GRADING MACHINES WITH IMPROVED CONTROL

Information

  • Patent Application
  • 20220081878
  • Publication Number
    20220081878
  • Date Filed
    March 04, 2021
    3 years ago
  • Date Published
    March 17, 2022
    2 years ago
Abstract
A mobile machine includes a powertrain that propels the mobile machine about a worksite and a controllable subsystem configured to affect a surface of the worksite. The mobile machine includes a terrain sensor configured to sense a characteristic of the surface and generate sensor signals indicative of the characteristic of the surface. The mobile machine includes a control system configured to control the controllable subsystem, with a control signal, to affect the portion of the surface. The control system is configured to receive a sensor signal from the terrain sensor after the controllable subsystem has affected the portion of the surface, the sensor signal indicative of the characteristic of the portion of the surface after the controllable subsystem affects the portion of the surface and control the mobile machine based on the sensor signal and control signal.
Description
CROSS-REFERENCE TO RELATED APPLICATION

The present application claims priority to the Indian patent application Serial No. 202021039429, filed Sep. 11, 2020, the content of which is hereby incorporated by reference in its entirety.


FIELD OF THE DESCRIPTION

The present description relates to earth moving operations. More specifically, the present description relates to an earth grading control system.


BACKGROUND

There are many different types of work machines. Some such work machines include agricultural machines, construction machines, forestry machines, turf management machines, among others. Many of these pieces of mobile equipment have mechanisms that are controlled by the operator in performing operations. For instance, a construction machine can have multiple different mechanical, electrical, hydraulic, pneumatic and electro-mechanical subsystems, among others, all of which can be operated by the operator to work a site. Various types of work machines use a blade or bucket to grade a site. For example, an excavator has a bucket that is movable to scoop or otherwise rotate to remove material from a surface (e.g. the ground). Similarly, a grader has a blade that is movable to change the height and angle of the blade. A crawler is generally a tracked machine and has a bucket that can be raised or lowered as well as rotated. These are simply examples of work machines that have a blade or bucket that is movable in multiple degrees of freedom to interact with the work site. Achieving a proper grade in a worksite operation is often a first step of the entire operation and a last step to finish the operation.


The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.


SUMMARY

A mobile machine includes a powertrain that propels the mobile machine about a worksite and a controllable subsystem configured to affect a surface of the worksite. The mobile machine includes a terrain sensor configured to sense a characteristic of the surface and generate sensor signals indicative of the characteristic of the surface. The mobile machine includes a control system configured to control the controllable subsystem, with a control signal, to affect the portion of the surface. The control system is configured to receive a sensor signal from the terrain sensor after the controllable subsystem has affected the portion of the surface, the sensor signal being indicative of the characteristic of the portion of the surface after the controllable subsystem affects the portion of the surface. The control system controls the mobile machine based on the sensor signal and control signal.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in the background.





BRIEF DESCRIPTION OF THE DRAWINGS


FIGS. 1A-B are side views showing example mobile machines in example worksite environments.



FIG. 2 is a block diagram showing an example worksite environment.



FIGS. 3A-C are diagrams showing example worksite cross-sections.



FIG. 4 is a flow diagram showing an example machine operation.



FIG. 5 is a block diagram showing the architecture illustrated in FIG. 2 deployed in a remote server architecture.



FIGS. 6-8 show examples of mobile devices that can be used in the architectures shown in the previous FIGS.



FIG. 9 is a block diagram showing one example of a computing environment that can be used in the architecture illustrated in previous FIGS.





DETAILED DESCRIPTION


FIG. 1A is a side view showing an example worksite 100. Worksite 100 includes an operator 150 controlling a mobile machine 102. As shown, mobile machine 102 is a dozer crawler, however, in other examples machine 102 can be other types of earth moving or grading machines as well, such as an excavator, a scraper, etc. Operator 150 interacts with machine 102 through user interface devices 112. As shown, user interface devices include a screen, joystick and pedals. In other examples, user interface devices 112 include steering wheels, levers, joysticks, switches, buttons, touch screens, microphones, etc. Operator 150 controls powertrain 104, blade 106 and ripper 108.


As shown, the blade 106 is a six-way blade that affects the worksite terrain of worksite 160. Blade 106 is used to grade the surface of worksite 160 to a level or other job specified terrain. Powertrain 104, as shown, includes tracks however in other examples powertrain 104 can include other propulsion and steering mechanisms as well. Ripper 108 is used to rip heavy or compact ground that may be part of worksite 160, prior to being finished graded.


Operator 150 is assisted by an autonomous or semiautonomous control system to control machine 102. For example, sensors 110 sense conditions of terrain 160 and/or machine 102 to assist in automatic or semi-automatic control of machine 102. For example, sensors 110 can include accelerometers, gyroscopes, linear displacement transducers, range scanners, such as lidar or radar, cameras, etc. Of course, sensors 110 can include other sensors as well. In one instance, sensors 110 monitor the height of blade 106 relative to powertrain 104 such that the cutting depth and angle of blade 106 are known. This way, blade 106 can be controlled to grade the terrain of worksite 160 to job specifications.


Typically, sensors 110 monitor controllable subsystems of machine 102 (e.g., powertrain 104, blade 106, ripper 108, etc.). However, in an example herein disclosed, sensors 110 also sense terrain of worksite 160 before and after an operation has been completed by blade 106 (or some other controllable subsystem). Sensing the terrain 160 of worksite 100 before it is affected allows control system to know the initial state of worksite 100. Then, sensing the affected area of terrain 160 after it has been affected by machine 102, will allow the control system to know how machine acted upon worksite 100. When a different affected terrain is sensed than what was expected to be produced by the operation, the control system can counter the error made when acting on that portion of worksite 100. For example, blade 106 may be controlled in such a way that two inches of soil are to be removed from worksite 100, however after blade 106 acts on the surface it is then sensed that three inches were removed from the portion of worksite 100. The control system incorporates this discrepancy data and corrects it the next time it controls blade 106 to dig two inches. It may scale back such that it does not cut three inches again. This terrain measurement can also be used to calculate the productivity of machine 102, that is the amount of earth moved or not moved by machine 102 when completing an operation. FIG. 1B shows an example mobile machine 102, that is an excavator, in worksite 100. Systems and methods herein described may also apply to excavators and other mobile work machines.



FIG. 2 is a block diagram showing an example environment 200. Environment 200 is similar to that of environment 100 in FIG. 1 and some components are similarly numbered. As shown, environment 200 includes machine 102, remote system 115, user 152 and can include other items as well, as indicated by block 201. Machine 102 includes user interface devices 112, control system 202, processors or controllers 203, sensors 110, datastore 230, controllable subsystems 240 and he can include other items as well, as indicated by block 250. Controller 203 can include various computers, processors, servers or other components that can implement mechanical or electronic control of other components of machine 102. For instance, controller 203 can be used to implement control system 202 which sends electrical impulses to actuators 103 that control the various controllable subsystems 240 of machine 102.


Controllable subsystems 240, as shown, include powertrain 104, blade 106 and ripper 108 and their corresponding actuators (e.g., hydraulic cylinders, etc.) 103. However, controllable subsystems 240 can include other items as well, as indicated by block 109. Powertrain 104 includes the main components that generate power and deliver that power to the ground, water, or air to propel machine 102 around worksite 160. For example, powertrain 104 includes the engine, transmission, torque converters, driveshafts, differentials and the final drive elements (drive wheels, continuous tracks, propeller, rotor, jet, etc.). Blade 106 and ripper 108 are controllable subsystems that machine 102 uses to complete a grading operation or otherwise affect the worksite surface. In other examples, other components can be provided as well that affect the worksite surface. For example, where machine 102 is an excavator, a bucket or packer can be provided to affect the surface of the worksite. Of course, these are examples only and other controllable subsystems may be provided as well, as indicated by block 109.


Control system 202 controls the above-mentioned controllable subsystems 240 and other components utilizing various logic components. Logic components can include software, hardware, firmware and/or some other combination of electronic and mechanical components. The logic components of control system 202 include control path generator logic 204, terrain prediction logic 206, terrain sensing logic 208, terrain comparison logic 210, control model logic 212, control signal generator logic 222 and can include other items as well, as indicated by block 224. Control system 202 automatically improves the commanded actions of machine 102 by identifying the difference between what was expected to result from a command and what actually resulted from it. In the below examples, commands are predicted as terrain characteristics and sensed based on the actual affected terrain. However, in other examples the predictions and sensing can be measured in other ways as well. For instance, a predicted travel path of a component and the sensed actual path of the component.


Control path generator logic 204 generates control paths to reach a desired outcome on the worksite. For example, a control path includes the driving route of machine 102 and the various states of the controllable subsystems 240 (e.g., blade height and angle) as machine 102 passes over the worksite. For instance, a simple control path example includes driving machine 102 due North for ten meters while blade 106 is two inches lower than the tracks. Of course, a complex job likely includes a significant number of control paths that are more complicated than this provided example.


Terrain prediction logic 206 is configured to predict characteristics of the terrain after completing a given control path on the worksite. For instance, using the above example control path, terrain prediction logic 206 predicts that the terrain as wide as blade 106 will be two inches lower than it was before machine 102 passed over the area and also that a given amount of spillage will be on either side of the path of machine 102. Terrain prediction logic 206 can provide more specific information such as the angle of the terrain, the filling or creating of voids in the worksite surface, etc. Terrain prediction logic 206 can also predict the path that a component of machine 102 should travel (e.g., blade 106).


Terrain sensing logic 208 receives sensor signals from terrain sensors 116 and generates signals or data indicative of characteristics of the terrain. For example, terrain sensing logic 208 can receive sensor signals and create a point cloud or three-dimensional model of the worksite surface.


Terrain comparison logic 210 receives signals from terrain prediction logic 206 and terrain sensing logic 208 and compares the terrain characteristics. For instance, terrain comparison logic 210 receives a point cloud from terrain prediction logic 206 that corresponds to a predicted surface after machine 102 completes a control path on it and compares it to a point cloud received from terrain sensing logic 208 the corresponds to the actual terrain sensed after machine 102 has acted on it. Terrain comparison logic 210 then can generate signals or data indicative of differences between these two-point clouds.


Control model logic 212 creates and manages models used by control system 202 to control machine 102 (e.g., control signal generator logic 222 references the model when generating a control signal used to control a controllable subsystem 240). Control model logic 212 includes machine learning logic 214 that receives data over time and learns which inputs generate a given set of outputs. For instance, machine learning logic 214 can include a neural network, decision tree, random forest, and/or other machine learning mechanisms that operate according to different protocols.


Model generator logic 216 utilizes machine learning logic 214 or other means to generate a control model. For instance, model generator logic 216 can be preprogrammed at manufacture to control machine 102 using default or standard controls. Or model generator logic 216 gathers various data from terrain comparison logic 210 over a given amount of time before it generates a control model.


Model modification logic 218 receives data from terrain comparison logic 210 or other components to modify (e.g., improve) a control model generated by model generator logic 216. For instance, as long as data is collected, model modification logic 218 can use this data to improve the control model generated by model generator logic 216. Of course, control model logic 212 can include other items as well, as indicated by block 220.


Control signal generator logic 222 generates control signals, based on a control model, that are sent to control controllable subsystems 240 to complete an action. For instance, a control signal can include electrical or mechanical electrical impulses that control the actuator 103 of a controllable subsystem 240.


Productivity determination logic 223 determines the productivity of machine 102. For instance, productivity determination logic 223 determines the productivity as the amount of earth moved by machine 102 over a given time, per pass, per operator, per shift, etc. Productivity metric generator 225 generates a productivity metric based on the productivity of machine 102.


Sensors 110 include machine sensors 114 that sense the various characteristics of machine 102. For example, machine sensors 114 can include linear displacement transducers, strain gauges, potentiometers, odometers, thermometers, etc. Sensors 110 also include terrain sensors 116 that sense characteristics of the terrain. For instance, terrain sensors 116 can include range scanning sensors such as lidar, radar, sonar, etc. Terrain sensors 116 can also include image capturing devices such as a camera or stereo camera systems that can detect characteristics (e.g., heights, contours, type, temperature, etc.) of the terrain. Other examples of terrain sensors can include soil types sensors, moisture sensors, thermal imaging, etc.



FIG. 3A is a diagram showing example cross-sectional terrain modifications. Sectional line 302 represents the final finish grade of the terrain. Sectional line 304 represents the initial terrain that is sensed by a terrain sensor 116. The difference between the initial terrain represented by line 304 and finished grade represented by line 302 is so large that it requires multiple passes of machine 102 to complete.


Control path generator logic 204 generates a first set of control paths configured to guide machine 102 over the terrain represented by line 304. This first set of control paths are received by terrain prediction logic 206 which predicts that executing the control paths will result in terrain represented by sectional line 306 (or predicts that line 306 represents the path blade 106 will travel). However, after machine 102 passes over this portion of the worksite, it is sensed by terrain sensor 116 that the terrain surface is actually represented by sectional line 308 (or otherwise determines that line 308 represents the path blacked 106 actually traveled). The difference between these two, predicted surface represented by line 306 and actual surface represented by line 308, is represented by shaded portion 310.


Shaded portion 310 is divided into portions 310-1, 310-2, 310-3, 310-4 and 310-5. Portions 310-4 and 310-5 are points of intersection where the predicted surface and actual surface match. While portions 310-1, 310-2 and 310-3 are portions where the result of the generated control path was different than the expected result. These portions of deviation can be used by control model logic 212 to generate or modify the control model used, to control controllable subsystems 240, to reduce the deviation from the predicted surfaces and resultant surfaces on subsequent execution of control paths.



FIG. 3B is a diagram showing example cross-sectional terrain modifications. Sectional line 338 represents the initial terrain that is sensed by a terrain sensor 116, which corresponds to line 308 of FIG. 3A. Control path generator logic 204 generates a second pass across this portion of the worksite. Terrain prediction logic 206 predicts that this second pass will result in a surface represented by line 334. As machine 102 passes over this portion the terrain is sensed again and terrain sensing logic 208 determines that the affected surface is represented by line 332. The difference between these two, predicted surface represented by line 334 and actual surface represented by line 332 is represented by shaded portion 336.


Shaded portion 336 represents a deviation between the predicted and actual terrain cross-section. As can be seen, shaded portion 336 in FIG. 3B is substantially smaller than shaded portion 310 in FIG. 3A. This shows that the second pass was more accurate as the deviation from the first pass in FIG. 3A was added into the control model by control model logic 212 to refine the generation of control signals when controlling machine 102. Deviations are recorded and the inputs that created these deviations are used to modify the model or control path generator logic 204 such that the deviation between the predicted and affected terrain (e.g., the difference between the expected result of a commanded operation, and the actual result) is reduced.



FIG. 3C is a diagram showing example cross-sectional terrain modifications. Sectional line 368 represents the initial terrain that is sensed by a terrain sensor 116, which corresponds to line 332 of FIG. 3B. Control path generator logic 204 generates a set of control paths that correspond to a third pass across this portion of the worksite. Terrain prediction logic 206 predicts that this third pass will result in a surface represented by line 362. As machine 102 passes over this portion the terrain is sensed again and terrain sensing logic 208 determines that the affected surface is represented by line 364. As can be seen, the predicted and actual cross-sections of the surface are nearly identical. This is because the deviations from both FIGS. 3A and 3B have been compiled into the control model and the inputs (e.g., errors in control signal, variations, miscalibrations, etc.) that cause the deviations have been reduced. This process can be continually applied after each execution and sensing of a control path such that most or all deviations can be controlled for.



FIG. 4 is a flow diagram showing an example machine operation 400. Operation 400 begins at block 402 where the target terrain of a worksite is generated or retrieved. As indicated by block 404, the target terrain can be received from worksite specifications. As indicated by block 406, the target terrain can be generated or retrieved in other ways as well. For example, the terrain may need to be flattened to a given elevation and the target terrain is automatically generated as a plane. More complex target terrains can be generated as well.


Operation 400 proceeds at block 410 where the initial terrain of the worksite is detected. As indicated by block 412, the terrain can be sensed via range scanning, e.g., using a system that scans distances of points and plots the various points as a point cloud. Often range scanning is completed via lidar, radar, sonar or some other device. As indicated by block 414, the terrain of the initial worksite can be scanned via image analysis. For example, one or more cameras capture an image of the worksite and the image or images are analyzed to retrieve characteristics of the worksite terrain. Of course, the worksite terrain can be detected in other ways as well, as indicated by block 416.


Operation 400 proceeds at block 420 where control paths are generated to grade the worksite from the initially sensed terrain of block 410 to the target terrain of block 402. The control paths can be generated based on machine characteristics, as indicated by block 422. For example, some relevant machine characteristics include the blade width, blade size, blade angle, machine torque, machine weight, machine speed, max cutting depth, etc. The control paths can be generated based on worksite characteristics, as indicated by block 424. For example, some relevant worksite characteristics include soil type, moisture, density, standard deviation of height, etc. The control paths can be generated based on a combination of machine and worksite characteristics. For instance, a given machine with a given torque may be able to take deeper cuts based on the soil type or characteristics.


Operation 400 proceeds at block 430 where machine 102 is actuated through one or more control paths. For example, machine 102 can be controlled to automatically set blade 106 is set at a certain digging depth and to drive at a given bearing for a given distance. In another example, the bearing, blade depth and/or settings can be output for operator 150, who can manually set them.


Operation 400 proceeds at block 440 where the terrain affected by machine 102 in block 430 is sensed. As indicated by block 442, the terrain can be sensed via range scanning, e.g., using a system that scans distances and plots the various points as a point cloud. Often range scanning is completed via lidar, radar, sonar or some other device. As indicated by block 444, the terrain of the initial worksite can be scanned via image analysis. For example, one or more cameras capture an image of the worksite and the image or images are analyzed to retrieve characteristics of the worksite terrain. Of course, the worksite terrain can be detected in other ways as well, as indicated by block 446.


Operation 400 proceeds at block 450 where a difference between the initial terrain sensed in block 410 and the affected terrain sensed in block 440 is determined. For example, the difference can be generated by comparing a point cloud from block 450 and a point cloud from block 410. The difference is also indicative of the productivity, as indicated by block 452. As indicated by block 454, a productivity metric can be generated, based at least in part on the difference.


Operation 400 proceeds at block 460 where model generator logic 216 generates a control model or the difference data from block 450 is provided to machine learning logic 214 which identifies how the existing control model is to be changed, as indicated by block 462, and model modification logic 218 modifies the existing control model to improve its accuracy. The control model (or modified control model) is used by control signal generator logic 222 to generate control signals based an intended result, learned control errors, positively reinforced control successes, control variations based on environmental or machine factors, etc. As indicated by block 462, a machine learning algorithm can be used to generate or modify the control model. In one example, in order to improve the model's performance, machine learning logic 214 uses a machine learning algorithm that stores inputs and factors and the results that these inputs and factors produce. Then model modification logic 218 can modify the control model so the next time the control system faces a similar set of inputs and factors the result can be predicted and if the past results were not as desired, they can be avoided.


Operation 400 proceeds at block 470 where it is determined if there are more control paths to complete. If not, then operation 400 ends. If so, then operation 400 proceeds at block 480. At block 480, one or more additional control paths are executed using the control model generated or modified at block 460. The operation 400 proceeds again at block 440.


The present description thus describes a system that automatically improves the commanded actions of the machine by identifying the difference between what was expected to result from a command and what actually resulted from it. This is different than comparing the current terrain to the target finished terrain. Instead, the present description improves machine control so the individual machine control operations more closely conform to what is expected.


The present discussion has mentioned processors and servers. In one example, the processors and servers include computer processors with associated memory and timing circuitry, not separately shown. They are functional parts of the systems or devices to which they belong and are activated by and facilitate the functionality of the other components or items in those systems.


Also, a number of user interface displays have been discussed. They can take a wide variety of different forms and can have a wide variety of different user actuatable input mechanisms disposed thereon. For instance, the user actuatable input mechanisms can be text boxes, check boxes, icons, links, drop-down menus, search boxes, etc. They can also be actuated in a wide variety of different ways. For instance, they can be actuated using a point and click device (such as a track ball or mouse). They can be actuated using hardware buttons, switches, a joystick or keyboard, thumb switches or thumb pads, etc. They can also be actuated using a virtual keyboard or other virtual actuators. In addition, where the screen on which they are displayed is a touch sensitive screen, they can be actuated using touch gestures. Also, where the device that displays them has speech recognition components, they can be actuated using speech commands.


A number of data stores have also been discussed. It will be noted they can each be broken into multiple data stores. All can be local to the systems accessing them, all can be remote, or some can be local while others are remote. All of these configurations are contemplated herein.


Also, the figures show a number of blocks with functionality ascribed to each block. It will be noted that fewer blocks can be used so the functionality is performed by fewer components. Also, more blocks can be used with the functionality distributed among more components.


It will be noted that the above discussion has described a variety of different systems, components and/or logic. It will be appreciated that such systems, components and/or logic can be comprised of hardware items (such as processors and associated memory, or other processing components, some of which are described below) that perform the functions associated with those systems, components and/or logic. In addition, the systems, components and/or logic can be comprised of software that is loaded into a memory and is subsequently executed by a processor or server, or other computing component, as described below. The systems, components and/or logic can also be comprised of different combinations of hardware, software, firmware, etc., some examples of which are described below. These are only some examples of different structures that can be used to form the systems, components and/or logic described above. Other structures can be used as well.



FIG. 5 is a block diagram of machine 102, shown in FIG. 2, except that it communicates with elements in a remote server architecture 500. In one example, remote server architecture 500 can provide computation, software, data access, and storage services that do not require end-user knowledge of the physical location or configuration of the system that delivers the services. In various examples, remote servers can deliver the services over a wide area network, such as the internet, using appropriate protocols. For instance, remote servers can deliver applications over a wide area network and they can be accessed through a web browser or any other computing component. Software or components shown in FIG. 2 as well as the corresponding data, can be stored on servers at a remote location. The computing resources in a remote server environment can be consolidated at a remote data center location or they can be dispersed. Remote server infrastructures can deliver services through shared data centers, even though they appear as a single point of access for the user. Thus, the components and functions described herein can be provided from a remote server at a remote location using a remote server architecture. Alternatively, they can be provided from a conventional server, or they can be installed on client devices directly, or in other ways.


In the example shown in FIG. 5, some items are similar to those shown in FIG. 2 and they are similarly numbered. FIG. 5 specifically shows that control system 202 and datastore 230 can be located at a remote server location 502. Therefore, machine 102 accesses those systems through remote server location 502.



FIG. 5 also depicts another example of a remote server architecture. FIG. 5 shows that it is also contemplated that some elements of FIG. 2 are disposed at remote server location 502 while others are not. By way of example, datastore 230 or control system 202 can be disposed at a location separate from location 502, and accessed through the remote server at location 502. Regardless of where they are located, they can be accessed directly by machine 102, through a network (either a wide area network or a local area network), they can be hosted at a remote site by a service, or they can be provided as a service, or accessed by a connection service that resides in a remote location. Also, the data can be stored in substantially any location and intermittently accessed by, or forwarded to, interested parties. For instance, physical carriers can be used instead of, or in addition to, electromagnetic wave carriers. In such an example, where cell coverage is poor or nonexistent, another mobile machine (such as a fuel truck) can have an automated information collection system. As mobile machine 102 comes close to the fuel truck for fueling, the system automatically collects the information from the harvester using any type of ad-hoc wireless connection. The collected information can then be forwarded to the main network as the fuel truck reaches a location where there is cellular coverage (or other wireless coverage). For instance, the fuel truck may enter a covered location when traveling to fuel other machines or when at a main fuel storage location. All of these architectures are contemplated herein. Further, the information can be stored on the mobile machine until the mobile machine enters a covered location. The mobile machine, itself, can then send the information to the main network.


It will also be noted that the elements of FIG. 2, or portions of them, can be disposed on a wide variety of different devices. Some of those devices include servers, desktop computers, laptop computers, tablet computers, or other mobile devices, such as palm top computers, cell phones, smart phones, multimedia players, personal digital assistants, etc.



FIG. 6 is a simplified block diagram of one illustrative example of a handheld or mobile computing device that can be used as a user's or client's handheld device 16, in which the present system (or parts of it) can be deployed. For instance, a mobile device can be deployed in the operator compartment of machine 102 for use in generating, processing, or displaying blad 106 control settings. FIGS. 7-8 are examples of handheld or mobile devices.



FIG. 6 provides a general block diagram of the components of a client device 16 that can run some components shown in FIG. 2, that interacts with them, or both. In the device 16, a communications link 13 is provided that allows the handheld device to communicate with other computing devices and under some examples provides a channel for receiving information automatically, such as by scanning. Examples of communications link 13 include allowing communication though one or more communication protocols, such as wireless services used to provide cellular access to a network, as well as protocols that provide local wireless connections to networks.


Under other examples, applications can be received on a removable Secure Digital (SD) card that is connected to an interface 15. Interface 15 and communication links 13 communicate with a processor 17 (which can also embody controller 203 from FIG. 1) along a bus 19 that is also connected to memory 21 and input/output (I/O) components 23, as well as clock 25 and location system 27.


I/O components 23, in one example, are provided to facilitate input and output operations. I/O components 23 for various examples of the device 16 can include input components such as buttons, touch sensors, optical sensors, microphones, touch screens, proximity sensors, accelerometers, orientation sensors and output components such as a display device, a speaker, and or a printer port. Other I/O components 23 can be used as well.


Clock 25 illustratively comprises a real time clock component that outputs a time and date. It can also, illustratively, provide timing functions for processor 17.


Location system 27 illustratively includes a component that outputs a current geographical location of device 16. This can include, for instance, a global positioning system (GPS) receiver, a LORAN system, a dead reckoning system, a cellular triangulation system, or other positioning system. It can also include, for example, mapping software or navigation software that generates desired maps, navigation routes and other geographic functions.


Memory 21 stores operating system 29, network settings 31, applications 33, application configuration settings 35, data store 37, communication drivers 39, and communication configuration settings 41. Memory 21 can include all types of tangible volatile and non-volatile computer-readable memory devices. It can also include computer storage media (described below). Memory 21 stores computer readable instructions that, when executed by processor 17, cause the processor to perform computer-implemented steps or functions according to the instructions. Processor 17 can be activated by other components to facilitate their functionality as well.



FIG. 7 shows one example in which device 16 is a tablet computer 600. In FIG. 7, computer 600 is shown with user interface display screen 602. Screen 602 can be a touch screen or a pen-enabled interface that receives inputs from a pen or stylus. It can also use an on-screen virtual keyboard. Of course, it might also be attached to a keyboard or other user input device through a suitable attachment mechanism, such as a wireless link or USB port, for instance. Computer 600 can also illustratively receive voice inputs as well.



FIG. 8 is similar to FIG. 7 except that the phone is a smart phone 71. Smart phone 71 has a touch sensitive display 73 that displays icons or tiles or other user input mechanisms 75. Mechanisms 75 can be used by a user to run applications, make calls, perform data transfer operations, etc. In general, smart phone 71 is built on a mobile operating system and offers more advanced computing capability and connectivity than a feature phone.


Note that other forms of the devices 16 are possible.



FIG. 9 is one example of a computing environment in which elements of FIG. 2, or parts of it, (for example) can be deployed. With reference to FIG. 9, an example system for implementing some examples includes a general-purpose computing device in the form of a computer 810 programmed to operate as discussed above. Components of computer 810 may include, but are not limited to, a processing unit 820 (which can comprise controller 203), a system memory 830, and a system bus 821 that couples various system components including the system memory to the processing unit 820. The system bus 821 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. Memory and programs described with respect to FIG. 2 can be deployed in corresponding portions of FIG. 9.


Computer 810 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 810 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media is different from, and does not include, a modulated data signal or carrier wave. It includes hardware storage media including both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 810. Communication media may embody computer readable instructions, data structures, program modules or other data in a transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.


The system memory 830 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 831 and random access memory (RAM) 832. A basic input/output system 833 (BIOS), containing the basic routines that help to transfer information between elements within computer 810, such as during start-up, is typically stored in ROM 831. RAM 832 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 820. By way of example, and not limitation, FIG. 9 illustrates operating system 834, application programs 835, other program modules 836, and program data 837.


The computer 810 may also include other removable/non-removable volatile/nonvolatile computer storage media. By way of example only, FIG. 9 illustrates a hard disk drive 841 that reads from or writes to non-removable, nonvolatile magnetic media, an optical disk drive 855, and nonvolatile optical disk 856. The hard disk drive 841 is typically connected to the system bus 821 through a non-removable memory interface such as interface 840, and optical disk drive 855 are typically connected to the system bus 821 by a removable memory interface, such as interface 850.


Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (e.g., ASICs), Application-specific Standard Products (e.g., ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.


The drives and their associated computer storage media discussed above and illustrated in FIG. 9, provide storage of computer readable instructions, data structures, program modules and other data for the computer 810. In FIG. 9, for example, hard disk drive 841 is illustrated as storing operating system 844, application programs 845, other program modules 846, and program data 847. Note that these components can either be the same as or different from operating system 834, application programs 835, other program modules 836, and program data 837.


A user may enter commands and information into the computer 810 through input devices such as a keyboard 862, a microphone 863, and a pointing device 861, such as a mouse, trackball or touch pad. Other input devices (not shown) may include a joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 820 through a user input interface 860 that is coupled to the system bus, but may be connected by other interface and bus structures. A visual display 891 or other type of display device is also connected to the system bus 821 via an interface, such as a video interface 890. In addition to the monitor, computers may also include other peripheral output devices such as speakers 897 and printer 896, which may be connected through an output peripheral interface 895.


The computer 810 is operated in a networked environment using logical connections (such as a controller area network—CAN, local area network—LAN, or wide area network WAN) to one or more remote computers, such as a remote computer 880.


When used in a LAN networking environment, the computer 810 is connected to the LAN 871 through a network interface or adapter 870. When used in a WAN networking environment, the computer 810 typically includes a modem 872 or other means for establishing communications over the WAN 873, such as the Internet. In a networked environment, program modules may be stored in a remote memory storage device. FIG. 9 illustrates, for example, that remote application programs 885 can reside on remote computer 880.


It is noted while agricultural planting machines have been particularly discussed with respect to the examples described herein, other machines can also be implemented with said examples. Thus, the present disclosure is not limited to use of the systems and processes discussed with merely planting machines. They can be used with other machines as well, some of which are mentioned above.


It should also be noted that the different examples described herein can be combined in different ways. That is, parts of one or more examples can be combined with parts of one or more other examples. All of this is contemplated herein.


Example 1 is a method of controlling a mobile machine, the method comprising


obtaining an initial terrain sensor signal from a terrain sensor, the initial terrain sensor signal indicative of a characteristic of a surface of a worksite;


generating a terrain prediction of the characteristic of the surface of the worksite to be observed after one or more controllable subsystems execute a control path to affect the surface of the worksite;


sending control signals to the one or more controllable subsystems to execute the control path;


obtaining a second terrain sensor signal from the terrain sensor, the second terrain sensor signals being indicative of a characteristic of the surface of the worksite after the one or more controllable subsystems affected the surface of the worksite; and


controlling the mobile machine based on a comparison of the terrain prediction with the second terrain sensor signal.


Example 2 is the method of any or all previous examples, wherein sending control signals to the one or more controllable subsystems to execute the control path comprises sending a set of controls to a blade actuator to move a blade position to grade the surface of the worksite.


Example 3 is the method of any or all previous examples, wherein controlling the mobile machine based on the comparison of the terrain prediction with the second terrain sensor signal comprises sending a second set of control signals to the one or more controllable subsystems to execute a second control path.


Example 4 is the method of any or all previous examples, further comprising:


generating a subsystem terrain prediction of the characteristic of the surface of the worksite to be observed after the one or more controllable subsystems execute the second control path;


obtaining a third terrain sensor signal from the terrain sensor, the third terrain sensor signal being indicative of a characteristic of the surface of the worksite after the one or more controllable subsystems affected the surface of the worksite while executing the second control path; and


controlling the mobile machine based on a comparison of the subsequent prediction with the third sensor signal.


Example 5 is the method of any or all previous examples, further comprising generating a productivity metric based on the initial terrain sensor signal and the second terrain sensor signal and wherein controlling the mobile machine comprises controlling the mobile machine based on the productivity metric.


Example 6 is the method of any or all previous examples, wherein obtaining the second sensor signal from the terrain sensor comprises: range scanning the surface of the worksite.


Example 7 is a mobile machine comprising:


a powertrain that propels the mobile machine about a worksite;


a controllable subsystem configured to affect a surface of the worksite;


a terrain sensor configured to sense a characteristic of the surface and generate terrain sensor signals indicative of the characteristic of the surface; and


a control system configured to:


control the controllable subsystem, with a control signal, to affect a portion of the surface;


receive a sensor signal from the terrain sensor after the controllable subsystem has affected the portion of the surface, the sensor signal being indicative of the characteristic of the portion of the surface after the controllable subsystem affects the portion of the surface; and


control the mobile machine based on the terrain sensor signal and control signal.


Example 8 is the mobile machine of any or all previous examples, wherein the control system comprises:


terrain prediction logic that generates a terrain prediction indicative of a predicted characteristic of the portion of the surface to be observed after the controllable subsystem affects the portion of the surface and wherein the control system controls the mobile machine based on the terrain prediction.


Example 9 is the mobile machine of any or all previous examples, wherein the control system comprises:


terrain comparison logic that compares the terrain prediction to the terrain sensor signal and generates a comparison signal indicative of the comparison and wherein the control system controls the mobile machine based on the comparison signal.


Example 10 is the mobile machine of any or all previous examples, wherein the control system comprises:


control model logic that generates a control model based on the comparison signal and wherein the control system controls the mobile machine based on the control model.


Example 11 is the mobile machine of any or all previous examples, wherein the control system controls the controllable subsystem to affect the surface of the worksite a second time, the terrain prediction logic generates a second terrain prediction indicative of a predicted characteristic of the portion of the surface to be observed after the controllable subsystem affects the portion of the surface the second time, the terrain comparison logic receives the second terrain prediction and a second terrain sensor signal from the terrain sensor after the controllable subsystem has affected the portion of the surface the second time, the second terrain sensor signal indicative of the characteristic of the portion of the surface after the controllable subsystem affects the portion of the surface the second time, and the terrain comparison logic compares the second terrain prediction and the second terrain sensor signal and generates a second comparison signal indicative of the comparison, the control system being configured to control the mobile machine based on the second comparison signal.


Example 12 is the mobile machine of any or all previous examples, wherein the control model logic is configured to modify the control model based on the second terrain comparison.


Example 13 is the mobile machine of any or all previous examples, wherein the control model logic comprises machine learning logic that generates or modifies the control model.


Example 14 is the mobile machine of any or all previous examples, wherein the control model comprises a neural network.


Example 15 is the mobile machine of any or all previous examples, wherein the control system is configured to generate a productivity metric based on the sensor signal and control the mobile machine based on the productivity metric.


Example 16 is the mobile machine of any or all previous examples, wherein the mobile machine comprises a crawler.


Example 17 is the mobile machine of any or all previous examples, wherein the mobile machine comprises an excavator.


Example 18 is a control system for a mobile machine comprising:


control path generator logic that generates a control path for the mobile machine;


terrain prediction logic that generates a terrain prediction indicative of a predictive affect on the terrain by executing the control path;


terrain sensing logic that receives a terrain sensor signal and determines a characteristic of the terrain after the control path is executed, based on the terrain sensor signal; and


terrain comparison logic that compares the characteristic of the terrain after the control path is executed with the terrain prediction and generates a comparison signal indicative of the comparison, wherein the control system controls the mobile machine based on the comparison signal.


Example 19 is the control system of any or all previous examples, further comprising:


control model logic that generates a control model based on the comparison signal and wherein the control system controls the mobile machine based on the control model.


Example 20 is the control system of any or all previous examples, wherein the control path generator logic generates a second control path for the mobile machine, the terrain prediction logic that generates a second prediction indicative of a predicted affect on the terrain by executing the second control path, the terrain sensing logic receives a second terrain sensor signal and determines a characteristic of the terrain after the second control path is executed based on the second terrain sensor signal, the terrain comparison logic that compares the characteristic of the terrain of the terrain after the second control path is executed with the second terrain prediction and generates a second terrain comparison signal indicative of the terrain second comparison, and the control model logic modifies the control model based on the second comparison signal; and


wherein the control system controls the mobile machine based on the modified control model.


Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims
  • 1. A method of controlling a mobile machine, the method comprising: obtaining an initial terrain sensor signal from a terrain sensor, the initial terrain sensor signal indicative of a characteristic of a surface of a worksite;generating a terrain prediction of the characteristic of the surface of the worksite to be observed after one or more controllable subsystems execute a control path to affect the surface of the worksite;sending control signals to the one or more controllable subsystems to execute the control path;obtaining a second terrain sensor signal from the terrain sensor, the second terrain sensor signals being indicative of a characteristic of the surface of the worksite after the one or more controllable subsystems affected the surface of the worksite; andcontrolling the mobile machine based on a comparison of the terrain prediction with the second terrain sensor signal.
  • 2. The method of claim 1, wherein sending control signals to the one or more controllable subsystems to execute the control path comprises sending a set of controls to a blade actuator to move a blade position to grade the surface of the worksite.
  • 3. The method of claim 1, wherein controlling the mobile machine based on the comparison of the terrain prediction with the second terrain sensor signal comprises sending a second set of control signals to the one or more controllable subsystems to execute a second control path.
  • 4. The method of claim 3, further comprising: generating a subsystem terrain prediction of the characteristic of the surface of the worksite to be observed after the one or more controllable subsystems execute the second control path;obtaining a third terrain sensor signal from the terrain sensor, the third terrain sensor signal being indicative of a characteristic of the surface of the worksite after the one or more controllable subsystems affected the surface of the worksite while executing the second control path; andcontrolling the mobile machine based on a comparison of the subsequent prediction with the third sensor signal.
  • 5. The method of claim 3, further comprising generating a productivity metric based on the initial terrain sensor signal and the second terrain sensor signal and wherein controlling the mobile machine comprises controlling the mobile machine based on the productivity metric.
  • 6. The method of claim 1, wherein obtaining the second sensor signal from the terrain sensor comprises: range scanning the surface of the worksite.
  • 7. A mobile machine comprising: a powertrain that propels the mobile machine about a worksite;a controllable subsystem configured to affect a surface of the worksite;a terrain sensor configured to sense a characteristic of the surface and generate terrain sensor signals indicative of the characteristic of the surface; anda control system configured to: control the controllable subsystem, with a control signal, to affect a portion of the surface;receive a sensor signal from the terrain sensor after the controllable subsystem has affected the portion of the surface, the sensor signal being indicative of the characteristic of the portion of the surface after the controllable subsystem affects the portion of the surface; andcontrol the mobile machine based on the terrain sensor signal and control signal.
  • 8. The mobile machine of claim 7, wherein the control system comprises: terrain prediction logic that generates a terrain prediction indicative of a predicted characteristic of the portion of the surface to be observed after the controllable subsystem affects the portion of the surface and wherein the control system controls the mobile machine based on the terrain prediction.
  • 9. The mobile machine of claim 8, wherein the control system comprises: terrain comparison logic that compares the terrain prediction to the terrain sensor signal and generates a comparison signal indicative of the comparison and wherein the control system controls the mobile machine based on the comparison signal.
  • 10. The mobile machine of claim 9, wherein the control system comprises: control model logic that generates a control model based on the comparison signal and wherein the control system controls the mobile machine based on the control model.
  • 11. The mobile machine of claim 10, wherein the control system controls the controllable subsystem to affect the surface of the worksite a second time, the terrain prediction logic generates a second terrain prediction indicative of a predicted characteristic of the portion of the surface to be observed after the controllable subsystem affects the portion of the surface the second time, the terrain comparison logic receives the second terrain prediction and a second terrain sensor signal from the terrain sensor after the controllable subsystem has affected the portion of the surface the second time, the second terrain sensor signal indicative of the characteristic of the portion of the surface after the controllable subsystem affects the portion of the surface the second time, and the terrain comparison logic compares the second terrain prediction and the second terrain sensor signal and generates a second comparison signal indicative of the comparison, the control system being configured to control the mobile machine based on the second comparison signal.
  • 12. The mobile machine of claim 11, wherein the control model logic is configured to modify the control model based on the second terrain comparison.
  • 13. The mobile machine of claim 12, wherein the control model logic comprises machine learning logic that generates or modifies the control model.
  • 14. The mobile machine of claim 10, wherein the control model comprises a neural network.
  • 15. The mobile machine of claim 7, wherein the control system is configured to generate a productivity metric based on the sensor signal and control the mobile machine based on the productivity metric.
  • 16. The mobile machine of claim 7, wherein the mobile machine comprises a crawler.
  • 17. The mobile machine of claim 7, wherein the mobile machine comprises an excavator.
  • 18. A control system for a mobile machine comprising: control path generator logic that generates a control path for the mobile machine;terrain prediction logic that generates a terrain prediction indicative of a predictive affect on the terrain by executing the control path;terrain sensing logic that receives a terrain sensor signal and determines a characteristic of the terrain after the control path is executed, based on the terrain sensor signal; andterrain comparison logic that compares the characteristic of the terrain after the control path is executed with the terrain prediction and generates a comparison signal indicative of the comparison, wherein the control system controls the mobile machine based on the comparison signal.
  • 19. The control system of claim 18, further comprising: control model logic that generates a control model based on the comparison signal and wherein the control system controls the mobile machine based on the control model.
  • 20. The control system of claim 19, wherein the control path generator logic generates a second control path for the mobile machine, the terrain prediction logic that generates a second prediction indicative of a predicted affect on the terrain by executing the second control path, the terrain sensing logic receives a second terrain sensor signal and determines a characteristic of the terrain after the second control path is executed based on the second terrain sensor signal, the terrain comparison logic that compares the characteristic of the terrain of the terrain after the second control path is executed with the second terrain prediction and generates a second terrain comparison signal indicative of the terrain second comparison, and the control model logic modifies the control model based on the second comparison signal; and wherein the control system controls the mobile machine based on the modified control model.
Priority Claims (1)
Number Date Country Kind
202021039429 Sep 2020 IN national