WORK MACHINE, REMOTE OPERATION ASSISTING DEVICE, AND ASSISTING SYSTEM

Information

  • Patent Application
  • 20250215668
  • Publication Number
    20250215668
  • Date Filed
    December 17, 2024
    7 months ago
  • Date Published
    July 03, 2025
    29 days ago
Abstract
A work machine includes: a work device configured to change a shape of an object in a work range of the work machine; and a display device configured to recommend, in a visible way, a target work location in the work range of the work machine, based on information about a current shape of the object, the target work location being where the work device changes the shape of the object.
Description
RELATED APPLICATION

The present application claims priority under 35 USC § 119 to Japanese Patent Application No. 2023-223705, filed Dec. 28, 2023, the entire contents of which are incorporated herein by reference.


BACKGROUND
1. Technical Field

The present disclosure generally relates to work machines.


2. Description of Related Art

Functions to assist an operator in operating a work machine have been known heretofore.


SUMMARY

The present disclosure aims to provide a technique for assisting a user's operations of a work machine properly.


To achieve the above object, according to one embodiment of the present disclosure, a work machine is provided. The work machine includes: a work machine including: a work device configured to change a shape of an object in a work range of the work machine; and a display device configured to recommend, in a visible way, a target work location in the work range of the work machine, based on information about a current shape of the object, the target work location being where the work device changes the shape of the object.


According to the above embodiment, it is possible to properly assist a user's operations of a work machine.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a diagram that shows an example of an operation assisting system;



FIG. 2 is a top view of an example of an excavator;



FIG. 3 is a diagram that shows an example structure relating to an excavator's remote operations;



FIG. 4 is a diagram that shows an example structure of an excavator;



FIG. 5 is a diagram that shows an example structure of an information processing device;



FIG. 6 is a functional block diagram that shows a first example functional structure of an operation assisting system;



FIG. 7 is a diagram that shows an example of a work range;



FIG. 8 is a diagram that shows a first example method of displaying recommended work locations for an excavator;



FIG. 9 is a diagram that shows a second example method of displaying recommended work locations for an excavator;



FIG. 10 is a diagram that shows a third example method of displaying recommended work locations for an excavator;



FIG. 11 shows a first example method of processing a suggestion about an excavator's work site;



FIG. 12 shows a second example method of processing a suggestion about an excavator's work site;



FIG. 13 is a functional block diagram that shows a second example functional structure of an operation assisting system;



FIG. 14 shows a third example of processing a suggestion about an excavator's work site;



FIG. 15 is a functional block diagram that shows a third example functional structure of an operation assisting system; and



FIG. 16 is a diagram that shows a fourth example of processing a suggestion about an excavator's work site.





DETAILED DESCRIPTION

For example, according to the above-mentioned machine guidance function of related art, only the distance between the target working surface and the working part of the attachment (for example, a bucket). The target working surface refers to the point or plane of reference for the job when the excavator performs excavation. Therefore, for example, an inexperienced operator may not be able to locate the target work location properly. For example, he/she may not be able to decide from which location in the work range to excavation should be started, which location is to be excavated next, and so forth.


An embodiment of the present disclosure (hereinafter “the present embodiment”) will be described below with reference to the accompanying drawings.


[Overview of the Operation Assisting System]

An overview of an operation assisting system SYS according to the present embodiment will be described With reference to FIG. 1 to FIG. 3.



FIG. 1 is a diagram that shows an example of the operation assisting system SYS. In FIG. 1, a left side view of the excavator 100 is shown. FIG. 2 is a top view showing an example of the excavator 100. FIG. 3 is a diagram that shows an example structure that pertains to the remote operations of the excavator 100. In the following description, the direction in which an attachment AT extends as viewed from the excavator 100 (the upward direction in FIG. 2) may be referred to as “front,” and the direction in the excavator 100, or the direction viewed from the excavator 100, will be described accordingly.


As shown in FIG. 1, the operation assisting system SYS includes an excavator 100, an information processing device 200, and a sensor group 300.


The operation assisting system SYS cooperates with the excavator 100 using the information processing device 200, and provides assistance for the operation of the excavator 100.


The operation assisting system SYS may include one excavator 100 or multiple excavators 100.


In the operation assisting system SYS, the excavator 100 is the target work machine subject to operation-related assistance.


As shown in FIG. 1 and FIG. 2, the excavator 100 includes a lower traveling body 1, an upper rotating body 3, an attachment AT including a boom 4, an arm 5, and a the bucket 6, and a cabin 10.


The lower traveling body 1 drives the excavator 100 using crawlers 1C. The crawlers 1C include a left crawler 1CL and a right crawler 1CR. The crawler 1CL is driven hydraulically by a drive hydraulic motor 1ML. Similarly, the crawler 1CL is driven hydraulically by a drive hydraulic motor 1MR. By this means, the lower traveling body 1 can run autonomously.


The upper rotating body 3 is mounted rotatably (in a freely-rotatable fashion) on the lower traveling body 1 via a rotating mechanism 2. For example, the upper rotating body 3 rotates relative to the lower traveling body 1 as the rotating mechanism 2 is driven hydraulically by the rotating hydraulic motor 2M.


The boom 4 is attached to the front center of the upper rotating body 3 such that the boom 4 can be moved upward or downward about a rotating axis that extends in the left-right direction. The arm 5 is attached to the tip of the boom 4 such that the arm 5 can rotate about a rotating axis that extends in the left-right direction. The bucket 6 is attached to the tip of the arm 5 such that the bucket 6 can rotate about a rotating axis that extends in the left-right direction.


The bucket 6 is an example of an end attachment and is used for, for example, excavation, sloping, land-leveling, and so forth.


The bucket 6 is attached to the tip of the arm 5 such that the bucket 6 can be replaced as appropriate, depending on the details of the job that the excavator 100 performs. That is, a bucket of a different type from the bucket 6 may be attached to the tip of the arm 5 instead of the bucket 6. For example, a relatively large bucket, a bucket for slopes, a bucket for dredging, etc.) and so forth. Also, a type of end attachment other than a bucket may be attached to the tip of the arm 5. For example, an agitator, breaker, crusher, and so forth may be attached to the tip of the arm 5. Also, a spare attachment such as a quick coupling or stilt rotator may be provided between the arm 5 and the end attachment.


The boom 4, the arm 5, and the bucket 6 are driven hydraulically by the boom cylinder 7, the arm cylinder 8, and the bucket cylinder 9, respectively.


The cabin 10 is a cab in which the operator sits, and is mounted on the front left side of the upper rotating body 3.


The excavator 100 is equipped with a communication device 60 and therefore can communicate with external devices such as the information processing device 200 and the sensor group 300 via a predetermined communication network NW.


The communication network NW may be, for example, a local area network (LAN) at the work site. The communication network NW may also be a wide area network (WAN). A wide area networks may refer to, for example, a mobile communication network ending at base stations, a satellite communication network using communication satellites, the Internet, and so forth. The communication network NW may also include, for example, a short-distance communication network based on wireless communication standards such as Wi-Fi and Bluetooth (registered trademark).


For example, the excavator 100 operates responding elements, such as the lower traveling body 1 (that is, a pair of left and right crawlers 1CL and 1CR), the upper rotating body 3, the boom 4, the arm 5, and the bucket 6, in accordance with the operation by the operator aboard the cabin 10.


Also, instead of or in addition to being structured such that the excavator 100 can be operated by the operator aboard the cabin 10, the excavator 100 may also be structured such that it can be operated remotely from outside the excavator 100. When the excavator 100 is operated remotely, the inside of the cabin 10 may be unmanned. Also, when the excavator 100 is one dedicated for remote use, the cabin 10 may be omitted. The following description will presume that the operator's operations include at least one of: operations that the operator performs on the operating device 26 in the cabin 10; and remote operations that an external operator performs.


For example, as shown in FIG. 3, “remote operations” may refer to a mode of operation in which the excavator 100 is operated by an operation input for an actuator of the excavator 100 performed on a remote operation assisting device 400 that can communicate with the excavator 100 through a communication network NW. The remote operation assisting device 400 may be provided separately from the information processing device 200, or may be the information processing device 200.


The remote operation assisting device 400 may be provided, for example, in a management center where the jobs of the excavator 100 is managed from outside. Also, the remote operation assisting device 400 may be a portable operation terminal. In this case, the operator can directly check the status of the job to be performed by the excavator 100 from around the excavator 100, and operate the excavator 100 remotely.


The excavator 100 may transmit, for example, images that show the surroundings of the excavator 100 (hereinafter “surrounding images”), including ones that show the front of the excavator 100, to the remote operation assisting device 400, through the communication device 60, based on images captured by and output from image capturing devices mounted on the excavator 100. Also, the excavator 100 may transmit the captured images output from the image capturing devices to the remote operation assisting device 400 through the communication device 60, and the remote operation assisting device 400 may process the captured images received from the excavator 100 and generate surrounding images. Then, the remote operation assisting device 400 may display the surrounding images that show the surroundings of the excavator 100, including ones that show the front of the excavator 100, on the display device of the remote operation assisting device 400. Also, a variety of information images (information screen) displayed on an output device 50 (display device 50A) provided inside the cabin 10 of the excavator 100 may also be displayed on the display device of the remote operation assisting device 400. By this means, the operator using the remote operation assisting device 400 can operate the excavator 100 remotely, while checking the images showing the surroundings of the excavator 100 on the display device, the contents displayed on the information screen, and so forth. Then, the excavator 100 may run the actuators and drive the respective responding elements, such as the lower traveling body 1, the upper rotating body 3, the boom 4, the arm 5, and the bucket 6, in accordance with the remote operation signals that indicate the details of remote operations transmitted from the remote operation assisting device 400 by the communication device 60.


Also, the term “remote operation” here may refer to modes of operation in which, for example, the excavator 100 is operated based on external sound/voice inputs or gesture inputs from people (for example, workers) around the excavator 100. To be more specific, the excavator 100 recognizes the voices spoken by nearby workers, their gestures, and so forth, through a sound input device (for example, a microphone) or a gesture input device (for example, an image capturing device) mounted on the excavator 100. Then, the excavator 100 may run the actuators according to the contents of the recognized voices and gestures, and drive the responding elements such as the lower traveling body 1 (that is, the crawlers 1CL and 1CR), upper rotating body 3, boom 4, arm 5, and bucket 6.


Also, the excavator 100 may run the actuators automatically regardless of the details of operations by the operator. By this means, the excavator 100 can implement a function to automatically operate at least some of the responding elements, such as the lower traveling body 1, the upper rotating body 3, and the attachment AT. This function is also commonly known as an “autonomous driving function,” “machine control (MC) function,” and so forth.


The autonomous driving function may refer to, for example, a semi-autonomous driving function (operation-assisting MC function). The semi-autonomous driving function here refers to a function to automatically operate some elements (actuators) to be driven, other than the elements (actuators) to be driven and to be operated, in accordance with operations by the operator. Also, the autonomous driving function may refer to a fully-autonomous driving function (fully-autonomous MC function). The fully-autonomous driving function is a function to automatically operate at least some of the multiple responding elements (hydraulic actuators) on the assumption that the operator makes no operations. In the event the fully-autonomous driving function is enabled in the excavator 100, the interior of the cabin 10 may be unmanned. Also, when the excavator 100 is one dedicated for fully-autonomous driving use, the cabin 10 may be omitted. Also, the semi-autonomous driving function or the fully-autonomous driving function may refer, for example, a rule-based autonomous driving function. A rule-based autonomous driving function is an example autonomous driving function in which the details in which the elements (actuators) to be driven and subject to autonomous driving are operated are determined autonomously according to rules specified in advance. Also, the semi-autonomous driving function and the fully-autonomous driving function may include an autonomous driving function. The autonomous driving function here may refer to a function which allows the excavator 100 to make various decisions autonomously, and in which the details of operations to be performed on the elements (hydraulic actuators) that are to be driven and subject to autonomous driving are determined in accordance with the decisions of the excavator 100.


Also, the job of the excavator 100 may be monitored remotely. In this case, a remote operation assisting device having the same function as the those of the remote operation assisting device 400 may be employed. The remote operation assisting device may be, for example, the information processing device 200. By this means, the supervisor, who is the user of the remote operation assisting device, can monitor the status of the job to be performed by the excavator 100, while also checking the surrounding images displayed on the display device of the remote operation assisting device. Also, for example, if the supervisor judges it necessary, from the viewpoint of safety, the supervisor may intervene in the excavator 100's operation by the operator or autonomous driving of the excavator 100 and force the excavator 100 to an emergency stop by making a predetermined input using the input device of the remote operation assisting device.


The information processing device 200 cooperates with the excavator 100 by communicating with it, and assists the operation of the excavator 100.


The information processing device 200 may be, for example, a server device or a managing terminal device. These devices may be installed in the management office in the work site of the excavator 100 or in a management center that manages the operation status of the excavator 100, which is in a location different from the work site of the excavator 100. The server device may be an on-premise server, a cloud server, an edge server, or the like. The managing terminal device may be, for example, a stationary terminal device such as a desktop personal computer (PC), a portable terminal device (mobile terminal) such as a tablet terminal, a smartphone, or a laptop PC. In the latter case, the workers at the work site, the supervisor who oversees the workers' jobs, the manager who manages the work site, and others can move around the work site carrying the portable information processing device 200 with them. Furthermore, in the latter case, the operator may, for example, bring carry the portable information processing device 200 into the cabin of the excavator 100 with him/her.


The information processing device 200, for example, acquires data about the operating state of the excavator 100 from the excavator 100. By this means, the information processing device 200 can learn the operating state of the excavator 100 and monitor whether or not the excavator 100 shows an anomaly. Also, the information processing device 200 can, for example, display data about the operating state of the excavator 100 via a display device 208 (described later) and allow the user to check the data. Also, the information processing device 200 can, for example, cause a training model to learn the operating state of the excavator 100, and generate a trained model for assisting the operation of the excavator 100.


Also, the information processing device 200 may transmit, to the excavator 100, various data such as programs and reference data used in the processes in the controller 30. By this means, the excavator 100 can perform various processes that pertain to the operation of the excavator 100 using the various data downloaded from the information processing device 200.


The sensor group 300 is provided at the work site of the excavator 100.


For example, if the operation assisting system SYS includes multiple excavators 100, a sensor group 300 is provided for each excavator 100. Also, if multiple excavators 100 are included in the operation assisting system SYS and perform their respective jobs within the same work site, one sensor group 300 may be shared between all the excavators 100.


The sensor group 300 may include sensors 300-1 to 300-M (M is an integer greater than or equal to 2). The sensors 300-1 to 300-M may measure the state of objects in the work site around the excavator 100 and acquire measurement data. The objects in the work site may include objects in the work range around the excavator 100. The objects in the work range of the excavator 100 may include, for example, earth and sand in the work range around the excavator 100. Also, the objects in the work site may include obstacles around the excavator 100, in addition to the objects in the work range around the excavator 100 (for example, earth and sand in the work range). The obstacles around the excavator 100 may include, for example, other excavators near the excavator 100, work machines such as bulldozers, and work vehicles such as trucks for transporting earth and sand. The state of an object may include, for example, the shape and properties of the object.


The sensors 300-1 to 300-M include, for example, distance measurement sensors (distance sensors). The distance measurement sensors include, for example, a light-detecting-and-ranging (LIDAR) sensor, a millimeter wave radar, a ultrasonic sensor, an infrared sensors, and so forth. The sensors 300-1 to 300-M may also include, for example, a 3D camera for acquiring distance (depth) data, in addition to two-dimensional images, such as a stereo camera, a time-of-flight (TOF) camera, and the like. The sensors 300-1 to 300-M may also include a mixture of distance measurement sensors and 3D cameras. By this means, the sensor group 300 can acquire measurement data that represents the shape of objects in the work site around the excavator 100. Hereinafter, a sensor that can acquire measurement data that represents the shape of an object, such as a distance measurement sensor or a 3D camera, may be referred to as a “shape sensor” for ease of explanation.


Also, the sensors 300-1 to 300-M may include multi-spectral cameras. Multi-spectral cameras may include, for example, hyper-spectral cameras. By this means, for example, the sensor group 300 can acquire measurement data that represents the properties of objects in the work site around the excavator 100, such as soil's hardness, water content, and so forth. Hereinafter, a sensor that can acquire measurement data that represents the properties of an object, such as a hyper-spectral camera, may be referred to as a “property sensor” for ease of explanation.


For example, the sensors 300-1 to 300-M may include multiple shape sensors. Then, the multiple shape sensors may be provided at varying locations in the work site around the excavator 100 such that the range in which each sensor can sense may overlap at least another shape sensor's sensing range. As a consequence, for example, if an occlusion occurs in one shape sensor's sensing range and measurement data cannot be acquired with respect to the shape of some of the objects in that sensing range, another shape sensor may be able to acquire measurement data that represents the shape of objects in that range. Therefore, the sensor group 300 can more reliably acquire measurement data that represents the shape of the target object in the work site around the excavator 100.


Also, the sensors 300-1 to 300-M may include multiple property sensors. Then, the multiple property sensors may be provided at varying locations in the work site around the excavator 100 such that the range in which each property sensor can sense may overlap at least another property sensor's sensing range. As a consequence, for example, if an occlusion occurs in one shape sensor's sensing range and measurement data cannot be acquired with respect to the shape of some of the objects in that sensing range, another shape sensor may be able to acquire measurement data that represents the shape of objects in that range. Therefore, the sensor group 300 can more reliably acquire measurement data that represents the properties of objects in the work site around the excavator 100.


Also, the sensors 300-1 to 300-M may include a sensor (hereinafter “centralized sensor”) that functions as both a shape sensor and a property sensor. In this case, the sensors 300-1 to 300-M may include multiple centralized sensors as well. Then, multiple property sensors may be provided at varying locations in a work site around the excavator 100 such that each property sensor's sensing range may overlaps at least one another property sensor's sensing range.


Note that the sensor group 300 may simply include only one shape sensor or one property sensor. Also, the operation assisting system SYS may simply include, instead of the sensor group 300, only one sensor that can acquire measurement data with respect to the state of objects in a work site around the excavator 100.


The sensors 300-1 to 300-M may be fixed at the work site around the excavator 100, or may be mounted on a mobile entity that can move within the work site around the excavator 100. The mobile entity may be, for example, a work machine or a work vehicle that moves within the work site. Also, for example, an aircraft such as a drone that flies above the work site may also be a mobile entity that moves within the work site.


Outputs (measurement data) of the sensors 300-1 to 300-M are taken into the information processing device 200 through the communication network NW. Outputs of the sensors 300-1 to 300-M are taken directly into the information processing device 200 through the communication network NW, for example. Also, outputs of the sensors 300-1 to 300-M may be once received as inputs by the excavator 100 through the communication network NW and taken into the information processing device 200 via the excavator 100. Also, when the sensors 300-1 to 300-M are mounted on a specific device such as one of the mobile entities mentioned earlier, outputs of the sensors 300-1 to 300-M may be once received as inputs in the specific device and taken into the information processing device 200 from the device.


Note that the sensor group 300 may be omitted.


[Structure of Operation Assisting System SYS]

Next, the hardware structure of the operation assisting system SYS will be described with reference to FIG. 1 to FIG. 3, and, in addition, with reference to FIG. 4 and FIG. 5.


<Structure of the Excavator 100>


FIG. 4 is a diagram that shows an example structure of the excavator 100.


Note that, in FIG. 4, mechanical drive force travels, dashed lines are paths on which pilot pressure travels, and the dotted lines are paths on which electric signals travel. The excavator 100 includes various components, such as those constituting a drive for hydraulic system hydraulically driving the responding elements, an operation system for operating the responding elements, a user interface system for exchanging information with users, a communication system for communicating with outside, and a control system for implementing various controls.


<<Hydraulic Drive System>>

As shown in FIG. 4, the hydraulic drive system of the excavator 100 includes a hydraulic actuator HA. The hydraulic actuator HA hydraulically drives each of the responding elements, including a lower traveling body 1 (that is, crawlers 1CL and 1CR), an upper rotating body 3, a boom 4, an arm 5, and a bucket 6, as described earlier. Also, the hydraulic drive system of the excavator 100 according to the present embodiment includes an engine 11, a regulator 13, a main pump 14, and a control valve 17.


The hydraulic actuator HA includes drive hydraulic motors 1ML and 1MR, a rotating hydraulic motor 2M, a boom cylinder 7, an arm cylinder 8, and a bucket cylinder 9, and so forth.


Note that, in the excavator 100, part or the whole of the hydraulic actuator HA may be replaced with an electric actuator. That is, the excavator 100 may be a hybrid excavator or an electric excavator.


The engine 11 is the motor of the excavator 100 and the main power source in the hydraulic drive system. The engine 11 may be, for example, a diesel engine that runs on light oil. The engine 11 is mounted, for example, in a rear part of the upper rotating body 3. The engine 11 rotates at a constant speed at a pre-configured target number of rotations per part time, under direct or indirect control by the controller 30 (described later), thereby driving the main pump 14 and the pilot pump 15.


Note that, instead of or in addition to the engine 11, another motor (for example, an electric motor) may be mounted on the excavator 100.


The regulator 13 controls (adjusts) the amount of discharge from the main pump 14 under the control of the controller 30. For example, the regulator 13 adjusts the angle of the swashplate of the main pump 14 (hereinafter “tilting angle”) in accordance with control commands from the controller 30.


The main pump 14 supplies hydraulic oil to the control valve 17 through a high-pressure hydraulic line. The main pump 14 is attached to the rear part of the upper rotating body 3, like the engine 11, for example. The main pump 14 is driven by the engine 11, as described earlier. The main pump 14 may be, for example, a variable displacement hydraulic pump. s described earlier, under the control of the controller 30, the tilting angle of the swashplate is adjusted by the regulator 13, so that the piston stroke length is adjusted, and the discharge flow rate, discharge pressure, and so forth are controlled.


The control valve 17 drives the hydraulic actuator HA in accordance with the details of operations that the operator performs on the operating device 26, the details of remote operations, or operation commands supporting the autonomous driving function of the excavator 100. The control valve 17 is mounted, for example, in the center of the upper rotating body 3. As mentioned earlier, the control valve 17 is connected to the main pump 14 via a high-pressure hydraulic line, and selectively supplies hydraulic oil supplied from the main pump 14 to each hydraulic actuator in response to the operator's operation or the operation command supporting the autonomous driving function of the excavator 100. To be more specific, the control valve 17 includes multiple control valves that control the rate and direction of flow of hydraulic oil supplied from the main pump 14 to each hydraulic actuator HA (also referred to as “direction switching valve”).


<<Operation System>>

As shown in FIG. 4, the operation system of the excavator 100 includes a pilot pump 15, an operating device 26, a hydraulic control valve 31, a shuttle valve 32, a and hydraulic control valve 33.


The pilot pump 15 supplies pilot pressures to various hydraulic equipment via pilot line 25. The pilot pump 15, like engine 11, is attached to the rear part of the upper rotating body 3. The pilot pump 15 may be, for example, a fixed displacement hydraulic pump, and is driven by the engine 11 as described earlier.


Note that the pilot pump 15 may be omitted. In this case, relatively high-pressure hydraulic oil may be discharged from the main pump 14 and its pressure may be reduced by means of a predetermined pressure-reducing valve. The resulting, relatively low-pressure hydraulic oil may be supplied to various hydraulic equipment as pilot pressure.


The operating device 26 may be provided near the cockpit of the cabin 10 and used by the operator to operate various responding elements. To be more specific, the operating device 26 may be used by the operator to operate the hydraulic actuators HA for driving respective responding elements. As a result of this, the operator is able to operate the responding elements, which are the targets to be driven by the hydraulic actuators HA. The operating device 26 may be, for example, a pedal device, a lever device, and the like, for operating respective responding elements (hydraulic actuators HA).


For example, as shown in FIG. 4, the operating device 26 is a hydraulic pilot device. To be more specific, the operating device 26 uses the hydraulic oil, supplied from the pilot pump 15 through the pilot line 25 and a pilot line 25A branching from the pilot line 25, and outputs suitable pilot pressures (that is, pressures that match the details of corresponding operations) to the secondary pilot line 27A. The pilot line 27A is connected to one inlet port of the shuttle valve 32, and connected to the control valve 17 via the pilot line 27 connected to an outlet port of the shuttle valve 32. By this means, the control valve 17 may receive, as inputs, pilot pressures that match the details operations performed on the operating device 26 with respect to various responding elements (hydraulic actuators HA), through the shuttle valve 32. As a consequence, the control valve 17 can drive the hydraulic actuators HA according to the details of operations made on or with respect to the operating device 26 by the operator, etc.


Also, the operating device 26 may be electric. In this case, the pilot line 27A, the shuttle valve 32, and the hydraulic control valve 33 are omitted. To be more specific, the operating device outputs 26 electric signals (hereinafter “operation signals”) that match the details of operations made, and the operation signals may be input to the controller 30. Then, the controller 30 outputs control commands according to the operation signals' contents (that is, control signals that match the details of operations made on or with respect to the operating device 26), to the hydraulic control valve 31. By this means, the hydraulic control valve 31 inputs pilot pressures that match the details of operations made on or with respect to the operating device 26 to the control valve 17, so that the control valve 17 can drive the individual hydraulic actuators HA according to the details of operations made on or with respect to the operating device 26.


Also, the control valves (direction switching valves) built in the control valve 17 for driving respective hydraulic actuators HA may be electromagnetic solenoid valves. In this case, operation signals output from the operating device 26 may be directly input to the control valve 17 (that is, directly to electromagnetic solenoid type control valves).


Also, as described earlier, part or all of the hydraulic actuators HA may be replaced with electric actuators. In this case, the controller 30 may output control commands that match the details of operations made on or with respect to the operating device 26, or the details of remote operations indicated by remote operation signals, to the electric actuators or the driver for driving the electric actuators. Also, when the excavator 100 is operated remotely, the operating device 26 may be omitted.


A hydraulic control valve 31 may be provided for every responding element (hydraulic actuator HA) that works in conjunction with operations made on or with respect to the operating device 26, in every direction in which the responding elements (hydraulic actuators HA) might move (for example, the directions in which the boom 4 rises and drops). For example, two hydraulic control valves 31 are provided for each double-acting hydraulic actuator HA for driving the lower traveling body 1, upper rotating body 3, boom 4, arm 5, bucket 6, and so forth. The hydraulic control valve 31 may be provided in the pilot line 25B between the pilot pump 15 and the control valve 17, for example, and may be structured such that the flow path area (that is, the cross-sectional area in which the hydraulic oil can flow) can be changed. By this means, the hydraulic control valve 31 can output certain pilot pressures to the secondary pilot line 27B by using hydraulic oil supplied from the pilot pump 15 through the pilot line 25B. Therefore, through the shuttle valve 32 provided between the pilot line 27B and the pilot line 27, the hydraulic control valve 31 can apply, indirectly, certain pilot pressures that match control signals from the controller 30, to the control valve 17. As a consequence, for example, the controller 30 can supply pilot pressures that match operation commands supporting the autonomous driving function of the excavator 100, from the hydraulic control valve 31 to the control valve 17, thereby enabling the excavator 100 to operate based on its autonomous driving function.


Also, the controller 30 may control the hydraulic control valve 31 and operate the excavator 100 remotely. To be more specific, the controller 30 outputs control signals that match the details of remote operations indicated by remote operation signals from the remote operation assisting device 400, to the hydraulic control valve 31, via the communication device 60. By this means, the controller 30 can supply pilot pressures that match the details of remote operations, from the hydraulic control valve 31 to the control valve 17, thereby enabling the excavator 100 to operate based on the operator's remote operations.


Also, when the operating device 26 is an electric one, the controller 30 can directly supply pilot pressures that match the details of operations (operation signals) made on or with respect to the operating device 26, from the hydraulic control valve 31 to the control valve 17, thereby enabling the excavator 100 to operate based on the operator's operations.


Every shuttle valve 32 has two inlet ports and one outlet port. When varying pilot pressures are input to the two inlet ports, the shuttle valve 32 outputs hydraulic oil having the higher pilot pressure to the outlet port. Similar to the hydraulic control valve 31, the shuttle valve 32 is provided per responding element (hydraulic actuator HA) that works in conjunction with operations performed on or with respect to the operating device 26, and per direction in which the corresponding driven element (hydraulic actuator HA) moves. For example, two shuttle valves 32 are provided per double-acting hydraulic actuator HA for driving the lower traveling body 1, upper rotating body 3, boom 4, arm 5, bucket 6, and so forth. One of the two inlet ports of every shuttle valve 32 is connected to a secondary pilot line 27A of the operating device 26 (to be more specific, the above-mentioned lever device or pedal device included in the operating device 26), and the other one is connected to a pilot line 27B, which is a secondary pilot line, of the hydraulic control valve 31. The outlet port of each shuttle valve 32 is connected to the pilot port of the corresponding control valve in the control valve 17 through the pilot line 27. A shuttle valve 32's “corresponding control valve” refers to at least one control valve for driving a hydraulic actuator HA that works in conjunction with the operation of the above-mentioned lever device or pedal device connected to one inlet port of the shuttle valve 32. Therefore, these shuttle valves 32 can apply the higher one of: the pilot pressure of the secondary pilot line 27A of the operating device 26; and the pilot pressure of the secondary pilot line 27B of the hydraulic control valve 31, to the pilot ports of respective corresponding control valves. That is, the controller 30 allows the hydraulic control valve 31 to output pilot pressures that are higher than the secondary pilot pressure in the operating device 26, so that the corresponding control valves can be controlled regardless of the operation of the operating device 26 by the operator. Consequently, the controller 30 can control the operation of the driven elements (the lower traveling body 1, upper rotating body 3, boom 4, arm 5, and bucket 6) and implement the autonomous driving function, remote operation function, and so forth of the excavator 100, regardless of how or in what way the operator maneuvers the operating device 26.


The hydraulic control valve 33 may be provided in the pilot line 27A that connects between the operating device 26 and the shuttle valve 32. The hydraulic control valve 33 may be, for example, structured such that its flow path area can be changed. The hydraulic control valve 33 operates in accordance with control signals received as inputs from the controller 30. By this means, when the operating device 26 is operated by an operator, the controller 30 can forcibly reduce the pilot pressures output from the operating device 26. Therefore, even when the operating device 26 is operated, the controller 30 can prevent or substantially prevent the hydraulic actuators HA from operating in accordance with operations made on or with respect to the operating device 26 from operating, or force the operation of the hydraulic actuators HA to a stop. Also, even when the operating device 26 is operated, the controller 30 can reduce the pilot pressures output from the operating device 26 below the pilot pressures output from the hydraulic control valve 31. Therefore, by controlling the hydraulic control valve 31 and the hydraulic control valve 33, the controller 30 can reliably apply desired pilot pressures to the pilot ports of individual control valves in the control valve 17, regardless of the details of operations made on or with respect to the operating device 26. Consequently, the controller 30 can implement the autonomous driving function, the remote operation function, and so forth of the excavator 100 more properly by controlling the hydraulic control valves 33 in addition to the hydraulic control valves 31, for example.


<User Interface System>

As shown in FIG. 4, the user interface system of the excavator 100 includes an operating device 26, an output device 50, and an input device 52.


The output device 50 may output a variety of information to the user of the excavator 100 (for example, the operator in the cabin 10, the remote operator, etc.) and to people around the excavator 100 (for example, workers, the drivers of work vehicles, etc.).


For example, the output device 50 may include lighting equipment and a display device 50A (see FIG. 6) that output a variety of information in a visualized manner. The lighting equipment may include, for example, a warning light (indicator lamp). The display device 50A may be, for example, a liquid crystal display or an organic electroluminescence (EL) display. For example, as shown in FIG. 2, the lighting equipment and the display device 50A may be provided inside the cabin 10, and output a variety of information in a visualized manner to the operator in the cabin 10. Also, the lighting equipment and the display device 50A may be provided, for example, on a side surface of the upper rotating body 3 and output a variety of information in a visualized manner to workers around the excavator 100.


Also, the output device 50 may include a sound output device that outputs a variety of information in an auditory manner. The sound output device may be, for example, a buzzer or a speaker. The sound output device may be provided, at least, either inside or outside the cabin 10, and output a variety of information in an auditory manner, to the operator inside the cabin 10 or people (for example, workers) around the excavator 100.


Also, the output device 50 may include a device that outputs a variety of information in a tactile manner, such as by vibration of the cockpit.


The input device 52 receives various inputs from the user of the excavator 100, and signals corresponding to the received inputs are taken into the controller 30. For example, as shown in FIG. 2, the input device 52 is provided inside the cabin 10 and receives inputs from the operator in the cabin 10 or the like. Also, the input device 52 may be provided, for example, on a side surface or somewhere in the upper rotating body 3 and receives inputs from the workers or the like around the excavator 100.


For example, the input device 52 may be a mechanical input device that receives the mechanical operations that the user performs thereon as inputs. This mechanical input device may include: a touch panel implemented on the display device 50A; a touch pad, a button switch, a lever, a toggle, and so forth provided so as to surround the display device 50A; and a knob switch provided on operating device 26 (lever device).


Also, the input device 52 may include a sound input device that receives sound/voice inputs from the user. The sound input device may be a microphone, for example.


Also, the input device 52 may be a gesture input device that receives gesture inputs from the user. The gesture input device may include, for example, an image capturing device that captures images of gestures made by the user.


Also, the input device 52 may be a biological input device that receives biological inputs from the user. The biological inputs include, for example, biological information such as the user's fingerprint, iris, and so forth.


<<Communication System>>

As shown in FIG. 4, the communication system of the excavator 100 according to the present embodiment includes a communication device 60.


The communication device 60 connects with an external communication network NW and communicates with devices provided apart from the excavator 100. These devices may include ones situated outside the excavator 100, as well as a portable terminal device that the user of the excavator 100 carries into the cabin 10 with him/her. The communication device 60 may include, for example, a mobile communication module conforming to standards such as 4G (4th Generation) and 5G (5th Generation). The communication device 60 may also include, for example, a satellite communication module. The communication device 60 may also include, for example, a Wi-Fi communication module or a Bluetooth (registered trademark) communication module. Also, when there are multiple connectable communication networks NW, the communication device 60 may include multiple communication devices in accordance with the types of the communication networks NW.


For example, the communication device 60 communicates with external devices such as the information processing device 200 and remote operation assisting device 400 within the work site through a local communication line established at the work site. The local communication line may be, for example, a local 5G (also commonly known as “local 5G”) mobile communication line established at the work site or a local network based on Wi-Fi 6.


The communication device 60 may also communicate with the information processing device 200, sensor group 300, remote operation assisting device 400, and so forth, situated outside the work site, through a wide-area communication line covering the work site, that is, a wide-area network.


<<Control System>>

As shown in FIG. 4, the control system of the excavator 100 includes a controller 30. Also, the control system of the excavator 100 according to the present embodiment includes an operation pressure sensor 29, sensors 40, and sensors S1 to S9.


The controller 30 controls the excavator 100 in a variety of ways.


The functions of the controller 30 may be implemented by any hardware or by any combination of hardware and software. For example, as shown in FIG. 3, the controller 30 includes a secondary memory device 30A, a memory device 30B, a central processing part (CPU) 30C, and an interface device 30D, which are all connected via a bus BS1.


The secondary memory device 30A is a non-volatile storage means that stores the programs that are installed, as well as necessary files and data. The secondary memory device 30A may be, for example, an electrically erasable programmable read-only memory (EEPROM), a flash memory, or the like.


The memory device 30B, for example, loads a program in the secondary memory device 30A so that the CPU 30C can read it when there is a command to start the program. The memory device 30B may be, for example, a static random access memory (SRAM).


The CPU 30C, for example, executes a program loaded in the memory device 30B and implements a variety of functions of the controller 30 according the program's instructions.


The interface device 30D may, for example, function as a communication interface for connecting with the internal communication network of the excavator 100. The interface device 30D may include a variety of types of communication interfaces, used depending on the type of the communication network to which it is connected.


Also, the interface device 30D may function as an external interface for reading data from a recording medium and writing data to the recording medium. The recording medium may, for example, a dedicated tool connected to a connector installed inside the cabin 10 by a detachable cable. Also, the recording medium may be, for example, a general-purpose recording medium such as an SD memory card or a universal serial bus (USB) memory. By this means, a program for implementing a variety of functions of the controller 30 may be provided, for example, by a portable recording medium, and installed in the secondary memory device 30A of the controller 30. Also, a program may be downloaded from another computer (for example, the information processing device 200) outside the excavator 100, through the communication device 60, and installed in the secondary memory device 30A.


Note that some of the functions of the controller 30 may be implemented by other controllers (control devices). In other words, the functions of the controller 30 may be distributed over and implemented by multiple controllers mounted on the excavator 100.


The operation pressure sensor 29 detects secondary pilot pressures (pressures on the pilot line 27A) in the hydraulic-pilot operating device 26. That is, the operation pressure sensor 29 detects pilot pressures that match the way in which the individual driven elements (hydraulic actuators) in the operating device 26 are operated. Pilot-pressure detection signals, produced by the operation pressure sensor 29 and indicating the way each individual driven element (hydraulic actuator HA) is operated in the operating device 26, are taken into the controller 30.


Note that the operation pressure sensor 29 may be omitted when the operating device 26 is an electric one. This is because the controller 30 can learn the operating state of each driven element, through the operating device 26, based on operation signals received from the operating device 26.


The sensors 40 acquire measurement data with respect to the state of objects around the excavator 100, for example.


For example, the sensors 40 may include a shape sensor, such as a distance measurement sensor or an image capturing device that can acquire measurement data that represents the shape of objects around the excavator 100. Also, the sensors 40 may include a centralized sensor that, in addition to having the function of a shape sensor, functions as a property sensor, such as a hyper-spectral camera, and that can acquire measurement data that represents the properties of objects around the excavator 100.


For example, as shown in FIG. 2, the sensors 40 may include sensors 40F, 40B, 40L, and 40R. The sensor 40F may measure the state (shape, properties, etc.) of objects in front of the upper rotating body 3. The sensor 40B may measure the state of objects behind the upper rotating body 3. The sensor 40L may measure the state of objects to the left of the upper rotating body 3. The sensor 40R may measure the state of objects to the right of the upper rotating body 3. By this means, the sensors 40 can measure the state of objects in a range covering the entire circumference of the excavator 100, that is, an angular range of 360 degrees, when viewed from above the excavator 100. Hereinafter, any one of sensors 40F, 40B, 40L, and 40R may be individually referred to as an “sensor 40X.”


The output data of the sensor 40X (that is, measurement data with respect to the state of objects around the excavator 100) is input to the controller 30 via a one-to-one communication line or an in-vehicle network. By this means, for example, the controller 30 can learn the state of objects around the excavator 100, such as their shapes and properties, based on output data of the sensor 40X.


Note that some or all of the sensors 40B, 40L, and 40R may be omitted.


The sensor S1 may be attached to the boom 4 and measure the posture of the boom 4. The sensor S1 may output measurement data that represents the posture of the boom 4. The posture of the boom 4 may be, for example, the posture/angle (hereinafter “boom angle”) of the proximal end of the boom 4 about the rotating axis. The proximal end of the boom 4 may be, for example, the part of the boom 4 connecting with the upper rotating body 3. The sensor S1 may be, for example, a rotary potentiometer, a rotary encoder, an acceleration sensor, an angular acceleration sensor, a 6-axis sensor, an inertial measurement part (IMU), and so forth. The same may be true for the sensors S2 to S4. The sensor S1 may also be a cylinder sensor that detects the extended/retracted position of the boom cylinder 7. The same may be true for the sensors S2 and S3. The outputs of the sensor S1 may be input to the controller 30 (measurement data that represents the posture of the boom 4). By this means, the controller 30 can learn the posture of the boom 4.


The sensor S2 may be attached to the arm 5 and measures the posture of the arm 5. The sensor S2 may output measurement data that represents the posture of the arm 5. The posture of the arm 5 may be, for example, the posture/angle of the proximal end of the arm 4 (hereinafter referred to as “arm angle”) about the rotating axis. The proximal end of the arm 4 may be, for example, the part of the arm 4 connecting with the boom 4. The outputs of the sensor S2 (measurement data that represents the posture of the arm 5) may be input to the controller 30. By this means, the controller 30 can learn the posture of the arm 5.


The sensor S3 may be attached to the bucket 6 and measure the posture of the bucket 6. The sensor S3 may output measurement data that represents the posture of the bucket 6. The posture of the bucket 6 may be, for example, the posture/angle of the proximal end of the bucket 6 (hereinafter “arm angle”) about the rotating axis. The proximal end of the bucket may be, for example, the part of the bucket 6 connecting with the arm 5. The outputs of the sensor S3 may be taken into the controller 30 (measurement data that represents the posture of the bucket 6). By this means, the controller 30 can learn the posture of the bucket 6.


The sensor S4 may measure the posture of the body (for example, the upper rotating body 3) of the excavator 100. The sensor S4 may output measurement data that represents the posture of the body of the excavator 100. The posture of the body of the excavator 100 may be, for example, the tilt of the body relative to a specific reference plane (for example, a horizontal plane). For example, the sensor S4 may be attached to upper rotating body 3 and measure the tilting angles about two axes (hereinafter “front-to-back tilting angle” and “left-to-right tilting angle”), one in the front-to-back direction and the other one in the left-right direction with respect to the excavator 100. The outputs of the sensor S4 (measurement data that represents the posture of the body of the excavator 100) may be taken into the controller 30. By this means, the controller 30 can learn the posture (tilt) of body (the upper rotating body 3) of the excavator 100.


The sensor S5 may be attached to the upper rotating body 3 and measure the rotation of the upper rotating body 3. The sensor S5 may output measurement data that represents the rotation of the upper rotating body 3. The sensor S5 may measure, for example, the rotating angular velocity and rotating angle of the upper rotating body 3. The sensor S5 may be, for example, a gyro sensor, a resolver, a rotary encoder, and so forth. The outputs of the sensor S5 (measurement data that represents the rotation of the upper rotating body 3) may be taken into the controller 30. By this means, the controller 30 can learn the rotation of the upper rotating body 3, such as the rotating angle.


The controller 30 can identify (estimate) the position of the tip (the bucket 6) of the attachment AT based on the output of the sensors S1 to S5.


Note that if the sensor S4 includes a gyro sensor, 6-axis sensor, IMU, or the like that can detect angular velocity about three axes, the rotation of the upper rotating body 3 may be detected based on detection signals of the sensor S4 (for example, the rotating angular velocity). In this case, the sensor S5 may be omitted.


The sensor S6 may measure the position of the excavator 100. The sensor S6 may measure the position in a world (global) coordinate system or using local coordinates at the work site. In the former case, the sensor S6 may be, for example, a global navigation satellite system (GNSS) sensor. In the latter case, the sensor S6 may be a transceiver that communicates with a device that serves as a point of reference for the work site's position, and output a signal corresponding to the position of the excavator 100 relative to the point of reference. The outputs of the sensor S6 are taken into the controller 30.


The sensor S7 may measure the pressure (cylinder pressure) in the oil chamber of the boom cylinder 7. The sensor S7 may be, for example, a sensor that measures the cylinder pressure (rod pressure) in the rod-side oil chamber of the boom cylinder 7 and a sensor that measures the cylinder pressure in the bottom-side oil chamber (bottom pressure). The outputs of the sensor S7 (measurement data of the cylinder pressure of the boom cylinder 7) may be input to the controller 30.


The sensor S8 may measures the pressure (cylinder pressure) of the oil chamber of the arm cylinder 8. The sensor S8 may be, for example, a sensor that measures the cylinder pressure (rod pressure) of the rod-side oil chamber of the arm cylinder 8, and a sensor that measures the cylinder pressure (bottom pressure) of the bottom-side oil chamber of the arm cylinder 8. The outputs of the sensor S8 (measurement data of the cylinder pressure of the arm cylinder 8) may be input to the controller 30.


The sensor S9 may measure the pressure (cylinder pressure) of the oil chamber of the bucket cylinder 9. The sensor S9 may be, for example, a sensor that measures the cylinder pressure (rod pressure) of the rod-side oil chamber of the bucket cylinder 9, and a sensor that measures the cylinder pressure (bottom pressure) of the bottom-side oil chamber of the bucket cylinder 9. The outputs of the sensor S9 (measurement data of the cylinder pressure of the bucket cylinder 9) may be input to the controller 30.


The controller 30 can learn the state of load acting on the attachment AT based on the output of the sensors S7 to S9. The load that acts on the attachment AT may include, for example, the reactive force acting on the bucket 6 from the objects in the work range and the weight of earth and sand (for example, ground soil) caught in the bucket 6.


Note that some or all of the sensors S1 to S9 may be omitted, depending on necessity. Also, in addition to the sensors S1 to S9, the excavator 100 may be equipped with other sensors that can learn the state of the excavator 100. For example, the excavator 100 may be equipped with an orientation sensor that can detect its own orientation. The orientation sensor may be, for example, an electronic compass including a geomagnetic sensor.


<Structure of Information Processing Device>


FIG. 5 is a diagram that shows an example structure of an information processing device 200.


The functions of the information processing device 200 may be implemented by any hardware or any combination of hardware and software. For example, as shown in FIG. 5, the information processing device 200 may include an external interface 201, a secondary storage device 202, a memory device 203, a CPU 204, a high-speed calculation device 205, a communication interface 206, an input device 207, a display device 208, and a sound output device 209. These are all connected by a bus BS2.


The external interface 201 may function as an interface for reading data from a recording medium 201A and writing data to the recording medium 201A. The recording medium 201A may be, for example, a flexible disk, a compact disc (CD), a digital versatile disc (DVD), a Blu-ray (registered trademark) disc (BD), a secure digital (SD) memory card, a universal serial bus (USB) memory, and the like. By this means, the information processing device 200 can read various data used in various processes through the recording medium 201A and store the data in the secondary storage device 202, or install programs for implementing a variety of functions.


Note that the information processing device 200 may acquire various data and programs used in various processes from external devices via the communication interface 206.


The secondary storage device 202 may store various programs installed, as well as files and data necessary for various processes. The secondary storage device 202 may include, for example, a hard disc drive (HDD), solid state drive (SSD), a flash memory, and so forth.


When a command to start a program is received, the memory device 203 may read the program from the secondary storage device 202 and stores it. The memory device 203 may include, for example, a dynamic random access memory (DRAM), a static random access memory (SRAM), and the like.


The CPU 204 execute various programs loaded from the secondary storage device 202 to the memory device 203, and implements a variety of functions related to the information processing device 200 according to the programs.


The high-speed calculation device 205 works in conjunction with the CPU 204 to perform calculation processes at a relatively high speed. The high-speed calculation device 205 may be, for example, a graphics processing part (GPU), an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), and so forth.


Note that the high-speed calculation device 205 may be omitted, depending on the speed required in the calculation processes.


The communication interface 206 is used as an interface for allowing communication with external devices. By this means, the information processing device 200 can communicate with external devices such as, for example, the excavator 100, through the communication interface 206. Also, the communication interface 206 may include and choose between multiple types of communication interfaces depending on what communication method is applied to communication with connecting devices, and so on.


The input device 207 may receive various inputs from the user. The input device 207 may include a remote operation device for operating the excavator 100 remotely.


The input device 207 may be, for example, an input device (hereinafter “mechanical input device”) that receives mechanical operation inputs from the user. When the excavator 100 is operated remotely, the operating device for the remote operation may be a mechanical input device. The “mechanical input device” in this case may be, for example, a button, a toggle, a lever, a keyboard, a mouse, a touch panel implemented in the display device 208, a touch pad provided apart from the display device 208, and so on.


Also, the input device 207 may include a sound input device that can accept sound/voice input from the user. The sound input device may, for example, include a microphone that can collect the user's voice.


Also, the input device 207 may include a gesture input device that can recognize and receive the user's gestures as inputs. The gesture input device may, for example, include a camera that can capture images of the user's gestures.


Also, the input device 207 may include a biological input device that can receive the user's biological inputs. The biological input device may be, for example, a camera that can acquire image data containing information about the user's fingerprint or iris.


The display device 208 may show an information screen or an operation screen to the user of the information processing device 200. The display device 208 may, for example, be an LCD display or an organic EL display.


The sound output device 209 communicates a variety of information to the user of the information processing device 200 through sound. The sound output device 209 may be, for example, a buzzer, an alarm, a speaker, and so forth.


<Structure of Remote Operation Assisting Device>

The hardware structure of the remote operation assisting device 400 may be the same as that of the information processing device 200. Therefore, illustration and explanation of the hardware structure of the remote operation assisting device 400 will be omitted.


Hereinafter, the communication interface, input device, and display device of the remote operation assisting device 400 may be referred to as “communication interface 406,” “input device 407,” and “display device 408,” respectively, for ease of explanation.


[First Example Functional Structure of Operation Assisting System]

Next, a first example functional structure of the operation assisting system SYS will be described below with reference to FIG. 1 to FIG. 5, as well as FIG. 6 to FIG. 10.



FIG. 6 is a functional block diagram that shows a first example functional structure of the operation assisting system SYS. FIG. 7 is a diagram that shows an example of ae work range. FIG. 8 to FIG. 10 are diagrams that show first to third example methods of displaying recommended work locations for the excavator 100.


Note that, in FIG. 8 to FIG. 10, heat maps with varying colors are used to express differences in patterns.


In the following description, the term “the path of the working part of the excavator 100” may be used when referring to the route that the working part of the excavator 100 has already traveled (that is, the path) and to the path that the working part of the excavator 100 might travel in the future. The working part corresponds to the tip of the attachment AT used to make changes to the working target. To be more specific, the working part may be the bucket 6.


The excavator 100 may include an assisting device 150. In this example, the assisting device 150 assists the user who performs jobs by operating the excavator 100 (that is, the operator of the excavator 100).


As shown in FIG. 6, the assisting device 150 includes a controller 30, sensors 40, a display device 50A, and sensors S1 to S9.


The controller 30 includes, as its functional parts, an operation log providing part 301 and an operation assisting part 302.


Note that, when the operation assisting system SYS includes multiple excavators 100, there may be an excavator 100 in which the controller 30 includes only the former one of the operation log providing part 301 and the operation assisting part 302, and an excavator 100 in which the controller 30 includes only the latter one of them. In this case, the former excavator 100 may only have the function to keep an operation log of the excavator 100 and give it to the information processing device 200, which is a job assisting function used in the latter excavator 100. The same may be true for the following second example functional structure (FIG. 13) and third example functional structure (FIG. 15).


The information processing device 200 may include, as its functional parts, a log acquiring part 2001, a simulator part 2002, a log storage part 2003, a training data generating part 2004, a machine learning part 2005, a trained model storage part 2006, and a delivery part 2007.


The operation log providing part 301 may be a functional part for keeping an operation log while the excavator 100 performs predetermined movements and giving the log to the information processing device 200.


Examples of predetermined movements may include excavation, an operation of raising and rotating the boom, an operation of lowering and rotating the boom, unloading of sand and earth, brooming, which are used when the excavator 100 excavates. Also, examples of predetermined movements may include excavation, unloading of sand and earth, sweeping, flattening, compaction, brooming, and so forth, which are used when the excavator 100 levels land. Furthermore, examples of predetermined movements may include cutting of land, compaction, and so forth, which are used when the excavator 100 slopes land. Sweeping may, for example, refer to an operation in which the attachment AT is controlled such that the bucket 6 is pushed forward along the ground, thereby sweeping out soil and sand with the back of the bucket 6. In the sweeping operation, for example, the attachment AT may operate to lower the boom 4 and then open up the arm 5, and so on. Flattening may, for example, refer to an operation in which the attachment AT is controlled such that the tip of the bucket 6 moves substantially horizontally along the ground, toward the front part of the excavator 100, thereby smoothing out the unevenness of the ground (the terrain's surface). In the flattening operation, for example, the attachment AT may operate as when the boom 4 and then fold the arm 5, and so on. Compaction may, for example, refer to an operation in which the attachment AT is controlled such that the back of the bucket 6 is pressed against the ground. Also, compaction may refer to an operation in which the bucket 6 is moved upward and downward and the back of the bucket 6 is pressed against the ground, thereby hitting the ground with the back of the bucket. Also, compaction may refer to an operation in which the bucket 6 is pushed forward along the ground until the earth and sand are swept forward with the back of the bucket 6, up to a specific position, and in which, subsequently, the back of the bucket 6 is pressed against the ground at the specific position. In the compaction operation, for example, the attachment AT operates to lower the boom 4 to compress the ground. Brooming may, for example, refer to an operation in which the upper rotating body 3 of the excavator 100 is controlled to such that the bucket 6 rotates left and right while it is kept aligned with the ground. Also, brooming may refer to, for example, an operation in which the attachment AT and the upper rotating body 3 are controlled such that the bucket 6 is pushed forward while the bucket 6 is kept aligned with the ground and rotated left and right alternately. In the brooming operation, for example, the upper rotating body 3 repeats rotating left and right alternately. Also, in the brooming operation, for example, in addition to the alternate left and right rotation of the upper rotating body 3, the attachment AT may lower the boom 4 and open up the arm 5, as in sweeping operation.


The operation log of the excavator 100 may be time sequence data that represents the operation of the excavator 100. For example, the operation log of the excavator 100 may include time sequence data that represents the details the operation by the operator. The time sequence data that represents the details the operation by the operator may, for example, be time sequence output data of the operation pressure sensor 29 supporting a hydraulic-pilot operating device 26, time sequence output data (operation signal data) of the operating device 26 supporting an electric operating device 26, and so forth. Also, the operation log of the excavator 100 may be time sequence data that represents the posture of the excavator 100, and that is obtained from time sequence data output from the sensors S1 to S5 or from data output from the sensors S1 to S5.


For example, the operation log providing part 301 may acquire an operation log when the operator of the excavator 100 has a long history of operating the excavator 100 and is relatively experienced with respect to a predetermined criterion (hereinafter, for ease of explanation, referred to as “expert”), and give the operation log to the information processing device 200. By this means, as will be described below, machine learning based on the operation log of the excavator 100 can generate trained learning models LM1 and LM2 that reflect the operations of the excavator 100 by an expert.


The operation log providing part 301 may include an operation log recording part 301A, an operation log storage part 301B, and an operation log transmission part 301C.


The operation log recording part 301A may keep an operation log when the excavator 100 performs predetermined movements and records it in the operation log storage part 301B. For example, every time the excavator 100 makes a predetermined movement, the operation log recording part 301A records the operation log during that operation in the operation log storage part 301B.


The operation log storage part 301B stores the operation log of the excavator 100. For example, the operation log storage part 301B stores, for every predetermined movement that the excavator 100 performs, an operation log and data about the time (date and time) the predetermined movement is performed, which are associated with each other. The data about the time a predetermined movement is performed includes data about the times at which the predetermined movement is started and ended by the excavator 100. Also, if multiple predetermined movements are determined to be carried out, the operation log storage part 301B may store, for every predetermined movement the excavator 100 performs, an operation log, data about the time the predetermined movement is performed, and data about identification information of the predetermined movement, which are associated with each other. Hereinafter, data associated with the operation log of the excavator 100 may be referred to as “accompanying data” for ease of explanation. For example, the operation log storage part 301B accumulates record data that represents the correspondence between the operation log and the accompanying data for every predetermined movement that the excavator 100 performs. This builds a database of operation logs that are recorded when the excavator 100 performs predetermined movements.


Of those operation logs stored in the operation log storage part 301B, ones that have been transmitted to the information processing device 200 by the operation log transmission part 301C described below may be subsequently deleted.


The operation log transmission part 301C may transmit, for example, an operation log and its accompanying data to the information processing device 200 via the communication device 60. The operation log may be kept when the excavator 100 makes a predetermined movement and stored in the operation log storage part 301B. The accompanying data may be associated with the operation log. The operation log transmission part 301C may also transmit record data that represents the correspondence between the operation logs of the excavator 100 and the accompanying data, prepared for every predetermined movement that the excavator 100 performs, to the information processing device 200.


For example, the operation log transmission part 301C may transmit, for example, an operation log of the excavator 100 and its accompanying data that are stored in the operation log storage part 301B but have not been transmitted before, to the information processing device 200, in response to receipt of a transmission request for an operation log of the excavator 100 from the information processing device 200. Also, the operation log transmission part 301C may, for example, autonomously transmit an operation log of the excavator 100 and its accompanying data that are stored in the operation log storage part 301B but have not been transmitted before, to the information processing device 200 at predetermined timing. The predetermined timing may be, for example, when the excavator 100 stops operating (when the key switch is turned off), when the excavator 100 starts operating (when the key switch is turned on), and so forth.


The log acquiring part 2001 may keep a log when the excavator 100 makes a predetermined movement.


The log when the excavator 100 makes a predetermined movement includes an operation log when the excavator 100 makes the predetermined movement and a log of the state of the target object in the work range. The log of the state of objects in the work range includes data that represents the state of the target object in the work range before and after the execution of the predetermined movement of the excavator 100. The state of objects in the work range includes the shape of objects in the work range and the properties of earth and sand in the work range (for example, the topographical shape of the ground surface in the work range). The operation log that is kept when the excavator 100 makes a predetermined movement is uploaded from the excavator 100. The state log of objects in the work range when the excavator 100 makes a predetermined movement is kept based on measurement data uploaded from the sensor group 300 and accompanying data uploaded from the excavator 100 (data at the time when a predetermined movement is executed).


The simulator part 2002 uses a virtual model of the excavator 100 and the object (that is, earth and sand) in the work range to perform a computer simulation of the predetermined movement of the excavator 100.


For example, earth and sand in the work range may be modeled as a collective body of micro particles using a distinct element method (DEM). By this means, the simulator part 2002 can make a virtual model of the excavator 100 make a predetermined movement such as excavation. Then, by analyzing the movement of each micro particle, the simulator 2002 can virtually reproduce the overall behavior of earth and sand in the work range as a collective body, the reactive force from earth and sand, and so forth.


When the excavator 100 makes a predetermined movement using computer simulation, the simulator part 2002 acquires data about the path of the working part of the excavator 100 as a log, and also acquires data about the state of earth and sand in the work range before and after the predetermined movement is performed as a log. The former data is in effect an operation log of the computer simulation in which the excavator 100 makes the predetermined movement. The latter data is in effect a log of the state of objects in the work range in the computer simulation in which the excavator 100 makes the predetermined movement.


The simulator part 2002 performs the computer simulation such that the excavator 100 makes the predetermined movement in numerous patterns. For example, the state of the working target (earth and sand) might vary in each simulation, the working part of the excavator 100 might follow various paths, and so forth. By this means, the simulator part 2002 can accumulate, in the log storage part 2003, logs of when the excavator 100 makes a predetermined movement, based on computer simulations employing varying conditions.


The log storage part 2003 stores and accumulates the logs acquired by the log acquiring part 2001 and the simulator part 2002 when the excavator 100 makes a predetermined movement. For example, every time the excavator 100 actually preforms a predetermined movement, or every time a computer simulation of a predetermined movement is performed, the log storage part 2003 stores an operation log, a log of the state of objects in the work range, and accompanying data, in association with each other. In the log storage part 2003, a log that is acquired by the log acquiring part 2001 and a log that is acquired by the simulator part 2002 may be stored in an distinguishable manner, or may be stored so as to be mixed up in an indistinguishable manner.


The training data generating part 2004 may generate training data for machine learning based on the logs stored in the log storage part 2003 when the excavator 100 makes a predetermined movement, and output a training data set, which is a collective body of a large volume of training data. The training data generating part 2004 may generate the training data autonomously by batch processing. The training data may also generate the training data based on or in response to an input from the user of the information processing device 200. The training data generating part 2004 includes a training data generating part 2004A.


The training data generating part 2004A generates a training data set for generating the trained model LM1. Based on data that represents the state of objects in the work range of the excavator 100, the trained model LM1 infers the distribution of recommendability of a target work location (also interchangeably referred to as “work site”) in the work range of the excavator 100. For example, as shown in FIG. 7, the work range TA of the excavator 100 is divided into multiple rectangular small areas. The trained model LM1 infers the recommendability of each small area as a work location. Also, the trained model LM1 may infer the distribution of work location recommendability of the excavator 100 in the work range based on data that represents the state of objects in the work range of the excavator 100 and data that represents the target shape of the target object in the work range. For example, data that represents an object's target shape in the work range of the excavator 100 may be data representing a target working surface in the work range. Also, assuming that the excavator 100 makes a predetermined movement N times, the trained model LM1 may infer the distribution of work locations' recommendability for the excavator 100, for each of the first time to the N-th time, in the work range (where N is an integer of 2 or greater). Objects in the work range of the excavator 100 may include soil and sand in the work range as the working target object of the excavator 100. Also, objects in the work range of the excavator 100 may include obstacles other than the object that serves as the working target of the excavator 100. Saying that the excavator 100 changes the shape of an object in a predetermined movement may refer to, for example, a case in which an object's shape is changed in a predetermined movement in which the working part (for example, the bucket 6) of the attachment AT of the excavator 100 comes into contact with the object. A predetermined movement in which the working part of the attachment AT comes into contact with an object may be, for example, excavation in which the bucket 6 excavates earth and sand in the work range. Also, saying that the excavator 100 changes the shape of an object in a predetermined movement may refer to, for example, a case in which an object's shape is changed by unloading, in the work range, the object such as earth and sand contained in the bucket 6 that serves as the working part of the attachment AT. In a case in which the bucket 6 serves as the working part of the attachment AT of the excavator 100, “unloading” may refer to an operation of emptying the bucket 6 of the earth and sand contained therein. For example, if the predetermined movement of the excavator 100 refers to excavation, the trained model LM1 may infer the distribution of work location recommendability for the excavator 100 within the work range where the excavator 100 excavates with the bucket 6, based on data representing the shape of the target object in the work range of the excavator 100.


Input data supporting the trained model LM1 includes data that represents the state of the target object in the work range before the excavator 100 makes the predetermined movement. The state of the target object in the work range includes, for example, the shape of the target object in the work range. Also, the state of the target object in the work range may include the properties of the object (for example, earth and sand). For example, the data representing the properties of earth and sand includes data about the angle of repose of earth and sand. By this means, the trained model LM1 can infer the distribution of recommendability of a more appropriate work site by taking into account the angle of repose of earth and sand. Also, input data supporting the trained model LM1 may include data that represents the target shape of the target object in the work range of the excavator 100.


The training data is a combination of a type of input data specified for the trained model LM1 and data that represents the correct inference result (that is, true data) for the input data. The true data is given in response to the input data included in the training data, and represents the location where the excavator 100 changes the shape of the object in a predetermined movement in the work range. Also, when the distribution of work location recommendability for the excavator 100 within the work range is inferred for each of N predetermined movements of the excavator 100, the true data includes data representing the location in the work range where the excavator 100 changes the shape of the object by each of the excavator 100's N-th predetermined movements. Also, when multiple types of predetermined movements are specified, the trained model LM1 may be generated for each type of predetermined movement. In this case, the training data generating part 2004A generates a training data set for each type of predetermined movement.


The training data set for generating the trained model LM1 is generated based on logs acquired by the log acquiring part 2001, for example. To be more specific, the training data may be the combination of data that represents the state of the target object in the work range before an expert starts operating the excavator 100 to make a predetermined movement, and data that represents the location where the excavator 100 changes the shape of the object in the predetermined movement. By this means, the information processing device 200 can generate a trained model LM1 that reflects the location where the excavator 100 changes the shape of the target object in the work range when the excavator 100 makes the predetermined movement being operated by an expert. Therefore, the trained model LM1 can receive data that represents the state of the target object in the work range of the excavator 100 as an input, and infer a distribution of work location recommendability for the excavator 100, that is, one that takes the expertise of an expert into account. In other words, the distribution of every potential work location's recommendability for the excavator 100 corresponds to the distribution of every potential work location's reliability for the excavator 100 in terms of job efficiency, safety, and so forth, based on the expertise of an expert. For example, the shorter the distance that the lower traveling body 1 needs to travel to perform a job to change the shape of an object at a target location, and the smaller the rotating angle of the upper rotating body 3, the easier it is for the working part of the attachment AT to reach that location. On the other hand, the greater the distance that the lower traveling body 1 needs to travel to perform a job to change the shape of an object at a target location, and the greater the rotating angle of the upper rotating body 3, the more difficult it is for the working part of the attachment AT to reach that location. Therefore, an expert is more likely to select a location that is easy to bring the working part of the attachment AT of the excavator 100 for the excavator 100's work location from the perspective of job efficiency and so forth. In this case, the trained model LM1 may estimate the distribution of work location recommendability for the excavator 100 within the work range such that a location that the attachment AT of the excavator 100 can reach with ease shows a higher recommendability as a potential work location than a location that the attachment AT of the excavator 100 has difficulty reaching. Also, the training data set for generating the trained model LM1 may include a base training data set and a training data set for final fine tuning.


The machine learning part 2005 generates a trained model by performing machine learning on the base training model based on the training data set generated by the training data generating part 2004. The trained model (and the base training model) includes, for example, a neural network such as a deep neural network (DNN).


The machine learning part 2005 includes a machine learning part 2005A.


The machine learning part 2005A controls the base training model M1 to perform machine learning based on the training data set output from the training data generating part 2004A. By this means, the machine learning part 2005A can generate a trained model LM1 that can output (infer) the distribution of work location recommendability for the excavator 100 in a given work range by using data about the state of objects in the work range of the excavator 100 as an input. For example, the machine learning part 2005A can use an error backpropagation algorithm based on the error between the output data of the training model M1 for the input data included in the training data, and the true data, thus optimizing the training model M1 and generating a trained model LM1.


Also, the machine learning part 2005A may generate a trained model LM1 by applying reinforcement learning instead of supervised learning to the training model M1. In this case, the training data generating part 2004A may be omitted. For example, the machine learning part 2005A may apply reinforced learning to the training model M1 based on logs acquired from the log acquiring part 2001 and the simulator part 2002 such that a predetermined reward related to the job efficiency or safety of the excavator 100 is maximized. At this time, the machine learning part 2005A may work in conjunction with the simulator part 2002 to have the simulator part 2002 try many movement patterns, thereby enabling more efficient reinforced learning for the training model M1.


The trained model storage part 2006 may stores the trained model LM1 output from the machine learning part 2005. Also, when the trained model LM1 is trained again or additionally by the machine learning part 2005A, the trained model LM1 in the trained model storage part 2006 may be updated. Also, when the trained model LM1 is updated, the trained model LM1 before the updating may be stored in the trained model storage part 2006 or in a different memory part in a recyclable fashion. By this means, for example, when there is a problem with the trained model LM1 after it is updated, the updated trained model LM1 can resume what it was before the updating and be reused.


The delivery part 2007 distributes the data of the trained model LM1 to the excavator 100.


For example, assuming that a trained model LM1 is generated or updated by the machine learning part 2005A, the delivery part 2007 may deliver the most recently generated or updated trained model LM1 to the excavator 100. Also, the delivery part 2007 may deliver the latest trained model LM1 in the trained model storage part 2006, to the excavator 100, in response to a signal from the excavator 100 requesting delivery of the trained model LM1.


The operation assisting part 302 is a functional part for assisting the user (that is, the operator of the excavator 100) who performs jobs by operating the excavator 100.


The operation assisting part 302 includes a trained model storage part 302A, a state acquiring part 302B, a recommendability distribution estimating part 302C, and a recommending part 302D.


The trained model storage part 302A stores the trained model LM1 delivered from the information processing device 200 and received through the communication device 60.


The state acquiring part 302B acquires data that represents the state of the target object in the work range of the excavator 100. For example, the state acquiring part 302B acquires data that represents the state of the shape, properties, etc. of the target object in the work range of the excavator 100 based on the output of the sensor 40. Also, the state acquiring part 302B may acquire data that represents the state of the shape, properties, etc. of the target object in the work range of the excavator 100 based on the output of the sensor 300-X acquired through the communication device 60. In this case, the sensors 40 may be omitted. Also, there may be a location where the sensors 40 or the sensor 300-X are occluded and cannot detect data that represents the state of the target object. In this case, the state acquiring part 302B may acquire data representing the shape of the target object in the work range of the excavator 100 by estimating the change in the shape of the target object (that is, earth and sand) from before the excavator 100 performed a predetermined movement last time based on the path (that is, route) that the working part of the attachment AT followed when the excavator 100 performed the predetermined movement. Also, the state acquiring part 302B may estimate the reactive force that worked from the target object in the work range onto the working part when the excavator 100 performed the predetermined movement last time based on the measurement data of the sensors S7 to S9. The state acquiring part 302B may acquire data that represents the state of the target object in the work range of the excavator 100 based on that estimation result.


Note that, regardless of the output of the sensors 40 or the sensor 300-X, the state of the target object in the work range of the excavator 100 may be estimated, and data that represents the state of the target object may be acquired. For example, the information processing device 200 may use the simulator part 2002 to build a digital twin that shows the status of job at the work site of the excavator 100. In this case, the information processing device 200 can update the digital twin based on log data uploaded on a real time basis from the excavator 100, and output the estimated state in the work range, such as the state of the target object in the current work range of the excavator 100 and its properties, from the digital twin. Therefore, the output of the digital twin may be sent from the information processing device 200 to the excavator 100 through the communication interface 206. This allows the state acquiring part 302B to acquire data that represents the estimated state of the target object in the work range of the excavator 100. Also, for a given input, the machine learning part 2005 of the information processing device 200 may generate a trained learning model LM4 that can infer the state of the target object in the work range of the excavator 100. For example, the trained learning model LM4 receives, as inputs, the state of the target object in the work range before the excavator 100 makes a predetermined movement and the path (that is, route) of the working part while the excavator 100 makes the predetermined movement, and estimates the shape of the target object in the work range after the excavator 100 makes the predetermined movement. In this case, the training data set for the trained learning model LM4 is generated by the training data generating part 2004 based on log data acquired by the log acquiring part 2001 and log data generated based on the simulation by the simulator part 2002. By this means, every time the excavator 100 makes a predetermined movement, the state acquiring part 302B can estimate the current state in the work range after the excavator 100 makes the predetermined movement, using the trained learning model LM4 delivered from the delivery part 2007 of the information processing device 200.


The recommendability distribution estimating part 302C estimates the distribution of work location recommendability for the excavator 100 within the work range based on data that represents the state of the target object in the work range of the excavator 100 acquired by the state acquiring part 302B. To be more specific, the recommendability distribution estimating part 302C applies the trained model LM1 to the input data including data that represents the state of the target object in the work range of the excavator 100, and estimates the distribution of work location recommendability for the excavator 100 within the work range. Also, the recommendability distribution estimating part 302C may use the trained model LM1 to estimate the distribution of work location recommendability in each of the first time to the N-th time the excavator 100 makes a predetermined movement in the work range, based on the data that represents the state of the target object in the work range of the excavator 100.


Based on the estimation result in the recommendability distribution estimating part 302C, the recommending part 302D recommends a work location for the excavator 100 in the work range to the user of the cabin 10 in a visible manner through the display device 50A. Also, when the excavator 100 is operated remotely, the recommending part 302D recommends a work location for the excavator 100 in the work range to the user in a visible manner through the display device 408 of the remote operation assisting device 400 based on the estimation result of the recommendability distribution estimating part 302C. In this case, for example, the recommending part 302D can control the display device 408 by transmitting a control command including information for recommending a work site of the excavator 100 in the work range to the remote operation assisting device 400 via the communication device 60. By this means, the user can determine, relatively easily, at which location in the excavator 100's work range the shape of a target object should be changed when the excavator 100 is operated in a predetermined way, such as the location where the bucket 6 is to excavate in the excavation operation of the excavator 100.


For example, the recommending part 302D displays the distribution of work location recommendability for the excavator 100 within the work range estimated by the recommendability distribution estimating part 302C, on the display device 50A or on the display device 408. By this means, the recommending part 302D can recommend, to the user, as a potential work location, a location where the recommendability is relatively high within the work range of the excavator 100, based on the distribution of work location recommendability for the excavator 100 within the work range displayed on the display device 50A, etc.


For example, as shown in FIG. 8, the recommending part 302D recommends a work location for the excavator 100 in the work range through a screen 800 of the display device 50A.


The screen 800 includes a work range image 801, an excavator image 802, a target shape image 803, and a recommendability distribution image 804.


The work range image 801 is an image that represents a simulated two-dimensional cross-sectional view of the work range of the excavator 100.


The excavator image 802 is an image that represents a simulated left side view of the excavator 100.


The target shape image 803 is an image that simulates the target shape of the work range, to be more specific, and the target working surface in two dimensions.


The recommendability distribution image 804 is an image that represents the distribution of recommendability of the work site of the excavator 100 in the work range. In this example, the recommendability distribution image 804 is an image that extracts and displays only the distribution of recommendability around the location where the recommendability is significantly high from the distribution of work location recommendability for the excavator 100 within the work range. The recommendability distribution image 804 includes recommendability distribution images 804A to 804C that correspond to the location where the recommendability as a potential work location is significantly high in the work range of the excavator 100. By this means, the controller 30 can recommended three locations corresponding to each of the recommendability distribution images 804A to 804C from in the work range as work sites for the excavator 100 via the display device 50A.


A recommendability distribution image 804A is an image showing the distribution of work location recommendability near the top of a slope in the work range of the excavator 100. A recommendability distribution image 804B is an image showing the distribution of work location recommendability near the middle of a slope in the work range of the excavator 100. A recommendability distribution image 804C is an image showing the distribution of work location recommendability near the foot of a slope in the work range of the excavator 100.


In this example, recommendability distribution image 804B has a maximum at a relatively large value for the recommendability as a potential work location. On the other hand, recommendability distribution images 804A and 804C have a maximum at a relatively small value for the recommendability as a potential work location. Therefore, the user can easily recognize that the recommendability for a work location of the location corresponding to recommendability distribution image 804B is higher than the recommendability for a work location of the locations corresponding to recommendability distribution images 804A and 804C.


In this way, in this example, the recommending part 302D can recommended a work site of the excavator 100 in the work range to the user by displaying the work range represented in two dimensions and the distribution of work location recommendability for the excavator 100 within the work range on the display device 50A.


Also, as shown in FIG. 9, the recommending part 302D may recommended a work site of the excavator 100 in the work range through a screen 900 of the display device 50A.


The screen 900 includes a work range image 901 and a recommendability distribution image 904.


The work range image 901 is an image that represents the work range of the excavator 100 in a simulated three-dimensional manner. In this example, the work range image 901 represents the work range of the excavator 100 in a simulated three-dimensional manner as seen from the viewpoint of an operator of the cabin 10 of the excavator 100.


Note that the work range image 901 may be replaced with a viewpoint conversion image that is generated based on an image captured by an image capturing device (camera) as the sensor 40.


The recommendability distribution image 904 is an image that represents the distribution of work location recommendability for the excavator 100 within the work range. In this example, the recommendability distribution image 904, like the recommendability distribution image 804 described above, is an image that extracts and displays only the distribution of recommendability around the location where the recommendability as a potential work location is significantly high from the distribution of work location recommendability for the excavator 100 within the work range. The recommendability distribution image 904 includes recommendability distribution images 904A to 904C corresponding to locations where the recommendability as a potential work location is significantly high in the work range of the excavator 100. By this means, the controller 30 can recommended three locations corresponding to each of the recommendability distribution images 904A to 904C from in the work range as work sites for the excavator 100 via the display device 50A.


In this example, recommendability distribution image 904A has a maximum at a relatively large value for the recommendability as a potential work location. On the other hand, recommendability distribution images 904B and 904C have a maximum at a relatively small value for the recommendability as a potential work location. Therefore, the user can easily recognize that the recommendability for a work location of to the location corresponding recommendability distribution image 904A is higher than the recommendability for a work location of the locations corresponding to recommendability distribution images 904B and 904C.


Thus, in this example, the recommending part 302D can recommended a work site to the user by displaying on the display device 50A the work range as seen from the cabin of the excavator 100, which is represented in three dimensions, and the distribution of work location recommendability for the excavator 100 within that work range. Therefore, for example, a user (operator) in the cabin 10 of the excavator 100 can visually recognize the distribution of work location recommendability in the work range in the same field of view as the actual work range as seen by the operator himself from the cabin 10.


Note that the screen 900 may include, in addition to the work range image 901 and the recommendability distribution image 904, an image that represents the target shape of the working target in three dimensions.


Also, as shown in FIG. 10, the recommending part 302D may recommended a work site of the excavator 100 in the work range through the screen 1000 of the display device 50A.


The screen 1000 includes a work range image 1001, an excavator image 1002, and a recommendability distribution image 1004.


The work range image 1001 is an image that simulates the work range of the excavator 100 in three dimensions. In this example, the work range image 1001 simulates the work range when viewed from a viewpoint in front of the excavator 100 facing the excavator 100.


Note that the work range image 1001 may be replaced with a viewpoint conversion image that is generated based on an image captured by an image capturing device (camera) serving as the sensor 40.


The excavator image 1002 is an image that simulates the excavator 100 viewed from the front in three dimensions.


The recommendability distribution image 1004 is an image that represents the distribution of recommendability of work sites for the excavator 100 in the work range. In this example, the recommendability distribution image 1004 is an image that extracts and displays only the distribution of recommendability around a location where the recommendability is significantly high from the distribution of recommendability of work sites for the excavator 100 in the work range. By this means, the controller 30 can recommended three locations corresponding to each of the recommendability distribution images 1004A to 1004C from the work range as work sites for the excavator 100 through the display device 50A.


In this example, recommendability distribution image 1004B has a maximum at a relatively large value. On the other hand, recommendability distribution images 1004A and 1004C have a maximum at a relatively small value for the recommendability of the work site. Therefore, the user can easily recognize that the recommendability of the location corresponding to the recommendability distribution image 1004B is higher than the recommendability of the locations corresponding to the recommendability distribution images 1004A and 1004C.


Thus, in this example, the recommending part 302D can recommended a work site to the user by displaying on the display device 50A the work range when the excavator 100 represented in three dimensions is viewed from the front, and the distribution of recommendability of the work site of the excavator 100 in that work range. Therefore, the user can visually confirm the distribution of recommendability while checking both the work range viewed from the excavator 100 side and the work range viewed from the front of the excavator 100, i.e., from the opposite side.


Note that screen 1000 may include an image showing the target shape of the working target in three dimensions, in addition to work range image 1001, excavator image 1002, and recommendability distribution image 1004.


Also, when the excavator 100 is operated remotely, the recommending part 302D may cause the display device 408 of the remote operation assisting device 400 to display screen 800, screen 900, or screen 1000.


Also, the recommending part 302D may express differences in the recommendability as a potential work location for each recommended work site by expressing each recommendability as a numerical value instead of using varying colors for each location recommended as a potential work location.


Also, when displaying the distribution of work location recommendability in a work range, the recommending part 302D may display the distribution of recommendability limited to a predetermined range in the work range that the excavator 100 judges to be easy to reach with the working part of the attachment AT. Also, when displaying the distribution of work location recommendability in a work range, the recommending part 302D may display the distribution of recommendability limited to a predetermined range in the work range that the excavator 100 judges to be easy to reach with the working part of the attachment AT. “Small relative to a reference distance” may mean less than the reference distance or shorter than the reference distance. Similarly, “small relative to a reference angle” may mean less than the reference angle or smaller than the reference angle. The reference distance may be zero. Similarly, the reference angle may be zero. Also, the recommending part 302D may express differences in the recommendability as a potential work location for each recommended work site by expressing each recommendability as a numerical value instead of using varying colors for each location recommended as a potential work location.


Also, the recommending part 302D may not recommended a work site if there is no location in the working target of the excavator 100 whose recommendability as a potential work location is relatively high for the predetermined criterion. In this case, the recommending part 302D, for example, displays on the display device 50A or the display device 408 that there are no work locations to recommended.


Also, the recommending part 302D may visually recommended a work site for each repetition of the predetermined movement of the excavator 100 in the work range for the first to N-th future predetermined movements of the excavator 100 through the display device 50A or the display device 408. In this case, the recommending part 302D, for example, displays an image representing the work range on the display device 50A or the display device 408, and overlays an image representing the location with the highest recommendability in the work range for each repetition.


Note that some or all of the functions of the state acquiring part 302B, the recommendability distribution estimating part 302C, and the recommending part 302D may be transferred to an outside of the excavator 100. For example, the functions of the state acquiring part 302B, the recommendability distribution estimating part 302C, and the recommending part 302D may be transferred to the information processing device 200, and the display contents of the display device 50A may be controlled from the information processing device 200. Also, for example, when the excavator 100 is operated remotely, the functions of the state acquiring part 302B, the recommendability distribution estimating part 302C, and the recommending part 302D may be provided in the remote operation assisting device 400. In these cases, for example, the output of the sensors 40 and sensors S1 to S9 are uploaded on a real time basis to information processing device 200 and remote operation assisting device 400 via communication device 60.


[First Example Process for Recommending Work Location]

Next, referring to FIG. 11, a first example of process for recommending a work location of the excavator 100 will be described.



FIG. 11 is a flowchart showing a schematic diagram of a first example of process for recommending a work location of the excavator 100.


The flowchart of FIG. 11 is executed, for example, when a function to recommend a work location of the excavator 100 is enabled and a job by a predetermined movement of the excavator 100 is executed. The same may be true for the flowcharts of FIG. 12, FIG. 14, and FIG. 16, which will be described below.


As shown in FIG. 11, in step S10, state acquiring part 302B acquires data that represents the shape of a target object in the work range of the excavator 100 based on output from the sensor 40, etc.


When the process in step S10 is completed, the controller 30 proceeds to step S11.


In step S11, the recommendability distribution estimating part 302C estimates the distribution of work location recommendability for the excavator 100 within the work range, by using the trained model LM1, based on the data obtained in step S10.


When the process in step S11 is completed, the controller 30 proceeds to step S12.


In step S12, the recommending part 302D recommends a work location for the excavator 100 through the display device 50A or the display device 408, based on the distribution of work location recommendability for the excavator 100 within the work range estimated in step S11.


When the process of step S12 is completed, the controller 30 proceeds to step S13.


In step S13, the controller 30 determines whether or not the excavator 100 has performed and completed a predetermined movement. That is, after the recommendation in step S12, the controller 30 determines whether or not the excavator 100 has performed and completed a predetermined movement. The controller 30 determines whether or not the excavator 100 has started and completed a predetermined movement, using predefined operating conditions, based on, for example, output of the sensors S1 to S9. If the excavator 100 has started and then completed a predetermined movement, the controller 30 proceeds to step S14; otherwise, the controller 30 repeats the process of step S13.


In step S14, the controller 30 determines whether a predetermined condition (hereinafter “job end condition”) indicating the end of the job of the excavator 100 is satisfied. The job end condition may be, for example, that a predetermined input indicating the end of the job is received from the user through the input device 52 or the input device 407. The job end condition may also be, for example, a condition related to time. The job end condition may also be a condition indicating that the shape of the target object in the work range has arrived at the target shape. If the job end condition is satisfied, the controller 30 ends the procedures of this flowchart, and if the job end condition is not satisfied, the controller 30 returns to step S10 and repeats the processing from step S10 onwards.


[Second Example Process for Recommending Work Location]

Next, referring to FIG. 12, a second example of process for recommending a work location of the excavator 100 will be described.


As shown in FIG. 12, the process in step S20 is the same as the process in step S10 in FIG. 11, and so its description will be omitted.


When the process in step S20 is completed, the controller 30 proceeds to step S21.


In step S21, the recommendability distribution estimating part 302C estimates the distribution of work location recommendability for the excavator 100 within the work range, in each of the first to the N-th time the excavator 100 performs the predetermined movement, using the trained model LM1 based on the data acquired in step S20.


When the process for step S21 is completed, the controller 30 proceeds to step S22.


In step S22, the recommending part 302D recommends potential work locations for the excavator 100, corresponding to the first to N-th time the excavator 100 performs the predetermined movement in the future, through the display device 50A or the display device 408, based on the distribution of work location recommendability for the excavator 100 in the work range estimated in step S21.


When the process for step S22 is completed, the controller 30 proceeds to step S23.


The processes of steps S23 and S24 are the same as the processes of steps S13 and S14 in FIG. 11, and so their description will be omitted.


Note that in step S23, whether or not the predetermined movement of the excavator 100 has been completed N times is determined, not whether or not the predetermined movement of the excavator 100 has been completed once.


[Second Example Functional Structure of Operation Assisting System]

Next, a second example functional structure of the operation assisting system SYS will be described with reference to FIG. 13, in addition to FIG. 1 to FIG. 5.


In the following description, in this example, the same reference numerals will be used for the same or corresponding parts as in the first example (FIG. 6) described above, and the description will focus on parts that are different from the first example described above, and description of the same or corresponding parts as in the first example described above may be omitted.



FIG. 13 is a functional block diagram that shows a second example functional structure of the operation assisting system SYS.


This example differs from the first example described above mainly in that the trained learning model LM2 is generated by the information processing device 200 and that the controller 30 includes a path generation part 302E.


The controller 30 of the excavator 100 includes an operation log providing part 301 and an operation assisting part 302 as functional parts, as in the first example described above.


The information processing device 200 includes, as its functional parts, a log acquiring part 2001, a simulator part 2002, a log storage part 2003, a training data generating part 2004, a machine learning part 2005, a trained model storage part 2006, and a delivery part 2007, as in the first example described above.


The training data generating part 2004 includes a training data generating part 2004B in addition to the training data generating part 2004A.


The training data generating part 2004B generates training data for generating a trained learning model LM2. The trained learning model LM2 infers the path of the working part when the excavator 100 makes a predetermined movement, based on data that represents the state of the target object in the work range of the excavator 100. The trained learning model LM2 may also infer the path of the working part when the excavator 100 makes a predetermined movement, based on data that represents the state of the target object in the work range of the excavator 100 and data that represents the target shape of the target object in the work range. Also, the trained learning model LM2 may receive data specifying a work location of the excavator 100 as a constraint, and may infer the path of the working part in the predetermined movement of the excavator 100 under the constraint.


The input data corresponding to the trained learning model LM2 includes, for example, data that represents the state of the target object in the work range of the excavator 100. The data that represents the state of the target object (that is, earth and sand) in the work range includes, for example, data representing the shape of the target object in the work range, as described earlier. Also, the data that represents the state of the target object in the work range may include data representing the properties of the object (that is, earth and sand) in the work range, as described earlier. Also, the input data corresponding to the trained learning model LM2 may include data that represents the target shape of the working target. Also, when the predetermined movement of the excavator 100 is operation of unloading sand and earth, the input data corresponding to the trained learning model LM2 may include data representing the weight or volume of earth and sand contained in the bucket 6 before unloading.


The training data is a combination of input data of a type specified for the trained learning model LM2 and data (true data) representing a correct inference result for the input data. To be more specific, the input data included in the training data includes data that represents the state of the work range before the excavator 100 makes the predetermined movement. Also, the input data included in the training data may include data that represents the target shape of an object in the work range of the excavator 100. Also, when the predetermined movement of the excavator 100 is an operation of unloading sand and earth, the input data included in the training data may include data representing the weight or volume of the earth contained in the bucket 6 before unloading. Also, the true data included in the training data includes data representing the path of the working part when the excavator 100 makes the predetermined movement by operation by the expert, assuming the state of the target object in the work range for the input data. That is, the training data generating part 2004B generates a training data set based on the log acquired by the log acquiring part 2001 when the excavator 100 makes a predetermined movement through an operation by an expert. Also, if multiple types of predetermined movements are specified, a trained learning model LM2 may be generated for each type of predetermined movement. In this case, the training data generating part 2004B generates a training data set for each type of predetermined movement.


The machine learning part 2005 includes a machine learning part 2005B in addition to the machine learning part 2005A.


The machine learning part 2005B controls the base training model M2 to perform machine learning based on the training data set output from the training data generating part 2004B. By this means, the machine learning part 2005B can generate a trained learning model LM2 that can infer the path of a working part in a predetermined movement of the excavator 100 by using data about the state of an object in the work range of the excavator 100 as input. For example, the machine learning part 2005B can optimize the training model M2 using an error backpropagation algorithm based on the error between the output data of the learning model M2 for the input data included in the teacher data and the true data, to generate the trained learning model LM2.


Also, the machine learning part 2005 may generate the trained learning model LM2 by applying reinforcement learning instead of supervised learning to the training model M2. In this case, the training data generating part 2004B is omitted. For example, the machine learning part 2005B causes the training model M2 to perform reinforcement learning so as to maximize a predetermined reward related to job efficiency or safety based on logs acquired by the log acquiring part 2001 and the simulator part 2002. At this time, by linking with the simulator part 2002 and causing the simulator part 2002 to try out a large number of movement patterns, it is possible to more efficiently perform reinforcement learning on the training model M2.


Note that, instead of the trained learning models LM1 and LM2, the machine learning part 2005 may generate a trained learning model LM3 that can infer both the distribution of work location recommendability for the excavator 100 within the work range and the path of the working part during a predetermined movement by the excavator 100 using data representing the shape of the target object in the excavator's work range as an input. In this case, instead of the machine learning parts 2005A and 2005B, a machine learning part that generates the trained learning model LM3 is provided. Also, in this case, instead of the training data generating parts 2004A and 2004B, a training data generating part that generates a training data set for generating the trained learning model LM3 is provided.


The trained model storage part 2006 stores the trained learning models LM1 and LM2 output by the machine learning part 2005. Also, when the machine learning part 2005B re-learns or additionally learns the trained learning model LM2, the trained learning model LM2 in the trained model storage part 2006 is updated. Also, when the trained learning model LM2 is updated, the trained learning model LM2 before the update may be stored in the trained model storage part 2006 or another memory part in a recyclable fashion. By this means, for example, when there is a problem with the trained learning model LM2, the trained learning model LM2 before the update can be restored and reused.


Note that when trained learning model LM3 is generated, trained learning model LM3 is stored in trained model storage part 2006 instead of trained learning models LM1 and LM2.


The delivery part 2007 distributes data of trained learning model LM2 to excavator 100 in addition to trained model LM1.


For example, when the trained learning model LM2 is generated or updated by the machine learning part 2005B, the delivery part 2007 distributes the most recently generated or updated trained learning model LM2 to the excavator 100. Also, the delivery part 2007 may deliver the latest trained learning model LM2 in the trained model storage part 2006 to the excavator 100 in response to a signal received from the excavator 100 requesting to broadcast the trained learning model LM2.


Note that when trained learning model LM3 is generated, the delivery part 2007 distributes the trained learning model LM3 to the excavator 100 instead of the trained learning models LM1 and LM2.


The operation assisting part 302 includes, as functional parts, a trained model storage part 302A, a state acquiring part 302B, a recommendability distribution estimating part 302C, and a recommending part 302D, as well as a path generating part 302E.


The trained model storage part 302A stores the trained learning models LM1 and LM2 delivered from the information processing device 200.


The path generating part 302E generates a path of the working part in the predetermined movement of the excavator 100 (hereinafter, for ease of explanation, “recommended path”) based on data that represents the state of the target object in the current work range of the excavator 100 acquired by the state acquiring part 302B. To be more specific, the path generating part 302E generates a recommended path of the working part in the predetermined movement of the excavator 100 by applying the trained learning model LM2 based on data that represents the state of the target object in the current work range of the excavator 100. For example, the path generation part 302E generates a recommended path for changing the shape of an object at a location where the recommendability is relatively high as a potential work location of the excavator 100 in the work range through a predetermined movement based on the estimation result of the recommendability distribution estimating part 302C. In the work range, the location where the recommendability is relatively high as a potential work location of the excavator 100 corresponds to a location recommended as a potential work location by the recommending part 302D. The location where the recommendability is relatively high as a job location for the excavator 100 may be, for example, a location with the highest recommendability for a work location of the excavator 100. Also, the location where the recommendability is relatively high as a potential work location of the excavator 100 may include a location where the recommendability as a potential work location is significantly high for the excavator 100.


Note that, when the trained learning model LM3 is generated, the functions of the recommendability distribution estimating part 302C and the path generation part 302E may be integrated into one functional part. In this case, the integrated functional parts use the trained learning model LM3 to estimate the distribution of work location recommendability for the excavator 100 within the work range and the recommended path of the working part in the predetermined movement of the excavator 100.


In addition to recommending a work site of the excavator 100 in the work range, the recommending part 302D recommends a recommended path of the working part in a predetermined movement of the excavator 100 at the work site of the excavator 100, which is generated by the path generating part 302E.


For example, the recommending part 302D displays images showing the recommended paths of the working part of the excavator 100 in two dimensions at the corresponding locations in place of the recommendability distribution images 804A to 804C on the screen 800 in FIG. 8 described above. By this means, the recommending part 302D can recommended a work site of the excavator 100 in the work range based on the position of the image in the work range image 801, and recommended a recommended path in a predetermined movement of the excavator 100 based on the shape of the image.


Also, the recommending part 302D displays an image showing a recommended path of the working part of the excavator 100 in three dimensions as seen from the cabin 10 in place of the recommendability distribution images 904A to 904C on the screen 900 in FIG. 9 described above. By this means, the recommending part 302D can recommended a work site of the excavator 100 in the work range based on the position of the image in the work range image 901, and can recommended a recommended path for the excavator 100 in a predetermined movement based on the shape of the image. Similarly, the recommending part 302D displays, in place of the recommendability distribution images 1004A to 1004C on the screen 1000 of FIG. 10 described above, images showing a recommended path of the working part of the excavator 100 in three dimensions as viewed from the front of the excavator 100, at the corresponding locations. By this means, the recommending part 302D can recommended a work site of the excavator 100 in the work range based on the position of the image in the work range image 1001, and recommended a recommended path for the excavator 100 in a predetermined movement based on the shape of the image.


Also, in any of the screens 800, 900, and 1000 in FIG. 8 to FIG. 10 described above, the recommending part 302D changes the color of the image representing the recommended path of the working part of the excavator 100 in each of the three locations according to the recommendability as a potential work location for that location. By this means, the recommending part 302D can notify the user of the recommendability as a potential work location for a location in the work range corresponding to the position where the image is displayed, based on the color of the image representing the recommended path of the working part of the excavator 100.


Also, the recommending part 302D may display the recommendability as a potential work location for that location as a numerical value so as to correspond to the image representing the recommended path of the working part of the excavator 100 in each of the three locations in any of the screens 800, 900, and 1000 in FIG. 8 to FIG. 10 described above. By this means, the recommending part 302D can express differences in the recommendability as a potential work location for each location in the work range corresponding to a position representing the recommended path of the working part of the excavator 100 by using varying numerical values.


Note that some or all of the functions of the state acquiring part 302B, the recommendability distribution estimating part 302C, the recommending part 302D, and the path generating part 302E may be transferred to the outside of the excavator 100. For example, the functions of the state acquiring part 302B, the recommendability distribution estimating part 302C, the recommending part 302D, and the path generating part 302E may be transferred to the information processing device 200, and the display contents of the display device 50A may be controlled from the information processing device 200.


Also, for example, when the excavator 100 is operated remotely, the functions of the state acquiring part 302B, the recommendability distribution estimating part 302C, the recommending part 302D, and the path generating part 302E may be provided in the remote operation assisting device 400. In these cases, for example, the output of the sensors 40 and sensors S1 to S9 are uploaded on a real time basis to information processing device 200 and remote operation assisting device 400 via communication device 60.


[Third Example Process for Recommending Work Location]

Next, referring to FIG. 14, we will explain a third example process for recommending a potential work location of the excavator 100.



FIG. 14 is a flowchart showing a third example of the process for recommending a potential work location of the excavator 100.


As shown in FIG. 14, the processes of steps S30 and S31 are the same as the processes of steps S10 and S11 in FIG. 11, and therefore will not be described.


When the process for step S31 is completed, the controller 30 proceeds to step S32.


In step S32, the path generation part 302E generates a recommended path for changing the shape of an object in a location where the recommendability is relatively high as a potential work location of the excavator 100 in the work range by making a predetermined movement, based on the estimation result of the recommendability distribution estimating part 302C.


When the process in step S32 is completed, the controller 30 proceeds to step S33.


In step S33, the recommending part 302D recommends a work location for the excavator 100 in the work range and recommends a recommended path of the working part by a predetermined movement of the excavator 100 at the work site in a visible manner through the display device 50A or the display device 408 based on the results of the process in steps S31 and S32.


When the process in step S33 is completed, the controller 30 proceeds to step S34.


The process in steps S34 and S35 is the same as the process in steps S13 and S14 in FIG. 11, so the explanation is ended here.


[Third Example Functional Structure Operation Assisting System]

Next, a third example functional structure of operation assisting system SYS will be described with reference to FIG. 15 in addition to FIG. 1 to FIG. 5.


In the following description, in this example, the same reference numerals are used to designate components that are the same as or correspond to those in the first example (FIG. 6) and second example (FIG. 13) described above, and the description will focus on parts that differ from the first and second examples described above, and may omit description of parts that are the same as or correspond to the first and second examples described above.


In this example, the main difference from the second example described above is that the controller 30 includes operation control part 302F.


The controller 30 of the excavator 100 includes, as its functional parts, an operation log providing part 301 and an operation assisting part 302, as in the first example described above.


The operation assisting part 302 includes, as functional parts, a trained model storage part 302A, a state acquiring part 302B, a recommendability distribution estimating part 302C, a recommending part 302D, and a path generating part 302E, as well as an operation control part 302F.


The operation control part 302F automatically causes the excavator 100 to perform an operation according to a predetermined input from the user through the input device 52 or the input device 407.


For example, when a work site is selected by the user through the input device 52 or the input device 407 from among the work sites recommended by the recommending part 302D, the operation control part 302F automatically causes the excavator 100 to make a predetermined movement. To be more specific, the operation control part 302F controls the hydraulic actuator HA so that the working part moves along a recommended path corresponding to the selected work site. By this means, the user can cause the excavator 100 to make a predetermined movement so that the working part moves along the recommended path, simply by selecting a work location through the input device 52 or the like, without operating the operating device 26 or the operating device for remote operation. Therefore, even a user with little experience in operating the excavator 100 can efficiently proceed with the job, and, as a result, the controller 30 can improve the job efficiency of the excavator 100.


For example, the user can perform an operation to select one of the images representing the recommended path of the working part on the screen displayed on the display device 50A, etc., via the input device 52, etc. Then, the operation control part 302F causes the excavator 100 to make a predetermined movement so that the working part moves along the recommended path corresponding to the image representing the selected recommended path.


Note that some or all of the functions of the state acquiring part 302B, the recommendability distribution estimating part 302C, the recommending part 302D, the path generating part 302E, and the operation control part 302F may be provided outside the excavator 100. For example, the functions of the state acquiring part 302B, the recommendability distribution estimating part 302C, the recommending part 302D, the path generating part 302E, and the operation control part 302F may be transferred to the information processing device 200, and the display contents of the display device 50A and the operation of the excavator 100 may be controlled from the information processing device 200. Also, for example, when the excavator 100 is operated remotely, the functions of the state acquiring part 302B, the recommendability distribution estimating part 302C, the recommending part 302D, the path generating part 302E, and the operation control part 302F may be provided in the remote operation assisting device 400. In these cases, for example, the output of the sensors 40 and the sensors S1 to S9 are uploaded on a real time basis to the information processing device 200 and the remote operation assisting device 400 via the communication device 60.


[Fourth Example of Process for Recommending Work Location]

Next, a fourth example of the process for recommending a potential work location of the excavator 100 will be described with reference to FIG. 16.



FIG. 16 is a flowchart showing a schematic diagram of a fourth example of the process for recommending a potential work location of the excavator 100.


As shown in FIG. 16, the process for steps S40 to S43 is the same as the process for steps S30 to S33 in FIG. 14, and therefore its description will be omitted.


When the process for step S43 is completed, the controller 30 proceeds to step S44.


In step S44, the controller 30 determines whether an operation to select and confirm a work site from among the work sites displayed in the visible way through the display device 50A has been performed through the input device 52 or the input device 407. If the target operation has been performed, the controller 30 proceeds to step S45; otherwise, the controller 30 repeats the process in step S44.


In step S45, the operation control part 302F performs operation control to cause the excavator 100 to make a predetermined movement so that the working part moves in accordance with the recommended function corresponding to the work site for the one selected in step S44.


When the process in step S45 is completed, the controller 30 proceeds to step S46.


The process in step S46 is the same as the process in step S14 in FIG. 11, so its description will be omitted.


Other Embodiments and Examples

The embodiments and examples described hereinabove may be modified or changed as appropriate by, for example, combining or replacing/substituting their details.


For example, the display device 50A according to one embodiment described above may be a head-up display. In this case, under the control of the recommending part 302D, the display device 50 may display images (for example, the above-described recommendability distribution image 904A to 904C) that recommend potential work locations in the work range in the user's field of view, who is viewing the work range around the excavator 100 from the cockpit.


Also, in at least one embodiment described above, the recommending part 302D may display a recommended work location of the for the excavator 100, a recommended path, and so on, in a visible way, through smart glasses or augmented-reality (AR) glasses worn by the user, instead of the display device 50A or the display device 408.


Also, in at least one embodiment described above, the display device 50A may be a projection device. This projection device may project images on the ground in the work range around the excavator 100 and indicate information to the user of the cabin 10 or the user of the remote operation device by employing a technique such as projection mapping. In this case, for example, the recommending part 302D may control the projection device to project an image for recommending a potential work location of the excavator 100 in the work range, right or substantially right onto that location.


Also, the methods of recommending potential work locations and recommended paths according to the above embodiments, examples, and their variations may be applied to other work machines than the excavator 100. For example, a continuous unloader may be another work machine. In this case, the operation assisting system SYS may, for example, use the lower part (scraping part) of the continuous unloader's bucket elevator, to show a recommend work location for scraping loose cargo (for example, iron ore, coal, etc.)


from the ship's hold, show a recommended path for the scraping part as a working part, and so on.


Advantages

Next, advantages that the work machine, remote operation assisting device, and operation assisting system according to the present disclosure can provide will be described below.


According to a first example of the present disclosure, a work machine includes a work device and a display device. The work machine refers to the excavator 100 described above. The work machine may be the continuous unloader described earlier. The work device may be, for example, the attachment AT described above. The work device may also be, for example, the bucket elevator of the continuous unloader. The display device may be, for example, the display device 50A described above. To be more specific, the work device may operate so as to change the shape of an object in the work range of the work machine. Then, the display device may be configured to recommend, in a visible way, a target work location in the work range of the work machine, based on information about the current shape of the object, the target work location being where the work device changes the shape of the object.


By this means, the user can determine where in the excavator's work range the target work location is located, based on the content shown on the display device. Therefore, the work machine can properly assist the user's operation of the work machine.


Also, according to a second example of the present disclosure, in the first example above, the display device may be further configured to display, in a distinguishable way, a location of relatively high recommendability and a location of relatively low recommendability as potential target work locations in the work range of the work machine.


By this means, the user can determine where in the excavator's work range the target work location is located, based on how high or low every potential work location's recommendability is.


Also, according to a third example of the present disclosure, in the second example above, the display device may be further configured to display the difference between the high recommendability and the low recommendability by using varying colors or varying numerical values.


By this means, the user can readily learn how work location recommendability varies from one potential work location to another within the excavator's work range.


Also, according to a fourth example of the present disclosure, in the third example above, the display device may be further configured to display the difference between the high recommendability and the low recommendability by using a heat map.


By this means, the user can readily learn how work location recommendability varies from one potential work location to another within the excavator's work range.


Also, according to a fifth example of the present disclosure, in any one of the second to fourth examples described above, the display device may be further configured to display a location in the work range of the work machine where the recommendability shows a maximal value.


By this means, the work machine can recommend, to the user, as a target work location, a location in the work range where the recommendability as a target work location has a maximal value.


Also, according to a sixth example of the present disclosure, in any one of the second to fifth examples described above, the display device may be further configured to display an indication that work location is recommendable when no work location in the work range of the work machine shows relatively high recommendability with respect to a predetermined criterion.


By this means, the work machine can prevent or substantially prevent a situation in which the job efficiency or the safety of the work machine decreases, by recommending a location with relatively low recommendability as a target work location.


Also, according to a seventh example of the present disclosure, in any one of the second to sixth examples described above, the high recommendability and the low recommendability may be defined in association with the job efficiency or the safety of the work machine.


By this means, the work machine can recommended potential target work locations such that the job efficiency and safety of the work machine are improved.


Also, according to an eighth example of the present disclosure, in the seventh example described above, the work machine may include a processing device. The processing device may be, for example, the controller 30 described above. To be more specific, the processing device may be configured to output a distribution of recommendability in the work range of the work machine, based on the information about the current shape of the object in the work range of the work machine, by using a trained model that has been machine-learned using training data that is associated with the shape of the object and that is modeled on operation of the work machine by an operator with relatively high expertise based on a predetermined criterion. The operator with relatively high expertise based on the predetermined criterion may be, for example, the expert described above. The trained model may be, for example, the trained model LM1 described above.


By this means, the work machine can recommend, to the user, a target work location that reflects the job efficiency and safety aspects of operation by an operator with high expertise.


Also, according to a ninth example of the present disclosure, in any one of the first to eighth examples described above, the display device may be further configured to recommend, in the visible way, a plurality of potential target work locations in a time sequence based on the information about the current shape of the object in the work range of the work machine.


By this means, the work machine can recommend target work locations for multiple operations altogether, for example. Therefore, the work machine can offer improved user friendliness and job efficiency.


Also, according to a tenth example of the present disclosure, in any one of the first to ninth examples above, the display device may be further configured to display an image of a recommended target work location in the work range of the work machine, within the field of view of the user of the work machine who is viewing the work range of the work machine from the cockpit of the work machine. The working part may be, for example, the bucket 6 above. Also, the working part may be the scraping part of the continuous unloader described above.


By this means, the work machine can not only recommend a target work location, but can also recommend a path that the working part follows when changing the shape of the object at that location.


Also, according to an eleventh example of the present disclosure, in any one of the first to tenth examples described above, the display device may be further configured to display an image of a recommended target work location in the work range of the work machine by overlaying the image of the recommended work location over an image showing the current shape of the object in the work range of the work machine. Examples of images showing the shape of the object in the work range of the work machine may include, for example, the work range images 801, 901, and 1001, which have been described earlier. The image of a recommended target work location may be, for example, the above-described recommendability distribution images 804A to 804C, recommendability distribution images 904A to 904C, and recommendability distribution images 1004A to 1004C.


By this means, the work machine can recommend potential work locations in the work range of the work machine in association with the shape of the target object in the work range.


Also, according to a twelfth example of the present disclosure, in any one of the first to tenth examples described above, the display device may be further configured to display an image of a recommended target work location in the work range of the work machine, within the field of view of the user of the work machine who is viewing the work range of the work machine from the cockpit of the work machine.


By this means, the work machine can recommend potential work locations in the work range of the work machine in association with the shape of the target object in the work range.


Furthermore, according to a thirteenth example of the present disclosure, in any one of the first to tenth examples described above, the display device may be further configured to display an image of a recommended target work location in the work range of the work machine by projecting the image of the recommended target work location over the work range of the work machine.


By this means, the work machine can recommend potential work locations in the work range of the work machine in association with the shape of the target object in the work range.


Also, according to a fourteenth example of the present disclosure, a remote operation assisting device may include an operating part, a communication part, and a display part. The remote operation assisting device may be, for example, the remote operation assisting device 400 described above. The operating part may be, for example, an operating device for remote operation as the input device 407 described above. The communication part may be, for example, the communication interface 406 described above. The display part may be, for example, the display device 408 described above. To be more specific, the operating part is used by a user to remotely operate a work machine including a work device that operates to change the shape of an object in the work range of the work machine. The communication part transmits information indicating the operating state of the operating part to the work machine. Then, the display part may be configured to recommend, in a visible way, a target work location in the work range of the work machine where the work device changes the shape of an object, based on information about the current shape of the object.


By this means, the user performing remote operation can determine where in the excavator's work range the target work location is located, based on the content shown on the display device. Therefore, the remote operation assisting device can properly assist the user's operation of the work machine.


Also, according to a fifteenth example of the present disclosure, an assisting system assists a user's operation of a work machine having a work device that operates to change the shape of an object in the work range of the excavator 100. The assisting system may be, for example, the above-described operation assisting system SYS. To be more specific, the assisting system may be provided with a display part. The display part may be, for example, the display device 50A or display device 408 described earlier. Also, the display part may be, for example, the above-mentioned smart glasses or AR goggles. To be more specific, the display part may be configured to recommend, in a visible way, a target work location in the work range of the work machine, based on information about the current shape of the object, the target work location being where the work device changes the shape of the object.


By this means, the user can determine a target work location in the work range based on the content displayed on the display part. Therefore, the assisting system can properly assist the user's operations of the work machine.


Although embodiments of the present disclosure have been described in detail above, the present disclosure is not limited to such specific embodiments, and various modifications and variations are possible within the scope of the gist described in the claims.

Claims
  • 1. A work machine comprising: a work device configured to change a shape of an object in a work range of the work machine; anda display device configured to recommend, in a visible way, a target work location in the work range of the work machine, based on information about a current shape of the object, the target work location being where the work device changes the shape of the object.
  • 2. The work machine according to claim 1, wherein the display device is further configured to display, in a distinguishable way, a location of relatively high recommendability and a location of relatively low recommendability as potential target work locations in the work range of the work machine.
  • 3. The work machine according to claim 2, wherein the display device is further configured to display difference between the high recommendability and the low recommendability by using varying colors or varying numerical values.
  • 4. The work machine according to claim 3, wherein the display device is further configured to display difference between the high recommendability and the low recommendability by using a heat map.
  • 5. The work machine according to claim 2, wherein the display device is further configured to display a location in the work range of the work machine where recommendability shows a maximal value.
  • 6. The work machine according to claim 2, wherein the display device is further configured to display an indication that no work location is recommendable when no work location in the work range of the work machine shows relatively high recommendability with respect to a predetermined criterion.
  • 7. The work machine according to claim 2, wherein the high recommendability and the low recommendability are defined in association with job efficiency or safety of the work machine.
  • 8. The work machine according to claim 7, further comprising a processing device configured to output a distribution of recommendability in the work range of the work machine, based on the information about the current shape of the object in the work range of the work machine, by using a trained model that has been machine-learned using training data that is associated with the shape of the object and that is modeled on operation of the work machine by an operator with relatively high expertise based on a predetermined criterion.
  • 9. The work machine according to claim 1, wherein the display device is further configured to recommend, in the visible way, a plurality of potential target work locations in a time sequence based on the information about the current shape of the object in the work range of the work machine.
  • 10. The work machine according to claim 1, wherein the display device is further configured to recommend, in the visible way, the target work location in the work range of the work machine, and a path that a working part of the work device follows when the work device changes the shape of the object in the work range of the work machine, based on the information about the current shape of the object.
  • 11. The work machine according to claim 1, wherein the display device is further configured to display an image of a recommended target work location in the work range of the work machine by overlaying the image of the recommended work location over an image showing the current shape of the object.
  • 12. The work machine according to claim 1, wherein the display device is further configured to display an image of a recommended target work location in the work range of the work machine, within a field of view of a user of the work machine who is viewing the work range of the work machine from a cockpit of the work machine.
  • 13. The work machine according to claim 1, wherein the display device is further configured to display an image of a recommended target work location in the work range of the work machine by projecting the image of the recommended target work location over the work range of the work machine.
  • 14. A remote operation assisting device comprising: an operating part configured to allow a user to remotely operate a work machine having a work device for changing a shape of an object in a work range of the work machine;a communication part configured to transmit information about an operating state of the operating part to the work machine; anda display part configured to recommend, in a visible way, a target work location in the work range of the work machine, based on information about a current shape of the object, the target location being where the work device changes the shape of the object.
  • 15. An assisting system for assisting operation of a work machine having a work device that is operated by a user when changing a shape of an object in a work range of the work machine, the assisting system comprising: a display device configured to recommend, in a visible way, a target work location in the work range of the work machine, based on information about a current shape of the object, the target work location being where the work device changes the shape of the object.
Priority Claims (1)
Number Date Country Kind
2023-223705 Dec 2023 JP national