SUPPORTING DEVICE, WORK MACHINE, AND PROGRAM

Information

  • Patent Application
  • 20250003197
  • Publication Number
    20250003197
  • Date Filed
    September 09, 2024
    7 months ago
  • Date Published
    January 02, 2025
    3 months ago
Abstract
A supporting device includes a memory, and a processor that is connected to the memory and configured to execute obtaining data related to a shape of a part that has already been constructed in a constructing target, and estimating a target shape of the constructing target based on the data obtained.
Description
BACKGROUND
1. Field of the Invention

The present disclosure relates to a supporting device or the like for work machines.


2. Description of the Related Art

Techniques related to machine guidance and machine control of work machines are known, which support operation of an operator or automatically perform construction by using data of a target shape of a constructing target.


SUMMARY

According to one embodiment of the present disclosure, a supporting device including a memory, and a processor connected to the memory is provided. The processor is configured to execute obtaining data related to a shape of a part that has already been constructed in a constructing target, and estimating a target shape of the constructing target based on the data obtained.


According to another embodiment of the present disclosure, a work machine including a memory, and a processor connected to the memory is provided. The processor is configured to execute obtaining data related to a shape of a part that has already been constructed in a constructing target, and estimating of a target shape of the constructing target based on the data obtained.


According to still another embodiment of the present disclosure, a non-transitory computer-readable recording medium storing a program is provided. The program causes a supporting device to execute obtaining data related to a shape of a part that has already been constructed in a constructing target, and estimating a target shape of the constructing target based on the data obtained.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a side view of an example of a shovel;



FIG. 2 is a top view of the example of the shovel;



FIG. 3 is a diagram illustrating an example of a configuration related to remote operation of a shovel;



FIG. 4 is a block diagram illustrating an example of a hardware configuration of a shovel;



FIG. 5 is a diagram schematically illustrating an example of a method for estimating a target shape of a constructing target in slope work;



FIG. 6 is a diagram schematically illustrating an example of a method for estimating a target shape of a constructing target in floor excavation work;



FIG. 7 is a functional block diagram illustrating a first example of a functional configuration related to estimation of a target shape of a constructing target;



FIG. 8 is a flowchart schematically illustrating the first example of processing related to estimation of a target shape of a constructing target;



FIG. 9 is a diagram illustrating an example of displayed contents of a display illustrating a topographic shape around a shovel;



FIG. 10 is a diagram illustrating an example of displayed contents of a display illustrating an estimation result of the target shape of a construction;



FIG. 11 is a diagram illustrating another example of displayed contents of a display illustrating an estimation result of the target shape of the constructing target;



FIG. 12 is a diagram illustrating an example of an operation support system;



FIG. 13 is a diagram illustrating an example of a hardware configuration of an information processor;



FIG. 14 is a functional block diagram illustrating a second example of a functional configuration related to estimation of a target shape of a constructing target; and



FIG. 15 is a flowchart schematically illustrating the second example of the processing related to estimation of the target shape of a constructing target.





DETAILED DESCRIPTION

In techniques related to machine guidance and machine control, it is necessary for an operator or the like to prepare data of a target shape of a constructing target in advance or to manually input parameters related to the target shape. Therefore, for example, in a small-scale work site, there is a possibility that introduction of such techniques is difficult or that the operator needs to take time on preparation thereof.


In view of the above issues, the present disclosure provides a technique capable of more easily obtaining data related to the target shape of a constructing target.


Hereinafter, embodiments will be described with reference to the drawings.


[Outline of the Shovel]

First, an outline of a shovel 100 according to the present embodiment will be described with reference to FIGS. 1 to 3.



FIGS. 1 and 2 are side views illustrating an example of the shovel 100. FIG. 2 is a top view illustrating the example of the shovel 100. FIG. 3 is a diagram illustrating an example of a configuration related to remote operation of a shovel. Hereinafter, a direction in which an attachment AT extends in the top view of the shovel 100 (an upward direction in FIG. 2) may be defined as “front” to describe a direction in the shovel 100 or a direction as seen from the shovel 100.


As illustrated in FIGS. 1 and 2, the shovel 100 includes a lower traveling body 1, an upper rotatable body 3, the attachment AT including a boom 4, an arm 5, and a bucket 6, and a cab 10.


The lower traveling body 1 is configured to cause the shovel 100 to travel by using a crawler 1C. The crawler 1C includes a left crawler 1CL and a right crawler 1CR. The crawler 1CL is hydraulically driven by a traveling hydraulic motor 1ML. Similarly, the crawler 1CR is hydraulically driven by a traveling hydraulic motor 1MR. Thus, the lower traveling body 1 is self-propelled.


The upper rotatable body 3 is mounted on the lower traveling body 1 in a rotatable manner via a turner 2. For example, the upper rotatable body 3 rotates relative to the lower traveling body 1 when the turner 2 is hydraulically driven by a rotation hydraulic motor 2M.


The boom 4 is mounted at the center of the front portion of the upper rotatable body 3 so as to be elevated about a rotation axis along a left-right direction. The arm 5 is mounted at a tip of the boom 4 so as to be rotated about a rotation axis along the left-right direction. The bucket 6 is mounted at a tip of the arm 5 so as to be rotated about a rotation axis along the left-right direction.


The bucket 6 is an example of an end attachment and is used for, for example, excavation work.


The bucket 6 is attached to the tip of the arm 5 in such a manner that it can be appropriately replaced according to the work of the shovel 100. That is, instead of the bucket 6, a bucket of a type different from the bucket 6, such as a relatively large bucket, a bucket for slope, a bucket for dredging, or the like, may be attached to the tip of the arm 5. An end attachment of a type other than the bucket, such as an agitator, a breaker, a crusher, or the like, may be attached to the tip of the arm 5. A spare attachment such as a quick coupling or a tilt rotator, for example, may be provided between the arm 5 and the end attachment.


The boom 4, the arm 5, and the bucket 6 are hydraulically driven by a boom cylinder 7, an arm cylinder 8, and a bucket cylinder 9, respectively.


The cab 10 is a cockpit in which an operator rides and operates the shovel 100. The cab 10 is mounted, for example, on a front left side of the upper rotatable body 3.


For example, the shovel 100 operates driven elements such as the lower traveling body 1 (that is, a pair of left and right crawlers 1CL and 1CR), the upper rotatable body 3, the boom 4, the arm 5, and the bucket 6 in response to an operation by an operator who is in the cab 10.


Instead of or in addition to being operable by an operator who is in the cab 10, the shovel 100 may be configured so as to be remotely operated (remotely controlled) from the outside of the shovel 100. When the shovel 100 is remotely operated, the interior of the cab 10 may be unattended. Hereinafter, the description will proceed on an assumption that the operation by the operator includes at least either of the operation of an operation device 26 by the operator in the cab 10 or a remote operation by an operator outside the cab 10.


For example, as illustrated in FIG. 3, the remote operation includes a mode in which the shovel 100 is operated by an operation input related to an actuator of the shovel 100 performed by a remote operation supporting device 300.


The remote operation supporting device 300 is provided, for example, in a management center for managing the work of the shovel 100 from the outside. The remote operation supporting device 300 may be a portable operation terminal, and in this case, the operator can perform the remote operation of the shovel 100 while directly confirming the working state of the shovel 100 from the periphery of the shovel 100.


The shovel 100 may transmit, for example, an image (hereinafter, referred to as “peripheral image”) representing the state of the periphery thereof including the front of the shovel 100 based on the captured image output by an imaging device 40 described in the following, to the remote operation supporting device 300 through a communicator 60 described in the following. The remote operation supporting device 300 may cause the display to display the image (peripheral image) received from the shovel 100. Further, various information images (information screens) displayed on an output device 50 (display) inside the cab 10 of the shovel 100 may similarly be displayed on the display of the remote operation supporting device 300. Thus, the operator using the remote operation supporting device 300 can remotely operate the shovel 100 while checking the displayed contents of, for example, an image or information screen representing the state of the periphery of the shovel 100 displayed on the display. The shovel 100 may operate the actuator in response to a remote operation signal representing the content of the remote operation received from the remote operation supporting device 300 by the communicator 60, and may drive driven elements such as the lower traveling body 1, the upper rotatable body 3, the boom 4, the arm 5, and the bucket 6.


The remote operation may include a mode in which the shovel 100 is operated by, for example, external voice input or gesture input to the shovel 100 by a person (for example, a worker) around the shovel 100. Specifically, the shovel 100 recognizes, through a voice input device (for example, a microphone), a gesture input device (for example, an imaging device), or the like mounted on the shovel itself, voices spoken by workers around the shovel, gestures performed by workers, or the like. The shovel 100 may operate the actuator in response to the content of the recognized voice, gesture, or the like, and may drive driven elements such as the lower traveling body 1 (left and right crawlers 1C), the upper rotatable body 3, the boom 4, the arm 5, and the bucket 6.


The shovel 100 may automatically operate the actuator regardless of the content of the operation by the operator. Thus, the shovel 100 can perform a function (“automatic drive function” or “machine control (MC) function”) for automatically operating at least a part of the driven elements such as the lower traveling body 1, the upper rotatable body 3, the boom 4, the arm 5, and the bucket 6.


The automatic drive function includes, for example, a function (“semi-automatic drive function” or “operation-assisted MC function”) for automatically operating a driven element (actuator) other than the driven element (actuator) to be operated in response to the operation or remote operation of the operation device 26 by the operator. The automatic drive function may include a function (“fully automatic drive function” or “fully automatic MC function”) for automatically operating at least a part of a plurality of driven elements (actuators) on an assumption that there is no operation or remote operation of the operation device 26 by the operator. When the fully automatic drive function is effective in the shovel 100, the interior of the cab 10 may be unattended. The semi-automatic drive function, the fully automatic drive function, and the like may include a mode in which the operation to be performed by the driven elements (actuators) in automatic driving is automatically decided according to a predetermined rule. The semi-automatic drive function, the fully automatic drive function, and the like may include: a mode in which the shovel 100 autonomously performs various determinations; and in accordance with the determination result, an operation to be performed in automatic operation by the driven elements (actuators), that are the target of automatic driving, is autonomously decided (“autonomous operation function”).


In addition, the operation of the shovel 100 may be remotely monitored. In this case, a remote monitoring supporting device having the same function as the remote operation supporting device 300 may be provided. The remote monitoring supporting device is, for example, an information processor 200 described in the following. Thus, a supervisor who is a user of the remote monitoring supporting device can monitor the status of the operation of the shovel 100 while checking peripheral images displayed on the display of the remote monitoring supporting device. For example, when determined to be necessary from the viewpoint of safety, the supervisor can intervene in the operation of the shovel 100 by the operator and urgently stop the shovel by performing a predetermined input using an input device of the remote monitoring supporting device.


[Hardware Configuration of Shovel]

Next, the hardware configuration of the shovel 100 will be described with reference to FIG. 4.



FIG. 4 is a block diagram illustrating an example of a hardware configuration of the shovel 100.


In FIG. 4, a path through which mechanical power is transmitted is indicated by a double line, a path through which high-pressure hydraulic fluid for driving a hydraulic actuator flows is indicated by a solid line, a path through which pilot pressure is transmitted is indicated by a dashed line, and a path through which electric signals are transmitted is indicated by a dotted line.


The shovel 100 includes components such as a hydraulic drive system for hydraulically driving the driven elements, an operation system for operating the driven elements, a user interface system for exchanging information with a user, a communication system for communicating with the outside, and a control system for various controls.


<Hydraulic Drive System>

As illustrated in FIG. 4, the hydraulic drive system of the shovel 100 includes a hydraulic actuator HA for hydraulically driving each of the driven elements such as the lower traveling body 1 (left and right crawlers 1C), the upper rotatable body 3, and the attachment AT, as described above. The hydraulic drive system of the shovel 100 according to the present embodiment includes an engine 11, a regulator 13, a main pump 14, and a control valve 17.


The hydraulic actuator HA includes the traveling hydraulic motors 1ML, 1MR, the rotation hydraulic motor 2M, the boom cylinder 7, the arm cylinder 8, the bucket cylinder 9, and the like.


In the shovel 100, a part or all of the hydraulic actuators HA may be replaced with electric actuators. That is, the shovel 100 may be a hybrid shovel or an electric shovel.


The engine 11 is a prime mover of the shovel 100 and a main power source in the hydraulic drive system. The engine 11 is, for example, a diesel engine using light oil as fuel. The engine 11 is mounted, for example, at the rear of the upper rotatable body 3. The engine 11 rotates at a predetermined target speed under direct or indirect control by a controller 30 described in the following to drive the main pump 14 and a pilot pump 15.


Instead of or in addition to the engine 11, another prime mover (for example, an electric motor) or the like may be mounted on the shovel 100.


The regulator 13 controls (adjusts) a discharge amount of the main pump 14 under control of the controller 30. For example, the regulator 13 adjusts an angle (hereinafter, referred to as “tilt angle”) of a swash plate of the main pump 14 in response to a control command from the controller 30.


The main pump 14 supplies hydraulic fluid to the control valve 17 through a high-pressure hydraulic line. Like the engine 11, the main pump 14 is mounted, for example, on the rear of the upper rotatable body 3. As described above, the main pump 14 is driven by the engine 11. The main pump 14 is, for example, a variable displacement hydraulic pump, and as described above, a stroke length of the piston is adjusted by adjusting the tilt angle of the swash plate by the regulator 13 under control of the controller 30, thereby controlling a discharge flow rate and a discharge pressure.


The control valve 17 drives the hydraulic actuator HA in response to the contents of the operator's operation or remote operation of the operation device 26 or an operation command corresponding to the automatic drive function. The control valve 17 is mounted, for example, in the central portion of the upper rotatable body 3. As described above, the control valve 17 is connected to the main pump 14 via the high-pressure hydraulic line, and selectively supplies hydraulic oil supplied from the main pump 14 to the respective hydraulic actuators in response to an operation by an operator or an operation command corresponding to an automatic drive function. Specifically, the control valve 17 includes a plurality of control valves (also referred to as “direction switching valves”) for controlling the flow rate and flow direction of the hydraulic oil supplied from the main pump 14 to the respective hydraulic actuators HA.


<Operation System>

As illustrated in FIG. 4, the operation system of the shovel 100 includes the pilot pump 15, the operation device 26, a hydraulic control valve 31, a shuttle valve 32, and a hydraulic control valve 33.


The pilot pump 15 supplies pilot pressure to various hydraulic devices through a pilot line 25. The pilot pump 15 is mounted similarly to the engine 11, for example, at the rear of the upper rotatable body 3. The pilot pump 15 is, for example, a fixed-displacement hydraulic pump, and is driven by the engine 11 as described above.


The pilot pump 15 may be omitted. In such a case, a relatively low-pressure hydraulic fluid obtained after pressure of a relatively high-pressure hydraulic fluid discharged from the main pump 14 is reduced by a predetermined pressure reducing valve may be supplied to various hydraulic devices as the pilot pressure.


The operation device 26 is provided near the cockpit of the cab 10 and is used by an operator to operate various driven elements. Specifically, the operation device 26 is used by an operator to operate the hydraulic actuator HA for driving each driven element, and as a result, the operator can operate the driven element to be driven by the hydraulic actuator HA. The operation device 26 includes a pedal device and a lever device for operating each driven element (hydraulic actuator HA).


For example, as illustrated in FIG. 4, the operation device 26 is of a hydraulic pilot type. Specifically, the operation device 26 utilizes hydraulic fluid supplied from the pilot pump 15 through the pilot line 25 and a pilot line 25A branched therefrom, and outputs a pilot pressure corresponding to the contents of the operation to a pilot line 27A on a secondary side. The pilot line 27A is connected to one inlet port of the shuttle valve 32 and is connected to the control valve 17 through a pilot line 27 connected to an outlet port of the shuttle valve 32. As a result, the pilot pressure corresponding to the contents of the operation to the operation device 26 concerning various driven elements (hydraulic actuators HA) can be input to the control valve 17 through the shuttle valve 32. Therefore, the control valve 17 can drive the respective hydraulic actuators HA according to the content of the operation performed to the operation device 26 by an operator or the like.


The operation device 26 may be an electric type. In this case, the pilot line 27A, the shuttle valve 32, and the hydraulic control valve 33 are omitted. Specifically, the operation device 26 outputs an electric signal (hereinafter referred to as “operation signal”) corresponding to the content of the operation, and the operation signal is taken into the controller 30. Then, the controller 30 outputs to the hydraulic control valve 31 a control command according to the content of the operation signal, that is, a control signal according to the content of operation to the operation device 26. As a result, the pilot pressure according to the content of operation to the operation device 26 is input from the hydraulic control valve 31 to the control valve 17, and the control valve 17 can drive the respective hydraulic actuators HA according to the content of operation to the operation device 26.


Further, the control valve (direction switching valve) included in the control valve 17 for driving the respective hydraulic actuators HA may be of an electromagnetic solenoid type. In this case, an operation signal output from the operation device 26 may be directly input to the control valve 17, that is, to the electromagnetic solenoid type control valve.


Further, as described above, a part or all of the hydraulic actuators HA may be replaced with electric actuators. In this case, the controller 30 may output a control command corresponding to the content of the operation through the operation device 26 and the content of remote operation defined by a remote operation signal, to the electric actuator or a driver or the like for driving the electric actuator. When the shovel 100 is remotely operated, the operation device 26 may be omitted.


The hydraulic control valve 31 is provided for each driven element (hydraulic actuator HA) to be operated by the operation device 26 and for each driving direction (for example, raising and lowering directions of the boom 4) of the driven element (hydraulic actuator HA). That is, two hydraulic control valves 31 are provided for each double-acting hydraulic actuator HA. The hydraulic control valve 31 may be provided, for example, in a pilot line 25B between the pilot pump 15 and the control valve 17, and may be configured so that its flow-path area (i.e., a cross-sectional area through which hydraulic fluid can flow) can be changed. Thus, the hydraulic control valve 31 can output a predetermined pilot pressure to a pilot line 27B on the secondary side by using the hydraulic fluid of the pilot pump 15 supplied through the pilot line 25B. Therefore, the hydraulic control valve 31 can indirectly apply a predetermined pilot pressure corresponding to a control signal from the controller 30 to the control valve 17 through the shuttle valve 32 between the pilot line 27B and the pilot line 27. Therefore, the controller 30 can cause the hydraulic control valve 31 to supply the pilot pressure corresponding to an operation command corresponding to the automatic drive function to the control valve 17, thereby realizing operation of the shovel 100 by the automatic drive function.


Further, the controller 30 may control, for example, the hydraulic control valve 31 to realize remote operation of the shovel 100. Specifically, the controller 30 outputs a control signal corresponding to the content of the remote operation specified by the remote operation signal received from the remote operation supporting device 300 to the hydraulic control valve 31 by the communicator 60. Thus, the controller 30 can cause the hydraulic control valve 31 to supply a pilot pressure corresponding to the content of the remote operation to the control valve 17, thereby realizing operation of the shovel 100 based on the remote operation by an operator.


When the operation device 26 is an electric type, the controller 30 can cause the hydraulic control valve 31 to directly supply a pilot pressure corresponding to the content of the operation (operation signal) of the operation device 26 to the control valve 17, thereby realizing the operation of the shovel 100 based on the operation by an operator.


The shuttle valve 32 includes two inlet ports and one outlet port, and outputs hydraulic fluid having a higher pilot pressure among the pilot pressures input to the two inlet ports to the outlet port. The shuttle valve 32 is provided for each driven element (hydraulic actuator HA) to be operated by the operation device 26 and for each driving direction of the driven element (hydraulic actuator HA). One of the two inlet ports of the shuttle valve 32 is connected to a pilot line 27A on the secondary side of the operation device 26 (specifically, the lever device and the pedal device described above included in the operation device 26), and the other is connected to a pilot line 27B on the secondary side of the hydraulic control valve 31. The outlet port of the shuttle valve 32 is connected to the pilot port of a corresponding control valve of the control valve 17 through the pilot line 27. The corresponding control valve is a control valve for driving a hydraulic actuator to be operated by the lever device or the pedal device described above connected to one inlet port of the shuttle valve 32. For this reason, the shuttle valve 32 can cause the pilot port of the corresponding control valve to operate either the pilot pressure of the pilot line 27A on the secondary side of the operation device 26 or the pilot pressure of the pilot line 27B on the secondary side of the hydraulic control valve 31, whichever is higher. That is, by causing the hydraulic control valve 31 to output a pilot pressure higher than the pilot pressure on the secondary side of the operation device 26, the controller 30 can control the corresponding control valve without depending on the operation of the operation device 26 by the operator. Therefore, the controller 30 can control the operation of the driven elements (lower traveling body 1, upper rotatable body 3, attachment AT) without depending on a state of operation to the operation device 26 by the operator, thereby realizing a remote operation function and an automatic drive function.


The hydraulic control valve 33 is provided on the pilot line 27A that connects the operation device 26 and the shuttle valve 32. The hydraulic control valve 33 is configured so that, for example, its flow-path area can be changed. The hydraulic control valve 33 operates in response to a control signal input from the controller 30. Thus, the controller 30 can forcibly reduce the pilot pressure output from the operation device 26 when the operation device 26 is operated by an operator. Therefore, the controller 30 can forcibly suppress or stop the operation of the hydraulic actuator corresponding to the operation of the operation device 26 even when the operation device 26 is operated. Also, the controller 30 can reduce the pilot pressure output from the operation device 26 to be lower than the pilot pressure output from the hydraulic control valve 31 even when the operation device 26 is operated, for example. Therefore, by controlling the hydraulic control valve 31 and the hydraulic control valve 33, the controller 30 can reliably apply a desired pilot pressure to the pilot port of the control valve in the control valve 17 regardless of the content of the operation to the operation device 26, for example. Therefore, by controlling the hydraulic control valve 33 in addition to the hydraulic control valve 31, the controller 30 can more appropriately realize the remote operation function and the automatic drive function of the shovel 100.


<User Interface System>

As illustrated in FIG. 4, the user interface system of the shovel 100 includes an operation device 26, the output device 50, and an input device 52.


The output device 50 outputs various kinds of information to the user of the shovel 100 (e.g., an operator in the cab 10 or an external remote operator), people around the shovel 100 (e.g., workers and drivers of working vehicles), and the like.


For example, the output device 50 includes a lighting device, a display 50A (see FIG. 5), and the like for outputting various kinds of information in a visual manner. The lighting device is, for example, a warning lamp (indicator lamp), and the like. The display 50A is, for example, a liquid crystal display, an organic electroluminescence (EL) display, and the like. For example, as illustrated in FIG. 2, the lighting device and the display 50A may be provided inside the cab 10 and output various kinds of information in a visual manner to an operator or the like inside the cab 10. The lighting device and the display 50A may be provided, for example, on a side surface of the upper rotatable body 3, and output various kinds of information in a visual manner to an operator or the like around the shovel 100.


For example, the output device 50 includes a sound output device for outputting various kinds of information in an auditory manner. The sound output device includes, for example, a buzzer, a speaker, and the like. The sound output device may be provided, for example, in at least one of the interior or exterior of the cab 10, and may output various kinds of information by an auditory method to an operator inside the cab 10 and a person (such as an operator) around the shovel 100.


The output device 50 may also include, for example, a device that outputs various types of information by a tactile method such as vibration of the cockpit.


The input device 52 receives various inputs from a user of the shovel 100, and signals corresponding to the received inputs are taken into the controller 30. The input device 52 is provided, for example, in the cab 10, and receives inputs from an operator or the like inside the cab 10. The input device 52 may also be provided, for example, on a side surface of the upper rotatable body 3, and may receive inputs from an operator or the like around the shovel 100.


For example, the input device 52 includes an operation input device that receives mechanically operated inputs from a user. The operation input device may include a touch panel mounted on the display, a touch pad mounted around the display, a button switch, a lever, a toggle, a knob switch provided on the operation device 26 (lever device), and the like.


Further, for example, the input device 52 may include a voice input device that receives a user's voice input. The voice input device may include, for example, a microphone.


Further, for example, the input device 52 may include a gesture input device that receives a user's gesture input. The gesture input device may include, for example, an imaging device that captures an image of a gesture performed by a user.


Further, for example, the input device 52 may include a biological input device that receives a user's biological input. The biological input may include, for example, input of biological information such as a user's fingerprint or iris.


<Communication System>

As illustrated in FIG. 4, a communication system of the shovel 100 according to the present embodiment includes the communicator 60.


The communicator 60 is connected to an external communication line and communicates with a device provided separately from the shovel 100. The device provided separately from the shovel 100 may include a portable terminal device (mobile terminal) brought into the cab 10 by a user of the shovel 100, in addition to the device provided outside the shovel 100. The communicator 60 may include, for example, a mobile communication module conforming to standards such as 4G (fourth generation) and 5G (fifth generation). The communicator 60 may include, for example, a satellite communication module. The communicator 60 may include, for example, a WiFi communication module and a Bluetooth (registered trademark) communication module. The communicator 60 may include a plurality of communicators corresponding to a communication line to be connected.


For example, the communicator 60 communicates with external devices such as the information processor 200 and the remote operation supporting device 300 in the work site through a local communication line generated in the work site. The local communication line may be, for example, a mobile communication line using local 5G (so-called local 5G) or a local network (local area network: LAN) using WiFi6 constructed in the work site.


Further, for example, the communicator 60 communicates with the information processor 200, the remote operation supporting device 300, and the like located outside the work site through a wide communication line including the work site, that is, a wide area network (WAN). The wide area network may include, for example, a wide area mobile communication network, a satellite communication network, an Internet network, and the like.


<Control System>

As illustrated in FIG. 4, a control system of the shovel 100 includes the controller 30. The control system of the shovel 100 according to the present embodiment includes an operation pressure sensor 29, the imaging device 40, and sensors S1 to S5.


The controller 30 performs various controls related to the shovel 100.


The functions of the controller 30 may be realized by any hardware, any combination of hardware and software, or the like. For example, as illustrated in FIG. 4, the controller 30 includes an auxiliary storage 30A, a memory 30B, a central processing unit (CPU) 30C, and an interface device 30D, connected by a bus B1.


The auxiliary storage 30A is a nonvolatile storage, and stores programs to be installed as well as necessary files, data, and the like. The auxiliary storage 30A is, for example, an electrically erasable programmable read-only memory (EEPROM) or a flash memory.


The memory 30B loads a program in the auxiliary storage 30A so that the CPU 30C can read the program when a command to start the program is issued. The memory 30B is, for example, a static random access memory (SRAM).


The CPU 30C executes, for example, a program loaded into the memory 30B, and realizes various functions of the controller 30 in accordance with a command of the program.


The interface device 30D functions as, for example, a communication interface for connecting to the communication line inside the shovel 100. The interface device 30D may include a plurality of different types of communication interfaces in accordance with the types of communication lines to be connected.


The interface device 30D functions as an external interface for reading data from a recording medium and writing data to the recording medium. The recording medium is, for example, a dedicated tool connected to a connector installed inside the cab 10 by a detachable cable. The recording medium may be, for example, a general-purpose recording medium such as an SD memory card or a universal serial bus (USB) memory. Thus, a program for realizing various functions of the controller 30 can be provided by, for example, a portable recording medium and installed in the auxiliary storage 30A of the controller 30. The program may be downloaded from another computer outside the shovel 100 through the communicator 60 and installed in the auxiliary storage 30A.


A part of the functions of the controller 30 may be realized by another controller (control device). That is, the functions of the controller 30 may be realized by a plurality of controllers in a distributed manner.


The operation pressure sensor 29 detects a pilot pressure on the secondary side (pilot line 27A) of the hydraulic pilot type operation device 26, that is, the pilot pressure corresponding to the operating state of each driven element (hydraulic actuator) in the operation device 26. A detection signal of the pilot pressure corresponding to the operating state of each driven element (hydraulic actuator HA) in the operation device 26 by the operation pressure sensor 29 is taken into the controller 30.


When the operation device 26 is of an electric type, the operation pressure sensor 29 is omitted. This is because the controller 30 can recognize the operation state of each driven element through the operation device 26 based on an operation signal taken in from the operation device 26.


The imaging device 40 obtains an image around the shovel 100. The imaging device 40 may also obtain (generate) three-dimensional data (hereinafter, simply referred to as “three-dimensional data of an object”) representing a position and an outline of an object around the shovel 100 within an imaging range (angle of view) based on the obtained image and data related to a distance described in the following. The three-dimensional data of the object around the shovel 100 is, for example, data of coordinate information of a point group representing a surface of the object, distance image data, and the like.


For example, as illustrated in FIG. 2, the imaging device 40 includes a camera 40F for imaging the front of the upper rotatable body 3, a camera 40B for imaging the rear of the upper rotatable body 3, a camera 40L for imaging the left side of the upper rotatable body 3, and a camera 40R for imaging the right side of the upper rotatable body 3. As a result, the imaging device 40 can image the entire area around the shovel 100, that is, a range extending in an angle direction of 360 degrees, in the top view of the shovel 100. The operator can visually view peripheral images such as images captured by the cameras 40B, 40L, and 40R and processed images generated based on the captured images, through the output device 50 (display 50A) and the display for remote control, and can confirm the state of the left, right, and rear of the upper rotatable body 3. The operator can remotely control the shovel 100 while confirming the operation of the attachment AT including the bucket 6 by visually viewing peripheral images such as images captured by the camera 40F and processed images generated based on the captured images, through the display for remote control. Hereinafter, the cameras 40F, 40B, 40L, and 40R may be collectively or individually referred to as “camera 40X”.


The camera 40X is, for example, a monocular camera. The camera 40X may also be configured to obtain data on a distance (depth) in addition to a two-dimensional image, such as a stereo camera, a time of flight (TOF) camera, or the like (hereinafter, collectively referred to as “3D camera”).


Output data (for example, image data, three-dimensional data of objects around the shovel 100, etc.) from the imaging device 40 (camera 40X) is taken into the controller 30 through a one-to-one communication line or an in-vehicle network. Thus, for example, the controller 30 can monitor objects around the shovel 100 based on the output data from the camera 40X. Further, for example, the controller 30 can confirm the environment around the shovel 100 based on the output data from the camera 40X. Further, for example, the controller 30 can confirm the posture of the attachment AT reflected in the captured image based on the output data from the camera 40X (camera 40F). Further, for example, the controller 30 can confirm the posture of the body (upper rotatable body 3) of the shovel 100 based on the output data from the camera 40X with the objects around the shovel 100 as a reference.


A part of the cameras 40F, 40B, 40L, and 40R may be omitted. For example, when the shovel 100 is not remotely operated, the cameras 40F and 40L may be omitted. This is because the front and left sides of the shovel 100 are relatively easy to check from an operator of the cab 10. Further, instead of or in addition to the imaging device 40 (camera 40X), a distance sensor may be provided on the upper rotatable body 3. The distance sensor is mounted on the upper rotatable body 3, for example, and obtains data related to the distance and direction of objects around the shovel 100. The distance sensor may also obtain (generate) three-dimensional data (for example, data of coordinate information of a point group) of objects around the shovel 100 within a sensing range based on the obtained data. The distance sensor may be, for example, light detection and ranging (LIDAR). The distance sensor may be, for example, a millimeter-wave radar, an ultrasonic sensor, an infrared sensor, or the like.


The sensor S1 is attached to the boom 4 and detects a posture angle (hereinafter, referred to as “boom angle”) around the rotation axis of a base end corresponding to a connecting portion of the boom 4 with the upper rotatable body 3. The sensor S1 includes, for example, a rotary potentiometer, a rotary encoder, an acceleration sensor, an angular acceleration sensor, a 6-axis sensor, an inertial measurement unit (IMU), and the like. The same may be applied to the sensors S2 and S4. The sensor S1 may also include a cylinder sensor for detecting the telescopic position of the boom cylinder 7. The same may be applied to the sensor S2. The detection signal of the boom angle by the sensor S1 is taken into the controller 30. Thus, the controller 30 can recognize the posture state of the boom 4.


The sensor S2 is attached to the arm 5 and detects a posture angle (hereinafter, referred to as “arm angle”) around the rotation axis of the base end corresponding to the connecting portion of the arm 5 with the boom 4. The detection signal of the arm angle by the sensor S2 is taken into the controller 30. Thus, the controller 30 can recognize the posture state of the arm 5.


The sensor S3 is attached to the bucket 6 and detects a posture angle (hereinafter, referred to as “arm angle”) around the rotation axis of the base end corresponding to the connecting portion of the bucket 6 with the arm 5. The detection signal of the arm angle by the sensor S3 is taken into the controller 30. Thus, the controller 30 can recognize the posture state of the bucket 6.


The sensor S4 detects an inclination state of the machine body (for example, the upper rotatable body 3) with respect to a predetermined reference plane (e.g., horizontal plane). For example, the sensor S4 is attached to the upper rotatable body 3 and detects inclination angles (hereinafter, referred to as “front-rear inclination angle” and “left-right inclination angle”) of the shovel 100 (that is, the upper rotatable body 3) around two axes in the front-rear direction and the left-right direction. A detection signal corresponding to the inclination angle (front-rear inclination angle and left-right inclination angle) detected by the sensor S4 is taken into the controller 30. Thus, the controller 30 can recognize the inclination state of the machine body (upper rotatable body 3).


The sensor S5 is attached to the upper rotatable body 3 and outputs detection information on the rotation state of the upper rotatable body 3. The sensor S5 detects, for example, the rotation angular velocity and the rotation angle of the upper rotatable body 3. The sensor S5 includes, for example, a gyro sensor, a resolver, a rotary encoder, and the like. The detection information on the rotation state detected by the sensor S5 is taken into the controller 30. Thus, the controller 30 can recognize the rotation state such as the rotation angle of the upper rotatable body 3.


When the sensor S4 includes a gyro sensor capable of detecting the angular velocity around three axes, a 6-axis sensor, an IMU, and the like, the rotation state (e.g. rotation angular velocity) of the upper rotatable body 3 may be detected based on the detection signal of the sensor S4. In this case, the sensor S5 may be omitted. Further, when the posture state of the upper rotatable body 3, the attachment AT, and the like can be recognized based on the output of the imaging device 40 and the distance sensor, at least a part of the sensors S1 to S5 may be omitted.


[Overview of Target-Shape Estimation Function]

Next, referring to FIGS. 1 to 4 as well as FIGS. 5 and 6, an outline of a function (hereinafter, referred to as “target-shape estimation function” for convenience) related to estimation of a target shape of an object to be constructed by the shovel 100 will be described.



FIG. 5 is a diagram schematically illustrating an example of a method for estimating a target shape of a constructing target in slope work.



FIG. 6 is a diagram schematically illustrating an example of a method for estimating a target shape in floor excavation work.


The constructing target is an object to be formed on the ground or the like by the shovel 100 in construction work. The construction work is, for example, slope work, floor excavation work, and ground leveling work. The constructing target is, for example, a horizontal surface, slope, groove, embankment, and the like.


The target shape of the constructing target is a goal for the shape of the constructing target that is expected to be finally realized by the operation of the shovel 100. The target shape of the constructing target is, for example, a target construction surface to be formed as a flat surface. The target shape of the constructing target may also be a target construction surface formed by a curved surface having a predetermined curvature.


In the present embodiment, the shovel 100 (controller 30) estimates the target shape of the constructing target based on the shape of a part of the constructing target that has already been constructed by the operation of the shovel by a skilled person. The estimated target shape of the constructing target may be the overall target shape of the constructing target, or it may be the target shape of a partial area of the constructing target that is larger than a part that has already been constructed. The shovel used by the skilled person for the construction work of the constructing target may be the shovel 100 or another shovel different from the shovel 100. For example, the skilled person is an operator who has a relatively long experience in operation of the shovel and a relatively high experience value in the operation of the shovel.


For example, as illustrated in FIG. 5, in the slope work, an area 501, a part that has already been constructed in the slope of the constructing target is formed. As described above, the area 501 is constructed in advance based on the operation of the shovel by a skilled person.


First, the shovel 100 (controller 30) uses the imaging device 40 to capture an image of the shape of the area 501, and obtains shape data of the area 501 based on an output of the imaging device 40.


Next, the shovel 100 (controller 30) estimates a target construction surface 502 corresponding to the target shape of the constructing target based on the shape data of the area 501.


For example, the controller 30 estimates the target construction surface 502 by extending a planar shape of the area 501 in a width direction of the slope or duplicating and arranging the planar shape in the width direction, and obtains data on the target construction surface 502. Hereinafter, this estimation method may be referred to as a “first estimation method” of the target shape of the constructing target.


Further, the controller 30 may use a trained model that is subjected to machine learning based on training data that is a set of shape data of a part that has already been constructed in a slope of the constructing target and shape data of the target construction surface of the slope to be constructed. In this case, the controller 30 can obtain data on the target construction surface 502 by using shape data of the area 501 as input data, applying the input data to the trained model, and estimating the target construction surface 502.


Further, as illustrated in FIG. 6, for example, in a floor excavation work, a groove 601 that is a part that has already been constructed in the groove of a constructing target is formed. As described above, the groove 601 is constructed in advance based on the operation of a shovel by a skilled person.


First, the shovel 100 (controller 30) uses the imaging device 40 to capture an image of the shape of the groove 601 and obtains shape data of the groove 601 based on an output of the imaging device 40.


Subsequently, the shovel 100 (controller 30) estimates a target construction surface 602 corresponding to a target shape of the groove to be constructed based on the shape data of the groove 601. The target construction surface 602 includes target construction surfaces corresponding to side surfaces at both ends in the width direction of the groove, target construction surfaces corresponding to side surfaces at both ends in the length direction of the groove, and a target construction surface corresponding to the bottom of the groove.


For example, the controller 30 estimates the target construction surface 602 by the first estimation method. Specifically, the controller 30 may estimate the target construction surfaces corresponding to the side surfaces of the target construction surface 602, that are at both ends in the width direction of the groove, by extending the shapes of the side surfaces at both ends in the width direction of the groove 601 in the length direction or by duplicating and arranging them in the length direction. The controller 30 may estimate a target construction surface that corresponds to the bottom of the groove of the target construction surface 602 by extending the shape of the bottom of the groove 601 in the length direction of the groove of the constructing target or by duplicating and arranging the shape in the length direction. The controller 30 may also estimate the target construction surfaces that correspond to the side surfaces at both ends in the length direction of the groove of the target construction surface 602 by using the shapes of both end surfaces in the length direction of the groove 601 as they are or by offsetting their positions in the length direction of the groove of the constructing target.


The controller 30 may also estimate the target construction surface 602 by the second estimation method. Specifically, the controller 30 may use a trained model that has been subjected to machine learning based on a training data set, that is a combination of data on the shape of a part that has already been constructed in the groove as the constructing target and shape data of the target construction surface of the groove as the constructing target. In this case, the controller 30 can estimate the target construction surface 602 and obtain data on the target construction surface 602 by, while using the shape data of the groove 601 as input data, applying the trained model thereto, and based on the output data to which the trained model is applied.


As described above, in the present embodiment, the controller 30 can estimate the target shape of the constructing target based on the shape data of a part of the constructing target, which has already been constructed by operation of the shovel by a skilled person, and can obtain data related to the target shape. Thus, the controller 30 can more easily obtain data related to the target shape of the constructing target. Therefore, since there is no need to prepare data related to the target shape in advance, for example, there is no need to manually input parameters related to the target shape by the user. Moreover, even in a small-scale construction site where, for example, a system or funds for preparing data related to the target shape in advance are not secured, data related to the target shape can be obtained and work efficiency can be enhanced by employing techniques related to machine guidance and machine control.


For example, the target-shape estimation function is applied to the shovel 100 operated by an operator of the cab 10 or a remote operator. In this case, the shovel 100 can support the operation of the operator by using a machine guidance function and the MC function of an operation support type based on data related to the target shape estimated by the target-shape estimation function. The target-shape estimation function may also be applied to the shovel 100 operated by a fully automatic drive function (fully automatic MC function). In this case, the shovel 100 can automatically perform construction so that the constructing target becomes the target shape by the fully automatic drive function, based on the data related to the target shape estimated by the target-shape estimation function.


[First Example of Functional Configuration Related to Estimation of Target Shape of Constructing Target]

Next, a first example of a functional configuration related to estimation of a target shape of a constructing target will be described with reference to FIG. 7 in addition to FIGS. 1 to 6.



FIG. 7 is a functional block diagram illustrating a first example of a functional configuration related to estimation of a target shape of a constructing target.


The shovel 100 includes a supporting device 150 as illustrated in FIG. 7.


The supporting device 150 supports work of the shovel 100. The supporting device 150 includes the controller 30, the imaging device 40, the display 50A, the input device 52, and the communicator 60.


The controller 30 includes, as functional parts, a topographic shape obtainer 301, the target shape estimator 302, a display processor 303, a target shape corrector 304, a target shape data storage 305, and a work support controller 306.


The topographic shape obtainer 301 obtains, based on an output of the imaging device 40 and the distance sensor, data on the topographic shape of a place where a constructing target is formed around the shovel 100, including a place that has already been constructed in the constructing target. The topographic shape data includes, for example, both a part that has already been constructed and a part that has not been constructed in the constructing target. The topographic shape data may be, for example, image data or three-dimensional data.


The target shape estimator 302 estimates a target shape of the constructing target, based on the data obtained by the topographic shape obtainer 301, by using the first estimation method described above, and obtains data on the target shape of the constructing target.


The display processor 303 causes the display 50A to display the target shape of the constructing target as an estimation result by the target shape estimator 302. When the shovel 100 is remotely operated or monitored, the display processor 303 may transmit the image data of the target shape of the constructing target as the estimation result by the target shape estimator 302 to the remote operation supporting device 300 or the remote monitoring supporting device through the communicator 60. Thus, the display processor 303 can cause the remote operation supporting device 300 or the display of the remote operation supporting device to display the target shape of the constructing target as the estimation result by the target shape estimator 302. Thus, a user (operator) can visually confirm the data related to the target shape of the constructing target.


The display processor 303 may also cause the display 50A to display the target shape of the constructing target as the estimation result by the target shape estimator 302 and a current topographical shape based on an output from the imaging device 40 so as to be comparable (see FIGS. 10 and 11). Similarly, the display processor 303 may transmit an image capable of comparing the target shape of the constructing target as the estimation result by the target shape estimator 302 and the current topographical shape based on the output from the imaging device 40, to the remote operation supporting device 300 and the remote monitoring supporting device. Thus, the user (operator) can compare the current topographical shape with the target shape of the constructing target to confirm whether or not an appropriate target shape has been generated (estimated).


The target shape corrector 304 corrects data related to the target shape of the constructing target, which is the estimation result by the target shape estimator 302, in response to a predetermined input from the user (operator). The input from the user is received by the input device 52. When the shovel 100 is remotely operated or remotely monitored, the input from the user is received from the remote operation supporting device 300 or the remote monitoring supporting device through the communicator 60. For example, the target shape corrector 304 corrects the data related to the target shape of the constructing target with a total of 6 degrees of freedom in a three-dimensional orthogonal coordinate system, that are translational motion in directions along an X-axis, a Y-axis, and a Z-axis and rotational motion around the axes of the X-axis, the Y-axis, and the Z-axis. The target shape corrector 304 may also correct the data related to the target shape of the constructing target with a total of 9 degrees of freedom by adding expansion or reduction in the directions along the X-axis, the Y-axis, and the Z-axis.


The target shape data storage 305 stores data related to the target shape of the constructing target. Specifically, data related to the target shape of the constructing target as the estimation result by the target shape estimator 302 or data related to the target shape of the constructing target as the correction result by the target shape corrector 304 may be stored.


The work support controller 306 performs control for supporting work related to the construction of the constructing target based on data related to the target shape of the constructing target.


For example, the work support controller 306 performs control related to machine guidance based on data related to the target shape of the constructing target. Specifically, the work support controller 306 may notify the operator of information such as a distance between the target shape and a work portion such as a toe or a back of the bucket 6 through the output device 50 such as the display 50A or the communicator 60.


In addition, the work support controller 306 may perform control related to machine control (automatic drive function) based on data related to the target shape of the constructing target. More specifically, the work support controller 306 may control the hydraulic control valve 31 and operate the attachment AT or the like so as to support operation of the operator, or so as to move the work portion such as the toe or the back of the bucket 6 on a track along the target shape without depending on the operation by the operator.


[First Example of Processing Related to Estimation of Target Shape of Constructing Target]

Next, a first example of processing related to estimation of the target shape of the constructing target will be described with reference to FIGS. 8 to 11 in addition to FIGS. 1 to 7.



FIG. 8 is a flowchart schematically illustrating a first example of processing related to estimation of the target shape of the constructing target. FIG. 9 is a diagram illustrating an example (screen 900) of the displayed contents of the display 50A displaying a topographical shape around the shovel 100. FIGS. 10 and 11 are diagrams illustrating an example and another example of the displayed contents of the display 50A displaying an estimation result of the target shape of the constructing target. Specifically, FIG. 10 is a diagram illustrating the displayed contents of the display 50A when the target shape of the constructing target as the estimation result is appropriate, and FIG. 11 is a diagram illustrating the displayed contents of the display 50A when the target shape of the constructing target as the estimation result is inappropriate.


The flowchart of FIG. 8 starts when a predetermined input from a user (operator, supervisor, etc.) is received through, for example, the input device 52 or the communicator 60.


As illustrated in FIG. 8, in step S102 (one example of an obtainment step), based on the output of the imaging device 40, the topographic shape obtainer 301 obtains image data around the shovel 100 or three-dimensional data representing a state around the shovel 100 based on the image data.


When processing of step S102 is completed, the controller 30 proceeds to step S104.


In step S104, the display processor 303 causes the display 50A, the remote operation supporting device 300, or the like to display an image representing the topographic shape around the shovel 100 of a place where the constructing target is to be formed based on the data obtained in step S102 (see FIG. 9).


For example, as illustrated in FIG. 9, images 901 to 903 are displayed on the screen 900 of the display 50A.


The image 901 is an image representing a topographical shape around the shovel 100 where the constructing target (in this example, a slope) is to be formed. Specifically, in the image 901, a topographical shape around the shovel 100 is represented by three-dimensional mesh data. Thus, the user can recognize the current topographical shape of the place where the constructing target is to be formed.


The image 902 is an operation icon for executing processing for estimating the target shape of the constructing target. Thus, the user can execute the processing for estimating the target shape of the constructing target by operating the icon of the image 902 through the input device 52 or the like (see step S106 in FIG. 8).


The image 903 is an operation icon for ending the processing of the flowchart in FIG. 8 and returning to a predetermined screen. The same applies to images 1004 and 1104 described in the following. By using the operation icon, the processing of the flowchart of FIG. 8 can be stopped halfway.


Returning to FIG. 8, when the processing of step S104 is completed, the controller 30 proceeds to step S106.


In step S106, the controller 30 determines whether or not the processing for estimating the target shape of the constructing target has been performed through the input device 52 or the like. When the processing for estimating the target shape of the constructing target has been performed, the controller 30 proceeds to step S108, and otherwise (for example, when operation of the image 903 is performed or when no operation is performed even after a certain period of time has elapsed), ends the processing of the flowchart for the case.


The processing of steps S104 and S106 may be omitted.


In step S108 (one example of an estimation step), the target shape estimator 302 estimates the target shape of the constructing target based on the data obtained in step S102 by using the above-described first estimation method, and obtains data related to the target shape of the constructing target.


When the processing in step S108 is completed, the controller 30 proceeds to step S110.


In step S110, the target shape of the constructing target as the estimation result of step S108 is displayed on the display 50A, the remote operation supporting device 300 (display), or the like (see FIGS. 10 and 11).


For example, as illustrated in FIG. 10, a screen 1000 includes images 1001 to 1005. Similarly, as illustrated in FIG. 11, a screen 1100 includes images 1101 to 1105.


The images 1001 and 1101 are images representing the topographical shape around the shovel 100 where a constructing target (in this example, a slope) is to be formed, respectively, similar to the image 901 in FIG. 9.


The images 1002 and 1102 are each an image representing a target shape (target construction surface) of a constructing target.


The images 1003 and 1103 are each an operation icon for confirming the target shape (target construction surface) of the constructing target to the currently displayed contents.


The images 1004 and 1104 are each an operation icon for shifting to a screen for correcting the target shape of the constructing target from the currently displayed contents.


On the screen 1000, the target shape (image 1002) of the constructing target is appropriately estimated with respect to the current topographical shape (image 1001) around the shovel 100. Therefore, the user can confirm data of the currently displayed contents as the target shape of the constructing target by operating the icon of the image 1003 through the input device 52, the remote operation supporting device 300, or the like.


On the other hand, on the screen 1100, the target shape of the constructing target (image 1102) is estimated in a form that does not match with the current shape of the periphery of the shovel 100 at all. Accordingly, the user may correct the target shape of the constructing target by operating the operation icon of the image 1104 through the input device 52, the remote operation supporting device 300, or the like.


Returning to FIG. 8, when the processing of step S110 is completed, the controller 30 proceeds to step S112.


In step S112, the controller 30 determines whether or not an operation for correcting the target shape of the constructing target has been performed through the input device 52, the remote operation supporting device 300 (input device), or the like. When the operation for correcting the target shape of the constructing target is performed, the controller 30 proceeds to step S114, and when the operation for confirming the target shape of the constructing target is performed, the controller proceeds to step S116.


In step S114, the target shape corrector 304 corrects the data related to the target shape of the constructing target in response to input from the user through the input device 52 and the remote operation supporting device 300.


When the processing of step S114 is completed, the controller 30 proceeds to step S116.


In step S116, the controller 30 confirms the target shape and stores the data related to the target shape in the target shape data storage 305.


When the processing of step S116 is completed, the controller 30 ends the processing of the flowchart for the case.


As described above, in the present example, the supporting device 150 is configured to estimate the target shape of the constructing target based on the shape of the part that has already been constructed by shovel operation by a skilled person, and data related to the target shape of the constructing target can be obtained more easily.


Furthermore, in the present example, the supporting device 150 can correct the target shape of the constructing target as the estimation result in response to an input from the user.


[Outline of Operation Support System]

Next, an operation support system SYS will be described with reference to FIG. 12 in addition to FIGS. 1 to 3.


As illustrated in FIG. 12, the shovel 100 may be a component of the operation support system SYS. Specifically, the operation support system SYS includes the shovel 100 and the information processor 200.


The operation support system SYS cooperates with the shovel 100 by using the information processor 200 to support operation of the shovel 100.


The operation support system SYS may include one shovel 100 or a plurality of shovels.


The shovel 100 is a work machine to be supported for operation, by the operation support system SYS.


The information processor 200, by communicating with the shovel 100, mutually cooperates with the shovel 100, and performs support on operation of the shovel 100.


The information processor 200 is, for example, a server and a management terminal device installed in a management office in a work site of the shovel 100, or in a management center located in a place different from the work site of the shovel 100 that manages the operation status of the shovel 100. The management terminal device may be, for example, a stationary terminal device such as a desktop personal computer (PC), or a portable terminal device (mobile terminal) such as a tablet terminal, a smartphone, or a laptop PC. In the latter case, a worker at the work site, a supervisor who supervises the work, a manager who manages the work site, or the like can move within the work site while holding the portable information processor 200. In the latter case, the operator can, for example, bring the portable information processor 200 into the cab of the shovel 100.


The information processor 200 obtains data on the operating state from, for example, the shovel 100. Thus, the information processor 200 can recognize the operating state of the shovel 100 and monitor whether or not there are any abnormalities in the shovel 100. Further, the information processor 200 can display data related to the operating state of the shovel 100 through a display 208 described in the following, and make the user check the data.


Further, the information processor 200 transmits, for example, various types of data such as a program used in the processing of the controller 30 to the shovel 100 and reference data to the shovel 100. Thus, the shovel 100 can perform various types of processing related to the operation of the shovel 100 by using the various types of data downloaded from the information processor 200.


[Hardware Configuration of Operation Support System]

Next, a hardware configuration of the operation support system SYS will be described with reference to FIG. 13 in addition to FIGS. 1 to 4 and FIG. 12.


Since the hardware configuration of the shovel 100 is the same as that illustrated in FIG. 4, a description thereof will be omitted.



FIG. 13 is a block diagram illustrating an example of a hardware configuration of the information processor 200.


The functions of the information processor 200 are realized by any hardware, any combination of hardware and software, or the like. For example, as illustrated in FIG. 13, the information processor 200 includes an external interface 201, an auxiliary storage 202, a memory 203, a CPU 204, a high-speed computer 205, a communication interface 206, an input device 207, and the display 208 connected by a bus B2.


The external interface 201 functions as an interface for reading data from a recording medium 201A and writing data to the recording medium 201A. The recording medium 201A includes, for example, a flexible disk, a compact disc (CD), a digital versatile disc (DVD), a Blu-ray disc (BD (registered trademark)), an SD memory card, a USB memory, and the like. As a result, the information processor 200 can read various types of data used in processing through the recording medium 201A, store them in the auxiliary storage 202, and install programs for realizing various functions.


The information processor 200 may obtain various types of data and programs used in the processing from an external device through the communication interface 206.


The auxiliary storage 202 stores various installed programs and also stores files, data, and the like required for the various types of processing. The auxiliary storage 202 includes, for example, a hard disk drive (HDD), a solid state disk (SSD), a flash memory, and the like.


The memory 203 reads a program from the auxiliary storage 202 and stores the program in response to an instruction to start the program. The memory 203 includes, for example, a dynamic random access memory (DRAM) and an SRAM.


The CPU 204 executes various programs loaded from the auxiliary storage 202 into the memory 203 and realizes various functions related to the information processor 200 according to the programs.


The high-speed computer 205 performs computation processing at a relatively high speed in conjunction with the CPU 204. The high-speed computer 205 includes, for example, a graphics processing unit (GPU), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), and the like.


Note that the high-speed computer 205 may be omitted depending on the required speed of computation processing.


The communication interface 206 is used as an interface for communicatively connecting to an external device. Thus, the information processor 200 can communicate with, for example, an external device such as the shovel 100 through the communication interface 206. The communication interface 206 may include a plurality of types of communication interfaces depending on a communication method or the like with the connected device.


The input device 207 receives various inputs from a user.


The input device 207 includes, for example, an operation input device that receives mechanical operation input from a user. The operation input device includes, for example, a button, a toggle, a lever, and the like. The operation input device also includes, for example, a touch panel mounted on the display 208, a touch pad provided separately from the display 208, and the like.


The input device 207 also includes, for example, a voice input device configured to receive voice input from a user. The voice input device includes, for example, a microphone configured to collect voice of a user.


The input device 207 also includes, for example, a gesture input device configured to receive gesture input from a user. The gesture input device includes, for example, a camera configured to image a gesture of the user.


The input device 207 includes, for example, a biological input device configured to receive biological input from a user. The biological input device includes, for example, a camera configured to obtain image data including information on a fingerprint and an iris of the user.


The display 208 displays an information screen and an operation screen to the user. For example, the display 208 includes the above-described display for remote operation. The display 208 is, for example, a liquid crystal display or an organic electroluminescence (EL) display.


Like the information processor 200, the remote operation supporting device 300 is realized by any hardware, any combination of hardware and software, or the like, and the same hardware configuration may be adopted. For example, like the information processor 200 (FIG. 6), the remote operation supporting device 300 is configured around a computer including a CPU, a memory, an auxiliary storage, an interface device, an input device, and a display. The memory is, for example, an SRAM or a DRAM. The auxiliary storage is, for example, an HDD, an SSD, an EEPROM, a flash memory, or the like. The interface device includes an external interface for connecting with an external recording medium and a communication interface for communicating with the outside such as the shovel 100. The input device includes, for example, a lever-type operation input device. Thus, via the operation input device, the operator can perform an operation input related to the actuator of the shovel 100, and via the communication interface, the remote operation supporting device 300 can transmit a signal corresponding to the operation input to the shovel 100. Accordingly, the operator can remotely operate the shovel 100 using the remote operation supporting device.


[Second Example of Functional Configuration Related to Estimation of Target Shape of Constructing Target]

Next, a second example of a functional configuration related to estimation of a target shape of a constructing target will be described with reference to FIG. 14 in addition to FIGS. 1 to 4, and FIGS. 12 and 13.


Hereinafter, description will be focused on points that are different from the first example (FIG. 7), and the description that is the same or that corresponds to that in the first example described above may be simplified or omitted.



FIG. 14 is a functional block diagram illustrating a second example of a functional configuration related to estimation of a target shape of a constructing target.


As illustrated in FIG. 14, the shovel 100 includes the supporting device 150.


Similar to the case of the first example (FIG. 7) described above, the supporting device 150 includes the controller 30, the imaging device 40, the display 50A, the input device 52, and the communicator 60.


The controller 30 includes, as functional parts, the topographic shape obtainer 301, the target shape estimator 302, the display processor 303, the target shape corrector 304, a target shape data storage 305, the work support controller 306, a trained model storage 307, and a transmitter 308.


A trained model storage 307 stores a trained model LM.


The trained model LM is used to estimate a target shape of a constructing target by the second estimation method. The trained model LM is distributed from the information processor 200.


The target shape estimator 302 estimates the target shape of the constructing target by using the second estimation method described above, based on the data obtained by the topographic shape obtainer 301, and obtains data related to the target shape of the constructing target. Specifically, the target shape estimator 302 may estimate the target shape of the constructing target by, while using the data obtained by the topographic shape obtainer 301 as input data, applying the trained model LM thereto, and using output data to which the trained model LM is applied.


The transmitter 308 transmits to the information processor 200, a set of data obtained by the topographic shape obtainer 301 and data that corresponds to the obtained data and that relates to the target shape of the constructing target stored (registered) in the target shape data storage 305.


The information processor 200 includes a training data storage 2001, a machine learning part 2002, a trained model storage 2003, and a distributor 2004 as functional parts related to the estimation of the target shape of the constructing target.


The training data storage 2001 stores (registers) training data for generating the trained model LM. The training data storage 2001 may also include training data for updating the trained model LM by re-learning or additional learning. The training data is a set of data on a topographic shape including a place that has already been constructed in a constructing target and data on a target shape of the constructing target.


For example, the latter data of the training data includes a set of data, data obtained by the topographic shape obtainer 301 and received from the shovel 100 (transmitter 308), and data that corresponds to the obtained data and finally confirmed as data relating to a target shape of the constructing target.


The machine learning part 2002 performs machine learning on a predetermined model by using the training data set of the training data storage 2001 to generate the trained model LM.


The machine learning part 2002 may update the trained model LM by re-learning or additionally learning the trained model LM by using the training data set of the training data storage 2001.


The trained model LM generated by the machine learning part 2002 is stored (registered) in the trained model storage 2003. The trained model LM in the trained model storage 2003 may be updated by re-learning or additional learning by the machine learning part 2002.


The distributor 2004 distributes the trained model LM to the shovel 100.


[Second Example of Processing Related to Estimation of Target Shape of Constructing Target]

Next, a second example of processing related to estimation of a target shape of a constructing target will be described with reference to FIG. 15 in addition to FIGS. 1 to 4 and FIGS. 12 to 14.


Hereinafter, description will be focused on points that are different from the first example (FIG. 8), and the description that is the same or that corresponds to that in the first example described above may be simplified or omitted.


As illustrated in FIG. 15, processes of steps S202 and S204 are the same as those of steps S102 and S104 in FIG. 8, and therefore the description thereof is omitted.


When processing of step S204 is completed, the controller 30 proceeds to step S206.


In step S206, the controller 30 determines whether or not an operation that executes processing for estimating a target shape of a constructing target has been performed via the input device 52 or the like, as in step S106 in FIG. 8. When the operation that executes processing for estimating the target shape of the constructing target to be executed has been performed, the controller 30 proceeds to step S208, and otherwise ends the processing of the flowchart for the case.


In step S208, a target shape estimator 302 estimates a target shape of a constructing target based on the data obtained in step S102 by using the second estimation method described above, and obtains data related to the target shape of the constructing target.


When the processing of step S208 is completed, the controller 30 proceeds to step S210.


Since processes of steps S210 to S216 are the same as those of steps S110 to S116 in FIG. 8, the description thereof is omitted.


When the processing of step S216 is completed, the controller 30 proceeds to step S218.


In step S218, the transmitter 308 transmits the data obtained in step S202 and the data of the target shape of the constructing target confirmed in step S216 to the information processor 200. Specifically, the transmitter 308 transmits to the information processor 200 a set of data including data of the current topographic shape of a place where the constructing target is to be formed, including a place that has already been constructed in the constructing target, and the data of the target shape of the constructing target finally confirmed.


When the processing of step S216 is completed, the controller 30 ends the processing of the flowchart for the case.


The processing of step S216 may be omitted and the transmitter 308 may collectively transmit the data obtained by executing processing of the flowchart for a plurality of times, to the information processor 200 as batch processing.


As described above, in the present example, the supporting device 150 can estimate the target shape of the constructing target using the trained model LM based on the data related to the current topographic shape of the place where the constructing target is to be formed, including a place that has already been constructed in the constructing target.


In the present example, the information processor 200 can update the trained model LM using the training data, that is a set of the data related to the topographic shape including the place that has already been constructed in the constructing target and the data related to the target shape of the constructing target. Thus, the result of the correction of the target shape of the constructing target by the target shape corrector 304 can be reflected in the trained model LM. Accordingly, accuracy in estimation of the target shape of the constructing target using the trained model LM can be enhanced.


OTHER EMBODIMENTS

Next, another embodiment will be described.


The contents of the above-described embodiments may be appropriately combined, or modified or changed.


For example, in the above-described embodiments, when the shovel 100 carries out the construction work of a part of the constructing target, processing related to the construction work as pre-processing and processing related to estimation of the target shape (see FIGS. 8 and 15) may be executed as a series of processing. In this case, in the processing related to the construction work as the pre-processing or in processing parallel with the processing, the trajectory of the work portion such as the toe or the back of the bucket 6 during the construction work may be recorded in the auxiliary storage 30A or the like. As a result, the topographic shape obtainer 301 can obtain data related to the topographic shape including a place that has already been constructed in the constructing target on the basis of the data related to the trajectory of the work portion. This is because the data related to the trajectory of the work portion is considered to represent the relative topographic shape of the completed portion as seen from the shovel 100. In this case, the target shape estimator 302 may estimate the target shape of the constructing target on the basis of the data related to the trajectory of the work portion. Furthermore, in this case, the target shape corrector 304 may correct (modify) the data related to the target shape of the constructing target as the estimation result of the target shape estimator 302 on the basis of the data related to the trajectory of the work portion.


In the above-described embodiment and modified examples thereof, a part or all of the functions of the supporting device 150 may be transferred to the remote operation supporting device 300.


In the above-described embodiment and modified examples thereof, a part or all of the functions of the supporting device 150 may be transferred to the information processor 200.


In the above-described embodiment and modified examples thereof, a part or all of the functional parts of the information processor 200 related to the target-shape estimation function may be transferred to the shovel 100.


[Effects]

Next, the operation of the supporting device according to the present embodiment will be described.


In the present embodiment, the supporting device includes an obtainer configured to obtain data on the shape of a part that has already been constructed in the constructing target, and an estimator configured to estimate the target shape of the constructing target based on the data obtained by the obtainer. The supporting device is, for example, the supporting device 150. The obtainer is, for example, the topographic shape obtainer 301. The estimator is, for example, the target shape estimator 302.


As a result, data related to the target shape of the constructing target can be obtained only by executing the construction of a part of the constructing target. Therefore, the supporting device can more easily obtain data related to the target shape of the constructing target.


Further, in the present embodiment, the estimator may estimate the target shape of the constructing target by using a trained model that has been subjected to machine learning based on training data, that is a set of data relating to the shape of a part that has already been constructed in the constructing target and the target shape of the constructing target. The trained model is, for example, the trained model LM.


Thus, the supporting device can obtain data on the target shape of the constructing target from the shape of the part that has already been constructed in the constructing target.


In the present embodiment, the estimator may estimate the target shape of the constructing target by duplicating or extending the shape of a part that has already been constructed in the constructing target, in a direction corresponding to the constructing target.


Thus, the supporting device can obtain data on the target shape of the constructing target from the shape of the part that has already been constructed in the constructing target.


In the present embodiment, the estimator may estimate the target shape of the constructing target based on data on the trajectory of the work portion when the work machine has performed the construction work of a part that has already been constructed in the constructing target. The work machine is, for example, the shovel 100, and the work portion is the toe or the back of the bucket 6 of the shovel 100.


As a result, the supporting device can obtain data related to the target shape of the constructing target from the shape of a part that has already been constructed in the constructing target.


In the present embodiment, the supporting device may include a display part for displaying the target shape of the constructing target as the estimation result by the estimator. The display part is, for example, the display 50A.


As a result, the supporting device can have the user visually confirm the target shape of the constructing target as the estimation result. Therefore, the user's convenience can be enhanced.


Further, in the present embodiment, the supporting device may include a corrector configured to correct the target shape of the constructing target as the estimation result of the estimator in response to an input from the user. The corrector is, for example, the target shape corrector 304.


Thus, when the target shape of the constructing target as the estimation result is inappropriate, the user can operate the supporting device to correct the target shape of the constructing target.


In the present embodiment, the supporting device may also include a corrector, and an updating part that updates the trained model by using, as the training data, a set of data including the data obtained by the obtainer and the data of the target shape of the constructing target as the correction result by the corrector. The updating part is, for example, the machine learning part 2002.


Thus, the correction result by the corrector can be reflected in the trained model. Therefore, accuracy in estimation of the target shape of the constructing target using the trained model can be enhanced.


In the present embodiment, the work machine may be provided with the above-described supporting device.


Thus, the work machine can more easily estimate and obtain the target shape of the constructing target by using the supporting device.


According to the above-described embodiments, data related to the target shape of the constructing target can be easily obtained.


Although the embodiments have been described above in detail, the present disclosure is not limited to such specific embodiments, and various modifications and changes are possible within the scope of the claims.

Claims
  • 1. A supporting device, comprising: a memory; anda processor connected to the memory and configured to execute: obtaining data related to a shape of a part that has already been constructed in a constructing target, andestimating a target shape of the constructing target based on the data obtained.
  • 2. The supporting device according to claim 1, wherein the processor is configured to estimate the target shape of the constructing target by using a trained model that is trained by machine learning based on training data, the training data being a set of: data on the shape of the part that has already been constructed in the constructing target; anddata on the target shape of the constructing target.
  • 3. The supporting device according to claim 1, wherein the processor is configured to estimate the target shape of the constructing target by duplicating or extending the shape of the part that has already been constructed in the constructing target, in a direction in accordance with the constructing target.
  • 4. The supporting device according to claim 1, wherein the processor is configured to estimate the target shape of the constructing target based on data on trajectory of a work portion, the data being obtained by a work machine that performed construction work of the part that has already been constructed in the constructing target.
  • 5. The supporting device according to claim 1, wherein the processor is configured to display the target shape of the constructing target as an estimation result by the estimator.
  • 6. The supporting device according to claim 1, wherein the processor is configured to correct the target shape of the constructing target as an estimation result in response to an input from a user.
  • 7. The supporting device according to claim 2, wherein the processor is configured to correct a target shape of a constructing target as an estimation result in response to an input from a user, andupdate the trained model by using, as training data, a set of data being a set of the data obtained, anddata of the target shape of the constructing target as a correction result.
  • 8. A work machine, comprising: a memory; anda processor connected to the memory and configured to execute: obtaining data related to a shape of a part that has already been constructed in a constructing target, andestimating a target shape of the constructing target based on the data obtained.
  • 9. A non-transitory computer-readable recording medium storing a program causing a supporting device to execute: obtaining data related to a shape of a part that has already been constructed in a constructing target, andestimating a target shape of the constructing target based on the data obtained.
Priority Claims (1)
Number Date Country Kind
2022-058416 Mar 2022 JP national
RELATED APPLICATION

This application is a continuation application of International Application No. PCT/JP2023/013128, filed on Mar. 30, 2023, and designated the U.S., which is based upon and claims priority to Japanese Patent Application No. 2022-058416 filed on Mar. 31, 2022, the entire contents of each of which are incorporated herein by reference.

Continuations (1)
Number Date Country
Parent PCT/JP2023/013128 Mar 2023 WO
Child 18828239 US