The present disclosure relates to a working system, a computer implemented method, a method for producing a trained posture estimation model, and training data.
When a wheel loader is operated to perform an excavation work, the vehicle is moved forward to push a work implement into a mass of soil and the work implement is also raised. Thus, the soil is scooped on the work implement.
Conventionally, in order to perform efficient excavation work, a technique for automatically controlling the operation of a work implement has been proposed (for example, see PTL 1).
PTD 1: Japanese Patent Laying-Open No. 2018-135649
The above document discloses a technique for automatically driving and controlling a boom from the operator's accelerator operation and bucket operation during an excavation work. An accelerator operation by a foot of the operator and a lever operation by the right hand of the operator are required, and the operator needs to be skilled for smooth operation.
Determining a target posture that the work implement at work should assume and automatically controlling the work implement in accordance with the target posture allow further automation of work by the work machine.
Accordingly, the present disclosure relates to a working system, a computer implemented method, a method for producing a trained posture estimation model, and training data, for determining a target posture for a work implement at work to assume.
According to an aspect of the present disclosure, there is provided a working system, comprising: a body; a work implement attached to the body; and a computer. The computer has a trained posture estimation model to determine a target posture for the work implement to assume at work. The computer obtains a target value for an amount of a work performed by the work implement, a period of time elapsing since the work implement started to work, and mechanical data for operation of the body and the work implement, uses the trained posture estimation model to estimate a target posture from the target value, the elapsed period of time and the mechanical data, and outputs the estimated target posture.
According to an aspect of the present disclosure, a computer-implemented method is provided. The method comprises the following steps. A first step is to obtain a target value for an amount of a work performed by a work implement attached to a body, a period of time elapsing since the work implement started to work, and mechanical data for operation of the body and the work implement. A second step is to, using a trained posture estimation model for determining a target posture for the work implement to assume at work, estimate a target posture from the target value, the elapsed period of time and the mechanical data to obtain an estimated target posture.
According to an aspect of the present disclosure, a computer-implemented method is provided. A first step is to obtain a target value for an amount of a work performed by a work implement attached to a body, a period of time elapsing since the work implement started to work, and mechanical data for operation of the body and the work implement. A second step is to select one trained posture estimation model, based on the target value, from a plurality of trained posture estimation models for determining a target posture for the work implement to assume at work. A third step is to use the selected trained posture estimation model to estimate a target posture from the elapsed period of time and the mechanical data to obtain an estimated target posture.
According to an aspect of the present disclosure, a method for producing a trained posture estimation model is provided. The method comprises the following steps. A first step is to obtain training data including: an amount of a work performed by a work implement attached to a body; a period of time elapsing since the work implement started to work; mechanical data for operation of the body and the work implement; and posture data of the work implement at work. A second step is to train the posture estimation model using the training data.
According to an aspect of the present disclosure, there is provided training data used to train a posture estimation model used to determine a target posture for a work implement attached to a body to assume at work. The training data includes an amount of a work performed by the work implement, a period of time elapsing since the work implement started to work, mechanical data for operation of the body and the work implement at a point in time when the elapsed period of time is measured, and posture data indicating a posture assumed by the work implement at the point in time when the elapsed period of time is measured.
According to an aspect of the present disclosure, a method for producing a trained posture estimation model is provided. The method comprises the following steps. A first step is to obtain a target value for an amount of a work performed by a work implement attached to a body, a period of time elapsing since the work implement started to work, and mechanical data for operation of the body and the work implement. A second step is to use a trained first posture estimation model to estimate a target posture for the work implement to assume at work from the target value, the elapsed period of time and the mechanical data to thus obtain an estimated target posture. A third step is to train a second posture estimation model using training data including the elapsed period of time, the mechanical data and the target value as well as the estimated target posture.
According to the present disclosure, a target posture for a work implement at work can be obtained accurately.
Hereinafter, an embodiment will be described with reference to the drawings. In the following description, identical components are identically denoted. Their names and functions are also identical. Accordingly, they will not be described repeatedly in detail.
<General Configuration>
In an embodiment, as one example of a work machine, a wheel loader 1 will be described.
As shown in
Traveling apparatus 4 is for causing the vehicular body of wheel loader 1 to travel, and includes traveling wheels 4a and 4b. When traveling wheels 4a and 4b are rotationally driven, wheel loader 1 can travel by itself, and perform a desired work using work implement 3.
Vehicular body frame 2 includes a front frame 2a and a rear frame 2b. Front frame 2a and rear frame 2b are attached to be capable of mutually swinging rightward and leftward. A pair of steering cylinders 11 is attached across front frame 2a and rear frame 2b. Steering cylinder 11 is a hydraulic cylinder. Steering cylinder 11 is extended and retracted by hydraulic oil received from a steering pump 12 (see
In the present specification, a direction in which wheel loader 1 travels straight forward/backward is referred to as a forward/backward direction of wheel loader 1. In the forward/backward direction of wheel loader 1, a side on which work implement 3 is located with respect to vehicular body frame 2 is defined as a forward direction, and a side opposite to the forward direction is defined as a backward direction. A rightward/leftward direction of wheel loader 1 is a direction orthogonal to the forward/backward direction in a plan view. When looking in the forward direction, a right side and a left side in the rightward/leftward direction are a rightward direction and a rightward direction, respectively. An upward/downward direction of wheel loader 1 is a direction orthogonal to a plane defined by the forward/backward direction and the rightward/leftward direction. In the upward/downward direction, a side on which the ground is present is a downward side, and a side on which the sky is present is an upward side.
Work implement 3 and a pair of traveling wheels (front wheels) 4a are attached to front frame 2a. Work implement 3 is disposed in front of the vehicular body. Work implement 3 is driven by hydraulic oil received from a work implement pump 13 (see
Boom 14 has a proximal end portion rotatably attached to front frame 2a by a boom pin 9. Bucket 6 is rotatably attached to boom 14 by a bucket pin 17 located at the distal end of boom 14.
Front frame 2a and boom 14 are coupled by a pair of boom cylinders 16. Boom cylinder 16 is a hydraulic cylinder. Boom cylinder 16 has a proximal end attached to front frame 2a. Boom cylinder 16 has a distal end attached to boom 14. Boom 14 is moved up and down when boom cylinder 16 is extended and retracted by hydraulic oil received from work implement pump 13 (see
Work implement 3 further includes a bell crank 18, a bucket cylinder 19, and a link 15. Bell crank 18 is rotatably supported by boom 14 via a support pin 18a located substantially at the center of boom 14. Bucket cylinder 19 couples bell crank 18 and front frame 2a together. Link 15 is coupled to a coupling pin 18c provided at a distal end portion of bell crank 18. Link 15 couples bell crank 18 and bucket 6 together.
Bucket cylinder 19 is a hydraulic cylinder and work tool cylinder. Bucket cylinder 19 has a proximal end attached to front frame 2a. Bucket cylinder 19 has a distal end attached to a coupling pin 18b provided at a proximal end portion of bell crank 18. When bucket cylinder 19 is extended and retracted by hydraulic oil received from work implement pump 13 (see
Cab 5 and a pair of traveling wheels (rear wheels) 4b are attached to rear frame 2b. Cab 5 is disposed behind boom 14. Cab 5 is mounted on vehicular body frame 2. In cab 5, a seat seated by an operator of wheel loader 1, an operation device 8 described hereinafter, and the like are disposed.
<System Configuration>
Engine 21 is for example a diesel engine. As the driving source, engine 21 may be replaced with a motor driven by a power storage unit, or the engine and the motor may both be used. Engine 21 includes a fuel injection pump 24. Fuel injection pump 24 is provided with an electronic governor 25. Output of engine 21 is controlled by adjusting the amount of fuel injected into the cylinder. This adjustment is performed by controlling electronic governor 25 by control device 10.
Engine speed is sensed by an engine speed sensor 91. Engine speed sensor 91 outputs a detection signal which is in turn input to control device 10.
Traveling apparatus 4 is an apparatus receiving a driving force from engine 21 to thereby cause wheel loader 1 to travel. Traveling apparatus 4 has a power transmission device 23, front and rear wheels 4a and 4b described above, and the like.
Power transmission device 23 is a device that transmits driving force from engine 21 to front and rear wheels 4a and 4b, and it is for example a transmission. In wheel loader 1, front wheel 4a attached to front frame 2a and rear wheel 4b attached to rear frame 2b both configure driving wheels receiving the driving force to cause wheel loader 1 to travel. Power transmission device 23 changes rotation of an input shaft 27 in speed and outputs it to an output shaft 28.
Output shaft 28 is provided with an output rotation speed sensor 92. Output rotation speed sensor 92 senses rotation speed of output shaft 28. Output rotation speed sensor 92 outputs a detection signal which is in turn input to control device 10. Control device 10 calculates vehicular speed based on the detection signal of output rotation speed sensor 92.
Power transmission device 23 outputs driving force which is in turn transmitted to wheels 4a and 4b via a shaft 32 and the like. Thus, wheel loader 1 travels. A part of the driving force from engine 21 is transmitted to traveling apparatus 4, and wheel loader 1 travels.
A part of the driving force of engine 21 is transmitted to work implement pump 13 and steering pump 12 via a power extraction unit 33. Power extraction unit 33 is a device that distributes output of engine 21 to power transmission device 23 and a cylinder driving unit composed of work implement pump 13 and steering pump 12.
Work implement pump 13 and steering pump 12 are hydraulic pumps driven by a driving force output from engine 21. Work implement pump 13 pumps out hydraulic oil which is in turn supplied to boom cylinder 16 and bucket cylinder 19 via a work implement control valve 34. Steering pump 12 pumps out hydraulic oil which is in turn supplied to steering cylinder 11 via a steering control valve 35. Work implement 3 is driven by a part of the driving force output from engine 21.
A first hydraulic pressure detector 95 is attached to boom cylinder 16. First hydraulic pressure detector 95 detects pressure of hydraulic oil inside an oil chamber of boom cylinder 16. First hydraulic pressure detector 95 outputs a detection signal which is in turn input to control device 10.
A second hydraulic pressure detector 96 is attached to bucket cylinder 19. Second hydraulic pressure detector 96 detects pressure of hydraulic oil inside an oil chamber of bucket cylinder 19. Second hydraulic pressure detector 96 outputs a detection signal which is in turn input to control device 10.
A first angle detector 29 is, for example, a potentiometer attached to boom pin 9. First angle detector 29 detects a boom angle representing an angle by which boom 14 is lifted up with respect to the vehicular body. First angle detector 29 outputs a detection signal indicating the boom angle to control device 10.
Specifically, as shown in
First angle detector 29 may be a stroke sensor disposed on boom cylinder 16.
A second angle detector 48 is, for example, a potentiometer attached to support pin 18a. Second angle detector 48 detects a bell crank angle representing an angle of bell crank 18 with respect to boom 14. Second angle detector 48 outputs a detection signal representing the bell crank angle to control device 10.
Specifically, as shown in
Second angle detector 48 may detect an angle of bucket 6 (a bucket angle) with respect to boom 14. The bucket angle is an angle formed by: a straight line passing through the center of bucket pin 17 and a blade edge 6a of bucket 6; and boom reference line A. Second angle detector 48 may be a potentiometer or a proximity switch attached to bucket pin 17. Alternatively, second angle detector 48 may be a stroke sensor disposed on bucket cylinder 19.
Operation device 8 is operated by an operator. Operation device 8 includes a plurality of types of operating members that are operated by the operator to operate wheel loader 1. Specifically, Operation device 8 includes an accelerator operating member 81a, a steering member 82a, a boom operating member 83a, a bucket operating member 84a, a gear-shifting member 85a, and an FR operating member 86a.
Accelerator operating member 81a is operated to set a target engine speed for engine 21. Accelerator operating member 81a is, for example, an accelerator pedal. When accelerator operating member 81a is operated in an increased amount (for an accelerator pedal, when it is depressed in an increased amount), the vehicular body is accelerated. When accelerator operating member 81a is operated in a decreased amount the vehicular body is decelerated. Accelerator operation detection unit 81b detects an amount by which accelerator operating member 81a is operated. An amount by which accelerator operating member 81a is operated will be referred to as an amount of operation of the accelerator. Accelerator operation detection unit 81b detects the amount of operation of the accelerator. Accelerator operation detection unit 81b outputs a detection signal to control device 10. Control device 10 controls the output from engine 21 based on the detection signal received from accelerator operation detection unit 81b.
Steering member 82a is operated to control in which direction the vehicle moves. Steering member 82a is, for example, a steering handle. Steering operation detection unit 82b detects a position of steering member 82a and outputs a detection signal to control device 10. Control device 10 controls steering control valve 35 based on the detection signal output from steering operation detection unit 82b. Steering cylinder 11 extends and retracts to change a direction in which the vehicle travels.
Boom operating member 83a is operated to operate boom 14. Boom operating member 83a is, for example, a control lever. Boom operation detection unit 83b detects a position of boom operating member 83a. Boom operation detection unit 83b outputs a detection signal to control device 10. Control device 10 controls work implement control valve 34 based on the detection signal received from boom operation detection unit 83b. Boom cylinder 16 extends and retracts to operate boom 14.
Bucket operating member 84a is operated to operate bucket 6. Bucket operating member 84a is, for example, a control lever. Bucket operation detection unit 84b detects a position of bucket operating member 84a. Bucket operation detection unit 84b outputs a detection signal to control device 10. Control device 10 controls work implement control valve 34 based on the detection signal received from bucket operation detection unit 84b. Bucket cylinder 19 extends and retracts to operate bucket 6.
Gear shifting member 85a is operated to set changing of speed by power transmission device 23. Gear shifting member 85a is, for example, a shift lever. Gear-shifting operation detection unit 85b detects a position of gear shifting member 85a. Gear-shifting operation detection unit 85b outputs a detection signal to control device 10. Control device 10 controls power transmission device 23 based on the detection signal received from gear-shifting operation detection unit 85b.
FR operating member 86a is operated to switch the vehicle between traveling forward and traveling backward. FR operating member 86a is switched to each of a forward position, a neutral position, and a reverse position. FR operation detection unit 86b detects a position of FR operating member 86a. FR operation detection unit 86b outputs a detection signal to control device 10. In response to the detection signal received from FR operation detection unit 86b, control device 10 controls power transmission device 23 to switch the vehicle between a forward traveling state, a reverse traveling state, and a neutral state.
Display unit 50 receives a command signal from control device 10 and displays various types of information. Various types of information displayed on display unit 50 may for example be information for a work performed by wheel loader 1, vehicular body information such as a remaining amount of fuel, coolant's temperature and hydraulic oil's temperature, an image of an environment of wheel loader 1, and the like. Display unit 50 may be a touch panel, and in that case, a signal generated when the operator touches a portion of display unit 50 is output from display unit 50 to control device 10.
Control device 10 is generally implemented by reading various programs by a CPU (Central Processing Unit). Control device 10 includes a memory 10M and a timer 10T. Memory 10M functions as a work memory and stores various programs for implementing a function of the wheel loader. Control device 10 reads the current time from timer 10T. Control device 10 for example calculates a period of time elapsing while wheel loader 1 is performing an excavation work since wheel loader 1 started to perform the excavation work.
Control device 10 sends an engine command signal to electronic governor 25 so that a target rotational speed corresponding to an amount of operation of accelerator operating member 81a is obtained. Based on an amount of fuel supplied to engine 21 that varies as controlled by electronic governor 25, control device 10 can calculate fuel consumption per unit running time of engine 21, fuel consumption per unit traveling distance of wheel loader 1, and fuel consumption per unit loaded weight in bucket 6.
Control device 10 calculates a vehicular speed of wheel loader 1 based on the detection signal of output rotation speed sensor 92. Control device 10 reads from memory 10M a map defining a relationship between wheel loader 1's vehicular speed and traction, and calculates traction based on the map.
Control device 10 receives a detection signal of engine speed from engine speed sensor 91. Control device 10 reads from memory 10M a map defining a relationship between engine speed and engine torque, and calculates engine torque based on the map.
Traction and engine torque may be calculated in a different manner than reference to a map. For example, traction and engine torque may be calculated by referring to a table, or calculation using a mathematical expression, or the like.
Control device 10 automatically controls operation of boom 14 and bucket 6. This automatic control will more specifically be described hereinafter.
<Excavation Work>
Wheel loader 1 of the present embodiment performs excavation work for scooping a target to be excavated, such as soil and sand.
As shown in
Wheel loader 1 of the present embodiment performs an excavating operation to cause bucket 6 to scoop target to be excavated 100, and a loading operation to load a load (or target to be excavated 100) in bucket 6 onto a carrier such as a dump truck.
More specifically, wheel loader 1 performs a plurality of work steps, which will be described hereinafter, sequentially to excavate target to be excavated 100 and load target to be excavated 100 onto a carrier such as a dump truck.
A first step is to move forward toward target to be excavated 100 (hereinafter also referred to as the step of moving forward without any load). A second step is to move wheel loader 1 forward until blade edge 6a of bucket 6 bites into target to be excavated 100 (hereinafter also referred to as the excavating (plowing) step). A third step is to operate boom cylinder 16 to raise bucket 6 and also operate bucket cylinder 19 to tilt bucket 6 back (hereinafter also referred to as the excavating (scooping) step). A fourth step is to move wheel loader 1 backward after target to be excavated 100 is scooped into bucket 6 (hereinafter also referred to as the step of moving backward with a load).
A fifth step is to move wheel loader 1 forward to approach the dump truck while keeping bucket 6 raised or raising bucket 6 (hereinafter also referred to as the step of moving forward with a load). A sixth step is to dump bucket 6 at a predetermined position to load target to be excavated 100 onto the loading platform of the dump truck (hereinafter also referred to as the soil dumping step). A seventh step is to lower boom 14 while moving wheel loader 1 backward to return bucket 6 to an excavating position (hereinafter also referred to the step of moving backward and lowering the boom). The above is a typical work steps configuring one cycle of an excavating and loading process.
For example, whether wheel loader 1 is currently performing an excavating step and thus work implement 3 is currently performing an excavation work or wheel loader 1 is currently not performing an excavating step and thus work implement 3 is currently not performing an excavation work can be determined based on a combination of conditions for determination about an operation by an operator to move wheel loader 1 forward and rearward, an operation by the operator for work implement 3, and the current hydraulic pressure of the cylinder of work implement 3.
<Detailed Configuration of Computer 102A>
Computer 102A includes a processor 103, a storage device 104, a communication interface 105, and an I/O interface 106. Processor 103 is for example a CPU.
Storage device 104 includes a medium which stores information such as stored programs and data so as to be readable by processor 103. Storage device 104 includes a RAM (Random Access Memory), or a ROM (Read Only Memory) or a similar system memory, and an auxiliary storage device. The auxiliary storage device may for example be a magnetic recording medium such as a hard disk, an optical recording medium such as a CD (Compact Disc) or a DVD (Digital Versatile Disc), or a semiconductor memory such as a flash memory. Storage device 104 may be built into computer 102A. Storage device 104 may include an external recording medium 109 detachably connected to computer 102A. External recording medium 109 may be a CD-ROM.
Communication interface 105 is, for example, a wired LAN (Local Area Network) module, or a wireless LAN module, and is an interface for performing communications via a communication network. I/O interface 106 is, for example, a USB (Universal Serial Bus) port, and is an interface for connecting to an external device.
Computer 102A is connected to an input device 107 and an output device 108 via I/O interface 106. Input device 107 is a device used by a user for input to computer 102A. Input device 107 includes, for example, a mouse, or a trackball or a similar pointing device. Input device 107 may include a device such as a keyboard for inputting text. Output device 108 includes, for example, a display (display unit 50, see
Calculation unit 161 receives from first hydraulic pressure detector 95 a detection signal indicative of pressure of hydraulic oil internal to an oil chamber of boom cylinder 16 as detected. Calculation unit 161 receives from accelerator operation detection unit 81b a detection signal indicative of the amount of operation of the accelerator as detected. Calculation unit 161 receives from output rotation speed sensor 92 a detection signal indicative of rotational speed of the output shaft 28 as detected. Calculation unit 161 calculates vehicular speed of wheel loader 1 based on the detection signal of output rotation speed sensor 92. Calculation unit 161 receives from engine speed sensor 91 a detection signal indicative of engine speed as detected.
Calculation unit 161 calculates an amount of fuel supplied to engine 21, based on the amount of operation of the accelerator. Calculation unit 161 calculates an amount of target to be excavated 100 such as soil loaded into bucket 6, based on the hydraulic pressure in the oil chamber of boom cylinder 16. Furthermore, calculation unit 161 calculates an amount of target to be excavated 100 loaded per amount of fuel supplied (i.e., a fuel consumption rate).
Hereinafter will be described an example of a method for calculating an amount of target to be excavated 100 scooped into bucket 6 in an excavation work (hereinafter referred to as an amount of soil excavated). An excavation work corresponds to a work performed by work implement 3 according to an embodiment. An amount of soil excavated corresponds to an amount of work performed by work implement 3 according to an embodiment.
When boom angle θ1 and boom pressure Pτ at a certain time point are obtained, the amount of soil excavated at that time point can be calculated. For example, assuming that boom angle θ1=θk and boom pressure Pτ=Pτk at a certain time point mk as shown in
As shown in
When Pτk is located between PτA and PτC as shown in
The method of calculating the amount of soil excavated in bucket 6 is not limited to the examples shown in
Referring again to
Boom cylinder 16's hydraulic pressure, an amount of operation of the accelerator, vehicular speed, engine speed, a fuel consumption rate, traction, and engine torque are included in mechanical data for operation of the body of the work machine (or the vehicular body) and work implement 3. The mechanical data includes data for traveling of the work vehicular body, such as an amount of operation of the accelerator, vehicular speed, engine speed, traction, and engine torque.
Processor 103 has a timer 162. Calculation unit 161 reads the current time from timer 162, and calculates a period of time elapsing while wheel loader 1 is performing an excavation work since wheel loader 1 started to perform the excavation work.
The excavation work having been started, that is, wheel loader 1 having transitioned in a work process from the step of moving forward without any load to the excavating (plowing) step, is determined by detecting that the hydraulic pressure in the oil chamber of boom cylinder 16 increases when blade edge 6a of bucket 6 plows into target to be excavated 100 and the load of target to be excavated 100 starts to act on bucket 6, and confirming through boom angle θ1 and bell crank angle θ2 whether work implement 3 is in a posture to start the excavation work. A point in time when the work starts may be determined based on a load received by boom cylinder 16 in the work. When the work starts may be determined based on data of an image of an environment surrounding wheel loader 1, as captured by an imaging device.
The excavation work having ended, that is, wheel loader 1 having transitioned in the work process from the excavating (scooping) step to the step of moving backward with a load, is determined by detecting that a direction in which wheel loader 1 travels is changed from a forward direction to a backward direction and that bucket 6 having been tilted back to scoop target to be excavated 100 now assumes a neutral operation.
Processor 103 includes an angle detection unit 163. Angle detection unit 163 receives from first angle detector 29 a detection signal indicative of boom angle θ1 as detected. Angle detection unit 163 receives from second angle detector 48 a detection signal indicative of bell crank angle θ2 as detected.
Boom angle θ1 and bell crank angle θ2 detected at a point in time during an excavation work are assigned to a period of time elapsing at the point in time since the excavation work was started, mechanical data obtained at the point in time, and an amount of soil excavated by the excavation work, and are stored to storage device 104 as training data. Storage device 104 stores a training data set 188 for training a target posture estimation model 180. Training data set 188 includes a plurality of training data each labelling a posture assumed by work implement 3 (or boom angle θ1 and bell crank angle θ2) at a point in time during an excavation work with respect to a period of time elapsing since the excavation work was started and mechanical data at the point in time as well as an amount of soil excavated in the excavation work.
Processor 103 includes a target posture estimation unit 165. Storage device 104 has target posture estimation model 180 stored therein.
Target posture estimation model 180 is an artificial intelligence model for determining a target posture for work implement 3 to assume during an excavation work. Target posture estimation model 180 is configured to determine a target posture for work implement 3 to assume during an excavation work from a target amount of soil to be excavated in the excavation work, as well as a period of time elapsing since the excavation work was started and mechanical data. Computer 102A uses target posture estimation model 180 of artificial intelligence to estimate a target posture for work implement 3 to assume during an excavation work. Target posture estimation unit 165 uses target posture estimation model 180 to estimate a target posture for work implement 3 from a target amount of soil to be excavated, an elapsed period of time and mechanical data to obtain an estimated target posture.
More specifically, target posture estimation unit 165 reads target posture estimation model 180 from storage device 104 and inputs a target amount of soil to be excavated in an excavation work, as well as a period of time elapsing since the excavation work was started and mechanical data, as calculated by calculation unit 161, to target posture estimation model 180 to obtain an output of a result of an estimation of boom angle θ1 and bell crank angle θ2 to be a target.
Target posture estimation model 180 includes a neural network. Target posture estimation model 180 includes, for example, a deep neural network such as a convolutional neural network (CNN).
The model in the embodiment may be implemented in hardware, software executable on hardware, firmware, or a combination thereof. The model may include programs, algorithms, and data executed by processor 103. The model may have functionality performed by a single module or across multiple modules in a distributed manner. The model may be distributed across a plurality of computers.
Processor 103 includes an error calculation unit 166 and a target posture estimation model update unit 167.
Error calculation unit 166 selects training data corresponding to the elapsed period of time and mechanical data calculated by calculation unit 161. Error calculation unit 166 compares a result of an estimation of boom angle θ1 and bell crank angle θ2 by target posture estimation unit 165 with boom angle θ1 and bell crank angle θ2 included in the selected training data. Error calculation unit 166 calculates an error of the result of the estimation of boom angle θ1 and bell crank angle θ2 by target posture estimation unit 165 with respect to the values of boom angle θ1 and bell crank angle θ2 included in the training data.
Target posture estimation model update unit 167 updates target posture estimation model 180 based on the error of boom angle θ1 and bell crank angle θ2 calculated by error calculation unit 166. In this way, target posture estimation model 180 is trained. Target posture estimation model 180 is trained in a factory before shipment of wheel loader 1.
<Method for Producing Target Posture Estimation Model 180 Trained>
As shown in
In step S102, for the excavation work for which an amount of soil excavated is calculated in step S101, a period of time elapsing at a point in time during the excavation work since excavation was started and mechanical data were obtained. In step S103, the work implement's posture data at the point in time was obtained.
Calculation unit 161 calculates a period of time elapsing at a point in time during an excavation work since the excavation work was started. Further, Calculation unit 161 calculates mechanical data at the point in time, based on results of detection done by various sensors including first hydraulic pressure detector 95, accelerator operation detection unit 81b, output rotation speed sensor 92, and engine speed sensor 91. Angle detection unit 163 detects boom angle θ1 and bell crank angle θ2 made at the point in time, based on a result of detection done by first angle detector 29 and second angle detector 48.
As shown in
The training data may further include data manually input by an operator, the angle of inclination of target to be excavated 100 and the type of the target as soil, and data of an image of an environment surrounding wheel loader 1, as captured by an imaging device.
Subsequently, in step S104, a target posture for work implement 3 is output. Computer 102A, more specifically, target posture estimation unit 165 reads target posture estimation model 180 from storage device 104. Target posture estimation model 180 includes the neural network shown in
Adjacent layers have their respective units connected to each other, and a weight is set for each connection. A bias is set for each unit. A threshold value is set for each unit. An output value of each unit is determined depending on whether a total sum of a product of a value input to each unit and the weight plus the bias exceeds the threshold value.
Target posture estimation model 180 is trained to determine a target posture for work implement 3 at work to assume from an amount of soil excavated, a period of time elapsing since an excavation work was started, and mechanical data. A parameter obtained for target posture estimation model 180 through training is stored to storage device 104. The parameter for target posture estimation model 180 for example includes the number of layers of the neural network, the number of units in each layer, a relationship between units in connectivity, a weight applied to a connection between each unit and another unit, a bias associated with each unit, and a threshold value for each unit.
Target posture estimation unit 165 inputs an amount of soil excavated, an elapsed period of time, and mechanical data calculated by calculation unit 161 to input layer 181. Output layer 183 outputs a target posture for work implement 3, more specifically, an output value indicating boom angle θ1 and bell crank angle θ2. For example, computer 102A uses an amount of soil excavated, an elapsed period of time, and mechanical data as an input to input layer 181 to compute forward propagation of the neural network of target posture estimation model 180. Thus, computer 102A obtains an estimated target posture for work implement 3 as an output value output from the neural network at output layer 183.
Step S103 may not be followed by step S104. Steps S103 and S104 may be performed simultaneously, or step S104 may be followed by step S103.
Subsequently, in step S105, a difference is calculated between the target posture for work implement 3 output in step S104 and the work implement's posture data obtained in step S103. Computer 102A, more specifically, error calculation unit 166 compares the estimated target posture of work implement 3 output from target posture estimation model 180 at output layer 183 with a posture of work implement 3 included in corresponding training data, and calculates an error of the estimated target posture with respect to the work implement's posture data.
Computer 102A trains target posture estimation model 180 using an amount of soil excavated in an excavation work, a period of time elapsing at a point in time during the excavation work since the excavation work was started, and mechanical data obtained at that point in time as input data, and posture data indicating a posture assumed by work implement 3 at that point in time (i.e., boom angle θ1 and bell crank angle θ2) as teacher data. From an error of an output value as calculated, computer 102A calculates through back propagation an error of a weight applied to a connection between each unit and another unit, an error of each unit's bias, and an error of the threshold value for each unit.
Subsequently, in step S106, target posture estimation model 180 is updated. Computer 102A, more specifically, target posture estimation model update unit 167 updates parameters of target posture estimation model 180, such as a weight applied to a connection between each unit and another unit, each unit's bias and the threshold value for each unit, based on the error of the estimated target posture with respect to the posture of work implement 3 obtained in angle detection unit 163, as calculated by error calculation unit 166. And once the same elapsed period of time and mechanical data have been input to input layer 181, an output value closer to posture data indicating a posture of work implement 3 can be output. Target posture estimation model 180 has the updated parameters stored to storage device 104.
When a target posture is next time estimated for work implement 3, an amount of soil excavated, an elapsed period of time, and mechanical data are input to the updated target posture estimation model 180 to obtain an output of an estimated target posture for work implement 3. Computer 102A repeats step S101 to step S105 until target posture estimation model 180 outputs an estimated target posture for work implement 3 that matches posture data indicating a posture that work implement 3 assumes at a point in time when the elapsed period of time and the mechanical data are obtained. In this way, target posture estimation model 180 has its parameters optimized and is thus trained.
Once target posture estimation model 180 has sufficiently been trained, and as a result comes to obtain a sufficiently accurately estimated target posture, computer 102A ends training target posture estimation model 180. Target posture estimation model 180 trained is thus produced. Then, the process ends (END in
Initial values for various parameters of target posture estimation model 180 may be provided by a template. Alternatively, the initial values of the parameters may be manually given by human input. When retraining target posture estimation model 180, computer 102A may prepare initial values for parameters, based on values stored in storage device 104 as parameters of target posture estimation model 180 to be retrained.
<Estimating Target Posture for Work Implement 3 Using Target Posture Estimation Model 180 Trained>
Processor 103 includes calculation unit 161, timer 162, and target posture estimation unit 165, as well as shown in
Initially, in step S201, a target amount of soil to be excavated in an excavation work is obtained. An operator of wheel loader 1 may operate input device 107 (
Alternatively, based on a history of amounts of soil excavated in previous excavation works, computer 102B may calculate an amount of soil to be excavated into bucket 6 for the current excavation work. For example, when wheel loader 1 is engaged in a work to load a dump truck capable of carrying a load of 30 tons and a total of 24 tons of target to be excavated 100 has been loaded on the dump truck by previous two loading operations, then, the amount of soil to be excavated to be scooped into bucket 6 for the current excavation work may be calculated to be (30−24)=6 tons. A weight of target to be excavated 100 having been loaded on the dump truck may be determined as a cumulative value of amounts of soil excavated, as calculated with reference to
Subsequently, in step S202, an elapsed period of time and mechanical data are obtained. Computer 102B, more specifically, calculation unit 161 calculates a period of time elapsing at a point in time during an excavation work since the excavation work was started. Calculation unit 161 calculates mechanical data for the point in time based on results of detection done by various sensors including first hydraulic pressure detector 95, accelerator operation detection unit 81b, output rotation speed sensor 92, and engine speed sensor 91. Input data 191 shown in
Subsequently, in step S203, a target posture is estimated for work implement 3. Computer 102B, more specifically, target posture estimation unit 165 reads target posture estimation model 180 and an optimal value of a trained parameter from storage device 104 to obtain target posture estimation model 180 trained. Target posture estimation unit 165 uses the target amount of soil to be excavated, the elapsed period of time, and the mechanical data as data 191 input to target posture estimation model 180. Target posture estimation unit 165 inputs the target amount of soil to be excavated, the elapsed period of time, and the mechanical data to each unit included in input layer 181 of target posture estimation model 180 trained. Target posture estimation model 180 trained outputs at output layer 183 an estimated target posture which is an estimation of a target posture for work implement 3 to assume during an excavation work, more specifically, an angular output value 197 including boom angle θ1 and bell crank angle θ2 (see
Subsequently, in step S204, computer 102B operates work implement 3 based on the estimated target posture.
Target posture estimation unit 165 outputs to boom control unit 168 boom angle θ1 targeted. Boom control unit 168 outputs a control signal to boom cylinder 16 based on boom angle θ1 targeted. In response to the control signal, boom cylinder 16 extends or retracts to perform automatic control to operate boom 14 so that an actual value of boom angle θ1 approaches a target value.
Bell crank angle θ2 targeted is output from target posture estimation unit 165 to bucket control unit 169. Bucket control unit 169 outputs a control signal to bucket cylinder 19 based on bell crank angle θ2 targeted. In response to the control signal, bucket cylinder 19 extends or retracts to perform automatic control to operate bucket 6 so that an actual value of bell crank angle θ2 approaches a target value.
Finally, in step S205, computer 102B generates management data including a posture of work implement 3. Computer 102B stores the management data to storage device 104. Then, the process ends (END of
Thus, in the working system according to the embodiment, computer 102B includes target posture estimation model 180 that has been trained for determining a target posture for work implement 3 to assume during an excavation work. As shown in
Target posture estimation model 180 of artificial intelligence suitable for estimating a target posture for work implement 3 can thus be used to estimate a target posture for work implement 3 to assume during an excavation work. Computer 102B can thus easily and accurately determine a target posture for work implement 3 using artificial intelligence.
Based on a target amount of soil to be excavated, a target posture for work implement 3 is determined, and an appropriate posture for work implement 3 that corresponds to a specific amount of soil excavated is estimated. By controlling work implement 3 to assume the estimated target posture, a target amount of soil to be excavated as specified can be automatically excavated with precision. As an amount of soil automatically excavated can thus be adjusted, a weight loaded on a dump truck can be adjusted, and target to be excavated 100 can be efficiently loaded on the dump truck for every excavating and loading process to a maximum load that the dump truck can carry. Increased productivity can thus be achieved.
As shown in
Wheel loader 1 after shipment from the factory may include first angle detector 29, second angle detector 48, and angle detection unit 163. In this case, target posture estimation model 180 may be additionally trained after shipment from the factory.
As shown in
<A First Modified Example for Training Target Posture Estimation Model 180>
In the
A target posture estimation model 180A corresponding to amount of soil excavated PL1 is created by using training data group 188Ga to perform the same process as steps S102 to S106 in the
Thereafter, in step S203, a period of time elapsing since an excavation work is started and mechanical data are input to the selected target posture estimation model to estimate a target posture for work implement 3.
Thus creating a plurality of target posture estimation models that are trained to correspond to specific amounts of soil excavated and using the different target posture estimation models for different amounts of soil excavated allow a simple process to be employed to estimate a target posture for work implement 3.
<A Second Modified Example for Training Target Posture Estimation Model 180>
A first wheel loader 1 (a wheel loader 1A), a second wheel loader 1 (a wheel loader 1B), a third wheel loader 1 (a wheel loader 1C), and a fourth wheel loader 1 (a wheel loader 1D) shown in
Computer 102A obtains, from each of wheel loaders 1A, 1B and 1C, a period of time elapsing at a point in time during an excavation work since the excavation work was started, and mechanical data for that point in time. Computer 102A also obtains the work implement's posture data (boom angle θ1 and bell crank angle θ2) at that point in time from each of wheel loaders 1A, 1B and 1C in association with the elapsed period of time and the mechanical data. Computer 102A extracts amounts of soil excavated, elapsed periods of time, mechanical data, and the work implement's posture data of a plurality of excavation works performed by wheel loaders 1A, 1B and 1C, and collects them as training data. Using these training data, computer 102A trains target posture estimation model 180 to be able to estimate a target posture for work implement 3 from a target amount of soil to be excavated, an elapsed period of time and mechanical data to thus obtain an estimated target posture.
Computer 102A may obtain an elapsed period of time, mechanical data and the work implement's posture data from each of wheel loaders 1A, 1B, 1C via communication interface 105 (see
Computer 102A may be located at the same work site as wheel loaders 1A, 1B, 1C. Alternatively, computer 102A may be located in a remote place away from a work site, such as a management center for example. Wheel loaders 1A, 1B, 1C may be located at the same work site or at different work sites.
Target posture estimation model 180 trained is provided to each wheel loader 1A, 1B, 1C via communication interface 105, external recording medium 109, or the like. Each wheel loader 1A, 1B, 1C is thus provided with target posture estimation model 180 trained.
When target posture estimation model 180 is already stored in each wheel loader 1A, 1B, 1C, target posture estimation model 180 stored is overwritten. Target posture estimation model 180 may be overwritten periodically by periodically collecting training data and training target posture estimation model 180, as described above. Whenever target posture estimation model 180 has a parameter updated, the latest, updated value is stored to storage device 104.
Target posture estimation model 180 trained is also provided to wheel loader 1D. Target posture estimation model 180 is provided to both wheel loaders 1A, 1B, 1C that provide training data and wheel loader 1D that does not provide training data. Wheel loader 1D may be located at the same work site as any of wheel loaders 1A, 1B, 1C, or may be located at a work site different than wheel loaders 1A, 1B, 1C. Wheel loader 1D may be before shipment from a factory.
<Method for Producing Distillation Model>
Target posture estimation model 180 described above is not limited to a model trained through machine learning using training data 188A, 188B, 188C, . . . , and may be a model generated using the trained model. For example, target posture estimation model 180 may be another trained model (a distillation model) trained based on a result obtained by repeatedly inputting/outputting data to/from a trained model.
As shown in
Subsequently, in step S303, computer 102A uses a trained first target posture estimation model to obtain an estimated target posture which is an estimation of a target posture for work implement 3 to assume during the excavation work. In step S304, computer 102A outputs the estimated target posture for work implement 3.
Computer 102A, more specifically, target posture estimation unit 165 reads the trained first target posture estimation model from storage device 104. Target posture estimation unit 165 inputs the amount of soil excavated, elapsed period of time and mechanical data calculated by calculation unit 161 to input layer 181 of the trained first target posture estimation model. The trained first target posture estimation model outputs from output layer 183 a target posture for work implement 3 to assume during the excavation work, more specifically, an estimated target posture indicating boom angle θ1 and bell crank angle θ2.
Subsequently, in step S305, computer 102A stores the amount of soil excavated, elapsed period of time and mechanical data obtained in steps S301 and S302 and the target posture output in step S304 for work implement 3 to storage device 104 as training data.
Subsequently, in step S306, computer 102A uses the training data to train a second target posture estimation model. Computer 102A inputs an amount of soil excavated, an elapsed period of time and mechanical data to the second target posture estimation model at an input layer. Computer 102A outputs from an output layer of the second target posture estimation model a target posture for work implement 3 to assume during an excavation work, more specifically, an output value indicating a result of estimating boom angle θ1 and bell crank angle θ2. A difference is calculated between the estimated target posture of work implement 3 output from the second target posture estimation model, and the estimated target posture of work implement 3 output from the first target posture estimation model, as output in step S304. Based on this difference, computer 102A updates a parameter of the second target posture estimation model. The second target posture estimation model is thus trained.
Finally, in step S307, the second position estimation model has the updated parameter stored to storage device 104 as a trained parameter. Then, the process ends (END in
Thus, an amount of soil excavated, an elapsed period of time and mechanical data, and a target posture estimated for work implement 3 through a first target posture estimation model can be used as training data to train a second target posture estimation model (or a distillation model), and computer 102A can use the second target posture estimation model that is simpler than the first target posture estimation model to estimate a target posture for work implement 3 to assume during an excavation work. This can alleviate a load imposed on computer 102A for estimating a target posture for work implement 3. Computer 102A may train the second target posture estimation model by using training data generated by another computer.
In the above embodiment, target posture estimation model 180 includes a neural network. This is not exclusive, however, and target posture estimation model 180 may be a model, such as a support vector machine, a decision tree, or the like capable of accurately estimating a target posture for work implement 3 at work to assume from a period of time elapsing since a work was started and mechanical data through machine learning.
The work machine to which the idea of the present disclosure is applicable is not limited to a wheel loader, and may be a work machine having a work implement, such as a hydraulic excavator, a crawler dozer, and the like. For a hydraulic excavator, the target posture estimation model may receive mechanical data including the boom cylinder's hydraulic pressure, the dipper stick cylinder's hydraulic pressure, engine torque, engine speed, a hydraulic pump's capacity, and the like. For a hydraulic excavator, the target posture estimation model may output an estimated target posture for the work implement including an angle of the boom with respect to the vehicular body, an angle of the dipper stick with respect to the boom, and an angle of the bucket with respect to the dipper stick.
The presently disclosed embodiments are to be considered as illustrative in any respect and not restrictive. The scope of the present invention is not indicated by the above description but by the scope of the claims, and is intended to include meanings equivalent to the scope of the claims and any modifications within the scope.
1, 1A, 1B, 1C, 1D wheel loader, 2 vehicular body frame, 2a front frame, 3 work implement, 4 traveling apparatus, 5 cab, 6 bucket, 6a blade edge, 8 operation device, 9 boom pin, 10 control device, 11 steering cylinder, 14 boom, 16 boom cylinder, 17 bucket pin, 18 bell crank, 18a support pin, 18b, 18c coupling pin, 19 bucket cylinder, 21 engine, 29 first angle detector, 48 second angle detector, 81a accelerator operating member, 81b accelerator operation detection unit, 82a steering member, 82b steering operation detection unit, 83a boom operating member, 83b boom operation detection unit, 84a bucket operating member, 84b bucket operation detection unit, 85a gear shifting member, 85b gear-shifting operation detection unit 86a, FR operating member, 86b FR operation detection unit, 91 engine speed sensor, 92 output rotation speed sensor, 95 first pressure detector, 96 second pressure detector, 100 target to be excavated, 102A, 102B computer, 103 processor, 104 storage device, 105 communication interface, 106 I/O Interface, 107 input device, 108 output device, 109 external recording medium, 161 calculation unit, 162 timer, 163 angle detection unit, 165 target posture estimation unit, 166 error calculation unit, 167 target posture estimation model update unit, 168 boom control unit, 169 bucket control unit, 180 target posture estimation model, 181 input layer, 182 intermediate layer, 183 output layer, 188 training data set, 188A, 188B, 188C training data, 191 input data, 197 angular output value, A boom reference line, B bucket reference line, H horizontal line, L bucket's locus.
Number | Date | Country | Kind |
---|---|---|---|
2020-044103 | Mar 2020 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/008391 | 3/4/2021 | WO |