The present disclosure relates to a system including a work machine, a computer implemented method, a method for producing a trained position estimation model, and training data.
For a hydraulic excavator, Japanese Patent Laying-Open No. 2017-71982 (PTL 1) discloses attaching a boom angle sensor to a boom pin, a dipper stick angle sensor to a dipper stick pin, and a bucket angle sensor to a bucket link to sense values which are in turn used to calculate the position of the tip of a tooth of the bucket.
PTL 1: Japanese Patent Laying-Open No. 2017-71982
The configuration described in the above document necessitates attaching an angle sensor to an axis of each of the boom, the dipper stick and the bucket in order to determine the posture of a work implement, which invites an increased number of components.
Herein is disclosed a system including a work machine, a computer implemented method, a method for producing a trained position estimation model, and training data to determine the position of a work implement.
In one aspect of the present disclosure, there is provided a system comprising: a work machine body; a work implement attached to the work machine body; an imaging device that captures an image of the work implement; and a computer. The computer has a trained position estimation model to determine a position of the work implement. The computer is programmed to obtain the image of the work implement captured by the imaging device and use the trained position estimation model to obtain a position of the work implement estimated from the captured image.
In one aspect of the present disclosure, there is provided a method implemented by a computer. The method comprises the following steps: A first step is to obtain an image including a work implement provided to a work machine body. A second step is to use a trained position estimation model for determining a position of the work implement to obtain a position of the work implement estimated from the obtained image.
In one aspect of the present disclosure, there is provided a method for producing a trained position estimation model. The method comprises the following steps: A first step is to obtain training data. The training data includes a captured image of a work implement attached to a work machine body, and a position of the work implement measured when the image is captured. A second step is to train the position estimation model by using the training data.
In one aspect of the present disclosure, there is provided training data for training a position estimation model used to determine a position of a work implement. The training data comprises: an image of the work implement captured by an imaging device; and a position of the work implement measured when the image is captured.
In one aspect of the present disclosure, there is provided a method for producing a trained position estimation model. The method comprises the following steps: A first step is to obtain a captured image of a work implement attached to a work machine body. A second step is to use a trained first position estimation model to obtain a position of the work implement estimated from the captured image. A third step is to train a second position estimation model by using training data including the captured image and the estimated position.
The present disclosure thus allows the position of a work implement to be determined accurately.
Hereinafter, an embodiment will be described with reference to the drawings. In the following description, identical components are identically denoted. Their names and functions are also identical. Accordingly, they will not be described repeatedly.
In the embodiment, initially will be described a configuration of a hydraulic excavator which is an example of a work machine to which the idea of the present disclosure is applicable.
As shown in
Revolving unit 3 is disposed on traveling apparatus 5 and supported by traveling apparatus 5. Revolving unit 3 can revolve about an axis of revolution RX with respect to traveling apparatus 5. Revolving unit 3 has a cab 4. An occupant (or operator) of hydraulic excavator 100 gets in cab 4 and operates hydraulic excavator 100. Cab 4 is provided with an operator's seat 4S where the operator sits. The operator can operate hydraulic excavator 100 in cab 4. The operator in cab 4 can operate work implement 2, operate revolving unit 3 to revolve it with respect to traveling apparatus 5, and operate traveling apparatus 5 to cause hydraulic excavator 100 to travel.
Revolving unit 3 has an engine compartment 9 accommodating an engine and a counterweight provided in a rear portion of revolving unit 3. In engine compartment 9 are disposed an engine, a hydraulic pump and so forth (not shown).
Revolving unit 3 is provided with a handrail 29 frontwardly of engine compartment 9. Handrail 29 is provided with an antenna 21. Antenna 21 is for example an antenna for GNSS (Global Navigation Satellite Systems). Antenna 21 has a first antenna 21A and a second antenna 21B provided on revolving unit 3 and spaced from each other in a vehicular widthwise direction.
Work implement 2 is supported by revolving unit 3. Work implement 2 has a boom 6, a dipper stick 7, and a bucket 8. Boom 6 is pivotably coupled to revolving unit 3. Dipper stick 7 is pivotably coupled to boom 6. Bucket 8 is pivotably coupled to dipper stick 7. Bucket 8 has a plurality of teeth. Bucket 8 has a distal end portion, which will be referred to as a tooth tip 8a.
Boom 6 has a proximal end portion coupled to revolving unit 3 via a boom pin 13. Dipper stick 7 has a proximal end portion coupled to a distal end portion of boom 6 via a dipper stick pin 14. Bucket 8 is coupled to a distal end portion of dipper stick 7 via a bucket pin 15. Bucket 8 is an example of an attachment detachably attached to a tip of work implement 2. Depending on the type of work, the attachment is replaced with a breaker, grapple, a lifting magnet, or the like.
Hydraulic excavator 100 has a variety of components, and in the present embodiment, their positional relationship will be described with work implement 2 serving as a reference.
Boom 6 of work implement 2 pivots with respect to revolving unit 3 about boom pin 13 provided at the proximal end portion of boom 6. When a specific portion of boom 6 which pivots with respect to revolving unit 3, for example, a distal end portion of boom 6 moves, it provides a locus in an arc. A plane including the arc is specified as an operating plane P. When hydraulic excavator 100 is seen in a plan view, operating plane P is represented as a straight line. The straight line extends in a direction, which is a fore/aft direction of main body 1 of hydraulic excavator 100 or revolving unit 3, and it is hereinafter also simply referred to as the fore/aft direction. A lateral direction (or vehicular widthwise direction) of main body 1 of hydraulic excavator 100 or a lateral direction of revolving unit 3 is orthogonal to the fore/aft direction in a plan view, and it is hereinafter also simply referred to as the lateral direction.
A side where work implement 2 protrudes from main body 1 of hydraulic excavator 100 in the fore/aft direction is the fore direction and a direction opposite to the fore direction is the aft direction. A right side and a left side of the lateral direction when one faces front are the right direction and the left direction, respectively.
The fore/aft direction refers to a fore/aft direction of an operator who sits at the operator's seat in cab 4. A direction in which the operator sitting at the operator's seat faces is defined as the fore direction and a direction behind the operator who sits at the operator's seat is defined as the aft direction. The lateral direction refers to a lateral direction of the operator who sits at the operator's seat. A right side and a left side when the operator sitting at the operator's seat faces front are defined as the right direction and the left direction, respectively.
Boom 6 is pivotable about boom pin 13. Dipper stick 7 is pivotable about dipper stick pin 14. Bucket 8 is pivotable about bucket pin 15. Dipper stick 7 and bucket 8 are each a movable member movable on the side of the distal end of boom 6. Boom pin 13, dipper stick pin 14, and bucket pin 15 extend in a direction orthogonal to operating plane P, i.e., in the lateral direction. Operating plane P is orthogonal to at least one (in the embodiment, all three) of axes that serve as centers about which boom 6, dipper stick 7, and bucket 8 pivot.
As has been set forth above, boom 6 pivots on operating plane P with respect to revolving unit 3. Similarly, dipper stick 7 pivots on operating plane P with respect to boom 6, and bucket 8 pivots on operating plane P with respect to dipper stick 7. Work implement 2 of the embodiment has its entirety operated on operating plane P. Tooth tip 8a of bucket 8 moves on operating plane P. Operating plane P is a vertical plane including a range in which work implement 2 is movable. Operating plane P intersects each of boom 6, dipper stick 7, and bucket 8. Operating plane P can be set at a center of boom 6, dipper stick 7, and bucket 8 in the lateral direction.
As shown in
Work implement 2 has a boom cylinder 10, a dipper stick cylinder 11, and a bucket cylinder 12. Boom cylinder 10 drives boom 6. Dipper stick cylinder 11 drives dipper stick 7. Bucket cylinder 12 drives bucket 8. Boom cylinder 10, dipper stick cylinder 11, and bucket cylinder 12 are each a hydraulic cylinder driven with hydraulic oil.
Work implement 2 has a bucket link. The bucket link has a first link member 16 and a second link member 17. First link member 16 and second link member 17 have their respective tips relatively rotatably coupled together via a bucket cylinder top pin 19. Bucket cylinder top pin 19 is coupled to a tip of bucket cylinder 12. Therefore, first link member 16 and second link member 17 are pinned to bucket cylinder 12.
First link member 16 has a proximal end rotatably coupled to dipper stick 7 via a first link pin 18 in a vicinity of bucket pin 15 located at the distal end portion of dipper stick 7. First link member 16 is pinned to dipper stick 7. Second link member 17 has a proximal end rotatably coupled via a second link pin 20 to a bracket located at a foot of bucket 8. Second link member 17 is pinned to bucket 8.
Hydraulic excavator 100 has an imaging device 50. Imaging device 50 in the embodiment is a monocular camera.
Imaging device 50 is attached to revolving unit 3. Imaging device 50 is attached to cab 4. Imaging device 50 is attached inside cab 4. Imaging device 50 is attached in a vicinity of an upper end of a left front pillar of cab 4. Imaging device 50 is disposed in an internal space of cab 4 in a vicinity of the left front pillar at a position away from work implement 2 in the lateral direction. Imaging device 50 is disposed apart from operating plane P of work implement 2 in the lateral direction. Imaging device 50 is disposed leftwardly of operating plane P.
As shown in
An angle formed in a side view by a straight line passing through boom pin 13 and dipper stick pin 14 and a straight line passing through dipper stick pin 14 and bucket pin 15 is defined as dipper stick angle θa. Dipper stick angle θa is an angle of dipper stick 7 with respect to boom 6.
An angle formed in a side view by a straight line passing through dipper stick pin 14 and bucket pin 15 and a straight line passing through bucket pin 15 and tooth tip 8a is defined as bucket angle θk. Bucket angle θk is an angle of bucket 8 with respect to dipper stick 7.
A posture of work implement 2 on operating plane P is determined by a combination of boom angle θb, dipper stick angle θa, and bucket angle θk. For example, a position, or XY coordinates, on operating plane P of first link pin 18 located at the distal end portion of dipper stick 7 is determined by a combination of boom angle θb and dipper stick angle θa. A position, or XY coordinates, on operating plane P of bucket cylinder top pin 19 displacing as bucket 8 operates is determined by a combination of boom angle θb, dipper stick angle θa, and bucket angle θk.
Imaging device 50 is attached at a position at which the operating plane of work implement 2 is viewed in an oblique direction. Imaging device 50 captures an image of work implement 2 at an angle larger than 0° with respect to operating plane P. Work implement 2 and imaging device 50 are both attached to revolving unit 3, and even when hydraulic excavator 100 travels or revolves, imaging device 50 has a positional relationship unchanged with respect to operating plane P. A position at which imaging device 50 is attached with respect to operating plane P is predetermined for each type of hydraulic excavator 100.
Imaging device 50 captures an image of work implement 2. Imaging device 50 images operating plane P of work implement 2. Imaging device 50 captures an image of work implement 2 moving on operating plane P. The image captured by imaging device 50 includes at least a portion of work implement 2.
Computer 102A may be designed exclusively for the system according to the embodiment, or may be a general-purpose PC (Personal Computer). Computer 102A has a processor 103, a storage device 104, a communication interface 105, and an I/O interface 106. Processor 103 is, for example, a CPU (Central Processing Unit).
Storage device 104 includes a medium that stores information such as stored programs and data readably by processor 103. Storage device 104 includes a system memory such as a RAM (Random Access Memory) or a ROM (Read Only Memory), and an auxiliary storage device. The auxiliary storage device may be a magnetic recording medium such as a hard disk, an optical recording medium such as a CD (Compact Disc) and a DVD (Digital Versatile Disc), or a semiconductor memory such as a flash memory. Storage device 104 may be incorporated in computer 102A. Storage device 104 may include an external recording medium 109 that is detachably connected to computer 102A. External recording medium 109 may be a CD-ROM.
Communication interface 105 is, for example, a wired LAN (Local Area Network) module or a wireless LAN module, and is an interface for performing communications via a communication network. I/O interface 106 is, for example, a USB (Universal Serial Bus) port, and is an interface for connecting to an external device.
Computer 102A is connected to input device 107 and output device 108 via I/O interface 106. Input device 107 is a device for a user to input to computer 102A. Input device 107 includes a pointing device such as a mouse or a trackball, for example. Input device 107 may include a device such as a keyboard used to input text. Output device 108 includes, for example, a display.
Image processing unit 61 receives from imaging device (a camera) 50 an image captured thereby. Image processing unit 61 subjects the received captured image to image processing.
Position estimation model 80 is an artificial intelligence model for determining a position of work implement 2 relative to main body 1. Position estimation model 80 is configured to determine a relative position of work implement 2 from a captured image. Computer 102A estimates the relative position of work implement 2 by using the position estimation model of artificial intelligence. Work implement position estimation unit 65 uses position estimation model 80 to obtain a relative position of work implement 2 estimated from a captured image. More specifically, work implement position estimation unit 65 reads position estimation model 80 from storage device 104 and inputs a captured image to position estimation model 80 to obtain an output of a result of an estimation of boom angle θb, dipper stick angle θa, and bucket angle θk.
Position estimation model 80 includes a neural network. Position estimation model 80 includes, for example, a deep neural network such as a convolutional neural network (CNN).
The model in the embodiment may be implemented in hardware, software executable on hardware, firmware, or a combination thereof. The model may include programs, algorithms, and data executed by processor 103. The model may have its functionalities implemented by a single module or distributed among multiple modules and implemented thereby. The model may be distributed among a plurality of computers.
Hydraulic excavator 100 before shipment further includes an encoder 161. Encoder 161 is a general term for a boom angle sensor attached to boom pin 13, a dipper stick angle sensor attached to the dipper stick pin, and a bucket angle sensor attached to the bucket link. Instead of encoder 161, a potentiometer may be attached to work implement 2 to measure an angle. A stroke sensor that senses the stroke of the hydraulic cylinder may be attached to convert an amount of movement of the hydraulic cylinder into an angle.
Processor 103 has an angle conversion unit 162, an error detection unit 66, and a position estimation model updating unit 67. Angle conversion unit 162 receives an electrical signal from encoder 161 and converts the electrical signal into boom angle θb, dipper stick angle θa, and bucket angle θk. Encoder 161 obtains an electrical signal at a time when imaging device 50 captures an image, and outputs the electrical signal to angle conversion unit 162. Angle conversion unit 162 associates boom angle θb, dipper stick angle θa and bucket angle θk that are measured when the image is captured with the captured image, and thus obtains the angles.
Error detection unit 66 compares a result of an estimation of boom angle θb, dipper stick angle θa and bucket angle θk by work implement position estimation unit 65 with a result of a measurement of boom angle θb, dipper stick angle θa and bucket angle θk based on a result of detection by encoder 161 converted in angle conversion unit 162. Error detection unit 66 calculates an error of the result of the estimation with respect to the true values of boom angle θb, dipper stick angle θa and bucket angle θk.
Position estimation model updating unit 67 updates position estimation model 80 based on the error of boom angle θb, dipper stick angle θa and bucket angle θk as calculated by error detection unit 66. In this way, position estimation model 80 is trained. An image of work implement 2 captured by imaging device 50, and boom angle θb, dipper stick angle θa and bucket angle θk obtained when the image is captured that are calculated in angle conversion unit 162 configure data for training position estimation model 80. Position estimation model 80 is trained in a factory before hydraulic excavator 100 is shipped.
As shown in
Subsequently, in step S102, angle measurement data is obtained. Computer 102A, more specifically, angle conversion unit 162, obtains from encoder 161 measurement data of boom angle θb, dipper stick angle θa and bucket angle θk detected by encoder 161. The measurement data is assumed to be assigned to the captured image. An image captured at a specific time is associated with measurement data detected at that specific time. As shown in
The training data includes a plurality of captured images of work implement 2 in different postures, as shown in
Subsequently, in step S103, a relative position of work implement 2 is output. Computer 102A, more specifically, work implement position estimation unit 65, reads position estimation model 80 from storage device 104. Position estimation model 80 includes a neural network shown in
Immediately adjacent layers have their neurons connected together, and for each connection a weight (a connection weight) is set. The number of connections of neurons may be set as appropriate. A threshold value is set for each neuron, and an output value of each neuron is determined by whether a sum of products of a value input to each neuron and a weight exceeds the threshold value.
Position estimation model 80 is trained to determine a relative position of work implement 2 from a captured image. Through training, a parameter is obtained for position estimation model 80, and the parameter is stored in storage device 104. The parameter includes, for example, the number of layers of the neural network, the number of neurons in each layer, a relation in which neurons are connected together, a weight applied to a connection between each neuron and another neuron, and a threshold value for each neuron.
Work implement position estimation unit 65 inputs an image captured by imaging device 50 to input layer 81. Output layer 83 outputs a position of work implement 2 relative to main body 1, more specifically, a value indicating boom angle θb, dipper stick angle θa and bucket angle θk. For example, computer 102A uses the captured image as an input to input layer 81 to perform a computation process for a forward propagation through the neural network of position estimation model 80. As a result, computer 102A obtains an estimated relative position of work implement 2 as a value output from output layer 83 of the neural network.
Step S102 may not be followed by step S103. Step S102 and Step S103 may be performed at the same time, or step S103 may precede step S102.
Subsequently, in step S104, a difference is calculated between the estimated position of work implement 2 output in step S103 and the measurement data of the angles of work implement 2 obtained in step S102. Computer 102A, more specifically, error detection unit 66, compares the relative position of work implement 2 estimated from the captured image and output from output layer 83 of position estimation model 80 with the measured relative position of work implement 2 as obtained in angle conversion unit 162 to calculate an error of the estimated value with respect to the true value of the relative position of work implement 2.
Computer 102A trains position estimation model 80 using a captured image as input data and a relative position of work implement 2 measured when the image is captured as teacher data. From the calculated error of the output value, computer 102A calculates through backpropagation an error of a weight applied to a connection between each neuron and another neuron and an error of the threshold value of each neuron.
Subsequently, in step S105, position estimation model 80 is updated. Computer 102A, more specifically, position estimation model updating unit 67, updates parameters of position estimation model 80 such as a weight applied to a connection between each neuron and another neuron and each neuron's threshold value, based on the error of the estimated value with respect to the true value of the relative position of work implement 2, as calculated in error detection unit 66, so that a value closer to the true value can be output when the same captured image is input to input layer 81. The updated parameters of position estimation model 80 are stored to storage device 104.
When estimating a relative position of work implement 2 next time, a captured image is input to position estimation model 80 updated and a result of an estimation of the relative position of work implement 2 is obtained. Computer 102A repeats step S101 to step S105 until the result of the estimation of the relative position of work implement 2 that is output by position estimation model 80 matches the measured relative position of work implement 2. Position estimation model 80 thus has its parameters optimized and is thus trained.
Once position estimation model 80 has sufficiently been trained and as a result come to output a sufficiently accurate estimation result, computer 102A finishes training position estimation model 80. Position estimation model 80 has thus been trained. Then, the process ends (end).
Initial values for a variety of parameters of position estimation model 80 may be provided by a template. Alternatively, the initial values for the parameters may be manually provided by human input. When re-training position estimation model 80, computer 102A may prepare initial values for the parameters based on values stored in storage device 104 as the parameters of position estimation model 80 to be re-trained.
Initially, in step S201, a captured image is obtained. Computer 102B, more specifically, image processing unit 61 obtains from imaging device (a camera) 50 an image 71 (see
Subsequently, in step S202, a relative position of work implement 2 is output. Computer 102B, more specifically, work implement position estimation unit 65 reads position estimation model 80 and a trained parameter's optimal value from storage device 104 to obtain position estimation model 80 trained. Work implement position estimation unit 65 uses image 71 captured by imaging device 50 as data input to position estimation model 80. Work implement position estimation unit 65 inputs the captured image 71 to each neuron included in input layer 81 of position estimation model 80 trained. Position estimation model 80 trained outputs from output layer 83 an estimated position of work implement 2 relative to main body 1, more specifically, an output angle value 77 indicating boom angle θb, dipper stick angle θa and bucket angle θk (see
Finally, in step S203, computer 102B generates management data including the position of work implement 2 relative to main body 1. Computer 102B records the management data in storage device 104. Then, the process ends (end).
As described above, in the system according to the embodiment, computer 102B has position estimation model 80 trained for determining a position of work implement 2 relative to main body 1. As shown in
Thus, a posture of work implement 2 can be estimated using position estimation model 80 of artificial intelligence suitable for estimating a position of work implement 2 relative to main body 1. Thus, the posture of work implement 2 can be easily and accurately determined by computer 102B using artificial intelligence.
As work implement 2's posture can be estimated from a captured image of the work implement, a sensor can be dispensed with for sensing boom angle θb, dipper stick angle θa and bucket angle θk. As the sensor is absent, its durability does not affect the operation of hydraulic excavator 100, either. This allows a simple, inexpensive and highly reliable configuration to be used to determine the current posture of work implement 2, similarly as done in hydraulic excavator 100 as conventional.
As shown in
When hydraulic excavator 100 shipped from a factory is equipped with a sensor such as encoder 161 for sensing boom angle θb, dipper stick angle θa and bucket angle θk, it is also possible to additionally train position estimation model 80 after the shipment from the factory.
As shown in
As shown in
As shown in
Computer 102A obtains an image captured by imaging device 50 from each of hydraulic excavators 100A, 100B, 100C. Computer 102A also obtains from each of hydraulic excavators 100A, 100B, 100C, in association with the captured image, boom angle θb, dipper stick angle θa and bucket angle θk measured when the image is captured. Computer 102A uses the captured image and angles of work implement 2 obtained at the same time to train position estimation model 80 so that a relative position of work implement 2 estimated from a captured image can be obtained.
Computer 102A may obtain a captured image and measurement data of angles of work implement 2 from each of hydraulic excavators 100A, 100B, 100C via communication interface 105 (see
Computer 102A may be located at the same work site as hydraulic excavators 100A, 100B, 100C. Alternatively, computer 102A may be located in a remote place away from a work site, such as a management center for example. Hydraulic excavators 100A, 100B, 100C may be located at the same work site or at different work sites.
Position estimation model 80 trained is provided to each hydraulic excavator 100A, 100B, 100C via communication interface 105, external recording medium 109, or the like. Each hydraulic excavator 100A, 100B, 100C is thus provided with position estimation model 80 trained.
When position estimation model 80 is already stored in each hydraulic excavator 100A, 100B, 100C, position estimation model 80 stored is overwritten. Position estimation model 80 may be overwritten periodically by periodically collecting training data and training position estimation model 80, as described above. Whenever position estimation model 80 has a parameter updated, the latest, updated value is stored to storage device 104.
Position estimation model 80 trained is also provided to hydraulic excavator 100D. Position estimation model 80 is provided to both hydraulic excavators 100A, 100B, 100C that provide training data and hydraulic excavator 100D that does not provide training data. Hydraulic excavator 100D may be located at the same work site as any of hydraulic excavators 100A, 100B, 100C, or may be located at a work site different than hydraulic excavators 100A, 100B, 100C. Hydraulic excavator 100D may be before shipment from a factory.
Position estimation model 80 described above is not limited to a model trained through machine learning using training data 61A, 61B, 61C, . . . , and may be a model generated using the trained model. For example, position estimation model 80 may be another trained model (a distillation model) trained based on a result obtained by repeatedly inputting/outputting data to/from a trained model.
As shown in
Subsequently, in step S302, computer 102A uses a trained first position estimation model to obtain an estimated position of work implement 2 relative to main body 1. In step S303, computer 102A outputs the estimated relative position of work implement 2.
Computer 102A, more specifically, work implement position estimation unit 65 reads the trained first position estimation model from storage device 104. Work implement position estimation unit 65 inputs image 71 captured by imaging device 50 to input layer 81 of the trained first position estimation model. The trained first position estimation model outputs from output layer 83 a result of an estimation of a position of work implement 2 relative to main body 1, more specifically, output angle value 77 (see
Subsequently, in step S304, computer 102A stores the captured image obtained in step S301 and the result of the estimation of the relative position of work implement 2 output in step S303 in storage device 104 as training data.
Subsequently, in step S305, computer 102A uses the trained model to train a second position estimation model. Computer 102A inputs a captured image to an input layer of the second position estimation model. Computer 102A outputs from an output layer of the second position estimation model an output value indicating a result of an estimation of a position of work implement 2 relative to main body 1, more specifically, boom angle θb, dipper stick angle θa and bucket angle θk. A difference is calculated between the relative position of work implement 2 output from the second position estimation model and the relative position of work implement 2 output from the first position estimation model in step S303. Based on this difference, computer 102A updates the second position estimation model's parameters. The second position estimation model is thus trained.
Finally, in step S306, the updated parameters of the second position estimation model are stored in storage device 104 as trained parameters. Then, the process ends (end).
Thus, a captured image of work implement 2 and a relative position of work implement 2 estimated through a first position estimation model can be used as training data to train a second position estimation model (or obtain a distillation model), and computer 102A can use the second position estimation model that is simpler than the first position estimation model to estimate a position of work implement 2 relative to main body 1. This can alleviate a load imposed on computer 102A for estimating the relative position of work implement 2. Computer 102A may train the second position estimation model by using training data generated by another computer.
In the above embodiment, position estimation model 80 includes a neural network. This is not exclusive, however, and position estimation model 80 may be a model, such as a support vector machine, capable of accurately estimating a position of work implement 2 relative to main body 1 from a captured image of work implement 2 through machine learning.
The work machine to which the idea of the present disclosure is applicable is not limited to a hydraulic excavator, and may be a work machine having a work implement, such as a bulldozer, a motor grader, or a wheel loader.
The presently disclosed embodiments are to be considered as illustrative in any respect and not restrictive. The scope of the present invention is not indicated by the above description but by the scope of the claims, and is intended to include meaning equivalent to the terms of the claims and any modifications within the scope.
Number | Date | Country | Kind |
---|---|---|---|
2018-111231 | Jun 2018 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/011560 | 3/19/2019 | WO | 00 |