This application is a U.S. National stage application of International Application No. PCT/JP2021/000335, filed on Jan. 7, 2021. This U.S. National stage application claims priority under 35 U.S.C. § 119(a) to Japanese Patent Application No. 2020-015440, filed in Japan on Jan. 31, 2020, the entire contents of which are hereby incorporated herein by reference.
The present invention relates to a system and a control method for preventing an erroneous operation of a work machine, and an excavator.
Generally, a work machine is provided with an operating member, such as a lever, for an operator to operate the work machine. For example, an operator operates the operating member by holding it with his or her hand. However, when the operator performs an operation other than the operation of the work machine, the operator's body or clothes may accidentally touch the operating member. In that case, the work machine operates an operation that is contrary to an intention of the operator.
In order to prevent an erroneous operation as described above, for example, Japanese Unexamined Patent Publication No. 2010-250459 discloses an erroneous operation prevention device. In this erroneous operation prevention device, a tactile sensor is mounted on the entire surface of the grip of the operation lever. When the pressure detected by the tactile sensor continues for a predetermined time, a controller determines that the holding of the operating lever has been detected and releases a hydraulic locking mechanism.
A way of operating the operating member (for example, a way of holding or touching) varies depending on an operator. Therefore, in the above-mentioned erroneous operation prevention device, the tactile sensor may not accurately detect the holding by the operator. Also, in the above-mentioned erroneous operation prevention device, the controller determines whether the pressure detected by the tactile sensor has continued for a predetermined time. Therefore, it takes time to release the hydraulic locking mechanism. As a result, the operability of the work machine during normal operation is deteriorated.
An object of the present disclosure is to detect an erroneous operation of a work machine with high accuracy.
A system according to one aspect of the present disclosure is a system for preventing an erroneous operation of a work machine. The system includes an operating member, a camera, and a controller. The operating member is operable by an operator. The camera captures an image of a region including at least a portion of the operating member and generates image data indicative of the image. The controller acquires the image data from the camera. The controller determines whether an operation of the operating member by the operator is an intentional operation or an unintentional operation based on the image. When the operation of the operating member by the operator is the intentional operation, the controller controls the work machine according to the operation of the operating member. When the operation of the operating member by the operator is the unintentional operation, the controller invalidates the operation of the operating member.
A method according to another aspect of the present disclosure is a control method for preventing an erroneous operation of a work machine. The control method includes the following processes. A first process is to acquire image data indicative of an image of a region including at least a portion of an operating member. A second process is to determine whether an operation of the operating member by an operator is an intentional operation or an unintentional operation based on the image. A third process is to control the work machine according to the operation of the operating member when the operation of the operating member by the operator is the intentional operation. A fourth process is to invalidate the operation of the operating member when the operation of the operating member by the operator is the unintentional operation.
An excavator according to another aspect of the present disclosure includes a traveling body, a rotating body, a work implement, a cab, an operating member, a camera, and a controller. The rotating body is rotatably attached to the traveling body. The work implement is attached to the rotating body. The cab is disposed on the rotating body. The operating member is disposed in the cab. The operating member is operable by an operator in order to operate at least one of the traveling body, the rotating body, or the work implement. The camera captures an image of a region including at least a portion of the operating member. The camera generates image data indicative of the image. The controller acquires the image data from the camera. The controller determines whether an operation of the operating member by the operator is an intentional operation or an unintentional operation based on the image. When the operation of the operating member by the operator is the intentional operation, the controller controls at least one operation of the traveling body, the rotating body, or the work implement according to the operation of the operating member. When the operation of the operating member by the operator is the unintentional operation, the controller invalidates the operation of the operating member.
According to the present disclosure, it is determined whether the operation of the operating member by the operator is the intentional operation or the unintentional operation based on the image of the region including at least a portion of the operating member. Therefore, it is possible to detect an erroneous operation with high accuracy regardless of a way of operating by each operator. Further, it is possible to quickly determine whether the operation of the operating member by the operator is a normal operation. Therefore, the deterioration of the operability of the work machine during normal operation can be reduced.
Hereinafter, a control system of a work machine 1 according to an embodiment will be described with reference to the drawings.
As illustrated in
The work implement 3 includes a boom 11, an arm 12, and a bucket 13. The boom 11 is attached to the rotating body 4 so as to be movable up and down. The arm 12 is movably attached to the boom 11. The bucket 13 is movably attached to the arm 12. The work implement 3 includes a boom cylinder 14, an arm cylinder 15, and a bucket cylinder 16. The boom cylinder 14, the arm cylinder 15, and the bucket cylinder 16 are hydraulic cylinders and driven by hydraulic fluid supplied from a hydraulic pump 22 described later. The boom cylinder 14 actuates the boom 11. The arm cylinder 15 actuates the arm 12. The bucket cylinder 16 actuates the bucket 13.
The work machine 1 includes a rotation motor 25. The rotation motor 25 is a hydraulic motor and driven by hydraulic fluid from the hydraulic pump 22. The rotation motor 25 causes the rotating body 4 to rotate. Although one hydraulic pump 22 is illustrated in
The hydraulic pump 22 is a variable displacement pump. A pump control device 26 is connected to the hydraulic pump 22. The pump control device 26 controls the tilt angle of the hydraulic pump 22. The pump control device 26 includes, for example, an electromagnetic valve and is controlled by command signals from the controller 24. The controller 24 controls the pump control device 26, thereby controlling the displacement of the hydraulic pump 22.
The work machine 1 includes a control valve 27. The hydraulic pump 22, the cylinders 14 to 16, and the rotation motor 25 are connected to each other by means of a hydraulic circuit via the control valve 27. The control valve 27 is controlled by command signals from the controller 24. The control valve 27 controls the flow rate of the hydraulic fluid supplied from the hydraulic pump 22 to the cylinders 14 to 16 and the rotation motor 25. The controller 24 controls the control valve 27, thereby controlling the operation of the work implement 3. The controller 24 controls the control valve 27, thereby controlling the rotation of the rotating body 4.
The power transmission device 23 transmits driving force of the engine 21 to the traveling body 5. The crawler belts 6a and 6b are driven by the driving force from the power transmission device 23 to cause the work machine 1 to travel. The power transmission device 23 may be, for example, a torque converter or a transmission having a plurality of transmission gears. Alternatively, the power transmission device 23 may be another type of transmission, such as a hydro static transmission (HST) or a hydraulic mechanical transmission (HMT).
The controller 24 is programmed to control the work machine 1 based on acquired data. The controller 24 controls the engine 21, the traveling body 5, and the power transmission device 23, thereby causing the work machine 1 to travel. The controller 24 controls the hydraulic pump 22 and the control valve 27, thereby causing the work implement 3 to operate. The controller 24 controls the hydraulic pump 22 and the control valve 27, thereby causing the rotating body 4 to rotate.
The controller 24 includes a processor 31, such as a CPU. The processor 31 executes processes for controlling the work machine 1. The controller 24 includes a storage device 32. The storage device 32 includes a memory, such as a RAM or a ROM, and an auxiliary storage device, such as a hard disk drive (HDD) or a solid state drive (SSD). The storage device 32 stores data and programs for controlling the work machine 1.
The work machine 1 includes a first operating member 33, a second operating member 34, a third operating member 35, and a fourth operating member 36.
The first operating member 33 is a lever. The first operating member 33 is tiltable in the front-back and left-right directions from a neutral position. The first operating member 33 outputs a signal indicative of the operating direction and operating amount of the first operating member 33. The controller 24 receives the signal from the first operating member 33. The controller 24 causes the work implement 3 to operate according to the operation of the first operating member 33 by the operator. Alternatively, the controller 24 causes the rotating body 4 to rotate according to the operation of the first operating member 33 by the operator.
The second operating member 34 is a lever. The second operating member 34 is tiltable in front-back and left-right directions from a neutral position. The second operating member 34 outputs a signal indicative of the operating direction and operating amount of the second operating member 34. The controller 24 receives the signal from the second operating member 34. The controller 24 causes the work implement 3 to operate according to the operation of the second operating member 34 by the operator.
The third operating member 35 is disposed in front of the seat 37. The third operating member 35 is a lever. The third operating member 35 is tiltable in the front-back direction. The third operating member 35 outputs a signal indicative of the operating direction and operating amount of the third operating member 35. The controller 24 receives the signal from the third operating member 35. The controller 24 causes the work machine 1 to travel according to the operation of the third operating member 35 by the operator.
The fourth operating member 36 is a pedal. The fourth operating member 36 is coupled to the third operating member 35. The fourth operating member 36 operates integrally with the third operating member 35. The controller 24 causes the work machine 1 to travel according to the operation of the third operating member 35 or the fourth operating member 36 by the operator.
The work machine 1 includes a locking member 38. The locking member 38 is disposed in the cab 6. The locking member 38 is disposed at a side of the seat 37. The locking member 38 is movable between a locked position and a released position. When the locking member 38 is in the locked position, the controller 24 invalidates the operation of the first operating member 33 and the second operating member 34. That is, when the locking member 38 is in the locked position, the controller 24 prohibits the operation of the work implement 3 regardless of the operation of the first operating member 33 and the second operating member 34. When the locking member 38 is in the locked position, the controller 24 prohibits the rotation of the rotating body 4 regardless of the operation of the first operating member 33.
For example, in a case where the control valve 27 is an electric pilot type, the controller 24 does not output a command signal to the control valve 27 regardless of the operation of the first operating member 33 and the second operating member 34 when the locking member 38 is in the locked position. Alternatively, in a case where the control valve 27 is a hydraulic pilot type, the controller 24 stops supplying the pilot pressure to the control valve 27 when the locking member 38 is in the locked position.
When the locking member 38 is in the released position, the controller 24 controls the work implement 3 or the rotating body 4 according to the operation of the first operating member 33 and the second operating member 34. That is, when the locking member 38 is in the released position, the controller 24 causes the work implement 3 to operate according to the operation of the first operating member 33 and the second operating member 34. When the locking member 38 is in the released position, the controller 24 causes the rotating body 4 to rotate according to the operation of the first operating member 33.
The work machine 1 includes a camera 39. The camera 39 captures an image of a region including the first operating member 33, the second operating member 34, and the seat 37 in the cab 6. The number of cameras 39 is not limited to one and a plurality of cameras may be disposed in the cab 6. The camera 39 generates image data indicative of the captured image. The camera 39 communicates with the controller 24 by wire or wirelessly. The controller 24 receives the image data from the camera 39. The image indicated by the image data may be a still image or a moving image.
The controller 24 detects an erroneous operation of the first operating member 33 and the second operating member 34 by the operator based on the image. Hereinafter, processes for detecting an erroneous operation executed by the controller 24 will be described. In the following description, a case where the first operating member 33 is operated will be described. However, the same processes may be executed in a case where the second operating member 34 is operated.
In step S102, the controller 24 acquires the image data. The controller 24 acquires the image data indicative of an image including the first operating member 33 from the camera 39.
In step S103, the controller 24 determines whether the operation of the first operating member 33 is performed. The controller 24 determines whether the operation of the first operating member 33 is performed based on a signal from the first operating member 33. When the operation of the first operating member 33 is not performed, the lock is maintained in step S106. When the operation of the first operating member 33 is performed, the process proceeds to step S104.
In step S104, the controller 24 determines whether the operation of the first operating member 33 by the operator is an intentional operation or an unintentional operation. The controller 24 performs a determination based on the image indicated by the image data.
The controller 24 determines whether the operation shown in the image is an intentional operation or an unintentional operation by using image recognition technology that uses artificial intelligence (AI). As illustrated in
The image recognition model 41 performs image analysis using deep learning. The image recognition model 41 includes a neural network illustrated in
The image data D11 is input to the input layer 121. The output data D12 indicative of a classification of the operation detected in the image is output to the output layer 123. The classification includes an intentional operation and an unintentional operation. The image recognition model 41 is trained to output the output data D12 indicative of the classification of the operation detected in the image when the image data D11 is input. Trained parameters of the image recognition model 41 acquired by training are stored in the controller 24. The trained parameters include, for example, the number of layers of the neural network, the number of neurons in each layer, the coupling relationships among the neurons, the weights of the couplings among neurons, and the thresholds of each neuron.
For the image showing that the first operating member 33 is held by a hand of an operator 100 as illustrated in
For the images of various ways of holding the first operating member 33 by the operator 100 as illustrated in
On the other hand, for the image showing that the operating member is touched by a portion other than the hand of the operator 100 as illustrated in
For example, for the image showing that the first operating member 33 is touched by a foot of the operator 100 as illustrated in
When it is determined in step S104 that the operation by the operator 100 is the intentional operation, the process proceeds to step S105. In step S105, the controller 24 allows the operation of the first operating member 33. That is, the controller 24 causes the work implement 3 or the rotating body 4 to operate according to the operation of the first operating member 33.
When it is determined in step S104 that the operation by the operator 100 is the unintentional operation, the process proceeds to step S106. In step S106, the controller 24 maintains the lock. That is, the controller 24 invalidates the operation of the first operating member 33 and does not cause the work implement 3 or the rotating body 4 to operate regardless of the operation of the first operating member 33.
According to the control system of the work machine 1 according to the present embodiment described above, it is determined whether the operation of the first operating member 33 by the operator 100 is the intentional operation or the unintentional operation based on the image of the region including at least a portion of the first operating member 33. Therefore, it is possible to detect an erroneous operation with high accuracy regardless of a way of operating the first operating member 33 by the operator 100. Further, it is possible to quickly determine whether the operation of the first operating member 33 by the operator 100 is a normal operation. Therefore, the deterioration of the operability of the work machine 1 during normal operation can be reduced. The same effect as described above can be achieved in a case where the second operating member 34 is operated.
Although an embodiment of the present invention has been described so far, the present invention is not limited to the above embodiment and various modifications can be made without departing from the gist of the invention.
The work machine 1 is not limited to the hydraulic excavator and may be another type of work machine, such as a wheel loader, a bulldozer, or a motor grader. The configuration of the work machine 1 is not limited to that as mentioned above and may be changed. For example, the rotation motor 25 may be an electric motor.
The first to fourth operating members 33 to 36 are not limited to those of the above embodiment and may be modified. For example, the first to fourth operating members 33 to 36 are not limited to levers and may be switches. A portion of the first to fourth operating members 33 to 36 may be omitted or changed. Alternatively, another operating member, such as a steering wheel, may be provided. The controller 24 may execute the same processes for detecting an erroneous operation on the steering wheel as described above. The work machine 1 may include a steering mechanism. The controller 24 may steer the work machine 1 according to the operation of the operating member by the operator.
The field of view of the camera 39 may include only one of the first operating member 33 or the second operating member 34. The field of view of the camera 39 may not include the seat 37. A camera may be provided individually for each of the first operating member 33 and the second operating member 34. The field of view of the camera 39 may include the third operating member 35 or the fourth operating member 36. The controller 24 may execute the same processes for detecting an erroneous operation on the third operating member 35 or the fourth operating member 36 as described above.
The controller 24 may include a plurality of processors, such as a CPU or a GPU. The above processes may be distributed and executed among the plurality of processors 31. The controller 24 is not limited to one unit and the above processes may be distributed and executed among the plurality of controllers. For example,
As illustrated in
The order of the above-mentioned processes may be changed. Some of the above-mentioned processes may be changed or omitted. For example, the determination between the intentional operation and the unintentional operation may be performed by another image recognition technology using AI such as a support vector machine, instead of deep learning. Alternatively, the determination between the intentional operation and the unintentional operation is not limited to AI and may be performed by a rule-based image recognition technology such as pattern matching.
As illustrated in
As illustrated in
As illustrated in
According to the present disclosure, it is possible to detect an erroneous operation of the work machine with high accuracy.
Number | Date | Country | Kind |
---|---|---|---|
2020-015440 | Jan 2020 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/000335 | 1/7/2021 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2021/153182 | 8/5/2021 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
9363441 | Crookham | Jun 2016 | B2 |
20080127531 | Stanek | Jun 2008 | A1 |
20080133094 | Stanek | Jun 2008 | A1 |
20100217492 | Kodaka | Aug 2010 | A1 |
20150004566 | Bomer | Jan 2015 | A1 |
20150153733 | Ohmura | Jun 2015 | A1 |
20170107693 | Yamada | Apr 2017 | A1 |
20170275848 | Marquette | Sep 2017 | A1 |
20170305018 | Machida | Oct 2017 | A1 |
20180134262 | Kurahashi | May 2018 | A1 |
20200032482 | Meguriya | Jan 2020 | A1 |
20200032489 | Yamazaki | Jan 2020 | A1 |
20200173143 | Kurokawa et al. | Jun 2020 | A1 |
20200173148 | Nishi | Jun 2020 | A1 |
20210002860 | Otani | Jan 2021 | A1 |
20230134855 | Hodel | May 2023 | A1 |
20230183942 | Tamura | Jun 2023 | A1 |
20230279638 | Danguchi | Sep 2023 | A1 |
20230279645 | Danguchi | Sep 2023 | A1 |
Number | Date | Country |
---|---|---|
101815630 | Aug 2010 | CN |
106170414 | Nov 2016 | CN |
207277400 | Apr 2018 | CN |
113727883 | Nov 2021 | CN |
113728141 | Nov 2021 | CN |
113748247 | Dec 2021 | CN |
113767202 | Dec 2021 | CN |
2624552 | Aug 2013 | EP |
3333831 | Jun 2018 | EP |
3575501 | Dec 2019 | EP |
2004-19127 | Jan 2004 | JP |
2007-72629 | Mar 2007 | JP |
2010-250459 | Nov 2010 | JP |
2015-108860 | Jun 2015 | JP |
2018-79707 | May 2018 | JP |
2019-176401 | Oct 2019 | JP |
10-2008-0006950 | Jan 2008 | KR |
10-2015-0067926 | Jun 2015 | KR |
2019039522 | Feb 2019 | WO |
WO-2019155843 | Aug 2019 | WO |
WO-2019189589 | Oct 2019 | WO |
WO-2020003994 | Jan 2020 | WO |
Entry |
---|
The Office Action for the corresponding Chinese application No. 202180006632.7, issued on May 27, 2023. |
Tongzhu Liu, “Construction Mode and Innovation of Intelligent Hospital”, Publishing Company of China Science and Technology University; Dec. 2019. |
The International Search Report for the corresponding international application No. PCT/JP2021/000335, issued on Mar. 30, 2021. |
The Office Action for the corresponding Chinese application No. 202180006632.7, issued on Jan. 13, 2023. |
The Office Action for the corresponding Korean application No. 10-2022-7016081, issued on Nov. 20, 2023. |
Number | Date | Country | |
---|---|---|---|
20230018377 A1 | Jan 2023 | US |