This application claims foreign priority of JP2023-197872 filed Nov. 22, 2024, the disclosure of which is hereby in corporated by reference in its entirety.
The present invention relates to a work machine control method, a work machine control program, and a work machine control system that are used for a work machine having a function to detect a detection target in a surrounding monitoring area, and to a work machine.
As the related art, a work machine (shovel) capable of detecting an object present in the surroundings has been known (for example, see Patent Document 1). The work machine according to the related art includes a camera attached to an upper turning body, a display device, and an object detector. The object detector detects a predetermined object within a predetermined detection range that is set around the shovel. In this work machine, an image that is displayed on the display device includes: a first region where a camera image is displayed; and a second region where an actuation state of an object detection function is displayed.
In the above related art, a large amount of information is displayed on the display device. Thus, an operator possibly misses an actuation state of a detection processor (object detection function) or is required to gaze at the display device in order to prevent missing the actuation state of the detector.
An object of the present invention is to provide a work machine control method, a work machine control program, a work machine control system, and a work machine with which an operator can easily and intuitively comprehend an actuation state of a detection processor.
A work machine control method according to an aspect of the present invention includes: acquiring a captured image of a monitoring area around a work machine; displaying a display screen including the captured image, on a display device; and, in the case where a detection processor for detecting a detection target in the monitoring area is enabled, displaying a valid object, which moves in at least one of a circumferential direction and a radial direction of a virtual circle centered on a reference point set for the captured image, in a superimposed manner on the captured image on the display screen.
A work machine control program according to another aspect of the present invention is a program for causing one or more processors to execute the work machine control method.
A work machine control system according to another aspect of the present invention includes an image acquisition unit and a display processor. The image acquisition unit acquires a captured image of a monitoring area around a work machine. The display processor displays a display screen including the captured image, on a display device. In the case where a detection processor for detecting a detection target in the monitoring area is enabled, the display processor displays a valid object, which moves in at least one of a circumferential direction and a radial direction of a virtual circle centered on a reference point set for the captured image, in a superimposed manner on the captured image on the display screen.
A work machine according to further another aspect of the present invention includes the work machine control system and a machine body, on which the display device is mounted.
The present invention can provide the work machine control method, the work machine control program, the work machine control system, and the work machine with which an operator can easily and intuitively comprehend an actuation state of the detection processor.
Hereinafter, a description will be made on embodiments of the present invention with reference to the accompanying drawings. Each of the following embodiments is an example that embodies the present invention, and has no intention to limit the technical scope of the present invention.
As illustrated in
The term “work machine” in the present disclosure means any of various types of work machines, and examples include work machines such as backhoes (including a hydraulic excavator, a mini excavator, and the like), a wheel loader, and a carrier. The work machine 3 includes a work unit 33 configured to be able to perform one or more types of work at least including hanging work. The work machine 3 is not limited to a “vehicle” and may be a work vessel, a work flying object such as a drone or a multicopter, or the like, for example. Furthermore, the work machine 3 is not limited to a construction machine and may be an agricultural machine such as a rice transplanter, a tractor, or a combine harvester, for example. In the present embodiment, unless otherwise specified, the work machine 3 is the backhoe with a hanging function (with a crane function), and a description will be made on a case where the work machine 3 can perform, as work, excavation work, ground leveling work, trench excavation work, loading work, or the like, in addition to the hanging work, as an example.
In addition, in the present embodiment, for convenience of description, a vertical direction in a usable state of the work machine 3 is defined as an up-down direction D1. Furthermore, in a non-turning state of the turning unit 32, a front-rear direction D2 and a left-right direction D3 are defined with a direction seen from a user (operator) who is seated in (a driving unit 321 of) the work machine 3 being a reference. In other words, each of the directions used in the present embodiment is a direction that is defined with the machine body 30 of the work machine 3 being a reference. A direction in which the machine body 30 moves during forward travel of the work machine 3 is “forward”, and a direction in which the machine body 30 moves during reverse travel of the work machine 3 is “rearward”. Similarly, a direction in which a front end portion of the machine body 30 during a right turn of the work machine 3 is “rightward”, and a direction in which the front end portion of the machine body 30 during a left turn of the work machine 3 is “leftward”.
However, these directions do not intend to limit a use direction (direction during use) of the work machine 3.
The work machine 3 includes an engine as a power source. In the work machine 3, the machine body 30 is driven when a hydraulic pump 41 (see
In the present embodiment, it is assumed that the work machine 3 is a riding-type backhoe as described above. Thus, the work unit 33 is driven according to an operation of the user (operator) who is seated in the driving unit 321, and performs the work such as the excavation work. The driving unit 321 which the user gets in is provided in the turning unit 32.
The travel unit 31 has a travel function and is configured to be able to travel (including turning) on the ground. For example, the travel unit 31 has a pair of left and right crawlers 311, a blade 312, and the like. The travel unit 31 further has the travel hydraulic motor 43 (hydraulic actuator) and the like for driving the crawlers 311.
The turning unit 32 is located above the travel unit 31 and configured to be turnable about a rotation shaft along the vertical direction. The turning unit 32 has the turning hydraulic motor (hydraulic actuator) and the like. In addition to the driving unit 321, the engine, the hydraulic pump 41, and the like are mounted on the turning unit 32. Furthermore, a boom bracket 322, to which the work unit 33 is attached, is provided in a front end portion of the turning unit 32.
The work unit 33 is configured to be able to perform the work including the hanging work. The work unit 33 is supported by the boom bracket 322 of the turning unit 32 and performs the work. The work unit 33 includes a bucket 331, a boom 332, an arm 333, and the like. The work unit 33 further has the hydraulic actuator (including the hydraulic cylinder 44, the hydraulic motor, and the like) for driving each unit.
The bucket 331 is a type of an attachment (work tool) that is attached to the machine body 30 of the work machine 3, and is any tool that is selected from plural types of attachments according to a work content.
For example, the bucket 331 is detachably attached to the machine body 30, and is replaced according to the work content. In addition to the bucket 331, examples of the attachment for the work machine 3 include various tools such as a breaker, an auger, a crusher, a fork, a fork claw, a steel frame cutter, an asphalt milling machine, a mower, a ripper, a mulcher, a tilt rotator, and a tamper. The work unit 33 drives the bucket 331 with power from a drive device to perform the work.
The boom 332 is rotatably supported by the boom bracket 322 of the turning unit 32. More specifically, the boom 332 is supported by the boom bracket 322 in a manner to be rotatable about a rotation axis along a horizontal direction. The boom 332 has a shape that extends upward from a base end portion, which is supported by the boom bracket 322. The arm 333 is coupled to a distal end of the boom 332. The arm 333 is supported by the boom 332 in a manner to be rotatable about a rotation axis along the horizontal direction. The bucket 331 is attached to a distal end of the arm 333.
The work unit 33 acts when receiving the power from the engine as the power source. More specifically, when the hydraulic pump 41 is driven by the engine, and the hydraulic oil is supplied from the hydraulic pump 41 to the hydraulic actuator (the hydraulic cylinder 44 and the like) of the work unit 33, sections (the bucket 331, the boom 332, and the arm 333) of the work unit 33 act.
In the present embodiment, in particular, the work unit 33 has an articulated structure in which the boom 332 and the arm 333 are independently rotatable. That is, due to the rotation of each of the boom 332 and the arm 333 about the rotation axis along the horizontal direction, for example, the articulated work unit 33 including the boom 332 and the arm 333 can be extended or retracted as a whole.
Similar to the work unit 33, the travel unit 31 and the turning unit 32 also act when receiving the power from the engine as the power source. That is, when the hydraulic oil is supplied from the hydraulic pump 41 to the hydraulic motor 43 of the travel unit 31, the hydraulic motor of the turning unit 32, and the like, the turning unit 32 and the travel unit 31 act.
As described above, the engine functions as the power source that supplies the power to each unit. Here, together with the hydraulic pump 41 and the like, the engine is mounted on the turning unit 32. In the present embodiment, as an example, the engine is a diesel engine. The engine is driven by fuel (light fuel herein) supplied from a fuel tank.
Here, the machine body 30 includes various types of sensors (including a camera) for detecting a detection target Ob1 (see
The left camera 341, the right camera 342, and the rear camera 343 are connected to the control system 1 and each output a captured image to the control system 1.
The left camera 341, the right camera 342, and the rear camera 343 are respectively installed to be directed leftward, rightward, and rearward with the driving unit 321 being a reference, so as to be able to capture images of the monitoring area A1 on the left side, the right side, and the rear side when seen from the operator who is seated in the driving unit 321 of the turning unit 32. That is, as illustrated in
The detector 5 is connected to the control system 1 and outputs a detection result of the detection target Ob1 in the monitoring area A1 to the control system 1. Here, as an example, the detector 5 is installed to face rearward, so as to be able to detect the detection target Ob1 in the monitoring area A1 with the monitoring area A1 that can be imaged by the left camera 341, the right camera 342, and the rear camera 343 being a target. That is, a detection range of the detection target Ob1 by the detector 5 is the same as the imageable range (monitoring area A1) by the left camera 341, the right camera 342, and the rear camera 343.
In the present embodiment, the detector 5 includes a sensor 51 (see
That is, the detection result by the detector 5 may include presence or absence of the detection target Ob1 in the monitoring area A1, the position of the detection target Ob1 in the monitoring area A1 when the detection target Ob1 is present in the monitoring area A1, the attributes of the detection target Ob1, and the like.
In short, the detector 5 detects the detection target Ob1 in the monitoring area A1 around the work machine 3. The detector 5 determines whether the detection target Ob1 is present in the monitoring area A1 (presence or absence), and outputs a detection result indicating whether the detection target Ob1 is present in the monitoring area A1. In the present embodiment, as an example, the detection target Ob1 is a “person”. That is, in the case where the “person” enters the monitoring area A1 around the work machine 3 as a result of movement of the work machine 3 or movement of the “person” around the work machine 3, the detector 5 detects the “person” as the detection target Ob1. In the case where the plural detection target objects Ob1 are present in the monitoring area A1, the detector 5 may also detect the number of the detection targets Ob1 (the number of the persons).
As illustrated in
The hydraulic oil from the hydraulic pump 41, which is driven by the engine, is supplied to the hydraulic motor 43 of the travel unit 31, the hydraulic motor of the turning unit 32, the hydraulic cylinder 44 of the work unit 33, and the like. Thus, the hydraulic actuators such as the hydraulic motor 43 and the hydraulic cylinder 44 are driven.
The hydraulic actuators such as the hydraulic motor 43 and the hydraulic cylinder 44 are each provided with the pilot-type direction switching valve 47 capable of switching a direction and a flow rate of the hydraulic oil from the hydraulic pump 41. The direction switching valve 47 is driven when supplied with the pilot oil that serves as an input command from the pilot pump 42.
Here, for example, the remote control valve 45 is provided in the pilot oil supply path to the direction switching valve 47 that corresponds to the hydraulic cylinder 44 of the work unit 33. The remote control valve 45 outputs a work operation command of the work unit 33 in response to an operation of the operation lever. The work operation command instructs extending motion, retracting motion, and the like of the work unit 33. The electromagnetic control valve 46 (electromagnetic valve) is inserted between the remote control valve 45 and the pilot pump 42. The control valve 46 is connected to a power source 351 via a cutoff relay 352 and a cutoff switch 353, and acts according to a supply current from the power source 351.
Similarly, a remote control valve is provided in the pilot oil supply path to the direction switching valve that corresponds to the hydraulic motor 43 of the travel unit 31. This remote control valve outputs a travel operation command of the travel unit 31 in response to the operation of the operation lever. The travel operation command instructs travel motion (forward travel, reverse travel, or the like) of the travel unit 31. Furthermore, the remote control valve is also provided in the pilot oil supply path to the direction switching valve that corresponds to the hydraulic motor of the turning unit 32. This remote control valve outputs a turning operation command of the turning unit 32 in response to the operation of the operation lever. The turning operation command instructs turning motion (left turn, right turn, or the like) of the turning unit 32. The electromagnetic control valve 46 (electromagnetic valve) is also inserted between these remote control valve and pilot pump 42. The control valve 46 is connected to a power source 351 via a cutoff relay 352 and a cutoff switch 353, and acts according to a supply current from the power source 351.
In an energized state, that is, in a state where the current is supplied, the control valve 46 opens a pilot oil flow path from the pilot pump 42 to the remote control valve 45. In an unenergized state, that is, in a state where the current supply is cut off, the control valve 46 blocks the pilot oil flow path. Accordingly, when the supply current to the control valve 46 is cut off, the hydraulic actuator that corresponds to the remote control valve 45 cannot no longer be driven, and the output of the hydraulic actuator is forcibly stopped regardless of the operation of the operation lever.
Here, the cutoff relay 352 is connected to the control system 1 and switched between on/off according to a control signal (electric signal) from the control system 1. The cutoff switch 353 is switched between on/off according to an operation of a cutoff lever, and is turned on when the cutoff lever is operated downward. Accordingly, when both of the cutoff relay 352 and the cutoff switch 353 are on, the control valve 46 is brought into the energized state, and thus the pilot oil flow path from the pilot pump 42 to the remote control valve 45 is opened. As a result, the hydraulic actuator is driven in response to the operation of the operation lever. Meanwhile, when at least one of the cutoff relay 352 and the cutoff switch 353 is off, the control valve 46 is brought into the unenergized state, and thus the pilot oil flow path is shut off. As a result, the hydraulic actuator can not longer be driven.
For example, in the off state of at least one of the cutoff relay 352 and the cutoff switch 353, which are connected to the control valve 46 inserted between the pilot pump 42 and the remote control valve corresponding to the hydraulic motor of the turning unit 32, the hydraulic motor of the turning unit 32 cannot be driven. In this state, since the output from the hydraulic actuator (hydraulic motor of the turning unit 32) is forcibly stopped irrespective of the operation of the operation lever, the turning motion of the turning unit 32 is prohibited.
The control system 1 mainly includes a computer system that has one or more processors, such as a central processing unit (CPU), and one or more pieces of memory, such as read only memory (ROM) and random access memory (RAM), and executes various types of processing (information processing). In the present embodiment, the control system 1 is an integrated controller that controls the entire work machine 3, and includes an electronic control unit (ECU), for example. However, the control system 1 may be provided separately from the integrated controller. The control system 1 will be described in detail in “[2] Configuration of Control System”.
The display device 2 is arranged in the driving unit 321 of the machine body 30, and is a user interface that accepts an operation input by the user (operator) and outputs various types of information to the user. For example, the display device 2 outputs an electrical signal that corresponds to the user's operation and thereby accepts the various types of the user's operations. This enables the user (operator) to visually recognize a display screen Dp1 (see
As illustrated in
The control unit 21 controls the display device 2 according to the data from the control system 1. More specifically, the control unit 21 outputs an electrical signal that corresponds to the user's operation accepted by the operation unit 22, and displays the display screen Dp1, which is generated by the control system 1, on the display unit 23.
The operation unit 22 is a user interface for accepting an operation input performed by the user (operator) to the display screen Dp1 displayed on the display unit 23. For example, the operation unit 22 outputs an electric signal corresponding to an operation performed by a user U1 (refer to
The operation unit 22 may include a touch panel, an operation dial, or the like. Also, in this case, by operating the operation unit 22, any of the items on the display screen Dp1 is operated (selected).
The display unit 23 is a user interface, such as a liquid-crystal display or organic EL display displaying various types of the information, for presenting the information to the user U1 (operator). The display unit 23 displays and presents the various types of the information to the user. In the present embodiment, as an example, the display unit 23 is a full-color liquid-crystal display with a backlight, and has a “laterally long” display region that is long in a lateral direction as illustrated in
On the display screen Dp1, the display device 2 presents the various types of the information to the user U1 (operator) who operates the work machine 3. That is, the user U1 who operates the work machine 3 can visually acquire the various types of the information on the work machine 3 by seeing the display screen Dp1 that is displayed on the display device 2. As an example, when the display device 2 displays information on a running state of the work machine 3, such as a coolant temperature and a hydraulic oil temperature, the user U1 can check, on the display device 2, the information on the running state of the work machine 3 that is required to operate the work machine 3. The display device 2 can also display, on the display screen Dp1, the images of the surroundings of the work machine 3 (images of the monitoring area A1) that are captured by the left camera 341, the right camera 342, and the rear camera 343. This enables the user U1 (operator) who operates the work machine 3 to check situations on the sides, behind, and the like of the work machine 3, which tend to be the blind spots from the driving unit 321, by seeing the display screen Dp1 displayed on the display device 2.
Furthermore, the work machine 3 includes the sound output unit 36 (see
The machine body 30 further includes the operation lever, the cutoff lever, a communication terminal, the fuel tank, a battery, and the like in addition to the above-described components. Moreover, the machine body 30 includes sensors for monitoring a running state of the machine body 30, and examples of such sensors are a coolant temperature sensor, a hydraulic oil temperature sensor, a tachometer that measures an engine speed, and an hour meter that measures a running time. In addition to the above, the machine body 30 further includes sensors for detecting states or the like of the cutoff lever, a starter key switch, and the like.
Next, a description will be made on a configuration of the control system 1 according to the present embodiment with reference to
The term “screen” such as the display screen Dp1 in the present disclosure means a video (image) displayed on the display device 2 and includes a graphic image, a figure, a photograph, text, a moving image, and the like. Thus, the control system 1 can cause the display device 2 to display, for example, the display screen Dp1 including an icon that indicates information on the operating state of the work machine 3 such as cooling water temperature and hydraulic oil temperature. Here, in the case where the display screen Dp1 includes the video and the like, the display screen Dp1 includes a still picture but includes a video that changes every moment.
As illustrated in
The control system 1 is configured to be communicable with the device that is provided to each part of the machine body 30. That is, at least the display device 2, the detector 5, the sound output unit 36, the cutoff relay 352, the left camera 341, the right camera 342, the rear camera 343, and the like are connected to the control system 1. Accordingly, the control system 1 can control the display device 2, the sound output unit 36, and the like, control the cutoff relay 352 to control the control valve 46, and acquire the detection result by the detector 5 and the images captured by the left camera 341, the right camera 342, the rear camera 343, and the like. Here, the control system 1 may exchange the various types of the information (data) with each of the devices either directly or indirectly via a relay or the like.
The image acquisition unit 14 executes image acquisition processing to acquire the captured image of the monitoring area A1 around the work machine 3. In the present embodiment, the image acquisition unit 14 regularly or irregularly acquires the output of the left camera 341, the right camera 342, and the rear camera 343 from the left camera 341, the right camera 342, and the rear camera 343, respectively. That is, the image acquisition unit 14 acquires the image data (captured image) of the monitoring area A1 (each of the small areas A11, A12, A13) around the work machine 3. The data acquired by the image acquisition unit 14 is stored in the memory or the like, for example.
The detection processor 15 executes detection processing to detect the detection target Ob1 in the monitoring area A1. In the present embodiment, since the detector 5 detects the detection target Ob1 in the monitoring area A1, the detection processor 15 detects the detection target Ob1 in the monitoring area A1 by acquiring the detection result by the detector 5 from the detector 5.
The restraint processor 12 executes restraint processing to restrain the motion of the work machine 3 on the basis of the detection result by the detector 5. In the present embodiment, in the case where the detection result by the detector 5 is a result that indicates the presence of the detection target Ob1 (the person herein) in the monitoring area A1, the restraint processor 12 executes the restraint processing. The term “restraint processing” in the present disclosure means processing that relates to the motion of the work machine 3 and suppresses the motion of the work machine 3 in some way. As an example, the restraint processing includes processing to indirectly restrain the motion of the work machine 3 by warning the user U1 (operator) who operates the work machine 3 by the sound or the light (including the display). Furthermore, the restraint processing includes processing to directly restrain the motion of the work machine 3 by controlling the travel unit 31, the turning unit 32, the work unit 33, and the like of the work machine 3.
In the present embodiment, the restraint processor 12 includes a sound output processor 121 and a restriction processor 122.
In the case where the detection target Ob1 is present in the monitoring area A1, the sound output processor 121 controls the sound output unit 36 to cause the sound output unit 36 to output notification sound from the sound output unit 36. That is, in the present embodiment, the restraint processing includes sound output processing to output the notification sound. The notification sound may be a simple beep or may be the voice, such as a message “Please be careful”. Furthermore, the notification sound may be changed according to the detection result by the detector 5 (the distance from the machine body 30 to the detection target Ob1, or the like). In this way, the motion of the work machine 3 can be restrained indirectly by warning the user U1 (operator), who operates the work machine 3, using the notification sound. Thus, there is a high degree of freedom in the operation of the work machine 3. That is, when the user U1 operates the work machine 3 while paying attention to the detection target Ob1, it is possible to continue the motion of the work machine 3 while contact with the detection target Ob1 is avoided.
In the case where the detection target Ob1 is present in the monitoring area A1, the restriction processor 122 turns off the cutoff relay 352 by controlling the cutoff relay 352. Consequently, the control valve 46, which is connected to the power source 351 via the cutoff relay 352, is brought into the unenergized state, and the output of the hydraulic actuator corresponding to the control valve 46 is forcibly stopped. That is, in the present embodiment, the restraint processing includes restriction processing to restrict the motion of the work machine 3. The term “restriction processing” in the present disclosure means processing that relates to the motion of the work machine 3 and restricts the motion of the work machine 3 in some way. Examples of the restriction processing include processing to prohibit the travel motion of the travel unit 31 (disable the travel motion), processing to prohibit the turning motion of the turning unit 32 (disable the turning motion), and processing to prohibit the motion of the work unit 33 (disable the work). In this way, the motion of the work machine 3 can be restricted forcibly without relying on the operation by the user U1 (operator). That is, it is possible to avoid contact of the machine body 30 with the detection target Ob1, which is caused by the motion of the work machine 3.
Here, the restriction processing that is executed by the restriction processor 122 at least includes processing to restrict the turning motion of the turning unit 32. More specifically, the restriction processor 122 is configured to be able to control the cutoff relay 352 that is connected to the control valve 46 corresponding to the hydraulic motor of the turning unit 32 and turn off the cutoff relay 352 when the detection target Ob1 is present in the monitoring area A1. In this way, when the detection target Ob1 is present in the monitoring area A1, the hydraulic motor of the turning unit 32 cannot be driven, and the turning unit 32 is emergently stopped during the turning motion of the turning unit 32, or the turning motion of the turning unit 32 is prohibited when the turning unit 32 is not performing the turning motion. That is, in the present embodiment, the work machine 3 includes the travel unit 31 and the turning unit 32 that can turn with respect to the travel unit 31. The restriction processing at least restricts the turning motion of the turning unit 32. In this way, when the detection target Ob1 is present in the monitoring area A1, which is the blind spot for the user U1 (operator), it is possible to avoid the contact of the machine body 30 with the detection target Ob1, which is caused by turning of the turning unit 32.
The switching processor 13 switches between an enabled state and a disabled state of the detection processor 15. That is, the switching processor 13 switches between an enabled state and a disabled state of the function related to the detection processing for the detection target Ob1 in the monitoring area A1 by the detection processor 15. In short, the detection processor 15 is not always enabled but can be switched between the enabled state and the disabled state. In the case where the detection processor 15 is enabled, and the detection target Ob1 is present in the monitoring area A1, the detection target Ob1 is detected by the detection processor 15. Thus, the restraint processor 12 executes the restraint processing. Meanwhile, in the case where the detection processor 15 is disabled, the detection target Ob1 is not detected by the detection processor 15 even when the detection target Ob1 is present in the monitoring area A1. Thus, the restraint processor 12 does not execute the restraint processing.
In the present embodiment, depending on whether to actuate the detector 5, the switching processor 13 switches between the enabled state and the disabled state of the detection processor 15. That is, the switching processor 13 actuates the detector 5 when the detection processor 15 is enabled. The switching processor 13 does not actuate the detector 5 when the detection processor 15 is disabled. In this way, when the detection processor 15 is disabled, power consumption by the detector 5 can be suppressed.
However, the configuration of the switching processor 13 is not limited thereto. The switching processor 13 may disable the detection processor 15 by disabling the detection processing by the detection processor 15 (acquisition of the detection result from the detector 5), disabling the acquired detection result, or the like. In this case, when the detection processor 15 is disabled, the detector 5 itself is actuated. However, even in the case where the detection target Ob1 is present in the monitoring area A1, the restraint processor 12 does not execute the restraint processing. In short, the switching processor 13 may disable the detection processor 15 by disabling the processing using the detection result by the detector 5.
In the present embodiment, as an example, the enabled state and the disabled state of the detection processor 15 are switched when the user U1 (operator) operates the display device 2. That is, when the user U1 operates the operation unit 22 of the display device 2 to enable the detection processor 15, the switching processor 13 accepts this operation to enable the detection processor 15. Meanwhile, when the user U1 operates the operation unit 22 of the display device 2 to disable the detection processor 15, the switching processor 13 accepts this operation to disable the detection processor 15.
In the present embodiment, the restraint processing executed by the restraint processor 12 includes the sound output processing executed by the sound output processor 121 and the restriction processing executed by the restriction processor 122. Just as described, the restraint processing includes plural types of specific processing (the sound output processing, the restriction processing, and the like) for restraining the motion of the work machine 3. Here, in the restraint processing, each type of the specific processing can be switched independently between an enabled state and a disabled state. That is, the switching processor 13 can switch between the enabled state and the disabled state of each of the sound output processor 121 and the restriction processor 122 in the restraint processor 12. As an example, the sound output processor 121 can be enabled, and the restriction processor 122 can be disabled.
Alternatively, the sound output processor 121 can be disabled, and the restriction processor 122 can be enabled. In this way, depending on the situation, it is possible to only enable the necessary specific processing, which improves a degree of freedom in the restraint processing.
The display processor 11 executes display processing to display at least the display screen Dp1 on the display device 2. More specifically, the display processor 11 generates the display screen Dp1 on the basis of the data acquired by the image acquisition unit 14 and the like, controls the display device 2, and thereby displays the display screen Dp1 on the display unit 23 of the display device 2. Furthermore, the display processor 11 acts in response to the operation accepted by the operation unit 22 of the display device 2. The display processor 11 displays, on the display screen Dp1, captured image Im100 (see
Here, the display processor 11 can display a valid object X1 (see
In this way, the display of the valid object X1 on the display screen Dp1 allows the user U1 (operator) to comprehend that the detection processor 15 is enabled.
That is, the user U1 can visually recognize whether the detection processor 15 is enabled or disabled, and can actuate the work machine 3 after recognizing that the detection processor 15 is enabled or disabled.
By the way, in the present embodiment, the detection processor 15 acquires the detection result by the detector 5 on the outside of the control system 1 and executes the detection processing for the detection target Ob1 on the basis of the detection result. However, the detection processor 15 is not limited to this configuration. For example, the detection processor 15 may execute the detection processing for the detection target Ob1 in the monitoring area A1 on the basis of the output of the sensor 51 on the outside of the control system 1 and/or the imaging unit 52.
Hereinafter, a description will be mainly made on an example of a control method for the work machine 3 (hereinafter simply referred to as a “control method”) that is executed by the control system 1 with reference to
The control method according to the present embodiment is executed by the control system 1 that has the computer system as the main component, and thus, in other words, is embodied by the work machine control program (hereinafter simply referred to as a “control program”). That is, the control program according to the present embodiment is a computer program for causing the one or more processors to execute each processing related to the control method. Such a control program may be executed in cooperation by the control system 1 and the display device 2, for example.
Here, in the case where a particular start operation, which is set in advance to execute the control program, is performed, the control system 1 executes the following various types of processing related to the control method. The start operation is an activation operation of the engine of the work machine 3, or the like, for example. Meanwhile, in the case where a particular termination operation, which is set in advance, is performed, the control system 1 terminates the following various types of processing related to the control method. The termination operation is a stop operation of the engine of the work machine 3, or the like, for example.
Herein, a description will be firstly made on a configuration of the display screen Dp1 that is displayed on the display unit 23 of the display device 2 by the control method according to the present embodiment. In the drawings in
The display screen Dp1 illustrated in
The home screen is a basic display screen Dp1 that is first displayed on the display device 2 while the work machine 3 is running. The display screen Dp1 can be shifted from the home screen to any of various display screens Dp1 including a menu screen, a crane screen, a mode screen, a PTO screen, and the like according to the operation on the operation unit 22.
As illustrated in
Specifically, the display screen Dp1 is vertically (an upper/lower direction) divided into four regions.
Each of the three regions from the top is further divided into three regions in the lateral direction (left-right direction). Thus, the display screen Dp1 is divided into a total of ten regions. The regions in the second row from the top are the first region R1, the second region R2, and the third region R3 in order from left to right. The region in the lowermost row is the fourth region R4. Furthermore, the regions in the third row from the top are the fifth region R5, the sixth region R6, and the seventh region R7 in order from left to right, and the regions in the uppermost row are the eighth region R8, the ninth region R9, and the tenth region R10 in order from left to right. Of the four regions into which the display screen Dp1 is divided in the vertical direction, the region in the second row from the top (region including the first region R1, the second region R2, and the third region R3) has the largest vertical size. Among the three regions transversely divided, a transverse size of middle regions (the second region R2, the sixth region R6, and the ninth region R9) is the largest.
However, the placement and size of the respective regions are merely examples and can be properly changed. In addition, it is not essential that the regions are clearly divided by boundaries. For example, in the example of
The first region R1 is a rectangular region extending in the vertical direction. In the first region R1, for example, remaining amount information G1 on a remaining amount of the fuel (for example, the light oil) for the engine is displayed. The display processor 11 generates the remaining amount information G1 in the display screen Dp1, for example, on the basis of output (a sensor signal) from a remaining amount sensor.
The second region R2 is a rectangular region that is long in the lateral direction. The captured image Im100 of the monitoring area A1, the valid object X1, and the like are displayed in the second region R2. A machine body object Im10, a captured image Im200 of the rear camera 343, and the like are further displayed in the second region R2.
The captured image Im100 is an overhead image that is generated by subjecting the image captured by the left camera 341, the image captured by the right camera 342, and the image captured by the rear camera 343 into coordinate transformations and synthesization. That is, the captured image Im100 is an image that is acquired by synthesizing an image of the small area A11 on the left side of the driving unit 321 captured by the left camera 341, an image of the small area A12 on the right side of the driving unit 321 captured by the right camera 342, and the image of the small area A13 on the rear side of the driving unit 321 captured by the rear camera 343 and that is displayed in the form that the monitoring area A1 is seen from the upper side of the machine body 30. The display processor 11 synthesizes the captured images acquired by the image acquisition unit 14 and displays the synthesized image in real time.
The valid object X1 is the object indicating that the detection processor 15 is enabled, and is displayed in the superimposed manner on the captured image Im100 on the display screen Dp1. The valid object X1 will be described in detail in the section “[3.2] Details”.
The machine body object Im10 is displayed in an 11th region R11 that is set at a center of the second region R2. In the present embodiment, as an example, the machine body object Im10 is a graphic image (icon) that imitates the machine body 30 seen from above. The captured image Im200 is an image of the small area A13 on the rear side of the driving unit 321 that is captured by the rear camera 343.
The third region R3 is a rectangular region that is in the vertical direction. In the third region R3, a graphic image (icon) Im1 that corresponds to the running state of each of the units of the work machine 3 is displayed. In the third region R3, the plural graphic images Im1 can be displayed, and each of the graphic images Im1 uses a design (pattern) to indicate, for example, the state of the battery, a seatbelt, the coolant temperature, the hydraulic oil temperature, or the like.
Here, each of the graphic images Im1 indicates the running state, depending on a display mode including a display color, a size, or the like, for example. The display processor 11 determines the state of each of the units of the work machine 3 by using the output of the various sensors (including the coolant temperature sensor and the hydraulic oil temperature sensor), each of which detects the running state of each of the units of the work machine 3. When an abnormal value is detected in any portion, the display processor 11 provides warning display by changing the display mode, such as the display color, of the graphic image Im1 of the respective portion, or the like.
The fourth region R4 is a band-shaped region that extends over an entire width of the display screen Dp1. In the fourth region R4, items for operations on the display screen Dp1 are displayed. In
Furthermore, in the present embodiment, in the fourth region R4, one of the items is highlighted to respond to the operation of the operation dial (or a cursor key) in the operation unit 22, or the like.
In the example of
In the fifth region R5, a warning display graphic image (icon) is displayed to indicate that the abnormal value has been detected by any of the various sensors (including the coolant temperature sensor and the hydraulic oil temperature sensor). In the sixth region R6, for example, information on the running work unit 33 in the work machine 3 is displayed. In the seventh region R7, for example, the information on the running state of the work machine 3, such as the engine speed, is displayed. In the eighth region R8, for example, current time is displayed. In the ninth region R9, for example, information on the item to which the currently displayed display screen Dp1 belongs is displayed. In the tenth region R10, for example, information on a running time (hour meter) of the work machine 3 is displayed.
Next, the control method according to the present embodiment will be described in detail.
The control method according to the present embodiment has: acquiring the captured image Im100 of the monitoring area A1 around the work machine 3 (image acquisition processing); and displaying the display screen Dp1, which includes the captured image Im100, on the display device 2 (display processing). The control method further has, in the display processing: displaying the valid object X1 in the superimposed manner on the captured image Im100 on the display screen Dp1 in the case where the detection processor 15, which detects the detection target in the monitoring area A1, is enabled. The valid object X1 moves in at least one of a circumferential direction and a radial direction of a virtual circle C1 (see
In short, by displaying the display screen Dp1, which includes the captured image Im100, on the display device 2, the user U1 can check the presence or the absence of the detection target Ob1 in the monitoring area A1, and the like in the captured image Im100 on the display screen Dp1. Here, in the case where the detection processor 15 is enabled, the valid object X1, which moves in at least one of the circumferential direction and the radial direction of the virtual circle C1 centered on the reference point P1, is displayed in the superimposed manner on the captured image Im100 on the display screen Dp1. Accordingly, the user U1 can comprehend whether the detection processor 15 is enabled or disabled by the valid object X1 that is dynamically displayed at a position that naturally enters a field of view of the user U1 when the user U1 sees the captured image Im100. Thus, the control method according to the present embodiment allows the operator to intuitively comprehend the actuation state of the detector 5.
A detailed description will hereinafter be made on the display of the second region R2 in the display screen Dp1 with reference to
As illustrated in
In detail, the captured image Im100 has such a fan shape that, when the virtual circle C1 that is centered on the reference point P1 located near a center of the second region R2 is set, an excluded range from a first end E101 to a second end E102 as a circumferential part of the virtual circle C1 is cut out. In the example of
As illustrated in
In the present embodiment, the upper side of the machine body object Im10 corresponds to the front side of the machine body 30 in a real space, the lower side of the machine body object Im10 corresponds to the rear side of the machine body 30 in the real space, the right side of the machine body object Im10 corresponds to the right side of the machine body 30 in the real space, and the left side of the machine body object Im10 corresponds to the left side of the machine body 30 in the real space. More specifically, the lower portion of the machine body object Im10 is formed in the arc shape that imitates a rear portion (counterweight) of the turning unit 32, and the upper portion of the machine body object Im10 is formed in the shape that imitates a front portion (work unit 33) of the turning unit 32. In the present embodiment, the work unit 33 is arranged at a position that is offset to the right from the center of the turning unit 32 in the left-right direction D3, and the driving unit 321 is arranged at a position that is offset to the left from the center of the turning unit 32 in the left-right direction D3. Thus, in the machine body object Im10, a figure that imitates the work unit 33 is located in an upper right portion, and a figure that imitates the driving unit 321 is located in a left portion.
In the present embodiment, the machine body object Im10 is displayed not with an orientation of the travel unit 31 but with an orientation of the turning unit 32, on which the driving unit 321 is mounted, being a reference. That is, when the turning unit 32 turns relative to the travel unit 31, the direction that corresponds to the upper side of the machine body object Im10 in the real space is changed. In this way, the upper side of the machine body object Im10 constantly corresponds to the front side of the turning unit 32, which is located on the front side as seen by the operator seated in the driving unit 321.
Here, the direction of the machine body 30 represented by the machine body object Im10 matches the direction of the captured image Im100 that is seen from the machine body 30. That is, both the front side of the machine body 30, which is simulated by the machine body object Im10, and the front side in the captured image Im100 are directed upward in the display screen Dp1. This makes it easier for the operator to intuitively comprehend the position of the detection target Ob1 by looking at the captured image Im100. However, the present disclosure is not limited to this example, and the machine body object Im10 may be arranged at a position other than the apex angle of the captured image Im100, for example, on the lower side of the captured image Im100.
The captured image Im200 by the rear camera 343 is arranged in the excluded range between the first end E101 and the second end E102 in the circumferential direction of the virtual circle C1. That is, the captured image Im200 is arranged above the captured image Im100 in the display screen Dp1. In the present embodiment, as an example, the captured image Im200 has a pentagonal outer shape that includes two sides along the first end E101 and the second end E102. In this way, the captured image Im200 of the small area A13, which is located behind the driving unit 321 and is captured by the rear camera 343, can be displayed by using a dead space in the second region R2.
Furthermore, in the present embodiment, the detection target Ob1 in the monitoring area A1 is set as the detection target by the detector 5. Thus, in the case where the detection target Ob1 is present in the monitoring area A1, the detection target Ob1 is naturally captured in the captured image Im100 of the monitoring area A1. Just as described, when the detection result indicates the presence of the detection target Ob1 in the monitoring area A1, at least a part of the detection target is captured in the captured image Im100 on the display screen Dp1. This allows the user U1 to check the detection target Ob1 in the captured images Im100. That is, on the display screen Dp1 displayed in the display device 2, the operator (user U1) can check the situations on the lateral side, the rear side, and the like of the work machine 3, which are likely to be blind spots from the driving unit 321. Thus, in the case where the detection target Ob1 is present in the monitoring area A1, the detailed situation of the detection target Ob1 can be easily comprehended on the display screen Dp1.
By the way, when the detection processor 15 is valid, on the display screen Dp1, the valid object X1 is displayed in the superimposed manner on the captured image Im100. The valid object X1 moves in at least one of the circumferential direction and the radial direction of the virtual circle C1 that is centered on the reference point P1. In the present embodiment, the valid object X1 continuously moves in the circumferential direction of the virtual circle C1.
As illustrated in
Here, as described above, the captured image Im100 has such a fan shape that the excluded range from the first end E101 to the second end E102 as the circumferential part of the virtual circle C1 is cut out. Then, as illustrated in
More specifically, in the circumferential direction of the virtual circle C1, the valid object X1 continuously moves on the captured image Im100 and skips the excluded range (range from the first end E101 to the second end E102) where the captured image Im100 is not provided. In this way, the valid object X1 moves from the first end E101 to the second end E102 in a non-continuous manner. Accordingly, the valid object X1 is not superimposed on the captured image Im200 by the rear camera 343 and is only displayed in the superimposed manner on the captured image Im100. Thus, the user U1 further easily comprehends whether the detection processor 15 is enabled.
In the present embodiment, the valid object X1 has transparency. That is, although the valid object X1 is displayed in the superimposed manner on the captured image Im100, a portion of the captured image Im100, on which the valid object X1 is superimposed, is also displayed through the valid object X1. This allows the user U1 to visually recognize the entire captured image Im100 through the valid object X1. Thus, compared to a case where the captured image Im100 is partially blocked by the valid object X1, the visual recognition property of the captured image Im100 is improved.
Furthermore, the captured image Im100 is an overhead image. That is, the valid object X1 is displayed in the superimposed manner on the captured image Im100 such that the valid object X1 moves on the captured image Im100 as the overhead image in which the monitoring area A1 is seen from above. Thus, for the user U1, the valid object X1 appears like a radar that scans the monitoring area A1. As a result, the user U1 can easily and intuitively understand that the valid object X1 indicates the validity of the detection processor 15.
In addition, in the control method according to the present embodiment, the display mode of the valid object X1 is preferably changed according to the detection result by the detector 5. Since the display mode of the valid object X1 is changed according to the detection result by the detector 5 in this case, there are at least two display modes: a display mode during “non-detection” in which the detection target Ob1 is not present in the monitoring area A1; and a display mode during “detection” in which the detection target Ob1 is present in the monitoring area A1.
Furthermore, the display mode of the valid object X1 is preferably changed according to the distance from the work machine 3 to the detection target Ob1 and/or the direction of the detection target Ob1 (that is, the position of the detection target Ob1). Accordingly, even during the “detection” time, in which the detection target Ob1 is present in the monitoring area A1, the display mode of the valid object X1 further differs between a case where the distance from the work machine 3 to the detection target Ob1 is long and a case where the distance from the work machine 3 to the detection target Ob1 is short.
The display mode of the valid object X1, which is changed according to the detection result by the detector 5, includes the display color and the like of the valid object X1, for example. In this case, the display color of the valid object X1 on the display screen Dp1 is changed according to the detection result by the detector 5.
For example, as the detection target Ob1 is located closer to the machine body 30, the display color in the display mode of the valid object X1 becomes more conspicuous. As an example, when the detection target Ob1 moves closer to the machine body 30, the display color of the valid object X1 is changed from yellow to red.
In addition to the above, the display mode of the valid object X1, which is changed according to the detection result by the detector 5, may include a moving speed, a moving range, and/or a moving direction of the valid object X1.
For example, the display mode of the valid object X1 is changed such that, as the detection target Ob1 is located closer to the machine body 30, the moving speed of the valid object X1 is increased.
Just as described, by changing the display mode of the valid object X1 according to the detection result by the detector 5, the user U1 can intuitively comprehend the presence of the detection target Ob1 (for example, the person) in the monitoring area A1 by the display mode of the valid object X1.
Here, there is a case where the plural detection targets Ob1 are simultaneously present in the monitoring area A1. In this case, the display mode of the valid object X1 is preferably determined according to the position of the detection target Ob1, which is the closest to the machine body 30 of the work machine 3 among the plural detection targets Ob1. For example, in the case where the detection target Ob1 that is present at a position within a threshold distance from the work machine 3 and the detection target Ob1 that is present at a position farther than the threshold distance from the work machine 3 are simultaneously detected, the display mode of the valid object X1 is determined on the basis of the detection target Ob1 that is present at the position within the threshold distance from the work machine 3. Just as described, since the display mode of the valid object X1 is determined according to the detection result of the detection target Ob1 with the highest urgency, the operator can easily and intuitively comprehend the detection result.
Next, a description will be made on an overall flow of processing related to the control method with reference to
As illustrated in
Here, the display processor 11 of the control system 1 determines whether the detection processor 15 is enabled or disabled by the switching processor 13 (S3). If the detection processor 15 is enabled (S3: Yes), the display processor 11 displays the valid object X1, which moves in the circumferential direction of the virtual circle C1, in the superimposed manner on the captured image Im100 on the display screen Dp1 (S4). On the other hand, if the detection processor 15 is disabled (S3: No), the display processor 11 does not display the valid object X1 (S5).
The control system 1 repeatedly executes the processing in above steps S1 to S5. In this way, the display screen Dp1 is displayed on the display device 2 at any time. However, the flowchart illustrated in
Hereinafter, modified examples of the first embodiment will be listed.
The modified examples, which will be described below, can be applied in appropriate combinations.
The control system 1 in the present disclosure includes the computer system. The computer system has, as the main components, the one or more processors and the one or more pieces of the memory as the hardware. When the processor executes the program that is stored in the memory of the computer system, the function as the control system 1 in the present disclosure is implemented. The program may be recorded in advance in the memory of the computer system, may be provided through an electric communication line, or may be recorded and provided in a non-transitory recording medium, such as a memory card, an optical disc, or a hard disk drive, that can be read by the computer system. Some or all of the functional units included in the control system 1 may be configured by an electronic circuit.
The configuration that at least some of the functions of the control system 1 are consolidated in the single casing is not essential. The components of the control system 1 may be separately provided in the plural casings. On the contrary, the functions that are separately provided in the plural devices (for example, the control system 1 and the display device 2) in the first embodiment may be consolidated in the single casing. Furthermore, at least some of the functions of the control system 1 may be implemented on the cloud (by cloud computing) or the like.
The power source of the work machine 3 is not limited to the diesel engine. For example, the power source of the work machine 3 may be the engine other than the diesel engine, a motor (electric motor), or a hybrid power source including the engine and the motor (electric motor).
The display device 2 is not limited to the dedicated device. For example, the display device 2 may be a general-purpose terminal such as a laptop computer, a tablet terminal, or a smartphone. Furthermore, the display unit 23 is not limited to the mode that directly displays the display screen, such as the liquid-crystal display or the organic EL display. For example, the display unit 23 may be configured to display the display screen by projection, such as a projector.
As an information input mode of the operation unit 22, a mode other than the push-button switch, the touch panel, the operation dial, and the like may be employed. For example, the operation unit 22 may employ an input mode using a keyboard or a pointing device such as a mouse, a voice input mode, a gesture input mode, an input mode of an operation signal from another terminal, or the like.
The restriction processing executed by the restriction processor 122 only needs to be the processing to restrict the motion of the work machine 3, and is not limited to the processing to prohibit (disable) the motion (turning motion and the like) of the work machine 3. The restriction processing may be, for example, processing to decelerate the motion (turning motion and the like) of the work machine 3, processing to reduce a motion range (turning angle and the like) of the work machine 3, processing to restrict an allowable area for the motion of the work machine 3, or the like.
It is not essential that the restraint processing includes the plural types of the specific processing (sound output processing, restriction processing, and the like) for restraining the motion of the work machine 3. Furthermore, even in the case where the restraint processing includes the plural types of the specific processing, it is not essential that the enabled state and the disabled state are independently switched per specific processing, and the enabled state and the disabled state may be switchable for the plural types of the specific processing all at once.
The function related to the restraint processing by the restraint processor 12 is not essential in the first place, and thus the restraint processor 12 can be omitted appropriately.
The detector 5 for detecting the detection target Ob1 in the monitoring area A1 around the work machine 3 may include, in addition to or instead of the sensor 51 and the imaging unit 52, a sensor such as a motion sensor, a sonar sensor, a radar, or a light detection and ranging (LiDAR) sensor.
The detection processor 15 may detect the detection target Ob1 in the monitoring area A1 on the basis of the output (image data) of the left camera 341, the right camera 342, and the rear camera 343, for example.
More specifically, the detection processor 15 subjects the image data acquired by the image acquisition unit 14 to the image processing and thereby extracts a characteristic amount in the image. Then, based on the characteristic amount, the detection processor 15 determines whether the detection target Ob1 (“person” in the present embodiment) is captured in the image. Here, in the case where the detection target Ob1 is captured in the image, the detection processor 15 determines whether the detection target Ob1 is captured in the image captured by the left camera 341, the right camera 342, or the rear camera 343. That is, the detection processor 15 determines whether the detection target Ob1 is present in the small area A11, the image of which is captured by the left camera 341, the small area A12, the image of which is captured by the right camera 342, or the small area A13, the image of which is captured by the rear camera 343, and thereby detects the detection target Ob1.
The captured image Im100, which is displayed on the display screen Dp1, is not limited to the overhead image, which is generated by performing a coordinate transformation on the captured images by the left camera 341, the right camera 342, and the rear camera 343 and synthesizing these captured images. The captured image that is displayed on the display screen Dp1 may be an image that is captured by at least one of the left camera 341, the right camera 342, and the rear camera 343 (that is, an image that is not synthesized).
The cameras that capture the captured image Im100 are not limited to the left camera 341, the right camera 342, and the rear camera 343, and may include one, two, four, or more cameras (image sensors). Furthermore, for example, the image may be captured by a camera capable of capturing an image in all directions as seen from the work machine 3, such as a half celestial sphere camera (360-degree camera).
The detection target Ob1 may include, in addition to or instead of the “person”, a moving object such as a vehicle (including another work machine), a structure such as a wall or a pillar, a plant, an animal, a step, a trench, or another obstacle.
It is not essential to reflect the positional relationship of the monitoring area A1 with the work machine 3 in the real space to the positional relationship of the captured image Im100 with the machine body object Im10 on the display screen Dp1. Furthermore, it is not essential to reflect the positional relationship of the monitoring area A1 with the turning unit 32 in the real space to the positional relationship of the captured image Im100 with the machine body object Im10 on the display screen Dp1.
It is not essential that the display mode of the valid object X1 is changed according to the detection result by the detector 5.
As illustrated in
In the present embodiment, the valid object X1 moves in the radial direction of the virtual circle C1 from the reference point P1 side toward the outer circumferential side of the virtual circle C1 (see
Also, in the present embodiment, the valid object X1 is not displayed in the excluded range from the first end E101 to the second end E102 of the captured image Im100. Thus, similar to the captured image Im100, the valid object X1 has a shape in which the excluded range from the first end E101 to the second end E102 is cut out.
Furthermore, once moving to the outermost end of a movable range, the valid object X1 moves to the innermost end of the movable range. That is, once moving from the reference point P1 side to the outer circumference of the captured image Im100, which is the outermost end of the movable range, the valid object X1 moves to an inner circumference of the captured image Im100, which is the innermost end of the movable range.
More specifically, in the radial direction of the virtual circle C1, the valid object X1 continuously moves on the captured image Im100. Then, when moving to the outermost end, at which the captured image Im100 is not provided, the valid object X1 moves from the outer circumference to the inner circumference of the captured image Im100 in a non-continuous manner. In this way, the valid object X1 is repeatedly moves on the captured image Im100, which allows the user U1 to further easily comprehend whether the detection processor 15 is enabled.
The configuration according to Embodiment 2 can be adopted by being combined with the various configurations (including the variations) described in Embodiment 1 as appropriate.
Hereinafter, the summary of the invention extracted from the above-described embodiments will be additionally described. Note that each of the configurations and processing functions described in the following supplementary notes can be selected and arbitrarily combined.
A work machine control method includes:
The work machine control method according to Supplementary Note 1 further includes:
The work machine control method according to Supplementary Note 1 or 2, in which
The work machine control method according to any one of Supplementary Note 1 to 3, in which
The work machine control method according to any one of Supplementary Note 1 to 4, in which
The work machine control method according to any one of Supplementary Note 1 to 5, in which
The work machine control method according to Supplementary Note 6, in which
The work machine control method according to any one of Supplementary Notes 1 to 5, in which
The work machine control method according to Supplementary Note 8, in which
A work machine control program that causes one or more processors to execute
Number | Date | Country | Kind |
---|---|---|---|
2023-197872 | Nov 2023 | JP | national |