WORK MACHINE CONTROL METHOD, WORK MACHINE CONTROL PROGRAM, WORK MACHINE CONTROL SYSTEM, AND WORK MACHINE

Information

  • Patent Application
  • 20250163680
  • Publication Number
    20250163680
  • Date Filed
    October 31, 2024
    6 months ago
  • Date Published
    May 22, 2025
    4 days ago
Abstract
The work machine control method includes: acquiring a captured image of a monitoring area around a work machine; and displaying a display screen including the captured image on a display device. The work machine control method further includes: in the case where a detection processor for detecting a detection target in a monitoring area is enabled, displaying a valid object, which moves in at least one of a circumferential direction and a radial direction of a virtual circle centered on a reference point set for the captured image, in a superimposed manner on the captured image on the display screen.
Description
CROSS-REFERENCE

This application claims foreign priority of JP2023-197872 filed Nov. 22, 2024, the disclosure of which is hereby in corporated by reference in its entirety.


TECHNICAL FIELD

The present invention relates to a work machine control method, a work machine control program, and a work machine control system that are used for a work machine having a function to detect a detection target in a surrounding monitoring area, and to a work machine.


BACKGROUND ART

As the related art, a work machine (shovel) capable of detecting an object present in the surroundings has been known (for example, see Patent Document 1). The work machine according to the related art includes a camera attached to an upper turning body, a display device, and an object detector. The object detector detects a predetermined object within a predetermined detection range that is set around the shovel. In this work machine, an image that is displayed on the display device includes: a first region where a camera image is displayed; and a second region where an actuation state of an object detection function is displayed.


PRIOR ART DOCUMENT
Patent Document





    • Patent Document 1: WO2018/008542





SUMMARY OF INVENTION
Technical Problem

In the above related art, a large amount of information is displayed on the display device. Thus, an operator possibly misses an actuation state of a detection processor (object detection function) or is required to gaze at the display device in order to prevent missing the actuation state of the detector.


An object of the present invention is to provide a work machine control method, a work machine control program, a work machine control system, and a work machine with which an operator can easily and intuitively comprehend an actuation state of a detection processor.


Solution to Problem

A work machine control method according to an aspect of the present invention includes: acquiring a captured image of a monitoring area around a work machine; displaying a display screen including the captured image, on a display device; and, in the case where a detection processor for detecting a detection target in the monitoring area is enabled, displaying a valid object, which moves in at least one of a circumferential direction and a radial direction of a virtual circle centered on a reference point set for the captured image, in a superimposed manner on the captured image on the display screen.


A work machine control program according to another aspect of the present invention is a program for causing one or more processors to execute the work machine control method.


A work machine control system according to another aspect of the present invention includes an image acquisition unit and a display processor. The image acquisition unit acquires a captured image of a monitoring area around a work machine. The display processor displays a display screen including the captured image, on a display device. In the case where a detection processor for detecting a detection target in the monitoring area is enabled, the display processor displays a valid object, which moves in at least one of a circumferential direction and a radial direction of a virtual circle centered on a reference point set for the captured image, in a superimposed manner on the captured image on the display screen.


A work machine according to further another aspect of the present invention includes the work machine control system and a machine body, on which the display device is mounted.


Advantageous Effects of Invention

The present invention can provide the work machine control method, the work machine control program, the work machine control system, and the work machine with which an operator can easily and intuitively comprehend an actuation state of the detection processor.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1 is a schematic perspective view illustrating an overall configuration of a work machine according to a first embodiment.



FIG. 2 is a schematic view illustrating a hydraulic circuit and the like of the work machine according to the first embodiment.



FIG. 3 is a schematic plan view schematically illustrating a monitoring area that is set around the work machine, and the like when the work machine according to the first embodiment is seen from above.



FIG. 4 is a schematic external view of a display device on which a display screen is displayed by a work machine control system according to the first embodiment.



FIG. 5 is a view illustrating an example of the display screen that is displayed by the work machine control system according to the first embodiment.



FIG. 6 is a view illustrating a display example of a second region in the display screen that is displayed by the work machine control system according to the first embodiment.



FIG. 7 is a view illustrating another display example of the second region in the display screen that is displayed by the work machine control system according to the first embodiment.



FIG. 8 is a view illustrating a motion example of a valid object in the second region in the display screen that is displayed by the work machine control system according to the first embodiment.



FIG. 9 is a flowchart illustrating a motion example of the work machine control system according to the first embodiment.



FIG. 10 is a view illustrating a motion example of a valid object in a second region in a display screen that is displayed by a work machine control system according to a second embodiment.





DESCRIPTION OF EMBODIMENTS

Hereinafter, a description will be made on embodiments of the present invention with reference to the accompanying drawings. Each of the following embodiments is an example that embodies the present invention, and has no intention to limit the technical scope of the present invention.


First Embodiment
[1] Overall Configuration

As illustrated in FIG. 1, a work machine 3 according to the present embodiment has a machine body 30 that includes a travel unit 31, a turning unit 32, and a work unit 33. As illustrated in FIG. 2, the work machine 3 further includes a work machine control system 1 (hereinafter also simply referred to as a “control system 1”). Moreover, the machine body 30 further includes a display device 2, an operation device, and the like.


The term “work machine” in the present disclosure means any of various types of work machines, and examples include work machines such as backhoes (including a hydraulic excavator, a mini excavator, and the like), a wheel loader, and a carrier. The work machine 3 includes a work unit 33 configured to be able to perform one or more types of work at least including hanging work. The work machine 3 is not limited to a “vehicle” and may be a work vessel, a work flying object such as a drone or a multicopter, or the like, for example. Furthermore, the work machine 3 is not limited to a construction machine and may be an agricultural machine such as a rice transplanter, a tractor, or a combine harvester, for example. In the present embodiment, unless otherwise specified, the work machine 3 is the backhoe with a hanging function (with a crane function), and a description will be made on a case where the work machine 3 can perform, as work, excavation work, ground leveling work, trench excavation work, loading work, or the like, in addition to the hanging work, as an example.


In addition, in the present embodiment, for convenience of description, a vertical direction in a usable state of the work machine 3 is defined as an up-down direction D1. Furthermore, in a non-turning state of the turning unit 32, a front-rear direction D2 and a left-right direction D3 are defined with a direction seen from a user (operator) who is seated in (a driving unit 321 of) the work machine 3 being a reference. In other words, each of the directions used in the present embodiment is a direction that is defined with the machine body 30 of the work machine 3 being a reference. A direction in which the machine body 30 moves during forward travel of the work machine 3 is “forward”, and a direction in which the machine body 30 moves during reverse travel of the work machine 3 is “rearward”. Similarly, a direction in which a front end portion of the machine body 30 during a right turn of the work machine 3 is “rightward”, and a direction in which the front end portion of the machine body 30 during a left turn of the work machine 3 is “leftward”.


However, these directions do not intend to limit a use direction (direction during use) of the work machine 3.


The work machine 3 includes an engine as a power source. In the work machine 3, the machine body 30 is driven when a hydraulic pump 41 (see FIG. 2) is driven by the engine, and hydraulic oil is supplied from the hydraulic pump 41 to hydraulic actuators (including a hydraulic motor 43, a hydraulic cylinder 44, and the like) in the units of the machine body 30, for example. In addition, the work machine 3 is controlled when the user (operator) who is seated in the driving unit 321 of the machine body 30 operates an operation lever or the like of the operation device, for example.


In the present embodiment, it is assumed that the work machine 3 is a riding-type backhoe as described above. Thus, the work unit 33 is driven according to an operation of the user (operator) who is seated in the driving unit 321, and performs the work such as the excavation work. The driving unit 321 which the user gets in is provided in the turning unit 32.


The travel unit 31 has a travel function and is configured to be able to travel (including turning) on the ground. For example, the travel unit 31 has a pair of left and right crawlers 311, a blade 312, and the like. The travel unit 31 further has the travel hydraulic motor 43 (hydraulic actuator) and the like for driving the crawlers 311.


The turning unit 32 is located above the travel unit 31 and configured to be turnable about a rotation shaft along the vertical direction. The turning unit 32 has the turning hydraulic motor (hydraulic actuator) and the like. In addition to the driving unit 321, the engine, the hydraulic pump 41, and the like are mounted on the turning unit 32. Furthermore, a boom bracket 322, to which the work unit 33 is attached, is provided in a front end portion of the turning unit 32.


The work unit 33 is configured to be able to perform the work including the hanging work. The work unit 33 is supported by the boom bracket 322 of the turning unit 32 and performs the work. The work unit 33 includes a bucket 331, a boom 332, an arm 333, and the like. The work unit 33 further has the hydraulic actuator (including the hydraulic cylinder 44, the hydraulic motor, and the like) for driving each unit.


The bucket 331 is a type of an attachment (work tool) that is attached to the machine body 30 of the work machine 3, and is any tool that is selected from plural types of attachments according to a work content.


For example, the bucket 331 is detachably attached to the machine body 30, and is replaced according to the work content. In addition to the bucket 331, examples of the attachment for the work machine 3 include various tools such as a breaker, an auger, a crusher, a fork, a fork claw, a steel frame cutter, an asphalt milling machine, a mower, a ripper, a mulcher, a tilt rotator, and a tamper. The work unit 33 drives the bucket 331 with power from a drive device to perform the work.


The boom 332 is rotatably supported by the boom bracket 322 of the turning unit 32. More specifically, the boom 332 is supported by the boom bracket 322 in a manner to be rotatable about a rotation axis along a horizontal direction. The boom 332 has a shape that extends upward from a base end portion, which is supported by the boom bracket 322. The arm 333 is coupled to a distal end of the boom 332. The arm 333 is supported by the boom 332 in a manner to be rotatable about a rotation axis along the horizontal direction. The bucket 331 is attached to a distal end of the arm 333.


The work unit 33 acts when receiving the power from the engine as the power source. More specifically, when the hydraulic pump 41 is driven by the engine, and the hydraulic oil is supplied from the hydraulic pump 41 to the hydraulic actuator (the hydraulic cylinder 44 and the like) of the work unit 33, sections (the bucket 331, the boom 332, and the arm 333) of the work unit 33 act.


In the present embodiment, in particular, the work unit 33 has an articulated structure in which the boom 332 and the arm 333 are independently rotatable. That is, due to the rotation of each of the boom 332 and the arm 333 about the rotation axis along the horizontal direction, for example, the articulated work unit 33 including the boom 332 and the arm 333 can be extended or retracted as a whole.


Similar to the work unit 33, the travel unit 31 and the turning unit 32 also act when receiving the power from the engine as the power source. That is, when the hydraulic oil is supplied from the hydraulic pump 41 to the hydraulic motor 43 of the travel unit 31, the hydraulic motor of the turning unit 32, and the like, the turning unit 32 and the travel unit 31 act.


As described above, the engine functions as the power source that supplies the power to each unit. Here, together with the hydraulic pump 41 and the like, the engine is mounted on the turning unit 32. In the present embodiment, as an example, the engine is a diesel engine. The engine is driven by fuel (light fuel herein) supplied from a fuel tank.


Here, the machine body 30 includes various types of sensors (including a camera) for detecting a detection target Ob1 (see FIG. 3) in a monitoring area A1 (see FIG. 3) around the work machine 3, and the various types of the sensors include the camera that captures an image of the surroundings of the machine body 30. As illustrated in FIG. 3, in the present embodiment, as an example, plural (three herein) cameras (imaging devices) including a left camera 341, a right camera 342, and a rear camera 343 are mounted on the turning unit 32 of the machine body 30. Furthermore, as illustrated in FIG. 3, a detector 5 that detects the detection target Ob1 in the monitoring area A1 is mounted on the turning unit 32 of the machine body 30.


The left camera 341, the right camera 342, and the rear camera 343 are connected to the control system 1 and each output a captured image to the control system 1. FIG. 3 is a plan view in which the work machine 3 is seen from above, and schematically illustrates the monitoring area A1, which is set around the work machine 3, the detection target Ob1, and the machine body 30 (including the left camera 341, the right camera 342, and the rear camera 343) of the work machine 3.


The left camera 341, the right camera 342, and the rear camera 343 are respectively installed to be directed leftward, rightward, and rearward with the driving unit 321 being a reference, so as to be able to capture images of the monitoring area A1 on the left side, the right side, and the rear side when seen from the operator who is seated in the driving unit 321 of the turning unit 32. That is, as illustrated in FIG. 3, the monitoring area A1 includes plural (three herein) small areas A11, A12, A13, and the left camera 341 captures an image of the small area A11 (left area) on the left side of these when seen from the operator who is seated in the driving unit 321. Similarly, the right camera 342 captures an image of the small area A12 (right area) on the right side when seen from the operator who is in the driving unit 321, and the rear camera 343 captures an image of the small area A13 (rear area) on the rear side when seen from the operator who is seated in the driving unit 321. In this way, the left camera 341, the right camera 342, and the rear camera 343 can cover the areas on the lateral sides (left side and right side) and the rear side, which tend to be blind spots for the operator.


The detector 5 is connected to the control system 1 and outputs a detection result of the detection target Ob1 in the monitoring area A1 to the control system 1. Here, as an example, the detector 5 is installed to face rearward, so as to be able to detect the detection target Ob1 in the monitoring area A1 with the monitoring area A1 that can be imaged by the left camera 341, the right camera 342, and the rear camera 343 being a target. That is, a detection range of the detection target Ob1 by the detector 5 is the same as the imageable range (monitoring area A1) by the left camera 341, the right camera 342, and the rear camera 343.


In the present embodiment, the detector 5 includes a sensor 51 (see FIG. 2) and an imaging unit 52 (see FIG. 2). The sensor 51 is a three-dimensional sensor that measures a distance to the detection target Ob1 by a Time Of Flight (TOF) method for measuring a distance to a ranging point on the basis of a round-trip time of a radio wave, light, sound, or the like to the ranging point. The sensor 51 is a ranging sensor, such as a millimeter-wave radar, that uses the radio wave as a medium to identify the distance to the detection target Ob1, an azimuth in which the detection target Ob1 is present, and the like. The imaging unit 52 is a camera (including an image sensor and optical elements) that captures an image of a detection region by the sensor 51 (the monitoring area A1). Just as described, in the case where the detection target Ob1 is present in the monitoring area A1, the detector 5, in which the sensor 51 and the imaging unit 52 are combined, can identify a three-dimensional position and attributes (shape, size, color, motion, and the like) of the detection target Ob1.


That is, the detection result by the detector 5 may include presence or absence of the detection target Ob1 in the monitoring area A1, the position of the detection target Ob1 in the monitoring area A1 when the detection target Ob1 is present in the monitoring area A1, the attributes of the detection target Ob1, and the like.


In short, the detector 5 detects the detection target Ob1 in the monitoring area A1 around the work machine 3. The detector 5 determines whether the detection target Ob1 is present in the monitoring area A1 (presence or absence), and outputs a detection result indicating whether the detection target Ob1 is present in the monitoring area A1. In the present embodiment, as an example, the detection target Ob1 is a “person”. That is, in the case where the “person” enters the monitoring area A1 around the work machine 3 as a result of movement of the work machine 3 or movement of the “person” around the work machine 3, the detector 5 detects the “person” as the detection target Ob1. In the case where the plural detection target objects Ob1 are present in the monitoring area A1, the detector 5 may also detect the number of the detection targets Ob1 (the number of the persons).



FIG. 2 schematically illustrates a hydraulic circuit and an electric circuit (electrical connection relationships) of the work machine 3 according to the present embodiment. In FIG. 2, solid lines represent high-pressure oil paths (for the hydraulic oil), dotted lines represent low-pressure oil paths (for pilot oil), and one-dot chain arrows represent electric signal paths.


As illustrated in FIG. 2, the work machine 3 includes a pilot pump 42, a remote control valve 45, a control valve 46, a direction switching valve (control valve) 47, and the like, in addition to the hydraulic pump 41, the hydraulic motor 43 (not illustrated in FIG. 2), and the hydraulic cylinder 44.


The hydraulic oil from the hydraulic pump 41, which is driven by the engine, is supplied to the hydraulic motor 43 of the travel unit 31, the hydraulic motor of the turning unit 32, the hydraulic cylinder 44 of the work unit 33, and the like. Thus, the hydraulic actuators such as the hydraulic motor 43 and the hydraulic cylinder 44 are driven.


The hydraulic actuators such as the hydraulic motor 43 and the hydraulic cylinder 44 are each provided with the pilot-type direction switching valve 47 capable of switching a direction and a flow rate of the hydraulic oil from the hydraulic pump 41. The direction switching valve 47 is driven when supplied with the pilot oil that serves as an input command from the pilot pump 42.


Here, for example, the remote control valve 45 is provided in the pilot oil supply path to the direction switching valve 47 that corresponds to the hydraulic cylinder 44 of the work unit 33. The remote control valve 45 outputs a work operation command of the work unit 33 in response to an operation of the operation lever. The work operation command instructs extending motion, retracting motion, and the like of the work unit 33. The electromagnetic control valve 46 (electromagnetic valve) is inserted between the remote control valve 45 and the pilot pump 42. The control valve 46 is connected to a power source 351 via a cutoff relay 352 and a cutoff switch 353, and acts according to a supply current from the power source 351.


Similarly, a remote control valve is provided in the pilot oil supply path to the direction switching valve that corresponds to the hydraulic motor 43 of the travel unit 31. This remote control valve outputs a travel operation command of the travel unit 31 in response to the operation of the operation lever. The travel operation command instructs travel motion (forward travel, reverse travel, or the like) of the travel unit 31. Furthermore, the remote control valve is also provided in the pilot oil supply path to the direction switching valve that corresponds to the hydraulic motor of the turning unit 32. This remote control valve outputs a turning operation command of the turning unit 32 in response to the operation of the operation lever. The turning operation command instructs turning motion (left turn, right turn, or the like) of the turning unit 32. The electromagnetic control valve 46 (electromagnetic valve) is also inserted between these remote control valve and pilot pump 42. The control valve 46 is connected to a power source 351 via a cutoff relay 352 and a cutoff switch 353, and acts according to a supply current from the power source 351.


In an energized state, that is, in a state where the current is supplied, the control valve 46 opens a pilot oil flow path from the pilot pump 42 to the remote control valve 45. In an unenergized state, that is, in a state where the current supply is cut off, the control valve 46 blocks the pilot oil flow path. Accordingly, when the supply current to the control valve 46 is cut off, the hydraulic actuator that corresponds to the remote control valve 45 cannot no longer be driven, and the output of the hydraulic actuator is forcibly stopped regardless of the operation of the operation lever.


Here, the cutoff relay 352 is connected to the control system 1 and switched between on/off according to a control signal (electric signal) from the control system 1. The cutoff switch 353 is switched between on/off according to an operation of a cutoff lever, and is turned on when the cutoff lever is operated downward. Accordingly, when both of the cutoff relay 352 and the cutoff switch 353 are on, the control valve 46 is brought into the energized state, and thus the pilot oil flow path from the pilot pump 42 to the remote control valve 45 is opened. As a result, the hydraulic actuator is driven in response to the operation of the operation lever. Meanwhile, when at least one of the cutoff relay 352 and the cutoff switch 353 is off, the control valve 46 is brought into the unenergized state, and thus the pilot oil flow path is shut off. As a result, the hydraulic actuator can not longer be driven.


For example, in the off state of at least one of the cutoff relay 352 and the cutoff switch 353, which are connected to the control valve 46 inserted between the pilot pump 42 and the remote control valve corresponding to the hydraulic motor of the turning unit 32, the hydraulic motor of the turning unit 32 cannot be driven. In this state, since the output from the hydraulic actuator (hydraulic motor of the turning unit 32) is forcibly stopped irrespective of the operation of the operation lever, the turning motion of the turning unit 32 is prohibited.


The control system 1 mainly includes a computer system that has one or more processors, such as a central processing unit (CPU), and one or more pieces of memory, such as read only memory (ROM) and random access memory (RAM), and executes various types of processing (information processing). In the present embodiment, the control system 1 is an integrated controller that controls the entire work machine 3, and includes an electronic control unit (ECU), for example. However, the control system 1 may be provided separately from the integrated controller. The control system 1 will be described in detail in “[2] Configuration of Control System”.


The display device 2 is arranged in the driving unit 321 of the machine body 30, and is a user interface that accepts an operation input by the user (operator) and outputs various types of information to the user. For example, the display device 2 outputs an electrical signal that corresponds to the user's operation and thereby accepts the various types of the user's operations. This enables the user (operator) to visually recognize a display screen Dp1 (see FIG. 4) that is displayed on the display device 2 and to operate the display device 2 when necessary.


As illustrated in FIG. 2, the display device 2 includes a control unit 21, an operation unit 22, and a display unit 23. The display device 2 is configured to be communicable with the control system 1 and can exchange data with the control system 1. In the present embodiment, as an example, the display device 2 is a dedicated device used for the work machine 3.


The control unit 21 controls the display device 2 according to the data from the control system 1. More specifically, the control unit 21 outputs an electrical signal that corresponds to the user's operation accepted by the operation unit 22, and displays the display screen Dp1, which is generated by the control system 1, on the display unit 23.


The operation unit 22 is a user interface for accepting an operation input performed by the user (operator) to the display screen Dp1 displayed on the display unit 23. For example, the operation unit 22 outputs an electric signal corresponding to an operation performed by a user U1 (refer to FIG. 4) to receive various operations performed by the user U1. In this embodiment, as one example, as illustrated in FIG. 4, the operation unit 22 includes plural (six herein) mechanical push-button switches 221 to 226. These plural push-button switches 221 to 226 are arranged in the vicinity of (below in the example of FIG. 4) a display region in a manner to follow a peripheral edge of the display area of the display unit 23. These plural push-button switches 221 to 226 correspond to items displayed on the display screen Dp1, which will be described below. When any of the plural push-button switches 221 to 226 is operated, the respective item on the display screen Dp1 is operated (selected).


The operation unit 22 may include a touch panel, an operation dial, or the like. Also, in this case, by operating the operation unit 22, any of the items on the display screen Dp1 is operated (selected).


The display unit 23 is a user interface, such as a liquid-crystal display or organic EL display displaying various types of the information, for presenting the information to the user U1 (operator). The display unit 23 displays and presents the various types of the information to the user. In the present embodiment, as an example, the display unit 23 is a full-color liquid-crystal display with a backlight, and has a “laterally long” display region that is long in a lateral direction as illustrated in FIG. 4.


On the display screen Dp1, the display device 2 presents the various types of the information to the user U1 (operator) who operates the work machine 3. That is, the user U1 who operates the work machine 3 can visually acquire the various types of the information on the work machine 3 by seeing the display screen Dp1 that is displayed on the display device 2. As an example, when the display device 2 displays information on a running state of the work machine 3, such as a coolant temperature and a hydraulic oil temperature, the user U1 can check, on the display device 2, the information on the running state of the work machine 3 that is required to operate the work machine 3. The display device 2 can also display, on the display screen Dp1, the images of the surroundings of the work machine 3 (images of the monitoring area A1) that are captured by the left camera 341, the right camera 342, and the rear camera 343. This enables the user U1 (operator) who operates the work machine 3 to check situations on the sides, behind, and the like of the work machine 3, which tend to be the blind spots from the driving unit 321, by seeing the display screen Dp1 displayed on the display device 2.


Furthermore, the work machine 3 includes the sound output unit 36 (see FIG. 2) that outputs sound (including voice) to the user U1 (operator). The sound output unit 36 includes a buzzer, a speaker, or the like, and outputs sound when receiving the electric signal. The sound output unit 36 is connected to the control system 1 and outputs the sound, such as a beep or the voice, in response to a sound control signal from the control system 1. In the present embodiment, similar to the display device 2, the sound output unit 36 is provided in the driving unit 321 of the machine body 30. The sound output unit 36 may integrally be provided with the display device 2.


The machine body 30 further includes the operation lever, the cutoff lever, a communication terminal, the fuel tank, a battery, and the like in addition to the above-described components. Moreover, the machine body 30 includes sensors for monitoring a running state of the machine body 30, and examples of such sensors are a coolant temperature sensor, a hydraulic oil temperature sensor, a tachometer that measures an engine speed, and an hour meter that measures a running time. In addition to the above, the machine body 30 further includes sensors for detecting states or the like of the cutoff lever, a starter key switch, and the like.


[2] Configuration of Control System

Next, a description will be made on a configuration of the control system 1 according to the present embodiment with reference to FIG. 2. The control system 1 controls the display device 2 and thereby causes the display device 2 to display the display screen Dp1. In the present embodiment, as described above, the display device 2 is mounted on the machine body 30 of the work machine 3. The control system 1 is a component of the work machine 3 and constitutes the work machine 3 with the machine body 30 and the like. In other words, the work machine 3 according to the present embodiment at least includes the control system 1 and the machine body 30 (including the travel unit 31, the turning unit 32, and the work unit 33) mounted on the display device 2.


The term “screen” such as the display screen Dp1 in the present disclosure means a video (image) displayed on the display device 2 and includes a graphic image, a figure, a photograph, text, a moving image, and the like. Thus, the control system 1 can cause the display device 2 to display, for example, the display screen Dp1 including an icon that indicates information on the operating state of the work machine 3 such as cooling water temperature and hydraulic oil temperature. Here, in the case where the display screen Dp1 includes the video and the like, the display screen Dp1 includes a still picture but includes a video that changes every moment.


As illustrated in FIG. 2, the control system 1 includes a display processor 11, a restraint processor 12, a switching processor 13, an image acquisition unit 14, and a detection processor 15. In the present embodiment, as an example, the control system 1 has, as the main component, the computer system having the one or more processors. Accordingly, when the one or more processors executes a work machine control program, these plural functional units (the display processor 11 and the like) are realized. These plural functional units included in the control system 1 may be provided separately in plural casings or may be provided in a single casing.


The control system 1 is configured to be communicable with the device that is provided to each part of the machine body 30. That is, at least the display device 2, the detector 5, the sound output unit 36, the cutoff relay 352, the left camera 341, the right camera 342, the rear camera 343, and the like are connected to the control system 1. Accordingly, the control system 1 can control the display device 2, the sound output unit 36, and the like, control the cutoff relay 352 to control the control valve 46, and acquire the detection result by the detector 5 and the images captured by the left camera 341, the right camera 342, the rear camera 343, and the like. Here, the control system 1 may exchange the various types of the information (data) with each of the devices either directly or indirectly via a relay or the like.


The image acquisition unit 14 executes image acquisition processing to acquire the captured image of the monitoring area A1 around the work machine 3. In the present embodiment, the image acquisition unit 14 regularly or irregularly acquires the output of the left camera 341, the right camera 342, and the rear camera 343 from the left camera 341, the right camera 342, and the rear camera 343, respectively. That is, the image acquisition unit 14 acquires the image data (captured image) of the monitoring area A1 (each of the small areas A11, A12, A13) around the work machine 3. The data acquired by the image acquisition unit 14 is stored in the memory or the like, for example.


The detection processor 15 executes detection processing to detect the detection target Ob1 in the monitoring area A1. In the present embodiment, since the detector 5 detects the detection target Ob1 in the monitoring area A1, the detection processor 15 detects the detection target Ob1 in the monitoring area A1 by acquiring the detection result by the detector 5 from the detector 5.


The restraint processor 12 executes restraint processing to restrain the motion of the work machine 3 on the basis of the detection result by the detector 5. In the present embodiment, in the case where the detection result by the detector 5 is a result that indicates the presence of the detection target Ob1 (the person herein) in the monitoring area A1, the restraint processor 12 executes the restraint processing. The term “restraint processing” in the present disclosure means processing that relates to the motion of the work machine 3 and suppresses the motion of the work machine 3 in some way. As an example, the restraint processing includes processing to indirectly restrain the motion of the work machine 3 by warning the user U1 (operator) who operates the work machine 3 by the sound or the light (including the display). Furthermore, the restraint processing includes processing to directly restrain the motion of the work machine 3 by controlling the travel unit 31, the turning unit 32, the work unit 33, and the like of the work machine 3.


In the present embodiment, the restraint processor 12 includes a sound output processor 121 and a restriction processor 122.


In the case where the detection target Ob1 is present in the monitoring area A1, the sound output processor 121 controls the sound output unit 36 to cause the sound output unit 36 to output notification sound from the sound output unit 36. That is, in the present embodiment, the restraint processing includes sound output processing to output the notification sound. The notification sound may be a simple beep or may be the voice, such as a message “Please be careful”. Furthermore, the notification sound may be changed according to the detection result by the detector 5 (the distance from the machine body 30 to the detection target Ob1, or the like). In this way, the motion of the work machine 3 can be restrained indirectly by warning the user U1 (operator), who operates the work machine 3, using the notification sound. Thus, there is a high degree of freedom in the operation of the work machine 3. That is, when the user U1 operates the work machine 3 while paying attention to the detection target Ob1, it is possible to continue the motion of the work machine 3 while contact with the detection target Ob1 is avoided.


In the case where the detection target Ob1 is present in the monitoring area A1, the restriction processor 122 turns off the cutoff relay 352 by controlling the cutoff relay 352. Consequently, the control valve 46, which is connected to the power source 351 via the cutoff relay 352, is brought into the unenergized state, and the output of the hydraulic actuator corresponding to the control valve 46 is forcibly stopped. That is, in the present embodiment, the restraint processing includes restriction processing to restrict the motion of the work machine 3. The term “restriction processing” in the present disclosure means processing that relates to the motion of the work machine 3 and restricts the motion of the work machine 3 in some way. Examples of the restriction processing include processing to prohibit the travel motion of the travel unit 31 (disable the travel motion), processing to prohibit the turning motion of the turning unit 32 (disable the turning motion), and processing to prohibit the motion of the work unit 33 (disable the work). In this way, the motion of the work machine 3 can be restricted forcibly without relying on the operation by the user U1 (operator). That is, it is possible to avoid contact of the machine body 30 with the detection target Ob1, which is caused by the motion of the work machine 3.


Here, the restriction processing that is executed by the restriction processor 122 at least includes processing to restrict the turning motion of the turning unit 32. More specifically, the restriction processor 122 is configured to be able to control the cutoff relay 352 that is connected to the control valve 46 corresponding to the hydraulic motor of the turning unit 32 and turn off the cutoff relay 352 when the detection target Ob1 is present in the monitoring area A1. In this way, when the detection target Ob1 is present in the monitoring area A1, the hydraulic motor of the turning unit 32 cannot be driven, and the turning unit 32 is emergently stopped during the turning motion of the turning unit 32, or the turning motion of the turning unit 32 is prohibited when the turning unit 32 is not performing the turning motion. That is, in the present embodiment, the work machine 3 includes the travel unit 31 and the turning unit 32 that can turn with respect to the travel unit 31. The restriction processing at least restricts the turning motion of the turning unit 32. In this way, when the detection target Ob1 is present in the monitoring area A1, which is the blind spot for the user U1 (operator), it is possible to avoid the contact of the machine body 30 with the detection target Ob1, which is caused by turning of the turning unit 32.


The switching processor 13 switches between an enabled state and a disabled state of the detection processor 15. That is, the switching processor 13 switches between an enabled state and a disabled state of the function related to the detection processing for the detection target Ob1 in the monitoring area A1 by the detection processor 15. In short, the detection processor 15 is not always enabled but can be switched between the enabled state and the disabled state. In the case where the detection processor 15 is enabled, and the detection target Ob1 is present in the monitoring area A1, the detection target Ob1 is detected by the detection processor 15. Thus, the restraint processor 12 executes the restraint processing. Meanwhile, in the case where the detection processor 15 is disabled, the detection target Ob1 is not detected by the detection processor 15 even when the detection target Ob1 is present in the monitoring area A1. Thus, the restraint processor 12 does not execute the restraint processing.


In the present embodiment, depending on whether to actuate the detector 5, the switching processor 13 switches between the enabled state and the disabled state of the detection processor 15. That is, the switching processor 13 actuates the detector 5 when the detection processor 15 is enabled. The switching processor 13 does not actuate the detector 5 when the detection processor 15 is disabled. In this way, when the detection processor 15 is disabled, power consumption by the detector 5 can be suppressed.


However, the configuration of the switching processor 13 is not limited thereto. The switching processor 13 may disable the detection processor 15 by disabling the detection processing by the detection processor 15 (acquisition of the detection result from the detector 5), disabling the acquired detection result, or the like. In this case, when the detection processor 15 is disabled, the detector 5 itself is actuated. However, even in the case where the detection target Ob1 is present in the monitoring area A1, the restraint processor 12 does not execute the restraint processing. In short, the switching processor 13 may disable the detection processor 15 by disabling the processing using the detection result by the detector 5.


In the present embodiment, as an example, the enabled state and the disabled state of the detection processor 15 are switched when the user U1 (operator) operates the display device 2. That is, when the user U1 operates the operation unit 22 of the display device 2 to enable the detection processor 15, the switching processor 13 accepts this operation to enable the detection processor 15. Meanwhile, when the user U1 operates the operation unit 22 of the display device 2 to disable the detection processor 15, the switching processor 13 accepts this operation to disable the detection processor 15.


In the present embodiment, the restraint processing executed by the restraint processor 12 includes the sound output processing executed by the sound output processor 121 and the restriction processing executed by the restriction processor 122. Just as described, the restraint processing includes plural types of specific processing (the sound output processing, the restriction processing, and the like) for restraining the motion of the work machine 3. Here, in the restraint processing, each type of the specific processing can be switched independently between an enabled state and a disabled state. That is, the switching processor 13 can switch between the enabled state and the disabled state of each of the sound output processor 121 and the restriction processor 122 in the restraint processor 12. As an example, the sound output processor 121 can be enabled, and the restriction processor 122 can be disabled.


Alternatively, the sound output processor 121 can be disabled, and the restriction processor 122 can be enabled. In this way, depending on the situation, it is possible to only enable the necessary specific processing, which improves a degree of freedom in the restraint processing.


The display processor 11 executes display processing to display at least the display screen Dp1 on the display device 2. More specifically, the display processor 11 generates the display screen Dp1 on the basis of the data acquired by the image acquisition unit 14 and the like, controls the display device 2, and thereby displays the display screen Dp1 on the display unit 23 of the display device 2. Furthermore, the display processor 11 acts in response to the operation accepted by the operation unit 22 of the display device 2. The display processor 11 displays, on the display screen Dp1, captured image Im100 (see FIG. 5) that is captured by the left camera 341, the right camera 342, and the rear camera 343, for example. That is, the display processor 11 displays the image of the monitoring area A1 around the work machine 3 on the display device 2.


Here, the display processor 11 can display a valid object X1 (see FIG. 5) on the display screen Dp1 in addition to the captured image Im100, and the valid object X1 indicates that the detection processor 15 is enabled. When the detection processor 15 is enabled, the display processor 11 displays the valid object X1 in a superimposed manner on the captured image Im100 in the display screen Dp1. When the detection processor 15 is disabled, the display processor 11 does not display the valid object X1. The term “object” such as the valid object X1 described in the present disclosure includes a symbol, a graphic image, a figure, a photograph, text, a video, and the like that are displayed on the display screen Dp1 as well as a combination of these.


In this way, the display of the valid object X1 on the display screen Dp1 allows the user U1 (operator) to comprehend that the detection processor 15 is enabled.


That is, the user U1 can visually recognize whether the detection processor 15 is enabled or disabled, and can actuate the work machine 3 after recognizing that the detection processor 15 is enabled or disabled.


By the way, in the present embodiment, the detection processor 15 acquires the detection result by the detector 5 on the outside of the control system 1 and executes the detection processing for the detection target Ob1 on the basis of the detection result. However, the detection processor 15 is not limited to this configuration. For example, the detection processor 15 may execute the detection processing for the detection target Ob1 in the monitoring area A1 on the basis of the output of the sensor 51 on the outside of the control system 1 and/or the imaging unit 52.


[3] Work Machine Control Method

Hereinafter, a description will be mainly made on an example of a control method for the work machine 3 (hereinafter simply referred to as a “control method”) that is executed by the control system 1 with reference to FIG. 5 to FIG. 9.


The control method according to the present embodiment is executed by the control system 1 that has the computer system as the main component, and thus, in other words, is embodied by the work machine control program (hereinafter simply referred to as a “control program”). That is, the control program according to the present embodiment is a computer program for causing the one or more processors to execute each processing related to the control method. Such a control program may be executed in cooperation by the control system 1 and the display device 2, for example.


Here, in the case where a particular start operation, which is set in advance to execute the control program, is performed, the control system 1 executes the following various types of processing related to the control method. The start operation is an activation operation of the engine of the work machine 3, or the like, for example. Meanwhile, in the case where a particular termination operation, which is set in advance, is performed, the control system 1 terminates the following various types of processing related to the control method. The termination operation is a stop operation of the engine of the work machine 3, or the like, for example.


[3.1] Display Screen

Herein, a description will be firstly made on a configuration of the display screen Dp1 that is displayed on the display unit 23 of the display device 2 by the control method according to the present embodiment. In the drawings in FIG. 5 and the like, each of which illustrates the display screen Dp1 displayed on the display unit 23 of the display device 2, one-dot chain lines, leading lines, and reference signs representing the region are merely provided for an illustrative purpose, and are not actually displayed on the display device 2.


The display screen Dp1 illustrated in FIG. 5 is a home screen that is displayed first by the control method.


The home screen is a basic display screen Dp1 that is first displayed on the display device 2 while the work machine 3 is running. The display screen Dp1 can be shifted from the home screen to any of various display screens Dp1 including a menu screen, a crane screen, a mode screen, a PTO screen, and the like according to the operation on the operation unit 22.


As illustrated in FIG. 5, the display screen Dp1 includes a first region R1, a second region R2, a third region R3, a fourth region R4, a fifth region R5, a sixth region R6, a seventh region R7, an eighth region R8, a ninth region R9, and a tenth region R10. In the control method according to the present embodiment, as an example, the captured image Im100 of the monitoring area A1, the valid object X1, and the like are displayed in the second region R2 that occupies most of the display screen Dp1.


Specifically, the display screen Dp1 is vertically (an upper/lower direction) divided into four regions.


Each of the three regions from the top is further divided into three regions in the lateral direction (left-right direction). Thus, the display screen Dp1 is divided into a total of ten regions. The regions in the second row from the top are the first region R1, the second region R2, and the third region R3 in order from left to right. The region in the lowermost row is the fourth region R4. Furthermore, the regions in the third row from the top are the fifth region R5, the sixth region R6, and the seventh region R7 in order from left to right, and the regions in the uppermost row are the eighth region R8, the ninth region R9, and the tenth region R10 in order from left to right. Of the four regions into which the display screen Dp1 is divided in the vertical direction, the region in the second row from the top (region including the first region R1, the second region R2, and the third region R3) has the largest vertical size. Among the three regions transversely divided, a transverse size of middle regions (the second region R2, the sixth region R6, and the ninth region R9) is the largest.


However, the placement and size of the respective regions are merely examples and can be properly changed. In addition, it is not essential that the regions are clearly divided by boundaries. For example, in the example of FIG. 5, while the second region R2 and the third region R3 are clearly separated by the boundary, no boundary is present between the first region R1 and the second region R2. Of course, the first region R1 and the second region R2 may be clearly divided by the boundary line.


The first region R1 is a rectangular region extending in the vertical direction. In the first region R1, for example, remaining amount information G1 on a remaining amount of the fuel (for example, the light oil) for the engine is displayed. The display processor 11 generates the remaining amount information G1 in the display screen Dp1, for example, on the basis of output (a sensor signal) from a remaining amount sensor.


The second region R2 is a rectangular region that is long in the lateral direction. The captured image Im100 of the monitoring area A1, the valid object X1, and the like are displayed in the second region R2. A machine body object Im10, a captured image Im200 of the rear camera 343, and the like are further displayed in the second region R2.


The captured image Im100 is an overhead image that is generated by subjecting the image captured by the left camera 341, the image captured by the right camera 342, and the image captured by the rear camera 343 into coordinate transformations and synthesization. That is, the captured image Im100 is an image that is acquired by synthesizing an image of the small area A11 on the left side of the driving unit 321 captured by the left camera 341, an image of the small area A12 on the right side of the driving unit 321 captured by the right camera 342, and the image of the small area A13 on the rear side of the driving unit 321 captured by the rear camera 343 and that is displayed in the form that the monitoring area A1 is seen from the upper side of the machine body 30. The display processor 11 synthesizes the captured images acquired by the image acquisition unit 14 and displays the synthesized image in real time.


The valid object X1 is the object indicating that the detection processor 15 is enabled, and is displayed in the superimposed manner on the captured image Im100 on the display screen Dp1. The valid object X1 will be described in detail in the section “[3.2] Details”.


The machine body object Im10 is displayed in an 11th region R11 that is set at a center of the second region R2. In the present embodiment, as an example, the machine body object Im10 is a graphic image (icon) that imitates the machine body 30 seen from above. The captured image Im200 is an image of the small area A13 on the rear side of the driving unit 321 that is captured by the rear camera 343.


The third region R3 is a rectangular region that is in the vertical direction. In the third region R3, a graphic image (icon) Im1 that corresponds to the running state of each of the units of the work machine 3 is displayed. In the third region R3, the plural graphic images Im1 can be displayed, and each of the graphic images Im1 uses a design (pattern) to indicate, for example, the state of the battery, a seatbelt, the coolant temperature, the hydraulic oil temperature, or the like.


Here, each of the graphic images Im1 indicates the running state, depending on a display mode including a display color, a size, or the like, for example. The display processor 11 determines the state of each of the units of the work machine 3 by using the output of the various sensors (including the coolant temperature sensor and the hydraulic oil temperature sensor), each of which detects the running state of each of the units of the work machine 3. When an abnormal value is detected in any portion, the display processor 11 provides warning display by changing the display mode, such as the display color, of the graphic image Im1 of the respective portion, or the like.


The fourth region R4 is a band-shaped region that extends over an entire width of the display screen Dp1. In the fourth region R4, items for operations on the display screen Dp1 are displayed. In FIG. 5, as an example, six items of “MENU”, “CRANE”, “MODE”, “CAMERA”, “PTO”, and “SWITCH” are arranged in this order from left to right in the fourth region R4. These six items are associated with the six push-button switches 221 to 226 in the operation unit 22 that are located directly below the six items. For example, the item “MENU” is associated with the push-button switch 221, and the item “CRANE” is associated with the push-button switch 222. Thus, for example, when the user U1 operates the push-button switch 224 that corresponds to the item “CAMERA” (see FIG. 4), the item “CAMERA” is operated (selected).


Furthermore, in the present embodiment, in the fourth region R4, one of the items is highlighted to respond to the operation of the operation dial (or a cursor key) in the operation unit 22, or the like.


In the example of FIG. 5, the item “MENU” is highlighted, and the highlighted item is switched by the operation of the operation dial (or the cursor key), or the like. While the desired item is highlighted, the user U1 can operate a determination button to select the desired item. Thus, for example, when the determination button is operated after the highlighted item is switched to the item “CAMERA”, the item “CAMERA” is operated (selected). When the operation unit 22 includes the touch panel, the user U1 can perform the operation of touching the desired item on the display screen Dp1 to select the desired item.


In the fifth region R5, a warning display graphic image (icon) is displayed to indicate that the abnormal value has been detected by any of the various sensors (including the coolant temperature sensor and the hydraulic oil temperature sensor). In the sixth region R6, for example, information on the running work unit 33 in the work machine 3 is displayed. In the seventh region R7, for example, the information on the running state of the work machine 3, such as the engine speed, is displayed. In the eighth region R8, for example, current time is displayed. In the ninth region R9, for example, information on the item to which the currently displayed display screen Dp1 belongs is displayed. In the tenth region R10, for example, information on a running time (hour meter) of the work machine 3 is displayed.


[3.2] Details

Next, the control method according to the present embodiment will be described in detail.


The control method according to the present embodiment has: acquiring the captured image Im100 of the monitoring area A1 around the work machine 3 (image acquisition processing); and displaying the display screen Dp1, which includes the captured image Im100, on the display device 2 (display processing). The control method further has, in the display processing: displaying the valid object X1 in the superimposed manner on the captured image Im100 on the display screen Dp1 in the case where the detection processor 15, which detects the detection target in the monitoring area A1, is enabled. The valid object X1 moves in at least one of a circumferential direction and a radial direction of a virtual circle C1 (see FIG. 6) that is centered on a reference point P1 (see FIG. 6) set on the captured image Im100.


In short, by displaying the display screen Dp1, which includes the captured image Im100, on the display device 2, the user U1 can check the presence or the absence of the detection target Ob1 in the monitoring area A1, and the like in the captured image Im100 on the display screen Dp1. Here, in the case where the detection processor 15 is enabled, the valid object X1, which moves in at least one of the circumferential direction and the radial direction of the virtual circle C1 centered on the reference point P1, is displayed in the superimposed manner on the captured image Im100 on the display screen Dp1. Accordingly, the user U1 can comprehend whether the detection processor 15 is enabled or disabled by the valid object X1 that is dynamically displayed at a position that naturally enters a field of view of the user U1 when the user U1 sees the captured image Im100. Thus, the control method according to the present embodiment allows the operator to intuitively comprehend the actuation state of the detector 5.


A detailed description will hereinafter be made on the display of the second region R2 in the display screen Dp1 with reference to FIGS. 6 to 8. FIGS. 6 to 8 each illustrate an example of display contents of the second region R2 in the display screen Dp1. In FIGS. 6 to 8, only the area around the second region R2 in the display screen Dp1 is illustrated, and other illustrations are omitted. Furthermore, the reference point P1 and the virtual circle C1, which are illustrated in FIGS. 6 and 7, are a virtual point and a virtual circle, respectively, and are not actually displayed on the display device 2.


As illustrated in FIG. 6, the captured image Im100 is a fan-shaped image that has an arc-shaped contour on a far side from the machine body 30. The term “arc-shaped” in the present disclosure is not limited to an arc constituting a part of a perfect circle, and includes a line curved outward in general. Thus, as illustrated in FIG. 6, even in the case where a contour line of the captured image Im100 includes a line that is curved outward as a whole, the captured image Im100 can be considered as the fan-shaped image having the arc-shaped contour. Here, the captured image Im100 is the fan-shaped image in which an upper portion of the arc in the display screen Dp1 is cut out, and is the image of (each of the small areas A11, A12, A13) the monitoring area A1 in which a left portion, a right portion, and a lower portion of the image respectively correspond to the left side, the right side, and the rear side of the machine body 30. That is, on the display screen Dp1, the captured image Im100 is displayed in the state where at least the contours of the portions corresponding to the monitoring area A1 on the left side, the right side, and the rear side of the work machine 3 are formed in the arc shape.


In detail, the captured image Im100 has such a fan shape that, when the virtual circle C1 that is centered on the reference point P1 located near a center of the second region R2 is set, an excluded range from a first end E101 to a second end E102 as a circumferential part of the virtual circle C1 is cut out. In the example of FIG. 6, on the display screen Dp1, the captured image Im100 has: the linear first end E101 that extends upward to the left from the reference point P1; and the linear second end E102 that extends upward to the right from the reference point P1.


As illustrated in FIG. 6, the machine body object Im10 is arranged in an apex portion of the captured image Im100. That is, the machine body object Im10 is arranged at a position on the reference point P1, which corresponds to an apex angle of the fan-shaped captured image Im100. In other words, the control method according to the present embodiment further includes displaying the machine body object Im10 at the reference point P1 in the display screen Dp1.


In the present embodiment, the upper side of the machine body object Im10 corresponds to the front side of the machine body 30 in a real space, the lower side of the machine body object Im10 corresponds to the rear side of the machine body 30 in the real space, the right side of the machine body object Im10 corresponds to the right side of the machine body 30 in the real space, and the left side of the machine body object Im10 corresponds to the left side of the machine body 30 in the real space. More specifically, the lower portion of the machine body object Im10 is formed in the arc shape that imitates a rear portion (counterweight) of the turning unit 32, and the upper portion of the machine body object Im10 is formed in the shape that imitates a front portion (work unit 33) of the turning unit 32. In the present embodiment, the work unit 33 is arranged at a position that is offset to the right from the center of the turning unit 32 in the left-right direction D3, and the driving unit 321 is arranged at a position that is offset to the left from the center of the turning unit 32 in the left-right direction D3. Thus, in the machine body object Im10, a figure that imitates the work unit 33 is located in an upper right portion, and a figure that imitates the driving unit 321 is located in a left portion.


In the present embodiment, the machine body object Im10 is displayed not with an orientation of the travel unit 31 but with an orientation of the turning unit 32, on which the driving unit 321 is mounted, being a reference. That is, when the turning unit 32 turns relative to the travel unit 31, the direction that corresponds to the upper side of the machine body object Im10 in the real space is changed. In this way, the upper side of the machine body object Im10 constantly corresponds to the front side of the turning unit 32, which is located on the front side as seen by the operator seated in the driving unit 321.


Here, the direction of the machine body 30 represented by the machine body object Im10 matches the direction of the captured image Im100 that is seen from the machine body 30. That is, both the front side of the machine body 30, which is simulated by the machine body object Im10, and the front side in the captured image Im100 are directed upward in the display screen Dp1. This makes it easier for the operator to intuitively comprehend the position of the detection target Ob1 by looking at the captured image Im100. However, the present disclosure is not limited to this example, and the machine body object Im10 may be arranged at a position other than the apex angle of the captured image Im100, for example, on the lower side of the captured image Im100.


The captured image Im200 by the rear camera 343 is arranged in the excluded range between the first end E101 and the second end E102 in the circumferential direction of the virtual circle C1. That is, the captured image Im200 is arranged above the captured image Im100 in the display screen Dp1. In the present embodiment, as an example, the captured image Im200 has a pentagonal outer shape that includes two sides along the first end E101 and the second end E102. In this way, the captured image Im200 of the small area A13, which is located behind the driving unit 321 and is captured by the rear camera 343, can be displayed by using a dead space in the second region R2.


Furthermore, in the present embodiment, the detection target Ob1 in the monitoring area A1 is set as the detection target by the detector 5. Thus, in the case where the detection target Ob1 is present in the monitoring area A1, the detection target Ob1 is naturally captured in the captured image Im100 of the monitoring area A1. Just as described, when the detection result indicates the presence of the detection target Ob1 in the monitoring area A1, at least a part of the detection target is captured in the captured image Im100 on the display screen Dp1. This allows the user U1 to check the detection target Ob1 in the captured images Im100. That is, on the display screen Dp1 displayed in the display device 2, the operator (user U1) can check the situations on the lateral side, the rear side, and the like of the work machine 3, which are likely to be blind spots from the driving unit 321. Thus, in the case where the detection target Ob1 is present in the monitoring area A1, the detailed situation of the detection target Ob1 can be easily comprehended on the display screen Dp1.


By the way, when the detection processor 15 is valid, on the display screen Dp1, the valid object X1 is displayed in the superimposed manner on the captured image Im100. The valid object X1 moves in at least one of the circumferential direction and the radial direction of the virtual circle C1 that is centered on the reference point P1. In the present embodiment, the valid object X1 continuously moves in the circumferential direction of the virtual circle C1.


As illustrated in FIGS. 6 and 7, this valid object X1 is an image (object) that is long along a radius of the virtual circle C1 and moves in the circumferential direction of the virtual circle C1. More specifically, as exemplified in FIG. 6, the valid object X1 is an elongated object that extends from the reference point P1 as the center of the virtual circle C1 toward the arc-shaped outer shape (outer circumference) of the captured image Im100. Such a valid object X1 moves about the reference point P1 in the circumferential direction of the virtual circle C1 in a manner to rotate in a clockwise direction (a direction indicated by a broken arrow in FIG. 7). In this way, the valid object X1 is likely to enter the field of view of the user U1, and thus the user U1 can further easily comprehend whether the detection processor 15 is enabled.


Here, as described above, the captured image Im100 has such a fan shape that the excluded range from the first end E101 to the second end E102 as the circumferential part of the virtual circle C1 is cut out. Then, as illustrated in FIG. 8, the valid object X1 moves from the first end E101 to the second end E102 while skipping the excluded range. That is, as illustrated in an upper portion of FIG. 8, when the valid object X1 moves in the clockwise direction (the direction indicated by a broken arrow in FIG. 8) and reaches the first end E101 of the captured image Im100, as illustrated in a lower portion of FIG. 8, the valid object X1 moves to the second end E102 of the captured image Im100.


More specifically, in the circumferential direction of the virtual circle C1, the valid object X1 continuously moves on the captured image Im100 and skips the excluded range (range from the first end E101 to the second end E102) where the captured image Im100 is not provided. In this way, the valid object X1 moves from the first end E101 to the second end E102 in a non-continuous manner. Accordingly, the valid object X1 is not superimposed on the captured image Im200 by the rear camera 343 and is only displayed in the superimposed manner on the captured image Im100. Thus, the user U1 further easily comprehends whether the detection processor 15 is enabled.


In the present embodiment, the valid object X1 has transparency. That is, although the valid object X1 is displayed in the superimposed manner on the captured image Im100, a portion of the captured image Im100, on which the valid object X1 is superimposed, is also displayed through the valid object X1. This allows the user U1 to visually recognize the entire captured image Im100 through the valid object X1. Thus, compared to a case where the captured image Im100 is partially blocked by the valid object X1, the visual recognition property of the captured image Im100 is improved.


Furthermore, the captured image Im100 is an overhead image. That is, the valid object X1 is displayed in the superimposed manner on the captured image Im100 such that the valid object X1 moves on the captured image Im100 as the overhead image in which the monitoring area A1 is seen from above. Thus, for the user U1, the valid object X1 appears like a radar that scans the monitoring area A1. As a result, the user U1 can easily and intuitively understand that the valid object X1 indicates the validity of the detection processor 15.


In addition, in the control method according to the present embodiment, the display mode of the valid object X1 is preferably changed according to the detection result by the detector 5. Since the display mode of the valid object X1 is changed according to the detection result by the detector 5 in this case, there are at least two display modes: a display mode during “non-detection” in which the detection target Ob1 is not present in the monitoring area A1; and a display mode during “detection” in which the detection target Ob1 is present in the monitoring area A1.


Furthermore, the display mode of the valid object X1 is preferably changed according to the distance from the work machine 3 to the detection target Ob1 and/or the direction of the detection target Ob1 (that is, the position of the detection target Ob1). Accordingly, even during the “detection” time, in which the detection target Ob1 is present in the monitoring area A1, the display mode of the valid object X1 further differs between a case where the distance from the work machine 3 to the detection target Ob1 is long and a case where the distance from the work machine 3 to the detection target Ob1 is short.


The display mode of the valid object X1, which is changed according to the detection result by the detector 5, includes the display color and the like of the valid object X1, for example. In this case, the display color of the valid object X1 on the display screen Dp1 is changed according to the detection result by the detector 5.


For example, as the detection target Ob1 is located closer to the machine body 30, the display color in the display mode of the valid object X1 becomes more conspicuous. As an example, when the detection target Ob1 moves closer to the machine body 30, the display color of the valid object X1 is changed from yellow to red.


In addition to the above, the display mode of the valid object X1, which is changed according to the detection result by the detector 5, may include a moving speed, a moving range, and/or a moving direction of the valid object X1.


For example, the display mode of the valid object X1 is changed such that, as the detection target Ob1 is located closer to the machine body 30, the moving speed of the valid object X1 is increased.


Just as described, by changing the display mode of the valid object X1 according to the detection result by the detector 5, the user U1 can intuitively comprehend the presence of the detection target Ob1 (for example, the person) in the monitoring area A1 by the display mode of the valid object X1.


Here, there is a case where the plural detection targets Ob1 are simultaneously present in the monitoring area A1. In this case, the display mode of the valid object X1 is preferably determined according to the position of the detection target Ob1, which is the closest to the machine body 30 of the work machine 3 among the plural detection targets Ob1. For example, in the case where the detection target Ob1 that is present at a position within a threshold distance from the work machine 3 and the detection target Ob1 that is present at a position farther than the threshold distance from the work machine 3 are simultaneously detected, the display mode of the valid object X1 is determined on the basis of the detection target Ob1 that is present at the position within the threshold distance from the work machine 3. Just as described, since the display mode of the valid object X1 is determined according to the detection result of the detection target Ob1 with the highest urgency, the operator can easily and intuitively comprehend the detection result.


[3.3] Overall Processing

Next, a description will be made on an overall flow of processing related to the control method with reference to FIG. 9. FIG. 9 is a flowchart illustrating an example of the processing related to the control method.


As illustrated in FIG. 9, the image acquisition unit 14 of the control system 1 acquires the captured images from the left camera 341, the right camera 342, and the rear camera 343 (S1). Then, the display processor 11 of the control system 1 displays the display screen Dp1 including the captured image Im100 on the display device 2 (S2).


Here, the display processor 11 of the control system 1 determines whether the detection processor 15 is enabled or disabled by the switching processor 13 (S3). If the detection processor 15 is enabled (S3: Yes), the display processor 11 displays the valid object X1, which moves in the circumferential direction of the virtual circle C1, in the superimposed manner on the captured image Im100 on the display screen Dp1 (S4). On the other hand, if the detection processor 15 is disabled (S3: No), the display processor 11 does not display the valid object X1 (S5).


The control system 1 repeatedly executes the processing in above steps S1 to S5. In this way, the display screen Dp1 is displayed on the display device 2 at any time. However, the flowchart illustrated in FIG. 9 is merely one example. The processing may be added or omitted appropriately, or the order of the processing may be changed appropriately.


[4] MODIFIED EXAMPLES

Hereinafter, modified examples of the first embodiment will be listed.


The modified examples, which will be described below, can be applied in appropriate combinations.


The control system 1 in the present disclosure includes the computer system. The computer system has, as the main components, the one or more processors and the one or more pieces of the memory as the hardware. When the processor executes the program that is stored in the memory of the computer system, the function as the control system 1 in the present disclosure is implemented. The program may be recorded in advance in the memory of the computer system, may be provided through an electric communication line, or may be recorded and provided in a non-transitory recording medium, such as a memory card, an optical disc, or a hard disk drive, that can be read by the computer system. Some or all of the functional units included in the control system 1 may be configured by an electronic circuit.


The configuration that at least some of the functions of the control system 1 are consolidated in the single casing is not essential. The components of the control system 1 may be separately provided in the plural casings. On the contrary, the functions that are separately provided in the plural devices (for example, the control system 1 and the display device 2) in the first embodiment may be consolidated in the single casing. Furthermore, at least some of the functions of the control system 1 may be implemented on the cloud (by cloud computing) or the like.


The power source of the work machine 3 is not limited to the diesel engine. For example, the power source of the work machine 3 may be the engine other than the diesel engine, a motor (electric motor), or a hybrid power source including the engine and the motor (electric motor).


The display device 2 is not limited to the dedicated device. For example, the display device 2 may be a general-purpose terminal such as a laptop computer, a tablet terminal, or a smartphone. Furthermore, the display unit 23 is not limited to the mode that directly displays the display screen, such as the liquid-crystal display or the organic EL display. For example, the display unit 23 may be configured to display the display screen by projection, such as a projector.


As an information input mode of the operation unit 22, a mode other than the push-button switch, the touch panel, the operation dial, and the like may be employed. For example, the operation unit 22 may employ an input mode using a keyboard or a pointing device such as a mouse, a voice input mode, a gesture input mode, an input mode of an operation signal from another terminal, or the like.


The restriction processing executed by the restriction processor 122 only needs to be the processing to restrict the motion of the work machine 3, and is not limited to the processing to prohibit (disable) the motion (turning motion and the like) of the work machine 3. The restriction processing may be, for example, processing to decelerate the motion (turning motion and the like) of the work machine 3, processing to reduce a motion range (turning angle and the like) of the work machine 3, processing to restrict an allowable area for the motion of the work machine 3, or the like.


It is not essential that the restraint processing includes the plural types of the specific processing (sound output processing, restriction processing, and the like) for restraining the motion of the work machine 3. Furthermore, even in the case where the restraint processing includes the plural types of the specific processing, it is not essential that the enabled state and the disabled state are independently switched per specific processing, and the enabled state and the disabled state may be switchable for the plural types of the specific processing all at once.


The function related to the restraint processing by the restraint processor 12 is not essential in the first place, and thus the restraint processor 12 can be omitted appropriately.


The detector 5 for detecting the detection target Ob1 in the monitoring area A1 around the work machine 3 may include, in addition to or instead of the sensor 51 and the imaging unit 52, a sensor such as a motion sensor, a sonar sensor, a radar, or a light detection and ranging (LiDAR) sensor.


The detection processor 15 may detect the detection target Ob1 in the monitoring area A1 on the basis of the output (image data) of the left camera 341, the right camera 342, and the rear camera 343, for example.


More specifically, the detection processor 15 subjects the image data acquired by the image acquisition unit 14 to the image processing and thereby extracts a characteristic amount in the image. Then, based on the characteristic amount, the detection processor 15 determines whether the detection target Ob1 (“person” in the present embodiment) is captured in the image. Here, in the case where the detection target Ob1 is captured in the image, the detection processor 15 determines whether the detection target Ob1 is captured in the image captured by the left camera 341, the right camera 342, or the rear camera 343. That is, the detection processor 15 determines whether the detection target Ob1 is present in the small area A11, the image of which is captured by the left camera 341, the small area A12, the image of which is captured by the right camera 342, or the small area A13, the image of which is captured by the rear camera 343, and thereby detects the detection target Ob1.


The captured image Im100, which is displayed on the display screen Dp1, is not limited to the overhead image, which is generated by performing a coordinate transformation on the captured images by the left camera 341, the right camera 342, and the rear camera 343 and synthesizing these captured images. The captured image that is displayed on the display screen Dp1 may be an image that is captured by at least one of the left camera 341, the right camera 342, and the rear camera 343 (that is, an image that is not synthesized).


The cameras that capture the captured image Im100 are not limited to the left camera 341, the right camera 342, and the rear camera 343, and may include one, two, four, or more cameras (image sensors). Furthermore, for example, the image may be captured by a camera capable of capturing an image in all directions as seen from the work machine 3, such as a half celestial sphere camera (360-degree camera).


The detection target Ob1 may include, in addition to or instead of the “person”, a moving object such as a vehicle (including another work machine), a structure such as a wall or a pillar, a plant, an animal, a step, a trench, or another obstacle.


It is not essential to reflect the positional relationship of the monitoring area A1 with the work machine 3 in the real space to the positional relationship of the captured image Im100 with the machine body object Im10 on the display screen Dp1. Furthermore, it is not essential to reflect the positional relationship of the monitoring area A1 with the turning unit 32 in the real space to the positional relationship of the captured image Im100 with the machine body object Im10 on the display screen Dp1.


It is not essential that the display mode of the valid object X1 is changed according to the detection result by the detector 5.


Second Embodiment

As illustrated in FIG. 10, the work machine 3 according to the present embodiment differs from the work machine 3 according to the first embodiment in display contents of the second region R2 in the display screen Dp1. Hereinafter, the same components as those in the first embodiment will be denoted by the same reference signs, and the description thereon will not be made. In FIG. 10, only the second region R2 of the display screen Dp1 is illustrated, and the regions other than the second region R2 are not illustrated.


In the present embodiment, the valid object X1 moves in the radial direction of the virtual circle C1 from the reference point P1 side toward the outer circumferential side of the virtual circle C1 (see FIG. 6). More specifically, as exemplified in FIG. 10, the valid object X1 is an arc-shaped (ring-shaped) object that is centered on the reference point P1 as the center of the virtual circle C1. That is, like a wave pattern centered on the reference point P1, the valid object X1 moves in a manner to expand toward the outer side of the captured image Im100 (in directions indicated by broken arrows in FIG. 10) from a state illustrated in an upper portion of FIG. 10 toward a state illustrated in a lower portion of FIG. 10. Such a valid object X1 moves about the reference point P1 toward the outer circumferential side of the virtual circle C1 (that is, a direction away from the reference point P1) in the radial direction of the virtual circle C1. In this way, the valid object X1 is likely to enter the field of view of the user U1, and thus the user U1 can further easily comprehend whether the detection processor 15 is enabled.


Also, in the present embodiment, the valid object X1 is not displayed in the excluded range from the first end E101 to the second end E102 of the captured image Im100. Thus, similar to the captured image Im100, the valid object X1 has a shape in which the excluded range from the first end E101 to the second end E102 is cut out.


Furthermore, once moving to the outermost end of a movable range, the valid object X1 moves to the innermost end of the movable range. That is, once moving from the reference point P1 side to the outer circumference of the captured image Im100, which is the outermost end of the movable range, the valid object X1 moves to an inner circumference of the captured image Im100, which is the innermost end of the movable range.


More specifically, in the radial direction of the virtual circle C1, the valid object X1 continuously moves on the captured image Im100. Then, when moving to the outermost end, at which the captured image Im100 is not provided, the valid object X1 moves from the outer circumference to the inner circumference of the captured image Im100 in a non-continuous manner. In this way, the valid object X1 is repeatedly moves on the captured image Im100, which allows the user U1 to further easily comprehend whether the detection processor 15 is enabled.


The configuration according to Embodiment 2 can be adopted by being combined with the various configurations (including the variations) described in Embodiment 1 as appropriate.


Supplementary Note of the Invention

Hereinafter, the summary of the invention extracted from the above-described embodiments will be additionally described. Note that each of the configurations and processing functions described in the following supplementary notes can be selected and arbitrarily combined.


<Supplementary Note 1>

A work machine control method includes:

    • acquiring the captured image of the monitoring area around the work machine;
    • displaying the display screen including the captured image on the display device; and
    • in the case where the detection processor for detecting the detection target in the monitoring area is enabled, displaying the valid object, which moves in at least one of the circumferential direction and the radial direction of the virtual circle centered on the reference point set for the captured image, in the superimposed manner on the captured image on the display screen.


<Supplementary Note 2>

The work machine control method according to Supplementary Note 1 further includes:

    • displaying the machine body object at the reference point on the display screen.


<Supplementary Note 3>

The work machine control method according to Supplementary Note 1 or 2, in which

    • the display mode of the valid object is changed according to the detection result by the detection processor.


<Supplementary Note 4>

The work machine control method according to any one of Supplementary Note 1 to 3, in which

    • the valid object has transparency.


<Supplementary Note 5>

The work machine control method according to any one of Supplementary Note 1 to 4, in which

    • the captured image is the overhead image.


<Supplementary Note 6>

The work machine control method according to any one of Supplementary Note 1 to 5, in which

    • the valid object is long along the radius of the virtual circle and moves in the circumferential direction of the virtual circle.


<Supplementary Note 7>

The work machine control method according to Supplementary Note 6, in which

    • the captured image has the shape in which the excluded range from the first end to the second end, which is the circumferential part of the virtual circle, is cut out, and
    • the valid object skips the excluded range and moves from the first end to the second end.


<Supplementary Note 8>

The work machine control method according to any one of Supplementary Notes 1 to 5, in which

    • the valid object moves in the radial direction of the virtual circle from the reference point side toward the outer circumferential side of the virtual circle.


<Supplementary Note 9>

The work machine control method according to Supplementary Note 8, in which

    • when moving to the outermost end of the movable range, the valid object moves to the innermost end of the movable range.


<Supplementary Note 10>

A work machine control program that causes one or more processors to execute

    • the work machine control method according to any one of Supplementary Notes 1 to 9.


REFERENCE SIGNS LIST






    • 1: Work machine control system


    • 2: Display device


    • 3: Work machine


    • 5: Detector


    • 11: Display processor


    • 14: Image acquisition unit


    • 30: Machine body

    • A1: Monitoring area

    • C1: Virtual circle

    • E101: First end

    • E102: Second end

    • Dp1: Display screen

    • Im10: Machine body object

    • Im100: Captured image

    • Ob1: Detection target

    • P1: Reference point

    • X1: Valid object




Claims
  • 1. A work machine control method comprising: acquiring a captured image of a monitoring area around a work machine;displaying a display screen including the captured image on, a display device; andin the case where a detection processor for detecting a detection target in the monitoring area is enabled, displaying a valid object, which moves in at least one of a circumferential direction and a radial direction of a virtual circle centered on a reference point set for the captured image, in a superimposed manner on the captured image on the display screen.
  • 2. The work machine control method according to claim 1 further comprising: displaying a machine body object at the reference point on the display screen.
  • 3. The work machine control method according to claim 1, wherein a display mode of the valid object is changed according to a detection result by the detection processor.
  • 4. The work machine control method according to claim 1, wherein the valid object has transparency.
  • 5. The work machine control method according to claim 1, wherein the captured image is an overhead image.
  • 6. The work machine control method according to claim 1, wherein the valid object is long along a radius of the virtual circle and moves in the circumferential direction of the virtual circle.
  • 7. The work machine control method according to claim 6, wherein the captured image has a shape in which an excluded range from a first end to a second end, which is a circumferential part of the virtual circle, is cut out, andthe valid object skips the excluded range and moves from the first end to the second end.
  • 8. The work machine control method according to claim 1, wherein the valid object moves in the radial direction of the virtual circle from the reference point side toward an outer circumferential side of the virtual circle.
  • 9. The work machine control method according to claim 8, wherein when moving to an outermost end of a movable range, the valid object moves to an innermost end of the movable range.
  • 10. A work machine control program for causing one or more processors to execute the work machine control method according to claim 1.
  • 11. A work machine control system comprising: an image acquisition unit that acquires a captured image of a monitoring area around a work machine; anda display processor that displays a display screen including the captured image, on a display device, whereinin the case where a detection processor for detecting a detection target in the monitoring area is enabled, the display processor displays a valid object, which moves in at least one of a circumferential direction and a radial direction of a virtual circle centered on a reference point set for the captured image, in a superimposed manner on the captured image on the display screen.
  • 12. A work machine comprising: the work machine control system according to claim 11; anda machine body on which the display device is mounted.
Priority Claims (1)
Number Date Country Kind
2023-197872 Nov 2023 JP national