WORK ASSISTANCE SERVER AND WORK ASSISTANCE METHOD

Information

  • Patent Application
  • 20230092296
  • Publication Number
    20230092296
  • Date Filed
    January 25, 2021
    3 years ago
  • Date Published
    March 23, 2023
    a year ago
Abstract
A work assistance server 10 according to the present invention includes a first assistance processing element 121 that generates a surroundings image including a region around a work machine 40 when a request to start work by the work machine 40 is made, and a second assistance processing element 122 that permits the work by the work machine 40 on condition that the surroundings image is displayed on an image output device 221 of a remote operation apparatus 20.
Description
TECHNICAL FIELD

The present invention relates to a work assistance server that assists remote operation of a work machine and a work assistance method for assisting the work of the work machine.


BACKGROUND ART

Conventionally, a plurality of fixed point cameras are installed at a work site where a work machine such as a remote shovel works, which enables overlooking the entire site and monitoring motion or work of a specific shovel.


To confirm a situation around a work machine, a method for loading a plurality of cameras into the work machine to pick up an image has been known.


For example, in a shovel in Patent Literature 1, described below, a left camera is attached to a stay on the left front side of a cabin, and a back camera is attached to an upper portion of a counter weight. A right camera is attached to a handrail on the right front side of an upper turning body.


The left camera and the right camera are attached such that a side surface of the upper turning body is included in their respective image pickup ranges. The back camera is attached such that a rear end of an upper surface of the counter weight is included in its image pickup range. An operator of the shovel can easily recognize a positional relationship with an object existing therearound using the cameras (Patent Literature 1/paragraphs 0015 to 0019, FIG. 2).


CITATION LIST
Patent Literature



  • Patent Literature 1: International Publication No. WO 2017/115808



SUMMARY OF INVENTION
Technical Problem

However, if a remote work machine is remotely operated, an operator (worker) is at a position spaced apart from the remote work machine. Accordingly, the worker desires to be able to efficiently confirm a video image of a camera that picks up an image of the surroundings of a work machine to be remotely operated when performing the work.


The present invention has been made in view of the above-described points, and is directed to providing a work assistance server capable of a worker efficiently perceiving a situation around a work machine.


Solution to Problem

According to a first aspect of the invention, a work assistance server that assists in work by a work machine to be operated such that the work machine to be operated can be remotely operated in response to an operation performed on a remote operation apparatus including a display device comprises a first assistance processing element that generates a surroundings image including a region around the work machine when a request to start the work by the work machine is made, and a second assistance processing element that permits the work by the work machine on condition that the surroundings image is displayed on the display device.


The work assistance server according to the present invention, a picked-up image (a still image or a moving image) obtained by picking up an image of the region around the work machine is sent to the display device, and a worker can confirm the situation. As a result, the work assistance server assists the worker in the work by the work machine.


The first assistance processing element generates the surroundings image including the region around the work machine when the request to start the work by the work machine is made, and displays the surroundings image on the display device of the remote operation apparatus. The second assistance processing element permits the work by the work machine on condition that the surroundings image is displayed on the display device. As a result, an operator overlooks the region around the work machine before starting the work by the work machine. The work assistance server can permit the remote operation apparatus to operate the work machine after the surroundings image to be confirmed by the operator is reliably displayed on the display device of the remote operation apparatus.


According to a second aspect of the invention, a work assistance method for assisting in work by a work machine to be operated such that the work machine to be operated can be remotely operated in response to an operation performed on a remote operation apparatus including a display device comprises a first step of generating a surroundings image including a region around the work machine when a request to start the work by the work machine is made, and a second step of permitting the work by the work machine on condition that the surroundings image is displayed on the display device.


In the work assistance method according to the present invention, in the first step, the surroundings image including the region around the work machine is generated when the request to start the work by the work machine is made, and the surroundings image is displayed on the display device of the remote operation apparatus. In the second step, the work by the work machine is permitted on condition that the surroundings image is displayed on the display device. As a result, the operator overlooks the region around the work machine before starting the work by the work machine.


In the work assistance method, the remote operation apparatus can be permitted to operate the work machine after the surroundings image to be confirmed by the operator is reliably displayed on the display device of the remote operation apparatus.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1 is a diagram illustrating an outline of a work assistance system according to an embodiment of the present invention.



FIG. 2 is a diagram illustrating a configuration of an operation mechanism in a work machine.



FIG. 3A is a diagram (1) for describing details of the work machine.



FIG. 3B is a diagram (2) for describing details of the work machine.



FIG. 4A is a flowchart (1) of processing to be performed by each of a work assistance server, a remote operation apparatus, and the work machine.



FIG. 4B is a flowchart (2) of processing to be performed by each of the work assistance server, the remote operation apparatus, and the work machine.



FIG. 5 illustrates an example of an image (a surroundings image 1) to be displayed on an image output device.



FIG. 6 is a diagram (1) for describing means for acquiring a surroundings image of the work machine.



FIG. 7 illustrates an example of an image (a surroundings image 2) to be displayed on the image output device.



FIG. 8 is a diagram (2) for describing means for acquiring a surroundings image of the work machine.



FIG. 9 illustrates an example of an image (a surroundings image 3) to be displayed on the image output device.



FIG. 10 is a diagram illustrating a work environment image to be displayed on the image output device.





DESCRIPTION OF EMBODIMENT

Details of a work assistance server according to the present invention will be described below with reference to the drawings.


First, a work assistance server 10 according to an embodiment of the present invention and a work assistance system 1 including the work assistance server 10 will be described with reference to FIG. 1.


The work assistance system 1 for a construction machine according to the present embodiment is a system configured to be able to selectively remotely operate a plurality of work machines 40 assigned as operation targets to an operator OP or a remote operation apparatus 20 by the operator OP operating the remote operation apparatus 20. A work site where the plurality of work machines 40 to be operated by the remote operation apparatus 20 are arranged may be one work site or any one of a plurality of work sites.


The work assistance system 1 comprises at least the work assistance server 10 and the remote operation apparatus 20 configured to remotely operate the work machine 40 in addition to the work machine 40. The work assistance server 10, the remote operation apparatus 20, and the work machine 40 are configured to be communicable with one another by a network NW including a wireless communication network. The work assistance server 10 automatically selects an appropriate image pickup device or picked-up image in response to switching, movement, or a work content, for example, of the work machine 40. The work assistance server 10 outputs the picked-up image to an image output device 221 (a “display device” in the present invention), described below, in order to assist an operator OP of the remote operation apparatus 20 in performing an operation.


The work assistance server 10 comprises a database 102, a first assistance processing element 121, a second assistance processing element 122, and server wireless communication equipment 125. The database 102 stores and holds a picked-up image picked up by an actual machine image pickup device 412, a surroundings image pickup device 413, or the like, described below. The database 102 may be constituted by a database server separate from the work assistance server 10.


Each of the assistance processing elements 121 and 122 is constituted by an arithmetic processing unit (a single core processor or a multi-core processor or a processor core constituting them). Each of the assistance processing elements 121 and 122 reads required data and software from a storage device such as a memory and performs arithmetic processing conforming to the software with the data used as a target. The server wireless communication equipment 125 issues an instruction to perform display on the image output device 221 via the network NW, and receives the picked up image from the image pickup device.


Then, the remote operation apparatus 20 comprises a remote control device 200, a remote input interface 210, and a remote output interface 220. The remote control device 200 is constituted by an arithmetic processing unit (a single core processor or a multi-core processor or a processor core constituting them). The remote control device 200 reads required data and software from a storage device such as a memory and performs arithmetic processing conforming to the software with the data used as a target.


The remote input interface 210 comprises a remote operation mechanism 211. The remote output interface 220 comprises the image output device 221, a worker image pickup device 222, and remote wireless communication equipment 223.


The worker image pickup device 222 is a camera attached to the remote operation apparatus 20, and picks up an image of at least an operation seat (a seat St).


The remote wireless communication equipment 223 transmits an operation signal to the work machine 40 via the network NW, and receives a picked-up image from the actual machine image pickup device 412, the surroundings image pickup device 413, or the like.


The remote operation mechanism 211 comprises a traveling operation device, a turning operation device, a boom operation device, an arm operation device, and a bucket operation device. Each of the operation devices has an operation lever that receives a rotation operation. The operation lever (traveling lever) of the traveling operation device is operated to operate a lower traveling body 427 in the work machine 40 (see FIGS. 3A and 3B). The traveling lever may also serve as a traveling pedal. For example, a traveling pedal fixed to a base portion or a lower end portion of the traveling lever may be provided.


The operation lever (turning lever) of the turning operation device is used to operate a hydraulic turning motor constituting a turning mechanism 430 in the work machine 40. The operation lever (boom lever) of the boom operation device is used to operate a boom cylinder 442 in the work machine 40 (see FIGS. 3A and 3B).


The operation lever (arm lever) of the arm operation device is used to operate an arm cylinder 444 in the work machine 40. The operation lever (bucket lever) of the bucket operation device is used to operate a bucket cylinder 446 in the work machine 40 (see FIGS. 3A and 3B).


An example of the operation levers constituting the remote operation mechanism 211 and the seat St for the operator OP to sit are illustrated in FIG. 2. The seat St has a form such as a high back chair with an armrest. The seat St may have any form in which the operator can sit, for example, a form like a low back chair with no headrest or a form like a chair with no backrest.


A pair of left and right traveling levers 2110 respectively corresponding to left and right crawlers are laterally arranged side by side in front of the seat St. The one operation lever may also serve as a plurality of operation levers. For example, a right-side operation lever 2111 provided in front of a right-side frame of the seat St may function as a boom lever when operated in a front-rear direction and function as a bucket lever when operated in a left-right direction.


Similarly, a left-side operation lever 2112 provided in front of a left-side frame of the seat St may function as an arm lever when operated in the front-rear direction and function as a turning lever when operated in the left-right direction. A lever pattern may be arbitrarily changed in response to an operation instruction from the operator OP.


The image output device 221 comprises a central image output device 2210, a left-side image output device 2211, and a right-side image output device 2212 respectively having substantially rectangular screens arranged in front of, diagonally leftward in front of, and diagonally rightward in front of the seat St, as illustrated in FIG. 2. Respective shapes and sizes of the screens (image display regions) of the central image output device 2210, the left-side image output device 2211, and the right-side image output device 2212 may be the same as or different from one another.


A right edge of the left-side image output device 2211 is adjacent to a left edge of the central image output device 2210 such that the screen of the central image output device 2210 and the screen of the left-side image output device 2211 form an inclined angle θ1 (e.g., 120°≤θ1≤150°). Similarly, a left edge of the right-side image output device 2212 is adjacent to a right edge of the central image output device 2210 such that the screen of the central image output device 2210 and the screen of the right-side image output device 2212 form an inclined angle θ2 (e.g., 120°≤θ2≤150°). The inclined angles θ1 and θ2 may be the same as or different from each other.


The respective screens of the central image output device 2210, the left-side image output device 2211, and the right-side image output device 2212 may be parallel to one another in a vertical direction, or may be inclined in the vertical direction. At least one of the central image output device 2210, the left-side image output device 2211, and the right-side image output device 2212 may be constituted by a plurality of separated image output devices. For example, the central image output device 2210 may be constituted by a pair of image output devices, which are vertically adjacent to each other, each having a substantially rectangular screen. Each of the image output devices 2210 to 2212 may further comprise a speaker (voice output device).


Then, the work machine 40 comprises an actual machine control device 400, an actual machine input interface 410, an actual machine output interface 420, and an actuation mechanism (work attachment) 440. The actual machine control device 400 is constituted by an arithmetic processing unit (a single core processor or a multi-core processor or a processor core constituting them). The actual machine control device 400 reads required data and software from a storage device such as a memory and performs arithmetic processing conforming to the software with the data used as a target.



FIGS. 3A and 3B illustrate an example of the work machine 40 according to the present embodiment. The work machine 40 is a crawler shovel (construction machine), for example, and comprises a machine body 450 constituted by the crawler type lower traveling body 427, an upper turning body 435 turnably loaded into the lower traveling body 427 via the turning mechanism 430, and an actuation mechanism 440. A front left side portion of the upper turning body 435 is provided with a cab (operation room) 425, and a front central portion of the upper turning body 435 is provided with the actuation mechanism 440. A machine housing portion 436 that houses a machine such as an engine and a counter weight 437 arranged behind the machine housing portion 436 are arranged behind the cab (operation room) 425.


Referring to FIG. 1 again, the actual machine input interface 410 comprises an actual machine operation mechanism 411, the actual machine image pickup device 412, and the surroundings image pickup device 413. The actual machine operation mechanism 411 comprises a plurality of operation levers arranged similarly to the remote operation mechanism 211 around a seat arranged in the cab 425. The cab 425 is provided with a driving mechanism or a robot that receives a signal corresponding to an operation mode of the remote operation lever and moves the actual machine operation levers based on the received signal.


As illustrated in FIGS. 3A and 3B, the actual machine image pickup device 412 is installed in the cab 425, for example. The actual machine image pickup device 412 picks up an image of a forward orientation of the work machine 40 through a front window of the cab 425.


As illustrated in FIGS. 3A and 3B, the surroundings image pickup device 413 is installed in a front lower portion of the cab 425, for example. The surroundings image pickup device 413 comprises a front camera 413A that picks up an image of the front of the work machine 40, a right camera 413B that is installed on the machine body 450 and picks up an image of the right of the work machine 40, a left camera 413D that is installed on the machine body 450 and picks up an image of the left of the work machine 40, and a rear camera 413C that is installed on the machine body 450 and picks up an image of the rear of the work machine 40. Respective image pickup ranges of the cameras 413A to 413D are set to overlap one another, and can pick up an omnidirectional image (360°) of a region around the work machine 40.


The actual machine output interface 420 comprises actual machine wireless communication equipment 422. The actual machine wireless communication equipment 422 receives a signal corresponding to the operation mode of the remote operation lever from the remote wireless communication equipment 223 in the remote operation apparatus 20 via the network NW. The signal is further transmitted to the actual machine control device 400, and the work machine 40 operates in response to the signal. The cab 425 is provided with a driving mechanism or a robot that actuates the actual machine operation levers based on the signal.


The actuation mechanism 440 comprises a boom 441 mounted on the upper turning body 435, an arm 443 rotatably connected to a distal end portion of the boom 441, and a bucket 445 rotatably connected to a distal end portion of the arm 443. The boom cylinder 442, the arm cylinder 444, and the bucket cylinder 446 each constituted by a stretchable hydraulic cylinder are mounted on the actuation mechanism 440.


A positioning device 460 is a device that detects a position of the work machine 40. The positioning device 460 is constituted by a GNSS receiver (GNSS: Global Navigation Satellite System), for example. The position of the work machine 40 detected by the positioning device 460 is transmitted to the work assistance server 10 via the actual machine wireless communication equipment 422, and is stored in the database 102.


The boom cylinder 442 is interposed between the boom 441 and the upper turning body 435 by expanding and contracting upon being supplied with hydraulic oil to rotate the boom 441 in a rise and fall direction. The arm cylinder 444 is interposed between the arm 443 and the boom 441 by expanding and contracting upon being supplied with hydraulic oil to rotate the arm 443 around a horizontal axis relative to the boom 441.


The bucket cylinder 446 is interposed between the bucket 445 and the arm 443 by expanding and contracting upon being supplied with hydraulic oil to rotate the bucket 445 around a horizontal axis relative to the arm 443.


The work assistance system 1 can further comprise an installation camera 70 installed at a work site where each of the work machines 40 is arranged and an unmanned aircraft 80 that flies above the work site where the work machine 40 is arranged.


The work assistance server 10 and the remote operation apparatus 20 can appropriately acquire a picked-up video image picked up by an image pickup device 713 in the installation camera 70 and a picked-up video image picked up by an image pickup device 813 in the unmanned aircraft 80 via the network NW and output the picked-up video images to the image output device 221.


Then, flowcharts relating to a function of the work assistance system 1 according to the present embodiment will be described with reference to FIGS. 4A and 4B. Selection of the image pickup device, display of the picked-up image, and the like will be appropriately described with reference to FIGS. 5 to 10.


Blocks “C10” to “C44” in the flowcharts mean transmission and/or receiving of data and mean that processing in a branch direction is performed on condition that the data is transmitted and/or received, although used to simplify description.


(First Function)

In the remote operation apparatus 20, the presence or absence of a first designation operation through the remote input interface 210 by the operator OP is determined (FIG. 4A/STEP210). The “first designation operation” is an operation for selecting the work machine 40 that cooperates with the remote operation apparatus 20.


The presence or absence of the first designation operation is determined in the following manner, for example. A map, a list, or the like representing respective existence positions of work machines 40 that can cooperate with the remote operation apparatus 20 is outputted to the remote output interface 220. Then, it is determined whether or not an operation such as a tap for designating the one work machine 40 that cooperates with the remote operation apparatus 20 has been performed by the operator OP.


If a result of the determination is negative (FIG. 4A/NO in STEP210), processing of the first function ends. On the other hand, if the result of the determination is affirmative (FIG. 4A/YES in STEP210), the remote operation apparatus 20 transmits a connection request signal to the work assistance server 10 via the remote wireless communication equipment 223 (FIG. 4A/STEP211). The “connection request signal” includes a work machine identifier for identifying the work machine 40 that has established communication with the remote operation apparatus 20 or the work machine 40 designated through the remote input interface 210.


If the work assistance server 10 receives the connection request signal (FIG. 4A/C10), the first assistance processing element 121 issues an instruction to start transmission of a picked-up image to the work machine 40 to be identified by the work machine identifier (FIG. 4A/STEP111). At that time, at least an image pickup device of the actual machine image pickup device 412 of the work machine 40 is selected.


The work machine 40 transmits, if it receives the transmission start instruction (FIG. 4A/C40), a picked-up image to the work assistance server 10 (FIG. 4A/STEP411). The work assistance server 10 generates, if it receives the picked-up image via the server wireless communication equipment 125 (FIG. 4A/C11), a surroundings image (FIG. 4A/STEP112), and transmits the surroundings image to the remote operation apparatus 20 via the server wireless communication equipment 125 (FIG. 4A/STEP113). Examples of the surroundings image include surroundings images 1 to 3 listed below.


(Surroundings Image 1)

For example, the work machine 40 transmits, if it comprises the surroundings image pickup device 413 illustrated in FIGS. 3A and 3B, images respectively picked up by the cameras 413A to 413D constituting the surroundings image pickup device 413 to the work assistance server 10 via the actual machine wireless communication equipment 422 (FIG. 4A/STEP411).


If the work assistance server 10 receives the picked-up images respectively picked up by the cameras 413A to 413D via the server wireless communication equipment 125 (FIG. 4A/C11), the first assistance processing element 121 synthesizes the picked-up images respectively picked up by the cameras 413A to 413D. The work assistance server 10 generates a surroundings image that has been viewed from above the work machine 40 (FIG. 4A/STEP112), as illustrated in FIG. 5, and transmits the surroundings image to the remote operation apparatus 20 via the server wireless communication equipment 125 (FIG. 4A/STEP113).


The remote operation apparatus 20 outputs, if it receives the surroundings image via the remote wireless communication equipment 223 (FIG. 4A/C20), the surroundings image to the image output device 221 (FIG. 4A/STEP212).


In the surroundings image illustrated in FIG. 5, an image of the work machine 40 illustrated at its center may be previously stored as graphical information in the database 102. In the surroundings image, a broken line indicates each of joints among the picked-up images respectively picked up by the cameras 413A to 413D. The broken lines need not be outputted to the image output device 221.


The surroundings image pickup device 413 in the work machine 40 may generate the surroundings image by synthesizing the picked-up images respectively picked up by the cameras 413A to 413D. In this case, the work machine 40 transmits the surroundings image generated by the surroundings image pickup device 413 to the work assistance server 10 via the actual machine wireless communication equipment 422 (FIG. 4A/STEP411).


The work assistance server 10 can be configured to transmit, if it receives the surroundings image via the server wireless communication equipment 125 (FIG. 4A/C11), the surroundings image to the remote operation apparatus 20 via the server wireless communication equipment 125 (FIG. 4A/STEP113).


The remote output interface 220 in the remote operation apparatus 20 may generate the surroundings image by synthesizing the picked-up images respectively picked up by the cameras 413A to 413D. The work assistance server 10 transmits, if it receives the picked-up images respectively picked up by the cameras 413A to 413D via the server wireless communication equipment 125 (FIG. 4A/C11), the picked-up images to the remote operation apparatus 20 via the server wireless communication equipment 125 (FIG. 4A/STEP113).


The remote operation apparatus 20 synthesizes, if it receives the picked-up images respectively picked up by the cameras 413A to 413D via the remote wireless communication equipment 223, the picked-up images. Then, the remote operation apparatus 20 outputs the generated surroundings image to the image output device 221 (FIG. 4A/STEP212).


(Surroundings Image 2)

The work machine 40 transmits a picked-up image picked up by the actual machine image pickup device 412 to the work assistance server 10 via the actual machine wireless communication equipment 422 (FIG. 4A/STEP411). Further, a position of the work machine 40 detected by the positioning device 460 is transmitted to the work assistance server 10 via the actual machine wireless communication equipment 422, and is stored in the database 102.


If the work assistance server 10 receives the picked-up image picked up by the actual machine image pickup device 412 via the server wireless communication equipment 125 (FIG. 4A/C11), the first assistance processing element 121 generates a surroundings image obtained by superimposing picked-up images respectively picked up by the installation cameras 70A and 70B on the picked-up image picked up by the actual machine image pickup device 412 (FIG. 4A/STEP112), as illustrated in FIG. 7.


Specifically, the work assistance server 10 can receive a picked-up image picked up by the image pickup device 713, which is to be sent via a communication equipment 723 in the installation camera 70, in addition to the picked-up image picked up by the actual machine image pickup device 412 via the server wireless communication equipment 125.


The first assistance processing element 121 grasps a position and a shootable range at a work site of the installation camera 70. The work assistance server 10 specifies the installation camera 70 the shootable range of which includes a position of the work machine 40 stored in the database 102 based on the position of the work machine 40. Further, the work assistance server 10 superimposes the picked-up image picked up by the specified installation camera 70, which includes the work machine 40 captured by the specified installation camera 70, on the picked-up image picked up by the actual machine image pickup device 412, and outputs the obtained image to the image output device 221 (FIG. 4A/STEP212).


As illustrated in FIG. 6, if there exist a plurality of installation cameras 70 the respective shootable ranges of which include the position of the work machine 40, all picked-up images respectively picked up by the installation cameras 70 may be displayed, or some of the picked-up images may be displayed.


In FIG. 7, a picked-up image picked up by an installation camera 70B that shoots the work machine 40 from the left side is outputted to the left side of the image output device 221. A picked-up image picked up by an installation camera 70A that shoots the work machine 40 from the right side is outputted to the right side of the image output device 221. When the picked-up images respectively obtained by shooting the work machine 40 in different directions are selected, a situation around the work machine 40 can be reflected on the surroundings image.


(Surroundings Image 3)

A case where at a work site where the work machine 40 is arranged, the unmanned aircraft 80 that can fly thereabove exists will be described with reference to FIG. 8. The work assistance server 10 can receive a picked-up image picked up by the image pickup device 813 loaded into the unmanned aircraft 80, which is to be transmitted via a communication equipment 823.


The unmanned aircraft 80 may be one that can fly wirelessly in a rechargeable manner or may be supplied with power by wire. The unmanned aircraft 80 waits at a work site or a station provided on the work machine 40. The image pickup device 813 can pick up an image of at least a portion below the unmanned aircraft 80.


If the work machine 40 receives a transmission start instruction (FIG. 4A/C40), the unmanned aircraft 80 moves to the area above the work machine 40, and the image pickup device 813 starts image pickup.


The work machine 40 transmits the picked-up image picked up by the actual machine image pickup device 412 to the work assistance server 10 via the actual machine wireless communication equipment 422 (FIG. 4A/STEP411). At the same time, the picked-up image picked up by the image pickup device 813 is transmitted to the work assistance server 10 via the communication equipment 823. The picked-up image picked up by the image pickup device 813 may be transmitted to the work assistance server 10 via the actual machine wireless communication equipment 422 in the work machine 40.


If the work assistance server 10 receives the picked-up image picked up by the image pickup device 813 via the server wireless communication equipment 125 (FIG. 4A/C11), the first assistance processing element 121 considers the picked-up image as a surroundings image, and transmits the surroundings image to the remote operation apparatus 20 via the server wireless communication equipment 125 (FIG. 4A/STEP113).


The remote operation apparatus 20 outputs, if it receives the surroundings image via the remote wireless communication equipment 223 (FIG. 4A/C20), the surroundings image as illustrated in FIG. 9 to the image output device 221 (FIG. 4A/STEP212). In the surroundings image, an image of the work machine 40 illustrated at its center and an image of a work site around the work machine 40 are each a picked-up image picked up by the image pickup device 813.


A height and a direction of the unmanned aircraft 80 when the work machine 40 is shot from thereabove may be set. For example, an image, the direction and the size of which are appropriate, of the work machine 40 in a picked-up image is stored as a defined image, and the height and the direction of the unmanned aircraft 80 can be controlled to be respectively a height and a direction in which the defined image can be picked up.


The above-described surroundings images 1 to 3 can be appropriately selected. If the work machine 40 is loaded with the surroundings image pickup device 413, for example, the surroundings image 1 can be adopted.


If the work assistance server 10 and the remote operation apparatus 20 can acquire the picked-up image picked up by the installation camera 70 installed at the work site where the work machine 40 is arranged via the network NW, the surroundings image 2 can be adopted.


If the unmanned aircraft 80 exists above the work machine 40, and the picked-up image picked up by the image pickup device 813 can be acquired via the network NW, the surroundings image 3 can be adopted.


As described above, if the surroundings image has been outputted to the image output device 221 (FIG. 4A/STEP212), the operator OP can confirm safety, for example, whether or not a vehicle or a field worker does not exist.


Then, operator information is transmitted to the work assistance server 10 via the remote wireless communication equipment 223 (FIG. 4A/STEP213). The “operator information” is information capable of determining whether or not the operator OP has confined the surroundings image outputted to the image output device 221.


For example, the operator information is information indicating that the surroundings image has been displayed on the image output device 221 over a predetermined time period (e.g., 10 seconds). If the information is transmitted, and the work assistance server 10 receives the information via the server wireless communication equipment 125 (FIG. 4A/C12), the second assistance processing element 122 determines whether or not a work permission condition is satisfied (FIG. 4A/STEP114).


If a time period during which the surroundings image is displayed is less than a predetermined time period, the possibility that the operator has not confirmed the surroundings image is high. Therefore, the second assistance processing element 122 determines that the work permission condition is not satisfied (FIG. 4A/NO in STEP114), and generates the surroundings image again (FIG. 4A/STEP112), to update the surroundings image.


On the other hand, if the time period during which the surroundings image is displayed is the predetermined time period or more, the possibility that the operator OP has confirmed the surroundings image is high. Accordingly, the second assistance processing element 122 determines that the work permission condition is satisfied (FIG. 4A/YES in STEP114), and transmits an operation receiving permission signal to the work machine 40 via the server wireless communication equipment 125 (FIG. 4A/STEP115).


The operator information may be an image capturing an operation of the operator OP for confirming the surroundings image outputted to the image output device 221. In this case, the remote operation apparatus 20 picks up an image of the operator OP who sits on the seat St using the worker image pickup device 222, and transmits the image as an operator image to the work assistance server 10 via the remote wireless communication equipment 223 (FIG. 4A/STEP213).


If the work assistance server 10 receives the operator image via the server wireless communication equipment 125 (FIG. 4A/C12), the second assistance processing element 122 determines whether or not a work permission condition is satisfied (FIG. 4A/STEP114).


The second assistance processing element 122 analyzes the operator image, and specifies an operation of the operator. For example, if it is detected that an eye line and a face of the operator OP are oriented toward the image output device 221 and the operator OP has pointed and confirmed the image output device 221, the second assistance processing element 122 determines that the work permission condition is satisfied (FIG. 4A/YES in STEP114). The second assistance processing element 122 transmits the operation receiving permission signal to the work machine 40 via the server wireless communication equipment 125 (FIG. 4A/STEP115).


On the other hand, if the second assistance processing element 122 determines that the work permission condition is not satisfied (FIG. 4A/NO in STEP114), a surroundings image is generated again (FIG. 4A/STEP112). As a result, the operator OP is made to confirm the surroundings image outputted to the image output device 221.


The second assistance processing element 122 may perform control to output a warning sound or the like from a speaker of the remote operation apparatus 20 if a state where the work permission condition is not satisfied (FIG. 4A/NO in STEP114) is continued for a predetermined time period or more in a result of the analysis of the operator image.


The work machine 40 starts, if it receives the operation receiving permission signal through the actual machine wireless communication equipment 422 (FIG. 4A/C41), to receive a remote operation by the remote operation apparatus 20 (FIG. 4A/STEP412).


(Second Function)

Then, a further function of the work assistance system 1 will be described with reference to a flowchart illustrated in FIG. 4B.


The work assistance server 10 transmits a picked-up image switching instruction to the work machine 40 via the server wireless communication equipment 125 (FIG. 4B/STEP120). The “picked-up image switching instruction” is an instruction to display on the image output device 221 an image to be used when the remote operation apparatus 20 remotely operates the work machine 40.


If the work machine 40 receives the picked-up image switching instruction via the actual machine wireless communication equipment 422 (FIG. 4B/C42), the actual machine control device 400 acquires an actual machine picked-up image from the actual machine image pickup device 412 (FIG. 4B/STEP421). That is, the actual machine control device 400 transmits actual machine picked-up image data representing the actual machine picked-up image to the work assistance server 10 (FIG. 4B/STEP422).


In the work assistance server 10, if the first assistance processing element 121 receives the actual machine picked-up image data via the server wireless communication equipment 125 (FIG. 4B/C14), the second assistance processing element 122 transmits the actual machine picked-up image data to the remote operation apparatus 20 (FIG. 4B/STEP121).


If the remote operation apparatus 20 receives the actual machine picked-up image data via the remote wireless communication equipment 223 (FIG. 4B/C21), the remote control device 200 outputs a work environment image corresponding to the actual machine picked-up image data to the image output device 221 (FIG. 4B/STEP220).


As a result, the work environment image on which the boom 441, the arm 443, and the bucket 445, respectively, as parts of the actuation mechanism 440 are reflected is outputted to the image output device 221, as illustrated in FIG. 10, for example.


In the remote operation apparatus 20, the remote control device 200 recognizes an operation mode of the remote operation mechanism 211 (FIG. 4B/STEP221). Further, the remote operation apparatus 20 transmits a remote operation command corresponding to the operation mode to the work assistance server 10 via the remote wireless communication equipment 223 (FIG. 4B/STEP222).


In the work assistance server 10, if the second assistance processing element 122 receives the remote operation command (FIG. 4B/C15), the first assistance processing element 121 transmits the remote operation command to the work machine 40 (FIG. 4B/STEP122).


The work machine 40 controls, if the actual machine control device 400 receives the remote operation command via the actual machine wireless communication equipment 422 (FIG. 4B/C43), an operation of the actuation mechanism 440 or the like (FIG. 4B/STEP423). As a result, the work machine 40 starts to be remotely operated. For example, work for scooping soil in front of the work machine 40 using the bucket 445 and dropping the soil from the bucket 445 after turning the upper turning body 435 is performed.


Then, in the remote operation apparatus 20, the operator OP ends the work by the work machine 40. At this time, the remote operation apparatus 20 transmits a work end signal toward the work assistance server 10 (FIG. 4B/STEP223).


The work assistance server 10 transmits, if it receives the work end signal via the actual machine wireless communication equipment 422 (FIG. 4B/C16), the work end signal toward the work machine 40 (FIG. 4B/STEP123). At this time, the work assistance server 10 may currently issue a work end instruction to not only an image pickup device that transmits a picked-up image but also all image pickup devices.


The work machine 40 stops, if it receives the work end signal (FIG. 4B/C44), image pickup by the image pickup device and driving of the work machine 40 (FIG. 4B/STEP424).


Finally, the remote operation apparatus 20 stops outputting the picked-up image (FIG. 4B/STEP224). As a result, display of the picked-up image on the image output device 221 ends.


The operator OP can operate the plurality of work machines 40 from the remote operation apparatus 20, and may switch the work machine 40. When the work machine 40 is switched to a new work machine 40, the work assistance system 1 may perform processing illustrated in FIGS. 4A and 4B, described above. From the foregoing, a series of processes of the work assistance server 10 (the first function and the second function) ends.


Thus, the work assistance server 10 uses the picked-up images respectively picked up by the actual machine image pickup device 412 and the surroundings image pickup device 413 in the work machine 40, the installation camera 70, and the like to display a picked-up image around the work machine 40 on the image output device 221. The operator OP can grasp not only a situation around the work machine 40 but also a relative positional relationship with an obstacle, for example, and reliably perform work. Time and effort required for the operator OP himself/herself to switch the image pickup device are also saved.


In the work assistance server according to the first aspect of the invention, the second assistance processing element preferably permits the work by the work machine based on information about confirming the surroundings image displayed on the display device by an operator who operates the remote operation apparatus.


The second assistance processing element permits the remote operation apparatus to operate the work machine based on the information about confirming the surroundings image displayed on the display device by the operator. As a result, the work assistance server can permit the operation of the work machine in a situation where the possibility that the operator has confirmed the surroundings image is high.


In the work assistance server according to the first aspect of the invention, the information is information about a time period during which the surroundings image is displayed on the display device, and the second assistance processing element preferably permits the work by the work machine based on the surroundings image being displayed on the display device for a predetermined time period.


The second assistance processing element can estimate that the possibility that the operator has confirmed the surroundings image is high based on the surroundings image being displayed on the display device for the predetermined time period and permit the work by the work machine.


In the work assistance server according to the first aspect of the invention, the information is information about a detection result of a confirmation operation of the operator, and the second assistance processing element preferably permits the work by the work machine based on the confirmation operation having been detected for the surroundings image displayed on the display device.


In this case, the second assistance processing element can permit the work by the work machine in a situation where the confirmation operation has been detected for the surroundings image displayed on the display device, which means that the possibility that the operator has reliably confirmed the surroundings image is high.


In the work assistance server according to the first aspect of the invention, the first assistance processing element preferably generates the surroundings image based on a picked-up image picked up by the surroundings image pickup device that is provided in the work machine and picks up an image of the surroundings of the work machine.


The first assistance processing element can generate the surroundings image based on the picked-up image picked up by the surroundings image pickup device provided in the work machine. Then, the surroundings image is displayed on the display device. Thus, the work assistance server can use the surroundings image to permit the operator to perform the work.


In the work assistance server according to the first aspect of the invention, the first assistance processing element preferably grasps respective positions and image pickup directions of a plurality of installation cameras arranged at a work site where the work machine is positioned, selects the installation camera that captures the work machine by the respective positions and image pickup directions of the installation cameras and the position of the work machine, and generates the surroundings image based on a picked-up image picked up by the selected installation camera.


In this case, the first assistance processing element can generate the surroundings image based on the picked-up image picked up by the installation camera arranged at the work site. As a result, the work assistance server can generate the surroundings image even when the work machine comprises no surroundings image pickup device.


In the work assistance server according to the first aspect of the invention, the first assistance processing element preferably generates the surroundings image based on a picked-up image picked up by an image pickup device provided in an aircraft that can fly above the work machine.


In this case, the first assistance processing element can generate the surroundings image based on the picked-up image picked up by the image pickup device provided in the aircraft that flies above the work machine. As a result, the work assistance server can generate the surroundings image even when the work machine comprises no surroundings image pickup device and when there is no appropriate installation camera.


REFERENCE SIGNS LIST


1 . . . work assistance system, 10 . . . work assistance server, 20 . . . remote operation apparatus, 40 . . . work machine, 70, 70A, 70B . . . installation camera, 80 . . . unmanned aircraft, 121 . . . first assistance processing element, 122 . . . second assistance processing element, 200 . . . remote control device, 221 . . . image output device, 222 . . . worker image pickup device, 412 . . . actual machine image pickup device, 413 . . . surroundings image pickup device, 425 . . . cab, 427 . . . lower traveling body, 430 . . . turning mechanism, 435 . . . upper turning body, 440 . . . actuation mechanism (work attachment), 445 . . . bucket, 460 . . . positioning device, 713 . . . image pickup device, 723 . . . communication equipment, 813 . . . image pickup device, 823 . . . communication equipment, 2110 . . . traveling lever, 2111 . . . right-side operation lever, 2112 . . . left-side operation lever.

Claims
  • 1. A work assistance server that assists a work by a work machine to be operated such that the work machine to be operated is remotely operated in response to an operation of a remote operation apparatus including a display device, the work assistance server comprising: a first assistance processing element that generates a surroundings image including a region around the work machine when a request to start the work by the work machine is made; anda second assistance processing element that permits the work by the work machine on condition that the surroundings image is displayed on the display device.
  • 2. The work assistance server according to claim 1, wherein the second assistance processing element permits the work by the work machine based on information about confirming the surroundings image displayed on the display device by an operator who operates the remote operation apparatus.
  • 3. The work assistance server according to claim 2, wherein the information is information about a time period during which the surroundings image is displayed on the display device, andthe second assistance processing element permits the work by the work machine based on the surroundings image being displayed on the display device for a predetermined time period.
  • 4. The work assistance server according to claim 2, wherein the information is information about a detection result of a confirmation operation by the operator, andthe second assistance processing element permits the work by the work machine based on the confirmation operation having been detected for the surroundings image displayed on the display device.
  • 5. The work assistance server according to claim 1, wherein the first assistance processing element generates the surroundings image based on a picked-up image picked up by a surroundings image pickup device that is provided in the work machine and picks up an image of the surroundings of the work machine.
  • 6. The work assistance server according to claim 1, wherein the first assistance processing element grasps respective positions and image pickup directions of a plurality of installation cameras arranged at a work site where the work machine is positioned, selects the installation camera that captures the work machine by the respective positions and image pickup directions of the installation cameras and the position of the work machine, and generates the surroundings image based on a picked-up image picked up by the selected installation camera.
  • 7. The work assistance server according to claim 1, wherein the first assistance processing element generates the surroundings image based on a picked-up image picked up by an image pickup device provided in an aircraft that can fly above the work machine.
  • 8. A work assistance method for assisting a work by a work machine to be operated such that the work machine to be operated is remotely operated in response to an operation performed on a remote operation apparatus including a display device, the work assistance method comprising: a first step of generating a surroundings image including a region around the work machine when a request to start the work by the work machine is made; anda second step of permitting the work by the work machine on condition that the surroundings image is displayed on the display device.
Priority Claims (1)
Number Date Country Kind
2020-043744 Mar 2020 JP national
PCT Information
Filing Document Filing Date Country Kind
PCT/JP2021/002418 1/25/2021 WO