The present invention relates to a work assist server to assist a plurality of workers in sharing information about a work site, by communication with a plurality of clients assigned to the plurality of workers, respectively.
A terminal device for a remote monitoring assistance system has been proposed for a worker who is patrolling and inspecting in a plant and a person who waits outside a work site to share information with sufficient accuracy (see Patent Literature 1, for example). This terminal device comprises a video input unit which inputs video data of the site, an input operation selecting unit such as a pen or a mouse, a detection unit which detects whether there is new video to be obtained, a communication control unit which wirelessly transmits and receives data to and from outside, and an input/output screen display unit which displays an input screen to input predetermined data.
Patent Literature 1: Japanese Patent Laid-Open No. 2005-242830
However, it is preferable that a plurality of people involved in work can share information about an object requiring attention in a work area of the plurality of people involved in the work.
To solve the problem, an object of the present invention is to provide a server and system which allow a plurality of people involved in work to share information about an object requiring attention in a work area.
The present invention relates to a work assist server to assist a plurality of workers in sharing information about a work site, by communication with a plurality of clients assigned to the plurality of workers, respectively.
The work assist server of the present invention comprises a first assist processing element which recognizes existence of a target object in a target object image region that is a part of a captured image obtained through an imaging device and target object related information about the target object, and a real space position and real space posture of the imaging device, based on communication with a first client among the plurality of clients, and which presumes an extension mode of the target object in a real space, based on the real space position and real space posture of the imaging device, and a second assist processing element which causes an output interface of a second client among the plurality of clients to output a work environment image indicating the extension mode of the target object presumed by the first assist processing element and the target object related information, based on communication with the second client.
A work assist system of the present invention includes the work assist server of the present invention, and the plurality of clients.
According to the work assist server and the work assist system (hereinafter referred to as “the work assist server and the like” as appropriate) of the present invention, the work environment image indicating the extension mode of the target object in the real space in the target object image region that is a part of the captured image obtained through the imaging device of the first client and the target object related information is outputted to the output interface of the second client.
Consequently, for example, when each worker recognizes the existence of the target object around the worker, the worker can immediately obtain the captured image of the target object by use of the client as the first client. Then, another worker can immediately recognize the extension mode of the target object in the real space and the target object related information through the work environment image outputted to the output interface by use of the client as the second client. Furthermore, for example, by each of the plurality of workers in a common site using the client as the first client, it is possible to share a map with an abundant amount of information about various target objects among the plurality of workers. Consequently, for example, when the worker works using a work machine, the worker can smoothly perform the work while being aware of the extension mode of the target object.
(Configuration of Work Assist System)
A work assist system as an embodiment of the present invention shown in
(Configuration of Work Assist Server)
The work assist server 10 comprises a database 102, a first assist processing element 121, and a second assist processing element 122. The database 102 stores and holds “a captured image”, “a real space position and real space posture of an imaging device 612”, “an extension mode of a target object image region in the captured image”, “target object related information about a target object existing in the target object image region” and the like. The database 102 may include a database server separate from the work assist server 10. Each assist processing element includes an arithmetic processing unit (a single core processor or a multi-core processor or a processor core included in the multi-core processor) and reads required data and software from a storage device such as a memory and executes after-mentioned arithmetic processing for the data as a target in accordance with the software.
(Configuration of Remote Operation Device)
The remote operation device 20 constituting one client comprises a remote control device 200, a remote input interface 210, and a remote output interface 220. The remote control device 200 includes an arithmetic processing unit (a single core processor or a multi-core processor or a processor core included in the multi-core processor) and reads required data and software from a storage device such as a memory and executes arithmetic processing for the data as a target in accordance with the software. The remote input interface 210 comprises a remote operation mechanism 211. The remote output interface 220 comprises an image output device 221 and remote wireless communication equipment 222.
The one client may include a mobile terminal cooperating with the remote operation device 20 or having a mutual communication function. The mobile terminal includes a configuration similar to the after-mentioned worker terminal 60.
The remote operation mechanism 211 includes an operation device for traveling, an operation device for turning, an operation device for boom, an operation device for arm, and an operation device for bucket. Each operation device includes operation levers receiving a rotating operation. The operation levers (travel levers) for the operation device for traveling are operated to move a lower traveling body 410 of the work machine 40. The travel levers may also serve as travel pedals. For example, the travel pedals fixed to a base portion or a bottom end of the travel levers may be provided. The operation lever (turn lever) of the operation device for turning is operated to move a hydraulic swing motor included in a turning mechanism 430 of the work machine 40. The operation lever (boom lever) of the operation device for boom is operated to move a boom cylinder 442 of the work machine 40. The operation lever (arm lever) of the operation device for arm is operated to move an arm cylinder 444 of the work machine 40. The operation lever (bucket lever) of the operation device for bucket is operated to move a bucket cylinder 446 of the work machine 40.
The respective operation levers included in the remote operation mechanism 211 are arranged around a seat St on which an operator sits as shown in
In front of the seat St, a pair of left and right travel levers 2110 corresponding to left and right crawlers are arranged laterally in a left-right direction. One operation lever may serve as a plurality of operation levers. For example, a right-side operation lever 2111 provided in front of a right frame of the seat St shown in
For example, as shown in
(Configuration of Work Machine)
The work machine 40 comprises an actual machine control device 400, an actual machine input interface 410, an actual machine output interface 420, and a working mechanism 440. The actual machine control device 400 includes an arithmetic processing unit (a single core processor or a multi-core processor or a processor core included in the multi-core processor) and reads required data and software from a storage device such as a memory and executes arithmetic processing for the data as a target in accordance with the software.
The work machine 40 is, for example, a crawler shovel (construction machine), and comprises the crawler lower traveling body 410, and an upper turning body 420 rotatably mounted on the lower traveling body 410 via the turning mechanism 430 as shown in
The actual machine input interface 410 comprises an actual machine operation mechanism 411 and an actual machine imaging device 412. The actual machine operation mechanism 411 comprises a plurality of operation levers arranged around a seat disposed inside the cab 424 in the same manner as in the remote operation mechanism 211. A drive mechanism or a robot which receives a signal depending on an operation mode of a remote operation lever and moves an actual machine operation lever based on the received signal is provided in the cab 424. The actual machine imaging device 412 is installed, for example, inside the cab 424, and images an environment including at least a part of the working mechanism 440 through a front window of the cab 424.
The actual machine output interface 420 comprises actual machine wireless communication equipment 422.
The work attachment 440 as the working mechanism comprises a boom 441 mounted on the upper turning body 420 such that the boom can be undulated, an arm 443 rotatably coupled to a tip end of the boom 441, and a bucket 445 rotatably coupled to a tip end of the arm 443. The boom cylinder 442, the arm cylinder 444 and the bucket cylinder 446, each of which is configured with a telescopic hydraulic cylinder, are attached to the work attachment 440.
The boom cylinder 442 is interposed between the boom 441 and the upper turning body 420 to receive supply of hydraulic oil and extend and retract, thereby rotating the boom 441 in an undulating direction. The arm cylinder 444 is interposed between the arm 443 and the boom 441 to receive the supply of hydraulic oil and extend and retract, thereby rotating the arm 443 to the boom 441 about a horizontal axis. The bucket cylinder 446 is interposed between the bucket 445 and the arm 443 to receive the supply of hydraulic oil and extend and retract, thereby rotating the bucket 445 to the arm 443 about the horizontal axis.
(Configuration of Worker Terminal)
The worker terminal 60 constituting the other client is a terminal device such as a smartphone or a tablet terminal, and comprises a control device 600, an input interface 610, and an output interface 620. The control device 600 includes an arithmetic processing unit (a single core processor or a multi-core processor or a processor core included in the multi-core processor) and reads required data and software from a storage device such as a memory and executes arithmetic processing for the data as a target in accordance with the software.
The input interface 610 comprises the imaging device 612. The input interface 610 includes a button, a switch or the like of a touch panel. The output interface 620 comprises an image output device 621 (and a voice output device as required), and wireless communication equipment 622.
(Function)
(First Function (Output of Work Environment Image))
Description will be made as to a function of the work assist system with the above configuration with reference to flowcharts shown in
When each constituent element (arithmetic processing resource or hardware resource) of the present invention “recognizes” information, the recognizing is concept including processing to prepare information in any form available for subsequent processing, such as receiving of the information, reading or retrieving of the information from the storage device or the like, writing (storing and holding) or registering of the information in the storage device or the like, presuming, determining, identifying, measuring, predicting or the like of the information by executing arithmetic processing of an output signal from the sensor and/or received or retrieved basic information according to predetermined algorithm, and the like.
In the remote operation device 20, it is determined whether there is a first designated operation through the remote input interface 210 by an operator OP (
In the work assist server 10, in a case where the first environment confirmation request is received (
In the remote operation device 20, in a case where the first environment image data is received through the remote wireless communication equipment 222 (
In the worker terminal 60, the captured image is obtained through the imaging device 612 in response to an imaging operation through the input interface 610 by the first worker (
The control device 600 measures a real space position and real space posture of the imaging device 612 when the captured image is obtained (
The control device 600 determines whether a target object image region R is designated through an operation in the input interface 610 (
In a case where the determination result is positive (YES in
In a case where the determination result is positive (YES in
In the work assist server 10, in a case where the data representing each of “the captured image data”, “the real space position and real space posture of the imaging device 612”, “the extension mode of the target object image region R” and “the target object related information” is received (
On these assumptions, the extension mode (spread in each of a latitude direction and a longitude direction) of the target object is presumed based on the real space position and real space posture of the imaging device 612, as well as the angle of view or a focal length, in addition to the extension mode of the target object image region R in a captured image coordinate system. In a case where the target object is a structure at a position higher than the ground or a depression depressed in the ground, presumption accuracy based on the assumptions becomes lower, but the extension mode of the target object in the real space can be roughly grasped.
The captured image (or a distance image as the captured image), to which a distance to the target object measured with a distance measurement sensor (or the distance measurement sensor as the imaging device 612) mounted in the worker terminal 60 is assigned as a pixel value, may be transmitted from the worker terminal 60 to the work assist server 10. In this case, the target object based on the real space position of the imaging device 612 or an extension mode of the surface of the object in the real space may be more accurately presumed.
Subsequently, the second assist processing element 122 generates a work environment image indicating a presumption result of the extension mode of the target object in the real space by the first assist processing element 121 and the target object related information, based on the first environment image (
In the remote operation device 20, in a case where the remote wireless communication equipment 222 included in the remote output interface 220 receives the work environment image data (
In the worker terminal 60, in a case where the control device 600 determines that the target object related information is not inputted through the operation of the first worker in the input interface 610 (NO in
In the work assist server 10, in a case where data representing each of “the captured image data”, “the real space position and real space posture of the imaging device 612” and “the extension mode of the target object image region R” is received (
In the worker terminal 60, in a case where the control device 600 determines that the target object image region R is not designated through the operation of the first worker in the input interface 610 (NO in
In the work assist server 10, in a case where the data representing each of “the captured image data” and “the real space position and real space posture of the imaging device 612” is received (
(Second Function (Remote Operation of Work Machine))
In the remote operation device 20, it is determined whether there is a second designated operation through the remote input interface 210 by the operator OP (
In the work assist server 10, in a case where the second environment confirmation request is received, the first assist processing element 121 transmits the second environment confirmation request to the corresponding work machine 40 (
In the work machine 40, in a case where the environment confirmation request is received through the actual machine wireless communication equipment 422 (
In the work assist server 10, in a case where the captured image data is received (
In the remote operation device 20, in a case where the second environment image data is received through the remote wireless communication equipment 222 (
In the remote operation device 20, the remote control device 200 recognizes an operation mode of the remote operation mechanism 211 (
In the work assist server 10, in a case where the remote operation command is received, the first assist processing element 121 transmits the remote operation command to the work machine 40 (
In the work machine 40, in a case where the actual machine control device 400 receives the operation command through the actual machine wireless communication equipment 422 (
(Effects)
According to the work assist system with the above configuration and the work assist server 10 included in this system, the work environment image indicating the extension mode of the target object in the real space in the target object image region R that is a part of the captured image obtained through the imaging device (e.g., the imaging device 612) of the first client (e.g., the worker terminal 60) and the target object related information is outputted to the output interface (e.g., the remote output interface 220) of the second client (e.g., the remote operation device 20) (see
Consequently, for example, when each worker recognizes the existence of the target object around the worker, the worker can immediately obtain the captured image of the target object by use of the client as the first client (see
In the above embodiment, the work assist server 10 is configured with one or more servers separate from each of the remote operation device 20, the work machine 40 and the worker terminal 60 (see
In the above embodiment, in the first client (e.g., the worker terminal 60 or the remote operation device 20), the designation of the target object image region R and the input of the target object related information are possible. As another embodiment, however, it may be postulated that the designation of the target object image region R and the input of the target object related information are not performed in the first client, and the captured image obtained through the imaging device may be transmitted to the work assist server 10 together with the data representing the real space position and real space posture of the imaging device (see
Alternatively, it may be postulated that the target object related information is not inputted in the first client, and the captured image obtained through the imaging device may be transmitted to the work assist server 10 together with the target object image region R and the data representing the real space position and real space posture of the imaging device (see
The first assist processing element 121 recognizes the extension mode of the target object in the target object image region designated through the input interface (210, 610) of the first client and the target object related information about the target object, in the captured image outputted to the output interface (220, 620) of the first client, based on the communication with the first client (e.g., the worker terminal 60 or the remote operation device 20).
Consequently, for example, when each worker recognizes the existence of the target object around the worker as described above, the worker can immediately obtain the captured image of the target object by use of the client as the first client. Furthermore, each worker can designate the image region where the target object exists, the image region being a part of the captured image of the target object outputted to the output interface of the first client, as the target object image region through the input interface and can input the target object related information. Consequently, the existence of the target object noticed by each worker and the target object related information can be more accurately conveyed to the other worker.
The first assist processing element 121 recognizes the existence of the target object in the target object image region that is a part of the captured image obtained through the actual machine imaging device 412 mounted in the work machine 40 and the target object related information about the target object, and the real space position and real space posture of the actual machine imaging device 412, based on the communication with the remote operation device 20 as the first client, for remotely operating the work machine 40.
Consequently, the plurality of workers can share the extension mode of the target object around the work machine 40 and the target object related information based on the captured image obtained through the actual machine imaging device 412 mounted in the work machine 40.
Number | Date | Country | Kind |
---|---|---|---|
2019-212619 | Nov 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/030682 | 8/12/2020 | WO |