CONSTRUCTION MACHINE AND ASSISTANCE SYSTEM FOR CONSTRUCTION MACHINE

Information

  • Patent Application
  • 20250084614
  • Publication Number
    20250084614
  • Date Filed
    November 22, 2024
    a year ago
  • Date Published
    March 13, 2025
    11 months ago
Abstract
A construction machine includes a lower travel body, an upper slewing body mounted on the lower travel body in a slewable manner, a cab mounted on the upper slewing body, and a space recognition device disposed on a pillar in the cab.
Description
BACKGROUND
1. Technical Field

The present disclosure relates to a construction machine and an assistance system for a construction machine.


2. Description of Related Art

There is known a construction machine including a camera disposed at a position that allows the camera to capture an image of an operator seated on an operator's seat and an operation lever operated by the operator, and a camera disposed at a position that allows the camera to capture an image of a traveling operation lever and a traveling operation pedal. It is known that a malfunction unintended by an operator is prevented in the construction machine by using images captured by these cameras.


SUMMARY

A construction machine includes a lower travel body, an upper slewing body mounted on the lower travel body in a slewable manner, a cab mounted on the upper slewing body, and a space recognition device disposed on a pillar in the cab.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a diagram illustrating an example of a system configuration of an assistance system for an excavator;



FIG. 2 is a diagram illustrating the arrangement of an imaging device inside a cab;



FIG. 3A is a diagram illustrating a state in which the front window is being open partway;



FIG. 3B is a diagram illustrating a state in which the front window is being open partway;



FIG. 4 is a block diagram illustrating a configuration example of a drive system of an excavator;



FIG. 5 is a diagram for explaining an example of a state detection method by a state detector;



FIG. 6 is a flowchart for explaining processing of an excavator according to a first embodiment;



FIG. 7A is a first diagram illustrating an example of image data to be output;



FIG. 7B is a second diagram illustrating an example of image data to be output;



FIG. 8 is a flowchart for explaining an operation of an excavator according to a second embodiment; and



FIG. 9 is a diagram for explaining a system configuration of an assistance system for a construction machine according to a third embodiment.





DETAILED DESCRIPTION

An operator seated on the operator's seat does not always face forward. For this reason, images of face parts such as eyes, nose, and mouth of the operator cannot be always captured.


In view of the above circumstances, it is desirable to capture an image of operator's face parts. A construction machine disclosed herein includes a lower travel body, an upper slewing body slewably mounted on the lower travel body, a cab mounted on the upper slewing body, and a space recognition device disposed on a pillar in the cab.


First Embodiment

Hereinafter, embodiments will be described with reference to the drawings. In the accompanying drawings, an X axis, a Y axis, and a Z axis are orthogonal to each other. Specifically, the X axis extends along the front-rear axis of the excavator, the Y axis extends along the left-right axis of the excavator, and the Z axis extends along the slew axis of the excavator. In the present embodiment, the X axis and the Y axis extend in the horizontal direction, and the Z axis extends in the vertical direction.



FIG. 1 is a diagram illustrating an example of a system configuration of an assistance system for an excavator. An excavator assistance system SYS of the present embodiment includes an excavator 100 and a management apparatus 200. In the following descriptions, the excavator assistance system SYS is simply referred to as an “assistance system SYS”.


In the assistance system SYS of the present embodiment, the excavator 100 and the management apparatus 200 are connected to each other via a network or the like. The excavator 100 is an example of a work machine.


In the excavator 100 of the present embodiment, a plurality of imaging devices are disposed at respective positions in a cab 10, which is described later, where a face image of an operator seated on an operator's seat is always captured. In the present embodiment, a state of an operator is detected based on the image data captured by the plurality of imaging devices disposed in the cab 10, and the detection results and the image data are stored in association with each other.


Furthermore, the excavator 100 of the present embodiment may transmit information including image data captured by the plurality of imaging devices and detection results to the management apparatus 200, and may cause the management apparatus 200 to manage the information.


In the present embodiment, the management apparatus 200 may be caused to display detection results regarding a state of an operator and image data. A state of an operator in the present embodiment will be described in detail later.


Image data in the present embodiment includes moving image data and still image data. In the example of FIG. 1, the assistance system SYS includes the excavator 100 and the management apparatus 200; however, the assistance system SYS is not limited to this example. The assistance system SYS may include an assistance device that assists an operator who operates the excavator 100. The assistance device may be a portable terminal apparatus such as a smartphone, a tablet device, or a wearable device.


In the example of FIG. 1, the management apparatus 200 is implemented by a single information processing apparatus; however, the management apparatus 200 is not limited to this example. The management apparatus 200 may be implemented by a plurality of information processing apparatuses. In other words, the functions implemented by the management apparatus 200 may be implemented by a plurality of information processing apparatuses.


Furthermore, in the present embodiment, a plurality of imaging devices are disposed inside the cab 10; however, the present disclosure is not limited to this example. It suffices that the imaging devices are disposed at respective positions where face images of an operator seated on the operator's seat are always captured.


Hereinafter, the excavator 100 of the present embodiment will be described. FIG. 1 illustrates a side view of the excavator 100.


The excavator 100 is an example of a construction machine, and includes a lower travel body 1, a swing mechanism 2, and an upper slewing body 3. In the excavator 100, an upper slewing body 3 is slewably mounted on a lower travel body 1 via a swing mechanism 2. A boom 4 is attached to the upper slewing body 3. An arm 5 is attached to the distal end of the boom 4, and a bucket 6 as an end attachment is attached to the distal end of the arm 5.


The boom 4, the arm 5, and the bucket 6 constitute an excavation attachment as an example of an attachment. The boom 4 is driven by a boom cylinder 7, the arm 5 is driven by an arm cylinder 8, and the bucket 6 is driven by a bucket cylinder 9. A boom angle sensor S1 is attached to the boom 4, an arm angle sensor S2 is attached to the arm 5, and a bucket angle sensor S3 is attached to the bucket 6.


The boom angle sensor S1 is configured to detect a rotation angle of the boom 4. In the present embodiment, the boom angle sensor S1 is an accelerometer, and is capable of detecting a rotation angle of the boom 4 with respect to the upper slewing body 3 (hereinafter, referred to as a “boom angle”). The boom angle is, for example, a minimum angle when the boom 4 is lowered to a lowest position, and increases as the boom 4 is raised.


The arm angle sensor S2 is configured to detect a rotation angle of the arm 5. In the present embodiment, the arm angle sensor S2 is an accelerometer, and is capable of detecting a rotation angle of the arm 5 with respect to the boom 4 (hereinafter, referred to as a “arm angle”). The arm angle is, for example, a minimum angle when the arm 5 is closed to the maximum, and increases as the arm 5 is opened.


The bucket angle sensor S3 is configured to detect a rotation angle of the bucket 6. In the present embodiment, the bucket angle sensor S3 is an accelerometer, and is capable of detecting a rotation angle of the bucket 6 with respect to the arm 5 (hereinafter, referred to as a “bucket angle”). The bucket angle is, for example, a minimum angle when the bucket 6 is closed to the maximum, and increases as the bucket 6 is opened.


The boom angle sensor S1, the arm angle sensor S2, and the bucket angle sensor S3 may be potentiometers using variable resistors, stroke sensors that detect stroke amounts of corresponding hydraulic cylinders, rotary encoders that detect rotation angles around coupling pins, gyro sensors, combinations of accelerometers and gyro sensors, or the like.


A boom rod pressure sensor S7R and a boom bottom pressure sensor S7B are attached to the boom cylinder 7. An arm rod pressure sensor S8R and an arm bottom pressure sensor S8B are attached to the arm cylinder 8.


A bucket rod pressure sensor S9R and a bucket bottom pressure sensor S9B are attached to the bucket cylinder 9. The boom rod pressure sensor S7R, the boom bottom pressure sensor S7B, the arm rod pressure sensor S8R, the arm bottom pressure sensor S8B, the bucket rod pressure sensor S9R, and the bucket bottom pressure sensor S9B may be collectively referred to as “cylinder pressure sensors”.


The boom rod pressure sensor S7R detects a pressure of a rod-side oil chamber of the boom cylinder 7 (hereinafter, a “boom rod pressure”), and the boom bottom pressure sensor S7B detects a pressure of a bottom-side oil chamber of the boom cylinder 7 (hereinafter, a “boom bottom pressure”). The arm rod pressure sensor S8R detects a pressure of a rod-side oil chamber of the arm cylinder 8 (hereinafter, an “arm rod pressure”), and the arm bottom pressure sensor S8B detects a pressure of a bottom-side oil chamber of the arm cylinder 8 (hereinafter, an “arm bottom pressure”).


The bucket rod pressure sensor S9R detects a pressure of a rod-side oil chamber of the bucket cylinder 9 (hereinafter, a “bucket rod pressure”), and the bucket bottom pressure sensor S9B detects a pressure of a bottom-side oil chamber of the bucket cylinder 9 (hereinafter, a “bucket bottom pressure”).


The upper slewing body 3 is provided with a cab 10 as an operator's cab and a power source such as an engine 11. A sensor for detecting an amount of discharged CO2 may be provided in the vicinity of a discharge mechanism of the engine 11.


In addition, the upper slewing body 3 is provided with a controller 30, a display device 40, an input device 42, a sound output device 43, a storage device 47, a position measurement device P1, a machine body inclination sensor S4, a slewing angular speed sensor S5, an imaging device S6, and a communication device T1.


On the upper slewing body 3, a power storage that supplies electric power, a motor generator that generates electric power using a rotational driving force of the engine 11, and the like may be mounted. The power storage is, for example, a capacitor or a lithium ion battery. The motor generator may function as a motor to drive a mechanical load or as a generator to supply electric power to an electrical load.


The controller 30 functions as a main control unit that performs drive control of the excavator 100. In the present embodiment, the controller 30 is configured by a computer including a CPU, a RAM, a ROM, and the like. The various functions of the controller 30 are realized by the CPU executing programs stored in the ROM, for example. The various functions may include, for example, at least one of a machine guidance function of guiding a manual operation of the excavator 100 by an operator or a machine control function of automatically assisting a manual operation of the excavator 100 by an operator.


The display device 40 is configured to display various kinds of information. The display device 40 may be connected to the controller 30 via a communication network such as a CAN, or may be connected to the controller 30 via a dedicated line.


The input device 42 is configured to allow an operator to input various kinds of information to the controller 30. The input device 42 includes at least one of a touch panel, a knob switch, a membrane switch, and the like installed in the cab 10.


The sound output device 43 is configured to output sound. The sound output device 43 may be, for example, an in-vehicle speaker connected to the controller 30, or an alarm such as a buzzer. In the present embodiment, the sound output device 43 is configured to output various kinds of information by sound in response to a sound output command from the controller 30.


The storage device 47 is configured to store various kinds of information. The storage device 47 is, for example, a nonvolatile storage medium such as a semiconductor memory. The storage device 47 may store information that is output from various devices during an operation of the excavator 100, or may store information acquired via various devices before an operation of the excavator 100 is started.


Specifically, the storage device 47 may store information including image data captured by a plurality of imaging devices (cameras) disposed in the cab 10 and detection results regarding a state of the operator by the controller 30.


The storage device 47 may store, for example, information regarding a target working surface acquired via the communication device T1 or the like. A target construction surface may be set by an operator of the excavator 100 or may be set by a construction manager or the like.


The position measurement device P1 is configured to measure a position of the upper slewing body 3. The position measurement device P1 may be configured to measure an orientation of the upper slewing body 3. In the present embodiment, the position measurement device P1 is, for example, a GNSS compass, and detects a position and an orientation of the upper slewing body 3 and outputs the detected values to the controller 30. For this reason, the position measurement device P1 can also function as an orientation detection device that detects an orientation of the upper slewing body 3. The orientation detection device may be an azimuth sensor attached to the upper slewing body 3.


The machine body inclination sensor S4 is configured to detect an inclination of the upper slewing body 3. In the present embodiment, the machine body inclination sensor S4 is an accelerometer that detects a front-rear inclination angle around the front-rear axis and a left-right inclination angle around the left-right axis of the upper slewing body 3 with respect to the virtual horizontal plane. The longitudinal axis and the lateral axis of the upper slewing body 3 are orthogonal to each other at, for example, an excavator center point which is one point on the swing axis of the excavator 100.


The slewing angular speed sensor S5 is configured to detect a swing angular speed of the upper slewing body 3. The slewing angular speed sensor S5 may be configured to detect or calculate a slewing angle of the upper slewing body 3. In the present embodiment, the slewing angular speed sensor S5 is a gyro sensor. The slewing angular speed sensor S5 may be a resolver, a rotary encoder, or the like.


The imaging device S6 is an example of a space recognition device and is configured to acquire an image of the periphery of the excavator 100. In the present embodiment, the imaging device S6 includes a front camera S6F that images space in front of the excavator 100, a left camera S6L that images space on the left side of the excavator 100, a right camera S6R that images space on the right side of the excavator 100, and a rear camera S6B that images space in the rear of the excavator 100.


The imaging device S6 of the present embodiment may include a plurality of imaging devices provided in the cab 10. The arrangement of the plurality of imaging devices provided in the cab 10 will be described in detail later.


The imaging device S6 is, for example, a monocular camera having an imaging device, such as a CCD or a CMOS, and outputs a captured image to the display device 40. The imaging devices S6 may be a stereo camera, a range image camera, or the like. The imaging device S6 may be replaced with another space recognition device such as a three-dimensional range image sensor, an ultrasonic sensor, a millimeter wave radar, a LIDAR, or an infrared sensor, or may be replaced with a combination of another space recognition device and a camera.


The front camera S6F is attached to, for example, the ceiling of the cab 10, that is, inside the cab 10. The front camera S6F may be attached to the outside of the cab 10, such as the roof of the cab 10 or the side surface of the boom 4. The left camera S6L is attached to the left end of the upper surface of the upper slewing body 3; the right camera S6R is attached to the right end of the upper surface of the upper slewing body 3; and the rear camera S6B is attached to the rear end of the upper surface of the upper slewing body 3.


The communication device T1 controls communication with an external device outside the excavator 100. In the present embodiment, the communication device T1 controls communication with external devices via a satellite network, a mobile telephone network, the Internet, or the like. The external device may be, for example, the management apparatus 200 such as a server installed in an external facility, or may be an assistance device such as a smartphone carried by a worker in the vicinity of the excavator 100.


The excavator 100 may transmit image data captured by the imaging device S6 to an external device such as the management apparatus 200 via the communication device T1. With this configuration, a worker, a manager, or the like outside the excavator 100 can visually recognize a state of the periphery of the excavator 100 and a state of the operator through a display device, such as a monitor connected to the management apparatus 200 or to the assistance device.


Next, the arrangement of the imaging device S6 in the inside of the cab 10 of the present embodiment will be described with reference to FIG. 2. FIG. 2 is a diagram illustrating the arrangement of the imaging device inside the cab.



FIG. 2 is a perspective view of the inside of the cab 10, illustrating a state when the front is viewed from the operator's seat in the cab 10.


As illustrated in FIG. 2, an operator's seat 90 is installed in the cab 10. A left console 90L is installed on the left side of the operator's seat 90, and a right console 90R is installed on the right side of the operator's seat 90. A left operation lever 26L is attached to an upper portion of a front end of the left console 90L, and a right operation lever 26R is attached to a position corresponding to the left operation lever 26L on the right console 90R. A main monitor 40M, which is one of the display devices 40, is attached to an upper portion of a front end of the right console 90R.


In the cab 10, a camera S6PL is attached to the left pillar 111L, and a camera S6PR is attached to the right pillar 111R. The left pillar 111L and the right pillar 111R are parts of the frame body 110 illustrated in FIG. 3. The camera S6PL and the camera S6PR are imaging devices and are examples of a three-dimensional sensor.


In the present embodiment, as illustrated in FIG. 3, which is described later, the cameras S6PL and S6PR are attached to the left and right pillars at respective positions between a height H1 of a seat surface 90m of the operator's seat 90 and a height H2 of an upper end portion of a headrest 90c of the operator's seat 90. The “height” in the present embodiment may be, for example, a vertical distance from the floor surface 10f of the cab 10.


In the present embodiment, by disposing the cameras S6PL and S6PR in such a manner, an image of an operator seated on the operator's seat 90 and who looks at the front can be captured from both the left side and the right side.


Specifically, for example, in the case where an operator seated on the operator's seat 90 is facing the right, an image including face parts of the operator can be captured by the camera S6PR attached to the right pillar 111R. In the case where an operator seated on the operator's seat 90 is facing the left, an image including face parts of the operator can be captured by the camera S6PL attached to the left pillar 111L.


The cameras S6PL and S6PR of the present embodiment are attached so as not to obstruct the field of view of an operator through the front window 62. For this reason, the cameras S6PL and S6PR of the present embodiment are set to have a size that fits within the widths of the left pillar 111L and the right pillar 111R.


In the present embodiment, by setting the cameras S6PL and S6PR to have such a size, it is possible to prevent the cameras S6PL and S6PR from interfering with the opening and closing of the window surface front portion 60. In other words, in the present embodiment, the cameras S6PL and S6PR and the opening and closing operation of the front window 62 can be made to not interfere with each other.


In the example of FIG. 2, one imaging device is attached to each of the left and right pillars; however, the number of pillars attached to the left pillar 111L and the right pillar 111R is not limited to this example. In the present embodiment, for example, a plurality of imaging devices may be attached to each of the left and right pillars.


The cameras S6PL and S6PR of the present embodiment may be movable. For example, a rail having a groove may be attached to each of the left pillar 111L and the right pillar 111R, and the cameras S6PL and S6PR may be fitted into the rails attached to the left pillar 111L and the right pillar 111R.


In this case, an operator can move the mounting positions of the cameras S6PL and S6PR by sliding the cameras S6PL and S6PR on the rail.


In the present embodiment, for example, a plurality of attachment portions for attaching the cameras S6PL and S6PR may be provided on each of the left pillar 111L and the right pillar 111R. The attachment portions may be, for example, a USB hub including a plurality of USB (Universal Serial Bus) connectors.


In this case, the cameras S6PL and S6PR are connected to discretionarily chosen USB connectors of the USB hubs, and thus the attachment positions of the cameras S6PL and S6PR can be moved.


In addition, the cameras S6PL and S6PR of the present embodiment may be provided with fixing members for fixing the cameras to the left pillar 111L and the right pillar 111R, respectively, on the back surface of the housing, for example. In the case where the left pillar 111L and the right pillar 111R are formed of metals such as steel, the fixing members may be, for example, a magnet or the like, or may be a screw or the like.


In this case, the cameras S6PL and S6PR are fixed to arbitrary positions of the left pillar 111L and the right pillar 111R, and thus the attachment positions of the cameras S6PL and S6PR can be moved.


As described above, in the present embodiment, the cameras S6PL and S6PR are installed in a movable manner, and thus the cameras S6PL and S6PR can be moved to a position suitable for capturing the face image of the operator according to the sitting height or the like when an operator is seated on the operator's seat 90.


In the present embodiment, although not illustrated, a left monitor and a right monitor may be provided at respective positions on the left pillar 111L and the right pillar 111R, respectively, where the left monitor and the right monitor do not interfere with the cameras S6PL and S6PR.


In this case, the left monitor is preferably attached at a height higher than the height of the main monitor 40M and lower than the height of the left rear-view mirror 10c. The “height” is, for example, a vertical distance from the ground. Preferably, the left monitor is attached at substantially the same height as the left camera S6L. Similarly, the right monitor is preferably attached at a height higher than the height of the main monitor 40M and lower than the height of the left rear-view mirror 10c. Preferably, the right monitor is attached at substantially the same height as the right camera S6R.


The back monitor 40B is attached to an upper portion of the right pillar 111R so as to be disposed along the front ceiling frame 113. The front ceiling frame 113 is a part of the frame body 110 illustrated in FIG. 3.


The back monitor 40B may be attached to an upper portion of the left pillar 111L so as to be disposed along the front ceiling frame 113.


In this way, the left monitor may be disposed as if it were a left rear-view mirror in the left side portion of the field of view of an operator of the excavator seated on the operator's seat and who views the front. Also, a right monitor may be disposed in the right portion of the field of view as if it were a right rear-view mirror.


In the present embodiment, the back monitor 40B is disposed at the upper portion of the field of view as if it were a rear-view mirror. For this reason, an operator of the excavator can intuitively recognize that the image displayed on the left monitor is a mirror image of the left rear area of the excavator. Similarly, it is possible for the operator to intuitively recognize that the image displayed on the right monitor is a mirror image of the right rear of the excavator and the image displayed on the back monitor 40B is a mirror image of the rear of the excavator.


The images displayed on the left monitor, the right monitor, and the back monitor 40B correspond to the images captured by the left camera S6L, the right camera S6R, and the rear camera S6B, respectively. In other words, the left monitor, the right monitor, and the back monitor 40B independently display views of different directions. The display of the left monitor, the right monitor, and the back monitor 40B is started simultaneously with the activation of the main monitor 40M when an operator turns the key to ON. However, the display may be started at the same time as the start of the engine 11.


The left monitor, the right monitor, and the back monitor 40B are attached so as not to obstruct the field of view of an operator through the front window 62. For this reason, in the present embodiment, the left monitor and the right monitor have sizes that fit within the widths of the left pillar 111L and the right pillar 111R, and the back monitor 40B is attached to the upper right corner of the front window 62. The left monitor and the right monitor may have widths wider than the widths of the left pillar 111L and the right pillar 111R, and the back monitor 40B may have a size that fits within the widths of the front ceiling frame 113. The left monitor, the right monitor, and the back monitor 40B are attached at respective positions where they do not interfere with the opening and closing of the front window 62.


The screen sizes and resolutions of the left monitor, the right monitor, and the back monitor 40B are preferably selected so that the images of persons within a predetermined range (for example, 12 m) from the excavator are displayed on the screen in a size larger than a predetermined size (for example, 7 mm×7 mm). For example, monitors having a screen size of 7 inches or more are adopted as the left monitor and the right monitor, and monitors having a screen size of 7 inches or 8 inches are preferably adopted.


The left monitor and the right monitor are attached at the same height with respect to a reference horizontal plane. The reference horizontal plane is, for example, the ground on which the excavator is located. In the present embodiment, the monitors are attached so as to be bilaterally symmetrical with respect to the center line of the operator's cab indicated by the alternate long and short dash line in FIG. 2.


The left monitor, the right monitor, and the back monitor 40B may be configured such that the attachment angles thereof can be adjusted in accordance with a body shape, a working posture, and the like of an operator seated on the operator's seat 90.


As illustrated in FIG. 2, the operator's seat 90 is provided at the center of the cab 10, and a left operation lever 26L and a right operation lever 26R are provided on both sides of the operator's seat 90. For this reason, an operator seated on the operator's seat 90 can move the bucket 6 to a desired position and perform excavation work by operating the left operation lever 26L with the left hand, and operating the right operation lever 26R with the right hand.


An image display part 41M and a switch panel 42M of the main monitor 40M are installed on the right front side of the operator's seat 90. An operator of the excavator can ascertain an operation state of the excavator by viewing the image display part 41M. In the example of FIG. 2, the image display part 41M displays an overhead view image. The overhead view image is an example of a composite image generated based on images captured by the rear camera S6B and the left and right side cameras. Specifically, the overhead view image is a viewpoint conversion image representing a state when the surroundings of the excavator are viewed from a virtual viewpoint directly above.


The left monitor is attached to the left pillar 111L, and the right monitor is attached to the right pillar 111R. The left monitor and the right monitor may be attached at respective positions such that the left monitor and the right monitor can be captured in the peripheral field of view when an operator captures the bucket 6 in the central field of view through the front window 62 of the cab 10. For this reason, when an operator is performing excavation work while viewing the bucket 6 in the central visual field, the operator can view the states of the left rear area and the right rear area of the excavator displayed on the left monitor and the right monitor in the peripheral visual field without moving the line of sight.


A selection dial 52 and an operation device 53 are provided on the left console 90L. In the case where the operator wants to change the range of the left monitor displaying an image of the left rear area of the excavator, the operator operates the selection dial 52 to select the left camera S6L. Then, the operator operates the operation device 53 to change the orientation of the left camera S6L, thereby changing the range of the image displayed on the left monitor. The same operation is performed in the case where the range of the image reflected by the left rear-view mirror 10c is changed.


Next, opening and closing of the front window 62 will be described with reference to FIG. 3. FIG. 3 illustrates a state in which the front window is being open partway; FIG. 3A is a front view, and FIG. 3B is a left-side view.


The cab 10 includes an operation device 26 as a driving operation part, the operator's seat 90, the display device 40, a gate lock lever 45, the cameras S6PL and S6PR, and the like. Although not illustrated, the controller 30 illustrated in FIG. 1 and the like are mounted in the cab 10.


The operation device 26 includes a left operation lever 26L, a right operation lever 26R, a travel lever 26B, a pedal 26C, and the like. The gate lock lever 45 and a gate lock valve 19 are provided at a left lower position of the operator's seat 90. In the case where the gate lock lever 45 is pulled up so that an operator cannot exit from the cab 10, the gate lock valve 19 is switched to a communicating state (open state), whereby a pilot line is communicated and the various operation devices are enabled to be in an operable state. In the case where the gate lock lever 45 is pulled down so that the operator can exit from the cab 10, the gate lock valve 19 is switched to a non-communicating state (closed state), whereby the pilot line is interrupted and the various operation devices are disabled to be in a non-operable state.


The operator's seat 90 has an arm rest 90a, a seat back 90b, and a head rest 90c. In the present embodiment, the height from the floor 10f of the cab 10 to the seat 90m of the operator's seat 90 is H1, and the height from the floor 10f of the cab 10 to the upper end of the headrest 90c is H2.


The cameras S6PL and S6PR of the present embodiment are disposed at respective positions where the height from the floor surface 10f of the cab 10 is between H1 and H2, on the left pillar 111L and the right pillar 111R, respectively. In other words, the cameras S6PL and S6PR of the present embodiment are arranged in such a manner that the height of the cameras from the floor surface 10f of the cab 10 is within the height range H3.


In the present embodiment, by arranging the cameras S6PL and S6PR in this way, it is possible to capture a face image of an operator regardless of the sitting height of the operator seated on the seat surface 90m.


The cab 10 has a frame body 110. The frame body 110 is formed by combining a vertical frame, a horizontal frame, and a connecting frame. The vertical frame includes a pair of left and right pillars 111 (111L and 111R) positioned on the front side (the traveling direction side, the Z1 side) and a pair of left and right vertical frames (pillars) 112 positioned on the rear side (the Z2 side). The horizontal frame includes a front ceiling frame 113 horizontally bridged between the left and right pillars 111L and 111R on the front side, and a rear ceiling frame 114 horizontally bridged between the left and right pillars 112 on the rear side. The pair of left and right pillars 111 on the front side and the pair of left and right pillars 112 on the rear side are connected by a pair of left and right connecting frames 115, respectively.


In the present embodiment, in the example of FIG. 2, an imaging device may be attached to each of the left and right pillars 112 on the rear side. In this case, at least four imaging devices are provided in the cab 10.


In the present embodiment, attaching an imaging device to each of the rear left and right pillars 112 allows the imaging devices to capture a face image of an operator even when the operator's face is directed rearward.


The cab 10 has a window surface front part 60 arranged in a front surface frame formed by a frame body 110. In the cab 10, side windows 65 are disposed in the left and right frames formed by the frame body 110. Furthermore, the cab 10 has a head window disposed on an upper surface frame formed by the frame body 110.


The window surface front portion 60 includes a lower front window 61, a front window 62, and an upper front window 63.


The front window 62 has a sliding mechanism that is capable of sliding the front window 62. In the present embodiment, the front window 62 and the lower front window 61 are separated from each other so that the front window 62 can be accommodated in the cab 10 when the front window 62 is slid.


The slide mechanism of the present embodiment includes a pair of left and right slide rails 111a provided on the respective inner surfaces of the pair of left and right pillars 111, and a pair of left and right slide rails 115a provided on the respective inner surfaces of the left and right connecting frames 115. The front window 62 is disposed between the left and right slide rails 111a and between the left and right slide rails 115a. The slide rail 111a and the slide rail 115a may be formed in such a manner that the rail grooves are continuous, and a sliding portion (not illustrated) provided at an end portion of the front window 62 or the like may be configured to be movable from the slide rail 111a to the slide rail 115a. The sliding portion may be a roller or the like.


The front window 62 is disposed between the lower front window 61 and the upper front window 63 when the front window 62 is closed. When the front window 62 is opened, the front window 62 is separated from the lower front window 61 and the upper front window 63 and slides in the opening direction (toward the Y1 side). The front window 62 slides in a closing direction (toward the Y2 side) when closing, and is disposed between the lower front window 61 and the upper front window 63.


The front window 62 is provided with a handle portion 62a at each of an upper left position and an upper right position. The handle portion 62a may be provided only on one of the left and right sides. An operator grips the handle portion 62a and slides the front window 62 in the opening direction or the closing direction. The handle portion 62a may have a lock mechanism for fixing the front window 62 between the lower front window 61 and the upper front window 63.


A sealing member (not illustrated) may be interposed between the upper front window 63 and the front window 62. The sealing member may be provided at the lower edge portion of the upper front window 63 and the upper edge portion of the front window 62, respectively. The sealing member provided respectively at the lower edge portion of the upper front window 63 may have an eaves portion on the front side (the Z1 side). The eaves portion prevents rainwater or the like from entering from between the two sealing members. The two sealing members may be configured to be able to be separated from each other.


The lower front window 61 is directly fixed to the frame body 110 or the like. A sealing member may be interposed between the lower front window 61 and the front window 62.


The front window 62 is disposed flush with the upper front window 63 with a sealing member interposed therebetween when closed, and constitutes the window surface front portion 60 of the cab 10. When the operator opens the front window 62 by opening the handle portion 62a, the front window 62 moves rearward (toward the Z2 side) and separates from the upper front window 63. At this time, the front window 62 is also separated from the lower front window 61. The opening operation refers to an unlocking operation for releasing the front window 62 from a fixed state and an operation for sliding the front window 62 upward.


Next, when an operator grips the handle portion 62a of the front window 62 and performs an opening operation, the front window 62 slides in the opening direction (the Y1 direction) as illustrated in FIG. 3A. At this time, as the cameras S6PL and S6PR have sizes that fit within the respective widths of the left pillar 111L and the right pillar 111R, the front window 62 does not touch the cameras S6PL and S6PR. In other words, the cameras S6PL and S6PR do not interfere with the opening and closing operation of the front window 62.


When the front window 62 is slid, the front window 62 is disposed at a position parallel to the upper surface of the cab 10. At this time, the front surface of the cab 10 is opened at a portion where the front window 62 is disposed.


Next, a configuration of a drive system of the excavator 100 will be described with reference to FIG. 4. FIG. 4 is a block diagram illustrating a configuration example of a drive system of the excavator. In FIG. 4, the mechanical power system, the high-pressure hydraulic line, the pilot line, and the electric control system are indicated by a double line, a thick solid line, a broken line, and a dotted line, respectively.


The drive system of the excavator 100 mainly includes the engine 11, a regulator 13, a main pump 14, a pilot pump 15, a control valve 17, the operation device 26, a discharge pressure sensor 28, an operation pressure sensor 29, the controller 30, a proportional valve 31, a work mode selection dial 32, and the like.


The engine 11 is a drive source of the excavator. In the present embodiment, the engine 11 is, for example, a diesel engine that operates to maintain a predetermined rotational speed. An output shaft of the engine 11 is connected to respective input shafts of the main pump 14 and the pilot pump 15.


The main pump 14 supplies a hydraulic oil to the control valve 17 via a high-pressure hydraulic line. In the present embodiment, the main pump 14 is a swash plate type variable displacement hydraulic pump.


The regulator 13 controls a discharge amount of the main pump 14. In the present embodiment, the regulator 13 controls a discharge amount of the main pump 14 by adjusting the swash plate tilting angle of the main pump 14 in response to a control command from the controller 30.


The pilot pump 15 supplies a hydraulic oil to various hydraulic control devices including the operation device 26 and the proportional valve 31 via the pilot line. In the present embodiment, the pilot pump 15 is a fixed displacement hydraulic pump.


The control valves 17 are a hydraulic control device that controls a hydraulic system in the excavator. The control valves 17 include control valves 171 to 176 and a bleed valve 177. The control valves 17 can selectively supply a hydraulic oil discharged by the main pump 14 to one or a plurality of hydraulic actuators through the control valves 171 to 176.


The control valves 171 to 176 control a flow rate of a hydraulic oil flowing from the main pump 14 to the hydraulic actuator and a flow rate of the hydraulic oil flowing from the hydraulic actuator to a hydraulic oil tank. The hydraulic actuators include the boom cylinder 7, the arm cylinder 8, the bucket cylinder 9, a left travel hydraulic motor 1A, a right travel hydraulic motor 1B, and a slew hydraulic motor 2A.


The bleed valve 177 controls a flowing amount of a hydraulic oil discharged from the main pump 14 to the hydraulic oil tank without passing through the hydraulic actuators (hereinafter “a bleed flow amount”). The bleed valve 177 may be provided outside the control valves 17.


The operation device 26 is a device used by an operator to operate the hydraulic actuators. In the present embodiment, the operation device 26 supplies a hydraulic oil discharged by the pilot pump 15 to the pilot ports of the control valves respectively corresponding to the hydraulic actuators via the pilot lines. The pressure (pilot pressure) of a hydraulic oil supplied to each of the pilot ports is a pressure in accordance with an operation direction and an operation amount of a lever or a pedal (not illustrated) of the operation device 26 corresponding to each of the hydraulic actuators.


The discharge pressure sensor 28 detects a discharge pressure of the main pump 14. In the present embodiment, the discharge pressure sensor 28 outputs the detected value to the controller 30.


The operation pressure sensor 29 detects content of an operator's operation using the operation device 26. In the present embodiment, the operation pressure sensor 29 detects an operation direction and an operation amount of the lever or the pedal of the operation device 26 corresponding to each of the hydraulic actuators in the form of pressure (operation pressure), and outputs the detected value to the controller 30. Content of an operation of the operation device 26 may be detected using a sensor other than the operation pressure sensor.


The controller 30 is a control unit that controls the entire excavator 100. The functions of the controller 30 of the present embodiment will be described in detail later.


The proportional valve 31 operates in response to a control command that is output from the controller 30. In the present embodiment, the proportional valve 31 is an electromagnetic valve that adjusts a secondary pressure introduced from the pilot pump 15 to the pilot port of the bleed valve 177 in the control valves 17 in accordance with an electric current command that is output from the controller 30. The proportional valve 31 operates in such a manner that, for example, a secondary pressure introduced into the pilot port of the bleed valve 177 increases as an electric current command increases.


The work mode selection dial 32 is a dial for an operator to select an operation mode, and enables switching between a plurality of different operation modes. Data indicating a setting state of the engine speed and a setting state of the acceleration/deceleration characteristics corresponding to a work mode is constantly transmitted from the work mode selection dial 32 to the controller 30.


The work mode selection dial 32 is configured to switch a work mode in multiple stages, among an SP mode, an H mode, an A mode, and an IDLE mode. In other words, the work mode selection dial 32 of the present embodiment can switch the setting condition of the excavator 100.


The SP mode is an example of a first mode, and the H mode is an example of a second mode. FIG. 4 illustrates a state in which the SP mode is selected by the work mode selection dial 32.


The SP mode is a work mode selected in the case where a work amount is prioritized, and uses a highest engine speed and highest acceleration/deceleration characteristics. The H mode is a work mode selected in the case where it is desired to achieve both a work amount and fuel efficiency, and uses a second highest engine speed and second highest acceleration/deceleration characteristics.


The A mode is a work mode selected in the case where it is desired to operate the excavator with low noise by making the acceleration characteristics and the deceleration characteristics of the hydraulic actuator corresponding to a lever operation gentle and improving accurate operability and safety, and uses a third highest engine speed and third highest acceleration/deceleration characteristics. The IDLE mode is a work mode selected in the case where the engine 11 is desired to be in a low idling state, and uses a lowest engine speed and lowest acceleration/deceleration characteristics.


Here, in the case where the operation of each actuator is stopped during the engine driving in each work mode (high idling state), the controller 30 causes the engine 11 to maintain the rotation speed that is set for each work mode. The controller 30 may switch the engine speed to the low idling state when the high idling state continues for a predetermined time. The idling state includes a high idling state and a low idling state.


In the above description, the names of the respective stages of the work mode are the SP mode, the H mode, the A mode, and the IDLE mode; however, the names of the respective stages are not limited thereto. For example, the names of the SP mode, the H mode, and the A mode may be a POWER mode, an STD mode, an ECO mode, and an IDLE mode (low idling state), respectively. The work mode is not limited to those of the present embodiment, and may be set to five or more stages.


The engine 11 is controlled to rotate at a constant speed corresponding to the engine speed of the work mode that is set by the work mode selection dial 32. The opening of the bleed valve 177 is controlled based on the bleed valve opening characteristics of the operation mode that is set by the work mode selection dial 32.


In the present embodiment, each of the above-described work modes may be referred to as a setting condition of the excavator 100, and information indicating the setting condition may be referred to as setting condition information. The setting condition information is information in which a designated item and a value of the item are associated with each other. The designated item is, for example, an item indicating the state of an engine speed corresponding to each work mode or an item indicating the state of the acceleration/deceleration characteristics. Therefore, the setting condition information of the present embodiment includes an item and a value of the item indicating the state of an engine speed corresponding to each work mode, and an item and a value of the item indicating the state of the acceleration/deceleration characteristics.


In the configuration diagram of FIG. 4, the ECO mode is set as one of the modes selected by the work mode selection dial 32, but an ECO mode switch may be provided separately from the work mode selection dial 32. In this case, the engine speed may be adjusted in accordance with each mode selected using the work mode selection dial 32, and the acceleration/deceleration characteristics corresponding to each mode of the work mode selection dial 32 may be gradually changed when the ECO mode switch is turned on.


The change of the work mode may be realized by a sound input. In this case, the excavator is provided with a sound input device for inputting a sound uttered by an operator to the controller 30. The controller 30 is also provided with a sound identification section for identifying a sound that is input by the voice input device.


In this manner, the operation mode is selected by the work mode selection dial 32, the ECO mode switch, the sound identification section, and the like.


Next, the function of the controller 30 of the present embodiment will be described. The controller 30 of the present embodiment includes an image data acquirer 301, a state detector 302, and an outputter 303.


The image data acquirer 301 acquires image data captured by the imaging device S6. Specifically, the image data acquirer 301 acquires the image data (moving image data) captured by the camera S6PL and the camera S6PR. In the case where the left and right pillars 112 on the rear side are also provided with cameras, the image data acquirer 301 additionally acquires image data captured by these cameras.


The state detector 302 detects a state of an operator based on image data acquired by the image data acquirer 301. At this time, in the present embodiment, image data suitable for detection of a state is selected from image data captured by the cameras S6PL and S6PR in accordance with a state of an operator who is a target of detection. Then, the state detector 302 detects a state of an operator based on a selected image data. The processing of the state detector 302 will be described in detail later.


The controller 30 stores information in which image data acquired by the image data acquirer 301 and a detection result acquired by the state detector 302 are associated with each other in the storage device 47. In the following description, information in which image data (moving image data) acquired by the image data acquirer 301 and a detection result by the state detector 302 are associated with each other may be referred to as “state history information”.


The outputter 303 outputs the state history information stored in the storage device 47 to the communication device T1. In other words, the outputter 303 outputs the state history information to the management apparatus 200 via the communication device T1.


The controller 30 of the present embodiment may stop capturing image data by the camera S6PL and camera S6PR in the case where, for example, a state in which an operator is not seated in the operator's seat 90 continues for a certain period of time. The state where an operator is not seated on the operator's seat 90 is a state where an image of the operator is not included in the image represented by the image data acquired by image data acquirer 301. The certain period of time may be, for example, about one hour.


The controller 30 of the present embodiment may turn off the engine 11 as the drive source when a state where an operator is not seated on the operator's seat 90 continues for a certain time or more.


The state of an operator and the processing of the state detector 302 in the present embodiment will be described below.


The state of an operator in the present embodiment includes a posture of an operator when an operator is seated on the operator's seat 90, whether or not the operator wears accessories (helmet, seat belt, etc.), facial expressions, behaviors, and the like.


Specifically, for example, the state of an operator includes a state in which an operator is sitting with their legs crossed, a state in which an operator is not wearing a helmet, and the like. The posture of sitting with the legs crossed may cause an erroneous operation, and is therefore an inappropriate posture for an operator to take during operation of the excavator 100. The state in which a helmet is not worn is a state in which accessories for enhancing the safety of an operator during a work are not worn, and is therefore an undesirable state.


The state of an operator includes, for example, a state in which an operator is yawning, a state in which operator's eyes are closed, a state in which an operator is continuously blinking, and the like. The yawning state is estimated to be a state in which operator's concentration on work is declining. The state in which operator's eyes are closed for a predetermined time or more is estimated as a state in which the operator feels drowsy. The state in which an operator continuously blinks is estimated to be a state in which the operator feels fatigue.


Furthermore, the state of an operator includes, for example, a state in which an operator is performing an action unrelated to work. The state of performing an action unrelated to work is estimated to be, for example, a state of not being engaged in operator work.


In the present embodiment, the state detector 302 detects the above-described states as a state of an operator. In other words, the state detector 302 can detect a posture, presence or absence of accessories, drowsiness, a decline in concentration, a degree of fatigue, a working attitude, and the like as a state of an operator. A state of the operator detected by the state detector 302 may be defined in advance.


In addition, when detecting a plurality of states of an operator, the state detector 302 of the present embodiment selects image data corresponding to the state to be detected from image data captured by the cameras S6PL and S6PR, and detects a state using the selected image data.


Specifically, for example, when drowsiness, a decline in concentration, a degree of fatigue, or the like is detected as a state of an operator, the facial expression of the operator is important. For this reason, the state detector 302 selects image data (moving image data) including a face image from images captured by the cameras S6PL and S6PR. Then, the state detector 302 detects whether the operator feels drowsy, whether the concentration is declining, the degree of fatigue, and the like from the selected image data, and stores state history information including the detection result in the storage device 47.


Furthermore, for example, when a posture, presence or absence of accessories, a working attitude, and the like are detected as a state of an operator, an image of the entire body of the operator is important. For this reason, the state detector 302 selects image data (moving image data) including the entire body of the operator from images data captured by the cameras S6PL and S6PR. Then, the state detector 302 detects whether or not the posture of the operator is appropriate, whether or not the operator appropriately wears accessories, and whether or not the operator is engaged in work from the selected image data, and stores state history information including the detection result in the storage device 47.


The state history information of the present embodiment is transmitted to the management apparatus 200 by the outputter 303 and is managed by the management apparatus 200. The state history information may include, for example, identification information for identifying an operator and a machine number for identifying the excavator 100.


Therefore, in the present embodiment, for example, in the case where a manager tries to ascertain a state of a certain operator whose concentration is declined during work, it suffices that the manager inputs identification information for identifying the operator and the information “decline in concentration” for designating the state to the management apparatus 200. In response to receipt of this input, it suffices that the management apparatus 200 is caused to display image data associated with the detection result “decline in concentration” in the state history information including the identification information of the operator.


The state detector 302 according to the present embodiment may be, for example, a trained model generated by learning such as machine learning.


A state detection method by the state detector 302 of the present embodiment will be described below with reference to FIG. 5.



FIG. 5 is a diagram for explaining an example of a state detection method by the state detector. The state detector 302 according to the present embodiment may be a trained model mainly configured by a neural network DNN. In other words, the controller 30 may have a trained model that realizes the state detector 302.


The neural network DNN is a so-called deep neural network having one or more intermediate layers (hidden layers) between an input layer and an output layer. In the neural network DNN, a weighting parameter representing a connection strength with a lower layer is defined for each of a plurality of neurons constituting each intermediate layer. The neural network DNN is configured in such a manner that the neurons of each layer output the sum of values obtained by multiplying input values from the plurality of neurons of an upper layer by weighting parameters defined for the neurons of the upper layer to the neurons of the lower layer through a threshold function.


Machine learning, specifically, deep learning is performed on the neural network DNN, and the above-described weighting parameters are optimized. Thus, the neural network DNN can receive an input of image data acquired by the image data acquirer 301 as an input signal x and output a probability that a state of an operator is a predefined state as an output signal y.


In the present embodiment, the output signal y1 that is output from the neural network DNN indicates that a prediction probability that a state of an operator at the time of input of an input signal x is an inappropriate posture is 10%. The output signal y2 indicates that a probability that an operator is in a state of low concentration at the time of an input of an input signal x is 50%.


The state detector 302 according to the present embodiment may acquire, for example, a set of a plurality of states included in the output signal y and a probability that a state of an operator is each of the plurality of states as a detection result regarding a state of an operator and accumulate the set as a part of the state history information.


In the present embodiment, accumulating the state history information in this way allows a manager or the like to ascertain the details of the history of the state of the operator during work, even after work of the operator is finished.


The state detector 302 may set a state having a highest probability among all the states included in the output signal y as the state of the operator when the input signal x is input (detection result).


Specifically, for example, it is assumed that a state with a highest probability among the plurality of states included in the output signal y is a state of low concentration. In this case, the state detector 302 may select “state of low concentration” as a state of an operator at this timing from among the plurality of states, and may use the selected state as the detection result.


The neural network DNN is, for example, a convolutional neural network (CNN). The CNN is a neural network to which an existing image processing technique (convolution processing and pooling processing) is applied.


Next, the operation of the excavator 100 of the present embodiment will be described with reference to FIG. 6. FIG. 6 is a flowchart for explaining the processing of the excavator according to the first embodiment.


The controller 30 of the excavator 100 according to the present embodiment acquires image data from the cameras S6PL and S6PR through the image data acquirer 301 (step S601).


Subsequently, the state detector 302 selects image data to be used for detection from the plurality of acquired image data in accordance with the type of the state, and detects a state of an operator using the selected image data (step S602).


Subsequently, the controller 30 causes the outputter 303 to store the state history information including the detection result and the image data used for the state detection in the storage device 47 (step S603).


In this way, in the excavator 100 of the present embodiment, image data suitable for detection of a state of an operator is selected from a plurality of image data (moving image data) captured by a plurality of imaging devices disposed at the cab 10, and a state of an operator is detected using the selected image data.


In other words, in the present embodiment, for example, if a state of an operator that a manager or the like wants to detect is considered as a “purpose”, image data suitable for detecting the purpose is automatically selected from a plurality of image data captured inside the cab 10. In the present embodiment, a state of an operator, which is a purpose of the detection, is detected using selected image data.


In this way, in the present embodiment, a state of an operator can be detected in response to a request from a construction manager or the like who manages the operator, the excavator 100, or the like.


An example of image data of a state of an operator stored in the storage device 47 as the state history information will be described below with reference to FIGS. 7A and 7B. FIG. 7A is a first diagram illustrating an example of image data to be output, and FIG. 7B is a second diagram illustrating an example of image data to be output. The image 71 illustrated in FIG. 7A is an example of an image that is output as a result of detecting a state in which the operator is in an inappropriate posture.


It is understood from the image 71 that the operator is gripping the lever. However, the operator operates the lever with their legs crossed, and this posture is found to be an inappropriate posture as a posture at the time of operation.


The image 72 of FIG. 7B illustrates a state of an operator not engaged in work, that is, a state of an operator not gripping the lever and not wearing accessories (helmet). In this case, the screen is an example of a screen that is output as a result of detecting a state of an operator being in an inappropriate posture while the excavator 100 is being in an operable state (the engine 11 is ON and the gate lock valve 19 is in an open state).


It is understood from the image 72 that the operator is not wearing a helmet and crossing their legs, and is not operating the excavator 100. In particular, it is possible to confirm whether the state of the operator is appropriate according to the state of the excavator 100 (whether the excavator 100 is in an operable state or not, etc.) by combining the image with various detection information that is output from various sensors attached to the excavator 100.


In the present embodiment, displaying of such an image in the management apparatus 200 as a detection result allows a manager to ascertain a situation of an operator at a time of detecting a state.


Second Embodiment

Hereinafter, the second embodiment will be described with reference to the drawings. The second embodiment differs from the first embodiment in that a three-dimensional model of an operator who is working is created from a plurality of image data acquired by a plurality of imaging devices attached to pillars, and a state of the operator is detected based on the three-dimensional model. In the following description of the second embodiment, the differences from the first embodiment will be described.



FIG. 8 is a flowchart for explaining an operation of an excavator according to the second embodiment. The excavator 100 of the present embodiment acquires image data (moving image data) captured by the camera S6PL and the camera S6PR through the image data acquirer 301 of the controller 30 (step S801).


Subsequently, the controller 30 of the present embodiment creates a three-dimensional model of an operator from a plurality of image data acquired by the image data acquirer 301, and detects a state of an operator based on the created three-dimensional model by the state detector 302 (step S802). The three-dimensional model of the operator is a three-dimensional model represented by a point group in a three-dimensional coordinate space.


Subsequently, the controller 30 causes the outputter 303 to store the state history information including the detection result and the image used for the state detection in the storage device 47 (step S803).


As described above, in the present embodiment, since a three-dimensional model of an operator is generated from a plurality of image data acquired by the image data acquirer 301, it is not necessary to select image data suitable for a plurality of states of an operator as a target for detection. In other words, in the present embodiment, since a three-dimensional model created from a plurality of image data obtained by imaging an operator from different directions is used for detection of a state of the operator, the imaging direction of the operator by the camera does not affect the detection of the state.


In the present embodiment, as the number of cameras disposed on the pillars in the cab 10 increases, a three-dimensional model with higher accuracy can be created, and the accuracy in detection of a state of an operator can be increased.


Third Embodiment

The third embodiment will be described below. The third embodiment differs from the first and second embodiments in that a state determination function of the excavator 100 is provided in the management apparatus 200.



FIG. 9 is a diagram for explaining a system configuration of an assistance system for a construction machine according to the third embodiment.


In the excavator assistance system SYS of the present embodiment, the management apparatus 200 includes the image data acquirer 301, the state detector 302, and an outputter 303A.


The management apparatus 200 may be a general computer including an arithmetic processing apparatus and a storage apparatus, and the functions of the image data acquirer 301, the state detector 302, and the outputter 303A are realized by the arithmetic processing apparatus reading and executing a state detecting program stored in the storage device of the management apparatus 200.


The excavator 100 of the present embodiment transmits image data captured by the camera S6PL and the camera S6PR to the management apparatus 200.


The management apparatus 200 acquires a plurality of image data transmitted from the excavator 100 through a plurality of image data acquirers 301, detects a state of an operator by using the plurality of image data through the state detector 302, and stores state history information including the image data and the detection result.


The outputter 303A causes a display apparatus included in the management apparatus 200, a display connected to the management apparatus 200, or the like to display the state history information.


Specifically, for example, in response to an input of the information designating a state that a manager wants to detect to the management apparatus 200 together with identification information of an operator and a machine number of the excavator 100, the outputter 303A reads state history information corresponding to the identification information of the operator and the machine number of the excavator 100. The management apparatus 200 may cause, among the detection results included in the read state history information, the designated detection result and image data associated with the designated detection result to be displayed.


In the case where only identification information of an operator and a machine number of the excavator 100 are input, the outputter 303A may read out the state history information corresponding to the identification information of the operator and the machine number of the excavator 100, and cause a list of the detected states to be displayed. In response to selection of a certain state from the list of the detected states, the outputter 303A may cause the image data associated with the selected state to be displayed. In response to, for example, input of the identification information of an operator, the outputter 303A may refer to the state history information associated with the identification information and cause the number of times the inappropriate state is detected to be displayed.


In the present embodiment, outputting a detection result regarding a state of an operator can cause a manager to ascertain a state of the operator.


In the present embodiment, states of a plurality of image data are detected by the management apparatus 200, and it is thus possible to reduce a processing load of the controller 30 of the excavator 100.


In the present embodiment, the cab 10 as an operator's cab is the cab 10 of the excavator 100; however, the present disclosure is not limited to this example. The present embodiment can also be applied to a work machine other than the excavator 100. For example, such a work machine may include a gantry crane, a crawler crane, a traveling crane, an overhead crane, a jib crane, or the like.


A gantry crane includes a pair of legs disposed in a lateral direction and capable of traveling in a longitudinal direction, a girder bridged between the legs, a main trolley capable of traversing along the girder, an operator's cab for performing crane operation, and a passage portion connected to a main trolley and connecting the main trolley and the operator's cab. The present embodiment may be applied to the operator's cab of such a gantry crane. The present embodiment may be applied to the operator's seat of such a gantry crane.


In the crawler crane, an upper slewing body is mounted on a lower travel body having a crawler belt in a slewing manner, and a tower boom is attached to the upper slewing body so as to be able to be elevated or lowered. The upper slewing body is provided with an operator's seat. In a crawler crane, a tower jib that does not expand and contract is pivotally supported on the upper end side of a tower boom so as to be elevated or lowered with respect to the tower boom, and a tower strut which is interposed between the tower jib and a pendant rope and serves as an auxiliary member when elevating or lowering the tower jib is pivotally supported on the other portion of the upper end side of the tower boom. The present embodiment may be applied to the operator's seat of such a crawler crane.


A traveling crane includes, for example, a traveling portion that travels on a rail, a fixed portion provided on the traveling portion, a revolving portion provided on the fixed portion so as to be capable of revolving, an operator's cab provided on the revolving portion, and a jib provided on the revolving portion. The traveling crane also includes a hoisting tool for hoisting a load, a wire rope for winding up and down the hoisting tool, and a drum for winding up and feeding out the wire rope. The traveling crane apparatus can travel on the rail by driving the traveling portion in a state where load is suspended by the hoisting tool, and can convey the load. The present embodiment may be applied to the operator's seat of such a traveling crane.


An overhead crane is installed in a building, and lifts an object to be lifted (hereinafter, simply referred to as an “object”) by a hoisting tool and moves the hoisting tool in a horizontal direction to thereby transport the object. The overhead crane includes a girder provided so as to bridge between a pair of rails provided in a building, and the girder can travel on the rails. A trolley capable of traversing along the extending direction of the girder is provided on the girder. A hoisting tool is attached to the trolley. In other words, the hoisting tool moves in the traveling direction when the girder travels, and the hoisting tool moves in the traversing direction when the trolley traverses. The object is conveyed by the movement of the hoisting tool in this manner. The girder is provided with an operator's cab for operating the overhead crane. The present embodiment may be applied to the operator's seat of such an overhead crane.


In a jib crane, a jib is attached to an end of a revolving body on which a back stay is erected so as to be freely raised and lowered around a raising and lowering center shaft, the jib is raised and lowered by winding and delivering a raising and lowering rope, and a suspended load is lifted and lowered from a tip of the jib by winding and lowering a hoisting rope. The present embodiment may be applied to the operator's seat of such a jib crane.


Although the preferred embodiments of the present disclosure have been described in detail, the present disclosure is not limited to the above-described embodiments, and various modifications and substitutions can be made to the above-described embodiments without departing from the scope of the present disclosure.

Claims
  • 1. A construction machine, comprising: a cab including an operator's seat; anda space recognition device disposed on a pillar in the cab.
  • 2. The construction machine according to claim 1, further comprising: a lower travel body; andan upper slewing body mounted on the lower travel body in a slewable manner, whereinthe cab is mounted on the upper slewing body.
  • 3. The construction machine according to claim 1, wherein the space recognition device is disposed between a height of a seat surface of an operator's seat provided in the cab and a height of an upper end portion of a headrest of the operator's seat.
  • 4. The construction machine according to claim 1, wherein the space recognition device is disposed at a position where the space recognition device does not interfere with a front window provided in the cab at a time of opening and closing the front window.
  • 5. The construction machine according to claim 3, wherein the spatial recognition device is movable.
  • 6. The construction machine according to claim 4, wherein a plurality of space recognition devices is arranged, each of the space recognition devices being the space recognition device,the pillar includes an attachment member for attaching the plurality of space recognition devices, andthe plurality of space recognition devices each include a fixing member that is attachable to and detachable from the attachment member, the fixing member fixing a corresponding space recognition device of the plurality of space recognition devices to the pillar.
  • 7. The construction machine according to claim 5, wherein the fixing member is a magnet.
  • 8. The construction machine according to claim 1, comprising: an image data acquirer configured to acquire a plurality of image data acquired by each of the plurality of space recognition devices; anda state detector configured to detect a state of an operator seated on an operator's seat provided in the cab by using the plurality of image data.
  • 9. The construction machine according to claim 7, wherein the state detector is configured to select image data in accordance with the state of the operator to be detected from the plurality of image data, anddetect the state of the operator based on the selected image data.
  • 10. The construction machine according to claim 7, comprising: an outputter configured to cause a display device to display image data at a time of detecting the state of the operator by the state detector.
  • 11. An assistance system for a construction machine comprising a construction machine and a management apparatus, wherein the construction machine includes a lower travel body;an upper slewing body mounted on the lower travel body in a slewable manner;a cab mounted on the upper slewing body;a space recognition device disposed on a pillar in the cab; andan outputter configured to output a plurality of image data captured by the space recognition device to the management apparatus, andthe management apparatus includes a storage section configured to store the plurality of image data.
Priority Claims (1)
Number Date Country Kind
2022-084645 May 2022 JP national
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation application of International Application No. PCT/JP2023/017688 filed on May 11, 2023, and designated the U.S., which is based upon and claims priority to Japanese Patent Application No. 2022-084645 filed on May 24, 2022, the entire contents of which are incorporated herein by reference.

Continuations (1)
Number Date Country
Parent PCT/JP2023/017688 May 2023 WO
Child 18956407 US