The present disclosure relates to a system and a method for a work machine.
Conventionally, a technique for displaying an image of a work machine captured by a camera on a display is known. For example, as illustrated in US Patent Publication No. 2014/0240506, an on-vehicle camera mounted on the work machine captures an image of the work machine and its field of view in the front, rear, left or right direction, and the image is displayed on the display. Further, in US Patent Publication No. 2014/0240506, there is provided a site camera that automatically moves to follow the work machine as the work machine moves. The sight camera is disposed away from the work machine to capture a wider field of view at a work site.
In the above-mentioned technique, the image captured by the on-vehicle camera is displayed on the display as it is. In this case, depending on a topography such as the one with large undulations, it may be difficult to accurately recognize a positional relationship between the work machine and the topography from the image displayed on the display.
In the above-mentioned technique, a wider field of view at the work site can be captured using the site camera. However, in that case, it is necessary to control the site camera with high accuracy. This makes the system complicated or increases the cost of the system.
An object of the present disclosure is to provide a system and a method capable of easily and accurately recognizing a positional relationship between a work machine and an object around the work machine.
A system according to a first aspect is a system including a work machine, a light detection and a LiDAR, a processor, and a display. The work machine includes a work implement. The LiDAR is attached to the work machine and includes a laser and a photodetector. The LiDAR measures a distance to at least a part of the work implement and a distance to an object around the work machine. The processor acquires position data from the distances measured by the LiDAR. The position data indicates a position of at least the part of the work implement and a position of the object around the work machine. The processor generates an image indicative of the position of at least the part of the work implement and the position of the object around the work machine based on the position data. The display displays the image in response to a signal from the processor.
A method according to a second aspect is a method executed by a processor in order to display a topography around a work machine including a work implement and a position of the work implement on a display. The method includes the following processes. A first process is to measure a distance to at least a part of the work implement and a distance to an object around the work machine by a LiDAR. A second process is to acquire position data from the distances measured by the LiDAR. The position data indicates a position of at least the part of the work implement and a position of the object around the work machine. A third process is to generate an image indicative of the position of at least the part of the work implement and the position of the object around the work machine based on the position data. A fourth process is to display the image on the display.
A system according to a third aspect is a system including a processor and a display. The processor acquires a distance to at least a part of a work implement and a distance to an object around a work machine measured by a LiDAR. The processor acquires position data from the distances measured by the LiDAR. The position data indicates a position of at least the part of the work implement and a position of the object around the work machine. The processor generates an image indicative of the position of at least the part of the work implement and the position of the object around the work machine based on the position data. The display displays the image in response to a signal from the processor.
In the present disclosure, the position data is acquired from the distances measured by the LiDAR. Then, the image is generated based on the position data and displayed on the display. The image indicates the position of at least the part of the work implement and the object around the work machine. Therefore, the positional relationship between the work machine and the object around the work machine can be easily and accurately recognized.
A system for a work machine according to an embodiment will now be described with reference to the drawings.
The vehicle body 2 includes an engine compartment 11. An operating cabin 12 is disposed at the rear of the engine compartment 11. A ripper device 5 is attached to a rear part of the vehicle body 2. The travel device 4 is a device for causing the work machine 1 to travel. The travel device 4 includes a pair of crawler belts 13 disposed on the left and right sides of the vehicle body 2. The work machine 1 travels due to the crawler belts 13 being driven.
The work implement 3 is disposed in front of the vehicle body 2. The work implement 3 is used for work such as digging, earthmoving, ground leveling, or the like. The work implement 3 includes a blade 14, a lift cylinder 15, a tilt cylinder 16, and an arm 17. The blade 14 is supported on the vehicle body 2 via the arm 17. The blade 14 is configured to be move in the up-down direction. The tilt cylinder 16 and the lift cylinder 15 are driven by hydraulic fluid discharged from a hydraulic pump 22 described later and change the posture of the blade 14.
The power transmission device 23 transmits driving force of the engine 21 to the travel device 4. The power transmission device 23 may be a hydro static transmission (HST), for example. Alternatively, the power transmission device 23 may be, for example, a torque converter or a transmission having a plurality of transmission gears.
The control valve 24 is a proportional control valve and is controlled according to an input command signal. The control valve 24 is disposed between the hydraulic pump 22 and hydraulic actuators such as the lift cylinder 15 and the tilt cylinder 16. The control valve 24 controls the flow rate of the hydraulic fluid supplied from the hydraulic pump 22 to the lift cylinder 15 and the tilt cylinder 16. The control valve 24 may be a pressure proportional control valve. Alternatively, the control valve 24 may be an electromagnetic proportional control valve.
The system 100 includes a first controller 31, a second controller 32, an input device 33, communication devices 34 and 35, and a display 36. The first controller 31 and the communication device 34 are mounted on the work machine 1. The second controller 32, the input device 33, the communication devices 34 and 35, and the display 36 are disposed outside of the work machine 1. For example, the second controller 32, the input device 33, the communication device 35, and the display 36 are disposed in a control center away from a work site. The work machine 1 can be operated remotely using the input device 33 outside of the work machine 1.
The first controller 31 and the second controller 32 are programmed to control the work machine 1. The first controller 31 includes a memory 311 and a processor 312. The memory 311 includes, for example, a volatile memory such as a RAM and a non-volatile memory such as a ROM. The memory 311 stores programs and data for controlling the work machine 1. The processor 312 is, for example, a central processing unit (CPU) and executes processes for controlling the work machine 1 according to a program. The first controller 31 controls the travel device 4 or the power transmission device 23, thereby causing the work machine 1 to travel. The first controller 31 controls the control valve 24, thereby causing the work implement 3 to operate.
The second controller 32 includes a memory 321 and a processor 322. The memory 321 includes, for example, a volatile memory such as a RAM and a non-volatile memory such as a ROM. The memory 321 stores programs and data for controlling the work machine 1. The processor 322 is, for example, a central processing unit (CPU) and executes processes for controlling the work machine 1 according to a program. The second controller 32 receives an operation signal from the input device 33. Further, the second controller 32 outputs a signal to the display 36, thereby causing the display 36 to display an image as described later.
The input device 33 receives an operation by an operator and outputs an operation signal according to the operation. The input device 33 outputs an operation signal to the second controller 32. The input device 33 includes an operating element such as an operating lever, a pedal, a switch, or the like for operating the travel device 4 and/or the work implement 3. The input device 33 may include a touch screen. The travel of the work machine 1 such as forward or reverse is controlled according to the operation of the input device 33. Also, the movement of the work implement 3 such as raising or lowering is controlled according to the operation of the input device 33.
The display 36 is, for example, a CRT, an LCD or an OELD. However, the display 36 is not limited to the aforementioned displays and may be another type of display. The display 36 displays an image based on a signal from the second controller 32.
The second controller 32 is configured to wirelessly communicate with the first controller 31 via the communication devices 34 and 35. The second controller 32 transmits the operation signal from the input device 33 to the first controller 31. The first controller 31 controls the travel device 4 and/or the work implement 3 according to the operation signal.
The system 100 includes a position sensor 36 and a light detection and ranging (LiDAR) 37. The position sensor 36 and the LiDAR 37 are mounted on the work machine 1. The position sensor 36 includes a global navigation satellite system (GNSS) receiver 38 and an IMU 39. The GNSS receiver 38 is, for example, a receiver for global positioning system (GPS). The GNSS receiver 38 receives a positioning signal from a satellite and acquires vehicle body position data indicative of position coordinates of the work machine 1 from the positioning signal. The first controller 31 acquires the vehicle body position data from the GNSS receiver 38.
The IMU 39 is an inertial measurement unit. The IMU 39 acquires inclination angle data. The inclination angle data includes an angle with respect to the horizontal in the vehicle front-rear direction (pitch angle) and an angle with respect to the horizontal in the vehicle lateral direction (roll angle). The first controller 31 acquires the inclination angle data from the IMU 39.
The LiDAR 37 measures three-dimensional shapes of at least a part of the work implement 3 and an object around the work machine 1.
The LiDAR 37 includes a motor 43, a laser 44, and a photodetector 45. The motor 43 rotates the rotating head 42 around the rotation axis Ax1. The laser 44 is provided on the rotating head 42. The laser 44 includes a plurality of light emitting elements 441 such as a laser diode, for example. The plurality of light emitting elements 441 are aligned in the rotation axis Ax1 direction. In
The photodetector 45 includes a plurality of light receiving elements 451 such as a photodiode, for example. The LiDAR 37 emits a laser light from the laser 44 and detects its reflected light with the photodetector 45. As a result, the LiDAR 37 measures a distance from the LiDAR 37 to a measurement point on an object to be measured. In
The LiDAR 37 measures positions of a plurality of measurement points at a predetermined cycle while rotating the laser 44 around the rotation axis Ax1. Therefore, the LiDAR 37 measures distances to the measurement points at a certain rotation angle. The LiDAR 37 outputs measurement point data. The measurement point data includes information on which element has been used for measuring each measurement point, at which rotation angle each measurement point has been measured, and positional relationships between each measurement point and each element.
As illustrated in
In
In the present embodiment, based on the positions of the measurement points measured by the LiDAR 37, images indicative of the blade 14 and the object in front of the blade 14 are generated and displayed on the display 36. Hereinafter, processes executed by the first controller 31 and the second controller 32 in order to generate an image will be described.
As illustrated in
In step S102, the second controller 32 acquires position data. Here, the second controller 32 receives the measurement point data from the first controller 31. The second controller 32 includes information indicative of a positional relationship between the LiDAR 37 and the work machine 1. The second controller 32 calculates and acquires the position data indicative of the blade 14 and the topography in front of the blade 14 from the measurement point data. Instead of the second controller 32, the first controller 31 may calculate and acquire the position data from the measurement point data. In that case, the second controller 32 may receive the position data from the first controller 31.
In step S103, the second controller 32 generates an image 50 indicative of the blade 14 and the object in front of the blade 14 based on the position data.
In step S104, the second controller 32 outputs a signal indicative of the image 50 to the display 36. As a result, the display 36 displays the image 50. The image 50 is updated in real time and displayed as a moving image. Therefore, when the work machine 1 is traveling or operating, the image 50 is changed and displayed according to a change in the surroundings of the work machine 1.
In the system 100 according to the present embodiment described above, the position data is acquired from the distances to the plurality of measurement points measured by the LiDAR 37. Then, based on the position data, the image 50 is generated and displayed on the display 36. The image 50 indicates the positions of at least the part of the work implement 3 and the object around the work machine 1. Therefore, a user is able to easily and accurately recognize the positional relationship between the work machine 1 and the object around the work machine 1 owing to the image 50.
Although the embodiment of the present disclosure have been described above, the present invention is not limited to the above embodiment and various modifications may be made within the scope of the invention.
The work machine 1 is not limited to a bulldozer and may be another vehicle such as a wheel loader, a motor grader, a hydraulic excavator, or the like. The work machine 1 may be a vehicle driven by an electric motor. The operating cabin 12 may be omitted from the work machine 1.
The work machine 1 may be operated in the operating cabin instead of being remotely operated.
The first controller 31 is not limited to one unit and may be divided into a plurality of controllers. The second controller 32 is not limited to one unit and may be divided into a plurality of controllers. The controller 30 is not limited to one unit and may be divided into a plurality of controllers.
The configuration and/or disposition of the LiDAR 37 is not limited to the position of the above embodiment and may be changed. For example, the rotation axis Ax1 of the LiDAR 37 may be disposed along the vertical direction. Alternatively, the LiDAR 37 may be non-rotatable. The direction measured by the LiDAR 37 is not limited to the front diction of the work machine 1 and may be a rear direction, a lateral direction, or another direction of the work machine 1. The object around the work machine 1 measured by the LiDAR 37 is not limited to the topography 200 and may include another work machine, a building, a person, or the like.
In the present disclosure, the positional relationship between the work machine and the object around the work machine can be easily and accurately recognized owing to the image.
Number | Date | Country | Kind |
---|---|---|---|
2019-008902 | Jan 2019 | JP | national |
This application is a U.S. National stage application of International Application No. PCT/JP2020/001774, filed on Jan. 20, 2020. This U.S. National stage application claims priority under 35 U.S.C. § 119(a) to Japanese Patent Application No. 2019-008902, filed in Japan on Jan. 23, 2019, the entire contents of which are hereby incorporated herein by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/001774 | 1/20/2020 | WO | 00 |