The present invention relates to a projection display device, a method for controlling the projection display device, and a program for controlling the projection display device.
JP2014-129676A and JP2010-018141A disclose techniques of improving operation efficiency at the time of construction work by using various operational machines in which an operator of a hydraulic shovel, a wheel loader, a bulldozer, a motor grader, or the like can observe a working machine from a driver's cabin by using a heal-up display (HUD).
JP2014-129676A discloses a hydraulic shovel that displays information of an ascending-and-descending direction and an ascent-and-descent amount of a bucket ahead of a driver's seat.
JP2010-018141A discloses a hydraulic shovel that displays an execution scheme drawing ahead of the driver's seat by using the HUD.
JP2012-255286A discloses a hydraulic shovel that displays an execution scheme drawing, information indicating the current execution situation, and a bucket image indicating the position of a bucket in the current situation on a display apparatus within the driver's cabin.
According to the techniques in JP2010-018141A and JP2012-255286A, the operator can operate the operation machine while checking the execution scheme drawing, and thus, the operation efficiency can be improved. However, only with the display of the execution scheme drawing and the current position of the bucket, it is difficult for an inexperienced operator to perform execution as intended.
According to the technique in JP2014-129676A, since the ascent-and-descent amount of the bucket is displayed together with the execution scheme drawing, it is effective as operational support for the inexperienced operator. However, a range in which execution is to be performed may be wide in some cases, and in these cases, the operator needs to determine the position at which the bucket is to ascend and descend in the range in which execution is to be performed. JP2014-129676A does not consider the necessity for supporting such determination.
In addition, with the hydraulic shovel in JP2014-129676A, it may be considered that the operator rotates or moves forward a vehicle body to move and stop the bucket at an appropriate position to repeat a process of operating the bucket in accordance with the up-and-down amount displayed by the HUD at this position. With such a process, depending on the experienced level of the operator, the vehicle body may be moved more than necessary, and the operation efficiency may decrease.
Although the construction machine has been described above as an example, an object of improving the operation efficiency arises similarly for an HUD that is to be mounted in a vehicle (e.g., forklift) on which a working machine that can be operated by the operator ahead of the driver's seat is mounted.
The present invention has been made in view of the above circumstances, and an object is to provide a projection display device that can improve the operation efficiency of a vehicle having a working machine, a method for controlling the projection display device, and a program for controlling the projection display device.
A projection display device according to the present invention is a projection display device to be mounted in a vehicle having a movable working machine and a main body part, the working machine being attached to the main body part, the main body part having a driver's cabin. The projection display device includes: a detection unit, a projection display unit, and a display control unit. The detection unit detects a position of the vehicle and a direction of the driver's cabin. The projection display unit includes a light modulation unit that, on the basis of image information to be input, spatially modulates light emitted from a light source, and projects image light, obtained through spatial modulation by the light modulation unit, onto a projection surface mounted in the driver's cabin to display a virtual image based on the image light. The display control unit controls the image information to be input to the light modulation unit and that controls the virtual image that is to be displayed by the projection display unit. The display control unit causes the projection display unit to display a working-machine virtual image that represents the working machine on the basis of operation plan information and the position and the direction detected by the detection unit, the operation plan information specifying the position of the vehicle, the direction of the driver's cabin, and a posture of the working machine that are stored in a storage unit.
A method for controlling a projection display device according to the present invention is a method for controlling a projection display device, the projection display device being to be mounted in a vehicle having a movable working machine and a main body part, the working machine being attached to the main body part, the main body part having a driver's cabin. The projection display device has a light modulation unit and a projection display unit. On the basis of image information to be input, the light modulation unit spatially modulates light emitted from a light source. The projection display unit projects image light, obtained through spatial modulation by the light modulation unit, onto a projection surface mounted in the driver's cabin to display a virtual image based on the image light. The method includes: a detection step and a display control step. The detection step detects a position of the vehicle and a direction of the driver's cabin. The display control step causes the projection display unit to display a working-machine virtual image that represents the working machine on the basis of operation plan information and the position and the direction detected in the detection step, the operation plan information specifying the position of the vehicle, the direction of the driver's cabin, and a posture of the working machine that are stored in a storage unit.
A program for controlling a projection display device according to the present invention is a program for controlling a projection display device, the projection display device being to be mounted in a vehicle having a movable working machine and a main body part, the working machine being attached to the main body part, the main body part having a driver's cabin. The projection display device has a light modulation unit and a projection display unit. On the basis of image information to be input, the light modulation unit spatially modulates light emitted from a light source. The projection display unit projects image light, obtained through spatial modulation by the light modulation unit, onto a projection surface mounted in the driver's cabin to display a virtual image based on the image light. The program is for causing a computer to execute: a detection step and a display control step. The detection step detects a position of the vehicle and a direction of the driver's cabin. The display control step causes the projection display unit to display a working-machine virtual image that represents the working machine on the basis of operation plan information and the position and the direction detected in the detection step, the operation plan information specifying the position of the vehicle, the direction of the driver's cabin, and a posture of the working machine that are stored in a storage unit.
According to the present invention, it is possible to provide a projection display device that can improve the operation efficiency of a vehicle having a working machine, a method for controlling the projection display device, and a program for controlling the projection display device.
Now, an embodiment of the present invention will be described with reference to the drawings.
The construction machine 1 is a hydraulic shovel and is composed of units such as an undercarriage 2, an upper rotatable body 3 that is supported by the undercarriage 2 in a rotatable manner, and a front operation unit 4 that is supported by the upper rotatable body 3. The undercarriage 2 and the upper rotatable body 3 constitute a main body part of the construction machine 1.
The undercarriage 2 includes a metal or rubber crawler for traveling on a public road or in a construction site.
The upper rotatable body 3 includes a driver's cabin 5, a direction sensor 14 that detects the direction of the driver's cabin 5, and a global positioning system (GPS) receiver 15 that detects the position (latitude and longitude) of the construction machine 1. In the driver's cabin 5, a control device for controlling the front operation unit 4 and a driver's seat 6 for an operator to be seated are set.
The front operation unit 4 includes an arm 4C, a boom 4B, and a bucket 4A. The arm 4C is supported by the upper rotatable body 3 such that the arm 4C is movable in the gravity direction and a direction perpendicular to the gravity direction (vertical direction in the drawing and direction perpendicular to the drawing). The boom 4B is supported by the arm 4C such that the boom 4B is rotatable relative to the arm 4C. The bucket 4A is supported by the boom 4B such that the bucket 4A is rotatable relative to the boom 4B. The bucket 4A is a part that can directly contact a target such as the earth or an object to be carried and constitutes a working machine.
Note that instead of the bucket 4A, another working machine, such as a steel frame cutting machine, a concrete crushing machine, a grabbing machine, or a hitting breaker, may be attached to the boom 4B.
The bucket 4A is movable in the vertical direction of the drawing relative to the driver's cabin 5 via the arm 4C and the boom 4B. In addition, the bucket 4A is rotatable around axes that are the line-of-sight direction of the operator who is seated on the driver's seat 6 and a direction perpendicular to the gravity direction. In addition, the boom 4B is rotatable around an axis that is perpendicular to the drawing.
Although omitted from the illustration, a group of sensors such as an angular rate sensor and a three-axis acceleration sensor for detecting the posture of the front operation unit 4 is provided in the front operation unit 4.
The driver's cabin 5 is provided with a front windshield 11 ahead of the driver's seat 6, and a part of the front windshield 11 is a region processed to reflect image light, which will be described later. Furthermore, this region constitutes a projection area 11A onto which image light emitted from the HUD 10 is projected. The direction sensor 14 is provided for detecting the direction of a front surface of the front windshield 11.
The HUD 10 is set within the driver's cabin 5 and displays a virtual image with image light projected onto the projection area 11A, which is a part of a region of the front windshield 11, so that the operator who is seated on the driver's seat 6 can visually recognize the virtual image ahead of the front windshield 11.
As illustrated in
By seeing image light that has been projected onto and reflected on the projection area 11A of the front windshield 11, the operator of the construction machine 1 can visually recognize, as a virtual image, information such as an image or characters for supporting the operation by using the construction machine 1. The projection area 11A has a function of reflecting the image light projected from the HUD 10 and transmitting light from the outdoor space (the outside) at the same time. Thus, the operator can visually recognize the virtual image based on the image light projected from the HUD 10, the virtual image overlapping with the outside scene.
Although the HUD 10 is mounted in the hydraulic shovel in the example in
The driver's cabin 5 is surrounded by the front windshield 11, a right-side windshield 21, and a left-side windshield 22. The driver's cabin 5 includes a left control lever 23, a right control lever 24, and the like around the driver's seat 6. The left control lever 23 is for controlling folding and stretching of the front operation unit 4 and rotation of the upper rotatable body 3. The right control lever 24 is for controlling digging and releasing of the bucket 4A in the front operation unit 4. Note that the operation functions assigned to the left control lever 23 and the right control lever 24 are examples and are not limited to the above examples.
The front windshield 11 has the projection area 11A onto which the image light emitted from the HUD 10 is projected, and the projection area 11A reflects the image light and transmits light from the outdoor space (the outside) at the same time.
Note that the construction machine 1 is equipped with, although omitted from the illustration, a steering for running, an acceleration, a brake, and the like that are operated when running by using the undercarriage 2.
The HUD 10 includes a light source unit 40, a light modulation element 44, a driving unit 45 that drives the light modulation element 44, a projection optical system 46, a diffusion plate 47, a reflective mirror 48, a magnifying glass 49, a system control unit 60 that controls the light source unit 40 and the driving unit 45, and a storage unit 70 that may be composed of a storage medium such as a flash memory.
The light source unit 40 includes a light source control unit 40A, an R light source 41r, a G light source 41g, a B light source 41b, a dichroic prism 43, a collimator lens 42r, a collimator lens 42g, and a collimator lens 42b. The R light source 41r is a red light source that emits red light, the G light source 41g is a green light source that emits green light, and the B light source 41b is a blue light source that emits blue light. The collimator lens 42r is provided between the R light source 41r and the dichroic prism 43, the collimator lens 42g is provided between the G light source 41g and the dichroic prism 43, and the collimator lens 42b is provided between the B light source 41b and the dichroic prism 43.
The dichroic prism 43 is an optical member for guiding light emitted from each of the R light source 41r, the G light source 41g, and the B light source 41b to the same optical path. That is, the dichroic prism 43 transmits red light collimated by the collimator lens 42r and emits the red light to the light modulation element 44. In addition, the dichroic prism 43 reflects green light collimated by the collimator lens 42g and emits the green light to the light modulation element 44. Furthermore, the dichroic prism 43 reflects blue light collimated by the collimator lens 42b and emits the blue light to the light modulation element 44. The optical member having such a function is not limited to the dichroic prism. For example, a cross dichroic mirror may also be used.
For each of the R light source 41r, the G light source 41g, and the B light source 41b, a light emitting element such as a laser or a light emitting diode (LED) is used. The R light source 41r, the G light source 41g, and the B light source 41b constitute a light source of the HUD 10. Although the light source of the HUD 10 includes three light sources, which are the R light source 41r, the G light source 41g, and the B light source 41b, in this embodiment, the number of light sources may be one, two, or four or more.
The light source control unit 40A sets the light emission amount of each of the R light source 41r, the G light source 41g, and the B light source 41b to a predetermined light emission amount pattern, and performs control so as to cause the R light source 41r, the G light source 41g, and the B light source 41b to sequentially emit light in accordance with the light emission amount pattern.
The light modulation element 44 spatially modulates the light emitted from the dichroic prism 43 on the basis of image information and emits the spatially modulated light (red image light, blue image light, and green image light) to the projection optical system 46.
As the light modulation element 44, for example, a liquid crystal on silicon (LCOS), a digital micromirror device (DMD), a micro electro mechanical systems (MEMS) element, a liquid crystal display element, or the like can be used.
On the basis of image information that is input from the system control unit 60, the driving unit 45 drives the light modulation element 44 to cause light (red image light, blue image light, and green image light) in accordance with image information to be emitted from the light modulation element 44 to the projection optical system 46.
The light modulation element 44 and the driving unit 45 constitute a light modulation unit of the HUD 10.
The projection optical system 46 is an optical system for projecting the light emitted from the light modulation element 44 onto the diffusion plate 47. This optical system is not limited to a lens, and a scanner can also be used. For example, light emitted from a scanner may be diffused by the diffusion plate 47 to form a plane light source.
The reflective mirror 48 reflects the light diffused by the diffusion plate 47 toward the magnifying glass 49.
The magnifying glass 49 enlarges and projects an image based on the light reflected on the reflective mirror 48 onto the projection area 11A.
The light source unit 40, the light modulation element 44, the driving unit 45, the projection optical system 46, the diffusion plate 47, the reflective mirror 48, and the magnifying glass 49 constitute a projection display unit 50. The projection display unit 50 spatially modulates light emitted from the R light source 41r, the G light source 41g, and the B light source 41b on the basis of image information that is input from the system control unit 60 and projects the spatially modulated image light onto the projection area 11A. The projection area 11A constitutes a display area in which a virtual image can be displayed by the projection display unit 50.
The system control unit 60 controls the light source control unit 40A and the driving unit 45 so as to cause image light based on image information to be emitted to the diffusion plate 47 through the projection optical system 46.
The diffusion plate 47, the reflective mirror 48, and the magnifying glass 49 illustrated in
The system control unit 60 is mainly composed of a processor and includes a read only memory (ROM) in which a program to be executed by the processor or the like is stored, a random access memory (RAM) as a work memory, and the like.
The storage unit 70 stores a plurality of operation plan information items.
The operation plan information is information that specifies each of the position (latitude and longitude) of the construction machine 1 at which digging by using the bucket 4A is to be started, the direction of the driver's cabin 5 at that position, the posture of the bucket 4A (including the position of the bucket 4A in the vertical direction, the distance to the bucket 4A from the driver's cabin 5, and the like) at the time of start of digging at that position, and the digging amount at that position. Note that the information on the digging amount may be omitted from the operation plan information.
Hereinafter, the position of the construction machine 1 specified by the operation plan information will be called planned position, the direction of the driver's cabin 5 specified by the operation plan information will be called planned direction, and the posture of the bucket 4A specified by the operation plan information will be called planned posture.
Sensors 80 illustrated in
On the basis of the operation plan information that is read out from the storage unit 70, the direction information that is input from the direction sensor 14, and the position information that is input from the GPS receiver 15, the system control unit 60 generates image information for displaying a working-machine virtual image that represents the bucket 4A and causes image light based on the image information to be projected onto the projection area 11A. Note that the HUD 10 includes the storage unit 70 in this non-limiting example. The HUD 10 may read out the operation plan information that is stored in a storage medium that is externally attached to the HUD 10. Alternatively, the HUD 10 may read out the operation plan information from a storage medium that is outside the construction machine 1 through a network.
The system control unit 60 includes a detection unit 61, an overlap determining unit 62, and a display control unit 63. The detection unit 61, the overlap determining unit 62, and the display control unit 63 are functional blocks formed by the processor of the system control unit 60 executing programs including a control program stored in the ROM.
On the basis of the acceleration information and angular rate information that are input from the sensors 80, the detection unit 61 detects the posture of the bucket 4A determined on the basis of the position of the bucket 4A in the vertical direction and the distance to the bucket 4A from the driver's cabin 5. The posture of the bucket 4A detected by the detection unit 61 will be called detected posture below.
In addition, the detection unit 61 detects the direction of the driver's cabin 5 (the direction of the front surface of the front windshield 11) on the basis of the direction information that is input from the direction sensor 14. The direction of the driver's cabin 5 detected by the detection unit 61 will be hereinafter called detected direction.
Furthermore, the detection unit 61 detects the position of the construction machine 1 on the basis of the position information that is input from the GPS receiver 15. The position of the construction machine 1 detected by the detection unit 61 will be hereinafter called detected position.
The display control unit 63 controls the image information to be input to the driving unit 45 and controls the virtual image to be displayed by the projection display unit 50.
On the basis of any of the plurality of operation plan information items stored in the storage unit 70 and the detected position and detected direction detected by the detection unit 61, the display control unit 63 causes the projection display unit 50 to display a bucket virtual image (working-machine virtual image) that represents the bucket 4A at a predetermined position in the projection area 11A, thereby presenting to the operator, the position of the construction machine 1, the direction of the driver's cabin 5, and the posture of the bucket 4A that are appropriate for starting a digging operation.
In a case where the detected position corresponds with the planned position and the detected direction corresponds with the planned direction, on the basis of the detected posture and the planned posture, the overlap determining unit 62 determines whether the bucket virtual image displayed by the projection display unit 50 overlaps with the bucket 4A in a state where seen from the driver's seat 6. The state where the bucket virtual image overlaps with the bucket 4A includes, in addition to the state where the outline of the bucket virtual image completely overlaps with the outline of the bucket 4A, the state where these two outlines are slightly misaligned.
In addition, the correspondence between the detected position and the planned position means not only the case where the detected position completely corresponds with the planned position but also the case where the difference between the detected position and the planned position is less than or equal to a predetermined value. The correspondence between the detected direction and the planned direction means not only the case where the front surface of the front windshield 11 faces the position for performing a digging operation and the detected direction completely corresponds with the planned direction but also the case where the front surface of the front windshield 11 faces the position for performing a digging operation and the difference between the detected direction and the planned direction is less than or equal to a predetermined value.
Specifically, the overlap determining unit 62 calculates the difference between the detected posture and the planned posture according to the operation plan information stored in the storage unit 70. In a case where the difference is less than a threshold value, the overlap determining unit 62 determines that the bucket virtual image and the bucket 4A overlap with each other in the state where seen from the driver's seat 6. In a case where the difference is greater than or equal to the threshold value, the overlap determining unit 62 determines that the bucket virtual image and the bucket 4A do not overlap with each other in the state where seen from the driver's seat 6.
Note that a digital camera that can capture an image of the same range as the field of view of the operator who is seated on the driver's seat 6 may be installed in the driver's cabin 5, and the overlap determining unit 62 may analyze the image captured by the digital camera so as to determine whether the bucket virtual image that is being displayed by the projection display unit 50 and the bucket 4A overlap with each other in the state where seen from the driver's seat 6.
In the case where the overlap determining unit 62 determines that the bucket virtual image and the bucket 4A overlap with each other in the state where seen from the driver's seat 6, the display control unit 63 generates image information including report information for informing the operator that the state where the digging operation is to be started is set, inputs this report information to the driving unit 45, and causes the report information to be displayed.
The report information is information for reporting that the bucket 4A corresponds with the planned posture and is information such as an image or characters that can be visually recognized by the operator easily.
In the HUD 10, a plurality of operation plan information items Dn (n is an integer of two or more) are stored in advance in the storage unit 70, and each of the plurality of operation plan information items Dn is stored in association with an operation execution order. Note that the value “n” is smaller as the execution order is earlier.
When the HUD 10 is started and is set to an operational support mode, the display control unit 63 first reads out an operation plan information item Dn whose execution order is first from the storage unit 70 (step S1).
Subsequently, on the basis of the direction information from the direction sensor 14 and the position information from the GPS receiver 15, the detection unit 61 detects the position of the construction machine 1 and the direction of the driver's cabin 5 (step S2).
Subsequently, the display control unit 63 determines whether the planned position according to the operation plan information that is read out from the storage unit 70 in step S1 corresponds with the detected position that is detected in step S2 (step S3).
In a case where the determination in step S3 is YES, the display control unit 63 determines whether the planned direction according to the operation plan information that is read out from the storage unit 70 in step S1 corresponds with the detected direction that is detected in step S2 (step S4).
In a case where the determination in step S4 is YES, on the basis of the planned posture specified by the operation plan information that is read out from the storage unit 70 in step S1, as illustrated in
The bucket virtual image 101C virtually represents the bucket 4A observed within the projection area 11A from the driver's seat 6 in the state where the bucket 4A is in the planned posture. Accordingly, the operation of overlapping the bucket virtual image 101C and the bucket 4A with each other enables the position of the bucket 4A in a space to correspond with the planned position.
Upon display of the bucket virtual image in step S5, the overlap determining unit 62 determines whether the bucket virtual image and the bucket 4A overlap with each other (step S6).
When the operator moves the bucket 4A upward, the state illustrated in
After the images have been displayed as illustrated in
In a case where the display control unit 63 determines in step S4 that the planned direction according to the operation plan information that is read out from the storage unit 70 in step S1 does not correspond with the detected direction that is detected in step S2 (step S4: NO), the display control unit 63 determines whether the difference between the planned direction according to the operation plan information that is read out from the storage unit 70 in step S1 and the detected direction that is detected in step S2 is greater than or equal to a threshold value (
In a case where the determination in step S11 is YES, the display control unit 63 does not display the bucket virtual image based on the planned posture, but causes the projection display unit 50 to display a rotation instruction virtual image indicating the direction of rotation of the driver's cabin 5, the rotation being necessary for making the direction of the driver's cabin 5 closer to the planned direction (step S12).
In a case where the determination in step S11 is NO, on the basis of the planned posture according to the operation plan information that is read out from the storage unit 70 in step S1 and the difference between the planned direction according to the operation plan information that is read out in step S1 and the detected direction that is detected in step S2, the display control unit 63 determines the display position and display size of the bucket virtual image and causes the projection display unit 50 to display the bucket virtual image at the determined display position with the determined display size (step S13).
In the case where the determination in step S11 is NO, although the position of the construction machine 1 is according to the plan, the direction of the driver's cabin 5 is misaligned to the left or right from the planned direction.
Accordingly, in order to express the misalignment of this direction, the display control unit 63 controls the display position of the bucket virtual image on the basis of the difference between the planned direction and the detected direction without changing the display size of the bucket virtual image based on the planned posture included in the operation plan information.
Specifically, in a case where the detected direction is on a more right side than the planned direction, as illustrated in
After the process in step S13, the display control unit 63 causes the projection display unit 50 to display a rotation instruction virtual image indicating an instruction for making the direction of the driver's cabin 5 closer to the planned direction according to the operation plan information (step S14).
After the process in step S12 or the process in step S14, the process returns to step S4 in
In a case where it is determined in step S3 that the planned position according to the operation plan information that is read out from the storage unit 70 in step S1 does not correspond with the detected position that is detected in step S2 (step S3: NO), the display control unit 63 determines whether the planned direction according to the operation plan information that is read out from the storage unit 70 in step S1 corresponds with the detected direction that is detected in step S2 (step S15).
In a case where the determination in step S15 is YES, on the basis of the planned posture according to the operation plan information that is read out from the storage unit 70 in step S1 and the difference between the planned direction according to the operation plan information that is read out in step S1 and the detected direction that is detected in step S2, the display control unit 63 determines the display position and display size of the bucket virtual image and causes the projection display unit 50 to display the bucket virtual image at the determined display position with the determined display size (step S16).
In the case where the determination in step S15 is YES, although the direction of the driver's cabin 5 of the construction machine 1 is according to the plan, the position of the construction machine 1 is ahead of or behind the planned position.
Accordingly, in order to express the misalignment of this position, the display control unit 63 controls the display size of the bucket virtual image on the basis of the difference between the planned position and the detected position without changing the display position of the bucket virtual image based on the planned posture included in the operation plan information.
Specifically, in a case where the planned position is ahead of the detected position, as illustrated in
In addition, in a case where the planned position is behind the detected position, as illustrated in
After the process in step S16, the display control unit 63 causes the projection display unit 50 to display a movement instruction virtual image indicating an instruction for making the position of the construction machine 1 closer to the planned position included in the operation plan information (step S17).
After the process in step S17, the process returns to step S2.
In a case where the determination in step S15 is NO, the display control unit 63 determines whether the difference between the planned direction according to the operation plan information that is read out from the storage unit 70 in step S1 and the detected direction that is detected in step S2 is greater than or equal to the threshold value (step S18).
In a case where the determination in step S18 is YES, the display control unit 63 does not display the bucket virtual image based on the planned posture and causes the projection display unit 50 to display a movement instruction virtual image indicating the direction of movement of the construction machine 1, the movement being necessary to make the direction of the driver's cabin 5 closer to the planned direction (step S19).
In a case where the determination in step S18 is NO, on the basis of the planned posture according to the operation plan information that is read out from the storage unit 70 in step S1, the difference between the planned direction according to the operation plan information that is read out in step S1 and the detected direction that is detected in step S2, and the difference between the planned position according to the operation plan information that is read out in step S1 and the detected position that is detected in step S2, the display control unit 63 determines the display position and display size of the bucket virtual image and causes the projection display unit 50 to display the bucket virtual image at the determined display position with the determined display size (step S20).
The case where the determination in step S18 is NO corresponds to the state where the position of the construction machine 1 is misaligned from the plan and the direction of the driver's cabin 5 is slightly misaligned from the plan.
Accordingly, in order to express the misalignment of the position and the direction, the display control unit 63 controls the display position of the bucket virtual image based on the planned posture included in the operation plan information on the basis of the difference between the planned direction and the detected direction and controls the display size of the bucket virtual image based on the planned posture included in the operation plan information on the basis of the difference between the planned position and the detected position.
Specifically, in a case where the detected direction is on a more right side than the planned direction and the detected position is behind the planned position, as illustrated in
After the process in step S20, the display control unit 63 causes the projection display unit 50 to display the movement instruction virtual image indicating an instruction for making the position of the construction machine 1 and the direction of the driver's cabin 5 closer to the planned position and planned direction according to the operation plan information (step S21).
After the process in step S19 or the process in step S21, the process returns to step S2.
As described above, with the HUD 10, on the basis of the operation plan information stored in the storage unit 70, the detected position of the construction machine 1, and the detected direction of the driver's cabin 5, the bucket image representing the bucket 4A can be displayed at a position indicating a digging point according to the plan.
Thus, the bucket virtual image enables the operator to check the position of the construction machine 1, the direction of the driver's cabin 5, and the posture of the bucket 4A that are appropriate for starting digging. Therefore, even in a case where an inexperienced operator performs an operation, by overlapping the bucket 4A with the bucket virtual image, the operator can start the operation in an appropriate state according to the operation plan information. This can realize the execution according to the plan without unnecessary movement, thereby improving the operation efficiency.
In addition, with the HUD 10, in a case where the bucket virtual image is not displayed, or in a case where the position of the construction machine 1 or the direction of the driver's cabin 5 is not according to the plan although the bucket virtual image is displayed, as illustrated in
Furthermore, with the HUD 10, in a case where it is determined that the bucket virtual image and the bucket 4A overlap with each other, as illustrated in
Note that in the display example of
In addition, instead of displaying the text image 112, a speaker may be added to the HUD 10, and the display control unit 63 may, by using the speaker, inform the operator that the posture of the bucket 4A corresponds to the planned posture. Furthermore, the display of the text image 112 may be combined with the change in the display color of the bucket virtual image 101C or the display of the bucket virtual image 101C in a blinking manner, or the display of the text image 112 may be combined with the report by using the speaker. Such a configuration enables the operator to perform the operation more accurately.
In addition, the text image 111 illustrated in
For example, in a case where the operator considers that the display content projected onto the projection area 11A disturbs the operation, the display of images other than the bucket virtual image may be switched off so as to set a stable operation efficiency.
In addition, in the display example in
As described above, the following matters are disclosed herein.
(1) A projection display device to be mounted in a vehicle having a movable working machine and a main body part, the working machine being attached to the main body part, the main body part having a driver's cabin, the projection display device including:
a detection unit that detects a position of the vehicle and a direction of the driver's cabin;
a projection display unit that includes a light modulation unit that, on the basis of image information to be input, spatially modulates light emitted from a light source, and that projects image light, obtained through spatial modulation by the light modulation unit, onto a projection surface mounted in the driver's cabin to display a virtual image based on the image light; and
a display control unit that controls the image information to be input to the light modulation unit and that controls the virtual image that is to be displayed by the projection display unit,
wherein the display control unit causes the projection display unit to display a working-machine virtual image that represents the working machine on the basis of operation plan information and the position and the direction detected by the detection unit, the operation plan information specifying the position of the vehicle, the direction of the driver's cabin, and a posture of the working machine that are stored in a storage unit.
(2) The projection display device according to (1),
wherein the display control unit controls a display position of the working-machine virtual image on the basis of a difference between the direction specified by the operation plan information and the direction detected by the detection unit.
(3) The projection display device according to (1) or (2),
wherein the display control unit controls a display size of the working-machine virtual image on the basis of a difference between the position specified by the operation plan information and the position detected by the detection unit.
(4) The projection display device according to any one of (1) to (3),
wherein, in a case where the difference between the direction specified by the operation plan information and the direction detected by the detection unit is greater than or equal to a threshold value, the display control unit causes the projection display unit to display, instead of the working-machine virtual image, information indicating an instruction for changing the direction of the driver's cabin.
(5) The projection display device according to any one of (1) to (4), further including:
an overlap determining unit that determines whether the working-machine virtual image displayed by the projection display unit and the working machine overlap with each other in a state where seen from a driver's seat in the driver's cabin,
wherein, in a case where the overlap determining unit determines that the working-machine virtual image and the working machine overlap with each other, the display control unit informs an operator within the driver's cabin that the working machine is in an optimal posture.
(6) The projection display device according to (5),
wherein, in the case where the overlap determining unit determines that the working-machine virtual image and the working machine overlap with each other, the display control unit causes the projection display unit to display report information for informing the operator within the driver's cabin that the working machine is in the optimal posture.
(7) The projection display device according to (5),
wherein, in the case where the overlap determining unit determines that the working-machine virtual image and the working machine overlap with each other, the display control unit informs, by using a sound, the operator within the driver's cabin that the working machine is in the optimal posture.
(8) The projection display device according to any one of (1) to (7),
wherein, if the working-machine virtual image is being displayed, the display control unit further causes the projection display unit to display information indicating operation content of the working machine.
(9) A method for controlling a projection display device, the projection display device being to be mounted in a vehicle having a movable working machine and a main body part, the working machine being attached to the main body part, the main body part having a driver's cabin, the projection display device having
a light modulation unit that, on the basis of image information to be input, spatially modulates light emitted from a light source, and
a projection display unit that projects image light, obtained through spatial modulation by the light modulation unit, onto a projection surface mounted in the driver's cabin to display a virtual image based on the image light, the method including:
a detection step of detecting a position of the vehicle and a direction of the driver's cabin; and
a display control step of causing the projection display unit to display a working-machine virtual image that represents the working machine on the basis of operation plan information and the position and the direction detected in the detection step, the operation plan information specifying the position of the vehicle, the direction of the driver's cabin, and a posture of the working machine that are stored in a storage unit.
(10) The method for controlling a projection display device according to (9),
wherein, in the display control step, a display position of the working-machine virtual image is controlled on the basis of a difference between the direction specified by the operation plan information and the direction detected in the detection step.
(11) The method for controlling a projection display device according to (9) or (10),
wherein, in the display control step, a display size of the working-machine virtual image is controlled on the basis of a difference between the position specified by the operation plan information and the position detected in the detection step.
(12) The method for controlling a projection display device according to any one of (9) to (11),
wherein, in the display control step, in a case where the difference between the direction specified by the operation plan information and the direction detected in the detection step is greater than or equal to a threshold value, the projection display unit is caused to display, instead of the working-machine virtual image, information indicating an instruction for changing the direction of the driver's cabin.
(13) The method for controlling a projection display device according to any one of (9) to (12), further including:
an overlap determining step of determining whether the working-machine virtual image displayed by the projection display unit and the working machine overlap with each other in a state where seen from a driver's seat in the driver's cabin,
wherein, in a case where it is determined in the overlap determining step that the working-machine virtual image and the working machine overlap with each other, in the display control step, an operator within the driver's cabin is informed that the working machine is in an optimal posture.
(14) The method for controlling a projection display device according to (13),
wherein, in the case where it is determined in the overlap determining step that the working-machine virtual image and the working machine overlap with each other, in the display control step, the projection display unit is caused to display report information for informing the operator within the driver's cabin that the working machine is in the optimal posture.
(15) The method for controlling a projection display device according to (13),
wherein, in the case where it is determined in the overlap determining step that the working-machine virtual image and the working machine overlap with each other, in the display control step, by using a sound, the operator within the driver's cabin is informed that the working machine is in the optimal posture.
(16) The method for controlling a projection display device according to any one of (9) to (15),
wherein, in a case where the working-machine virtual image is being displayed, in the display control step, the projection display unit is further caused to display information indicating operation content of the working machine.
(17) A program for controlling a projection display device, the projection display device being to be mounted in a vehicle having a movable working machine and a main body part, the working machine being attached to the main body part, the main body part having a driver's cabin, the projection display device having
a light modulation unit that, on the basis of image information to be input, spatially modulates light emitted from a light source, and
a projection display unit that projects image light, obtained through spatial modulation by the light modulation unit, onto a projection surface mounted in the driver's cabin to display a virtual image based on the image light, the program causing a computer to execute:
a detection step of detecting a position of the vehicle and a direction of the driver's cabin; and
a display control step of causing the projection display unit to display a working-machine virtual image that represents the working machine on the basis of operation plan information and the position and the direction detected in the detection step, the operation plan information specifying the position of the vehicle, the direction of the driver's cabin, and a posture of the working machine that are stored in a storage unit.
According to the present invention, it is possible to increase the operation efficiency of a vehicle having a working machine, such as a construction machine or an agricultural machine.
Number | Date | Country | Kind |
---|---|---|---|
2016-245679 | Dec 2016 | JP | national |
This application is a Continuation of PCT International Application No. PCT/JP2017/036270 filed on Oct. 5, 2017, which claims priority under 35 U.S.C § 119(a) to Japanese Patent Application No. 2016-245679 filed on Dec. 19, 2016. Each of the above application(s) is hereby expressly incorporated by reference, in its entirety, into the present application.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2017/036270 | Oct 2017 | US |
Child | 16423045 | US |