The present invention relates to a head-up display device which is mounted on a vehicle.
A head-up display device provides a variety of visual information to a driver of which line of sight is directed outside a vehicle (c.f. Patent Document 1). Therefore, the head-up display device contributes to safe driving for the vehicle.
Patent Document 1 proposes setting a priority to information which is given to the driver. According to Patent Document 1, a display area for information is determined according to the priority.
The priority proposed in Patent Document 1 is determined irrespective of a driving operation pattern of the driver. Therefore, if information is displayed by the techniques disclosed in Patent Document 1, the driver may not intuitively grasp necessary information.
An object of the present invention is to provide a head-up display device which allows a driver to intuitively acquire necessary information.
A head-up display device according to one aspect of the present invention is mounted on a vehicle. The head-up display device includes a projection device which emits an image light onto a reflective surface including a first area and a second area below the first area. The image light includes a first image light representing first information about an external factor outside the vehicle, and a second image light representing second information about the vehicle itself. The projection device emits the first image light onto the first area and the second light onto the second area.
The head-up display device allows a driver to intuitively acquire necessary information.
These and other objects, features and advantages of the present invention will become more apparent upon reading the following detailed description along with the accompanying drawings.
The inventors found that a driver may intuitively acquire information if the information is displayed according to a driving operation pattern of the driver. In many cases, the driver directs his/her line of sight upwardly when the driver views a factor existing outside a vehicle (e.g. a signboard indicating a legal speed, a line formed on a road surface for use in indicating a lane or a preceding vehicle). On the other hand, a lot of information about the vehicle itself is displayed on an indicator panel. Therefore, the driver directs his/her line of sight downwardly in many cases in order to obtain information about the vehicle itself. A head-up display is described in the first embodiment, the head-up display being configured to display information according to such a driving operation pattern.
The head-up display device 100 includes a projection device 200. The projection device 200 is mounted on a vehicle (not shown). The projection device 200 may be a general projector configured to emit an image light in response to an image signal. The principles of the present embodiment are not limited to a specific structure of the projection device 200.
The projection device 200 generates an image light in response to an image signal. The image light is emitted from the projection device 200 to the reflective surface RFT.
The first image light may represent first information including information about an external factor outside the vehicle. For instance, the first information may include navigation information for navigating to a destination. The first information may include legal speed information about a legal speed determined for a lane along which the vehicle runs. The projection device 200 may generate an image signal from a signal which is output from a navigation system mounted on the vehicle, and then may generate an image light representing the navigation information and/or the legal speed information. The image generation techniques for displaying the navigation information and the legal speed information as an image may rely on various image processing techniques applied to existing vehicles. Therefore, the principles of the present embodiment are not limited to a specific image generation technique for displaying the navigation information and the legal speed information as an image.
The first information may include inter-vehicle distance information about a distance setting between the vehicle driven by the driver DRV and a preceding vehicle which is targeted in an auto cruise control. The projection device 200 may generate an image signal representing the inter-vehicle distance information in collaboration with a control program which is used for the auto cruise control. The image generation techniques for displaying the inter-vehicle distance information as an image may rely on existing auto cruise control techniques. Therefore, the principles of the present embodiment are not limited to a specific image generation technique for displaying the inter-vehicle distance information as an image.
The first information may include lane information indicating a positional relationship between the vehicle driven by the driver DRV and a lane along which the vehicle runs. The projection device 200 may use a signal from a camera device mounted on the vehicle to generate an image signal representing the lane information. The image generation techniques for displaying the lane information as an image may rely on various existing image processing techniques. Therefore, the principles of the present embodiment are not limited to a specific image generation technique for displaying the lane information as an image.
The second image light may represent second information about the vehicle itself which is driven by the driver DRV. For instance, the second information may include running speed information about an actual running speed of the vehicle driven by the driver DRV. The projection device 200 may generate an image light representing running speed information from a detection signal which is output from various sensors mounted on the vehicle. The image generation techniques for displaying the running speed information as an image may rely on various signal processing techniques applied to existing vehicles. Therefore, the principles of the present embodiment are not limited to a specific image generation technique for displaying the running speed information as an image.
The second image light may include setting speed information about a running speed setting of the vehicle in the auto cruise control. The projection device 200 may generate an image signal which represents the setting speed information in collaboration with a control program which is used for the auto cruise control. The image generation techniques for displaying the setting speed information as an image may rely on existing auto cruise control techniques. Therefore, the principles of the present embodiment are not limited to a specific image generation technique for displaying the setting speed information as an image.
The projection device 200 may emit an image light including a variety of information. If information exclusively represents information about the vehicle itself, the information being irrelevant to external factors outside the vehicle, the projection device 200 may output the information as the second image light. On the other hand, if information is associated with an external factor outside the vehicle, the information may be emitted as the first image light. Therefore, the principles of the present embodiment are not limited to specific information represented by an image light.
A head-up display device may emit an image light toward a windshield of a vehicle. The windshield of the vehicle allows partial transmission of the image light, and reflects other parts of the image light. The reflected image light enters into the driver's eyes. Consequently, the driver may view a virtual image through the windshield. A head-up display device which emits image light toward a windshield is described in the second embodiment.
The projection device 200 is stored in the dashboard DSB. The projection device 200 emits an image light toward the windshield WSD. As described in the context of the first embodiment, an area of the windshield WSD which receives an image light is conceptually divided into the upper and lower display areas UDA, LDA. Both of the first image light incident onto the upper display UDA and the second image light incident onto the lower display area LDA are reflected toward the driver DRY. Consequently, the driver DRV may visually recognize images represented by the first and second image lights as a virtual image through the windshield WSD.
Appropriate optical processing may be applied to the windshield WSD. Consequently, it is less likely that multiple images occur. Techniques for avoiding the multiple images may rely on various existing optical techniques. Therefore, the principles of the present embodiment are not limited to specific optical characteristics of the windshield WSD.
A head-up display device may display various images on the basis of the principles described in the context of the first and second embodiments. An exemplary image to be displayed by a head-up display device is described in the third embodiment.
Each of
Each of
As shown in
As shown in
Unlike
When the driver DRV activates a navigation system (not shown) of the vehicle VCL as shown in
As shown in
Switching an image between
As described in the context of the first to third embodiments, if information about an external factor outside a vehicle is included, the head-up display device emits an image light onto an upper area. On the other hand, the head-up display device emits an image light onto a lower area in order to display information about the vehicle itself. The head-up display device may emit a boundary light as an image light representing a boundary between the upper area and the lower area, in addition to an image light representing a variety of information. When the boundary light is emitted, a driver is likely to receive visual impression that arrangement of images is well-organized. On the other hand, when the boundary light is constantly emitted, the head-up display device may unnecessarily consume electric power. Therefore, the head-up device may switch a display mode between a first display mode, in which the boundary is displayed, and a second display mode, in which the boundary is not displayed. The first and second display modes are described in the fourth embodiment.
Each of
As shown in
The head-up display device 100 may emit at least one of the first and second image lights (c.f.
Unlike the first display mode, the head-up display device 100 does not emit the boundary light (c.f.
The head-up display device 100 may switch a display mode between the first and second display modes in response to a manual operation of the driver DRV (c.f.
A head-up display device may display a boundary image under the first display mode according to the principles described in the context of the fourth embodiment. A positional relationship between a boundary image and an area above the boundary image is similar to a positional relationship between a hood and scenery viewed through a windshield. The head-up display device may use the aforementioned positional similarity to display an image which is used for a distance setting between a vehicle driven by a driver and a preceding vehicle to be targeted in an auto cruise control. An exemplary image for use in setting an auto cruise control is described in the fifth embodiment.
As shown in
As described with reference to
As described with reference to
Like
A head-up display device may provide a driver with an image for use in setting an auto cruise control according to the principles described in the context of the fifth embodiment. A designer designing the head-up display device may use various image generation techniques (e.g. programming techniques or circuit designing techniques) for displaying an image described in the context of the fifth embodiment. Exemplary techniques for displaying an image for use in setting an auto cruise control are described in the sixth embodiment.
The head-up display device 100 includes an optical processor 300 and a signal processor 400. The optical processor 300 corresponds to the projection device 200 described with reference to
As described in the context of the second embodiment, the vehicle VCL includes the windshield WSD. The optical processor 300 emits an image light (the first and/or second image lights, c.f.
The vehicle VCL includes a sensor group SSG and an interface ITF, in addition to the windshield WSD. The sensor group SSG may include various sensor elements for detecting a running state of the vehicle VCL and various devices (e.g. a camera device or a communication device) for acquiring information outside the vehicle VCL. The interface ITF receives a manual operation of the driver DRV (c.f.
The sensor group SSG includes a speed detector SDT. The speed detector SDT may include various sensor elements for detecting a running speed of the vehicle VCL. The speed detector SDT generates a detection signal representing a running speed of the vehicle VCL. The detection signal is output from the speed detector SDT to the signal processor 400. The detection techniques for detecting a speed of the vehicle VCL may rely on techniques for use in various existing vehicles. The principles of the present embodiment are not limited to a specific technique for detecting a running speed of the vehicle VCL.
The interface ITF includes an operation portion MOP and a request signal generator RSG. The driver DRY may operate the operation portion MOP to request displaying the image (c.f.
The request signal includes a switching request signal, a setting distance signal and a setting speed signal. The switching request signal transmits a display switching request for the image, which is described in the context of the fifth embodiment, to the signal processor 400. The setting distance signal transmits a distance setting between the vehicle VCL and a preceding vehicle in an auto cruise control to the signal processor 400. The setting speed signal transmits a running speed setting of the vehicle VCL in the auto cruise control to the signal processor 400.
The operation portion MOP may be a steering switch near a steering wheel. The steering switch may be a lever, a button, a dial or another structure configured to receive a manual operation of the driver DRV. The principles of the present embodiment are not limited to a specific structure of the operation portion MOP.
The request signal generator RSG may be a computer which executes a program for the auto cruise control. Designing the request signal generator RSG may rely on a variety of existing auto cruise control techniques. The principles of the present embodiment are not limited to a specific program or a specific computer device for use in the request signal generator RSG.
The signal processor 400 includes a detection signal receiver 410, a request receiver 420 and an image signal processor 430. The detection signal receiver 410 receives the detection signal from the speed detector SDT. The detection signal is then output from the detection signal receiver 410 to the image signal processor 430. The image signal processor 430 may display various images (e.g. the images described with reference to
The request receiver 420 includes a switching request receiver 421, a distance setting receiver 422 and a speed setting receiver 423. The request receiver 420 receives the request signal (i.e. the switching request signal, the setting distance signal and the setting speed signal) from the request signal generator RSG. The request signal is then output from the request receiver 420 to the image signal processor 430.
The switching request receiver 421 receives the switching request signal from the request signal generator RSG. The switching request signal is then output from the switching request receiver 421 to the image signal processor 430. The image signal processor 430 processes signals in response to the switching request signal in order to display the image (i.e. the image for use in setting the auto cruise control) which is described with reference to
The distance setting receiver 422 receives the setting distance signal from the request signal generator RSG. The setting distance signal is then output from the distance setting receiver 422 to the image signal processor 430. The image signal processor 430 determines a distance between the boundary image BDI (c.f.
The speed setting receiver 423 receives the setting speed signal from the request signal generator RSG. The setting speed signal is then output from the speed setting receiver 423 to the image signal processor 430. The image signal processor 430 determines contents of the setting speed image ASI (c.f.
The image signal processor 430 includes a switching portion 431, a storage 432 and an image signal generator 433. The switching portion 431 receives the detection signal from the detection signal receiver 410. The switching portion 431 receives the switching request signal from the switching request receiver 421. The switching portion 431 switches an output destination of the detection signal in response to the switching request signal. The storage 432 stores various data required for image generation. The image signal generator 433 may read data from the storage 432 to generate various images in response to the detection signal and/or the request signal.
The image signal generator 433 includes a first image signal processor 441 and a second image signal processor 442. The switching portion 431 sets one of the first and second image signal processors 441, 442 as the output destination of the detection signal in response to the switching request signal. When the switching portion 431 outputs the detection signal to the first image signal processor 441, the first image signal processor 441 generates an image signal for displaying the image described with reference to
The first image signal processor 441 includes an image data reader 451, a display position adjuster 452, a speed image generator 453, an upper image generator 454, a lower image generator 455 and a combining portion 456.
The image data reader 451 receives the switching request signal from the switching request receiver 421. The image data reader 451 reads the image data from the storage 432 in response to receiving the switching request signal. The image data which is read from the storage 432 may include information about the boundary image BDI and information about a shape of the symbol image SBI. The image data is then output from the image data reader 451 to the upper image generator 454.
The display position adjuster 452 receives the setting distance signal from the distance setting receiver 422. The display position adjuster 452 reads information about a display position of the symbol image SBI from the storage 432 in response to receiving the setting distance signal. The information about the display position of the symbol image SBI may represent an initial setting value about the position of the symbol image SBI relative to the boundary image BDI. Alternatively, information about the position of the symbol image SBI relative to the boundary image BDI may represent a value which is set immediately before. The display position adjuster 452 refers to the setting distance signal and data about the display position read from the storage 432 to determine a display position of the symbol image SBI. The position data about the determined display position is output from the display position adjuster 452 to the upper image generator 454 and the storage 432. As a result of the output of the position data to the storage 432, the position data is updated in the storage 432.
As described above, the image data for use in forming the boundary image BDI and the symbol image SBI is output from the image data reader 451 to the upper image generator 454. The upper image generator 454 generates an image signal to form the boundary image BDI and the symbol image SBI, which are formed in an area above the boundary image BDI. The upper image generator 454 generates an image signal so that the symbol image SBI is formed at a position which is determined by the position data output from the display position adjuster 452.
The speed image generator 453 receives the setting speed signal from the speed setting receiver 423. The speed image generator 453 generates image data for use in displaying the setting speed image ASI in response to the setting speed signal. The image data for use in displaying the setting speed image ASI is output from the speed image generator 453 to the lower image generator 455.
The lower image generator 455 receives the detection signal from the switching portion 431, in addition to the image data from the speed image generator 453. The lower image generator 455 generates image data for use in displaying the running speed image RSI in response to the detection signal. The lower image generator 455 uses the image data for displaying the setting speed image ASI and the running speed image RSI to generate an image signal for use in displaying an image in an area below the boundary image BD.
The image signal representing the boundary image BDI and the symbol image SBI above the boundary image BDI is output from the upper image generator 454 to the combining portion 456. The image signal for use in displaying the running speed image RSI and the setting speed image ASI below the boundary image BDI is output from the lower image generator 455 to the combining portion 456. The combining portion 456 may combine these image signals to generate an image signal for use in displaying the image described with reference to
When the detection signal is output from the switching portion 431 to the second image signal processor 442, the second image signal processor 442 refers to the detection signal to generate the image signal for use in displaying the running speed image RSI in the lower display area LDA (c.f.
The optical processor 300 includes a light source portion 310, a modulator 320 and an emitting portion 330. The light source portion 310 generates laser light or other light suitable for generating an image light. The modulator 320 receives the image signal from one of the combining portion 456 and the second image signal processor 442. The modulator 320 modulates light emitted from the light source portion 310 in response to the image signal to generate an image light. The image light is emitted from the emitting portion 330 to the windshield WSD.
The light source portion 310 may be a general laser source. The light source portion 310 may include laser sources configured to emit laser lights which are different in wavelengths from each other. In this case, the head-up display device 100 may form an image on the windshield WSD with different hues. The principles of the present embodiment are not limited to a specific type of a light source which is used as the light source portion 310.
The modulator 320 may be a general spatial light modulator element. For instance, the modulator 320 may drive liquid crystal elements in response to an image signal from one of the combining portion 456 and the second image signal processor 442 to generate an image light. Alternatively, the modulator 320 may include an MEMS mirror which is driven by an image signal. Further alternatively, the modulator 320 may include a Galvano mirror or another reflective element which is driven by an image signal. The principles of the present embodiment are not limited to a specific structure of the modulator 320.
The emitting portion 330 may include various optical elements for image formation on the windshield WSD. For instance, the emitting portion 330 may include a projection lens or a screen. A designer designing the head-up display device 100 may use a structure for use in existing projectors in designing the emitting portion 330. The principles of the present embodiment are not limited to a specific structure of the emitting portion 330.
The head-up display device described in the context of the sixth embodiment may execute various processes to switch a display mode between the first and second display modes. Exemplary processes for switching a display mode are described in the seventh embodiment.
Step S110 is started, for instance, when the driver DRV (c.f.
When the driver DRV turns off the ignition switch (not shown), the head-up display device 100 (c.f.
The switching portion 431 (c.f.
When the driver DRV operates the operation portion MOP (c.f.
The request signal generator RSG (c.f.
The switching portion 431 starts measuring a time.
The switching portion 431 determines whether or not the elapsed time “Tc” exceeds a predetermined threshold time “Tt”. When the elapsed time “Tc” does not exceed the threshold time “t”, Step S170 is repeated. The threshold time “Tt” is set to a time period long enough (e.g. five seconds) for the driver DRV to complete setting the auto cruising control. While Step S170 is executed, the driver DRV may operate the operation portion MOP to set an inter-vehicle distance between the vehicle VCL and a preceding vehicle to be targeted in the auto cruising control, and a setting speed in the auto cruising control. The first image signal processor 441 generates an image signal so that a position of the symbol image SBI (c.f.
The switching portion 431 switches an output destination of the detection signal from the first image signal processor 441 to the second image signal processor 442. Consequently, the setting mode of the auto cruising control is interrupted or finished. In addition, the switching portion 431 resets a value of the elapsed time “Tc” to “0”. Step 3130 is then executed.
A head-up display device may display an image for use in setting an auto cruise control under the first display mode according to the principles described in the context of the fifth to seventh embodiments. Alternatively, the head-up display device may display other information under the first display mode. For instance, the head-up display device may display lane information under the first display mode, the lane information indicating a positional relationship between a vehicle and a lane along which the vehicle runs. Exemplary lane information indicating a positional relationship between a vehicle and a lane along which the vehicle runs is described in the eighth embodiment.
As described with reference to
A camera device (not shown) for acquiring image data about an image on a road surface is mounted on the vehicle VCL (c.f.
When the vehicle VCL is about to deviate from a lane and/or deviates from the lane, the head-up display device 100 may display the lane image LLI. When the vehicle VCL is about to run over a right line mark formed on a road surface, the head-up display device 100 may blink the right lane image RFI. Meanwhile, the head-up display device 100 may display the left lane image LFI at fixed luminance. The driver DRV may then recognize that the vehicle VCL deviates from the right lane on the basis of the difference in display pattern between the right and left lane images RFI, LFI. The difference in display pattern between the right and left lane images RFI, LFI may be a hue difference between the right and left lane images RFI, LFI. The principles of the present embodiment are not limited to a specific indication for notifying the driver DRV of a deviation direction of the vehicle VCL. With regard to the present embodiment, the lane information is exemplified by the lane image LLI.
As described with reference to
A head-up display device may provide a driver with an image as the lane information according to the principles described in the context of the eighth embodiment, the image representing a positional relationship between a vehicle and a lane along which the vehicle runs. A designer designing the head-up display device may use various image generation techniques (e.g. a programming technique or a circuit designing technique) for displaying the image described in the context of the eighth embodiment. Techniques for displaying an image for use in providing the lane information are described in the ninth embodiment.
Like the sixth embodiment, the head-up display device 100 includes the optical processor 300. The description of the sixth embodiment is applied to the optical processor 300.
The head-up display device 100 further includes a signal processor 400A. The signal processor 400A may be a part of the projection device 200 (c.f.
As described in the context of the second embodiment, the vehicle VCL includes the windshield WSD. The optical processor 300 emits an image light (the first and/or second image lights: c.f.
The vehicle VCL includes a sensor group SSH and an interface ITG, in addition to the windshield WSD. The sensor group SSH may include various sensor elements for detecting a running state of the vehicle VCL, and various devices (e.g. a communication device) for acquiring information outside the vehicle VCL. The interface ITG receives a manual operation of the driver DRV (c.f.
Like the sixth embodiment, the sensor group SSH includes the speed detector SDT. The description of the sixth embodiment is applied to the speed detector SDT. The speed detector SDT generates speed data representing a running speed of the vehicle VCL. The speed data corresponds to the detection signal described in the context of the sixth embodiment. Therefore, the description about the detection signal in the sixth embodiment may be applied to the speed data. The speed data is output from the speed detector SDT to the signal processor 400A.
The sensor group SSH further includes a camera device CMD. The camera device CMD captures an image of a road surface to generate image data representing the road surface. The image data is output from the camera device CMD to the signal processor 400A. The camera device CMD may be a CCD) camera, a CMOS camera or any other device configured to generate image data representing a road surface. The principles of the present embodiment are not limited to a specific device for use as the camera device CMD.
The interface ITG includes an operation portion MOQ. The driver DRV operates the operation portion MOQ to generate a notification signal which notifies that the auto cruising control is activated. The notification signal is output from the operation portion MOQ to the signal processor 400A. The signal processor 400A processes signals for displaying the setting state image ACI described with reference to
The operation portion MOQ may be a steering switch near a steering wheel. The steering switch may be a lever, a button, a dial or another structure configured to receive a manual operation of the driver DRV. The principles of the present embodiment are not limited to a specific structure of the operation portion MOQ.
The signal processor 400A includes a data receiver 410A and an image signal processor 430A.
The data receiver 410A includes an image data determination portion 411 and a speed data receiver 412. The image data is output from the camera device CMD to the image data determination portion 411. The image data determination portion 411 determines from the image data whether or not the vehicle VCL is running on an appropriate position in a lane. The determination techniques for determining whether or not the vehicle VCL is running on the appropriate position in the lane may rely on existing image recognition techniques. The principles of the present embodiment are not limited to a specific technique for determining whether or not the vehicle VCL is running on the appropriate position in the lane.
When a positional relationship between the vehicle VCL and a line mark formed on a road surface (a mark indicating an edge of the lane) is inappropriate, the image data determination portion 411 generates a request signal which requests display of the lane image LLI described with reference to
The speed data receiver 412 receives the speed data from the speed detector SDT. The speed data is then output from the speed data receiver 412 to the image signal processor 430A.
The image signal processor 430A includes a switching portion 431A, a storage 432A and an image signal generator 433A.
The switching portion 431A includes an output data determination portion 461 and an output destination determination portion 462. The output data determination portion 461 receives the speed data from the speed data receiver 412. When the image data determination portion 411 generates the request signal, the request signal is output from the image data determination portion 411 to the output data determination portion 461. When the driver DRV operates the operation portion MOQ to activate the auto cruising control, the notification signal is output from the operation portion MOQ to the output data determination portion 461.
When the output data determination portion 461 receives only the speed data, the output data determination portion 461 generates the speed data as output data. When the output data determination portion 461 receives the notification signal in addition to the speed data, the output data determination portion 461 generates output data, in which the display request for the setting state image ACI is added to the speed data. When the output data determination portion 461 receives the request signal in addition to the speed data, the output data determination portion 461 generates output data, in which the display request for the lane image LLI is added to the speed data. The output data is output from the output data determination portion 461 to the output destination determination portion 462. The output destination determination portion 462 determines an output destination of the output data according to the contents of the output data.
The storage 432A includes a first storage 471 and a second storage 472. The first storage 471 stores image data about the boundary image BDI (c.f.
The image signal generator 433A includes a first image signal processor 441A and a second image signal processor 442A. When the output data includes the display request for the lane image LLI, the output data determination portion 461 determines the first image signal processor 441A as the output destination of the output data. Otherwise, the output data determination portion 461 determines the second image signal processor 442 as the output destination of the output data.
When the output data is output from the output data determination portion 461 to the first image signal processor 441A, the first image signal processor 441A generates an image signal for use in displaying the image described with reference to
The first image signal processor 441A includes a lower image generator 455A and a combining portion 456A. The lower image generator 455A receives the output data from the output destination determination portion 462. The lower image generator 455A processes signals according to the output data to generate lower image data representing an image to be displayed in an area below the boundary image BDI. When the output data includes the display request for the setting state image ACI, the lower image data includes information for use in displaying the running speed image RSI and the setting state image ACI. Otherwise, the lower image data includes information for use in displaying only the running speed image RSI. The lower image data is output from the lower image generator 455A to the combining portion 456A.
The combining portion 456A includes a boundary combining portion 481 and a blink processor 482. The boundary combining portion 481 receives the lower image data from the lower image generator 455A. The boundary combining portion 481 reads image data representing the boundary image BDI from the first storage 471. The boundary combining portion 481 combines the image data representing the boundary image BDI with the lower image data. The combined image data is output from the boundary combining portion 481 to the blink processor 482. The blink processor 482 reads the image data representing the lane image LLI from the second storage 472. The blink processor 482 incorporates the image data representing the lane image LLI into the image data which is received from the boundary combining portion 481. In this case, the blink processor 482 processes signals for blinking one of the left and right lane images LFI, RFI (c.f.
When the output data is output from the output destination determination portion 462 to the second image signal processor 442A, the second image signal processor 442A refers to the speed data contained in the output data to generate image data for use in displaying the running speed image RSI. When the output data includes the display request for the setting state image ACI in addition to the speed data, the second image signal processor 442A generates image data for use in displaying the running speed image RSI and the setting state image ACI. The image data generated by the second image signal processor 442A is output to the modulator 320 in the optical processor 300.
The head-up display device described in the context of the ninth embodiment uses the camera device to determine whether or not a positional relationship between a vehicle and a lane is appropriate with use of a camera device. The head-up display device may use various determination techniques for determining a position of a vehicle with respect to a lane. Exemplary determination techniques are described in the tenth embodiment.
Each of
Each of
The image data determination portion 411 sets a scanning area SCA in the imaging area CPA. The image data determination portion 411 scans the scanning area SCA to determine whether or not the line mark LNM exists in the scanning area SCA. As shown in
The scanning area SCA may be set so that the request signal is generated before the vehicle VCL deviates from the lane. If the scanning area SCA is appropriately set, the lane image LLI (c.f.
The head-up display device described in the context of the ninth embodiment may blink a lane image to notify a driver that a positional relationship between a vehicle and a lane is inappropriate. Blinking the lane image may rely on various image processing techniques. Exemplary image processing techniques for blinking a lane image is described in the eleventh embodiment.
Each of
The determination techniques described in the context of the tenth embodiment is applied to each of a couple of line marks indicating both edges of a lane. Therefore, the image data determination portion 411 (c.f.
When there is a high risk of deviation of the vehicle to the left, the request signal which is generated by the image data determination portion 411 may include request information which causes blinking of the left lane image LFI. When there is a high risk of deviation of the vehicle to the right, the request signal which is generated by the image data determination portion 411 may include request information which causes blinking of the right lane image RFI.
When the request signal includes request information which causes blinking of the left lane image LFI, the blink processor 482 (c.f.
When the request signal includes request information which causes blinking of the right lane image RFI, the blink processor 482 alternately incorporates the first image data and the second image data into image data received from the boundary combining portion 481. Consequently, the right lane image RFI displayed on the windshield WSD blinks.
The head-up display device described in the context of the ninth embodiment may execute various processes to switch a display mode between the first and second display modes. An exemplary process for switching a display mode is described in the twelfth embodiment.
Step S210 is started, for instance, when the driver DRV (c.f.
When the driver DRV turns off the ignition switch, the head-up display device 100 finishes the process. Otherwise, Step S230 is executed.
The output data determination portion 461 acquires the speed data but does not receive the request signal. Therefore, the output data determination portion 461 generates output data without the display request for the lane image LLI (c.f.
The image data determination portion 411 (c.f.
The image data determination portion 411 generates a request signal requesting display of the lane image LLI. The request signal is output from the image data determination portion 411 to the output data determination portion 461. The output data determination portion 461 incorporates the display request for the lane image LLI into the output data. The output data is output from the output data determination portion 461 to the output destination determination portion 462. The output data includes the display request for the lane image LLI Therefore, the output destination determination portion 462 selects the first image signal processor 441A (c.f.
The image data determination portion 411 determines a risk of deviation of the vehicle VCL from the lane according to the determination techniques described in the context of the tenth embodiment. When the image data determination portion 411 determines that there is a high risk of deviation from the lane, Step S250 is executed. Otherwise, Step S230 is executed.
As described in the context of the third and fourth embodiments, the head-up display device may provide a driver with navigation information for navigating to a destination under the second display mode. A head-up display device which provides a driver with navigation information under the second display mode is described in the thirteenth embodiment.
Like the ninth embodiment, the head-up display device 100 includes the optical processor 300. The description of the ninth embodiment is applied to the optical processor 300.
The head-up display device 100 further includes a signal processor 400B. The signal processor 400B may be a part of the projection device 200. Alternatively, the signal processor 400B may be a circuit provided independently of the projection device 200. In this case, the signal processor 400B may be a part of a circuit for processing various signals in the vehicle VCL (c.f.
As described in the context of the second embodiment, the vehicle VCL includes the windshield WSD. The optical processor 300 emits the image light (the first and/or second image lights, c.f.
The vehicle VCL includes a sensor group SSI and an interface ITH, in addition to the windshield WSD. The sensor group SSI may include various sensor elements for detecting a running state of the vehicle VCL, and various devices (e.g. a communication device) for acquiring information outside the vehicle VCL. The interface ITH receives a manual operation of the driver DRV (c.f.
Like the ninth embodiment, the sensor group SSI includes the speed detector SDT and the camera device CMD. The description of the ninth embodiment is applied to the speed detector SDT and the camera device CMD,
The sensor group SSI further includes a navigation system NVS. The navigation system NVS is operated on the basis of a global positioning system (GPS) and generates navigation data. The navigation system NVS may be a commercially available device. The principles of the present embodiment are not limited to a specific device for use as the navigation system NVS.
The navigation data may include information about a position of the vehicle VCL (e.g. a name of a road on which the vehicle VCL runs or a legal speed determined for the road on which the vehicle VCL runs), or information about a positional relationship between the vehicle VCL and a destination set by the driver DRV (e.g. route information for helping arrival of the vehicle VCL at the destination, or a distance between the vehicle VCL and the destination). The principles of the present embodiment are not limited to specific contents of the navigation data.
Like the ninth embodiment, the interface ITH includes the operation portion MOQ. The description of the ninth embodiment is applied to the operation portion MOQ.
The interface ITH further includes an operation portion MOR which is operated by the driver DRV in order to activate the navigation system NVS. When the driver DRV operates the operation portion MOR to activate the navigation system NVS, the navigation system NVS generates the navigation data.
The operation portion MOR may be a touch panel (not shown) provided in the navigation system NVS. The principles of the present embodiment are not limited to a specific structure of the operation portion MOR.
The signal processor 400B includes a data receiver 410B and an image signal processor 430B.
Like the ninth embodiment, the data receiver 410B includes the image data determination portion 411 and the speed data receiver 412. The description of the ninth embodiment is applied to the image data determination portion 411 and the speed data receiver 412.
The data receiver 410B further includes a navigation data receiver 413. The navigation data receiver 413 receives the navigation data from the navigation system NVS. The navigation data is then output from the navigation data receiver 413 to the image signal processor 430B.
Like the ninth embodiment, the image signal processor 430B includes the storage 432A. The description of the ninth embodiment is applied to the storage 432A.
The image signal processor 430B further includes a switching portion 431B and an image signal generator 433B
The switching portion 431B includes an output data determination portion 461B and an output destination determination portion 462B. Like the ninth embodiment, the output data determination portion 461B receives the speed data, the request signal and the notification signal from the speed data receiver 412, the image data determination portion 411 and the operation portion MOQ. The description of the ninth embodiment is applied to the speed data, the request signal and the notification signal.
When the driver DRV operates the operation portion MOR to activate the navigation system NVS, the output data determination portion 461B may receive the navigation data from the navigation data receiver 413. The output data determination portion 461B does not receive the navigation data unless the navigation system NVS is activated.
If the output data determination portion 461B receives none of the request signal and the navigation data, the image described with reference to
When the navigation system NVS is operated without setting a destination, the output data determination portion 461 may receive the navigation data including only information about a legal speed determined for a road on which the vehicle VCL runs. The image described with reference to
When the driver DRV sets a destination for the navigation system NVS, the output data determination portion 461 may receive the navigation data including information for navigating to the destination. The image described with reference to
When the output data determination portion 461B receives both of the request signal and the navigation data, the output data determination portion 461B incorporates the display request for the lane image LLI (c.f.
Like the ninth embodiment, the image signal generator 433B includes the first image signal processor 441A. The description of the ninth embodiment is applied to the first image signal processor 441A.
When the output data includes the display request for the lane image LLI, the output data determination portion 461B selects the first image signal processor 441A as the output destination of the output data like the ninth embodiment. Therefore, the description of the ninth embodiment is applied to the output data determination portion 461B which selects the first image signal processor 441A as the output destination of the output data,
The image signal generator 433B further includes a second image signal processor 442B. When the navigation data incorporated in the output data includes information about a legal speed determined for a road on which the vehicle VCL runs, the second image signal processor 442B processes signals for displaying the legal speed image LSI (c.f.
As described in the context of the fourth embodiment, the boundary image may be displayed only under the first display mode. Alternatively, the boundary image may be displayed not only under the first display mode but also the second display mode. When the boundary image is displayed under both of the display modes, a driver may look for information on the basis of the boundary image. As described in the context of the fifth and eighth embodiments, the first display mode is used as a display mode in which the boundary image gives visual impression like a hood of a vehicle. In this case, it is preferable that the boundary image is more highlighted under the first display mode than the second display mode. Therefore, a designer designing a head-up display device may provide a difference in display style of the boundary image between the first and second display modes. For instance, a head-up display device may use a boundary light having higher light energy to display the boundary image under the first display mode than the second display mode. Alternatively, the head-up display device may use a difference in another display style (e.g. a thickness of the boundary image, a line pattern of the boundary image (e.g. a straight line or a chain line)) to provide a difference of the boundary image between the first and second display modes. boundary images having a difference in light energy are described in the fourteenth embodiment.
The boundary image BDI is displayed on the windshield WSD under the first display mode. The boundary image BD2 is displayed on the windshield WSD under the second display mode.
The projection device 200 (c.f.
As described in the context of the fourteenth embodiment, a difference in energy of a boundary light which forms a boundary image may be provided between the first and second display modes. The difference in energy of the boundary light may be provided by adjustment to an output of a light source which emits the boundary light. Alternatively, the difference in energy of the boundary light may be provided by modulation for a light which is emitted from a light source. A head-up display device which changes power from a light source between the first and second display modes is described in the fifteenth embodiment.
The head-up display device 100 includes an optical processor 300C and a signal processor 400C. Like the thirteenth embodiment, the optical processor 300C includes the modulator 320 and the emitting portion 330. The description of the thirteenth embodiment is applied to the modulator 320 and the emitting portion 330.
The optical processor 300C further includes a light source portion 310C. The light source portion 310C emits a boundary light having high energy under the first display mode. The light source portion 310C emits a boundary light having low energy under the second display mode.
Like the thirteenth embodiment, the signal processor 400C includes the data receiver 410B. The description of the thirteenth embodiment is applied to the data receiver 410B.
The signal processor 400C further includes an image signal processor 430C. Like the thirteenth embodiment, the image signal processor 430C includes the storage 432A. The description of the thirteenth embodiment is applied to the storage 432A.
The image signal processor 430C includes a switching portion 431C and an image signal generator 433C. Like the thirteenth embodiment, the switching portion 431C includes the output data determination portion 4611B. The description of the thirteenth embodiment is applied to the output data determination portion 461B.
The switching portion 431C further includes an output destination determination portion 462C. Like the thirteenth embodiment, the output destination determination portion 462C determines the output destination of the output data. Therefore, the description of the thirteenth embodiment is applied to the output destination determination portion 462C.
The output destination determination portion 462C not only determines the output destination of the output data but also generates a luminance signal designating luminance of the boundary light to be emitted from the light source portion 310C. When the output data from the output data determination portion 461B includes the display request for the lane image LLI (c.f.
Like the thirteenth embodiment, the image signal generator 433C includes the first image signal processor 441A. The description of the thirteenth embodiment is applied to the first image signal processor 441A.
The image signal generator 433C further includes a second image signal processor 442C. Like the thirteenth embodiment, the second image signal processor 442C processes signals for displaying an image on the upper and lower display areas UDA, LDA (c.f.
The second image signal processor 442C reads image data from the first storage 471, the image data representing the boundary image BD2 (c.f.
The head-up display device described in the context of the fifteenth embodiment may display boundary images different in luminance between the first and second display modes. Additionally or alternatively, the head-up display device may display boundary images different in thickness between the first and second display modes. A head-up display device which displays boundary images different in thickness between the first and second display modes is described in the sixteenth embodiment.
The first storage 471 stores the first boundary data and the second boundary data. The boundary combining portion 481 (c.f.
The principles of the aforementioned various embodiments may be combined to meet requirements for vehicles.
The exemplary head-up display devices described in the context of the aforementioned various embodiments mainly include the following features.
A head-up display device according to one aspect of the aforementioned embodiments is mounted on a vehicle. The head-up display device includes a projection device which emits image light onto a reflective surface including a first area, and a second area below the first area. The image light includes first image light representing first information including information about an external factor outside the vehicle, and second image light representing second information about the vehicle itself. The projection device emits the first image light onto the first area and the second image light onto the second area.
According to the aforementioned configuration, the first image light is emitted onto the first area whereas the second image light is emitted onto the second area below the first area. Since a driver trying to acquire information about an external factor outside a vehicle directs his/her line of sight toward an upper area in many cases, the driver may intuitively acquire the first information including information about the external factor outside the vehicle. On the other hand, a driver trying to acquire information about a vehicle driven by the driver directs his/her line of sight toward a lower area in many cases. Therefore, the driver may intuitively acquire the second information about the vehicle itself.
With regard to the aforementioned configuration, the image light may include a boundary light representing a boundary between the first and second areas. The projection device may switch a display mode between a first display mode, under which the boundary light is emitted, and a second display mode under which at least one of the first and second image lights is emitted without emission of the boundary light.
According to the aforementioned configuration, since the projection device emits the boundary light under the first display mode, the first information is displayed above the boundary light whereas the second information is displayed below the boundary light. Since a positional relationship among the first information, the boundary light and the second information is similar to a positional relationship among an external factor in a field of view of a driver, a hood and an interior of the vehicle, the driver may intuitively acquire the first information and the second information. Since the projection device switches the display mode from the first display mode to the second display mode, the boundary light is not emitted unnecessarily. This results in a reduction in electric power consumption of the projection device.
With regard to the aforementioned configuration, the image light may include first boundary light representing a boundary between the first and second areas, and second boundary light representing the boundary which is different in display style from the first boundary light. The projection device may switch a display mode between a first display mode, under which the first boundary light is emitted, and a second display mode under which the second boundary light is emitted.
According to the aforementioned configuration, since the second boundary light is different in display style from the first boundary light, a driver may visually recognize a difference in display style between the first and second boundary lights and know that the display mode is switched.
With regard to the aforementioned configuration, the first boundary light may be higher in light intensity than the second boundary light.
According to the aforementioned configuration, since the first boundary light is higher in light intensity than the second boundary light, the head-up display device may emphasize the boundary between the first and second areas under the first display mode.
With regard to the aforementioned configuration, the first boundary light may draw a boundary thicker than the second boundary light.
According to the aforementioned configuration, since the first boundary light draws a boundary thicker than the second boundary light, the head-up display device may emphasize the boundary between the first and second areas under the first display mode.
With regard to the aforementioned configuration, the first information may include inter-vehicle distance information about a distance setting between the vehicle and a preceding vehicle which is targeted in an auto cruise control. The projection device may emit the first image light representing the inter-vehicle distance information under the first display mode.
According to the aforementioned configuration, since a preceding vehicle is a factor existing outside the vehicle driven by a driver, the driver is likely to direct his/her line of sight toward an upper area when the driver tries to acquire information about the preceding vehicle. Since the first image light representing inter-vehicle distance information about a distance setting between the vehicle and the preceding vehicle which is targeted in an auto cruise control is emitted onto the first area above the second area, the driver may intuitively acquire the inter-vehicle distance information.
With regard to the aforementioned configuration, the first image light may represent a symbol image indicating the preceding vehicle as the inter-vehicle distance information. When a distance between the vehicle and the preceding vehicle is set to a first value, the symbol image may be displayed so that the symbol image is distant from the boundary by a first length. When the distance between the vehicle and the preceding vehicle is set to a second value larger than the first value, the symbol image may be displayed so that the symbol image is distant from the boundary by a second length longer than the first length.
According to the aforementioned configuration, a position of the symbol image from the boundary is changed in response to a distance setting between the vehicle and the preceding vehicle. Since a positional relationship between the boundary and the symbol image is similar to a positional relationship between a hood and a preceding vehicle in a field of view of a driver, the driver may intuitively acquire the inter-vehicle distance information.
With regard to the aforementioned configuration, the first information may include lane information for use in notifying a positional relationship between the vehicle and a lane along which the vehicle runs. The projection device may emit the first image light representing the lane information under the first display mode.
According to the aforementioned configuration, since a lane is a factor existing outside the vehicle driven by a driver, the driver is likely to direct his/her line of sight toward an upper area when the driver tries to acquire information about the lane. Since the first image light representing lane information for use in notifying a positional relationship between the vehicle and the lane is emitted onto the first area above the second area, the driver may intuitively acquire the lane information.
With regard to the aforementioned configuration, the first image light may represent a straight line image as the lane information, the straight line image extending upwardly from the boundary.
According to the aforementioned configuration, since the straight line image extends upwardly from the boundary, a positional relationship between the straight line image and the boundary is similar to a positional relationship between a hood in a field of view of a driver and a line formed on a road surface. Accordingly, the driver may intuitively recognize whether or not the vehicle deviates from the lane.
With regard to the aforementioned configuration, the first information may include at least one selected from a group consisting of navigation information for navigating to a destination, legal speed information about a legal speed determined for a lane along which the vehicle runs, inter-vehicle distance information about a distance setting between the vehicle and a preceding vehicle which is targeted in an auto cruise control, and lane information for use in notifying a positional relationship between the vehicle and the lane.
According to the aforementioned configuration, since a destination, a legal speed, a preceding vehicle and a lane are factors existing outside the vehicle driven by a driver, the driver is likely to direct his/her line of sight toward an upper area when the driver tries to acquire information about these factors. Since the first image light is emitted onto the first area above the second area, the driver may intuitively acquire information about a destination, a legal speed, a preceding vehicle and a lane.
With regard to the aforementioned configuration, the second information may include at least one selected from a group consisting of running speed information about a running speed of the vehicle, and setting speed information about a running speed setting of the vehicle in the auto cruise control.
According to the aforementioned configuration, since a running speed of a vehicle is associated with a vehicle itself, the driver is likely to direct his/her line of sight toward a lower area when the driver tries to acquire running speed information or setting speed information. Since the second image light is emitted onto the second area below the first area, the driver may intuitively acquire the information about a running speed of the vehicle.
The principles of the aforementioned embodiments are advantageously used in designing various vehicles.
Number | Date | Country | Kind |
---|---|---|---|
2015-062987 | Mar 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/058192 | 3/15/2016 | WO | 00 |