The present invention relates to a head-up display system.
Conventionally, as a head-up display system, for example, Japanese Patent Application Laid-open No. 2019-038451 describes a head-up display that emits display light indicating a display image to a transmission and reflection unit on a vehicle to allow a viewer to visually recognize a virtual image of the display image. This head-up display controls a display that displays a display image, and the display so that an emphasized image is superimposed and displayed on an object included in the landscape in front of the vehicle for at least a certain period of time until the vehicle is started and starts traveling. As a result, the head-up display demonstrates the superimposed display of the emphasized image before the vehicle starts traveling, thereby allowing the viewer to recognize the presence and significance of the function of superimposing and displaying the emphasized image on the object.
Meanwhile, in the head-up display described in Japanese Patent Application Laid-open No. 2019-038451 described above, for example, when an enhanced image is superimposed on an object and displayed, if the object is focused, the enhanced image may be visually recognized as a double image due to congestion.
Therefore, the present invention has been made in view of the foregoing, and an object thereof is to provide a head-up display system capable of appropriately performing superimposed display.
In order to achieve the above mentioned object, a head-up display system according to one aspect of the present invention includes: a display unit that displays a virtual image by emitting display light including a display image toward a reflecting member, the reflecting member being provided in a vehicle and having transparency; an object detection unit that detects an object included in a landscape in front of the vehicle; a line-of-sight detection unit that detects a line of sight of a driver of the vehicle; a distance measuring unit that measures an actual distance from an eye point of the driver to a specific object among the objects detected by the object detection unit, the specific object being located ahead of the line of sight of the driver detected by the line-of-sight detection unit and matching the driver's focus; and a control unit that controls the display unit to superimpose and display the virtual image on the specific object, wherein the control unit controls the display unit to execute double image prevention processing of adjusting a virtual image display distance, which is a distance from the eye point of the driver to the virtual image, to the actual distance measured by the distance measuring unit.
The above and other objects, features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
A mode (embodiment) for carrying out the present invention will be described in detail with reference to the drawings. The present invention is not limited by the contents described in the following embodiment. In addition, the constituent elements described below include those that can be easily assumed by those skilled in the art and those that are substantially the same.
Furthermore, the configurations described below can be appropriately combined. In addition, various omissions, substitutions, or changes in the configuration can be made without departing from the gist of the present invention.
An AR-HUD system 1 according to an embodiment will be described with reference to the drawings. The AR-HUD system 1 is an example of a head-up display system. The AR-HUD system 1 emits display light including a display image P toward a windshield W as a reflecting member provided in a vehicle and having transparency, and superimposes and displays a virtual image S reflected by the windshield W toward an eye point EP side on an object OJ. Here, the eye point EP is a position assumed in advance as the position of the driver's eyes or is the actual position of the driver's eyes. For example, in a case where the eye point EP is the actual position of the driver's eyes, the position of the driver's eyes is detected by a driver monitor 20 described later.
As illustrated in
The object detection sensor 10 is provided in the vehicle and detects the object OJ included in the landscape in front of the vehicle. Here, the object OJ is, for example, an object to be recognized by the driver, such as a person, another vehicle, or a sign. The object detection sensor 10 includes, for example, a stereo camera, and captures an image of the landscape in front of the vehicle. Then, the object detection sensor 10 performs image analysis on the captured image by a known image processing method such as a pattern matching method, and detects a plurality of objects OJ such as a person, another vehicle, and a sign. Furthermore, the object detection sensor 10 measures an actual distance D1 from the eye point EP of the driver with respect to the detected object OJ such as a person, another vehicle, or a sign. That is, the object detection sensor 10 measures the actual distance D1, which is the distance from the eye point EP of the driver to the object OJ. The object detection sensor 10 outputs, to the distance output device 30, object detection information indicating the position (XY coordinates) of the detected object OJ, the size of the object OJ, and the actual distance D1 from the eye point EP of the driver to the object OJ.
Next, the driver monitor 20 will be described. The driver monitor 20 is provided in the vehicle and monitors the driver, and includes a line-of-sight detection unit 21. The line-of-sight detection unit 21 detects a line of sight En of the driver. For example, the line-of-sight detection unit 21 is disposed in a state where a camera lens faces the driver. The line-of-sight detection unit 21 detects the line of sight En of the driver by a known line-of-sight detection method. For example, the line-of-sight detection unit 21 detects the line of sight En of the driver on the basis of the position of the pupil of the eyeball in a face image of the driver. In this case, the line-of-sight detection unit 21 compares a predetermined image of the eye with the face image of the driver, and detects the position of the pupil of the driver from the face image of the driver. The line-of-sight detection unit 21 detects the line of sight En of the driver from the detected position of the pupil of the driver. The line-of-sight detection unit 21 outputs line-of-sight information (XY coordinates) representing the detected line of sight En to the distance output device 30.
Next, the distance output device 30 will be described. The distance output device 30 outputs the actual distance D1 from the eye point EP to the object OJ. The distance output device 30 outputs the actual distance D1 from the eye point EP to the object OJ on the basis of the object detection information output from the object detection sensor 10 and the line-of-sight information of the driver output from the line-of-sight detection unit 21 of the driver monitor 20. The distance output device 30 detects a specific object OJ from among the plurality of objects OJ on the basis of, for example, line-of-sight information (XY coordinates) indicating the line of sight En of the driver. Specifically, the distance output device 30 detects, among the plurality of objects OJ, a specific object OJ that is located ahead of the line of sight En of the driver and matches the driver's focus. That is, the distance output device 30 detects the specific object OJ actually visually recognized by the driver among the plurality of objects OJ. In other words, the distance output device 30 detects the specific object OJ at a position (XY coordinates) that matches the position (XY coordinates) of the line of sight En of the driver. Then, the distance output device 30 outputs the position (XY coordinates) of the specific object OJ and object information indicating the actual distance D1 from the eye point EP to the specific object OJ to the AR-HUD device 40.
Next, the AR-HUD device 40 will be described. The AR-HUD device 40 emits display light including the display image P toward the windshield W, and superimposes and displays the virtual image S reflected by the windshield W toward the eye point EP on the object OJ. The AR-HUD device 40 includes a reflection unit 41, a display unit 42, and a control unit 43. The reflection unit 41, the display unit 42, and the control unit 43 are communicably connected to each other.
The reflection unit 41 reflects the display light emitted from the display unit 42 toward the windshield W. The reflection unit 41 includes a first intermediate mirror 411, a second intermediate mirror 412, and a last mirror 413. The first intermediate mirror 411 totally reflects the display light emitted from the display unit 42 toward the second intermediate mirror 412. The second intermediate mirror 412 totally reflects the display light emitted from the display unit 42 and reflected by the first intermediate mirror 411 toward the last mirror 413. The last mirror 413 totally reflects the display light emitted from the display unit 42 and reflected by the first intermediate mirror 411 and the second intermediate mirror 412 toward the windshield W.
The display unit 42 emits display light including the display image P, and emits the display light to the windshield W via the reflection unit 41. The display unit 42 includes a display 421. For example, as illustrated in
The control unit 43 controls the display unit 42 to superimpose and display the virtual image S on the specific object OJ. At this time, the control unit 43 executes double image prevention processing of adjusting a virtual image display distance D2 from the eye point EP to the virtual image S to the actual distance D1 from the eye point EP to the specific object OJ on the basis of the object information output from the distance output device 30. Typically, in a case where the double image prevention processing is executed, the control unit 43 matches the virtual image display distance D2 from the eye point EP to the virtual image S with the actual distance D1 from the eye point EP to the specific object OJ. Note that the control unit 43 does not need to match the virtual image display distance D2 from the eye point EP to the virtual image S with the actual distance D1 from the eye point EP to the specific object OJ as long as the double image due to congestion can be prevented, and the actual distance D1 and the virtual image display distance D2 may be slightly different. The above-described virtual image display distance D2 is a distance from the eye point EP to a display position at which the virtual image S is displayed. That is, the virtual image display distance D2 is a straight line distance connecting the eye point EP and a position at which the virtual image S is displayed. In other words, the virtual image display distance D2 is a distance from the eye point EP to an image forming position where the virtual image S is formed.
The control unit 43 controls the display unit 42 to change the parallax between the left-eye image LP and the right-eye image RP, thereby executing double image prevention processing of adjusting the virtual image display distance D2 to the actual distance D1. In the double image prevention processing, for example, the control unit 43 adjusts the inclination of the reflection surface of each mirror of the reflection unit 41 and changes the optical path of the display light including the left-eye image LP and the optical path of the display light including the right-eye image RP emitted from the lenticular lens 421b, thereby changing the parallax between the left-eye image LP and the right-eye image RP. For example, as illustrated in
In addition, as illustrated in
Furthermore, as illustrated in
Next, an operation example of the AR-HUD system 1 will be described with reference to a flowchart of
Next, the AR-HUD system 1 determines whether or not the detected line of sight En of the driver is within a display range (Step S2). For example, the AR-HUD system 1 causes the driver monitor 20 to determine whether or not the line of sight En of the driver is included in a predetermined display range in which the virtual image S is displayed. If the line of sight En of the driver is within the display range (Step S2; Yes), the AR-HUD system 1 detects a plurality of objects OJ included in the landscape in front of the vehicle (Step S3). For example, the AR-HUD system 1 causes the object detection sensor 10 to perform image analysis on a captured image by a known image processing method such as a pattern matching method, and detects a plurality of objects OJ such as a person, another vehicle, and a sign. Then, the object detection sensor 10 measures the actual distance D1 from the eye point EP of the driver with respect to the detected object OJ such as a person, another vehicle, or a sign.
Next, the AR-HUD system 1 determines whether or not the line of sight En of the driver and the object OJ match (Step S4). For example, the AR-HUD system 1 causes the distance output device 30 to detect the specific object OJ from among the plurality of objects OJ on the basis of the line-of-sight information (XY coordinates) indicating the line of sight En of the driver. If the line of sight En of the driver and the object OJ match (Step S4; Yes), the AR-HUD system 1 outputs the actual distance D1 from the eye point EP of the driver to the matched specific object OJ (Step S5).
Next, the AR-HUD system 1 adjusts the virtual image display distance D2 from the eye point EP to the virtual image S to the actual distance D1 from the eye point EP to the specific object OJ, and superimposes and displays the virtual image S on the specific object OJ (Step S6). For example, the AR-HUD system 1 causes the control unit 43 to control the display unit 42 to change the parallax between the left-eye image LP and the right-eye image RP, thereby adjusting the virtual image display distance D2 to the actual distance D1 and superimposing and displaying the virtual image S on the specific object OJ.
Note that in Step S2 described above, if the detected line of sight En of the driver is not within the display range (Step S2; No), the AR-HUD system 1 ends the processing. In Step S4 described above, if the line of sight En of the driver and the object OJ do not match (Step S4; No), the AR-HUD system 1 ends the processing.
As described above, the AR-HUD system 1 according to the embodiment includes the display unit 42, the line-of-sight detection unit 21, and the control unit 43. The display unit 42 displays the virtual image S by emitting display light including the display image P toward the windshield W provided in the vehicle and having transparency. The object detection sensor 10 detects a plurality of objects OJ included in the landscape in front of the vehicle. The line-of-sight detection unit 21 detects line of sight En of the driver of the vehicle. The distance output device 30 identifies (measures) the actual distance D1 from the eye point EP of the driver with respect to a specific object OJ that is located ahead of the line of sight En of the driver detected by the line-of-sight detection unit 21 and matches the driver's focus among the plurality of objects OJ detected by the object detection sensor 10. The control unit 43 controls the display unit 42 to superimpose and display the virtual image S on the specific object OJ, and executes double image prevention processing of adjusting the virtual image display distance D2 from the eye point EP of the driver to the virtual image S to the actual distance D1.
With this configuration, the AR-HUD system 1 can prevent the virtual image S from appearing as a double image due to congestion when the driver focuses on the specific object OJ. In particular, when displaying the virtual image S indicating emergency information, the AR-HUD system 1 can cause the driver to clearly view the virtual image S. Thus, the AR-HUD system 1 can superimpose and display the virtual image S appropriately.
In the AR-HUD system 1, the display image P is a three-dimensional image including the left-eye image LP visually recognized by the left eye LE of the driver and the right-eye image RP visually recognized by the right eye RE of the driver. The control unit 43 controls the display unit 42 to change the parallax between the left-eye image LP and the right-eye image RP, thereby executing double image prevention processing of adjusting the virtual image display distance D2 to the actual distance D1. With this configuration, the AR-HUD system 1 can prevent the three-dimensional virtual image S from appearing as a double image due to congestion when the driver focuses on the specific object OJ. As a result, the AR-HUD system 1 can superimpose and display the three-dimensional virtual image S appropriately.
Next, a modification of the embodiment will be described. Note that in the modification, constituent elements equivalent to those in the embodiment are denoted by the same reference numerals, and a detailed description thereof will be omitted. In the AR-HUD system 1, the example in which the display image P is a three-dimensional image has been described, but the display image P is not limited thereto, and may be, for example, a two-dimensional image visually recognized by the left eye LE and the right eye RE of the driver. In this case, a control unit 43 controls a display unit 42 to change an optical path length of display light, thereby executing double image prevention processing of adjusting a virtual image display distance D2 to an actual distance D1. For example, the display unit 42 includes an optical path length adjusting unit including a plurality of folded mirrors. The optical path length adjusting unit adjusts (selects) the folding mirror that reflects the display light on the basis of the actual distance D1 to change the optical path length of the display light to a windshield W, and adjusts the virtual image display distance D2 for displaying a virtual image S including a two-dimensional image to the actual distance D1. As a result, the AR-HUD system 1 according to the modification can prevent the two-dimensional virtual image S from appearing as a double image due to congestion when the driver focuses on a specific object OJ. As a result, the AR-HUD system 1 according to the modification can superimpose and display the two-dimensional virtual image S appropriately.
The example in which the object detection sensor 10 includes the stereo camera has been described, but the present invention is not limited thereto, and any sensor may be used as long as the sensor can detect the object OJ and measure the actual distance D1. The object detection sensor 10 may include, for example, a known sensor such as a monocular camera, an infrared camera, a laser radar, a millimeter wave radar, or an ultrasonic sensor.
The head-up display system according to the present embodiment can prevent a virtual image from appearing as a double image due to congestion when a driver focuses on a specific object. As a result, the head-up display system can appropriately perform superimposed display.
Although the invention has been described with respect to specific embodiments for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art that fairly fall within the basic teaching herein set forth.
Number | Date | Country | Kind |
---|---|---|---|
2022-169695 | Oct 2022 | JP | national |
This application is a continuation application of International Application No. PCT/JP2023/037644 filed on Oct. 18, 2023 which claims the benefit of priority from Japanese Patent Application No. 2022-169695 filed on Oct. 24, 2022 and designating the U.S., the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2023/037644 | Oct 2023 | WO |
Child | 18924256 | US |