The present invention relates to a head-up display (HUD) device that projects the display light of an image onto a projected member such as a windshield or combiner of a vehicle and displays a virtual image in front of a driver, a display control device, and the like.
In HUD devices, an image correction processing (hereinafter referred to as warping processing) is known in which an image to be projected is pre-distorted so as to have a characteristic opposite to the distortion of a virtual image caused by a curved surface shape or the like of an optical system, a windshield, or the like. The warping processing in HUD devices is described, for example, in Patent Document 1.
In addition, performing warping processing based on the driver's viewpoint position (viewpoint follow-up warping) is described, for example, in Patent Document 2.
Patent Document 1: Japanese Unexamined Patent Application Publication No. 2015-87619
Patent Document 2: Japanese Unexamined Patent Application Publication No. 2014-199385
The present inventor examined the implementation of a viewpoint position follow-up warping control which updates a warping parameter in accordance with the viewpoint position of a driver (which can be interpreted broadly, such as a pilot or crew member), and recognized the new issue described below.
After a driver's viewpoint position moves and a HUD device temporarily loses that viewpoint position, the viewpoint position may be re-detected and the warping parameter may be updated on the basis of the re-detected viewpoint position.
Here, ideally, the image (virtual image) displayed after the warping processing should be a flat virtual image after the distortion caused by the optical system or the like is completely corrected. In the viewpoint position follow-up warping, it is ideal that a virtual image without distortion is always obtained even when the position of the viewpoint of a driver (user) changes, but the distortion cannot be completely removed.
Therefore, after the viewpoint position loss (lost) occurs, if simple viewpoint position follow-up warping is performed on the basis of the re-detected viewpoint position, even when a virtual image of the same image is displayed, the appearance of the virtual image seen by the driver (the state of the virtual image, the impression received from the virtual image, etc.) may change instantaneously, causing a sense of discomfort to the driver.
Further, as a mode of the viewpoint lost, it can be typically assumed that the viewpoint moves out of the eye box for some reason and then returns to the inside of the eye box, but it is not limited to this, and the viewpoint lost may occur when the driver's viewpoint moves instantaneously within the eye box. Even if saying the “viewpoint lost” in a word, there are many different modes. If necessary, it is also important to adopt measures that take into account the situation of the viewpoint lost.
Furthermore, in recent years, for example, a HUD device capable of displaying a virtual image over a fairly wide range in front of a vehicle has been developed, and such a HUD device tends to be large in size. Although the design of the optical system or the like has been devised to reduce distortion, it is difficult to achieve a uniform distortion reduction effect in, for example, all areas of the eye box. For instance, it is possible that a situation may occur in which when the driver's viewpoint is in the central area of the eye box, the degree of distortion can be significantly suppressed, but when the viewpoint is in the periphery of the eye box, the degree of residual distortion increases to some extent. This point may also be a factor that significantly changes the appearance of the virtual image by the warping processing after the loss of the viewpoint position.
An object of the present invention is to, in the HUD device, when performing the viewpoint position follow-up warping processing, suppress that the appearance of the image changes instantaneously with the update of the warping parameter, which causes a sense of discomfort to the driver, when the viewpoint position follow-up warping control that updates the warping parameter in accordance with the driver's viewpoint position is performed and the viewpoint position loss occurs, and then the viewpoint position is re-detected.
Other objects of the present invention will become apparent to those skilled in the art by referring to the aspects and the best embodiment exemplified below, and to the attached drawings.
Aspects according to the present invention are exemplified below to allow the summary of the present invention to be easily understood.
In a first aspect, a display control device is a display control device that controls a head-up display (HUD) device which is mounted on a vehicle, which includes a display unit that displays an image and an optical system including an optical member that reflects and projects display light of the image onto the projected member, and which projects the image onto a projected member provided in the vehicle to thereby make a driver to visually recognize a virtual image of the image. The display control device includes a control unit that performs a viewpoint position follow-up warping control to update a warping parameter in accordance with a viewpoint position of the driver in an eye box and to pre-distort an image to be displayed on the display unit with use of the warping parameter in such a manner that the image has a characteristic opposite to a distortion characteristic of the virtual image of the image. When a viewpoint lost in which a position of at least one of right and left viewpoints of the driver is unclear is detected, the control unit maintains, in a viewpoint lost period, the warping parameter set immediately before the viewpoint lost period, and when a position of the viewpoint is re-detected after the viewpoint lost period, the control unit disables at least one warping processing that uses the warping parameter corresponding to the re-detected viewpoint position.
In the first aspect, during the viewpoint lost (sometimes referred to as a viewpoint loss or a viewpoint position loss) period, the warping parameter immediately before is maintained, and at the timing when the viewpoint position is re-detected, the warping by a new warping parameter is not performed immediately, but is performed with a delay.
In other words, when the viewpoint position is re-detected after the viewpoint lost, the control unit disables at least one warping processing that uses the warping parameter corresponding to the re-detected viewpoint position (setting the disabling period after re-detection), and then enables (performs) the warping processing that uses the warping parameter corresponding to the viewpoint position after the disabling period ends.
For example, it is assumed that the eye box is divided into multiple partial areas and a method for detecting the viewpoint position in units of each partial area is adopted. For example, it was found that the viewpoint was in “partial area A” immediately before the viewpoint lost, and the viewpoint lost occurred, and then the viewpoint moved to “partial area B” at the timing when the viewpoint position was re-detected. Even so, the warping processing to which the warping parameter corresponding to the position of the “partial area B” is applied is disabled (at least one disabling). When the viewpoint position moves from the partial area B to partial area C, the warping processing to which the warping parameter corresponding to the position of the partial area C can be disabled again (second disabling).
It is also possible to adaptively determine how many times to disable, taking into account the mode of the viewpoint lost (e.g., length of the viewpoint lost period), vehicle traveling state (e.g., vehicle speed), and the like.
In a second aspect dependent on the first aspect, the control unit compares the viewpoint lost time with a threshold value, and if the viewpoint lost time is shorter than the threshold value, the control unit may perform a control to lengthen a period during which the warping processing is disabled as compared to a case where the viewpoint lost time is longer than the threshold value.
In the second aspect, the control unit clocks a time during which the viewpoint position is lost (lost time or loss time), and if the lost time is shorter than a predetermined threshold value, the control unit performs a control to lengthen the period for disabling.
In the second aspect, by setting a longer period during which the warping parameter is fixed without changing and the visual quality of the image (virtual image) is maintained at a constant level, the driver can be made aware that the re-detection after the viewpoint lost has been successful and that the corresponding processing is now being performed. In other words, by extending the period during which the warping parameter is fixed and performing image processing using the warping parameter updated over time, even when the appearance of the image (virtual image) changes, the driver is made aware that the change does not occur suddenly with the movement of the driver's eyes and is a change with a margin in time.
This makes it easier for the driver to sense that, although the viewpoint position was lost due to the movement of one's own eyes, the system of the HUD device succeeded in re-detecting the viewpoint position, and the processing corresponding to the viewpoint lost has been performed.
In other words, the HUD device (system side) can present the driver who is the user that the processing after the viewpoint lost is being done properly. This gives the driver a sense of security and mental stability, and thus it is possible to obtain the effect of making a sense of discomfort less likely to occur or reducing a sense of discomfort.
In a third aspect dependent on the first aspect, the control unit compares the viewpoint lost time with a threshold value, and if the viewpoint lost time is shorter than the threshold value, the control unit may perform a control to shorten a period during which the warping processing is disabled as compared to a case where the viewpoint lost time is longer than the threshold value.
In the third aspect, the control unit clocks the viewpoint position lost time (loss time), and if the lost time is shorter than a predetermined threshold value, the control unit performs a control to shorten the period for disabling. The direction of control is opposite to the second aspect, and the second and third aspects have different effects to be obtained, and thus each aspect can be applied selectively in accordance with the expected effect.
When the viewpoint lost time is short, the change in the viewpoint position is considered to be small (the viewpoint movement distance is considered to be relatively short). Therefore, the time to disable the warping processing using a new warping parameter is set to be shorter than when the viewpoint lost time is longer. This allows, for example, after performing the minimum necessary disabling (timing delay), to reduce a sense of discomfort or suppress the occurrence of a sense of discomfort by quickly displaying an appropriate warping-corrected image (virtual image) corresponding to the viewpoint position.
In other words, when the viewpoint lost period is short, the movement distance of the viewpoint position is presumed to be small, and it can be assumed that there is little difference in the distortion of the virtual image before and after the updating of the warping parameter. Considering this point, a sudden change in the appearance of the virtual image immediately after the re-detection of the viewpoint position (in other words, in a fairly short time) is prevented, and then the normal viewpoint follow-up warping control is restored, thereby surely obtaining the effect of improving visibility.
In a fourth aspect dependent on any one of the first to third aspects, when an update cycle of the warping parameter before the viewpoint lost occurs and during the viewpoint lost period is defined as a first update cycle RT1 and an update cycle of the warping parameter during a period during which the warping processing is disabled is defined as a second update cycle RT2, the control unit may change a parameter update cycle in such a manner that RT1<RT2.
In the fourth aspect, a processing to lengthen the update cycle of the warping parameter (warping parameter update cycle change processing) is used in conjunction with a processing to maintain the parameter immediately before the viewpoint lost in the disabling period.
For example, if the frame rate of the image (virtual image) is 60 fps (frames per second), then image processing (image display processing) is performed at 60 frames per second (in other words, one frame period is 1/60 second). As an example, assume that it is normal to also update the warping parameter every frame.
Here, in the disabling period after the viewpoint lost, if the parameter is updated every 2 frames, the update cycle becomes 2/60 seconds, and if it is performed every 3 frames, the update cycle is 3/60 seconds, and the update cycle becomes longer. In this way, it is possible to lengthen (increase) the update cycle by switching to the update based on multiple frames. As the update cycle becomes longer, the reflection of the updated warping parameter in the image (virtual image) becomes slower. In other words, the sensitivity of reflecting the updated parameter in the display is slowed down. After the disabling period, the update cycle of the changed parameter will be restored (from RT2 to RT1). However, in reality, restoring the update cycle is not instantaneous, but requires a certain amount of time, and thus even if the parameter is switched, the reflection of the switched parameter in the actual display will be delayed.
This makes it easier to provide an appropriate delay of some time width (a delay that is long enough for the driver to perceive that the change in the appearance in the actual display has been delayed (in other words, a delay that effectively has the effect of slightly extending the width of the disabling period)). An effect such as facilitating the design of a timing control circuit can be expected.
Further, the degree of increase in the update cycle of the parameter can be controlled in a variable manner, or the timing of restoring the increased update cycle to its original cycle can be devised, thereby expanding the range of control variations and enabling flexible responses. The amount of delay in the actual display control can also be easily set quite widely.
In a fifth aspect dependent on the fourth aspect, after changing the parameter update cycle from the RT1 to the RT2, at an end timing of a period during which the warping processing is disabled, the control unit may restore the parameter update cycle from the RT2 to the RT1, or at a timing when a predetermined time has further elapsed from the end timing of the period during which the warping processing is disabled, the control unit may restore the parameter update cycle from the RT2 to the RT1, or the control unit may start changing the parameter update cycle starting from the end timing of the period during which the warping processing is disabled, and gradually restore the parameter update cycle from the RT2 to the RT1 with a lapse of time.
In the fifth aspect, examples of a mode in which the update cycle is restored after performing the processing of lengthening the update cycle in the fourth aspect.
In a first example, the long update cycle is synchronized with the switching (updating) of the warping parameter and is restored to the original short update cycle. Even in this case, a certain amount of time is needed to change the update cycle, and thus that much delay is ensured.
In a second example, the parameter update cycle is restored after a predetermined time has further elapsed from the time of switching (updating) the warping parameter. In this example, the timing of restoring the update cycle is delayed by a predetermined amount of time from the timing of switching (updating) the parameter, further delaying the reflection of the changed parameter in the display, which thus makes it easier to ensure the delay of an appropriate length that can be perceived by a human eye.
In a third example, when restoring the update cycle, it is restored gradually over a predetermined period. In other words, a control is performed in which when the parameter update cycle is restored from, for example, 1/15 second to 1/60 second, the cycle is not restored immediately, but is gradually switched to 1/30 second, 1/45 second, and 1/60 in a predetermined time as a unit. By controlling the gradual switching of the update cycle on the time axis, the delay in reflecting the changed parameter in the display can be managed with greater precision.
In a sixth aspect dependent on any one of the first to fifth aspects, a low-speed state determination unit that determines whether a speed of the vehicle is in a low-speed state may be further included, and the control unit may lengthen a period during which the warping processing is disabled when the vehicle is in the low-speed state including a stopped state, as compared to a period during which the warping processing is disabled in a state where the vehicle is in a state faster than the low-speed state.
In the sixth aspect, when the vehicle is in a low-speed state, the disabling period is set longer than when the vehicle is in a medium-speed or high-speed state (in other words, the timing for switching the warping parameter is further delayed). In a low-speed state, the driver is sensitive to visual fluctuations in the front and the like and can easily detect these fluctuations. Therefore, at this time, a measure is taken in such a manner that the reflection of the new parameter in the image after the viewpoint lost is delayed to a greater extent, and a sense of discomfort caused by the instantaneous change in display appearance is less likely to occur. As the vehicle speed increases out of the low-speed state, the disabling periods are reduced (it is possible to include a case where the disabling period is eliminated), and a control is performed with an emphasis on accelerating the distortion correction of the image based on the viewpoint position re-detected after the lost. This allows for an appropriate warping control in response to a vehicle speed.
In a seventh aspect dependent on any one of the first to sixth aspects, the control unit may change a period during which the warping processing is disabled in accordance with a vehicle speed of the vehicle, and in this case, when a speed of the vehicle is within a range of equal to or higher than a first speed value U1 (U1>0) and equal to or lower than a second speed value U2 that is higher than the first speed value, the control unit may perform a control to reduce the period during which the warping processing is disabled with respect to the vehicle speed as the vehicle speed becomes fast, or may perform a control to moderate a degree of the reduction when the vehicle speed is in a range close to the first speed value and to make the degree of the reduction steeper as the vehicle speed becomes away from the first speed value, or may perform a control to moderate the degree of the reduction when the vehicle speed is in a range close to the first speed value, to make the degree of the reduction further steeper as the vehicle speed becomes away from the first speed value, and to moderate the degree of the reduction as the vehicle speed approaches the second speed value.
In the seventh aspect, when performing a control to shorten (in other words, reduce) the period during which the warping processing is disabled as the vehicle speed increases, the control may be performed when the vehicle speed is within a range of equal to or higher than a first speed value U1 (U1>0) and equal to or lower than a second speed value U2 that is higher than the first speed value (first control). In this case, the control may not be performed in a range where the vehicle speed is below the first speed U1 and above the second speed U2, so as not to impose an excessive burden on the system of the HUD device. Further, by reducing the disabling period with respect to the speed, a more flexible and appropriate warping processing according to the speed is possible.
Further, when the vehicle is in a low-speed state where the driver can easily perceive visual changes in the image (virtual image) (in other words, the vehicle speed is in a range close to the first speed value U1), the control to moderate a degree of reduction in the disabling period in such a manner that a sudden warping parameter update is suppressed may be performed (second control). In this case, a more precise control can be achieved.
Further, in addition to the second control described above, a control to moderate the degree of reduction in the disabling period as the vehicle speed approaches the second speed value U2 (an control of the inverted S-shaped characteristic) may be further performed, and when the vehicle speed reaches the second speed value U2, the reduction is stopped and the disabling period becomes constant, thereby suppressing that the change suddenly (unexpectedly) peaks out and a sense of discomfort occurs (third control). This can further improve the visibility of a virtual image.
In an eighth aspect dependent on any one of the first to seventh aspects, when adjusting a position of the eye box in accordance with a height position of the viewpoint of the driver, the head-up display device may not move the optical member and may change a reflection position of display light of the image in the optical member.
In the eighth aspect, when the height position of the eye box EB is adjusted in accordance with the height position of the driver's eyes (viewpoint), the HUD device that performs the above control does not rotate the optical member which projects light onto the projected member by, for example, using an actuator, but changes the reflection position of the light in the optical member.
In recent years, HUD devices tend to be developed on the premise of displaying a virtual image over a fairly wide area in front of a vehicle, for example, and this inevitably makes the device larger. As a matter of course, the optical member will also become larger. If this optical member is rotated with the use of an actuator or the like, the precision of controlling the height position of the eye box may be degraded due to the error. To prevent this, the position at which a light ray is reflected by the optical member is changed.
In such a large optical member, the distortion of the virtual image is prevented as much as possible by optimally designing the reflection surface as a free curved surface. However, as mentioned above, distortions may inevitably become apparent when, for example, the driver's viewpoint is located in the periphery of the eye box. Therefore, in such a case, the above control to temporarily disable (delay) the application of the parameter corresponding to the viewpoint position after the re-detection within a predetermined range is performed, whereby it is possible to make a sense of discomfort due to a change in appearance caused by distortion of the virtual image less likely to occur, and to effectively utilize the above control to improve the visibility.
In a ninth aspect dependent on any one of the first to eighth aspects, a hypothetical virtual image display surface corresponding to an image display surface of the display unit may be arranged so as to be superimposed on a road surface in front of the vehicle, or may be arranged at an angle with respect to the road surface in such a manner that a distance between a near end portion that is an end portion of the virtual image display surface on a side closer to the vehicle and the road surface is small, and a distance between a far end portion that is an end portion of the virtual image display surface on a side further from the vehicle and the road surface is large.
In the ninth aspect, a hypothetical virtual image display surface (corresponding to a display surface such as a screen as a display unit) arranged in front of the vehicle or the like in a HUD device is provided so as to be superimposed on the road surface or inclined with respect to the road surface. The former is referred to as a road surface superimposition HUD, while the latter is sometimes referred to as an inclined surface HUD.
These can perform various displays with the use of the wide virtual image display surface superimposed on the road surface or the wide virtual image display surface provided at an angle with respect to the road surface in the range of, for example, 5 m to 100 m, in front of the vehicle. It is preferable that the HUD device is increased in size and the eye box is also increased in size to detect the viewpoint position with high accuracy in a wider range than before and perform image correction using an appropriate warping parameter. However, if the viewpoint lost occurs, the highly accurate control of switching the warping parameter may in fact reduce the visibility of the image (virtual image) after the viewpoint is re-detected. Therefore, the application of the control method of the present invention becomes effective.
Those skilled in the art will readily understand that the exemplified aspects according to the present invention may be further modified without departing from the spirit of the invention.
The best embodiment described below is used so that the present invention is easily understood. Therefore, a person skilled in the art should note that the present invention is not unduly limited by the embodiments described below.
The following description refers to
As illustrated in
Part of the display light of the image is reflected by the windshield 2, is incident on a viewpoint (eye) A of a driver or the like located inside of (or on) a preset eye box EB (here, the eye box EB has a quadrangular shape having a predetermined area), and is formed in front of a vehicle 1, and thus a virtual image V is displayed on a hypothetical virtual image display surface PS corresponding to a display surface 102 of the display unit 101.
The image on the display unit 101 is distorted due to the shapes of the curved mirror 105 and the windshield 2. To offset that distortion, the image is subjected to a distortion of the opposite characteristic of that distortion. This pre-distortion (preposing distortion) method of image correction is referred to herein as warping processing or warping image correction processing.
Ideally, the virtual image V displayed on the virtual image display surface PS by the warping processing becomes an uncurved flat image. However, for example, it is undeniable that some distortion remains in a large HUD device 100 or the like that projects the display light onto the wide projection area 4 on the windshield 2 and sets the virtual image display distance in a fairly wide range, which is unavoidable.
In the upper left of
Further, the degree of distortion or the mode of distortion of the virtual image V in which the distortion remains depends on the position of the viewpoint A on the eye box EB. Since the optical system of the HUD device 100 is designed on the assumption that viewpoint A is located near the center, when the viewpoint A is near the center, the distortion of the virtual image is relatively small, and the distortion of the virtual image tends to increase as the viewpoint A approaches the periphery.
The description will be continued with reference to
As illustrated in
Display light K of the image is emitted from a projection optical system 118 of the HUD device 100, and part of the display light K is reflected by the windshield 2 and incident on the driver's viewpoint (eye) A. When the viewpoint A is in the eye box, the driver can visually recognize the virtual image of the image.
The HUD device 100 includes a ROM 210, and the ROM 210 incorporates an image conversion table 212. The image conversion table 212 stores, for example, a warping parameter WP that determines a polynomial, a multiplier, a constant, or the like for image correction (warping image correction) by a digital filter. The warping parameter WP is provided for each of the partial areas Z1 to Z9 in the eye box EB. In
When the viewpoint A moves, the position of the viewpoint A among the multiple partial areas Z1 to Z9 is detected. Then, any one of the warping parameters WP (Z1) to WP (Z9) corresponding to the detected partial area is read from the ROM 210 (warping parameter update), and the warping processing is performed with the use of the warping parameter.
The description will be continued with reference to
As illustrated in
In the example of
In the example of
In the example of
Thus, even for a virtual image V displaying the same content (virtual image V after the warping processing), the appearance varies considerably depending on the position of viewpoint A. For example, when the viewpoint A is located in the center of the eye box, the visible virtual image V has relatively little distortion as illustrated in
In this state (in the case of
The description will be continued with reference to
A typical example of the viewpoint lost is when, while driving, the driver's viewpoint A leaves the eye box EB and the detection of its position is interrupted, and then the viewpoint A returns to the eye box EB.
Further, the viewpoint lost may occur when the viewpoint A does not go out of the eye box EB but moves in multiple partial areas instantaneously, as in viewpoint movement examples (5) and (6). In the viewpoint movement examples (5) and (6), the movement distance is shorter, the lost time of the viewpoint is smaller, and the viewpoint movement is relatively stable compared to the movement manners (1) to (4) above. There are a variety of modes of the viewpoint lost, and a flexible response is desirable.
In the present embodiment, basically, a measure is taken in which, during the period when the viewpoint lost (viewpoint loss) occurs, the warping parameter immediately before is maintained, and the disabling period is started from the timing when the viewpoint A is re-detected, and in the disabling period, at least one warping processing that uses the warping parameter corresponding to the re-detected viewpoint position is disabled. In the disabling period, the warping parameter immediately before the viewpoint lost is maintained.
It can be assumed that a warping parameter corresponding to the position of a center (reference sign CP) of the eye box EB is adopted during the viewpoint lost period. However, in this case, the parameter is subjected to stepwise processing of shifting from the parameter immediately before the viewpoint lost to the parameter corresponding to the center CP once, and then to the parameter corresponding to the position after re-detection. Since there is a high possibility that the appearance of the virtual image will change due to the switching of the warping parameter, the warping parameter corresponding to the center CP is not adopted in the present embodiment. As described above, during the viewpoint lost period and the subsequent disabling period, the warping parameter immediately before viewpoint lost is maintained, thereby performing the control to suppress changes in the appearance of the virtual image. By setting the disabling period to an appropriate length, for example, only the warping associated with the re-detection of the viewpoint A immediately after the viewpoint movement (2) in
Further, in the disabling period, the warping parameter update cycle change processing (processing to lengthen the update cycle) can be used in combination. Further, in doing so, the application processing of continuing the period of increasing the parameter update cycle for some time after the disabling period ends can be performed, and the application processing of gradually restoring the update cycle to its original cycle over time can also be used in combination (examples of
Further, the disabling period may be controlled in a variable manner, corresponding to whether the viewpoint lost period is longer or shorter than a threshold value (threshold value for comparison/determination). For example, the driver (user) can be reassured that the viewpoint has been re-detected after the viewpoint lost (a lost of relatively short viewpoint movement) caused by the viewpoint movement (5) in
The description will be continued with reference to
Further, the HUD device 100 includes a light source 112, a light projection unit 114, the projection optical system 118, a viewpoint position detection unit (viewpoint position determination unit) 120, a bus 150, a bus interface 170, a display control unit 180, an image generation unit 200, the ROM 210 that incorporates the image conversion table 212, and a VRAM 220 that stores image (original image) data 222 and temporarily stores image data after warping processing 224. The display control unit (display control device) 180 is composed of one or a plurality of processors, one or a plurality of image processing circuits, one or a plurality of memories, and the like, and executes a program stored in the memories, thereby making it possible to control the HUD device 100 (display unit 116), for example, for generating and/or transmitting image data. The processor and/or image processing circuit may include at least one general-purpose microprocessor (e.g., a central processing unit (CPU)), at least one application-specific integrated circuits (ASIC), at least one field programmable gate array (FPGA), or any combination thereof. The memories include any type of magnetic media, such as hard disks, any type of optical media, such as CDs and DVDs, any type of volatile memory, such as semiconductor memory, and non-volatile memory. The volatile memory may include DRAM and SRAM, and the non-volatile memory may include ROM and NVRAM.
The viewpoint position detection unit 120 includes a viewpoint coordinate detection unit 122 and an in-eye box partial area detection unit 124 that detects (determines) which partial area of the eye box the viewpoint A is located in on the basis of detected viewpoint coordinates.
Further, the display control unit 180 includes a speed detection unit 182 that detects (determines) the speed of the vehicle 1 (which also serves as a low-speed state determination unit that determines a low-speed state), a warping control unit 184 (which includes a warping management unit 185), a timer 190, a memory control unit 192, and a warping processing unit (warping image correction processing unit) 194.
Further, the warping management unit 185 includes a viewpoint loss detection unit (or viewpoint lost detection unit) 186 that detects that a viewpoint loss (viewpoint lost) has occurred, and a warping parameter switching delay unit (disabling period setting unit) 187, a warping parameter update cycle changing unit 188, and a temporary accumulation unit of eye box partial area information 189, which temporarily stores the partial area information of the eye box corresponding to a detected viewpoint position.
Here, when the viewpoint position is first re-detected after the viewpoint lost, the warping parameter switching delay unit (disabling period setting unit) 187 performs a control not to immediately perform the switching of the warping parameter based on the re-detected viewpoint position at that timing, but to delay temporarily the switching of the warping parameter to thereby disable the switching of the warping parameter.
Further, the warping parameter update cycle changing unit 188 performs a control to change the update cycle of the warping parameter (specifically, lengthen the update cycle) at least in the disabling period, in parallel with the disabling period setting processing due to the delay in the switching timing of the warping parameter. By changing this update cycle, for example, it is possible to appropriately delay the timing of reflecting the updated warping parameter in the actual display.
The basic operation is as follows. That is, an operation is performed in which, with the use of the partial area information (information indicating which partial area of the eye box the viewpoint A is in) sent from the viewpoint position detection unit 120 as an address variable, the memory control unit 192 accesses the ROM 210 and reads the corresponding warping parameter, and with the use of the read warping parameter, the warping processing unit (warping image correction processing unit) 194 performs warping processing on the original image, the image generation unit 200 generates an image of a predetermined format on the basis of the data after the warping processing, and the image is supplied to, for example, the light source 112, light projection unit 114, or the like.
However, if warping simply follows the movement of the viewpoint A, as described above, when the viewpoint position is re-detected after the viewpoint lost, the appearance of the virtual image V may change instantaneously, causing visual discomfort to the driver. Therefore, a control to intentionally blunt (suppress) the sensitivity of the warping processing is performed. There are several modes of this control. The following is a step-by-step explanation.
The following description refers to
Further, as illustrated in
As illustrated by the solid line in
Further, at time t12, the position of the viewpoint A is re-detected, but the switch to the parameter corresponding to the re-detected position is not performed immediately, and the parameter value WP1 is maintained for a predetermined period from the re-detection time t12 (here, the period up to time t13), and at the time t13, the parameter value is changed to a value WP2 based on the re-detected position. The period from time t12 to t13 is a disabling period Ta. When the viewpoint position is re-detected at least once during this disabling period Ta, the parameter change (application of the parameter) based on the re-detected position is disabled and warping using the maintained original parameter is performed.
Provision of the disabling period will cause the parameter to be fixed for a short while, and no instantaneous switching of the warping parameter will be made. Further, in that disabling period, even if the viewpoint position is unstable and moves, for example, through multiple partial areas of the eye box, the parameter is not changed on the basis of continuous re-detection of the viewpoint position. This stabilizes the warping processing. Therefore, for example, when the driver's viewpoint A goes out of the eye box and then back into the eye box, it is suppressed that the appearance of the virtual image V instantaneously changes and a sense of discomfort occurs.
The description will be continued with reference to
As illustrated in
Further, as illustrated in
Further, the example of
For example, if the frame rate of the image (virtual image) is 60 fps (frames per second), then image processing (image display processing) is performed at 60 frames per second (in other words, one frame period is 1/60 second). As an example, assume that it is normal to also update the warping parameter every frame.
Here, in the disabling period Ta after the viewpoint lost, if the parameter is updated every 2 frames, the update cycle becomes 2/60 seconds, and if it is performed every 3 frames, the update cycle is 3/60 seconds, and the update cycle becomes longer. In this way, it is possible to lengthen (increase) the update cycle by switching to the update based on multiple frames. As the update cycle becomes longer, the reflection of the updated warping parameter in the image (virtual image) becomes slower. In other words, the sensitivity of reflecting the updated parameter in the display is slowed down. After the disabling period, the update cycle of the changed parameter will be restored (from RT2 to RT1). However, in reality, restoration of the update cycle is not instantaneous, but requires a certain amount of time, which means that even if the warping parameter is switched, the reflection of the switched parameter in the actual display will be delayed.
This makes it easier to provide an appropriate delay of some time width (a delay that is long enough for the driver to perceive that the change in the appearance in the actual display has been delayed (in other words, a delay that effectively has the effect of slightly extending the width of the disabling period)). An effect such as facilitating the design of a timing control circuit can be expected.
Further, the degree of increase in the update cycle of the parameter can be controlled in a variable manner, or the timing of restoring the increased update cycle to its original cycle can be devised, thereby expanding the range of control variations and enabling flexible responses. The amount of delay in the actual display control can also be easily set quite widely.
A variation of the timing for restoring the increased update cycle is illustrated in
Further, in
The description will be continued with reference to
In the example of
In
By setting a longer period during which the warping parameter is fixed without changing and the visual quality of the image (virtual image) is maintained at a constant level, the driver can be made aware that the re-detection after the viewpoint lost has been successful and that the corresponding processing is now being performed. In other words, by extending the period during which the warping parameter is fixed and performing image processing using the warping parameter updated over time, even when the appearance of the image (virtual image) changes, the driver is made aware that the change does not occur suddenly with the movement of the driver's eyes and is a change with a margin in time.
This makes it easier for the driver to sense that, although the viewpoint position was lost due to the movement of one's own eyes, the system of the HUD device succeeded in re-detecting the viewpoint position, and the processing corresponding to the viewpoint lost has been performed.
In other words, the HUD device (system side) can present the driver who is the user that the processing after the viewpoint lost is being done properly. This gives the driver a sense of security and mental stability, and thus it is possible to obtain the effect of making a sense of discomfort less likely to occur or reducing a sense of discomfort.
The description will be continued with reference to
In the example of
When the viewpoint lost time is short, the change in the viewpoint position is considered to be small (the viewpoint movement distance is considered to be relatively short). Therefore, in the example of
This allows, for example, after performing the minimum necessary disabling (timing delay), to reduce a sense of discomfort or suppress the occurrence of a sense of discomfort by quickly displaying an appropriate warping-corrected image (virtual image) corresponding to the viewpoint position.
In other words, when the viewpoint lost period is short, the movement distance of the viewpoint position is presumed to be small, and it can be assumed that there is little difference in the distortion of the virtual image before and after the updating of the warping parameter. Considering this point, a sudden change in the appearance of the virtual image immediately after the re-detection of the viewpoint position (in other words, in a fairly short time) is prevented, and then the normal viewpoint follow-up warping control is restored, thereby surely obtaining the effect of improving visibility.
The description will be continued with reference to
The speed detection unit 182 of
In
In the low-speed state, the driver (user) is sensitive to visual fluctuations in the front and the like and can easily detect these fluctuations. Therefore, at this time, a measure is taken in such a manner that the reflection of the new parameter in the image after the viewpoint lost is delayed to a greater extent, and a sense of discomfort caused by the instantaneous change in display appearance is less likely to occur. As the vehicle speed increases out of the low-speed state, the disabling periods Ta, Te, and Tf are reduced (in doing so, it is possible to include a case where the disabling period is eliminated), and a control is performed with an emphasis on accelerating the distortion correction of the image based on the viewpoint position re-detected after the lost. This allows for a more flexible and appropriate warping control in response to a vehicle speed.
Further, in the control example in
Further, in a case where a control illustrated by a characteristic line Q3 is performed, a control to moderate a degree of reduction in the disabling period when the vehicle speed is in a range close to the first speed value U1 and to make the degree of reduction steeper as the vehicle speed becomes away from the first speed value U1 is performed.
In other words, when the vehicle is in a low-speed state where the driver can easily perceive visual changes in the image (virtual image) (in other words, the vehicle speed is in a range close to the first speed value U1), a control to moderate a degree of reduction in the disabling period in such a manner that a sudden warping parameter update is suppressed is performed (this is the second control). In this case, a more precise control can be achieved.
Further, in a case where a control illustrated by a characteristic line Q4 is performed, a control to moderate a degree of reduction in the disabling period when the vehicle speed is in a range close to the first speed value U1, to make the degree of reduction further steeper as the vehicle speed becomes away from the first speed value, and to moderate the degree of reduction as the vehicle speed approaches the second speed value U2 is performed (the control of an inverted S-shaped characteristic, and this is the third control). In this third control, in addition to the second control described above, a control to moderate the degree of reduction in the disabling period as the vehicle speed approaches the second speed value U2 is further performed, and when the vehicle speed reaches the second speed value U2, the reduction is stopped and the disabling period becomes constant, thereby suppressing that the change suddenly (unexpectedly) peaks out and a sense of discomfort occurs. This can further improve the visibility of a virtual image.
The description will be continued with reference to
If N in step S2, the processing returns to step S1. If Y, the warping parameter immediately before the viewpoint loss is maintained (step S3). Subsequently, it is determined whether the viewpoint position has been re-detected after the viewpoint loss (step S4). If N, the processing returns to step S3. If Y, the processing proceeds to step S5.
In step S5, the processing of delaying (disabling) the update (switch) to the warping parameter corresponding to the re-detected viewpoint position is performed, thereby disabling at least one warping processing that uses the parameter corresponding to the re-detected viewpoint position. In doing so, the parameter update cycle change processing (the processing in
In step S6, it is determined whether the predetermined time (disabling period) Ta has elapsed. If N, the processing returns to step S5. If Y, the processing proceeds to step S7.
In step S7, update (switch) to the warping parameter corresponding to the re-detected viewpoint position is performed. In principle, the disabling period ends at this time (however, when the control of characteristic lines Tk1 and Tk2 of
Further, if the parameter update cycle has been changed in step S5, in step S7, the processing of restoring the parameter update cycle is also performed. There are three possible methods illustrated in
Subsequently, in step S8, it is determined whether to end image correction. If Y, the processing ends. If N, the processing returns to step S1.
The description will be continued with reference to
In step S4-1, it is determined whether the viewpoint loss period (viewpoint lost period) is shorter than a threshold value. If N, the processing proceeds to step S5.
If Y, in step S4-2, Te (>Ta) is adopted as the disabling period (delay time for switching a parameter) (in the case of
The description will be continued with reference to
In step S10, the vehicle speed is detected. In step S11, the traveling state of the vehicle (including stopping) is determined. For example, the determination of respective low-speed, medium-speed, and high-speed states may be performed.
In step S12, a viewpoint is monitored, and in step S13, it is determined that if there is a viewpoint loss (viewpoint lost). If N, the processing returns to step S12. If Y, the processing proceeds to step S14.
In step S14, the warping processing such as the control example 1 in
The description will be continued with reference to
These can perform various displays with the use of the wide virtual image display surface PS superimposed on the road surface 41 or the wide virtual image display surface PS provided at an angle with respect to the road surface 41, in the range of, for example, 5 m to 100 m in front of the vehicle 1. It is preferable that the HUD device is increased in size and the eye box EB is also increased in size to detect the viewpoint position with high accuracy in a wider range than before and perform image correction using an appropriate warping parameter. However, if the viewpoint lost occurs, the highly accurate control of switching the warping parameter may in fact reduce the visibility of the image (virtual image) after the viewpoint is re-detected. Therefore, the application of the control method of the present invention becomes effective.
The description will be continued with reference to
In the example of
In other words, when the height position of the eye box EB is adjusted in accordance with the height position of the driver's eyes (viewpoint A), the optical member 131 which projects light onto the projected member 2 is not rotated by, for example, using an actuator, but the reflection position of the light in the optical member 131 is changed. The height direction is the Y direction in the figure (the direction along the vertical line of the road surface 41, and the direction away from the road surface 41 is the positive direction). The X direction is the right-left direction of the vehicle 1, and the Z direction is the front-rear direction (or forward direction) of the vehicle 1.
In recent years, HUD devices tend to be developed on the premise of displaying a virtual image over a fairly wide area in front of a vehicle, for example, and this inevitably makes the device larger. As a matter of course, the optical member 131 will also become larger. If this optical member 131 is rotated with the use of an actuator or the like, the precision of controlling the height position of the eye box EB may be degraded due to the error. To prevent this, the position at which a light ray is reflected by the reflection surface of the optical member 131 is changed.
In such a large optical member 131, the distortion of the virtual image is prevented as much as possible by optimally designing the reflection surface as a free curved surface. However, as mentioned above, distortions may inevitably become apparent when, for example, the driver's viewpoint A is located in the periphery of the eye box EB.
Therefore, in such a case, the above control to temporarily disable (delay) the application of the parameter corresponding to the viewpoint position after the re-detection within a predetermined range is performed, whereby it is possible to make a sense of discomfort due to a change in appearance caused by distortion of the virtual image less likely to occur, and to effectively utilize the above control to improve the visibility of the virtual image.
As described above, according to the present invention, when performing the viewpoint position follow-up warping control that updates the warping parameter in accordance with the driver's viewpoint position, it is possible to effectively suppress that the appearance of the image changes instantaneously with the update of the warping parameter after the viewpoint loss (viewpoint lost) occurs, which causes a sense of discomfort to the driver.
The present invention can be used in either a monocular HUD device in which the display light of the same image is incident on each of the right and left eyes, or a parallax HUD device in which an image having parallax is incident on each of the right and left eyes.
In this specification, the term vehicle can also be interpreted, in a broad sense, as a conveyance. Further, terms related to navigation (e.g., signs, etc.) shall be interpreted in a broad sense, taking into account, for example, the perspective of navigation information in a broad sense that is useful for vehicle operation. Further, the HUD device shall also include those used as simulators (e.g., aircraft simulators).
The present invention is not limited to the exemplary embodiments described above, and a person skilled in the art may easily modify the exemplary embodiments described above to the scope of the claims.
1 vehicle (own vehicle)
2 projected member (reflective transmissive member, windshield, etc.)
4 projection area
5 virtual image display area
7 steering wheel
51 display light
100 HUD device
110 viewpoint detection camera 110
112 light source
114 light projection unit
116 display unit
117 display surface (image display surface)
118 projection optical system
120 viewpoint position detection unit (viewpoint position determination unit)
122 viewpoint coordinate detection unit
124 in-eye box partial area detection unit
130 operation unit
131 curved mirror (concave mirror, etc.)
133 reflecting mirror (including reflective mirror, corrective mirror, etc.)
140 vehicle ECU
150 bus
151 light source unit
161 display unit (screen, etc.)
63 display surface (image display surface)
170 bus interface
171 control unit
173 actuator that drives the display unit
180 display control unit (display control device)
182 speed detection unit
184 warping control unit
185 warping management unit
186 viewpoint loss (viewpoint lost) detection unit
187 warping parameter switching delay unit (disabling period setting unit)
188 warping parameter update cycle changing unit
189 temporary accumulation unit of eye box partial area information
192 memory control unit
194 warping processing unit
200 image generation unit
210 ROM
220 VRAM
212 image conversion table
222 image (original image) data
224 image data after warping processing
EB eye box
Z (Z1 to Z9, etc.) partial area of eye box
WP warping parameter
PS virtual image display surface
V virtual image
Number | Date | Country | Kind |
---|---|---|---|
2019-235289 | Dec 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/047470 | 12/18/2020 | WO |