This application is based on Japanese Patent Applications No. 2015-23621 filed on Feb. 9, 2015, and No. 2015-236915 filed on Dec. 3, 2015, the disclosures of which are incorporated herein by reference.
The present disclosure relates to a vehicle display control device and a vehicle display unit including the same.
Conventionally, there has been widely known a head-up display (HUD) that projects a display image onto a projection member for transmitting outside scenery therethrough in a subject vehicle, thereby virtually displaying the display image in association with a front obstacle in the outside scenery. In order to control the virtual image display by the HUD, Patent Literatures 1 and 2 each disclose a vehicle display control technique for virtually displaying as a display image a highlighting image that highlights a front obstacle.
Specifically, in the technique disclosed in Patent Literature 1, a virtual image display position and a virtual image display size are controlled so that a highlighting image having an annular linear shape is superimposed on a front obstacle transmitted through a projection member. According to the technique, even when the virtual image display position of the highlighting image deviates within the range of a control error caused by, for example, disturbance, it is possible to maintain the association of the highlighting image with the front obstacle by a superimposed state.
However, in the technique disclosed in Patent Literature 1, part of the front obstacle is hidden behind the highlighting image, and a user may thus feel inconvenience. In view of this, in the technique disclosed in Patent Literature 2, a virtual image display position and a virtual image display size are controlled so that a highlighting image having a rectangular linear shape surrounds the entire periphery of a front obstacle transmitted through a projection member with a margin left between the highlighting image and the front obstacle. According to the technique, even when the virtual image display position of the highlighting image deviates within the range of a control error, it is possible to prevent part of the front obstacle from being hidden behind the highlighting image by the margin to reduce inconvenience to a user.
In the technique disclosed in Patent Literature 2, when the highlighting image deviates upward, leftward, or rightward with respect to the front obstacle, although a user feels the deviation, it looks as if the front obstacle is pointed by the highlighting image on the same plane because of the following reason. Typically, a space is present on the upper side, the left side, and the right side of the front obstacle. Thus, a user is less likely to feel separation in the front-rear direction with respect to the front obstacle in a linear portion of the rectangular linear highlighting image, the linear portion extending right and left or up and down and being superimposed on the space.
However, in the technique disclosed in Patent Literature 2, when the highlighting image deviates downward with respect to the front obstacle, a user is likely to feel the deviation, and it looks as if the front obstacle is not pointed by the highlighting image because of the following reason. The ground is present under the front obstacle. Thus, in a linear portion of the rectangular linear highlighting image, the linear portion extending right and left and being superimposed on the ground, the deviation with respect to the front obstacle is likely to be conspicuous because the horizontal line is recalled by the association with the ground. When the front obstacle is a preceding vehicle, the horizontal line is particularly likely to be recalled in a linear portion extending right and left along a bumper of the preceding vehicle, and the conspicuousness of deviation becomes remarkable. Thus, in the linear portion extending right and left under the front obstacle, a user is likely to feel downward deviation as separation in the front-rear direction with respect to the obstacle. As a result, the association with the front obstacle becomes ambiguous, which may reduce the highlighting effect or give a user illusion as if the front obstacle becomes separated.
Patent Literature 1: WO-2009/072366-A
Patent Literature 2: JP-2005-343351-A
It is an object of the present disclosure to provide a vehicle display control device that appropriately highlights a front obstacle by virtual image display of a highlighting image and a vehicle display unit including the same.
According to a first aspect of the present disclosure, a vehicle display control device that controls to display a virtual image in a subject vehicle equipped with a head-up display that displays the virtual image in association with at least one front obstacle in outside scenery by projecting a display image on a projection member for transmitting the outside scenery therethrough, includes: an image storage device that stores, as the display image, a highlighting image for highlighting the front obstacle by a linear portion having a virtual image display size surrounding the front obstacle with a margin spaced apart from the front obstacle at a virtual image display position corresponding to an entire range less than an entire circumference other than a lower side of a periphery of the front obstacle; and a virtual image display control device that is provided by at least one processor and controls the virtual image display position and the virtual image display size.
According to such a vehicle display control device, the highlighting image as the display image that highlights the front obstacle in the outside scenery is controlled to the virtual image display size surrounding the front obstacle with the margin left by the linear portion at the virtual image display position corresponding to the entire range less than a circle except the lower side of the periphery of the front obstacle. Thus, even if a user feels a deviation with respect to the front obstacle, it looks as if the front obstacle is pointed by the highlighting image that is superimposed on a space within the outside scenery on the upper side, left side, and right side of the front obstacle. Thus, the user is less likely to feel separation in the front-rear direction with respect to the front obstacle.
Accordingly, even if the virtual image display position of the highlighting image deviates within the range of a control error caused by, for example, disturbance, it is possible to maintain the association of the highlighting image with the front obstacle, and also possible to avoid the illusion as if the front obstacle becomes separated. Further, even if the virtual image display position of the highlighting image deviates within the control error range, the margin formed by the linear portion prevents part of the front obstacle from being hidden behind the highlighting image, which enables reduction in inconvenience to a user. As described above, the vehicle display control device that can achieve an inconvenience reducing action in addition to an association maintaining action and an illusion avoidance action makes it possible to appropriately highlight the front obstacle by the virtual image display of the highlighting image.
According to a second aspect of the present disclosure, a vehicle display control device that controls to display a virtual image in a subject vehicle equipped with a head-up display that displays the virtual image in association with at least one front obstacle in outside scenery by projecting a display image on a projection member for transmitting the outside scenery therethrough, includes: an image storage device that stores, as the display image, a highlighting image for highlighting the front obstacle by a first linear portion, having a first virtual image display size surrounding the front obstacle with a margin spaced apart from the front obstacle at a first virtual image display position corresponding to an entire range less than an entire circumference other than a lower side of a periphery of the front obstacle, and a second linear portion, having a second virtual image display size surrounding the front obstacle with a margin spaced apart from the front obstacle and a lower brightness than the first linear portion at a second virtual image display position between opposing ends of the first linear portion in the periphery of the front obstacle; and a virtual image display control device that is provided by at least one processor and controls a virtual image display position including the first virtual image display position and the second virtual image display position and a virtual image display size including the first virtual image display size and the second virtual image display size.
According to such a vehicle display control device, the highlighting image as the display image that highlights the front obstacle in the outside scenery is controlled to the virtual image display size surrounding the front obstacle with the margin left by the first linear portion at the virtual image display position corresponding to the entire range less than a circle except the lower side of the periphery of the front obstacle. Thus, even if a user feels a deviation with respect to the front obstacle, it looks as if the front obstacle is pointed by the first linear portion that is superimposed on a space within the outside scenery on the upper side, left side, and right side of the front obstacle. Thus, the user is less likely to feel separation in the front-rear direction with respect to the front obstacle.
Further, according the above vehicle display control device, the highlighting image size is controlled to the virtual image display size surrounding the front obstacle with the margin left by the second linear portion at the virtual image display position corresponding a part of the periphery of the front obstacle between the opposite ends of the first linear portion. Even when the second linear portion having a lower brightness than the first linear portion is superimposed on the ground which is present under the front obstacle, the fixation point of a user is likely to be more focused onto the first linear portion than the second linear portion. Thus, the second linear portion having a lower brightness weakens the association with the ground. Accordingly, the user is less likely to feel separation in the front-rear direction with respect to the front obstacle.
Accordingly, even if the virtual image display positions of the linear portions deviate within the range of a control error caused by, for example, disturbance, it is possible to maintain the association of the highlighting image with the front obstacle, and also possible to avoid the illusion as if the front obstacle becomes separated. Further, even if the virtual image display positions of the linear portions deviate within the control error range, the margins formed by the linear portions prevent part of the front obstacle from being hidden behind the highlighting image, which enables reduction in inconvenience to a user. As described above, the vehicle display control device that can achieve an inconvenience reducing action in addition to an association maintaining action and an illusion avoidance action makes it possible to appropriately highlight the front obstacle by the virtual image display of the highlighting image.
According to a third aspect of the present disclosure, a vehicle display unit includes: the vehicle display control device according to the first aspect or the second aspect; and the head-up display.
In such a vehicle display unit, the virtual image display position and the virtual image display size of the highlighting image by the HUD are controlled by the vehicle display control device of the first or second aspect. Thus, it is possible to appropriately highlight the front obstacle by the highlighting image.
The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description made with reference to the accompanying drawings. In the drawings:
Hereinbelow, a plurality of embodiments of the present disclosure will be described with reference to the drawings. Corresponding elements in the respective embodiments may be denoted by the same reference signs to avoid repetitive description. In each of the embodiments, when only a part of a configuration is described, a configuration of the other preceding embodiments can be applied to the other part of the configuration. Further, in addition to a combination of configurations clearly stated in the respective embodiments, configurations of a plurality of embodiments may be partially combined even if not clearly stated unless there is an obstacle in the combination.
A travel assist system 1 of a first embodiment to which the present disclosure is applied is mounted on a subject vehicle 2 as illustrated in
As illustrated in
The periphery monitoring system 3 is provided with an external sensor 30 and a periphery monitoring electronic control unit (ECU) 31. The external sensor 30 detects, for example, another vehicle, an artificial structure, a human and an animal, or a traffic sign present outside, as an obstacle that is present outside the subject vehicle 2 and may collide with the subject vehicle 2. The external sensor 30 includes, for example, one or more kinds selected from a sonar, a radar, and a camera.
Specifically, the sonar is an ultrasonic sensor that is installed, for example, in a front part or a rear part of the subject vehicle 2. The sonar receives reflected waves of ultrasonic waves transmitted to a detection area outside the subject vehicle 2 to detect an obstacle within the detection area, and thereby outputs a detection signal. The radar is a millimeter wave sensor or a laser sensor that is installed, for example, in the front part or the rear part of the subject vehicle 2. The radar receives reflected waves of millimeter or submillimeter waves or laser beams transmitted to the detection area outside the subject vehicle 2 to detect an obstacle within the detection area, and thereby outputs a detection signal. The camera is a monocular or compound-eye camera that is installed, for example, in a rearview mirror or a door mirror of the subject vehicle 2. The camera captures an image of the detection area outside the subject vehicle 2 to detect an obstacle or a traffic sign within the detection area, and thereby outputs an image signal.
The periphery monitoring ECU 31 mainly includes a microcomputer including a processor and a memory, and is connected to the external sensor 30 and the in-vehicle network 6. The periphery monitoring ECU 31 acquires, for example, sign information such as speed limit sign and a lane sign and line marking information such as a white line and a yellow line on the basis of an output signal of the external sensor 30. In addition, the periphery monitoring ECU 31 acquires, for example, obstacle information such as the type of an obstacle, a moving direction and a moving speed of a front obstacle 8b (see
The vehicle control system 4 is provided with a vehicle state sensor 40, an occupant sensor 41, and a vehicle control ECU 42. The vehicle state sensor 40 is connected to the in-vehicle network 6. The vehicle state sensor 40 detects a traveling state of the subject vehicle 2. The vehicle state sensor 40 includes, for example, one or more kinds selected from a vehicle speed sensor, an engine speed sensor, a steering angle sensor, a fuel sensor, a water temperature sensor, and a radio receiver.
Specifically, the vehicle speed sensor detects a vehicle speed of the subject vehicle 2 and thereby outputs a vehicle speed signal corresponding to the detection. The engine speed sensor detests an engine speed in the subject vehicle 2 and thereby outputs an engine speed signal corresponding to the detection. The steering angle sensor detects a steering angle of the subject vehicle 2 and thereby outputs a steering angle signal corresponding to the detection. The fuel sensor detects a remaining fuel amount in a fuel tank of the subject vehicle 2 and thereby outputs a fuel signal corresponding to the detection. The water temperature sensor detects a cooling water temperature in an internal combustion engine in the subject vehicle 2 and thereby outputs a water temperature signal corresponding to the detection. The radio receiver receives, for example, output radio waves from a positioning satellite, a transmitter of another vehicle for vehicle-vehicle communication, and a roadside machine for road-vehicle communication, and thereby outputs a traffic signal. The traffic signal is, for example, a signal representing traffic information relating to the subject vehicle 2 such as a traveling position, a traveling direction, a traveling road state, and a speed limit and a signal representing the above obstacle information.
The occupant sensor 41 is connected to the in-vehicle network 6. The occupant sensor 41 detects a state or an operation of a user inside a vehicle cabin 2c of the subject vehicle 2 illustrated in
Specifically, the power switch is turned on by a user inside the vehicle cabin 2c for starting the internal combustion engine or a motor generator of the subject vehicle 2 and thereby outputs a power signal corresponding to the turn-on operation. The user state monitor captures an image of a state of a user on a driver's seat 20 inside the vehicle cabin 2c using an image sensor to detect the user state and thereby outputs an image signal. The display setting switch is operated by a user for setting a display state inside the vehicle cabin 2c and thereby outputs a display setting signal corresponding to the operation. The turn switch is turned on by a user inside the vehicle cabin 2c for actuating a direction indicator of the subject vehicle 2 and thereby outputs a turn signal corresponding to the turn-on operation.
The cruise control switch is turned on by a user inside the vehicle cabin 2c for automatically controlling the following distance of the subject vehicle 2 with respect to a preceding vehicle as the front obstacle 8b or the vehicle speed of the subject vehicle 2 and thereby outputs a cruise control signal corresponding to the turn-on operation. The lane control switch is turned on by a user inside the vehicle cabin 2c for automatically controlling a width-direction position of the subject vehicle 2 in a traveling lane and thereby outputs a lane control signal corresponding to the turn-on operation.
The vehicle control ECU 42 illustrated in
ECUs selected from an engine control ECU, a motor control ECU, a brake control ECU, a steering control ECU, and an integrated control ECU, and includes at least the integrated control ECU.
Specifically, the engine control ECU controls actuation of a throttle actuator and a fuel injection valve of the internal combustion engine in accordance with an operation of an acceleration pedal 26 inside the vehicle cabin 2c illustrated in
In particular, the integrated control ECU of the present embodiment performs full speed range adaptive cruise control (FSRA) for automatically controlling the following distance and the vehicle speed of the subject vehicle 2 in a full speed range when the cruise control switch is turned on. The integrated control ECU mounted on the subject vehicle 2 as a “following distance control unit” that performs the FSRA controls actuation of the engine control ECU or the motor control ECU and actuation of the brake control ECU on the basis of acquired information in the periphery monitoring ECU 31 and an output signal of the radio receiver.
The integrated control ECU of the present embodiment performs lane keeping assist (LKA) for restricting a departure of the subject vehicle 2 from the white line or the yellow line to automatically control the width-direction position in the traveling lane when the lane control switch is turned on. The integrated control ECU mounted on the subject vehicle 2 also as a “lane control unit” that performs LKA controls actuation of the steering control ECU on the basis of acquired information in the periphery monitoring ECU 31 and an output signal of the radio receiver.
The display system 5 as a “vehicle display unit” is mounted on the subject vehicle 2 for visually presenting information. The display system 5 is provided with an HUD 50, a multi-function display (MFD) 51, a combination meter 52, and a human machine interface (HMI) control unit (HCU) 54.
The HUD 50 is installed in an instrument panel 22 inside the vehicle cabin 2c illustrated in
As illustrated in
In addition to such display of the highlighting image 560, for example, display of an image representing one or more kinds of information selected from navigation information, sign information, and obstacle information may be employed as virtual image display by the HUD 50. Further, virtual image display can be achieved also by using a light transmissive combiner disposed on the instrument panel 22 and transmits the outside scenery 8 therethrough in cooperation with the windshield 21 to project the display image 56 on the combiner. The above navigation information can be acquired, for example, in the HCU 54 (described in detail below) on the basis of map information stored in a memory 54m and an output signal of the sensor 40.
The MFD 51 is installed in a center console 23 inside the vehicle cabin 2c illustrated in
The combination meter 52 is installed in the instrument panel 22 inside the vehicle cabin 2c. The combination meter 52 displays vehicle information relating to the subject vehicle 2 so as to be visually recognizable by a user on the driver's seat 20. The combination meter 52 is a digital meter that displays vehicle information as an image formed on a liquid crystal panel or an analog meter that displays vehicle information by indicating scales by an indicator. For example, display representing one or more kinds of information selected from the vehicle speed, the engine speed, the remaining fuel amount, the cooling water temperature, and an operation state of the turn switch, the cruise control switch and the lane control switch is employed as such display by the combination meter 52.
The HCU 54 illustrated in
In particular, in the present embodiment, data of the display image 56 including the highlighting image 560 is stored in the memory 54m as an “image storage device”, so that the HCU 54 functions as a “vehicle display control device”. Specifically, the HCU 54 executes a display control program using the processor 54p to achieve a display control flow for reading the highlighting image 560 from the memory 54m and displaying the read highlighting image 560 as illustrated in
In S101 of the display control flow, it is determined whether one front obstacle 8b to be highlighted by the highlighting image 560 to call attention has been detected. Specifically, the determination in S101 is made on the basis of, for example, one or more kinds of information selected from obstacle information acquired by the periphery monitoring ECU 31 and obstacle information represented by an output signal of the radio receiver as the occupant sensor 41. While negative determination is made in S101, S101 is repeatedly executed. On the other hand, when positive determination is made in S101, a shift to S102 is made.
In the following S102, required information I for virtually displaying the highlighting image 560 is acquired. Specifically, the required information I includes, for example, one or more kinds selected from acquired information in the periphery monitoring ECU 31 and information based on output signals of the sensors 40, 41. Examples of the acquired information in the periphery monitoring ECU 31 include obstacle information. Examples of the information based on an output signal of the vehicle state sensor 40 include a vehicle speed represented by an output signal of the vehicle speed sensor and a steering angle represented by an output signal of the steering angle sensor. Examples of the information based on the occupant sensor 41 include a set value of a display state represented by an output signal of the display setting switch, a user state such as an eyeball state represented by an output signal of the user state monitor, and traffic information and obstacle information represented by an output signal of the radio receiver.
In the following S103, the virtual image display position α and the virtual image display size β of the highlighting image 560 are set on the basis of the required information I acquired in S102. Specifically, a fixation point or a fixation line obtained when a user fixes his/her eyes on the front obstacle 8b detected in S101 is first estimated on the basis of the required information I. Then, the virtual image display position α is set at the entire range less than a circle except the lower side with respect to the front obstacle 8b on the estimated fixation point or fixation line. Further, the virtual image display size β is set so as to form the linear portion 560p with the margin 560m left with respect to the front obstacle 8b.
In the following S104, display data for virtually displaying the highlighting image 560 with the virtual image display position α and the virtual image display size β set in S103 is generated. At this time, the display data is generated by applying image processing to data of the highlighting image 560 read from the memory 54m.
In the following S105, the display data generated in S104 is provided to the HUD 50 to form the highlighting image 560 by the display device 50i, thereby controlling the virtual image display position α and the virtual image display size β of the linear portion 560p. As a result, as illustrated in
In the first embodiment as described above, part of the HCU 54 that executes S101, S102, S103, S104, and S105 corresponds to a “virtual image display control device” constructed by the processor 54p.
The action and effect of the first embodiment described hereinabove will be described below.
The highlighting image 560 that highlights the front obstacle 8b in the outside scenery 8 is controlled to the virtual image display size β surrounding the front obstacle 8b with the margin 560m left by the linear portion 560p at the virtual image display position α corresponding to the entire range less than a circle except the lower side of the periphery of the front obstacle 8b. Thus, as illustrated in
Accordingly, even if the virtual image display position α of the highlighting image 560 deviates within a control error range, it is possible to maintain the association of the highlighting image 560 with the front obstacle 8b, and also possible to avoid the illusion as if the front obstacle 8b becomes separated. Further, even if the virtual image display position α of the highlighting image 560 deviates within the control error range, the margin 560m formed by the linear portion 560p prevents part of the front obstacle 8b from being hidden behind the highlighting image 560, which enables reduction in inconvenience to a user. As described above, the first embodiment that can achieve an inconvenience reducing action in addition to an association maintaining action and an illusion avoidance action makes it possible to appropriately highlight the front obstacle 8b by the virtual image display of the highlighting image 560.
As illustrated in
As described above, since the virtual image display position α and the virtual image display size β of the highlighting image 560 displayed by the HUD 50 are controlled by the HCU 54, it is possible to appropriately highlight the front obstacle 8b by the highlighting image 560.
A second embodiment of the present disclosure is a modification of the first embodiment. As illustrated in
In S2101, it is determined whether one vehicle immediately ahead that travels in the same lane and in the same direction as the subject vehicle 2 has been detected as the front obstacle 8b under automatic following distance control by FSRA of the integrated control ECU in the vehicle control ECU 42. Specifically, the determination in S2101 is made on the basis of, for example, one or more kinds selected from control information of the integrated control ECU, obstacle information represented by an output signal of the radio receiver, and sign information, lane marking information and obstacle information acquired by the periphery monitoring ECU 31. While negative determination is made in S2101, a return to S2100 is made. On the other hand, when positive determination is made in S2101, a return to S2100 is made after the execution of S102, S103, S104, and S105. Note that when negative determination is made in S2100 or S2101 immediately after the return from S105, the virtual image display of the highlighting image 560 is finished.
As described above, in the second embodiment, the following distance of the subject vehicle 2 with respect to a preceding vehicle as the front obstacle 8b is automatically controlled similarly to the first embodiment. Thus, the position α and the size β of the highlighting image 560 are controlled as illustrated in
A third embodiment of the present disclosure is a modification of the first embodiment. As illustrated in
In S3101, it is determined whether one vehicle immediately ahead that travels in the same or a different lane and in the same direction as the subject vehicle 2 has been detected as the front obstacle 8b under automatic control by LKA of the integrated control ECU in the vehicle control ECU 42. Specifically, the determination in S3101 is made on the basis of, for example, one or more kinds selected from control information of the integrated control ECU, obstacle information represented by an output signal of the radio receiver, and sign information, lane marking information and obstacle information acquired by the periphery monitoring ECU 31. While negative determination is made in S3101, a return to S3100 is made. On the other hand, when positive determination is made in S3101, a return to S3100 is made after the execution of S102, S103, S104, and S105. Note that when negative determination is made in S3100 or S3101 immediately after the return from S105, the virtual image display of the highlighting image 560 is finished.
As described above, in the third embodiment, the width-direction position of the subject vehicle 2 in the traveling lane is automatically controlled similarly to the first embodiment. Thus, the position α and the size β of the highlighting image 560 are controlled as illustrated in
A fourth embodiment of the present disclosure is a modification of the first embodiment. As illustrated in
A first virtual image display size β1 which is the size of the first linear portion 4560p1 is variably set so as to continuously surround the front obstacle 8b at the first virtual image display position α1 corresponding to the entire range less than a circle except a lower side of the periphery of the front obstacle 8b. In addition, the virtual image display size β1 of the first linear portion 4560p1 is variably set so as to leave a margin 4560m1 for allowing a user to directly visually recognize the outside scenery 8 except the front obstacle 8b between the first linear portion 4560p1 and the front obstacle 8b on the inner peripheral side. A virtual image display color of the first linear portion 4560p1 is fixedly set or variably set by a user to a translucent color that enables a superimposed part with the outside scenery 8 to be visually recognized and also enables reduction in inconvenience as well as a predetermined high-brightness color tone that highlights the front obstacle 8b to enable user's attention to be called to the front obstacle 8b. For example, the virtual image display color of the first linear portion 4560p1 is set to light yellow, light red, light green, or light amber.
On the other hand, a second virtual image display size β2 which is the size of the second linear portion 4560p2 is variably set so as to continuously surround the front obstacle 8b at the second virtual image display position α2 between the opposite ends of the first linear portion 4560p1 at the lower side of the periphery of the front obstacle 8b. In addition, the virtual image display size 131 of the second linear portion 4560p2 is variably set so as to leave a margin 4560m2 for allowing a user to directly visually recognize the outside scenery 8 except the front obstacle 8b between the second linear portion 4560p2 and the front obstacle 8b on the inner peripheral side. A virtual image display color of the second linear portion 4560p2 is fixedly set or variably set by a user to a translucent color that enables a superimposed part with the outside scenery 8 to be visually recognized and also enables reduction in inconvenience as well as a predetermined color tone having a lower brightness than the first linear portion 4560p1. For example, the virtual image display color of the second linear portion 4560p2 is set to dark yellow, dark red, dark green, or dark amber. The color tones of the respective linear portions 4560p1, 4560p2 may be set to similar color tones or dissimilar color tones. The brightness of the linear portion 4560p1 and the brightness of the linear portion 4560p2 are adjusted by setting gradation values of the respective linear portions 4560p1, 4560p2 so as to make a brightness value of a brightness signal lower at the second linear portion 4560p2 than that at the first linear portion 4560p1.
As illustrated in
In the following S4104, display data for virtually displaying the linear portions 4560p1, 4560p2 with the virtual image display positions α1, α2 and the virtual image display sizes β1, β2 set in S4103 is generated. At this time, the display data is generated by applying image processing to data of the highlighting image 4560 read from the memory 54m.
In the following S4105, the display data generated in S4104 is provided to the HUD 50 to form the highlighting image 4560 by the display device 50i, thereby controlling the virtual image display positions α1, α2 and the virtual image display sizes β1, β2 of the linear portions 4560p1, 4560p2. As a result, the highlighting image 4560 is visually recognized with the first virtual image display size β1 surrounding the front obstacle 8b with the margin 4560m1 left by the first linear portion 4560p1 at the first virtual image display position α1 corresponding to the entire range less than a circle except the lower side of the periphery of the front obstacle 8b. In addition, the highlighting image 4560 is visually recognized with the second virtual image display size β2 surrounding the front obstacle 8b with the margin 4560m2 left by the second linear portion 4560p2 having a lower brightness than the first linear portion 4560p1 at the second virtual image display position α2 corresponding to the lower side of the periphery of the front obstacle 8b. Thereafter, in the display control flow, a return to S101 is made. When negative determination is made in S101 immediately after the return, the virtual image display of the highlighting image 4560 is finished.
In the fourth embodiment as described above, part of the HCU 54 that executes S101, S102, S4103, S4104, and S4105 corresponds to the “virtual image display control device” constructed by the processor 54p.
The action and effect of the fourth embodiment described hereinabove will be described below.
The highlighting image 4560 that highlights the front obstacle 8b is controlled to the virtual image display size β1 surrounding the front obstacle 8b with the margin 4560m1 left by the first linear portion 4560p1 at the first virtual image display position α1 corresponding to the entire range less than a circle except the lower side of the periphery of the front obstacle 8b. Thus, as illustrated in
Further, the highlighting image 4560 is controlled to the virtual image display size β2 surrounding the front obstacle 8b with the margin 4560m2 left by the second linear portion 4560p2 at the second virtual image display position α2 corresponding a part of the periphery of the front obstacle 8b between the opposite ends of the first linear portion 4560p1. As illustrated in
Accordingly, even if the virtual image display positions α1, α2 of the linear portions 4560p1, 4560p2 deviate within a control error range, it is possible to maintain the association of the highlighting image 4560 with the front obstacle 8b, and also possible to avoid the illusion as if the front obstacle 8b becomes separated. Further, even if the virtual image display positions α1, α2 of the linear portions 4560p1, 4560p2 deviate within the control error range, the margins 4560m1, 4560m2 formed by the linear portions 4560p1, 4560p2 prevent part of the front obstacle 8b from being hidden behind the highlighting image 4560, which enables reduction in inconvenience to a user to be reduced. As described above, the fourth embodiment that can achieve an inconvenience reducing action in addition to an association maintaining action and an illusion avoidance action makes it possible to appropriately highlight the front obstacle 8b by the virtual image display of the highlighting image 4560.
Further, even when the second linear portion 4560p2 that curvedly extends between the opposite ends of the first linear portion 4560p1 at the second virtual image display position α2 is superimposed on the ground 4008g, not only the fixation point of a user is less likely to be focused thereon due to a low brightness thereof, but also the user is less likely recall the horizontal line. Thus, when the virtual image display positions α1, α2 of the linear portions 4560p1, 4560p2 deviate within the control error range, the association of the second linear portion 4560p2 with the ground 4008g is weakened even at the lower side of the front obstacle 8b, and it is thus possible to divert the fixation point from the second linear portion 4560p2. Accordingly, it is possible to reliably exhibit the association maintaining action and the illusion avoidance action. Thus, the front obstacle 8b can be appropriately highlighted.
As described above, since the virtual image display positions α1, α2 and the virtual image display sizes β1, β2 of the highlighting image 4560 displayed by the HUD 50 are controlled by the HCU 54, it is possible to appropriately highlight the front obstacle 8b by the highlighting image 4560.
A fifth embodiment of the present disclosure is a modification of the first embodiment. As illustrated in
In S5101a, it is determined whether at least one front obstacle 8b to be highlighted by the highlighting image 560 to call attention has been detected. The determination at this time is made in the same manner as S101. While negative determination is made in S5101a, S5101a is repeatedly executed. On the other hand, when positive determination is made in S5101a, a shift to S5101b is made.
In S5101b, it is determined whether a plurality of front obstacles 8b have been detected in S101. As the result, when negative determination is made, S102, S103, S104, and S105 are executed as processing for the single front obstacle 8b. On the other hand, when positive determination is made, S5102, S5103, S5104, and S5105 are executed as individual processing for each of the front obstacles 8b.
In S5102, required information I for virtually displaying the highlighting image 560 is individually acquired for each front obstacle 8b detected in S101. At this time, the required information I for each front obstacles 8b is acquired in the same manner as in S102.
In the following S5103, the virtual image display position α and the virtual image display size β of the highlighting image 560 are individually set for each front obstacles 8b detected in S101. At this time, based on the required information I for each front obstacle 8b acquired in S5102, the virtual image display size β is set to be smaller as the front obstacle 8b to be highlighted by the highlighting image 560 is farther from the subject vehicle 2 as illustrated in
In the following S5104, as illustrated in
In the following S5105, the display data generated in S5104 is provided to the HUD 50 to form the highlighting image 560 by the display device 50i. Accordingly, the virtual image display position α and the virtual image display size β of the linear portion 560p of the highlighting image 560 are individually controlled for each front obstacle 8b detected in S101. As a result, as illustrated in
In the display control flow, a return to S5101a is made after the execution of S5105. When negative determination is made in S5101a immediately after the return, virtual image display of all the highlighting images 560 is finished. When positive determination is made in S5101a immediately after the return and negative determination is made in S5101b, virtual image display of the highlighting image 560 for the front obstacle 8b that becomes undetected is finished, but virtual image display of the highlighting image 560 for the front obstacle 8b that remains detected is continued. Note that a return to S5101a is made also after the execution of S105.
As described above, according to the fifth embodiment, the highlighting images 560 that individually highlight the plurality of front obstacles 8b are controlled to a smaller size as the front obstacle 8b to be highlighted is farther from the subject vehicle 2. Accordingly, it is possible to increase the degree of highlighting by the highlighting image 560 having a large size for the front obstacle 8b that is close to the subject vehicle 2 and thus requires particular attention and, at the same time, ensure a highlighting function by the highlighting image 560 having a small size also for the front obstacle 8b that is far from the subject vehicle 2. Thus, highlighting of the plurality of obstacles 8b by the respective highlighting images 560 can be appropriately performed in a prioritized manner. In the above fifth embodiment, part of the HCU 54 that executes S5101a, S5101b, S102, S103, S104, S105, S5102, S5103, S5104, and S5105 corresponds to the “virtual image display control device” constructed by the processor 54p.
A sixth embodiment of the present disclosure is a modification of the fifth embodiment. As illustrated in
In S5203a, the virtual image display position α and the virtual image display size β of the highlighting image 560 are individually set for each front obstacle 8b detected in S101. At this time, the virtual image display position α and the virtual image display size β are set in the same manner as S5103.
In the following S5203b, a virtual image display shape y of the highlighting image 560 is individually set for each front obstacle detected in S101. At this time, as illustrated in
In the following S5204, as illustrated in
In the following S5205, the display data generated in S5204 is provided to the HUD 50 to form the highlighting image 560 by the display device 50i, thereby controlling the virtual image display position α, the virtual image display size β, and the virtual image display shape γ of the linear portion 560p. As a result, as illustrated in
As described above, according to the sixth embodiment, the virtual image display shapes γ of the highlighting images 560 that individually highlight the plurality of front obstacles 8b are varied according to the type of the front obstacle 8b to be highlighted. Accordingly, a user can determine the type of each front obstacle 8b from the virtual image display shape γ of the corresponding highlighting image 560. Thus, it is possible to enhance the association of the highlighting images 560 with the respective obstacles 8b to thereby appropriately highlight these obstacles 8b. In the above sixth embodiment, part of the HCU 54 that executes S5101a, S5101b, S102, S103, S104, S105, S5102, S5203a, S5203b, S5204, and S5205 corresponds to the “virtual image display control device” constructed by the processor 54p.
A seventh embodiment of the present disclosure is a modification of the fifth embodiment. As illustrated in
In S5303a, the virtual image display position α and the virtual image display size β of the highlighting image 560 are individually set for each front obstacle 8b detected in S101. At this time, the virtual image display position α and the virtual image display size β are set in the same manner as in S5103.
For example, an intersection or a city area has many points that are desired to be closely watched by a user. Thus, in the following S5303b, it is determined whether virtual image display positions α of the highlighting images 560 set for the respective front obstacles 8b in S5303a are superimposed on one another. As the result, when positive determination is made, a shift to S5303c is made.
In S5303c, based on required information I acquired in S5102, the virtual image display shape γ is changed in one of the highlighting images 560 whose virtual image display positions α are superimposed as illustrated in
In the following S5304, as illustrated in
In the following S5305, the display data generated in S5304 is provided to the HUD 50 to form the highlighting image 560 by the display device 50i, thereby controlling the virtual image display position α, the virtual image display size β, and the virtual image display shape γ of the linear portion 560p. As a result, as illustrated in
In the display control flow, a return to S5101a is made after the execution of S5305. On the other hand, when negative determination is made in S5303b, S5104 and S5105 described in the fifth embodiment are executed prior to the return to S5101a without the execution of S5303c, S5304, and S5305. In this case, each highlighting image 560 for each front obstacle 8b is visually recognized with the position α and the size β similar to S5105.
As described above, according to the seventh embodiment, when the virtual image display positions a of the highlighting images 560 that individually highlight the plurality of front obstacles 8b are superimposed on one another, the virtual image display of the highlighted image 560 that highlights the front obstacle 8b farther from the subject vehicle 2 is cut at the superimposed point P. Accordingly, it is possible to increase the degree of highlighting by the highlighting image 560 having no cut for the front obstacle 8b that is close to the subject vehicle 2 and thus requires particular attention and, at the same time, ensure a highlighting function also for the front obstacle 8b farther from the subject vehicle 2 by the cut highlighting image 560. Further, inconvenience to a user caused by the superimposed virtual image display positions α can be reduced. Thus, highlighting of the plurality of obstacles 8b by the respective highlighting images 560 can be appropriately performed in a prioritized manner. In the above seventh embodiment, part of the HCU 54 that executes S5101a, S5101b, S102, S103, S104, S105, S5102, S5303a, S5303b, S5303c, S5304, S5305, S5104, and S5105 corresponds to the “virtual image display control device” constructed by the processor 54p.
An eighth embodiment of the present disclosure is a modification of the sixth embodiment. As illustrated in
In S5403a, the virtual image display position α and the virtual image display size β of the highlighting image 560 are individually set for each front obstacle 8b detected in S101. At this time, the virtual image display position α and the virtual image display size β are set in the same manner as in S5103.
In the following S5403b, the virtual image display shape γ of the highlighting image 560 is individually set for each front obstacle 8b detected in S101. At this time, the virtual image display shape γ of each highlighting image 560 is set so as to limit a virtual image display range of the linear portion 560p to a range except both lateral sides in addition to the lower side in the periphery of the front obstacle 8b to be highlighted as illustrated in
As illustrated in
As described above, according to the eighth embodiment, the virtual image display of each of the highlighting images 560 that individually highlights the plurality of front obstacles 8b is limited to the range except not only the lower side, but also the lateral sides in the periphery of the front obstacle 8b to be highlighted. This makes the virtual image display positions α of the highlighting images 560 corresponding to the respective obstacles 8b less likely to be superimposed on one other. Thus, it is possible not only to individually associating the highlighting images 560 with the respective obstacles 8b, but also to reduce inconvenience to a user caused by such superimposition. Therefore, it is possible to more appropriately highlight the plurality of obstacles 8b by the respective highlighting images 560. In the above eighth embodiment, part of the HCU 54 that executes S5101a, S5101b, S102, S103, S104, S105, S5102, S5403a, S5403b, S5204, and S5205 corresponds to the “virtual image display control device” constructed by the processor 54p.
A ninth embodiment of the present disclosure is a modification of the second embodiment. As illustrated in
In S6101, it is determined whether a preceding vehicle in the same lane as the front obstacle 8b once detected in S2101 has been lost under automatic following distance control by FSRA. It is assumed that the detection lost occurs not only when the preceding vehicle moves to a lane different from the lane of the subject vehicle 2 and thus becomes undetected, but also when the preceding vehicle erroneously becomes undetected by disturbance even when remaining in the same lane.
When negative determination is made in S6101, a return to S101 is made. When positive determination is made in S6101, a shift to S6103 is made. In S6103, as illustrated in
In the following S6104, as illustrated in
In the following S6105, the display data generated in S6104 is provided to the HUD 50 to form the highlighting image 560 by the display device 50i, thereby controlling the virtual image display position α, the virtual image display size β, and the virtual image display brightness δ of the linear portion 560p. As a result, as illustrated in
As described above, according to the ninth embodiment, when the front obstacle 8b once detected in the subject vehicle 2 has been lost, the virtual image display brightness δ of the highlighting image 560 that highlights the lost front obstacle 8b is partially reduced. Accordingly, even when a user can visually recognize the front obstacle 8b, the user can intuitively understand a detection lost state of the subject vehicle 2 from a change in the brightness of the highlighting image 560. Thus, it is possible to ensure the safety and security for a user using the highlighting image 560. In the above ninth embodiment, part of the HCU 54 that executes S2100, 52101, S102, S103, S104, S105, S6101, S6103, S6104, and S6105 corresponds to the “virtual image display control device” constructed by the processor 54p.
A tenth embodiment of the present disclosure is a modification of the ninth embodiment. As illustrated in
In S6203, as illustrated in
In the display control flow, as illustrated in
As described above, according to the tenth embodiment, when the front obstacle 8b once detected in the subject vehicle 2 has been lost, the virtual image display brightness δ of the entire highlighting image 560 that highlights the lost front obstacle 8b is reduced. Accordingly, even when a user can visually recognize the front obstacle 8b, the user can intuitively understand a detection lost state of the subject vehicle 2 from a change in the brightness of the highlighting image 560. Thus, it is possible to ensure the safety and security for a user using the highlighting image 560. In the above ninth embodiment, part of the HCU 54 that executes S2100, S2101, S102, S103, S104, S105, S6101, S6203, S6104, and S6105 corresponds to the “virtual image display control device” constructed by the processor 54p.
An eleventh embodiment of the present disclosure is a modification of the second embodiment. The integrated control ECU in the vehicle control ECU 42 according to the eleventh embodiment performs adaptive cruise control (ACC) for forcibly and automatically controlling the following distance and the vehicle speed in a specific vehicle speed range such as a high speed range instead of FSRA. The integrated control ECU as an “automatic control unit” that performs ACC switches manual driving by a user to automatic control driving when the cruise control switch is turned on and the vehicle speed of the subject vehicle 2 falls within the specific vehicle speed range. On the other hand, the integrated control ECU switches automatic control driving to manual driving when the cruise control switch is turned off during the automatic control driving or when the vehicle speed falls outside the specific vehicle speed range during the automatic control driving.
In the display control flow of the eleventh embodiment, as illustrated in
In S7100, it is determined whether the vehicle speed of the subject vehicle 2 falls within the specific vehicle speed range on the basis of an output signal of the vehicle speed sensor of the vehicle state sensor 40. As the result, when negative determination is made, a return to S2100 is made. On the other hand, when positive determination is made, S7101, S7102, S7103a, S7103b, S7104, and S7105 are executed after the execution of S2101, S102, S103, S104, and S105.
In S7101, it is determined whether the vehicle speed falls outside the specific vehicle speed range on the basis of an output signal of the vehicle speed sensor. As the result, when negative determination is made due to the vehicle speed kept within the specific vehicle speed range, a return to S2100 is made. On the other hand, when positive determination is made due to the vehicle speed outside the specific vehicle speed range, a shift to S7102 is made along with automatic switching from the automatic control driving to the manual driving by the integrated control ECU.
In S7102, required information I for virtually displaying the highlighting image 560 is acquired in the same manner as S102. In the following S7103a, the virtual image display position α and the virtual image display size β of the highlighting image 560 are set on the basis of the required information I acquired in S7102. The virtual image display position α and the virtual image display size β are set in the same manner as S103 in the other points.
In the following S7103b, as illustrated in
In the following S7104, as illustrated in
In the following S7105, the display data generated in S7104 is provided to the HUD 50 to form the highlighting image 560 by the display device 50i, thereby controlling the virtual image display position α, the virtual image display size β, and the virtual image display color ε of the linear portion 560p. As a result, the highlighting image 560 is visually recognized with the virtual image display color ε changed from
As described above, according to the eleventh embodiment, the virtual image display color ε of the highlighting image 560 is changed along with switching from automatic control driving to manual driving by a user by the integrated control ECU. Accordingly, a user can intuitively understand the switching from automatic control driving to manual driving from the change in the display color of the highlighting image 560. Thus, it is possible to ensure the safety and security for a user using the highlighting image 560. In the above eleventh embodiment, part of the HCU 54 that executes S2100, S7100, S2101, S102, S103, S104, S105, S7101, S7102, S7103a, S7103b, S7104, and S7105 corresponds to the “virtual image display control device” constructed by the processor 54p.
Although the plurality of embodiments of the present disclosure have been described above, the present disclosure is not limited to these embodiments, and can be applied to various embodiments and combinations without departing from the gist of the present disclosure.
In a first modification, the virtual image display control for the highlighting image 4560 according to the fourth embodiment may be applied to the second embodiment.
In a second modification, the virtual image display control for the highlighting image 4560 according to the fourth embodiment may be applied to the third embodiment.
In a third modification, the virtual image display control for the highlighting image 4560 according to the fourth embodiment may be applied to the fifth embodiment.
In a fourth modification, the virtual image display control for the highlighting image 4560 according to the fourth embodiment may be applied to the sixth embodiment.
In a fifth modification, the virtual image display control for the highlighting image 4560 according to the fourth embodiment may be applied to the seventh embodiment.
In a sixth modification, the virtual image display control for the highlighting image 4560 according to the fourth embodiment may be applied to the eighth embodiment.
In a seventh modification, the virtual image display control for the highlighting image 4560 according to the fourth embodiment may be applied to the ninth embodiment.
In an eighth modification, the virtual image display control for the highlighting image 4560 according to the fourth embodiment may be applied to the tenth embodiment.
In a ninth modification, the virtual image display control for the highlighting image 4560 according to the fourth embodiment may be applied to the eleventh embodiment.
In a tenth modification, the linear portion 560p of the highlighting image 560 virtually displayed by the first to third embodiments and the fifth to eleventh embodiments may be formed in a virtual image display shape other than a curved circular arc shape, for example, a substantially inverted U shape which is not curved as illustrated in
In an eleventh modification, the first linear portion 4560p1 of the highlighting image 4560 virtually displayed by the fourth embodiment and the first to ninth modifications may be formed in a virtual image display shape other than a curved circular arc shape, for example, a substantially inverted U shape which is not curved as illustrated in
In a thirteenth modification, the highlighting image 560 or 4560 virtually displayed by the second, third, and ninth to eleventh embodiments and the first, second, and seventh to ninth modifications may be virtually displayed around each of a plurality of front obstacles 8b according to any of the fifth to eighth embodiments and the third to sixth modifications. In a fourteenth modification, the virtual image display sizes β, β1, β2 that become smaller as the front obstacle 8b is farther from the subject vehicle 2 may not be employed in the sixth to eighth embodiments and the fourth to sixth modifications.
In a fifteenth modification, a virtual image display color of a color tone that is varied according to the type of the front obstacle 8b may be employed instead of or in addition to the virtual image display shape γ that is varied according to the type of the front obstacle 8b by the sixth embodiment and the fourth modification. In a sixteen modification, the highlighting image 560 may be caused to blink instead of or in addition to reducing the virtual image display brightness δ of at least part of the highlighting image 560 by the ninth and tenth embodiments and the seventh and eighth modifications.
In a seventeenth modification, the virtual image display color ε may be changed along with switching from manual driving to automatic control driving instead of or in addition to changing the virtual image display color ε along with switching from automatic control driving to manual driving by the eleventh embodiment and the ninth modification. In an eighteenth modification, a virtual image display shape that is changed along with switching from automatic control driving to manual driving may be employed instead of or in addition to the virtual image display color ε that is changed along with the switching from automatic control driving to manual driving by the eleventh embodiment and the ninth modification.
In a nineteenth modification, the seventh embodiment and the fifth modification may be combined with the sixth embodiment and the fourth modification, respectively. In a twentieth modification, the eighth embodiment and the sixth modification may be combined with the sixth embodiment and the fourth modification, respectively. In a twenty-first modification, the ninth embodiment and the seventh modification may be combined with the eleventh embodiment and the ninth modification, respectively. In a twenty-second modification, the tenth embodiment and the eighth modification may be combined with the eleventh embodiment and the ninth modification, respectively.
In a twenty-third modification, the ACC according to the eleventh embodiment and the ninth modification may be performed instead of FSRA by the integrated control ECU of the vehicle control ECU 42 also in the other embodiments and modifications. In a twenty-fourth modification, the integrated control ECU of the vehicle control ECU 42 according to the eleventh embodiment and the ninth modification may be caused to function as the “automatic control unit” that performs LKA to change the virtual image display color ε along with switching from automatic control driving to manual driving by LKA. In this case, the third embodiment and the second modification can be combined. In a twenty-fifth modification, the integrated control ECU of the vehicle control ECU 42 according to the eleventh embodiment and the ninth modification may be caused to function as the “automatic control unit” that performs automatic control driving other than ACC and LKA to change the virtual image display color ε along with switching from the automatic control driving to manual driving. Examples of the applicable automatic control driving other than ACC and LKA include driving that automatically controls merging traveling at a junction on a traveling road, branch-off traveling at a branch point on a traveling road, and traveling from a gate to a junction.
In a twenty-sixth modification, the HCU 54 may not be provided. In the twenty-sixth modification, for example, one or more kinds of ECUs selected from the ECUs 31, 42, and the display ECU provided for controlling the display elements 50, 51, 52 may be caused to function as the “vehicle display control device”. That is, the display control flow of each of the embodiments may be achieved by the processor(s) included in one or more kinds of ECUs to construct the “virtual image display control device”.
It is noted that a flowchart or the processing of the flowchart in the present application includes sections (also referred to as steps), each of which is represented, for instance, as S101. Further, each section can be divided into several sub-sections while several sections can be combined into a single section. Furthermore, each of thus configured sections can be also referred to as a device, module, or means.
While the present disclosure has been described with reference to embodiments thereof, it is to be understood that the disclosure is not limited to the embodiments and constructions. The present disclosure is intended to cover various modification and equivalent arrangements. In addition, while the various combinations and configurations, other combinations and configurations, including more, less or only a single element, are also within the spirit and scope of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
2015-023621 | Feb 2015 | JP | national |
2015-236915 | Dec 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/000371 | 1/26/2016 | WO | 00 |