The disclosure relates to a head up display (HUD) in a motor vehicle.
A head up display emits light that reflects from the front windshield to be seen by the driver. The light appears to come from a virtual image in front of the driver and in front of the windshield. This type of head up display is currently commercially available.
Conventional head up displays create the virtual image by first using a display to create an image. Next, the light from the image is reflected from one or more mirrors. Next, the light from the mirrors is reflected from the windshield. The mirrors are designed and positioned relative to the display so that the light seen by the driver, which is reflected from the windshield, appears to come from a virtual image that is outside of the vehicle. The mirrors and display are typically contained in a package that occupies a volume beneath the top surface of the dashboard.
In a head up display, graphics are projected onto a transparent glass surface. Because there is not an opaque surface such as that in a traditional thin-film-transistor (TFT) liquid-crystal display (LCD), there is an absence of black when driving graphics. This absence of black or opaqueness creates obstacles for ensuring adequate contrast, especially color contrast. Current Head up display systems project/display graphics onto glass with no way of understanding what the background environment colors are beyond/behind the glass surface (e.g., the area in front of the car). Additionally, the measure of color contrast is calculated by determining differences in color value of two (or more) referenced objects. As an example, the color contrast of this text is calculated by understanding the color value assigned to this text and the color value assigned to the background it sits on (the paper's white color) and then measuring the difference between the two color values of the mentioned objects. To optimize color contrast, the values of each of the elements in the total composition must be understood. In the case of the current head up display systems, the color values assigned to graphical elements being rendered can be understood, but the background environment has no calculable value. Therefore, creating visible graphics is not as easy as with a traditional TFT LCD because we cannot appropriately measure the color value of the environment in which the rendered graphics appear to overlay. Today's solutions make use of brightness sensors to be able to determine the appropriate brightness level of graphics.
Current systems have no way of determining what color environment is beyond the displayed graphics, therefore there are limitations to the amount of visual color contrast that can be created. Today's solutions are very limited and mostly cater to day/night modes by adjusting the brightness.
As more content, and even camera feeds, are being displayed on HUDs, there are more obstacles in creating highly visible, high contrast images as it is not possible to control the environmental factors outside the car. As more features are added to HUDs, it will be important to make sure the added features are visible in all situations and under all conditions.
The present invention may provide a method for creating an automatic, contextually-configured human-machine interface (HMI) for a Head Up Display (HUD) in a motor vehicle. The HUD may be contextually driven by the surrounding environment and/or terrain. Utilizing a combination of image sensors (e.g., image sensors on the outside of the vehicle), the captured images and information may be used to determine What the colors of the surrounding environment are. Based on the information gained by these sensors, the HUD graphics may be re-rendered by applying a new color and/or adjusting the placement position to create highly visible graphics within the context of that specific environment background. Thus, the invention may ultimately provide virtual images with greater color contrast with the background against which the virtual images are presented.
In one embodiment, if the vehicle is driving toward a sunset, the vehicle camera sensors may detect the light intensity, color and detail of the ambient environment in the forward-facing direction. Within these image feeds, the HUD may use color detection technology to detect, for example, the presence of bright yellows, oranges and hues of reds. The HUD may then compare that color data to the colors presented in the current HUD graphics (e.g., the virtual image). If the current graphics colors are close in family to the colors of the sunset background (e.g., based on color theory studies) the HUD may augment those graphics by applying a new color code that is in the family of complementary colors (e.g., according to the color theory sciences) to create graphics images with greater contrast vis-a-vis the background scene. In this sunset background example, the graphics may get treatments of blues and greens based on the science of color theory and complementary color principles.
The present invention improves not just brightness, but also modifies colors of the displayed images depending upon a variety of environmental factors. Ultimately, color contrast is controlled, which is not possible by use of brightness sensors alone.
In one embodiment, the invention comprises a head up display arrangement for a motor vehicle having a human driver. A light source emits a light field including a first color such that the light field is visible to the driver as a virtual image. A light sensor detects a second color that is visible to the driver in a forward direction of the vehicle in an ambient environment outside of the vehicle. An electronic processor is communicatively coupled to both the light source and the light sensor. The processor changes the first color in the light field to a third color. The third color is dependent upon the second color.
In another embodiment, the invention comprises a head up display method for a motor vehicle having a human driver. A light field including a first color is emitted such that the light field is visible to the driver as a virtual image. A second color that is visible in a forward direction of the vehicle in an ambient environment outside of the vehicle is detected. The first color in the light field is changed to a third color. The third color is dependent upon the second color.
In yet another embodiment, the invention comprises a head up display arrangement for a motor vehicle having a human driver. A light source emits a light field including a first color such that the light field is reflected off of a windshield of the vehicle and the light field is visible to the human driver as a virtual image. A light sensor detects a second color that is visible in a background of the virtual image as viewed by the human driver. An electronic processor is communicatively coupled to both the light source and the light sensor. The processor compares the first color to the second color. If the color comparison reveals that a contrast between the first color and the second color is less than a threshold, then the first color in the light field is changed to a third color such that a contrast between the third color and the second color is greater than the contrast between the first color and the second color.
An advantage of the present invention is that by using intelligent data processing, the exact color(s) and content in the background behind the displayed graphics may be determined so that augmented graphics may be better displayed.
Another advantage of the present invention is that it may provide graphics that are more visible graphics in specific situations wherein the graphics of current head up displays are not easily visible.
Yet another advantage of the present invention is that it may provide the ability to augment graphics in a variety of ways to achieve better visibility in all environmental states and with all background colors.
A better understanding of the present invention will be had upon reference to the following description in conjunction with the accompanying drawings.
An electronic processor 28 is communicatively coupled to both light source 14 and light sensor 20. Processor 28 may change the first color in the light field to a third color. The third color is dependent upon the second color. For example, the third color may have a high degree of contrast vis-a-vis the second color. More particularly, processor 28 may compare the first color to the second color. If the color comparison reveals that a contrast between the first color and the second color is less than a threshold, then processor 28 may change the first color in the light field to a third color. A contrast between the third color and the second color may be greater than the contrast between the first color and the second color.
In a next step 504, a second color that is visible in a forward direction of the vehicle is detected in an ambient environment outside of the vehicle. For example, with vehicle 10 driving toward a sunset, light sensor 20 may detect the color yellow in the sky as background for the two orange digits indicating the speed of vehicle 10.
In a final step 506, the first color in the light field is changed to a third color. The third color is dependent upon the second color. For example, processor 28 may determine that there is a low level of contrast between orange and yellow. Accordingly, processor 28 may change the color of the two digits indicating the speed of vehicle 10 to a color having a higher degree of contrast with the yellow background, such as blue, for example.
The invention has been described above as including changing colors of the virtual image, but changes are not limited to color within the scope of the invention. Other augmentations may include flipping the image representation to cut outs, or negatives. In addition to different colors, the HUD may also display different sets of content as well. Using the same or similar methodologies, the HUD system may accommodate surroundings to display contextual information in that setting. For example, if recognizing a lot of snow in the image sensor signals, the HUD may display augmented road lines and augmented lane markers in this environment.
As the industry adds more features to the HUD, including image feeds, it is also possible within the scope of the invention to apply filters to image feeds to augment appearance for better visibility. Such image feeds may be from back up cameras directed in the reverse direction, for example. If a HUD projects characters, symbols and icons that are superimposed over an image feed from a camera directed in a reverse direction, the colors of the characters, symbols and icons may be controlled and modified so as to provide high color contrast between the colors of the characters, symbols and icons and the colors in the image feed from the camera directed in the reverse direction.
The foregoing description may refer to “motor vehicle”, “automobile”, “automotive”, or similar expressions. It is to be understood that these terms are not intended to limit the invention to any particular type of transportation vehicle. Rather, the invention may be applied to any type of transportation vehicle whether traveling by air, water, or ground, such as airplanes, boats, etc.
The foregoing detailed description is given primarily for clearness of understanding and no unnecessary limitations are to be understood therefrom for modifications can be made by those skilled in the art upon reading this disclosure and may be made without departing from the spirit of the invention.
This application claims benefit of U.S. Provisional Application No. 62/740,461 filed on Oct. 3, 2018, the disclosure of which is hereby incorporated by reference in its entirety for all purposes.
Number | Date | Country | |
---|---|---|---|
62740461 | Oct 2018 | US |