In transportation, it is desirable to have information instantaneously available to monitor the functions of the moving vehicle. Present methodologies include dashboard display units such as speedometers, odometers, fuel gauges, engine temperature gauges, transmission function and the like, often with illuminated dials. More recent methodologies, sometimes referred to as Head Up Displays (“HUDs”), cause one or more of these functions to be displayed on a windshield or a similar transparent screen. Typically, the position of display is in the eyesight directly in front of the driver, so that in driving a vehicle, the driver is able to view directly the vehicle driving speed or other data she desires to know, without the need to raise or lower her head to change her field of view.
In one conventional HUD device, the vehicle information to be displayed is first projected by an optical device onto a display screen that is placed inside the wind shield and facing the wind shield. Then, the vehicle information displayed on the display screen is reflected by the wind shield and displayed on the wind shield for viewing by the driver.
With such a device, the associated hardware is quite large, requiring a voluminous space in the instrument panel to house all of the components. Accordingly, there is a need in the industry, among others, to provide HUD capabilities to vehicles that have minimal space available in the instrument panel.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In accordance with an aspect of the present disclosure, a method is carried out in a vehicle having a graphical display. The method is implemented in computer-executable instructions for displaying information about the vehicle. The method comprises obtaining view data from a camera, the view data indicative of a field of view of an operator when operating the vehicle, obtaining vehicle data, and causing the graphical display to render the vehicle data and the view data in an overlaying manner.
In accordance with another aspect of the present disclosure, a computer-readable medium is provided having modules for conveying information about a vehicle on a graphical display. The computer-readable medium includes an image acquisition module configured to collect data from a camera associated with a field of view of a driver of the vehicle, a graphical element rendering module configured to collect vehicle information from one or more information components and to generate one or more graphical elements representative of the collected vehicle data, a superimpose module configured generate a data stream composed of one or more of the generated graphical elements superimposed onto to the data collected from the camera, and a display module configured to cause the data stream to be presented to a graphical display for display.
In accordance with another embodiment of the present disclosure, a system is provided for providing information to a vehicle operator. The system includes a graphical display, a camera mounted in a suitable position so as to capture one or more images indicative of a field of view of a vehicle operator, and a display generator coupled to the camera and to the graphical display. The display generator in on embodiment is configured to: receive vehicle information from one or more information components: receive the one or more images captured by the camera; generate one or more graphical elements indicative of the vehicle information; generate a stream of data representing the one or more graphical elements superimposed onto the one or more images captured by the camera; and present the stream of data to the graphical display for display.
The foregoing aspects and many of the attendant advantages of the claimed subject matter will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
The detailed description set forth below in connection with the appended drawings, where like numerals reference like elements, is intended as a description of various embodiments of the disclosed subject matter and is not intended to represent the only embodiments. Each embodiment described in this disclosure is provided merely as an example or illustration and should not be construed as preferred or advantageous over other embodiments. The illustrative examples provided herein are not intended to be exhaustive or to limit the claimed subject matter to the precise forms disclosed.
The present disclosure relates to an inventive Heads Up Display (HUD) suitable for use in a vehicle. Instead of an image projected onto the windshield of the vehicle as is done by conventional HUDs, the HUD described herein comprises a smart device, such as a cellular phone, tablet, smart display or the like, that is configured to display one or more types of vehicle information while at the same time diminishing any visual impairment caused by the device to the driver.
Heretofore, the lack of space below the surface of a heavy truck's instrument panel restricts the use of any conventional HUD. Thus, previous solutions to this space problem have placed displays in the drivers field of view (e.g., A-pillar, back of sunvisor, etc.) to provide information to the driver. However, this limits the visibility outside the vehicle.
To address this problem and others, embodiments of the disclosure employ smart devices, such as a cellular phone, a tablet or smart display, etc., mounted in or otherwise associated with the field of view of the driver. In embodiments disclosed herein, the “smart” HUD can be mounted to the visor or windshield, suspended from the headliner, etc. The “smart” HUD is configured to communicate with one or more information generating or collecting components (“information components”). In one embodiment, the “smart” HUD is configured to communicate with the one or more information components via the truck's vehicle wide network, such as a CAN, in order to display to the driver such information as virtual gauges, diagnostics, truck-based applications, among others.
In order to solve the visibility issue for the driver, the “smart” HUD is equipped or associated with a forward facing camera, and is configured to act as a direct view display to the driver. In that regard, real-time video captured by the camera is displayed as the background of the vehicle information presented to the driver by the direct view display. In other words, vehicle information represented in any graphical form is superimposed on the real-time video acquired by the camera and presented together to the driver. This results in a “see through” effect in that the vehicle information on the display appears to be floating on the camera's captured real-time image/video as if the smart device is transparent.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of one or more embodiments of the present disclosure. It will be apparent to one skilled in the art, however, that many embodiments of the present disclosure may be practiced without some or all of the specific details. In some instances, well-known process steps have not been described in detail in order to not unnecessarily obscure various aspects of the present disclosure. Further, it will be appreciated that embodiments of the present disclosure may employ any combination of features described herein.
Although representative embodiments of the present disclosure is described with reference to Class 8 trucks, it will be appreciated that aspects of the present disclosure have wide application, and therefore, may be suitable for use with many types of powered vehicles, such as passenger vehicles, buses, RVs, commercial vehicles, light and medium duty vehicles, and the like. Accordingly, the following descriptions and illustrations herein should be considered illustrative in nature, and thus, not limiting the scope of the claimed subject matter.
Turning now to
Still referring to
The vehicle information can be processed by the display generator 36 or other components so that the appropriate readings may be presented on the graphical display 26. In this regard and by way of example only, the information components 40 may report vehicle information about a number of vehicle systems, including but not limited to vehicle and engine speed, fluid levels, tire pressure monitoring, battery level, transmission and engine temperatures, collision detection system data, hybrid drive data, heating/cooling system data, infotainment data, among others. The graphical display 26 may be, for example, a liquid crystal display (LCD) or a light emitting polymer display (LPD) that may include a “touch screen” or sensitive layer configured to recognize direct input applied to the surface of the graphical display 26. For example, the position of the direct input, the pressure of the direct input, or general direction of the direct input may be recognized in order to obtain input from a vehicle operator. In other embodiments, the vehicle may include conventional operator control inputs (not illustrated) connected to the HUD 20 for obtaining input from a vehicle operator that may include, but are not limited to, buttons, switches, knobs, etc.
The display generator 36 is also connected in communication with the camera 32 or other digital image capture device. The camera 32 is configured to capture real time images and video and to transmit the captured images and video to the display generator 36. In that regard, the camera 32 can include any known image sensor, such as a CCD sensor or CMOS sensor, and associated circuitry for providing real time images and/or video to the display generator 36. In one embodiment, the camera 32 is mounted to the body 22 on the opposite side as the graphical display 26. Thus, when the HUD 20 is mounted in front of the driver, the graphical display 26 faces the vehicle driver and the camera 32 faces forwardly through the windshield W so as to capture images similar to the driver's field of view. In other embodiments, the camera 32 may be discrete from the body 22 and mounted to the vehicle in an appropriate location to capture similar forward looking views. In use, the real time images may be processed by the display generator 36 or other components so that the real time images in the form of video may be presented on the graphical display 26 in real-time or near real-time.
In accordance with an aspect of the present disclosure, the display generator 36 is configured to: (1) generate one or more graphical representations or “elements” of the vehicle information; (2) superimpose the generated graphical elements onto video obtained from the camera 32; and (3) present the graphical display 26 with the “combined” or superimposed graphical data stream in real-time or near real-time (e.g., 0.001 to 0.1 second delay, etc.) for display. As displayed, the superimposed graphical data stream provides a “see through” effect (i.e., causes the vehicle information on the display to appear to be “floating” on the camera's captured real-time image/video as if the “smart” HUD 20 is transparent.) As a result, forward visability through the windshield of the vehicle is improved.
Turning now to
Still referring to
By way of example, the speed of the vehicle can be represented in graphical form in a variety of ways. In one embodiment, vehicle speed is represented graphically by a digital gauge (i.e., digital speedometer). In another embodiment, vehicle speed is represented graphically by text (e.g., a numeral, such as “53”), as shown in
In some embodiments, if the collected and processed vehicle information is associated with an abnormal vehicle condition, the associated graphical element may be rendered so as to be displayed with, for example, increased size, in a color indicative of an abnormal or warning condition (e.g., yellow, red, etc.) and/or flashing in one or more colors, etc. For example, if the vehicle speed information is determined to represent a vehicle speed that is greater than a speed limit set by a fleet operator or the speed limit posted roadside and provided to the HUD 20 by, for example, a GPS or navigation system, the graphical element can be generated as an enlarged numerical representation of such information. Similarly, in another embodiment, the graphical element can be generated as a flashing numerical representation of such information.
The display generator 36 also includes a video acquisition module 70. The video acquisition module 70 implements logic for obtaining real-time images or video from the camera 32. In one embodiment, the video acquisition module 70 is configured to control the operation of the camera 32. For example, in one embodiment, the video acquisition module 70 is configured to start on/off the camera, retrieve the real-time video generated by the camera or cause the camera to send the real-time video to the display generator 36, etc. During the acquisition process, the images or video received from the camera 32 can be processed and temporary stored, such as in memory and/or an associated buffer. It will be appreciated that the functionality of module 70 can be incorporated into the camera 32 in some embodiments.
The display generator 36 also includes a superimpose module 74. The superimpose module 74 implements logic for superimposing one or more graphical elements rendered by the graphic element rendering module 66 onto the real-time images or video obtained from the camera 32. As such, a data stream is generated, which contains one or more of the rendered graphical elements overlaying the captured real time video in one or more sections thereof. In the exemplary embodiment depicted in
As further illustrated in
As used herein, the term processor is not limited to integrated circuits referred to in the art as a computer, but broadly refers to a microcontroller, a microcomputer, a microprocessor, a programmable logic controller, an application specific integrated circuit, other programmable circuits, combinations of the above, among others. Therefore, as used herein, the term “processor” can be used to generally describe these aforementioned components, and can be either hardware or software, or combinations thereof, that implement logic for carrying out various aspects of the present disclosure. Similarly, the term “module” can include logic that may be implemented in either hardware or software, or combinations thereof.
As illustrated in
If a start event occurs at block 602, the method 600 proceeds to block 604, where the display generator 36 begins collecting information from the information components 40 and renders one or more graphical elements 80 for subsequent display. In some embodiments, the rendered graphical elements 80 can be temporarily stored in memory 58 or an associated buffer. This information may be continually collected and processed so that current readings may be conveyed on the graphical display 26. From block 604, the method proceeds to block 606, where real time video representing part of the driver's field of view captured by camera 32 is received and temporarily stored, for example, in memory 58 or an associated buffer. Next, at block 608, the rendered graphical elements from block 604 are superimposed onto the real-time video from block 606, resulting in a composite data (e.g., video) stream. The composite data stream is then presented to the display 26 at block 610 for display. Once received by the display 26, the composite data stream is rendered by display 26, as shown in the example of
The method 600 then proceeds to block 612, where a determination is made as to whether a process termination event has occurred. The termination event can be turning the ignition key to the “off” position, powering down the HUD, or placing the HUD in sleep or stand-by mode, etc. If a termination event occurs at block 612, then the method 600 ends. If not, the method returns to block 604 so that a continuous feed is presented to the display 26.
It should be well understood that the routine 600 described above with reference to
The principles, representative embodiments, and modes of operation of the present disclosure have been described in the foregoing description. However, aspects of the present disclosure which are intended to be protected are not to be construed as limited to the particular embodiments disclosed. Further, the embodiments described herein are to be regarded as illustrative rather than restrictive. It will be appreciated that variations and changes may be made by others, and equivalents employed, without departing from the spirit of the present disclosure. Accordingly, it is expressly intended that all such variations, changes, and equivalents fall within the spirit and scope of the present disclosure, as claimed.