The present invention relates in general to a vehicle instrument panel, that is to say a control system arranged on a structure provided with one or more panels carrying adjustment or measurement devices, indicator instruments, display devices and the like able to allow a driver to control the vehicle conditions.
It is known that instrument panels of some current vehicles are provided with processors able to receive signals from devices disposed in the vehicle, and to present the driver with corresponding information in an organic. and unitary manner by means of a display device.
Because of the constraints on space within the interior of a passenger compartment of motor vehicles, such on-board computers are generally disposed in positions to one side of and/or lower than the head of the driver. This arrangement forces the driver temporarily to take his eyes from the road in order to be able to read the information appearing on the display.
This, naturally, can give rise to a dangerous situation which is more likely the heavier and more intense the traffic, and in general when obstructions to be avoided and the variations in the path to be travelled by the vehicle are more frequent.
The object of this invention is to provide an instrument panel, which eliminates or at least reduces the occurrence of dangerous situations resulting from the above-mentioned disadvantages.
This object is achieved according to the invention by a display arrangement for an instrument panel of a motor vehicle having the characteristics defined in the claims.
One preferred, but non-limitative embodiment of the invention will now be described making reference to the attached drawings, in which:
FIGS. 3 to 5 are representations which illustrate examples of use of the display arrangement of the invention.
With reference to
The processor unit 20 is preferably integrated on the vehicle on such a way as to reduce to a minimum the computational load of the interface 10; in another preferred embodiment the processor unit can be constituted by two sub-units, one of which is integrated on the vehicle and one integrated on the wearable interface 10, the two being connected together by cable or by “wireless” connection (for example via radio frequency or infrared).
With reference to
The wearable interface unit 10 includes information rendering means 100, disposed on the support structure 11, for rendering information from the processor unit 20. This information can be of video and/or audio type. These rendering means 100 include a virtual image generator 102 operable to generate a virtual image and to present it to the driver's eyes through the transparent screen 12 at a predetermined distance, the said virtual image being superimposed on the scene visible to the driver through the transparent screen 12. If the distance at which the virtual image is presented is sufficiently large (for example greater than 5 metres) the driver's eye is able to focus on the retina both the background and the virtual image generated by the means 102 with a minimum accommodation. To this end the virtual image generator 102 includes miniaturised image formation means, for example of liquid crystal (LCD), or cathode ray tube (CRT), or organic light emitter device (OLED) type, operable to form a synthetic real image, and an optical system for transformation of the said real image into a virtual image located at a certain distance from the observer and visible to the driver through the transparent screen 12. The transformation of the synthetic image from real to virtual serves to present the video information to the driver at a predetermined distance from the eyes in such a way as to minimise the accommodation of the focal distance. Moreover, the virtual image is presented in such a way that the information of critical importance is displayed in high-resolution regions of the field of view (close to the fovea of the eye) and rapidly accessible. Information which is not critical for safety can be displayed in marginal regions of the field of view, for example at the top (possibly superimposed over the overhead light) or at the bottom (possibly superimposed over the dash board). Preferably, the virtual image is selectively positionable within the field of view of the user by mechanical and/or electronic and/or software means in such a way as to optimise the visibility and usableness of the information presented. The virtual image can be presented to only one or to both eyes, and may or may not contain the same information for both eyes. For example, the field of view subtended by the image presented to the right eye may be only partially superimposed over the field of view subtended by the image presented to the left eye. The information rendering means 100 preferably further include a speaker 103 able to make a sound signal available to the driver.
The wearable interface unit 10 further include sensor means 110, also disposed on the support structure 11, operable to perceive a physical signal from the operator and/or from the surrounding environment, and to make available a corresponding electrical signal representative of this physical signal. The sensor means 110 further include a unit 111 for detection of the position and/or orientation of the driver's head. In a variant of the invention the driver's head position and/or orientation detection unit 111 is constituted by an inertial measurement unit integrated entirely on the wearable interface unit 10; in a further variant of the invention the position/orientation unit 111 includes an inertial measurement unit integrated on the wearable interface unit and a non-inertial measurement unit (for example of the mechanical, magnetic, optical or ultrasonic type) partly integrated on the wearable interface unit 10 and partly integrated on the vehicle.
The visual information can be rendered to the user in a manner related fixedly to the vehicle's frame of reference, similar to what occurs in a traditional instrument panel: this is done for all the information for which a spatial correlation between the virtual image and the environment surrounding the vehicle (or background) over which this virtual image is superimposed is not required. This is the case, for example, with vision-aid systems, for example for improving the night vision (described hereinafter), in which the sensors used are conventionally integrated on the vehicle and therefore fixedly related with it.
If the head position/orientation detection unit 111 is constituted by an inertial measurement unit integrated on the wearable interface unit 10 this unit 111 will provide inertial coordinates, that is to say referred to the frame of reference of the ground (that is the environment surrounding the vehicle); to obtain the inertial coordinates of the user's head with respect to the frame of reference of the vehicle the coordinates with respect to the frame of reference of the ground are corrected by taking into account the inertial coordinates of the vehicle detected by a vehicle navigation system (described hereinafter).
In another variant of the invention, more suitable if the virtual image is not correlated with the environment surrounding the vehicle, but rather with the interior of the vehicle, the head position/orientation detection unit ill comprises a measurement unit of non-inertial type, that is to say one adapted to measure directly the coordinates of the head in the frame of reference of the vehicle. Preferably, this measurement unit comprises a video camera of the optical sensor matrix type, for example CCD or CMOS, integrated on the interface unit 10 in such a position that a plurality of position-locating means are present in the field of view of the video camera. The recognition in real time of the position-locating means makes it possible for the detection unit 111 to identify the position of the user's head with respect of the position-locating means and therefore its coordinates with respect to the vehicle.
To this end the position-locating means are constituted by visual indicator means, for example an LED operating with visible or infrared radiation. Alternatively, such LEDs are integrated on the interface unit 10 and the position-locating means are constituted by simple reflectors which receive the radiation from the LEDs and reflect it towards the video camera integrated on the interface unit 10.
In an alternative configuration the video camera is integrated on the vehicle, whilst the position-locating means are integrated on the interface unit 10.
The data from the head position/orientation detection unit 111 may be neither inertial coordinates nor non-inertial coordinates, but rather raw or only partially conditioned sensor data which are sent to the processor unit 20 and there processed to obtain the coordinates of the head; this is necessary if it is decided to reduce to the minimum the computational power and the memory integrated in the wearable interface unit 10 by delegating the more onerous calculations to non-wearable subunits of the system, that is to say by having them integrated on the vehicle.
Preferably, the sensor means 110 further include an “eye tracking” position unit 111b for measuring the coordinates of the pupil with respect to the frame of reference of the head and therefore the direction in which the driver is looking The data coming from the unit 111b are transmitted to the processor unit 20 in a manner similar to that which takes place for data coming from the unit 111.
As well as the primary function of measuring the position of the pupil, the eye tracking system 111b can be also utilised for monitoring the driver's attention and prevent accidents caused by dropping to sleep. The eye tracking system can also be utilised simultaneously for identification of the driver through retinal recognition.
The sensor means 110 further include an ambient illumination sensor 112 operable to detect the brightness of the environment surrounding the driver, in such a way as to make it possible to adapt the illumination and the colour of the virtual image presented to the driver to that of the real image, thereby optimising the contrast and visibility. In one possible embodiment the function of measuring the brightness, effected by the brightness sensor 112, is achieved via the same optical sensor matrix as is utilised for -the non-inertial measurement unit previously mentioned. The adaptation of the brightness of the virtual image can be obtained by means of screen of variable transmittance capable of varying the transmittance of the optical virtual display system and/or the brightness of the miniaturised means for forming the images in dependence of the ambient luminance in such a way as to optimise the contrast.
In an alternative embodiment the interface unit 10 is formed by two separately wearable sub-units connected to one another by a cable or via “wireless” connection (for example radio frequency or infrared), one of which is integrated on the visor or helmet and the other carried on an article of clothing.
Other interface devices are also provided, disposed on the vehicle's instrument panel, that is to say one or more manually controllable devices 113 of conventional type, and a microphone 114 prearranged to be able to transmit a voice command to the processor unit 20. Preferably, the microphone 114 is also integrated on the support structure 11 of the wearable interface unit 10. These devices make it possible, in a known manner, to select the type of data to be displayed, and possibly their manner of presentation (colour, dimension, position), or to interact with other systems of the vehicle.
Operatively connected to the processor unit 20 are on-board devices and systems of the type normally utilised for the provision to the user of information about the vehicle and its operating conditions. For example, in the case of motor vehicles, such devices and systems include sensor means 201 comprising any type of known sensor which can be installed in modern motor vehicles to detect signals relating to the operating conditions and functionality of the vehicle (for example temperature, pressure and coolant liquid level signals and the like), a trip data processor (or “trip computer”) to process data relating to the fuel consumption, average speed, number of kilometres travelled and the like, a navigation system 202 (equipped, for example,. with vehicle inertial sensors, a GPS receiver and a database of maps), driver assistance systems 203 (for example a “lane warning” system for automatically detecting the position of the vehicle with respect to the edges of the roadway, an “overtake warning” system for controlling the overtaking manoeuvre, an “adaptive cruise control” system for control of the cruising speed and safe distances, a “stop & go” system for control of the speed and safe distances in low speed tail back conditions), a infotelematic unit 204 connected to a network according to the GSM/GPRS or UMTS protocols, and a night vision assistance system for night vision of the road, including an infrared video camera 205. The connection between the processing unit 20 and the above-mentioned on-board devices/systems can be achieved by a cable or by “wireless” connection (for example by a radio frequency or infrared); the “wireless” mode of connection may be unnecessary if at least one subunit of the processor unit 20 is integrated in the wearable interface unit 10.
The processor unit 20 processes the signals from the on-board systems in a conventional manner to provide for generation of video and audio signals containing the information to be delivered to the driver. The data relating to the position of the driver's head with respect to the vehicle and/or the background, and the position of the pupil with respect to the head are utilised by the processor unit 20 to determine, on the basis of the access priority, the distribution of individual elements of information within the driver's field of view. The processor unit 20 therefore transmits the video signal to the virtual image generator 102 in a known manner (via cable or via radio), which latter provides for the associated display to be superimposed on the image viewed directly by the driver and the audio signal at the speakers 103.
In FIGS. 3 to 5 are shown examples of application of the wearable interface unit 10 whilst driving a motor vehicle. In these examples, in which the interface unit 10 is formed as spectacles, the information is displayed in predetermined portions of the driver's field of view defined by the lenses 12 of the spectacles 11. Preferably, the information relating to the function of the vehicle (for example data on speed, engine speed, fuel level, oil temperature, kilometres travelled, turning point indicator etc.) are displayed in peripheral zones of the field of view (for example at the bottom), whilst those relating to the vision assistance (for example for night vision) or to failure indications, imminent dangers or malfunctions are displayed in the central zone of the field of view (corresponding to the path of the vehicle), and those relating to navigation are displayed in the zones of the field of view relating to the rendered navigation information (for example turning point indication, clearance etc.). The peripheral region of the field of view, dedicated to data relating to the function of the vehicle, can also be occlusive type, that is not allowing the background to be seen: this can allow an increase in contrast.
More preferably still, part of the information (for example data on speed, engine speed, fuel level, oil temperature, kilometres travelled, etc.) is presented to the user in a permanent manner as takes place in a current on-board instrument panel, in such a way that the overall effect is that of having a virtual instrument panel superimposed over part of the scene, that is the environment surrounding the vehicle, and in part within the vehicle.
In an advantageous embodiment the overall field of view processed by the system is greater than that which is presented instantaneously to the driver, who can thus displace his head within the overall field of view with movements of the head and/or rotation of the pupil. With respect to the previously-described embodiment, this configuration allows the use of image-formation means of lower resolution in that only a part of the virtual instrument panel is presented at any instant to the driver. Moreover, the optical virtual display system works on a narrower field of view which makes this optical system simpler, lighter and of lower cost. Finally, the driver only sees information presented within a narrow field of view, centred about the direction in which the driver's head is pointing or in which the driver is looking: this is ergonomically advantageous in that the information presented is limited and makes it possible not to distract or confuse the driver.
As will be appreciated, although the display arrangement has been illustrated as a replacement of the traditional on-board instrument panel display devices it can however be utilised as a complement to such on-board instrument panels to present information of assistance to the driver essential for safety.
Naturally, the principle of the invention remaining the same, the details of construction and the embodiments can be widely varied with respect to what has been described and illustrated, without by this departing from the scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
TO2003A000662 | Aug 2003 | IT | national |