The present invention relates to a 3Dimension stereoscopic display device that displays a three-dimensional stereoscopic image or a 3Dimension stereoscopic movie.
A conventional stereoscopic display device disclosed by patent reference 1 provides a three-dimensional stereoscopic image mainly intended for home use. Because this stereoscopic display device enables the user to watch a three-dimensional stereoscopic movie without wearing stereoscopic vision glasses, the stereoscopic display device offers high convenience to the user. For example, the stereoscopic display device is suitable for use as a content playback device or an RSE (Rear Seat Entertainment) display device for rear seats. The stereoscopic display device is also suitable for use in a control system using FA (Factory Automation) or image display.
However, a problem which arises in a case in which a conventional technology, typified by a technology disclosed in patent reference 1, is applied to a three-dimensional stereoscopic display of an icon or a button is that there is a possibility that a user operation performed on an icon or the like displayed in a three-dimensional stereoscopic manner is not accepted because no correspondence between a virtual position space in which the icon or the button is displayed in a three-dimensional stereoscopic manner and an operation input unit for actually accepting an operation on the icon or the button is clearly defined. More specifically, because the virtual display of the icon or the button in a three-dimensional stereoscopic manner and a hardware switch or touch panel surface for actually accepting an operation on the icon or the button exist at different positions or in different spaces, the conventional technology makes the user have a feeling that something is abnormal.
The present invention is made in order to solve the above-mentioned problem, and it is therefore an object of the present invention to provide a 3Dimension stereoscopic display device that can provide an HMI (Human Machine Interface) with a three-dimensional stereoscopic display which enables the user to perform an operation matching the user's intuition.
In accordance with the present invention, there is provided a 3Dimension stereoscopic display device including: a stereoscopic display monitor unit for displaying a right-eye image or video image and a left-eye image or video image for three-dimensional stereoscopic display of an operation screen in a three-dimensional stereoscopic manner; a touch panel unit disposed on a screen of the stereoscopic display monitor unit, for detecting a relative position of a pointing object relative to a touch surface thereof, the pointing object being used for performing a touch operation on the operation screen which is displayed on the screen of the stereoscopic display monitor unit in a three-dimensional stereoscopic manner; a screen composition processing unit for generating the right-eye image or video image and the left-eye image or video image for three-dimensional stereoscopic display in which a virtual display surface for three-dimensional stereoscopic display of an icon image which is an operation target on the operation screen is set to be placed at a position forward with respect to the screen of the stereoscopic display monitor unit; and a control unit for determining that the icon image is operated when the touch panel unit detects a pointing object used for performing a touch operation on the icon image.
According to the present invention, there is provided an advantage of being able to provide an HMI with a three-dimensional stereoscopic display which enables a user to perform an operation matching the user's intuition.
Hereafter, in order to explain this invention in greater detail, the preferred embodiments of the present invention will be described with reference to the accompanying drawings.
A stereoscopic display system 1B shown in
A stereoscopic display system 1C shown in
The screen composition processing unit 4 carries out a three-dimensional stereoscopic video image compositing process on the right and left video data in the content for stereoscopic display read from the storage unit 8, the three-dimensional stereoscopic video image compositing process being specific to the present invention, and outputs the right and left video data processed thereby to the video image playback device 5. The stereoscopic display monitor 6 displays the right and left video data played back by the video image playback device 5 in a stereoscopic manner when viewed from a viewer, like that shown in
A video signal for left eyes (L) and a video signal for right eyes (R) which the video image playback device 5 generates by playing back the right and left video data are alternately inputted to the stereoscopic display monitor 6 in order of L, R, L, R, and . . . . When receiving the video signal for left eyes (L), the liquid crystal display element group 6a operates the liquid crystal element group for left eyes, whereas when receiving the video signal for right eyes (R), the liquid crystal display element group 6a operates the liquid crystal element group for right eyes. The parallax barrier unit 6b blocks the light emitted from the backlight and passing through the liquid crystal display element group for right eyes at the time that the liquid crystal element group for left eyes operates, whereas the parallax barrier unit 6b blocks the light emitted from the backlight and passing through the liquid crystal display element group for left eyes at the time that the liquid crystal element group for right eyes operates. As a result, the right-eye video image and the right-eye video image are displayed alternately on the screen of the stereoscopic display monitor 6, so that a viewer can watch the stereoscopic video image at his or her point of view shown in
The present invention is not limited to the stereoscopic display monitor 6 having the structure shown in
The main CPU 4a controls each component disposed in the in-vehicle information system 1. This main CPU 4a functions as the screen composition processing unit 4 shown in
The GPS receiver 9 receives the position information about the position of the vehicle from GPS satellites, and the speed sensor 10 detects vehicle speed pulses for calculating the vehicle speed of the vehicle. The internal memory 11 serves as a work area when the main CPU 4a executes the application program for in-vehicle information processing. The CD/DVD drive device 12 plays back an AV source stored in a memory medium 12a, such as a CD or DVD. When stereoscopic display video data are included in an AV source stored in the memory medium 12a, the CD/DVD drive device functions as the stereoscopic video image content receiver 7 shown in
The HDD (hard disk drive) 13 is a mass storage device mounted in the in-vehicle information system 1, and stores a map database (abbreviated as a map DB hereafter) 13a, icon data 13b, and a program 13d. The map DB 13a is a database in which map data for use in navigation processing are registered. POI information in which the locations of POIs (Points Of Interest) on a map or detailed information associated with these POIs are described is also included in the map data. The icon data 13b show icons which are to be displayed on the screen of the stereoscopic display monitor 6. The icon data include icons showing operation buttons used for enabling the user to carry out various operations on the screen, etc. The program 13d is an application program for in-vehicle information processing which the main CPU 4a executes. For example, the program has an application program for map display including a program module for implementing the functions of the screen compositing process unit 4.
The radio receiver 14 receives a radio broadcast, and makes a channel selection according to, for example, an operation on a not-shown button selector. The DTV receiver 15 receives a digital television broadcast, and makes a channel selection according to an operation on a not-shown button selector, like the radio receiver 14. The DTV receiver 15 also functions as the stereoscopic video image content receiver 7 shown in
The in-vehicle LAN_I/F unit 16 is an interface between an in-vehicle LAN (Local Area Network) 17 and the main CPU 4a, and relays data communications between, for example, other equipment connected to the in-vehicle LAN 17, and the main CPU 4a. Further, the storage unit 8 shown in
Sound signals played back by the CD/DVD drive device 12, the radio receiver 14, and the DTV receiver 15 and a sound signal from the main CPU 4a are amplified by the amplifier 19, and a sound is outputted via the speaker 20. As the sound signal from the main CPU 4a, there is a route guidance voice signal generated through the navigation processing, for example.
The three-dimensional touch panel 22 detects that a pointing object, such as a user's finger, reaches a region at a predetermined distance or less from a touch surface thereof in a noncontact manner, and also detects contact of a pointing object with the touch surface thereof. More specifically, the three-dimensional touch panel has three-dimensional space extending from the touch surface in the direction of the normal to the touch surface as its detection region.
A plurality of infrared LEDs 23 are arranged along two sides which perpendicularly intersect each other and which are included in the periphery of the touch switch 22b, and a plurality of light receiving elements 24 each for receiving a corresponding one of infrared light rays from the plurality of infrared LED23 are arranged along two other sides opposite to the two sides along which the infrared LEDs 23 are arranged. In this arrangement, the infrared light rays emitted from the plurality of infrared LEDs 23 look as if they form a grid on the touch switch 22b, as shown in
Next, the operation of the 3Dimension stereoscopic display device will be explained.
The 3Dimension stereoscopic display device composites images into a three-dimensional stereoscopic image in which a virtual display of a planar image is placed at a position forward or backward with respect to the touch surface of the three-dimensional touch panel 22 (which is assumed to be at the same position as the screen of the stereoscopic display monitor 6) when viewed from a viewer's position, and displays the three-dimensional stereoscopic image in a stereoscopic manner. For example, when displaying a planar map in a map display in an in-vehicle navigation device, the 3Dimension stereoscopic display device places the virtual display of the planar map at a position backward with respect to the touch surface of the three-dimensional touch panel 22 (i.e., farther away from the driver than the touch surface). In this case, the 3Dimension stereoscopic display device can lessen the difference in focus distance between the focus position at which the road scene ahead of the vehicle at which the driver is looking while driving the vehicle is located, and the position of the virtual map display surface. More specifically, the 3Dimension stereoscopic display device enables the driver looking at an area ahead of the vehicle to direct his or her line of sight towards the touch surface of the three-dimensional touch panel 22 with a shorter travel distance of its focus position and view the map on the screen without having a feeling that something is abnormal. By doing in this way, the 3Dimension stereoscopic display device can make the map displayed in a stereoscopic manner legible, and, as a result, can improve the safety of the driver at the time of looking at the map display.
In the example shown in
Further, the position of the driver's right eye is expressed as a point Or (xr, yr, 0), the position of the driver's eye is expressed as a point O1(xl, yl, 0), and the gap between the left and right eyes is expressed as d. That is, the following relationship: |xr−xl|=d is established. The projection of a point p(x, y) on the planar map shown by the planar map data Pic_plane onto the virtual map display surface P yields a point p(x, y, z) on the map display surface P.
Right-eye image data Pic_R(x, y) of the planar map are expressed by a set of points pr at each of which a straight line (vector Vr) which connects between a point p(x, y, z) on the virtual map display surface P and the point Or(xr, yr, 0) which is the position of the right eye intersects the touch surface of the three-dimensional touch panel 22. Similarly, left-eye image data Pic_L(x, y) of the planar map are expressed by a set of points pl at each of which a straight line (vector V1) which connects between the point p(x, y, z) on the virtual map display surface P and the point O1 (xl, yl, 0) which is the position of the left eye intersects the touch surface of the three-dimensional touch panel 22.
The screen composition processing unit 4 calculates the points pr and pl by using the planar map data Pic_plane and the parameters Z0, z, and d in such a way that the distance between the virtual map display surface P and the position of each of the driver's right and left eyes is equal to z to generate right-eye image data Pic_R(x, y) and left-eye image data Pic_L (x, y), and outputs these right-eye and left-eye image data to the video image playback device 5.
The video image playback device 5 plays back the right-eye image data Pic_R(x, y) and the left-eye image data Pic_L(x, y) which are generated by the screen composition processing unit 4, and outputs them to the stereoscopic display monitor 6. The stereoscopic display monitor 6 displays the planar map in a stereoscopic manner by using the right-eye image data Pic_R(x, y) and the left-eye image data Pic_L(x, y) which are played back by the video image playback device 5. At this time, the 3Dimension stereoscopic display device makes the planar map look as if it is displayed on the virtual map display surface P placed at a position backward with respect to the touch surface of the three-dimensional touch panel 22 when viewed from the driver's position by using stereoscopic vision.
Further, when displaying a planar map in a map display in an in-vehicle navigation device, the 3Dimension stereoscopic display device can place the virtual display surface P of the planar map at a position forward with respect to the touch surface of the three-dimensional touch panel 22 (i.e., at a position closer to the driver than the touch surface). At this time, the 3Dimension stereoscopic display device makes the planar map look as if it is displayed to be floating from the touch surface of the three-dimensional touch panel 22 when viewed from the driver's position by using stereoscopic vision.
In the case of z<Z0, the 3Dimension stereoscopic display device makes the planar map shown by the planar map data Pic_plane look as if it is displayed on the virtual map display surface P placed at a position forward with respect to the touch surface of the three-dimensional touch panel 22 when viewed from the driver's position by using stereoscopic vision. As an alternative, when carrying out the screen compositing process with the following relationship: z=Z0, the 3Dimension stereoscopic display device makes the virtual map display surface P coincide with the touch surface of the planar map of the three-dimensional touch panel 22, and the planar map look as if it is displayed on the screen Q. As an alternative, when carrying out the screen compositing process with the following relationship: z>Z0, the 3Dimension stereoscopic display device makes the planar map look as if it is displayed on the virtual map display surface P placed at a position backward with respect to the touch surface of the three-dimensional touch panel 22 (i.e., farther away from the driver than the touch surface) when viewed from the driver's position by using stereoscopic vision.
Although in the above-mentioned explanation the case in which the present invention is applied to the in-vehicle information system 1 is shown, the application of the present invention is not limited to in-vehicle systems, and the present invention can be applied to all systems that require displays and operations, such as an FA, a panel computer, and a display system for guidance.
Further, in the case of z>Z0, when displaying the virtual map display surface P at a position farther away from the driver than the touch surface of the three-dimensional touch panel 22, the 3Dimension stereoscopic display device can improve the safety of the driver at a time when the driver looks at the map display, as mentioned above. In the case of z<Z0, the 3Dimension stereoscopic display device provides an advantage of making the screen legible by displaying the display screen in such a way that it is floating closer to the driver. Therefore, the 3Dimension stereoscopic display device can perform a control operation of setting the relationship between the parameters z and Z0 to z>Z0 when the vehicle is travelling, whereas the 3Dimension stereoscopic display device can perform a control operation of setting the relationship between the parameters z and Z0 to z<Z0 when the vehicle is at rest.
In the above-mentioned explanation, the case in which a planar map is displayed on a virtual map display surface P is shown. A case in which software buttons for operational input, such as icons, are displayed in a stereoscopic manner on another virtual display surface parallel to the virtual map display surface P will be mentioned hereafter.
First, the main CPU 4a reads map data from the map DB 13a stored in the HDD 13, and generates planar map data Pic_plane according to a predetermined map drawing algorithm, as shown in
In this example, the 3Dimension stereoscopic display device displays the planar map shown by the planar map data Pic_plane on the virtual map display surface P which is placed at a position backward with respect to the touch surface of the three-dimensional touch panel 22, and displays an enter button and a return button on the virtual display surface R which is placed at a position forward with respect to the touch surface of the three-dimensional touch panel 22. Hereafter, the distance between the touch surface of the three-dimensional touch panel 22 and the display surface R of the icons is expressed as z1. More specifically, the 3Dimension stereoscopic display device makes each of the icons of the enter button and the return button look as if it is floating at the distance dz from the touch surface of the three-dimensional touch panel 22 with respect to the driver's position by using stereoscopic vision. In the example shown in
Right-eye image data Pic_R(x, y) of the planar map are expressed by a set of points pr at each of which a straight line (vector Vr) which connects between a point p(x, y, z) on the virtual map display surface P or a point p(x, y, Z0-z1) on the display surface R and the point Or(xr, yr, 0) which is the position of the right eye intersects the screen Q of the stereoscopic display monitor 6. Similarly, left-eye image data Pic_L(x, y) of the planar map are expressed by a set of points pl(xl, yl, Z0) at each of which a straight line (vector V1) which connects between the point p(x, y, z) on the virtual map display surface P or the point p(x, y, Z0-z1) on the display surface R and the point O1 (xl, yl, 0) which is the position of the left eye intersects the screen Q of the stereoscopic display monitor 6. On the other hand, each of the icons of the enter button and the return button in the right-eye image of the planar map is expressed by a set of points pr on the right-eye image, while each of the icons of the enter button and the return button in the left-eye image of the planar map is expressed by a set of points pl on the left-eye image.
The screen composition processing unit 4 receives the planar map data Pic_plane generated by the main CPU 4a (step ST1). Next, the screen composition processing unit 4 receives the icon data about the enter button and the return button which the main CPU 4a has read from the HDD 13 (step ST2). The screen composition processing unit 4 then receives the parameters Z0, z, d, and z1 from the internal memory 11 (step ST3).
The screen composition processing unit 4 then calculates the points pr and pl in such a way that the distance between the virtual map display surface P and the position of the driver's eyes is equal to z and the distance between the display surface R of the icons and the position of the driver's eyes is equal to (Z0-z1) by using the planar map data Pic_plane, the parameters Z0, z, d, and z1 and the icon data to generate right-eye image data Pic_R(x, y) and left-eye image data Pic_L(x, y) in the same way that the screen composition processing unit according to above-mentioned Embodiment 1 does (step ST4). After that, the screen composition processing unit 4 outputs the right-eye image data Pic_R(x, y) and the left-eye image data Pic_L(x, y) which are generated thereby to the video image playback device 5 (step ST5).
The video image playback device 5 plays back the right-eye image data Pic_R(x, y) and the left-eye image data Pic_L(x, y) which are generated by the screen composition processing unit 4, and outputs them to the stereoscopic display monitor 6. The stereoscopic display monitor 6 displays the planar map and the icons in a stereoscopic manner by using the right-eye image data Pic_R(x, y) and the left-eye image data Pic_L(x, y) which are played back by the video image playback device 5 (step ST6). At this time, the 3Dimension stereoscopic display device makes the enter button and the return button as if they are floating on the touch surface of the three-dimensional touch panel 22 with respect to the driver's position by using stereoscopic vision.
In the above-mentioned display state, the main CPU 4a determines whether or not the user's finger is approaching the touch surface of the three-dimensional touch panel 22 according to a detection signal from the three-dimensional touch panel 22 (step ST7). For example, when the user's finger moves and then enters the detection region defined by z3 and formed by the infrared LEDs 23 and the light receiving elements 24 in the three-dimensional touch panel 22, the three-dimensional touch panel 22 detects the coordinates of this finger as a point (x, y, z3). When the user's finger touches the touch surface, the three-dimensional touch panel 22 detects the coordinates of the finger as (x, y, 0), and outputs a detection signal showing the coordinates to the main CPU 4a. In the structure shown in
When the three-dimensional touch panel determines that the user's finger is approaching the touch surface of the three-dimensional touch panel 22 (when YES in step ST7), the main CPU 4a carries out a predetermined process and a screen transition which are to be performed when a corresponding icon which the finger has approached is touched (step ST8). For example, when the user's finger is approaching the “return button,” the main CPU shifts to step ST8 and carries out a predetermined operation by assuming that the “return button” is pushed down and, after that, returns to the process of step ST1. Thus, the 3Dimension stereoscopic display device enables the user to simply operate a stereoscopic image icon which looks as if it is floating via stereoscopic vision in a noncontact manner to make the in-vehicle information system carry out the function corresponding to the operation. In contrast, when the three-dimensional touch panel determines that the user's finger not approaching the touch surface of the three-dimensional touch panel 22 (when NO in step ST7), the 3Dimension stereoscopic display device returns to the process of step ST1.
Further, the 3Dimension stereoscopic display device can generate a three-dimensional stereoscopic image in which the distance z1 at which the icon image is floating from the touch surface of the three-dimensional touch panel 22 via stereoscopic vision is made to be equal to the detection distance z3 at which a pointing object approaching the three-dimensional touch panel 22 is detected. By doing in this way, the 3Dimension stereoscopic display device can implement a user-friendly operation screen in which the display position of each icon coincides with the sensitivity area of the three-dimensional touch panel 22.
In addition, the case in which the virtual map display surface P of the planar map is placed at a position backward with respect to the touch surface of the three-dimensional touch panel 22 is shown in the above-mentioned explanation. In this case, because the 3Dimension stereoscopic display device makes each stereoscopic image icon look as if only this icon is floating from the touch surface via stereoscopic vision by making the map display surface P coincide with the touch surface of the three-dimensional touch panel 22 (z=Z0), the designability of the three-dimensional stereoscopic image is improved and the user is enabled to easily operate each stereoscopic image icon.
Further, the 3Dimension stereoscopic display device can set the parameters, such as z and z1, to the screen composition processing unit 4 through a user operation, and can change the already-set parameters through a user operation. For example, the 3Dimension stereoscopic display device enables the user to freely set the distance z1 at which each icon image is floating from the touch surface of the three-dimensional touch panel 22 via stereoscopic vision by performing a drag process of dragging the icon image along the z-axis of the three-dimensional touch panel 22 (in the direction of the normal to the touch surface).
In addition, the parameters, such as z and z1, can include a distance which is predetermined according to the state of the vehicle equipped with or holding the 3Dimension stereoscopic display device. As the distance which is predetermined according to the state of the vehicle equipped with or holding the 3Dimension stereoscopic display device, a distance which is predetermined according to the speed of the vehicle can be provided. More specifically, the 3Dimension stereoscopic display device can set preset values, as the map display surface P and the distance z1 between the touch surface of the three-dimensional touch panel 22 and the display surface R of the icons, to the screen composition processing unit 4 according to the speed of the vehicle. For example, the 3Dimension stereoscopic display device sets z to satisfy the following relationship: z>Z0 when the vehicle is travelling, whereas the device sets z to satisfy the following relationship: z=Z0 when the vehicle is at rest. Further, when the vehicle is travelling, the 3Dimension stereoscopic display device sets z1 to a smaller value than that when the vehicle is at rest.
As mentioned above, the 3Dimension stereoscopic display device in accordance with this Embodiment 1 includes the stereoscopic display monitor 6 for displaying a right-eye image or video image and a left-eye image or video image for three-dimensional stereoscopic display of an operation screen in a three-dimensional stereoscopic manner, the three-dimensional touch panel 22 disposed on the screen of the stereoscopic display monitor 6, for detecting a relative position of a pointing object relative to the touch surface thereof, the pointing object being used for performing a touch operation on the operation screen which is displayed on the screen of the stereoscopic display monitor 6 in a three-dimensional stereoscopic manner, the screen composition processing unit 4 for generating the right-eye image or video image and the left-eye image or video image for three-dimensional stereoscopic display in which the virtual display surface R for three-dimensional stereoscopic display of an icon image which is an operation target on the operation screen is set to be placed at a position forward with respect to the screen Q of the stereoscopic display monitor 6, and the main CPU 4a for determining that the icon image is operated when the three-dimensional touch panel unit 22 detects a pointing object used for performing a touch operation on the icon image. Because the 3Dimension stereoscopic display device is constructed in this way, the 3Dimension stereoscopic display device can provide an advantage of being able to provide an HMI with a three-dimensional stereoscopic display which enables the user to perform an operation matching the user's intuition.
Further, the 3Dimension stereoscopic display device according to above-mentioned Embodiment 1 can display each stereoscopic image icon in the following way according to a user operation.
When the user's finger approaches the “return button” in the display state shown in
When the user then performs a gesture of further pushing the stereoscopic image icon of the “return button,” the three-dimensional touch panel 22 detects the travel distance of the finger caused by the gesture and then notifies this travel distance to the screen composition processing unit 4. The screen composition processing unit 4 changes the distance z1 between the virtual display surface R for display of the stereoscopic image icon of the “return button” and the touch surface according to the travel distance of the finger caused by the above-mentioned gesture to display the stereoscopic image icon of the “return button” in such a way that the stereoscopic image icon is recessed according to the gesture of pushing the stereoscopic image icon by using the finger, as shown in
By doing in this way, the 3Dimension stereoscopic display device can provide an HMI which enables the user to perform an operation matching the user's intuition. As an alternative, the 3Dimension stereoscopic display device can notify the user that the stereoscopic image icon is focused by changing the color or shape of the stereoscopic image, vibrating the stereoscopic image, or providing a change in the sense of touch which the user has, and the 3Dimension stereoscopic display device can also notify the user that the stereoscopic image icon is operated by changing the color or shape of the stereoscopic image to a predetermined color or shape, vibrating the stereoscopic image, or providing a change in the sense of touch which the user has.
In above-mentioned Embodiment 1, a user operation of moving the user's finger can be an operation of moving a finger as if to draw a circle, a V-shaped checking operation of moving a finger as if to mark a checkbox, or an operation of moving a finger up and down or rightward and leftward as long as the main CPU 4a can identify the user operation by using the detection information from the three-dimensional touch panel 22 which is a pointing object detection unit.
As an alternative, the 3Dimension stereoscopic display device enables the user to select a pattern from preset patterns as the user operation of moving the user's finger, or can provide a gesture registration mode in which the user is allowed to register his or her own gesture in the system and enables the user to perform a registered gesture as the above-mentioned user operation.
In addition, the 3Dimension stereoscopic display device can perform a control operation of not changing the focus position, which is determined via stereoscopic vision, of an icon corresponding to a function which is not permitted to be performed according to the state of the vehicle equipped with or holding the 3Dimension stereoscopic display device even when detecting a user operation on the icon. For example, as an example of the icon corresponding to a function which is not permitted to be performed according to the above-mentioned state of the vehicle, there can be an icon which does not accept the operation assigned thereto because of restrictions on operations at a time when the vehicle is travelling. In this case, the 3Dimension stereoscopic display device can display the above-mentioned icon by changing the color and shape of the icon to a color and a shape different from those of icons corresponding to the functions which are permitted to be performed when the vehicle is travelling, or can send out a warning sound or a warning message when the user operates the icon. Further, the 3Dimension stereoscopic display device can change the color of an icon which is not permitted to be operated to gray, make the icon semi-transparent, or reduce the degree of projection with which the icon looks as if it is projecting via stereoscopic vision.
Further, when the user's finger approaches an icon, the 3Dimension stereoscopic display device in accordance with above-mentioned Embodiment 1 can display icons existing in a fixed region surrounding the finger in a larger size, thereby making it easy for the user to operate any one of the icons.
As shown in
When specifying the character key button 50a which the user's finger has approached and the character key buttons 50a existing adjacent to this character key button by using the coordinate data inputted thereto from the three-dimensional touch panel 22, the screen composition processing unit 4 generates a three-dimensional stereoscopic image in which these character key buttons 50a are displayed in a larger size, by a predetermined size, than that in which the other character key buttons 50a and the other various buttons 52 to 56 are displayed, and displays the three-dimensional stereoscopic image on the stereoscopic display monitor 6 via the video image playback device 5. As a result, as shown in
Further, when the user's finger approaches a character key button, the 3Dimension stereoscopic display device increases the degree of projection with which the character key button which the user's finger has approached and character key buttons 50a existing adjacent to this character key button are projecting via stereoscopic vision, as shown in
Further, although the case in which a planar map is displayed in a stereoscopic manner is shown in above-mentioned Embodiment 1, the present invention can also be applied to a display of information, such as a menu screen for an AV system, vehicle information, or safety information, as long as the information is typical information displayed on the in-vehicle information system. For example, the present invention can be used for a display of an icon for control of an air conditioner, a meter panel in the dashboard, information about the fuel efficiency of the vehicle, preventive safety information, VICS (registered trademark) information, or the like.
In addition, although the case in which a stereoscopic display which is viewed stereoscopically with the naked eye is produced is shown in above-mentioned Embodiment 1, the present invention can also use a stereoscopic display method of providing a stereoscopic image by using a polarization eyeglass. Further, although in Embodiment 1 an optical type three-dimensional touch panel for detecting that a finger or a pointing object reaches a region at a distance of z3 or less from a touch surface is used as the three-dimensional touch panel, as shown in
Although the case in which the 3Dimension stereoscopic display device in accordance with the present invention is applied to an in-vehicle information system is shown in above-mentioned Embodiment 1, the 3Dimension stereoscopic display device in accordance with the present invention can be applied to any display device having such a stereoscopic display monitor as above mentioned. For example, the 3Dimension stereoscopic display device in accordance with the present invention can be applied to not only an in-vehicle navigation device, but also a display device for use in a mobile telephone terminal or a mobile information terminal (PDA; Personal Digital Assistance). Further, the 3Dimension stereoscopic display device in accordance with the present invention can be applied to a display device, such as a PND (Portable Navigation Device) which a person carries onto a moving object, such as a car, a railroad, a ship, or an airplane, to use it.
The present invention is not limited to the structure explained in above-mentioned Embodiment 1. That is, it is to be understood that some of the structural components shown in above-mentioned Embodiment 1 can be freely combined, and a variation and an omission of each of the structural components can be made without departing from the spirit of scope of the invention.
Because the 3Dimension stereoscopic display device in accordance with the present invention can provide an HMI with a three-dimensional stereoscopic display which enables the user to perform an operation matching the user's intuition, the 3Dimension stereoscopic display device is suitable for use as a display device mounted in an in-vehicle information system.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2010/006220 | 10/20/2010 | WO | 00 | 12/13/2012 |