The present invention relates to a display device including a display unit such as a touch panel.
A display device allowing for touch input can be typically implemented by providing a resistive film sheet, a capacitive sheet, or the like on an LCD (Liquid Crystal Display). Display devices of this type are utilized as interfaces allowing for intuitive operations in various applications such as ATMs installed in banks or the like, ticket-vending machines from which tickets or the like are purchased, car navigation systems, mobile devices, and gaming machines.
Basically, to provide an input or inputs to each of these display devices allowing for touch input, a user touches one point or a plurality of points thereon. The display device is capable of only determining whether or not the user has touched and where in the display screen the user has touched.
Known techniques for implementing touch input are a resistive-film method, a capacitive method, an infrared method, an electromagnetic induction method, and an ultrasonic method, as well as a method of implementing touch input by analyzing an image captured by a photosensor built-in LCD so as to detect a touch location. For example, Japanese Patent Laying-Open No. 2006-244446 (Patent Document 1) describes such a photosensor built-in LCD.
As such, there are various known techniques for detecting a touch location. However, a direction of a touch operation to a touch location cannot be determined only by detecting a touch location.
In view of this, Japanese Patent Laying-Open No. 2006-47534 (Patent Document 2) proposes to perform display control in accordance with a result of determining from which one of a driver's seat and a passenger's seat a touch operation has been performed onto a display screen of a dual-view LCD type car navigation system that is installed in a vehicle and presents different images to a person on the left side and a person on the right side respectively. This system determines which one of the persons on the left and right seats has performed the touch operation, by respectively transmitting different signals from the driver's seat and the passenger's seat via human bodies that are in contact with the vehicular seats or a steering wheel.
Patent Document 1: Japanese Patent Laying-Open No. 2006-244446
Patent Document 2: Japanese Patent Laying-Open No. 2006-47534
In the system proposed in Patent Document 2, the persons always need to be in touch with the signal transmitting devices provided in addition to the display device, i.e., always need to sit on the seats or touch the steering wheel. Hence, the system is only capable of determining a touch operation direction under such a limited circumstance. Further, in the system, a relatively large-scale device needs to be provided outside the display device so as to transmit the signals. Hence, it is unrealistic to apply the system to mobile devices, which are carried by users when they go out.
Apart from the system for transmitting the determination signals via human bodies, Patent Document 2 also proposes a system for determining a touch operation direction by analyzing the display device's display screen image captured from behind the passenger's seat and the driver's seat by a camera installed at a rear portion of the ceiling in the vehicular compartment.
However, this system hardly detects with precision in what direction it is touched, when a touched portion cannot be viewed by obstacles such as the persons' heads or backs or when the one person's hand crosses the other person's hand. As with the system in which the signal transmitting device is provided outside the display device, this method cannot be also applied to mobile devices, which are carried by users when they go out.
The present invention is made to solve the foregoing problems, and its object is to provide a display device capable of specifying a touch operation direction when a touch operation is performed (direction of touch onto a display screen in the touch operation), as well as a control method in such a display device.
A display device according to the present invention includes: a display unit having a display screen; an image capturing unit for capturing, in an image, a touch operation performed onto the display screen, from inside the display unit; a touch location detecting unit for detecting a touch location of the touch operation performed onto the display screen; and a direction determining unit for determining a touch direction of the touch operation performed onto the display screen, based on the image captured by the image capturing unit.
Preferably, the direction determining unit determines the touch direction based on an image of a predetermined area from the touch location in the image captured by the image capturing unit.
Preferably, the direction determining unit determines the touch direction based on a change in density in an image of the touch location and a periphery thereof, when a finger or a touch member touches the display screen for the touch operation and is captured in the image by the image capturing unit.
Preferably, the image capturing unit is a photosensor or a temperature sensor.
Preferably, the touch location detecting unit detects the touch location based on the image captured by the image capturing unit.
According to the present invention, a touch operation onto a display screen is captured in an image from inside the display unit. Based on the image thus captured, a touch operation direction for the touch location is determined. Accordingly, the display device itself is capable of specifying the touch operation direction when the touch operation is detected.
1, 2, 11: display device; 10, 18: photosensor built-in LCD; 14: touch location detecting unit; 15: touch operation direction determining unit; 17: capacitive type touch panel; 20: photosensor built-in dual-view LCD.
The following describes embodiments of the present invention with reference to figures. In the description below, the same parts and components are given the same reference characters. Their names and functions are the same.
Display device 1 includes a photosensor built-in LCD (liquid crystal panel/display) 10 that has pixels in each of which a photosensor is built and that is capable of not only displaying but also capturing an image. Display device 1 further includes a touch location detecting unit 14 for detecting a touch location by analyzing the image captured by photosensor built-in LCD 10. Display device 1 further includes a touch operation direction determining unit 15 for determining in what direction it is touched, by analyzing the image captured by photosensor built-in LCD 10. Display device 1 further includes a control unit 19, which receives data indicating a touch location and a touch operation direction, for performing general control over display device 1. Display device 1 further includes a memory unit 12 for storing data and instructions therein. Display device 1 further includes an image generating unit 13 for generating an image to be displayed on a screen.
When a touch operation onto the display screen is detected, photosensor built-in LCD 10 captures from its inside an image of the display screen thus touched and operated Photosensor built-in LCD 10 provides the captured image to touch location detecting unit 14. Touch location detecting unit 14 analyzes the image to detect the touch location.
Exemplary methods usable for detecting a touch location are techniques such as edge feature extraction and pattern matching. A touch location is detected by the image processing as such, so touch location detecting unit 14 can detect a plurality of touch locations when a plurality of locations on the screen are touched simultaneously. This allows an operator to simultaneously operate a plurality of operation targets displayed on the screen. This also permits a plurality of operators to simultaneously operate them.
Touch operation direction determining unit 15 analyzes the image obtained by photosensor built-in LCD 10, with reference to the touch location detected by touch location detecting unit 14. Through the analysis, touch operation direction determining unit 15 determines a touch operation direction.
Control unit 19 uses the touch location detected by touch location detecting unit 14 and the touch operation direction determined by touch operation direction determining unit 15, so as to control display device 1, which serves as for example a gaming machine.
Photosensor built-in LCD 10 includes the photosensors (not shown) respectively corresponding to the pixels. The photosensors mainly respond to a frequency of a visible light region. As illustrated in
Images captured by the photosensors upon touch operations differ depending on the intensities of the external light and the backlight as well as a manner of placing a finger thereon.
Even when the intensity of the external light is changed, the brightness of a portion corresponding to the finger pad placed in contact with or adjacent to the display screen is not changed much. In contrast, the brightness in a portion not shaded by the finger and thus receiving the external light is directly affected by the change in brightness of the external light. Hence, the portion not shaded by the finger and receiving the external light gets darker or brighter than the portion corresponding to the finger pad, whereby contrast changes therebetween.
a) shows an exemplary image captured when the display screen of display device 1 is irradiated with relatively bright external light. As shown in
b) shows an exemplary image captured when the external light is relatively dark. As shown in
c) shows an exemplary image captured when the external light is parallel light such as sunlight and is very bright. Finger pad image 101, the image of the finger pad partially in contact with the screen and receiving the reflected light of backlight 16, is darker because portions around finger pad image 101 is exposed to the external light more intense than the reflected light.
The following fully describes methods by which touch operation direction determining unit 15 determines a touch operation direction. Touch operation direction determining unit 15 determines the touch operation direction based on an image of a predetermined area from the touch location in the image captured by photosensor built-in LCD 10. As the methods by which touch operation direction determining unit 15 determines the touch operation direction, the following three methods will be described below sequentially: “determination from a direction of density gradient”, “determination from edge feature distribution”, and “determination from a fingerprint shape”.
Determination from Direction of Density Gradient
This method of determination is to determine the touch operation direction by examining a direction of density gradient in a grayscale image of a portion at and around the detected touch location with respect to the density in the touch location, in the image of the entire display screen captured by the photosensor built-in LCD 10.
a) shows the image of the portion at and around the touch location in the image captured by photosensor built-in LCD 10. In this image, as with
Observing finger pad image 101 of
Hence, in finger pad image 101 of the example of
b) shows a criterion for determining, based on the above-described principle, which one of the four directions, i.e., the upward, downward, leftward, and right ward directions, the touch operation direction corresponds to. As shown in
It should be noted that only the four directions are illustrated for the touch operation direction for simplicity in
It should be also noted that in the description herein, the direction in which the grayscale image is getting darker (direction toward the finger base) is determined as the touch operation direction, but the direction in which the grayscale image is getting brighter (direction toward the fingertip) may be determined as the touch operation direction.
Generally, in the vicinity of the touch location, densities differ in the grayscale image among, for example, a portion in which the finger is strongly pressed against the display screen, a portion in which the finger is lightly pressed thereagainst, a portion in which the finger is slightly spaced therefrom, and the like. Thus, display device 1 determines the touch operation direction from the direction of density gradient in the grayscale image, thereby accurately determining the touch operation direction.
Next, referring to
In touch operation direction determining process P1, display device 1 first obtains a grayscale image having a center corresponding to the coordinate of a touch location (S1). The grayscale image thus obtained is, for example, an image of a portion within an analysis area 80 in the image captured by photosensor built-in LCD 10. Analysis area 80 is an area of a square having vertical and horizontal lengths of L1×L1, and has its center corresponding to the coordinate of the touch location represented by the central portion of cross mark 105 as shown in
It should be noted that display device 1 may be provided with a function of registering the size of a user's finger and may set the size of analysis area 80 based on the size thus registered. Further, the shape of analysis area 80 is not limited to the square but it may be rectangular, circular, or elliptic.
Next, in order to calculate to find the direction of density gradient, display device 1 performs a Sobel filtering process onto the obtained grayscale image in the horizontal direction (x direction) and the vertical direction (y direction) (S2, S3).
Display device 1 applies the Sobel filtering to the grayscale image of the portion within analysis area 80 so as to calculate Sobel values in the horizontal direction (x direction) and the vertical direction (y direction) for each pixel constituting the grayscale image.
Then, in S2 and S3, based on the Sobel values calculated for each pixel in the horizontal direction and the vertical direction, display device 1 classifies the direction of density gradient in each pixel as one of the four directions, i.e., upward, downward, leftward, and rightward directions (S4). As shown in
For example, when (Gx, Gy)=(−10, +15), display device 1 determines the direction of Gy, i.e., the upward direction corresponding to the positive direction in the vertical direction, as the direction of density gradient for the pixel. In consideration of influences of noise and the like, display device 1 does not determine the direction of density gradient for the pixel when each of the absolute values thereof is equal to or smaller than a predetermined threshold.
In this way, for each pixel, display device 1 classifies the direction of density gradient as one of the four upward, downward, leftward, and rightward directions, thus determining the direction of density gradient for each pixel in the image of the portion within analysis area 80 as shown in the schematic diagram of
Next, display device 1 calculates to find what direction is largest in number among the directions of density gradient, i.e., the upward, downward, leftward, rightward directions, determines the direction largest in number as the touch operation direction (S5), and terminates the process.
In the case where display device 1 determines it among more than the four directions, oblique directions are also defined in accordance with the Sobel values calculated for each pixel in the horizontal direction and the vertical direction. For example, from a result of the process in each of S2 and S3, display device 1 calculates density gradient vectors 111 for the pixels respectively. Then, display device 1 compares a slope angle of a resultant vector obtained by combining respective density gradient vectors 111 for the pixels, with each of the upward, downward, leftward, rightward directions, and oblique directions. Then, a direction with the smallest difference in angle therebetween is determined as the touch operation direction.
Alternatively, even in the case where the touch operation direction is determined among the four directions, display device 1 may compare the slope angle of the resultant vector with each of the upward, downward, leftward, and rightward directions, and determines as the touch operation direction a direction with the smallest difference in angle therebetween.
The above-described method of determination from the direction of density gradient is intended for a case where an image such as the one shown in
In particular, when no finger shadow image is formed in the image and the background of finger pad image 101 is in white as shown in
Hence, it is desirable for display device 1 to determine the touch operation direction based on not only the image of the finger pad at and around the touch location but also images of the portions other than the finger pad.
Referring to
In the case where there is a finger shadow image 102 in the image as shown in
For example, in the case of
When it is determined as “NO” in S12, display device 1 forwards the process to step S14, in which display device 1 determines as the touch operation direction a direction in which image density is getting higher. On the other hand, when it is determined as “YES” in S12, display device 1 determines as the touch operation direction a direction in which image density is getting lower (S13).
In the procedure of
Determination from Edge Feature Distribution
The following describes a method by which display device 1 determines a touch operation direction based on edge feature distribution.
In this method of determination, in the entire display screen's image captured by photosensor built-in LCD 10, display device 1 extracts an edge of a portion around a fingertip relative to a detected touch location, from a grayscale image of the touch location and its periphery, and determines a touch operation direction.
a) shows, in its left side, the image of the touch location and its periphery in the image captured by photosensor built-in LCD 10. As with
As shown in the figure, the density value in finger shadow image 102 is high at the very end of the fingertip and difference in density from the background thereof is distinct, while the density value therein is getting lower as further away therefrom in the direction toward the finger base. Accordingly, display device 1 obtains an edge 103 extracted from such an image as shown in the right side of
b) shows a criterion for determining which one of the four upward, downward, leftward, and rightward directions the touch operation direction corresponds to, based on this principle. As shown in
It should be noted that only the four directions are illustrated for the touch operation direction for simplicity in
Herein, display device 1 is configured to determine the direction toward the finger base (edge opening direction) as the touch operation direction, but display device 1 may be configured to determine the direction toward the finger tip as the touch operation direction.
By applying the edge detection technique in this way, display device 1 can obtain different edge features around the touch location at, for example, a portion in which the finger is pressed strongly against the display screen, a portion in which the finger is pressed lightly thereagainst, a portion in which the finger is slightly spaced therefrom, and the like. Display device 1 examines distribution of the edge features thus obtained, thereby determining the touch operation direction. In this way, the touch operation direction can be determined accurately from slight differences in degree of contact of the finger or the like with the display screen.
Referring to
In touch operation direction determining process P2, display device 1 first obtains a grayscale image having a center corresponding to the coordinate of a touch location (S21). The grayscale image thus obtained is, for example, an image of a portion within an analysis area in an image captured by photosensor built-in LCD 10. The analysis area is an area of a square and has its center corresponding to the coordinate of the touch location that corresponds to the central portion of cross mark 105 as shown in
Next, display device 1 performs a process of extracting edge features based on the grayscale image thus obtained (S22). For the extraction of the edge features, display device 1 utilizes Sobel filtering or other filtering for edge extraction. Display device 1 extracts the edge features using a predetermined threshold.
Next, display device 1 specifies a direction in which the edge features are small when viewed from the coordinates of the center (the coordinates of the touch location), i.e., the direction in which an opening is located, and determines that the specified direction in which the opening is located is the touch operation direction (S23). With this, display device 1 terminates the process.
Alternatively, in step S23, display device 1 calculates barycentric coordinates of a plurality of pixels having edge features, and determines the touch operation direction therefrom. Specifically, display device 1 first extracts a plurality of pixels having edge features each equal to or larger than a predetermined threshold. Then, display device 1 weighs each of the extracted pixels with the magnitudes of pixel values, and calculates to find the barycentric coordinates. Display device 1 determines, as the touch operation direction, a direction from the location of the barycentric coordinates toward the location of the coordinates of the touch position in a straight line connecting the barycentric coordinates to the coordinates of the center of the analysis area (the coordinates of the touch location).
It should be noted that the determination of the touch operation direction using the edge feature is not limited to the above-described example and any method can be employed as long as it allows for determination of the touch operation direction from the edge features.
Determination from Fingerprint
The following describes a method by which display device 1 determines a touch operation direction based on a fingerprint. As shown in
Specifically, first, display device 1 calculates Sobel values for each pixel in a finger pad image by means of Sobel filtering or the like, as with the method of determination from the direction of density gradient. Next, display device 1 calculates to find a direction for each pixel based on the calculated Sobel values as with the method of determination from the direction of density gradient. From distribution of the respective directions for the pixels, display device 1 specifies the fingertip portion and the finger's joint portion.
A fingerprint is not always captured in the direction shown in
In this way, display device 1 reads the pattern of the fingerprint from the grayscale image obtained upon the touch operation with the finger, and determines the touch operation direction from the pattern. In this way, display device 1 can accurately determines the touch operation direction.
The method of determining a touch operation direction based on a fingerprint is not limited to the one described above. For example, it can be considered that an operator's fingerprint is registered in advance in display device 1, and display device 1 compares a captured finger pad image with the registered fingerprint by means of pattern matching so as to determine the touch operation direction.
The following describes an example in which the above-described methods of determination are applied to control of display device 1. Here, control of a hockey game is illustrated.
As shown in
Display device 1 displays respective images of goal 30a of team A and goal 30b of team B on the opposite ends of the display screen. Between the goals, display device 1 displays respective images of five pentagon-shaped pieces 20a of team A, five octagon-shaped pieces 20b of team B, and a ball 30 which changes its direction of movement when hit by pieces 20a, 20b. Display device 1 displays uniform numbers 1-5 for pieces 20a respectively and displays uniform numbers 1-5 for pieces 20b respectively. In the figure, fingers 100a represent fingers of operator A and fingers 100b represent fingers of operator B. In the description below, a piece having a uniform number n (n is a natural number of not less than 1 but not more than 5) is referred to as “piece n”.
Operators A, B operate pieces 20a, 20b of their teams with fingers 100a, 100b in order to push ball 30 into their opponent teams' goals 30a, 30b. When one team pushes the ball into its opponent team's goal, display device 1 adds a score therefor. The control for the game based on the operations of operators A, B is performed by, for example, control unit 19 of display device 1 (see
For operations of the pieces, display device 1 assigns operation rights as illustrated in
The control above is implemented by display device 1 identifying whether or not an operator has an operation right, based on a touch operation direction (right or left) as shown in
First, display device 1 determines whether or not touch location detecting unit 14 has detected a touch operation on a piece (S31). If no touch operation on a piece is detected, display device 1 ends the process.
In the example shown in
Then, display device 1 determines identities (IDs in
Thereafter, for the specified pieces, display device 1 respectively specifies touch operation directions determined by touch operation direction determining unit 15 (S33). Then, display device 1 verifies the touch operation directions specified for the pieces against the touch operation directions retained in association with the pieces, respectively (S34). Display device 1 determines whether or not each of the touch operation directions matches with its corresponding one for each of the pieces (S35).
In the example of
Meanwhile, for example, if operator A attempts to operate piece 1 of team B, the specified information does not match with the operation right data of
As such, display device 1 specifies a piece for which a touch operation has been detected, based on a touch location detected by touch location detecting unit 14 and current locations of pieces 20a, 20b. Display device 1 determines an operator thereof based on a direction determined by touch operation direction determining unit 15. In this way, display device 1 can perform control to only permit an operator with a operation right to operate a corresponding piece.
As such, display device 1 specifies the operator, thereby clearly specifying who has attempted to operate a piece when there are a plurality of operators.
When a plurality of operators operates a plurality of operation targets on one screen, display device 1 only permits an operator to operate an operation target that matches with the operation right of the operator, thus preventing the operator from doing a wrong operation. Further, display device 1 can put restrictions in a game or the like to permit operators to operate only pieces of their own teams.
Display device 1 according to the embodiment illustrated above analyzes an input image of a finger, a pen, or the like to allow for detection of a touch location and determination of a touch operation direction. Hence, control unit 19 can determine what direction an operator having done the operation is in, based on the signal indicating the touch location, the signal indicating the touch operation direction, and the information stored in memory unit 12. Thus, control unit 19 can perform information processing in accordance with the operation right provided to the operator.
Display device 1 according to the present embodiment analyzes an image obtained by the photosensors provided behind the display screen to determine what direction a person having operated an operation target on the display screen is in, relative to the screen. Thus, display device 1 with such a small scale configuration is capable of accurately determining an operation direction.
Hence, even when there are obstacles in front of the display screen of display device 1, no obstacles have not been captured in the input image used in determining the operation direction, thus preventing hindrance of the determination thereof. Further, display device 1 does not need to be provided with an apparatus for transmitting a signal to outside the display device through a human body, unlike the system described above in BACKGROUND ART. As such, a touch operation direction can be determined and subsequent image processing can be performed only by display device 1 having such a small scale, simplified configuration.
Exemplified in the above-described embodiment is a case where a touch location is detected by analyzing an image obtained by photosensor built-in LCD 10. However, the configuration for detecting a touch location is not limited to this and other embodiments can be employed.
For example, the configuration of display device 1 of
For the touch panel, various types of touch panels can be used as long as they can detect a touch location, such as resistive-film type, infrared type, electromagnetic induction type, and ultrasonic type touch panels.
However, in the case where the touch operation direction is determined by analyzing the image obtained by the photosensor built-in LCD, it is preferable to use the photosensor built-in LCD, rather than the capacitive type touch panel or the like, for the detection of the touch location together with the touch operation direction. This is because components of the touch panel such as a resistive film or a step of providing the resistive film on the display screen can be reduced, thus simplifying the configuration of the display device. With the configuration thus simplified, cost thereof can be reduced advantageously.
In the above-described embodiment, the photosensor built-in LCD mainly responsive to the frequency of light in the visible light region is exemplified and illustrated as an image capturing device (image input device) for capturing an image, but the configuration of the image capturing device is not limited to this. Various other configurations can be utilized.
For example, each of display devices 1, 11 includes photosensors each mainly responsive to a frequency of infrared rays rather than the frequency of the visible light region. In each of display devices 1, 11, infrared rays emitted from behind the display screen are reflected by a finger, a pen, or the like, and the infrared rays thus reflected are received by the photosensors. Each of display devices 1, 11 converts the received infrared rays into an image. The use of the photosensors each mainly responsive to the infrared rays rather than the visible light allows each of display devices 1, 11 to obtain an image of reflection of a finger pad and a shadow of a finger without influences of external light such as room light. The image obtained by the photosensors each responsive to the infrared rays utilizes the reflection of the finger, and is therefore basically the same as the one obtained by the photosensors each responsive to the frequency of the visible light region.
As such, in each of display devices 1, 11, the photosensors obtain the image from the light emitted from the backlight and reflected by an operation input object such as a finger, whereby the degree of contact of the operational input object such as a finger or a touch pen with the display screen can be fully ascertained.
Alternatively, each of display devices 1, 11 may include temperature sensors instead of the photosensors so as to convert into an input image a temperature change taking place when a finger or a pen is placed thereon. The image provided by the temperature sensors is, for example, an image in which a location touched by a finger as shown in
In each of display devices 1, 11, when the image is obtained by the temperature sensors, the degree of contact of the operation input object such as a finger or a touch pen with the display screen can be ascertained fully. Further, unlike the photosensors, such display devices 1, 11 are less likely to be affected by external light such as room light or sunlight. Hence, each of display devices 1, 11 can only detect the temperature change caused by the operation input object.
Hence, each of display devices 1, 11 can detect a touch location and determine a touch direction based on an image provided by the photosensors for infrared rays or the like and an image provided by the temperature sensors, using the algorithm for detecting a touch location and determining a touch direction from an image provided by the photosensors mainly responsive to the above-described visible light region.
It should be noted that any type of image capturing devices can be used as the image capturing device as long as they are sensors or cameras that can capture an image of a finger or a pen placed thereon.
Exemplified in the first embodiment, one embodiment of display device 1, is display device 1 applied to a gaming machine. The display device can be also applied to a car navigation system by configuring the display device to include therein a photosensor built-in dual-view LCD instead of photosensor built-in LCD 10.
Display device 2 includes photosensor built-in dual-view LCD 20 (hereinafter, simply referred to as “LCD 20”) and touch location detecting unit 14 described above. Display device 2 further includes a control unit 21 for identifying whether an operator is on the left or right side based on a signal received from touch operation direction determining unit 15 and controlling LCD 20. Display device 2 further includes a memory unit 22 for storing therein various information concerned with the control. Display device 2 further includes output devices 26-28 (a television receiver 26, a navigation device 27, and a DVD player 28). Display device 2 further includes an image selecting unit 25, which receives data from each of output devices 26-28, classifies the data into images for the left side and images for the right side, and outputs them selectively. Display device 2 further includes a left side display control unit 23 and a right side display control unit 24 for respectively controlling images to be displayed on the left side and right side of the LCD screen of LCD 20, based on the image data sent from image selecting unit 25.
In the presentation shown in
As shown in
Likewise, when the operator on the driver's seat side operates a scroll button 301 and the operator on the passenger's seat side operates a channel button 401 simultaneously as shown in
As with each of display devices 1, 11 described as the first embodiment, display device 2 according to the second embodiment includes touch operation direction determining unit 15 in addition to touch location detecting unit 14. Accordingly, display device 2 can detect a touch operation onto button 301 or 401, and specify whether the touch operation is performed from the passenger's seat side or the driver's seat side.
As a result, for example, in the case of
In the description herein, the buttons for operating the television and the buttons for navigation in the car navigation system employed in a vehicle are exemplified, but the car navigation system and associated sources are mere examples. The display device is also applicable to a system in which a source whose operation is assigned to one side (for example, driver's seat side) relative to the center of the dual-view display screen is different from a source whose operation is assigned to the other side (for example, passenger's seat side).
Further, any of the methods of “determination from a direction of density gradient”, “determination from edge feature distribution”, and “determination from a fingerprint shape”, each of which is described in the first embodiment, may be employed as a method of determination by touch operation direction determining unit 15 in the second embodiment.
Alternatively, as the method of determination by touch operation direction determining unit 15, a method of “determination from an ellipse longitudinal direction” can be employed.
For example, as shown in
Hence, when touch operations are detected and finger pad images 101L, 101R each having an elliptic shape are captured as shown in
Since the shape of the finger pad of a finger is elliptic as such upon touching the screen, display device 2 determines the touch operation direction based on the direction of the longitudinal axis of this ellipse, thereby achieving accurate determination of the touch operation direction.
In the description herein, display device 2 determines the touch operation direction based on the inclination of longitudinal axis 52 of the ellipse, but the determination of the touch operation direction is not limited to the determination from longitudinal axis 52 of the ellipse. Display device 2 may determine the touch operation direction based on the shorter axis of the ellipse.
The longitudinal axis of the ellipse can be found by calculation, for example, as follows. For example, as shown in
The following describes how display device 2 operates based on the determination principle shown in
Next, display device 2 determines whether or not the direction of inclination of the longitudinal axes is a positive direction (θ in
Display device 2 according to the second embodiment eliminates the need of providing a camera outside the display device or installing a signal transmitting device in a seat or a steering wheel in order to specify an operator. Thus, whether an operator operates from the left side or the right side can be determined only by display device 2. Further, the image is captured from inside the display screen to preclude obstacles for the determination. Accordingly, even when there are any obstacles in front of the display screen, for example, even when the hands of the person on the passenger's seat and the person on the driver's seat are crossed as shown in
It should be noted that each of the buttons for the passenger's seat side and each of the buttons for the driver's seat side may be arranged at different locations so as not to be superimposed on each other, in order to distinguish which one of the person on the passenger's seat and the person on the driver's seat has pressed a button. This results in, however, a limited area for displaying the buttons disadvantageously. Display device 2 according to the second embodiment is beneficial also in this point.
The display device according to each of the embodiments described above eliminates the need of installing a camera, a signal transmitting device, and the like outside the display device. Hence, the display device can accurately determine in what direction it is touched, under any circumstances, for example, even when there are any obstacles in front of the display screen or when the display device is put to stand or lie down. Thus, the display device can perform accurately subsequent information processing using the determined information. Further, with such a simple configuration constituted only by the display device, it can be accurately determined in what direction the display device is touched, so the cost therefor is as small as that for a conventional display device having a touch panel. Accordingly, the display device according to each of the embodiments can be used in various applications such as a mobile application in which one carries the display device.
The following describes variations and features of each of the embodiments described above.
(1) The display device may analyze an image based on a touch location detected by touch location detecting unit 14, so as to determine a touch operation direction.
(2) In each of the first and second embodiments, the display device analyzes an image of a predetermined area in a captured image to determine a touch operation direction. In this way, the display device processes only the image of the touch location and its periphery to determine the touch operation direction, thereby achieving simplified and fast processing. Further, the display device does not analyze an unnecessary image area. Accordingly, the display device achieves accuracy in determining the touch operation direction. The processing thus simplified allows for reduced number of gates used in fabricating a circuit of the display device. This achieves reduced manufacturing cost of the display device. However, instead of this, the display device may analyze the entire captured image.
(3) The methods of determination by touch operation direction determining unit 15 are not limited to those exemplified in the above-described embodiments, and any method of determination may be employed as long as the method allows for determination of a touch operation direction from an image.
(4) Illustrated in each of the first and second embodiments is a case where a touch operation is performed with a finger. However, even when the touch operation is performed with any member such as a touch pen, each of the first and second embodiments is applicable because an image of a shadow having its center at a location of coordinates at which the touch operation has been detected is captured as shown in
(5) Display device 2 according to the second embodiment may not include television receiver 26, navigation device 27, and/or DVD player 28.
(6) The method of determination from an ellipse longitudinal direction as illustrated in the second embodiment may be employed in each of display devices 1, 11 of the first embodiment.
(7) Any of the methods of “determination from a direction of density gradient”, “determination from edge feature distribution”, “determination from a fingerprint shape”, and “determination from an ellipse longitudinal direction” may be employed in display devices 1, 2, 11. Further, each of display devices 1, 2, 11 may be provided with a plurality of touch operation direction determining units 15 so as to selectively perform determination processing based on any of the plurality of methods of determination. Further, each of display devices 1, 2, 11 may further include a selecting operation unit to select any of the plurality of touch operation direction determining units 15 in response to an operator's operation. Alternatively, each of display devices 1, 2, 11 may operates the plurality of touch operation direction determining units 15 to determine a touch operation direction based on a plurality of determination results thereof.
Although the embodiments of the present invention have been described, it should be considered that the embodiments disclosed herein are illustrative and non-restrictive in any respect. The scope of the present invention is defined by the scope of claims, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.
Number | Date | Country | Kind |
---|---|---|---|
2008-124654 | May 2008 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2009/054269 | 3/6/2009 | WO | 00 | 10/4/2010 |