The present invention relates to a display device, and in particular, relates to a technology that corrects an input location by a touch pen in a display device equipped with a touch panel.
Mobile phones, tablet devices, and the like that include a touch panel display have recently become popular. There may be instances in a display device equipped with a touch panel in which, depending on the usage environment, there is a decrease in the accuracy in detecting an input location, and it may be necessary to perform calibration in order to correct the input location. As disclosed in Japanese Patent Application Laid Open Publication No. 2011-164742, for example, there is a method for performing calibration in which the display device displays an adjustment screen showing a cross mark or the like in order to adjust coordinates on the touch panel, and then adjusts the coordinates on the touch panel in accordance with the results of a contact operation for the cross mark by a user.
Japanese Patent Laid Open Publication No. 2003-271314 discloses a technology that corrects input location on a touch panel in accordance with dominant hand information. In Japanese Patent Laid Open Publication No. 2003-271314, the display device takes into account that a position that differs from a target position on the display panel may be input on the touch panel as a result of factors such as the roundness of the touch pen, the respective thicknesses of the touch panel and the display panel, and the like. The device then displays a plurality of marks in prescribed locations and corrects the input location. When a touch pen or the like is used to perform a contact operation with respect to the marks, it is difficult to perform the contact operation as the location approaches the edges of the display area near the dominant hand. Thus, in accordance with dominant hand information input by the user, the plurality of marks are displayed in a higher density moving toward the edges of the display area near the dominant hand. In Japanese Patent Laid Open Publication No. 2003-271314, by increasing the display density of the marks at the edges of the display area near the dominant hand, there is an increase in the amount of correction near the edges of the display area near the dominant hand, and the input location is corrected more accurately.
As disclosed in Japanese Patent Laid Open Publication No. 2011-164742 and Japanese Patent Laid Open Publication No. 2003-271314, when symbols such as a cross mark are displayed in order to adjust a coordinate range on the touch panel, there are cases in which the input operation is done such that what is displayed is not accurately input. In such a case, it is not possible to appropriately adjust the coordinate range on the touch panel. This is due to parallax occurring in the touch panel display as a result of the distance between the display panel and the touch panel, and the like, resulting in a difference between an input target location and the input location input on the touch panel. Another reason for this is that since the direction of the line of sight toward the touch panel will vary according to which hand is gripping the touch pen, shifts in input location due to parallax will vary according to which hand is gripping the touch pen. Thus, when the input location is corrected by a fixed amount regardless of which hand is gripping the touch pen, it is not possible to correct the input location to the location that the user desires.
The present invention provides a technology in which is possible to correct shifts in input location due to parallax based on which hand is gripping the touch pen.
A display device according to a first aspect of the present invention includes: a display unit having a rectangular display area; a touch panel; a display control unit that displays a character image in a determination area located in a corner of the display area such that the character image contacts two sides of the determination area that border an exterior of the display area; a determination unit that determines whether a hand gripping a touch pen is a left hand or a right hand in accordance with input locations of the touch pen on the character image and a display location of the character image; a detection unit that detects a shift amount with respect to the respective input locations in accordance with the input locations of the touch pen on the character image and the display location of the character image; and a correction unit that corrects an input location of the touch pen in the display area in accordance with a determination result from the determination unit and a detection result from the detection unit.
A second aspect of the present invention is configured such that, in the first aspect of the present invention: a plurality of the determination areas are respectively located in two opposing corners of the display area, the character image is displayed in the plurality of the determination areas, the display device further includes a setting unit that sets a coordinate range of the touch panel that corresponds to the display area in accordance with the input locations of the touch pen on the character images displayed in the respective determination areas, and the correction unit further corrects the input location of the touch pen in accordance with the coordinate range set by the setting unit.
A third aspect of the present invention is configured such that, in either the first or second aspects of the present invention: an upright direction of a character shown in the character image is substantially parallel to an extension direction of one of the two sides of the determination area, the character image has a line segment that is substantially parallel to the one side, and the detection unit detects the shift amount in the input locations in accordance with input locations of the touch pen on the line segment and a display location of the line segment.
A fourth aspect of the present invention is configured such that, in any one of the first to third aspects of the present invention: the character image includes a curved line that curves toward one of the two sides of the determination area, the one side being substantially parallel to an upright direction of a character shown in the character image, the display control unit displays the character image such that a section of the curved line contacts the one side of the determination area, and the determination unit determines whether the hand gripping the touch pen is the left hand or the right hand by determining whether or not a trajectory of input locations of the touch pen on the section of the curved line is located within the determination area.
A fifth aspect of the present invention is configured such that, in any one of the first to third aspects of the present invention: the character image includes a curved line that curves toward the two respective sides of the determination area, and the display control unit displays the character image such that portions of the curved line contact the two sides of the determination area.
A sixth aspect of the present invention is configured such that, in the first to fifth aspects of the present invention, if an input operation other than by the touch pen is performed on the touch panel and if the input operation is an input operation by a hand, the determination unit further determines which hand is gripping the touch pen in accordance with a positional relationship between an input location of the input operation by the hand and the input locations of the touch pen.
A seventh aspect of the present invention is configured such that, in any one of the first to fifth aspects of the present invention, if an input operation other than by the touch pen is performed on the touch panel near at least two opposing sides of four sides forming the display area, the determination unit further determines which hand is gripping the touch pen in accordance with a contact area of the input operation.
An eighth aspect of the present invention is configured such that, in any one of the first to seventh aspects of the present invention: the display control unit sets a lock function when the display device is turned ON and/or when a non-input state in which no input operation is performed on the touch panel lasts for a prescribed period of time, the lock function not allowing any input operation other than a character sequence that matches a character sequence of a prescribed password to be accepted until the character sequence is input, the display control unit deactivating the lock function when the character sequence that matches the character sequence is input, the character image is a portion of the character sequence of the prescribed password, and when an input operation is performed by the touch pen after the lock function has been deactivated, the correction unit corrects an input location of the input operation.
A ninth aspect of the present invention is configured such that, in any one of the first to eighth aspects of the present invention, the display control unit further displays an instruction image for directing operation of an application installed on the display device in a location in the display area based on the determination result from the determination unit.
According to the configurations of the present invention, it is possible to appropriately adjust a coordinate range on the touch panel and correct an input location based on which hand is gripping the touch pen.
A display device according to one embodiment of the present invention includes: a display unit that has a rectangular display area; a touch panel; a display control unit that displays a character image in a determination area located in a corner of the display area such that the character image contacts two sides of the determination area that border the exterior of the display area; a determination unit that determines which hand is gripping a touch pen in accordance with an input location of the character image by the touch pen and a display location of the character image; a detection unit that detects an amount of shift in the input location in accordance with the input location of the character image by the touch pen and the display location of the character image; and a correction unit that corrects the input location by the touch pen in the display area in accordance with the determination results of the determination unit and the detection results of the detection unit (Configuration 1).
According to Configuration 1, the display control unit displays the character image in the determination area located in the corner of the rectangular display area such that the character image contacts two sides of the determination area that border the exterior of the display area. The determination unit determines which hand is gripping the touch pen in accordance with the input location of the character image by the touch pen and the display location of the character image. The detection unit detects the amount of shift in the input location. When the touch pen performs an input operation, the correction unit corrects the input location in accordance with the determination results of the hand gripping the touch pen and the amount of shift in the input location. The character image is displayed in the determination area so as to contact two sides of the determination area that border the exterior of the display area. Compared to a symbol such as a cross mark, it is easier to input a character image exactly as displayed. Thus, when the character image is appropriately input, it is possible to correct the input location to the correct location in accordance with the detection results of the amount of shift in the input location and the determination results regarding the hand gripping the touch pen that are based on the input location of the character image. As a result, compared to cases in which the input location is corrected by a fixed amount regardless of which hand is gripping the touch pen, it is possible to improve the accuracy in correcting the input location.
Configuration 2 may be configured such that in Configuration 1: the character image is displayed in respective determination areas located in two opposing corners of the display area; the display device further includes a setting unit that sets a coordinate range on the touch panel that corresponds to the display area in accordance with the input location by the touch pen of the character image displayed in the determination areas; and the correction unit further corrects the input location by the touch pen in accordance with the coordinate range set by the setting unit.
According to Configuration 2, the character image is displayed in the respective determination areas located in two opposing corners of the display area. The setting unit sets the coordinate range on the touch panel in accordance with the input location by the touch pen of the character image displayed in the respective determination areas. The correction unit corrects the input location by the touch pen in accordance with the set coordinate range of the touch panel. When the character images displayed in the respective determination areas are appropriately traced, it is possible to identify opposite corner locations of the display area from the input locations, appropriately set a coordinate range of the touch panel that corresponds to the display area, and improve accuracy in correcting the input location.
Configuration 3 may be configured such that, in Configuration 1 or Configuration 2: an upright direction of the character shown in the character image is substantially parallel to an extension direction of one of the two sides of the determination area; the character image includes a line segment that is substantially parallel to the one side; and the detection unit detects shift in the input location in accordance with the input location of the line segment and the display location of the line segment.
According to Configuration 3, the character in the character image is substantially parallel to an extension direction of one of the two sides of the determination area that border the exterior of the display area. The character image includes a line segment that is substantially parallel to the above-mentioned side. In other words, the character image includes a line segment that is substantially parallel to the upright direction of the character. Normally, a user performs input after fixing the display device such that the characters are displayed upright in front of the user. When a line segment that is substantially parallel to the upright direction of the character is appropriately traced, it is possible to detect the amount of shift in a direction orthogonal to the line segment, or in other words, in the left/right direction of the user, and it is possible to correct the shift in the input location that results from the parallax experienced by the user.
Configuration 4 of the present invention may be configured such that, in any one of Configurations 1 to 3: the character image includes a curved line that curves toward one of the two sides of the determination area, the side of the determination area being substantially parallel to the upright direction of the character displayed in the character image; the display control unit displays the character image such that a section of the curved line contacts the one side of the determination area; and the determination unit determines which hand is gripping the touch pen by determining whether or not the trajectory of the input locations of the section of the curved line by the touch pen is located within the determination area.
According to Configuration 4, the character image includes a curved line that curves toward one of the two sides of the determination area, with the one side of the determination area being substantially parallel to the upright direction of the character in the character image. A portion of the curved line contacts the above-mentioned one side of the determination area. The device determines which hand is gripping the touch pen by determining whether or not the trajectory of the input locations of the section of the curved line input by the touch pen is located within the determination area. Compared to a straight line, it is easier to appropriately input a curved line without stopping until the line reaches the border with the exterior of the display area. Since the input location shifts to the left or right due to parallax based on which hand is holding the pen, it is possible to easily determine which direction the input location has shifted toward as a result of whether or not the trajectory of the input locations are within the determination area. As a result, it is possible to easily determine which hand is gripping the touch pen.
Configuration 5 may be configured such that, in any one Configurations 1 to 3: the character image includes a curved line that curves toward both respective sides of the determination area; and the display control unit displays the character image such that a portion of the curved line contacts the two sides of the determination area.
According to Configuration 5, the character image includes a curved line that curves toward both respective sides of the determination area that border the exterior of the display area, and is displayed such that a portion of the curved line touches the respective sides. Compared to a straight line, it is easier to appropriately input a curved line without stopping until the line reaches a border with the exterior of the display area. When the character image is appropriately traced, it is possible to identify, from the input locations, locations on the touch panel that border the exterior of the display area, and to more appropriately adjust the coordinate range on the touch panel.
Configuration 6 may be configured such that, in any one of Configurations 1 to 5, when an input operation is performed on the touch panel by an object that is not the touch pen, the determination unit further determines, when the input operation is an input operation by a hand, which hand is gripping the touch pen in accordance with the positional relationship of the input location of the input operation by the hand and the input location by the touch pen.
According to Configuration 6, the device determines which hand is gripping the touch pen in accordance with the positional relationship of the input location by the hand and the input location by the touch pen. When an input operation is performed by the touch pen, there may be cases in which the input operation is performed in a state in which the hand gripping the touch pen is being supported by the touch panel. There is a fixed positional relationship between the location at which the hand gripping the touch pen contacts the touch panel and the input location by the touch pen, or in other words, the location of the tip of the touch pen. Thus, it is possible to more reliably determine which hand is gripping the touch pen as a result of the positional relationship between the input location where the hand is contacting the touch panel and the input location by the touch pen.
Configuration 7 may be configured such that, in any one of Configurations 1 to 5, when an input operation is performed on the touch panel near at least two opposing sides among four sides forming the display area by an object that is not the touch pen, the determination unit further determines which hand is gripping the touch pen in accordance with a contact area of the input operation.
According to Configuration 7, when an input operation is performed by an object that is not the touch panel near two opposing sides that are part of the display area, the device further determines which hand is gripping the touch pen in accordance with a contact area on the touch panel by the input operation and the input location by the touch pen. When the user holds the display device in a hand opposite from the hand holding the touch pen and performs an input operation on the touch panel via the touch pen, the fingers of the hand holding the touch panel may contact the touch panel at a location near the two opposing sides of the display area. In such a case, the area of the fingers contacting the touch panel near the two sides will vary according to which hand is gripping the touch pen. Thus, by determining which hand is gripping the touch pen via the contact area of the input operation that is near the opposing two sides of the display area and that is not performed by the touch pen, it is possible to increase determination accuracy compared to Configuration 1.
Configuration 8 may be configured such that, in any one of Configurations 1 to 7: the display control unit sets a lock function when the device is turned ON and/or when a non-input state, in which no input operation is performed on the touch panel, continues so as to exceed a fixed period of time, the lock function making it so that no input operation other than a character sequence that matches a character sequence of a prescribed password is accepted until the character sequence is input; the display control unit deactivates the lock function when a character sequence that matches the above-mentioned character sequence is input; the character image is a portion of the character sequence of the prescribed password; and when an input operation is performed by the touch pen after the lock function has been deactivated, the correction unit corrects the input location of the input operation.
According to Configuration 8, the device sets a lock function by displaying a character image that is a portion of the character sequence of the prescribed password when either the device is turned ON and/or a non-input state on the touch panel continues so as to exceed a fixed period of time. When the prescribed password is input via an input operation, the lock function is deactivated. After the lock function is deactivated, the input location of an input operation is corrected if the input operation was performed by the touch pen. Thus, it is not necessary to separately provide a screen for correcting the input location and a screen for setting the lock function, and the user may perform input operations on just one screen.
Configuration 9 may be configured such that, in any one of Configurations 1 to 8, the display control unit further displays an instruction image for directing the operation of an application installed on the device in a location in the display area based on the determination results of the determination unit.
According to Configuration 9, an instruction image for directing the operation of an application installed on the display device is displayed in a location of the display area based on the determinations results of the hand gripping the touch pen; thus, it is possible to improve the operability of the application.
An embodiments of the present invention will be described in detail below with reference to the drawings. Portions in the drawings that are the same or similar are assigned the same reference characters and descriptions thereof will not be repeated.
(Configuration)
The touch panel 10 is a capacitive type touch panel, for example. The touch panel 10 includes a group of drive electrodes (not shown) and a group of sense electrodes (not shown) arranged in a matrix, and has a sensing area formed by the group of drive electrodes and the group of sense electrodes. The touch panel 10 is provided upon the display panel 20 such that a display area 20A (see
The touch panel control unit 11 outputs sequential scan signals to the drive electrodes of the touch panel 10, and detects contact on the touch panel 10 when the signal value output from the sense electrodes meets or exceeds a threshold value. When the touch panel control unit 11 detects contact on the touch panel 10, the touch panel control unit 11 detects whether or not the contact was made by a touch pen 12 in accordance with the signal value from the sense electrodes. The operation of determining whether or not the contact was made by the touch pen 12 is performed by determining whether or not the signal value from the sense electrodes falls within a threshold range (hereafter referred to as a “touch pen determination threshold range”) that represents a change in capacitance when the touch pen 12 contacts the touch panel 10. When the signal value from the sense electrodes falls within the touch pen determination threshold range, the touch panel control unit 11 determines that the contact was made by the touch pen 12. When the signal value does not fall within the touch pen determination threshold range, the touch panel control unit 11 determines that the contact was not made by the touch pen 12.
Moreover, the touch panel control unit 11 detects as the input location coordinates corresponding to a location at which the drive electrode and the sense electrode for which the signal value from the sense electrode was obtained intersect. In addition, the touch panel control unit 11 outputs to the control unit 40 the detection results indicating whether or not the contact was made by the touch pen 12 and coordinates representing the detected input location. The coordinates of the input location detected by the touch panel control unit 11 are coordinates within a coordinate range (default values) that is initially set to correspond to the display area 20A.
The display panel 20 is a liquid crystal panel in which a liquid crystal layer (not shown) is sandwiched between an active matrix substrate, which transmits light, and an opposite substrate (both not shown). On the active matrix substrate, a plurality of gate lines (not shown) and a plurality of source lines (not shown) that intersect the gate lines are formed. The substrate includes the display area 20A (see
The display panel control unit 21 has: a gate driver (not shown) that scans the gate lines (not shown) of the display panel 20, and a source driver (not shown) that provides data signals to the source lines (not shown) of the display panel 20. The display panel control unit 21 outputs a prescribed voltage signal to the common electrode and outputs control signals, including timing signals such as clock signals, to the gate driver and the source driver. As a result, the gate lines are sequentially scanned by the gate driver, the data signals are provided to the source lines by the source driver, and an image based on the data signals is displayed in the respective pixels.
The backlight 30 is provided to the rear of the display panel 20. The backlight 30 has a plurality of LEDs (light-emitting diodes), and lights the plurality of LEDs in accordance with the luminance indicated by the backlight control unit 31, which will be described later. The backlight control unit 31 outputs a luminance signal to the backlight 30 that is based on a luminance indicated by the control unit 40.
The control unit 40 has a CPU (central processing unit; not shown) and memory, which includes ROM (read only memory) and RAM (random access memory).
The display control unit 401 causes the display panel control unit 21 to display a character image for calibrating the touch panel 10 when the display device 1 is turned ON.
Character images 200a, 200b are respectively displayed in a display area 201 (a determination area) and a display area 202 (a determination area). The display area 201 and the display area 202 are located in opposing corners that serve as a reference for indicating the range of the display area 20A. The display area 201 is enclosed by the sides 20x1, 20y1, as well as two other sides that serve as borders with other display areas in the display area 20A. The display area 202 is enclosed by the sides 20x2, 20y2, as well as two other sides that serve as borders with other display areas in the display area 20A.
In the present embodiment, the character image 200a is a capital letter “G,” while the character image 200b is a lowercase letter “b.” In the example shown in
Next, the setting unit 402 will be explained. When an input operation has been made by the touch pen 12 for the character images 200a, 200b displayed in the display area 20A, the setting unit 402 sets reference coordinates representing a coordinate range within the touch panel 10 in accordance with the input location of the character images 200a, 200b within the default values of the sensing area.
In the display area 201 shown in
In other words, the character displayed in the character image 200a includes a line that is not parallel to the sides 20x1, 20y1, which form the border of the display area 201 and the exterior of the display area. Portions of this non-parallel line contact the sides 20x1 (X axis), 20y1 (Y axis) respectively. In other words, in
Similarly, in the coordinate plane shown in
In other words, the letter shown in the character image 200b includes a line that is not parallel to the sides 20x2, 20y2, which form the border of the display area 202 and the exterior of the display area. Portions of this non-parallel line contact the sides 20x2 (Y=Dy1), 20y2 (X=Dx1), respectively. In other words, in
The default values for the sensing area of the touch panel 10 are set so as to correspond to the coordinate plane of the display area 20A.
An input operation made by the touch pen 12 for the character images 200a, 200b is an operation in which the character images 200a, 200b are traced using the touch pen 12. Since portions of the character image 200a are in contact with the X axis and the Y axis, input locations near the X axis and the Y axis of the sensing area are detected when the character image 200a is traced appropriately. Therefore, among the coordinates for the input locations by the touch pen 12 for the character image 200a, the minimum X coordinate (Tx0′, for example) and the minimum Y coordinate (Ty0′, for example) correspond to the minimum X coordinate (Dx0) and the minimum Y coordinate (Dy0) in the display area 20A.
In addition, since portions of the character image 200b are in contact with Y=Dy1 and X=Dx1, input locations near X=Dx1 and Y=Dy1 in the sensing area are detected when the character image 200b is traced appropriately. Therefore, among the coordinates for the input locations by the touch pen for the character image 200b, the maximum X coordinate (Tx1′, for example) and the maximum Y coordinate (Ty1′, for example) correspond to the maximum X coordinate (Dx1) and the maximum Y coordinate (Dy1) in the display area 20A.
The setting unit 402 resets the default values by setting (Tx0′, Ty0′) and (Tx1′, Ty1′), which are based on the input locations for the character images 200a, 200b, as reference coordinates indicating the coordinate range of the sensing area. The setting unit 402 stores the reference coordinates (Tx0′, Ty0′), (Tx1′, Ty1′) for the touch panel 10 in the storage unit 50.
Next, the relationship between the hand (dominant hand) gripping the touch pen 12 and shifts in the input location as a result of parallax will be explained. The direction to which the input location shifts as a result of parallax depends on which hand is gripping the touch pen 12.
When the user grips the touch pen 12 with the left hand and performs input on the touch panel 10, the user sees the location where he will perform input from the right side of the left hand. Thus, as shown in
Meanwhile, when the user grips the touch pen 12 with the right hand and performs input on the touch panel 10, the user sees the location where he will perform input from the left side of the right hand. Thus, as shown in
In this manner, the input location shifts to the right (positive direction of the X axis) of the display location (input target location) on the display panel 20 when the hand gripping the touch pen 12 is the right hand, and shifts to the left (negative direction of the X axis) of the display location (target input location) of the display panel 20 when the left hand is gripping the touch pen 12. The determination unit 403 determines which hand is gripping the touch pen 12 in accordance with the display location of the character image 200a and the input location of the character image 200a by the touch pen 12 within the default value coordinate plane of the sensing area.
The portion 211a of the curved line in the character image 200a shown in
When the user traces the portion 211a of the character image 200a while gripping the touch pen 12 with his right hand, the input location will shift to the right (positive direction of the X axis) of the display location (input target location) as a result of parallax; thus, the input location for the portion 211a of the character image 200a will be located within the display area 201. Thus, the determination unit 403 determines that the hand gripping the touch pen 12 is the right hand when a line connecting a plurality of continuous input locations in the determination target range is located within the display area 201.
Meanwhile, the determination unit 403 determines that the hand gripping the touch pen 12 is the left hand when a line connecting a plurality of continuous input locations in the determination target range is not located within the display area 201. When the user traces the portion 211a of the character image 200a while gripping the touch pen 12 with the left hand, the input location will shift to the left (negative direction of the X axis) of the display location (input target location) as a result of parallax; thus, it is likely that the portion 211a of the character image 200a will overlap the border (Y axis) with the exterior of the display area. As a result, the line connecting the input locations for the portion 211a of the character image 200a will be cut off at the border (Y axis) with the exterior of the display area. The determination unit 403 stores in the storage unit 50 the determination results determined in accordance with the input location of the partial image 211a.
Next, the detection unit 404 will be explained. The detection unit 404 detects shifts along the X axis for the input location of the character image 200a, and detects the amount of shift (correction value) between the input location and the display location of the character image 200a. Specifically, in the present embodiment, the detection unit 404 detects the amount of shift between the display location of the partial image 213a indicated by diagonal lines in the character image 200a shown in
In the present embodiment, in accordance with the input location of the character image 200a, the device determines which hand is gripping the touch pen 12 and then detects an amount of shift in the input location. However, the determination of the hand gripping the touch pen 12 and the detection of the amount of shift in the input location may be performed in accordance with the input location of the character image 200b. When the character image 200b is used, the determination unit 403 determines, within the default values for the sensing area, whether or not the trajectory of the plurality of continuous input locations that were input within the determination target range and that correspond to the portion 211b of the character image 200b is located within the display area 202.
When the user grips the touch pen 12 in the right hand and then traces the character image 200b, it is likely that the input position will shift to the right due to parallax. On the other hand, when the user grips the touch pen 12 in his left hand and then traces the character image 200b, it is likely that the input position will shift to the left due to parallax. Thus, the determination unit 403 determines that the hand gripping the touch pen 12 is the right hand when the line connecting the input locations corresponding to the portion 211b of the character image 200b is not located within the display area 202, and determines that the hand gripping the touch pen 12 is the left hand when the line is located within the display area 202.
In addition, when a shift in the input location is detected using the character image 200b, the detection unit 404 detects the shift in accordance with the input location of the partial image 213b of the character image 200b shown in
The description will be continued while referring back to
The description will be continued while referring back to
The operation button unit 60 includes the operation buttons of the display device 1, such as the power button and menu buttons. The operation button unit 60 outputs an operation signal that represents the content of the user operation to the control unit 40.
The user then performs an operation of tracing the character images 200a, 200b displayed in the display area 20A via the touch pen 12. The control unit 40, via the touch panel control unit 11, waits while displaying the character images 200a, 200b until the control unit 40 acquires the coordinates indicating the location contacted by the touch pen 12 (Step S13: No). After acquiring the coordinates indicating the position contacted by the touch pen 12 (Step S13: Yes), the control unit 40, via the touch panel control unit 11, resets the default values for the coordinate range of the touch panel 10 by setting reference coordinates indicating the coordinate range of the touch panel 10 in accordance with the acquired coordinates (Step S14).
Specifically, the control unit 40 identifies, from among the coordinates acquired from the touch panel control unit 11, coordinates included in a region 101 (see
Furthermore, the control unit 40, identifies, from among the coordinates acquired from the touch panel control unit 11, coordinates includes in a region 102 (see
Next, the control unit 40 detects the determination of the hand gripping the touch pen 12 and the amount of shift (correction value) of the input location in accordance with the input location of the character image 200a and the display location of the character image 200a.
Specifically, the control unit 40 determines whether or not a trajectory of a plurality of continuous input locations input by the user within the determination range, which corresponds to the partial image 211a of the character image 200a shown in
Next, the control unit 40 detects the difference between the display location of the partial image 213a of the character image 200a shown in
After calibration, the control unit 40 executes a prescribed application, and waits until coordinates indicating the input location are acquired from the touch panel control unit 11 (Step S16: No). When the control unit 40 acquires coordinates ((Tx2, Ty2), for example) indicating the input location from the touch panel control unit 11 (Step S16: Yes), if the coordinates are an input location contacted by the touch pen 12 (Step S17: Yes), the control unit 40 corrects the acquired coordinates (Tx2, Ty2) in accordance with the reference coordinates (Tx0′, Ty0′), (Tx1′, Ty1′) in the storage unit 50, the information indicating which hand is gripping the touch pen 12, and the amount of shift (correction value) in the input location (Step S18).
The coordinates acquired from the touch panel control unit 11 are coordinates within the default values ((Tx0, Ty0) to (Tx1, Ty1)) of the sensing area. The control unit 40 converts the acquired coordinates (Tx2, Ty2) into coordinates within the coordinate range ((Tx0′, Ty0′) to (Tx1′, Ty1′)) of the reference coordinates. Specifically, the coordinates (Tx2′, Ty2′) within the reference coordinates may be acquired by changing the default values, the reference coordinates, and the acquired coordinates into variables, and inserting the default values, the reference coordinates, and the acquired coordinates into a prescribed arithmetic formula for correcting the acquired coordinates, for example.
The amount of shift in the input location due to parallax is not taken into account in the converted coordinates (Tx2′, Ty2′). The control unit 40 corrects the coordinates (Tx2′, Ty2′) in accordance with the information indicating which hand is gripping the touch pen 12 and the correction value. In other words, when the hand gripping the touch pen 12 is the right hand, the control unit 40 corrects the X coordinate of the coordinates (Tx2′, Ty2′) to an X coordinate (Tx2′-Δdx) that has been shifted toward the negative direction of the X axis (left direction) by the correction amount (Δdx). Furthermore, when the hand gripping the touch pen 12 is the left hand, the control unit 40 corrects the X coordinate of the coordinates (Tx2′, Ty2′) to an X coordinate (Tx2′+Δdx) that has been shifted toward the positive direction of the X axis (right direction) by the correction amount (Δdx). The control unit 40 outputs coordinates representing the corrected input location to the application that is being executed.
Meanwhile, when the control unit 40 acquires the coordinates representing the contact location (input location) from the touch panel control unit 11 (Step S16: Yes), if the coordinates are not an input location resulting from contact by the touch pen 12 (Step S17: No), the control unit 40 outputs the acquired coordinates to the application being executed (Step S19).
The control unit 40, via the operation button unit 60, repeats the processing starting at Step S17 until receiving an operation signal indicating that the power has been turned OFF (Step S20: No). When the control unit 40 receives the operation signal indicating that the power has been turned OFF, the control unit 40 ends the above-mentioned processing (Step S20: Yes).
In the above-mentioned embodiment, a character image 200a representing the letter “G” was displayed in the display area 201 located in one of two opposing corners in the display area 20A, while a character image 200b representing the letter “b” was displayed in the other display area 200b. The letters in the character images 200a, 200b include lines that are not parallel to the two sides of the display areas 201, 202 that border the exterior of the display area. These character images 200a, 200b are displayed such that portions of these non-parallel lines contact the above-mentioned two sides. Since the character images 200a, 200b represent letters, the user can more intuitively trace what is displayed compared to cases in which a symbol such as a cross mark is displayed. Thus, it is possible to detect a location near the border of the display areas 201, 202 and the exterior of the display area from the input location of the character images 200a, 200b, and it is possible to set an appropriate coordinate range as the sensing area of the touch panel 10 that corresponds to the display area 20A.
In addition, in the above-mentioned embodiment, the display device determines whether or not a trajectory of the input locations for the portion 211a of the character image 200a displayed so as to contact the border with the exterior of the display area is located within the display area 201. The portion 211a of the character image 200a is a curved line. A curved line is easier to input without stopping compared to a straight line. Thus, it is possible to determine the direction to which the input location shifts as a result of parallax, or in other words, determine which hand is gripping the touch pen 12, from the trajectory of the input locations for the portion 211a of the character image 200a.
The partial image 213a of the character image 200a is a portion of a line segment that is substantially parallel to the upright direction of the character in the character image 200a. When the partial image is a portion of a line that is not parallel to the upright direction of the character, the partial image has a component in the upright direction of the character and a component in a direction orthogonal to the upright direction of the character; thus, when the device detects a shift in the input location, these various components must be taken into consideration. As a countermeasure, in the above-mentioned embodiment, it is possible to easily detect the amount of shift in the input location in the X axis direction, or in other words, the amount of shift in the left-right direction of the user, by simply obtaining the difference between the input location of the partial image 213a and the display location of the partial image 213a.
In addition, in the above-mentioned embodiment, it is possible to correct the input location acquired after calibration to coordinates that reflect which hand is gripping the touch pen 12 and the amount of shift in the input location within the coordinate range based on the reference coordinates of the touch panel 10 obtained during calibration. As a result, the desired input location of the user is output to the application that is being executed, and the appropriate processing that the user desires to perform can be carried out.
An embodiment of the present invention has been described above, but the above embodiment is a mere example of an implementation of the present invention. Thus, the present invention is not limited to the embodiment described above, and can be implemented by appropriately modifying the embodiment described above without departing from the spirit of the present invention. Next, modification examples of the present invention will be explained.
(1) In the above-described embodiment, an example was used in which letters of the alphabet that were different from one another were displayed as the character images 200a, 200b. Japanese characters may be displayed as well, however. As shown in
In addition, as shown in
In the examples shown in
In addition, in
When determining which hand is gripping the touch pen 12 in accordance with the input locations for the character image 200b shown in
Even in cases in which the character represented by the character images 200a, 200b is a Japanese character, the respective characters of the character images 200a, 200b have a line that is not parallel to the two sides of the display areas 201, 202 that border the exterior of the display area. In the present modification example, the portions (211a, 211b, 212a, 212b) of the character images 200a, 200b are curved line portions that curve toward, or in other words, portions of lines that are not parallel to, the two sides of the display area that border the exterior of the display areas in which the respective character images 200a, 200b are displayed.
It is preferable that the character represented by one of the character images 200a, 200b include a line segment from the border of the display area 20A that is substantially parallel to the upright direction of the character. When the partial image is a portion of a line that is not parallel to the upright direction of the character, the partial image contains a component in the upright direction of the character and a component in a direction that is orthogonal to the upright direction; thus, it is not possible to acquire the amount of shift in the input location in the left-right direction of the character just by acquiring the difference between the display location and input location of the partial image. In the present modification example, the respective characters represented by the character images 200a, 200b contain linear segments that are substantially parallel to the upright direction of the character; thus, it possible to easily acquire the amount of shift in the input location in a direction orthogonal to the upright direction of the character, or in other words, in the left-right direction of the user, by acquiring the difference between the input location and the display location of the partial images of the line segments.
(2) In the above-mentioned embodiment and Modification Example (1), examples were used in which the respective characters in the character images 200a, 200b were displayed in the display areas 201, 202 such that respective portions of the characters contacted the two sides bordering the exterior of the display area. The invention may be configured as follows, however. The characters may be displayed so as to protrude to the exterior of the display area by displaying the characters such that portions of the characters corresponding to the character images 200a, 200b overlap the border between the display areas 201, 202 and the exterior of the display area. In other words, the character images 200a, 200b may be images that represent a portion of the respective characters.
As shown in
In addition, as shown in
In the present modification example, the minimum reference coordinates (Tx0′, Ty0′) of the sensing area are set in accordance with the input location of the character image 200a, and the maximum reference coordinates (Tx1′, Ty1′) of the sensing area are set in accordance with the input location of the character image 200b.
In the example shown in
In the example shown in
In addition, in the present modification example, the display device will detect the amount of shift in the input location and the determination results regarding which hand is gripping the touch pen 12 in accordance with the input location of the character image 200a. Specifically, in the example shown in
In the example shown in
In the present modification example, the Japanese hiragana characters corresponding to the character images 200a, 200b are displayed such that the characters visibly protrude to the exterior of the display area. The more familiar a user is with the Japanese hiragana characters corresponding to the character images 200a, 200b, the more accurately the character images 200a, 200b will be traced. As a result, a location near the border with the exterior of the display area is input, and it is possible to appropriately set reference coordinates representing the coordinate range of the display panel 10 corresponding to the display area 20A.
(3) In the above-mentioned embodiment, an example was used in which one letter of the alphabet was used for each of the character images 200a, 200b. Two or more letters of the alphabet may be used, however. An example will be explained hereafter in which two letters of the alphabet are used as the character images 200a, 200b, for example.
As shown in
In addition, as shown in
The portions 251a, 252a of the character image 200a respectively contact the Y axis and the X axis of the display area 20A. When the character image 200a is appropriately traced, locations near the Y axis and the X axis of the display area 20A are input. As a result, it is possible to identify reference coordinates (Tx0′, Ty0′) in the sensing area that correspond to the minimum coordinates (Dx0, Dy0) of the display area 20A. In addition, when the user recognizes the letters “d” and “Q” and inputs the entire letters (including the portions that are not displayed), a location on the X axis or the Y axis of the display area 20A will be input, and it is thus possible to more accurately set the reference coordinates (Tx0′, Ty0′).
Similarly, the portions 251b, 252b of the character image 200b respectively contact the side X=Dx1 and the side Y=Dy1 of the display area 20A. When the character image 200b is appropriately traced, locations near the side X=Dx1 and the side Y=Dy1 of the display area 20A are input. As a result, it is possible to identify reference coordinates (Tx1′, Ty1′) in the sensing area that correspond to the maximum coordinates (Dx1, Dy1) of the display area 20A. In addition, when the user recognizes the letters “b” and “P” and inputs the entire letters (including the portions that are not displayed), a location on the side X=Dx1 or the side Y=Dy1 of the display area 20A will be input, and it is thus possible to more accurately set the reference coordinates (Tx1′, Ty1′).
In the present modification example, the character image 200a and the character image 200b are displayed such that a portion of the curved line in the letter “d” represented by the image 200a_1 and a portion of the curved line in the letter “P” represented by the image 200b_2 respectively protrude to the exterior of the display area. Thus, in the present modification example, the determination regarding which hand is gripping the touch pen 12 and the detection of the amount of shift in the input location are performed in the following manner.
When the determination regarding which hand is gripping the touch pen 12 and the detection of the amount of shift in the input location are performed using the character image 200a, the device may determine which hand is gripping the touch pen 12 and detect the amount of shift in the input location in accordance with the input location of the partial image 253a in the image 200a_1 shown in
Furthermore, when the determination regarding which hand is gripping the touch pen 12 and the detection of the amount of shift in the input location are performed using the character image 200b, the device may determine which hand is gripping the touch pen 12 and detect the amount of shift in the input location in accordance with the input location of the partial image 253b in the image 200b_2 shown in
In such a case, the control unit 40 calculates the difference, in the sensing area, between the plurality of X coordinates corresponding to the display location of the partial image 253a or the partial image 253b and the X coordinates of the plurality of input locations for the partial image 253a or the partial image 253b, and then acquires the average value (Tx_Avg) for these calculations. When the average value Tx_Avg satisfies ITx_Avgl≦(a prescribed threshold) and Tx_Avg>0, the control unit 40 determines that the direction of the shift in the input location is in the negative direction of the X axis (the left hand direction), and then determines that the hand gripping the touch pen 12 is the left hand. Furthermore, when ITx_Avgl≦(a prescribed threshold) and Tx_Avg<0, the control unit 40 determines that the direction of the shift in the input location is in the positive direction of the X axis (the right hand direction), and then determines that the hand gripping the touch pen 12 is the right hand. The control unit 40 detects ITx_Avgl as the amount of shift in the input location.
The partial images 253a, 253b are line segment images that are substantially parallel to the Y axis. Thus, by acquiring the difference between the input location of the partial image 253a or the partial image 253b and the display location of the partial image 253a or the partial image 253b, it is possible to detect the direction to which the input location has shifted as a result of parallax. As a result, it is possible to determine which hand is gripping the touch pen 12 via the direction toward which the input location has shifted.
In the present modification example, an example was used in which the images 200a_1, 200a_2, which formed the character image 200a, were arranged and displayed along the X axis direction (a direction orthogonal to the upright direction of the characters). The images may be arranged and displayed along the Y axis direction (the upright direction of the characters), however. When the images 200a_1, 200a_2 forming the character image 200a are arranged along the Y axis direction, the images should be displayed in the order of the image 200a_2, and then the image 200a_1. In addition, when the images 200b_1, 200b_2 forming the character image 200b are arranged along the Y axis direction, the images should be displayed in the order of the image 200b_2, and then the image 200b_1.
Essentially, when the upright direction of the characters forming the character image 200a and the Y axis of the display area 20A are substantially parallel to each other, the image of a character having a curved line that curves toward the X axis of the display area 20A should be disposed such that a portion of the curved line contacts the X axis, and the image of a character having a curved line that curves toward the Y axis of the display area 20A should be disposed such that a portion of the curved line contacts the Y axis. In addition, when the upright direction of the characters forming the character image 200b and the Y axis of the display area 20A are substantially parallel to each other, the image of a character having a curved line that curves toward the side Y=Dx1 should be disposed such that a portion of the curved line contacts the side Y=Dx1, and the image of a character having a curved line that curves toward the side X=Dy1 should be disposed such that a portion of the curved line contacts the side X=Dy1.
Also in the present modification example, an example was used in which characters forming the character images 200a, 200b were displayed such that the characters visibly protruded to the exterior of the display area. However, if the characters do not visibly protrude, the device may be configured so as to display information near the character images 200a, 200b that prompts the user to trace the characters forming the character images 200a, 200b. In other words, in
(4) In the above-mentioned embodiment, an example was used in which the device determined which hand was gripping the touch pen 12 in accordance with the input location by the touch pen 12 for the partial image 211a that was a curved line segment of the character image 200a. Similar to Modification Example (3), however, the device may be configured so as to perform the determination regarding which hand is gripping the touch pen 12 and the detection of the amount of shift in the input location in accordance with the input location of the partial image 213a, which is a line segment substantially parallel to the Y axis, and the display location of the partial image 213a.
(5) In the above-mentioned embodiment, the display device 1 may be prevented from accepting input instructions on the touch panel 10 when the display device 1 is turned ON until the device receives an input operation of a prescribed password formed of a sequence of a plurality of characters. When the display device 1 is turned on, the control unit 40 displays the lock screen shown in
(6) In the above-mentioned embodiment and modification examples, an example was used in which, when the display device 1 is turned ON, the display device 1 displays the character images 200a, 200b, and then determines which hand is gripping the touch pen 12 and detects the amount of shift in the input location. The device may be configured to perform these operations at the following times, however. The device may perform these operations when the device receives an operation from the user that instructs the device to perform calibration, and the device may also perform these operations after a non-operation period has elapsed in which no contact has been detected on the touch panel 10 for a fixed period of time, for example.
(7) In the above-mentioned embodiment, an example was used in which the display device determined which hand was gripping the touch pen 12 in accordance with the input location of the character image 200a. The determination regarding which hand is gripping the touch pen 12 may alternatively be determined using the following method. There are instances in which the character image 200a is traced using the touch pen 12 while a portion of the pinky-finger side of the palm of the hand gripping the touch pen 12 is supported by the touch panel 10. At such time, the contact area of the portion of the palm contacting the touch panel 10 is larger than the contact area of the touch pen 12; thus, the changes in capacitance will be different when the palm contacts the touch panel 10 as a result of the difference in size of the contact areas.
In the present modification example, the touch panel 10 is configured as a touch panel that can perform multi-touch detection, and can furthermore preset a threshold range for a signal value that corresponds to a change in capacitance resulting from contact by a portion of the palm. When a signal value output from the sense electrodes falls within the threshold range for signal values corresponding to contact by a portion of the palm, the touch panel control unit 12 outputs to the control unit 40 a signal indicating contact by a portion of the palm and the coordinates representing the contact location. When the control unit 40 acquires from the touch panel control unit 12 the coordinates of the contact by the portion of the palm, the control unit 40 determines which hand is gripping the touch pen 12 as a result of the relationship between the coordinates of the contact by the portion of the palm and the coordinates of the contact by the touch pen 12.
As shown in
(8) In the above-mentioned embodiment, the control unit 40 may be further configured such that, when the control unit 40 detects an input location via contact by a hand, instead of contact by the touch pen 12, near the two opposing sides of the display area 20A, the control unit 40 determines which hand is gripping the touch pen 12 in accordance with these detection results. When the user holds the touch pen 12 with his right hand and performs input while the left hand holds the display device 1 as shown in
(9) In the above-mentioned embodiment, the control unit 40 may be configured so as to display, in a location based on which hand is gripping the touch pen 12, a display location of an instruction image that directs the operation of a menu icon, an operation icon, or the like that has been set in an application installed in the display device 1. In such a case, when the hand gripping the touch pen 12 is the right hand, for example, the control unit 40 may display an instruction image that has a frequency of operation higher than a prescribed frequency of operation on the right side (positive direction of the X axis) of the display area 20A, and may display an instruction image with a frequency of operation below the prescribed frequency of operation on the left side (the negative direction of the X axis) of the display area 20A. In addition, when the hand gripping the touch pen 12 is the left hand, the control unit may display the instruction images in a manner opposite to that when the right hand was gripping the touch pen 12; namely, the control unit 40 may display an instruction image with a frequency of operation higher than a prescribed frequency of operation on the left side (negative direction of the X axis) of the display area 20A, and may display an instruction image with an a frequency of operation below the prescribed frequency of operation on the right side (the positive direction of the X axis) of the display area 20A. In short, the display device should display an instruction image in a location of the display area 20A that is closer to the hand gripping the touch pen 12. By displaying the instruction image with a higher frequency of operation in a location closer to the hand gripping the touch pen 12, it is possible to improve the user friendliness of the device.
(10) In the above-mentioned embodiment, an example was used in which display areas located in opposing corners of the display area 20A included a display area 201 that included minimum values (Dx0, Dy0) located in the upper left of the display area 20A shown in
(11) In the above-mentioned embodiment and modification examples, an example was used in which the respective characters in the character images 200a, 200b were letters from the alphabet or Japanese hiragana characters. However, the characters in the character images 200a, 200b may instead be characters from the languages of a variety of countries, such as Japanese katakana letters, kanji, hangul characters, or the like, or may instead be numbers. In addition, the character images 200a, 200b may be made of two or more characters that are a combination of letters from the alphabet and numbers, or may be made of two or more characters that are combination of characters from the languages of a variety of countries.
(12) In the above-mentioned embodiment and modification examples, an example was used in which the display areas 201, 202 located in opposing corners of the display area 20A were determination areas and the character images 200a, 200b were respectively displayed in the respective determination areas. The display device may be configured, however, such that display areas located in one or more corners of the display area 20A are determination regions and the character images are displayed therein.
The present invention can be applied to the industry of display devices equipped with touch panels.
Number | Date | Country | Kind |
---|---|---|---|
2013-183291 | Sep 2013 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2014/071374 | 8/13/2014 | WO | 00 |