The present invention relates to a display control device, a display control method and a program, which support an input operation using a touch panel.
In recent years, the number of electronic apparatuses equipped with touch panels has increased. The touch panel which can provide an intuitive user interface to a user is widely used as a device which receives an input operation on an electronic apparatus including a mobile phone or a smartphone. The touch panel enables both reception of an input operation on a screen of a display unit (for example, liquid crystal display (LCD)) or an electroluminescent (EL) display) provided in an electronic apparatus and display of an operation execution result of the input operation, to be performed on the same screen.
In addition, in recent years, a touch panel which can detect the proximity of a finger is known (for example, refer to Patent Literature 1). A noncontact type user input device disclosed in Patent Literature 1 includes a plurality of linear transmission electrodes, a transmitter which supplies an
AC current for transmission to each of the transmission electrodes, a plurality of linear reception electrodes which are disposed so as not to make contact with the respective transmission electrodes, and a receiver which receives AC current flowing through the reception electrode. A capacitor is formed at each intersection between the transmission electrode and the reception electrode. Further, since a capacitor is formed according to the proximity of the fingertip of a user, a capacitance value of the capacitor changes depending on degree of proximity of the fingertip. The noncontact user input device can recognize the distance between the touch panel and the finger on the basis of the change in the capacitance value.
The touch panel as disclosed in Patent Literature 1 can detect a state in which the finger is held over at a position in a space which is lower than a predetermined height from a horizontal surface of the touch panel, that is, a proximity state of the finger and the touch panel, and can detect a case where a sliding operation is performed with the finger in the space approximately in parallel with respect to the touch panel in the same manner as in a case where a sliding operation is performed in a state where the finger is directly touched on the touch panel on the basis of a capacitance value which is specified depending on the distance between the finger and the touch panel. For this reason, a touch panel which can detect the proximity of the finger is expected to be established as a new user interface.
Patent Literature 1: JP-A-2002-342033
However, in the touch panel of the related art, there is a problem in that a user does not know what kind of operation will be executed until the user performs a certain input operation on the touch panel in an application. For example, there may be case where a user as a beginner uses an application which is not used much, and the user who is unaccustomed to the functions of buttons such as icons displayed on a screen performs a touch operation on the buttons, and as a result, an operation which is not intended by the user is executed, and thus the user makes operation errors.
In addition, Patent Literature 1 does not particularly disclose an operation in the above-described use case, that is, a proximity operation is performed on the buttons such as icons displayed on the screen when the user uses the application which is not used much. However, the same problem described in the above use case may also occur in the touch panel which can detect the proximity of the finger.
The present invention has been made in consideration of the above-described circumstances, and an object thereof is to provide a display control device, a display control method and a program, which explicitly display support information or related information regarding an input operation target when the proximity is detected before a user selects and fixes an input operation target, so as to support a user's accurate input operation, thereby improving operability.
A display control device according to one aspect of the present invention includes: a display unit that displays data on a screen; a proximity detection unit that detects proximity of a finger to the screen and outputs a proximity detection signal; a contact detection unit that detects a contact of the finger on the screen; a display control unit that displays related information or support information regarding an item displayed at a proximity correspondence position on the screen, at the proximity correspondence position or in the vicinity of the proximity correspondence position with reference to the proximity detection signal, wherein the proximity correspondence position is a position on the screen corresponding to the proximity detection signal for the finger of which the proximity is detected; and an operation execution unit that executes an operation corresponding to an item falling in a contact position where a contact of the finger on the related information or the support information is detected in accordance with the contact of the finger.
According to this configuration, since related information or support information regarding an item (for example, a button) displayed at a proximity correspondence position which is a position on a screen located downward in the vertical direction of the finger is displayed at the proximity correspondence position or in the vicinity thereof when the proximity of the finger to the screen is detected, it is possible to explicitly display support information or related information regarding an input operation target when the proximity is detected before a user selects and fixes an input operation target, so as to support a user's accurate input operation, thereby improving operability.
A display control method according to one aspect of the present invention is a display control method for a display control device which displays data on a screen, and the method includes the steps of: detecting proximity of a finger to the screen and outputs a proximity detection signal; displaying related information or support information regarding an item displayed at a proximity correspondence position on the screen, at the proximity correspondence position or in a vicinity of the proximity correspondence position with reference to the proximity detection signal, wherein the proximity correspondence position is a position on the screen corresponding to the proximity detection signal for the finger of which the proximity is detected; detecting a contact of the finger on the screen; and executing an operation corresponding to an item falling in the proximity correspondence position or a contact position where the contact is detected in accordance with the proximity of the finger to or the contact of the finger on the related information or the support information.
According to this method, since related information or support information regarding an item (for example, a button) displayed at a proximity correspondence position which is a position on a screen located downward in the vertical direction of the finger is displayed at the proximity correspondence position or in the vicinity thereof when the proximity of the finger to the screen is detected, it is possible to explicitly display support information or related information regarding an input operation target when the proximity is detected before a user selects and fixes an input operation target, so as to support a user's accurate input operation, thereby improving operability.
An aspect of the present invention is a program causing a computer which is a display control device which displays data on a screen, to execute the steps of; detecting proximity of a finger to the screen and outputs a proximity detection signal; displaying related information or support information regarding an item displayed at a proximity correspondence position on the screen, at the proximity correspondence position or in the vicinity of the proximity correspondence position with reference to the proximity detection signal, wherein the proximity correspondence position is a position on the screen corresponding to the proximity detection signal for the finger of which the proximity is detected; detecting a contact of the finger on the screen; and executing an operation corresponding to an item falling in the proximity correspondence position or a contact position where the contact is detected in accordance with the proximity of the finger to or the contact of the finger on the related information or the support information.
According to this program, since related information or support information regarding an item (for example, a button) displayed at a proximity correspondence position which is a position on a screen located downward in the vertical direction of the finger is displayed at the proximity correspondence position or in the vicinity thereof when the proximity of the finger to the screen is detected, it is possible to explicitly display support information or related information regarding an input operation target when the proximity is detected before a user selects and fixes an input operation target, so as to support a user's accurate input operation, thereby improving operability.
According to the present invention, it is possible to explicitly display support information or related information regarding an input operation target when the proximity is detected before a user selects and fixes an input operation target, so as to support a user's accurate input operation, thereby improving operability.
Hereinafter, respective embodiments of a display control device, a display control method, and a program related to the present invention will be described with reference to the drawings. The display control device related to the present invention is an electronic apparatus including a display unit which displays data on a screen, and is, for example, a mobile phone, a smartphone, a tablet terminal, a digital still camera, a personal digital assistant (PDA), or an electronic book terminal. Hereinafter, a description will be made by using a portable terminal (for example, a smartphone) as an example of the display control device for describing each embodiment.
In addition, the present invention can be represented as a display control device in terms of a device, or a program causing a computer to function as the display control device in order to execute each of the operations (steps) executed by the display control device. Further, the present invention can be represented as a display control method including each of the operations (steps) executed by the display control device. In other words, the present invention can be represented as any category of device, method, and program.
In addition, in the following description, an item which allows a user's input operation to be received and allows some content items of each application displayed on a screen of a display unit (for example, an LCD or an organic EL display) of a portable terminal to be selected, or an item for performing a predetermined operation related to a content item through the selection, is defined as a “button”. The predetermined operation is, for example, an operation (for example, an operation of reproducing video data) of executing the content related to a currently displayed content item in an application.
A “button” as a portion giving an instruction for operating an application or changing settings from a user may be hyper-linked text, that is, a headline of news, for example, in a case where the headline of news is displayed as a content item of the application, or may be an image (for example, an icon) for prompting the user to perform a selection operation, or may be a combination of text and an image. The portable terminal receives selection of, for example, a “headline of news” corresponding to a button as an input operation on the button in response to a user's input operation, and displays details of the news corresponding to the selected button. In addition, the “button” is specified according to an application which is activated in the portable terminal.
In the following description, the description will be made by using a user's finger (for example, the index finger) as an example of an indication medium (detection target) on a touch panel, but an indication medium is not limited to the finger, and a conductive stylus gripped by a user's hand may be used. In addition, an indication medium (detection target) is not particularly limited as long as the indication medium can detect proximity and touch (contact) on a touch panel according to a structure and a detection method of the touch panel.
In addition, two axes representing a horizontal surface of a touch panel are set as an x axis and a y axis, and an axis representing a direction (height direction) with respect to the horizontal plane on the touch panel is set as a z axis. Further, in the following description, “coordinates” indicate either touch coordinates (x,y) specified by a position on the horizontal surface of the touch panel when the touch panel detects touching (contact) of the finger or proximity coordinates (x, y, z) specified by a position in a space when the touch panel detects the proximity of the finger. A z coordinate value of the proximity coordinates indicates a height at which the finger is separated from the horizontal surface of the touch panel in the space.
In addition, in the following description, an operation of holding up the finger at a position within a proximity detection region in a space separated in a separated direction from the horizontal surface of the touch panel is defined as a “hover operation”, and an operation of sliding (moving) the finger approximately in parallel to the horizontal surface of the touch panel is defined as a “hover-slide operation”. Therefore, an operation in which the finger is directly touched on the surface of the touch panel is not a “hover operation” but a “touch operation”. Further, an operation in which the finger touches (comes into contact with) the horizontal surface of the touch panel and is then slid (moved) along the horizontal surface is defined as a “touch-slide operation”.
Still further, in order to detect a hover operation or a hover-slide operation, the distance between the finger and the touch panel preferably corresponds to a range of a capacitance value which can be detected by the touch panel since the distance is in inverse proportion to the capacitance value which is detected by the touch panel. For this reason, a proximity detection region (z coordinate value: z3) in which the proximity of the finger can be detected is set in advance in touch panels provided in portable terminals of the following respective portable terminals (refer to
In a first embodiment, in a case where the finger approaches any one of the keys of a QWERTY type software keyboard displayed on a screen, a portable terminal 1 enlargedly displays a key displayed at a proximity correspondence position and all keys adjacent to the key as related information or support information regarding an item (for example, an alphabet “C” key) displayed at the proximity correspondence position around the position (hereinafter, referred to as the “proximity correspondence position”) on the screen corresponding to proximity coordinates of the finger with which a hover operation or a hover-slide operation is being performed.
First, with reference to
Each of the key position/finger position determination unit 30, keyboard application 40, the key image generation unit 51, the enlarged key image generation unit 52, and the key image combining unit 53 can be operated by a processor (not illustrated) built into the portable terminal 1 reading and executing a program related to the present invention. In addition, the processor is, for example, a central processing unit (CPU), a micro-processing unit (MPU), or a digital signal processor (DSP), and is also the same for the following respective embodiments.
The proximity detection unit 10 detects that a user's finger FG (refer to
The proximity coordinate evaluation unit 11 calculates proximity coordinates (x, y, z) of the finger FG to the touch panel 15 as a proximity detection signal during the detection of the proximity on the basis of the proximity notification sent from the proximity detection unit 10. In addition, in the following description, a proximity detection signal will be described by using proximity coordinates, but a capacitance value calculated when the proximity is detected may be used. As described above, in the proximity coordinates (x, y, z), the x component and the y component are coordinate values indicating a position on the horizontal surface of the touch panel 15 mounted on a screen DP (refer to
The touch detection unit 20 as a contact detection unit detects an operation in which the finger FG touches (comes into contact with) the touch panel 15 through a touch operation or a touch-slide operation. The touch detection unit 20 sends a contact notification indicating that the finger FG touches (comes into contact with) the touch panel 15 to the touch coordinate evaluation unit 21.
The touch coordinate evaluation unit 21 calculates touch coordinates (x,y) when the finger FG touches (comes into contact with) the touch panel 15 on the basis of the contact notification sent from the touch detection unit 20. The touch coordinate evaluation unit 21 outputs information regarding the calculated touch coordinates (x,y) to the key position/finger position determination unit 30. In addition, the touch detection unit 20 and the touch coordinate evaluation unit 21 may be collectively configured as a touch detection unit.
In the respective embodiments including the present embodiment, the touch panel 15 which can detect both the touching (contact) and the proximity of the finger FG may be configured by using the proximity detection unit 10, the proximity coordinate evaluation unit 11, the touch detection unit 20, and the touch coordinate evaluation unit 21.
The key position/finger position determination unit 30 determines an x coordinate value and a y coordinate value of the proximity coordinates (x, y, z), that is, a key (hereinafter, referred to as a “proximity correspondence key”) which is displayed at a proximity correspondence position of the finger FG with which the hover operation or the hover-slide operation is being performed, on the basis of the proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11 and the display key position information 31.
In a case where a QWERTY type software keyboard is displayed on the screen DP (refer to
In addition, the display key position information 31 is held in the key position/finger position determination unit 30 but may be held in the keyboard application 40. In this case, the key position/finger position determination unit 30 outputs the information regarding the proximity coordinates (x, y, z) which is output from the proximity coordinate evaluation unit 11, to the keyboard application 40. The keyboard application 40 determines information regarding a proximity correspondence key displayed at a proximity correspondence position on the basis of the display key position information 31 and the information output from the key position/finger position determination unit 30, and outputs the determined information to the key position/finger position determination unit 30.
The key position/finger position determination unit 30 determines the proximity correspondence key, then inquires of the keyboard application 40 information regarding the number of keys which are enlargedly displayed simultaneously with the determined proximity correspondence key, and outputs an enlarged key image generation instruction indicating that enlarged key images (data) of a plurality of keys be generated including the proximity correspondence key, to the enlarged key image generation unit 52.
The keyboard application 40 as an operation execution unit is an application which is stored in advance in a read only memory (ROM) built into the portable terminal 1, causes the key image generation unit 51 to generate screen data of, for example, a QWERTY type software keyboard, and receives a user's input operation (for example, a text input operation) on the QWERTY type software keyboard displayed on the screen DP.
The keyboard application 40 outputs the information regarding the number of keys which are enlargedly displayed simultaneously with the proximity correspondence key to the key position/finger position determination unit 30 as response information to the inquiry from the key position/finger position determination unit 30 on the basis of the simultaneous enlargement unit setting information 41.
The simultaneous enlargement unit setting information 41 is information indicating the number of keys which are enlargedly displayed simultaneously with the proximity correspondence key, and is information indicating “all” keys adjacent to the proximity correspondence key in the present embodiment.
For example, if the finger FG approaches the “C” key displayed on the screen DP, the “C” key becomes a proximity correspondence key. The keyboard application 40 outputs, to the key position/finger position determination unit 30, number information (=four) indicating that information regarding the number of keys which are enlargedly displayed simultaneously with the proximity correspondence key represents all keys adjacent to the proximity correspondence key, as response information, on the basis of the simultaneous enlargement unit setting information 41. The key position/finger position determination unit 30 determines that the keys enlargedly displayed simultaneously with the proximity correspondence key (the “C” key) are “X”, “V”, “D” and “F” keys on the basis of the response information from the keyboard application 40 and the display key position information 31. Further, the key position/finger position determination unit 30 determines that the proximity correspondence key (the “C” key) and the keys (the “X”, “V”, “D” and “F” keys) enlargedly displayed simultaneously with the proximity correspondence key are targets generated by the enlarged key image generation unit 52. The key position/finger position determination unit 30 outputs an enlarged key image generation instruction for generating enlargedly displayed key images of the proximity correspondence key (the “C” key) and all the keys (the “X”, “V”, “D” and “F” keys) adjacent to the proximity correspondence key, to the enlarged key image generation unit 52.
Therefore, as illustrated in
Hereinafter, in the QWERTY type software keyboard, a key which is enlargedly displayed is referred to as an “enlargedly displayed key”. In addition, a part of the QWERTY type software keyboard (specifically, all of the “X”, “C”, “V”, “D” and “F” keys, and parts of “Z”, “S”, “E”, “R”, “T”, “G” and “B” keys) is hidden by the region AR2 of all the enlargedly displayed keys which are enlargedly displayed (refer to the right side of the arrow of
Next, the drawing change setting information 42 is information indicating timing at which display content of all enlargedly displayed keys changes according to the movement of the finger FG in a case where the finger FG with which a hover operation is being performed is moved in parallel to the screen DP (a hover-slide operation), and is predefined in an operation of the keyboard application 40. However, the drawing change setting information 42 may be changed as appropriate by a user's setting changing operation.
The drawing change setting information 42 is any one of information pieces indicating the following three kinds of timings. This will be described in detail with reference to
The first timing is a timing at which the finger FG exceeds a point P1 on a boundary BR1 of a display region of the “C” key before being enlargedly displayed when the finger FG with which a hover operation is being performed on the “C” key is moved (a hover-slide operation) in the direction of an arrow RW. In other words, the key position/finger position determination unit 30 outputs the information regarding the proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11, to the keyboard application 40. The keyboard application 40 outputs information indicating that the proximity coordinates (x, y, z) satisfy the first timing of the drawing change setting information 42 to the enlarged key image generation unit 52 on the basis of the proximity coordinates (x, y, z) output from the key position/finger position determination unit 30. Consequently, the portable terminal 1 enables an expert user who can perform consecutive input operations even if the finger FG does not hover out, to understand and recognize a change in display content of the enlargedly displayed key at the time when the finger exceeds a boundary of a key which is not enlargedly displayed with the original magnification. Therefore, it is possible to improve a user's operability.
The second timing is a timing at which the finger FG exceeds a point P2 on a boundary BR2 of a display region of the “C” key after being enlargedly displayed when the finger FG with which a hover operation is being performed on the “C” key is moved (a hover-slide operation) in the direction of the arrow RW. In other words, the key position/finger position determination unit 30 outputs the information regarding the proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11, to the keyboard application 40. The keyboard application 40 outputs information indicating that the proximity coordinates (x, y, z) satisfy the second timing of the drawing change setting information 42 to the enlarged key image generation unit 52 on the basis of the proximity coordinates (x, y, z) output from the key position/finger position determination unit 30. Consequently, the portable terminal 1 enables even a user who is not an expert to explicitly understand and recognize a change in display content of the enlargedly displayed key at the time when the finger exceeds the boundary BR2 of the enlargedly displayed key “C” which is actually displayed on the screen DP. Therefore, it is possible to reduce operation errors by the user and thus to improve the user's operability.
The third timing is a timing at which the finger FG exceeds a point P3 on a boundary corresponding to an outer line of a display region of the region AR2 of all the enlargedly displayed keys when the finger FG with which a hover operation is being performed on the “C” key is moved (a hover-slide operation) in the direction of the arrow RW. In other words, the key position/finger position determination unit 30 outputs the information regarding the proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11, to the keyboard application 40. The keyboard application 40 outputs information indicating that the proximity coordinates (x, y, z) satisfy the third timing of the drawing change setting information 42 to the enlarged key image generation unit 52 on the basis of the proximity coordinates (x, y, z) output from the key position/finger position determination unit 30. Consequently, the portable terminal 1 enables a user such as a beginner who is not accustomed to a hover operation or a hover-slide operation to explicitly understand and recognize a change in display content of the enlargedly displayed key at the time when the finger exceeds the outer line boundary of the region AR2 of the enlargedly displayed keys which are actually displayed on the screen DP. Therefore, it is possible to reduce operation errors by the user and thus to improve the user's operability.
As mentioned above, since the movement speed during the movement of the finger FG with respect to the touch panel 15 (during a hover-slide operation) differs depending on a hover operation skill or a hover-slide operation skill of a user, the portable terminal 1 sets the drawing change setting information 42 which is suitable for the skill or preference of the user to be used by the user, and can thus appropriately perform changes of enlargedly displayed keys in consideration of convenience for the user.
Next, the enlarged display position setting information 43 is information indicating a display position of an enlargedly displayed key as the center when display content of enlargedly displayed keys changes according to the movement of the finger FG in a case where the proximity coordinates (x, y, z) satisfy the timing of the drawing change setting information 42.
The enlarged display position setting information 43 is information indicating the following two display positions. This will be described in detail with reference to
A first display position is a display position of a key (the “V” key) before being enlargedly displayed, which is displayed at a proximity correspondence position of the finger FG after being moved (a hover-slide operation) in a case where the finger FG is moved (a hover-slide operation) as the above-described premise. In other words, the key position/finger position determination unit 30 outputs the information regarding the proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11, to the keyboard application 40, in a case where the finger FG is moved (a hover-slide operation) as the above-described premise. Further, the keyboard application 40 outputs, to the enlarged key image generation unit 52, information indicating that information regarding a display position of an enlargedly displayed key as the center when display content of an enlargedly displayed key changes according to the movement of the finger FG is “a display position of the “V” key before being enlargedly displayed, which is displayed at the proximity correspondence position of the finger FG after being moved (a hover-slide operation)”, when the proximity coordinates (x, y, z) satisfy the first timing of the drawing change setting information 42, on the basis of the proximity coordinates (x, y, z) output from the key position/finger position determination unit 30. Therefore, in a case where the finger FG is moved (a hover-slide operation) as the above-described premise, the portable terminal 1 enlargedly displays (draws) the “V” key as a proximity correspondence key and all keys (the “C”, “B”, “F” and “G” keys) adjacent to the “V” key centering on the display position of the “V” key before being enlargedly displayed, which is displayed at the proximity correspondence position of the finger FG after being moved (a hover-slide operation) (refer to the upper right part of
A second display position is a display position of an enlargedly displayed key which is displayed at a proximity correspondence position of the finger FG after being moved among initial display positions of all enlargedly displayed keys which have been displayed before the movement (a hover-slide operation) of the finger FG in a case where the finger FG is moved (a hover-slide operation) as the above-described premise. In addition, in a case where the enlarged display position setting information 43 is information indicating a second display position, display of enlargedly displayed keys of all keys (the “B” and “G” keys) adjacent in the movement (a hover-slide operation) direction of the “V” key after being enlargedly displayed, which is displayed at the proximity correspondence position of the finger FG after being moved, is added (continuously performed), and the display of the enlargedly displayed keys of all the keys (the “C” and “F” keys) adjacent on an opposite side to the same movement (a hover-slide operation) direction is erased. In other words, the key position/finger position determination unit 30 outputs the information regarding the proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11, to the keyboard application 40, in a case where the finger FG is moved (a hover-slide operation) as the above-described premise. Further, the keyboard application 40 outputs, to the enlarged key image generation unit 52, information indicating that information regarding a display position of an enlargedly displayed key as the center when display content of an enlargedly displayed key changes according to the movement of the finger FG is “an initial display position of an enlargedly displayed key (the “V” key) which is displayed at a proximity correspondence position of the finger FG after being moved (a hover-slide operation) among initial display positions of all enlargedly displayed keys which have been displayed before the movement (a hover-slide operation) of the finger FG”, when the proximity coordinates (x, y, z) satisfy the first timing of the drawing change setting information 42, on the basis of the proximity coordinates (x, y, z) output from the key position/finger position determination unit 30. Therefore, in a case where the finger FG is moved (a hover-slide operation) as the above-described premise, the portable terminal 1 additionally displays the enlargedly displayed keys of all the keys (the “B” and “G” keys) adjacent in the movement (a hover-slide operation) direction and erases the enlargedly displayed keys of all the keys (the “C” and “F” keys) adjacent on an opposite side to the movement (a hover-slide operation) direction centering on the initial display position of the proximity correspondence key (the “V” key) of the finger FG after being moved (a hover-slide operation) (refer to the lower right part of
As mentioned above, since the portable terminal 1 sets the enlarged display position setting information 43 which is suitable for the skill or preference of the user to be used by the user, according to the enlarged display position setting information 43 indicating the first display position, it is possible to further reduce a movement amount of the finger FG when display content of an enlargedly displayed key is changed according to the movement (a hover-slide operation) of the finger FG than in a case of using the enlarged display position setting information 43 indicating the second display position, and thus to simplify a user's operation.
In addition, according to the enlarged display position setting information 43 indicating the second display position, the portable terminal 1 additionally displays all keys adjacent in a movement direction of the finger FG without changing display positions of enlargedly displayed keys which have already been displayed, and deletes all keys adjacent in an opposite direction to the movement direction, and thus it is possible to improve visibility of all enlargedly displayed keys after the movement (a hover-slide operation) of the finger FG.
The screen display unit 50, which is configured by using, for example, an LCD or an organic EL display, has a function of displaying data on the screen DP, and displays screen data output from the key image combining unit 53 which will be described later, on the screen DP. In the present embodiment, the screen data displayed by the screen display unit 50 is data regarding a screen of a QWERTY type software keyboard, or data regarding a screen in which a screen of the QWERTY type software keyboard is combined with enlargedly displayed keys.
The key image generation unit 51 generates screen data of a QWERTY type software keyboard (refer to
The enlarged key image generation unit 52 generates image data of an enlargedly displayed key on the basis of an enlarged key image generation instruction output from the key position/finger position determination unit 30, and outputs the image data to the key image combining unit 53. In addition, the enlarged key image generation unit 52 generates image data of enlargedly displayed keys after the movement (a hover-slide operation) of the finger FG on the basis of the enlarged key image generation instruction output from the key position/finger position determination unit 30 and each of the information pieces output from the keyboard application 40, and outputs the generated image data to the key image combining unit 53. Further, each of the information pieces output from the keyboard application 40 includes information that the drawing change setting information 42 is satisfied, and information regarding a display position of an enlargedly displayed key as a center when display content of an enlargedly displayed key changes according to a movement of the finger FG. Still further, the enlarged key image generation unit 52 outputs the information regarding a display position of an enlargedly displayed key as a center among all enlargedly displayed keys output from the keyboard application 40, to the key image combining unit 53.
The key image combining unit 53 as a display control unit combines the screen data of the QWERTY type software keyboard output from the key image generation unit 51 with the image data of the enlargedly displayed key output from the enlarged key image generation unit 52, and displays the combined screen data at a predetermined position on the screen DP of the screen display unit 50 on the basis of the information regarding the display position of the enlargedly displayed key as a center among the enlargedly displayed keys output from the enlarged key image generation unit 52. The predetermined position is a predetermined position based on the display position of the enlargedly displayed key as a center, output from the enlarged key image generation unit 52. In addition, in a case where image data of an enlargedly displayed key is not input from the enlarged key image generation unit 52, the key image combining unit 53 displays the screen data of the QWERTY type software keyboard output from the key image generation unit 51 on the screen DP of the screen display unit 50 as it is.
Further, in a case where the combined screen data is displayed on the screen DP of the screen display unit 50, the key image combining unit 53 temporarily stores information indicating that the enlargedly displayed key is currently being displayed on the screen DP, in a random access memory (RAM) (not illustrated) built into the portable terminal 1. This information is referred to by, for example, the key position/finger position determination unit 30.
In
After step S11, an operation of the portable terminal 1 proceeds to step S12. Details of an operation in step S12 will be described later with reference to
After step S12, the portable terminal 1 waits for the touch detection unit 20 to detect touching (contact) of the finger FG on the touch panel 15 (step S13). If the touch detection unit 20 detects touching (contact) of the finger FG on the touch panel 15 (YES in step S13), the touch coordinate evaluation unit 21 calculates touch coordinates (x,y) of the finger FG on the touch panel 15 on the basis of a contact notification sent from the touch detection unit 20. The touch coordinate evaluation unit 21 outputs information regarding the calculated touch coordinates (x,y) to the key position/finger position determination unit 30.
The key position/finger position determination unit 30 fixes a key of the QWERTY type software keyboard displayed at a position of the touch coordinates (x,y) as a key corresponding to a user's input operation target on the basis of the information regarding the touch coordinates (x,y) output from the touch coordinate evaluation unit 21 (step S14). The key position/finger position determination unit 30 outputs information regarding the key displayed at the position of the touch coordinates (x,y) and an operation execution instruction for executing an operation (for example, a text input operation) corresponding to the key, to the keyboard application 40.
The keyboard application 40 executes an operation (for example, a text input operation) corresponding to the key fixed as a key corresponding to a user's input operation target on the basis of the operation execution instruction output from the key position/finger position determination unit 30. Then, the operation of the portable terminal 1 returns to step S12.
On the other hand, if the touch detection unit 20 does not detect touching (contact) of the finger FG on the touch panel 15 (NO in step S 13), the key position/finger position determination unit 30 causes the keyboard application 40 to determine whether or not the finger FG with which a hover operation is being performed at the proximity correspondence position is moved (a hover-slide operation) so as to exceed the proximity correspondence key, that is, the proximity coordinates (x, y, z) satisfy any one of the timings specified by the drawing change setting information 42, as illustrated in
If the keyboard application 40 determines that the finger FG with which a hover operation is being performed is not moved (a hover-slide operation) so as to exceed the proximity correspondence key (NO in step S15), display content of the enlargedly displayed key which is currently being displayed does not change even if there is the movement (a hover-slide operation) of the finger FG, and the operation of the portable terminal 1 returns to step S13.
On the other hand, if the keyboard application 40 determines that the finger FG with which a hover operation is being performed is moved (a hover-slide operation) so as to exceed the proximity correspondence key (YES in step S15), display content of the enlargedly displayed key which is currently being displayed changes according to the movement (a hover-slide operation) of the finger FG, and the operation of the portable terminal 1 returns to step S12.
Next, in
If it is determined that the enlargedly displayed key has not already been displayed (NO in step S12-1), the key position/finger position determination unit 30 determines a proximity correspondence key which is displayed at coordinates (x,y) of the proximity correspondence position on the basis of the information regarding the proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11 and the display key position information 31. The key position/finger position determination unit 30 determines the proximity correspondence key and then inquires of the keyboard application 40 information regarding the number of keys which are enlargedly displayed simultaneously with the determined proximity correspondence key.
The keyboard application 40 outputs, to the key position/finger position determination unit 30, information regarding the number (=four) of keys which are enlargedly displayed simultaneously with the proximity correspondence key, as response information to the inquiry from the key position/finger position determination unit 30, on the basis of the simultaneous enlargement unit setting information 41.
The key position/finger position determination unit 30 determines that the keys enlargedly displayed simultaneously with the proximity correspondence key (the “C” key) are “X”, “V”, “D” and “F” keys on the basis of the response information from the keyboard application 40 and the display key position information 31. Further, the key position/finger position determination unit 30 determines that the proximity correspondence key (the “C” key) and the keys (the “X”, “V”, “D” and “F” keys) enlargedly displayed simultaneously with the proximity correspondence key are targets generated by the enlarged key image generation unit 52. The key position/finger position determination unit 30 outputs an enlarged key image generation instruction for generating enlargedly displayed key images of the proximity correspondence key (the “C” key) and all the keys (the “X”, “V”, “D” and “F” keys) adjacent to the proximity correspondence key, to the enlarged key image generation unit 52.
Te enlarged key image generation unit 52 generates image data of enlargedly displayed keys and outputs the image data to the key image combining unit 53 on the basis of an enlarged key image generation instruction output from the key position/finger position determination unit 30. The key image combining unit 53 combines the screen data of the software keyboard output from the key image generation unit 51 with the image data of the enlargedly displayed keys output from the enlarged key image generation unit 52, and displays the combined screen data on the screen DP of the screen display unit 50 (step S12-2). In other words, the key image combining unit 53 displays the enlargedly displayed keys generated by the enlarged key image generation unit 52 on the screen DP of the screen display unit 50, centering on the display position of the proximity correspondence key (the “C” key) before being enlargedly displayed with respect to the software keyboard displayed on the screen DP of the screen display unit 50 (refer to
On the other hand, if it is determined that the enlargedly displayed key has already been displayed (YES in step 512-1), the key position/finger position determination unit 30 determines a proximity correspondence key which is displayed at a proximity correspondence position of the finger FG after being moved (a hover-slide operation) on the basis of the information regarding the proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11 and the display key position information 31.
After the proximity correspondence key is determined, the key position/finger position determination unit 30 outputs information regarding the proximity correspondence key to the keyboard application 40, and inquires of the keyboard application 40 the enlarged display position setting information 43 for specifying a display position of an enlargedly displayed key as a center when display content of enlargedly displayed keys changes according to a movement (a hover-slide operation) of the finger FG (step S12-3). In response to the inquiry from the key position/finger position determination unit 30, the keyboard application 40 refers to the enlarged display position setting information 43 and outputs, to the enlarged key image generation unit 52, information regarding the display position of the proximity correspondence key as a center when display content of enlargedly displayed keys changes according to a movement (a hover-slide operation) of the finger FG. Further, the keyboard application 40 outputs, to the key position/finger position determination unit 30, number information (=four) indicating that information regarding the number of keys which are enlargedly displayed simultaneously with the proximity correspondence key is all keys adjacent to the proximity correspondence key, as response information to the inquiry from the key position/finger position determination unit 30, on the basis of the simultaneous enlargement unit setting information 41.
In addition, the key position/finger position determination unit 30 determines that the keys enlargedly displayed simultaneously with the proximity correspondence key (the “C” key) are “X”, “V”, “D” and “F” keys on the basis of the response information from the keyboard application 40 and the display key position information 31. Further, the key position/finger position determination unit 30 determines that the proximity correspondence key (the “C” key) and the keys (the “X”, “V”, “D” and “F” keys) enlargedly displayed simultaneously with the proximity correspondence key are targets generated by the enlarged key image generation unit 52. The key position/finger position determination unit 30 outputs an enlarged key image generation instruction for generating enlargedly displayed key images of the proximity correspondence key (the “C” key) and all the keys (the “X”, “V”, “D” and “F” keys) adjacent to the proximity correspondence key, to the enlarged key image generation unit 52.
If the enlarged display position setting information 43 indicates the first display position, that is, a display position of a key before being enlargedly displayed, which is displayed at the proximity correspondence key of the finger FG after being moved (a hover-slide operation) (the first display position in step S12-3), as illustrated in the upper right part of
The enlarged key image generation unit 52 generates image data of enlargedly displayed keys after the movement (a hover-slide operation) of the finger FG on the basis of the enlarged key image generation instruction output from the key position/finger position determination unit 30 and each of the information pieces output from the keyboard application 40, and outputs the generated image data to the key image combining unit 53. The key image combining unit 53 combines the screen data of the software keyboard output from the key image generation unit 51 with the image data of the enlargedly displayed keys output from the enlarged key image generation unit 52, and displays the combined screen data on the screen DP of the screen display unit 50. In other words, the key image combining unit 53 displays (draws) the enlargedly displayed keys generated by the enlarged key image generation unit 52 on the screen DP of the screen display unit 50, centering on the display position of the proximity correspondence key before being enlargedly displayed after the finger FG is moved, with respect to the software keyboard displayed on the screen DP of the screen display unit 50 (step S12-2).
On the other hand, if the enlarged display position setting information 43 indicates the second display position, that is, a display position with the initial display position of the enlargedly displayed key as a reference (the second display position in step S12-3), as illustrated in the lower right of
The enlarged key image generation unit 52 generates image data of enlargedly displayed keys after the movement (a hover-slide operation) of the finger FG on the basis of the enlarged key image generation instruction output from the key position/finger position determination unit 30 and each of the information pieces output from the keyboard application 40, and outputs the generated image data to the key image combining unit 53. The key image combining unit 53 combines the screen data of the software keyboard output from the key image generation unit 51 with the image data of the enlargedly displayed keys output from the enlarged key image generation unit 52, and displays the combined screen data on the screen DP of the screen display unit 50. In other words, the key image combining unit 53 additionally displays the enlargedly displayed keys of all the keys adjacent in the movement direction and erases the enlargedly displayed keys of all the keys adjacent on an opposite side to the movement direction centering on the initial display position of the proximity correspondence key of the finger FG after being moved with respect to the software keyboard displayed on the screen DP of the screen display unit 50 (step S12-4).
As mentioned above, when the proximity of the finger FG with respect to the screen DP on which the QWERTY type software keyboard is displayed, the portable terminal 1 of the present embodiment displays all keys adjacent to a proximity correspondence key in the vicinity of a proximity correspondence position, as related information or support information displayed at the proximity correspondence position which is a position on the screen corresponding to proximity coordinates of the finger FG with which a hover operation or a hover-slide operation is being performed. Consequently, the portable terminal 1 can explicitly display (draw) the support information or related information regarding the input operation target when the proximity is detected before a key as a user's input operation target is fixedly selected, so as to support a user's accurate input operation, thereby improving operability.
In a second embodiment, in a case where the finger approaches any one of keys of a numeric keypad type software keyboard displayed on a screen, a portable terminal 1A displays or enlargedly displays a key (for example, an “” key of hiragana characters) displayed at a proximity correspondence position and related keys (for example, “”, “”, “”, and “” keys of the hiragana characters) which can be selected in a case where the key is pressed once or multiple times as related information or support information regarding an item (for example, the “” key) displayed at the proximity correspondence position in the vicinity the proximity correspondence position (refer to
The portable terminal 1A illustrated in
Each of the key position/finger position determination unit 30A, keyboard application 40A, the key image generation unit 51, the key image combining unit 53, the display target key determination unit 54, and the display target key image generation unit 55 can be operated by a processor (not illustrated) built into the portable terminal 1A reading and executing a program related to the present invention.
The key position/finger position determination unit 30A outputs information regarding proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11, and a determination instruction for determining a key displayed at a proximity correspondence position corresponding to the proximity coordinates (x, y, z), to the keyboard application 40A. In addition, the key position/finger position determination unit 30A outputs a key displayed at a proximity correspondence position of the finger FG whose proximity is detected, and a determination instruction for determining related information regarding the key or a related key used as support information, to the display target key determination unit 54.
The keyboard application 40A as an operation execution unit is an application which is stored in advance in a ROM built into the portable terminal 1, causes the key image generation unit 51 to generate screen data of, for example, a numeric keypad type software keyboard, and receives a user's input operation (for example, a text input operation) on the numeric keypad type software keyboard displayed on the screen DP.
The keyboard application 40A determines a proximity correspondence key displayed at an x coordinate value and a y coordinate value of the proximity coordinates (x, y, z), that is, at the proximity correspondence position of the finger FG whose proximity is detected, on the basis of the information regarding the proximity coordinates (x, y, z) output from the key position/finger position determination unit 30A, the determination instruction for determining a key displayed at the proximity correspondence position corresponding to the proximity coordinates (x, y, z), and the display key position information 31.
Further, the keyboard application 40A determines related keys used as related information or support information regarding the determined proximity correspondence key on the basis of the related key information 44.
The display target key determination unit 54 as a display target item determination unit determines that the related keys are displayed near the proximity correspondence position corresponding to the finger FG whose proximity is detected as related information or support information regarding the proximity correspondence key on the basis of the determination instruction output from the key position/finger position determination unit 30A and the information regarding each of the proximity correspondence key and the related keys output from the keyboard application 40A. The display target key determination unit 54 outputs the information regarding the related keys and a related key image generation instruction for generating image data of the related keys, to the display target key image generation unit 55.
The display target key image generation unit 55 generates the image data of the related keys on the basis of the information regarding the related keys and the related key image generation instruction output from the display target key determination unit 54, and outputs the image data to the key image combining unit 53.
In
The key position/finger position determination unit 30A outputs information regarding proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11, and a determination instruction for determining a key displayed at a proximity correspondence position corresponding to the proximity coordinates (x, y, z), to the keyboard application 40A. In addition, the key position/finger position determination unit 30A outputs a key displayed at a proximity correspondence position of the finger FG whose proximity is detected, and a determination instruction for determining related information regarding the key or a related key used as support information, to the display target key determination unit 54.
The keyboard application 40A determines a proximity correspondence key displayed at an x coordinate value and a y coordinate value of the proximity coordinates (x, y, z), that is, at the proximity correspondence position of the finger FG whose proximity is detected, on the basis of the information regarding the proximity coordinates (x, y, z) output from the key position/finger position determination unit 30A, the determination instruction for determining a key displayed at the proximity correspondence position corresponding to the proximity coordinates (x, y, z), and the display key position information 31 (step S22).
Further, the keyboard application 40A determines related keys used as related information or support information regarding the determined proximity correspondence key on the basis of the related key information 44 (step S23). The keyboard application 40A outputs information regarding each of the determined proximity correspondence key and the related keys to the display target key determination unit 54. The display target key determination unit 54 determines that the related keys are displayed near the proximity correspondence position corresponding to the finger FG whose proximity is detected as related information or support information regarding the proximity correspondence key on the basis of the determination instruction output from the key position/finger position determination unit 30A and the information regarding each of the proximity correspondence key and the related keys output from the keyboard application 40A (step S23). The display target key determination unit 54 outputs the information regarding the related keys and a related key image generation instruction for generating image data of the related keys, to the display target key image generation unit 55.
The display target key image generation unit 55 generates the image data of the related keys on the basis of the information regarding the related keys and the related key image generation instruction output from the display target key determination unit 54, and outputs the image data to the key image combining unit 53. The key image combining unit 53 combines screen data of the numeric keypad type software keyboard output from the key image generation unit 51 with the image data of the related keys output from the display target key image generation unit 55, and displays the combined screen data on the screen DP of the screen display unit 50. In other words, the key image combining unit 53 additionally displays (draws) the related keys in the vicinity of the display position of the proximity correspondence key of the finger FG with which a hover operation is being performed on the numeric keypad type software keyboard displayed on the screen DP of the screen display unit 50, as related information or support information regarding the proximity correspondence key (step S24).
In addition,
After step S24, the portable terminal 1A waits for the touch detection unit 20 to detect touching (contact) of the finger FG on the touch panel 15 (step S25). If the touch detection unit 20 detects touching (contact) of the finger FG on the touch panel 15 (YES in step S25), the touch coordinate evaluation unit 21 calculates touch coordinates (x,y) of the finger FG on the touch panel 15 on the basis of a contact notification sent from the touch detection unit 20. The touch coordinate evaluation unit 21 outputs information regarding the calculated touch coordinates (x,y) to the key position/finger position determination unit 30A.
The key position/finger position determination unit 30A fixes a key of the QWERTY type software keyboard displayed at a position of the touch coordinates (x,y) as a key corresponding to a user's input operation target on the basis of the information regarding the touch coordinates (x,y) output from the touch coordinate evaluation unit 21 (step S26). The key position/finger position determination unit 30A outputs information regarding the key displayed at the position of the touch coordinates (x,y) and an operation execution instruction for executing an operation (for example, a text input operation) corresponding to the key, to the keyboard application 40A. The keyboard application 40A executes an operation (for example, a text input operation) corresponding to the key fixed as a key corresponding to a user's input operation target on the basis of the operation execution instruction output from the key position/finger position determination unit 30A. Then, in
On the other hand, if the touch detection unit 20 does not detect touching (contact) of the finger FG on the touch panel 15 (NO in step S25), the key position/finger position determination unit 30A determines whether or not the finger FG with which a hover operation or a hover-slide operation is being performed is moved (a hover-slide operation) to another key on the basis of the information regarding the proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11 (step S27).
If it is determined that the finger FG with which a hover operation or a hover-slide operation is being performed is not moved (a hover-slide operation) to another key (NO in step S27), the operation of the portable terminal 1A returns to step S25.
On the other hand, if it is determined that the finger FG with which a hover operation or a hover-slide operation is being performed is moved (a hover-slide operation) to another key (YES in step S27), the operation of the portable terminal 1A returns to step S22.
As mentioned above, when detecting the proximity of the finger FG to the screen DP on which the numeric keypad type software keyboard is displayed, the portable terminal 1A displays related keys which can be selected when a proximity correspondence key is pressed once or multiple times, in the vicinity of a proximity correspondence position, as related information or support information regarding a key displayed at the proximity correspondence position which is a position on the screen corresponding to proximity coordinates of the finger FG. Consequently, the portable terminal 1A can explicitly display (draw) the support information or related information regarding the input operation target when the proximity is detected before a key as a user's input operation target is fixedly selected, so as to support a user's accurate input operation, thereby improving operability.
In a third embodiment, a portable terminal 1B displays an operation execution item (for example, a button) which will be described later and a setting changing item (for example, a button) which will be described later in a transmissive manner with respect to a subject in order to improve a user's visibility for the subject on a preview screen of an application with a camera function.
In addition, in the third embodiment, in a case where the finger approaches a “button for executing a setting changing function” which is displayed in a transmissive manner on the preview screen of the application with a camera function, the portable terminal 1B displays a plurality of setting changing items (for example, buttons) having a setting changing function in the same group corresponding to the corresponding button in a visible display form, in the vicinity of a proximity correspondence position, as related information or support information regarding the button (refer to
Further, in the third embodiment, in a case where the finger approaches a “button for executing an operation function” which is displayed in a transmissive manner on the preview screen of the application with a camera function, the portable terminal 1B displays a button displayed at a proximity correspondence position in a visible display form, and erases buttons other than the button displayed at the proximity correspondence position from the screen among all items (for example, buttons) displayed on the screen (refer to
Still further, in the third embodiment, an item (for example, a button), which is displayed in a transmissive manner with respect to a subject on a preview screen in order to execute operation functions including an operation of capturing a still image or a moving image and an operation of viewing captured data, is defined as an “operation execution item” in the application with a camera function. Furthermore, an item (for example, a button), which is displayed in a transmissive manner with respect to a subject on a preview screen in order to execute a setting changing function including the content of setting changes which are used as various conditions for an image capturing operation, is defined as a “setting changing item” in the application with a camera function.
The portable terminal 1B illustrated in
Each of the button position/finger position determination unit 30B, camera application 40B, the button execution content determination unit 56, the display variable button image generation unit 57, the camera preview screen generation unit 58, and the camera screen combining unit 59 can be operated by a processor (not illustrated) built into the portable terminal 1B reading and executing a program related to the present invention.
The button position/finger position determination unit 30B outputs information regarding proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11, and a determination instruction for determining a button (hereinafter, referred to as a “proximity correspondence button”) displayed at a proximity correspondence position corresponding to the proximity coordinates (x, y, z), to the camera application 40B. In addition, the button position/finger position determination unit 30B outputs a determination instruction for determining whether or not a button displayed at a proximity correspondence position of the finger FG whose proximity is detected is an operation execution item or a setting changing item, to the button execution content determination unit 56.
The camera application 40B as an operation execution unit is an application which executes operation functions including an operation of capturing still image data or moving image data using a camera mechanism (not illustrated) provided in the portable terminal 1B and an operation of viewing captured data, and a setting changing function including the content of setting changes which are used as various conditions for an image capturing operation, and is stored in advance in a ROM built into the portable terminal 1B. In addition, the camera application 40B causes the camera preview screen generation unit 58 to generate screen data of a preview screen on which a subject is displayed to be checked before an image capturing operation, and receives a user's input operation (for example, an input operation for capturing a still image) on the content of the preview screen on the screen DP.
The camera application 40B determines a proximity correspondence button displayed at an x coordinate value and a y coordinate value of the proximity coordinates (x, y, z), that is, at the proximity correspondence position of the finger FG whose proximity is detected, on the basis of the information regarding the proximity coordinates (x, y, z) output from the button position/finger position determination unit 30B, the determination instruction for determining a button displayed at the proximity correspondence position corresponding to the proximity coordinates (x, y, z), and the display button position information 45.
The display button position information 45 is information for specifying a display position of each of an operation execution item and a setting changing item displayed on the preview screen generated by the camera preview screen generation unit 58. The camera application 40B determines execution content of the proximity correspondence button which is determined by using the display button position information 45, on the basis of the button execution content information 46.
The button execution content information 46 is information for specifying an operation performed in a case where each of the operation execution item and the setting changing item displayed on the preview screen generated by the camera preview screen generation unit 58 is selected through a user's touch operation, or the setting change content. The camera application 40B outputs information regarding the execution content of the proximity correspondence button determined by using the button execution content information 46, to the button execution content determination unit 56.
In addition, in the present embodiment, the button execution content information 46 is held in the camera application 40B, but may be held in the button execution content determination unit 56. In this case, the button execution content determination unit 56 obtains information regarding the proximity correspondence position from the camera application 40B, and determines execution content of the proximity correspondence button on the basis of the button execution content information 46 held in the button execution content determination unit 56.
The button execution content determination unit 56 as an item operation determination unit determines whether or not the proximity correspondence button displayed at the proximity correspondence position corresponding to the finger FG whose proximity is detected is an operation execution item or a setting changing item on the basis of the determination instruction output from the button position/finger position determination unit 30B and the information regarding the execution content of the proximity correspondence button output from the camera application 40B.
If it is determined that the proximity correspondence button is a setting changing item, the button execution content determination unit 56 outputs, to the display variable button image generation unit 57, information for displaying specific setting change content in a display form which can be visually recognized by a user as related information or support information related to a simple button which is displayed in a transmissive manner with respect to a substrate on a preview screen, and a generation instruction for generating button image data of the setting changing item in a visible display form.
If it is determined that the proximity correspondence button is an operation execution item, the button execution content determination unit 56 outputs, to the display variable button image generation unit 57, information for displaying the proximity correspondence button in a display form which can be visually recognized by the user on a preview screen, and an erasure instruction for erasing buttons (including a simple button indicating a setting changing item and operation execution items) other than the operation execution item (for example, a button) displayed at the proximity correspondence position from the screen among all items which are displayed in a transmissive manner with respect to a subject on the preview screen.
The display variable button image generation unit 57 generates button image data in which a display form of the setting changing item (simple button) displayed at the proximity correspondence position corresponding to the finger FG whose proximity is detected on the preview screen is changed from an initial transmissive display form to a display form which can be visually recognized by a user, on the basis of the information output from the button execution content determination unit 56 and the generation instruction. The display variable button image generation unit 57 outputs the generated button image data of the setting changing item to the camera screen combining unit 59.
The display variable button image generation unit 57 generates button image data in which a display form of the operation execution item displayed at the proximity correspondence position corresponding to the finger FG whose proximity is detected on the preview screen is changed from an initial transmissive display form to a display form which can be visually recognized by a user, on the basis of the information output from the button execution content determination unit 56 and the erasure instruction, and also outputs information for erasing remaining operation execution items and setting changing items (simple buttons), to the camera screen combining unit 59.
The camera preview screen generation unit 58 generates screen data of the preview screen on the basis of the generation instruction of screen data of the preview screen output from the camera application 40B. In the present embodiment, as described above, the camera preview screen generation unit 58 generates screen data so that the setting changing items and the operation execution items are in a transmissive form with respect to a subject on the preview screen. The camera preview screen generation unit 58 outputs the generated screen data to the camera screen combining unit 59.
The camera screen combining unit 59 as a display control unit combines the screen data of the preview screen output from the camera preview screen generation unit 58 with the button image data of the setting changing item output from the display variable button image generation unit 57, and displays the combined screen data on the screen DP of the screen display unit 50 (refer to
A plurality of simple buttons BT1 are displayed in a transmissive display form with respect to a subject at a left end of a left preview screen of
On the other hand, a plurality of buttons BT2 is displayed in a transmissive display form with respect to the subject at a right end of the left preview screen of
In a case where the finger FG approaches any one of the simple buttons indicating the setting changing items, as illustrated in a right preview screen of
In addition, the camera screen combining unit 59 combines the screen data of the preview screen output from the camera preview screen generation unit 58 with the button image data of the operation execution item output from the display variable button image generation unit 57, and displays the combined screen data on the screen DP of the screen display unit 50.
Further, in a case where information indicating that a predetermined or more time has elapsed after the proximity of the finger FG to the proximity correspondence button which is displayed at the proximity correspondence position is acquired from a timer (not illustrated) of the portable terminal 1B, the camera screen combining unit 59 displays a screen on which the simple buttons of the remaining setting changing items and the operation execution items excluding the operation execution item displayed at the proximity correspondence position where the proximity of the finger FG is detected are erased from the screen data of the preview screen on the basis of the erasure instruction output from the display variable button image generation unit 57, on the screen DP of the screen display unit 50 (refer to
In a case where the finger FG approaches the central button (the moving image capturing button) for a predetermined or more time among the buttons BT2 indicating operation execution items, as illustrated in a right preview screen of
In
The portable terminal 1B waits for the proximity detection unit 10 to detect the proximity of the finger FG to the touch panel 15 (step S32). If the proximity detection unit 10 detects the proximity of the finger FG to the touch panel 15 (YES in step S32), the proximity coordinate evaluation unit 11 calculates proximity coordinates (x, y, z) of the finger FG to the touch panel 15 on the basis of a proximity notification sent from the proximity detection unit 10. The proximity coordinate evaluation unit 11 outputs information regarding the calculated proximity coordinates (x, y, z) to the button position/finger position determination unit 30B.
The button position/finger position determination unit 30B outputs information regarding proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11, and a determination instruction for determining a proximity correspondence button, to the camera application 40B. In addition, the button position/finger position determination unit 30B outputs a determination instruction for determining whether or not a button displayed at a proximity correspondence position of the finger FG whose proximity is detected is an operation execution item or a setting changing item, to the button execution content determination unit 56.
The camera application 40B determines a proximity correspondence button displayed at an x coordinate value and a y coordinate value of the proximity coordinates (x, y, z), that is, at the proximity correspondence position of the finger FG whose proximity is detected, on the basis of the information regarding the proximity coordinates (x, y, z) output from the button position/finger position determination unit 30B, the determination instruction for determining a button displayed at the proximity correspondence position corresponding to the proximity coordinates (x, y, z), and the display button position information 45. The camera application 40B determines execution content of the proximity correspondence button which is determined by using the display button position information 45, on the basis of the button execution content information 46. The camera application 40B outputs information regarding the execution content of the proximity correspondence button determined by using the button execution content information 46, to the button execution content determination unit 56.
The button execution content determination unit 56 determines whether or not the proximity correspondence button displayed at the proximity correspondence position corresponding to the finger FG whose proximity is detected is an operation execution item or a setting changing item on the basis of the determination instruction output from the button position/finger position determination unit 30B and the information regarding the execution content of the proximity correspondence button output from the camera application 40B (step S33).
If it is determined that the proximity correspondence button is a setting changing item (a setting changing item in step S33), the button execution content determination unit 56 outputs, to the display variable button image generation unit 57, information for displaying specific setting change content in a display form which can be visually recognized by a user as related information or support information related to a simple button which is displayed in a transmissive manner with respect to a substrate on a preview screen, and a generation instruction for generating button image data of the setting changing item in a visible display form.
The display variable button image generation unit 57 generates button image data in which a display form of the setting changing item displayed at the proximity correspondence position corresponding to the finger FG whose proximity is detected on the preview screen is changed from an initial transmissive display form to a display form which can be visually recognized by a user, on the basis of the information output from the button execution content determination unit 56 and the generation instruction. The display variable button image generation unit 57 outputs the generated button image data of the setting changing item to the camera screen combining unit 59.
The camera screen combining unit 59 combines the screen data of the preview screen output from the camera preview screen generation unit 58 with the button image data of the setting changing item output from the display variable button image generation unit 57, and displays the combined screen data on the screen DP of the screen display unit 50 (step S34; refer to
After step S34, the portable terminal 1B waits for the touch detection unit 20 to detect touching (contact) of the finger FG on the touch panel 15 (step S35). If the touch detection unit 20 detects touching (contact) of the finger FG on the touch panel 15 (YES in step S35), the button position/finger position determination unit 30B outputs the information regarding the touch coordinates (x,y) output from the touch coordinate evaluation unit 21, to the camera application 40B. The camera application 40B executes an operation corresponding to an item on the preview screen, falling under the information regarding the touch coordinates (x,y) output from the button position/finger position determination unit 30B, or the setting change content (step S36).
On the other hand, if it is determined that the proximity correspondence button is an operation execution item (an operation execution item in step S33), the button execution content determination unit 56 outputs, to the display variable button image generation unit 57, information for displaying the proximity correspondence button in a display form which can be visually recognized by the user on a preview screen, and an erasure instruction for erasing buttons (including a simple button indicating a setting changing item and operation execution items) other than the operation execution item (for example, a button) displayed at the proximity correspondence position from the screen among all items which are displayed in a transmissive manner with respect to a subject on the preview screen.
The display variable button image generation unit 57 generates button image data in which a display form of the setting changing item (simple button) displayed at the proximity correspondence position corresponding to the finger FG whose proximity is detected on the preview screen is changed from an initial transmissive display form to a display form which can be visually recognized by a user, on the basis of the information output from the button execution content determination unit 56 and the generation instruction, and also outputs information for erasing remaining operation execution items and setting changing items to the camera screen combining unit 59.
The camera screen combining unit 59 combines the screen data of the preview screen output from the camera preview screen generation unit 58 with the button image data of the operation execution item output from the display variable button image generation unit 57, and displays the combined screen data on the screen DP of the screen display unit 50 (step S37).
Further, in a case where information indicating that a predetermined or more time has elapsed after the proximity of the finger FG to the proximity correspondence button which is displayed at the proximity correspondence position is acquired from a timer (not illustrated) of the portable terminal 1B (YES in step S38), the camera screen combining unit 59 displays a screen on which the simple buttons of the remaining setting changing items and the operation execution items excluding the operation execution item displayed at the proximity correspondence position where the proximity of the finger FG is detected are erased from the screen data of the preview screen on the basis of the erasure instruction output from the display variable button image generation unit 57, on the screen DP of the screen display unit 50 (step S39; refer to
As mentioned above, in a case where the finger FG approaches a simple button indicating a plurality of setting changing items which are displayed in a transmissive manner and are set in the same group on a preview screen of the application (the camera application 40B) with the camera function, the portable terminal 1B of the present embodiment displays a plurality of setting changing items corresponding to each simple button from a transmissive display form to a display form which can be visually recognized by a user (refer to
In addition, in a case where the finger FG approaches a button indicating each operation execution item which is displayed in a transmissive manner on a preview screen of the application (the camera application 40B) with the camera function, the portable terminal 1B displays a proximity correspondence button from a transmissive display form to a display form which can be visually recognized by a user, and also erases the display of respective buttons of remaining items (the operation execution items and simple buttons of the setting changing items) in a case where the proximity of the finger FG to the same proximity correspondence button is continuously performed for a predetermined or more time. Consequently, for example, in a case where the finger FG approaches a moving image capturing button for a predetermined or more time, the portable terminal 1B erases all buttons which are not necessary in the moving image capturing operation, and thus it is possible to improve visibility and operability when a user performs an operation related to an operation execution item for a subject.
In a fourth embodiment, a portable terminal 1C displays an image of a trajectory of a cancellation operation and a detection region of the proximity or contact in the image of the trajectory in an identifiable display form as related information or support information for a user cancelling a security lock or supporting the cancellation operation, on a security lock screen displayed in a security lock (for example, a screen lock) function which is activated when an input operation is not performed from the user for a predetermined or more time (refer to
The portable terminal 1C illustrated in
Each of the trajectory detection position/finger position determination unit 32, the security lock application 40C, the trajectory matching determination unit 47, the trajectory image generation unit 61, the security lock screen generation unit 62, and the security lock screen combining unit 63 can be operated by a processor (not illustrated) built into the portable terminal 1C reading and executing a program related to the present invention.
The trajectory detection position/finger position determination unit 32 as a trajectory position determination unit determines a trajectory position of a cancellation operation using the finger FG on a security lock screen and a detection region through which the finger FG is passing on the basis of information regarding proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11 or touch coordinates (x,y) output from the touch coordinate evaluation unit 21, and the detection region information 33. The trajectory detection position/finger position determination unit 32 outputs information regarding the determined trajectory position of the cancellation operation using the finger FG to the security lock application 40C, and temporarily stores the determined trajectory position of the cancellation operation using the finger FG and information regarding a detection region through which the finger FG is passing in the trajectory holding unit 60 in correlation with each other.
The detection region information 33 is information indicating a relationship between a trajectory position of a cancellation operation using the finger FG on a security lock screen and a predefined detection region. The detection region information 33 will be described with reference to
As illustrated in
Further, in the touch panel 15 of the portable terminal 1C, a total of nine detection points DT1 to DT9 illustrated in
In other words, a total of nine detection points DT1 to DT9 are provided in the touch detection region, a total of nine detection points DT10 to DT18 are provided in the low hovering detection region, and a total of nine detection points DT19 to DT27 are provided in the high hovering detection region. The trajectory detection position/finger position determination unit 32 determines through which detection points the finger FG with which a cancellation operation is being performed has passed among the total of twenty-seven detection points DT1 to DT27. In addition, the touch detection region is provided at a position where a height in the z axis direction is 0 (zero), the low hovering detection region is provided at a position where a height in the z axis direction is between 0 (zero) and z1, and the high hovering detection region is provided at a position where a height in the z axis direction is provided between z1 and z2.
Further, the trajectory detection position/finger position determination unit 32 refers to the trajectory holding unit 60, and outputs, to the trajectory image generation unit 61, information for generating a trajectory image including a trajectory (for example, a straight line or a curve) for a cancellation operation using the finger FG and a detection region where the trajectory of the cancellation operation using the finger FG is passing.
The security lock application 40C as an operation execution unit is an application which is stored in advance in a ROM built into the portable terminal 1C, and is used to lock a user's use of the portable terminal 1C in a case where an input operation is not performed on the portable terminal 1C for a predetermined or more time. The security lock application 40C causes the security lock screen generation unit 62 to generate a security lock screen on the basis of a certain input operation (for example, pressing of a power supply button or a predetermined home button) in a stop state of the portable terminal 1C due to the user not performing an input operation for a predetermined or more time, and receives a cancellation operation on the security lock screen on the screen DP of the screen display unit 50 from the user.
In a case where the security lock application 40C acquires information regarding a trajectory position output from the trajectory detection position/finger position determination unit 32, and information indicating that the finger FG hovers out is output from the trajectory detection position/finger position determination unit 32, the trajectory matching determination unit 47 compares the information regarding the trajectory position output from the trajectory detection position/finger position determination unit 32 up to the hovering-out with the cancelation trajectory information 48.
In addition, the trajectory matching determination unit 47 temporarily stores information regarding a comparison result, the information regarding the trajectory position of a cancellation operation using the finger FG stored in the trajectory holding unit 60 by the trajectory detection position/finger position determination unit 32, and information regarding the detection region through which the finger FG is passing, in the trajectory holding unit 60 in correlation with each other. If the information regarding the trajectory position output from the trajectory detection position/finger position determination unit 32 matches the cancelation trajectory information 48 up to the hovering-out, the trajectory matching determination unit 47 outputs information for canceling the security lock to the security lock application 40C.
The cancelation trajectory information 48 is trajectory information of a cancellation operation which is registered in advance in order to enable a user's input operation on the portable terminal 1C by canceling a security lock screen when the security lock screen is displayed. The cancelation trajectory information 48 may be changed as appropriate through a user's setting changing operation.
The trajectory holding unit 60 is configured by using a RAM built into the portable terminal 1C, and temporarily stores information regarding a trajectory position of a cancellation operation using the finger FG output from the trajectory detection position/finger position determination unit 32 and information regarding a detection region through which the finger FG is passing in correlation with each other. In addition, the trajectory holding unit 60 temporarily stores a component result from the trajectory matching determination unit 47, information regarding a trajectory position of a cancellation operation using the finger FG, and information regarding a detection region through which the finger FG is passing in correlation with each other.
The trajectory image generation unit 61 generates trajectory image data including a trajectory (for example, a straight line or a curve) for a cancellation operation using the finger FG and a detection region through which the trajectory of the cancellation operation using the finger FG is passing, by referring to the trajectory holding unit 60, on the basis of the information output from the trajectory detection position/finger position determination unit 32, and outputs the trajectory image data to the security lock screen combining unit 63. The trajectory image data will be described later with reference to
The security lock screen generation unit 62 generates screen data of a security lock screen on the basis of a generation instruction of the security lock screen output from the security lock application 40C. The security lock screen generation unit 62 outputs the generated screen data to the security lock screen combining unit 63.
The security lock screen combining unit 63 as a display control unit combines the screen data of the security lock screen output from the security lock screen generation unit 62 with the trajectory image data output from the trajectory image generation unit 61, and displays the combined screen data on the screen DP of the screen display unit 50 (refer to
Next, with reference to
The state A is a state in which a cancellation operation using the finger FG has not yet been input. The state B indicates a state in which the detection point DT7 (refer to
Next, the state C indicates a state in which the finger FG passes through the detection point DT8 (refer to
Next, the state D indicates a state right before the finger FG passes through the central detection point of the low hovering detection region at the position of the low hovering detection region. The trajectory image generation unit 61 generates image data of a display form (for example, a translucent green circular form; refer to a dot pattern portion in
Finally, the state E indicates a state right after the finger FG passes through the central detection point of the high hovering detection region at the position of the high hovering detection region. The trajectory image generation unit 61 generates image data of a display form (for example, a translucent yellow circular form; refer to a hatched portion in
As mentioned above, the portable terminal 1C displays a height (a detection region) of a trajectory position of the finger FG and a trajectory in which the finger FG passes through the height of the trajectory position, for example, in a display form of the same color or the same pattern, as related information or support information regarding a position (a proximity correspondence position or a touch position) which the finger FG approaches or comes into contact with among positions of predetermined detection points, between starting and ending of a cancellation operation using the finger FG with respect to the security lock screen. Consequently, the portable terminal 1C can simply display a trajectory (for example, a straight line or a curve) drawn by the finger FG in a cancellation operation and a detection region which the trajectory is currently passing or passed previously in correlation with each other. Therefore, the portable terminal 1C enables a user to visually compare and confirm the cancelation trajectory information 48 memorized by the user with a cancellation operation which is currently being performed and also enables to the user to easily perform the cancellation operation of a security lock screen.
In
If either the proximity detection unit 10 detects the proximity of the finger FG to the touch panel 15 or the touch detection unit 20 detects a touch of the finger FG on the touch panel 15 (YES in step S42), the trajectory detection position/finger position determination unit 32 determines a trajectory position of a cancellation operation using the finger FG on the security lock screen and a detection region through which the finger FG is passing on the basis of proximity coordinates (x, y, z) output from the proximity coordinate evaluation unit 11 or touch coordinates (x,y) output from the touch coordinate evaluation unit 21 and the detection region information 33. The trajectory detection position/finger position determination unit 32 outputs information regarding the determined trajectory position of the cancellation operation using the finger FG to the security lock application 40C, and also temporarily stores the information regarding the determined trajectory position of the cancellation operation using the finger FG and information regarding the detection region through which the finger FG is passing in the trajectory holding unit 60 in correlation with each other. In addition, the trajectory detection position/finger position determination unit 32 refers to the trajectory holding unit 60, and outputs, to the trajectory image generation unit 61, information for generating a trajectory image including a trajectory (for example, a straight line or a curve) for a cancellation operation using the finger FG and a detection region where the trajectory of the cancellation operation using the finger FG is passing.
The trajectory image generation unit 61 generates trajectory image data including a trajectory (for example, a straight line or a curve) for a cancellation operation using the finger FG and a detection region through which the trajectory of the cancellation operation using the finger FG is passing, by referring to the trajectory holding unit 60, on the basis of the information output from the trajectory detection position/finger position determination unit 32, and outputs the trajectory image data to the security lock screen combining unit 63.
The security lock screen combining unit 63 combines the screen data of the security lock screen output from the security lock screen generation unit 62 with the trajectory image data output from the trajectory image generation unit 61, and displays the combined screen data on the screen DP of the screen display unit 50 (refer to
After step S43, if there is a movement (a hover-slide operation or a touch-slide operation) of the finger FG with which a hover operation or a touch operation is being performed (YES in step S44), in the same manner as in step S43, the security lock screen combining unit 63 displays trajectory image data indicating a trajectory position and a detection region (a height of the trajectory position) through which the trajectory is passing in the cancellation operation using the finger FG on the security lock screen according to the movement (a hover-slide operation or a touch-slide operation) of the finger FG (step S45; referred to the state C, the state D, and the state E of
After step S45, if the finger FG does not hover out (NO in step S46), the operation of the portable terminal 1C returns to step S44. On the other hand, after step S45, if the finger FG hovers out (YES in step S46), the trajectory matching determination unit 47 compares a trajectory of the cancellation operation using the finger FG, detected up to the hovering-out with the cancelation trajectory information 48 (step S47).
If the trajectory of the cancellation operation using the finger FG, detected up to the hovering-out matches the cancelation trajectory information 48 (YES in step S47), the trajectory matching determination unit 47 outputs information for canceling the security lock to the security lock application 40C. The security lock application 40C cancels the security lock screen displayed on the screen DP of the screen display unit 50 on the basis of the information output from the trajectory matching determination unit 47 (step S49), and switches a state of the portable terminal 1C to a state in which an input operation can be received from a user.
On the other hand, if the trajectory of the cancellation operation using the finger FG, detected up to the hovering-out does not match the cancelation trajectory information 48 (NO in step S47), the trajectory matching determination unit 47 erases the trajectory of the cancellation operation using the finger FG after a predetermined time has elapsed from the component time point in step S47 (step S50). After step S50, the operation of the portable terminal 1C returns to step S41.
As mentioned above, the portable terminal 1C displays a height (a detection region) of a trajectory position of the finger FG and a trajectory in which the finger FG passes through the height of the trajectory position, for example, in a display form of the same color or the same pattern, as related information or support information regarding a position which the finger FG approaches or comes into contact with among positions of predetermined detection points, between starting and ending of a cancellation operation using the finger FG with respect to the security lock screen, under the condition in which the finger FG does not hover out.
Consequently, the portable terminal 1C can simply display a trajectory (for example, a straight line or a curve) drawn by the finger FG in a cancellation operation and a detection region which the trajectory is currently passing or passed previously in correlation with each other. Therefore, the portable terminal 1C enables a user to visually compare and confirm the cancelation trajectory information 48 memorized by the user with a cancellation operation which is currently being performed and also enables to the user to easily perform the cancellation operation of a security lock screen.
As mentioned above, although the various embodiments have been described with reference to the drawings, it is needless to say that the present invention is not limited to the embodiments. It is obvious that a person skilled in the art can conceive of alterations or modifications of the various embodiments and combinations of the various embodiments within the scope recited in the claims, and it is understood that they naturally fall within the technical scope of the present invention.
In addition, in the fourth embodiment, a description has been made of a case where the trajectory detection position/finger position determination unit 32 uses the respective nine detection points specified for each layer of the three detection regions (the touch detection region, the low hovering detection region, and the high hovering detection region) illustrated in
FG (refer to
In
In
In addition, the trajectory detection position/finger position determination unit 32 identifies that a total of nine detection points of the touch detection region illustrated in
This application is based on Japanese Patent Application No. 2012-217711 filed on Sep. 28, 2012, the contents of which are incorporated herein by reference.
The present invention is useful for a display control device, a display control method, and a program, which explicitly display support information or related information regarding an input operation target when the proximity is detected before a user selects and fixes the input operation target, so as to support a user's accurate input operation, thereby improving operability.
Number | Date | Country | Kind |
---|---|---|---|
2012-217711 | Sep 2012 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2013/005795 | 9/27/2013 | WO | 00 |