The present invention relates to input devices for receiving user inputs on an outer edge of a casing thereof, wearable terminals including such an input device, mobile terminals including the input device, methods of controlling the input device, and control programs for controlling operation of the input device.
Smart watches and other like compact wearable terminals have only a small display screen on which a touch panel is stacked. Therefore, improvement of GUI (Graphical User Interface) operability has been a large issue with these terminals. In relation to this GUI operability improvement, Patent Literature 1 discloses a GUI that improves operability by displaying radial submenus around a first touch position in a menu. The GUI also displays submenus in such a manner that a series of strokes of selecting from the submenus ends near the origin of the first stroke.
However, the GUI disclosed in Patent Literature 1 is built basically assuming user operations with one finger (including the thumb). The GUI has problems detailed below when considering the fact that the GUI is used with a small display screen of the wearable terminal.
When the GUI disclosed in Patent Literature 1 is applied to a wearable terminal, the limited display area for opening submenus could significantly degrade visibility: for example, submenu items may need to be displayed in a small size or superimposed on the background image. In addition, submenus are opened in various directions and therefore may be hidden and made invisible by a finger, which also seriously degrades operability.
Other problems also exist. Since smart watches and other like compact wearable terminals have only a small display screen, it would be easier for the user to touch an edge of the screen or touch a side face of the casing than to touch a display item on the screen. However, if the user wearing the smart watch on the arm (or around the wrist) attempts to touch an edge of the screen or a side face of the casing with one finger, the finger will often move (displace) the terminal due to the lack of structural support for the terminal before the user can complete the touch operation.
The inventors of the present invention have diligently worked in order to solve these problems and as a result, have found that the operability of the terminal improves if two or more fingers are used, for example, by touching a side or end of the terminal with the forefinger (or a finger other than the thumb) while supporting the opposite side or end thereof with the thumb.
In view of these problems, it is an object of the present invention to provide an input or like device that improves operability in an input operation that involves use of two or more fingers.
To address the problems, an information terminal in accordance with an aspect of the present invention is directed to an input device for receiving an input from a user on an outer edge of a casing of the input device, the input device including: a detection unit configured to detect a contact position of a first finger of the user on the outer edge; and a second setup unit configured to set up, by using as a reference a position opposite from the contact position of the first finger of the user detected by the detection unit, a second input area where an input made with a second finger of the user is received.
Additionally, to address the problems, a method of controlling an information terminal in accordance with an aspect of the present invention is directed to a method of controlling an input device for receiving an input from a user on an outer edge of a casing of the input device, the method including: (a) detecting a contact position of a first finger of the user on the outer edge; and (b) setting up, by using as a reference a position opposite from the contact position of the first finger detected in step (a), a second input area where an input made with a second finger of the user is received.
An aspect of the present invention can improve operability in an input operation that involves use of two or more fingers.
Portions (a) to (d) of
Portions (a) and (b) of
Portions (a) to (d) of
Portions (a) to (d) of
Portions (a) to (d) of
Portions (a) to (d) of
Portions (a) to (c) of
Portions (a) to (c) of
Portions (a) to (c) of
Portions (a) to (c) of
Portions (a) to (d) of
Portions (a) to (d) of
Portions (a) to (d) of
The following will describe embodiments of the present invention in reference to
The configuration of a terminal device (input device, wearable terminal, or mobile terminal) 10 in accordance with embodiments of the present invention will be described in reference to
In the present embodiment, the detection unit 1 includes a touch panel (detection unit) 11 and a side face touch sensor (detection unit) 12. The touch panel 11 is stacked on the display unit 3. The side face touch sensor 12 is disposed on a side face on the outer edge of the display unit 3 provided in the casing of the terminal device 10.
The touch panel 11 is configured to detect a target object touching or approaching a display screen of the display unit 3 in the casing and also to detect a first or a second finger touching or approaching the outer edge of the casing (or the display unit 3) (detection step). This configuration enables the touch panel 11, which is stacked on the display unit 3 in the casing and which also detects a target object touching or approaching the display screen of the display unit 3, to detect a first or a second finger touching or approaching the outer edge of the casing (or the display unit 3). Therefore, no new detection member needs to be provided to detect touching or approaching of the outer edge of the casing (or the display unit 3). That in turn can reduce the parts count.
Meanwhile, the side face touch sensor 12 is configured to detect a first or a second finger touching or approaching a side face of the casing. This configuration enables the side face touch sensor 12, disposed on a side face of the casing of the terminal device 10, to detect the first or the second finger touching or approaching the outer edge of the casing.
The detection unit 1 (touch device) may be provided in any form including the touch panel 11 and the side face touch sensor 12, provided that a touch can be detected on a side (corner) of a display device in the display unit 3 or on a side face of the casing of the terminal device 10.
The control unit 2, built around, for example, a CPU (central processing unit), collectively controls each unit in the terminal device 10. Referring to
The detection unit controller 21 includes a contact position determination unit 221 to determine the location of a target object on the display screen of the display unit 3 (the “contact position”; e.g., coordinates) by means of the touch panel 11 based on a result of detection of the target object touching or approaching the display screen. The contact position determination unit 221 in the detection unit controller 21 is configured to determine the contact position (coordinates) of the target object on the outer edge of the display unit 3 based on a result of the detection by the touch panel 11 of the first or the second finger touching or approaching the outer edge of the display unit 3.
The detection unit controller 21 is configured to determine the contact position of the target object on the side face touch sensor 12 based on a result of the detection of contact or approach of the target object by the side face touch sensor 12. The contact position determination unit 221 is configured to provide the setup unit 22 and/or the process specification unit 24 with information on the contact position of the target object in the determined display screen or information on the contact position of the target object as provided by the side face touch sensor 12.
The setup unit 22 is configured to set up, in or proximate to the contact position of the first finger detected by the detection unit 1, a first input area where an input with the first finger is received. The setup unit 22 is further configured to set up a second input area where an input with the user's second finger is received, using a position opposite from the contact position of the first finger detected by the detection unit 1 as a reference (second setup step). This configuration results in the second input area for the second finger being set up across from the contact position of the user's first finger where the first finger has touched the outer edge of the display unit 3, which can improve operability in an input operation that involves use of two or more fingers. The configuration also enables reception of an input that involves use of the first finger as well as an input that involves use of the second finger, which enables reception of more than one input. The setup unit 22 is configured to provide the detection unit controller 21, the display control unit 23, and/or the process specification unit 24 with information on the first input area and the second input area that have been set up.
The setup unit 22 may set up the first input area if the detection unit 1 has detected the contact position of the first finger and subsequently detected the contact position of the second finger and set up the second input area if the detection unit 1 has detected the contact position of the second finger and subsequently detected the contact position of the first finger, and the detection unit 1 may alternately detect the contact position of the first finger and the contact position of the second finger, so that the setup unit 22 can alternately set up the first input area and the second input area. According to this configuration, the first input area and the second input area are alternately set up if an input is made alternately with the first finger and with the second finger, which can improve operability in an input operation that involves use of two or more fingers.
The display control unit 23 controls the display unit 3 to display predetermined and other images (for example, a main menu, submenus, and icons in each menu (menu items) that will be described later in detail). Particularly, the display control unit 23 of the present embodiment is configured to control the display unit 3 to display, in or near the first input area on the display unit 3, a main menu as a first input-use image that prompts the user to make an input in the first input area with the first finger. According to this configuration, the first input-use image is displayed in or near the first input area. That in turn enables the user to visually recognize the first input-use image (main menu) so that the user can make an input in the first input area while visually recognizing that image.
The display control unit 23 is configured to control the display unit 3 to display, in or near the second input area on the display unit 3, a submenu as a second input-use image that prompts the user to make an input in the second input area with the second finger. According to this configuration, a submenu is displayed in or near the second input area, which is triggered by the input in the first input area with the first finger as prompted by the main menu. That in turn enables the user to visually recognize the submenu upon that input so that the user can make an input in the second input area while visually recognizing the submenu, which can improve the visibility of the menus on the display screen and the operability of the terminal device 10.
The submenu is not displayed in or near the second input area until an input is made in the first input area. Therefore, the user cannot recognize the presence of the second input area before making an input in the first input area. In other words, the user cannot make an input in the second input area before making an input in the first input area. Thus, no input is allowed in the second input area while the user is making an input in the first input area. The configuration can hence prevent malfunctions that could be caused if inputs are permitted in more than one location.
Alternatively, the display control unit 23 may display the first input-use image if the detection unit 1 has detected the contact position of the first finger and subsequently detected the contact position of the second finger, display the second input-use image if the detection unit 1 has detected the contact position of the second finger and subsequently detected the contact position of the first finger, and alternately display the first input-use image and the second input-use image if the detection unit 1 has alternately detected the contact position of the first finger and the contact position of the second finger. According to this configuration, the first input-use image and the second input-use image are alternately displayed if an input is made alternately with the first finger and with the second finger. That in turn enables the user to visually recognize the first input-use image and the second input-use image alternately upon such inputs so that the user can make an input alternately in the first input area and in the second input area while visually recognizing those images, which can improve operability in an input operation that involves use of two or more fingers.
As a further alternative, the display control unit 23 may display hierarchically lower-level submenus in accordance with the sequence in which the first input-use image and the second input-use image are alternately displayed. This configuration enables selection of menu items in hierarchically lower-level submenus in accordance with the sequence in which the first input-use image and the second input-use image are alternately displayed, which can improve operability in an input operation that involves use of two or more fingers.
The process specification unit 24 is configured to specify the processing to be executed that corresponds to the input operations by the user based on information on the inputs in the first and second input areas set up by the setup unit 22 and either information on the contact position of the target object in the display screen as determined by the contact position determination unit 221 in the detection unit controller 21 or information on the contact position of the target object as provided by the side face touch sensor 12. The process specification unit 24 is further configured to provide the process execution unit 25 with information on the specified processing.
The process execution unit 25 is configured to cause an appropriate block (particularly, the display control unit 23) in the control unit 2 to execute a process in accordance with the specified processing based on the information on the processing received from the process specification unit 24.
The display unit 3 of the present embodiment includes, for example, a liquid crystal panel as a predetermined display screen to display images. The display panel used in the display unit 3 is by no means limited to a liquid crystal panel and may be an organic EL (electroluminescence) panel, an inorganic EL panel, or a plasma panel.
The display unit 3 of the present embodiment is configured to display, particularly in or near the first input area, the main menu as the first input-use image that prompts the user to make an input in the first input area with the first finger. The display unit 3 is further configured to display, in or near the second input area, a submenu as the second input-use image that prompts the user to make an input in the second input area with the second finger.
The present embodiment has so far described the terminal device 10 including the display unit 3. The present invention is not necessarily embodied in this form that includes a display unit. For example, the present invention may be embodied, without there being the display unit 3, in the form of an input or control device that only receives touch operations on the outer edge of the casing.
The memory unit 4 prestores various information required for the operation of all the units in the control unit 2 and also stores various information generated by the units during the operation of the terminal device 10 on the fly. Examples of the information prestored in the memory unit 4 include information on the OS (operating system), which is basic software to operate the terminal device 10, information on various applications (software), and information on the GUI (graphical user interface) produced on the display unit 3.
Examples of the various information generated by the units during the operation of the terminal device 10 include information on the contact position of the first or the second finger determined by the contact position determination unit 221 in the detection unit controller 21, information on the first or the second input area set up by the setup unit 22, and information on the first input-use image (main menu image) or the second input image (submenu image) generated by the display control unit 23.
Next, referring to
Portion (a) of
Portions (b) and (c) of
Portion (b) of
In contrast, portion (c) of
Next, portion (d) of
Next, referring to
Portion (b) of
Next, referring to
The main menu may be displayed either close to where a touch has been made on the outer edge of the display unit 3 with the thumb (first finger) in response to that touch or close to where a touch is expected to be made with the thumb since before the touch is actually made. The following modes (1) and (2) are given here as more specific examples.
(1) Upon starting an application, the main menu is displayed in a peripheral part of the screen of the display unit 3 (since before the thumb touches). The main menu in the peripheral part of the screen disappears when the central part of the screen (the area of the screen where the menu is not being displayed) is touched. After that, the main menu is displayed close to where a touch is made on the outer edge of the display unit 3, in response to that touch.
(2) No main menu is displayed upon the start of an application. The main menu is displayed close to where a touch is made on the outer edge of the display unit 3, in response to that touch.
Portions (c) and (d) of
The modes shown in
The compact input device has a limited screen display area. Operability and visibility of the compact input device will improve, for example, if the operation command menu is displayed on a peripheral part of the screen so that the user can touch the edge of the screen and the side face of the casing for operation. For example, if two or more operation buttons are to be displayed in the central part of the screen, the buttons need to be displayed in small size, which can lead to poor visibility and wrong touches. The screen may be partially hidden and made invisible by the finger being used in the operation.
If the user wears, for example, a clock on the wrist and attempts to operate the compact input device on an edge/side face thereof, the user cannot readily touch or press the edge/side face with one finger without displacing the casing. The user would find it easier to support the casing with one finger and operate the input device with another finger. However, the input device would interpret this operation that involves use of two fingers as two touches at different points and might fail to determine which of the touches should be interpreted as an input operation, possibly resulting in a malfunction. For these reasons, as mentioned above, the submenu related to the touch (that gives support to the terminal) by which the first main menu is selected is displayed near the position opposite from the position where the first touch has been made, to enable the user to readily perform delicate operations with another finger. This manner of operation restrains wrong operations that involve use of two fingers and simultaneously improves operability and visibility.
Operation examples for the terminal device 10 in accordance with Embodiment 1 will be described in reference to
Portions (a) and (b) of
The example in portion (b) of
The user can select an item in a submenu by, for example, a “single tapping (brief touch and release)” or “touch, slide, and release for selection”. When there are many items (e.g., a long list of items) in a menu, the user needs to scroll the menu. To distinguish this scroll operation from a single tapping and a “touch and release for selection” operation, the user can perform, for example, a “double tapping,” a “touch and swipe in toward the center of the screen,” or a “release touching thumb for selection of item being touched on with forefinger” operation.
Portions (c) and (d) of
This example in portion (c) of
The example in portion (d) of
In clocks and like compact input devices, each of these modes displays an associated submenu across from the first touch select position so that the user can support the casing with one finger and perform delicate touch operations with another finger, thereby improving operability. In compact input devices with limited display content, these modes display operation menus on the periphery of the screen, thereby also improving screen visibility.
Operation examples for the terminal device 10 in accordance with Embodiment 2 will be described in reference to
Portion (b) of
Portion (c) of
Portion (d) of
In the mode shown in
Touching a song select icon in the main menu, for example, in the area A1 (contact position P1) with the thumb (first finger) invokes a display of a list of artists in a peripheral part of the screen opposite from the contact position P1 (area A2 or contact position P2), thereby enabling selection with another finger (second finger). Selecting from the list of artists in the area A2 (contact position P2) invokes a display of a list of albums of the selected artist in a peripheral part of the screen (area A3 or contact position P3) opposite from the area A2, enabling selection alternately with the thumb and with the other finger. Selecting from the list of albums in the area A3 (contact position P3) invokes a display of the titles of the songs in the selected album in a peripheral part of the screen (area A4 or contact position P4) opposite from the area A3, thereby enabling selection alternately with the thumb and with the other finger. This manner of selecting alternately with the thumb and with another finger enables the user to select a series of menu items to sequentially move down to a hierarchically lower level through the hierarchically structured menus and submenus.
The terminal device 10 may be configured so that the user can proceed to a next page or move down sequentially through the list by touching the “1,” area on the bottom of each list.
The terminal device 10 may be configured so that the user can return to the initial screen of the list of artists, the list of albums, and the list of song titles by touching the “Artist,” “Album,” or “Song Title” areas respectively.
Each of these modes displays operation menus on the periphery of the screen to enable inputs on the edge. That in turn prevents the display contents on the screen (e.g., information on the song being played and the list of songs) from being hidden behind displayed keys and fingers, thereby ensuring visibility. The modes also allow for selection of a key on the edge of the screen. That can reduce wrong inputs (wrong button operations) over cases where small input keys are crammed on the screen, thereby improving operability.
Operation examples for the terminal device 10 in accordance with Embodiment 3 will be described in reference to
Portions (a) and (b) of
Portion (c) of
Portions (a) and (b) of
Candidate conversions (menu items) may be displayed based on inputted characters as shown in portion (c) of
If there are many candidates like candidate conversions, the user may need to scroll the list or jump to a next page of the list. This is done by the same operation as single tapping and releasing of the finger. Therefore, it is preferable to “input” by double tapping, swiping into the screen, or releasing the thumb off the screen. Alternatively, if after a candidate conversion is tentatively selected by single tapping or releasing the finger, “scroll/next page” is touched on again in the right peripheral side, the tentatively selected candidate conversion may be deselected to allow subsequent operations. If an operate is done on the thumb side to input a next character after a candidate conversion is tentatively selected, the tentatively selected candidate conversion may be “inputted.”
Portions (a) and (b) of
Next, input candidates (menu items), each being a single word, may be displayed as shown in portion (c) of
In the mode shown in portions (a) and (b) of
Portions (c) and (d) of
Portions (a) and (b) of
In response to the input of the first letter on the thumb side (first finger side or area A1), letters that are likely to follow are selectively displayed on the opposite side (area A2) for selection with another finger (second finger). If there is no candidate, the user can input another letter on the thumb side. Subsequent input letter candidates may be displayed only on the other finger's side or alternately on the thumb side and on the other finger's side.
Portions (c) and (d) of
In response to the input of the first letter on the thumb side (first finger side or area A1), letters that are likely to follow are selectively displayed on the opposite side (area A2) for selection with another finger (second finger). If there is no candidate, the user can input another letter on the thumb side.
The mode shown in portions (a) and (b) of
Input letters and words can be predicted, for example, by preparing, in advance, dictionary data containing frequently used common words and retrieving candidates from that data or by presenting candidates based on the user's input history.
Each of these modes displays letter input keys on the periphery of the screen of the display unit 3 to enable manipulation of the keys on the edge. That in turn prevents the display contents (email body) on the screen from being hidden behind displayed keys and fingers, thereby ensuring visibility. The modes also allow for selection of a key on the edge of the screen. That can reduce wrong inputs (wrong button operations) over cases where small input keys are crammed on the screen, thereby improving operability.
Operation examples for the terminal device 10 in accordance with Embodiment 4 will be described in reference to
Portion (b) of
Portion (c) of
Portion (d) of
Each of these examples displays operation menus on the periphery of the screen of the display unit 3 to enable operations on the edge. That in turn prevents the display contents on the screen (Web pages) from being hidden behind displayed keys and fingers, thereby ensuring visibility. The examples also allow for selection of a key on the edge of the screen. That can reduce wrong inputs (wrong button operations) over cases where small input keys are crammed on the screen, thereby improving operability.
Variation examples of display items (menu items) in the main menu and submenus will be described in reference to
The control blocks of the terminal device 10 (particularly, the detection unit controller 21, the setup unit 22, and the display control unit 23) may be implemented with logic circuits (hardware) fabricated, for example, on an integrated circuit (IC chip) and may be implemented by software running on a CPU (central processing unit).
In the latter case, the terminal device 10 includes among others a CPU that executes instructions from programs or software by which various functions are implemented, a ROM (read-only memory) or like storage device (referred to as a “storage medium”) containing the programs and various data in a computer-readable (or CPU-readable) format, and a RAM (random access memory) into which the programs are loaded. The computer (or CPU) retrieves and executes the programs contained in the storage medium, thereby achieving the object of the present invention. The storage medium may be a “non-transient, tangible medium” such as a tape, a disc, a card, a semiconductor memory, or programmable logic circuitry. The programs may be supplied to the computer via any transmission medium (e.g., over a communications network or by broadcasting waves) that can transmit the programs. The present invention encompasses data signals on a carrier wave that are generated during electronic transmission of the programs.
The input device (the terminal device 10) in accordance with aspect 1 of the present invention is directed to an input device for receiving an input from a user on an outer edge of a casing of the input device, the input device including: a detection unit (1) configured to detect a contact position of a first finger of the user on the outer edge; and a second setup unit (setup unit 22) configured to set up, by using as a reference a position opposite from the contact position of the first finger of the user detected by the detection unit, a second input area where an input made with a second finger of the user is received.
According to this configuration, the second input area for the second finger is set up across from the contact position of the first finger of the user on the outer edge of the casing. That in turn can improve operability in an input operation that involves use of two or more fingers.
The input device in accordance with aspect 2 of the present invention may further include a first setup unit (setup unit 22) configured to set up, in or near the contact position of the first finger detected by the detection unit in aspect 1, a first input area where an input made with the first finger is received. According to this configuration, an input made with the first finger can also be received as well as an input made with the second finger. Therefore, two or more inputs can be received.
The input device in accordance with aspect 3 of the present invention may be configured so that in aspect 2, the first setup unit and the second setup unit alternately set up the first input area and the second input area respectively. That can improve operability in an input operation that involves use of two or more fingers.
The input device in accordance with aspect 4 of the present invention may be configured so that in aspect 2 or 3, a slide operation or a scroll operation with the second finger is enabled in the second input area while the first finger is touching the first input area. According to this configuration, operability can be improved in an input operation that involves use of two or more fingers.
The input device in accordance with aspect 5 of the present invention may further include, in aspect 2, a display control unit (23) configured to cause a first input-use image prompting the user to make an input in the first input area with the first finger to be displayed in or near the first input area. According to this configuration, the first input-use image is displayed in or near the first input area. That in turn enables the user to visually recognize the first input-use image so that the user can make an input in the first input area while visually recognizing that image.
The input device in accordance with aspect 6 of the present invention may be configured so that in aspect 5, the display control unit is further configured to cause a second input-use image prompting the user to make an input in the second input area with the second finger to be displayed in or near the second input area in response to an input in the first input area.
According to this configuration, the second input-use image is displayed in or near the second input area in response to an input in the first input area. That in turn enables the user to visually recognize the second input-use image upon that input so that the user can make an input in the second input area while visually recognizing that image.
In addition, the second input-use image is not displayed in or near the second input area until an input made in the first input area. Therefore, the user cannot recognize the presence of the second input area before making an input in the first input area. In other words, the user cannot make an input in the second input area before making an input in the first input area. Thus, no input is allowed in the second input area while the user is making an input in the first input area. The configuration can hence prevent malfunctions that could be caused if inputs are permitted in more than one location.
The input device in accordance with aspect 7 of the present invention may be configured so that in aspect 6: the second input-use image includes a plurality of menu items; and in response to the second finger being released off the second input area when the second finger is being slid over the second input area, a menu item associated with a position where the second finger is released is selected. According to this configuration, operability can be improved in an input operation that involves use of two or more fingers.
The input device in accordance with aspect 8 of the present invention may be configured so that in aspect 6: the second input-use image includes a plurality of menu items; and in response to the first finger being released off the first input area when the first finger is touching the first input area and the second finger is touching the second input area, a menu item associated with a position where the second finger is touching the second input area is selected. According to this configuration, operability can be improved in an input operation that involves use of two or more fingers.
The input device in accordance with aspect 9 of the present invention may further include, in aspect 2, a display control unit configured to cause a first input-use image prompting the user to make an input in the first input area with the first finger to be displayed in or near the first input area and further configured to cause a second input-use image prompting the user to make an input in the second input area with the second finger to be displayed in or near the second input area in response to an input in the first input area, wherein the first input-use image and the second input-use image are alternately displayed if the detection unit alternately detects the contact position of the first finger and a contact position of the second finger. According to this configuration, the first input-use image and the second input-use image are alternately displayed by making an input alternately in the first finger and in the second finger. That in turn enables the user to visually recognize the first input-use image and the second input-use image alternately upon such inputs so that the user can make an input alternately in the first input area and in the second input area while visually recognizing those images, which can improve operability in an input operation that involves use of two or more fingers.
The input device in accordance with aspect 10 of the present invention may be configured so that in aspect 6, the second input-use image includes a submenu associated with a main menu shown in the first input-use image prompting the user to make an input in the first input area with the first finger. According to this configuration, a submenu is displayed in or near the second input area, which is triggered by the input in the first input area with the first finger as prompted by the main menu. That can improve the visibility of the menus and the operability of the input device.
The input device in accordance with aspect 11 of the present invention may be configured so that in aspect 9, the display control unit is configured to cause hierarchically lower-level submenus to be displayed in accordance with a sequence in which the first input-use image and the second input-use image are alternately displayed. This configuration enables selection of menu items in hierarchically lower-level submenus in accordance with the sequence in which the first input-use image and the second input-use image are alternately displayed, which can improve operability in an input operation that involves use of two or more fingers.
The input device in accordance with aspect 12 of the present invention may be configured so that in any of aspects 1 to 11, the detection unit is stacked on a display unit in the casing to detect a target object touching or approaching a display screen of the display unit and also detect the first finger or the second finger touching or approaching the outer edge. This configuration enables the detection unit, which is stacked on the display unit in the casing and which also detects a target object touching or approaching the display screen of the display unit, to detect the first or the second finger touching or approaching the outer edge. Therefore, no new detection member needs to be provided to detect touching or approaching of the outer edge. That in turn can reduce the parts count.
The input device in accordance with aspect 13 of the present invention may be configured so that in any of aspects 1 to 11, the detection unit is disposed on a side face of the casing. This configuration enables the detection unit, disposed on a side face of the casing, to detect the first or the second finger touching or approaching the outer edge.
A wearable terminal in accordance with aspect 14 of the present invention preferably includes the input device in any of aspects 1 to 13. This configuration provides a wearable terminal that can improve operability in an input operation that involves use of two or more fingers.
A mobile terminal in accordance with aspect 15 of the present invention preferably includes the input device in any of aspects 1 to 13. This configuration provides a mobile terminal that can improve operability in an input operation that involves use of two or more fingers.
A method of controlling an input device in accordance with aspect 16 of the present invention is directed to a method of controlling an input device for receiving an input from a user on an outer edge of a casing of the input device, the method includes: (a) detecting a contact position of a first finger of the user on the outer edge; and (b) setting up, by using as a reference a position opposite from the contact position of the first finger detected in step (a), a second input area where an input made with a second finger of the user is received. This method achieves the same effects as aspect 1.
A control program for an input device in accordance with aspect 17 of the present invention may be directed to a control program for controlling an operation of an input device in aspect 1, the control program causing a computer to operate as the second setup unit in the input device.
The input device in each aspect of the present invention may be implemented on a computer. When this is the case, the present invention encompasses programs, for controlling the input device, which when run on a computer cause the computer to function as those units in the input device (only software elements) to implement the input device and also encompasses computer-readable storage media containing such a program.
The present invention is not limited to the description of the embodiments above, but may be altered within the scope of the claims. An embodiment based on a proper combination of technical means disclosed in different embodiments is encompassed in the technical scope of the present invention. Furthermore, new technological features can be created by combining technological means disclosed in different embodiments.
The present invention is applicable, for example, to input devices receiving user inputs on an outer edge of the casing thereof, wearable terminals including such an input device, and mobile terminals including such an input device.
Number | Date | Country | Kind |
---|---|---|---|
2014-254477 | Dec 2014 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2015/077830 | 9/30/2015 | WO | 00 |