The present invention relates to a system, method and program for realizing a user interface based on finger identification.
Various techniques have been proposed for user interfaces that allow users to interact with computers. One of these techniques is realized by identifying each finger of the user's hand and assigning a function to each.
The basic example of such a technique is to assign a different function to each finger. If fixed functions are assigned, only 10 functions can be assigned for 10 fingers. Since 10 functions are not sufficient for many applications, it is necessary to appropriately switch the pairs of functions to be assigned. In order to use the assigned functions smoothly, it is important to make the correspondence between functions and fingers easily understandable.
Patent document 1 describes a technology that associates a TV function to a finger. When said finger is in contact with the touchscreen 2, the technology associates a cancel function or a setting function of the TV function to another finger (FIG. 21, etc.).
Patent document 1: International Publication No. 2006/104132.
Further improvements can be made in the user interface that identifies the user's fingers and assigns functions to multiple identified fingers.
The present invention was made in view of the above, and its problem is to realize a more intuitive and efficient user interface based on the identification of each finger of the user in a system, method or program for providing a user interface to a user.
The first aspect of the present invention is a method for providing a user interface to a user, comprising steps of: identifying a plurality of fingers of the user; assigning a mode switching function to a first finger of the identified plurality of fingers; and switching each of functions assigned to second and third fingers of the plurality of fingers to another function in response to a touch action by the first finger; wherein at least one of the second finger and the third finger is different from the first finger.
Also, the second aspect of the present invention is the method according to the first aspect, wherein the touch action is a touch action on an object.
Also, the third aspect of the present invention is the method according to the second aspect, wherein said another function is one of a mode switching function, a parameter control function and an object selection function.
Also, the fourth aspect of the present invention is the method according to the second aspect, wherein a function assigned to the second finger after switching is a parameter control function, and a function assigned to the third finger after switching is a function to change the parameter control function.
Also, the fifth aspect of the present invention is the method according to the second aspect, wherein a function assigned to the second finger after switching is a parameter control function, the method further comprising a step of assigning a function to change the parameter control function to a finger different from the second finger of the plurality of fingers in response to a touch action by the second finger.
Also, the sixth aspect of the present invention is the method according to the second aspect, wherein a function assigned to the second finger after switching is a command processing function.
Also, the seventh aspect of the present invention is the method according to the sixth aspect, wherein the command processing function is one of an editing function, a conversion function, a search function, a save function, a copy function, a computation function, a transmission function, and any combination of these functions.
Also, the eighth aspect of the present invention is the method of any of the first to fourth aspects, wherein the second finger is the same finger as the first finger.
Also, the ninth aspect of the present invention is the method of any of the first to ninth aspects, further comprising a step of returning the functions of the second finger and the third finger to pre-switching functions when the touch action of the first finger is released.
Also, the tenth aspect of the present invention is the method of any of the first to eighth aspects, further comprising a step of returning the functions of the second finger and the third finger to pre-switching functions when the touch action of the first finger is released within a predetermined time.
Also, the eleventh aspect of the present invention is the method of any of the second to eighth aspects, wherein the selection state of the object is maintained even after the touch action is released.
Also, the twelfth aspect of the present invention is the method according to the eleventh aspect, further comprising a step of switching each of the functions assigned to the second finger and the third finger to a function different from functions before and during the touch action in response to the release of the touch action.
Also, the thirteenth aspect of the present invention is the method according to the eleventh or twelfth aspect, further comprising steps of terminating the selection state in response to a touch action by one of the plurality of fingers; and switching each of the functions assigned to the second finger and the third finger to its pre-switching function in response to the termination of the selection state.
Also, a fourteenth aspect of the present invention is the method of any of the first to thirteenth aspects, wherein a virtual hand representing the plurality of identified fingers is displayed on a display used by the user, together with an icon or label representing functions assigned to each finger.
Also, the fifteenth aspect of the present invention is the method according to the fourteenth aspect, wherein the label is displayed only when a finger associated with the label is positioned at or above a predetermined height from the desk used by the user.
Also, the sixteenth aspect of the present invention is the method according to the fourteenth aspect, wherein when the user's finger is outside an area of a screen of the display, the icon or the label is displayed at an edge of the screen.
Also, the seventeenth aspect of the present invention is the method of any of the first through sixteenth aspects, wherein the plurality of fingers of the user includes one of fingers of the right hand of the user and one of finger of the left hand of the user.
Also, the eighteenth aspect of the present invention is the method of any of the first to seventeenth aspects, wherein at least one of the first finger, the second finger, and the third finger is a set of fingers including a plurality of fingers.
Also, the nineteenth aspect of the present invention is a program for causing a computer to perform a method of providing a user interface to a user, the method comprising steps of: identifying a plurality of fingers of the user; assigning a mode switching function to a first finger of the identified plurality of fingers; and switching each of functions assigned to second and third fingers of the plurality of fingers to another function in response to a touch action by the first finger; wherein at least one of the second finger and the third finger is different from the first finger.
Also, the twentieth aspect of the present invention is an apparatus for providing a user interface with an application to a user, configured to: identify a plurality of fingers of the user; assign a mode switching function to a first finger of the identified plurality of fingers; and switch each of functions assigned to second and third fingers of the plurality of fingers to another function in response to a touch action by the first finger; wherein at least one of the second finger and the third finger is different from the first finger.
According to one aspect of the present invention, an intuitive and efficient user interface can be realized based on the identification of each finger of the user.
Embodiments of the present invention is explained with reference to the figures. All figures are illustrative. The present specification discloses a system, method, and program regarding an interaction with a computer realized by identifying multiple fingers of a user and performing multiple touch actions using the identified fingers (hereinafter also referred to as “finger-identification multi-touch interaction”).
As a guide to the user (101), a schematic user hand (hereinafter also referred to as “virtual hand”) is preferably displayed on a display (104). It is preferable that the virtual hand and the application screen are superimposed on the display (104) so that the user can directly perform actions such as shape editing on the application screen. To improve the accuracy of the image recognition process of the hands by the camera (102), a sheet of a single color may be placed under both hands of the user (101). Black is preferred for said color. A tactile feedback (haptics) device (not shown) may be placed on the desk to provide feedback to touch or other actions by the user. An apparatus may be provided to provide feedback of user input by sound, light, or other means. The display (104) and the surface for actions may be integrated into a single structure.
The camera (102) is preferably a camera equipped with a depth sensor to increase the accuracy of finger position recognition. It preferable that it captures color images to increase the accuracy of finger recognition. An input of the user (101) is preferably performed not only with one hand but also with both hands. Recognition of the user's (101) finger operation may be realized by any recognition method, such as a conventional touch screen, touch pad, smart glove with built-in sensors or a combination of these instead of capturing by the camera (102). The functionality of the computer (103) may be realized by an external server, such as a cloud. The computer (103) may be equipped with other input devices (mouse, keyboard, etc.) for auxiliary use, but not shown.
Here, a “touch action” is typically the action of lowering a floating finger to the surface of a desk, but it may also be a contact with an object other than a desk, such as a touch screen, touch pad, or other part of the body such as the thigh/other hand, etc. and may be realized in any way (these actions are also referred to as “touch actions”). A touch action may be performed simultaneously with multiple fingers as a set of fingers. In addition to a contact, the fact that a finger is pushed in with a pressure above a certain level may be detected as a touch action.
It is not necessary that each part of the control program (200) corresponds to a single program module. Some or all of the functions may be realized by linking the control program (200) with a program such as an application. Examples of such programs include a web browser, JavaScript (registered trademark) that can run on a web browser, a native program that can be linked with the JavaScript, etc. In addition, some or all of the functions may be performed on a peripheral device such as a camera, or on an external computer such as a cloud. It can also be stored on a computer-readable storage medium to form a non-transitory program product.
The finger identification multi-touch interaction according to an embodiment of the present invention may display icons or labels on the fingertips of the virtual hand representing fingers identified by a camera or other means to represent functions assigned to the fingers. This, together with the temporary mode switching described below, allows many types of direct actions more than the number of fingers to be performed on the object of the actions. It may also be possible to select functions in a hierarchical manner by switching the assignment of a group of functions by means of temporary mode switching. In addition, the mode can also be switched by selecting an object of an action to enable context-sensitive function selection.
Here, even if the fingertip is outside the screen area, if the fingertip is within the recognition range of the fingertip position recognition means such as the camera (102), it is preferable that the icon is displayed at the edge of the screen to allow the user to see and touch the screen. It is preferable that the vertical coordinate position, if it is the left or right screen edge, or the horizontal coordinate position, if it is the top or bottom screen edge, is moved in conjunction with the corresponding fingertip. This will seamlessly present an icon when the finger moves out of the screen or into the screen, and prevent the screen edge from being fixedly obscured by an icon. When the fingertip is the camera (102) or other fingertip position recognition means, it is preferable that the icon or label is hidden or changed in color or transparency to indicate to the user that the finger is out of recognition range.
The icons displayed with the virtual hand may be made larger to facilitate visual exploration. In the case of a conventional user interface such as a toolbar, if the icons are large, they will occupy a fixed amount of screen space, but in the finger identification multi-touch interaction according to the present embodiment, even if the icons are displayed relatively large, they can be moved simply by moving the hand, so the screen space is not occupied in a fixed manner. It is preferable to enable displaying larger icons for users with poor eyesight or beginners, and smaller icons for skilled users.
It is preferable to display a cursor on one or more fingertips to indicate that object selection is possible or to indicate a hotspot or area for selection. This is because in addition to the fingers for object selection, there are fingers for a command processing or mode switching and it is necessary to clarify which fingers are available for object selection. If it is not the finger for object selection, it is preferable to display a specific cursor to indicate so, or to display only an icon.
If the user is a novice, it is preferable to display both icons and labels because the user searches for functions by visual or direct exploration. However, in the finger-identification multi-touch interaction according to the present embodiment, since the selection can be made without seeing and there is no need to always present the labels, it is preferable for skilled users to hide the labels and display them only when fingers are positioned above a certain height from the desk, so that skilled users can check them immediately in case they forget.
In the basic process of temporary mode switching, when the touch action with the finger that activated the temporary mode is released, the mode is restored and the function group assignment to the fingers returns to the state before the switch. Alternatively, the mode may be permanently switched when the touch action is performed for a short period of time equal to or below a predetermined time (tap action), and returned to the pre-switching state in response to the release of the touch action when the touch action is performed for longer than or equal to the predetermined time (long press action). Although the functions assigned to multiple fingers are mainly described as being switched on a mode-by-mode basis, it is conceivable to control the return to the pre-switching state only for some of the multiple fingers, or not to return to the pre-switching state, when the predetermined conditions are met.
Even if the touch action is released, the selection state of the object of an action may be continued. In this case, control may be performed such that the mode is not returned even after the touch action is released. For example, the selection state of an object may be terminated when the touch action is performed again on the object in question or when the touch action is released. Any other action may be used to terminate the selection state. The mode may then be restored in response to the termination of the selection state. Alternatively, if the touch action is released while an object is being selected, a control may be performed to assign a different set of functions to the fingers than before the selection of the object in question. The fact that an object is being selected means that further actions on that object are likely to be performed. Furthermore, when the touch action is released while the object is being selected, control may be performed to assign a different set of functions to the fingers than before and during the touch action of the object in question.
Multiple objects may be selectable, and the function group assigned to each finger may be switched according to the types of multiple objects selected. In this case, the switching of the function group can be performed, for example, at the time when multiple objects are selected, at the time when additional objects are selected while multiple objects are selected, at the time when the selection state of any object ends while multiple objects are selected, or at the time when a touch action is performed on any of the objects in a state in which multiple objects are selected.
In the object selection state, control may be performed such that a touch action on another object with the finger to which the selection function is assigned (e.g., right index finger) results in an additional selection while the touch action of the specific finger (e.g., right thumb) is continued, if necessary. Alternatively, in the state of object selection, while continuing the touch action of a specific finger as necessary, another object may be additionally selected by dragging the finger used to select that object to the area where another object exists. A multiple selection function may be assigned to a finger so that multiple touch actions may be performed with the finger and multiple objects existing within a rectangular, polygonal, circular or elliptical area connecting the points where those touch actions were performed may be selected. In addition, a multiple selection function may be assigned to a finger so that a drag operation may be performed with the finger and multiple objects that exist within the area enclosed by the locus may be selected. The function of the finger to which a selection function is assigned may be switched to a multiple selection function by continued touch action of a specific finger.
Here, a hover action may be used instead of a touch action as a trigger for object selection. In other words, in the example in
More generally, if the parameter control function is assigned to a specific finger, then another finger can be assigned the ability to change the parameter control function, and while the touch action with the that finger continues, at least one of the type of parameter determined by the action of that specific finger, the number of the parameters and the unit of increase or decrease of the value of each parameter can be changed. An example of changing the number of parameters is changing from a two-dimensional angle of rotation to a three-dimensional angle of rotation. Assignment of the change function to another finger may be made at the time of assignment of the parameter control function to a particular finger, or alternatively, in response to the detection of a touch action by a particular finger.
In the above description, the command processing is mainly an example of editing objects, but it can be processing to an object other than editing. Examples of such processing include conversion such as translation of selected text, web search based on selected text, storing such as exporting the selected object to a file, copying the selected object, computing based on the selected text, sending the selected object, etc., and any combination of these can also be used. For example, when an object is selected and moved with a specific finger (e.g., the index finger of the right hand), i.e., when an object is selected with a finger to which the selection function is assigned and the function of the movement action is assigned to that finger, moving after touch action with another finger (e.g., the ring finger of the right hand) will instead of the editing processing of simply moving an object, a combined control may be performed so that the object is copied and then moved.
Number | Date | Country | Kind |
---|---|---|---|
2022-006752 | Jan 2022 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2023/001599 | 1/19/2023 | WO |