Embodiments of the present invention relate to user input. In particular, they relate to processing input patterns detected by a touch sensitive display device.
Some electronic devices, such as mobile telephones, include a touch sensitive display. Typically, a user provides input by touching the touch sensitive display with a fingertip. For example, a user may navigate through a menu by selecting graphical items using a fingertip.
According to some, but not necessarily all, embodiments of the invention, there is provided an apparatus, comprising: at least one processor; and at least one memory storing computer program instructions, the at least one processor being configured to execute the computer program instructions to cause the apparatus at least to perform: processing an input pattern, detected at an instance in time by a touch sensitive display, to discriminate between a fingertip input pattern and an elongate input pattern; performing a first action after discriminating that the input pattern corresponds with the fingertip input pattern; and performing a second action after discriminating that the input pattern corresponds with the elongate input pattern, wherein the second action is different to the first action.
According to some, but not necessarily all, embodiments of the invention, there is provided a method, comprising: processing an input pattern, detected at an instance in time by a touch sensitive display, to discriminate between a fingertip input pattern and an elongate input pattern; performing a first action after discriminating that the input pattern corresponds with the fingertip input pattern; and performing a second action after discriminating that the input pattern corresponds with the elongate input pattern, wherein the second action is different to the first action.
According to some, but not necessarily all, embodiments of the invention, there is provided a computer program comprising computer program instructions that, when executed by at least one processor, cause an apparatus at least to perform: processing an input pattern, detected at an instance in time by a touch sensitive display, to discriminate between a fingertip input pattern and an elongate input pattern; performing a first action after discriminating that the input pattern corresponds with the fingertip input pattern; and performing a second action after discriminating that the input pattern corresponds with the elongate input pattern, wherein the second action is different to the first action.
According to some, but not necessarily all, embodiments of the invention, there is provided an apparatus, comprising: means for processing an input pattern, detected at an instance in time by a touch sensitive display, to discriminate between a fingertip input pattern and an elongate input pattern; means for performing a first action after discriminating that the input pattern corresponds with the fingertip input pattern; and means for performing a second action after discriminating that the input pattern corresponds with the elongate input pattern, wherein the second action is different to the first action.
For a better understanding of various examples of embodiments of the present invention reference will now be made by way of example only to the accompanying drawings in which:
Embodiments of the invention relate to processing input patterns detected by a touch sensitive input display. In particular, they relate to discriminating between an input provided at the touch sensitive display using a fingertip and input provided at the touch sensitive display using a side edge of a hand.
The Figures illustrate an apparatus 10/30, comprising: at least one processor 12; and at least one memory 14 storing computer program instructions 18, the at least one processor 12 being configured to execute the computer program instructions 18 to cause the apparatus 10/30 at least to perform: processing an input pattern, detected at an instance in time by a touch sensitive display 22, to discriminate between a fingertip input pattern 80 and an elongate input pattern 90/90a/90c; performing a first action after discriminating that the input pattern corresponds with the fingertip input pattern 80; and performing a second action after discriminating that the input pattern corresponds with the elongate input pattern 90/90a/90c, wherein the second action is different to the first action.
The processor 12 is configured to read from and write to the memory 14. The processor 12 may also comprise an output interface via which data and/or commands are output by the processor 12 and an input interface via which data and/or commands are input to the processor 12.
Although the memory 14 is illustrated as a single component it may be implemented as one or more separate components, some or all of which may be integrated/removable and/or may provide permanent/semi-permanent/dynamic/cached storage.
The memory 14 stores a computer program 16 comprising computer program instructions 18 that control the operation of the apparatus 10/30 when loaded into the processor 12. The computer program instructions 18 provide the logic and routines that enables the apparatus 10/30 to perform the method illustrated in
The computer program 16 may arrive at the apparatus 10/30 via any suitable delivery mechanism 40. The delivery mechanism 40 may be, for example, a tangible computer-readable storage medium, a computer program product, a memory device, a record medium such as a CD-ROM, DVD or Blu-Ray disc, or any article of manufacture that tangibly embodies the computer program 16. The delivery mechanism 40 may be a signal configured to reliably transfer the computer program 16.
The apparatus 30 illustrated in
The processor 12 is configured to provide outputs to the touch sensitive display 22 and the radio frequency transceiver 24. The processor 12 is configured to receive inputs from the radio frequency transceiver 24 and the touch sensitive display 22.
The memory 14 in
The touch sensitive display 22 is configured to provide a graphical user interface. The touch sensitive display 22 may be any type of touch sensitive display, such as a resistive touch sensitive display or a capacitive touch sensitive display. The touch sensitive display 22 may be configured to detect multiple (spatially separate) touch inputs simultaneously.
The radio frequency transceiver 24 is configured to transmit and receive radio frequency signals. The radio frequency transceiver 24 may, for example, be a cellular transceiver that is compatible with one or more cellular protocols such as GSM (Global System for Mobile Communications), IS-95 (Interim Standard 95) or UMTS (Universal Mobile Telecommunications System). Alternatively, the radio frequency transceiver 24 may be a short range transceiver that is compatible with one or more short range protocols, such as Bluetooth protocols or IEEE (Institute of Electrical and Electronic Engineers) protocols. In some embodiments of the invention, the apparatus 30 comprises one or more cellular transceivers and one or more short range transceivers.
The hand 100 is connected to an arm 79 by a wrist 90. The hand 100 comprises a thumb 101 and four fingers 101-105. The first finger 102 from the thumb 101 is known as the “index finger”. The second finger 103 from the thumb 101 is known as the “middle finger”. The third finger 104 from the thumb 101 is known as the “ring finger”. The fourth finger 105 from the thumb 101 is known as the “little finger”.
The area of the hand 100 designated by the dotted fine 110 includes the metacarpal bones. Consequently, this region will be referred to as the “metacarpal region 110”.
Each finger 101-105 includes three separate bones: the proximal phalanx 120, the intermediate phalanx 130 and the distal phalanx 140. The proximal phalanx 120 is connected to the metacarpal region 110 by the metacarpophalangeal joint 115. The intermediate phalanx 130 is connected to the proximal phalanx 120 by the proximal interphalangeal joint 125. The distal phalanx 140 is connected to the proximal phalanx by the distal interphalangeal joint 135. The reference numerals 115, 120, 125, 130, 135 and 140 are only illustrated in relation to the little finger 105 in
The first side surface 107 is a portion of the metacarpal region 110 of the hand. The first side surface 107 is on the opposite side of the metacarpal region 110 to the thumb 101. The second side surface 106 is a portion of the little finger 105 of the hand 100. The second side surface 106 is on the opposite side of the little finger 105 to the ring finger 104.
The length of the hand 100 can be considered to be substantially aligned with the direction of the fingers 102-105 in
The processor 12 may control the touch sensitive display 22 to display content. In
In
The graphical items 71-73 may, for instance, be menu items in a hierarchical menu. For example, the graphical items 71-73 may be items in a first level of the hierarchical menu, and selection of one of the graphical items 71-73 may cause the processor 12 to display one or more further graphical items. The further graphical items are at a second (lower) level in the hierarchical menu.
For instance, consider an example in which the first graphical item 71 relates to “messaging”, the second graphical item 72 relates to “contacts” and the third graphical item 73 relates to “settings”. Selection of the first graphical item 71 may cause the processor 12 to remove the second and third graphical items 72, 73 from display, and to display one or more further graphical items relating to an “inbox”, “sent items”, “drafts”, etc. The inbox, sent items and drafts may be accessed by selecting the relevant graphical item.
In
Once the user has placed the side edge 400 of his hand 100 on the display 22, he moves it across the display 22. In this example, the user places his hand on the right hand side of the display 22 and then he moves his hand to the left, such that the dorsal surface 300 of the hand 100 is the leading surface of the movement.
The side edge of the hand 100 remains in contact with the display 22 as the hand 100 moves across the display 22. The movement of the hand 100 may, for example, be caused by rotating the wrist, the elbow and/or the shoulder.
The gesture that is illustrated in
The elongate input pattern 90 has a length L and a width W. The length L is larger than the width W. For example, the length L may be: i) more than two times larger than the width W, ii) more than three times larger than the width W, or iii) more than four times larger than the width W.
The elongate input pattern 90 comprises a first elongate portion 94 and a second elongate portion 92. The first elongate portion 94 is produced due to the input provided by the side surface 106 of the little finger 105. The second elongate portion 92 is produced due to the input provided by the side surface 107 of the metacarpal region 110 of the hand 100.
In this example, the side surface 107 of the metacarpal region 110 of the hand 100 has a larger width than the side surface 106 of the little finger 105. Consequently, the width of the second elongate portion 92 is larger than the width of the first elongate portion 94.
Movement of the side edge 400 of the hand 100 across the display 22 causes the elongate input pattern 90 to move across the display 22. The arrow 95 in
In this example, the hand 100 is moved by rotating the elbow and/or the shoulder, so the length of the elongate input pattern 90 remains substantially aligned with the length of the display 22 during movement. The direction of movement of the elongate input pattern 90 is substantially perpendicular to the width of the display 22 in this example.
The display 22 may respond to movement of the side edge 400 of the hand 100 by detecting the elongate input pattern 90 at various positions on the display 22 at particular instances in time, over a period of time. The processor 12 may be configured to determine that the elongate input pattern 90 is moving (and to determine the direction of movement) by analyzing the location of the elongate input pattern 90 on the display 22 over a period of time. For example, the processor 12 may ascertain that the elongate input pattern 90 is moving by determining that the elongate input pattern 90 is at different positions on the display 22 at different instances in time. In the example illustrated in
After the processor 12 determines that an elongate input pattern 90 is moving across the display 22, it controls the display 22 to display another plurality of graphical items 74-76. The graphical items 74-76 are different to the graphical items 71-73 displayed on the display 22 in
Thus, in this example, the hand swipe by the user does not result in selection of any of the graphical items 71-73 originally displayed on the display 22 (in
For example, each of the graphical items 71-76 may relate to a different software application. However, it may not be possible to display all of the graphical items 71-76 on the display 22 at the same time (for example, due to the size of the display 22). In embodiments of the invention, a user may perform the hand swipe gesture to search through the graphical items 71-76, in order to find the one that he is looking for. When a desired graphical item is displayed, selection of that graphical item (for example, by providing fingertip input at the graphical item) may result in the software application being executed.
The processor 12 may control browsing across a level in the menu structure in such a way that the user perceives it to be “continuous”. The processor 12 may control the display 22 to display different graphical items after each hand swipe is detected, until all of the graphical items in a particular menu level have been displayed. When the final set of graphical items in a particular menu level are displayed, the processor 12 may also control the display 22 to display some indication that there are no further graphical items to view in that menu level. After all of the graphical items in a menu level have been displayed, detection of a further hand swipe gesture may cause the processor 12 to display the graphical items 71-73 that were initially displayed on the display 22 prior to the detection of the first hand swipe gesture.
The hand 100 is moved in this example by rotating the wrist. The dotted line designated by the reference numeral 90b illustrates the position of the elongate input pattern at a later instance in time. The arrows 95a and 95b indicate that the direction of movement of the elongate input pattern 90a, 90b remains perpendicular to the length L of the elongate input pattern as the elongate input pattern moves.
The first elongate portion 94c is produced due to input provided by the side surface 106 of the little finger 105. The second elongate portion 92c is produced due to input provided by the side surface 107 of the metacarpal region 110 of the hand 100. In this example, the side surface 106 of the little finger 105 is inclined with respect to the side surface 107 of the metacarpal region 110 (by means of the metacarpophalangeal joint 115). Consequently, a portion of the side surface 106 of the little finger 105 and/or a portion of the side surface 107 of the metacarpal region 110 does not contact the display 22. The arrow 95c indicates the direction of movement of the elongate input pattern 90c.
It will be appreciated by those skilled in the art that the elongate input pattern may take a form that is different to those illustrated in
A method according to embodiments of the invention will now be described in relation to
The touch sensitive display 22 responds to user input by detecting an input pattern at an instance in time. The touch sensitive display 22 provides the input pattern to the processor 12.
At block 600 of
The processor 12 may analyze the input pattern to perform the discrimination. For example, the processor 12 may analyze the input pattern by comparing one or more characteristics of the detected input pattern with one or more stored reference characteristics 19. For example, the processor 12 may analyze the input pattern to determine whether it has any of the characteristics of the elongate input patterns 90, 90a, 90c described above, and/or any of the characteristics of the fingertip input pattern 80 described above.
If the processor 12 discriminates that the detected input pattern corresponds with the fingertip input pattern 80, the method proceeds along arrow 602 to block 604. If the processor 12 discriminates that the input pattern corresponds with the elongate input pattern 90, 90a, 90c, the method proceeds along arrow 606 to block 608.
At block 604 of
At block 608 of
In some embodiments of the invention, the processor 12 may perform the second action after determining that the input pattern (corresponding with the elongate pattern) is moving across the display 22. For instance, the processor 12 may determine that the input pattern is moving by determining that the input pattern is at different positions on the display 22 at different instances in time (as described above).
References to ‘a tangible computer-readable storage medium’, ‘a computer program product’, a ‘computer’, and a ‘processor’ etc. should be understood to encompass not only computers having different architectures such as single/multi-processor architectures and sequential (Von Neumann)/parallel architectures but also specialized circuits such as field-programmable gate arrays (FPGA), application specific circuits (ASIC), signal processing devices and other devices. References to computer program, instructions, code etc. should be understood to encompass software for a programmable processor or firmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fixed-function device, gate array or programmable logic device etc.
The blocks illustrated in the
Although embodiments of the present invention have been described in the preceding paragraphs with reference to various examples, it should be appreciated that modifications to the examples given can be made without departing from the scope of the invention as claimed. For example, the user may move his hand in the opposite direction to that illustrated in
An example is described above in which the first graphical item 71 relates to “messaging”, the second graphical item 72 relates to “contacts” and the third graphical item 73 relates to “settings”. It should be appreciated that, in other embodiments of the invention, the graphical items 71, 72, 73 may not relate to “messaging”, “settings” and “contacts”. For example, the each graphical item 71, 72, 73 may relate to media content. In this regard, each graphical item 71, 72, 73 may relate to an individual file containing audio, video or audiovisual content.
In this example, detection of a fingertip tip input pattern 80 at a graphical item 71, 72, 73 causes the processor 12 to control playback of the related media content. Detection of an elongate input pattern 90 may cause the processor 12 to display further graphical items 74-76, each of which relate to further media content. This enables a user to browse through his media content, for example by swiping the side edge 400 of his hand 100 across the display 22.
In exemplary embodiments of the invention described above, detection of an elongate input pattern 90 when the first, second and third graphical items 71-73 are displayed on the display 22 causes the processor 12 to control the display 22 to display further graphical items 74-76 at the same level in a hierarchical menu structure. In alternative embodiments of the invention, the processor 12 may change the level of the menu structure that is displayed on the display 22 in response to detection of an elongate input pattern 90. In these embodiments of the invention, the further graphical items 74-76 displayed after the detection of the elongate input pattern 90 may, for example, be part of a higher level in the menu structure than the first, second and third graphical items 71-73. In other words, detection of the elongate input pattern 90 causes a de-selection of a previously selected graphical item, causing graphical items from a higher level in the menu structure to be displayed.
In some implementations, the processor 12 may be configured to perform different functions in dependence upon the direction of a hand swipe gesture. For example, the processor 12 may cause different graphical items to be displayed depending upon whether a user swipes his hand 100 from right to left across the display 22 or from left to right across the display 22. Detection of a hand swipe gesture in a vertical direction (from a lower part of the display 22 to an upper part of the display 22, or vice versa) may cause the processor 12 to display graphical items from a different level in the menu structure. For example, detection of a hand swipe gesture from a lower part of the display 22 to an upper part of the display 22 may cause the processor 12 to display graphical items from a higher level in the menu structure.
In some embodiments of the invention, it is not necessary for the user to move his hand across the display 22 to cause the processor 12 to perform the “second action” referred to in block 608 of
Features described in the preceding description may be used in combinations other than the combinations explicitly described.
Although functions have been described with reference to certain features, those functions may be performable by other features whether described or not.
Although features have been described with reference to certain embodiments, those features may also be present in other embodiments whether described or not.
Whilst endeavoring in the foregoing specification to draw attention to those features of the invention believed to be of particular importance it should be understood that the Applicant claims protection in respect of any patentable feature or combination of features hereinbefore referred to and/or shown in the drawings whether or not particular emphasis has been placed thereon.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/CN2010/070509 | 2/4/2010 | WO | 00 | 7/31/2012 |