The present invention relates to methods for operation of a touch input device and in particular to methods of operation where the operational state of the touch input device is contingent upon the type; size or shape of a detected touch object. The invention has been developed primarily for use with touch input devices that include a display capable of presenting a plurality of user-selectable graphical elements, and will be described hereinafter with reference to this application. However, it will be appreciated that the invention is not limited to this particular field of use.
Any discussion of the prior art throughout the specification should in no way be considered as an admission that such prior art is widely known or forms part of the common general knowledge in the field.
Input devices based on touch sensing (touch screens) have long been used in electronic devices such as computers, personal digital assistants (PDAs), handheld games and point of sale kiosks, and are now appearing in other portable consumer electronics devices such as mobile phones. Generally, touch-enabled devices allow a user to interact with the device by touching one or more graphical elements, such as icons or keys of a virtual keyboard, presented on a display.
Several touch-sensing technologies are known, including resistive, capacitive, projected capacitive, surface acoustic wave and optical, all of which have advantages and disadvantages in areas such as cost, reliability, ease of viewing in bright light, ability to sense different types of touch object, e.g. finger, gloved finger, stylus, and single or multi-touch capability. For example resistive touch screens are inexpensive and can sense virtually any rigid touch object, but have poor screen viewability in bright light and can only sense single touches. Projected capacitive has multi-touch capability but cannot sense a non-conductive stylus or a gloved finger, and likewise has poor screen viewability in bright light. Optical has good screen viewability in bright light, limited multi-touch capability and is sensitive to virtually any touch object, but there is the potential for the detectors to be saturated by sunlight.
Furthermore some touch-sensing technologies, including optical and surface acoustic wave, are sensitive to near-touches as well as to actual touches, whereas other technologies such as resistive require an actual touch.
The sensitivity of some touch technologies to selected types of touch object can be used to advantage. For example U.S. Pat. Nos. 4,686,332 and 5,956,020 describe capacitive touch screens that, in addition to detecting finger touch, can detect an active stylus from signals emitted by the stylus, while U.S. Pat. No. 5,777,607 and US Patent Publication No 2001/0013855 A1 describe touch tablets that detect finger touch capacitively and stylus touch resistively. This finger/stylus discrimination enables the touch system controller to reject an inadvertent ‘palm touch’ from a user's hand holding the stylus, or to make decisions as to which applications or operations to enable.
Several touch technologies are able to distinguish different types of touch object based on the size of the object, with size determined either as a linear dimension (e.g. using resistive touch in Japanese Patent Publication No 2004213312 A2 or infrared touch in U.S. Pat. No. 4,672,195 and U.S. Pat. No. 4,868,912) or a contact area (e.g. using projected capacitive touch in US 2006/0026535 A1 or in-cell optical touch in U.S. Pat. No. 7,166,966). In some cases (U.S. Pat. No. 4,672,195, U.S. Pat. No. 4,868,912) size information is used to reject touch objects that are too small (e.g. an insect) or too large (e.g. a ‘palm touch’), while in other cases (US 2006/0139340 A1) it can help resolve ‘phantom’ touches from real touches in the ‘double touch ambiguity’ that occurs with some touch technologies, or to decide whether to activate an icon being touched (US 2006/0053387 A1). In yet other cases, described fox example in U.S. Pat. No. 7,190,348, US 2008/0204421 A1 and US 2008/0284751 A1, size information is used to distinguish between stylus and finger touch. It has also been suggested that stylus and finger touch can be distinguished on the basis of pressure (JP 04199416 A2), temperature or direct imaging (US 2008/0284751 A1).
Irrespective of the means used to distinguish between finger and stylus touch, several groups have used the information to address the problem of using a finger (a convenient but relatively large touch object) to select small icons accurately. Known methods for improving finger operation of a touch screen include presenting a set of larger icons (U.S. Pat. No. 7,190,348, JP 2003271310 A2, US 2005/0237310 A1, US 2007/0057926 A1, US 2008/0284743 A1), enlarging a portion of the touch interface (US 2006/0026535 A1), and using an offset cursor (U.S. Pat. No. 7,190,348, US 2008/0204421 A1).
The concept of gestural inputs, where a user moves one or more touch objects (usually fingers, with the thumb considered to be a finger) across a touch-sensitive surface, or places one or more touch objects on a touch-sensitive surface in a particular sequence, are an increasingly popular means for enhancing the power of touch input devices beyond the simple ‘touch to select’ function, with a large number of gestures of varying complexity for touch input devices known in the art (see for example US Patent Publication Nos 2006/0026535 A1, 2006/0274046 A1 and 2007/0177804 A1). A given gesture may be interpreted differently depending on whether the touch object is a finger or stylus. In one example (U.S. Pat. No. 6,611,258) a drawing application may interpret a stroke as a line when performed by a stylus or as an erase gesture when performed by a finger. In another (US 2008/0284743 A1) a stylus or finger stroke may be interpreted as a ‘panning’ gesture or an erase gesture. As discussed in US 2006/0097991 A1, touch technologies such as projected capacitive that can accurately detect several simultaneous touch events are particularly well suited to gestural input, with gestures interpreted according to the number of fingers used. US 2007/0177804 A1 discusses the concept of a ‘chord’ as a set of fingers contacting a multi-touch surface, and suggests the use of a gesture dictionary assigning gestures to different motions of a chord. However for touch technologies with no multi-touch capability (e.g. resistive and surface capacitive) or limited multi-touch capability (e.g. infrared and surface acoustic wave), gestural input based on chords is of limited applicability.
It is an object of the present invention to overcome or ameliorate at least one of the disadvantages of the prior art, or to provide a useful alternative.
It is an object of the invention in its preferred form to provide a method for operation of a touch input device where the operational state of the device is contingent on the type, size or shape of the object used to provide the touch input.
In a first aspect, the present invention provides a method for operation of a touch input device comprising a touch input area, said method comprising the steps of (i) detecting a touch or near-touch of an object on or near said touch input area; (ii) determining a parameter indicative of the size and/or shape of said object; (iii) comparing said parameter with at least one predetermined value; and (iv) enabling an operational state of said touch input device in response to said comparison, wherein said operational state is a sleep mode or an active mode.
In a preferred form of the invention the predetermined values are threshold values and the parameter is compared with said threshold values to determine which function is enabled by the touch object. The predetermined value may be compared with a single threshold value such that if the parameter is greater than the threshold value the device enters a sleep mode, and if the parameter is less than or equal to the threshold value it enters an active mode. In an alternative embodiment, the predetermined values are a set of threshold values whereby the parameter is compared with a first lower threshold value and a second upper threshold value greater than the first lower threshold value. If the parameter is greater than the second threshold value the device enters sleep mode, and if the parameter is less than the first threshold value the device enters an active mode.
In a second aspect, the present invention provides a method for operation of a text entry mode of a touch input device comprising a touch input area operatively associated with a display; said method comprising the steps of: (i) detecting a touch or near-touch of an object on or near said touch input area (II) determining whether said touch object is a stylus or a finger; and (iii) displaying on said display a full keyboard if said touch object is determined to be a stylus, or a reduced keyboard if said touch object is determined to be a finger.
In a third aspect, the present invention provides a method for operation of a touch input device comprising a touch input area operatively associated with a display, said method comprising the steps a (i) detecting a touch or near-touch of an object on or near said touch input area; (ii) determining the size and/or shape of said object; and (iii) displaying a cursor on said display in response to said determining step, wherein said cursor is a graphical representation of the determined touch object.
In a fourth aspect, the present invention provides a method for operation of a touch input device comprising a touch input area operatively associated with a display, said method comprising the steps of: (i) detecting a touch or near-touch of an object on or near said touch input area; (ii) determining whether said touch object is a stylus or a finger; and (iii) displaying a cursor on said display in response to said determining step, wherein said cursor is a graphical representation of the determined touch object.
According to this aspect, in a preferred form the cursor may be a graphical representation of a stylus or a handholding stylus if said touch object is determined to be a stylus. Alternatively the cursor may be a graphical representation of a pointing hand, a finger or a group of fingers if said touch object is determined to be a finger or group of fingers.
In a fifth aspect, the present invention provides a method for operation of a touch input device comprising a touch input area, said method comprising the steps of: (i) detecting a touch or near-touch of an object on or near said touch input area; (ii) determining a parameter indicative of the size and/or shape of said object; and (iii) presenting said parameter to a user of said device.
According to this aspect, the parameter may be displayed on a display operatively associated with said touch input area. The parameter may be displayed graphically and/or alphanumerically in one or more dimensions to the user of the device.
In a sixth aspect, the present invention provides a method for operation of a touch input device comprising a touch input area, said method comprising the steps of i) detecting a touch or near-touch of an object on or near said touch input area, said object comprising one or more fingers bunched together: ii) determining a parameter indicative of the size and/or shape of said object; comparing said parameter with at least one predetermined value and (iv) on the basis of said comparison, differentiating said object as a single finger or as a plurality of fingers bunched together.
Preferably, the parameter is compared with one or more predetermined threshold values, these threshold values delimiting a plurality of functions such that the size and/or shape of said object enables one or more of said functions.
In a seventh aspect, the present invention provides a method for interacting with a touch input device comprising a touch input area, said method comprising placing one more touch objects on or near said touch input area wherein at least one of said touch objects comprises at least two fingers bunched together.
In preferred forms of the invention the number and magnitude of the predetermined values may be user definable. In some embodiments the parameter would include at least one linear dimension of said object with for example, a linear dimension threshold value in the range of 2 mm to 5 mm.
In other embodiments the predetermined value may include an area of said object with, for example, an area threshold value in the range of 4 mm2 to 25 mm2.
In a still further embodiment the parameter may include a measure of symmetry of the object.
The display which is operatively associated with the touch input area is preferably but not necessarily coincident with said touch input unit.
Preferred embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which:
Referring to the drawings,
As shown in
For simplicity,
Another size-related measure that can be calculated is an interaction area 40 between a touch object and a display. For the optical touch input device 2 shown in
In an alternative form of input device 42 shown in
In a first aspect of the present invention, the size and/or shape of a detected touch object are used to determine whether an input device should be in sleep mode or active mode. For example when an optical touch input device 2 or 42 is in sleep mode, it operates at a frame rate of order one frame per second (with a ‘frame’ including pulsing the optical source(s) 6 and scanning the multi-element detector 18), whereas in active mode it operates at much higher frame rates, of order 100 frames per second or even higher for demanding applications such as signature capture. In general an input device will remain is sleep mode whenever possible, to conserve power. For example if an input device in active mode is placed into a pocket or a sleeve, the device controller will detect the pocket or sleeve as a touch with a parameter indicative of size and/or shape larger than a predetermined value and will direct the device to enter sleep mode. In certain embodiments the device will only enter sleep mode if this ‘large’ touch persists for a certain time. Optionally the device may provide a warning message such as a beep before entering sleep mode, which could be useful if a user were inadvertently resting their band on the input area. Alternatively or additionally, if the input device is in sleep mode and detects a touch object with a parameter indicative of size and/or shape smaller than a predetermined value, e.g. consistent with a stylus or finger, the controller will direct the input device to enter active mode. We note that this aspect does not require the presence of a display, i.e. it is applicable to touch panel devices where the input area does not coincide with a display.
In another embodiment the predetermined values may be two predetermined threshold values with which the size and/or shape indicative parameter is compared, with a first predetermined threshold value being smaller than a second predetermined threshold value. A device in sleep mode will enter active mode if it detects a touch object with size and/or shape parameter smaller than the first predetermined threshold value, and a device in active mode will enter sleep mode if it detects a touch object with size and/or shape parameter larger than the second predetermined threshold value. By setting the second predetermined threshold value to correspond to a significant fraction of the input area, i.e. much larger than a finger, the likelihood of a user inadvertently sending the device into sleep mode, say with a palm touch, is reduced.
In another aspect of the present invention, a touch input device controller first determines whether a touch object is a stylus or a finger, and then presents a suitable user interface for alphanumeric text entry. In preferred embodiments the stylus/finger decision is made based on determining a parameter indicative of the size and/or shape of the touch object as described below, but in alternative embodiments the decision is made based on one or more other criteria known in the art, including those described previously. If the device controller determines that the touch object is a stylus, it presents a full keyboard (such as a QWERTY keyboard or the like, including variations used for alphabet-based languages other than English), or a reduced keyboard (such as a T9 keypad), with multiple characters per key, if the touch object is a finger. Many other types of reduced keyboards are known in the art, including an expanding circular arrangement disclosed in US 2007/0256029 A1 entitled ‘Systems and methods for interfacing a user with a touch screen’ and incorporated herein by reference. A QWERTY keyboard has the advantage of unambiguous input but requires a larger display area, whereas reduced keyboards require a smaller display area but frequently need some form of ‘disambiguation’ routine and are often slower to use. We note that U.S. Pat. No. 6,611,258 discloses a somewhat contrary text entry system where a QWERTY keyboard is presented for finger touch, and a character drawing pad for stylus touch.
By way of specific example,
In a preferred embodiment shown in
In another aspect of the present invention the parameter determined by the controller to identify the touch object is a parameter indicative of shape. The determination of this parameter may be quite straightforward such as measuring a plurality of linear dimensions to determine the actual shape, or give a measure of the symmetry of the object producing the touch or near touch.
In certain embodiments the number or magnitudes of the one or more predetermined threshold values are fixed, while in other embodiments they are user-definable. In alternative embodiments, a decision as to which keyboard to display is made based on a touch made anywhere on the display. In yet other embodiments, the displayed keyboard can be changed dynamically during text entry, say if the user switches between finger and stylus operation.
In another aspect of the present invention, a touch input device controller first determines the origin of the touch or near touch eg. whether a touch object is a stylus, a finger or bunch of fingers in contact with each other, or another object such as a credit card. The device then presents a cursor with shape indicative, of the touch object, for example a pointing hand or a finger for finger touch, or a stylus or a band holding a stylus for a stylus. In general the intuitive part of the cursor (i.e. the fingertip or stylus tip) will be the ‘hot spot’ of the cursor, and the cursor may be coincident with the touch object or offset as is known in the art. In preferred embodiments the stylus/finger decision is made based on measuring one or more dimensions of the touch object as described below, but in alternative embodiments the decision is made based on one or more other criteria known in the art including those described previously.
By way of specific example, if a touch input device controller detects a touch object with both linear dimensions less than a predetermined linear threshold of 5 mm it will display a cursor shaped like a stylus or pen, and if it detects a touch object with both linear dimensions greater than the predetermined linear threshold it will display a cursor shaped like a finger. In another example, a touch input device controller will display a cursor shaped like a stylus or pen if it detects a touch object with interaction area less than a predetermined area threshold of 25 mm2, or a cursor shaped like a finger if it detects a touch object with interaction area greater than the predetermined area threshold.
In a fourth aspect of the present invention, a touch input device has a ‘measure object’ mode (enabled for example by tab 65 in
In the example illustrated in
A further aspect of the present invention concerns gestural input for touch technologies with limited or no multi-touch capability. For example a resistive touch screen is limited to a single touch point, with two simultaneous touch events being reported as a single touch event midway between the two touch objects. As explained in PCT Patent Publication No WO 2008/138046 A1 entitled ‘Double touch inputs’ and incorporated herein by reference, touch technologies relying on two intersecting energy paths to determine the location of a touch object, such as the ‘infrared’ technologies illustrated in
This ‘double touch ambiguity’ can lead to certain gestures being misinterpreted. For example
The present invention provides a device controller that uses touch object recognition to determine whether a given gesture includes two or more adjacent or bunched fingers, and assigns a function accordingly. Unlike the ‘chords’ of the prior art where a user's fingers axe separated and individually detectable, bunched fingers place no multi-touch requirement on the device controller, since they are detected as a single touch event. On the basis of the determined parameter indicative of size and/or shape however, the number of fingers in a bunch can be determined, expanding the range of functions that can be applied to simple gestures such as a linear or arcuate swipe.
In a specific example of touch object dimensions being used to determine the effect of a gesture,
As shown in
The ‘bunched fingers’ rotation shown in
The concept of performing gestures with bunched fingers can be extended to chords that include both bunched and separate fingers, e.g. bunched index finger and middle finger with a separate thumb. In a touch system with multi-touch capability and the ability to determine touch object dimensions, this has the advantage of further increasing the ‘vocabulary’ of gestural input. Another advantage of such chords, particularly for touch technologies that are subject to double touch ambiguity, is that the two components of the chord will have quite different sizes. As recognised in US 2006/0139340 A1, a size differential is one means by which an ambiguity may be resolved. To explain further,
Although the invention has been described with reference to specific examples, it will be appreciated by those skilled in the art that the invention may be embodied in many other forms.
Number | Date | Country | Kind |
---|---|---|---|
2008901068 | Mar 2008 | AU | national |
2008902412 | May 2008 | AU | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/AU2009/000274 | 3/5/2009 | WO | 00 | 9/7/2010 |