1. Technical Field
The invention relates to interactive displays. More particularly, the invention relates to a method and apparatus for determining user gestures to control a touch detecting, interactive display.
2. Description of the Prior Art
There are many situations in which one or more individuals interactively explore image-based data. For example, a team of paleontologists may wish to discuss an excavation plan for a remote dig site. To do so, they wish to explore in detail the geographic characteristics of the site as represented on digitized maps. In most laboratories, this requires the team either to huddle around a single workstation and view maps and images on a small display, or sit at separate workstations and converse by phone. The activity of exploring imagery is much more easily and effectively accomplished with the users surrounding a single large display. A particularly effective approach is a touch detecting, interactive display such as that disclosed in the related patent application entitled Touch Detecting Interactive Display, filed Aug. 6, 2004, Ser. No. 10/913,105. In such a system, an image is produced on a touch detecting display surface. A touch sensor determines the locations at which a user contacts the display surface, and based on the position and motions of these locations, user gestures are determined. The display is then updated based on the determined user gestures.
A wide variety of mechanisms are available for determining the locations at which a user contacts the touch sensor. Often, a grid-based approach is used in which measurements acquired on the perimeter of the touch sensor indicate a horizontal coordinate x1 and a vertical coordinate y1 for each contact location.
This method of determining and reporting the locations of contacts differentiates grid-based sensors from many other touch sensor technologies such as the Synaptics TouchPad™ found on many laptop computers. By measuring changes in capacitance near a wire mesh, the TouchPad™ determines contact positions directly and reports an absolute position to the host device. Clearly, an ability to directly ascertain and report the position of a contact is in many situations advantageous. However, capacitive sensors do not scale well, and are therefore impractical or prohibitively expensive for incorporation into large interactive displays.
A number of methods have been proposed for recognizing user gestures through tracking the position and motion of one or more contact locations determined by a touch sensor. Clearly, these methods encounter difficulty when used in conjunction with a grid-based sensor that cannot disambiguate the location of multiple simultaneous contact points. It would thus be advantageous to define a set of user gestures in terms of the bounding box surrounding the detected contact locations. Such a set of user gestures would permit the use of inexpensive, highly reliable, and highly scalable grid-based touch sensors yet still allow users to interact with the display in an intuitive manner.
The invention provides a method and apparatus for identifying gestures performed by a user to control an interactive display. The gestures are identified based on a bounding box enclosing the points at which a user contacts a touch sensor corresponding with the display surface. The invention thus permits the use of inexpensive and highly reliable grid-based touch sensors that provide a bounding box to describe contact information. In identifying the gestures, the position, motion, shape, and deformation of the bounding box may all be considered. In particular, the center, width, height, aspect ratio, length of the diagonal, and orientation of the diagonal of the bounding box may be determined. A stretch factor, defined as the maximum of the ratio of the height of the bounding box to the width of the bounding box and the ratio of the width of the bounding box to the height of the bounding box, may also be computed. Finally, gestures may be identified based on the changes in time of these characteristics and quantities.
Gestures that may be identified include pan, zoom, and rotate gestures. Display commands that may be associated with the identified gestures include, panning, zooming, and rotation commands that, when executed, provide a translation, a change in the magnification, or a change in the orientation of the displayed imagery. In a preferred embodiment of the invention, a pan gesture is identified only if the motion of the bounding box is greater than a predetermined motion threshold and the deformation of the bounding box is less than a predetermined deformation threshold. A zoom gesture is identified only if the stretch factor is greater than a predetermined stretch threshold and is increasing. A rotate gesture is identified only if the deformation of the bounding box is greater than a predetermined deformation threshold. Ambiguity in the direction of rotation implied by a rotate gesture is resolved by a convention in which the bounding box is specified with a particular pair of opposing corners, e.g. lower left and upper right, determining the relative intensity of contact locations, or measuring the torque applied by the user to the display surface.
a-3d shows several gestures identified based on bounding box position, shape, motion, and deformation according to the invention.
The invention provides a method and apparatus for identifying gestures performed by a user to control an interactive display. The gestures are identified based on a bounding box enclosing the points at which a user contacts a touch sensor corresponding with the display surface. The invention thus permits the use of inexpensive and highly reliable grid-based touch sensors that provide a bounding box to describe contact information.
Corresponding with the display surface is a touch sensor 155 that is capable of detecting when and where a user touches the display surface. Based upon the contact information provided by the touch sensor, user gestures are identified, and a command associated with the user gesture is determined. The command is executed, altering the displayed imagery in the manner requested by the user via the gesture. For example, in
In the preferred embodiment of the invention the touch sensor and the display are physically coincident as shown In
As noted above, cost and reliability often motivate the use of a grid-based sensor in touch detecting displays that, as shown in
a-3d show several gestures identified based on bounding box position, shape, motion, and deformation according to the invention. As shown in
In the preferred embodiment of the invention, gestures are identified using the procedure detailed below and illustrated in
After a gestures is identified, a display command consistent with the identified gesture is determined, and the display is updated appropriately. In the preferred embodiment of the invention:
In the preferred embodiment of the invention, the identification procedure is performed upon or shortly after initiation of contact by the user. Once the gesture has been identified, the identification is maintained until the contact is terminated. Throughout the duration of the contact, the display is continually updated, preferably each time updated bounding box information is received from the touch sensor. Initiation and termination of the single gesture are therefore determined based upon the appearance and disappearance of the bounding box, which is typically an event explicitly declared by the touch sensor.
Experimentation has indicated that such a rigid gesture classification is preferred by users, because it is difficult in practice to execute gestures that are purely of one type. Classifying the bounding box motion and deformation as a gesture of one type averts the frustration experienced by a user when, for example, an attempt to zoom results in both a zooming and a rotating motion of the display.
Nonetheless, in an alternative embodiment of the invention, the identification procedure is performed more frequently. For example, the identification procedure may be performed each time updated bounding box information is received from the touch sensor. In this approach, a single user motion, as delineated by the appearance and disappearance of a bounding box, potentially contains pan, zoom, and rotate gestures. Over the duration of the gesture, the display is updated with a combination of panning, zooming, and rotational motions that, to the user, appear smooth and continuous. Successful implementation of this embodiment requires especially careful selection of the thresholds ε1, εc, and εS.
In the above gesture identification procedure, the gesture for rotation remains partly ambiguous. Specifically, the direction of rotation cannot be determined from the bounding box alone. The pairs of points [C1,C2] and [C1,C2′] of
It should be noted that although the invention is described above with reference to a bounding box defined by two contact locations, the bounding box may also be defined for the case of three or more contact points. For a set of contact points C, defined by contact locations (xi,yi), the bounding box is defined by the corners (min[xi],min[yi]) and (max[xi],max[yi]).
While the description herein references a grid-based sensor incorporating a series of infrared emitters and receivers, the invention is equally applicable to other grid-based sensors. For example, the invention may be used with laser break-beam grids, resistive grids, capacitive grids, and arrays of acoustic, e.g. ultrasonic, emitters and microphones. The invention may also be used with non-grid-based sensors that return contact information in the form of a bounding box.
Finally, while the invention is described with reference to a rectangular bounding box, alternative embodiments of the invention may used non-rectangular bounding boxes. For example, a touch sensor incorporating corner based sensors that determine an angular bearing to each point of contact may return contact information in the form of a quadrilateral bounding box. The techniques described herein can be applied to a generalized quadrilateral bounding box with appropriate definition of a bounding box center, width, height, aspect ratio, and diagonal. The invention may thus be used in conjunction with sensors that are not strictly grid-based.
Although the invention is described herein with reference to several embodiments, including the preferred embodiment, one skilled in the art will readily appreciate that other applications may be substituted for those set forth herein without departing from the spirit and scope of the invention.
Accordingly, the invention should only be limited by the following Claims.
This application claims priority to U.S. patent application Ser. No. 10/913,105 entitled Touch-Detecting Interactive Display filed Aug. 6, 2004, and to U.S. provisional patent application No. 60/647,343 entitled Touch Table Detection and Gesture Recognition Technologies filed Jan. 25, 2005; all of which are incorporated herein in their entirety by this reference thereto.
Number | Name | Date | Kind |
---|---|---|---|
3478220 | Milroy | Nov 1969 | A |
3673327 | Johnson | Jun 1972 | A |
3764813 | Clement | Oct 1973 | A |
3775560 | Ebeling | Nov 1973 | A |
3860754 | Johnson | Jan 1975 | A |
4144449 | Funk | Mar 1979 | A |
4247767 | O'Brien | Jan 1981 | A |
4463380 | Hooks, Jr. | Jul 1984 | A |
4507557 | Tsikos | Mar 1985 | A |
4517559 | Deitch | May 1985 | A |
4722053 | Dubno | Jan 1988 | A |
4742221 | Sasaki | May 1988 | A |
4746770 | McAvinney | May 1988 | A |
4782328 | Denlinger | Nov 1988 | A |
5105186 | May | Apr 1992 | A |
5239373 | Tang et al. | Aug 1993 | A |
5436639 | Arai et al. | Jul 1995 | A |
5448263 | Martin | Sep 1995 | A |
5483261 | Yasutake | Jan 1996 | A |
5512826 | Hardy et al. | Apr 1996 | A |
5528263 | Platzker | Jun 1996 | A |
5982352 | Pryor | Nov 1999 | A |
6008798 | Mato, Jr. | Dec 1999 | A |
6057845 | Dupouy | May 2000 | A |
6141000 | Martin | Oct 2000 | A |
6215477 | Morrison | Apr 2001 | B1 |
6232957 | Hinckley | May 2001 | B1 |
6333753 | Hinckley | Dec 2001 | B1 |
6335722 | Tani et al. | Jan 2002 | B1 |
6335724 | Takekawa | Jan 2002 | B1 |
6337681 | Martin | Jan 2002 | B1 |
6352351 | Ogasahara | Mar 2002 | B1 |
6384809 | Smith | May 2002 | B1 |
6414671 | Gillespie | Jul 2002 | B1 |
6421042 | Omura | Jul 2002 | B1 |
6429856 | Omura | Aug 2002 | B1 |
6504532 | Ogasahara | Jan 2003 | B1 |
6518959 | Ito | Feb 2003 | B1 |
6531999 | Trajkovic | Mar 2003 | B1 |
6532006 | Takekawa | Mar 2003 | B1 |
6563491 | Omura | May 2003 | B1 |
6594023 | Omura | Jul 2003 | B1 |
6608619 | Omura et al. | Aug 2003 | B2 |
6636635 | Matsugu | Oct 2003 | B2 |
6654007 | Ito | Nov 2003 | B2 |
6723929 | Kent | Apr 2004 | B2 |
6747636 | Martin | Jun 2004 | B2 |
6764185 | Beardsley | Jul 2004 | B1 |
6765558 | Dotson | Jul 2004 | B1 |
6788297 | Itoh et al. | Sep 2004 | B2 |
6791700 | Omura | Sep 2004 | B2 |
6803906 | Morrison | Oct 2004 | B1 |
6810351 | Katsurahira | Oct 2004 | B2 |
6825890 | Matsufusa | Nov 2004 | B2 |
6828959 | Takekawa | Dec 2004 | B2 |
6888536 | Westerman | May 2005 | B2 |
6922642 | Sullivan | Jul 2005 | B2 |
6999061 | Hara et al. | Feb 2006 | B2 |
7339580 | Westerman et al. | Mar 2008 | B2 |
7474296 | Obermeyer et al. | Jan 2009 | B2 |
20010019325 | Takekawa | Sep 2001 | A1 |
20010022579 | Hirabayashi | Sep 2001 | A1 |
20010026268 | Ito | Oct 2001 | A1 |
20020036617 | Pryor | Mar 2002 | A1 |
20020185981 | Dietz | Dec 2002 | A1 |
20030001825 | Omura et al. | Jan 2003 | A1 |
20030063775 | Rafii et al. | Apr 2003 | A1 |
20030137494 | Tulbert | Jul 2003 | A1 |
20030231167 | Leung | Dec 2003 | A1 |
20040046744 | Rafii et al. | Mar 2004 | A1 |
20050052427 | Wu et al. | Mar 2005 | A1 |
20060022955 | Kennedy | Feb 2006 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060026536 | Hotelling et al. | Feb 2006 | A1 |
20070252821 | Hollemans et al. | Nov 2007 | A1 |
20070268273 | Westerman et al. | Nov 2007 | A1 |
20080211785 | Hotelling et al. | Sep 2008 | A1 |
Number | Date | Country |
---|---|---|
0881591 (B1) | Dec 1998 | EP |
0881592 (B2) | Dec 1998 | EP |
2001175807 | Jun 2001 | JP |
Number | Date | Country | |
---|---|---|---|
20060288313 A1 | Dec 2006 | US |
Number | Date | Country | |
---|---|---|---|
60647343 | Jan 2005 | US |