1. Technical Field
The invention relates to interactive displays. More particularly, the invention relates to multi-user displays suitable for displaying geographical imagery.
2. Description of the Prior Art
There are many situations in which one or more individuals interactively explore image based data. For example, a team of paleontologists may wish to discuss an excavation plan for a remote site dig. To do so, they wish to explore in detail the geographic characteristics of the site as represented on digitized maps. In most laboratories, this requires the team either to huddle around a single workstation and view maps and images on a small display, or sit at separate workstations and converse by phone.
The activity of exploring imagery is much more easily and effectively accomplished with the users surrounding a single large display. While such displays do exist, the interface to the display is typically still operated by a single user. For example, the National Center for Supercomputing Applications (NCSA) and the Geographic Modeling Systems Laboratory have developed an interactive display for museum exhibits in which imagery is displayed on a large horizontal projection surface (http://archive.ncsa.uiuc.edu/Cyberia/RWConsort_UIUC/index.html). A nearby touch screen control console allows visitors to move through and investigate 3-D graphical representations” of the exhibited geographic region. However, as the adjacent touch screen console must still be operated by a single user, the ability of the team as a whole to interact meaningfully with the display is limited.
Moreover, the interfaces to most displays are not particularly intuitive. While the NCSA system does provide a convenient touch screen interface, it is disjoint from the display itself, distancing the controlling user from the display and lessening any sense of immersion within the imagery. Other displays do not offer even this level of convenience, requiring users to manipulate the imagery through the use of keyboards and pointing devices such as mice.
It would be advantageous to provide a collaborative display with which several users can meaningfully discuss and interact with imagery as a group. It would also be advantageous if such a display allowed the users to control the imagery and underlying information through direct interaction with the display itself, using a set of natural gestures. It would be further advantageous to provide a display that is useful for individuals exploring Geographic Information Systems (GIS) data, such as scientists and military planners.
The invention provides an interactive display that is controlled by user gestures identified on a touch detecting display surface. In the preferred embodiment of the invention, imagery is projected onto a horizontal projection surface from a projector located above the projection surface. Locations where a user contacts the projection surface are detected using a set of infrared emitters and receivers arrayed around the perimeter of the projection surface. For each contact location, a computer software application stores a history of contact position information and, from the position history, determines a velocity for each contact location. Based upon the position history and the velocity information, gestures are identified. The identified gestures are associated with display commands that are executed to update the displayed imagery accordingly. Thus, the invention enables users to control the display through direct physical interaction with the imagery.
The contact locations may be detected by any of several approaches, including infrared emitters and receivers, a capacitive or resistive touch pad, ultrasound, and visual analysis of a material layer below the display surface that exhibits a visible change in response to applied pressure. Optionally, the position history and velocity information are supplemented with measurements of the intensity with which a user contacts the display surface. Optionally, the position history and velocity information are supplemented with measurements of the intensity with which a user contacts the display surface.
In the preferred embodiment of the invention, gestures are identified and associated with commands by pairing each contact location with a pixel within the imagery and updating the imagery, such that each of the pixels remains coincident with the corresponding contact location. Alternatively, gestures are identified by classifying the position history and velocity information as one of several distinct, allowable gestures.
Commands that may be associated with the identified gestures include, for example, panning, zooming and rotation. Objects represented within the imagery may be selected, and menus and submenus may be navigated. If the displayed imagery contains imagery layers, as in the case of geographical information systems imagery, the visibility and transparency of the layers may be adjusted. The displayed imagery preferably includes a control interface, such as a menu, positioned near and oriented towards a point on the edge of the display surface. At the request of a user, the control interface may be repositioned near and oriented towards another point on the edge of the display surface. Finally, the display surface is preferably surrounded by a railing that provides a visual cue that discourages users from leaning onto the display surface.
The invention provides an interactive display that is controlled by user gestures identified on a touch detecting display surface.
The display surface is capable of detecting when and where a user touches the display surface. Based upon this information, user gestures are identified, and a command associated with the user gesture is determined. The command is executed, altering the displayed imagery in the manner requested by the user via the gesture. For example, in
While projection from above onto a horizontally oriented display is preferred, other display surface orientations, projector configurations, and display technologies are possible. For example, a horizontally oriented rear-projection surface may be used as the display surface, with the projector mounted below the display surface, projecting in an upward direction. This approach offers the advantage of eliminating the shadows generated in those configurations where a user may position his body between the projector and the projection surface. The display may also be mounted in a vertical orientation and affixed to a wall or other supporting structure. In this case, nonprojection, thin profile display technologies may be most appropriate, such as LCD's, OLED's, or plasma displays, although those skilled in the art will appreciate that any display technology may be used in connection with the invention herein.
Detection of when and where a user touches the display surface may be achieved by a number of different approaches. In the preferred embodiment of the invention, a set of infrared emitters and receivers is arrayed around the perimeter of the projection surface, oriented such that each emitter emits light in a plane a short distance above the projection surface. The location where the user is touching the projection surface is determined by considering which emitters are and are not occluded as viewed from each of the receivers. A configuration incorporating a substantially continuous set of emitters around the perimeter and three receivers, each positioned in a corner of the projection surface, is particularly effective in resolving multiple locations of contact.
Alternatively, a resistive touch pad, such as those commonly used in laptop computers, may be placed beneath a flexible display surface. The resistive touch pad comprises two layers of plastic that are separated by a compressible insulator such as air, and a voltage differential is maintained across the separated layers. When the upper layer is touched with sufficient pressure, it is deflected until it contacts the lower layer, changing the resistive characteristics of the upper to lower layer current pathway. By considering these changes in resistive characteristics, the location of the contact can be determined. Capacitive touch pads may also be used, such as the Synaptics TouchPad™ (www.synaptics.com/products/touchpad.cfm).
In yet another embodiment of the invention, a thin layer of material that changes optical properties in response to pressure, such as a liquid crystal film, is placed beneath a flexible display surface. One or more video cameras trained on the underside of the material capture the changes in optical properties that occur when a user touches the projection surface and therefore applies pressure to the thin layer. The location of contact is then determined through analysis of the video camera images. Alternatively, ultrasound may be used to detect contact information. Further, a combination of such schemes, e.g. IR and ultrasound, may be used to detect contact information.
Regardless of the approach used to determine contact locations on the display surface, the location information is analyzed to identify user gestures. In the preferred embodiment of the invention, the infrared emitters and receivers periodically provide location information to a computer software application. For each of M distinct contact locations, the software records over time the discrete sequence of positions {right arrow over (x)}i(n), where i∈[1,M]. To determine whether or not two contact locations are distinct from one another, the distance between the points of contact may be considered, with two points separated by a distance greater than a predetermined threshold distance εd considered distinct contact locations.
By considering a recent history of the contact positions, a discrete sequence of velocities {right arrow over (ν)}i(n), including both a rate and direction of motion, is determined for each distinct contact. Most simply,
where ΔT is the interval at which the infrared emitters and receiver periodically provide location information. Preferably, to provide a smoothing effect and to mitigate the affects of noise in the contact location measurements, a longer history of position information is incorporated. For example, the expression
may be used, where N is the number of time steps that are considered in the recent history. More sophisticated techniques may also be employed, such as a least squares curve fit to the recent history of contact positions.
The position and velocity information determined in this manner is used to identify gestures. The gestures are in turn associated with specific display commands, which are executed to update the display as desired by the user. In the preferred embodiment of the invention, gestures are both identified and associated with display control commands via a single procedure. Up to two distinct contact locations are tracked by the software. That is, M is constrained to be less than two at any one time interval n. If a third contact location is identified, it is ignored until one or both of the existing contact locations is released by the user. The one or two contact locations are associated with individual points within the displayed imagery, i.e. pixels. As the positions of the contact locations move, the display is updated such that the pixels within the image remain coincident with the contact locations. Thus, to the user, the displayed image appears and feels to be an elastic sheet of paper that can be translated, rotated, and stretched as desired.
An important feature of this approach to identifying gestures and controlling the displayed imagery is that the basic motions described above can be combined to effect more complicated changes in the displayed imagery. For example, if a user establishes a right and a left contact location and initiates an offset separating motion, upward on the right and downward on the left 240, the result is a combined inward zoom and counterclockwise rotation.
In practice, inexperienced users of the interactive display may be overwhelmed by the power and flexibility of the above gestures and associated display control commands. For example, a user may be confused or disoriented when an imperfect attempt to zoom inward results in a combined inward zoom and a slight rotation. It may therefore be beneficial for the gesture identification procedure to classify gestures as one of several distinct, allowable gestures.
|φ1−φ2|<θcp,
where θcp is a predetermined tolerance, then either the first or second contact location and velocity is ignored, or the two contact locations and velocities are averaged. In either case, a single contact location and velocity is obtained, and the identified gesture is associated with a panning movement of the display.
As a further example, consider an instance in which a user establishes and moves two contact locations, as shown in the middle diagram of
|π−φd|<θcz,
where θcz is a predetermined tolerance, then the gestures is identified as a request for a zoom. To determine whether the display should be zoomed in or zoomed out, the length of the line segment S12 is considered. If the length is decreasing, the gesture is identified as a request for an outward zoom. Correspondingly, if the length is increasing, as is shown in
As yet another example, consider an instance in which a user establishes and moves two contact locations as shown in the lower diagram of
where θcr is a predetermined tolerance, then the gesture is identified as a request for a rotation of the display. The direction of the rotation may be determined by computing a cross product of one of the velocity vectors and a vector {right arrow over (s)}12 connecting the two positions {right arrow over (x)}1 and {right arrow over (x)}2.
Finally, consider the instance in which none of Equations 3, 4, or 5 are satisfied. In this case, the gesture cannot be identified as a pure pan, zoom, or rotation. The gesture is therefore ignored, and the user may be alerted with an audible tone or visual cue that the gesture was not identified.
As noted above, if a gesture is identified, the display is updated accordingly. In each of the three cases, the rate at which the display is altered, e.g. panned, zoomed, or rotated, is proportional to the magnitude of the velocity of the contact points. Either one of the two velocities may be selected, or an average magnitude may be computed. A gain or attenuation may be applied to this velocity to provide the desired balance of speed and precision in display control.
The gesture based display control commands, such as those above, may be supplemented by other gestures and associated commands that extend the ability of users to interact intuitively with the information provided on the display. For example, if a user touches the display surface at a point corresponding to a displayed object for which properties are known, and the user maintains the contact for a period of time longer than a predetermined period of time τ1, the corresponding displayed object is selected. The user may then be presented with a series of operations that may be performed on the object. For example, a user selecting a city may be presented with options to list and update the census data associated with the city. Detailed information of this nature may be provided to the user directly on the display surface or via an auxiliary display located near the display surface.
The gesture based display control commands described above may be further supplemented by commands activated when a user touches specially defined regions within the display surface associated with a control interface presented within the imagery. A primary example is the use of menus, through which users may perform more complicated operations than can be described using simple gestures.
The particular configuration of load cells shown in
Although the invention is described herein with reference to several embodiments, including the preferred embodiment, one skilled in the art will readily appreciate that other applications may be substituted for those set forth herein without departing from the spirit and scope of the invention.
Accordingly, the invention should only be limited by the following claims.
Number | Name | Date | Kind |
---|---|---|---|
3478220 | Milroy | Nov 1969 | A |
3673327 | Johnson | Jun 1972 | A |
3764813 | Clement | Oct 1973 | A |
3775560 | Ebeling | Nov 1973 | A |
3860754 | Johnson | Jan 1975 | A |
4144449 | Funk et al. | Mar 1979 | A |
4245634 | Albisser | Jan 1981 | A |
4247767 | O'Brien et al. | Jan 1981 | A |
4463380 | Hooks, Jr. | Jul 1984 | A |
4507557 | Tsikos | Mar 1985 | A |
4517559 | Deitch | May 1985 | A |
4527240 | Kvitash | Jul 1985 | A |
4722053 | Dubno | Jan 1988 | A |
4742221 | Sasaki | May 1988 | A |
4746770 | McAvinney | May 1988 | A |
4782328 | Denlinger | Nov 1988 | A |
5105186 | May | Apr 1992 | A |
5239373 | Tang et al. | Aug 1993 | A |
5379238 | Stark | Jan 1995 | A |
5436639 | Arai et al. | Jul 1995 | A |
5448263 | Martin | Sep 1995 | A |
5483261 | Yasutake | Jan 1996 | A |
5512826 | Hardy et al. | Apr 1996 | A |
5528263 | Platzker et al. | Jun 1996 | A |
5971922 | Arita | Oct 1999 | A |
5982352 | Pryor | Nov 1999 | A |
6008798 | Mato, Jr. | Dec 1999 | A |
6057845 | Dupouy | May 2000 | A |
6141000 | Martin | Oct 2000 | A |
6215477 | Morrison | Apr 2001 | B1 |
6232957 | Hinckley | May 2001 | B1 |
6240306 | Rohrscheib | May 2001 | B1 |
6280381 | Malin | Aug 2001 | B1 |
6309884 | Cooper | Oct 2001 | B1 |
6333753 | Hinckley | Dec 2001 | B1 |
6335722 | Tani et al. | Jan 2002 | B1 |
6335724 | Takekawa | Jan 2002 | B1 |
6337681 | Martin | Jan 2002 | B1 |
6352351 | Ogasahara | Mar 2002 | B1 |
6379301 | Worthington | Apr 2002 | B1 |
6384809 | Smith | May 2002 | B1 |
6414671 | Gillespie | Jul 2002 | B1 |
6415167 | Blank | Jul 2002 | B1 |
6421042 | Omura | Jul 2002 | B1 |
6429856 | Omura | Aug 2002 | B1 |
6487429 | Hockersmith | Nov 2002 | B2 |
6504532 | Ogasahara | Jan 2003 | B1 |
6512936 | Monfre | Jan 2003 | B1 |
6518959 | Ito | Feb 2003 | B1 |
6528809 | Thomas | Mar 2003 | B1 |
6531999 | Trajkovic | Mar 2003 | B1 |
6532006 | Takekawa | Mar 2003 | B1 |
6563491 | Omura | May 2003 | B1 |
6594023 | Omura | Jul 2003 | B1 |
6608619 | Omura et al. | Aug 2003 | B2 |
6636635 | Matsugu | Oct 2003 | B2 |
6651061 | Uchida | Nov 2003 | B2 |
6654007 | Ito | Nov 2003 | B2 |
6654620 | Wu | Nov 2003 | B2 |
6675030 | Ciurczak | Jan 2004 | B2 |
6723929 | Kent | Apr 2004 | B2 |
6747636 | Martin | Jun 2004 | B2 |
6764185 | Beardsley | Jul 2004 | B1 |
6765558 | Dotson | Jul 2004 | B1 |
6788297 | Itoh et al. | Sep 2004 | B2 |
6791700 | Omura | Sep 2004 | B2 |
6803906 | Morrison | Oct 2004 | B1 |
6810351 | Katsurahira | Oct 2004 | B2 |
6825890 | Matsufusa | Nov 2004 | B2 |
6828959 | Takekawa | Dec 2004 | B2 |
6885883 | Parris | Apr 2005 | B2 |
6888536 | Westerman | May 2005 | B2 |
6922642 | Sullivan | Jul 2005 | B2 |
6998247 | Monfre | Feb 2006 | B2 |
6999061 | Hara et al. | Feb 2006 | B2 |
7339580 | Westerman et al. | Mar 2008 | B2 |
7474296 | Obermeyer et al. | Jan 2009 | B2 |
20010016682 | Berner | Aug 2001 | A1 |
20010019325 | Takekawa | Sep 2001 | A1 |
20010022579 | Hirabayashi | Sep 2001 | A1 |
20010026268 | Ito | Oct 2001 | A1 |
20020019022 | Dunn | Feb 2002 | A1 |
20020036617 | Pryor | Mar 2002 | A1 |
20020132279 | Hockersmith | Sep 2002 | A1 |
20020185981 | Dietz | Dec 2002 | A1 |
20030001825 | Omura et al. | Jan 2003 | A1 |
20030063775 | Rafii et al. | Apr 2003 | A1 |
20030137494 | Tulbert | Jul 2003 | A1 |
20030231167 | Leung | Dec 2003 | A1 |
20040033618 | Haass | Feb 2004 | A1 |
20040046744 | Rafii et al. | Mar 2004 | A1 |
20040106163 | Workman | Jun 2004 | A1 |
20050038674 | Braig | Feb 2005 | A1 |
20050052427 | Wu et al. | Mar 2005 | A1 |
20050106651 | Chaiken | May 2005 | A1 |
20060022955 | Kennedy | Feb 2006 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060026536 | Hotelling et al. | Feb 2006 | A1 |
20060063218 | Bartkowiak | Mar 2006 | A1 |
20070252821 | Hollemans et al. | Nov 2007 | A1 |
20070268273 | Westerman et al. | Nov 2007 | A1 |
20080211785 | Hotelling et al. | Sep 2008 | A1 |
Number | Date | Country |
---|---|---|
0 881 592 | Oct 2002 | EP |
0 881 591 | Sep 2003 | EP |
2001-175807 | Jun 2001 | JP |
WO0216905 | Feb 2002 | WO |
Number | Date | Country | |
---|---|---|---|
20060274046 A1 | Dec 2006 | US |