This application is related to the following U.S. patent application:
The present disclosure relates generally to user interaction with an electronic device, and more particularly to dual-sided gestures implemented using an electronic device that accepts touch input on multiple sides.
Electronic device manufacturers are increasingly using touch-sensitive displays (touch screens), which enable a device to visually convey information to a user as well as to enable a user to interact contextually with displayed graphical user elements and otherwise provide user input to the electronic device. Some electronic device manufacturers are contemplating devices with a touch pad as well as a touch screen. In one contemplated configuration, a touch sensitive display is placed on an obverse side of a housing of an electronic device and a touch pad is placed on a reverse side of the housing. Given this contemplated configuration, there are various opportunities to develop new touch interactions with an electronic device.
The various aspects, features and advantages of the disclosure will become more fully apparent to those having ordinary skill in the art upon careful consideration of the following Drawings and accompanying Detailed Description.
An electronic device for manipulating graphical user interface elements has a touch-sensitive display (touch screen) on an obverse side of the electronic device and a touch-sensitive surface (touch pad) on a reverse side of the electronic device. The electronic device displays at least two graphical user elements (data icon, program icon, application window, digital photograph, etc.) on the touch screen. A user touches a first element using either the touch screen or the touch pad. This touch selects the first element and “anchors” it while a user's slide motion on the other touch-sensitive surface manipulates the second element.
The slide contact sensed on the other touch-sensitive surface can be interpreted by the electronic device as a drag (lateral movement of the second element within the plane of the display screen) relative to the first element, a push (virtual movement of the second element in front or behind the plane of the display screen) relative to the first element, a rotate (rotational movement of the second element within the plane of the display screen) relative to the first element, or a pixel-based move (zoom in/out or enlarge/reduce) relative to the first element.
Various features of the slide movement, such as the speed, the length, the pressure, the direction, and/or the pattern may affect the interpretation. For example, a rotational slide movement may direct the electronic device to rotate the second element relative to the first element while a linear slide movement may direct the electronic device to drag the second element relative to the first element. As another example, sliding to the right may control a zoom out (with the length of the slide movement relating to the percentage of zoom out) while sliding to the left may control a zoom in (with the length of the slide movement relating to the percentage of zoom in).
The touch and slide movements on touch-sensitive surfaces on different sides of the electronic device are at least partially overlapping in time, with the touch starting prior to or simultaneously with the slide. By supporting dual-sided gestures, the electronic device may enable known graphical user interface interactions and provide support for additional graphical user interface interactions.
The electronic device 100 shown is a portable, handheld electronic device such as a mobile phone, remote controller, personal digital assistant, portable audio or video player, handheld game console, or the like; however, the electronic device could be implemented as a non-handheld device such as an interactive table-like surface.
The touch screen 181 shows, in this example, six graphical user interface elements 110, 112, 114, 120, 122, 124. These elements can be data icons (e.g., file folders, documents, spreadsheets, address book contacts, photo files, music files, video files, electronic book files, etc.) to access data, program icons (e.g., word processing, presentation, browser, media player, calendar, geographic navigation, electronic mail or text messaging, electronic games, etc.) to access software applications, application windows (for individual instances of opened software applications, file folder navigation, etc.), links/shortcuts to any of the above, and the like. If a user would like to manipulate one or more of the graphical user elements, the user touches a first element, using either the touch screen 181 on the obverse side 191 or the touch pad 185 on the reverse side 195, to select a first element. If a slide contact occurs on the other touch-sensitive surface, overlapping in time with the touch contact, the electronic device 100 will manipulate the other graphical user interface elements relative to the first element.
Although a two-dimensional matrix layout has been shown, the graphical user interface screen can be reduced to a one-dimensional matrix layout such as a list of song files, electronic book files, address book contact files, etc.
The dual-sided gesture has been illustrated as a single-handed gesture; however, two hands can be used to perform the touch and slide movements shown. In some electronic device implementations, the electronic device 100 as shown is only a portion of a larger electronic device akin to a hinged laptop computer. When the laptop-configured electronic device is in an open position, using two hands to perform the dual-sided gesture may be ergonomically easier—depending on the size and location of the touch screen, the size and location of the touch pad, and individual user preferences.
Initially, the electronic device displays 510 at least two graphical user interface elements on a touch screen. Next, the electronic device selects 520 a first element based on a touch contact detected on a touch-sensitive surface. The selection may be indicated to a user visually or audibly as previously described. After that, the electronic device manipulates 530 a non-selected graphical user interface element based on a slide contact detected on a different touch-sensitive surface.
Different types of manipulations are possible depending on the operational mode of the electronic device, the type of graphical user interface element selected via the touch contact, the pattern of the slide movement, and other factors. One type of manipulation drags 532 the non-selected element with respect to the first element as shown in
The electronic device can optionally cease 540 manipulating based on detecting cessation of the slide contact. If the touch contact has been persistent throughout the slide contact, the electronic device can manipulate 550 the first element based on movement of the touch contact after the slide contact ceases.
Although the touch and the slide are performed on different touch-sensitive surfaces, the electronic device is agnostic as to whether the touch contact is performed on a touch-sensitive surface on the obverse side or the reverse side.
The dragging manipulation of non-selected elements relative to a stationary selected element allows a user to move one or more elements within the plane of the display screen. A different mode can allow a push manipulation to move non-selected elements virtually in front or behind a selected element in the plane of the display screen.
As alluded to previously, using a touch pad on a reverse side of an electronic device to select an element from the “bottom” of a virtual stack of user interface elements may be easier for a user than trying to select that same element using a touch screen.
In addition to linear slide movements directing drag and push interactions with graphical user interface elements, a circular slide movement may direct rotate interactions with graphical user interface elements.
A slide user interaction on a second touch-sensitive surface can also direct a pixel-based move relative to a first element selected using a touch user interaction on a first touch-sensitive surface. This variant mode uses a touch user interaction to select a pixel to “anchor” instead of a graphical user interface element in its entirety. Pixel-based moves allow a user to direct zoom in/out interactions and enlarge/reduce interactions.
More complicated slide user interactions may be implemented on an electronic device.
A memory 3389, coupled to the processor 3388, stores software programs for manipulating graphical user interface elements in accordance with the flow diagram of
When executing various software programs, the processor 3388 has a touch detection module 3371 for detecting a touch user interaction on a first touch-sensitive surface of an electronic device 3300. The detection module determines which graphical user interface element has been selected via the detected touch. A slide detection module 3375 detects a slide user interaction on a second touch-sensitive surface of the electronic device 3300. Based on the detected slide motion (possibly including pressure, velocity, direction, and pattern of the slide motion), a graphical user interface element manipulation module 3376 manipulates non-selected interactive user elements relative to the first interaction user element(s) based on the slide user interaction. As mentioned previously, the manipulation may be classified as drag, push, rotate, or pixel-based moves based on the slide user interaction. Signals from the manipulation module are coupled to the display screen controller 3382 to cause the graphical user interface elements to change as directed by the processor 3388.
The electronic device 3300 can also include a variety of other components (not shown) based on the particular implementation. For example, if the electronic device 3300 was implemented as a mobile phone, it would also include a microphone and a wireless transceiver and possibly additional input components such as a keypad, accelerometer, and vibration alert. If the electronic device 3300 was implemented as a remote controller, an infrared transmitter could also be included.
The touch interaction 3410 and the slide interaction 3420 both continue for a period of time 3460, and it is generally expected that a user will release the touch interaction 3410 before completing the slide interaction 3420 as shown in
Thus, the electronic device and method for manipulating graphical user interface elements detects a touch user interaction on a first touch-sensitive surface of an electronic device to select a first interactive user element, detects a slide user interaction on a second touch-sensitive surface of the electronic device, and manipulates non-selected interactive user elements relative to the first interaction user elements based on the slide user interaction. This document has disclosed drag, push, rotate, and pixel-based moves based on the slide user interaction.
While the present invention is susceptible of embodiment in various forms, the drawings show and the text describes embodiments with the understanding that the present disclosure is to be considered an exemplification of the invention and is not intended to limit the invention to the specific embodiments illustrated. Furthermore, while the various figures are intended to illustrate the various claimed aspects of the present invention, in doing so, the elements are not necessarily intended to be drawn to scale. In other words, the size, shape, and dimensions of some elements, features, components, and/or regions are for purposes of clarity (or for purposes of better describing or illustrating the concepts intended to be conveyed) and may be exaggerated and/or emphasized relative to other illustrated elements.
While various embodiments of the invention have been illustrated and described, it is to be understood that the invention is not so limited. Numerous modifications, changes, variations, substitutions, and equivalents will occur to those skilled in the art without departing from the spirit and scope of the present invention as defined by the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
4076262 | Deventer | Feb 1978 | A |
5128671 | Thomas, Jr. | Jul 1992 | A |
5494447 | Zaidan | Feb 1996 | A |
5543588 | Bisset et al. | Aug 1996 | A |
5610971 | Vandivier | Mar 1997 | A |
5623280 | Akins et al. | Apr 1997 | A |
5729219 | Armstrong et al. | Mar 1998 | A |
5795300 | Bryars | Aug 1998 | A |
5832296 | Wang et al. | Nov 1998 | A |
5896575 | Higginbotham et al. | Apr 1999 | A |
5898600 | Isashi | Apr 1999 | A |
5959260 | Hoghooghi et al. | Sep 1999 | A |
6020878 | Robinson | Feb 2000 | A |
6201554 | Lands | Mar 2001 | B1 |
6233138 | Osgood | May 2001 | B1 |
6392870 | Miller, Jr. | May 2002 | B1 |
6457547 | Novitschitsch | Oct 2002 | B2 |
6466198 | Feinstein | Oct 2002 | B1 |
6504530 | Wilson et al. | Jan 2003 | B1 |
6532147 | Christ, Jr. | Mar 2003 | B1 |
6549789 | Kfoury | Apr 2003 | B1 |
6597347 | Yasutake | Jul 2003 | B1 |
6927747 | Amirzadeh et al. | Aug 2005 | B2 |
7058433 | Carpenter | Jun 2006 | B2 |
7075513 | Silfverberg et al. | Jul 2006 | B2 |
7205959 | Henriksson | Apr 2007 | B2 |
7218313 | Marcus et al. | May 2007 | B2 |
7423526 | Despotis | Sep 2008 | B2 |
7453442 | Poynter | Nov 2008 | B1 |
7453443 | Rytivaara et al. | Nov 2008 | B2 |
8265717 | Gorsica et al. | Sep 2012 | B2 |
8669863 | Alhuwaishel | Mar 2014 | B2 |
20010052122 | Nanos et al. | Dec 2001 | A1 |
20030103324 | Gallivan | Jun 2003 | A1 |
20030197678 | Siddeeq | Oct 2003 | A1 |
20030199290 | Viertola | Oct 2003 | A1 |
20030234768 | Rekimoto et al. | Dec 2003 | A1 |
20040192260 | Sugimoto et al. | Sep 2004 | A1 |
20040212599 | Cok et al. | Oct 2004 | A1 |
20050012723 | Pallakoff | Jan 2005 | A1 |
20050020325 | Enger et al. | Jan 2005 | A1 |
20050024339 | Yamazaki et al. | Feb 2005 | A1 |
20050031390 | Orozco-Abundis | Feb 2005 | A1 |
20050096106 | Bennetts et al. | May 2005 | A1 |
20050124395 | Bae et al. | Jun 2005 | A1 |
20050275416 | Hervieux et al. | Dec 2005 | A1 |
20050282596 | Park et al. | Dec 2005 | A1 |
20060017711 | Pihlaja | Jan 2006 | A1 |
20060024601 | Ogawa et al. | Feb 2006 | A1 |
20060034601 | Andersson et al. | Feb 2006 | A1 |
20060037175 | Hyun | Feb 2006 | A1 |
20060084482 | Saila | Apr 2006 | A1 |
20060092355 | Yang et al. | May 2006 | A1 |
20060111160 | Lin et al. | May 2006 | A1 |
20060139320 | Lang | Jun 2006 | A1 |
20060170649 | Kosugi et al. | Aug 2006 | A1 |
20060197750 | Kerr et al. | Sep 2006 | A1 |
20060284853 | Shapiro | Dec 2006 | A1 |
20070075915 | Cheon et al. | Apr 2007 | A1 |
20070076861 | Ju | Apr 2007 | A1 |
20070097151 | Rosenberg | May 2007 | A1 |
20070103454 | Elias | May 2007 | A1 |
20070127199 | Arneson | Jun 2007 | A1 |
20070177803 | Elias et al. | Aug 2007 | A1 |
20080004085 | Jung et al. | Jan 2008 | A1 |
20080102888 | Seligren et al. | May 2008 | A1 |
20080150903 | Chuang | Jun 2008 | A1 |
20080192977 | Gruenhagen et al. | Aug 2008 | A1 |
20080211783 | Hotelling et al. | Sep 2008 | A1 |
20080252608 | Geaghan | Oct 2008 | A1 |
20080261661 | Jessop | Oct 2008 | A1 |
20080266118 | Pierson et al. | Oct 2008 | A1 |
20090046110 | Sadler et al. | Feb 2009 | A1 |
20090061948 | Lee et al. | Mar 2009 | A1 |
20090066660 | Ure | Mar 2009 | A1 |
20090096749 | Kawahara et al. | Apr 2009 | A1 |
20090131117 | Choi | May 2009 | A1 |
20090140863 | Liu et al. | Jun 2009 | A1 |
20090199130 | Tsern et al. | Aug 2009 | A1 |
20090201253 | Jason et al. | Aug 2009 | A1 |
20090241048 | Augustine et al. | Sep 2009 | A1 |
20090251419 | Radely-Smith | Oct 2009 | A1 |
20090273571 | Bowens | Nov 2009 | A1 |
20090298547 | Kim et al. | Dec 2009 | A1 |
20090315834 | Nurmi et al. | Dec 2009 | A1 |
20100007603 | Kirkup | Jan 2010 | A1 |
20100020034 | Kim | Jan 2010 | A1 |
20100029327 | Jee | Feb 2010 | A1 |
20100110495 | Letocha | May 2010 | A1 |
20100113100 | Harmon et al. | May 2010 | A1 |
20100134409 | Challener et al. | Jun 2010 | A1 |
20100219943 | Vanska et al. | Sep 2010 | A1 |
20100235742 | Hsu et al. | Sep 2010 | A1 |
20100277420 | Charlier et al. | Nov 2010 | A1 |
20100277421 | Charlier et al. | Nov 2010 | A1 |
20110003665 | Burton et al. | Jan 2011 | A1 |
20110012921 | Cholewin et al. | Jan 2011 | A1 |
20110012928 | Cholewin et al. | Jan 2011 | A1 |
20110157799 | Harmon et al. | Jun 2011 | A1 |
20110190675 | Vess | Aug 2011 | A1 |
20110221688 | Byun et al. | Sep 2011 | A1 |
20120092383 | Hysek et al. | Apr 2012 | A1 |
20120127070 | Ryoo et al. | May 2012 | A1 |
20120139904 | Lee et al. | Jun 2012 | A1 |
20130044215 | Rothkopf et al. | Feb 2013 | A1 |
20130197857 | Lu et al. | Aug 2013 | A1 |
20140018686 | Medelius et al. | Jan 2014 | A1 |
Number | Date | Country |
---|---|---|
0913977 | May 1999 | EP |
1335567 | Feb 2002 | EP |
1408400 | Apr 2004 | EP |
1517223 | Mar 2005 | EP |
1754424 | Feb 2007 | EP |
2065786 | Jun 2009 | EP |
2150031 | Feb 2010 | EP |
2771769 | Jun 1999 | FR |
2339505 | Jan 2000 | GB |
2368483 | May 2002 | GB |
100683535 | Feb 2007 | KR |
1020070035026 | Mar 2007 | KR |
2004114636 | Dec 2004 | WO |
2005071928 | Aug 2005 | WO |
2005111769 | Nov 2005 | WO |
2008030563 | Mar 2008 | WO |
2009123406 | Oct 2009 | WO |
2010097692 | Sep 2010 | WO |
Entry |
---|
Patent Cooperation Treaty, “PCT Search Report and Written Opinion of the International Searching Authority” for International Application No. PCT/US2013/055165, Mar. 12, 2014, 10 pages. |
International Preliminary Report on Patentability from international application No. PCT PCT/US2010/040876, dated Jul. 20, 2009, 8 pp. |
Prosecution History from U.S. Appl. No. 12/505,775, dated Dec. 23, 2011 through Mar. 25, 2013, 53 pp. |
Prosecution History from U.S. Appl. No. 12/565,200, dated Sep. 18, 2012 through Apr. 3, 2013, 48 pp. |
Illustration of GPS system, published by lucid touch microsoft research & mitsubishi research, Nov. 26, 2008, retrieved from http://research.microsoft.com/users/baudisch/projects/lucidtouch/applications, 3 pp. |
Chu et al., “Lucid Touch prototype,” published by lucid touch microsoft research & mitsubishi electric research labs, Nov. 26, 2008, retrieved from http://research.microsoft.com/users/baudisch/projects/lucidtouch/index.html, 1 pp. |
Erh-Li (Early) Shen et al., “Double-side Multi-touch Input for Mobile Devices”, CHI 2009—Spotlight on Works in Progress—Session 2, Apr. 4, 2009, pp. 4339-4344. |
Patrick Baudisch, “Application Areas of Lucid Touch”, http://research.microsoft.com/users/baudisch/projects/lucidtouch/applications, accessed Nov. 26, 2008, 3 pages. |
Dance With Shadows, “Microsoft's LucidTouch See-Through Touchscreen Unveiled”, http://www.dancewithshadows.com/tech/lucid-touch.asp, Mar. 10, 2008, 2 pages. |
Patrick Baudisch, “Lucid Touch Homepage”, http://research.microsoft.com/users/baudisch/projects/lucidtouch/index.html, accessed Nov. 26, 2008, 5 pages. |
Masanori Sugimoto and Keiichi Hiroki, “HybridTouch: An Intuitive Manipulation Technique for PDAs Using Their Front and Rear Surfaces”, Proc. of the 8th Int'l Conf. on Human Computer Interaction with Mobile Devices and Services (MobileHCI 2006), Sep. 12, 2006, pp. 137-140. |
Daniel Wigdor et al., “Lucid Touch: A See-through Mobile Device”, Proc. of the 20th Annual ACM Symposium on User Interface Software and Tech., Oct. 7, 2007, pp. 269-278, XP002582051. |
Adesso, Inc., “Adresso Easy Cat 2 Button Glidepoint Touchpad (Black)”, http://www.adesso.com/en/component/content/article/63-touchpads/189-gp-160.html, downloaded Sep. 12, 2012, 2 pages. |
Gregory Wilson, “Evaluating the Effectiveness of Using Touch Sensor Capacitors as an Input Device for a Wrist Watch Computer”, Georgia Institute of Technology undergraduate thesis, smartech.gatech.edu/xmlui/handle/1853/19947, Dec. 17, 2007, 15 pages. |
Jun Rekimoto, “GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices”, http://www.sonycsl.co.jp/person/rekimoto/papers/iswc01.pdf, 5th IEEE Int'l Symp. on Wearable Computers, 2001, pp. 21-27. |
Paul H. Dietz et al., “A Practical Pressure Sensitive Computer Keyboard”, 22nd Ass'n for Computing Machinery Symp. on User Interface Software and Tech., Oct. 4, 2009, 4 pages. |
Samsung, “Samsung Wearable Mobile Device Makes Communication Easier for an Active Lifestyle”, http://www.tuvie.com/samsung-wearable-mobile-device-can-make-communication-easier-in-adventurous-trips/, Apr. 3, 2010, 10 pages. |
Charles McLellan, “Eleksen Wireless Fabric Keyboard: A First Look”, http://www.zdnet.com/eleksen-wireless-fabric-keyboard-a-first-look-3039278954, Jul. 17, 2006, 9 pages. |
United States Patent and Trademark Office, “Final Office Action” for U.S. Appl. No. 12/565,200, Jan. 16, 2013, 12 pages. |
United States Patent and Trademark Office, “Non-Final Office Action” for U.S. Appl. No. 12/433,253, Feb. 16, 2012, 22 pages. |
United States Patent and Trademark Office, “Non-Final Office Action” for U.S. Appl. No. 12/565,200, Sep. 18, 2012, 22 pages. |
Patent Cooperation Treaty, “PCT Search Report and Written Opinion of the International Searching Authority” for Int'l Appln. No. PCT/US2010/031879, Jul. 7, 2010, 14 pages. |
Patent Cooperation Treaty, “PCT Search Report and Written Opinion of the International Searching Authority” for Int'l Appln. No. PCT/US2010/037568, Nov. 25, 2010, 19 pages. |
Patent Cooperation Treaty, “PCT Search Report and Written Opinion of the International Searching Authority” for Int'l Appln. No. PCT/US2010/040876, Oct. 5, 2010, 14 pages. |
United States Patent and Trademark Office, “Non-Final Office Action” for U.S. Appl. No. 12/505,775, Dec. 23, 2011, 19 pages. |
Number | Date | Country | |
---|---|---|---|
20130215064 A1 | Aug 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12565200 | Sep 2009 | US |
Child | 13852211 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12505775 | Jul 2009 | US |
Child | 12565200 | US |