Method for manipulating a plurality of non-selected graphical user elements

Information

  • Patent Grant
  • 9250729
  • Patent Number
    9,250,729
  • Date Filed
    Thursday, March 28, 2013
    11 years ago
  • Date Issued
    Tuesday, February 2, 2016
    8 years ago
Abstract
A method for manipulating graphical user interface elements includes displaying a plurality of elements on a touch screen of an electronic device. Based on a first touch screen contact detected on a first side of the electronic device and a slide contact detected on a second side of the electronic device, a first touch screen element is selected and the non-selected touch screen element(s) are manipulated relative to the first element. The slide contact can be interpreted by the electronic device as a drag, push, or rotate relative to the first element. Various features of the slide movement, such as the speed, the length, the pressure, the direction, and/or the pattern may affect the manipulation of the non-selected elements.
Description
CROSS REFERENCE TO RELATED APPLICATIONS

This application is related to the following U.S. patent application:

    • “Method for Implementing Zoom Functionality on a Portable Device with Opposing Touch Sensitive Surfaces” by Erik Cholewin et al., application Ser. No. 12/505,775 filed on Jul. 20, 2009; and
    • “Electronic Device and Method for Manipulating Graphic User Interface Elements” by Erik Cholewin et al., application Ser. No. 12/565,200 filed on Sep. 23, 2009.


      The related applications are assigned to the assignee of the present application and are hereby incorporated herein in their entirety by this reference thereto.


FIELD OF THE DISCLOSURE

The present disclosure relates generally to user interaction with an electronic device, and more particularly to dual-sided gestures implemented using an electronic device that accepts touch input on multiple sides.


BACKGROUND

Electronic device manufacturers are increasingly using touch-sensitive displays (touch screens), which enable a device to visually convey information to a user as well as to enable a user to interact contextually with displayed graphical user elements and otherwise provide user input to the electronic device. Some electronic device manufacturers are contemplating devices with a touch pad as well as a touch screen. In one contemplated configuration, a touch sensitive display is placed on an obverse side of a housing of an electronic device and a touch pad is placed on a reverse side of the housing. Given this contemplated configuration, there are various opportunities to develop new touch interactions with an electronic device.


The various aspects, features and advantages of the disclosure will become more fully apparent to those having ordinary skill in the art upon careful consideration of the following Drawings and accompanying Detailed Description.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an obverse side of an electronic device with a touch screen and a touch pad, and it depicts a user interaction with the touch screen.



FIG. 2 illustrates a reverse side of the electronic device of FIG. 1, and it depicts a user interaction with the touch pad.



FIG. 3 illustrates the obverse side of the electronic device of FIG. 1 after completion of the user interaction with the touch pad, and it also depicts a subsequent user interaction with the touch screen.



FIG. 4 illustrates an obverse side of the electronic device of FIG. 1 after completion of the subsequent user interaction with the touch screen.



FIG. 5 illustrates a flow chart for an electronic device manipulating graphical user interface elements.



FIG. 6 illustrates a reverse side of an electronic device with a touch screen and a touch pad, and it depicts a user interaction with the touch pad.



FIG. 7 illustrates an obverse side of the electronic device of FIG. 6, and it depicts a user interaction with the touch screen.



FIG. 8 illustrates an obverse side of the electronic device of FIG. 6 after completion of the user interaction with the touch screen.



FIG. 9 illustrates an obverse side of an electronic device with a touch screen and a touch pad, and it depicts a user interaction with the touch screen.



FIG. 10 illustrates a reverse side of the electronic device of FIG. 9, and it depicts a user interaction with the touch pad.



FIG. 11 illustrates an obverse side of the electronic device of FIG. 9 after completion of the user interaction with the touch pad.



FIG. 12 illustrates a reverse side of an electronic device with a touch screen and a touch pad, and it depicts a user interaction with the touch pad.



FIG. 13 illustrates an obverse side of the electronic device of FIG. 12, and it depicts a user interaction with the touch screen.



FIG. 14 illustrates an obverse side of the electronic device of FIG. 12 after completion of the user interaction with the touch screen.



FIG. 15 illustrates an obverse side of the electronic device of FIG. 12, and it depicts an alternate user interaction with the touch screen.



FIG. 16 illustrates an obverse side of the electronic device of FIG. 12 after completion of the alternate user interaction with the touch screen.



FIG. 17 illustrates an obverse side of an electronic device with a touch screen and a touch pad, and it depicts a user interaction with the touch screen.



FIG. 18 illustrates a reverse side of the electronic device of FIG. 17, and it depicts a user interaction with the touch pad.



FIG. 19 illustrates an obverse side of the electronic device of FIG. 17 after completion of the user interaction with the touch pad.



FIG. 20 illustrates an obverse side of an electronic device with a touch screen and a touch pad, and it depicts a user interaction with the touch screen.



FIG. 21 illustrates a reverse side of the electronic device of FIG. 20, and it depicts a user interaction with the touch pad.



FIG. 22 illustrates an obverse side of the electronic device of FIG. 20 after completion of the user interaction with the touch pad.



FIG. 23 illustrates a reverse side of the electronic device of FIG. 20, and it depicts an alternate user interaction with the touch pad.



FIG. 24 illustrates an obverse side of the electronic device of FIG. 20 after completion of the alternate user interaction with the touch pad.



FIG. 25 illustrates an obverse side of an electronic device with a touch screen and a touch pad, and it depicts a user interaction with the touch screen.



FIG. 26 illustrates a reverse side of the electronic device of FIG. 25, and it depicts a user interaction with the touch pad.



FIG. 27 illustrates an obverse side of the electronic device of FIG. 25 after completion of the user interaction with the touch pad.



FIG. 28 illustrates a reverse side of the electronic device of FIG. 25, and it depicts an alternate user interaction with the touch pad.



FIG. 29 illustrates an obverse side of the electronic device of FIG. 25 after completion of the alternate user interaction with the touch pad.



FIG. 30 illustrates an obverse side of an electronic device with a touch screen and a touch pad, and it depicts a user interaction with the touch screen.



FIG. 31 illustrates a reverse side of the electronic device of FIG. 30, and it depicts a user interaction with the touch pad.



FIG. 32 illustrates an obverse side of the electronic device of FIG. 30 after completion of the user interaction with the touch pad.



FIG. 33 illustrates a simplified block diagram of an electronic device with a touch screen and a touch pad.



FIGS. 34-35 illustrate various timing options available for user interaction with the touch screen and the touch pad.





DETAILED DESCRIPTION

An electronic device for manipulating graphical user interface elements has a touch-sensitive display (touch screen) on an obverse side of the electronic device and a touch-sensitive surface (touch pad) on a reverse side of the electronic device. The electronic device displays at least two graphical user elements (data icon, program icon, application window, digital photograph, etc.) on the touch screen. A user touches a first element using either the touch screen or the touch pad. This touch selects the first element and “anchors” it while a user's slide motion on the other touch-sensitive surface manipulates the second element.


The slide contact sensed on the other touch-sensitive surface can be interpreted by the electronic device as a drag (lateral movement of the second element within the plane of the display screen) relative to the first element, a push (virtual movement of the second element in front or behind the plane of the display screen) relative to the first element, a rotate (rotational movement of the second element within the plane of the display screen) relative to the first element, or a pixel-based move (zoom in/out or enlarge/reduce) relative to the first element.


Various features of the slide movement, such as the speed, the length, the pressure, the direction, and/or the pattern may affect the interpretation. For example, a rotational slide movement may direct the electronic device to rotate the second element relative to the first element while a linear slide movement may direct the electronic device to drag the second element relative to the first element. As another example, sliding to the right may control a zoom out (with the length of the slide movement relating to the percentage of zoom out) while sliding to the left may control a zoom in (with the length of the slide movement relating to the percentage of zoom in).


The touch and slide movements on touch-sensitive surfaces on different sides of the electronic device are at least partially overlapping in time, with the touch starting prior to or simultaneously with the slide. By supporting dual-sided gestures, the electronic device may enable known graphical user interface interactions and provide support for additional graphical user interface interactions.



FIGS. 1-4 illustrate an obverse side 191 and a reverse side 195 of an electronic device 100 with a touch screen 181 and a touch pad 185. An obverse side of an object is a side that is primarily intended to be viewed by a user, and the obverse side 191 of the electronic device 100 contains the touch screen 181 or other type of touch-sensitive display. The reverse side of an object is a side that is not intended to be primarily viewed by a user, and the reverse side 195 of the electronic device 100 contains a touch pad 185 or other type of touch-sensitive surface without a display. If desired, the reverse side 195 could contain a touch screen instead of the touch pad 185 shown, although the user is not expected to look at the reverse side when the electronic device 100 is subjected to the dual-sided gestures.


The electronic device 100 shown is a portable, handheld electronic device such as a mobile phone, remote controller, personal digital assistant, portable audio or video player, handheld game console, or the like; however, the electronic device could be implemented as a non-handheld device such as an interactive table-like surface.


The touch screen 181 shows, in this example, six graphical user interface elements 110, 112, 114, 120, 122, 124. These elements can be data icons (e.g., file folders, documents, spreadsheets, address book contacts, photo files, music files, video files, electronic book files, etc.) to access data, program icons (e.g., word processing, presentation, browser, media player, calendar, geographic navigation, electronic mail or text messaging, electronic games, etc.) to access software applications, application windows (for individual instances of opened software applications, file folder navigation, etc.), links/shortcuts to any of the above, and the like. If a user would like to manipulate one or more of the graphical user elements, the user touches a first element, using either the touch screen 181 on the obverse side 191 or the touch pad 185 on the reverse side 195, to select a first element. If a slide contact occurs on the other touch-sensitive surface, overlapping in time with the touch contact, the electronic device 100 will manipulate the other graphical user interface elements relative to the first element.



FIG. 1 depicts a user interaction 171 with the touch screen 181. At this point in time, the user has touched a first element 120 using the touch screen 181. In order to provide feedback to the user regarding the touch-selection of the first element 120, the selected graphical user element 120 may graphically change properties (e.g., darken, lighten, “blink”, underline, bold, etc.), provide a haptic response (e.g., vibration, etc.), or the electronic device 100 may provide audio feedback (e.g., click, tone, or other noise) using an audio speaker 187. A highlighting visual feedback is shown.



FIG. 2 depicts a user interaction 175 with the touch pad 185. Looking at the reverse side 195 of the electronic device 100, the user slides a finger along the touch pad 185 surface. The slide interaction 175 at least partially overlaps in time with the touch interaction 171. After the slide interaction 175 begins, the user may release the touch interaction 171. Alternately, the user may maintain the touch interaction 171 after the slide interaction 175 begins as shown in FIG. 3 via the continued highlighting visual effect. Because the touch pad 185 is on the reverse side 195 of the electronic device 100, the slide interaction 175 is mirrored on the obverse side 191. FIG. 3 illustrates the obverse side 191 of the electronic device after completion of the slide user interaction 175 with the touch pad 185. Note that the non-selected graphical user elements 110, 112, 114, 122, 124 have been dragged relative to the selected element 120, and the selected element 120 has remained stationary on the touch screen 181. The movement of the non-selected elements 110, 112, 114, 122, 124 is in accordance (mirror image relationship) with the slide interaction 175 on the touch pad 185. Note that some of the non-selected elements 114, 124 have moved off the right edge of the touch screen due to the limited size of the touch screen 181. A scrolling function (not shown) may be implemented to move those elements 122, 124 back onto the touch screen 181, perhaps with the side-effect of element 120 moving off the left edge of the touch screen.



FIG. 3 also depicts an optional subsequent user interaction 177 with the touch screen 181. At this point, a known touch screen interaction is performed as an extension to the touch interaction 171. For example, the user moves the former touch interaction 171. The touched element 120 follows the subsequent user interaction 177, and FIG. 4 illustrates the touch screen 181 after completion of the additional user interaction 177, when the user is no longer contacting the touch screen 181 or the touch pad 185. If the touch interaction 171 has been persistent throughout the slide interaction 175, the subsequent interaction 177 may simply be a movement on the touch screen 181 after the slide interaction 175 has concluded. Alternately, the user could release the touch interaction 171 after the slide interaction 175 begins, and the subsequent user interaction 177 could be a separate touch-and-drag interaction.


Although a two-dimensional matrix layout has been shown, the graphical user interface screen can be reduced to a one-dimensional matrix layout such as a list of song files, electronic book files, address book contact files, etc.


The dual-sided gesture has been illustrated as a single-handed gesture; however, two hands can be used to perform the touch and slide movements shown. In some electronic device implementations, the electronic device 100 as shown is only a portion of a larger electronic device akin to a hinged laptop computer. When the laptop-configured electronic device is in an open position, using two hands to perform the dual-sided gesture may be ergonomically easier—depending on the size and location of the touch screen, the size and location of the touch pad, and individual user preferences.



FIG. 5 illustrates a flow chart 500 for an electronic device manipulating graphical user interface elements. The electronic device has a touch screen (or other type of touch-sensitive display) and another touch-sensitive surface (such as a touch pad or a second touch screen).


Initially, the electronic device displays 510 at least two graphical user interface elements on a touch screen. Next, the electronic device selects 520 a first element based on a touch contact detected on a touch-sensitive surface. The selection may be indicated to a user visually or audibly as previously described. After that, the electronic device manipulates 530 a non-selected graphical user interface element based on a slide contact detected on a different touch-sensitive surface.


Different types of manipulations are possible depending on the operational mode of the electronic device, the type of graphical user interface element selected via the touch contact, the pattern of the slide movement, and other factors. One type of manipulation drags 532 the non-selected element with respect to the first element as shown in FIGS. 1-3 and 6-8. Another type of manipulation pushes 534 the non-selected element with respect to the first element as shown in FIGS. 9-16. A third type of manipulation rotates 536 the non-selected element with respect to the first element as shown in FIGS. 17-19. And a fourth type of manipulation moves pixels 538 relative to the selected element, which is interpreted by the electronic device as a selected pixel. In this manner, zoom in/out or icon enlargement/reduction can be performed as directed by the slide movement as shown in FIGS. 20-29.


The electronic device can optionally cease 540 manipulating based on detecting cessation of the slide contact. If the touch contact has been persistent throughout the slide contact, the electronic device can manipulate 550 the first element based on movement of the touch contact after the slide contact ceases.


Although the touch and the slide are performed on different touch-sensitive surfaces, the electronic device is agnostic as to whether the touch contact is performed on a touch-sensitive surface on the obverse side or the reverse side. FIGS. 6-8 show the same user interaction with the six elements shown in FIGS. 1-3 but starting with a touch on the reverse-side touch-sensitive surface.



FIGS. 6-8 illustrate an obverse side 691 and a reverse side 695 of an electronic device 600 with a touch screen 681 and a touch pad 685. On the obverse side 691, the touch screen 681 displays six graphical user interface elements 610, 612, 614, 620, 622, 624. Instead of touching the touch screen 681 to select a first element, a user touches the touch pad 685 to select the first element. FIG. 6 depicts a touch user interaction 671 with the touch pad 685 to select the first element 620. FIG. 7 illustrates the obverse side 691 of the electronic device 600 with selected element 620 highlighted to provide visual indication of the touch selection from the reverse side 685. Note that the user interaction 671 on the reverse side 695 results in a mirror image selection (i.e., element 620 is selected and not element 624). FIG. 7 also depicts a slide user interaction 675 with the touch screen 681. The user interaction 675 is generally a mirror image of the user interaction 175 shown in FIG. 2, because the slide is performed on the obverse side 691 touch screen 681 instead of the reverse side 695 touch pad 685. The non-selected elements 610, 612, 614, 622, 624 are dragged consistent with the slide user interaction 675, and the graphical user interface at the completion of the slide is the same as shown in FIG. 3. FIG. 8 illustrates the obverse side 691 of the electronic device 600 after completion of the user interaction 675 with the touch screen 681. Note that, in FIG. 8, the touch interaction 671 has ceased and thus there is no selected element at this time.


The dragging manipulation of non-selected elements relative to a stationary selected element allows a user to move one or more elements within the plane of the display screen. A different mode can allow a push manipulation to move non-selected elements virtually in front or behind a selected element in the plane of the display screen.



FIGS. 9-11 illustrate an obverse side 991 and a reverse side 995 of an electronic device 900 with a touch screen 981 and a touch pad 985. On the obverse side 991, the touch screen 981 displays thirteen graphical user interface elements 910, 912, 914, 920, 922, 930, 932, 934, 940, 942, 944, 950, 954. These elements are depicted in a virtual three-dimensional matrix. Thus, some elements seem to be “above” or “below” other elements. In order to select an element that is “on top of” another element, the touch screen 981 on the obverse side 991 may be used for selection. In order to select an element that is “underneath” another element, the touch pad 985 on the reverse side 995 may be used for selection.



FIG. 9 depicts a user interaction 971 with the touch screen 981 to select a first element 932. Feedback for the selection is depicted by highlighting the first element 932. Note that the touch location of user interaction 971 is ambiguous and the electronic device 900 defaults to the element 932 that is virtually closer to the touched side (the obverse side, in this example). If an ambiguous touch occurred (i.e., in the same location) on the reverse side, the lower element 942 would have initially been selected. By making slight finger movements, however, the selection may switch to nearby elements as is known in the art.



FIG. 10 depicts a user interaction 975 with the touch pad 985 on the reverse side 995 of the electronic device 900. The slide user interaction directs the electronic device 900 to virtually push the non-selected elements 910, 912, 914, 920, 922, 930, 934, 940, 942, 944, 950, 954 “above” the first element 932. FIG. 11 illustrates the obverse side 991 after completion of the user interaction 975 with the touch pad 985. Note that the first element 932 is now located “below” the other element 942 because all the non-selected elements 910, 912, 914, 920, 922, 930, 934, 940, 942, 944, 950, 954 have been moved “up” relative to the selected element 932. Note also that the touch user interaction 971 was released prior to the completion of the push user interaction 975, so no element is highlighted in FIG. 11.


As alluded to previously, using a touch pad on a reverse side of an electronic device to select an element from the “bottom” of a virtual stack of user interface elements may be easier for a user than trying to select that same element using a touch screen.



FIGS. 12-16 illustrate an obverse side 1291 and a reverse side 1295 of an electronic device 1200 with a touch screen 1281 and a touch pad 1285. On the obverse side 1291, the touch screen 1281 displays thirteen graphical user interface elements 1210, 1212, 1214, 1220, 1222, 1230, 1232, 1234, 1240, 1242, 1244, 1250, 1254 in a virtual three-dimensional matrix similar to FIG. 9. FIG. 12 depicts a user interaction 1271 with the touch pad 1285. The user touches the touch pad 1285 in a location that could be interpreted as a selection of any one of elements 1230, 1240, or 1250. Because the touch user interaction 1271 is from the reverse side 1295, however, the electronic device 1200 selects the “lowest” element 1250 of the stack by default. In this example, the electronic device 1200 provides user feedback regarding the selection by highlighting the element 1250. Slight finger movement will adjust the default selection as desired by the user and should be reflected back to the user via visual or audible feedback. After the first element 1250 is properly selected, the user performs a slide user interaction 1275 on the other touch-sensitive surface. FIG. 13 depicts a slide user interaction 1275 with the touch screen 1281. This slide user interaction 1275 is slight and thus moves the non-selected user elements 1210, 1212, 1214, 1220, 1222, 1230, 1232, 1234, 1240, 1242, 1244, 1254 only by one level in the virtual three-dimensional matrix. Thus, the first element 1250 has moved from “below” the other elements 1230, 1240 in its stack to “between” the other elements 1230, 1240 as shown in FIG. 14.



FIG. 15 depicts an alternate user interaction 1575 with the touch screen 1281. In this situation, the same first element 1250 is selected, but the slide user interaction 1575 is more prominent and thus moves the non-selected user elements 1210, 1212, 1214, 1220, 1222, 1230, 1232, 1234, 1240, 1242, 1244, 1254 two levels down in the virtual three-dimensional matrix. In this example, the first element 1250 has moved from “below” the other elements 1230, 1240 in its stack to “above” the other elements 1230, 1240 as shown in FIG. 16. Although some of the non-selected elements 1240, 1254 have moved off the bottom edge of the touch screen 1281 due to the limited size of the touch screen, a scrolling function (not shown) may be implemented to move those elements 1240, 1254 back onto the touch screen 1281.


In addition to linear slide movements directing drag and push interactions with graphical user interface elements, a circular slide movement may direct rotate interactions with graphical user interface elements. FIGS. 17-19 illustrate an obverse side 1791 and a reverse side 1795 of an electronic device 1700 with a touch screen 1781 and a touch pad 1785. FIG. 17 depicts a touch user interaction 1771 with the touch screen 1781 to select one thumbnail photograph graphical user interface element 1710 out of the two photograph graphical user interface elements 1710, 1720 displayed on the touch screen 1781. FIG. 18 depicts a slide user interaction 1775 with the touch pad 1785 on the reverse side 1795 of the electronic device 1700. This clockwise circular slide user interaction 1775 directs the electronic device 1700 to rotate the non-selected graphical user interface element 1720 counterclockwise (due to the mirror image effect of slide user interactions on the reverse side 1795).



FIG. 19 illustrates an obverse side 1791 of the electronic device 1700 after completion of the slide user interaction 1775 with the touch pad 1785. More degrees in a circular slide user interaction can increase the rotation of the non-selected graphical user interface element. For example, the 180 degrees of circular slide interaction 1775 shown translates to 90 degrees of rotation of the non-selected graphical user interface element 1720. Extrapolating this mapping, 360 degrees of circular slide interaction would translate to 180 degrees of rotation of the non-selected graphical user interface element, and 540 degrees of circular slide interaction would translate to 270 degrees of rotation of the non-selected graphical user interface element. Alternately, the mapping of the circular slide interaction to rotation of the non-selected graphical user interface element could be more direct, such as 180 degrees of circular slide interaction translating to 180 degrees of rotation of the non-selected graphical user interface element.


A slide user interaction on a second touch-sensitive surface can also direct a pixel-based move relative to a first element selected using a touch user interaction on a first touch-sensitive surface. This variant mode uses a touch user interaction to select a pixel to “anchor” instead of a graphical user interface element in its entirety. Pixel-based moves allow a user to direct zoom in/out interactions and enlarge/reduce interactions.



FIGS. 20-24 illustrate an obverse side 2091 and a reverse side 2095 of an electronic device 2000 with a touch screen 2081 and a touch pad 2085. On the obverse side 2091, the touch screen 2081 displays a digital photograph 2010. FIG. 20 depicts a touch user interaction 2071 with the touch screen. This touch user interaction 2071 selects a pixel that will be the central pixel of a zoom in/out function that will be directed by a slide user interaction 2075, 2375 on a touch pad 2085 on the reverse side 2095 of the electronic device 2000.



FIG. 21 depicts a slide user interaction 2075 with the touch pad 2085. This slide interaction 2075 directs the processor of the electronic device 2000 to zoom in the digital photograph 2010. The direction, speed, pressure, and/or length of the slide may direct the enlargement value of the zoom function. In this example, sliding to the left (as shown in FIG. 21) directs a zoom in as shown in FIG. 22. Meanwhile, sliding to the right (as shown in FIG. 23) directs a zoom out as shown in FIG. 24.



FIGS. 25-29 illustrate an obverse side 2591 and a reverse side 2595 of an electronic device 2500 with a touch screen 2581 and a touch pad 2585. In this example, a slide user interaction 2575, 2875 directs an enlargement or a reduction of a non-selected icon 2210. FIG. 25 depicts a user interaction 2571, 2572 with the touch screen 2581 that selects two interactive user elements 2220, 2230 from the three interactive user elements 2210, 2220, 2230 displayed. Note that in this case, the touch user interaction 2571, 2572 is a dual-touch user interaction and the two user elements could also be considered a two-part selected first element.



FIG. 26 depicts a slide user interaction 2575 with the touch pad 2585 on the reverse side 2595 of the electronic device 2500. In this mode, the slide user interaction 2575 enlarges the non-selected interactive user element 2210 as shown. The slide speed, direction, pressure, and/or length may determine the degree of enlargement implemented by the processor of the electronic device 2500 as shown in FIG. 27.



FIG. 28 depicts an alternate user interaction 2875 with the touch pad 2585. In this example, a slide in the “up” direction directs an enlargement and a slide in the “down” direction” directs a reduction. FIG. 29 illustrates an obverse side 2591 of the electronic device 2500 after completion of the alternate user interaction 2875 with the touch pad 2585.


More complicated slide user interactions may be implemented on an electronic device. FIGS. 30-32 illustrate an obverse side 3091 and a reverse side 3095 of an electronic device 3000 with a touch screen 3081 and a touch pad 3085. In this example, instead of a linear slide gesture directing movement of non-selected elements as shown in FIGS. 1-3, 6-16, and 20-29 or a circular slide gesture as shown in FIGS. 17-19, FIG. 31 depicts an angled slide user interaction 3075. FIG. 30 depicts a user interaction 3071, 3072 with two interactive user elements 3010, 3020 displayed on the touch screen 3081. This dual-touch user interaction 3071, 3072 anchors both elements 3010, 3020 (i.e., a dual-part selected element) such that a subsequent slide interaction does not move the anchored elements 3010, 3020.



FIG. 31 depicts a slide user interaction 3075 with the touch pad 3085 on the reverse side 3095 of the electronic device 3000. This slide interaction 3075 is angled with an “up” portion followed by a “right” portion. Such a dual-slide user interaction may be helpful to prevent unintended adjustment of interactive user interface elements due to inadvertent contact with the electronic device's touch-sensitive surfaces. As shown in FIG. 32, the dual-slide user interaction 3075 directs the non-selected element 3030 to move left (due to the mirror effect of the slide on the reverse side 3095). The “up” portion of the slide interaction 3075 does not direct a drag of the non-selected element 3030 but rather complicates the slide user interaction such that inadvertent drags are less likely to occur.



FIG. 33 illustrates a simplified block diagram of an electronic device 3300 with a touch screen 3381 and a touch pad 3385. As shown, the touch screen 3381 is on an obverse side of the electronic device 3300 and the touch pad 3385 is on a reverse side of the electronic device 3300. In other embodiments, however, the touch pad could be on the top of the electronic device, the bottom of the electronic device, or even on the obverse side of the electronic device along with the touch screen 3381. As noted previously, the touch screen 3381 and touch pad 3385 are examples of touch-sensitive surfaces, and the touch pad 3385 can be replaced with a second touch screen in an alternate embodiment. The electronic device 3300 also has a touch screen controller 3382 coupled to the touch screen 3381 and a touch pad controller 3386 coupled to the touch pad 3385. Both controllers 3382, 3386 are coupled to a processor 3388. In other embodiments, the controllers may be integrated into a single controller or into the processor 3388. The processor 3388 receives signals from the touch screen 3381 and touch pad 3385 via their respective controllers 3382, 3386 and directs signals to the touch screen 3381 display (via its controller 3382) and/or audio speaker 3387.


A memory 3389, coupled to the processor 3388, stores software programs for manipulating graphical user interface elements in accordance with the flow diagram of FIG. 5, an operating system, various application programs, and data files. The memory 3389 can include one or more forms of volatile and/or non-volatile, fixed and/or removable memory, such as read-only memory (ROM), electronic programmable read-only memory (EPROM), random access memory (RAM), and erasable electronic programmable read-only memory (EEPROM).


When executing various software programs, the processor 3388 has a touch detection module 3371 for detecting a touch user interaction on a first touch-sensitive surface of an electronic device 3300. The detection module determines which graphical user interface element has been selected via the detected touch. A slide detection module 3375 detects a slide user interaction on a second touch-sensitive surface of the electronic device 3300. Based on the detected slide motion (possibly including pressure, velocity, direction, and pattern of the slide motion), a graphical user interface element manipulation module 3376 manipulates non-selected interactive user elements relative to the first interaction user element(s) based on the slide user interaction. As mentioned previously, the manipulation may be classified as drag, push, rotate, or pixel-based moves based on the slide user interaction. Signals from the manipulation module are coupled to the display screen controller 3382 to cause the graphical user interface elements to change as directed by the processor 3388.


The electronic device 3300 can also include a variety of other components (not shown) based on the particular implementation. For example, if the electronic device 3300 was implemented as a mobile phone, it would also include a microphone and a wireless transceiver and possibly additional input components such as a keypad, accelerometer, and vibration alert. If the electronic device 3300 was implemented as a remote controller, an infrared transmitter could also be included.



FIGS. 34-35 illustrate various timing options available for user interaction with the touch screen and the touch pad. In FIG. 34, a touch user interaction 3410 occurs on a first touch-sensitive surface of an electronic device. This touch user interaction 3410 has a positive time duration as shown. After starting the touch user interaction 3410 and before ending the touch user interaction 3410, a slide user interaction 3420 occurs on a second touch-sensitive surface of the electronic device. The time elapsed 3450 between the commencement of the touch user interaction 3410 and the commencement of the slide user interaction 3420 may be any positive value time period, including a zero time elapsed—which means that the touch interaction 3410 and the slide interaction 3420 commenced at almost the same time. (The tolerance for a “zero time elapsed” determination may be set by a manufacturer setting, a user-configurable setting, or through a learning process by the electronic device.)


The touch interaction 3410 and the slide interaction 3420 both continue for a period of time 3460, and it is generally expected that a user will release the touch interaction 3410 before completing the slide interaction 3420 as shown in FIG. 34, resulting in a time period 3470 where the electronic device only detects the slide interaction 3420. FIGS. 6-32 all presume the kind of timing depicted in FIG. 34. The touch user interaction 3410 on a first touch-sensitive surface selects an interactive user element (or pixel) on a display of the electronic device while the slide user interaction 3420 on a second touch-sensitive surface manipulates other interactive user elements on the display.



FIG. 35 illustrates an alternate timing option where the touch user interaction 3510 starts before and ends after a slide user interaction 3520. This situation occurs in FIGS. 1-4 described earlier. Thus, there is a time period 3550 between starting the touch user interaction 3510 and starting the slide user interaction 3520, a time period 3560 when the electronic device detects both the touch user interaction 3510 and the slide user interaction 3520, followed by a time period 3580 after the slide interaction 3520 completes where the touch interaction 3510 remains. In this time period 3580, the touch user interaction 3510 could change to become a drag user interaction (or other type of touch-based user interaction) as shown in FIG. 3.


Thus, the electronic device and method for manipulating graphical user interface elements detects a touch user interaction on a first touch-sensitive surface of an electronic device to select a first interactive user element, detects a slide user interaction on a second touch-sensitive surface of the electronic device, and manipulates non-selected interactive user elements relative to the first interaction user elements based on the slide user interaction. This document has disclosed drag, push, rotate, and pixel-based moves based on the slide user interaction.


While the present invention is susceptible of embodiment in various forms, the drawings show and the text describes embodiments with the understanding that the present disclosure is to be considered an exemplification of the invention and is not intended to limit the invention to the specific embodiments illustrated. Furthermore, while the various figures are intended to illustrate the various claimed aspects of the present invention, in doing so, the elements are not necessarily intended to be drawn to scale. In other words, the size, shape, and dimensions of some elements, features, components, and/or regions are for purposes of clarity (or for purposes of better describing or illustrating the concepts intended to be conveyed) and may be exaggerated and/or emphasized relative to other illustrated elements.


While various embodiments of the invention have been illustrated and described, it is to be understood that the invention is not so limited. Numerous modifications, changes, variations, substitutions, and equivalents will occur to those skilled in the art without departing from the spirit and scope of the present invention as defined by the appended claims.

Claims
  • 1. A method comprising: displaying, by an electronic device, at least two elements;selecting, by the electronic device and based on a touch input detected on a first side of the electronic device, a first element from the at least two elements;manipulating, by the electronic device, a non-selected second element from the at least two elements in a direction of a slide touch input detected on a second side of the electronic device; andresponsive to detecting a cessation of the slide touch input, moving, based on a movement of the touch input after cessation of the slide touch input, the first element.
  • 2. The method in accordance with claim 1, wherein the manipulating comprises: dragging the second element laterally, within the virtual plane of the touch screen, relative to the first element in the direction of the slide touch input.
  • 3. The method in accordance with claim 1, wherein the manipulating comprises: rotating the second element relative to the first element.
  • 4. The method in accordance with claim 1wherein the selecting comprises selecting, based on the touch input, a two-part first element, wherein the touch input is a dual-touch contact, andwherein moving the first element comprises moving the two-part first element.
  • 5. A device comprising: a touch screen positioned on a first side of the device and configured to display at least two elements;a touch pad positioned on a second side of the device; andone or more processors configured to: select, based on a touch input detected by the touch screen, a first element from the at least two elements;manipulate a non-selected second element from the at least two elements, relative to the first element, in a direction of a slide touch input detected by the touch pad; andresponsive to detecting a cessation of the slide touch input, move the first element based on a movement of the touch input after cessation of the slide touch input.
  • 6. The device of claim 5, wherein the one or more processors are configured to manipulate the non-selected second element by at least being configured to drag the second element laterally, within the virtual plane of the touch screen, relative to the first element in the direction of the slide touch input.
  • 7. The device of claim 5, wherein the one or more processors are configured to manipulate the non-selected second element by at least being configured to rotate the second element, within the plane of the touch screen, relative to the first element.
  • 8. The device of claim 5, wherein the one or more processors are configured to select the first element by at least being configured to select, based on the touch input, a two-part first element, wherein the touch input is a dual-touch contact, and wherein the one or more processors are configured to move the first element by at least being configured to move the two-part first element.
  • 9. A non-transitory computer-readable storage medium encoded with instructions for causing one or more processors of an electronic device to: select, based on a touch input detected on a first side the electronic device, a first element from at least two elements, wherein the at least two elements are displayed by a touch screen positioned on the first side of the electronic device;manipulate a non-selected second element from the at least two elements, relative to the first element, in a direction of a slide touch input detected on a second side the electronic device; andresponsive to detecting a cessation of the slide touch input, move the first element based on a movement of the touch input after cessation of the slide touch input.
  • 10. The non-transitory computer-readable storage medium of claim 9 further encoded with instructions to cause the one or more processors to manipulate the non-selected element by at least causing the one or more processors to drag the second element laterally, within the virtual plane of the touch screen, relative to the first element in the direction of the slide touch input.
  • 11. The non-transitory computer-readable storage medium of claim 9 further encoded with instructions to cause the one or more processors to manipulate the non-selected element by at least causing the one or more processors to rotate the second element, within the plane of the touch screen, relative to the first element.
  • 12. The non-transitory computer-readable storage medium of claim 9 further encoded with instructions to cause the one or more processors to select the first element by at least causing the one or more processors to select, based on the touch input, a two-part first element, wherein the touch input is a dual-touch contact, and move the first element by at least causing the one or more processors to move the two-part first element.
US Referenced Citations (107)
Number Name Date Kind
4076262 Deventer Feb 1978 A
5128671 Thomas, Jr. Jul 1992 A
5494447 Zaidan Feb 1996 A
5543588 Bisset et al. Aug 1996 A
5610971 Vandivier Mar 1997 A
5623280 Akins et al. Apr 1997 A
5729219 Armstrong et al. Mar 1998 A
5795300 Bryars Aug 1998 A
5832296 Wang et al. Nov 1998 A
5896575 Higginbotham et al. Apr 1999 A
5898600 Isashi Apr 1999 A
5959260 Hoghooghi et al. Sep 1999 A
6020878 Robinson Feb 2000 A
6201554 Lands Mar 2001 B1
6233138 Osgood May 2001 B1
6392870 Miller, Jr. May 2002 B1
6457547 Novitschitsch Oct 2002 B2
6466198 Feinstein Oct 2002 B1
6504530 Wilson et al. Jan 2003 B1
6532147 Christ, Jr. Mar 2003 B1
6549789 Kfoury Apr 2003 B1
6597347 Yasutake Jul 2003 B1
6927747 Amirzadeh et al. Aug 2005 B2
7058433 Carpenter Jun 2006 B2
7075513 Silfverberg et al. Jul 2006 B2
7205959 Henriksson Apr 2007 B2
7218313 Marcus et al. May 2007 B2
7423526 Despotis Sep 2008 B2
7453442 Poynter Nov 2008 B1
7453443 Rytivaara et al. Nov 2008 B2
8265717 Gorsica et al. Sep 2012 B2
8669863 Alhuwaishel Mar 2014 B2
20010052122 Nanos et al. Dec 2001 A1
20030103324 Gallivan Jun 2003 A1
20030197678 Siddeeq Oct 2003 A1
20030199290 Viertola Oct 2003 A1
20030234768 Rekimoto et al. Dec 2003 A1
20040192260 Sugimoto et al. Sep 2004 A1
20040212599 Cok et al. Oct 2004 A1
20050012723 Pallakoff Jan 2005 A1
20050020325 Enger et al. Jan 2005 A1
20050024339 Yamazaki et al. Feb 2005 A1
20050031390 Orozco-Abundis Feb 2005 A1
20050096106 Bennetts et al. May 2005 A1
20050124395 Bae et al. Jun 2005 A1
20050275416 Hervieux et al. Dec 2005 A1
20050282596 Park et al. Dec 2005 A1
20060017711 Pihlaja Jan 2006 A1
20060024601 Ogawa et al. Feb 2006 A1
20060034601 Andersson et al. Feb 2006 A1
20060037175 Hyun Feb 2006 A1
20060084482 Saila Apr 2006 A1
20060092355 Yang et al. May 2006 A1
20060111160 Lin et al. May 2006 A1
20060139320 Lang Jun 2006 A1
20060170649 Kosugi et al. Aug 2006 A1
20060197750 Kerr et al. Sep 2006 A1
20060284853 Shapiro Dec 2006 A1
20070075915 Cheon et al. Apr 2007 A1
20070076861 Ju Apr 2007 A1
20070097151 Rosenberg May 2007 A1
20070103454 Elias May 2007 A1
20070127199 Arneson Jun 2007 A1
20070177803 Elias et al. Aug 2007 A1
20080004085 Jung et al. Jan 2008 A1
20080102888 Seligren et al. May 2008 A1
20080150903 Chuang Jun 2008 A1
20080192977 Gruenhagen et al. Aug 2008 A1
20080211783 Hotelling et al. Sep 2008 A1
20080252608 Geaghan Oct 2008 A1
20080261661 Jessop Oct 2008 A1
20080266118 Pierson et al. Oct 2008 A1
20090046110 Sadler et al. Feb 2009 A1
20090061948 Lee et al. Mar 2009 A1
20090066660 Ure Mar 2009 A1
20090096749 Kawahara et al. Apr 2009 A1
20090131117 Choi May 2009 A1
20090140863 Liu et al. Jun 2009 A1
20090199130 Tsern et al. Aug 2009 A1
20090201253 Jason et al. Aug 2009 A1
20090241048 Augustine et al. Sep 2009 A1
20090251419 Radely-Smith Oct 2009 A1
20090273571 Bowens Nov 2009 A1
20090298547 Kim et al. Dec 2009 A1
20090315834 Nurmi et al. Dec 2009 A1
20100007603 Kirkup Jan 2010 A1
20100020034 Kim Jan 2010 A1
20100029327 Jee Feb 2010 A1
20100110495 Letocha May 2010 A1
20100113100 Harmon et al. May 2010 A1
20100134409 Challener et al. Jun 2010 A1
20100219943 Vanska et al. Sep 2010 A1
20100235742 Hsu et al. Sep 2010 A1
20100277420 Charlier et al. Nov 2010 A1
20100277421 Charlier et al. Nov 2010 A1
20110003665 Burton et al. Jan 2011 A1
20110012921 Cholewin et al. Jan 2011 A1
20110012928 Cholewin et al. Jan 2011 A1
20110157799 Harmon et al. Jun 2011 A1
20110190675 Vess Aug 2011 A1
20110221688 Byun et al. Sep 2011 A1
20120092383 Hysek et al. Apr 2012 A1
20120127070 Ryoo et al. May 2012 A1
20120139904 Lee et al. Jun 2012 A1
20130044215 Rothkopf et al. Feb 2013 A1
20130197857 Lu et al. Aug 2013 A1
20140018686 Medelius et al. Jan 2014 A1
Foreign Referenced Citations (18)
Number Date Country
0913977 May 1999 EP
1335567 Feb 2002 EP
1408400 Apr 2004 EP
1517223 Mar 2005 EP
1754424 Feb 2007 EP
2065786 Jun 2009 EP
2150031 Feb 2010 EP
2771769 Jun 1999 FR
2339505 Jan 2000 GB
2368483 May 2002 GB
100683535 Feb 2007 KR
1020070035026 Mar 2007 KR
2004114636 Dec 2004 WO
2005071928 Aug 2005 WO
2005111769 Nov 2005 WO
2008030563 Mar 2008 WO
2009123406 Oct 2009 WO
2010097692 Sep 2010 WO
Non-Patent Literature Citations (25)
Entry
Patent Cooperation Treaty, “PCT Search Report and Written Opinion of the International Searching Authority” for International Application No. PCT/US2013/055165, Mar. 12, 2014, 10 pages.
International Preliminary Report on Patentability from international application No. PCT PCT/US2010/040876, dated Jul. 20, 2009, 8 pp.
Prosecution History from U.S. Appl. No. 12/505,775, dated Dec. 23, 2011 through Mar. 25, 2013, 53 pp.
Prosecution History from U.S. Appl. No. 12/565,200, dated Sep. 18, 2012 through Apr. 3, 2013, 48 pp.
Illustration of GPS system, published by lucid touch microsoft research & mitsubishi research, Nov. 26, 2008, retrieved from http://research.microsoft.com/users/baudisch/projects/lucidtouch/applications, 3 pp.
Chu et al., “Lucid Touch prototype,” published by lucid touch microsoft research & mitsubishi electric research labs, Nov. 26, 2008, retrieved from http://research.microsoft.com/users/baudisch/projects/lucidtouch/index.html, 1 pp.
Erh-Li (Early) Shen et al., “Double-side Multi-touch Input for Mobile Devices”, CHI 2009—Spotlight on Works in Progress—Session 2, Apr. 4, 2009, pp. 4339-4344.
Patrick Baudisch, “Application Areas of Lucid Touch”, http://research.microsoft.com/users/baudisch/projects/lucidtouch/applications, accessed Nov. 26, 2008, 3 pages.
Dance With Shadows, “Microsoft's LucidTouch See-Through Touchscreen Unveiled”, http://www.dancewithshadows.com/tech/lucid-touch.asp, Mar. 10, 2008, 2 pages.
Patrick Baudisch, “Lucid Touch Homepage”, http://research.microsoft.com/users/baudisch/projects/lucidtouch/index.html, accessed Nov. 26, 2008, 5 pages.
Masanori Sugimoto and Keiichi Hiroki, “HybridTouch: An Intuitive Manipulation Technique for PDAs Using Their Front and Rear Surfaces”, Proc. of the 8th Int'l Conf. on Human Computer Interaction with Mobile Devices and Services (MobileHCI 2006), Sep. 12, 2006, pp. 137-140.
Daniel Wigdor et al., “Lucid Touch: A See-through Mobile Device”, Proc. of the 20th Annual ACM Symposium on User Interface Software and Tech., Oct. 7, 2007, pp. 269-278, XP002582051.
Adesso, Inc., “Adresso Easy Cat 2 Button Glidepoint Touchpad (Black)”, http://www.adesso.com/en/component/content/article/63-touchpads/189-gp-160.html, downloaded Sep. 12, 2012, 2 pages.
Gregory Wilson, “Evaluating the Effectiveness of Using Touch Sensor Capacitors as an Input Device for a Wrist Watch Computer”, Georgia Institute of Technology undergraduate thesis, smartech.gatech.edu/xmlui/handle/1853/19947, Dec. 17, 2007, 15 pages.
Jun Rekimoto, “GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices”, http://www.sonycsl.co.jp/person/rekimoto/papers/iswc01.pdf, 5th IEEE Int'l Symp. on Wearable Computers, 2001, pp. 21-27.
Paul H. Dietz et al., “A Practical Pressure Sensitive Computer Keyboard”, 22nd Ass'n for Computing Machinery Symp. on User Interface Software and Tech., Oct. 4, 2009, 4 pages.
Samsung, “Samsung Wearable Mobile Device Makes Communication Easier for an Active Lifestyle”, http://www.tuvie.com/samsung-wearable-mobile-device-can-make-communication-easier-in-adventurous-trips/, Apr. 3, 2010, 10 pages.
Charles McLellan, “Eleksen Wireless Fabric Keyboard: A First Look”, http://www.zdnet.com/eleksen-wireless-fabric-keyboard-a-first-look-3039278954, Jul. 17, 2006, 9 pages.
United States Patent and Trademark Office, “Final Office Action” for U.S. Appl. No. 12/565,200, Jan. 16, 2013, 12 pages.
United States Patent and Trademark Office, “Non-Final Office Action” for U.S. Appl. No. 12/433,253, Feb. 16, 2012, 22 pages.
United States Patent and Trademark Office, “Non-Final Office Action” for U.S. Appl. No. 12/565,200, Sep. 18, 2012, 22 pages.
Patent Cooperation Treaty, “PCT Search Report and Written Opinion of the International Searching Authority” for Int'l Appln. No. PCT/US2010/031879, Jul. 7, 2010, 14 pages.
Patent Cooperation Treaty, “PCT Search Report and Written Opinion of the International Searching Authority” for Int'l Appln. No. PCT/US2010/037568, Nov. 25, 2010, 19 pages.
Patent Cooperation Treaty, “PCT Search Report and Written Opinion of the International Searching Authority” for Int'l Appln. No. PCT/US2010/040876, Oct. 5, 2010, 14 pages.
United States Patent and Trademark Office, “Non-Final Office Action” for U.S. Appl. No. 12/505,775, Dec. 23, 2011, 19 pages.
Related Publications (1)
Number Date Country
20130215064 A1 Aug 2013 US
Continuations (1)
Number Date Country
Parent 12565200 Sep 2009 US
Child 13852211 US
Continuation in Parts (1)
Number Date Country
Parent 12505775 Jul 2009 US
Child 12565200 US