This disclosure relates to computing devices, and more particularly, to user interface (UI) techniques for managing active applications on touch sensitive computing devices.
Touch sensitive computing devices such as smart phones, eReaders, tablet computers, personal digital assistants (PDAs), and other such devices are commonly used for displaying consumable content and running multiple software applications (also known as applications or apps). The applications may vary based on the device, but may include applications in categories such as, for example, communications, entertainment, children, social, games, news and weather, tools and utilities, and productivity, just to name a few types. The devices are useful for displaying a user interface that allows a user to interact with the displayed content, such as content provided by the various applications. The touch sensitive computing device may receive user input from a touch screen or some other touch sensitive surface/interface, such as a track pad (e.g., in combination with a non-touch sensitive display). The user may interact with the touch sensitive interface using fingers, a stylus, or some other implement to provide input to the user interface.
a-b illustrate an example touch sensitive computing device having a manage active apps mode configured in accordance with an embodiment of the present invention.
c-d illustrate example configuration screen shots of the user interface of the touch sensitive computing device shown in
a illustrates a block diagram of a touch sensitive computing device configured in accordance with an embodiment of the present invention.
b illustrates a block diagram of a communication system including the touch sensitive computing device of
a-f collectively illustrate an example manage active apps mode pinch and flick input for closing a list of active applications, in accordance with an embodiment of the present invention.
a-c collectively illustrate an example manage active apps mode flick gesture, where the direction of the flick determines the mode function performed, in accordance with an embodiment of the present invention.
a-c illustrate an example manage active apps mode separate action using a spread gesture to separate a previously formed stack of active applications, in accordance with an embodiment of the present invention.
Techniques are disclosed for managing active applications on a touch sensitive computing device using a pinch and flick gesture input, generally referred to herein as a manage active apps mode. The manage active apps mode allows a user to perform a pinch gesture on a display of active applications to form a stack of those active applications. The user can then perform a flick gesture on the stack to perform a function on all of the active applications in the stack. The function may include, for example, closing, stopping, force stopping, quitting, or deleting the active applications in the stack. In some cases, the manage active apps mode may further include an action, such as a spread gesture, that separates a previously formed stack. In some cases, the manage active apps mode may be configured to provide feedback (e.g., an animation or sound) after a stack has been flicked to indicate that the function was performed (e.g., that the apps were closed, stopped, etc.). Numerous other configurations and variations will be apparent in light of this disclosure.
General Overview
As previously described, touch sensitive computing devices such as tablets, eReaders, and smart phones are commonly used for displaying user interfaces and consumable content, and are generally configured to run multiple software applications (also known as applications or apps) at the same time. In this manner, a device can have multiple applications active at the same time. Some applications may become active automatically (e.g., when the device is powered on) or manually (e.g., when a user selects the specific application), for example. In some instances, a user may desire to manage all of the device's active applications by closing (or stopping, deleting, etc.) all of them at the same time. For example, the user may desire to manage all of the active applications to start a new set of active applications or to save device memory, power, or data usage.
Thus, and in accordance with one or more embodiments of the present invention, techniques are disclosed for managing active applications on a touch sensitive computing device using a pinch and flick gesture input, generally referred to herein as a manage active apps mode. As will be apparent in light of this disclosure, a pinch gesture can be performed on a display of active applications to form a stack of the active applications. As used herein, “pinch” and “pinch gesture” refer, in addition to their ordinary meaning, to making contact (whether direct or proximate) with a touch sensitive surface/interface using two or more fingers and then bringing those fingers toward each other or together. After a stack is formed, a flick gesture performed on the stack can be used to perform a function on all of the active applications in the stack. As used herein, “flick” and “flick gesture” refer, in addition to their ordinary meaning, to making contact (whether direct or proximate) with a touch sensitive surface/interface using one or more fingers and then making a throwing, swiping, and/or dragging motion. The function caused by flicking the stack may include, for example, closing, stopping, force stopping, quitting, and/or deleting all of the active applications in the stack. In some cases, the pinch and/or flick gesture may be made with the assistance of a stylus or other implement. For example, in some embodiments of the manage active apps mode, after a list of active applications is pinched to form them into a stack (e.g., using two fingers), a stylus may be used to flick the stack off of the touch screen to close (or stop, delete, etc.) all of the active applications in that stack.
As used herein, “application(s)” and “app(s)” refer, in addition to their ordinary meaning, to software or programs for a computing device that serve a user in some capacity or help a user perform an activity, task, or job. As used herein, “active” used in conjunction with application(s) and app(s) refers, in addition to its ordinary meaning, to being currently running, open, or displayed in the foreground and/or background. “Active” may also refer to using device memory, power, and/or data. Applications included in a display of active applications (e.g., a list of active applications) may include any application made active automatically (e.g., when the device is powered on) and/or manually (e.g., started by a user). However, a list of active applications need not include every application active on the device. For example, some devices may separate active applications into categories, such that the techniques described herein can be used on one categorical list of active applications (e.g., active entertainment applications). In such an example, a pinch and flick input may be used to close all of the active entertainment applications, without affecting the active applications in any other category. In another example, in devices that include multiple user profiles, lists of active applications may be specific to each user profile. In such an example, a pinch and flick input may be used to close the active applications of one user profile, without affecting the active applications of any other user profile. In still other embodiments, some active applications may effectively be designated as exempt (based on user-configuration and/or hard-coded), such that those applications are not affected by a pinch and flick input of the manage active apps mode (e.g., even if they are active and included in the stack, the pinch and flick based function will not operate on exempted apps). As will be apparent in light of this disclosure, the manage active apps mode can be used on any display of active applications, whether that display is a list, group, grid, menu, icon layout, or any other suitable display.
The functions available using the manage active apps mode may vary based on the device and/or the device's user interface (or operating system), as will be apparent in light of this disclosure. In some embodiments, closing active applications may stop them from running in the foreground, but one or more of the closed applications may still run in the background or may continue to cause use of device memory, power, and/or data. In some such embodiments, stopping, force stopping, or quitting active applications may stop them from running in both the foreground and the background. In other embodiments, managing active applications may include other functions, such as force quitting, ending, or killing active applications, or even deleting (removing/uninstalling) applications. In some embodiments, managing active applications may include any function that makes the applications inactive in some manner. In some embodiments, the function performed on all of the active applications in the stack may be determined by the characteristics of the flick gesture (e.g., the direction of the flick). For example, in one embodiment, flicking the stack to the left closes the active applications in the stack and flicking the stack to the right force stops the active applications and flicking the stack to downward deletes the active applications. With respect to deleting apps, a special gesture and/or confirmation can be used. For instance, if the user flicks stack away before releasing from the pinch gesture (such that pinch and flick are effectively one continuous gesture), then a delete/removal of that applications in the stack would be executed, in accordance with one embodiment. The user may also be prompted to confirm the deletion mode, if so desired. Applications required or otherwise restricted can be excluded or otherwise exempted from any such deletion requests, as previously explained. Allowing a user to select the function performed on the stack of active applications, e.g., based on the direction and/or nature of the flick gesture, may enhance the user's experience when managing active applications.
In some embodiments, the manage active apps mode may include a gesture for separating a previously formed stack of active applications prior to performing a flick gesture. In some such embodiments, a spread gesture performed on the stack, for example, may be used to restore all of the active applications into their original list (to undo forming the stack). In some embodiments, the manage active apps mode may be configured to provide feedback to indicate that the active applications have been closed (or stopped, force stopped, etc.) after the flick gesture has been performed. For example, the feedback may be visual (e.g., an animation or text is displayed), auditory (e.g., a notification sound is played), and/or tactile (e.g., a vibration is provided). In some embodiments, the manage active apps mode pinch and flick may be made using one continuous gesture (by maintaining contact between the pinch gesture and flick gesture), as will be discussed in turn.
In some embodiments, the manage active apps mode may be configured at a device level (based on the settings of the electronic device or administrative user) and/or at a user profile level (based on the specific user profile being used). For example, in devices having multiple user profiles, the manage active apps mode may be configured to close the active applications in response to a pinch and flick input when one user profile is active, whereas it may be configured to stop the active applications in response to the same pinch and flick input in another user profile. To this end, the manage active apps mode may be user-configurable, hard-coded, or some combination thereof (e.g., where some aspects are user-configurable and others are hard-coded). Further, the manage active apps mode as variously described herein may be included initially with the user interface (or operating system) of a touch sensitive computing device or be a separate program/service/application configured to interface with an already existing UI for a touch sensitive computing device to incorporate the functionality of the manage active apps mode as variously described herein. In some instances, the manage active apps mode may come as a non-transient computer program product comprising a set of instructions. For ease of reference, user input (e.g., the input used for the pinch and flick gestures) is sometimes referred to as contact or user contact. However, direct and/or proximate contact (e.g., hovering within a few centimeters of the touch sensitive surface) may be used to perform gestures as variously described herein depending on the specific touch sensitive device/interface being used. In other words, in some embodiments, a user may be able to use the manage active apps mode without physically touching the touch sensitive device, as will be apparent in light of this disclosure.
Device and Configuration Examples
a-b illustrate an example touch sensitive computing device having a manage active apps mode configured in accordance with an embodiment of the present invention. The device could be, for example, a tablet computer such as the NOOK® Tablet by Barnes & Noble. In a more general sense, the device may be any electronic device having a touch sensitive user interface and capability for displaying content to a user, such as a mobile phone or mobile computing device such as an eReader, a tablet or laptop, a desktop computing system, a television, a smart display screen, or any other device having a touch screen display or a non-touch display screen that can be used in conjunction with a touch sensitive surface/interface. As will be appreciated in light of this disclosure, the claimed invention is not intended to be limited to any particular kind or type of touch sensitive computing device.
As can be seen with this example configuration, the device comprises a housing/frame that includes a number of hardware features such as a power button and a press-button (sometimes called a home button herein). A touch screen based user interface (UI) is also provided, which in this example embodiment includes a quick navigation menu having six main categories to choose from (Home, Library, Shop, Search, Light, and Settings) and a status bar that includes a number of icons (a night-light icon, a wireless network icon, and a book icon), a battery indicator, and a clock. Other embodiments may have fewer or additional such UI touch screen controls and features, or different UI touch screen controls and features altogether, depending on the target application of the device. Any such general UI controls and features can be implemented using any suitable conventional or custom technology, as will be appreciated. Although the touch sensitive computing device shown in
The power button can be used to turn the device on and off, and may be used in conjunction with a touch-based UI control feature that allows the user to confirm a given power transition action request (e.g., such as a slide bar or tap point graphic to turn power off). In this example device, the home button is a physical press-button that can be used as follows: when the device is awake and in use, tapping the button will display the device's home screen or holding the button will display an active apps screen (e.g., a list of active applications). Numerous other configurations and variations will be apparent in light of this disclosure, and the claimed invention is not intended to be limited to any particular set of hardware buttons or features, or device form factor.
Continuing from
As will be appreciated, the various UI control features and sub-menus displayed to the user are implemented as UI touch screen controls in this example embodiment. Such UI touch screen controls can be programmed or otherwise configured using any number of conventional or custom technologies. In general, the touch screen translates one or more touches (whether direct or proximate and whether made by a user's hand, a stylus, or some other suitable implement) in a particular location(s) into an electrical signal which is then received and processed by the underlying operating system (OS), system software, and circuitry (processor, etc.) of the touch sensitive computing device. In some instances, note that the user need not actually physically touch the touch sensitive surface/interface to provide user input (e.g., when the touch sensitive surface/interface recognizes hovering input). Additional example details of the underlying OS and circuitry in accordance with some embodiments will be discussed in turn with reference to
As previously explained, and with further reference to
In the example case shown in
Continuing with the example settings screen shown in
The next selectable setting shown in the example of
If the flick does affect the function (if the Yes box is selected) in this example case, then the user can further configure the manage active apps mode to assign various functions (e.g., close apps, stop apps, delete apps, etc.) to flick directions, as will be apparent in light of this disclosure. This can be achieved, for example, by selecting the Configure button next to the Yes box when the Yes box is selected. For example, the user may be able to set that flicking the stack to the left closes the applications in the stack and flicking the stack to the right force stops the applications, as shown in
The next settings option under the flick gesture section shown in
In one or more embodiments, the user may specify the user profiles in which the manage active apps mode is available. Such a configuration feature may be helpful, for instance, in a smart phone or tablet computer or other multifunction computing device that includes multiple user profiles (as opposed to a device having only one user profile). In one example case, for instance, the administrative user of the device may be able to designate which user profiles can use the manage active apps mode as variously described herein, or determine whether or not the users have access to configure the manage active apps mode. In some embodiments, the manage active apps mode may also be related or tied to another aspect of the device's UI (or operating system), such that the manage active apps mode is only available when the other aspect is running or invoked. For example, the manage active apps mode may only be available, active, or running when an active apps screen is displayed (e.g., using an active apps button).
As can be further seen in
Architecture
a illustrates a block diagram of a touch sensitive computing device configured in accordance with an embodiment of the present invention. As can be seen, this example device includes a processor, memory (e.g., RAM and/or ROM for processor workspace and storage), additional storage/memory (e.g., for content), a communications module, a touch screen, and an audio module. A communications bus and interconnect is also provided to allow inter-device communication. Other typical componentry and functionality not reflected in the block diagram will be apparent (e.g., battery, co-processor, etc.). Further note that although a touch screen display is provided, other embodiments may include a non-touch screen and a touch sensitive surface such as a track pad, or a touch sensitive housing configured with one or more acoustic sensors, etc. In this manner, a non-touch sensitive computing device can become a touch sensitive computing device by adding an interfacing touch sensitive component. The principles provided herein equally apply to any such touch sensitive devices. For ease of description, examples are provided with touch screen technology.
The touch sensitive surface (touch sensitive display or touch screen, in this example) can be any device that is configured with user input detecting technologies, whether capacitive, resistive, acoustic, active or passive stylus, and/or other input detecting technology. The screen display can be layered above input sensors, such as a capacitive sensor grid for passive touch-based input (e.g., with a finger or passive stylus in the case of a so-called in-plane switching (IPS) panel), or an electro-magnetic resonance (EMR) sensor grid (e.g., for sensing a resonant circuit of the stylus). In some embodiments, the touch screen display can be configured with a purely capacitive sensor, while in other embodiments the touch screen display may be configured to provide a hybrid mode that allows for both capacitive input and active stylus input. In any such embodiments, a touch screen controller may be configured to selectively scan the touch screen display and/or selectively report contacts detected directly on or otherwise sufficiently proximate to (e.g., within a few centimeters) the touch screen display. The proximate contact may include, for example, hovering input used to cause location specific input as though direct contact were being provided on a touch sensitive surface (such as a touch screen). Numerous touch screen display configurations can be implemented using any number of known or proprietary screen based input detecting technology.
Continuing with the example embodiment shown in
The processor can be any suitable processor (e.g., 800 MHz Texas Instruments® OMAP3621 applications processor), and may include one or more co-processors or controllers to assist in device control. In this example case, the processor receives input from the user, including input from or otherwise derived from the power button, home button, and touch sensitive surface. The processor can also have a direct connection to a battery so that it can perform base level tasks even during sleep or low power modes. The memory (e.g., for processor workspace and executable file storage) can be any suitable type of memory and size (e.g., 256 or 512 Mbytes SDRAM), and in other embodiments may be implemented with non-volatile memory or a combination of non-volatile and volatile memory technologies. The storage (e.g., for storing consumable content and user files) can also be implemented with any suitable memory and size (e.g., 2 GBytes of flash memory).
The display can be implemented, for example, with a 6-inch E-ink Pearl 800×600 pixel screen with Neonode® zForce® touch screen, or any other suitable display and touch screen interface technology. The communications module can be, for instance, any suitable 802.11b/g/n WLAN chip or chip set, which allows for connection to a local network so that content can be downloaded to the device from a remote location (e.g., content provider, etc, depending on the application of the display device). In some specific example embodiments, the device housing that contains all the various componentry measures about 6.5″ high by about 5″ wide by about 0.5″ thick, and weighs about 6.9 ounces. Any number of suitable form factors can be used, depending on the target application (e.g., laptop, desktop, mobile phone, etc.). The device may be smaller, for example, for smart phone and tablet applications and larger for smart computer monitor and laptop applications.
The operating system (OS) module can be implemented with any suitable OS, but in some example embodiments is implemented with Google Android OS or Linux OS or Microsoft OS or Apple OS. In other example embodiments, the OS module may be implemented with any OS that can run multiple applications and has a UI capable of displaying a list (or group) of active applications. The power management (Power) module can be configured as typically done, such as to automatically transition the device to a low power consumption or sleep mode after a period of non-use. A wake-up from that sleep mode can be achieved, for example, by a physical button press and/or a touch screen swipe or other action. The UI module can be, for example, based on touch screen technology, and the various example screen shots and example use-cases shown in
Client-Server System
b illustrates a block diagram of a communication system including the touch sensitive computing device of
Manage Active Apps Mode Examples
a-e collectively illustrate an example manage active apps mode pinch and flick input for closing a list of active applications, in accordance with an embodiment of the present invention.
The screen shot in
b illustrates a pinch gesture used to form a stack of the targeted active applications in the active apps screen (which may be a sub-set of those active apps, as will be appreciated in light of this disclosure). In this example, a user is using two fingers (from the user's hand) to perform a pinch gesture. In some embodiments, the pinch may be initiated on one of the applications in the active apps screen to form the active applications into a stack. In other embodiments, the pinch may be initiated anywhere on the active apps screen to form the active applications into a stack. As shown in
After the stack of the active applications is formed (e.g., as shown in
In some embodiments, the flick gesture may have to reach a certain threshold (e.g., based on speed, distance, etc.) to cause the stack to go off of the screen (e.g., by flicking it or throwing it off). In other embodiments, the user may have to flick, swipe, or drag the stack off of the screen to cause the stack to go off the screen, i.e., the user may have to maintain contact (whether direct or proximate) with the touch sensitive surface until the edge of the touch screen is reached or nearly reached. In some embodiments, the pinch and flick method may have to be performed using one continuous gesture, i.e., without losing contact (whether direct or proximate) with the touch sensitive surface/interface. In the example shown in
After a flick gesture is performed on the stack to perform a function on all of the active applications (e.g., as shown in
a-c collectively illustrate an example manage active apps mode flick gesture where the direction of the flick determines the mode function performed, in accordance with an embodiment of the present invention.
a-c illustrate an example manage active apps mode separate action using a spread gesture to separate a previously formed stack of active applications, in accordance with an embodiment of the present invention.
Methodology
The method generally includes sensing a user's input by a touch sensitive surface. In general, any touch sensitive device/interface may be used to detect contact (whether direct or proximate) with it by one or more fingers and/or styluses or other suitable implements. As soon as the user initiates contact with the touch sensitive surface/interface at one or more contact points, the UI can track the path of each contact point and determine the gesture(s) being performed, including the pinch and flick gestures variously described herein. The release point(s) can also be captured by the UI as they may be used to execute or to stop executing a function or action started when the user initiated contact with the touch sensitive surface (e.g., to form a stack of active applications after a pinch gesture is performed on an active apps screen or to select a flick function as determined by the direction of the flick). These main detections can be used in various ways to implement UI functionality, including a manage active apps mode as variously described herein, as will be appreciated in light of this disclosure.
In this example case, the method includes detecting 601 user contact at the touch sensitive interface. In general, the touch monitoring is effectively continuous. Although the method illustrated in
If a pinch gesture has been performed while an active apps screen is being displayed, then the method continues with forming 606 the active applications into a stack. In some embodiments, determining 604 if a pinch gesture has been performed may include determining if the pinch gesture was performed on the active apps screen portion of the display, such as when the active apps screen does not take up the entire display area, or determining if the appropriate number of contact points were used in the pinch gesture, for example. Also, recall that a selection of a subset of the displayed active applications can be made as well, and/or some of the displayed apps may be exempt from the pinch and flick app management function. Once a recognizable pinch gesture has been performed to form 606 the targeted active applications into a stack, the method continues by determining 607 if a flick gesture has been performed on the stack. If a flick gesture has not been performed on the stack, the method determines 608 if an action to separate the stack has been performed. Actions used to separate the stack may include, for example, a spread gesture on the stack, a double tap gesture on the stack, or a press-and-hold gesture on the stack. In some embodiments, the manage active apps mode may be configured to separate a previously formed stack of active applications when a user exits or leaves the active apps screen (the active apps screen is no longer being displayed). In some such embodiments, any action that causes the active apps screen to be exited or left can also cause the stack to be separated. In other embodiments, the stack of active applications may remain in a stack even if the active apps screen was exited or left. As previously described, in some such embodiments, the stack may or may not include any newly active applications into the stack after it is formed, based on the configuration of the manage active apps mode.
If an action to separate the stack has been performed, the method continues back at step 605 by determining if an active apps screen is still being displayed. If an action to separate the stack has not been performed, the method continues to review 607 for a flick gesture performed on the stack until either a flick gesture has been performed on the stack or until the stack of active applications has been separated (e.g., through an action that separates the stack such as a spread gesture performed on the stack). If a flick gesture has been performed on a stack of active applications, then the method continues with performing 609 a function on all of the active applications in the stack. In some embodiments, determining 607 if a flick gesture has been performed may include determining if the flick gesture exceeds a certain speed or distance threshold, determining if the flick gesture caused the stack of active applications to go off of the screen or display, and/or determining if the appropriate number of flick contact points were used, for example. In some embodiments, determining 607 if a flick gesture has been performed may include determining if contact was maintain from the pinch gesture such that the pinch and flick were performed as one continuous gesture. As previously described, the function performed on all of the active applications in the flicked stack may include closing, stopping, force stopping, quitting, or deleting the active applications, for example. In some embodiments, characteristics of the flick gesture, such as the direction of the flick gesture, may determine the function performed. The function performed in response to a flick gesture performed on a stack of active applications may be user-configurable (e.g., see
The example method shown in
Regardless of whether the manage active apps mode is configured to provide feedback, the method of this example embodiment continues with a default action 612, such as displaying the device's home screen or doing nothing until further user contact/input. Likewise, the received contact can be reviewed for some other UI request, as done at 603. The method may continue in the touch monitoring mode indefinitely or as otherwise desired, so that any contact provided by the user when an active apps screen is displayed can be evaluated for use in the manage active apps mode, if appropriate. As previously described, the manage active apps mode may be user profile specific, such that it is only available, enabled, and/or active when certain user profiles are being used. In addition, the manage active apps mode may have different configurations for different user profiles, particularly where the manage active apps mode is user-configurable. In some embodiments, the manage active apps mode may only be available, enabled, and/or active when an active apps screen is displayed (e.g., when multiple active applications are displayed in a list, group, menu, or some other suitable format). In this manner, power and/or memory may be conserved since the manage active apps mode may only run or otherwise be available when an active apps screen is displayed.
Numerous variations and embodiments will be apparent in light of this disclosure. One example embodiment of the present invention provides a device including a display for displaying content to a user, a touch sensitive surface for allowing user input, and a user interface. The user interface includes the ability to interact with multiple applications, wherein a pinch gesture performed on the touch sensitive surface forms a stack of active applications and a flick gesture on the touch sensitive surface performs a function on all of the active applications in the stack. In some cases, the flick gesture on the touch sensitive surface performs one of closing, stopping, force stopping, quitting, and deleting all of the active applications in the stack. In some cases, the display is a touch screen display that includes the touch sensitive surface. In some cases, the flick gesture causes the stack to go off of the display. In some cases, the direction of the flick gesture determines the function performed with respect to all of the active applications in the stack. In some cases, at least one of a spread gesture, double tap gesture, and press-and-hold gesture performed on the touch sensitive surface separates the stack into the original display of active applications. In some cases, feedback is provided after the flick gesture to indicate that the function has been performed, the feedback being visual, auditory, and/or tactile. In some cases, the pinch gesture and flick gesture are made using one continuous gesture. In some cases, a flick gesture performed on a single active application performs one of closing, stopping, force stopping, quitting, and deleting the single active application.
Another example embodiment of the present invention provides a mobile computing device including a display having a touch screen interface and for displaying content to a user, and a user interface. The user interface includes a manage active apps mode that can be invoked in response to user input via the touch sensitive surface. The user input includes a pinch gesture performed on an active apps screen that displays active applications (wherein the pinch gesture causes the active applications to form into a stack) and a flick gesture performed on the stack, wherein the manage active apps mode is configured to perform one of a close, stop, force stop, quit, and delete function on all of the active applications in the stack when invoked. In some cases, the function performed is determined by the direction of the flick gesture. In some cases, the flick gesture includes dragging the stack until a portion of the stack is off of the active apps screen. In some cases, the stack is separated into the original display of active applications when the active apps screen is exited. In some cases, the manage active apps mode is user-configurable.
Another example embodiment of the present invention provides a computer program product including a plurality of instructions non-transiently encoded thereon to facilitate operation of an electronic device according to a process. The computer program product may include one or more computer readable mediums such as, for example, a hard drive, compact disk, memory stick, server, cache memory, register memory, random access memory, read only memory, flash memory, or any suitable non-transitory memory that is encoded with instructions that can be executed by one or more processors, or a plurality or combination of such memories. In this example embodiment, the process is configured to form a stack of active applications in response to a first user input via a touch sensitive interface of a device capable of displaying content (wherein the first user input includes a pinch gesture performed on the touch sensitive surface), and perform a function on the active applications in the stack in response to a second user input via the touch sensitive interface (wherein the second user input includes a flick gesture). In some cases, the function invoked is one of closing, stopping, force stopping, quitting, and deleting all of the active applications in the stack. In some cases, the direction of the flick gesture determines the function performed. In some cases, at least one of a spread gesture, double tap gesture, and press-and-hold gesture performed on the stack of active applications separates the stack into the original display of active applications. In some cases, the touch sensitive surface is a touch screen display. In some cases, feedback is provided after the flick gesture to indicate that the function was performed, the feedback being visual, auditory, and/or tactile.
The foregoing description of the embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of this disclosure. It is intended that the scope of the invention be limited not by this detailed description, but rather by the claims appended hereto.