Radial menus with bezel gestures

Information

  • Patent Grant
  • 10268367
  • Patent Number
    10,268,367
  • Date Filed
    Friday, June 10, 2016
    8 years ago
  • Date Issued
    Tuesday, April 23, 2019
    5 years ago
Abstract
Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.
Description
BACKGROUND

One of the challenges that continues to face designers of devices having user-engageable displays, such as touch displays, pertains to providing enhanced functionality for users, without necessarily permanently manifesting the functionality as part of the “chrome” of a device's user interface. This is so, not only with devices having larger or multiple screens, but also in the context of devices having a smaller footprint, such as tablet PCs, hand-held devices, smaller multi-screen devices and the like.


SUMMARY

This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.


Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.



FIG. 1 is an illustration of an environment in an example implementation in accordance with one or more embodiments.



FIG. 2 is an illustration of a system in an example implementation showing FIG. 1 in greater detail.



FIG. 3 illustrates an example computing device in accordance with one or more embodiments.



FIG. 4 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.



FIG. 5 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.



FIG. 6 illustrates an example computing device in accordance with one or more embodiments.



FIG. 7 illustrates an example computing device in accordance with one or more embodiments.



FIG. 8 illustrates an example computing device in accordance with one or more embodiments.



FIG. 9 illustrates an example computing device in accordance with one or more embodiments.



FIG. 10 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.



FIG. 11 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.



FIG. 12 illustrates an example computing device in accordance with one or more embodiments.



FIG. 13 illustrates an example computing device in accordance with one or more embodiments.



FIG. 14 illustrates an example computing device in accordance with one or more embodiments.



FIG. 15 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.



FIG. 16 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.



FIG. 17 illustrates an example computing device in accordance with one or more embodiments.



FIG. 18 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.



FIG. 19 illustrates an example computing device in accordance with one or more embodiments.



FIG. 20 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.



FIG. 21 illustrates an example computing device in accordance with one or more embodiments.



FIG. 22 illustrates an example computing device in accordance with one or more embodiments.



FIG. 23 illustrates an example computing device in accordance with one or more embodiments.



FIG. 24 illustrates an example computing device in accordance with one or more embodiments.



FIG. 25 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.



FIG. 26 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.



FIG. 27 illustrates an example computing device in accordance with one or more embodiments.



FIG. 28 illustrates an example computing device in accordance with one or more embodiments.



FIG. 29 illustrates an example computing device in accordance with one or more embodiments.



FIG. 30 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.



FIG. 31 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.



FIG. 32 is a flow diagram that describes the steps in a method in accordance with one or more embodiments.



FIG. 33 illustrates an example computing device that can be utilized to implement various embodiments described herein.





DETAILED DESCRIPTION

Overview


Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.


In the following discussion, a variety of different implementations are described that involve bezel gestures, or gestures associated with bezel gestures, to initiate and/or implement functions on a computing device. In this way, a user may readily access enhanced functions of a computing device in an efficient and intuitive manner.


In the following discussion, an example environment is first described that is operable to employ the gesture techniques described herein. Example illustrations of the gestures and procedures are then described, which may be employed in the example environment, as well as in other environments. Accordingly, the example environment is not limited to performing the example gestures and the gestures are not limited to implementation in the example environment.


Example Environment



FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ bezel gestures and other techniques described herein. The illustrated environment 100 includes an example of a computing device 102 that may be configured in a variety of ways. For example, the computing device 102 may be configured as a traditional computer (e.g., a desktop personal computer, laptop computer, and so on), a mobile station, an entertainment appliance, a set-top box communicatively coupled to a television, a wireless phone, a netbook, a game console, a handheld device, and so forth as further described in relation to FIG. 2. Thus, the computing device 102 may range from full resource devices with substantial memory and processor resources (e.g., personal computers, game consoles) to a low-resource device with limited memory and/or processing resources (e.g., traditional set-top boxes, hand-held game consoles). The computing device 102 also includes software that causes the computing device 102 to perform one or more operations as described below.


Computing device 102 includes a bezel 103 that forms part of the device's housing. The bezel is made up of the frame structure adjacent the device's display, also referred to as display device 108 below. Computing device 102 includes a gesture module 104 and a bezel gesture module 105 that forms part of the gesture module 104. The gesture modules can be implemented in connection with any suitable type of hardware, software, firmware or combination thereof. In at least some embodiments, the gesture modules are implemented in software that resides on some type of tangible, computer-readable medium examples of which are provided below.


Gesture module 104 and bezel gesture module 105 are representative of functionality that recognizes gestures and bezel gestures, respectively, and causes operations to be performed that correspond to the gestures. The gestures may be recognized by modules 104, 105 in a variety of different ways. For example, the gesture module 104 may be configured to recognize a touch input, such as a finger of a user's hand 106a as proximal to display device 108 of the computing device 102 using touchscreen functionality. In addition, bezel gesture module 105 can be configured to recognize a touch input, such as a finger of a user's hand 106b, that initiates a gesture on or adjacent bezel 103 and proceeds onto display device 108. Any suitable technology can be utilized to sense an input on or adjacent bezel 103. For example, in at least some embodiments, the digitizer or sensing elements associated with display device 108 can extend underneath bezel 103. In this instance, technologies such as capacitive field technologies, as well as others, can be utilized to sense the user's input on or adjacent to the bezel 103.


Alternately or additionally, in embodiments in which display device 108 does not extend underneath bezel 103, but rather lies flush with the bezel, bezel gesture module 105 can detect the changing contact profile of the user's finger as it emerges onto display device 108 from bezel 103. Alternately or additionally, approaches that utilize the centroid of the user's touch profile can be utilized to detect a changing centroid contact profile that is suggestive of a bezel gesture. Further, techniques for fingerprint sensing can be employed. Specifically, if the sensing substrate is sensitive enough to determine ridges of the finger or fingers contacting the display, then the orientation of the finger(s) as well as the fact that the fingerprint is clipped by the bezel can be detected. Needless to say, any number of different techniques can be utilized to sense a user's input relative to the bezel 103. The touch input may also be recognized as including attributes (e.g., movement, selection point, etc.) that are usable to differentiate the touch input from other touch inputs recognized by the gesture modules 104, 105. This differentiation may then serve as a basis to identify a gesture from the touch inputs and consequently an operation that is to be performed based on identification of the gesture. This yields the general benefit that gestures that start from the bezel and enter onto the screen are, in general, distinguishable from other ostensibly similar gestures that access on-screen content, since there is no reason for users to position their fingers starting partially or fully off-screen if their intent is to interact with something on the screen. Hence, normal direct manipulative gestures, even for objects near the screen boundaries, are still possible and do not interfere with bezel gestures and vice versa.


For example, a finger of the user's hand 106a is illustrated as selecting 110 an image 112 displayed by the display device 108. Selection 110 of the image 112 and subsequent movement of the finger of the user's hand 106a may be recognized by the gesture module 104. The gesture module 104 may then identify this recognized movement as indicating a “drag and drop” operation to change a location of the image 112 to a point in the display at which the finger of the user's hand 106a was lifted away from the display device 108. Thus, recognition of the touch input that describes selection of the image, movement of the selection point to another location, and then lifting of the finger of the user's hand 106a may be used to identify a gesture (e.g., drag-and-drop gesture) that is to initiate the drag-and-drop operation.


A variety of different types of gestures may be recognized by the gesture modules 104, 105 such as gestures that are recognized from a single type of input (e.g., touch gestures such as the previously described drag-and-drop gesture) as well as gestures involving multiple types of inputs. For example, modules 104, 105 can be utilized to recognize single-finger gestures and bezel gestures, multiple-finger/same-hand gestures and bezel gestures, and/or multiple-finger/different-hand gestures and bezel gestures.


For example, the computing device 102 may be configured to detect and differentiate between a touch input (e.g., provided by one or more fingers of the user's hand 106a, 106b) and a stylus input (e.g., provided by a stylus 116). The differentiation may be performed in a variety of ways, such as by detecting an amount of the display device 108 that is contacted by the finger of the user's hand 106 versus an amount of the display device 108 that is contacted by the stylus 116.


Thus, the gesture modules 104, 105 may support a variety of different gesture techniques through recognition and leverage of a division between stylus and touch inputs, as well as different types of touch inputs.


Accordingly, the gesture modules 104, 105 may support a variety of different gestures. Examples of gestures described herein include a single-finger gesture 118, a single-finger bezel gesture 120, a multiple-finger/same-hand gesture 122, a multiple-finger/same-hand bezel gesture 124, a multiple-finger/different hand gesture 126, and a multiple-finger/different-hand bezel gesture 128. Each of these different types of bezel gestures is described below.



FIG. 2 illustrates an example system showing the gesture module 104 and bezel gesture module 105 of FIG. 1 as being implemented in an environment where multiple devices are interconnected through a central computing device. The central computing device may be local to the multiple devices or may be located remotely from the multiple devices. In one embodiment, the central computing device is a “cloud” server farm, which comprises one or more server computers that are connected to the multiple devices through a network or the Internet or other means.


In one embodiment, this interconnection architecture enables functionality to be delivered across multiple devices to provide a common and seamless experience to the user of the multiple devices. Each of the multiple devices may have different physical requirements and capabilities, and the central computing device uses a platform to enable the delivery of an experience to the device that is both tailored to the device and yet common to all devices. In one embodiment, a “class” of target device is created and experiences are tailored to the generic class of devices. A class of device may be defined by physical features or usage or other common characteristics of the devices. For example, as previously described the computing device 102 may be configured in a variety of different ways, such as for mobile 202, computer 204, and television 206 uses. Each of these configurations has a generally corresponding screen size and thus the computing device 102 may be configured as one of these device classes in this example system 200. For instance, the computing device 102 may assume the mobile 202 class of device which includes mobile telephones, music players, game devices, and so on. The computing device 102 may also assume a computer 204 class of device that includes personal computers, laptop computers, netbooks, and so on. The television 206 configuration includes configurations of device that involve display in a casual environment, e.g., televisions, set-top boxes, game consoles, and so on. Thus, the techniques described herein are may be supported by these various configurations of the computing device 102 and are not limited to the specific examples described in the following sections.


Cloud 208 is illustrated as including a platform 210 for web services 212. The platform 210 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 208 and thus may act as a “cloud operating system.” For example, the platform 210 may abstract resources to connect the computing device 102 with other computing devices. The platform 210 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for the web services 212 that are implemented via the platform 210. A variety of other examples are also contemplated, such as load balancing of servers in a server farm, protection against malicious parties (e.g., spam, viruses, and other malware), and so on.


Thus, the cloud 208 is included as a part of the strategy that pertains to software and hardware resources that are made available to the computing device 102 via the Internet or other networks. For example, the gesture modules 104, 105 may be implemented in part on the computing device 102 as well as via a platform 210 that supports web services 212.


For example, the gesture techniques supported by the gesture modules may be detected using touchscreen functionality in the mobile configuration 202, track pad functionality of the computer 204 configuration, detected by a camera as part of support of a natural user interface (NUI) that does not involve contact with a specific input device, and so on. Further, performance of the operations to detect and recognize the inputs to identify a particular gesture may be distributed throughout the system 200, such as by the computing device 102 and/or the web services 212 supported by the platform 210 of the cloud 208.


Generally, any of the functions described herein can be implemented using software, firmware, hardware (e.g., fixed logic circuitry), manual processing, or a combination of these implementations. The terms “module,” “functionality,” and “logic” as used herein generally represent software, firmware, hardware, or a combination thereof. In the case of a software implementation, the module, functionality, or logic represents program code that performs specified tasks when executed on a processor (e.g., CPU or CPUs). The program code can be stored in one or more computer readable memory devices. The features of the gesture techniques described below are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


In the discussion that follows, various sections describe example bezel gestures and gestures associated with bezel gestures. A first section entitled “Use of Bezel as an Input Mechanism” describes embodiments in which a computing device's bezel can be used as an input mechanism. Following this, a section entitled “Using Off-Screen Motion to Create On-Screen Input” describes how a motion away from a device's screen can be utilized, through gestures, to create on-screen input. Next, a section entitled “Use of Multiple Fingers for Gesturing” describes how multiple fingers can be utilized to provide gestural input. Following this section, a section entitled “Radial Menus” describes embodiments in which radial menus can be utilized to provide a robust collection of input options. Next, a section entitled “On and Off Screen Gestures and Combinations—Page/Object Manipulation” describes various types of gestures and combinations that can be utilized to manipulate pages and/or objects. Last, a section entitled “Example Device” describes aspects of an example device that can be utilized to implement one or more embodiments.


Use of Bezel as an Input Mechanism


In one or more embodiments, the bezel of a device can be utilized as an input mechanism. For example, in instances in which the display device extends under the bezel, a user's finger or other input mechanism can be sensed when it hovers over or physically engages the bezel. Alternately or additionally, the bezel can include sensing mechanisms, such as infrared mechanisms as well as others, that sense a user's finger or other input mechanism hovering over or physically engaging the bezel. Any combination of inputs relative to the bezel can be used. For example, to provide various inputs to the device, the bezel can be tapped one or more times, held, slid over, hovered over and/or any combination of these or other inputs.


As an example, consider the following. Many selection, manipulation, and context menu activation schemes utilize a distinction between a device's background canvas and objects that appear on the canvas. Using the bezel as an input mechanism can provide a way to access a page in the background canvas, even if the page itself is covered by many closely-spaced objects. For example, tapping on the bezel may provide a mechanism to deselect all objects. Holding on the bezel could be used to trigger a context menu on the page. As an example, consider FIG. 3 which illustrates an example environment 300 that includes a computing device 302 having a bezel 303 and a display device 308. In this instance, a finger on user's hand 306a is tapping on bezel 303. By tapping on the bezel, the user's input is sensed and an associated functionality that is mapped to the input can be provided. In the above example, such functionality might deselect all objects appearing on display device 308. In addition, input can be received at different locations on the bezel and can be mapped to different functionality. For example, input received on the right side of the bezel might be mapped to a first functionality; input received on the left side of the bezel might be mapped to a second functionality and so on. Furthermore, input received in different regions of a bezel side might be mapped to different functionality or to no functionality at all depending on the orientation of the device and how the user is holding it. Some bezel edges may be left unassigned or may be insensitive to touch-and-hold, so that inadvertent operations will not be triggered. Thus, any one particular side of the bezel might be utilized to receive input and, accordingly map that input to different functionality depending on what region of the bezel receives the input. It is to be appreciated and understood that input received via the bezel can be received independent of any input received via hardware input devices, such as buttons, track balls, and other instrumentalities that might be located on an associated device. Further, in at least some embodiments, input received via the bezel can be the only user input that is utilized to ascertain and access a particular functionality. For example, input received solely on the bezel can provide the basis by which device functionality can be accessed. Further, in some embodiments, orientation sensors (e.g. accelerometers) may be used as an input to help decide which bezel edges are active. In some embodiments quick, intentional tap remains available, but only touch and hold is ignored to differentiate from simply holding the device with a finger that happens to be resting on the bezel.


Alternately or additionally, in at least some embodiments, a visual affordance can be utilized to provide a hint or indication of accessible functionality associated with the bezel. Specifically, a visual affordance can be utilized to indicate functionality that is accessible by virtue of a bezel gesture. Any suitable type of visual affordance can be utilized. As an example, consider again FIG. 3. There, a visual affordance in the form of a semi-transparent strip 304 provides an indication that additional functionality can be accessed through utilization of a bezel gesture. The visual affordance can take any suitable form and can be located at any suitable location on display device 308. Furthermore, the visual affordance can be exposed in any suitable way. For example, in at least some embodiments, input received via the bezel can be used to expose or display the visual affordance. Specifically, in at least some embodiments, a “peek out” visual affordance can be presented responsive to detecting a hover over, or a physical engagement of the device's bezel. The “peek out” visual affordance can, in at least some embodiments, be deselected by the user such that the “peek out” is hidden.


In this particular example, the additional functionality associated with semi-transparent strip 304 resides in the form of a so-called bezel menu which is accessible using a bezel gesture. Specifically, in one or more embodiments, the bezel menu can be accessed through a gesture in which a finger of user's hand 306b touches the bezel and then moves across the bezel and onto the display device 308 in the direction of the illustrated arrow. This can allow the bezel menu to be dropped down as will be described in more detail below.


Accordingly, various embodiments can use the bezel itself as an input mechanism, as in the first example above. Alternately or additionally, various other embodiments can use the bezel in connection with a visual affordance that provides a clue to the user that additional functionality can be accessed by virtue of a bezel gesture.



FIG. 4 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be implemented in connection with a system such as those systems that are described above and below.


Step 400 receives an input associated with a bezel. Any suitable type of input can be received, examples of which are provided above. Step 402 accesses functionality associated with the received input. Any suitable type of functionality can be accessed. By virtue of providing a variety of different types of recognizable inputs (e.g., taps, tap combinations, tap/hold combinations, slides, etc), and mapping those recognizable inputs to different types of functionalities, a robust collection of user input mechanisms can be provided.



FIG. 5 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be implemented in connection with a system such as those systems that are described above and below.


Step 500 displays a visual affordance on a display device associated with a computing device. Any suitable type of visual affordance can be utilized, an example of which is provided above. Step 502 receives a bezel gesture input relative to the visual affordance. Any suitable type of bezel gesture input can be utilized. Step 504 accesses functionality associated with the received bezel gesture input. Any suitable type of functionality can be accessed, an example of which is provided above and described in more detail below.


Having considered examples in which the bezel can be used as an input mechanism, consider now various embodiments that can utilize off-screen or off-display motion to create screen or display input.


Using Off-Screen Motion to Create On-Screen Input


In at least some embodiments, off-screen to on-screen motion (or vice versa) can be utilized as a mechanism to expose a menu or to access some other type of functionality. The off-screen motion or input can be provided, as indicated above, relative to the device's bezel. Any suitable type of bezel gesture input can be provided in order to effectuate the off-screen to on-screen motion. For example, bezel gestures or inputs can, by way of example and not limitation, start or end on the bezel, cross or recross the bezel, cross at different locations of the bezel (e.g., the corners, or particular ranges of coordinates along a particular edge), and/or occur on one or more bezels associated with multiple screens (with the possibility of different semantics depending on the screen or edge thereof). Further, bezel inputs can include, by way of example and not limitation, a single-contact drag (finger or pen), two-contact drag (two fingers), and/or a hand-contact drag (multiple fingers/whole hand/multiple or single fingers on different hands). For example, pinch gestures from off-screen space (i.e. originating on the bezel) can be utilized and mapped to different functionalities. For example, bezel gestures with multiple contacts entering from different edges of the screen can have different semantics. Specifically, two fingers entering from adjacent edges of the bezel (i.e. spanning a corner) might be mapped to a zoom out operation that zooms out on a page to show an extended workspace or canvas. Two fingers entering from opposite edges, with either one hand (if the screen is small enough), or two hands (one finger from each hand) can be mapped to a different functionality. Multiple fingers entering on one edge of the bezel and one finger entering from an adjacent or opposite edge of the bezel might be mapped to a different functionality. Additionally, multiple fingers entering from two or more edges can further be mapped to additional functionality.


As another example, consider FIG. 6. There, device 602 includes a bezel 603 and a visual affordance 604 that is rendered on display device 608. As noted above, visual affordance 604, in the form of a semi-transparent strip, can be utilized to provide a hint or indication of accessible functionality, in this case a bezel menu, associated with the bezel.


In one or more embodiments, the bezel menu can be accessed through a bezel gesture in which a finger of user's hand 606 touches the bezel and then moves across the bezel and onto the display device 608 in the direction of the illustrated arrow. This can allow bezel menu 610 to be dropped down at which time it can become fully opaque.


In the illustrated and described embodiment, bezel menu 610 includes multiple selectable icons or slots 612, 614, 616, 618, and 620. Each of the icons or slots is associated with a different functionality such as, for example, paint functionality, pen functionality, note functionality, object creation, object editing, and the like. It is to be appreciated and understood, that any type of functionality can be associated with the icons or slots.


In the illustrated and described environment, bezel menu 610 can enable a user to access and activate commands, tools, and objects. The bezel menu can be configured to respond to both touch input and pen input. Alternately or additionally, the bezel menu can be configured to respond only to touch input.


In at least some embodiments, different gestural modes can be utilized to access functionality associated with the bezel menu 610. For example, one gestural mode can be a novice mode, and another gestural mode can be an expert mode.


In the novice mode, after the user gestures to reveal the bezel menu 610, the user can lift their finger, whereupon the bezel menu can remain open for a configurable interval (or indefinitely). The user may then tap on a desired item associated with one of the icons or slots 612, 614, 616, 618, and 620. Through this gesture, the functionality associated with a particular icon or slot can be accessed. For example, tapping on a particular icon or slot may cause an object to be created on the canvas associated with display device 608. In at least some embodiments, in the novice mode, objects that are accessed from the bezel menu appear in default locations on the canvas. The user may close the bezel menu by sliding it back off of the screen (an on-screen-to-offscreen gesture) or by tapping outside of the bezel menu, without activating any function.


In the expert mode, once the user is familiar with the location of commonly used items accessible from the bezel menu, the user can perform a continuous finger-drag that crosses through the slot or icon and onto the canvas to create and drag an associated object (or tool, or interface mode) to a specific desired position or path, in a single transaction. The user can then let go of the object and interact with it. As an example, consider FIG. 7. There, the user has performed a bezel gesture that has dragged across icon or slot 614 to access functionality associated with a post-it note and has positioned the corresponding note on the canvas as indicated. At this point, the user can lift a finger and annotate the digital post-it as desired using an associated pen. In at least some embodiments, the bezel menu 610 may or may not remain fully open after a particular functionality has been accessed.


In at least some other embodiments, in the expert mode, the bezel menu may not necessarily be revealed at all in order to access functionality associated with an icon or slot. Rather, a bezel gesture that crosses the visual affordance at a location that corresponds to a particular icon or slot may access functionality associated with the icon or slot. As an example, consider FIG. 8. There, visual affordance 604 is illustrated. Notice that the bezel gesture crosses over a portion of the visual affordance that corresponds to icon or slot 614 (FIG. 7). Notice also that by virtue of this bezel gesture, a corresponding post-it note has been accessed. This feature can be implemented by using a time delay, e.g. ⅓ second, and considering the location of the user's finger before actually deciding whether to deploy the bezel menu responsive to a bezel gesture. The idea here is that the bezel menu stays hidden unless the user pauses, or just pulls out the menu, without completing a drag-off of the desired item. This is accomplished using a time delay before the bezel menu starts to slide out. Hence, once users are familiar with a particular operation on the bezel menu, they can rapidly drag through it to create and position an object without ever having to be distracted by the opening of the visual menu itself. This can encourage expert performance based on ballistic motion driven by procedural memory, rather than visually guided performance based on direct manipulation of a widget. The concept succeeds because the novice way of using it helps to learn and encourage the expert way of working with it.


As but one example of how this can work in accordance with one embodiment, consider the following. When the finger is observed to cross from the screen bezel into a slot of the bezel menu, a timer is started. No other immediate visual feedback occurs. When the timer expires, if the finger is still in the region occupied by the bezel menu, the bezel menu slides out and tracks with the user's finger. When the user's finger lifts inside the bezel menu area, it stays posted. This is the novice mode described above. The user can lift a finger to inspect all slots, and tap on the desired one to create the desired object (rather than dragging it). The user can also touch down and drag an item onto the canvas from the novice mode. If the finger has slid past a threshold distance or region, then the bezel menu remains closed but the function indicated by the slot that was crossed is activated, e.g. a post-it is created and starts following the user's finger. This is the expert mode described above. An implementation consideration is that the slot that is selected by the expert mode gesture can be determined by the location at which the finger crosses the screen edge.


In at least some embodiments, the bezel menu can be scrollable in order to provide access to the additional functionality. For example, the bezel menu can have left and right arrows on either side to enable scrollability. Alternately or additionally, a single or multi-finger drag that is orthogonal to the opening direction of the bezel menu can scroll it, without the need for any arrows.


In at least some embodiments, the bezel menu can create space for additional slots or icons. For example, by reducing the width of slots or icons that appear at the edge of the bezel menu, additional slots or icons can be added. As an example, consider FIG. 9.


There, a device includes a bezel 903 and a bezel menu 910 that appears on display device 908. Additional slots or icons 912, 914 appear in the bezel menu 910. Notice that the slots or icons 912, 914 have a reduced width relative to other slots or icons. In this example, the width is reduced by about one half. In order to access objects associated with slots or icons 912, 914, a bezel gesture can be used that drags over the slot or icon from the side of the device as shown. In some embodiments, the corner slots or icons can have a special status. For example, the corner slots or icons may be permanently assigned to a particular functionality and may not be customizable.


Accordingly, bezel menus can be used to expose functionality to a user in a manner that does not permanently cause screen real estate to be occupied or require the use of a dedicated hardware button.



FIG. 10 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be implemented in connection with a system such as those systems that are described above and below.


Step 1000 displays a visual affordance associated with an accessible bezel menu. An example of a suitable visual affordance is given above. Step 1002 receives a bezel gesture input relative to the visual affordance. Any suitable bezel gesture can be utilized, an example of which is provided above. Step 1004 presents, responsive to receiving the bezel gesture input, a bezel menu. Any suitable bezel menu can be utilized. In at least some embodiments, the bezel menu can be presented simply by virtue of receiving a bezel gesture without necessarily displaying a visual affordance. Alternately or additionally, the visual affordance may fade in when the user's finger or pen hovers above an associated bezel edge.



FIG. 11 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be implemented in connection with a system such as those systems that are described above and below.


Step 1100 receives a gesture input. The input can be received relative to a bezel menu or a visual affordance associated with a bezel menu. Any suitable gesture input can be received. For example, the gesture input can comprise an input that does not use or incorporate the bezel. An example of this was provided above in the discussion of FIG. 6 relative to a user tapping on an exposed portion of the bezel menu. Alternately or additionally, the gesture input can comprise a bezel gesture input. An example of this was provided above in the discussion of FIGS. 7-9. Step 1102 ascertains a functionality associated with the gesture input. Step 1104 accesses the functionality that was ascertained in step 1102. Examples of how this can be done are provided above.


The examples above illustrate gestures, including bezel gestures that utilize a single finger. In other embodiments, more than one finger can be utilized in connection with gestures including bezel gestures.


Use of Multiple Fingers for Gesturing


In one or more embodiments, multiple fingers can be utilized for gesturing, including bezel gesturing. The multiple fingers can reside on one hand or, collectively, on both hands. The use of multiple fingers can enable multiple numbers of touches to be mapped to different functionalities or objects associated with functionalities. For example, a two-finger gesture or bezel gesture might be mapped to a first functionality or a first object associated therewith, and a three-finger gesture or bezel gesture might be mapped to a second functionality or a second object associated therewith. As an example, consider FIG. 12.


There, device 1202 includes a bezel 1203 and a visual affordance 1204 that is rendered on the display device. As noted above, visual affordance 1204, in the form of a semi-transparent strip, can be utilized to provide a hint or indication of accessible functionality, in this case a bezel menu 1210, associated with the bezel.


As noted above, the bezel menu 1210 can be accessed through a bezel gesture in which a finger of the user's hand touches the bezel and then moves across the bezel and onto the display device to drag the bezel menu down.


In one or more embodiments, bezel menu 1210 can be exposed and further extended into a drawer illustrated at 1212. In the illustrated and described embodiment, the following bezel gesture can be used to expose drawer 1212. First, a user touches down with one or more fingers on or near the bezel 1203. This is illustrated in the top-most portion of FIG. 12. From there, the user can drag multiple fingers onto the display device as illustrated in the bottom-most portion of FIG. 12, thereby exposing drawer 1212. In at least some embodiments, no objects are created, by default, when multiple fingers simultaneously cross the bezel menu. That is, in these embodiments, a multi-finger gesture as described above indicates that the drawer 1212 is being accessed. Drawer 1212 can have additional objects such as those that are illustrated. Additional objects can include, by way of example and not limitation, additional tools, colors, and various other objects. In addition, in at least some embodiments, drawer 1212 can be utilized to store and/or arrange various items. Items can be arranged or rearranged in any suitable way such as, by direct manipulation by the user, e.g. by dragging and dropping an object within the drawer.


In at least some embodiments, lifting the hand may leave the drawer open until it is later closed by way of a similar gesture in the opposite direction. In at least some embodiments, bezel menu 1210 can be customized using, for example, contents from drawer 1212. As an example, consider FIG. 13.


There, a user can change the default assignment of tools and/or objects to the main bezel menu slots via a drag and drop operation. For example, in the top-most portion of FIG. 13, a user touches down on a new tool 1300. The user then proceeds to drag tool 1300 into or onto one of the slots of bezel menu 1210. This gesture causes the object previously associated with the slot to be replaced with the new object dropped by the user.


Alternately or additionally, the user can also drag content from the page or canvas into the drawer 1212. As an example, consider FIG. 14. There, the user has touched down on an object 1400 on the page or canvas and has dragged the object into drawer 1212. By lifting the finger, the object 1400 is deposited into the drawer 1212.


It is to be appreciated and understood that while one drawer has been described above, various other embodiments can utilize multiple drawers. For example, other edges of the display device can be associated with different drawers. These different drawers may hold different tools, objects, or other content. On dual or multiple-screen devices, the drawers for each screen edge may be identical or may be differentiated. In at least some embodiments, the multiple drawers may also be accessed on each screen edge by sliding orthogonal to the direction that the drawer is opened. This can be done either by a single touch, and/or multiple touches. If the bezel menu extends all the way to the screen edges, it can also be done by a bezel gesture from the orthogonal edge.


In the embodiment described just above, multiple touches were used to access drawer 1212. Specifically, as illustrated in FIG. 12, three touches were used to access the illustrated drawer. In one or more embodiments, different numbers of touches can be utilized to access different drawers. For example, two touches can be mapped to a first drawer, three touches can be mapped to a second drawer, four touches can be mapped to a third drawer, and so on. Alternately or additionally, the spacing between multiple touches and variances therebetween can be mapped to different functionalities. For example, a two-finger touch with a first spacing might be mapped to a first functionality; and, a two-finger touch with a second, greater spacing might be mapped to a second different functionality.



FIG. 15 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be implemented in connection with a system such as those systems that are described above and below.


Step 1500 receives multiple-finger gesture input. Any suitable type of gesture can be utilized including, by way of example and not limitation, bezel gesture input such as that described above. Step 1502 ascertains a functionality associated with the multiple-finger gesture input. Examples of functionalities are described above. Step 1504 accesses the ascertained functionality. Examples of how this can be done are described above.



FIG. 16 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be implemented in connection with a system such as those systems that are described above and below.


Step 1600 receives a bezel gesture input. Examples of bezel gesture inputs are described above. Step 1602 ascertains a functionality associated with the bezel gesture input. In this particular embodiment, the functionality associated with the bezel gesture input is one that is associated with accessing one or more drawers. Step 1604 exposes one or more drawers for the user. Examples of how this can be done are described above.


Radial Menus


In at least some embodiments, so-called radial menus can be utilized in connection with menus such as bezel menus. Although radial menus are described, other types of menus can be used without departing from the spirit and scope of the claimed subject matter. For example, pull down menus can be used in conjunction with bezel menus. One of the general ideas associated with radial menus is that a user can touch down at a certain location and stroke or slide their finger a certain direction to access and implement a particular functionality or menu command. The presence of a radial menu can be indicated by a small icon associated with a larger icon or slot of the bezel menu. As an example, consider FIG. 17.


There, device 1702 includes a bezel 1703 and a bezel menu 1710 that has been exposed on display device 1708 as described above. In the illustrated and described embodiment, bezel menu 1710 includes multiple selectable icons or slots, one of which is designated at 1712. Each of the icons or slots is associated with a different functionality such as, for example, paint functionality, pen functionality, note functionality, object creation, object editing, and the like. It is to be appreciated and understood, that any type of functionality can be associated with the icons or slots.


As noted above, bezel menu 1710 can enable a user to access and activate commands, tools, and objects. The bezel menu can be configured to respond to both touch input and pen input. Alternately or additionally, the bezel menu can be configured to respond only to touch input. In the illustrated and described embodiment, icon or slot 1712 includes a radial menu icon 1714 that gives a clue to the user that one or more radial menus, for example radial menu 1715, is associated with this particular icon or slot. In the illustrated and described embodiment, the radial menu 1715 can be accessed in any suitable way, e.g. through a pen or touch. For example, in at least some embodiments, the radial menu 1715 can be accessed by hovering a pen over or near radial menu icon 1714. Alternately or additionally, a pen or finger can be used to pull down the radial menu 1715. Alternately or additionally, the radial menus 1715 can be accessed through a tap-and-hold of the pen or finger on or near the radial menu icon 1714. In some embodiments, tapping on the radial menu icon triggers a default action which may or may not be different than the action associated with tapping on the bezel menu slot.


Once the radial menu 1715 is exposed, the user can access various functionalities or commands by touching down on or near radial menu icon 1714 and stroking in a particular direction. In the illustrated and described embodiment, five different directions are indicated by the arrows. Each direction corresponds to a different functionality or command. Each functionality or command is represented, in the drawing, by a cross-hatched square. In at least some embodiments, each icon or slot 1712 has a default functionality or command. By selecting a particular radial menu functionality or command, the default functionality or command may be replaced by the selected functionality or command.


In at least some embodiments, the number of options presented by a radial menu can change depending on the location of the corresponding slot or icon with which the radial menu is associated. For example, in the illustrated and described embodiment, slot or icon 1712 includes five options for the user. Radial menus associated with slots or icons that appear at the ends of the bezel menu 1710 may have fewer options due to spacing constraints. Alternately or additionally, radial menus associated with slots or icons that appear as part of an exposed drawer may have more selectable options.


In at least some embodiments, radial menus can be implemented to include both a novice mode and an expert mode. In the novice mode, the radial menu can be fully exposed to enable users who are unfamiliar with its accessible functionalities or commands to be visually guided through the selection process. In the expert mode, intended for users who are familiar with the content and behavior of radial menus, the radial menu might not be exposed at all. Rather, a quick touch-and-stroke gesture associated with an icon or slot, such as icon 1712, may enable the radial menu's functionality or command to be accessed directly.



FIG. 18 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be implemented in connection with a system such as those systems that are described above and below.


Step 1800 presents a bezel menu. Examples of bezel menus are provided above. Step 1802 provides an indication of one or more radial menus associated with the bezel menu. In the illustrated and described embodiment, the indication resides in the form of a radial menu icon that appears on a slot or icon of the bezel menu. Step 1804 receives user input associated with one of the radial menus. Examples of how this can be done are provided above. For example, in at least some embodiments, a radial menu can be visually presented to the user so that the user can then touch and stroke in a particular direction to provide the input. Alternately or additionally, a radial menu need not necessarily be visually presented. Rather, a user who is familiar with the radial menu's content and behavior can correspondingly gesture, as described above, to provide the input. Step 1806 accesses, responsive to the received user input, and the associated functionality or command.


In one or more embodiments, the bezel menu may or may not be rotated when the screen orientation is rotated. For example, in some instances it may be desirable to not rotate a bezel menu when the screen orientation is rotated. This may be particularly relevant in applications where the content should not be rotated, e.g., a journal page or a sketch pad where the user rotates the screen to afford different drawing angles. In other instances, it may be desirable to rotate the bezel menu when the screen orientation is rotated. By default, it may be desirable to support the same number of bezel menu slots on all four edges of the screen so that menu items can be rotated from the long edge or screen to the short edge of the screen without losing some items.


Alternately or additionally, bezel menus can be customizable per screen orientation to enable different numbers of slots to be used on the long and short edges of the screen. In some instances, some edges of the screen may be left without bezel items depending on the screen orientation. For example, the left and bottom edges, for a right-handed individual, may be more likely to be swiped by accident, and may be left without bezel menus if desired.


On and Off Screen Gestures and Combinations—Page/Object Manipulation


In one or more embodiments, on and off screen gesture combinations can be utilized to manipulate pages and/or other objects. For example, combinations of on and off screen gestures can include gestures in which input is received on the screen relative to an object using one hand, and additional input in the form of a bezel gesture is received relative to the object using the same or a different hand. Any suitable type of gesture combinations can be used. As an example, consider FIG. 19.


There, a device 1902 includes a bezel 1903. A page 1904 is displayed on the display device (not designated). In the illustrated and described embodiment, a tear operation is performed using a combination of on and off screen gestures. Specifically, in the bottommost portion of FIG. 19, a user's left hand or left index finger holds an object which, in this example, comprises page 1904. Using the right hand, the user initiates a bezel gesture starting on bezel 1903 and moving in the direction of the indicated arrow through a portion of page 1904. By virtue of using a single finger to indicate the tear operation, a partial tear of the page is performed. A tear operation can be implemented by creating a bitmap of the portion of the page that has been torn away and rendering only that portion of the page that was not torn away. Alternately or additionally, an object can be created to represent the torn-away portion. In this created object, objects appearing in the torn-away portion can be created to represent items appearing on the page.


In one or more other embodiments, a tear operation can be implemented using multiple fingers. In these embodiments, the multiple finger input can be mapped to an operation that completely tears a page out of the canvas or book in which the page appears.


In at least some embodiments, the direction of tearing can carry with it different semantics. For example, a top-to-bottom tear may tear out and delete a page. A bottom-to-top tear may tear out and allow dragging of the page to a new location.



FIG. 20 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be implemented in connection with a system such as those systems that are described above and below.


Step 2000 receives on-screen input associated with an object. Any suitable type of on-screen input can be received including, by way of example and not limitation, single-finger input and/or multiple-finger input. Step 2002 receives a bezel gesture input associated with the object. Any suitable type of bezel gesture input can be received including, by way of example and not limitation, single-finger input and/or multiple-finger input. Step 2004 ascertains functionality associated with both inputs. Step 2006 accesses the associated functionality. Any suitable type of functionality can be associated with the combination of on-screen and bezel gesture inputs, an example of which is provided above.


Other page manipulations can be provided through the use of gestures, including bezel gestures. For example, page flipping and page saving (also termed “page pocketing”) can be provided as described below.


As an example, consider FIG. 21. There, a device 2102 includes a bezel 2103 and a page 2104. As shown in the bottommost portion of FIG. 21, a user can flip to a previous page by using a bezel gesture that starts on bezel 2103 and proceeds rightward across the screen in the direction of the arrow. Doing so reveals the previous page 2106. Likewise, to turn to the next page, a user would utilize a similar bezel gesture, but only in the opposite direction. Using the page flipping gesture, the user's finger can lift at any suitable location on the screen.


In one or more embodiments, the semantics of page flipping gestures can vary from that described above. For example, in some instances a page flipping gesture can be initiated as described above. However, if the user pauses with their finger on the screen, multiple pages can be flipped through. Alternately or additionally, pausing the finger on the screen in the middle of a page flipping gesture can cause additional controls, such as section tabs, command palettes, or a bezel menu to appear.


Alternately or additionally, in at least some embodiments, the further a user's finger progresses across the screen, the more pages can be flipped. Alternately or additionally, multiple pages can be flipped by initiating the page flipping gesture as described above, and then moving the finger in a circular motion, either clockwise or counterclockwise. In this instance, clockwise motion would represent forward flipping, and counterclockwise motion would represent backwards flipping. In this implementation, a circle may be fitted to the last N samples of motion. The speed of motion can be a function of the diameter of the circle. Note that in this implementation, the user does not have to circle around any particular location on the screen, or even to draw a well formed circle at all. Rather, any curvilinear motion can get mapped to page flipping in an intuitive manner, while also allowing the user to easily stop and reverse course to flip in the opposite direction.


In at least some embodiments, a similar gesture can be used to save or “pocket” a page. In these embodiments, rather than the gesture terminating on the screen, as in the page flipping example, the gesture can terminate on a bezel portion or other structure that lies across the screen from where the gesture originated. As an example, consider FIGS. 22 and 23.


There, a device 2202 includes a bezel 2203 and a page 2204. As shown in the bottommost portion of FIG. 22, a user can save or pocket a page by using a bezel gesture that starts on bezel 2203 and proceeds rightward across the screen in the direction of the arrow to a bezel portion that lies opposite of where the gesture originated. Doing so reveals another page 2206. In one or more embodiments, a distance threshold can be defined such that, prior to the threshold, the page flipping experience, such as that described and shown in FIG. 21 can be provided. After the defined distance threshold, a different page-saving or page-pocketing experience can be provided. For example, in the FIG. 22 illustration, page 2204 has been reduced to a thumbnail. The page-saving or page-pocketing experience can be provided by a combination of passing the minimum distance threshold after a minimum timeout, such as ⅓ second, when most page flipping gestures would have been completed. In at least some embodiments, if the user lifts their finger prior to reaching the opposite-side bezel, a page flipping operation can be presumed.



FIG. 23 illustrates a device 2302 that includes a bezel 2303 and two separate display screens 2304, 2306 separated by a spine 2308. Spine 2308 can be considered as comprising part of the bezel or physical structure of the device. A page 2310 is illustrated as being displayed on display screen 2304.


As shown in the bottommost portion of FIG. 23, a user can save or pocket a page by using a bezel gesture that starts on bezel 2303 and proceeds rightward across the screen in the direction of the arrow to spine 2308 that lies across the screen 2304 from where the gesture originated. Doing so reveals another page 2312. In one or more embodiments, a distance threshold can be defined such that, prior to the threshold, the page flipping experience, such as that described and shown in FIG. 21 can be provided. After the defined distance threshold, a different page-saving or page-pocketing experience can be provided. For example, in the FIG. 23 illustration, page 2310 has been reduced to a thumbnail. The page-saving or page-pocketing experience can be provided after a minimum timeout, such as ⅓ second, when most page flipping gestures would have been completed. In at least some embodiments, if the user lifts their finger prior to reaching the spine 2308, a page flipping operation can be presumed.


In one or more embodiments, portions of pages can be saved or pocketed. As an example, consider FIG. 24. There, a device 2402 includes a bezel 2403 and two separate display screens 2404, 2406 separated by a spine 2408. Spine 2408 can be considered as comprising part of the bezel or physical structure of the device. A page 2410 is illustrated as being displayed on display screen 2404.


As shown in the bottommost portion of FIG. 24, a user can save or pocket a portion of the page by using a bezel gesture. First, two fingers of a user's hand (in this case the left hand) sweep onto the screen from the bezel. In this particular instance, the user's left-hand initiates the bezel gesture from the spine 2408 and moves in the direction of the top-most arrow. The region between the fingers—here illustrated at 2412—is then highlighted. The user's other hand can then sweep across the highlighted area to tear out the highlighted portion of the page and pocket or save the highlighted portion as shown. In one or more embodiments, this gesture can be supported on any of the four edges of the screen, thus allowing horizontal or vertical strips to be torn from either screen by either right-handed or left-handed users. In at least some embodiments, the torn portion of the page can have two torn edges and two clean-cut edges to distinguish it from pocketed pages or other pocketed objects.



FIG. 25 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be implemented in connection with a system such as those systems that are described above and below.


Step 2500 receives bezel gesture input relative to a page. Step 2502 ascertains page manipulation functionality associated with the input. Any suitable type of page manipulation functionality can be ascertained, examples of which are provided above. Step 2504 accesses the ascertained page manipulation functionality.



FIG. 26 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be implemented in connection with a system such as those systems that are described above and below.


Step 2600 receives on-screen input relative to a page. Any suitable type of input can be received. In at least some embodiments, the received screen input comprises a touch input or a stylus input. Step 2602 receives a bezel gesture input relative to the page. Any suitable type of bezel gesture input can be received, examples of which are provided above. Step 2604 ascertains page manipulation functionality associated with the combined input. Examples of page manipulation functionality are provided above. Step 2606 accesses the ascertained page manipulation functionality for purposes of implementing the functionality relative to the page.


Thus, page flipping and page saving operations can be unified through the use of bezel gestures that included at least some common aspects. Unification of these two operations yields simplicity and facilitates discoverability for users.


In one or more embodiments, other page manipulation operations can be implemented through the use of bezel gestures. As an example, consider FIG. 27. There, a device 2702 includes a bezel 2703. A page 2704 is displayed on the display device (not designated). In the illustrated and described embodiment, a bookmark tab can be created through the use of a bezel gesture. Specifically, as shown in the bottommost portion of FIG. 27, a bookmark tab 2706 can be created by initiating a gesture on the bezel 2703 and moving on to page 2704. In the illustrated and described embodiment, the bezel gesture that creates the bookmark tab originates on a corner of the bezel as shown. Any suitable location on the bezel can be utilized for creating a bookmark tab.


Alternately or additionally, bezel gestures can be utilized to dog-ear a page. As an example, consider FIG. 28. There, a device 2802 includes a bezel 2803. A page 2804 is displayed on the display device (not designated). In the illustrated and described embodiment, a dog-ear can be created through the use of a bezel gesture. Specifically, as shown in the bottommost portion of FIG. 28, a dog-ear 2806 can be created by initiating a gesture on the bezel 2803 and moving onto page 2804 and then exiting the page in an opposite direction as illustrated by the arrows. In the illustrated and described embodiment, the bezel gesture that creates the dog-ear originates on a corner of the bezel as shown. Any suitable location on the bezel can be utilized for creating a dog-ear. For example, in other embodiments, a dog-ear can be created through a bezel gesture that cuts across a corner of the page.


In one or more embodiments, gestures can be utilized to expose tabs such as user-created or predefined tabs in a document. As an example, consider FIG. 29. There, a device 2902 includes a bezel 2903. A page 2904 is displayed on the display device (not designated). In one or more embodiments, tabs can be exposed by utilizing a bezel gesture that pulls at the edge of page 2904 as shown to expose a tab structure 2906. As the bezel gesture moves onto the screen, the page is pulled slightly to the right to expose tab structure 2906. In this instance, the gesture includes two or more fingers that are held together as shown, rather than with a gap therebetween.


In one or more embodiments, continuing to drag the page can reveal further structure. For example, continuing to drag the page can expose a table organizational view to the left of page 2904. In at least some embodiments, continuing the gesture across the entire page can save or pocket the entire page as described above.



FIG. 30 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be implemented in connection with a system such as those systems that are described above and below.


Step 3000 receives a bezel gesture input relative to a page. Step 3002 creates a bookmark tab relative to the page, responsive to receiving the bezel gesture input. Examples of how this can be done are provided above.



FIG. 31 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be implemented in connection with a system such as those systems that are described above and below.


Step 3100 receives a bezel gesture input relative to a page. Step 3102 creates a dog-ear on the page, responsive to receiving the bezel gesture input.


Examples of how this can be done are provided above.



FIG. 32 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method can be implemented in connection with a system such as those systems that are described above and below.


Step 3200 receives a bezel gesture input relative to a page. Step 3202 exposes tab structure associated with the page. Examples of how this can be done are provided above.


Example Device



FIG. 33 illustrates various components of an example device 3300 that can be implemented as any type of portable and/or computer device as described with reference to FIGS. 1 and 2 to implement embodiments of the gesture techniques described herein. Device 3300 includes communication devices 3302 that enable wired and/or wireless communication of device data 3304 (e.g., received data, data that is being received, data scheduled for broadcast, data packets of the data, etc.). The device data 3304 or other device content can include configuration settings of the device, media content stored on the device, and/or information associated with a user of the device. Media content stored on device 3300 can include any type of audio, video, and/or image data. Device 3300 includes one or more data inputs 3306 via which any type of data, media content, and/or inputs can be received, such as user-selectable inputs, messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.


Device 3300 also includes communication interfaces 3308 that can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface. The communication interfaces 3308 provide a connection and/or communication links between device 3300 and a communication network by which other electronic, computing, and communication devices communicate data with device 3300.


Device 3300 includes one or more processors 3310 (e.g., any of microprocessors, controllers, and the like) which process various computer-executable or readable instructions to control the operation of device 3300 and to implement the gesture embodiments described above. Alternatively or in addition, device 3300 can be implemented with any one or combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 3312. Although not shown, device 3300 can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.


Device 3300 also includes computer-readable media 3314, such as one or more memory components, examples of which include random access memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. A disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), any type of a digital versatile disc (DVD), and the like. Device 3300 can also include a mass storage media device 3316.


Computer-readable media 3314 provides data storage mechanisms to store the device data 3304, as well as various device applications 3318 and any other types of information and/or data related to operational aspects of device 3300. For example, an operating system 3320 can be maintained as a computer application with the computer-readable media 3314 and executed on processors 3310. The device applications 3318 can include a device manager (e.g., a control application, software application, signal processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, etc.). The device applications 3318 also include any system components or modules to implement embodiments of the gesture techniques described herein. In this example, the device applications 3318 include an interface application 3322 and a gesture-capture driver 3324 that are shown as software modules and/or computer applications. The gesture-capture driver 3324 is representative of software that is used to provide an interface with a device configured to capture a gesture, such as a touchscreen, track pad, camera, and so on. Alternatively or in addition, the interface application 3322 and the gesture-capture driver 3324 can be implemented as hardware, software, firmware, or any combination thereof.


Device 3300 also includes an audio and/or video input-output system 3326 that provides audio data to an audio system 3328 and/or provides video data to a display system 3330. The audio system 3328 and/or the display system 3330 can include any devices that process, display, and/or otherwise render audio, video, and image data. Video signals and audio signals can be communicated from device 3300 to an audio device and/or to a display device via an RF (radio frequency) link, S-video link, composite video link, component video link, DVI (digital video interface), analog audio connection, or other similar communication link. In an embodiment, the audio system 3328 and/or the display system 3330 are implemented as external components to device 3300. Alternatively, the audio system 3328 and/or the display system 3330 are implemented as integrated components of example device 3300.


CONCLUSION

Bezel gestures for touch displays have been described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.


Although the embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the embodiments defined in the appended claims are not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed embodiments.

Claims
  • 1. A computing device comprising: a display device;a bezel disposed adjacent the display device;a display gesture module configured to recognize display touch input on the display device;a bezel gesture module configured to recognize bezel touch input on the bezel;the computing device configured to render a display, in response to a single-contact dragging bezel gesture recognized by the bezel gesture module as a single gesture that originates on the bezel and moves on to the display device, of first objects indicative of first functionality that is associated only with the single-contact dragging bezel gesture, wherein the first objects have not been previously rendered; andthe computing device further configured to render a display, in response to a multi-contact dragging bezel gesture recognized by the bezel gesture module as a single gesture that originates on the bezel and moves on to the display device, of second objects indicative of second functionality that is different than the first functionality and is associated only with the multi-contact dragging bezel gesture, wherein the second objects have not been previously rendered, wherein the multi-contact dragging bezel gesture comprises two or more fingers moving together to produce the dragging bezel gesture.
  • 2. The computing device of claim 1 further configured to indicate, in response to bezel gestures at different locations on the bezel, a different functionality mapped to each of the bezel gestures at its corresponding different location.
  • 3. The computing device of claim 1 further configured to display a visual affordance via the display device, where the displayed visual affordance indicates functionality that is accessible via touch input on the bezel.
  • 4. The computing device of claim 1 further configured to indicate, in response to the recognized gesture also crossing an icon displayed on the display device, functionality associated with the icon.
  • 5. The device of claim 1 where the bezel touch input on the bezel is recognized independent of any buttons on the bezel.
  • 6. The device of claim 1 where the multi-contact dragging bezel gesture originates on multiple sides of the bezel.
  • 7. A method performed on a computing device that comprises a display gesture module, a bezel gesture module, a display device, and a bezel disposed adjacent the display device, the method comprising: rendering a display of first objects indicative of first functionality in response to a single-contact dragging bezel gesture recognized by the bezel gesture module as a single gesture that originates on the bezel and moves on to the display device that is associated only with the single-contact dragging bezel gesture, wherein the first objects have not been previously rendered; andrendering a display of second objects indicative of second functionality in response to a multi-contact dragging bezel gesture recognized by the bezel gesture module as a single gesture that originates on the bezel and moves on to the display device, where the second functionality is different than the first functionality and is associated only with the multi-contact dragging bezel gesture, wherein the second objects have not been previously rendered, wherein the multi-contact dragging bezel gesture comprises two or more fingers moving together to produce the dragging bezel gesture.
  • 8. The method of claim 7 further comprising indicating, in response to bezel gestures at different locations on the bezel, a different functionality mapped to each of the bezel gestures at its corresponding different location.
  • 9. The method of claim 7 further comprising displaying a visual affordance via the display device, where the displayed visual affordance indicates functionality that is accessible via touch input on the bezel.
  • 10. The method of claim 7 further comprising indicating, in response to the recognized gesture also crossing an icon displayed on the display device, functionality associated with the icon.
  • 11. The method of claim 7 where the bezel touch input on the bezel is recognized independent of any buttons on the bezel.
  • 12. The method of claim 7 where the multi-contact dragging bezel gesture originates on multiple sides of the bezel.
  • 13. At least one non-transitory computer-readable hardware storage media that includes computer-executable instructions which, based on execution by a computing device that comprises a display gesture module, a bezel gesture module, a display device, and a bezel disposed adjacent the display device, configure the computing device to perform actions comprising: rendering a display of first objects indicative of first functionality in response to a single-contact dragging bezel gesture recognized by the bezel gesture module as a single gesture that originates on the bezel and moves on to the display device that is associated only with the single-contact dragging bezel gesture, wherein the first objects have not been previously rendered; andrendering a display of second objects indicative of second functionality in response to a multi-contact dragging bezel gesture recognized by the bezel gesture module as a single gesture that originates on the bezel and moves on to the display device, where the second functionality is different than the first functionality and is associated only with the multi-contact dragging bezel gesture, wherein the second objects have not been previously rendered, wherein the multi-contact dragging bezel gesture comprises two or more fingers moving together to produce the dragging bezel gesture.
  • 14. The at least one non-transitory computer-readable hardware storage media of claim 13 further comprising indicating, in response to bezel gestures at different locations on the bezel, a different functionality mapped to each of the bezel gestures at its corresponding different location.
  • 15. The at least one non-transitory computer-readable hardware storage media of claim 13 further comprising displaying a visual affordance via the display device, where the displayed visual affordance indicates functionality that is accessible via touch input on the bezel.
  • 16. The at least one non-transitory computer-readable hardware storage media of claim 13 further comprising indicating, in response to the recognized gesture also crossing an icon displayed on the display device, functionality associated with the icon.
  • 17. The at least one non-transitory computer-readable hardware storage media of claim 13 where the bezel touch input on the bezel is recognized independent of any buttons on the bezel, or where the multi-contact dragging bezel gesture originates on multiple sides of the bezel.
RELATED APPLICATIONS

This application is a Continuation of, and claims benefit from, U.S. patent application Ser. No. 12/709,301 that was filed on Feb. 19, 2010, and that is incorporated herein by reference in its entirety.

US Referenced Citations (505)
Number Name Date Kind
4686332 Greanias et al. Aug 1987 A
4843538 Lane et al. Jun 1989 A
4868912 Doering Sep 1989 A
5231578 Levin et al. Jul 1993 A
5237647 Roberts et al. Aug 1993 A
5252951 Tannenbaum et al. Oct 1993 A
D341848 Bigelow et al. Nov 1993 S
5305435 Bronson Apr 1994 A
5347628 Brewer et al. Sep 1994 A
5351995 Booker et al. Oct 1994 A
5392388 Gibson Feb 1995 A
5404458 Zetts Apr 1995 A
5463725 Henckel et al. Oct 1995 A
5491783 Douglas et al. Feb 1996 A
5496974 Akebi et al. Mar 1996 A
5497776 Yamazaki et al. Mar 1996 A
5511148 Wellner Apr 1996 A
5555369 Menendez et al. Sep 1996 A
5596697 Foster et al. Jan 1997 A
5615320 Lavendel Mar 1997 A
5661773 Swerdloff et al. Aug 1997 A
5664128 Bauer Sep 1997 A
5664133 Malamud et al. Sep 1997 A
5689669 Lynch et al. Nov 1997 A
5694150 Sigona et al. Dec 1997 A
5731813 O—Rourke et al. Mar 1998 A
5761485 Munyan Jun 1998 A
5777596 Herbert Jul 1998 A
5817019 Kawashima Oct 1998 A
5821930 Hansen Oct 1998 A
5838889 Booker et al. Nov 1998 A
5898434 Small et al. Apr 1999 A
5943052 Allen Aug 1999 A
5969720 Lisle et al. Oct 1999 A
6025839 Schell et al. Feb 2000 A
6029214 Dorfman et al. Feb 2000 A
6037937 Beaton et al. Mar 2000 A
6061061 Conrad et al. May 2000 A
6072476 Harada et al. Jun 2000 A
6073036 Heikkinen et al. Jun 2000 A
6097392 Leyerle Aug 2000 A
6115724 Booker et al. Sep 2000 A
6167439 Levine et al. Dec 2000 A
6208331 Singh Mar 2001 B1
6226010 Long May 2001 B1
6239798 Ludolph et al. May 2001 B1
6246395 Goyins Jun 2001 B1
6266050 Oh et al. Jul 2001 B1
6278443 Amro et al. Aug 2001 B1
6310610 Beaton et al. Oct 2001 B1
6340979 Beaton et al. Jan 2002 B1
6396523 Segal et al. May 2002 B1
6448987 Easty et al. Sep 2002 B1
6459424 Resman Oct 2002 B1
6501491 Brown et al. Dec 2002 B1
6507352 Cohen et al. Jan 2003 B1
6525749 Moran et al. Feb 2003 B1
6545669 Kinawi et al. Apr 2003 B1
6831631 Chuang Dec 2004 B2
6859909 Lerner et al. Feb 2005 B1
6920619 Milekic Jul 2005 B1
6925598 Melhem et al. Aug 2005 B2
6941521 Lin et al. Sep 2005 B2
6957233 Beezer et al. Oct 2005 B1
7023427 Kraus et al. Apr 2006 B2
7053887 Kraus et al. May 2006 B2
7158123 Myers et al. Jan 2007 B2
7180524 Axelrod Feb 2007 B1
7256770 Hinckley et al. Aug 2007 B2
7295191 Kraus et al. Nov 2007 B2
7302650 Allyn et al. Nov 2007 B1
7338224 Jones et al. Mar 2008 B2
7339580 Westerman et al. Mar 2008 B2
7454717 Hinckley et al. Nov 2008 B2
7479949 Jobs Jan 2009 B2
7506269 Lang Mar 2009 B2
7509348 Burtner et al. Mar 2009 B2
7532196 Hinckley May 2009 B2
7561146 Hotelling Jul 2009 B1
7570943 Sorvari et al. Aug 2009 B2
7636071 O'Gorman Dec 2009 B2
7643012 Kim et al. Jan 2010 B2
7656393 King et al. Feb 2010 B2
7676767 Hofmeister et al. Mar 2010 B2
7735020 Chaudhri Jun 2010 B2
7760187 Kennedy Jul 2010 B2
7821780 Choy Oct 2010 B2
7827495 Bells et al. Nov 2010 B2
7847789 Kolmykov-Zotov et al. Dec 2010 B2
7853877 Giesen et al. Dec 2010 B2
D631043 Kell Jan 2011 S
7898529 Fitzmaurice Mar 2011 B2
7941758 Tremblay May 2011 B2
7956847 Christie Jun 2011 B2
8018440 Townsend et al. Sep 2011 B2
8019843 Cash et al. Sep 2011 B2
8040142 Bokma et al. Oct 2011 B1
8089482 Axelrod Jan 2012 B1
8102858 Rahim et al. Jan 2012 B1
8122384 Partridge et al. Feb 2012 B2
8136027 Underwood et al. Mar 2012 B2
8169418 Birkler May 2012 B2
8181122 Davidson May 2012 B2
8212788 Lam Jul 2012 B2
8219930 Johns Jul 2012 B2
8239784 Hotelling et al. Aug 2012 B2
8239785 Hinckley Aug 2012 B2
8239882 Dhanjal et al. Aug 2012 B2
8261213 Hinckley Sep 2012 B2
8274482 Kim et al. Sep 2012 B2
8284170 Bernstein Oct 2012 B2
8294669 Partridge et al. Oct 2012 B2
8294686 Townsend et al. Oct 2012 B2
8289289 Rimon et al. Nov 2012 B2
8327295 Ikeda et al. Dec 2012 B2
8335996 Davidson et al. Dec 2012 B2
8345008 Lee et al. Jan 2013 B2
8373660 Paliakoff Feb 2013 B2
8395600 Kawashima et al. Mar 2013 B2
8407606 Davidson et al. Mar 2013 B1
8418165 Hoff et al. Apr 2013 B2
8473870 Hinckley et al. Jun 2013 B2
8477114 Miller et al. Jul 2013 B2
8539384 Hinckley et al. Sep 2013 B2
8578294 Eom et al. Nov 2013 B2
8581864 Miyazawa et al. Nov 2013 B2
8587526 Engelhardt et al. Nov 2013 B2
8640047 Mouton et al. Jan 2014 B2
8643628 Eriksson et al. Feb 2014 B1
8659570 Townsend et al. Feb 2014 B2
8671343 Oberstein Mar 2014 B2
8707174 Hinckley et al. Apr 2014 B2
8751970 Hinckley et al. Jun 2014 B2
8788967 Davidson et al. Jul 2014 B2
8799827 Hinckley et al. Aug 2014 B2
8810533 Chen Aug 2014 B2
8836648 Wilairat Sep 2014 B2
8836659 Chen et al. Sep 2014 B2
9021398 Kotler et al. Apr 2015 B2
9047009 King Jun 2015 B2
9075522 Hinckley et al. Jul 2015 B2
9116602 Kotler et al. Aug 2015 B2
9256342 Davidson et al. Feb 2016 B2
9261964 Townsend et al. Feb 2016 B2
9274682 Hinckley et al. Mar 2016 B2
9310994 Hinckley et al. Apr 2016 B2
9418346 Lehtiniemi et al. Aug 2016 B2
9965165 Hinckley et al. May 2018 B2
20010012000 Eberhard Aug 2001 A1
20010035860 Segal et al. Nov 2001 A1
20010047263 Smith et al. Nov 2001 A1
20020060701 Naughton et al. May 2002 A1
20020097229 Rose et al. Jul 2002 A1
20020101457 Lang Aug 2002 A1
20020116421 Fox et al. Aug 2002 A1
20020156870 Boroumand et al. Oct 2002 A1
20020198906 Press Dec 2002 A1
20030016253 Aoki et al. Jan 2003 A1
20030067451 Tagg et al. Apr 2003 A1
20030095095 Pihlaja May 2003 A1
20030098858 Perski et al. May 2003 A1
20030142081 Iizuka Jul 2003 A1
20030179541 Sullivan Sep 2003 A1
20030184585 Lin et al. Oct 2003 A1
20030231219 Leung Dec 2003 A1
20040001048 Kraus et al. Jan 2004 A1
20040113941 Sliwa et al. Jun 2004 A1
20040155871 Perski et al. Aug 2004 A1
20040178994 Kairls, Jr. Sep 2004 A1
20040212601 Cake et al. Oct 2004 A1
20040236741 Burstrom et al. Nov 2004 A1
20040236774 Baird et al. Nov 2004 A1
20040255254 Weingart et al. Dec 2004 A1
20050012723 Pallakoff Jan 2005 A1
20050017957 Yi Jan 2005 A1
20050017959 Kraus et al. Jan 2005 A1
20050052432 Kraus et al. Mar 2005 A1
20050076300 Martinez Apr 2005 A1
20050101864 Zheng et al. May 2005 A1
20050129314 Chen Jun 2005 A1
20050162402 Watanachote Jul 2005 A1
20050177796 Takahashi Aug 2005 A1
20050183040 Kondo et al. Aug 2005 A1
20050184973 Lum et al. Aug 2005 A1
20050189154 Perski et al. Sep 2005 A1
20050198592 Keely et al. Sep 2005 A1
20050216834 Gu Sep 2005 A1
20050285965 Zimmer et al. Dec 2005 A1
20060001650 Robbins et al. Jan 2006 A1
20060010371 Shur et al. Jan 2006 A1
20060012580 Perski et al. Jan 2006 A1
20060012581 Haim et al. Jan 2006 A1
20060022955 Kennedy Feb 2006 A1
20060026521 Hotelling et al. Feb 2006 A1
20060026535 Hotelling et al. Feb 2006 A1
20060026536 Hotelling et al. Feb 2006 A1
20060031786 Hillis et al. Feb 2006 A1
20060036959 Heatherly et al. Feb 2006 A1
20060071912 Hill Apr 2006 A1
20060073814 Allen et al. Apr 2006 A1
20060092177 Blasko May 2006 A1
20060093219 Gounares et al. May 2006 A1
20060101354 Hashimoto et al. May 2006 A1
20060109252 Kolmykov-Zotov et al. May 2006 A1
20060112335 Hofmeister et al. May 2006 A1
20060161870 Hotelling et al. Jul 2006 A1
20060197750 Kerr et al. Sep 2006 A1
20060197753 Hoteliing Sep 2006 A1
20060197963 Royal et al. Sep 2006 A1
20060238517 King Oct 2006 A1
20060238520 Westerman et al. Oct 2006 A1
20060262105 Smith et al. Nov 2006 A1
20060262188 Elyada et al. Nov 2006 A1
20060267955 Hino Nov 2006 A1
20060284852 Hofmeister Dec 2006 A1
20070043744 Carro Feb 2007 A1
20070055936 Dhanjal et al. Mar 2007 A1
20070063987 Sato et al. Mar 2007 A1
20070075976 Kun et al. Apr 2007 A1
20070078735 Wan et al. Apr 2007 A1
20070092243 Allen et al. Apr 2007 A1
20070097096 Rosenberg May 2007 A1
20070106939 Qassoudi May 2007 A1
20070109274 Reynolds May 2007 A1
20070120762 O'Gorman May 2007 A1
20070124677 de los reyes et al. May 2007 A1
20070146337 Ording et al. Jun 2007 A1
20070146347 Rosenberg Jun 2007 A1
20070150496 Feinsmith Jun 2007 A1
20070152976 Townsend et al. Jul 2007 A1
20070168890 Zhao et al. Jul 2007 A1
20070171211 Perski et al. Jul 2007 A1
20070236468 Tuli Oct 2007 A1
20070240057 Satterfield Oct 2007 A1
20070242056 Engelhardt et al. Oct 2007 A1
20070250786 Jeon et al. Oct 2007 A1
20070256029 Maxwell Nov 2007 A1
20070262951 Huie et al. Nov 2007 A1
20070271528 Park et al. Nov 2007 A1
20080001924 de los Reyes et al. Jan 2008 A1
20080005703 Radivojevic et al. Jan 2008 A1
20080036743 Westerman Feb 2008 A1
20080040692 Sunday et al. Feb 2008 A1
20080042978 Perez-Noguera Feb 2008 A1
20080046425 Perski Feb 2008 A1
20080052945 Matas et al. Mar 2008 A1
20080059504 Barbetta et al. Mar 2008 A1
20080059914 Allyn et al. Mar 2008 A1
20080062141 Chandhri Mar 2008 A1
20080065720 Brodersen et al. Mar 2008 A1
20080074402 Cornish et al. Mar 2008 A1
20080082903 McCurdy et al. Apr 2008 A1
20080084400 Rosenberg Apr 2008 A1
20080143681 Xiaoping Jun 2008 A1
20080164982 Andrews et al. Jul 2008 A1
20080165141 Christie Jul 2008 A1
20080165255 Christie et al. Jul 2008 A1
20080168382 Louch et al. Jul 2008 A1
20080168396 Matas et al. Jul 2008 A1
20080168403 Westerman et al. Jul 2008 A1
20080180404 Han et al. Jul 2008 A1
20080211766 Westerman et al. Sep 2008 A1
20080211778 Ording et al. Sep 2008 A1
20080218494 Perski et al. Sep 2008 A1
20080228853 Brinck et al. Sep 2008 A1
20080229192 Gear et al. Sep 2008 A1
20080249682 Wisniewski et al. Oct 2008 A1
20080278455 Atkins et al. Nov 2008 A1
20080303798 Matsudate et al. Dec 2008 A1
20090007015 Mandic et al. Jan 2009 A1
20090019188 Mattice et al. Jan 2009 A1
20090033632 Szolyga et al. Feb 2009 A1
20090037813 Newman et al. Feb 2009 A1
20090054107 Feland, III et al. Feb 2009 A1
20090058830 Herz Mar 2009 A1
20090059730 Lyons et al. Mar 2009 A1
20090064012 Tremblay Mar 2009 A1
20090074255 Holm Mar 2009 A1
20090077501 Partridge et al. Mar 2009 A1
20090079699 Sun Mar 2009 A1
20090083665 Anttila et al. Mar 2009 A1
20090094562 Jeong et al. Apr 2009 A1
20090096758 Hotelling et al. Apr 2009 A1
20090117943 Lee et al. May 2009 A1
20090128505 Partridge et al. May 2009 A1
20090138830 Borgaonkar et al. May 2009 A1
20090143141 Wells et al. Jun 2009 A1
20090153289 Hope et al. Jun 2009 A1
20090153438 Miller et al. Jun 2009 A1
20090158134 Wang et al. Jun 2009 A1
20090167696 Griffin Jul 2009 A1
20090167702 Nurmi Jul 2009 A1
20090174679 Westerman Jul 2009 A1
20090184921 Scott et al. Jul 2009 A1
20090187860 Fleck et al. Jul 2009 A1
20090193366 Davidson Jul 2009 A1
20090213136 Desjardins et al. Aug 2009 A1
20090217211 Hildreth et al. Aug 2009 A1
20090231356 Barnes et al. Sep 2009 A1
20090249236 Westerman et al. Oct 2009 A1
20090249247 Tseng et al. Oct 2009 A1
20090251432 Wang et al. Oct 2009 A1
20090251434 Rimon et al. Oct 2009 A1
20090256857 Davidson et al. Oct 2009 A1
20090276701 Nurmi Nov 2009 A1
20090278806 Duarte et al. Nov 2009 A1
20090282332 Porat Nov 2009 A1
20090284478 De la Torre Baltierra Nov 2009 A1
20090284488 Sip Nov 2009 A1
20090288044 Matthews et al. Nov 2009 A1
20090292989 Matthews et al. Nov 2009 A1
20090295753 King et al. Dec 2009 A1
20090307589 Inose et al. Dec 2009 A1
20090309846 Trachtenberg et al. Dec 2009 A1
20090319893 Pihlaja Dec 2009 A1
20090320070 Inoguchi Dec 2009 A1
20090327963 Mouilleseaux et al. Dec 2009 A1
20090327964 Mouilleseaux et al. Dec 2009 A1
20090327975 Stedman Dec 2009 A1
20100013768 Leung Jan 2010 A1
20100013792 Fukushima Jan 2010 A1
20100016049 Shirakawa et al. Jan 2010 A1
20100020025 Lemort et al. Jan 2010 A1
20100039392 Pratt et al. Feb 2010 A1
20100042827 Pratt et al. Feb 2010 A1
20100045705 Vertegaal et al. Feb 2010 A1
20100050076 Roth Feb 2010 A1
20100051355 Yang Mar 2010 A1
20100053103 No et al. Mar 2010 A1
20100053861 Kim et al. Mar 2010 A1
20100058182 Jung Mar 2010 A1
20100060607 Ludwig Mar 2010 A1
20100066667 MacDougall et al. Mar 2010 A1
20100066694 Jonsdottir Mar 2010 A1
20100066698 Seo Mar 2010 A1
20100079392 Chiang et al. Apr 2010 A1
20100081475 Chiang et al. Apr 2010 A1
20100083154 Takeshita Apr 2010 A1
20100083190 Roberts et al. Apr 2010 A1
20100088641 Choi Apr 2010 A1
20100090971 Choi et al. Apr 2010 A1
20100097338 Miyashita et al. Apr 2010 A1
20100103136 Ono et al. Apr 2010 A1
20100105443 Vaisanen Apr 2010 A1
20100107067 Vaisanen Apr 2010 A1
20100110019 Ozias et al. May 2010 A1
20100115455 Kim May 2010 A1
20100123675 Ippel May 2010 A1
20100134415 Iwase et al. Jun 2010 A1
20100134424 Brisebois et al. Jun 2010 A1
20100137027 Kim Jun 2010 A1
20100149109 Elias Jun 2010 A1
20100164878 Bestle et al. Jul 2010 A1
20100164959 Brown et al. Jul 2010 A1
20100169813 Chang Jul 2010 A1
20100182247 Petschnigg et al. Jul 2010 A1
20100182264 Hahn et al. Jul 2010 A1
20100185983 Szoczei et al. Jul 2010 A1
20100188371 Lowles et al. Jul 2010 A1
20100192102 Chmielewski et al. Jul 2010 A1
20100201634 Coddington Aug 2010 A1
20100213040 Yeh et al. Aug 2010 A1
20100217428 Strong et al. Aug 2010 A1
20100220900 Orsley Sep 2010 A1
20100241973 Whiddett Sep 2010 A1
20100245242 Wu et al. Sep 2010 A1
20100245263 Parada, Jr. et al. Sep 2010 A1
20100251112 Hinckley et al. Sep 2010 A1
20100251189 Jaeger Sep 2010 A1
20100262928 Abbott Oct 2010 A1
20100283748 Hsieh et al. Nov 2010 A1
20100295795 Wilairat Nov 2010 A1
20100295797 Nicholson et al. Nov 2010 A1
20100295799 Nicholson et al. Nov 2010 A1
20100295817 Nicholson et al. Nov 2010 A1
20100299592 Zalewski et al. Nov 2010 A1
20100299595 Zalewski et al. Nov 2010 A1
20100299596 Zalewski et al. Nov 2010 A1
20100299597 Shin et al. Nov 2010 A1
20100302172 Wilairat Dec 2010 A1
20100302712 Wilairat Dec 2010 A1
20100306702 Warner Dec 2010 A1
20100313124 Privault et al. Dec 2010 A1
20100321326 Grunthaner et al. Dec 2010 A1
20110012841 Lin Jan 2011 A1
20110013203 Grosz et al. Jan 2011 A1
20110016431 Grosz et al. Jan 2011 A1
20110018821 Kii Jan 2011 A1
20110041096 Larco et al. Feb 2011 A1
20110043472 Hada Feb 2011 A1
20110050594 Kim et al. Mar 2011 A1
20110055729 Mason et al. Mar 2011 A1
20110055753 Horodezky et al. Mar 2011 A1
20110055760 Drayton et al. Mar 2011 A1
20110066981 Chmielewski et al. Mar 2011 A1
20110072036 Agsen et al. Mar 2011 A1
20110093815 Gobeil Apr 2011 A1
20110107220 Perlman May 2011 A1
20110115735 Lev et al. May 2011 A1
20110117526 Wigdor et al. May 2011 A1
20110126094 Horodezky et al. May 2011 A1
20110143769 Jones et al. Jun 2011 A1
20110157046 Lee et al. Jun 2011 A1
20110159915 Yano et al. Jun 2011 A1
20110167092 Subramaniam et al. Jul 2011 A1
20110167336 Aitken et al. Jul 2011 A1
20110167391 Momeyer et al. Jul 2011 A1
20110169749 Ganey et al. Jul 2011 A1
20110181524 Hinckley Jul 2011 A1
20110185299 Hinckley Jul 2011 A1
20110185300 Hinckley Jul 2011 A1
20110185318 Hinckley Jul 2011 A1
20110185320 Hinckley Jul 2011 A1
20110191704 Hinckley Aug 2011 A1
20110191718 Hinckley Aug 2011 A1
20110191719 Hinckley Aug 2011 A1
20110199386 Dharwada et al. Aug 2011 A1
20110202879 Stovicek et al. Aug 2011 A1
20110205163 Hinckley Aug 2011 A1
20110209039 Hinckley Aug 2011 A1
20110209057 Hinckley Aug 2011 A1
20110209058 Hinckley Aug 2011 A1
20110209088 Hinckley Aug 2011 A1
20110209089 Hinckley Aug 2011 A1
20110209097 Hinckley Aug 2011 A1
20110209098 Hinckley Aug 2011 A1
20110209099 Hinckley Aug 2011 A1
20110209100 Hinckley Aug 2011 A1
20110209101 Hinckley Aug 2011 A1
20110209102 Hinckley Aug 2011 A1
20110209103 Hinckley Aug 2011 A1
20110209104 Hinckley Aug 2011 A1
20110231796 Vigil Sep 2011 A1
20110242039 Kalis et al. Oct 2011 A1
20110242138 Tribble Oct 2011 A1
20110246943 Fujibayashi Oct 2011 A1
20110248928 Michaelraj Oct 2011 A1
20110252334 Verma et al. Oct 2011 A1
20110252344 Van Os Oct 2011 A1
20110261058 Luo Oct 2011 A1
20110291948 Stewart et al. Dec 2011 A1
20110291964 Chambers et al. Dec 2011 A1
20110310459 Gates et al. Dec 2011 A1
20120001861 Townsend et al. Jan 2012 A1
20120036434 Oberstein Feb 2012 A1
20120036480 Warner et al. Feb 2012 A1
20120042006 Kiley et al. Feb 2012 A1
20120075194 Ferren Mar 2012 A1
20120084705 Lee et al. Apr 2012 A1
20120096411 Nash Apr 2012 A1
20120113007 Koch et al. May 2012 A1
20120131454 Shah May 2012 A1
20120154303 Lazaridis et al. Jun 2012 A1
20120158629 Hinckley et al. Jun 2012 A1
20120212445 Heikkinen et al. Aug 2012 A1
20120218282 Choboter et al. Aug 2012 A1
20120236026 Hinckley Sep 2012 A1
20120242590 Baccichet et al. Sep 2012 A1
20120272144 Radakovitz et al. Oct 2012 A1
20120262407 Hinckley et al. Nov 2012 A1
20120287076 Dao et al. Nov 2012 A1
20120304133 Nan et al. Nov 2012 A1
20120306788 Chen et al. Dec 2012 A1
20120311476 Campbell Dec 2012 A1
20120324384 Cohen et al. Dec 2012 A1
20130019203 Kotler et al. Jan 2013 A1
20130019204 Kotler et al. Jan 2013 A1
20130019206 Kotler et al. Jan 2013 A1
20130019208 Kotler et al. Jan 2013 A1
20130038564 Ho Feb 2013 A1
20130044070 Townsend et al. Feb 2013 A1
20130047123 May et al. Feb 2013 A1
20130063891 Martisauskas Mar 2013 A1
20130088434 Masuda et al. Apr 2013 A1
20130093691 Moosavi Apr 2013 A1
20130117715 Williams et al. May 2013 A1
20130154999 Guard Jun 2013 A1
20130181902 Hinckley Jul 2013 A1
20130212503 Zalewski et al. Aug 2013 A1
20130212504 Zalewski et al. Aug 2013 A1
20130222287 Bae et al. Aug 2013 A1
20130265269 Sharma et al. Oct 2013 A1
20130271447 Setlur et al. Oct 2013 A1
20130275914 Zhuo Oct 2013 A1
20130300668 Churikov Nov 2013 A1
20130335453 Lim et al. Dec 2013 A1
20140022183 Ayoub et al. Jan 2014 A1
20140033134 Pimmel et al. Jan 2014 A1
20140043265 Chang et al. Feb 2014 A1
20140043277 Saukko et al. Feb 2014 A1
20140092041 Ih Apr 2014 A1
20140111462 Townsend et al. Apr 2014 A1
20140132551 Bathiche May 2014 A1
20140192019 Fukushima Jul 2014 A1
20140195957 Bang Jul 2014 A1
20140289668 Mavrody Sep 2014 A1
20140293145 Jones et al. Oct 2014 A1
20140337791 Agnetta et al. Nov 2014 A1
20150042588 Park Feb 2015 A1
20150145797 Corrion May 2015 A1
20150160849 Weiss et al. Jun 2015 A1
20150261362 King Sep 2015 A1
20150261364 Cady et al. Sep 2015 A1
20160370958 Tsuji et al. Dec 2016 A1
20160378318 Tsuju et al. Dec 2016 A1
Foreign Referenced Citations (57)
Number Date Country
1326564 Dec 2001 CN
1578430 Feb 2005 CN
1704888 Dec 2005 CN
1766824 May 2006 CN
1936799 Jan 2007 CN
101198925 Jun 2008 CN
101263443 Sep 2008 CN
201181467 Jan 2009 CN
101404687 Apr 2009 CN
101410781 Apr 2009 CN
101432677 May 2009 CN
01496404 Jul 2009 CN
101482790 Jul 2009 CN
201298220 Aug 2009 CN
101551728 Oct 2009 CN
101566865 Oct 2009 CN
101576789 Nov 2009 CN
101609383 Dec 2009 CN
101627361 Jan 2010 CN
101636711 Jan 2010 CN
1505483 Feb 2005 EP
1942401 Jul 2008 EP
2081107 Jul 2009 EP
2148268 Jan 2010 EP
2466442 Jun 2012 EP
2560088 Feb 2013 EP
6282368 Oct 1994 JP
7281810 Oct 1995 JP
2000163031 Jun 2000 JP
2001265523 Sep 2001 JP
2001290585 Oct 2001 JP
2002055753 Feb 2002 JP
2003195998 Jul 2003 JP
2005004690 Jan 2005 JP
2005026834 Jan 2005 JP
2005122271 May 2005 JP
2005149279 Jun 2005 JP
2005267034 Sep 2005 JP
2007240964 Sep 2007 JP
3143462 Jul 2008 JP
2008532185 Aug 2008 JP
2008217742 Sep 2008 JP
2008305087 Dec 2008 JP
2009097724 Apr 2009 JP
2010019643 Jan 2010 JP
2010026834 Feb 2010 JP
2010250465 Nov 2010 JP
20090013927 Feb 2009 KR
1020090088501 Aug 2009 KR
20090106755 Oct 2009 KR
200921478 May 2009 TW
200947297 Aug 2009 TW
200951783 Dec 2009 TW
9928812 Jan 1999 WO
2009086628 Jul 2009 WO
2011106467 Sep 2011 WO
2011106468 Sep 2011 WO
Non-Patent Literature Citations (326)
Entry
Chinese Patent Office, Author unknown, CN Notice on Grant of Patent Right for Invention for Application No. 201110046519.X, pp. 1-2, dated Aug. 2, 2016, China.
CN Notice on the First Office Action for Application No. 201180009635.2, dated Jul. 28, 2014.
CN Notice on the Third Office Action for Application No. 20111 0046519.X, dated Sep. 21, 2015.
CN Notice on Reexamination for Application No. 201110044285.5, dated Jul. 23, 2014.
CN Notice on Reexamination for Application No. 201110044285.5, dated Dec. 22, 2014.
CN Notice on the Third Office Action for Application No. 201180009579.2, dated Sep. 6, 2015.
Hotelling, “Multi-functional hand-held device”, U.S. Appl. No. 60/658,777, filed Mar. 4, 2005.
Hotelling, “Multi-functional hand-held device”, U.S. Appl. No. 60/663,345, filed Mar. 16, 2005.
“Supplemental Notice of Allowance”, U.S. Appl. No. 12/713,133, dated Feb. 3, 2014, 2 pages.
Fonseca, “New Apple Patent Hints at Touch Enabled Bezels for Future Devices”, Jul. 3, 2013, 6 Pages.
Kim, et al., “Hand Grip Pattern Recognition for Mobile User Interfaces”, Interaction Lab / Samsung Advanced Institute of Technology, 2006, pp. 1789-1794.
“Ex Parte Mewherter, PTAB precedential decision”, U.S. Appl. No. 10/685,192, May 8, 2013, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,204, dated Nov. 20, 2013, 31 pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,133, dated Jan. 17, 2014, 4 pages.
“Foreign Office Action”, Chinese Application No. 201110050852.8,dated Nov. 1, 2013, 8 Pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,133, dated Dec. 10, 2013, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,348, dated Dec. 20, 2013, 9 pages.
“UI Guidelines”, Retrieved from: http://na.blackberry.com/eng/deliverables/6622/BlackBerry Smartphones-US.pdf., 76 Pages, Oct. 22, 2009.
Nordgren, Peder “Development of a Touch Screen Interface for Scania Interactor”, Master's Thesis in Computing Science, UMEA University, Available at <http://www.cs.umu.se/education/examina/Rapporter/PederNordgren.pdf>, (Apr. 10, 2007), pp. 1-59.
Vallerio, Keith S., et al., “Energy-Efficient Graphical User Interface Design”, Retrieved from: <http://www.cc.gatech.edu/classes/AY2007/cs7470_fall/zhong-energy-efficient-user-interface.pdf>, (Jun. 10, 2004), pp. 1-13.
“Decision on Reexamination”, CN Application No. 201110044285.5, dated Mar. 26, 2015, 14 Pages.
“Decision on Reexamination”, CN Application No. 201110046519.x, dated May 28, 2015, 9 Pages.
“Final Office Action”, U.S. Appl. No. 12/695,976, dated Aug. 5, 2015, 12 pages.
“Final Office Action”, U.S. Appl. No. 12/713,113, dated Aug. 5, 2015, 26 pages.
“Final Office Action”, U.S. Appl. No. 12/713,127, dated Jul. 31, 2015,19 pages.
Roth, Volker et al. “Bezel Swipe: Conflict Free Scrolling and Multiple Selection on Mobile touch Screen Devices” (Apr. 2009); pp. 1-4.
“Final Office Action”, U.S. Appl. No. 13/484,075, dated Jul. 16, 2015, 10 pages.
“Foreign Notice of Allowance”, CN Application No. 201180010769.6, dated Apr. 30, 2015, 4 Pages.
“Foreign Office Action”, CN Application No. 201180007100.1, dated May 15, 2015, 20 Pages.
“Foreign Office Action”, CN Application No. 201180009579.2, dated Apr. 21, 2015, 16 Pages.
“Foreign Office Action”, JP Application No. 2012-554008, dated Jun. 25, 2015,13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/019811, dated Jul. 8, 2015, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/067804, dated Jul. 24, 2015, 19 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/674,357, dated Jun. 4, 2015,10 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/099,798, dated Jun. 9, 2015,15 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/212,916, dated Aug. 7, 2015,10 pages.
“Search Report”, TW Application No. 099142890, dated Jun. 30, 2015, 1 page.
“Advisory Action”, U.S. Appl. No. 12/695,842, dated May 12, 2015, 3 pages.
“Final Office Action”, U.S. Appl. No. 12/695,937, dated Apr. 2, 2015, 14 pages.
“Final Office Action”, U.S. Appl. No. 13/898,452, dated Mar. 27, 2015, 23 pages.
“Foreign Notice of Allowance”, CN Application No. 201110046510.9, dated Feb. 12, 2015, 6 Pages.
“Foreign Notice of Allowance”, JP Application No. 2012-555062, dated Mar. 3, 2015, 4 Pages.
“Foreign Office Action”, CN Application No. 201180010692.2, dated Mar. 10, 2015, 9 Pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700,357, dated Apr. 2, 2015, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,204, dated May 7, 2015, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,127, dated Mar. 26, 2015, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/484,075, dated Apr. 29, 2015, 8 pages.
“Final Office Action”, U.S. Appl. No. 12/695,842, dated Feb. 12, 2015, 20 pages.
“Final Office Action”, U.S. Appl. No. 12/700,357,dated Nov. 20, 2014, 12 pages.
“Final Office Action”, U.S. Appl. No. 12/700,510, dated Feb. 3, 2015, 28 pages.
“Final Office Action”, U.S. Appl. No. 12/709,204, dated Jan. 12, 2015, 29 pages.
“Final Office Action”, U.S. Appl. No. 13/484,075, dated Feb. 4, 2015, 12 pages.
“Final Office Action”, U.S. Appl. No. 13/674,357, dated Jan. 29, 2015, 10 pages.
“Final Office Action”, U.S. Appl. No. 14/145,204, dated Nov. 12, 2014, 10 pages.
“Foreign Notice of Allowance”, CN Application No. 20111 0050506.X, dated Nov. 2, 2014, 4 Pages.
“Foreign Office Action”, CN Application No. 201180011020.3, dated Jan. 15, 2015, 9 Pages.
“Foreign Office Action”, CN Application No. 201180011039.8, dated Feb. 17, 2015, 17 Pages.
“Foreign Office Action”, JP Application No. 2012-554008, dated Nov. 25, 2014, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,976, dated Mar. 25, 2015, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,282, dated Jan. 29, 2015, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,113, dated Feb. 12, 2015, 24 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,118, dated Jan. 29, 2015, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/145,204, dated Feb. 24, 2015, 9 Pages.
“Notice of Allowance”, U.S. Appl. No. 12/709,245, dated Jan. 30, 2015, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,096, dated Jan. 9, 2015, 14 pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,118, dated Mar. 5, 2015, 4 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 12/713,118, dated Mar. 19, 2015, 2 pages.
“Final Office Action”, U.S. Appl. No. 12/695,937, dated Nov. 10, 2014, 13 pages.
“Final Office Action”, U.S. Appl. No. 12/709,245, dated Nov. 14, 2014, 6 pages.
“Foreign Office Action”, CN Application No. 201180007100.1, dated Sep. 10, 2014, 22 pages.
“Foreign Office Action”, CN Application No. 201180009579.2, dated Nov. 4, 2014, 16 pages.
“Foreign Office Action”, CN Application No. 201180010692.2, dated Jun. 26, 2014, 13 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 12/713,096, dated Nov. 4, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 12/713,113, dated Oct. 8, 2014, 12 pages.
“Foreign Office Action”, CN Application No. 201110046510.9, dated Jul. 25, 2014, 11 Pages.
“Foreign Office Action”, CN Application No. 201180010769.6, dated Sep. 3, 2014, 12 Pages.
“Foreign Office Action”, CN Application No. 201180011039.8, dated Jun. 5, 2014, 16 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/898,452, dated Sep. 26, 2014, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,245, dated Mar. 20, 2014, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,113, dated Jun. 4, 2014, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/484,075, dated Sep. 5, 2014, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/674,357, dated Aug. 4, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/230,700, dated May 15, 2012, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/230,700, dated Jun. 21, 2012, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/657,662, dated Oct. 11, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 12/472,699, dated May 2, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 12/709,376, dated Mar. 17, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,096, dated Aug. 29, 2014, 14 pages.
Roudaut, et al., “Leaf Menus: Linear Menus with Stroke Shortcuts for Small Handheld Devices”, Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part I, Aug. 2009, 4 pages.
Roth, Volker et al., “Bezel Swipe: Conflict-Free Scrolling and Multiple Selection on Mobile Touch Screen Devices”, In 27th International Conference on Human Factors in Computing Systems, Retrieved from <http://www.volkerroth.com/download/Roth2009a.pdf>, (Apr. 4, 2009), 4 pages.
Saini, Kalpana et al., “Designing of a Virtual System with Fingerprint Security by considering many Security Threats”, International Journal of Computer Applications, vol. 3—No. 2, available at <http://www.ijcaonline.org/volume3/number2/pxc387995.pdf>,(Jun. 2010), pp. 25-31.
Sajid, Uzair “Microsoft Patent a Futuristic Virtual Multitouch Keyboard”, Retrieved from <http://thetechnopath.com/microsoft-patent-futu ristic-virtual-multitouchkeyboard/857/> on Mar. 6, 2013 (Sep. 27, 2009),8 pages.
Sax, et al., “LiquidKeyboard: An Ergonomic, Adaptive QWERTY Keyboard for Touchscreens”, Proceedings of Fifth International Conference on Digital Society, (Feb. 23, 2011), pp. 117-122.
Serrano, et al., “Bezel-Tap Gestures: Quick Activation of Commands from Sleep Mode on Tablets”, n Proceedings of the SIGCHI Conference on Human Factors in Computinq Systems (Apr. 27, 2013),10 pages.
T., Nick “Smartphone displays need a bezel. Here's why”, Retrieved from <http://www.phonearena.com/news/Smartphone-displays-need-a-bezel.-Hereswhyid27670> on Aug. 29, 2012 (Mar. 12, 2012),4 pages.
“3M TouchWare TM Software for Windows User Guide”, In White Paper of 3M Touch Systems Aug. 9, 2013, 65 pages.
“Final Office Action”, U.S. Appl. No. 12/695,976, dated Jul. 23, 2014, 12 pages.
“Final Office Action”, U.S. Appl. No. 12/709,204, dated Apr. 11, 2014, 24 pages.
“Final Office Action”, U.S. Appl. No. 12/709,282, dated May 9, 2014, 17 pages.
“Final Office Action”, U.S. Appl. No. 12/713,127, dated Aug. 14, 2014, 15 pages.
“Final Office Action”, U.S. Appl. No. 13/898,452, dated Jun. 9, 2014, 26 pages.
“Foreign Office Action”, CN Application No. 201180011020.3, dated May 4, 2014, 12 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/657,662, dated Apr. 5, 2013, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/145,204, dated Feb. 5, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,842, dated Aug. 18, 2014, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,937, dated May 7, 2014, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700,357, dated Jun. 26, 2014, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700,510, dated Jun. 12, 2014, 26 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,204, dated Aug. 13, 2014, 25 pages.
“Foreign Office Action”, CN Application No. 201110046510.9, dated Feb. 12, 2014, 9 Pages.
“Foreign Office Action”, CN Application No. 20111 0050506.X, dated Feb. 26, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,096, dated Jan. 30, 2014, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,127, dated Jan. 31, 2014, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/898,452, dated Feb. 24, 2014, 24 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,282, dated Oct. 10, 2013, 12 pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,110, dated Dec. 4, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/472,699, dated Oct. 23, 2013, 14 pages.
“Final Office Action”, U.S. Appl. No. 12/695,842, dated Dec. 2, 2013, 17 pages.
“Advisory Action”, U.S. Appl. No. 12/709,376, dated Dec. 19, 2013, 2 pages.
“3M Touch Systems, Inc. Announces Shipment of Dispersive Signal Technology Product”, Datasheet, 3M Corporation, (Sep. 6, 2005), 3 pages.
“AccuScribe Touchscreens”, Datasheet Elo TouchSvstem (Aug. 2005), 2 pages.
“Final Office Action”, U.S. Appl. No. 11/324,157 (dated Jun. 24, 2009), 14 pages.
“Final Office Action”, U.S. Appl. No. 11/324,157, (dated Oct. 15, 2010), 18 pages.
“Final Office Action”, U.S. Appl. No. 12/472 699 (dated Jul. 29, 2013), 12 pages.
“Final Office Action”, U.S. Appl. No. 12/709,204, (dated Sep. 12, 2013), 24 pages.
“Final Office Action”, U.S. Appl. No. 12/709 282 (dated Jul. 16, 2013),11 pages.
“Final Office Action”, U.S. Appl. No. 12/709,348, (dated Sep. 12, 2013),12 pages.
“Final Office Action”, U.S. Appl. No. 12/709,376 (dated Sep. 10, 2013),12 pages.
“Final Office Action”, U.S. Appl. No. 12/713,113, (dated Oct. 8, 2013), 21 pages.
“Foreign Office Action”, Chinese Application No. 201110046519.x, (dated Aug. 6, 2013), 11 pages.
“Foreign Office Action”, Chinese Application No. 201110046529.3, (dated Aug. 6, 2013), 11 pages.
“In touch with new opportunities—Dispersive Signal Technology”, DataSheet, NXT, (2005),1 page.
“Non-Final Office Action”, Application No. 11/324 157 (Apr. 28, 2010),16 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/324,157, (dated Sep. 28, 2009),17 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/324,157 (dated Dec. 11, 2008),12 pages.
“Touch Systems—Innovation Touch Screen Solution”, Retrieved from <http://www.touchsystems.com/article.aspx?id=16> on Aug. 30, 2012 (Aug. 14, 2012),1 page.
“Notice of Allowance”, U.S. Appl. No. 11/324,157 (dated May 9, 2011), 8 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2013/020413, (dated Apr. 8, 2013),10 pages.
“Touch Screen is available in .36-50.8 mm thickness”, ThomasNet Industrial News Room, (Jul. 29, 2003), 2 pages.
Boudreaux, Toby “Touch Patterns: Chapter 6—Programming the iPhone User Experience”, Oct. 25, 2011,12 pages.
Findiater, et al., “Personalized Input: Improving Ten-Finger Touchscreen Typing through Automatic Adaptation”, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, (May 5, 2012),10 pages.
Goel, et al., “GripSense: Using Built-In Sensors to Detect Hand Posture and Pressure on Commodity Mobile Phones”, Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, (Oct. 7, 2012), pp. 545-554.
Hinckley, Ken et al., “Sensor Synaesthesia: Touch in Motion, and Motion in Touch”, CHI 2011 , May 7-12, 2011,(May 7, 2011),10 pages.
Hirche, et al., “Adaptive Interface for Text Input on large-Scale Interactive Surfaces”, 3rd IEEE International Workshop on Horizontal Interactive Human Computer System, (Oct. 1, 2008), pp. 153-156.
Lee, Tyler “The TypeWay iPad app is an adaptive on-screen keyboard”, (Feb. 1, 2012), 2 pages.
Maxwell, Kenneth G., “Writing drivers for common touch-screen interface hardware”, Industrial Control Design Line (Jun. 15, 2005), 9 pages, (Feb. 5, 2012),10 pages.
Moore, Charles “TypeWay Adaptive Keyboard for iPad Review”, Feb. 5, 2012.
Panzarino, Matthew “Apple's iPad Mini Should have a Widescreen Display”, Retrieved from <http://thenextweb.com/apple/2012/08/15/what-ipad-mini-169-instead-43/> (Aug. 15, 2012), 6 pages.
“Final Office Action”, U.S. Appl. No. 12/472,699, (dated Feb. 15, 2012),12 pages.
“Foreign Office Action”, Chinese Application 201110044285.5 (dated Apr. 24, 2013), 8 pages.
“Final Office Action”, U.S. Appl. No. 12/709,282, (dated Dec. 24, 2012), 11 pages.
“Final Office Action”, U.S. Appl. No. 12/709,376, (dated Nov. 8, 2012), 20 pages.
“Final Office Action”, U.S. Appl. No. 12/713,110 (dated Jan. 17, 2013), 10 pages.
“Final Office Action”, U.S. Appl. No. 12/713,118, (dated Oct. 26, 2012), 10 pages.
“Foreign Office Action”, Chinese Application No. 201110050499.3, (dated Nov. 27, 2012), 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,053 (dated Nov. 23, 2012), 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,081, (dated Nov. 29, 2012), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,130 (dated Jan. 16, 2013), 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,133, (dated Jan. 14, 2013), 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/484,075 (dated Jan. 15, 2013), 9 pages.
“Supplementary European Search Report”, European Patent Application No. 11747907.1, (dated Nov. 7, 2012), 3 pages.
“Supplementary European Search Report”, European Patent Application No. 11748028.5, (dated Nov. 7, 2012), 3 pages.
“Supplementary European Search Report”, European Patent Application No. 11748027.7, (dated Nov. 29, 2012), 3 pages.
“Final Office Action”, U.S. Appl. No. 12/695,937, (dated Jul. 26, 2012), 13 pages.
“Final Office Action”, U.S. Appl. No. 12/700,460 (dated Aug. 28, 2012), 26 pages.
“Final Office Action”, U.S. Appl. No. 12/700,510, (dated Oct. 10, 2012), 23 pages.
“Final Office Action”, U.S. Appl. No. 12/709,204 (dated Oct. 3, 2012), 24 pages.
“Final Office Action”, U.S. Appl. No. 12/713,053, (dated Aug. 17, 2012), 10 pages.
“Final Office Action”, U.S. Appl. No. 12/713,130 (dated Jun. 29, 2012), 8 pages.
“Final Office Action”, U.S. Appl. No. 12/709,348, (datd Feb. 17, 2012), 13 pages.
“Final Office Action”, U.S. Appl. No. 12/709,376 (dated Mar. 30, 2012), 16 pages.
“Final Office Action”, U.S. Appl. No. 12/713,081, (dated May 9, 2012), 19 pages.
“Final Office Action”, U.S. Appl. No. 12/713,113 (dated Jun. 4, 2012), 18 pages.
“Final Office Action”, U.S. Appl. No. 12/713,127, (dated Jun. 6, 2012),18 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,937, (dated Apr. 25, 2012), 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700 510 (dated Feb. 7, 2012), 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,204, (dated May 10, 2012), 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,245, (dated Mar. 21, 2012), 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,282, (dated Apr. 12, 2012), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,053 (dated Feb. 3, 2012), 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,096, (dated Jun. 6, 2012), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,110 (dated Jun. 21, 2012), 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,118, (dated Jun. 8, 2012), 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,133 (dated Jan. 31, 2012), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 12/695,064, (dated Mar. 28, 2012), 12 pages.
“Notice of Allowance”, U.S. Appl. No. 12/695,959 (dated Apr. 17, 2012), 13 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2011/025132, (dated Oct. 26, 2011), 10 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2011/025575, (dated Sep. 30, 2011), 14 pages.
Gross, Mark D., “Stretch-A-Sketch: A Dynamic Diagrammer”, IEEE Symposium on Visual Languages, Available at <http://depts.washington.edu/dmachine/PAPERNL94/v1.html>,(Oct. 1994),11 pages.
Hinckley, Ken et al., “Codex: A Dual Screen Tablet Computer”, Conference on Human Factors in Computing Systems, Available at <http://research.microsoft.com/en-us/um/people/kenh/codex-chi-2009-withauthors. pdf>, (2009),10 pages.
Hinckley, Ken et al., “Stitching: Pen Gestures that Span Multiple Displays”, CHI 2004, Available at <http://www.cs.comell.edu/-francois/Papers/2004-Hinckley-AVI04-Stitching.>,(2004),pp. 1-8.
Krazit, Tom “Has Apple Found the Magic Touch?”, Retrieved from: <http://news.cnet.com/8301-135793-9879471-37.html> on Nov. 101, 2009, (Feb. 26, 2008), 2 pages.
Minsky, Margaret R., “Manipulating Simulated Objects with Real-world Gestures using a Force and Position Sensitive Screen”, Computer Graphics, vol. 18, No. 3, Available at <http://delivery.acm.org/10.1145/810000/808598/p195-minsky.pdf?key1 =808598&key2=2244955521 &coll=GUIDE&dl=GU IDE&CFID=57828830&CFTOKEN=43421964>,(Jul. 1984),pp. 195-203.
Olwal, Alex et al., “Rubbing and Tapping for Precise and Rapid Selection on Touch-Screen Displays”, Conference on Human Factors in Computing Systems, Available at <http://www.csc.kth.se/alx/projects/research/rubbing/olwaUubbing_tapping_chi2008.pdf>,(Apr. 2008),10 pages.
Yee, Ka-Ping “Two-Handed Interaction on a Tablet Display”, Retrieved from: <http://zesty.ca/tht/yee-tht-chi2004-short.pdf>, Conference on Human Factors in Computing Systems,(Apr. 2004),4 pages.
“Final Office Action”, U.S. Appl. No. 12/709,245 (dated Jan. 6, 2012), 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700,460, (dated Jan. 13, 2012), 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,376 (dated Jan. 23, 2012), 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,081, (dated Dec. 23, 2011), 18 pages.
“Foreign Office Action”, Chinese Application 201110044285.5, (dated Jun. 20, 2012),12 pages.
“Foreign Office Action”, Chinese Application No. 201110044285.5, (dated Jan. 4, 2013), 13 pages.
“Foreign Office Action”, Chinese Application No. 201110046510.9, (dated May 31, 2013), 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/472,699 (dated Mar. 28, 2013),10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/472,699, (dated Sep. 12, 2011),12 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,204 (dated Jun. 6, 2013), 27 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,096, (dated Jun. 26, 2013), 8 pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,053 (dated Jun. 7, 2013), 7 pages.
“Final Office Action”, U.S. Appl. No. 12/713,133, (dated May 20, 2013),10 pages.
“Final Office Action”, U.S. Appl. No. 13/484,075 (dated May 21, 2013),10 pages.
“Foreign Office Action”, Chinese Application No. 201110046519.x, (dated Mar. 19, 2013), 11 pages.
“Foreign Office Action”, Chinese Application No. 201110050506.x, (dated Apr. 2, 2013), 11 pages.
“Foreign Office Action”, Chinese Application No. 201110050508.9, (dated Mar. 7, 2013), 8 pages.
“Foreign Office Action”, Chinese Application No. 201110050852.8, (dated Mar. 26, 2013), 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,842, (dated May 22, 2013), 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,245 (dated May 30, 2013), 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,348, (dated Apr. 25, 2013),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,376 (dated May 23, 2013), 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,110, (dated May 3, 2013), 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,113 (dated Apr. 23, 2013),18 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2011/025971, (dated Oct. 31, 2011),15 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US/2011025972, (dated Sep. 30, 2011),14 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2011/020412, (dated Aug. 31, 2011 ), 9 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2011/020410, (dated Sep. 27, 2011 ), 9 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2011/020417, (dated Oct. 20, 2011 ), 8 pages.
“Foreign Office Action”, European Patent Application No. 11748027.7, (dated Jan. 18, 2013), 5 pages.
“Foreign Office Action”, European Patent Application No. 11748026.9, (dated Jan. 16, 2013), 5 pages.
“Foreign Office Action”, European Patent Application No. 11748029.3, (dated Jan. 16, 2013), 5 pages.
“Foreign Office Action”, European Patent Application No. 11748028.5, (dated Jan. 28, 2013), 5 pages.
“Foreign Office Action”, European Patent Application No. 11747907.1, (dated Jan. 28, 2013), 5 pages.
“Foreign Office Action”, Chinese Application No. 201110046529.3, (dated Feb. 4, 2013), 8 pages.
“Final Office Action”, U.S. Appl. No. 12/713,096 (dated Feb. 15, 2013),7 pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,130, (dated Feb. 19, 2013), 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,282 (dated Feb. 28, 2013),11 pages.
“Final Office Action”, U.S. Appl. No. 12/709,245, (dated Mar. 15, 2013),16 pages.
“Final Office Action”, U.S. Appl. No. 12/713,133 (dated Jul. 2, 2012),8 pages.
“Foreign Office Action”, Chinese Application No. 201110046519.X, (dated Aug. 2, 2012),12 pages.
“Foreign Office Action”, Chinese Application No. 201110046529.3, (dated Aug. 16, 2012),13 pages.
“Foreign Office Action”, Chinese Application No. 201110050499.3, (dated Aug. 3, 2012), 8 pages.
“Foreign Office Action”, Chinese Application No. 201110050508.9, (dated Aug. 3, 2012), 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,842, (dated Oct. 3, 2012),16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,976, (dated Sep. 11, 2012), 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700,357 (dated Jul. 2, 2012),10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,348, (dated Aug. 2, 2012),14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,376, (dated Aug. 17, 2012),17 pages.
“iQuery & Css Example—Dropdown Menu”, DesignReviver, Retrieved from: <http://designreviver.com/tutorials/jquery-css-example-dropdown-menu/> on Nov. 22, 2011,(Oct. 7, 2008),30 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,245, (dated Nov. 30, 2011),11 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,348, (dated Dec. 7, 2011), 12 pages.
“Final Office Action”, U.S. Appl. No. 12/695,976, (dated Nov. 21, 2012), 10 pages.
“Final Office Action”, U.S. Appl. No. 12/700,357 (dated Oct. 24, 2012), 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,113 (dated Dec. 22, 2011), 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,127, (dated Dec. 27, 2011), 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,130 (dated Jan. 23, 2012), 7 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2011/025131, (dated Oct. 31, 2011),10 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2011/025974, (dated Oct. 26, 2011), 8 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2011/025973, (dated Oct. 27, 2011), 13 pages.
Vigil, Jose M., “Methods for Controlling a Floating Cursor on a Multi-touch Mobile Phone or Tablet in Conjunction with Selection Gestures and Content Gestures”, U.S. Appl. No. 61/304,972, (Feb. 16, 2010), 54 pages.
Elliott, Matthew “First Dell, Then HP: What's Next for N-trig's Multitouch Screen Technology”, (Nov. 25, 2008),5 pages.
Appleinsider, “Special Report: Apple's Touch-Sensitive iPod Ambitions Disclosed in Filing”, (Oct. 26, 2006),10 pages.
Emigh, Jacqueline “Lenovo Launches Windows 7 ThinkPads with Multitouch and Outdoor Screens”, (Sep. 15, 2009), 3 pages.
Roth, Volker et al., “Bezel Swipe: Conflict-Free Scrolling and Multiple Selection on Mobile Touch Screen Devices”, CHI 2009, (Apr. 2009),4 pages.
Pierce, Jeffrey S., et al., “Tools paces and Glances: Storing, Accessing, and Retrieving Objects in 3D Desktop Applications”, 1999 Symposium on Interactive 3D Graphics, (Apr. 1999), pp. 163-168.
“Apple Unibody MacBook Pro #MB991 LLIA 2.53 GHz Intel Core 2 Duo”, Nov. 10, 2009, (2009),12 pages.
“Dell and Windows 7—The Wait Is Over”, (Oct. 22, 2009), 2 pages.
“New MS Courier Leak Details Multi-Touch Interface”, (Nov. 4, 2009), 9 pages.
Brandl, Peter et al., “Combining and Measuring the Benefits of Bimanual Pen and Direct-Touch Interaction on Horizontal Interfaces”, Mitsubishi Electric Research Laboratories,(May 2008), 10 pages.
Daniels, Martyn “Brave New World”, (Mar. 2009), 54 pages.
“Apple Granted a Major Radial Menus Patent for iOS and Os X”, Retrieved From: http://www.patentlyapple.com/patently-apple/2012/08/apple-granted-a-major-radial-menus-patent-for-ios-and-os-x.html, Aug. 14, 2012, 12 Pages..
“ATOK for Android”, Retrieved From: http://www.youtube.com/watch?v=bZiDbz0aJKk, Jun. 9, 2012, 2 Pages.
“Autodesk Upgrades SketchBook Pro for iPad”, Retrieved From: http://www.macworld.co.uk/digitallifestyle/news/index.cfm?newsid=3278495, May 9, 2011, 2 Pages.
“Colorful Ink Drops”, Retrieved From: https://web.archive.org/web/20110527192625/http://www.123rf.com/photo_9060642_colorful-ink-drops.html, May 27, 2011, 2 Pages.
“Google Reveals Possible Radial Styled Menus Coming to Android”, Retrieved From: https://web.archive.org/web/20120803020224/http://www.patentbolt.com/2012/07/google-reveals-possible-radial-styled-menus-coming-to-android.html, Jul. 31, 2012, 9 Pages.
“Using Context Menus”, Retrieved From: https://docs.microsoft.com/en-us/previous-versions/windows/desktop/ms701740(v=vs.85), Aug. 29, 2011, 3 Pages.
“Wacom Tablets. The basics”, Retrieved From: https://jelphotoretouch.wordpress.com/category/wacom-tips-and-tricks/, Feb. 25, 2011, 11 Pages.
“Office Action Issued in European Patent Application No. 11737428.0”, dated Nov. 18, 2013, 4 Pages.
“Office Action Issued in European Patent Application No. 11745194.8”, dated Nov. 22, 2016, 6 Pages.
“Search Report Issued in European Patent Application No. 11745194.8”, dated Nov. 3, 2016, 4 Pages.
“Final Office Action Issued in U.S. Appl. No. 12/695,842”, dated Feb. 2, 2016, 11 Pages.
“Final Office Action Issued in U.S. Appl. No. 12/700,510”, dated Mar. 14, 2016, 36 Pages.
“Final Office Action Issued in U.S. Appl. No. 12/709,282”, dated Aug. 24, 2015, 25 Pages.
“Final Office Action Issued in U.S. Appl. No. 12/709,282”, dated Jun. 27, 2017, 17 Pages.
“Non Final Office Action Issued in U.S. Appl. No. 12/709,282”, dated May 10, 2016, 23 Pages.
“Non Final Office Action Issued in U.S. Appl. No. 12/709,282”, dated Dec. 12, 2016, 23 Pages.
“Final Office Action Issued in U.S. Appl. No. 12/709,301”, dated Jan. 7, 2013, 15 Pages.
“Final Office Action Issued in U.S Patent Application No. 12/709,301”, dated Sep. 3, 2013, 13 Pages.
“Final Office Action Issued in U.S. Appl. No. 12/709,301”, dated Mar. 1, 2012, 12 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 12/709,301”, dated May 14, 2013, 14 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 12/709,301”, dated Nov. 28, 2011, 10 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 12/709,301”, dated Sep. 13, 2012, 14 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 12/709,301”, dated May 23, 2014, 13 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 12/709,301”, dated Oct. 24, 2013, 12 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 12/709,301”, dated Jan. 16, 2015, 5 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 12/709,301”, dated Jul. 14, 2015, 12 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 12/709,301”, dated Nov. 19, 2015, 8 Pages.
“Final Office Action Issued in U.S. Appl. No. 12/709,348”, dated Jan. 7, 2013, 15 Pages.
“Final Office Action Issued in U.S. Appl. No. 13/349,691”, dated Sep. 27, 2013, 32 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 13/349,691”, date Jun. 20, 2014, 18 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 13/349,691”, dated May 8, 2013, 24 Pages.
“Final Office Action Issued in U.S. Appl. No. 13/542,962”, dated Jun. 20, 2014, 18 Pages.
“Non Final Office Action Issued in U.S. Appl. No. 13/542,962”, dated Feb. 26, 2015, 19 Pages.
“Non Final Office Action Issued in U.S. Appl. No. 13/542,962”, dated Jan. 15, 2014, 17 Pages.
“Final Office Action Issued in U.S. Appl. No. 13/543,976”, dated Jun. 30, 2015, 20 Pages.
“Final Office Action Issued in U.S. Appl. No. 13/543,976”, dated Jun. 19, 2014, 19 Pages.
“Non Final Office Action Issued in U.S. Appl. No. 13/543,976”, dated Mar. 12, 2015, 24 Pages.
“Non Final Office Action Issued in U.S. Appl. No. 13/543,976”, dated Jan. 31, 2014, 16 Pages.
“Final Office Action Issued in U.S. Appl. No. 13/549,397”, dated Sep. 11, 2014, 21 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 13/549,397”, dated Mar. 3, 2014, 21 Pages.
“Notice of Allowance Issued in U.S. Appl. No. 13/549,397”, dated Apr. 24, 2015, 11 Pages.
“Final Office Action Issued in U.S. Appl. No. 13/674,357”, dated Sep. 17, 2015, 13 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 13/674,357”, dated Feb. 17, 2016, 13 Pages.
“Final Office Action Issued in U.S. Appl. No. 13/898,452”, dated Mar. 10, 2016, 25 Pages.
“Non-Final Office Action issued in U.S. Appl. No. 13/898,452”, dated Sep. 14, 2015, 23 Pages.
“Final Office Action Issued in U.S. Appl. No. 14/099,798”, dated Nov. 25, 2015, 19 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 14/099,798”, dated Mar. 31, 2016, 19 Pages.
“Fourth Office Action and Search Report Issued in Chinese Patent Application No. 201110046519.X”, dated Apr. 1, 2016, 13 Pages.
“Office Action Issued in Chinese Patent Application No. 201180010692.2”, dated Mar. 28, 2016, 7 Pages.
“Third Office Action Issued in Chinese Patent Application No. 201180010692.2”, dated Sep. 15, 2015, 10 Pages.
“Third Office Action Issued in Chinese Patent Application No. 201180011039.8”, dated Sep. 6, 2015, 7 Pages.
“Notice of Allowance Issued in Japanese Patent Application No. 2012-554009”, dated Dec. 12, 2014, 3 Pages.
Arsenault, Simon, “Contributing Actions to the Eclipse Workbench”, Retrieved From: http://www.eclipse.org/articles/article.php?file=Article-action-contribution/index.html, Oct. 18, 2001, 15 Pages.
Francone, et al., “Wavelet Menus on Handheld Devices: Stacking Metaphor for Novice Mode and Eyes-Free Selection for Expert Mode”, In Proceedings of the International Conference on Advanced Visual Interfaces, May 26, 2010, 8 Pages.
Gaudioso, Victor, “How is Blend Extensible?”, Retrieved From: https://web.archive.org/web/20111025091841/http://www.windowspresentationfoundation.com/?p=743, Oct. 10, 2010, 5 Pages..
Montreuil, “Home Design 3D by LiveCAD : Dream Homes At Your Fingertips”, Retrieved From: https://web.archive.org/web/20110529094413/http://livecad.net/EN/Products/Home-Design-3D-by-LiveCAD-Press-release.php, Feb. 10, 2011, 2 Pages.
“International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2014/067804”, dated Feb. 22, 2016, 9 Pages.
“Second Written Opinion Issued in PCT Application No. PCT/US2014/067804”, dated Nov. 24, 2015, 8 Pages.
Rice, Frank, “Customizing Context Menus in Office 2010”, Retrieved From: https://docs.microsoft.com/en-us/previous-versions/office/developer/office-2010/ee691832(v=office.14), Nov. 2009, 7 Pages.
Tikku, Nirvana, “jQuery Radmenu Plugin”, Retrieved From: https://web.archive.org/web/20110814094113/http://www.tikku.com/jguery-radmenu-plugin, May 14, 2010, 18 Pages.
Related Publications (1)
Number Date Country
20160283104 A1 Sep 2016 US
Continuations (1)
Number Date Country
Parent 12709301 Feb 2010 US
Child 15179660 US