Parametric inertia and APIs

Information

  • Patent Grant
  • 10642365
  • Patent Number
    10,642,365
  • Date Filed
    Tuesday, September 9, 2014
    9 years ago
  • Date Issued
    Tuesday, May 5, 2020
    4 years ago
Abstract
Parametric inertia and API techniques are described. In one or more implementations, functionality is exposed via an application programming interface by an operating system of a computing device to one or more applications that is configured to calculate an effect of inertia for movement in a user interface. The calculated effect of inertia for the movement on the user interface is managed by the operating system based on one or more rest points specified using one or more parametric curves by the one or more applications via interaction with the application programming interface.
Description
BACKGROUND

Development of user interfaces continues to strive toward support of a natural user experience between a user and the user interface. One such way to achieve this natural user experience is to have the user interface mimic real world user interaction with physical objects. In this way, a user's interactions with objects in the user interface may be performed in an intuitive manner that leverages a user's experience with objects in the real world, thereby improving efficiency of this user interaction.


One way in which the user interface may mimic real world interaction with objects is through the use of inertia. The user, for instance, may make a swipe gesture that is recognized through touchscreen functionality of the computing device. Even once the swipe gesture ceases input, the user interface may continue to move in a manner that mimics inertia on an object in a real world scenario, such as pushing a page. However, conventional techniques that are utilized to calculate the effect of inertia on an object in a user interface are static and thus are limited to a single expression of inertia on an object.


SUMMARY

Parametric inertia and API techniques are described. In one or more implementations, functionality is exposed via an application programming interface by an operating system of a computing device to one or more applications that is configured to calculate an effect of inertia for movement in a user interface. The calculated effect of inertia for the movement on the user interface is managed by the operating system based on one or more rest points specified using one or more parametric curves by the one or more applications via interaction with the application programming interface.


In one or more implementations, a system includes one or more modules implemented at least partially in hardware. The one or more modules are configured to perform operations including calculating an inertia rest position of an effect of inertia by an operating system of a computing device using one or more of a plurality of phases based on one or more rest points specified by an application via an application programming interface of the operating system. The operations also include exposing the calculated inertia rest position by the operating system as applied to a user interface output by the computing device for display by a display device.


In one or more implementations, a computing device includes a processing system and memory configured to maintain instructions that are executable by the processing system to perform operations. The operations include exposing an application programming interface by an operating system to one or more applications that is configured to calculate an inertial rest point for movement in a user interface. The operations also include managing the calculated effect of inertia for the inertial rest point for the movement on the user interface by the operating system based on one or more rest points specified by the one or more applications via interaction with the application programming interface.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items. Entities represented in the figures may be indicative of one or more entities and thus reference may be made interchangeably to single or plural forms of the entities in the discussion.



FIG. 1 is an illustration of an environment in an example implementation that is operable to perform parametric inertia and application programming interface techniques.



FIG. 2 depicts a system in an example implementation in which interaction of an inertia module and applications of FIG. 1 is shown in greater detail.



FIG. 3 depicts an example implementation showing a workflow diagram in which an inertia module of FIG. 2 utilizes default, position, and range phases to calculate an inertia rest position for an effect of inertia on a subject of movement.



FIG. 4 is a flow diagram depicting a procedure in an example implementation in which an operating system exposes functionality to one or more applications to calculate an effect of inertia on a subject of movement in a user interface.



FIG. 5 is a flow diagram depicting a procedure in an example implementation in which an inertia rest position is calculated by an operating system based on one or more rest points specified by an application via an application programming interface.



FIG. 6 illustrates an example system including various components of an example device that can be implemented as any type of computing device as described with reference to FIGS. 1-5 to implement embodiments of the techniques described herein.





DETAILED DESCRIPTION

Overview


User interfaces may be configured to mimic an effect of inertia on objects in the user interface, such as to continue movement of a page in a user interface responsive to a swipe gesture even after completion of the gesture. However, conventional techniques that are utilized to support this effect are generally static and thus supplied a single defined effect that is not alterable by applications.


Parametric inertia and application programming interface techniques are described. In one or more implementations, an operating system is configured to expose application programming interfaces via which applications may specify rest points and associated parametric curves for use in calculating an effect of inertia on an object. The rest points and associated parametric curves, for instance, may be utilized by the operating system in calculating an inertia rest position at which the effect of inertia is to cease.


Phases used to calculate the inertia rest point may include a default phase in which rest points specified by the applications that take positions and velocities at a start of inertia are evaluated. A position phase may also be included where the rest points placed between the inertia start point and the proposed inertia rest point of the default phase are evaluated and thus may be utilized to adjust the inertia rest position of the default phase. A range phase may also be employed in which ranges specified by rest points of the application are evaluated at a proximity of a proposed inertia rest point, such as to “snap to” a rest point described by the application. In this way, a parametric description through use of parametric curves specified by the applications may be used to produce a proposed location of an inertia rest position. Further discussion of these and other techniques may be found in the following sections.


In the following discussion, an example environment is first described that may employ the parametric inertia and API techniques described herein. Example procedures are then described which may be performed in the example environment as well as other environments. Consequently, performance of the example procedures is not limited to the example environment and the example environment is not limited to performance of the example procedures.


Example Environment



FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ parametric techniques described herein. The illustrated environment 100 includes a computing device 102, which may be configured in a variety of ways.


For example, a computing device may be configured as a computer that is capable of communicating over a network, such as a desktop computer, a mobile station, an entertainment appliance, a set-top box communicatively coupled to a display device, a wireless phone, a game console, and so forth. Thus, the computing device 102 may range from full resource devices with substantial memory and processor resources (e.g., personal computers, game consoles) to a low-resource device with limited memory and/or processing resources (e.g., traditional set-top boxes, hand-held game consoles). Additionally, although a single computing device 102 is shown, the computing device 102 may be representative of a plurality of different devices, such as multiple servers utilized by a business to perform operations such as by a web service, a remote control and set-top box combination, an image capture device and a game console configured to capture gestures, and so on.


The computing device 102 is illustrated as including a variety of hardware components, examples of which include a processing system 104, an example of a computer-readable storage medium illustrated as memory 106, a display device 108, and so on. The processing system 104 is representative of functionality to perform operations through execution of instructions stored in the memory 106. Although illustrated separately, functionality of these components may be further divided, combined (e.g., on an application specific integrated circuit), and so forth.


The computing device 102 is further illustrated as including an operating system 110. The operating system 110 is configured to abstract underlying functionality of the computing device 102 to applications 112 that are executable on the computing device 102. For example, the operating system 110 may abstract processing system 104, memory 106, network, and/or display device 108 functionality of the computing device 102 such that the applications 112 may be written without knowing “how” this underlying functionality is implemented. The application 112, for instance, may provide data to the operating system 110 to be rendered and displayed by the display device 108 or printer 116 without understanding how this rendering will be performed. The operating system 108 may also represent a variety of other functionality, such as to manage a file system and user interface that is navigable by a user of the computing device 102.


An example of abstracted functionality of the operating system 110 is illustrated in FIG. 1 as an inertia module 114. The inertia module 114 is representative of functionality that is exposed to applications 112 to calculate an effect of inertia on a user interface 116 displayed by the display device 108 of the computing device 102. For example, a finger of a user's hand 118 may be placed proximal to the display device 108 and moved 120 to the left. This proximity and subsequent movement 120 may be recognized as a swipe gesture by the operating system 110 to cause the user interface 116 to scroll to the left in this example.


Inertia may be included as part of this gesture such that motion of the user interface 116 continues even after input of the gesture ceases. In this way, the movement of the user interface may mimic movement of an object in the real world. Other examples are also contemplated, such as movement of a subject (e.g., object) within the user interface 116. Further other examples of inputs are also contemplated, such as through use of a cursor control device, keys of a keyboard, and so forth.


Conventional techniques utilized to employ inertia in a user interface, however, relied on pre-determined behaviors and logic that is tightly coupled with internal workings of software of a computing device. Thus, these conventional techniques are not customizable by applications 112 and thus not generic enough to meet the needs of evolving user experiences.


Accordingly, the inertia module 114 may be utilized to expose functionality to the applications 112 that is usable to customize an effect of inertia in a user interface 116. This customization support a variety of different features, such as to determine an inertia rest position which describes a point at which output of an animation showing inertia is to end, further discussion of which may be found in the following and is shown in a corresponding figure.



FIG. 2 depicts a system 200 in an example implementation in which interaction of the inertia module 114 and the applications 112 is shown in greater detail. As illustrated the operating system 110 includes the inertia module 114 as previously described that is representative of functionality to calculate an effect of inertia in a user interface. The inertia module 114 in this example includes application programming interfaces (APIs) 202 via which this functionality is exposed to the applications 112. The applications 112 may interact with the application programming interfaces 202 in a variety of ways to specify constraints to be utilized by the inertia module 114 in calculating the effect of inertia in the user interface.


The applications 112, for instance, may communicate one or more rest points 204 and corresponding parametric curves 206 that are utilized to describe functionality associated with the rest points 204 in calculating the effect of inertia. The parametric curves 206 provide a parametric description of a mathematical function that is usable to calculate a proposed location of an inertia rest position, e.g., a stopping point of an animation to be used to display the effect of inertia in the user interface. Rest points 204 may be configured in a variety of ways, such as a natural end point of inertia, a snap point, content boundary or any concept that indicate a position where the effect of inertia may stop.


The inertia module 114 may utilize one or more phases 208 in the calculation of the effect of inertia on the user interface, such as to determine the inertia rest position and how movement to that inertia rest position is to be accomplished. Examples of phases 208 are illustrated as a default phase 210, a position phase 212, and a range phase 214. Thus, in this example inertia rest point calculation may employ one or more of the phases, and may do so in the listed order. For example, an input to the default phase 210 may be taken from a state of a subject of the movement (and consequently the effect of the inertia) at the start of inertia, the result from the previous phase is fed to the next phase, and the output of last phase is the final location of the inertia rest point.


The default phase 210 is a phase in which parametric rest points 204 specified by the applications 112 that take positions and/or velocities at the start of inertia are evaluated. The position phase 212 is a phase in which rest points 204 placed between the inertia start point and the proposed inertia rest point of the default phase 210 are evaluated. The range phase 214 is a phase in which a limited number of range rest points 204 at the proximity of the currently proposed inertia rest point are evaluated.


A rest point 204 may be used to determine a proposed rest point of the effect of inertia by specifying a position of the proposed rest point itself. Rest points 204 may be specified in terms of a variety of different parameters measured form the state of the interaction. For example, rest points 204 may be specified relative to a position of a subject of movement of the effect of inertia at a start of inertia. Rest points 204 may also specified as a distance from a position of the inertia rest point from the previous phase to the parametric rest point, a distance derived from the velocity, of the subject at the start of inertia. The rest points 204 may also be specified based on a velocity of the subject when passing through the rest points 204, based on an extrapolated velocity of the subject as it comes to rest on an inertia rest point from a previous phase.


Rest points 204 may also be specified in terms of absolute values that bear no relationship to a state of interaction. For example, rest points 204 may be specified as constant numerical value. Direct specification of rest points 204 may also be performed using a position a set distance away from an absolute value regardless of where the content is located. A rest point 204 that specifies the position of the proposed rest point directly may be referred to as a position-based rest point.


A rest point's 204 position may also be utilized to determine a proposed rest point of the effect of inertia by specifying ranges of attraction, e.g., within which the position “snaps to” the rest point 204. The ranges may be delimited using a variety of values, the rest point's 204 position, and so on. For example, the ranges may be specified centering at a position a set distance away from the subject's position at inertia start on either a negative or positive side of the rest point 204. The ranges may also be specified centering at a position a set distance away from an absolute position value on either a negative or positive side of the rest point. Therefore, rest points 204 providing these range specifications may referred to as range-based rest points, or simply range rest points in the following.


Selection of a rest point position in the position phase 212 may be performed based on the inertia start point. For example, an inertia rest position calculated by the default phase 210 may be set as a candidate. The first position rest points 204 encountered going in a direction of a previous inertia rest point candidate are evaluated and the rest point's position is used as an inertia rest point candidate. This process may be repeated in the position phase 212 until no more position rest points are encountered.


Selection of an inertia rest position in the range phase 214 may be started from a rest point candidate from the position phase 212 as follows. First, the closest range rest point to the negative side of the inertia rest point candidate that has a range overlapping it is found. Second, the closest range rest point to the positive side out of the two is chosen that is closest to the rest point candidate. The rest point candidate is then changed to the position of the chosen range rest point.


As previously described, an initial position and velocity of the subject of movement in the user interface is referred to as inertia start. A position of an inertia rest point is calculated using phases 208 that refines the position of the rest point until each of the relevant rest points 204 are considered.


Each of the phases 208, e.g., the default, position, and range phase 210, 212, 214 are optional. A default value is chosen for a rest point candidate at the end of a phase 208 if no rest points are applicable such that the default phase 210 outputs the inertia rest position based on the initial velocity of the subject and a predefined deceleration, the position phase 212 leaves the rest point at the same position, and the range phase 214 leaves the rest point at the same position. In this way, the default inertia rest point is based on the default inertia deceleration to give an effect that the subject of the movement is freely moving without restrictions with minimal work on the part of the application.


For each phase, particular kinds of rest points 204 may be considered based on the purpose of the phase as described above. Rest points that derives input and produces output in a certain way are applied in a phase 208, and zero or more phases 208 combine to produce a result for scenarios such as a content boundary, snap points, or other behaviors desired by the applications 112.


The default phase 210 may be evaluated by the inertia module 114 first to obtain a clear starting point in determining the inertia rest point. For example, the default phase 210 may be considered the simplest and most basic step in computing an inertia rest point, without which the behavior would be an instant stop for the subject.


The position phase 212 may then follow the default phase 210 in the processing performed by the inertia module 114. For example, the position phase 212 may be utilized to implement a predictable order to evaluate each of the rest points other than the order in which the rest points 204 are specified, which may complicate processing performed by the inertia module 114 (and thus decrease efficiency) as well as verification of its correctness.


The position phase 212 is utilized to put constraints on a rest point candidate, such as to leave it at a point that may be short of an original position proposed for the effect of the inertia to stop. Also, this may be performed before applying range rest points of the range phase 214 such that the subject of the movement is not allowed to go beyond more than one point, even if the inertia would have brought the subject into the range of a rest point that is more than one rest point away.


In the default phase 210, applications 112 may customize the inertia rest position of the subject to be different from a default employed by the inertia module 114. The rest points 204 that participate in this phase provide answers to the question: “how far would the content travel given its initial position and velocity?” These rest points 204 take their input from the state of the subject of the movement at inertia start, measuring either its position or velocity.


This value is evaluated using the one or more parametric curves 206 associated with the rest point 204, producing an output value. The parametric curves 206, for instance, may represent a physics equation that gives distance traveled based on an initial velocity and constant deceleration of a moving body. The output value is then applied to the subject's position at inertia start to indicate how far the effect of inertia is to stop away from the starting position.


The values may also be relative such that the values are applicable regardless of where the subject is actually located at inertia start, meaning the application 112 may specify a single rest point 204 for this phase. Although the applications 112 may specify more than one rest point 204 for the default phase 210, a result from a single one of the rest points 204 is used.


The position phase 212 is provided for the application to alter the course of the subject as it moves from its position at inertia start to the position given by the default phase 210. The rest points 204 that participate in this phase adjust the position of the rest point location as the points are evaluated in turn, starting with the first encountered from the inertia start position, in the direction of movement of the inertia.


The rest points 204 and corresponding parametric curves 206 involved in the position phase 212 may be configured in a variety of ways. For example, the rest points 204 may be based on the position of the subject of movement at inertia start, which gives the parametric curve 206 the distance between that position and the position of the rest point as input. The rest point 204 may also be based on the velocity of the subject at inertia start which gives the parametric curve 206 the value of the velocity as input. Additionally, the rest point 204 may be based on the candidate rest position from the default phase 210, which gives the parametric curve 206 the distance between that position and the position of the rest point as input. Further, the rest point 204 may be based on the velocity of the subject as it passes the rest point, extrapolated from the velocity at inertia start and the velocity when the subject reaches the candidate rest position, which is zero.


The parametric curves 206 associated with the rest point 204 are then evaluated with this input and produce an output value through processing by the inertia module 114. The value may then be applied to the position of the rest point as the distance between it and the proposed rest position of the subject for the next encountered rest point. This may be utilized to support a variety of functionality, such as a snap point that captures the subject if it is moving below a certain velocity threshold when it passes the point, a snap point that does not allow the subject to move past it thereby creating a “single-step” behavior for the subject of the movement that are allowed to be moved one section at a time, and so forth.


The range phase 214 is provided for the applications 112 to force the subject of the movement to come to stop at a specific location, from the position given by the position phase 212. The rest points 204 that participate in this phase provide ranges of attraction that “pulls” or “pushes” the candidate rest position toward them or away from them.


The rest points 204 that participate in the range phase 214 may be configured in a variety of ways. The rest point 204 may be based on the position of the subject of the movement at inertia start, which gives the parametric curve 206 the distance between that position and the position of the rest point as input. The rest point 204 may be based on the velocity of the subject at inertia start which gives the parametric curve 206 the value of the velocity as input. The rest point may also be based on the candidate rest position from the default inertia case, which gives the parametric curve 206 the distance between that position and the position of the rest point as input. Additionally, the rest point 204 may be based on the velocity of the subject as it passes the rest point 204, extrapolated from the velocity at inertia start and the velocity when the subject reaches the candidate rest position, which is zero.


Two sets of parametric curves 206 may be evaluated with this input, one set for the range extending towards the negative direction of the rest point 204, the other set for the range extending towards the positive direction (e.g., along a direction of the movement) of the rest point 204. The closest rest point that covers the candidate rest position from the previous phase in its range replaces it as the candidate rest position of the range phase 214.


If there are a plurality of range rest points that have ranges that overlap the candidate rest position, the closest one is selected. This means that the rest point farther away has its range effectively “cut off” at the position of the closer rest point. However, the closer rest point may also produce a range that does not overlap the candidate rest position, and the rest point farther away may be considered, making its range effectively “go through” the closer rest point.


Examples of behaviors supported by the range phase 214 include a content boundary, at which point the content can move no further and thus comes to rest at this position if it was further at inertia start. This is not implemented by a position rest point because it might not be applicable since the search goes in the direction of the inertia, e.g., the content may be moving away from the boundary by user interaction at inertia start.


In another example, the range phase 214 may be utilized to specify a mandatory snap point, where the subject of the movement is forced to stop at one of such rest points at the end of inertia. In a further example, the range phase 214 may implement an optional snap point, where if the subject of the movement is close enough to the rest point as well as moving under a velocity threshold, the subject is stopped at the rest point. An example showing implementation of the default phase 210, position phase 212, and range phase 214 by the inertia module 114 is described as follows and shown in a corresponding figure.



FIG. 3 depicts an example implementation 300 showing a workflow diagram in which the inertia module 114 of FIG. 2 utilizes default, position, and range phases 210, 212, 214 to calculate an inertia rest position for an effect of inertia on a subject of movement. This example implementation 300 is illustrated using first, second, third, fourth, fifth, and sixth stages 302, 304, 306, 308, 310, 312. The first stage 302 corresponds to a start state, the second stage 304 illustrates a default phase 210, the third and fourth stages 306, 308 correspond to the position phase 212, the fifth stage 310 corresponds to the range phase 214, and the sixth stage 312 illustrates an end state which is a result of the stages calculated by the inertia module 114.


At the first stage 302, a start state is shown that includes an inertia start 314 followed by a range rest point 316, a position rest point 318, another position rest point 320, and a range rest point 322. Thus, position rest points 328, 320 correspond to the position phase 212 and range rest points 316, 322 correspond to the range phase 214 and will be evaluated in their respective phases by the inertia module 114.


At the second stage 304, an inertia rest point 324 is calculated in the default phase 210. The default phase 210, for instance, may calculate the inertia rest point 324 based on position and velocity of a subject of movement at the inertia start 314. For example, the default phase 210 may calculate the inertia rest position 324 based on an initial velocity of the subject of the movement and a predefined deceleration and thus answers the question “how far would the subject travel given its initial position and velocity?” In this way, the default phase 210 gives a clear starting point in determining the inertia rest point.


At the third stage 306, the course of the subject of the movement is altered as it moves from its position at inertia start to the position given by the default phase 210. The rest points that participate in this phase adjust the position of the rest point candidate's location as the points are evaluated in turn, starting with the first encountered from the inertia start position, in the direction of movement of the inertia. Accordingly, position rest point 320 is evaluated at first, which results in a rest point candidate 326. An effect of position rest point 318 is evaluated at the fourth stage 308, which results in rest point candidate 328. As there are no more applicable position rest points, evaluation involved in the position phase 212 is completed with a result being rest point candidate 328.


At the fifth stage 310, the range phase 214 is evaluated by the inertia module 114. This evaluates an effect of range rest points 316, 322 on the rest point candidate 328 from the position phase 212. For example, range rest point 316 may have a range 332 as illustrated by the bracket and range rest point 322 may have a range 334 illustrated by another bracket. Because, a position of the rest point candidate 328 from the position phase 212 falls within the range 332 of the range rest point 316, range rest point 316 is set as the inertia rest point as illustrated for the end state at the sixth stage 312. In this way, the inertia module 114 may use a sequence of phases to calculate an effect of inertia on movement of a subject in a user interface, further discussion of which may be found in relation to the following procedures.


Example Procedures


The following discussion describes parametric inertia and API techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedures may be implemented in hardware, firmware, or software, or a combination thereof. The procedures are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. In portions of the following discussion, reference will be made to the figures described above.


Functionality, features, and concepts described in relation to the examples of FIGS. 1-3 may be employed in the context of the procedures described herein. Further, functionality, features, and concepts described in relation to different procedures below may be interchanged among the different procedures and are not limited to implementation in the context of an individual procedure. Moreover, blocks associated with different representative procedures and corresponding figures herein may be applied together and/or combined in different ways. Thus, individual functionality, features, and concepts described in relation to different example environments, devices, components, and procedures herein may be used in any suitable combinations and are not limited to the particular combinations represented by the enumerated examples.



FIG. 4 depicts a procedure 400 in an example implementation in which an operating system exposes functionality to one or more applications to calculate an effect of inertia on a subject of movement in a user interface. Functionality is exposed via an application programming interface by an operating system of a computing device to one or more applications that is configured to calculate an effect of inertia for movement in a user interface (block 402). The operating system 110, for instance, may include functionality represented by an inertia module 114 to calculate an effect of inertia. The inertia module 114 may include application programming interfaces 202 that support interaction with applications 112.


The calculated effect of inertia for the movement on the user interface is managed by the operating system based on one or more rest points specified using one or more parametric curves by the one or more applications via interaction with the application programming interface (block 404). Continuing with the previous examples, applications 112 may specify rest points 204 and corresponding parametric curves 206 that are usable as part of the calculation of the effect of inertia, e.g., to calculate an inertia rest position and well as movement involved in an animation is displaying inertia relation movement to the inertia rest position. In this way, the application programming interfaces 202 may support a rich description of inertia to be applied to a subject of movement in a user interface, e.g., the user interface as a whole such as when scrolling, movement of an object within the user interface, and so on.



FIG. 5 depicts a procedure 500 in an example implementation in which an inertia rest position is calculated by an operating system based on one or more rest points specified by an application via an application programming interface. An inertia rest position of an effect of inertia is calculated by an operating system of a computing device using one or more of a plurality of phases based on one or more rest points specified by an application via an application programming interface of the operating system (block 502). The plurality of phases may include a default phase in which the inertia rest position is based at least in part on an initial velocity of an input and a predefined deceleration (block 504). The plurality of phases may also include a position phase in which constraints are applied as specified by the application to one or more rest points to determine the inertia rest position (block 506). The plurality of phases may also include a range phase in which ranges are applied as specified by the application to one or more rest points to determine the inertia rest position (block 508). These phases may be employed singly, in succession, and so on as described above.


The calculated inertia rest position is exposed by the operating system as applied to a user interface output by the computing device for display by a display device (block 510). For example, the inertia module 114 may employ these phases to arrive at a resulting inertia rest position. This position may then be used as a destination to be employed in an animation involving inertia in a user interface as previously described. A variety of other examples are also contemplated without departing from the spirit and scope herein.


Example System and Device



FIG. 6 illustrates an example system generally at 600 that includes an example computing device 602 that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. An example of this is illustrated through inclusion of the inertia module 114. The computing device 602 may be, for example, a server of a service provider, a device associated with a client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system.


The example computing device 602 as illustrated includes a processing system 604, one or more computer-readable media 606, and one or more I/O interface 608 that are communicatively coupled, one to another. Although not shown, the computing device 602 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines.


The processing system 604 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 604 is illustrated as including hardware element 610 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. The hardware elements 610 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.


The computer-readable storage media 606 is illustrated as including memory/storage 612. The memory/storage 612 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage component 612 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage component 612 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 606 may be configured in a variety of other ways as further described below.


Input/output interface(s) 608 are representative of functionality to allow a user to enter commands and information to computing device 602, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to recognize movement as gestures that do not involve touch), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, the computing device 602 may be configured in a variety of ways as further described below to support user interaction.


Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the computing device 602. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”


“Computer-readable storage media” may refer to media and/or devices that enable persistent and/or non-transitory storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media. The computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.


“Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 602, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.


As previously described, hardware elements 610 and computer-readable media 606 are representative of modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein, such as to perform one or more instructions. Hardware may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware. In this context, hardware may operate as a processing device that performs program tasks defined by instructions and/or logic embodied by the hardware as well as a hardware utilized to store instructions for execution, e.g., the computer-readable storage media described previously.


Combinations of the foregoing may also be employed to implement various techniques described herein. Accordingly, software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 610. The computing device 602 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of a module that is executable by the computing device 602 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 610 of the processing system 604. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more computing devices 602 and/or processing systems 604) to implement techniques, modules, and examples described herein.


As further illustrated in FIG. 6, the example system 600 enables ubiquitous environments for a seamless user experience when running applications on a personal computer (PC), a television device, and/or a mobile device. Services and applications run substantially similar in all three environments for a common user experience when transitioning from one device to the next while utilizing an application, playing a video game, watching a video, and so on.


In the example system 600, multiple devices are interconnected through a central computing device. The central computing device may be local to the multiple devices or may be located remotely from the multiple devices. In one embodiment, the central computing device may be a cloud of one or more server computers that are connected to the multiple devices through a network, the Internet, or other data communication link.


In one embodiment, this interconnection architecture enables functionality to be delivered across multiple devices to provide a common and seamless experience to a user of the multiple devices. Each of the multiple devices may have different physical requirements and capabilities, and the central computing device uses a platform to enable the delivery of an experience to the device that is both tailored to the device and yet common to all devices. In one embodiment, a class of target devices is created and experiences are tailored to the generic class of devices. A class of devices may be defined by physical features, types of usage, or other common characteristics of the devices.


In various implementations, the computing device 602 may assume a variety of different configurations, such as for computer 614, mobile 616, and television 618 uses. Each of these configurations includes devices that may have generally different constructs and capabilities, and thus the computing device 602 may be configured according to one or more of the different device classes. For instance, the computing device 602 may be implemented as the computer 614 class of a device that includes a personal computer, desktop computer, a multi-screen computer, laptop computer, netbook, and so on.


The computing device 602 may also be implemented as the mobile 616 class of device that includes mobile devices, such as a mobile phone, portable music player, portable gaming device, a tablet computer, a multi-screen computer, and so on. The computing device 602 may also be implemented as the television 618 class of device that includes devices having or connected to generally larger screens in casual viewing environments. These devices include televisions, set-top boxes, gaming consoles, and so on.


The techniques described herein may be supported by these various configurations of the computing device 602 and are not limited to the specific examples of the techniques described herein. This functionality may also be implemented all or in part through use of a distributed system, such as over a “cloud” 620 via a platform 622 as described below.


The cloud 620 includes and/or is representative of a platform 622 for resources 624. The platform 622 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 620. The resources 624 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from the computing device 602. Resources 624 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network.


The platform 622 may abstract resources and functions to connect the computing device 602 with other computing devices. The platform 622 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for the resources 624 that are implemented via the platform 622. Accordingly, in an interconnected device embodiment, implementation of functionality described herein may be distributed throughout the system 600. For example, the functionality may be implemented in part on the computing device 602 as well as via the platform 622 that abstracts the functionality of the cloud 620.


CONCLUSION

Although the example implementations have been described in language specific to structural features and/or methodological acts, it is to be understood that the implementations defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed features.

Claims
  • 1. A method comprising: exposing functionality via an application programming interface by an operating system of a computing device to one or more applications that is configured to calculate an effect of inertia for movement in a user interface in a default phase, a position phase, and a range phase based on one or more parameters specified by the one or more applications via the application programming interface;determining, during the default phase, a proposed inertia rest position by determining, based at least in part on the one or more parameters, two or more rest points that are based on a position or velocity corresponding to a state of a subject movement at an inertia start position;updating, during the position phase, the proposed inertia rest position output by the default phase by using the two or more rest points to adjust the proposed inertia rest position as evaluated in turn, starting with a first point of the two or more rest points closest to the inertial start position and continuing in a direction of the movement, to generate a range inertia rest position;updating, during the range phase, the range inertia rest position by forcing the range inertia rest position to coincide with a specific location when falling within a predefined range around the specific location to generate a final location of an inertia rest position; andmanaging the calculated effect of inertia for the movement on the user interface by the operating system based on the two or more rest points specified using one or more parametric curves by the one or more applications via interaction with the application programming interface.
  • 2. The method of claim 1, wherein the effect of the inertia corresponds to the inertia rest position.
  • 3. The method of claim 1, wherein the calculated effect of inertia on the user interface is applied to movement of the user interface as a whole or applied to the subject movement of a subject within the user interface.
  • 4. The method of claim 1, wherein forcing the range inertia rest position to coincide with the specific location further includes locating a closest rest point of the two or more rest points to the range inertia rest position and setting the range inertia rest position to coincide with the closest rest point.
  • 5. The method of claim 1, wherein determining the two or more rest points comprises determining the two or more rest points as the one or more parameters specified by the one or more applications via the application programming interface.
  • 6. The method of claim 1, wherein determining the two or more rest points comprises calculating the two or more rest points based on the position or velocity corresponding to the state of the subject movement at the inertia start position.
  • 7. A system comprising: a display device; andone or more modules implemented at least partially in hardware, the one or more modules configured to perform operations comprising:calculating an inertia rest position of an effect of inertia by an operating system of a computing device using a default phase, a position phase, and a range phase based on two or more rest points, wherein the two or more rest point are determined based on one or more parameters specified by an application via an application programming interface of the operating system, wherein the calculating includes determining, during the default phase, a proposed inertia rest position by determining the two or more rest points that are based on a position or velocity corresponding to a state of a subject movement at an inertia start position;updating, during the position phase, the proposed inertia rest position output by the default phase by using the two or more rest points to adjust the proposed inertia rest position as evaluated in turn, starting with a first point of the two or more rest points closest to the inertial start position and continuing in a direction of the movement, to generate a range inertia rest position;updating, during the range phase, the range inertia rest position by forcing the range inertia rest position to coincide with a specific location when falling within a predefined range around the specific location to generate a final location of the inertia rest position; andexposing the calculated inertia rest position by the operating system as applied to a user interface output by the computing device for display by the display device.
  • 8. The system of claim 7, wherein calculating the inertia rest position is based at least in part on an initial velocity of an input and a predefined deceleration.
  • 9. The system of claim 7, wherein the calculating of the inertia rest position in the position phase: is based on a position of the subject movement of a subject at an inertia start and a position rest point;is based on a distance between the inertia rest position to a position rest point;is based on the initial velocity of the subject movement of the subject at inertia start, which defines a velocity of the subject between the inertia start and a position of the inertia rest position to a parametric curve; oris based on a velocity of the subject when passing a corresponding said rest point.
  • 10. The system of claim 7, wherein updating, during the range phase, includes applying as specified by the application to the two or more rest points to determine the inertia rest position.
  • 11. The system of claim 7, wherein the inertia rest position is applied to the subject movement of the user interface as a whole or applied to movement of an object within the user interface.
  • 12. The system of claim 7, wherein the one or more modules are configured to determine the two or more rest points as the one or more parameters specified by the one or more applications via the application programming interface.
  • 13. The system of claim 7, wherein the one or more modules are configured to calculate the two or more rest point based on the position or velocity corresponding to the state of the subject movement at the inertia start position.
  • 14. A non-transitory computer readable medium having instructions stored therein that, when executed by one or more processors, cause the one or more processors to: expose an application programming interface by an operating system to one or more applications that is configured to calculate an inertia rest position for movement in a user interface in a default phase, a position phase, and a range phase based on one or more parameters specified by the one or more applications via the application programming interface;determine, during the default phase, a proposed inertia rest position by determining, based at least in part on the one or more parameters, two or more rest points that are based on a position or velocity corresponding to a state of a subject movement at an inertia start position;update, during the position phase, the proposed inertia rest position output by the default phase by using the two or more rest points to adjust the proposed inertia rest position as evaluated in turn, starting with a first point of the two or more rest points closest to the inertial start position and continuing in a direction of the movement, to generate a range inertia rest position;update, during the range phase, the range inertia rest position by forcing the range inertia rest position to coincide with a specific location when falling within a predefined range around the specific location to generate a final location of the inertia rest position; andmanage the calculated effect of inertia for the inertia rest position for the movement on the user interface by the operating system based on the two or more rest points specified by the one or more applications via interaction with the application programming interface.
  • 15. The non-transitory computer readable medium of claim 14, wherein the inertia rest position is applied to movement of the user interface as a whole or applied to movement of a subject within the user interface.
  • 16. The non-transitory computer readable medium of claim 14, wherein the effect of the inertia corresponds to the inertia rest position.
  • 17. The non-transitory computer readable medium of claim 14, wherein the calculated effect of inertia on the user interface is applied to the subject movement of the user interface as a whole or applied to the subject movement of a subject within the user interface.
  • 18. The non-transitory computer readable medium of claim 14, wherein the instructions that cause the one or more processors to determine the proposed inertia rest position further determine the two or more rest points as the one or more parameters specified by the one or more applications via the application programming interface.
  • 19. The non-transitory computer readable medium of claim 14, wherein the instructions that cause the one or more processors to determine the proposed inertia rest position further calculate the two or more rest points based on the position or velocity corresponding to the state of the subject movement at the inertia start position.
US Referenced Citations (736)
Number Name Date Kind
4823283 Diehm et al. Apr 1989 A
5045997 Watanabe Sep 1991 A
5046001 Barker et al. Sep 1991 A
5189732 Kondo Feb 1993 A
5258748 Jones Nov 1993 A
5297032 Trojan et al. Mar 1994 A
5321750 Nadan Jun 1994 A
5339392 Risberg et al. Aug 1994 A
5432932 Chen et al. Jul 1995 A
5463725 Henckel et al. Oct 1995 A
5485197 Hoarty Jan 1996 A
5495566 Kwatinetz Feb 1996 A
5515495 Ikemoto May 1996 A
5574836 Broemmelsiek Nov 1996 A
5598523 Fujita Jan 1997 A
5611060 Belfiore et al. Mar 1997 A
5623613 Rowe et al. Apr 1997 A
5640176 Mundt et al. Jun 1997 A
5650827 Tsumori et al. Jul 1997 A
5657049 Ludolph et al. Aug 1997 A
5675329 Barker Oct 1997 A
5687331 Volk et al. Nov 1997 A
5712995 Cohn Jan 1998 A
5771042 Santos-Gomez Jun 1998 A
5793415 Gregory et al. Aug 1998 A
5819284 Farber et al. Oct 1998 A
5844547 Minakuchi et al. Dec 1998 A
5860073 Ferrel et al. Jan 1999 A
5905492 Straub et al. May 1999 A
5914720 Maples et al. Jun 1999 A
5940076 Sommers et al. Aug 1999 A
5959621 Nawaz et al. Sep 1999 A
5963204 Ikeda et al. Oct 1999 A
6008809 Brooks Dec 1999 A
6008816 Eisler Dec 1999 A
6009519 Jones et al. Dec 1999 A
6011542 Durrani et al. Jan 2000 A
6028600 Rosin et al. Feb 2000 A
6057839 Advani et al. May 2000 A
6064383 Skelly May 2000 A
6104418 Tanaka et al. Aug 2000 A
6108003 Hall, Jr. et al. Aug 2000 A
6111585 Choi Aug 2000 A
6115040 Bladow et al. Sep 2000 A
6166736 Hugh Dec 2000 A
6188405 Czerwinski et al. Feb 2001 B1
6211921 Cherian et al. Apr 2001 B1
6212564 Harter et al. Apr 2001 B1
6216141 Straub et al. Apr 2001 B1
6266098 Cove et al. Jul 2001 B1
6278448 Brown et al. Aug 2001 B1
6281940 Sciammarella Aug 2001 B1
6311058 Wecker et al. Oct 2001 B1
6369837 Schirmer Apr 2002 B1
6385630 Ejerhed May 2002 B1
6396963 Shaffer May 2002 B2
6411307 Rosin et al. Jun 2002 B1
6424338 Andersone Jul 2002 B1
6426753 Migdal Jul 2002 B1
6433789 Rosman Aug 2002 B1
6448987 Easty et al. Sep 2002 B1
6449638 Wecker et al. Sep 2002 B1
6456334 Duhault Sep 2002 B1
6489977 Sone Dec 2002 B2
6505243 Lortz Jan 2003 B1
6507643 Groner Jan 2003 B1
6510144 Dommety et al. Jan 2003 B1
6510466 Cox et al. Jan 2003 B1
6510553 Hazra Jan 2003 B1
6538635 Ringot Mar 2003 B1
6570597 Seki et al. May 2003 B1
6577323 Jamieson et al. Jun 2003 B1
6577350 Proehl et al. Jun 2003 B1
6591244 Jim et al. Jul 2003 B2
6597374 Baker et al. Jul 2003 B1
6628309 Dodson et al. Sep 2003 B1
6636246 Gallo et al. Oct 2003 B1
6662023 Helle Dec 2003 B1
6675387 Boucher et al. Jan 2004 B1
6690387 Zimmerman et al. Feb 2004 B2
6697825 Underwood et al. Feb 2004 B1
6707449 Hinckley et al. Mar 2004 B2
6710771 Yamaguchi et al. Mar 2004 B1
6721958 Dureau Apr 2004 B1
6724403 Santoro et al. Apr 2004 B1
6784925 Tomat et al. Aug 2004 B1
6798421 Baldwin Sep 2004 B2
6801203 Hussain Oct 2004 B1
6807558 Hassett et al. Oct 2004 B1
6832355 Duperrouzel et al. Dec 2004 B1
6857104 Cahn Feb 2005 B1
6865297 Loui Mar 2005 B2
6873329 Cohen et al. Mar 2005 B2
6876312 Yu Apr 2005 B2
6885974 Holle Apr 2005 B2
6904597 Jin Jun 2005 B2
6920445 Bae Jul 2005 B2
6938101 Hayes et al. Aug 2005 B2
6961731 Holbrook Nov 2005 B2
6971067 Karson et al. Nov 2005 B1
6972776 Davis et al. Dec 2005 B2
6975306 Hinckley Dec 2005 B2
6976210 Silva et al. Dec 2005 B1
6978303 McCreesh et al. Dec 2005 B1
6983310 Rouse Jan 2006 B2
6987991 Nelson Jan 2006 B2
7013041 Miyamoto Mar 2006 B2
7017119 Johnston et al. Mar 2006 B1
7019757 Brown et al. Mar 2006 B2
7028264 Santoro et al. Apr 2006 B2
7032187 Keely, Jr. et al. Apr 2006 B2
7036090 Nguyen Apr 2006 B1
7036091 Nguyen Apr 2006 B1
7042460 Hussain et al. May 2006 B2
7051291 Sciammarella et al. May 2006 B2
7058955 Porkka Jun 2006 B2
7065385 Jarrad et al. Jun 2006 B2
7065386 Smethers Jun 2006 B1
7075535 Aguera y Arcas Jul 2006 B2
7089507 Lection et al. Aug 2006 B2
7091998 Miller-Smith Aug 2006 B2
7093201 Duarte Aug 2006 B2
7106349 Baar et al. Sep 2006 B2
7111044 Lee Sep 2006 B2
7133707 Rak Nov 2006 B1
7133859 Wong Nov 2006 B1
7139800 Bellotti et al. Nov 2006 B2
7146573 Brown et al. Dec 2006 B2
7155729 Andrew et al. Dec 2006 B1
7158123 Myers et al. Jan 2007 B2
7158135 Santodomingo et al. Jan 2007 B2
7178111 Glein et al. Feb 2007 B2
7194506 White et al. Mar 2007 B1
7210099 Rohrabaugh et al. Apr 2007 B2
7216588 Suess May 2007 B2
7249326 Stoakley et al. Jul 2007 B2
7262775 Calkins et al. Aug 2007 B2
7263668 Lentz Aug 2007 B1
7280097 Chen Oct 2007 B2
7283620 Adamczyk Oct 2007 B2
7289806 Morris et al. Oct 2007 B2
7296184 Derks et al. Nov 2007 B2
7296242 Agata et al. Nov 2007 B2
7310100 Hussain Dec 2007 B2
7333092 Zadesky et al. Feb 2008 B2
7333120 Venolia Feb 2008 B2
7336263 Valikangas Feb 2008 B2
7369647 Gao et al. May 2008 B2
7376907 Santoro et al. May 2008 B2
7386807 Cummins et al. Jun 2008 B2
7388578 Tao Jun 2008 B2
7403191 Sinclair Jul 2008 B2
7408538 Hinckley et al. Aug 2008 B2
7412663 Lindsay et al. Aug 2008 B2
7433920 Blagsvedt et al. Oct 2008 B2
7447520 Scott Nov 2008 B2
7461151 Colson et al. Dec 2008 B2
7469380 Wessling et al. Dec 2008 B2
7469381 Ording Dec 2008 B2
7478326 Holecek et al. Jan 2009 B2
7479949 Jobs Jan 2009 B2
7480870 Anzures Jan 2009 B2
7483418 Maurer Jan 2009 B2
7487467 Kawahara et al. Feb 2009 B1
7496830 Rubin Feb 2009 B2
7500175 Colle et al. Mar 2009 B2
7512966 Lyons, Jr. et al. Mar 2009 B2
7577918 Lindsay Aug 2009 B2
7581034 Polivy et al. Aug 2009 B2
7593995 He et al. Sep 2009 B1
7595810 Louch Sep 2009 B2
7599790 Rasmussen et al. Oct 2009 B2
7600189 Fujisawa Oct 2009 B2
7600234 Dobrowski et al. Oct 2009 B2
7606714 Williams et al. Oct 2009 B2
7607106 Ernst et al. Oct 2009 B2
7610563 Nelson et al. Oct 2009 B2
7619615 Donoghue Nov 2009 B1
7640518 Forlines et al. Dec 2009 B2
7653883 Hotelling et al. Jan 2010 B2
7657849 Chaudhri et al. Feb 2010 B2
7663607 Hotelling et al. Feb 2010 B2
7664067 Pointer Feb 2010 B2
7671756 Herz et al. Mar 2010 B2
7702683 Kirshenbaum Apr 2010 B1
7755674 Kaminaga Jul 2010 B2
7834861 Lee Nov 2010 B2
7877707 Westerman et al. Jan 2011 B2
7880728 De Los Reyes et al. Feb 2011 B2
7889180 Byun et al. Feb 2011 B2
7895309 Belali et al. Feb 2011 B2
7924271 Christie et al. Apr 2011 B2
7933632 Flynt et al. Apr 2011 B2
7962281 Rasmussen et al. Jun 2011 B2
7983718 Roka Jul 2011 B1
7987431 Santoro et al. Jul 2011 B2
8006276 Nakagawa et al. Aug 2011 B2
8086275 Wykes Dec 2011 B2
8108781 Laansoo et al. Jan 2012 B2
8131808 Aoki et al. Mar 2012 B2
8150924 Buchheit et al. Apr 2012 B2
8171431 Grossman et al. May 2012 B2
8175653 Smuga May 2012 B2
8176438 Zaman et al. May 2012 B2
8209623 Barletta et al. Jun 2012 B2
8225193 Kleinschnitz et al. Jul 2012 B1
8238876 Teng Aug 2012 B2
8245152 Brunner et al. Aug 2012 B2
8250494 Butcher Aug 2012 B2
8255473 Eren et al. Aug 2012 B2
8255812 Parparita et al. Aug 2012 B1
8269736 Wilairat Sep 2012 B2
8271898 Mattos Sep 2012 B1
8307279 Fioravanti et al. Nov 2012 B1
8384726 Grabowski et al. Feb 2013 B1
8429565 Agarawala et al. Apr 2013 B2
8448083 Migos et al. May 2013 B1
8473870 Hinckley et al. Jun 2013 B2
8525808 Buening Sep 2013 B1
8539384 Hinckley et al. Sep 2013 B2
8548431 Teng et al. Oct 2013 B2
8560959 Zaman et al. Oct 2013 B2
8589815 Fong et al. Nov 2013 B2
8612874 Zaman et al. Dec 2013 B2
8624933 Marr et al. Jan 2014 B2
8627227 Matthews et al. Jan 2014 B2
8687023 Markiewicz et al. Apr 2014 B2
8689123 Zaman et al. Apr 2014 B2
8830270 Zaman et al. Sep 2014 B2
8902163 Sawai Dec 2014 B2
20010022621 Squibbs Sep 2001 A1
20020000963 Yoshida et al. Jan 2002 A1
20020018051 Singh Feb 2002 A1
20020035607 Checkoway Mar 2002 A1
20020054117 van Dantzich et al. May 2002 A1
20020060701 Naughton et al. May 2002 A1
20020070961 Xu et al. Jun 2002 A1
20020077156 Smethers Jun 2002 A1
20020091755 Narin Jul 2002 A1
20020097264 Dutta et al. Jul 2002 A1
20020105531 Niemi Aug 2002 A1
20020115476 Padawer et al. Aug 2002 A1
20020128036 Yach et al. Sep 2002 A1
20020129061 Swart et al. Sep 2002 A1
20020138248 Corston-Oliver et al. Sep 2002 A1
20020142762 Chmaytelli et al. Oct 2002 A1
20020145631 Arbab et al. Oct 2002 A1
20020152305 Jackson et al. Oct 2002 A1
20020154176 Barksdale et al. Oct 2002 A1
20020161634 Kaars Oct 2002 A1
20020186251 Himmel et al. Dec 2002 A1
20020194385 Linder et al. Dec 2002 A1
20030003899 Tashiro et al. Jan 2003 A1
20030008686 Park et al. Jan 2003 A1
20030011643 Nishihihata Jan 2003 A1
20030020671 Santoro et al. Jan 2003 A1
20030040300 Bodic et al. Feb 2003 A1
20030046396 Richter et al. Mar 2003 A1
20030073414 Capps Apr 2003 A1
20030096604 Vollandt May 2003 A1
20030105827 Tan et al. Jun 2003 A1
20030135582 Allen et al. Jul 2003 A1
20030187996 Cardina et al. Oct 2003 A1
20030222907 Heikes et al. Dec 2003 A1
20030225846 Heikes et al. Dec 2003 A1
20040066414 Czerwinski et al. Apr 2004 A1
20040068543 Seifert Apr 2004 A1
20040078299 Down-Logan Apr 2004 A1
20040111673 Bowman et al. Jun 2004 A1
20040185883 Rukman Sep 2004 A1
20040212586 Denny Oct 2004 A1
20040217954 O'Gorman et al. Nov 2004 A1
20040217980 Radburn et al. Nov 2004 A1
20040237048 Tojo et al. Nov 2004 A1
20040250217 Tojo et al. Dec 2004 A1
20050005241 Hunleth et al. Jan 2005 A1
20050028208 Ellis Feb 2005 A1
20050044058 Matthews et al. Feb 2005 A1
20050054384 Pasquale et al. Mar 2005 A1
20050060647 Doan et al. Mar 2005 A1
20050060665 Rekimoto Mar 2005 A1
20050079896 Kokko et al. Apr 2005 A1
20050085215 Kokko Apr 2005 A1
20050085272 Anderson et al. Apr 2005 A1
20050108655 Andrea et al. May 2005 A1
20050114788 Fabritius May 2005 A1
20050120306 Klassen et al. Jun 2005 A1
20050143138 Lee et al. Jun 2005 A1
20050149879 Jobs et al. Jul 2005 A1
20050182798 Todd et al. Aug 2005 A1
20050183021 Allen et al. Aug 2005 A1
20050184999 Daioku Aug 2005 A1
20050198159 Kirsch Sep 2005 A1
20050198584 Matthews et al. Sep 2005 A1
20050200762 Barletta et al. Sep 2005 A1
20050216300 Appelman et al. Sep 2005 A1
20050223057 Buchheit et al. Oct 2005 A1
20050223069 Cooperman et al. Oct 2005 A1
20050232166 Nierhaus Oct 2005 A1
20050250547 Salman et al. Nov 2005 A1
20050268237 Crane et al. Dec 2005 A1
20050273614 Ahuja Dec 2005 A1
20050280719 Kim Dec 2005 A1
20060004685 Pyhalammi et al. Jan 2006 A1
20060010394 Chaudhri et al. Jan 2006 A1
20060015736 Callas et al. Jan 2006 A1
20060015812 Cunningham Jan 2006 A1
20060026013 Kraft Feb 2006 A1
20060026521 Hotelling et al. Feb 2006 A1
20060036425 Le Cocq et al. Feb 2006 A1
20060048073 Jarrett et al. Mar 2006 A1
20060048101 Krassovsky et al. Mar 2006 A1
20060059430 Bells Mar 2006 A1
20060061597 Hui Mar 2006 A1
20060070005 Gilbert et al. Mar 2006 A1
20060074735 Shukla et al. Apr 2006 A1
20060074771 Kim Apr 2006 A1
20060075360 Bixler Apr 2006 A1
20060103623 Davis May 2006 A1
20060107231 Matthews et al. May 2006 A1
20060112354 Park et al. May 2006 A1
20060129543 Bates et al. Jun 2006 A1
20060135220 Kim et al. Jun 2006 A1
20060136773 Kespohl et al. Jun 2006 A1
20060152803 Provitola Jul 2006 A1
20060172724 Linkert et al. Aug 2006 A1
20060173911 Levin et al. Aug 2006 A1
20060184901 Dietz Aug 2006 A1
20060190833 SanGiovanni et al. Aug 2006 A1
20060199598 Lee et al. Sep 2006 A1
20060212806 Griffin et al. Sep 2006 A1
20060218234 Deng et al. Sep 2006 A1
20060218501 Wilson et al. Sep 2006 A1
20060224993 Wong et al. Oct 2006 A1
20060246955 Nirhamo Nov 2006 A1
20060253801 Okaro et al. Nov 2006 A1
20060259870 Hewitt et al. Nov 2006 A1
20060259873 Mister Nov 2006 A1
20060262134 Hamiter et al. Nov 2006 A1
20060268100 Karukka et al. Nov 2006 A1
20060271520 Ragan Nov 2006 A1
20060281448 Plestid et al. Dec 2006 A1
20060293088 Kokubo Dec 2006 A1
20060294063 Ali et al. Dec 2006 A1
20060294396 Witman Dec 2006 A1
20070005716 LeVasseur et al. Jan 2007 A1
20070006094 Canfield et al. Jan 2007 A1
20070011610 Sethi et al. Jan 2007 A1
20070015532 Deelman Jan 2007 A1
20070024646 Saarinen Feb 2007 A1
20070035513 Sherrard et al. Feb 2007 A1
20070038567 Allaire et al. Feb 2007 A1
20070050724 Lee et al. Mar 2007 A1
20070054679 Cho et al. Mar 2007 A1
20070061488 Alagappan et al. Mar 2007 A1
20070061714 Stuple et al. Mar 2007 A1
20070063995 Bailey et al. Mar 2007 A1
20070067272 Flynt Mar 2007 A1
20070067737 Zielinski et al. Mar 2007 A1
20070073718 Ramer Mar 2007 A1
20070076013 Campbell Apr 2007 A1
20070080954 Griffin Apr 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070082708 Griffin Apr 2007 A1
20070083746 Fallon et al. Apr 2007 A1
20070083821 Garbow et al. Apr 2007 A1
20070106635 Frieden et al. May 2007 A1
20070120835 Sato May 2007 A1
20070127638 Doulton Jun 2007 A1
20070157089 Van Os et al. Jul 2007 A1
20070171192 Seo et al. Jul 2007 A1
20070182595 Ghasabian Aug 2007 A1
20070182999 Anthony et al. Aug 2007 A1
20070185847 Budzik et al. Aug 2007 A1
20070192707 Maeda et al. Aug 2007 A1
20070192730 Simila et al. Aug 2007 A1
20070192733 Horiuchi Aug 2007 A1
20070192739 Hunleth et al. Aug 2007 A1
20070197196 Shenfield et al. Aug 2007 A1
20070198420 Goldstein Aug 2007 A1
20070208840 Mcconville et al. Sep 2007 A1
20070211034 Griffin et al. Sep 2007 A1
20070214429 Lyudovyk et al. Sep 2007 A1
20070216651 Patel Sep 2007 A1
20070216661 Chen et al. Sep 2007 A1
20070225022 Satake Sep 2007 A1
20070233654 Karlson Oct 2007 A1
20070236468 Tuli Oct 2007 A1
20070238488 Scott Oct 2007 A1
20070247435 Benko et al. Oct 2007 A1
20070250583 Hardy Oct 2007 A1
20070250787 Kawahara et al. Oct 2007 A1
20070253758 Suess Nov 2007 A1
20070256029 Maxwell Nov 2007 A1
20070257891 Esenther et al. Nov 2007 A1
20070257933 Klassen Nov 2007 A1
20070260674 Shenfield Nov 2007 A1
20070262964 Zotov et al. Nov 2007 A1
20070263843 Foxenland Nov 2007 A1
20070273663 Park et al. Nov 2007 A1
20070273668 Park et al. Nov 2007 A1
20070280457 Aberethy Dec 2007 A1
20070281747 Pletikosa Dec 2007 A1
20080005668 Mavinkurve Jan 2008 A1
20080028294 Sell et al. Jan 2008 A1
20080032681 West Feb 2008 A1
20080036743 Westerman et al. Feb 2008 A1
20080040692 Sunday et al. Feb 2008 A1
20080048986 Khoo Feb 2008 A1
20080052370 Snyder Feb 2008 A1
20080057910 Thoresson et al. Mar 2008 A1
20080057926 Forstall et al. Mar 2008 A1
20080065607 Weber Mar 2008 A1
20080072173 Brunner et al. Mar 2008 A1
20080076472 Hyatt Mar 2008 A1
20080082911 Sorotokin et al. Apr 2008 A1
20080082934 Kocienda et al. Apr 2008 A1
20080085700 Arora Apr 2008 A1
20080092054 Bhumkar et al. Apr 2008 A1
20080094368 Ording et al. Apr 2008 A1
20080095100 Cleveland et al. Apr 2008 A1
20080102863 Hardy May 2008 A1
20080104544 Collins et al. May 2008 A1
20080107057 Kannan et al. May 2008 A1
20080113656 Lee et al. May 2008 A1
20080114535 Nesbitt May 2008 A1
20080122796 Jobs May 2008 A1
20080132252 Altman et al. Jun 2008 A1
20080141153 Samson et al. Jun 2008 A1
20080153551 Baek et al. Jun 2008 A1
20080155425 Murthy et al. Jun 2008 A1
20080162651 Madnani Jul 2008 A1
20080163104 Haug Jul 2008 A1
20080165132 Weiss Jul 2008 A1
20080165136 Christie et al. Jul 2008 A1
20080165141 Christie Jul 2008 A1
20080165163 Bathiche Jul 2008 A1
20080167058 Lee et al. Jul 2008 A1
20080168349 Lamiraux et al. Jul 2008 A1
20080168379 Forstall et al. Jul 2008 A1
20080168382 Louch et al. Jul 2008 A1
20080168402 Blumenberg Jul 2008 A1
20080168403 Westerman et al. Jul 2008 A1
20080172609 Rytivaara Jul 2008 A1
20080174570 Jobs et al. Jul 2008 A1
20080180399 Cheng Jul 2008 A1
20080182628 Lee et al. Jul 2008 A1
20080184112 Chiang et al. Jul 2008 A1
20080189653 Taylor et al. Aug 2008 A1
20080189658 Jeong et al. Aug 2008 A1
20080192056 Robertson et al. Aug 2008 A1
20080198141 Lee et al. Aug 2008 A1
20080200142 Adbel-Kader et al. Aug 2008 A1
20080208973 Hayashi Aug 2008 A1
20080222273 Lakshmanan Sep 2008 A1
20080222545 Lemay et al. Sep 2008 A1
20080222547 Wong et al. Sep 2008 A1
20080222560 Harrison Sep 2008 A1
20080222569 Champion Sep 2008 A1
20080225014 Kim Sep 2008 A1
20080242362 Duarte Oct 2008 A1
20080259042 Thorn Oct 2008 A1
20080261513 Shin et al. Oct 2008 A1
20080261660 Huh et al. Oct 2008 A1
20080263457 Kim et al. Oct 2008 A1
20080270558 Ma Oct 2008 A1
20080297475 Woolf et al. Dec 2008 A1
20080299999 Lockhart et al. Dec 2008 A1
20080301046 Martinez Dec 2008 A1
20080301575 Fermon Dec 2008 A1
20080307351 Louch et al. Dec 2008 A1
20080309626 Westerman et al. Dec 2008 A1
20080316177 Tseng Dec 2008 A1
20080317240 Chang et al. Dec 2008 A1
20080320413 Oshiro Dec 2008 A1
20090007009 Luneau et al. Jan 2009 A1
20090007017 Anzures et al. Jan 2009 A1
20090012952 Fredriksson Jan 2009 A1
20090029736 Kim et al. Jan 2009 A1
20090031247 Walter et al. Jan 2009 A1
20090037469 Kirsch Feb 2009 A1
20090037846 Spalink et al. Feb 2009 A1
20090051671 Konstas Feb 2009 A1
20090061837 Chaudhri et al. Mar 2009 A1
20090061948 Lee et al. Mar 2009 A1
20090064055 Chaudhri Mar 2009 A1
20090070673 Barkan et al. Mar 2009 A1
20090077649 Lockhart Mar 2009 A1
20090083656 Dukhon Mar 2009 A1
20090085851 Lim Apr 2009 A1
20090085878 Heubel Apr 2009 A1
20090089215 Newton Apr 2009 A1
20090089459 Jeyaseelan et al. Apr 2009 A1
20090089704 Makela Apr 2009 A1
20090094562 Jeong et al. Apr 2009 A1
20090103515 Pointer Apr 2009 A1
20090106696 Duarte Apr 2009 A1
20090109243 Kraft Apr 2009 A1
20090117942 Boningue et al. May 2009 A1
20090125844 Weir et al. May 2009 A1
20090140061 Schultz et al. Jun 2009 A1
20090140986 Karkkainen et al. Jun 2009 A1
20090144642 Crystal Jun 2009 A1
20090144653 Ubillos Jun 2009 A1
20090144753 Morris Jun 2009 A1
20090146962 Ahonen et al. Jun 2009 A1
20090153492 Popp Jun 2009 A1
20090160809 Yang Jun 2009 A1
20090163182 Gatti et al. Jun 2009 A1
20090164888 Phan Jun 2009 A1
20090164928 Brown et al. Jun 2009 A1
20090164936 Kawaguchi Jun 2009 A1
20090178007 Matas et al. Jul 2009 A1
20090182788 Chung et al. Jul 2009 A1
20090184939 Wohlstadter et al. Jul 2009 A1
20090199122 Deutsch et al. Aug 2009 A1
20090199128 Matthews et al. Aug 2009 A1
20090199130 Tsern et al. Aug 2009 A1
20090205041 Michalske Aug 2009 A1
20090215504 Lando Aug 2009 A1
20090225038 Bolsinga et al. Sep 2009 A1
20090228825 Van Os et al. Sep 2009 A1
20090228841 Hildreth Sep 2009 A1
20090235200 Deutsch et al. Sep 2009 A1
20090235203 Iizuka Sep 2009 A1
20090248421 Michaelis et al. Oct 2009 A1
20090249257 Bove et al. Oct 2009 A1
20090265662 Bamford Oct 2009 A1
20090271778 Mandyam et al. Oct 2009 A1
20090284482 Chin Nov 2009 A1
20090284657 Roberts et al. Nov 2009 A1
20090288044 Matthews et al. Nov 2009 A1
20090292989 Matthews et al. Nov 2009 A1
20090293007 Duarte et al. Nov 2009 A1
20090298547 Kim et al. Dec 2009 A1
20090303231 Robinet et al. Dec 2009 A1
20090305732 Marcellino et al. Dec 2009 A1
20090307105 Lemay et al. Dec 2009 A1
20090307589 Inose et al. Dec 2009 A1
20090307623 Agarawala et al. Dec 2009 A1
20090313584 Kerr et al. Dec 2009 A1
20090315839 Wilson et al. Dec 2009 A1
20090315847 Fujii Dec 2009 A1
20090322760 Kwiatkowski Dec 2009 A1
20090327969 Estrada Dec 2009 A1
20100005420 Schneider Jan 2010 A1
20100008490 Gharachorloo et al. Jan 2010 A1
20100013782 Liu et al. Jan 2010 A1
20100020025 Lemort et al. Jan 2010 A1
20100020091 Rasmussen et al. Jan 2010 A1
20100031186 Tseng Feb 2010 A1
20100042911 Wormald et al. Feb 2010 A1
20100050076 Roth Feb 2010 A1
20100058248 Park Mar 2010 A1
20100066698 Seo Mar 2010 A1
20100070931 Nichols Mar 2010 A1
20100073380 Kaplan et al. Mar 2010 A1
20100075628 Ye Mar 2010 A1
20100077058 Messer Mar 2010 A1
20100077310 Karachale et al. Mar 2010 A1
20100077330 Kaplan et al. Mar 2010 A1
20100079392 Chiang et al. Apr 2010 A1
20100079413 Kawashima et al. Apr 2010 A1
20100081475 Chiang et al. Apr 2010 A1
20100086022 Hunleth et al. Apr 2010 A1
20100087169 Lin Apr 2010 A1
20100087173 Lin Apr 2010 A1
20100088635 Louch Apr 2010 A1
20100100839 Tseng et al. Apr 2010 A1
20100102998 Fux Apr 2010 A1
20100103118 Townsend et al. Apr 2010 A1
20100103124 Kruzeniski Apr 2010 A1
20100105370 Kruzeniski Apr 2010 A1
20100105424 Smuga Apr 2010 A1
20100105438 Wykes Apr 2010 A1
20100105439 Friedman Apr 2010 A1
20100105440 Kruzeniski Apr 2010 A1
20100105441 Voss Apr 2010 A1
20100106915 Krishnaprasad et al. Apr 2010 A1
20100107067 Vaisanen Apr 2010 A1
20100107068 Butcher Apr 2010 A1
20100107100 Schneekloth Apr 2010 A1
20100122110 Ordogh May 2010 A1
20100138767 Wang et al. Jun 2010 A1
20100145675 Lloyd et al. Jun 2010 A1
20100146437 Woodcock et al. Jun 2010 A1
20100159966 Friedman Jun 2010 A1
20100159994 Stallings et al. Jun 2010 A1
20100159995 Stallings et al. Jun 2010 A1
20100162180 Dunnam et al. Jun 2010 A1
20100167699 Sigmund et al. Jul 2010 A1
20100169766 Duarte et al. Jul 2010 A1
20100169772 Stallings et al. Jul 2010 A1
20100169819 Bestle et al. Jul 2010 A1
20100175018 Petschnigg et al. Jul 2010 A1
20100175029 Williams Jul 2010 A1
20100180233 Kruzeniski Jul 2010 A1
20100185932 Coffman et al. Jul 2010 A1
20100216491 Winkler et al. Aug 2010 A1
20100223569 Vuong et al. Sep 2010 A1
20100248688 Teng Sep 2010 A1
20100248689 Teng Sep 2010 A1
20100248741 Setlur et al. Sep 2010 A1
20100248787 Smuga Sep 2010 A1
20100248788 Yook et al. Sep 2010 A1
20100251153 SanGiovanni et al. Sep 2010 A1
20100265196 Lee et al. Oct 2010 A1
20100281402 Staikos et al. Nov 2010 A1
20100281409 Rainisto et al. Nov 2010 A1
20100283743 Coddington et al. Nov 2010 A1
20100289806 Lao et al. Nov 2010 A1
20100293056 Flynt et al. Nov 2010 A1
20100295795 Wilairat Nov 2010 A1
20100298034 Shin et al. Nov 2010 A1
20100302172 Wilairat Dec 2010 A1
20100302176 Nikula et al. Dec 2010 A1
20100302278 Shaffer et al. Dec 2010 A1
20100302712 Wilairat Dec 2010 A1
20100311470 Seo et al. Dec 2010 A1
20100313165 Louch et al. Dec 2010 A1
20100321403 Inadome Dec 2010 A1
20100328431 Kim et al. Dec 2010 A1
20100329642 Kam et al. Dec 2010 A1
20100333008 Taylor Dec 2010 A1
20110004839 Cha et al. Jan 2011 A1
20110004845 Ciabarra Jan 2011 A1
20110018806 Yano Jan 2011 A1
20110029598 Arnold Feb 2011 A1
20110029904 Smith et al. Feb 2011 A1
20110029927 Lietzke et al. Feb 2011 A1
20110029934 Locker et al. Feb 2011 A1
20110035702 Williams et al. Feb 2011 A1
20110043527 Ording et al. Feb 2011 A1
20110055773 Agarawala et al. Mar 2011 A1
20110074699 Marr Mar 2011 A1
20110074710 Weeldreyer et al. Mar 2011 A1
20110074719 Yeh et al. Mar 2011 A1
20110087988 Ray et al. Apr 2011 A1
20110093778 Kim et al. Apr 2011 A1
20110093816 Chang et al. Apr 2011 A1
20110093821 Wigdor et al. Apr 2011 A1
20110107272 Aguilar May 2011 A1
20110113337 Liu et al. May 2011 A1
20110113486 Hunt et al. May 2011 A1
20110119586 Blinnikka et al. May 2011 A1
20110126156 Krishnaraj et al. May 2011 A1
20110138313 Decker et al. Jun 2011 A1
20110154235 Min et al. Jun 2011 A1
20110157027 Rissa Jun 2011 A1
20110161845 Stallings et al. Jun 2011 A1
20110163968 Hogan Jul 2011 A1
20110173556 Czerwinski et al. Jul 2011 A1
20110173568 Royal, Jr. et al. Jul 2011 A1
20110173569 Howes et al. Jul 2011 A1
20110175930 Hwang et al. Jul 2011 A1
20110202834 Mandryk et al. Aug 2011 A1
20110202866 Huang et al. Aug 2011 A1
20110209039 Hinckley et al. Aug 2011 A1
20110209089 Hinckley et al. Aug 2011 A1
20110209100 Hinckley et al. Aug 2011 A1
20110209101 Hinckley et al. Aug 2011 A1
20110209102 Hinckley et al. Aug 2011 A1
20110209103 Hinckley et al. Aug 2011 A1
20110209104 Hinckley et al. Aug 2011 A1
20110225547 Fong et al. Sep 2011 A1
20110231796 Vigil Sep 2011 A1
20110252346 Chaudhri Oct 2011 A1
20110252380 Chaudhri Oct 2011 A1
20110276864 Oules Nov 2011 A1
20110316884 Giambalvo et al. Dec 2011 A1
20120005584 Seago et al. Jan 2012 A1
20120009903 Schultz et al. Jan 2012 A1
20120028687 Wykes Feb 2012 A1
20120050185 Davydov et al. Mar 2012 A1
20120050332 Nikara et al. Mar 2012 A1
20120062604 Lobo et al. Mar 2012 A1
20120089950 Tseng Apr 2012 A1
20120102433 Falkenburg Apr 2012 A1
20120151397 Oberstein et al. Jun 2012 A1
20120159395 Deutsch et al. Jun 2012 A1
20120159402 Nurmi et al. Jun 2012 A1
20120167008 Zaman Jun 2012 A1
20120167011 Zaman Jun 2012 A1
20120174005 Deutsch Jul 2012 A1
20120174029 Bastide et al. Jul 2012 A1
20120176401 Hayward et al. Jul 2012 A1
20120179992 Smuga Jul 2012 A1
20120210265 Delia et al. Aug 2012 A1
20120212495 Butcher Aug 2012 A1
20120216139 Ording et al. Aug 2012 A1
20120233571 Wever et al. Sep 2012 A1
20120244841 Teng Sep 2012 A1
20120254780 Mouton et al. Oct 2012 A1
20120265644 Roa et al. Oct 2012 A1
20120272181 Rogers Oct 2012 A1
20120290962 Zielinski et al. Nov 2012 A1
20120299968 Wong et al. Nov 2012 A1
20120304068 Zaman et al. Nov 2012 A1
20120304092 Jarrett et al. Nov 2012 A1
20120304108 Jarrett et al. Nov 2012 A1
20120304113 Patten et al. Nov 2012 A1
20120304114 Wong et al. Nov 2012 A1
20120304116 Donahue et al. Nov 2012 A1
20120304117 Donahue et al. Nov 2012 A1
20120304118 Donahue et al. Nov 2012 A1
20120311485 Caliendo, Jr. et al. Dec 2012 A1
20120323992 Brobst et al. Dec 2012 A1
20130033525 Markiewicz Feb 2013 A1
20130042203 Wong et al. Feb 2013 A1
20130042206 Zaman et al. Feb 2013 A1
20130044141 Markiewicz Feb 2013 A1
20130047079 Kroeger et al. Feb 2013 A1
20130047105 Jarrett Feb 2013 A1
20130047117 Deutsch Feb 2013 A1
20130057587 Leonard et al. Mar 2013 A1
20130057588 Leonard Mar 2013 A1
20130063442 Zaman Mar 2013 A1
20130063443 Garside Mar 2013 A1
20130063465 Zaman Mar 2013 A1
20130063490 Zaman Mar 2013 A1
20130067381 Yalovsky Mar 2013 A1
20130067390 Kwiatkowski Mar 2013 A1
20130067391 Pittappilly Mar 2013 A1
20130067398 Pittappilly Mar 2013 A1
20130067399 Elliott Mar 2013 A1
20130067412 Leonard et al. Mar 2013 A1
20130067420 Pittappilly Mar 2013 A1
20130093757 Cornell Apr 2013 A1
20130111396 Brid May 2013 A1
20130169649 Bates et al. Jul 2013 A1
20130176316 Bates et al. Jul 2013 A1
20140082552 Zaman Mar 2014 A1
20140109008 Zaman Apr 2014 A1
20140208260 Kawahara et al. Jul 2014 A1
20150193109 Takahashi Jul 2015 A1
20160070429 Clark Mar 2016 A1
Foreign Referenced Citations (62)
Number Date Country
1734440 Feb 2006 CN
1902575 Jan 2007 CN
101114303 Jan 2008 CN
101809531 Aug 2010 CN
101819498 Sep 2010 CN
102004603 Apr 2011 CN
102033698 Apr 2011 CN
102197377 Sep 2011 CN
102197702 Sep 2011 CN
102591579 Jul 2012 CN
103809891 May 2014 CN
103984500 Aug 2014 CN
0583060 Feb 1994 EP
1752868 Feb 2007 EP
2004227393 Aug 2004 JP
2004264990 Sep 2004 JP
2004357257 Dec 2004 JP
2012507077 Mar 2012 JP
2012230571 Nov 2012 JP
2013519952 May 2013 JP
2013186726 Sep 2013 JP
2014137792 Jul 2014 JP
2014149590 Aug 2014 JP
200303655 Feb 2003 KR
20060019198 Mar 2006 KR
1020070036114 Apr 2007 KR
1020070098337 Oct 2007 KR
20070120368 Dec 2007 KR
1020080025951 Mar 2008 KR
1020080041809 May 2008 KR
1020080076390 Aug 2008 KR
100854333 Sep 2008 KR
1020080084156 Sep 2008 KR
1020080113913 Dec 2008 KR
1020090041635 Apr 2009 KR
20100010072 Feb 2010 KR
20100048375 May 2010 KR
20100056369 May 2010 KR
1020100056369 May 2010 KR
2011116315 Oct 2012 RU
201023026 Jun 2010 TW
WO-9926127 May 1999 WO
WO-0129976 Apr 2001 WO
WO-2005026931 Mar 2005 WO
WO-2005027506 Mar 2005 WO
WO-2006019639 Feb 2006 WO
WO-2007121557 Nov 2007 WO
WO-2007134623 Nov 2007 WO
WO-2008030608 Mar 2008 WO
WO-2008031871 Mar 2008 WO
WO-2008035831 Mar 2008 WO
WO-2009000043 Dec 2008 WO
WO-2009012398 Jan 2009 WO
WO-2009049331 Apr 2009 WO
WO-2010024969 Mar 2010 WO
WO-2010048229 Apr 2010 WO
WO-2010048448 Apr 2010 WO
WO-2010048519 Apr 2010 WO
WO-2010117643 Oct 2010 WO
WO-2010125451 Nov 2010 WO
WO-2010135155 Nov 2010 WO
WO-2011041885 Apr 2011 WO
Non-Patent Literature Citations (329)
Entry
“Adobe Acrobat 8 Standard User Guide”, Adobe Systems Incorporated, 2007, pp. 34 & 36.
“Advisory Action”, U.S. Appl. No. 12/414,382, dated Jan. 20, 2012, 3 pages.
“Advisory Action”, U.S. Appl. No. 12/433,605, dated Apr. 5, 2012, 3 pages.
“Alltel Adds Dedicated Search Key to Phones”, Retrieved from: <http://www.phonescoop.com/news/item.php?n=2159> on Nov. 26, 2008., Apr. 12, 2007, 2 Pages.
“Android 2.3 User's Guide”, AUG-2.3-103, Android mobile technology platform 2.3, Dec. 13, 2010, 380 pages.
“Apple iPhone—8GB AT&T”, Retrieved from: <http://nytimes.com.com/smartphones/apple-iphone-8gb-at/4515-6452_7-32309245.html> on Nov. 20, 2008, Jun. 29, 2007, 11 pages.
“Application User Model IDs”, Retrieved from: <http://msdn.microsoft.com/en-us/library/dd378459(VS.85).aspx> on Sep. 28, 2010, 2010, 6 pages.
“Ask Web Hosting”, Retrieved from: <http://www.askwebhosting.com/story/18501/HTC_FUZE_From_ATandampT_Fuses_Fun_and_Function_With_the_One-Touch_Power_of_TouchFLO_3D.html> on May 5, 2009., Nov. 11, 2008, 3 pages.
“Basics of Your Device: Get Familiar with the Home Screen”, Nokia USA—How to—retrieved from <http://www.nokia.ca/get-support-and-software/product-support/c6-01/how-to#> on May 11, 2011, 3 pages.
“Blackberry office tools: Qwerty Convert”, Retrieved from: <http://blackberrysoftwarelist.net/blackberry/download-software/blackberry-office/qwerty_convert.aspx> on Nov. 20, 2008, Nov. 20, 2008, 1 page.
“Calc4M”, Retrieved from: <http://www.hellebo.com/Calc4M.html> on Dec. 11, 2008, Sep. 10, 2008, 4 Pages.
“Class ScrollView”, Retrieved from: <http://www.blackberry.com/developers/docs/6.0.0api/net/rim/device/api/ui/ScrollView.html> on Sep. 28, 2010, 13 pages.
“Content-Centric E-Mail Message Analysis in Litigation Document Reviews”, Retrieved from: <http://www.busmanagement.com/article/Issue-14/Data-Management/Content-Centric-E-Mail-Message-Analysis-in-Litigation-Document-Reviews/> on May 6, 2009, 2009, 5 Pages.
“Dial a number”, Retrieved from: <http://www.phonespell.org/ialhelp.html> on Nov. 20, 2008, Nov. 20, 2008, 1 page.
“DuoSense™ Multi-Touch Gestures”, Retrieved from: <http://www.n-trig.com/Data/Uploads/Misc/DuoSenseMTG_final.pdf>, Jul. 2008, 4 pages.
“Elecont Quick Desktop 1.0.43”, Retrieved from: <http://handheld.softpedia.com/get/System-Utilities/Launcher-Applications/Elecont-Quick-Desktop-72131.shtml> on May 5, 2009., Mar. 13, 2009, 2 pages.
“Email Notification for Microsoft Outlook and Outlook Express”, Retrieved from: <http://www.contextmagic.com/express-notification/> on Sep. 29, 2010, Jul. 21, 2004, 3 pages.
“Enhanced IBM Power Systems Software and PowerVM Restructuring”, IBM United States Announcement 208-082, dated Apr. 8, 2008, available at <http://www.ibm.com/common/ssi/rep_ca/2/897/ENUS208-082/ENUS208082.PDF>,Apr. 8, 2008, pp. 1-19.
“Exclusive: Windows Mobile 7 to Focus on Touch and Motion Gestures”, Retrieved from: <http://anti-linux.blogspot.com/2008/08/exclusive-windows-mobile-7-to-focus-on.html> on May 6, 2009, Aug. 1, 2008, 14 pages.
“Extended European Search Report”, EP Application No. 09818253.8, dated Apr. 10, 2012, 7 pages.
“EXtreme Energy Conservation: Advanced Power-Saving Software for Wireless Devices”, White Paper, Freescale Semiconductor, Inc., Document No. XTMENRGYCNSVWP, Rev #0, available at <http://www.freescale.com/files/32bit/doc/white_paper/XTMENRGYCNSVWP.pdf>,Feb. 2006, 15 pages.
“Final Office Action”, U.S. Appl. No. 11/305,789, dated Apr. 1, 2009, 10 pages.
“Final Office Action”, U.S. Appl. No. 11/502,264, dated Feb. 4, 2010, 15 pages.
“Final Office Action”, U.S. Appl. No. 11/502,264, dated Mar. 29, 2013, 16 pages.
“Final Office Action”, U.S. Appl. No. 11/502,264, dated Apr. 3, 2009, 9 pages.
“Final Office Action”, U.S. Appl. No. 12/244,545, dated Dec. 7, 2011, 16 pages.
“Final Office Action”, U.S. Appl. No. 12/244,545, dated Sep. 7, 2012, 23 pages.
“Final Office Action”, U.S. Appl. No. 12/413,977, dated Nov. 17, 2011, 16 pages.
“Final Office Action”, U.S. Appl. No. 12/414,382, dated Dec. 23, 2011, 7 pages.
“Final Office Action”, U.S. Appl. No. 12/414,476, dated Dec. 1, 2011, 20 pages.
“Final Office Action”, U.S. Appl. No. 12/433,605, dated Feb. 3, 2012, 11 pages.
“Final Office Action”, U.S. Appl. No. 12/433,667, dated Sep. 13, 2011, 17 pages.
“Final Office Action”, U.S. Appl. No. 12/469,458, dated Nov. 17, 2011, 15 pages.
“Final Office Action”, U.S. Appl. No. 12/469,480, dated Feb. 9, 2012, 17 pages.
“Final Office Action”, U.S. Appl. No. 12/484,799, dated Apr. 30, 2012, 13 pages.
“Final Office Action”, U.S. Appl. No. 12/560,081, dated Mar. 14, 2012, 16 pages.
“Final Office Action”, U.S. Appl. No. 12/721,422, dated Mar. 7, 2013, 10 pages.
“Final Office Action”, U.S. Appl. No. 12/972,967, dated Oct. 11, 2013, 21 pages.
“Final Office Action”, U.S. Appl. No. 12/983,106, dated Oct. 7, 2013, 19 pages.
“Final Office Action”, U.S. Appl. No. 13/073,300, dated Apr. 1, 2014, 14 pages.
“Final Office Action”, U.S. Appl. No. 13/118,204, dated Nov. 21, 2013, 24 pages.
“Final Office Action”, U.S. Appl. No. 13/118,321, dated Dec. 19, 2013, 30 pages.
“Final Office Action”, U.S. Appl. No. 13/118,333, dated Apr. 23, 2014, 22 pages.
“Final Office Action”, U.S. Appl. No. 13/118,339, dated Aug. 22, 2013, 21 pages.
“Final Office Action”, U.S. Appl. No. 13/118,347, dated Aug. 15, 2013, 25 pages.
“Final Office Action”, U.S. Appl. No. 13/224,258, dated Jul. 18, 2014, 39 pages.
“Final Office Action”, U.S. Appl. No. 13/224,258, dated Sep. 11, 2013, 37 pages.
“Final Office Action”, U.S. Appl. No. 13/228,707, dated May 21, 2014, 16 pages.
“Final Office Action”, U.S. Appl. No. 13/228,876, dated Jul. 18, 2014, 15 pages.
“Final Office Action”, U.S. Appl. No. 13/229,155, dated Jun. 12, 2014, 15 pages.
“Final Office Action”, U.S. Appl. No. 13/229,693, dated Sep. 4, 2013, 23 pages.
“Final Office Action”, U.S. Appl. No. 13/655,386, dated Jun. 6, 2013, 34 pages.
“Final Office Action”, U.S. Appl. No. 13/656,354, dated Jun. 17, 2013, 14 pages.
“Final Office Action”, U.S. Appl. No. 13/656,574, dated Aug. 23, 2013, 20 pages.
“Final Office Action”, U.S. Appl. No. 13/657,621, dated Sep. 10, 2013, 18 pages.
“Final Office Action”, U.S. Appl. No. 13/657,646, dated May 6, 2013, 12 pages.
“Final Office Action”, U.S. Appl. No. 13/657,789, dated Jun. 21, 2013, 35 pages.
“First Examination Report”, NZ Application No. 618269, dated May 20, 2014, 2 pages.
“First Examination Report”, NZ Application No. 618284, dated May 20, 2014, 2 pages.
“Foreign Office Action”, CN Application No. 201110429183.5, dated Jan. 6, 2014, 10 Pages.
“Foreign Office Action”, CN Application No. 201110437542.1, dated Jan. 6, 2014, 10 Pages.
“Foreign Office Action”, CN Application No. 201110437572.2, dated Dec. 3, 2013, 7 pages.
“Foreign Office Action”, CN Application No. 201110454251.3, dated Dec. 27, 2013, 12 Pages.
“Foreign Office Action”, CN Application No. 201180071186.4, dated Jun. 13, 2014, 12 pages.
“Freeware.mobi”, Retrieved from: <http://www.palmfreeware.mobi/download-palette.html> on Nov. 6, 2008, Oct. 9, 2001, 2 pages.
“Gestures Programming”, Retrieved from <http://doc.qt.digia.com/4.6/gestures-overview.html> on May 28, 2014, 2010, 3 pages.
“GnomeCanvas”, Retrieved from: <http://library.gnome.org/devel/libgnomecanvas/unstable/GnomeCanvas.html> on Sep. 28, 2010, 11 pages.
“Guidelines for Panning”, Retrieved From: <http://msdn.microsoft.com/en-in/library/windows/apps/hh465310.aspx> Aug. 19, 2014, Dec. 9, 2012, 5 Pages.
“How Do I Cancel a “Drag” Motion on an Android Seekbar?”, retrieved from <http://stackoverflow.com/questions/2917969/how-do-i-cancel-a-drag-motion-on-an-android-seekbar> on Jun. 20, 2011, May 28, 2010, 1 page.
“How do I use Categories with my Weblog?”, Retrieved from: <http://tpsupport.mtcs.sixapart.com/tp/us-tp1/how_do_i_use_categories_with_my_weblog.html> on Sep. 28, 2010, Sep. 16, 2009, 3 pages
“How do you dial 1-800-Flowers”, Retrieved from: <http://blogs.msdn.com/windowsmobile/archive/2007/02/06/how-do-you-dial-1-800-flowers.aspx> on Nov. 20, 2008, Feb. 6, 2007, 24 pages.
“HTC Shows HTC Snap with Snappy Email Feature”, Retrieved from: <http://www.wirelessandmobilenews.com/smartphones/ on May 5, 2009>, May 4, 2009, 10 Pages.
“Image Gestures Example”, Retrieved from <http://doc.qt.digia.com/4.6/gestures-imagegestures.html> on May 28, 2014, 2010, 3 pages.
“IntelliScreen—New iPhone App Shows Today Screen Type Info in Lock Screen”, Retrieved from: <http://justanotheriphoneblog.com/wordpress//2008/05/13/intelliscreen-new-iphone-app-shows-today-screen-type-info-on-lock-screen/> on Nov. 12, 2008, May 13, 2008, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2010/028555, dated Oct. 12, 2010, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2010/028699, dated Oct. 4, 2010, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/067075, dated Dec. 12, 2012, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2009/061864, dated May 14, 2010, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2009/061382, dated May 26, 2010, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055725, dated Sep. 27, 2012, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2009/061735, dated Jun. 7, 2010, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2010/034772, dated Dec. 29, 2010, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2012/047091, dated Dec. 27, 2012, 15 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2010/038730, dated Jan. 19, 2011, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055513, dated Mar. 27, 2012, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055514, dated May 22, 2012, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055512, dated May 24, 2012, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055520, dated May 9, 2012, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055524, dated Jun. 1, 2012, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/065702, dated Aug. 29, 2012, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055736, dated Sep. 17, 2012, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/067073, dated Sep. 17, 2012, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055511, dated Apr. 24, 2012, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055523, dated May 10, 2012, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055521, dated May 15, 2012, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055522, dated May 15, 2012, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055496, dated Sep. 12, 2012, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055712, dated Sep. 21, 2012, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055493, 9/26/212, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/055478, dated Sep. 27, 2012, 9 pages
“International Search Report and Written Opinion”, Application No. PCT/US2011/05576, dated Sep. 27, 2012, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2010/028553, Application Filing Date: Mar. 24, 2010, dated Nov. 9, 2010, 9 pages.
“Internet Explorer Window Restrictions”, Retrieved from: http://technet.microsoft.com/en-us/library/cc759517(WS.10).aspx on Jun. 28, 2011, Microsoft TechNet, 5 pages.
“Introduction to Windows Touch”, Retrieved from: <http://download.microsoft.com/download/a/d/f/adf1347d-08dc-41a4-9084-623b1194d4b2/Win7_touch.docx>, Dec. 18, 2008, pp. 1-7.
“IPad User Guide”, retrieved from <http://cyndidannerkuhn.info/CDK/iPads_Resources_files/iPad_User_Guide.pdf> on Jun. 17, 2011, 154 pages.
“IPod touch User Guide for iPhone OS 3.0 Software”, Apple Inc., 2009, 153 pages.
“Keyboard (5)”, Retrieved from: <http://landru.uwaterloo.ca/cgi-bin/man.cgi?section=5&topic=keyboard> on Dec. 11, 2008., Aug. 11, 1997, 8 Pages.
“Keyboard Shortcuts”, Retrieved from: <http://www.pctoday.com/editorial/article.asp?article=articles%2F2005%2Ft0311%2F26t11%2F26t11.asp> on Aug. 3, 2009., Nov. 2005, 5 pages.
“Kiosk Browser Chrome Customization Firefox 2.x”, Retrieved from: <http://stlouis-shopper.com/cgi-bin/mozdev-wiki/,pl?ChromeCustomization> on Oct. 22, 2008 Making a new chrome for the kiosk browser, Kiosk Project Kiosk Browser Chrome Customization Firefox-2.x,Aug. 16, 2007, 2 pages.
“Live Photo Gallery—Getting Started—from Camera to Panorama”, Retrieved from: <http://webdotwiz.spaces.live.com/blog/cns!2782760752B93233!1729.entry> on May 5, 2009., Sep. 2008, 7 Pages.
“Magic mouse”, Retrieved from: <http://www.apple.com/magicmouse/> on May 10, 2011, 3 pages.
“MIDTB Tip Sheet: Book Courier”, Retrieved from: <http://www.midtb.org/tipsbookcourier.htm> on Dec. 11, 2008., Sep. 26, 2005, 6 Pages.
“Mobile/UI/Designs/TouchScreen/workingUl”, Retrieved from: <https://wiki.mozilla.org/Mobile/UI/Designs/TouchScreen/workingUI> on Oct. 26, 2009, 2009, 30 pages.
“MoGo beta v.0.4”, Retrieved from: <http://forum.xda-developers.com/showthread.php?t=375196> on Sep. 27, 2010, Mar. 7, 2008, 10 pages.
“MS-Content-Zoom-Snap-Points Property”, Retrieved From: <http://msdn.microsoft.com/en-us/library/windows/apps/hh441259.aspx> Aug. 22, 2014, 2 Pages.
“MS-Scroll-Snap-Type Property”, Retrieved From: <http://msdn.microsoft.com/en-in/library/windows/apps/hh466057.aspx> Aug. 19, 2014, 1 Page.
“Multi-touch”, Retrieved from <http://en.wikipedia.org/wiki/Multi-touch#Microsoft_Surface> on Apr. 24, 2009, Apr. 17, 2009, 8 pages.
“My Favorite Gadgets, System Monitor II”, Retrieved from <http://www.myfavoritegadgets.info/monitors/SystemMonitorll/systemmonitorll.html> on Mar. 12, 2013, Jun. 8, 2010, 5 pages.
“New Features in WhatsUp Gold v12.0”, retrieved from <http://www.netbright.co.th/?name=product&file=readproduct&id=12> on Jun. 10, 2011, 4 pages.
“Nokia E61 Tips and Tricks for Keyboard Shortcuts”, Retrieved from: <http://www.mobiletopsoft.com/board/1810/nokia-e61-tips-and-tricks-for-keyboard-shortcuts.html> on Dec. 17, 2008., Jan. 27, 2006, 2 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/228,707, dated Oct. 25, 2013, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/228,888, dated Feb. 10, 2014, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/215,052, dated Jun. 23, 2011, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/305,789, dated Sep. 21, 2009, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/502,264, dated Sep. 30, 2009, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/502,264, dated Sep. 14, 2012, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/244,545, dated Mar. 27, 2012, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/244,545, dated Aug. 17, 2011, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/413,977, dated Jul. 19, 2011, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/413,977, dated Jul. 20, 2012, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,382, dated Jul. 26, 2011, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,434, dated Jan. 17, 2012, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,434, dated May 31, 2012, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,434, dated Aug. 2, 2011, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,455, dated Aug. 29, 2011, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,458, dated Jul. 6, 2011, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,476, dated Nov. 9, 2012, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,476, dated Aug. 3, 2011, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/433,605, dated Jun. 24, 2011, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/433,667, dated Jun. 7, 2011, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/433,667, dated Feb. 3, 2012, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,419, dated Nov. 9, 2011, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,419, dated May 23, 2012, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,458, dated Jul. 1, 2011, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,458, dated Sep. 21, 2012, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,480, dated Oct. 17, 2012, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,480, dated Sep. 22, 2011, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/470,558, dated Nov. 22, 2011, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/480,969, dated Aug. 7, 2012, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/484,799, dated Aug. 11, 2011, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/484,799, dated Aug. 7, 2012, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/484,845, dated Dec. 7, 2011, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/560,081, dated Dec. 7, 2011, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/721,422, dated Oct. 1, 2012, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/972,967, dated Jan. 30, 2013, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/977,584, dated Dec. 7, 2012, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/978,184, dated Jan. 23, 2013, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/983,106, dated Sep. 10, 2014, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/983,106, dated Nov. 9, 2012, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/073,300, dated Jul. 25, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/118,204, dated Feb. 28, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/118,257, dated Mar. 5, 2013, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/118,265, dated Jun. 10, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/118,288, dated Jul. 2, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/118,292, dated Jun. 6, 2014, 25 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/118,321, dated Jun. 10, 2013, 32 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/118,333, dated Jul. 5, 2013, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/118,339, dated Feb. 11, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/118,347, dated Feb. 12, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/196,272, dated Feb. 6, 2013, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/196,272, dated Sep. 3, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/224,258, dated Jan. 8, 2013, 35 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/228,876, dated Nov. 22, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/228,931, dated Apr. 7, 2014, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/228,945, dated Apr. 14, 2014, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/229,155, dated Nov. 18, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/229,556, dated Mar. 28, 2014, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/229,693, dated Mar. 12, 2013, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/229,693, dated Jun. 20, 2014, 25 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/229,702, dated Jul. 3, 2014, 28 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/229,709, dated Apr. 7, 2014, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,495, dated Dec. 19, 2012, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,495, dated Sep. 17, 2012, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/655,386, dated Dec. 26, 2012, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/655,390, dated Dec. 17, 2012, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,354, dated Feb. 6, 2013, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,574, dated Jan. 31, 2013, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/657,621, dated Feb. 7, 2013, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/657,621, dated Jul. 18, 2014, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/657,646, dated Aug. 12, 2014, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/657,646, dated Jan. 3, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/657,789, dated Jan. 9, 2013, 38 pages.
“Normalizing Text: A Java Tutorial by Oracle”, Retrieved from: <http://docs.oracle.com/javase/tutorial/i18n/text/normalizerapi.html> on Apr. 8, 2014, Nov. 11, 2006, 3 pages.
“Notice of Allowance”, U.S. Appl. No. 11/215,052, dated Mar. 14, 2012, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 11/305,789, dated Nov. 23, 2009, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,382, dated Apr. 4, 2012, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,434, dated Aug. 17, 2012, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,455, dated Jan. 4, 2012, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,458, dated Oct. 31, 2011, 2 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,458, dated Nov. 29, 2011, 2 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,458, dated Aug. 10, 2011, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 12/470,558, dated Apr. 2, 2012, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 12/470,558, dated Aug. 23, 2012, 2 pages.
“Notice of Allowance”, U.S. Appl. No. 12/484,799, dated Oct. 22, 2012, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 12/484,845, dated Mar. 16, 2012, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 12/721,422, dated Jul. 11, 2013, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 12/977,584, dated Jun. 19, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 12/978,184, dated Nov. 6, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 12/978,184, dated Aug. 2, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/118,204, dated Jul. 8, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/196,272, dated Nov. 8, 2013, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/492,495, dated Apr. 26, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/655,386, dated Apr. 25, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/655,390, dated May 24, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/657,789, dated Aug. 4, 2014, 16 pages.
“Notifications”, retrieved from <http://msdn.microsoft.com/en-us/library/aa511497.aspx> on May 10, 2011, 16 pages.
“OmneMon™ System Resource Metrics”, retrieved from <http://www.omnesys.com/documents/OmneMonSRM_Brochure.pdf> on Jun. 10, 2011, 3 pages.
“ONYX Graphics Announces New ONYX Prepedge Job Preparation Software”, retrieved from <http://www.largeformatreview.com/rip-software/433-onyx-graphics-announces-new-onyx-> on May 10, 2011, 2 pages.
“Oracle8i Application Developer's Guide—Advanced Queuing Release 2 (8.1.6)”, Retrieved from: http://www.cs.otago.ac.nz/oradocs/appdev.817/a76938/adq01in5.htm on May 6, 2009., Dec. 1999, 8 pages.
“Oracle8i Application Developer's Guide—Advanced Queuing”, Retrieved from: http://www.cs.umbc.edu/help/oracle8/server.815/a68005/03_adq1i.htm on May 6, 2009., Feb. 1999, 29 Pages.
“Oracle8i Concepts Release 8.1.5”, Retrieved from: http://www.cs.umbc.edu/help/oracle8/server.815/a67781/c16queue.htm on May 6, 2009., Feb. 1999, 10 Pages.
“Palette Extender 1.0.2”, Retrieved from: <http://palette-extender.en.softonic.com/symbian> on Nov. 6, 2008, Jan. 21, 2003, 2 pages.
“Parallax Scrolling”, Retrieved from: <http://en.wikipedia.org/wiki/Parallax_scrolling> on May 5, 2009., May 4, 2009, 3 Pages.
“Push Notifications Overview for Windows Phone”, Retrieved from: <http://msdn.microsoft.com/en-us/library/ff402558%28VS.92%29.aspx> on Sep. 30, 2010, Sep. 3, 2010, 1 page.
“QPinchGesture Class Reference”, Retrieved from <http://doc.qt.digia.com/4.6/qpinchgesture.html> on May 28, 2014, 2010, 6 pages.
“Remapping the Keyboard”, Retrieved from: <http://publib.boulder.ibm.com/infocenter/hodhelp/v9r0/index.jsp?topic=/com.ibm.hod9.doc/help/assignkey.html> on Dec. 11, 2008., Jul. 15, 2005, 5 Pages.
“Restriction Requirement”, U.S. Appl. No. 13/118,265, dated Feb. 27, 2014, 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/118,288, dated Mar. 4, 2014, 7 pages.
“SecureMe—Anti-Theft Security Application for S60 3rd”, Retrieved from: <http:/www.killermobile.com/newsite/mobile-software/s60-applications/secureme-%11-anti%11theft-security-application-for-s60-3rd.htm> on Jun. 28, 2011, Dec. 15, 2008, 3 pages.
“Snap”, Windows 7 Features—retrieved from <http://windows.microsoft.com/en-US/windows7/products/features/snap> on Sep. 23, 2011, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 12/977,584, dated Sep. 16, 2013, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 12/977,584, dated Oct. 11, 2013, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 12/978,184, dated Feb. 25, 2014, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/655,390, dated Sep. 19, 2013, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/655,390, dated Jul. 25, 2013, 2 pages.
“Symbian Applications”, Retrieved from: <http://symbianfullversion.blogspot.com/2008_12_01_archive.html> on May 5, 2009., Jan. 2009, 51 Pages.
“The Map Screen”, retrieved from <http://www.symbianos.org/whereamiusersguide> on Jun. 17, 2011, 3 pages.
“Top 3 Task Switchers for Android”, TechCredo—retrieved from <http://www.techcredo.com/android/top-3-task-switchers-for-android> on May 11, 2011, Mar. 9, 2011, 5 pages.
“Top Android App: Swipepad”, Best Android Apps Review—retrieved from <http://www.bestandroidappsreview.com/2011/01/top-android-app-swipepad-launcher.html> on May 11, 2011, 4 pages.
“Touch Shell Free”, Retrieved from: <http://www.pocketpcfreeware.mobi/download-touch-shell-free.html> on May 5, 2009., Feb. 23, 2009, 2 Pages.
“User Guide”, retrieved from <http://wireframesketcher.com/help/help.html> on Jun. 17, 2011, 19 pages.
“Windows 8 Is Gorgeous, but Is It More Than Just a Shell? (Video)”, retrieved from <http://techcrunch.com/2011/06/02/windows-8-gorgeous-shell-video/> on Jun. 20, 2011, Jun. 2, 2011, 6 pages.
“Windows Phone 7 (Push Notification)”, retrieved from <http://unknownerror.net/2011-06/windows-phone-7-push-notification-36520> on Jul. 6, 2011, 4 pages.
“Windows Phone 7 Live Tiles”, Retrieved from: <http://www.knowyourmobile.com/microsoft/windowsphone7/startscreen/640737/windows_phone_7_live_tiles.html> on May 11, 2011, Oct. 20, 2010, 3 pages.
“Winterface Review”, Retrieved from: <http://www.mytodayscreen.com/winterface-review/> on Nov. 12, 2008, Jul. 9, 2008, 42 pages.
“Womma”, Retrieved from: <http://www.womma.org/blog/links/wom-trends/> on May 5, 2009., 2007, 70 Pages.
“Working with Multiple Windows”, MSOffice tutorial!—retrieved from <http://www.msoffice-tutorial.com/working-with-multiple-windows.php> on Sep. 23, 2011, 3 pages.
“You've Got Mail 1.4 Build”, retrieved from <http://www.fileshome.com/Shows_Animation_Plays_Sound_Automatic_N . . . > on Jan. 6, 2010, Jun. 18, 2007, 2 pages.
“YUI 3: ScrollView [beta]”, Retrieved from: <http://developer.yahoo.com/yui/3/scrollview/> on Sep. 28, 2010, 5 pages.
Anson, “Pining for Windows Phone 7 controls? We got ya covered! [Announcing the first release of the Silverlight for Windows Phone Toolkit!]”, Retrieved from <http://blogs.msdn.com/b/delay/archive/2010/09/16/pining-for-windows-phone-7-controls-we-got-ya-covered-announcing-the-first-release-of-the-silverlight-for-windows-phone-toolkit.aspx> on May 30, 2014, Sep. 16, 2010, 17 pages.
Bates, “A Framework to Support Large-Scale Active Applications”, University of Cambridge Computer Laboratory—Available at <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.48.1690&rep=rep1&type=pdf>, 1996, 8 pages.
Beiber, et al., “Screen Coverage: A Pen-Interaction Problem for PDA's and Touch Screen Computers”, In Proceedings of ICWMC 2007, Mar. 2007, 6 pages.
Bjork, et al., “Redefining the Focus and Context of Focus+Context Visualizations”, In Proceedings of INFOVIS 2000—Available at <http://www.johan.redstrom.se/papers/redefining.pdf>, Oct. 2000, 9 pages.
Bowes, et al., “Transparency for Item Highlighting”, Faculty of Computing Science, Dalhousie University—Available at <http://torch.cs.dal.ca/˜dearman/pubs/GI2003-bowes,dearman,perkins-paper.pdf>, 2003, 2 pages.
Bruzzese, “Using Windows 7, Managing and Monitoring Windows 7—Chapter 11”, Que Publishing, May 5, 2010, 33 pages.
Buring, “User Interaction with Scatterplots on Small Screens—A Comparative Evaluation of Geometric-Semantic Zoom and Fisheye Distortion”, IEEE Transactions on Visualization and Computer Graphics, vol. 12, Issue 5, Available at <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.134.4568&rep=rep1&type=pdf>,Sep. 2006, pp. 829-836.
Carrera, et al., “Conserving Disk Energy in Network Servers”, available at <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.6.8301&rep=rep1&type=ps>, Nov. 2002, 15 pages.
Cawley, “How to Customize Your Windows Phone 7”, Retrieved from: <http://www.brighthub.com/mobile/windows-mobile-platform/articles/95213.aspx> on May 10, 2011, Nov. 12, 2010, 3 pages.
Cawley, “Windows Phone 7 Customization Tips and Tricks”, retrieved from <http://www.brighthub.com/mobile/windows-mobile-platform/articles/95213.aspx> on Jun. 20, 2011, May 16, 2011, 2 pages.
Cohen, et al., “Wang Tiles for Image and Texture Generation”, In Proceedings of SIGGRAPH 2003—Available <http://research.microsoft.com/en-us/um/people/cohen/WangFinal.pdf>, 2003, 8 pages.
Damien, “7 Ways to Supercharge Multitasking in Android”, retrieved from <http://maketecheasier.com/7-ways-to-supercharge-multitasking-in-android/2011/01/22/> on May 11, 2011, Jan. 22, 2011, 5 pages.
Davis, “A WPF Custom Control for Zooming and Panning”, Retrieved from: <http://www.codeproject.com/KB/WPF/zoomandpancontrol.aspx> on Sep. 28, 2010, Jun. 29, 2010, 21 pages.
Delimarsky, “Sending Tile Push Notifications on Windows Phone 7”, retrieved from <http://mobile.dzone.com/articles/sending-tile-push> on May 10, 2011, Aug. 25, 2010, 2 pages.
Denoue, et al., “WebNC: Efficient Sharing of Web Applications”, In Proceedings of WWW 2009—Available at <http://www.fxpal.com/publications/FXPAL-PR-09-495.pdf>, 2009, 2 pages.
Dolcourt, “Webware”, Retrieved from: <http://news.cnet.com/webware/?categoryId=2010> on May 5, 2009., May 5, 2009, 13 Pages.
Dunsmuir, “Selective Semantic Zoom of a Document Collection”, Available at <http://www.cs.ubc.ca/˜tmm/courses/533/projects/dustin/proposal.pdf>, Oct. 30, 2009, pp. 1-9.
Farrugia, et al., “Cell Phone Mini Challenge: Node-Link Animation Award Animating Multivariate Dynamic Social Networks”, IEEE Symposium on Visual Analytics Science and Technology, Columbus, OH, USA, Oct. 21-23, 2008, Oct. 21, 2008, 2 pages.
Fisher, “Cool Discussion of Push Notifications—Toast and Tile—on Windows Phone”, Retrieved from: <http://www.windowsphoneexpert.com/Connection/forums/p/4153/18399.aspx> on Sep. 29, 2010, May 3, 2010, 3 pages.
Foley, “The JavaScript Behind Touch-Friendly Sliders”, Retrieved From: <http://css-tricks.com/the-javascript-behind-touch-friendly-sliders/> Aug. 19, 2014, Jun. 13, 2013, 14 Pages.
Gade, “Samsung Alias u740”, Retrieved from: <http://www.mobiletechreview.com/phones/Samsung-U740.htm> on Nov. 20, 2008, Mar. 14, 2007, 6 pages.
Gao, “A General Logging Service for Symbian based Mobile Phones”, Retrieved from: <http://www.nada.kth.se/utbildning/grukth/exjobb/rapportlistor/2007/rapporter07/gao_rui_07132.pdf.> on Jul. 17, 2008, Feb. 2007, pp. 1-42.
Gralla, “Windows XP Hacks, Chapter 13—Hardware Hacks”, O'Reilly Publishing, Feb. 23, 2005, 25 pages.
Ha, et al., “SIMKEYS: An Efficient Keypad Configuration for Mobile Communications”, Retrieved from: < http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=01362557.> on Dec. 17, 2008, Nov. 2004, 7 Pages.
“Symbian OS C++ for Mobile Phones vol. 3”, Retrieved from: <http://www.amazon.co.uk/Symbian-OS-Mobile-Phones-Press/dp/productdescription/0470066415> on Oct. 23, 2008, Symbian Press,Jun. 16, 2003, 4 pages.
Hickey, “Google Android has Landed; T-Mobile, HTC Unveil G1”, Retrieved from: <http://www.crn.com/retail/210603348> on Nov. 26, 2008., Sep. 23, 2008, 4 pages.
Horowitz, “Installing and Tweaking Process Explorer part 2”, Retrieved from <http://web.archive.org/web/20110510093838/http://blogs.computerworld.com/16165/installing_and_tweaking_process_explorer_part_2> on Mar. 12, 2013, May 23, 2010, 7 pages.
Janecek, et al., “An Evaluation of Semantic Fisheye Views for Opportunistic Search in an Annotated Image Collection”, Available at <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.67.3084&rep=rep1&type=pdf>, Feb. 15, 2005, pp. 1-15.
Kcholi, “Windows CE .NET Interprocess Communication”, Retrieved from http://msdn.microsoft.com/en-us/library/ms836784.aspx on Jul. 17, 2008., Jan. 2004, 15 Pages.
Keranen, “OpenGL-based User Interface Toolkit for Symbian Mobile Devices”, Master of Science Thesis, Tamere University of Technology, Department of Information Technology, Apr. 6, 2005, 88 pages.
Kurdi, “Acer GridVista: snap your windows to pre-defined sections on your screen(s)”, Retrieved from <http://www.freewaregenius.com/acer-gridvista-snap-your-windows-to-pre-defined-sections-of-your-screens/> on Jun. 30, 2013, Jan. 19, 2010, 6 pages.
Kurdi, “WinSplit Revolution”, Retrieved from <http://www.freewaregenius.com/winsplit-revolution/> on Jun. 30, 2013, Aug. 22, 2007, 4 Pages.
La, “Parallax Gallery”, Available at <http://webdesignerwall.comtutorials/parallax-gallery/comment-page-1>, Apr. 25, 2008, 16 pages.
Livingston, et al., “Windows 95 Secrets”, 1995, I DG Books Worldwide, 3rd Edition, 1995, pp. 121-127.
Long, “Gmail Manager 0.6”, Retrieved from: <https://addons.mozilla.org/en-US/firefox/addon/1320/> on Sep. 29, 2010, Jan. 27, 2010, 4 pages.
Mann, et al., “Spectrum Analysis of Motion Parallax in a 3D Cluttered Scene and Application to Egomotion”, Journal of the Optical Society of America A, vol. 22, No. 9—Available at <http://www.cs.uwaterloo.ca/˜mannr/snow/josa-mann-langer.pdf>, Sep. 2005, pp. 1717-1731.
Mantia, “Multitasking: What Does It Mean?”, retrieved from <http://mantia.me/blog/multitasking/> on Sep. 23, 2011, 3 pages.
Mao, “Comments of Verizon Wireless Messaging Services, LLC”, Retrieved from: http://www.ntia.doc.gov/osmhome/warnings/comments/verizon.htm on May 6, 2009., Aug. 18, 2000, 5 Pages.
Marie, “MacBook Trackpad Four Fingers Swipe Left/Right to Switch Applications”, MacBook Junkie—retrieved from <http://www.macbookjunkie.com/macbook-trackpad-four-fingers-swipe-left-right-to-switch-applications/> on May 11, 2011, Nov. 13, 2010, 4 pages.
Mei, et al., “Probabilistic Multimodality Fusion for Event Based Home Photo Clustering”, Retrieved from: <http://ieeexplore.ieee.org//stamp/stamp.jsp?tp=&arnumber=04036960.>, Dec. 26, 2006, pp. 1757-1760.
Nordgren, “Development of a Touch Screen Interface for Scania Interactor”, Master's Thesis in C—Available at <http://www.cs.umu.se/education/examina/Rapporter/PederNordgren.pdf>omputing Science, UMEA University, Apr. 10, 2007, pp. 1-59.
Oliver, “Potential iPhone Usability and Interface Improvements”, Retrieved from: <http://www.appleinsider.com/articles/08/09/18/potential_iphone_usability_and_interface_improvements.html> on Nov. 12, 2008, AppleInsider,Sep. 18, 2008, 4 pages.
Oryl, “Review: Asus P527 Smartphone for North America”, Retrieved from: <http://www.mobileburn.com/review.jsp?Id=4257> on Dec. 17, 2008., Mar. 5, 2008, 1 Page.
Padilla, “Palm Treo 750 Cell Phone Review—Hardware”, Retrieved from: <http://www.wirelessinfo.com/content/palm-Treo-750-Cell-Phone-Review/Hardware.htm> on Dec. 11, 2008., Mar. 17, 2007, 4 Pages.
Paul, “Hands-on: KDE 4.5 Launches with Tiling, New Notifications”, Retrieved from: <http://arstechnica.com/open-source/reviews/2010/08/hands-on-kde-45-launches-with-tiling-new-notifications.ars> on Sep. 29, 2010, Aug. 2010, 3 pages.
Perry, “Teach Yourself Windows 95 in 24 Hours”, 1997, Sams Publishing, 2nd Edition, 1997, pp. 193-198.
“Scrollsnap”, Retrieved From: <http://benoit.pointet.info/stuff/jquery-scrollsnap-plugin/> Aug. 19, 2014, Jun. 29, 2013, 3 Pages.
Raghaven, et al., “Model Based Estimation and Verification of Mobile Device Performance”, Available at http://alumni.cs.ucsb.edu/˜raimisl/emsoft04_12.pdf., Sep. 27-29, 2004, 10 Pages.
Rakow, et al., “CSS Scroll Snap Points Module Level 1”, Retrieved From: <http://dev.w3.org/csswg/css-snappoints/> Aug. 19, 2014, Mar. 5, 2014, 18 Pages.
Ray, “Microsoft Re-Tiles Mobile Platform for Windows 7 Era”, retrieved from <http://www.theregister.co.uk/2010/02/15/windows_phone_7_series/> on May 11, 2011, Feb. 15, 2010, 2 pages.
Reed, “Microsoft Demos Windows Mobile 6.1 at CTIA”, Retrieved from: <http://www.networkworld.com/news/2008/040208-ctia-microsoft-windows-mobile.html> on Jul. 18, 2008, Apr. 2, 2008, 1 page.
Remond, “Mobile Marketing Solutions”, Retrieved from: <http://www.mobilemarketingmagazine.co.uk/mobile_social_networking/> on May 5, 2009., Apr. 28, 2009, 16 Pages.
Rice, et al., “A System for Searching Sound Palettes”, Proceedings of the Eleventh Biennial Symposium on Arts and Technology,, Available at <http://www.comparisonics.com/FindSoundsPalettePaper.pdf>,Feb. 2008, 6 pages.
Ritchie, “iOS 4 features: iPod touch Wi-Fi stays connected when asleep—iPhone too?”, Retrieved from: <http://www.goip.com/2010/06/ios-4-features-ipod-touch-wi-fi-stays-connected-when-asleep-%E2%80%94-iphone-too/> on Sep. 30, 2010, Jun. 14, 2010, 2 pages.
Ritscher, “Using Surface APIs in your WPF application—Part 1”, Retrieved from: <http://blog.wpfwonderland.com/2009/06/30/using-surface-apis-in-your-wpf-application/> on Sep. 28, 2010, Jun. 30, 2009, 7 pages.
Roberts, “Touching and Gesturing on the iPhone”, Available at <http://www.sitepen.com/blog/2008/07/10/touching-and-gesturing-on-the-iphone/comments-pare-1>, Jul. 10, 2008, 16 pages.
Rossi, et al., “Enabling New Interoperable Panning Experiences Through the CSS Scrolling Snap Points Specification”, Retrieved From: <http://blogs.msdn.com/b/ie/archive/2013/10/22/enabling-new-interoperable-panning-experiences-through-the-css-scrolling-snap-points-specification.aspx> Aug. 22, 2014, Oct. 22, 2013, 4 Pages.
Sandoval, “A development platform and execution environment for mobile applications”, Universidad Autónoma de Baja California, School of Chemical Sciences and Engineering, Available at <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.86.7989&rep=rep1&type=pdf>,2004, 18 pages.
Singh, et al., “CINEMA: Columbia InterNet Extensible Multimedia Architecture”, Available at <http://www1.cs.columbia.edu/˜library/TR-repository/reports/reports-2002/cucs-011-02.pdf>, Sep. 3, 2002, 83 Pages.
Smith, et al., “GroupBar: The TaskBar Evolved”, Proceedings of OZCHI 2003—Available at <http://research.microsoft.com/pubs/64316/ozchi2003-groupbar.pdf>, Nov. 2003, pp. 1-10.
Steinicke, et al., “Multi-Touching 3D Data: Towards Direct Interaction in Stereoscopic Display Environments coupled with Mobile Devices”, Advanced Visual Interfaces (AVI) Workshop on Designing Multi-Touch Interaction Techniques for Coupled Public, Available at <http://viscg.uni-muenster.de/publications/2008/SHSK08/ppd-workshop.-pdf.>,Jun. 15, 2008, 4 Pages.
Storey, “Setting Native-Like Scrolling Offsets in CSS with Scrolling Snap Points”, Retrieved From: <http://generatedcontent.org/post/66817675443/setting-native-like-scrolling-offsets-in-css-with> Aug. 19, 2014, Nov. 18, 2013, 9 pages.
Suror, “PocketShield—New Screenlock App for the HTC Diamond and Pro”, Retrieved from: <http://wmpoweruser.com/?tag=htc-touch-diamond> on Jun. 28, 2011, Oct. 23, 2008, 2 pages.
Terpstra, “Beta Beat: Grape, a New Way to Manage Your Desktop Clutter”, Retrieved from: http://www.tuaw.com/2009/04/14/beta-beat-grape-a-new-way-to-manage-your-desktop-clutter/, Apr. 14, 2009, 4 pages.
Vallerio, et al., “Energy-Efficient Graphical User Interface Design”, Retrieved from: <http://www.cc.gatech.edu/classes/AY2007/cs7470_fall/zhong-energy-efficient-user-interface.pdf>, Jun. 10, 2004, pp. 1-13.
Vermeulen, “BlackBerry PlayBook Hands-on”, retrieved from <http://mybroadband.co.za/news/gadgets/20104-BlackBerry-PlayBook-hands-.html> on May 11, 2011, May 8, 2011, 4 pages.
Viticii, “Growl 1.3 to Be Released on Mac App Store, Introduce Lion Support and Drop GrowlMail Support”, Retrieved from: <http://www.macstories.net/stories/growl-1-3-to-be-released-on-mac-app-store-introduce-lion-support-and-drop-growlmail-support/> on Jul. 22, 2011, Jul. 6, 2011, 6 pages.
Vornberger, “Bluetile”, Retrieved from: <http://www.bluetile.org> on Sep. 29, 2010, 5 pages.
Wilson, “How the iPhone Works”, Retrieved from: <http://electronics.howstuffworks.com/iphone2.htm> on Apr. 24, 2009, Jan. 2007, 9 pages.
Wilson, “Robust Computer Vision-Based Detection of Pinching for One and Two-Handed Gesture Input”, In Proceedings of UIST 2006—Available at <http://research.microsoft.com/en-us/um/people/awilson/publications/wilsonuist2006/uist%202006%20taffi.pdf>, Oct. 2006, 4 pages.
Wobbrock, et al., “User-Defined Gestures for Surface Computing”, CHI 2009, Apr. 4-9, 2009, Boston, MA—available at <http://faculty.washington.edu/wobbrock/pubs/chi-09.2.pdf>, Apr. 4, 2009, 10 pages.
Wu, et al., “Achieving a Superior Ownership Experience in Manageability and Quality for Siebel CRM”, available at <http://www.oracle.com/us/products/enterprise-manager/superior-exp-for-siebel-crm-068962.pdf>, Aug. 2008, 25 pages.
Wyatt, “/Flash/the art of parallax scrolling”, .net Magazine, Aug. 1, 2007, pp. 74-76.
Yang, et al., “Semantic Photo Album Based on MPEG-4 Compatible Application Format”, Retrieved from: <http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=04146254.>, 2007, 2 Pages.
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US2015/048751”, dated Nov. 27, 2015, 13 Pages.
The PCT International Preliminary Report on Patentability dated Sep. 9, 2016 for PCT application No. PCT/US2015/048751, 16 pages.
The PCT Written Opinion of the International Preliminary Examining Authority for PCT application No. PCT/US2015/048751, dated Jul. 12, 2016, 5 pages.
“Office Action Issued in Japanese Patent Application No. 2017-511294”, dated Jun. 24, 2019, 6 Pages.
“First Office Action and Search Report Issued in Chinese Patent Application No. 201580048605.0”, dated May 20, 2019, 41 Pages.
“Office Action and Search Report Issued in Russian Patent Application No. 2017107164”, dated Apr. 12, 2019, 17 Pages.
Related Publications (1)
Number Date Country
20160070357 A1 Mar 2016 US