Haptic feedback for a touch input device

Information

  • Patent Grant
  • 10061385
  • Patent Number
    10,061,385
  • Date Filed
    Friday, January 22, 2016
    8 years ago
  • Date Issued
    Tuesday, August 28, 2018
    6 years ago
Abstract
Techniques for haptic feedback for a touch input device are described. Generally, haptic feedback is provided for different user interactions with a touch input device, such as interactions with applications, services, and so forth. According to various embodiments, how haptic feedback is initiated depends on whether different functionalities directly support haptic feedback. For instance, techniques described herein enable haptic feedback to be provided whether or not a particular functionality directly supports haptic feedback.
Description
BACKGROUND

Modern computing devices utilize a variety of different types of feedback to indicate to users that certain functionalities are available and that certain actions are occurring or about to occur. For instance, when a user hovers a cursor over a hyperlink, visual feedback can be presented that indicates that the hyperlink is selectable to navigate to a particular network location. In another example, audio feedback can be presented to indicate an incoming communication, such as a new instant message.


One particularly useful type of feedback is haptic feedback, which provides tactilely-perceptible feedback via various mechanisms. For instance, a touchscreen may employ a tactile device (e.g., a piezo-electric device) to provide a localized vibration when a user presses a virtual button displayed on the touchscreen. Such haptic feedback represents a tactile reinforcement that the user has successfully selected the virtual button, and may be combined with other types of feedback (e.g., visual and audio feedback) to increase the perceptibility of certain actions and functionalities. While haptic feedback can be leveraged in a variety of scenarios, it can be difficult to comprehensively incorporate across different applications and services that may not have the ability to invoke haptic mechanisms.


SUMMARY

This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.


Techniques for haptic feedback for a touch input device are described. Generally, haptic feedback is provided for different user interactions with a touch input device, such as interactions with applications, services, and so forth. According to various embodiments, how haptic feedback is initiated depends on whether different functionalities directly support haptic feedback. For instance, techniques described herein enable haptic feedback to be provided whether or not a particular functionality directly supports haptic feedback.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.



FIG. 1 is an illustration of an environment in an example implementation that is operable to employ techniques discussed herein.



FIG. 2 illustrates an example implementation scenario for an application that supports haptic feedback in accordance with one or more embodiments.



FIG. 3 depicts an example implementation scenario for an application that does not directly support haptic feedback in accordance with one or more embodiments.



FIG. 4 is a flow diagram that describes steps in a method for causing output of haptic feedback in accordance with one or more embodiments.



FIG. 5 is a flow diagram that describes steps in a method for determining whether a haptic feedback is to be generated based on an external haptic event or an internal haptic event in accordance with one or more embodiments.



FIG. 6 is a flow diagram that describes steps in a method for determining attributes of haptic feedback in accordance with one or more embodiments.



FIG. 7 illustrates an example system and computing device as described with reference to FIG. 1, which are configured to implement embodiments of techniques described herein.





DETAILED DESCRIPTION

Overview


Techniques for haptic feedback for a touch input device are described. Generally, haptic feedback is provided for different user interactions with a touch input device, such as interactions with applications, services, and so forth. According to various implementations, how haptic feedback is initiated depends on whether different functionalities directly support haptic feedback.


For instance, consider a first scenario where a user is providing a touch gesture to a haptic-enabled touch input device to provide input to a graphical user interface (GUI) of an application. Further, consider that the application directly supports haptic feedback. An application that directly supports haptic feedback, for instance, represents an application that includes logic to recognize different types of user input and to initiate specific haptic feedback based on the user input and application context. Generally, application context refers to various application-specific scenarios, such as GUI context, application state, and so forth. Accordingly, in this particular scenario the application directly supports haptic feedback, and thus recognizes the user input to the GUI and causes the touch input device to generate haptic feedback based on attributes of the user input.


Consider now a second scenario where a user is providing a touch gesture to the haptic-enabled touch input device to provide input to a GUI of a different application. Further, consider that the different application does not directly support haptic feedback. An application that does not directly support haptic feedback, for instance, represents an application that does not include direct logic to initiate specific haptic feedback based on user input and/or application context. Accordingly, techniques discussed herein enable haptic feedback to be provided on the touch input device even though the different application does not directly support haptic feedback. For example, haptic functionality of the touch input device (e.g., firmware, a device driver, and so forth) recognizes attributes of the touch gesture and generates predefined haptic feedback based on the attributes. Thus, haptic feedback can be provided even though a particular functionality does not directly support haptic feedback, such as a particular application, a particular operating system, and so forth.


Accordingly, techniques described herein enable haptic feedback to be provided across a variety of different systems and functionalities, and in scenarios where particular systems and/or functionalities do not directly support haptic feedback.


In the following discussion, an example environment is first described that is operable to employ techniques described herein. Next, a section entitled “Example Implementation Scenarios” describes some example implementation scenarios in accordance with one or more embodiments. Following this, a section entitled “Example Procedures” describes some example procedures in accordance with one or more embodiments. Finally, a section entitled “Example System and Device” describes an example system and device that are operable to employ techniques discussed herein in accordance with one or more embodiments.


Having presented an overview of example implementations in accordance with one or more embodiments, consider now an example environment in which example implementations may by employed.


Example Environment



FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ techniques for haptic feedback for a touch input device described herein. The environment 100 includes a client device 102, which may be configured in a variety of ways, such as a traditional computer (e.g., a desktop personal computer, laptop computer, and so on), a mobile device, an entertainment appliance, a smartphone, a wearable device, a netbook, a game console, a handheld device (e.g., a tablet), and so forth.


The client device 102 includes a variety of different functionalities that enable various activities and tasks to be performed. For instance, the client device 102 includes an operating system 104, applications 106, input/output (“I/O”) devices 108, and a haptic module 110. Generally, the operating system 104 is representative of functionality for abstracting various system components of the client device 102, such as hardware, kernel-level modules and services, and so forth. The operating system 104, for instance, can abstract various components of the client device 102 to the applications 106 to enable interaction between the components and the applications 106.


The applications 106 represent functionalities for performing different tasks via the client device 102. Examples of the applications 106 include a word processing application, a spreadsheet application, a web browser, a gaming application, a communication application, and so forth. The applications 106 may be installed locally on the client device 102 to be executed via a local runtime environment, and/or may represent portals to remote functionality, such as cloud-based services, web apps, and so forth. Thus, the applications 106 may take a variety of forms, such as locally-executed code, portals to remotely hosted services, and so forth.


The I/O devices 108 are representative of different functionalities for receiving input to the client device 102 and/or for providing output from the client device 102. Particular instances of the I/O devices 108, for example, represent a dedicated input device, a dedicated output device, or a device that both receives input and provides output. The I/O devices 108 include haptic input/output (“I/O”) devices 112, which are representative of devices that are configured to provide haptic output. For instance, the haptic I/O devices 112 include a touchscreen 114 and a trackpad 116, which are configured to provide haptic feedback which is tactilely-perceptible. The touchscreen 114, for example, is not only configured to provide visual output, but can also receive touch input and provide haptic output. Further, the trackpad 116 can not only receive touch input for the client device 102, but can provide various types of haptic output. Generally, the haptic I/O devices 112 may utilize a variety of different haptic-generating mechanisms to generate haptic feedback, such as motors, magnets, linear resonant actuators (LRAs) (magnetic and piezo based), piezo-electric bars, and so forth.


The haptic module 110 is representative of functionality for enabling the client device 102 to provide various types of haptic output. For instance, the haptic module 110 represents hardware and logic for enabling the haptic I/O devices to output various types of haptic feedback. The haptic module 110, for example, includes a haptic application programming interface (API) 118, a haptic driver 120, and gesture mapping 122. Generally, the haptic API 118 and the haptic driver 120 are representative of functionalities to enable various other functionalities to invoke the haptic I/O devices. For instance, the operating system 104 and the applications 106 may call the haptic API 118 to request that a particular haptic I/O device 112 generate haptic feedback. The haptic API 118 then interfaces with the haptic driver 120, which in turn interfaces with the haptic I/O devices 112 to cause the haptic I/O devices 112 to generate haptic feedback. Example interactions between the various entities included in the environment 100 are described below.


The gesture mapping 122 represents mappings of different gestures to different respective types of haptic feedback. For instance, different gesture attributes can cause different respective types of haptic feedback to be generated. As further detailed below, in an event that a functionality external to the haptic I/O devices 112 (e.g., an application 106, the operating system 104, and so forth) does not directly support haptic feedback, the haptic module 110 can detect attributes of a gesture applied to a haptic I/O device 112 and cause a particular type of haptic feedback to be output by the haptic I/O device 112 based on the attributes.


In at least some implementations, the haptic module 110 can be implemented as part of the haptic I/O devices 112, such as in firmware of the haptic I/O devices 112. Alternatively or additionally, the haptic module 110 may be implemented as part of system resources of the client device 102, such as part of the operating system 104.


The client device 102 further includes haptic data 124, which represents information about whether different functionalities directly support haptic feedback. For instance, the haptic data 124 includes identifiers for individual applications of the applications 106, and indicates whether each of the individual applications support haptic feedback. The haptic data 124 may also indicate whether other functionalities directly support haptic feedback, such as the operating system 104, other services that reside on the client device 102, and so forth. Generally, the haptic data 124 may be implemented as part of the haptic module 110, as part of the operating system 104, and/or as a standalone set of haptic data that is accessible by different functionalities of the client device 102.


Further illustrated as part of the environment 100 is a haptic-enabled pen 126, which is representative of an instance of the haptic I/O devices 112. Generally, the haptic-enabled pen 126 represents a handheld input apparatus that includes various internal components that can generate haptic feedback in various scenarios. For instance, the haptic-enabled pen 126 can provide input to the touchscreen 114, and based on various events can generate haptic feedback. The various implementations and scenarios discussed below, for example, may apply to haptic feedback generated by various haptic-enabled devices, such as the trackpad 116, the touchscreen 114, and the haptic-enabled pen 126.


Having described an example environment in which the techniques described herein may operate, consider now a discussion of an example implementation scenario for haptic feedback for a touch input device in accordance with one or more embodiments.


Example Implementation Scenarios


The following section describes some example implementation scenarios for haptic feedback for a touch input device in accordance with one or more implementations. The implementation scenarios may be implemented in the environment 100 discussed above, and/or any other suitable environment.



FIG. 2 depicts an example implementation scenario 200 for an application that directly supports haptic feedback in accordance with one or more implementations. The scenario 200 includes various entities and components introduced above with reference to the environment 100.


In the scenario 200, an application 106a is active and a graphical user interface (GUI) 202 for the application 106a is displayed on the touchscreen 114 of the client device 102. Further, the application 106a is configured to initiate haptic feedback based on various application-related events. For instance, the application 106a includes logic for interacting with the haptic module 110, such as via the haptic API 118. Alternatively or additionally, the application 106a is configured to initiate haptic feedback via interaction with the operating system 104. For instance, the operating system 104 may serve as an intermediary between the application 106a and the haptic module 110.


Continuing with the scenario 200, a user provides input to the trackpad 116 to interact with the GUI 202. For instance, the user's finger 204 moves across the surface of the trackpad 116 to move a cursor 206 within the GUI 202. In this particular example, the user moves the cursor 206 within proximity to an action region 208. Generally, an action region refers to a region of the GUI 202 associated with a particular available action. For instance, the action region 208 is configured to receive user input specifying a particular location for retrieving and displaying weather-related information.


In response to detecting the cursor 206 in proximity to (e.g., touching and/or overlapping) the action region 208, the application 106a fires a haptic event 210 to the haptic module 110. For instance, the haptic event 210 is communicated directly from the application 106a to the haptic module 110 via the haptic API 118. Alternatively, the application 106a communicates the haptic event 210 to the operating system 104, and the operating system 104 forwards the haptic event 210 to the haptic module 110. Generally, the haptic event 210 represents an “external” haptic event since the haptic event 210 is generated by an external functionality that is external to the haptic module 110 and the haptic I/O devices 112.


According to various implementations, the haptic event 210 specifies a particular type of haptic feedback to be generated by the trackpad 116. For instance, different action regions of the GUI 202 can be linked to different types of haptic feedback. Accordingly, in response to receiving the haptic event 210, the haptic module 110 causes the trackpad 116 to generate haptic feedback 212. For instance, the haptic module 110 instructs the haptic driver 120 to cause the trackpad 116 to generate the haptic feedback 212. The haptic feedback 212, for example, is produced by a haptic mechanism of the trackpad 116, and is tactilely perceptible on the surface of the trackpad 116, such as by the user's finger 204.


In at least some implementations, the haptic module 110 is configured to track which applications 106 directly support haptic feedback, and which applications 106 do not. For instance, a particular application 106 that directly supports haptic feedback represents an application 106 that is configured to generate haptic events to notify the haptic module 110 to generate haptic feedback. However, a different application 106 that does not directly support haptic feedback represents an application 106 that is not configured to generate haptic events. Thus, the scenario 200 represents an implementation where the application 106a directly supports haptic feedback and is thus configured to generate the haptic event 210 to cause the haptic feedback 212 to be generated.



FIG. 3 depicts an example implementation scenario 300 for an application that does not directly support haptic feedback in accordance with one or more implementations. The scenario 300 includes various entities and components introduced above with reference to the environment 100. In at least some implementations, the scenario 300 represents an extension and/or variation on the scenario 200, described above.


In the scenario 300, an application 106b is active and a graphical user interface (GUI) 302 for the application 106b is displayed on the touchscreen 114 of the client device 102. Further, the application 106b is not configured to initiate haptic feedback based on various application-related events. For instance, the application 106b does not include logic for interacting with the haptic module 110. The application 106b, for example, does not directly support generating haptic events.


Continuing with the scenario 300, a user provides input to the trackpad 116 to interact with the GUI 202b. For instance, the user's finger 204 moves across the surface of the trackpad 116 to move the cursor 206 within the GUI 302. In this particular example, the user provides a gesture 304 to the trackpad 116 to move the cursor 206 and drag a scroll bar 306 downward. Since the application 106b does not directly support haptic feedback, the haptic module 110 detects the gesture 304 and fires a haptic event 308 to the haptic driver 120. The haptic module 110, for instance, fires the haptic event 308 without direct interaction with the application 106b. Alternatively or additionally, the haptic module 110 queries the operating system 104 for permission to generate haptic feedback (e.g., fire the haptic event 308) while the application 106b is active, e.g., has focus on the touchscreen 114.


Responsive to receiving the haptic event 308, the haptic driver 120 causes the trackpad 116 to generate haptic feedback 310. For instance, the operating system 104 detects that the application 106b has focus and that the application 106b does not directly support haptic feedback, such as based on an entry in the haptic data 124 that indicates that the application 106b does not directly support haptic feedback. Accordingly, the operating system 104 notifies the haptic module 110 (e.g., via the haptic API 118) that an application currently in focus does not directly support haptic feedback. Alternatively or additionally, the operating system 104 notifies the haptic module 110 that the application 106b has focus, and the haptic module 110 looks up the application 106b in the haptic data 124 to determine that the application 106b does not directly support haptic feedback.


Responsive to detecting the gesture 304 and ascertaining that the application 106b does not directly support haptic feedback, the haptic module 110 determines that the haptic feedback 310 is to be generated by the trackpad 116. In an example implementation, the haptic module 110 determines a gesture type for the gesture 304, and determines the haptic feedback 310 based on the gesture type. The haptic module 110, for instance, determines the gesture type based on attributes of the gesture 304. Examples of such gesture attributes include direction of movement of the gesture relative to the trackpad 116 (e.g., up, down, left, right, and so forth), distance of movement, velocity of movement, acceleration and/or deceleration, an amount of pressure applied while generating the gesture 304, and so forth. One of more of such gesture attributes can be considered in characterizing a gesture type for the gesture 304.


For example, different sets of gesture attributes can correspond to different respective gesture types. Further, different gesture types can be mapped to different respective types of haptic feedback, such as in the gesture mapping 122. For instance, a tap gesture can be mapped to one type of haptic feedback, a swipe gesture to another type of haptic feedback, a drag gesture to still another type of haptic feedback, and so on. In the particular example presented in scenario 300, the haptic module 110 ascertains that the gesture 304 is a downward dragging gesture on the trackpad 116, maps the gesture 304 to haptic feedback identified for the gesture 304 in the gesture mapping 122, and generates the haptic event 308 to identify the haptic feedback 310. Based on information included in the haptic event 308, the haptic driver 120 initiates the haptic feedback 310 on the trackpad 116.


According to various implementations, the haptic module 110 causes the haptic feedback 310 to be generated by the trackpad 116 independent of a notification from the application 106b to generate haptic feedback, and independent of any information concerning an input context of the application 106b. For instance, the haptic module 110 causes the haptic feedback 310 to be generated based on attributes of the gesture 304 itself and without any input (e.g., context and/or instructions) from the application 106b. Thus, the haptic event 308 represents an “internal” haptic event since the haptic event 308 is generated internally to the haptic module 110 and/or the trackpad 116 and independent of direct interaction with the application 106b.


Accordingly, the scenarios described above illustrate that implementations for haptic feedback for a touch input device described herein can differentiate between functionalities that directly support haptic feedback and functionalities that do not directly support haptic feedback, and can enable haptic feedback to be generated in both cases. While these scenarios are discussed with reference to different applications, it is to be appreciated that implementations discussed herein can be employed with a wide variety of different functionalities, such as different applications, services, operating systems, and so forth. For instance, techniques described herein can be employed to generate haptic feedback on a device with an operating system that does not directly support haptic feedback.


Further, while the scenarios described above are discussed with reference to the trackpad 116, it is to be appreciated that the scenarios may be implemented with any haptic-enabled device, such as the touchscreen 114, the haptic-enabled pen 126, and so forth.


Having discussed some example implementation scenarios, consider now a discussion of some example procedures in accordance with one or more embodiments.


Example Procedures


The following discussion describes some example procedures for haptic feedback for a touch input device in accordance with one or more embodiments. The example procedures may be employed in the environment 100 of FIG. 1, the system 700 of FIG. 7, and/or any other suitable environment. The procedures, for instance, represent example procedures for implementing the implementation scenarios described above. In at least some implementations, the steps described for the various procedures are implemented automatically and independent of user interaction.



FIG. 4 is a flow diagram that describes steps in a method in accordance with one or more implementations. The method describes an example procedure for causing output of haptic feedback in accordance with one or more implementations. In at least some implementations, the method may be performed at least in part by the haptic module 110 and/or by the operating system 104.


Step 400 receives an indication of input to a touch surface of a touch input device. The haptic module 110, for instance, detects that user input is provided to a touch surface of a haptic I/O device 112, such as one of the trackpad 116 or the touchscreen 114.


Step 402 ascertains whether haptic feedback for the input is to be initiated by an external haptic event or an internal haptic event. Generally, an external haptic event represents a haptic event received by the haptic module 110 from an external functionality that is external to the touch input device, such as an application 106 that directly supports haptic feedback, the operating system 104, and so forth. One example implementation of an external haptic event is the haptic event 210 discussed above. An internal haptic event represents a haptic event generated by the touch input device in response to the input. One example implementation of an internal haptic event is the haptic event 308 discussed above. An example way of determining whether a haptic feedback is to be generated based on an external haptic event or an internal haptic event is discussed below.


In an event that the haptic feedback for the input is to be initiated by an external haptic event (“External”), step 404 receives the external haptic event and causes the touch input device to output haptic feedback based on the external haptic event. For example, the haptic module 110 receives a haptic event from an application 106 and/or the operating system 104. Generally, the haptic event includes information describing attributes of the haptic feedback to be output by the touch input device. Examples of attributes of haptic feedback include vibration frequency, vibration amplitude, feedback duration, haptic pulse information, variations in frequency and/or amplitude, and so forth.


In an event that the haptic feedback for the input is to be initiated by an internal haptic event (“Internal”), step 406 causes the touch input device to output haptic feedback based on the internal haptic event. For example, the haptic module 110 communicates the internal haptic event to the haptic driver 120 to cause the touch input device (e.g., one of the haptic I/O devices 112) to output haptic feedback. The internal haptic event, for instance, includes information describing attributes of the haptic feedback to be output by the touch input device, examples of which are described above. In at least some implementations, attributes of haptic feedback are determined based on attributes of a gesture applied to the touch input device to generate the input to the touch surface. An example way of determining attributes of haptic feedback is discussed below.



FIG. 5 is a flow diagram that describes steps in a method in accordance with one or more implementations. The method describes an example procedure for determining whether a haptic feedback is to be generated based on an external haptic event or an internal haptic event in accordance with one or more implementations. In at least some implementations, the method may be performed at least in part by the haptic module 110 and/or by the operating system 104.


Step 500 determines whether a functionality external to a touch input device directly supports haptic feedback. The haptic module 110, for instance, determines whether an application 106 that currently has focus on the client device 102 directly supports haptic feedback, and/or whether the operating system 104 directly supports haptic feedback. The applications 106 and the operating system 104, for example, represent functionalities that are external to the touch input device, i.e., external to the haptic I/O devices 112.


If the functionality external to the touch input device directly supports haptic feedback (“Yes”), step 502 determines that haptic feedback is to be generated in response to an external haptic event. An application 106 that currently has focus, for instance, notifies the haptic module 110 that the application directly supports haptic feedback. Alternatively or additionally, the operating system 104 notifies the haptic module 110 that an application 106 that currently has focus directly supports haptic feedback, and/or that the operating system 104 itself directly supports haptic feedback. In at least some implementations, an external functionality interacts with the haptic module 110 via calls to the haptic API 118.


In yet another implementation, the haptic module 110 determines from the haptic data 124 whether a particular application 106 and/or the operating system 104 directly support haptic feedback. For instance, the haptic module 110 can determine whether an external functionality directly supports haptic feedback by ascertaining whether the haptic data 124 indicates that the external functionality directly supports/doesn't directly support haptic feedback.


If the functionality external to the touch input device does not directly support haptic feedback (“No”), step 504 determines that haptic feedback is to be generated in response to an internal haptic event. For instance, the operating system 104 notifies the haptic module 110 that an application 106 that currently has focus does not directly support haptic feedback. Alternatively or additionally, and as discussed above, the haptic module 110 can determine whether an external functionality directly supports haptic feedback by ascertaining whether the haptic data 124 indicates that the external functionality directly supports/doesn't directly support haptic feedback.



FIG. 6 is a flow diagram that describes steps in a method in accordance with one or more implementations. The method describes an example procedure for determining attributes of haptic feedback in accordance with one or more implementations. In at least some implementations, the method may be performed at least in part by the haptic module 110 and/or by the operating system 104. The method, for instance, represents an implementation of step 406 discussed above with reference to FIG. 4.


Step 600 ascertains attributes of a gesture used to provide input to a touch input device. Examples of gesture attributes include direction relative to a surface to which the gesture is applied (e.g., up, down, left, right, and so forth), distance of movement, velocity of movement, acceleration and/or deceleration, an amount of pressure applied while generating the gesture, and so forth.


Step 602 maps the attributes of the gesture to haptic feedback. For instance, different gesture attributes can be mapped to different types of haptic feedback. In at least some implementations, the haptic module 110 maps the attributes of the gesture to a particular type of haptic feedback specified for the attributes in the gesture mapping 122.


Step 604 causes output of the haptic feedback. The haptic module 110, for instance, instructs the haptic driver 120 to output the haptic feedback.


Accordingly, techniques discussed herein enable haptic feedback to be provided in a wide variety of scenarios and across a wide variety of different device configurations.


Having discussed some example procedures, consider now a discussion of an example system and device in accordance with one or more embodiments.


Example System and Device



FIG. 7 illustrates an example system generally at 700 that includes an example computing device 702 that is representative of one or more computing systems and/or devices that may implement various techniques described herein. For example, the client device 102 discussed above with reference to FIG. 1 can be embodied as the computing device 702. The computing device 702 may be, for example, a server of a service provider, a device associated with the client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system.


The example computing device 702 as illustrated includes a processing system 704, one or more computer-readable media 706, and one or more Input/Output (I/O) Interfaces 708 that are communicatively coupled, one to another. Although not shown, the computing device 702 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines.


The processing system 704 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 704 is illustrated as including hardware element 710 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. The hardware elements 710 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.


The computer-readable media 706 is illustrated as including memory/storage 712. The memory/storage 712 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage 712 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage 712 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 706 may be configured in a variety of other ways as further described below.


Input/output interface(s) 708 are representative of functionality to allow a user to enter commands and information to computing device 702, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone (e.g., for voice recognition and/or spoken input), a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to detect movement that does not involve touch as gestures), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, the computing device 702 may be configured in a variety of ways as further described below to support user interaction.


Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” “entity,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the computing device 702. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”


“Computer-readable storage media” may refer to media and/or devices that enable persistent storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Computer-readable storage media do not include signals per se. The computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.


“Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 702, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.


As previously described, hardware elements 710 and computer-readable media 706 are representative of instructions, modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein. Hardware elements may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware devices. In this context, a hardware element may operate as a processing device that performs program tasks defined by instructions, modules, and/or logic embodied by the hardware element as well as a hardware device utilized to store instructions for execution, e.g., the computer-readable storage media described previously.


Combinations of the foregoing may also be employed to implement various techniques and modules described herein. Accordingly, software, hardware, or program modules and other program modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 710. The computing device 702 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of modules that are executable by the computing device 702 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 710 of the processing system. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more computing devices 702 and/or processing systems 704) to implement techniques, modules, and examples described herein.


As further illustrated in FIG. 7, the example system 700 enables ubiquitous environments for a seamless user experience when running applications on a personal computer (PC), a television device, and/or a mobile device. Services and applications run substantially similar in all three environments for a common user experience when transitioning from one device to the next while utilizing an application, playing a video game, watching a video, and so on.


In the example system 700, multiple devices are interconnected through a central computing device. The central computing device may be local to the multiple devices or may be located remotely from the multiple devices. In one embodiment, the central computing device may be a cloud of one or more server computers that are connected to the multiple devices through a network, the Internet, or other data communication link.


In one embodiment, this interconnection architecture enables functionality to be delivered across multiple devices to provide a common and seamless experience to a user of the multiple devices. Each of the multiple devices may have different physical requirements and capabilities, and the central computing device uses a platform to enable the delivery of an experience to the device that is both tailored to the device and yet common to all devices. In one embodiment, a class of target devices is created and experiences are tailored to the generic class of devices. A class of devices may be defined by physical features, types of usage, or other common characteristics of the devices.


In various implementations, the computing device 702 may assume a variety of different configurations, such as for computer 714, mobile 716, and television 718 uses. Each of these configurations includes devices that may have generally different constructs and capabilities, and thus the computing device 702 may be configured according to one or more of the different device classes. For instance, the computing device 702 may be implemented as the computer 714 class of a device that includes a personal computer, desktop computer, a multi-screen computer, laptop computer, netbook, and so on.


The computing device 702 may also be implemented as the mobile 716 class of device that includes mobile devices, such as a mobile phone, portable music player, portable gaming device, a tablet computer, a wearable device, a multi-screen computer, and so on. The computing device 702 may also be implemented as the television 718 class of device that includes devices having or connected to generally larger screens in casual viewing environments. These devices include televisions, set-top boxes, gaming consoles, and so on.


The techniques described herein may be supported by these various configurations of the computing device 702 and are not limited to the specific examples of the techniques described herein. For example, functionalities discussed with reference to the haptic module 110 may be implemented all or in part through use of a distributed system, such as over a “cloud” 720 via a platform 722 as described below.


The cloud 720 includes and/or is representative of a platform 722 for resources 724. The platform 722 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 720. The resources 724 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from the computing device 702. Resources 724 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network.


The platform 722 may abstract resources and functions to connect the computing device 702 with other computing devices. The platform 722 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for the resources 724 that are implemented via the platform 722. Accordingly, in an interconnected device embodiment, implementation of functionality described herein may be distributed throughout the system 700. For example, the functionality may be implemented in part on the computing device 702 as well as via the platform 722 that abstracts the functionality of the cloud 720.


Discussed herein are a number of methods that may be implemented to perform techniques discussed herein. Aspects of the methods may be implemented in hardware, firmware, or software, or a combination thereof. The methods are shown as a set of steps that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. Further, an operation shown with respect to a particular method may be combined and/or interchanged with an operation of a different method in accordance with one or more implementations. Aspects of the methods can be implemented via interaction between various entities discussed above with reference to the environment 70.


Implementations discussed herein include:


Example 1

A system for causing haptic feedback, the system including: a haptic-enabled touch input device; at least one processor; and one or more computer-readable storage media including instructions stored thereon that, responsive to execution by the at least one processor, cause the system perform operations including: receiving an indication of input via the touch input device; ascertaining whether haptic feedback for the input is to be initiated by an external haptic event received from an external functionality that is external to the touch input device, or whether haptic feedback is to be initiated by an internal haptic event generated by the touch input device in response to the input; and causing the touch input device to output haptic feedback based on one of the external haptic event or the internal haptic event.


Example 2

A system as described in example 1, wherein the touch input device includes one or more of a haptic-enabled trackpad, a haptic-enabled touchscreen, or a haptic-enabled pen.


Example 3

A system as described in one or more of examples 1 or 2, wherein the one or more computer-readable storage media includes firmware of the touch input device.


Example 4

A system as described in one or more of examples 1-3, wherein the operations further include determining that the external functionality directly supports haptic feedback, wherein said ascertaining includes ascertaining that the haptic feedback for the input is to be initiated by an external haptic event, and said causing includes causing the touch input device to output haptic feedback based on the external haptic event.


Example 5

A system as described in one or more of examples 1-4, wherein the external functionality includes an application that currently has focus, the operations further include determining that the application directly supports haptic feedback, and wherein said ascertaining includes ascertaining that the haptic feedback for the input is to be initiated by an external haptic event received from the application, and said causing includes causing the touch input device to output haptic feedback based on the external haptic event.


Example 6

A system as described in one or more of examples 1-5, wherein the external functionality includes an operating system, the operations further include determining that the operating system directly supports haptic feedback, and wherein said ascertaining includes ascertaining that the haptic feedback for the input is to be initiated by an external haptic event received from the operating system, and said causing includes causing the touch input device to output haptic feedback based on the external haptic event.


Example 7

A system as described in one or more of examples 1-6, wherein the operations further include determining that the external functionality does not directly support haptic feedback, wherein said ascertaining includes ascertaining that the haptic feedback for the input is to be initiated by an internal haptic event, and said causing includes causing the touch input device to output haptic feedback based on the internal haptic event.


Example 8

A system as described in one or more of examples 1-7, wherein the external functionality includes an application that currently has focus, the operations further include determining that the application does not directly support haptic feedback, and wherein said ascertaining includes ascertaining that the haptic feedback for the input is to be initiated by an internal haptic event, and said causing includes causing the touch input device to output haptic feedback based on the internal haptic event.


Example 9

A system as described in one or more of examples 1-8, wherein the external functionality includes an operating system, the operations further include determining that the operating system does not directly support haptic feedback, and wherein said ascertaining includes ascertaining that the haptic feedback for the input is to be initiated by an internal haptic event, and said causing includes causing the touch input device to output haptic feedback based on the internal haptic event.


Example 10

A system as described in one or more of examples 1-9, wherein said ascertaining includes ascertaining that the haptic feedback for the input is to be initiated by an internal haptic event, and said causing includes: ascertaining one or more attributes of a gesture used to provide the input to the touch input device; and causing output of the haptic feedback based on the one or more attributes.


Example 11

A computer-implemented method for causing output of haptic feedback, the method including: receiving an indication of input to a touch surface of a touch input device; ascertaining whether haptic feedback for the input is to be initiated by an external haptic event received from an external functionality that is external to the touch input device, or whether haptic feedback is to be initiated by an internal haptic event generated by the touch input device in response to the input; and causing the touch input device to output haptic feedback based on one of the external haptic event or the internal haptic event.


Example 12

A method as described in example 11, further including determining based on haptic data whether the external functionality directly supports haptic feedback, and wherein said ascertaining includes one of: in an event that the haptic data indicates that the external functionality directly supports haptic feedback, ascertaining that the haptic feedback for the input is to be initiated by the external haptic event received from an external functionality; or in an event that the haptic data indicates that the external functionality does not directly support haptic feedback, ascertaining that the haptic feedback is to be initiated by the internal haptic event generated by the touch input device in response to the input.


Example 13

A method as described in one or more of examples 11 or 12, further including receiving a notification that the external functionality directly supports haptic feedback, wherein said ascertaining includes ascertaining that the haptic feedback for the input is to be initiated by an external haptic event, and said causing includes causing the touch input device to output haptic feedback based on the external haptic event.


Example 14

A method as described in one or more of examples 11-13, wherein said ascertaining includes ascertaining that the haptic feedback for the input is to be initiated by an internal haptic event, and said causing includes: ascertaining one or more attributes of a gesture used to provide the input to the touch input device; mapping the attributes of the gesture to haptic feedback; and causing output of the haptic feedback.


Example 15

A computer-implemented method for causing output of haptic feedback, the method including: receiving an indication of input to a touch input device; determining that an external functionality external to the touch input device does not directly support haptic feedback; ascertaining, responsive to said determining, that haptic feedback for the input is to be initiated by an internal haptic event generated by the touch input device in response to the input; ascertaining one or more attributes of a gesture that caused the input; and causing the touch input device to output haptic feedback in response to the internal haptic event and based on the one or more attributes of the gesture.


Example 16

A method as described in example 15, wherein said determining includes receiving a notification that the external functionality does not directly support haptic feedback.


Example 17

A method as described in one or more of examples 15 or 16, wherein the external functionality includes an application that does not directly support haptic feedback, and wherein the input includes input to a graphical user interface of the application.


Example 18

A method as described in one or more of examples 15-17, wherein the external functionality includes an operating system that does not directly support haptic feedback.


Example 19

A method as described in one or more of examples 15-18, wherein said ascertaining one or more attributes of the gesture includes ascertaining one or more of a direction of movement of the gesture relative to the touch input device, distance of movement of the gesture, velocity of movement of the gesture, acceleration of the gesture, deceleration of the gesture, or an amount of pressure applied to the touch input device while generating the gesture.


Example 20

A method as described in one or more of examples 15-19, wherein the external functionality includes an application, and wherein said causing is performed independent of information concerning an input context of the application.


CONCLUSION

Techniques for haptic feedback for a touch input device are described. Although embodiments are described in language specific to structural features and/or methodological acts, it is to be understood that the embodiments defined in the appended claims are not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed embodiments.

Claims
  • 1. A system comprising: a haptic-enabled touch input device;at least one processor; andone or more computer-readable storage media including instructions stored thereon that, responsive to execution by the at least one processor, cause the system perform operations including: receiving an indication of input via the touch input device indicating a user interaction with a functionality, the functionality representing one of an application or an operating system;ascertaining whether haptic feedback for the input is to be initiated by an external haptic event received from the functionality, or whether haptic feedback is to be initiated by an internal haptic event generated by the touch input device in response to the input, said ascertaining based on at least in part on a determination of whether the functionality supports haptic feedback; andcausing the touch input device to output haptic feedback based on one of the external haptic event or the internal haptic event, including at least one of:in an event that a determination is made that the functionality directly supports haptic feedback, ascertaining that the haptic feedback for the input is to be initiated by the external haptic event, and causing the touch input device to output haptic feedback based on the external haptic event; orin an event that a determination is made that the functionality does not directly support haptic feedback, ascertaining that the haptic feedback for the input is to be initiated by the internal haptic event, and causing the touch input device to output haptic feedback based on the internal haptic event.
  • 2. A system as recited in claim 1, wherein the touch input device comprises one or more of a haptic-enabled trackpad, a haptic-enabled touchscreen, or a haptic-enabled pen.
  • 3. A system as recited in claim 1, wherein the one or more computer-readable storage media comprises firmware of the touch input device.
  • 4. A system as recited in claim 1, wherein the operations further include determining that the functionality directly supports haptic feedback, wherein said ascertaining comprises ascertaining that the haptic feedback for the input is to be initiated by the external haptic event, and said causing comprises causing the touch input device to output haptic feedback based on the external haptic event.
  • 5. A system as recited in claim 1, wherein the functionality comprises the application, the application currently has focus, the operations further include determining that the application directly supports haptic feedback, and wherein said ascertaining comprises ascertaining that the haptic feedback for the input is to be initiated by an external haptic event received from the application, and said causing comprises causing the touch input device to output haptic feedback based on the external haptic event received from the application.
  • 6. A system as recited in claim 1, wherein the functionality comprises the operating system, the operations further include determining that the operating system directly supports haptic feedback, and wherein said ascertaining comprises ascertaining that the haptic feedback for the input is to be initiated by an external haptic event received from the operating system, and said causing comprises causing the touch input device to output haptic feedback based on the external haptic event received from the operating system.
  • 7. A system as recited in claim 1, wherein the operations further include determining that the functionality does not directly support haptic feedback, wherein said ascertaining comprises ascertaining that the haptic feedback for the input is to be initiated by the internal haptic event, and said causing comprises causing the touch input device to output haptic feedback based on the internal haptic event.
  • 8. A system as recited in claim 1, wherein the functionality comprises the application, the application currently has focus, the operations further include determining that the application does not directly support haptic feedback, and wherein said ascertaining comprises ascertaining that the haptic feedback for the input is to be initiated by the internal haptic event, and said causing comprises causing the touch input device to output haptic feedback based on the internal haptic event.
  • 9. A system as recited in claim 1, wherein the functionality comprises the operating system, the operations further include determining that the operating system does not directly support haptic feedback, and wherein said ascertaining comprises ascertaining that the haptic feedback for the input is to be initiated by the internal haptic event, and said causing comprises causing the touch input device to output haptic feedback based on the internal haptic event.
  • 10. A system as recited in claim 1, wherein said ascertaining comprises ascertaining that the haptic feedback for the input is to be initiated by the internal haptic event, and said causing comprises: ascertaining one or more attributes of a gesture used to provide the input to the touch input device; andcausing output of the haptic feedback based on the one or more attributes.
  • 11. A computer-implemented method, comprising: receiving an indication of input to a touch surface of a touch input device;ascertaining whether haptic feedback for the input is to be initiated by an external haptic event received from a functionality representing one of an application or an operating system, or whether haptic feedback is to be initiated by an internal haptic event generated by the touch input device in response to the input, said ascertaining based on at least in part on a determination of whether the functionality supports haptic feedback; andcausing the touch input device to output haptic feedback based on one of the external haptic event or the internal haptic event, including at least one of:in an event that a determination is made that the functionality directly supports haptic feedback, ascertaining that the haptic feedback for the input is to be initiated by the external haptic event received from the functionality; orin an event that a determination is made that the functionality does not directly support haptic feedback, ascertaining that the haptic feedback is to be initiated by the internal haptic event generated by the touch input device in response to the input.
  • 12. A method as described in claim 11, further comprising determining based on haptic data whether the functionality directly supports haptic feedback, and wherein said ascertaining comprises one of: in an event that the haptic data indicates that the functionality directly supports haptic feedback, ascertaining that the haptic feedback for the input is to be initiated by the external haptic event received from the functionality; orin an event that the haptic data indicates that the functionality does not directly support haptic feedback, ascertaining that the haptic feedback is to be initiated by the internal haptic event generated by the touch input device in response to the input.
  • 13. A method as described in claim 11, further comprising receiving a notification that the functionality directly supports haptic feedback, wherein said ascertaining comprises ascertaining that the haptic feedback for the input is to be initiated by the external haptic event, and said causing comprises causing the touch input device to output haptic feedback based on the external haptic event.
  • 14. A method as described in claim 11, wherein said ascertaining comprises ascertaining that the haptic feedback for the input is to be initiated by the internal haptic event, and said causing comprises: ascertaining one or more attributes of a gesture used to provide the input to the touch input device;mapping the attributes of the gesture to haptic feedback; andcausing output of the haptic feedback.
  • 15. A computer-implemented method, comprising: receiving an indication of input to a touch input device;determining that a functionality external to the touch input device does not directly support haptic feedback, the functionality representing one of an application or an operating system;ascertaining, responsive to said determining, that haptic feedback for the input is to be initiated by an internal haptic event generated by the touch input device in response to the input;ascertaining one or more attributes of a gesture that caused the input; andcausing the touch input device to output haptic feedback in response to the internal haptic event and based on the one or more attributes of the gesture.
  • 16. A method as described in claim 15, wherein said determining comprises receiving a notification that the functionality does not directly support haptic feedback.
  • 17. A method as described in claim 15, wherein the functionality comprises the application, the application does not directly support haptic feedback, and wherein the input comprises input to a graphical user interface of the application.
  • 18. A method as described in claim 15, wherein the functionality comprises the operating system, and the operating system does not directly support haptic feedback.
  • 19. A method as described in claim 15, wherein said ascertaining one or more attributes of the gesture comprises ascertaining one or more of a direction of movement of the gesture relative to the touch input device, distance of movement of the gesture, velocity of movement of the gesture, acceleration of the gesture, deceleration of the gesture, or an amount of pressure applied to the touch input device while generating the gesture.
  • 20. A method as described in claim 15, wherein the functionality comprises the application, and wherein said causing is performed independent of information concerning an input context of the application.
US Referenced Citations (655)
Number Name Date Kind
578325 Fleming Mar 1897 A
4046975 Seeger, Jr. Sep 1977 A
4065649 Carter et al. Dec 1977 A
4243861 Strandwitz Jan 1981 A
4279021 See et al. Jul 1981 A
4302648 Sado et al. Nov 1981 A
4317013 Larson Feb 1982 A
4326193 Markley et al. Apr 1982 A
4365130 Christensen Dec 1982 A
4492829 Rodrique Jan 1985 A
4527021 Morikawa et al. Jul 1985 A
4559426 Van Zeeland et al. Dec 1985 A
4577822 Wilkerson Mar 1986 A
4588187 Dell May 1986 A
4607147 Ono et al. Aug 1986 A
4651133 Ganesan et al. Mar 1987 A
4735394 Facco Apr 1988 A
4890832 Komaki Jan 1990 A
5149923 Demeo Sep 1992 A
5220521 Kikinis Jun 1993 A
5283559 Kalendra et al. Feb 1994 A
5331443 Stanisci Jul 1994 A
5480118 Cross Jan 1996 A
5489900 Cali et al. Feb 1996 A
5510783 Findlater et al. Apr 1996 A
5546271 Gut et al. Aug 1996 A
5548477 Kumar et al. Aug 1996 A
5558577 Kato Sep 1996 A
5576981 Parker et al. Nov 1996 A
5612719 Beernink et al. Mar 1997 A
5618232 Martin Apr 1997 A
5681220 Bertram et al. Oct 1997 A
5745376 Barker et al. Apr 1998 A
5748114 Koehn May 1998 A
5781406 Hunte Jul 1998 A
5807175 Davis et al. Sep 1998 A
5818361 Acevedo Oct 1998 A
5828770 Leis et al. Oct 1998 A
5842027 Oprescu et al. Nov 1998 A
5859642 Jones Jan 1999 A
5874697 Selker et al. Feb 1999 A
5909211 Combs et al. Jun 1999 A
5926170 Oba Jul 1999 A
5942733 Allen et al. Aug 1999 A
5971635 Wise Oct 1999 A
6002389 Kasser Dec 1999 A
6005209 Burleson et al. Dec 1999 A
6012714 Worley et al. Jan 2000 A
6040823 Seffernick et al. Mar 2000 A
6044717 Biegelsen et al. Apr 2000 A
6061644 Leis May 2000 A
6112797 Colson et al. Sep 2000 A
6147859 Abboud Nov 2000 A
6177926 Kunert Jan 2001 B1
6178443 Lin Jan 2001 B1
6239786 Burry et al. May 2001 B1
6254105 Rinde et al. Jul 2001 B1
6279060 Luke et al. Aug 2001 B1
6329617 Burgess Dec 2001 B1
6344791 Armstrong Feb 2002 B1
6380497 Hashimoto et al. Apr 2002 B1
6429846 Rosenberg et al. Aug 2002 B2
6437682 Vance Aug 2002 B1
6506983 Babb et al. Jan 2003 B1
6511378 Bhatt et al. Jan 2003 B1
6532147 Christ, Jr. Mar 2003 B1
6543949 Ritchey et al. Apr 2003 B1
6565439 Shinohara et al. May 2003 B2
6597347 Yasutake Jul 2003 B1
6600121 Olodort et al. Jul 2003 B1
6603408 Gaba Aug 2003 B1
6617536 Kawaguchi Sep 2003 B2
6651943 Cho et al. Nov 2003 B2
6685369 Lien Feb 2004 B2
6695273 Iguchi Feb 2004 B2
6704864 Philyaw Mar 2004 B1
6721019 Kono et al. Apr 2004 B2
6725318 Sherman et al. Apr 2004 B1
6738049 Kiser et al. May 2004 B2
6758615 Monney et al. Jul 2004 B2
6774888 Genduso Aug 2004 B1
6776546 Kraus et al. Aug 2004 B2
6781819 Yang et al. Aug 2004 B2
6784869 Clark et al. Aug 2004 B1
6813143 Makela Aarre Nov 2004 B2
6819316 Schulz et al. Nov 2004 B2
6822635 Shahoian Nov 2004 B2
6856506 Doherty et al. Feb 2005 B2
6861961 Sandbach et al. Mar 2005 B2
6864573 Robertson et al. Mar 2005 B2
6898315 Guha May 2005 B2
6914197 Doherty et al. Jul 2005 B2
6950950 Sawyers et al. Sep 2005 B2
6970957 Oshins et al. Nov 2005 B1
6976799 Kim et al. Dec 2005 B2
7051149 Wang et al. May 2006 B2
7083295 Hanna Aug 2006 B1
7091436 Serban Aug 2006 B2
7091955 Kramer Aug 2006 B2
7095404 Vincent et al. Aug 2006 B2
7106222 Ward et al. Sep 2006 B2
7116309 Kimura et al. Oct 2006 B1
7123292 Seeger et al. Oct 2006 B1
7194662 Do et al. Mar 2007 B2
7202837 Ihara Apr 2007 B2
7213991 Chapman et al. May 2007 B2
7224830 Nefian et al. May 2007 B2
7245292 Custy Jul 2007 B1
7277087 Hill et al. Oct 2007 B2
7301759 Hsiung Nov 2007 B2
7374312 Feng et al. May 2008 B2
7401992 Lin Jul 2008 B1
7423557 Kang Sep 2008 B2
7446276 Piesko Nov 2008 B2
7447934 Dasari et al. Nov 2008 B2
7469386 Bear et al. Dec 2008 B2
7486165 Ligtenberg et al. Feb 2009 B2
7499037 Lube Mar 2009 B2
7502803 Cutter et al. Mar 2009 B2
7542052 Solomon et al. Jun 2009 B2
7557312 Clark et al. Jul 2009 B2
7558594 Wilson Jul 2009 B2
7559834 York Jul 2009 B1
RE40891 Yasutake Sep 2009 E
7602384 Rosenberg et al. Oct 2009 B2
7620244 Collier Nov 2009 B1
7622907 Vranish Nov 2009 B2
7636921 Louie Dec 2009 B2
7639876 Clary et al. Dec 2009 B2
7656392 Bolender Feb 2010 B2
7686694 Cole Mar 2010 B2
7728820 Rosenberg et al. Jun 2010 B2
7728923 Kim et al. Jun 2010 B2
7731147 Rha Jun 2010 B2
7733326 Adiseshan Jun 2010 B1
7736042 Park et al. Jun 2010 B2
7773076 Pittel et al. Aug 2010 B2
7773121 Huntsberger et al. Aug 2010 B1
7774155 Sato et al. Aug 2010 B2
7777972 Chen et al. Aug 2010 B1
7782342 Koh Aug 2010 B2
7813715 McKillop et al. Oct 2010 B2
7815358 Inditsky Oct 2010 B2
7817428 Greer, Jr. et al. Oct 2010 B2
7865639 McCoy et al. Jan 2011 B2
7880727 Abanami et al. Feb 2011 B2
7884807 Hovden et al. Feb 2011 B2
7890863 Grant et al. Feb 2011 B2
7907394 Richardson et al. Mar 2011 B2
636397 Green Apr 2011 A1
7928964 Kolmykov-Zotov et al. Apr 2011 B2
7936501 Smith et al. May 2011 B2
7945717 Rivalsi May 2011 B2
7952566 Poupyrev et al. May 2011 B2
7970246 Travis et al. Jun 2011 B2
7973771 Geaghan Jul 2011 B2
7976393 Haga et al. Jul 2011 B2
7978281 Vergith et al. Jul 2011 B2
8016255 Lin Sep 2011 B2
8018386 Qi et al. Sep 2011 B2
8018579 Krah Sep 2011 B1
8022939 Hinata Sep 2011 B2
8026904 Westerman Sep 2011 B2
8053688 Conzola et al. Nov 2011 B2
8063886 Serban et al. Nov 2011 B2
8065624 Morin et al. Nov 2011 B2
8069356 Rathi et al. Nov 2011 B2
8077160 Land et al. Dec 2011 B2
8090885 Callaghan et al. Jan 2012 B2
8094134 Suzuki et al. Jan 2012 B2
8098233 Hotelling et al. Jan 2012 B2
8115499 Osoinach et al. Feb 2012 B2
8117362 Rodriguez et al. Feb 2012 B2
8118274 McClure et al. Feb 2012 B2
8118681 Mattice et al. Feb 2012 B2
8130203 Westerman Mar 2012 B2
8154524 Wilson et al. Apr 2012 B2
8162282 Hu et al. Apr 2012 B2
659139 Gengler May 2012 A1
8169421 Wright et al. May 2012 B2
8189973 Travis et al. May 2012 B2
8216074 Sakuma Jul 2012 B2
8229509 Paek et al. Jul 2012 B2
8229522 Kim et al. Jul 2012 B2
8232963 Orsley et al. Jul 2012 B2
8267368 Torii et al. Sep 2012 B2
8269093 Naik et al. Sep 2012 B2
8274784 Franz et al. Sep 2012 B2
8279589 Kim Oct 2012 B2
8279623 Idzik et al. Oct 2012 B2
8322290 Mignano Dec 2012 B1
8325144 Tierling et al. Dec 2012 B1
8330061 Rothkopf et al. Dec 2012 B2
8330742 Reynolds et al. Dec 2012 B2
8378972 Pance et al. Feb 2013 B2
8395587 Cauwels et al. Mar 2013 B2
8403576 Merz Mar 2013 B2
8416559 Agata et al. Apr 2013 B2
8421757 Suzuki et al. Apr 2013 B2
8487751 Laitinen et al. Jul 2013 B2
8498100 Whitt, III et al. Jul 2013 B1
8607651 Eventoff Dec 2013 B2
8633916 Bernstein et al. Jan 2014 B2
8638315 Algreatly Jan 2014 B2
8659555 Pihlaja Feb 2014 B2
8661363 Platzer et al. Feb 2014 B2
8674961 Posamentier Mar 2014 B2
8757374 Kaiser Jun 2014 B1
8766925 Perlin et al. Jul 2014 B2
8836664 Colgate et al. Sep 2014 B2
8847895 Lim et al. Sep 2014 B2
8854331 Heubel et al. Oct 2014 B2
8907871 Orsley Dec 2014 B2
8928581 Braun et al. Jan 2015 B2
8970525 D Los Reyes Mar 2015 B1
9047012 Bringert et al. Jun 2015 B1
9448631 Winter et al. Sep 2016 B2
9459160 Shaw et al. Oct 2016 B2
20010035697 Rueger et al. Nov 2001 A1
20010035859 Kiser Nov 2001 A1
20020000977 Vranish Jan 2002 A1
20020126445 Minaguchi et al. Sep 2002 A1
20020134828 Sandbach et al. Sep 2002 A1
20020154099 Oh Oct 2002 A1
20020188721 Lemel et al. Dec 2002 A1
20030016282 Koizumi Jan 2003 A1
20030044215 Monney et al. Mar 2003 A1
20030083131 Armstrong May 2003 A1
20030107557 Liebenow Jun 2003 A1
20030132916 Kramer Jul 2003 A1
20030163611 Nagao Aug 2003 A1
20030197687 Shetter Oct 2003 A1
20030201982 Iesaka Oct 2003 A1
20040005184 Kim et al. Jan 2004 A1
20040100457 Mandle May 2004 A1
20040174670 Huang et al. Sep 2004 A1
20040190239 Weng et al. Sep 2004 A1
20040212598 Kraus et al. Oct 2004 A1
20040227721 Moilanen et al. Nov 2004 A1
20040258924 Berger et al. Dec 2004 A1
20040268000 Barker et al. Dec 2004 A1
20050030728 Kawashima et al. Feb 2005 A1
20050057515 Bathiche Mar 2005 A1
20050057521 Aull et al. Mar 2005 A1
20050059441 Miyashita Mar 2005 A1
20050059489 Kim Mar 2005 A1
20050146512 Hill et al. Jul 2005 A1
20050190159 Skarine Sep 2005 A1
20050240949 Liu et al. Oct 2005 A1
20050264653 Starkweather et al. Dec 2005 A1
20050264988 Nicolosi Dec 2005 A1
20050285703 Wheeler et al. Dec 2005 A1
20060020903 Wang et al. Jan 2006 A1
20060028095 Maruyama et al. Feb 2006 A1
20060049993 Lin et al. Mar 2006 A1
20060082973 Egbert et al. Apr 2006 A1
20060085658 Allen et al. Apr 2006 A1
20060102914 Smits et al. May 2006 A1
20060103633 Gioeli May 2006 A1
20060125799 Hillis et al. Jun 2006 A1
20060132423 Travis Jun 2006 A1
20060154725 Glaser et al. Jul 2006 A1
20060156415 Rubinstein et al. Jul 2006 A1
20060181514 Newman Aug 2006 A1
20060181521 Perreault et al. Aug 2006 A1
20060187216 Trent, Jr. et al. Aug 2006 A1
20060195522 Miyazaki Aug 2006 A1
20060197753 Hotelling Sep 2006 A1
20060197754 Keely Sep 2006 A1
20060197755 Bawany Sep 2006 A1
20060238510 Panotopoulos et al. Oct 2006 A1
20060248597 Keneman Nov 2006 A1
20070043725 Hotelling et al. Feb 2007 A1
20070047221 Park Mar 2007 A1
20070051792 Wheeler et al. Mar 2007 A1
20070056385 Lorenz Mar 2007 A1
20070062089 Homer et al. Mar 2007 A1
20070069153 Pai-Paranjape et al. Mar 2007 A1
20070072474 Beasley et al. Mar 2007 A1
20070145945 McGinley et al. Jun 2007 A1
20070152983 McKillop et al. Jul 2007 A1
20070182663 Biech Aug 2007 A1
20070182722 Hotelling et al. Aug 2007 A1
20070200830 Yamamoto Aug 2007 A1
20070205995 Woolley Sep 2007 A1
20070220708 Lewis Sep 2007 A1
20070234420 Novotney et al. Oct 2007 A1
20070236408 Yamaguchi et al. Oct 2007 A1
20070236472 Bentsen Oct 2007 A1
20070236475 Wherry Oct 2007 A1
20070247338 Marchetto Oct 2007 A1
20070247432 Oakley Oct 2007 A1
20070257821 Son et al. Nov 2007 A1
20070260892 Paul et al. Nov 2007 A1
20070274094 Schultz et al. Nov 2007 A1
20070274095 Destain Nov 2007 A1
20070283179 Burnett et al. Dec 2007 A1
20080005423 Jacobs et al. Jan 2008 A1
20080013809 Zhu et al. Jan 2008 A1
20080018611 Serban et al. Jan 2008 A1
20080024459 Poupyrev et al. Jan 2008 A1
20080042994 Gillespie et al. Feb 2008 A1
20080094367 Van De Ven et al. Apr 2008 A1
20080104437 Lee May 2008 A1
20080151478 Chern Jun 2008 A1
20080158185 Westerman Jul 2008 A1
20080167832 Soss Jul 2008 A1
20080180411 Solomon et al. Jul 2008 A1
20080202824 Philipp et al. Aug 2008 A1
20080219025 Spitzer et al. Sep 2008 A1
20080228969 Cheah et al. Sep 2008 A1
20080232061 Wang et al. Sep 2008 A1
20080238884 Harish Oct 2008 A1
20080253822 Matias Oct 2008 A1
20080297878 Brown et al. Dec 2008 A1
20080303646 Elwell et al. Dec 2008 A1
20080309636 Feng et al. Dec 2008 A1
20080316002 Brunet et al. Dec 2008 A1
20080316066 Minato et al. Dec 2008 A1
20080320190 Lydon et al. Dec 2008 A1
20090002218 Rigazio et al. Jan 2009 A1
20090007001 Morin et al. Jan 2009 A1
20090009476 Daley, III Jan 2009 A1
20090046416 Daley, III Feb 2009 A1
20090049979 Naik et al. Feb 2009 A1
20090065267 Sato Mar 2009 A1
20090073060 Shimasaki et al. Mar 2009 A1
20090073957 Newland et al. Mar 2009 A1
20090079639 Hotta et al. Mar 2009 A1
20090083562 Park et al. Mar 2009 A1
20090085878 Heubel et al. Apr 2009 A1
20090090568 Min Apr 2009 A1
20090101417 Suzuki et al. Apr 2009 A1
20090106655 Grant et al. Apr 2009 A1
20090117955 Lo May 2009 A1
20090127005 Zachut et al. May 2009 A1
20090128374 Reynolds May 2009 A1
20090135142 Fu et al. May 2009 A1
20090140985 Liu Jun 2009 A1
20090160529 Lamborghini Jun 2009 A1
20090163147 Steigerwald et al. Jun 2009 A1
20090167704 Terlizzi et al. Jul 2009 A1
20090174679 Westerman Jul 2009 A1
20090182901 Callaghan et al. Jul 2009 A1
20090195497 Fitzgerald et al. Aug 2009 A1
20090219250 Ure Sep 2009 A1
20090231019 Yeh Sep 2009 A1
20090231275 Odgers Sep 2009 A1
20090250267 Heubel et al. Oct 2009 A1
20090251008 Sugaya Oct 2009 A1
20090259865 Sheynblat et al. Oct 2009 A1
20090262492 Whitchurch et al. Oct 2009 A1
20090265670 Kim et al. Oct 2009 A1
20090267892 Faubert Oct 2009 A1
20090284397 Lee et al. Nov 2009 A1
20090295739 Nagara et al. Dec 2009 A1
20090303137 Kusaka et al. Dec 2009 A1
20090303204 Nasiri et al. Dec 2009 A1
20090320244 Lin Dec 2009 A1
20090321490 Groene et al. Dec 2009 A1
20100001963 Doray et al. Jan 2010 A1
20100013319 Kamiyama et al. Jan 2010 A1
20100013613 Weston Jan 2010 A1
20100026656 Hotelling et al. Feb 2010 A1
20100038821 Jenkins et al. Feb 2010 A1
20100039764 Locker et al. Feb 2010 A1
20100045609 Do et al. Feb 2010 A1
20100045633 Gettemy Feb 2010 A1
20100051356 Stern et al. Mar 2010 A1
20100051432 Lin et al. Mar 2010 A1
20100053087 Dai et al. Mar 2010 A1
20100053534 Hsieh et al. Mar 2010 A1
20100075517 Ni et al. Mar 2010 A1
20100077237 Sawyers Mar 2010 A1
20100079398 Shen et al. Apr 2010 A1
20100081377 Chatterjee et al. Apr 2010 A1
20100085321 Pundsack Apr 2010 A1
20100097198 Suzuki Apr 2010 A1
20100102182 Lin Apr 2010 A1
20100103112 Yoo et al. Apr 2010 A1
20100103131 Segal et al. Apr 2010 A1
20100123686 Klinghult et al. May 2010 A1
20100133398 Chiu et al. Jun 2010 A1
20100137033 Lee Jun 2010 A1
20100141588 Kimura et al. Jun 2010 A1
20100142130 Wang et al. Jun 2010 A1
20100149111 Olien Jun 2010 A1
20100149134 Westerman et al. Jun 2010 A1
20100156798 Archer Jun 2010 A1
20100161522 Tirpak et al. Jun 2010 A1
20100162109 Chatterjee et al. Jun 2010 A1
20100162179 Porat Jun 2010 A1
20100164857 Liu et al. Jul 2010 A1
20100171708 Chuang Jul 2010 A1
20100171891 Kaji et al. Jul 2010 A1
20100174421 Tsai et al. Jul 2010 A1
20100180063 Ananny et al. Jul 2010 A1
20100182263 Aunio et al. Jul 2010 A1
20100188299 Rinehart et al. Jul 2010 A1
20100188338 Longe Jul 2010 A1
20100206614 Park et al. Aug 2010 A1
20100206644 Yeh Aug 2010 A1
20100214239 Wu et al. Aug 2010 A1
20100214257 Wussler et al. Aug 2010 A1
20100222110 Kim et al. Sep 2010 A1
20100231498 Large et al. Sep 2010 A1
20100231510 Sampsell et al. Sep 2010 A1
20100231556 Mines et al. Sep 2010 A1
20100238075 Pourseyed Sep 2010 A1
20100238119 Dubrovsky et al. Sep 2010 A1
20100238138 Goertz et al. Sep 2010 A1
20100245221 Khan Sep 2010 A1
20100250988 Okuda et al. Sep 2010 A1
20100274932 Kose Oct 2010 A1
20100279768 Huang et al. Nov 2010 A1
20100289457 Onnerud et al. Nov 2010 A1
20100289508 Joguet et al. Nov 2010 A1
20100295812 Burns et al. Nov 2010 A1
20100302378 Marks et al. Dec 2010 A1
20100304793 Kim Dec 2010 A1
20100306538 Thomas et al. Dec 2010 A1
20100308778 Yamazaki et al. Dec 2010 A1
20100308844 Day et al. Dec 2010 A1
20100315267 Chung Dec 2010 A1
20100315348 Jellicoe et al. Dec 2010 A1
20100315373 Steinhauser et al. Dec 2010 A1
20100321299 Shelley et al. Dec 2010 A1
20100321301 Casparian et al. Dec 2010 A1
20100321330 Lim et al. Dec 2010 A1
20100321339 Kimmel Dec 2010 A1
20100325155 Skinner et al. Dec 2010 A1
20100328230 Faubert et al. Dec 2010 A1
20100331059 Apgar et al. Dec 2010 A1
20110007008 Algreatly Jan 2011 A1
20110012873 Prest et al. Jan 2011 A1
20110018556 Le et al. Jan 2011 A1
20110019123 Prest et al. Jan 2011 A1
20110031287 Le Gette et al. Feb 2011 A1
20110036965 Zhang et al. Feb 2011 A1
20110037721 Cranfill et al. Feb 2011 A1
20110043454 Modarres et al. Feb 2011 A1
20110043990 Mickey et al. Feb 2011 A1
20110049094 Wu Mar 2011 A1
20110050037 Rinner et al. Mar 2011 A1
20110050587 Natanzon et al. Mar 2011 A1
20110050630 Ikeda Mar 2011 A1
20110055407 Lydon et al. Mar 2011 A1
20110057899 Sleeman et al. Mar 2011 A1
20110060926 Brooks et al. Mar 2011 A1
20110069148 Jones et al. Mar 2011 A1
20110074688 Hull et al. Mar 2011 A1
20110074702 Pertuit et al. Mar 2011 A1
20110080347 Steeves et al. Apr 2011 A1
20110080367 Marchand et al. Apr 2011 A1
20110084909 Hsieh et al. Apr 2011 A1
20110095994 Birnbaum Apr 2011 A1
20110096513 Kim Apr 2011 A1
20110102326 Casparian et al. May 2011 A1
20110102356 Kemppinen et al. May 2011 A1
20110115712 Han et al. May 2011 A1
20110115747 Powell et al. May 2011 A1
20110118025 Lukas et al. May 2011 A1
20110128227 Theimer Jun 2011 A1
20110134032 Chiu et al. Jun 2011 A1
20110134112 Koh et al. Jun 2011 A1
20110141052 Bernstein Jun 2011 A1
20110147398 Ahee et al. Jun 2011 A1
20110148793 Ciesla et al. Jun 2011 A1
20110157087 Kanehira et al. Jun 2011 A1
20110163955 Nasiri et al. Jul 2011 A1
20110164370 McClure et al. Jul 2011 A1
20110167181 Minoo et al. Jul 2011 A1
20110167287 Walsh et al. Jul 2011 A1
20110167391 Momeyer et al. Jul 2011 A1
20110167992 Eventoff et al. Jul 2011 A1
20110179864 Raasch et al. Jul 2011 A1
20110184646 Wong et al. Jul 2011 A1
20110193787 Morishige et al. Aug 2011 A1
20110193938 Oderwald et al. Aug 2011 A1
20110202878 Park et al. Aug 2011 A1
20110205161 Myers et al. Aug 2011 A1
20110205372 Miramontes Aug 2011 A1
20110216266 Travis Sep 2011 A1
20110227872 Huska et al. Sep 2011 A1
20110227913 Hyndman Sep 2011 A1
20110231682 Kakish et al. Sep 2011 A1
20110234502 Yun et al. Sep 2011 A1
20110241999 Thier Oct 2011 A1
20110242138 Tribble Oct 2011 A1
20110248152 Svajda et al. Oct 2011 A1
20110248920 Larsen Oct 2011 A1
20110248930 Kwok et al. Oct 2011 A1
20110248941 Abdo et al. Oct 2011 A1
20110261001 Liu Oct 2011 A1
20110261021 Modarres et al. Oct 2011 A1
20110261083 Wilson Oct 2011 A1
20110267294 Kildal Nov 2011 A1
20110267300 Serban et al. Nov 2011 A1
20110267757 Probst Nov 2011 A1
20110290686 Huang Dec 2011 A1
20110291922 Stewart et al. Dec 2011 A1
20110291951 Tong Dec 2011 A1
20110295697 Boston et al. Dec 2011 A1
20110297566 Gallagher et al. Dec 2011 A1
20110304577 Brown et al. Dec 2011 A1
20110304962 Su Dec 2011 A1
20110306424 Kazama et al. Dec 2011 A1
20110316807 Corrion Dec 2011 A1
20120007821 Zaliva Jan 2012 A1
20120011462 Westerman et al. Jan 2012 A1
20120013519 Hakansson et al. Jan 2012 A1
20120023459 Westerman Jan 2012 A1
20120024682 Huang et al. Feb 2012 A1
20120026048 Vazquez et al. Feb 2012 A1
20120044179 Hudson Feb 2012 A1
20120047368 Chinn et al. Feb 2012 A1
20120050975 Garelli et al. Mar 2012 A1
20120055770 Chen Mar 2012 A1
20120068933 Larsen Mar 2012 A1
20120068957 Puskarich et al. Mar 2012 A1
20120072167 Cretella, Jr. et al. Mar 2012 A1
20120075198 Sulem et al. Mar 2012 A1
20120075221 Yasuda Mar 2012 A1
20120075249 Hoch Mar 2012 A1
20120081316 Sirpal et al. Apr 2012 A1
20120087078 Medica et al. Apr 2012 A1
20120092279 Martin Apr 2012 A1
20120092350 Ganapathi et al. Apr 2012 A1
20120094257 Pillischer et al. Apr 2012 A1
20120098751 Lin Apr 2012 A1
20120099263 Lin Apr 2012 A1
20120099749 Rubin et al. Apr 2012 A1
20120105481 Baek et al. May 2012 A1
20120106082 Wu et al. May 2012 A1
20120113579 Agata et al. May 2012 A1
20120115553 Mahe et al. May 2012 A1
20120117409 Lee et al. May 2012 A1
20120127071 Jitkoff et al. May 2012 A1
20120127118 Nolting et al. May 2012 A1
20120139844 Ramstein et al. Jun 2012 A1
20120140396 Zeliff et al. Jun 2012 A1
20120145525 Ishikawa Jun 2012 A1
20120155015 Govindasamy et al. Jun 2012 A1
20120162693 Ito Jun 2012 A1
20120175487 Goto Jul 2012 A1
20120182242 Lindahl et al. Jul 2012 A1
20120188180 Yang et al. Jul 2012 A1
20120194393 Uttermann et al. Aug 2012 A1
20120194448 Rothkopf Aug 2012 A1
20120200532 Powell et al. Aug 2012 A1
20120200802 Large Aug 2012 A1
20120206401 Lin et al. Aug 2012 A1
20120206937 Travis et al. Aug 2012 A1
20120223866 Ayala Vazquez et al. Sep 2012 A1
20120224073 Miyahara Sep 2012 A1
20120229401 Birnbaum et al. Sep 2012 A1
20120235635 Sato Sep 2012 A1
20120235921 Laubach Sep 2012 A1
20120235942 Shahoian et al. Sep 2012 A1
20120242588 Myers et al. Sep 2012 A1
20120246377 Bhesania Sep 2012 A1
20120249459 Sashida et al. Oct 2012 A1
20120249474 Pratt et al. Oct 2012 A1
20120256848 Madabusi Srinivasan Oct 2012 A1
20120256959 Ye et al. Oct 2012 A1
20120268412 Cruz-Hernandez et al. Oct 2012 A1
20120268911 Lin Oct 2012 A1
20120274578 Snow et al. Nov 2012 A1
20120274811 Bakin Nov 2012 A1
20120287562 Wu et al. Nov 2012 A1
20120299866 Pao et al. Nov 2012 A1
20120300275 Vilardell et al. Nov 2012 A1
20120304199 Homma et al. Nov 2012 A1
20120312955 Randolph Dec 2012 A1
20120327025 Huska et al. Dec 2012 A1
20120328349 Isaac et al. Dec 2012 A1
20130009892 Salmela et al. Jan 2013 A1
20130044059 Fu Feb 2013 A1
20130047747 Joung Feb 2013 A1
20130063364 Moore Mar 2013 A1
20130063873 Wodrich et al. Mar 2013 A1
20130076646 Krah et al. Mar 2013 A1
20130076652 Leung Mar 2013 A1
20130088431 Ballagas et al. Apr 2013 A1
20130088442 Lee Apr 2013 A1
20130094131 O'Donnell et al. Apr 2013 A1
20130097534 Lewin et al. Apr 2013 A1
20130106766 Yilmaz et al. May 2013 A1
20130107144 Marhefka et al. May 2013 A1
20130127735 Motoyama May 2013 A1
20130141370 Wang et al. Jun 2013 A1
20130167663 Eventoff Jul 2013 A1
20130194235 Zanone et al. Aug 2013 A1
20130201115 Heubel Aug 2013 A1
20130207917 Cruz-Hernandez et al. Aug 2013 A1
20130222286 Kang et al. Aug 2013 A1
20130227836 Whitt, III Sep 2013 A1
20130228433 Shaw Sep 2013 A1
20130229273 Nodar Cortizo et al. Sep 2013 A1
20130229356 Marwah et al. Sep 2013 A1
20130229386 Bathiche Sep 2013 A1
20130249802 Yasutake Sep 2013 A1
20130275058 Awad Oct 2013 A1
20130278552 Kamin-Lyndgaard Oct 2013 A1
20130300683 Birnbaum et al. Nov 2013 A1
20130304941 Drasnin Nov 2013 A1
20130304944 Young Nov 2013 A1
20130311881 Birnbaum et al. Nov 2013 A1
20130314341 Lee et al. Nov 2013 A1
20130321291 Sim Dec 2013 A1
20130335209 Cruz-Hernandez et al. Dec 2013 A1
20130335330 Lane Dec 2013 A1
20130335902 Campbell Dec 2013 A1
20130335903 Raken Dec 2013 A1
20130342464 Bathiche et al. Dec 2013 A1
20130342465 Bathiche Dec 2013 A1
20130346636 Bathiche Dec 2013 A1
20140009429 Verweg et al. Jan 2014 A1
20140020484 Shaw et al. Jan 2014 A1
20140022177 Shaw Jan 2014 A1
20140028624 Marsden et al. Jan 2014 A1
20140055375 Kim et al. Feb 2014 A1
20140083207 Eventoff Mar 2014 A1
20140092003 Liu Apr 2014 A1
20140098058 Baharav et al. Apr 2014 A1
20140104189 Marshall et al. Apr 2014 A1
20140139436 Ramstein et al. May 2014 A1
20140139452 Levesque May 2014 A1
20140139472 Takenaka May 2014 A1
20140198072 Schuele et al. Jul 2014 A1
20140210742 Delattre et al. Jul 2014 A1
20140221098 Boulanger Aug 2014 A1
20140225821 Kim et al. Aug 2014 A1
20140225857 Ma Aug 2014 A1
20140230575 Picciotto et al. Aug 2014 A1
20140232657 Aviles et al. Aug 2014 A1
20140232679 Whitman et al. Aug 2014 A1
20140306914 Kagayama Oct 2014 A1
20140320393 Modarres et al. Oct 2014 A1
20140354587 Mohindra et al. Dec 2014 A1
20140370937 Park et al. Dec 2014 A1
20150084865 Shaw et al. Mar 2015 A1
20150097786 Behles et al. Apr 2015 A1
20150116205 Westerman et al. Apr 2015 A1
20150185842 Picciotto et al. Jul 2015 A1
20150185950 Watanabe et al. Jul 2015 A1
20150227207 Winter et al. Aug 2015 A1
20150253872 Reyes Sep 2015 A1
20150293592 Cheong et al. Oct 2015 A1
20150370376 Harley et al. Dec 2015 A1
20160018894 Yliaho et al. Jan 2016 A1
20160063828 Moussette Mar 2016 A1
20160135742 Cobbett May 2016 A1
20170023418 Shaw et al. Jan 2017 A1
20170102770 Winter et al. Apr 2017 A1
Foreign Referenced Citations (18)
Number Date Country
1722073 Jan 2006 CN
101763166 Jun 2010 CN
1223722 Jul 2002 EP
1591891 Nov 2005 EP
2353978 Aug 2011 EP
2381340 Oct 2011 EP
2584432 Apr 2013 EP
2178570 Feb 1987 GB
10326124 Dec 1998 JP
1173239 Mar 1999 JP
11345041 Dec 1999 JP
1020110087178 Aug 2011 KR
1038411 May 2012 NL
WO-2010011983 Jan 2010 WO
WO-2012036717 Mar 2012 WO
WO-2012173305 Dec 2012 WO
WO-2013169299 Nov 2013 WO
WO-2014098946 Jun 2014 WO
Non-Patent Literature Citations (230)
Entry
“Accessing Device Sensors”, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012, 2011, 4 pages.
“ACPI Docking for Windows Operating Systems”, Retrieved from: <http://www.scritube.com/limba/engleza/software/ACPI-Docking-for-Windows-Opera331824193.php> on Jul. 6, 2012, 2012, 10 pages.
“Advanced Configuration and Power Management Specification”, Intel Corporation, Microsoft Corporation, Toshiba Corp. Revision 1, Dec. 22, 1996, 364 pages.
“Capacitive Touch Sensors—Application Fields, Technology Overview and Implementation Example”, Fujitsu Microelectronics Europe GmbH; retrieved from http://www.fujitsu.com/downloads/MICRO/fme/articles/fujitsu-whitepaper-capacitive-touch-sensors.pdf on Jul. 20, 2011, Jan. 12, 2010, 12 pages.
“Cholesteric Liquid Crystal”, Retrieved from: <http://en.wikipedia.org/wiki/Cholesteric_liquid_crystal> on Aug. 6, 2012, Jun. 10, 2012, 2 pages.
“Cirago Slim Case®—Protective case with built-in kickstand for your iPhone 5®”, Retrieved from <http://cirago.com/wordpress/wp-content/uploads/2012/10/ipc1500brochure1.pdf> on Jan. 29, 2013, Jan. 2013, 1 page.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, dated Apr. 9, 2013, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, dated Jul. 2, 2013, 2 pages.
“Developing Next-Generation Human Interfaces using Capacitive and Infrared Proximity Sensing”, Silicon Laboratories, Inc., Available at <http://www.silabs.com/pages/DownloadDoc.aspx?FILEURL=support%20documents/technicaldocs/capacitive%20and%20proximity%20sensing_wp.pdf&src=SearchResults>, Aug. 30, 2010, pp. 1-10.
“Directional Backlighting for Display Panels”, U.S. Appl. No. 13/021,448, filed Feb. 4, 2011, 38 pages.
“DR2PA”, retrieved from <http://www.architainment.co.uk/wp-content/uploads/2012/08/DR2PA-AU-US-size-Data-Sheet-Rev-H_LOGO.pdf> on Sep. 17, 2012, Jan. 2012, 4 pages.
“Ex Parte Quayle Action”, U.S. Appl. No. 13/599,763, filed Nov. 14, 2014, 6 pages.
“Final Office Action”, U.S. Appl. No. 13/471,001, dated Jul. 25, 2013, 20 pages.
“Final Office Action”, U.S. Appl. No. 13/527,263, dated Jan. 27, 2015, 7 pages.
“Final Office Action”, U.S. Appl. No. 13/603,918, dated Mar. 21, 2014, 14 pages.
“Final Office Action”, U.S. Appl. No. 13/647,479, dated Dec. 12, 2014, 12 pages.
“Final Office Action”, U.S. Appl. No. 13/651,195, dated Apr. 18, 2013, 13 pages.
“Final Office Action”, U.S. Appl. No. 13/651,232, dated May 21, 2013, 21 pages.
“Final Office Action”, U.S. Appl. No. 13/651,287, dated May 3, 2013, 16 pages.
“Final Office Action”, U.S. Appl. No. 13/651,976, dated Jul. 25, 2013, 21 pages.
“Final Office Action”, U.S. Appl. No. 13/653,321, dated Aug. 2, 2013, 17 pages.
“Final Office Action”, U.S. Appl. No. 13/655,065, dated Apr. 2, 2015, 23 pages.
“Final Office Action”, U.S. Appl. No. 13/655,065, dated Aug. 8, 2014, 20 pages.
“Final Office Action”, U.S. Appl. No. 13/769,356, dated Apr. 10, 2015, 9 pages.
“Final Office Action”, U.S. Appl. No. 13/782,137, dated May 8, 2015, 19 pages.
“Final Office Action”, U.S. Appl. No. 13/974,749, dated May 21, 2015, 19 pages.
“Final Office Action”, U.S. Appl. No. 13/974,749, dated Sep. 5, 2014, 18 pages.
“Final Office Action”, U.S. Appl. No. 13/974,994, dated Jun. 10, 2015, 28 pages.
“Final Office Action”, U.S. Appl. No. 13/974,994, dated Oct. 6, 2014, 26 pages.
“Final Office Action”, U.S. Appl. No. 13/975,087, dated Aug. 7, 2015, 16 pages.
“Final Office Action”, U.S. Appl. No. 13/975,087, dated Sep. 10, 2014, 19 pages.
“Final Office Action”, U.S. Appl. No. 14/033,510, dated Jun. 5, 2015, 24 pages.
“Final Office Action”, U.S. Appl. No. 14/033,510, dated Aug. 21, 2014, 18 pages.
“First One Handed Fabric Keyboard with Bluetooth Wireless Technology”, Retrieved from: <http://press.xtvworld.com/article3817.html> on May 8, 2012, Jan. 6, 2005, 2 pages.
“Force and Position Sensing Resistors: An Emerging Technology”, Interlink Electronics, Available at <http://staff.science.uva.nl/˜vlaander/docu/FSR/An_Exploring_Technology.pdf>, Feb. 1990, pp. 1-6.
“Frogpad Introduces Weareable Fabric Keyboard with Bluetooth Technology”, Retrieved from: <http://www.geekzone.co.nz/content.asp?contentid=3898> on May 7, 2012, Jan. 7, 2005, 3 pages.
“How to Use the iPad's Onscreen Keyboard”, Retrieved from <http://www.dummies.com/how-to/content/how-to-use-the-ipads-onscreen-keyboard.html> on Aug. 28, 2012, 2012, 3 pages.
“iControlPad 2—The open source controller”, Retrieved from <http://www.kickstarter.com/projects/1703567677/icontrolpad-2-the-open-source-controller> on Nov. 20, 2012, 2012, 15 pages.
“i-Interactor electronic pen”, Retrieved from: <http://www.alibaba.com/product-gs/331004878/i_Interactor_electronic_pen.html> on Jun. 19, 2012, 2012, 5 pages.
“Incipio LG G-Slate Premium Kickstand Case—Black Nylon”, Retrieved from: <http://www.amazon.com/Incipio-G-Slate-Premium-Kickstand-Case/dp/B004ZKP916> on May 8, 2012, 2012, 4 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/068687, dated Mar. 18, 2015, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/016151, dated May 16, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/016743, dated Jul. 24, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/056185, dated Dec. 4, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028948, dated Jun. 21, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/029461, dated Jun. 21, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/040968, dated Sep. 5, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/044871, dated Aug. 14, 2013, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/014522, dated Jun. 6, 2014, 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/045283, dated Mar. 12, 2014, 19 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/044873, dated Nov. 22, 2013, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/045049, dated Sep. 16, 2013, 9 pages.
“Membrane Keyboards & Membrane Keypads”, Retrieved from: <http://www.pannam.com/> on May 9, 2012, Mar. 4, 2009, 2 pages.
“Microsoft Tablet PC”, Retrieved from <http://web.archive.org/web/20120622064335/https://en.wikipedia.org/wiki/Microsoft_Tablet_PC> on Jun. 4, 2014, Jun. 21, 2012, 9 pages.
“Motion Sensors”, Android Developers—retrieved from <http://developer.android.com/guide/topics/sensors/sensors_motion.html> on May 25, 2012, 2012, 7 pages.
“MPC Fly Music Production Controller”, AKAI Professional, Retrieved from: <http://www.akaiprompc.com/mpc-fly> on Jul. 9, 2012, 4 pages.
“NI Releases New Maschine & Maschine Mikro”, Retrieved from <http://www.djbooth.net/index/dj-equipment/entry/ni-releases-new-maschine-mikro/> on Sep. 17, 2012, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/021,448, dated Dec. 13, 2012, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,001, dated Feb. 19, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,139, dated Mar. 21, 2013, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,202, dated Feb. 11, 2013, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, dated Jan. 18, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, dated Apr. 3, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, dated Jul. 19, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/563,435, dated Jun. 14, 2013, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, dated Jun. 19, 2013, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/565,124, dated Jun. 17, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/599,763, dated May 28, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/603,918, dated Sep. 2, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/603,918, dated Dec. 19, 2013, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/645,405, dated Jan. 31, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/645,405, dated Aug. 11, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/647,479, dated Jul. 3, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,195, dated Jan. 2, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,232, dated Jan. 17, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,272, dated Feb. 12, 2013, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,287, dated Jan. 29, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,304, dated Mar. 22, 2013, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,327, dated Mar. 22, 2013, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,726, dated Apr. 15, 2013, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, dated Mar. 18, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, dated Jul. 1, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,976, dated Feb. 22, 2013, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,321, dated Feb. 1, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, dated Feb. 7, 2013, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, dated Jun. 3, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/655,065, dated Apr. 24, 2014, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/655,065, dated Aug. 19, 2015, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/655,065, dated Dec. 19, 2014, 24 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,055, dated Apr. 23, 2013, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,520, dated Feb. 1, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,520, dated Jun. 5, 2013, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/759,875, dated Aug. 1, 2014, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/769,356, dated Oct. 19, 2015, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/769,356, dated Nov. 20, 2014, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/782,137, dated Jan. 30, 2015, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/782,137, dated Oct. 6, 2015, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/974,749, dated Feb. 12, 2015, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/974,749, dated May 8, 2014, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/974,994, dated Jan. 23, 2015, 26 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/974,994, dated Jun. 4, 2014, 24 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/975,087, dated Feb. 27, 2015, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/975,087, dated May 8, 2014, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/033,510, dated Feb. 12, 2015, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/033,510, dated Jun. 5, 2014, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/144,876, dated Jun. 10, 2015, 23 pages.
“Notice of Allowance”, U.S. Appl. No. 13/470,633, dated Mar. 22, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,202, dated May 28, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/599,763, dated Feb. 18, 2015, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/603,918, dated Jan. 22, 2015, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,195, dated Jul. 8, 2013, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,272, dated May 2, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,304, dated Jul. 1, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,327, dated Jun. 11, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,726, dated May 31, 2013, 5 pages.
“On-Screen Keyboard for Windows 7, Vista, XP with Touchscreen”, Retrieved from <www.comfort-software.com/on-screen-keyboard.html> on Aug. 28, 2012, Feb. 2, 2011, 3 pages.
“Optical Sensors in Smart Mobile Devices”, ON Semiconductor, TND415/D, Available at <http://www.onsemi.jp/pub_link/Collateral/TND415-D.PDF>, Nov. 2010, pp. 1-13.
“Optics for Displays: Waveguide-based Wedge Creates Collimated Display Backlight”, OptoIQ, retrieved from <http://www.optoiq.com/index/photonics-technologies-applications/lfw-display/lfw-article-display.articles.laser-focus-world.volume-46.issue-1.world-news.optics-for_displays.html> on Nov. 2, 2010, Jan. 1, 2010, 3 pages.
“Position Sensors”, Android Developers—retrieved from <http://developer.android.com/guide/topics/sensors/sensors_position.html> on May 25, 2012, 5 pages.
“Reflex LCD Writing Tablets”, retrieved from <http://www.kentdisplays.com/products/lcdwritingtablets.html> on Jun. 27, 2012, 3 pages.
“Restriction Requirement”, U.S. Appl. No. 13/603,918, dated Nov. 27, 2013, 8 pages.
“Restriction Requirement”, U.S. Appl. No. 13/471,139, dated Jan. 17, 2013, 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,304, dated Jan. 18, 2013, 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,726, dated Feb. 22, 2013, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,871, dated Feb. 7, 2013, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/715,229, dated Aug. 13, 2013, 7 pages.
“Second Written Opinion”, Application No. PCT/US2014/068687, dated Nov. 12, 2015, 6 pages.
“Smart Board™ Interactive Display Frame Pencil Pack”, Available at <http://downloads01.smarttech.com/media/sitecore/en/support/product/sbfpd/400series(interactivedisplayframes)/guides/smartboardinteractivedisplayframepencilpackv12mar09.pdf>, 2009, 2 pages.
“Snugg iPad 3 Keyboard Case—Cover Ultra Slim Bluetooth Keyboard Case for the iPad 3 & iPad 2”, Retrieved from <https://web.archive.org/web/20120810202056/http://www.amazon.com/Snugg-iPad-Keyboard-Case-Bluetooth/dp/B008CCHXJE> on Jan. 23, 2015, Aug. 10, 2012, 4 pages.
“SolRxTM E-Series Multidirectional Phototherapy ExpandableTM 2-Bulb Full Body Panel System”, Retrieved from: < http://www.solarcsystems.com/us_multidirectional_uv_light_therapy_1_intro.html > on Jul. 25, 2012, 2011, 4 pages.
“Tactile Feedback Solutions Using Piezoelectric Actuators”, Available at: http://www.eetimes.com/document.asp?doc_id=1278418, Nov. 17, 2010, 6 pages.
“The Microsoft Surface Tablets Comes With Impressive Design and Specs”, Retrieved from <http://microsofttabletreview.com/the-microsoft-surface-tablets-comes-with-impressive-design-and-specs> on Jan. 30, 2013, Jun. 2012, 2 pages.
“Tilt Shift Lenses: Perspective Control”, retrieved from http://www.cambridgeincolour.com/tutorials/tilt-shift-lenses1.htm, Mar. 28, 2008, 11 Pages.
“Virtualization Getting Started Guide”, Red Hat Enterprise Linux 6, Edition 0.2—retrieved from <http://docs.redhat.com/docs/en-US/Red_Hat_Enterprise_Linux/6/html-single/Virtualization_Getting_Started_Guide/index.html> on Jun. 13, 2012, 24 pages.
“Visus Photonics—Visionary Technologies New Generation of Production Ready Keyboard-Keypad Illumination Systems”, Available at: <http://www.visusphotonics.com/pdf/appl_keypad_keyboard_backlights.pdf>, May 2006, pp. 1-22.
“What is Active Alignment?”, http://www.kasalis.com/active_alignment.html, retrieved on Nov. 22, 2012, Nov. 22, 2012, 2 Pages.
“Write & Learn Spellboard Advanced”, Available at <http://somemanuals.com/VTECH,WRITE%2526LEARN—SPELLBOARD—ADV—71000,JIDFHE.PDF>, 2006, 22 pages.
“Writer 1 for iPad 1 keyboard + Case (Aluminum Bluetooth Keyboard, Quick Eject and Easy Angle Function!)”, Retrieved from <https://web.archive.org/web/20120817053825/http://www.amazon.com/keyboard-Aluminum-Bluetooth-Keyboard-Function/dp/B004OQLSLG> on Jan. 23, 2015, Aug. 17, 2012, 5 pages.
Akamatsu,“Movement Characteristics Using a Mouse with Tactile and Force Feedback”, In Proceedings of International Journal of Human-Computer Studies 45, No. 4, Oct. 1996, 11 pages.
Bathiche,“Input Device with Interchangeable Surface”, U.S. Appl. No. 13/974,749, Aug. 23, 2013, 51 pages.
Block,“DeviceOrientation Event Specification”, W3C, Editor's Draft, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012, Jul. 12, 2011, 14 pages.
Brown,“Microsoft Shows Off Pressure-Sensitive Keyboard”, retrieved from <http://news.cnet.com/8301-17938_105-10304792-1.html> on May 7, 2012, Aug. 6, 2009, 2 pages.
Butler,“SideSight: Multi-“touch” Interaction around Small Devices”, In the proceedings of the 21st annual ACM symposium on User interface software and technology., retrieved from <http://research.microsoft.com/pubs/132534/sidesight_crv3.pdf> on May 29, 2012, Oct. 19, 2008, 4 pages.
Chu,“Design and Analysis of a Piezoelectric Material Based Touch Screen With Additional Pressure and Its Acceleration Measurement Functions”, In Proceedings of Smart Materials and Structures, vol. 22, Issue 12, Nov. 1, 2013, 2 pages.
Crider,“Sony Slate Concept Tablet “Grows” a Kickstand”, Retrieved from: <http://androidcommunity.com/sony-slate-concept-tablet-grows-a-kickstand-20120116/> on May 4, 2012, Jan. 16, 2012, 9 pages.
Das,“Study of Heat Transfer through Multilayer Clothing Assemblies: A Theoretical Prediction”, Retrieved from <http://www.autexrj.com/cms/zalaczone_pliki/5_013_11.pdf>, Jun. 2011, 7 pages.
Dietz,“A Practical Pressure Sensitive Computer Keyboard”, In Proceedings of UIST 2009, Oct. 2009, 4 pages.
Gaver,“A Virtual Window on Media Space”, retrieved from <http://www.gold.ac.uk/media/15gaver-smets-overbeeke.MediaSpaceWindow.chi95.pdf> on Jun. 1, 2012, retrieved from <http://www.gold.ac.uk/media/15gaver-smets-overbeeke.MediaSpaceWindow.chi95.pdf> on Jun. 1, 2012, May 7, 1995, 9 pages.
Glatt,“Channel and Key Pressure (Aftertouch).”, Retrieved from: <http://home.roadrunner.com/˜jgglatt/tutr/touch.htm> on Jun. 11, 2012, 2012, 2 pages.
Gong,“PrintSense: A Versatile Sensing Technique to Support Multimodal Flexible Surface Interaction”, In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; retrieved from: http://dl.acm.org/citation.cfm?id=2556288.2557173&coll=DL&dl=ACM&CFID=571580473 &CFTOKEN=89752233 on Sep. 19, 2014, Apr. 26, 2014, 4 pages.
Hanlon,“ElekTex Smart Fabric Keyboard Goes Wireless”, Retrieved from: <http://www.gizmag.com/go/5048/ > on May 7, 2012, Jan. 15, 2006, 5 pages.
Harada,“VoiceDraw: A Hands-Free Voice-Driven Drawing Application for People With Motor Impairments”, In Proceedings of Ninth International ACM SIGACCESS Conference on Computers and Accessibility, retrieved from <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.113.7211&rep=rep1&type=pdf> on Jun. 1, 2012, Oct. 15, 2007, 8 pages.
Hinckley,“Touch-Sensing Input Devices”, In Proceedings of ACM SIGCHI 1999, May 15, 1999, 8 pages.
Hughes,“Apple's haptic touch feedback concept uses actuators, senses force on iPhone, iPad”, Retrieved from: http://appleinsider.com/articles/12/03/22/apples_haptic_touch_feedback_concept_uses_actuators_senses_force_on _iphone_ipad, Mar. 22, 2012, 5 pages.
Iwase,“Multistep Sequential Batch Assembly of Three-Dimensional Ferromagnetic Microstructures with Elastic Hinges”, Retrieved at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1549861>> Proceedings: Journal of Microelectromechanical Systems, Dec. 2005, 7 pages.
Kaufmann,“Hand Posture Recognition Using Real-time Artificial Evolution”, EvoApplications'09, retrieved from <http://evelyne.lutton.free.fr/Papers/KaufmannEvolASP2010.pdf> on Jan. 5, 2012, Apr. 3, 2010, 10 pages.
Kaur,“Vincent Liew's redesigned laptop satisfies ergonomic needs”, Retrieved from: <http://www.designbuzz.com/entry/vincent-liew-s-redesigned-laptop-satisfies-ergonomic-needs/> on Jul. 27, 2012, Jun. 21, 2010, 4 pages.
Khuntontong,“Fabrication of Molded Interconnection Devices by Ultrasonic Hot Embossing on Thin Polymer Films”, IEEE Transactions on Electronics Packaging Manufacturing, vol. 32, No. 3, Jul. 2009, pp. 152-156.
Kyung,“TAXEL: Initial Progress Toward Self-Morphing Visio-Haptic Interface”, Proceedings: In IEEE World Haptics Conference, Jun. 21, 2011, 6 pages.
Lane,“Media Processing Input Device”, U.S. Appl. No. 13/655,065, dated Oct. 18, 2012, 43 pages.
Li,“Characteristic Mode Based Tradeoff Analysis of Antenna-Chassis Interactions for Multiple Antenna Terminals”, In IEEE Transactions on Antennas and Propagation, Retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6060882>, Feb. 2012, 13 pages.
Linderholm,“Logitech Shows Cloth Keyboard for PDAs”, Retrieved from: <http://www.pcworld.com/article/89084/logitech_shows_cloth_keyboard_for_pdas.html> on May 7, 2012, Mar. 15, 2002, 5 pages.
Mackenzie,“The Tactile Touchpad”, In Proceedings of the ACM CHI Human Factors in Computing Systems Conference Available at: <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.150.4780&rep=rep1&type=pdf >, Mar. 22, 1997, 2 pages.
Manresa-Yee,“Experiences Using a Hands-Free Interface”, In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, retrieved from <http://dmi.uib.es/˜cmanresay/Research/%5BMan08%5DAssets08.pdf> on Jun. 1, 2012, Oct. 13, 2008, pp. 261-262.
McLellan,“Eleksen Wireless Fabric Keyboard: a first look”, Retrieved from: <http://www.zdnetasia.com/eleksen-wireless-fabric-keyboard-a-first-look-40278954.htm> on May 7, 2012, Jul. 17, 2006, 9 pages.
McPherson,“TouchKeys: Capacitive Multi-Touch Sensing on a Physical Keyboard”, In Proceedings of NIME 2012, May 2012, 4 pages.
Miller,“MOGA gaming controller enhances the Android gaming experience”, Retrieved from <http://www.zdnet.com/moga-gaming-controller-enhances-the-android-gaming-experience-7000007550/> on Nov. 20, 2012, Nov. 18, 2012, 9 pages.
Nakanishi,“Movable Cameras Enhance Social Telepresence in Media Spaces”, In Proceedings of the 27th International Conference on Human Factors in Computing Systems, retrieved from <http://smg.ams.eng.osaka-u.ac.jp/˜nakanishi/hnp_2009_chi.pdf> on Jun. 1, 2012, Apr. 6, 2009, 10 pages.
Picciotto,“Piezo-Actuated Virtual Buttons for Touch Surfaces”, U.S. Appl. No. 13/769,356, Feb. 17, 2013, 31 pages.
Piltch,“ASUS Eee Pad Slider SL101 Review”, Retrieved from <http://www.laptopmag.com/review/tablets/asus-eee-pad-slider-sl101.aspx>, Sep. 22, 2011, 5 pages.
Post,“E-Broidery: Design and Fabrication of Textile-Based Computing”, IBM Systems Journal, vol. 39, Issue 3 & 4, Jul. 2000, pp. 840-860.
Poupyrev,“Ambient Touch: Designing Tactile Interfaces for Handheld Devices”, In Proceedings of the 15th Annual ACM Symposium on User Interface Software and Technology Available at: <http://www.ivanpoupyrev.com/e-library/2002/uist2002_ambientouch.pdf>, Oct. 27, 2002, 10 pages.
Poupyrev,“Tactile Interfaces for Small Touch Screens”, In Proceedings of the 16th Annual ACM Symposium on User Interface Softward and Technology, Nov. 2, 2003, 4 pages.
Purcher,“Apple is Paving the Way for a New 3D GUI for IOS Devices”, Retrieved from: <http://www.patentlyapple.com/patently-apple/2012/01/apple-is-paving-the-way-for-a-new-3d-gui-for-ios-devices.html> on Jun. 4, 2012, Retrieved from: <http://www.patentlyapple.com/patently-apple/2012/01/apple-is-paving-the-way-for-a-new-3d-gui-for-ios-devices.html> on Jun. 4, 2012, Jan. 12, 2012, 15 pages.
Qin,“pPen: Enabling Authenticated Pen and Touch Interaction on Tabletop Surfaces”, In Proceedings of ITS 2010, Available at <http://www.dfki.de/its2010/papers/pdf/po172.pdf>, Nov. 2010, pp. 283-284.
Reilink,“Endoscopic Camera Control by Head Movements for Thoracic Surgery”, In Proceedings of 3rd IEEE RAS & EMBS International Conference of Biomedical Robotics and Biomechatronics, retrieved from <http://doc.utwente.nl/74929/1/biorob_online.pdf> on Jun. 1, 2012, Sep. 26, 2010, pp. 510-515.
Rendl,“PyzoFlex: Printed Piezoelectric Pressure Sensing Foil”, In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, Oct. 7, 2012, 10 pages.
Rubin,“Switched on: The Bedeviled Bezel”, Retrieved from: http://www.engadget.com/2011/07/17/switched-on-the-bedeviled-bezel/—on Nov. 19, 2015, Jul. 17, 2011, 4 pages.
Shaw,“Input Device Configuration having Capacitive and Pressure Sensors”, U.S. Appl. No. 14/033,510, Sep. 22, 2013, 55 pages.
Staff,“Gametel Android controller turns tablets, phones into portable gaming devices”, Retrieved from <http://www.mobiletor.com/2011/11/18/gametel-android-controller-turns-tablets-phones-into-portable-gaming-devices/#> on Nov. 20, 2012, Nov. 18, 2011, 5 pages.
Sumimoto,“Touch & Write: Surface Computing With Touch and Pen Input”, Retrieved from: <http://www.gottabemobile.com/2009/08/07/touch-write-surface-computing-with-touch-and-pen-input/> on Jun. 19, 2012, Aug. 7, 2009, 4 pages.
Sundstedt,“Gazing at Games: Using Eye Tracking to Control Virtual Characters”, In ACM SIGGRAPH 2010 Courses, retrieved from <http://www.tobii.com/Global/Analysis/Training/EyeTrackAwards/veronica_sundstedt.pdf> on Jun. 1, 2012, Jul. 28, 2010, 85 pages.
Takamatsu,“Flexible Fabric Keyboard with Conductive Polymer-Coated Fibers”, In Proceedings of Sensors 2011, Oct. 28, 2011, 4 pages.
Titus,“Give Sensors a Gentle Touch”, http://www.ecnmag.com/articles/2010/01/give-sensors-gentle-touch, Jan. 13, 2010, 6 pages.
Travis,“Collimated Light from a Waveguide for a Display Backlight”, Optics Express, 19714, vol. 17, No. 22, retrieved from <http://download.microsoft.com/download/D/2/E/D2E425F8-CF3C-4C71-A4A2-70F9D4081007/OpticsExpressbacklightpaper.pdf> on Oct. 15, 2009, Oct. 15, 2009, 6 pages.
Travis,“The Design of Backlights for View-Sequential 3D”, retrieved from <http://download.microsoft.com/download/D/2/E/D2E425F8-CF3C-4C71-A4A2-70F9D4081007/Backlightforviewsequentialautostereo.docx> on Nov. 1, 2010, 4 pages.
Tuite,“Haptic Feedback Chips Make Virtual-Button Applications on Handheld Devices a Snap”, Retrieved at: http://electronicdesign.com/analog/haptic-feedback-chips-make-virtual-button-applications-handheld-devices-snap, Sep. 10, 2009, 7 pages.
Valli,“Notes on Natural Interaction”, retrieved from <http://www.idemployee.id.tue.nl/g.w.m.rauterberg/lecturenotes/valli-2004.pdf> on Jan. 5, 2012, Sep. 2005, 80 pages.
Valliath,“Design of Hologram for Brightness Enhancement in Color LCDs”, Retrieved from <http://www.loreti.it/Download/PDF/LCD/44_05.pdf> on Sep. 17, 2012, May 1998, 5 pages.
Vaucelle,“Scopemate, A Robotic Microscope!”, Architectradure, retrieved from <http://architectradure.blogspot.com/2011/10/at-uist-this-monday-scopemate-robotic.html> on Jun. 6, 2012, Oct. 17, 2011, 2 pages.
Williams,“A Fourth Generation of LCD Backlight Technology”, Retrieved from <http://cds.linear.com/docs/Application%20Note/an65f.pdf>, Nov. 1995, 124 pages.
Xu,“Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors”, IUI'09, Feb. 8-11, 2009, retrieved from <http://sclab.yonsei.ac.kr/courses/10TPR/10TPR.files/Hand%20Gesture%20Recognition%20and%20Virtual%20Game%20Control%20based%20on%203d%20accelerometer%20and%20EMG%20sensors.pdf> on Jan. 5, 2012, Feb. 8, 2009, 5 pages.
Xu,“Vision-based Detection of Dynamic Gesture”, ICTM'09, Dec. 5-6, 2009, retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5412956> on Jan. 5, 2012, Dec. 5, 2009, pp. 223-226.
Zhang,“Model-Based Development of Dynamically Adaptive Software”, In Proceedings of ICSE 2006, Available at <http://www.irisa.fr/lande/lande/icse-proceedings/icse/p371.pdf>, May 20, 2006, pp. 371-380.
Zhu,“Keyboard before Head Tracking Depresses User Success in Remote Camera Control”, In Proceedings of 12th IFIP TC 13 International Conference on Human-Computer Interaction, Part II, retrieved from<http://csiro.academia.edu/Departments/CSIRO_ICT_Centre/Papers?page=5> on Jun. 1, 2012, Aug. 24, 2009, 14 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/698,318, dated Jun. 9, 2016, 2 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/025966, dated Jun. 15, 2016, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/144,876, dated Jul. 6, 2016, 33 pages.
“Notice of Allowance”, U.S. Appl. No. 14/698,318, dated May 6, 2016, 13 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 14/698,318, dated Aug. 15, 2016, 2 pages.
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US2017/013583”, dated Apr. 19, 2017, 13 pages.
“Second Written Opinion Issued in PCT Application No. PCT/US2016/025966”, dated Mar. 14, 2017, 7 pages.
“Using a Force Touch trackpad”, Retrieved on: Nov. 17, 2015 Available at: https://support.apple.com/en-in/HT204352.
Betters, Elyse, “What is Force Touch? Apple's Haptic Feedback Technology Explained”, Published on: Mar. 11, 2015 Available at: http://www.pocket-lint.com/news/133176-what-is-force-touch-apple-s-haptic-feedback-technology-explained.
De Rosa, Aurelio, “HTML5: Vibration API”, Published on: Mar. 10, 2014 Available at: http://code.tutsplus.com/tutorials/html5-vibration-api-mobile-22585.
Rendl, et al., “Presstures: Exploring Pressure-Sensitive Multi-Touch Gestures on Trackpads”, In Proceedings of SIGCHI Conference on Human Factors in Computing Systems, Apr. 26, 2014, pp. 431-434.
Kadlecek, Petr, “Overview of Current Developments in Haptic APIs”, In Proceedings of 15th Central European Seminar on Computer Graphics, May 2, 2011, 8 pages.
U.S. Appl. No. 14/298,658, Boulanger, et al., “Method and System for Controlling of an Ambient Multiple Zones Haptic Feedback on Mobile Devices (W231)”, filed Jun. 6, 2014.
“Final Office Action”, U.S. Appl. No. 14/144,876, dated Feb. 3, 2016, 27 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2014/068687, dated Mar. 11, 2016, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/269,594, dated Jun. 7, 2017, 27 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2016/025966, dated May 22, 2017, 8 pages.
“Advisory Action”, U.S. Appl. No. 13/769,356, dated May 30, 2017, 2 pages.
“Advisory Action”, U.S. Appl. No. 13/769,356, dated Dec. 16, 2016, 3 pages.
“Advisory Action”, U.S. Appl. No. 14/729,793, dated Feb. 13, 2018, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/769,356, dated Mar. 23, 2016, 15 pages.
“Final Office Action”, U.S. Appl. No. 13/769,356, dated Sep. 30, 2016, 15 pages.
“Final Office Action”, U.S. Appl. No. 14/729,793, dated Dec. 1, 2017, 17 pages.
“Foreign Office Action”, CN Application No. 201480009165.3, dated Apr. 12, 2017, 16 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2016/031699, dated Feb. 22, 2017, 6 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/031699, dated Nov. 11, 2016, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/769,356, dated Jun. 30, 2016, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/033,508, dated Dec. 3, 2015, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/729,793, dated Mar. 31, 2017, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/283,913, dated Feb. 10, 2017, 20 pages.
“Notice of Allowance”, U.S. Appl. No. 14/033,508, dated May 6, 2016, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 15/269,594, dated Jan. 31, 2018, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 15/283,913, dated Mar. 19, 2018, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 15/283,913, dated Sep. 6, 2017, 9 pages.
“Second Written Opinion”, Application No. PCT/US2014/016151, dated Jan. 29, 2015, 6 pages.
Related Publications (1)
Number Date Country
20170212591 A1 Jul 2017 US