Computer device and an apparatus having sensors configured for measuring spatial information indicative of a position of the computing devices

Information

  • Patent Grant
  • 8903517
  • Patent Number
    8,903,517
  • Date Filed
    Wednesday, September 4, 2013
    10 years ago
  • Date Issued
    Tuesday, December 2, 2014
    9 years ago
Abstract
Sensor fusion algorithm techniques are described. In one or more embodiments, behaviors of a host device and accessory devices are controlled based upon an orientation of the host device and accessory devices, relative to one another. A combined spatial position and/or orientation for the host device may be obtained based on raw measurements that are obtained from at least two different types of sensors. In addition, a spatial position and/or orientation for an accessory device is ascertained using one or more sensors of the accessory device. An orientation (or position) of the accessory device relative to the host computing device may then be computed based on the combined spatial position/orientation for the host computing device and the ascertained spatial position/orientation for the accessory device. The relative orientation that is computed may then be used in various ways to control behaviors of the host computing device and/or accessory device.
Description
BACKGROUND

Mobile computing devices have been developed to increase the functionality that is made available to users in a mobile setting. For example, a user may interact with a mobile phone, tablet computer, or other mobile computing device to check email, surf the web, compose texts, interact with applications, and so on. Some mobile computing devices may connect to and interact with various accessory devices to provide different input techniques, extend functionality, and so forth. One challenge that faces developers of mobile computing devices is managing behaviors and interaction with accessory devices. For instance, a host computing device may have limited control over how an accessory device behaves and thus actions of the accessory may sometimes interfere with operation of the host computing device. Moreover, the user experience may be adversely affected by accessory devices that do not respond in a manner that is consistent with the host computing device. Thus, integrated management of behaviors and interaction for accessory devices may be a challenging consideration for developers of mobile computing devices.


SUMMARY

Sensor fusion algorithm techniques are described. In one or more embodiments, behaviors of a host device and accessory devices are controlled based upon an orientation of the host device and accessory devices, relative to one another. A combined spatial position and/or orientation for the host device may be obtained based on raw measurements that are obtained from at least two different types of sensors. In addition, a spatial position and/or orientation for an accessory device is ascertained using one or more sensors of the accessory device. An orientation (or position) of the accessory device relative to the host computing device may then be computed based on the combined spatial position/orientation for the host computing device and the ascertained spatial position/orientation for the accessory device. The relative orientation that is computed may then be used in various ways to control behaviors of the host computing device and/or accessory device.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items. Entities represented in the figures may be indicative of one or more entities and thus reference may be made interchangeably to single or plural forms of the entities in the discussion.



FIG. 1 is an illustration of an environment in an example implementation that is operable to employ the techniques described herein.



FIG. 2 depicts an example implementation of a computing device of FIG. 1 in greater detail.



FIG. 3 depicts an example implementation of an accessory device of FIG. 1 as showing a flexible hinge in greater detail.



FIG. 4 depicts an example orientation of the accessory device in relation to the computing device in accordance with one or more embodiments.



FIG. 5 depicts an example orientation of the accessory device in relation to the computing device in accordance with one or more embodiments.



FIG. 6 depicts an example orientation of the accessory device in relation to the computing device in accordance with one or more embodiments.



FIG. 7 depicts an example orientation of the accessory device in relation to the computing device in accordance with one or more embodiments.



FIG. 8 depicts an example orientation of the accessory device in relation to the computing device in accordance with one or more embodiments.



FIG. 9 depicts an example orientation of the accessory device in relation to the computing device in accordance with one or more embodiments.



FIG. 10 depicts illustrates some example rotational orientations of the computing device in relation to the input device in accordance with one or more embodiments.



FIG. 11 is a flow diagram that describes an example procedure in accordance with one or more embodiments.



FIG. 12 is a flow diagram that describes an example procedure in accordance with one or more embodiments.



FIG. 13 illustrates an example system including various components of an example device that can be implemented as any type of computing device as described with reference to FIGS. 1-12 to implement embodiments of the techniques described herein.





DETAILED DESCRIPTION

Overview


Traditionally, a host computing device may have limited control over how an associated accessory device behaves. Thus actions of the accessory may sometimes interfere with operation of the host computing device, which may detract from the user experience. Accordingly, integrated management of behaviors and interaction for accessory devices may be a consideration for developers of mobile computing devices.


Sensor fusion algorithm techniques are described. In one or more embodiments, behaviors of a host device and accessory devices are controlled based upon an orientation of the host device and accessory devices, relative to one another. A combined spatial position and/or orientation for the host device may be obtained based on raw measurements that are obtained from at least two different types of sensors. In addition, a spatial position and/or orientation for an accessory device is ascertained using one or more sensors of the accessory device. An orientation (or position) of the accessory device relative to the host computing device may then be computed based on the combined spatial position/orientation for the host computing device and the ascertained spatial position/orientation for the accessory device. The relative orientation that is computed may then be used in various ways to control behaviors of the host computing device and/or accessory device.


In the following discussion, an example environment and devices are first described that may employ the techniques described herein. Example procedures are then described which may be performed in the example environment and by the devices as well as in other environments and by other devices. Consequently, performance of the example procedures is not limited to the example environment/devices and the example environment/devices are not limited to performance of the example procedures.


Example Operating Environment



FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ the techniques described herein. The illustrated environment 100 includes an example of a computing device 102 that is physically and communicatively coupled to an accessory device 104 via a flexible hinge 106. The computing device 102 may be configured in a variety of ways. For example, the computing device 102 may be configured for mobile use, such as a mobile phone, a tablet computer as illustrated, and so on. Thus, the computing device 102 may range from full resource devices with substantial memory and processor resources to a low-resource device with limited memory and/or processing resources. The computing device 102 may also relate to software that causes the computing device 102 to perform one or more operations.


The computing device 102, for instance, is illustrated as including an input/output module 108. The input/output module 108 is representative of functionality relating to processing of inputs and rendering outputs of the computing device 102. A variety of different inputs may be processed by the input/output module 108, such as inputs relating to functions that correspond to keys of the input device, keys of a virtual keyboard displayed by the display device 110 to identify gestures and cause operations to be performed that correspond to the gestures that may be recognized through the accessory device 104 and/or touchscreen functionality of the display device 110, and so forth. Thus, the input/output module 108 may support a variety of different input techniques by recognizing and leveraging a division between types of inputs including key presses, gestures, and so on.


In the illustrated example, the accessory device 104 is a device configured as a keyboard having a QWERTY arrangement of keys although other arrangements of keys are also contemplated. Further, other non-conventional configurations for an accessory device 104 are also contemplated, such as a game controller, configuration to mimic a musical instrument, a power adapter, and so forth. Thus, the accessory device 104 may assume a variety of different configurations to support a variety of different functionality. Different accessory devices may be connected to the computing device at different times. Moreover, functionally of a particular accessory device may also be adapted to assume different configurations and capabilities, such as through different selectable modes, software/firmware updates, modular add-on devices/components, and so forth. This may cause changes in the way keys or other controls for an accessory are laid-out and also change the way on which inputs from the accessory are handled by the host and applications. For example, an accessory device may be operable as keyboard and as a game controller by adaptively switching the kinds of keys/controls, displayed labels, and positions of controls to assume different configurations at different times.


As previously described, the accessory device 104 is physically and communicatively coupled to the computing device 102 in this example through use of a flexible hinge 106. The flexible hinge 106 represents one illustrative example of an interface that is suitable to connect and/or attach and accessory device to a host computing device 102. The flexible hinge 106 is flexible in that rotational movement supported by the hinge is achieved through flexing (e.g., bending) of the material forming the hinge as opposed to mechanical rotation as supported by a pin, although that embodiment is also contemplated. Further, this flexible rotation may be configured to support movement in one direction (e.g., vertically in the figure) yet restrict movement in other directions, such as lateral movement of the accessory device 104 in relation to the computing device 102. This may be used to support consistent alignment of the accessory device 104 in relation to the computing device 102, such as to align sensors used to change power states, application states, and so on.


The flexible hinge 106, for instance, may be formed using one or more layers of fabric and include conductors formed as flexible traces to communicatively couple the accessory device 104 to the computing device 102 and vice versa. This communication, for instance, may be used to communicate a result of a key press to the computing device 102, receive power from the computing device, perform authentication, provide supplemental power to the computing device 102, and so on. The flexible hinge 106 or other interface may be configured in a variety of ways to support multiple different accessory devices 104, further discussion of which may be found in relation to the following figure.


As further illustrated in FIG. 1 the computing device 102 may include various applications 112 that provide different functionality to the device. A variety of applications 112 typically associated with computing devices are contemplated including, but not limited to, an operating system, a productivity suite that integrates multiple office productivity modules, a web browser, games, a multi-media player, a word processor, a spreadsheet program, a photo manager, and so forth. The computing device 102 further includes multiple host sensors 114 that are configured to sense corresponding inputs responsive to manipulation of the computing device 102. Likewise, the accessory device 104 includes one or more accessory sensors 116 that are configured to sense corresponding inputs generated responsive to manipulation of the accessory device 104.


In accordance with techniques described herein, input obtained from the host sensors 114 and accessory sensors 116 may be processed and/or combined according to a suitable sensor fusion algorithm to resolve an orientation of the accessory device 104 and computing device 102 one to another. In general, input regarding position and/or orientation from multiple different types of sensors is processed in combination to compute the orientation. The computed orientation may then be used to control behaviors of the host and accessory and perform various corresponding operations. A variety of different types of sensors and algorithms suitable to resolve the orientation may be employed as discussed in greater detail in relation to the following figures.


To further illustrate, consider FIG. 2 which depicts generally at 200 an example computing device 102 of FIG. 1 in greater detail. In the depicted example, the computing device 102 is shown in a stand-alone configuration without an accessory device 104 being attached. In addition to the components discussed in relation to FIG. 1, the example computing device of FIG. 2 further includes a processing system 202 and computer-readable media 204 that are representative of various different types and combinations of processing components, media, memory, and storage components and/or devices that may be associated with a computing device and employed to provide a wide range of device functionality. In at least some embodiments, the processing system 202 and computer-readable media 204 represent processing power and memory/storage that may be employed for general purpose computing operations. More generally, the computing device 102 may be configured as any suitable computing system and/or device that employ various processing systems and computer-readable media, additional details and examples of which are discussed in relation to the example computing system of FIG. 13.


The computing device 102 may also implement selected device functionality through one or more microcontrollers 206. The microcontrollers 206 represent hardware devices/systems that are designed to perform a predefined set of designated tasks. The microcontrollers 206 may represent respective on-chip systems/circuits having self-contained resources such as processing components, I/O devices/peripherals, various types of memory (ROM, RAM, Flash, EEPROM), programmable logic, and so forth. Different microcontrollers may be configured to provide different embedded applications/functionality that are implemented at least partially in hardware and perform corresponding tasks. The microcontrollers 206 enable performance of some tasks outside of operation of a general purpose processing system and other applications/components of the computing device or accessory device. Generally, power consumption of the microcontrollers is low in comparison with operating a general purpose processing system for a device.


As further depicted, the computing device 102 may further include a sensor fusion module 208, a behavior module 210, and a sensor fusion application programming interface (API) 212 to implement aspects of sensor fusion algorithm techniques described herein. The sensor fusion module 208 generally represents functionality to apply a suitable sensor fusion algorithm as described above and below to derive an orientation that is based on input from multiple sensors. The sensor fusion module 208 may operate to collect inputs regarding positions/orientation/etc. supplied via the various sensors, process the inputs, and compute a corresponding orientation that describe the spatial relationship of the computing device 102 and an accessory device 104.


The behavior module 210 represents functionality to control and/or modify a variety of different behaviors associated with the computing device 102 and/or accessory devices 104 based on the computed orientation. This may include but is not limited to managing power states/consumption, selecting operational modes or device states, adjusting sensitivity of one or more sensors, controlling interaction between the host, accessory, and/or peripheral devices, modifying device functionality, enabling/disabling network connections, activating/deactivating applications, and/or setting application states, to name a few examples. These and other examples of behaviors that may be controlled based on a computed orientation are described in greater detail in relation to the example procedures discussed herein below.


The sensor fusion application programming interface (API) 212 represents functionality to expose information regarding the computer orientation for use by applications 112. For example, applications 112 may utilize the sensor fusion API to request orientation information on demand and/or subscribe to orientation updates from the sensor fusion module 208 and/or an associated notification system. The sensor fusion API may then interact with the sensor fusion module 208 on behalf of the application 112 to cause orientation information to be conveyed to the application 112. Applications 112 may use orientation information in various ways, example of which may be found in the discussion of an example procedure 1200 of FIG. 12 below.


As previously mentioned, various different types of sensors may be employed to implement the techniques described herein. A host computing device may include an array of sensors used to provide orientation information. By way of example and not limitation, the host sensors 114 for the example computing device 102 of FIG. 2 are depicted as including a gyroscope 214, an accelerometer 216, a magnetometer 218, and a Hall Effect sensor 220. Various other sensors 222 suitable to derive information regarding the position and/or orientation may also be employed.



FIG. 3 depicts an example implementation 300 of the accessory device 104 of FIG. 1 as showing the flexible hinge 106 in greater detail. In this example, the accessory device 104 is depicted as being detached from the computing device. Here, a connection portion 302 of the input device is shown that is configured to provide a communicative and physical connection between the accessory device 104 and the computing device 102. In this example, the connection portion 302 has a height and cross section configured to be received in a channel in the housing of the computing device 102, although this arrangement may also be reversed without departing from the spirit and scope thereof. The connection portion 302 provides an interface through which attachment/connection of the accessory device 104 to the computing device may be detected. In at least some embodiments, this interface also enables communications for interaction and/or control of the accessory device 104 as described herein. For example, the computing device 102, sensor fusion module 208, and/or behavior module 210 may communicate with the accessory device through the interface to obtain input from various accessory sensors 116 and to direct behaviors of the accessory device.


The connection portion 302 is flexibly connected to a portion of the accessory device 104 that includes the keys through use of the flexible hinge 106. Thus, when the connection portion 302 is physically connected to the computing device the combination of the connection portion 302 and the flexible hinge 106 supports movement of the accessory device 104 in relation to the computing device 102 that is similar to a hinge of a book. Naturally, a variety of orientations may be supported some examples of which are described in the following section.


The connecting portion 302 is illustrated in this example as including magnetic coupling devices 304, 306, mechanical coupling protrusions 308, 310, and a plurality of communication contacts 312. The magnetic coupling devices 304, 306 are configured to magnetically couple to complementary magnetic coupling devices of the computing device 102 through use of one or more magnets. In this way, the accessory device 104 may be physically secured to the computing device 102 through use of magnetic attraction. The connecting portion 302 also includes mechanical coupling protrusions 308, 310 to form a mechanical physical connection between the accessory device 104 and the computing device 102. The communication contacts 212 are configured to contact corresponding communication contacts of the computing device 102 to form a communicative coupling between the devices to facilitate various kinds of communications.


Having discussed an example environment in which embodiments may operate, consider now some example device orientations in accordance with one or more embodiments.


Example Device Orientations


The following discussion presents some example device orientations. As detailed, different device orientations can be associated with different device power states, different application states, trigger different behaviors, and so forth. The example orientations as well as other orientations may be determined using sensor fusion algorithm techniques described above and below. A determined orientation may then be used to drive different behaviors for the host and/or the accessory.



FIG. 4 illustrates that the accessory device 104 may be rotated such that the accessory device 104 is placed against the display device 110 of the computing device 102 to assume an orientation 400. In the orientation 400, the accessory device 104 may act as a cover such that the accessory device 104 can protect the display device 110 from harm. In implementations, the orientation 400 can correspond to a closed position of the computing device 102.



FIG. 5 illustrates that the input device 104 has rotated away from the computing device 102 such that the computing device assumes an orientation 500. The orientation 400 includes a gap 502 that is introduced between the computing device 102 and the accessory device 104. In implementations, the orientation 500 can be caused unintentionally by a user, such as by inadvertent contact with the computing device 102 and/or the accessory device 104 that causes the computing device 102 to sag slightly away from the accessory device 104 such that the gap 502 is introduced.



FIG. 6 illustrates an example orientation 600 of the computing device 102. In the orientation 600, the accessory device 104 is laid flat against a surface and the computing device 102 is disposed at an angle to permit viewing of the display device 110, e.g., such as through use of a kickstand 602 disposed on a rear surface of the computing device 102. The orientation 600 can correspond to a typing arrangement whereby input can be received via the accessory device 104, such as using keys of a keyboard, a track pad, and so forth.



FIG. 7 illustrates a further example orientation of the computing device 102, generally at 700. In the orientation 700, the computing device 102 is oriented such that the display device 110 faces away from the accessory device 104. In this example, the kickstand 602 can support the computing device 102, such as via contact with a back surface of the accessory device 104. Although not expressly illustrated here, a cover can be employed to cover and protect a front surface of the accessory device 104. In the depicted orientation, an angle 702 between the device and host is established. Various different angles corresponding to different positions/orientation may be established, as discussed above and below.



FIG. 8 illustrates an example orientation 800, in which the accessory device 104 may also be rotated so as to be disposed against a back of the computing device 102, e.g., against a rear housing of the computing device 102 that is disposed opposite the display device 110 on the computing device 102. In this example, through orientation of the connection portion 202 to the computing device 102, the flexible hinge 106 is caused to “wrap around” the connection portion 202 to position the accessory device 104 at the rear of the computing device 102.


This wrapping causes a portion of a rear of the computing device 102 to remain exposed. This may be leveraged for a variety of functionality, such as to permit a camera 802 positioned on the rear of the computing device 102 to be used even though a significant portion of the rear of the computing device 102 is covered by the accessory device 104 in the example orientation 800. Further to the example illustrated in FIG. 8, the display device 110 of the computing device 102 may be determined to be oriented at an angle 804 relative to the accessory device 104. In general, the angle 804 may change as the accessory device 104 is manipulated into different positions. For example, the angle 804 as shown in FIG. 8 can be determined to be approximately 360 degrees. Other orientations may correspond to other angles, and angle ranges may be established and associated with defined modes or states that may trigger different behaviors. Thus, behaviors may be controlled based on the particular mode/state that correspond to the current angle between the host and accessory.



FIG. 9 illustrates a further example orientation of the computing device 102, generally at 900. In the orientation 900, the computing device 102 is rotated sideways, e.g., in a portrait orientation relative to a surface 902 on which the computing device 102 is disposed. The display device 110 is visible, with the accessory device 104 rotated away from the display device 110. In at least some implementations, a width of the accessory device 104 can be narrower than a width of the computing device 102. Additionally or alternatively, the width of the accessory device 104 can be tapered such that the edge closest to the hinge 106 is wider than the outermost edge. This can enable the face of the display device 110 to recline back in the orientation 900, to provide for a suitable viewing angle.



FIG. 10 illustrates that the computing device 102 may be rotated within a variety of different angle ranges with respect to the accessory device 104. As detailed herein, different angle ranges can be associated with different power states, different application states, and so on.


An angle range 1000 is illustrated, which corresponds to a closed position for the computing device 102. Thus, if the computing device 102 is positioned at an angle within the angle range 1000 relative to the accessory device 104, the computing device 102 can be determined to be in a closed position. A closed position can include an associated closed state where various functionalities/behaviors for the computing device 102 and accessory device 104 can be modified accordingly based on the closed state.


Further illustrated is an angle range 1002, which may correspond to a typing orientation for the computing device 102. Thus, if the computing device 102 is positioned at an angle within the angle range 1002 relative to the accessory device 104, the computing device 102 can be determined to be in a typing orientation. Within this orientation, the computing device 102 and/or the accessory device 104 can placed in a typing power state where functionalities/behaviors for the computing device 102 and accessory device 104 can be customized accordingly based on the typing state.



FIG. 10 further illustrates an angle range 1004, which corresponds to a viewing position for the computing device 102. Thus, if the computing device 102 is positioned at an angle within the angle range 1004 relative to the accessory device 104, the computing device 102 can be determined to be in a viewing orientation. In this orientation, functionalities/behaviors for the computing device 102 and accessory device 104 can be controlled accordingly based on the viewing state.


The orientations, angle ranges, power states, and so forth discussed above are presented for purposes of illustration only. It is contemplated that a wide variety of different orientations, device states, and angle ranges may be implemented within the spirit and scope of the claimed embodiments.


Having discussed some example device orientations, consider now some example procedures in accordance with one or more embodiments.


Example Procedures


The following discussion describes sensor fusion algorithm techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedures may be implemented in hardware, firmware, software, or a combination thereof. The procedures are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. In portions of the following discussion, reference may be made to the example operating environment 100 of FIG. 1, the example devices of FIGS. 2-3, and the example orientation shown in FIGS. 4-10, respectively.



FIG. 11 depicts an example procedure 1100 in which an orientation of an accessory relative to a host is computed. In at least some embodiments, the procedure may be performed by a suitably configured computing device, such as the example computing device 102 of FIG. 2 that includes or otherwise make use of a sensor fusion module 208 and/or behavior module 210.


Raw spatial positions for a host computing device are calculated independently using at least two different types of sensors (block 1102). The raw spatial positions are processed to obtain a combined spatial position for the host computing device (block 1104).


For example, the sensor fusion module 208 may be configured to implement a designated sensor fusion algorithm. Generally, the sensor fusion algorithm is configured to aggregate information from an array of different kinds of host sensors 114 employed by a computing device 102. The aggregation of multiple different sensing techniques and types of sensors may provide improved resolution of positions and may smooth errors that may be introduced by individual techniques and sensors. In at least some embodiments, the sensor fusion algorithm is configured to calculate at least two independent computations of the raw spatial position of the computing device 102 using different respective sensors. Multiple independent computations of the raw position may then be used to produce a combined spatial position. Each of the independent computations may employ one or more of the various types of host sensors 114 described above and below. At least some of the sensors used for different independent computations are of different types. Thus, the sensor fusion algorithm obtains input from a variety of different host sensors 114 and combines this information to resolve the position of the computing device 102.


In one approach, the computing device 102 includes a gyroscope 214 that may be used to obtain one of the independent computations of the raw position. Generally, a gyroscope uses principles of angular momentum to calculate orientation and rotation. The gyroscope 214 can be used to recognize movement within three-dimensional space and may enable determination of position with respect to a reference object/point, such as the earth. Using input obtained from the gyroscope 214, the sensor fusion module 208 may operate to compute a raw spatial position for the computing device. The raw spatial position may be expressed as coordinates in a three dimensional coordinate system defined with x, y, and z axes relative to the reference object/point (e.g., the earth).


In particular, the angular velocity input obtained from the gyroscope can be processed to determine angular positioning of the computing device. Initially, the input from the gyroscope may be filtered to remove a low pass constant offset of the gyroscope. Such a low pass constant offset may be created if the gyroscope is stuck in a non-zero position and is removed to prevent inaccuracy in the computation. The algorithm may integrate over multiple axes of the gyroscope (e.g., x, y, and z axes) to obtain a transform that describes a raw spatial position for the computing device. This processing may involve integrating angular velocity input from the gyroscope through a Runge-Kutta integration algorithm (or other suitable algorithm) to obtain corresponding impulse data. The impulse data may be expressed as quaternions for the different axes, which when multiplied together produce a quaternion that describes a transformation between the computing device 102 and the earth (or other selected reference object/point) with respect to their respective axes/coordinate systems. This provides one independent version of the raw spatial position for the computing device 102.


Another independent computation of the raw spatial position may be obtained using an accelerometer 216 and a magnetometer 218 in combination. Here, the accelerometer 216 is configured as a three axes accelerometer that may be employed to derive two of the degrees of freedom of the device (e.g., position with respect to the x-axis and y-axis). In the low pass, the vector of acceleration is approximately 1 g down pointing to the center of the earth. The components of acceleration measured via the accelerometer 216 may be obtained as distributed across each of the three axes. The components of acceleration can in turn be used to compute angles of the accelerometer/device axes with respect to the low pass vector that points to the center of the earth. This provides two of the three degrees of freedom with respect to tilt or orientation of the device. In particular, the accelerometer processing just described is used to resolve the tilt/orientation of the x-axis and y-axis of the computing device 102.


Now, the magnetometer 218 may be employed to resolve the remaining degree of freedom with respect to tilt/orientation of the device. The magnetometer 218 may be initialized/configured to act like a compass. In this approach, the magnetometer 218 can be used to compute a vector that is parallel to the ground (e.g., the earth's surface). This vector points to magnetic north and can be used to determine rotation of the device with respect to the z-axis. Now, the tilt/orientation of the x-axis and y-axis from the accelerometer and the rotation of the device with respect to the z-axis from the magnetometer 218 may be used to construct another quaternion that describes a transformation between the computing device 102 and the earth (or other selected reference object/point) with respect to their respective axes/coordinate systems. This provides another independent way in which a raw spatial position for the computing device 102 may be obtained. Other examples using different sensors and combination of sensors are contemplated. For example, a global positioning satellite (GPS) radio may be used to provide some positioning data that may be used alone or in combination with other kinds of sensor data to compute the position/orientation of the computing device 102.


Accordingly, at least two different results for the raw spatial position are computed using the foregoing example techniques or other suitable techniques. The sensor fusion algorithm may be further configured to combine multiple independent computations of raw spatial position in various ways. The combining generally involves interpolating between two or more raw spatial positions to reduce or eliminate inaccuracies and/or smooth the results. The interpolation produces a combined spatial position for the computing device that is based on two or more independently obtained raw spatial positions.


By way of example and not limitation, results obtained using a gyroscope may be more precise in the short term relative to other sensors and position determination techniques. However, small integration errors associated with the gyroscope computations may build up over time creating an increasingly larger offset that may result in inaccurate results in the long term. Thus, interpolating the gyroscope results with other independently obtained results can effectively adjust for expected integration errors in the gyroscope results. In one approach, a normalized linear interpolation is employed that may be biased towards the gyroscope results since these results are initially more precise and subject to less noise. Other independent results, such as the results from the accelerometer/magnetometer, may be included in the interpolation to keep the gyroscope results in check and slowly adjust the bias for the combined result away from the gyroscope results and towards the other results over time. This produces a mathematically smooth transformation as the combined result.


A spatial position for an accessory device connected to the host computing device is ascertained using one or more sensors of the accessory device (block 1106). The spatial position for the accessory device 104 may be computed in any suitable way, including but not limited to the techniques described in relation to the computing device 102. Accessory sensors 116 for different accessories may include any of the various types of sensors described herein. Accordingly, different corresponding techniques may be used to ascertain spatial position of the accessory based on appropriate input from one or more accessory sensors 116. Different techniques may also be employed for different accessories based on the types of sensors that are included with the accessory. In general, the sensor fusion module 206 may be configured to obtain input from different sensors of the accessory over a suitable interface with the accessory and compute a corresponding spatial position based on the input.


In one particular example, the sensor fusion module 206 may compute a spatial position using an accelerometer 216 associated with the accessory device 104. In this approach, the accelerometer 216 may be employed to resolve the tilt/orientation with respect to the x-axis and y-axis of the accessory device 104. This may occur in a manner that is comparable to the computation of the same kind of information for the computing device 102 using an associated accelerometer as described above.


In some arrangements, the accessory device 104 may be configured to connect to the computing device 102 using a connection portion 302 that is connectable to an interface of the computing device via a known location. For instance, in the hinge example previously described, at least some information regarding the position of the accessory device may be established based upon the known location and nature of the connection to the host device. Thus, it may be sufficient to use the two degrees of freedom (e.g., x-axis and y-axis position/pitch and roll) for the accessory device 104 in such cases to resolve the position of the accessory relative to the host. It should be noted though that rotation with respect the z-axis may also be computed for the accessory device 104 in some embodiments, using a magnetometer 218 as discussed previously or using other sensors and techniques. This may be employed in configurations in which an accessory may still be manipulated in three dimensions even when connected to a host device, such as by way of a ball and socket type connection.


An orientation of the accessory device relative to the host computing device is computed based on the combined spatial position for the host computing device and the ascertained spatial position for the accessory device (block 1108). The computed orientation may correspond to any of the different orientations discussed in relation to FIGS. 4-10 as wells as other possible orientations. Here, a comparison can be made between the combined spatial position for the computing device 102 and the ascertained spatial position of the accessory device 104 to derive information regarding the orientation of the device one to another. In particular, the combined spatial position indicates a transformation between how axes in a coordinate system for the computing device 102 are oriented relative to axes associated with a reference coordinate system for the earth or other reference. Similarly, the ascertained spatial position of the accessory device 104 indicates a transformation between how axes in a coordinate system for the accessory device are oriented relative to axes of the reference coordinate system. Accordingly, these two positions may be used to compute a transformation of the accessory device 104 relative to the computing device 102 that is independent of the reference coordinate system.


By way of example, in some cases, the orientation may be defined as an angle of the accessory device 104 with respect the computing device 102 as represented in FIG. 10. As also discussed previously, different angles may be associated with different interaction states, such as the closed state, typing state, and viewing state examples given above. The orientation may alternatively be expressed in another suitable manner, such as using x, y, z coordinates.


Optionally, the computed orientation may be verified using a Hall Effect sensor 220 of the computing device 102. The Hall Effect sensor 220 may be configured to utilize magnetic force to detect proximity between the computing device 102 and the accessory device 104. For example, the Hall Effect sensor 220 may measure proximity based upon one or more magnets that are included with the computing device 102 and/or the accessory device 104. When the computing device 102 is rotated to a closed position, the Hall Effect sensor 220 may be configured to align with and detect a magnet of the accessory device 104. When the computing device 102 is positioned away from the accessory device 104 in an open position, the Hall Effect sensor 220 may be unable to detect the magnet or the detected magnetic force may change as the computing device 102 is rotated at different angles relative to the accessory device 104. The Hall Effect sensor 220 provides another way in which the orientation may be determined. Thus, the Hall Effect sensor 220 may be used as an additional check on whether the orientation computed using other sensors is accurate. This additional check may be made before causing and/or controlling some kinds of behaviors, such as powering down the devices or switching off different components based on orientation.


One or more behaviors of the host computing device and accessory device are controlled based on the orientation that is computed (block 1110). Various behaviors and responsive actions may be driven based on a computed orientation of an accessory with respect to the host. The behavior module 210 may be configured to obtain orientation results from the sensor fusion module 208 and control various behaviors accordingly.


Controlling the behaviors may include at least power management operations for the computing device 102 and/or host device. Generally, power management operations are configured to control power consumption and prolong battery life. For example, the behavior module 210 may cause changes in power modes/states to occur based on particular orientations. This may include toggling the devices and/or selected components on/off according to a determined orientation. For example, in a closed state both the host and accessory may be powered down or placed into a sleep mode. In another example, the accessory may be powered down when the orientation corresponds to a viewing state. The accessory device 104 may also automatically wake-up in particular orientation, such as when a typing state is detected. A variety of other power management examples are also contemplated that may occur in response to a computed orientation.


In another example, controlling the behaviors may include selectively adjusting and/or enabling/disabling different sensors for the device according to the orientation. By way of example, rotation of the accessory fully around to cover the backside of the host may be indicative of a game play state. In this arrangement, it may be likely that an accelerometer 216 may be used for gameplay whereas use of touch functionality for keyboard/typing input from the accessory may be unlikely. According, in this arrangement sensitivity of an accelerometer 216 may be increased/turned-on and touch sensitivity may be decreased or disabled. In a typing state, the opposite may be true and the accelerometer 216 may be disabled or adjusted to less sensitivity and the touch sensitivity may be increased or re-enabled. Thus, sensitivity of sensors may be adjusted and particular sensors may be turned on/off based on orientation. It should be noted that sensors that are controlled may include sensors involved in computation of the orientation as well as other sensors of the host or accessory.


In yet another example, functionality that is activated for the accessory and/or host may be modified based on the orientation. For example, an accessory may be configured to act as game controller when wrapped around to the backside and transform to provide keyboard type inputs when in a typing orientation. In a further example, reading gestures to scroll or turn pages via the accessory may be enabled by input across the accessory device in a viewing orientation and may be disabled for other states/orientation. These kinds of changes in the functionality provided by an accessory may occur by selectively exposing, enabling, configuring or otherwise activating different controls, functions, and gestures according to different orientations.


Comparable changes to activate gestures, touch keys, and other functionality of the host computing device based on the orientation may also occur. For example, gestures for manipulation of media content on the display 110 may be active in some orientations (e.g., viewing state or gaming state) and deactivated in other scenarios. Some additional examples of modifications that may be made to functionality that is activated/available for the computing device based on orientation include selectively enabling/disabling network connections and/or controlling interactions of the host with accessory devices and/or peripheral devices (e.g., printers, streaming media devices, storage devices) based upon the computed orientation.


Additionally, behaviors of applications 112 may also be controlled based on a computed orientation. For example, the behavior module 210 may be configured to selectively activate or deactivate different applications 112 based on the orientation. This may include toggling between applications operating in foreground and background processes, launching and closing particular applications, minimizing/maximizing, and so forth. Applications 112 may also retrieve and/or subscribe to receive updates of computed orientation that the applications may make use of in various ways, some details of which are provided in relation to the following figure. Accordingly, a wide variety of behaviors may be controlled based on a computed orientation, of which the particular behaviors enumerated above are but as few illustrative examples.



FIG. 12 depicts an example procedure 1200 in which a computed orientation is exposed for use by applications. In at least some embodiments, the procedure may be performed by a suitably configured computing device, such as the example computing device 102 of FIG. 2 that includes or otherwise make use of a sensor fusion application programming interface (API) 212.


An orientation of an accessory device relative to a host computing device is computed based on a combined spatial position for the host computing device and an ascertained spatial position for the accessory device (block 1202). This may occur in accordance with a designated sensor fusion algorithm as discussed in relation to the example procedure 1100 of FIG. 11 above.


An interface is exposed that is operable by one or more applications to obtain the computed orientation (block 1204). The computed orientation is supplied to an application in response to receiving a request from the application via the interface (block 1206). In particular, a computing device 102 may include a sensor fusion application programming interface (API) 212 that is operable to supply computed orientation information to applications 112. In one approach, the sensor fusion API may provide orientation information on demand responsive to individual requests. In addition or alternatively, the sensor fusion API may be configured to facilitate registration of applications 112 to subscribe to receive orientation updates. In response to a request to subscribe, the API may register an application with the sensor fusion module 208 and/or an associated notification system configured to supply notification messages to registered applications when orientation changes occur. The applications 112 may then receive notification messages sent via the notification system that describe updates to the orientation.


The sensor fusion API may supply the orientation and/or related information to application in various formats. For example, the orientation may be in the form of a transform of the accessory device 104 relative to the computing device 102 as computed in the manner described above. In this case, an application may process the supplied orientation information to obtain information in an appropriate format for the application, such as an orientation angle or a defined orientation state corresponding to the computed orientation. In addition or alternatively, the sensor fusion module 208 may operate to compute an orientation state on behalf of applications. Thus, information supplied via the sensor fusion API may include a state name or identifier that may be directly usable by the applications.


Applications 112 may make use of orientation information supplied through the API in various ways. For instance, an application 112 may selectively modify a user interface and/or functionality of the user interface for the application based on the orientation. This may include activating different controls, menus, gestures, and/or input modes for different respective orientations. For example, a navigation menu that appears in one orientation (typing/keyboard input orientation) may disappear in a viewing orientation. Further, an application 112 may be configured to include various modes and switch between the modes based on orientation. For example, a messaging application may switch from a text input mode to a video mode in accordance with the computed orientation. In another example, the application may modify the manner in which particular inputs are interpreted in different orientations. For instance, a button press in a typing orientation may be used for alphanumeric entry whereas the same button may be used for content control functions in a viewing orientation. Other buttons, keys, and other controls may also be selectively enabled or disabled as the orientation changes. A variety of other examples are also contemplated.


Having considered the foregoing example procedures, consider now a discussion of example systems and devices that may be employed to implement aspects of techniques in one or more embodiments.


Example System and Device



FIG. 13 illustrates an example system generally at 1300 that includes an example computing device 1302 that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. The computing device 1302 may be, for example, be configured to assume a mobile configuration through use of a housing formed and size to be grasped and carried by one or more hands of a user, illustrated examples of which include a mobile phone, mobile game and music device, and tablet computer although other examples are also contemplated.


The example computing device 1302 as illustrated includes a processing system 1304, one or more computer-readable media 1306, and one or more I/O interface 1308 that are communicatively coupled, one to another. Although not shown, the computing device 1302 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines.


The processing system 1304 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 1304 is illustrated as including hardware element 1310 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. The hardware elements 1310 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.


The computer-readable storage media 1306 is illustrated as including memory/storage 1312. The memory/storage 1312 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage component 1312 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage component 1312 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 1306 may be configured in a variety of other ways as further described below.


Input/output interface(s) 1308 are representative of functionality to allow a user to enter commands and information to computing device 1302, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to recognize movement as gestures that do not involve touch), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, the computing device 1302 may be configured in a variety of ways to support user interaction.


The computing device 1302 is further illustrated as being communicatively and physically coupled to an accessory device 1314 that is physically and communicatively removable from the computing device 1302. In this way, a variety of different accessory devices may be coupled to the computing device 1302 having a wide variety of configurations to support a wide variety of functionality. In this example, the accessory device 1314 includes one or more controls 1316, which may be configured as press-sensitive keys, mechanically switched keys, buttons, and so forth.


The accessory device 1314 is further illustrated as including one or more modules 1318 that may be configured to support a variety of functionality. The one or more modules 1318, for instance, may be configured to process analog and/or digital signals received from the controls 1316 to determine whether an input was intended, determine whether an input is indicative of resting pressure, support authentication of the accessory device 1314 for operation with the computing device 1302, and so on.


Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the computing device 1302. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”


“Computer-readable storage media” may refer to media and/or devices that enable persistent and/or non-transitory storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media. The computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.


“Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 1302, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.


As previously described, hardware elements 1310 and computer-readable media 1306 are representative of modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein, such as to perform one or more instructions. Hardware may include components of an integrated circuit or on-chip system, microcontroller devices, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware. In this context, hardware may operate as a processing device that performs program tasks defined by instructions and/or logic embodied by the hardware as well as a hardware utilized to store instructions for execution, e.g., the computer-readable storage media described previously.


Combinations of the foregoing may also be employed to implement various techniques described herein. Accordingly, software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 1310. The computing device 1302 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of a module that is executable by the computing device 1302 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 1310 of the processing system 1304. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more computing devices 1302 and/or processing systems 1304) to implement techniques, modules, and examples described herein.


CONCLUSION

Although the example implementations have been described in language specific to structural features and/or methodological acts, it is to be understood that the implementations defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed features.

Claims
  • 1. A computing device comprising: one or more sensors configured to measure first spatial information indicative of a position of the computing device;an interface configured to receive second spatial information indicative of a position of an accessory device in communication with the computing device; anda controller in communication with the sensor and the interface, the controller configured to cause modification of behaviors of the computing device based on the first spatial information and the second spatial information.
  • 2. A computing device as recited in claim 1, wherein the one or more sensors comprise at least one of a gyroscope, an accelerometer, a magnetometer, or a hall effect sensor.
  • 3. A computing device as recited in claim 1, wherein the one or more sensors comprise an array of multiple sensors used to provide multiple raw measurements that are combined to compute the first spatial information indicative of the position of the computing device.
  • 4. A computing device as recited in claim 1, further comprising a sensor fusion module operable to compute a relative orientation from the first spatial information and the second spatial information that describes a spatial relationship between the computing device and the accessory device.
  • 5. A computing device as recited in claim 4, wherein the sensor fusion module is implemented via the controller.
  • 6. A computing device as recited in claim 4, further comprising a sensor fusion application programming interface (API) operable to expose information regarding the relative orientation for use by applications of the computing device.
  • 7. A computing device as recited in claim 1, further comprising a behavior module operable under the direction of the controller to perform operations to control one or more behaviors of the computing device based at least in part upon an orientation of the computing device relative to the accessory device.
  • 8. A computing device as recited in claim 1, wherein the interface is configured to provide a communicative and physical connection between the accessory device and the computing device.
  • 9. An apparatus comprising: an interface configured to receive first spatial information indicative of a position of a first device and second spatial information indicative of a position of a second device; anda controller configured to control behaviors of the first device based on the first spatial data and the second spatial data.
  • 10. An apparatus as recited in claim 9, wherein interface is configured to obtain the first spatial information and the second spatial information from multiple sensors associated with the first device and the second device.
  • 11. An apparatus as recited in claim 9, wherein interface is configured as a flexible hinge that provides a communicative and physical connection between the first device and the second device.
  • 12. An apparatus as recited in claim 9, wherein the controller is further configured to control behaviors of the second device based on the first spatial information and the second spatial information.
  • 13. An apparatus as recited in claim 9, wherein control of the behaviors by the controller includes one or more of: managing power states;selecting operational modes or device states;adjusting sensitivity of one or more sensors associated the first device and the second device;controlling interaction between the first device and the second device;modifying device functionality;enabling and disabling network connections;activating and deactivating applications; orsetting application states.
  • 14. An apparatus as recited in claim 9, wherein the controller is further configured to compute a relative orientation of the first device relative to the second device based upon the first spatial information and the second spatial information.
  • 15. An apparatus comprising: an interface configured to receive first spatial information indicative of a position of a first device and second spatial information indicative of a position of a second device; anda controller configured to utilize the first spatial information and the second spatial information to: compute a relative orientation of the first device relative to the second device; andmodify behaviors of the first device based on the relative orientation.
  • 16. An apparatus as recited in claim 15, further comprising an application programming interface (API) operable to expose information regarding the relative orientation that is computed for use by applications associated with the first device and the second device.
  • 17. An apparatus as recited in claim 15, wherein: the first spatial information and the second spatial information include multiple measurements of position obtained from sensors associated with the first device and the second device; andcomputation of the relative orientation by the controller comprises applying a sensor fusion algorithm to derive the relative orientation by combining the multiple measurements of position.
  • 18. An apparatus as recited in claim 15, wherein modification of the behaviors by the controller comprises at least managing power states of the first device based on the relative orientation.
  • 19. An apparatus as recited in claim 15, wherein modification of the behaviors by the controller comprises at least controlling interaction between the first device and the second device based the relative orientation.
  • 20. An apparatus as recited in claim 15, modification of the behaviors by the controller includes at least one of: enabling or disabling network connections based on the relative orientation; oractivating or deactivating applications based on the relative orientation.
RELATED APPLICATIONS

This application is a continuation of and claims priority under 35 U.S.C §120 to U.S. patent application Ser. No. 13/651,272, filed Oct. 12, 2012, titled “Sensor Fusion Algorithm,” which is a continuation of and claims priority under 35 U.S.C §120 to U.S. patent application Ser. No. 13/471,202, filed May 14, 2012, titled “Sensor Fusion Algorithm,” which claims priority under 35 U.S.C. §119(e) to the following U.S. Provisional Patent Applications, the entire disclosures of each of these applications being incorporated by reference in their entirety: U.S. Provisional Patent Application No. 61/606,321, filed Mar. 2, 2012, and titled “Screen Edge;” U.S. Provisional Patent Application No. 61/606,301, filed Mar. 2, 2012, and titled “Input Device Functionality;” U.S. Provisional Patent Application No. 61/606,313, filed Mar. 2, 2012, and titled “Functional Hinge;” U.S. Provisional Patent Application No. 61/606,333, filed Mar. 2, 2012, and titled “Usage and Authentication;” U.S. Provisional Patent Application No. 61/613,745, filed Mar. 21, 2012, and titled “Usage and Authentication;” U.S. Provisional Patent Application No. 61/606,336, filed Mar. 2, 2012, and titled “Kickstand and Camera;” and U.S. Provisional Patent Application No. 61/607,451, filed Mar. 6, 2012, and titled “Spanaway Provisional.”

US Referenced Citations (735)
Number Name Date Kind
578325 Fleming Mar 1897 A
3879586 DuRocher et al. Apr 1975 A
4046975 Seeger, Jr. Sep 1977 A
4065649 Carter et al. Dec 1977 A
4086451 Boulanger Apr 1978 A
4243861 Strandwitz Jan 1981 A
4302648 Sado et al. Nov 1981 A
4317013 Larson Feb 1982 A
4365130 Christensen Dec 1982 A
4492829 Rodrique Jan 1985 A
4527021 Morikawa et al. Jul 1985 A
4559426 Van Zeeland et al. Dec 1985 A
4576436 Daniel Mar 1986 A
4577822 Wilkerson Mar 1986 A
4588187 Dell May 1986 A
4607147 Ono et al. Aug 1986 A
4615579 Whitehead Oct 1986 A
4651133 Ganesan et al. Mar 1987 A
4735394 Facco Apr 1988 A
4735495 Henkes Apr 1988 A
5008497 Asher Apr 1991 A
5128829 Loew Jul 1992 A
5220521 Kikinis Jun 1993 A
5283559 Kalendra et al. Feb 1994 A
5319455 Hoarty et al. Jun 1994 A
5331443 Stanisci Jul 1994 A
5339382 Whitehead Aug 1994 A
5363075 Fanucchi Nov 1994 A
5375076 Goodrich et al. Dec 1994 A
5406415 Kelly Apr 1995 A
5480118 Cross Jan 1996 A
5546271 Gut et al. Aug 1996 A
5548477 Kumar et al. Aug 1996 A
5558577 Kato Sep 1996 A
5618232 Martin Apr 1997 A
5621494 Kazumi et al. Apr 1997 A
5666112 Crowley et al. Sep 1997 A
5681220 Bertram et al. Oct 1997 A
5737183 Kobayashi et al. Apr 1998 A
5745376 Barker et al. Apr 1998 A
5748114 Koehn May 1998 A
5781406 Hunte Jul 1998 A
5806955 Parkyn, Jr. et al. Sep 1998 A
5807175 Davis et al. Sep 1998 A
5808713 Broer et al. Sep 1998 A
5818361 Acevedo Oct 1998 A
5828770 Leis et al. Oct 1998 A
5838403 Jannson et al. Nov 1998 A
5842027 Oprescu et al. Nov 1998 A
5861990 Tedesco Jan 1999 A
5874697 Selker et al. Feb 1999 A
5905485 Podoloff May 1999 A
5924555 Sadamori et al. Jul 1999 A
5926170 Oba Jul 1999 A
5929946 Sharp et al. Jul 1999 A
5971635 Wise Oct 1999 A
5999147 Teitel Dec 1999 A
6002389 Kasser Dec 1999 A
6005209 Burleson et al. Dec 1999 A
6012714 Worley et al. Jan 2000 A
6040823 Seffernick et al. Mar 2000 A
6042075 Burch, Jr. Mar 2000 A
6044717 Biegelsen et al. Apr 2000 A
6046857 Morishima Apr 2000 A
6061644 Leis May 2000 A
6072551 Jannson et al. Jun 2000 A
6108200 Fullerton Aug 2000 A
6112797 Colson et al. Sep 2000 A
6124906 Kawada et al. Sep 2000 A
6128007 Seybold Oct 2000 A
6129444 Tognoni Oct 2000 A
6178443 Lin Jan 2001 B1
6232934 Heacock et al. May 2001 B1
6234820 Perino et al. May 2001 B1
6254105 Rinde et al. Jul 2001 B1
6256447 Laine Jul 2001 B1
6279060 Luke et al. Aug 2001 B1
6300986 Travis Oct 2001 B1
6329617 Burgess Dec 2001 B1
6344791 Armstrong Feb 2002 B1
6353503 Spitzer et al. Mar 2002 B1
6366440 Kung Apr 2002 B1
6380497 Hashimoto et al. Apr 2002 B1
6437682 Vance Aug 2002 B1
6506983 Babb et al. Jan 2003 B1
6511378 Bhatt et al. Jan 2003 B1
6529179 Hashimoto et al. Mar 2003 B1
6532147 Christ, Jr. Mar 2003 B1
6543949 Ritchey et al. Apr 2003 B1
6565439 Shinohara et al. May 2003 B2
6597347 Yasutake Jul 2003 B1
6600121 Olodort et al. Jul 2003 B1
6603408 Gaba Aug 2003 B1
6608664 Hasegawa Aug 2003 B1
6617536 Kawaguchi Sep 2003 B2
6648485 Colgan et al. Nov 2003 B1
6651943 Cho et al. Nov 2003 B2
6685369 Lien Feb 2004 B2
6695273 Iguchi Feb 2004 B2
6704864 Philyaw Mar 2004 B1
6721019 Kono et al. Apr 2004 B2
6725318 Sherman et al. Apr 2004 B1
6774888 Genduso Aug 2004 B1
6776546 Kraus et al. Aug 2004 B2
6780019 Ghosh et al. Aug 2004 B1
6781819 Yang et al. Aug 2004 B2
6784869 Clark et al. Aug 2004 B1
6795146 Dozov et al. Sep 2004 B2
6813143 Makela Nov 2004 B2
6819316 Schulz et al. Nov 2004 B2
6847488 Travis Jan 2005 B2
6856506 Doherty et al. Feb 2005 B2
6861961 Sandbach et al. Mar 2005 B2
6870671 Travis Mar 2005 B2
6895164 Saccomanno May 2005 B2
6898315 Guha May 2005 B2
6909354 Baker et al. Jun 2005 B2
6914197 Doherty et al. Jul 2005 B2
6950950 Sawyers et al. Sep 2005 B2
6970957 Oshins et al. Nov 2005 B1
6976799 Kim et al. Dec 2005 B2
6980177 Struyk Dec 2005 B2
7006080 Gettemy Feb 2006 B2
7007238 Glaser Feb 2006 B2
7025908 Hayashi et al. Apr 2006 B1
7051149 Wang et al. May 2006 B2
7068496 Wong et al. Jun 2006 B2
7083295 Hanna Aug 2006 B1
7091436 Serban Aug 2006 B2
7095404 Vincent et al. Aug 2006 B2
7099149 Krieger et al. Aug 2006 B2
7101048 Travis Sep 2006 B2
7104679 Shin et al. Sep 2006 B2
7106222 Ward et al. Sep 2006 B2
7116309 Kimura et al. Oct 2006 B1
7123292 Seeger et al. Oct 2006 B1
7129979 Lee Oct 2006 B1
7136282 Rebeske Nov 2006 B1
7152985 Benitez et al. Dec 2006 B2
7153017 Yamashita et al. Dec 2006 B2
D535292 Shi et al. Jan 2007 S
7194662 Do et al. Mar 2007 B2
7199931 Boettiger et al. Apr 2007 B2
7202837 Ihara Apr 2007 B2
7213991 Chapman et al. May 2007 B2
7224830 Nefian et al. May 2007 B2
7260221 Atsmon Aug 2007 B1
7260823 Schlack et al. Aug 2007 B2
7277087 Hill et al. Oct 2007 B2
7301759 Hsiung Nov 2007 B2
7370342 Ismail et al. May 2008 B2
7374312 Feng et al. May 2008 B2
7375885 Ijzerman et al. May 2008 B2
7400377 Evans et al. Jul 2008 B2
7431489 Yeo et al. Oct 2008 B2
7447934 Dasari et al. Nov 2008 B2
7457108 Ghosh Nov 2008 B2
7469386 Bear et al. Dec 2008 B2
7486165 Ligtenberg et al. Feb 2009 B2
7499037 Lube Mar 2009 B2
7499216 Niv et al. Mar 2009 B2
7502803 Culter et al. Mar 2009 B2
7503684 Ueno et al. Mar 2009 B2
7515143 Keam et al. Apr 2009 B2
7528374 Smitt et al. May 2009 B2
7542052 Solomon et al. Jun 2009 B2
7558594 Wilson Jul 2009 B2
7559834 York Jul 2009 B1
7561131 Ijzerman et al. Jul 2009 B2
7572045 Hoelen et al. Aug 2009 B2
RE40891 Yasutake Sep 2009 E
7620244 Collier Nov 2009 B1
7622907 Vranish Nov 2009 B2
7631327 Dempski et al. Dec 2009 B2
7636921 Louie Dec 2009 B2
7639876 Clary et al. Dec 2009 B2
7643213 Boettiger et al. Jan 2010 B2
7656392 Bolender Feb 2010 B2
7675598 Hong Mar 2010 B2
7686694 Cole Mar 2010 B2
7728923 Kim et al. Jun 2010 B2
7729493 Krieger et al. Jun 2010 B2
7731147 Rha Jun 2010 B2
7733326 Adiseshan Jun 2010 B1
7773076 Pittel et al. Aug 2010 B2
7773121 Huntsberger et al. Aug 2010 B1
7774155 Sato et al. Aug 2010 B2
7777972 Chen et al. Aug 2010 B1
7782341 Kothandaraman Aug 2010 B2
7782342 Koh Aug 2010 B2
7813715 McKillop et al. Oct 2010 B2
7815358 Inditsky Oct 2010 B2
7822338 Wernersson Oct 2010 B2
7844985 Hendricks et al. Nov 2010 B2
7855716 McCreary et al. Dec 2010 B2
7865639 McCoy et al. Jan 2011 B2
7884807 Hovden et al. Feb 2011 B2
7893921 Sato Feb 2011 B2
D636397 Green Apr 2011 S
7918559 Tesar Apr 2011 B2
7927654 Hagood et al. Apr 2011 B2
7928964 Kolmykov-Zotov et al. Apr 2011 B2
7936501 Smith et al. May 2011 B2
7944520 Ichioka et al. May 2011 B2
7945717 Rivalsi May 2011 B2
7957082 Mi et al. Jun 2011 B2
7965268 Gass et al. Jun 2011 B2
7967462 Ogiro et al. Jun 2011 B2
7970246 Travis et al. Jun 2011 B2
7973771 Geaghan Jul 2011 B2
7976393 Haga et al. Jul 2011 B2
7978281 Vergith et al. Jul 2011 B2
8016255 Lin Sep 2011 B2
8018386 Qi et al. Sep 2011 B2
8018579 Krah Sep 2011 B1
8026904 Westerman Sep 2011 B2
8053688 Conzola et al. Nov 2011 B2
8065624 Morin et al. Nov 2011 B2
8069356 Rathi et al. Nov 2011 B2
RE42992 David Dec 2011 E
8077160 Land et al. Dec 2011 B2
8090885 Callaghan et al. Jan 2012 B2
8098233 Hotelling et al. Jan 2012 B2
8102362 Ricks et al. Jan 2012 B2
8115499 Osoinach et al. Feb 2012 B2
8115718 Chen et al. Feb 2012 B2
8117362 Rodriguez et al. Feb 2012 B2
8118274 McClure et al. Feb 2012 B2
8118681 Mattice et al. Feb 2012 B2
8120166 Koizumi et al. Feb 2012 B2
8130203 Westerman Mar 2012 B2
8154524 Wilson et al. Apr 2012 B2
8162282 Hu et al. Apr 2012 B2
D659139 Gengler May 2012 S
8169421 Wright et al. May 2012 B2
8179236 Weller et al. May 2012 B2
8184190 Dosluoglu May 2012 B2
8189973 Travis et al. May 2012 B2
8216074 Sakuma Jul 2012 B2
8229509 Paek et al. Jul 2012 B2
8229522 Kim et al. Jul 2012 B2
8231099 Chen Jul 2012 B2
8248791 Wang et al. Aug 2012 B2
8255708 Zhang Aug 2012 B1
8264310 Lauder et al. Sep 2012 B2
8267368 Torii et al. Sep 2012 B2
8274784 Franz et al. Sep 2012 B2
8279589 Kim Oct 2012 B2
8310508 Hekstra et al. Nov 2012 B2
8310768 Lin et al. Nov 2012 B2
8322290 Mignano Dec 2012 B1
8325416 Lesage et al. Dec 2012 B2
8354806 Travis et al. Jan 2013 B2
8362975 Uehara Jan 2013 B2
8387078 Memmott Feb 2013 B2
8416559 Agata et al. Apr 2013 B2
8498100 Whitt, III et al. Jul 2013 B1
8543227 Perek et al. Sep 2013 B1
8548608 Perek et al. Oct 2013 B2
8564944 Whitt, III et al. Oct 2013 B2
8570725 Whitt, III et al. Oct 2013 B2
8599542 Healey et al. Dec 2013 B1
8610015 Whitt et al. Dec 2013 B2
8614666 Whitman et al. Dec 2013 B2
8646999 Shaw et al. Feb 2014 B2
8699215 Whitt, III et al. Apr 2014 B2
8719603 Belesiu May 2014 B2
8724302 Whitt, III et al. May 2014 B2
8780541 Whitt et al. Jul 2014 B2
8791382 Whitt, III et al. Jul 2014 B2
20010023818 Masaru et al. Sep 2001 A1
20020000977 Vranish Jan 2002 A1
20020044216 Cha Apr 2002 A1
20020134828 Sandbach et al. Sep 2002 A1
20020135457 Sandbach et al. Sep 2002 A1
20020154099 Oh Oct 2002 A1
20020163510 Williams et al. Nov 2002 A1
20030011576 Sandbach et al. Jan 2003 A1
20030016282 Koizumi Jan 2003 A1
20030036365 Kuroda Feb 2003 A1
20030044215 Monney et al. Mar 2003 A1
20030051983 Lahr Mar 2003 A1
20030108720 Kashino Jun 2003 A1
20030163611 Nagao Aug 2003 A1
20030165017 Amitai Sep 2003 A1
20030197687 Shetter Oct 2003 A1
20030198008 Leapman et al. Oct 2003 A1
20030231243 Shibutani Dec 2003 A1
20040005184 Kim et al. Jan 2004 A1
20040056843 Lin et al. Mar 2004 A1
20040115994 Wulff et al. Jun 2004 A1
20040156168 LeVasseur et al. Aug 2004 A1
20040160734 Yim Aug 2004 A1
20040169641 Bean et al. Sep 2004 A1
20040212598 Kraus et al. Oct 2004 A1
20040212601 Cake et al. Oct 2004 A1
20040258924 Berger et al. Dec 2004 A1
20040268000 Barker et al. Dec 2004 A1
20050030728 Kawashima et al. Feb 2005 A1
20050052831 Chen Mar 2005 A1
20050055498 Beckert et al. Mar 2005 A1
20050057515 Bathiche Mar 2005 A1
20050059489 Kim Mar 2005 A1
20050062715 Tsuji et al. Mar 2005 A1
20050099400 Lee May 2005 A1
20050100690 Mayer et al. May 2005 A1
20050134717 Misawa Jun 2005 A1
20050146512 Hill et al. Jul 2005 A1
20050236848 Kim et al. Oct 2005 A1
20050264653 Starkweather et al. Dec 2005 A1
20050264988 Nicolosi Dec 2005 A1
20050285703 Wheeler et al. Dec 2005 A1
20060010400 Dehlin et al. Jan 2006 A1
20060028400 Lapstun et al. Feb 2006 A1
20060028838 Imade Feb 2006 A1
20060049993 Lin et al. Mar 2006 A1
20060083004 Cok Apr 2006 A1
20060085658 Allen et al. Apr 2006 A1
20060092139 Sharma May 2006 A1
20060096392 Inkster et al. May 2006 A1
20060102914 Smits et al. May 2006 A1
20060125799 Hillis et al. Jun 2006 A1
20060132423 Travis Jun 2006 A1
20060146573 Iwauchi et al. Jul 2006 A1
20060154725 Glaser et al. Jul 2006 A1
20060156415 Rubinstein et al. Jul 2006 A1
20060181514 Newman Aug 2006 A1
20060187216 Trent, Jr. et al. Aug 2006 A1
20060195522 Miyazaki Aug 2006 A1
20060215244 Yosha et al. Sep 2006 A1
20060227393 Herloski Oct 2006 A1
20060238550 Page Oct 2006 A1
20060265617 Priborsky Nov 2006 A1
20060272429 Ganapathi et al. Dec 2006 A1
20060279501 Lu et al. Dec 2006 A1
20070003267 Shibutani Jan 2007 A1
20070019181 Sinclair et al. Jan 2007 A1
20070046625 Yee Mar 2007 A1
20070047221 Park Mar 2007 A1
20070056385 Lorenz Mar 2007 A1
20070062089 Homer et al. Mar 2007 A1
20070069153 Pai-Paranjape et al. Mar 2007 A1
20070072474 Beasley et al. Mar 2007 A1
20070076434 Uehara et al. Apr 2007 A1
20070081091 Pan et al. Apr 2007 A1
20070117600 Robertson et al. May 2007 A1
20070145945 McGinley et al. Jun 2007 A1
20070176902 Newman et al. Aug 2007 A1
20070182663 Biech Aug 2007 A1
20070182722 Hotelling et al. Aug 2007 A1
20070185590 Reindel et al. Aug 2007 A1
20070188478 Silverstein et al. Aug 2007 A1
20070200830 Yamamoto Aug 2007 A1
20070201246 Yeo et al. Aug 2007 A1
20070201859 Sarrat Aug 2007 A1
20070220708 Lewis Sep 2007 A1
20070230227 Palmer Oct 2007 A1
20070234420 Novotney et al. Oct 2007 A1
20070236408 Yamaguchi et al. Oct 2007 A1
20070236475 Wherry Oct 2007 A1
20070236873 Yukawa et al. Oct 2007 A1
20070247432 Oakley Oct 2007 A1
20070260892 Paul et al. Nov 2007 A1
20070274094 Schultz et al. Nov 2007 A1
20070274095 Destain Nov 2007 A1
20070274099 Tai et al. Nov 2007 A1
20070283179 Burnett et al. Dec 2007 A1
20070296709 Guanghai Dec 2007 A1
20080005423 Jacobs et al. Jan 2008 A1
20080013809 Zhu et al. Jan 2008 A1
20080018611 Serban et al. Jan 2008 A1
20080019150 Park et al. Jan 2008 A1
20080037284 Rudisill Feb 2008 A1
20080053222 Ehrensvard et al. Mar 2008 A1
20080059888 Dunko Mar 2008 A1
20080068451 Hyatt Mar 2008 A1
20080104437 Lee May 2008 A1
20080106592 Mikami May 2008 A1
20080122803 Izadi et al. May 2008 A1
20080150913 Bell et al. Jun 2008 A1
20080151478 Chern Jun 2008 A1
20080158185 Westerman Jul 2008 A1
20080167832 Soss Jul 2008 A1
20080174570 Jobs et al. Jul 2008 A1
20080186660 Yang Aug 2008 A1
20080211787 Nakao et al. Sep 2008 A1
20080219025 Spitzer et al. Sep 2008 A1
20080225205 Travis Sep 2008 A1
20080228969 Cheah et al. Sep 2008 A1
20080238884 Harish Oct 2008 A1
20080253822 Matias Oct 2008 A1
20080297878 Brown et al. Dec 2008 A1
20080307242 Qu Dec 2008 A1
20080309636 Feng et al. Dec 2008 A1
20080316002 Brunet et al. Dec 2008 A1
20080316183 Westerman et al. Dec 2008 A1
20080316768 Travis Dec 2008 A1
20080320190 Lydon et al. Dec 2008 A1
20090007001 Morin et al. Jan 2009 A1
20090009476 Daley, III Jan 2009 A1
20090033623 Lin Feb 2009 A1
20090065267 Sato Mar 2009 A1
20090073060 Shimasaki et al. Mar 2009 A1
20090073957 Newland et al. Mar 2009 A1
20090079639 Hotta et al. Mar 2009 A1
20090083562 Park et al. Mar 2009 A1
20090089600 Nousiainen Apr 2009 A1
20090096738 Chen et al. Apr 2009 A1
20090127005 Zachut et al. May 2009 A1
20090135142 Fu et al. May 2009 A1
20090140985 Liu Jun 2009 A1
20090146992 Fukunaga et al. Jun 2009 A1
20090161385 Parker et al. Jun 2009 A1
20090163147 Steigerwald et al. Jun 2009 A1
20090167728 Geaghan et al. Jul 2009 A1
20090167930 Safaee-Rad et al. Jul 2009 A1
20090174759 Yeh et al. Jul 2009 A1
20090189873 Peterson et al. Jul 2009 A1
20090189974 Deering Jul 2009 A1
20090195497 Fitzgerald et al. Aug 2009 A1
20090195518 Mattice et al. Aug 2009 A1
20090207144 Bridger Aug 2009 A1
20090219250 Ure Sep 2009 A1
20090231275 Odgers Sep 2009 A1
20090239586 Boeve et al. Sep 2009 A1
20090244832 Behar et al. Oct 2009 A1
20090251008 Sugaya Oct 2009 A1
20090259865 Sheynblat et al. Oct 2009 A1
20090262492 Whitchurch et al. Oct 2009 A1
20090265670 Kim et al. Oct 2009 A1
20090285491 Ravenscroft et al. Nov 2009 A1
20090296331 Choy Dec 2009 A1
20090303137 Kusaka et al. Dec 2009 A1
20090303204 Nasiri et al. Dec 2009 A1
20090315830 Westerman Dec 2009 A1
20090316072 Okumura et al. Dec 2009 A1
20090320244 Lin Dec 2009 A1
20090321490 Groene et al. Dec 2009 A1
20100001963 Doray et al. Jan 2010 A1
20100013319 Kamiyama et al. Jan 2010 A1
20100026656 Hotelling et al. Feb 2010 A1
20100038821 Jenkins et al. Feb 2010 A1
20100039081 Sip Feb 2010 A1
20100045609 Do et al. Feb 2010 A1
20100045633 Gettemy Feb 2010 A1
20100051356 Stern et al. Mar 2010 A1
20100051432 Lin et al. Mar 2010 A1
20100052880 Laitinen et al. Mar 2010 A1
20100053534 Hsieh et al. Mar 2010 A1
20100053771 Travis et al. Mar 2010 A1
20100072351 Mahowald Mar 2010 A1
20100075517 Ni et al. Mar 2010 A1
20100077237 Sawyers Mar 2010 A1
20100081377 Chatterjee et al. Apr 2010 A1
20100085321 Pundsack Apr 2010 A1
20100102182 Lin Apr 2010 A1
20100102206 Cazaux et al. Apr 2010 A1
20100103112 Yoo et al. Apr 2010 A1
20100105443 Vaisanen Apr 2010 A1
20100123686 Klinghult et al. May 2010 A1
20100133398 Chiu et al. Jun 2010 A1
20100135036 Matsuba et al. Jun 2010 A1
20100142130 Wang et al. Jun 2010 A1
20100148995 Elias Jun 2010 A1
20100148999 Casparian et al. Jun 2010 A1
20100149073 Chaum et al. Jun 2010 A1
20100149104 Sim et al. Jun 2010 A1
20100149111 Olien Jun 2010 A1
20100149117 Chien et al. Jun 2010 A1
20100149134 Westerman et al. Jun 2010 A1
20100149377 Shintani et al. Jun 2010 A1
20100156798 Archer Jun 2010 A1
20100156913 Ortega et al. Jun 2010 A1
20100161522 Tirpak et al. Jun 2010 A1
20100164857 Liu et al. Jul 2010 A1
20100164897 Morin et al. Jul 2010 A1
20100171891 Kaji et al. Jul 2010 A1
20100174421 Tsai et al. Jul 2010 A1
20100177388 Cohen et al. Jul 2010 A1
20100180063 Ananny et al. Jul 2010 A1
20100188299 Rinehart et al. Jul 2010 A1
20100188338 Longe Jul 2010 A1
20100205472 Tupman et al. Aug 2010 A1
20100206614 Park et al. Aug 2010 A1
20100206644 Yeh Aug 2010 A1
20100214214 Corson et al. Aug 2010 A1
20100214257 Wussler et al. Aug 2010 A1
20100222110 Kim et al. Sep 2010 A1
20100231498 Large et al. Sep 2010 A1
20100231510 Sampsell et al. Sep 2010 A1
20100231556 Mines et al. Sep 2010 A1
20100235546 Terlizzi et al. Sep 2010 A1
20100238075 Pourseyed Sep 2010 A1
20100238138 Goertz et al. Sep 2010 A1
20100238620 Fish Sep 2010 A1
20100245221 Khan Sep 2010 A1
20100245289 Svajda Sep 2010 A1
20100250988 Okuda et al. Sep 2010 A1
20100259482 Ball Oct 2010 A1
20100265182 Ball et al. Oct 2010 A1
20100271771 Wu et al. Oct 2010 A1
20100274932 Kose Oct 2010 A1
20100279768 Huang et al. Nov 2010 A1
20100282953 Tam Nov 2010 A1
20100289457 Onnerud et al. Nov 2010 A1
20100295812 Burns et al. Nov 2010 A1
20100296163 Saarikko Nov 2010 A1
20100299642 Merrell et al. Nov 2010 A1
20100302378 Marks et al. Dec 2010 A1
20100304793 Kim et al. Dec 2010 A1
20100306538 Thomas et al. Dec 2010 A1
20100308778 Yamazaki et al. Dec 2010 A1
20100308844 Day et al. Dec 2010 A1
20100309617 Wang et al. Dec 2010 A1
20100313680 Joung et al. Dec 2010 A1
20100315348 Jellicoe et al. Dec 2010 A1
20100315373 Steinhauser et al. Dec 2010 A1
20100321339 Kimmel Dec 2010 A1
20100321877 Moser Dec 2010 A1
20100324457 Bean et al. Dec 2010 A1
20100325155 Skinner et al. Dec 2010 A1
20100331059 Apgar et al. Dec 2010 A1
20110007047 Fujioka et al. Jan 2011 A1
20110012873 Prest et al. Jan 2011 A1
20110019123 Prest et al. Jan 2011 A1
20110031287 Le Gette et al. Feb 2011 A1
20110032127 Roush Feb 2011 A1
20110032215 Sirotich et al. Feb 2011 A1
20110036965 Zhang et al. Feb 2011 A1
20110037721 Cranfill et al. Feb 2011 A1
20110043990 Mickey et al. Feb 2011 A1
20110044582 Travis et al. Feb 2011 A1
20110050576 Forutanpour et al. Mar 2011 A1
20110050626 Porter et al. Mar 2011 A1
20110055407 Lydon et al. Mar 2011 A1
20110057724 Pabon Mar 2011 A1
20110057899 Sleeman et al. Mar 2011 A1
20110060926 Brooks et al. Mar 2011 A1
20110069148 Jones et al. Mar 2011 A1
20110072391 Hanggie et al. Mar 2011 A1
20110074688 Hull et al. Mar 2011 A1
20110081946 Singh et al. Apr 2011 A1
20110096035 Shen Apr 2011 A1
20110096513 Kim Apr 2011 A1
20110102326 Casparian et al. May 2011 A1
20110102356 Kemppinen et al. May 2011 A1
20110115738 Suzuki et al. May 2011 A1
20110115747 Powell et al. May 2011 A1
20110117970 Choi May 2011 A1
20110118025 Lukas et al. May 2011 A1
20110122071 Powell May 2011 A1
20110134032 Chiu et al. Jun 2011 A1
20110134112 Koh et al. Jun 2011 A1
20110157046 Lee et al. Jun 2011 A1
20110157087 Kanehira et al. Jun 2011 A1
20110163955 Nasiri et al. Jul 2011 A1
20110164370 McClure et al. Jul 2011 A1
20110167181 Minoo et al. Jul 2011 A1
20110167287 Walsh et al. Jul 2011 A1
20110167391 Momeyer et al. Jul 2011 A1
20110167992 Eventoff et al. Jul 2011 A1
20110169762 Weiss Jul 2011 A1
20110169778 Nungester et al. Jul 2011 A1
20110170289 Allen et al. Jul 2011 A1
20110176035 Poulsen Jul 2011 A1
20110179864 Raasch et al. Jul 2011 A1
20110184646 Wong et al. Jul 2011 A1
20110184824 George et al. Jul 2011 A1
20110188199 Pan Aug 2011 A1
20110193787 Morishige et al. Aug 2011 A1
20110193938 Oderwald et al. Aug 2011 A1
20110202878 Park et al. Aug 2011 A1
20110205372 Miramontes Aug 2011 A1
20110216266 Travis Sep 2011 A1
20110221659 King et al. Sep 2011 A1
20110227913 Hyndman Sep 2011 A1
20110231682 Kakish et al. Sep 2011 A1
20110235179 Simmonds Sep 2011 A1
20110242138 Tribble Oct 2011 A1
20110242298 Bathiche et al. Oct 2011 A1
20110242440 Noma et al. Oct 2011 A1
20110242670 Simmonds Oct 2011 A1
20110248152 Svajda et al. Oct 2011 A1
20110248920 Larsen Oct 2011 A1
20110248941 Abdo et al. Oct 2011 A1
20110261001 Liu Oct 2011 A1
20110261083 Wilson Oct 2011 A1
20110267272 Meyer et al. Nov 2011 A1
20110267300 Serban et al. Nov 2011 A1
20110273475 Herz et al. Nov 2011 A1
20110290686 Huang Dec 2011 A1
20110295697 Boston et al. Dec 2011 A1
20110297566 Gallagher et al. Dec 2011 A1
20110298919 Maglaque Dec 2011 A1
20110302518 Zhang Dec 2011 A1
20110304577 Brown Dec 2011 A1
20110304815 Newell Dec 2011 A1
20110304962 Su Dec 2011 A1
20110305875 Sanford et al. Dec 2011 A1
20110306424 Kazama et al. Dec 2011 A1
20110316807 Corrion Dec 2011 A1
20120007821 Zaliva Jan 2012 A1
20120011462 Westerman et al. Jan 2012 A1
20120013519 Hakansson et al. Jan 2012 A1
20120019165 Igaki et al. Jan 2012 A1
20120020112 Fisher et al. Jan 2012 A1
20120023459 Westerman Jan 2012 A1
20120024682 Huang et al. Feb 2012 A1
20120026048 Vazquez et al. Feb 2012 A1
20120026096 Ku Feb 2012 A1
20120032887 Chiu et al. Feb 2012 A1
20120032891 Parivar Feb 2012 A1
20120038495 Ishikawa Feb 2012 A1
20120044179 Hudson Feb 2012 A1
20120047368 Chinn et al. Feb 2012 A1
20120050975 Garelli et al. Mar 2012 A1
20120062850 Travis Mar 2012 A1
20120068919 Lauder et al. Mar 2012 A1
20120069540 Lauder et al. Mar 2012 A1
20120072167 Cretella, Jr. et al. Mar 2012 A1
20120075249 Hoch Mar 2012 A1
20120077384 Bar-Niv et al. Mar 2012 A1
20120081316 Sirpal et al. Apr 2012 A1
20120092279 Martin Apr 2012 A1
20120094257 Pillischer et al. Apr 2012 A1
20120099263 Lin Apr 2012 A1
20120099749 Rubin et al. Apr 2012 A1
20120113579 Agata et al. May 2012 A1
20120115553 Mahe et al. May 2012 A1
20120117409 Lee et al. May 2012 A1
20120127118 Nolting et al. May 2012 A1
20120140396 Zeliff et al. Jun 2012 A1
20120145525 Ishikawa Jun 2012 A1
20120162693 Ito Jun 2012 A1
20120175487 Goto Jul 2012 A1
20120182242 Lindahl et al. Jul 2012 A1
20120182249 Endo et al. Jul 2012 A1
20120182743 Chou Jul 2012 A1
20120194393 Uttermann et al. Aug 2012 A1
20120194448 Rothkopf Aug 2012 A1
20120200802 Large Aug 2012 A1
20120206937 Travis et al. Aug 2012 A1
20120212438 Vaisanen Aug 2012 A1
20120223866 Ayala et al. Sep 2012 A1
20120224073 Miyahara Sep 2012 A1
20120229634 Laett et al. Sep 2012 A1
20120235635 Sato Sep 2012 A1
20120243165 Chang et al. Sep 2012 A1
20120246377 Bhesania et al. Sep 2012 A1
20120249443 Anderson et al. Oct 2012 A1
20120256959 Ye et al. Oct 2012 A1
20120274811 Bakin Nov 2012 A1
20120287562 Wu et al. Nov 2012 A1
20120299872 Nishikawa et al. Nov 2012 A1
20120300275 Vilardell et al. Nov 2012 A1
20120312955 Randolph Dec 2012 A1
20120328349 Isaac et al. Dec 2012 A1
20130009413 Chiu et al. Jan 2013 A1
20130016468 Oh Jan 2013 A1
20130027867 Lauder et al. Jan 2013 A1
20130044074 Park et al. Feb 2013 A1
20130063873 Wodrich et al. Mar 2013 A1
20130067126 Casparian et al. Mar 2013 A1
20130073877 Radke Mar 2013 A1
20130076617 Csaszar et al. Mar 2013 A1
20130088431 Ballagas et al. Apr 2013 A1
20130106766 Yilmaz et al. May 2013 A1
20130107144 Marhefka et al. May 2013 A1
20130120466 Chen et al. May 2013 A1
20130162554 Lauder et al. Jun 2013 A1
20130172906 Olson et al. Jul 2013 A1
20130201094 Travis Aug 2013 A1
20130207937 Lutian Aug 2013 A1
20130217451 Komiyama et al. Aug 2013 A1
20130227836 Whitt, III Sep 2013 A1
20130228023 Drasnin Sep 2013 A1
20130228433 Shaw Sep 2013 A1
20130228434 Whitt, III Sep 2013 A1
20130228435 Whitt, III Sep 2013 A1
20130228439 Whitt, III Sep 2013 A1
20130229100 Siddiqui Sep 2013 A1
20130229335 Whitman Sep 2013 A1
20130229347 Lutz, III Sep 2013 A1
20130229350 Shaw Sep 2013 A1
20130229351 Whitt, III Sep 2013 A1
20130229354 Whitt, III Sep 2013 A1
20130229356 Marwah Sep 2013 A1
20130229363 Whitman Sep 2013 A1
20130229366 Dighde Sep 2013 A1
20130229380 Lutz, III Sep 2013 A1
20130229386 Bathiche Sep 2013 A1
20130229534 Panay Sep 2013 A1
20130229568 Belesiu Sep 2013 A1
20130229570 Beck et al. Sep 2013 A1
20130229756 Whitt, III Sep 2013 A1
20130229757 Whitt, III Sep 2013 A1
20130229758 Belesiu Sep 2013 A1
20130229759 Whitt, III Sep 2013 A1
20130229760 Whitt, III Sep 2013 A1
20130229761 Shaw Sep 2013 A1
20130229762 Whitt, III Sep 2013 A1
20130229773 Siddiqui Sep 2013 A1
20130230346 Shaw Sep 2013 A1
20130231755 Perek Sep 2013 A1
20130232280 Perek Sep 2013 A1
20130232348 Oler Sep 2013 A1
20130232349 Oler Sep 2013 A1
20130232350 Belesiu et al. Sep 2013 A1
20130232353 Belesiu Sep 2013 A1
20130232571 Belesiu Sep 2013 A1
20130242495 Bathiche et al. Sep 2013 A1
20130262886 Nishimura Oct 2013 A1
20130300590 Dietz Nov 2013 A1
20130300647 Drasnin Nov 2013 A1
20130301199 Whitt Nov 2013 A1
20130301206 Whitt Nov 2013 A1
20130304941 Drasnin Nov 2013 A1
20130322000 Whitt Dec 2013 A1
20130322001 Whitt Dec 2013 A1
20130328761 Boulanger Dec 2013 A1
20130329360 Aldana Dec 2013 A1
20130332628 Panay Dec 2013 A1
20130335387 Emerton Dec 2013 A1
20130339757 Reddy Dec 2013 A1
20140043275 Whitman et al. Feb 2014 A1
20140048399 Whitt, III Feb 2014 A1
20140049894 Rihn Feb 2014 A1
20140063198 Boulanger Mar 2014 A1
20140078063 Bathiche Mar 2014 A1
20140119802 Shaw May 2014 A1
20140132550 McCracken et al. May 2014 A1
20140185215 Whitt Jul 2014 A1
20140185220 Whitt Jul 2014 A1
20140204514 Whitt Jul 2014 A1
20140204515 Whitt Jul 2014 A1
Foreign Referenced Citations (53)
Number Date Country
990023 Jun 1976 CA
1440513 Sep 2003 CN
103455149 Dec 2013 CN
1223722 Jul 2002 EP
1591891 Nov 2005 EP
2026178 Feb 2009 EP
2353978 Aug 2011 EP
2381290 Oct 2011 EP
2123213 Jan 1984 GB
56108127 Aug 1981 JP
10234057 Sep 1998 JP
10301055 Nov 1998 JP
10326124 Dec 1998 JP
1173239 Mar 1999 JP
11338575 Dec 1999 JP
2000010654 Jan 2000 JP
2000106021 Apr 2000 JP
2001142564 May 2001 JP
2001174746 Jun 2001 JP
2002162912 Jun 2002 JP
2004038950 Feb 2004 JP
2005331565 Dec 2005 JP
2006163459 Jun 2006 JP
2006294361 Oct 2006 JP
2009122551 Jun 2009 JP
2010244514 Oct 2010 JP
20010039013 May 2001 KR
20010107055 Dec 2001 KR
20040066647 Jul 2004 KR
20050014299 Feb 2005 KR
20060003093 Jan 2006 KR
20080006404 Jan 2008 KR
20080009490 Jan 2008 KR
20080055051 Jun 2008 KR
20090029411 Mar 2009 KR
20100022059 Feb 2010 KR
20100067366 Jun 2010 KR
20100115675 Oct 2010 KR
20110064265 Jun 2011 KR
102011008717 Aug 2011 KR
20110109791 Oct 2011 KR
20110120002 Nov 2011 KR
20110122333 Nov 2011 KR
101113530 Feb 2012 KR
WO-9964784 Dec 1999 WO
WO-0079327 Dec 2000 WO
WO-0128309 Apr 2001 WO
WO-2005059874 Jun 2005 WO
WO-2006044818 Apr 2006 WO
WO 2007112172 Oct 2007 WO
WO 2009034484 Mar 2009 WO
WO 2011049609 Apr 2011 WO
WO 2012036717 Mar 2012 WO
Non-Patent Literature Citations (286)
Entry
“Advisory Action”, U.S. Appl. No. 13/939,032, Feb. 24, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Jan. 14, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Jan. 22, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/021,448, Jan. 16, 2014, 33 Pages.
“Final Office Action”, U.S. Appl. No. 13/564,520, Jan. 15, 2014, 7 pages.
“Foreign Office Action”, CN Application No. 201320097066.8, Oct. 24, 2013, 5 Pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/055679, Nov. 18, 2013, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/599,635, Feb. 25, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,186, Feb. 27, 2014, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,405, Feb. 20, 2014, 37 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/494,651, Feb. 4, 2014, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/525,070, Jan. 17, 2014, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, Feb. 14, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, Feb. 26, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/063,912, Jan. 2, 2014, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 13/938,930, Feb. 20, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/939,002, Mar. 3, 2014, 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Mar. 20, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Apr. 3, 2014, 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Mar. 10, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Apr. 14, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/938,930, May 6, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, May 5, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/371,725, Apr. 2, 2014, 22 pages.
“Final Office Action”, U.S. Appl. No. 13/525,070, Apr. 24, 2014, 21 pages.
“Final Office Action”, U.S. Appl. No. 13/780,228, Mar. 28, 2014, 13 pages.
“Final Office Action”, U.S. Appl. No. 14/063,912, Apr. 29, 2014, 10 pages.
“Final Office Action”, U.S. Appl. No. 14/199,924, May 6, 2014, 5 pages.
“Foreign Office Action”, CN Application No. 201320328022.1, Feb. 17, 2014, 4 Pages.
“Foreign Office Action”, CN Application No. 201320328022.1, Oct. 18, 2013, 3 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,237, Mar. 24, 2014, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, May 7, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,376, Apr. 2, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,232, Apr. 30, 2014, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, Apr. 3, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,055, Mar. 12, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/199,924, Apr. 10, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/200,595, Apr. 11, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,139, Mar. 17, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,232, Apr. 25, 2014, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,287, May 2, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/939,032, Apr. 3, 2014, 4 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/653,321, Mar. 28, 2014, 4 pages.
“Accessing Device Sensors”, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html>>on May 25, 2012, 4 pages.
“ACPI Docking for Windows Operating Systems”, Retrieved from: <http://www.scritube.com/limba/engleza/software/ACPI-Docking-for-Windows-Opera331824193.php> on Jun. 6, 2012,10 pages.
“Chinese Search Report”, Application No. 201110272868.3, (Apr. 1, 2013),10 pages.
“Cholesteric Liquid Crystal”, Retrieved from: <http://en.wikipedia.org/wiki/Cholesteric—liquid—crystal> on Aug. 6, 2012,(Jun. 10, 2012), 2 pages.
“Cirago Slim Case®—Protective case with built-in kickstand for your iPhone 5®”, Retrieved from <http://cirago.com/wordpress/wp-content/uploads/2012/10/ipc1500brochure1.pdf> on Jan. 29, 2013, 1 page.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, (Apr. 9, 2013), 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, (Jul. 2, 2013), 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,327, (Sep. 12, 2013), 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,327, (Sep. 23, 2013), 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,726, (Sep. 17, 2013), 2 pages.
“Developing Next-Generation Human Interfaces using Capacitive and Infrared Proximity Sensing”, Silicon Laboratories, Inc., Available at <http://www.silabs.com/pages/DownloadDoc.aspx?FILEURL=support%20documents/technicaldocs/capacitive%20and%20sensing—wp.pdf&src=SearchResults>,(Aug. 30, 2010), pp. 1-10.
“Directional Backlighting for Display Panels”, U.S. Appl. No. 13/021,448, (Feb. 4, 2011), 38 pages.
“DR2PA”, retrieved from <http://www.architainment.co.uk/wp-content/uploads/2012/08/DR2PA-AU-US-size-Data-Sheet-Rev-H—LOGO.pdf> on Sep. 17, 2012, 4 pages.
“Final Office Action”, U.S. Appl. No. 13/471,001, (Jul. 25, 2013), 20 pages.
“Final Office Action”, U.S. Appl. No. 13/471,139, (Sep. 16, 2013),13 pages.
“Final Office Action”, U.S. Appl. No. 13/471,336, (Aug. 28, 2013),18 pages.
“Final Office Action”, U.S. Appl. No. 13/651,195, (Apr. 18, 2013),13 pages.
“Final Office Action”, U.S. Appl. No. 13/651,232, (May 21, 2013), 21 pages.
“Final Office Action”, U.S. Appl. No. 13/651,287 , (May 3, 2013), 16 pages.
“Final Office Action”, U.S. Appl. No. 13/651,976, (Jul. 25, 2013), 21 pages.
“Final Office Action”, U.S. Appl. No. 13/653,321, (Aug. 2, 2013),17 pages.
“Final Office Action”, U.S. Appl. No. 13/653,682, (Oct. 18, 2013),16 pages.
“First One Handed Fabric Keyboard with Bluetooth Wireless Technology”, Retrieved from: <http://press.xtvworld.com/article3817.html> on May 18, 2012,(Jan. 6, 2005), 2 pages.
“For Any Kind of Proceeding 2011 Springtime as Well as Coil Nailers as Well as Hotter Summer Season”, Lady Shoe Worlds, retrieved from <http://www.ladyshoesworld.com/2011/09/18/for-any-kind-of-proceeding-2011-springtime-as-well-as-coil-nailers-as-well-as-hotter-summer-season/> on Nov. 3, 2011,(Sep. 8, 2011), 2 pages.
“Force and Position Sensing Resistors: An Emerging Technology”, Interlink Electronics, Available at <http://staff.science.uva.nl/˜vlaander/docu/FSR/An—Exploring—Technology.pdf>, (Feb. 1990), pp. 1-6.
“Frogpad Introduces Weareable Fabric Keyboard with Bluetooth Technology”, Retrieved from: <http://www.geekzone.co.nz/content.asp?contentid=3898> on May 7, 2012,(Jan. 7, 2005), 3 pages.
“How to Use the iPad's Onscreen Keyboard”, Retrieved from <http://www.dummies.com/how-to/content/how-to-use-the-ipads-onscreen-keyboard.html> on Aug. 28, 2012, 3 pages.
“iControlPad 2—The open source controller”, Retrieved from <http://www.kickstarter.com/projects/1703567677/icontrolpad-2-the-open-source-controller> on Nov. 20, 2012, (2012),15 pages.
“i-Interactor electronic pen”, Retrieved from: <http://www.alibaba.com/product-gs/331004878/i—Interactor—electronic—pen.html> on Jun. 19, 2012, 5 pages.
“Incipio LG G-Slate Premium Kickstand Case—Black Nylon”, Retrieved from: <http://www.amazon.com/Incipio-G-Slate-Premium-Kickstand-Case/dp/B004ZKP916> on May 18, 2012, 4 pages.
“International Search Report and Written Opinion”, International Application No. PCT/US2011/050471 , (Apr. 9, 2012), 8 pages.
“Membrane Keyboards & Membrane Keypads”, Retrieved from: <http://www.pannam.com/> on May 9, 2012,(Mar. 4, 2009), 2 pages.
“Microsoft Develops Glasses-Free Eye-Tracking 3D Display”, Tech-FAQ, retrieved from <http://www.tech-faq.com/microsoft-develops-glasses-free-eye-tracking-3d-display.html> on Nov. 2, 2011, 3 pages.
“Microsoft Reveals Futuristic 3D Virtual HoloDesk Patent”, Retrieved from <http://www.patentbolt.com/2012/05/microsoft-reveals-futuristic-3d-virtual-holodesk-patent.html> on May 28, 2012, (May 23, 2012), 9 pages.
“Motion Sensors”, Android Developers, retrieved from <http://developer.android.com/guide/topics/sensors/sensors—motion.html> on May 25, 2012, 7 pages.
“MPC Fly Music Production Controller”, AKAI Professional, Retrieved from: <http://www.akaiprompc.com/mpc-fly> on Jul. 9, 2012, 4 pages.
“NI Releases New Maschine & Maschine Mikro”, Retrieved from <http://www.djbooth.net/index/dj-equipment/entry/ni-releases-new-maschine-mikro/> on Sep. 17, 2012, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/882,994, (Feb. 1, 2013),17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/021,448, (Dec. 13, 2012), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/021,448, (Aug. 16, 2013), 25 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,001, (Feb. 19, 2013),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,139, (Mar. 21, 2013),12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,202, (Feb. 11, 2013),10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, (Jan. 18, 2013),14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, (Jul. 19, 2013), 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/563,435, (Jun. 14, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, (Jun. 19, 2013), 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/565,124, (Jun. 17, 2013), 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,195, (Jan. 2, 2013),14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,232, (Jan. 17, 2013),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,272, (Feb. 12, 2013),10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,287, (Jan. 29, 2013),13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,304, (Mar. 22, 2013), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,327, (Mar. 22, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,726, (Apr. 15, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, (Mar. 18, 2013),14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, (Jul. 1, 2013), 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,976, (Feb. 22, 2013),16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,321, (Feb. 1, 2013),13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, (Feb. 7, 2013),11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, (Jun. 3, 2013),14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,055, (Apr. 23, 2013),11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,520, (Feb. 1, 2013),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,520, (Jun. 5, 2013), 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/938,930, (Aug. 29, 2013), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,002, (Aug. 28, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,032, (Aug. 29, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 12/882,994, (Jul. 12, 2013), 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/470,633, (Mar. 22, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,202, (May 28, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,195, (Jul. 8, 2013), 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,272, (May 2, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,304, (Jul. 1, 2013), 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,327, (Jun. 11, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,726, (May 31, 2013), 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,871, (Oct. 2, 2013), 7 pages.
“Notice to Grant”, Chinese Application No. 201320097089.9 (Sep. 29, 2013), 2 Pages.
“On-Screen Keyboard for Windows 7, Vista, XP with Touchscreen”, Retrieved from <www.comfort-software.com/on-screen-keyboard.html> on Aug. 28, 2012, (Feb. 2, 2011), 3 pages.
“Optical Sensors in Smart Mobile Devices”, ON Semiconductor, TND415/D, Available at <http://www.onsemi.jp/pub—link/Collateral/TND415-D.PDF>,(Nov. 2010), pp. 1-13.
“Optics for Displays: Waveguide-based Wedge Creates Collimated Display Backlight”, OptoIQ, retrieved from <http://www.optoiq.com/index/photonics-technologies-applications/lfw-display/lfw-article-display.articles.laser-focus-world.volume-46.issue-1.world-news.optics-for—displays.html> on Nov. 2, 2010,(Jan. 1, 2010), 3 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2013/028479, (Jun. 17, 2013), 10 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2013/029461, (Jun. 21, 2013), 11 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2013/028948, (Jun. 21, 2013), 11 pages.
“PCT Search Report”, Application No. PCT/US2013/042790, (Aug. 8, 2013), 9 pages.
“Position Sensors”, Android Developers, retrieved from <http://developer.android.com/guide/topics/sensors/sensors—position.html> on May 25, 2012, 5 pages.
“Reflex LCD Writing Tablets”, retrieved from <http://www.kentdisplays.com/products/lcdwritingtablets.html> on Jun. 27, 2012, 3 pages.
“Restriction Requirement”, U.S. Appl. No. 13/471,139, (Jan. 17, 2013), 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,304, (Jan. 18, 2013), 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,726, (Feb. 22, 2013), 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,871, (Feb. 7, 2013), 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/715,229, (Aug. 13, 2013), 7 pages.
“Smart Board™ Interactive Display Frame Pencil Pack”, Available at <http://downloads01.smarttech.com/media/sitecore/en/support/product/sbfpd/400series(interactivedisplayframes)/guides/smartboardinteractivedisplayframepencilpackv12mar09.pdf>,(2009), 2 pages.
“SoIRxTM E-Series Multidirectional Phototherapy ExpandableTM 2-Bulb Full Body Panel System”, Retrieved from: <http://www.solarcsystems.com/us—multidirectional—uv—light—therapy—1—intro.html > on Jul. 25, 2012,(2011), 4 pages.
“The Microsoft Surface Tablets Comes With Impressive Design and Specs”, Retrieved from <http://microsofttabletreview.com/the-microsoft-surface-tablets-comes-with-impressive-design-and-specs> on Jan. 30, 2013, (Jun. 2012), 2 pages.
“Tilt Shift Lenses: Perspective Control”, retrieved from http://www.cambridgeincolour.com/tutorials/tilt-shift-lenses1.htm, (Mar. 28, 2008),11 Pages.
“Virtualization Getting Started Guide”, Red Hat Enterprise Linux 6, Edition 0.2, retrieved from <http://docs.redhat.com/docs/en-US/Red—Hat—Enterprise—Linux/6/html-single/Virtualization—Getting—Started—Guide/index.html> on Jun. 13, 2012, 24 pages.
“What is Active Alignment?”, http://www.kasalis.com/active—alignment.html, retrieved on Nov. 22, 2012, 2 Pages.
Bert , et al., “Passive Matrix Addressing of Electrophoretic Image Display”, Conference on International Display Research Conference, Retrieved from <http://www.cmst.be/publi/eurodisplay2002—s14-1.pdf>, (Oct. 1, 2002), 4 pages.
Block, Steve et al., “DeviceOrientation Event Specification”, W3C, Editor's Draft, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012, (Jul. 12, 2011), 14 pages.
Brown, Rich “Microsoft Shows Off Pressure-Sensitive Keyboard”, retrieved from <http://news.cnet.com/8301-17938—105-10304792-1.html> on May 7, 2012, (Aug. 6, 2009), 2 pages.
Burge, et al., “Determination of off-axis aberrations of imaging systems using on-axis measurements”, SPIE Proceeding, Retrieved from <http://www.loft.optics.arizona.edu/documents/journal—articles/Jim—Burge—Determination—of—off-axis—aberrations—of—imaging—systems—using—on-axis—measurements.pdf>,(Sep. 21, 2011),10 pages.
Butler, Alex et al., “SideSight: Multi-“touch” Interaction around Small Devices”, In the processdings of the 21st annual ACM symposium on User interface software and technology., retrieved from <http://research.microsoft.com/pubs/132534/sidesight—crv3.pdf> on May 29, 2012,(Oct. 19, 2008), 4 pages.
Chang, Jee-Gong et al., “Optical Design and Analysis of LCD Backlight Units Using ASAP”, Optical Engineering, Available at <http://www.opticsvalley.com/resources/kbasePDF/ma—oe—001—optical—design.pdf>,(Jun. 2, 2003),15 pages.
Crider, Michael “Sony Slate Concept Tablet “Grows” a Kickstand”, Retrieved from: <http://androidcommunity.com/sony-slate-concept-tablet-grows-a-kickstand-20120116/> on May 4, 2012,(Jan. 16, 2012), 9 pages.
Das, Apurba et al., “Study of Heat Transfer through Multilayer Clothing Assemblies: A Theoretical Prediction”, Retrieved from <http://www.autexrj.com/cms/zalaczone—pliki/5—013—11.pdf>, (Jun. 2011), 7 pages.
Dietz, Paul H., et al., “A Practical Pressure Sensitive Computer Keyboard”, In Proceedings of UIST 2009,(Oct. 2009), 4 pages.
Diverdi, et al., “An Immaterial Pseudo-3D Display with 3D Interaction”, In the proceedings if Three-Dimensional Television: Capture, Transmission, and Display, Springer, Retrieved from <http://www.cs.ucsb.edu/˜holl/pubs/DiVerdi-2007-3DTV.pdf>,(Feb. 6, 2007), 26 pages.
Graver, William W., et al., “A Virtual Window on Media Space”, retrieved from <http://www.gold.ac.uk/media/15gaver-smets-overbeeke.MediaSpaceWindow.chi95.pdf> on Jun. 1, 2012, retrieved from <http://www.gold.ac.uk/media/15gaver-smets-overbeeke.MediaSpaceWindow.chi95.pdf> on Jun. 1, 2012,(May 7, 1995), 9 pages.
Glatt, Jeff “Channel and Key Pressure (Aftertouch).”, Retrieved from: <http://home.roadrunner.com/˜jgglatt/tutr/touch.htm> on Jun. 11, 2012, 2 pages.
Grossman, et al., “Multi-Finger Gestural Interaction with 3D Volumetric Display”, In the proceedings of the 17th annual ACM symposium on User interface software and technology, Retrieved from <http://www.dgp.toronto.edu/papers/tgrossman—UIST2004.pdf>,(Oct. 24, 2004), pp. 61-70.
Hanlon, Mike “ElekTex Smart Fabric Keyboard Goes Wireless”, Retrieved from: <http://www.gizmag.com/go/5048/ > on May 17, 2012,(Jan. 15, 2006), 5 pages.
Harada, Susumu et al., “VoiceDraw: A Hands-Free Voice-Driven Drawing Application for People with Motor Impairments”, In Proceedings of Ninth International ACM SIGACCESS Conference on Computers and Accessibility, retrieved from <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.113.7211&rep=rep1&type=pdf > on Jun. 1, 2012,(Oct. 15, 2007), 8 pages.
Hinckley, Ken et al., “Codex: A Dual Screen Tablet Computer”, Conference on Human Factors in Computing Systems, (Apr. 9, 2009), 10 pages.
Iwase, Eiji “Multistep Sequential Batch Assembly of Three-Dimensional Ferromagnetic Microstructures with Elastic Hinges”, Retrieved at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1549861>> Proceedings: Journal of Microelectromechanical Systems, (Dec. 2005), 7 pages.
Izadi, Shahram et al., “ThinSight: A Thin Form-Factor Interactive Surface Technology”, Communications of the ACM, vol. 52, No. 12, retrieved from <http://research.microsoft.com/pubs/132532/p90-izadi.pdf> on Jan. 5, 2012,(Dec. 2009), pp. 90-98.
Jacobs, et al., “2D/3D Switchable Displays”, In the proceedings of Sharp Technical Journal (4), Available at <https://cgi.sharp.co.jp/corporate/rd/journal-85/pdf/85-04.pdf>,(Apr. 2003), pp. 15-18.
Kaufmann, Benoit et al., “Hand Posture Recognition Using Real-time Artificial Evolution”, EvoApplications'09, retrieved from <http://evelyne.lutton.free.fr/Papers/KaufmannEvolASP2010.pdf> on Jan. 5, 2012,(Apr. 3, 2010), 10 pages.
Kaur, Sukhmani “Vincent Liew's redesigned laptop satisfies ergonomic needs”, Retrieved from: <http://www.designbuzz.com/entry/vincent-liew-s-redesigned-laptop-satisfies-ergonomic-needs/> on Jul. 27, 2012,(Jun. 21, 2010), 4 pages.
Khuntontong, Puttachat et al., “Fabrication of Molded Interconnection Devices by Ultrasonic Hot Embossing on Thin Polymer Films”, IEEE Transactions on Electronics Packaging Manufacturing, vol. 32, No. 3,(Jul. 2009), pp. 152-156.
Lee, C.M.G “Flat-Panel Autostereoscopic 3D Display”, Optoelectronics, IET , Available at <http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=04455550>,(Feb. 2008),pp. 24-28.
Lee, et al., “Depth-Fused 3D Imagery on an Immaterial Display”, In the proceedings of IEEE Transactions on Visualization and Computer Graphics, vol. 15, No. 1, Retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=04540094>,(Jan. 2009),20-33.
Lee, et al., “LED Light Coupler Design for a Ultra-Thin Light Guide”, Journal of the Optical Society of Korea, vol. 11, Issue.3, Retrieved from <http://opticslab.kongju.ac.kr/pdf/06.pdf>,(Sep. 2007), 5 pages.
Li, et al., “Characteristic Mode Based Tradeoff Analysis of Antenna-Chassis Interactions for Multiple Antenna Terminals”, In IEEE Transactions on Antennas and Propagation, Retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=arnumber=6060882>,(Feb. 2012),13 pages.
Linderholm, Owen “Logitech Shows Cloth Keyboard for PDAs”, Retrieved from: <http://www.pcworld.com/article/89084/logitech—shows—cloth—keyboard—for—pdas.html> on May 7, 2012,(Mar. 15, 2002), 5 pages.
Liu, et al., “Three-dimensional PC: toward novel forms of human-computer interaction”, In the proceedings of Three-Dimensional Video and Display: Devices and Systems vol. CR76, Retrieved from <http://www.google.com/in/url?sa=t&rct=j&q=Three-dimensional+PC:+toward+novel+forms+of+human-computer+interaction&source=web&cd=1&ved=0CFoQFjAA&url=http%3A%2F%2F%2Fciteserx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.32.9469%26rep%3Drep1%26,(Nov. 5, 2000), pp. 250-281.
Manresa-Yee, Cristina et al., “Experiences Using a Hands-Free Interface”, In Proceedings of the 10th International ACM SIGACCESS Conference on Computers adn Accessibility, retrieved from <http://dmi.uib.es/˜cmanresay/Research/%5BMan08%5DAssets08.pdf> on Jun. 1, 2012,(Oct. 13, 2008), pp. 261-262.
Mclellan, Charles “Eleksen Wireless Fabric Keyboard: a first look”, Retrieved from: <http://www.zdnetasia.com/eleksen-wireless-fabric-keyboard-a-first-look-40278954.htm> on May 7, 2012,(Jul. 17, 2006), 9 pages.
Miller, Matthew “MOGA gaming controller enhances the Android gaming experience”, Retrieved from <http://www.zdnet.com/moga-gaming-controller-enhances-the-android-gaming-experience-7000007550/> on Nov. 20, 2012, (Nov. 18, 2012), 9 pages.
Morookian, et al., “Ambient-Light-Canceling Camera Using Subtraction of Frames”, NASA Tech Briefs, Retrieved from <http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20110016693—2011017808.pdf>,(May 2004), 2 pages.
Nakanishi, Hideyuki et al., “Movable Cameras Enhance Social Telepresence in Media Spaces”, In Proceedings of the 27th International Conference on Human Factors in Computing Systems, retrieved from <http://smg.ams.eng.osaka-u.ac.jp/˜nakanishi/hnp—2009—chi.pdf> on Jun. 1, 2012,(Apr. 6, 2009), 10 pages.
Peli, Eli “Visual and Optometric Issues with Head-Mounted Displays”, IS & T/OSA Optics & Imaging in the Information Age, The Society for Imaging Science and Technology, available at <http://www.u.arizona.edu/˜zrui3/zhang—pHMPD—spie07.pdf>,(1996), pp. 364-369.
Piltch, Avram “ASUS Eee Pad Slider SL101 Review”, Retrieved from <http://www.laptopmag.com/review/tablets/asus-eee-pad-slider-sl101.aspx>, (Sep. 22, 2011), 5 pages.
Post, E.R. et al., “E-Broidery: Design and Fabrication of Textile-Based Computing”, IBM Systems Journal, vol. 39, Issue 3 & 4,(Jul. 2000), pp. 840-860.
Purcher, Jack “Apple is Paving the Way for a New 3D GUI for IOS Devices”, Retrieved from: <http://www.patentlyapple.com/patently-apple/2012/01/apple-is-paving-the-way-for-a-new-3d-gui-for-ios-devices.html> Jun. 4, 2012,(Jan. 12, 2012), 15 pages.
Qin, Yongqiang et al., “pPen: Enabling Authenticated Pen and Touch Interaction on Tabletop Surfaces”, In Proceedings of ITS 2010, Available at <http://www.dfki,de/its2010/papers/pdf/po172.pdf>,(Nov. 2010), pp. 283-284.
Reilink, Rob et al., “Endoscopic Camera Control by Head Movements for Thoracic Surgery”, In Proceedings of 3rd IEEE RAS & EMBS International Conference of Biomedical Robotics and Biomechatronics, retrieved from <http://doc.utwente.nl/74929/1/biorob—online.pdf> on Jun. 1, 2012,(Sep. 26, 2010), pp. 510-515.
Reisman, et al., “A Screen-Space Formulation for 2D and 3D Direct Manipulation”, In the proceedings of the 22nd annual ACM symposium on User interface, Retrieved from <http://innovis.cpsc.ucalgary.ca/innovis/uploads/Courses/TableTopDetails2009/Reisman2009.pdf>,(Oct. 4, 2009), pp. 69-78.
Schoning, Johannes et al., “Building Interactive Multi-Touch Surfaces”, Journal of Graphics, GPU, and Game Tools, vol. 14, No. 3, available at <http://www.libavg.com/raw-attachment/wiki/Multitouch/Multitouchguide—draft.pdf>,(Nov. 2009), pp. 35-55.
Staff, “Gametel Android controller turns tablets, phones into portable gaming devices”, Retrieved from <http://www.mobiletor.com/2011/11/18/gametel-android-controller-turns-tablets-phones-into-portable-gaminq-devices/#> on Nov. 20, 2012, (Nov. 18, 2011), 5 pages.
Sumimoto, Mark “Touch & Write: Surface Computing With Touch and Pen Input”, Retrieved from: <http://www.gottabemobile.com/2009/08/07/touch-write-surface-computing-with-touch-and-pen-input/> on Jun. 19, 2012,(Aug. 7, 2009), 4 pages.
Sundstedt, Veronica “Gazing at Games: Using Eye Tracking to Control Virtual Characters”, In ACM SIGGRAPH 2010 Courses, retrieved from <http://www.tobii.com/Global/Analysis/Training/EyeTrackAwards/veronica—sundstedtpdf> on Jun. 1, 2012,(Jul. 28, 2010), 85 pages.
Takamatsu, Seiichi et al., “Flexible Fabric Keyboard with Conductive Polymer-Coated Fibers”, In Proceedings of Sensors 2011,(Oct. 28, 2011), 4 pages.
Travis, Adrian et al., “Collimated Light from a Waveguide for a Display Backlight”, Optics Express, 19714, vol. 17, No. 22, retrieved from <http://download.microsoft.com/download/D/2/E/D2E425F8-CF3C-4C71-A4A2-70F9D4081007/OpticsExpressbacklightpaper.pdf> on Oct. 15, 2009, 6 pages.
Travis, Adrian et al., “The Design of Backlights for View-Sequential 3D”, retrieved from <http://download.microsoft.com/download/D/2/E/D2E425F8-CF3C-4C71-A4A2-70F9D4081007/Backlightforviewsequentialautostereo.docx> on Nov. 1, 2010, 4 pages.
Travis, Adrian R., et al., “Flat Projection for 3-D”, In Proceedings of the IEEE, vol. 94 Issue: 3, Available at <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1605201>,(Mar. 2006), pp. 539-549.
Valli, Alessandro “Notes on Natural Interaction”, retrieved from <http://www.idemployee.id.tue.nl/g.w.m.rauterberg/lecturenotes/valli-2004.pdf> on Jan. 5, 2012,(Sep. 2005),80 pages.
Valliath, G T., “Design of Hologram for Brightness Enhancement in Color LCDs”, Retrieved from <http://www.loreti.it/Download/PDF/LCD/44—05.pdf> on Sep. 17, 2012, 5 pages.
Vaucelle, Cati “Scopemate, A Robotic Microscope!”, Architectradure, retrieved from <http://architectradure.blogspot.com/2011/10/at-uist-this-monday-scopemate-robotic.html> on Jun. 6, 2012,(Oct. 17, 2011), 2 pages.
Williams, Jim “A Fourth Generation of LCD Backlight Technology”, Retrieved from <http://cds.linear.com/docs/Application%20Note/an65f.pdf>, (Nov. 1995),124 pages.
Xu, Zhang et al., “Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors”, IUI'09, Feb. 8-11, 2009, retrieved from <http://sclab.yonsei.ac.kr/courses/10TPR/10TPR.files/Hand%20Gestur%20Recognition%20and%20Virtual%20Game%20Control%20based%20on%203d%20accelerometer%20and%20EMG%20sensors.pdf> on Jan. 5, 2012,(Feb. 8, 2009), 5 pages.
Xu, Zhi-Gang et al., “Vision-based Detection of Dynamic Gesture”, ICTM'09, Dec. 5-6, 2009, retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5412956> on Jan. 5, 2012,(Dec. 5, 2009), pp. 223-226.
Yan, Jin-Ren et al., “Edge-Lighting Light Guide Plate Based on Micro-Prism for Liquid Crystal Display”, Journal of Display Technology, vol. 5, No. 9, Available at <http://ieeexplore.ieee.org/ielx5/9425/5196834/05196835.pdf?tp=&arnumber=5196835&isnumber=5196834>,(Sep. 2009), pp. 355-357.
Yu, et al., “A New Driving Scheme for Reflective Bistable Cholesteric Liquid Crystal Displays”, Society for Information Display International Symposium Digest of Technical Papers, Retrieved from <http://www.ee.ust.hk/˜eekwok/publications/1997/bcd—sid.pdf.>,(May 1997), 4 pages.
Zhang, et al., “Model-Based Development of Dynamically Adaptive Software”, In Proceedings of ICSE 2006, Available at <http://www.irisa.fr/lande/lande/icse-proceedings/icse/p371.pdf>,(May 20, 2006), pp. 371-380.
Zhang, Rui “Design of Head Mounted Displays”, Retrieved at <<http://www.optics.arizona.edu/optomech/student%20reports/2007/Desigen%20of%20mounteddisplays%20Zhang.pdf>>, (Dec. 12, 2007), 6 pages.
Zhu, Dingyun et al., “Keyboard before Head Tracking Depresses User Success in Remote Camera Control”,In Proceedings of 12th IFIP TC 13 International Conference on Human-Computer Interaction, Part II, retrieved from <http://csiro.academia.edu/Departments/CSIRO—ICT—Centre/Papers?page=5> on Jun. 1, 2012,(Aug. 24, 2009),14 pages.
“Advanced Configuration and Power Management Specification”, Intel Corporation, Microsoft Corporation, Toshiba Corp. Revision 1, (Dec. 22, 1996), 364 pages.
“Final Office Action”, U.S. Appl. No. 13/656,055, (Oct. 23, 2013),14 pages.
“Final Office Action”, U.S. Appl. No. 13/938,930, (Nov. 8, 2013),10 pages.
“Final Office Action”, U.S. Appl. No. 13/939,002, (Nov. 8, 2013), 7 pages.
“Final Office Action”, U.S. Appl. No. 13/939,032, (Dec. 20, 2013), 5 pages.
“FingerWorks Installation and Operation Guide for the TouchStream ST and TouchStream LP”, FingerWorks, Inc. Retrieved from <http://ec1.images-amazon.com/media/i3d/01/A/man-migrate/MANUAL000049862.pdf>, (2002),14 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/040968, (Sep. 5, 2013),12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/045049, (Sep. 16, 2013), 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/042550, (Sep. 24, 2013),14 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/043961, (Oct. 17, 2013),11 pages.
“International Search Report”, Application No. PCT/US2010/045676, (Apr. 28, 2011), 2 Pages.
“International Search Report”, Application No. PCT/US2010/046129, (Mar. 2, 2011), 3 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/371,725, (Nov. 7, 2013),19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/468,918, (Dec. 26, 2013),18 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,232, (Dec. 5, 2013),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/780,228, (Oct. 30, 2013),12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,002, (Dec. 20, 2013), 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/563,435, (Nov. 12, 2013), 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/565,124, (Dec. 24, 2013), 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/653,321, (Dec. 18, 2013), 41 pages.
“Notice to Grant”, CN Application No. 201320097124.7, (Oct. 8, 2013), 2 pages.
“Real-Time Television Content Platform”, retrieved from <http://www.accenture.com/us-en/pages/insight-real-time-television-platform.aspx> on Mar. 10, 2011, (May 28, 2002), 3 pages.
“Restriction Requirement”, U.S. Appl. No. 13/468,918, (Nov. 29, 2013), 6 pages.
“Welcome to Windows 7”, Retrieved from: <http://www.microsoft.com/en-us/download/confirmation.aspx?id=4984> on Aug. 1, 2013, (Sep. 16, 2009), 3 pages.
“What is the PD-Net Project About?”, retrieved from <http://pd-net.org/about/> on Mar. 10, 2011, 3 pages.
Kim, Min Su et al., “A Controllable Viewing Angle LCD with an Optically isotropic liquid crystal”, Journal of Physics D: Applied Physics, vol. 43, No. 14, (Mar. 23, 2010), 7 Pages.
Lee, C.M.G. “Flat-panel Backlight for View-sequential 3D Display”, Optoelectronics, IEE Proceedings-vol. 151. No. 6 IET, (Dec. 2004), 4 pages.
Prospero, Michael “Samsung Outs Series 5 Hybrid PC Tablet”, Retrieved from: <http://blog.laptopmag.com/samsung-outs-series-5-hybrid-pc-tablet-running-windows-8> on Oct. 31, 2013, (Jun. 4, 2012), 7 pages.
Travis, Adrian et al., “P-127: Linearity in Flat Panel Wedge Projection”, SID 03 Digest, retrieved from <http://www.2.eng.cam.ac.uk/˜arlt1/Linearity%20in%20flat%20panel%20wedge%20projection.pdf>,(May 12, 2005), pp. 716-719.
Yagi, Nobuyuki “The Concept of “AdapTV””, Series: The Challenge of “AdapTV”, Broadcast Technology, No. 28, (2006), pp. 16-17.
“Advisory Action”, U.S. Appl. No. 14/199,924, May 28, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/938,930, Jun. 6, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, May 22, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/494,651, Jun. 11, 2014, 19 pages.
“Final Office Action”, U.S. Appl. No. 13/603,918, Mar. 21, 2014, 14 pages.
“Final Office Action”, U.S. Appl. No. 13/653,682, Jun. 11, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/013928, May 12, 2014, 17 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/045283, Mar. 12, 2014, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,030, May 15, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,054, Jun. 3, 2014, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/655,065, Apr. 24, 2014, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/975,087, May 8, 2014, 18 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,237, May 12, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/199,924, Jun. 10, 2014, 4 pages.
“Restriction Requirement”, U.S. Appl. No. 13/595,700, May 28, 2014, 6 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,232, Jul. 31, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, Jun. 19, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,032, Jun. 26, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,032, Jul. 15, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/471,376, Aug. 18, 2014, 24 pages.
“Final Office Action”, U.S. Appl. No. 13/599,635, Aug. 8, 2014, 16 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/031531, Jun. 20, 2014, 10 Pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028483, Jun. 24, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028484, Jun. 24, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028485, Jun. 25, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028769, Jun. 26, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028771, Jun. 19, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028486, Jun. 20, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/041017, Jul. 17, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028489, Jun. 20, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028488, Jun. 24, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028767, Jun. 24, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028481, Jun. 19, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028490, Jun. 24, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028766, Jun. 26, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028772, Jun. 30, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028768, Jun. 24, 2014, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028482, Jun. 20, 2014, 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028487, May 27, 2014, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028770, Jun. 26, 2014, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/468,882, Jul. 9, 2014, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/468,949, Jun. 20, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/470,951, Jul. 2, 2014, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,001, Jun. 17, 2014, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,412, Jul. 11, 2014, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/525,070, Aug. 14, 2014, 24 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, Jun. 16, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/595,700, Jun. 18, 2014, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/647,479, Jul. 3, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,976, Jun. 16, 2014, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,250, Jun. 17, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,276, Jun. 13, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/277,240, Jun. 13, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/468,918, Jun. 17, 2014, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,186, Jul. 3, 2014, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,405, Jun. 24, 2014, 9 pages.
“Final Office Action”, U.S. Appl. No. 13/595,700, Aug. 15, 2014, 6 pages.
Related Publications (1)
Number Date Country
20140012401 A1 Jan 2014 US
Provisional Applications (7)
Number Date Country
61606301 Mar 2012 US
61606313 Mar 2012 US
61606333 Mar 2012 US
61613745 Mar 2012 US
61606336 Mar 2012 US
61607451 Mar 2012 US
61606321 Mar 2012 US
Continuations (2)
Number Date Country
Parent 13651272 Oct 2012 US
Child 14018286 US
Parent 13471202 May 2012 US
Child 13651272 US