Device camera angle

Information

  • Patent Grant
  • 9275809
  • Patent Number
    9,275,809
  • Date Filed
    Monday, May 14, 2012
    12 years ago
  • Date Issued
    Tuesday, March 1, 2016
    8 years ago
Abstract
Techniques for device camera angle are described. In one or more implementations, a camera is mounted in a computing device at an angle based on an orientation of the computing device. For example, when the computing device is positioned on a surface and at an angle to the surface (such as when supported by a kickstand), the mounting angle of the camera is such that an optical axis of the camera points forward, and not towards the surface. In at least some implementations, a computing device includes a camera that is physically adjustable to support different orientations of the computing device. In at least some implementations, images that are captured via a camera on a computing device can be manipulated based on an orientation of the computing device.
Description
BACKGROUND

Mobile computing devices have been developed to increase the functionality that is made available to users in a mobile setting. For example, a user may interact with a mobile phone, tablet computer, or other mobile computing device to check email, surf the web, compose texts, interact with applications, and so on.


Many mobile computing devices include an integrated camera. Such devices are typically held at a particular angle in order for an integrated camera to capture an image. Thus, images can be cut-off or out-of-focus if the device is not held or positioned at the correct angle relative to an object being photographed.


SUMMARY

This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.


Techniques for device camera angle are described. In one or more implementations, a computing device includes a kickstand that can support the computing device on a surface. For example, kickstand can be opened to a particular position, and the computing device can be positioned on a surface (e.g., a table, a desk, and so on) such that a user can interact with the computing device. A user, for instance, can provide input to the computing device via an attached input device. Further, a user can view and/or interact with a display device included on the computing device.


In at least some embodiments, a camera is mounted in a computing device at an angle based on an orientation of the computing device. For example, when the computing device is positioned on a surface and at an angle to the surface (such as when supported by a kickstand), the mounting angle of the camera is such that the camera points forward, and not towards the surface. For instance, consider a scenario where the computing device is placed on a table in a room at a preset angle supported by a kickstand, such that a user sitting at the table can view a display on the computing device. The camera can be mounted in the computing device on a surface opposite the display device, such that the field of view of the camera points away from the display device. Further, the camera is mounted at an angle in the computing device such that the user can capture images (e.g., still images, video, and so on) of objects in the room, such as other persons sitting at the table, a whiteboard on a wall, and so forth. Thus, the field of view of the camera can be perpendicular to the table such that the camera is not simply pointing down at the table. In implementations, this can provide a “tripod experience” whereby a computing device that includes a camera can be supported by a kickstand, and the camera is angled such that images of surrounding objects can be captured, e.g., recorded.


In at least some implementations, a computing device includes a camera that is physically adjustable independent of the computing device to support different orientations of the computing device. Components of the camera, for instance, can be tilted, rotated, and/or panned based on a detected orientation of the computing device. This can enable a field of view of the camera to be adjusted to enable images of objects to be captured in different orientations of the computing device.


In at least some implementations, images that are captured via a camera on a computing device can be manipulated based on an orientation of the computing device. For example, various types of image enhancement and/or image correction can be applied to image data to account for phenomena that may arise when images are captured at particular angles, such as low light, image distortion, and so on.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items. Entities represented in the figures may be indicative of one or more entities and thus reference may be made interchangeably to single or plural forms of the entities in the discussion.



FIG. 1 is an illustration of an environment in an example implementation that is operable to employ the techniques described herein.



FIG. 2 depicts an example implementation of an input device of FIG. 1 as showing a flexible hinge in greater detail.



FIG. 3 depicts an example orientation of the computing device in accordance with one or more embodiments.



FIG. 4 depicts an example orientation of the computing device in accordance with one or more embodiments.



FIG. 5 depicts an example orientation of the computing device in accordance with one or more embodiments.



FIG. 6 depicts an example orientation of the computing device in accordance with one or more embodiments.



FIG. 7 depicts an example orientation of the computing device in accordance with one or more embodiments.



FIG. 8 depicts an example camera assembly in accordance with one or more embodiments.



FIG. 9 depicts an example camera assembly in accordance with one or more embodiments.



FIG. 10 depicts an example implementation scenario in accordance with one or more embodiments.



FIG. 11 depicts an example implementation scenario in accordance with one or more embodiments.



FIG. 12 is a flow diagram that describes steps in a method in accordance with one or more embodiments.



FIG. 13 illustrates an example system including various components of an example device that can be implemented as any type of computing device as described with reference to FIGS. 1-12 to implement embodiments of the techniques described herein.





DETAILED DESCRIPTION

Overview


Techniques for device camera angle are described. In one or more implementations, a computing device includes a kickstand that can support the computing device on a surface. For example, kickstand can be opened to a particular position, and the computing device can be positioned on a surface (e.g., a table, a desk, and so on) such that a user can interact with the computing device. A user, for instance, can provide input to the computing device via an attached input device. Further, a user can view and/or interact with a display device included on the computing device.


In at least some embodiments, a camera is mounted in a computing device at an angle based on an orientation of the computing device. For example, when the computing device is positioned on a surface and at an angle to the surface (such as when supported by a kickstand), the mounting angle of the camera is such that the camera points forward, and not towards the surface. For instance, consider a scenario where the computing device is placed on a table in a room at a preset angle supported by a kickstand, such that a user sitting at the table can view a display on the computing device. The camera can be mounted in the computing device on a surface opposite the display device, such that the field of view of the camera points away from the display device. Further, the camera is mounted at an angle in the computing device such that the user can capture images (e.g., still images, video, and so on) of objects in the room, such as other persons sitting at the table, a whiteboard on a wall, and so forth. Thus, the field of view of the camera can be perpendicular to the table such that the camera is not simply pointing down at the table. In implementations, this can provide a “tripod experience” whereby a computing device that includes a camera can be supported by a kickstand, and the camera is angled such that images of surrounding objects can be captured, e.g., recorded.


In at least some implementations, a computing device includes a camera that is physically adjustable to support different orientations of the computing device. Components of the camera, for instance, can be tilted, rotated, and/or panned based on a detected orientation of the computing device. This can enable a field of view of the camera to be adjusted to enable images of objects to be captured in different orientations of the computing device.


In at least some implementations, images that are captured via a camera on a computing device can be manipulated based on an orientation of the computing device. For example, various types of image enhancement and/or correction can be applied to image data to account for phenomena that may arise when images are captured at particular angles, such as low light, image distortion, and so on.


In the following discussion, an example environment is first described that may employ techniques described herein. Next, a section entitled “Example Device Orientations” describes some example mobile device orientations in accordance with one or more embodiments. Following this, a section entitled “Example Camera Assembly” describes some example camera assemblies and camera components in accordance with one or more embodiments. Next, an example procedure is described which may be performed in the example environment as well as other environments. Consequently, performance of the example procedure is not limited to the example environment and the example environment is not limited to performance of the example procedure. Finally, an example system and device are described in which embodiments may be implemented in accordance with one or more embodiments. Further, although an input device is described, other devices are also contemplated that do not include input functionality, such as covers.


Example Environment



FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ techniques described herein. The illustrated environment 100 includes an example of a computing device 102 that is physically and communicatively coupled to an input device 104 via a flexible hinge 106. The computing device 102 may be configured in a variety of ways. For example, the computing device 102 may be configured for mobile use, such as a mobile phone, a tablet computer as illustrated, and so on. Thus, the computing device 102 may range from full resource devices with substantial memory and processor resources to a low-resource device with limited memory and/or processing resources. The computing device 102 may also relate to software that causes the computing device 102 to perform one or more operations.


The computing device 102, for instance, is illustrated as including an input/output module 108. The input/output module 108 is representative of functionality relating to processing of inputs and rendering outputs of the computing device 102. A variety of different inputs may be processed by the input/output module 108, such as inputs relating to functions that correspond to keys of the input device 104, keys of a virtual keyboard displayed by a display device 110 to identify gestures and cause operations to be performed that correspond to the gestures that may be recognized through the input device 104 and/or touchscreen functionality of the display device 110, and so forth. Thus, the input/output module 108 may support a variety of different input techniques by recognizing and leveraging a division between types of inputs including key presses, gestures, and so on.


In the illustrated example, the input device 104 is configured as having an input portion that includes a keyboard having a QWERTY arrangement of keys and track pad although other arrangements of keys are also contemplated. Further, other non-conventional configurations are also contemplated, such as a game controller, configuration to mimic a musical instrument, and so forth. Thus, the input device 104 and keys incorporated by the input device 104 may assume a variety of different configurations to support a variety of different functionality.


As previously described, the input device 104 is physically and communicatively coupled to the computing device 102 in this example through use of a flexible hinge 106. The flexible hinge 106 is flexible in that rotational movement supported by the hinge is achieved through flexing (e.g., bending) of the material forming the hinge as opposed to mechanical rotation as supported by a pin, although that embodiment is also contemplated. Further, this flexible rotation may be configured to support movement in one or more directions (e.g., vertically in the figure) yet restrict movement in other directions, such as lateral movement of the input device 104 in relation to the computing device 102. This may be used to support consistent alignment of the input device 104 in relation to the computing device 102, such as to align sensors used to change power states, application states, and so on.


The flexible hinge 106, for instance, may be formed using one or more layers of fabric and include conductors formed as flexible traces to communicatively couple the input device 104 to the computing device 102 and vice versa. This communication, for instance, may be used to communicate a result of a key press to the computing device 102, receive power from the computing device, perform authentication, provide supplemental power to the computing device 102, and so on. The flexible hinge 106 may be configured in a variety of way in accordance with one or more embodiments.


The computing device 102 further includes an orientation module 112, which is representative of functionality to determine a positional orientation of the computing device 102. For example, the orientation module 112 can utilize orientation information received from one or more orientation sensors 114. The orientation sensors 114 are representative of functionality to detect types of orientation information for the computing device 102, such as angles relative to gravity, relative tilt, angle relative to earth's magnetic field, and so forth. Examples of the orientation sensors 114 include an accelerometer, magnetometer, tilt sensor, inclinometer, and so on. A variety of other types of orientation sensors may additionally or alternatively be employed, however.


The orientation module 112 can utilize the orientation information to determine a relative orientation of the computing device 102. The relative orientation, for instance, can indicate an angle at which the computing device 102 is tilted, such as with reference to the ground, e.g., earth's gravitational field. Orientation information can be leveraged to perform various tasks, examples of which are discussed above and below.


A camera assembly 116 is included, which is representative of functionality to record images, such as still images, video, and so on. The camera assembly 116 can include various image capture components, such as a lens, a mirror, an electronic image sensor, and so on. The camera assembly 116 can also include structural components employed to mount image capture components into the computing device 102, such as a component carrier in which the image capture components can be installed. The component carrier can enable the image capture components to be securely mounted in the computing device 102. In at least some embodiments, the component carrier can also enable various adjustments to be made to angles at which images are captured, as detailed below.


The computing device 102 also includes a camera module 118, which is representative of functionality to perform various operations related to image capture and image adjustment. The camera module 118 can also cause adjustments to be made to various components of the camera assembly 116. The camera module 118, for instance, can utilize orientation information received from the orientation module 112 and/or the orientation sensors 114. The camera module 118 can leverage the orientation information to perform various operations, such as adjusting components of the camera assembly 116 to account for orientation of the computing device 102, image manipulation based on orientation of the computing device 102, and so forth. Examples of such operations are detailed below.



FIG. 2 depicts an example implementation 200 of the input device 104 of FIG. 1 as showing the flexible hinge 106 in greater detail. In this example, a connection portion 202 of the input device is shown that is configured to provide a communicative and physical connection between the input device 104 and the computing device 102. The connection portion 202 as illustrated has a height and cross section configured to be received in a channel in the housing of the computing device 102, although this arrangement may also be reversed without departing from the spirit and scope thereof.


The connection portion 202 is flexibly connected to a portion of the input device 104 that includes the keys through use of the flexible hinge 106. Thus, when the connection portion 202 is physically connected to the computing device the combination of the connection portion 202 and the flexible hinge 106 supports movement of the input device 104 in relation to the computing device 102 that is similar to a hinge of a book.


The connection portion 202 is illustrated in this example as including magnetic coupling devices 204, 206, mechanical coupling protrusions 208, 210, and communication contacts 212. The magnetic coupling devices 204, 206 are configured to magnetically couple to complementary magnetic coupling devices of the computing device 102 through use of one or more magnets. In this way, the input device 104 may be physically secured to the computing device 102 through use of magnetic attraction.


The connection portion 202 also includes mechanical coupling protrusions 208, 210 to form a mechanical physical connection between the input device 104 and the computing device 102. The communication contacts 212 are configured to contact corresponding communication contacts of the computing device 102 to form a communicative coupling between the devices as shown.


Having discussed an example environment in which embodiments may operate, consider now some example device orientations in accordance with one or more embodiments.


Example Device Orientations


The following discussion presents some example device orientations in accordance with various embodiments.



FIG. 3 illustrates an example orientation 300 of the computing device 102. In the orientation 300, the input device 104 is laid flat against a surface 302 and the computing device 102 is disposed at an angle to permit viewing of the display device 110, e.g., such as through use of a kickstand 304 disposed on a rear surface of the computing device 102. The orientation 300 can correspond to a typing arrangement whereby input can be received via the input device 104, such as using keys of the keyboard, a track pad, and so forth. For instance, the surface 302 can correspond to any suitable surface on which the computing device 102 and/or the input device 104 can be placed, such as a desk, a table, a floor, and so forth.


In at least some embodiments, the kickstand 304 can be configured to open to various preset positions. The preset positions, for instance, can correspond to angles with reference to a rear surface 306 of the computing device 102. In the illustrated example, the kickstand 304 is open to a preset position that corresponds to an angle 308 with reference to the rear surface 306. The angle 308 can be selected from a range of different angles. The angle 308, for instance, can include an angle between 20 degrees to 30 degrees (20°-30°).


Further to the example illustrated in FIG. 3, the computing device 102 includes the camera assembly 116. As mentioned above, the camera assembly 116 can include various components, such as a lens, a sensor, mirrors, a prism, and so forth. In at least some implementations, a field of view of the camera assembly faces away from the display device 110, such that a user who is interacting with the computing device 102 and/or the input device 104 can capture images of objects that the user is facing.


In at least some implementations, components of the camera assembly 116 can be mounted in the computing device 102 at an angle based on a tilt angle of the computing device 102. For instance, components of the camera assembly 116 can be mounted at an angle such that when the computing device is placed in the orientation 300, a field of view of the camera assembly 116 is substantially perpendicular to the surface 302, e.g., within 10 degrees (10°). The angle of the camera assembly 116, for example, can be such that in the orientation 300, an optical axis 310 of the camera assembly 116 is substantially parallel (e.g., within 10 degrees (10°)) to the surface 302.


For example, consider that the angle 308 of the kickstand 304 is such that the rear surface 306 is at an angle of 65 degrees (65°) to the surface 302. In this example, the camera assembly 116 can be angled in the computing device 102 such that the optical axis 310 is at an angle of 115 degrees to the rear surface 306 to enable the optical axis to be substantially parallel to the surface 302. Thus, in at least some embodiments, the camera assembly 116 can be mounted at an angle such that an angle of the optical axis 310 with respect to the rear surface 306 is supplementary to an angle of the rear surface 306 with respect to the surface 302.


Additionally or alternatively, the camera assembly 116 can be adjustable to compensate for various orientations and/or angles of the computing device 102. For instance, consider the example illustrated in FIG. 4, where the computing device 102 is positioned in the orientation 300 discussed above.


In this example, an orientation of the computing device 102 is determined. For example, the orientation sensors 114 can detect that the computing device 102 is tilted at an angle 400 with reference to gravity, e.g., a gravitational vector 402. The orientation module 112 can receive this orientation information from the orientation sensors 114, and can perform various operations based on the orientation information. For instance, the orientation module 112 can cause one or more components of the camera assembly 116 to be physically adjusted based on the angle 400. The orientation module 112, for example, can cause one or more components of the camera assembly 116 to be tilted, panned, and so forth, such that the optical axis 310 is perpendicular to the gravitational vector 402. Additionally or alternatively, a variety of other adjustments can be made as well within the spirit and scope of the disclosed embodiments.


Components of the camera assembly 116 may also be adjustable based on an angle of the kickstand 304. For instance, the orientation module 112 can detect that the kickstand 304 is opened to a particular position. A hinge assembly that enables rotation of the kickstand 304, for example, can include a sensor mechanism that can detect an angle at which the kickstand 304 is disposed. Based on position of the kickstand 304, components of the camera assembly 116 can be tilted, panned, and so forth.


Orientation information can also be leveraged to perform various types of image processing. For instance, the camera module 118 can receive orientation information from the orientation module 112 and/or the orientation sensors 114. The camera module 118 can use the orientation information to perform image processing on a captured image, such as image correction to compensate for image distortion caused by an angle of the camera assembly 116 to an object being captured.



FIG. 5 illustrates that the input device 104 may be rotated such that the input device 104 is placed against the display device 110 of the computing device 102 to assume an orientation 500. In the orientation 500, the input device 104 may act as a cover such that the input device 104 can protect the display device 110 from harm. In implementations, the orientation 500 can correspond to a closed position of the computing device 102.


In the orientation 500, while the display device 110 may not be visible, the camera assembly 116 may nonetheless be used to capture images of objects. Further, techniques discussed herein may be employed to determine an orientation of the computing device 102, and to adjust the camera assembly 116 and/or images based on the orientation.



FIG. 6 illustrates a further example orientation of the computing device 102, generally at 600. In the orientation 600, the computing device 102 is placed on a surface 602 and is oriented such that the display device 110 faces away from the input device 104. In this example, the kickstand 304 can support the computing device 102, such as via contact with a back surface of the input device 104. Although not expressly illustrated here, a cover can be employed to cover and protect a front surface of the input device 104 from the surface 602.


Further to the example illustrated in FIG. 6, the camera assembly 116 can be angled as discussed above. For example, the camera assembly 116 can be angled such that the optical axis 310 is parallel to the surface 602. Additionally or alternatively, an orientation of the computing device 102 can be determined and leveraged to adjust components of the camera assembly 116, to perform image processing, and so forth.



FIG. 7 illustrates an example orientation 700, in which the input device 104 may also be rotated so as to be disposed against a back of the computing device 102, e.g., against a rear housing of the computing device 102 that is disposed opposite the display device 110 on the computing device 102. In this example, the flexible hinge 106 is caused to “wrap around” to position the input device 104 at the rear of the computing device 102.


This wrapping causes a portion of a rear of the computing device 102 to remain exposed. This may be leveraged for a variety of functionality, such as to permit the camera assembly 116 to be used even though a significant portion of the rear of the computing device 102 is covered by the input device 104.


The orientation 700 can enable a variety of uses for the computing device 102. For instance, the orientation 700 can correspond to a handheld position of the computing device. In the handheld position, a user can grasp the computing device 102 in the orientation 700, and use the computing device to capture images of objects via the camera assembly 116. Thus, a user can point the camera assembly 116 toward an object to cause an image of the object to be displayed via the display device 110. The user can then activate functionality of the camera assembly 116 to capture an image of the object, such as by actuating a touch screen button displayed on the display device 110, pressing a button on the computing device 102 and/or the input device 104, and so on. Thus, the display device 110 can function as a preview display for images that can be captured via the camera assembly 116.


Further to the example illustrated in FIG. 7, the camera assembly 116 can be angled as discussed above. For example, the camera assembly 116 can be angled such that the optical axis 310 is parallel to the ground, perpendicular to the gravitational vector 402, and so on. Additionally or alternatively, an orientation of the computing device 102 can be determined and leveraged to adjust components of the camera assembly 116, to perform image processing, and so forth.


The example orientations discussed above are presented for purpose of example only, and techniques discussed herein can be implemented to enable images to be captured in a wide variety of different device orientations. Further, although the camera assembly 116 is illustrated in a particular position and orientation with reference to the computing device 102, this is not intended to be limiting. The camera assembly 116 can be oriented in a wide variety of different positions on the computing device 102 within the spirit and scope of the claimed embodiments. In at least some embodiments, for instance, the camera assembly 116 can include a front facing camera, e.g., a camera whose field of view faces the same direction as the display device 110. Further, the computing device 102 can employ multiple cameras that can capture different fields of view, e.g., multiple implementations of the camera assembly 116. For instance, both a front facing and a rear facing camera can be employed.


Having discussed some example device orientations, consider now some example camera assemblies in accordance with one or more embodiments.


Example Camera Assembly



FIG. 8 illustrates an example implementation of the camera assembly 116. Included as part of the camera assembly 116 are a carrier 800, which contains a sensor 802 and an optical intake 804. The carrier 800 is a mechanism that contains components of the camera assembly 116, and enables the components to be mounted in various configurations in the computing device 102. In implementations, the carrier 800 can be adjustably mounted in the computing device 102, such that the carrier 800 can be tilted, panned, rotated, and so forth. For example, the carrier 800 can be attached to a motor assembly that enables adjustment of the carrier 800 and/or components of the camera assembly 116 within the computing device 102.


The sensor 802 is representative of a device that can receive an optical image, and can convert the optical image into an electronic signal. Examples of the sensor 802 include a digital charge-coupled device (CCD), a complementary metal-oxide-semiconductor (CMOS) active pixel sensor, and so forth. Images converted by the sensor 802 can be utilized by other components and/or functionalities of the computing device 102, such as displayed via the display device 110, stored in memory, and so forth.


The optical intake 804 receives light externally from the computing device 102, and focuses the light on the sensor 802 to form an optical image on the sensor 802. The optical intake 804 can include a variety of components, such as different configurations and/or combinations of a lens, a prism, a mirror, and so forth. In at least some embodiments, the optical intake 804 is configured to focus light on particular portions of the sensor 802. Which portion of the sensor 802 can depend on an angle at which the computing device 102 is tilted, the camera carrier 800 is tilted, and so forth.



FIG. 9 illustrates the camera assembly 116 in a partial view of the computing device 102. As shown in FIG. 9, the camera assembly 116 can be mounted at an angle in the computing device 102, such as with respect to the display device 110, the rear surface 306, and so on. Additionally or alternatively, the camera assembly can be physically adjustable in the computing device 102, such as via tilting, panning, rotating, and so on. For instance, the carrier 800 can be mounted on one or more axes, about which the carrier 800 can be manipulated to cause the camera assembly 116 to be angled in different directions.



FIG. 10 illustrates an example scenario 1000, in which a region of the sensor 802 that is utilized to capture an image is based on a tilt angle of the computing device 102. In the upper portion of the scenario 1000, the computing device 102 is tilted at an angle 1002. The angle 1002, for instance, can be an angle of a plane formed by the display device 110, with reference to a gravitational vector 1004 detected via the orientation sensors 114.


In the lower portion of the scenario 1000, an image tile 1006 is defined for the sensor 802 based on the angle 1002. In at least some implementations, the sensor 802 can be mapped to determine which portion(s) of the sensor 802 to use to generate image data based on tilt angles of the computing device 102, the camera assembly 116, and so forth. In some orientations, for instance, the angle of incident light on the optical intake 804 can be such that light that passes through the optical intake 804 can focus on sub-portions of the sensor 802. This can enable a sensor to be divided into sub-portions (e.g., the image tile 1006) that are used to generate images based on determined angles of orientation. Additionally or alternatively, a sub-portion of the sensor 802 to be used to capture an image can be calculated on the fly, such as based on an angle of orientation, external light levels, resolution settings for the camera assembly 116, and so forth.


Mapping the sensor 802, for instance, can include determining a threshold optical signal-to-noise ratio (SNR) to be used to capture images. For example, image data received from the sensor 802 that exceeds the threshold SNR can be utilized to capture an image, while image data that does not exceed the threshold SNR can be ignored. Alternatively, image data that does not exceed the threshold SNR can be processed to increase the quality of a resulting image, such as using noise reduction techniques, light enhancement techniques, and so on.


Further to mapping the sensor 802, focus regions (e.g., image tiles) of the sensor 802 that correspond to particular orientation angles can be predetermined by measuring light intensity (e.g., signal intensity) on different regions of the sensor 802 when the computing device 102 is oriented at different angles. Regions that exceed a threshold light intensity can be used to capture an image, such as by defining image tiles within regions of the sensor 802 that receive focused light at and/or above the threshold light intensity.


Thus, the image tile 1006 corresponds to a portion of the sensor 802 that is used to capture an image when the computing device is positioned at the angle 1002. Further, data generated from regions of the sensor 802 that are external to the image tile 1006 can be ignored, or processed to enhance image quality. If the computing device 102 is tilted to a different angle, a different image tile can be determined. For instance, consider the following example.



FIG. 11 illustrates an example scenario 1100, in which an image tile 1102 is defined based on an angle of orientation of the computing device 102. The computing device 102, for instance, can be positioned at an angle 1104 with reference to a gravitational vector 1106. Thus, the computing device 102 is positioned at a different orientation than described above with reference to FIG. 10. Thus, the image tile 1102 is defined at a different region of the sensor 802 than was the image tile 1006 described in FIG. 10. Accordingly, different portions of the sensor 802 can be used to capture images, based on an angle of orientation of the computing device 102, of the camera assembly 116, and so forth.


Example Procedure



FIG. 12 is a flow diagram that describes steps in a method in accordance with one or more embodiments. In at least some embodiments, the method can be employed to determine an orientation of a computing device with respect to an input device.


Step 1200 ascertains an orientation of a computing device. For example, an orientation of the computing device 102 relative to earth's gravity (e.g., a gravitational vector) can be determined. In implementations, this can include determining an angle at which the computing device 102 is oriented with reference to earth's gravity. As referenced above, however, a variety of different techniques can be employed to ascertain an orientation of a computing device.


Step 1202 adjusts a camera component of the computing device based on the orientation. For instance, one or more of the carrier 800, the sensor 802, and/or the optical intake 804 can be physically tilted, panned, rotated, and so forth, based on an angle of orientation of the computing device 102. As referenced above, a variety of different types of mechanisms can be used to accomplish such adjustment. For instance, a motor can be attached to an axis of the carrier 800, and can rotate the carrier 800 to enable various components of the camera assembly 116 to be positioned at different angles.


Step 1204 manipulates image data for an image captured via the camera component based on the orientation. For instance, various types of image corrections and/or image enhancements can be applied to image data based on the orientation. In an example implementation, for instance, a specific region of the sensor 802 can be associated with low light levels at particular orientations of the computing device 102. Thus, when the computing device 102 is in such orientations, light enhancement and/or light correction techniques can be applied to image data received from the region. As another example, a specific region of the sensor 802 can be associated with image distortion (e.g., barrel distortion, pincushion distortion, and so forth) at particular orientations of the computing device 102. Thus, when the computing device 102 is in such orientations, image data correction techniques can be applied to image data received from the region to correct for the image distortion.


In implementations, steps 1200, 1202, and 1204 can occur together, sequentially, alternatively, and so on.


Example System and Device



FIG. 13 illustrates an example system generally at 1300 that includes an example computing device 1302 that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. The computing device 1302 may be, for example, be configured to assume a mobile configuration through use of a housing formed and size to be grasped and carried by one or more hands of a user, illustrated examples of which include a mobile phone, mobile game and music device, and tablet computer although other examples are also contemplated.


The example computing device 1302 as illustrated includes a processing system 1304, one or more computer-readable media 1306, and one or more I/O interface 1308 that are communicatively coupled, one to another. Although not shown, the computing device 1302 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines.


The processing system 1304 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 1304 is illustrated as including hardware element 1310 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. The hardware elements 1310 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.


The computer-readable storage media 1306 is illustrated as including memory/storage 1312. The memory/storage 1312 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage component 1312 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage component 1312 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 1306 may be configured in a variety of other ways as further described below.


Input/output interface(s) 1308 are representative of functionality to allow a user to enter commands and information to computing device 1302, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to recognize movement as gestures that do not involve touch), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, the computing device 1302 may be configured in a variety of ways to support user interaction.


The computing device 1302 is further illustrated as being communicatively and physically coupled to an input device 1314 that is physically and communicatively removable from the computing device 1302. In this way, a variety of different input devices may be coupled to the computing device 1302 having a wide variety of configurations to support a wide variety of functionality. In this example, the input device 1314 includes one or more keys 1316, which may be configured as pressure sensitive keys, mechanically switched keys, and so forth.


The input device 1314 is further illustrated as include one or more modules 1318 that may be configured to support a variety of functionality. The one or more modules 1318, for instance, may be configured to process analog and/or digital signals received from the keys 1316 to determine whether a keystroke was intended, determine whether an input is indicative of resting pressure, support authentication of the input device 1314 for operation with the computing device 1302, and so on.


Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


Techniques may further be implemented in a network environment, such as utilizing various cloud-based resources. For instance, methods, procedures, and so forth discussed above may leverage network resources to enable various functionalities.


An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the computing device 1302. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”


“Computer-readable storage media” may refer to media and/or devices that enable persistent and/or non-transitory storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media. The computer-readable storage media includes hardware such as volatile and nonvolatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.


“Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 1302, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.


As previously described, hardware elements 1310 and computer-readable media 1306 are representative of modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein, such as to perform one or more instructions. Hardware may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware. In this context, hardware may operate as a processing device that performs program tasks defined by instructions and/or logic embodied by the hardware as well as a hardware utilized to store instructions for execution, e.g., the computer-readable storage media described previously.


Combinations of the foregoing may also be employed to implement various techniques described herein. Accordingly, software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 1310. The computing device 1302 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of a module that is executable by the computing device 1302 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 1310 of the processing system 1304. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more computing devices 1302 and/or processing systems 1304) to implement techniques, modules, and examples described herein.


Discussed herein are a number of methods that may be implemented to perform techniques discussed herein. Aspects of the methods may be implemented in hardware, firmware, or software, or a combination thereof. The methods are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. Further, an operation shown with respect to a particular method may be combined and/or interchanged with an operation of a different method in accordance with one or more implementations. Aspects of the methods can be implemented via interaction between various entities discussed above with reference to the environment 100.


CONCLUSION

Although the example implementations have been described in language specific to structural features and/or methodological acts, it is to be understood that the implementations defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed features.

Claims
  • 1. A computer-implemented method, comprising: ascertaining an orientation of a computing device that includes a kickstand, the orientation being based at least in part on an angle of the kickstand relative to the computing device; and manipulating image data captured via a sensor of a camera component of the computing device, including:determining, based on the orientation, an image region that corresponds to a portion of the sensor;utilizing a portion of the image data from the image region to capture an image; andignoring a different portion of the image data captured outside of the image region.
  • 2. A method as described in claim 1, wherein the orientation comprises an angle at which the computing device is positioned relative to gravity.
  • 3. A method as described in claim 1, further comprising changing an angle of orientation of one or more portions of the camera component with respect to the computing device based on the orientation of the computing device.
  • 4. A method as described in claim 3, wherein said changing comprises one of more of tilting, rotating, or panning the one or more portions of the camera component with reference to the computing device.
  • 5. A method as described in claim 1, wherein said manipulating comprises applying one or more image data correction techniques to the image data.
  • 6. A method as described in claim 1, wherein said ignoring is in response to determining that the different portion of the image data does not exceed a threshold signal-to-noise ratio.
  • 7. A method as described in claim 1, wherein the image region corresponds to a portion of the sensor that receives light at a threshold light intensity.
  • 8. A method as described in claim 1, further comprising: ascertaining a change in orientation of the computing device; anddetermining, based on the change in orientation, a different image region that corresponds to a different portion of the sensor to be used for capturing a different image.
  • 9. An apparatus comprising: a computing device including a kickstand that is positionable to enable the computing device to be supported on a surface;a camera assembly operably attached at a fixed angle within the computing device at an angle such that when the computing device is supported via the kickstand on the surface, an optical axis of one or more components of the camera assembly is substantially parallel to the surface, the camera assembly including an image sensor for capturing an image; anda camera module configured to select a portion of the image sensor to be used to capture an image based on one or more of an angle of orientation of the computing device or an external light level.
  • 10. An apparatus as described in claim 9, wherein the angle is such that when the computing device is supported via the kickstand on the surface, a field of view of one or more components of the camera assembly is substantially perpendicular to the surface.
  • 11. An apparatus as described in claim 9, further comprising an input device that is attached to the computing device and that is rotatable to different positions with respect to the computing device, at least one of the positions corresponding to a typing position in which the computing device is supported by the kickstand and input can be provided to the computing device via the input device.
  • 12. An apparatus as described in claim 9, further comprising: a display device disposed on a front surface of the computing device, andan input device that is attached to the computing device and that is rotatable to different positions with respect to the computing device, at least one of the positions corresponding to handheld position in which the input device is rotated against a rear surface of the computing device such that images in a field of view of one or more components of the camera assembly can be viewed via the display device and captured.
  • 13. An apparatus as described in claim 9, wherein the computing device includes functionality to: determine an orientation of the computing device; andapply one or more of light enhancement or light correction to a portion of the image data based on the orientation.
  • 14. An apparatus as described in claim 9, wherein the computing device includes functionality to: determine an orientation of the computing device; andperform image manipulation of an image captured via the camera assembly based on the orientation.
  • 15. An apparatus as described in claim 9, wherein the one or more components of the camera assembly comprises a lens.
  • 16. An apparatus comprising: a computing device;a display device disposed on a front surface of a computing device;an input device operably attached to the computing device and that is rotatable to support different orientations of the computing device, at least one of the orientations enabling the display device to be covered by the input device;a kickstand disposed on a rear surface of the computing device and configured to support the computing device on a surface;a camera assembly mounted to the computing device such that a field of view of the camera assembly faces away from the display device, one or more components of the camera assembly being angled on the computing device based on an angle at which the kickstand is configured to support the computing device; anda sensor mechanism configured to detect an angle of the kickstand relative to the computing device; anda camera module configured to adjust one or more components of the camera assembly by one or more of panning or tilting the one or more components relative to the computing device and based on the detected angle of the kickstand relative to the computing device.
  • 17. An apparatus as recited in claim 16, wherein the camera assembly is physically adjustable via one or more of tilting, rotating, or panning, to support image capture in different orientations of the computing device.
  • 18. An apparatus as recited in claim 16, wherein the one or more components of the camera assembly are angled on the computing device such that when the kickstand supports the computing device on the surface, a field of view of the one or more components is substantially perpendicular to the surface.
  • 19. An apparatus as recited in claim 16, wherein the computing device includes: functionality to ascertain an orientation of the computing device; andfunctionality to perform one or more of: adjusting the one or more components of the camera assembly based on the orientation of the computing device; ormanipulating image data captured via the camera component based on the orientation of the computing device.
  • 20. An apparatus as recited in claim 19, wherein the functionality to ascertain an orientation of the computing device comprises functionality to determine an angle of the computing device with reference to gravity.
RELATED APPLICATIONS

This application claims priority under 35 U.S.C. §119(e) to the following U.S. Provisional Patent Applications, the entire disclosures of each of these applications being incorporated by reference in their entirety: U.S. Provisional Patent Application No. 61/606,321, filed Mar. 2, 2012, and titled “Screen Edge;” U.S. Provisional Patent Application No. 61/606,301, filed Mar. 2, 2012, and titled “Input Device Functionality;” U.S. Provisional Patent Application No. 61/606,311, filed Mar. 2, 2012, and titled “Functional Hinge;” U.S. Provisional Patent Application No. 61/606,333, filed Mar. 2, 2012, and titled “Usage and Authentication;” U.S. Provisional Patent Application No. 61/613,745, filed Mar. 21, 2012, and titled “Usage and Authentication;” U.S. Provisional Patent Application No. 61/606,336, filed Mar. 2, 2012, and titled “Kickstand and Camera;” and U.S. Provisional Patent Application No. 61/607,451, filed Mar. 6, 2012, and titled “Spanaway Provisional.”

US Referenced Citations (777)
Number Name Date Kind
578325 Fleming Mar 1897 A
3600528 Leposavic Aug 1971 A
3777082 Hatley Dec 1973 A
3879586 DuRocher et al. Apr 1975 A
3968336 Johnson Jul 1976 A
4046975 Seeger, Jr. Sep 1977 A
4065649 Carter et al. Dec 1977 A
4086451 Boulanger Apr 1978 A
4243861 Strandwitz Jan 1981 A
4261042 Ishiwatari et al. Apr 1981 A
4302648 Sado et al. Nov 1981 A
4317011 Mazurk Feb 1982 A
4317013 Larson Feb 1982 A
4365130 Christensen Dec 1982 A
4492829 Rodrique Jan 1985 A
4503294 Matsumaru Mar 1985 A
4527021 Morikawa et al. Jul 1985 A
4559426 Van Zeeland et al. Dec 1985 A
4577822 Wilkerson Mar 1986 A
4588187 Dell May 1986 A
4607147 Ono et al. Aug 1986 A
4651133 Ganesan et al. Mar 1987 A
4652704 Franklin Mar 1987 A
4724605 Fiorella Feb 1988 A
4735394 Facco Apr 1988 A
4801771 Mizuguchi et al. Jan 1989 A
4824268 Diernisse Apr 1989 A
4864084 Cardinale Sep 1989 A
4990900 Kikuchi Feb 1991 A
5008497 Asher Apr 1991 A
5021638 Nopper et al. Jun 1991 A
5053585 Yaniger Oct 1991 A
5107401 Youn Apr 1992 A
5111223 Omura May 1992 A
5128829 Loew Jul 1992 A
5220318 Staley Jun 1993 A
5220521 Kikinis Jun 1993 A
5235495 Blair et al. Aug 1993 A
5253362 Nolan et al. Oct 1993 A
5283559 Kalendra et al. Feb 1994 A
5331443 Stanisci Jul 1994 A
5349403 Lo Sep 1994 A
5363075 Fanucchi Nov 1994 A
5375076 Goodrich et al. Dec 1994 A
5480118 Cross Jan 1996 A
5491313 Bartley et al. Feb 1996 A
5546271 Gut et al. Aug 1996 A
5548477 Kumar et al. Aug 1996 A
5558577 Kato Sep 1996 A
5581682 Anderson et al. Dec 1996 A
5596700 Darnell et al. Jan 1997 A
5617343 Danielson et al. Apr 1997 A
5621494 Kazumi et al. Apr 1997 A
5661279 Kenmochi Aug 1997 A
5666112 Crowley et al. Sep 1997 A
5681220 Bertram et al. Oct 1997 A
5737183 Kobayashi et al. Apr 1998 A
5745376 Barker et al. Apr 1998 A
5748114 Koehn May 1998 A
5781406 Hunte Jul 1998 A
5803748 Maddrell et al. Sep 1998 A
5807175 Davis et al. Sep 1998 A
5818361 Acevedo Oct 1998 A
5828770 Leis et al. Oct 1998 A
5842027 Oprescu et al. Nov 1998 A
5874697 Selker et al. Feb 1999 A
5905485 Podoloff May 1999 A
5920317 McDonald Jul 1999 A
5924555 Sadamori et al. Jul 1999 A
5926170 Oba Jul 1999 A
5929946 Sharp et al. Jul 1999 A
5971635 Wise Oct 1999 A
5995026 Sellers Nov 1999 A
6002389 Kasser Dec 1999 A
6002581 Lindsey Dec 1999 A
6005209 Burleson et al. Dec 1999 A
6012714 Worley et al. Jan 2000 A
6014800 Lee Jan 2000 A
6040823 Seffernick et al. Mar 2000 A
6042075 Burch, Jr. Mar 2000 A
6044717 Biegelsen et al. Apr 2000 A
6055705 Komatsu et al. May 2000 A
6061644 Leis May 2000 A
6108200 Fullerton Aug 2000 A
6128007 Seybold Oct 2000 A
6141388 Servais et al. Oct 2000 A
6178085 Leung Jan 2001 B1
6178443 Lin Jan 2001 B1
6188391 Seely et al. Feb 2001 B1
6254105 Rinde et al. Jul 2001 B1
6278490 Fukuda et al. Aug 2001 B1
6305073 Badders Oct 2001 B1
6329617 Burgess Dec 2001 B1
6344791 Armstrong Feb 2002 B1
6366440 Kung Apr 2002 B1
6380497 Hashimoto et al. Apr 2002 B1
6437682 Vance Aug 2002 B1
6450046 Maeda Sep 2002 B1
6511378 Bhatt et al. Jan 2003 B1
6532035 Saari et al. Mar 2003 B1
6532147 Christ, Jr. Mar 2003 B1
6543949 Ritchey et al. Apr 2003 B1
6565439 Shinohara et al. May 2003 B2
6574030 Mosier Jun 2003 B1
6585435 Fang Jul 2003 B2
6597347 Yasutake Jul 2003 B1
6600121 Olodort et al. Jul 2003 B1
6603408 Gaba Aug 2003 B1
6603461 Smith, Jr. et al. Aug 2003 B2
6608664 Hasegawa Aug 2003 B1
6617536 Kawaguchi Sep 2003 B2
6651943 Cho et al. Nov 2003 B2
6684166 Bellwood et al. Jan 2004 B2
6685369 Lien Feb 2004 B2
6687614 Ihara et al. Feb 2004 B2
6695273 Iguchi Feb 2004 B2
6700617 Hamamura et al. Mar 2004 B1
6704864 Philyaw Mar 2004 B1
6721019 Kono et al. Apr 2004 B2
6725318 Sherman et al. Apr 2004 B1
6753920 Momose et al. Jun 2004 B1
6774888 Genduso Aug 2004 B1
6776546 Kraus et al. Aug 2004 B2
6780019 Ghosh et al. Aug 2004 B1
6781819 Yang et al. Aug 2004 B2
6784869 Clark et al. Aug 2004 B1
6798887 Andre Sep 2004 B1
6813143 Makela Nov 2004 B2
6819316 Schulz et al. Nov 2004 B2
6856506 Doherty et al. Feb 2005 B2
6856789 Pattabiraman et al. Feb 2005 B2
6859565 Baron Feb 2005 B2
6861961 Sandbach et al. Mar 2005 B2
6909354 Baker et al. Jun 2005 B2
6914197 Doherty et al. Jul 2005 B2
6950950 Sawyers et al. Sep 2005 B2
6962454 Costello Nov 2005 B1
6976799 Kim et al. Dec 2005 B2
7002624 Uchino et al. Feb 2006 B1
7007238 Glaser Feb 2006 B2
7091436 Serban Aug 2006 B2
7099149 Krieger et al. Aug 2006 B2
7102683 Perry et al. Sep 2006 B2
7106222 Ward et al. Sep 2006 B2
7123292 Seeger et al. Oct 2006 B1
7129979 Lee Oct 2006 B1
D535292 Shi et al. Jan 2007 S
7159132 Takahashi et al. Jan 2007 B2
7162153 Harter, Jr. et al. Jan 2007 B2
7194662 Do et al. Mar 2007 B2
7213323 Baker et al. May 2007 B2
7213991 Chapman et al. May 2007 B2
7252512 Tai et al. Aug 2007 B2
7260221 Atsmon Aug 2007 B1
7277087 Hill et al. Oct 2007 B2
7295720 Raskar Nov 2007 B2
7301759 Hsiung Nov 2007 B2
7365967 Zheng Apr 2008 B2
7379094 Yoshida et al. May 2008 B2
7400452 Detro et al. Jul 2008 B2
7415676 Fujita Aug 2008 B2
7443443 Raskar et al. Oct 2008 B2
7447922 Asbury et al. Nov 2008 B1
7457108 Ghosh Nov 2008 B2
7469386 Bear et al. Dec 2008 B2
7486165 Ligtenberg et al. Feb 2009 B2
7499037 Lube Mar 2009 B2
7509042 Mori et al. Mar 2009 B2
7539882 Jessup et al. May 2009 B2
7542052 Solomon et al. Jun 2009 B2
7558594 Wilson Jul 2009 B2
7559834 York Jul 2009 B1
RE40891 Yasutake Sep 2009 E
7594638 Chan et al. Sep 2009 B2
7629966 Anson Dec 2009 B2
7636921 Louie Dec 2009 B2
7639329 Takeda et al. Dec 2009 B2
7656392 Bolender Feb 2010 B2
7722358 Chatterjee et al. May 2010 B2
7724952 Shum et al. May 2010 B2
7729493 Krieger et al. Jun 2010 B2
7731147 Rha Jun 2010 B2
7733326 Adiseshan Jun 2010 B1
7761119 Patel Jul 2010 B2
7777972 Chen et al. Aug 2010 B1
7782342 Koh Aug 2010 B2
7813715 McKillop et al. Oct 2010 B2
7822338 Wernersson Oct 2010 B2
7865639 McCoy et al. Jan 2011 B2
7884807 Hovden et al. Feb 2011 B2
7893921 Sato Feb 2011 B2
D636397 Green Apr 2011 S
7927654 Hagood et al. Apr 2011 B2
7928964 Kolmykov-Zotov et al. Apr 2011 B2
7932890 Onikiri et al. Apr 2011 B2
7944520 Ichioka et al. May 2011 B2
7945717 Rivalsi May 2011 B2
7973771 Geaghan Jul 2011 B2
7978281 Vergith et al. Jul 2011 B2
8016255 Lin Sep 2011 B2
8053688 Conzola et al. Nov 2011 B2
8059384 Park et al. Nov 2011 B2
8065624 Morin et al. Nov 2011 B2
8069356 Rathi et al. Nov 2011 B2
8090885 Callaghan et al. Jan 2012 B2
8098233 Hotelling et al. Jan 2012 B2
8115499 Osoinach et al. Feb 2012 B2
8117362 Rodriguez et al. Feb 2012 B2
8118274 McClure et al. Feb 2012 B2
8120166 Koizumi et al. Feb 2012 B2
8130203 Westerman Mar 2012 B2
8149219 Lii et al. Apr 2012 B2
8154524 Wilson et al. Apr 2012 B2
8159372 Sherman Apr 2012 B2
8162282 Hu et al. Apr 2012 B2
D659139 Gengler May 2012 S
8169421 Wright et al. May 2012 B2
8179236 Weller et al. May 2012 B2
8184190 Dosluoglu May 2012 B2
8229509 Paek et al. Jul 2012 B2
8229522 Kim et al. Jul 2012 B2
8231099 Chen Jul 2012 B2
8243432 Duan et al. Aug 2012 B2
8248791 Wang et al. Aug 2012 B2
8255708 Zhang Aug 2012 B1
8264310 Lauder et al. Sep 2012 B2
8267368 Torii et al. Sep 2012 B2
8269731 Molne Sep 2012 B2
8274784 Franz et al. Sep 2012 B2
8279589 Kim Oct 2012 B2
8322290 Mignano Dec 2012 B1
8346206 Andrus et al. Jan 2013 B1
8373664 Wright Feb 2013 B2
8384566 Bocirnea Feb 2013 B2
8387078 Memmott Feb 2013 B2
8387938 Lin Mar 2013 B2
8403576 Merz Mar 2013 B2
8416559 Agata et al. Apr 2013 B2
8424160 Chen Apr 2013 B2
8464079 Chueh et al. Jun 2013 B2
8498100 Whitt, III et al. Jul 2013 B1
8514568 Qiao et al. Aug 2013 B2
8520371 Peng et al. Aug 2013 B2
8543227 Perek et al. Sep 2013 B1
8548608 Perek et al. Oct 2013 B2
8564944 Whitt, III et al. Oct 2013 B2
8569640 Yamada et al. Oct 2013 B2
8570725 Whitt, III et al. Oct 2013 B2
8576031 Lauder et al. Nov 2013 B2
8587701 Tatsuzawa Nov 2013 B2
8599542 Healey et al. Dec 2013 B1
8610015 Whitt et al. Dec 2013 B2
8614666 Whitman et al. Dec 2013 B2
8633898 Westerman et al. Jan 2014 B2
8646999 Shaw et al. Feb 2014 B2
8674941 Casparian et al. Mar 2014 B2
8699215 Whitt, III et al. Apr 2014 B2
8719603 Belesiu May 2014 B2
8724302 Whitt et al. May 2014 B2
8744070 Zhang et al. Jun 2014 B2
8744391 Tenbrook et al. Jun 2014 B2
8762746 Lachwani et al. Jun 2014 B1
8767388 Ahn et al. Jul 2014 B2
8780540 Whitt, III et al. Jul 2014 B2
8780541 Whitt et al. Jul 2014 B2
8786767 Rihn et al. Jul 2014 B2
8791382 Whitt, III et al. Jul 2014 B2
8797765 Lin et al. Aug 2014 B2
8825187 Hamrick et al. Sep 2014 B1
8830668 Whitt, III et al. Sep 2014 B2
8850241 Oler et al. Sep 2014 B2
8854799 Whitt, III et al. Oct 2014 B2
8873227 Whitt et al. Oct 2014 B2
8891232 Wang Nov 2014 B2
8896993 Belesiu et al. Nov 2014 B2
8903517 Perek et al. Dec 2014 B2
8908858 Chiu et al. Dec 2014 B2
8934221 Guo Jan 2015 B2
8935774 Belesiu et al. Jan 2015 B2
8939422 Liu et al. Jan 2015 B2
8947864 Whitt, III et al. Feb 2015 B2
8949477 Drasnin Feb 2015 B2
8964376 Chen Feb 2015 B2
9047207 Belesiu et al. Jun 2015 B2
9064654 Whitt, III et al. Jun 2015 B2
9075566 Whitt, III et al. Jul 2015 B2
9098117 Lutz, III et al. Aug 2015 B2
9116550 Siddiqui et al. Aug 2015 B2
9134807 Shaw et al. Sep 2015 B2
9134808 Siddiqui et al. Sep 2015 B2
9146620 Whitt et al. Sep 2015 B2
9158383 Shaw et al. Oct 2015 B2
9158384 Whitt, III et al. Oct 2015 B2
9176900 Whitt, III et al. Nov 2015 B2
9176901 Whitt, III et al. Nov 2015 B2
20010023818 Masaru et al. Sep 2001 A1
20020005108 Ludwig Jan 2002 A1
20020044216 Cha Apr 2002 A1
20020070883 Dosch Jun 2002 A1
20020113882 Pollard et al. Aug 2002 A1
20020134828 Sandbach et al. Sep 2002 A1
20020135457 Sandbach et al. Sep 2002 A1
20020195177 Hinkley et al. Dec 2002 A1
20030000821 Takahashi et al. Jan 2003 A1
20030007648 Currell Jan 2003 A1
20030011576 Sandbach et al. Jan 2003 A1
20030036365 Kuroda Feb 2003 A1
20030044216 Fang Mar 2003 A1
20030051983 Lahr Mar 2003 A1
20030067450 Thursfield et al. Apr 2003 A1
20030108720 Kashino Jun 2003 A1
20030128285 Itoh Jul 2003 A1
20030160712 Levy Aug 2003 A1
20030163611 Nagao Aug 2003 A1
20030197687 Shetter Oct 2003 A1
20030197806 Perry et al. Oct 2003 A1
20030231243 Shibutani Dec 2003 A1
20040005184 Kim et al. Jan 2004 A1
20040046796 Fujita Mar 2004 A1
20040056843 Lin et al. Mar 2004 A1
20040113956 Bellwood et al. Jun 2004 A1
20040156168 LeVasseur et al. Aug 2004 A1
20040160734 Yim Aug 2004 A1
20040169641 Bean et al. Sep 2004 A1
20040189822 Shimada Sep 2004 A1
20040212598 Kraus et al. Oct 2004 A1
20040212601 Cake et al. Oct 2004 A1
20040258924 Berger et al. Dec 2004 A1
20040268000 Barker et al. Dec 2004 A1
20050030728 Kawashima et al. Feb 2005 A1
20050047773 Satake et al. Mar 2005 A1
20050052831 Chen Mar 2005 A1
20050055498 Beckert et al. Mar 2005 A1
20050057515 Bathiche Mar 2005 A1
20050059489 Kim Mar 2005 A1
20050062715 Tsuji et al. Mar 2005 A1
20050068460 Lin Mar 2005 A1
20050094895 Baron May 2005 A1
20050099400 Lee May 2005 A1
20050134717 Misawa Jun 2005 A1
20050146512 Hill et al. Jul 2005 A1
20050236848 Kim et al. Oct 2005 A1
20050264653 Starkweather et al. Dec 2005 A1
20050264988 Nicolosi Dec 2005 A1
20050283731 Saint-Hilaire et al. Dec 2005 A1
20060049920 Sadler et al. Mar 2006 A1
20060085658 Allen et al. Apr 2006 A1
20060092139 Sharma May 2006 A1
20060096392 Inkster et al. May 2006 A1
20060102020 Takada et al. May 2006 A1
20060125799 Hillis et al. Jun 2006 A1
20060154725 Glaser et al. Jul 2006 A1
20060155391 Pistemaa et al. Jul 2006 A1
20060156415 Rubinstein et al. Jul 2006 A1
20060174143 Sawyers et al. Aug 2006 A1
20060176377 Miyasaka Aug 2006 A1
20060181514 Newman Aug 2006 A1
20060187216 Trent, Jr. et al. Aug 2006 A1
20060192763 Ziemkowski Aug 2006 A1
20060195522 Miyazaki Aug 2006 A1
20060220465 Kingsmore et al. Oct 2006 A1
20060265617 Priborsky Nov 2006 A1
20060267931 Vainio et al. Nov 2006 A1
20060272429 Ganapathi et al. Dec 2006 A1
20070003267 Shibutani Jan 2007 A1
20070024742 Raskar et al. Feb 2007 A1
20070056385 Lorenz Mar 2007 A1
20070062089 Homer et al. Mar 2007 A1
20070069153 Pai-Paranjape et al. Mar 2007 A1
20070072474 Beasley et al. Mar 2007 A1
20070081091 Pan et al. Apr 2007 A1
20070117600 Robertson et al. May 2007 A1
20070121956 Bai et al. May 2007 A1
20070127205 Kuo Jun 2007 A1
20070145945 McGinley et al. Jun 2007 A1
20070172229 Wernersson Jul 2007 A1
20070176902 Newman et al. Aug 2007 A1
20070178891 Louch et al. Aug 2007 A1
20070182663 Biech Aug 2007 A1
20070182722 Hotelling et al. Aug 2007 A1
20070185590 Reindel et al. Aug 2007 A1
20070200830 Yamamoto Aug 2007 A1
20070201859 Sarrat Aug 2007 A1
20070220708 Lewis Sep 2007 A1
20070222766 Bolender Sep 2007 A1
20070230227 Palmer Oct 2007 A1
20070234420 Novotney et al. Oct 2007 A1
20070236408 Yamaguchi et al. Oct 2007 A1
20070236475 Wherry Oct 2007 A1
20070236873 Yukawa et al. Oct 2007 A1
20070247432 Oakley Oct 2007 A1
20070252674 Nelson et al. Nov 2007 A1
20070260892 Paul et al. Nov 2007 A1
20070263119 Shum et al. Nov 2007 A1
20070283179 Burnett et al. Dec 2007 A1
20070296709 Guanghai Dec 2007 A1
20070297625 Hjort et al. Dec 2007 A1
20080001924 de los Reyes et al. Jan 2008 A1
20080019684 Shyu et al. Jan 2008 A1
20080042978 Perez-Noguera Feb 2008 A1
20080053222 Ehrensvard et al. Mar 2008 A1
20080059888 Dunko Mar 2008 A1
20080068451 Hyatt Mar 2008 A1
20080074398 Wright Mar 2008 A1
20080084499 Kisacanin et al. Apr 2008 A1
20080104437 Lee May 2008 A1
20080106592 Mikami May 2008 A1
20080129520 Lee Jun 2008 A1
20080151478 Chern Jun 2008 A1
20080158185 Westerman Jul 2008 A1
20080174570 Jobs et al. Jul 2008 A1
20080177185 Nakao et al. Jul 2008 A1
20080186660 Yang Aug 2008 A1
20080203277 Warszauer et al. Aug 2008 A1
20080228969 Cheah et al. Sep 2008 A1
20080238884 Harish Oct 2008 A1
20080253822 Matias Oct 2008 A1
20080307242 Qu Dec 2008 A1
20080316002 Brunet et al. Dec 2008 A1
20080316183 Westerman et al. Dec 2008 A1
20080320190 Lydon et al. Dec 2008 A1
20090009476 Daley, III Jan 2009 A1
20090073957 Newland et al. Mar 2009 A1
20090083562 Park et al. Mar 2009 A1
20090089600 Nousiainen Apr 2009 A1
20090096756 Lube Apr 2009 A1
20090102805 Meijer et al. Apr 2009 A1
20090131134 Baerlocher et al. May 2009 A1
20090140985 Liu Jun 2009 A1
20090147102 Kakinuma et al. Jun 2009 A1
20090158221 Nielsen et al. Jun 2009 A1
20090160944 Trevelyan et al. Jun 2009 A1
20090167930 Safaee-Rad et al. Jul 2009 A1
20090174759 Yeh et al. Jul 2009 A1
20090177906 Paniagua, Jr. et al. Jul 2009 A1
20090189873 Peterson Jul 2009 A1
20090195497 Fitzgerald et al. Aug 2009 A1
20090195518 Mattice et al. Aug 2009 A1
20090207144 Bridger Aug 2009 A1
20090231275 Odgers Sep 2009 A1
20090231465 Senba Sep 2009 A1
20090239586 Boeve et al. Sep 2009 A1
20090244832 Behar et al. Oct 2009 A1
20090244872 Yan Oct 2009 A1
20090251008 Sugaya Oct 2009 A1
20090259865 Sheynblat et al. Oct 2009 A1
20090262492 Whitchurch et al. Oct 2009 A1
20090265670 Kim et al. Oct 2009 A1
20090284613 Kim Nov 2009 A1
20090285491 Ravenscroft et al. Nov 2009 A1
20090296331 Choy Dec 2009 A1
20090303204 Nasiri et al. Dec 2009 A1
20090315830 Westerman Dec 2009 A1
20090320244 Lin Dec 2009 A1
20090321490 Groene et al. Dec 2009 A1
20100006412 Wang et al. Jan 2010 A1
20100013319 Kamiyama et al. Jan 2010 A1
20100023869 Saint-Hilaire et al. Jan 2010 A1
20100026656 Hotelling et al. Feb 2010 A1
20100038821 Jenkins et al. Feb 2010 A1
20100039081 Sip Feb 2010 A1
20100039764 Locker et al. Feb 2010 A1
20100045633 Gettemy et al. Feb 2010 A1
20100051432 Lin et al. Mar 2010 A1
20100052880 Laitinen et al. Mar 2010 A1
20100053534 Hsieh et al. Mar 2010 A1
20100054435 Louch et al. Mar 2010 A1
20100056130 Louch et al. Mar 2010 A1
20100073329 Raman et al. Mar 2010 A1
20100077237 Sawyers Mar 2010 A1
20100079379 Demuynck et al. Apr 2010 A1
20100083108 Rider et al. Apr 2010 A1
20100085321 Pundsack Apr 2010 A1
20100100752 Chueh et al. Apr 2010 A1
20100102182 Lin Apr 2010 A1
20100103112 Yoo et al. Apr 2010 A1
20100103332 Li et al. Apr 2010 A1
20100105443 Vaisanen Apr 2010 A1
20100106983 Kasprzak et al. Apr 2010 A1
20100115309 Carvalho et al. May 2010 A1
20100117993 Kent May 2010 A1
20100123686 Klinghult et al. May 2010 A1
20100128427 Iso May 2010 A1
20100133398 Chiu et al. Jun 2010 A1
20100142130 Wang et al. Jun 2010 A1
20100146317 Challener et al. Jun 2010 A1
20100148995 Elias Jun 2010 A1
20100148999 Casparian et al. Jun 2010 A1
20100149104 Sim et al. Jun 2010 A1
20100149111 Olien Jun 2010 A1
20100149377 Shintani et al. Jun 2010 A1
20100156913 Ortega et al. Jun 2010 A1
20100157085 Sasaki Jun 2010 A1
20100161522 Tirpak et al. Jun 2010 A1
20100164857 Liu et al. Jul 2010 A1
20100164897 Morin et al. Jul 2010 A1
20100171875 Yamamoto Jul 2010 A1
20100171891 Kaji et al. Jul 2010 A1
20100174421 Tsai et al. Jul 2010 A1
20100180063 Ananny et al. Jul 2010 A1
20100185877 Chueh et al. Jul 2010 A1
20100188299 Rinehart et al. Jul 2010 A1
20100201308 Lindholm Aug 2010 A1
20100205472 Tupman et al. Aug 2010 A1
20100206614 Park et al. Aug 2010 A1
20100207774 Song Aug 2010 A1
20100220205 Lee et al. Sep 2010 A1
20100222110 Kim et al. Sep 2010 A1
20100231522 Li Sep 2010 A1
20100235546 Terlizzi et al. Sep 2010 A1
20100238320 Washisu Sep 2010 A1
20100238620 Fish Sep 2010 A1
20100250975 Gill et al. Sep 2010 A1
20100250988 Okuda et al. Sep 2010 A1
20100259482 Ball Oct 2010 A1
20100259876 Kim Oct 2010 A1
20100265182 Ball et al. Oct 2010 A1
20100271771 Wu et al. Oct 2010 A1
20100274932 Kose Oct 2010 A1
20100279768 Huang et al. Nov 2010 A1
20100289457 Onnerud et al. Nov 2010 A1
20100295812 Burns et al. Nov 2010 A1
20100302378 Marks et al. Dec 2010 A1
20100306538 Thomas et al. Dec 2010 A1
20100308778 Yamazaki et al. Dec 2010 A1
20100308844 Day et al. Dec 2010 A1
20100309617 Wang et al. Dec 2010 A1
20100313680 Joung et al. Dec 2010 A1
20100315345 Laitinen Dec 2010 A1
20100315348 Jellicoe et al. Dec 2010 A1
20100315373 Steinhauser et al. Dec 2010 A1
20100321877 Moser Dec 2010 A1
20100324457 Bean et al. Dec 2010 A1
20100325155 Skinner et al. Dec 2010 A1
20110012866 Keam Jan 2011 A1
20110012873 Prest et al. Jan 2011 A1
20110019123 Prest et al. Jan 2011 A1
20110031287 Le Gette et al. Feb 2011 A1
20110032127 Roush Feb 2011 A1
20110036965 Zhang et al. Feb 2011 A1
20110037721 Cranfill et al. Feb 2011 A1
20110043990 Mickey et al. Feb 2011 A1
20110050576 Forutanpour et al. Mar 2011 A1
20110050626 Porter et al. Mar 2011 A1
20110050946 Lee et al. Mar 2011 A1
20110055407 Lydon et al. Mar 2011 A1
20110057724 Pabon Mar 2011 A1
20110060926 Brooks et al. Mar 2011 A1
20110069148 Jones et al. Mar 2011 A1
20110074688 Hull et al. Mar 2011 A1
20110081946 Singh et al. Apr 2011 A1
20110102326 Casparian et al. May 2011 A1
20110102752 Chen et al. May 2011 A1
20110107958 Pance et al. May 2011 A1
20110108401 Yamada et al. May 2011 A1
20110113368 Carvajal et al. May 2011 A1
20110115738 Suzuki et al. May 2011 A1
20110117970 Choi May 2011 A1
20110134032 Chiu et al. Jun 2011 A1
20110134043 Chen Jun 2011 A1
20110157037 Shamir et al. Jun 2011 A1
20110157046 Lee et al. Jun 2011 A1
20110157087 Kanehira et al. Jun 2011 A1
20110163955 Nasiri et al. Jul 2011 A1
20110164370 McClure et al. Jul 2011 A1
20110167181 Minoo et al. Jul 2011 A1
20110167287 Walsh et al. Jul 2011 A1
20110167391 Momeyer et al. Jul 2011 A1
20110169762 Weiss Jul 2011 A1
20110176035 Poulsen Jul 2011 A1
20110179864 Raasch et al. Jul 2011 A1
20110181754 Iwasaki Jul 2011 A1
20110184646 Wong et al. Jul 2011 A1
20110184824 George et al. Jul 2011 A1
20110188199 Pan Aug 2011 A1
20110191480 Kobayashi Aug 2011 A1
20110193787 Morishige et al. Aug 2011 A1
20110199389 Lu et al. Aug 2011 A1
20110205372 Miramontes Aug 2011 A1
20110221659 King et al. Sep 2011 A1
20110221678 Davydov Sep 2011 A1
20110227913 Hyndman Sep 2011 A1
20110231682 Kakish et al. Sep 2011 A1
20110234494 Peterson et al. Sep 2011 A1
20110234881 Wakabayashi et al. Sep 2011 A1
20110241999 Thier Oct 2011 A1
20110248152 Svajda et al. Oct 2011 A1
20110248920 Larsen Oct 2011 A1
20110248941 Abdo et al. Oct 2011 A1
20110261001 Liu Oct 2011 A1
20110261209 Wu Oct 2011 A1
20110265287 Li et al. Nov 2011 A1
20110266672 Sylvester Nov 2011 A1
20110267272 Meyer et al. Nov 2011 A1
20110273475 Herz et al. Nov 2011 A1
20110285555 Bocirnea Nov 2011 A1
20110290686 Huang Dec 2011 A1
20110295697 Boston et al. Dec 2011 A1
20110297566 Gallagher et al. Dec 2011 A1
20110298919 Maglaque Dec 2011 A1
20110302518 Zhang Dec 2011 A1
20110304577 Brown et al. Dec 2011 A1
20110305875 Sanford et al. Dec 2011 A1
20110316807 Corrion Dec 2011 A1
20110320204 Locker et al. Dec 2011 A1
20120002052 Muramatsu et al. Jan 2012 A1
20120002820 Leichter Jan 2012 A1
20120007821 Zaliva Jan 2012 A1
20120008015 Manabe Jan 2012 A1
20120019686 Manabe Jan 2012 A1
20120020490 Leichter Jan 2012 A1
20120020556 Manabe Jan 2012 A1
20120023401 Arscott et al. Jan 2012 A1
20120023459 Westerman Jan 2012 A1
20120024682 Huang et al. Feb 2012 A1
20120026096 Ku Feb 2012 A1
20120026110 Yamano Feb 2012 A1
20120032887 Chiu et al. Feb 2012 A1
20120032891 Parivar Feb 2012 A1
20120032901 Kwon Feb 2012 A1
20120038495 Ishikawa Feb 2012 A1
20120044179 Hudson Feb 2012 A1
20120044379 Manabe Feb 2012 A1
20120047368 Chinn et al. Feb 2012 A1
20120050975 Garelli et al. Mar 2012 A1
20120062564 Miyashita Mar 2012 A1
20120062736 Xiong Mar 2012 A1
20120068919 Lauder et al. Mar 2012 A1
20120069540 Lauder et al. Mar 2012 A1
20120075249 Hoch Mar 2012 A1
20120077384 Bar-Niv et al. Mar 2012 A1
20120092279 Martin Apr 2012 A1
20120094257 Pillischer et al. Apr 2012 A1
20120099749 Rubin et al. Apr 2012 A1
20120113137 Nomoto May 2012 A1
20120113579 Agata et al. May 2012 A1
20120117409 Lee et al. May 2012 A1
20120127118 Nolting et al. May 2012 A1
20120127126 Mattice et al. May 2012 A1
20120133797 Sato et al. May 2012 A1
20120139727 Houvener et al. Jun 2012 A1
20120140396 Zeliff et al. Jun 2012 A1
20120145525 Ishikawa Jun 2012 A1
20120156875 Srinivas et al. Jun 2012 A1
20120162693 Ito Jun 2012 A1
20120175487 Goto Jul 2012 A1
20120182242 Lindahl et al. Jul 2012 A1
20120182249 Endo et al. Jul 2012 A1
20120194448 Rothkopf Aug 2012 A1
20120212438 Vaisanen Aug 2012 A1
20120218194 Silverman Aug 2012 A1
20120221877 Prabu Aug 2012 A1
20120224073 Miyahara Sep 2012 A1
20120229634 Laett et al. Sep 2012 A1
20120242584 Tuli Sep 2012 A1
20120246377 Bhesania Sep 2012 A1
20120249443 Anderson et al. Oct 2012 A1
20120250873 Bakalos et al. Oct 2012 A1
20120256829 Dodge Oct 2012 A1
20120256959 Ye et al. Oct 2012 A1
20120260177 Sehrer Oct 2012 A1
20120274811 Bakin Nov 2012 A1
20120281129 Wang et al. Nov 2012 A1
20120287218 Ok Nov 2012 A1
20120299872 Nishikawa et al. Nov 2012 A1
20120300275 Vilardell et al. Nov 2012 A1
20120312955 Randolph Dec 2012 A1
20120330162 Rajan et al. Dec 2012 A1
20130009413 Chiu et al. Jan 2013 A1
20130015311 Kim Jan 2013 A1
20130021289 Chen et al. Jan 2013 A1
20130027867 Lauder et al. Jan 2013 A1
20130031353 Noro Jan 2013 A1
20130038541 Bakker Feb 2013 A1
20130044074 Park et al. Feb 2013 A1
20130046397 Fadell et al. Feb 2013 A1
20130063873 Wodrich et al. Mar 2013 A1
20130067126 Casparian et al. Mar 2013 A1
20130067259 Freiwald et al. Mar 2013 A1
20130073877 Radke Mar 2013 A1
20130076617 Csaszar et al. Mar 2013 A1
20130082824 Colley Apr 2013 A1
20130088431 Ballagas et al. Apr 2013 A1
20130100030 Los et al. Apr 2013 A1
20130100082 Bakin et al. Apr 2013 A1
20130106766 Yilmaz et al. May 2013 A1
20130128102 Yano May 2013 A1
20130135214 Li et al. May 2013 A1
20130151944 Lin Jun 2013 A1
20130154959 Lindsay et al. Jun 2013 A1
20130159749 Moeglein et al. Jun 2013 A1
20130162554 Lauder et al. Jun 2013 A1
20130172906 Olson et al. Jul 2013 A1
20130191741 Dickinson et al. Jul 2013 A1
20130212483 Brakensiek et al. Aug 2013 A1
20130215035 Guard Aug 2013 A1
20130217451 Komiyama et al. Aug 2013 A1
20130222272 Martin, Jr. Aug 2013 A1
20130222274 Mori et al. Aug 2013 A1
20130222275 Byrd et al. Aug 2013 A1
20130222323 McKenzie Aug 2013 A1
20130222681 Wan Aug 2013 A1
20130226794 Englebardt Aug 2013 A1
20130227836 Whitt, III Sep 2013 A1
20130228023 Drasnin Sep 2013 A1
20130228433 Shaw Sep 2013 A1
20130228434 Whitt, III Sep 2013 A1
20130228439 Whitt, III Sep 2013 A1
20130229100 Siddiqui Sep 2013 A1
20130229335 Whitman Sep 2013 A1
20130229347 Lutz, III Sep 2013 A1
20130229350 Shaw Sep 2013 A1
20130229351 Whitt, III Sep 2013 A1
20130229354 Whitt, III et al. Sep 2013 A1
20130229363 Whitman Sep 2013 A1
20130229366 Dighde Sep 2013 A1
20130229380 Lutz, III Sep 2013 A1
20130229568 Belesiu Sep 2013 A1
20130229570 Beck et al. Sep 2013 A1
20130229756 Whitt, III Sep 2013 A1
20130229757 Whitt, III et al. Sep 2013 A1
20130229758 Belesiu Sep 2013 A1
20130229759 Whitt, III Sep 2013 A1
20130229760 Whitt, III Sep 2013 A1
20130229761 Shaw Sep 2013 A1
20130229762 Whitt, III Sep 2013 A1
20130229773 Siddiqui Sep 2013 A1
20130230346 Shaw Sep 2013 A1
20130231755 Perek Sep 2013 A1
20130232280 Perek Sep 2013 A1
20130232348 Oler Sep 2013 A1
20130232349 Oler Sep 2013 A1
20130232350 Belesiu et al. Sep 2013 A1
20130232353 Belesiu Sep 2013 A1
20130232571 Belesiu Sep 2013 A1
20130232742 Burnett et al. Sep 2013 A1
20130262886 Nishimura Oct 2013 A1
20130268897 Li et al. Oct 2013 A1
20130285922 Alberth, Jr. et al. Oct 2013 A1
20130300590 Dietz Nov 2013 A1
20130300647 Drasnin Nov 2013 A1
20130301199 Whitt Nov 2013 A1
20130301206 Whitt Nov 2013 A1
20130304941 Drasnin Nov 2013 A1
20130321992 Liu et al. Dec 2013 A1
20130322000 Whitt Dec 2013 A1
20130322001 Whitt Dec 2013 A1
20130329360 Aldana Dec 2013 A1
20130332628 Panay Dec 2013 A1
20130339757 Reddy Dec 2013 A1
20130342976 Chung Dec 2013 A1
20140012401 Perek Jan 2014 A1
20140043275 Whitman et al. Feb 2014 A1
20140048399 Whitt, III Feb 2014 A1
20140055624 Gaines Feb 2014 A1
20140085814 Kielland Mar 2014 A1
20140119802 Shaw May 2014 A1
20140125864 Rihn May 2014 A1
20140167585 Kuan et al. Jun 2014 A1
20140185215 Whitt Jul 2014 A1
20140185220 Whitt Jul 2014 A1
20140204514 Whitt Jul 2014 A1
20140204515 Whitt Jul 2014 A1
20140247546 Whitt Sep 2014 A1
20140291134 Whitt Oct 2014 A1
20140293534 Siddiqui Oct 2014 A1
20140313401 Rihn et al. Oct 2014 A1
20140362506 Whitt, III et al. Dec 2014 A1
20140372914 Byrd et al. Dec 2014 A1
20140379942 Perek et al. Dec 2014 A1
20150005953 Fadell et al. Jan 2015 A1
20150036274 Belesiu et al. Feb 2015 A1
20150227212 Whitt, III et al. Aug 2015 A1
20150234478 Belesiu et al. Aug 2015 A1
20150261262 Whitt, III et al. Sep 2015 A1
20150311014 Shaw et al. Oct 2015 A1
20150312453 Gaines et al. Oct 2015 A1
Foreign Referenced Citations (75)
Number Date Country
990023 Jun 1976 CA
1352767 Jun 2002 CN
1537223 Oct 2004 CN
1787605 Jun 2006 CN
1808362 Jul 2006 CN
101366001 Feb 2009 CN
101452334 Jun 2009 CN
101644979 Feb 2010 CN
101675406 Mar 2010 CN
101681189 Mar 2010 CN
102004559 Apr 2011 CN
1102012763 Apr 2011 CN
102112947 Jun 2011 CN
102117121 Jul 2011 CN
102138113 Jul 2011 CN
102147643 Aug 2011 CN
102214040 Oct 2011 CN
102292687 Dec 2011 CN
102356624 Feb 2012 CN
103455097 Dec 2013 CN
103455149 Dec 2013 CN
10116556 Oct 2002 DE
645726 Mar 1995 EP
1003188 May 2000 EP
1223722 Jul 2002 EP
1480029 Nov 2004 EP
1591891 Nov 2005 EP
1983411 Oct 2008 EP
2006869 Dec 2008 EP
2026178 Feb 2009 EP
2353978 Aug 2011 EP
2410408 Jan 2012 EP
2068643 Aug 1981 GB
2123213 Jan 1984 GB
2305780 Apr 1997 GB
2381584 May 2003 GB
2482932 Feb 2012 GB
52107722 Sep 1977 JP
56108127 Aug 1981 JP
H104540 Jan 1998 JP
10326124 Dec 1998 JP
1173239 Mar 1999 JP
11338575 Dec 1999 JP
2000010654 Jan 2000 JP
2001142564 May 2001 JP
2002170458 Jun 2002 JP
2002300438 Oct 2002 JP
2004038950 Feb 2004 JP
3602207 Dec 2004 JP
2006160155 Jun 2006 JP
2006163459 Jun 2006 JP
2006294361 Oct 2006 JP
2010244514 Oct 2010 JP
2003077368 Mar 2014 JP
20010107055 Dec 2001 KR
20050014299 Feb 2005 KR
20060003093 Jan 2006 KR
20080006404 Jan 2008 KR
20090029411 Mar 2009 KR
20100022059 Feb 2010 KR
20100067366 Jun 2010 KR
20100115675 Oct 2010 KR
102011008717 Aug 2011 KR
20110109791 Oct 2011 KR
20110120002 Nov 2011 KR
20110122333 Nov 2011 KR
101113530 Feb 2012 KR
WO-9919995 Apr 1999 WO
WO-0072079 Nov 2000 WO
WO-2006044818 Apr 2006 WO
WO-2007112172 Oct 2007 WO
WO-2009034484 Mar 2009 WO
WO-2010147609 Dec 2010 WO
WO-2011049609 Apr 2011 WO
WO-2014084872 Jun 2014 WO
Non-Patent Literature Citations (342)
Entry
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, (Jul. 2, 2013), 2 pages.
“Final Office Action”, U.S. Appl. No. 13/471,001, (Jul. 25, 2013), 20 pages.
“Final Office Action”, U.S. Appl. No. 13/471,336, (Aug. 28, 2013), 18 pages.
“Final Office Action”, U.S. Appl. No. 13/651,976, (Jul. 25, 2013), 21 pages.
“Final Office Action”, U.S. Appl. No. 13/653,321, (Aug. 2, 2013), 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, (Jul. 19, 2013), 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, (Jul. 1, 2013), 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/938,930, (Aug. 29, 2013), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,002, (Aug. 28, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,032, (Aug. 29, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,195, (Jul. 8, 2013), 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,304, (Jul. 1, 2013), 5 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2013/029461, (Jun. 21, 2013), 11 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2013/028948, (Jun. 21, 2013), 11 pages.
“Advisory Action”, U.S. Appl. No. 13/939,032, Feb. 24, 2014, 2 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/053683, Nov. 28, 2013, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/599,635, Feb. 25, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,186, Feb. 27, 2014, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,405, Feb. 20, 2014, 37 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, Feb. 14, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, Feb. 26, 2014, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 13/938,930, Feb. 20, 2014, 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Mar. 20, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Apr. 3, 2014, 4 pages.
“Final Office Action”, U.S. Appl. No. 13/780,228, Mar. 28, 2014, 13 pages.
“Foreign Office Action”, CN Application No. 201320328022.1, Feb. 17, 2014, 4 Pages.
“Foreign Office Action”, CN Application No. 201320328022.1, Oct. 18, 2013, 3 Pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/067912, Feb. 13, 2014, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,237, Mar. 24, 2014, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,376, Apr. 2, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, Apr. 3, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,055, Mar. 12, 2014, 17 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,139, Mar. 17, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/667,408, Mar. 13, 2014, 11 pages.
“Notice of Allowance”, U.S. Appl. No. 13/939,002, Mar. 3, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/939,032, Apr. 3, 2014, 4 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/653,321, Mar. 28, 2014, 4 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/063,912, Jan. 2, 2014, 10 pages.
“FingerWorks Installation and Operation Guide for the TouchStream ST and TouchStream LP”, FingerWorks, Inc. Retrieved from <http://ec1.images-amazon.com/media/i3d/01/A/man-migrate/MANUAL000049862.pdf>, 2002, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,232, Dec. 5, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/468,918, Dec. 26, 2013, 18 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Jan. 14, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Jan. 22, 2014, 2 pages.
“Notice of Allowance”, U.S. Appl. No. 13/653,321, Dec. 18, 2013, 4 pages.
“Foreign Office Action”, CN Application No. 201320097066.8, Oct. 24, 2013, 5 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,002, Dec. 20, 2013, 5 pages.
“Final Office Action”, U.S. Appl. No. 13/939,032, Dec. 20, 2013, 5 pages.
“Restriction Requirement”, U.S. Appl. No. 13/468,918, Nov. 29, 2013, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/565,124, Dec. 24, 2013, 6 pages.
“Final Office Action”, U.S. Appl. No. 13/564,520, Jan. 15, 2014, 7 pages.
“Cirago Slim Case®—Protective case with built-in kickstand for your iPhone 5®”, Retrieved from <http://cirago.com/wordpress/wp-content/uploads/2012/10/ipc1500brochure1.pdf>on Jan. 29, 2013, 1 page.
“Non-Final Office Action”, U.S. Appl. No. 13/471,001, (Feb. 19, 2013), 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,139, (Mar. 21, 2013), 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,202, (Feb. 11, 2013), 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, (Jan. 18, 2013), 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,195, (Jan. 2, 2013), 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,232, (Jan. 17, 2013), 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,272, (Feb. 12, 2013), 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,287, (Jan. 29, 2013), 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,304, (Mar. 22, 2013), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,327, (Mar. 22, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, (Mar. 18, 2013), 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,976, (Feb. 22, 2013), 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,321, (Feb. 1, 2013), 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, (Feb. 7, 2013), 11 pages.
“Notice of Allowance”, U.S. Appl. No. 13/470,633, (Mar. 22, 2013), 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/471,139, (Jan. 17, 2013), 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,304, (Jan. 18, 2013), 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,726, (Feb. 22, 2013), 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,871, (Feb. 7, 2013), 6 pages.
“The Microsoft Surface Tablets Comes With Impressive Design and Specs”, Retrieved from <http://microsofttabletreview.com/the-microsoft-surface-tablets-comes-with-impressive-desiqn-and-specs> on Jan. 30, 2013, (Jun. 2012), 2 pages.
“Tilt Shift Lenses: Perspective Control”, retrieved from. http://www.cambridgeincolour.com/tutorials/tilt-shift-lenses1.htm, (Mar. 28, 2008), 11 Pages.
“What is Active Alignment?”, http://www.kasalis.com/active—alignment.html, retrieved on Nov. 22, 2012, 2 Pages.
“Advanced Configuration and Power Management Specification”, Intel Corporation, Microsoft Corporation, Toshiba Corp. Revision 1, (Dec. 22, 1996), 364 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651 327 (Sep. 12, 2013), 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,327, (Sep. 23, 2013), 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,726, (Sep. 17, 2013), 2 pages.
“Final Office Action”, U.S. Appl. No. 13/471,139, (Sep. 16, 2013), 13 pages.
“Final Office Action”, U.S. Appl. No. 13/653,682, (Oct. 18, 2013), 16 pages.
“Final Office Action”, U.S. Appl. No. 13/656,055, (Oct. 23, 2013), 14 pages.
“Final Office Action”, U.S. Appl. No. 13/938,930, (Nov. 8, 2013), 10 pages.
“Final Office Action”, U.S. Appl. No. 13/939,002, (Nov. 8, 2013), 7 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/040968, (Sep. 5, 2013), 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/042550, (Sep. 24, 2013), 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/780,228, (Oct. 30, 2013), 12 pages.
“Notice of Allowance”, U.S. Appl. No. 13/563,435, (Nov. 12, 2013), 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,871, (Oct. 2, 2013), 7 pages.
“Notice to Grant”, CN Application No. 201320097089.9, (Sep. 29, 2013), 2 Pages.
“Notice to Grant”, CN Application No. 201320097124.7, (Oct. 8, 2013), 2 pages.
“Welcome to Windows 7”, Retrieved from: <http://www.microsoft.com/en-us/download/confirmation.aspx?id=4984> on Aug. 1, 2013, (Sep. 16, 2009), 3 pages.
Prospero, Michael “Samsung Outs Series 5 Hybrid PC Tablet”, Retrieved from:<http://blog.laptopmag.com/samsung-outs-series-5-hybrid-pc-tablet-running-windows-8> on Oct. 31, 2013, (Jun. 4, 2012), 7 pages.
“Accessing Device Sensors”, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012, 4 pages.
“First One Handed Fabric Keyboard with Bluetooth Wireless Technology”, Retrieved from: <http://press.xtvworld.com/article3817.html> on May 8, 2012,(Jan. 6, 2005), 2 pages.
“Force and Position Sensing Resistors: An Emerging Technology”, Interlink Electronics, Available at <http://staff.science.uva.nl/˜vlaander/docu/FSR/An—Exploring—Technology.pdf>,(Feb. 1990), pp. 1-6.
“Frogpad Introduces Weareable Fabric Keyboard with Bluetooth Technology”, Retrieved from: <http://www.geekzone.co.nz/content.asp?contentid=3898> on May 7, 2012,(Jan. 7, 2005), 3 pages.
“Incipio LG G-Slate Premium Kickstand Case—Black Nylon”, Retrieved from: <http://www.amazon.com/Incipio-G-Slate-Premium-Kickstand-Case/dp/B004ZKP916> on May 8, 2012, 4 pages.
“Membrane Keyboards & Membrane Keypads”, Retrieved from: <http://www.pannam.com/> on May 9, 2012,(Mar. 4, 2009), 2 pages.
“Motion Sensors”, Android Developers, retrieved from <http://developer.android.com/guide/topics/sensors—motion.html> on May 25, 2012, 7 pages.
“Position Sensors”, Android Developers, retrieved from <http://developers.android.com/guide/topics/sensors—position.html> on May 25, 2012, 5 pages.
“SolRxTM E-Series Multidirectional Phototherapy ExpandableTM 2-Bulb Full Body Panel System”, Retrieved from: <http://www.solarcsystems.com/us—multidirectional—uv—light—therapy—1—intro.html > on Jul. 25, 2012,(2011), 4 pages.
“Virtualization Getting Started Guide”, Red Hat Enterprise Linux 6, Edition 0.2, retrieved from <http://docs.redhat.com/docs/en-US/Red—Hat—Enterprise—Linux/6/html-single/Virtualization—Getting—Started—Guide/index.html> on Jun. 13, 2012, 24 pages.
Block, Steve et al., “DeviceOrientation Event Specification”, W3C, Editors Draft, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012,(Jul. 12, 2011), 14 pages.
Brown, Rich “Microsoft Shows Off Pressure-Sensitive Keyboard”, retrieved from <http://news.cnet.com/9301-17938—105-10304792-1.html> on May 7, 2012, (Aug. 6, 2009), 2 pages.
Buler, Alex et al., “SideSight: Multi-“touch” Interaction around Small Devices”, In the proceedings of the 21st annual ACM symposium on User Interface software and technology., retrieved from <http://research.microsoft.com/pubs/132534/sidesight—crv3.pdf> on May 29, 2012,(Oct. 19, 2008), 4 pages.
Crider, Michael “Sony Slate Concept Tablet “Grows” a Kickstand”, Retrieved from: <http://androidcommunity.com/sony-slate-concept-tablet-grows-a-kickstand-20120116/> on May 4, 2012,(Jan. 16, 2012), 9 pages.
Dietz, Paul H., et al., “A Practical Pressure Sensitive Computer Keyboard”, In Proceedings of UIST 2009,(Oct. 2009), 4 pages.
Glatt, Jeff “Channel and Key Pressure (Aftertouch).”, Retrieved from: <http://home.roadrunner.com/˜jgglatt/tutr/touch.htm> on Jun. 11, 2012, 2 pages.
Hanlon, Mike “ElekTex Smart Fabric Keyboard Goes Wireless”, Retrieved from: <http://www.gizmag.com/go/5048/ > on May 7, 2012,(Jan. 15, 2006), 5 pages.
Kaur, Sukhmani “Vincent Liew's redesigned laptop satisfies ergonomic needs”, Retrieved from: <http://www.designbuzz.com/entry/vincent-liew-s-redesigned-laptop-satisfies-ergonomic-needs/> on Jul. 27, 2012,(Jun. 21, 2010), 4 pages.
Khuntontong, Puttachat et al., “Fabrication of Molded Interconnection Devices by Ultrasonic Hot Embossing on Thin Polymer Films”, IEEE Transactions on Electronics Packaging Manufacturing, vol. 32, No. 3,(Jul. 2009), pp. 152-156.
Linderholm, Owen “Logitech Shows Cloth Keyboard for PDAs”, Retrieved from: <http://www.pcworld.com/article/89084/logitech—shows—cloth—keyboard—for—pdas.html> on May 7, 2012,(Mar. 15, 2002), 5 pages.
McLellan, Charles “Eleksen Wireless Fabric Keyboard: a first look”, Retrieved from: <http://www.zdnetasia.com/eleksen-wireless-fabric-keyboard-a-first-look-40278954.htm> on May 7, 2012,(Jul. 17, 2006), 9 pages.
Post, E.R. et al., “E-Broidery: Design and Fabrication of Textile-Based Computing”, IBM Systems Journal, vol. 39, Issue 3 & 4,(Jul. 2000), pp. 840-860.
Purcher, Jack “Apple is Paving the Way for a New 3D GUI for IOS Devices”, Retrieved from: <http://www.patentlyapple.com/patently-apple/2012/01/apple-is-paving-the-way-for-a-new-3d-gui-for-ios-devices.html> on Jun. 4, 2012,(Jan. 12, 2012), 15 pages.
Takamatsu, Seiichi et al., “Flexible Fabric Keyboard with Conductive Polymer-Coated Fibers”, In Proceedings of Sensors 2011,(Oct. 28, 2011), 4 pages.
Zhang, et al., “Model-Based Development of Dynamically Adaptive Software”, In Proceedings of ICSE 2006, Available at <http://www.irisa.fr/lande/lande/icse-proceedings/icse/p371.pdf>, (May 20, 2006), pp. 371-380.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, (Apr. 9, 2013), 2 pages.
“Final Office Action”, U.S. Appl. No. 13/651,195, (Apr. 18, 2013), 13 pages.
“Final Office Action”, U.S. Appl. No. 13/651,232, (May 21, 2013), 21 pages.
“Final Office Action”, U.S. Appl. No. 13/651,287, (May 3, 2013), 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/563,435, (Jun. 14, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, (Jun. 19, 2013), 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/565,124, (Jun. 18, 2013), 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,726, (Apr. 15, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, (Jun. 3, 2013), 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,055, (Apr. 23, 2013), 11 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,202, (May 28, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,272, (May 2, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,327, (Jun. 11, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,726, (May 31, 2013), 5 pages.
Jacobs, et al., “2D/3D Switchable Displays”, In the proceedings of Sharp Technical Journal (4), Available at <https://cgi.sharp.co.jp/corporate/rd/journal-85/pdf/85-04.pdf>,(Apr. 2003), pp. 15-18.
Morookian, et al., “Ambient-Light-Canceling Camera Using Subtraction of Frames”, NASA Tech Briefs, Retrieved from <http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20110016693—2011017808.pdf>,(May 2004), 2 pages.
“Advisory Action”, U.S. Appl. No. 14/199,924, May 28, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Mar. 10, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Apr. 14, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,232, Jul. 31, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,287, Aug. 21, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/667,408, Jun. 24, 2014, 9 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/938,930, May 6, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/938,930, Jun. 6, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, May 22, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, Jun. 19, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, May 5, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,032, Jun. 26, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,032, Jul. 15, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/199,924, Aug. 29, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/471,376, Aug. 18, 2014, 24 pages.
“Final Office Action”, U.S. Appl. No. 13/595,700, Aug. 15, 2014, 6 pages.
“Final Office Action”, U.S. Appl. No. 13/599,635, Aug. 8, 2014, 16 pages.
“Final Office Action”, U.S. Appl. No. 13/653,682, Jun. 11, 2014, 11 pages.
“Final Office Action”, U.S. Appl. No. 14/063,912, Apr. 29, 2014, 10 pages.
“Final Office Action”, U.S. Appl. No. 14/199,924, May 6, 2014, 5 pages.
“Foreign Notice of Allowance”, CN Application No. 201320096755.7, Jan. 27, 2014, 2 pages.
“Foreign Office Action”, CN Application No. 201320097065.3, Jun. 18, 2013, 2 pages.
“Foreign Office Action”, CN Application No. 201320097079.5, Sep. 26, 2013, 4 pages.
“Interlink Electronics FSR (TM) Force Sensing Resistors (TM)”, Retrieved at <<http://akizukidenshi.com/download/ds/ interlinkelec/94-00004+Rev+B%20FSR%201ntegration%20Guide.pdf on Mar. 21, 2013, 36 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/031531, Jun. 20, 2014, 10 Pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028483, Jun. 24, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028484, Jun. 24, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028485, Jun. 25, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028769, Jun. 26, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028771, Jun. 19, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028486, Jun. 20, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/041017, Jul. 17, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028489, Jun. 20, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028488, Jun. 24, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028767, Jun. 24, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028481, Jun. 19, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028490, Jun. 24, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028766, Jun. 26, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028772, Jun. 30, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028768, Jun. 24, 2014, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028482, Jun. 20, 2014, 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028487, May 27, 2014, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028770, Jun. 26, 2014, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/468,882, Jul. 9, 2014, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/468,949, Jun. 20, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/470,951, Jul. 2, 2014, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,001, Jun. 17, 2014, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,030, May 15, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,282, Sep. 3, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, May 7, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,412, Jul. 11, 2014, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,232, Apr. 30, 2014, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, Jun. 16, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/595,700, Jun. 18, 2014, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,976, Jun. 16, 2014, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/063,912, Sep. 2, 2014, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/199,924, Apr. 10, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/200,595, Apr. 11, 2014, 4 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,250, Jun. 17, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,276, Jun. 13, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/277,240, Jun. 13, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/468,918, Jun. 17, 2014, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,186, Jul. 3, 2014, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,237, May 12, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,405, Jun. 24, 2014, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,232, Apr. 25, 2014, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,287, May 2, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 14/018,286, May 23, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/199,924, Jun. 10, 2014, 4 pages.
“Restriction Requirement”, U.S. Appl. No. 13/595,700, May 28, 2014, 6 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/471,405, Aug. 29, 2014, 5 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 14/018,286, Jun. 11, 2014, 5 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/471,030, Sep. 30, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/199,924, Sep. 5, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/199,924, Sep. 19, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/468,949, Oct. 6, 2014, 11 pages.
“Final Office Action”, U.S. Appl. No. 13/471,336, Oct. 6, 2014, 13 pages.
“Final Office Action”, U.S. Appl. No. 13/492,232, Nov. 17, 2014, 13 pages.
“Final Office Action”, U.S. Appl. No. 13/595,700, Oct. 9, 2014, 8 pages.
“Final Office Action”, U.S. Appl. No. 13/656,055, Sep. 17, 2014, 10 pages.
“Final Office Action”, U.S. Appl. No. 14/200,595, Nov. 19, 2014, 5 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/043546, Oct. 9, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,393, Oct. 20, 2014, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/525,614, Nov. 24, 2014, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/780,228, Sep. 15, 2014, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/325,247, Nov. 17, 2014, 15 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,030, Sep. 5, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/653,682, Sep. 24, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 14/277,240, Sep. 16, 2014, 4 pages.
“Restriction Requirement”, U.S. Appl. No. 13/593,066, Oct. 8, 2014, 8 pages.
“Restriction Requirement”, U.S. Appl. No. 14/147,252, Dec. 1, 2014, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 14/325,247, Oct. 6, 2014, 6 pages.
Harrison, “UIST 2009 Student Innovation Contest—Demo Video”, Retrieved From: <https://www.youtube.com/watch?v=PDI8eYIASf0> Sep. 16, 2014, Jul. 23, 2009, 1 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/277,240, Jan. 8, 2015, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/470,951, Jan. 12, 2015, 20 pages.
“Final Office Action”, U.S. Appl. No. 13/471,412, Dec. 15, 2014, 11 pages.
“Final Office Action”, U.S. Appl. No. 13/527,263, Jan. 27, 2015, 7 pages.
“Final Office Action”, U.S. Appl. No. 14/063,912, Jan. 12, 2015, 12 pages.
“Final Office Action”, U.S. Appl. No. 14/225,276, Dec. 17, 2014, 6 pages.
“First Examination Report”, NZ Application No. 628690, Nov. 27, 2014, 2 pages.
“Foreign Office Action”, CN Application No. 201320097079.5, Jul. 28, 2014, 4 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,030, Jan. 15, 2015, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, Jan. 26, 2015, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/593,066, Jan. 2, 2015, 11 pages.
“Notice of Allowance”, U.S. Appl. No. 13/595,700, Jan. 21, 2015, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,976, Jan. 21, 2015, 10 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/471,405, Dec. 17, 2014, 5 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,232, Apr. 24, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/656,055, Apr. 13, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/200,595, Jun. 4, 2015, 3 pages.
“Final Office Action”, U.S. Appl. No. 13/468,882, Feb. 12, 2015, 9 pages.
“Final Office Action”, U.S. Appl. No. 13/525,614, Apr. 29, 2015, 20 pages.
“Final Office Action”, U.S. Appl. No. 13/780,228, Apr. 10, 2015, 19 pages.
“Final Office Action”, U.S. Appl. No. 14/225,250, Mar. 13, 2015, 7 pages.
“Final Office Action”, U.S. Appl. No. 14/325,247, Apr. 16, 2015, 21 pages.
“Foreign Notice on Reexamination”, CN Application No. 201320097066.8, Apr. 3, 2015, 7 Pages.
“Foreign Office Action”, CN Application No. 201310067808.7, May 28, 2015, 14 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,376, Mar. 27, 2015, 28 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,393, Mar. 26, 2015, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,412, Jun. 1, 2015, 31 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,232, Feb. 24, 2015, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/599,635, Feb. 12, 2015, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/852,848, Mar. 26, 2015, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/059,280, Mar. U.S. Appl. 3, 2015, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/063,912, May 7, 2015, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/147,252, Feb. 23, 2015, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,276, Apr. 23, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/468,949, Apr. 24, 2015, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/468,918, Apr. 8, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/468,949, Apr. 24, 2015, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,030, Apr. 6, 2015, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,282, Apr. 30, 2015, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/564,520, May 8, 2015, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,232, Mar. 30, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/656,055, Mar. 4, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/200,595, Feb. 17, 2015, 2 pages.
“Notice of Allowance”, U.S. Appl. No. 14/200,595, Feb. 25, 2015, 4 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/468,918, Jun. 4, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/468,949, Jun. 5, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/595,700, Apr. 10, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/595,700, May 4, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/595,700, May 22, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/656,055, May 15, 2015, 2 pages.
Schafer,“Using Interactive Maps for Navigation and Collaboration”, CHI '01 Extended Abstracts on Human Factors in Computing Systems, Mar. 31, 2001, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,232, Jun. 10, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,232, Jul. 6, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/656,055, Jul. 1, 2015, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/471,376, Jul. 28, 2015, 35 pages.
“Final Office Action”, U.S. Appl. No. 13/492,232, Jul. 10, 2015, 11 pages.
“Final Office Action”, U.S. Appl. No. 13/599,635, Jul. 30, 2015, 23 pages.
“Final Office Action”, U.S. Appl. No. 13/852,848, Jul. 20, 2015, 9 pages.
“Final Office Action”, U.S. Appl. No. 14/059,280, Jul. 22, 2015, 25 pages.
“Final Office Action”, U.S. Appl. No. 14/147,252, Jun. 25, 2015, 11 pages.
“Foreign Office Action”, CN Application No. 201310067335.0, Jun. 12, 2015, 15 Pages.
“Foreign Office Action”, CN Application No. 201310225788.1, Jun. 23, 2015, 14 Pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2014/031531, Jun. 9, 2015, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, Jun. 24, 2015, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/525,614, Jul. 31, 2015, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/727,001, Jul. 10, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/225,276, Jun. 22, 2015, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 14/457,881, Jul. 22, 2015, 7 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/656,055, Jun. 10, 2015, 2 pages.
Cunningham,“Software Infrastructure for Natural Language Processing”, In Proceedings of the fifth conference on Applied natural language processing, Mar. 31, 1997, pp. 237-244.
“Advisory Action”, U.S. Appl. No. 13/471,376, Sep. 23, 2015, 7 pages.
“Advisory Action”, U.S. Appl. No. 14/059,280, Sep. 25, 2015, 7 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/471,030, Aug. 10, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/564,520, Aug. 14, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/564,520, Sep. 17, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/225,276, Aug. 27, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/225,276, Sep. 29, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/457,881, Aug. 20, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/457,881, Oct. 2, 2015, 2 pages.
“Decision on Reexamination”, CN Application No. 201320097079.5, Sep. 7, 2015, 8 Pages.
“Extended European Search Report”, EP Application No. 13858620.1, Sep. 18, 2015, 6 pages.
“Extended European Search Report”, EP Application No. 13859280.3, Sep. 7, 2015, 6 pages.
“Extended European Search Report”, EP Application No. 13859406.4, Sep. 8, 2015, 6 pages.
“Final Office Action”, U.S. Appl. No. 13/689,541, Nov. 2, 2015, 21 pages.
“Final Office Action”, U.S. Appl. No. 14/063,912, Sep. 3, 2015, 13 pages.
“Foreign Office Action”, CN Application No. 201310067385.9, Aug. 6, 2015, 16 pages.
“Foreign Office Action”, CN Application No. 201310067592.4, Oct. 23, 2015, 12 Pages.
“Foreign Office Action”, CN Application No. 201310067627.4, Sep. 28, 2015, 14 pages.
“Foreign Office Action”, CN Application No. 201310096345.7, Oct. 19, 2015, 16 Pages.
“Foreign Office Action”, CN Application No. 201310316114.2, Sep. 29, 2015, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/470,951, Oct. 1, 2015, 29 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,393, Sep. 30, 2015, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/780,228, Sep. 18, 2015, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/162,529, Sep. 18, 2015, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,250, Aug. 19, 2015, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,276, Aug. 19, 2015, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 14/727,001, Oct. 2, 2015, 4 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/468,918, Aug. 7, 2015, 4 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/468,949, Sep. 14, 2015, 2 pages.
“Foreign Office Action”, CN Application No. 201310067631.0, Dec. 10, 2015, 10 Pages.
“Extended European Search Report”, EP Application No. 13858283.8, Nov. 23, 2015, 10 pages.
“Extended European Search Report”, EP Application No. 13858397.6, Nov. 30, 2015, 7 pages.
“Extended European Search Report”, EP Application No. 13858674.8, Nov. 27, 2015, 6 pages.
“Extended European Search Report”, EP Application No. 13858834.8, Oct. 29, 2015, 8 pages.
“Extended European Search Report”, EP Application No. 13860272.7, Dec. 14, 2015, 9 pages.
“Extended European Search Report”, EP Application No. 13861292.4, Nov. 23, 2015, 7 pages.
“Final Office Action”, U.S. Appl. No. 13/471,336, Dec. 10, 2015, 17 pages.
“Foreign Office Action”, CN Application No. 201310065273.X, Oct. 28, 2015, 14 pages.
“Foreign Office Action”, CN Application No. 201310067429.8, Nov. 25, 2015, 12 Pages.
“Foreign Office Action”, CN Application No. 201310067622.1, Oct. 27, 2015, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/468,882, Nov. 13, 2015, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,376, Nov. 23, 2015, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,412, Nov. 20, 2015, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 13/527,263, Dec. 9, 2015, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/852,848, Nov. 19, 2015, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 14/059,280, Nov. 23, 2015, 9 pages.
“Supplementary European Search Report”, EP Application No. 13728568.0, Oct. 30, 2015, 7 pages.
Related Publications (1)
Number Date Country
20130229534 A1 Sep 2013 US
Provisional Applications (7)
Number Date Country
61606321 Mar 2012 US
61606301 Mar 2012 US
61606313 Mar 2012 US
61606333 Mar 2012 US
61613745 Mar 2012 US
61606336 Mar 2012 US
61607451 Mar 2012 US