Input device sensor configuration

Information

  • Patent Grant
  • 9952106
  • Patent Number
    9,952,106
  • Date Filed
    Monday, October 3, 2016
    8 years ago
  • Date Issued
    Tuesday, April 24, 2018
    6 years ago
Abstract
Input device configurations are described. In implementations, an input device includes a sensor substrate having a capacitive sensor to detect proximity of an object that contacts the input device. The input device also includes a flexible contact layer spaced apart from the sensor substrate, where the flexible contact layer is implemented to flex and contact the sensor substrate responsive to pressure applied by the object to initiate an input to a computing device. The flexible contact layer can include a force concentrator pad that is designed to cause pressure to be channeled through the force concentrator pad to cause the flexible contact layer to contact the sensor substrate to initiate the input.
Description
BACKGROUND

Mobile computing devices have been developed to increase the functionality that is made available to users in a mobile setting. For example, a user may interact with a mobile phone, tablet computer, or other mobile computing device to check email, surf the web, compose texts, interact with applications, and so on. Because mobile computing devices are configured to be mobile, however, the mobile devices may be ill suited for intensive data entry operations.


For example, some mobile computing devices provide a virtual keyboard that is accessible using touchscreen functionality of the device. However, it may difficult to perform some tasks using a virtual keyboard such as inputting a significant amount of text, composing a document, and so forth. Moreover, virtual keyboards consume some screen real estate that may otherwise be used to display content. Thus, use of traditional virtual keyboards may be frustrating when confronted with some input scenarios.


SUMMARY

Input device configurations are described. In one or more implementations, an input device includes a sensor substrate having one or more conductors and a flexible contact layer spaced apart from the sensor substrate. The flexible contact layer is configured to flex to contact the sensor substrate to initiate an input of a computing device. The flexible contact layer includes a force concentrator pad that is configured to cause pressure to be channeled through the force concentrator pad to cause the flexible contact layer to contact the sensor substrate to initiate the input.


In one or more implementations, an input device includes a plurality of indications that are selectable to initiate corresponding inputs and pressure sensitive sensor nodes formed in an array such that each of the indications corresponds to a plurality of the pressure-sensitive keys to initiate the corresponding inputs. The formation of the plurality of pressure sensitive sensor nodes includes a sensor substrate having one or more conductors and a flexible contact layer spaced apart from the sensor substrate that is configured to flex to contact the sensor substrate to initiate the corresponding input of a computing device.


In one or more implementations, an input device includes a sensor substrate having one or more conductors, a flexible contact layer spaced apart from the sensor substrate that is configured to flex to contact the sensor substrate to initiate an input of a computing device. The flexible contact layer includes a surface having a force sensitive ink configured to contact the one or more conductors of the sensor substrate to initiate the input and a plurality of spacers formed on the surface.


In one or more implementations, an input device includes a capacitive sensor assembly arranged in an array that is configured to detect a location of an object that is proximal to a respective capacitive sensor of the capacitive sensor assembly and a pressure sensitive sensor assembly including a plurality of pressure sensitive sensor nodes that are configured to detect an amount of pressure applied by the object against a respective pressure sensitive sensor node of the pressure sensitive sensor assembly.


In one or more implementations, an object is detected that is located proximal to one or more capacitive sensors of an input device. The input device is configured to communicate one or more inputs to a computing device. Responsive to the detection, functionality of the input device that is not related to the capacitive sensors is caused to be placed in an operational state.


In one or more implementations, an input device includes a capacitive sensor array configured to detect proximity of an object and a plurality of pressure sensitive sensor nodes embedded as nodes in the capacitive sensor array. The plurality of pressure sensitive sensor nodes are configured to initiate corresponding inputs of a computing device, each of the plurality of pressure sensitive sensor nodes formed from flexible contact layer spaced apart from a sensor substrate.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.



FIG. 1 is an illustration of an environment in an example implementation that is operable to employ the sensor configuration techniques described herein.



FIG. 2 depicts an example implementation of an input device of FIG. 1 as showing a flexible hinge in greater detail.



FIG. 3 depicts an example implementation showing a perspective view of a connection portion of FIG. 2 that includes mechanical coupling protrusions and a plurality of communication contacts.



FIG. 4 depicts an example implementation showing a cross section of the input device of FIG. 1.



FIG. 5 depicts an example implementation of a backlight mechanism of FIG. 1 as including a light guide of FIG. 4 and a light source.



FIG. 6 depicts an example implementation in which a layer of the sensor assembly is shown in a cross section, the layer configured to support implementation of pressure sensitive sensor nodes.



FIG. 7 depicts an example implementation in which a flexible contact layer of FIG. 6 is shown in cross section as combined with a sensor substrate to form a pressure sensitive sensor node assembly.



FIG. 8 depicts an example implementation of a pressure sensitive sensor node of FIGS. 6 and 7 as employing a force concentrator pad.



FIG. 9 an example of the pressure sensitive sensor node of FIG. 8 as having pressure applied at a plurality of different locations of the flexible contact layer to cause contact with the sensor substrate.



FIG. 10 depicts an example implementation in which a pressure sensitive sensor nodes are arranged in an array that is configured to support gesture detection and use for a plurality of different input configurations.



FIG. 11 depicts an implementation showing examples of indications of inputs of the outer layer of FIG. 4 as corresponding to a plurality of underlying pressure sensitive sensor nodes of an array of FIG. 10.



FIG. 12 depicts an example implementation of the input as including a pressure sensitive sensor assembly and a capacitive sensor assembly.



FIG. 13 depicts an example implementation in which pressure sensitive sensor nodes of a pressure sensitive sensor assembly are interspersed with capacitive sensors of a capacitive sensor assembly.



FIG. 14 depicts an example implementation in which the capacitive sensor assembly is configured as a layer and disposed proximal to a layer of a pressure sensitive sensor assembly.



FIG. 15 illustrates an example system generally at that includes an example computing device that is representative of one or more computing systems and/or devices that may implement the various techniques described herein.





DETAILED DESCRIPTION

Overview


Mobile computing devices may be utilized in a wide variety of different scenarios due to their mobile construction, e.g., configured to be held by one or more hands of a user. As previously described, however, conventional techniques that were utilized to interact with these mobile computing devices could be limited when restricted solely to a virtual keyboard. Although supplemental input devices have been developed (e.g., an external keyboard), these devices could be unwieldy and difficult to interact with in mobile scenarios, including limitations in inputs that are recognized by the input device, difficulty in transporting the devices, and so forth.


Input device configurations are described. In one or more implementations, an input device is configured to include a generally uniform array of pressure sensitive sensor nodes. The pressure sensitive sensor nodes may have a size and pitch that is sufficient to recognize gestures, e.g., made by a finger of a user's hand, a stylus, and so on, by detecting an input as involving motion across a plurality of the keys. Additionally, the array may also be configured such that collections of the pressure sensitive sensor nodes are mapped to particular inputs (e.g., keys of a keyboard) to also function as a keyboard, track pad, and so on. The input device may be configured in a variety of ways to implement pressure sensitive sensor nodes having this functionality. The input device may also be configured to promote a relative thin form factor for the input device overall, e.g., less than three millimeters. This may be performed through use of force concentrator pads, integrated spacers, and so on. In this way, the input device may be configured to support a variety of different types of input functionality and may do so in a manner that maintains mobility of the mobile computing device to which it may be attached.


Additionally, the input device may also be configured to incorporate a capacitive sensor assembly. For instance, the capacitive sensor assembly may be configured to detect proximity of an object, and when so detected, wake other functionality of the input device (e.g., backlighting, operation of the pressure sensitive sensor nodes, and so on) and/or a computing device communicatively coupled to the input device. The capacitive sensor assembly may also operate in conjunction with the pressure sensitive sensor nodes to expose inputs having an increased richness to a computing device. The capacitive sensor assembly, for instance, may be employed to provide a location of an object and the pressure sensitive sensor nodes may be utilized to indicate an amount of pressure (i.e., a “z” indication). These inputs may be leveraged by the computing device to recognize gestures, gaming inputs, and so on and thus may provide increased input functionality to a user. A variety of other examples are also contemplated, further discussion of which may be found in relation to the following sections.


In the following discussion, an example environment is first described that may employ the input device configuration techniques described herein. Examples of layers that are usable in the example environment (i.e., the input device) are then described which may be performed in the example environment as well as other environments. Consequently, use of the example layers is not limited to the example environment and the example environment is not limited to use of the example layers.


Example Environment



FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ the techniques described herein. The illustrated environment 100 includes an example of a computing device 102 that is physically and communicatively coupled to an input device 104 via a flexible hinge 106. The computing device 102 may be configured in a variety of ways. For example, the computing device 102 may be configured for mobile use, such as a mobile phone, a tablet computer as illustrated, and so on that is configured to be held by one or more hands of a user. Thus, the computing device 102 may range from full resource devices with substantial memory and processor resources to a low-resource device with limited memory and/or processing resources. The computing device 102 may also relate to software that causes the computing device 102 to perform one or more operations.


The computing device 102, for instance, is illustrated as including an input/output module 108. The input/output module 108 is representative of functionality relating to processing of inputs and rendering outputs of the computing device 102. A variety of different inputs may be processed by the input/output module 108, such as inputs relating to functions that correspond to keys of the input device 104, keys of a virtual keyboard displayed by the display device 110 to identify gestures and cause operations to be performed that correspond to the gestures that may be recognized through the input device 104 and/or touchscreen functionality of the display device 110, and so forth. Thus, the input/output module 108 may support a variety of different input techniques by recognizing and leveraging a division between types of inputs including key presses, gestures, and so on.


In the illustrated example, the input device 104 is configured as having an input portion that includes a keyboard having a QWERTY arrangement of keys and track pad although other arrangements of keys are also contemplated. Further, other non-conventional configurations are also contemplated, such as a game controller, configuration to mimic a musical instrument, and so forth. Thus, the input device 104 and keys incorporated by the input device 104 may assume a variety of different configurations to support a variety of different functionality.


As previously described, the input device 104 is physically and communicatively coupled to the computing device 102 in this example through use of a flexible hinge 106. The flexible hinge 106 is flexible in that rotational movement supported by the hinge is achieved through flexing (e.g., bending) of the material forming the hinge as opposed to mechanical rotation as supported by a pin, although that embodiment is also contemplated. Further, this flexible rotation may be configured to support movement in one or more directions (e.g., vertically in the figure) yet restrict movement in other directions, such as lateral movement of the input device 104 in relation to the computing device 102. This may be used to support consistent alignment of the input device 104 in relation to the computing device 102, such as to align sensors used to change power states, application states, and so on.


The flexible hinge 106, for instance, may be formed using one or more layers of fabric and include conductors formed as flexible traces to communicatively couple the input device 104 to the computing device 102 and vice versa. This communication, for instance, may be used to communicate a result of a key press to the computing device 102, receive power from the computing device, perform authentication, provide supplemental power to the computing device 102, and so on.



FIG. 2 depicts an example implementation 200 of the input device 104 of FIG. 1 as showing the flexible hinge 106 in greater detail. In this example, a connection portion 202 of the input device is shown that is configured to provide a communicative and physical connection between the input device 104 and the computing device 102. The connection portion 202 as illustrated has a height and cross section configured to be received in a channel in the housing of the computing device 102, although this arrangement may also be reversed without departing from the spirit and scope thereof.


The connection portion 202 is flexibly connected to a portion of the input device 104 that includes the keys through use of the flexible hinge 106. Thus, when the connection portion 202 is physically connected to the computing device 102 the combination of the connection portion 202 and the flexible hinge 106 supports movement of the input device 104 in relation to the computing device 102 that is similar to a hinge of a book.


Through this rotational movement, a variety of different orientations of the input device 104 in relation to the computing device 102 may be supported. For example, rotational movement may be supported by the flexible hinge 106 such that the input device 104 may be placed against the display device 110 of the computing device 102 and thereby act as a cover. Thus, the input device 104 may act to protect the display device 110 of the computing device 102 from harm.


The connection portion 202 may be secured to the computing device in a variety of ways, an example of which is illustrated as including magnetic coupling devices 204, 206 (e.g., flux fountains), mechanical coupling protrusions 208, 210, and a plurality of communication contacts 212. The magnetic coupling devices 204, 206 are configured to magnetically couple to complementary magnetic coupling devices of the computing device 102 through use of one or more magnets. In this way, the input device 104 may be physically secured to the computing device 102 through use of magnetic attraction.


The connection portion 202 also includes mechanical coupling protrusions 208, 210 to form a mechanical physical connection between the input device 104 and the computing device 102. The mechanical coupling protrusions 208, 210 are shown in greater detail in relation to FIG. 3, which is discussed below.



FIG. 3 depicts an example implementation 300 showing a perspective view of the connection portion 202 of FIG. 2 that includes the mechanical coupling protrusions 208, 210 and the plurality of communication contacts 212. As illustrated, the mechanical coupling protrusions 208, 210 are configured to extend away from a surface of the connection portion 202, which in this case is perpendicular although other angles are also contemplated.


The mechanical coupling protrusions 208, 210 are configured to be received within complimentary cavities within the channel of the computing device 102. When so received, the mechanical coupling protrusions 208, 210 promote a mechanical binding between the devices when forces are applied that are not aligned with an axis that is defined as correspond to the height of the protrusions and the depth of the cavity.


The connection portion 202 is also illustrated as including a plurality of communication contacts 212. The plurality of communication contacts 212 is configured to contact corresponding communication contacts of the computing device 102 to form a communicative coupling between the devices as shown. The connection portion 202 may be configured in a variety of other ways, including use of a rotational hinge, mechanical securing device, and so on. In the following, an example of a docking apparatus 112 is described and shown in a corresponding figure.



FIG. 4 depicts an example implementation 400 showing a cross section of input device 104 of FIG. 1. The outer layer 402 is configured to supply an outer surface of the input device 104 with which a user may touch and interact. The outer layer 402 may be formed in a variety of ways, such as from a fabric material, e.g., a backlight compatible polyurethane with a heat emboss for key formation, use of a laser to form indications of inputs, and so on.


Beneath the outer layer is a smoothing layer 404. The smoothing layer 404 may be configured to support a variety of different functionality. This may include use as a support to reduce wrinkling of the outer layer 402, such as through formation as a thin plastic sheet, e.g., approximately 0.125 millimeters of polyethylene terephthalate (PET), to which the outer layer 402 is secured through use of an adhesive. The smoothing layer 404 may also be configured to including masking functionality to reduce and even eliminate unwanted light transmission, e.g., “bleeding” of light through the smoothing layer 404 and through a fabric outer layer 402. The smoothing layer also provides a continuous surface under the outer layer, such that it hides any discontinuities or transitions between the inner layers.


A light guide 406 is also illustrated, which may be included as part of a backlight mechanism to support backlighting of indications (e.g., legends) of inputs of the input device 104. This may include illumination of keys of a keyboard, game controls, gesture indications, and so on. The light guide 406 may be formed in a variety of ways, such as from a 250 micron thick sheet of a plastic, e.g., a clear polycarbonate material with etched texturing. Additional discussion of the light guide 406 may be found beginning in relation to FIG. 5.


A sensor assembly 408 is also depicted. Thus, as illustrated the light guide 406 and the smoothing layer 404 are disposed between the outer layer 402 and the sensor assembly 408. The sensor assembly 408 is configured to detect proximity of an object to initiate an input. The detected input may then be communicated to the computing device 102 (e.g., via the connection portion 202) to initiate one or more operations of the computing device 102. The sensor assembly 408 may be configured in a variety of ways to detect proximity of inputs, such as a capacitive sensor array, a plurality of pressure sensitive sensor nodes (e.g., membrane switches using a force sensitive ink), mechanical switches, a combination thereof, and so on.


A structure assembly 410 is also illustrated. The structure assembly 410 may be configured in a variety of ways, such as a trace board and backer that are configured to provide rigidity to the input device 104, e.g., resistance to bending and flexing. A backing layer 412 is also illustrated as providing a rear surface to the input device 104. The backing layer 412, for instance, may be formed from a fabric similar to an outer layer 402 that omits one or more sub-layers of the outer layer 402, e.g., a 0.38 millimeter thick fabric made of wet and dry layers of polyurethane. Although examples of layers have been described, it should be readily apparent that a variety of other implementations are also contemplated, including removal of one or more of the layers, addition of other layers (e.g., a dedicated force concentrator layer, mechanical switch layer), and so forth. Thus, the following discussion of examples of layers is not limited to incorporation of those layer in this example implementation 400 and vice versa.



FIG. 5 depicts an example implementation 500 of a backlight mechanism as including a light guide 406 of FIG. 4 and a light source. As previously described, the light guide 406 may be configured in a variety of ways to support transmission of light that is to act as a backlight for the input device 102. For example, the light guide 406 may be configured from a clear plastic or other material that supports transmission of light from a light source 502, which may be implemented using one or more light emitting diodes (LEDs). The light guide 406 is positioned to receive the emitted light from the light source 502 through a side of the light guide 406 and emit the light through one or more other sides and/or surface regions of the light guide 406.


The light guide 406, for instance, may be configured to output light at specific locations through use of etching, embossing, contact by another material having a different refractive index (e.g., an adhesive disposed on the plastic of the light guide 406), and so on. In another example, the light guide 406 may be configured as a universal light guide such that a majority (and even entirety) of a surface of the light guide 406 may be configured output light, e.g., through etching of a majority of a surface 504 of the light guide 406. Thus, instead of specially configuring the light guide 406 in this example, the same light guide maybe used to output different indications of inputs, which may be used to support different languages, arrangements of inputs, and so on by the input device 104.


As previously described, however, this could cause bleeding of light through adjacent surfaces to the light guide in conventional techniques, such as through an outer layer 402 of fabric to give a “galaxy” effect, pinholes, and so on. Accordingly, one or more of these adjacent layers may be configured to reduce and even prevent transmission of light in undesirable locations. For example, the outer layer may include sub-layers having progressively darker shades of a color to enable use of a light surface color yet restrict transmission of light through the fabric, a mask of ink may be printed (e.g., to the smoothing layer 404) to absorb light at particular locations (e.g., near the light source), and so on. A variety of other examples are also contemplated.



FIG. 6 depicts an example implementation 600 in which a layer of the sensor assembly 408 is shown in a cross section, the layer configured to support implementation of pressure sensitive sensor nodes. A flexible contact layer 602 (e.g., Mylar) of a pressure sensitive sensor node is illustrated in this example that is configured to flex to initiate contact and thus an input. In this example, the flexible contact layer 602 does not perform this contact absent application of pressure against the flexible contact layer 602 as further described in relation to FIG. 7.


The flexible contact layer 602 in this example includes a force sensitive ink 604 disposed on a surface of the flexible contact layer 602. The force sensitive ink 604 is configured such that an amount of resistance of the ink varies directly in relation to an amount of pressure applied. The force sensitive ink 604, for instance, may be configured with a relatively rough surface that is compressed against another surface (e.g., a conductor as shown in FIG. 7) upon an application of pressure against the flexible contact layer 602. The greater the amount of pressure, the more the force sensitive ink 604 is compressed, thereby increasing conductivity and decreasing resistance of the force sensitive ink 604. Other conductors may also be disposed on the flexible contact layer 602 without departing form the spirit and scope therefore, including other types of pressure sensitive and non-pressure sensitive conductors.


The flexible contact layer 602 is also illustrated as including spacers 606 formed on the same surface of the flexible contact layer 602 as the force sensitive ink 604. The spacers 606 define openings through which the flexible contact layer 602 is to flex to initiate inputs. The spacers 606 may be configured in a variety of ways, such as through use of a dielectric spacer material having a height of approximately 6.5 um. The flexible contact layer 602 is also illustrated as including a securing mechanism 608 (e.g., 25 um of adhesive) to secure the flexible contact layer 602 to an adjacent layer of the pressure sensitive sensor node assembly.


Force concentrator pads 610 are also illustrated as disposed on an opposing side of the flexible contact layer 602 in relation to the side of flexible contact layer 602 that includes the force sensitive ink 604. The force concentrator pads 610 are illustrated as secured to and/or a part of the flexible contact layer 602, such as formed from a material to have a height of approximately 50 um. The force concentrator pads have a cross section along an axis of the flexible contact layer 602 that approximates a cross section of the force sensitive ink 604 disposed on an opposing side of the flexible contact layer 602. The force concentrator pads 610 may be configured to channel pressure applied to the input device 104 to promote consistent contact of the force sensitive ink 604, further discussion of which may be found beginning in relation to the discussion of FIG. 8.



FIG. 7 depicts an example implementation 700 in which the flexible contact layer 602 of FIG. 6 is shown in cross section as combined with a sensor substrate 702 to form a pressure sensitive sensor node assembly. The sensor substrate 702 may be configured in a variety of ways, such as a printed circuit board (PCB) having conductors 704 disposed thereon.


The conductors 704 are configured to be contacted by the force sensitive ink 604 of the flexible contact layer 602. When contacted, an analog signal may be generated for processing by the input device 104 and/or the computing device 102, e.g., to recognize whether the signal is likely intended by a user to provide an input for the computing device 102. A variety of different types of conductors 704 may be disposed on the sensor substrate 702, such as formed from a variety of conductive materials (e.g., silver, copper), disposed in a variety of different configurations as further described in relation to FIG. 10, and so on.


The sensor substrate 702 is also illustrated as including spacers 706. The spacers 706 are disposed on the same surface as the conductors 704 on the sensor substrate 702 in an area between the conductors. The spacers 706 of the sensor substrate 702 and the spacers 606 of the flexible contact layer 602 may be positioned to form a spacer assembly as shown in the figure, e.g., having a total height of 41 um. This height may thus cause the force sensitive ink 604 of the flexible contact layer 602 to be spaced apart from the conductors 704 of the sensor substrate 702.


Application of a pressure against the flexible contact layer 602 may cause the flexible contact layer 602 to flex through an opening formed by the spacer assembly to contact the conductors 704 of the sensor substrate 702. As previously described, the amount of pressure may be communicated through different resistances of the force sensitive ink 604 to provide an output that indicates an amount of pressure that was applied, e.g., with twelve bits of resolution as further described below.


The securing mechanism 608 (e.g., the adhesive described in relation to FIG. 6) may be used to secure the flexible contact layer 602 to the sensor substrate 702. This may be performed directly to a surface of the sensor substrate 702, include use of a solder mask 708 having a height approximating the force sensitive ink 604 to increase a height of a gap between the ink and the conductors 704, and so on.


The flexible contact layer 602 is also illustrated as secured to the light guide 406 through use of the previously described adhesives 612. As this may cause light to bleed from the light guide 406, the flexible contact layer 602 may be configured to promote reflectance of this light (e.g., by being colored white). Additionally, to reduce an amount of light bleed “upward” through the smoothing layer 404 and outer surface 402 of FIG. 4, a lesser surface area of adhesive 710 may be used to secure the smoothing layer 404 to the light guide 406 than is used to secure the light guide 406 to the flexible contact layer 602. Other techniques may also be utilized to reduce this light bleed, such as to include a mask 712 printed as a black ink to portions of the smoothing layer 404 that are secured to the light guide 406, e.g., that have adhesive disposed thereon.



FIG. 8 depicts an example implementation 800 of a pressure sensitive sensor node of FIGS. 6 and 7 as employing a force concentrator pad 610. In this example, the flexible contact layer 602 is spaced apart from the sensor substrate 702 through use of a space assembly 802, which may employ the spacers 606, 706 of FIGS. 6 and 7. The force concentrator pad 610 may be implemented in a variety of ways, such as part of the flexible contact layer 602 as illustrated, as a stand-alone layer, as part of a smoothing layer 404, and so on.


As previously described, the flexible contact layer 602 may be configured from a variety of materials, such as a flexible material (e.g., Mylar) that is capable of flexing to contact a sensor substrate 702. The flexible contact layer 602 in this instance includes a force concentrator pad 610 disposed thereon that is raised from a surface of the flexible contact layer 602. Thus, the force concentrator pad 610 is configured as a protrusion to contact another layer of the input device 104, such as the light guide 406, smoothing layer 404, outer surface 402, and so on. The force concentrator pad 610 may be formed in a variety of ways, such as formation as a layer (e.g., printing, deposition, forming, etc.) on a substrate of the flexible contact layer 602 (e.g., Mylar), as an integral part of the substrate itself, and so on.



FIG. 9 depicts an example 900 of the pressure sensitive sensor node 800 of FIG. 8 as having pressure applied at a plurality of different locations of the flexible contact layer 602 to cause contact with the sensor substrate 702. The pressure is illustrated through use of arrows, which in this instance include first, second, and third locations 902, 904, 906 which are positioned at distances that are respectively closer to an edge of the sensor, e.g., an edge defined by the spacer assemblies 802.


As illustrated, the force concentrator pad 610 is sized so as to permit the flexible contact layer 602 to flex between the spacer assemblies 802. The force concentrator pad 610 is configured to provide increased mechanical stiffness and thus improved resistance to localized bending and flexing around a single sensor, e.g., as in comparison with a substrate (e.g., Mylar) of the flexible contact layer 602 alone. Therefore, when the force concentrator pad 610 receives pressures (e.g., is “pressed”), the flexible contact layer 602 has a decreased bend radius than would otherwise be the case.


Thus, the bending of a substrate of the flexible contact layer 602 around the force concentrator pad 610 may promote a relatively consistent contact area between the force sensitive ink 604 and the conductors 704 of the sensor substrate 702. This may promote normalization of a signal produced by the key, e.g., to address “off center” contact.


The force concentrator pad 610 may also act to spread a contact area of a source of the pressure. The flexible contact layer 602, for instance, may receive a pressure caused by a fingernail, a tip of a stylus, pen, or other object that has a relatively small contact area. This could result in a correspondingly small contact area of the flexible contact layer 602 that contacts the sensor substrate 702, and thus a corresponding decrease in signal strength.


However, due to the mechanical stiffness of the force concentrator pad 610, this pressure may be spread across an area of the force concentrator pad 610, which is then spread across an area of the flexible contact layer 602 that correspondingly bends around the spacer assemblies 802 to contact the sensor substrate 702. In this way, the force concentrator pad 610 may be used to distribute and normalize a contact area between the flexible contact layer 602 and the sensor substrate 702 that is used to generate a signal by the pressure sensitive sensor node.


The force concentrator pad 610 may also act to channel pressure, even if this pressure is applied “off center.” For example, the flexibility of the flexible contact layer 602 may depend at least partially on a distance from an edge of the pressure sensitive sensor node, e.g., an edge defined by the spacer assembly 802 in this instance.


The force concentrator pad 610, however, may be used to channel pressure to the flexible contact layer 602 to promote relatively consistent contact. For example, pressure applied at a first location 902 that is positioned at a general center region of the flexible contact layer 602 may cause contact that is similar to contact achieved when pressure applied at a second location 904 that is positioned at an edge of the force concentrator pad 610. Pressures applied outside of a region defined by the force concentrator pad 610 may also be channeled through use of the force concentrator pad 610, such as a third position 906 that is located outside of the region defined by the force concentrator pad 610 but within an edge of the key. A position that is located outside of a region of the flexible contact layer 602 defined by the spacer assembly 802 may also be channeled to cause the flexible contact layer 602 to contact the sensor substrate 702 as illustrated. A variety of different configurations of pressure sensitive sensor assemblies may leverage the pressures sensitive keys previously described, an example of which is described as follows and shown in a corresponding figure.



FIG. 10 depicts an example implementation 1000 in which a pressure sensitive sensor nodes are arranged in an array (e.g., a pool) that is configured to support gesture detection and use for a plurality of different input configurations. In this example, a pressure sensitive sensor assembly 1002 of the input device 104 includes a plurality of pressure sensitive sensor nodes, which are illustrated as having a generally uniform size and spacing in a square shape, although other sizes, spacings, and arrangements are also contemplated without departing from the spirit and scope thereof.


An enlarged example of a conductor 704 of the pressure sensitive sensor assembly 1002 is also shown. In this example, conductors 704 of the sensor substrate 702 are configured in first and second portions of inter-digitated trace fingers. Thus, a pressure applied to the flexible contact layer 602 may cause the force sensitive ink 604 to contact the conductors 704 and act as a shunt to permit a flow of electricity between the first and second portions of inter-digitated trace fingers. Other examples are also contemplated, such as to the first portion on the flexible contact layer 602 and the second portion on the sensor substrate 702 with the force sensitive ink being disposed between the layers having the portions.


In the illustrated example, the input device 104 includes an array of sensors spaced in a generally uniform manner, e.g., individual sensors placed approximately five millimeters apart on center in a grid arrangement. The sensors are illustrated as squares in the example although other sizes and arrangements are also contemplated, such as staggered generally circular sensors and so on. Further, the sensors may be configured in a variety of ways, such as pressure sensitive sensors, include a capacitive grid as described in relation to FIG. 13, and so on.


The size and spacing of the sensors may be configured in a variety of ways. For example, a surface area of the sensor may be configured to have a surface area of approximately 25 millimeters (e.g., a 5×5 square or less), may be configured to have a surface area of approximately nine millimeters (e.g., a 3×3 square), may be configured to have a surface area of approximately 2.25 millimeters (e.g., a 1.5×1.5 square configured to detect a fingernail, stylus, and so on), have a pitch of approximately five millimeters or less, and so on. Additionally, a variety of different detection and sampling rates may be supported, such as a one kilohertz sampling rate with twelve bits of resolution (e.g., to indicate pressure) and may be responsive to twenty five grams of pressure. In this way, the array may be configured to detect gestures across a sequence of the sensors, may support dynamic mapping of key presses to corresponding indications as described in relation to FIG. 11, and so on. It should be readily apparent that a wide variety of other examples are also contemplated. Regardless of how implemented, sensors of the array may thus correspond to indications of inputs of the outer surface 402 of the input device 104, further discussion of which is described as follows and shown in a corresponding figure.



FIG. 11 depicts an implementation 1100 showing examples of indications 1102, 1104 of inputs of the outer layer 402 of FIG. 4 as corresponding to a plurality of underlying pressure sensitive sensor nodes of an array of FIG. 10. The first indication 402 is taken from the outer layer 402 and shows the letter “A” of a QWERTY keyboard that, once selected (e.g., pressed) is to cause a corresponding input to be provided by the input device 104 to the computing device 102.


The indication 1102 is disposed over four sensors of the array of the pressure sensitive sensor assembly 1002 of FIG. 10, which are illustrated in phantom. Accordingly, a mapping may be employed such that an output from any, all, or a combination thereof of these sensors is recognized by the computing device 102 as the indicated input, e.g., a key press of the letter “A.”


Likewise, indication 1104 is taken from a game controller is of an input for a rocker control, such as to provide inputs to control direction of an object in a game. The indication 1104 is also disposed over a plurality of sensors (e.g., the pressure sensitive sensor nodes of the array of FIG. 10), which are also illustrated in phantom. Accordingly, a mapping may be employed such that an output from any, all, or a combination thereof of these sensors is recognized by the computing device 102 as the indicated input, e.g., different directions dependent on which part of the rocker control receives contact.


Additionally, techniques may be employed to detect a centroid of a contact to determine a likely intent of a contact received by a user. For the indication 1104 of the rocker control, for instance, a centroid of a user's finger may be detected to determine a likely direction. This technique may also be employed to determine which of a plurality of indications likely correspond to an input, such as when a user contacts a border between multiple indications the centroid may be used to determine which indication and corresponding sensor is likely intended as an input by a user. Capacitive sensors may also be incorporated to aid this detection as further described beginning in relation to FIG. 13. Although a generally uniform array of sensors was described, other arrangements may also be employed that are not uniform, e.g., to follow a configuration of a QWERTY keyboard and map other functionality such as a game controller over this configuration.



FIG. 12 depicts an example implementation 1200 of the input device 104 as including a pressure sensitive sensor assembly 1202 and a capacitive sensor assembly 1204. The pressure sensitive sensor assembly 1202 may be configured in a variety of ways, such as an array as described in relation to FIG. 10, one to one correspondence between indications of keys of FIG. 1 and underlying sensors, and so on. In the illustrated example, the pressure sensitive sensor assembly 1202 is illustrated in phantom as disposed underneath indications of keys of a keyboard of FIG. 1.


The input device 104 also includes capacitive sensor assemblies 1204 which are illustrated as disposed beneath palm rests of the input device 104 although other configurations as also contemplated as further described below. The capacitive sensor assemblies 1204 are configured to detect proximity of an object, such as a finger of a user's hand 1206 as illustrated, a stylus, or other object. This detection may be leveraged to support a wide variety of different functionality. For example, the capacitive sensor assemblies 1204 may remain operational (e.g., awake) while other functionality of the input device 104 (e.g., the pressure sensitive sensor node assembly 1202, backlighting, and so on), computing device 102, and so on are in a non-operational state, e.g., a sleep state, hibernation state, “off,” and so on. This may be performed to reduce power consumption by these devices.


Responsive to detection of an object (e.g., the finger of the user's hand 1206), the input device 104 may cause this other functionality of the input device 104 and/or computing device 102 to “wake.” For example, this may cause examination of an output of a camera of the computing device 102 and/or input device 104 to determine whether to turn the backlighting of the input device 104 “on” based on an amount of light detected in the surroundings of the device, place the pressure sensitive sensor assembly 1202 in an operational state to detect pressure, and so on. In this way, the capacitive sensor assembly 1204 may be utilized to conserve power consumed by the input device 104 and/or the computing device 102 as well as increase responsiveness of these devices, e.g., by waking before contact is even received by the pressure sensitive sensor node assembly. Additionally, this may be utilized to protect against inadvertent presses of the keys of the input device 104, such as in combination with detection of a location of the input device 104 in relation to the computing device 102, e.g., positioned at a rear of the computing device 102 through use of a Hall Effect sensor, accelerometers, magnetometers, and so forth.


The capacitive sensor assembly 1204 may be configured in a variety of ways to perform this object detection. For example, the capacitive sensor assembly 1204 may be configured in a portion of the input device 104 that is separate and non-overlapping from a portion including the pressure sensitive sensor assembly 1202 as shown in FIG. 12. In this example, the capacitive sensor assembly 1204 may be configured to detect presence of an object but not a specific location of the object beyond the presence of the object in a sensing range of capacitive sensors of the capacitive sensor assembly 1204. The capacitive sensor assembly 1204 may also be configured to be interspersed between the pressure sensitive sensor nodes of the pressure sensitive sensor node assembly, an example of which is described in relation to FIG. 13. In yet another example, the capacitive sensor assembly 1204 may be formed as a layer that is disposed proximal to a layer of the pressure sensitive sensor node assembly, an example of which is described in relation to FIG. 14.



FIG. 13 depicts an example implementation 1300 in which pressure sensitive sensor nodes of a pressure sensitive sensor assembly 1202 are interspersed with capacitive sensors of a capacitive sensor assembly 1204. In this example, the conductors 704 of the pressure sensitive sensor nodes are formed on the sensor substrate 702 as inter-digitated trace fingers as described above. Capacitive trace lines of the capacitive sensor assembly 1204 are also illustrated.


Thus, in this example the pressure sensitive sensor nodes (e.g., sensor nodes) are embedded into a capacitive sensor array to support both capacitive location and pressure input to be reported by the input device 104. Accuracy and linearity of the capacitive sensor assembly 1204 may support a high degree of positional accuracy at virtually non-contact use inputs to support gesturing, mousing movements, and so on.


Additionally, integration of the pressure sensitive sensor nodes of the pressure sensitive sensor assembly 1202 supports pressure readings at each discrete location of capacitive touch events detected by the capacitive sensor assembly 1204. This may also be utilized to support a thin form factor of the input device 104 as a whole that is configured to detect position and pressure at multiple, discrete user inputs. In one or more implementations, conductors of the capacitive sensors and the conductors 704 of the pressure sensitive sensor nodes may be incorporated at the same layer of the input device 104, e.g., the sensor substrate 702 of FIG. 7. Other examples are also contemplated, one such example is described as follows and shown in a corresponding figure.



FIG. 14 depicts an example implementation 1400 in which the capacitive sensor assembly 1204 is configured as a layer and disposed proximal to a layer of a pressure sensitive sensor assembly 1202. In this example, the layers are disposed adjacent to each other and are incorporated as part of a track pad, although other configurations are also contemplated. For instance, the capacitive sensor assembly 1204 may be leveraged to detect location as before whereas the pressure sensitive sensor assembly 1202 may be utilized to detect pressure, e.g., a “z” axis input in addition to the “x” and “y” location inputs of the capacitive sensor assembly 1204. In another example, this may be utilized to implement a plurality of keys as part of the track pad that are usable to initiate different inputs in addition to the moussing input supported by the track pad. A variety of other examples are also contemplated without departing from the spirit and scope thereof.


Example System and Device



FIG. 15 illustrates an example system generally at 1500 that includes an example computing device 1502 that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. The computing device 1502 may be, for example, be configured to assume a mobile configuration through use of a housing formed and size to be grasped and carried by one or more hands of a user, illustrated examples of which include a mobile phone, mobile game and music device, and tablet computer although other examples are also contemplated. The input device 1514 may also be configured to incorporate a pressure sensitive sensor assembly 1202 and a capacitive sensor assembly 1204 as previously described.


The example computing device 1502 as illustrated includes a processing system 1504, one or more computer-readable media 1506, and one or more I/O interface 1508 that are communicatively coupled, one to another. Although not shown, the computing device 1502 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines.


The processing system 1504 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 1504 is illustrated as including hardware element 1510 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. The hardware elements 1510 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.


The computer-readable storage media 1506 is illustrated as including memory/storage 1512. The memory/storage 1512 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage component 1512 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage component 1512 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 1506 may be configured in a variety of other ways as further described below.


Input/output interface(s) 1508 are representative of functionality to allow a user to enter commands and information to computing device 1502, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to recognize movement as gestures that do not involve touch), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, the computing device 1502 may be configured in a variety of ways to support user interaction.


The computing device 1502 is further illustrated as being communicatively and physically coupled to an input device 1514 that is physically and communicatively removable from the computing device 1502. In this way, a variety of different input devices may be coupled to the computing device 1502 having a wide variety of configurations to support a wide variety of functionality. In this example, the input device 1514 includes one or more keys 1516, which may be configured as pressure sensitive sensor nodes, mechanically switched keys, and so forth.


The input device 1514 is further illustrated as include one or more modules 1518 that may be configured to support a variety of functionality. The one or more modules 1518, for instance, may be configured to process analog and/or digital signals received from the keys 1516 to determine whether a keystroke was intended, determine whether an input is indicative of resting pressure, support authentication of the input device 1514 for operation with the computing device 1502, and so on.


Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the computing device 1502. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”


“Computer-readable storage media” may refer to media and/or devices that enable persistent and/or non-transitory storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media. The computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.


“Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 1502, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.


As previously described, hardware elements 1510 and computer-readable media 1506 are representative of modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein, such as to perform one or more instructions. Hardware may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware. In this context, hardware may operate as a processing device that performs program tasks defined by instructions and/or logic embodied by the hardware as well as a hardware utilized to store instructions for execution, e.g., the computer-readable storage media described previously.


Combinations of the foregoing may also be employed to implement various techniques described herein. Accordingly, software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 1510. The computing device 1502 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of a module that is executable by the computing device 1502 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 1510 of the processing system 1504. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more computing devices 1502 and/or processing systems 1504) to implement techniques, modules, and examples described herein.


CONCLUSION

Although the example implementations have been described in language specific to structural features and/or methodological acts, it is to be understood that the implementations defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed features.

Claims
  • 1. An input device comprising: a sensor substrate having pressure sensor nodes and capacitive sensor nodes configured to detect proximity of an object that contacts the input device, the pressure sensor nodes interspersed with the capacitive sensor nodes, the pressure sensor nodes configured to conserve power in a non-operational state prior to being placed in an operational state responsive to the capacitive sensor nodes detecting the proximity of the object; anda flexible contact layer spaced apart from the sensor substrate, the flexible contact layer configured to flex and contact the sensor substrate responsive to pressure applied by the object to initiate an input to a computing device.
  • 2. The input device as recited in claim 1, wherein the flexible contact layer includes a force concentrator pad configured to cause pressure to be channeled through the force concentrator pad to cause the flexible contact layer to contact the sensor substrate to initiate the input.
  • 3. The input device as recited in claim 1, wherein the capacitive sensor nodes are configured to detect the proximity of the object before the object contacts the input device to initiate the input.
  • 4. The input device as recited in claim 1, wherein the input to the computing device is initiated based on at least one of a capacitive sensor node detecting the proximity of the object, the pressure applied by the object on the sensor substrate, or an input force applied to the flexible contact layer.
  • 5. The input device as recited in claim 1, further comprising at least two conductors that include one of the capacitive sensor nodes and one of the pressure sensor nodes, wherein the one pressure sensor node is configured to detect a force of the object that contacts the input device to initiate the input to the computing device.
  • 6. The input device as recited in claim 5, wherein the one pressure sensor node of the at least two conductors is placed in the operational state in response to the one capacitive sensor node detecting the proximity of the object.
  • 7. The input device as recited in claim 6, wherein the sensor substrate includes the at least two conductors formed as inter-digitated trace fingers that form the input based on the pressure applied by the object.
  • 8. The input device as recited in claim 1, wherein the flexible contact layer includes a force concentrator pad configured to transfer pressure causing the flexible contact layer to contact the sensor substrate with a uniformly distributed contact pressure.
  • 9. The input device as recited in claim 8, wherein the pressure is applied to the force concentrator pad through an off-center key strike and the force concentrator pad is configured to channel the pressure to initiate the input.
  • 10. An input device comprising: a sensor assembly including a plurality of sensor nodes formed in an array, formation of the plurality of sensor nodes including:a sensor substrate having conductors that include pressure sensors and capacitive sensors formed in the array of the pressure sensors interspersed with the capacitive sensors, the capacitive sensors configured to detect proximity of an object that contacts the input device while the pressure sensors conserve power in a non-operational state; anda flexible contact layer spaced apart from the sensor substrate, the flexible contact layer configured to flex and contact the sensor substrate to initiate an input to a computing device.
  • 11. The input device as recited in claim 10, wherein the sensor nodes include force concentrator pads each configured to cause pressure to be channeled through the force concentrator pad to cause the flexible contact layer to contact the sensor substrate to initiate the input.
  • 12. The input device as recited in claim 10, wherein the capacitive sensors are configured to detect the proximity of the object before the object contacts the input device to initiate the input.
  • 13. The input device as recited in claim 10, wherein the input to the computing device is initiated based on at least one of the capacitive sensors detecting the proximity of the object, a pressure applied by the object on the sensor substrate, or an input force applied to the flexible contact layer.
  • 14. The input device as recited in claim 13, wherein the flexible contact layer includes force concentrator pads configured to transfer pressure causing the flexible contact layer to contact the sensor substrate with a uniformly distributed contact pressure.
  • 15. The input device as recited in claim 10, wherein the pressure sensors are placed in an operational state in response to the capacitive sensors detecting the proximity of the object.
  • 16. The input device as recited in claim 10, wherein each of the conductors is a combination of a pressure sensor and a capacitive sensor.
  • 17. A method comprising: detecting proximity of an object with capacitive sensor nodes of a sensor substrate of an input device, the sensor substrate including pressure sensor nodes interspersed with the capacitive sensor nodes;contacting the sensor substrate with a flexible contact layer of the input device, the flexible contact layer spaced apart from the sensor substrate and configured to flex and contact the sensor substrate responsive to pressure applied by the object to initiate an input to a computing device; andconserving power with the pressure sensor nodes of the sensor substrate in a non-operational state prior to the capacitive sensor nodes said detecting the proximity of the object.
  • 18. The method as recited in claim 17, further comprising channeling pressure through a force concentrator pad of the flexible contact layer, the channeling the pressure causing the flexible contact layer said contacting the sensor substrate initiating the input.
  • 19. The method as recited in claim 17, further comprising placing the pressure sensor nodes in an operational state responsive to said detecting the proximity of the object with the capacitive sensor nodes.
  • 20. The method as recited in claim 17, wherein said detecting the proximity of the object comprises detecting the proximity before the object contacts the input device to initiate the input.
RELATED APPLICATIONS

This application is a continuation of and claims priority to U.S. patent application Ser. No. 14/033,508 filed Sep. 22, 2013 and titled “Input Device Sensor Configuration,” which claims priority as a continuation-in-part to U.S. patent application Ser. No. 13/974,749 filed Aug. 23, 2013 and titled “Input Device with Interchangeable Surface,” which claims priority as a continuation-in-part of U.S. patent application Ser. No. 13/655,065 filed Oct. 18, 2012, and titled “Media Processing Input Device,” which claims priority to U.S. Provisional Patent Application No. 61/659,364 filed Jun. 13, 2012, and titled “Music Blade,” the disclosures of each which are incorporated by reference herein in their entirety.

US Referenced Citations (610)
Number Name Date Kind
578325 Fleming Mar 1897 A
4046975 Seeger, Jr. Sep 1977 A
4065649 Carter et al. Dec 1977 A
4243861 Strandwitz Jan 1981 A
4279021 See et al. Jul 1981 A
4302648 Sado et al. Nov 1981 A
4317013 Larson Feb 1982 A
4326193 Markley et al. Apr 1982 A
4365130 Christensen Dec 1982 A
4492829 Rodrique Jan 1985 A
4527021 Morikawa et al. Jul 1985 A
4559426 Van Zeeland et al. Dec 1985 A
4577822 Wilkerson Mar 1986 A
4588187 Dell May 1986 A
4607147 Ono et al. Aug 1986 A
4651133 Ganesan et al. Mar 1987 A
4735394 Facco Apr 1988 A
4890832 Komaki Jan 1990 A
5149923 Demeo Sep 1992 A
5220521 Kikinis Jun 1993 A
5283559 Kalendra et al. Feb 1994 A
5331443 Stanisci Jul 1994 A
5480118 Cross Jan 1996 A
5489900 Cali et al. Feb 1996 A
5510783 Findlater et al. Apr 1996 A
5546271 Gut et al. Aug 1996 A
5548477 Kumar et al. Aug 1996 A
5558577 Kato Sep 1996 A
5576981 Parker et al. Nov 1996 A
5618232 Martin Apr 1997 A
5681220 Bertram et al. Oct 1997 A
5745376 Barker et al. Apr 1998 A
5748114 Koehn May 1998 A
5781406 Hunte Jul 1998 A
5807175 Davis et al. Sep 1998 A
5818361 Acevedo Oct 1998 A
5828770 Leis et al. Oct 1998 A
5842027 Oprescu et al. Nov 1998 A
5859642 Jones Jan 1999 A
5874697 Selker et al. Feb 1999 A
5909211 Combs et al. Jun 1999 A
5926170 Oba Jul 1999 A
5942733 Allen et al. Aug 1999 A
5971635 Wise Oct 1999 A
6002389 Kasser Dec 1999 A
6005209 Burleson et al. Dec 1999 A
6012714 Worley et al. Jan 2000 A
6040823 Seffernick et al. Mar 2000 A
6044717 Biegelsen et al. Apr 2000 A
6061644 Leis May 2000 A
6112797 Colson et al. Sep 2000 A
6147859 Abboud Nov 2000 A
6177926 Kunert Jan 2001 B1
6178443 Lin Jan 2001 B1
6239786 Burry et al. May 2001 B1
6254105 Rinde et al. Jul 2001 B1
6279060 Luke et al. Aug 2001 B1
6329617 Burgess Dec 2001 B1
6344791 Armstrong Feb 2002 B1
6380497 Hashimoto et al. Apr 2002 B1
6429846 Rosenberg et al. Aug 2002 B2
6437682 Vance Aug 2002 B1
6506983 Babb et al. Jan 2003 B1
6511378 Bhatt et al. Jan 2003 B1
6532147 Christ, Jr. Mar 2003 B1
6543949 Ritchey et al. Apr 2003 B1
6565439 Shinohara et al. May 2003 B2
6597347 Yasutake Jul 2003 B1
6600121 Olodort et al. Jul 2003 B1
6603408 Gaba Aug 2003 B1
6617536 Kawaguchi Sep 2003 B2
6651943 Cho et al. Nov 2003 B2
6685369 Lien Feb 2004 B2
6695273 Iguchi Feb 2004 B2
6704864 Philyaw Mar 2004 B1
6721019 Kono et al. Apr 2004 B2
6725318 Sherman et al. Apr 2004 B1
6738049 Kiser et al. May 2004 B2
6758615 Monney et al. Jul 2004 B2
6774888 Genduso Aug 2004 B1
6776546 Kraus et al. Aug 2004 B2
6781819 Yang et al. Aug 2004 B2
6784869 Clark et al. Aug 2004 B1
6813143 Makela Nov 2004 B2
6819316 Schulz et al. Nov 2004 B2
6856506 Doherty et al. Feb 2005 B2
6861961 Sandbach et al. Mar 2005 B2
6864573 Robertson et al. Mar 2005 B2
6898315 Guha May 2005 B2
6914197 Doherty et al. Jul 2005 B2
6950950 Sawyers et al. Sep 2005 B2
6970957 Oshins et al. Nov 2005 B1
6976799 Kim et al. Dec 2005 B2
6977352 Oosawa Dec 2005 B2
7051149 Wang et al. May 2006 B2
7083295 Hanna Aug 2006 B1
7091436 Serban Aug 2006 B2
7091955 Kramer Aug 2006 B2
7095404 Vincent et al. Aug 2006 B2
7106222 Ward et al. Sep 2006 B2
7116309 Kimura et al. Oct 2006 B1
7123292 Seeger et al. Oct 2006 B1
7194662 Do et al. Mar 2007 B2
7202837 Ihara Apr 2007 B2
7213991 Chapman et al. May 2007 B2
7224830 Nefian et al. May 2007 B2
7245292 Custy Jul 2007 B1
7277087 Hill et al. Oct 2007 B2
7301759 Hsiung Nov 2007 B2
7374312 Feng et al. May 2008 B2
7401992 Lin Jul 2008 B1
7423557 Kang Sep 2008 B2
7446276 Piesko Nov 2008 B2
7447934 Dasari et al. Nov 2008 B2
7469386 Bear et al. Dec 2008 B2
7486165 Ligtenberg et al. Feb 2009 B2
7499037 Lube Mar 2009 B2
7502803 Culter et al. Mar 2009 B2
7542052 Solomon et al. Jun 2009 B2
7557312 Clark et al. Jul 2009 B2
7558594 Wilson Jul 2009 B2
7559834 York Jul 2009 B1
RE40891 Yasutake Sep 2009 E
7602384 Rosenberg et al. Oct 2009 B2
7620244 Collier Nov 2009 B1
7622907 Vranish Nov 2009 B2
7636921 Louie Dec 2009 B2
7639876 Clary et al. Dec 2009 B2
7656392 Bolender Feb 2010 B2
7686694 Cole Mar 2010 B2
7728820 Rosenberg et al. Jun 2010 B2
7728923 Kim et al. Jun 2010 B2
7731147 Rha Jun 2010 B2
7733326 Adiseshan Jun 2010 B1
7736042 Park et al. Jun 2010 B2
7773076 Pittel et al. Aug 2010 B2
7773121 Huntsberger et al. Aug 2010 B1
7774155 Sato et al. Aug 2010 B2
7777972 Chen et al. Aug 2010 B1
7782342 Koh Aug 2010 B2
7813715 McKillop et al. Oct 2010 B2
7815358 Inditsky Oct 2010 B2
7817428 Greer, Jr. et al. Oct 2010 B2
7865639 McCoy et al. Jan 2011 B2
7884807 Hovden et al. Feb 2011 B2
7907394 Richardson et al. Mar 2011 B2
D636397 Green Apr 2011 S
7928964 Kolmykov-Zotov et al. Apr 2011 B2
7936501 Smith et al. May 2011 B2
7945717 Rivalsi May 2011 B2
7952566 Poupyrev et al. May 2011 B2
7970246 Travis et al. Jun 2011 B2
7973771 Geaghan Jul 2011 B2
7976393 Haga et al. Jul 2011 B2
7978281 Vergith et al. Jul 2011 B2
8016255 Lin Sep 2011 B2
8018386 Qi et al. Sep 2011 B2
8018579 Krah Sep 2011 B1
8022939 Hinata Sep 2011 B2
8026904 Westerman Sep 2011 B2
8053688 Conzola et al. Nov 2011 B2
8063886 Serban et al. Nov 2011 B2
8065624 Morin et al. Nov 2011 B2
8069356 Rathi et al. Nov 2011 B2
8077160 Land et al. Dec 2011 B2
8090885 Callaghan et al. Jan 2012 B2
8094134 Suzuki et al. Jan 2012 B2
8098233 Hotelling et al. Jan 2012 B2
8115499 Osoinach et al. Feb 2012 B2
8117362 Rodriguez et al. Feb 2012 B2
8118274 McClure et al. Feb 2012 B2
8118681 Mattice et al. Feb 2012 B2
8130203 Westerman Mar 2012 B2
8154524 Wilson et al. Apr 2012 B2
8162282 Hu et al. Apr 2012 B2
D659139 Gengler May 2012 S
8169421 Wright et al. May 2012 B2
8189973 Travis et al. May 2012 B2
8216074 Sakuma Jul 2012 B2
8229509 Paek et al. Jul 2012 B2
8229522 Kim et al. Jul 2012 B2
8232963 Orsley et al. Jul 2012 B2
8267368 Torii et al. Sep 2012 B2
8269093 Naik et al. Sep 2012 B2
8274784 Franz et al. Sep 2012 B2
8279589 Kim Oct 2012 B2
8279623 Idzik et al. Oct 2012 B2
8322290 Mignano Dec 2012 B1
8330061 Rothkopf et al. Dec 2012 B2
8330742 Reynolds et al. Dec 2012 B2
8378972 Pance et al. Feb 2013 B2
8403576 Merz Mar 2013 B2
8416559 Agata et al. Apr 2013 B2
8487751 Laitinen et al. Jul 2013 B2
8498100 Whitt, III et al. Jul 2013 B1
8607651 Eventoff Dec 2013 B2
8633916 Bernstein et al. Jan 2014 B2
8638315 Algreatly Jan 2014 B2
8659555 Pihlaja Feb 2014 B2
8674961 Posamentier Mar 2014 B2
8757374 Kaiser Jun 2014 B1
8766925 Perlin et al. Jul 2014 B2
8831677 Villa-Real Sep 2014 B2
8836664 Colgate et al. Sep 2014 B2
8847895 Lim et al. Sep 2014 B2
8847897 Sakai et al. Sep 2014 B2
8854331 Heubel et al. Oct 2014 B2
8928581 Braun et al. Jan 2015 B2
8970525 D Los Reyes Mar 2015 B1
9047012 Bringert et al. Jun 2015 B1
9411436 Shaw et al. Aug 2016 B2
9448631 Winter et al. Sep 2016 B2
9459160 Shaw et al. Oct 2016 B2
9684382 Shaw et al. Jun 2017 B2
20010035859 Kiser Nov 2001 A1
20020000977 Vranish Jan 2002 A1
20020126445 Minaguchi et al. Sep 2002 A1
20020134828 Sandbach et al. Sep 2002 A1
20020154099 Oh Oct 2002 A1
20020188721 Lemel et al. Dec 2002 A1
20030016282 Koizumi Jan 2003 A1
20030044215 Monney et al. Mar 2003 A1
20030083131 Armstrong May 2003 A1
20030132916 Kramer Jul 2003 A1
20030163611 Nagao Aug 2003 A1
20030197687 Shetter Oct 2003 A1
20030201982 Iesaka Oct 2003 A1
20040005184 Kim et al. Jan 2004 A1
20040100457 Mandle May 2004 A1
20040140998 Gravina et al. Jul 2004 A1
20040174670 Huang et al. Sep 2004 A1
20040190239 Weng et al. Sep 2004 A1
20040212598 Kraus et al. Oct 2004 A1
20040258924 Berger et al. Dec 2004 A1
20040268000 Barker et al. Dec 2004 A1
20050030728 Kawashima et al. Feb 2005 A1
20050057515 Bathiche Mar 2005 A1
20050057521 Aull et al. Mar 2005 A1
20050059441 Miyashita Mar 2005 A1
20050059489 Kim Mar 2005 A1
20050146512 Hill et al. Jul 2005 A1
20050190159 Skarine Sep 2005 A1
20050240949 Liu et al. Oct 2005 A1
20050264653 Starkweather et al. Dec 2005 A1
20050264988 Nicolosi Dec 2005 A1
20050285703 Wheeler et al. Dec 2005 A1
20060028095 Maruyama et al. Feb 2006 A1
20060049993 Lin et al. Mar 2006 A1
20060082973 Egbert et al. Apr 2006 A1
20060085658 Allen et al. Apr 2006 A1
20060102914 Smits et al. May 2006 A1
20060103633 Gioeli May 2006 A1
20060125799 Hillis et al. Jun 2006 A1
20060132423 Travis Jun 2006 A1
20060154725 Glaser et al. Jul 2006 A1
20060156415 Rubinstein et al. Jul 2006 A1
20060158433 Serban et al. Jul 2006 A1
20060181514 Newman Aug 2006 A1
20060181521 Perreault et al. Aug 2006 A1
20060187216 Trent, Jr. et al. Aug 2006 A1
20060195522 Miyazaki Aug 2006 A1
20060197753 Hotelling Sep 2006 A1
20060197755 Bawany Sep 2006 A1
20060209050 Serban Sep 2006 A1
20060238510 Panotopoulos et al. Oct 2006 A1
20060248597 Keneman Nov 2006 A1
20070043725 Hotelling et al. Feb 2007 A1
20070047221 Park Mar 2007 A1
20070051792 Wheeler et al. Mar 2007 A1
20070056385 Lorenz Mar 2007 A1
20070057922 Schultz et al. Mar 2007 A1
20070062089 Homer et al. Mar 2007 A1
20070069153 Pai-Paranjape et al. Mar 2007 A1
20070072474 Beasley et al. Mar 2007 A1
20070145945 McGinley et al. Jun 2007 A1
20070152983 McKillop et al. Jul 2007 A1
20070182663 Biech Aug 2007 A1
20070182722 Hotelling et al. Aug 2007 A1
20070200830 Yamamoto Aug 2007 A1
20070205995 Woolley Sep 2007 A1
20070220708 Lewis Sep 2007 A1
20070234420 Novotney et al. Oct 2007 A1
20070236408 Yamaguchi et al. Oct 2007 A1
20070236475 Wherry Oct 2007 A1
20070247338 Marchetto Oct 2007 A1
20070247432 Oakley Oct 2007 A1
20070257821 Son et al. Nov 2007 A1
20070260892 Paul et al. Nov 2007 A1
20070274094 Schultz et al. Nov 2007 A1
20070274095 Destain Nov 2007 A1
20070283179 Burnett et al. Dec 2007 A1
20080005423 Jacobs et al. Jan 2008 A1
20080013809 Zhu et al. Jan 2008 A1
20080018608 Serban et al. Jan 2008 A1
20080018611 Serban et al. Jan 2008 A1
20080024459 Poupyrev et al. Jan 2008 A1
20080042994 Gillespie et al. Feb 2008 A1
20080094367 Van De Ven et al. Apr 2008 A1
20080104437 Lee May 2008 A1
20080151478 Chern Jun 2008 A1
20080158185 Westerman Jul 2008 A1
20080167832 Soss Jul 2008 A1
20080180411 Solomon et al. Jul 2008 A1
20080202251 Serban et al. Aug 2008 A1
20080202824 Philipp et al. Aug 2008 A1
20080219025 Spitzer et al. Sep 2008 A1
20080228969 Cheah et al. Sep 2008 A1
20080232061 Wang et al. Sep 2008 A1
20080238884 Harish Oct 2008 A1
20080253822 Matias Oct 2008 A1
20080297878 Brown et al. Dec 2008 A1
20080303646 Elwell et al. Dec 2008 A1
20080309636 Feng et al. Dec 2008 A1
20080316002 Brunet et al. Dec 2008 A1
20080320190 Lydon et al. Dec 2008 A1
20090002218 Rigazio et al. Jan 2009 A1
20090007001 Morin et al. Jan 2009 A1
20090009476 Daley, III Jan 2009 A1
20090046416 Daley, III Feb 2009 A1
20090049979 Naik et al. Feb 2009 A1
20090065267 Sato Mar 2009 A1
20090073060 Shimasaki et al. Mar 2009 A1
20090073957 Newland et al. Mar 2009 A1
20090079639 Hotta et al. Mar 2009 A1
20090083562 Park et al. Mar 2009 A1
20090090568 Min Apr 2009 A1
20090117955 Lo May 2009 A1
20090127005 Zachut et al. May 2009 A1
20090128374 Reynolds May 2009 A1
20090135142 Fu et al. May 2009 A1
20090140985 Liu Jun 2009 A1
20090163147 Steigerwald et al. Jun 2009 A1
20090182901 Callaghan et al. Jul 2009 A1
20090195497 Fitzgerald et al. Aug 2009 A1
20090200148 Honmatsu et al. Aug 2009 A1
20090219250 Ure Sep 2009 A1
20090231019 Yeh Sep 2009 A1
20090231275 Odgers Sep 2009 A1
20090250267 Heubel et al. Oct 2009 A1
20090251008 Sugaya Oct 2009 A1
20090259865 Sheynblat et al. Oct 2009 A1
20090262492 Whitchurch et al. Oct 2009 A1
20090265670 Kim et al. Oct 2009 A1
20090267892 Faubert Oct 2009 A1
20090284397 Lee et al. Nov 2009 A1
20090303137 Kusaka et al. Dec 2009 A1
20090303204 Nasiri et al. Dec 2009 A1
20090320244 Lin Dec 2009 A1
20090321490 Groene et al. Dec 2009 A1
20100001963 Doray et al. Jan 2010 A1
20100013319 Kamiyama et al. Jan 2010 A1
20100026656 Hotelling et al. Feb 2010 A1
20100038821 Jenkins et al. Feb 2010 A1
20100039764 Locker et al. Feb 2010 A1
20100045609 Do et al. Feb 2010 A1
20100045633 Gettemy Feb 2010 A1
20100051356 Stern et al. Mar 2010 A1
20100051432 Lin et al. Mar 2010 A1
20100053087 Dai et al. Mar 2010 A1
20100053534 Hsieh et al. Mar 2010 A1
20100075517 Ni et al. Mar 2010 A1
20100077237 Sawyers Mar 2010 A1
20100079398 Shen et al. Apr 2010 A1
20100081377 Chatterjee et al. Apr 2010 A1
20100085321 Pundsack Apr 2010 A1
20100097198 Suzuki Apr 2010 A1
20100102182 Lin Apr 2010 A1
20100103112 Yoo et al. Apr 2010 A1
20100103131 Segal et al. Apr 2010 A1
20100123686 Klinghult et al. May 2010 A1
20100133398 Chiu et al. Jun 2010 A1
20100137033 Lee Jun 2010 A1
20100141588 Kimura et al. Jun 2010 A1
20100142130 Wang et al. Jun 2010 A1
20100149111 Olien Jun 2010 A1
20100149134 Westerman et al. Jun 2010 A1
20100156798 Archer Jun 2010 A1
20100161522 Tirpak et al. Jun 2010 A1
20100162109 Chatterjee et al. Jun 2010 A1
20100162179 Porat Jun 2010 A1
20100164857 Liu et al. Jul 2010 A1
20100171708 Chuang Jul 2010 A1
20100171891 Kaji et al. Jul 2010 A1
20100174421 Tsai et al. Jul 2010 A1
20100180063 Ananny et al. Jul 2010 A1
20100182263 Aunio et al. Jul 2010 A1
20100188299 Rinehart et al. Jul 2010 A1
20100188338 Longe Jul 2010 A1
20100206614 Park et al. Aug 2010 A1
20100206644 Yeh Aug 2010 A1
20100214257 Wussler et al. Aug 2010 A1
20100222110 Kim et al. Sep 2010 A1
20100231498 Large et al. Sep 2010 A1
20100231510 Sampsell et al. Sep 2010 A1
20100231556 Mines et al. Sep 2010 A1
20100238075 Pourseyed Sep 2010 A1
20100238119 Dubrovsky et al. Sep 2010 A1
20100238138 Goertz et al. Sep 2010 A1
20100245221 Khan Sep 2010 A1
20100250988 Okuda et al. Sep 2010 A1
20100274932 Kose Oct 2010 A1
20100279768 Huang et al. Nov 2010 A1
20100289457 Onnerud et al. Nov 2010 A1
20100289508 Joguet et al. Nov 2010 A1
20100295812 Burns et al. Nov 2010 A1
20100302378 Marks et al. Dec 2010 A1
20100304793 Kim et al. Dec 2010 A1
20100306538 Thomas et al. Dec 2010 A1
20100308778 Yamazaki et al. Dec 2010 A1
20100308844 Day et al. Dec 2010 A1
20100315348 Jellicoe et al. Dec 2010 A1
20100315373 Steinhauser et al. Dec 2010 A1
20100321299 Shelley et al. Dec 2010 A1
20100321301 Casparian et al. Dec 2010 A1
20100321330 Lim et al. Dec 2010 A1
20100321339 Kimmel Dec 2010 A1
20100325155 Skinner et al. Dec 2010 A1
20100331059 Apgar et al. Dec 2010 A1
20110007008 Algreatly Jan 2011 A1
20110012873 Prest et al. Jan 2011 A1
20110018556 Le et al. Jan 2011 A1
20110019123 Prest et al. Jan 2011 A1
20110031287 Le Gette et al. Feb 2011 A1
20110036965 Zhang et al. Feb 2011 A1
20110037379 Lecamp et al. Feb 2011 A1
20110037721 Cranfill et al. Feb 2011 A1
20110043454 Modarres et al. Feb 2011 A1
20110043990 Mickey et al. Feb 2011 A1
20110050587 Natanzon et al. Mar 2011 A1
20110050630 Ikeda Mar 2011 A1
20110055407 Lydon et al. Mar 2011 A1
20110057899 Sleeman et al. Mar 2011 A1
20110060926 Brooks et al. Mar 2011 A1
20110069148 Jones et al. Mar 2011 A1
20110074688 Hull et al. Mar 2011 A1
20110080347 Steeves et al. Apr 2011 A1
20110084909 Hsieh et al. Apr 2011 A1
20110095994 Birnbaum Apr 2011 A1
20110096513 Kim Apr 2011 A1
20110102326 Casparian et al. May 2011 A1
20110102356 Kemppinen et al. May 2011 A1
20110115712 Han et al. May 2011 A1
20110115747 Powell et al. May 2011 A1
20110118025 Lukas et al. May 2011 A1
20110128227 Theimer Jun 2011 A1
20110134032 Chiu et al. Jun 2011 A1
20110134112 Koh et al. Jun 2011 A1
20110141052 Bernstein Jun 2011 A1
20110147398 Ahee et al. Jun 2011 A1
20110148793 Ciesla et al. Jun 2011 A1
20110157087 Kanehira et al. Jun 2011 A1
20110163955 Nasiri et al. Jul 2011 A1
20110164370 McClure et al. Jul 2011 A1
20110167181 Minoo et al. Jul 2011 A1
20110167287 Walsh et al. Jul 2011 A1
20110167391 Momeyer et al. Jul 2011 A1
20110167992 Eventoff et al. Jul 2011 A1
20110179864 Raasch et al. Jul 2011 A1
20110184646 Wong et al. Jul 2011 A1
20110193787 Morishige et al. Aug 2011 A1
20110193938 Oderwald et al. Aug 2011 A1
20110202878 Park et al. Aug 2011 A1
20110205372 Miramontes Aug 2011 A1
20110216266 Travis Sep 2011 A1
20110227913 Hyndman Sep 2011 A1
20110231682 Kakish et al. Sep 2011 A1
20110234502 Yun et al. Sep 2011 A1
20110241999 Thier Oct 2011 A1
20110242138 Tribble Oct 2011 A1
20110248152 Svajda et al. Oct 2011 A1
20110248920 Larsen Oct 2011 A1
20110248930 Kwok et al. Oct 2011 A1
20110248941 Abdo Oct 2011 A1
20110261001 Liu Oct 2011 A1
20110261021 Modarres et al. Oct 2011 A1
20110261083 Wilson Oct 2011 A1
20110267294 Kildal Nov 2011 A1
20110267300 Serban et al. Nov 2011 A1
20110267757 Probst Nov 2011 A1
20110290686 Huang Dec 2011 A1
20110295697 Boston et al. Dec 2011 A1
20110297566 Gallagher et al. Dec 2011 A1
20110304577 Brown et al. Dec 2011 A1
20110304962 Su Dec 2011 A1
20110306424 Kazama et al. Dec 2011 A1
20110316807 Corrion Dec 2011 A1
20120007821 Zaliva Jan 2012 A1
20120011462 Westerman et al. Jan 2012 A1
20120013519 Hakansson et al. Jan 2012 A1
20120023459 Westerman Jan 2012 A1
20120024682 Huang et al. Feb 2012 A1
20120026048 Vazquez et al. Feb 2012 A1
20120044179 Hudson Feb 2012 A1
20120047368 Chinn et al. Feb 2012 A1
20120050975 Garelli et al. Mar 2012 A1
20120055770 Chen Mar 2012 A1
20120062245 Bao et al. Mar 2012 A1
20120068933 Larsen Mar 2012 A1
20120068957 Puskarich et al. Mar 2012 A1
20120072167 Cretella, Jr. et al. Mar 2012 A1
20120075249 Hoch Mar 2012 A1
20120081316 Sirpal et al. Apr 2012 A1
20120087078 Medica et al. Apr 2012 A1
20120092279 Martin Apr 2012 A1
20120092350 Ganapathi et al. Apr 2012 A1
20120094257 Pillischer et al. Apr 2012 A1
20120098751 Lin Apr 2012 A1
20120099263 Lin Apr 2012 A1
20120099749 Rubin et al. Apr 2012 A1
20120106082 Wu et al. May 2012 A1
20120113579 Agata et al. May 2012 A1
20120115553 Mahe et al. May 2012 A1
20120117409 Lee et al. May 2012 A1
20120127118 Nolting et al. May 2012 A1
20120140396 Zeliff et al. Jun 2012 A1
20120145525 Ishikawa Jun 2012 A1
20120155015 Govindasamy et al. Jun 2012 A1
20120162693 Ito Jun 2012 A1
20120175487 Goto Jul 2012 A1
20120182242 Lindahl et al. Jul 2012 A1
20120194393 Utterman et al. Aug 2012 A1
20120194448 Rothkopf Aug 2012 A1
20120200532 Powell et al. Aug 2012 A1
20120200802 Large Aug 2012 A1
20120206401 Lin et al. Aug 2012 A1
20120206937 Travis et al. Aug 2012 A1
20120223866 Ayala Vazquez et al. Sep 2012 A1
20120224073 Miyahara Sep 2012 A1
20120235635 Sato Sep 2012 A1
20120235921 Laubach Sep 2012 A1
20120235942 Shahoian et al. Sep 2012 A1
20120242588 Myers et al. Sep 2012 A1
20120246377 Bhesania Sep 2012 A1
20120249459 Sashida et al. Oct 2012 A1
20120256959 Ye et al. Oct 2012 A1
20120268911 Lin Oct 2012 A1
20120274811 Bakin Nov 2012 A1
20120287562 Wu et al. Nov 2012 A1
20120299866 Pao et al. Nov 2012 A1
20120300275 Vilardell et al. Nov 2012 A1
20120304199 Homma et al. Nov 2012 A1
20120312955 Randolph Dec 2012 A1
20120328349 Isaac et al. Dec 2012 A1
20130009892 Salmela et al. Jan 2013 A1
20130044059 Fu Feb 2013 A1
20130047747 Joung, II Feb 2013 A1
20130063364 Moore Mar 2013 A1
20130063873 Wodrich et al. Mar 2013 A1
20130076646 Krah et al. Mar 2013 A1
20130088431 Ballagas et al. Apr 2013 A1
20130088442 Lee Apr 2013 A1
20130094131 O'Donnell et al. Apr 2013 A1
20130097534 Lewin et al. Apr 2013 A1
20130106766 Yilmaz et al. May 2013 A1
20130107144 Marhefka et al. May 2013 A1
20130141370 Wang et al. Jun 2013 A1
20130167663 Eventoff Jul 2013 A1
20130194235 Zanone et al. Aug 2013 A1
20130201115 Heubel Aug 2013 A1
20130227836 Whitt, III et al. Sep 2013 A1
20130228433 Shaw Sep 2013 A1
20130229273 Nodar Cortizo et al. Sep 2013 A1
20130229356 Marwah et al. Sep 2013 A1
20130229386 Bathiche Sep 2013 A1
20130275058 Awad Oct 2013 A1
20130278542 Stephanou et al. Oct 2013 A1
20130278552 Kamin-Lyndgaard Oct 2013 A1
20130300683 Binbaum et al. Nov 2013 A1
20130304941 Drasnin Nov 2013 A1
20130304944 Young Nov 2013 A1
20130335330 Lane Dec 2013 A1
20130335902 Campbell Dec 2013 A1
20130335903 Raken Dec 2013 A1
20130342464 Bathiche et al. Dec 2013 A1
20130342465 Bathiche Dec 2013 A1
20130346636 Bathiche Dec 2013 A1
20140008203 Nathan et al. Jan 2014 A1
20140020484 Shaw et al. Jan 2014 A1
20140022177 Shaw Jan 2014 A1
20140028624 Marsden et al. Jan 2014 A1
20140055375 Kim et al. Feb 2014 A1
20140062933 Coulson et al. Mar 2014 A1
20140062934 Coulson et al. Mar 2014 A1
20140083207 Eventoff Mar 2014 A1
20140085247 Leung et al. Mar 2014 A1
20140098058 Baharav et al. Apr 2014 A1
20140139436 Ramstein et al. May 2014 A1
20140139472 Takenaka May 2014 A1
20140210742 Delattre et al. Jul 2014 A1
20140221098 Boulanger Aug 2014 A1
20140230575 Picciotto et al. Aug 2014 A1
20140232679 Whitman et al. Aug 2014 A1
20140253305 Rosenberg et al. Sep 2014 A1
20140306914 Kagayama Oct 2014 A1
20150084865 Shaw et al. Mar 2015 A1
20150084909 Worfolk et al. Mar 2015 A1
20150160778 Kim et al. Jun 2015 A1
20150185842 Picciotto et al. Jul 2015 A1
20150193034 Jeong et al. Jul 2015 A1
20150227207 Winter et al. Aug 2015 A1
20150242012 Petcavich et al. Aug 2015 A1
20150301642 Hanauer et al. Oct 2015 A1
20150370376 Harley et al. Dec 2015 A1
20160018894 Yliaho et al. Jan 2016 A1
20160070398 Worfolk Mar 2016 A1
20160195955 Picciotto et al. Jul 2016 A1
20160313832 Shaw et al. Oct 2016 A1
20160357296 Picciotto et al. Dec 2016 A1
20170102770 Winter et al. Apr 2017 A1
20170255276 Shaw et al. Sep 2017 A1
Foreign Referenced Citations (16)
Number Date Country
1223722 Jul 2002 EP
1591891 Nov 2005 EP
2353978 Aug 2011 EP
2381340 Oct 2011 EP
2584432 Apr 2013 EP
2178570 Feb 1987 GB
10326124 Dec 1998 JP
1173239 Mar 1999 JP
11345041 Dec 1999 JP
1020110087178 Aug 2011 KR
1038411 May 2012 NL
WO-2010011983 Jan 2010 WO
WO-2012036717 Mar 2012 WO
WO-2012173305 Dec 2012 WO
WO-2013169299 Nov 2013 WO
WO-2014098946 Jun 2014 WO
Non-Patent Literature Citations (227)
Entry
“Final Office Action”, U.S. Appl. No. 13/975,087, dated Nov. 4, 2016, 23 pages.
“Final Office Action”, U.S. Appl. No. 14/591,704, dated Nov. 25, 2016, 33 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/031699, dated Nov. 11, 2016, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/974,749, dated Jan. 20, 2017, 23 pages.
“Second Written Opinion”, Application No. PCT/US2015/067754, dated Nov. 25, 2016, 8 pages.
“Accessing Device Sensors”, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012, 2011, 4 pages.
“ACPI Docking for Windows Operating Systems”, Retrieved from: <http://www.scritube.com/limba/engleza/software/ACPI-Docking-for-Windows-Opera331824193.php> on Jul. 6, 2012, 2012, 10 pages.
“Advanced Configuration and Power Management Specification”, Intel Corporation, Microsoft Corporation, Toshiba Corp. Revision 1, Dec. 22, 1996, 364 pages.
“Advisory Action”, U.S. Appl. No. 13/975,087, dated Nov. 16, 2015, 3 pages.
“Capacitive Touch Sensors—Application Fields, Technology Overview and Implementation Example”, Fujitsu Microelectronics Europe GmbH; retrieved from http://www.fujitsu.com/downloads/MICRO/fme/articles/fujitsu-whitepaper-capacitive-touch-sensors.pdf on Jul. 20, 2011, Jan. 12, 2010, 12 pages.
“Cholesteric Liquid Crystal”, Retrieved from: <http://en.wikipedia.org/wiki/Cholesteric_liquid_crystal> on Aug. 6, 2012, Jun. 10, 2012, 2 pages.
“Cirago Slim Case®—Protective case with built-in kickstand for your iPhone 5®”, Retrieved from <http://cirago.com/wordpress/wp-content/uploads/2012/10/ipc1500brochure1.pdf> on Jan. 29, 2013, Jan. 2013, 1 page.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, dated Apr. 9, 2013, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, dated Jul. 2, 2013, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/033,290, dated Jul. 13, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/033,508, dated Jun. 16, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/033,508, dated Sep. 9, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/698,318, dated Jun. 9, 2016, 2 pages.
“Developing Next-Generation Human Interfaces using Capacitive and Infrared Proximity Sensing”, Silicon Laboratories, Inc., Available at <http://www.silabs.com/pages/DownloadDoc.aspx?FILEURL=support%20documents/technicaldocs/capacitive%20and%20proximity%20sensing_wp.pdf&src=SearchResults>, Aug. 30, 2010, pp. 1-10.
“Directional Backlighting for Display Panels”, U.S. Appl. No. 13/021,448, dated Feb. 4, 2011, 38 pages.
“DR2PA”, retrieved from <http://www.architainment.co.uk/wp-content/uploads/2012/08/DR2PA-AU-US-size-Data-Sheet-Rev-H_LOGO.pdf> on Sep. 17, 2012, Jan. 2012, 4 pages.
“Ex Parte Quayle Action”, U.S. Appl. No. 13/599,763, Nov. 14, 2014, 6 pages.
“Examiner's Answer to Appeal Brief”, U.S. Appl. No. 13/974,994, dated May 18, 2016, 40 pages.
“Final Office Action”, U.S. Appl. No. 13/471,001, dated Jul. 25, 2013, 20 pages.
“Final Office Action”, U.S. Appl. No. 13/527,263, dated Jan. 27, 2015, 7 pages.
“Final Office Action”, U.S. Appl. No. 13/603,918, dated Mar. 21, 2014, 14 pages.
“Final Office Action”, U.S. Appl. No. 13/647,479, dated Dec. 12, 2014, 12 pages.
“Final Office Action”, U.S. Appl. No. 13/651,195, dated Apr. 18, 2013, 13 pages.
“Final Office Action”, U.S. Appl. No. 13/651,232, dated May 21, 2013, 21 pages.
“Final Office Action”, U.S. Appl. No. 13/651,287, dated May 3, 2013, 16 pages.
“Final Office Action”, U.S. Appl. No. 13/651,976, dated Jul. 25, 2013, 21 pages.
“Final Office Action”, U.S. Appl. No. 13/653,321, dated Aug. 2, 2013, 17 pages.
“Final Office Action”, U.S. Appl. No. 13/655,065, dated Apr. 2, 2015, 23 pages.
“Final Office Action”, U.S. Appl. No. 13/655,065, dated Aug. 8, 2014, 20 pages.
“Final Office Action”, U.S. Appl. No. 13/655,065, dated Nov. 17, 2015, 25 pages.
“Final Office Action”, U.S. Appl. No. 13/769,356, dated Apr. 10, 2015, 9 pages.
“Final Office Action”, U.S. Appl. No. 13/974,749, dated Mar. 23, 2016, 22 pages.
“Final Office Action”, U.S. Appl. No. 13/974,749, dated May 21, 2015, 19 pages.
“Final Office Action”, U.S. Appl. No. 13/974,749, dated Sep. 5, 2014, 18 pages.
“Final Office Action”, U.S. Appl. No. 13/974,994, dated Jun. 10, 2015, 28 pages.
“Final Office Action”, U.S. Appl. No. 13/974,994, dated Oct. 6, 2014, 26 pages.
“Final Office Action”, U.S. Appl. No. 13/975,087, dated Aug. 7, 2015, 16 pages.
“Final Office Action”, U.S. Appl. No. 13/975,087, dated Sep. 10, 2014, 19 pages.
“Final Office Action”, U.S. Appl. No. 14/033,510, dated Feb. 8, 2016, 27 pages.
“Final Office Action”, U.S. Appl. No. 14/033,510, dated Jun. 5, 2015, 24 pages.
“Final Office Action”, U.S. Appl. No. 14/033,510, dated Aug. 21, 2014, 18 pages.
“Final Office Action”, U.S. Appl. No. 14/033,510, dated Sep. 22, 2016, 21 pages.
“First One Handed Fabric Keyboard with Bluetooth Wireless Technology”, Retrieved from: <http://press.xtvworld.com/article3817.html> on May 8, 2012, Jan. 6, 2005, 2 pages.
“Force and Position Sensing Resistors: an Emerging Technology”, Interlink Electronics, Available at <http://staff.science.uva.nl/˜vlaander/docu/FSR/An_Exploring_Technology.pdf>, Feb. 1990, pp. 1-6.
“Frogpad Introduces Weareable Fabric Keyboard with Bluetooth Technology”, Retrieved from: <http://www.geekzone.co.nz/content.asp?contentid=3898> on May 7, 2012, Jan. 7, 2005, 3 pages.
“How to Use the iPad's Onscreen Keyboard”, Retrieved from <http://www.dummies.com/how-to/content/how-to-use-the-i pads-onscreen-keyboard.html> on Aug. 28, 2012, 2012, 3 pages.
“iControlPad 2—The open source controller”, Retrieved from <http://www.kickstarter.com/projects/1703567677/icontrolpad-2-the-open-source-controller> on Nov. 20, 2012, 2012, 15 pages.
“i-Interactor electronic pen”, Retrieved from: <http://www.alibaba.com/product-gs/331004878/i_Interactor_electronic_pen.html> on Jun. 19, 2012, 2012, 5 pages.
“Incipio LG G-Slate Premium Kickstand Case—Black Nylon”, Retrieved from: <http://www.amazon.com/Incipio-G-Slate-Premium-Kickstand-Case/dp/B004ZKP916> on May 8, 2012, 2012, 4 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2014/056185, dated Dec. 23, 2015, 7 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/068687, dated Mar. 18, 2015, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/016151, dated May 16, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/056185, dated Dec. 4, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028948, dated Jun. 21, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/029461, dated Jun. 21, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/040968, dated Sep. 5, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/044871, dated Aug. 14, 2013, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/067754, dated Apr. 7, 2016, 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/014522, dated Jun. 6, 2014, 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/025966, dated Jun. 15, 2016, 15 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/045283, dated Mar. 12, 2014, 19 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/044873, dated Nov. 22, 2013, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/045049, dated Sep. 16, 2013, 9 pages.
“Membrane Keyboards & Membrane Keypads”, Retrieved from: <http:www.pannam.com/> on May 9, 2012, Mar. 4, 2009, 2 pages.
“Microsoft Tablet PC”, Retrieved from <http://web.archive.org/web/20120622064335/https://en.wikipedia.org/wiki/Microsoft_Tablet_PC> on Jun. 4, 2014, Jun. 21, 2012, 9 pages.
“Motion Sensors”, Android Developers—retrieved from <http://developer.android.com/guide/topics/sensors/sensors_motion.html> on May 25, 2012, 2012, 7 pages.
“MPC Fly Music Production Controller”, AKAI Professional, Retrieved from: <http://www.akaiprompc.com/mpc-fly> on Jul. 9, 2012, 4 pages.
“NI Releases New Maschine & Maschine Mikro”, Retrieved from <http://www.djbooth.net/index/dj-equipment/entry/ni-releases-new-maschine-mikro/> on Sep. 17, 2012, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/021,448, dated Dec. 13, 2012, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,001, dated Feb. 19, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,139, dated Mar. 21, 2013, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,202, dated Feb. 11, 2013, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, dated Jan. 18, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, dated Apr. 3, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, dated Jul. 19, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/563,435, dated Jun. 14, 2013, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, dated Jun. 19, 2013, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/565,124, dated Jun. 17, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/599,763, dated May 28, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/603,918, dated Sep. 2, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/603,918, dated Dec. 19, 2013, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/645,405, dated Jan. 31, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/645,405, dated Aug. 11, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/647,479, dated Jul. 3, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,195, dated Jan. 2, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,232, dated Jan. 17, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,272, dated Feb. 12, 2013, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,287, dated Jan. 29, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,304, dated Mar. 22, 2013, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,327, dated Mar. 22, 2013, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,726, dated Apr. 15, 2013, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, dated Mar. 18, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, dated Jul. 1, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,976, dated Feb. 22, 2013, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,321, dated Feb. 1, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, dated Feb. 7, 2013, 11 pages.
“Non-Final Office Action”, U.S. App. No. 13/653,682, dated Jun. 3, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/655,065, dated Apr. 24, 2014, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/655,065, dated Aug. 19, 2015, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/655,065, dated Dec. 19, 2014, 24 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,055, dated Apr. 23, 2013, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,520, dated Feb. 1, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,520, dated Jun. 5, 2013, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/759,875, dated Aug. 1, 2014, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/769,356, dated Nov. 20, 2014, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/974,749, dated Feb. 12, 2015, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/974,749, dated May 8, 2014, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/974,749, dated Feb. 3, 2015, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/974,994, dated Jan. 23, 2015, 26 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/974,994, dated Jun. 4, 2014, 24 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/975,087, dated Feb. 27, 2015, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/975,087, dated May 8, 2014, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/975,087, dated May 10, 2016, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/033,290, dated Dec. 3, 2015, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/033,508, dated Dec. 3, 2015, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/033,510, dated Feb. 12, 2015, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/033,510, dated Jun. 5, 2014, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/033,510, dated Oct. 7, 2015, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/144,876, dated Jun. 10, 2015, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/591,704, dated Jun. 7, 2016, 32 pages.
“Notice of Allowance”, U.S. Appl. No. 13/470,633, dated Mar. 22, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,202, dated May 28, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/599,763, dated Feb. 18, 2015, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/603,918, dated Jan. 22, 2015, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,195, dated Jul. 8, 2013, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,272, dated May 2, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,304, dated Jul. 1, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,327, dated Jun. 11, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,726, dated May 31, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 14/033,290, dated Mar. 30, 2016, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 14/033,508, dated May 6, 2016, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 14/698,318, dated May 6, 2016, 13 pages.
“On-Screen Keyboard for Windows 7, Vista, XP with Touchscreen”, Retrieved from <www.comfort-software.com/on-screen-keyboard.html> on Aug. 28, 2012, Feb. 2, 2011, 3 pages.
“Optical Sensors in Smart Mobile Devices”, ON Semiconductor, TND415/D, Available at <http://www.onsemi.jp/pub_link/Collateral/TND415-D.PDF>, Nov. 2010, pp. 1-13.
“Optics for Displays: Waveguide-based Wedge Creates Collimated Display Backlight”, OptoIQ, retrieved from <http://www.optoiq.com/index/photonics-technologies-applications/lfw-display/lfw-article-display.articles.laser-focus-world.volume-46.issue-1.world-news.optics-for_displays.html> on Nov. 2, 2010, Jan. 1, 2010, 3 pages.
“Position Sensors”, Android Developers—retrieved from <http://developer.android.com/guide/topics/sensors/sensors_position.html> on May 25, 2012, 5 pages.
“Reflex LCD Writing Tablets”, retrieved from <http://www.kentdisplays.com/products/lcdwritingtablets.html> on Jun. 27, 2012, 3 pages.
“Restriction Requirement”, U.S. Appl. No. 13/603,918, dated Nov. 27, 2013, 8 pages.
“Restriction Requirement”, U.S. Appl. No. 13/471,139, dated Jan. 17, 2013, 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,726, dated Feb. 22, 2013, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,871, dated Feb. 7, 2013, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/715,229, dated Aug. 13, 2013, 7 pages.
“Second Written Opinion”, Application No. PCT/US2014/056185, dated Sep. 15, 2015, 5 pages.
“SMART Board™ Interactive Display Frame Pencil Pack”, Available at <http://downloads01.smarttech.com/media/sitecore/en/support/product/sbfpd/400series(interactivedisplayframes)/guides/smartboardinteractivedisplayframepencilpackv12mar09.pdf>, 2009, 2 pages.
“Snugg iPad 3 Keyboard Case—Cover Ultra Slim Bluetooth Keyboard Case for the iPad 3 & iPad 2”, Retrieved from <https://web.archive.org/web/20120810202056/http://www.amazon.com/Snugg-iPad-Keyboard-Case-Bluetooth/dp/B008CCHXJE> on Jan. 23, 2015, Aug. 10, 2012, 4 pages.
“SolRxTM E-Series Multidirectional Phototherapy ExpandableTM 2-Bulb Full Body Panel System”, Retrieved from: <http://www.solarcsystems.com/us_multidirectional_uv_light_therapy_1_intro.html > on Jul. 25, 2012, 2011, 4 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 14/698,318, dated Aug. 15, 2016, 2 pages.
“Tactile Feedback Solutions Using Piezoelectric Actuators”, Available at: http://www.eetimes.com/document.asp?doc_id=1278418, Nov. 17, 2010, 6 pages.
“The Microsoft Surface Tablets Comes With Impressive Design and Specs”, Retrieved from <http://microsofttabletreview.com/the-microsoft-surface-tablets-comes-with-impressive-design-and-specs> on Jan. 30, 2013, Jun. 2012, 2 pages.
“Tilt Shift Lenses: Perspective Control”, retrieved from http://www.cambridgeincolour.com/tutorials/tilt-shift-lenses1.htm, Mar. 28, 2008, 11 Pages.
“Virtualization Getting Started Guide”, Red Hat Enterprise Linux 6, Edition 0.2—retrieved from <http://docs.redhat.com/docs/en-US/Red_Hat_Enterprise_Linux/6/html-single/Virtualization_Getting_Started_Guide/index.html> on Jun. 13, 2012, 24 pages.
“Visus Photonics—Visionary Technologies New Generation of Production Ready Keyboard—Keypad Illumination Systems”, Available at: <http://www.visusphotonics.com/pdf/appl_keypad_keyboard_backlights.pdf>, May 2006, pp. 1-22.
“What is Active Alignment?”, http://www.kasalis.com/active_alignment.html, retrieved on Nov. 22, 2012, Nov. 22, 2012, 2 Pages.
“Write & Learn Spellboard Advanced”, Available at <http://somemanuals.com/VTECH,WRITE%2526LEARN--SPELLBOARD--ADV--71000,JIDFHE.PDF>, 2006, 22 pages.
“Writer 1 for iPad 1 keyboard + Case (Aluminum Bluetooth Keyboard, Quick Eject and Easy Angle Function!)”, Retrieved from <https://web.archive.org/web/20120817053825/http://www.amazon.com/keyboard-Aluminum-Bluetooth-Keyboard-Function/dp/B0040QLSLG> on Jan. 23, 2015, Aug. 17, 2012, 5 pages.
Akamatsu,“Movement Characteristics Using a Mouse with Tactile and Force Feedback”, In Proceedings of International Journal of Human-Computer Studies 45, No. 4, Oct. 1996, 11 pages.
Bathiche,“Input Device with Interchangeable Surface”, U.S. Appl. No. 13/974,749, dated Aug. 23, 2013, 51 pages.
Block,“DeviceOrientation Event Specification”, W3C, Editor's Draft, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012, Jul. 12, 2011, 14 pages.
Brown,“Microsoft Shows Off Pressure-Sensitive Keyboard”, retrieved from <http://news.cnet.com/8301-17938_105-10304792-1.html> on May 7, 2012, Aug. 6, 2009, 2 pages.
Butler,“SideSight: Multi-“touch” Interaction around Small Devices”, In the proceedings of the 21st annual ACM symposium on User interface software and technology., retrieved from <http://research.microsoft.com/pubs/132534/sidesight_crv3.pdf> on May 29, 2012, Oct. 19, 2008, 4 pages.
Chu,“Design and Analysis of a Piezoelectric Material Based Touch Screen With Additional Pressure and Its Acceleration Measurement Functions”, In Proceedings of Smart Materials and Structures, vol. 22, Issue 12, Nov. 1, 2013, 2 pages.
Crider,“Sony Slate Concept Tablet “Grows” a Kickstand”, Retrieved from: <http://androidcommunity.com/sony-slate-concept-tablet-grows-a-kickstand-20120116/> on May 4, 2012, Jan. 16, 2012, 9 pages.
Das,“Study of Heat Transfer through Multilayer Clothing Assemblies: A Theoretical Prediction”, Retrieved from <http://www.autexrj.com/cms/zalaczone_pliki/5_013_11.pdf>, Jun. 2011, 7 pages.
Dietz,“A Practical Pressure Sensitive Computer Keyboard”, In Proceedings of UIST 2009, Oct. 2009, 4 pages.
Gaver,“A Virtual Window on Media Space”, retrieved from <http://www.gold.ac.uk/media/15gaver-smets-overbeeke.MediaSpaceWindow.chi95.pdf> on Jun. 1, 2012, retrieved from <http://www.gold.ac.uk/media/15gaver-smets-overbeeke.MediaSpaceWindow.chi95.pdf> on Jun. 1, 2012, May 7, 1995, 9 pages.
Glatt,“Channel and Key Pressure (Aftertouch).”, Retrieved from: <http://home.roadrunner.com/˜jgglatt/tutr/touch.htm> on Jun. 11, 2012, 2012, 2 pages.
Gong,“PrintSense: A Versatile Sensing Technique to Support Multimodal Flexible Surface Interaction”, In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; retrieved from: http://dl.acm.org/citation.cfm?id=2556288.2557173&coll=DL&dl=ACM&CFID=571580473&CFTOKEN=89752233 on Sep. 19, 2014, Apr. 26, 2014, 4 pages.
Hanlon,“ElekTex Smart Fabric Keyboard Goes Wireless”, Retrieved from: <http://www.gizmag.com/go/5048/ > on May 7, 2012, Jan. 15, 2006, 5 pages.
Harada,“VoiceDraw: A Hands-Free Voice-Driven Drawing Application for People With Motor Impairments”, In Proceedings of Ninth International ACM SIGACCESS Conference on Computers and Accessibility, retrieved from <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.113.7211&rep=rep1&type=pdf> on Jun. 1, 2012, Oct. 15, 2007, 8 pages.
Hinckley,“Touch-Sensing Input Devices”, In Proceedings of ACM SIGCHI 1999, May 15, 1999, 8 pages.
Hughes,“Apple's haptic touch feedback concept uses actuators, senses force on iPhone, iPad”, Retrieved from: http://appleinsider.com/articles/12/03/22/apples_haptic_touch_feedback_concept_uses_actuators_senses_force_on_iphone_ipad, Mar. 22, 2012, 5 pages.
Iwase,“Multistep Sequential Batch Assembly of Three-Dimensional Ferromagnetic Microstructures with Elastic Hinges”, Retrieved at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1549861>> Proceedings: Journal of Microelectromechanical Systems, Dec. 2005, 7 pages.
Kaufmann,“Hand Posture Recognition Using Real-time Artificial Evolution”, EvoApplications'09, retrieved from <http://evelyne.lutton.free.fr/Papers/KaufmannEvolASP2010.pdf> on Jan. 5, 2012, Apr. 3, 2010, 10 pages.
Kaur,“Vincent Liew's redesigned laptop satisfies ergonomic needs”, Retrieved from: <http://www.designbuzz.com/entry/vincent-liew-s-redesigned-laptop-satisfies-ergonomic-needs/> on Jul. 27, 2012, Jun. 21, 2010, 4 pages.
Khuntontong,“Fabrication of Molded Interconnection Devices by Ultrasonic Hot Embossing on Thin Polymer Films”, IEEE Transactions on Electronics Packaging Manufacturing, vol. 32, No. 3, Jul. 2009, pp. 152-156.
Kyung,“TAXEL: Initial Progress Toward Self-Morphing Visio-Haptic Interface”, Proceedings: In IEEE World Haptics Conference, Jun. 21, 2011, 6 pages.
Lane,“Media Processing Input Device”, U.S. Appl. No. 13/655,065, Oct. 18, 2012, 43 pages.
Li,“Characteristic Mode Based Tradeoff Analysis of Antenna-Chassis Interactions for Multiple Antenna Terminals”, In IEEE Transactions on Antennas and Propagation, Retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6060882>, Feb. 2012, 13 pages.
Linderholm,“Logitech Shows Cloth Keyboard for PDAs”, Retrieved from: <http://www.pcworld.com/article/89084/logitech_shows_cloth_keyboard_for_pdas.html> on May 7, 2012, Mar. 15, 2002, 5 pages.
Mackenzie,“The Tactile Touchpad”, In Proceedings of the ACM CHI Human Factors in Computing Systems Conference Available at: <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.150.4780&rep=rep1&type=pdf>, Mar. 22, 1997, 2 pages.
Manresa-Yee,“Experiences Using a Hands-Free Interface”, In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, retrieved from <http://dmi.uib.es/˜cmanresay/Research/%5BMan08%5DAssets08.pdf> on Jun. 1, 2012, Oct. 13, 2008, pp. 261-262.
McLellan,“Eleksen Wireless Fabric Keyboard: a first look”, Retrieved from: <http://www.zdnetasia.com/eleksen-wireless-fabric-keyboard-a-first-look-40278954.htm> on May 7, 2012, Jul. 17, 2006, 9 pages.
McPherson,“TouchKeys: Capacitive Multi-Touch Sensing on a Physical Keyboard”, In Proceedings of NIME 2012, May 2012, 4 pages.
Miller,“MOGA gaming controller enhances the Android gaming experience”, Retrieved from <http://www.zdnet.com/moga-gaming-controller-enhances-the-android-gaming-experience-7000007550/> on Nov. 20, 2012, Nov. 18, 2012, 9 pages.
Nakanishi,“Movable Cameras Enhance Social Telepresence in Media Spaces”, In Proceedings of the 27th International Conference on Human Factors in Computing Systems, retrieved from <http://smg.ams.eng.osaka-u.ac.jp/˜nakanishi/hnp_2009_chi.pdf> on Jun. 1, 2012, Apr. 6, 2009, 10 pages.
Picciotto,“Piezo-Actuated Virtual Buttons for Touch Surfaces”, U.S. Appl. No. 13/769,356, filed Feb. 17, 2013, 31 pages.
Piltch,“ASUS Eee Pad Slider SL101 Review”, Retrieved from <http://www.laptopmag.com/review/tablets/asus-eee-pad-slider-sl101.aspx>, Sep. 22, 2011, 5 pages.
Post,“E-Broidery: Design and Fabrication of Textile-Based Computing”, IBM Systems Journal, vol. 39, Issue 3 & 4, Jul. 2000, pp. 840-860.
Poupyrev,“Ambient Touch: Designing Tactile Interfaces for Handheld Devices”, In Proceedings of the 15th Annual ACM Symposium on User Interface Software and Technology Available at: <http://www.ivanpoupyrev.com/e-library/2002/uist2002_ambientouch.pdf>, Oct. 27, 2002, 10 pages.
Purcher,“Apple is Paving the Way for a New 3D GUI for IOS Devices”, Retrieved from: <http://www.patentlyapple.com/patently-apple/2012/01/apple-is-paving-the-way-for-a-new-3d-gui-for-ios-devices.html> on Jun. 4, 2012, Retrieved from: <http://www.patentlyapple.com/patently-apple/2012/01/apple-is-paving-the-way-for-a-new-3d-gui-for-ios-devices.html> on Jun. 4, 2012, Jan. 12, 2012, 15 pages.
Qin,“pPen: Enabling Authenticated Pen and Touch Interaction on Tabletop Surfaces”, In Proceedings of ITS 2010—Available at <http://www.dfki.de/its2010/papers/pdf/po172.pdf>, Nov. 2010, pp. 283-284.
Reilink,“Endoscopic Camera Control by Head Movements for Thoracic Surgery”, In Proceedings of 3rd IEEE RAS & EMBS International Conference of Biomedical Robotics and Biomechatronics, retrieved from <http://doc.utwente.nl/74929/1/biorob_online.pdf> on Jun. 1, 2012, Sep. 26, 2010, pp. 510-515.
Rendl,“PyzoFlex: Printed Piezoelectric Pressure Sensing Foil”, In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, Oct. 7, 2012, 10 pages.
Shaw,“Input Device Configuration having Capacitive and Pressure Sensors”, U.S. Appl. No. 14/033,510, dated Sep. 22, 2013, 55 pages.
Staff,“Gametel Android controller turns tablets, phones into portable gaming devices”, Retrieved from <http://www.mobiletor.com/2011/11/18/gametel-android-controller-turns-tablets-phones-into-portable-gaming-devices/#> on Nov. 20, 2012, Nov. 18, 2011, 5 pages.
Sumimoto,“Touch & Write: Surface Computing With Touch and Pen Input”, Retrieved from: <http://www.gottabemobile.com/2009/08/07/touch-write-surface-computing-with-touch-and-pen-input/> on Jun. 19, 2012, Aug. 7, 2009, 4 pages.
Sundstedt,“Gazing at Games: Using Eye Tracking to Control Virtual Characters”, In ACM SIGGRAPH 2010 Courses, retrieved from <http://www.tobii.com/Global/Analysis/Training/EyeTrackAwards/veronica_sundstedt.pdf> on Jun. 1, 2012, Jul. 28, 2010, 85 pages.
Takamatsu,“Flexible Fabric Keyboard with Conductive Polymer-Coated Fibers”, In Proceedings of Sensors 2011, Oct. 28, 2011, 4 pages.
Travis,“Collimated Light from a Waveguide for a Display Backlight”, Optics Express, 19714, vol. 17, No. 22, retrieved from <http://download.microsoft.com/download/D/2/E/D2E425F8-CF3C-4C71-A4A2-70F9D4081007/OpticsExpressbacklightpaper.pdf> on Oct. 15, 2009, Oct. 15, 2009, 6 pages.
Travis,“The Design of Backlights for View-Sequential 3D”, retrieved from <http://download.microsoft.com/download/D/2/E/D2E425F8-CF3C-4C71-A4A2-70F9D4081007/Backlightforviewsequentialautostereo.docx> on Nov. 1, 2010, 4 pages.
Tuite,“Haptic Feedback Chips Make Virtual-Button Applications on Handheld Devices A Snap”, Retrieved at: http://electronicdesign.com/analog/haptic-feedback-chips-make-virtual-button-applications-handheld-devices-snap, Sep. 10, 2009, 7 pages.
Valli,“Notes on Natural Interaction”, retrieved from <http://www.idemployee.id.tue.nl/g.w.m.rauterberg/lecturenotes/valli-2004.pdf> on Jan. 5, 2012, Sep. 2005, 80 pages.
Valliath,“Design of Hologram for Brightness Enhancement in Color LCDs”, Retrieved from <http://www.loreti.it/Download/PDF/LCD/44_05.pdf> on Sep. 17, 2012, May 1998, 5 pages.
Vaucelle,“Scopemate, A Robotic Microscope!”, Architectradure, retrieved from <http://architectradure.blogspot.com/2011/10/at-uist-this-monday-scopemate-robotic.html> on Jun. 6, 2012, Oct. 17, 2011, 2 pages.
Williams,“A Fourth Generation of LCD Backlight Technology”, Retrieved from <http://cds.linear.com/docs/Application%20Note/an65f.pdf>, Nov. 1995, 124 pages.
Xu,“Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors”, IUI'09, Feb. 8-11, 2009, retrieved from <http://sclab.yonsei.ac.kr/courses/10TPR/10TPR.files/Hand%20Gesture%20Recognition%20and%20Virtual%20Game%20Control%20based%20on%203d%20accelerometer%20and%20EMG%20sensors.pdf> on Jan. 5, 2012, Feb. 8, 2009, 5 pages.
Xu,“Vision-based Detection of Dynamic Gesture”, ICTM'09, Dec. 5-6, 2009, retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5412956> on Jan. 5, 2012, Dec. 5, 2009, pp. 223-226.
Zhang,“Model-Based Development of Dynamically Adaptive Software”, In Proceedings of ICSE 2006, Available at <http://www.irisa.fr/lande/lande/icse-proceedings/icse/p371.pdf>, May 20, 2006, pp. 371-380.
Zhu,“Keyboard before Head Tracking Depresses User Success in Remote Camera Control”, In Proceedings of 12th IFIP TC 13 International Conference on Human-Computer Interaction, Part II, retrieved from <http://csiro.academia.edu/Departments/CSIRO_ICT_Centre/Papers?page=5> on Jun. 1, 2012, Aug. 24, 2009, 14 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/033,510, dated May 5, 2017, 2 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/067754, dated Jan. 10, 2017, 10 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2016/031699, dated Feb. 22, 2017, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/591,704, dated Mar. 10, 2017, 26 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/729,793, dated Mar. 31, 2017, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/203,636, dated Apr. 13, 2017, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/269,594, dated Jun. 7, 2017, 27 pages.
“Notice of Allowance”, U.S. Appl. No. 14/033,510, dated Feb. 15, 2017, 10 pages.
“PTAB Decision”, U.S. Appl. No. 13/974,994, dated May 16, 2017, 16 pages.
“Second Written Opinion”, Application No. PCT/US2016/025966, dated Mar. 14, 2017, 7 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2016/025966, dated May 22, 2017, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/591,704, dated Aug. 21, 2017, 15 pages.
“Notice of Allowance”, U.S. Appl. No. 15/203,636, dated Jul. 20, 2017, 7 pages.
Related Publications (1)
Number Date Country
20170023418 A1 Jan 2017 US
Provisional Applications (1)
Number Date Country
61659364 Jun 2012 US
Continuations (1)
Number Date Country
Parent 14033508 Sep 2013 US
Child 15283913 US
Continuation in Parts (2)
Number Date Country
Parent 13974749 Aug 2013 US
Child 14033508 US
Parent 13655065 Oct 2012 US
Child 13974749 US