Sensing user input at display area edge

Information

  • Patent Grant
  • 9304949
  • Patent Number
    9,304,949
  • Date Filed
    Monday, October 21, 2013
    10 years ago
  • Date Issued
    Tuesday, April 5, 2016
    8 years ago
Abstract
One or more sensors are disposed to sense user inputs in an active display area as well as user inputs in an extended area that is outside of the active display area. Data for user inputs, such as gestures, may include data from user inputs sensed in both the active display area and outside of the active display area. The user inputs can begin and/or end outside of the active display area.
Description
BACKGROUND

Mobile computing devices have been developed to increase the functionality that is made available to users in a mobile setting. For example, a user may interact with a mobile phone, tablet computer, or other mobile computing device to check email, surf the web, compose texts, interact with applications, and so on. Traditional mobile computing devices oftentimes employ displays with touchscreen functionality to allow users to input various data or requests to the computing device. However, it can be difficult to recognize certain user inputs with such traditional mobile computing devices, providing frustrating and unfriendly experiences for the users.


SUMMARY

Sensing user input at display area edge techniques are described.


In one or more implementations, input data for a user input is received, the input data having been sensed by one or more sensors. The input data includes data for locations touched by an object in an active display area of an interactive display device as well as data for locations touched by the object in an area outside of the active display area of the interactive display device. Based on the data for the locations touched by the object in the active display area as well as the locations touched by the object in the area outside of the active display area, the user input is determined.


In one or more implementations, a computing device includes a housing and a display device supported by the housing and having an active display area. The display device has one or more sensors disposed for sensing locations of the active display area that are touched by an object while inputting a user input to the computing device as well as locations outside of the active display area that are touched by the object while inputting the user input. The computing device determines the user input based on both the locations of the active display area that are touched by the object and the locations outside of the active display area that are touched by the object.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items. Entities represented in the figures may be indicative of one or more entities and thus reference may be made interchangeably to single or plural forms of the entities in the discussion.



FIG. 1 is an illustration of an environment in an example implementation that is operable to employ the techniques described herein.



FIG. 2 is an illustration of an environment in another example implementation that is operable to employ the techniques described herein.



FIG. 3 depicts an example implementation of an input device of FIG. 2 as showing a flexible hinge in greater detail.



FIG. 4 depicts an example implementation showing a perspective view of a connecting portion of FIG. 3 that includes mechanical coupling protrusions and a plurality of communication contacts.



FIG. 5 illustrates an example display device implementing the sensing user input at display area techniques.



FIG. 6 illustrates a cross section view of an example display device implementing the sensing user input at display area techniques.



FIG. 7 illustrates a cross section view of another example display device implementing the sensing user input at display area techniques.



FIG. 8 is an illustration of a system in an example implementation that is operable to employ the techniques described herein.



FIG. 9 illustrates the example display device of FIG. 5 with an example user input.



FIG. 10 illustrates the example display device of FIG. 5 with another example user input.



FIG. 11 illustrates the example display device of FIG. 5 with another example user input.



FIG. 12 illustrates the example display device of FIG. 5 with another example user input.



FIG. 13 is a flowchart illustrating an example process for implementing the techniques described herein in accordance with one or more embodiments.



FIG. 14 illustrates an example system including various components of an example device that can be implemented as any type of computing device as described with reference to FIGS. 1-13 to implement embodiments of the techniques described herein.





DETAILED DESCRIPTION
Overview

Sensing user input at display area edge techniques are described. One or more sensors are disposed to sense user inputs in an active display area as well as to sense user inputs in an extended area that is outside of the active display area. Data for user inputs, such as gestures, may include data from user inputs sensed in both the active display area and outside of the active display area. Thus, user inputs can begin and/or end outside of the active display area.


In the following discussion, an example environment is first described that may employ the techniques described herein. Example procedures are then described which may be performed in the example environment as well as other environments. Consequently, performance of the example procedures is not limited to the example environment and the example environment is not limited to performance of the example procedures.


Example Environment and Procedures


FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ the techniques described herein. The illustrated environment 100 includes an example of a computing device 102, which may be configured in a variety of ways. For example, the computing device 102 may be configured for mobile use, such as a mobile phone, a tablet computer, and so on. However, the techniques discussed herein are also applicable to multiple types of devices other than those for mobile use, and may be used with any of a variety of different devices that use an input sensor over or in a display area. For example, the computing device 102 may be a desktop computer, a point of sale kiosk, an interactive display or monitor (e.g., in a hospital, airport, mall, etc.), and so forth. The computing device 102 may range from full resource devices with substantial memory and processor resources to a low-resource device with limited memory and/or processing resources. The computing device 102 may also relate to software that causes the computing device 102 to perform one or more operations.


The computing device 102, for instance, is illustrated as including an input/output module 108. The input/output module 108 is representative of functionality relating to processing of inputs and rendering outputs of the computing device 102. A variety of different inputs may be processed by the input/output module 108, such as inputs relating to functions that correspond to keys of an input device coupled to computing device 102 or keys of a virtual keyboard displayed by the display device 110, inputs that are gestures recognized through touchscreen functionality of the display device 110 and that cause operations to be performed that correspond to the gestures, and so forth. The display device 110 is thus also referred to as an interactive display device due to the ability of the display device to receive user inputs via any of various input sensing technologies. The input/output module 108 may support a variety of different input techniques by recognizing and leveraging a division between types of inputs including key presses, gestures, and so on.



FIG. 2 is an illustration of an environment 200 in another example implementation that is operable to employ the techniques described herein. The illustrated environment 200 includes an example of a computing device 202 that is physically and communicatively coupled to an input device 204 via a flexible hinge 206. The computing device 202 may be configured in a variety of ways, analogous to computing device 102 of FIG. 1. The computing device 202 may also relate to software that causes the computing device 202 to perform one or more operations.


The computing device 202, for instance, is illustrated as including an input/output module 208. The input/output module 208 is representative of functionality relating to processing of inputs and rendering outputs of the computing device 202. A variety of different inputs may be processed by the input/output module 208, such as inputs relating to functions that correspond to keys of the input device 204 or keys of a virtual keyboard displayed by the display device 210, inputs that are gestures recognized through touchscreen functionality of the display device 210 and that cause operations to be performed that correspond to the gestures, and so forth. The display device 210 is thus also referred to as an interactive display device due to the ability of the display device to receive user inputs via any of various input sensing technologies. The input/output module 208 may support a variety of different input techniques by recognizing and leveraging a division between types of inputs including key presses, gestures, and so on.


In the illustrated example, the input device 204 is configured as a keyboard having a QWERTY arrangement of keys although other arrangements of keys are also contemplated. Further, other non-conventional configurations are also contemplated, such as a game controller, configuration to mimic a musical instrument, and so forth. Thus, the input device 204 and keys incorporated by the input device 204 may assume a variety of different configurations to support a variety of different functionality.


As previously described, the input device 204 is physically and communicatively coupled to the computing device 202 in this example through use of a flexible hinge 206. The flexible hinge 206 is flexible in that rotational movement supported by the hinge is achieved through flexing (e.g., bending) of the material forming the hinge as opposed to mechanical rotation as supported by a pin, although that embodiment is also contemplated. Further, this flexible rotation may be configured to support movement in one direction (e.g., vertically in the figure) yet restrict movement in other directions, such as lateral movement of the input device 204 in relation to the computing device 202. This may be used to support consistent alignment of the input device 204 in relation to the computing device 202, such as to align sensors used to change power states, application states, and so on.


The flexible hinge 206, for instance, may be formed using one or more layers of fabric and include conductors formed as flexible traces to communicatively couple the input device 204 to the computing device 202 and vice versa. This communication, for instance, may be used to communicate a result of a key press to the computing device 202, receive power from the computing device, perform authentication, provide supplemental power to the computing device 202, and so on. The flexible hinge 206 may be configured in a variety of ways, further discussion of which may be found in relation to the following figure.



FIG. 3 depicts an example implementation 300 of the input device 204 of FIG. 2 as showing the flexible hinge 206 in greater detail. In this example, a connection portion 302 of the input device is shown that is configured to provide a communicative and physical connection between the input device 204 and the computing device 202. In this example, the connection portion 302 has a height and cross section configured to be received in a channel in the housing of the computing device 202, although this arrangement may also be reversed without departing from the spirit and scope thereof.


The connection portion 302 is flexibly connected to a portion of the input device 204 that includes the keys through use of the flexible hinge 206. Thus, when the connection portion 302 is physically connected to the computing device the combination of the connection portion 302 and the flexible hinge 206 supports movement of the input device 204 in relation to the computing device 202 that is similar to a hinge of a book.


For example, rotational movement may be supported by the flexible hinge 206 such that the input device 204 may be placed against the display device 210 of the computing device 202 and thereby act as a cover. The input device 204 may also be rotated so as to be disposed against a back of the computing device 202, e.g., against a rear housing of the computing device 202 that is disposed opposite the display device 210 on the computing device 202.


Naturally, a variety of other orientations are also supported. For instance, the computing device 202 and input device 204 may assume an arrangement such that both are laid flat against a surface as shown in FIG. 2. In another instance, a typing arrangement may be supported in which the input device 204 is laid flat against a surface and the computing device 202 is disposed at an angle to permit viewing of the display device 210, e.g., such as through use of a kickstand disposed on a rear surface of the computing device 202. Other instances are also contemplated, such as a tripod arrangement, meeting arrangement, presentation arrangement, and so forth.


The connecting portion 302 is illustrated in this example as including magnetic coupling devices 304, 306, mechanical coupling protrusions 308, 310, and a plurality of communication contacts 312. The magnetic coupling devices 304, 306 are configured to magnetically couple to complementary magnetic coupling devices of the computing device 202 through use of one or more magnets. In this way, the input device 204 may be physically secured to the computing device 202 through use of magnetic attraction.


The connecting portion 302 also includes mechanical coupling protrusions 308, 310 to form a mechanical physical connection between the input device 204 and the computing device 202. The mechanical coupling protrusions 308, 310 are shown in greater detail in the following figure.



FIG. 4 depicts an example implementation 400 showing a perspective view of the connecting portion 302 of FIG. 3 that includes the mechanical coupling protrusions 308, 310 and the plurality of communication contacts 312. As illustrated, the mechanical coupling protrusions 308, 310 are configured to extend away from a surface of the connecting portion 302, which in this case is perpendicular although other angles are also contemplated.


The mechanical coupling protrusions 308, 310 are configured to be received within complimentary cavities within the channel of the computing device 202. When so received, the mechanical coupling protrusions 308, 310 promote a mechanical binding between the devices when forces are applied that are not aligned with an axis that is defined as correspond to the height of the protrusions and the depth of the cavity.


For example, when a force is applied that does coincide with the longitudinal axis described previously that follows the height of the protrusions and the depth of the cavities, a user overcomes the force applied by the magnets solely to separate the input device 204 from the computing device 202. However, at other angles the mechanical coupling protrusion 308, 310 are configured to mechanically bind within the cavities, thereby creating a force to resist removal of the input device 204 from the computing device 202 in addition to the magnetic force of the magnetic coupling devices 304, 306. In this way, the mechanical coupling protrusions 308, 310 may bias the removal of the input device 204 from the computing device 202 to mimic tearing a page from a book and restrict other attempts to separate the devices.


The connecting portion 302 is also illustrated as including a plurality of communication contacts 312. The plurality of communication contacts 312 is configured to contact corresponding communication contacts of the computing device 202 to form a communicative coupling between the devices. The communication contacts 312 may be configured in a variety of ways, such as through formation using a plurality of spring loaded pins that are configured to provide a consistent communication contact between the input device 204 and the computing device 202. Therefore, the communication contact may be configured to remain during minor movement of jostling of the devices. A variety of other examples are also contemplated, including placement of the pins on the computing device 202 and contacts on the input device 204.


The sensing user input at display area edge techniques use one or more sensors disposed in an extended sensor area to sense user input outside of an active display area. One or more sensors are also disposed to sense user inputs in the active display area. The extended sensor area is in close proximity to (e.g., within 5 millimeters of) the active display area, and typically is adjacent to the active display area.



FIG. 5 illustrates an example display device 500 implementing the sensing user input at display area techniques. The display device 500 is an interactive display device that includes an active display area 502 in which various data and information may be displayed by the computing device. The display area 502 is referred to as an active display area as the data and information displayed can be changed over time by the computing device, optionally in response to user inputs received by the computing device. The display device 500 also includes an extended sensor area 504, surrounding and adjacent to the active display area 502, illustrated with cross-hatching. User inputs can be received when an object, such as a finger of a user's hand, a stylus, a pen, and so forth is touching and/or in close proximity to the surface of the active display area 502 and/or the surface of the extended sensor area 504. Extended sensor area 504 facilitates sensing user inputs along the edge of the active display area 502. The edge of the active display area 502 refers to the outer perimeter of the active display area 502, which is the portion of the active display area 502 that is closest to the extended sensor area 504.


The extended sensor area 504 can extend, for example, 2 millimeters beyond the active display area 502, although other amounts of extension are contemplated. The extended sensor area 504 can extend the same amount beyond the active display area 502 all around the active display area 502, or alternatively can extend by different amounts. For example, the extended sensor area 504 can extend beyond the active display area 502 by 2 millimeters in the vertical direction and by 4 millimeters in the horizontal direction. The extended sensor area 504 can also vary for different types of devices and be customized to the particular type of device. For example, interactive devices that can receive input from farther away (e.g., point of sale kiosks and interactive displays that can sense input as far away as 10 centimeters) may have extended sensor areas that extend beyond the display area farther (e.g., 10-15 centimeters rather than 2-4 millimeters) than devices that receive input from closer interactions (e.g., a tablet that senses touch).


Display devices implementing the sensing user input at display area edge techniques can use a variety of active display technologies. These active display technologies may include, for example, flexible display technologies, e-reader display technologies, liquid crystal (LCD) display technologies, light-emitting diode (LED) display technologies, organic light-emitting diode (OLED) display technologies, plasma display technologies, and so forth. Although examples of display technologies are discussed herein, other display technologies are also contemplated.


Display devices implementing the sensing user input at display area edge techniques can use a variety of different input sensing technologies. These input sensing technologies may include capacitive systems and/or resistive systems that sense touch. These input sensing technologies may also include inductive systems that sense pen (or other object) inputs. These input sensing technologies may also include optical based systems that sense reflection or disruption of light from objects touching (or close to) the surface of the display device, such as Sensor in Pixel (SIP) systems, Infrared systems, optical imaging systems, and so forth. Other types of input sensing technologies can also be used, such as surface acoustic wave systems, acoustic pulse recognition systems, dispersive signal systems, and so forth. Although examples of input sensing technologies are discussed herein, other input sensing technologies are also contemplated. Furthermore, these input sensing technologies may be combined together, such as a piezoelectric with extended capacitive sensor to provide other tactile input.


Depending on the input sensing technology that is used for a display device, user inputs can be received when an object (such as a finger of a user's hand, a stylus, a pen, and so forth) is touching and/or in close proximity to the surface of the display device. This close proximity can be, for example 5 millimeters, although different proximities are contemplated and can vary depending on the manner in which the display device is implemented. The proximity of an object to the display device refers to a distance the object is from the display device along a direction perpendicular to a plane of the display device.



FIG. 6 illustrates a cross section view of an example display device 600 implementing the sensing user input at display area techniques. The display device 600 includes an active display layer 602 on top of which is disposed an input sensing layer 604. Although the layers 602 and 604 are illustrated as being individual layers, it should be noted that each of the layers 602 and 604 itself may be made up of multiple layers. The input sensing layer 604 and the active display layer 602 can be implemented using a variety of different technologies, as discussed above. Although not illustrated in FIG. 6, it should be noted that any number of additional layers can be included in the display device 600. For example, an additional protective layer made of glass or plastic can be disposed on top of input sensing layer 604.


A user's finger 606 (or other object) touching or in close proximity to the input sensing layer 604 is sensed by the input sensing layer 604. The locations where the user's finger 606 (or other object) is sensed by the layer 604 is provided by the layer 604 as sensed object locations and are used to identify the user input, as discussed in more detail below.


The input sensing layer 604 includes multiple sensors, and extends beyond the active display area 602 to extended sensor area 608, 610. The number of sensors and manner in which the sensors are disposed may vary based on the implementation and the input sensing technology used for the input sensing layer 604. The input sensing layer 604 includes a portion 612 as well as portions 614 and 616.


One or more sensors may be disposed in the input sensing layer 604 above active display layer 602, in portion 612. These sensors disposed above the layer 602 sense the user's finger 606 (or other object) touching or in close proximity to the layer 604 above the active display layer 602, and thus are also referred to as sensing user input in and/or above the active display area as well as being disposed in the active display area.


One or more sensors may also be disposed in the input sensing layer 604 above extended sensor area 608, 610, in portions 614, 616, respectively. The extended sensor area 608, 610 is not above the active display layer 602, as illustrated in FIG. 6. These sensors disposed above the extended sensor area 608, 610 sense the user's finger 606 (or other object) touching or in close proximity to the layer 604 above the extended sensor area 608, 610, and thus are also referred to as sensing user input in and/or above the extended sensor area 608, 610. Because the extended sensor area 608, 610 is not above the active display layer 602, these sensors disposed above the extended sensor area 608, 610 are also referred to as sensing user input in an area outside of the active display area as well as being disposed in the area outside of the active display area.


Alternatively, sensors may be disposed in the input sensing layer 604 in other manners, such as along the outer edge (the perimeter) of the input sensing layer 604, at corners of the input sensing layer 604, and so forth. Such sensors may still sense user input in and/or above the active display area, as well as user input in an area outside of the active display area.



FIG. 7 illustrates a cross section view of another example display device 700 implementing the sensing user input at display area techniques. The display device 700 includes an active display layer 702 on top of which is disposed an input sensing layer 704. The input sensing layer 704 and the active display layer 702 can be implemented using a variety of different technologies, as discussed above. The layers 702 and 704 are disposed between a lower panel layer 706 and an upper panel layer 708. The panel layers 706, 708 may be made of various materials, such as glass, plastic, and so forth. Although the layers 702, 704, 706, and 708 are illustrated as being individual layers, it should be noted that each of the layers 702, 704, 706, and 708 itself may be made up of multiple layers. These layers may also be flexible layers, and applicable in 3-dimensional (3Ds) interactive devices.


Additional support material 714, 716 is optionally included between the panel layers 706, 708, illustrated with cross-hatching in FIG. 7. The support material 714, 716 provides additional support for areas between the panel layers to which the layers 702 and 704 do not extend. The support material 714, 716 can be various materials, such as glass, plastic, bonding adhesive, and so forth.


A user's finger 606 (or other object) touching or in close proximity to the input sensing layer 704 is sensed by the input sensing layer 704. The locations where the user's finger 606 (or other object) is sensed by the layer 704 is provided by the layer 704 as sensed object locations and are used to identify the user input, as discussed in more detail below.


The input sensing layer 704 includes multiple sensors, and extends beyond the active display area 702 to extended sensor area 710, 712. Input sensing layer 704 need not, however, extend as far as panel layers 706, 708, as illustrated. The number of sensors included in the input sensing layer 704 and the manner in which the sensors are disposed may vary based on the implementation and the input sensing technology used for the input sensing layer 704. The input sensing layer 704 includes a portion 718 as well as portions 720 and 722.


One or more sensors are disposed in the input sensing layer 704 above active display layer 702, in portion 718. These sensors disposed above the layer 702 sense the user's finger 606 (or other object) touching or in close proximity to the panel layer 708 above the active display layer 702, and thus are also referred to as sensing user input in and/or above the active display area as well as being disposed in the active display area.


One or more sensors are also disposed in the input sensing layer 704 above extended sensor area 710, 712, in portions 720, 722, respectively. The extended sensor area 710, 712 is not above the active display layer 702, as illustrated in FIG. 7. These sensors disposed above the extended sensor area 710, 712 sense the user's finger 706 (or other object) touching or in close proximity to the panel layer 708 above the extended sensor area 710, 712, and thus are also referred to as sensing user input in and/or above the extended sensor area 710, 712. Because the extended sensor area 710, 712 is not above the active display layer 702, these sensors disposed above the extended sensor area 710, 712 are also referred to as sensing user input in an area outside of the active display area as well as being disposed in the area outside of the active display area.


Alternatively, sensors may be disposed in the input sensing layer 704 in other manners, such as along the outer edge (the perimeter) of the input sensing layer 704, at corners of the input sensing layer 704, and so forth. Such sensors may still sense user input in and/or above the active display area, as well as user input in an area outside of the active display area.


It should be noted that, although the input sensing layers in FIGS. 6 and 7 are illustrated as being disposed above the active display layers, other arrangements are contemplated. For example, the input sensing layer can be within or below the active display layer. The input sensing layer can also be of multiple configurations. The input sensing layer may be on both sides of plastic and/or glass substrate, or on the same side of plastic, glass and/or other optical clear layers.



FIG. 8 is an illustration of a system 800 in an example implementation that is operable to employ the techniques described herein. The system 800 includes an input data collection module 802 and an input handler module 804. System 800 may be implemented, for example, in the computing device 102 of FIG. 1 or the computing device 202 of FIG. 2. Although the modules 802 and 804 are illustrated in the system 800, it should be noted that one or more additional modules may be included in the system 800. It should also be noted that the functionality of the module 802 and/or the module 804 can be separated into multiple modules.


The input data collection module 802 receives indications of sensed object locations 806. These sensed object location indications 806 are indications of locations of an object (e.g., the user's finger or a pen) that were sensed by an input sensing layer of a display device. Timing information associated with the locations that were sensed by the input sensing layer can also optionally be included as part of the sensed object location indications 806. This timing information indicates when a particular location was sensed, and may take different forms. For example, this timing information may be relative to a fixed timeframe or clock, or may be an amount of time since the previous location was sensed. Alternatively, the timing information may be generated by the input data collection module 802 based on the timing of receipt of the sensed object location indications 806.


The input data collection module 802 uses the sensed object location indications 806 to generate input data 808. The input data 808 describes the location and the movement of the user input. The input data 808 can be the sensed object location indications 806, as well as any associated timing information for the locations as received and/or generated by the module 802.


Additionally, a user input can have an associated lifetime, which refers to a time duration that begins when an object touching (or in close proximity to) the surface is sensed and ends when the object is no longer sensed as touching (or in close proximity to) the surface of the display device. This associated lifetime may be identified by the input data collection module 802 and included as part of the input data 808.


A user input can also have an associated velocity, which refers to a velocity at which the object that is sensed is moving. This velocity is a particular distance divided by a particular amount of time, such as a particular number of inches per second, a particular number of millimeters per millisecond, and so forth. This associated velocity may be identified by the input data collection module 802 and included as part of the input data 808, or used in other manners (e.g., to determine when to provide input data 808 to the input handler module 804, as discussed in more detail below).


The input data collection module 802 provides the input data 808 to the input handler module 804, which determines what the user input is. The user input can take various forms, such as a gesture or mouse movement. A gesture refers to a motion or path taken by an object (e.g., the user's finger) to initiate one or more functions of a computing device. For example, a gesture may be sliding of the user's finger in a particular direction, the user's finger tracing a particular character or symbol (e.g., a circle, a letter “Z”, etc.), and so forth. A gesture may also include a multi-touch input in which multiple objects (e.g., multiple of the user's fingers) take particular motions or paths to initiate one or more functions of the computing device. A mouse movement refers to a motion or path taken by an object (e.g., the user's finger) to move something (e.g., a cursor or pointer, an object being dragged and dropped, etc.) on the display device. Although gestures and mouse movements are discussed herein, various other types of user inputs are contemplated.


The input handler module 804 may use any of a variety of public and/or proprietary techniques to determine what the user input is based on the input data 808. For example, the input handler module 804 can determine that the user input is a particular gesture, a particular mouse movement, and so forth. The input handler 804 may also be configured to analyze characteristics of the input (e.g., the size of the input and/or velocity of the input) to configure the display or other output for a customized user experience. For example, a small finger with small input can be processed to adjust the font, color, application, and so forth suitable to children.


The input handler module 804 may also, based on the determined user input, take various actions. For example, the input handler module 804 may provide an indication of the determined user input to one or more other modules of the computing device to carry out the requested function or movement. By way of another example, the input handler module 804 itself may carry out the requested function or movement.


The input data collection module 802 may provide the input data 808 to the input handler module 804 at various times. For example, the input data collection module 802 may provide the input data 808 to the input handler module 804 as the input data 808 is generated. By way of another example, the input data collection module 802 may provide the input data 808 to the input handler after the user input has finished (e.g., after the lifetime associated with the user input has elapsed and the object is no longer sensed as touching (or in close proximity to) the surface of the display device).


Alternatively, the input data collection module 802 may maintain the input data 808 for a user input but not provide the input data 808 to the input handler module 804 until a particular event occurs. Various different events can cause the module 802 to provide the input data 808 to the module 804. One event that may cause the module 802 to provide the input data 808 to the module 804 is the user input, as indicated by the location of the object, being in the active display area. Thus, in response to the user input being in the active display area, the module 802 provides the input data 808 to the module 804.


Another event that may cause the module 802 to provide the input data 808 to the module 804 is the user input being outside of the active display area but predicted to be in the active display area in the future (e.g., during an associated lifetime of the user input). The user input can be predicted to be in the active display area in the future based on various rules or criteria, such as based on the velocity of the user input and/or the direction of the user input. For example, if the user input is outside of the active display area and the direction of the user input is towards the active display area, then the user input is predicted to be in the active display area in the future. By way of another example, if the user input is outside of the active display area, the direction of the user input is towards the active display area, and the velocity of the user input is greater than a threshold amount, then the user input is predicted to be in the active display area in the future. This threshold amount can be, for example, 4 inches per second, although other threshold amounts are contemplated. Thus, in response to the user input being predicted to be in the active display area in the future, the module 802 provides the input data 808 to the module 804.



FIG. 9 illustrates the example display device 500 of FIG. 5 with an example user input. The display device 500 includes an active display area 502 surrounded by an extended sensor area 504, illustrated with cross-hatching, as discussed above. A user input is received via a user's finger 606.


The user input in FIG. 9 is illustrated as a movement from right to left, with the user input beginning in the extended sensor area 504 and moving into the active display area 502. The ending position of the user's finger is illustrated using a dashed outline of a hand. Sensing of the user input begins in the extended sensor area 504, prior to the user's finger 606 moving into the active display area 502. The user input indicated by the movement of the user's finger 606 in FIG. 9 may be identified more quickly than if extended sensor area 504 were not included in display device 500. The user input may be identified more quickly because without extended sensor area 504 locations of the user's finger 606 would not begin to be sensed until after the edge of the active display area 502 is reached by the user's finger 606.


The user input in FIG. 9 is illustrated as beginning in extended sensor area 504. However, it should be noted that the user input can begin outside of both the active display area 502 and the extended sensor area 504 (e.g., along an edge of the display device 500). The user input may still be identified more quickly than if extended sensor area 504 were not included in display device 502 because the movement will begin to be sensed when the extended sensor area 504 is reached by the user's finger 606 (rather than waiting until the user's finger 606 reaches the active display area 502).



FIG. 10 illustrates the example display device 500 of FIG. 5 with another example user input. The display device 500 includes an active display area 502 surrounded by an extended sensor area 504, illustrated with cross-hatching, as discussed above. A user input is received via a user's finger 606.


The user input in FIG. 10 is illustrated as a movement from left to right, with the user input beginning in the active display area 502 and ending in the extended sensor area 504. The ending position of the user's finger is illustrated using a dashed outline of a hand. Alternatively, the ending position of the movement may be outside of both the active display area 502 and the extended sensor area 504 (e.g., along an edge of the display device 500). Sensing of the user input begins in the active display area, prior to the user's finger 606 moving into the extended sensor area 504. By ending movement of the user's finger 606 in (or having the movement of the user's finger pass through) the extended sensor area 504, the location of the user input in the extended sensor area 504 can be used in identifying the user input. For example, the input handler module 804 of FIG. 8 may determine that the user input is a swipe or gesture from left to right across the display device as opposed to an input that was intended by the user to stop over a particular icon or object displayed near the edge of the display area.



FIG. 11 illustrates the example display device 500 of FIG. 5 with another example user input. The display device 500 includes an active display area 502 surrounded by an extended sensor area 504, illustrated with cross-hatching, as discussed above. A user input is received via a user's finger 606.


The user input in FIG. 11 is illustrated as moving from right to left and from top to bottom in a “<” shape. The user input in FIG. 11 begins and ends in the active display area 504, but passes through the extended sensor area 504. The ending position of the user's finger is illustrated using a dashed outline of a hand. Sensing of the user input in the extended sensor area 504 allows the user input illustrated in FIG. 11 to be input along the edge of the active display area 504. Even though the user input passes outside the edge of the active display area 504, the user input is sensed in the extended sensor area 504.



FIG. 12 illustrates the example display device 500 of FIG. 5 with another example user input. The display device 500 includes an active display area 502 surrounded by an extended sensor area 504, illustrated with cross-hatching, as discussed above. A user input is received via a user's finger 606.


The user input in FIG. 12 is illustrated as a movement from left to right, with the user input beginning and ending in the extended sensor area 504 without moving into the active display area 502. The ending position of the user's finger is illustrated using a dashed outline of a hand. Sensing of the user input begins in the extended sensor area 504. However, as the user's finger 606 is not moved into the active display area 502, and the direction of movement of the user's finger 606 is not towards the active display area 502, the input data for the user input need not be provided to the input handler module 804 of FIG. 8. Thus, as the user input remains in the extended sensor area 504, no action based on a user input need be taken.



FIG. 13 is a flowchart illustrating an example process 1300 for implementing the techniques described herein in accordance with one or more embodiments. Process 1300 is carried out by a computing device, such as computing device 102 of FIG. 1 or computing device 202 of FIG. 2, and can be implemented in software, firmware, hardware, or combinations thereof. Process 1300 is shown as a set of acts and is not limited to the order shown for performing the operations of the various acts. Process 1300 is an example process for implementing the techniques described herein; additional discussions of implementing the techniques described herein are included herein with reference to different figures.


In process 1300, input data is received (act 1302). The input data includes data for at least part of the user input in an active display area of a device and data for at least part of the user input in an area outside of the active display area of the device, as discussed above.


Based on the input data, the user input is determined (act 1304). Any of a variety of public and/or proprietary techniques may be used to determine what the user input is, as discussed above.


The action indicated by the user input is performed (act 1306). This action may be the performance of various functions or movements, as discussed above.


Example System and Device


FIG. 14 illustrates an example system generally at 1400 that includes an example computing device 1402 that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. The computing device 1402 may, for example, be configured to assume a mobile configuration through use of a housing formed and sized to be grasped and carried by one or more hands of a user, illustrated examples of which include a mobile phone, mobile game and music device, and tablet computer although other examples and configurations are also contemplated.


The example computing device 1402 as illustrated includes a processing system 1404, one or more computer-readable media 1406, and one or more I/O interfaces 1408 that are communicatively coupled, one to another. Although not shown, the computing device 1402 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines.


The processing system 1404 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 1404 is illustrated as including hardware element 1410 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. The hardware elements 1410 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.


The computer-readable storage media 1406 is illustrated as including memory/storage 1412. The memory/storage 1412 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage component 1412 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage component 1412 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 1406 may be configured in a variety of other ways as further described below.


Input/output interface(s) 1408 are representative of functionality to allow a user to enter commands and information to computing device 1402, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to recognize movement as gestures that do not involve touch), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, the computing device 1402 may be configured in a variety of ways to support user interaction.


The computing device 1402 is further illustrated as including one or more modules 1418 that may be configured to support a variety of functionality. The one or more modules 1418, for instance, may be configured to generate input data based on indications of sensed object locations, to determine what a user input is based on the input data, and so forth. The modules 1418 may include, for example, the input data collection module 802 and/or the input handler module 804 of FIG. 8.


Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the computing device 1402. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”


“Computer-readable storage media” may refer to media and/or devices that enable persistent and/or non-transitory storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media. The computer-readable storage media includes hardware such as volatile and nonvolatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.


“Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 1402, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.


As previously described, hardware elements 1410 and computer-readable media 1406 are representative of modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein, such as to perform one or more instructions. Hardware may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware. In this context, hardware may operate as a processing device that performs program tasks defined by instructions and/or logic embodied by the hardware as well as a hardware utilized to store instructions for execution, e.g., the computer-readable storage media described previously.


Combinations of the foregoing may also be employed to implement various techniques described herein. Accordingly, software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 1410. The computing device 1402 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of a module that is executable by the computing device 1402 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 1410 of the processing system 1404. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more computing devices 1402 and/or processing systems 1404) to implement techniques, modules, and examples described herein.


CONCLUSION

Although the example implementations have been described in language specific to structural features and/or methodological acts, it is to be understood that the implementations defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed features.

Claims
  • 1. A method comprising: receiving input data for a user input, the input data having been sensed by one or more sensors, the input data including data indicating first multiple locations touched by an object in an active display area of an interactive display device in which data or information is displayed as well as first timing information indicating when each of the first multiple locations was sensed, the input data further including data indicating second multiple locations touched by the object in an area outside of the active display area of the interactive display device as well as second timing information indicating when each of the second multiple locations was sensed, the area outside of the active display area extending beyond the active display area where there are no display pixels; anddetermining the user input based on the indicated first multiple locations and the indicated first timing information as well as the indicated second multiple locations and the indicated second timing information, the user input comprising a gesture described by a path that includes the first multiple locations and the second multiple locations.
  • 2. A method as recited in claim 1, the user input beginning in the area outside of the active display area and ending in the active display area.
  • 3. A method as recited in claim 1, the user input beginning in the active display area and ending in the area outside of the active display area.
  • 4. A method as recited in claim 1, the user input beginning in the active display area and ending in the active display area, and passing through the area outside of the active display area after beginning the user input and prior to ending the user input.
  • 5. A method as recited in claim 1, the receiving comprising receiving the input data in response to the user input being predicted to be in the active display area in the future.
  • 6. A method as recited in claim 5, the user input being predicted to be in the active display area in the future in response to a direction of the user input being towards the active display area.
  • 7. A method as recited in claim 5, the user input being predicted to be in the active display area in the future in response to both a direction of the user input being towards the active display area and a velocity of the user input being greater than a threshold amount.
  • 8. A method as recited in claim 1, the interactive display device being included in a computing device, and the user input comprising a gesture indicating one or more functions of the computing device to initiate.
  • 9. A method comprising: sensing, in a display device having one or more sensors, input data for a user input, the input data including first multiple locations of an active display area of the display device in which data or information is displayed that are touched by an object as well as timing information indicating when each of the first multiple locations was sensed, the input data further including second multiple locations of an area outside of the active display area that are touched by the object as well as timing information indicating when each of the second multiple locations was sensed, the area outside of the active display area extending beyond the active display area where there is no display layer; andusing the first multiple locations and timing information indicating when each of the first multiple locations was sensed as well as the second multiple locations and timing information indicating when each of the second multiple locations was sensed to determine the user input.
  • 10. A method as recited in claim 9, the user input beginning in the area outside of the active display area and ending in the active display area.
  • 11. A method as recited in claim 9, the user input beginning in the active display area and ending in the area outside of the active display area.
  • 12. A method as recited in claim 9, the user input beginning in the active display area and ending in the active display area, and passing through the area outside of the active display area after beginning the user input and prior to ending the user input.
  • 13. A method as recited in claim 9, the object comprising a finger of a user providing the user input.
  • 14. A method as recited in claim 9, the area outside of the active display area comprising an extended sensor area surrounding and adjacent to the active display area.
  • 15. A computing device comprising a housing and a display device supported by the housing and having an active display area in which data or information is displayed, and an area outside of the active display area extending beyond the active display area where there are no display pixels, the display device having one or more sensors disposed for sensing first multiple locations of the active display area that are touched by an object while inputting a user input to the computing device as well as second multiple locations outside of the active display area that are touched by the object while inputting the user input, the computing device determining the user input based on both the first multiple locations of the active display area that are touched by the object and the second multiple locations outside of the active display area that are touched by the object, as well as based on timing information indicating when each of the first multiple locations was sensed by the one or more sensors and when each of the second multiple locations was sensed by the one or more sensors.
  • 16. A computing device as recited in claim 15, at least one of the one or more sensors being disposed in an extended sensor area surrounding the active display area such that locations of the object are sensed by the computing device along the edge of the active display area before locations of the object are sensed by the computing device in the active display area.
  • 17. A computing device as recited in claim 15, the one or more sensors being included in an input sensor layer of the display device extending beyond an active display layer of the display device.
  • 18. A computing device as recited in claim 15, the area outside of the active display area comprising an extended sensor area surrounding and adjacent to the active display area.
  • 19. A computing device as recited in claim 18, the extended sensor area extending approximately two millimeters outside the active display area.
  • 20. A computing device as recited in claim 15, the user input comprising a gesture indicating one or more functions of the computing device to initiate.
RELATED APPLICATIONS

This application is a continuation of and claims priority to U.S. patent application Ser. No. 13/651,195, filed Oct. 12, 2012, entitled “Sensing User Input At Display Area Edge”, which is a continuation of and claims priority to U.S. patent application Ser. No. 13/471,376, filed May 14, 2012, entitled “Sensing User Input At Display Area Edge”, and which further claims priority under 35 U.S.C. §119(e) to the following U.S. Provisional patent applications, the entire disclosures of each of these applications being incorporated by reference in their entirety: U.S. Provisional Patent Application No. 61/606,321, filed Mar. 2, 2012, and titled “Screen Edge;” U.S. Provisional Patent Application No. 61/606,301, filed Mar. 2, 2012, and titled “Input Device Functionality;” U.S. Provisional Patent Application No. 61/606,313, filed Mar. 2, 2012, and titled “Functional Hinge;” U.S. Provisional Patent Application No. 61/606,333, filed Mar. 2, 2012, and titled “Usage and Authentication;” U.S. Provisional Patent Application No. 61/613,745, filed Mar. 21, 2012, and titled “Usage and Authentication;” U.S. Provisional Patent Application No. 61/606,336, filed Mar. 2, 2012, and titled “Kickstand and Camera;” and U.S. Provisional Patent Application No. 61/607,451, filed Mar. 6, 2012, and titled “Spanaway Provisional.”

US Referenced Citations (1032)
Number Name Date Kind
578325 Fleming Mar 1897 A
3600528 Leposavic Aug 1971 A
3777082 Hatley Dec 1973 A
3879586 DuRocher et al. Apr 1975 A
3968336 Johnson Jul 1976 A
4046975 Seeger, Jr. Sep 1977 A
4065649 Carter et al. Dec 1977 A
4086451 Boulanger Apr 1978 A
4243861 Strandwitz Jan 1981 A
4261042 Ishiwatari et al. Apr 1981 A
4302648 Sado et al. Nov 1981 A
4317011 Mazurk Feb 1982 A
4317013 Larson Feb 1982 A
4323740 Balash Apr 1982 A
4365130 Christensen Dec 1982 A
4492829 Rodrique Jan 1985 A
4503294 Matsumaru Mar 1985 A
4527021 Morikawa et al. Jul 1985 A
4559426 Van Zeeland et al. Dec 1985 A
4576436 Daniel Mar 1986 A
4577822 Wilkerson Mar 1986 A
4588187 Dell May 1986 A
4607147 Ono et al. Aug 1986 A
4615579 Whitehead Oct 1986 A
4651133 Ganesan et al. Mar 1987 A
4652704 Franklin Mar 1987 A
4724605 Fiorella Feb 1988 A
4735394 Facco Apr 1988 A
4735495 Henkes Apr 1988 A
4795977 Frost et al. Jan 1989 A
4801771 Mizuguchi et al. Jan 1989 A
4824268 Diernisse Apr 1989 A
4864084 Cardinale Sep 1989 A
4990900 Kikuchi Feb 1991 A
5008497 Asher Apr 1991 A
5021638 Nopper et al. Jun 1991 A
5053585 Yaniger Oct 1991 A
5107401 Youn Apr 1992 A
5128829 Loew Jul 1992 A
5220318 Staley Jun 1993 A
5220521 Kikinis Jun 1993 A
5235495 Blair et al. Aug 1993 A
5253362 Nolan et al. Oct 1993 A
5283559 Kalendra et al. Feb 1994 A
5319455 Hoarty et al. Jun 1994 A
5331443 Stanisci Jul 1994 A
5339382 Whitehead Aug 1994 A
5363075 Fanucchi Nov 1994 A
5375076 Goodrich et al. Dec 1994 A
5404133 Moriike et al. Apr 1995 A
5406415 Kelly Apr 1995 A
5480118 Cross Jan 1996 A
5491313 Bartley et al. Feb 1996 A
5546271 Gut et al. Aug 1996 A
5548477 Kumar et al. Aug 1996 A
5558577 Kato Sep 1996 A
5576981 Parker et al. Nov 1996 A
5581682 Anderson et al. Dec 1996 A
5596700 Darnell et al. Jan 1997 A
5617343 Danielson et al. Apr 1997 A
5618232 Martin Apr 1997 A
5621494 Kazumi et al. Apr 1997 A
5661279 Kenmochi Aug 1997 A
5666112 Crowley et al. Sep 1997 A
5681220 Bertram et al. Oct 1997 A
5737183 Kobayashi et al. Apr 1998 A
5745376 Barker et al. Apr 1998 A
5748114 Koehn May 1998 A
5781406 Hunte Jul 1998 A
5803748 Maddrell et al. Sep 1998 A
5806955 Parkyn, Jr. et al. Sep 1998 A
5807175 Davis et al. Sep 1998 A
5808713 Broer et al. Sep 1998 A
5818361 Acevedo Oct 1998 A
5828770 Leis et al. Oct 1998 A
5838403 Jannson et al. Nov 1998 A
5842027 Oprescu et al. Nov 1998 A
5861990 Tedesco Jan 1999 A
5874697 Selker et al. Feb 1999 A
5920317 McDonald Jul 1999 A
5921652 Parker et al. Jul 1999 A
5924555 Sadamori et al. Jul 1999 A
5926170 Oba Jul 1999 A
5929946 Sharp et al. Jul 1999 A
5967637 Ishikawa et al. Oct 1999 A
5971635 Wise Oct 1999 A
5995026 Sellers Nov 1999 A
5999147 Teitel Dec 1999 A
6002389 Kasser Dec 1999 A
6002581 Lindsey Dec 1999 A
6005209 Burleson et al. Dec 1999 A
6012714 Worley et al. Jan 2000 A
6014800 Lee Jan 2000 A
6040823 Seffernick et al. Mar 2000 A
6042075 Burch, Jr. Mar 2000 A
6044717 Biegelsen et al. Apr 2000 A
6046857 Morishima et al. Apr 2000 A
6055705 Komatsu et al. May 2000 A
6061644 Leis May 2000 A
6072551 Jannson et al. Jun 2000 A
6108200 Fullerton Aug 2000 A
6112797 Colson et al. Sep 2000 A
6124906 Kawada et al. Sep 2000 A
6128007 Seybold Oct 2000 A
6129444 Tognoni Oct 2000 A
6141388 Servais et al. Oct 2000 A
6172807 Akamatsu Jan 2001 B1
6178085 Leung Jan 2001 B1
6178443 Lin Jan 2001 B1
6188391 Seely et al. Feb 2001 B1
6215590 Okano Apr 2001 B1
6232934 Heacock et al. May 2001 B1
6254105 Rinde et al. Jul 2001 B1
6256447 Laine Jul 2001 B1
6279060 Luke et al. Aug 2001 B1
6300986 Travis Oct 2001 B1
6305073 Badders Oct 2001 B1
6329617 Burgess Dec 2001 B1
6344791 Armstrong Feb 2002 B1
6351273 Lemelson et al. Feb 2002 B1
6353503 Spitzer et al. Mar 2002 B1
6366440 Kung Apr 2002 B1
6380497 Hashimoto et al. Apr 2002 B1
6411266 Maguire, Jr. Jun 2002 B1
6437682 Vance Aug 2002 B1
6441362 Ogawa Aug 2002 B1
6450046 Maeda Sep 2002 B1
6506983 Babb et al. Jan 2003 B1
6511378 Bhatt et al. Jan 2003 B1
6529179 Hashimoto et al. Mar 2003 B1
6532147 Christ, Jr. Mar 2003 B1
6543949 Ritchey et al. Apr 2003 B1
6565439 Shinohara et al. May 2003 B2
6585435 Fang Jul 2003 B2
6597347 Yasutake Jul 2003 B1
6600121 Olodort et al. Jul 2003 B1
6603408 Gaba Aug 2003 B1
6603461 Smith, Jr. et al. Aug 2003 B2
6608664 Hasegawa Aug 2003 B1
6617536 Kawaguchi Sep 2003 B2
6648485 Colgan et al. Nov 2003 B1
6651943 Cho et al. Nov 2003 B2
6684166 Bellwood et al. Jan 2004 B2
6685369 Lien Feb 2004 B2
6687614 Ihara et al. Feb 2004 B2
6695273 Iguchi Feb 2004 B2
6704864 Philyaw Mar 2004 B1
6721019 Kono et al. Apr 2004 B2
6725318 Sherman et al. Apr 2004 B1
6774888 Genduso Aug 2004 B1
6776546 Kraus et al. Aug 2004 B2
6780019 Ghosh et al. Aug 2004 B1
6781819 Yang et al. Aug 2004 B2
6784869 Clark et al. Aug 2004 B1
6795146 Dozov et al. Sep 2004 B2
6798887 Andre Sep 2004 B1
6813143 Makela Nov 2004 B2
6819316 Schulz et al. Nov 2004 B2
6847488 Travis Jan 2005 B2
6856506 Doherty et al. Feb 2005 B2
6856789 Pattabiraman et al. Feb 2005 B2
6861961 Sandbach et al. Mar 2005 B2
6867828 Taira et al. Mar 2005 B2
6870671 Travis Mar 2005 B2
6895164 Saccomanno May 2005 B2
6898315 Guha May 2005 B2
6909354 Baker et al. Jun 2005 B2
6914197 Doherty et al. Jul 2005 B2
6950950 Sawyers et al. Sep 2005 B2
6962454 Costello Nov 2005 B1
6970957 Oshins et al. Nov 2005 B1
6976799 Kim et al. Dec 2005 B2
6977643 Wilbrink Dec 2005 B2
6980177 Struyk Dec 2005 B2
6981792 Nagakubo et al. Jan 2006 B2
7006080 Gettemy Feb 2006 B2
7007238 Glaser Feb 2006 B2
7025908 Hayashi et al. Apr 2006 B1
7051149 Wang et al. May 2006 B2
7068496 Wong et al. Jun 2006 B2
7073933 Gotoh et al. Jul 2006 B2
7083295 Hanna Aug 2006 B1
7091436 Serban Aug 2006 B2
7095404 Vincent et al. Aug 2006 B2
7099149 Krieger et al. Aug 2006 B2
7101048 Travis Sep 2006 B2
7104679 Shin et al. Sep 2006 B2
7106222 Ward et al. Sep 2006 B2
7116309 Kimura et al. Oct 2006 B1
7123292 Seeger et al. Oct 2006 B1
7129979 Lee Oct 2006 B1
7136282 Rebeske Nov 2006 B1
7151635 Bidnyk et al. Dec 2006 B2
7152985 Benitez et al. Dec 2006 B2
7153017 Yamashita et al. Dec 2006 B2
D535292 Shi et al. Jan 2007 S
7159132 Takahashi et al. Jan 2007 B2
7194662 Do et al. Mar 2007 B2
7199931 Boettiger et al. Apr 2007 B2
7213323 Baker et al. May 2007 B2
7213991 Chapman et al. May 2007 B2
7224830 Nefian et al. May 2007 B2
7252512 Tai et al. Aug 2007 B2
7260221 Atsmon Aug 2007 B1
7260823 Schlack et al. Aug 2007 B2
7277087 Hill et al. Oct 2007 B2
7280348 Ghosh Oct 2007 B2
7301759 Hsiung Nov 2007 B2
7364343 Keuper et al. Apr 2008 B2
7365967 Zheng Apr 2008 B2
7370342 Ismail et al. May 2008 B2
7374312 Feng et al. May 2008 B2
7375885 Ijzerman et al. May 2008 B2
7384178 Sumida et al. Jun 2008 B2
7400377 Evans et al. Jul 2008 B2
7400817 Lee et al. Jul 2008 B2
7410286 Travis Aug 2008 B2
7415676 Fujita Aug 2008 B2
7431489 Yeo et al. Oct 2008 B2
7447922 Asbury et al. Nov 2008 B1
7447934 Dasari et al. Nov 2008 B2
7457108 Ghosh Nov 2008 B2
7469386 Bear et al. Dec 2008 B2
7486165 Ligtenberg et al. Feb 2009 B2
7499037 Lube Mar 2009 B2
7499216 Niv et al. Mar 2009 B2
7502803 Culter et al. Mar 2009 B2
7503684 Ueno et al. Mar 2009 B2
7515143 Keam et al. Apr 2009 B2
7528374 Smitt et al. May 2009 B2
7539882 Jessup et al. May 2009 B2
7542052 Solomon et al. Jun 2009 B2
7545429 Travis Jun 2009 B2
7558594 Wilson Jul 2009 B2
7559834 York Jul 2009 B1
7561131 Ijzerman et al. Jul 2009 B2
7572045 Hoelen et al. Aug 2009 B2
RE40891 Yasutake Sep 2009 E
7594638 Chan et al. Sep 2009 B2
7620244 Collier Nov 2009 B1
7631327 Dempski et al. Dec 2009 B2
7636921 Louie Dec 2009 B2
7639329 Takeda et al. Dec 2009 B2
7639876 Clary et al. Dec 2009 B2
7643213 Boettiger et al. Jan 2010 B2
7656392 Bolender Feb 2010 B2
7660047 Travis et al. Feb 2010 B1
7675598 Hong Mar 2010 B2
7686694 Cole Mar 2010 B2
7728923 Kim et al. Jun 2010 B2
7729493 Krieger et al. Jun 2010 B2
7731147 Rha Jun 2010 B2
7733326 Adiseshan Jun 2010 B1
7761119 Patel Jul 2010 B2
7773076 Pittel et al. Aug 2010 B2
7773121 Huntsberger et al. Aug 2010 B1
7774155 Sato et al. Aug 2010 B2
7777972 Chen et al. Aug 2010 B1
7782341 Kothandaraman Aug 2010 B2
7782342 Koh Aug 2010 B2
7813715 McKillop et al. Oct 2010 B2
7815358 Inditsky Oct 2010 B2
7822338 Wernersson Oct 2010 B2
7844985 Hendricks et al. Nov 2010 B2
7855716 McCreary et al. Dec 2010 B2
7865639 McCoy et al. Jan 2011 B2
7884807 Hovden et al. Feb 2011 B2
7893921 Sato Feb 2011 B2
D636397 Green Apr 2011 S
7918559 Tesar Apr 2011 B2
7927654 Hagood et al. Apr 2011 B2
7928964 Kolmykov-Zotov et al. Apr 2011 B2
7932890 Onikiri et al. Apr 2011 B2
7936501 Smith et al. May 2011 B2
7944520 Ichioka et al. May 2011 B2
7945717 Rivalsi May 2011 B2
7957082 Mi et al. Jun 2011 B2
7965268 Gass et al. Jun 2011 B2
7967462 Ogiro et al. Jun 2011 B2
7970246 Travis et al. Jun 2011 B2
7973771 Geaghan Jul 2011 B2
7976393 Haga et al. Jul 2011 B2
7978281 Vergith et al. Jul 2011 B2
7991257 Coleman Aug 2011 B1
8007158 Woo et al. Aug 2011 B2
8016255 Lin Sep 2011 B2
8018386 Qi et al. Sep 2011 B2
8018579 Krah Sep 2011 B1
8026904 Westerman Sep 2011 B2
8053688 Conzola et al. Nov 2011 B2
8059384 Park et al. Nov 2011 B2
8065624 Morin et al. Nov 2011 B2
8069356 Rathi et al. Nov 2011 B2
RE42992 David Dec 2011 E
8077160 Land et al. Dec 2011 B2
8090885 Callaghan et al. Jan 2012 B2
8098233 Hotelling et al. Jan 2012 B2
8102362 Ricks et al. Jan 2012 B2
8115499 Osoinach et al. Feb 2012 B2
8115718 Chen et al. Feb 2012 B2
8117362 Rodriguez et al. Feb 2012 B2
8118274 McClure et al. Feb 2012 B2
8118681 Mattice et al. Feb 2012 B2
8120166 Koizumi et al. Feb 2012 B2
8130203 Westerman Mar 2012 B2
8149219 Lii et al. Apr 2012 B2
8149272 Evans et al. Apr 2012 B2
8154524 Wilson et al. Apr 2012 B2
8159372 Sherman Apr 2012 B2
8162282 Hu et al. Apr 2012 B2
D659139 Gengler May 2012 S
8169421 Wright et al. May 2012 B2
8179236 Weller et al. May 2012 B2
8184190 Dosluoglu May 2012 B2
8189973 Travis et al. May 2012 B2
8216074 Sakuma Jul 2012 B2
8229509 Paek et al. Jul 2012 B2
8229522 Kim et al. Jul 2012 B2
8231099 Chen Jul 2012 B2
8243432 Duan et al. Aug 2012 B2
8248791 Wang et al. Aug 2012 B2
8251563 Papakonstantinou et al. Aug 2012 B2
8255708 Zhang Aug 2012 B1
8264310 Lauder et al. Sep 2012 B2
8267368 Torii et al. Sep 2012 B2
8269731 Molne Sep 2012 B2
8274784 Franz et al. Sep 2012 B2
8279589 Kim Oct 2012 B2
8310508 Hekstra et al. Nov 2012 B2
8310768 Lin et al. Nov 2012 B2
8322290 Mignano Dec 2012 B1
8325416 Lesage et al. Dec 2012 B2
8346206 Andrus et al. Jan 2013 B1
8354806 Adrian et al. Jan 2013 B2
8362975 Uehara Jan 2013 B2
8373664 Wright Feb 2013 B2
8384566 Bocirnea Feb 2013 B2
8387078 Memmott Feb 2013 B2
8387938 Lin Mar 2013 B2
8403576 Merz Mar 2013 B2
8416559 Agata et al. Apr 2013 B2
8424160 Chen Apr 2013 B2
8464079 Chueh et al. Jun 2013 B2
8466954 Ko et al. Jun 2013 B2
8467133 Miller Jun 2013 B2
8498100 Whitt, III et al. Jul 2013 B1
8513547 Ooi Aug 2013 B2
8514568 Qiao et al. Aug 2013 B2
8520371 Peng et al. Aug 2013 B2
8543227 Perek et al. Sep 2013 B1
8548608 Perek et al. Oct 2013 B2
8564944 Whitt, III et al. Oct 2013 B2
8565560 Popovich et al. Oct 2013 B2
8569640 Yamada et al. Oct 2013 B2
8570725 Whitt, III et al. Oct 2013 B2
8576031 Lauder et al. Nov 2013 B2
8582280 Ryu Nov 2013 B2
8587701 Tatsuzawa Nov 2013 B2
8599542 Healey et al. Dec 2013 B1
8610015 Whitt et al. Dec 2013 B2
8614666 Whitman et al. Dec 2013 B2
8633898 Westerman et al. Jan 2014 B2
8646999 Shaw et al. Feb 2014 B2
8674941 Casparian et al. Mar 2014 B2
8692212 Craft Apr 2014 B1
8699215 Whitt, III et al. Apr 2014 B2
8719603 Belesiu et al. May 2014 B2
8724302 Whitt et al. May 2014 B2
8744070 Zhang et al. Jun 2014 B2
8744391 Tenbrook et al. Jun 2014 B2
8762746 Lachwani et al. Jun 2014 B1
8767388 Ahn et al. Jul 2014 B2
8780540 Whitt, III et al. Jul 2014 B2
8780541 Whitt et al. Jul 2014 B2
8791382 Whitt, III et al. Jul 2014 B2
8797765 Lin et al. Aug 2014 B2
8825187 Hamrick et al. Sep 2014 B1
8830668 Whitt, III et al. Sep 2014 B2
8850241 Oler et al. Sep 2014 B2
8854799 Whitt, III et al. Oct 2014 B2
8873227 Whitt et al. Oct 2014 B2
8891232 Wang Nov 2014 B2
8896993 Belesiu et al. Nov 2014 B2
8903517 Perek et al. Dec 2014 B2
8908858 Chiu et al. Dec 2014 B2
8934221 Guo Jan 2015 B2
8939422 Liu et al. Jan 2015 B2
8947353 Boulanger et al. Feb 2015 B2
8947864 Whitt, III et al. Feb 2015 B2
8949477 Drasnin Feb 2015 B2
8964376 Chen Feb 2015 B2
9001028 Baker Apr 2015 B2
9047207 Belesiu et al. Jun 2015 B2
9064654 Whitt, III et al. Jun 2015 B2
9075566 Whitt, III et al. Jul 2015 B2
9098117 Lutz, III et al. Aug 2015 B2
9111703 Whitt, III et al. Aug 2015 B2
9146620 Whitt et al. Sep 2015 B2
9158383 Shaw et al. Oct 2015 B2
9158384 Whitt, III et al. Oct 2015 B2
9176900 Whitt, III et al. Nov 2015 B2
9176901 Whitt, III et al. Nov 2015 B2
9201185 Large Dec 2015 B2
9256089 Emerton et al. Feb 2016 B2
9268373 Whitt et al. Feb 2016 B2
20010023818 Masaru et al. Sep 2001 A1
20020005108 Ludwig Jan 2002 A1
20020008854 Travis et al. Jan 2002 A1
20020044216 Cha Apr 2002 A1
20020070883 Dosch Jun 2002 A1
20020134828 Sandbach et al. Sep 2002 A1
20020135457 Sandbach et al. Sep 2002 A1
20020163510 Williams et al. Nov 2002 A1
20020195177 Hinkley et al. Dec 2002 A1
20030000821 Takahashi et al. Jan 2003 A1
20030007648 Currell Jan 2003 A1
20030009518 Harrow et al. Jan 2003 A1
20030011576 Sandbach et al. Jan 2003 A1
20030016282 Koizumi Jan 2003 A1
20030028688 Tiphane et al. Feb 2003 A1
20030036365 Kuroda Feb 2003 A1
20030044216 Fang Mar 2003 A1
20030051983 Lahr Mar 2003 A1
20030067450 Thursfield et al. Apr 2003 A1
20030108720 Kashino Jun 2003 A1
20030132916 Kramer Jul 2003 A1
20030137821 Gotoh et al. Jul 2003 A1
20030160712 Levy Aug 2003 A1
20030163611 Nagao Aug 2003 A1
20030165017 Amitai Sep 2003 A1
20030173195 Federspiel Sep 2003 A1
20030197687 Shetter Oct 2003 A1
20030198008 Leapman et al. Oct 2003 A1
20030231243 Shibutani Dec 2003 A1
20040005184 Kim et al. Jan 2004 A1
20040046796 Fujita Mar 2004 A1
20040056843 Lin et al. Mar 2004 A1
20040100457 Mandle May 2004 A1
20040113956 Bellwood et al. Jun 2004 A1
20040160734 Yim Aug 2004 A1
20040169641 Bean et al. Sep 2004 A1
20040212553 Wang et al. Oct 2004 A1
20040212598 Kraus et al. Oct 2004 A1
20040212601 Cake et al. Oct 2004 A1
20040258924 Berger et al. Dec 2004 A1
20040268000 Barker et al. Dec 2004 A1
20050030728 Kawashima et al. Feb 2005 A1
20050047773 Satake et al. Mar 2005 A1
20050052831 Chen Mar 2005 A1
20050055498 Beckert et al. Mar 2005 A1
20050057515 Bathiche Mar 2005 A1
20050059489 Kim Mar 2005 A1
20050062715 Tsuji et al. Mar 2005 A1
20050099400 Lee May 2005 A1
20050100690 Mayer et al. May 2005 A1
20050134717 Misawa Jun 2005 A1
20050146512 Hill et al. Jul 2005 A1
20050236848 Kim et al. Oct 2005 A1
20050264653 Starkweather et al. Dec 2005 A1
20050264988 Nicolosi Dec 2005 A1
20050283731 Saint-Hilaire et al. Dec 2005 A1
20050285703 Wheeler et al. Dec 2005 A1
20060010400 Dehlin et al. Jan 2006 A1
20060012767 Komatsuda et al. Jan 2006 A1
20060028400 Lapstun et al. Feb 2006 A1
20060028476 Sobel Feb 2006 A1
20060028838 Imade Feb 2006 A1
20060049920 Sadler et al. Mar 2006 A1
20060049993 Lin et al. Mar 2006 A1
20060061555 Mullen Mar 2006 A1
20060083004 Cok Apr 2006 A1
20060085658 Allen et al. Apr 2006 A1
20060092139 Sharma May 2006 A1
20060096392 Inkster et al. May 2006 A1
20060102020 Takada et al. May 2006 A1
20060102914 Smits et al. May 2006 A1
20060125799 Hillis et al. Jun 2006 A1
20060132423 Travis Jun 2006 A1
20060146573 Iwauchi et al. Jul 2006 A1
20060154725 Glaser et al. Jul 2006 A1
20060155391 Pistemaa et al. Jul 2006 A1
20060156415 Rubinstein et al. Jul 2006 A1
20060174143 Sawyers et al. Aug 2006 A1
20060176377 Miyasaka Aug 2006 A1
20060181514 Newman Aug 2006 A1
20060187216 Trent, Jr. et al. Aug 2006 A1
20060192763 Ziemkowski Aug 2006 A1
20060195522 Miyazaki Aug 2006 A1
20060215244 Yosha et al. Sep 2006 A1
20060220465 Kingsmore et al. Oct 2006 A1
20060227393 Herloski Oct 2006 A1
20060238510 Panotopoulos et al. Oct 2006 A1
20060238550 Page Oct 2006 A1
20060250381 Geaghan Nov 2006 A1
20060262185 Cha et al. Nov 2006 A1
20060265617 Priborsky Nov 2006 A1
20060267931 Vainio et al. Nov 2006 A1
20060272429 Ganapathi et al. Dec 2006 A1
20060279501 Lu et al. Dec 2006 A1
20070003267 Shibutani Jan 2007 A1
20070019181 Sinclair et al. Jan 2007 A1
20070046625 Yee Mar 2007 A1
20070047221 Park Mar 2007 A1
20070056385 Lorenz Mar 2007 A1
20070062089 Homer et al. Mar 2007 A1
20070069153 Pai-Paranjape et al. Mar 2007 A1
20070072474 Beasley et al. Mar 2007 A1
20070076434 Uehara et al. Apr 2007 A1
20070080813 Melvin Apr 2007 A1
20070081091 Pan et al. Apr 2007 A1
20070091638 Ijzerman et al. Apr 2007 A1
20070117600 Robertson et al. May 2007 A1
20070121956 Bai et al. May 2007 A1
20070122027 Kunita et al. May 2007 A1
20070127205 Kuo Jun 2007 A1
20070145945 McGinley et al. Jun 2007 A1
20070172229 Wernersson Jul 2007 A1
20070176902 Newman et al. Aug 2007 A1
20070178891 Louch et al. Aug 2007 A1
20070182663 Biech Aug 2007 A1
20070182722 Hotelling et al. Aug 2007 A1
20070185590 Reindel et al. Aug 2007 A1
20070188478 Silverstein et al. Aug 2007 A1
20070200830 Yamamoto Aug 2007 A1
20070201246 Yeo et al. Aug 2007 A1
20070201859 Sarrat Aug 2007 A1
20070217224 Kao et al. Sep 2007 A1
20070220708 Lewis Sep 2007 A1
20070222766 Bolender Sep 2007 A1
20070230227 Palmer Oct 2007 A1
20070234420 Novotney et al. Oct 2007 A1
20070236408 Yamaguchi et al. Oct 2007 A1
20070236475 Wherry Oct 2007 A1
20070236873 Yukawa et al. Oct 2007 A1
20070247338 Marchetto Oct 2007 A1
20070247432 Oakley Oct 2007 A1
20070252674 Nelson et al. Nov 2007 A1
20070260892 Paul et al. Nov 2007 A1
20070274094 Schultz et al. Nov 2007 A1
20070274095 Destain Nov 2007 A1
20070274099 Tai et al. Nov 2007 A1
20070283179 Burnett et al. Dec 2007 A1
20070296709 Guanghai Dec 2007 A1
20070297625 Hjort et al. Dec 2007 A1
20080005423 Jacobs et al. Jan 2008 A1
20080013809 Zhu et al. Jan 2008 A1
20080019150 Park et al. Jan 2008 A1
20080037284 Rudisill Feb 2008 A1
20080053222 Ehrensvard et al. Mar 2008 A1
20080059888 Dunko Mar 2008 A1
20080068451 Hyatt Mar 2008 A1
20080074398 Wright Mar 2008 A1
20080104437 Lee May 2008 A1
20080106592 Mikami May 2008 A1
20080122803 Izadi et al. May 2008 A1
20080129520 Lee Jun 2008 A1
20080150913 Bell et al. Jun 2008 A1
20080151478 Chern Jun 2008 A1
20080158185 Westerman Jul 2008 A1
20080167832 Soss Jul 2008 A1
20080174570 Jobs et al. Jul 2008 A1
20080180411 Solomon et al. Jul 2008 A1
20080186660 Yang Aug 2008 A1
20080211787 Nakao et al. Sep 2008 A1
20080219025 Spitzer et al. Sep 2008 A1
20080225205 Travis Sep 2008 A1
20080228969 Cheah et al. Sep 2008 A1
20080238884 Harish Oct 2008 A1
20080253822 Matias Oct 2008 A1
20080297878 Brown et al. Dec 2008 A1
20080307242 Qu Dec 2008 A1
20080309636 Feng et al. Dec 2008 A1
20080316002 Brunet et al. Dec 2008 A1
20080316183 Westerman et al. Dec 2008 A1
20080316768 Travis Dec 2008 A1
20080320190 Lydon et al. Dec 2008 A1
20090007001 Morin et al. Jan 2009 A1
20090009476 Daley, III Jan 2009 A1
20090033623 Lin Feb 2009 A1
20090040426 Mather et al. Feb 2009 A1
20090073060 Shimasaki et al. Mar 2009 A1
20090073957 Newland et al. Mar 2009 A1
20090079639 Hotta et al. Mar 2009 A1
20090083562 Park et al. Mar 2009 A1
20090083710 Best et al. Mar 2009 A1
20090089600 Nousiainen Apr 2009 A1
20090096738 Chen et al. Apr 2009 A1
20090096756 Lube Apr 2009 A1
20090102805 Meijer et al. Apr 2009 A1
20090127005 Zachut et al. May 2009 A1
20090131134 Baerlocher et al. May 2009 A1
20090135318 Tateuchi et al. May 2009 A1
20090140985 Liu Jun 2009 A1
20090146975 Chang Jun 2009 A1
20090146992 Fukunaga et al. Jun 2009 A1
20090152748 Wang et al. Jun 2009 A1
20090158221 Nielsen et al. Jun 2009 A1
20090161385 Parker et al. Jun 2009 A1
20090163147 Steigerwald et al. Jun 2009 A1
20090167728 Geaghan et al. Jul 2009 A1
20090167930 Safaee-Rad et al. Jul 2009 A1
20090174687 Ciesla et al. Jul 2009 A1
20090174759 Yeh et al. Jul 2009 A1
20090177906 Paniagua, Jr. et al. Jul 2009 A1
20090189873 Peterson Jul 2009 A1
20090189974 Deering Jul 2009 A1
20090195497 Fitzgerald et al. Aug 2009 A1
20090195518 Mattice et al. Aug 2009 A1
20090207144 Bridger Aug 2009 A1
20090231275 Odgers Sep 2009 A1
20090239586 Boeve et al. Sep 2009 A1
20090244009 Staats et al. Oct 2009 A1
20090244832 Behar et al. Oct 2009 A1
20090244872 Yan Oct 2009 A1
20090251008 Sugaya Oct 2009 A1
20090251623 Koyama Oct 2009 A1
20090259865 Sheynblat et al. Oct 2009 A1
20090262492 Whitchurch et al. Oct 2009 A1
20090265670 Kim et al. Oct 2009 A1
20090269943 Palli et al. Oct 2009 A1
20090276734 Taylor et al. Nov 2009 A1
20090285491 Ravenscroft et al. Nov 2009 A1
20090296331 Choy Dec 2009 A1
20090303137 Kusaka et al. Dec 2009 A1
20090303204 Nasiri et al. Dec 2009 A1
20090315830 Westerman Dec 2009 A1
20090316072 Okumura et al. Dec 2009 A1
20090320244 Lin Dec 2009 A1
20090321490 Groene et al. Dec 2009 A1
20100001963 Doray et al. Jan 2010 A1
20100006412 Wang et al. Jan 2010 A1
20100013319 Kamiyama et al. Jan 2010 A1
20100013738 Covannon et al. Jan 2010 A1
20100023869 Saint-Hilaire et al. Jan 2010 A1
20100026656 Hotelling et al. Feb 2010 A1
20100038821 Jenkins et al. Feb 2010 A1
20100039081 Sip Feb 2010 A1
20100039764 Locker et al. Feb 2010 A1
20100045609 Do et al. Feb 2010 A1
20100045633 Gettemy Feb 2010 A1
20100051356 Stern et al. Mar 2010 A1
20100051432 Lin et al. Mar 2010 A1
20100052880 Laitinen et al. Mar 2010 A1
20100053534 Hsieh et al. Mar 2010 A1
20100053771 Travis et al. Mar 2010 A1
20100054435 Louch et al. Mar 2010 A1
20100056130 Louch et al. Mar 2010 A1
20100072351 Mahowald Mar 2010 A1
20100073329 Raman et al. Mar 2010 A1
20100077237 Sawyers Mar 2010 A1
20100079379 Demuynck et al. Apr 2010 A1
20100079861 Powell Apr 2010 A1
20100081377 Chatterjee et al. Apr 2010 A1
20100083108 Rider et al. Apr 2010 A1
20100085321 Pundsack Apr 2010 A1
20100100752 Chueh et al. Apr 2010 A1
20100102182 Lin Apr 2010 A1
20100102206 Cazaux et al. Apr 2010 A1
20100103112 Yoo et al. Apr 2010 A1
20100103611 Yang et al. Apr 2010 A1
20100105443 Vaisanen Apr 2010 A1
20100106983 Kasprzak et al. Apr 2010 A1
20100115309 Carvalho et al. May 2010 A1
20100117993 Kent May 2010 A1
20100123686 Klinghult et al. May 2010 A1
20100128427 Iso May 2010 A1
20100133398 Chiu et al. Jun 2010 A1
20100135036 Matsuba et al. Jun 2010 A1
20100142130 Wang et al. Jun 2010 A1
20100146317 Challener et al. Jun 2010 A1
20100148995 Elias Jun 2010 A1
20100148999 Casparian et al. Jun 2010 A1
20100149073 Chaum et al. Jun 2010 A1
20100149104 Sim et al. Jun 2010 A1
20100149111 Olien Jun 2010 A1
20100149117 Chien et al. Jun 2010 A1
20100149134 Westerman et al. Jun 2010 A1
20100149377 Shintani et al. Jun 2010 A1
20100156798 Archer Jun 2010 A1
20100156913 Ortega et al. Jun 2010 A1
20100161522 Tirpak et al. Jun 2010 A1
20100164857 Liu et al. Jul 2010 A1
20100164897 Morin et al. Jul 2010 A1
20100171891 Kaji et al. Jul 2010 A1
20100174421 Tsai et al. Jul 2010 A1
20100177388 Cohen et al. Jul 2010 A1
20100180063 Ananny et al. Jul 2010 A1
20100185877 Chueh et al. Jul 2010 A1
20100188299 Rinehart et al. Jul 2010 A1
20100188338 Longe Jul 2010 A1
20100201308 Lindholm Aug 2010 A1
20100205472 Tupman et al. Aug 2010 A1
20100206614 Park et al. Aug 2010 A1
20100206644 Yeh Aug 2010 A1
20100207774 Song Aug 2010 A1
20100214214 Corson et al. Aug 2010 A1
20100214257 Wussler et al. Aug 2010 A1
20100220205 Lee et al. Sep 2010 A1
20100222110 Kim et al. Sep 2010 A1
20100231498 Large et al. Sep 2010 A1
20100231510 Sampsell et al. Sep 2010 A1
20100231522 Li Sep 2010 A1
20100231556 Mines et al. Sep 2010 A1
20100235546 Terlizzi et al. Sep 2010 A1
20100238075 Pourseyed Sep 2010 A1
20100238138 Goertz et al. Sep 2010 A1
20100238620 Fish Sep 2010 A1
20100245221 Khan Sep 2010 A1
20100245289 Svajda Sep 2010 A1
20100250975 Gill et al. Sep 2010 A1
20100250988 Okuda et al. Sep 2010 A1
20100259482 Ball Oct 2010 A1
20100259876 Kim Oct 2010 A1
20100265182 Ball et al. Oct 2010 A1
20100271771 Wu et al. Oct 2010 A1
20100274932 Kose Oct 2010 A1
20100279768 Huang et al. Nov 2010 A1
20100282953 Tam Nov 2010 A1
20100289457 Onnerud et al. Nov 2010 A1
20100295812 Burns et al. Nov 2010 A1
20100296163 Saarikko Nov 2010 A1
20100299642 Merrell et al. Nov 2010 A1
20100302378 Marks et al. Dec 2010 A1
20100304793 Kim Dec 2010 A1
20100306538 Thomas et al. Dec 2010 A1
20100308778 Yamazaki et al. Dec 2010 A1
20100308844 Day et al. Dec 2010 A1
20100309617 Wang et al. Dec 2010 A1
20100313680 Joung et al. Dec 2010 A1
20100315345 Laitinen Dec 2010 A1
20100315348 Jellicoe et al. Dec 2010 A1
20100315373 Steinhauser et al. Dec 2010 A1
20100321339 Kimmel Dec 2010 A1
20100321482 Cleveland Dec 2010 A1
20100321877 Moser Dec 2010 A1
20100322479 Cleveland Dec 2010 A1
20100324457 Bean et al. Dec 2010 A1
20100325155 Skinner et al. Dec 2010 A1
20100331059 Apgar et al. Dec 2010 A1
20110007047 Fujioka et al. Jan 2011 A1
20110012873 Prest et al. Jan 2011 A1
20110018799 Lin Jan 2011 A1
20110019123 Prest et al. Jan 2011 A1
20110031287 Le Gette et al. Feb 2011 A1
20110032127 Roush Feb 2011 A1
20110032215 Sirotich et al. Feb 2011 A1
20110036965 Zhang et al. Feb 2011 A1
20110037721 Cranfill et al. Feb 2011 A1
20110043990 Mickey et al. Feb 2011 A1
20110044582 Travis et al. Feb 2011 A1
20110050576 Forutanpour et al. Mar 2011 A1
20110050626 Porter et al. Mar 2011 A1
20110055407 Lydon et al. Mar 2011 A1
20110057724 Pabon Mar 2011 A1
20110060926 Brooks et al. Mar 2011 A1
20110069148 Jones et al. Mar 2011 A1
20110072391 Hanggie et al. Mar 2011 A1
20110074688 Hull et al. Mar 2011 A1
20110075440 Wang Mar 2011 A1
20110081946 Singh et al. Apr 2011 A1
20110096035 Shen Apr 2011 A1
20110102326 Casparian et al. May 2011 A1
20110102356 Kemppinen et al. May 2011 A1
20110102752 Chen et al. May 2011 A1
20110107958 Pance et al. May 2011 A1
20110108401 Yamada et al. May 2011 A1
20110113368 Carvajal et al. May 2011 A1
20110115738 Suzuki et al. May 2011 A1
20110115747 Powell et al. May 2011 A1
20110117970 Choi May 2011 A1
20110118025 Lukas et al. May 2011 A1
20110122071 Powell May 2011 A1
20110134032 Chiu et al. Jun 2011 A1
20110134043 Chen Jun 2011 A1
20110134112 Koh et al. Jun 2011 A1
20110157037 Shamir et al. Jun 2011 A1
20110157046 Lee et al. Jun 2011 A1
20110157087 Kanehira et al. Jun 2011 A1
20110157101 Chang Jun 2011 A1
20110163955 Nasiri et al. Jul 2011 A1
20110164370 McClure et al. Jul 2011 A1
20110167181 Minoo et al. Jul 2011 A1
20110167287 Walsh et al. Jul 2011 A1
20110167391 Momeyer et al. Jul 2011 A1
20110167992 Eventoff et al. Jul 2011 A1
20110169762 Weiss Jul 2011 A1
20110169778 Nungester et al. Jul 2011 A1
20110170289 Allen et al. Jul 2011 A1
20110176035 Poulsen Jul 2011 A1
20110179864 Raasch et al. Jul 2011 A1
20110184646 Wong et al. Jul 2011 A1
20110184824 George et al. Jul 2011 A1
20110188199 Pan Aug 2011 A1
20110191480 Kobayashi Aug 2011 A1
20110193787 Morishige et al. Aug 2011 A1
20110193938 Oderwald et al. Aug 2011 A1
20110199389 Lu et al. Aug 2011 A1
20110202878 Park et al. Aug 2011 A1
20110205372 Miramontes Aug 2011 A1
20110216266 Travis Sep 2011 A1
20110221659 King et al. Sep 2011 A1
20110221678 Davydov Sep 2011 A1
20110227913 Hyndman Sep 2011 A1
20110231682 Kakish et al. Sep 2011 A1
20110234494 Peterson et al. Sep 2011 A1
20110234502 Yun et al. Sep 2011 A1
20110235179 Simmonds Sep 2011 A1
20110241999 Thier Oct 2011 A1
20110242063 Li et al. Oct 2011 A1
20110242138 Tribble Oct 2011 A1
20110242298 Bathiche et al. Oct 2011 A1
20110242440 Noma et al. Oct 2011 A1
20110242670 Simmonds Oct 2011 A1
20110248152 Svajda et al. Oct 2011 A1
20110248920 Larsen Oct 2011 A1
20110248941 Abdo et al. Oct 2011 A1
20110261001 Liu Oct 2011 A1
20110261083 Wilson Oct 2011 A1
20110262001 Bi et al. Oct 2011 A1
20110265287 Li et al. Nov 2011 A1
20110266672 Sylvester Nov 2011 A1
20110273475 Herz et al. Nov 2011 A1
20110285555 Bocirnea Nov 2011 A1
20110290686 Huang Dec 2011 A1
20110295697 Boston et al. Dec 2011 A1
20110297566 Gallagher et al. Dec 2011 A1
20110298919 Maglaque Dec 2011 A1
20110302518 Zhang Dec 2011 A1
20110304577 Brown Dec 2011 A1
20110304815 Newell Dec 2011 A1
20110305875 Sanford et al. Dec 2011 A1
20110306424 Kazama et al. Dec 2011 A1
20110314425 Chiang Dec 2011 A1
20110316807 Corrion Dec 2011 A1
20110320204 Locker et al. Dec 2011 A1
20120002820 Leichter Jan 2012 A1
20120007821 Zaliva Jan 2012 A1
20120011462 Westerman et al. Jan 2012 A1
20120013519 Hakansson et al. Jan 2012 A1
20120019165 Igaki et al. Jan 2012 A1
20120020112 Fisher et al. Jan 2012 A1
20120020490 Leichter Jan 2012 A1
20120023401 Arscott et al. Jan 2012 A1
20120023459 Westerman Jan 2012 A1
20120024682 Huang et al. Feb 2012 A1
20120026048 Vazquez et al. Feb 2012 A1
20120026096 Ku Feb 2012 A1
20120026110 Yamano Feb 2012 A1
20120032887 Chiu et al. Feb 2012 A1
20120032891 Parivar Feb 2012 A1
20120032901 Kwon Feb 2012 A1
20120032917 Yamaguchi Feb 2012 A1
20120038495 Ishikawa Feb 2012 A1
20120044140 Koyama et al. Feb 2012 A1
20120044179 Hudson Feb 2012 A1
20120047368 Chinn et al. Feb 2012 A1
20120050975 Garelli et al. Mar 2012 A1
20120062564 Miyashita Mar 2012 A1
20120062850 Travis Mar 2012 A1
20120068919 Lauder et al. Mar 2012 A1
20120069540 Lauder et al. Mar 2012 A1
20120075249 Hoch Mar 2012 A1
20120077384 Bar-Niv et al. Mar 2012 A1
20120081316 Sirpal et al. Apr 2012 A1
20120092279 Martin Apr 2012 A1
20120094257 Pillischer et al. Apr 2012 A1
20120099749 Rubin et al. Apr 2012 A1
20120103778 Obata et al. May 2012 A1
20120105321 Wang et al. May 2012 A1
20120113137 Nomoto May 2012 A1
20120113579 Agata et al. May 2012 A1
20120115553 Mahe et al. May 2012 A1
20120117409 Lee et al. May 2012 A1
20120127118 Nolting et al. May 2012 A1
20120127126 Mattice et al. May 2012 A1
20120127573 Robinson et al. May 2012 A1
20120139727 Houvener et al. Jun 2012 A1
20120140396 Zeliff et al. Jun 2012 A1
20120145525 Ishikawa Jun 2012 A1
20120156875 Srinivas et al. Jun 2012 A1
20120162693 Ito Jun 2012 A1
20120175487 Goto Jul 2012 A1
20120182242 Lindahl et al. Jul 2012 A1
20120182249 Endo et al. Jul 2012 A1
20120182743 Chou Jul 2012 A1
20120188791 Voloschenko et al. Jul 2012 A1
20120194393 Utterman et al. Aug 2012 A1
20120194448 Rothkopf Aug 2012 A1
20120195063 Kim et al. Aug 2012 A1
20120200802 Large Aug 2012 A1
20120206937 Travis et al. Aug 2012 A1
20120212438 Vaisanen Aug 2012 A1
20120218194 Silverman Aug 2012 A1
20120221877 Prabu Aug 2012 A1
20120223866 Ayala et al. Sep 2012 A1
20120224073 Miyahara Sep 2012 A1
20120227259 Badaye et al. Sep 2012 A1
20120229634 Laett et al. Sep 2012 A1
20120235635 Sato Sep 2012 A1
20120242584 Tuli Sep 2012 A1
20120243165 Chang et al. Sep 2012 A1
20120246377 Bhesania Sep 2012 A1
20120249443 Anderson et al. Oct 2012 A1
20120250873 Bakalos et al. Oct 2012 A1
20120256829 Dodge Oct 2012 A1
20120256959 Ye et al. Oct 2012 A1
20120260177 Sehrer Oct 2012 A1
20120274811 Bakin Nov 2012 A1
20120298491 Ozias et al. Nov 2012 A1
20120299872 Nishikawa et al. Nov 2012 A1
20120300275 Vilardell et al. Nov 2012 A1
20120312955 Randolph Dec 2012 A1
20130009413 Chiu et al. Jan 2013 A1
20130015311 Kim Jan 2013 A1
20130016468 Oh Jan 2013 A1
20130021289 Chen et al. Jan 2013 A1
20130027356 Nishida Jan 2013 A1
20130027867 Lauder et al. Jan 2013 A1
20130031353 Noro Jan 2013 A1
20130038541 Bakker Feb 2013 A1
20130044059 Fu Feb 2013 A1
20130044074 Park et al. Feb 2013 A1
20130046397 Fadell et al. Feb 2013 A1
20130063873 Wodrich et al. Mar 2013 A1
20130067126 Casparian et al. Mar 2013 A1
20130067259 Freiwald et al. Mar 2013 A1
20130073877 Radke Mar 2013 A1
20130076617 Csaszar et al. Mar 2013 A1
20130076635 Lin Mar 2013 A1
20130082824 Colley Apr 2013 A1
20130088431 Ballagas et al. Apr 2013 A1
20130100030 Los et al. Apr 2013 A1
20130100082 Bakin et al. Apr 2013 A1
20130106766 Yilmaz et al. May 2013 A1
20130107144 Marhefka et al. May 2013 A1
20130120466 Chen et al. May 2013 A1
20130127980 Haddik et al. May 2013 A1
20130135214 Li et al. May 2013 A1
20130154959 Lindsay Jun 2013 A1
20130155723 Coleman Jun 2013 A1
20130159749 Moeglein et al. Jun 2013 A1
20130162554 Lauder et al. Jun 2013 A1
20130172906 Olson et al. Jul 2013 A1
20130182246 Tanase Jul 2013 A1
20130191741 Dickinson et al. Jul 2013 A1
20130201094 Travis Aug 2013 A1
20130207937 Lutian et al. Aug 2013 A1
20130212483 Brakensiek et al. Aug 2013 A1
20130215035 Guard Aug 2013 A1
20130217451 Komiyama et al. Aug 2013 A1
20130222272 Martin, Jr. Aug 2013 A1
20130222274 Mori Aug 2013 A1
20130222275 Byrd et al. Aug 2013 A1
20130222323 McKenzie Aug 2013 A1
20130222353 Large Aug 2013 A1
20130226794 Englebardt Aug 2013 A1
20130227836 Whitt, III Sep 2013 A1
20130228023 Drasnin Sep 2013 A1
20130228433 Shaw Sep 2013 A1
20130228434 Whitt, III Sep 2013 A1
20130228435 Whitt, III Sep 2013 A1
20130228439 Whitt, III Sep 2013 A1
20130229100 Siddiqui Sep 2013 A1
20130229335 Whitman Sep 2013 A1
20130229347 Lutz, III Sep 2013 A1
20130229350 Shaw Sep 2013 A1
20130229351 Whitt, III Sep 2013 A1
20130229354 Whitt, III et al. Sep 2013 A1
20130229356 Marwah Sep 2013 A1
20130229357 Powell Sep 2013 A1
20130229363 Whitman Sep 2013 A1
20130229366 Dighde Sep 2013 A1
20130229380 Lutz, III Sep 2013 A1
20130229386 Bathiche Sep 2013 A1
20130229534 Panay Sep 2013 A1
20130229568 Belesiu Sep 2013 A1
20130229570 Beck et al. Sep 2013 A1
20130229756 Whitt, III Sep 2013 A1
20130229757 Whitt, III et al. Sep 2013 A1
20130229758 Belesiu Sep 2013 A1
20130229759 Whitt, III Sep 2013 A1
20130229760 Whitt, III Sep 2013 A1
20130229761 Shaw Sep 2013 A1
20130229762 Whitt, III Sep 2013 A1
20130229773 Siddiqui Sep 2013 A1
20130230346 Shaw Sep 2013 A1
20130231755 Perek Sep 2013 A1
20130232280 Perek Sep 2013 A1
20130232348 Oler Sep 2013 A1
20130232349 Oler Sep 2013 A1
20130232350 Belesiu et al. Sep 2013 A1
20130232353 Belesiu Sep 2013 A1
20130232571 Belesiu Sep 2013 A1
20130232742 Burnett et al. Sep 2013 A1
20130241860 Ciesla et al. Sep 2013 A1
20130242495 Bathiche et al. Sep 2013 A1
20130262886 Nishimura Oct 2013 A1
20130268897 Li et al. Oct 2013 A1
20130285922 Alberth, Jr. et al. Oct 2013 A1
20130300590 Dietz Nov 2013 A1
20130300647 Drasnin Nov 2013 A1
20130301199 Whitt Nov 2013 A1
20130301206 Whitt Nov 2013 A1
20130304941 Drasnin Nov 2013 A1
20130308339 Woodgate et al. Nov 2013 A1
20130321992 Liu et al. Dec 2013 A1
20130322000 Whitt Dec 2013 A1
20130322001 Whitt Dec 2013 A1
20130329360 Aldana Dec 2013 A1
20130339757 Reddy Dec 2013 A1
20130342976 Chung Dec 2013 A1
20140012401 Perek Jan 2014 A1
20140048399 Whitt, III Feb 2014 A1
20140063198 Boulanger Mar 2014 A1
20140085814 Kielland Mar 2014 A1
20140119802 Shaw May 2014 A1
20140167585 Kuan et al. Jun 2014 A1
20140185215 Whitt Jul 2014 A1
20140185220 Whitt Jul 2014 A1
20140204514 Whitt Jul 2014 A1
20140204515 Whitt Jul 2014 A1
20140247546 Whitt Sep 2014 A1
20140291134 Whitt Oct 2014 A1
20140293534 Siddiqui Oct 2014 A1
20140362506 Whitt, III et al. Dec 2014 A1
20140372914 Byrd et al. Dec 2014 A1
20140379942 Perek et al. Dec 2014 A1
20150005953 Fadell et al. Jan 2015 A1
20150036274 Belesui et al. Feb 2015 A1
20150261262 Whitt, III et al. Sep 2015 A1
20150311014 Shaw et al. Oct 2015 A1
20150378392 Siddiqui et al. Dec 2015 A1
Foreign Referenced Citations (138)
Number Date Country
990023 Jun 1976 CA
1352767 Jun 2002 CN
1440513 Sep 2003 CN
1515937 Jul 2004 CN
1537223 Oct 2004 CN
1650202 Aug 2005 CN
1653411 Aug 2005 CN
1700072 Nov 2005 CN
1787605 Jun 2006 CN
1808362 Jul 2006 CN
1920642 Feb 2007 CN
101038401 Sep 2007 CN
101198925 Jun 2008 CN
101366001 Feb 2009 CN
101410781 Apr 2009 CN
101452334 Jun 2009 CN
101464750 Jun 2009 CN
101473167 Jul 2009 CN
101490642 Jul 2009 CN
101500388 Aug 2009 CN
101512403 Aug 2009 CN
101644979 Feb 2010 CN
101675406 Mar 2010 CN
101681189 Mar 2010 CN
101688991 Mar 2010 CN
101889225 Nov 2010 CN
101893785 Nov 2010 CN
101908428 Dec 2010 CN
102004559 Apr 2011 CN
1102012763 Apr 2011 CN
102096494 Jun 2011 CN
102112947 Jun 2011 CN
201853163 Jun 2011 CN
102117121 Jul 2011 CN
102124532 Jul 2011 CN
102138113 Jul 2011 CN
102147643 Aug 2011 CN
102214040 Oct 2011 CN
102292687 Dec 2011 CN
102356624 Feb 2012 CN
103455149 Dec 2013 CN
10116556 Oct 2002 DE
645726 Mar 1995 EP
1003188 May 2000 EP
1223722 Jul 2002 EP
1480029 Nov 2004 EP
1591891 Nov 2005 EP
1983411 Oct 2008 EP
2006869 Dec 2008 EP
2026178 Feb 2009 EP
2353978 Aug 2011 EP
2381290 Oct 2011 EP
2410408 Jan 2012 EP
2068643 Aug 1981 GB
2178570 Feb 1987 GB
2305780 Apr 1997 GB
2381584 May 2003 GB
2402460 Dec 2004 GB
2410116 Jul 2005 GB
2428101 Jan 2007 GB
2482932 Feb 2012 GB
52107722 Sep 1977 JP
H07218865 Aug 1995 JP
H0980354 Mar 1997 JP
H09178949 Jul 1997 JP
H10234057 Sep 1998 JP
10301055 Nov 1998 JP
10326124 Dec 1998 JP
1173239 Mar 1999 JP
11338575 Dec 1999 JP
2000010654 Jan 2000 JP
2000106021 Apr 2000 JP
2001142564 May 2001 JP
2001174746 Jun 2001 JP
2002100226 Apr 2002 JP
2002162912 Jun 2002 JP
2002170458 Jun 2002 JP
2003215349 Jul 2003 JP
2004038950 Feb 2004 JP
2004171948 Jun 2004 JP
2005077437 Mar 2005 JP
2005156932 May 2005 JP
2005331565 Dec 2005 JP
2006004877 Jan 2006 JP
2006163459 Jun 2006 JP
2006278251 Oct 2006 JP
2006294361 Oct 2006 JP
2006310269 Nov 2006 JP
2007184286 Jul 2007 JP
2007273288 Oct 2007 JP
2008066152 Mar 2008 JP
2008286874 Jul 2008 JP
2008529251 Jul 2008 JP
2009059583 Mar 2009 JP
2009122551 Jun 2009 JP
2010151951 Jul 2010 JP
2010244514 Oct 2010 JP
2003077368 Mar 2014 JP
20010039013 May 2001 KR
20010107055 Dec 2001 KR
20050014299 Feb 2005 KR
20060003093 Jan 2006 KR
20080006404 Jan 2008 KR
20080009490 Jan 2008 KR
20080055051 Jun 2008 KR
20090029411 Mar 2009 KR
20100022059 Feb 2010 KR
20100067366 Jun 2010 KR
20100115675 Oct 2010 KR
20110064265 Jun 2011 KR
1020110087178 Aug 2011 KR
20110109791 Oct 2011 KR
20110120002 Nov 2011 KR
20110122333 Nov 2011 KR
101113530 Feb 2012 KR
1038411 May 2012 NL
WO-9919995 Apr 1999 WO
WO-9964784 Dec 1999 WO
WO-0079327 Dec 2000 WO
WO-0128309 Apr 2001 WO
WO-0172037 Sep 2001 WO
WO-03048635 Jun 2003 WO
WO-03083530 Sep 2003 WO
WO-2005059874 Jun 2005 WO
WO-2006044818 Apr 2006 WO
WO-2006082444 Aug 2006 WO
WO-2007094304 Aug 2007 WO
WO-2007103631 Sep 2007 WO
WO-2007112172 Oct 2007 WO
WO-2007123202 Nov 2007 WO
WO-2008013146 Jan 2008 WO
WO-2008038016 Apr 2008 WO
WO-2009034484 Mar 2009 WO
WO-2010011983 Jan 2010 WO
WO-2010074116 Jul 2010 WO
WO-2011049609 Apr 2011 WO
WO-2013033274 Mar 2013 WO
WO-2013163347 Oct 2013 WO
Non-Patent Literature Citations (498)
Entry
Williams, Jim “A Fourth Generation of LCD Backlight Technology”, Retrieved from <http://cds.linear.com/docs/Application%20Note/an65f.pdf>, (Nov. 1995),124 pages.
Xu, Zhang et al., “Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors”, IUI'09, Feb. 8-11, 2009, retrieved from <http://sclab.yonsei.ac.kr/courses/10TPR/10TPR.files/Hand%20Gesture%20Recognition%20and%20Virtual%20Game%20Control%20based%20on%203d%20accelerometer%20and%20EMG%20sensors.pdf> on Jan. 5, 2012,(Feb. 8, 2009), 5 pages.
Xu, Zhi-Gang et al., “Vision-based Detection of Dynamic Gesture”, ICTM'09, Dec. 5-6, 2009, retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5412956> on Jan. 5, 2012,(Dec. 5, 2009), pp. 223-226.
Yagi, Nobuyuki “The Concept of “AdapTV””, Series: The Challenge of “AdapTV”, Broadcast Technology, No. 28, (2006), pp. 16-17.
Yan, Jin-Ren et al., “Edge-Lighting Light Guide Plate Based on Micro-Prism for Liquid Crystal Display”, Journal of Display Technology, vol. 5, No. 9, Available at <http://ieeexplore.ieee.org/ielx5/9425/5196834/05196835.pdf?tp=&arnumber=5196835&isnumber=5196834>,(Sep. 2009), pp. 355-357.
Yu, et al., “A New Driving Scheme for Reflective Bistable Cholesteric Liquid Crystal Displays”, Society for Information Display International Symposium Digest of Technical Papers, Retrieved from <http://www.ee.ust.hk/˜eekwok/publications/1997/bcd—sid.pdf>,(May 1997), 4 pages.
Zhang, et al., “Model-Based Development of Dynamically Adaptive Software”, In Proceedings of ICSE 2006, Available at <http://www.irisa.fr/lande/lande/icse-proceedings/icse/p371.pdf>,(May 20, 2006), pp. 371-380.
Zhang, Rui “Design of Head Mounted Displays”, Retrieved at <<http://www.optics.arizona.edu/optomech/student%20reports/2007/Design%20of%20mounteddisplays%20Zhang.pdf>>, (Dec. 12, 2007), 6 pages.
Zhu, Dingyun et al., “Keyboard before Head Tracking Depresses User Success in Remote Camera Control”, In Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction, Part II, retrieved from <http://csiro.academia.edu/Departmemts/CSIRO—ICT—Centre/Papers?page=5> on Jun. 1, 2012,(Aug. 24, 2009), 14 pages.
“Advisory Action”, U.S. Appl. No. 14/199,924, May 28, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/938,930, Jun. 6, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, May 22, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, Jun. 19, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,032, Jun. 26, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/494,651, Jun. 11, 2014, 19 pages.
“Final Office Action”, U.S. Appl. No. 13/653,682, Jun. 11, 2014, 11 pages.
“Foreign Notice of Allowance”, CN Application No. 201320096755.7, Jan. 27, 2014, 2 pages.
“Foreign Office Action”, CN Application No. 201320097079.5, Sep. 26, 2013, 4 pages.
“Interlink Electronics FSR (TM) Force Sensing Resistors (TM)”, Retrieved at <<http://akizukidenshi.com/download/ds/ interlinkelec/94-00004+Rev+B%20FSR%201ntegration%20Guide.pdf on Mar. 21, 2013, 36 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/031531, Jun. 20, 2014, 10 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/468,882, Jul. 9, 2014, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/468,949, Jun. 20, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/470,951, Jul. 2, 2014, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,001, Jun. 17, 2014, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,030, May 15, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,054, Jun. 3, 2014, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,412, Jul. 11, 2014, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, Jun. 16, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/595,700, Jun. 18, 2014, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/647,479, Jul. 3, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,976, Jun. 16, 2014, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,250, Jun. 17, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,276, Jun. 13, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/277,240, Jun. 13, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/468,918, Jun. 17, 2014, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,186, Jul. 3, 2014, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,237, May 12, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,405, Jun. 24, 2014, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 14/018,286, May 23, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/199,924, Jun. 10, 2014, 4 pages.
“Restriction Requirement”, U.S. Appl. No. 13/595,700, May 28, 2014, 6 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 14/018,286, Jun. 11, 2014, 5 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Mar. 20, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Apr. 3, 2014, 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Mar. 10, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Apr. 14, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/938,930, May 6, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, May 5, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/371,725, Apr. 2, 2014, 22 pages.
“Final Office Action”, U.S. Appl. No. 13/525,070, Apr. 24, 2014, 21 pages.
“Final Office Action”, U.S. Appl. No. 13/780,228, Mar. 28, 2014, 13 pages.
“Final Office Action”, U.S. Appl. No. 14/063,912, Apr. 29, 2014, 10 pages.
“Final Office Action”, U.S. Appl. No. 14/199,924, May 6, 2014, 5 pages.
“Foreign Office Action”, CN Application No. 201320328022.1, Feb. 17, 2014, 4 Pages.
“Foreign Office Action”, CN Application No. 201320328022.1, Oct. 18, 2013, 3 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,186, Feb. 27, 2014, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,237, Mar. 24, 2014, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, May 7, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,376, Apr. 2, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,232, Apr. 30, 2014, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, Apr. 3, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, Feb. 26, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,055, Mar. 12, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/199,924, Apr. 10, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/200,595, Apr. 11, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,139, Mar. 17, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,232, Apr. 25, 2014, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,287, May 2, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/939,002, Mar. 3, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/939,032, Apr. 3, 2014, 4 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/653,321, Mar. 28, 2014, 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/471,030, Sep. 30, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,287, Aug. 21, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/199,924, Aug. 29, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/199,924, Sep. 5, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/199,924, Sep. 19, 2014, 2 pages.
“EP Search Report”, EP Application No. 09812072.8, Apr. 5, 2012, 6 Pages.
“Final Office Action”, U.S. Appl. No. 13/468,949, Oct. 6, 2014, 11 pages.
“Final Office Action”, U.S. Appl. No. 13/471,336, Oct. 6, 2014, 13 pages.
“Final Office Action”, U.S. Appl. No. 13/595,700, Oct. 9, 2014, 8 pages.
“Final Office Action”, U.S. Appl. No. 13/656,055, Sep. 17, 2014, 10 pages.
“Foreign Notice of Allowance”, CN Application No. 201320097065.3, Nov. 21, 2013, 2 pages.
“Foreign Office Action”, CN Application No. 200980134848, May 13, 2013, 7 Pages.
“Foreign Office Action”, CN Application No. 200980134848, May 31, 2012, 7 Pages.
“Foreign Office Action”, CN Application No. 200980134848, Dec. 4, 2013, 8 Pages.
“Foreign Office Action”, CN Application No. 200980134848, Dec. 19, 2012, 8 Pages.
“Foreign Office Action”, CN Application No. 201080037117.7, Jul. 1, 2014, 9 Pages.
“Foreign Office Action”, CN Application No. 201210023945.6, Jun. 25, 2014, 6 Pages.
“Foreign Office Action”, CN Application No. 201320097065.3, Jun. 18, 2013, 2 pages.
“Foreign Office Action”, JP Application No. 2011-526118, Aug. 16, 2013, 8 Pages.
“Foreign Office Action”, JP Application No. 2012-525632, May 2, 2014, 10 Pages.
“Foreign Office Action”, JP Application No. 2012-525722, Apr. 22, 2014, 15 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2009/055250, Mar. 2, 2014, 10 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/021,448, Jul. 22, 2014, 35 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,282, Sep. 3, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/780,228, Sep. 15, 2014, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/063,912, Sep. 2, 2014, 11 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,030, Sep. 5, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/494,651, Oct. 2, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/653,682, Sep. 24, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 14/277,240, Sep. 16, 2014, 4 pages.
“Restriction Requirement”, U.S. Appl. No. 13/653,184, Sep. 5, 2014, 6 pages.
“Search Report”, EP Application No. 09812072.8, Apr. 17, 2013, 5 Pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/471,405, Aug. 29, 2014, 5 pages.
Boual, et al., “Wedge Displays as Cameras”, Retrieved From: http://www.camfpd.com/72-3.pdf, SID Symposium Digest of Technical Papers, vol. 37, Issue 1, pp. 1999-2002, Jun. 2006, 4 Pages.
Chen, et al., “Design of a Novel Hybrid Light Guide Plate for Viewing Angle Switchable Backlight Module”, Institute of Photonic Systems, Ntional Chiao Tung University, Tainan, Taiwan., Jul. 1, 2013, 4 Pages.
Chou, et al., “Imaging and Chromatic Behavior Analysis of a Wedge-Plate Display”, Retrieved From: http://www.di.nctu.edu.tw/2006TDC/papers/Flexible/06-012.doc, SID Symposium Digest of Technical Papers vol. 37, Issue 1, pp. 1031-1034,Jun. 2006, 4 Pages.
Ishida, et al., “A Novel Ultra Thin Backlight System without Optical Sheets Using a Newly Developed Multi-Layered Light-guide”, SID 10 Digest, Jul. 5, 2012, 4 Pages.
Nishizawa, et al., “Investigation of Novel Diffuser Films for 2D Light-Distribution Control”, Tohoku University, Aramaki Aoba, Aoba-ku, Sendai 980-8579, Japan, LINTEC Corporation, 23-23 Honcho, Itabashi-ku, Tokyo 173-0001, Japan., Dec. 2011, 4 Pages.
Phillips, et al., “Links Between Holography and Lithography”, Fifth International Symposium on Display Holography, 206., Feb. 17, 1995, 9 Pages.
Powell, “High-Efficiency Projection Screen”, U.S. Appl. No. 14/243,501, Apr. 2, 2014, 26 Pages.
Travis, “P-60: LCD Smear Elimination by Scanning Ray Angle into a Light Guide”, Retrieved From: http://www2.eng.cam.ac.uk/˜arlt1/P—60.pdf, SID Symposium Digest of Technical Papers vol. 35, Issue 1, pp. 474-477, May 2004, 4 Pages.
Travis, et al., “Optical Design of a Flat Panel Projection Wedge Display”, 9th International Display Workshops, paper FMC6-3, Dec. 4-6, 2002, Hiroshima, Japan., Dec. 2002, 4 Pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,232, Jul. 31, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,032, Jul. 15, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/471,376, Aug. 18, 2014, 24 pages.
“Final Office Action”, U.S. Appl. No. 13/595,700, Aug. 15, 2014, 6 pages.
“Final Office Action”, U.S. Appl. No. 13/599,635, Aug. 8, 2014, 16 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028483, Jun. 24, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028484, Jun. 24, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028485, Jun. 25, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028769, Jun. 26, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028771, Jun. 19, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028486, Jun. 20, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/041017, Jul. 17, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028489, Jun. 20, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028488, Jun. 24, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028767, Jun. 24, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028481, Jun. 19, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028490, Jun. 24, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028766, Jun. 26, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028772, Jun. 30, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028768, Jun. 24, 2014, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028482, Jun. 20, 2014, 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028487, May 27, 2014, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028770, Jun. 26, 2014, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/525,070, Aug. 14, 2014, 24 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/494,651, Oct. 24, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/494,651, Dec. 29, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/471,054, Oct. 23, 2014, 17 pages.
“Final Office Action”, U.S. Appl. No. 13/471,412, Dec. 15, 2014, 11 pages.
“Final Office Action”, U.S. Appl. No. 13/492,232, Nov. 17, 2014, 13 pages.
“Final Office Action”, U.S. Appl. No. 13/647,479, Dec. 12, 2014, 12 pages.
“Final Office Action”, U.S. Appl. No. 14/200,595, Nov. 19, 2014, 5 pages.
“Final Office Action”, U.S. Appl. No. 14/225,276, Dec. 17, 2014, 6 pages.
“Foreign Office Action”, CN Application No. 201320097079.5, Jul. 28, 2014, 4 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/043546, Oct. 9, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/371,725, Nov. 3, 2014, 27 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,393, Oct. 20, 2014, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/525,614, Nov. 24, 2014, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,184, Dec. 1, 2014, 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/653,218, Nov. 7, 2014, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 14/147,252, Dec. 1, 2014, 6 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/471,405, Dec. 17, 2014, 5 pages.
Harrison, “UIST 2009 Student Innovation Contest—Demo Video”, Retrieved From: <https://www.youtube.com/watch?v=PDI8eYIASf0> Sep. 16, 2014, Jul. 23, 2009, 1 pages.
“Accessing Device Sensors”, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012, 4 pages.
“ACPI Docking for Windows Operating Systems”, Retrieved from: <http://www.scritube.com/limba/engleza/software/ACPI-Docking-for-Windows-Opera331824193.php> on Jul. 6, 2012,10 pages.
“Advanced Configuration and Power Management Specification”, Intel Corporation, Microsoft Corporation, Toshiba Corp. Revision 1, (Dec. 22, 1996),364 pages.
“Chinese Search Report”, Application No. 201110272868.3, (Apr. 1, 2013),10 pages.
“Cholesteric Liquid Crystal”, Retrieved from: <http://en.wikipedia.org/wiki/Cholesteric—liquid—crystal> on Aug. 6, 2012, (Jun. 10, 2012), 2 pages.
“Cirago Slim Case®—Protective case with built-in kickstand for your iPhone 5®”, Retrieved from <http://cirago.com/wordpress/wp-content/uploads/2012/10/ipc1500brochure1.pdf> on Jan. 29, 2013, (Jan. 2013), 1 page.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, (Apr. 9, 2013), 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, (Jul. 2, 2013), 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,327, (Sep. 12, 2013), 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,327 (Sep. 23, 2013), 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,726, (Sep. 17, 2013), 2 pages.
“Developing Next-Generation Human Interfaces using Capacitive and Infrared Proximity Sensing”, Silicon Laboratories, Inc., Available at <http://www.silabs.com/pages/DownloadDoc.aspx?FILEURL=support%20documents/technicaldocs/capacitive%20and%20proximity%20sensing—wp.pdf&src=SearchResults>,(Aug. 30, 2010), pp. 1-10.
“Directional Backlighting for Display Panels”, U.S. Appl. No. 13/021,448, (Feb. 4, 2011), 38 pages.
“DR2PA”, retrieved from <http://www.architainment.co.uk/wp-content/uploads/2012/08/DR2PA-AU-US-size-Data-Sheet-Rev-H—LOGO.pdf> on Sep. 17, 2012, (Jan. 2012),4 pages.
“Final Office Action”, U.S. Appl. No. 13/471,001, (Jul. 25, 2013), 20 pages.
“Final Office Action”, U.S. Appl. No. 13/471,139, (Sep. 16, 2013),13 pages.
“Final Office Action”, U.S. Appl. No. 13/471,336, (Aug. 28, 2013),18 pages.
“Final Office Action”, U.S. Appl. No. 13/651,195, (Apr. 18, 2013),13 pages.
“Final Office Action”, U.S. Appl. No. 13/651,232, (May 21, 2013), 21 pages.
“Final Office Action”, U.S. Appl. No. 13/651,287, (May 3, 2013),16 pages.
“Final Office Action”, U.S. Appl. No. 13/651,976, (Jul. 25, 2013), 21 pages.
“Final Office Action”, U.S. Appl. No. 13/653,321, (Aug. 2, 2013),17 pages.
“Final Office Action”, U.S. Appl. No. 13/653,682, (Oct. 18, 2013),16 pages.
“Final Office Action”, U.S. Appl. No. 13/656,055, (Oct. 23, 2013),14 pages.
“Final Office Action”, U.S. Appl. No. 13/938,930, (Nov. 8, 2013),10 pages.
“Final Office Action”, U.S. Appl. No. 13/939,002, (Nov. 8, 2013), 7 pages.
“Final Office Action”, U.S. Appl. No. 13/939,032, (Dec. 20, 2013), 5 pages.
“FingerWorks Installation and Operation Guide for the TouchStream ST and TouchStream LP”, FingerWorks, Inc. Retrieved from <http://ec1.images-amazon.com/media/i3d/01/A/man-migrate/MANUAL000049862.pdf>, (2002),14 pages.
“First One Handed Fabric Keyboard with Bluetooth Wireless Technology”, Retrieved from: <http://press.xtvworld.com/article3817.html> on May 8, 2012,(Jan. 6, 2005), 2 pages.
“For Any Kind of Proceeding 2011 Springtime as Well as Coil Nailers as Well as Hotter Summer Season”,Lady Shoe Worlds, retrieved from <http://www.ladyshoesworld.com/2011/09/18/for-any-kind-of-proceeding-2011-springtime-as-well-as-coil-nailers-as-well-as-hotter-summer-season/> on Nov. 3, 2011,(Sep. 8, 2011), 2 pages.
“Force and Position Sensing Resistors: An Emerging Technology”, Interlink Electronics, Available at <http://staff.science.uva.nl/˜vlaander/docu/FSR/An—Exploring—Technology.pdf>,(Feb. 1990), pp. 1-6.
“Frogpad Introduces Weareable Fabric Keyboard with Bluetooth Technology”, Retrieved from: <http://www.geekzone.co.nz/content.asp?contentid=3898> on May 7, 2012,(Jan. 7, 2005), 3 pages.
“How to Use the iPad's Onscreen Keyboard”, Retrieved from <http://www.dummies.com/how-to/content/how-to-use-the-ipads-onscreen-keyboard.html> on Aug. 28, 2012, 3 pages.
“iControlPad 2—The open source controller”, Retrieved from <http://www.kickstarter.com/projects/1703567677/icontrolpad-2-the-open-source-controller> on Nov. 20, 2012, (2012),15 pages.
“i-Interactor electronic pen”, Retrieved from: <http://www.alibaba.com/product-gs/331004878/i—Interactor—electronic—pen.html> on Jun. 19, 2012, 5 pages.
“Incipio LG G-Slate Premium Kickstand Case—Black Nylon”, Retrieved from: <http://www.amazon.com/Incipio-G-Slate-Premium-Kickstand-Case/dp/B004ZKP916> on May 8, 2012, 4 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/050471,(Apr. 9, 2012), 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028479, (Jun. 17, 2013), 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/029461, (Jun. 21, 2013),11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028948, (Jun. 21, 2013),11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/044871, (Aug. 14, 2013),12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/040968, (Sep. 5, 2013),12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/045049, (Sep. 16, 2013), 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/042550, (Sep. 24, 2013),14 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/043961, (Oct. 17, 2013),11 pages.
“Membrane Keyboards & Membrane Keypads”, Retrieved from: <http://www.pannam.com/> on May 9, 2012,(Mar. 4, 2009), 2 pages.
“Microsoft Develops Glasses-Free Eye-Tracking 3D Display”, Tech-FAQ—retrieved from <http://www.tech-faq.com/microsoft-develops-glasses-free-eye-tracking-3d-display.html> on Nov. 2, 2011, (Nov. 2, 2011),3 pages.
“Microsoft Reveals Futuristic 3D Virtual HoloDesk Patent”, Retrieved from <http://www.patentbolt.com/2012/05/microsoft-reveals-futuristic-3d-virtual-holodesk-patent.html> on May 28, 2012, (May 23, 2012), 9 pages.
“Motion Sensors”, Android Developers, retrieved from <http://developer.android.com/guide/topics/sensors/sensors—motion.html> on May 25, 2012, 7 pages.
“MPC Fly Music Production Controller”, AKAI Professional, Retrieved from: <http://www.akaiprompc.com/mpc-fly> on Jul. 9, 2012, 4 pages.
“NI Releases New Maschine & Maschine Mikro”, Retrieved from <http://www.djbooth.net/index/dj-equipment/entry/ni-releases-new-maschine-mikro/> on Sep. 17, 2012, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/882,994, (Feb. 1, 2013),17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/021,448, (Dec. 13, 2012), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/021,448, (Aug. 16, 2013), 25 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/371,725, (Nov. 7, 2013),19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,001, (Feb. 19, 2013),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,139, (Mar. 21, 2013),12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,202, (Feb. 11, 2013),10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, (Jan. 18, 2013),14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, (Jul. 19, 2013), 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/563,435, (Jun. 14, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, (Jun. 19, 2013), 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/565,124, (Jun. 17, 2013), 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,195, (Jan. 2, 2013),14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,232, (Jan. 17, 2013),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,232, (Dec. 5, 2013),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,272, (Feb. 12, 2013),10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,287, (Jan. 29, 2013),13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,304, (Mar. 22, 2013), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,327, (Mar. 22, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,726, (Apr. 15, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, (Mar. 18, 2013),14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, (Jul. 1, 2013),5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,976, (Feb. 22, 2013),16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,321, (Feb. 1, 2013),13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, (Feb. 7, 2013),11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, (Jun. 3, 2013),14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,055, (Apr. 23, 2013),11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,520, (Feb. 1, 2013),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,520, (Jun. 5, 2013), 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/780,228, (Oct. 30, 2013),12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/938,930, (Aug. 29, 2013), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,002, (Aug. 28, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,002, (Dec. 20, 2013), 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,032, (Aug. 29, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 12/882,994, (Jul. 12, 2013), 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/470,633, (Mar. 22, 2013),7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,202, (May 28, 2013),7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/563,435, (Nov. 12, 2013), 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/565,124, (Dec. 24, 2013), 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,195, (Jul. 8, 2013), 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,272, (May 2, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,304, (Jul. 1, 2013), 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,327, (Jun. 11, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,726, (May 31, 2013), 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,871, (Oct. 2, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/653,321, (Dec. 18, 2013),41 pages.
“Notice of Allowance”, U.S. Appl. No. 13/656,520, (Oct. 2, 2013), 5 pages.
“Notice to Grant”, CN Application No. 201320097089.9, (Sep. 29, 2013), 2 Pages.
“Notice to Grant”, CN Application No. 201320097124.7, (Oct. 8, 2013), 2 pages.
“On-Screen Keyboard for Windows 7, Vista, XP with Touchscreen”, Retrieved from <www.comfort-software.com/on-screen-keyboard.html> on Aug. 28, 2012, (Feb. 2, 2011), 3 pages.
“Optical Sensors in Smart Mobile Devices”, ON Semiconductor, TND415/D, Available at <http://www.onsemi.jp/pub—link/Collateral/TND415-D.PDF>,(Nov. 2010), pp. 1-13.
“Optics for Displays: Waveguide-based Wedge Creates Collimated Display Backlight”, OptoIQ, retrieved from <http://www.optoiq.com/index/photonics-technologies-applications/lfw-display/lfw-article-display.articles.laser-focus-world.volume-46.issue-1.world-news.optics-for—displays.html> on Nov. 2, 2010,(Jan. 1, 2010), 3 pages.
“PCT Search Report”, Application No. PCT/US2013/042790, (Aug. 8, 2013), 9 pages.
“Position Sensors”, Android Developers, retrieved from <http://developer.android.com/guide/topics/sensors/sensors—position.html> on May 25, 2012, 5 pages.
“Real-Time Television Content Platform”, retrieved from <http://www.accenture.com/us-en/pages/insight-real-time-television-platform.aspx> on Mar. 10, 2011, (May 28, 2002), 3 pages.
“Reflex LCD Writing Tablets”, retrieved from <http://www.kentdisplays.com/products/lcdwritingtablets.html> on Jun. 27, 2012, 3 pages.
“Restriction Requirement”, U.S. Appl. No. 13/468,918, (Nov. 29, 2013), 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/471,139, (Jan. 17, 2013), 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,304, (Jan. 18, 2013), 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,726, (Feb. 22, 2013), 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,871, (Feb. 7, 2013), 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/715,229, (Aug. 13, 2013), 7 pages.
“SMART Board™ Interactive Display Frame Pencil Pack”, Available at <http://downloads01.smarttech.com/media/sitecore/en/support/product/sbfpd/400series(interactivedisplayframes)/guides/smartboardinteractivedisplayframepencilpackv12mar09.pdf>, (2009), 2 pages.
“SoIRxTM E-Series Multidirectional Phototherapy ExpandableTM 2-Bulb Full Body Panel System”, Retrieved from: <http://www.solarcsystems.com/us—multidirectional—uv—light—therapy—1—intro.html > on Jul. 25, 2012,(2011), 4 pages.
“The Microsoft Surface Tablets Comes With Impressive Design and Specs”, Retrieved from <http://microsofttablereview.com/the-microsoft-surface-tablets-comes-with-impressive-design-and-specs> on Jan. 30, 2013, (Jun. 2012), 2 pages.
“Tilt Shift Lenses: Perspective Control”, retrieved from http://www.cambridgeincolour.com/tutorials/tilt-shift-lenses1.htm, (Mar. 28, 2008),11 Pages.
“Virtualization Getting Started Guide”, Red Hat Enterprise Linux 6, Edition 0.2, retrieved from <http://docs.redhat.com/docs/en-US/Red—Hat—Enterprise—Linux/6/html-single/Virtualization—Getting—Started—Guide/index.html> on Jun. 13, 2012, 24 pages.
“Welcome to Windows 7”, Retrieved from: <http://www.microsoft.com/en-us/download/confirmation.aspx?id=4984> on Aug. 1, 2013, (Sep. 16, 2009), 3 pages.
“What is Active Alignment?”, http://www.kasalis.com/active—alignment.html, retrieved on Nov. 22, 2012, 2 Pages.
“What is the PD-Net Project About?”, retrieved from <http://pd-net.org/about/> on Mar. 10, 2011, (Mar. 10, 2011), 3 pages.
“Write & Learn Spellboard Advanced”, Available at <http://somemanuals.com/VTECH,WRITE%2526LEARN--SPELLBOARD--ADV--71000,JIDFHE.PDF>, (2006), 22 pages.
Bathiche, Steven N., et al., “Input Device with Interchangeable Surface”, U.S. Appl. No. 13/974,749, (Aug. 23, 2013), 51 pages.
Bert, et al., “Passive Matrix Addressing of Electrophoretic Image Display”, Conference on International Display Research Conference, Retrieved from <http://www.cmst.be/publi/eurodisplay2002—s14-1.pdf>,(Oct. 1, 2002), 4 pages.
Block, Steve et al., “DeviceOrientation Event Specification”, W3C, Editor's Draft, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012,(Jul. 12, 2011), 14 pages.
Brown, Rich “Microsoft Shows Off Pressure-Sensitive Keyboard”, retrieved from <http://news.cnet.com/8301-17938—105-10304792-1.html> on May 7, 2012, (Aug. 6, 2009), 2 pages.
Burge, et al., “Determination of off-axis aberrations of imaging systems using on-axis measurements”, SPIE Proceeding, Retrieved from <http://www.loft.optics.arizona.edu/documents/journal—articles/Jim—Burge—Determinationaxis—measurements.pdf>,(Sep. 21, 2011),10 pages.
Butler, Alex et al., “SideSight: Multi-“touch” Interaction around Small Devices”, In the proceedings of the 21st annual ACM symposium on User interface software and technology., retrieved from <http://research.microsoft.com/pubs/132534/sidesight—crv3.pdf> on May 29, 2012,(Oct. 19, 2008), 4 pages.
Chang, Jee-Gong et al., “Optical Design and Analysis of LCD Backlight Units Using ASAP”, Optical Engineering, Available at <http://www.opticsvalley.com/resources/kbasePDF/ma—oe—001—optical—design.pdf>,(Jun. 2003),15 pages.
Crider, Michael “Sony Slate Concept Tablet “Grows” a Kickstand”, Retrieved from: <http://androidcommunity.com/sony-slate-concept-tablet-grows-a-kickstand-20120116/> on May 4, 2012,(Jan. 16, 2012), 9 pages.
Das, Apurba et al., “Study of Heat Transfer through Multilayer Clothing Assemblies: A Theoretical Prediction”, Retrieved from <http://www.autexrj.com/cms/zalaczone—pliki/5—013—11.pdf>, (Jun. 2011), 7 pages.
Dietz, Paul H., et al., “A Practical Pressure Sensitive Computer Keyboard”, In Proceedings of UIST 2009,(Oct. 2009), 4 pages.
Diverdi, et al., “An Immaterial Pseudo-3D Display with 3D Interaction”, In the proceedings of Three-Dimensional Television: Capture, Transmission, and Display, Springer, Retrieved from <http://www.cs.ucsb.edu/˜holl/pubs/DiVerdi-2007-3DTV.pdf>,(Feb. 6, 2007), 26 pages.
Gaver, William W., et al., “A Virtual Window on Media Space”, retrieved from <http://www.gold.ac.uk/media/15gaver-smets-overbeeke.MediaSpaceWindow.chi95.pdf> on Jun. 1, 2012, retrieved from <http://www.gold.ac.uk/media/15gaver-smets-overbeeke.MediaSpaceWindow.chi95.pdf> on Jun. 1, 2012,(May 7, 1995), 9 pages.
Glatt, Jeff “Channel and Key Pressure (Aftertouch).”, Retrieved from: <http://home.roadrunner.com/˜jgglatt/tutr/touch.htm> on Jun. 11, 2012, 2 pages.
Grossman, et al., “Multi-Finger Gestural Interaction with 3D Volumetric Displays”, In the proceedings of the 17th annual ACM symposium on User interface software and technology, Retrieved from <http://www.dgp.toronto.edu/papers/tgrossman—UIST2004.pdf>,(Oct. 24, 2004), 61-70.
Hanlon, Mike “ElekTex Smart Fabric Keyboard Goes Wireless”, Retrieved from: <http://www.gizmag.com/go/5048/ > on May 7, 2012,(Jan. 15, 2006), 5 pages.
Harada, Susumu et al., “VoiceDraw: A Hands-Free Voice-Driven Drawing Application for People With Motor Impairments”, In Proceedings of Ninth International ACM SIGACCESS on Computers and Accessibility, retrieved from <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.113.7211&rep=rep1&type=pdf> on Jun. 1, 2012,(Oct. 15, 2007), 8 pages.
Hinckley, Ken et al., “Codex: A Dual Screen Tablet Computer”, Conference on Human Factors in Computing Systems, (Apr. 9, 2009), 10 pages.
Iwase, Eiji “Multistep Sequential Batch Assembly of Three-Dimensional Ferromagnetic Microstructures with Elastic Hinges”, Retrieved at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1549861>> Proceedings: Journal of Microelectromechanical Systems, (Dec. 2005), 7 pages.
Izadi, Shahram et al., “ThinSight: A Thin Form-Factor Interactive Surface Technology”, Communications of the ACM, vol. 52, No. 12, retrieved from <http://research.microsoft.com/pubs/132532/p90-izadi.pdf> on Jan. 5, 2012,(Dec. 2009), pp. 90-98.
Jacobs, et al., “2D/3D Switchable Displays”, In the proceedings of Sharp Technical Journal (4), Available at <https://cgi.sharp.co.jp/corporate/rd/journal-85/pdf/85-04.pdf>,(Apr. 2003), pp. 15-18.
Kaufmann, Benoit et al., “Hand Posture Recognition Using Real-time Artificial Evolution”, EvoApplications'09, retrieved from <http://evelyne.lutton.free.fr/Papers/KaufmannEvolASP2010.pdf> Jan. 5, 2012,(Apr. 3, 2010),10 pages.
Kaur, Sukhmani “Vincent Liew's redesigned laptop satisfies ergonomic needs”, Retrieved from: <http://www.designbuzz.com/entry/vincent-liew-s-redesigned-laptop-satisfies-ergonomic-needs/> on Jul. 27, 2012,(Jun. 21, 2010), 4 pages.
Khuntontong, Puttachat et al., “Fabrication of Molded Interconnection Devices by Ultrasonic Hot Embossing on Thin Polymer Films”, IEEE Transactions on Electronics Packaging Manufacturing, vol. 32, No. 3,(Jul. 2009), pp. 152-156.
Kim, Min Su et al., “A Controllable Viewing Angle LCD with an Optically isotropic liquid crystal”, Journal of Physics D: Applied Physics, vol. 43, No. 14, (Mar. 23, 2010),7 Pages.
Lance, David M., et al., “Media Processing Input Device”, U.S. Appl. No. 13/655,065, filed Oct. 18, 2012, 43 pages.
Lee, C.M.G “Flat-Panel Autostereoscopic 3D Display”, Optoelectronics, IET, Available at <http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=04455550>,(Feb. 2008), pp. 24-28.
Lee, et al., “Depth-Fused 3D Imagery on an Immaterial Display”, In the proceedings of IEEE Transactions on Visualization and Computer Graphics, vol. 15, No. 1, Retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=04540094>,(Jan. 2009),20-33.
Lee, et al., “LED Light Coupler Design for a Ultra Thin Light Guide”, Journal of the Optical Society of Korea, vol. 11, Issue.3, Retrieved from <http://opticslab.kongju.ac.kr/pdf/06.pdf>,(Sep. 2007), 5 pages.
Li, et al., “Characteristic Mode Based Tradeoff Analysis of Antenna-Chassis Interactions for Multiple Antenna Terminals”, In IEEE Transactions on Antennas and Propagation, Retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6060882>,(Feb. 2012),13 pages.
Linderholm, Owen “Logitech Shows Cloth Keyboard for PDAs”, Retrieved from: <http://www.pcworld.com/article/89084/logitech—shows—cloth—keyboard—for—pdas.html> on May 7, 2012,(Mar. 15, 2002),5 pages.
Liu, et al., “Three-dimensional PC: toward novel forms of human-computer interaction”, In the proceedings of Three-Dimensional Video and Display: Devices and Systems vol. CR76, Retrieved from <http://www.google.co.in/url?sa=t&rct=j&q=Three-dimensional+PC:+toward+novel+forms+of+human-computer+interaction&source=web&cd=1&ved=0CFoQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.32.9469%26rep%3Drep1%26,(Nov. 5, 2000), 250-281.
Manresa-Yee, Cristina et al., “Experiences Using a Hands-Free Interface”, In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, retrieved from <http://dmi.uib.es/˜cmanresay/Research%5BMan08%5DAssets08.pdf> on Jun. 1, 2012,(Oct. 13, 2008), pp. 261-262.
McLellan, Charles “Eleksen Wireless Fabric Keyboard: a first look”, Retrieved from: <http://www.zdnetasia.com/eleksen-wireless-fabric-keyboard-a-first-look-40278954.htm> on May 7, 2012,(Jul. 17, 2006), 9 pages.
Miller, Matthew “MOGA gaming controller enhances the Android gaming experience”, Retrieved from <http://www.zdnet.com/moga-gaming-controller-enhances-the-android-gaming-experience-7000007550/> on Nov. 20, 2012, (Nov. 18, 2012), 9 pages.
Morookian, et al., “Ambient-Light-Canceling Camera Using Subtraction of Frames”, NASA Tech Briefs, Retrieved from <http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20110016693—2011017808.pdf>,(May 2004), 2 pages.
Nakanishi, Hideyuki et al., “Movable Cameras Enhance Social Telepresence in Media Spaces”, In Proceedings of the 27th International Conference on Human Factors in Computing Systems, retrieved from <http://smg.ams.eng.osaka-u.ac.jp/˜nakanishi/hnp—2009—chi.pdf> on Jun. 1, 2012,(Apr. 6, 2009), 10 pages.
Peli, Eli “Visual and Optometric Issues with Head-Mounted Displays”, IS & T/OSA Optics & Imaging in the Information Age, The Society for Imaging Science and Technology, available at <http://www.u.arizona.edu/˜zrui3/zhang—pHMPD—spie07.pdf>,(1996), pp. 364-369.
Piltch, Avram “ASUS Eee Pad Slider SL101 Review”, Retrieved from <http://www.laptopmag.com/review/tablets/asus-eee-pad-slider-sl101.aspx>, (Sep. 22, 2011), 5 pages.
Post, E.R. et al., “E-Broidery: Design and Fabrication of Textile-Based Computing”, IBM Systems Journal, vol. 39, Issue 3 & 4,(Jul. 2000), pp. 840-860.
Prospero, Michael “Samsung Outs Series 5 Hybrid PC Tablet”, Retrieved from: <http://blog.laptopmag.com/samsung-outs-series-5-hybrid-pc-tablet-running-windows-8> on Oct. 31, 2013, (Jun. 4, 2012), 7 pages.
Purcher, Jack “Apple is Paving the Way for a New 3D GUI for IOS Devices”, Retrieved from: <http://www.patentlyapple.com/patently-apple/2012/01/apple-is-paving-the-way-for-a-new-3d-gui-for-ios-devices.html> on Jun. 4, 2012,(Jan. 12, 2012),15 pages.
Qin, Yongqiang et al., “pPen: Enabling Authenticated Pen and Touch Interaction on Tabletop Surfaces”, In Proceedings of ITS 2010, Available at <http://www.dfki.de/its2010/papers/pdf/po172.pdf>,(Nov. 2010), pp. 283-284.
Reilink, Rob et al., “Endoscopic Camera Control by Head Movements for Thoracic Surgery”, In Proceedings of 3rd IEEE RAS & EMBS International Conference of Biomedical Robotics and Biomechatronics, retrieved from <http://doc.utwente.nl/74929/1/biorob—online.pdf> on Jun. 1, 2012,(Sep. 26, 2010), pp. 510-515.
Reisman, et al., “A Screen-Space Formulation for 2D and 3D Direct Manipulation”, In the proceedings of the 22nd annual ACM symposium on User interface, Retrieved from <http://innovis.cpsc.ucalgary.ca/innovis/uploads/Courses/TableTopDetails2009/Reisman2009.pdf>,(Oct. 4, 2009), 69-78.
Schoning, Johannes et al., “Building Interactive Multi-Touch Surfaces”, Journal of Graphics, GPU, and Game Tools, vol. 14, No. 3, available at <http://www.libavg.com/raw-attachment/wiki/Multitouch/Multitouchguide—draft.pdf>,(Nov. 2009), pp. 35-55.
Staff, “Gametel Android controller turns tablets, phones into portable gaming devices”, Retrieved from <http://www.mobiletor.com/2011/11/18/gametel-android-controller-turns-tablets-phones-into-portable-gaming-devices/#> on Nov. 20, 2012, (Nov. 18, 2011), 5 pages.
Sumimoto, Mark “Touch & Write: Surface Computing With Touch and Pen Input”, Retrieved from: <http://www.gottabemobile.com/2009/08/07/touch-write-surface-computing-with-touch-and-pen-input/> on Jun. 19, 2012 (Aug. 7, 2009), 4 pages.
Sundstedt, Veronica “Gazing at Games: Using Eye Tracking to Control Virtual Characters”, In ACM SIGGRAPH 2010 Courses, retrieved from <http://www.tobii.com/Global/Analysis/Training/EyeTrackAwards/veronica—sundstedt.pdf> Jun. 1, 2012,(Jul. 28, 2010), 85 pages.
Takamatsu, Seiichi et al., “Flexible Fabric Keyboard with Conductive Polymer-Coated Fibers”, In Proceedings of Sensors 2011,(Oct. 28, 2011), 4 pages.
Travis, Adrian et al., “Collimated Light from a Waveguide for a Display Backlight”, Optics Express, 19714, vol. 17, No. 22, retrieved from <http://download.microsoft.com/download/D/2/E/D2E425F8-CF3C-4C71-A4A2-70F9D4081007/OpticsExpressbacklightpaper.pdf> on Oct. 15, 2009,(Oct. 15, 2009), 6 pages.
Travis, Adrian et al., “P-127: Linearity in Flat Panel Wedge Projection”, SID 03 Digest, retrieved from <http://www2.eng.cam.ac.uk/˜arlt1/Linearity%20in%20flat%20panel%20wedge%20projection.pdf>,(May 12, 2005), pp. 716-719.
Travis, Adrian et al., “The Design of Backlights for View-Sequential 3D”, retrieved from <http://download.microsoft.com/download/D/2/E/D2E425F8-CF3C-4C71-A4A2-70F9D4081007/Backlightforviewsequentialautostereo.docx> on Nov. 1, 2010,4 pages.
Travis, Adrian R., et al., “Flat Projection for 3-D”, In Proceedings of the IEEE, vol. 94, Issue: 3, Available at <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1605201>,(Mar. 13, 2006), pp. 539-549.
Valli, Alessandro “Notes on Natural Interaction”, retrieved from <http://www.idemployee.id.tue.nl/g.w.m.rauterberg/lecturenotes/valli-2004.pdf> on Jan. 5, 2012,(Sep. 2005), 80 pages.
Valliath, G T., “Design of Hologram for Brightness Enhancement in Color LCDs”, Retrieved from <http://www.loreti.it/Download/PDF/LCD/44—05.pdf> on Sep. 17, 2012, (May 1998), 5 pages.
Vaucelle, Cati “Scopemate, A Robotic Microscope!”, Architectradure, retrieved from <http://architectradure.blogspot.com/2011/10/at-uist-this-monday-scopemate-robotic.html> on Jun. 6, 2012,(Oct. 17, 2011), 2 pages.
“Advisory Action”, U.S. Appl. No. 13/939,032, Feb. 24, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Jan. 14, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Jan. 22, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/021,448, Jan. 16, 2014, 33 Pages.
“Final Office Action”, U.S. Appl. No. 13/564,520, Jan. 15, 2014, 7 pages.
“Foreign Office Action”, CN Application No. 201320097066.8, Oct. 24, 2013, 5 Pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/055679, Nov. 18, 2013, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/599,635, Feb. 25, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,405, Feb. 20, 2014, 37 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/494,651, Feb. 4, 2014, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/525,070, Jan. 17, 2014, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, Feb. 14, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/063,912, Jan. 2, 2014, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 13/938,930, Feb. 20, 2014, 4 pages.
Lee, “Flat-panel Backlight for View-sequential 3D Display”, Optoelectronics, IEE Proceedings-.vol. 151. No. 6 IET, Dec. 2004, 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/277,240, Jan. 8, 2015, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/021,448, Jan. 2, 2015, 19 pages.
“Final Office Action”, U.S. Appl. No. 13/468,882, Feb. 12, 2015, 9 pages.
“Final Office Action”, U.S. Appl. No. 13/470,951, Jan. 12, 2015, 20 pages.
“Final Office Action”, U.S. Appl. No. 13/525,070, Jan. 29, 2015, 30 pages.
“Final Office Action”, U.S. Appl. No. 13/527,263, Jan. 27, 2015, 7 pages.
“Final Office Action”, U.S. Appl. No. 14/063,912, Jan. 12, 2015, 12 pages.
“First Examination Report”, NZ Application No. 628690, Nov. 27, 2014, 2 pages.
“Foreign Office Action”, CN Application No. 201080037117.7, Aug. 20, 2013, 10 pages.
“Foreign Office Action”, CN Application No. 201210023945.6, Dec. 3, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,030, Jan. 15, 2015, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,232, Feb. 24, 2015, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, Jan. 26, 2015, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/599,635, Feb. 12, 2015, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/147,252, Feb. 23, 2015, 11 pages.
“Notice of Allowance”, U.S. Appl. No. 13/595,700, Jan. 21, 2015, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,976, Jan. 21, 2015, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 14/200,595, Feb. 17, 2015, 2 pages.
“Notice of Allowance”, U.S. Appl. No. 14/200,595, Feb. 25, 2015, 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,232, Apr. 24, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/656,055, Apr. 13, 2015, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/371,725, Mar. 3, 2015, 30 pages.
“Final Office Action”, U.S. Appl. No. 13/525,614, Apr. 29, 2015, 20 pages.
“Final Office Action”, U.S. Appl. No. 13/780,228, Apr. 10, 2015, 19 pages.
“Final Office Action”, U.S. Appl. No. 14/225,250, Mar. 13, 2015, 7 pages.
“Foreign Notice on Reexamination”, CN Application No. 201320097066.8, Apr. 3, 2015, 7 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,054, Mar. 13, 2015, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,376, Mar. 27, 2015, 28 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,393, Mar. 26, 2015, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,412, Jun. 1, 2015, 31 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/525,070, May 18, 2015, 32 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/647,479, Apr. 28, 2015, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,218, Mar. 4, 2015, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/852,848, Mar. 26, 2015, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/063,912, May 7, 2015, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,276, Apr. 23, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/468,949, Apr. 24, 2015, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/468,918, Apr. 8, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/468,949, Apr. 24, 2015, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,030, Apr. 6, 2015, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,282, Apr. 30, 2015, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/564,520, May 8, 2015, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,232, Mar. 30, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/653,184, Mar. 10, 2015, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/656,055, Mar. 4, 2015, 7 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/595,700, Apr. 10, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/595,700, May 4, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/595,700, May 22, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/656,055, May 15, 2015, 2 pages.
Schafer,“Using Interactive Maps for Navigation and Collaboration”, CHI '01 Extended communication to applicant.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,232, Jun. 10, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,232, Jul. 6, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/656,055, Jul. 1, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/200,595, Jun. 4, 2015, 3 pages.
“Final Office Action”, U.S. Appl. No. 13/471,376, Jul. 28, 2015, 35 pages.
“Final Office Action”, U.S. Appl. No. 13/492,232, Jul. 10, 2015, 11 pages.
“Final Office Action”, U.S. Appl. No. 13/599,635, Jul. 30, 2015, 23 pages.
“Final Office Action”, U.S. Appl. No. 13/852,848, Jul. 20, 2015, 9 pages.
“Final Office Action”, U.S. Appl. No. 14/147,252, Jun. 25, 2015, 11 pages.
“Foreign Office Action”, CN Application No. 201310067335.0, Jun. 12, 2015, 15 Pages.
“Foreign Office Action”, CN Application No. 201310067808.7, May 28, 2015, 14 Pages.
“Foreign Office Action”, CN Application No. 201310225788.1, Jun. 23, 2015, 14 Pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2014/031531, Jun. 9, 2015, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, Jun. 24, 2015, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/525,614, Jul. 31, 2015, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/727,001, Jul. 10, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/225,276, Jun. 22, 2015, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 14/457,881, Jul. 22, 2015, 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/598,898, Jul. 17, 2015, 6 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/468,918, Jun. 4, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/468,949, Jun. 5, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/653,184, Jun. 24, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/656,055, Jun. 10, 2015, 2 pages.
Cunningham,“Software Infrastructure for Natural Language Processing”, In Proceedings of the fifth conference on Applied natural language processing, Mar. 31, 1997, pp. 237-244.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/021,448, Aug. 17, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/471,030, Aug. 10, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/564,520, Aug. 14, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/225,276, Aug. 27, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/457,881, Aug. 20, 2015, 2 pages.
“Final Office Action”, U.S. Appl. No. 14/063,912, Sep. 3, 2015, 13 pages.
“Foreign Office Action”, CN Application No. 201280029520.4, Jun. 30, 2015, 11 pages.
“Foreign Office Action”, CN Application No. 201310067385.9, Aug. 6, 2015, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,250, Aug. 19, 2015, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,276, Aug. 19, 2015, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/021,448, Jul. 30, 2015, 11 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/468,918, Aug. 7, 2015, 4 pages.
“Advisory Action”, U.S. Appl. No. 13/471,376, Sep. 23, 2015, 7 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/564,520, Sep. 17, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/225,276, Sep. 29, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/457,881, Oct. 2, 2015, 2 pages.
“Decision on Reexamination”, CN Application No. 201320097079.5, Sep. 7, 2015, 8 Pages.
“Extended European Search Report”, EP Application No. 13858620.1, Sep. 18, 2015, 6 pages.
“Extended European Search Report”, EP Application No. 13858834.8, Oct. 29, 2015, 8 pages.
“Extended European Search Report”, EP Application No. 13859280.3, Sep. 7, 2015, 6 pages.
“Extended European Search Report”, EP Application No. 13859406.4, Sep. 8, 2015, 6 pages.
“Final Office Action”, U.S. Appl. No. 13/647,479, Sep. 17, 2015, 11 pages.
“Final Office Action”, U.S. Appl. No. 13/653,218, Oct. 5, 2015, 16 pages.
“Final Office Action”, U.S. Appl. No. 13/689,541, Nov. 2, 2015, 21 pages.
“Foreign Office Action”, CN Application No. 201310065273.X, Oct. 28, 2015, 14 pages.
“Foreign Office Action”, CN Application No. 201310067592.4, Oct. 23, 2015, 12 Pages.
“Foreign Office Action”, CN Application No. 201310067622.1, Oct. 27, 2015, 14 pages.
“Foreign Office Action”, CN Application No. 201310067627.4, Sep. 28, 2015, 14 pages.
“Foreign Office Action”, CN Application No. 201310096345.7, Oct. 19, 2015, 16 Pages.
“Foreign Office Action”, CN Application No. 201310316114.2, Sep. 29, 2015, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/468,882, Nov. 13, 2015, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/470,951, Oct. 1, 2015, 29 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,393, Sep. 30, 2015, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/598,898, Oct. 23, 2015, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/780,228, Sep. 18, 2015, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/162,529, Sep. 18, 2015, 13 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,054, Sep. 25, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,376, Nov. 23, 2015, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,412, Nov. 20, 2015, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 13/525,070, Sep. 25, 2015, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/852,848, Nov. 19, 2015, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 14/727,001, Oct. 2, 2015, 4 pages.
“Restriction Requirement”, U.S. Appl. No. 13/891,109, Sep. 22, 2015, 6 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/468,949, Sep. 14, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/471,054, Nov. 19, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/525,070, Oct. 19, 2015, 2 pages.
“Supplementary Euorpean Search Report”, EP Application No. 13728568.0, Oct. 30, 2015, 7 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/471,054, Jan. 11, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/525,070, Jan. 13, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/527,263, Jan. 4, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/527,263, Jan. 11, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/727,001, Dec. 15, 2015, 2 pages.
“Extended European Search Report”, EP Application No. 13857958.6, Dec. 18, 2015, 8 pages.
“Extended European Search Report”, EP Application No. 13858283.8, Nov. 23, 2015, 10 pages.
“Extended European Search Report”, EP Application No. 13858397.6, Nov. 30, 2015, 7 pages.
“Extended European Search Report”, EP Application No. 13858674.8, Nov. 27, 2015, 6 pages.
“Extended European Search Report”, EP Application No. 13860272.7, Dec. 14, 2015, 9 pages.
“Extended European Search Report”, EP Application No. 13860836.9, Nov. 27, 2015, 9 pages.
“Extended European Search Report”, EP Application No. 13861292.4, Nov. 23, 2015, 7 pages.
“Final Office Action”, U.S. Appl. No. 13/471,336, Dec. 10, 2015, 17 pages.
“Foreign Office Action”, CN Application No. 201310067373.6, Dec. 23, 2015, 15 Pages.
“Foreign Office Action”, CN Application No. 201310067429.8, Nov. 25, 2015, 12 Pages.
“Foreign Office Action”, CN Application No. 201310067631.0, Dec. 10, 2015, 11 Pages.
“Foreign Office Action”, CN Application No. 201310067641.4, Dec. 30, 2015, 12 Pages.
“Foreign Office Action”, CN Application No. 201310067808.7, Jan. 7, 2016, 6 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,232, Dec. 17, 2015, 11 pages.
“Notice of Allowance”, U.S. Appl. No. 13/527,263, Dec. 9, 2015, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/647,479, Jan. 14, 2016, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/727,001, Dec. 15, 2015, 2 pages.
“Restriction Requirement”, U.S. Appl. No. 14/794,182, Dec. 22, 2015, 6 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/471,412, Feb. 16, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/852,848, Jan. 29, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/727,001, Jan. 25, 2016, 2 pages.
“Final Office Action”, U.S. Appl. No. 14/225,250, Jan. 29, 2016, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/891,109, Jan. 29, 2016, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/371,725, Jan. 29, 2016, 10 pages.
Related Publications (1)
Number Date Country
20140043275 A1 Feb 2014 US
Provisional Applications (7)
Number Date Country
61606321 Mar 2012 US
61606301 Mar 2012 US
61606313 Mar 2012 US
61606333 Mar 2012 US
61613745 Mar 2012 US
61606336 Mar 2012 US
61607451 Mar 2012 US
Continuations (2)
Number Date Country
Parent 13651195 Oct 2012 US
Child 14059280 US
Parent 13471376 May 2012 US
Child 13651195 US