Optical stylus interaction

Information

  • Patent Grant
  • 9354748
  • Patent Number
    9,354,748
  • Date Filed
    Monday, February 13, 2012
    13 years ago
  • Date Issued
    Tuesday, May 31, 2016
    8 years ago
Abstract
Optical stylus interaction techniques are described. In an implementation, a display of a computing device includes optical sensors capable of detecting images projected by a stylus. A stylus may include a projection system to project various images used to convey interaction information that may be interpreted by the computing device. Based on recognition of different projected images, a context for interaction of the stylus may be ascertained and corresponding operations may be performed by the computing device. This may include resolving a spatial position of the stylus relative to the display device as well as movement of the stylus and images that defines various stylus-based gestures. In addition, the environment for optical stylus interaction enables a writing mode that emulates natural writing by mapping different projectable images to changes in pressure applied to the stylus when in contact with the display.
Description
BACKGROUND

One way in which a user may interact with a computing device is via a stylus. A stylus is a pen-like device that may facilitate digital hand writing and drawing as well as interactions with a touchscreen display. A stylus may be used as an alternative to direct input by a user's hand. Traditional styluses, though, typically may rely upon near surface sensors such as proximity or capacitive sensors to provide input. Accordingly, functionality provided by a traditional stylus is limited at distances beyond an inch or so from a computing device. Moreover, writing and drawing with a traditional stylus may feel unnatural because typically digital ink traces made via a stylus are not applied in a manner comparable to physical markings made by a pen, paintbrush, or other writing instrument.


SUMMARY

Optical stylus interaction techniques are described. In an implementation, a display of a computing device includes optical sensors capable of detecting images projected by a stylus. A stylus may be configured with a projection system to project various images used to convey interaction information that may be decoded and recognized by the computing device. Based on recognition of different projected images, a context for interaction of the stylus may be ascertained and corresponding operations may be performed by the computing device. The decoding may include resolving a spatial position of the stylus relative to the display device as well as movement of the stylus that defines various stylus based gestures. In addition, the environment for optical stylus interaction enables a writing mode that emulates natural writing by mapping different available images to changes in pressure applied to the stylus when in contact with the display.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.



FIG. 1 is an illustration of an environment in an example implementation that is operable to employ optical stylus interaction techniques.



FIG. 2 illustrates an example system showing a stylus of FIG. 1 in greater detail.



FIG. 3 depicts an illustration of an example sequence of images to implement optical stylus interaction techniques.



FIG. 4 is a flow diagram depicting a procedure in an example implementation in which projected images are decoded to trigger corresponding actions.



FIG. 5 is a flow diagram depicting a procedure in an example implementation in which projected images are interpreted to determine a context for interaction with a computing device using a stylus.



FIG. 6 is a flow diagram depicting a procedure in an example implementation in which a stylus projects images to convey corresponding interaction information.



FIG. 7 is a flow diagram depicting a procedure in an example implementation in which a stylus projects images indicative of an activated mode.



FIG. 8 illustrates an example system and components of the system that can be employed to implement embodiments of the techniques described herein.





DETAILED DESCRIPTION

Overview


Conventional styluses have limited functionality when used at a distance from a display device of a computing device. In addition, digital ink traces made by a traditional stylus are not applied in a manner comparable to physical markings made by a pen, paintbrush, or other writing instrument, which may make writing with a traditional stylus feel unnatural.


Optical stylus interaction techniques are described. In an implementation, a display of a computing device includes optical sensors capable of detecting images projected by a stylus. A stylus may be configured with a projection system to project various images used to convey interaction information that may be decoded and recognized by the computing device. Based on interpretation of different projected images, a context for interaction of the stylus may be ascertained and corresponding operations may be performed by the computing device. This may include resolving a spatial position of the stylus relative to the display device as well as movement of the stylus and/or the images that defines various stylus-based gestures. The spatial position may be determined for any or all of six degrees of freedom of the stylus. Thus, a variety of different kinds of stylus based gestures may be enabled and the stylus may be used for cursor control, gesture input, and other control functions at some distance away from the display surface.


In addition, the environment described for optical stylus interaction enables a writing mode that emulates natural writing by mapping different interpretable images to changes in pressure applied to the stylus when in contact with the display. For instance, a stylus may include a pressure switch in its tip that measures discrete pressure levels applied to the tip. The stylus may be configured to project different images for different pressure levels. Accordingly, the computing device may adjust attributes of digital ink traces in response to writing pressure changes in a manner that emulates natural writing.


In the following discussion, an example operating environment is first described that is operable to employ various optical stylus interaction techniques described herein. In the course of discussing the example operating environment, some implementation details regarding an example optically enabled stylus are also discussed. Example procedures involving the various techniques are then described, which may be employed in the example environment as well as in other environments. Accordingly, the example environment is not limited to performing the example procedures. Likewise, the example procedures are not limited to implementation in the example environment. Thereafter, an example system and components of the system suitable to implement optical stylus interaction techniques in accordance with one or more embodiments are described.


Example Operating Environment



FIG. 1 is an illustration of an example operating environment 100 that is operable to employ optical stylus interaction techniques described herein. The operating environment includes a computing device 102 having a processing system 104 and computer-readable media 106 that is representative of various different types and combinations of media, memory, and storage components and/or devices that may be associated with a computing device. The computing device 102 is further illustrated as including an operating system 108 and one or more device applications 110 that may reside on the computer-readable media (as shown), may be implemented at least partially by one or more hardware elements, and/or may be executed via the processing system 104. Computer-readable media 106 may include both “computer-readable storage media” and “communication media,” examples of which can be found in the discussion of the example computing system of FIG. 8. The computing device 102 may be configured as any suitable computing system and/or device that employ various processing systems 104 examples of which are also discussed in relation to the example computing system of FIG. 8.


The computing device 102 is also illustrated as including a display device 112 and an input/output module 114. The display device 112 may be configured as a touchscreen to enable touchscreen and gesture functionality. In some embodiments, the display device 112 is configured to include a variety of sensors configured to sense different kinds of interaction with the display device 112. For example, the display device 112 may be configured with optical sensors (e.g., a camera array, photo sensors, complementary metal-oxide-semiconductor (CMOS) sensors, charge-coupled device (CCD) imaging sensors, infrared sensors, light detecting LEDs, photodiodes, and/or other optical sensors) through which optical interaction with the display device 112 may be detected and processed. In addition or alternatively, a display device 112 may also include capacitive sensors, thermal sensors, pressure sensors, proximity sensors and the like for detection of touch input, gestures, and other interaction with the display. In at least some embodiments, the display device 112 is configured as a Sensor-In-Pixel (SIP) panel for which sensors may be associated with respective individual pixels or groups of pixels of the display device 112. A SIP panel may sense differences in light incident upon (emanating from) its surface using an array of sensor elements within the layers of the panel. Illumination may transmit through the panel and reflect off objects or images at or near the surface thereby providing a mechanism for generating optical signals that may be used to detect the objects or images.


The input/output module 114 is representative of functionality to identify various kinds of inputs and cause operations to be performed that correspond to the inputs. Inputs identifiable/interpretable by the input/output module 114 include touch, gestures, stylus input, and/or optical input that is detected via the display device 112 as well as keystrokes, mouse input, motion of the device captured via inertial sensors, input and gestures detected by a camera of the device, and/or operation of controls such as physical buttons, switches, levers, and keys provided by a device. For example, the input/output module 114 may be configured to recognize a touch input, such as a finger of a user's hand 116 as proximal to a display device 112 of the computing device 102 using touchscreen functionality.


Various input may also be recognized by the input/output module 114 as including attributes (e.g., movement, selection point, size, etc.) that are usable to differentiate between different inputs recognized by the input/output module 114. This differentiation may then serve as a basis to identify a gesture from the inputs and consequently an operation that is to be performed based on identification of the gesture. A variety of different types of gestures may be recognized by the input/output module 114, such a gestures that are recognized from a single type of input (e.g., touch gestures such as the previously described drag-and-drop gesture) as well as gestures involving multiple types and combinations of inputs. The input/output module 114 may be further be configured to detect stylus input and/or stylus gestures provided by a stylus 118 including but not limited to optical interaction of the stylus 118 with the display device 112.


To handle optical interactions, the input/output module 114 may include or otherwise make use of an image decoder module 120 that represents functionality of the computing device 102 to perform processing to decode images projected onto the display device 112 by a suitably configured stylus 118 (or other optically enabled device). Thus, in at least some embodiments, a stylus is configured to enable optical interaction with the display device 112. For instance, the example stylus 118 is depicted as including a projection system 122 that may be used to project various images 124. For example, the projection system 122 may be configured as a laser projection system that projects holographs provided by optical elements or another conventional lens projection system that use a spatial light modulator (SLM), a liquid crystal display (LCD) array, and so forth. Images 124 that are projected to the display device 112 may be detected and recognized by the computing device 102. Based on recognition of different images, a context for interaction of the stylus 118 may be ascertained and the image decoder module 120 may cause operations/actions corresponding to the context to be performed by the computing device. The decoding may include resolving a spatial position of the stylus 118 relative to the display device based on processing of the images 124 as well as movement of the stylus and/or the images that define various stylus-based gestures.


Further, different modes of operation for a stylus may be conveyed using different images 124 that are projected and recognized via the display device 112. A mode manager module 126 of the stylus represents functionality operable to cause different images to be projected for different modes such as a hover mode (e.g., cursor control mode), pointer mode, writing mode, gesture mode, game mode, and so forth. Different modes may be triggered and controlled by a mode switch of the stylus, such as a mode trigger button, a selector switch, a pressure switch in the stylus tip, and so forth. In one example, a writing mode that emulates pressure sensitive writing is implemented by using different images that are mapped to discrete pressure levels that may be measured using a pressure switch in the stylus tip or otherwise.


Additionally, different styluses may be configured to project different respective images that may be configured as or include identifiers for the different styluses. Accordingly, different identifying images and/or identifiers provided as part of an image may be employed to determine stylus identity and differentiate between multiple styluses and corresponding input. Thus, using optical techniques described above and below, various information regarding a stylus may be conveyed from the stylus to a device including position, spatial orientation, mode of operation, stylus identification, movement, input commands, gestures, and so forth. Details regarding these and other aspects of optical stylus interaction techniques are described in relation to the following figures.



FIG. 2 is a schematic illustration showing implementation details of an example of stylus, generally at 200. In the depicted example, the stylus 118 includes a projection system 122 that may be configured in various ways to project images 124 for detection and interpretation by a computing device 102. As mentioned, in at least some embodiments, an image 124 that is projected onto a display device 112 having optical sensors (such as a Sensor-In-Pixel (SIP) panel) can be optically detected based upon differences in illumination at the surface of the display. As represented in FIG. 2, the projection system 122 may include a light source 202 and one or more image elements 204 that can be employed to form a projection 206 of an image 124. Images 124 may include holographic images, digital images, glyphs, icons, patterns, pictures, arc and line combinations, and/or other suitable images that may be projected via a projection system 122 of a stylus 118.


A variety of different types of light sources are contemplated including laser diode sources, light emitting diode (LED) sources, an LCD array, SLMs, and so forth. The image elements 204 are representative of various components and techniques that may be employed to capture, contain, record, or otherwise embody image information/data used by the projection system 122 to project suitable images and/or patterns. For example, a beam 208 from the light source 202 may be used to illuminate different image elements 204 and form a corresponding projection 206 of the image 124. As described in greater detail below, one or more beams from one or more light sources may be configured to scan across different image elements 204 to produce different corresponding images for detection by the display in different scenarios. In another example, image elements 204 represent digital patterns, graphics, or spatial light modulators that may be projected via the projection system 122 using digital image projection techniques. Different images projected by a stylus are interpretable by a target computing device to recognize interaction information for the stylus conveyed by the images including but not limited to information such as a spatial position of the stylus, an activated mode for the stylus, a gesture produced by manipulation of the stylus, and/or an identity of the stylus. Thus, interpretation of projected images by the target computing device (e.g., image decoder module 120) enables the device to selectively perform corresponding operations and actions in response to the optical input from the stylus.


In one particular example, the projection system 122 is configured as a laser projection system that employs laser sources and diffractive optical elements (DOEs) to project holographic images recorded via the DOEs. For instance, image elements 204 may be configured as DOEs having image information used to reproduce corresponding holographs when appropriately illuminated via the light source 202. Accordingly, the light source 202 may be a laser, such as a vertical-cavity surface-emitting laser (VCSEL) or other suitable semiconductor laser diode.


In another example, the projection system 122 may be configured to form projections via an image forming device and a lens system that may be arranged in various ways. For instance, the image forming device represents functionality and devices used to form an image including a spatial light modulator (SLM), an LCD array, and/or a digital image processor, and so forth. The image forming device may supply a formed image to the lens system for projection onto the target surface. The lens system therefore represents functionality to direct image content to a target. For instance, an SLM may be configured to provide light gating of a dot or pixel pattern for an image to control output of the image with content directed to a surface by a corresponding lens system. A variety of other configurations are also contemplated.


Other types of projection systems are also contemplated including laser systems, LED systems, and typical digital or analog projection systems suitable to project images from a stylus to a target. In at least some embodiments, the projected images are invisible to the human eye such as being infrared (IR) projections although visible projections and images may also be employed.


The stylus 118 further includes a stylus tip 210 that facilitates touch interaction and touch gestures through contact of the stylus tip 210 with the display device. As mentioned, different modes of operation for a stylus may also be conveyed using different projected images. To facilitate mode switching by the mode manager module 126, the stylus 118 may include at least one mode switch examples of which may include a mode button 212 and/or pressure switch 214 as depicted in FIG. 2. Other suitable components and mechanisms configured to toggle between different modes are also contemplated, such as a sliding switch, a touch sensitive button, and a stylus generated gesture, to name a few examples.


In general, the projection system 122 enables optical interaction with a device via projection of images 124 that are detected in some manner by the device. By so doing, interaction information regarding the stylus can be communicated via an optical channel between the stylus and the computing device. This may occur without establishing radio frequency (RF) connections or involving RF communication between the stylus and computing device. Thus, optical channels for conveying stylus interaction information may be used independently of and/or without RF communications between the stylus and computing device.


Interaction information can include information regarding the spatial orientation and location (e.g. spatial position) of the stylus. By decoding projected images, a computing device may be able to resolve six degrees of freedom for the stylus. This includes determination of x, y, and z coordinate positions of the stylus relative to the display and angular/rotational position of the stylus along x, y and z axes. For the example coordinate system, the x-y plane is defined by the display surface and the z-axis perpendicular to the display surface defines height relative to the x-y plane.


In particular, an image decoder module 120 may be configured to ascertain spatial orientation and location (e.g. spatial position) of the stylus based upon orientation, size, distortion, and/or size relationships of elements in projected images. Naturally, the point at which the image is projected onto a display device (e.g., intersection of the optical z-axis along which the image is projected with the display surface) may be employed to determine the “cursor” position (e.g., x and y coordinates) of the stylus. A height of the stylus above (or relative to) the display (e.g., z-height) may be determined based upon size calculations for projected images. Orientation of asymmetric elements of an image may be analyzed to identify a rotational position around the “optical” z-axis of the stylus, which is also referred to as “clocking” and/or a “clock” position of the stylus. Likewise, angular orientation of the stylus on the x-axis and y-axis may be determined based upon processing of asymmetric elements, image size, size relationships, and/or distortions of various image elements that occur as the angles of image projection change. For instance, ratios of different arc elements of a projected image may be used to compute corresponding angles of image projection.


Thus, image projections that convey stylus interaction information and decoding of such images as described herein enable an optical interaction environment for a device that can support different operational modes, commands, and stylus gestures, some examples of which are discussed in this document. A variety of different configurations and operational modes are contemplated to create a stylus suitable to implement optical stylus interaction techniques described herein. For example, styluses may range from relatively simple styluses designed to project a single image to more complex styluses designed to project multiple image, beams, and/or sequences of images. One or multiple lasers and/or other light sources may be employed for different stylus configurations.


Different operational modes may generally include at least a hover mode (e.g., cursor control mode) for interaction with a device from a distance and writing mode that may be activated to apply digital ink traces for writing, painting, and drawing in some scenarios. In hover mode, the stylus may perform control functions through optical interaction to manipulate operations of a device including controlling applications and user interfaces of the device. Some examples of control functions include controlling a cursor, menu navigation, input of stylus-based gestures, and so forth. The control functions may be determined at least in part based upon decoding of projected images and corresponding spatial positions and/or gestures. For instance, movement of the stylus between successive spatial positions may define a gesture and/or trigger a corresponding control function associated with the movement/gesture. Thus, the stylus in hover mode acts as sort of a “magic wand” that a user may point, wave, flick, rotate, and otherwise manipulate to cause corresponding actions by a computing device based upon decoding of one or more images projected from the stylus in response to manipulation of the stylus. The hover mode may be supported in a defined zone extending out from the display surface and at a distance from the display. For example, hover mode and corresponding optical interactions may be enabled at a range of distances from at an inch or so above the display surface to several feet from the display, and even at greater distances, such as from across a MOM.


At or near the surface of a display, a writing mode may be activated in response to projection of corresponding images that trigger the writing mode. For instance, detection of a particular image for the writing mode via optical sensors of a display may trigger writing mode. It should be noted that optical detection techniques may be used alone or conjunction with other techniques. For example, proximity sensors, capacitive sensors, and the like may be combined with optical sensing techniques to detect positioning of the stylus tip at or near the display surface and/or to toggle between modes accordingly. Writing mode may also be activated when the stylus is positioned at a distance from the display in some embodiments. For instance, writing mode may be triggered responsive to operation of a mode switch that cause images for the writing mode to be directed to the display. Generally, in writing mode digital ink traces are applied to emulate writing, drawing, painting, and so forth. Writing mode may also emulate natural pressure sensitive writing as described in greater detail below.


To further illustrate, consider now a discussion of few example configurations for a stylus in accordance with one more embodiments. In one approach, a stylus may be configured to project a single fixed image/pattern. Such a fixed pattern system may be implemented using a single DOE and laser diode configured to project the single fixed image/pattern. Here, the single fixed image may be used to determine any or all of the six degrees of freedom (6DOF) of the stylus as mentioned previously. Toggling projection of the image on and off via a mode switch may also be employed to switch between different modes. For instance, a mode button 212 may be selectively depressed to toggle selected stylus modes on or off in a binary fashion. In one example, a mode button 212 may be configured to toggle between hover mode and writing mode.


In another configuration, a stylus may be designed to project multiple different images that may be used to convey different kinds of stylus interaction information. By way of example, one image projection may be used to convey spatial position while a separate projection is used to convey mode information. Other kinds of projections are also contemplated, such as separate projections used to convey a stylus identification image and/or to implement a visible beam/image for a pointer function. Multiple images that are used for different purposes may be projected concurrently as well as at different times, such as by using different respective light sources, elements, and beams to form multiple different image projection streams. Thus, multiple independent image projection streams may be employed in some scenarios.


In another approach, the same image projection stream may be configured to project different images at different times. By way of example and not limitation, one image projection stream may be used to project a stylus identification image during initialization/calibration of the stylus, one or more images suitable for resolving spatial orientation (e.g., 6DOF) in a hover mode, and/or one or more images suitable to convey different writing/drawing attributes in a writing mode. This may occur by illuminating different image elements 204 that encode different images at different times. In particular, the mode manager module 126 may operate to cause the projection system 122 to form different images 124 in different contexts and/or in response to mode changes determined via a mode switch or otherwise.


In at least some embodiments, a pressure switch 214 may also be employed to trigger a writing mode based at least in part upon contact of a stylus tip 210 with a display. The pressure switch 214 represents functionality to enable measurement of different pressure levels applied to the stylus tip 210. For example, a resistive pressure transducer may be employed to sense pressure applied to the tip. In another approach, pressure may be sensed optically using a prism or photodiode to sense variable amounts of light that are mapped to different pressure levels. Further, different images and/or corresponding modes of operation may be mapped to the different pressure levels. Thus, contact of stylus tip 210 may cause pressure measurement by the pressure switch 214 that activates the writing mode through projection of corresponding images. A sequence of images corresponding to different writing pressure levels may be projected as the measured pressure level changes. In this manner, attributes of digital ink traces applied for writing, painting, and/or drawing may be adjusted to respond in a natural, pressure-sensitive way to changes in pressure on the stylus tip 210.


In particular, digital ink traces generally may be adjusted to become darker and/or thicker as pressure placed on the stylus tip increases to emulate the way in which physical markings change with increased pressure on a pen, paintbrush, or other instrument. In other words, pressure sensitive writing is emulated by adjusting digital ink traces applied via the stylus to match a pressure level defined by a corresponding projected image. Other writing/drawing attributes may also be conveyed in a comparable manner by mapping the attributes to different images and patterns. By way of example and not limitation, selections of drawing tools, font types, colors, line styles, brush/pen type, and other attributes may be conveyed by projection of images that are mapped to the attributes. In at least some embodiments selections of various attributes may be accomplished using corresponding stylus-based gestures.


More generally, the environment described herein for optical stylus interaction enables definition and recognition of various stylus-based gestures. This may include gestures for selection of different writing/drawing attributes noted above as well as other gestures for device control, content navigation, user interface manipulation, and so forth. In addition, different gestures may be implemented in different modes. Gestures may be defined based upon measurement of spatial position of the stylus and/or sequential changes in the spatial position that are ascertained using optically conveyed information.


By way of example and not limitation, in hover mode a twisting or rotating gesture in which the stylus is rotated around the z-axis may drive forward and back navigation between different pages for a browser, presentation, or electronic book; a wrist flick gesture may be configured to select an item and/or open an item; waving the stylus up/down or left/right may cause a scrolling like response; moving the stylus up and down in the z-direction may cause zooming in and out, and so forth. In writing mode, a twisting or rotating gesture may facilitate selection of different writing/drawing attributes, a wrist flick gesture may be cause a spattering pattern of ink traces to appear (like spattering paint), a sweeping gesture may operate an erase function, and so forth. Thus, a variety of stylus-based gestures may be implemented of which the gestures enumerated above are but a few illustrative examples.


In some cases, a display device 112 may be configured to employ some form of ambient light cancellation. In such scenarios, an image projected by a stylus 118 runs the risk of being interpreted as ambient light and be canceled out. As such, a projection system 122 may be configured to account for ambient light cancellation and adjust projections accordingly. This may include configuring image projections for particular wavelengths or ranges that are not canceled and/or pulsing projections in sync with a display. For instance, illumination light for the display may be pulsed in order to differentiate between local light and ambient light and cancel out ambient light. Accordingly, to avoid interpretation of image projections as ambient light, a projection system 122 may be configured to include a photonic detector/receiver, such as a photodiode, that may be used to detect pulsing and sync projections with pulsed illumination light used for a display. Pulsing projections in this manner may additionally serve to improve battery life since projections may occur intermittently at short time intervals in which sensors of the display are integrating the image content.



FIG. 3 depicts generally at 300 an example sequence of images that may be mapped to different pressure levels to implement pressure sensitive writing. Naturally, different images may be mapped to other writing/drawing attributes as well as different operational modes in a comparable manner. It just so happens that in this example, the image projected by a stylus is configured to change based on changes in writing pressure applied to the stylus.


In particular, scanning of a beam 208 across a series of image elements 204 is represented by the series of different images shown in FIG. 3. As mentioned, the image elements 204 may be configured as DOEs encoding respective holograms that may be illuminated by a suitable laser source. Other equivalent imaging techniques may also be employed. The beam location 302 of the beam 208 may be controlled and adjusted based upon changes in pressure detected via a pressure switch 214 (or other mode switch input). This causes the beam to illuminate different elements and/or output different projected images 304 for different pressure levels. In this example, four image elements 204 are employed to form seven different images labeled by letters A through G. Here, when the beam intersects boundaries of two elements, content from both of the two elements may be illuminated thus giving the seven different example images. Relatively simple patterns, glyphs, icons or other images may be employed to convey different levels or modes. Additionally or alternatively, images may also be configured to contain asymmetric content and combinations of arcs and lines as mentioned previously to facilitate decoding of spatial position in appropriate situations.


Although four image elements 204 are shown in this example, a fewer or greater number of image elements 204 may be employed in different scenarios. Generally, the number of different elements and therefore the number of levels or modes that may be accommodated by an image sequence is dependent upon the size of the holograms and the beam footprint as well as consideration of the range of pressure input (or other mode switch input) and practical constraints on stylus size.


Having considered an example environment and details for an optically enabled stylus, consider now some example procedures for optical stylus interaction in accordance with one or more embodiments.


Example Procedures


The following discussion describes optical stylus interaction techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedures may be implemented in hardware, firmware, or software, or a combination thereof. The procedures are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. In portions of the following discussion, reference may be made to the example operating environment of FIG. 1 and example stylus details discussed in relation to FIGS. 2 and 3.



FIG. 4 depicts a procedure 400 in an example implementation in which optical techniques are used to resolve a spatial position of a stylus. One or more images that are projected onto a display using a stylus are detected (block 402). For example, a SIP panel display or other display device 112 having optical sensors may detect images produced via a projection system 122 of a stylus 118 via an input/output module 114 or otherwise. Various different kinds of images 124 may be projected in different scenarios. Optical sensors generally detect the images projected onto the display by measuring changes in illumination caused by the image projection.


The one or more images are decoded to resolve a spatial position of the stylus relative to the display device (block 404). For example, input/output module 114 may include or make use of an image decoder module 120 to decode detected images. Here, the image decoder module 120 is configured to analyze image attributes (e.g., size, position, distortion, rotation, and so forth) to ascertain the spatial position of the stylus that corresponds to the projected the image. The image decoder module 120 may resolve spatial position of the stylus for six degrees of freedom as discussed in relation to FIG. 2. In general, optical stylus interaction techniques including detecting and decoding of images by a computing are based upon optical projections of the one or more images by the stylus. Such optical projections do not rely upon or involve radio frequency (RF) communications between the stylus and target device.


Certain features of an image may be selected by design to facilitate resolving spatial position of the stylus for six degrees of freedom. By way of example, the origin, pointing, and angular extent of the projected image may be known by design, or may be assessed by measurement. Accordingly, projected content may be designed to contain circular and/or annular elements that, when projected, may be analyzed relative to a known pattern for the origin, pointing and angular content/extent of the image to recognize changes as position of the stylus changes. Likewise, changes in the size of the image may also be detected. Changes in image size and various image elements can be used to determine the z-height of the stylus.


Additionally, line width around the perimeter or borders of a projected image may be used to ascertain an angular direction pointing back to a stylus. In turn, the angular direction can be employed to determine x-y position of the stylus in relation of the display surface. Line width variations may be included around the border/perimeter of an image to facilitate position assessment and avoid ambiguities. A circular or annular image with line width variation may be designed to enable resolution of five degrees of freedom. Further, addition of a fiducial marker and/or asymmetric elements within the projected image, such as a line perpendicular to the perimeter, or a dot, enables assessment of rotational or ‘clocking’ position. Therefore, generally speaking, an image that combines circular or annular elements with line width variation and a fiducial marker can enable resolution of six degrees of freedom for the stylus. One or more actions are performed corresponding to the spatial position that is resolved by decoding of the one or more images (block 406). For example, the image decoder module 120 may provide information on spatial position that may be used by the input/output module 114 to recognize various stylus gestures, some examples of which were described previously. Accordingly, actions corresponding to a recognized stylus gesture may be performed. For instance, movement of the stylus to different spatial positions may be detected and used for cursor control and/or to recognize gestures. A particular recognized gesture, such as rotating or “clocking” the stylus around the z-axis, may be detected based on successive spatial positions of the stylus. For instance, gestures may be recognized by determining relative changes in orientation of the projected content and/or a portion of the projected content based on a fiducial marker or asymmetric elements. The particular recognized gesture may cause a corresponding action, such as turning a page or navigating a user interface menu. Position information regarding the stylus may also be used to drive changes between different operational modes.



FIG. 5 depicts a procedure 500 in an example implementation in which images projected by a stylus are recognized to drive corresponding actions by a device. One or more images projected onto a display device via a stylus are recognized (block 502). As before, this may include a wide variety of images 124 used to convey different interaction information as described in relation to FIGS. 1-3. For example, different images may be projected to convey a stylus identifier, spatial position, operational modes, writing pressure levels, and so forth. An image decoder module 120 or other comparable functionality of a device may operate to decode the images.


A context for interaction with a computing device using the stylus is ascertained based on the recognition of the one or more images (block 504) and actions associated with the ascertained context are performed (block 506). For example, different images may be indicative of gestures, commands, stylus position and movement, and other interaction conducted with the stylus. The images may be mapped to corresponding actions to be performed when the images or stylus-based gestures are recognized. A context for interaction of the stylus may be defined based on stylus information such as a mode for the stylus, the particular image projected, current spatial position of the stylus, and movement of the stylus. The context may also include device information such as a cursor position, an active interface and/or application, item or window focus/selection, and so forth. Thus, a particular image that is projected may represent a corresponding context that can be identified based on recognition of the particular image. Based on the ascertained context, the input/output module 114 or other comparable functionality of a device may determine appropriate responsive actions to take, such as cursor movement, action associated with a gesture, a switch between modes, navigation functions, launching or closing an application, and so forth.



FIG. 6 depicts a procedure 600 in an example implementation in which a stylus projects images to convey interaction information regarding the stylus. Images available for projection by a stylus are mapped to different states for a stylus (block 602). For example, different images 124 and/or patterns that may be projected by a stylus 118 may be mapped to different states including different pressure levels, different modes, and/or different spatial positions as discussed previously. A mode manager module 126 of a stylus may include a table, database, file or other suitable mapping information to map the different states to corresponding images.


A current state of the different states for the stylus is determined (block 604). For instance, the mode manager module 126 of the stylus may obtain information regarding the current state from various mode switches and/or other sensors of the stylus. This enables the mode manager module 126 to compute the current state and cause projection of an image corresponding to the current state.


An image indicative of the current state is projected by the stylus for decoding at a computing device to ascertain the current state (block 606). Here, the mode manager module 126 may reference mapping information to look-up an image that is to be projected based on the current state or otherwise select between different images that a stylus is capable of projecting based on the current state. The mode manager module 126 may then cause the projection system of the stylus to project an appropriate image. Once projected, the image may be interpreted by a target computing device as described previously and corresponding operations and actions may be triggered.



FIG. 7 depicts a procedure 700 in an example implementation in which a stylus projects an image indicative of an active mode. Activation of a mode for a stylus is detected (block 702). The detection may occur in any suitable way. For example, activation of a mode may be detected by a mode manager module 126 based on operation of a mode switch as described previously. In another example, a mode (such as hover mode or writing mode) may be triggered based upon a gesture or spatial positioning of a stylus. Additionally, a mode (such as a stylus identification mode) may be triggered by an initialization sequence when a stylus is powered on. Detectable modes may also include different writing pressure modes corresponding to discrete pressure levels measured by a pressure switch.


An image indicative of the activated mode is projected from the stylus for decoding by a computing device to ascertain the active mode (block 704). In other words, an image corresponding to the activated mode is projected for detection and processing by the computing device 102. For instance, a stylus identification image incorporating an identifier may be used to identify a particular stylus and or distinguish between different styluses in a stylus identification mode. Images mapped to pressure levels may be projected in a writing mode as described previously. In addition or alternatively, images suitable to resolve spatial positioning, control a cursor, and/or recognize gestures may be projected for activated modes as appropriate. Thus, the computing device 102 is able to determine the mode through optically conveyed information and take appropriate responsive action to toggle between modes, perform commands and operations designated by the optically conveyed information, and so forth.


Having considered some example procedures for optical stylus interaction techniques, consider now a discussion of an example system and components of the system that may be employed to implement aspects of the described techniques in one or more embodiments.


Example System



FIG. 8 illustrates an example system 800 that includes an example computing device 802 that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. The computing device 802 may be, for example, a server of a service provider, a device associated with the client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system.


The example computing device 802 as illustrated includes a processing system 804, one or more computer-readable media 806, and one or more I/O interfaces 808 that are communicatively coupled, one to another. Although not shown, the computing device 802 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines.


The processing system 804 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 804 is illustrated as including hardware elements 810 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. The hardware elements 810 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.


The computer-readable media 806 is illustrated as including memory/storage 812. The memory/storage 812 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage 812 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage 812 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 806 may be configured in a variety of other ways as further described below.


Input/output interface(s) 808 are representative of functionality to allow a user to enter commands and information to computing device 802, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone for voice input/control, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to detect movement that does not involve touch as gestures), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, the computing device 802 may be configured in a variety of ways as further described below to support user interaction.


Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the computing device 802. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “communication media.”


“Computer-readable storage media” may refer to media and/or devices that enable persistent and/or non-transitory storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media. The computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.


“Communication media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 802, such as via a network. Communication media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Communication media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.


As previously described, hardware elements 810 and computer-readable media 806 are representative of instructions, modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein. Hardware elements may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware devices. In this context, a hardware element may operate as a processing device that performs program tasks defined by instructions, modules, and/or logic embodied by the hardware element as well as a hardware device utilized to store instructions for execution, e.g., the computer-readable storage media described previously.


Combinations of the foregoing may also be employed to implement various techniques and modules described herein. Accordingly, software, hardware, or program modules including device applications 110, input/output module 114, image decoder module 120, mode manager module 126 and other program modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 810. The computing device 802 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of modules as a module that is executable by the computing device 802 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 810 of the processing system. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more computing devices 802 and/or processing systems 804) to implement techniques, modules, and examples described herein.


As further illustrated in FIG. 8, the example system 800 enables ubiquitous environments for a seamless user experience when running applications on a personal computer (PC), a television device, and/or a mobile device. Services and applications run substantially similar in all three environments for a common user experience when transitioning from one device to the next while utilizing an application, playing a video game, watching a video, and so on.


In the example system 800, multiple devices are interconnected through a central computing device. The central computing device may be local to the multiple devices or may be located remotely from the multiple devices. In one embodiment, the central computing device may be a cloud of one or more server computers that are connected to the multiple devices through a network, the Internet, or other data communication link.


In one embodiment, this interconnection architecture enables functionality to be delivered across multiple devices to provide a common and seamless experience to a user of the multiple devices. Each of the multiple devices may have different physical requirements and capabilities, and the central computing device uses a platform to enable the delivery of an experience to the device that is both tailored to the device and yet common to all devices. In one embodiment, a class of target devices is created and experiences are tailored to the generic class of devices. A class of devices may be defined by physical features, types of usage, or other common characteristics of the devices.


In various implementations, the computing device 802 may assume a variety of different configurations, such as for computer 814, mobile 816, and television 818 uses. Each of these configurations includes devices that may have generally different constructs and capabilities, and thus the computing device 802 may be configured according to one or more of the different device classes. For instance, the computing device 802 may be implemented as the computer 814 class of a device that includes a personal computer, desktop computer, a multi-screen computer, laptop computer, netbook, and so on.


The computing device 802 may also be implemented as the mobile 816 class of device that includes mobile devices, such as a mobile phone, portable music player, portable gaming device, a tablet computer, a multi-screen computer, and so on. The computing device 802 may also be implemented as the television 818 class of device that includes devices having or connected to generally larger screens in casual viewing environments. These devices include televisions, set-top boxes, gaming consoles, and so on.


The techniques described herein may be supported by these various configurations of the computing device 802 and are not limited to the specific examples of the techniques described herein. This is illustrated through inclusion of the image decoder module 120 on the computing device 802. The functionality of the image decoder module 120 and other modules may also be implemented all or in part through use of a distributed system, such as over a “cloud” 820 via a platform 822 as described below.


The cloud 820 includes and/or is representative of a platform 822 for resources 824. The platform 822 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 820. The resources 824 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from the computing device 802. Resources 824 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network.


The platform 822 may abstract resources and functions to connect the computing device 802 with other computing devices. The platform 822 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for the resources 824 that are implemented via the platform 822. Accordingly, in an interconnected device embodiment, implementation of functionality described herein may be distributed throughout the system 800. For example, the functionality may be implemented in part on the computing device 802 as well as via the platform 822 that abstracts the functionality of the cloud 820.


CONCLUSION

Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed invention.

Claims
  • 1. A computing device comprising: a display device including one or more optical sensors configured to detect images projected onto a surface of the display device by a stylus;an image decoder module configured to: decode a detected image to ascertain a corresponding context for interaction with the computing device using the stylus by recognizing the detected image from multiple different images that each represent different contexts, the multiple different images each corresponding to a graphic that results from illuminating a series of image elements of the stylus differently than the series of image elements is illuminated for other graphics; andcause performance of one or more actions by the computing device that are associated with the ascertained context.
  • 2. A computing device as described in claim 1, wherein the display device comprises a sensor-in-pixel (SIP) panel having the optical sensors associated with respective pixels of the display device.
  • 3. A computing device as described in claim 1, wherein the image decoder module is further configured to resolve the spatial position of the stylus based upon decoding of the detected image.
  • 4. A computing device as described in claim 1, wherein: the image decoder module is further configured to recognize a gesture input via the stylus based upon decoding of the detected image; andthe one or more actions correspond to the recognized gesture.
  • 5. A computing device as described in claim 1, wherein: the context comprises an operational mode for interaction with the computing device using the stylus that is defined by the detected image; andperformance of one or more actions comprises switching to the operational mode.
  • 6. A computing device as described in claim 1, wherein the image decoder module is further configured to detect movement of the stylus based upon decoding of the detected image.
  • 7. A computing device as described in claim 1, wherein: the detected image defines a pressure level for a writing mode of the stylus; andthe one or more actions comprise emulating pressure sensitive writing by adjusting attributes of digital ink traces applied via the stylus to match the pressure level defined by the detected image.
  • 8. A computing device as described in claim 1, wherein the image decoder module is further configured to determine an identity of the stylus based upon an identifier for the stylus that is projected as part of the detected image.
  • 9. A method implemented by one or more modules at least partially in hardware of a computing device, the method comprising: detecting one or more images projected onto a display device associated with the computing device using a stylus;resolving a spatial position of the stylus relative to the display device by: decoding the one or more projected images that, when decoded, reveal information that is indicative of the spatial position relative to the display device, the decoding including analyzing the one or more projected images relative to known patterns of a mapping of multiple different projectable images that each represent different information indicative of the spatial position, at least one of the multiple different projectable images corresponding to a graphic containing asymmetric content that is analyzable to facilitate resolution of the spatial position; andascertaining projection characteristics of the one or more projected images that are further indicative of the spatial position relative to the display device; andperforming one or more actions corresponding to the spatial position that is resolved by the decoding of the one or more projected images and the ascertaining of the projection characteristics of the one or more projected images.
  • 10. A method as described in claim 9, further comprising: recognizing an image projected by the stylus that is indicative of a writing mode of the stylus and conveys information regarding a pressure level to apply to digital ink traces input via the stylus; andin response to recognition of the image indicative of the writing mode, activating the writing mode and causing digital ink traces input via the stylus to have attributes defined for the pressure level conveyed by the image.
  • 11. A method as described in claim 9, wherein the images are detected via one or more optical sensors incorporated with the display device.
  • 12. A method as described in claim 9, wherein the detecting and the decoding of the one or more projected images by the computing device are based on optical projections of the one or more projected images by the stylus that do not involve radio frequency (RF) communications.
  • 13. A method as described in claim 9, wherein ascertaining the projection characteristics of the one or more projected images to resolve the spatial position comprises determining a height of the stylus relative to the display based at least in part upon size calculations for the one or more projected images.
  • 14. A method as described in claim 9, wherein ascertaining the projection characteristics of the one or more projected images to resolve the spatial position comprises processing the asymmetric content of the one or more projected images to identify a rotational position of the stylus around an optical axis along which the one or more projected images are projected.
  • 15. A method as described in claim 9, wherein ascertaining the projection characteristics of the one or more projected images to resolve the spatial position comprises resolving the spatial position for six degrees of freedom of the stylus based on analysis of image size, distortion of image elements, and orientation of the asymmetric content for the one or more projected images.
  • 16. A method as described in claim 9, wherein decoding the one or more projected images to resolve the spatial position comprises recognizing a gesture based upon a detection of movement of the stylus in accordance with the spatial position.
  • 17. A method as described in claim 9, wherein the one or more actions comprise control functions to manipulate operations of the computing device determined based in part upon the spatial position.
  • 18. A stylus comprising: a projection system;one or more image elements embodying image information for corresponding images to enable projection of the images via the projection system to a target computing device; anda mode manager module configured to cause the projection system to project different images using the image information embodied by the one or more image elements in response to manipulation of the stylus, the different images each comprising a different combination of graphical content and projected to convey different interaction information for the stylus optically for decoding by the target computing device, the different projected images interpretable by the target computing device with projection characteristics of detected projected images to recognize one or more of a spatial position of the stylus, an activated mode for the stylus, or a gesture produced by manipulation of the stylus.
  • 19. A stylus as described in claim 18, wherein: the one or more image elements comprise diffractive optical elements having image information to reproduce recorded holographic images; andthe projection system comprises a laser projection system that employs a laser diode to illuminate the diffractive optical elements and form the corresponding images to convey the interaction information optically to the target computing device.
  • 20. A stylus as described in claim 18, further comprising a pressure switch to measure pressure applied to a tip of the stylus, wherein the mode manager module is further configured to: determine a pressure level measured by the pressure switch; andcause the projection system to project a particular image indicative of the determined pressure level to enable the target computing device to adjust attributes of digital ink traces input via the stylus to match the determined pressure level that is conveyed optically by the particular image.
US Referenced Citations (1069)
Number Name Date Kind
578325 Fleming Mar 1897 A
4046975 Seeger, Jr. Sep 1977 A
4065649 Carter et al. Dec 1977 A
4086451 Boulanger Apr 1978 A
4237347 Burundukov et al. Dec 1980 A
4239338 Borrelli et al. Dec 1980 A
4243861 Strandwitz Jan 1981 A
4279021 See et al. Jul 1981 A
4302648 Sado et al. Nov 1981 A
4317013 Larson Feb 1982 A
4326193 Markley et al. Apr 1982 A
4365130 Christensen Dec 1982 A
4492829 Rodrique Jan 1985 A
4527021 Morikawa et al. Jul 1985 A
4559426 Van Zeeland et al. Dec 1985 A
4576436 Daniel Mar 1986 A
4577822 Wilkerson Mar 1986 A
4588187 Dell May 1986 A
4607147 Ono et al. Aug 1986 A
4615579 Whitehead Oct 1986 A
4643604 Enrico Feb 1987 A
4651133 Ganesan et al. Mar 1987 A
4735394 Facco Apr 1988 A
4735495 Henkes Apr 1988 A
5008497 Asher Apr 1991 A
5067573 Uchida Nov 1991 A
5111223 Omura May 1992 A
5128829 Loew Jul 1992 A
5220521 Kikinis Jun 1993 A
5249978 Gazda et al. Oct 1993 A
5283559 Kalendra et al. Feb 1994 A
5319455 Hoarty et al. Jun 1994 A
5331443 Stanisci Jul 1994 A
5339382 Whitehead Aug 1994 A
5340528 Machida et al. Aug 1994 A
5349403 Lo Sep 1994 A
5363075 Fanucchi Nov 1994 A
5375076 Goodrich et al. Dec 1994 A
5406415 Kelly Apr 1995 A
5480118 Cross Jan 1996 A
5510783 Findlater et al. Apr 1996 A
5546271 Gut et al. Aug 1996 A
5548477 Kumar et al. Aug 1996 A
5558577 Kato Sep 1996 A
5576981 Parker et al. Nov 1996 A
5618232 John Apr 1997 A
5621494 Kazumi et al. Apr 1997 A
5681220 Bertram et al. Oct 1997 A
5737183 Kobayashi et al. Apr 1998 A
5745376 Barker et al. Apr 1998 A
5748114 Koehn May 1998 A
5750939 Makinwa et al. May 1998 A
5781406 Hunte Jul 1998 A
5806955 Parkyn, Jr. et al. Sep 1998 A
5807175 Davis et al. Sep 1998 A
5808713 Broer et al. Sep 1998 A
5818361 Acevedo Oct 1998 A
5825982 Wright et al. Oct 1998 A
5828770 Leis et al. Oct 1998 A
5838403 Jannson et al. Nov 1998 A
5842027 Oprescu et al. Nov 1998 A
5850135 Kuki et al. Dec 1998 A
5861990 Tedesco Jan 1999 A
5874697 Selker et al. Feb 1999 A
5886675 Aye et al. Mar 1999 A
5905485 Podoloff May 1999 A
5921652 Parker et al. Jul 1999 A
5924555 Sadamori et al. Jul 1999 A
5926170 Oba Jul 1999 A
5929946 Sharp et al. Jul 1999 A
5957191 Okada et al. Sep 1999 A
5967637 Ishikawa et al. Oct 1999 A
5971635 Wise Oct 1999 A
5973677 Gibbons Oct 1999 A
5999147 Teitel Dec 1999 A
6002389 Kasser Dec 1999 A
6005209 Burleson et al. Dec 1999 A
6012714 Worley et al. Jan 2000 A
6040823 Seffernick et al. Mar 2000 A
6044717 Biegelsen et al. Apr 2000 A
6046857 Morishima et al. Apr 2000 A
6061644 Leis May 2000 A
6072551 Jannson et al. Jun 2000 A
6108200 Fullerton Aug 2000 A
6112797 Colson et al. Sep 2000 A
6124906 Kawada et al. Sep 2000 A
6128007 Seybold Oct 2000 A
6129444 Tognoni Oct 2000 A
6147859 Abboud Nov 2000 A
6172807 Akamatsu Jan 2001 B1
6178443 Lin Jan 2001 B1
6188391 Seely et al. Feb 2001 B1
6215590 Okano Apr 2001 B1
6228926 Golumbic May 2001 B1
6232934 Heacock et al. May 2001 B1
6234820 Perino et al. May 2001 B1
6254105 Rinde et al. Jul 2001 B1
6256447 Laine Jul 2001 B1
6278490 Fukuda et al. Aug 2001 B1
6279060 Luke et al. Aug 2001 B1
6300986 Travis Oct 2001 B1
6329617 Burgess Dec 2001 B1
6344791 Armstrong Feb 2002 B1
6351273 Lemelson et al. Feb 2002 B1
6353503 Spitzer et al. Mar 2002 B1
6366440 Kung Apr 2002 B1
6380497 Hashimoto et al. Apr 2002 B1
6411266 Maguire, Jr. Jun 2002 B1
6437682 Vance Aug 2002 B1
6441362 Ogawa Aug 2002 B1
6469755 Adachi et al. Oct 2002 B1
6506983 Babb et al. Jan 2003 B1
6511378 Bhatt et al. Jan 2003 B1
6529179 Hashimoto et al. Mar 2003 B1
6532147 Christ, Jr. Mar 2003 B1
6543949 Ritchey et al. Apr 2003 B1
6545577 Yap Apr 2003 B2
6565439 Shinohara et al. May 2003 B2
6574030 Mosier Jun 2003 B1
6597347 Yasutake Jul 2003 B1
6600121 Olodort et al. Jul 2003 B1
6603408 Gaba Aug 2003 B1
6608664 Hasegawa Aug 2003 B1
6617536 Kawaguchi Sep 2003 B2
6648485 Colgan et al. Nov 2003 B1
6651943 Cho et al. Nov 2003 B2
6681333 Cho Jan 2004 B1
6685369 Lien Feb 2004 B2
6695273 Iguchi Feb 2004 B2
6700617 Hamamura et al. Mar 2004 B1
6704864 Philyaw Mar 2004 B1
6721019 Kono et al. Apr 2004 B2
6725318 Sherman et al. Apr 2004 B1
6738049 Kiser et al. May 2004 B2
6774888 Genduso Aug 2004 B1
6776546 Kraus et al. Aug 2004 B2
6781819 Yang et al. Aug 2004 B2
6784869 Clark et al. Aug 2004 B1
6790054 Boonsue Sep 2004 B1
6795146 Dozov et al. Sep 2004 B2
6813143 Makela Nov 2004 B2
6819082 Yang Nov 2004 B2
6819316 Schulz et al. Nov 2004 B2
6847488 Travis Jan 2005 B2
6856506 Doherty et al. Feb 2005 B2
6859565 Baron Feb 2005 B2
6861961 Sandbach et al. Mar 2005 B2
6864573 Robertson et al. Mar 2005 B2
6867828 Taira et al. Mar 2005 B2
6870671 Travis Mar 2005 B2
6895164 Saccomanno May 2005 B2
6898315 Guha May 2005 B2
6902214 Smith Jun 2005 B2
6914197 Doherty et al. Jul 2005 B2
6922333 Weng et al. Jul 2005 B2
6929291 Chen Aug 2005 B2
6950950 Sawyers et al. Sep 2005 B2
6970957 Oshins et al. Nov 2005 B1
6976799 Kim et al. Dec 2005 B2
6980177 Struyk Dec 2005 B2
6981792 Nagakubo et al. Jan 2006 B2
7002624 Uchino et al. Feb 2006 B1
7006080 Gettemy Feb 2006 B2
7007238 Glaser Feb 2006 B2
7018678 Gronbeck et al. Mar 2006 B2
7019491 Bozzone et al. Mar 2006 B2
7023430 Liu et al. Apr 2006 B2
7025908 Hayashi et al. Apr 2006 B1
7051149 Wang et al. May 2006 B2
7068496 Wong et al. Jun 2006 B2
7073933 Gotoh et al. Jul 2006 B2
7083295 Hanna Aug 2006 B1
7091436 Serban Aug 2006 B2
7095404 Vincent et al. Aug 2006 B2
7099149 Krieger et al. Aug 2006 B2
7101048 Travis Sep 2006 B2
7102683 Perry et al. Sep 2006 B2
7104679 Shin et al. Sep 2006 B2
7106222 Ward et al. Sep 2006 B2
7116309 Kimura et al. Oct 2006 B1
7123292 Seeger et al. Oct 2006 B1
7129979 Lee Oct 2006 B1
7136282 Rebeske Nov 2006 B1
7151635 Bidnyk et al. Dec 2006 B2
7152985 Benitez et al. Dec 2006 B2
7153017 Yamashita et al. Dec 2006 B2
D535292 Shi et al. Jan 2007 S
7162153 Harter, Jr. et al. Jan 2007 B2
7169460 Chen et al. Jan 2007 B1
7194662 Do et al. Mar 2007 B2
7199554 Kim et al. Apr 2007 B2
7199931 Boettiger et al. Apr 2007 B2
7201508 Misaras Apr 2007 B2
7202837 Ihara Apr 2007 B2
7213323 Baker et al. May 2007 B2
7213991 Chapman et al. May 2007 B2
7224830 Nefian et al. May 2007 B2
7239505 Keely et al. Jul 2007 B2
7260221 Atsmon Aug 2007 B1
7260823 Schlack et al. Aug 2007 B2
7277087 Hill et al. Oct 2007 B2
7287738 Pitlor Oct 2007 B2
7295720 Raskar Nov 2007 B2
7301759 Hsiung Nov 2007 B2
7311526 Rohrbach et al. Dec 2007 B2
7331793 Hernandez et al. Feb 2008 B2
7364343 Keuper et al. Apr 2008 B2
7370342 Ismail et al. May 2008 B2
7374312 Feng et al. May 2008 B2
7375885 Ijzerman et al. May 2008 B2
7379094 Yoshida et al. May 2008 B2
7384178 Sumida et al. Jun 2008 B2
7400377 Evans et al. Jul 2008 B2
7400817 Lee et al. Jul 2008 B2
7410286 Travis Aug 2008 B2
7423557 Kang Sep 2008 B2
7431489 Yeo et al. Oct 2008 B2
7443443 Raskar et al. Oct 2008 B2
7447934 Dasari et al. Nov 2008 B2
7457108 Ghosh Nov 2008 B2
7467948 Lindberg et al. Dec 2008 B2
7469386 Bear et al. Dec 2008 B2
7486165 Ligtenberg et al. Feb 2009 B2
7499037 Lube Mar 2009 B2
7499216 Niv et al. Mar 2009 B2
7502803 Culter et al. Mar 2009 B2
7503684 Ueno et al. Mar 2009 B2
7509042 Mori et al. Mar 2009 B2
7515143 Keam et al. Apr 2009 B2
7528374 Smitt et al. May 2009 B2
7542052 Solomon et al. Jun 2009 B2
7545429 Travis Jun 2009 B2
7558594 Wilson Jul 2009 B2
7559834 York Jul 2009 B1
7561131 Ijzerman et al. Jul 2009 B2
7572045 Hoelen et al. Aug 2009 B2
RE40891 Yasutake Sep 2009 E
7620244 Collier Nov 2009 B1
7622907 Vranish Nov 2009 B2
7626582 Nicolas et al. Dec 2009 B1
7631327 Dempski et al. Dec 2009 B2
7636921 Louie Dec 2009 B2
7639876 Clary et al. Dec 2009 B2
7643213 Boettiger et al. Jan 2010 B2
7656392 Bolender Feb 2010 B2
7660047 Travis et al. Feb 2010 B1
7675598 Hong Mar 2010 B2
7686694 Cole Mar 2010 B2
7705558 Silverman Apr 2010 B2
7715187 Hotelling et al. May 2010 B2
7722792 Uezaki et al. May 2010 B2
7724952 Shum et al. May 2010 B2
7728923 Kim et al. Jun 2010 B2
7729493 Krieger et al. Jun 2010 B2
7731147 Rha Jun 2010 B2
7733326 Adiseshan Jun 2010 B1
7773076 Pittel et al. Aug 2010 B2
7773121 Huntsberger et al. Aug 2010 B1
7774155 Sato et al. Aug 2010 B2
7775567 Ligtenberg et al. Aug 2010 B2
7777972 Chen et al. Aug 2010 B1
7782341 Kothandaraman Aug 2010 B2
7782342 Koh Aug 2010 B2
7788474 Switzer et al. Aug 2010 B2
7813715 McKillop et al. Oct 2010 B2
7815358 Inditsky Oct 2010 B2
7817428 Greer, Jr. et al. Oct 2010 B2
7822338 Wernersson Oct 2010 B2
7844985 Hendricks et al. Nov 2010 B2
7852621 Lin et al. Dec 2010 B2
7855716 McCreary et al. Dec 2010 B2
7865639 McCoy et al. Jan 2011 B2
7884807 Hovden et al. Feb 2011 B2
7893921 Sato Feb 2011 B2
7898797 Fan et al. Mar 2011 B2
7907394 Richardson et al. Mar 2011 B2
D636397 Green Apr 2011 S
7918559 Tesar Apr 2011 B2
7927654 Hagood et al. Apr 2011 B2
7928964 Kolmykov-Zotov et al. Apr 2011 B2
7936501 Smith et al. May 2011 B2
7944520 Ichioka et al. May 2011 B2
7945717 Rivalsi May 2011 B2
7957082 Mi et al. Jun 2011 B2
7965268 Gass et al. Jun 2011 B2
7967462 Ogiro et al. Jun 2011 B2
7970246 Travis et al. Jun 2011 B2
7973771 Geaghan Jul 2011 B2
7976393 Haga et al. Jul 2011 B2
7978281 Vergith et al. Jul 2011 B2
7991257 Coleman Aug 2011 B1
8007158 Woo et al. Aug 2011 B2
8016255 Lin Sep 2011 B2
8018386 Qi et al. Sep 2011 B2
8018579 Krah Sep 2011 B1
8026904 Westerman Sep 2011 B2
8035614 Bell et al. Oct 2011 B2
8035624 Bell et al. Oct 2011 B2
8053688 Conzola et al. Nov 2011 B2
8059391 Chang et al. Nov 2011 B2
8065624 Morin et al. Nov 2011 B2
8069356 Rathi et al. Nov 2011 B2
RE42992 David Dec 2011 E
8077160 Land et al. Dec 2011 B2
8090885 Callaghan et al. Jan 2012 B2
8098233 Hotelling et al. Jan 2012 B2
8102362 Ricks et al. Jan 2012 B2
8115499 Osoinach et al. Feb 2012 B2
8115718 Chen et al. Feb 2012 B2
8117362 Rodriguez et al. Feb 2012 B2
8118274 McClure et al. Feb 2012 B2
8118681 Mattice et al. Feb 2012 B2
8120166 Koizumi et al. Feb 2012 B2
8130203 Westerman Mar 2012 B2
8149219 Lii et al. Apr 2012 B2
8149272 Evans et al. Apr 2012 B2
8154524 Wilson et al. Apr 2012 B2
8162282 Hu et al. Apr 2012 B2
D659139 Gengler May 2012 S
8169185 Partovi et al. May 2012 B2
8169421 Wright et al. May 2012 B2
8179236 Weller et al. May 2012 B2
8184190 Dosluoglu May 2012 B2
8189973 Travis et al. May 2012 B2
8216074 Sakuma Jul 2012 B2
8223489 Shih Jul 2012 B2
8229509 Paek et al. Jul 2012 B2
8229522 Kim et al. Jul 2012 B2
8231099 Chen Jul 2012 B2
8248791 Wang et al. Aug 2012 B2
8251563 Papakonstantinou et al. Aug 2012 B2
8255708 Zhang Aug 2012 B1
8259091 Yeh Sep 2012 B2
8264310 Lauder et al. Sep 2012 B2
8267368 Torii et al. Sep 2012 B2
8269731 Molne Sep 2012 B2
8274784 Franz et al. Sep 2012 B2
8279589 Kim Oct 2012 B2
8310508 Hekstra et al. Nov 2012 B2
8310768 Lin et al. Nov 2012 B2
8322290 Mignano Dec 2012 B1
8325416 Lesage et al. Dec 2012 B2
8342857 Palli et al. Jan 2013 B2
8345920 Ferren et al. Jan 2013 B2
8354806 Travis et al. Jan 2013 B2
8362975 Uehara Jan 2013 B2
8373664 Wright Feb 2013 B2
8387078 Memmott Feb 2013 B2
8389078 Lin et al. Mar 2013 B2
8416206 Carpendale et al. Apr 2013 B2
8416559 Agata et al. Apr 2013 B2
8466902 Boer et al. Jun 2013 B2
8466954 Ko et al. Jun 2013 B2
8467133 Miller Jun 2013 B2
8497657 Franks et al. Jul 2013 B2
8498100 Whitt, III et al. Jul 2013 B1
8513547 Ooi Aug 2013 B2
8515501 Lee et al. Aug 2013 B2
8543227 Perek et al. Sep 2013 B1
8548608 Perek et al. Oct 2013 B2
8560004 Tsvetkov et al. Oct 2013 B1
8564944 Whitt, III et al. Oct 2013 B2
8565560 Popovich et al. Oct 2013 B2
8570725 Whitt, III et al. Oct 2013 B2
8571539 Ranganathan et al. Oct 2013 B1
8582206 Travis Nov 2013 B2
8599542 Healey et al. Dec 2013 B1
8600120 Gonion et al. Dec 2013 B2
8600526 Nielsen et al. Dec 2013 B2
8610015 Whitt et al. Dec 2013 B2
8614666 Whitman et al. Dec 2013 B2
8646999 Shaw et al. Feb 2014 B2
8654030 Mercer Feb 2014 B1
8692212 Craft Apr 2014 B1
8699215 Whitt, III et al. Apr 2014 B2
8700931 Gudlavenkatasiva et al. Apr 2014 B2
8705229 Ashcraft et al. Apr 2014 B2
8719603 Belesiu May 2014 B2
8723842 Kaneda et al. May 2014 B2
8724302 Whitt et al. May 2014 B2
8738090 Kanda May 2014 B2
8749529 Powell et al. Jun 2014 B2
8780540 Whitt, III et al. Jul 2014 B2
8780541 Whitt et al. Jul 2014 B2
8854799 Whitt, III et al. Oct 2014 B2
8873227 Whitt et al. Oct 2014 B2
8903517 Perek et al. Dec 2014 B2
8947353 Boulanger et al. Feb 2015 B2
8964379 Rihn et al. Feb 2015 B2
9001028 Baker Apr 2015 B2
9075566 Whitt, III et al. Jul 2015 B2
9158384 Whitt, III et al. Oct 2015 B2
9176901 Whitt, III et al. Nov 2015 B2
9201185 Large Dec 2015 B2
9256089 Emerton et al. Feb 2016 B2
9268373 Whitt et al. Feb 2016 B2
9304949 Whitman et al. Apr 2016 B2
20010020455 Schifferl Sep 2001 A1
20010023818 Masaru et al. Sep 2001 A1
20010035859 Kiser Nov 2001 A1
20020000977 Vranish Jan 2002 A1
20020008854 Travis et al. Jan 2002 A1
20020044216 Cha Apr 2002 A1
20020103616 Park et al. Aug 2002 A1
20020113882 Pollard et al. Aug 2002 A1
20020126445 Minaguchi et al. Sep 2002 A1
20020134828 Sandbach et al. Sep 2002 A1
20020138772 Crawford et al. Sep 2002 A1
20020154099 Oh Oct 2002 A1
20020163510 Williams et al. Nov 2002 A1
20020190823 Yap Dec 2002 A1
20030016282 Koizumi Jan 2003 A1
20030028688 Tiphane Feb 2003 A1
20030036365 Kuroda Feb 2003 A1
20030044215 Monney et al. Mar 2003 A1
20030108720 Kashino Jun 2003 A1
20030128285 Itoh Jul 2003 A1
20030132916 Kramer Jul 2003 A1
20030137821 Gotoh et al. Jul 2003 A1
20030148740 Yau et al. Aug 2003 A1
20030163611 Nagao Aug 2003 A1
20030165017 Amitai Sep 2003 A1
20030195937 Kircher, Jr. Oct 2003 A1
20030197687 Shetter Oct 2003 A1
20030197806 Perry et al. Oct 2003 A1
20030198008 Leapman et al. Oct 2003 A1
20040005184 Kim et al. Jan 2004 A1
20040048941 Raffel et al. Mar 2004 A1
20040056843 Lin et al. Mar 2004 A1
20040095333 Morag et al. May 2004 A1
20040100457 Mandle May 2004 A1
20040115994 Wulff et al. Jun 2004 A1
20040156168 LeVasseur et al. Aug 2004 A1
20040169641 Bean et al. Sep 2004 A1
20040174709 Buelow, II et al. Sep 2004 A1
20040189822 Shimada Sep 2004 A1
20040212553 Wang Oct 2004 A1
20040212598 Kraus et al. Oct 2004 A1
20040212601 Cake et al. Oct 2004 A1
20040258924 Berger et al. Dec 2004 A1
20040268000 Barker et al. Dec 2004 A1
20050002073 Nakamura et al. Jan 2005 A1
20050030728 Kawashima et al. Feb 2005 A1
20050052831 Chen Mar 2005 A1
20050055498 Beckert et al. Mar 2005 A1
20050057515 Bathiche Mar 2005 A1
20050057521 Aull et al. Mar 2005 A1
20050059489 Kim Mar 2005 A1
20050062715 Tsuji et al. Mar 2005 A1
20050068460 Lin Mar 2005 A1
20050094895 Baron May 2005 A1
20050099400 Lee May 2005 A1
20050100690 Mayer et al. May 2005 A1
20050134717 Misawa Jun 2005 A1
20050146512 Hill et al. Jul 2005 A1
20050231156 Yan Oct 2005 A1
20050236848 Kim et al. Oct 2005 A1
20050240949 Liu et al. Oct 2005 A1
20050264653 Starkweather et al. Dec 2005 A1
20050264988 Nicolosi Dec 2005 A1
20050265035 Brass et al. Dec 2005 A1
20050285703 Wheeler et al. Dec 2005 A1
20060010400 Dehlin et al. Jan 2006 A1
20060012767 Komatsuda et al. Jan 2006 A1
20060028400 Lapstun et al. Feb 2006 A1
20060028476 Sobel Feb 2006 A1
20060028838 Imade Feb 2006 A1
20060030295 Adams et al. Feb 2006 A1
20060049993 Lin et al. Mar 2006 A1
20060070384 Ertel Apr 2006 A1
20060082973 Egbert et al. Apr 2006 A1
20060083004 Cok Apr 2006 A1
20060085658 Allen et al. Apr 2006 A1
20060092379 Cho et al. May 2006 A1
20060102914 Smits et al. May 2006 A1
20060103633 Gioeli May 2006 A1
20060125799 Hillis et al. Jun 2006 A1
20060132423 Travis Jun 2006 A1
20060146573 Iwauchi et al. Jul 2006 A1
20060154725 Glaser et al. Jul 2006 A1
20060156415 Rubinstein et al. Jul 2006 A1
20060181514 Newman Aug 2006 A1
20060181521 Perreault et al. Aug 2006 A1
20060187216 Trent, Jr. et al. Aug 2006 A1
20060195522 Miyazaki Aug 2006 A1
20060197755 Bawany Sep 2006 A1
20060215244 Yosha et al. Sep 2006 A1
20060227393 Herloski Oct 2006 A1
20060238510 Panotopoulos et al. Oct 2006 A1
20060238550 Page Oct 2006 A1
20060250381 Geaghan Nov 2006 A1
20060254042 Chou et al. Nov 2006 A1
20060261778 Elizalde Rodarte Nov 2006 A1
20060262185 Cha et al. Nov 2006 A1
20060279501 Lu et al. Dec 2006 A1
20060287982 Sheldon et al. Dec 2006 A1
20070002587 Miyashita Jan 2007 A1
20070003267 Shibutani Jan 2007 A1
20070019181 Sinclair et al. Jan 2007 A1
20070024742 Raskar et al. Feb 2007 A1
20070046625 Yee Mar 2007 A1
20070047221 Park Mar 2007 A1
20070051792 Wheeler et al. Mar 2007 A1
20070056385 Lorenz Mar 2007 A1
20070062089 Homer et al. Mar 2007 A1
20070069153 Pai-Paranjape et al. Mar 2007 A1
20070072474 Beasley et al. Mar 2007 A1
20070076434 Uehara et al. Apr 2007 A1
20070080813 Melvin Apr 2007 A1
20070081091 Pan et al. Apr 2007 A1
20070091638 Ijzerman et al. Apr 2007 A1
20070114967 Peng May 2007 A1
20070116929 Fujimori et al. May 2007 A1
20070122027 Kunita et al. May 2007 A1
20070126393 Bersenev Jun 2007 A1
20070133156 Ligtenberg et al. Jun 2007 A1
20070145945 McGinley et al. Jun 2007 A1
20070161262 Lloyd Jul 2007 A1
20070176902 Newman et al. Aug 2007 A1
20070182663 Biech Aug 2007 A1
20070182722 Hotelling et al. Aug 2007 A1
20070185590 Reindel et al. Aug 2007 A1
20070188478 Silverstein Aug 2007 A1
20070189667 Wakita et al. Aug 2007 A1
20070194752 McBurney Aug 2007 A1
20070200830 Yamamoto Aug 2007 A1
20070201246 Yeo et al. Aug 2007 A1
20070201859 Sarrat Aug 2007 A1
20070217224 Kao et al. Sep 2007 A1
20070220708 Lewis Sep 2007 A1
20070222766 Bolender Sep 2007 A1
20070234420 Novotney et al. Oct 2007 A1
20070236408 Yamaguchi et al. Oct 2007 A1
20070236467 Marshall et al. Oct 2007 A1
20070236475 Wherry Oct 2007 A1
20070236873 Yukawa et al. Oct 2007 A1
20070247338 Marchetto Oct 2007 A1
20070247432 Oakley Oct 2007 A1
20070247800 Smith et al. Oct 2007 A1
20070257821 Son et al. Nov 2007 A1
20070260892 Paul et al. Nov 2007 A1
20070263119 Shum et al. Nov 2007 A1
20070271527 Paas et al. Nov 2007 A1
20070274094 Schultz et al. Nov 2007 A1
20070274095 Destain Nov 2007 A1
20070274099 Tai et al. Nov 2007 A1
20070283179 Burnett et al. Dec 2007 A1
20080001924 de los Reyes et al. Jan 2008 A1
20080002350 Farrugia Jan 2008 A1
20080005423 Jacobs et al. Jan 2008 A1
20080013809 Zhu et al. Jan 2008 A1
20080018611 Serban et al. Jan 2008 A1
20080019150 Park et al. Jan 2008 A1
20080019684 Shyu et al. Jan 2008 A1
20080030937 Russo et al. Feb 2008 A1
20080037284 Rudisill Feb 2008 A1
20080048654 Takahashi et al. Feb 2008 A1
20080053222 Ehrensvard et al. Mar 2008 A1
20080059888 Dunko Mar 2008 A1
20080061565 Lee et al. Mar 2008 A1
20080068451 Hyatt Mar 2008 A1
20080074398 Wright Mar 2008 A1
20080084499 Kisacanin et al. Apr 2008 A1
20080088593 Smoot Apr 2008 A1
20080090626 Griffin et al. Apr 2008 A1
20080104437 Lee May 2008 A1
20080106592 Mikami May 2008 A1
20080111518 Toya May 2008 A1
20080122803 Izadi et al. May 2008 A1
20080150913 Bell et al. Jun 2008 A1
20080151478 Chern Jun 2008 A1
20080158185 Westerman Jul 2008 A1
20080167832 Soss Jul 2008 A1
20080174570 Jobs et al. Jul 2008 A1
20080177185 Nakao et al. Jul 2008 A1
20080179507 Han Jul 2008 A2
20080180411 Solomon et al. Jul 2008 A1
20080182622 Makarowski et al. Jul 2008 A1
20080186660 Yang Aug 2008 A1
20080186683 Ligtenberg et al. Aug 2008 A1
20080203277 Warszauer et al. Aug 2008 A1
20080211787 Nakao et al. Sep 2008 A1
20080219025 Spitzer et al. Sep 2008 A1
20080225205 Travis Sep 2008 A1
20080228969 Cheah et al. Sep 2008 A1
20080232061 Wang et al. Sep 2008 A1
20080233326 Hegemier et al. Sep 2008 A1
20080238884 Harish Oct 2008 A1
20080253822 Matias Oct 2008 A1
20080258679 Manico et al. Oct 2008 A1
20080297878 Brown et al. Dec 2008 A1
20080303479 Park et al. Dec 2008 A1
20080309636 Feng et al. Dec 2008 A1
20080316002 Brunet et al. Dec 2008 A1
20080316768 Travis Dec 2008 A1
20080320190 Lydon et al. Dec 2008 A1
20090002218 Rigazio et al. Jan 2009 A1
20090007001 Morin et al. Jan 2009 A1
20090009476 Daley, III Jan 2009 A1
20090013275 May et al. Jan 2009 A1
20090033623 Lin Feb 2009 A1
20090040426 Mather et al. Feb 2009 A1
20090065267 Sato Mar 2009 A1
20090073060 Shimasaki et al. Mar 2009 A1
20090073957 Newland et al. Mar 2009 A1
20090079639 Hotta et al. Mar 2009 A1
20090083562 Park et al. Mar 2009 A1
20090096738 Chen et al. Apr 2009 A1
20090102419 Gwon et al. Apr 2009 A1
20090127005 Zachut et al. May 2009 A1
20090131134 Baerlocher et al. May 2009 A1
20090134838 Raghuprasad May 2009 A1
20090135142 Fu et al. May 2009 A1
20090135318 Tateuchi et al. May 2009 A1
20090140985 Liu Jun 2009 A1
20090142020 Van Ostrand et al. Jun 2009 A1
20090146975 Chang Jun 2009 A1
20090146992 Fukunaga et al. Jun 2009 A1
20090147102 Kakinuma et al. Jun 2009 A1
20090152748 Wang et al. Jun 2009 A1
20090158221 Nielsen et al. Jun 2009 A1
20090160944 Trevelyan et al. Jun 2009 A1
20090161385 Parker et al. Jun 2009 A1
20090163147 Steigerwald et al. Jun 2009 A1
20090167728 Geaghan et al. Jul 2009 A1
20090167930 Safaee-Rad et al. Jul 2009 A1
20090182901 Callaghan et al. Jul 2009 A1
20090189974 Deering Jul 2009 A1
20090195497 Fitzgerald et al. Aug 2009 A1
20090200384 Masalkar Aug 2009 A1
20090219250 Ure Sep 2009 A1
20090231275 Odgers Sep 2009 A1
20090231465 Senba Sep 2009 A1
20090239586 Boeve et al. Sep 2009 A1
20090244832 Behar et al. Oct 2009 A1
20090251008 Sugaya Oct 2009 A1
20090251623 Koyama Oct 2009 A1
20090259865 Sheynblat et al. Oct 2009 A1
20090262492 Whitchurch et al. Oct 2009 A1
20090265670 Kim et al. Oct 2009 A1
20090268386 Lin Oct 2009 A1
20090276734 Taylor et al. Nov 2009 A1
20090284613 Kim Nov 2009 A1
20090285491 Ravenscroft et al. Nov 2009 A1
20090296331 Choy Dec 2009 A1
20090303137 Kusaka et al. Dec 2009 A1
20090303204 Nasiri et al. Dec 2009 A1
20090316072 Okumura et al. Dec 2009 A1
20090320244 Lin Dec 2009 A1
20090321490 Groene et al. Dec 2009 A1
20090322278 Franks et al. Dec 2009 A1
20100001963 Doray et al. Jan 2010 A1
20100013319 Kamiyama et al. Jan 2010 A1
20100013738 Covannon et al. Jan 2010 A1
20100026656 Hotelling et al. Feb 2010 A1
20100038821 Jenkins et al. Feb 2010 A1
20100045540 Lai et al. Feb 2010 A1
20100045609 Do et al. Feb 2010 A1
20100045633 Gettemy Feb 2010 A1
20100051356 Stern et al. Mar 2010 A1
20100051432 Lin et al. Mar 2010 A1
20100053534 Hsieh et al. Mar 2010 A1
20100053771 Travis et al. Mar 2010 A1
20100072351 Mahowald Mar 2010 A1
20100075517 Ni et al. Mar 2010 A1
20100077237 Sawyers Mar 2010 A1
20100079861 Powell Apr 2010 A1
20100081377 Chatterjee et al. Apr 2010 A1
20100083108 Rider et al. Apr 2010 A1
20100085321 Pundsack Apr 2010 A1
20100102182 Lin Apr 2010 A1
20100102206 Cazaux et al. Apr 2010 A1
20100103112 Yoo et al. Apr 2010 A1
20100103131 Segal et al. Apr 2010 A1
20100103332 Li et al. Apr 2010 A1
20100117993 Kent May 2010 A1
20100123686 Klinghult et al. May 2010 A1
20100128112 Marti et al. May 2010 A1
20100133398 Chiu et al. Jun 2010 A1
20100135036 Matsuba et al. Jun 2010 A1
20100142130 Wang et al. Jun 2010 A1
20100149073 Chaum et al. Jun 2010 A1
20100149111 Olien Jun 2010 A1
20100149117 Chien et al. Jun 2010 A1
20100149134 Westerman et al. Jun 2010 A1
20100149377 Shintani et al. Jun 2010 A1
20100154171 Lombardi et al. Jun 2010 A1
20100156798 Archer Jun 2010 A1
20100156913 Ortega et al. Jun 2010 A1
20100157085 Sasaki Jun 2010 A1
20100161522 Tirpak et al. Jun 2010 A1
20100162109 Chatterjee et al. Jun 2010 A1
20100164857 Liu et al. Jul 2010 A1
20100164897 Morin et al. Jul 2010 A1
20100171891 Kaji et al. Jul 2010 A1
20100174421 Tsai et al. Jul 2010 A1
20100177388 Cohen et al. Jul 2010 A1
20100180063 Ananny et al. Jul 2010 A1
20100188299 Rinehart et al. Jul 2010 A1
20100188338 Longe Jul 2010 A1
20100206614 Park et al. Aug 2010 A1
20100206644 Yeh Aug 2010 A1
20100214214 Corson et al. Aug 2010 A1
20100214257 Wussler et al. Aug 2010 A1
20100222110 Kim et al. Sep 2010 A1
20100231498 Large et al. Sep 2010 A1
20100231510 Sampsell et al. Sep 2010 A1
20100231556 Mines et al. Sep 2010 A1
20100235546 Terlizzi et al. Sep 2010 A1
20100237970 Liu Sep 2010 A1
20100238075 Pourseyed Sep 2010 A1
20100238138 Goertz et al. Sep 2010 A1
20100238270 Bjelkhagen et al. Sep 2010 A1
20100238320 Washisu Sep 2010 A1
20100238620 Fish Sep 2010 A1
20100245221 Khan Sep 2010 A1
20100245289 Svajda Sep 2010 A1
20100250988 Okuda et al. Sep 2010 A1
20100271771 Wu et al. Oct 2010 A1
20100274932 Kose Oct 2010 A1
20100279768 Huang et al. Nov 2010 A1
20100282953 Tam Nov 2010 A1
20100289457 Onnerud et al. Nov 2010 A1
20100291331 Schaefer Nov 2010 A1
20100295812 Burns et al. Nov 2010 A1
20100296163 Saarikko Nov 2010 A1
20100299642 Merrell et al. Nov 2010 A1
20100302378 Marks et al. Dec 2010 A1
20100304793 Kim Dec 2010 A1
20100306538 Thomas et al. Dec 2010 A1
20100308778 Yamazaki et al. Dec 2010 A1
20100308844 Day et al. Dec 2010 A1
20100315348 Jellicoe et al. Dec 2010 A1
20100315774 Walker et al. Dec 2010 A1
20100321301 Casparian et al. Dec 2010 A1
20100321339 Kimmel Dec 2010 A1
20100321482 Cleveland Dec 2010 A1
20100321877 Moser Dec 2010 A1
20100322479 Cleveland Dec 2010 A1
20100324457 Bean et al. Dec 2010 A1
20100325155 Skinner et al. Dec 2010 A1
20100331059 Apgar et al. Dec 2010 A1
20110002577 Van Ostrand Jan 2011 A1
20110007047 Fujioka et al. Jan 2011 A1
20110012866 Keam Jan 2011 A1
20110012873 Prest et al. Jan 2011 A1
20110018799 Lin Jan 2011 A1
20110019123 Prest et al. Jan 2011 A1
20110031287 Le Gette et al. Feb 2011 A1
20110032215 Sirotich et al. Feb 2011 A1
20110036965 Zhang et al. Feb 2011 A1
20110037721 Cranfill et al. Feb 2011 A1
20110043142 Travis Feb 2011 A1
20110043479 van Aerle et al. Feb 2011 A1
20110043990 Mickey et al. Feb 2011 A1
20110044579 Travis et al. Feb 2011 A1
20110044582 Travis et al. Feb 2011 A1
20110050946 Lee et al. Mar 2011 A1
20110055407 Lydon et al. Mar 2011 A1
20110057899 Sleeman et al. Mar 2011 A1
20110060926 Brooks et al. Mar 2011 A1
20110069148 Jones et al. Mar 2011 A1
20110072391 Hanggie et al. Mar 2011 A1
20110074688 Hull et al. Mar 2011 A1
20110075440 Wang Mar 2011 A1
20110081946 Singh et al. Apr 2011 A1
20110095994 Birnbaum Apr 2011 A1
20110096035 Shen Apr 2011 A1
20110096513 Kim Apr 2011 A1
20110102326 Casparian et al. May 2011 A1
20110102356 Kemppinen et al. May 2011 A1
20110115747 Powell et al. May 2011 A1
20110118025 Lukas et al. May 2011 A1
20110122071 Powell May 2011 A1
20110134032 Chiu et al. Jun 2011 A1
20110134112 Koh et al. Jun 2011 A1
20110157046 Lee et al. Jun 2011 A1
20110157087 Kanehira et al. Jun 2011 A1
20110157101 Chang Jun 2011 A1
20110163955 Nasiri et al. Jul 2011 A1
20110164370 McClure et al. Jul 2011 A1
20110167181 Minoo et al. Jul 2011 A1
20110167287 Walsh et al. Jul 2011 A1
20110167391 Momeyer et al. Jul 2011 A1
20110167992 Eventoff et al. Jul 2011 A1
20110169762 Weiss Jul 2011 A1
20110169778 Nungester et al. Jul 2011 A1
20110170289 Allen et al. Jul 2011 A1
20110176035 Poulsen Jul 2011 A1
20110179864 Raasch et al. Jul 2011 A1
20110181754 Iwasaki Jul 2011 A1
20110183120 Sharygin et al. Jul 2011 A1
20110184646 Wong et al. Jul 2011 A1
20110184824 George et al. Jul 2011 A1
20110193787 Morishige et al. Aug 2011 A1
20110193938 Oderwald et al. Aug 2011 A1
20110197156 Strait et al. Aug 2011 A1
20110199389 Lu et al. Aug 2011 A1
20110202878 Park et al. Aug 2011 A1
20110205372 Miramontes Aug 2011 A1
20110216039 Chen et al. Sep 2011 A1
20110216266 Travis Sep 2011 A1
20110221659 King et al. Sep 2011 A1
20110227913 Hyndman Sep 2011 A1
20110228462 Dang Sep 2011 A1
20110231682 Kakish et al. Sep 2011 A1
20110234502 Yun et al. Sep 2011 A1
20110234535 Hung et al. Sep 2011 A1
20110234881 Wakabayashi et al. Sep 2011 A1
20110235179 Simmonds Sep 2011 A1
20110242063 Li Oct 2011 A1
20110242138 Tribble Oct 2011 A1
20110242298 Bathiche et al. Oct 2011 A1
20110242440 Noma et al. Oct 2011 A1
20110242670 Simmonds Oct 2011 A1
20110248152 Svajda et al. Oct 2011 A1
20110248920 Larsen Oct 2011 A1
20110248941 Abdo et al. Oct 2011 A1
20110261001 Liu Oct 2011 A1
20110261083 Wilson Oct 2011 A1
20110262001 Bi et al. Oct 2011 A1
20110267272 Meyer et al. Nov 2011 A1
20110267300 Serban et al. Nov 2011 A1
20110273475 Herz et al. Nov 2011 A1
20110290686 Huang Dec 2011 A1
20110291993 Miyazaki Dec 2011 A1
20110295697 Boston et al. Dec 2011 A1
20110297566 Gallagher et al. Dec 2011 A1
20110298919 Maglaque Dec 2011 A1
20110304577 Brown Dec 2011 A1
20110304815 Newell Dec 2011 A1
20110304962 Su Dec 2011 A1
20110306424 Kazama et al. Dec 2011 A1
20110310038 Park et al. Dec 2011 A1
20110314425 Chiang Dec 2011 A1
20110316807 Corrion Dec 2011 A1
20120002052 Muramatsu et al. Jan 2012 A1
20120007821 Zaliva Jan 2012 A1
20120008015 Manabe Jan 2012 A1
20120011462 Westerman et al. Jan 2012 A1
20120013519 Hakansson et al. Jan 2012 A1
20120019165 Igaki et al. Jan 2012 A1
20120019686 Manabe Jan 2012 A1
20120020112 Fisher et al. Jan 2012 A1
20120020556 Manabe Jan 2012 A1
20120021618 Schultz Jan 2012 A1
20120023459 Westerman Jan 2012 A1
20120024682 Huang et al. Feb 2012 A1
20120026048 Vazquez et al. Feb 2012 A1
20120032891 Parivar Feb 2012 A1
20120032917 Yamaguchi Feb 2012 A1
20120033369 Wu et al. Feb 2012 A1
20120044140 Koyama Feb 2012 A1
20120044179 Hudson Feb 2012 A1
20120044379 Manabe Feb 2012 A1
20120047368 Chinn et al. Feb 2012 A1
20120050975 Garelli et al. Mar 2012 A1
20120062850 Travis Mar 2012 A1
20120068919 Lauder et al. Mar 2012 A1
20120069540 Lauder et al. Mar 2012 A1
20120071008 Sessford Mar 2012 A1
20120072167 Cretella, Jr. et al. Mar 2012 A1
20120075249 Hoch Mar 2012 A1
20120081316 Sirpal et al. Apr 2012 A1
20120087078 Medica et al. Apr 2012 A1
20120092279 Martin Apr 2012 A1
20120094257 Pillischer et al. Apr 2012 A1
20120099263 Lin Apr 2012 A1
20120099749 Rubin et al. Apr 2012 A1
20120102438 Robinson et al. Apr 2012 A1
20120105321 Wang May 2012 A1
20120106082 Wu et al. May 2012 A1
20120113031 Lee et al. May 2012 A1
20120113223 Hilliges et al. May 2012 A1
20120113579 Agata et al. May 2012 A1
20120115553 Mahe et al. May 2012 A1
20120117409 Lee et al. May 2012 A1
20120127118 Nolting et al. May 2012 A1
20120127126 Mattice et al. May 2012 A1
20120127573 Robinson et al. May 2012 A1
20120133561 Konanur et al. May 2012 A1
20120133797 Sato et al. May 2012 A1
20120140396 Zeliff et al. Jun 2012 A1
20120145525 Ishikawa Jun 2012 A1
20120146943 Fairley et al. Jun 2012 A1
20120155015 Govindasamy et al. Jun 2012 A1
20120161406 Mersky Jun 2012 A1
20120162126 Yuan et al. Jun 2012 A1
20120162693 Ito Jun 2012 A1
20120170284 Shedletsky Jul 2012 A1
20120175487 Goto Jul 2012 A1
20120182242 Lindahl et al. Jul 2012 A1
20120182249 Endo et al. Jul 2012 A1
20120182743 Chou Jul 2012 A1
20120185803 Wang et al. Jul 2012 A1
20120188791 Voloschenko et al. Jul 2012 A1
20120194393 Uttermann et al. Aug 2012 A1
20120194448 Rothkopf Aug 2012 A1
20120195063 Kim et al. Aug 2012 A1
20120200532 Powell et al. Aug 2012 A1
20120200802 Large Aug 2012 A1
20120206937 Travis et al. Aug 2012 A1
20120223866 Ayala Vazquez et al. Sep 2012 A1
20120224073 Miyahara Sep 2012 A1
20120229634 Laett et al. Sep 2012 A1
20120235635 Sato Sep 2012 A1
20120235790 Zhao et al. Sep 2012 A1
20120235921 Laubach Sep 2012 A1
20120243102 Takeda et al. Sep 2012 A1
20120243165 Chang et al. Sep 2012 A1
20120246377 Bhesania Sep 2012 A1
20120249443 Anderson et al. Oct 2012 A1
20120256829 Dodge Oct 2012 A1
20120256959 Ye et al. Oct 2012 A1
20120268912 Minami et al. Oct 2012 A1
20120274811 Bakin Nov 2012 A1
20120278744 Kozitsyn et al. Nov 2012 A1
20120284297 Aguera-Arcas et al. Nov 2012 A1
20120287562 Wu et al. Nov 2012 A1
20120300275 Vilardell et al. Nov 2012 A1
20120312955 Randolph Dec 2012 A1
20120323933 He et al. Dec 2012 A1
20120326003 Solow et al. Dec 2012 A1
20120328349 Isaac et al. Dec 2012 A1
20130009413 Chiu et al. Jan 2013 A1
20130016468 Oh Jan 2013 A1
20130017696 Alvarez Rivera Jan 2013 A1
20130021289 Chen et al. Jan 2013 A1
20130027354 Yabuta et al. Jan 2013 A1
20130027356 Nishida Jan 2013 A1
20130027867 Lauder et al. Jan 2013 A1
20130044074 Park et al. Feb 2013 A1
20130046397 Fadell et al. Feb 2013 A1
20130063873 Wodrich et al. Mar 2013 A1
20130067126 Casparian et al. Mar 2013 A1
20130076617 Csaszar et al. Mar 2013 A1
20130083466 Becze et al. Apr 2013 A1
20130088431 Ballagas et al. Apr 2013 A1
20130100008 Marti et al. Apr 2013 A1
20130100082 Bakin et al. Apr 2013 A1
20130106766 Yilmaz et al. May 2013 A1
20130106813 Hotelling et al. May 2013 A1
20130107144 Marhefka et al. May 2013 A1
20130120466 Chen et al. May 2013 A1
20130120760 Raguin et al. May 2013 A1
20130127980 Haddick et al. May 2013 A1
20130128102 Yano May 2013 A1
20130154959 Lindsay et al. Jun 2013 A1
20130155723 Coleman Jun 2013 A1
20130162554 Lauder et al. Jun 2013 A1
20130172906 Olson et al. Jul 2013 A1
20130182246 Tanase Jul 2013 A1
20130187753 Chiriyankandath Jul 2013 A1
20130201094 Travis Aug 2013 A1
20130212483 Brakensiek et al. Aug 2013 A1
20130216108 Hwang et al. Aug 2013 A1
20130217451 Komiyama et al. Aug 2013 A1
20130222272 Martin, Jr. Aug 2013 A1
20130222274 Mori et al. Aug 2013 A1
20130222323 McKenzie Aug 2013 A1
20130222353 Large Aug 2013 A1
20130222681 Wan Aug 2013 A1
20130227836 Whitt, III Sep 2013 A1
20130228023 Drasnin Sep 2013 A1
20130228433 Shaw Sep 2013 A1
20130228434 Whitt, III Sep 2013 A1
20130228435 Whitt, III Sep 2013 A1
20130228439 Whitt, III Sep 2013 A1
20130229100 Siddiqui et al. Sep 2013 A1
20130229335 Whitman Sep 2013 A1
20130229347 Lutz, III Sep 2013 A1
20130229350 Shaw et al. Sep 2013 A1
20130229351 Whitt, III Sep 2013 A1
20130229354 Whitt, III et al. Sep 2013 A1
20130229356 Marwah Sep 2013 A1
20130229357 Powell Sep 2013 A1
20130229363 Whitman Sep 2013 A1
20130229366 Dighde Sep 2013 A1
20130229380 Lutz, III Sep 2013 A1
20130229386 Bathiche Sep 2013 A1
20130229534 Panay Sep 2013 A1
20130229568 Belesiu et al. Sep 2013 A1
20130229570 Beck et al. Sep 2013 A1
20130229756 Whitt, III Sep 2013 A1
20130229757 Whitt, III et al. Sep 2013 A1
20130229758 Belesiu Sep 2013 A1
20130229759 Whitt, III Sep 2013 A1
20130229760 Whitt, III Sep 2013 A1
20130229761 Shaw Sep 2013 A1
20130229762 Whitt, III Sep 2013 A1
20130229773 Siddiqui et al. Sep 2013 A1
20130230346 Shaw Sep 2013 A1
20130231755 Perek Sep 2013 A1
20130232280 Perek Sep 2013 A1
20130232348 Oler Sep 2013 A1
20130232349 Oler et al. Sep 2013 A1
20130232350 Belesiu et al. Sep 2013 A1
20130232353 Belesiu Sep 2013 A1
20130232571 Belesiu Sep 2013 A1
20130242495 Bathiche et al. Sep 2013 A1
20130262886 Nishimura Oct 2013 A1
20130278552 Kamin-Lyndgaard Oct 2013 A1
20130283212 Zhu et al. Oct 2013 A1
20130300590 Dietz Nov 2013 A1
20130300647 Drasnin Nov 2013 A1
20130301199 Whitt Nov 2013 A1
20130301206 Whitt Nov 2013 A1
20130304941 Drasnin Nov 2013 A1
20130304944 Young Nov 2013 A1
20130307935 Rappel et al. Nov 2013 A1
20130308339 Woodgate et al. Nov 2013 A1
20130322000 Whitt Dec 2013 A1
20130322001 Whitt Dec 2013 A1
20130328761 Boulanger Dec 2013 A1
20130329301 Travis Dec 2013 A1
20130329360 Aldana Dec 2013 A1
20130332628 Panay Dec 2013 A1
20130335330 Lane Dec 2013 A1
20130335387 Emerton Dec 2013 A1
20130335902 Campbell Dec 2013 A1
20130335903 Raken Dec 2013 A1
20130339757 Reddy Dec 2013 A1
20130342464 Bathiche et al. Dec 2013 A1
20130342465 Bathiche Dec 2013 A1
20130346636 Bathiche Dec 2013 A1
20140012401 Perek Jan 2014 A1
20140022629 Powell Jan 2014 A1
20140043275 Whitman Feb 2014 A1
20140048399 Whitt, III Feb 2014 A1
20140049894 Rihn Feb 2014 A1
20140053108 Johansson Feb 2014 A1
20140055624 Gaines Feb 2014 A1
20140063198 Boulanger Mar 2014 A1
20140078063 Bathiche Mar 2014 A1
20140098085 Lee Apr 2014 A1
20140118241 Chai May 2014 A1
20140119802 Shaw May 2014 A1
20140123273 Matus May 2014 A1
20140125864 Rihn May 2014 A1
20140131000 Bornemann et al. May 2014 A1
20140132550 McCracken et al. May 2014 A1
20140135060 Mercer May 2014 A1
20140148938 Zhang May 2014 A1
20140155031 Lee et al. Jun 2014 A1
20140155123 Lee et al. Jun 2014 A1
20140185215 Whitt Jul 2014 A1
20140185220 Whitt Jul 2014 A1
20140194095 Wynne et al. Jul 2014 A1
20140196143 Fliderman et al. Jul 2014 A1
20140258937 Lee Sep 2014 A1
20140283142 Shepherd et al. Sep 2014 A1
20140362506 Whitt, III et al. Dec 2014 A1
20140372914 Byrd et al. Dec 2014 A1
20140378099 Huang et al. Dec 2014 A1
20140379942 Perek et al. Dec 2014 A1
20150005953 Fadell et al. Jan 2015 A1
20150020122 Shin et al. Jan 2015 A1
20150026092 Abboud et al. Jan 2015 A1
20150070119 Rihn et al. Mar 2015 A1
20150086174 Abecassis et al. Mar 2015 A1
20150117444 Sandblad et al. Apr 2015 A1
20150161834 Spahl et al. Jun 2015 A1
20150172264 Hardy Jun 2015 A1
20150243236 Jain et al. Aug 2015 A1
20150261262 Whitt, III et al. Sep 2015 A1
20160034284 Won et al. Feb 2016 A1
20160034424 Won Feb 2016 A1
20160034695 Won et al. Feb 2016 A1
20160037481 Won et al. Feb 2016 A1
Foreign Referenced Citations (101)
Number Date Country
990023 Jun 1976 CA
1440513 Sep 2003 CN
1515937 Jul 2004 CN
1650202 Aug 2005 CN
1700072 Nov 2005 CN
1787605 Jun 2006 CN
1920642 Feb 2007 CN
101038401 Sep 2007 CN
101366001 Feb 2009 CN
101473167 Jul 2009 CN
101512403 Aug 2009 CN
101644979 Feb 2010 CN
101688991 Mar 2010 CN
101889225 Nov 2010 CN
101893785 Nov 2010 CN
202441167 Sep 2012 CN
103455149 Dec 2013 CN
0271956 Jun 1988 EP
1223722 Jul 2002 EP
1425763 Jun 2004 EP
1591891 Nov 2005 EP
2353978 Aug 2011 EP
2378607 Oct 2011 EP
2381290 Oct 2011 EP
2618247 Jul 2013 EP
2123213 Jan 1984 GB
2178570 Feb 1987 GB
2410116 Jul 2005 GB
2428101 Jan 2007 GB
56108127 Aug 1981 JP
H07218865 Aug 1995 JP
H0980354 Mar 1997 JP
H09178949 Jul 1997 JP
H104540 Jan 1998 JP
H10234057 Sep 1998 JP
10301055 Nov 1998 JP
10326124 Dec 1998 JP
1173239 Mar 1999 JP
2000106021 Apr 2000 JP
2001174746 Jun 2001 JP
2002100226 Apr 2002 JP
2002162912 Jun 2002 JP
2002300438 Oct 2002 JP
2003215349 Jul 2003 JP
2004171948 Jun 2004 JP
3602207 Dec 2004 JP
2005077437 Mar 2005 JP
2005156932 May 2005 JP
2005331565 Dec 2005 JP
2006004877 Jan 2006 JP
2006160155 Jun 2006 JP
2006278251 Oct 2006 JP
2006294361 Oct 2006 JP
2006310269 Nov 2006 JP
2007184286 Jul 2007 JP
2007273288 Oct 2007 JP
2008066152 Mar 2008 JP
2008286874 Jul 2008 JP
2008529251 Jul 2008 JP
2009003053 Jan 2009 JP
2009059583 Mar 2009 JP
2009122551 Jun 2009 JP
2010151951 Jul 2010 JP
20010039013 May 2001 KR
20040066647 Jul 2004 KR
20080006404 Jan 2008 KR
20080009490 Jan 2008 KR
20080055051 Jun 2008 KR
20110064265 Jun 2011 KR
1020110087178 Aug 2011 KR
1038411 May 2012 NL
WO-9108915 Jun 1991 WO
WO-9964784 Dec 1999 WO
WO-0079327 Dec 2000 WO
WO-0128309 Apr 2001 WO
WO-0172037 Sep 2001 WO
WO-03048635 Jun 2003 WO
WO-03083530 Sep 2003 WO
WO-03106134 Dec 2003 WO
WO-2005027696 Mar 2005 WO
WO-2005059874 Jun 2005 WO
WO-2006044818 Apr 2006 WO
WO-2006082444 Aug 2006 WO
WO-2007094304 Aug 2007 WO
WO-2007103631 Sep 2007 WO
WO-2007123202 Nov 2007 WO
WO-2008013146 Jan 2008 WO
WO-2008038016 Apr 2008 WO
WO-2008055039 May 2008 WO
WO-2009034484 Mar 2009 WO
WO-2010011983 Jan 2010 WO
WO-2010105272 Sep 2010 WO
WO-2010147609 Dec 2010 WO
WO-2011016200 Feb 2011 WO
WO-2012036717 Mar 2012 WO
WO-2012063410 May 2012 WO
WO-2012174364 Dec 2012 WO
WO-2013012699 Jan 2013 WO
WO-2013033067 Mar 2013 WO
WO-2013033274 Mar 2013 WO
WO-2013163347 Oct 2013 WO
Non-Patent Literature Citations (421)
Entry
“Accessing Device Sensors”, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012, 4 pages.
“ACPI Docking for Windows Operating Systems”, Retrieved from: <http://www.scritube.com/limba/engleza/software/ACPI-Docking-for-Windows-Opera331824193.php> on Jul. 6, 2012, 10 pages.
“Cholesteric Liquid Crystal”, Retrieved from: <http://en.wikipedia.org/wiki/Cholesteric—liquid—crystal> on Aug. 6, 2012,(Jun. 10, 2012), 2 pages.
“Cirago Slim Case®—Protective case with built-in kickstand for your iPhone 5®”, Retrieved from <http://cirago.com/wordpress/wp-content/uploads/2012/10/ipc1500brochure1.pdf> on Jan. 29, 2013, (Jan. 2013), 1 page.
“Developing Next-Generation Human Interfaces using Capacitive and Infrared Proximity Sensing”, Silicon Laboratories, Inc., Available at <http://www.silabs.com/pages/DownloadDoc.aspx?FILEURL=support%20documents/techinicaldocs/capacitive%20and%20proximity%20sensing—wp.pdf&src=SearchResults>,(Aug. 30, 2010), pp. 1-10.
“Directional Backlighting for Display Panels”, U.S. Appl. No. 13/021,448, (Feb. 4, 2011), 38 pages.
“DR2PA”, retrieved from <http://www.architainment.co.uk/wp-content/uploads/2012/08/DR2PA-AU-US-size-Data-Sheet-Rev-H—LOGO.pdf> on Sep. 17, 2012, 4 pages.
“First One Handed Fabric Keyboard with Bluetooth Wireless Technology”, Retrieved from: <http://press.xtvworld.com/article3817.html> on May 8, 2012,(Jan. 6, 2005), 2 pages.
“Force and Position Sensing Resistors: An Emerging Technology”, Interlink Electronics, Available at <http://staff.science.uva.nl/˜vlaander/docu/FSR/An—Exploring—Technology.pdf>,(Feb. 1990), pp. 1-6.
“Frogpad Introduces Weareable Fabric Keyboard with Bluetooth Technology”, Retrieved from: <http://www.geekzone.co.nz/content.asp?contentid=3898> on May 7, 2012,(Jan. 7, 2005), 3 pages.
“How to Use the iPad's Onscreen Keyboard”, Retrieved from <http://www.dummies.com/how-to/content/how-to-use-the-ipads-onscreen-keyboard.html> on Aug. 28, 2012, 3 pages.
“i-Interactor electronic pen”, Retrieved from: <http://www.alibaba.com/product-gs/331004878/i—Interactor—electronic—pen.html> on Jun. 19, 2012, 5 pages.
“Incipio LG G-Slate Premium Kickstand Case—Black Nylon”, Retrieved from: <http://www.amazon.com/Incipio-G-Slate-Premium-Kickstand-Case/dp/B004ZKP916> on May 8, 2012, 4 pages.
“Membrane Keyboards & Membrane Keypads”, Retrieved from: <http://www.pannam.com/> on May 9, 2012,(Mar. 4, 2009), 2 pages.
“Motion Sensors”, Android Developers, retrieved from <http://developer.android.com/guide/topics/sensors/sensors—motion.html< on May 25, 2012, 7 pages.
“MPC Fly Music Production Controller”, AKAI Professional, Retrieved from: <http://www.akaiprompc.com/mpc-fly> Jul. 9, 12, 4 pages.
“NI Releases New Machine & Maschine Mikro”, Retrieved from <http://www.djbooth.net/index/dj-equipment/entry/ni-releases-new-maschine-mikro/> on Sep. 17, 2012, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/021,448, (Dec. 13, 2012), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,001, (Feb. 19, 2013), 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,139, (Mar. 21, 2013), 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,202, (Feb. 11, 2013), 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, (Jan. 18, 2013), 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,195, (Jan. 2, 2013), 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,232, (Jan. 17, 2013), 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,272, (Feb. 12, 2013), 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,287, (Jan. 29, 2013), 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,304, (Mar. 22, 2013), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,327, (Mar. 22, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, (Mar. 18, 2013), 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,976, (Feb. 22, 2013), 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,321, (Feb. 1, 2013), 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, (Feb. 7, 2013), 11 pages.
“Notice of Allowance”, U.S. Appl. No. 13/470,633, (Mar. 22, 2013), 7 pages.
“On-Screen Keyboard for Windows 7, Vista, XP with Touchscreen”, Retrieved from <www.comfort-software.com/on-screen-keyboard.html> on Aug. 28, 2012, (Feb. 2, 2011), 3 pages.
“Optical Sensors in Smart Mobile Devices”, ON Semiconductor, TND415/D, Available at <http://www.onsemi.jp/pub—link/Collateral/TND415-D.PDF>,(Nov. 2010), pp. 1-13.
“Optics for Displays: Waveguide-based Wedge Creates Collimated Display Backlight”, OptoIQ, retrieved from <http://www.optoiq.com/index/photonics-technologies-applications/lfw-display/lfw-article-display-articles.laser-focus-world.volume-46.issue-1.world-news.optics-for—displays.html> Nov. 2, 2010,(Jan. 1, 2010), 3 pages.
“Position Sensors”, Android Developers, retrieved from <http://developer.android.com/guide/topics/sensors/sensors—position.html> on May 25, 2012, 5 pages.
“Reflex LCD Writing Tablets”, retrieved from <http://www.kentdisplays.com/products/lcdwritingtablets.html> on Jun. 27, 2012, 3 pages
“Restriction Requirement”, U.S. Appl. No. 13/471,139, (Jan. 17, 2013), 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,304, (Jan. 18, 2013), 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,726, (Feb. 22, 2013), 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/651,871, (Feb. 7, 2013),6 pages.
“SMART Board™ Interactive Display Frame Pencil Pack”, Available at <http://downloads01.smarttech.com/media/sitecore/en/support/product/sbfpd/400series(interactivedisplayframes)/guides/smartboardinteractivedisplayframepencilpackv12mar09.pdf>,(2009), 2 pages.
“SolRxTM E-Series Multidirectional Phototherapy ExpandableTM 2-Bulb Full Body Panel System”, Retrieved from: <http://www.solarcsystems.com/us—multidirectional—uv—light—therapy—1—intro.html > on Jul. 25, 2012,(2011), 4 pages.
“The Microsoft Surface Tablets Comes With Impressive Design and Specs”, Retrieved from <http://microsofttabletreview.com/the-microsoft-surface-tablets-comes-with-impressive-design-and-specs> on Jan. 30, 2013, (Jun. 2012), 2 pages.
“Tilt Shift Lenses: Perspective Control”, retrieved from http://www.cambridgeincolour.com/tutorials/tilt-shift-lenses1.htm, (Mar. 28, 2008), 11 Pages.
“Virtualization Getting Started Guide”, Red Hat Enterprise Linux 6, Edition 0.2, retrieved from <http://docs.redhat.com/docs/en-US/Red—Hat—Enterprise—Linux/6/html-single/Virtualization—Getting—Started—Guide/index.html> on Jun. 13, 2012, 24 pages.
“What is Active Alignment?”, http://www.kasalis.com/active—alignment.html, retrieved on Nov. 22, 2012, 2 Pages.
Block, Steve et al., “DeviceOrientation Event Specification”, W3C, Editor's Draft, retrieved from <https://developer.palm.com/content/api/dev-guide/pdk/accessing-device-sensors.html> on May 25, 2012,(Jul. 12, 2011), 14 pages.
Brown, Rich “Microsoft Shows Off Pressure-Sensitive Keyboard”, retrieved from <http://news.cnet.com/8301-17938—105-10304792-1.html> on May 7, 2012, (Aug. 6, 2009), 2 pages.
Butler, Alex et al., “SideSight: Multi-“touch” Interaction around Small Devices”, In the proceedings of the 21st annual ACM symposium on User interface software and technology., retrieved from <http://research.microsoft.com/pubs/132534/sidesight—crv3.pdf> on May 29, 2012,(Oct. 19, 2008), 4 pages.
Crider, Michael “Sony Slate Concept Tablet “Grows” a Kickstand”, Retrieved from: <http://androidcommunity.com/sony-slate-concept-tablet-grows-a-kickstand-20120116/> on May 4, 2012,(Jan. 16, 2012), 9 pages.
Das, Apurba et al., “Study of Heat Transfer through Multilayer Clothing Assemblies: A Theoretical Prediction”, Retrieved from <http://www.autexrj.com/cms/zalaczone—pliki/5—013—11.pdf>, (Jun. 2011), 7 pages.
Dietz, Paul H., et al., “A Practical Pressure Sensitive Computer Keyboard”, In Proceedings of UIST 2009,(Oct. 2009), 4 pages.
Gaver, William W., et al., “A Virtual Window on Media Space”, retrieved from <http://www.gold.ac.uk/media/15gaver-smets-overbeeke.MediaSpaceWindow.chi95.pdf> on Jun. 1, 2012, retrieved from <http://www.gold.ac.uk/media/15gaver-smets-overbeeke.MediaSpaceWindow.chi95.pdf> on Jun. 1, 2012,(May 7, 1995), 9 pages.
Glatt, Jeff “Channel and Key Pressure (Aftertouch).”, Retrieved from: <http://home.roadrunner.com/˜jgglatt/tutr/touch.htm> on Jun. 11, 2012, 2 pages.
Hanlon, Mike “ElekTex Smart Fabric Keyboard Goes Wireless”, Retrieved from: <http://www.gizmag.com/go/5048/ > on May 7, 2012,(Jan. 15, 2006), 5 pages.
Harada, Susumu et al., “VoiceDraw: A Hands-Free Voice-Driven Drawing Application for People With Motor Impairments”, In Proceedings of Ninth International ACM SIGACCESS Conference on Computers and Accessibility, retrieved from <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.113.7211&rep=rep1&type=pdf> on Jun. 1, 2012,(Oct. 15, 2007), 8 pages.
Iwase, Eiji “Multistep Sequential Batch Assembly of Three-Dimensional Ferromagnetic Microstructures with Elastic Hinges”, Retrieved at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1549861>> Proceedings: Journal of Microelectromechanical Systems, (Dec. 2005), 7 pages.
Kaufmann, Benoit et al., “Hand Posture Recognition Using Real-time Artificial Evolution”, EvoApplications'09 , retrieved from <http://evelyne.lutton.free.fr/Papers/KaufmannEvolASP2010.pdf> Jan. 5, 2012,(Apr. 3, 2010),10 pages.
Kaur, Sukhmani “Vincent Liew's redesigned laptop satisfies ergonomic needs”, Retrieved from: <http://www.designbuzz.com/entry/vincent-liew-s-redesigned-laptop-satisfies-ergonomic-needs/> on Jul. 27, 2012,(Jun. 21, 2010), 4 pages.
Khuntontong, Puttachat et al., “Fabrication of Molded Interconnection Devices by Ultrasonic Hot Embossing on Thin Polymer Films”, IEEE Transactions on Electronics Packaging Manufacturing, vol. 32, No. 3,(Jul. 2009), pp. 152-156.
Linderholm, Owen “Logitech Shows Cloth Keyboard for PDAs”, Retrieved from: <http://www.pcworld.com/article/89084/logitech—shows—cloth—keyboard—for—pdas.html> on May 7, 2012,(Mar. 15, 2002), 5 pages.
Manresa-Yee, Cristina et al., “Experiences Using a Hands-Free Interface”, In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, retrieved from <http://dmi.uib.es/˜cmanresay/Research/%5BMan08%5DAssets08.pdf> on Jun. 1, 2012,(Oct. 13, 2008), pp. 261-262.
McLellan, Charles “Eleksen Wireless Fabric Keyboard: a first look”, Retrieved from: <http://www.zdnetasia.com/eleksen-wireless-fabric-keyboard-a-first-look-40278954.htm> on May 7, 2012,(Jul. 17, 2006), 9 pages.
Nakanishi, Hideyuki et al., “Movable Cameras Enhance Social Telepresence in Media Spaces”, In Proceedings of the 27th International Conference on Human Factors in Computing Systems, retrieved from <http://smg.ams.eng.osaka-u.ac.jp/˜nakanishi/hnp—2009—chi.pdf> on Jun. 1, 2012,(Apr. 6, 2009), 10 pages.
Piltch, Avram “ASUS Eee Pad Slider SL101 Review”, Retrieved from <http://www.laptopmag.com/review/tablets/asus-eee-pad-slider-sl101.aspx>, (Sep. 22, 2011), 5 pages.
Post, E.R. et al., “E-Broidery: Design and Fabrication of Textile-Based Computing”, IBM Systems Journal, vol. 39, Issue 3 & 4,(Jul. 2000), pp. 840-860.
Purcher, Jack “Apple is Paving the Way for a New 3D GUI for IOS Devices”, Retrieved from: <http://www.patentlyapple.com/patently-apple/2012/01/apple-is-paving-the-way-for-a-new-3d-gui-for-ios-devices.html> on Jun. 4, 2012,(Jan. 12, 2012),15 pages.
Qin, Yongqiang et al., “pPen: Enabling Authenticated Pen and Touch Interaction on Tabletop Surfaces”, In Proceedings of ITS 2010, Available at <http://www.dfki.de/its2010/papers/pdf/po172.pdf>,(Nov. 2010), pp. 283-284.
Reilink, Rob et al., “Endoscopic Camera Control by Head Movements for Thoracic Surgery”, In Proceedings of 3rd IEEE RAS & EMBS International Conference of Biomedical Robotics and Biomechatronics, retrieved from <http://doc.utwente.nl/74929/1/biorob—online.pdf> on Jun. 1, 2012,(Sep. 26, 2010), pp. 510-515.
Sumimoto, Mark “Touch & Write: Surface Computing With Touch and Pen Input”, Retrieved from: <http://www.gottabemobile.com/2009/08/07/touch-write-surface-computing-with-touch-and-pen-input/> on Jun. 19, 2012,(Aug. 7, 2009), 4 pages.
Sundstedt, Veronica “Gazing at Games: Using Eye Tracking to Control Virtual Characters”, In ACM SIGGRAPH 2010 Courses, retrieved from <http://www.tobii.com/Global/Analysis/Training/EyeTrackAwards/veronica—sundstedt.pdf> Jun. 1, 2012,(Jul. 28, 2010), 85 pages.
Takamatsu, Seiichi et al., “Flexible Fabric Keyboard with Conductive Polymer-Coated Fibers”, In Proceedings of Sensors 2011,(Oct. 28, 2011), 4 pages.
Travis, Adrian et al., “Collimated Light from a Waveguide for a Display Backlight”, Optics Express, 19714, vol. 17, No. 22, retrieved from <http://download.microsoft.com/download/D/2/E/D2E425F8-CF3C-4C71-A4A2-70F9D4081007/OpticsExpressbacklightpaper.pdf> on Oct. 15, 2009, 6 pages.
Travis, Adrian et al., “The Design of Backlights for View-Sequential 3D”, retrieved from <http://download.microsoft.com/download/D/2/E/D2E425F8-CF3C-4C71-A4A2-70F9D4081007/Backlightforviewsequentialautostereo.docx> on Nov. 1, 2010, 4 pages.
Valli, Alessandro “Notes on Natural Interaction”, retrieved from <http://www.idemployee.id.tue.nl/g.w.m.rauterberg/lecturenotes/valli-2004.pdf> on Jan. 5, 2012,(Sep. 2005), 80 pages.
Valliath, G T., “Design of Hologram for Brightness Enhancement in Color LCDs”, Retrieved from <http://www.loreti.it/Download/PDF/LCD/44—05.pdf> on Sep. 17, 2012, 5 pages.
Vaucelle, Cati “Scopemate, A Robotic Microscope!”, Architectradure, retrieved from <http://architectradure.blogspot.com/2011/10/at-uist-this-monday-scopemate-robotic.html> on Jun. 6, 2012,(Oct. 17, 2011), 2 pages.
Williams, Jim “A Fourth Generation of LCD Backlight Technology”, Retrieved from <http://cds.linear.com/docs/Application%20Note/an65f.pdf>, (Nov. 1995), 124 pages.
Xu, Zhang et al., “Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors”, IUI'09, Feb. 8-11, 2009, retrieved from <http://sclab.yonsei.ac.kr/courses/10TPR/10TPR.files/Hand%20Gesture%20Recognition%20and%20Virtual%20Game%20Control%20based%20on%203d%/20accelerometer%20and%20EMG%20sensors.pdf> on Jan. 5, 2012,(Feb. 8, 2009), 5 pages.
Xu, Zhi-Gang et al., “Vision-based Detection of Dynamic Gesture”, ICTM'09, Dec. 5-6, 2009, retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5412956> on Jan. 5, 2012,(Dec. 5, 2009), pp. 223-226.
Zhang, et al., “Model-Based Development of Dynamically Adaptive Software”, In Proceedings of ICSE 2006, Available at <http://www.irisa.fr/lande/lande/icse-proceedings/icse/p371.pdf>,(May 20, 2006), pp. 371-380.
Zhu, Dingyun et al., “Keyboard before Head Tracking Depresses User Success in Remote Camera Control”, In Proceedings of 12th IFIP TC 13 International Conference on Human-Computer Interaction, Part II, retrieved from <http://csiro.academia.edu/Departments/CSIRO—ICT—Centre/Papers?page=5> on Jun. 1, 2012,(Aug. 24, 2009), 14 pages.
“Final Office Action”, U.S. Appl. No. 13/651,195, (Apr. 18, 2013), 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/563,435, (Jun. 14, 2013), 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, (Jun. 19, 2013), 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/565,124, (Jun. 17, 2013), 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,202, (May 28, 2013), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,195, (Jul. 8, 2013), 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/043961, Oct. 17, 2013, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/780,228, Oct. 30, 2013, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/494,651, Feb. 4, 2014, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/525,070, Jan. 17, 2014, 19 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Jan. 14, 2014, 2 pages.
“International Search Report”, Application No. PCT/US2010/045676, Apr. 28, 2011, 2 Pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Jan. 22, 2014, 2 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/021,448, Aug. 16, 2013, 25 pages.
“International Search Report”, Application No. PCT/US2010/046129, Mar. 2, 2011, 3 Pages.
“What is the PD-Net Project About?”, retrieved from <http://pd-net.org/about/> on Mar. 10, 2011, 3 pages.
“Real-Time Television Content Platform”, retrieved from <http://www.accenture.com/us-en/pages/insight-real-time-television-platform.aspx> on Mar. 10, 2011, May 28, 2002, 3 pages.
“Notice of Allowance”, U.S. Appl. No. 13/563,435, Nov. 12, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/565,124, Dec. 24, 2013, 6 pages.
“Final Office Action”, U.S. Appl. No. 13/564,520, Jan. 15, 2014, 7 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/055679, Nov. 18, 2013, 8 pages.
Kim, et al.,' “A Controllable Viewing Angle LCD with an Optically isotropic liquid crystal”, Journal of Physics D: Applied Physics, vol. 43, No. 14, Mar. 23, 2010, 7 Pages.
Lee, “Flat-panel Backlight for View-sequential 3D Display”, Optoelectronics, IEE Proceedings—.vol. 151. No. 6 IET, Dec. 2004, 4 pages.
Travis, et al., “Flat Projection for 3-D”, In Proceedings of the IEEE, vol. 94 Issue: 3, Available at <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1605201>,Mar. 13, 2006, pp. 539-549.
Travis, et al.,' “P-127: Linearity in Flat Panel Wedge Projection”, SID 03 Digest, retrieved from <http://www2.eng.cam.ac.uk/˜arlt1/Linearity%20in%20flat%20panel%20wedge%20projection.pdf>,May 12, 2005, pp. 716-719.
Yagi, “The Concept of “AdapTV””, Series: The Challenge of “AdapTV”, Broadcast Technology, No. 28, 2006, pp. 16-17.
“Advisory Action”, U.S. Appl. No. 14/199,924, May 28, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Mar. 10, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/494,651, Jun. 11, 2014, 19 pages.
“Final Office Action”, U.S. Appl. No. 13/525,070, Apr. 24, 2014, 21 pages.
“Final Office Action”, U.S. Appl. No. 14/199,924, May 6, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, Jun. 16, 2014, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,237, May 12, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/018,286, May 23, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/199,924, Jun. 10, 2014, 4 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 14/018,286, Jun. 11, 2014, 5 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/563,435, Mar. 20, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Apr. 3, 2014, 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/565,124, Apr. 14, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/021,448, Jan. 16, 2014, 33 Pages.
“Final Office Action”, U.S. Appl. No. 13/780,228, Mar. 28, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,237, Mar. 24, 2014, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, Feb. 14, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/199,924, Apr. 10, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/200,595, Apr. 11, 2014, 4 pages.
“Advanced Configuration and Power Management Specification”, Intel Corporation, Microsoft Corporation, Toshiba Corp. Revision 1, Dec. 22, 1996, 364 pages.
“Advisory Action”, U.S. Appl. No. 13/939,032, Feb. 24, 2014, 2 pages.
“Apple®—45W MagSafe 2 Power Adapter with Magnetic DC Connector-”, Retrieved from <http://www.bestbuy.com/site/Apple%26%23174%3B---45W-MagSafe-2-Power-Adapter-with-Magnetic-DC-Connector/5856526.p?id=1218696408860&skuld=5856526#tab=overview> on May 14, 2013, 2013, 4 Pages.
“Basic Cam Motion Curves”, Retrieved From: <http://ocw.metu.edu.tr/pluginfile.php/6886/mod—resource/content/1/ch8/8-3.htm> Nov. 22, 2013, Middle East Technical University,1999, 14 Pages.
“Can I Customize my Samsung Galaxy S® 4 Lock Screen? Which Features can I Access When the Device is Locked?”, Retrieved From: <http://www.samsung.com/us/support/howtoguide/N0000006/10632/127767> Jul. 3, 2014, May 16, 2014, 12 Pages.
“Controlling Your Desktop's Power Management”, Retrieved From: <http://www.vorkon.de/SU1210.001/drittanbieter/Dokumentation/openSUSE—11.2/manual/sec.gnomeuser.start.power—mgmt.html> Jul. 7, 2014, 6 Pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, Apr. 9, 2013, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/470,633, Jul. 2, 2013, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,327, Sep. 12, 2013, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,327, Sep. 23, 2013, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/651,726, Sep. 17, 2013, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/656,520, Jan. 16, 2014, 3 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/715,133, Apr. 2, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/938,930, May 6, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/938,930, Jun. 6, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, May 22, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, Jun. 19, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,002, May 5, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/939,032, Jun. 26, 2014, 2 pages.
“Edwards 1508 Series Surface Wall Mount Electromagnetic Door Holder”, Edwards Signaling, retrieved from <http://www.thesignalsource.com/documents/1508.pdf>, 2000, 1 page.
“Final Office Action”, U.S. Appl. No. 12/163,614, Nov. 8, 2012, 15 pages.
“Final Office Action”, U.S. Appl. No. 12/163,614, Aug. 19, 2011, 15 pages.
“Final Office Action”, U.S. Appl. No. 13/408,257, Mar. 28, 2014, 17 pages.
“Final Office Action”, U.S. Appl. No. 13/471,001, Jul. 25, 2013, 20 pages.
“Final Office Action”, U.S. Appl. No. 13/471,139, Sep. 16, 2013, 13 pages.
“Final Office Action”, U.S. Appl. No. 13/471,336, Aug. 28, 2013, 18 pages.
“Final Office Action”, U.S. Appl. No. 13/603,918, Mar. 21, 2014, 14 pages.
“Final Office Action”, U.S. Appl. No. 13/651,232, May 21, 2013, 21 pages.
“Final Office Action”, U.S. Appl. No. 13/651,287, May 3, 2013, 16 pages.
“Final Office Action”, U.S. Appl. No. 13/651,976, Jul. 25, 2013, 21 pages.
“Final Office Action”, U.S. Appl. No. 13/653,321, Aug. 2, 2013, 17 pages.
“Final Office Action”, U.S. Appl. No. 13/653,682, Jun. 11, 2014, 11 pages.
“Final Office Action”, U.S. Appl. No. 13/653,682, Oct. 18, 2013, 16 pages.
“Final Office Action”, U.S. Appl. No. 13/656,055, Oct. 23, 2013, 14 pages.
“Final Office Action”, U.S. Appl. No. 13/938,930, Nov. 8, 2013, 10 pages.
“Final Office Action”, U.S. Appl. No. 13/939,002, Nov. 8, 2013, 7 pages.
“Final Office Action”, U.S. Appl. No. 13/939,032, Dec. 20, 2013, 5 pages.
“Final Office Action”, U.S. Appl. No. 14/063,912, Apr. 29, 2014, 10 pages.
“FingerWorks Installation and Operation Guide for the TouchStream ST and TouchStream LP”, FingerWorks, Inc. Retrieved from <http://ec1.images-amazon.com/media/i3d/01/A/man-migrate/MANUAL000049862.pdf>, 2002, 14 pages.
“For Any Kind of Proceeding 2011 Springtime as Well as Coil Nailers as Well as Hotter Summer Season”, Lady Shoe Worlds, retrieved from <http://www.ladyshoesworld.com/2011/09/18/for-any-kind-of-proceeding-2011-springtime-as-well-as-coil-nailers-as-well-as-hotter-summer-season/> on Nov. 3, 2011,Sep. 8, 2011, 2 pages.
“Foreign Notice of Allowance”, CN Application No. 201320096755.7, Jan. 27, 2014, 2 pages.
“Foreign Office Action”, CN Application No. 201110272868.3, Apr. 1, 2013, 10 pages.
“Foreign Office Action”, CN Application No. 201320097066.8, Oct. 24, 2013, 5 Pages.
“Foreign Office Action”, CN Application No. 201320097079.5, Sep. 26, 2013, 4 pages.
“Foreign Office Action”, CN Application No. 201320328022.1, Feb. 17, 2014, 4 Pages.
“Foreign Office Action”, CN Application No. 201320328022.1, Oct. 18, 2013, 3 Pages.
“iControlPad 2—The open source controller”, Retrieved from <http://www.kickstarter.com/projects/1703567677/icontrolpad-2-the-open-source-controller> on Nov. 20, 2012, 2012, 15 pages.
“Interlink Electronics FSR (TM) Force Sensing Resistors (TM)”, Retrieved at <<http://akizukidenshi.com/download/ds/ interlinkelec/94-00004+Rev+B%20FSR%201ntegration%20Guide.pdf on Mar. 21, 2013, 36 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028479, Jun. 17, 2013, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/031531, Jun. 20, 2014, 10 Pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/051421, Dec. 6, 2013, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/065154, Feb. 5, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/020050, May 9, 2014, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028488, Jun. 24, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/053683, Nov. 28, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/016654, May 16, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028481, Jun. 19, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/028948, Jun. 21, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/029461, Jun. 21, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/040968, Sep. 5, 2013, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/067912, Feb. 13, 2014, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/075180, May 6, 2014, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/044871, Aug. 14, 2013, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/042550, Sep. 24, 2013, 14 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/013928, May 12, 2014, 17 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/045283, Mar. 12, 2014, 19 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/050471, Apr. 9, 2012, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/044873, Nov. 22, 2013, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/063156, Dec. 5, 2013, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/067905, Apr. 15, 2014, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/042790, Aug. 8, 2013, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/045049, Sep. 16, 2013, 9 pages.
“Lock Screen Overview (Windows Runtime Apps)”, Retrieved From: <http://msdn.microsoft.com/en-in/library/windows/apps/hh779720.aspx> Jul. 8, 2014, Dec. 31, 2012, 5 Pages.
“Magnetic Cell Phone Holder”, Extreme Computing, retrieved from <http://www.extremecomputing.com/magnetholder.html> on May 7, 2008, 1 page.
“Microsoft Develops Glasses-Free Eye-Tracking 3D Display”, Tech-FAQ—retrieved from <http://www.tech-faq.com/microsoft-develops-glasses-free-eye-tracking-3d-display.html> on Nov. 2, 2011, Nov. 2, 2011, 3 pages.
“Microsoft Reveals Futuristic 3D Virtual HoloDesk Patent”, Retrieved from <http://www.patentbolt.com/2012/05/microsoft-reveals-futuristic-3d-virtual-holodesk-patent.htmlt> on May 28, 2012, May 23, 2012, 9 pages.
“Microsoft Tablet PC”, Retrieved from <http://web.archive.org/web/20120622064335/https://en.wikipedia.org/wiki/Microsoft—Tablet—PC> on Jun. 4, 2014, Jun. 21, 2012, 9 pages.
“Molex:PCI Express Mini Card Connector, Right Angle, Low-Profile, Mid-Mount 0.80mm (.031″) Pitch”, Retrieved from <http://rhu004.sma-promail.com/SQLImages/kelmscott/Molex/PDF—Images/987650-4441.PDF> on Feb. 6, 2013, 2010, 3 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/409,967, Dec. 10, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/599,635, Feb. 25, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/163,614, Apr. 27, 2011, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/163,614, May 24, 2012, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/882,994, Feb. 1, 2013, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/408,257, Dec. 5, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/468,918, Dec. 26, 2013, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/468,949, Jun. 20, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/470,951, Jul. 2, 2014, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,001, Jun. 17, 2014, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,030, May 15, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,054, Jun. 3, 2014, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,186, Feb. 27, 2014, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,336, May 7, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,376, Apr. 2, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/471,405, Feb. 20, 2014, 37 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,232, Apr. 30, 2014, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/494,722, May 9, 2014, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, Apr. 3, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/527,263, Jul. 19, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/595,700, Jun. 18, 2014, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/599,763, May 28, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/603,918, Dec. 19, 2013, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/645,405, Jan. 31, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/647,479, Jul. 3, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,232, Dec. 5, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,726, Apr. 15, 2013, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,871, Jul. 1, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/651,976, Jun. 16, 2014, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, Feb. 26, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/653,682, Jun. 3, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/655,065, Apr. 24, 2014, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,055, Mar. 12, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,055, Apr. 23, 2013, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,520, Feb. 1, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/656,520, Jun. 5, 2013, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/938,930, Aug. 29, 2013, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,002, Aug. 28, 2013, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,002, Dec. 20, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/939,032, Aug. 29, 2013, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/974,994, Jun. 4, 2014, 24 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/975,087, May 8, 2014, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/063,912, Jan. 2, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,250, Jun. 17, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/225,276, Jun. 13, 2014, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/277,240, Jun. 13, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 12/163,614, Apr. 3, 2013, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 12/882,994, Jul. 12, 2013, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/409,967, Feb. 14, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/468,918, Jun. 17, 2014, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,139, Mar. 17, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,186, Jul. 3, 2014, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/471,405, Jun. 24, 2014, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,232, Apr. 25, 2014, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,272, May 2, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,287, May 2, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,304, Jul. 1, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,327, Jun. 11, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,726, May 31, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/651,871, Oct. 2, 2013, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/653,321, Dec. 18, 2013, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/656,520, Oct. 2, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/667,408, Mar. 13, 2014, 11 pages.
“Notice of Allowance”, U.S. Appl. No. 13/715,133, Jan. 6, 2014, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/938,930, Feb. 20, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/939,002, Mar. 3, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/939,032, Apr. 3, 2014, 4 pages.
“Notice to Grant”, CN Application No. 201320097089.9, Sep. 29, 2013, 2 Pages.
“Notice to Grant”, CN Application No. 201320097124.7, Oct. 8, 2013, 2 pages.
“PCI Express® SMT Connector | FCI”, Retrieved from <http://www.ttiinc.com/object/fp—fci—PCISMT> on Feb. 6, 2013, Feb. 2013, 1 page.
“Restriction Requirement”, U.S. Appl. No. 13/468,918, Nov. 29, 2013, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/603,918, Nov. 27, 2013, 8 pages.
“Restriction Requirement”, U.S. Appl. No. 13/715,133, Oct. 28, 2013, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/367,812, Mar. 11, 2014, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/494,722, Dec. 20, 2013, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/589,773, Aug. 6, 2014, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/595,700, May 28, 2014, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/715,133, Dec. 3, 2013, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 13/715,229, Aug. 13, 2013, 7 pages.
“RoPD® Connectors”, Retrieved from <http://www.rosenberger.de/documents/headquarters—de—en/ba—automotive/AUTO—RoPD—Flyer—2012.pdf> on May 14, 2013, Jun. 2012, 6 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/653,321, Mar. 28, 2014, 4 pages.
“Surface”, Retrieved from <http://www.microsoft.com/surface/en-us/support/hardware-and-drivers/type-cover> on Dec. 24, 2013, 6 pages.
“Teach Me Simply”, Retrieved From: <http://techmesimply.blogspot.in/2013/05/yugatech—3.html> on Nov. 22, 2013, May 3, 2013, pp. 1-6.
“Welcome to Windows 7”, Retrieved from: <http://www.microsoft.com/en-us/download/confirmation.aspx?id=4984> on Aug. 1, 2013, Sep. 16, 2009, 3 pages.
“Windows 7: Display Reminder When Click on Shutdown?”, Retrieved From: <http://www.sevenforums.com/customization/118688-display-reminder-when-click-shutdown.html> Jul. 8, 2014, Oct. 18, 2010, 5 Pages.
“Write & Learn Spellboard Advanced”, Available at <http://somemanuals.com/VTECH,WRITE%2526LEARN--SPELLBOARD--ADV--71000,JIDFHE.PDF>, 2006, 22 pages.
Bathiche, et al.,' “Input Device with Interchangeable Surface”, U.S. Appl. No. 13/974,749, Aug. 23, 2013, 51 pages.
Bert, et al.,' “Passive Matrix Addressing of Electrophoretic Image Display”, Conference on International Display Research Conference, Retrieved from <http://www.cmst.be/publi/eurodisplay2002—s14-1.pdf>, Oct. 1, 2002, 4 pages.
Breath, “ThinkSafe: A Magnetic Power Connector for Thinkpads”, Retrieved from <http://www.instructables.com/id/ThinkSafe%3A-A-Magnetic-Power-Connector-for-Thinkpad/> on May 14, 2013, Oct. 26, 2006, 9 pages.
Burge, et al.,' “Determination of off-axis aberrations of imaging systems using on-axis measurements”, SPIE Proceeding, Retrieved from <http://www.loft.optics.arizona.edu/documents/journal—articles/Jim—Burge—Determination—of—off-axis—aberrations—of—imaging—systems—using—on-axis—measurements.pdf>,Sep. 21, 2011, 10 pages.
Campbell, “Future iPhones May Unlock, Hide Messages based on a User's Face”, Retrieved From:<http://appleinsider.com/articles/13/12/03/future-iphones-may-unlock-hide-messages-based-on-a-users-face> Jul. 3, 2014, Dec. 3, 2013, 11 Pages.
Caprio, “Enabling Notification Badges for Whatsapp and Other Android Apps”, Retrieved From: <http://geek.ng/2013/05/enabling-notification-badges-for-whatsapp-and-other-android-apps.html> Jul. 3, 2014, May 20, 2014, 7 Pages.
Carlon, “How to Add a WhatsApp Widget to your Lock Screen”, Retrieved From: <http://www.androidpit.com/how-to-add-a-whatsapp-widget-to-your-lock-screen> Jul. 3, 2014, Apr. 9, 2014, 6 Pages.
Chang, et al.,' “Optical Design and Analysis of LCD Backlight Units Using ASAP”, Optical Engineering, Available at <http://www.opticsvalley.com/resources/kbasePDF/ma—oe—001—optical—design.pdf>,Jun. 2003, 15 pages.
Chavan, et al.,' “Synthesis, Design and Analysis of a Novel Variable Lift Cam Follower System”, In Proceedings: International Journal of Desingn Engineering, vol. 3, Issue 4, Inderscience Publishers,Jun. 3, 2010, 1 Page.
Constine, “Cover is an Android-Only Lockscreen that Shows Apps When You Need Them”, Retrieved From: <http://techcrunch.com/2013/10/24/cover-android/> Jul. 2, 2014, Oct. 24, 2013, 15 pages.
Diverdi, et al.,' “An Immaterial Pseudo-3D Display with 3D Interaction”, In the proceedings of Three-Dimensional Television: Capture, Transmission, and Display, Springer, Retrieved from <http://www.cs.ucsb.edu/˜holl/pubs/DiVerdi-2007-3DTV.pdf>,Feb. 6, 2007, 26 pages.
Eckel, “Personalize Alerts with the Help of OS X Mavericks Notifications”, Retrieved From: <http://www.techrepublic.com/article/customize-os-x-mavericks-notifications-to-personalize-alerts/> Jul. 8, 2014, Mar. 10, 2014, 7 Pages.
Grossman, et al.,' “Multi-Finger Gestural Interaction with 3D Volumetric Displays”, In the proceedings of the 17th annual ACM symposium on User interface software and technology, Retrieved from <http://www.dgp.toronto.edu/papers/tgrossman—UIST2004.pdf>,Oct. 24, 2004, 61-70.
Haslam, “This App for Android Customizes your Lock Screen Automatically Depending on Time of Day or Situation”, Retrieved From: <http://www.redmondpie.com/this-app-for-android-customizes-your-lock-screen-automatically-depending-on-time-of-day-or-situation/> Jul. 8, 2014, Jun. 1, 2012, 6 Pages.
Henry, “Supercharge Your Lock Screen with DashClock and These Add-Ons”, Retrieved From: <http://lifehacker.com/supercharge-your-lock-screen-with-dashclock-and-these-a-493206006> Jul. 3, 2014, May 7, 2013, 12 Pages.
Hinckley, et al.,' “Codex: A Dual Screen Tablet Computer”, Conference on Human Factors in Computing Systems, Apr. 9, 2009, 10 pages.
Izadi, et al.,' “ThinSight: A Thin Form-Factor Interactive Surface Technology”, Communications of the ACM, vol. 52, No. 12, retrieved from <http://research.microsoft.com/pubs/132532/p90-izadi.pdf> on Jan. 5, 2012,Dec. 2009, pp. 90-98.
Jacobs, et al.,' “2D/3D Switchable Displays”, In the proceedings of Sharp Technical Journal (4), Available at <https://cgi.sharp.co.jp/corporate/rd/journal-85/pdf/85-04.pdf>,Apr. 2003, pp. 15-18.
Justin, “Seidio Active with Kickstand for the Galaxy SIII”, Retrieved From: <http://www.t3chniq.com/seidio-active-with-kickstand-gs3/> on Nov. 22, 2013, Jan. 3, 2013, 5 Pages.
Lahr, “Development of a Novel Cam-based Infinitely Variable Transmission”, Proceedings: In Thesis of Master of Science in Mechanical Engineering, Virginia Polytechnic Institute and State University,Nov. 6, 2009, 91 pages.
Lambert, “Cam Design”, In Proceedings: Kinematics and dynamics of Machine, University of Waterloo Department of Mechanical Engineering,Jul. 2, 2002, pp. 51-60.
Lane, et al.,' “Media Processing Input Device”,U.S. Appl. No. 13/655,065, Oct. 18, 2012, 43 pages.
Lee, “Flat-Panel Autostereoscopic 3D Display”, Optoelectronics, IET, Available at <http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=04455550>,Feb. 2008, pp. 24-28.
Lee, et al.,' “Depth-Fused 3D Imagery on an Immaterial Display”, In the proceedings of IEEE Transactions on Visualization and Computer Graphics, vol. 15, No. 1, Retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=04540094>,Jan. 2009, 20-33.
Lee, et al.,' “LED Light Coupler Design for a Ultra Thin Light Guide”, Journal of the Optical Society of Korea, vol. 11, Issue.3, Retrieved from <http://opticslab.kongju.ac.kr/pdf/06.pdf>,Sep. 2007, 5 pages.
Li, et al.,' “Characteristic Mode Based Tradeoff Analysis of Antenna-Chassis Interactions for Multiple Antenna Terminals”, In IEEE Transactions on Antennas and Propagation, Retrieved from <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6060882>,Feb. 2012, 13 pages.
Liu, et al.,' “Three-dimensional PC: toward novel forms of human-computer interaction”, In the proceedings of Three-Dimensional Video and Display: Devices and Systems vol. CR76, Retrieved from <http://www.google.co.in/url?sa=t&rct=j&q=Three-dimensional+PC:+toward+novel+forms+of+human-computer+interaction&source=web&cd=1&ved=0CFoQFjAA&url=http%3A%2Fciteseerx.ist.psu.edu%/2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.32.9469%26rep%3Drep1%26, Nov. 5, 2000, 250-281.
Mack, “Moto X: The First Two Weeks”, Retrieved From: <http://www.gizmag.com/two-weeks-motorola-google-moto-x-review/28722/> Jul. 8, 2014, Aug. 16, 2013, 8 pages.
McLellan, “Microsoft Surface Review”, Retrieved from <http://www.zdnet.com/microsoft-surface-review-7000006968/> on May 13, 2013, Nov. 6, 2012, 17 pages.
Miller, “MOGA gaming controller enhances the Android gaming experience”, Retrieved from <http://www.zdnet.com/moga-gaming-controller-enhances-the-android-gaming-experience-7000007550/> on Nov. 20, 2012, Nov. 18, 2012, 9 pages.
Morookian, et al.,' “Ambient-Light-Canceling Camera Using Subtraction of Frames”, NASA Tech Briefs, Retrieved from <http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20110016693—2011017808.pdf>,May 2004, 2 pages.
Patterson, “iOS 7 Tip: Alerts, Banners, and Badges—What's the Difference?”, Retrieved From: <http://heresthethingblog.com/2014/01/22/ios-7-tip-whats-difference-alert/>, Jan. 22, 2014, 6 Pages.
Peli, “Visual and Optometric Issues with Head-Mounted Displays”, IS & T/OSA Optics & Imaging in the Information Age, The Society for Imaging Science and Technology, available at <http://www.u.arizona.edu/˜zrui3/zhang—pHMPD—spie07.pdf>,1996, pp. 364-369.
Prospero, “Samsung Outs Series 5 Hybrid PC Tablet”, Retrieved from: <http://blog.laptopmag.com/samsung-outs-series-5-hybrid-pc-tablet-running-windows-8> on Oct. 31, 2013, Jun. 4, 2012, 7 pages.
Ramirez, “Applying Solventless Elastomeric Polyurethanes on Concrete in Wastewater Service”, In Proceedings: Journal of Protective Coatings and Linings, May 1995, 13 pages.
Reisman, et al.,' “A Screen-Space Formulation for 2D and 3D Direct Manipulation”, In the proceedings of the 22nd annual ACM symposium on User interface, Retrieved from <http://innovis.cpsc.ucalgary.ca/innovis/uploads/Courses/TableTopDetails2009/Reisman2009.pdf>,Oct. 4, 2009, 69-78.
Ritchie, “How to Use Lock Screen, Today, Popups, and Banners in Notification Center for iPhone and iPad”, Retrieved From: <http://www.imore.com/how-use-notification-center-iphone-ipad> Jul. 3, 2014, Apr. 30, 2014, 8 pages.
Royman, “NiLS Lockscreen Notifications”, Retrieved From: <https://play.google.com/store/apps/details?id=com.roymam.android.notificationswidget&hl=en> Jul. 3, 2014, Jun. 28, 2014, 3 Pages.
Salman, “Create a Minimal Lock Screen With WidgetLocker”, Retrieved From: <http://android.appstorm.net/how-to/create-a-minimal-lock-screen-with-widgetlocker/> Jul. 3, 2014, Dec. 26, 2011, 12 Pages.
Sanap, et al.,' “Design and Analysis of Globoidal Cam Index Drive”, Proceedings: In International Journal of Scientific Research Engineering & Technology, Jun. 2013, 6 Pages.
Schoning, et al.,' “Building Interactive Multi-Touch Surfaces”, Journal of Graphics, GPU, and Game Tools, vol. 14, No. 3, available at <http://www.libavg.com/raw-attachment/wiki/Multitouch/Multitouchguide—draft.pdf>,Nov. 2009, pp. 35-55.
Siddiqui, “Hinge Mechanism for Rotatable Component Attachment”, U.S. Appl. No. 13/852,848, Mar. 28, 2013, 51 pages.
Staff, “Gametel Android controller turns tablets, phones into portable gaming devices”, Retrieved from <http://www.mobiletor.com/2011/11/18/gametel-android-controller-turns-tablets-phones-into-portable-gaming-devices/#> on Nov. 20, 2012, Nov. 18, 2011, 5 pages.
Thurrott, “Nokia Lumia “Black”: Glance 2.0”, Retrieved From:<http://winsupersite.com/windows-phone/nokia-lumia-black-glance-20> Jul. 8, 2014, Jan. 11, 2014, 3 Pages.
Whitwam, “How to Tweak Android's Lock Screen and Notifications”, Retrieved From: <http://www.tested.com/tech/android/457766-tips-and-tricks-make-androids-lock-screen-and-notifications-even-better/?icid=pets%7Chat%7Ctestedlink%7C457766-how-to-tweak-androids-lock-screen-and-notifications> Jul. 3, 2014, Sep. 18, 2013, 4 Pages.
Yan, et al.,' “Edge-Lighting Light Guide Plate Based on Micro-Prism for Liquid Crystal Display”, Journal of Display Technology, vol. 5, No. 9, Available at <http://ieeexplore.ieee.org/ielx5/9425/5196834/05196835.pdf?tp=&arnumber=5196835&isnumber=5196834>,Sep. 2009, pp. 355-357.
Yu, et al.,' “A New Driving Scheme for Reflective Bistable Cholesteric Liquid Crystal Displays”, Society for Information Display International Symposium Digest of Technical Papers, Retrieved from <http://www.ee.ust.hk/˜eekwok/publications/1997/bcd—sid.pdf>,May 1997, 4 pages.
Zhang, “Design of Head Mounted Displays”, Retrieved at <<http://www.optics.arizona.edu/optomech/student%20reports/2007/Design%20of%20mounteddisplays%20Zhang.pdf>>, Dec. 12, 2007, 6 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/199,924, Aug. 29, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/199,924, Sep. 5, 2014, 2 pages.
“EP Search Report”, Application No. 09812072.8, Apr. 5, 2012, 6 Pages.
“EP Search Report”, Application No. 09812072.8, Apr. 17, 2013, 5 Pages.
“Foreign Notice of Allowance”, CN Application No. 201320097065.3, Nov. 21, 2013, 2 pages.
“Foreign Office Action”, Application No. 200980134848, May 13, 2013, 7 Pages.
“Foreign Office Action”, Application No. 200980134848, Dec. 4, 2013, 8 Pages.
“Foreign Office Action”, Application No. 200980134848, Dec. 19, 2012, 8 Pages.
“Foreign Office Action”, Application No. 201080037117.7, Jul. 1, 2014, 9 Pages.
“Foreign Office Action”, Application No. 2011-526118, Aug. 16, 2013, 8 Pages.
“Foreign Office Action”, Application No. 201210023945.6, Jun. 25, 2014, 6 Pages.
“Foreign Office Action”, CN Application No. 200980134848, May 31, 2012, 7 Pages.
“Foreign Office Action”, CN Application No. 201320097065.3, Jun. 18, 2013, 2 pages.
“Foreign Office Action”, JP Application No. 2012-525632, May 2, 2014, 10 Pages.
“Foreign Office Action”, JP Application No. 2012-525722, Apr. 22, 2014, 15 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2009/055250, Mar. 2, 2014, 10 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/021,448, Jul. 22, 2014, 35 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/525,070, Aug. 14, 2014, 24 pages.
Boual, et al.,' “Wedge Displays as Cameras”, Retrieved From: http://www.camfpd.com/72-3.pdf, SID Symposium Digest of Technical Papers, vol. 37, Issue 1, pp. 1999-2002, Jun. 2006, 4 Pages.
Chen, et al.,' “Design of a Novel Hybrid Light Guide Plate for Viewing Angle Switchable Backlight Module”, Institute of Photonic Systems, National Chiao Tung University, Tainan, Taiwan., Jul. 1, 2013, 4 Pages.
Chou, et al.,' “Imaging and Chromatic Behavior Analysis of a Wedge-Plate Display”, Retrieved From: http://www.di.nctu.edu.tw/2006TDC/papers/Flexible/06-012.doc, SID Symposium Digest of Technical Papers vol. 37, Issue 1, pp. 1031-1034,Jun. 2006, 4 Pages.
Ishida, et al.,' “A Novel Ultra Thin Backlight System without Optical Sheets Using a Newly Developed Multi-Layered Light-guide”, SID 10 Digest, Jul. 5, 2012, 4 Pages.
Nishizawa, et al.,' “Investigation of Novel Diffuser Films for 2D Light-Distribution Control”, Tohoku University, Aramaki Aoba, Aoba-ku, Sendai 980-8579, Japan, LINTEC Corporation, 23-23 Honcho, Itabashi-ku, Tokyo 173-0001, Japan., Dec. 2011, 4 Pages.
Phillips, et al.,' “Links Between Holography and Lithography”, Fifth International Symposium on Display Holography, 206., Feb. 17, 1995, 9 Pages.
Powell, “High-Efficiency Projection Screen”, U.S. Appl. No. 14/243,501, filed Apr. 2, 2014, Apr. 2, 2014, 26 Pages.
Travis, “P-60: LCD Smear Elimination by Scanning Ray Angle into a Light Guide”, Retrieved From: http://www2.eng.cam.ac.uk/˜arlt1/P—60.pdf, SID Symposium Digest of Technical Papers vol. 35, Issue 1, pp. 474-477, May 2004,2004, 4 Pages.
Travis, et al.,' “Optical Design of a Flat Panel Projection Wedge Display”, 9th International Display Workshops, paper FMC6-3, Dec. 4-6, 2002, Hiroshima, Japan., Dec. 2002, 4 Pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/021,448, Aug. 17, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/564,520, Aug. 14, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/200,595, Jun. 4, 2015, 3 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/457,881, Aug. 20, 2015, 2 pages.
“Extended European Search Report”, EP Application No. 12800433.0, Oct. 28, 2014, 10 pages.
“Final Office Action”, U.S. Appl. No. 14/059,280, Jul. 22, 2015, 25 pages.
“Foreign Office Action”, CN Application No. 201280029520.4, Jun. 30, 2015, 11 pages.
“Foreign Office Action”, JP Application No. 2012-525722, Aug. 13, 2014, 17 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/066248, Mar. 12, 2015, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/727,001, Jul. 10, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/021,448, Jul. 30, 2015, 11 pages.
“Notice of Allowance”, U.S. Appl. No. 14/457,881, Jul. 22, 2015, 7 pages.
“Restriction Requirement”, U.S. Appl. No. 13/598,898, Jul. 17, 2015, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/525,070, May 18, 2015, 32 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/059,280, Mar. 3, 2015, 18 pages.
“Notice of Allowance”, U.S. Appl. No. 13/564,520, May 8, 2015, 4 pages.
“Advisory Action”, U.S. Appl. No. 14/059,280, Sep. 25, 2015, 7 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/564,520, Sep. 17, 2015, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/457,881, Oct. 2, 2015, 2 pages.
“Extended European Search Report”, EP Application No. 13859406.4, Sep. 8, 2015, 6 pages.
“Foreign Office Action”, CN Application No. 201310067592.4, Oct. 23, 2015, 12 Pages.
“Foreign Office Action”, CN Application No. 201310067622.1, Oct. 27, 2015, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/598,898, Oct. 23, 2015, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/162,529, Sep. 18, 2015, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/447,306, Oct. 1, 2015, 16 pages.
“Notice of Allowance”, U.S. Appl. No. 13/525,070, Sep. 25, 2015, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 14/059,280, Nov. 23, 2015, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 14/727,001, Oct. 2, 2015, 4 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/525,070, Oct. 19, 2015, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/021,448, Jan. 2, 2015, 19 pages.
“Final Office Action”, U.S. Appl. No. 13/525,070, Jan. 29, 2015, 30 pages.
“First Examination Report”, NZ Application No. 628690, Nov. 27, 2014, 2 pages.
“Foreign Office Action”, CN Application No. 201080037117.7, Aug. 20, 2013, 10 pages.
“Foreign Office Action”, CN Application No. 201210023945.6, Dec. 3, 2013, 13 pages.
“Notice of Allowance”, U.S. Appl. No. 14/200,595, Feb. 17, 2015, 2 pages.
“Notice of Allowance”, U.S. Appl. No. 14/200,595, Feb. 25, 2015, 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/494,651, Oct. 24, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/494,651, Dec. 29, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/199,924, Sep. 19, 2014, 2 pages.
“Final Office Action”, U.S. Appl. No. 14/200,595, Nov. 19, 2014, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/564,520, Jan. 26, 2015, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/494,651, Oct. 2, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/589,773, Sep. 16, 2014, 8 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/589,773, Jan. 27, 2015, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/589,773, Nov. 5, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/525,070, Jan. 13, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/727,001, Jan. 25, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/727,001, Dec. 15, 2015, 2 pages.
“Extended European Search Report”, EP Application No. 13857958.6, Dec. 18, 2015, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/447,109, Feb. 11, 2016, 15 pages.
“Notice of Allowance”, U.S. Appl. No. 14/727,001, Dec. 15, 2015, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/598,898, Apr. 1, 2016, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/994,737, Apr. 5, 2016, 6 pages.
Related Publications (1)
Number Date Country
20130207937 A1 Aug 2013 US