Conductive trace routing for display and bezel sensors

Information

  • Patent Grant
  • 9946383
  • Patent Number
    9,946,383
  • Date Filed
    Friday, April 29, 2016
    8 years ago
  • Date Issued
    Tuesday, April 17, 2018
    6 years ago
Abstract
Conductive trace routing techniques for display and bezel sensors are described. In one or more implementations, an apparatus includes display sensors, bezel sensors, and a plurality of conductive traces. The display sensors are configured to detect proximity of an object and are arranged in conjunction with a display area of a display device to support interaction with a user interface displayed by the display device. The bezel sensors are configured to detect proximity of an object and are disposed in a bezel that at least partially surrounds the display device and is outside the display area. The plurality of conductive traces are disposed between the display and bezel sensors and communicatively couple the display sensors and the bezel sensors to one or more computing components that are configured to process inputs received from the display sensors and the bezel sensors.
Description
BACKGROUND

Touchscreen functionality has expanded the ways in which a user may interact with a device. One example of such functionality is the recognition of gestures, which may be performed to initiate corresponding operations of the computing device.


However, conventional techniques that were employed to support this interaction were often limited in how the gestures were detected, such as to use touchscreen functionality incorporated directly over a display portion a display device. Additionally, these conventional techniques were often static and thus did not address how the computing device was being used.


Consequently, even though gestures could expand the techniques via which a user may interact with a computing device, conventional implementations of these techniques often did not address how a user interacted with a device to perform these gestures, which could be frustrating to a user as well as inefficient.


SUMMARY

Conductive trace routing techniques for display and bezel sensors are described. In one or more implementations, an apparatus includes display sensors, bezel sensors, and a plurality of conductive traces. The display sensors are configured to detect proximity of an object and are arranged in conjunction with a display area of a display device to support interaction with a user interface displayed by the display device. The bezel sensors are configured to detect proximity of an object and are disposed in a bezel that at least partially surrounds the display device and is outside the display area. The plurality of conductive traces are disposed between the display and bezel sensors and communicatively couple the display sensors and the bezel sensors to one or more computing components that are configured to process inputs received from the display sensors and the bezel sensors.


In one or more implementations, a computing device includes a housing, a touch panel, one or more computing components implemented at least partially in hardware, and a plurality of conductive traces. The housing assumes a handheld form factor that is configured to be held by one or more hands of a user. The touch panel is secured to the housing and includes a display device, display sensors configured to detect proximity of an object and arranged in conjunction with a display area of display device, and bezel sensors disposed in a bezel of the touch panel that are also configured to detection proximity of an object. The one or more computing components are configured to process inputs received from the display and bezel sensors to identify gestures. The plurality of conductive traces are routed between the display and bezel sensors and communicatively couple the display sensors and the bezel sensors to the one or more computing components.


In one or more implementations, a plurality of inputs are received from display and bezel sensors of a touch panel of a computing device that are communicatively coupled to one or more computing components of the computing device using a plurality of conductive traces that are routed between the display and bezel sensors. The inputs are distinguished between inputs received that are indicative of a user's hand as holding a housing of the computing device and inputs that are indicative of a gesture. Performance of one or more operations by the one or more computing components is initiated that correspond to the indicated gesture.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.



FIG. 1 is an illustration of an environment in an example implementation that is operable to employ the conductive trace routing and bezel sensor techniques described herein.



FIG. 2 depicts a system showing an example of a communicative coupling of the display and bezel sensors to the object detection module using a plurality of conductive traces.



FIG. 3 depicts an example implementation in which an object detection module is configured to distinguish and leverage inputs provided by display and bezel sensors to support interaction with the computing device of FIG. 1.



FIG. 4 depicts an example system showing a cut-away view of the display and bezel sensors along with the plurality of conductive traces in a coplanar relationship.



FIG. 5 depicts an implementation showing first and second examples of arrangements of the display sensors, bezel sensors, and plurality of conductive traces in relation to each other.



FIG. 6 is a flow diagram depicting a procedure in an example implementation in which inputs are distinguished based on a likelihood of being indicative of a user's hand as holding a housing of a computing device and inputs that are likely indicative of a gesture.



FIG. 7 illustrates various components of an example device that can be implemented as any type of portable and/or computer device as described with reference to FIGS. 1-6 to implement embodiments of the conductive trace routing and bezel sensor techniques described herein.





DETAILED DESCRIPTION

Overview


Object detection sensors are configured to detect proximity of an object, such as a finger of a user's hand. These sensors may be incorporated as part of a display device to form a touch panel such that a user may interact “directly” with user interface elements (e.g., over a display of) displayed by a display device of the touch panel. Conventional techniques that were utilized to route conductive traces of the sensors involved routing of the traces along a perimeter of the touch panel. As such, these traces could prevent the extension of the sensors to an edge of the touch panel.


Conductive trace routing techniques for display and bezel sensors is described. In one or more implementations a touch panel includes display sensors and bezel sensors. The display sensors are configured to support interaction with a display area of a display device, such as with elements of a user interface as previously described. Bezel sensors are also included that may be configured to detect proximity of an object. The bezel sensors are disposed in a bezel that at least partially surrounds the display area of the display device.


Conductive traces, that communicatively couple the display and bezel sensors to computing components of a computing device that includes the touch panel, are disposed in an area between the display and bezel sensors. In this way, the bezel sensors may be disposed near an edge of a housing of a computing device to support detection of an object near this edge and/or another display device. For example, the bezel sensors may be located in an area between two display devices to support touch functionality between the devices. Thus, the bezel sensors may be utilized to support a variety of functionality, such as to detect whether the computing device is being held by one or more hands of a user, for use of bezel gestures, specific absorption rate (SAR) management techniques, and so on as further described below.


In the following discussion, an example environment is first described that is operable to employ the conductive trace routing and bezel sensor techniques described herein. Example illustrations of the techniques and procedures are then described, which may be employed in the example environment as well as in other environments. Accordingly, the example environment is not limited to performing the example techniques and procedures. Likewise, the example techniques and procedures are not limited to implementation in the example environment.


Example Environment



FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ the conductive trace routing and bezel sensor techniques described herein. The illustrated environment 100 includes a computing device 102. In this example, the computing device 102 includes one or more computing components 104 that are implemented at least partially in hardware and are configured to perform and/or assist in performance of one or more operations of the computing device 102, e.g., in execution of instructions specified by software. Examples of computing components 104 include a processing system 106, memory 108, a display device 110, object detection sensors 112 that include display and bezel sensors 114, 116 and an object detection module 118. Examples of software that are executable on the processing system 106 and are storable in memory 108 include an operating system 120 and applications 122.


The computing device 102 may be configured in a variety of ways. For example, a computing device may be configured as a computer that is capable of communicating over a network, such as a desktop computer, a mobile station, an entertainment appliance, a set-top box communicatively coupled to a display device, a wireless phone, a game console, portable music or game device, remote control, and so forth.


Thus, the computing device 102 may range from full resource devices with substantial memory and processor resources (e.g., personal computers, game consoles) to a low-resource device with limited memory and/or processing resources (e.g., traditional set-top boxes, hand-held game consoles). Additionally, although a single computing device 102 is shown, the computing device 102 may be representative of a plurality of different devices, such as a remote control and set-top box combination, a game console and game controller configuration, include peripheral devices, dedicated touchpad, and so on.


For example, the computing device 102 is illustrated as including a housing 124, in which, the computing components 104 are disposed therein. The housing 124 is configured to support a handheld form factor such that the housing 124 may be held by a hand 126 of a user while supporting interaction with another hand 128 of the user, may be grasped by both hands 126, 128 of a user, and so on.


In the illustrated example, a finger of a user's hand 128 is illustrated as interacting with a user interface displayed in a display area 130 of the display device 110. This may be utilized to support a variety of different functionality, which may include interaction with user interface elements displayed on the display device 110, recognition of gestures, and so on that include processing of inputs by an object detection module 118 to determine a location, at which, the object is detected.


For instance, proximity of an object, e.g., the finger of the user's hand 128, may be detected by display sensors 114 that are disposed as proximal to the display area 132. The display sensors 114 may be configured in a variety of different ways to detect proximity of an object, such as capacitive sensors, resistive sensors, acoustic sensors, image capture devices (e.g., sensor-in-a-pixel), and so forth such that the display sensors 114 do not obstruct a view by a user of the user interface in the display area 132 in this instance. A variety of other objects may also be detected, such as a stylus, and so on. Thus, in this example the display sensors 114 and the display device 110 are configured to form a touch panel to support touchscreen functionality.


The computing device 102 also includes bezel sensors 116 that are disposed in a bezel 134 that at least partially surrounds the display area 132 of the display device 110. The bezel 134 and corresponding bezel sensors 116 may be included as part of the touch panel described earlier. The bezel 134, however, may be configured such that a user interface is not displayable through the bezel 134, which is illustrated in black in the figure. Other examples are also contemplated in which the bezel 134 may be utilized to display parts of a user interface, e.g., to indicate a position of a user's hand, include user interface elements, notifications, and so on.


The bezel sensors 116 of the bezel 134 may also be configured to detect proximity of an object, such as parts of a user's hand 128 as illustrated that are disposed over the bezel sensors 116. Like the display sensors 114, the bezel sensors 116 may be configured in a variety of ways, such as capacitive sensors, resistive sensors, acoustic sensors, image capture devices (e.g., sensor-in-a-pixel), thermal sensors, strain sensors, and so on.


Inputs from the bezel sensors 116 may also be processed by the object detection module 118 to determine a location at which the object is detected as proximal to the bezel 132. For example, the object detection module 118 may include a single controller implemented in hardware that is configured to process inputs received from the bezel sensors 116. In one or more implementations, this single controller of the object detection module 118 may also be configured to process inputs received from the display sensors 114, which may be utilized to reduce overall cost and improve efficiency of the computing device 102.


Other multi-controller examples are also contemplated, such as to reduce power consumption as further described below by keeping the bezel sensors 116 “active” while other computing components 104 are in a sleep state to initiate a “wake” of those components. The inputs from the bezel sensors 116 may be processed singly or in combination with inputs received from the display sensors 114 as further described below.


A variety of different types of gestures may be recognized by the object detection module 118, such a gestures that are recognized from a single type of input as well as gestures involving multiple types of inputs. For example, the computing device 102 may be configured to detect and differentiate between proximity to the display sensors 114 of the display device 110 from one or more bezel sensors 116 utilized to detect proximity of an object at a bezel 134 of the display device 110. The differentiation may be performed in a variety of ways, such as by detecting a location at which the object is positioned, use of different sensors, and so on.


Thus, the object detection module 118 may support a variety of different gesture techniques by recognizing and leveraging a division between inputs received via a display portion 132 of the display device 110 and the bezel 134. Consequently, the combination of display and bezel inputs may serve as a basis to indicate a variety of different gestures.


For instance, primitives of touch (e.g., tap, hold, two-finger hold, grab, cross, pinch, hand or finger postures, and so on) may be composed to create a space of intuitive and semantically rich gestures that are dependent on “where” these inputs are detected as well as which sensors were utilized in this detection. It should be noted that by differentiating between display and bezel inputs, the number of gestures that are made possible by each of these inputs alone is also increased. For example, although the movements may be the same, different gestures (or different parameters to analogous commands) may be indicated using inputs detected via the display 132 versus a bezel 134.


Although the following discussion may describe specific examples of inputs, in instances the types of inputs may be switched (e.g., display may be used to replace bezel inputs and vice versa) and even removed (e.g., both inputs may be provided using either portion) without departing from the spirit and scope of the discussion.



FIG. 2 depicts a system 200 showing an example of a communicative coupling of the display and bezel sensors to the object detection module 118. In this example, a section of a touch panel is shown in greater detail as including the display sensors 114 and the bezel sensors 116 from the display portion 132 and bezel of the display device 110, respectively. The display and bezel sensors 114, 116 may be configured as previously described to include matching techniques to detect proximity of an object, e.g., such that the display sensors 114 “extend into” the bezel 134 to support use as bezel sensors 116.


The display sensors 114 may also be configured to detect proximity of an object in manner that is different than that used by the bezel sensors 116, e.g., different types, patterns, and so on. In the illustration, for instance, the display sensors 114 may be configured as a grid (e.g., using indium tin oxide or “ITO”) that is configured to detect proximity of an object at a variety of different locations using mutual capacitance. Mutual capacitance occurs between charge-holding objects such that current passes between the objects. For example, lines of the grid of the display sensors 114 may act as capacitor plates with a material disposed between the lines acting as a dielectric of a capacitor.


The bezel sensors 116, however, may be configured to support direct capacitance that is discrete for individual ones of the bezel sensors 116. Thus, in this example individual sensors of the bezel sensors 116 may be utilized to detect proximity of an object to the respective sensors. Detection by the bezel sensors 116 using direct capacitance may support a greater range of detection than that supported using mutual capacitance by the display sensors. This may be utilized to support a variety of different functionality, such as to detect proximity of an object as it approaches but does not contact a surface that includes the bezel sensors 116, further discussion of which may be found in relation to a discussion of FIG. 4.


The display sensors 114 and the bezel sensors 116 are illustrated as being communicatively coupled to the object detection module 118 using a plurality of conductive traces 202. By routing the conductive traces 202 between the display and bezel sensors 114, 116, the bezel sensors 114 may be positioned adjacent to an edge of the housing 134, as opposed to routing of the traces between the sensors and the housing as was conventionally performed. In this way, the bezel sensors 116 may be positioned relatively close to the edge of the housing 134 (e.g., within one millimeter) that was not feasible using conventional techniques. The “to the edge” location of the bezel sensors 116 along with the display sensors 114 may thus support an extended detection area that may be leveraged to support a variety of different functionality, further discussion of which may be found below and shown in an corresponding figure.



FIG. 3 depicts an example implementation 300 in which the object detection module 118 is configured to distinguish between and leverage inputs provided by the display and bezel sensors 114, 116 to support interaction with the computing device 102. The object detection module 118 may leverage the bezel sensors 116 along with the display sensors 114 to make a determination as to how interaction with the computing device 102 is performed.


In the illustrated example, for instance, inputs from the bezel and display sensors 114, 116 may be processed by the object detection module 118 to make a determination that the housing 124 of the computing device 102 is likely held in the user's left hand 126. Additionally, inputs from the bezel and display sensors 114, 116 may also be processed to indicate that a fingertip of a user's hand 128 is detected by the display sensors 114 along with flat fingers and a palm of that user's right hand 128 by the display and bezel sensors 114, 116.


Thus, in this example the determination may cause inputs to be ignored that correspond to the left hand 126. A determination may also be made to permit inputs that correspond to a fingertip of the user's right hand 128 but reject inputs from the palm and flat finger surfaces of that hand. Although this example describes restriction of inputs that are not likely related to a gesture, a variety of other examples are also contemplated, such as to orient a user interface based on how the computing device 102 is likely held, cause output of user interface elements at positions based on “where” the computing device is likely held, to manage wireless devices in accordance with specific absorption rate (SAR) considerations, and so forth as further described below.


A determination of which inputs to leverage (e.g., for recognition of a gesture, orientation of a user interface, and so on) may be made in a variety of ways. For example, a plurality of samples may be collected that involve different usage scenarios. These samples may then be processed using machine learning or other techniques to generate data that describes likely hand positions, orientations, and so on for inputs received from the sensors. Accuracy of a likely determination may be improved based on resolution of the inputs, e.g., the more bezel sensors 116 utilized in the bezel 134 the greater the likelihood of an accurate determination using inputs from these sensors.


Further, inputs from additional sensors may be leveraged to improve this determination, such as from sensors disposed on the sides and/or back of the housing 124. In the illustrated example, for instance, sensors may be disposed on the sides of the housing 124 that are generally perpendicular to a plane of a surface of the display device 110 (e.g., at an angle of twenty-two degrees from a perpendicular plane to that of the display device), the back of the housing 124 that is defined as a plane generally perpendicular to the plane of the surface of the display device 110 (e.g., on an opposing side), and so on. Other examples of use of inputs received from the display and bezel sensors 114, 116 are also contemplated as further described below.



FIG. 4 depicts an example system 400 showing a cut-away view of the display and bezel sensors along with the plurality of conductive traces 202 in a co-planar relationship. In this example, the display and bezel sensors 114, 116 are formed along with the plurality of conductive traces 202 in a co-planar relationship on a substrate 402, e.g., glass, plastic, and so on.


A grid of ITO, for instance, may be utilized to form the display sensors 114 as shown in FIG. 2. Discrete sensing elements may also be formed from the ITO for the bezel sensors 116 on the substrate 402 along with the plurality of conductive traces 202. The conductive traces 202 may be formed in a channel between the display and bezel sensors 114, 116 such that these traces are also coplanar, one to another, and are formed without using “jumpers.” In this way, overall thinness of the computing device 102 may be promoted by avoiding use of additional layers, thereby preserving a handheld form factor of the computing device 102.


As previously described the bezel sensors 116 may be configured to support an increased sensing range in comparison with the display sensors 114 in one or more implementations. This increased sensing range may be utilized in a variety of ways, such as to detect an object 404 such as a palm of a user's hand and thereby reject inputs that correspond to detection of that object before contact with a surface of the computing device 102 and/or detection by the display sensors 114.


Additionally, the machine learning techniques may also be employed to manage computing device 102 operation based on detection of the object 404, which may include movement of the object in relation to the object detection sensors 112. For example, the bezel sensors 116 may be configured to consume less power than the display sensors 114. As such, the bezel sensors 116 may operate in a polling fashion by “waking” at periodic intervals to determine if an object 404 is proximal while other component components 104 are in a sleep state. If so, this detection by the object detection module 118 may then cause other computing components 104 that are in the sleep state (e.g., hibernation state) to “wake” to support user interaction. Movement of the object 404 may also assist on this determination, such as toward or away from the bezel sensors 116 as illustrated by the phantom line in the figure.


Thus, in this example, the object detections sensors 112 may be utilized to determine a likely location of the object 404 as well as orientation of the computing device 102 itself in relation to the object 404. This may also be leveraged as part of specific absorption rate (SAR) management of computing components 104 of the computing device 102 that emit radiation, e.g., wireless communication components. For instance, the object detection module 118 may indicate that an object 404 is disposed proximal to the computing device 102, antennas of the computing device 102, and so on. This indication may then be leveraged by the computing device 102 (e.g., operating system 120, applications 122, and so on) to reduce an amount of radiation emitted by a Wi-Fi® network connection device, Bluetooth® wireless connection device, and so on. Further, this indication may be leveraged with the indication of movement to support further functionality, e.g., permit higher emissions as the object 404 is moved away as opposed to when the object 404 is moved toward the sensors, and so forth. Although a coplanar relationship was described in this example 400, non-coplanar relationships between the display sensors 114, bezel sensors 116, and plurality of conductive traces 202 are also contemplated without departing from the spirit and scope thereof, an example of which is described as follows and shown in a corresponding figure.



FIG. 5 depicts an implementation 500 showing first and second examples 502, 502 or arrangements of the display sensors 114, bezel sensors 116, and plurality of conductive traces 202 in relation to each other. In the first example 502, an overlapping arrangement of the display sensors 114 and the bezel sensors 116 is shown in which the plurality of conductive traces 202 are also disposed between these sensors. In this example, the display sensors 114 are disposed on a plane that is closer to an object 404 to be detected than a plane that includes the bezel sensors 116.


The second example 504 also includes an overlapping arrangement, but in this instance the bezel sensors 116 are disposed on a plane that is closer to an object 404 to be detected than a plane that includes the display sensors 114. A variety of other examples of arrangements are also contemplated as further described in relation to the following procedures.


Example Procedures


The following discussion describes bezel sensor and conductive trace routing techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedures may be implemented in hardware, firmware, or software, or a combination thereof. The procedures are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. In portions of the following discussion, reference will be made to FIGS. 1-5.



FIG. 6 depicts a procedure 600 in an example implementation in which inputs are distinguished based on a likelihood of being indicative of a user's hand as holding a housing of a computing device and inputs that are likely indicative of a gesture. A plurality of inputs are received from display and bezel sensors of a touch panel of a computing device that are communicatively coupled to one or more computing components of the computing device using a plurality of conductive traces that are routed between the display and bezel sensors (block 602). As shown in FIGS. 4 and 5, for instance, the conductive traces 202 may be routed between the display and bezel sensors 114, 116 in a coplanar relationship, multi-planar relationship, and so on. In this way, the bezel sensors 116 may be positioned closer to an edge of the housing 124 than would otherwise be possible if the traces were routed along the “outside” of the sensors.


Inputs are distinguished between inputs received that are indicative of a user's hand as holding a housing of the computing device and inputs that are indicative of a gesture (block 604). Data generated from machine learning, for instance, may be leveraged by an object detection module 118 to determine a likelihood that inputs correspond to gestures versus those cause by a user holding the device and thus are not desired on the part of a user to initiate an operation of the device. Performance of one or more operations is then initiated by the one or more computing components that correspond to the indicated gesture (block 606), such as to navigate through a user interface, select particular items, and so on.


Example System and Device



FIG. 7 illustrates an example system generally at 700 that includes an example computing device 702 that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. The computing device 702 may be, for example, a server of a service provider, a device associated with a client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system that includes computing components 104 as described above.


The example computing device 702 as illustrated includes a processing system 704, one or more computer-readable media 706, and one or more I/O interface 708 that are communicatively coupled, one to another. Although not shown, the computing device 702 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines.


The processing system 704 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 704 is illustrated as including hardware element 710 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. The hardware elements 710 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.


The computer-readable storage media 706 is illustrated as including memory/storage 712. The memory/storage 712 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage component 712 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage component 712 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 706 may be configured in a variety of other ways as further described below.


Input/output interface(s) 708 are representative of functionality to allow a user to enter commands and information to computing device 702, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to recognize movement as gestures that do not involve touch), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, the computing device 702 may be configured in a variety of ways as further described below to support user interaction.


Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the computing device 702. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”


“Computer-readable storage media” may refer to media and/or devices that enable persistent and/or non-transitory storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media. The computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.


“Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 702, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.


As previously described, hardware elements 710 and computer-readable media 706 are representative of modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein, such as to perform one or more instructions. Hardware may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware. In this context, hardware may operate as a processing device that performs program tasks defined by instructions and/or logic embodied by the hardware as well as a hardware utilized to store instructions for execution, e.g., the computer-readable storage media described previously.


Combinations of the foregoing may also be employed to implement various techniques described herein. Accordingly, software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 710. The computing device 702 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of a module that is executable by the computing device 702 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 710 of the processing system 704. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more computing devices 702 and/or processing systems 704) to implement techniques, modules, and examples described herein.


As further illustrated in FIG. 7, the example system 700 enables ubiquitous environments for a seamless user experience when running applications on a personal computer (PC), a television device, and/or a mobile device. Services and applications run substantially similar in all three environments for a common user experience when transitioning from one device to the next while utilizing an application, playing a video game, watching a video, and so on.


In the example system 700, multiple devices are interconnected through a central computing device. The central computing device may be local to the multiple devices or may be located remotely from the multiple devices. In one embodiment, the central computing device may be a cloud of one or more server computers that are connected to the multiple devices through a network, the Internet, or other data communication link.


In one embodiment, this interconnection architecture enables functionality to be delivered across multiple devices to provide a common and seamless experience to a user of the multiple devices. Each of the multiple devices may have different physical requirements and capabilities, and the central computing device uses a platform to enable the delivery of an experience to the device that is both tailored to the device and yet common to all devices. In one embodiment, a class of target devices is created and experiences are tailored to the generic class of devices. A class of devices may be defined by physical features, types of usage, or other common characteristics of the devices.


In various implementations, the computing device 702 may assume a variety of different configurations, such as for computer 714, mobile 716, and television 718 uses. Each of these configurations includes devices that may have generally different constructs and capabilities, and thus the computing device 702 may be configured according to one or more of the different device classes. For instance, the computing device 702 may be implemented as the computer 714 class of a device that includes a personal computer, desktop computer, a multi-screen computer, laptop computer, netbook, and so on.


The computing device 702 may also be implemented as the mobile 716 class of device that includes mobile devices, such as a mobile phone, portable music player, portable gaming device, a tablet computer, a multi-screen computer, and so on. The computing device 702 may also be implemented as the television 718 class of device that includes devices having or connected to generally larger screens in casual viewing environments. These devices include televisions, set-top boxes, gaming consoles, and so on.


The techniques described herein may be supported by these various configurations of the computing device 702 and are not limited to the specific examples of the techniques described herein. This functionality may also be implemented all or in part through use of a distributed system, such as over a “cloud” 720 via a platform 722 as described below.


The cloud 720 includes and/or is representative of a platform 722 for resources 724. The platform 722 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 720. The resources 724 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from the computing device 702. Resources 724 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network.


The platform 722 may abstract resources and functions to connect the computing device 702 with other computing devices. The platform 722 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for the resources 724 that are implemented via the platform 722. Accordingly, in an interconnected device embodiment, implementation of functionality described herein may be distributed throughout the system 700. For example, the functionality may be implemented in part on the computing device 702 as well as via the platform 722 that abstracts the functionality of the cloud 720.


CONCLUSION

Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed invention.

Claims
  • 1. A system comprising: a display panel including: display sensors disposed in a display area of the display panel and configured to detect proximity of objects relative to a surface of the display panel; andbezel sensors disposed in a bezel of the display panel that are also configured to detect proximity of the objects relative to a surface of the display panel;one or more computing components configured to process inputs received from the display sensors and the bezel sensors to identify gestures; anda plurality of conductive traces, routed between the display sensors and the bezel sensors, the conductive traces communicatively coupling the display sensors and the bezel sensors to the one or more computing components.
  • 2. A system as described in claim 1, wherein the display sensors are configured to perform detection of the object using a different sensing technology than the bezel sensors.
  • 3. A system apparatus as described in claim 2, wherein the display sensors are configured to detect the proximity of the objects using mutual capacitance and the bezel sensors are configured to detect the proximity of the objects using direct capacitance.
  • 4. A system as described in claim 1, wherein the display sensors, the bezel sensors, and the plurality of conductive traces are at least partially coplanar.
  • 5. A system as described in claim 1, wherein the plurality of conductive traces is routed in a channel between the display sensors and the bezel sensors.
  • 6. A system as described in claim 1, wherein the bezel sensors have a detection range that is different than a detection range for the display sensors.
  • 7. A system as described in claim 6, wherein the detection range of the bezel sensors is greater than the detection range for the display sensors.
  • 8. A system as described in claim 1, wherein the one or more computing components include an object detection module.
  • 9. A system as described in claim 8, wherein the object detection module is configured to recognize a gesture from the inputs received from one or more of the display sensors or the bezel sensors and initiate actions corresponding to the gesture.
  • 10. A system as described in claim 1, wherein the one or more computing components include a component to manage radiation emission for specific absorption rate (SAR) management responsive to the inputs received from one or more of the display sensors or the bezel sensors indicating proximity of a user to the system.
  • 11. A system as described in claim 1, wherein the display sensors, plurality of conductive traces, and bezel sensors are formed on a surface of a single substrate.
  • 12. A tablet computing device comprising: display sensors disposed in a display area of the tablet computing device configured to detect proximity of objects relative to a surface of a display panel;bezel sensors disposed in a bezel of the tablet computing device that are also configured to detect proximity of the objects relative to the surface of the display panel, the display sensors configured to perform detection of the objects using a different sensing technology than the bezel sensors; anda plurality of conductive traces, routed in a channel between the display sensors and the bezel sensors, the conductive traces communicatively coupling the display sensors and the bezel sensors to an object detection component configured to process inputs received via the display sensors and the bezel sensors, the object detection component further configured to analyze the inputs received from the display sensors and the bezel sensors to identify gestures.
  • 13. A tablet computing device as described in claim 12, wherein the display sensors, the bezel sensors, and the plurality of conductive traces are at least partially coplanar.
  • 14. A tablet computing device as described in claim 12, wherein the bezel sensors have a detection range that is different than a detection range for the display sensors.
  • 15. A tablet computing device as described in claim 12, wherein the display sensors are configured to detect the proximity of the objects using mutual capacitance and the bezel sensors are configured to detect the proximity of the objects using direct capacitance.
  • 16. A computing device comprising: a touch panel having display sensors configured to detect proximity of an object, and bezel sensors disposed in a bezel of the touch panel also configured to detect proximity of the object, the bezel sensors disposed along one or more sides of the computing device within a distance of one millimeter to the edge of a housing of the computing device and the bezel sensors having a detection range that is different than a detection range for the display sensors; anda plurality of conductive traces, routed between the display sensors and the bezel sensors, that communicatively couple the display sensors and the bezel sensors to one or more computing components configured to analyze inputs received from the display sensors and the bezel sensors.
  • 17. A computing device as described in claim 16, wherein the display sensors are configured to perform detection of the object using a different sensing technology than the bezel sensors.
  • 18. A computing device as described in claim 17, wherein the one or more computing components include an object detection module configured to recognize a gesture from the inputs received from one or more of the display sensors or the bezel sensors and initiate actions corresponding to the gesture.
  • 19. A tablet computing device as described in claim 17, wherein the plurality of conductive traces are routed in a channel between the display sensors and the bezel sensors.
  • 20. A computing device as described in claim 16, wherein the one or more computing components include an object detection module configured to recognize a gesture from the inputs received from one or more of the display sensors or the bezel sensors and initiate at least one action corresponding to the gesture.
RELATED APPLICATIONS

This application is a continuation of and claims priority under 35 U.S.C. § 120 to U.S. patent application Ser. No. 14/212,916, filed Mar. 14, 2014, entitled “Conductive Trace Routing for Display and Bezel Sensors,” the disclosure of which is incorporated by reference herein in its entirety.

US Referenced Citations (428)
Number Name Date Kind
4686332 Greanias et al. Aug 1987 A
4843538 Lane et al. Jun 1989 A
4868912 Doering Sep 1989 A
5231578 Levin et al. Jul 1993 A
5237647 Roberts et al. Aug 1993 A
5252951 Tannenbaum et al. Oct 1993 A
5351995 Booker et al. Oct 1994 A
5404458 Zetts Apr 1995 A
5463725 Henckel et al. Oct 1995 A
5491783 Douglas et al. Feb 1996 A
5496974 Akebi et al. Mar 1996 A
5497776 Yamazaki et al. Mar 1996 A
5511148 Wellner Apr 1996 A
5555369 Menendez et al. Sep 1996 A
5596697 Foster et al. Jan 1997 A
5661773 Swerdloff et al. Aug 1997 A
5664128 Bauer Sep 1997 A
5664133 Malamud et al. Sep 1997 A
5694150 Sigona et al. Dec 1997 A
5731813 O'Rourke et al. Mar 1998 A
5761485 Munyan Jun 1998 A
5777596 Herbert Jul 1998 A
5817019 Kawashima Oct 1998 A
5821930 Hansen Oct 1998 A
5838889 Booker et al. Nov 1998 A
5898434 Small et al. Apr 1999 A
5943052 Allen Aug 1999 A
5969720 Lisle et al. Oct 1999 A
6029214 Dorfman et al. Feb 2000 A
6037937 Beaton et al. Mar 2000 A
6061061 Conrad et al. May 2000 A
6072476 Harada et al. Jun 2000 A
6097392 Leyerle Aug 2000 A
6115724 Booker et al. Sep 2000 A
6167439 Levine et al. Dec 2000 A
6208331 Singh Mar 2001 B1
6239798 Ludolph et al. May 2001 B1
6246395 Goyins Jun 2001 B1
6266050 Oh et al. Jul 2001 B1
6278443 Amro et al. Aug 2001 B1
6310610 Beaton et al. Oct 2001 B1
6340979 Beaton et al. Jan 2002 B1
6396523 Segal et al. May 2002 B1
6459424 Resman Oct 2002 B1
6507352 Cohen et al. Jan 2003 B1
6525749 Moran et al. Feb 2003 B1
6545669 Kinawi et al. Apr 2003 B1
6831631 Chuang Dec 2004 B2
6859909 Lerner et al. Feb 2005 B1
6920619 Milekic Jul 2005 B1
6957233 Beezer et al. Oct 2005 B1
7023427 Kraus et al. Apr 2006 B2
7053887 Kraus et al. May 2006 B2
7209125 Kong et al. Apr 2007 B2
7295191 Kraus et al. Nov 2007 B2
7302650 Allyn et al. Nov 2007 B1
7338224 Jones et al. Mar 2008 B2
7339580 Westerman et al. Mar 2008 B2
7454717 Hinckley et al. Nov 2008 B2
7479949 Jobs et al. Jan 2009 B2
7506269 Lang Mar 2009 B2
7532196 Hinckley May 2009 B2
7561146 Hotelling Jul 2009 B1
7605804 Wilson Oct 2009 B2
7636071 O'Gorman Dec 2009 B2
7643012 Kim et al. Jan 2010 B2
7656393 King et al. Feb 2010 B2
7676767 Hofmeister et al. Mar 2010 B2
7760187 Kennedy Jul 2010 B2
7821780 Choy Oct 2010 B2
7847789 Kolmykov-Zotov et al. Dec 2010 B2
D631043 Kell Jan 2011 S
7898529 Fitzmaurice et al. Mar 2011 B2
7956847 Christie Jun 2011 B2
8018440 Townsend et al. Sep 2011 B2
8102858 Rahim et al. Jan 2012 B1
8122384 Partridge et al. Feb 2012 B2
8169418 Birkler May 2012 B2
8181122 Davidson May 2012 B2
8212788 Lam Jul 2012 B2
8239785 Hinckley Aug 2012 B2
8261213 Hinckley Sep 2012 B2
8274482 Kim et al. Sep 2012 B2
8284170 Bernstein Oct 2012 B2
8289289 Rimon et al. Oct 2012 B2
8294669 Partridge et al. Oct 2012 B2
8294686 Townsend et al. Oct 2012 B2
8327295 Ikeda et al. Dec 2012 B2
8335996 Davidson et al. Dec 2012 B2
8345008 Lee et al. Jan 2013 B2
8373660 Pallakoff Feb 2013 B2
8395600 Kawashima et al. Mar 2013 B2
8407606 Davidson et al. Mar 2013 B1
8473870 Hinckley et al. Jun 2013 B2
8477114 Miller et al. Jul 2013 B2
8539384 Hinckley et al. Sep 2013 B2
8581864 Miyazawa et al. Nov 2013 B2
8587526 Engelhardt et al. Nov 2013 B2
8640047 Mouton et al. Jan 2014 B2
8643628 Eriksson Feb 2014 B1
8659570 Townsend et al. Feb 2014 B2
8707174 Hinckley et al. Apr 2014 B2
8751970 Hinckley et al. Jun 2014 B2
8788967 Davidson et al. Jul 2014 B2
8799827 Hinckley et al. Aug 2014 B2
8810533 Chen Aug 2014 B2
8836648 Wilairat Sep 2014 B2
8836659 Chen Sep 2014 B2
9047009 King Jun 2015 B2
9075522 Hinckley et al. Jul 2015 B2
9256342 Davidson Feb 2016 B2
9261964 Townsend et al. Feb 2016 B2
9274682 Hinckley et al. Mar 2016 B2
9310994 Hinckley et al. Apr 2016 B2
9360972 Avery Jun 2016 B1
9367205 Hinckley et al. Jun 2016 B2
9477337 Cady Oct 2016 B2
9519419 Hinckley et al. Dec 2016 B2
9582122 Bathiche Feb 2017 B2
9594457 Townsend et al. Mar 2017 B2
20010012000 Eberhard Aug 2001 A1
20010035860 Segal et al. Nov 2001 A1
20010047263 Smith et al. Nov 2001 A1
20020060701 Naughton et al. May 2002 A1
20020097229 Rose et al. Jul 2002 A1
20020101457 Lang Aug 2002 A1
20020116421 Fox et al. Aug 2002 A1
20030016253 Aoki et al. Jan 2003 A1
20030063073 Geaghan et al. Apr 2003 A1
20030067451 Tagg et al. Apr 2003 A1
20030095095 Pihlaja May 2003 A1
20030098858 Perski et al. May 2003 A1
20030142081 Iizuka Jul 2003 A1
20030179541 Sullivan Sep 2003 A1
20030231219 Leung Dec 2003 A1
20040001048 Kraus et al. Jan 2004 A1
20040012572 Sowden et al. Jan 2004 A1
20040155871 Perski et al. Aug 2004 A1
20040236741 Burstrom et al. Nov 2004 A1
20040236774 Baird et al. Nov 2004 A1
20040255254 Weingart et al. Dec 2004 A1
20050012723 Pallakoff Jan 2005 A1
20050017957 Yi Jan 2005 A1
20050017959 Kraus et al. Jan 2005 A1
20050052432 Kraus et al. Mar 2005 A1
20050076300 Martinez Apr 2005 A1
20050101864 Zheng et al. May 2005 A1
20050129314 Chen Jun 2005 A1
20050162402 Watanachote Jul 2005 A1
20050177796 Takahashi Aug 2005 A1
20050184973 Lum et al. Aug 2005 A1
20050189154 Perski et al. Sep 2005 A1
20050198592 Keely, Jr. et al. Sep 2005 A1
20060001650 Robbins et al. Jan 2006 A1
20060010371 Shur et al. Jan 2006 A1
20060012580 Perski et al. Jan 2006 A1
20060012581 Haim et al. Jan 2006 A1
20060022955 Kennedy Feb 2006 A1
20060026521 Hotelling et al. Feb 2006 A1
20060026535 Hotelling et al. Feb 2006 A1
20060026536 Hotelling et al. Feb 2006 A1
20060031786 Hillis et al. Feb 2006 A1
20060071912 Hill Apr 2006 A1
20060092177 Blasko May 2006 A1
20060093219 Gounares et al. May 2006 A1
20060101354 Hashimoto et al. May 2006 A1
20060109252 Kolmykov-Zotov et al. May 2006 A1
20060112335 Hofmeister et al. May 2006 A1
20060161870 Hotelling et al. Jul 2006 A1
20060197750 Kerr et al. Sep 2006 A1
20060197753 Hotelling Sep 2006 A1
20060197963 Royal et al. Sep 2006 A1
20060238517 King et al. Oct 2006 A1
20060238520 Westerman et al. Oct 2006 A1
20060262105 Smith et al. Nov 2006 A1
20060262188 Elyada et al. Nov 2006 A1
20060267955 Hino Nov 2006 A1
20060279548 Geaghan Dec 2006 A1
20060284852 Hofmeister et al. Dec 2006 A1
20070043744 Carro Feb 2007 A1
20070063987 Sato et al. Mar 2007 A1
20070075976 Kun et al. Apr 2007 A1
20070097096 Rosenberg May 2007 A1
20070106939 Qassoudi May 2007 A1
20070109274 Reynolds May 2007 A1
20070120762 O'Gorman May 2007 A1
20070146337 Ording et al. Jun 2007 A1
20070146347 Rosenberg Jun 2007 A1
20070150496 Feinsmith Jun 2007 A1
20070152976 Townsend et al. Jul 2007 A1
20070168890 Zhao et al. Jul 2007 A1
20070171211 Perski et al. Jul 2007 A1
20070236468 Tuli Oct 2007 A1
20070242056 Engelhardt et al. Oct 2007 A1
20070262951 Huie et al. Nov 2007 A1
20080001924 de los Reyes et al. Jan 2008 A1
20080005703 Radivojevic et al. Jan 2008 A1
20080036743 Westerman et al. Feb 2008 A1
20080040692 Sunday et al. Feb 2008 A1
20080042978 Perez-Noguera Feb 2008 A1
20080046425 Perski Feb 2008 A1
20080052945 Matas et al. Mar 2008 A1
20080059914 Allyn et al. Mar 2008 A1
20080062141 Chandhri Mar 2008 A1
20080065720 Brodersen et al. Mar 2008 A1
20080074402 Cornish et al. Mar 2008 A1
20080082903 McCurdy et al. Apr 2008 A1
20080084400 Rosenberg Apr 2008 A1
20080164982 Andrews et al. Jul 2008 A1
20080165141 Christie Jul 2008 A1
20080165255 Christie et al. Jul 2008 A1
20080168382 Louch et al. Jul 2008 A1
20080168396 Matas et al. Jul 2008 A1
20080168403 Westerman et al. Jul 2008 A1
20080180404 Han et al. Jul 2008 A1
20080211766 Westerman et al. Sep 2008 A1
20080211778 Ording et al. Sep 2008 A1
20080218494 Perski et al. Sep 2008 A1
20080229192 Gear et al. Sep 2008 A1
20080249682 Wisniewski et al. Oct 2008 A1
20080278455 Atkins et al. Nov 2008 A1
20080303798 Matsudate et al. Dec 2008 A1
20090019188 Mattice et al. Jan 2009 A1
20090033632 Szolyga et al. Feb 2009 A1
20090054107 Feland, III et al. Feb 2009 A1
20090058830 Herz Mar 2009 A1
20090059730 Lyons et al. Mar 2009 A1
20090064012 Tremblay Mar 2009 A1
20090074255 Holm Mar 2009 A1
20090077501 Partridge et al. Mar 2009 A1
20090079699 Sun Mar 2009 A1
20090094562 Jeong et al. Apr 2009 A1
20090096758 Hotelling et al. Apr 2009 A1
20090117943 Lee et al. May 2009 A1
20090128505 Partridge et al. May 2009 A1
20090138830 Borgaonkar et al. May 2009 A1
20090143141 Wells et al. Jun 2009 A1
20090153289 Hope et al. Jun 2009 A1
20090153438 Miller et al. Jun 2009 A1
20090167696 Griffin Jul 2009 A1
20090167702 Nurmi Jul 2009 A1
20090174679 Westerman Jul 2009 A1
20090184921 Scott et al. Jul 2009 A1
20090193366 Davidson Jul 2009 A1
20090217211 Hildreth et al. Aug 2009 A1
20090249236 Westerman et al. Oct 2009 A1
20090249247 Tseng et al. Oct 2009 A1
20090251432 Wang et al. Oct 2009 A1
20090251434 Rimon et al. Oct 2009 A1
20090256857 Davidson et al. Oct 2009 A1
20090276701 Nurmi Nov 2009 A1
20090278806 Duarte et al. Nov 2009 A1
20090282332 Porat Nov 2009 A1
20090284478 De la Torre Baltierra et al. Nov 2009 A1
20090284488 Sip Nov 2009 A1
20090295753 King et al. Dec 2009 A1
20090307589 Inose et al. Dec 2009 A1
20090309846 Trachtenberg et al. Dec 2009 A1
20090320070 Inoguchi Dec 2009 A1
20090327963 Mouilleseaux et al. Dec 2009 A1
20090327975 Stedman Dec 2009 A1
20100013768 Leung Jan 2010 A1
20100013792 Fukushima Jan 2010 A1
20100016049 Shirakawa et al. Jan 2010 A1
20100020025 Lemort et al. Jan 2010 A1
20100039392 Pratt et al. Feb 2010 A1
20100042827 Pratt et al. Feb 2010 A1
20100045705 Vertegaal et al. Feb 2010 A1
20100050076 Roth Feb 2010 A1
20100051355 Yang Mar 2010 A1
20100053103 No Mar 2010 A1
20100053861 Kim Mar 2010 A1
20100058182 Jung Mar 2010 A1
20100060607 Ludwig Mar 2010 A1
20100066667 MacDougall et al. Mar 2010 A1
20100066694 Jonsdottir Mar 2010 A1
20100066698 Seo Mar 2010 A1
20100079392 Chiang et al. Apr 2010 A1
20100081475 Chiang et al. Apr 2010 A1
20100083154 Takeshita Apr 2010 A1
20100083190 Roberts et al. Apr 2010 A1
20100088641 Choi Apr 2010 A1
20100090971 Choi et al. Apr 2010 A1
20100097338 Miyashita et al. Apr 2010 A1
20100103136 Ono et al. Apr 2010 A1
20100105443 Vaisanen Apr 2010 A1
20100107067 Vaisanen Apr 2010 A1
20100110019 Ozias May 2010 A1
20100115455 Kim May 2010 A1
20100123675 Ippel May 2010 A1
20100134415 Iwase et al. Jun 2010 A1
20100134423 Brisebois et al. Jun 2010 A1
20100134424 Brisebois et al. Jun 2010 A1
20100137027 Kim Jun 2010 A1
20100149109 Elias Jun 2010 A1
20100164878 Bestle et al. Jul 2010 A1
20100164897 Morin et al. Jul 2010 A1
20100164959 Brown et al. Jul 2010 A1
20100169813 Chang Jul 2010 A1
20100182247 Petschnigg et al. Jul 2010 A1
20100182264 Hahn et al. Jul 2010 A1
20100188371 Lowles et al. Jul 2010 A1
20100201634 Coddington Aug 2010 A1
20100213040 Yeh et al. Aug 2010 A1
20100217428 Strong et al. Aug 2010 A1
20100220900 Orsley Sep 2010 A1
20100241973 Whiddett Sep 2010 A1
20100245242 Wu et al. Sep 2010 A1
20100245263 Parada, Jr. et al. Sep 2010 A1
20100251112 Hinckley et al. Sep 2010 A1
20100251189 Jaeger Sep 2010 A1
20100262928 Abbott Oct 2010 A1
20100283748 Hsieh et al. Nov 2010 A1
20100295795 Wilairat Nov 2010 A1
20100302172 Wilairat Dec 2010 A1
20100302712 Zednicek et al. Dec 2010 A1
20100306702 Warner Dec 2010 A1
20100313124 Privault et al. Dec 2010 A1
20100321326 Grunthaner et al. Dec 2010 A1
20110012841 Lin Jan 2011 A1
20110018821 Kii Jan 2011 A1
20110041096 Larco et al. Feb 2011 A1
20110043472 Hada Feb 2011 A1
20110043475 Rigazio et al. Feb 2011 A1
20110050594 Kim et al. Mar 2011 A1
20110055729 Mason et al. Mar 2011 A1
20110055753 Horodezky et al. Mar 2011 A1
20110072036 Agsen et al. Mar 2011 A1
20110107220 Perlman May 2011 A1
20110115735 Lev et al. May 2011 A1
20110115784 Tartz et al. May 2011 A1
20110117526 Wigdor et al. May 2011 A1
20110126094 Horodezky et al. May 2011 A1
20110143769 Jones et al. Jun 2011 A1
20110159915 Yano et al. Jun 2011 A1
20110167092 Subramaniam et al. Jul 2011 A1
20110167336 Aitken et al. Jul 2011 A1
20110167350 Hoellwarth Jul 2011 A1
20110167391 Momeyer et al. Jul 2011 A1
20110169749 Ganey et al. Jul 2011 A1
20110181524 Hinckley et al. Jul 2011 A1
20110185299 Hinckley Jul 2011 A1
20110185300 Hinckley Jul 2011 A1
20110185318 Hinckley et al. Jul 2011 A1
20110185320 Hinckley Jul 2011 A1
20110187647 Woloszynski et al. Aug 2011 A1
20110191704 Hinckley Aug 2011 A1
20110191718 Hinckley Aug 2011 A1
20110191719 Hinckley Aug 2011 A1
20110199386 Dharwada et al. Aug 2011 A1
20110205163 Hinckley Aug 2011 A1
20110209039 Hinckley et al. Aug 2011 A1
20110209057 Hinckley Aug 2011 A1
20110209058 Hinckley Aug 2011 A1
20110209088 Hinckley Aug 2011 A1
20110209089 Hinckley et al. Aug 2011 A1
20110209093 Hinckley Aug 2011 A1
20110209097 Hinckley Aug 2011 A1
20110209098 Hinckley et al. Aug 2011 A1
20110209099 Hinckley Aug 2011 A1
20110209100 Hinckley et al. Aug 2011 A1
20110209101 Hinckley et al. Aug 2011 A1
20110209102 Hinckley et al. Aug 2011 A1
20110209103 Hinckley et al. Aug 2011 A1
20110209104 Hinckley et al. Aug 2011 A1
20110231796 Vigil Sep 2011 A1
20110242039 Kalis et al. Oct 2011 A1
20110242138 Tribble Oct 2011 A1
20110261058 Luo Oct 2011 A1
20110291948 Stewart et al. Dec 2011 A1
20110291964 Chambers et al. Dec 2011 A1
20110310459 Gates Dec 2011 A1
20120001861 Townsend et al. Jan 2012 A1
20120032979 Blow et al. Feb 2012 A1
20120075194 Ferren Mar 2012 A1
20120084705 Lee et al. Apr 2012 A1
20120096411 Nash Apr 2012 A1
20120113007 Koch et al. May 2012 A1
20120131454 Shah May 2012 A1
20120154303 Lazaridis et al. Jun 2012 A1
20120158629 Hinckley et al. Jun 2012 A1
20120212445 Heikkinen Aug 2012 A1
20120236026 Hinckley Sep 2012 A1
20120262407 Hinckley et al. Oct 2012 A1
20120287076 Dao et al. Nov 2012 A1
20120304133 Nan et al. Nov 2012 A1
20120306788 Chen et al. Dec 2012 A1
20120311476 Campbell Dec 2012 A1
20120324384 Cohen et al. Dec 2012 A1
20130038564 Ho Feb 2013 A1
20130044070 Townsend et al. Feb 2013 A1
20130063891 Martisauskas Mar 2013 A1
20130088434 Masuda Apr 2013 A1
20130093691 Moosavi Apr 2013 A1
20130117715 Williams et al. May 2013 A1
20130154999 Guard Jun 2013 A1
20130181902 Hinckley et al. Jul 2013 A1
20130222287 Bae et al. Aug 2013 A1
20130257768 Lee Oct 2013 A1
20130265269 Sharma et al. Oct 2013 A1
20130271447 Setlur et al. Oct 2013 A1
20130275914 Zhuo Oct 2013 A1
20130300668 Churikov Nov 2013 A1
20130335453 Lim Dec 2013 A1
20140022183 Ayoub Jan 2014 A1
20140043265 Chang et al. Feb 2014 A1
20140043277 Saukko et al. Feb 2014 A1
20140092041 Ih Apr 2014 A1
20140111462 Townsend et al. Apr 2014 A1
20140132551 Bathiche May 2014 A1
20140192019 Fukushima Jul 2014 A1
20140195957 Bang Jul 2014 A1
20140253477 Shim Sep 2014 A1
20140289668 Mavrody Sep 2014 A1
20140293145 Jones Oct 2014 A1
20140337791 Agnetta et al. Nov 2014 A1
20150042588 Park Feb 2015 A1
20150145797 Corrion May 2015 A1
20150160849 Weiss Jun 2015 A1
20150227166 Lee Aug 2015 A1
20150261362 King Sep 2015 A1
20150261364 Cady et al. Sep 2015 A1
20160110024 Townsend et al. Apr 2016 A1
20160283104 Hinckley et al. Sep 2016 A1
20170131835 Bathiche May 2017 A1
20170147148 Townsend et al. May 2017 A1
20170177100 Townsend et al. Jun 2017 A1
20170177101 Townsend et al. Jun 2017 A1
Foreign Referenced Citations (59)
Number Date Country
1326564 Dec 2001 CN
1578430 Feb 2005 CN
1704888 Dec 2005 CN
1766824 May 2006 CN
1936799 Mar 2007 CN
101198925 Jun 2008 CN
201181467 Jan 2009 CN
101404687 Apr 2009 CN
101410781 Apr 2009 CN
101432677 May 2009 CN
101482790 Jul 2009 CN
101496404 Jul 2009 CN
201298220 Aug 2009 CN
101551728 Oct 2009 CN
101566865 Oct 2009 CN
101576789 Nov 2009 CN
101609383 Dec 2009 CN
101627361 Jan 2010 CN
101636711 Jan 2010 CN
101668056 Mar 2010 CN
102207788 Oct 2011 CN
0388344 Feb 1995 EP
1942401 Jul 2008 EP
2081107 Jul 2009 EP
2148268 Jan 2010 EP
2466442 Jun 2012 EP
2261781 Oct 2012 EP
2560088 Feb 2013 EP
2634678 Sep 2013 EP
6282368 Oct 1994 JP
7281810 Oct 1995 JP
2001265523 Sep 2001 JP
2001290585 Oct 2001 JP
2002055753 Feb 2002 JP
2003195998 Jul 2003 JP
2005004690 Jan 2005 JP
2005026834 Jan 2005 JP
2005122271 May 2005 JP
2005149279 Jun 2005 JP
2007240964 Sep 2007 JP
3143462 Jul 2008 JP
2008532185 Aug 2008 JP
2008217742 Sep 2008 JP
2008305087 Dec 2008 JP
2009097724 Apr 2009 JP
2010019643 Jan 2010 JP
2010026834 Feb 2010 JP
2010250465 Nov 2010 JP
20090013927 Feb 2009 KR
1020090088501 Aug 2009 KR
20090106755 Oct 2009 KR
200921478 May 2009 TW
200947297 Nov 2009 TW
200951783 Dec 2009 TW
WO-9928812 Jan 1999 WO
WO-2009086628 Jul 2009 WO
WO-2009131987 Oct 2009 WO
WO-2011106467 Sep 2011 WO
WO-2011106468 Sep 2011 WO
Non-Patent Literature Citations (342)
Entry
“Search Report”, TW Application No. 099142890, dated Jun. 30, 2015, 1 page.
“In touch with new opportunities—Dispersive Signal Technology”, DataSheet, NXT, 2005, 1 page.
“TouchSystems—Innovation Touch Screen Solution”, Retrieved from <http://www.touchsystems.com/article.aspx?id=16> on Aug. 30, 2012, Aug. 14, 2012, 1 page.
“Final Office Action”, U.S. Appl. No. 12/713,110, dated Jan. 17, 2013, 10 pages.
“Final Office Action”, U.S. Appl. No. 13/674,357, dated Jan. 29, 2015, 10 pages.
“Final Office Action”, U.S. Appl. No. 12/695,842, dated Feb. 2, 2016, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,976, dated Mar. 25, 2015, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/472,699, dated Mar. 28, 2013, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,110, dated May 3, 2013, 10 pages.
“Final Office Action”, U.S. Appl. No. 12/713,133, dated May 20, 2013, 10 pages.
“Final Office Action”, U.S. Appl. No. 13/484,075, dated May 21, 2013, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/674,357, dated Jun. 4, 2015, 10 pages.
“Final Office Action”, U.S. Appl. No. 13/484,075, dated Jul. 16, 2015, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/212,916, dated Aug. 7, 2015, 10 pages.
“Foreign Office Action”, CN Application No. 201180010692.2, dated Sep. 15, 2015, 10 Pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/025132, dated Oct. 26, 2011, 10 pages.
“Final Office Action”, U.S. Appl. No. 12/713,118, dated Oct. 26, 2012, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/025131, dated Oct. 31, 2011, 10 pages.
“Final Office Action”, U.S. Appl. No. 14/145,204, dated Nov. 12, 2014, 10 pages.
“Final Office Action”, U.S. Appl. No. 12/695,976, dated Nov. 21, 2012, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700,357, dated Dec. 16, 2015, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/657,662, dated Apr. 5, 2013, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/020413, dated Apr. 8, 2013, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700,357, dated Jul. 2, 2012, 10 pages.
“Final Office Action”, U.S. Appl. No. 12/713,053, dated Aug. 17, 2012, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,118, dated Jan. 29, 2015, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,282, dated Feb. 28, 2013, 11 pages.
“Foreign Office Action”, CN Application No. 201110050852.8, dated Mar. 26, 2013, 11 pages.
“Foreign Office Action”, CN Application No. 201110050506.X, dated Apr. 2, 2013, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,937, dated May 7, 2014, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,113, dated Jun. 4, 2014, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700,357, dated Jun. 26, 2014, 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/019811, dated Jul. 8, 2015, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,301, dated Jul. 14, 2015, 11 pages.
“Foreign Office Action”, CN Application No. 201110046510.9, dated Jul. 25, 2014, 11 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/484,075, dated Sep. 5, 2014, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,301, dated Oct. 24, 2013, 11 pages.
“Final Office Action”, U.S. Appl. No. 13/484,075, dated Nov. 10, 2015, 11 pages.
“Final Office Action”, U.S. Appl. No. 12/695,976, dated Nov. 27, 2015, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,245, dated Nov. 30, 2011, 11 pages.
“Final Office Action”, U.S. Appl. No. 12/709,282, dated Dec. 24, 2012, 11 pages.
“Final Office Action”, U.S. Appl. No. 12/709,301, dated Mar. 1, 2012, 11 pages.
“Foreign Office Action”, CN Application No. 201110046510.9, dated May 31, 2013, 11 pages.
“Final Office Action”, U.S. Appl. No. 12/709,282, dated Jul. 16, 2013, 11 pages.
“Foreign Office Action”, CN Application No. 201110046519.X, dated Aug. 6, 2013, 11 pages.
“Foreign Office Action”, CN Application No. 201110046529.3, dated Aug. 6, 2013, 11 pages.
“Final Office Action”, U.S. Appl. No. 13/484,075, dated Feb. 4, 2015, 12 pages.
“Foreign Office Action”, CN Application No. 201110046519.X, dated Mar. 19, 2013, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700,357, dated Apr. 2, 2015, 12 pages.
“Foreign Office Action”, CN Application No. 201180011020.3, dated May 4, 2014, 12 Pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,301, dated May 23, 2014, 12 pages.
“Foreign Office Action”, CN Application No. 201110044285.5, dated Jun. 20, 2012, 12 pages.
“Final Office Action”, U.S. Appl. No. 12/695,976, dated Jul. 23, 2014, 12 pages.
“Final Office Action”, U.S. Appl. No. 12/472,699, dated Jul. 29, 2013, 12 pages.
“Final Office Action”, U.S. Appl. No. 12/695,976, dated Aug. 5, 2015, 12 pages.
“Final Office Action”, U.S. Appl. No. 12/709,301, dated Sep. 3, 2013, 12 pages.
“Foreign Office Action”, CN Application No. 201180010769.6, dated Sep. 3, 2014, 12 Pages.
“Final Office Action”, U.S. Appl. No. 12/709,376, dated Sep. 10, 2013, 12 pages.
“Final Office Action”, U.S. Appl. No. 12/709,348, dated Sep. 12, 2013, 12 pages.
“Final Office Action”, U.S. Appl. No. 13/674,357, dated Sep. 17, 2015, 12 pages.
“Final Office Action”, U.S. Appl. No. 12/713,113, dated Oct. 8, 2014, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,282, dated Oct. 10, 2013, 12 pages.
“Final Office Action”, U.S. Appl. No. 12/700,357, dated Nov. 20, 2014, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/324,157, dated Dec. 11, 2008, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,348, dated Dec. 7, 2011, 12 pages.
“Final Office Action”, U.S. Appl. No. 12/472,699, dated Feb. 15, 2012, 12 pages.
“Apple Unibody MacBook Pro #MB991LL/A 2.53 GHz Intel Core 2 Duo”, Retrieved from: <http://www.themacstore.com/parts/show/c-nmb3-mb991II_a> on Nov. 10, 2009, 2009, 12 pages.
“Notice of Allowance”, U.S. Appl. No. 12/695,064, dated Mar. 28, 2012, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/472,699, dated Sep. 12, 2011, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/674,357, dated Feb. 17, 2016, 13 pages.
“Foreign Office Action”, CN Application No. 201380005804.4, dated Mar. 1, 2016, 13 Pages.
“Foreign Office Action”, JP Application No. 2012-554008, dated Jun. 25, 2015, 13 pages.
“Foreign Office Action”, CN Application No. 201180010692.2, dated Jun. 26, 2014, 13 pages.
“Foreign Office Action”, CN Application No. 201180009635.2, dated Jul. 28, 2014, 13 pages.
“Foreign Office Action”, CN Application No. 201110044285.5, dated Jan. 4, 2013, 13 pages.
“Final Office Action”, U.S. Appl. No. 12/709,245, dated Jan. 6, 2012, 13 pages.
“Final Office Action”, U.S. Appl. No. 12/700,357, dated Oct. 24, 2012, 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/025973, dated Oct. 27, 2011, 13 pages.
“Final Office Action”, U.S. Appl. No. 12/695,937, dated Nov. 10, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,053, dated Nov. 23, 2012, 13 pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,110, dated Dec. 4, 2013, 13 pages.
“Final Office Action”, U.S. Appl. No. 12/709,348, dated Feb. 17, 2012, 13 pages.
“Notice of Allowance”, U.S. Appl. No. 12/695,959, dated Apr. 17, 2012, 13 pages.
“Final Office Action”, U.S. Appl. No. 12/695,937, dated Jul. 26, 2012, 13 pages.
“Foreign Office Action”, CN Application No. 201110046519.X, dated Aug. 2, 2012, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,301, dated Sep. 13, 2012, 13 pages.
“Final Office Action”, U.S. Appl. No. 12/709,301, dated Jan. 7, 2013, 14 pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,096, dated Jan. 9, 2015, 14 pages.
“Decision on Reexamination”, CN Application No. 201110044285.5, dated Mar. 26, 2015, 14 Pages.
“Final Office Action”, U.S. Appl. No. 12/695,937, dated Apr. 2, 2015, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,204, dated May 7, 2015, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,301, dated May 14, 2013, 14 pages.
“Final Office Action”, U.S. Appl. No. 11/324,157, dated Jun. 24, 2009, 14 pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,096, dated Aug. 29, 2014, 14 pages.
“Final Office Action”, U.S. Appl. No. 12/700,357, dated Aug. 31, 2015, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,376, dated Jan. 23, 2012, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/472,699, dated Oct. 23, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,245, dated Mar. 21, 2012, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,937, dated Apr. 25, 2012, 14 pages.
“Foreign Office Action”, CN Application No. 201110046529.3, dated Aug. 16, 2012, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,348, dated Aug. 2, 2012, 14 pages.
“International Search Report and Written Opinion”, Application No. PCT/US/2011025972, dated Sep. 30, 2011, 14 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/025575, dated Sep. 30, 2011, 14 pages.
“Final Office Action”, U.S. Appl. No. 12/709,348, dated Jan. 7, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,348, dated Apr. 25, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,245, dated May 30, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/099,798, dated Jun. 9, 2015, 15 pages.
“Final Office Action”, U.S. Appl. No. 12/713,127, dated Aug. 14, 2014, 15 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/025971, dated Oct. 31, 2011, 15 pages.
“Final Office Action”, U.S. Appl. No. 12/709,245, dated Mar. 15, 2013, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,245, dated Mar. 20, 2014, 16 pages.
“Foreign Office Action”, CN Application No. 201180009579.2, dated Apr. 21, 2015, 16 Pages.
“Non-Final Office Action”, U.S. Appl. No. 11/324,157, dated Apr. 28, 2010, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,842, dated May 22, 2013, 16 pages.
“Foreign Office Action”, CN Application No. 201180011039.8, dated Jun. 5, 2014, 16 Pages.
“Foreign Office Action”, CN Application No. 201110046519.X, dated Sep. 21, 2015, 16 Pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,842, dated Oct. 3, 2012, 16 pages.
“Foreign Office Action”, CN Application No. 201180009579.2, dated Nov. 4, 2014, 16 pages.
“Final Office Action”, U.S. Appl. No. 12/709,376, dated Mar. 30, 2012, 16 pages.
“Foreign Office Action”, CN Application No. 201180011039.8, dated Feb. 17, 2015, 17 Pages.
“Final Office Action”, U.S. Appl. No. 12/709,282, dated May 9, 2014, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,376, dated May 23, 2013, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/324,157, dated Sep. 28, 2009, 17 pages.
“Final Office Action”, U.S. Appl. No. 12/695,842, dated Dec. 2, 2013, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,376, dated Aug. 17, 2012, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,127, dated Mar. 26, 2015, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/099,798, dated Mar. 31, 2016, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,113, dated Apr. 23, 2013, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,842, dated Aug. 13, 2015, 18 pages.
“Final Office Action”, U.S. Appl. No. 11/324,157, dated Oct. 15, 2010, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,081, dated Dec. 23, 2011, 18 pages.
“Final Office Action”, U.S. Appl. No. 12/713,113, dated Jun. 4, 2012, 18 pages.
“Final Office Action”, U.S. Appl. No. 12/713,127, dated Jun. 6, 2012, 18 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2014/067804, dated Jul. 24, 2015, 19 Pages.
“Final Office Action”, U.S. Appl. No. 12/713,127, dated Jul. 31, 2015, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/352,193, dated Aug. 20, 2014, 19 pages.
“Foreign Office Action”, CN Application No. 201180009579.2, dated Sep. 6, 2015, 19 pages.
“Final Office Action”, U.S. Appl. No.14/099,798, dated Nov. 25, 2015, 19 pages.
“Final Office Action”, U.S. Appl. No. 12/713,081, dated May 9, 2012, 19 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 12/713,133, dated Feb. 3, 2014, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/212,916, dated Mar. 3, 2016, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 12/713,118, dated Mar. 19, 2015, 2 pages.
“Touch Screen is available in .36-50.8 mm thickness”, ThomasNet Industrial News Room, Jul. 29, 2003, 2 pages.
“AccuScribe Touchscreens”, Datasheet, Elo TouchSystem, Aug. 2005, 2 pages.
“Dell and Windows 7—The Wait is Over”, Retrieved from: <http://content.dell.com/us/en/corp/d/press-releases/2009-10-22-Dell-and-Windows-7.aspx> on Nov. 10, 2009, Oct. 22, 2009, 2 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 12/713,096, dated Nov. 4, 2014, 2 pages.
“Advisory Action”, U.S. Appl. No. 12/709,376, dated Dec. 19, 2013, 2 pages.
“Final Office Action”, U.S. Appl. No. 12/695,842, dated Feb. 12, 2015, 20 pages.
“Foreign Office Action”, CN Application No. 201180007100.1, dated May 15, 2015, 20 Pages.
“Final Office Action”, U.S. Appl. No. 12/709,376, dated Nov. 8, 2012, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,113, dated Dec. 22, 2011, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700,510, dated Feb. 7, 2012, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,204, dated May 10, 2012, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,127, dated Jan. 31, 2014, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/352,193, dated Apr. 9, 2015, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,110, dated Jun. 21, 2012, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,842, dated Aug. 18, 2014, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/898,452, dated Sep. 26, 2014, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700,460, dated Jan. 13, 2012, 21 pages.
“Final Office Action”, U.S. Appl. No. 12/713,113, dated Oct. 8, 2013, 21 pages.
“Final Office Action”, U.S. Appl. No. 13/352,193, dated Jan. 12, 2015, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,282, dated Jan. 29, 2015, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/352,193, dated Jan. 31, 2014, 22 pages.
“Ex Parte Mewherter, PTAB precedential decision”, U.S. Appl. No. 10/685,192, filed May 8, 2013, 22 pages.
“Foreign Office Action”, CN Application No. 201180007100.1, dated Sep. 10, 2014, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,127, dated Dec. 27, 2011, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,096, dated Jan. 30, 2014, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/352,193, dated Mar. 22, 2016, 23 pages.
“Final Office Action”, U.S. Appl. No. 13/898,452, dated Mar. 27, 2015, 23 pages.
“Final Office Action”, U.S. Appl. No. 13/352,193, dated May 23, 2014, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/898,452, dated Sep. 14, 2015, 23 pages.
“Final Office Action”, U.S. Appl. No. 13/352,193, dated Oct. 1, 2015, 23 pages.
“Final Office Action”, U.S. Appl. No. 12/700,510, dated Oct. 10, 2012, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,113, dated Feb. 12, 2015, 24 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/898,452, dated Feb. 24, 2014, 24 pages.
“Final Office Action”, U.S. Appl. No. 12/709,204, dated Apr. 11, 2014, 24 pages.
“Final Office Action”, U.S. Appl. No. 12/709,282, dated Aug. 24, 2015, 24 pages.
“Final Office Action”, U.S. Appl. No. 12/709,204, dated Sep. 12, 2013, 24 pages.
“Final Office Action”, U.S. Appl. No. 12/709,204, dated Oct. 3, 2012, 24 pages.
“Final Office Action”, U.S. Appl. No. 13/898,452, dated Mar. 10, 2016, 25 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,204, dated Aug. 13, 2014, 25 pages.
“Final Office Action”, U.S. Appl. No. 13/898,452, dated Jun. 9, 2014, 26 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700,510, dated Jun. 12, 2014, 26 pages.
“Final Office Action”, U.S. Appl. No. 12/713,113, dated Aug. 5, 2015, 26 pages.
“Final Office Action”, U.S. Appl. No. 12/700,460, dated Aug. 28, 2012, 26 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,204, dated Jun. 6, 2013, 27 pages.
“Final Office Action”, U.S. Appl. No. 12/700,510, dated Feb. 3, 2015, 28 pages.
“Final Office Action”, U.S. Appl. No. 12/709,204, dated Jan. 12, 2015, 29 pages.
“Advisory Action”, U.S. Appl. No. 12/695,842, dated Mar. 28, 2016, 3 pages.
“Advisory Action”, U.S. Appl. No. 12/695,842, dated May 12, 2015, 3 pages.
“3M Touch Systems, Inc. Announces Shipment of Dispersive Signal Technology Product”, Datasheet, 3M Corporation, retrieved from <http://solutions.3m.com/wps/portal/3M/en_US/TouchSystems/TouchScreen/Informatio/Media/PressReleases/Archive/?PC_7_RJH9U52300FA602N9RSR991OI3000000_assetId=1114287537178<, Sep. 6, 2005, 3 pages.
“Supplementary European Search Report”, EP Application No. 11737428.0, dated Nov. 13, 2004, 3 pages.
“Supplementary European Search Report”, EP Application No. 11748027.7, dated Nov. 29, 2012, 3 pages.
“Supplementary European Search Report”, EP Application No. 11747907.1, dated Nov. 7, 2012, 3 pages.
“Supplementary European Search Report”, EP Application No. 11748028.5, dated Nov. 7, 2012, 3 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,204, dated Nov. 20, 2013, 31 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/700,510, dated Aug. 28, 2015, 34 pages.
“Final Office Action”, U.S. Appl. No. 12/700,510, dated Mar. 14, 2016, 36 pages.
“Foreign Notice of Allowance”, CN Application No. 201180011039.8, dated Jan. 13, 2016, 4 Pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,133, dated Jan. 17, 2014, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 12/709,245, dated Jan. 30, 2015, 4 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/979,910, dated Feb. 22, 2016, 4 pages.
“Foreign Notice of Allowance”, JP Application No. 2012-555062, dated Mar. 3, 2015, 4 Pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,118, dated Mar. 5, 2015, 4 pages.
“Foreign Notice of Allowance”, CN Application No. 201180009579.2, dated Mar. 7, 2016, 4 Pages.
“Notice of Allowance”, U.S. Appl. No. 12/709,245, dated Apr. 28, 2015, 4 pages.
“Foreign Notice of Allowance”, CN Application No. 201180010769.6, dated Apr. 30, 2015, 4 Pages.
“Foreign Notice of Allowance”, CN Application No. 201110050506.X, dated Nov. 2, 2014, 4 Pages.
“Foreign Office Action”, EP Application No. 11737428.0, dated Nov. 18, 2013, 4 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,130, dated Jan. 16, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,301, dated Jan. 16, 2015, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,130, dated Feb. 19, 2013, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 14/145,204, dated Sep. 25, 2015, 5 pages.
“Foreign Office Action”, EP Application No. 11748026.9, dated Jan. 16, 2013, 5 pages.
“Foreign Office Action”, EP Application No. 11748029.3, dated Jan. 16, 2013, 5 pages.
“Foreign Office Action”, EP Application No. 11748027.7, dated Jan. 18, 2013, 5 pages.
“Foreign Office Action”, EP Application No. 11747907.1, dated Jan. 28, 2013, 5 pages.
“Foreign Office Action”, EP Application No. 11748028.5, dated Jan. 28, 2013, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,096, dated Dec. 30, 2015, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/145,204, dated Feb. 5, 2014, 5 pages.
“Foreign Notice of Allowance”, CN Application No. 201110046510.9, dated Feb. 12, 2015, 6 Pages.
“Notice of Allowance”, U.S. Appl. No. 12/709,376, dated Mar. 17, 2014, 6 pages.
“(Need English Summary) Foreign Office Action”, CN Application No. 201180010692.2, dated Mar. 28, 2016, 6 Pages.
“Notice of Allowance”, U.S. Appl. No. 12/472,699, dated May 2, 2014, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 12/709,301, dated Sep. 8, 2015, 6 pages.
“Final Office Action”, U.S. Appl. No. 12/709,245, dated Nov. 14, 2014, 6 pages.
“Foreign Office Action”, CN Application No. 201110050506.X, dated Feb. 26, 2014, 6 pages.
“3M TouchWare TM Software for Windows User Guide”, In White Paper of 3M Touch Systems—Retrieved at: <<http://multimedia.3m.com/mws/mediawebserver?6666660Zjcf6IVs6EVs66SS0LCOrrrrQ—>>, Aug. 9, 2013, 65 pages.
“Notice of Allowance”, U.S. Appl. No. 12/709,301, dated Feb. 24, 2016, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 12/713,053, dated Jun. 7, 2013, 7 pages.
“Extended European Search Report”, EP Application No. 13738283.4, dated Aug. 4, 2015, 7 pages.
“Foreign Office Action”, CN Application No. 201180011039.8, dated Sep. 6, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 12/709,204, dated Sep. 25, 2015, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,130, dated Jan. 23, 2012, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,133, dated Jan. 31, 2012, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/657,662, dated Oct. 11, 2013, 7 pages.
“Foreign Office Action”, JP Application No. 2012-554008, dated Nov. 25, 2014, 7 pages.
“Final Office Action”, U.S. Appl. No. 12/713,096, dated Feb. 15, 2013, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,118, dated Jun. 8, 2012, 7 pages.
“UI Guidelines Version 2.1”, Retrieved from: http://na.blackberry.com/eng/deliverables/6622/BlackBerry_Smartphones-US.pdf., 76 Pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,133, dated Jan. 14, 2013, 8 pages.
“Foreign Office Action”, CN Application No. 201110044285.5, dated Apr. 24, 2013, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/484,075, dated Apr. 29, 2015, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 11/324,157, dated May 9, 2011, 8 pages.
“Notice on Reexamination”, CN Application No. 201110044285.5, dated Jul. 23, 2014, 8 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/674,357, dated Aug. 4, 2014, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/020417, dated Oct. 20, 2011, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/025974, dated Oct. 26, 2011, 8 pages.
“Foreign Office Action”, CN Application No. 201110050852.8, dated Nov. 1, 2013, 8 Pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,301, dated Nov. 19, 2015, 8 pages.
“Second Written Opinion”, Application No. PCT/US2014/067804, dated Nov. 24, 2015, 8 Pages.
“Foreign Office Action”, CN Application No. 201110050499.3, dated Nov. 27, 2012, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,133, dated Dec. 10, 2013, 8 pages.
“Foreign Office Action”, CN Application No. 201110044285.5, dated Dec. 22, 2014, 8 Pages.
“Notice of Allowance”, U.S. Appl. No. 14/212,916, dated Dec. 24, 2015, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,053, dated Feb. 3, 2012, 8 pages.
“Foreign Office Action”, CN Application No. 201110050508.9, dated Mar. 7, 2013, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/230,700, dated May 15, 2012, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/230,700, dated Jun. 21, 2012, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,096, dated Jun. 26, 2013, 8 pages.
“Final Office Action”, U.S. Appl. No. 12/713,130, dated Jun. 29, 2012, 8 pages.
“Final Office Action”, U.S. Appl. No. 12/713,133, dated Jul. 2, 2012, 8 pages.
“Foreign Office Action”, CN Application No. 201110050499.3, dated Aug. 3, 2012, 8 pages.
“Foreign Office Action”, CN Application No. 201110050508.9, dated Aug. 3, 2012, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/695,976, dated Sep. 11, 2012, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/484,075, dated Jan. 15, 2013, 9 pages.
“Foreign Office Action”, CN Application No. 201180011020.3, dated Jan. 15, 2015, 9 Pages.
“Foreign Office Action”, CN Application No. 201110046510.9, dated Feb. 12, 2014, 9 Pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2014/067804, dated Feb. 22, 2016, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/145,204, dated Feb. 24, 2015, 9 Pages.
“Foreign Office Action”, CN Application No. 201180010692.2, dated Mar. 10, 2015, 9 Pages.
“Decision on Reexamination”, CN Application No. 201110046519.X, dated May 28, 2015, 9 Pages.
“New MS Courier Leak Details Multi-Touch Interface”, Retrieved from: <http://www.electronista.com/articles/09/11/04/courier.gestures.ui.explained/> on Nov. 10, 2009, Nov. 4, 2009, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,301, dated Nov. 28, 2011, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,081, dated Nov. 29, 2012, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,348, dated Dec. 20, 2013, 9 pages.
“Foreign Office Action”, CN Application No. 201110046529.3, dated Feb. 4, 2013, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/709,282, dated Apr. 12, 2012, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/713,096, dated Jun. 6, 2012, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/020412, dated Aug. 31, 2011, 9 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2011/020410, dated Sep. 27, 2011, 9 pages.
“iQuery & Css Example—Dropdown Menu”, DesignReviver, Retrieved from: <http://designreviver.com/tutorials/jquery-css-example-dropdown-menu/> on Nov. 22, 2011, Oct. 7, 2008, 30 pages.
Appleinsider,“Special Report: Apple's Touch-Sensitive iPod Ambitions Disclosed in Filing”, Retrieved from: <http://www.appleinsider.com/articles/06/10/26/special_report_apples_touch_sensitive_ipod_ambitions_disclosed_in_filing.html> on Nov. 11, 2009, Oct. 26, 2006, 10 pages.
Boudreaux,“Touch Patterns: Chapter 6—Programming the iPhone User Experience”, retrieved from <http://oreilly.com/iphone/excerpts/iphone-programming-user/touch-patterns.html> on Oct. 25, 2011, 12 pages.
Brandl,“Combining and Measuring the Benefits of Bimanual Pen and Direct-Touch Interaction on Horizontal Interfaces”, Retrieved from: <http://www.merl.com/papers/docs/TR2008-054.pdf> on Nov. 5, 2009, Mitsubishi Electric Research Laboratories, May 2008, 10 pages.
Daniels,“Brave New World”, Retrieved from: <http://bookseller-association.blogspot.com/2009_03_01_archive.html> on Nov. 10, 2009, Mar. 2009, 54 pages.
Elliott,“First Dell, Then HP: What's Next for N-trig's Multitouch Screen Technology”, Retrieved from: <http://news.cnet.com/8301-17938_105-10107886-1.html> on Nov. 11, 2009, Nov. 25, 2008, 5 pages.
Emigh,“Lenovo Launches Windows 7 ThinkPads with Multitouch and Outdoor Screens”, Retrieved from: <http://www.betanews.com/article/Lenovo-launches-Windows-7-ThinkPads-with-multitouch-and-outdoor-screens/1253017166> on Nov. 11, 2009, Sep. 15, 2009, 3 pages.
Findlater,“Personalized Input: Improving Ten-Finger Touchscreen Typing through Automatic Adaptation”, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Available at <http://terpconnect.umd.edu/˜leahkf/pubs/CHI2012-findlater-PersonalizedTyping.pdf>, May 5, 2012, 10 pages.
Fonseca,“New Apple Patent Hints at Touch Enabled Bezels for Future Devices”, Retrieved from: <http://vr-zone.com/articles/new-apple-patent-hints-at-touch-enabled-bezels-for-future-devices/42928.html?utm_source=rss&utm_medium=rss&utm_campaign=new-apple-patent-hints-at-touch-enabled-bezels-for-future-devices> Jan. 31, 2014, Jul. 3, 2013, 6 Pages.
Goel,“GripSense: Using Built-In Sensors to Detect Hand Posture and Pressure on Commodity Mobile Phones”, Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, Oct. 7, 2012, pp. 545-554.
Gross,“Stretch-A-Sketch: A Dynamic Diagrammer”, IEEE Symposium on Visual Languages, Available at <http://depts.washington.edu/dmachine/PAPER/VL94/vl.html>, Oct. 1994, 11 pages.
Hinckley,“Codex: A Dual Screen Tablet Computer”, Conference on Human Factors in Computing Systems, Apr. 9, 2009, 10 pages.
Hinckley,“Sensor Synaesthesia: Touch in Motion, and Motion in Touch”, CHI 2011, May 7-12, 2011, available at <http://research.microsoft.com/en-us/um/people/kenh/papers/touch-motion-camera-ready-final.pdf>, May 7, 2011, 10 pages.
Hinckley,“Stitching: Pen Gestures that Span Multiple Displays”, CHI 2004, Available at <http://www.cs.cornell.edu/˜francois/Papers/2004-Hinckley-AVI04-Stitching.>, 2004, pp. 1-8.
Hirche,“Adaptive Interface for Text Input on Large-Scale Interactive Surfaces”, 3rd IEEE International Workshop on Horizontal Interactive Human Computer System, Oct. 1, 2008, pp. 153-156.
Hotelling,“Multi-functional hand-held device”, U.S. Appl. No. 60/658,777, filed Mar. 4, 2015, 117 pages.
Hotelling,“Multi-functional hand-held device”, U.S. Appl. No. 60/663,345, filed Mar. 16, 2005, 76 pages.
Kim,“Hand Grip Pattern Recognition for Mobile User Interfaces”, Interaction Lab / Samsung Advanced Institute of Technology, Available at <http://www.alice.org/stage3/pubs/uistsensing.pdf>, 2006, pp. 1789-1794.
Krazit,“Has Apple Found the Magic Touch?”, Retrieved from: <http://news.cnet.com/8301-13579_3-9879471-37.html> on Nov. 10, 2009, Feb. 26, 2008, 2 pages.
Lee,“The TypeWay iPad app is an adaptive on-screen keyboard”, Retrieved from <http://www.ubergizmo.com/2012/02/the-typeway-ipad-app-is-an-adaptive-on-screen-keyboard/> on Mar. 7, 2013, Feb. 1, 2012, 2 pages.
Maxwell,“Writing drivers for common touch-screen interface hardware”, Industrial Control Design Line, Jun. 15, 2005, 9 pages.
Minsky,“Manipulating Simulated Objects with Real-world Gestures using a Force and Position Sensitive Screen”, Computer Graphics, vol. 18, No. 3, Available at <http://delivery.acm.org/10.1145/810000/808598/p195-minsky.pdf?key1=808598&key2=2244955521&coll=GUIDE&dl=GUIDE&CFID=57828830&CFTOKEN=43421964>, Jul. 1984, pp. 195-203.
Moore,“TypeWay Adaptive Keyboard for iPad Review”, Retrieved from <http://www.technologytell.com/apple/89378/typeway-adaptive-keyboard-for-ipad-review/> on Mar. 6, 2013, Feb. 5, 2012, 10 pages.
Nordgren,“Development of a Touch Screen Interface for Scania Interactor”, Master's Thesis in C—Available at <http://www.cs.umu.se/education/examina/Rapporter/PederNordgren.pdf>omputing Science, UMEA University, Apr. 10, 2007, pp. 1-59.
Olwal,“Rubbing and Tapping for Precise and Rapid Selection on Touch-Screen Displays”, Conference on Human Factors in Computing Systems, Available at <http://www.csc.kth.se/˜alx/projects/research/rubbing/olwal_rubbing_tapping_chi_2008.pdf>, Apr. 2008, 10 pages.
Panzarino,“Apple's iPad Mini Should have a Widescreen Display”, Retrieved from <http://thenextweb.com/apple/2012/08/15/what-ipad-mini-169-instead-43/> on Aug. 29, 2012, Aug. 15, 2012, 6 pages.
Pierce,“Toolspaces and Glances: Storing, Accessing, and Retrieving Objects in 3D Desktop Applications”, 1999 Symposium on Interactive 3D Graphics, Available at <http://delivery.acm.org/10.1145/310000/300545/p163-pierce. pdf?key1=300545&key2=8792497521&coll=GUIDE&dl=GUIDE&CFID=61004073&CFTOKEN=28819248>, Apr. 1999, pp. 163-168.
Roth,“Bezel Swipe: Conflict-Free Scrolling and Multiple Selection on Mobile Touch Screen Devices”, CHI 2009, Available at <http://www.volkerroth.com/download/Roth2009a.pdf>, Apr. 2009, 4 pages.
Roth,“Bezel Swipe: Conflict-Free Scrolling and Multiple Selection on Mobile Touch Screen Devices”, In 27th International Conference on Human Factors in Computing Systems, Retrieved from <http://www.volkerroth.com/download/Roth2009a.pdf>, Apr. 4, 2009, 4 pages.
Roudaut,“Leaf Menus: Linear Menus with Stroke Shortcuts for Small Handheld Devices”, Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part I, Aug. 2009, 4 pages.
Saini,“Designing of a Virtual System with Fingerprint Security by considering many Security Threats”, International Journal of Computer Applications, vol. 3—No. 2, available at <http://www.ijcaonline.org/volume3/number2/pxc387995.pdf>, Jun. 2010, pp. 25-31.
Sajid,“Microsoft Patent a Futuristic Virtual Multitouch Keyboard”, Retrieved from <http://thetechnopath.com/microsoft-patent-futuristic-virtual-multitouch-keyboard/857/> on Mar. 6, 2013, Sep. 27, 2009, 8 pages.
Sax,“Liquid Keyboard: an Ergonomic, Adaptive QWERTY Keyboard for Touchscreens and Surfaces”, ICDS 2011, The Fifth International Conference on Digital Society, Feb. 23, 2011, 6 pages.
Sax,“Liquid Keyboard: An Ergonomic, Adaptive QWERTY Keyboard for Touchscreens”, Proceedings of Fifth International Conference on Digital Society, Feb. 23, 2011, pp. 117-122.
Serrano,“Bezel-Tap Gestures: Quick Activation of Commands from Sleep Mode on Tablets”, n Proceedings of the SIGCHI Conference on Human Factors in IComputing Systems, Apr. 27, 2013, 10 pages.
T.,“Smartphone displays need a bezel. Here's why”, Retrieved from <http://www.phonearena.com/news/Smartphone-displays-need-a-bezel.-Heres-why_id27670> on Aug. 29, 2012, Mar. 12, 2012, 4 pages.
Vallerio,“Energy-Efficient Graphical User Interface Design”, Retrieved from: <http://www.cc.gatech.edu/classes/AY2007/cs7470_fall/zhong-energy-efficient-user-interface.pdf>, Jun. 10, 2004, pp. 1-13.
Vigil,“Methods for Controlling a Floating Cursor on a Multi-touch Mobile Phone or Tablet in Conjunction with Selection Gestures and Content Gestures”, U.S. Appl. No. 61/304,972, filed Feb. 16, 2010, 54 pages.
Yee,“Two-Handed Interaction on a Tablet Display”, Retrieved from: <http://zesty.ca/tht/yee-tht-chi2004-short.pdf>, Conference on Human Factors in Computing Systems, Apr. 2004, 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/212,916, dated May 9, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/212,916, dated May 26, 2016, 6 pages.
“Final Office Action”, U.S. Appl. No. 13/674,357, dated Jul. 27, 2016, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/898,452, dated Jul. 28, 2016, 26 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/674,357, dated Jan. 26, 2017, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/674,357, dated Nov. 14, 2016, 2 pages.
“Foreign Office Action”, CN Application No. 201380059094.3, dated Dec. 1, 2016, 15 pages.
“Foreign Office Action”, EP Application No. 15713073.3, dated Mar. 16, 2017, 4 pages.
“Foreign Notice of Allowance”, CN Application No. 201110046519.X, dated Aug. 2, 2016, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 13/674,357, dated Oct. 13, 2016, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 14/979,910, dated Aug. 31, 2016, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/979,910, dated Nov. 1, 2016, 7 pages.
“Foreign Office Action”, CN Application No. 201380059094.3, dated Aug. 17, 2017, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/453,403, dated Jun. 30, 2017, 6 pages.
“Final Office Action”, U.S. Appl. No. 13/898,452, dated May 19, 2017, 24 pages.
“Foreign Notice of Allowance”, CN Application No. 201380005804.4, dated Sep. 30, 2016, 4 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/069644, dated Jan. 8, 2014, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/426,548, dated Jun. 30, 2017, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/453,464, dated Jun. 30, 2017, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/543,403, dated Jun. 30, 2017, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 13/352,193, dated Jul. 29, 2016, 10 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/352,193, dated Nov. 14, 2016, 2 pages.
Related Publications (1)
Number Date Country
20160291787 A1 Oct 2016 US
Continuations (1)
Number Date Country
Parent 14212916 Mar 2014 US
Child 15142758 US