Hover detection on a touch sensor panel

Information

  • Patent Grant
  • 11662867
  • Patent Number
    11,662,867
  • Date Filed
    Monday, May 3, 2021
    3 years ago
  • Date Issued
    Tuesday, May 30, 2023
    a year ago
  • CPC
    • G06F3/0446
    • G06F3/0445
  • Field of Search
    • CPC
    • G06F3/0446
    • G06F3/0445
    • G06F3/041661
    • G06F3/0443
    • G06F2203/04108
  • International Classifications
    • G06F3/041
    • G06F3/044
Abstract
Some touch screens can be formed with rows and columns of touch electrodes. In some examples, during a first time period, a first set of row electrodes are driven while a second set of row electrodes are sensed. In some examples, during a second time period, the first set of row electrodes are sensed while the second set of row electrodes are driven. In some examples, during a third time period, a first set of column electrodes are driven while a second set of column electrodes are sensed. In some examples, during a fourth time period, the first set of column electrodes are sensed while the second set of column electrodes are driven. In some examples, a touch image can be generated based on the data sensed from the first, second, third, and fourth time periods.
Description
FIELD OF THE DISCLOSURE

This relates generally to methods and systems for performing hover detection on a touch sensor paned.


BACKGROUND OF THE DISCLOSURE

Many types of input devices are presently available for performing operations in a computing system, such as buttons or keys, mice, trackballs, joysticks, touch sensor panels, touch screens and the like. Touch screens, in particular, are popular because of their ease and versatility of operation as well as their declining price. Touch screens can include a touch sensor panel, which can be a clear panel with a touch-sensitive surface, and a display device such as a liquid crystal display (LCD), light emitting diode (LED) display or organic light emitting diode (OLED) display that can be positioned partially or fully behind the panel so that the touch-sensitive surface can cover at least a portion of the viewable area of the display device. Touch screens can allow a user to perform various functions by touching the touch sensor panel using a finger, stylus or other object at a location often dictated by a user interface (UI) being displayed by the display device. In general, touch screens can recognize a touch and the position of the touch on the touch sensor panel, and the computing system can then interpret the touch in accordance with the display appearing at the time of the touch, and thereafter can perform one or more actions based on the touch. In the case of some touch sensing systems, a physical touch on the display is not needed to detect a touch. For example, in some capacitive-type touch sensing systems, fringing electric fields used to detect touch can extend beyond the surface of the display, and objects approaching near the surface may be detected near the surface without actually touching the surface.


Capacitive touch sensor panels can be formed by a matrix of partially or fully transparent or non-transparent conductive plates (e.g., touch electrodes or sensing electrodes) made of materials such as Indium Tin Oxide (ITO). In some examples, the conductive plates can be formed from other materials including conductive polymers, metal mesh, graphene, nanowires (e.g., silver nanowires) or nanotubes (e.g., carbon nanotubes). It is due in part to their substantial transparency that some capacitive touch sensor panels can be overlaid on a display to form a touch screen, as described above. Some touch screens can be formed by at least partially integrating touch sensing circuitry into a display pixel stackup (i.e., the stacked material layers forming the display pixels). In some cases, capacitive touch sensor panel can operate in a mutual capacitance or self-capacitance mode.


SUMMARY OF THE DISCLOSURE

This relates to systems and methods of improving detection sensitivity of touch sensor panels that are operating in mutual capacitance mode, such as to detect proximity (e.g., hover) events, for example. In some examples, a touch sensor panel can be arranged in rows and columns of touch electrodes. In a mutual capacitance sensing mode, the intersection of the rows or columns form a capacitance that can be measured by a touch sensing circuit. In some examples, the rows and/or columns can be driven by a known drive signal and other rows and/or columns can be sensed to determine the capacitance at the respective intersections. When an object such as a finger or stylus approaches and/or contacts the touch sensor panel, the capacitance at respective intersections changes due to the object's interference with the electromagnetic fields between the touch electrodes. In some examples, multiple rows and/or columns can be driven simultaneously (optionally with the same drive signal) to increase the field penetration of the generated electromagnetic fields. In some examples, the sensed change in capacitance due to the object's interaction with the touch sensor panel can be small relative to the overall capacitance formed by the intersection of the touch electrodes. In some examples, an offset signal can be injected into touch sensing circuits to offset baseline capacitance.





BRIEF DESCRIPTION OF THE DRAWINGS


FIGS. 1A-1E illustrate example systems that can implement touch sensing according to examples of the disclosure.



FIG. 2 illustrates a block diagram of an example computing system that can implement touch sensing according to examples of the disclosure.



FIG. 3 illustrates an example touch screen including touch sensing circuitry configured as drive and sense regions or lines according to examples of the disclosure.



FIG. 4 illustrates an example touch screen including touch sensing circuitry configured as pixelated electrodes according to examples of the disclosure.



FIG. 5 illustrates an example mutual capacitance scan of an example row-column touch sensor panel.



FIG. 6 illustrates an example touch screen system according to examples of the disclosure.



FIGS. 7A-7D illustrate an exemplary method of operating a touch panel according to examples of the disclosure.



FIGS. 8A-8B illustrates exemplary touch sensor circuits according to examples of the disclosure.



FIGS. 9A-9B illustrate signal graphs of exemplary driving schemes for touch sensor circuits according to examples of the disclosure.



FIGS. 10A-10C illustrate a method of demodulating a touch sense signal.



FIG. 11 illustrates an example multi-scan scan sequence.





DETAILED DESCRIPTION

In the following description of examples, reference is made to the accompanying drawings which form a part hereof, and in which it is shown by way of illustration specific examples that can be practiced. It is to be understood that other examples can be used and structural changes can be made without departing from the scope of the disclosed examples.


This relates to systems and methods of improving detection sensitivity of touch sensor panels that are operating in mutual capacitance mode, such as to detect proximity (e.g., hover) events, for example. In some examples, a touch sensor panel can be arranged in rows and columns of touch electrodes. In a mutual capacitance sensing mode, the intersection of the rows or columns form a capacitance that can be measured by a touch sensing circuit. In some examples, the rows and/or columns can be driven by a known drive signal and other rows and/or columns can be sensed to determine the capacitance at the respective intersections. When an object such as a finger or stylus approaches and/or contacts the touch sensor panel, the capacitance at respective intersections changes due to the object's interference with the electromagnetic fields between the touch electrodes. In some examples, the change in capacitance can be small relative to the overall capacitance formed by the intersection of the touch electrodes. In some examples, multiple rows and/or columns can be driven simultaneously to increase the field penetration of the generated electromagnetic fields. In some examples, an offset signal can be injected into touch sensing circuits to offset baseline capacitance.



FIGS. 1A-1E illustrate example systems that can implement touch sensing (e.g., hover sensing) according to examples of the disclosure. FIG. 1A illustrates an example mobile telephone 136 that includes a touch screen 124 and a computing system that can implement touch sensing according to examples of the disclosure. FIG. 1B illustrates an example digital media player 140 that includes a touch screen 126 and a computing system that can implement touch sensing according to examples of the disclosure. FIG. 1C illustrates an example personal computer 144 that includes a touch screen 128 and a computing system that can implement touch sensing according to examples of the disclosure. FIG. 1D illustrates an example tablet computing device 148 that includes a touch screen 130 and a computing system that can implement touch sensing according to examples of the disclosure. FIG. 1E illustrates an example wearable device 150 that includes touch screen 152 and a computing system and can be attached to a user using a strap 154 and that can implement touch sensing according to examples of the disclosure. The touch screen and computing system that can implement touch sensing can be implemented in other devices.


Touch screens 124, 126, 128, 130 and 150 can be based on, for example, self-capacitance or mutual capacitance sensing technology, or another touch sensing technology. For example, a self-capacitance based touch system can include a matrix of small, individual plates of conductive material that can be referred to as touch node electrodes (as described below with reference to touch screen 420 in FIG. 4). For example, a touch screen can include a plurality of individual touch node electrodes, each touch node electrode identifying or representing a unique location on the touch screen at which touch or proximity (i.e., a touch or proximity event) is to be sensed, and each touch node electrode being electrically isolated from the other touch node electrodes in the touch screen/panel. Such a touch screen can be referred to as a pixelated self-capacitance touch screen, though it is understood that in some examples, the touch node electrodes on the touch screen can be used to perform scans other than self-capacitance scans on the touch screen (e.g., mutual capacitance scans). During operation, a touch node electrode can be stimulated with an AC waveform, and the self-capacitance to ground of the touch node electrode can be measured. As an object approaches the touch node electrode, the self-capacitance to ground of the touch node electrode can change (e.g., increase). This change in the self-capacitance of the touch node electrode can be detected and measured by the touch sensing system to determine the positions of multiple objects when they touch, or come in proximity to, the touch screen. In some examples, the electrodes of a self-capacitance based touch system can be formed from rows and columns of conductive material (as described below with reference to touch screen 320 in FIG. 3), and changes in the self-capacitance to ground of the rows and columns can be detected, similar to above. In some examples, a touch screen can be multi-touch, single touch, projection scan, full-imaging multi-touch, capacitive touch, etc.


In some examples, touch screens 124, 126, 128, 130 and 150 can be based on mutual capacitance. A mutual capacitance based touch system can include drive and sense lines that may cross over each other on different layers, or may be adjacent to each other on the same layer (e.g., as illustrated in touch screen 320 in FIG. 3). The crossing or adjacent locations can be referred to as touch nodes. During operation, the drive line can be stimulated with an AC waveform and the mutual capacitance of the touch node can be measured. As an object approaches the touch node, the mutual capacitance of the touch node can change (e.g., decrease). This change in the mutual capacitance of the touch node can be detected and measured by the touch sensing system to determine the positions of multiple objects when they touch, or come in proximity to, the touch screen. In some examples, the electrodes of a mutual-capacitance based touch system can be formed from a matrix of small, individual plates of conductive material, and changes in the mutual capacitance between plates of conductive material can be detected, similar to above.


In some examples, touch screens 124, 126, 128, 130 and 150 can be based on mutual capacitance and/or self-capacitance. The electrodes can be arranged as a matrix of small, individual plates of conductive material (e.g., as in touch screen 420 in FIG. 4) or as drive lines and sense lines (e.g., as in touch screen 320 in FIG. 3), or in another pattern. The electrodes can be configurable for mutual capacitance or self-capacitance sensing or a combination of mutual and self-capacitance sensing. For example, in one mode of operation electrodes can be configured to sense mutual capacitance between electrodes and in a different mode of operation electrodes can be configured to sense self-capacitance of electrodes. In some examples, some of the electrodes can be configured to sense mutual capacitance therebetween and some of the electrodes can be configured to sense self-capacitance thereof.



FIG. 2 illustrates a block diagram of an example computing system that can implement touch sensing (e.g., hover sensing) according to examples of the disclosure. Computing system 200 could be included in, for example, mobile telephone 136, digital media player 140, personal computer 144, tablet computing device 148, wearable device 150, or any mobile or non-mobile computing device that includes a touch screen. Computing system 200 can include an integrated touch screen 220 to display images and to detect touch and/or proximity (e.g., hover) events from an object (e.g., finger 203 or active or passive stylus 205) at or proximate to the surface of the touch screen 220. Computing system 200 can also include an application specific integrated circuit (“ASIC”) illustrated as touch ASIC 201 to perform touch and/or stylus sensing operations. Touch ASIC 201 can include one or more touch processors 202, peripherals 204, and touch controller 206. Touch ASIC 201 can be coupled to touch sensing circuitry of touch screen 220 to perform touch and/or stylus sensing operations (described in more detail below). Peripherals 204 can include, but are not limited to, random access memory (RAM) or other types of memory or storage, watchdog timers and the like. Touch controller 206 can include, but is not limited to, one or more sense channels in receive circuitry 208, panel scan engine 210 (which can include channel scan logic) and transmit circuitry 214 (which can include analog or digital driver logic). In some examples, the transmit circuitry 214 and receive circuitry 208 can be reconfigurable by the panel scan engine 210 based the scan event to be executed (e.g., mutual capacitance row-column scan, mutual capacitance row-row scan, differential mutual capacitance scan, mutual capacitance column-column scan, row self-capacitance scan, column self-capacitance scan, touch spectral analysis scan, stylus spectral analysis scan, stylus scan, etc.). Panel scan engine 210 can access RAM 212, autonomously read data from the sense channels and provide control for the sense channels (e.g., described in more detail with respect to sense channel 780 in FIG. 7E). The touch controller 206 can also include a scan plan (e.g., stored in RAM 212) which can define a sequence of scan events to be performed at the touch screen. The scan plan can include information necessary for configuring or reconfiguring the transmit circuitry and receive circuitry for the specific scan event to be performed. Results (e.g., touch signals or touch data) from the various scans can also be stored in RAM 212. In addition, panel scan engine 210 can provide control for transmit circuitry 214 to generate stimulation signals at various frequencies and/or phases that can be selectively applied to drive regions of the touch sensing circuitry of touch screen 220. Touch controller 206 can also include a spectral analyzer to determine low noise frequencies for touch and stylus scanning. The spectral analyzer can perform spectral analysis on the scan results from an unstimulated touch screen. Although illustrated in FIG. 2 as a single ASIC, the various components and/or functionality of the touch ASIC 201 can be implemented with multiple circuits, elements, chips, and/or discrete components.


Computing system 200 can also include an application specific integrated circuit illustrated as display ASIC 216 to perform display operations. Display ASIC 216 can include hardware to process one or more still images and/or one or more video sequences for display on touch screen 220. Display ASIC 216 can be configured to generate read memory operations to read the data representing the frame/video sequence from a memory (not shown) through a memory controller (not shown), for example. Display ASIC 216 can be configured to perform various processing on the image data (e.g., still images, video sequences, etc.). In some examples, display ASIC 216 can be configured to scale still images and to dither, scale and/or perform color space conversion on the frames of a video sequence. Display ASIC 216 can be configured to blend the still image frames and the video sequence frames to produce output frames for display. Display ASIC 216 can also be more generally referred to as a display controller, display pipe, display control unit, or display pipeline. The display control unit can be generally any hardware and/or firmware configured to prepare a frame for display from one or more sources (e.g., still images and/or video sequences). More particularly, display ASIC 216 can be configured to retrieve source frames from one or more source buffers stored in memory, composite frames from the source buffers, and display the resulting frames on touch screen 220. Accordingly, display ASIC 216 can be configured to read one or more source buffers and composite the image data to generate the output frame.


Display ASIC 216 can provide various control and data signals to the display, including timing signals (e.g., one or more clock signals) and/or vertical blanking period and horizontal blanking interval controls. The timing signals can include a pixel clock that can indicate transmission of a pixel. The data signals can include color signals (e.g., red, green, blue). The display ASIC 216 can control the touch screen 220 in real-time, providing the data indicating the pixels to be displayed as the touch screen is displaying the image indicated by the frame. The interface to such a touch screen 220 can be, for example, a video graphics array (VGA) interface, a high definition multimedia interface (HDMI), a digital video interface (DVI), a LCD interface, an LED display interface, an OLED display interface, a plasma interface, or any other suitable interface.


In some examples, a handoff module 218 can also be included in computing system 200. Handoff module 218 can be coupled to the touch ASIC 201, display ASIC 216, and touch screen 220, and can be configured to interface the touch ASIC 201 and display ASIC 216 with touch screen 220. The handoff module 218 can appropriately operate the touch screen 220 according to the scanning/sensing and display instructions from the touch ASIC 201 and the display ASIC 216. In other examples, the display ASIC 216 can be coupled to display circuitry of touch screen 220 and touch ASIC 201 can be coupled to touch sensing circuitry of touch screen 220 without handoff module 218.


Touch screen 220 can use liquid crystal display (LCD) technology, light emitting polymer display (LPD) technology, light emitting diode (LED) technology, organic LED (OLED) technology, or organic electro luminescence (OEL) technology, although other display technologies can be used in other examples. In some examples, the touch sensing circuitry and display circuitry of touch screen 220 can be stacked on top of one another. For example, a touch sensor panel can cover some or all of a surface of the display (e.g., fabricated one on top of the next in a single stack-up or formed from adhering together a touch sensor panel stack-up with a display stack-up). In other examples, the touch sensing circuitry and display circuitry of touch screen 220 can be partially or wholly integrated with one another. The integration can be structural and/or functional. For example, some or all of the touch sensing circuitry can be structurally in between the substrate layers of the display (e.g., between two substrates of a display pixel cell). Portions of the touch sensing circuitry formed outside of the display pixel cell can be referred to as “on-cell” portions or layers, whereas portions of the touch sensing circuitry formed inside of the display pixel cell can be referred to as “in cell” portions or layers. Additionally, some electronic components can be shared, and used at times as touch sensing circuitry and at other times as display circuitry. For example, in some examples, common electrodes can be used for display functions during active display refresh and can be used to perform touch sensing functions during touch sensing periods. A touch screen stack-up sharing components between sensing functions and display functions can be referred to as an in-cell touch screen.


Computing system 200 can also include a host processor 228 coupled to the touch ASIC 201, and can receive outputs from touch ASIC 201 (e.g., from touch processor 202 via a communication bus, such as an serial peripheral interface (SPI) bus, for example) and perform actions based on the outputs. Host processor 228 can also be connected to program storage 232 and display ASIC 216. Host processor 228 can, for example, communicate with display ASIC 216 to generate an image on touch screen 220, such as an image of a user interface (UI), and can use touch ASIC 201 (including touch processor 202 and touch controller 206) to detect a touch on or near touch screen 220, such as a touch input to the displayed UI. The touch input can be used by computer programs stored in program storage 232 to perform actions that can include, but are not limited to, moving an object such as a cursor or pointer, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, making a selection, executing instructions, operating a peripheral device connected to the host device, answering a telephone call, placing a telephone call, terminating a telephone call, changing the volume or audio settings, storing information related to telephone communications such as addresses, frequently dialed numbers, received calls, missed calls, logging onto a computer or a computer network, permitting authorized individuals access to restricted areas of the computer or computer network, loading a user profile associated with a user's preferred arrangement of the computer desktop, permitting access to web content, launching a particular program, encrypting or decoding a message, and/or the like. Host processor 228 can also perform additional functions that may not be related to touch processing.


Computing system 200 can include one or more processors, which can execute software or firmware implementing various functions. Specifically, for integrated touch screens which share components between touch and/or stylus sensing and display functions, the touch ASIC and display ASIC can be synchronized so as to properly share the circuitry of the touch sensor panel. The one or more processors can include one or more of the one or more touch processors 202, a processor in display ASIC 216, and/or host processor 228. In some examples, the display ASIC 216 and host processor 228 can be integrated into a single ASIC, though in other examples, the host processor 228 and display ASIC 216 can be separate circuits coupled together. In some examples, host processor 228 can act as a master circuit and can generate synchronization signals that can be used by one or more of the display ASIC 216, touch ASIC 201 and handoff module 218 to properly perform sensing and display functions for an in-cell touch screen. The synchronization signals can be communicated directly from the host processor 228 to one or more of the display ASIC 216, touch ASIC 201 and handoff module 218. Alternatively, the synchronization signals can be communicated indirectly (e.g., touch ASIC 201 or handoff module 218 can receive the synchronization signals via the display ASIC 216).


Computing system 200 can also include a wireless module (not shown). The wireless module can implement a wireless communication standard such as a WiFi®, BLUETOOTH™ or the like. The wireless module can be coupled to the touch ASIC 201 and/or host processor 228. The touch ASIC 201 and/or host processor 228 can, for example, transmit scan plan information, timing information, and/or frequency information to the wireless module to enable the wireless module to transmit the information to an active stylus, for example (i.e., a stylus capable generating and injecting a stimulation signal into a touch sensor panel). For example, the computing system 200 can transmit frequency information indicative of one or more low noise frequencies the stylus can use to generate a stimulation signal. Additionally or alternatively, timing information can be used to synchronize the stylus 205 with the computing system 200, and the scan plan information can be used to indicate to the stylus 205 when the computing system 200 performs a stylus scan and expects stylus stimulation signals (e.g., to save power by generating a stimulus only during a stylus scan period). In some examples, the wireless module can also receive information from peripheral devices, such as an active stylus 205, which can be transmitted to the touch ASIC 201 and/or host processor 228. In other examples, the wireless communication functionality can be incorporated in other components of computing system 200, rather than in a dedicated chip.


Note that one or more of the functions described herein can be performed by firmware stored in memory and executed by the touch processor in touch ASIC 201, or stored in program storage and executed by host processor 228. The firmware can also be stored and/or transported within any non-transitory computer-readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “non-transitory computer-readable storage medium” can be any medium (excluding a signal) that can contain or store the program for use by or in connection with the instruction execution system, apparatus, or device. The non-transitory computer readable medium storage can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, a portable computer diskette (magnetic), a random access memory (RAM) (magnetic), a read-only memory (ROM) (magnetic), an erasable programmable read-only memory (EPROM) (magnetic), a portable optical disc such a CD, CD-R, CD-RW, DVD, DVD-R, or DVD-RW, or flash memory such as compact flash cards, secured digital cards, USB memory devices, memory sticks, and the like.


The firmware can also be propagated within any transport medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “transport medium” can be any medium that can communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The transport readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic or infrared wired or wireless propagation medium.


It is to be understood that the computing system 200 is not limited to the components and configuration of FIG. 2, but can include other or additional components in multiple configurations according to various examples. Additionally, the components of computing system 200 can be included within a single device, or can be distributed between multiple devices.


As discussed above, the touch screen 220 can include touch sensing circuitry. FIG. 3 illustrates an example touch screen including touch sensing circuitry configured as drive and sense regions or lines according to examples of the disclosure. Touch screen 320 can include touch sensing circuitry that can include a capacitive sensing medium having a plurality of drive lines 322 and a plurality of sense lines 323. It should be noted that the term “lines” is sometimes used herein to mean simply conductive pathways, as one skilled in the art will readily understand, and is not limited to elements that are strictly linear, but includes pathways that change direction, and includes pathways of different size, shape, materials, etc. Additionally, the drive lines 322 and sense lines 323 can be formed from smaller electrodes coupled together to form drive lines and sense lines. Drive lines 322 can be coupled to transmit circuitry and sense lines 323 can be coupled to receive circuitry. As used herein, an electrical component “coupled to” or “connected to” another electrical component encompasses a direct or indirect connection providing electrical path for communication or operation between the coupled components. Thus, for example, drive lines 322 may be directly connected to transmit circuitry or indirectly connected to sense circuitry via drive interface 324, but in either case an electrical path may be provided for driving stimulation signals to drive lines. Likewise, sense lines 323 may be directly connected to sense channels or indirectly connected to sense channels via sense interface 325, but in either case an electrical path may be provided for sensing the sense lines 323. Drive lines 322 can be driven by stimulation signals from the transmit circuitry 214 through a drive interface 324, and resulting sense signals generated in sense lines 323 can be transmitted through a sense interface 325 to sense channels in receive circuitry 208 in touch controller 206. In this way, drive lines and sense lines can be part of the touch sensing circuitry that can interact to form capacitive sensing nodes, which can be thought of as touch nodes, such as touch nodes 326 and 327. This way of understanding can be particularly useful when touch screen 320 is viewed as capturing an “image” of touch (or “touch image”). In other words, after touch controller 206 has determined whether a touch has been detected at each touch node in the touch screen, the pattern of touch pixels in the touch screen at which a touch occurred can be thought of as an “image” of touch (e.g., a pattern of fingers or other objects touching the touch screen).


It should be understood that the row/drive and column/sense associations can be exemplary, and in other examples, columns can be drive lines and rows can be sense lines. In some examples, row and column electrodes can be perpendicular such that touch nodes can have x and y coordinates, though other coordinate systems can also be used, and the coordinates of the touch nodes can be defined differently. It should be understood that touch screen 220 can include any number of row electrodes and column electrodes to form the desired number and pattern of touch nodes. The electrodes of the touch sensor panel can be configured to perform various scans including some or all of row-column and/or column-row mutual capacitance scans, differential mutual capacitance scans, self-capacitance row and/or column scans, row-row mutual capacitance scans, column-column mutual capacitance scans, and stylus scans.


Additionally or alternatively, the touch screen can include touch sensing circuitry including an array of touch node electrodes arranged in a pixelated touch node electrode configuration. FIG. 4 illustrates an example touch screen including touch sensing circuitry configured as pixelated touch node electrodes according to examples of the disclosure. Touch screen 420 can include touch sensing circuitry that can include a plurality of individual touch node electrodes 422, each touch node electrode identifying or representing a unique location on the touch screen at which touch or proximity (i.e., a touch or proximity event) is to be sensed, and each touch node electrode being electrically isolated from the other touch node electrodes in the touch screen/panel. Touch node electrodes 408 can be on the same or different material layers on touch screen 420. In some examples, touch screen 420 can sense the self-capacitance of touch node electrodes 422 to detect touch and/or proximity activity on touch screen 420. For example, in a self-capacitance configuration, touch node electrodes 422 can be coupled to sense channels in receive circuitry 208 in touch controller 206, can be driven by stimulation signals from the sense channels (or transmit circuitry 214) through drive/sense interface 425, and can be sensed by the sense channels through the drive/sense interface as well, as described above. Labeling the conductive plates used to detect touch (i.e., touch node electrodes 422) as “touch pixel” electrodes can be particularly useful when touch screen 420 is viewed as capturing an “image” of touch. In other words, after touch controller 206 has determined an amount of touch detected at each touch node electrode 422 in touch screen 420, the pattern of touch node electrodes in the touch screen at which a touch occurred can be thought of as an “image” of touch (e.g., a pattern of fingers or other objects touching the touch screen). In some examples, touch screen 420 can sense the mutual capacitance between touch node electrodes 422 to detect touch and/or proximity activity on touch screen 420 Although discussed herein primarily with reference to a row-column touch sensor panel (e.g., with reference to FIGS. 6 and 7A-7D), the principles of the touch sensing can be applied to a pixelated touch sensor panel configured to detect mutual capacitance. Additionally, although discussed herein primarily with reference to mutual capacitance based touch sensor panels, the principles of the touch sensing can be applied to other capacitance based touch sensor panels (e.g., self-capacitance based touch sensor panels), resistive touch sensor panels, and other types of touch sensor panels. Additionally, it should be understood that a force sensor panel can also be implemented using mutual capacitance sensing techniques. In some examples, force sensor panel can measure mutual capacitance between electrodes mounted on the backplane of the display and electrodes mounted on a proximate flex circuit. As force is exerted, the distance between the electrodes mounted on the backplane of the display and electrodes mounted on a proximate flex circuit can change the mutual capacitance coupling therebetween. The change in mutual capacitance can be measured to detect force applied to the touch screen.



FIG. 5 illustrates an example mutual capacitance scan of an example row-column touch sensor panel. Touch sensor panel 500 can include an array of touch nodes formed at the crossing points of row electrodes 510 and column electrodes 520. For example, touch node 506 can be formed at the crossing point of row electrode 501 and column electrode 502. During a single-stimulation mutual capacitance scan, a row electrode 501 (configured as a drive line) can be coupled to the transmit circuitry 214 which can stimulate the row electrode 501 with a drive signal (“Vstim”). One or more column electrodes (configured as sense lines) can be coupled to the receive circuitry 208 to sense mutual capacitance (or changes in mutual capacitance) between row electrode 501 and each of the one or more column electrodes. For each step of the single-stimulation mutual capacitance scan, one row electrode can be stimulated and the one or more column traces can be sensed. A touch node 506 can have a mutual capacitance Cm at the touch node 506 (between stimulated row electrode 501 and sensed column electrode 502) when there is no object touching or proximate to (e.g., within a threshold distance of) touch node 506. When an object touches or is proximate to the touch node 506 (e.g., a finger or stylus), the mutual capacitance Cm can be reduced by ΔCm, i.e., (Cm−ΔCm), corresponding to the amount of charge shunted through the object to ground. This mutual capacitance change can be sensed by sense amplifier 508 in the receive circuitry 208, which can be coupled to the column electrode 502 corresponding to touch node 506, to sense a touch signal that can be used to indicate the touch or proximity of an object at touch node 506. The sensing described with respect to touch node 506 can be repeated for the touch nodes of the touch sensor panel to generate an image of touch for the touch sensor panel (e.g., in subsequent single-stimulation mutual capacitance steps different row electrodes, such as row electrodes 503, 505, and 507, can be stimulated). In examples with a dedicated sense amplifier 508 for each column electrode (sense line) and N row electrodes (drive lines), the touch image for the touch sensor panel can be generated using N single-stimulation mutual capacitance scan steps.


In some examples, rather than using a single-stimulation mutual capacitance scan, the row-column touch sensor panel 500 can be stimulated using a multi-stimulation (“multi-stim”) mutual capacitance scan. In multi-stim scan, multiple drive lines (e.g., row electrodes 510) can be simultaneously stimulated with different stimulation signals for multiple stimulation steps, and the sense signals generated at one or more sense lines (e.g., column electrodes 520) in response to the multiple stimulation steps can be processed to determine the presence and/or amount of touch for each touch node in the touch sensor panel (corresponding to the multiple drive lines). For example, FIG. 5 illustrates four row electrodes 510 and four column electrodes 520. In some examples, each of the four row electrodes 510 can be stimulated with a drive signal Vstim, but the phases of the drive signals applied to the drive lines can be different for four stimulation steps. In some examples, the drive signal can be in-phase (Vstim+, 0° phase) or out-of-phase (Vstim−, 180° phase). For example, the polarities of the stimulation signals (e.g., cosine of the phase) for two example multi-stim scans can be represented by Table 1 or Table 2:













TABLE 1






Step 1
Step 2
Step 3
Step 4







Row 501
+
+
+
+


Row 503
+
+




Row 505
+


+


Row 507
+

+





















TABLE 2






Step 1
Step 2
Step 3
Step 4







Row 501

+

+


Row 503
+
+




Row 505
+

+



Row 507


+
+









For each sense line and for each step, the sensed signal can include contributions from the four drive lines (e.g., due to the capacitive coupling between the four drive lines and the sense line), encoded based on the polarity of the stimulation signal. At the end of the four steps, four sensed signals for a respective sense line can be decoded based on the stimulation phases to extract the capacitive signal for each touch node formed by one of the drive lines and the respective sense line. For example, assuming a linear system, the sensed signal for a sense line for each scan step can be proportional to the total signal charge, Qsig_tot, which can be equal to the sum of the product of the stimulation voltage and the touch node capacitance for each touch node of the sense line. Mathematically, this can be expressed for step S by equation (1) as:

Qsig_tot(S)=Σi=0M-1stimi(SCsigi  (1)

where Vstim can represent the stimulation voltage indexed for drive line (row electrode) i and step S and Csig can represent the capacitance at each touch node for the sense line indexed for corresponding drive line (row electrode) i and M is the total number of drive lines. In vector form, the above expression can be rewritten in equation (2) as:

{tilde over (Q)}sig_tot=Vstim·{tilde over (M)}·{tilde over (C)}sig  (2)

where {tilde over (Q)}sig_tot can represent a vector of the sensed signals from each scan step of the multi-stim scan, Vstim can represent a constant stimulation voltage, {tilde over (M)} can represent a matrix of polarities of the stimulation voltage (stimulation matrix) indexed by row and step (e.g., as shown in Table 1 or Table 2 above), and {tilde over (C)}sig can represent a vector of the capacitance at each touch node for the sense line. The capacitance value at each touch node of the sense line can be decoded using equation (3):











C
˜


s

i

g

=




M
~


-
1



V

s

t

i

m


·


Q
~

sig_tot






(
3
)








where {tilde over (M)}−1 can represent the inverse of stimulation matrix. Repeating the measurements and calculations above for each sense line can determine a capacitance signal for each touch node of the touch sensor panel scanned during the multi-stim scan. Although the multi-stim scan described above with respect to FIG. 5 includes four scan steps, it should be understood that the total duration of all four scan steps of the multi-stimulation scan can be the same duration as each scan step of the single-stimulation scan without any reduction in the integration time for sensing the capacitive signal at each touch node. Additional discussion of multi-stimulation touch sensing can be found in U.S. Pat. No. 7,812,827 entitled “Simultaneous Sensing Arrangement” by Steve Hotelling, et al. (filed Jan. 3, 2007) and in U.S. Pat. No. 8,592,697 entitled “Single-Chip Multi-Stimulus Sensor Controller” by Steve Hotelling, et al. (filed Sep. 10, 2008) both of which are incorporated by reference herein.



FIG. 6 illustrates an example touch screen system 600 according to examples of the disclosure. In some embodiments, touch screen system 600 includes a touch screen 601, drive circuitry 602, sense circuitry 603, and one or more multiplexers 608 and 610. In some examples, touch screen 601 can include patterned touch electrodes (e.g., row touch electrodes and column touch electrodes) configured for measuring touch (or proximity) of an object to touch screen 600. In FIG. 6, touch screen 601 includes four rows of touch electrodes 604-1, 604-2, 604-3, and 604-4 and four columns of touch electrodes 606-1, 606-2, 606-3, and 606-4. As shown, each touch electrode in a column of touch electrodes are electrically coupled and each touch electrode in a row of touch electrodes are electrically coupled. It is understood that the number of rows and columns are merely exemplary and a touch screen can have more or fewer rows or columns of touch electrodes. It is also understood that the shape of the touch electrodes shown in FIG. 6 are merely exemplary and need not be diamond shaped.


In some examples, any of the rows and/or columns of touch electrodes can be configured as drive or sense electrodes based on the switching states of one or more multiplexers. In FIG. 6A, each row and each column of touch electrodes has an associated multiplexer for controlling what circuitry is coupled to the corresponding row or column of touch electrodes. For example, row multiplexer 608-1 is coupled to row 604-1 such that row multiplexer 608-1 controls whether row 604-1 is coupled to drive circuitry 602 or sense circuitry 603. Similarly, column multiplexer 610-1 is coupled to column 606-1 and controls whether column 606-1 is coupled to drive circuitry 602 or sense circuitry 603.


As shown in FIG. 6, each multiplexer can be a bi-directional 2:1 multiplexer (e.g., an analog multiplexer, a multidirectional multiplexer, etc.). For example, multiplexer 608-1 has two input ports 609-1 and 609-2 and one output port 609-3. Input port 609-1 of row multiplexer 608-1 is coupled to drive circuitry 602 such that if multiplexer 608-1 is set to “select” input port 609-1, then output port 609-3 of row multiplexer 608-1, which is coupled to row 604-1, is coupled to drive circuitry 602 (e.g., drive circuitry 602 can then drive row 604-1). On the other hand, input port 609-2 of row multiplexer 608-1 is coupled to sense circuitry 603 such that if row multiplexer 608-1 is set to “select” the input port 609-2, then output port 609-3 of row multiplexer 608-1 is coupled to sense circuitry 603 (e.g., row 604-1 can provide signals to sense circuitry 603). Row multiplexers 608-2, 608-3, and 608-4 are coupled to row 604-2, 604-3, 604-4 and the drive and sense circuitries similarly.


In some examples, certain rows and/or columns can share the same drive line and/or sense lines. For example, input port 611-1 of column multiplexer 610-1 is coupled to the same drive line as input port 609-1 of row multiplexer 608-1. Thus, in some embodiments, if row multiplexer 608-1 and column multiplexer 610-1 are both set to “select” the first input port (e.g., input port 609-1 and input port 611-1, respectively), then both row 604-1 and column 606-1 are driven by the same signal from drive circuitry 602. In some examples, input port 611-2 of column multiplexer 610-1 is coupled to sense circuitry 603 such that if column multiplexer 610-1 is set to “select” input port 611-2, then output port 611-3 of column multiplexer 610-1 is coupled to sense circuitry 603 (e.g., column 606-1 can provide signals to sense circuitry 603). In some examples, input port 611-2 of column multiplexer 610-1 is coupled to the same sense line as input port 609-2 of row multiplexer 608-1. Thus, in some embodiments, if row multiplexer 608-1 and column multiplexer 610-1 are both set to “select” the second port (e.g., input port 609-2 and input port 611-2, respectively), then both row 604-1 and column 606-1 provide sense signals to sense circuitry 603 on the same sense line. Column multiplexers 610-2, 610-3, and 610-4 are coupled to columns 606-2, 606-3, 606-4 and the drive and sense circuitries similarly. Thus, as shown above, each row of touch electrodes can have a corresponding column of touch electrodes that can be configured to be coupled to the same drive and sense lines. It is understood, however, that each row and/or column of touch electrodes can have their own dedicated sense and drive lines (e.g., does not share drive or sense lines with another row or column). In some embodiments, sharing drive and/or sense lines can reduce the area requirements of touch screen system 600, while having dedicated drive and/or sense lines can increase flexibility in driving and/or sensing touch panel 601. Thus, as described above, touch screen system 600 allows any of the row or column electrodes to be configured as drive electrodes or sense electrodes.



FIGS. 7A-7D illustrate a method of operating touch panel 700 according to examples of the disclosure. In FIGS. 7A-7D, touch panel 700 is configured in a mutual capacitance sense mode. As described above, when detecting touch activity, the electromagnetic field generated at the intersection of a drive and sense electrode can be limited to the volume immediately around the respective intersection. As a result, the ability for conventional touch panels to reliably sense hovering objects is minimal. Thus, FIGS. 7A-7D illustrate a method of simultaneously driving multiple rows and/or columns to increase the electromagnetic field penetration to improve detection of hovering objects. FIGS. 7A-7D illustrate four sequential scan steps to generate a full touch image.


In FIG. 7A, touch panel 700 includes row electrodes 702-1, 702-2, 702-3, and 702-4, and column electrodes 704-1, 704-2, 704-3, and 704-4. In FIG. 7A, the row and column electrodes are illustrated as rectangular rows and columns for simplicity, but it is understood that the row and columns can be either a single element as shown, or multiple elements that are electrically coupled together (such as shown in FIG. 6). Furthermore, the row electrodes can be coupled along a horizontal direction (e.g., each electrode in a row can be coupled to the next adjacent electrode) and the column electrodes can be coupled together along a vertical direction (e.g., each electrode in a column can be coupled to the next adjacent electrode), as shown in FIG. 7A. Thus, touch panel 700 can be similar to touch panel 601 described above with respect to FIG. 6.


In FIG. 7A, row 702-1 is coupled to drive/sense node 706-1, row 702-2 is coupled to drive/sense node 706-2, row 702-3 is coupled to drive/sense node 706-3, and row 702-4 is coupled to drive/sense node 706-4. Column 704-1 is coupled to drive/sense node 708-1, column 704-2 is coupled to drive/sense node 708-2, column 704-3 is coupled to drive/sense node 708-3, and column 704-4 is coupled to drive/sense node 708-4. In some examples, the drive/sense nodes shown in FIG. 7A are a simplified illustration of the multiplexer circuitry described above with respect to FIG. 6. Thus, if a respective drive/sense node is described herein as being configured to be driven or sensed, it is understood that the corresponding multiplexers for the respective row or column are set to select the respective ports to enable the respective row or column to be driven or sensed (e.g., coupled to drive or sense circuitry), respectively.



FIG. 7A illustrates a first scan step of simultaneously driving multiple rows and/or columns to increase the electromagnetic field penetration to improve detection of hovering objects. In FIG. 7A, rows 702-1 and 702-3 and columns 704-1, 704-2, 704-3, and 704-4 are driven by drive/sense nodes 706-1, 706-3, 708-1, 708-2, 708-3, and 708-4, respectively, while rows 702-2 and 702-4 are sensed by drive/sense nodes 706-2 and 706-4, respectively. Thus, in the first scan step, all columns of touch electrodes are driven while every other row of touch electrodes (e.g., non-adjacent rows) are driven and the remaining rows are sensed. Thus, for each row of electrodes that is being sensed, two adjacent rows of electrodes are being simultaneously driven and the column electrodes are being driven. Thus, instead of only have a single drive element for each sense element, each sense element has multiple drive elements. In this way, the electromagnetic field penetration is increased due to the number of elements being driven, thus allowing for an object hovering over touch panel 700 to be detected. In some embodiments, performing the above-described first scan step provides the system with a partial touch image (e.g., every other row, such as the odd rows).



FIG. 7B illustrates a second scan step of simultaneously driving multiple rows and/or columns to increase the electromagnetic field penetration to improve detection of hovering objects. In FIG. 7B, rows 702-2 and 702-4 and columns 704-1, 704-2, 704-3, and 704-4 are driven by drive/sense nodes 706-2, 706-4, 708-1, 708-2, 708-3, and 708-4, respectively, while rows 702-1 and 702-3 are sensed by drive/sense nodes 706-1 and 706-3, respectively. Thus, in the second scan step, all columns of touch electrodes remain driven while the rows that were driven in the first scan step are now scanned and the rows that were scanned are now driven. Thus, the second scan step scans the rows of touch electrodes that were not scanned in the first scan step. In this way, the partial touch image from the first scan step (e.g., the odd rows) and the partial touch image from the second scan step (e.g., the even rows) can be combined to generate a touch image in which every row has been scanned.



FIGS. 7C-7D illustrate a third and fourth scan step in which the drive/sense scheme of the rows and columns are swapped. FIG. 7C illustrates a third scan step of simultaneously driving multiple rows and/or columns to increase the electromagnetic field penetration to improve detection of hovering objects. In FIG. 7C, rows 702-1, 702-2, 702-3, and 702-4 are all driven (by drive/sense nodes 706-1, 706-2, 706-3, and 706-4, respectively), while columns 702-2 and 702-4 are sensed (by drive/sense nodes 708-2 and 708-4, respectively) and columns 702-1 and 702-3 are driven (by drive/sense nodes 708-1 and 708-3, respectively. Thus, similarly to the first scan step described above with respect to FIG. 7A, every row is driven while every other column are driven (e.g., non-adjacent columns) and the remaining columns are sensed. In some embodiments, performing the above-described third scan step provides the system with a partial touch image (e.g., the odd columns).



FIG. 7D illustrates a fourth scan step of simultaneously driving multiple rows and/or columns to increase the electromagnetic field penetration to improve detection of hovering objects. In FIG. 7D, rows 702-1, 702-2, 702-3, and 702-4 are all driven (by drive/sense nodes 706-1, 706-2, 706-3, and 706-4, respectively), while columns 702-1 and 702-3 are sensed (by drive/sense nodes 708-1 and 708-3, respectively) and columns 702-2 and 702-4 are driven (by drive/sense nodes 708-2 and 708-4, respectively. In some embodiments, performing the above-described fourth scan step provides the system with a partial touch image (e.g., the even columns). In this way, the partial touch image from the third scan step (e.g., the odd columns) and the partial touch image from the fourth scan step (e.g., the even columns) can be combined to generate a touch image in which every column has been scanned. In some embodiments, combining the partial touch images of all scan steps generates a full touch image in which every row and column is scanned. Performing the above four scan steps provides a full resolution touch image, but the increased number of drive elements increases the sensitivity of the touch electrodes (e.g., improving the signal to noise ratio), thereby enabling the touch panel to better detect touch and hover activities.


In some examples, instead of driving and sensing every other row and/or column of touch electrodes as described above with respect to FIGS. 7A-7D, a single row or column can be sensed while all other rows and columns can be driven. In some embodiments, driving every row and column except for one row significantly increases the electromagnetic field penetration by maximizing the number of drive elements. In some embodiments, to achieve a full touch image, multiple scans are required such that each row and column is sensed (e.g., for a total of n+m number of scans, where n is the number of rows and m is the number of columns).


It is understood that a full resolution touch image may not be necessary and in some examples, scanning every other row or column, or scanning every third row or column can achieve a touch image with a sufficiently high resolution to determine whether there is touch or hover activity. Similarly, it is understood that the every-other-row and every-other-column scheme discussed above can be modified to be one scan element out of every three elements, or one every four, one every five, etc.


In some examples, as discussed above, rather than using a single-stimulation mutual capacitance scan (e.g., each driven row or column is driven by the same signal or signals that have the same phase and/or amplitude), the row-column touch sensor panel can be stimulated using a multi-stimulation (“multi-stim”) mutual capacitance scan. As discussed above, in a multi-stim scan, multiple drive lines (e.g., row electrodes or column electrodes) can be simultaneous stimulated with different stimulation signals for multiple stimulation steps, and the sense signals generated at one or more sense lines (e.g., the undriven row or column electrodes) in response to the multiple stimulation steps can be processed to determine the presence and/or amount of touch for each touch node in the touch sensor panel (corresponding to the multiple drive lines).


In some embodiments, because multi-stim scan inherently drive multiple rows or columns simultaneously, multi-stim scan techniques can be used in conjunction with the multiple drive technique discussed above with respect to FIGS. 7A-7D to improve the detection sensitivity of the touch panel. Table 3 in FIG. 11 illustrates an example multi-scan scan sequence in which columns of electrodes are driven by a drive signal that is in-phase (Vstim+, 0° phase) or out-of-phase (Vstim−, 180° phase). In the example illustrated in Table 3, the touch panel has ten columns of touch node electrodes and the entirety of the column is driven by an in-phase signal, an out-of-phase signal, or no signal (e.g., 0 voltage, reference voltage, ground voltage, guard voltage, etc.). In some embodiments, the row electrodes are sensed while the columns are driven as shown in Table 3.


As shown in Table 3 of FIG. 11, a touch panel can be divided evenly among a number of groups (e.g., “banks”) and each column within a group can be “ganged” (e.g., coupled together such that they are driven by the same drive signal). As shown, a touch panel with ten column electrodes can be divided into three groups such that each group includes three columns and a fourth overflow group. In Table 3, each group is shaded (or not shaded, as the case may be) for ease of illustration. In some examples, each column electrode in each group is driven while the overflow group is not driven (e.g., 0 voltage, reference voltage, ground voltage, guard voltage, etc.). Because the touch panel is divided into three groups, each group is subjected to three scan steps (e.g., the number of groups). In some examples, during the first scan step, the first group is driven with Vstim− (e.g., out-of-phase drive signal) while the second and third groups are driven with Vstim+ (e.g., in-phase drive signal). During the second scan step, the first and third groups are driven with Vstim+ while the second group is driven with Vstim-. During the third scan step, the first and second groups are driven with Vstim+ while the third group is driven with Vstim-. Thus, each group is driven by Vstim− during the first set of scan steps.


During the second set of scan steps (e.g., scan steps 4-6), the groups of electrodes are shifted by one. For example, the first group is now composed of columns 2, 3, and 4; the second group is now composed of columns 5, 6, and 7, and the third group is now composed of columns 8, 9, and 10; and the overflow group is composed of column 1. During the second set of scan steps, the groups are subject to the same sequence of drive signals as during the first set of scan steps (e.g., each group is driven by Vstim− once). As shown in Table 3, the groups “wrap” such that the overflow group shifted from including column 10 to including column 1.


During the third set of scan steps (e.g., scan steps 7-9), the groups of electrodes are shifted by one again. For example, the first group is now composed of columns 3, 4, and 5; the second group is now composed of columns 6, 7, and 8, and the third group is now composed of columns 9, 10, and 1; and the overflow group is composed of column 2. During the third set of scan steps, the groups are subject to the same sequence of drive signals as during the first set of scan steps (e.g., each group is driven by Vstim− once). As shown in Table 3, the groups “wrap” such that the third group shifted from including columns 8-10 to including columns 9, 10, and 1, and the overflow group shifted from including column 1 to including column 2.


In the fourth set of scan steps (e.g., scan step 10), a common mode scan is performed in which each column of touch electrodes is driven by the Vstim+ signal. Thus, as described above, within each grouping, three scans per set are performed (e.g., the number of columns in each group, resulting in three scans per set), and each group is shifted three times (e.g., the number of groups, resulting in four sets of scans). As shown, if the number of columns does not divide evenly into the number of groups, then an overflow group of columns can be used which is not driven by either Vstim+ or Vstim−.


After performing the four sets of scan steps described above, the sensed signal generated at the row electrodes can be modeled using equation (4):

{tilde over (Q)}={tilde over (B)}·{tilde over (C)}  (4)

where {tilde over (Q)} is a vector with a length of four (e.g., the number of sets of scans) of the sensed signals from each scan step of the multi-stim scan, {tilde over (B)} is a 4×4 matrix of polarities of the stimulation voltage (e.g., stimulation matrix) indexed by column and step (e.g., as shown in Table 3 in FIG. 11), and {tilde over (C)} is a vector with a length of four (e.g., the number of sets of scans) at each touch node for the sense line. The capacitance value at each touch node of the sense line can then be decoded using equation (5):

{tilde over (C)}={tilde over (B)}−1·{tilde over (Q)}  (5)

where {tilde over (B)}−1 represents the inverse of the stimulation matrix. Repeating the measurements and calculations above for each sense line can determine a capacitance signal for each touch node of the touch sensor panel scanned during the multi-stim scan. Thus, as shown, the equations for decoding the multi-stim scan described herein are similar to those described above with respect to equations 1-3. In this way, the system is able to determine the amount of capacitance at each touch node while simultaneously being able to drive multiple adjacent columns with the same drive signal (e.g., each column in a group), thus increasing the electromagnetic field penetration of the generated electromagnetic fields and improving the detection sensitivity of the touch panel. It is understood that the stimulus matrix illustrated in Table 3 of FIG. 11 is merely exemplary and any sized matrix can be used (e.g., any number of rows and/or any number of columns).



FIGS. 8A-8B illustrate exemplary touch sensor circuits according to examples of the disclosure. As discussed above, a touch sensor circuit is able to sense the total capacitance between one or more drive electrodes and a sense electrode. In some examples, the total capacitance comprises a baseline component and an interaction component. The baseline component is the amount of capacitance when the touch panel experiences no touch or hover activity. In some examples, the baseline capacitance is the amount of capacitance inherent in the design of the system. In some embodiments, the baseline capacitance can drift over time as changes in the environment cause the amount of baseline capacitance to change (e.g., temperature, humidity, noise from components of the device, noise in the environment, etc.). In some examples, the interaction component is the change in the capacitance due to a user's interaction with the touch panel (e.g., touch the touch sensor panel or hovering over the touch sensor panel). In some embodiments, the interaction component is dynamic and changes based on how close or far the object is to the touch panel, how well grounded the object is, the type of material of the object, and other such factors that affect the conductivity of the object. FIGS. 8A-8B illustrate exemplary systems and methods for injecting an offset signal to offset the effects of the baseline capacitance, thereby improving the signal to noise ratio and sensitivity of the touch sensor panel.



FIG. 8A illustrates an exemplary touch sensor circuit 800 according to examples of the disclosure. In some examples, touch sensor circuit 800 receives a touch sense signal from one or more touch electrodes on a touch panel, amplifies and/or buffers the touch sense signal, and transmits the touch sense signal for further processing, such as by touch processor 202. In some examples, components of touch sensor circuit 800 are integrated with touch controller 206. In some embodiments, touch sensor circuit 800 includes touch panel 801. In some examples, touch panel 801 is a touch sensitive surface similar to touch screen 220, touch screen 320, touch screen 420, and/or touch panel 500.


As shown in FIG. 8A, touch panel 801 is a circuit model of a touch sensor panel (e.g., touch screen 220, touch screen 320, touch screen 420, and/or touch panel 500). It is understood that the components in touch panel 801 are a simplified representation of a touch panel. In some examples, touch panel 801 includes a stimulation source 802 that generates stimulation signal VSTM. In some examples, stimulation source 802 provides a stimulation signal (e.g., drive signal) to one or more touch electrodes on a touch panel. In some examples, the one or more touch electrodes on the touch panel receive the stimulation signal from stimulation source 802. As shown in FIG. 8A, the one or more touch electrodes can be modeled as capacitor 804 having a capacitance CSIG (e.g., the baseline capacitance between one or more drive electrodes and one or more sense electrodes in a mutual capacitance sensing scheme and the capacitance caused by an object interacting with the touch panel). In some examples, touch panel 801 outputs a touch signal that is based on the total capacitance sensed by capacitor 804. In some embodiments, the touch panel outputs a touch sense current, IIN, which includes a baseline current component, IBASE, and touch current component, ITOUCH. In some examples, the touch current component varies (e.g., increases or decreases) the total output current and can be modeled as the delta change in the current due to the touch or hover activity.


In FIG. 8A, touch sensor circuit 800 includes amplifier 806. In some embodiments, touch sensor circuit 800 includes feedback capacitor 808 and/or feedback resistor 810. As shown in FIG. 8A, amplifier 806 is an operational amplifier and/or a differential amplifier including an inverting input port, a noninverting input port, and an output port. Amplifier 806 amplifies the difference between the signal on the inverting input port and the signal on the noninverting input port and outputs the amplified difference to the output port. For example, if the noninverting input port of amplifier 806 is coupled to a reference voltage, such as system ground, then amplifier 806 amplifies the signal on the inverting input port with respect to ground. In FIG. 8A, the noninverting input port of amplifier 806 is coupled to a bias voltage VBIAS and the inverting input port of amplifier 806 is coupled to the output of touch panel 801. Thus, in some embodiments, amplifier 806 receives and amplifies a touch signal received from touch panel 801. In some embodiments, amplifier 806 outputs an amplified touch signal, which is forwarded for further processing (e.g., such as by touch processor 202, to determine whether there is any touch or hover activity, the magnitude of the activity, the location of the activity, or any other relevant characteristics).


In some embodiments, feedback capacitor 808 and feedback resistor 810 are coupled in parallel, forming a feedback network coupled between the output port of amplifier 806 and the inverting input port of amplifier 806. In some examples, the values of feedback capacitor 808 and feedback resistor 810 control the amplification characteristics of amplifier 806 (e.g., gain, frequency response, etc.). In some examples, feedback capacitor 808 and feedback resistor 810 are fixed components. In some examples, feedback capacitor 808 and feedback resistor 810 are variable components that can be adjusted at manufacture time (e.g., calibrated static components) or adjusted during runtime (e.g., dynamic components).


In some examples, touch sensor circuit 800 includes offset generator 812 and offset resistor 814. In some embodiments, offset resistor 814 is a variable resistor. In some embodiments, offset resistor 814 can be used to match the output impedance of offset generator 812 with the input impedance of amplifier 806. In some embodiments, offset generator 812 applies a signal (e.g., a voltage) that, when applied to offset resistor 814, generates an offset current that is equal and opposite to the baseline touch signal from touch panel 801 (e.g., IBASE, the signal generated by touch panel 801 due to stimulation source 802 when there is no touch or hover activity). In some embodiments, injecting a signal that is equal and opposite to the baseline touch signal from touch panel 801 into the noninverting input port of amplifier 806 offsets or “cancels out” the touch signal resulting from the baseline capacitance, leaving the touch signal resulting from a touch or hover interaction (e.g., ITOUCH). In some embodiments, the signal generated by offset generator 812 is a large percentage of the baseline touch signal (e.g., a percentage greater than or equal to 50% of the baseline touch signal) and is able to cancel out the large percentage of the baseline touch signal.


In some embodiments, cancelling a large percentage of the baseline touch signal decreases the proportion of the overall touch signal due to the baseline capacitance and thus increases the proportion of the overall touch signal due to the touch or hover activity. In some embodiments, increasing the ratio of the signal due to the touch or hover activity to the overall touch signal increases the system's sensitivity and ability to identify touch and hover events. In some examples, offset generator 812 decreases touch signal drift. In some examples, the baseline touch signal (e.g. touch signal in the absence of touch) induces a touch baseline drift component, which is optionally equivalent to the product of the temperature coefficient of the touch sensing circuitry, the temperature of the touch sensing circuitry, and/or the baseline touch signal. In some examples, by performing offset compensation (e.g., by cancelling the baseline touch component via offset generator 812), the associated touch baseline drift component can be eliminated or reduced and thus the associated touch baseline drift can be eliminated or reduced.


In some embodiments, the circuit generated by the combination of capacitor 804, amplifier 806, feedback capacitor 808, and feedback resistor 810 forms a differentiator that performs a derivative function on the stimulation signal. In some embodiments (e.g., such as those illustrated in FIGS. 8A-8B), the amplifier's feedback network impedance is dominated by RFB and therefore, the amplifier is configured in transimpedance mode. Thus, in some embodiments, the signal generated by offset generator 812 can have a waveform that is the same or similar to the derived waveform of the stimulation signal generated by stimulation source 802.


For example, FIG. 9A illustrates signal graphs of an example driving scheme for exemplary touch sensor circuit 800 according to examples of the disclosure. In FIG. 9A, stimulation source 802 generates a triangular wave stimulation signal (e.g., signal 902). In some embodiments, driving capacitor 804 with a triangular wave causes a square wave output current profile, as shown by signal 904. In some embodiments, the current output of capacitor 804 is a square wave due to the touch sensor circuit forming a differentiator. In some examples, the output current can be modeled with equation (6):











I
C

(
t
)

=


C
SIG

·


d


V

(
t
)



d

t







(
6
)








where V(t) is the stimulation signal (e.g., signal 902) generated by stimulation source 802, CSIG is the capacitance of capacitor 804, and IC(t) is the output current (e.g., IIN). In some embodiments, triangular stimulation signal V(t) can have the form:

V(t)=Vstm0·2·FSTM·t  (7)

during time period t=0 to







t
=

1

2
·

F

S

T

M





,





where FSTM is the frequency of the stimulation signal and Vstm0 is the amplitude of stimulation signal Vstm. In some embodiments, triangular stimulation signal V(t) can have the form:

V(t)=2·Vstm0·(1−FSTM·t)  (8)

during time period






t
=



1

2
·

F
STM





to


t

=


1

F

S

T

M



.







Thus, in some embodiments, during the first time period (e.g., t=0 to








t
=

1

2
·

F

S

T

M





)

,





the input current into the amplifier can be derived from equation (7) (e.g., by differentiating equation (7)) to yield equation (9):

I(t)=CSIG·Vstm0·2·FSTM  (9)

and during the second time period, the input current into the amplifier can derived from equation (8) (e.g., by differentiating equation (8) to yield equation (10):

I(t)=−CSIG·Vstm0·2·FSTM  (10)


In some embodiments, when the amplifier is configured in a TIA mode (e.g., if RFB dominates the feedback impedance), during the first time period (e.g., t=0 to








t
=

1

2
·

F

S

T

M





)

,





the output of the amplifier can be modeled by equation (11):

VOUT(t)=RFB·CSIG·Vstm0·2·FSTM  (12)

and during the second time period the output of the amplifier can be modeled by equation (12):

VOUT(t)=−RFB·CSIG·Vstm0·2·FSTM  (12)


As shown, providing a triangular stimulation signal results in a square current signal that is based on the slope of the triangle signal as described above. In some examples, signal 902 includes a baseline current component, IBASE, and touch current component, ITOUCH. In some examples, the touch current component can be significant smaller than the baseline current component (e.g., small perturbations to signal 902 which are not perceptible due to the scale of the graph).


As shown in FIG. 9A, because touch sensor circuit 800 uses an offset resistor 814 with offset generator 812, touch sensor circuit 800 does not form a differentiator with offset generator 812. Thus, generating a square wave voltage signal (e.g., signal 906) causes a square wave current signal (e.g., signal 908) to be injected into amplifier 806. In some embodiments, offset resistor 814 can be selected (e.g., adjusted, tuned, etc.) to improve the cancellation (e.g., cancellation of the baseline touch signal as described above) based on equation (13):












"\[LeftBracketingBar]"



C

S

I

G


·

Vstm
0

·
2
·

F

S

T

M





"\[RightBracketingBar]"


=


V

off
0



2

R

O

F

F








(
13
)








where Voff0 is the amplitude of offset signal (e.g., generated by offset generator 832).


In some examples, because signal 908 (e.g., the current injected by offset generator 812) has a similar or the same signal profile as signal 904 (e.g., the current output from touch panel 802, the current flowing through capacitor 804) and has a similar or the same amplitude as the baseline current component (IBASE) of signal 904, the resulting current flowing into the inverting input of amplifier 806 includes the signal due to touch or hover activity (ITOUCH) and little or none of the baseline current component, IBASE, which has been cancelled out or reduced by signal 908. Thus, the resulting output signal of amplifier 806 is an amplified signal based on ITOUCH, as illustrated by signal 910. In some embodiments, as illustrated by signal 910, the output signal has peaks and valleys (e.g., due to the sudden change in input current at the transitions and/or due to any phase misalignment between signals 904 and 908). In some embodiments, the output signal can be sampled on each cycle after the signal has settled to a steady state (e.g., after the peaks and valleys). In some embodiments, the output signal can be sampled on each cycle before the half cycle (e.g., before the next peak or valley).



FIG. 8B illustrates an exemplary touch sensor circuit 820 according to examples of the disclosure. Touch sensor circuit 820 is similar to touch sensor circuit 800 except instead of an offset resistor coupling offset generator 812 to the inverting input of amplifier 806, touch sensor circuit 820 includes an offset capacitor 834 coupling offset generator 832 to the inverting input of amplifier 826. In some embodiments, offset capacitor 834 is a variable capacitor. In some embodiments, the use of offset capacitor 834 instead of an offset resistor causes touch sensor circuit 820 to form a differentiator that performs a derivative function on the offset signal. Thus, in some embodiments, in order to generate a current of equal but opposite magnitude as the current through capacitor 824, the signal generated by offset generator 832 can have a waveform that is the same or similar to the waveform of the stimulation signal generated by stimulation source 832 (e.g., not the derived waveform of the stimulation signal). For example, if the stimulation signal generated by stimulation source 822 is a triangular wave, then the waveform generated by offset generator 832 is also a triangular wave. In some embodiments, offset capacitor 834 can be set have the same baseline capacitance as capacitor 824 and offset generator 832 can be set to generate the same signal (but opposite) as signal generator 822. In some embodiments, instead of using a separate offset generator 812, touch sensor circuit 820 can include an inverter that receives the stimulation signal from stimulation source 822, inverts the stimulation signal, and drives offset capacitor 834 with the inverted version of the stimulation signal. In some embodiments, using offset capacitor 834 (e.g., as opposed to offset resistor 814) can be advantageous over using an offset resistor 814 because on-chip capacitors (e.g., integrated capacitors) have lower temperature coefficients than on-chip resistors.



FIG. 9B illustrates signal graphs of an example driving scheme for exemplary touch sensor circuit 820 according to examples of the disclosure. In FIG. 9B, stimulation source 822 generates a triangular wave stimulation signal (e.g., signal 922). In some embodiments, driving capacitor 824 with a triangular wave causes a square wave output current profile, as shown by signal 924. In some embodiments, the current output of capacitor 824 is a square wave due to the touch sensor circuit forming a differentiator. In some examples, the output current can be modeled with equation (6) above.


In some examples, signal 922 includes a baseline current component, IBASE, and touch current component, ITOUCH. In some examples, the touch current component can be significant smaller than the baseline current component.


As shown in FIG. 9B, because touch sensor circuit 820 uses an offset capacitor 834 with offset generator 832, touch sensor circuit 820 forms a differentiator with offset generator 832. Thus, generating a triangle wave voltage signal (e.g., signal 926) causes a square wave current signal (e.g., signal 928) to be injected into amplifier 826.


In some examples, because signal 928 (e.g., the current injected by offset generator 832) has a similar or the same signal profile as signal 924 (e.g., the current output from touch panel 822, the current flowing through capacitor 824) and has a similar or the same amplitude as the baseline current component (IBASE) of signal 924, the resulting current flowing into the inverting input of amplifier 826 includes the signal due to touch or hover activity (ITOUCH) and little or none of the baseline current component, IBASE, which has been cancelled out or reduced by signal 902. Thus, the resulting output signal of amplifier 826 is an amplified signal based on ITOUCH, as illustrated by signal 930. In some embodiments, as illustrated by signal 930, the output signal has peaks and valleys (e.g., due to the sudden change in input current at the transitions and/or due to any phase misalignment between signals 924 and 928) and a finite panel bandwidth. In some embodiments, the output signal can be sampled on each cycle after the signal has settled to a steady state (e.g., after the peaks and valleys). In some embodiments, the output signal can be sampled on each cycle before the half cycle (e.g., before the next peak or valley).



FIGS. 10A-10C illustrate a method of demodulating a touch sense signal. In some examples, the output signal of a touch sensor circuit (e.g., such as the touch sensor circuits described above) are further post processed before the signal is analyzed to determine whether or what type of touch activity occurred. In some examples, the post processing includes demodulating the touch sense signal. In some examples, the touch sense signal is converted to a digital signal before demodulation. In some examples, as will be described in further detail, demodulation includes multiplying the touch sense signal with one or more demodulation waveforms (e.g., predetermined waveforms that, when multiplied with the touch sense signal, performs one or more demodulation functions on the touch sense signal).



FIG. 10A illustrates an exemplary block diagram of a touch sensing system 1000. In some examples, touch sensing system 1000 includes touch electrode 1002 (e.g., such as touch node 422, touch electrode 604, touch electrode 606, row electrodes 702, and/or column electrodes 706 described above) that generates one or more touch sense signals indicative of touch activity detected on a touch panel. It is understood that touch electrode 1002 can represent a single touch electrode or a combination of multiple touch electrodes.


In FIG. 10A, touch electrode 1002 is coupled to touch amplifier 1004, which amplifies the output signal from touch electrode 1002 and optionally performs one or more preliminary filtering or post-processing steps (e.g., such as described above with respect to FIGS. 8A-8B). In some examples, touch sensing system 1000 includes an analog-to-digital converter 1006. Analog-to-digital converter 1006 can convert the analog output signal from amplifier 1004 into a digital signal representative of the amplified analog touch signal.


The touch sensing system 1000 in FIG. 10A includes demodulation block 1008. Demodulation block 808 can perform one or more demodulation functions on the digitized touch signal. In some embodiments, demodulation block 1008 includes one or more multipliers that multiplies the digitized touch signal with one or more demodulation waveforms. The demodulation waveforms can be selected and/or designed to perform particular demodulation functions. For example, demodulation block 808 includes a square wave demodulator that filters the touch signal and generates a “squared” output signal (e.g., a signal in which the transitions are short and the steady state levels (the peaks and troughs) are relatively flat). Thus, a square wave demodulator can be used to clean up the touch signal that is conducive to better sampling of the touch signal.


In FIG. 10A, demodulation block 1008 includes multiplier 1010, multiplier 1011, and integrator 1013. As illustrated in FIG. 10A, multiplier 1010 can multiply the touch signal with a square wave demodulation waveform, multiplier 1011 can multiply the touch signal with a window waveform, and integrator 1013 can perform an integrating function on the touch signal (e.g., integrates the touch signal over a threshold period and outputs a value representative of the integral of the touch signal over the threshold period). It is understood that although FIG. 10A illustrates demodulation block 1008 including multiplier 1010 and multiplier 101, other multipliers can be additionally included in demodulation block 1008 for performing other functions. For example, demodulation block 1008 can include a multi-stimulation demodulator (e.g., a demodulator that decodes the multi-stimulation codes, such as those described above). It is also understood that certain functions can be combined. For example, a windowing function can be combined with the square wave demodulator.


Multiplier 1010 can multiply the touch signal with a gated square wave demodulation waveform (e.g., as opposed to an ungated square wave demodulation waveform), which performs a gated square wave demodulation function on the touch signal. A gated square wave demodulation can perform a square wave demodulation function (e.g., “squaring” the touch signal as described above), and a gating function. Performing a gating function can include applying a square wave demodulation waveform with certain portions of the square wave demodulation waveform zeroed out (e.g., as will be illustrated below in FIG. 10B). In some examples, the zeroed portions of the demodulation waveform zeroes out the respective portions of the touch signal. Configuring the demodulation waveform such that the zeroed portion aligns with the transitions in the touch signal (e.g., the transitions of the carrier waveform generated by the stimulation voltage source, such as stimulation sources 802 and 822) can eliminate (e.g., zero out) or reduce the transition periods in which the touch signal has not yet settled.


In some examples, gating the touch signal (e.g., performing the gating function described above) provides one or more benefits. One benefit includes reducing the touch signal drift by eliminating and/or reducing the unsettled portions of the touch signal (e.g., before the touch signal has reached 80%, 85%, 90%, 95%, 99% of the peak value, and/or the first 10%, 20%, 30%, 33% of the pulse), which is potentially more susceptible to temperature drift. Another benefit includes optimizing and/or improving the interference rejection of the demodulator. For example, the frequency response of a square wave demodulator is equivalent to the fast Fourier transform (FFT) of a square wave demodulation waveform. The FFT of the square wave demodulation waveform has a passband at the at the fundamental frequency of the square wave demodulation waveform and at the odd harmonics (e.g., 3rd, 5th, etc.). In some examples, setting the duty cycle of the gating (e.g., the duty cycle of the square wave) to 33% (e.g., ⅓ of the waveform is zero) causes the third harmonic to be eliminated, thus reducing the amount of interference that is coupled through the demodulator at the frequency of the third harmonic and improves the interference rejection. It is understood that the duty cycle can be set to any value and the frequency response of the demodulator can be adjusted accordingly.


In some examples, additionally or alternatively, multiplier 1011 multiplies the touch signal with a window waveform, which performs a windowing function on the touch signal. A window waveform generally tapers to zero outside of a predetermined window of time, peaks at the center of the waveform, is symmetric before and after the peak of the waveform. Examples of suitable window waveforms include a Taylor window, a Gaussian window, a Chebyshev window (e.g., a Dolph-Chebyshev window), etc. Other suitable window functions are possible.


In some examples, a window function can be applied to optimize and/or improve the filter properties of the demodulator, and/or optionally improving interference rejection. The filter properties that can be optimized and/or improved include the stop band attenuation, passband ripple, stopband ripple, stop band roll-off, filter bandwidth, roll-off and/or any other suitable filter properties. Applying a windowing function to the touch signal can apply a stop-band filtering function to the touch signal. In some examples, the windowing function is able to reduce or eliminate noise at certain frequencies while maintaining the touch signal frequencies (e.g., the fundamental frequency, for example). Thus, applying a windowed demodulation function can perform stop-band attenuation. The windowing function can also be configured to attenuate frequencies between the harmonics of the touch signal (e.g., first harmonic, fifth harmonic, etc.).


In some examples, after performing the one or more demodulation steps, the demodulated touch signal is integrated by integrator 1013. Integrator 1013 can integrate the demodulated touch signal over a period of time and outputs a representative output value. In some embodiments, the output of integrator 1013 is forwarded to touch processor 1012 (e.g., touch processor 202) for processing (e.g., to determine whether there was touch activity and respond accordingly). It is understood that additional multipliers and/or integrators can be included in demodulation block 1008 for performing other post-processing functions on the touch signal.



FIG. 10B illustrates signal graphs of an exemplary demodulation module according to examples of the disclosure. In FIG. 10B, touch amplifier 1004 (optionally analog-to-digital converter 1006) outputs signal 1022 corresponding to a touch signal. As shown, signal 1022 has a shark fin shape (e.g., a square wave with an extended rise and fall transition) with a rise and fall time based on the characteristics of the touch sensing circuit. As discussed above, signal 1022 transitions from negative to positive and positive to negative over a certain period of time. In some examples, while signal 1022 is transitioning, energy (e.g., electrical energy) is being transferred from touch amplifier 804 to downstream circuitry. In some examples, as signal 1022 approaches and/or reaches its steady state, the majority of energy has been transferred to the downstream circuitry and signal 1022 begins to settle. Thus, as discussed above, square wave demodulation waveform (e.g., such as described above in FIG. 10A, represented by signal 1024) can be configured to gate (e.g., zero out) signal 1022 during the initial transition period (e.g., first 10%, 20%, 25%, 33%, 40%, of a pulse) and to not gate (e.g., pass through, using a 1 multiplier) as signal 1022 approaches and/or reaches its steady state level (e.g., 80%, 85%, 90%, 95%, 99% of the steady state level).


As described above, signal 1024 has a value of zero during the initial ⅓ of the width of the positive pulse of signal 1022 and a value of 1 during the subsequent ⅔ of the width of the positive pulse of signal 1022. Thus, signal 1024 gates the initial ⅓ of the positive pulse of signal 1022 and passes through the subsequent ⅔ of the width of the positive pulse of signal 1022. On the negative pulse, signal 1024 is zero for the first third of the negative pulse and −1 during the subsequent ⅔ of the width of the negative pulse of signal 1022. Thus, signal 1024 gates the initial ⅓ of the negative pulse of signal 1022 and passes through the subsequent ⅔ of the width of the negative pulse of signal 1022.



FIG. 10C illustrates graph 1030 and graph 1040 that illustrate an exemplary windowed gated square wave demodulation waveform and an exemplary frequency domain graph of the touch signal multiplied by a windowed gated square wave demodulation waveform. Graph 1030 is a time domain graph (e.g., amplitude against time) that includes signal 1032 that represents an exemplary window waveform (e.g., such as described above in FIG. 10A). In some embodiments, signal 1032 is the waveform of a windowed gated square wave demodulator (e.g., combining the window waveform with the gated square wave demodulation waveform). As shown, signal 1032 is a square wave waveform that increases in amplitude from zero, reaches a peak amplitude, and then reduces in amplitude (optionally symmetrically) to zero. In some examples, signal 1032 has a Taylor window waveform. As described above, multiplying a touch signal (e.g., such as signal 1022) with signal 1032 can result in a signal with frequency domain signal 1042. As shown in FIG. 10C, signal 1042 is a frequency domain graph (e.g., magnitude against frequency) of the touch signal after being demodulated by a windowed gated square wave demodulation waveform (e.g., such as signal 1032). In some examples, signal 1042 has a peak at 100 kHz (e.g., the first harmonic, the fundamental frequency) of the touch signal and at 500 kHz (e.g., the fifth harmonic of the touch signal). As shown, the third harmonic (e.g., at 300 kHz) has been eliminated (optionally reduced) because the demodulation waveform has a 33% duty cycle, which eliminates or reduces the third harmonic of the touch signal. Thus, as shown in FIG. 10C, applying a windowed gated square wave demodulation waveform (e.g., as opposed to one that is not windowed) applies a bandpass filtering effect in which 100 kHz and 500 kHz are passed through (or only minimally attenuated) while the frequencies other than the 100 kHz and 500 kHz are attenuated. It is understood that the windowing illustrated in signal 1032 is merely illustrative and can be modified to provide the transfer function desired according to the characteristics of the frequency spectrum of the touch signal. It is also understood that the reference to the first, third, and fifth harmonics at 100 kHz, 300 kHz, and 500 kHz is merely exemplary and any suitable frequency is possible.


It is understood that the demodulation technique described here (e.g., the use of a square wave demodulator, the user of a gated square wave demodulator, and/or the use of a windowed gated square wave demodulator) can be combined with any of the techniques described above (e.g., the driving and sensing patterns described in FIGS. 6 and 7A-7D, the multi-stimulation technique described with respect to Table 3, and/or the baseline offset techniques described with respect to FIGS. 8A-8B and 9A-9B).


Accordingly, some examples of this disclosure describe a method. Additionally or alternatively, in some examples, additionally or alternatively, the method is performed at a touch sensor panel having a first plurality of electrodes electrically coupled together along a first direction and a second plurality of electrodes electrically coupled together along a second direction, different from the first direction. Additionally or alternatively, in some embodiments, the method includes, during a first time period, driving a first set of electrodes of the first plurality of electrodes. Additionally or alternatively, in some embodiments, the method includes, while driving the first set of electrodes, sensing a second set of electrodes of the first plurality of electrodes, wherein the second set of electrodes are different from the first set of electrodes.


Additionally or alternatively, in some examples, the first set of electrodes includes a first electrode, and the second set of electrodes includes a second electrode, adjacent to the first electrode. Additionally or alternatively, in some examples, the first set of electrodes comprises a first electrode. Additionally or alternatively, in some examples, the second set of electrodes comprises a plurality of electrodes other than the first electrode. Additionally or alternatively, in some examples, the first set of electrodes includes a first electrode and a second electrode. Additionally or alternatively, in some examples, the second set of electrodes includes a third electrode and a fourth electrode. Additionally or alternatively, in some examples, the third electrode is adjacent to the first electrode and the second electrode. Additionally or alternatively, in some examples, the second electrode is adjacent to the third electrode and the fourth electrode.


Additionally or alternatively, in some examples, the method includes, during the first time period, while driving the first set of electrodes and sensing the second set of electrodes, driving the second plurality of electrodes. Additionally or alternatively, in some examples, the method includes, during a second time period, after the first time period, driving the second set of electrodes and while driving the second set of electrodes, sensing the first set of electrodes.


Additionally or alternatively, in some examples, the method includes, during a third time period, different from the second time period and after the first time period, driving a first set of electrodes of the second plurality of electrodes. Additionally or alternatively, in some examples, the method includes, while driving the first set of electrodes of the second plurality of electrodes, sensing a second set of electrodes of the second plurality of electrodes, different from the first set of the second plurality of electrodes and driving the first plurality of electrodes.


Additionally or alternatively, in some examples, the method includes, during a fourth time period, different from the second and third time period and after the first time period, driving the second set of electrodes of the second plurality of electrodes. Additionally or alternatively, in some examples, the method includes, while driving the second set of electrodes of the second plurality of electrodes, sensing the first set of electrodes of the second plurality of electrodes and driving the first plurality of electrodes. Additionally or alternatively, in some examples, the method includes generating a touch image based on a sensed touch data from the first time period, the second time period, the third time period, and the fourth time period.


Some examples of the disclosure are directed to a non-transitory computer readable storage medium. The non-transitory computer readable storage medium can store instructions, which when executed by a device including one or more processors, can cause the device to perform any of the above methods.


Some examples of this disclosure describe a touch controller. Additionally or alternatively, in some examples, the touch controller includes a touch sensor panel having a first plurality of electrodes electrically coupled together along a first direction and a second plurality of electrodes electrically coupled together along a second direction, different from the first direction. Additionally or alternatively, in some examples, switching circuitry coupled to the first plurality of electrodes and the second plurality of electrodes and including a first set of switching circuits and a second set of switching circuits. Additionally or alternatively, in some examples, each switching circuit of the first set of switching circuits is coupled to a respective electrode of the first plurality of electrodes and is configured to selectively couple the respective electrode to a drive circuitry or a sense circuitry. Additionally or alternatively, in some examples, each switching circuit of the second set of switching circuits is coupled to a respective electrode of the second plurality of electrodes and is configured to selectively couple the respective electrode to the drive circuitry or the sense circuitry. Additionally or alternatively, in some examples, the touch controller is configured to, during a first time period, drive a first set of electrodes of the first plurality of electrodes and while driving the first set of electrodes, sense a second set of electrodes of the first plurality of electrodes, different from the first set of electrodes.


Additionally or alternatively, in some examples, driving the first set of electrodes includes configuring a first set of respective switching circuits corresponding to the first set of electrodes to couple the first set of electrodes to the drive circuitry. Additionally or alternatively, in some examples, sensing the second set of electrodes includes configuring a second set of respective switching circuits corresponding to the second set of electrodes to couple the second set of electrodes to the sense circuitry.


Additionally or alternatively, in some examples, the first set of electrodes includes a first electrode, and the second set of electrodes includes a second electrode, adjacent to the first electrode. Additionally or alternatively, in some examples, the first set of electrodes comprises a first electrode of the first plurality of electrodes. Additionally or alternatively, in some examples, the second set of electrodes comprises a plurality of electrodes other than the first electrode.


Additionally or alternatively, in some examples, the first set of electrodes includes a first electrode and a second electrode. Additionally or alternatively, in some examples, the second set of electrodes includes a third electrode and a fourth electrode. Additionally or alternatively, in some examples, the third electrode is adjacent to the first electrode and the second electrode. Additionally or alternatively, in some examples, the second electrode is adjacent to the third electrode and the fourth electrode. Additionally or alternatively, in some examples, the touch controller is configured to, during the first time period, while driving the first set of electrodes and sensing the second set of electrodes, drive the second plurality of electrodes.


Additionally or alternatively, in some examples, the touch controller is configured to, during a second time period, after the first time period, drive the second set of electrodes and while driving the second set of electrodes, sense the first set of electrodes. Additionally or alternatively, in some examples, the touch controller is configured to, during a third time period, different from the second time period and after the first time period, drive a first set of electrodes of the second plurality of electrodes. Additionally or alternatively, in some examples, the touch controller is configured to, during a third time period, different from the second time period and after the first time period, while driving the first set of electrodes of the second plurality of electrodes, sense a second set of electrodes of the second plurality of electrodes, different from the first set of the second plurality of electrodes and drive the first plurality of electrodes.


Additionally or alternatively, in some examples, the touch controller is configured to, drive the second set of electrodes of the second plurality of electrodes and while driving the second set of electrodes of the second plurality of electrodes, sense the first set of electrodes of the second plurality of electrodes and drive the first plurality of electrodes. Additionally or alternatively, in some examples, the touch controller is configured to generate a touch image based on a sensed touch data from the first time period, the second time period, the third time period, and the fourth time period.


Although the disclosed examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the disclosed examples as defined by the appended claims.

Claims
  • 1. A method comprising: at a touch sensor panel having a first plurality of electrodes electrically coupled together along a first direction and a second plurality of electrodes electrically coupled together along a second direction, different from the first direction:during a first time period: applying a drive signal to a first set of electrodes of the first plurality of electrodes corresponding to a set of odd electrodes of the first plurality of electrodes, wherein the first set of electrodes includes a first electrode and a second electrode; andwhile applying the drive signal to the first set of electrodes: sensing a second set of electrodes of the first plurality of electrodes corresponding to a set of even electrodes of the first plurality of electrodes; andapplying the drive signal to the second plurality of electrodes, wherein: the second set of electrodes is different from the first set of electrodes;the second set of electrodes includes a third electrode and a fourth electrode;the third electrode is positioned between the first electrode and the second electrode absent intervening electrodes; andthe second electrode is positioned between the third electrode and the fourth electrode absent intervening electrodes.
  • 2. The method of claim 1, further comprising: during a second time period, after the first time period: applying the drive signal to the second set of electrodes; andwhile applying the drive signal to the second set of electrodes, sensing the first set of electrodes.
  • 3. The method of claim 2, further comprising: during a third time period, different from the second time period and after the first time period: applying the drive signal to a first set of electrodes of the second plurality of electrodes; andwhile applying the drive signal to the first set of electrodes of the second plurality of electrodes: sensing a second set of electrodes of the second plurality of electrodes, different from the first set of the second plurality of electrodes; andapplying the drive signal to the first plurality of electrodes.
  • 4. The method of claim 3, further comprising: during a fourth time period, different from the second and third time period and after the first time period: applying the drive signal to the second set of electrodes of the second plurality of electrodes; andwhile applying the drive signal to the second set of electrodes of the second plurality of electrodes: sensing the first set of electrodes of the second plurality of electrodes; andapplying the drive signal to the first plurality of electrodes.
  • 5. The method of claim 4, further comprising generating a touch image based on a sensed touch data from the first time period, the second time period, the third time period, and the fourth time period.
  • 6. A touch controller, comprising: a touch sensor panel having a first plurality of electrodes electrically coupled together along a first direction and a second plurality of electrodes electrically coupled together along a second direction, different from the first direction;switching circuitry coupled to the first plurality of electrodes and the second plurality of electrodes and including a first set of switching circuits and a second set of switching circuits, wherein: each switching circuit of the first set of switching circuits is coupled to a respective electrode of the first plurality of electrodes and is configured to selectively couple the respective electrode to a drive circuitry or a sense circuitry; andeach switching circuit of the second set of switching circuits is coupled to a respective electrode of the second plurality of electrodes and is configured to selectively couple the respective electrode to the drive circuitry or the sense circuitry; and wherein:the touch controller is configured to: during a first time period: apply a drive signal to a first set of electrodes of the first plurality of electrodes corresponding to a set of odd electrodes of the first plurality of electrodes, wherein the first set of electrodes includes a first electrode and a second electrode; andwhile applying the drive signal to the first set of electrodes: sense a second set of electrodes of the first plurality of electrodes corresponding to a set of even electrodes of the first plurality of electrodes; andapply the drive signal to the second plurality of electrodes, wherein: the second set of electrodes is different from the first set of electrodes; the second set of electrodes includes a third electrode and a fourth electrode; the third electrode is positioned between the first electrode and the second electrode absent intervening electrodes; and the second electrode is positioned between the third electrode and the fourth electrode absent intervening electrodes.
  • 7. The touch controller of claim 6, wherein: applying the drive signal to the first set of electrodes includes configuring a first set of respective switching circuits corresponding to the first set of electrodes to couple the first set of electrodes to the drive circuitry; andsensing the second set of electrodes includes configuring a second set of respective switching circuits corresponding to the second set of electrodes to couple the second set of electrodes to the sense circuitry.
  • 8. The touch controller of claim 6, wherein the touch controller is further configured to: during a second time period, after the first time period: apply the drive signal to the second set of electrodes; andwhile applying the drive signal to the second set of electrodes, sense the first set of electrodes.
  • 9. The touch controller of claim 8, wherein the touch controller is further configured to: during a third time period, different from the second time period and after the first time period: apply the drive signal to a first set of electrodes of the second plurality of electrodes; andwhile applying the drive signal to the first set of electrodes of the second plurality of electrodes: sense a second set of electrodes of the second plurality of electrodes, different from the first set of the second plurality of electrodes; andapply the drive signal to the first plurality of electrodes.
  • 10. The touch controller of claim 9, wherein the touch controller is further configured to: during a fourth time period, different from the second and third time period and after the first time period: apply the drive signal to the second set of electrodes of the second plurality of electrodes; andwhile applying the drive signal to the second set of electrodes of the second plurality of electrodes: sense the first set of electrodes of the second plurality of electrodes; andapply the drive signal to the first plurality of electrodes.
  • 11. The touch controller of claim 10, wherein the touch controller is further configured to generate a touch image based on a sensed touch data from the first time period, the second time period, the third time period, and the fourth time period.
  • 12. A non-transitory computer-readable storage medium storing instructions, which when executed by an electronic device including one or more processors, cause the electronic device to: at a touch sensor panel having a first plurality of electrodes electrically coupled together along a first direction and a second plurality of electrodes electrically coupled together along a second direction, different from the first direction: during a first time period: apply a drive signal to a first set of electrodes of the first plurality of electrodes corresponding to a set of odd electrodes of the first plurality of electrodes, wherein the first set of electrodes includes a first electrode and a second electrode; andwhile applying the drive signal to the first set of electrodes: sense a second set of electrodes of the first plurality of electrodes corresponding to a set of even electrodes of the first plurality of electrodes; andapply the drive signal to the second plurality of electrodes, wherein: the second set of electrodes is different from the first set of electrodes; the second set of electrodes includes a third electrode and a fourth electrode; the third electrode is positioned between the first electrode and the second electrode absent intervening electrodes; and the second electrode is positioned between the third electrode and the fourth electrode absent intervening electrodes.
  • 13. The non-transitory computer-readable storage medium of claim 12, wherein the instructions, when executed by the electronic device, further cause the electronic device to: during a second time period, after the first time period: apply the drive signal to the second set of electrodes; andwhile applying the drive signal to the second set of electrodes, sense the first set of electrodes.
  • 14. The non-transitory computer-readable storage medium of claim 13, wherein the instructions, when executed by the electronic device, further cause the electronic device to: during a third time period, different from the second time period and after the first time period: apply the drive signal to a first set of electrodes of the second plurality of electrodes; andwhile applying the drive signal to the first set of electrodes of the second plurality of electrodes: sense a second set of electrodes of the second plurality of electrodes, different from the first set of the second plurality of electrodes; andapply the drive signal to the first plurality of electrodes.
  • 15. The non-transitory computer-readable storage medium of claim 14, wherein the instructions, when executed by the electronic device, further cause the electronic device to: during a fourth time period, different from the second and third time period and after the first time period: apply the drive signal to the second set of electrodes of the second plurality of electrodes; andwhile applying the drive signal to the second set of electrodes of the second plurality of electrodes: sense the first set of electrodes of the second plurality of electrodes; andapply the drive signal to the first plurality of electrodes.
  • 16. The non-transitory computer-readable storage medium of claim 15, wherein the instructions, when executed by the electronic device, further cause the electronic device to: generate a touch image based on a sensed touch data from the first time period, the second time period, the third time period, and the fourth time period.
  • 17. The non-transitory computer-readable storage medium of claim 12, wherein the instructions, when executed by the electronic device, further cause the electronic device to: apply the drive signal to the first set of electrodes by configuring a first set of respective switching circuits corresponding to the first set of electrodes to couple the first set of electrodes to drive circuitry of the electronic device; andsense the second set of electrodes by configuring a second set of respective switching circuits corresponding to the second set of electrodes to couple the second set of electrodes to sense circuitry of the electronic device.
CROSS-REFERENCE TO RELATED APPLICATION

This application claims the benefit of U.S. Provisional Application No. 63/032,575, filed May 30, 2020, the content of which is hereby incorporated by reference in its entirety for all purposes.

US Referenced Citations (687)
Number Name Date Kind
4087625 Dym et al. May 1978 A
4090092 Serrano May 1978 A
4304976 Gottbreht et al. Dec 1981 A
4475235 Graham Oct 1984 A
4550221 Mabusth Oct 1985 A
4659874 Landmeier Apr 1987 A
5194862 Edwards Mar 1993 A
5317919 Awtrey Jun 1994 A
5459463 Gruaz et al. Oct 1995 A
5483261 Yasutake Jan 1996 A
5488204 Mead et al. Jan 1996 A
5543590 Gillespie et al. Aug 1996 A
5631670 Tomiyoshi et al. May 1997 A
5825352 Bisset et al. Oct 1998 A
5835079 Shieh Nov 1998 A
5841078 Miller et al. Nov 1998 A
5841427 Teterwak Nov 1998 A
5844506 Binstead Dec 1998 A
5847690 Boie et al. Dec 1998 A
5880411 Gillespie et al. Mar 1999 A
5889236 Gillespie et al. Mar 1999 A
5914465 Allen et al. Jun 1999 A
5923997 Miyanaga et al. Jul 1999 A
5973623 Gupta et al. Oct 1999 A
6025647 Shenoy et al. Feb 2000 A
6057903 Colgan et al. May 2000 A
6137427 Binstead Oct 2000 A
6163313 Aroyan et al. Dec 2000 A
6188391 Seely et al. Feb 2001 B1
6204897 Colgan et al. Mar 2001 B1
6239788 Nohno et al. May 2001 B1
6252825 Perotto Jun 2001 B1
6310610 Beaton et al. Oct 2001 B1
6323846 Westerman et al. Nov 2001 B1
6329044 Inoue et al. Dec 2001 B1
6452514 Philipp Sep 2002 B1
6456952 Nathan Sep 2002 B1
6587358 Yasumura Jul 2003 B1
6690387 Zimmerman et al. Feb 2004 B2
6690569 Mayer et al. Feb 2004 B1
6730863 Gerpheide et al. May 2004 B1
6844673 Bernkopf Jan 2005 B1
6847354 Vranish Jan 2005 B2
6970160 Mulligan et al. Nov 2005 B2
7015894 Morohoshi Mar 2006 B2
7030860 Hsu et al. Apr 2006 B1
7129935 Mackey Oct 2006 B2
7138686 Banerjee et al. Nov 2006 B1
7180508 Kent et al. Feb 2007 B2
7184026 Gordon et al. Feb 2007 B2
7184064 Zimmerman et al. Feb 2007 B2
7337085 Soss Feb 2008 B2
7395717 Deangelis et al. Jul 2008 B2
7412586 Rajopadhye et al. Aug 2008 B1
7504833 Seguine Mar 2009 B1
7538760 Hotelling et al. May 2009 B2
7548073 Mackey et al. Jun 2009 B2
7580030 Marten Aug 2009 B2
7639234 Orsley Dec 2009 B2
7663607 Hotelling et al. Feb 2010 B2
7701539 Shih et al. Apr 2010 B2
7719523 Hillis May 2010 B2
7864503 Chang Jan 2011 B2
7898122 Andrieux et al. Mar 2011 B2
7907126 Yoon et al. Mar 2011 B2
7932898 Philipp et al. Apr 2011 B2
8026904 Westerman Sep 2011 B2
8040142 Bokma et al. Oct 2011 B1
8040321 Peng et al. Oct 2011 B2
8040326 Hotelling et al. Oct 2011 B2
8045783 Lee et al. Oct 2011 B2
8058884 Betancourt Nov 2011 B2
8059103 Geaghan Nov 2011 B2
8068097 Guanghai Nov 2011 B2
8120371 Day et al. Feb 2012 B2
8125312 Orr Feb 2012 B2
8149002 Ossart et al. Apr 2012 B2
8169421 Wright et al. May 2012 B2
8223133 Hristov Jul 2012 B2
8258986 Makovetskyy Sep 2012 B2
8259078 Hotelling et al. Sep 2012 B2
8264428 Nam Sep 2012 B2
8283935 Liu et al. Oct 2012 B2
8300021 Ayres et al. Oct 2012 B2
8319747 Hotelling et al. Nov 2012 B2
8339286 Cordeiro Dec 2012 B2
8355887 Harding et al. Jan 2013 B1
8390573 Trout Mar 2013 B2
8441464 Lin et al. May 2013 B1
8479122 Hotelling et al. Jul 2013 B2
8484838 Badaye et al. Jul 2013 B2
8487898 Hotelling Jul 2013 B2
8507811 Hotelling et al. Aug 2013 B2
8508495 Hotelling et al. Aug 2013 B2
8525756 Kwon Sep 2013 B2
8537126 Yousefpor et al. Sep 2013 B2
8542208 Krah et al. Sep 2013 B2
8552994 Simmons Oct 2013 B2
8576193 Hotelling Nov 2013 B2
8593410 Hong et al. Nov 2013 B2
8593425 Hong et al. Nov 2013 B2
8614688 Chang Dec 2013 B2
8633915 Hotelling et al. Jan 2014 B2
8665237 Koshiyama et al. Mar 2014 B2
8680877 Lee et al. Mar 2014 B2
8760412 Hotelling et al. Jun 2014 B2
8766950 Morein et al. Jul 2014 B1
8773146 Hills et al. Jul 2014 B1
8773351 Rekimoto Jul 2014 B2
8810524 Rosenberg et al. Aug 2014 B1
8810543 Kurikawa Aug 2014 B1
8884917 Seo Nov 2014 B2
8902172 Peng et al. Dec 2014 B2
8917256 Roziere Dec 2014 B2
8922521 Hotelling et al. Dec 2014 B2
8957874 Elias Feb 2015 B2
8976133 Yao et al. Mar 2015 B2
8982096 Hong et al. Mar 2015 B2
8982097 Kuzo et al. Mar 2015 B1
9000782 Roziere Apr 2015 B2
9001082 Rosenberg et al. Apr 2015 B1
9024913 Jung et al. May 2015 B1
9035895 Bussat et al. May 2015 B2
9075463 Pyo et al. Jul 2015 B2
9086774 Hotelling et al. Jul 2015 B2
9151791 Roziere Oct 2015 B2
9164137 Page et al. Oct 2015 B2
9189119 Liao et al. Nov 2015 B2
9250757 Roziere Feb 2016 B2
9261997 Chang et al. Feb 2016 B2
9268427 Yousefpor et al. Feb 2016 B2
9280251 Shih Mar 2016 B2
9292137 Kogo Mar 2016 B2
9317165 Hotelling et al. Apr 2016 B2
9329674 Lee et al. May 2016 B2
9329723 Benbasat et al. May 2016 B2
9372576 Westerman Jun 2016 B2
9423897 Bae Aug 2016 B2
9442330 Huo Sep 2016 B2
9448675 Morein et al. Sep 2016 B2
9448677 Beilker Sep 2016 B2
9535547 Roziere Jan 2017 B2
9582131 Elias Feb 2017 B2
9640991 Blondin et al. May 2017 B2
9690397 Shepelev et al. Jun 2017 B2
9740351 Li et al. Aug 2017 B2
9785295 Yang et al. Oct 2017 B2
9804717 Schropp, Jr. Oct 2017 B2
9857925 Morein et al. Jan 2018 B2
9874975 Benbasat et al. Jan 2018 B2
9880655 O'connor Jan 2018 B2
9886141 Yousefpor Feb 2018 B2
9898149 Kang et al. Feb 2018 B2
9904427 Co et al. Feb 2018 B1
9996175 Hotelling et al. Jun 2018 B2
10001888 Hong et al. Jun 2018 B2
10061433 Imai et al. Aug 2018 B2
10073562 Mo et al. Sep 2018 B2
10120520 Krah et al. Nov 2018 B2
10175832 Roziere Jan 2019 B2
10254896 Mori et al. Apr 2019 B2
10289251 Shih et al. May 2019 B2
10331278 Hotelling et al. Jun 2019 B2
10365764 Korapati et al. Jul 2019 B2
10386962 Jin et al. Aug 2019 B1
10459587 Krah et al. Oct 2019 B2
10534481 Badaye et al. Jan 2020 B2
10705658 Li et al. Jul 2020 B2
10725591 Maharyta et al. Jul 2020 B1
11157109 Shorten et al. Oct 2021 B1
20020015024 Westerman et al. Feb 2002 A1
20020152048 Hayes Oct 2002 A1
20030075427 Caldwell Apr 2003 A1
20030076325 Thrasher Apr 2003 A1
20030164820 Kent Sep 2003 A1
20030210235 Roberts Nov 2003 A1
20030234768 Rekimoto et al. Dec 2003 A1
20040017362 Mulligan et al. Jan 2004 A1
20040061687 Kent et al. Apr 2004 A1
20040090429 Geaghan et al. May 2004 A1
20040113819 Gauthey et al. Jun 2004 A1
20040119701 Mulligan et al. Jun 2004 A1
20040125087 Taylor et al. Jul 2004 A1
20040140993 Geaghan et al. Jul 2004 A1
20040188151 Gerpheide et al. Sep 2004 A1
20040189617 Gerpheide et al. Sep 2004 A1
20040212586 Denny Oct 2004 A1
20040239650 Mackey Dec 2004 A1
20040241920 Hsiao et al. Dec 2004 A1
20040243747 Rekimoto Dec 2004 A1
20050007353 Smith et al. Jan 2005 A1
20050012724 Kent Jan 2005 A1
20050069718 Voss-kehl et al. Mar 2005 A1
20050073507 Richter et al. Apr 2005 A1
20050083307 Aufderheide et al. Apr 2005 A1
20050104867 Westerman et al. May 2005 A1
20050126831 Richter et al. Jun 2005 A1
20050146509 Geaghan et al. Jul 2005 A1
20050219228 Alameh et al. Oct 2005 A1
20050239532 Inamura Oct 2005 A1
20050270039 Mackey Dec 2005 A1
20050270273 Marten Dec 2005 A1
20050280639 Taylor et al. Dec 2005 A1
20060001640 Lee Jan 2006 A1
20060017710 Lee et al. Jan 2006 A1
20060026521 Hotelling et al. Feb 2006 A1
20060038791 Mackey Feb 2006 A1
20060097991 Hotelling et al. May 2006 A1
20060097992 Gitzinger et al. May 2006 A1
20060132463 Lee et al. Jun 2006 A1
20060146484 Kim et al. Jul 2006 A1
20060161871 Hotelling et al. Jul 2006 A1
20060197753 Hotelling Sep 2006 A1
20060202969 Hauck Sep 2006 A1
20060203403 Schediwy et al. Sep 2006 A1
20060227115 Fry Oct 2006 A1
20060238522 Westerman et al. Oct 2006 A1
20060248208 Walbeck et al. Nov 2006 A1
20060256090 Huppi Nov 2006 A1
20060267953 Peterson et al. Nov 2006 A1
20060278444 Binstead Dec 2006 A1
20060279548 Geaghan Dec 2006 A1
20060284639 Reynolds Dec 2006 A1
20060293864 Soss Dec 2006 A1
20070008299 Hristov Jan 2007 A1
20070012665 Nelson et al. Jan 2007 A1
20070023523 Onishi Feb 2007 A1
20070074914 Geaghan et al. Apr 2007 A1
20070075982 Morrison et al. Apr 2007 A1
20070191070 Rao Aug 2007 A1
20070216637 Ito Sep 2007 A1
20070216657 Konicek Sep 2007 A1
20070229468 Peng et al. Oct 2007 A1
20070229470 Snyder et al. Oct 2007 A1
20070247443 Philipp Oct 2007 A1
20070262963 Xiao-ping et al. Nov 2007 A1
20070262969 Pak Nov 2007 A1
20070268273 Westerman et al. Nov 2007 A1
20070268275 Westerman et al. Nov 2007 A1
20070279395 Philipp et al. Dec 2007 A1
20070279619 Chang Dec 2007 A1
20070283832 Hotelling Dec 2007 A1
20070285365 Lee Dec 2007 A1
20070296709 Guanghai Dec 2007 A1
20080006454 Hotelling Jan 2008 A1
20080007533 Hotelling Jan 2008 A1
20080012835 Rimon et al. Jan 2008 A1
20080018581 Park et al. Jan 2008 A1
20080024456 Peng et al. Jan 2008 A1
20080036742 Garmon Feb 2008 A1
20080042985 Katsuhito et al. Feb 2008 A1
20080042986 Westerman et al. Feb 2008 A1
20080042987 Westerman et al. Feb 2008 A1
20080042992 Kim Feb 2008 A1
20080047764 Lee et al. Feb 2008 A1
20080061800 Reynolds et al. Mar 2008 A1
20080062139 Hotelling et al. Mar 2008 A1
20080062140 Hotelling et al. Mar 2008 A1
20080062147 Hotelling et al. Mar 2008 A1
20080062148 Hotelling et al. Mar 2008 A1
20080062151 Kent Mar 2008 A1
20080074398 Wright Mar 2008 A1
20080100572 Boillot May 2008 A1
20080136787 Yeh et al. Jun 2008 A1
20080136792 Peng et al. Jun 2008 A1
20080142281 Geaghan Jun 2008 A1
20080158145 Westerman Jul 2008 A1
20080158146 Westerman Jul 2008 A1
20080158167 Hotelling et al. Jul 2008 A1
20080158172 Hotelling et al. Jul 2008 A1
20080158174 Land et al. Jul 2008 A1
20080158181 Hamblin et al. Jul 2008 A1
20080158182 Westerman Jul 2008 A1
20080158185 Westerman Jul 2008 A1
20080162996 Krah et al. Jul 2008 A1
20080174321 Kang et al. Jul 2008 A1
20080180365 Ozaki Jul 2008 A1
20080188267 Sagong Aug 2008 A1
20080224962 Kasai et al. Sep 2008 A1
20080231292 Ossart et al. Sep 2008 A1
20080238871 Tam Oct 2008 A1
20080238879 Jaeger et al. Oct 2008 A1
20080246496 Hristov et al. Oct 2008 A1
20080252608 Geaghan Oct 2008 A1
20080264699 Chang et al. Oct 2008 A1
20080277259 Chang Nov 2008 A1
20080283175 Hagood et al. Nov 2008 A1
20080303022 Tai et al. Dec 2008 A1
20080303964 Lee et al. Dec 2008 A1
20080309626 Westerman et al. Dec 2008 A1
20080309627 Hotelling et al. Dec 2008 A1
20080309629 Westerman et al. Dec 2008 A1
20080309632 Westerman et al. Dec 2008 A1
20080309633 Hotelling et al. Dec 2008 A1
20080309635 Matsuo Dec 2008 A1
20090002337 Chang Jan 2009 A1
20090009485 Bytheway Jan 2009 A1
20090019344 Yoon et al. Jan 2009 A1
20090020343 Rothkopf et al. Jan 2009 A1
20090054107 Feland et al. Feb 2009 A1
20090059730 Lyons et al. Mar 2009 A1
20090070681 Dawes et al. Mar 2009 A1
20090073138 Lee et al. Mar 2009 A1
20090085894 Gandhi et al. Apr 2009 A1
20090091546 Joo et al. Apr 2009 A1
20090091551 Hotelling et al. Apr 2009 A1
20090109192 Liu et al. Apr 2009 A1
20090114456 Wisniewski May 2009 A1
20090128516 Rimon et al. May 2009 A1
20090135157 Harley May 2009 A1
20090141046 Rathnam et al. Jun 2009 A1
20090146970 Lowles et al. Jun 2009 A1
20090160787 Westerman et al. Jun 2009 A1
20090174676 Westerman Jul 2009 A1
20090174688 Westerman Jul 2009 A1
20090179868 Ayres et al. Jul 2009 A1
20090182189 Lira Jul 2009 A1
20090184937 Grivna Jul 2009 A1
20090194344 Harley et al. Aug 2009 A1
20090205879 Halsey et al. Aug 2009 A1
20090212642 Krah Aug 2009 A1
20090213090 Mamba et al. Aug 2009 A1
20090236151 Yeh et al. Sep 2009 A1
20090238012 Tatapudi et al. Sep 2009 A1
20090242283 Chiu Oct 2009 A1
20090251427 Hung et al. Oct 2009 A1
20090267902 Nambu et al. Oct 2009 A1
20090267903 Cady et al. Oct 2009 A1
20090267916 Hotelling Oct 2009 A1
20090273577 Chen et al. Nov 2009 A1
20090277695 Liu et al. Nov 2009 A1
20090303189 Grunthaner et al. Dec 2009 A1
20090309850 Yang Dec 2009 A1
20090309851 Bernstein Dec 2009 A1
20090314621 Hotelling Dec 2009 A1
20090315854 Matsuo Dec 2009 A1
20090322702 Chien et al. Dec 2009 A1
20100001973 Hotelling et al. Jan 2010 A1
20100004029 Kim Jan 2010 A1
20100006350 Elias Jan 2010 A1
20100007616 Jang Jan 2010 A1
20100013745 Kim et al. Jan 2010 A1
20100013791 Haga et al. Jan 2010 A1
20100019779 Kato et al. Jan 2010 A1
20100031174 Kim Feb 2010 A1
20100039396 Ho et al. Feb 2010 A1
20100059294 Elias et al. Mar 2010 A1
20100060608 Yousefpor Mar 2010 A1
20100079384 Grivna Apr 2010 A1
20100079401 Staton Apr 2010 A1
20100090964 Soo et al. Apr 2010 A1
20100097346 Sleeman Apr 2010 A1
20100102027 Liu et al. Apr 2010 A1
20100110035 Selker May 2010 A1
20100117985 Wadia May 2010 A1
20100123667 Kim et al. May 2010 A1
20100139991 Philipp et al. Jun 2010 A1
20100143848 Jain et al. Jun 2010 A1
20100149108 Hotelling et al. Jun 2010 A1
20100149127 Fisher et al. Jun 2010 A1
20100156810 Barbier et al. Jun 2010 A1
20100156846 Long et al. Jun 2010 A1
20100182018 Hazelden Jul 2010 A1
20100182278 Li et al. Jul 2010 A1
20100194695 Hotelling et al. Aug 2010 A1
20100194696 Chang et al. Aug 2010 A1
20100194697 Hotelling et al. Aug 2010 A1
20100194698 Hotelling et al. Aug 2010 A1
20100194707 Hotelling et al. Aug 2010 A1
20100201635 Klinghult et al. Aug 2010 A1
20100245286 Parker Sep 2010 A1
20100253638 Yousefpor et al. Oct 2010 A1
20100259503 Yanase et al. Oct 2010 A1
20100265187 Chang et al. Oct 2010 A1
20100265188 Chang et al. Oct 2010 A1
20100277418 Huang et al. Nov 2010 A1
20100321305 Chang et al. Dec 2010 A1
20100328228 Elias Dec 2010 A1
20100328248 Mozdzyn Dec 2010 A1
20100328262 Huang et al. Dec 2010 A1
20100328263 Lin Dec 2010 A1
20110001491 Huang et al. Jan 2011 A1
20110006832 Land et al. Jan 2011 A1
20110006999 Chang et al. Jan 2011 A1
20110007020 Hong et al. Jan 2011 A1
20110007021 Bernstein et al. Jan 2011 A1
20110007030 Mo et al. Jan 2011 A1
20110025623 Lin Feb 2011 A1
20110025629 Grivna et al. Feb 2011 A1
20110025635 Lee Feb 2011 A1
20110057899 Sleeman et al. Mar 2011 A1
20110061949 Krah et al. Mar 2011 A1
20110074705 Yousefpor et al. Mar 2011 A1
20110080391 Brown et al. Apr 2011 A1
20110096016 Yilmaz Apr 2011 A1
20110134050 Harley Jun 2011 A1
20110157068 Parker et al. Jun 2011 A1
20110157093 Bita et al. Jun 2011 A1
20110169783 Wang et al. Jul 2011 A1
20110175846 Wang et al. Jul 2011 A1
20110193776 Oda et al. Aug 2011 A1
20110199105 Otagaki et al. Aug 2011 A1
20110210941 Reynolds et al. Sep 2011 A1
20110227874 Fahraeus et al. Sep 2011 A1
20110231139 Yokota et al. Sep 2011 A1
20110234523 Chang et al. Sep 2011 A1
20110234526 Mi Sep 2011 A1
20110241907 Cordeiro Oct 2011 A1
20110248949 Chang et al. Oct 2011 A1
20110254795 Chen et al. Oct 2011 A1
20110261005 Joharapurkar et al. Oct 2011 A1
20110261007 Joharapurkar et al. Oct 2011 A1
20110282606 Ahed et al. Nov 2011 A1
20110298727 Yousefpor et al. Dec 2011 A1
20110310033 Liu et al. Dec 2011 A1
20110310064 Keski-jaskari et al. Dec 2011 A1
20120026099 Harley Feb 2012 A1
20120044199 Karpin et al. Feb 2012 A1
20120050206 Welland Mar 2012 A1
20120050214 Kremin et al. Mar 2012 A1
20120050216 Kremin et al. Mar 2012 A1
20120050217 Noguchi et al. Mar 2012 A1
20120054379 Leung et al. Mar 2012 A1
20120056662 Wilson et al. Mar 2012 A1
20120056851 Chen et al. Mar 2012 A1
20120075239 Azumi et al. Mar 2012 A1
20120092288 Wadia Apr 2012 A1
20120098776 Chen et al. Apr 2012 A1
20120113047 Hanauer et al. May 2012 A1
20120132006 Roziere May 2012 A1
20120146726 Huang et al. Jun 2012 A1
20120146920 Lin et al. Jun 2012 A1
20120146942 Kamoshida et al. Jun 2012 A1
20120154324 Wright et al. Jun 2012 A1
20120162121 Chang et al. Jun 2012 A1
20120162133 Chen et al. Jun 2012 A1
20120162134 Chen et al. Jun 2012 A1
20120169652 Chang Jul 2012 A1
20120169653 Chang Jul 2012 A1
20120169655 Chang Jul 2012 A1
20120169656 Chang Jul 2012 A1
20120169664 Milne Jul 2012 A1
20120182251 Krah Jul 2012 A1
20120187965 Roziere Jul 2012 A1
20120188200 Roziere Jul 2012 A1
20120188201 Binstead Jul 2012 A1
20120211264 Milne Aug 2012 A1
20120242597 Hwang et al. Sep 2012 A1
20120249401 Omoto Oct 2012 A1
20120249446 Chen et al. Oct 2012 A1
20120262395 Chan Oct 2012 A1
20120262410 Lim Oct 2012 A1
20120274603 Kim et al. Nov 2012 A1
20120287068 Colgate et al. Nov 2012 A1
20120313881 Ge et al. Dec 2012 A1
20120320385 Mu et al. Dec 2012 A1
20130015868 Peng Jan 2013 A1
20130016612 Vasseur et al. Jan 2013 A1
20130021291 Kremin et al. Jan 2013 A1
20130027118 Ho et al. Jan 2013 A1
20130027346 Yarosh et al. Jan 2013 A1
20130038573 Chang et al. Feb 2013 A1
20130057511 Shepelev et al. Mar 2013 A1
20130069911 You Mar 2013 A1
20130076647 Yousefpor et al. Mar 2013 A1
20130076648 Krah et al. Mar 2013 A1
20130078945 Lavi et al. Mar 2013 A1
20130093712 Liu et al. Apr 2013 A1
20130100038 Yilmaz et al. Apr 2013 A1
20130100071 Wright et al. Apr 2013 A1
20130120303 Hong et al. May 2013 A1
20130127739 Guard et al. May 2013 A1
20130141343 Yu et al. Jun 2013 A1
20130141383 Woolley Jun 2013 A1
20130154996 Trend et al. Jun 2013 A1
20130170116 In et al. Jul 2013 A1
20130173211 Hoch et al. Jul 2013 A1
20130176271 Sobel et al. Jul 2013 A1
20130176273 Li et al. Jul 2013 A1
20130176276 Shepelev Jul 2013 A1
20130181943 Bulea et al. Jul 2013 A1
20130194229 Sabo et al. Aug 2013 A1
20130215049 Lee Aug 2013 A1
20130215075 Lee et al. Aug 2013 A1
20130224370 Cok et al. Aug 2013 A1
20130234964 Kim et al. Sep 2013 A1
20130257785 Brown et al. Oct 2013 A1
20130257797 Wu et al. Oct 2013 A1
20130257798 Tamura et al. Oct 2013 A1
20130265276 Obeidat et al. Oct 2013 A1
20130271427 Benbasat et al. Oct 2013 A1
20130278447 Kremin Oct 2013 A1
20130278498 Jung et al. Oct 2013 A1
20130278525 Lim et al. Oct 2013 A1
20130278543 Hsu et al. Oct 2013 A1
20130285971 Elias et al. Oct 2013 A1
20130293499 Chang et al. Nov 2013 A1
20130307821 Kogo Nov 2013 A1
20130308031 Theuwissen Nov 2013 A1
20130314342 Kim et al. Nov 2013 A1
20130320994 Brittain et al. Dec 2013 A1
20130321289 Dubery et al. Dec 2013 A1
20130328759 Al-dahle et al. Dec 2013 A1
20130335342 Kim et al. Dec 2013 A1
20130342479 Pyo et al. Dec 2013 A1
20140002406 Cormier, Jr. et al. Jan 2014 A1
20140009438 Liu et al. Jan 2014 A1
20140022186 Hong et al. Jan 2014 A1
20140022201 Boychuk et al. Jan 2014 A1
20140043546 Yamazaki et al. Feb 2014 A1
20140049507 Shepelev et al. Feb 2014 A1
20140070823 Roziere Mar 2014 A1
20140071084 Sugiura Mar 2014 A1
20140078096 Tan et al. Mar 2014 A1
20140078097 Shepelev et al. Mar 2014 A1
20140098051 Hong et al. Apr 2014 A1
20140103712 Blondin et al. Apr 2014 A1
20140104194 Davidson et al. Apr 2014 A1
20140104223 Hanssen et al. Apr 2014 A1
20140104225 Davidson et al. Apr 2014 A1
20140104228 Chen et al. Apr 2014 A1
20140111496 Gomez et al. Apr 2014 A1
20140111707 Song et al. Apr 2014 A1
20140118270 Moses et al. May 2014 A1
20140125357 Blondin et al. May 2014 A1
20140125628 Yoshida et al. May 2014 A1
20140132534 Kim May 2014 A1
20140132560 Huang et al. May 2014 A1
20140132860 Hotelling et al. May 2014 A1
20140145997 Tiruvuru May 2014 A1
20140152615 Chang et al. Jun 2014 A1
20140160058 Chen et al. Jun 2014 A1
20140160376 Wang et al. Jun 2014 A1
20140168540 Wang et al. Jun 2014 A1
20140180481 Park et al. Jun 2014 A1
20140192027 Ksondzyk et al. Jul 2014 A1
20140204043 Lin et al. Jul 2014 A1
20140204058 Huang et al. Jul 2014 A1
20140210779 Katsuta et al. Jul 2014 A1
20140210784 Gourevitch et al. Jul 2014 A1
20140225838 Gupta et al. Aug 2014 A1
20140232681 Yeh Aug 2014 A1
20140232955 Roudbari et al. Aug 2014 A1
20140240291 Nam Aug 2014 A1
20140247245 Lee Sep 2014 A1
20140253470 Havilio Sep 2014 A1
20140267070 Shahparnia et al. Sep 2014 A1
20140267128 Bulea et al. Sep 2014 A1
20140267146 Chang et al. Sep 2014 A1
20140267165 Roziere Sep 2014 A1
20140285469 Wright et al. Sep 2014 A1
20140306924 Lin et al. Oct 2014 A1
20140327644 Mohindra Nov 2014 A1
20140333582 Huo Nov 2014 A1
20140347321 Roziere Nov 2014 A1
20140347574 Tung et al. Nov 2014 A1
20140354301 Trend Dec 2014 A1
20140362029 Mo et al. Dec 2014 A1
20140362030 Mo et al. Dec 2014 A1
20140362034 Mo et al. Dec 2014 A1
20140362036 Mo et al. Dec 2014 A1
20140368436 Abzarian et al. Dec 2014 A1
20140368460 Mo et al. Dec 2014 A1
20140375598 Shen Dec 2014 A1
20140375603 Hotelling et al. Dec 2014 A1
20140375903 Westhues et al. Dec 2014 A1
20150002176 Kwon et al. Jan 2015 A1
20150002448 Brunet et al. Jan 2015 A1
20150002464 Nishioka et al. Jan 2015 A1
20150002752 Shepelev et al. Jan 2015 A1
20150009421 Choi et al. Jan 2015 A1
20150015528 Vandermeijden Jan 2015 A1
20150026398 Kim Jan 2015 A1
20150035768 Shahparnia et al. Feb 2015 A1
20150035787 Shahparnia et al. Feb 2015 A1
20150035797 Shahparnia Feb 2015 A1
20150042600 Lukanc et al. Feb 2015 A1
20150042607 Takanohashi Feb 2015 A1
20150049043 Yousefpor Feb 2015 A1
20150049044 Yousefpor Feb 2015 A1
20150062063 Cheng et al. Mar 2015 A1
20150077375 Hotelling et al. Mar 2015 A1
20150077394 Dai et al. Mar 2015 A1
20150084911 Stronks et al. Mar 2015 A1
20150091587 Shepelev et al. Apr 2015 A1
20150091843 Ludden Apr 2015 A1
20150091849 Ludden Apr 2015 A1
20150103047 Hanauer et al. Apr 2015 A1
20150116263 Kim Apr 2015 A1
20150123939 Kim et al. May 2015 A1
20150167177 Choi Jun 2015 A1
20150194470 Hwang Jul 2015 A1
20150227240 Hong et al. Aug 2015 A1
20150242028 Roberts et al. Aug 2015 A1
20150248177 Maharyta Sep 2015 A1
20150248180 Wakuda Sep 2015 A1
20150253907 Elias Sep 2015 A1
20150268789 Liao et al. Sep 2015 A1
20150268795 Kurasawa et al. Sep 2015 A1
20150277648 Small Oct 2015 A1
20150309610 Rabii et al. Oct 2015 A1
20150324035 Yuan et al. Nov 2015 A1
20150338937 Shepelev et al. Nov 2015 A1
20150370387 Yamaguchi et al. Dec 2015 A1
20150378465 Shih et al. Dec 2015 A1
20160011702 Shih Jan 2016 A1
20160018348 Yau et al. Jan 2016 A1
20160018867 Nys et al. Jan 2016 A1
20160022218 Hayes et al. Jan 2016 A1
20160034102 Roziere et al. Feb 2016 A1
20160041629 Rao et al. Feb 2016 A1
20160048234 Chandran et al. Feb 2016 A1
20160062533 O'connor Mar 2016 A1
20160077667 Chiang et al. Mar 2016 A1
20160098114 Pylvas Apr 2016 A1
20160117017 Kremin et al. Apr 2016 A1
20160117032 Lin et al. Apr 2016 A1
20160139728 Jeon et al. May 2016 A1
20160154505 Chang et al. Jun 2016 A1
20160154529 Westerman Jun 2016 A1
20160170533 Roziere Jun 2016 A1
20160188040 Shin et al. Jun 2016 A1
20160195954 Wang et al. Jul 2016 A1
20160209953 Kim Jul 2016 A1
20160211808 Lee et al. Jul 2016 A1
20160216801 Shedletsky et al. Jul 2016 A1
20160216808 Hotelling et al. Jul 2016 A1
20160224177 Krah Aug 2016 A1
20160224189 Yousefpor et al. Aug 2016 A1
20160246403 Zhao et al. Aug 2016 A1
20160246423 Fu Aug 2016 A1
20160253034 Gupta et al. Sep 2016 A1
20160253041 Park et al. Sep 2016 A1
20160259448 Guarneri Sep 2016 A1
20160266676 Wang et al. Sep 2016 A1
20160266679 Shahparnia et al. Sep 2016 A1
20160282980 Chintalapoodi et al. Sep 2016 A1
20160283023 Shin et al. Sep 2016 A1
20160299603 Tsujioka et al. Oct 2016 A1
20160320898 Tang et al. Nov 2016 A1
20160357344 Benbasat et al. Dec 2016 A1
20170060318 Gu et al. Mar 2017 A1
20170090599 Kuboyama et al. Mar 2017 A1
20170090619 Yousefpor et al. Mar 2017 A1
20170090622 Badaye et al. Mar 2017 A1
20170090644 Yao et al. Mar 2017 A1
20170097703 Lee Apr 2017 A1
20170108968 Roziere Apr 2017 A1
20170139539 Yao et al. May 2017 A1
20170168619 Yang et al. Jun 2017 A1
20170168626 Konicek Jun 2017 A1
20170220156 Blondin et al. Aug 2017 A1
20170228061 Qiao et al. Aug 2017 A1
20170229502 Liu et al. Aug 2017 A1
20170262121 Kurasawa et al. Sep 2017 A1
20170269729 Chintalapoodi Sep 2017 A1
20170285804 Li et al. Oct 2017 A1
20170315646 Roziere Nov 2017 A1
20170351378 Wang et al. Dec 2017 A1
20170357371 Kim et al. Dec 2017 A1
20180032176 Krah et al. Feb 2018 A1
20180067584 Zhu et al. Mar 2018 A1
20180074633 Kida et al. Mar 2018 A1
20180107309 Endo et al. Apr 2018 A1
20180224962 Mori Aug 2018 A1
20180253167 Park et al. Sep 2018 A1
20180275824 Li et al. Sep 2018 A1
20180307374 Shah et al. Oct 2018 A1
20180307375 Shah et al. Oct 2018 A1
20180314385 Gupta et al. Nov 2018 A1
20180367139 Pribisic et al. Dec 2018 A1
20190034032 Westerman Jan 2019 A1
20190073061 Krah et al. Mar 2019 A1
20190087051 Yao et al. Mar 2019 A1
20190138152 Yousefpor et al. May 2019 A1
20190220115 Mori et al. Jul 2019 A1
20190237963 Wuerstlein et al. Aug 2019 A1
20190302932 Hotelling et al. Oct 2019 A1
20200019265 Krah et al. Jan 2020 A1
20200326828 Otagaki Oct 2020 A1
20200333902 Li et al. Oct 2020 A1
20200341585 Li et al. Oct 2020 A1
20200387248 Kim Dec 2020 A1
20200387259 Krah Dec 2020 A1
20220011920 Shorten et al. Jan 2022 A1
20220187956 Hotelling et al. Jun 2022 A1
20230040857 Krah Feb 2023 A1
Foreign Referenced Citations (216)
Number Date Country
1202254 Dec 1998 CN
1246638 Mar 2000 CN
1527274 Sep 2004 CN
1577385 Feb 2005 CN
1672119 Sep 2005 CN
1689677 Nov 2005 CN
1711520 Dec 2005 CN
1739083 Feb 2006 CN
1782837 Jun 2006 CN
1818842 Aug 2006 CN
1864124 Nov 2006 CN
1945516 Apr 2007 CN
101046720 Oct 2007 CN
101071354 Nov 2007 CN
101122838 Feb 2008 CN
101236320 Aug 2008 CN
101349957 Jan 2009 CN
101354620 Jan 2009 CN
101419516 Apr 2009 CN
201218943 Apr 2009 CN
101840293 Sep 2010 CN
101859215 Oct 2010 CN
102023768 Apr 2011 CN
102411460 Apr 2012 CN
102483659 May 2012 CN
102483673 May 2012 CN
102654664 Sep 2012 CN
102760405 Oct 2012 CN
102782626 Nov 2012 CN
102804114 Nov 2012 CN
102968235 Mar 2013 CN
103019485 Apr 2013 CN
103049148 Apr 2013 CN
103052930 Apr 2013 CN
103135815 Jun 2013 CN
202976038 Jun 2013 CN
103221910 Jul 2013 CN
103258492 Aug 2013 CN
103294321 Sep 2013 CN
103365500 Oct 2013 CN
103365506 Oct 2013 CN
103577008 Feb 2014 CN
103809810 May 2014 CN
103885627 Jun 2014 CN
103955322 Jul 2014 CN
203720826 Jul 2014 CN
104020880 Sep 2014 CN
104020908 Sep 2014 CN
104142757 Nov 2014 CN
104252266 Dec 2014 CN
105045446 Nov 2015 CN
102648446 Jan 2016 CN
105278739 Jan 2016 CN
105302395 Feb 2016 CN
105320383 Feb 2016 CN
105474154 Apr 2016 CN
205334405 Jun 2016 CN
105824461 Aug 2016 CN
107533393 Jan 2018 CN
11-2008-001245 Mar 2010 DE
10-2011-089693 Jun 2013 DE
11-2012-004912 Aug 2014 DE
853230 Jul 1998 EP
1192585 Apr 2002 EP
1278390 Jan 2003 EP
1391807 Feb 2004 EP
1455264 Sep 2004 EP
1496425 Jan 2005 EP
1573706 Sep 2005 EP
1573706 Sep 2005 EP
1192585 Dec 2005 EP
1644918 Apr 2006 EP
1717677 Nov 2006 EP
1745356 Jan 2007 EP
1455264 Mar 2007 EP
1717677 Jan 2008 EP
1918803 May 2008 EP
1986084 Oct 2008 EP
2045698 Apr 2009 EP
2077489 Jul 2009 EP
2144146 Jan 2010 EP
2148264 Jan 2010 EP
2224277 Sep 2010 EP
2256606 Dec 2010 EP
1455264 May 2011 EP
2495643 Sep 2012 EP
2756048 May 1998 FR
2896595 Jul 2007 FR
2949008 Feb 2011 FR
3004551 Oct 2014 FR
1546317 May 1979 GB
2144146 Feb 1985 GB
2428306 Jan 2007 GB
2437827 Nov 2007 GB
2450207 Dec 2008 GB
10-505183 May 1998 JP
2000-163031 Jun 2000 JP
3134925 Feb 2001 JP
2002-342033 Nov 2002 JP
2003-66417 Mar 2003 JP
2004-503835 Feb 2004 JP
2004-526265 Aug 2004 JP
2005-30901 Feb 2005 JP
2005-84128 Mar 2005 JP
2005-301373 Oct 2005 JP
2006-500642 Jan 2006 JP
2006-251927 Sep 2006 JP
2007-18226 Jan 2007 JP
2007-18515 Jan 2007 JP
2007-152487 Jun 2007 JP
2007-200177 Aug 2007 JP
3134925 Aug 2007 JP
2007-533044 Nov 2007 JP
2008-510251 Apr 2008 JP
2008-117371 May 2008 JP
2008-225415 Sep 2008 JP
2009-86240 Apr 2009 JP
2009-157373 Jul 2009 JP
2010-528186 Aug 2010 JP
10-2004-0002983 Jan 2004 KR
10-2004-0091728 Oct 2004 KR
10-2007-0002327 Jan 2007 KR
10-2008-0019125 Mar 2008 KR
10-2008-0041278 May 2008 KR
10-2010-0054899 May 2010 KR
20110044670 Apr 2011 KR
10-2012-0085737 Aug 2012 KR
10-2013-0054463 May 2013 KR
10-2013-0094495 Aug 2013 KR
10-2013-0117499 Oct 2013 KR
10-2014-0043395 Apr 2014 KR
10-2014-0074454 Jun 2014 KR
101609992 Apr 2016 KR
200715015 Apr 2007 TW
200826032 Jun 2008 TW
200835294 Aug 2008 TW
M341273 Sep 2008 TW
M344522 Nov 2008 TW
M344544 Nov 2008 TW
M352721 Mar 2009 TW
201115442 May 2011 TW
201203069 Jan 2012 TW
201401129 Jan 2014 TW
201419071 May 2014 TW
199718508 May 1997 WO
1999035633 Jul 1999 WO
1999035633 Sep 1999 WO
2000073984 Dec 2000 WO
2001097204 Dec 2001 WO
2002080637 Oct 2002 WO
2003079176 Sep 2003 WO
2004013833 Aug 2004 WO
2004112448 Dec 2004 WO
2004114265 Dec 2004 WO
2004013833 Aug 2005 WO
2005114369 Dec 2005 WO
2005114369 Jan 2006 WO
2006020305 Feb 2006 WO
2006023147 Mar 2006 WO
2006023147 May 2006 WO
2006104745 Oct 2006 WO
2006126703 Nov 2006 WO
2006130584 Dec 2006 WO
2007008518 Jan 2007 WO
2007012899 Feb 2007 WO
2007034591 Mar 2007 WO
2006020305 May 2007 WO
2006104745 May 2007 WO
2006130584 May 2007 WO
2007054018 May 2007 WO
2007058727 May 2007 WO
2007066488 Jun 2007 WO
2007089766 Aug 2007 WO
2007115032 Oct 2007 WO
2007146780 Dec 2007 WO
2007146785 Dec 2007 WO
2007115032 Jan 2008 WO
2008000964 Jan 2008 WO
2008007118 Jan 2008 WO
2008030780 Mar 2008 WO
2008047990 Apr 2008 WO
2007146785 May 2008 WO
2008076237 Jun 2008 WO
2008007118 Aug 2008 WO
2008076237 Aug 2008 WO
2007089766 Sep 2008 WO
2007146780 Sep 2008 WO
2008108514 Sep 2008 WO
2008135713 Nov 2008 WO
2009046363 Apr 2009 WO
2009103946 Aug 2009 WO
2009132146 Oct 2009 WO
2009132150 Oct 2009 WO
2010088659 Aug 2010 WO
2010117882 Oct 2010 WO
2011015795 Feb 2011 WO
2011028451 Mar 2011 WO
2011071784 Jun 2011 WO
2011015795 Jul 2011 WO
2011137200 Nov 2011 WO
2013093327 Jun 2013 WO
2013158570 Oct 2013 WO
2014105942 Jul 2014 WO
2014127716 Aug 2014 WO
2015017196 Feb 2015 WO
2015023410 Feb 2015 WO
2015072722 May 2015 WO
2015107969 Jul 2015 WO
2015178920 Nov 2015 WO
2016048269 Mar 2016 WO
2016066282 May 2016 WO
2016069642 May 2016 WO
2016126525 Aug 2016 WO
2016144437 Sep 2016 WO
2017058413 Apr 2017 WO
2017058415 Apr 2017 WO
Non-Patent Literature Citations (305)
Entry
Advisory Action received for U.S. Appl. No. 11/818,498, dated May 17, 2013, 5 pages.
Advisory Action received for U.S. Appl. No. 11/818,498, dated Oct. 14, 2011, 5 pages.
Advisory Action received for U.S. Appl. No. 12/110,024, dated Mar. 14, 2013, 3 pages.
Advisory Action received for U.S. Appl. No. 12/206,680, dated Apr. 16, 2012, 3 pages.
Advisory Action received for U.S. Appl. No. 12/238,333, dated Dec. 17, 2013, 3 pages.
Advisory Action received for U.S. Appl. No. 12/238,333, dated Oct. 21, 2015, 4 pages.
Advisory Action received for U.S. Appl. No. 12/333,250, dated Mar. 27, 2012, 3 pages.
Advisory Action received for U.S. Appl. No. 12/500,911, dated May 17, 2013, 3 pages.
Advisory Action received for U.S. Appl. No. 12/642,466, dated May 23, 2013, 2 pages.
Advisory Action received for U.S. Appl. No. 14/082,003, dated Mar. 10, 2016, 3 pages.
Advisory Action received for U.S. Appl. No. 14/645,120, dated Nov. 25, 2016, 3 pages.
Advisory Action received for U.S. Appl. No. 15/017,463, dated Aug. 8, 2018, 3 pages.
Corrected Notice of Allowance received for U.S. Appl. No. 12/333,250, dated Oct. 16, 2012, 5 pages.
Corrected Notice of Allowance received for U.S. Appl. No. 16/192,664, dated May 4, 2021, 3 pages.
Decision on Appeal received for U.S. Appl. No. 14/915,224, mailed on Jun. 30, 2021, 10 pages.
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 11/818,498, mailed on Dec. 20, 2013, 17 pages.
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 14/915,224, mailed on Oct. 6, 2020, 11 pages.
Extended European Search report received for European Patent Application No. 08022505.5, dated Apr. 25, 2012, 12 pages.
Extended European Search Report received for European Patent Application No. 10151969.2, dated Jul. 21, 2010, 5 pages.
Extended European Search Report received for European Patent Application No. 12162177.5, dated Dec. 3, 2012, 7 pages.
Extended European Search Report received for European Patent Application No. 12192450.0, dated Feb. 13, 2013, 6 pages.
Extended European Search Report received for European Patent Application No. 15166813.4, dated Aug. 31, 2015, 8 pages.
Extended European Search Report received for European Patent Application No. 18197785.1, dated Apr. 5, 2019, 8 pages.
Extended European Search Report received for European Patent Application No. 20206655.1, dated Feb. 19, 2021, 7 pages.
Extended European Search Report received for European Patent Application No. 20206655.1, dated Jul. 30, 2021, 8 pages.
Final Office Action received for U.S. Appl. No. 16/924,047, dated Apr. 13, 2022, 15 pages.
Final Office Action received for U.S. Appl. No. 11/818,498, dated Jan. 3, 2013, 17 pages.
Final Office Action received for U.S. Appl. No. 11/818,498, dated Jun. 10, 2011, 16 pages.
Final Office Action received for U.S. Appl. No. 12/038,760, dated Jul. 23, 2013, 20 pages.
Final Office Action received for U.S. Appl. No. 12/038,760, dated Jun. 8, 2011, 21 pages.
Final Office Action received for U.S. Appl. No. 12/110,024, dated Dec. 24, 2012, 21 pages.
Final Office Action received for U.S. Appl. No. 12/110,024, dated Jan. 19, 2012, 12 pages.
Final Office Action received for U.S. Appl. No. 12/110,075, dated Aug. 31, 2012, 15 pages.
Final Office Action received for U.S. Appl. No. 12/206,680, dated Jan. 5, 2012, 16 pages.
Final Office Action received for U.S. Appl. No. 12/206,680, dated Jan. 27, 2014, 20 pages.
Final Office Action received for U.S. Appl. No. 12/206,680, dated May 22, 2013, 16 pages.
Final Office Action received for U.S. Appl. No. 12/238,333, dated Apr. 22, 2015, 23 pages.
Final Office Action received for U.S. Appl. No. 12/238,333, dated Aug. 12, 2013, 19 pages.
Final Office Action received for U.S. Appl. No. 12/238,342, dated Aug. 13, 2013, 14 pages.
Final Office Action received for U.S. Appl. No. 12/238,342, dated Oct. 22, 2014, 16 pages.
Final Office Action received for U.S. Appl. No. 12/333,250, dated Dec. 15, 2011, 13 pages.
Final Office Action received for U.S. Appl. No. 12/494,173, dated Apr. 30, 2013, 7 pages.
Final Office Action received for U.S. Appl. No. 12/500,911, dated Feb. 5, 2013, 16 pages.
Final Office Action received for U.S. Appl. No. 12/545,604, dated Jul. 16, 2014, 18 pages.
Final Office Action received for U.S. Appl. No. 12/545,604, dated Jul. 19, 2013, 18 pages.
Final Office Action received for U.S. Appl. No. 12/545,754, dated Jun. 21, 2013, 6 pages.
Final Office Action received for U.S. Appl. No. 12/642,466, dated Feb. 1, 2013, 10 pages.
Final Office Action received for U.S. Appl. No. 12/642,466, dated Jan. 29, 2016, 10 pages.
Final Office Action received for U.S. Appl. No. 12/642,466, dated May 9, 2014, 13 pages.
Final Office Action received for U.S. Appl. No. 12/847,987, dated Apr. 23, 2014, 16 pages.
Final Office Action received for U.S. Appl. No. 13/448,182, dated Jun. 11, 2015, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 13/448,182, dated Oct. 22, 2014, 12 pages.
Final Office Action received for U.S. Appl. No. 13/899,391, dated Apr. 8, 2016, 10 pages.
Final Office Action received for U.S. Appl. No. 14/082,003, dated Jan. 4, 2016, 26 pages.
Final Office Action received for U.S. Appl. No. 14/082,003, dated Nov. 4, 2016, 19 pages.
Final Office Action received for U.S. Appl. No. 14/082,074, dated Nov. 12, 2015, 23 pages.
Final Office Action received for U.S. Appl. No. 14/157,737, dated Aug. 31, 2015, 28 pages.
Final Office Action received for U.S. Appl. No. 14/318,157, dated Jul. 26, 2017, 10 pages.
Final Office Action received for U.S. Appl. No. 14/318,157, dated May 9, 2016, 10 pages.
Final Office Action received for U.S. Appl. No. 14/550,686, dated Aug. 21, 2017, 12 pages.
Final Office Action received for U.S. Appl. No. 14/550,686, dated Jun. 14, 2016, 11 pages.
Final Office Action received for U.S. Appl. No. 14/558,529, dated Sep. 29, 2016, 23 pages.
Final Office Action received for U.S. Appl. No. 14/645,120, dated Aug. 10, 2017, 13 pages.
Final Office Action received for U.S. Appl. No. 14/645,120, dated May 27, 2016, 13 pages.
Final Office Action received for U.S. Appl. No. 14/915,224, dated Jan. 14, 2019, 8 pages.
Final Office Action received for U.S. Appl. No. 14/915,224, dated Nov. 18, 2019, 9 pages.
Final Office Action received for U.S. Appl. No. 14/993,017, dated Aug. 16, 2018, 35 pages.
Final Office Action received for U.S. Appl. No. 14/997,031, dated Jun. 14, 2018, 19 pages.
Final Office Action received for U.S. Appl. No. 15/006,987, dated Dec. 5, 2017, 16 pages.
Final Office Action received for U.S. Appl. No. 15/006,987, dated May 14, 2018, 11 pages.
Final Office Action received for U.S. Appl. No. 15/009,774, dated Feb. 6, 2019, 16 pages.
Final Office Action received for U.S. Appl. No. 15/017,463, dated Feb. 13, 2020, 22 pages.
Final Office Action received for U.S. Appl. No. 15/017,463, dated May 17, 2018, 23 pages.
Final Office Action received for U.S. Appl. No. 15/090,555, dated Aug. 29, 2018, 18 pages.
Final Office Action received for U.S. Appl. No. 15/097,179, dated Jul. 27, 2018, 12 pages.
Final Office Action received for U.S. Appl. No. 15/226,628, dated Mar. 28, 2018, 17 pages.
Final Office Action received for U.S. Appl. No. 15/228,942, dated Apr. 17, 2019, 9 pages.
Final Office Action received for U.S. Appl. No. 15/313,549, dated Dec. 18, 2019, 24 pages.
Final Office Action received for U.S. Appl. No. 15/507,722, dated Sep. 13, 2019, 18 pages.
Final Office Action received for U.S. Appl. No. 15/522,737, dated Sep. 12, 2019, 15 pages.
Final Office Action received for U.S. Appl. No. 16/152,326, dated Dec. 4, 2020, 10 pages.
Final Office Action received for U.S. Appl. No. 16/152,326, dated Jan. 27, 2020, 10 pages.
Final Office Action received for U.S. Appl. No. 16/192,664, dated Apr. 16, 2020, 10 pages.
Final Office Action received for U.S. Appl. No. 16/192,664, dated Nov. 19, 2020, 11 pages.
Final Office Action received for U.S. Appl. No. 16/201,730, dated Nov. 1, 2019, 11 pages.
Final Office Action received for U.S. Appl. No. 16/447,811, dated Jun. 22, 2021, 11 pages.
First Action Interview Office Action received for U.S. Appl. No. 15/686,969, dated Aug. 19, 2019, 7 pages.
First Action Interview received for U.S. Appl. No. 15/228,942, dated Nov. 26, 2018, 5 pages.
International Search Report received for PCT Patent Application No. PCT/US2008/078836, dated Mar. 19, 2009, 3 pages.
International Search Report received for PCT Patent Application No. PCT/US2009/041460, dated Jul. 17, 2009, 3 pages.
International Search Report received for PCT Patent Application No. PCT/US2009/041465, dated Aug. 5, 2009, 4 pages.
International Search Report received for PCT Patent Application No. PCT/US2010/022868, dated Mar. 10, 2010, 3 pages.
International Search Report received for PCT Patent Application No. PCT/US2010/029698, dated Jan. 14, 2011, 4 pages.
International Search Report received for PCT Patent Application No. PCT/US2010/058988, dated May 2, 2011, 4 pages.
International Search Report received for PCT Patent Application No. PCT/US2013/036662, dated Aug. 6, 2013, 3 pages.
International Search Report received for PCT Patent Application No. PCT/US2014/039245, dated Sep. 24, 2014, 3 pages.
International Search Report received for PCT Patent Application No. PCT/US2014/047888, dated Jan. 29, 2015, 6 pages.
International Search Report received for PCT Patent Application No. PCT/US2014/056795, dated Dec. 12, 2014, 3 pages.
International Search Report received for PCT Patent Application No. PCT/US2015/057644, dated Jan. 8, 2016, 3 pages.
International Search Report received for PCT Patent Application No. PCT/US2016/015479, dated May 9, 2016, 3 pages.
International Search Report received for PCT Patent Application No. PCT/US2016/016011, dated May 11, 2016, 6 pages.
International Search Report received for PCT Patent Application No. PCT/US2016/048694, dated Oct. 31, 2016, 4 pages.
International Search Report received for PCT Patent Application No. PCT/US2016/048750, dated May 4, 2017, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 11/818,498, dated Dec. 13, 2010, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 11/818,498, dated May 25, 2012, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 12/038,760, dated Feb. 4, 2011, 19 pages.
Non-Final Office Action received for U.S. Appl. No. 12/038,760, dated Jan. 2, 2013, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 12/110,024, dated Jul. 3, 2012, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 12/110,024, dated Jul. 11, 2011, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 12/110,075, dated Jan. 25, 2012, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 12/110,075, dated Jul. 8, 2011, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 12/110,075, dated Mar. 28, 2013, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 12/206,680, dated Jun. 9, 2011, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 12/206,680, dated Sep. 26, 2012, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 12/206,680, dated Sep. 30, 2013, 19 pages.
Non-Final Office Action received for U.S. Appl. No. 12/238,333, dated Jan. 7, 2013, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 12/238,333, dated May 3, 2012, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 12/238,333, dated Sep. 18, 2014, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 12/238,342, dated Feb. 15, 2013, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 12/238,342, dated Mar. 9, 2012, 27 pages.
Non-Final Office Action received for U.S. Appl. No. 12/238,342, dated Mar. 12, 2014, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 12/333,250, dated Aug. 17, 2011, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 12/494,173, dated Nov. 28, 2012, 7 pages.
Non-Final Office Action received for U.S. Appl. No. 12/500,911, dated Jun. 7, 2012, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 12/545,557, dated Jan. 3, 2014, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 12/545,557, dated Nov. 23, 2012, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 12/545,604, dated Dec. 19, 2013, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 12/545,604, dated Jan. 7, 2013, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 12/545,754, dated Jan. 2, 2014, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 12/545,754, dated Oct. 5, 2012, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 12/545,754, dated Sep. 10, 2013, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 12/642,466, dated Aug. 28, 2012, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 12/642,466, dated May 4, 2015, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 12/642,466, dated Nov. 8, 2013, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 12/847,987, dated Sep. 6, 2013, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 13/448,182, dated Jan. 31, 2014, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 13/737,779, dated Mar. 29, 2013, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 13/766,376, dated Jul. 31, 2015, 29 pages.
Non-Final Office Action received for U.S. Appl. No. 13/899,391, dated Oct. 5, 2015, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 14/055,717, dated Apr. 10, 2014, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 14/082,003, dated Mar. 13, 2017, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 14/082,003, dated May 8, 2015, 26 pages.
Non-Final Office Action received for U.S. Appl. No. 14/082,003, dated May 25, 2016, 23 pages.
Non-Final Office Action received for U.S. Appl. No. 14/082,074, dated Apr. 10, 2015, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 14/157,737, dated Feb. 10, 2015, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 14/318,157, dated Apr. 3, 2018, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 14/318,157, dated Dec. 19, 2016, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 14/318,157, dated Oct. 6, 2015, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 14/550,686, dated Aug. 20, 2015, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 14/550,686, dated Dec. 14, 2016, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 14/558,529, dated Apr. 14, 2016, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 14/558,529, dated Jun. 26, 2017, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 14/615,186, dated Jun. 1, 2016, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 14/645,120, dated Dec. 16, 2016, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 14/645,120, dated Oct. 27, 2015, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 14/915,224, dated Aug. 9, 2018, 7 pages.
Non-Final Office Action received for U.S. Appl. No. 14/915,224, dated Jul. 22, 2019, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 14/993,017, dated Dec. 22, 2017, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 14/993,017, dated Jan. 18, 2019, 35 pages.
Non-Final Office Action received for U.S. Appl. No. 15/006,987, dated Jun. 14, 2017, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 15/009,774, dated Jun. 20, 2018, 19 pages.
Non-Final Office Action received for U.S. Appl. No. 15/009,774, dated Sep. 4, 2019, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 15/017,463, dated May 15, 2019, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 15/017,463, dated Sep. 14, 2017, 23 pages.
Non-Final Office Action received for U.S. Appl. No. 15/039,400, dated Nov. 24, 2017, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 15/087,956, dated Jan. 18, 2019, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 15/089,432, dated Jan. 24, 2018, 28 pages.
Non-Final Office Action received for U.S. Appl. No. 15/090,555, dated Nov. 3, 2017, 28 pages.
Non-Final Office Action received for U.S. Appl. No. 15/097,179, dated Jan. 22, 2018, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 15/144,706, dated Apr. 7, 2017, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 15/148,798, dated Oct. 30, 2017, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 15/226,628, dated Aug. 11, 2017, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 15/226,628, dated Aug. 27, 2018, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 15/311,836, dated Dec. 15, 2017, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 15/313,549, dated Apr. 23, 2020, 33 pages.
Non-Final Office Action received for U.S. Appl. No. 15/313,549, dated Dec. 21, 2018, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 15/313,549, dated Jul. 10, 2019, 25 pages.
Non-Final Office Action received for U.S. Appl. No. 15/507,722, dated Feb. 11, 2019, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 15/522,737, dated Jan. 2, 2019, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 15/687,354, dated May 23, 2019, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 16/030,654, dated Feb. 21, 2020, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 16/152,326, dated Apr. 26, 2021, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 16/152,326, dated Aug. 14, 2019, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 16/152,326, dated Jun. 29, 2020, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 16/179,565, dated Dec. 13, 2018, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 16/192,664, dated Jul. 30, 2020, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 16/192,664, dated Nov. 26, 2019, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 16/201,730, dated May 10, 2019, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 16/447,811, dated Jan. 6, 2021, 29 pages.
Non-Final Office Action received for U.S. Appl. No. 16/581,721, dated Oct. 30, 2019, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 16/786,921, dated Dec. 10, 2020, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 16/921,817, dated Sep. 22, 2021, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 16/924,047, dated Sep. 24, 2021, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 17/003,133, dated Aug. 3, 2021, 22 pages.
Notice of Allowability received for U.S. Appl. No. 16/152,326, dated Dec. 10, 2021, 3 pages.
Notice of Allowability received for U.S. Appl. No. 16/786,921, dated Jul. 16, 2021, 2 pages.
Notice of Allowance received for U.S. Appl. No. 12/038,760, dated Nov. 8, 2013, 15 pages.
Notice of Allowance received for U.S. Appl. No. 12/110,024, dated Mar. 26, 2013, 4 pages.
Notice of Allowance received for U.S. Appl. No. 12/110,024, dated May 23, 2013, 5 pages.
Notice of Allowance received for U.S. Appl. No. 12/110,075, dated Aug. 19, 2013, 8 pages.
Notice of Allowance received for U.S. Appl. No. 12/238,333, dated Dec. 1, 2015, 10 pages.
Notice of Allowance received for U.S. Appl. No. 12/333,250, dated Aug. 28, 2012, 10 pages.
Notice of Allowance received for U.S. Appl. No. 12/494,173, dated Oct. 15, 2014, 8 pages.
Notice of Allowance received for U.S. Appl. No. 12/500,911, dated Aug. 19, 2013, 7 pages.
Notice of Allowance received for U.S. Appl. No. 12/545,557, dated Apr. 11, 2014, 9 pages.
Notice of Allowance received for U.S. Appl. No. 12/545,557, dated Jun. 10, 2013, 9 pages.
Notice of Allowance received for U.S. Appl. No. 12/545,604, dated Oct. 5, 2015, 9 pages.
Notice of Allowance received for U.S. Appl. No. 12/545,754, dated Aug. 21, 2014, 10 pages.
Notice of Allowance received for U.S. Appl. No. 13/448,182, dated Jan. 8, 2016, 9 pages.
Notice of Allowance received for U.S. Appl. No. 13/737,779, dated Sep. 3, 2013, 11 pages.
Notice of Allowance received for U.S. Appl. No. 13/766,376, dated Jan. 11, 2016, 10 pages.
Notice of Allowance received for U.S. Appl. No. 14/055,717, dated Nov. 7, 2014, 7 pages.
Notice of Allowance received for U.S. Appl. No. 14/082,003, dated Oct. 3, 2017, 10 pages.
Notice of Allowance received for U.S. Appl. No. 14/082,003, dated Sep. 20, 2017, 9 pages.
Notice of Allowance received for U.S. Appl. No. 14/157,737, dated Dec. 14, 2015, 5 pages.
Notice of Allowance received for U.S. Appl. No. 14/312,489, dated Mar. 16, 2015, 8 pages.
Notice of Allowance received for U.S. Appl. No. 14/318,157, dated Dec. 31, 2018, 8 pages.
Notice of Allowance received for U.S. Appl. No. 14/329,719, dated Nov. 2, 2015, 9 pages.
Notice of Allowance received for U.S. Appl. No. 14/550,686, dated Feb. 9, 2018, 12 pages.
Notice of Allowance received for U.S. Appl. No. 14/558,529, dated Oct. 13, 2017, 8 pages.
Notice of Allowance received for U.S. Appl. No. 14/615,186, dated Dec. 2, 2016, 7 pages.
Notice of Allowance received for U.S. Appl. No. 14/645,120, dated Mar. 1, 2018, 6 pages.
Notice of Allowance received for U.S. Appl. No. 14/915,224, dated Sep. 7, 2021, 5 pages.
Notice of Allowance received for U.S. Appl. No. 14/993,017, dated Jul. 12, 2019, 10 pages.
Notice of Allowance received for U.S. Appl. No. 15/009,774, dated Jul. 1, 2020, 6 pages.
Notice of Allowance received for U.S. Appl. No. 15/009,774, dated Mar. 20, 2020, 16 pages.
Notice of Allowance received for U.S. Appl. No. 15/039,400, dated Nov. 14, 2018, 2 pages.
Notice of Allowance received for U.S. Appl. No. 15/039,400, dated Oct. 12, 2018, 9 pages.
Notice of Allowance received for U.S. Appl. No. 15/087,956, dated Mar. 11, 2019, 11 pages.
Notice of Allowance received for U.S. Appl. No. 15/089,432, dated Jul. 30, 2018, 10 pages.
Notice of Allowance received for U.S. Appl. No. 15/090,555, dated Feb. 12, 2019, 7 pages.
Notice of Allowance received for U.S. Appl. No. 15/144,706, dated Sep. 20, 2017, 9 pages.
Notice of Allowance received for U.S. Appl. No. 15/148,798, dated Mar. 14, 2018, 10 pages.
Notice of Allowance received for U.S. Appl. No. 15/226,628, dated Apr. 3, 2019, 6 pages.
Notice of Allowance received for U.S. Appl. No. 15/228,942, dated Aug. 30, 2019, 12 pages.
Notice of Allowance received for U.S. Appl. No. 15/311,836, dated Jul. 5, 2018, 9 pages.
Notice of Allowance received for U.S. Appl. No. 15/313,549, dated Oct. 21, 2020, 10 pages.
Notice of Allowance received for U.S. Appl. No. 15/507,722, dated Feb. 27, 2020, 9 pages.
Notice of Allowance received for U.S. Appl. No. 15/522,737, dated Mar. 6, 2020, 8 pages.
Notice of Allowance received for U.S. Appl. No. 15/663,271, dated Jul. 5, 2018, 10 pages.
Notice of Allowance received for U.S. Appl. No. 15/686,969, dated Jan. 2, 2020, 8 pages.
Notice of Allowance received for U.S. Appl. No. 15/687,078, dated Apr. 3, 2019, 9 pages.
Notice of Allowance received for U.S. Appl. No. 15/687,354, dated Sep. 6, 2019, 12 pages.
Notice of Allowance received for U.S. Appl. No. 15/691,283, dated Jun. 5, 2019, 10 pages.
Notice of Allowance received for U.S. Appl. No. 16/152,326, dated Nov. 26, 2021, 9 pages.
Notice of Allowance received for U.S. Appl. No. 16/179,565, dated Jun. 6, 2019, 7 pages.
Notice of Allowance received for U.S. Appl. No. 16/192,664, dated Apr. 2, 2021, 10 pages.
Notice of Allowance received for U.S. Appl. No. 16/447,811, dated Nov. 19, 2021, 10 pages.
Notice of Allowance received for U.S. Appl. No. 16/581,721, dated Apr. 22, 2020, 5 pages.
Notice of Allowance received for U.S. Appl. No. 16/581,721, dated Aug. 4, 2020, 5 pages.
Notice of Allowance received for U.S. Appl. No. 16/786,921, dated Jul. 1, 2021, 8 pages.
Notice of Allowance received for U.S. Appl. No. 17/003,133, dated Feb. 10, 2022, 6 pages.
Notice of Allowance received for U.S. Appl. No. 16/030,654, dated Jun. 16, 2020, 8 pages.
Patent Board Decision received for U.S. Appl. No. 11/818,498, mailed on Nov. 2, 2016, 8 pages.
Pre-Interview First Office Action received for U.S. Appl. No. 15/228,942, dated Sep. 13, 2018, 4 pages.
Pre-Interview First Office Action received for U.S. Appl. No. 15/686,969, dated Apr. 4, 2019, 4 pages.
Restriction Requirement received for U.S. Appl. No. 12/238,333, dated Mar. 8, 2012, 6 pages.
Restriction Requirement received for U.S. Appl. No. 12/494,173, dated Aug. 8, 2012, 5 pages.
Restriction Requirement received for U.S. Appl. No. 13/766,376, dated Mar. 16, 2015, 6 pages.
Restriction Requirement received for U.S. Appl. No. 13/899,391, dated Apr. 8, 2015, 6 pages.
Restriction Requirement received for U.S. Appl. No. 14/915,224, dated Apr. 4, 2018, 10 pages.
Restriction Requirement received for U.S. Appl. No. 15/087,956, dated Feb. 13, 2018, 8 pages.
Restriction Requirement received for U.S. Appl. No. 15/089,432, dated Jul. 17, 2017, 5 pages.
Restriction Requirement received for U.S. Appl. No. 15/097,179, dated Sep. 28, 2017, 6 pages.
Restriction Requirement received for U.S. Appl. No. 15/228,942, dated Mar. 21, 2018, 6 pages.
Restriction Requirement received for U.S. Appl. No. 15/691,283, dated Mar. 5, 2019, 6 pages.
Restriction Requirement received for U.S. Appl. No. 16/447,811, dated Aug. 11, 2020, 5 pages.
Search Report received for Chinese Patent Application No. 2008201338142, dated Jan. 10, 2011, 25 pages (16 pages of English Translation and 9 pages of Official copy).
Search Report received for Chinese Patent Application No. 2009200081997, dated Jan. 7, 2011, 14 pages (8 pages of English Translation and 6 pages of Official copy).
Search Report received for Chinese Patent Application No. 201310042816.6, completed on May 18, 2015, 4 pages (2 pages of English Translation and 2 pages of Official copy).
Search Report received for Chinese Patent Application No. 201680008313.9, dated Jul. 5, 2019, 4 pages (2 pages English Translation and 2 pages of Official copy).
Search Report received for Chinese Patent Application No. 201780046939.3, dated Jun. 30, 2021, 6 pages (3 page of English Translation and 3 page of Official Copy).
Search Report received for Chinese Patent Application No. 201910391469.5, dated Jan. 27, 2022, 5 pages (2 pages of English Translation and 3 pages of Official Copy).
Search Report received for Chinese Patent Application No. ZL2009201524013, completed on Jun. 3, 2011, 20 pages (13 pages of English Translation and 7 pages of Official copy).
Search Report received for Chinese Patent Application No. ZL201020108330.X, completed on Dec. 14, 2011, 20 pages (12 pages of English Translation and 8 pages of Official copy).
Search Report received for European Patent Application No. 08017396.6, dated Mar. 19, 2009, 7 pages.
Search Report received for European Patent Application No. 17183937.6, dated Jan. 31, 2018, 4 pages.
Search Report received for Great Britain Patent Application No. GB0817242.1, dated Jan. 19, 2009, 2 pages.
Search Report received for Great Britain Patent Application No. GB0817242.1, dated Jan. 19, 2010, 2 pages.
Search Report received for Netherlands Patent Application No. 2001672, dated Apr. 29, 2009, 8 pages.
Search Report received forTaiwanese Patent Application No. 103105965, dated Nov. 12, 2015, 2 pages (1 page of English Translation and 1 page of Official copy).
Search Report received for Taiwanese Patent Application No. 103116003, dated Oct. 14, 2015, 2 pages (1 page of English Translation and 1 page of Official copy).
Search Report received for Taiwanese Patent Application No. 104115152, dated May 3, 2016, 2 pages (1 page of English Translation and 1 page of Official copy).
Supplemental Notice of Allowance received for U.S. Appl. No. 15/686,969, dated Feb. 21, 2020, 2 pages.
Supplementary European Search Report received for European Patent Application No. 14902458.0, dated Jul. 27, 2017, 4 pages.
Cassidy Robin, “The Tissot T-Touch Watch—A Groundbreaking Timepiece”, Ezine Articles, Available online at: <http://ezinearticles.com/?The-Tissot-T-Touch-Watch---A-Groundbreaking-Timepiece&id . . . >, Feb. 23, 2007, 2 pages.
Gibilisco Stan, “The Illustrated Dictionary of Electronics”, Eighth Edition, 2001, 2 pages.
Jinpu et al., “Probe into the Function of Touch Panel”, vol. 2008, No. 14, Jul. 15, 2008, 3 pages. (Abstract submitted in English).
Lee et al., “A Multi-Touch Three Dimensional Touch-Sensitive Tablet”, CHI'85 Proceedings, Apr. 1985, pp. 21-25.
Lowe Doug, “Electronics Components: How to Use an Op Amp as a Voltage Comparator”, Dummies, Available online at: <https://www.dummies.com/programming/electronics/components/electronics-components-how-to-use-an-op-amp-as-a-voltage-comparator/>, 2012, 9 pages.
Mainardi Elena, “Design of a Portable Touchscreen Interface for Powerline Domotic Systems”, IEEE International Conference on Automation Science and Engineering, Aug. 23-26, 2008, 5 pages.
Malik et al., “Visual Touchpad: A Two-Handed Gestural Input Device”, Proceedings of the 6th International Conference on Multimodal Interfaces, State College, PA, ICMI '04, ACM, Oct. 13-15, 2004, pp. 289-296.
O'Connor Todd, “mTouch Projected Capacitive Touch Screen Sensing Theory of Operation”, Microchip TB3064, Microchip Technology Inc., 2010, pp. 1-16.
Rekimoto J., “SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces”, CHI 2002 Conference Proceedings, Conference on Human Factors in Computing Systems, Minneapolis, vol. 4, No. 1, Apr. 20-25, 2002, pp. 113-120.
Rubine Dean, “Combining Gestures and Direct Manipulation”, CHI'92, May 3-7, 1992, pp. 659-660.
Rubine Dean H., “The Automatic Recognition of Gestures”, CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, Dec. 1991, 285 pages.
Westerman Wayne, “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface”, A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 1999, 363 pages.
Wilson Andrewd., “Robust Computer Vision-Based Detection of Pinching for One and Two-Handed Gesture Input”, ACM, USIT '06, Montreux, Oct. 15-18, 2006, 4 pages.
Yang et al., “A Noise-Immune High-Speed Readout Circuit for In-Cell Touch Screen Panels”, IEEE Transactions on Circuits and Systems-I: Regular Papers vol. 60, No. 7, Jul. 2013, pp. 1800-1809.
Final Office Action received for U.S. Appl. No. 16/921,817, dated Jun. 22, 2022, 24 Pages.
Non-Final Office Action received for U.S. Appl. No. 17/448,879, dated Jun. 24, 2022, 19 Pages.
Notice of Allowance received for U.S. Appl. No. 16/921,817, dated Nov. 30, 2022, 9 pages.
Notice of Allowance received for U.S. Appl. No. 16/924,047, dated Sep. 21, 2022, 7 pages.
Final Office Action received for U.S. Appl. No. 17/448,879, dated Jan. 9, 2023, 19 pages.
Restriction Requirement received for U.S. Appl. No. 17/653,231, dated Nov. 8, 2022, 5 pages.
Provisional Applications (1)
Number Date Country
63032575 May 2020 US