This relates generally to touch sensitive devices and, more specifically, to touch sensitive devices that can have multiple scanning modes.
Touch sensitive devices have become popular as input devices to computing systems due to their ease and versatility of operation as well as their declining price. A touch sensitive device can include a touch sensor panel, which can be a clear panel with a touch sensitive surface, and a display device, such as a liquid crystal display (LCD), that can be positioned partially or fully behind the panel or integrated with the panel so that the touch sensitive surface can cover at least a portion of the viewable area of the display device. The touch sensitive device can allow a user to perform various functions by touching the touch sensor panel using a finger, stylus or other object at a location often dictated by a user interface (UI) being displayed by the display device. In general, the touch sensitive device can recognize a touch event and the position of the touch event on the touch sensor panel, and the computing system can then interpret the touch event in accordance with the display appearing at the time of the touch event, and thereafter can perform one or more actions based on the touch event. Touch sensing operations, however, can consume a significant amount of power and drain the battery of the touch sensitive device.
This relates to reducing power consumption due to touch sensing operations for touch sensitive devices. Power consumption can be reduced by implementing a coarse scan (e.g., a banked common mode scan) to coarsely detect the presence or absence of an object touching or proximate to a touch sensor panel, and the results of the coarse scan can be used to dynamically adjust the operation of the touch sensitive device to perform or not perform one or more fine scan steps (e.g., a targeted active mode scan). A coarse scan, such as a banked common mode scan, can be a relatively low power scan compared to a fine scan, such as a full panel scan, and can indicate the presence or absence of a touch event at a region of the touch sensor panel. In some examples, the results of the coarse scan can be used to program a touch controller of the touch sensitive device for the next touch sensing frame to idle when no touch event is detected, or to perform a fine scan (e.g., a full or partial panel scan) when one or more touch events are detected. In some examples, the results of the coarse scan can be used to abort one or more scheduled fine scan steps during a current touch sensing frame when no touch event is detected.
In the following description of examples, reference is made to the accompanying drawings which form a part hereof, and in which it is shown by way of illustration specific examples that can be practiced. It is to be understood that other examples can be used and structural changes can be made without departing from the scope of the disclosed examples.
This relates to reducing power consumption due to touch sensing operations for touch sensitive devices. Continuously scanning a touch sensor panel to detect touch or proximity events can waste a significant amount of power, especially when no objects are touching or proximate to the touch sensor panel for extended periods of time. In order to reduce power consumption, in some examples, a coarse scan can be performed to coarsely detect the presence or absence of a touch event, and the results of the coarse scan can be used to dynamically adjust the operation of the touch sensitive device to perform or not perform one or more fine scan steps (e.g., a targeted active mode scan). A coarse scan, such as a banked common mode scan, can be a relatively low power scan compared to a fine scan such as a full panel scan, and can indicate the presence or absence of a touch event at a region of the touch sensor panel. In some examples, the results of the coarse scan can be used to program a touch controller for the next touch sensing frame to idle when no touch event is detected, or to perform a fine scan (e.g., a full or partial panel scan) when one or more touch events are detected. In some examples, the results of the coarse scan can be used to abort one or more steps of a scheduled fine scan during a current touch sensing frame when no touch event is detected. The latter example can reduce power consumption (e.g., by powered down unused sense channels) without degrading the responsiveness of the touch sensitive device. Additionally or alternatively, the one or more aborted scan steps can be reallocated to other scans.
Although a full panel scan is discussed herein as an example of a fine scan, and a banked common mode scan is discussed herein as an example of a coarse scan, it should be understood that the coarse and fine scans are not limited to these examples. A coarse scan can be any scan that provides information about touch events with a lower resolution than a given fine scan. A fine scan can be any scan that provides information about touch events with a higher resolution than a given coarse scan. A full panel scan can be an example fine scan that can provide the highest resolution scan as it can provide the maximum touch information available for the panel (e.g., unique sensor measurements for the smallest sensing node). A scan of the entire panel as a single sense node can be an example coarse scan that can provide the lowest resolution scan as it can provide the minimum touch information available for the panel (e.g., only one measurement for the largest sensing node configuration).
This also relates to performing a coarse scan and fine scan for a pixelated touch sensor panel. In some examples, one or more objects (e.g., a finger or an active or passive stylus) touching or proximate to the pixelated touch sensor panel can be detected using a coarse scan. When detecting an active stylus, for example, a coarse scan can reduce the number of sense/receive channels necessary for a scan of the entire panel, thereby saving power and reducing hardware required for the touch sensitive device. It should be appreciated however, that the object could be another input device. Once detected, the stylus location can be more accurately detected using a fine scan (e.g., a targeted active mode scan). For example, during a coarse scan, groups of individual pixel electrodes of a touch sensor panel can be coupled together to form super-pixel electrodes (or a bank of pixel electrodes). The super-pixel electrodes can be coupled to available sense/receive channels. In some examples, the coarse location can be indicated by the super-pixel electrode with a maximum detected touch value. In other examples, touch values from adjacent super-pixel electrodes can be used to provide additional resolution to the stylus location within the super-pixel electrode with the maximum detected touch value. After detecting the stylus and its coarse location, coupling between the sense/receive channels and pixel electrodes of the touch sensor panel can be dynamically adjusted and a fine scan can be performed for pixel electrodes proximate to the coarsely detected stylus location. The fine scan can include one or more of a per-pixel individual pixel scan, a row scan and a column scan.
This also related to using coarse and fine scans to reduce power consumption for a pixelated touch sensor panel. One or more coarse scans can be performed to coarsely detect the presence or absence of a touch or hover event, and the results of the coarse scan can be used to dynamically adjust the operation of the touch sensitive device to perform or not perform one or more fine scan steps. A coarse scan, such as a banked common mode scan for a pixelated touch sensor panel, can be a relatively low power scan compared to a fine scan such as a full panel scan for a pixelated touch sensor panel, and can indicate the presence or absence of a touch event at one or more regions of the touch sensor panel. In some examples, the results of the coarse scan can be used to abort one or more steps of a scheduled fine scan (or reprogram the steps of an upcoming scheduled fine scan). Unused sense channels from aborted scan steps can be powered down to reduce power consumption. Additionally or alternatively, the one or more aborted scan steps can be reallocated to other scans. Although a full panel scan is discussed herein as an example of a fine scan, it should be understood that the fine scan is not so limited, and can be any scan of the panel providing information about touch events with a higher resolution than the coarse scan.
Touch screens 124, 126, 128 and 130 can be based on, for example, self-capacitance or mutual capacitance sensing technology, or another touch sensing technology. For example, in a self-capacitance based touch system, an individual electrode with a self-capacitance to ground can be used to form a touch pixel (touch node) for detecting touch. As an object approaches the touch pixel, an additional capacitance to ground can be formed between the object and the touch pixel. The additional capacitance to ground can result in a net increase in the self-capacitance seen by the touch pixel. This increase in self-capacitance can be detected and measured by a touch sensing system to determine the positions of multiple objects when they touch the touch screen. A mutual capacitance based touch system can include, for example, drive regions and sense regions, such as drive lines and sense lines. For example, drive lines can be formed in rows while sense lines can be formed in columns (i.e., orthogonal). Touch pixels (touch nodes) can be formed at the intersections or adjacencies (in single layer configurations) of the rows and columns. In a pixelated touch sensor panel, touch nodes for a mutual capacitance scan can be formed at the adjacencies of pixel electrodes configured as drive electrodes and pixel electrodes configured as sense electrodes. During operation, the rows can be stimulated with an alternating current (AC) waveform and a mutual capacitance can be formed between the row and the column of the touch pixel. As an object approaches the touch pixel, some of the charge being coupled between the row and column of the touch pixel can instead be coupled onto the object. This reduction in charge coupling across the touch pixel can result in a net decrease in the mutual capacitance between the row and the column and a reduction in the AC waveform being coupled across the touch pixel. This reduction in the charge-coupled AC waveform can be detected and measured by the touch sensing system to determine the positions of multiple objects when they touch the touch screen. In some examples, a touch screen can be multi-touch, single touch, projection scan, full-imaging multi-touch, or any capacitive touch.
It should be apparent that the architecture shown in
Computing system 200 can include a host processor 228 for receiving outputs from touch processor 202 and performing actions based on the outputs. For example, host processor 228 can be connected to program storage 232 and a display controller, such as a Liquid-Crystal Display (LCD) driver 234. It is understood that although the examples of the disclosure are described with reference to LCD displays, the scope of the disclosure is not so limited and can extend to other types of displays, such as Light-Emitting Diode (LED) displays, including Active-Matrix Organic LED (AMOLED) and Passive-Matrix Organic LED (PMOLED) displays.
Host processor 228 can use LCD driver 234 to generate a display image on touch screen 220, such as a display image of a user interface (UI), and can use touch processor 202 and touch controller 206 to detect a touch on or near touch screen 220, such as a touch input to the displayed UI. The touch input can be used by computer programs stored in program storage 232 to perform actions that can include, but are not limited to, moving an object such as a cursor or pointer, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, making a selection, executing instructions, operating a peripheral device connected to the host device, answering a telephone call, placing a telephone call, terminating a telephone call, changing the volume or audio settings, storing information related to telephone communications such as addresses, frequently dialed numbers, received calls, missed calls, logging onto a computer or a computer network, permitting authorized individuals access to restricted areas of the computer or computer network, loading a user profile associated with a user's preferred arrangement of the computer desktop, permitting access to web content, launching a particular program, encrypting or decoding a message, and/or the like. Host processor 228 can also perform additional functions that may not be related to touch processing.
In some examples, RAM 212, program storage 232, or both, can be non-transitory computer readable storage media. One or both of RAM 212 and program storage 232 can have stored therein instructions, which when executed by touch processor 202 or host processor 228 or both, can cause the device including computing system 200 to perform one or more functions and methods of one or more examples of this disclosure.
Touch screen 220 can include touch sensing circuitry that can include a capacitive sensing medium having a plurality of drive lines 222 and a plurality of sense lines 223. It should be noted that the term “lines” is sometimes used herein to mean simply conductive pathways, as one skilled in the art will readily understand, and is not limited to elements that are strictly linear, but includes pathways that change direction, and includes pathways of different size, shape, materials, etc. Drive lines 222 can be driven by stimulation signals 216 from driver logic 214 through a drive interface 224, and resulting sense signals 217 generated in sense lines 223 can be transmitted through a sense interface 225 to sense channels 208 (also referred to as an event detection and demodulation circuit) in touch controller 206. In this way, drive lines and sense lines can be part of the touch sensing circuitry that can interact to form capacitive sensing nodes, which can be thought of as touch picture elements (touch pixels), such as touch pixels 226 and 227. This way of understanding can be particularly useful when touch screen 220 is viewed as capturing an “image” of touch (“touch image”). In other words, after touch controller 206 has determined whether a touch has been detected at each touch pixel in the touch screen, the pattern of touch pixels in the touch screen at which a touch occurred can be thought of as an “image” of touch (e.g., a pattern of fingers touching the touch screen). Although not shown in the example of
In addition to performing a full panel touch sensor scan, the example touch sensor panel can also be configured to perform a banked common mode scan. The banked common mode scan can be performed for all the banks in the panel to provide touch information for the entire panel, though at a lower resolution than the full panel touch sensor scan described above. During a banked common mode scan, a bank of drive lines 302 can be simultaneously stimulated with a common stimulation signal, and the sense signals generated in one or more sense lines 304 in response to stimulation of the bank can be used to determine the presence and/or amount of touch at the region corresponding to the bank. Performing a common mode scan for multiple banks in the panel can provide coarse information about the presence or absence of a touch in one or more banks. In some examples, stimulation of each bank of the touch sensor panel can be time-multiplexed (single stimulation), though in other examples, the common mode stimulation of banks can be frequency-multiplexed or performed using bank-level multi-stimulation techniques. Performing a banked common mode scan (coarse scan) rather a coarse scan of the drive lines 302 of the entire touch sensor panel (i.e., stimulating all drive lines simultaneously with a common mode voltage) can ensure proper sensitivity and signal-to-noise ratio properties for the sense channels. As the number of drive lines per sense channel increases, the large signal from common modes stimulation can reduce the sensitivity and signal-to-noise ratio properties of the sense channel.
Touch sensing scans can be arranged to take place during designated touch sensing frames. In some examples requiring continuous touch information, full panel touch sensor scanning can be repeated during each touch sensing frame in order to detect touch and/or hover events at the touch sensor panel. Continuous touch full panel touch sensor scanning can be advantageous in that it can provide complete touch information for each scanning frame and can allow the system to be immediately responsive to touches.
In some examples, rather than performing full panel touch sensor scanning during each touch frame, the system can skip full panel touch sensing scans during touch sensing frames after no touch events are detected at the touch sensor panel for a threshold period of time. For example, the system can have two modes. During a first mode (e.g., full-frequency mode), the system can perform continuous full panel touch sensor scanning during touch sensing frames. During a second mode (e.g., reduced-frequency mode), the system can reduce the frequency of full panel touch sensor scanning by dropping scans (e.g., by programming the touch controller to idle rather than scan). The system can switch from the full-frequency mode to the reduced-frequency mode when no touch events are detected on the panel for a threshold number of touch sensing frames, for example. The system can switch from the reduced-frequency mode to the full-frequency mode when a touch event is detected, for example.
In some examples, rather than changing the frequency of full panel touch sensor scanning, the system can dynamically alter the operation for the upcoming frame based on a banked common mode detection scan for a current frame.
Although
Performing a common mode detection scan during touch sensing frames rather than a full panel touch sensing scan (as illustrated in
In some examples, rather than dynamically altering the operation for the upcoming frame based on a banked common mode detection scan for a current frame (which can reduce responsiveness of the system), the system can be configured to execute a detection scan and a full panel scan during a touch sensing frame, and abort the full panel touch sensing scan based on the detection scan.
Although
Although illustrated in
In other examples, some features illustrated in
It should be understood that although not illustrated in the touch sensing frames of
Additionally, although not illustrated in
It is understood that the order of coarse detection scanning, programming, and fine scanning (or idling) illustrated in
In some examples, the touch sensor panel or touch screen of a device can include a pixelated touch sensor panel. A pixel based touch system can include a matrix of small, individual plates of conductive material that can be referred to as touch pixel electrodes. For example, a touch screen can include a plurality of individual touch pixel electrodes, each touch pixel electrode identifying or representing a unique location on the touch screen at which touch or proximity (i.e., a touch or proximity event) is to be sensed, and each touch pixel electrode being electrically isolated from the other touch pixel electrodes in the touch screen/panel. Such a touch screen can be referred to as a pixelated touch screen. A pixelated touch screen configured to detect touch or hover events (e.g., from a finger or passive stylus) by measuring the self-capacitance of each touch pixel electrode can be referred to as a pixelated self-capacitance touch screen. During operation, a touch pixel electrode can be stimulated with an AC waveform, and the self-capacitance to ground of the touch pixel electrode can be measured. As an object approaches the touch pixel electrode, the self-capacitance to ground of the touch pixel electrode can change. This change in the self-capacitance of the touch pixel electrode can be detected and measured by the touch sensing system to determine the positions of multiple objects when they touch, or come in proximity to, the touch screen. In some examples, the electrodes of a self-capacitance based touch system can be formed from rows and columns of conductive material, and changes in the self-capacitance to ground of the rows and columns can be detected, similar to above. A pixelated touch screen can also be configured to measure mutual capacitance formed between an active stylus electrode and each of the pixel electrodes when the stylus is touching or proximate to the touch screen to determine the location of the active stylus. In some examples, a touch screen can be multi-touch, single touch, projection scan, full-imaging multi-touch, capacitive touch, etc.
Although the touch sensor panel 700 illustrated in
During a self-capacitance coarse scan of the pixelated touch sensor panel or touch screen (i.e., common mode super-pixel scan), a super-pixel electrode can be stimulated with an AC waveform, and the self-capacitance to ground of the super-pixel electrode can be measured. As an object approaches the super-pixel electrode, the self-capacitance to ground of the super-pixel electrode can change. This change in the self-capacitance of the super-pixel electrode can be detected and measured by the touch sensing system to determine the positions of multiple objects when they touch or come in proximity to the touch screen. During a coarse scan, each super-pixel electrode can be sensed (e.g., single stimulation or multi-stimulation) and the change in self-capacitance measured at each of the super-pixel electrodes in the panel can be viewed as a coarse image of touch on the touch sensor panel. Stimulating super-pixels during a coarse scan rather than stimulating individual pixel electrodes can reduce the number of sense channels required to scan the entire touch sensor panel during a coarse scan. For example, a super-pixel containing 16 individual pixel electrodes can reduce the number of channels necessary by a factor of 16 when simultaneously sensing the entire panel. Thus, rather than needing 480 sense channels as required for a 24×20 array of individual pixel electrodes, a 6×5 configuration of super-pixel electrodes can require only 30 sense/receive channels. In addition to hardware savings, the coarse scan can also be completed faster than a full panel scan of individual pixel electrodes and consume less power than a full panel scan of individual pixel electrodes. In some cases the number of super-pixels and sense/receive channels can be the same, such that the entire panel can be scanned at once.
Similarly, a mutual capacitance coarse scan of the pixelated touch sensor panel can be used to detect an object, such as an active stylus or other input device. For brevity, the following discussion will address an active stylus as an exemplary object. The stylus can generate stimulation signals which can couple to super-pixel electrodes, forming a mutual capacitance therebetween. The change in mutual capacitance between the active stylus and the super-pixel electrodes can be detected and measured by the touch sensing system to determine the position of the active stylus in contact with or proximity to the touch screen. The use of super-pixel electrodes can also provide the benefits of and associated with reducing the number of sense/receive channels.
The common mode super-pixel scan can be used to detect the presence of a stylus, pen or other touching object, and provide coarse information about its location. The super-pixel electrode with the maximum touch value (corresponding to the largest change in self-capacitance due to a finger or passive stylus or mutual capacitance due to the active stylus) can be identified by the touch system. In some examples, additional location information can be estimated using touch values from adjacent super-pixel electrodes.
The touch values of adjacent super-pixel electrodes can be used to provide additional location information. For example, a centroid can be calculated using the touch values corresponding to the super-pixel having the maximum touch value and the super-pixels adjacent to the super-pixel electrode having the maximum value. A centroid can be calculated in both the horizontal (x-axis) and vertical (y-axis) axes. The centroids can indicate whether the stylus location corresponds to the top, bottom or middle of the super-pixel electrode with the maximum touch value and also whether the stylus location corresponds to the left, right or middle portion of the super pixel electrode. For example as illustrated in
where Rx(n) and Ry(n) can correspond to the touch values corresponding to the super-pixels included in the centroid calculation (e.g., corresponding to super-pixels 904, 906 and 908 for the x-axis and super pixels 904, 910 and 912 for the y-axis). The centroid calculation can provide location information identifying the stylus location not only as corresponding to super-pixel electrode 904, but also as corresponding to an approximate sub-division or region of super-pixel electrode 904. As illustrated in
Although a centroid can be used to determine additional precision for the location of a stylus, other metrics can be used instead of or in addition to the centroid. For example, various ratios of the touch values or the relative magnitude of touch values can be used to determine additional location information. For example, a large magnitude of the touch value of super-pixel electrode 910 compared with the touch value of super-pixel electrode 912 can indicate that the location corresponds to the top of super-pixel electrode 904. Similarly, a ratio of touch values (e.g., maximum to adjacent or adjacent to adjacent) can be used to determine additional location information. Additionally, although the example illustrated in
After performing a coarse scan, a fine scan can be performed. The fine scan can use information from the coarse scan to minimize the number of electrodes scanned to reduce scan time and power consumption and make efficient use of the available sense/receive channel hardware. For example, if the coarse scan indicates a super-pixel electrode, the fine scan can focus on the individual pixel electrodes in the indicated super-pixel electrode and possibly some adjacent super-pixel electrodes. If the coarse scan provides additional location information (e.g., top left or center, etc.), the system can make use of the information to be more selective about which individual pixel electrodes to scan during the fine scan.
In some examples, the fine scan can be a per-pixel individual pixel scan. A per-pixel individual pixel scan can scan a plurality of pixels at or in proximity to the location of the stylus identified by the coarse scan. The system can reconfigure the connection between the electrodes of the touch sensor panel and sense/receive channels for a fine scan. For example, individual pixel electrodes can be coupled with distinct sense/receive channels.
The number of individual pixel electrodes scanned can be based on the physical footprint of the stylus. For example, a larger number of individual pixel electrodes can be scanned for a larger stylus footprint, and a smaller number of individual pixel electrodes can be scanned for a smaller stylus footprint. In some cases, one or more individual pixel electrodes in one or more adjacent super-pixels in a super-column and/or super-row can be scanned (simultaneously if enough sense channels are available, or in multiple scan steps). In the example illustrated in
In some cases the number and arrangement of adjacent pixel electrodes to scan during a fine scan can be adjusted based on the detected type of stylus. In some examples, the stylus can transmit information indicating its type and/or physical dimensions (or the information can be detected by the touch sensor panel or entered into the touch sensitive device manually), and the information about the stylus and/or physical dimensions can be used to determine the number and arrangement of pixel electrodes selected for a fine scan. In some examples, the number and arrangement of pixel electrodes selected for the fine can be adjusted dynamically based on stylus information (type, dimension, orientation, etc.) and in other examples the number and arrangement can be fixed.
In some examples, additional individual pixel electrodes (or larger arrangements of electrodes such as super-pixel electrodes, row electrodes or column electrodes) can be sensed during a fine scan in order to measure common mode noise (local or global), and the noise signal measured at these electrodes can be removed from the stylus signals sensed by the fine scan. In some examples, the electrodes can be adjacent to the coarsely detected location. In other examples, the electrode can be proximate, but not adjacent to the coarsely detected location to ensure that local common mode noise can be detected without capturing some of the stylus signal (which can cause some of stylus signals to be subtracted during noise removal). In yet other examples, global common mode noise can be measured from electrodes distant from the coarsely detected location.
In some examples, the fine scan can be a row scan and/or column scan. A row scan can scan a plurality of row electrodes formed from individual pixel electrodes at or in proximity to the location of the stylus identified by the coarse scan. A column scan can scan a plurality of column electrodes formed from individual pixel electrodes at or in proximity to the location of the stylus identified by the coarse scan. The system can reconfigure the connection between the electrodes of the touch sensor panel and sense/receive channels for a fine scan. For example, row electrodes or column electrodes can be coupled with distinct sense/receive channels.
The number of rows of individual pixel electrodes (row electrodes) scanned can be based on the physical footprint of the stylus. For example, a larger number of rows of individual pixel electrodes can be scanned for a larger stylus footprint, and a smaller number of rows of individual pixel electrodes can be scanned for a smaller stylus footprint. In some cases, one or more rows of individual pixel electrodes (or individual pixel electrodes, for example) in one or more adjacent super-pixels in a super-column and/or super-row can be scanned (simultaneously if enough sense channels are available, or in multiple scan steps). In the example illustrated in
The number of columns of individual pixel electrodes (column electrodes) scanned can be based on the physical footprint of the stylus. For example, a larger number of columns of individual pixel electrodes can be scanned for a larger stylus footprint and a smaller number of columns of individual pixel electrodes can be scanned for a smaller stylus footprint. In some cases, one or more columns of individual pixel electrodes (or individual pixel electrodes, for example) in one or more adjacent super-pixels in a super-column and/or super-row can be scanned (simultaneously if enough sense channels are available, or in multiple scan steps). In the example illustrated in
In some examples, the touch sensitive device can be configured to perform either a per-pixel scan or a row and column scan, though in other examples the touch sensitive device can dynamically select the type of fine scan to perform. For example, the per-pixel scan can provide increased accuracy over the row and column scans and can be better suited to applications requiring additional resolution or to detect multiple objects operating in close proximity to one another. In such a case, the device can dynamically select the per-pixel scan. In other cases, less resolution may be required and the row and column scan can be dynamically selected.
In some examples, multiple objects (e.g., multiple styli) can be used simultaneously. In some examples, the touch sensing device can time multiplex (e.g., scan for each stylus at a different time) the coarse and fine detection of the two stylus devices. In other examples, the touch sensing device can frequency multiplex (e.g., scan for each stylus at a different frequency) the coarse and fine detection of the two stylus devices, though additional sense/receive channels may be necessary to perform the scans for both styli in parallel.
In the discussion of detecting a stylus using a pixelated touch sensor panel, the number of sense/receive channels could be limited based on the dimensions of the stylus. However, when using the pixelated touch sensor panel to detect multiple touches in multiple regions of the touch sensor panel, the number of sense/receive channels and/or the number of scan steps may need to be increased. In some examples, the number of sense/receive channels can be increased, but coarse and fine scans can be implemented to reduce power by powering down sense/receive channels used to detect touch in regions of the pixelated touch sensor panel without objects present. It is understood that powering down unused sense channels can also help reduce power consumption even when the number of sense/receive channels is not increased.
As discussed herein, the pixelated touch sensor panel can include an array of individual pixel electrodes configured to detect touch or hover/proximity events by measuring, for example, changes in the self-capacitance of each individual pixel electrode. In other examples, changes in mutual capacitance can be measured between an active stylus electrode and each individual pixel electrode. During a full panel scan, each touch pixel electrode can be sensed and the change in self-capacitance measured at each of the pixel electrodes in the panel can be viewed as an image of touch on the touch sensor panel. If enough sense/receive channels are available, the full panel scan can be completed in a single scan step. Alternatively, the full panel scan can sense multiple pixel electrodes (e.g., up to the number of sense/receive channels available) during each of multiple steps of the full panel scan. For example, the 24×20 array of pixel electrodes of the pixelated touch sensor panel in
The minimum number of self-capacitance scan steps can be a function of the number of pixel electrodes in the pixelated touch sensor panel, the number of available sense channels in the touch controller and the number of touch controller circuits available assuming that every node in the panel is to be scanned individually. The relationship can be represented mathematically as:
where x can represent the number of scan steps for a full panel self-capacitance scan, M can represent the number of pixel electrodes in the pixelated touch sensor panel, N can represent the number of sense channels in a touch controller circuit, and Q can represent the number of touch controller circuits. When touch controller circuits have different numbers of sense channels, the denominator in the above equation can be replaced with the total number of sense channels in the touch system. When x is not an integer, the expression should be rounded up to the next integer.
Execution of the steps of a full panel scan can be performed based on scheduled scans (including scheduled scan steps), which can be part of a scan plan for the system. Execution of the full panel scan steps can be performed to generate high resolution scan images (touch images). It should be understood that a full panel scan of each pixel of the touch sensor is an example of a fine scan, but in other examples a fine scan could be a different resolution scan that is higher resolution than a coarse scan. Additionally, although the example scans in
In other examples, the touch system can sense mutual capacitance at the pixelated touch sensor panel (e.g., cross-coupling between two individual pixel electrodes) to generate an image of touch.
In the second configuration, pixel electrode 1608 can be configured as a drive electrode, pixel electrode 1602 can be configured as a sense electrode (e.g., held at a fixed DC voltage), and pixel electrodes 1604 and 1606 can be grounded or coupled to a DC voltage. During a mutual capacitance scan in the second configuration, the sense electrode can sense the signal coupling between the drive electrode and sense electrode to generate a measurement representative of the touch signal for pixel electrode 1602. In the third configuration, pixel electrode 1604 can be configured as a drive electrode, pixel electrode 1606 can be configured as a sense electrode, and pixel electrodes 1602 and 1608 can be grounded or coupled to a DC voltage. During a mutual capacitance scan in the third configuration, the sense electrode can sense the signal coupling between the drive electrode and sense electrode to generate a measurement representative of the touch signal for pixel electrode 1606. In the fourth configuration, pixel electrode 1606 can be configured as a drive electrode, pixel electrode 1604 can be configured as a sense electrode, and pixel electrodes 1602 and 1608 can be grounded or coupled to a DC voltage. During a mutual capacitance scan in the fourth configuration, the sense electrode can sense the signal coupling between the drive electrode and sense electrode to generate a measurement representative of the touch signal for pixel electrode 1604. By performing mutual capacitance scans in the four configurations for pixel electrodes 1602, 1604, 1606 and 1608, the system can generate a measurement for each pixel electrode in the group.
In some examples, rather than associating the measurement for each configuration with one of the individual pixel electrodes configured as a sense electrode, the measurements can be representative of the cross-coupling between the corresponding drive and sense electrode for each configuration, respectively. For example, rather than associating the mutual capacitance measurement of the first configuration described above with sense electrode 1608, the mutual capacitance measurement can be associated with drive electrode 1602 and sense electrode 1608.
The mutual capacitance scans described in
Although four configurations are described in
The full panel fine scans, whether mutual capacitance or self-capacitance scans, can waste considerable power when there are no objects touching or in proximity to the panel or when objects are coarsely detected in few regions of the pixelated touch sensor panel. In order to reduce power consumption by the system, the system can perform one or more coarse scans before performing a fine scan such as a full panel scan. Based on the coarse scan, the system can adjust the subsequent scanning (e.g., by reprogramming the subsequent scheduled scan or by aborting one or more steps of the scheduled scan) of the touch sensor panel. Sense channels that are unused due to the adjusted scanning can be powered down to save power. In other examples, rather than aborting scan steps or reprogramming the controller to idle for periods, some scan steps can be repeated (and the results averaged) to increase signal-to-noise ratio for the (SNR) scan.
The one or more coarse scans can include banked common-mode self-capacitance and/or banked common-mode mutual capacitance scans. In some examples, one coarse scan can adequately provide coarse information to adjust the fine scan. In other examples, information from more than one coarse scan can be used together to identify poorly grounded or floating objects to correct measured touch signals.
The common mode self-capacitance scan can include coupling multiple pixel electrodes to a common sense channel for the scan, for example, as in the common mode super-pixel scan described above. The multiple pixel electrodes coupled to the common sense channels can be considered as a bank of electrodes.
The common mode mutual capacitance scan can include coupling multiple sense electrodes to a common sense channel.
The bank of pixels for a mutual capacitance or self-capacitance coarse scan can define a region of a pixelated touch sensor panel. The measurement at each sense channel can coarsely represent the location of an object touching or proximate to the region. Although the regions defined by the banks illustrated in
Although
As discussed herein, information from the one or more coarse scans can be used to identify regions (banks) of the pixelated touch sensor panel that detect an object touching or hovering (i.e., touch events) or detect no object. The touch information can be used to determine which scan steps of the scheduled fine scan should be aborted. In some examples, the region (bank) defined by a scan step of a fine scan can correspond to the region (bank) defined by the coarse scan.
In some examples, the system can apply a border region around regions at which objects are detected before determining which scans to abort.
In some examples, the region defined by the scan step and the region defined by the coarse scan can be different.
As discussed above, the system can apply a border region around regions at which objects are detected before determining which scans to abort.
The border region illustrated in
In
Similarly, referring back to
It should be understood that although described as performing part of a fine scan step corresponding to a coarse scan region, in other examples a portion of a fine scan region can be fine scanned during a step (and the sense channels corresponding to the rest of the fine scan region can be powered down) where the portion does not correspond to a coarse scan region. For example, in
As described herein, in some examples, rather than abort scans based on the coarse scan, the system can repeat or duplicate some scan steps to increase SNR for the scan. The SNR improvement can come from averaging results of repeated scan steps or from increasing the duration of a scan step by performing the scan for more than one scan step period. This can be referred to as reallocating scan time to increase SNR of the scans.
In some examples, the scheduled scan steps can be reordered based on the results of the coarse scan.
Reallocating and/or reordering scheduled scans can increase complexity of the system. Additionally, as discussed herein, in some cases, scans cannot be reordered in the time before execution of steps of the fine scan. As a result, in some examples, the system can reorder and/or reallocate scans when one or more conditions are satisfied. For example, the system can reorder and/or reallocate scan time periods when the coarse scan detects few objects. In some examples, the system can reorder and/or reallocate scan time periods only when a single object (e.g., one finger or one stylus) is detected. In some examples reordering and/or reallocating scan steps can be performed when fewer than a threshold number of coarse or fine scan banks detect a touch event. For example, reordering and/or reallocating can be performed if fewer than 4 fine scan banks detect a touch event, or if fewer than 6 coarse scan banks detect a touch event. Alternatively, the system can reorder and/or reallocate scan steps when touch events are detected at less than a threshold percentage of the panel (e.g., touch events detected in less than 10% of the coarse banks).
The system can also measure SNR using one or more metrics. The system (e.g., hardware or firmware) can then evaluate power savings versus SNR tradeoffs to determine whether to reallocate scan time for improved SNR or to power down channels to save power. The evaluation of SNR can determine how many abandoned scans to reallocate for duplicate scans in order to get enough SNR improvement. In this way, the system can increase integration time as needed to gain sufficient SNR while also saving power by abandoning scans.
The determination of which scan steps to execute (or abort) for a fine scan based on the coarse scan results can be depend on various considerations other than based on the presence or absence of touch in various regions of the touch sensor panel. For example, as discussed above, the determination can also be based on a boundary region around regions at which touch is detected during a coarse scan. Additionally, some regions of the pixelated touch sensor panel can be scanned during the fine scan even if no touches are detected during the coarse scan (i.e., irrespective of the coarse scan results indicating no touch events). For example, some regions of the touch sensor panel can be scanned at all times, periodically, and/or based on the user interface. For example, if a touch screen displays a keyboard, the system can fine scan the portions of the touch sensor panel corresponding to keyboard, even when no touches are detected at those portions during the coarse scans. In another example, in an audio/video playback application, the system can scan portions of the touch screen corresponding to the playback controls (play/pause, fast forward, rewind, etc.) even if no touches are detected at those portions during the coarse scans.
Additionally, some regions of the touch sensor panel can be scanned, even when no touch is detected at the region, when a touch is detected in a related or linked region. For example, if a touch screen displays a keyboard, if a touch is detected at any portion of the keyboard during the coarse scan, other regions of the touch sensor panel corresponding to the keyboard can be scanned with a fine scan, even if no touch is detected at that region of the keyboard. The related or linked region can be context specific, such as linking regions based on the application in use, or can be based on the state of the device or the type of fine scan to be performed. The decisions regarding which regions to fine scan even without detecting a touch can also be determined based on the user interface, application in use, use of an input device such as a stylus, and a number of touch events detected at the panel in a coarse scan.
In some examples, even if the fine scan step would otherwise be aborted based on the absence of touch events in one or more regions scheduled to be scanned during fine scan step, the fine scan step can be performed and not aborted. For example, in some cases, when touch is detected in a first region during a coarse scan, a fine scan of a linked or related region can be performed and not aborted, even when the no touch event is detected during the coarse scan of the linked or related region. Likewise, some regions of the panel can be scanned during each fine scan or periodically based, for example, on the user interface or state of the device. In such cases, the system can perform and not abort fine scan steps of those regions even when no touch event is detected during the coarse scan of the region.
After identifying which scheduled fine scan steps to abort and which fine scan steps to perform, the system can perform the scheduled fine scan steps that were identified to be performed at their respective scheduled times and abort the scheduled fine scan steps that were identified to be aborted (2310). The sense/receive channels assigned to sense during the aborted scan steps can be powered down during the time of the aborted fine scan steps (2315).
In some examples, even if the fine scan step would otherwise be aborted based on the absence of touch events in one or more regions scheduled to be scanned during fine scan step, the fine scan step can be performed and not aborted. For example, in some cases, when touch is detected in a first region during a coarse scan, a fine scan of a linked or related region can be performed and not aborted, even when the no touch event is detected during the coarse scan of the linked or related region. Likewise, some regions of the panel can be scanned during each fine scan or periodically based, for example, on the user interface or state of the device. In such cases, the system can perform and not abort fine scan steps of those regions even when no touch event is detected during the coarse scan of the region.
Rather than simply aborting scan steps as in the process described with respect to
Therefore, according to the above, some examples of the disclosure are directed to an apparatus comprising a touch sensor panel and processing circuitry. The processing circuitry can be capable of scheduling a coarse scan and a fine scan to be executed during a scan frame, executing the coarse scan, and aborting at least a portion of the fine scan when no touch event is detected during execution of the coarse scan. Additionally or alternatively to one or more of the examples disclosed above, the touch sensor panel can comprise drive lines and sense lines. Additionally or alternatively to one or more of the examples disclosed above, executing the coarse scan can comprise stimulating a bank comprising a plurality of the drive lines with a common mode stimulation signal, receiving sense signals at one or more of the sense lines, the sense signals generated in response to the common mode stimulation signal, and generating at least one touch value for the bank. Additionally or alternatively to one or more of the examples disclosed above, executing the fine scan can comprise stimulating, in one or more steps, the drive lines with stimulation signals, receiving sense signals at the sense lines, the sense signals generated in response to the stimulation signals applied to the drive lines, and generating a touch value for a plurality of touch sensing nodes of the touch sensor panel, each of the plurality of touch sensing nodes measuring the mutual capacitance between a drive line and a sense line. Additionally or alternatively to one or more of the examples disclosed above, executing the fine scan can comprise stimulating, in one or more steps, the drive lines of one or more banks, but fewer than all banks, with stimulation signals, each bank comprising a plurality of the drive lines, at least one of the one or more banks having detected a touch event during the coarse scan, receiving sense signals at the sense lines, the sense signals generated in response to the stimulation signals applied to the drive lines of the one or more banks, and generating a touch value for a plurality of touch sensing nodes corresponding to the one or more banks, each touch sensing node measuring the mutual capacitance between a drive line and a sense line. Additionally or alternatively to one or more of the examples disclosed above, the processing circuitry can be further capable of generating an abort command, the abort command preventing or terminating execution of the fine scan. Additionally or alternatively to one or more of the examples disclosed above, the processing circuitry can be further capable of generating a one or more abort commands, the one or more abort commands preventing or terminating execution of one or more portions of the fine scan. Additionally or alternatively to one or more of the examples disclosed above, the processing circuitry can be further capable of discarding scan results from an aborted fine scan.
Other examples of the disclosure are directed to an apparatus comprising a touch sensor panel and processing circuitry. The processing circuitry can be capable of performing, during a first touch sensing frame, a first coarse scan and determining an operation for a second touch sensing frame based on a result of the first coarse scan. The processing circuitry can be further capable of performing, during the second touch sensing frame, a second coarse scan, and performing, during the second touch sensing frame, the determined operation. Additionally or alternatively to one or more of the examples disclosed above, the processing circuitry can comprise a touch controller capable of performing scanning operations, and a portion of the processing circuitry can be capable of programming the touch controller to perform the determined operation during the second touch sensing frame. Additionally or alternatively to one or more of the examples disclosed above, the first and second coarse scans can be banked common mode scans. Additionally or alternatively to one or more of the examples disclosed above, determining the operation for the second touch sensing frame based on the result of the first coarse scan can comprise determining an idle operation when the results of the first coarse scan indicate no touch event was detected. Additionally or alternatively to one or more of the examples disclosed above, determining the operation for the second touch sensing frame based on the result of the first coarse scan can comprise determining a full panel scan operation when the results of the first coarse scan indicate one or more touch events detected at at least a threshold number of banks of the touch sensor panel. Additionally or alternatively to one or more of the examples disclosed above, determining the operation for the second touch sensing frame based on the result of the first coarse scan can comprise determining a partial panel scan operation when the results of the first coarse scan indicate touch events detected at fewer than a threshold number of banks of the touch sensor panel.
Other examples of the disclosure are directed to an apparatus comprising a touch sensor panel and processing circuitry. The processing circuitry can be capable of performing a first detection scan to identify a location of an object touching or proximate to the touch sensor panel with a coarse resolution, and in response to identifying the coarse location of the object, reconfiguring a connection between electrodes of the touch sensor panel and sense channels of the processing circuitry and performing a second detection scan to identify the location of the object with a fine resolution. Additionally or alternatively to one or more of the examples disclosed above, during the first detection scan a plurality of super-pixel electrodes can be formed, each super-pixel formed by coupling together a plurality of pixel electrodes of the touch sensor panel. Touch values can be generated for the plurality of super-pixels electrodes. Additionally or alternatively to one or more of the examples disclosed above, identifying the location of the object with the coarse resolution can comprise identifying a super-pixel electrode corresponding to a maximum generated touch value. Additionally or alternatively to one or more of the examples disclosed above, identifying the location of the object with the coarse resolution can comprise identifying a region of the super-pixel electrode based on touch values of super-pixel electrodes adjacent to the super-pixel electrode corresponding to the maximum generated touch value. Additionally or alternatively to one or more of the examples disclosed above, the second detection scan can sense self-capacitance or mutual capacitance of individual pixel electrodes at the location identified with coarse resolution. Additionally or alternatively to one or more of the examples disclosed above, the second detection scan can sense self-capacitance or mutual capacitance of rows of individual pixel electrodes at the location identified with coarse resolution. Additionally or alternatively to one or more of the examples disclosed above, the second detection scan can sense self-capacitance or mutual capacitance of columns of individual pixel electrodes at the location identified with coarse resolution. Additionally or alternatively to one or more of the examples disclosed above, the second detection scan can sense self-capacitance or mutual capacitance pixel electrodes adjacent to the location identified with coarse resolution. Additionally or alternatively to one or more of the examples disclosed above, the number of super-pixel electrodes can correspond to the number of available sense channels of the processing circuitry.
Other examples of the disclosure are directed to a method executed by one or more processing circuits. The method can comprise scheduling a coarse scan and a fine scan to be executed during a scan frame, executing the coarse scan, and aborting at least a portion of the fine scan when no touch event is detected during execution of the coarse scan. Additionally or alternatively to one or more of the examples disclosed above, the coarse scan can be a banked common mode scan, wherein drive lines of the touch sensor panel are divided between a plurality of banks, and banks are stimulated with a common mode stimulation signal. Additionally or alternatively to one or more of the examples disclosed above, the fine scan can be a full panel scan. Additionally or alternatively to one or more of the examples disclosed above, the scan results from the fine scan can be discarded or ignored.
Other examples of the disclosure are directed to a method executed by one or more processing circuits. The method can comprise performing, during a first touch sensing frame, a first coarse scan, determining an operation for a second touch sensing frame based on a result of the first coarse scan, performing, during the second touch sensing frame, a second coarse scan, and performing, during the second touch sensing frame, the determined operation. Additionally or alternatively to one or more of the examples disclosed above, the method can further comprise programming a touch controller to perform the determined operation during the second touch sensing frame. Additionally or alternatively to one or more of the examples disclosed above, the first and second coarse scans can be banked common mode scans. Additionally or alternatively to one or more of the examples disclosed above, determining the operation for the second touch sensing frame based on the result of the first coarse scan can comprise determining an idle operation when the results of the first coarse scan indicate no touch event was detected. Additionally or alternatively to one or more of the examples disclosed above, determining the operation for the second touch sensing frame based on the result of the first coarse scan can comprise determining a full panel scan operation when the results of the first coarse scan indicate one or more touch events detected at at least a threshold number of banks of a touch sensor panel. Additionally or alternatively to one or more of the examples disclosed above, determining the operation for the second touch sensing frame based on the result of the first coarse scan can comprise determining a partial panel scan operation when the results of the first coarse scan indicate touch events detected at fewer than a threshold number of banks of the touch sensor panel.
Other examples of the disclosure are directed to a method executed by one or more processing circuits. The method can comprise performing a first detection scan to identify a location of an object touching or proximate to the touch sensor panel with a coarse resolution, and in response to identifying the coarse location of the object, reconfiguring a connection between electrodes of the touch sensor panel and sense channels of the processing circuitry and performing a second detection scan to identify the location of the object with a fine resolution. Additionally or alternatively to one or more of the examples disclosed above, the method can further comprise: during the first detection scan, forming a plurality of super-pixel electrodes, each super-pixel formed by coupling together a plurality of pixel electrodes of the touch sensor panel and generating touch values for the plurality of super-pixels electrodes. Additionally or alternatively to one or more of the examples disclosed above, identifying the location of the object with the coarse resolution can comprise identifying a super-pixel electrode corresponding to a maximum generated touch value. Additionally or alternatively to one or more of the examples disclosed above, identifying the location of the object with the coarse resolution can comprise identifying a region of the super-pixel electrode based on touch values of super-pixel electrodes adjacent to the super-pixel electrode corresponding to the maximum generated touch value. Additionally or alternatively to one or more of the examples disclosed above, the second detection scan can sense self-capacitance or mutual capacitance of individual pixel electrodes at the location identified with coarse resolution. Additionally or alternatively to one or more of the examples disclosed above, the second detection scan can sense self-capacitance or mutual capacitance of rows of individual pixel electrodes at the location identified with coarse resolution. Additionally or alternatively to one or more of the examples disclosed above, the second detection scan can sense self-capacitance or mutual capacitance of columns of individual pixel electrodes at the location identified with coarse resolution. Additionally or alternatively to one or more of the examples disclosed above, the second detection scan can sense self-capacitance or mutual capacitance pixel electrodes adjacent to the location identified with coarse resolution. Additionally or alternatively to one or more of the examples disclosed above, the number of super-pixel electrodes can correspond to the number of available sense channels of the processing circuitry.
Some examples of the disclosure are directed to a non-transitory computer readable storage medium. The computer readable medium can contain instructions that, when executed by a processor, can perform a method. The method can comprise scheduling a coarse scan and a fine scan to be executed during a scan frame, executing the coarse scan, and aborting at least a portion of the fine scan when no touch event is detected during execution of the coarse scan. Additionally or alternatively to one or more of the examples disclosed above, the coarse scan can be a banked common mode scan, wherein drive lines of the touch sensor panel are divided between a plurality of banks, and banks are stimulated with a common mode stimulation signal. Additionally or alternatively to one or more of the examples disclosed above, the fine scan can be a full panel scan. Additionally or alternatively to one or more of the examples disclosed above, the scan results from the fine scan can be discarded or ignored.
Some examples of the disclosure are directed to a non-transitory computer readable storage medium. The computer readable medium can contain instructions that, when executed by a processor, can perform a method. The method can comprise performing, during a first touch sensing frame, a first coarse scan, determining an operation for a second touch sensing frame based on a result of the first coarse scan, performing, during the second touch sensing frame, a second coarse scan, and performing, during the second touch sensing frame, the determined operation. Additionally or alternatively to one or more of the examples disclosed above, the method can further comprise programming a touch controller to perform the determined operation during the second touch sensing frame. Additionally or alternatively to one or more of the examples disclosed above, the first and second coarse scans can be banked common mode scans. Additionally or alternatively to one or more of the examples disclosed above, determining the operation for the second touch sensing frame based on the result of the first coarse scan can comprise determining an idle operation when the results of the first coarse scan indicate no touch event was detected. Additionally or alternatively to one or more of the examples disclosed above, determining the operation for the second touch sensing frame based on the result of the first coarse scan can comprise determining a full panel scan operation when the results of the first coarse scan indicate one or more touch events detected at at least a threshold number of banks of a touch sensor panel. Additionally or alternatively to one or more of the examples disclosed above, determining the operation for the second touch sensing frame based on the result of the first coarse scan can comprise determining a partial panel scan operation when the results of the first coarse scan indicate touch events detected at fewer than a threshold number of banks of the touch sensor panel.
Some examples of the disclosure are directed to a non-transitory computer readable storage medium. The computer readable medium can contain instructions that, when executed by a processor, can perform a method. The method can comprise performing a first detection scan to identify a location of an object touching or proximate to the touch sensor panel with a coarse resolution, and in response to identifying the coarse location of the object, reconfiguring a connection between electrodes of the touch sensor panel and sense channels of the processing circuitry and performing a second detection scan to identify the location of the object with a fine resolution.
Therefore, according to the above, some examples of the disclosure are directed to an apparatus (e.g., a touch controller). The apparatus can comprise sense channels and processing circuitry. The sense channels can be configured to be coupled to one or more electrodes of a pixelated touch sensor panel. The processing circuitry can be capable of scheduling one or more coarse scans of the pixelated touch sensor panel and a fine scan of the pixelated touch sensor panel, executing the one or more coarse scans, identifying one or more first banks of a plurality of first banks of electrodes of the pixelated touch sensor panel at which no touch event is detected by execution of the one or more coarse scans, determining one or more steps of the fine scan to abort based on at least the one or more first banks at which no touch event is detected by the execution of the one or more coarse scans, and aborting the determined one or more steps of the fine scan. Additionally or alternatively to one or more of the examples disclosed above, one or more of the sense channels unused during an aborted fine scan step can be powered down. Additionally or alternatively to one or more of the examples disclosed above, the one or more coarse scans can include a self-capacitance scan. During a self-capacitance scan each of a plurality of the sense channels can be coupled to one of the plurality of first banks of electrodes of the pixelated touch sensor panel, such that the self-capacitance scan can coarsely measure self-capacitance for the pixelated sensor panel in one scan step. Additionally or alternatively to one or more of the examples disclosed above, during each of the one or more steps of the fine scan, the sense channels can be coupled to individual electrodes of one of a plurality of second banks of electrodes. Additionally or alternatively to one or more of the examples disclosed above, the determining one or more steps of the fine scan to abort can include determining to abort a step of the fine scan when no touch event is detected during the execution of the one or more coarse scans at electrodes of a second bank to be scanned during the step of the fine scan. Additionally or alternatively to one or more of the examples disclosed above, the processing circuitry can be further capable of identifying one or more first banks of a plurality of first banks of electrodes of the pixelated touch sensor panel at which a touch event is detected by the execution of the one or more coarse scans, and identifying one or more electrodes proximate to and outside the one or more first banks at which the event is detected as having detected the touch event. Additionally or alternatively to one or more of the examples disclosed above, the processing circuitry can be further capable of identifying one or more first banks of a plurality of first banks of electrodes of the pixelated touch sensor panel at which a touch event is detected by the execution of the one or more coarse scans, and identifying one or more electrodes linked to the one or more first banks at which the event is detected as having detected the touch event. Additionally or alternatively to one or more of the examples disclosed above, determining the one or more steps of the fine scan to abort can include determining to not abort a step of the fine scan irrespective of the one or more coarse scans based on a state of the device. Additionally or alternatively to one or more of the examples disclosed above, the processing circuitry can be further capable of replacing an aborted step of the fine scan with a duplicate of a non-aborted scan step of the fine scan. Additionally or alternatively to one or more of the examples disclosed above, the processing circuitry can be further capable of replacing an aborted step of the fine scan with a non-aborted scan step of the fine scan.
Some examples of the disclosure are directed to a method for reducing power consumption of touch scanning operations for a device including a pixelated touch sensor panel. The method can comprise scheduling one or more coarse scans of the pixelated touch sensor panel and a fine scan of the pixelated touch sensor panel, executing the one or more coarse scans, identifying one or more first banks of a plurality of first banks of electrodes of the pixelated touch sensor panel at which no touch event is detected by execution of the one or more coarse scans, determining one or more steps of the fine scan to abort based on at least the one or more first banks at which no touch event is detected by the execution of the one or more coarse scans, and aborting the determined one or more steps of the fine scan. Additionally or alternatively to one or more of the examples disclosed above, the method can further comprise powering down one or more sense channels unused during an aborted fine scan step. Additionally or alternatively to one or more of the examples disclosed above, the one or more coarse scans can include a self-capacitance scan that measures a self-capacitance for each of the plurality of first banks of electrodes such that the pixelated sensor panel can be coarsely scanned in one scan step. Additionally or alternatively to one or more of the examples disclosed above, each step of the fine scan scans electrodes of one of a plurality of second banks of electrodes. Additionally or alternatively to one or more of the examples disclosed above, determining the one or more steps of the fine scan to abort can include determining to abort a step of the fine scan when no touch event is detected during the execution of the one or more coarse scans at electrodes of a second bank to be scanned during the step of the fine scan. Additionally or alternatively to one or more of the examples disclosed above, the method can further comprise identifying one or more first banks of a plurality of first banks of electrodes of the pixelated touch sensor panel at which a touch event is detected by the execution of the one or more coarse scans, and identifying one or more electrodes proximate to and outside the one or more first banks at which the event is detected as having detected the touch event. Additionally or alternatively to one or more of the examples disclosed above, the method can further comprise identifying one or more first banks of a plurality of first banks of electrodes of the pixelated touch sensor panel at which a touch event is detected by the execution of the one or more coarse scans, and identifying one or more electrodes linked to the one or more first banks at which the event is detected as having detected the touch event. Additionally or alternatively to one or more of the examples disclosed above, the method can further comprise replacing an aborted step of the fine scan with a duplicate of a non-aborted scan step of the fine scan. Additionally or alternatively to one or more of the examples disclosed above, the method can further comprise replacing an aborted step of the fine scan with a non-aborted scan step of the fine scan. Additionally or alternatively to one or more of the examples disclosed above, determining the one or more steps of the fine scan to abort can include determining to not abort a step of the fine scan irrespective of the one or more coarse scans based on a state of the device.
Some examples of the disclosure are directed to a non-transitory computer readable storage medium. The computer readable medium can contain instructions that, when executed by a processor, can perform a method. The method can comprise scheduling one or more coarse scans of the pixelated touch sensor panel and a fine scan of the pixelated touch sensor panel, executing the one or more coarse scans, identifying one or more first banks of a plurality of first banks of electrodes of the pixelated touch sensor panel at which no touch event is detected by execution of the one or more coarse scans, determining one or more steps of the fine scan to abort based on at least the one or more first banks at which no touch event is detected by the execution of the one or more coarse scans, and aborting the determined one or more steps of the fine scan. Additionally or alternatively to one or more of the examples disclosed above, the method can further comprise powering down one or more sense channels unused during an aborted fine scan step. Additionally or alternatively to one or more of the examples disclosed above, the one or more coarse scans can include a self-capacitance scan that measures a self-capacitance for each of the plurality of first banks of electrodes such that the pixelated sensor panel can be coarsely scanned in one scan step. Additionally or alternatively to one or more of the examples disclosed above, each step of the fine scan scans electrodes of one of a plurality of second banks of electrodes. Additionally or alternatively to one or more of the examples disclosed above, determining the one or more steps of the fine scan to abort can include determining to abort a step of the fine scan when no touch event is detected during the execution of the one or more coarse scans at electrodes of a second bank to be scanned during the step of the fine scan. Additionally or alternatively to one or more of the examples disclosed above, the method can further comprise identifying one or more first banks of a plurality of first banks of electrodes of the pixelated touch sensor panel at which a touch event is detected by the execution of the one or more coarse scans, and identifying one or more electrodes proximate to and outside the one or more first banks at which the event is detected as having detected the touch event. Additionally or alternatively to one or more of the examples disclosed above, the method can further comprise identifying one or more first banks of a plurality of first banks of electrodes of the pixelated touch sensor panel at which a touch event is detected by the execution of the one or more coarse scans, and identifying one or more electrodes linked to the one or more first banks at which the event is detected as having detected the touch event. Additionally or alternatively to one or more of the examples disclosed above, the method can further comprise replacing an aborted step of the fine scan with a duplicate of a non-aborted scan step of the fine scan. Additionally or alternatively to one or more of the examples disclosed above, the method can further comprise replacing an aborted step of the fine scan with a non-aborted scan step of the fine scan. Additionally or alternatively to one or more of the examples disclosed above, determining the one or more steps of the fine scan to abort can include determining to not abort a step of the fine scan irrespective of the one or more coarse scans based on a state of the device.
Although the disclosed examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the disclosed examples as defined by the appended claims.
This application claims benefit of U.S. Provisional Patent Application No. 62/087,792, filed Dec. 4, 2014, the entire disclosure of which is incorporated herein by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
3462692 | Bartlett | Aug 1969 | A |
3970846 | Schofield et al. | Jul 1976 | A |
4220815 | Gibson et al. | Sep 1980 | A |
4281407 | Tosima | Jul 1981 | A |
4289927 | Rodgers | Sep 1981 | A |
4320292 | Oikawa et al. | Mar 1982 | A |
4334219 | Paülus et al. | Jun 1982 | A |
4345248 | Togashi et al. | Aug 1982 | A |
4405921 | Mukaiyama | Sep 1983 | A |
4439855 | Dholakia | Mar 1984 | A |
4476463 | Ng et al. | Oct 1984 | A |
4481510 | Hareng et al. | Nov 1984 | A |
4484179 | Kasday | Nov 1984 | A |
4490607 | Pease et al. | Dec 1984 | A |
4496981 | Ota | Jan 1985 | A |
4520357 | Castleberry et al. | May 1985 | A |
4542375 | Alles et al. | Sep 1985 | A |
4602321 | Bornhorst | Jul 1986 | A |
4603356 | Bates | Jul 1986 | A |
4642459 | Casewell et al. | Feb 1987 | A |
4644338 | Aoki et al. | Feb 1987 | A |
4655552 | Togashi et al. | Apr 1987 | A |
4662718 | Masubuchi | May 1987 | A |
4671671 | Suetaka | Jun 1987 | A |
4677428 | Bartholow | Jun 1987 | A |
4679909 | Hamada et al. | Jul 1987 | A |
4684939 | Streit | Aug 1987 | A |
4698460 | Krein et al. | Oct 1987 | A |
4705942 | Budrikis et al. | Nov 1987 | A |
4720869 | Wadia | Jan 1988 | A |
4736203 | Sidlauskas | Apr 1988 | A |
4740782 | Aoki et al. | Apr 1988 | A |
4749879 | Peterson et al. | Jun 1988 | A |
4759610 | Yanagisawa | Jul 1988 | A |
4767192 | Chang et al. | Aug 1988 | A |
4772101 | Liu | Sep 1988 | A |
4782327 | Kley et al. | Nov 1988 | A |
4782328 | Denlinger | Nov 1988 | A |
4785564 | Gurtler | Nov 1988 | A |
4794634 | Torihata et al. | Dec 1988 | A |
4814760 | Johnston et al. | Mar 1989 | A |
4823178 | Suda | Apr 1989 | A |
4838655 | Hunahata et al. | Jun 1989 | A |
4846559 | Kniffler | Jul 1989 | A |
4877697 | Vollmann et al. | Oct 1989 | A |
4893120 | Doering et al. | Jan 1990 | A |
4904056 | Castleberry | Feb 1990 | A |
4917474 | Yamazaki et al. | Apr 1990 | A |
4940901 | Henry et al. | Jul 1990 | A |
5003356 | Wakai et al. | Mar 1991 | A |
5037119 | Takehara et al. | Aug 1991 | A |
5039206 | Wiltshire | Aug 1991 | A |
5051570 | Tsujikawa et al. | Sep 1991 | A |
5063379 | Fabry et al. | Nov 1991 | A |
5083175 | Hack et al. | Jan 1992 | A |
5105186 | May | Apr 1992 | A |
5113041 | Blonder et al. | May 1992 | A |
5117071 | Greanias et al. | May 1992 | A |
5140153 | Heikkinen et al. | Aug 1992 | A |
5151688 | Tanaka et al. | Sep 1992 | A |
5153420 | Hack et al. | Oct 1992 | A |
5172104 | Tanigaki et al. | Dec 1992 | A |
5182661 | Ikeda et al. | Jan 1993 | A |
5204661 | Hack et al. | Apr 1993 | A |
5236850 | Zhang | Aug 1993 | A |
5237314 | Knapp | Aug 1993 | A |
5239152 | Caldwell et al. | Aug 1993 | A |
5243332 | Jacobson | Sep 1993 | A |
5276538 | Monji et al. | Jan 1994 | A |
5301048 | Huisman | Apr 1994 | A |
5308964 | Kwon | May 1994 | A |
5339090 | Crossland et al. | Aug 1994 | A |
5339091 | Yamazaki et al. | Aug 1994 | A |
5341133 | Savoy et al. | Aug 1994 | A |
5349174 | Van Berkel et al. | Sep 1994 | A |
5360426 | Muller et al. | Nov 1994 | A |
5365461 | Stein et al. | Nov 1994 | A |
5369262 | Dvorkis et al. | Nov 1994 | A |
5376948 | Roberts | Dec 1994 | A |
5381251 | Nonomura et al. | Jan 1995 | A |
5386543 | Bird | Jan 1995 | A |
5387445 | Horiuchi et al. | Feb 1995 | A |
5414283 | den Boer et al. | May 1995 | A |
5422693 | Vogeley et al. | Jun 1995 | A |
5430462 | Katagiri et al. | Jul 1995 | A |
5445871 | Murase et al. | Aug 1995 | A |
5446564 | Mawatari et al. | Aug 1995 | A |
5461400 | Ishii et al. | Oct 1995 | A |
5475398 | Yamazaki et al. | Dec 1995 | A |
5483261 | Yasutake | Jan 1996 | A |
5483263 | Bird et al. | Jan 1996 | A |
5485177 | Shannon et al. | Jan 1996 | A |
5488204 | Mead et al. | Jan 1996 | A |
5502514 | Vogeley et al. | Mar 1996 | A |
5510916 | Takahashi | Apr 1996 | A |
5515186 | Fergason et al. | May 1996 | A |
5525813 | Miyake et al. | Jun 1996 | A |
5532743 | Komobuchi | Jul 1996 | A |
5559471 | Black | Sep 1996 | A |
5568292 | Kim | Oct 1996 | A |
5581378 | Kulick et al. | Dec 1996 | A |
5585817 | Itoh et al. | Dec 1996 | A |
5589961 | Shigeta et al. | Dec 1996 | A |
5598004 | Powell et al. | Jan 1997 | A |
5608390 | Gasparik | Mar 1997 | A |
5610629 | Baur | Mar 1997 | A |
5635982 | Zhang et al. | Jun 1997 | A |
5637187 | Takasu et al. | Jun 1997 | A |
5652600 | Khormaei et al. | Jul 1997 | A |
5659332 | Ishii et al. | Aug 1997 | A |
5677744 | Yoneda et al. | Oct 1997 | A |
5709118 | Ohkubo | Jan 1998 | A |
5712528 | Barrow et al. | Jan 1998 | A |
5734491 | Debesis | Mar 1998 | A |
5736980 | Iguchi et al. | Apr 1998 | A |
5751453 | Baur | May 1998 | A |
5757522 | Kulick et al. | May 1998 | A |
5767623 | Friedman et al. | Jun 1998 | A |
5777713 | Kimura | Jul 1998 | A |
5778108 | Coleman, Jr. | Jul 1998 | A |
5790106 | Hirano et al. | Aug 1998 | A |
5793342 | Rhoads | Aug 1998 | A |
5796121 | Gates | Aug 1998 | A |
5796473 | Murata et al. | Aug 1998 | A |
5812109 | Kaifu et al. | Sep 1998 | A |
5818037 | Redford et al. | Oct 1998 | A |
5818553 | Koenck et al. | Oct 1998 | A |
5818956 | Tuli | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5831693 | McCartney, Jr. et al. | Nov 1998 | A |
5834765 | Ashdown | Nov 1998 | A |
5835079 | Shieh | Nov 1998 | A |
5838290 | Kuijk | Nov 1998 | A |
5838308 | Knapp et al. | Nov 1998 | A |
5852487 | Fujimori et al. | Dec 1998 | A |
5854448 | Nozaki et al. | Dec 1998 | A |
5854881 | Yoshida et al. | Dec 1998 | A |
5877735 | King et al. | Mar 1999 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5883715 | Steinlechner et al. | Mar 1999 | A |
5890799 | Yiu et al. | Apr 1999 | A |
5917464 | Stearns | Jun 1999 | A |
5920309 | Bisset et al. | Jul 1999 | A |
5920360 | Coleman, Jr. | Jul 1999 | A |
5923320 | Murakami et al. | Jul 1999 | A |
5926238 | Inoue et al. | Jul 1999 | A |
5930591 | Huang | Jul 1999 | A |
5940049 | Hinman et al. | Aug 1999 | A |
5942761 | Tuli | Aug 1999 | A |
5956020 | D'Amico et al. | Sep 1999 | A |
5959617 | Bird et al. | Sep 1999 | A |
5959697 | Coleman, Jr. | Sep 1999 | A |
5962856 | Zhao et al. | Oct 1999 | A |
5966108 | Ditzik | Oct 1999 | A |
5973312 | Curling et al. | Oct 1999 | A |
5990980 | Golin | Nov 1999 | A |
5990988 | Hanihara et al. | Nov 1999 | A |
5995172 | Ikeda et al. | Nov 1999 | A |
6002387 | Ronkka et al. | Dec 1999 | A |
6020590 | Aggas et al. | Feb 2000 | A |
6020945 | Sawai et al. | Feb 2000 | A |
6023307 | Park | Feb 2000 | A |
6028581 | Umeya | Feb 2000 | A |
6049428 | Khan et al. | Apr 2000 | A |
6061117 | Fujimoto | May 2000 | A |
6064374 | Fukuzaki | May 2000 | A |
6067062 | Takasu et al. | May 2000 | A |
6067140 | Woo et al. | May 2000 | A |
6069393 | Hatanaka et al. | May 2000 | A |
6078378 | Lu et al. | Jun 2000 | A |
6087599 | Knowles | Jul 2000 | A |
6091030 | Tagawa et al. | Jul 2000 | A |
6100538 | Ogawa | Aug 2000 | A |
6118435 | Fujita et al. | Sep 2000 | A |
6133906 | Geaghan | Oct 2000 | A |
6163313 | Aroyan et al. | Dec 2000 | A |
6177302 | Yamazaki et al. | Jan 2001 | B1 |
6181394 | Sanelle et al. | Jan 2001 | B1 |
6182892 | Angelo et al. | Feb 2001 | B1 |
6184863 | Sibert et al. | Feb 2001 | B1 |
6184873 | Ward | Feb 2001 | B1 |
6188391 | Seely et al. | Feb 2001 | B1 |
6188781 | Brownlee | Feb 2001 | B1 |
6232607 | Huang | May 2001 | B1 |
6236053 | Shariv | May 2001 | B1 |
6236063 | Yamazaki et al. | May 2001 | B1 |
6239788 | Nohno et al. | May 2001 | B1 |
6242729 | Izumi et al. | Jun 2001 | B1 |
6262408 | Izumi et al. | Jul 2001 | B1 |
6265792 | Granchukoff | Jul 2001 | B1 |
6271813 | Palalau | Aug 2001 | B1 |
6278423 | Wald et al. | Aug 2001 | B1 |
6278444 | Wilson et al. | Aug 2001 | B1 |
6284558 | Sakamoto | Sep 2001 | B1 |
6295113 | Yang | Sep 2001 | B1 |
6300977 | Waechter | Oct 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6316790 | Kodaira et al. | Nov 2001 | B1 |
6320617 | Gee et al. | Nov 2001 | B1 |
6323490 | Ikeda et al. | Nov 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6326956 | Jaeger et al. | Dec 2001 | B1 |
6327376 | Harkin | Dec 2001 | B1 |
6333544 | Toyoda et al. | Dec 2001 | B1 |
6351076 | Yoshida et al. | Feb 2002 | B1 |
6351260 | Graham et al. | Feb 2002 | B1 |
6357939 | Baron | Mar 2002 | B1 |
6364829 | Fulghum | Apr 2002 | B1 |
6377249 | Mumford | Apr 2002 | B1 |
6380995 | Kim | Apr 2002 | B1 |
6392254 | Liu et al. | May 2002 | B1 |
6399166 | Khan et al. | Jun 2002 | B1 |
6400359 | Katabami | Jun 2002 | B1 |
6441362 | Ogawa | Aug 2002 | B1 |
6453008 | Sakaguchi et al. | Sep 2002 | B1 |
6454482 | Silverbrook et al. | Sep 2002 | B1 |
6465824 | Kwasnick et al. | Oct 2002 | B1 |
6476447 | Yamazaki et al. | Nov 2002 | B1 |
6489631 | Young et al. | Dec 2002 | B2 |
6495387 | French | Dec 2002 | B2 |
6504530 | Wilson et al. | Jan 2003 | B1 |
6518561 | Miura | Feb 2003 | B1 |
6521109 | Bartic et al. | Feb 2003 | B1 |
6529189 | Colgan et al. | Mar 2003 | B1 |
6552745 | Perner | Apr 2003 | B1 |
6597348 | Yamazaki et al. | Jul 2003 | B1 |
6603867 | Sugino et al. | Aug 2003 | B1 |
6642238 | Hester, Jr. | Nov 2003 | B2 |
6646636 | Popovich et al. | Nov 2003 | B1 |
6667740 | Ely et al. | Dec 2003 | B2 |
6679702 | Rau | Jan 2004 | B1 |
6681034 | Russo | Jan 2004 | B1 |
6690156 | Weiner et al. | Feb 2004 | B1 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6700144 | Shimazaki et al. | Mar 2004 | B2 |
6720594 | Rahn et al. | Apr 2004 | B2 |
6738031 | Young et al. | May 2004 | B2 |
6738050 | Comiskey et al. | May 2004 | B2 |
6741655 | Chang et al. | May 2004 | B1 |
6762741 | Weindorf | Jul 2004 | B2 |
6762752 | Perski et al. | Jul 2004 | B2 |
6803906 | Morrison et al. | Oct 2004 | B1 |
6815716 | Sanson et al. | Nov 2004 | B2 |
6831710 | den Boer | Dec 2004 | B2 |
6862022 | Slupe | Mar 2005 | B2 |
6864882 | Newton | Mar 2005 | B2 |
6879344 | Nakamura et al. | Apr 2005 | B1 |
6879710 | Hinoue et al. | Apr 2005 | B1 |
6888528 | Rai et al. | May 2005 | B2 |
6947017 | Gettemy | Sep 2005 | B1 |
6947102 | den Boer et al. | Sep 2005 | B2 |
6956564 | Williams | Oct 2005 | B1 |
6972753 | Kimura et al. | Dec 2005 | B1 |
6995743 | den Boer et al. | Feb 2006 | B2 |
7006080 | Gettemy | Feb 2006 | B2 |
7009663 | Abileah et al. | Mar 2006 | B2 |
7015833 | Bodenmann et al. | Mar 2006 | B1 |
7015894 | Morohoshi | Mar 2006 | B2 |
7023503 | den Boer | Apr 2006 | B2 |
7053967 | Abileah et al. | May 2006 | B2 |
7068254 | Yamazaki et al. | Jun 2006 | B2 |
7075521 | Yamamoto et al. | Jul 2006 | B2 |
7098894 | Yang et al. | Aug 2006 | B2 |
7109465 | Kok et al. | Sep 2006 | B2 |
7157649 | Hill | Jan 2007 | B2 |
7164164 | Nakamura et al. | Jan 2007 | B2 |
7176905 | Baharav et al. | Feb 2007 | B2 |
7177026 | Perlin | Feb 2007 | B2 |
7184009 | Bergquist | Feb 2007 | B2 |
7184064 | Zimmerman et al. | Feb 2007 | B2 |
7190461 | Han et al. | Mar 2007 | B2 |
7205988 | Nakamura et al. | Apr 2007 | B2 |
7208102 | Aoki et al. | Apr 2007 | B2 |
7242049 | Forbes et al. | Jul 2007 | B2 |
7250596 | Reime | Jul 2007 | B2 |
7292229 | Morag et al. | Nov 2007 | B2 |
7298367 | Geaghan et al. | Nov 2007 | B2 |
7348946 | Booth, Jr. et al. | Mar 2008 | B2 |
7372455 | Perski et al. | May 2008 | B2 |
7408598 | den Boer et al. | Aug 2008 | B2 |
7418117 | Kim et al. | Aug 2008 | B2 |
7450105 | Nakamura et al. | Nov 2008 | B2 |
7456812 | Smith et al. | Nov 2008 | B2 |
7463297 | Yoshida et al. | Dec 2008 | B2 |
7483005 | Nakamura et al. | Jan 2009 | B2 |
7522149 | Nakamura et al. | Apr 2009 | B2 |
7535468 | Uy | May 2009 | B2 |
7536557 | Murakami et al. | May 2009 | B2 |
7545371 | Nakamura et al. | Jun 2009 | B2 |
7598949 | Han | Oct 2009 | B2 |
7609862 | Black | Oct 2009 | B2 |
7612767 | Griffin et al. | Nov 2009 | B1 |
7629945 | Baudisch | Dec 2009 | B2 |
7649524 | Haim et al. | Jan 2010 | B2 |
7649527 | Cho et al. | Jan 2010 | B2 |
7663607 | Hotelling et al. | Feb 2010 | B2 |
7719515 | Fujiwara et al. | May 2010 | B2 |
7786978 | Lapstun et al. | Aug 2010 | B2 |
7843439 | Perski et al. | Nov 2010 | B2 |
7848825 | Wilson et al. | Dec 2010 | B2 |
7859519 | Tulbert | Dec 2010 | B2 |
7868873 | Palay et al. | Jan 2011 | B2 |
7902840 | Zachut et al. | Mar 2011 | B2 |
7924272 | den Boer et al. | Apr 2011 | B2 |
8031094 | Hotelling et al. | Oct 2011 | B2 |
8059102 | Rimon et al. | Nov 2011 | B2 |
8094128 | Vu et al. | Jan 2012 | B2 |
8169421 | Wright et al. | May 2012 | B2 |
8174273 | Geaghan | May 2012 | B2 |
8228311 | Perski et al. | Jul 2012 | B2 |
8232977 | Zachut et al. | Jul 2012 | B2 |
8269511 | Jordan | Sep 2012 | B2 |
8278571 | Orsley | Oct 2012 | B2 |
8373677 | Perski et al. | Feb 2013 | B2 |
8390588 | Vu et al. | Mar 2013 | B2 |
8400427 | Perski et al. | Mar 2013 | B2 |
8479122 | Hotelling et al. | Jul 2013 | B2 |
8481872 | Zachut | Jul 2013 | B2 |
8493331 | Krah et al. | Jul 2013 | B2 |
8536471 | Stern et al. | Sep 2013 | B2 |
8537126 | Yousefpor et al. | Sep 2013 | B2 |
8552986 | Wong et al. | Oct 2013 | B2 |
8581870 | Bokma et al. | Nov 2013 | B2 |
8605045 | Mamba et al. | Dec 2013 | B2 |
8659556 | Wilson | Feb 2014 | B2 |
8698769 | Coulson et al. | Apr 2014 | B2 |
8723825 | Wright et al. | May 2014 | B2 |
8816985 | Tate et al. | Aug 2014 | B1 |
8847899 | Washburn et al. | Sep 2014 | B2 |
8928635 | Harley et al. | Jan 2015 | B2 |
8933899 | Shahparnia et al. | Jan 2015 | B2 |
9013429 | Krekhovetskyy et al. | Apr 2015 | B1 |
9092086 | Krah et al. | Jul 2015 | B2 |
9146414 | Chang et al. | Sep 2015 | B2 |
9170681 | Huang et al. | Oct 2015 | B2 |
9310923 | Krah et al. | Apr 2016 | B2 |
9310943 | Omelchuk et al. | Apr 2016 | B1 |
9329703 | Falkenburg et al. | May 2016 | B2 |
9377905 | Grivna et al. | Jun 2016 | B1 |
9519361 | Harley et al. | Dec 2016 | B2 |
9557845 | Shahparnia | Jan 2017 | B2 |
9582105 | Krah et al. | Feb 2017 | B2 |
9652090 | Tan et al. | May 2017 | B2 |
9921684 | Falkenburg et al. | Mar 2018 | B2 |
9939935 | Shahparnia | Apr 2018 | B2 |
20010000026 | Skoog | Mar 2001 | A1 |
20010000676 | Zhang et al. | May 2001 | A1 |
20010003711 | Coyer | Jun 2001 | A1 |
20010044858 | Rekimoto et al. | Nov 2001 | A1 |
20010046013 | Noritake et al. | Nov 2001 | A1 |
20010052597 | Young et al. | Dec 2001 | A1 |
20010055008 | Young et al. | Dec 2001 | A1 |
20020027164 | Mault et al. | Mar 2002 | A1 |
20020030581 | Janiak et al. | Mar 2002 | A1 |
20020030768 | Wu | Mar 2002 | A1 |
20020052192 | Yamazaki et al. | May 2002 | A1 |
20020063518 | Okamoto et al. | May 2002 | A1 |
20020067845 | Griffis | Jun 2002 | A1 |
20020071074 | Noritake et al. | Jun 2002 | A1 |
20020074171 | Nakano et al. | Jun 2002 | A1 |
20020074549 | Park et al. | Jun 2002 | A1 |
20020080123 | Kennedy et al. | Jun 2002 | A1 |
20020080263 | Krymski | Jun 2002 | A1 |
20020126240 | Seiki et al. | Sep 2002 | A1 |
20020149571 | Roberts | Oct 2002 | A1 |
20020175903 | Fahraeus et al. | Nov 2002 | A1 |
20030020083 | Hsiung et al. | Jan 2003 | A1 |
20030038778 | Noguera | Feb 2003 | A1 |
20030103030 | Wu | Jun 2003 | A1 |
20030103589 | Nohara et al. | Jun 2003 | A1 |
20030117369 | Spitzer et al. | Jun 2003 | A1 |
20030127672 | Rahn et al. | Jul 2003 | A1 |
20030137494 | Tulbert | Jul 2003 | A1 |
20030151569 | Lee et al. | Aug 2003 | A1 |
20030156087 | den Boer et al. | Aug 2003 | A1 |
20030156100 | Gettemy | Aug 2003 | A1 |
20030156230 | den Boer et al. | Aug 2003 | A1 |
20030174256 | Kim et al. | Sep 2003 | A1 |
20030174870 | Kim et al. | Sep 2003 | A1 |
20030179323 | Abileah et al. | Sep 2003 | A1 |
20030183019 | Chae | Oct 2003 | A1 |
20030197691 | Fujiwara et al. | Oct 2003 | A1 |
20030205662 | den Boer et al. | Nov 2003 | A1 |
20030218116 | den Boer et al. | Nov 2003 | A1 |
20030231277 | Zhang | Dec 2003 | A1 |
20030234759 | Bergquist | Dec 2003 | A1 |
20040008189 | Clapper et al. | Jan 2004 | A1 |
20040046900 | den Boer et al. | Mar 2004 | A1 |
20040081205 | Coulson | Apr 2004 | A1 |
20040095333 | Morag et al. | May 2004 | A1 |
20040113877 | Abileah et al. | Jun 2004 | A1 |
20040125430 | Kasajima et al. | Jul 2004 | A1 |
20040140962 | Wang et al. | Jul 2004 | A1 |
20040189587 | Jung et al. | Sep 2004 | A1 |
20040191976 | Udupa et al. | Sep 2004 | A1 |
20040252867 | Lan et al. | Dec 2004 | A1 |
20050040393 | Hong | Feb 2005 | A1 |
20050091297 | Sato et al. | Apr 2005 | A1 |
20050110777 | Geaghan et al. | May 2005 | A1 |
20050117079 | Pak et al. | Jun 2005 | A1 |
20050134749 | Abileah | Jun 2005 | A1 |
20050146517 | Robrecht et al. | Jul 2005 | A1 |
20050173703 | Lebrun | Aug 2005 | A1 |
20050179706 | Childers | Aug 2005 | A1 |
20050200603 | Casebolt et al. | Sep 2005 | A1 |
20050206764 | Kobayashi et al. | Sep 2005 | A1 |
20050231656 | den Boer et al. | Oct 2005 | A1 |
20050270590 | Izumi et al. | Dec 2005 | A1 |
20050275616 | Park et al. | Dec 2005 | A1 |
20050285985 | den Boer et al. | Dec 2005 | A1 |
20060007224 | Hayashi et al. | Jan 2006 | A1 |
20060007336 | Yamaguchi | Jan 2006 | A1 |
20060010658 | Bigley | Jan 2006 | A1 |
20060012580 | Perski et al. | Jan 2006 | A1 |
20060034492 | Siegel et al. | Feb 2006 | A1 |
20060120013 | Dioro et al. | Jun 2006 | A1 |
20060125971 | Abileah et al. | Jun 2006 | A1 |
20060159478 | Kikuchi | Jul 2006 | A1 |
20060170658 | Nakamura et al. | Aug 2006 | A1 |
20060176288 | Pittel et al. | Aug 2006 | A1 |
20060187367 | Abileah et al. | Aug 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060202975 | Chiang | Sep 2006 | A1 |
20060249763 | Mochizuki et al. | Nov 2006 | A1 |
20060250381 | Geaghan | Nov 2006 | A1 |
20060279690 | Yu et al. | Dec 2006 | A1 |
20060284854 | Cheng et al. | Dec 2006 | A1 |
20070030258 | Pittel et al. | Feb 2007 | A1 |
20070062852 | Zachut et al. | Mar 2007 | A1 |
20070109239 | den Boer et al. | May 2007 | A1 |
20070109286 | Nakamura et al. | May 2007 | A1 |
20070131991 | Sugawa | Jun 2007 | A1 |
20070146349 | Errico | Jun 2007 | A1 |
20070216905 | Han et al. | Sep 2007 | A1 |
20070229468 | Peng | Oct 2007 | A1 |
20070279346 | den Boer et al. | Dec 2007 | A1 |
20070285405 | Rehm | Dec 2007 | A1 |
20070291012 | Chang | Dec 2007 | A1 |
20080012835 | Rimon et al. | Jan 2008 | A1 |
20080012838 | Rimon | Jan 2008 | A1 |
20080029691 | Han | Feb 2008 | A1 |
20080046425 | Perski | Feb 2008 | A1 |
20080048995 | Abileah et al. | Feb 2008 | A1 |
20080049153 | Abileah et al. | Feb 2008 | A1 |
20080049154 | Abileah et al. | Feb 2008 | A1 |
20080055507 | den Boer et al. | Feb 2008 | A1 |
20080055295 | den Boer et al. | Mar 2008 | A1 |
20080055496 | Abileah et al. | Mar 2008 | A1 |
20080055497 | Abileah et al. | Mar 2008 | A1 |
20080055498 | Abileah et al. | Mar 2008 | A1 |
20080055499 | den Boer et al. | Mar 2008 | A1 |
20080062156 | Abileah et al. | Mar 2008 | A1 |
20080062157 | Abileah et al. | Mar 2008 | A1 |
20080062343 | den Boer et al. | Mar 2008 | A1 |
20080066972 | Abileah et al. | Mar 2008 | A1 |
20080084374 | Abileah et al. | Apr 2008 | A1 |
20080111780 | Abileah et al. | May 2008 | A1 |
20080128180 | Perski et al. | Jun 2008 | A1 |
20080129909 | den Boer et al. | Jun 2008 | A1 |
20080129913 | den Boer et al. | Jun 2008 | A1 |
20080129914 | den Boer et al. | Jun 2008 | A1 |
20080142280 | Yamamoto et al. | Jun 2008 | A1 |
20080158165 | Geaghan et al. | Jul 2008 | A1 |
20080158167 | Hotelling | Jul 2008 | A1 |
20080158172 | Hotelling et al. | Jul 2008 | A1 |
20080158180 | Krah et al. | Jul 2008 | A1 |
20080162997 | Vu et al. | Jul 2008 | A1 |
20080165311 | Abileah et al. | Jul 2008 | A1 |
20080170046 | Rimon et al. | Jul 2008 | A1 |
20080238885 | Zachut et al. | Oct 2008 | A1 |
20080278443 | Schelling et al. | Nov 2008 | A1 |
20080284925 | Han | Nov 2008 | A1 |
20080297487 | Hotelling et al. | Dec 2008 | A1 |
20080309625 | Krah et al. | Dec 2008 | A1 |
20080309628 | Krah et al. | Dec 2008 | A1 |
20080309631 | Westerman et al. | Dec 2008 | A1 |
20090000831 | Miller et al. | Jan 2009 | A1 |
20090009483 | Hotelling et al. | Jan 2009 | A1 |
20090027354 | Perski et al. | Jan 2009 | A1 |
20090065269 | Katsurahira | Mar 2009 | A1 |
20090066665 | Lee | Mar 2009 | A1 |
20090078476 | Rimon et al. | Mar 2009 | A1 |
20090095540 | Zachut et al. | Apr 2009 | A1 |
20090128529 | Izumi et al. | May 2009 | A1 |
20090135492 | Kusuda et al. | May 2009 | A1 |
20090153152 | Maharyta et al. | Jun 2009 | A1 |
20090153525 | Lung | Jun 2009 | A1 |
20090167713 | Edwards | Jul 2009 | A1 |
20090167728 | Geaghan et al. | Jul 2009 | A1 |
20090184939 | Wohlstadter et al. | Jul 2009 | A1 |
20090189867 | Krah et al. | Jul 2009 | A1 |
20090225210 | Sugawa | Sep 2009 | A1 |
20090251434 | Rimon et al. | Oct 2009 | A1 |
20090262637 | Badaye et al. | Oct 2009 | A1 |
20090273579 | Zachut et al. | Nov 2009 | A1 |
20090322685 | Lee | Dec 2009 | A1 |
20090322696 | Yaakoby et al. | Dec 2009 | A1 |
20100001978 | Lynch et al. | Jan 2010 | A1 |
20100006350 | Elias | Jan 2010 | A1 |
20100013793 | Abileah et al. | Jan 2010 | A1 |
20100013794 | Abileah et al. | Jan 2010 | A1 |
20100013796 | Abileah et al. | Jan 2010 | A1 |
20100020037 | Narita et al. | Jan 2010 | A1 |
20100020044 | Abileah et al. | Jan 2010 | A1 |
20100033766 | Marggraff | Feb 2010 | A1 |
20100045904 | Katoh et al. | Feb 2010 | A1 |
20100051356 | Stern et al. | Mar 2010 | A1 |
20100053113 | Wu | Mar 2010 | A1 |
20100059296 | Abileah et al. | Mar 2010 | A9 |
20100060590 | Wilson et al. | Mar 2010 | A1 |
20100066692 | Noguchi et al. | Mar 2010 | A1 |
20100066693 | Sato et al. | Mar 2010 | A1 |
20100073323 | Geaghan | Mar 2010 | A1 |
20100085325 | King-Smith et al. | Apr 2010 | A1 |
20100118237 | Katoh et al. | May 2010 | A1 |
20100127991 | Yee | May 2010 | A1 |
20100155153 | Zachut | Jun 2010 | A1 |
20100160041 | Grant et al. | Jun 2010 | A1 |
20100194692 | Orr et al. | Aug 2010 | A1 |
20100252335 | Orsley | Oct 2010 | A1 |
20100271332 | Wu et al. | Oct 2010 | A1 |
20100289754 | Sleeman et al. | Nov 2010 | A1 |
20100302419 | den Boer et al. | Dec 2010 | A1 |
20100309171 | Hsieh et al. | Dec 2010 | A1 |
20100315384 | Hargreaves et al. | Dec 2010 | A1 |
20100315394 | Katoh et al. | Dec 2010 | A1 |
20100321320 | Hung et al. | Dec 2010 | A1 |
20100322484 | Hama et al. | Dec 2010 | A1 |
20100327882 | Shahparnia et al. | Dec 2010 | A1 |
20100328249 | Ningrat et al. | Dec 2010 | A1 |
20110001708 | Sleeman | Jan 2011 | A1 |
20110007029 | Ben-David | Jan 2011 | A1 |
20110043489 | Yoshimoto | Feb 2011 | A1 |
20110063993 | Wilson et al. | Mar 2011 | A1 |
20110084857 | Marino et al. | Apr 2011 | A1 |
20110084937 | Chang et al. | Apr 2011 | A1 |
20110090146 | Katsurahira | Apr 2011 | A1 |
20110090181 | Maridakis | Apr 2011 | A1 |
20110153263 | Oda et al. | Jun 2011 | A1 |
20110155479 | Oda et al. | Jun 2011 | A1 |
20110157068 | Parker et al. | Jun 2011 | A1 |
20110169771 | Fujioka et al. | Jul 2011 | A1 |
20110175834 | Han et al. | Jul 2011 | A1 |
20110193776 | Oda et al. | Aug 2011 | A1 |
20110216016 | Rosener | Sep 2011 | A1 |
20110216032 | Oda et al. | Sep 2011 | A1 |
20110254807 | Perski et al. | Oct 2011 | A1 |
20110273398 | Ho et al. | Nov 2011 | A1 |
20110304577 | Brown et al. | Dec 2011 | A1 |
20110304592 | Booth et al. | Dec 2011 | A1 |
20120013555 | Takami et al. | Jan 2012 | A1 |
20120019488 | McCarthy | Jan 2012 | A1 |
20120050207 | Westhues et al. | Mar 2012 | A1 |
20120050216 | Kremin et al. | Mar 2012 | A1 |
20120056822 | Wilson et al. | Mar 2012 | A1 |
20120062497 | Rebeschi et al. | Mar 2012 | A1 |
20120062500 | Miller et al. | Mar 2012 | A1 |
20120068964 | Wright et al. | Mar 2012 | A1 |
20120086664 | Leto | Apr 2012 | A1 |
20120105357 | Li et al. | May 2012 | A1 |
20120105361 | Kremin et al. | May 2012 | A1 |
20120105362 | Kremin et al. | May 2012 | A1 |
20120146958 | Oda et al. | Jun 2012 | A1 |
20120154295 | Hinckley et al. | Jun 2012 | A1 |
20120154340 | Vuppu et al. | Jun 2012 | A1 |
20120182259 | Han | Jul 2012 | A1 |
20120212421 | Honji | Aug 2012 | A1 |
20120242603 | Engelhardt et al. | Sep 2012 | A1 |
20120274580 | Sobel et al. | Nov 2012 | A1 |
20120293464 | Adhikari | Nov 2012 | A1 |
20120320000 | Takatsuka | Dec 2012 | A1 |
20120327040 | Simon | Dec 2012 | A1 |
20120327041 | Harley | Dec 2012 | A1 |
20130021294 | Maharyta et al. | Jan 2013 | A1 |
20130027361 | Perski et al. | Jan 2013 | A1 |
20130033461 | Silverbrook | Feb 2013 | A1 |
20130069905 | Krah et al. | Mar 2013 | A1 |
20130088465 | Geller et al. | Apr 2013 | A1 |
20130100071 | Wright | Apr 2013 | A1 |
20130106722 | Shahparnia et al. | May 2013 | A1 |
20130113707 | Perski et al. | May 2013 | A1 |
20130127757 | Mann et al. | May 2013 | A1 |
20130141342 | Bokma et al. | Jun 2013 | A1 |
20130155007 | Huang et al. | Jun 2013 | A1 |
20130176273 | Li et al. | Jul 2013 | A1 |
20130176274 | Sobel et al. | Jul 2013 | A1 |
20130207938 | Ryshtun et al. | Aug 2013 | A1 |
20130215049 | Lee | Aug 2013 | A1 |
20130257793 | Zeliff et al. | Oct 2013 | A1 |
20140028576 | Shahparnia | Jan 2014 | A1 |
20140028607 | Tan | Jan 2014 | A1 |
20140077827 | Seguine | Mar 2014 | A1 |
20140132556 | Huang | May 2014 | A1 |
20140146009 | Huang | May 2014 | A1 |
20140168142 | Sasselli et al. | Jun 2014 | A1 |
20140168143 | Hotelling et al. | Jun 2014 | A1 |
20140184554 | Walley | Jul 2014 | A1 |
20140253462 | Hicks | Sep 2014 | A1 |
20140253469 | Hicks et al. | Sep 2014 | A1 |
20140267071 | Shahparnia | Sep 2014 | A1 |
20140267075 | Shahparnia et al. | Sep 2014 | A1 |
20140267184 | Bathiche et al. | Sep 2014 | A1 |
20140347311 | Joharapurkar et al. | Nov 2014 | A1 |
20140375612 | Hotelling et al. | Dec 2014 | A1 |
20150022485 | Chen et al. | Jan 2015 | A1 |
20150035768 | Shahparnia et al. | Feb 2015 | A1 |
20150035769 | Shahparnia | Feb 2015 | A1 |
20150035797 | Shahparnia | Feb 2015 | A1 |
20150103049 | Harley et al. | Apr 2015 | A1 |
20150177868 | Morein et al. | Jun 2015 | A1 |
20150338950 | Ningrat et al. | Nov 2015 | A1 |
20160077667 | Chiang et al. | Mar 2016 | A1 |
20160162101 | Pant et al. | Jun 2016 | A1 |
20160179281 | Krah et al. | Jun 2016 | A1 |
20160357343 | Falkenburg et al. | Dec 2016 | A1 |
20160378220 | Westhues et al. | Dec 2016 | A1 |
20170097695 | Ribeiro et al. | Apr 2017 | A1 |
20170115816 | Chang | Apr 2017 | A1 |
20170344174 | Pant et al. | Nov 2017 | A1 |
Number | Date | Country |
---|---|---|
1243282 | Feb 2000 | CN |
1278348 | Dec 2000 | CN |
1518723 | Aug 2004 | CN |
201329722 | Oct 2009 | CN |
101393488 | Oct 2010 | CN |
201837984 | May 2011 | CN |
036 02 796 | Aug 1987 | DE |
197 20 925 | Dec 1997 | DE |
0 306 596 | Mar 1989 | EP |
0 366 913 | May 1990 | EP |
0 384 509 | Aug 1990 | EP |
0 426 362 | May 1991 | EP |
0 426 469 | May 1991 | EP |
0 464 908 | Jan 1992 | EP |
0 488 455 | Jun 1992 | EP |
0 490 683 | Jun 1992 | EP |
0 491 436 | Jun 1992 | EP |
0 509 589 | Oct 1992 | EP |
0 545 709 | Jun 1993 | EP |
0 572 009 | Dec 1993 | EP |
0 572 182 | Dec 1993 | EP |
0 587 236 | Mar 1994 | EP |
0 601 837 | Jun 1994 | EP |
0 618 527 | Oct 1994 | EP |
0 633 542 | Jan 1995 | EP |
0 762 319 | Mar 1997 | EP |
0 762 319 | Mar 1997 | EP |
0 770 971 | May 1997 | EP |
0 962 881 | Dec 1999 | EP |
1 022 675 | Jul 2000 | EP |
1 128 170 | Aug 2001 | EP |
1 884 863 | Feb 2008 | EP |
2 040 149 | Mar 2009 | EP |
2 172 834 | Apr 2010 | EP |
2 221 659 | Aug 2010 | EP |
2 660 689 | Nov 2013 | EP |
55-074635 | Jun 1980 | JP |
57-203129 | Dec 1982 | JP |
60-179823 | Sep 1985 | JP |
64-006927 | Jan 1989 | JP |
64-040004 | Feb 1989 | JP |
1-196620 | Aug 1989 | JP |
2-182581 | Jul 1990 | JP |
2-211421 | Aug 1990 | JP |
5-019233 | Jan 1993 | JP |
5-173707 | Jul 1993 | JP |
05-243547 | Sep 1993 | JP |
8-166849 | Jun 1996 | JP |
9-001279 | Jan 1997 | JP |
9-185457 | Jul 1997 | JP |
9-231002 | Sep 1997 | JP |
9-274537 | Oct 1997 | JP |
10-027068 | Jan 1998 | JP |
10-040004 | Feb 1998 | JP |
10-133817 | May 1998 | JP |
10-133819 | May 1998 | JP |
10-186136 | Jul 1998 | JP |
10-198515 | Jul 1998 | JP |
11-110110 | Apr 1999 | JP |
11-242562 | Sep 1999 | JP |
2000-020241 | Jan 2000 | JP |
2000-163031 | Jun 2000 | JP |
2002-342033 | Nov 2002 | JP |
2005-129948 | May 2005 | JP |
2005-352490 | Dec 2005 | JP |
2009-054141 | Mar 2009 | JP |
10-2013-0028360 | Mar 2013 | KR |
10-2013-0109207 | Oct 2013 | KR |
200743986 | Dec 2007 | TW |
200925944 | Jun 2009 | TW |
201115414 | May 2011 | TW |
201118682 | Jun 2011 | TW |
201324242 | Jun 2013 | TW |
201419103 | May 2014 | TW |
201504874 | Feb 2015 | TW |
WO-9740488 | Oct 1997 | WO |
WO-9921160 | Apr 1999 | WO |
WO-9922338 | May 1999 | WO |
WO-0145283 | Jun 2001 | WO |
WO-2006104214 | Oct 2006 | WO |
WO-2007145346 | Dec 2007 | WO |
WO-2007145347 | Dec 2007 | WO |
WO-2008018201 | Feb 2008 | WO |
WO-2008044368 | Apr 2008 | WO |
WO-2008044369 | Apr 2008 | WO |
WO-2008044370 | Apr 2008 | WO |
WO-2008044371 | Apr 2008 | WO |
WO-2008047677 | Apr 2008 | WO |
WO-2009081810 | Jul 2009 | WO |
WO-2011008533 | Jan 2011 | WO |
WO-2012177567 | Dec 2012 | WO |
WO-2012177569 | Dec 2012 | WO |
WO-2012-177569 | Dec 2012 | WO |
WO-2012177571 | Dec 2012 | WO |
WO-2012177573 | Dec 2012 | WO |
WO-2014018233 | Jan 2014 | WO |
WO-2014143430 | Sep 2014 | WO |
WO-2015017196 | Feb 2015 | WO |
Entry |
---|
Non-Final Office Action dated Jul. 28, 2016, for U.S. Appl. No. 13/560,963, filed Jul. 27, 2012, twelve pages. |
Notice of Allowance dated Aug. 10, 2016, for U.S. Appl. No. 14/578,051, filed Dec. 19, 2014, seven pages. |
Lee, S.K. et al. (Apr. 1985). “A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” Proceedings of CHI: ACM Conference on Human Factors in Computing Systems, pp. 21-25. |
Rubine, D.H. (Dec. 1991). “The Automatic Recognition of Gestures,” CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, 285 pages. |
Rubine, D.H. (May 1992). “Combining Gestures and Direct Manipulation,” CHI ' 92, pp. 659-660. |
Westerman, W. (Spring 1999). “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface,” A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 364 pages. |
Abileah, A. et al. (2004). “59.3: Integrated Optical Touch Panel in a 14.1′ AMLCD,” SID '04 Digest (Seattle) pp. 1544-1547. |
Abileah, A. et al. (2006). “9.3: Optical Sensors Embedded within AMLCD Panel: Design and Applications,” ADEAC '06, SID (Atlanta) pp. 102-105. |
Abileah, A. et al. (2007). “Optical Sensors Embedded within AMLCD Panel: Design and Applications,” Siggraph-07, San Diego, 5 pages. |
Anonymous. (2002). “Biometric Smart Pen Project,” located at http://www.biometricsmartpen.de/ . . . , last visited Apr. 19, 2011, one page. |
Bobrov, Y. et al. (2002). “5.2 Manufacturing of a Thin-Film LCD,” Optiva, Inc., San Francisco, CA. 4 pages. |
Brown, C. et al. (2007). “7.2: A 2.6 inch VGA LCD with Optical Input Function using a 1-Transistor Active-Pixel Sensor,” ISSCC 2007 pp. 132-133, 592. |
Den Boer, W. et al. (2003). “56.3: Active Matrix LCD with Integrated Optical Touch Screen,” SID'03 Digest (Baltimore) pp. 1-4. |
Chinese Search Report dated Sep. 6, 2015, for CN Application No. CN 201280030349.9, with English translation, six pages. |
Chinese Search Report dated Oct. 23, 2015, for CN Application No. CN 201280030351.6, with English translation, four pages. |
Echtler, F. et al. (Jan. 2010). “An LED-based Multitouch Sensor for LCD Screens,” Cambridge, MA ACM 4 pages. |
Final Office Action dated Mar. 4, 2004, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, 17 pages. |
Final Office Action dated Jan. 21, 2005, for U.S. Appl. No. 10/329,217, filed Dec. 23, 2002, 13 pages. |
Final Office Action dated Aug. 9, 2005, for U.S. Appl. No. 10/442,433, filed May 20, 2003, six pages. |
Final Office Action dated Aug. 23, 2005, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, 10 pages. |
Final Office Action dated Dec. 13, 2005, for U.S. Appl. No. 10/371,413, filed Feb. 20, 2003, six pages. |
Final Office Action dated May 23, 2007, for U.S. Appl. No. 11/137,753, filed May 25, 2005, 11 pages. |
Final Office Action dated Oct. 18, 2007, for U.S. Appl. No. 11/351,098, filed Feb. 8, 2006, six pages. |
Final Office Action dated Oct. 31, 2007, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, nine pages. |
Final Office Action dated Mar. 24, 2009, for U.S. Appl. No. 11/351,098, filed Feb. 8, 2006, 10 pages. |
Final Office Action dated Feb. 10, 2011, for U.S. Appl. No. 11/901,649, filed Sep. 18, 2007, 20 pages. |
Final Office Action dated May 18, 2011, for U.S. Appl. No. 11/978,031, filed Oct. 25, 2007, 17 pages. |
Final Office Action dated Jun. 15, 2011, for U.S. Appl. No. 11/595,071, filed Nov. 8, 2006, 9 pages. |
Final Office Action dated Jun. 24, 2011, for U.S. Appl. No. 11/978,006, filed Oct. 25, 2007, 12 pages. |
Final Office Action dated Jul. 5, 2011, for U.S. Appl. No. 11/977,279, filed Oct. 24, 2007, 12 pages. |
Final Office Action dated Sep. 29, 2011, for U.S. Appl. No. 11/977,911, filed Oct. 26, 2007, 22 pages. |
Final Office Action dated Oct. 11, 2012, for U.S. Appl. No. 12/566,455, filed Sep. 24, 2009, 8 pages. |
Final Office Action dated Oct. 25, 2012, for U.S. Appl. No. 12/568,302, filed Sep. 28, 2009, 13 pages. |
Final Office Action dated Oct. 25, 2012, for U.S. Appl. No. 12/568,316, filed Sep. 28, 2009, 15 pages. |
Final Office Action dated Jul. 26, 2013, for U.S. Appl. No. 13/166,726, filed Jun. 22, 2011, ten pages. |
Final Office Action dated Oct. 31, 2013, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 13 pages. |
Final Office Action dated Jan. 13, 2014, for U.S. Appl. No. 12/568,316, filed Sep. 28, 2009, 15 pages. |
Final Office Action dated Apr. 28, 2014, for U.S. Appl. No. 13/652,007, filed Oct. 15, 2012, 16 pages. |
Final Office Action dated Jul. 14, 2014, for U.S. Appl. No. 13/166,711, filed Jun. 22, 2011, 12 pages. |
Final Office Action dated Dec. 2, 2014, for U.S. Appl. No. 13/560,963, filed Jul. 27, 2012,ten pages. |
Final Office Action dated Dec. 16, 2014, for U.S. Appl. No. 13/560,958, filed Jul. 27, 2012, twelve pages. |
Final Office Action dated Jan. 12, 2015, for U.S. Appl. No. 13/560,973, filed Jul. 27, 2012, six pages. |
Final Office Action dated May 4, 2015, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 17 pages. |
Final Office Action dated Aug. 20, 2015, for U.S. Appl. No. 13/166,711, filed Jun. 22, 2011, six pages. |
Final Office Action dated Feb. 1, 2016, for U.S. Appl. No. 13/560,963, filed Jul. 27, 2012, 12 pages. |
Final Office Action dated Feb. 3, 2016, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 15 pages. |
Final Office Action dated Mar. 9, 2016, for U.S. Appl. No. 13/831,318, filed Mar. 14, 2013, nine pages. |
Final Office Action dated Jun. 3, 2016, for U.S. Appl. No. 14/333,461, filed Jul. 16, 2014, eight pages. |
Hong, S.J. et al. (2005). “Smart LCD Using a-Si Photo Sensor,” IMID'05 Digest pp. 280-283. |
International Preliminary Report on Patentability and Written Opinion dated Oct. 8, 2004, for PCT Application No. PCT/US03/05300, filed Feb. 20, 2003, 15 pages. |
International Preliminary Report on Patentability and Written Opinion dated Dec. 30, 2004, for PCT Application No. PCT/US02/25573, filed Aug. 12, 2002, 16 pages. |
International Preliminary Report on Patentability and Written Opinion dated May 14, 2008, for PCT Application No. PCT/US06/43741, filed Nov. 10, 2006, four pages. |
International Search Report dated Apr. 14, 2003, for PCT Application No. PCT/US02/25573, filed Aug. 12, 2002 two pages. |
International Search Report dated Jun. 16, 2003, for PCT Application No. PCT/US03/05300, filed Feb. 20, 2003, two pages. |
International Search Report dated Nov. 11, 2003, for PCT Application No. PCT/US03/03277, filed Feb. 4, 2003, three pages. |
International Search Report dated Sep. 21, 2007, for PCT Application No. PCT/US06/43741, filed Nov. 10, 2006, one page. |
International Search Report dated Oct. 17, 2012, for PCT Application No. PCT/US2012/043019, filed Jun. 18, 2012, five pages. |
International Search Report dated Oct. 17, 2012, for PCT Application No. PCT/US2012/043023, filed Jun. 18, 2012, six pages. |
International Search Report dated Jan. 16, 2013, for PCT Application No. PCT/US2012/043021, filed Jun. 18, 2012, six pages. |
International Search Report dated Sep. 12, 2013, for PCT Application No. PCT/US2013/048977, filed Jul. 1, 2013, six pages. |
International Search Report dated Apr. 23, 2014, for PCT Application No. PCT/US2014/013927, filed Jan. 30, 2014, four pages. |
International Search Report dated Oct. 30, 2014 for PCT Application No. PCT/US2014/047658, four pages. |
Kim, J.H. et al. (May 14, 2000). “24.1: Fingerprint Scanner Using a-Si: H TFT-Array,” SID '00 Digest pp. 353-355. |
Kis, A. (2006). “Tactile Sensing and Analogic Algorithms,” Ph.D. Dissertation, Péter Pázmány Catholic University, Budapest, Hungary 122 pages. |
Non-Final Office Action dated Jun. 4, 2003, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, 16 pages. |
Non-Final Office Action dated May 21, 2004, for U.S. Appl. No. 10/329,217, filed Dec. 23, 2002, 13 pages. |
Non-Final Office Action dated Sep. 21, 2004, for U.S. Appl. No. 10/442,433, filed May 20, 2003, six pages. |
Non-Final Office Action dated Nov. 26, 2004, for U.S. Appl. No. 10/307,106, filed Nov. 27, 2002, eight pages. |
Non-Final Office Action dated Dec. 10, 2004, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, nine pages. |
Non-Final Office Action dated Jan. 21, 2005, for U.S. Appl. No. 10/347,149, filed Jan. 17, 2003, nine pages. |
Non-Final Office Action dated Apr. 15, 2005, for U.S. Appl. No. 10/371,413, filed Feb. 20, 2003, four pages. |
Non-Final Office Action dated Jun. 22, 2005, for U.S. Appl. No. 10/739,455, filed Dec. 17, 2003, 10 pages. |
Non-Final Office Action dated Jul. 12, 2005, for U.S. Appl. No. 10/347,149, filed Jan. 17, 2003, four pages. |
Non-Final Office Action dated Jan. 13, 2006, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, nine pages. |
Non-Final Office Action dated May 12, 2006, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, seven pages. |
Non-Final Office Action dated Aug. 28, 2006, for U.S. Appl. No. 10/371,413, filed Feb. 20, 2003, six pages. |
Non-Final Office Action dated Jun. 28, 2007, for U.S. Appl. No. 11/351,098, filed Feb. 8, 2006, 12 pages. |
Non-Final Office Action dated Jun. 29, 2007, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, 10 pages. |
Non-Final Office Action dated Feb. 25, 2008, for U.S. Appl. No. 11/137,753, filed May 25, 2005, 15 pages. |
Non-Final Office Action dated Jun. 24, 2008, for U.S. Appl. No. 11/351,098, filed Feb. 8, 2006, 11 pages. |
Non-Final Office Action dated Jun. 25, 2009, for U.S. Appl. No. 11/980,029, filed Oct. 29, 2007, 9 pages. |
Non-Final Office Action dated Nov. 23, 2009, for U.S. Appl. No. 11/407,545, filed Apr. 19, 2006, five pages. |
Non-Final Office Action dated Jul. 29, 2010, for U.S. Appl. No. 11/901,649, filed Sep. 18, 2007, 20 pages. |
Non-Final Office Action dated Oct. 13, 2010, for U.S. Appl. No. 11/978,006, filed Oct. 25, 2007, eight pages. |
Non-Final Office Action dated Oct. 14, 2010, for U.S. Appl. No. 11/595,071, filed Nov. 8, 2006, seven pages. |
Non-Final Office Action dated Nov. 26, 2010, for U.S. Appl. No. 11/977,279, filed Oct. 24, 2007, nine pages. |
Non-Final Office Action dated Nov. 26, 2010, for U.S. Appl. No. 11/977,830, filed Oct. 26, 2007, seven pages. |
Non-Final Office Action dated Dec. 13, 2010, for U.S. Appl. No. 11/977,339, filed Oct. 24, 2007, eight pages. |
Non-Final Office Action dated Feb. 1, 2011, for U.S. Appl. No. 11/978,031, filed Oct. 25, 2007, 18 pages. |
Non-Final Office Action dated Apr. 29, 2011, for U.S. Patent Appl. No. 11/977,911, filed Oct. 26, 2007, 19 pages. |
Non-Final Office Action dated Jun. 21, 2011, for U.S. Appl. No. 11/977,339, filed Oct. 24, 2007, 10 pages. |
Non-Final Office Action dated Jun. 28, 2011, for U.S. Appl. No. 12/852,883, filed Aug. 8, 2010, 16 pages. |
Non-Final Office Action dated Nov. 2, 2011, for U.S. Appl. No. 12/568,316, filed Sep. 28, 2009, 31 pages. |
Non-Final Office Action dated Nov. 4, 2011, for U.S. Appl. No. 12/568,302, filed Sep. 28, 2009, 29 pages. |
Non-Final Office Action dated Nov. 17, 2011, for U.S. Appl. No. 11/977,339, filed Oct. 24, 2007, nine pages. |
Non-Final Office Action dated Jan. 10, 2012, for U.S. Appl. No. 11/977,864, filed Oct. 26, 2007, six pages. |
Non-Final Office Action dated Jan. 31, 2012, for U.S. Appl. No. 12/566,477, filed Sep. 24, 2009, 11 pages. |
Non-Final Office Action dated Feb. 29, 2012, for U.S. Appl. No. 11/978,031, filed Oct. 25, 2007, 20 pages. |
Non-Final Office Action dated Apr. 20, 2012, for U.S. Appl. No. 12/566,455, filed Sep. 24, 2009, eight pages. |
Non-Final Office Action dated Jun. 5, 2012, for U.S. Appl. No. 11/595,071, filed Nov. 8, 2006, 14 pages. |
Non-Final Office Action dated Jun. 19, 2012, for U.S. Appl. No. 11/977,864, filed Oct. 26, 2007, seven pages. |
Non-Final Office Action dated Nov. 15, 2012, for U.S. Appl. No. 12/566,477, filed Sep. 24, 2009, 11 pages. |
Non-Final Office Action dated Mar. 5, 2013, for U.S. Appl. No. 13/166,726, filed Jun. 22, 2011, 14 pages. |
Non-Final Office Action dated Mar. 29, 2013, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 12 pages. |
Non-Final Office Action dated Jun. 17, 2013, for U.S. Appl. No. 13/166,711, filed Jun. 22, 2011, 8 pages. |
Non-Final Office Action dated Sep. 18, 2013, for U.S. Appl. No. 13/652,007, filed Oct. 15, 2012, 16 pages. |
Non-Final Office Action dated Dec. 16, 2013, for U.S. Appl. No. 13/166,711, filed Jun. 22, 2011, 12 pages. |
Non-Final Office Action dated Feb. 27, 2014, for U.S. Appl. No. 11/977,279, filed Oct. 24, 2007, 11 pages. |
Non-Final Office Action dated Mar. 14, 2014, for U.S. Appl. No. 11/977,339, filed Oct. 24, 2007, 10 pages. |
Non-Final Office Action dated Apr. 24, 2014, for U.S. Appl. No. 13/560,958, filed Jul. 27, 2012, nine pages. |
Non-Final Office Action dated May 8, 2014, for U.S. Appl. No. 13/560,973, filed Jul. 27, 2012, six pages. |
Non-Final Office Action dated Jun. 4, 2014, for U.S. Appl. No. 13/560,963, filed Jul. 27, 2012, nine pages. |
Non-Final Office Action dated Jun. 27, 2014, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 13 pages. |
Non-Final Office Action dated Jan. 30, 2015, for U.S. Appl. No. 13/166,711, filed Jun. 22, 2011, 12 pages. |
Non-Final Office Action dated May 14, 2015, for U.S. Appl. No. 13/560,963, filed Jul. 27, 2012, twelve pages. |
Non-Final Office Action dated May 22, 2015, for U.S. Appl. No. 13/831,318, filed Mar. 14, 2013, eight pages. |
Non-Final Office Action dated Aug. 28, 2015, for U.S. Appl. No. 13/560,958, filed Jul. 27, 2012, 11 pages. |
Non-Final Office Action dated Sep. 24, 2015, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 14 pages. |
Non-Final Office Action dated Dec. 4, 2015, for U.S. Appl. No. 14/333,461, filed Jul. 16, 2014, 15 pages. |
Non-Final Office Action dated Feb. 11, 2016, for U.S. Appl. No. 14/578,051, filed Dec. 19, 2014, nine pages. |
Non-Final Office Action dated May 13, 2016, for U.S. Appl. No. 15/057,035, filed Feb. 29, 2016, six pages. |
Non-Final Office Action dated May 17, 2016, for U.S. Appl. No. 14/333,382, filed Jul. 16, 2014, sixteen pages. |
Notice of Allowance dated Feb. 3, 2014, for U.S. Appl. No. 13/166,726, filed Jun. 22, 2011, nine pages. |
Notice of Allowance dated May 12, 2014, for U.S. Appl. No. 13/166,726, filed Jun. 22, 2011, nine pages. |
Notice of Allowance dated Sep. 4, 2014, for U.S. Appl. No. 13/166,726, filed Jun. 22, 2011, nine pages. |
Notice of Allowance dated Dec. 15, 2015, for U.S. Appl. No. 13/560,973, filed Jul. 27, 2012, nine pages. |
Notice of Allowance dated Jan. 14, 2016, for U.S. Appl. No. 13/166,711, filed Jun. 22, 2011, five pages. |
Notice of Allowance dated May 24, 2016, for U.S. Appl. No. 13/560,958, filed Jul. 27, 2012, ten pages. |
Notification of Reasons for Rejection dated Dec. 19, 2011, for JP Patent Application No. 2008-540205, with English Translation, six pages. |
Pye, A. (Mar. 2001). “Top Touch-Screen Options,” located at http://www.web.archive.org/web/20010627162135.http://www.industrialtechnology.co.uk/2001/mar/touch.html, last visited Apr. 29, 2004, two pages. |
Rossiter, J. et al. (2005). “A Novel Tactile Sensor Using a Matrix of LEDs Operating in Both Photoemitter and Photodetector Modes,” IEEE pp. 994-997. |
Search Report dated Jun. 12, 2014, for ROC (Taiwan) Patent Application No. 101122110, one page. |
TW Search Report dated Jul. 7, 2014, for TW Patent Application No. 101122109, filed Jun. 20, 2012, one page. |
TW Search Report dated Jul. 8, 2014, for TW Patent Application No. 101122107, filed Jun. 20, 2012, one page. |
TW Search Report dated Nov. 20, 2015, for TW Patent Application No. 103126285, one page. |
U.S. Appl. No. 60/359,263, filed Feb. 20, 2002, by den Boer et al. |
U.S. Appl. No. 60/383,040, filed May 23, 2002, by Abileah et al. |
U.S. Appl. No. 60/736,708, filed Nov. 14, 2005, by den Boer et al. |
U.S. Appl. No. 60/821,325, filed Aug. 3, 2006, by Abileah et al. |
Yamaguchi, M. et al. (Jan. 1993). “Two-Dimensional Contact-Type Image Sensor Using Amorphous Silicon Photo-Transistor,” Jpn. J. Appl. Phys. 32(Part 1, No. 1B):458-461. |
Non-Final Office Action dated Jul. 1, 2016, for U.S. Appl. No. 14/333,457, filed Jul. 16, 2014, 27 pages. |
TW Search Report dated Jun. 23, 2016, for TW Patent Application No. 104135140, with English Translation, two pages. |
Non-Final Office Action dated Sep. 27, 2016, for U.S. Appl. No. 15/144,615, filed May 2, 2016, five pages. |
Notice of Allowance dated Sep. 9, 2016, for U.S. Appl. No. 13/560,958, filed Jul. 27, 2012, eight pages. |
Non-Final Office Action dated Nov. 25, 2016, for U.S. Appl. No. 13/831,318, filed Mar. 14, 2013, eight pages. |
Non-Final Office Action dated Oct. 20, 2016, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 16 pages. |
Notice of Allowance dated Oct. 31, 2016, for U.S. Appl. No. 15/057,035, filed Feb. 29, 2016, ten pages. |
European Search Report dated May 2, 2016, for EP Application No. 15196245.3, twelve pages. |
Final Office Action dated May 4, 2017, for U.S. Appl. No. 15/144,615, filed May 2, 2016, five pages. |
Non-Final Office Action dated Jan. 12, 2017, for U.S. Appl. No. 14/869,980, filed Sep. 29, 2015, ten pages. |
Non-Final Office Action dated Jan. 23, 2017, for U.S. Appl. No. 14/333,382, filed Jul. 16, 2014, sixteen pages. |
Non-Final Office Action dated Apr. 6, 2107, for U.S. Appl. No. 14/333,461, filed Jul. 16, 2014, six pages. |
Notice of Allowance dated Feb. 14, 2017, for U.S. Appl. No. 13/560,963, filed Jul. 27, 2012, nine pages. |
Final Office Action dated Jun. 21, 2017, for U.S. Appl. No. 14/333,382, filed Jul. 16, 2014, 17 pages. |
Final Office Action dated May 31, 2017, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 16 pages. |
Final Office Action dated Aug. 7, 2017, for U.S. Appl. No. 14/869,980, filed Sep. 29, 2015, twelve pages. |
Final Office Action dated Aug. 21, 2017, for U.S. Appl. No. 13/831,318, filed Mar. 14, 2013, nine pages. |
Final Office Action dated Nov. 30, 2017, for U.S. Appl. No. 14/333,457, filed Jul. 16, 2014, 22 pages. |
Non-Final Office Action dated Dec. 14, 2017, for U.S. Appl. No. 15/169,679, filed May 31, 2016, 24 pages. |
Notice of Allowance dated Oct. 26, 2107, for U.S. Appl. No. 14/333,461, filed Jul. 16, 2014, seven pages. |
Notice of Allowance dated Nov. 9, 2017, for U.S. Appl. No. 14/333,382, filed Jul. 16, 2014, eight pages. |
Notice of Allowance dated Nov. 29, 2017, for U.S. Appl. No. 15/144,615, filed May 2, 2016, eight pages. |
Non-Final Office Action dated Jan. 2, 2018, for U.S. Appl. No. 14/869,980, filed Sep. 29, 2015, eleven pages. |
Non-Final Office Action dated Jan. 17, 2018, for U.S. Appl. No. 14/869,975, filed Sep. 29, 2015, 17 pages. |
Notice of Allowance dated Apr. 18, 2018, for U.S. Appl. No. 13/831,318, filed Mar. 14, 2013, ten pages. |
Notice of Allowance dated Jun. 6, 2018, for U.S. Appl. No. 14/869,980, filed Sep. 29, 2015, five pages. |
Number | Date | Country | |
---|---|---|---|
20160162102 A1 | Jun 2016 | US |
Number | Date | Country | |
---|---|---|---|
62087792 | Dec 2014 | US |