This invention is related to biometric image sensors. More particularly, this invention is related to systems for and methods of determining the presence of a finger on a fingerprint image sensor.
Fingerprint image sensors are used in many applications. In one system, a sensor and a host platform together form a fingerprint image processing system. The sensor reads the fingerprint image and converts it to a digital representation. This digital image is transmitted to the host. The host extracts the unique features of the fingerprint, which include minutiae (ridge endings and bifurcations), ridges, valleys, pores, and scars, and converts them into a template, a more compact digital representation of the image. In a first step, the host processes the image and extracts the template, and later compares the template to those stored for authorized users. If a match is found, in the second step the system authenticates the user or performs some other task.
As part of sensing a fingerprint image, the sensor continually scans its surface for any image and periodically transmits an interrupt to the host. After receiving the interrupt, the host reads and processes the corresponding digital image data. If the host determines that the image data correspond to a fingerprint, the host can authenticate the user or perform another task. Otherwise, the host ignores the image data and returns to tasks it was performing when it was interrupted. In such systems, smudges or other residual material (films, perspiration, dirt, oils, etc.) are read by the sensor and sent to the host for processing. This requires valuable processing time of the host to determine whether the image is of a usable print or is even a print at all. This use of processing resources is wasteful, unnecessarily generating interrupts to the host processor, diverting the processor from performing other tasks.
In accordance with embodiments of the present invention, a fingerprint sensor system interrupts a host system only after it has determined that a finger is present upon a fingerprint sensor. In this way, the host system is not unnecessarily interrupted when no finger is present upon the sensor and thus there is no digital fingerprint image data to process. The host system is thus able to use its processing time more efficiently, and to devote processing time to other tasks that are ready to be processed.
Also in accordance with embodiments of the present invention, a finger sensor system continually checks for the presence of a finger. The finger sensor system comprises a capacitive detector as well as other electronic components that draw power and generate heat. The electronic components are in a quiescent state, conserving power, until the capacitive detector detects a capacitance related to the presence of a near field object. When a threshold capacitance is detected, the electronic components are enabled so that an image can be read and processed.
In accordance with a first aspect of the present invention, an apparatus for determining the presence of a patterned object comprises a sensor system for sensing an image corresponding to the patterned object and translating the image into image data, a means for determining one or more values in response to the image data, and a means for generating a control signal in the event the one or more values exceed a pre-determined number of threshold levels. The sensor system is coupled to the means for determining, which is coupled to the means for generating. In a preferred embodiment, the sensor system comprises a biometric image sensor coupled to a signal converter. The sensor system further comprises a memory coupled to the signal converter and a processor. The memory contains one or more output lines for coupling to a host system. The processor is coupled to the biometric sensor, the signal converter, the memory, and the means for determining. Preferably, the signal converter comprises an analog-to-digital converter.
In another embodiment, the biometric image sensor is a fingerprint image sensor. Preferably, the fingerprint image sensor is a swipe sensor, but alternatively, it is a placement sensor. Preferably, the sensor is a capacitive sensor, but it can alternatively be any finger print sensor including, but not limited to, a thermal or an optical sensor.
In another embodiment, the means for determining comprises a software programmable capacitance detection engine to enable the first statistical engine coupled to the analog-to-digital converter and the processor; a second statistical engine coupled to the memory, the first statistical engine, and the processor; and an ac-energy-estimation engine coupled to the memory and the processor. The means for determining further comprises first, second, and third comparators. The first comparator is coupled to the first statistical engine and the processor and is configured to generate a first output when the output of the first statistical engine exceeds a first threshold value. The second comparator is coupled to the second statistical engine and the processor and is configured to generate a second output when the output of the second statistical engine exceeds a second threshold value. The third comparator is coupled to the ac-energy-estimation engine and the processor and is configured to generate a third output when the output of the ac-energy-estimation engine exceeds a third threshold value.
In another embodiment, the apparatus further comprises a software programmable detection engine coupled to and configured to enable the analog-to-digital converter, the first statistical engine, and the ac-energy-estimation engine. Preferably, the software programmable detection engine comprises a software programmable capacitive bridge. The software programmable capacitive bridge comprises one or more capacitors configured to be programmed to a detection threshold level.
Preferably, the first statistical engine is a mean engine and the first comparator is a mean comparator; the second statistical engine is a variance engine and the second comparator is a variance comparator; and the third comparator is an ac-energy-estimation comparator.
In this embodiment, the first threshold value is related to a mean of image data, the second threshold value is related to a variance of the image data, and the third threshold value is related to an ac-energy-estimate of the image data. Preferably, at least one of the first comparator, the second comparator, and the third comparator is configured to be programmed with corresponding threshold values.
In another embodiment, the sensor system comprises a software programmable capacitive detector to detect change in capacitance caused by a near field object. Upon detection this threshold programmable capacitive detector enables predetermined components of the sensor system including at least one of the analog-to-digital converter, the first statistical engine, the second statistical engine, and the ac-energy-estimation engine.
In another embodiment using thresholds crossings, the thresholds in the ac-energy-estimation engine are dynamically set by a software programmable delta and are centered around the mean from the mean engine. Alternatively the programmable delta can be tied to the variance from the variance engine.
In another embodiment the capacitive detector can be self adjusting and automatically configure the threshold based on long term history.
In another embodiment, the apparatus further comprises a host platform coupled to the means for determining, whereby the memory output lines couple the memory to the host system.
The host platform comprises an interrupt signal line coupled to at least one of the first comparator, the second comparator, and the third comparator. The means for generating comprises a voter logic circuit comprising the interrupt signal line. The voter logic circuit is coupled to outputs of each of the first comparator, the second comparator, and the third comparator.
In this embodiment, the host platform comprises one of a personal computer or a portable device selected from the group consisting of a telephone, a personal digital assistant, a personal music player, and a camera.
In another embodiment, the host platform is configured to receive at least one of the output from the first comparator, the output from the second comparator, and the output from the third comparator and to use each of the outputs to set at least one of the first threshold value, the second threshold value, and the third threshold value.
In another embodiment, the fingerprint image sensor is logically divided into a plurality of regions.
In accordance with a second aspect of the present invention, a method of detecting the presence of a patterned object on a sensor comprises sensing an image of the patterned object on the sensor; generating data corresponding to the image; calculating one or more statistics related to the data; determining whether the statistics exceed corresponding threshold values; and making the data available to a host computer in the event that the statistics exceed the corresponding threshold values.
Embodiments of the present invention allow host systems that process a number of tasks, including processing fingerprint image data, to operate more efficiently. In accordance with embodiments of the present invention, a host system is coupled to a fingerprint sensor system. The sensor system reads images of a fingerprint or portions of a fingerprint and translates the images into digital image data that are made available to a host, which performs a variety of tasks using image data including, but not limited to, authenticating a user, launching a computer program corresponding to the image data, and emulating a computer input device such as a mouse or joystick. Systems for and methods of fingerprint sensing are described in detail in U.S. patent application Ser. No. 10/194,994, filed Jul. 12, 2002, and titled “Method and System for Biometric Image Assembly from Multiple Partial Biometric Scans” and U.S. patent application Ser. No. 10/099,558, filed Mar. 13, 2002, and titled “Fingerprint Biometric Capture Device and Method with Integrated On-Chip data Buffering,” both of which are hereby incorporated by reference.
In accordance with embodiments of the present invention, the host system is interrupted by the fingerprint sensor system only when it is determined that a finger is present on a fingerprint sensor. Thus, the host system is not continually interrupted by the fingerprint sensor system, nor is it required to unnecessarily process data that does not correspond to a fingerprint. Thus, instead of unnecessarily handling interrupts when a finger is not present, requiring the processing of context switches and the like, the host system needs to service an interrupt only when there is fingerprint image data that need processing.
In accordance with other embodiments of the present invention, a host system forming part of the fingerprint sensor system and components of the fingerprint sensor system are able to be placed into extremely low-power discontinuous controlled operation. The host system is thus able to be powered down into sleep mode, awoken only when the fingerprint sensor has detected the presence of a finger. The fingerprint sensor system is able to process and detect the presence of a finger based on any sub-frame images of the complete finger image. The ability to process and detect the presence of a finger based on sub-images even further reduces the power consumption of the device.
In a preferred embodiment, the system comprises a software programmable pipelined image processor integrated into the data path of the sensor. The system further comprises a software-programmable threshold based capacitive detector, a mean image processing engine, a variance image processing engine, and an ac-energy-estimation engine, components collectively referred to as a programmable pipeline image processor. Preferably, each processor in the pipeline is software configurable to allow dynamic operation. A selectable architecture allows each stage of the pipeline image processor to be enabled or disabled to determine whether a finger is present. Furthermore, each pipeline stage of the image processor can be used during normal sensor operation. The data from the mean, variance and ac-energy-estimation engines is transferred to the host with the fingerprint image data to reduce the processing load on the host processor during subsequent image analysis.
The programmable capacitive detector provides for the low-power discontinuous operation. As described in more detail below, in operation, the analog-to-digital converter, mean image processor, variance image processor, and ac-energy-estimation processor, referred to as the calculation components, are all placed in a sleep mode or powered off to conserve energy. It will be appreciated that components other than or in addition to the calculation components (collectively referred to as the controlled sensor components) can also be placed in a sleep mode or powered off. The software programmable capacitive detector is constantly or periodically checking based on a programmable interval for the presence of an object on the fingerprint sensor of the present invention. Only after the programmable capacitive detector detects the presence of an object upon a surface of the fingerprint sensor are the controlled sensor components enabled. Thus, when the programmable capacitive detector has not detected an object on the surface of the finger image sensor, the controlled sensor components draw little or no current, conserving system energy and further decreasing the temperature of the system. When the programmable capacitive detector does detect an object on the surface of the finger sensor, the controlled sensor components are turned on and, in accordance with the present invention, used to determine whether the object is a finger.
The present invention makes use of quantifiable statistical properties associated with fingerprints and other patterned objects. Generally, a ridge of a fingerprint (because it directly contacts a surface of a sensor) will generate a high capacitance value on a capacitance sensor or appear as a dark portion on an optical sensor. Valleys will correspondingly generate a lower capacitance value of a capacitance sensor or appear as a light portion on an optical sensor. Because ridges and valleys alternate on a finger surface, pixel values corresponding to portions of the fingerprint also alternate. Thus, if a pixel value corresponding to a portion of a fingerprint is represented by an integer between 0 and 15 (representable by 4 bits), then the pixel values of all the portions will vary between smaller values (generally representing a valley) and larger values (representing a ridge). Taken as a whole, the pixel values related to an entire (or at least larger portion of a) fingerprint will thus typically have associated statistics, such as a mean value larger than a first threshold, a variance larger than a second threshold, and a number of alternations between ridges and values (i.e., crossings) larger than a third threshold. These statistics, taken alone or in combination, can be used to identify a fingerprint and thus the presence of a finger on a sensor.
It will be appreciated that other patterned objects also have expected mean, variance and ac-energy. Thus, their presence can also be detected in accordance with the present invention. Other patterned objects include, but are not limited to, other biometric objects such as palms and eyes (with veined patterns, etc.), as well as styluses or patterned styluses, such as those used to input data into electronic devices such as personal digital assistants.
In operation, when the fingerprint sensor system 100 detects the presence of a finger, it sends an interrupt signal on an interrupt line, which is one of the control and data lines 104. The fingerprint sensor system 100 can also make the fingerprint image data available on other of the control and data lines 104. When ready, the host system 103 handles the interrupt by reading fingerprint image data on one or more of the control and data lines 104. The host system 103 then processes the fingerprint image data in any number of ways, such as by (1) verifying that a template corresponding to the read print matches a template stored on the host system 104 and corresponding to an authorized user or (2) launching a program corresponding to the read template, among other tasks. Thus, in accordance with the present invention, the host system processes other tasks and only interrupts processing those tasks to process fingerprint image data read from the fingerprint sensor system 110 when the presence of a finger has been detected and there is fingerprint image data to process. Steps used to perform a method in accordance with one embodiment of the present invention are explained in more detail in
The fingerprint sensor system 111 further comprises a software programmable capacitive detector 122 coupled to the sensor array 111 by the line 109 and to other electronic components to enable any one or more of them that are quiescent, that is, normally in sleep mode or off to conserve power, including the A/D converter 115, the mean engine 130, the variance engine 140, the ac-energy estimation engine 150, the mean comparator 161, the variance comparator 162, the ac-energy estimation comparator 163, and the voter logic circuit 127.
The capacitive detector 122 comprises a comparator 171 having a first input 171A and a second input 171B, both coupled to a capacitive bridge, described below. The first input 171A is coupled to a contact 177C of a switch 177 and to a contact 176C of a switch 176. The switch 177 also has a contact 177B coupled to a voltage source V2 and an input 177A coupled to a voltage source V2 by a sense capacitor CFINGER2 178B. The switch 176 also has a contact 176B coupled to a voltage source V1 and an input 176A coupled to a voltage source V2 by a capacitor CREF2 179B. The second input 171B of the comparator 171 is coupled to a contact 174C of a switch 174 and to a contact 175C of a switch 175. The switch 174 also has a contact 174B coupled to a voltage source V1 and an input 174A coupled to a voltage source V2 by a sense capacitor CFINGER1 178A. The switch 175 also has a contact 175B coupled to a voltage source V2 and an input 175A coupled to a voltage source V2 by a capacitor CREF1 179A.
For ease of discussion, when any of the switches 174-177 is switched so that its input (e.g., 174A) is coupled directly to a voltage source (e.g., 174B), the switch is said to be in a charging position; when the switch is switched so that its input (e.g., 174A) is coupled to a voltage source through one of the capacitors (e.g., 179A, 179B, 178A, and 178B), the switch is said to be in a detection position. When a switch 174-177 is in a charging position, the capacitor coupled to its input (178A, 179A, 179B, and 178B, respectively) is charged to the software programmable value (V1, V2, V2, and V2, respectively). Those skilled in the art will recognize that in the detection position, the capacitors 178A, 178B, 179A, and 179B form a capacitive bridge.
The capacitors 179A and 179B are not exposed to a finger. The sense capacitors 178A and 178B, in contrast, are exposed to a finger (e.g., the two are brought relatively close to each other that a capacitance of the finger is detected by a sense capacitor) and are used to perform surface detection of a near field object, such as a finger. In a preferred embodiment, the sense capacitors 178A and 178B form part of the sensor array 111 and are thus shown as coupled to the sensor array by the line 109.
In operation, each of the switches 174-177 is first placed in a charging position until each of the capacitors 178A, 178B, 179A, 179B is fully charged. Next, each of the switches 174-177 is switched to a detection position so that the capacitive bridge is balanced. When a finger contacts a sense capacitor 178A and 178B, the effective capacitance of the contacted sense capacitor 178A and 178B increases, and the capacitive bridge trips. In other words, when all of the switches are placed in a detection position, the difference between the values on the sense capacitors 178A and 178B and the capacitors 179A and 179B will generate a voltage difference between the inputs 171A and 171B of the comparator 171, thus generating a signal on the output line 122A of the comparator 171. Thus, by adjusting the programmable values of the capacitors CREF1 and CREF2, the values of CFINGER1 and CFINGER2 that will be generated on output line 122 (indicating the presence of a finger and thus turning on analog-to-digital components) can be adjusted.
It will be appreciated that many of the components illustrated in
The AC energy can be measured using a Fourier Transform or a band-pass filter and can be estimated using threshold crossings. The computation of the Fourier Transform and the bandpass filtering are computationally expensive. Although threshold crossings are an estimate of the AC energy, they can be computed with fewer computational resources.
Threshold crossings look at the gray level intensity of a frame and count how many times the level moves from above the upper threshold 191 to below the lower threshold 193 and from below the lower threshold 193 to above the upper threshold 191. The upper and lower thresholds 191 and 193 are set at positive deltas (+deltas) and negative deltas (−deltas) relative to the center intensity value 192. The center value 192 and the delta count (+delta and −delta) are both software programmable. In an alternative embodiment the center value 192 is set by the mean engine (e.g.,
Referring to
Using row V as an example, cells V1-8, V10-11, V13-15, V17-18, V20-28, and V30-50 are all depicted as light colored, thereby having a relatively small intensity value; cells V9, V12, V16, V19, and V29 are all depicted as dark colored, thereby having a relatively larger intensity value. Those skilled in the art will recognize how to compute the mean and variance of the cell values in row V. As long as the dark and light intensities correspond to the threshold crossings of upper and lower thresholds, the number of threshold crossings refers to the total number of transitions from a light colored to a dark-colored cell or from a dark-colored to a light-colored cell. For this reason, the ac-energy estimation comparator 163 of
In one embodiment, a transition is recognized when adjacent cell values change from a first value to another value below a first threshold or from a first value to another value above a second threshold. Thus, referring to
Many variations can be made to the steps for generating statistics in accordance with the present invention. For example, rather than traveling row-by-row, the method can travel column-by-column. Furthermore, it will be appreciated that cells can have grey-scale values (shades) other than all dark or all light. Cells can also contain intermediate values reflecting intermediate shades corresponding, for example, to pores, scars, and other features. The system can be tuned by setting threshold values. For example, to ensure that a small scar is not recognized as a transition, an upper threshold can be increased so that only ridges, which have large binary values, will be recognized as threshold crossings. These thresholds can be programmed into the components of the present invention or can be dynamically set by an executing software program, dynamic in that the system uses previous threshold crossings to generate new threshold values. In this way the system can be tuned to better distinguish between ridges and valleys.
It will be appreciated that while
The operation of the system 100 in accordance with the present invention is now described with reference to
Referring to
If in the step 309 it is determined that a finger is present on the sensor array 111, then in the step 313, an interrupt signal is sent on the line 164 and the digital image data is made available to a host system on the channel 104. The channel 104 can be a serial line, a multi-line parallel bus, or any type of line on which digital data can be transmitted.
If in the step 309 it is determined that the statistics do not indicate that a finger is present on the sensor array 111, then processing loops back to the step 301.
After the system, 110, has interrupted the host, it enters a quiescent mode, in which the predetermined controlled sensor components enter sleep mode, thereby conserving power. The host can subsequently request the image data from the fingerprint sensor. The system is then reset by placing the switches 174-177 into a charging position using software programmable values for the capacitors 179A and 179B.
Referring again to
Referring again to
When the host receives an interrupt and the signals representing the values of the statistics such as the mean, variance, and threshold crossings, it can do more extensive analysis on the statistics. If, after more extensive analysis the host determines that no finger was actually present on the system, the host can be used to program any one or more of the mean comparator, the variance comparator, and the ac-energy estimation comparator so that interrupts are only generated when a finger is detected. This tuning is especially important in humid environments, for example, where sweat or other residue triggers the sensor's finger detect logic.
In accordance with the present invention, finger sensors can be divided into one or more logical regions and the presence of a finger detected in any region. In some embodiments, finger sensors are partitioned to emulate, for example, input devices such as computer mice, which recognize and process inputs based on the regions in which a finger is detected.
Embodiments of the present invention contemplate many modifications. For example, the system and method of the present invention can include logic to translate digital image data into a template, which is then read by the host when the interrupt signal is raised on the line 104 in
Number | Name | Date | Kind |
---|---|---|---|
4745301 | Michalchik | May 1988 | A |
4827527 | Morita et al. | May 1989 | A |
4833440 | Wojtanek | May 1989 | A |
5283735 | Gross et al. | Feb 1994 | A |
5327161 | Logan et al. | Jul 1994 | A |
5376913 | Pine | Dec 1994 | A |
5610993 | Yamamoto | Mar 1997 | A |
5612719 | Beernink et al. | Mar 1997 | A |
5621318 | Jacobsen et al. | Apr 1997 | A |
5657012 | Tait | Aug 1997 | A |
5666113 | Logan | Sep 1997 | A |
5675309 | DeVolpi | Oct 1997 | A |
5689285 | Asher | Nov 1997 | A |
5740276 | Tomko et al. | Apr 1998 | A |
5821930 | Hansen | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5825907 | Russo | Oct 1998 | A |
5828773 | Setlak et al. | Oct 1998 | A |
5841888 | Setlak et al. | Nov 1998 | A |
5845005 | Setlak et al. | Dec 1998 | A |
5852670 | Setlak et al. | Dec 1998 | A |
5862248 | Salatino et al. | Jan 1999 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5889507 | Engle et al. | Mar 1999 | A |
5903225 | Schmitt et al. | May 1999 | A |
5907327 | Ogura et al. | May 1999 | A |
5909211 | Combs et al. | Jun 1999 | A |
5910286 | Lipskier | Jun 1999 | A |
5920640 | Salatino et al. | Jul 1999 | A |
5940526 | Setlak et al. | Aug 1999 | A |
5943052 | Allen et al. | Aug 1999 | A |
5953441 | Setlak | Sep 1999 | A |
5956415 | McCalley et al. | Sep 1999 | A |
5963679 | Setlak | Oct 1999 | A |
5982894 | McCalley et al. | Nov 1999 | A |
5995084 | Chan et al. | Nov 1999 | A |
5995623 | Kawano et al. | Nov 1999 | A |
5995630 | Borza | Nov 1999 | A |
6011589 | Matsuura et al. | Jan 2000 | A |
6011849 | Orrin | Jan 2000 | A |
6021211 | Setlak et al. | Feb 2000 | A |
6028773 | Hundt | Feb 2000 | A |
6035398 | Bjorn | Mar 2000 | A |
6047281 | Wilson et al. | Apr 2000 | A |
6047282 | Wilson et al. | Apr 2000 | A |
6057540 | Gordon et al. | May 2000 | A |
6057830 | Chan et al. | May 2000 | A |
6061051 | Chan et al. | May 2000 | A |
6061464 | Leger | May 2000 | A |
6067368 | Setlak et al. | May 2000 | A |
6069970 | Salatino et al. | May 2000 | A |
6070159 | Wilson et al. | May 2000 | A |
6088471 | Setlak et al. | Jul 2000 | A |
6088585 | Schmitt et al. | Jul 2000 | A |
6098330 | Schmitt et al. | Aug 2000 | A |
6135958 | Mikula-Curtis et al. | Oct 2000 | A |
6141753 | Zhao et al. | Oct 2000 | A |
6181807 | Setlak et al. | Jan 2001 | B1 |
6208329 | Ballare | Mar 2001 | B1 |
6219793 | Li et al. | Apr 2001 | B1 |
6219794 | Soutar et al. | Apr 2001 | B1 |
6239790 | Martinelli et al. | May 2001 | B1 |
6248655 | Machida et al. | Jun 2001 | B1 |
6256022 | Manaresi et al. | Jul 2001 | B1 |
6259804 | Setlak et al. | Jul 2001 | B1 |
6278443 | Amro et al. | Aug 2001 | B1 |
6289114 | Mainguet | Sep 2001 | B1 |
6317508 | Kramer et al. | Nov 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6330345 | Russo et al. | Dec 2001 | B1 |
6337918 | Holehan | Jan 2002 | B1 |
6376393 | Newton et al. | Apr 2002 | B1 |
6400836 | Senior | Jun 2002 | B2 |
6404900 | Qian et al. | Jun 2002 | B1 |
6408087 | Kramer | Jun 2002 | B1 |
6442286 | Kramer | Aug 2002 | B1 |
6459804 | Mainguet | Oct 2002 | B2 |
6483931 | Kalnitsky et al. | Nov 2002 | B2 |
6501284 | Gozzini | Dec 2002 | B1 |
6512381 | Kramer | Jan 2003 | B2 |
6515488 | Thomas | Feb 2003 | B1 |
6518560 | Yeh et al. | Feb 2003 | B1 |
6535622 | Russo et al. | Mar 2003 | B1 |
6546122 | Russo | Apr 2003 | B1 |
6563101 | Tullis | May 2003 | B1 |
6580816 | Kramer et al. | Jun 2003 | B2 |
6601169 | Wallace, Jr. et al. | Jul 2003 | B2 |
6603462 | Matusis | Aug 2003 | B2 |
6628812 | Setlak et al. | Sep 2003 | B1 |
6654484 | Topping | Nov 2003 | B2 |
6661631 | Meador et al. | Dec 2003 | B1 |
6667439 | Salatino et al. | Dec 2003 | B2 |
6668072 | Hribernig et al. | Dec 2003 | B1 |
6681034 | Russo | Jan 2004 | B1 |
6683971 | Salatino et al. | Jan 2004 | B1 |
6744910 | McClurg et al. | Jun 2004 | B1 |
6754365 | Wen et al. | Jun 2004 | B1 |
6804378 | Rhoads | Oct 2004 | B2 |
6876756 | Vieweg | Apr 2005 | B1 |
6961452 | Fujii | Nov 2005 | B2 |
7002553 | Shkolnikov | Feb 2006 | B2 |
7003670 | Heaven et al. | Feb 2006 | B2 |
7020270 | Ghassabian | Mar 2006 | B1 |
7054470 | Bolle et al. | May 2006 | B2 |
7113179 | Baker et al. | Sep 2006 | B2 |
7136514 | Wong | Nov 2006 | B1 |
7197168 | Russo | Mar 2007 | B2 |
7263212 | Kawabe | Aug 2007 | B2 |
7280679 | Russo | Oct 2007 | B2 |
7299360 | Russo | Nov 2007 | B2 |
7339572 | Schena | Mar 2008 | B2 |
7369688 | Ser et al. | May 2008 | B2 |
20010026636 | Mainguet | Oct 2001 | A1 |
20010032319 | Setlak | Oct 2001 | A1 |
20010043728 | Kramer et al. | Nov 2001 | A1 |
20020054695 | Bjorn et al. | May 2002 | A1 |
20020130673 | Pelrine et al. | Sep 2002 | A1 |
20020164057 | Kramer et al. | Nov 2002 | A1 |
20020186203 | Huang | Dec 2002 | A1 |
20020188854 | Heaven et al. | Dec 2002 | A1 |
20030002718 | Hamid | Jan 2003 | A1 |
20030016849 | Andrade | Jan 2003 | A1 |
20030021495 | Cheng | Jan 2003 | A1 |
20030025606 | Sabatini | Feb 2003 | A1 |
20030028811 | Walker et al. | Feb 2003 | A1 |
20030035568 | Mitev et al. | Feb 2003 | A1 |
20030035572 | Kalnitsky et al. | Feb 2003 | A1 |
20030044051 | Fujieda | Mar 2003 | A1 |
20030095691 | Nobuhara et al. | May 2003 | A1 |
20030108227 | Philomin et al. | Jun 2003 | A1 |
20030115490 | Russo et al. | Jun 2003 | A1 |
20030123714 | O'Gorman et al. | Jul 2003 | A1 |
20030126448 | Russo | Jul 2003 | A1 |
20030135764 | Lu | Jul 2003 | A1 |
20030214481 | Xiong | Nov 2003 | A1 |
20030215116 | Brandt et al. | Nov 2003 | A1 |
20040014457 | Stevens | Jan 2004 | A1 |
20040128521 | Russo | Jul 2004 | A1 |
20040148526 | Sands et al. | Jul 2004 | A1 |
20040156538 | Greschitz et al. | Aug 2004 | A1 |
20040186882 | Ting | Sep 2004 | A1 |
20040208348 | Baharav et al. | Oct 2004 | A1 |
20040252867 | Lan et al. | Dec 2004 | A1 |
20040258282 | Bjorn et al. | Dec 2004 | A1 |
20040263479 | Shkolnikov | Dec 2004 | A1 |
20050012714 | Russo et al. | Jan 2005 | A1 |
20050041885 | Russo | Feb 2005 | A1 |
20050144329 | Tsai et al. | Jun 2005 | A1 |
20050169503 | Howell et al. | Aug 2005 | A1 |
20050179657 | Russo et al. | Aug 2005 | A1 |
20050259851 | Fyke | Nov 2005 | A1 |
20050259852 | Russo | Nov 2005 | A1 |
20060002597 | Rowe | Jan 2006 | A1 |
20060034043 | Hisano et al. | Feb 2006 | A1 |
20060078174 | Russo | Apr 2006 | A1 |
20060103633 | Gioeli | May 2006 | A1 |
20060242268 | Omernick et al. | Oct 2006 | A1 |
20060280346 | Machida | Dec 2006 | A1 |
20070014443 | Russo | Jan 2007 | A1 |
20070016779 | Lyle | Jan 2007 | A1 |
20070034783 | Eliasson et al. | Feb 2007 | A1 |
20070038867 | Verbauwhede et al. | Feb 2007 | A1 |
20070061126 | Russo et al. | Mar 2007 | A1 |
20070067642 | Singhal | Mar 2007 | A1 |
20070125937 | Eliasson et al. | Jun 2007 | A1 |
20070146349 | Errico et al. | Jun 2007 | A1 |
20070274575 | Russo | Nov 2007 | A1 |
20080013808 | Russo et al. | Jan 2008 | A1 |
Number | Date | Country |
---|---|---|
0 973 123 | Jan 2000 | EP |
1 113 383 | Jul 2001 | EP |
1 113 405 | Jul 2001 | EP |
1 143 374 | Feb 2005 | EP |
2000-48208 | Feb 2000 | JP |
2000-056877 | Feb 2000 | JP |
WO 9815225 | Apr 1998 | WO |
WO 9852145 | Nov 1998 | WO |
WO 9852146 | Nov 1998 | WO |
WO 9852147 | Nov 1998 | WO |
WO 9852157 | Nov 1998 | WO |
WO 9943258 | Sep 1999 | WO |
WO 0068873 | Nov 2000 | WO |
WO 0068874 | Nov 2000 | WO |
WO 0072507 | Nov 2000 | WO |
WO 0109819 | Feb 2001 | WO |
WO 0109936 | Feb 2001 | WO |
WO 0129731 | Apr 2001 | WO |
WO 0139134 | May 2001 | WO |
WO 0165470 | Sep 2001 | WO |
WO 0173678 | Oct 2001 | WO |
WO 0177994 | Oct 2001 | WO |
WO 0180166 | Oct 2001 | WO |
WO 0194892 | Dec 2001 | WO |
WO 0194902 | Dec 2001 | WO |
WO 0194966 | Dec 2001 | WO |
WO 0195305 | Dec 2001 | WO |
WO 0199035 | Dec 2001 | WO |
WO 0199036 | Dec 2001 | WO |
WO 0215209 | Feb 2002 | WO |
WO 0215267 | Feb 2002 | WO |
WO 0244998 | Jun 2002 | WO |
WO 02069386 | Sep 2002 | WO |
WO 02071313 | Sep 2002 | WO |
WO 02073375 | Sep 2002 | WO |
WO 02086800 | Oct 2002 | WO |
WO 02093462 | Nov 2002 | WO |
WO 02095349 | Nov 2002 | WO |
WO 03007127 | Jan 2003 | WO |
WO 03017211 | Feb 2003 | WO |
WO 03049011 | Jun 2003 | WO |
WO 03049012 | Jun 2003 | WO |
WO 03049016 | Jun 2003 | WO |
WO 03049104 | Jun 2003 | WO |
WO 03075210 | Sep 2003 | WO |