The invention relates generally to technology for sensing and recording finger motion, fingerprints and, more particularly to systems, devices and methods for finger motion tracking both alone, and in combination with fingerprint image processing and navigation operations.
Partial fingerprint scanners are becoming popular for a wide variety of security applications. In contrast to “all at once” fingerprint scanners, which capture an image of an entire fingerprint at the same time, partial fingerprint sensing devices use a sensing area that is smaller than the fingerprint area to be imaged. By imaging only a portion of a fingerprint at any given time, the size and cost of a partial fingerprint sensor can be made considerably smaller and cheaper than that of a full fingerprint sensor. However to capture a full fingerprint image, the user must move his finger and “swipe” it across the sensing zone of the partial finger print sensor.
Various types of partial fingerprint readers exist. Some work by optical means, some by pressure sensor means, and others by capacitance sensing means or radiofrequency sensing means.
For example, one common configuration used for a fingerprint sensor is a one or two dimensional array of CCD (charge coupled devices) or C-MOS circuit sensor elements (pixels). These components are embedded in a sensing surface to form a matrix of pressure sensing elements that generate signals in response to pressure applied to the surface by a finger. These signals are read by a processor and used to reconstruct the fingerprint of a user and to verify identification.
Other devices include one or two dimensional arrays of optical sensors that read light reflected off of a person's finger and onto an array of optical detectors. The reflected light is converted to a signal that defines the fingerprint of the finger analyzed and is used to reconstruct the fingerprint and to verify identification.
Many types of partial fingerprint scanners are comprised of linear (1 dimensional) arrays of sensing elements (pixels). These one dimensional sensors create a two dimensional image of a fingerprint through the relative motion of the finger pad relative to the sensor array.
One class of partial fingerprint sensors that are particularly useful for small device applications are deep finger penetrating radio frequency (RF) based sensors. These are described in U.S. Pat. Nos. 7,099,496; 7,146,024; and patent application publications US 2005-0244038 A1; US 2005-0244039 A1; US 2006-0083411 A1; US 2007-0031011 A1, and the contents of these patents and patent applications are incorporated herein by reference. These types of sensors are commercially produced by Validity Sensors, Inc, San Jose Calif. This class of sensor mounts the sensing elements (usually arranged in a one dimensional array) on a thin, flexible, and environmentally robust support, and the IC used to drive the sensor in a protected location some distance away from the sensing zone. Such sensors are particularly advantageous in applications where small sensor size and sensor robustness are critical.
The Validity fingerprint sensors measure the intensity of electric fields conducted by finger ridges and valleys, such as deep finger penetrating radio frequency (RF) based sensing technology, and use this information to sense and create the fingerprint image. These devices create sensing elements by creating a linear array composed of many miniature excitation electrodes, spaced at a high density, such as a density of approximately 500 electrodes per inch. The tips of these electrodes are separated from a single sensing electrode by a small sensor gap. The electrodes are electrically excited in a progressive scan pattern and the ridges and valleys of a finger pad alter the electrical properties (usually the capacitive properties) of the excitation electrode-sensing electrode interaction, and this in turn creates a detectable electrical signal. The electrodes and sensors are mounted on thin flexible printed circuit support, and these electrodes and sensors are usually excited and the sensor read by an integrated circuit chip (scanner chip, driver chip, scan IC) designed for this purpose. The end result is to create a one dimensional “image” of the portion of the finger pad immediately over the electrode array and sensor junction.
As the finger surface is moved across the sensor, portions of the fingerprint are sensed and captured by the device's one dimensional scanner, creating an array of one dimensional images indexed by order of data acquisition, and/or alternatively annotated with additional time and/or finger pad location information. Circuitry, such as a computer processor or microprocessor, then creates a full two-dimensional fingerprint image by creating a mosaic of these one dimensional partial fingerprint images.
Often the processor will then compare this recreated two dimensional full fingerprint, usually stored in working memory, with an authorized fingerprint stored in a fingerprint recognition memory, and determine if there is a match or not. Software to fingerprint matching is disclosed in U.S. Pat. Nos. 7,020,591 and 7,194,392 by Wei et. al., and is commercially available from sources such as Cogent systems, Inc., South Pasadena, Calif.
If the scanned fingerprint matches the record of an authorized user, the processor then usually unlocks a secure area or computer system and allows the user access. This enables various types of sensitive areas and information (financial data, security codes, etc.), to be protected from unauthorized users, yet still be easily accessible to authorized users.
The main drawback of partial fingerprint sensors is that in order to obtain a valid fingerprint scan, the user must swipe his or her finger across the sensor surface in a relatively uniform manner. Unfortunately, due to various human factors issues, this usually isn't possible. In the real world, users will not swipe their fingers with a constant speed. Some will swipe more quickly than others, some may swipe at non-uniform speeds, and some may stop partially through a scan, and then resume. In order to account for this type of variation, modern partial fingerprint sensors often incorporate finger position sensors to determine, relative to the fingerprint sensor, how the overall finger position and speed varies during a finger swipe.
One type of finger position indicator, represented by U.S. Pat. No. 7,146,024, and application publications US 2005-0244039A1 and US 2005-0235470A1 (the contents of which are incorporated herein by reference) detects relative finger position using a long array of electrical drive plate sensors. These plates sense the bulk of a finger (rather than the fine details of the fingerprint ridges), and thus sense the relative position of the finger relative to the linear array used for fingerprint sensing. A second type of fingerprint position indicator, represented by US patent application publication US 2007-0031011 A1 (the contents of which are incorporated herein by reference), uses two linear partial fingerprint sensors, located about 400 microns apart.
Another device is described in U.S. Pat. No. 6,002,815 of Immega, et al. The technique used by the Immega device is based on the amount of time required for the finger to travel a fixed distance between two parallel image lines that are oriented perpendicular to the axis of motion.
Still another technique is described in U.S. Pat. No. 6,289,114 of Mainguet. A device utilizing this method reconstructs fingerprints based on sensing and recording images taken of rectangular slices of the fingerprint and piecing them together using an overlapping mosaic algorithm.
In general, both users and manufacturers of electronic devices find it desirable to incorporate as much functionality as possible into as limited a space as possible, and produce high functionality devices as cheaply as possible. Thus devices that perform multiple functions with minimal amounts of device “real estate”, power, and manufacturing costs are generally preferred.
Although some prior art devices have addressed issues of user fingerprint identification, and other prior art devices have addressed issues of how to efficiently control and navigate through electronic devices using hand and finger motion for such control purposes, no completely satisfactory devices that allow for both user identification and precise user control of an electronic device have yet been marketed. Prior art devices that attempt to unify such user identification and device control functions primarily consist of relatively large mouse devices or mouse pads with a fingerprint sensor added somewhere on the device. Such devices are represented by U.S. Pat. Nos. 6,337,919; 6,400,836; and 5,838,306. Unfortunately, these prior art devices have tended to be both large and expensive, and thus have achieved only limited use as premium optional extras in business desktop and laptop computers.
If this type of fingerprint recognition and easy user navigational control could be produced in an extremely small, low cost, and low power device, this type of device could be used in a much broader variety of applications, and would likely be well received by both manufacturers and users.
Therefore, there exists a need in the art to more accurately sense finger swiping motion across a fingerprint sensor and to accurately calculate the speed and location of the finger that is in motion across the sensor. There also exists a great need in the art for a more efficient means to accurately sense and capture fingerprints on portable microprocessor controlled devices (e.g. cell phones, smart cards, PDA's, laptop computers, MP3 players, and the like). There is also a need for more convenient and efficient ways to provide navigation and control operations on such portable devices. As will be seen, the invention provides for these multiple needs in an elegant manner.
Further improvements in the finger location and movement sensing technology previously disclosed in patent and published U.S. Pat. Nos. 7,099,496, 7,146,024, US 2005-0235470 A1, and US 2005-0244039 A1 are possible, and some of these additional improvements are described herein. These improvements include deep finger penetrating radio frequency (RF) based partial fingerprint imagers that can be inexpensively printed or formed on flexible dielectrics, such as Kapton tape, and which produce robust combination fingerprint scanners and “finger mouse” devices.
These enhanced accuracy finger position and motion sensors can be used in a greater variety of different applications. These higher accuracy finger motion sensors may be used (either with or without a partial fingerprint imager) to control electronic devices. When several of these finger motion and position sensors are aligned in different directions, finger motion over a two dimensional surface may be detected. This allows a new type of finger controlled “mouse” electronic input device to be created. Motion of a finger along the surface of such sensors may allow a user to control the movement of an indicator on a display screen, and control an electronic or microprocessor controlled device.
Such sensors are particularly useful for small space constrained devices, such as cell phones, smart cards, music players, portable computers, personal digital accessories, and the like. Here ergonomic considerations and tactile feedback may facilitate use of such sensors, and use of textured surfaces and other techniques to achieve this end are discussed. Since such small space constrained devices are often designed to be both very robust and extremely low cost, techniques and methods to produce robust and cost effective sensors are also highly useful, and these techniques and methods are also discussed.
The invention uses multiple partial fingerprint readers (imagers), and algorithms capable of analyzing the output from these readers, to detect changes in fingerprint images as a finger moves over the imagers. This information can be used to provide finger motion information, or to produce fingerprint images, or both as desired by the particular application. By arranging the multiple partial fingerprint readers in different directions on a surface, finger motion in two dimensions may be detected, and this finger motion information used to control a wide variety of electronic devices.
Because, as will be discussed, these partial fingerprint readers or imagers are capable of detecting finger motion, they will often be referred to in this document in the alternative as “motion sensors” or “independent relative motion sensors”.
The partial fingerprint readers used as independent relative motion sensors are typically based on a linear array of sensing elements (pixels) that capture a narrow one dimensional array (string) of data that is indicative of fingerprint features along a linear portion of the underside of a finger. This is essentially a one dimensional slice of a two dimensional fingerprint. This string of data is used to determine the velocity of finger travel for use in navigation operations. By using multiple sensors arranged in different directions, finger motion and direction data can be computed, and this data can be used to determine the finger motion over the two dimensional surface of the sensor. Like a standard “mouse” pointing device, this two-dimensional finger motion data can be used to move a cursor on a monitor, or otherwise navigate over various menu and selection options on an electronic device.
In operation, the linear sensor array used in the partial fingerprint imager (motion sensor) senses and captures fingerprint features in the form of a string of data signals by first sensing the fingerprint features in an initial sensing and image capture step. A short time later (usually a fraction of a second); another image of the finger features is captured. If the finger is stationary, the two partial fingerprint images will be the same, but if the finger is in motion, usually a different portion of the fingerprint will be imaged and the two images will differ. The more the finger moves, the more different the two images will be.
To generalize, the first image capture operation or step is followed by one or more subsequent operations or steps where a sample is taken of a subset of the fingerprint features again after known time period. This time period may be predetermined or measured as time progresses between sensing and capturing of the samples. Once at least two samples are taken, the subsequent sample is compared against the previous sample to determine the amount of shift or change in the previous sample relative to the subsequent sample. In one embodiment, a single linear line of sensor pixels is used to sense a one-dimensional track of fingerprint features, and the signal sensed by the pixels is converted from an analog signal to a digital signal, where the features are then represented as a one dimensional array (string) of digital values. For example, the ridges of the fingerprint features may be represented as logical ones, and valleys represented as logical zeros. Often, the actual values will be expressed with 8 bits of feature height per pixel (0-255), but other levels of per-pixel fingerprint feature resolution (e.g. 1-bit, 2-bit, 4-bit, 8-bit, 12-bit, and 16-bit) are also quite feasible.
A variety of different partial fingerprint imagers may be used for this purpose. Although, to keep this disclosure, to a manageable length, most of the examples of linear sensor arrays will be based on deep finger penetrating radio frequency (RF) based sensors described in U.S. Pat. Nos. 7,099,496; 7,146,024; and published patent applications US 2005-0235470 A1; US 2005-0244039 A1; US 2006-0083411 A1; and US 2007-0031011 A1; it should be understood that this particular example is not intended to limit the scope of the invention in any way. Other methods, such as optical sensors, pressure sensors, etc. may also be used.
When compared, the first string of digital values from one sample can be compared to the second string in a one to one relationship (e.g. comparing, on a per pixel basis, the fingerprint feature data on the first string with the fingerprint feature data on the second string), and a similarity score can be produced that measures the number of matching values. If there is an immediate match, where both strings are substantially identical, then this would indicate that there was no finger movement during the time between which the two samples were taken. If there is not an immediate match, then this would indicate that there was some finger movement, and additional comparisons may be needed to determine the distance traveled. For each comparison, the strings of digital values can be shifted one or more pixels at a time. Once a good match is found, the distance traveled by the fingerprint is simply the number of pixels shifted times the distance between the pixels, which may be measured from the center point of one pixel to the center point of another pixel in the array of pixel sensors, for example.
In one embodiment, a predetermined number of comparisons can be made along with corresponding similarity scores. The process may then choose the highest score to determine the most accurate comparison. The number of pixels that were shifted to get the best comparison can then be used to determine the finger distance traveled, since the size of and distance between the pixels can be predetermined, and the number of pixels can thus be used to measure the distance traveled by the fingerprint across the motion sensor over the time period of the motion.
In another embodiment, the process could make comparisons and generate scores to measure against a predetermined threshold, rather than making a predetermined number of comparisons. In this embodiment, the similarity score from each comparison can be measured after the comparison is made. If the score is within the threshold, then it can be used to indicate the amount of shift from one sample to another. This can then be used to determine the distance traveled by the fingerprint across the linear motion sensor.
In one embodiment, generally, the invention provides a fingerprint motion tracking system and method, where a single linear sensor array is configured to sense features of a fingerprint along an axis of finger motion. The linear sensor array includes a plurality of substantially contiguous sensing elements or pixels configured to capture a segment of image data that represents a series of fingerprint features passing over a sensor surface.
A memory buffer (usually random access memory (RAM), but alternatively flash memory, EEPROM, or other memory) is configured to receive and store image data from the linear sensor array. And, a data “processing element” (often a microprocessor and a relevant driver program) is configured to generate fingerprint motion data. The linear sensor array may be configured to repeatedly sense at least two substantially contiguous segments of fingerprint data, and the processor can generate motion data based on at least two sensed contiguous segments of fingerprint data. In operation, the linear sensor array is configured to sense a first set of features of a fingerprint along an axis of finger motion and to generate a first set of image data captured by a plurality of substantially contiguous pixels of the sensor array. The linear sensor array is also configured to subsequently sense a second set of features of the fingerprint along an axis of finger motion and to generate a second set of image data captured by a plurality of substantially contiguous pixels of the sensor array. The processing element can then compare first and second sets of image data to determine the distance traveled by the fingerprint over a time interval.
As used herein, linear sensor array is a generic term that relates to a portion of sensing elements, whether they are pixels in an optical reader, a static or radio frequency reader that reads electric field intensity to capture a fingerprint image, piezoelectric components in touch-sensitive circuit fingerprint readers, or other elements indicative of fingerprint readers, where the elements are used to sense a portion of the fingerprint, rather than the entire fingerprint. Such sensor arrays may be configured in a number of ways within a matrix of well known sensor devices. As previously discussed, several modern configurations are described and illustrated in pending U.S. Patent Application Publication Number US 2006-0083411 A1 entitled: Fingerprint Sensing Assemblies and Methods of Making; U.S. Patent Application Publication Number US 2005-0244039 A1 entitled: Methods and Apparatus for Acquiring a Swiped Fingerprint Image; U.S. Patent Application Publication Number US 2005-0235470 A1, entitled: Fingerprint Sensing Methods and Apparatus; U.S. Pat. No. 7,099,496 entitled: Swiped aperture capacitive fingerprint sensing systems and methods, and other applications that are all assigned to common assignee Validity, Inc. Also, many other types of sensor matrices exist in the art directed to capturing fingerprint images. The invention is directed to a novel system, device and method that is not limited in application to any particular sensor matrix or array configuration. In fact, the invention can be used in conjunction with or incorporated into such configurations to improve performance, and further to reduce the processing resources required to capture and reconstruct images.
According to the invention, the linear sensor is substantially contiguous, which is to say that the sensor elements are in a relative proximity to each other so that a first reading of a portion of fingerprint features can be taken, followed by a second reading after a short period of time from another position. The two samples can be compared to determine the relative distance traveled by the fingerprint surface in relation to the sensor surface. The linear sensor is configured to merely take a relatively small sample of the fingerprint at one point in time, then another at a subsequent time. These two samples are used to determine movement of the fingerprint. Two or more samples maybe compared in order to compute direction and velocity of a fingerprint surface relative to the linear sensing elements. These samples may be linear, as described below and illustrated in the drawings, so that a linear array of fingerprint features can be recorded and easily compared to provide a basis for motion, distance traveled over time. If more than one sensor is employed, it is possible to determine direction of motion using vector addition with the different linear samples taken. Thus, some of the functions provided by the invention are a result of taking a linear sample to give a basis for vector analysis. However, those skilled in the art will understand that, given the description below and the related drawings, other embodiments are possible using other configurations of motion sensors, which would not depart from the spirit and scope of the invention, which is defined by the appended claims and their equivalents, as well as any claims and amendments presented in the future and their equivalents.
One useful feature of the invention is that ambiguity in results is substantially prevented. If properly configured, a system configured according to the invention can consistently produce a result, where at least two samples can be taken such that the features of one sample overlap with another sample. Then, comparisons can be made to determine the amount of shift, indicating the amount of movement of the fingerprint across the linear sensor. In prior art systems and methods, it is often the case that no result occurs, and a singularity results. Thus, a user would need to repeat sensing the fingerprint. In some systems, substantial predictor algorithms have been created in an attempt to compensate or resolve the singularity when it occurs. Such applications are very large and demand a good deal of computation and processing resources, which would greatly bog down a portable device. According to the invention, sensing motion of a fingerprint is substantially certain, where samples taken from the fingerprint surface are consistently reliable. This is particularly important in navigation applications, where relative movement of the finger translates to movement of an object such as a cursor on a graphical user interface (GUI), discussed further below.
In one embodiment, the linear sensor array may be used alone to determine linear movement of a fingerprint. In another embodiment, the single sensor array may be used in conjunction with one or more other linear sensor arrays to determine movement in two dimensions. In either embodiment, the linear sensor arrays are utilized solely for determining motion. If the motion of the analyzed fingerprint occurs generally along a predetermined axis of motion, the single linear sensor array can be utilized to sense the velocity of the fingerprint being analyzed. To capture and record the motion of a fingerprint that is not directed along a predetermined axis of motion, two or more linear arrays (a plurality of arrays) can be used together to sense and record such motion, and a processor can determine the direction and speed of the fingerprint using vector arithmetic.
In yet another embodiment, one or more such linear arrays may be used in conjunction with a fingerprint sensor matrix to more accurately capture and reconstruct a fingerprint image. The sensor matrix can be configured to sense and capture an image of a portion of a fingerprint being analyzed, and the one or more linear arrays can provide motion information for use in reconstructing a fingerprint image. A device so configured would be able to more accurately sense, capture, record and reconstruct a fingerprint image using less processing resources than conventional devices and methods.
One advantage that the invention has over prior art (Immega and Mainguet for example), is that the invention separates the analysis of motion from the capturing of the entire fingerprint image. The concept described in Immega, for example, requires the entire image to be captured and recoded line by line. The lines are used to both determine speed of the object being sensed and recorded and also calculate the speed of the object as it is passed over the perpendicular slot. Immega requires immense processing and storage resources to sense, capture, record and reconstruct the image, and all of these functions are carried out by processing the entire lot of image data captured and recorded. Similarly, a device configured according to Mainguet must capture large portions of the fingerprint image and requires substantial processing and storage resources to overlap and match the image mosaics to reconstruct the image. In stark contrast, the invention provides a means for detecting motion of a fingerprint separately from the process of capturing a fingerprint image, and uses the motion information to more efficiently reconstruct the fingerprint image using less processing and storage resources. The invention further provides a means for generating navigation information using the same mechanism.
Alternatively, in yet another embodiment, one or more arrays can be used to generate motion information for use in accurate navigational operations, such as for use in navigating a cursor on a graphical user interface (GUI). Utilizing the improved processing functions of the invention, an improved navigation device can be constructed that is compatible with a portable device that has the power and processing restrictions discussed above. Examples of such embodiments are described and illustrated below.
A motion sensor configured according to the invention uses substantially less space and power compared to conventional configurations for motion sensing, navigation and fingerprint image reconstruction. Additionally, as will be discussed, sensors configured according to the present invention are very robust, and may be produced by an extremely low cost process,
Such a configuration can further provide aid to conventional fingerprint reconstructing processes by better sensing motion of a finger while it is being analyzed by a sensing device. This allows a fingerprint sensing device the ability to reconstruct a fingerprint analyzed by a fingerprint sensor with reduced power. Utilizing the invention, conventional processes that need to match and construct fragmented images of a fingerprint, particularly devices that sense and process a fingerprint in portions, can be optimized with information related to fingerprint motion that occurs while a fingerprint surface is being read. Also, using this unique motion detection technology, optimal navigation functions can be provided that are both extremely small, extremely robust, and function with relatively low amounts of electrical power. Such navigation functions can enable miniaturized navigation devices to be integrated in a portable device system, such as a mouse pad used to move a cursor across a graphical user interface (GUI) on portable electronic devices including cellular phones, laptop computers, personal data assistants (PDAs), and other devices where low power navigation functions are desired. A novel system and method are provided that uses minimal space and processing resources in providing accurate motion detection from which fingerprint sensors as well as navigation systems can greatly benefit.
A device or system configured according to the invention can be implemented as a stand alone navigation device, or a device to provide image reconstruction information for use with a line imaging device that matches and assembles a fingerprint image. Such a line imaging device may be any imaging device configured to sense and capture portions of a fingerprint, whether it captures individual perpendicular image lines of a fingerprint, or multiple perpendicular lines. In operation, a motion detection device can operate as a separate motion detection and/or direction detection device. Alternatively, a motion detection device can be used in conjunction with a line imaging device to more accurately and efficiently sense, capture, store and reconstruct a fingerprint image. A device configured according to the invention may include a single array of finger ridge sensing pixels or data sensor points centrally located along the principal axis of motion to be detected, a sampling system to periodically sample the finger contact across the array, and a computational module or element that compares two sets of samples collected at different times to determine the distance traveled while between the two sample times. According to the invention, the motion sensor pixels do not necessarily need to have the same resolution as the line imager. The motion sensor pixels may in fact use a different sensing technique than the imager.
Again, the invention provides separate operations for detecting motion and for sensing and capturing a fingerprint image. Thus, the techniques used for the separate processes can be the same or may be different depending on the application. Those skilled in the art will understand that different variations of the separate processes are possible using known techniques and techniques can be derived without any undue experimentation. Such variations would not depart from the spirit and scope of the invention.
In another embodiment, the invention provides the capability of multi-axis motion sensing with additional off-axis sensing arrays. In this embodiment, there are two or more (a plurality of) sensor arrays for detecting motion, and each axis is independently measured to determine the component of velocity in that axis. The velocity components from the individual axes are used to compute a vector sum to determine the actual direction and velocity of motion of the finger with respect to the sensor surface. According to the invention, it is not necessary to capture the full image of the fingerprint in order to determine the distance traveled and the velocity. It is only necessary to capture a linear sample of fingerprint features along the line of motion of the fingerprint. In one embodiment, a plurality of samples, such as two or three samples, are captured by motion sensor pixels and are used to determine the distance traveled across the axis of motion of the fingerprint relative to the sensor surface and the velocity at which the motion occurs. This information can also be used in navigational operations, and can further be used in combination with a fingerprint imager to aid in reconstructing a fingerprint image. Utilizing the invention, either application can be configured in an economical and useful manner. Moreover, the operation of such a sensor or navigational device can be optimized to consume substantially less power than conventional devices, which require excessive processor operations for reassembly of the fingerprint image. And, given the motion information generated by a system configured according to the invention, the distance traveled and velocity of the fingerprint can be used to more accurately and efficiently reconstruct a full fingerprint or to better represent relative motion information for use in navigation.
Aligning the pixels along the axis of motion, rather than perpendicular to it, enables the use of motion detection algorithms that can be both time-variant and distance variant. This enables development of algorithms that utilize short distance measurement over long time periods for low speed motion and longer distance motion to more accurately measure higher speed motion, thus optimizing response time and accuracy. Both embodiments share the advantages gained by acquiring and comparing multiple spatial measurements of the fingerprint pattern at each sampling instance. Because multiple samples are taken and compared simultaneously, effects of sampling error, both due to noise and imprecision in the sampling of the finger pattern, are minimized. Also, because samples are taken at multiple locations along the axis of motion simultaneously at each sampling period, the images from two sampling periods can be compared to detect if there had been any significant finger motion between the two sample times. One shared advantage is that both systems are capable of detecting under-sampling of the image being acquired by the line imager, as a consequence of their ability to detect motion of multiple pixels in a short time interval.
An embodiment using a single segmented motion sensor array offers the advantage of detecting motion over a shorter range of distance. This provides faster response time, particularly at low finger speeds that may be encountered in navigation applications. Because this embodiment is sensitive to single pixel motion, it provides unique features that may also reduce the memory requirements for the computational elements. In order to provide a navigation device, as well as to detect and correct for finger motion that is not completely aligned with the desired axis, either of the embodiments may be combined in ensembles such that one sensor is aligned on the axis of motion, and additional sensors aligned at an angle (such as 22.5 or 30 degrees) to the principal axis of finger motion. Examples of different embodiments are discussed below.
Referring to
The system further includes a sensor module (102) that is used to sense the fingerprint surface (106) of a user's finger (104) when the finger is moved across fingerprint sensing surface (108). As can be seen, the fingerprint sensing surface (108) is illustrated as a narrow surface that is designed to sense and capture portions of a fingerprint as it is moves across the sensor. These portions can be subsequently reconstructed according to the invention using motion information from the motion sensors (110), (112). Thus, the sensor components illustrated in
Referring to
In different applications and devices, this interaction may take on many forms. A user may hold his or her finger stationary, and move the device that the sensor is attached to. In this mode, the fingerprint surface is stationary and sensor (102) moves relative to the fingerprint, not unlike a moving scanner in a photocopy machine. More typically, the sensor will be fixed to a surface, such as on the surface of a laptop computer or cellular phone, and the user will move his or her finger over the sensor. The net effect here is to move the fingerprint surface (106) by rubbing it against and along the fingerprint sensing surface (108). The net effect is the same, and again, the sensor (102), which is a partial fingerprint imager, can analyze and read a larger portion of the fingerprint.
As previously discussed, when used in conjunction with deep finger penetrating radio frequency (RF) based sensor technology; the techniques of the present invention lend themselves to robust sensors and a low cost manufacturing process. As discussed in U.S. Pat. Nos. 7,099,496; 7,146,024; and published patent applications US 2005-0235470 A1; US 2005-0244039 A1; US 2006-0083411 A1; US 2007-0031011 A1, incorporated herein by reference, suitable linear partial fingerprint imagers or linear motion sensors can be produced by simply printing conducting circuit traces (electrodes, electrical traces or conducting traces) on the surface of a flexible dielectric tape or thin film substrate, such as a polyimide tape. Often the Kapton® tape produced by DuPont Corporation will be suitable for this purpose, and thus will be used throughout as a specific example of a suitable substrate or support for the present invention.
Often it will be convenient to bond one or more integrated circuit chips, used to drive the sensor, to the Kapton tape, producing a complete sensor.
Referring to
Since a single wide or full linear sensor (166) is all that is needed to create a fingerprint image for user authentication and verification purposes, the other sensors (162), (164) used for finger motion detection do not need to be as large as the “imaging” fingerprint sensor. Since these “finger motion detection” sensors can be smaller and still function adequately, it often will be convenient to pack a larger number of small motion detection into a device in order to determine finger motion with higher accuracy. Using a larger number of motion sensors makes the system more robust to variations in user technique, and capable of more accurate finger motion determination.
Note that although
In this type of embodiment, the conductive elements (traces) (158) form a one dimensional array, where each trace corresponds to a pixel in the array. The traces are sequentially excited by the IC chip (168), (170), (172), which driving the traces with short bursts of oscillating electrical current. Topographic variations in the underside of the finger (the fingerprint) modulate this signal, which is picked up by a sensing pickup plate (166), (162), (164) mounted at a right angle to the traces, and separated from the traces by a small gap. Essentially the IC chip scans through the various traces, detects the corresponding signal at the pickup plates (166), (162), (164), and analyzes the result. A higher fingerprint ridge will return a different signal than a fingerprint valley, and a one dimensional image of the underside of the finger (the fingerprint) immediately on top of the pickup plate (166), (162), (164) results. By moving the finger, successive potions of the underside can be imaged.
In
Referring to
Both
In both
As was discussed previously, and as will be discussed later in this disclosure as well, these miniaturized, robust, and low-cost sensors lend themselves well for incorporation into a variety of different low-cost electrical devices. Often the users of such devices (such as MP3 players, cell phones, and the like) will be using the devices under less than ideal conditions. Here, use may be facilitated by various ergonomic measures, such as texturing the surface of the sensor with various tactile cues designed to help guide the user's finger to the position of the “mouse” most optimum for either device control and or fingerprint scanning. Often this can be done by providing various bumps or other texturing on the surface of the scanner, ideally positioned in useful locations. As an example, in the case where a two dimensional finger motion sensor (mouse) device consists of multiple partial fingerprint imagers arranged in a geometric pattern, then placing a bump near the center of this pattern (176) will signal the user that placing a finger near this central bump will likely produce optimal results in terms of finger motion sensing and device control. Similarly, placing other textures or bumps near the circumference or perimeter of the geometric pattern of partial fingerprint imagers will signal to the user the boundaries where the device is no likely going to be able to detect finger motion with optimal sensitivity. By use of such tactile feedback, users may rapidly become adjusted to the device, and obtain good results with a minimal training curve.
Referring to
Referring to
Such surface textures may be produced by either putting a bump on the surface underneath the thin flexible Kapton tape support, or alternatively by depositing material on top of the Kapton tape support. When it is desired to give the sensors (166), (162) or (164) texture, often it will be useful to put the bump on a surface underneath the Kapton tape. The thin flexible Kapton tape can then be placed on top of this support+textured bump, and the Kapton tape then rides up over the bump and then down again, creating the final textured bump. This causes the electrical circuit traces or electrodes on the Kapton tape to press more firmly against the underside of the finger in these regions, and can improve sensitivity.
Referring to
Referring to
Although use of Kapton tape has a number of advantages, other substrates may also be used for the present invention. For example, a ceramic or plastic substrate, such as are frequently used in integrated circuit housings, may be used. In the illustration of
As discussed herein, the invention can be applied either type of configuration, and is adaptable to any application where motion and direction information may be useful, such as for navigating objects such as cursors on a graphical user interface, or other applications.
Referring again to
Referring to
Referring to
Referring to
According to another embodiment (102(a)) of the invention illustrated in
Referring to
Referring to
Referring to
Referring to
If used for navigation purposes, of the motion sensor configurations above can be utilized for different navigation operations. For example, referring again to
Navigation can be most useful in two dimensional space, where motion and direction information are required. In prior art motion sensors, only one-directional motion can be detected, and, as discussed above, even the most basic motion detection requires a large amount of computation and processing resources. According to the invention, a navigation sensor can be configured to sense motion and direction. The motion and direction information can then be processed for use in various navigation operations for devices, such as to operate as a computer mouse for example. Referring again to
Thus, if a user would stroke a fingerprint surface against a motion sensor surface, the arrays could pick up the motion and direction information, and a processor could process the information to generate relative motion and direction information for use in navigation, such as for a computer mouse. In this example, a user can move a finger relative to a cursor on a graphical user interface (GUI), such as a computer screen, a cellular phone, a personal data assistant (PDA) or other personal device. The navigation sensor could then cause the cursor to move relative to the fingerprint motion, and a user can navigate across the GUI to operate functions on a computer or other device. Since the motion of the cursor is relative to the movement of the fingerprint surface against the navigation sensor, relatively small movements can translate to equal, lesser or even greater distance movement of the cursor.
One aspect of the invention that is very useful to navigation configurations is the ability to consistently generate a motion result. As discussed above, the invention provides a means to substantially ensure a result when a fingerprint moves across a motion sensor. This is true for single array motion sensors as well as multiple array sensors used for two-dimensional motion processing. In a navigation application, such a configuration can provide accurate and consistent motion and directional information that allows for smooth and reliable navigational operations.
Referring to
Referring to
Referring to
In one embodiment, in order to support motion at any arbitrary angle, sensor arrays may be oriented at approximately 0, 30, 60, 90, 120, and 150 degrees. Another more robust system might space them at 22.5 degree increments, rather than 30. Once motion reaches 180 degrees, the process can use reverse motion on the zero degree sensor array, and so on. A device configured in this way would have some of the properties of a navigation touchpad such as those used in laptop computers, with the relative motion sensing capability of a computer mouse.
Circuitry, Programs, and Algorithms Used in the Device
For detailed information pertaining to the principles, the driver circuitry and electronics used for the deep finger penetrating radio frequency (RF) based embodiment of the present invention operate, please refer to U.S. Pat. Nos. 7,099,496; 7,146,024; and patent applications US 2005-0235470 A1; US 2005-0244039 A1; US 2006-0083411 A1; US 2007-0031011 A1, incorporated herein by reference. The present discussion is not limited to such deep finger penetrating radio frequency (RF) based sensors, but is applicable to electrical sensing methods in general.
Referring to
Referring to
In one embodiment, fingerprint image data is converted into motion data according to the following scheme. In this scheme, a linear sensor array (such as a deep finger penetrating radio frequency (RF) based array, or an optical array, etc.) is disposed along a probable direction of finger motion. This sensor array is comprised of a number of imaging pixel elements arranged along the axis of motion of the finger with a sufficient pixel density to resolve fingerprint ridges and valleys, typically 250-500 dpi. The pixels may sense the presence or absence of the fingerprint ridge through a variety of techniques, such as capacitance, optical imaging, or mechanical pressure. The array of imaging pixels is sampled at a predetermined rate, sufficient to ensure that the finger will not travel more than about two pixels in a sample period. Any reasonable time period could be set, but one example is 500 usec. In this embodiment, the pixels are configured as a single extended array, and software may subdivide the larger array into a number of potentially overlapping windows.
At each sample time, the state of the sense elements is converted to a series of numerical values. Because earlier and later image samples are taken along the axis of finger motion at different times, the images will generally appear similar if they are sequentially shifted and compared against each other until a match is found. Depending upon the speed of the finger, the distance required for a match will differ, larger finger speeds requiring a greater distance shift. Thus for an absolute distance of motion D in the period between the samples T, the direct finger velocity measurement will be D/T.
Unlike prior art systems and methods, the system does not have to accumulate a large time history when no motion is detected between different fingerprint images taken at different times. The system can simply store or maintain the first fingerprint image (sample) and perform a new computation when the next sample is acquired. This is advantageous in the case where there is no prior knowledge of the approximate velocity speed of the finger. Often in practice, the finger velocity relative to the sensory surface may vary greatly. The invention eliminates the need for a large buffer of samples to cover a wide dynamic range of finger speeds.
A further advantage offered by the invention is the ability to adjust the sample rate (i.e. the time lapse between acquiring successive fingerprint images) as a function of the rate of motion of the finger. Thus the measurement system may adjust the sample rate to optimize the distance traveled when looking for a match between two successive fingerprint images (frames). For example, if the criterion is that a displacement of 10 pixels between successive images is optimal for determining fast moving fingers, then the time period between acquiring successive images can be adjusted accordingly. This is similar to a “coarse” adjust in that if a mouse cursor on a display screen is coupled to this device, the mouse will move very quickly (i.e. from one side of the screen to the other) but not very precisely. Conversely, as the finger velocity decreases, it is likely that the user wishes to control the mouse very precisely, as for example when a cursor on a screen is approaching the final destination desired by the user. Here, the system can shift to a “fine adjust” mode by changing the time and pixel displacement criteria. For fine adjust mode, the system might determine that the optimum pixel displacement to look for is on the order of a single pixel, and adjust the time difference between measurements to optimize for this result.
Those skilled in the art will understand that there are various methods for changing the sample rate in order to achieve these and other objectives, and the invention is not limited to any particular method, and moreover is inclusive of the various known methods as well as methods readily ascertainable by one skilled in the art without undue experimentation.
Usually, finger motion will not exactly coincide with the direction of a particular linear array or partial fingerprint imager. Usually it will be somewhat of the sensor axis. To capture this, multiple sensors each aligned with a different axis, will be used. Consider a two motion sensor unit, where each sensor has an axis 90° off from the other. In the typical case where finger motion lies between the two axes, the distance a fingerprint feature travels along each sensor array will be less than the entire length of the sensor.
Often multiple (more than two) arrays will be used, each with an axis at a different angle. To detect motion across a range of angles, sensor arrays may be provided at a series of angles disposed so that a match will be found on at least two of the sensor arrays. For example, by arranging the arrays in 30 degree increments across the allowable range of motion axes, it is possible to ensure that if there is worst case alignment (i.e. a 15 degree misalignment between the actual axis of motion an the two sensor arrays on either side of it), an image feature will still approximately follow the nearest sensor arrays for more than three pixels of travel. Thus, by sampling the sensor arrays fast enough to ensure that the finger has not traveled more than three pixels between samples, it is possible to determine the axis of motion by finding the adjacent pair of sensors with the highest correlation, and computing the vector sum of the distances traveled along each of them.
Referring to
In step (1210), a similarity score is generated, defining the amount of correlation between the two arrays. This may be in the form of a probability value, a percentage correlation value, or other mathematical value that can be used by the processor to determine the best similarity score among different comparisons. In step (1212), it is determine whether the similarity score falls within a threshold. In one embodiment, the threshold is a predetermined number that is decided according to a particular application. In practice, the invention can be configured to produce correlations that are of a high value, thus justifying a high threshold. Those skilled in the art will understand that such a threshold can be determined without undue experimentation, and that is depends on an application. If the score does not fall within the threshold, then the arrays are shifted to offset alignment in step (1214). The direction of the shifting may be done according to a predicted direction that a user would be expected to move the fingerprint surface across the sensor. If it is not known, or if the design calls for either direction, then flexibility can be accommodated by shifting the arrays in multiple directions until an alignment is reached that is within the threshold. In either case, the process returns to step (1208), where the arrays are compared again. A new similarity score is generated in step (1210), and the new score is measured against the threshold. This process can be reiterated until a score passes the threshold, and could possibly register an error if one is not met over time or a predetermined number of cycles. In a practical application, the two arrays can be shifted and processed once for each pixel in one array, since they are equal in length given that they were taken from the same array. If a score occurs that is within the threshold, then the distance is estimated in step (1216). This can be done by simply counting the number of pixels in which the arrays were shifted before a score occurs within the threshold, and multiplying this number by the distance between pixels, which can be estimated to be the distance between midpoints of two pixels. The distance can be accurately measured by sampling distances between individual pixels and groups of pixels in an array, but the exact method of measurement would depend on the application. Then, the velocity can be estimated in step (1218) by dividing the distance traveled by the time expended during the travel. The process ends at step (1220), where an estimated velocity value can be generated.
Often the process shown in
Referring to
Continuing, in step (1308), the two arrays are compared. In an initial alignment, referring briefly to
Then the distance is estimated in step (1318). Again, this can be done by simply counting the number of pixels in which the arrays were shifted, and multiplying this number by the distance between pixels, which can be estimated to be the distance between midpoints of two pixels. The distance can be accurately measured by sampling distances between individual pixels and groups of pixels in an array, but the exact method of measurement would depend on the application. Then, the velocity can be estimated in step (1320) by dividing the distance traveled by the time expended during the travel. The process ends in step (1322) where a velocity value can be generated.
Referring to
Continuing, in step (1408), the two arrays are compared for each sensor. In an initial alignment, the digital representations of the arrays of features are compared side by side for each sensor array. If this initial comparison shows a high correlation, then it is indicative of no relative motion between the fingerprint and the motion sensor. In step (1410), a similarity score is generated for each array, defining the amount of correlation between the two arrays. This may be in the form of a probability value, a percentage correlation value, or other mathematical value that can be used by the processor to determine the best similarity score among different comparisons. In step (1412), it is determine whether the shift is a last shift in a predetermined number of shifts. In practice, it is practical to shift at least the number of pixels in each of the array sensors, since both image arrays from each sensor is sensed and sampled by the same sensor array. Again, similar to the process invention embodied in
Then the distance is estimated in step (1418). Again, this can be done by simply counting the number of pixels in which the arrays were shifted, and multiplying this number by the distance between pixels, which can be estimated to be the distance between midpoints of two pixels. The distance can be accurately measured by sampling distances between individual pixels and groups of pixels in an array, but the exact method of measurement would depend on the application.
If the similarity score for either of the adjacent arrays exceeds the threshold and this similarity score occurs at a distance less than the distance traveled on the predominant axis, then the principal axis of motion is assumed to lie between the predominant axis and this second axis. The angle of motion is then estimated by computing the ratio of distances along the predominant and secondary axes. The ratio of these distances is approximately equal to the ratio of the cosines of the angles between the actual axis of motion and the axes of the two sensor arrays.
The final estimated distance is computed by taking the distance measured on the predominant axis sensor and dividing it by the cosine of the difference between the estimated angle of motion and the angle of the sensor axis.
Then, the velocity can be estimated in step (1420) by dividing the distance traveled by the time expended during the travel. The process ends in step (1422) where a velocity value can be generated.
Referring to
Referring to
As previously discussed, the invention may also involve a number of functions to be performed by a computer processor, such as a microprocessor. The microprocessor may be a specialized or dedicated microprocessor that is configured to perform particular tasks by executing machine-readable software code that defines the particular tasks. The microprocessor may also be configured to operate and communicate with other devices such as direct memory access modules, memory storage devices, Internet related hardware, and other devices that relate to the transmission of data in accordance with the invention. The software code may be configured using software formats such as Assembly, Java, C++, XML (Extensible Mark-up Language) and other languages that may be used to define functions that relate to operations of devices required to carry out the functional operations related to the invention. The code may be written in different forms and styles, many of which are known to those skilled in the art. Different code formats, code configurations, styles and forms of software programs and other means of configuring code to define the operations of a microprocessor in accordance with the invention will not depart from the spirit and scope of the invention.
Within the different types of computers, such as computer servers, that utilize the invention, there exist different types of memory devices for storing and retrieving information while performing functions according to the invention. Cache memory devices are often included in such computers for use by the central processing unit as a convenient storage location for information that is frequently stored and retrieved. Similarly, a persistent memory is also frequently used with such computers for maintaining information that is frequently retrieved by a central processing unit, but that is not often altered within the persistent memory, unlike the cache memory. Main memory is also usually included for storing and retrieving larger amounts of information such as data and software applications configured to perform functions according to the invention when executed by the central processing unit. These memory devices may be configured as random access memory (RAM), static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, and other memory storage devices that may be accessed by a central processing unit to store and retrieve information. The invention is not limited to any particular type of memory device, or any commonly used protocol for storing and retrieving information to and from these memory devices respectively.
The apparatus and method include a method and apparatus for enabling and controlling fingerprint sensors and fingerprint image data and motion data in conjunction with the operation of an electronic device where navigation and fingerprint verification processes are utilized. Although this embodiment is described and illustrated in the context of devices, systems and related methods of imaging fingerprints and navigation features for a portable device, the scope of the invention extends to other applications where such functions are useful. Furthermore, while the foregoing description has been with reference to particular embodiments of the invention, it will be appreciated that these are only illustrative of the invention and that changes may be made to those embodiments without departing from the principles of the invention.
This application is a continuation of application Ser. No. 12/103,655 filed Apr. 15, 2008 now U.S. Pat. No. 8,175,345, entitled “Unitized Ergonomic Two-Dimensional Fingerprint Motion Tracking Device and Method,” by Lawrence C. Gardner, the contents of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4151512 | Riganati et al. | Apr 1979 | A |
4225850 | Chang et al. | Sep 1980 | A |
4310827 | Asi | Jan 1982 | A |
4353056 | Tsikos | Oct 1982 | A |
4405829 | Rivest et al. | Sep 1983 | A |
4525859 | Bowles et al. | Jun 1985 | A |
4550221 | Mabusth | Oct 1985 | A |
4580790 | Doose | Apr 1986 | A |
4758622 | Gosselin | Jul 1988 | A |
4817183 | Sparrow | Mar 1989 | A |
5076566 | Kriegel | Dec 1991 | A |
5109427 | Yang | Apr 1992 | A |
5140642 | Hau et al. | Aug 1992 | A |
5305017 | Gerpheide | Apr 1994 | A |
5319323 | Fong | Jun 1994 | A |
5325442 | Knapp | Jun 1994 | A |
5420936 | Fitzpatrick et al. | May 1995 | A |
5422807 | Mitra et al. | Jun 1995 | A |
5429006 | Tamori | Jul 1995 | A |
5456256 | Schneider et al. | Oct 1995 | A |
5543591 | Gillespie et al. | Aug 1996 | A |
5569901 | Bridgelall et al. | Oct 1996 | A |
5623552 | Lane | Apr 1997 | A |
5627316 | De Winter et al. | May 1997 | A |
5650842 | Maase et al. | Jul 1997 | A |
5717777 | Wong et al. | Feb 1998 | A |
5781651 | Hsiao et al. | Jul 1998 | A |
5801681 | Sayag | Sep 1998 | A |
5818956 | Tuli | Oct 1998 | A |
5838306 | O'Connor et al. | Nov 1998 | A |
5848176 | Harra et al. | Dec 1998 | A |
5850450 | Schweitzer et al. | Dec 1998 | A |
5852670 | Setlak et al. | Dec 1998 | A |
5864296 | Upton | Jan 1999 | A |
5887343 | Salatino et al. | Mar 1999 | A |
5892824 | Beatson et al. | Apr 1999 | A |
5903225 | Schmitt et al. | May 1999 | A |
5915757 | Tsuyama et al. | Jun 1999 | A |
5920384 | Borza | Jul 1999 | A |
5920640 | Salatino et al. | Jul 1999 | A |
5940526 | Setlak et al. | Aug 1999 | A |
5999637 | Toyoda et al. | Dec 1999 | A |
6002815 | Immega et al. | Dec 1999 | A |
6011859 | Kalnitsky et al. | Jan 2000 | A |
6016355 | Dickinson et al. | Jan 2000 | A |
6052475 | Upton | Apr 2000 | A |
6067368 | Setlak et al. | May 2000 | A |
6073343 | Petrick et al. | Jun 2000 | A |
6076566 | Lowe | Jun 2000 | A |
6088585 | Schmitt et al. | Jul 2000 | A |
6098175 | Lee | Aug 2000 | A |
6118318 | Fifield et al. | Sep 2000 | A |
6134340 | Hsu et al. | Oct 2000 | A |
6157722 | Lerner et al. | Dec 2000 | A |
6161213 | Lofstrom | Dec 2000 | A |
6175407 | Santor | Jan 2001 | B1 |
6182076 | Yu et al. | Jan 2001 | B1 |
6182892 | Angelo et al. | Feb 2001 | B1 |
6185318 | Jain et al. | Feb 2001 | B1 |
6234031 | Suga | May 2001 | B1 |
6241288 | Bergenek et al. | Jun 2001 | B1 |
6259108 | Antonelli et al. | Jul 2001 | B1 |
6289114 | Mainguet | Sep 2001 | B1 |
6317508 | Kramer et al. | Nov 2001 | B1 |
6320394 | Tartagni | Nov 2001 | B1 |
6332193 | Glass et al. | Dec 2001 | B1 |
6333989 | Borza | Dec 2001 | B1 |
6337919 | Duton | Jan 2002 | B1 |
6346739 | Lepert et al. | Feb 2002 | B1 |
6347040 | Fries et al. | Feb 2002 | B1 |
6360004 | Akizuki | Mar 2002 | B1 |
6362633 | Tartagni | Mar 2002 | B1 |
6392636 | Ferrari et al. | May 2002 | B1 |
6399994 | Shobu | Jun 2002 | B2 |
6400836 | Senior | Jun 2002 | B2 |
6408087 | Kramer | Jun 2002 | B1 |
6473072 | Comiskey et al. | Oct 2002 | B1 |
6509501 | Eicken et al. | Jan 2003 | B2 |
6539101 | Black | Mar 2003 | B1 |
6580816 | Kramer et al. | Jun 2003 | B2 |
6597289 | Sabatini | Jul 2003 | B2 |
6643389 | Raynal et al. | Nov 2003 | B1 |
6672174 | Deconde et al. | Jan 2004 | B2 |
6710416 | Xu | Mar 2004 | B1 |
6738050 | Comiskey et al. | May 2004 | B2 |
6741729 | Bjorn et al. | May 2004 | B2 |
6757002 | Oross et al. | Jun 2004 | B1 |
6766040 | Catalano et al. | Jul 2004 | B1 |
6785407 | Tschudi et al. | Aug 2004 | B1 |
6838905 | Doyle | Jan 2005 | B1 |
6886104 | McClurg et al. | Apr 2005 | B1 |
6897002 | Teraoka et al. | May 2005 | B2 |
6898299 | Brooks | May 2005 | B1 |
6924496 | Manansala | Aug 2005 | B2 |
6937748 | Schneider et al. | Aug 2005 | B1 |
6941001 | Bolle et al. | Sep 2005 | B1 |
6941810 | Okada | Sep 2005 | B2 |
6950540 | Higuchi | Sep 2005 | B2 |
6959874 | Bardwell | Nov 2005 | B2 |
6963626 | Shaeffer et al. | Nov 2005 | B1 |
6970584 | O'Gorman et al. | Nov 2005 | B2 |
6980672 | Saito et al. | Dec 2005 | B2 |
6983882 | Cassone | Jan 2006 | B2 |
7013030 | Wong et al. | Mar 2006 | B2 |
7020591 | Wei et al. | Mar 2006 | B1 |
7030860 | Hsu et al. | Apr 2006 | B1 |
7035443 | Wong | Apr 2006 | B2 |
7042535 | Katoh et al. | May 2006 | B2 |
7043061 | Hamid et al. | May 2006 | B2 |
7043644 | DeBruine | May 2006 | B2 |
7046230 | Zadesky et al. | May 2006 | B2 |
7064743 | Nishikawa | Jun 2006 | B2 |
7099496 | Benkley | Aug 2006 | B2 |
7110577 | Tschud | Sep 2006 | B1 |
7113622 | Hamid | Sep 2006 | B2 |
7126389 | McRae et al. | Oct 2006 | B1 |
7129926 | Mathiassen et al. | Oct 2006 | B2 |
7136514 | Wong | Nov 2006 | B1 |
7146024 | Benkley | Dec 2006 | B2 |
7146026 | Russon et al. | Dec 2006 | B2 |
7146029 | Manansala | Dec 2006 | B2 |
7190816 | Mitsuyu et al. | Mar 2007 | B2 |
7194392 | Tuken et al. | Mar 2007 | B2 |
7197168 | Russo | Mar 2007 | B2 |
7200250 | Chou | Apr 2007 | B2 |
7251351 | Mathiassen et al. | Jul 2007 | B2 |
7258279 | Schneider et al. | Aug 2007 | B2 |
7260246 | Fujii | Aug 2007 | B2 |
7263212 | Kawabe | Aug 2007 | B2 |
7263213 | Rowe | Aug 2007 | B2 |
7289649 | Walley et al. | Oct 2007 | B1 |
7290323 | Deconde et al. | Nov 2007 | B2 |
7308121 | Mathiassen et al. | Dec 2007 | B2 |
7308122 | McClurg et al. | Dec 2007 | B2 |
7321672 | Sasaki et al. | Jan 2008 | B2 |
7356169 | Hamid | Apr 2008 | B2 |
7360688 | Harris | Apr 2008 | B1 |
7369685 | DeLeon | May 2008 | B2 |
7379569 | Chikazawa et al. | May 2008 | B2 |
7409876 | Ganapathi et al. | Aug 2008 | B2 |
7412083 | Takahashi | Aug 2008 | B2 |
7424618 | Roy et al. | Sep 2008 | B2 |
7447339 | Mimura et al. | Nov 2008 | B2 |
7447911 | Chou et al. | Nov 2008 | B2 |
7460697 | Erhart et al. | Dec 2008 | B2 |
7463756 | Benkley | Dec 2008 | B2 |
7505611 | Fyke | Mar 2009 | B2 |
7505613 | Russo | Mar 2009 | B2 |
7565548 | Fiske et al. | Jul 2009 | B2 |
7574022 | Russo | Aug 2009 | B2 |
7643950 | Getzin et al. | Jan 2010 | B1 |
7646897 | Fyke | Jan 2010 | B2 |
7681232 | Nordentoft et al. | Mar 2010 | B2 |
7689013 | Shinzaki | Mar 2010 | B2 |
7706581 | Drews et al. | Apr 2010 | B2 |
7733697 | Picca et al. | Jun 2010 | B2 |
7751601 | Benkley | Jul 2010 | B2 |
7843438 | Onoda | Nov 2010 | B2 |
7899216 | Watanabe et al. | Mar 2011 | B2 |
7953258 | Dean et al. | May 2011 | B2 |
8005276 | Dean et al. | Aug 2011 | B2 |
8031916 | Abiko et al. | Oct 2011 | B2 |
8077935 | Geoffroy et al. | Dec 2011 | B2 |
8107212 | Nelson et al. | Jan 2012 | B2 |
8116540 | Dean et al. | Feb 2012 | B2 |
8131026 | Benkley et al. | Mar 2012 | B2 |
8165355 | Benkley et al. | Apr 2012 | B2 |
20010026636 | Mainget | Oct 2001 | A1 |
20010030644 | Allport | Oct 2001 | A1 |
20010036299 | Senior | Nov 2001 | A1 |
20010043728 | Kramer et al. | Nov 2001 | A1 |
20020025062 | Black | Feb 2002 | A1 |
20020061125 | Fujii | May 2002 | A1 |
20020064892 | Lepert et al. | May 2002 | A1 |
20020067845 | Griffis | Jun 2002 | A1 |
20020073046 | David | Jun 2002 | A1 |
20020089044 | Simmons et al. | Jul 2002 | A1 |
20020089410 | Janiak et al. | Jul 2002 | A1 |
20020096731 | Wu et al. | Jul 2002 | A1 |
20020122026 | Bergstrom | Sep 2002 | A1 |
20020126516 | Jeon | Sep 2002 | A1 |
20020133725 | Roy et al. | Sep 2002 | A1 |
20020181749 | Matsumoto et al. | Dec 2002 | A1 |
20030002717 | Hamid | Jan 2003 | A1 |
20030002719 | Hamid et al. | Jan 2003 | A1 |
20030021495 | Cheng | Jan 2003 | A1 |
20030035570 | Benkley | Feb 2003 | A1 |
20030063782 | Acharya et al. | Apr 2003 | A1 |
20030068072 | Hamid | Apr 2003 | A1 |
20030076301 | Tsuk et al. | Apr 2003 | A1 |
20030076303 | Huppi | Apr 2003 | A1 |
20030095096 | Robbin et al. | May 2003 | A1 |
20030102874 | Lane et al. | Jun 2003 | A1 |
20030123714 | O'Gorman et al. | Jul 2003 | A1 |
20030123715 | Uchida | Jul 2003 | A1 |
20030141959 | Keogh et al. | Jul 2003 | A1 |
20030147015 | Katoh et al. | Aug 2003 | A1 |
20030161510 | Fuji | Aug 2003 | A1 |
20030161512 | Mathiassen | Aug 2003 | A1 |
20030169228 | Mathiassen et al. | Sep 2003 | A1 |
20030174871 | Yoshioka et al. | Sep 2003 | A1 |
20030186157 | Teraoka et al. | Oct 2003 | A1 |
20030209293 | Sako et al. | Nov 2003 | A1 |
20030224553 | Manansala | Dec 2003 | A1 |
20040012773 | Puttkammer | Jan 2004 | A1 |
20040022001 | Chu et al. | Feb 2004 | A1 |
20040042642 | Bolle et al. | Mar 2004 | A1 |
20040050930 | Rowe | Mar 2004 | A1 |
20040066613 | Leitao | Apr 2004 | A1 |
20040076313 | Bronstein et al. | Apr 2004 | A1 |
20040081339 | Benkley | Apr 2004 | A1 |
20040096086 | Miyasaka | May 2004 | A1 |
20040113956 | Bellwood et al. | Jun 2004 | A1 |
20040120400 | Linzer | Jun 2004 | A1 |
20040125993 | Zhao et al. | Jul 2004 | A1 |
20040129787 | Saito | Jul 2004 | A1 |
20040136612 | Meister et al. | Jul 2004 | A1 |
20040172339 | Snelgrove et al. | Sep 2004 | A1 |
20040179718 | Chou | Sep 2004 | A1 |
20040184641 | Nagasaka et al. | Sep 2004 | A1 |
20040190761 | Lee | Sep 2004 | A1 |
20040208346 | Baharav et al. | Oct 2004 | A1 |
20040208347 | Baharav et al. | Oct 2004 | A1 |
20040208348 | Baharav et al. | Oct 2004 | A1 |
20040213441 | Tschudi | Oct 2004 | A1 |
20040215689 | Dooley et al. | Oct 2004 | A1 |
20040228505 | Sugimoto | Nov 2004 | A1 |
20040228508 | Shigeta | Nov 2004 | A1 |
20040240712 | Rowe et al. | Dec 2004 | A1 |
20040252867 | Lan et al. | Dec 2004 | A1 |
20050031174 | Ryhanen et al. | Feb 2005 | A1 |
20050036665 | Higuchi | Feb 2005 | A1 |
20050047485 | Khayrallah et al. | Mar 2005 | A1 |
20050100196 | Scott et al. | May 2005 | A1 |
20050109835 | Jacoby et al. | May 2005 | A1 |
20050110103 | Setlak | May 2005 | A1 |
20050111708 | Chou | May 2005 | A1 |
20050123176 | Ishii et al. | Jun 2005 | A1 |
20050136200 | Durell et al. | Jun 2005 | A1 |
20050139656 | Arnouse | Jun 2005 | A1 |
20050139685 | Kozlay | Jun 2005 | A1 |
20050162402 | Watanachote | Jul 2005 | A1 |
20050169503 | Howell et al. | Aug 2005 | A1 |
20050210271 | Chou et al. | Sep 2005 | A1 |
20050219200 | Weng | Oct 2005 | A1 |
20050220329 | Payne et al. | Oct 2005 | A1 |
20050231213 | Chou et al. | Oct 2005 | A1 |
20050238212 | Du et al. | Oct 2005 | A1 |
20050244038 | Benkley | Nov 2005 | A1 |
20050244039 | Geoffroy et al. | Nov 2005 | A1 |
20050249386 | Juh | Nov 2005 | A1 |
20050258952 | Utter et al. | Nov 2005 | A1 |
20050269402 | Spitzer et al. | Dec 2005 | A1 |
20060006224 | Modi | Jan 2006 | A1 |
20060055500 | Burke et al. | Mar 2006 | A1 |
20060066572 | Yumoto et al. | Mar 2006 | A1 |
20060078176 | Abiko et al. | Apr 2006 | A1 |
20060083411 | Benkley | Apr 2006 | A1 |
20060110537 | Huang et al. | May 2006 | A1 |
20060140461 | Kim et al. | Jun 2006 | A1 |
20060144953 | Takao | Jul 2006 | A1 |
20060170528 | Funushige et al. | Aug 2006 | A1 |
20060187200 | Martin | Aug 2006 | A1 |
20060210082 | Devadas et al. | Sep 2006 | A1 |
20060214512 | Iwata | Sep 2006 | A1 |
20060239514 | Watanabe et al. | Oct 2006 | A1 |
20060249008 | Luther | Nov 2006 | A1 |
20060259873 | Mister | Nov 2006 | A1 |
20060261174 | Zellner et al. | Nov 2006 | A1 |
20060271793 | Devadas et al. | Nov 2006 | A1 |
20060287963 | Steeves et al. | Dec 2006 | A1 |
20070031011 | Erhart et al. | Feb 2007 | A1 |
20070036400 | Watanabe et al. | Feb 2007 | A1 |
20070057763 | Blattner et al. | Mar 2007 | A1 |
20070067828 | Bychkov | Mar 2007 | A1 |
20070076926 | Schneider et al. | Apr 2007 | A1 |
20070076951 | Tanaka et al. | Apr 2007 | A1 |
20070086634 | Setlak et al. | Apr 2007 | A1 |
20070090312 | Stallinga et al. | Apr 2007 | A1 |
20070138299 | Mitra | Jun 2007 | A1 |
20070180261 | Akkermans et al. | Aug 2007 | A1 |
20070198141 | Moore | Aug 2007 | A1 |
20070198435 | Siegal et al. | Aug 2007 | A1 |
20070228154 | Tran | Oct 2007 | A1 |
20070237366 | Maletsky | Oct 2007 | A1 |
20070248249 | Stoianov | Oct 2007 | A1 |
20080002867 | Mathiassen et al. | Jan 2008 | A1 |
20080013805 | Sengupta et al. | Jan 2008 | A1 |
20080019578 | Saito et al. | Jan 2008 | A1 |
20080049987 | Champagne et al. | Feb 2008 | A1 |
20080049989 | Iseri et al. | Feb 2008 | A1 |
20080063245 | Benkley et al. | Mar 2008 | A1 |
20080069412 | Champagne et al. | Mar 2008 | A1 |
20080126260 | Cox et al. | May 2008 | A1 |
20080169345 | Keane et al. | Jul 2008 | A1 |
20080170695 | Adler et al. | Jul 2008 | A1 |
20080175450 | Scott et al. | Jul 2008 | A1 |
20080178008 | Takahashi et al. | Jul 2008 | A1 |
20080179112 | Qin et al. | Jul 2008 | A1 |
20080185429 | Saville | Aug 2008 | A1 |
20080201265 | Hewton | Aug 2008 | A1 |
20080205714 | Benkley et al. | Aug 2008 | A1 |
20080219521 | Benkley et al. | Sep 2008 | A1 |
20080222049 | Loomis et al. | Sep 2008 | A1 |
20080223925 | Saito et al. | Sep 2008 | A1 |
20080226132 | Gardner | Sep 2008 | A1 |
20080240523 | Benkley et al. | Oct 2008 | A1 |
20080244277 | Orsini et al. | Oct 2008 | A1 |
20080267462 | Nelson et al. | Oct 2008 | A1 |
20080279373 | Erhart et al. | Nov 2008 | A1 |
20090130369 | Huang et al. | May 2009 | A1 |
20090153297 | Gardner | Jun 2009 | A1 |
20090154779 | Satyan et al. | Jun 2009 | A1 |
20090155456 | Benkley et al. | Jun 2009 | A1 |
20090169071 | Bond et al. | Jul 2009 | A1 |
20090174974 | Huang et al. | Jul 2009 | A1 |
20090237135 | Ramaraju et al. | Sep 2009 | A1 |
20090252384 | Dean et al. | Oct 2009 | A1 |
20090252385 | Dean et al. | Oct 2009 | A1 |
20090252386 | Dean et al. | Oct 2009 | A1 |
20090279742 | Abiko | Nov 2009 | A1 |
20090319435 | Little et al. | Dec 2009 | A1 |
20090324028 | Russo | Dec 2009 | A1 |
20100026451 | Erhart et al. | Feb 2010 | A1 |
20100045705 | Vertegaal et al. | Feb 2010 | A1 |
20100083000 | Kesanupalli | Apr 2010 | A1 |
20100119124 | Satyan | May 2010 | A1 |
20100123657 | Shimizu | May 2010 | A1 |
20100127366 | Bond et al. | May 2010 | A1 |
20100176823 | Thompson et al. | Jul 2010 | A1 |
20100176892 | Thompson et al. | Jul 2010 | A1 |
20100177940 | Dean et al. | Jul 2010 | A1 |
20100180136 | Thompson et al. | Jul 2010 | A1 |
20100189314 | Benkley et al. | Jul 2010 | A1 |
20100208953 | Gardner et al. | Aug 2010 | A1 |
20100244166 | Shibuta et al. | Sep 2010 | A1 |
20100272329 | Benkley | Oct 2010 | A1 |
20100284565 | Benkley et al. | Nov 2010 | A1 |
20110002461 | Erhart et al. | Jan 2011 | A1 |
20110018556 | Le et al. | Jan 2011 | A1 |
20110102567 | Erhart | May 2011 | A1 |
20110102569 | Erhart | May 2011 | A1 |
20110182486 | Valfridsson et al. | Jul 2011 | A1 |
20110214924 | Perezselsky et al. | Sep 2011 | A1 |
20110267298 | Erhart et al. | Nov 2011 | A1 |
20110298711 | Dean et al. | Dec 2011 | A1 |
20110304001 | Erhart et al. | Dec 2011 | A1 |
20120044639 | Garcia | Feb 2012 | A1 |
Number | Date | Country |
---|---|---|
2213813 | Oct 1973 | DE |
0929028 | Jan 1998 | EP |
0905646 | Mar 1999 | EP |
0973123 | Jan 2000 | EP |
1018697 | Jul 2000 | EP |
1139301 | Oct 2001 | EP |
1531419 | May 2005 | EP |
1533759 | May 2005 | EP |
1538548 | Jun 2005 | EP |
1624399 | Feb 2006 | EP |
1939788 | Jul 2008 | EP |
2331613 | May 1999 | GB |
2480919 | Dec 2011 | GB |
04158434 | Jun 1992 | JP |
2005242856 | Sep 2005 | JP |
WO 9003620 | Apr 1990 | WO |
WO 9858342 | Dec 1998 | WO |
WO 9928701 | Jun 1999 | WO |
WO 9943258 | Sep 1999 | WO |
WO 0122349 | Mar 2001 | WO |
WO 0194902 | Dec 2001 | WO |
WO 0194902 | Dec 2001 | WO |
WO 0195304 | Dec 2001 | WO |
WO 0211066 | Feb 2002 | WO |
WO 0247018 | Jun 2002 | WO |
WO 0247018 | Jun 2002 | WO |
WO 02061668 | Aug 2002 | WO |
WO 02077907 | Oct 2002 | WO |
WO 03063054 | Jul 2003 | WO |
WO 03075210 | Sep 2003 | WO |
WO 2004066194 | Aug 2004 | WO |
WO 2004066693 | Aug 2004 | WO |
WO 20050104012 | Nov 2005 | WO |
WO 2005106774 | Nov 2005 | WO |
WO 2005106774 | Nov 2005 | WO |
WO 2006040724 | Apr 2006 | WO |
WO 2006041780 | Apr 2006 | WO |
WO 2007011607 | Jan 2007 | WO |
WO 2008033264 | Mar 2008 | WO |
WO 2008033264 | Mar 2008 | WO |
WO 2008033265 | Jun 2008 | WO |
WO 2008033265 | Jun 2008 | WO |
WO 2008137287 | Nov 2008 | WO |
WO 2009002599 | Dec 2008 | WO |
WO 2009002599 | Dec 2008 | WO |
WO 2009029257 | Jun 2009 | WO |
WO 2009079219 | Jun 2009 | WO |
WO 2009079221 | Jun 2009 | WO |
WO 2009079257 | Jun 2009 | WO |
WO 2009079262 | Jun 2009 | WO |
WO 2010034036 | Mar 2010 | WO |
WO 2010036445 | Apr 2010 | WO |
WO 2010143597 | Dec 2010 | WO |
WO 2011053797 | May 2011 | WO |
Number | Date | Country | |
---|---|---|---|
20120206586 A1 | Aug 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12103655 | Apr 2008 | US |
Child | 13460330 | US |