The embodiments are generally related to electronic sensing devices, and, more particularly, to sensors for sensing objects located near or about the sensors for use in media navigation, fingerprint sensing and other operations of electronic devices and other products.
In the electronic sensing market, there are a wide variety of sensors for sensing objects at a given location. Such sensors are configured to sense electronic characteristics of an object in order to sense presence of an object near or about the sensor, physical characteristics of the object, shapes, textures on surfaces of an object, material composition, biological information, and other features and characteristics of an object being sensed.
Sensors may be configured to passively detect characteristics of an object, by measuring such as temperature, weight, or various emissions such as photonic, magnetic or atomic, of an object in close proximity or contact with the sensor, or other characteristic. An example of this is a non-contact infrared thermometer that detects the black body radiation spectra emitted from an object, from which its temperature can be computed.
Other sensors work by directly exciting an object with a stimulus such as voltage or current, then using the resultant signal to determine the physical or electrical characteristics of an object. An example of this is a fluid detector consisting of two terminals, one that excites the medium with a voltage source, while the second measures the current flow to determine the presence of a conductive fluid such as water.
Since a single point measurement of an object often does not provide enough information about an object for practical applications, it is often advantageous to collect a two-dimensional array of measurements. A two dimensional array of impedance may be created by moving a line sensing array over the surface of an object and then doing a line by line reconstruction of a two dimensional image like a fax machine does. An example of this is a swiped capacitive fingerprint sensor that measures differences in capacitance between fingerprint ridges and valleys as a finger is dragged across it. The swiping motion of the fingerprint by a user allows the one-dimensional line of sensor points to capture a large number of data points from the user's fingerprint surface. Such sensors reconstruct a two dimensional fingerprint image after the fact using individual lines of the captured data points. This reconstruction process requires a great deal of processing by a device, and is subject to failure if the swipe movement and conditions are not optimum.
A more user friendly way to obtain a two dimensional image is to create a two dimensional sensing array that can capture a user's fingerprint data while the user holds the fingerprint surface still on the sensor surface, rather than swipe across a sensor. Such sensors however can be prohibitive in cost due to the large number of sensing points needed in the array. An example of this is a two dimensional capacitive fingerprint sensor. A number of these are currently manufactured. These sensors, however, are based use 150 mm2 or more of silicon area and are therefore cost prohibitive for many applications. They are also delicate and fragile. They are sensitive to impact and even temperature changes, and thus are simply not durable enough for most applications, such as smart phones and other mobile electronic devices that are handled and sometimes dropped by users.
These different types of electronic sensors have been used in various applications, such as biometric sensors for measuring biological features and characteristics of people such as fingerprints, medical applications such as medical monitoring devices, fluid measuring monitors, and many other sensor applications. Typically, the sensing elements of the various devices are connected to a processor configured to process object information and to enable interpretations for object features and characteristics. Examples include ridges and valleys of a fingerprint, temperature, bulk readings of presence or absence, and other features and characteristics.
There are many applications for two dimensional image sensors as a particular example, and innovators have struggled with state of the art technology that has come short of desired features and functions. Fingerprint sensors, for example, have been in existence for many years and used in many environments to verify identification, to provide access to restricted areas and information, and many other uses. In this patent application, different types of fingerprint sensors will be highlighted as examples of sensor applications where the embodiment is applicable for simplicity of explanations, but other types of applications are also relevant to this background discussion and will also be addressed by the detailed description of the embodiment. These placement sensors may be configured to sense objects placed near or about the sensor, such as a fingerprint placement sensor that is configured to capture a full image of a fingerprint from a user's finger and compare the captured image with a stored image for authentication. Alternatively, sensors may be configured to sense the dynamic movement of an object about the sensor, such as a fingerprint swipe sensor that captures partial images of a fingerprint, reconstructs the fingerprint image, and compares the captured image to a stored image for authentication.
In such applications, cost, though always a factor in commercial products, has not been so critical—accuracy and reliability have been and still remain paramount factors. Typically, the placement sensor, a two-dimensional grid of sensors that senses a fingerprint image from a user's fingerprint surface all at once, was the obvious choice, and its many designs have become standard in most applications. Once the fingerprint image is sensed and reproduced in a digital form in a device, it is compared against a prerecorded and stored image, and authentication is complete when there is a match between the captured fingerprint image and the stored image. In recent years, fingerprint sensors have been finding their way into portable devices such as laptop computers, hand held devices, cellular telephones, and other devices. Though accuracy and reliability are still important, cost of the system components is very important. The conventional placement sensors were and still are very expensive for one primary reason: they all used silicon sensor surfaces (this is excluding optical sensors for this example, because they are simply too large and require more power than a portable device can afford to allocate, among other reasons, and thus they are generally not available in most commercially available devices). These silicon surfaces are very expensive, as the silicon material is as expensive as the material to make a computer chip. Computer chips, of course, have become smaller over the years to reduce their cost and improve their performance. The reason the fingerprint silicon could not be made smaller: they need to remain the size of the average fingerprint, and the requirement for full scanning of the users' fingerprints simply cannot be compromised. Substantially the full print is required for adequate security in authentication.
Enter the fingerprint swipe sensor into the market. Swipe sensors are fundamentally designed with a line sensor configured to sense fingerprint features as a user swipes their finger in a perpendicular direction with respect to the sensor line. The cost saver: swipe sensors need much less silicon, only enough to configure a line sensor with an array of pixel sensors. The width is still fixed based on the average fingerprint width, but the depth is substantially smaller compared to the placement sensor. Some swipe sensors are capacitive sensors, where capacitance of the fingerprint surface is measured and recorded line by line. Others send a small signal pulse burst into the surface of the fingerprint surface and measure a response in a pickup line, again recording fingerprint features line by line. In either case, unlike the placement sensors, the full fingerprint image needs to be reconstructed after the user completes the swipe, and the individual lines are reassembled and rendered to produce a full fingerprint image. This image is compared with a fingerprint image stored in the laptop or other device, and a user will then be authenticated if there is an adequate match.
For the capacitive swipe sensors, the first generation sensors were constructed with direct current (DC) switched capacitor technology (for example U.S. Pat. No. 6,011,859). This approach required using two plates per pixel forming a capacitor between them, allowing the local presence of a finger ridge to change the value of that capacitor relative to air. These DC capacitive configurations took images from the fingerprint surface, and did not penetrate below the finger surface. Thus, they were easy to spoof, or fake a fingerprint with different deceptive techniques, and they also had poor performance when a user had dry fingers. RF (Radio Frequency) sensors were later introduced, because some were able to read past the surface and into inner layers of a user's finger to sense a fingerprint. Different radio frequencies have been utilized by various devices along with different forms of detection including amplitude modulation (AM) and, phase modulation (PM). There are also differing configurations of transmitters and receivers, one type (for example U.S. Pat. No. 5,963,679) uses a single transmitter ring and an array of multiple low quality receivers that are optimized for on chip sensing. In contrast another type (for example U.S. Pat. No. 7,099,496) uses a large array of RF transmitters with only one very high quality receiver in a comb like plate structure optimized for off chip sensing.
One key impediment to the development of low cost placement sensors has been the issue of pixel density, and the resultant requirement for a large number of interconnections between layers of the sensor device. A typical sensor for a fingerprint application will be on the order of 10 mm.times.10 mm, with a resolution of 500 dpi. Such a sensor array would be approximately 200 rows by 200 columns, meaning there would need to be 200 via connections between layers in the device. While semiconductor vias can be quite small, the cost for implementing a sensor in silicon has proven to be prohibitive, as mentioned above.
In order to produce a placement sensor at a low enough cost for mass market adoption, lower cost processes such as circuit board etching must be employed. The current state of the art in circuit board via pitch is on the order of 200 μm, vs. the 50 μm pitch of the sensor array itself. Additionally, the added process steps required to form vias between layers of a circuit board significantly increase the tolerances for the minimum pitch of traces on each of the layers. Single-sided circuits may be readily fabricated with high yield with line pitch as low as 35 μm, whereas double sided circuits require a minimum line pitch on the order of 60 μm or more, which is too coarse to implement a full 500 dpi sensor array. One further consideration is that at similar line densities, double-sided circuits with vias are several times more expensive per unit area than single sided, making high-density double sided circuits too expensive for low cost sensor applications.
For laptop devices, adoption of the swipe sensor was driven by cost. The swipe sensor was substantially less expensive compared to the placement sensors, and most manufacturers of laptops adopted them based solely on price. The cost savings is a result of using less silicon area. More recently a substitute for the silicon sensor arose, using plastic Kapton® tape with etched sensing plates on it, connected to a separate processor chip (for example U.S. Pat. No. 7,099,496). This allowed the silicon portion of the sensor to be separated from the sensing elements and the silicon to follow Moore's law, shrinking to an optimal size, in length, width and depth in proportion to advances in process technology. Although this advance in the art enabled cheap durable Swipe Sensors, it did not overcome the basic image reconstruction and ergonomics issues resulting from changing from a simple two dimensional placement format. In addition to Swipe Sensors being cheaper, they take up less real estate in a host device, whether it is a laptop or a smaller device, such as a cellular phone or personal data device.
In most swipe class sensors, the fingerprint reconstruction process turned out to be a greater ergonomic challenge to users and more of a burden to quality control engineers than initially expected. Users needed to be trained to swipe their finger in a substantially straight and linear direction perpendicular to the sensor line as well as controlling contact pressure. Software training programs were written to help the user become more proficient, but different environmental factors and the inability of some to repeat the motion reliably gave Swipe Sensors a reputation for being difficult to use. Initial data from the field indicated that a large number of people were not regularly using the Swipe Sensors in the devices that they had purchased and opted back to using passwords. Quality control engineers who tried to achieve the optimum accuracy and performance in the matching process between the captured and reconstructed image found that the number of False Rejects (FRR), and False Acceptances (FAR), were much higher in Swipe Sensors than in placement sensors. Attempts to improve these reconstruction algorithms failed to produce equivalent statistical performance to placement sensors.
Development of sensors that take up less space on devices have been tried without much success. Various ramps, wells and finger guides had to be incorporated into the surfaces of the host devices to assist the user with finger placement and swiping. These structures ended up consuming significant space in addition to the actual sensor area. In the end, swipe sensors ended up taking up almost as much space as the placement sensors. This was not a big problem for full size laptops, but is currently a substantial problem for smaller laptops and netbooks, mobile phones, PDAs, and other small devices like key fobs.
Real estate issues have become even more of an issue with mobile device manufacturers who now require that the fingerprint sensor act also as a navigation device, like a mouse or touch-pad does in a laptop. The swipe sensor has proved to be a poor substitute for a mouse or touch pad due to the fact that they are constructed with an asymmetric array of pixels. Swipe sensors do a good job of detecting motion in the normal axis of the finger swipe but have difficulty accurately tracking sideways motion. Off axis angular movements are even more difficult to sense, and require significant processor resources to interpolate that movement with respect to the sensor line, and often have trouble resolving large angles. The byproduct of all this is a motion that is not fluid and difficult to use.
It is clear that low cost two dimensional fingerprint sensor arrays would serve a market need, but present art has not been able to fill that need. Conventional capacitive fingerprint sensors typically use distinct electrode structures to form the sensing pixels array. These electrode structures are typically square or circular and can be configured in a parallel plate configuration (for example U.S. Pat. Nos. 5,325,442 and 5,963,679) or a coplanar configuration (for example U.S. Pat. No. 6,011,859 and U.S. Pat. No. 7,099,496).
These prior art approaches cannot be configured into a low cost two dimensional array of sensing elements. Many capacitive fingerprint sensors (for example U.S. Pat. Nos. 5,963,679 and 6,011,859) have plate structures that must be connected to the drive and sense electronics with an interconnect density that is not practical for implementation other than using the fine line multilayer routing capabilities of silicon chips and therefore require lots of expensive silicon die are as stated before. Other sensors (for example U.S. Pat. No. 7,099,496) use off chip sensing elements on a cheap polymer film, but the sensor cell architecture is inherently one dimensional and cannot be expanded into a two dimensional matrix.
Another application for capacitive sensing arrays has been in the area of touch pads and touch screens. Because touchpad and touch screen devices consist of arrays of drive and sense traces and distinct sense electrodes, they are incapable of resolutions below a few hundred microns, making this technology unsuitable for detailed imaging applications. These devices are capable of detecting finger contact or proximity, but they provide neither the spatial resolution nor the gray-scale resolution within the body of the object being sensed necessary to detect fine features such as ridges or valleys.
Conventional art in the touchpad field utilizes a series of electrodes, either conductively (for example U.S. Pat. No. 5,495,077) or capacitively (for example US publication 2006/0097991). This series of electrodes are typically coupled to the drive and sense traces. In operation these devices produce a pixel that is significantly larger in scale than the interconnect traces themselves. The purpose is to generally sense presence and motion of an object to enable a user to navigate a cursor, to select an object on a screen, or to move a page illustrated on a screen. Thus, these devices operate at a low resolution when sensing adjacent objects.
Thus, there exists a need in the art for improved devices that can provide high quality and accurate placement sensors for use in different applications, such as fingerprint sensing and authentication for example, and that may also operate as a navigation device such as a mouse or touch pad in various applications. As will be seen, the embodiment provides such a device that addresses these and other needs in an elegant manner. Given the small size and functional demands of mobile devices, space savings are important. Thus, it would also be useful to be able to combine the functions of a sensor with that of other components, such as power switches, selector switches, and other components, so that multiple functions are available to a user without the need for more components that take up space.
Still further, it would be also useful for different embodiments of a touch sensor to provide various alternatives for providing biometric sensors that are easy to use and feasible in different applications.
Even further, it would be useful for sensors to not only act as image capturing components, but to also provide navigation operations for viewing and exploring various media, such as with touch-screens used in many smart phones, such as the iPad™, iPod™, iPhone™ and other touch-sensitive devices produced by Apple Corporation™, the Galaxy™ and its progeny by Samsung Corporation™, and other similar devices.
As discussed in the background, there are many applications for a two dimensional impedance sensor, and the embodiments described herein provide broad solutions to shortcomings in the prior art for many applications. The underlying technology finds application in many different sensor features for use in many types of products, including mobile phones, smart phones, flip phones, tablet computers such as Apple™ iPads™ and Samsung™ Galaxy™ devices, point of entry devices such as door knobs, fence, drug cabinets, automobiles, and most any device, venue or thing that may be locked and require authentication to access.
Generally, one embodiment is directed to a two-dimensional sensor, and may also be referred to as a placement sensor, touch sensor, area sensor, or 2D sensor, where a substantial area of an object such as a user's fingerprint is sensed rather than a point or line like portion of space that may or may not yield a characteristic sample that is adequate for identification. The sensor may have sensor lines located on one or more substrates, such as for example a flexible substrate that can be folded over on itself to form a grid array with separate sensor lines orthogonal to each other. The sensor lines may alternatively be formed on separate substrates. In either or any configuration, the crossover locations of different sensor lines create sensing locations for gathering information of the features and/or characteristics of an object, such as the patterns of ridges and valleys of a fingerprint for example.
Other embodiments provide a touch sensor having common electrical connections with a touch screen. For example, touch screen circuitry that resides under protective glass, such as Gorilla Glass™ used in many touch screen devices, may share common electrical connections with a two dimensional sensor used for navigation, and/or fingerprint sensing, or other operations. This provides benefits for manufacturing a device with both a touch screen and a fingerprint sensor, and may simplify the electrical layout of such a device. Exemplary configurations are described below and illustrated herein.
Other embodiments provide novel approaches to two-dimensional sensors integrated with a touch screen to provide the ability to capture a fingerprint image in one mode, and to operate as a conventional touch-screen when in another mode. In one example, a sensor grid may act as a touch screen by sensing presence of a user's finger or fingers and also movement of the fingers from one location to another together with speed to determine a swipe direction and speed. In another mode, the same sensor lines may act as drive lines and pickup lines, where a signal is transmitted from the screen to the user's finger or fingers, and the resulting signal is received by a pickup line and measured to determine the impedance of the fingerprint surface. Impedance values of fingerprint ridges are different than the impedance measurement of fingerprint valleys, and thus the fingerprint image may be mapped once the impedance values are captured of a two dimensional surface of a fingerprint surface. The resulting fingerprint image may then be compared to a stored fingerprint image to authenticate the user, much in the same way a simple password is compared to a stored password when users authenticate themselves with electronic devices using numerical and alphanumeric passwords with devices. The difference is that the use of a fingerprint in place of a password is much more secure.
A two dimensional sensor may be configured in different ways, such as for example a component that may be integrated on a portable device, a sensor integrated with a touch-screen used to provide touch sensitive surfaces for navigation of electronic content and operations in a portable device, or as a stand-alone component that may be electrically connected to a system or device to transmit and receive information for authentication, activation, navigation and other operations.
In one embodiment, the drive lines and pickup lines are not electrically intersecting or connected in a manner in which they would conduct with each other, they form an impedance sensing electrode pair with a separation that allows the drive lines to project an electrical field and the pickup lines to receive an electrical field, eliminating the need for distinct electrode structures. The two lines crossing with interspersed dielectric intrinsically creates an impedance sensing electrode pair. Thus, the sensor is configured to activate two one-dimensional sensor lines to obtain one pixel of information that identifies features and/or characteristics of an object. Unlike conventional sensors, a sensor configured according to certain embodiments may provide a two dimensional grid that is capable of capturing multiple pixels of information from an object by activating individual pairs of drive and pickup lines and capturing the resultant signal. This signal can be processed with logic or processor circuitry to define presence and absence of an object, features and/or characteristics of an object.
In yet another embodiment, a touch screen may operate as a sensor configured in one mode to capture information on a nearby object, such as information for forming an image of a fingerprint, and may operate in another mode to perform navigation or other operations when another mode. In one example, an OLED touch screen is configured to operate in at least two modes, one as a touch screen, and another as a fingerprint sensor, where a fingerprint may be captured in any part of the OLED touch screen desired, and even multiple fingerprints from two or more user fingers may be captured.
In examples described herein, these sensors may be configured to capture information of a nearby object, and the information may be used to produce renderings of an object, such as a fingerprint, and compare the renderings to secured information for authentication.
According to one embodiment, and in contrast to conventional approaches, a device can utilize the intrinsic impedance sensing electrode pair formed at the crossings between the drive and pickup lines. In operation, the electric fields may be further focused by grounding drive and pickup lines near or about the area being sensed by the particular crossover location at one time. This prevents interference that may occur if other drive and pickup lines were sensing electric fields simultaneously. More than one electrode pair may be sensed simultaneously. However, where resolution is an important factor, it may be preferred to avoid sensing electrode pairs that are too close to each other to avoid interference and maintain accuracy in sensing object features at a particular resolution. For purposes of this description, “intrinsic electrode pair” refers to the use of the impedance sensing electrode pairs that are formed at each of the drive and pickup line crossover locations. Due to the fact that the embodiments use each intrinsic electrode pair at each crossover as a sensing element, no differentiating geometric features exist at individual sensing nodes to distinguish them from the interconnect lines. As a result, the alignment between the drive layers and sense layers is non-critical, which significantly simplifies the manufacturing process.
Grounding the adjacent inactive drive and pickup lines restricts the pixel formed at each intrinsic electrode pair without requiring complex measures such as the dedicated guard rings employed in prior art (for example U.S. Pat. No. 5,963,679). Instead, guard grounds around the pixel are formed dynamically by switching adjacent inactive drive and pickup lines into ground potential. This allows the formation of high density pixel fields with relatively low resolution manufacturing processes, as the minimum pixel pitch for a given process is identical to the minimum feature spacing. This, in turn, enables the use of low cost manufacturing process and materials, which is the key to creating a low cost placement sensor.
In one example, the sensor lines may consist of drive lines on one layer and pickup lines on another layer, where the layers are located over each other in a manner that allows the separate sensor lines, the drive and pickup lines, to cross over each other to form impedance sensing electrode pairs at each crossover location. These crossover locations provide individually focused electrical pickup locations or pixels, or electrode pairs where a number of individual data points of features and/or characteristics of an object can be captured. The high degree of field focus is due to the small size of the intrinsic electrode pairs, as well as the high density of the neighboring ground provided by the inactive plates. The flexible substrate may have a second substrate configured with logic or processor circuitry for sending and receiving signals with the sensor lines to electronically capture information related to the object. Alternatively, there may be two separate substrates carrying the separate sensor lines and layered on each other, and yet connected to a third substrate for connection to logic or processor circuitry.
The utilization of the crossover locations between perpendicular lines on adjacent layers for the pickup cell greatly reduces the alignment requirements between the layers. Since there are no unique features at a sensor pixel location to align, the only real alignment requirement between the layers is maintaining perpendicularity. If the sense cell locations had specific features, such as the parallel plate features typical of prior art fingerprint sensors, the alignment requirements would include X and Y position tolerance of less than one quarter a pixel size, which would translate to less than +/−12 μm in each axis for a 500 DPI resolution fingerprint application.
In operation, a drive line is activated, with a current source for example, and a pickup line is connected to a receiving circuit, such as an amplifier/buffer circuit, so that the resulting electric field can be captured. An electric field extends from the drive line to the pickup line through the intermediate dielectric insulating layer. If an object is present, some or all of the electric field may be absorbed by the object, changing the manner in which the electric field is received by the pickup line. This changes the resulting signal that is captured and processed by the pickup line and receiving circuit, and thus is indicative of the presence of an object, and the features and characteristics of the object may be sensed and identified by processing the signal. This processing may be done by some form of logic or processing circuitry.
In other embodiments, the signal driving the drive line may be a complex signal, may be a varying frequency and/or amplitude, or other signal. This would enable a sensor to analyze the features and/or characteristics of an object from different perspectives utilizing a varying or complex signal. The signal may include simultaneous signals of different frequencies and/or amplitudes that would produce resultant signals that vary in different manners after being partially or fully absorbed by the object, indicating different features and characteristics of the object. The signal may include different tones, signals configured as chirp ramps, and other signals. Processing or logic circuitry may then be used to disseminate various information and data points from the resultant signal.
In operation, the varying or complex signal may be applied to the drive line, and the pickup line would receive the resulting electric field to be processed. Logic or processing circuitry may be configured to process the resulting signal, such as separating out different frequencies if simultaneous signals are used, so that features and/or characteristics of the object may be obtained from different perspectives.
Given the grid of pixels that can be activated at individual pairs, each pixel may be captured in a number of ways. In one embodiment, a drive line may be activated, and pickup lines may be turned on and off in a sequence to capture a line of pixels. This sequencing may operate as a scanning sequence. Here a first drive line is activated by connecting it to a signal source, and then one pickup line is connected to amplifier/buffer circuitry at a time, the information from the pixel formed at the crossing of the two lines is captured, and then disconnected. Then, a next pixel is processed in sequence, then another, then another, until the entire array of pickup lines is processed. The drive line is then deactivated, and another drive line is activated, and the pickup lines are again scanned with this active drive line. These may be done one at a time in sequence, several non-adjacent pixels may be processed simultaneously, or other variations are possible for a given application. After the grid of pixels is processed, then a rendering of object information will be possible.
Referring to
Referring to
Referring to
In this configuration of
In general, in operation, each area over which a particular drive line overlaps a pickup line with a separation of the a insulating dielectric substrate is an area that can capture and establish a sensing location that defines characteristics or features of a nearby object about that area. Since there exist multiple sensing locations over the area of the sensor grid, multiple data points defining features or characteristics of a nearby object can be captured by the sensor configuration. Thus, the sensor can operate as a planar two-dimensional sensor, where objects located on or about the sensor can be detected and their features and characteristics determined.
As described in the embodiments and examples below, the embodiment is not limited to any particular configuration or orientation described, but is only limited to the appended claims, their equivalents, and also future claims submitted in this and related applications and their equivalents. Also, many configurations, dimensions, geometries, and other features and physical and operational characteristics of any particular embodiment or example may vary in different applications without departing from the spirit and scope of the embodiment, which, again, are defined by the appended claims, their equivalents, and also future claims submitted in this and related applications and their equivalents.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the embodiment. However, it will be apparent to one skilled in the art that the embodiment can be practiced without these specific details. In other instances, well known circuits, components, algorithms, and processes have not been shown in detail or have been illustrated in schematic or block diagram form in order not to obscure the embodiment in unnecessary detail. Additionally, for the most part, details concerning materials, tooling, process timing, circuit layout, and die design have been omitted inasmuch as such details are not considered necessary to obtain a complete understanding of the embodiment and are considered to be within the understanding of persons of ordinary skill in the relevant art. Certain terms are used throughout the following description and claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function. In the following discussion and in the claims, the terms “including” and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to . . . ”
Embodiments of the embodiment are described herein. Those of ordinary skill in the art will realize that the following detailed description of the embodiment is illustrative only and is not intended to be in any way limiting. Other embodiments of the embodiment will readily suggest themselves to such skilled persons having the benefit of this disclosure. Reference will be made in detail to implementations of the embodiment as illustrated in the accompanying drawings. The same reference indicators will be used throughout the drawings and the following detailed description to refer to the same or like parts.
In the interest of clarity, not all of the routine features of the implementations described herein are shown and described. It will, of course, be appreciated that in the development of any such actual implementation, numerous implementation-specific decisions must be made in order to achieve the developer's specific goals. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking of engineering for those of ordinary skill in the art having the benefit of this disclosure.
In one embodiment, a sensor device includes drive lines located on or about an insulating dielectric substrate and configured to transmit a signal onto a surface of an object being sensed. Pickup lines are located near or about the drive lines and configured to receive the transmitted signal from the surface of an object. In order to keep a separation between the drive lines and pickup lines, the substrate may act as an insulating dielectric or spacing layer. The substrate may be for example a flexible polymer based substrate. One example is Kapton® tape, which is widely used in flexible circuits such as those used in printer cartridges and other devices. The package may include such a flexible substrate, where the drive lines may be located on one side of the substrate, and the pickup lines may be located on an opposite side of the substrate.
The drive lines may be orthogonal in direction with respect to the pickup lines, and may be substantially perpendicular to the pickup lines. According to one embodiment, a device may be configured with drive lines and pickup lines located on or about opposite sides of an insulating dielectric substrate, where the combination of these three components provides capacitive properties. The drive lines may be activated to drive an electric field onto, into or about an object. The pickup lines can receive electronic fields that originated from the drive lines, and these electronic fields can be interpreted by processing or logic circuitry to interpret features or characteristics of the object being sensed.
Thus, in one embodiment the layer separating the drive lines from the pickup lines can provide a capacitive property to the assembly. If some or all of the drive lines are substantially perpendicular to the pickup lines, either entirely or in portions, then a grid may be formed. In such a configuration, from a three dimensional view, the drive lines are located and oriented substantially in parallel with respect to each other about a first plane. One surface of the substrate is located about the drive lines in a second plane that is substantially parallel relative to the drive lines. The pickup lines are located and oriented substantially in parallel with respect to each other about a third plane that is substantially parallel to the first and second planes and also located about another substrate surface that is opposite that of the drive lines, such that the substrate is located substantially between the drive lines and the pickup lines.
In this description, including descriptions of embodiments and examples, there will be references to the terms parallel, perpendicular, orthogonal and related terms and description. It is not intended, nor would it be understood by those skilled in the art that these descriptions are at all limiting. To the contrary, the embodiment extends to orientations and configurations of the drive lines, the pickup lines, the substrate or related structure, and also various combinations and permutations of components, their placement, distance from each other, and order in different assemblies of a sensor. Though the embodiment is directed to a sensor configured with plurality of drive and pickup lines that generally cross over each other at a pixel location and are configured to detect presence and other features and characteristics of a nearby object, the embodiment is not limited to any particular configuration or orientation, but is only limited to the appended claims, their equivalents, and also future claims submitted in this and related applications and their equivalents.
Also, reference will be made to different orientations of the geometric planes on which various components will lie, such as the drive and pickup lines and the substrate that may be placed in between the sets of drive and pickup lines. If flexible substrates are used for example, the use of such a structure will allow for planes to change as a flexible structure is flexed or otherwise formed or configured. In such embodiment will be understood that certain aspects of the embodiment are directed to the drive lines and pickup lines being configured on opposite sides of a substrate and in a manner that enables the sensing of particular features and/or characteristics of a nearby object at each crossover location of a drive line and a pickup line. Thus, the orientation of the planes (which may be deformable, and thus may be sheets separated by a substantially uniform distance) of groups of components (such as drive lines or pickup lines for example) or substrates may vary in different applications without departing from the spirit and scope of the embodiment.
Also, reference will be made to pickup lines, pickup plates, drive lines, drive plate, and the like, but it will be understood that the various references to lines or plates may be used interchangeably and do not limit the embodiment to any particular form, geometry, cross-sectional shape, varying diameter or cross-sectional dimensions, length, width, height, depth, or other physical dimension of such components. Also, more sophisticated components may be implemented to improve the performance of a device configured according to the embodiment, such as for example small 65, 45, 32 or 22 nanometer conduction lines or carbon nano-tubes that may make an assembly more easily adapted to applications where small size and shape as well as low power are desired characteristics and features. Those skilled in the art will understand that such dimensions can vary in different applications, and even possibly improve the performance or lower power consumption in some applications, without departing from the spirit and scope of the embodiment.
Reference will also be made to various components that are juxtaposed, layered, or otherwise placed on each other. In one example of an embodiment, a plurality of drive lines are juxtaposed on one surface of a generally planar substrate, and a plurality of pickup lines are juxtaposed on an opposite surface of the planar substrate. The drive lines are substantially orthogonal to the pickup lines, and may be described as substantially perpendicular to the pickup lines. The distance between the drive lines and pickup lines may be filled with a substrate or insulating material that will provide for a capacitive configuration. Here the drive lines on one side of the substrate forms one capacitive plate, and the pickup lines on an opposite side for the corresponding capacitive plate. In operation, when the drive plate is activated, an electrical field is generated between the drive lines and pickup lines and through the substrate to form a plurality of capacitive elements. These capacitive elements are located at an area at each cross-section of a drive line and a pickup line with a portion of the substrate located between the areas. This is a location where the respective drive lines and pickup lines overlap each other. In any particular application, these areas in which the three components interact during operation define a data location at which a sensor reading can be made.
Reference will also be made to sensor lines, such as sensor drive lines and sensor pickup lines, and their orientation amongst themselves and each other. For example, there will be described substantially parallel drive lines. These drive lines are intended to be described as parallel conductive lines made up of a conductive material formed, etched, deposited or printed onto the surface such as copper, tin, silver and gold. Those skilled in the art will understand that, with the inherent imperfections in most any manufacturing process, such conductive lines are seldom “perfect” in nature, and are thus not exactly parallel in practice. Therefore, they are described as “substantially parallel”. Different applications may configure some of the drive lines even non-parallel, such that the lines may occur parallel for a portion of the line, and the line may necessarily deviate from parallel in order to connect with other components for the device to operate, or in order to be routed on or about the substrate on which it is formed or traced. Similarly, the separate array of lines may be described as orthogonal or perpendicular, where the drive lines are substantially orthogonal or perpendicular to the pickup lines. Those skilled in the art will understand that the various lines may not be perfectly perpendicular to each other, and they may be configured to be off-perpendicular or otherwise crossed-over in different angles in particular applications. They also may be partially perpendicular, where portions of drive lines may be substantially perpendicular to corresponding portions of pickup lines, and other portions of the different lines may deviate from perpendicular in order to be routed on or about the substrate or to be connected to other components for the device to operate.
These and other benefits provided by the embodiment will be described in connection with particular examples of embodiments of the embodiment and also descriptions of intended operational features and characteristics of devices and systems configured according to the embodiment.
In operation, generally, the drive lines can transmit an electromagnetic field toward an object that is proximal to the device. The pickup lines may receive a signal originating from the drive lines and then transmitted through the object and through the substrate and onto the pickup lines. The pickup lines may alternatively receive a signal originating from the drive lines that were then transmitted through the substrate and onto the pickup lines without passing through the object. This electric field can vary at different locations on the grid, giving a resultant signal that can be interpreted by some type of logic or processor circuitry to define features and/or characteristics of an object that is proximate the assembly.
The drive lines and pickup lines may be controlled by one or more processors to enable the transmission of the signal to an object via the drive lines, to receive a resultant signal from an object via the pickup lines, and to process the resultant signal to define an object image. One or more processors may be connected in one monolithic component, where the drive lines and pickup lines are incorporated in a package that includes the processor. In another embodiment, the drive lines, pickup lines and substrate may be assembled in a package by itself, where the package can be connected to a system processor that controls general system functions. This way, the package can be made part of the system by connecting with a system's input/output connections in order to communicate with the system. This would be similar in nature for example to a microphone connected to a laptop, where the audio signals are received by the system processor for use by the laptop in receiving sounds from a user. According to this embodiment, the sensor can be connected as a stand-alone component that communicates with the system processor to perform sensor operations in concert with the system processor.
In another embodiment, a sensor may be configured to drive signals at different frequencies since the impedance of most objects, especially human tissue and organs, will greatly vary with frequency. In order to measure complex impedance at one or more frequencies of a sensed object, the receiver must be able also to measure phase as well as amplitude. In one embodiment, the resulting signal generated from a given impedance sensing electrode pair may result from varying frequencies, known in the art as frequency hopping, where the receiver is designed to track a random, pseudo-random or non-random sequence of frequencies. A variation of this embodiment could be a linear or non-linear frequency sweep known as a chirp. In such an embodiment one could measure the impedance of a continuous range of frequencies very efficiently.
In yet another embodiment, a grid sensor as described above may be configured to also operate as a pointing device. Such a device could perform such functions as well known touch pads, track balls or mice used in desktops and laptop computers.
In one example of this embodiment, a two dimensional impedance sensor that can measure the ridges and valleys of a fingertip may be configured to track the motion of the fingerprint patterns. Prior art swiped fingerprint sensors can perform this function, but due to the physical asymmetry of the array and the need to speed correct, or “reconstruct” the image in real time make these implementations awkward at best. The sensor could also double as both a fingerprint sensor and a high quality pointing device.
One device configured according to the embodiment includes a first array of sensor lines on a flexible substrate, and a second array of sensor lines on a flexible substrate, and also a processor configured to process fingerprint data from the first and second arrays of sensor lines. When folded upon itself in the case of a single flexible substrate or when juxtaposed in the case of separate substrates, the separate sensor lines cross each other without electrically shorting to form a grid with cross-over locations that act as pixels from which fingerprint features can be sensed. In one embodiment, an array of substantially parallel sensor drive lines is located on a surface of the flexible substrate. These drive lines are configured to sequentially transmit signal into a surface of a user's finger activating a line at a time. A second array of sensor lines is similar to the first, consisting of substantially parallel sensor pickup lines that are substantially perpendicular to the drive lines. These pickup lines are configured to pick up the signal transmitted from the first.
In the configuration where the first and second set of sensor lines, the drive and the pickup lines for example, are located on different sections of an extended surface of the flexible substrate, the flexible substrate is further configured to be folded onto itself to form a dual layer configuration. Here, the first array of sensor drive lines becomes substantially perpendicular to the second array of pickup sensor lines when the flexible substrate is folded onto itself. This folding process creates crossover locations between these separate arrays of sensor lines—though they must not make direct electrical contact so that they operate independently. These crossover locations represent impedance sensing electrode pairs configured to sense pixels of an object and its sub-features juxtaposed relative to a surface of the flexible substrate. The scanning of these pixels is accomplished by activating individual rows and columns sequentially. Once a drive column is activated with drive signal the perpendicular pickup rows are scanned one at a time over the entire length of the selected driver. Only one row is electrically active (high impedance) at a time, the non-active rows are either shorted to ground or multiplexed to a state where they do not cross couple signal. When a finger ridge is placed above an array crossover location that is active, it interrupts a portion of the electric field that otherwise would be radiated through the surface film from the active drive column to the selected row pickup. The placement of an object's subfeature, such as a ridge or valley in the case of a fingerprint sensor, over an impedance sensing electrode pair results in a net signal decrease since some of the electric field is conducted to ground through the human body. In a case of a fingerprint sensor the placement of a fingerprint ridge/valley over an impedance-sensing electrode pair, the valley affects the radiation of electric field from the selected drive line to the selected pickup line much less than a ridge would. By comparing the relative intensity of signals between the pixels ridges and valleys, a two dimensional image of a finger surface can be created.
Referring again to
In operation, the sensor can be configured to detect the presence of a finger surface located proximate to the sensor surface, where the drive lines can drive an active electromagnetic field onto the finger surface, and the pickup lines can receive a resulting electromagnetic field signal from the pickup lines. In operation, the drive lines can generate an electric field that is passed onto the surface of the finger, and the different features of the fingerprint, such as ridges and valleys of the fingerprint surface and possibly human skin characteristics, would cause the resulting signal to change, providing a basis to interpret the signals to produce information related to the fingerprint features.
In one embodiment of a fingerprint sensor, referring again to
Referring to
Referring to
As another example of a sensor that can benefit from the embodiment, a reduced cost fingerprint swipe sensor could be configured using the same innovation provided by the embodiment. In this embodiment, a reduced number of pickup lines could be configured with a full number of orthogonal drive lines. Such a configuration would create a multi-line swipe sensor that would take the form of pseudo two-dimensional sensor, and when a finger was swiped over it would create a mosaic of partial images or slices. The benefit of this would be to reduce the complexity of image reconstruction task, which is problematic for current non-contact silicon sensors that rely on one full image line and a second partial one to do speed detection.
The tradeoff would be that this pseudo two dimensional array would have to be scanned at a much faster rate in order to keep up with the varying swipe speeds of fingers that have to be swiped across it.
Referring to
Also illustrated in
In the snapshot shown in
The scanning process continues beyond the snapshot shown in
Referring again to
The bottom plates 706a,b,c etc. are driven one at a time by AC signal source 716 via switch matrix 740a-n.
A single row remains active only as long as it takes the entire number of pickup plates/columns to be scanned. Scan time per pixel will depend on the frequency of operation and the settling time of the detection electronics, but there is no need to scan unduly fast as is typical with prior art swipe sensors. On the other hand prior art swipe sensors must scan at a very high pixel rate in order not to lose information due to undersampling relative to the finger speed that can be greater than 20 cm/sec. This reduction in capture speed relative to a swipe sensor relaxes the requirements of the analog electronics and greatly reduces the data rate that a host processor must capture in real time. This not only reduces system cost but allows operation by a host device with much less CPU power and memory. This is critical especially for mobile devices.
Once an entire row 706b has been scanned by all or substantially all of its corresponding pickup plates 702a-n, then the next row in the sequence is activated through switch matrix 740. This process continues until all or substantially all of the rows and columns are scanned.
The amount of signal that is coupled into the buffer amplifier 717 is a function of how much capacitance is formed by the insulating layer and the finger ridge or valley in close proximity. The detailed operation of how these electric fields radiate is shown in
Reference plate 805 is intentionally located outside of the finger contact area of the sensor, separated from pickup plates 802a-n by Gap 885, Gap 885 is much larger that the nominal gap between the pickup plates that is typically 50 μm. In a real world embodiment plate 805 would be positioned under the plastic of a bezel to prevent finger contact, placing it at least 500 μm apart from the other pickup plates.
Each one of the pickup plates 802a-n is scanned sequentially being switched through pickup switches 830a-n connecting them to Differential Amplifier 880. During the scanning process of an entire pickup row, the positive leg of the differential amplifier remains connected to reference plate 805 to provide the same signal reference for all of the pickup plates.
In
Each SPDT has a Parasitic Capacitance 945, due to the fact that real world switches do not give perfect isolation. In fact the amount of isolation decreases with frequency, typically modeled by a parallel capacitor across the switch poles. By using a SPDT switch we can shunt this capacitance to ground when an individual plate is not active. Since there is a large array of switches equal to the number of pickup plates, typically 200 for a 500 dpi sensor, the effective shunt capacitance to ground is multiplied by that number. So if a given switch has 0.5 picofarads of parasitic capacitance and there were 200 pickups, that would add up to 100 picofarads of total shunt capacitance.
In order to prevent this large capacitance from diverting most of the received signal from the active pickup to ground, it is desirable in this example to use a compensating circuit. This is accomplished by using resonating inductor 939, forming a classic bandpass filter circuit in conjunction with parasitic capacitors 945 (one per switch) and tuning capacitors 934 and 937. A two-step null & peak tuning calibration procedure is used where tuning capacitor 934 and 937 are individually tuned with inductor 939 using the same drive signal on both the plus and minus inputs to differential amplifier 980. The two bandpass filters formed with inductor 939 and resonating capacitors 934, and 937 respectively, will be tuned to the same center frequency when there is zero signal out of differential amplifier 980. Next capacitors 934 and 937 and inductor 939 are tuned together using a differential input signal with opposite 180 degrees phases on the plus and minus inputs to the differential amplifier 980. They are incremented in lock step until the exact drive carrier frequency is reached, this occurs when the output of differential amplifier 980 is at its peak, making the center frequency equal to the exact frequency of the carrier drive signal 916.
In a systems implementation, a calibration routine would be performed before each fingerprint scan to minimize drift of this filter with time and temperature. The resonating inductor 939 needs to have a Q or Quality Factor of at least 10 to give the filter the proper bandwidth characteristics necessary to optimize the signal to noise ratio.
Alternately, carrier source 916 may be a variable frequency source, and capacitors (937 and 934) may be fixed values. In this embodiment, tuning is accomplished by varying the frequency of source 916) until peak output is obtained from differential amplifier 980
Dividing up the large number of parallel pickup plates into groups each containing a smaller number of plates is an alternate architecture that would not require the use of a tuned bandpass filter in the front end because the parasitic switch capacitances would be greatly reduced. This would have two possible advantages, first lower cost, and second the ability to have a frequency agile front end. In this Figure we have a snapshot of the front end where the first switch 944a of bank 907a is active. All other switch banks 907b-907n are shown inactive, shorting their respective plates to ground. Therefore, only voltage or current differential amplifier 980a has any plate signal conducted into it, voltage or current differential amplifiers 980b-980n have both their positive and negative inputs shorted to ground through their respective switches 945a-n and 945r, preventing any signal from those banks making a contribution to the overall output.
Each of the differential amplifiers 980a-980n is summed through resistors 987a-987n into summing amplifier 985. Only the differential amplifier 980a in this snapshot has plate signal routed into it, so it independently produces signal to the input of summing amplifier 985. This process is repeated sequentially until all or substantially all of the switch banks 907a-n, and switch plates 944a-n, 945a-n, etc, of the entire array are fully scanned. In different embodiments, all or substantially all of the array may be scanned, or less than the entire array may be scanned in different applications. In some applications, a lower resolution may be desired, so all of the array may not need to be scanned. In other applications, a full image may not be necessary, such as a navigation application, where limited images may be used to detect movement of speed, distance and/or direction to use as input for a pointing device, such as directing a cursor on a display similar to a computer touch-pad or a mouse.
By splitting the pickup array up, the capacitive input load on each plate is reduced from that of the full array of switches to the number of switches within a given plate group. If we were to divide for example 196 potential pickup plates into 14 banks of 14 plates, the result would be a capacitance load equal to the parasitic capacitance of 14 switches (944), plus the capacitive load of the differential amplifier. If analog switches 944 are constructed with very low parasitic capacitance then the overall input load would be small enough not to need a bandpass circuit in the front end in order to resonate out the load capacitance. As integrated circuit fabrication techniques improve we would be able design smaller switches with less parasitic capacitance, making this approach become more attractive.
Buffers 982a through 982n as illustrated are special buffers that are designed to have very low input capacitance. In one embodiment, these buffers could be configured as single stage cascaded amplifiers in order to minimize drain-to-gate Miller capacitance and die area. To better maximize plate to plate isolation, two sets of switches could be used for each input. Analog switches 930a-930n are included in this example to multiplex each selected buffer into differential amplifier 980. Switches 932a-n are included to shut down the power simultaneously to all the other input buffers that are not selected. This effectively puts them at ground potential. An alternate embodiment would be to put input analog switches in front of each amplifier to allow a short of the unused plates directly to ground. One effect of this approach may be an increase in input load capacitance for each plate.
The positive input to differential amplifier 980 is always connected to the reference plate 902r (through low input capacitance buffer 982r), providing an “air” signal reference to the amp. The differential amplifier 980 serves to subtract out noise and common mode carrier signal in addition to providing a “air” reference carrier value.
Control processor 1030 orchestrates the scanning of the two dimensional sensor plate array. Drive plates/columns 1006a-1006n are scanned sequentially by the Bottom Plate Scanning Logic 1040 in the Control Processor 1030 (via drive control lines 1042 connected to switches coupled to the drive plates 1006a-n). When a selected drive plate is activated it is connected to carrier signal source 1016, all inactive drive plates are connected to ground. Before activating the next drive plate in the sequence the active drive plate remains on long enough for the entire row of pickup plates 1002a-n to be scanned by top plate logic 1045 controlling switches 1030a-n.
Analog mixer 1074 multiplies the gained up plate signal against the reference carrier 1013. The result is a classic spectrum of base band plus harmonic products at multiples of the carrier frequency. An analog low pass filter 1025 is employed to filter out the unwanted harmonics and must have a sharp enough roll off to attenuate the information associated with the second harmonic without losing base band information.
Following the low pass filter are an amplifier 1077 and an A/D Converter 1076 which must sample at at least twice the pixel rate to satisfy the Nyquist criteria. Memory buffer 1032 stores the A/D samples locally with sufficient size to keep up with the worst case latency of the host controller. The A/D Sample Control Line 1078 provides a sample clock for the converter to acquire the sequential pixel information that is created by the sequencing of the plate rows and columns.
A programmable gain stage or PGA 1190 follows the Differential Amplifier, which could easily be combined into a single differential amplifier including programmable gain as is commonly done in modern integrated circuit design PGA 1190 is designed to have a gain range wide enough to compensate for production variations in plate etching and solder mask thickness between the layers.
Control processor 1130 orchestrates the scanning of the two dimensional sensor plate array. Drive plates/columns 1106a-1106n are scanned sequentially by the bottom plate scanning logic 1140 in the Control Processor 1130 (via drive control lines 1142 connected to switches coupled to drive plates 1106a-n). When a selected drive plate is activated it is connected to carrier signal source 1116, all inactive drive plates are connected to ground. Before activating the next drive plate in the sequence the active drive plate remains on long enough for the entire row of Pickup Plates 1102a-n to be scanned by top plate logic 1145 controlling switches 1130a, 1130b, etc., and captured by the A/D converter 1125.
The A/D Converter 1125 is sampled at a rate of at least twice the carrier frequency to satisfy the Nyquist criteria. The A/D Sample Control Line 1107 provides a sample clock for the converter to acquire the sequential pixel information that is created by the sequencing of the plate rows and columns.
Following the A/D 1125 converter is a digital mixer 1118 that digitally multiplies the A/D output that is at the carrier frequency against the reference carrier generated by the Digitally Controlled Oscillator 1110. The result is that the signal is down converted to the base band with the carrier removed. There are other unwanted spectral components created by this process, namely a double time carrier side band, but these can easily be filtered out.
A combination decimator and digital filter 1120 follows the digital mixer 1118. This block performs sampling down conversion, reducing the sample rate from at least twice the carrier frequency to at least twice the pixel rate that is much lower. The digital filter would typically include a Cascaded Integrator Comb, or CIC filter, which removes the unwanted spectral byproducts of mixing as well as improving the receiver signal to noise. A CIC filter provides a highly efficient way to create a narrow passband filter after mixing the signal down to baseband with the digital mixer. The CIC filter may be followed by a FIR filter running at the slower decimated rate to correct passband droop.
With a reduction of sample rate in the order of 100:1 a relatively small Control Processor Buffer (1132) could be used to capture and entire fingerprint. For example a 200×200 array producing 40 k pixels could be stored in a 40 kb buffer. This is in contrast to a swipe sensor that must scan the partial image frames at a rate fast enough to keep up with the fastest allowable swipe speed, usually around 200 ms. At the same time, a slow swipe of two seconds must also be accommodated, requiring ten times the amount of memory as the fastest one. Various techniques have been developed to throw away redundant sample lines before storage, but even with that the real time storage requirements are much greater for swipe sensors. This is a critical factor in Match on Chip applications where memory capacity is limited. In addition, a placement sensor has no real-time data acquisition or processing requirements on the host processor beyond the patience of the user for holding their finger in place.
Referring to
If configured as a fingerprint sensor or other placement sensor, integrated circuit 1210 may be a mixed signal chip that enables all or some of the functions described in
Flexible substrate based connector 1235 routes power, ground and interface signals out to an external host or onto another substrate that contains system level components, such as those illustrated in
Referring to
As will be appreciated by those skilled in the art, given these examples, different designs may be accomplished to optimize different aspects of the invention, including size of the substrate used for a device, and both the size and pixel density of the sensing area. The invention, however, is not limited to a particular optimization done by others, and indeed the invention should inspire others to improve upon the design using well known and alternative processes, materials, and know-how available to them. The scope of the invention is only limited by claims that are either appended or submitted for examination in the future, and not by information that is extemporaneous to this specification.
Referring to
Referring to
Referring to
Bottom plate processing circuitry 1414 is configured to produce a drive signal for use by drive plates or lines located on the bottom layer 1408 of the sensor substrate 1404, and includes drivers and scanning logic 1462 for producing the signal, and programmable frequency generator 1426 for programmably setting the frequency in which the drive signal is set. The bottom plate processing circuitry 1414 includes communication link 1428, likewise, top plate circuitry has communication link 1430 for communicating with system buss 1432 for sending and receiving communications among the system, such as to processors, memory modules, and other components. System buss 1432 communicates with persistent memory 1434 via communication link 1436 for storing algorithms 1438, application software 1440, templates 1442, and other code for persistent and frequent use by processor 1444. Processor 1444 includes processor logic having logic 1448 and other circuitry for processing signals received from the system buss and originating from the sensor 1402, and also includes arithmetic logic unit 1450 configured with logical circuits for performing basic and complex arithmetic operations in conjunction with the processor. Processor memory 1452 is configured for local storage for the processor 1444, for example for storing results of calculations and retrieval for further calculations.
In operation, drive signals are controlled by processor 1444, and parameters for the drive signals originating from bottom plate processing circuitry 1414 are set in the bottom plate processing circuitry 1414 by the processor 1444. Drive signals are generated by logic 1462 within the parameters set in generator 1426 and sent to bottom plate 1408 via communication link 1516. These signals generate electromagnetic fields that extend to pickup lines on top layer 1406 about the sensing area 1411. These signals are cycled through different pixel electrode pairs on the sensor grid (not shown here, but described above), and some of these electromagnetic fields are absorbed by the object 1410 (such as a fingerprint for example). The resultant signal is picked up by the pickup plates or lines located on top layer 1406 about the sensing area (not shown here, but described above). The resultant signal is then transmitted to top plate processing circuitry 1409 via communication line 1412, and the signal is processed and transmitted to storage or processor 1444 for further processing. Once the drivers and scanning logic have cycled through the pixels on the grid sensor, data related to features and characteristics of the object can be defined and utilized by the system. For example, in a fingerprint sensor system, the image may be a fingerprint image that can be compared to a stored fingerprint image, and, if there is a match, it can be used to validate a user.
Those skilled in the art will recognize that row and column scanning order may not correspond directly to physical position in the array, as some implementations may more optimally be sampled in interleaved fashions.
In
In an authentication system such as described by
Embodiments described herein facilitate supporting both of these requirements by providing variable captured image resolution and matching algorithm security level. In one example, when operating in high security mode (such as when enrolling a user or validating a high-value transaction) the image capture procedure described in
Referring to
Still referring to
Referring to
Referring to
Referring to
In this particular example, the common substrate (2201) is a two layer rigid circuit board, which also provides a mechanical base for the sensor. The drive circuitry is implemented in integrated circuit die (2204) which is mounted on rigid drive substrate (2201). The die is connected to the circuit on the rigid substrate by a number of bonding pads (2206) using standard flip-chip mounting processes. A large number of drive lines (typically more than 100) are connected to the drive plates (2209), which are formed on the top side of the rigid substrate.
A dielectric layer (2208) separates drive plates (2209) from pickup plates (2210). In this instance dielectric layer (2208) is provided by a solder mask layer applied to the drive plates (2209) and rigid substrate (2201).
Pickup substrate assembly (2202) with pre-attached pickup circuit die (2205) is mounted on top of drive substrate (2201). The die is connected to the circuit on the flexible substrate by a number of bonding pads (2216) using standard flip-chip mounting processes. Because substrate (2202) is flexible, attach pads (2211) can mate with their corresponding pads (2212) on base substrate (2201). A cutout (2203) is provided in base substrate (2201) to accommodate pickup chip (2205) so the assembly lies flat. Attach pads (2211) provide interconnect to the mating pads (2212) on the substrate layer (2201).
Interconnect traces (2214) formed on the top layer of base substrate (2201) provide synchronizing signals between the integrated circuits (2204) and (2205).
Interconnect traces (2215) in the base substrate (2201) route signals to interconnect pads (2213) for power, ground, and communications interconnect to the host system.
Rigid base (2201) could be fabricated from standard circuit board materials, such as FR4, in which case plates (2209), interconnect (2213 and 2214) and pads (2213 and 2212) would typically be formed from copper by use of circuit board etching techniques. Rigid base (2201) could also be formed from glass, in which case plates (2209), interconnect traces (2213 and 2214), and pads (2212 and 2213) would typically be formed from a transparent conductive material such as Indium-Tin Oxide (ITO).
User motion tracking is required in touchscreen devices for a number of functions, including icon selection and movement, control selection, gesture recognition, text selection, and so on. Many motion tracking functions only require coarse position determination, but may be done at high speed. This is especially true of gesture recognition. Other functions may require much finer position determination, but these generally are performed at low speeds of motion to allow the user more precise control. Such functions include text selection and drawing tasks.
While it may be difficult to precisely track motion at high speeds due to the high number of positions that need to be sampled at high speed, in practice high speed and high precision are not needed simultaneously.
Given the relative large size of fingers compared to the size of pixels on a typical touchscreen, it is difficult to accurately determine the precise position of a finger. In practice, users actually do not rely on exact finger placement determination to perform precise tasks. Instead, they place their finger on the touchscreen at the approximate location of interest, and then rely on visual feedback from the screen to complete fine positioning tasks.
The important characteristics for a touchscreen display input, then, are good high speed coarse absolute position determination, and highly responsive high resolution low-speed relative motion determination.
This invention addresses the need for high speed coarse absolute positioning and responsive high resolution slow speed motion tracking by providing a dual resolution sensing system. A primary grid is formed at a spacing equal to that used by commercially available touch screens with a spacing of 5-10 per inch while a secondary grid is formed about the primary lines with a much finer resolution equal to that of a commercial fingerprint sensor at 500 lines per inch. The result is a sensor that is capable of detecting macro finger movements using the primary grid as well as small incremental movements using the secondary grid which tracks the movements of fingerprint ridge and valley features.
If the coarse finger position has not changed, then it is possible that the user is performing a fine positioning task.
It should be noted that many operating modes of a device may only require coarse location information from the touch sensor. In these cases the system can advantageously omit the fine motion tracking operations of the position sensor in order to save power.
While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad embodiment, and that this embodiment is not limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art. Hence, alternative arrangements and/or quantities of, connections of various sorts, arrangements and quantities of transistors to form circuits, and other features and functions can occur without departing from the spirit and scope of the embodiment. Similarly, components not explicitly mentioned in this specification can be included in various embodiments of this embodiment without departing from the spirit and scope of the embodiment. Also, different process steps and integrated circuit manufacture operations described as being performed to make certain components in various embodiments of this embodiment can, as would be apparent to one skilled in the art, be readily performed in whole or in part to make different components or in different configurations of components not explicitly mentioned in this specification without departing from the spirit and scope of the embodiment. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad embodiment, and that this embodiment is not limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
Again, the embodiment has application in many areas, particularly in biometric sensors. Fingerprint sensors, for example, and other biometric sensors are gaining increasing acceptance for use in a wide variety of applications for security and convenience reasons. Devices, systems and methods configured according to the embodiment will have improved security of the biometric verification process without increasing the cost of the system. Furthermore, the embodiment may extend to devices, systems and methods that would benefit from validation of components. As discussed above, the embodiment includes the ability for the host and sensor to include any combination or subset of the above components, which may be arranged and configured in the manner most appropriate for the system's intended application. Those skilled in the art will understand that different combinations and permutations of the components described herein are possible within the spirit and scope of the embodiment, which is defined by the appended Claims, their equivalents, and also Claims presented in related applications in the future and their equivalents.
The embodiment may also involve a number of functions to be performed by a computer processor, such as a microprocessor. The microprocessor may be a specialized or dedicated microprocessor that is configured to perform particular tasks according to the embodiment, by executing machine-readable software code that defines the particular tasks embodied by the embodiment. The microprocessor may also be configured to operate and communicate with other devices such as direct memory access modules, memory storage devices, Internet related hardware, and other devices that relate to the transmission of data in accordance with the embodiment. The software code may be configured using software formats such as Java, C++, XML (Extensible Mark-up Language) and other languages that may be used to define functions that relate to operations of devices required to carry out the functional operations related to the embodiment. The code may be written in different forms and styles, many of which are known to those skilled in the art. Different code formats, code configurations, styles and forms of software programs and other means of configuring code to define the operations of a microprocessor in accordance with the embodiment will not depart from the spirit and scope of the embodiment.
Within the different types of devices, such as laptop or desktop computers, hand held devices with processors or processing logic, and also possibly computer servers or other devices that utilize the embodiment, there exist different types of memory devices for storing and retrieving information while performing functions according to the embodiment. Cache memory devices are often included in such computers for use by the central processing unit as a convenient storage location for information that is frequently stored and retrieved. Similarly, a persistent memory is also frequently used with such computers for maintaining information that is frequently retrieved by the central processing unit, but that is not often altered within the persistent memory, unlike the cache memory. Main memory is also usually included for storing and retrieving larger amounts of information such as data and software applications configured to perform functions according to the embodiment when executed by the central processing unit. These memory devices may be configured as random access memory (RAM), static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, and other memory storage devices that may be accessed by a central processing unit to store and retrieve information. During data storage and retrieval operations, these memory devices are transformed to have different states, such as different electrical charges, different magnetic polarity, and the like. Thus, systems and methods configured according to the embodiment as described herein enable the physical transformation of these memory devices. Accordingly, the embodiment as described herein is directed to novel and useful systems and methods that, in one or more embodiments, are able to transform the memory device into a different state. The embodiment is not limited to any particular type of memory device, or any commonly used protocol for storing and retrieving information to and from these memory devices, respectively.
The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that causes the machine to perform any one or more of the methodologies of the present embodiment. The machine-readable medium includes any mechanism that provides (i.e., stores and/or transmits) information in a form readable by a machine (e.g., a computer, PDA, cellular telephone, etc.). For example, a machine-readable medium includes memory (such as described above); magnetic disk storage media; optical storage media; flash memory devices; biological electrical, mechanical systems; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.). The device or machine-readable medium may include a micro-electromechanical system (MEMS), nanotechnology devices, organic, holographic, solid-state memory device and/or a rotating magnetic or optical disk. The device or machine-readable medium may be distributed when partitions of instructions have been separated into different machines, such as across an interconnection of computers or as different virtual machines.
While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad embodiment, and that this embodiment not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
Reference in the specification to “an embodiment,” “one embodiment,” “some embodiments,” or “other embodiments” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least some embodiments, but not necessarily all embodiments. The various appearances “an embodiment,” “one embodiment,” or “some embodiments” are not necessarily all referring to the same embodiments. If the specification states a component, feature, structure, or characteristic “may”, “might”, or “could” be included, that particular component, feature, structure, or characteristic is not required to be included. If the specification or Claim refers to “a” or “an” element, that does not mean there is only one of the element. If the specification or Claims refer to “an additional” element, that does not preclude there being more than one of the additional element.
The methods, systems and devices include improved security operations and configurations with a novel approach to biometric systems. Such systems would greatly benefit from increased security features, particularly in financial transactions. Although this embodiment is described and illustrated in the context of devices, systems and related methods of validating biometric devices such as fingerprint sensors, the scope of the embodiment extends to other applications where such functions are useful. Furthermore, while the foregoing description has been with reference to particular embodiments of the embodiment, it will be appreciated that these are only illustrative of the embodiment and that changes may be made to those embodiments without departing from the principles of the embodiment, the scope of which is defined by the appended Claims and their equivalents.
This application is a divisional application under 35 U.S.C. § 121 of U.S. patent application Ser. No. 14/243,122 filed Apr. 2, 2014, which claims the benefit under 35 U.S.C. § 120 of the filing date of non-provisional patent application Ser. No. 13/860,494 filed Apr. 10, 2013, which claims priority under 35 U.S.C. § 119(e) of provisional application Ser. No. 61/622,474 filed Apr. 10, 2012, the respective disclosures of which are hereby incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
1680161 | Hansen | Feb 1928 | A |
1683059 | Van Deventer | Sep 1928 | A |
3393390 | Louis | Jul 1968 | A |
3593319 | Barber | Jul 1971 | A |
3610887 | Betzer | Oct 1971 | A |
3621439 | Newbery | Nov 1971 | A |
3624584 | Ohno | Nov 1971 | A |
3863195 | Bowen | Jan 1975 | A |
3960044 | Nagai et al. | Jun 1976 | A |
3997863 | Luce | Dec 1976 | A |
4151512 | Riganati et al. | Apr 1979 | A |
4152304 | Tadewald | May 1979 | A |
4208648 | Naumann | Jun 1980 | A |
4225850 | Chang et al. | Sep 1980 | A |
4257305 | Friend et al. | Mar 1981 | A |
4273682 | Kanamori | Jun 1981 | A |
4310827 | Asai | Jan 1982 | A |
4333068 | Kishel | Jun 1982 | A |
4353056 | Tsikos | Oct 1982 | A |
4405829 | Rivest et al. | Sep 1983 | A |
4419653 | Waigand | Dec 1983 | A |
4438158 | Eichelberger et al. | Mar 1984 | A |
4479392 | Froeb et al. | Oct 1984 | A |
4492949 | Peterson | Jan 1985 | A |
4525859 | Bowles et al. | Jun 1985 | A |
4550221 | Mabusth | Oct 1985 | A |
4570149 | Thornburg | Feb 1986 | A |
4580790 | Doose | Apr 1986 | A |
4604509 | Clancy et al. | Aug 1986 | A |
4745301 | Michalchik | May 1988 | A |
4746894 | Zeldman | May 1988 | A |
4758622 | Gosselin | Jul 1988 | A |
4765930 | Mashimo et al. | Aug 1988 | A |
4775765 | Kimura et al. | Oct 1988 | A |
4817183 | Sparrow | Mar 1989 | A |
4827527 | Morita et al. | May 1989 | A |
4833440 | Wojtanek | May 1989 | A |
4878040 | Tamura | Oct 1989 | A |
4933660 | Wynne, Jr. | Jun 1990 | A |
4933976 | Fishbine et al. | Jun 1990 | A |
4952761 | Viebrantz | Aug 1990 | A |
5060527 | Burgess | Oct 1991 | A |
5068638 | Bickely et al. | Nov 1991 | A |
5076566 | Kriegel | Dec 1991 | A |
5079949 | Tamori | Jan 1992 | A |
5109427 | Yang | Apr 1992 | A |
5140642 | Hsu et al. | Aug 1992 | A |
5162775 | Kuramochi et al. | Nov 1992 | A |
5164697 | Kramer | Nov 1992 | A |
5296835 | Nakamura | Mar 1994 | A |
5305017 | Gerpheide | Apr 1994 | A |
5319323 | Fong | Jun 1994 | A |
5325442 | Knapp | Jun 1994 | A |
5326194 | Pinto et al. | Jul 1994 | A |
5376913 | Pine et al. | Dec 1994 | A |
5420936 | Fitzpatrick et al. | May 1995 | A |
5422807 | Mitra et al. | Jun 1995 | A |
5428684 | Akiyama et al. | Jun 1995 | A |
5429006 | Tamori | Jul 1995 | A |
5456256 | Schneider et al. | Oct 1995 | A |
5499041 | Brandenburg et al. | Mar 1996 | A |
5515738 | Tamori | May 1996 | A |
5517738 | Wildi-Weber | May 1996 | A |
5543591 | Gillespie et al. | Aug 1996 | A |
5569901 | Bridgelall et al. | Oct 1996 | A |
5610993 | Yamamoto | Mar 1997 | A |
5612719 | Beernink et al. | Mar 1997 | A |
5614881 | Duggal et al. | Mar 1997 | A |
5621318 | Jacobsen et al. | Apr 1997 | A |
5623552 | Lane | Apr 1997 | A |
5627316 | De Winter et al. | May 1997 | A |
5644283 | Grosse-Wilde et al. | Jul 1997 | A |
5650842 | Maase et al. | Jul 1997 | A |
5657012 | Tait | Aug 1997 | A |
5675309 | DeVolpi | Oct 1997 | A |
5689285 | Asher | Nov 1997 | A |
5717777 | Wong et al. | Feb 1998 | A |
5745046 | Itsumi et al. | Apr 1998 | A |
5781651 | Hsiao et al. | Jul 1998 | A |
5801681 | Sayag | Sep 1998 | A |
5818956 | Tuli | Oct 1998 | A |
5821930 | Hansen | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5825907 | Russo | Oct 1998 | A |
5828773 | Setlak et al. | Oct 1998 | A |
5838306 | O'Connor et al. | Nov 1998 | A |
5841888 | Setlak et al. | Nov 1998 | A |
5845005 | Setlak et al. | Dec 1998 | A |
5848176 | Hara et al. | Dec 1998 | A |
5850450 | Schweitzer et al. | Dec 1998 | A |
5852670 | Setlak et al. | Dec 1998 | A |
5862248 | Salatino et al. | Jan 1999 | A |
5864296 | Upton | Jan 1999 | A |
5876106 | Kordecki | Mar 1999 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5884289 | Anderson et al. | Mar 1999 | A |
5887343 | Salatino et al. | Mar 1999 | A |
5889507 | Engle et al. | Mar 1999 | A |
5892824 | Beatson et al. | Apr 1999 | A |
5903225 | Schmitt et al. | May 1999 | A |
5907327 | Ogura et al. | May 1999 | A |
5909211 | Combs et al. | Jun 1999 | A |
5909501 | Thebaud | Jun 1999 | A |
5910286 | Lipskier | Jun 1999 | A |
5912612 | DeVolpi | Jun 1999 | A |
5915757 | Tsuyama et al. | Jun 1999 | A |
5920384 | Borza | Jul 1999 | A |
5920640 | Salatino et al. | Jul 1999 | A |
5940526 | Setlak et al. | Aug 1999 | A |
5943052 | Allen et al. | Aug 1999 | A |
5945929 | Westra | Aug 1999 | A |
5949325 | Devolpi | Sep 1999 | A |
5953441 | Setlak | Sep 1999 | A |
5956415 | McCalley et al. | Sep 1999 | A |
5963679 | Setlak | Oct 1999 | A |
5861875 | Gerpheide | Nov 1999 | A |
5982894 | McCalley et al. | Nov 1999 | A |
5995084 | Chan et al. | Nov 1999 | A |
5999084 | Armstrong | Dec 1999 | A |
5999637 | Toyoda et al. | Dec 1999 | A |
6002815 | Immega et al. | Dec 1999 | A |
6011589 | Matsuura et al. | Jan 2000 | A |
6016355 | Dickinson et al. | Jan 2000 | A |
6021211 | Setlak et al. | Feb 2000 | A |
6028773 | Hundt | Feb 2000 | A |
6047281 | Wilson et al. | Apr 2000 | A |
6047282 | Wilson et al. | Apr 2000 | A |
6049620 | Dickinson et al. | Apr 2000 | A |
6052475 | Upton | Apr 2000 | A |
6057540 | Gordon et al. | May 2000 | A |
6057830 | Chan et al. | May 2000 | A |
6061051 | Chan et al. | May 2000 | A |
6061464 | Leger | May 2000 | A |
6067368 | Setlak et al. | May 2000 | A |
6069970 | Salatino et al. | May 2000 | A |
6070159 | Wilson et al. | May 2000 | A |
6073343 | Petrick et al. | Jun 2000 | A |
6076566 | Lowe | Jun 2000 | A |
6088471 | Setlak et al. | Jul 2000 | A |
6088585 | Schmitt et al. | Jul 2000 | A |
6098175 | Lee | Aug 2000 | A |
6098330 | Schmitt et al. | Aug 2000 | A |
6118318 | Fifield et al. | Sep 2000 | A |
6134340 | Hsu et al. | Oct 2000 | A |
6135958 | Mikula-Curtis et al. | Oct 2000 | A |
6157722 | Lerner et al. | Dec 2000 | A |
6161213 | Lofstrom | Dec 2000 | A |
6173400 | Perlman et al. | Jan 2001 | B1 |
6175407 | Sartor | Jan 2001 | B1 |
6181807 | Setlak et al. | Jan 2001 | B1 |
6182076 | Yu et al. | Jan 2001 | B1 |
6182892 | Angelo et al. | Feb 2001 | B1 |
6185318 | Jain et al. | Feb 2001 | B1 |
6208329 | Ballare | Mar 2001 | B1 |
6215477 | Morrison et al. | Apr 2001 | B1 |
6234031 | Suga | May 2001 | B1 |
6239790 | Martinelli et al. | May 2001 | B1 |
6241288 | Bergenek et al. | Jun 2001 | B1 |
6248655 | Machida et al. | Jun 2001 | B1 |
6256012 | Devolpi | Jul 2001 | B1 |
6256022 | Manaresi et al. | Jul 2001 | B1 |
6259108 | Antonelli et al. | Jul 2001 | B1 |
6259804 | Setlak et al. | Jul 2001 | B1 |
6278443 | Amro et al. | Aug 2001 | B1 |
6289114 | Mainguet | Sep 2001 | B1 |
6317508 | Kramer et al. | Nov 2001 | B1 |
6320394 | Tartagni | Nov 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6330345 | Russo et al. | Dec 2001 | B1 |
6332193 | Glass et al. | Dec 2001 | B1 |
6333989 | Borza | Dec 2001 | B1 |
6337918 | Holehan | Jan 2002 | B1 |
6337919 | Dunton | Jan 2002 | B1 |
6344791 | Armstrong | Feb 2002 | B1 |
6346739 | Lepert et al. | Feb 2002 | B1 |
6347040 | Fries et al. | Feb 2002 | B1 |
6357302 | Knapp | Mar 2002 | B1 |
6360004 | Akizuki | Mar 2002 | B1 |
6362633 | Tartagni | Mar 2002 | B1 |
6370965 | Knapp | Apr 2002 | B1 |
6376393 | Newton et al. | Apr 2002 | B1 |
6392636 | Ferrari et al. | May 2002 | B1 |
6399994 | Shobu | Jun 2002 | B2 |
6400836 | Senior | Jun 2002 | B2 |
6404323 | Schrum et al. | Jun 2002 | B1 |
6404900 | Qian et al. | Jun 2002 | B1 |
6408087 | Kramer | Jun 2002 | B1 |
6437682 | Vance | Aug 2002 | B1 |
6442286 | Kramer | Aug 2002 | B1 |
6459424 | Resman | Oct 2002 | B1 |
6459804 | Mainguet | Oct 2002 | B2 |
6473072 | Comiskey et al. | Oct 2002 | B1 |
6483931 | Kalnitsky et al. | Nov 2002 | B2 |
6501284 | Gozzini | Dec 2002 | B1 |
6509501 | Eicken et al. | Jan 2003 | B2 |
6512381 | Kramer | Jan 2003 | B2 |
6515488 | Thomas | Feb 2003 | B1 |
6518560 | Yeh et al. | Feb 2003 | B1 |
6522773 | Houdeau | Feb 2003 | B1 |
6535622 | Russo et al. | Mar 2003 | B1 |
6539101 | Black | Mar 2003 | B1 |
6546122 | Russo | Apr 2003 | B1 |
6563101 | Tullis | May 2003 | B1 |
6580816 | Kramer et al. | Jun 2003 | B2 |
6597289 | Sabatini | Jul 2003 | B2 |
6603462 | Matusis | Aug 2003 | B2 |
6628812 | Setlak et al. | Sep 2003 | B1 |
6636053 | Gozzini | Oct 2003 | B1 |
6643389 | Raynal et al. | Nov 2003 | B1 |
6654484 | Topping | Nov 2003 | B2 |
6661631 | Meador et al. | Dec 2003 | B1 |
6664951 | Fujii et al. | Dec 2003 | B1 |
6667439 | Salatino et al. | Dec 2003 | B2 |
6668072 | Hribernig et al. | Dec 2003 | B1 |
6672174 | Deconde et al. | Jan 2004 | B2 |
6680731 | Gerpheide et al. | Jan 2004 | B2 |
6681034 | Russo | Jan 2004 | B1 |
6683971 | Salatino et al. | Jan 2004 | B1 |
6738050 | Comiskey et al. | May 2004 | B2 |
6741729 | Bjorn et al. | May 2004 | B2 |
6744910 | McClurg et al. | Jun 2004 | B1 |
6754365 | Wen et al. | Jun 2004 | B1 |
6757002 | Oross et al. | Jun 2004 | B1 |
6766040 | Catalano et al. | Jul 2004 | B1 |
6785407 | Tschudi et al. | Aug 2004 | B1 |
6836230 | Pailleur et al. | Dec 2004 | B2 |
6838905 | Doyle | Jan 2005 | B1 |
6862942 | Kawahata | Mar 2005 | B2 |
6876756 | Vieweg | Apr 2005 | B1 |
6886104 | McClurg et al. | Apr 2005 | B1 |
6897002 | Teraoka et al. | May 2005 | B2 |
6898299 | Brooks | May 2005 | B1 |
6914517 | Kinsella | Jul 2005 | B2 |
6924496 | Manansala | Aug 2005 | B2 |
6937748 | Schneider et al. | Aug 2005 | B1 |
6941001 | Bolle et al. | Sep 2005 | B1 |
6941810 | Okada | Sep 2005 | B2 |
6950540 | Higuchi | Sep 2005 | B2 |
6959874 | Bardwell | Nov 2005 | B2 |
6961452 | Fujii | Nov 2005 | B2 |
6963626 | Shaeffer et al. | Nov 2005 | B1 |
6970584 | O'Gorman et al. | Nov 2005 | B2 |
6980672 | Saito et al. | Dec 2005 | B2 |
6983882 | Cassone | Jan 2006 | B2 |
7002553 | Shkolnikov | Feb 2006 | B2 |
7003670 | Heaven et al. | Feb 2006 | B2 |
7004389 | Robinson et al. | Feb 2006 | B1 |
7013030 | Wong et al. | Mar 2006 | B2 |
7014107 | Singer et al. | Mar 2006 | B2 |
7020270 | Ghassabian | Mar 2006 | B1 |
7020591 | Wei et al. | Mar 2006 | B1 |
7029767 | Nakamata | Apr 2006 | B2 |
7030860 | Hsu et al. | Apr 2006 | B1 |
7035443 | Wong | Apr 2006 | B2 |
7039223 | Wong | May 2006 | B2 |
7042535 | Katoh et al. | May 2006 | B2 |
7043061 | Wong | May 2006 | B2 |
7043644 | DeBruine | May 2006 | B2 |
7046230 | Zadesky et al. | May 2006 | B2 |
7054470 | Bolle et al. | May 2006 | B2 |
7064743 | Nishikawa | Jun 2006 | B2 |
7099496 | Benkley | Aug 2006 | B2 |
7099497 | Chou et al. | Aug 2006 | B2 |
7102364 | Umeda et al. | Sep 2006 | B2 |
7110577 | Tschudi | Sep 2006 | B1 |
7113622 | Hamid | Sep 2006 | B2 |
7126389 | McRae et al. | Oct 2006 | B1 |
7129926 | Mathiassen et al. | Oct 2006 | B2 |
7136514 | Wong | Nov 2006 | B1 |
7146024 | Benkley | Dec 2006 | B2 |
7146026 | Russon et al. | Dec 2006 | B2 |
7146029 | Manansala | Dec 2006 | B2 |
7184581 | Johansen et al. | Feb 2007 | B2 |
7190816 | Mitsuyu et al. | Mar 2007 | B2 |
7194392 | Tuken et al. | Mar 2007 | B2 |
7194393 | Wei et al. | Mar 2007 | B2 |
7197168 | Russo | Mar 2007 | B2 |
7200250 | Chou | Apr 2007 | B2 |
7236616 | Scott | Jun 2007 | B1 |
7239153 | Nysaether | Jul 2007 | B2 |
7251351 | Mathiassen et al. | Jul 2007 | B2 |
7258279 | Schneider et al. | Aug 2007 | B2 |
7260246 | Fujii | Aug 2007 | B2 |
7263212 | Kawabe | Aug 2007 | B2 |
7263213 | Rowe | Aug 2007 | B2 |
7269256 | Rosen | Sep 2007 | B2 |
7280679 | Russo | Oct 2007 | B2 |
7283534 | Kelly et al. | Oct 2007 | B1 |
7289649 | Walley et al. | Oct 2007 | B1 |
7290323 | Deconde et al. | Nov 2007 | B2 |
7299360 | Russo | Nov 2007 | B2 |
7308121 | Mathiassen et al. | Dec 2007 | B2 |
7308122 | McClurg et al. | Dec 2007 | B2 |
7321672 | Sasaki et al. | Jan 2008 | B2 |
7356169 | Hamid | Apr 2008 | B2 |
7360688 | Harris | Apr 2008 | B1 |
7369658 | DeLeon | May 2008 | B2 |
7369688 | Ser et al. | May 2008 | B2 |
7379569 | Chikazawa et al. | May 2008 | B2 |
7398390 | Hyser | Jul 2008 | B2 |
7403644 | Bohn et al. | Jul 2008 | B2 |
7409876 | Ganapathi et al. | Aug 2008 | B2 |
7412083 | Takahashi | Aug 2008 | B2 |
7417536 | Lakshmanan et al. | Aug 2008 | B2 |
7424618 | Roy et al. | Sep 2008 | B2 |
7447339 | Mimura et al. | Nov 2008 | B2 |
7447911 | Chou et al. | Nov 2008 | B2 |
7460697 | Erhart et al. | Dec 2008 | B2 |
7463756 | Benkley | Dec 2008 | B2 |
7474772 | Russo et al. | Jan 2009 | B2 |
7505611 | Fyke | Mar 2009 | B2 |
7505613 | Russo | Mar 2009 | B2 |
7518382 | Vermesan et al. | Apr 2009 | B2 |
7543737 | Bensimon et al. | Jun 2009 | B2 |
7565548 | Fiske et al. | Jul 2009 | B2 |
7574022 | Russo | Aug 2009 | B2 |
7590269 | Creasey et al. | Sep 2009 | B2 |
7623659 | Huang et al. | Nov 2009 | B2 |
7643950 | Getzin et al. | Jan 2010 | B1 |
7646897 | Fyke | Jan 2010 | B2 |
7681232 | Nordentoft et al. | Mar 2010 | B2 |
7685629 | White et al. | Mar 2010 | B1 |
7689013 | Shinzaki | Mar 2010 | B2 |
7706581 | Drews et al. | Apr 2010 | B2 |
7733697 | Picca et al. | Jun 2010 | B2 |
7751595 | Russo | Jul 2010 | B2 |
7751601 | Benkley | Jul 2010 | B2 |
7754022 | Barnhill et al. | Jul 2010 | B2 |
7768273 | Kalnitsky et al. | Aug 2010 | B1 |
7821501 | Felder | Oct 2010 | B2 |
7831840 | Love et al. | Nov 2010 | B1 |
7843438 | Onoda | Nov 2010 | B2 |
7844579 | Peterson et al. | Nov 2010 | B2 |
7864992 | Riedijk et al. | Jan 2011 | B2 |
7899216 | Watanabe et al. | Mar 2011 | B2 |
7930812 | Curnalia et al. | Apr 2011 | B2 |
7953258 | Dean et al. | May 2011 | B2 |
7986193 | Krah | Jul 2011 | B2 |
8005276 | Dean et al. | Aug 2011 | B2 |
8023700 | Riionheimo | Sep 2011 | B2 |
8031916 | Abiko et al. | Oct 2011 | B2 |
8077935 | Geoffroy et al. | Dec 2011 | B2 |
8107212 | Nelson et al. | Jan 2012 | B2 |
8116540 | Dean et al. | Feb 2012 | B2 |
8525799 | Grivina et al. | Sep 2013 | B1 |
8619057 | Kobayashi et al. | Dec 2013 | B2 |
8711105 | Gray et al. | Apr 2014 | B2 |
20010012036 | Giere et al. | Aug 2001 | A1 |
20010017934 | Paloniemi et al. | Aug 2001 | A1 |
20010026636 | Mainguet | Oct 2001 | A1 |
20010029527 | Goshen | Oct 2001 | A1 |
20010030644 | Allport | Oct 2001 | A1 |
20010032319 | Setlak | Oct 2001 | A1 |
20010036299 | Senior | Nov 2001 | A1 |
20010043728 | Kramer et al. | Nov 2001 | A1 |
20020025062 | Black | Feb 2002 | A1 |
20020054695 | Bjorn et al. | May 2002 | A1 |
20020061125 | Fujii | May 2002 | A1 |
20020064892 | Lepert et al. | May 2002 | A1 |
20020067845 | Griffis | Jun 2002 | A1 |
20020073046 | David | Jun 2002 | A1 |
20020089044 | Simmons et al. | Jul 2002 | A1 |
20020089410 | Janiak et al. | Jul 2002 | A1 |
20020096731 | Wu et al. | Jul 2002 | A1 |
20020097231 | Satoh et al. | Jul 2002 | A1 |
20020109671 | Kawasome | Aug 2002 | A1 |
20020122026 | Bergstrom | Sep 2002 | A1 |
20020126516 | Jeon | Sep 2002 | A1 |
20020130673 | Pelrine et al. | Sep 2002 | A1 |
20020133725 | Roy et al. | Sep 2002 | A1 |
20020150282 | Kinsella | Oct 2002 | A1 |
20020152048 | Hayes | Oct 2002 | A1 |
20020156726 | Kleckner et al. | Oct 2002 | A1 |
20020164057 | Kramer et al. | Nov 2002 | A1 |
20020181749 | Matsumoto et al. | Dec 2002 | A1 |
20020186203 | Huang | Dec 2002 | A1 |
20020188854 | Heaven et al. | Dec 2002 | A1 |
20030002717 | Hamid | Jan 2003 | A1 |
20030002718 | Hamid | Jan 2003 | A1 |
20030002719 | Hamid et al. | Jan 2003 | A1 |
20030016849 | Andrade | Jan 2003 | A1 |
20030021495 | Cheng | Jan 2003 | A1 |
20030025606 | Sabatini | Feb 2003 | A1 |
20030028811 | Walker et al. | Feb 2003 | A1 |
20030035568 | Mitev et al. | Feb 2003 | A1 |
20030035570 | Benkley | Feb 2003 | A1 |
20030035572 | Kalnitsky et al. | Feb 2003 | A1 |
20030044051 | Fujieda | Mar 2003 | A1 |
20030063782 | Acharya et al. | Apr 2003 | A1 |
20030068072 | Hamid | Apr 2003 | A1 |
20030074559 | Riggs | Apr 2003 | A1 |
20030076301 | Tsuk et al. | Apr 2003 | A1 |
20030076303 | Huppi | Apr 2003 | A1 |
20030095096 | Robbin et al. | May 2003 | A1 |
20030095691 | Nobuhara et al. | May 2003 | A1 |
20030101348 | Russo et al. | May 2003 | A1 |
20030102874 | Lane et al. | Jun 2003 | A1 |
20030107608 | Hong et al. | Jun 2003 | A1 |
20030108226 | Goodman et al. | Jun 2003 | A1 |
20030108227 | Philomin et al. | Jun 2003 | A1 |
20030115475 | Russo et al. | Jun 2003 | A1 |
20030115490 | Russo et al. | Jun 2003 | A1 |
20030123714 | O'Gorman et al. | Jul 2003 | A1 |
20030123715 | Uchida | Jul 2003 | A1 |
20030126448 | Russo | Jul 2003 | A1 |
20030135764 | Lu | Jul 2003 | A1 |
20030141959 | Keogh et al. | Jul 2003 | A1 |
20030147015 | Katoh et al. | Aug 2003 | A1 |
20030161510 | Fujii | Aug 2003 | A1 |
20030161512 | Mathiassen et al. | Aug 2003 | A1 |
20030169228 | Mathiassen et al. | Sep 2003 | A1 |
20030174256 | Kim | Sep 2003 | A1 |
20030174871 | Yoshioka et al. | Sep 2003 | A1 |
20030186157 | Teraoka et al. | Oct 2003 | A1 |
20030209293 | Sako et al. | Nov 2003 | A1 |
20030214481 | Xiong | Nov 2003 | A1 |
20030215116 | Brandt et al. | Nov 2003 | A1 |
20030224553 | Manansala | Dec 2003 | A1 |
20040012773 | Puttkammer | Jan 2004 | A1 |
20040014457 | Stevens | Jan 2004 | A1 |
20040022001 | Chu et al. | Feb 2004 | A1 |
20040042642 | Bolle et al. | Mar 2004 | A1 |
20040050930 | Rowe | Mar 2004 | A1 |
20040066613 | Leitao | Apr 2004 | A1 |
20040076313 | Bronstein et al. | Apr 2004 | A1 |
20040076314 | Cheng | Apr 2004 | A1 |
20040081339 | Benkley | Apr 2004 | A1 |
20040096086 | Miyasaka et al. | May 2004 | A1 |
20040113956 | Bellwood et al. | Jun 2004 | A1 |
20040120400 | Linzer | Jun 2004 | A1 |
20040125990 | Goodman et al. | Jul 2004 | A1 |
20040125993 | Zhao et al. | Jul 2004 | A1 |
20040128521 | Russo | Jul 2004 | A1 |
20040129787 | Saito et al. | Jul 2004 | A1 |
20040136612 | Meister et al. | Jul 2004 | A1 |
20040148526 | Sands et al. | Jul 2004 | A1 |
20040156538 | Greschitz et al. | Aug 2004 | A1 |
20040172339 | Snelgrove et al. | Sep 2004 | A1 |
20040179718 | Chou | Sep 2004 | A1 |
20040184641 | Nagasaka et al. | Sep 2004 | A1 |
20040186882 | Ting | Sep 2004 | A1 |
20040190761 | Lee | Sep 2004 | A1 |
20040208346 | Baharav et al. | Oct 2004 | A1 |
20040208347 | Baharav et al. | Oct 2004 | A1 |
20040208348 | Baharav et al. | Oct 2004 | A1 |
20040213441 | Tschudi | Oct 2004 | A1 |
20040215689 | Dooley et al. | Oct 2004 | A1 |
20040228505 | Sugimoto | Nov 2004 | A1 |
20040228508 | Shigeta | Nov 2004 | A1 |
20040230536 | Fung et al. | Nov 2004 | A1 |
20040240712 | Rowe et al. | Dec 2004 | A1 |
20040252867 | Lan et al. | Dec 2004 | A1 |
20040257196 | Kotzin | Dec 2004 | A1 |
20040258282 | Bjorn et al. | Dec 2004 | A1 |
20040263479 | Shkolnikov | Dec 2004 | A1 |
20050012714 | Russo et al. | Jan 2005 | A1 |
20050031174 | Ryhanen et al. | Feb 2005 | A1 |
20050036665 | Higuchi | Feb 2005 | A1 |
20050041841 | Yoo et al. | Feb 2005 | A1 |
20050041885 | Russo | Feb 2005 | A1 |
20050122785 | Umeda et al. | Feb 2005 | A1 |
20050047485 | Khayrallah et al. | Mar 2005 | A1 |
20050089200 | Nysaether | Apr 2005 | A1 |
20050100196 | Scott et al. | May 2005 | A1 |
20050100938 | Hofmann et al. | May 2005 | A1 |
20050109835 | Jacoby et al. | May 2005 | A1 |
20050110103 | Setlak | May 2005 | A1 |
20050111708 | Chou | May 2005 | A1 |
20050123176 | Ishii et al. | Jun 2005 | A1 |
20050136200 | Durell et al. | Jun 2005 | A1 |
20050139656 | Arnouse | Jun 2005 | A1 |
20050149386 | Agura | Jul 2005 | A1 |
20050151065 | Min | Jul 2005 | A1 |
20050162402 | Watanachote | Jul 2005 | A1 |
20050169503 | Howell et al. | Aug 2005 | A1 |
20050178827 | Shatford | Aug 2005 | A1 |
20050179657 | Russo et al. | Aug 2005 | A1 |
20050198377 | Ferguson et al. | Sep 2005 | A1 |
20050205985 | Smith et al. | Sep 2005 | A1 |
20050210271 | Chou et al. | Sep 2005 | A1 |
20050219200 | Weng | Oct 2005 | A1 |
20050220329 | Payne et al. | Oct 2005 | A1 |
20050221798 | Sengupta et al. | Oct 2005 | A1 |
20050231213 | Chou et al. | Oct 2005 | A1 |
20050238212 | Du et al. | Oct 2005 | A1 |
20050244038 | Benkley | Nov 2005 | A1 |
20050244039 | Geoffroy et al. | Nov 2005 | A1 |
20050247559 | Frey et al. | Nov 2005 | A1 |
20050249386 | Juh | Nov 2005 | A1 |
20050254694 | Goodman et al. | Nov 2005 | A1 |
20050258952 | Utter et al. | Nov 2005 | A1 |
20050259851 | Fyke | Nov 2005 | A1 |
20050259852 | Russo | Nov 2005 | A1 |
20050269402 | Spitzer et al. | Dec 2005 | A1 |
20050281441 | Martinsen et al. | Dec 2005 | A1 |
20060002597 | Rowe | Jan 2006 | A1 |
20060006224 | Modi | Jan 2006 | A1 |
20060034043 | Hisano et al. | Feb 2006 | A1 |
20060055500 | Burke et al. | Mar 2006 | A1 |
20060066572 | Yumoto et al. | Mar 2006 | A1 |
20060076926 | Lee | Apr 2006 | A1 |
20060078174 | Russo | Apr 2006 | A1 |
20060078176 | Abiko et al. | Apr 2006 | A1 |
20060083411 | Benkley | Apr 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060103633 | Gioeli | May 2006 | A1 |
20060110013 | Lee | May 2006 | A1 |
20060110537 | Huang et al. | May 2006 | A1 |
20060140461 | Kim et al. | Jun 2006 | A1 |
20060141960 | Fernandez et al. | Jun 2006 | A1 |
20060144953 | Takao | Jul 2006 | A1 |
20060170528 | Fukushige et al. | Aug 2006 | A1 |
20060187200 | Martin | Aug 2006 | A1 |
20060190737 | Miyasaka | Aug 2006 | A1 |
20060210082 | Devadas et al. | Sep 2006 | A1 |
20060214512 | Iwata | Sep 2006 | A1 |
20060214921 | Takahashi et al. | Sep 2006 | A1 |
20060239514 | Watanabe et al. | Oct 2006 | A1 |
20060242268 | Omernick et al. | Oct 2006 | A1 |
20060244722 | Gust | Nov 2006 | A1 |
20060249008 | Luther | Nov 2006 | A1 |
20060259873 | Mister | Nov 2006 | A1 |
20060261174 | Zellner et al. | Nov 2006 | A1 |
20060261449 | Rapport et al. | Nov 2006 | A1 |
20060267385 | Steenwyk et al. | Nov 2006 | A1 |
20060271793 | Devadas et al. | Nov 2006 | A1 |
20060280346 | Machida | Dec 2006 | A1 |
20060287963 | Steeves et al. | Dec 2006 | A1 |
20070014443 | Russo | Jan 2007 | A1 |
20070021198 | Muir et al. | Jan 2007 | A1 |
20070031011 | Erhart et al. | Feb 2007 | A1 |
20070034783 | Eliasson et al. | Feb 2007 | A1 |
20070036400 | Watanabe et al. | Feb 2007 | A1 |
20070057763 | Blattner et al. | Mar 2007 | A1 |
20070061126 | Russo et al. | Mar 2007 | A1 |
20070067828 | Bychkov | Mar 2007 | A1 |
20070072631 | Mock et al. | Mar 2007 | A1 |
20070076923 | Chiu | Apr 2007 | A1 |
20070076926 | Schneider et al. | Apr 2007 | A1 |
20070076951 | Tanaka et al. | Apr 2007 | A1 |
20070086634 | Setlak et al. | Apr 2007 | A1 |
20070090312 | Stallinga et al. | Apr 2007 | A1 |
20070125937 | Eliasson et al. | Jun 2007 | A1 |
20070138299 | Mitra | Jun 2007 | A1 |
20070180261 | Akkermans et al. | Aug 2007 | A1 |
20070196002 | Choi et al. | Aug 2007 | A1 |
20070198141 | Moore | Aug 2007 | A1 |
20070198435 | Siegal et al. | Aug 2007 | A1 |
20070210895 | Kuhlman | Sep 2007 | A1 |
20070211923 | Kuhlman | Sep 2007 | A1 |
20070228154 | Tran | Oct 2007 | A1 |
20070236330 | Cho et al. | Oct 2007 | A1 |
20070237366 | Maletsky | Oct 2007 | A1 |
20070248249 | Stoianov | Oct 2007 | A1 |
20070274575 | Russo | Nov 2007 | A1 |
20070292005 | Lo et al. | Dec 2007 | A1 |
20080002867 | Mathiassen et al. | Jan 2008 | A1 |
20080013803 | Lo et al. | Jan 2008 | A1 |
20080013805 | Sengupta et al. | Jan 2008 | A1 |
20080013808 | Russo et al. | Jan 2008 | A1 |
20080019578 | Saito et al. | Jan 2008 | A1 |
20080030207 | Vermesan et al. | Feb 2008 | A1 |
20080042813 | Wheatley | Feb 2008 | A1 |
20080042814 | Hurwitz | Feb 2008 | A1 |
20080049980 | Castaneda et al. | Feb 2008 | A1 |
20080049987 | Champagne et al. | Feb 2008 | A1 |
20080049989 | Iseri et al. | Feb 2008 | A1 |
20080062140 | Hotelling et al. | Mar 2008 | A1 |
20080063245 | Benkley et al. | Mar 2008 | A1 |
20080069412 | Champagne et al. | Mar 2008 | A1 |
20080101662 | Lo et al. | May 2008 | A1 |
20080101663 | Lo et al. | May 2008 | A1 |
20080101705 | Mohamed et al. | May 2008 | A1 |
20080103995 | Mohamed et al. | May 2008 | A1 |
20080126260 | Cox et al. | May 2008 | A1 |
20080133373 | Perdomo et al. | Jun 2008 | A1 |
20080138078 | Alameh et al. | Jun 2008 | A1 |
20080138079 | Mui et al. | Jun 2008 | A1 |
20080154816 | Xiao et al. | Jun 2008 | A1 |
20080158178 | Hotelling et al. | Jul 2008 | A1 |
20080159688 | Schellinger et al. | Jul 2008 | A1 |
20080159698 | Alameh et al. | Jul 2008 | A1 |
20080159699 | Zeiger et al. | Jul 2008 | A1 |
20080165139 | Hotelling et al. | Jul 2008 | A1 |
20080165158 | Hotelling et al. | Jul 2008 | A1 |
20080166026 | Turek et al. | Jul 2008 | A1 |
20080169345 | Keane et al. | Jul 2008 | A1 |
20080170695 | Adler et al. | Jul 2008 | A1 |
20080174570 | Jobs et al. | Jul 2008 | A1 |
20080175450 | Scott | Jul 2008 | A1 |
20080178008 | Takahashi et al. | Jul 2008 | A1 |
20080179112 | Qin et al. | Jul 2008 | A1 |
20080185193 | Lin | Aug 2008 | A1 |
20080185429 | Saville | Aug 2008 | A1 |
20080201265 | Hewton | Aug 2008 | A1 |
20080205714 | Benkley et al. | Aug 2008 | A1 |
20080219521 | Benkley et al. | Sep 2008 | A1 |
20080222049 | Loomis et al. | Sep 2008 | A1 |
20080223925 | Saito et al. | Sep 2008 | A1 |
20080226132 | Gardner | Sep 2008 | A1 |
20080238878 | Wang | Oct 2008 | A1 |
20080240523 | Benkley et al. | Oct 2008 | A1 |
20080244277 | Orsini et al. | Oct 2008 | A1 |
20080247652 | Mohamed et al. | Oct 2008 | A1 |
20080265751 | Smith | Oct 2008 | A1 |
20080267462 | Nelson et al. | Oct 2008 | A1 |
20080273767 | Lo et al. | Nov 2008 | A1 |
20080273769 | Lo et al. | Nov 2008 | A1 |
20080279373 | Erhart et al. | Nov 2008 | A1 |
20080279416 | Lo et al. | Nov 2008 | A1 |
20080285813 | Holm | Nov 2008 | A1 |
20080298648 | Lo et al. | Dec 2008 | A1 |
20080309633 | Hotelling et al. | Dec 2008 | A1 |
20080316183 | Westerman et al. | Dec 2008 | A1 |
20090009486 | Sato et al. | Jan 2009 | A1 |
20090016913 | Smits | Jan 2009 | A1 |
20090056124 | Krebs et al. | Mar 2009 | A1 |
20090060728 | Grimes et al. | Mar 2009 | A1 |
20090074255 | Holm | Mar 2009 | A1 |
20090083850 | Fadell et al. | Mar 2009 | A1 |
20090130369 | Huang et al. | May 2009 | A1 |
20090153297 | Gardner | Jun 2009 | A1 |
20090154779 | Satyan et al. | Jun 2009 | A1 |
20090155456 | Benkley et al. | Jun 2009 | A1 |
20090024499 | Ribble | Jul 2009 | A1 |
20090169071 | Bond et al. | Jul 2009 | A1 |
20090169072 | Lo et al. | Jul 2009 | A1 |
20090174974 | Huang et al. | Jul 2009 | A1 |
20090210722 | Russo | Aug 2009 | A1 |
20090228952 | Gillig et al. | Sep 2009 | A1 |
20090237135 | Ramaraju et al. | Sep 2009 | A1 |
20090244820 | Kusaka et al. | Oct 2009 | A1 |
20090252384 | Dean et al. | Oct 2009 | A1 |
20090252385 | Dean et al. | Oct 2009 | A1 |
20090252386 | Dean et al. | Oct 2009 | A1 |
20090273573 | Hotelling | Nov 2009 | A1 |
20090273577 | Chen et al. | Nov 2009 | A1 |
20090279742 | Abiko | Nov 2009 | A1 |
20090314621 | Hotelling | Dec 2009 | A1 |
20090319435 | Little et al. | Dec 2009 | A1 |
20090324028 | Russo | Dec 2009 | A1 |
20100019032 | Kim | Jan 2010 | A1 |
20100026451 | Erhart et al. | Feb 2010 | A1 |
20100026453 | Yamamoto et al. | Feb 2010 | A1 |
20100030921 | Kim | Feb 2010 | A1 |
20100045705 | Vertegaal et al. | Feb 2010 | A1 |
20100050175 | Jung et al. | Feb 2010 | A1 |
20100059295 | Hotelling et al. | Mar 2010 | A1 |
20100083000 | Kesanupalli | Apr 2010 | A1 |
20100085325 | King-Smith et al. | Apr 2010 | A1 |
20100119124 | Satyan | May 2010 | A1 |
20100123675 | Ippel | May 2010 | A1 |
20100127366 | Bond et al. | May 2010 | A1 |
20100146275 | Slick et al. | Jun 2010 | A1 |
20100162386 | Li et al. | Jun 2010 | A1 |
20100176823 | Thompson et al. | Jul 2010 | A1 |
20100176892 | Thompson et al. | Jul 2010 | A1 |
20100177940 | Dean et al. | Jul 2010 | A1 |
20100180127 | Li et al. | Jul 2010 | A1 |
20100180136 | Thompson et al. | Jul 2010 | A1 |
20100189314 | Benkley et al. | Jul 2010 | A1 |
20100191634 | Macy et al. | Jul 2010 | A1 |
20100193257 | Hotelling | Aug 2010 | A1 |
20100208953 | Gardner et al. | Aug 2010 | A1 |
20100244166 | Shibuta et al. | Sep 2010 | A1 |
20100245553 | Schuler et al. | Sep 2010 | A1 |
20100272329 | Benkley | Oct 2010 | A1 |
20100284565 | Benkley | Nov 2010 | A1 |
20100318515 | Ramanathan et al. | Dec 2010 | A1 |
20110002461 | Erhart et al. | Jan 2011 | A1 |
20110018556 | Le et al. | Jan 2011 | A1 |
20110060913 | Hird et al. | Mar 2011 | A1 |
20110074734 | Wassvik et al. | Mar 2011 | A1 |
20110080370 | Wu | Apr 2011 | A1 |
20110082791 | Baghdasaryan et al. | Apr 2011 | A1 |
20110082800 | Baghdasaryan et al. | Apr 2011 | A1 |
20110082801 | Baghdasaryan et al. | Apr 2011 | A1 |
20110082802 | Baghdasaryan et al. | Apr 2011 | A1 |
20110083018 | Kesanupalli et al. | Apr 2011 | A1 |
20110083170 | Kesanupalli et al. | Apr 2011 | A1 |
20110083173 | Baghdasaryan et al. | Apr 2011 | A1 |
20110102567 | Erhart | May 2011 | A1 |
20110102569 | Erhart | May 2011 | A1 |
20110138450 | Kesanupalli et al. | Jun 2011 | A1 |
20110175703 | Benkley | Jul 2011 | A1 |
20110176037 | Benkley | Jul 2011 | A1 |
20110182486 | Valfridsson et al. | Jul 2011 | A1 |
20110193799 | Jun et al. | Aug 2011 | A1 |
20110214924 | Perezselsky et al. | Sep 2011 | A1 |
20110215150 | Schneider et al. | Sep 2011 | A1 |
20110242021 | Jun et al. | Oct 2011 | A1 |
20110261003 | Lee et al. | Oct 2011 | A1 |
20110267298 | Erhart et al. | Nov 2011 | A1 |
20110285640 | Park et al. | Nov 2011 | A1 |
20110298711 | Dean et al. | Dec 2011 | A1 |
20110304001 | Erhart et al. | Dec 2011 | A1 |
20120012652 | Couper et al. | Jan 2012 | A1 |
20120044639 | Garcia | Feb 2012 | A1 |
20120069042 | Ogita et al. | Mar 2012 | A1 |
20120075252 | Dighde et al. | Mar 2012 | A1 |
20120105081 | Shaikh | May 2012 | A1 |
20120162099 | You et al. | Jun 2012 | A1 |
20120182259 | Han | Jul 2012 | A1 |
20120189166 | Russo | Jul 2012 | A1 |
20120189172 | Russo | Jul 2012 | A1 |
20120242635 | Erhart et al. | Sep 2012 | A1 |
20120299863 | Yilmaz | Nov 2012 | A1 |
20130069894 | Chen | Mar 2013 | A1 |
20130106759 | Fredriksen et al. | May 2013 | A1 |
20130135247 | Na et al. | May 2013 | A1 |
20130181911 | Yilmaz et al. | Jul 2013 | A1 |
20130181949 | Setlak | Jul 2013 | A1 |
20130229379 | Joguet et al. | Sep 2013 | A1 |
20130265137 | Nelson et al. | Oct 2013 | A1 |
20140036168 | Ludwig | Feb 2014 | A1 |
20140184525 | Kim et al. | Jul 2014 | A1 |
Number | Date | Country |
---|---|---|
2011100746 | Apr 2011 | AU |
1247856 | Mar 2000 | CN |
1251650 | Apr 2000 | CN |
101383704 | Oct 2001 | CN |
1538154 | Oct 2004 | CN |
1841280 | Oct 2006 | CN |
1889103 | Jan 2007 | CN |
1942853 | Apr 2007 | CN |
101046458 | Oct 2007 | CN |
101515214 | Aug 2009 | CN |
101582002 | Nov 2009 | CN |
201936289 | Aug 2011 | CN |
2213813 | Oct 1973 | DE |
3216389 | Nov 1983 | DE |
19606408 | Jun 1997 | DE |
102005051530 | Oct 2005 | DE |
0905646 | Mar 1999 | EP |
0929028 | Jul 1999 | EP |
0973123 | Jan 2000 | EP |
1018697 | Jul 2000 | EP |
1 113 405 | Dec 2000 | EP |
1 113 383 | Jul 2001 | EP |
1139301 | Oct 2001 | EP |
1148446 | Oct 2001 | EP |
1 289 239 | Mar 2003 | EP |
1289239 | Mar 2005 | EP |
1531419 | May 2005 | EP |
1533759 | May 2005 | EP |
1538548 | Jun 2005 | EP |
1624399 | Oct 2007 | EP |
1939788 | Jul 2008 | EP |
2343677 | Jul 2011 | EP |
2343677 | Jul 2011 | EP |
2343679 | Jul 2011 | EP |
2343679 | Jul 2011 | EP |
2348472 | Jul 2011 | EP |
2348472 | Jul 2011 | EP |
2839173 | Oct 2003 | FR |
2331613 | May 1999 | GB |
2480919 | Dec 2011 | GB |
62-226030 | Oct 1987 | JP |
04158434 | Jun 1992 | JP |
10-020992 | Jan 1998 | JP |
11-164824 | Jun 1999 | JP |
2000-57328 | Feb 2000 | JP |
2001-077342 | Mar 2001 | JP |
2003-511799 | Mar 2003 | JP |
2005242856 | Sep 2005 | JP |
2006-271028 | Oct 2005 | JP |
2006-184104 | Jul 2006 | JP |
2006-517023 | Jul 2006 | JP |
2007-010338 | Jan 2007 | JP |
2007-018168 | Jan 2007 | JP |
2008-134836 | Jun 2008 | JP |
2009-516295 | Apr 2009 | JP |
09071135 | Apr 2009 | JP |
2011022788 | Feb 2011 | JP |
2011059793 | Mar 2011 | JP |
20050080628 | Aug 2005 | KR |
201112068 | Jan 2011 | TW |
201120507 | Jun 2011 | TW |
WO 9003620 | Apr 1990 | WO |
WO 9815225 | Apr 1998 | WO |
WO 9852145 | Nov 1998 | WO |
WO 9852146 | Nov 1998 | WO |
WO 9852147 | Nov 1998 | WO |
WO 98521517 | Nov 1998 | WO |
WO 9858342 | Dec 1998 | WO |
WO 9928701 | Jun 1999 | WO |
WO 9928701 | Jun 1999 | WO |
WO 9943258 | Sep 1999 | WO |
WO 0068873 | Nov 2000 | WO |
WO 0068874 | Nov 2000 | WO |
WO 0072507 | Nov 2000 | WO |
WO 0109819 | Feb 2001 | WO |
WO 0109936 | Feb 2001 | WO |
WO 0122349 | Mar 2001 | WO |
WO 01027868 | Apr 2001 | WO |
WO 0129731 | Apr 2001 | WO |
WO 0139134 | May 2001 | WO |
WO 0194966 | Jun 2001 | WO |
WO 0169520 | Sep 2001 | WO |
WO 0173678 | Oct 2001 | WO |
WO 0177994 | Oct 2001 | WO |
WO 0180166 | Oct 2001 | WO |
WO 0194902 | Dec 2001 | WO |
WO 0195304 | Dec 2001 | WO |
WO 0195305 | Dec 2001 | WO |
WO 0199035 | Dec 2001 | WO |
WO 0199036 | Dec 2001 | WO |
WO 0394892 | Dec 2001 | WO |
WO 0211066 | Feb 2002 | WO |
WO 0215209 | Feb 2002 | WO |
WO 0215267 | Feb 2002 | WO |
WO 0244998 | Jun 2002 | WO |
WO 0247018 | Jun 2002 | WO |
WO 0261668 | Aug 2002 | WO |
WO 02069386 | Sep 2002 | WO |
WO 02071313 | Sep 2002 | WO |
WO 02073375 | Sep 2002 | WO |
WO 02077907 | Oct 2002 | WO |
WO 02086800 | Oct 2002 | WO |
WO 02093462 | Oct 2002 | WO |
WO 02095349 | Nov 2002 | WO |
WO 03007127 | Jan 2003 | WO |
WO 03017211 | Feb 2003 | WO |
WO 03049011 | Jun 2003 | WO |
WO 03049012 | Jun 2003 | WO |
WO 03049016 | Jun 2003 | WO |
WO 03049104 | Jun 2003 | WO |
WO 03050963 | Jun 2003 | WO |
WO 03063054 | Jul 2003 | WO |
WO 03075210 | Sep 2003 | WO |
WO 03075210 | Dec 2003 | WO |
WO 2004066194 | Aug 2004 | WO |
WO 2004066693 | Aug 2004 | WO |
WO 2004066693 | Aug 2004 | WO |
WO 2005104012 | Nov 2005 | WO |
WO 2005104012 | Nov 2005 | WO |
WO 2005106774 | Nov 2005 | WO |
WO 2006040724 | Apr 2006 | WO |
WO 2006041780 | Apr 2006 | WO |
WO 2007011607 | Jan 2007 | WO |
WO2007048395 | May 2007 | WO |
WO 2007058727 | May 2007 | WO |
WO 0165470 | Sep 2007 | WO |
WO 2008033264 | Mar 2008 | WO |
WO 2008033265 | Mar 2008 | WO |
WO 2008137287 | Nov 2008 | WO |
WO 2009002599 | Dec 2008 | WO |
WO 2009079219 | Jun 2009 | WO |
WO 2009079221 | Jun 2009 | WO |
WO 2009079257 | Jun 2009 | WO |
WO 2009079262 | Jun 2009 | WO |
WO 2010034036 | Mar 2010 | WO |
WO 2010036445 | Apr 2010 | WO |
WO 2010143597 | Dec 2010 | WO |
WO2011035491 | Mar 2011 | WO |
WO 2011053797 | May 2011 | WO |
WO 2011088252 | Jul 2011 | WO |
WO2012014206 | Feb 2012 | WO |
Entry |
---|
Biometrics, “A Responsive Supply Chain”, By Andrew Conry-Murray, 5 pages, posted Jul. 7, 2002, as printed Aug. 23, 2004, at http://www.networkmagazine.com/shared/article/ ID-17601104. |
Ballard and Brown, “Computer Vision”, Prentice Hall, 1982, pp. 65-69. |
S. Shigematsu et al., “A Single-Chip Fingerprint Sensor and Identifier”, IEEE Journal of Solid-State Circuits, vol. 34, No. 12, Dec. 1999, pp. 1852-1859. |
“Fingernail Touch Sensors: Spatially Distributed Measurement and Hemodynamic Modeling”, Stephen Mascaro and H. Harry Asada, 2000 IEEE, pp. 3422-3427. |
Jacob O. Wobbrock, Brad A. Myers, Htet Htet Aung, and Edmond F. LoPeati, Text Entry from Power Wheelchairs: EdgeWrite for Joysticks and Touchpads, pp. 110-117, Human-Computer Interaction Institute School of Computer Science Carnegie Mellon University, Pittsburg, PA 15213 USA. |
Bartholomew J. Kane, “A High Resolution Traction Stress Sensor Array For Use In Robotic Tactile Determination”, A Dissertation Submitted to the Department of Mechanical Engineering and the Committee on Graduate Studies of Stanford University in Partial Fulfillment of the Requirements for the Degree of Philosophy, Sep. 1999. |
Choonwoo Ryu et al. “Super-template Generation Using Successive Bayesian Estimation for Fingerprint Enrollment”, Jan. 2005 Springer-Verlag Berlin Heidelberg, pp. 710-719. |
Dongjae Lee et al. “Fingerprint Fusion Based on Minutiae and Ridge for Enrollment”, Jan. 2000 Springer-Verlag Berlin Heidelberg, pp. 478-485. |
Koichi Sasakawa et al. “Personal Verification System with High Tolerance of Poor Quality Fingerprints”, May 1990 Machine Vision Systems Integration in Industry, pp. 265-272. |
Michal Irani et al., “Improving Resolution by Image Registration”, May 1991 by Academic Press, Inc., pp. 231-239. |
Qinfen Zheng et al. “A Computational Vision Approach to Image Registration”, Aug. 1992 IEEE, pp. 311-326. |
Wei-Yun Yau et al. “Fingerprint Templates Combination”, Jan. 2004 Springer-Verlag Berlin Heidelberg, pp. 449-460. |
Xudong Jiand et al. “Detecting the Fingerprint Minutiae by Adaptive Tracing the Gray-Level Ridge”, May 2001, pp. 999-1013, Published by Elsevier Science Ltd. |
Xudong Jiang et al. “Fingerprint Minutiae Matching Based on the Local Global Structures”, Sep. 2000 Ieee, pp. 1038-1041. |
I-Control, PDS 3000 TM Product Brief “Mobile Finger-Based Control Sensor”, 12 pages, Jul. 2003. |
Saunvil Pandya et al. “CORAL: Miniature Acoustic Communication Subsystem Architecture for Underwater Wireless Sensor Networks” University of Illinois at Urbana-Champaign Micro and Nanotechnology Laboratory. |
Search Report dated Sep. 26, 2002 for International Application No. PCT/US2001/46525. |
Search Report dated Aug. 9, 2005 for International Application No. PCT/US2005/012792. |
Search Report dated Dec. 12, 2005 for International Application No. PCT/US2005/013943. |
Search Report dated Dec. 22, 2005 for European Application No. 05021634.0-2218. |
Davide Maltoni, “Handbook of Fingerprint Recognition”, XP002355942 Springer, New York, USA, Jun. 2003, pp. 65-69. |
Vermesan et al., “A 500-dpi AC Capacitive Hybrid Flip-Chip CMOS ASIC/Sensor Module for Fingerprint, Navigation, and Pointer Detection With On-Chip Data Processing”, IEEE Journal of Solid State Circuits, vol. 38, No. 12, Dec. 2003, pp. 2288-2296. |
International Search Report and Written Opinion dated Jan. 30, 2006 for Application No. PCT/US2005/035504. |
Matsumoto et al., Impact of Artificial “Gummy” Fingers on Fingerprint Systems, SPIE 4677 (2002), reprinted from cryptome.org. |
Maltoni, “Handbook fo Fingerprint Recognition”, XP002355942 Springer, New York, USA, Jun. 2003 pp. 65-69. |
Ratha, et al. “Adaptive Flow Orientation Based Feature Extraction in Fingerprint Images,” Pattern Recognition, vol. 28, No. 11, 1657-1672, Nov. 1995. |
Ratha, et al., “A Real Time Matching System for Large Fingerprint Databases,”IEEE, Aug. 1996. |
Suh, et al., “Design and Implementation of the AEGIS Single-Chip Secure Processor Using Physical Random Functions”, Computer Architecture, 2005, ISCA '05, Proceedings, 32nd International Symposium, Jun. 2005 (MIT Techincal Report CSAIL CSG-TR-843, 2004. |
Rivest, et al., “A Method for Obtaining Digital Signatures and Public-Key Cryptosystems”, Communication of the ACM, vol. 21, (2), pp. 120-126. (1978). |
Hiltgen, et al., “Secure Internet Banking Authentication”, IEEE Security and Privacy, IEEE Computer Society, New York, NY, US, Mar. 1, 2006, pp. 24-31, XP007908655, ISSN: 1540-7993. |
Hegt, “Analysis of Current and Future Phishing Attacks on Internet Banking Services”, Mater Thesis. Techische Universiteit Eindhoven—Department of Mathematics and Computer Science May 31, 2008, pp. 1-149, XP002630374, Retrieves from the Internet: URL:http://alexandria.tue.nl/extra1/afstversl/wsk-i/hegt2008.pdf [retrieved on Mar. 29, 2011] pp. 127-137, paragraph 6.2. |
Gassend, et al., “Controlled Physical Random Functions”, In Proceedings of the 18th Annual Computer Security Conference Las Vegas, Nevada, Dec. 12, 2002. |
bellagiodesigns.com (Internet Archive Wayback Machine, www.bellagiodesigns.com date: Oct. 29, 2005). |
Universiteit Eindhoven—Department of Mathematics and Computer Science May 31, 2008, pp. 1-149, XP002630374, Retrieved from the Internet: URL:http://alexandria.tue.nl/extral/afstversl/wsk-i/hgt2008.pdf [retrieved on Mar. 3, 2011]pp. 127-134, paragraph 6.2. |
Wikipedia (Mar. 2012). “Integrated Circuit,”http://en.wikipedia.org/wiki/integrated_circuit. Revision as of Mar. 23, 2012. |
Closed Loop Systems, The Free Dictionary, http://www.thefreedictionary.com/closed-loop+system (downloaded Mar. 28, 2012). |
Feedback: Electronic Engineering, Wikipedia, p. 11 http://en.wikipedia.org/wiki/Feedback#Electronic_engineering (downloaded Mar. 28, 2012). |
ITD, “Anti-Money Laundering”, ITD, Jan. 22, 2009. |
Galy et al. (Jul. 2007) “A full fingerprint verification system for a single-line sweep sensor.” IEEE Sensors J., vol. 7 No. 7, pp. 1054-1065. |
Final Office Action issued in U.S. Appl. No. 13/860,494, 35 pages (dated May 4, 2017). |
Office Action (with English translation) issued in Korean Patent Application No. 10-2012-7021324, 20 pages. (dated May 30, 2017). |
Office Action (and English language summary) with Search Report issued in Chinese Patent Application No. 201380030318.8, 12 pages (dated May 25, 2017). |
Non-final Office Action issued in U.S. Appl. No. 14/243,116, 24 pages (dated Jun. 15, 2017). |
Final Office Action issued in U.S. Appl. No. 14/243,122, 24 pages. (dated Jun. 16, 2017). |
Notice of Allowance issued in U.S. Appl. No. 15/357,019, 92 pages (dated Jun. 9, 2017). |
Advisory Action issued in U.S. Appl. No. 13/860,494, 5 pages (dated Jun. 24, 2016). |
Chinese Office Action dated Dec. 1, 2014, CN Application No. 201180014263.2, 7 pages. |
Chinese Office Action dated Dec. 3, 2014, CN Application No. 201180014237.X, 9 pages. |
Chinese Office Action dated Mar. 2, 2016, CN Application No. 201180014263.2, 8 pages. |
Chinese Office Action dated Nov. 11, 2015, CN Application No. 201180014237.X, 6 pages. |
Corrected Notice of Allowability issued in U.S. Appl. No. 13/620,271, 12 pages (dated Sep. 18, 2015). |
Corrected Notice of Allowability issued in U.S. Appl. No. 13/620,271, 6 pages (dated Oct. 2, 2015). |
Extended European Search Report dated Mar. 2, 2016, Application No. 15179001.1 12 pages. |
Extended European Search Report issued in EP Patent Application No. 15179015.1, 7 pages (dated Nov. 24, 2015). |
Extended European Search Report issued in European Patent Application No. 13775647.4, 7 pages (dated Oct. 1, 2015). |
Extended European Search Report issued in European Patent Application No. 15179001, 7 pages (dated Nov. 9, 2015). |
Final Office Action dated Nov. 18, 2015, U.S. Appl. No. 13/801,991, 12 pages. |
Final Office Action issued in U.S. Appl. No. 13/860,494, 42 pages (dated Apr. 15, 2016). |
Final Office Action issued in U.S. Appl. No. 14/243,116, 22 pages (dated Feb. 10, 2017). |
Japanese Office Action (with English translation attached) dated Feb. 24, 2016, JP Application No. 2012-549090, 5 pages. |
Japanese Office Action dated Dec. 2, 2014, JP Application No. 2012-549090, 15 pages. |
Japanese Office Action dated Sep. 2, 2014, JP Application No. 2012-549092, 8 pages. |
Non-final Office Action issued in U.S. Appl. No. 13/801,991, 24 pages (dated Apr. 9, 2015). |
Notice of Allowance issued in U.S. Appl. No. 13/620,271, 21 pages (dated Apr. 21, 2015). |
Notice of Allowance issued in U.S. Appl. No. 13/620,271, 127 pages (dated Aug. 3, 2015). |
Office Action issued in U.S. Appl. No. 13/860,494, 98 pages (dated Oct. 7, 2015). |
Office Action issued in U.S. Appl. No. 13/860,494, 25 pages (dated Sep. 8, 2016). |
Office Action issued in U.S. Appl. No. 14/243,116, 112 pages (dated Aug. 10, 2016). |
Office Action issued in U.S. Appl. No. 14/243,122 dated Dec. 2, 2016, 16 pages. |
Taiwanese Office Action dated Nov. 16, 2015 (with English translation attached), TW Application No. 100101376, 10 pages. |
Taiwanese Search Report dated Jan. 16, 2015, TW Application No. 099146099, Filed: Dec. 27, 2010, 1 page. |
Examination Report issued in European Patent Application No. 13775647.4, 4 pages (dated Nov. 16, 2017). |
U.S. Office Action dated Jan. 29, 2018 issued in U.S. Appl. No. 13/860,494. |
Chinese Office Action dated Jan. 31, 2018 issued in Chinese Patent Application No. 201380030318.8 (with Explanation of Relevance). |
Number | Date | Country | |
---|---|---|---|
20170308228 A1 | Oct 2017 | US |
Number | Date | Country | |
---|---|---|---|
61622474 | Apr 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14243122 | Apr 2014 | US |
Child | 15590454 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13860494 | Apr 2013 | US |
Child | 14243122 | US |