Display with integrated touch screen and fingerprint sensor

Information

  • Patent Grant
  • 10101851
  • Patent Number
    10,101,851
  • Date Filed
    Tuesday, May 9, 2017
    7 years ago
  • Date Issued
    Tuesday, October 16, 2018
    6 years ago
Abstract
A display is integrated with a touch screen and a fingerprint sensor having a plurality of substantially parallel drive lines configured to transmit a signal into a surface of a proximally located object and a plurality of substantially parallel pickup lines oriented proximate the drive lines and electrically separated from the pickup lines to form intrinsic electrode pairs that are impedance sensitive at each of the drive and pickup proximal locations. The display may comprise an OLED display with an OLED injector layer and an OLED emissive layer disposed between the drive lines and pickup lines.
Description
BACKGROUND OF INVENTION

The embodiments are generally related to electronic sensing devices, and, more particularly, to sensors for sensing objects located near or about the sensors for use in media navigation, fingerprint sensing and other operations of electronic devices and other products.


In the electronic sensing market, there are a wide variety of sensors for sensing objects at a given location. Such sensors are configured to sense electronic characteristics of an object in order to sense presence of an object near or about the sensor, physical characteristics of the object, shapes, textures on surfaces of an object, material composition, biological information, and other features and characteristics of an object being sensed.


Sensors may be configured to passively detect characteristics of an object, by measuring such as temperature, weight, or various emissions such as photonic, magnetic or atomic, of an object in close proximity or contact with the sensor, or other characteristic. An example of this is a non-contact infrared thermometer that detects the black body radiation spectra emitted from an object, from which its temperature can be computed.


Other sensors work by directly exciting an object with a stimulus such as voltage or current, then using the resultant signal to determine the physical or electrical characteristics of an object. An example of this is a fluid detector consisting of two terminals, one that excites the medium with a voltage source, while the second measures the current flow to determine the presence of a conductive fluid such as water.


Since a single point measurement of an object often does not provide enough information about an object for practical applications, it is often advantageous to collect a two-dimensional array of measurements. A two dimensional array of impedance may be created by moving a line sensing array over the surface of an object and then doing a line by line reconstruction of a two dimensional image like a fax machine does. An example of this is a swiped capacitive fingerprint sensor that measures differences in capacitance between fingerprint ridges and valleys as a finger is dragged across it. The swiping motion of the fingerprint by a user allows the one-dimensional line of sensor points to capture a large number of data points from the user's fingerprint surface. Such sensors reconstruct a two dimensional fingerprint image after the fact using individual lines of the captured data points. This reconstruction process requires a great deal of processing by a device, and is subject to failure if the swipe movement and conditions are not optimum.


A more user friendly way to obtain a two dimensional image is to create a two dimensional sensing array that can capture a user's fingerprint data while the user holds the fingerprint surface still on the sensor surface, rather than swipe across a sensor. Such sensors however can be prohibitive in cost due to the large number of sensing points needed in the array. An example of this is a two dimensional capacitive fingerprint sensor. A number of these are currently manufactured. These sensors, however, are based use 150 mm2 or more of silicon area and are therefore cost prohibitive for many applications. They are also delicate and fragile. They are sensitive to impact and even temperature changes, and thus are simply not durable enough for most applications, such as smart phones and other mobile electronic devices that are handled and sometimes dropped by users.


These different types of electronic sensors have been used in various applications, such as biometric sensors for measuring biological features and characteristics of people such as fingerprints, medical applications such as medical monitoring devices, fluid measuring monitors, and many other sensor applications. Typically, the sensing elements of the various devices are connected to a processor configured to process object information and to enable interpretations for object features and characteristics. Examples include ridges and valleys of a fingerprint, temperature, bulk readings of presence or absence, and other features and characteristics.


There are many applications for two dimensional image sensors as a particular example, and innovators have struggled with state of the art technology that has come short of desired features and functions. Fingerprint sensors, for example, have been in existence for many years and used in many environments to verify identification, to provide access to restricted areas and information, and many other uses. In this patent application, different types of fingerprint sensors will be highlighted as examples of sensor applications where the embodiment is applicable for simplicity of explanations, but other types of applications are also relevant to this background discussion and will also be addressed by the detailed description of the embodiment. These placement sensors may be configured to sense objects placed near or about the sensor, such as a fingerprint placement sensor that is configured to capture a full image of a fingerprint from a user's finger and compare the captured image with a stored image for authentication. Alternatively, sensors may be configured to sense the dynamic movement of an object about the sensor, such as a fingerprint swipe sensor that captures partial images of a fingerprint, reconstructs the fingerprint image, and compares the captured image to a stored image for authentication.


In such applications, cost, though always a factor in commercial products, has not been so critical—accuracy and reliability have been and still remain paramount factors. Typically, the placement sensor, a two-dimensional grid of sensors that senses a fingerprint image from a user's fingerprint surface all at once, was the obvious choice, and its many designs have become standard in most applications. Once the fingerprint image is sensed and reproduced in a digital form in a device, it is compared against a prerecorded and stored image, and authentication is complete when there is a match between the captured fingerprint image and the stored image. In recent years, fingerprint sensors have been finding their way into portable devices such as laptop computers, hand held devices, cellular telephones, and other devices. Though accuracy and reliability are still important, cost of the system components is very important. The conventional placement sensors were and still are very expensive for one primary reason: they all used silicon sensor surfaces (this is excluding optical sensors for this example, because they are simply too large and require more power than a portable device can afford to allocate, among other reasons, and thus they are generally not available in most commercially available devices). These silicon surfaces are very expensive, as the silicon material is as expensive as the material to make a computer chip. Computer chips, of course, have become smaller over the years to reduce their cost and improve their performance. The reason the fingerprint silicon could not be made smaller: they need to remain the size of the average fingerprint, and the requirement for full scanning of the users' fingerprints simply cannot be compromised. Substantially the full print is required for adequate security in authentication.


Enter the fingerprint swipe sensor into the market. Swipe sensors are fundamentally designed with a line sensor configured to sense fingerprint features as a user swipes their finger in a perpendicular direction with respect to the sensor line. The cost saver: swipe sensors need much less silicon, only enough to configure a line sensor with an array of pixel sensors. The width is still fixed based on the average fingerprint width, but the depth is substantially smaller compared to the placement sensor. Some swipe sensors are capacitive sensors, where capacitance of the fingerprint surface is measured and recorded line by line. Others send a small signal pulse burst into the surface of the fingerprint surface and measure a response in a pickup line, again recording fingerprint features line by line. In either case, unlike the placement sensors, the full fingerprint image needs to be reconstructed after the user completes the swipe, and the individual lines are reassembled and rendered to produce a full fingerprint image. This image is compared with a fingerprint image stored in the laptop or other device, and a user will then be authenticated if there is an adequate match.


For the capacitive swipe sensors, the first generation sensors were constructed with direct current (DC) switched capacitor technology (for example U.S. Pat. No. 6,011,859). This approach required using two plates per pixel forming a capacitor between them, allowing the local presence of a finger ridge to change the value of that capacitor relative to air. These DC capacitive configurations took images from the fingerprint surface, and did not penetrate below the finger surface. Thus, they were easy to spoof, or fake a fingerprint with different deceptive techniques, and they also had poor performance when a user had dry fingers. RF (Radio Frequency) sensors were later introduced, because some were able to read past the surface and into inner layers of a user's finger to sense a fingerprint. Different radio frequencies have been utilized by various devices along with different forms of detection including amplitude modulation (AM) and, phase modulation (PM). There are also differing configurations of transmitters and receivers, one type (for example U.S. Pat. No. 5,963,679) uses a single transmitter ring and an array of multiple low quality receivers that are optimized for on chip sensing. In contrast another type (for example U.S. Pat. No. 7,099,496) uses a large array of RF transmitters with only one very high quality receiver in a comb like plate structure optimized for off chip sensing.


One key impediment to the development of low cost placement sensors has been the issue of pixel density, and the resultant requirement for a large number of interconnections between layers of the sensor device. A typical sensor for a fingerprint application will be on the order of 10 mm.times.10 mm, with a resolution of 500 dpi. Such a sensor array would be approximately 200 rows by 200 columns, meaning there would need to be 200 via connections between layers in the device. While semiconductor vias can be quite small, the cost for implementing a sensor in silicon has proven to be prohibitive, as mentioned above.


In order to produce a placement sensor at a low enough cost for mass market adoption, lower cost processes such as circuit board etching must be employed. The current state of the art in circuit board via pitch is on the order of 200 μm, vs. the 50 μm pitch of the sensor array itself. Additionally, the added process steps required to form vias between layers of a circuit board significantly increase the tolerances for the minimum pitch of traces on each of the layers. Single-sided circuits may be readily fabricated with high yield with line pitch as low as 35 μm, whereas double sided circuits require a minimum line pitch on the order of 60 μm or more, which is too coarse to implement a full 500 dpi sensor array. One further consideration is that at similar line densities, double-sided circuits with vias are several times more expensive per unit area than single sided, making high-density double sided circuits too expensive for low cost sensor applications.


For laptop devices, adoption of the swipe sensor was driven by cost. The swipe sensor was substantially less expensive compared to the placement sensors, and most manufacturers of laptops adopted them based solely on price. The cost savings is a result of using less silicon area. More recently a substitute for the silicon sensor arose, using plastic Kapton® tape with etched sensing plates on it, connected to a separate processor chip (for example U.S. Pat. No. 7,099,496). This allowed the silicon portion of the sensor to be separated from the sensing elements and the silicon to follow Moore's law, shrinking to an optimal size, in length, width and depth in proportion to advances in process technology. Although this advance in the art enabled cheap durable Swipe Sensors, it did not overcome the basic image reconstruction and ergonomics issues resulting from changing from a simple two dimensional placement format. In addition to Swipe Sensors being cheaper, they take up less real estate in a host device, whether it is a laptop or a smaller device, such as a cellular phone or personal data device.


In most swipe class sensors, the fingerprint reconstruction process turned out to be a greater ergonomic challenge to users and more of a burden to quality control engineers than initially expected. Users needed to be trained to swipe their finger in a substantially straight and linear direction perpendicular to the sensor line as well as controlling contact pressure. Software training programs were written to help the user become more proficient, but different environmental factors and the inability of some to repeat the motion reliably gave Swipe Sensors a reputation for being difficult to use. Initial data from the field indicated that a large number of people were not regularly using the Swipe Sensors in the devices that they had purchased and opted back to using passwords. Quality control engineers who tried to achieve the optimum accuracy and performance in the matching process between the captured and reconstructed image found that the number of False Rejects (FRR), and False Acceptances (FAR), were much higher in Swipe Sensors than in placement sensors. Attempts to improve these reconstruction algorithms failed to produce equivalent statistical performance to placement sensors.


Development of sensors that take up less space on devices have been tried without much success. Various ramps, wells and finger guides had to be incorporated into the surfaces of the host devices to assist the user with finger placement and swiping. These structures ended up consuming significant space in addition to the actual sensor area. In the end, swipe sensors ended up taking up almost as much space as the placement sensors. This was not a big problem for full size laptops, but is currently a substantial problem for smaller laptops and netbooks, mobile phones, PDAs, and other small devices like key fobs.


Real estate issues have become even more of an issue with mobile device manufacturers who now require that the fingerprint sensor act also as a navigation device, like a mouse or touch-pad does in a laptop. The swipe sensor has proved to be a poor substitute for a mouse or touch pad due to the fact that they are constructed with an asymmetric array of pixels. Swipe sensors do a good job of detecting motion in the normal axis of the finger swipe but have difficulty accurately tracking sideways motion. Off axis angular movements are even more difficult to sense, and require significant processor resources to interpolate that movement with respect to the sensor line, and often have trouble resolving large angles. The byproduct of all this is a motion that is not fluid and difficult to use.


It is clear that low cost two dimensional fingerprint sensor arrays would serve a market need, but present art has not been able to fill that need. Conventional capacitive fingerprint sensors typically use distinct electrode structures to form the sensing pixels array. These electrode structures are typically square or circular and can be configured in a parallel plate configuration (for example U.S. Pat. Nos. 5,325,442 and 5,963,679) or a coplanar configuration (for example U.S. Pat. No. 6,011,859 and U.S. Pat. No. 7,099,496).


These prior art approaches cannot be configured into a low cost two dimensional array of sensing elements. Many capacitive fingerprint sensors (for example U.S. Pat. Nos. 5,963,679 and 6,011,859) have plate structures that must be connected to the drive and sense electronics with an interconnect density that is not practical for implementation other than using the fine line multilayer routing capabilities of silicon chips and therefore require lots of expensive silicon die are as stated before. Other sensors (for example U.S. Pat. No. 7,099,496) use off chip sensing elements on a cheap polymer film, but the sensor cell architecture is inherently one dimensional and cannot be expanded into a two dimensional matrix.


Another application for capacitive sensing arrays has been in the area of touch pads and touch screens. Because touchpad and touch screen devices consist of arrays of drive and sense traces and distinct sense electrodes, they are incapable of resolutions below a few hundred microns, making this technology unsuitable for detailed imaging applications. These devices are capable of detecting finger contact or proximity, but they provide neither the spatial resolution nor the gray-scale resolution within the body of the object being sensed necessary to detect fine features such as ridges or valleys.


Conventional art in the touchpad field utilizes a series of electrodes, either conductively (for example U.S. Pat. No. 5,495,077) or capacitively (for example US publication 2006/0097991). This series of electrodes are typically coupled to the drive and sense traces. In operation these devices produce a pixel that is significantly larger in scale than the interconnect traces themselves. The purpose is to generally sense presence and motion of an object to enable a user to navigate a cursor, to select an object on a screen, or to move a page illustrated on a screen. Thus, these devices operate at a low resolution when sensing adjacent objects.


Thus, there exists a need in the art for improved devices that can provide high quality and accurate placement sensors for use in different applications, such as fingerprint sensing and authentication for example, and that may also operate as a navigation device such as a mouse or touch pad in various applications. As will be seen, the embodiment provides such a device that addresses these and other needs in an elegant manner. Given the small size and functional demands of mobile devices, space savings are important. Thus, it would also be useful to be able to combine the functions of a sensor with that of other components, such as power switches, selector switches, and other components, so that multiple functions are available to a user without the need for more components that take up space.


Still further, it would be also useful for different embodiments of a touch sensor to provide various alternatives for providing biometric sensors that are easy to use and feasible in different applications.


Even further, it would be useful for sensors to not only act as image capturing components, but to also provide navigation operations for viewing and exploring various media, such as with touch-screens used in many smart phones, such as the iPad™, iPod™, iPhone™ and other touch-sensitive devices produced by Apple Corporation™, the Galaxy™ and its progeny by Samsung Corporation™, and other similar devices.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows a diagrammatic view of one embodiment showing the drive and pickup plate structures with an insulating dielectric layer separating the drive and pickup lines.



FIG. 2 shows a basic diagrammatic view of one embodiment showing the basic electrical field operation without an object in close proximity to the drive and pickup plate structures with one drive plate excited by a voltage source.



FIG. 3 shows a basic diagrammatic view of one embodiment showing the basic electrical field operation with an object in close proximity to the drive and pickup plate structures with one drive plate excited by a voltage source.



FIG. 4 shows a basic diagrammatic view of one embodiment of the sensor showing the differences in field intensity with and without an object in close proximity to the drive and pickup plate structures with one drive plate excited by a voltage source.



FIG. 5 shows a basic diagrammatic view of one embodiment showing the basic electrical field operation with an object in close proximity of the drive and pickup plate structures with the selected pickup plate amplified and all inactive drive and pickup plates grounded.



FIG. 6a shows a basic diagrammatic view of one embodiment showing the basic electrical field operation with a finger or object containing a ridge surface feature in close proximity to the active electrode pair.



FIG. 6b shows a basic diagrammatic view of one embodiment showing the basic electrical field operation with a finger or object containing a valley surface feature in close proximity to the active electrode pair.



FIG. 7 shows a diagrammatic view of an x-y grid of plate rows and columns depicted by lumped circuit components that represent the electric field couplings of the sensor at each drive/pickup crossover.



FIG. 8 shows an example of an embodiment of the placement embodiment using a differential amplifier to take the signal from the selected pickup plate and subtract it from a reference signal plate for noise reduction purposes.



FIG. 9a shows the drive and sense multiplexing circuitry of an embodiment that incorporates a tank circuit to compensate for input loading effects.



FIG. 9b shows the drive and sense multiplexing circuitry of an embodiment that incorporates cascaded buffers to minimize input loading effects.



FIG. 9c shows the drive and sense multiplexing circuitry of an embodiment that incorporates dedicated buffers for each sense to minimize loading effects.



FIG. 10 shows an embodiment that incorporates an analog receiver to process the sensed signal, and processing circuitry to perform the drive and sense line scanning function.



FIG. 11 shows an embodiment that incorporates a direct digital conversion receiver to process the sensed signal, and processing circuitry to perform the drive and sense line scanning function.



FIG. 12A shows one example of a layout of the drive and sense traces for an embodiment that incorporates the folded aspect of the embodiment laid out flat prior to folding.



FIG. 12B shows one example of a layout of the drive and sense traces for an embodiment that incorporates the folded aspect of the embodiment laid out flat prior to folding.



FIG. 13a shows the layer stack-up of an embodiment that incorporates the folding aspect subsequent to folding.



FIG. 13b shows an embodiment that incorporates the folding aspect subsequent to folding and assembly into a rigid module.



FIG. 14 shows a sensor system configured according to the embodiment for the purpose of sensing features of an object.



FIG. 15 shows an example of the sensing of a fingerprint features.



FIG. 16 shows the process flow steps required to collect a 2-dimensional image with a sensor system configured according to one embodiment.



FIG. 17a shows the process flow steps required to authenticate a user with a fingerprint sensor system configured according one embodiment.



FIG. 17b shows the process of template extraction from a fingerprint image typically utilized in user authentication applications.



FIGS. 18A-18D show an example of a fingerprint sensor system having an integrated switch to allow a user to contact a fingerprint sensor and to actuate a switch simultaneously.



FIGS. 19A-J show another example of a fingerprint sensor system having an integrated switch, a dome switch in this example, to allow a user to contact a fingerprint sensor and to actuate a switch simultaneously.



FIG. 20 shows a top view of an embodiment of a switch formed on the same substrate as the fingerprint sensor.



FIGS. 21A and B are detailed views which show the operation of the embedded switch depicted in FIG. 20.



FIGS. 22A-26C illustrate other embodiments of the invention.



FIGS. 27-29 illustrate a method for integrating a folded flex fingerprint sensor directly onto a touch-screen device.



FIGS. 30-32 illustrate a fingerprint sensor integrated onto the same substrate layers as a conventional touch-screen.



FIGS. 33 and 34 illustrate a novel “Dual Grid” touch-screen.



FIGS. 35-37 illustrate a fully integrated Dual Grid touch-screen and fingerprint sensor, which advantageously share a common drive and sense circuit.



FIGS. 38-40 illustrated a fully integrated display with integral touch-screen and fingerprint sensing over the entire display area.



FIGS. 41-50 illustrate how the dual grid finger motion tracking process operates.



FIG. 51 illustrates a flexible fingerprint sensor integrated with a touchscreen.



FIG. 52 illustrates an embodiment of a fingerprint sensor sharing a substrate layer with a touchscreen.



FIG. 53 illustrates a sensor and a touchscreen implemented on common substrate layers utilizing a common controller chip.





DETAILED DESCRIPTION OF THE INVENTION

As discussed in the background, there are many applications for a two dimensional impedance sensor, and the embodiments described herein provide broad solutions to shortcomings in the prior art for many applications. The underlying technology finds application in many different sensor features for use in many types of products, including mobile phones, smart phones, flip phones, tablet computers such as Apple™ iPads™ and Samsung™ Galaxy™ devices, point of entry devices such as door knobs, fence, drug cabinets, automobiles, and most any device, venue or thing that may be locked and require authentication to access.


Generally, one embodiment is directed to a two-dimensional sensor, and may also be referred to as a placement sensor, touch sensor, area sensor, or 2D sensor, where a substantial area of an object such as a user's fingerprint is sensed rather than a point or line like portion of space that may or may not yield a characteristic sample that is adequate for identification. The sensor may have sensor lines located on one or more substrates, such as for example a flexible substrate that can be folded over on itself to form a grid array with separate sensor lines orthogonal to each other. The sensor lines may alternatively be formed on separate substrates. In either or any configuration, the crossover locations of different sensor lines create sensing locations for gathering information of the features and/or characteristics of an object, such as the patterns of ridges and valleys of a fingerprint for example.


Other embodiments provide a touch sensor having common electrical connections with a touch screen. For example, touch screen circuitry that resides under protective glass, such as Gorilla Glass™ used in many touch screen devices, may share common electrical connections with a two dimensional sensor used for navigation, and/or fingerprint sensing, or other operations. This provides benefits for manufacturing a device with both a touch screen and a fingerprint sensor, and may simplify the electrical layout of such a device. Exemplary configurations are described below and illustrated herein.


Other embodiments provide novel approaches to two-dimensional sensors integrated with a touch screen to provide the ability to capture a fingerprint image in one mode, and to operate as a conventional touch-screen when in another mode. In one example, a sensor grid may act as a touch screen by sensing presence of a user's finger or fingers and also movement of the fingers from one location to another together with speed to determine a swipe direction and speed. In another mode, the same sensor lines may act as drive lines and pickup lines, where a signal is transmitted from the screen to the user's finger or fingers, and the resulting signal is received by a pickup line and measured to determine the impedance of the fingerprint surface. Impedance values of fingerprint ridges are different than the impedance measurement of fingerprint valleys, and thus the fingerprint image may be mapped once the impedance values are captured of a two dimensional surface of a fingerprint surface. The resulting fingerprint image may then be compared to a stored fingerprint image to authenticate the user, much in the same way a simple password is compared to a stored password when users authenticate themselves with electronic devices using numerical and alphanumeric passwords with devices. The difference is that the use of a fingerprint in place of a password is much more secure.


A two dimensional sensor may be configured in different ways, such as for example a component that may be integrated on a portable device, a sensor integrated with a touch-screen used to provide touch sensitive surfaces for navigation of electronic content and operations in a portable device, or as a stand-alone component that may be electrically connected to a system or device to transmit and receive information for authentication, activation, navigation and other operations.


In one embodiment, the drive lines and pickup lines are not electrically intersecting or connected in a manner in which they would conduct with each other, they form an impedance sensing electrode pair with a separation that allows the drive lines to project an electrical field and the pickup lines to receive an electrical field, eliminating the need for distinct electrode structures. The two lines crossing with interspersed dielectric intrinsically creates an impedance sensing electrode pair. Thus, the sensor is configured to activate two one-dimensional sensor lines to obtain one pixel of information that identifies features and/or characteristics of an object. Unlike conventional sensors, a sensor configured according to certain embodiments may provide a two dimensional grid that is capable of capturing multiple pixels of information from an object by activating individual pairs of drive and pickup lines and capturing the resultant signal. This signal can be processed with logic or processor circuitry to define presence and absence of an object, features and/or characteristics of an object.


In yet another embodiment, a touch screen may operate as a sensor configured in one mode to capture information on a nearby object, such as information for forming an image of a fingerprint, and may operate in another mode to perform navigation or other operations when another mode. In one example, an OLED touch screen is configured to operate in at least two modes, one as a touch screen, and another as a fingerprint sensor, where a fingerprint may be captured in any part of the OLED touch screen desired, and even multiple fingerprints from two or more user fingers may be captured.


In examples described herein, these sensors may be configured to capture information of a nearby object, and the information may be used to produce renderings of an object, such as a fingerprint, and compare the renderings to secured information for authentication.


According to one embodiment, and in contrast to conventional approaches, a device can utilize the intrinsic impedance sensing electrode pair formed at the crossings between the drive and pickup lines. In operation, the electric fields may be further focused by grounding drive and pickup lines near or about the area being sensed by the particular crossover location at one time. This prevents interference that may occur if other drive and pickup lines were sensing electric fields simultaneously. More than one electrode pair may be sensed simultaneously. However, where resolution is an important factor, it may be preferred to avoid sensing electrode pairs that are too close to each other to avoid interference and maintain accuracy in sensing object features at a particular resolution. For purposes of this description, “intrinsic electrode pair” refers to the use of the impedance sensing electrode pairs that are formed at each of the drive and pickup line crossover locations. Due to the fact that the embodiments use each intrinsic electrode pair at each crossover as a sensing element, no differentiating geometric features exist at individual sensing nodes to distinguish them from the interconnect lines. As a result, the alignment between the drive layers and sense layers is non-critical, which significantly simplifies the manufacturing process.


Grounding the adjacent inactive drive and pickup lines restricts the pixel formed at each intrinsic electrode pair without requiring complex measures such as the dedicated guard rings employed in prior art (for example U.S. Pat. No. 5,963,679). Instead, guard grounds around the pixel are formed dynamically by switching adjacent inactive drive and pickup lines into ground potential. This allows the formation of high density pixel fields with relatively low resolution manufacturing processes, as the minimum pixel pitch for a given process is identical to the minimum feature spacing. This, in turn, enables the use of low cost manufacturing process and materials, which is the key to creating a low cost placement sensor.


In one example, the sensor lines may consist of drive lines on one layer and pickup lines on another layer, where the layers are located over each other in a manner that allows the separate sensor lines, the drive and pickup lines, to cross over each other to form impedance sensing electrode pairs at each crossover location. These crossover locations provide individually focused electrical pickup locations or pixels, or electrode pairs where a number of individual data points of features and/or characteristics of an object can be captured. The high degree of field focus is due to the small size of the intrinsic electrode pairs, as well as the high density of the neighboring ground provided by the inactive plates. The flexible substrate may have a second substrate configured with logic or processor circuitry for sending and receiving signals with the sensor lines to electronically capture information related to the object. Alternatively, there may be two separate substrates carrying the separate sensor lines and layered on each other, and yet connected to a third substrate for connection to logic or processor circuitry.


The utilization of the crossover locations between perpendicular lines on adjacent layers for the pickup cell greatly reduces the alignment requirements between the layers. Since there are no unique features at a sensor pixel location to align, the only real alignment requirement between the layers is maintaining perpendicularity. If the sense cell locations had specific features, such as the parallel plate features typical of prior art fingerprint sensors, the alignment requirements would include X and Y position tolerance of less than one quarter a pixel size, which would translate to less than +/−12 μm in each axis for a 500 DPI resolution fingerprint application.


In operation, a drive line is activated, with a current source for example, and a pickup line is connected to a receiving circuit, such as an amplifier/buffer circuit, so that the resulting electric field can be captured. An electric field extends from the drive line to the pickup line through the intermediate dielectric insulating layer. If an object is present, some or all of the electric field may be absorbed by the object, changing the manner in which the electric field is received by the pickup line. This changes the resulting signal that is captured and processed by the pickup line and receiving circuit, and thus is indicative of the presence of an object, and the features and characteristics of the object may be sensed and identified by processing the signal. This processing may be done by some form of logic or processing circuitry.


In other embodiments, the signal driving the drive line may be a complex signal, may be a varying frequency and/or amplitude, or other signal. This would enable a sensor to analyze the features and/or characteristics of an object from different perspectives utilizing a varying or complex signal. The signal may include simultaneous signals of different frequencies and/or amplitudes that would produce resultant signals that vary in different manners after being partially or fully absorbed by the object, indicating different features and characteristics of the object. The signal may include different tones, signals configured as chirp ramps, and other signals. Processing or logic circuitry may then be used to disseminate various information and data points from the resultant signal.


In operation, the varying or complex signal may be applied to the drive line, and the pickup line would receive the resulting electric field to be processed. Logic or processing circuitry may be configured to process the resulting signal, such as separating out different frequencies if simultaneous signals are used, so that features and/or characteristics of the object may be obtained from different perspectives.


Given the grid of pixels that can be activated at individual pairs, each pixel may be captured in a number of ways. In one embodiment, a drive line may be activated, and pickup lines may be turned on and off in a sequence to capture a line of pixels. This sequencing may operate as a scanning sequence. Here a first drive line is activated by connecting it to a signal source, and then one pickup line is connected to amplifier/buffer circuitry at a time, the information from the pixel formed at the crossing of the two lines is captured, and then disconnected. Then, a next pixel is processed in sequence, then another, then another, until the entire array of pickup lines is processed. The drive line is then deactivated, and another drive line is activated, and the pickup lines are again scanned with this active drive line. These may be done one at a time in sequence, several non-adjacent pixels may be processed simultaneously, or other variations are possible for a given application. After the grid of pixels is processed, then a rendering of object information will be possible.


Referring to FIG. 1, a diagrammatic view of one embodiment of a sensor 100 configured according to one embodiment is illustrated. In this configuration, pickup lines or top plates 102a[m], 102b[m+1] are located on a insulating dielectric substrate layer 104 and configured to transmit a signal into a surface of an object located in close proximity to the sensor lines. Drive lines or bottom plates 106a[n], 106b[n+1] are juxtaposed and substantially perpendicular to the pickup lines or top plates and are located on an opposite side of the a insulating dielectric substrate to form a type of a grid. The pickup lines are configured to receive the transmitted electromagnetic fields modified by the impedance characteristics on an object placed within the range of those electric fields.


Referring to FIG. 2, a diagrammatic view of a sensor 200 is shown having pickup lines or top plates 202a, 202b and insulating layer 204, and drive lines or bottom plates 206a, 206b. The Figure further illustrates how electromagnetic fields 208a, 208b extend between the drive lines and pickup plates through the substrate. Without an object within proximity, the electric field lines are uniform within the sensor structure and between the different lines. When an object is present, a portion of the electric field lines are absorbed by the object and do not return to the pickup plates through the insulating layer.


Referring to FIG. 3, an object 310 is illustrated proximate the sensor 300. The sensor 300 has pickup lines or top plates 302a, 302b, an insulating dielectric layer 304, and drive lines or bottom plates 306a, 306b. In operation, the drive lines and pickup lines of this device example may be individually activated, where a drive line/pickup line pair is activated to produce an active circuit. The result is a circuit that transmits electric field from active drive plate 316 into the combined dielectric of the insulating layer 304 and object 310 via electric field lines, 306a, 306b, and received by the active pickup plate. As the illustration shows, some of the field lines are captured by the object when it is placed about the active electrode pair. The variations in an object, such as peaks and valleys and other features of an object surface, can be detected and captured electronically by capturing and recording the resulting electric field variations occurring at different crossover locations of the drive and pickup lines. Similar to common capacitance based placement sensors, the sensor can capture a type of image of the object surface electronically, and generate a representation of the features and characteristics of an object, such as the features and characteristics of a fingerprint in the fingerprint sensor example described below.


In this configuration of FIG. 3, only one active electrode pair is illustrated. However, the embodiment is not limited to this particular configuration, where one single electrode pair, several electrode pairs, or even all electrode pairs may be active at one time for different operations. In practice, it may be desirable for less than all of the electrode pairs to be active at a given time, so that any interference that may occur between close-by pixels would be minimized. In one embodiment, a drive line may be activated, and the pickup lines may be scanned one or more at a time so that a line of pixels can be captured along the drive line and pickup lines as they are paired along a line at the crossover locations. This is discussed in more detail below in connection with FIG. 5.


In general, in operation, each area over which a particular drive line overlaps a pickup line with a separation of the a insulating dielectric substrate is an area that can capture and establish a sensing location that defines characteristics or features of a nearby object about that area. Since there exist multiple sensing locations over the area of the sensor grid, multiple data points defining features or characteristics of a nearby object can be captured by the sensor configuration. Thus, the sensor can operate as a planar two-dimensional sensor, where objects located on or about the sensor can be detected and their features and characteristics determined.


As described in the embodiments and examples below, the embodiment is not limited to any particular configuration or orientation described, but is only limited to the appended claims, their equivalents, and also future claims submitted in this and related applications and their equivalents. Also, many configurations, dimensions, geometries, and other features and physical and operational characteristics of any particular embodiment or example may vary in different applications without departing from the spirit and scope of the embodiment, which, again, are defined by the appended claims, their equivalents, and also future claims submitted in this and related applications and their equivalents.


In the following description, numerous specific details are set forth in order to provide a thorough understanding of the embodiment. However, it will be apparent to one skilled in the art that the embodiment can be practiced without these specific details. In other instances, well known circuits, components, algorithms, and processes have not been shown in detail or have been illustrated in schematic or block diagram form in order not to obscure the embodiment in unnecessary detail. Additionally, for the most part, details concerning materials, tooling, process timing, circuit layout, and die design have been omitted inasmuch as such details are not considered necessary to obtain a complete understanding of the embodiment and are considered to be within the understanding of persons of ordinary skill in the relevant art. Certain terms are used throughout the following description and claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function. In the following discussion and in the claims, the terms “including” and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to . . . ”


Embodiments of the embodiment are described herein. Those of ordinary skill in the art will realize that the following detailed description of the embodiment is illustrative only and is not intended to be in any way limiting. Other embodiments of the embodiment will readily suggest themselves to such skilled persons having the benefit of this disclosure. Reference will be made in detail to implementations of the embodiment as illustrated in the accompanying drawings. The same reference indicators will be used throughout the drawings and the following detailed description to refer to the same or like parts.


In the interest of clarity, not all of the routine features of the implementations described herein are shown and described. It will, of course, be appreciated that in the development of any such actual implementation, numerous implementation-specific decisions must be made in order to achieve the developer's specific goals. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking of engineering for those of ordinary skill in the art having the benefit of this disclosure.


In one embodiment, a sensor device includes drive lines located on or about an insulating dielectric substrate and configured to transmit a signal onto a surface of an object being sensed. Pickup lines are located near or about the drive lines and configured to receive the transmitted signal from the surface of an object. In order to keep a separation between the drive lines and pickup lines, the substrate may act as an insulating dielectric or spacing layer. The substrate may be for example a flexible polymer based substrate. One example is Kapton® tape, which is widely used in flexible circuits such as those used in printer cartridges and other devices. The package may include such a flexible substrate, where the drive lines may be located on one side of the substrate, and the pickup lines may be located on an opposite side of the substrate.


The drive lines may be orthogonal in direction with respect to the pickup lines, and may be substantially perpendicular to the pickup lines. According to one embodiment, a device may be configured with drive lines and pickup lines located on or about opposite sides of an insulating dielectric substrate, where the combination of these three components provides capacitive properties. The drive lines may be activated to drive an electric field onto, into or about an object. The pickup lines can receive electronic fields that originated from the drive lines, and these electronic fields can be interpreted by processing or logic circuitry to interpret features or characteristics of the object being sensed.


Thus, in one embodiment the layer separating the drive lines from the pickup lines can provide a capacitive property to the assembly. If some or all of the drive lines are substantially perpendicular to the pickup lines, either entirely or in portions, then a grid may be formed. In such a configuration, from a three dimensional view, the drive lines are located and oriented substantially in parallel with respect to each other about a first plane. One surface of the substrate is located about the drive lines in a second plane that is substantially parallel relative to the drive lines. The pickup lines are located and oriented substantially in parallel with respect to each other about a third plane that is substantially parallel to the first and second planes and also located about another substrate surface that is opposite that of the drive lines, such that the substrate is located substantially between the drive lines and the pickup lines.


In this description, including descriptions of embodiments and examples, there will be references to the terms parallel, perpendicular, orthogonal and related terms and description. It is not intended, nor would it be understood by those skilled in the art that these descriptions are at all limiting. To the contrary, the embodiment extends to orientations and configurations of the drive lines, the pickup lines, the substrate or related structure, and also various combinations and permutations of components, their placement, distance from each other, and order in different assemblies of a sensor. Though the embodiment is directed to a sensor configured with plurality of drive and pickup lines that generally cross over each other at a pixel location and are configured to detect presence and other features and characteristics of a nearby object, the embodiment is not limited to any particular configuration or orientation, but is only limited to the appended claims, their equivalents, and also future claims submitted in this and related applications and their equivalents.


Also, reference will be made to different orientations of the geometric planes on which various components will lie, such as the drive and pickup lines and the substrate that may be placed in between the sets of drive and pickup lines. If flexible substrates are used for example, the use of such a structure will allow for planes to change as a flexible structure is flexed or otherwise formed or configured. In such embodiment will be understood that certain aspects of the embodiment are directed to the drive lines and pickup lines being configured on opposite sides of a substrate and in a manner that enables the sensing of particular features and/or characteristics of a nearby object at each crossover location of a drive line and a pickup line. Thus, the orientation of the planes (which may be deformable, and thus may be sheets separated by a substantially uniform distance) of groups of components (such as drive lines or pickup lines for example) or substrates may vary in different applications without departing from the spirit and scope of the embodiment.


Also, reference will be made to pickup lines, pickup plates, drive lines, drive plate, and the like, but it will be understood that the various references to lines or plates may be used interchangeably and do not limit the embodiment to any particular form, geometry, cross-sectional shape, varying diameter or cross-sectional dimensions, length, width, height, depth, or other physical dimension of such components. Also, more sophisticated components may be implemented to improve the performance of a device configured according to the embodiment, such as for example small 65, 45, 32 or 22 nanometer conduction lines or carbon nano-tubes that may make an assembly more easily adapted to applications where small size and shape as well as low power are desired characteristics and features. Those skilled in the art will understand that such dimensions can vary in different applications, and even possibly improve the performance or lower power consumption in some applications, without departing from the spirit and scope of the embodiment.


Reference will also be made to various components that are juxtaposed, layered, or otherwise placed on each other. In one example of an embodiment, a plurality of drive lines are juxtaposed on one surface of a generally planar substrate, and a plurality of pickup lines are juxtaposed on an opposite surface of the planar substrate. The drive lines are substantially orthogonal to the pickup lines, and may be described as substantially perpendicular to the pickup lines. The distance between the drive lines and pickup lines may be filled with a substrate or insulating material that will provide for a capacitive configuration. Here the drive lines on one side of the substrate forms one capacitive plate, and the pickup lines on an opposite side for the corresponding capacitive plate. In operation, when the drive plate is activated, an electrical field is generated between the drive lines and pickup lines and through the substrate to form a plurality of capacitive elements. These capacitive elements are located at an area at each cross-section of a drive line and a pickup line with a portion of the substrate located between the areas. This is a location where the respective drive lines and pickup lines overlap each other. In any particular application, these areas in which the three components interact during operation define a data location at which a sensor reading can be made.


Reference will also be made to sensor lines, such as sensor drive lines and sensor pickup lines, and their orientation amongst themselves and each other. For example, there will be described substantially parallel drive lines. These drive lines are intended to be described as parallel conductive lines made up of a conductive material formed, etched, deposited or printed onto the surface such as copper, tin, silver and gold. Those skilled in the art will understand that, with the inherent imperfections in most any manufacturing process, such conductive lines are seldom “perfect” in nature, and are thus not exactly parallel in practice. Therefore, they are described as “substantially parallel”. Different applications may configure some of the drive lines even non-parallel, such that the lines may occur parallel for a portion of the line, and the line may necessarily deviate from parallel in order to connect with other components for the device to operate, or in order to be routed on or about the substrate on which it is formed or traced. Similarly, the separate array of lines may be described as orthogonal or perpendicular, where the drive lines are substantially orthogonal or perpendicular to the pickup lines. Those skilled in the art will understand that the various lines may not be perfectly perpendicular to each other, and they may be configured to be off-perpendicular or otherwise crossed-over in different angles in particular applications. They also may be partially perpendicular, where portions of drive lines may be substantially perpendicular to corresponding portions of pickup lines, and other portions of the different lines may deviate from perpendicular in order to be routed on or about the substrate or to be connected to other components for the device to operate.


These and other benefits provided by the embodiment will be described in connection with particular examples of embodiments of the embodiment and also descriptions of intended operational features and characteristics of devices and systems configured according to the embodiment.


In operation, generally, the drive lines can transmit an electromagnetic field toward an object that is proximal to the device. The pickup lines may receive a signal originating from the drive lines and then transmitted through the object and through the substrate and onto the pickup lines. The pickup lines may alternatively receive a signal originating from the drive lines that were then transmitted through the substrate and onto the pickup lines without passing through the object. This electric field can vary at different locations on the grid, giving a resultant signal that can be interpreted by some type of logic or processor circuitry to define features and/or characteristics of an object that is proximate the assembly.


The drive lines and pickup lines may be controlled by one or more processors to enable the transmission of the signal to an object via the drive lines, to receive a resultant signal from an object via the pickup lines, and to process the resultant signal to define an object image. One or more processors may be connected in one monolithic component, where the drive lines and pickup lines are incorporated in a package that includes the processor. In another embodiment, the drive lines, pickup lines and substrate may be assembled in a package by itself, where the package can be connected to a system processor that controls general system functions. This way, the package can be made part of the system by connecting with a system's input/output connections in order to communicate with the system. This would be similar in nature for example to a microphone connected to a laptop, where the audio signals are received by the system processor for use by the laptop in receiving sounds from a user. According to this embodiment, the sensor can be connected as a stand-alone component that communicates with the system processor to perform sensor operations in concert with the system processor.


In another embodiment, a sensor may be configured to drive signals at different frequencies since the impedance of most objects, especially human tissue and organs, will greatly vary with frequency. In order to measure complex impedance at one or more frequencies of a sensed object, the receiver must be able also to measure phase as well as amplitude. In one embodiment, the resulting signal generated from a given impedance sensing electrode pair may result from varying frequencies, known in the art as frequency hopping, where the receiver is designed to track a random, pseudo-random or non-random sequence of frequencies. A variation of this embodiment could be a linear or non-linear frequency sweep known as a chirp. In such an embodiment one could measure the impedance of a continuous range of frequencies very efficiently.


In yet another embodiment, a grid sensor as described above may be configured to also operate as a pointing device. Such a device could perform such functions as well known touch pads, track balls or mice used in desktops and laptop computers.


In one example of this embodiment, a two dimensional impedance sensor that can measure the ridges and valleys of a fingertip may be configured to track the motion of the fingerprint patterns. Prior art swiped fingerprint sensors can perform this function, but due to the physical asymmetry of the array and the need to speed correct, or “reconstruct” the image in real time make these implementations awkward at best. The sensor could also double as both a fingerprint sensor and a high quality pointing device.


One device configured according to the embodiment includes a first array of sensor lines on a flexible substrate, and a second array of sensor lines on a flexible substrate, and also a processor configured to process fingerprint data from the first and second arrays of sensor lines. When folded upon itself in the case of a single flexible substrate or when juxtaposed in the case of separate substrates, the separate sensor lines cross each other without electrically shorting to form a grid with cross-over locations that act as pixels from which fingerprint features can be sensed. In one embodiment, an array of substantially parallel sensor drive lines is located on a surface of the flexible substrate. These drive lines are configured to sequentially transmit signal into a surface of a user's finger activating a line at a time. A second array of sensor lines is similar to the first, consisting of substantially parallel sensor pickup lines that are substantially perpendicular to the drive lines. These pickup lines are configured to pick up the signal transmitted from the first.


In the configuration where the first and second set of sensor lines, the drive and the pickup lines for example, are located on different sections of an extended surface of the flexible substrate, the flexible substrate is further configured to be folded onto itself to form a dual layer configuration. Here, the first array of sensor drive lines becomes substantially perpendicular to the second array of pickup sensor lines when the flexible substrate is folded onto itself. This folding process creates crossover locations between these separate arrays of sensor lines—though they must not make direct electrical contact so that they operate independently. These crossover locations represent impedance sensing electrode pairs configured to sense pixels of an object and its sub-features juxtaposed relative to a surface of the flexible substrate. The scanning of these pixels is accomplished by activating individual rows and columns sequentially. Once a drive column is activated with drive signal the perpendicular pickup rows are scanned one at a time over the entire length of the selected driver. Only one row is electrically active (high impedance) at a time, the non-active rows are either shorted to ground or multiplexed to a state where they do not cross couple signal. When a finger ridge is placed above an array crossover location that is active, it interrupts a portion of the electric field that otherwise would be radiated through the surface film from the active drive column to the selected row pickup. The placement of an object's subfeature, such as a ridge or valley in the case of a fingerprint sensor, over an impedance sensing electrode pair results in a net signal decrease since some of the electric field is conducted to ground through the human body. In a case of a fingerprint sensor the placement of a fingerprint ridge/valley over an impedance-sensing electrode pair, the valley affects the radiation of electric field from the selected drive line to the selected pickup line much less than a ridge would. By comparing the relative intensity of signals between the pixels ridges and valleys, a two dimensional image of a finger surface can be created.


Referring again to FIG. 1, this general example of the grid sensor will be now used to illustrate how such a sensor configured according to the embodiment can be implemented as a fingerprint sensor, where the object would simply be the surface of the fingerprint on the user's finger. This example will be carried through the following Figures for illustration of the benefits and novel features of the impedance sensor configured according to the embodiment. However, it will be appreciated by those skilled in the art, however, that any object may be sensed by a device configured according to the embodiment. Again, the example and description are intended only for illustration purposes.


In operation, the sensor can be configured to detect the presence of a finger surface located proximate to the sensor surface, where the drive lines can drive an active electromagnetic field onto the finger surface, and the pickup lines can receive a resulting electromagnetic field signal from the pickup lines. In operation, the drive lines can generate an electric field that is passed onto the surface of the finger, and the different features of the fingerprint, such as ridges and valleys of the fingerprint surface and possibly human skin characteristics, would cause the resulting signal to change, providing a basis to interpret the signals to produce information related to the fingerprint features.


In one embodiment of a fingerprint sensor, referring again to FIG. 1 in the context of a fingerprint sensor, a flexible substrate is used as the insulating dielectric layer 104, to allow for beneficial properties of durability, low cost, and flexibility. The drive lines or plates 106a, 106b, are located on the flexible substrate and configured to transmit a signal into a surface of a user's fingerprint features and structures, such as ridges and valleys, placed on or about the sensor lines. The pickup lines 102a, 102b are configured to receive the transmitted signal from the user's finger surface. A processor (not shown) can be configured to collect and store a fingerprint image based on the received signal from the pickup lines.


Referring to FIG. 4, an example a sensor 400 configured as an object sensor, where the top plates or pickup lines 402a, 402b, . . . , 402n are located on one side of insulating dielectric layer or substrate 404. Bottom plates or drive lines 406a, 406b, . . . , 406n are located on an opposite side of the substrate 404. Electric fields 408a, 408b extend from bottom plates or drive lines 406a, 406b through the insulating layer or substrate 404 and onto active top plate 402a. According to the embodiment, these drive lines may be activated one at a time to reduce any interference effects, but the electric field results illustrated here are intended to illustrate a contrast between electric fields that are partially or fully absorbed by the object with electric fields that are not absorbed by the object at all. This information may be collected from drive and pickup plate electrode pairs at each crossover location to sense features and characteristics of the object that is proximate the sensor lines. Partially covered top plate or pickup line 402b is connected to voltmeter 417, and uncovered top plate 402a is connected to voltmeter 418. Active drive line or bottom plate 406b is connected is connected to AC signal source 416, causing an electric field to radiate from active plate 406b. According to a particular application, the number of drive lines and pickup lines can vary depending on the application, and it may depend on the cost and resolution desired. As can be seen, the electric field lines 408a is partially captured by the pickup lines 402a and 402b, and part is captured by the object, in this case finger 410. Also, in order to illustrate that the pickup lines will exhibit different reading when an object or object feature is present or not present or proximate to a given crossover location, volt meter 417 illustrates the response to the top plate or drive line 402b, and voltmeter 418 illustrates the response of top plate or drive line 402a. The difference in the deflections of voltmeter 417 in comparison 418 show the delta in electric field intensity between the two electrode pair locations, one with a finger present the other without.


Referring to FIG. 5, another example of a sensor configured according to the embodiment is illustrated for the drive and pickup configuration when detecting the presence of an object. The sensor 500 is illustrated, where the top plates or pickup lines 502a, 502b, . . . , 502n are located on one side of insulating layer or substrate 504, and bottom plates or drive lines 506a, 506b, . . . , 506n are located on an opposite side of the substrate 504. Again, the pickup lines are shown on the layer closest to the object being sensed for maximum sensitivity, and the drive lines shown on the opposite side of the substrate. Electric fields 508a, 508b extend from bottom plates or drive lines 506a, 506b through the insulating layer or substrate 504 and onto active top plate 502b. Other configurations are possible, perhaps having drive plates on the top, and pickup plates on the bottom. The embodiment, however, is not limited to any particular configuration that is insubstantially different than the examples and embodiments disclosed and claimed herein.



FIG. 5, further shows a snapshot of one selected individual electrode pair located at the crossover of pickup line 502b and drive line 506b, where the remaining pickup and drive lines are not active, shown grounded in FIG. 5. Drive line 506b is connected to AC voltage source 516, and pickup line 502b is connected to amplifier/buffer 514. Once activated as shown here, electric field lines 508a, 508b are generated, and they radiate from drive line 506b and are sensed by pickup line 502b, sending the resultant signal into amplifier/buffer 514, and are later processed by analog and digital circuit functions. Grounding the inactive adjacent drive and pickup lines focuses the electric fields 508a and 508b at the crossover location between the active the drive and pickup plates, limiting crosstalk from adjacent areas on the object being sensed. As the sensing operation proceeds in this embodiment, different drive line/pickup line crossover pairings may be activated to capture different pixels of information from the object. In the case of an object sensor, it can capture information on the shape of the object, and, if the electrical characteristics are non-uniform across its surface, it's composition. Again, the embodiment is not limited to this particular configuration, where one single electrode pair, several electrode pairs, or even all electrode pairs may be active at one time for different operations. In practice, it may be preferable for less than all of the impedance sensing electrode pairs to be active at a given time, so that any interference that may occur between close-by pixels would be minimized. In one embodiment, a drive line may be activated, and the pickup lines may be scanned one or more at a time so that a line of pixels can be captured along the drive line and pickup lines as they are paired along a line at the crossover locations. Thus, and still referring to FIG. 5, the AC voltage source 516 may remain connected to drive line 506b, and the connection of the amplifier/buffer 516 may cycle or scan over to sequential pickup lines, so that another pixel of information can be captured from another pickup line crossover paired with drive line 506b. Once substantially all the pickup lines 506a-n have been scanned drive line 506b can be deactivated, than another drive line in sequence can be activated with the AC voltage source, and a new scanning can commence through the pickup lines. Once substantially all drive line/pickup line pairings have been scanned to capture the full two-dimensional array of pixels, then a two dimensional image or rendering of the object features and characteristics can be made, such as a rendering of the shape of the object, and potentially a composition map.


As another example of a sensor that can benefit from the embodiment, a reduced cost fingerprint swipe sensor could be configured using the same innovation provided by the embodiment. In this embodiment, a reduced number of pickup lines could be configured with a full number of orthogonal drive lines. Such a configuration would create a multi-line swipe sensor that would take the form of pseudo two-dimensional sensor, and when a finger was swiped over it would create a mosaic of partial images or slices. The benefit of this would be to reduce the complexity of image reconstruction task, which is problematic for current non-contact silicon sensors that rely on one full image line and a second partial one to do speed detection.


The tradeoff would be that this pseudo two dimensional array would have to be scanned at a much faster rate in order to keep up with the varying swipe speeds of fingers that have to be swiped across it.



FIGS. 6a and 6b illustrate the operation of the sensor when detecting surface features of an object such as fingerprint ridges and valleys. The sensor is configured identically to the previous example in FIG. 5, but in this case is interacting with a textured surface such as a fingerprint.


Referring to FIGS. 6a and b, another example of a sensor configured according to the embodiment is illustrated. The sensor 600 is illustrated, where the top plates or pickup lines 602a-n are located on one side of insulating layer or substrate 604, and bottom plates or drive lines 606a-n are located on an opposite side of the substrate 604. For maximum sensitivity pickup lines are shown on the layer closest to the object being sensed, and the drive lines shown on the opposite side of the substrate. FIG. 6a shows electric field lines 620 as they interact with a proximally located object's valley and FIG. 6b shows electric field lines 621 as the interact with a proximally located object's peaks, extending from bottom plate drive line 606b through the insulating layer or substrate 604 and onto active top pickup line 602b. In the case of sensing a fingerprint, the corresponding ridges and valleys over the fingerprint surface can be captured by the grid of drive line/pickup line crossover points, and the resulting data can be used to render an image of the fingerprint. A stored fingerprint can then be compared to the captured fingerprint, and they can be compared for authentication. This is accomplished using any one of many fingerprint matching algorithms which are available from vendors as stand-alone products. Such vendors include Digital Persona, BioKey, and Cogent Systems, to name just a few.


Also illustrated in FIGS. 6a and b, is the individual sensor line pairing of pickup line 602b and drive line 606b. Their crossover forms the active electrode pair, and the remaining pickup and drive lines are not active, and will nominally be grounded by electronic switches. Drive line 606b is connected to AC voltage source 616, and pickup line 602b is connected to amplifier/buffer 605. Once activated as shown here, electric field lines 620 and 621 are created as shown in FIGS. 6a and 6b respectively, and they emanate between the drive line 606b and pickup line 602b, sending a resultant signal that is radiated onto pickup line 602b and connected to amplifier/buffer 605, and later processed by analog and digital processing circuitry. As the sensing operation proceeds in this embodiment, different drive line/pickup line crossover pairs may be activated to capture different pixels of information from the object. In the case of a fingerprint, it can capture information on different features and characteristics of the fingerprint and even the body of the finger itself. Again, the embodiment is not limited to this particular configuration, where one electrode pair, several electrode pairs, or even all electrode pairs may be active at one time for different operations. In practice, it may be preferable for less than all of the electrode pairs to be active at a given time, so that any interference that may occur between close-by pixels would be minimized. In one embodiment, a drive line may be activated, and the pickup lines may be scanned one or more at a time so that a line of pixels can be captured along the drive line and pickup lines as they are paired along a line at the crossover points. Thus, and still referring to FIG. 6a, the voltage source 616 may remain connected to drive line 606b, and the connection to buffer/amplifier 605 may cycle or scan over to another pickup line, so that another pixel of information can be captured from another electrode pair using driveline 606b.


In the snapshot shown in FIGS. 6a and 6b drive plate 606b remains excited by AC signal source 616 until an entire column of pixels is scanned, while unused drive plates (606a,c-n etc.), are switched to ground for isolation purposes. Likewise, in one embodiment only one pickup plate is active at a time and some or substantially all other pickup plates are switched to ground to minimize crosstalk.


The scanning process continues beyond the snapshot shown in FIGS. 6a and 6b, with the next column in sequence being activated, 606c, (although the sequence could be arbitrary), Once the entire sequence of Pickup Plates 602a-n is scanned, the next driver line 606d would activated, until all, or substantially all of the drive lines 606a-n have been sequenced. Once all the drive columns have been activated and the pickup plates scanned for each column, one will have collected a complete two dimensional array of pixels equal to the number of driver rows times the number of pickup columns. For a 500 DPI sensor that would create a 10.times.10 mm array or 100 mm2, consisting of 40,000 individual pixels. Depending on the application, all of the drive lines may be sequenced, or possibly some or most of them may be sequenced.


Referring again to FIGS. 6a and 6b, the two conductive layers Drive layer 606 and Pickup layer 602, are separated by an electrically insulating layer 604. This insulating layer 604 has high DC resistance and has a dielectric constant greater than one to allow the transmission of high frequency electric fields through it. In one embodiment this layer 602 is created by folding a single sided flexible printed circuit board back onto itself. In another embodiment it is created by placing a dielectric layer between two electrically active layers to form a double sided circuit board.



FIG. 7 shows an example of an x-y grid of plate rows and columns depicted by lumped circuit components that represent the electric field couplings of the sensor at each drive/pickup crossover.


The bottom plates 706a,b,c etc. are driven one at a time by AC signal source 716 via switch matrix 740a-n. FIG. 7 shows a scan snapshot where one drive switch 740b in the on condition connecting the corresponding plate to the signal source. This activates one entire row 706b with AC signal over the entire length of the plate that is equal to the sensor width in one dimension. Correspondingly each top plate 702a,b,c etc. will pick up AC signal through insulating layer 704 and coupling capacitors 761a,b,c . . . n. Only one pickup plate at a time is active being switched into the buffer amplifier 717. Top Plate 702b is shown as the active plate in FIG. 7, while all or substantially all other pickups are shorted to ground via switch matrix 730a-n, thus the information from one x-y pixel is captured.


A single row remains active only as long as it takes the entire number of pickup plates/columns to be scanned. Scan time per pixel will depend on the frequency of operation and the settling time of the detection electronics, but there is no need to scan unduly fast as is typical with prior art swipe sensors. On the other hand prior art swipe sensors must scan at a very high pixel rate in order not to lose information due to undersampling relative to the finger speed that can be greater than 20 cm/sec. This reduction in capture speed relative to a swipe sensor relaxes the requirements of the analog electronics and greatly reduces the data rate that a host processor must capture in real time. This not only reduces system cost but allows operation by a host device with much less CPU power and memory. This is critical especially for mobile devices.


Once an entire row 706b has been scanned by all or substantially all of its corresponding pickup plates 702a-n, then the next row in the sequence is activated through switch matrix 740. This process continues until all or substantially all of the rows and columns are scanned.


The amount of signal that is coupled into the buffer amplifier 717 is a function of how much capacitance is formed by the insulating layer and the finger ridge or valley in close proximity. The detailed operation of how these electric fields radiate is shown in FIGS. 6a and b. The total coupling capacitance is a series combination of the insulating layer capacitance 704 that is fixed for a given thickness, and the variable topological capacitance of the object being sensed. The variable portion of this is shown in FIG. 7 as a series of variable capacitors numbered 760a-n, 761a-n, 762a-n etc., forming a two dimensional array.



FIG. 8 shows an example of an embodiment of the placement sensor using a differential amplifier 880 to take the signal from the selected pickup plate (802a-n), and subtract it from the reference signal of plate 805. The electrical subtraction of these signals performs several functions: first wide band common mode is subtracted out; second, subtracting against reference plate 805 provides a relative reference signal equivalent to an ideal ridge valley; third, common mode carrier signal that couples into both plates other than through a finger is also subtracted out. First order carrier cancellation of etch variation in the pickup plates also occurs when we subtract out carrier that is coupled in by other means than through fingers placed on the sensor. This is critical for high volume manufacturing at a low cost.


Reference plate 805 is intentionally located outside of the finger contact area of the sensor, separated from pickup plates 802a-n by Gap 885, Gap 885 is much larger that the nominal gap between the pickup plates that is typically 50 μm. In a real world embodiment plate 805 would be positioned under the plastic of a bezel to prevent finger contact, placing it at least 500 μm apart from the other pickup plates.


Each one of the pickup plates 802a-n is scanned sequentially being switched through pickup switches 830a-n connecting them to Differential Amplifier 880. During the scanning process of an entire pickup row, the positive leg of the differential amplifier remains connected to reference plate 805 to provide the same signal reference for all of the pickup plates.



FIG. 9a shows a circuit diagram of an example of a front end circuit 900a for the placement sensor in a topology that uses a bank of Single Pole Double Throw Switches or SPDTs to scan the pickup plate rows and a bank of Single Pole Single Throw switches to multiplex the drive plate columns.


In FIG. 9a we see a snapshot of the analog switches as the scanning process begins. Only the first SPDT switch 944a is shown in the “on” position, which allows pickup plate 902a to conduct its plate signal into Differential Amplifier 980. The remaining pickup plates are shorted to ground via switches 944, preventing any pickup signal received by them from entering into differential amplifier 980.


Each SPDT has a Parasitic Capacitance 945, due to the fact that real world switches do not give perfect isolation. In fact the amount of isolation decreases with frequency, typically modeled by a parallel capacitor across the switch poles. By using a SPDT switch we can shunt this capacitance to ground when an individual plate is not active. Since there is a large array of switches equal to the number of pickup plates, typically 200 for a 500 dpi sensor, the effective shunt capacitance to ground is multiplied by that number. So if a given switch has 0.5 picofarads of parasitic capacitance and there were 200 pickups, that would add up to 100 picofarads of total shunt capacitance.


In order to prevent this large capacitance from diverting most of the received signal from the active pickup to ground, it is desirable in this example to use a compensating circuit. This is accomplished by using resonating inductor 939, forming a classic bandpass filter circuit in conjunction with parasitic capacitors 945 (one per switch) and tuning capacitors 934 and 937. A two-step null & peak tuning calibration procedure is used where tuning capacitor 934 and 937 are individually tuned with inductor 939 using the same drive signal on both the plus and minus inputs to differential amplifier 980. The two bandpass filters formed with inductor 939 and resonating capacitors 934, and 937 respectively, will be tuned to the same center frequency when there is zero signal out of differential amplifier 980. Next capacitors 934 and 937 and inductor 939 are tuned together using a differential input signal with opposite 180 degrees phases on the plus and minus inputs to the differential amplifier 980. They are incremented in lock step until the exact drive carrier frequency is reached, this occurs when the output of differential amplifier 980 is at its peak, making the center frequency equal to the exact frequency of the carrier drive signal 916.


In a systems implementation, a calibration routine would be performed before each fingerprint scan to minimize drift of this filter with time and temperature. The resonating inductor 939 needs to have a Q or Quality Factor of at least 10 to give the filter the proper bandwidth characteristics necessary to optimize the signal to noise ratio.


Alternately, carrier source 916 may be a variable frequency source, and capacitors (937 and 934) may be fixed values. In this embodiment, tuning is accomplished by varying the frequency of source 916) until peak output is obtained from differential amplifier 980



FIG. 9b shows an alternate example of a front end circuit 900b employing multiple banks of plates grouped together, each with their own differential amplifiers.


Dividing up the large number of parallel pickup plates into groups each containing a smaller number of plates is an alternate architecture that would not require the use of a tuned bandpass filter in the front end because the parasitic switch capacitances would be greatly reduced. This would have two possible advantages, first lower cost, and second the ability to have a frequency agile front end. In this Figure we have a snapshot of the front end where the first switch 944a of bank 907a is active. All other switch banks 907b-907n are shown inactive, shorting their respective plates to ground. Therefore, only voltage or current differential amplifier 980a has any plate signal conducted into it, voltage or current differential amplifiers 980b-980n have both their positive and negative inputs shorted to ground through their respective switches 945a-n and 945r, preventing any signal from those banks making a contribution to the overall output.


Each of the differential amplifiers 980a-980n is summed through resistors 987a-987n into summing amplifier 985. Only the differential amplifier 980a in this snapshot has plate signal routed into it, so it independently produces signal to the input of summing amplifier 985. This process is repeated sequentially until all or substantially all of the switch banks 907a-n, and switch plates 944a-n, 945a-n, etc, of the entire array are fully scanned. In different embodiments, all or substantially all of the array may be scanned, or less than the entire array may be scanned in different applications. In some applications, a lower resolution may be desired, so all of the array may not need to be scanned. In other applications, a full image may not be necessary, such as a navigation application, where limited images may be used to detect movement of speed, distance and/or direction to use as input for a pointing device, such as directing a cursor on a display similar to a computer touch-pad or a mouse.


By splitting the pickup array up, the capacitive input load on each plate is reduced from that of the full array of switches to the number of switches within a given plate group. If we were to divide for example 196 potential pickup plates into 14 banks of 14 plates, the result would be a capacitance load equal to the parasitic capacitance of 14 switches (944), plus the capacitive load of the differential amplifier. If analog switches 944 are constructed with very low parasitic capacitance then the overall input load would be small enough not to need a bandpass circuit in the front end in order to resonate out the load capacitance. As integrated circuit fabrication techniques improve we would be able design smaller switches with less parasitic capacitance, making this approach become more attractive.



FIG. 9c illustrates another example of a front end circuit 900c using individual plate buffers that are multiplexed into a second stage differential amplifier.


Buffers 982a through 982n as illustrated are special buffers that are designed to have very low input capacitance. In one embodiment, these buffers could be configured as single stage cascaded amplifiers in order to minimize drain-to-gate Miller capacitance and die area. To better maximize plate to plate isolation, two sets of switches could be used for each input. Analog switches 930a-930n are included in this example to multiplex each selected buffer into differential amplifier 980. Switches 932a-n are included to shut down the power simultaneously to all the other input buffers that are not selected. This effectively puts them at ground potential. An alternate embodiment would be to put input analog switches in front of each amplifier to allow a short of the unused plates directly to ground. One effect of this approach may be an increase in input load capacitance for each plate.



FIG. 9c shows a snapshot of the scanning process where top plate 902a is being sensed though buffer 982a that has power supplied to it through switch 932a. Analog switch 930a is closed, routing it to differential amplifier 980. All other buffer outputs are disconnected from the differential amplifier 980 via analog switches 930b-n and power switches 932n


The positive input to differential amplifier 980 is always connected to the reference plate 902r (through low input capacitance buffer 982r), providing an “air” signal reference to the amp. The differential amplifier 980 serves to subtract out noise and common mode carrier signal in addition to providing a “air” reference carrier value.



FIG. 10 shows a particular embodiment of a placement sensor implemented with traditional analog receiver 1000 technology. The analog front end begins with Differential Amplifier 1080 where selected Pickup Plate 1002a-n is subtracted from reference pickup plate 1005, which is located outside the finger contact area providing a reference signal equivalent to an ideal finger valley. A programmable gain stage or PGA 1090 follows the Differential Amplifier 1080, but could be integrated into the same block providing both gain and subtraction in a single stage. PGA 1090 is designed to have a gain range wide enough to compensate for production variations in plate etching and solder mask thickness between the layers.


Control processor 1030 orchestrates the scanning of the two dimensional sensor plate array. Drive plates/columns 1006a-1006n are scanned sequentially by the Bottom Plate Scanning Logic 1040 in the Control Processor 1030 (via drive control lines 1042 connected to switches coupled to the drive plates 1006a-n). When a selected drive plate is activated it is connected to carrier signal source 1016, all inactive drive plates are connected to ground. Before activating the next drive plate in the sequence the active drive plate remains on long enough for the entire row of pickup plates 1002a-n to be scanned by top plate logic 1045 controlling switches 1030a-n.


Analog mixer 1074 multiplies the gained up plate signal against the reference carrier 1013. The result is a classic spectrum of base band plus harmonic products at multiples of the carrier frequency. An analog low pass filter 1025 is employed to filter out the unwanted harmonics and must have a sharp enough roll off to attenuate the information associated with the second harmonic without losing base band information.


Following the low pass filter are an amplifier 1077 and an A/D Converter 1076 which must sample at at least twice the pixel rate to satisfy the Nyquist criteria. Memory buffer 1032 stores the A/D samples locally with sufficient size to keep up with the worst case latency of the host controller. The A/D Sample Control Line 1078 provides a sample clock for the converter to acquire the sequential pixel information that is created by the sequencing of the plate rows and columns.



FIG. 11 shows an example of one embodiment of a placement sensor implemented with direct digital conversion receiver 1100 technology. In this example, the analog front end begins with differential amplifier 1180 where selected pickup plate 1102a-n is subtracted from reference pickup plate 1105, which is located outside the finger contact area providing a reference signal equivalent to an ideal finger valley. The electrical subtraction of these signals performs several functions: first wide band common mode is subtracted out; second, subtracting against reference plate 1105 provides a relative reference signal equivalent to an ideal ridge valley; third, common mode carrier signal that couples into both plates other than through a finger is also subtracted out. Elimination of common mode is particularly important in high RF noise environments. First order carrier cancellation of etch variation in the pickup plates also occurs when we subtract out carrier that is coupled in by other means than through fingers placed on the sensor. This is critical for high volume manufacturing at a low cost.


A programmable gain stage or PGA 1190 follows the Differential Amplifier, which could easily be combined into a single differential amplifier including programmable gain as is commonly done in modern integrated circuit design PGA 1190 is designed to have a gain range wide enough to compensate for production variations in plate etching and solder mask thickness between the layers.


Control processor 1130 orchestrates the scanning of the two dimensional sensor plate array. Drive plates/columns 1106a-1106n are scanned sequentially by the bottom plate scanning logic 1140 in the Control Processor 1130 (via drive control lines 1142 connected to switches coupled to drive plates 1106a-n). When a selected drive plate is activated it is connected to carrier signal source 1116, all inactive drive plates are connected to ground. Before activating the next drive plate in the sequence the active drive plate remains on long enough for the entire row of Pickup Plates 1102a-n to be scanned by top plate logic 1145 controlling switches 1130a, 1130b, etc., and captured by the A/D converter 1125.


The A/D Converter 1125 is sampled at a rate of at least twice the carrier frequency to satisfy the Nyquist criteria. The A/D Sample Control Line 1107 provides a sample clock for the converter to acquire the sequential pixel information that is created by the sequencing of the plate rows and columns.


Following the A/D 1125 converter is a digital mixer 1118 that digitally multiplies the A/D output that is at the carrier frequency against the reference carrier generated by the Digitally Controlled Oscillator 1110. The result is that the signal is down converted to the base band with the carrier removed. There are other unwanted spectral components created by this process, namely a double time carrier side band, but these can easily be filtered out.


A combination decimator and digital filter 1120 follows the digital mixer 1118. This block performs sampling down conversion, reducing the sample rate from at least twice the carrier frequency to at least twice the pixel rate that is much lower. The digital filter would typically include a Cascaded Integrator Comb, or CIC filter, which removes the unwanted spectral byproducts of mixing as well as improving the receiver signal to noise. A CIC filter provides a highly efficient way to create a narrow passband filter after mixing the signal down to baseband with the digital mixer. The CIC filter may be followed by a FIR filter running at the slower decimated rate to correct passband droop.


With a reduction of sample rate in the order of 100:1 a relatively small Control Processor Buffer (1132) could be used to capture and entire fingerprint. For example a 200×200 array producing 40 k pixels could be stored in a 40 kb buffer. This is in contrast to a swipe sensor that must scan the partial image frames at a rate fast enough to keep up with the fastest allowable swipe speed, usually around 200 ms. At the same time, a slow swipe of two seconds must also be accommodated, requiring ten times the amount of memory as the fastest one. Various techniques have been developed to throw away redundant sample lines before storage, but even with that the real time storage requirements are much greater for swipe sensors. This is a critical factor in Match on Chip applications where memory capacity is limited. In addition, a placement sensor has no real-time data acquisition or processing requirements on the host processor beyond the patience of the user for holding their finger in place.


Referring to FIG. 12A, and example of a sensor layout 1200 configured according to one embodiment is illustrated in a configuration that is known in the semiconductor industry as a Chip on Flex (CoF). Chip on Flex is a configuration where a processor chip is attached to a flexible substrate, such as Kapton™ tape, and that is electrically connected to conductive lines and possibly other components located on the flexible substrate. In this example, the sensor layout 1200 is shown within the borders of Kapton tape having pitch rails 1202, 1204 with slots 1206 located along both rails. These slots are used in the manufacturing process to feed the tape through the process while lines and possibly components are formed on the tape. The pitch of a device refers to the length of Kapton tape required to form a device on the CoF. The distance “d” 1208, measured here between slots 1207 and 1209, is substantially constant throughout each rail, and the pitch is a shorthand method of determining the length of flex that a device covers. For the device shown in this example, the pitch 1212 shows a span between slot 1207 and 1214 of eight slots, and thus would be characterized as an 8-pitch device. The example sensor device shown, which may be a fingerprint sensor or other type of placement, 2D or area sensor, illustrates an integrated circuit 1210, which may be a logic circuit formed on a silicon substrate, a microprocessor, or other circuit for processing pixel information captured from a sensor circuit. The example may also be formed or otherwise manufactured on a substrate other than flexible substrate or Kapton tape, in fact it may be formed on a silicon substrate, rigid board, or other substrate configured for various applications.


If configured as a fingerprint sensor or other placement sensor, integrated circuit 1210 may be a mixed signal chip that enables all or some of the functions described in FIG. 16 below. In one embodiment, it has enough inputs and outputs to drive a 200 by 200 line array of drive and pickup lines, and may have more or less of either lines. The top layer 1220 is formed by an array of pickup lines connected directly to integrated circuit 1210. This may be a flip chip mounted directly to the flex substrate without bond wires. In this example, the bottom layer is formed by folding the single layer back onto itself along the folding axis 1230 to create a double layer active sensor area. The drive lines fold to create the bottom layer 1225. The drive lines in this example are split into left and right groups 1240 and 1242 respectively for the sake of layout balance, but could be all on the left or right side of the sensing area. The left drive plate bundles 1240, and right drive plate bundles 1242 are inter-digitated with alternating left and right feeds to form a continuous line array on bottom layer 1225.


Flexible substrate based connector 1235 routes power, ground and interface signals out to an external host or onto another substrate that contains system level components, such as those illustrated in FIG. 16 and described below. These components may include but are not limited to a processor with memory, logic enabling imbedded matching algorithm(s) and encrypting/decrypting functions. In an alternative example, connector 1235 may be attached to the host substrate using conductive adhesive otherwise known as anisotropic conductive film (ACF attach), which may be labeled as “high density” in some products.


Referring to FIG. 12B, another example of a sensor 1250 is shown having a different orientation and configuration on a substrate. Similar to the above example, the sensor 1250 is a placement sensor, and is configured to be folded onto itself along the folding axis 1251 to create two layers, the bottom layer 1252 with drive lines 1256, and top layer 1254, with pickup lines 1257, integrated circuit 1258, flex external connections 1262, and processor connections 1260 that may be used to connect the integrated circuit to external devices, such as for manufacturing testing for example. This configuration, however, has a much smaller pitch, again where the distance “d” is the distance between each pair of slots 1206, and the pitch in this example is between slots 1270 and 1272, making this device a 5-pitch device. This example device takes up 5 pitches of Kapton tape compared to the other example device 1200 (FIG. 12A) taking up 8 pitches of Kapton tape. This device performs substantially the same function as that of example 1200, FIG. 12A, yet takes up less Kapton tape, saving in material costs. The device may take up even less pitches of tape if the size of the resulting sensor surface were reduced, allowing the space needed to accommodate the pickup lines, drive lines, and other components to be reduced. In this example, the effective sensor surface may be ten square millimeters, and could be reduced to nine or even eight millimeters, and the structures could be reduced accordingly to reduce the overall area of the device, likewise reducing the area of substrate required to accommodate the overall device.


As will be appreciated by those skilled in the art, given these examples, different designs may be accomplished to optimize different aspects of the invention, including size of the substrate used for a device, and both the size and pixel density of the sensing area. The invention, however, is not limited to a particular optimization done by others, and indeed the invention should inspire others to improve upon the design using well known and alternative processes, materials, and know-how available to them. The scope of the invention is only limited by claims that are either appended or submitted for examination in the future, and not by information that is extemporaneous to this specification.


Referring to FIG. 13a, an illustration of a flex layout structure 1300 is illustrated. As shown, the flex layer structure 1300 includes an imaging area 1350, in which drive lines form crossover locations with pickup lines, where the crossover locations are formed by folding the top layer 1371 over bottom layer 1375, folding the flexible substrate upon itself about flex bend radius 1374. From a side view, the top flex 1364 is layered over top soldermask 1362, which is layered upon top copper or pickup lines 1360. Bottom layer solder mask 1370 is folded under top copper 1360, and bottom copper 1372 is formed under solder mask 1370 and over bottom flex 1375.


Referring to FIG. 13b, an example of a module structure 1301 is shown for mounting the flex layer structure 1300 of FIG. 13a. Those skilled in the art will understand that the structure of a particular module may vary according to the embodiment, and that this example, though it shows a substantially complete example of a module that can be used to base a practical implementation, is but one example and is not intended and should not be considered as limiting the embodiment in any way. The example structure 1301 includes rigid substrate 1330 that receives the flex top layer 1371 on its top layer with flex locating pins or plastic frame 1337 configured to ensure alignment of the drive plates with the pickup plates. Because the sensing electrode pairs are formed by crossovers of the drive and pickup lines on the two layers, the x-y alignment tolerance requirements may be on the order of several pixels, rather than the sub-pixel alignment tolerances that would be required if there were features to be matched between the two layers. The four mounting holes (1337) on each layer are sufficient to ensure angular and x-y alignment. Also illustrated is driver chip 1310 and imaging area 1350.


Referring to FIG. 14, an illustration is provided of an example system 1400 incorporating a sensor system 1402 generally configured according to the embodiment. A sensor device may be incorporated into a system, or may be configured as a stand-alone product. As a stand-alone product, the sensor components may be encased in a housing (not shown), and electrical connections exposed for connection to a device or system that would utilize such a device. Those skilled in the art will immediately see how a sensor configured according to the embodiment as described herein can be incorporated into a housing such as those widely used in different industry sectors. Thus, for example, in a system, the mechanical connections, designs and structures may necessarily vary according to a particular application. For example, if incorporated into a laptop for use as a fingerprint sensor, a surface mounting module would need to be employed to expose the sensor grid lines to a user. If incorporated into a mobile phone, personal data assistant (PDA) or the like, another type of mounting module would be needed to conform to the particular device design while providing the operational capability of the sensor. Again, FIG. 14 illustrates a diagrammatic representation of a system 1400 that incorporates a sensor 1402 configured according to the embodiment with the folded flexible or rigid substrate 1404 having a top layer 1406 and a bottom layer 1408, and each having either pickup lines or plates and drive lines or plates respectively depending on the application, though not shown here. The two-dimensional sensing area 1411 is shown with an object 1410 on top, which may be a finger in the case of a fingerprint sensor, or another object in another application. The top layer's pickup plates or lines (not shown) communicate with top plate processing circuitry 1409 via communication link 1412 to send resultant signals received. Drive lines or plates are located but not shown here on bottom layer 1408, and receive drive signals from bottom plate processing circuitry 1414 via communication line 1416. The top plate processing circuitry 1409 includes front end buffers and amplifiers 1416 configured to receive, amplify and/or buffer or store a resultant signal received from the pickup plates or lines. A switch array 1418 such as illustrated in FIGS. 9A-9C is configured to receive the signal from the front end 1416 and send the switched signal to analog to digital (A/D) converter 1420 for conversion to a digital signal. Digital signal processor (DSP) 1422 is configured to receive the digital signal from A/D converter 1420 and process the signal for transmission.


Bottom plate processing circuitry 1414 is configured to produce a drive signal for use by drive plates or lines located on the bottom layer 1408 of the sensor substrate 1404, and includes drivers and scanning logic 1462 for producing the signal, and programmable frequency generator 1426 for programmably setting the frequency in which the drive signal is set. The bottom plate processing circuitry 1414 includes communication link 1428, likewise, top plate circuitry has communication link 1430 for communicating with system buss 1432 for sending and receiving communications among the system, such as to processors, memory modules, and other components. System buss 1432 communicates with persistent memory 1434 via communication link 1436 for storing algorithms 1438, application software 1440, templates 1442, and other code for persistent and frequent use by processor 1444. Processor 1444 includes processor logic having logic 1448 and other circuitry for processing signals received from the system buss and originating from the sensor 1402, and also includes arithmetic logic unit 1450 configured with logical circuits for performing basic and complex arithmetic operations in conjunction with the processor. Processor memory 1452 is configured for local storage for the processor 1444, for example for storing results of calculations and retrieval for further calculations.


In operation, drive signals are controlled by processor 1444, and parameters for the drive signals originating from bottom plate processing circuitry 1414 are set in the bottom plate processing circuitry 1414 by the processor 1444. Drive signals are generated by logic 1462 within the parameters set in generator 1426 and sent to bottom plate 1408 via communication link 1516. These signals generate electromagnetic fields that extend to pickup lines on top layer 1406 about the sensing area 1411. These signals are cycled through different pixel electrode pairs on the sensor grid (not shown here, but described above), and some of these electromagnetic fields are absorbed by the object 1410 (such as a fingerprint for example). The resultant signal is picked up by the pickup plates or lines located on top layer 1406 about the sensing area (not shown here, but described above). The resultant signal is then transmitted to top plate processing circuitry 1409 via communication line 1412, and the signal is processed and transmitted to storage or processor 1444 for further processing. Once the drivers and scanning logic have cycled through the pixels on the grid sensor, data related to features and characteristics of the object can be defined and utilized by the system. For example, in a fingerprint sensor system, the image may be a fingerprint image that can be compared to a stored fingerprint image, and, if there is a match, it can be used to validate a user.



FIG. 15 Illustrates how a device configured according to the embodiment may be applied to a fingerprint sensing application. A user places a finger with fingerprint (1510) over the sensor grid, which is formed by the crossover locations of the drive plates (1506a-1506n) and the pickup plates (1502a-1502m). Image pixel 1561a senses the fingerprint area above the electrode pair of drive plate 1506a and pickup plate 1502a, pixel 1561a senses the crossover of drive 1506n and pickup 1502a, and pixel 1562n senses the area above the crossover of drive 1506n and pickup 1502m



FIG. 16 illustrates the steps required to collect the fingerprint image as shown in FIG. 15, using the embodiment shown in FIGS. 11 and 14. Image capture begins at step 1601. As part of the initialization a row counter is initialized to 1 at step 1602. Step 1603 is the beginning of a row scan sequence. At the beginning of each row, a column counter is set to 1 at step 1603. In step 1604, the top plate scanning logic 1145 activates the appropriate analog switch (one of 1103a through 1103n) for the selected row. In Step 1605 the sense of an individual pixel begins when the bottom plate scanning logic 1140 activates the appropriate drive plate (one of 1106a through 1106n) with the carrier signal 1116. At step 1606 the signal from differential amplifier 1180 is sampled repeatedly by A/D converter 1125 after processing through programmable gain amplifier 1190. Digital mixer 1118 mixes the samples down to the baseband frequency set by digital oscillator 1110. The baseband signal is then filtered by digital decimating filter 1120 to produce a signal level value for the current pixel. The functions performed for this step in the embodiment of FIG. 11 could alternatively be performed by the corresponding analog receiver shown in FIG. 10, or other functionally similar arrangements. In step 1607 the signal level value derived in step 1606 is stored in the appropriate position in memory buffer 1132 that corresponds to the currently selected row and column. In step 1608 the column number is incremented, and in step 1609 the column number is tested to determine whether the current row collection has been completed. If the row has not been completed, we return to step 1605 to collect the next pixel in the row. If the row is complete, we proceed to step 1610 and increment the row number. In step 1611, we test the row number to determine if all the rows have been scanned. If not, flow returns to 1603 to start the next row back at the first column. Once all the rows have been scanned, image capture is complete, and we proceed to step 1612, at which point the image is ready for further processing or transfer to long term storage.


Those skilled in the art will recognize that row and column scanning order may not correspond directly to physical position in the array, as some implementations may more optimally be sampled in interleaved fashions.


In FIG. 17, an example of the example as shown in FIG. 14 in a user authentication application. In step 1701a system level application 1440 on processor 1444 requires user authentication. At step 1702 the user is prompted to provide a finger for verification. The system waits for finger presence to be detected in step 1703. This can be performed by collecting a reduced size image as described in FIG. 16 and testing for finger image, or via other dedicated hardware. Once finger presence is detected, a complete image is collected in step 1704, using the method described in FIG. 16 or other substantially similar method. This image is then stored and in step 1705 converted into a template 1712 as shown in FIG. 17B, typically consisting of a map of minutia point locations and types (such as bifurcations 1710, and terminations 1711), or possibly of ridge frequency and orientation, or some combination of both. In step 1707 the template is then compared against one or more enrollment templates that were retrieved from persistent template storage 1142 in step 1706. If a match is found, the user is authenticated in step 1708 and granted access to the application. If no match is found, the user is rejected in step 1709, and access is denied.


In an authentication system such as described by FIGS. 16 and 17A-B, there can be a tradeoff between security and operational speed. A device such as a smartphone may have differing security and convenience (speed) requirements for differing operating modes, as well. These tradeoffs may be governed by the value of security for different types of information. As an example, users may place a low value on security for simply powering up their smartphone or other device. But, they may place a much higher value on performing a financial transaction or other sensitive transfer. They may want to lock out the ability to access personal contact information or customer lists, or may also want to lock out the ability for others to make local calls, long distance calls, access personal photos, access social networking websites, sent and receive text messages or emails, and they may want to have different security protocols for the access to different information. Users having conventional systems without the benefit of biometrics will typically lock their telephone handset with a four-digit PIN, which is a fairly low level of security. Securing a financial transaction over the same device, a new development that is desired in the industry, would cause a user to desire a much higher level of security. Conversely, the amount of time a user would find acceptable to unlock the phone for a simple call would be much shorter than the time they would wait to secure a high value transaction, where a user may be more tolerant of a higher time demand for authenticating the user for a financial transaction.


Embodiments described herein facilitate supporting both of these requirements by providing variable captured image resolution and matching algorithm security level. In one example, when operating in high security mode (such as when enrolling a user or validating a high-value transaction) the image capture procedure described in FIG. 16 and the match procedure described in FIG. 17A-B may operate in full resolution mode. When operating in ‘convenience’ mode (such as unlocking the phone, looking at photos, surfing the internet or switching users), the fingerprint image may be acquired in a half-resolution mode by skipping every other column and every other row—for example where steps 1608 and 1610 would increment the Column and Row counters, respectively, by two instead of one. This may result in an image with half the resolution in each axis compared to the high security mode, and one-fourth the size. This could cut by a factor of four both the time required to acquire the image (FIG. 16) and the time required to extract the template from the image (step 1705). Due to the reduced resolution of the image, and the relaxed security requirements for this convenience mode, the matching threshold applied in step 1707 may be accordingly reduced.


Referring to FIGS. 18A-D, another embodiment of a sensor module or assembly is illustrated as a sensor 1800, first shown in an expanded view in FIG. 18A, made up of a folded flex sensor 1802 having a sensing area 1803, a module folding base 1804 and mounting board 1806. In this embodiment, a switch having a plunger 1812 and base 1813 is incorporated into a sensor assembly that allows the integration of the sensor operations, such as fingerprint sensor operations, together with other operations of a device. Still further, this assembly allows for the configuration of a personalization switch for use on a device, such as a mobile telephone or smart phone for example, that has extended functions including biometric operations. If used together with a power or selector switch, such as for example a modular replacement for the main selection switch on an iPhone™ manufactured by Apple Computer Corporation or a navigation selection switch used on a BlackBerry™ smartphone manufactured by Research in Motion (RIM™) next to the display screen of these devices, a fingerprint sensor can be used for authentication while using these personal devices. The authentication can be used to access the device entirely, access different levels of information such as different information that a user wishes to protect, or could be used for authentication of the user for financial transactions that require a higher level of security. These settings may be preset by the manufacturer, may be reset by the user, may be set by a financial institution associated with the user or the device, or may be configurable by anyone with an interest in protecting the information.


Still referring to FIG. 18A, the folded flex sensor 1802 may be folded at 1805 and 1807 respectively to fit about the module folding base 1804 at mounting locations loop brace 1805-A and folding edge 1807-A respectively, along with placement holes 1808 to aid in placing the flex about the module and holding it in place. If the embodiments of the flex sensor circuit formed or otherwise configure on a substrate according to the examples of FIG. 12A or 12B, different mounting operations may be required to accommodate these or other designs that requires a different folding or forming of the substrate. The sensor 1802 may include processor 1810 as described in similar embodiments above. Mounting board 1806 includes the switch having the plunger 1812 and base 1813 mounted about switch opening 1811 to accommodate plunger 1812, and may also have a processor opening 1814 configured to accommodate processor 1810.


Referring to FIG. 18B, another expanded view of the sensor of FIG. 18A is shown from another angle, where one side of the flex sensor 1802 shows more clearly the openings 1808 and processor 1810, where the openings 1808 are configured to receive placement or mounting pegs 1816 for holding the sensor 1802 substrate in place and then received by mounting openings 1818. The placement or mounting pegs 1820 are received by mounting openings 1822. A switch base opening 1824 is configured to receive switch base 1813. In another embodiment, the opening for the plunger 1812 and the base 1813 may be a single sized opening that will accept the entire switch, or the switch may have base with the same diameter as the plunger so that a single cylindrical or rectangular or other shaped opening may be sufficient to accommodate the switch.



FIG. 18C shows a side cut away view of the assembled sensor assembly with the sensor substrate 1802 mounted on module folding base 1804 and mounted on base 1806, and with the openings 1811 and 1824 accommodating the switch plunger 1812 and switch base 1813 respectively. FIG. 18D shows a close-up view of the side view of FIG. 18C.



FIGS. 19A-J show an alternative sensor/switch assembly where a dome switch is used for the underlying switch that is integrated in the assembly. Referring to FIG. 19A, the assembly 1900 includes a dome switch 1912 disposed within an opening 1911 of a module folding base 1904 mounted on mounting board 1906. Opening 1911 may be beveled as at 1915 and may be covered by a cover 1903 (FIG. 19B). In this embodiment, a switch having a domed shaped plunger 1912 and base 1913 (FIG. 19D) is incorporated into a sensor assembly that allows the integration of the sensor operations, such as fingerprint sensor operations, together with other operations of a device.


Referring to FIGS. 19A, B and C, a folded flex sensor 1902 may be folded at 1905 and 1907 respectively to fit about the module folding base 1904 at mounting locations loop brace 1905-A and folding edge 1907-A respectively, along with placement holes 1908 (FIG. 19D) to aid in placing the flex about the module and holding it in place. If the embodiments of the flex sensor circuit formed or otherwise configure on a substrate according to the examples of FIG. 12A or 12B, different mounting operations may be required to accommodate these or other designs that requires a different folding or forming of the substrate. The sensor 1902 may include processor 1910 as described in similar embodiments above. Mounting board 1906 includes a switch base 1913 mounted below plunger 1912, and may also have a processor opening 1914 configured to accommodate processor 1910 (FIG. 19D).


Referring to FIGS. 19G, H and I, side views of the sensor is shown, showing the flex sensor 1902 and the openings 1908 and processor 1910, where the openings 1908 are configured to receive placement or mounting pegs 1916 for holding the sensor 1902 substrate in place and then received by mounting openings 1918. The placement or mounting pegs 1920 are received by mounting openings 1922. Switch base opening 1924 is configured to receive switch base 1913. In another embodiment, the opening for the plunger 1912 and the base 1913 may be a single sized opening that will accept the entire switch, or the switch may have base with the same diameter as the plunger so that a single cylindrical or rectangular or other shaped opening may be sufficient to accommodate the switch.



FIGS. 19E, F, and J show a normal, cutaway, and expanded cut away views of the sensor assembly mounted in a device such as a smartphone, with the sensor substrate 1902 mounted on module folding base 1904 and mounted on base 1906, and with the opening 1911 accommodating the switch dome plunger 1912 and switch base 1913. A sensor area 1901 is accessed by a bezel opening 1909 which is incorporated into the finished case 1925 of the device. When the user places a finger on the surface of the sensor area 1901 they will simultaneously depress switch plunger 1912.



FIG. 20 illustrates a perspective view of one embodiment of an embedded switch 2000 that can provide a means to electronically connect a top conductive layer 2002 through insulating layer 2004 to conductive layer 2006 upon the touch of a user on the surface of the sensor, not shown here, but it may be a layer above conductor 2002. The three layers may be embedded within a fingerprint sensor as described above, allowing for a switch located within the double-layered fingerprint sensor, so that a user can activate a function upon touch, such as power, select, initiate, enter, or other switch functions in a device. The three layers may be placed on a surface 2006 of module 2008, where the module is located on the surface 2010 of a substrate 2012.



FIGS. 21A and 21B
FIG. 21A shows an embodiment where a switch is formed on the same substrate as the sensor. The figures show the folded flex stack-up consisting of flex substrates 2102 and 2103, typically but not limited to Kapton®, metalized layers 2104 and 2105 are typically etched or formed copper traces and insulating layers 2106 and 2107 are typically solder mask. Insulating layers 2106 and 2107 have a cutout section 2110 out exposing the conductive layers 2104 and 2105. When no vertical pressure is applied over the gap 2110 conductive layers 2104 and 2105 are not electrically in contact with each other and are in the off position.



FIG. 21B shows the flex top layer 2103 and conductive layer 2112 mechanically depressed by a contacting object such as a finger. Top conductive layer 2107 can be pushed physically into electrical contact with conductive layer 2106 at pressure focal point 2112. This forms an embedded flex switch, which is shown in the on position.



FIGS. 22-26 illustrate alternative embodiments and further examples. These examples may be configured using different materials and structures, and they may further be oriented or integrated in different structures such as power buttons in mobile devices, stationary devices, computers, laptops, access devices (doorknobs, entryways, or the like). Note that in these figures and the number of plates is greatly reduced to simplify the drawings, and the size of individual drive and pickup plates are increased for simplicity as well. In practice, both drive and pickup plates may be formed at fixed or variable pitch, and unlike the drawings, the spacing between plates may be greater or less than the individual plate size.



FIGS. 22A-B depict an embodiment where the drive and detection electronics are implemented on separate silicon components. This configuration minimizes interconnect between layers by directly mounting the drive die on the substrate layer for the drive lines, and directly mounting the pickup die on the substrate for the pickup plates. The rigid substrate for the drive plates also serves as a common base layer which provides interconnect for synchronizing signals between the two subsystems, as well as power and communications to the host device.


In this particular example, the common substrate (2201) is a two layer rigid circuit board, which also provides a mechanical base for the sensor. The drive circuitry is implemented in integrated circuit die (2204) which is mounted on rigid drive substrate (2201). The die is connected to the circuit on the rigid substrate by a number of bonding pads (2206) using standard flip-chip mounting processes. A large number of drive lines (typically more than 100) are connected to the drive plates (2209), which are formed on the top side of the rigid substrate.


A dielectric layer (2208) separates drive plates (2209) from pickup plates (2210). In this instance dielectric layer (2208) is provided by a solder mask layer applied to the drive plates (2209) and rigid substrate (2201).


Pickup substrate assembly (2202) with pre-attached pickup circuit die (2205) is mounted on top of drive substrate (2201). The die is connected to the circuit on the flexible substrate by a number of bonding pads (2216) using standard flip-chip mounting processes. Because substrate (2202) is flexible, attach pads (2211) can mate with their corresponding pads (2212) on base substrate (2201). A cutout (2203) is provided in base substrate (2201) to accommodate pickup chip (2205) so the assembly lies flat. Attach pads (2211) provide interconnect to the mating pads (2212) on the substrate layer (2201).


Interconnect traces (2214) formed on the top layer of base substrate (2201) provide synchronizing signals between the integrated circuits (2204) and (2205).


Interconnect traces (2215) in the base substrate (2201) route signals to interconnect pads (2213) for power, ground, and communications interconnect to the host system.



FIGS. 23a-g illustrate an example of an assembly stackup of the two-chip. FIG. 23a shows the rigid base (2201) with the drive plates (2209), host interconnect traces (2215) and contact pads (2213), pickup communications traces (2214) and contact pads (2212). Cutout (2203) is mad in base (2201) to accommodate the pickup IC which will be attached in a subsequent step.


Rigid base (2201) could be fabricated from standard circuit board materials, such as FR4, in which case plates (2209), interconnect (2213 and 2214) and pads (2213 and 2212) would typically be formed from copper by use of circuit board etching techniques. Rigid base (2201) could also be formed from glass, in which case plates (2209), interconnect traces (2213 and 2214), and pads (2212 and 2213) would typically be formed from a transparent conductive material such as Indium-Tin Oxide (ITO).



FIG. 23b shows drive electronics die (2204) attached to the traces on the assembly from FIG. 23a. The die is shown attached to the traces via standard flip-chip mounting processes.



FIG. 23c shows the exemplary assembly after the addition of dielectric layer 2208. This dielectric layer may be formed by a standard soldermask process, such as LPI, or by applying a piece of dielectric such as Kapton film.



FIG. 23d shows a cutaway view of the exemplary flexible substrate (2202) with pickup plates (2210) and pickup communications pads (2211) formed on it. Flexible substrate (2202) may be formed from a Kapton film, in which case the plates (2210), traces, and pads (2211) would likely be formed of copper by standard etching techniques. Flexible substrate (2202) could also be made of a transparent material, such as polyester, with plates, traces, and pads formed from by depositing a film of a transparent conductive material such as ITO.



FIG. 23f shows the cutaway view of the exemplary flexible substrate with the addition of pickup electronics die (2205). Electrical connections between the die and elements on the flex are made by bonding interconnect bumps (2216) on the die to contacts (2217) on the flex assembly, as shown in FIG. 23e. Interconnect bumps (2216) are typically made of gold, while contacts (2217) are features formed of the same material as the plates and traces.



FIG. 23g shows a cutaway view of the exemplary completed assembly, as the flex assembly is mounted onto the rigid assembly. Electrical connection between the two sub-assemblies is made by mating flex assembly pads (2211) to rigid assembly pads (2212).



FIG. 24 shows an example of steps required to assemble the exemplary embodiment shown in FIGS. 22A-B and 23A-G. In Step 2401 traces 2214 and 2215, host contact pads 2213, layer interconnect pads 2212, and drive plates 2209 are all formed by an etching process on substrate 2201. A number of instances of the substrate assemblies may be formed at the same time by repeating the pattern across a large panel of base material. In Step 2402 cutout 2203 is formed in substrate 2203 by a standard circuit board routing process. This may take place at the same time that the multiple instances of substrate 2201 are separated by cutting out the substrate outline from the common panel. In Step 2403 dielectric layer 2208 is created by applying a layer of material such as LPI solder mask to substrate 2201 and drive plates 2209. In Step 2404 pickup plates 2210, interconnect pads 2211, and bonding pads 2217 are formed on substrate 2202 by an etching process. In Step 2405, drive electronics die 2204 is mounted onto the substrate assembly 2201 using a standard chip-on-board flip-chip bonding process. In Step 2406, pickup electronics die 2205 is mounted onto substrate assembly 2202 using standard flip-clip chip-on-flex bonding process. In Step 2407, flex substrate assembly 2202 is mounted onto base substrate assembly 2201. In Step 2408 pads 2211 and 2212 are electrically connected using an anisotropic conducting film (ACF) attach process.



FIGS. 25A-F shows an embodiment where the drive and detection electronics are implemented on separate structures, such as separate silicon components for example. This configuration minimizes interconnect between layers by directly mounting the drive die on the substrate layer for the drive lines, and directly mounting the pickup die on the substrate for the pickup plates. The drive and pickup layers may be both connected to a common base layer which provides interconnect for synchronizing signals between the two subsystems, as well as power and communications to the host device. In this particular example, the common substrate (2500) may be a two layer rigid circuit board, which may also provide a mechanical base for the sensor. The drive circuitry may be implemented in integrated circuit die (2504) that is mounted on flexible drive substrate (2501). The die may be connected to the circuit on the flexible substrate by a number of bonding pads (2506) using standard flip-chip mounting processes or other mounting processes known in the art. A large number of drive lines (possibly 100 or more) may be connected to the drive plates (2509), which may or may not be formed on the same flexible substrate. Attach pads (2511) can provide interconnect to the mating pads (2512) on the substrate layer (2500). Substrate (2500) may incorporate a cutout (2513). In one example, the cutout may be configured so that when the drive substrate (2501) is attached drive electronics chip (2504) will not contact substrate (2501), and the assembly lies flat or planar. In another embodiment, a surface may not be entirely planar or even molded over an object such as a power button, the different layers may have a cutout to accommodate different structures such as the drive electronics. Pickup substrate assembly (2502) with pre-attached pickup circuit die (2505) may be mounted on top of both drive substrate (2501) and base substrate (2500). In this embodiment, drive substrate (2501) provides the dielectric layer between the drive and pickup plates, without the need for a separate dielectric layer as in previously discussed embodiments. If substrate (2502) is flexible, attach pads (2507a) may be able to mate with their corresponding pads (2507b) on base substrate (2500). A cutout (2503) may be provided in base substrate (2500) to accommodate pickup chip (2505) so the assembly lies flat. Interconnect traces (2514) formed on the top layer of base substrate (2500) may be included to provide synchronizing signals between the integrated circuits (2504) and (2505). Vias (2507c) or other openings in the base substrate (2500) may be used to route signals to the bottom layer, where lower layer traces (2509) may connect the signals to interconnect pads (2508) for possibly power, ground, communications interconnect to the host system, and other connections.



FIGS. 26A-C shows an exemplary embodiment where the drive and detection electronics are both provided by a single integrated circuit. In one example, substrate (2601) may be composed of a dielectric material which separates the drive (2602) and pickup (2603) plates. Substrate (2601) may be a flexible material, such as Kapton, or a thin rigid material, such as an aramid laminate layer in a FlipChip package, or it may be another material. Integrated circuit die (2604) incorporates contact pads (2611) which are mounted onto bonding pads (2605) the bottom layer of the substrate. The bonding pads provide connections from die (2604) to interconnect traces (2606), drive plates (2602), and pickup interconnect traces (2607). A number of vias (2609) electrically connect pickup interconnect (2607) on the bottom layer to pickup plates (2603), which may be located on the top layer. As shown in FIG. 26A, conductive traces (2612) on the top layer fan out from the ends of the pickup plates (2603) to the vias (2609). Each conductive trace (2612) connects an associated pickup plate (2603) to an associated via (2609). In the illustrated embodiment, the pickup plates (2603) are straight and parallel to each other along their entire length. The conductive traces (2612) are not straight or parallel to each other along their entire length. The spacing between adjacent vias (2609) is greater than the spacing between adjacent pickup plates (2603). Interconnect traces (2606) may connect die (2604) to host connector pads (2608). A dielectric layer (2610) may be formed atop pickup plate (2603) to prevent direct contact of the finger with the pickup plates. The dielectric layer (2610) may be formed from a number of materials, including but not limited to an LPI soldermask material, an ink, or a top Kapton coversheet. In another embodiment, a surface may not be entirely planar or even molded over an object such as a power button,



FIGS. 27-29 illustrate a method for integrating a folded flex fingerprint sensor directly onto a touch-screen device. An unfolded Chip on Flex (COF) substrate is shown in FIG. 28 and is folded back onto itself as is shown in FIG. 27 callouts (2701-2705). Host interface tab (2705) connects the sensor to the host using a connector that accepts the etched flex tab directly or is attached to the host circuit board using an industry standard conductive adhesive referred to as ACF. FIG. 28 shows a COF layout where the substrate is folder end-over-end but could also be reconfigured to fold on another axis such as side-by-side.



FIGS. 30-32 illustrate a fingerprint sensor integrated onto the same substrate layers as a conventional touch-screen.



FIG. 51 illustrates a method for integrating a folded flex fingerprint sensor with a touchscreen device that utilizes a protective cover layer. Display substrate (5106) is a conventional display assembly, which may optionally incorporate a touchscreen. Flexible substrate (5120) has a driver receiver chip (5104) mounted on it and provides a substrate for both the drive and pickup lines of the fingerprint sensor. Substrate (5120) is then folded and mounted on top of display substrate (5106). Host interface connector (5205) provides data and control interconnects between the fingerprint sensor assembly and the host device. An optically transparent dielectric spacer (5130) is placed on top of the display substrate. This dielectric spacer is the same thickness as the stacked fingerprint sensor substrate (5112). A cutout (5108) in the dielectric spacer allows the fingerprint sensor sensing area (5112) to sit flush with the top of dielectric spacer. Top layer (5107) may be made of a protective material such as Corning Gorilla Glass or a protective polymer film. This top layer has a cutout (5109) which allows direct finger contact with finger sensing area (5112).



FIG. 52 illustrates a method for integrating a fingerprint sensor on the same substrate layers as a touchscreen. Display substrate (5206) contains a standard LDC module. The lower layer plates (5211) of the fingerprint sensor are formed or etched on top of the display substrate from a conductive material such as Indium Tin Oxide (ITO). Flexible substrate (5520) has driver receiver chip (5204) mounted on it. Connections from the driver receiver chip (5504) are made to the lower layer plates (5211) through a lower layer edge connector (5210). This connector may be attached through anisotropic conductive film (ACF) or other means. An optically transparent dielectric spacer (5230) is placed on top of the display substrate. The spacer has a cutout to accommodate lower edge connector (5210) and upper edge connector (5221). Top layer (5207) is fabricated from a thin glass or transparent polymer film. This layer (5207) has upper layer sensor plates (5212) formed or etched on its lower surface made of a conductive material such as Indium Tin Oxide (ITO). Flexible substrate 5520 connects signals from driver receiver chip (5204) through upper edge connector (5221), again through ACF or other means.



FIG. 53 illustrates a configuration where the fingerprint sensor and a touchscreen are implemented on common substrate layers utilizing a common controller chip. Driver receiver chip (5304) is mounted to flexible substrate (5320) which provides interconnects between the chip and the host through a host interface connector (5305). Display substrate (5306) is the upper surface of a display module. Both the lower touchscreen plates (5332) and the lower layer fingerprint sensor plates (5311) are formed or etched on top of display substrate (5306). These plates mat be etched or formed from a transparent material such as Indium Tin Oxide (ITO) or other suitable conductive material. Upper fingerprint sensor plates (5312) and upper touchscreen plates (5331) are both etched or formed on the lower side of top layer (5307). A dielectric layer (5330) separates the lower sets of plates for the two sensors from the upper sets of plates. Flexible substrate (5320) provides signal interconnect between driver/receiver chip (5304) to the lower sets of plates through lower edge connector (5310) and interconnect to the upper plate sets through upper edge connector 5321. These edge connectors may be electrically connected through an Anisotropic Conductive Film (ACF) bond or other suitable means.



FIGS. 33 and 34 illustrate a novel “Dual Grid” touch-screen.


User motion tracking is required in touchscreen devices for a number of functions, including icon selection and movement, control selection, gesture recognition, text selection, and so on. Many motion tracking functions only require coarse position determination, but may be done at high speed. This is especially true of gesture recognition. Other functions may require much finer position determination, but these generally are performed at low speeds of motion to allow the user more precise control. Such functions include text selection and drawing tasks.


While it may be difficult to precisely track motion at high speeds due to the high number of positions that need to be sampled at high speed, in practice high speed and high precision are not needed simultaneously.


Given the relative large size of fingers compared to the size of pixels on a typical touchscreen, it is difficult to accurately determine the precise position of a finger. In practice, users actually do not rely on exact finger placement determination to perform precise tasks. Instead, they place their finger on the touchscreen at the approximate location of interest, and then rely on visual feedback from the screen to complete fine positioning tasks.


The important characteristics for a touchscreen display input, then, are good high speed coarse absolute position determination, and highly responsive high resolution low-speed relative motion determination.


This invention addresses the need for high speed coarse absolute positioning and responsive high resolution slow speed motion tracking by providing a dual resolution sensing system. A primary grid is formed at a spacing equal to that used by commercially available touch screens with a spacing of 5-10 per inch while a secondary grid is formed about the primary lines with a much finer resolution equal to that of a commercial fingerprint sensor at 500 lines per inch. The result is a sensor that is capable of detecting macro finger movements using the primary grid as well as small incremental movements using the secondary grid which tracks the movements of fingerprint ridge and valley features.



FIG. 33 shows one embodiment of this dual grid concept in a touchscreen application where the cover layer has been omitted. The touchscreen consists of a matrix of passive capacitance sensing junctions formed at the crossover points of each row and column arranged in a pattern to facilitate both coarse absolute position detection and high resolution relative motion tracking. The illustration shows a simplified embodiment where the sense and drive lines are configured to provide a regularly space series of 3×3 pixel high resolution patches, where the high resolution pixels are spaced in a manner to detect localized features of a fingerprint. These pixels within the cluster would typically be spaced approximately 50 to 100 microns apart. The clusters of pixels are spaced in a manner to provide several contact points within a finger, leading to a typical cluster spacing of 1 to 3 millimeters apart. Active circuitry is located on the periphery of the array outside the sensing area and is used to make the individual crossovers impedance sensitive to a proximally located object as well as scan the entire array



FIG. 34 is an exploded view of the embodiment shown in FIG. 33, showing the stack up of the substrate, with the drive lines atop that, and the chip with the processing electronics mounted to routing lines which feed the drive plates, as well as feed lines which connect to the pickup lines on the upper layers and host interconnect. A dielectric layer separates the drive and pickup lines. The dielectric has cutouts to allow the interconnect pads to feed the pickup signals back to the processing electronics. A thin protective layer, typically of a polymer film or very thin glass is mounted on top of the sense plates.



FIGS. 35-37 illustrate a fully integrated Dual Grid touch-screen and fingerprint sensor, which advantageously share a common drive and sense circuit. In this embodiment, a dedicated area of high density pixels is provided to create a fingerprint sense area for user authentication or other similar purposes, adjacent to a dual-grid touchscreen array configured as in FIG. 33. FIGS. 36 and 37 are exploded views of the embodiment shown in FIG. 35.



FIGS. 38-40 illustrate a fully integrated display with integral touch-screen and fingerprint sensing over the entire display area. In this embodiment, the drive plates are formed below the OLED emissive pixels, while the pickup plates are formed of transparent conductors, such as ITO, in a layer above the OLED emissive layer.



FIGS. 41-46 illustrate how the dual grid finger motion tracking process operates.



FIGS. 41 and 42 show the location of a finger on the dual grid sensor before and after a coarse position change. Coarse position finding and fast motion tracking are performed by activating all the pixels in a cluster simultaneously and taking a single measurement to determine if the cluster is covered by a finger. FIG. 43 is a flowchart for this coarse scan process. FIG. 44 illustrates the computation of the centroid locations for the finger in the two samples collected before and after the move. If the coarse coverage changes between two scans, then the finger is moving quickly, and fine position tracking is not needed. In this case the device will report a new absolute position for each sensed finger to the host system.


If the coarse finger position has not changed, then it is possible that the user is performing a fine positioning task. FIG. 45 illustrates a sequence of images taken at a single high resolution cluster as a finger moves over it. It can be seen that a coarse centroid measurement would not detect the motion of the finger, but examination of the ridge pattern at a fine pixel resolution can detect this relative motion. FIG. 46 shows the local images collected at three different high resolution clusters before and after a very fine motion. For this example, we examine a 4×4 cluster of high resolution pixels at each location on the coarse grid. The advantage provided by taking samples at multiple locations is demonstrated by examining the results from each cell individually. Assuming sufficient sample rate to capture any motion of one pixel or less, there are nine possible relative locations that must be examined when comparing two sequential samples to determine the motion of the finger between the samples. For each possible adjacent move, we compute a match score by shifting the image to the adjacent location, and then counting pixels which match in the overlap region. The number of pixels matched is divided by the total number of pixels that are in the overlap region between the shifted and unshifted images to produce a relative match score. A score of 1 indicates a perfect match. When we apply this method to cell #1 in the example (FIG. 47) and cell #2 (FIG. 48) a single location receives a strong score relative to the other possibilities. When we apply the method to cell #3 (FIG. 49) however, there are three equally high scoring possible final positions. This outcome for a localized area can be quite common when tracking regular patterns such as fingerprints. The example in FIG. 49 demonstrates that a ridge tracking system can be locally insensitive to movement that is parallel to the local ridge orientation. In order to reliably track motion of fingerprints, therefore, it is strongly advantageous to sample the finger at several different locations. The dual grid configuration enables this by dispersing the pixel clusters widely about the finger, so that it samples the finger at locations with a variety of local orientations, without requiring the resources to cover the entire finger area at a high resolution. FIG. 50 shows the combination of results from the three measurement cells to produce a single robust score to determine if the pattern has remained stationary or moved. If motion is detected, the previously computed position (initially from the coarse position) is adjusted by the detected motion, and the updated position is transmitted to the host.


It should be noted that many operating modes of a device may only require coarse location information from the touch sensor. In these cases the system can advantageously omit the fine motion tracking operations of the position sensor in order to save power.


While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad embodiment, and that this embodiment is not limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art. Hence, alternative arrangements and/or quantities of, connections of various sorts, arrangements and quantities of transistors to form circuits, and other features and functions can occur without departing from the spirit and scope of the embodiment. Similarly, components not explicitly mentioned in this specification can be included in various embodiments of this embodiment without departing from the spirit and scope of the embodiment. Also, different process steps and integrated circuit manufacture operations described as being performed to make certain components in various embodiments of this embodiment can, as would be apparent to one skilled in the art, be readily performed in whole or in part to make different components or in different configurations of components not explicitly mentioned in this specification without departing from the spirit and scope of the embodiment. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.


While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad embodiment, and that this embodiment is not limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.


Again, the embodiment has application in many areas, particularly in biometric sensors. Fingerprint sensors, for example, and other biometric sensors are gaining increasing acceptance for use in a wide variety of applications for security and convenience reasons. Devices, systems and methods configured according to the embodiment will have improved security of the biometric verification process without increasing the cost of the system. Furthermore, the embodiment may extend to devices, systems and methods that would benefit from validation of components. As discussed above, the embodiment includes the ability for the host and sensor to include any combination or subset of the above components, which may be arranged and configured in the manner most appropriate for the system's intended application. Those skilled in the art will understand that different combinations and permutations of the components described herein are possible within the spirit and scope of the embodiment, which is defined by the appended Claims, their equivalents, and also Claims presented in related applications in the future and their equivalents.


The embodiment may also involve a number of functions to be performed by a computer processor, such as a microprocessor. The microprocessor may be a specialized or dedicated microprocessor that is configured to perform particular tasks according to the embodiment, by executing machine-readable software code that defines the particular tasks embodied by the embodiment. The microprocessor may also be configured to operate and communicate with other devices such as direct memory access modules, memory storage devices, Internet related hardware, and other devices that relate to the transmission of data in accordance with the embodiment. The software code may be configured using software formats such as Java, C++, XML (Extensible Mark-up Language) and other languages that may be used to define functions that relate to operations of devices required to carry out the functional operations related to the embodiment. The code may be written in different forms and styles, many of which are known to those skilled in the art. Different code formats, code configurations, styles and forms of software programs and other means of configuring code to define the operations of a microprocessor in accordance with the embodiment will not depart from the spirit and scope of the embodiment.


Within the different types of devices, such as laptop or desktop computers, hand held devices with processors or processing logic, and also possibly computer servers or other devices that utilize the embodiment, there exist different types of memory devices for storing and retrieving information while performing functions according to the embodiment. Cache memory devices are often included in such computers for use by the central processing unit as a convenient storage location for information that is frequently stored and retrieved. Similarly, a persistent memory is also frequently used with such computers for maintaining information that is frequently retrieved by the central processing unit, but that is not often altered within the persistent memory, unlike the cache memory. Main memory is also usually included for storing and retrieving larger amounts of information such as data and software applications configured to perform functions according to the embodiment when executed by the central processing unit. These memory devices may be configured as random access memory (RAM), static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, and other memory storage devices that may be accessed by a central processing unit to store and retrieve information. During data storage and retrieval operations, these memory devices are transformed to have different states, such as different electrical charges, different magnetic polarity, and the like. Thus, systems and methods configured according to the embodiment as described herein enable the physical transformation of these memory devices. Accordingly, the embodiment as described herein is directed to novel and useful systems and methods that, in one or more embodiments, are able to transform the memory device into a different state. The embodiment is not limited to any particular type of memory device, or any commonly used protocol for storing and retrieving information to and from these memory devices, respectively.


The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that causes the machine to perform any one or more of the methodologies of the present embodiment. The machine-readable medium includes any mechanism that provides (i.e., stores and/or transmits) information in a form readable by a machine (e.g., a computer, PDA, cellular telephone, etc.). For example, a machine-readable medium includes memory (such as described above); magnetic disk storage media; optical storage media; flash memory devices; biological electrical, mechanical systems; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.). The device or machine-readable medium may include a micro-electromechanical system (MEMS), nanotechnology devices, organic, holographic, solid-state memory device and/or a rotating magnetic or optical disk. The device or machine-readable medium may be distributed when partitions of instructions have been separated into different machines, such as across an interconnection of computers or as different virtual machines.


While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad embodiment, and that this embodiment not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.


Reference in the specification to “an embodiment,” “one embodiment,” “some embodiments,” or “other embodiments” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least some embodiments, but not necessarily all embodiments. The various appearances “an embodiment,” “one embodiment,” or “some embodiments” are not necessarily all referring to the same embodiments. If the specification states a component, feature, structure, or characteristic “may”, “might”, or “could” be included, that particular component, feature, structure, or characteristic is not required to be included. If the specification or Claim refers to “a” or “an” element, that does not mean there is only one of the element. If the specification or Claims refer to “an additional” element, that does not preclude there being more than one of the additional element.


The methods, systems and devices include improved security operations and configurations with a novel approach to biometric systems. Such systems would greatly benefit from increased security features, particularly in financial transactions. Although this embodiment is described and illustrated in the context of devices, systems and related methods of validating biometric devices such as fingerprint sensors, the scope of the embodiment extends to other applications where such functions are useful. Furthermore, while the foregoing description has been with reference to particular embodiments of the embodiment, it will be appreciated that these are only illustrative of the embodiment and that changes may be made to those embodiments without departing from the principles of the embodiment, the scope of which is defined by the appended Claims and their equivalents.

Claims
  • 1. A display device comprising an integral touch screen and fingerprint sensor over the display and comprising: a substrate;a driver chip;a bottom layer above the substrate and comprising a plurality of substantially parallel drive lines that are coupled to the driver chip;an OLED injector layer above the bottom layer;an OLED emissive layer above the OLED injector layer; anda top layer above the OLED emissive layer and comprising a plurality of substantially parallel pickup lines formed from transparent conductors,wherein the OLED emissive layer is in contact with the pickup lines,wherein each location where a pickup line overlaps a drive line forms a fingerprint sensor pixel, andwherein the touch screen comprises a high resolution touch screen and each individual pixel of the display device corresponds to an individual fingerprint sensor pixel.
  • 2. The display device of claim 1, wherein the drive lines and the pickup lines are arranged substantially perpendicularly to each other.
  • 3. The display device of claim 1, wherein the pickup lines are formed from indium tin oxide.
  • 4. The display device of claim 1, wherein the substrate comprises glass.
  • 5. The display device of claim 1, further comprising connectors configured for connecting the driver chip to a host device.
  • 6. The display device of claim 1, wherein the touch screen and the fingerprint sensor share a common drive and sense circuit.
  • 7. The display of claim 6, wherein the OLED injector layer and the OLED emissive layer are coupled to the common drive and sense circuit by the drive lines and the pickup lines.
  • 8. The display of claim 1, wherein the OLED injector layer is in contact with the drive lines.
  • 9. An OLED display device comprising an integral touch screen and a fingerprint sensor over the display and comprising: a substrate;a driver chip;a bottom layer above the substrate and comprising a plurality of substantially parallel drive lines that are coupled to the driver chip;an OLED emissive layer; anda top layer above the bottom layer and separated from the bottom layer by a dielectric layer and comprising a plurality of substantially parallel pickup lines formed of transparent conductors,wherein the OLED emissive layer is in contact with the pickup lines,wherein the pickup lines are oriented transversely to the drive lines to form a two dimensional array of intrinsic electrode pairs configured as a touch screen that detects contact with a portion of the array and configured as a fingerprint sensor that detects ridge and valley features of a finger proximally located with respect to at least a portion of the array, andwherein the touch screen and the fingerprint sensor cover an entire display area of the display device,wherein each location where a pickup line overlaps a drive line forms a fingerprint sensor pixel and wherein the touch screen comprises a high resolution touch screen and each individual pixel of the OLED display device corresponds to an individual fingerprint sensor pixel.
  • 10. The display device of claim 9, wherein the OLED display device further comprises an OLED injector layer above the bottom layer, wherein the OLED emissive layer is above the OLED injector layer.
  • 11. The display device of claim 10, wherein the touch screen and the fingerprint sensor share a common drive and sense circuit, and wherein the OLED injector layer and the OLED emissive layer are coupled to the common drive and sense circuit by the drive lines and the pickup lines.
  • 12. The display of claim 10, wherein the OLED injector layer is in contact with the drive lines.
  • 13. The display device of claim 9, wherein the driver chip is interconnected to the top layer.
  • 14. The display device of claim 9, further comprising connectors configured for connecting the driver chip to a host device.
  • 15. The display device of claim 9, wherein the substrate is formed from glass.
  • 16. The display device of claim 9, wherein the touch screen and the fingerprint sensor share a common drive and sense circuit.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a divisional application under 35 U.S.C. § 121 of U.S. patent application Ser. No. 14/243,122 filed Apr. 2, 2014, which claims the benefit under 35 U.S.C. § 120 of the filing date of non-provisional patent application Ser. No. 13/860,494 filed Apr. 10, 2013, which claims priority under 35 U.S.C. § 119(e) of provisional application Ser. No. 61/622,474 filed Apr. 10, 2012, the respective disclosures of which are hereby incorporated herein by reference.

US Referenced Citations (716)
Number Name Date Kind
1680161 Hansen Feb 1928 A
1683059 Van Deventer Sep 1928 A
3393390 Louis Jul 1968 A
3593319 Barber Jul 1971 A
3610887 Betzer Oct 1971 A
3621439 Newbery Nov 1971 A
3624584 Ohno Nov 1971 A
3863195 Bowen Jan 1975 A
3960044 Nagai et al. Jun 1976 A
3997863 Luce Dec 1976 A
4151512 Riganati et al. Apr 1979 A
4152304 Tadewald May 1979 A
4208648 Naumann Jun 1980 A
4225850 Chang et al. Sep 1980 A
4257305 Friend et al. Mar 1981 A
4273682 Kanamori Jun 1981 A
4310827 Asai Jan 1982 A
4333068 Kishel Jun 1982 A
4353056 Tsikos Oct 1982 A
4405829 Rivest et al. Sep 1983 A
4419653 Waigand Dec 1983 A
4438158 Eichelberger et al. Mar 1984 A
4479392 Froeb et al. Oct 1984 A
4492949 Peterson Jan 1985 A
4525859 Bowles et al. Jun 1985 A
4550221 Mabusth Oct 1985 A
4570149 Thornburg Feb 1986 A
4580790 Doose Apr 1986 A
4604509 Clancy et al. Aug 1986 A
4745301 Michalchik May 1988 A
4746894 Zeldman May 1988 A
4758622 Gosselin Jul 1988 A
4765930 Mashimo et al. Aug 1988 A
4775765 Kimura et al. Oct 1988 A
4817183 Sparrow Mar 1989 A
4827527 Morita et al. May 1989 A
4833440 Wojtanek May 1989 A
4878040 Tamura Oct 1989 A
4933660 Wynne, Jr. Jun 1990 A
4933976 Fishbine et al. Jun 1990 A
4952761 Viebrantz Aug 1990 A
5060527 Burgess Oct 1991 A
5068638 Bickely et al. Nov 1991 A
5076566 Kriegel Dec 1991 A
5079949 Tamori Jan 1992 A
5109427 Yang Apr 1992 A
5140642 Hsu et al. Aug 1992 A
5162775 Kuramochi et al. Nov 1992 A
5164697 Kramer Nov 1992 A
5296835 Nakamura Mar 1994 A
5305017 Gerpheide Apr 1994 A
5319323 Fong Jun 1994 A
5325442 Knapp Jun 1994 A
5326194 Pinto et al. Jul 1994 A
5376913 Pine et al. Dec 1994 A
5420936 Fitzpatrick et al. May 1995 A
5422807 Mitra et al. Jun 1995 A
5428684 Akiyama et al. Jun 1995 A
5429006 Tamori Jul 1995 A
5456256 Schneider et al. Oct 1995 A
5499041 Brandenburg et al. Mar 1996 A
5515738 Tamori May 1996 A
5517738 Wildi-Weber May 1996 A
5543591 Gillespie et al. Aug 1996 A
5569901 Bridgelall et al. Oct 1996 A
5610993 Yamamoto Mar 1997 A
5612719 Beernink et al. Mar 1997 A
5614881 Duggal et al. Mar 1997 A
5621318 Jacobsen et al. Apr 1997 A
5623552 Lane Apr 1997 A
5627316 De Winter et al. May 1997 A
5644283 Grosse-Wilde et al. Jul 1997 A
5650842 Maase et al. Jul 1997 A
5657012 Tait Aug 1997 A
5675309 DeVolpi Oct 1997 A
5689285 Asher Nov 1997 A
5717777 Wong et al. Feb 1998 A
5745046 Itsumi et al. Apr 1998 A
5781651 Hsiao et al. Jul 1998 A
5801681 Sayag Sep 1998 A
5818956 Tuli Oct 1998 A
5821930 Hansen Oct 1998 A
5825352 Bisset et al. Oct 1998 A
5825907 Russo Oct 1998 A
5828773 Setlak et al. Oct 1998 A
5838306 O'Connor et al. Nov 1998 A
5841888 Setlak et al. Nov 1998 A
5845005 Setlak et al. Dec 1998 A
5848176 Hara et al. Dec 1998 A
5850450 Schweitzer et al. Dec 1998 A
5852670 Setlak et al. Dec 1998 A
5862248 Salatino et al. Jan 1999 A
5864296 Upton Jan 1999 A
5876106 Kordecki Mar 1999 A
5880411 Gillespie et al. Mar 1999 A
5884289 Anderson et al. Mar 1999 A
5887343 Salatino et al. Mar 1999 A
5889507 Engle et al. Mar 1999 A
5892824 Beatson et al. Apr 1999 A
5903225 Schmitt et al. May 1999 A
5907327 Ogura et al. May 1999 A
5909211 Combs et al. Jun 1999 A
5909501 Thebaud Jun 1999 A
5910286 Lipskier Jun 1999 A
5912612 DeVolpi Jun 1999 A
5915757 Tsuyama et al. Jun 1999 A
5920384 Borza Jul 1999 A
5920640 Salatino et al. Jul 1999 A
5940526 Setlak et al. Aug 1999 A
5943052 Allen et al. Aug 1999 A
5945929 Westra Aug 1999 A
5949325 Devolpi Sep 1999 A
5953441 Setlak Sep 1999 A
5956415 McCalley et al. Sep 1999 A
5963679 Setlak Oct 1999 A
5861875 Gerpheide Nov 1999 A
5982894 McCalley et al. Nov 1999 A
5995084 Chan et al. Nov 1999 A
5999084 Armstrong Dec 1999 A
5999637 Toyoda et al. Dec 1999 A
6002815 Immega et al. Dec 1999 A
6011589 Matsuura et al. Jan 2000 A
6016355 Dickinson et al. Jan 2000 A
6021211 Setlak et al. Feb 2000 A
6028773 Hundt Feb 2000 A
6047281 Wilson et al. Apr 2000 A
6047282 Wilson et al. Apr 2000 A
6049620 Dickinson et al. Apr 2000 A
6052475 Upton Apr 2000 A
6057540 Gordon et al. May 2000 A
6057830 Chan et al. May 2000 A
6061051 Chan et al. May 2000 A
6061464 Leger May 2000 A
6067368 Setlak et al. May 2000 A
6069970 Salatino et al. May 2000 A
6070159 Wilson et al. May 2000 A
6073343 Petrick et al. Jun 2000 A
6076566 Lowe Jun 2000 A
6088471 Setlak et al. Jul 2000 A
6088585 Schmitt et al. Jul 2000 A
6098175 Lee Aug 2000 A
6098330 Schmitt et al. Aug 2000 A
6118318 Fifield et al. Sep 2000 A
6134340 Hsu et al. Oct 2000 A
6135958 Mikula-Curtis et al. Oct 2000 A
6157722 Lerner et al. Dec 2000 A
6161213 Lofstrom Dec 2000 A
6173400 Perlman et al. Jan 2001 B1
6175407 Sartor Jan 2001 B1
6181807 Setlak et al. Jan 2001 B1
6182076 Yu et al. Jan 2001 B1
6182892 Angelo et al. Feb 2001 B1
6185318 Jain et al. Feb 2001 B1
6208329 Ballare Mar 2001 B1
6215477 Morrison et al. Apr 2001 B1
6234031 Suga May 2001 B1
6239790 Martinelli et al. May 2001 B1
6241288 Bergenek et al. Jun 2001 B1
6248655 Machida et al. Jun 2001 B1
6256012 Devolpi Jul 2001 B1
6256022 Manaresi et al. Jul 2001 B1
6259108 Antonelli et al. Jul 2001 B1
6259804 Setlak et al. Jul 2001 B1
6278443 Amro et al. Aug 2001 B1
6289114 Mainguet Sep 2001 B1
6317508 Kramer et al. Nov 2001 B1
6320394 Tartagni Nov 2001 B1
6323846 Westerman et al. Nov 2001 B1
6330345 Russo et al. Dec 2001 B1
6332193 Glass et al. Dec 2001 B1
6333989 Borza Dec 2001 B1
6337918 Holehan Jan 2002 B1
6337919 Dunton Jan 2002 B1
6344791 Armstrong Feb 2002 B1
6346739 Lepert et al. Feb 2002 B1
6347040 Fries et al. Feb 2002 B1
6357302 Knapp Mar 2002 B1
6360004 Akizuki Mar 2002 B1
6362633 Tartagni Mar 2002 B1
6370965 Knapp Apr 2002 B1
6376393 Newton et al. Apr 2002 B1
6392636 Ferrari et al. May 2002 B1
6399994 Shobu Jun 2002 B2
6400836 Senior Jun 2002 B2
6404323 Schrum et al. Jun 2002 B1
6404900 Qian et al. Jun 2002 B1
6408087 Kramer Jun 2002 B1
6437682 Vance Aug 2002 B1
6442286 Kramer Aug 2002 B1
6459424 Resman Oct 2002 B1
6459804 Mainguet Oct 2002 B2
6473072 Comiskey et al. Oct 2002 B1
6483931 Kalnitsky et al. Nov 2002 B2
6501284 Gozzini Dec 2002 B1
6509501 Eicken et al. Jan 2003 B2
6512381 Kramer Jan 2003 B2
6515488 Thomas Feb 2003 B1
6518560 Yeh et al. Feb 2003 B1
6522773 Houdeau Feb 2003 B1
6535622 Russo et al. Mar 2003 B1
6539101 Black Mar 2003 B1
6546122 Russo Apr 2003 B1
6563101 Tullis May 2003 B1
6580816 Kramer et al. Jun 2003 B2
6597289 Sabatini Jul 2003 B2
6603462 Matusis Aug 2003 B2
6628812 Setlak et al. Sep 2003 B1
6636053 Gozzini Oct 2003 B1
6643389 Raynal et al. Nov 2003 B1
6654484 Topping Nov 2003 B2
6661631 Meador et al. Dec 2003 B1
6664951 Fujii et al. Dec 2003 B1
6667439 Salatino et al. Dec 2003 B2
6668072 Hribernig et al. Dec 2003 B1
6672174 Deconde et al. Jan 2004 B2
6680731 Gerpheide et al. Jan 2004 B2
6681034 Russo Jan 2004 B1
6683971 Salatino et al. Jan 2004 B1
6738050 Comiskey et al. May 2004 B2
6741729 Bjorn et al. May 2004 B2
6744910 McClurg et al. Jun 2004 B1
6754365 Wen et al. Jun 2004 B1
6757002 Oross et al. Jun 2004 B1
6766040 Catalano et al. Jul 2004 B1
6785407 Tschudi et al. Aug 2004 B1
6836230 Pailleur et al. Dec 2004 B2
6838905 Doyle Jan 2005 B1
6862942 Kawahata Mar 2005 B2
6876756 Vieweg Apr 2005 B1
6886104 McClurg et al. Apr 2005 B1
6897002 Teraoka et al. May 2005 B2
6898299 Brooks May 2005 B1
6914517 Kinsella Jul 2005 B2
6924496 Manansala Aug 2005 B2
6937748 Schneider et al. Aug 2005 B1
6941001 Bolle et al. Sep 2005 B1
6941810 Okada Sep 2005 B2
6950540 Higuchi Sep 2005 B2
6959874 Bardwell Nov 2005 B2
6961452 Fujii Nov 2005 B2
6963626 Shaeffer et al. Nov 2005 B1
6970584 O'Gorman et al. Nov 2005 B2
6980672 Saito et al. Dec 2005 B2
6983882 Cassone Jan 2006 B2
7002553 Shkolnikov Feb 2006 B2
7003670 Heaven et al. Feb 2006 B2
7004389 Robinson et al. Feb 2006 B1
7013030 Wong et al. Mar 2006 B2
7014107 Singer et al. Mar 2006 B2
7020270 Ghassabian Mar 2006 B1
7020591 Wei et al. Mar 2006 B1
7029767 Nakamata Apr 2006 B2
7030860 Hsu et al. Apr 2006 B1
7035443 Wong Apr 2006 B2
7039223 Wong May 2006 B2
7042535 Katoh et al. May 2006 B2
7043061 Wong May 2006 B2
7043644 DeBruine May 2006 B2
7046230 Zadesky et al. May 2006 B2
7054470 Bolle et al. May 2006 B2
7064743 Nishikawa Jun 2006 B2
7099496 Benkley Aug 2006 B2
7099497 Chou et al. Aug 2006 B2
7102364 Umeda et al. Sep 2006 B2
7110577 Tschudi Sep 2006 B1
7113622 Hamid Sep 2006 B2
7126389 McRae et al. Oct 2006 B1
7129926 Mathiassen et al. Oct 2006 B2
7136514 Wong Nov 2006 B1
7146024 Benkley Dec 2006 B2
7146026 Russon et al. Dec 2006 B2
7146029 Manansala Dec 2006 B2
7184581 Johansen et al. Feb 2007 B2
7190816 Mitsuyu et al. Mar 2007 B2
7194392 Tuken et al. Mar 2007 B2
7194393 Wei et al. Mar 2007 B2
7197168 Russo Mar 2007 B2
7200250 Chou Apr 2007 B2
7236616 Scott Jun 2007 B1
7239153 Nysaether Jul 2007 B2
7251351 Mathiassen et al. Jul 2007 B2
7258279 Schneider et al. Aug 2007 B2
7260246 Fujii Aug 2007 B2
7263212 Kawabe Aug 2007 B2
7263213 Rowe Aug 2007 B2
7269256 Rosen Sep 2007 B2
7280679 Russo Oct 2007 B2
7283534 Kelly et al. Oct 2007 B1
7289649 Walley et al. Oct 2007 B1
7290323 Deconde et al. Nov 2007 B2
7299360 Russo Nov 2007 B2
7308121 Mathiassen et al. Dec 2007 B2
7308122 McClurg et al. Dec 2007 B2
7321672 Sasaki et al. Jan 2008 B2
7356169 Hamid Apr 2008 B2
7360688 Harris Apr 2008 B1
7369658 DeLeon May 2008 B2
7369688 Ser et al. May 2008 B2
7379569 Chikazawa et al. May 2008 B2
7398390 Hyser Jul 2008 B2
7403644 Bohn et al. Jul 2008 B2
7409876 Ganapathi et al. Aug 2008 B2
7412083 Takahashi Aug 2008 B2
7417536 Lakshmanan et al. Aug 2008 B2
7424618 Roy et al. Sep 2008 B2
7447339 Mimura et al. Nov 2008 B2
7447911 Chou et al. Nov 2008 B2
7460697 Erhart et al. Dec 2008 B2
7463756 Benkley Dec 2008 B2
7474772 Russo et al. Jan 2009 B2
7505611 Fyke Mar 2009 B2
7505613 Russo Mar 2009 B2
7518382 Vermesan et al. Apr 2009 B2
7543737 Bensimon et al. Jun 2009 B2
7565548 Fiske et al. Jul 2009 B2
7574022 Russo Aug 2009 B2
7590269 Creasey et al. Sep 2009 B2
7623659 Huang et al. Nov 2009 B2
7643950 Getzin et al. Jan 2010 B1
7646897 Fyke Jan 2010 B2
7681232 Nordentoft et al. Mar 2010 B2
7685629 White et al. Mar 2010 B1
7689013 Shinzaki Mar 2010 B2
7706581 Drews et al. Apr 2010 B2
7733697 Picca et al. Jun 2010 B2
7751595 Russo Jul 2010 B2
7751601 Benkley Jul 2010 B2
7754022 Barnhill et al. Jul 2010 B2
7768273 Kalnitsky et al. Aug 2010 B1
7821501 Felder Oct 2010 B2
7831840 Love et al. Nov 2010 B1
7843438 Onoda Nov 2010 B2
7844579 Peterson et al. Nov 2010 B2
7864992 Riedijk et al. Jan 2011 B2
7899216 Watanabe et al. Mar 2011 B2
7930812 Curnalia et al. Apr 2011 B2
7953258 Dean et al. May 2011 B2
7986193 Krah Jul 2011 B2
8005276 Dean et al. Aug 2011 B2
8023700 Riionheimo Sep 2011 B2
8031916 Abiko et al. Oct 2011 B2
8077935 Geoffroy et al. Dec 2011 B2
8107212 Nelson et al. Jan 2012 B2
8116540 Dean et al. Feb 2012 B2
8525799 Grivina et al. Sep 2013 B1
8619057 Kobayashi et al. Dec 2013 B2
8711105 Gray et al. Apr 2014 B2
20010012036 Giere et al. Aug 2001 A1
20010017934 Paloniemi et al. Aug 2001 A1
20010026636 Mainguet Oct 2001 A1
20010029527 Goshen Oct 2001 A1
20010030644 Allport Oct 2001 A1
20010032319 Setlak Oct 2001 A1
20010036299 Senior Nov 2001 A1
20010043728 Kramer et al. Nov 2001 A1
20020025062 Black Feb 2002 A1
20020054695 Bjorn et al. May 2002 A1
20020061125 Fujii May 2002 A1
20020064892 Lepert et al. May 2002 A1
20020067845 Griffis Jun 2002 A1
20020073046 David Jun 2002 A1
20020089044 Simmons et al. Jul 2002 A1
20020089410 Janiak et al. Jul 2002 A1
20020096731 Wu et al. Jul 2002 A1
20020097231 Satoh et al. Jul 2002 A1
20020109671 Kawasome Aug 2002 A1
20020122026 Bergstrom Sep 2002 A1
20020126516 Jeon Sep 2002 A1
20020130673 Pelrine et al. Sep 2002 A1
20020133725 Roy et al. Sep 2002 A1
20020150282 Kinsella Oct 2002 A1
20020152048 Hayes Oct 2002 A1
20020156726 Kleckner et al. Oct 2002 A1
20020164057 Kramer et al. Nov 2002 A1
20020181749 Matsumoto et al. Dec 2002 A1
20020186203 Huang Dec 2002 A1
20020188854 Heaven et al. Dec 2002 A1
20030002717 Hamid Jan 2003 A1
20030002718 Hamid Jan 2003 A1
20030002719 Hamid et al. Jan 2003 A1
20030016849 Andrade Jan 2003 A1
20030021495 Cheng Jan 2003 A1
20030025606 Sabatini Feb 2003 A1
20030028811 Walker et al. Feb 2003 A1
20030035568 Mitev et al. Feb 2003 A1
20030035570 Benkley Feb 2003 A1
20030035572 Kalnitsky et al. Feb 2003 A1
20030044051 Fujieda Mar 2003 A1
20030063782 Acharya et al. Apr 2003 A1
20030068072 Hamid Apr 2003 A1
20030074559 Riggs Apr 2003 A1
20030076301 Tsuk et al. Apr 2003 A1
20030076303 Huppi Apr 2003 A1
20030095096 Robbin et al. May 2003 A1
20030095691 Nobuhara et al. May 2003 A1
20030101348 Russo et al. May 2003 A1
20030102874 Lane et al. Jun 2003 A1
20030107608 Hong et al. Jun 2003 A1
20030108226 Goodman et al. Jun 2003 A1
20030108227 Philomin et al. Jun 2003 A1
20030115475 Russo et al. Jun 2003 A1
20030115490 Russo et al. Jun 2003 A1
20030123714 O'Gorman et al. Jul 2003 A1
20030123715 Uchida Jul 2003 A1
20030126448 Russo Jul 2003 A1
20030135764 Lu Jul 2003 A1
20030141959 Keogh et al. Jul 2003 A1
20030147015 Katoh et al. Aug 2003 A1
20030161510 Fujii Aug 2003 A1
20030161512 Mathiassen et al. Aug 2003 A1
20030169228 Mathiassen et al. Sep 2003 A1
20030174256 Kim Sep 2003 A1
20030174871 Yoshioka et al. Sep 2003 A1
20030186157 Teraoka et al. Oct 2003 A1
20030209293 Sako et al. Nov 2003 A1
20030214481 Xiong Nov 2003 A1
20030215116 Brandt et al. Nov 2003 A1
20030224553 Manansala Dec 2003 A1
20040012773 Puttkammer Jan 2004 A1
20040014457 Stevens Jan 2004 A1
20040022001 Chu et al. Feb 2004 A1
20040042642 Bolle et al. Mar 2004 A1
20040050930 Rowe Mar 2004 A1
20040066613 Leitao Apr 2004 A1
20040076313 Bronstein et al. Apr 2004 A1
20040076314 Cheng Apr 2004 A1
20040081339 Benkley Apr 2004 A1
20040096086 Miyasaka et al. May 2004 A1
20040113956 Bellwood et al. Jun 2004 A1
20040120400 Linzer Jun 2004 A1
20040125990 Goodman et al. Jul 2004 A1
20040125993 Zhao et al. Jul 2004 A1
20040128521 Russo Jul 2004 A1
20040129787 Saito et al. Jul 2004 A1
20040136612 Meister et al. Jul 2004 A1
20040148526 Sands et al. Jul 2004 A1
20040156538 Greschitz et al. Aug 2004 A1
20040172339 Snelgrove et al. Sep 2004 A1
20040179718 Chou Sep 2004 A1
20040184641 Nagasaka et al. Sep 2004 A1
20040186882 Ting Sep 2004 A1
20040190761 Lee Sep 2004 A1
20040208346 Baharav et al. Oct 2004 A1
20040208347 Baharav et al. Oct 2004 A1
20040208348 Baharav et al. Oct 2004 A1
20040213441 Tschudi Oct 2004 A1
20040215689 Dooley et al. Oct 2004 A1
20040228505 Sugimoto Nov 2004 A1
20040228508 Shigeta Nov 2004 A1
20040230536 Fung et al. Nov 2004 A1
20040240712 Rowe et al. Dec 2004 A1
20040252867 Lan et al. Dec 2004 A1
20040257196 Kotzin Dec 2004 A1
20040258282 Bjorn et al. Dec 2004 A1
20040263479 Shkolnikov Dec 2004 A1
20050012714 Russo et al. Jan 2005 A1
20050031174 Ryhanen et al. Feb 2005 A1
20050036665 Higuchi Feb 2005 A1
20050041841 Yoo et al. Feb 2005 A1
20050041885 Russo Feb 2005 A1
20050122785 Umeda et al. Feb 2005 A1
20050047485 Khayrallah et al. Mar 2005 A1
20050089200 Nysaether Apr 2005 A1
20050100196 Scott et al. May 2005 A1
20050100938 Hofmann et al. May 2005 A1
20050109835 Jacoby et al. May 2005 A1
20050110103 Setlak May 2005 A1
20050111708 Chou May 2005 A1
20050123176 Ishii et al. Jun 2005 A1
20050136200 Durell et al. Jun 2005 A1
20050139656 Arnouse Jun 2005 A1
20050149386 Agura Jul 2005 A1
20050151065 Min Jul 2005 A1
20050162402 Watanachote Jul 2005 A1
20050169503 Howell et al. Aug 2005 A1
20050178827 Shatford Aug 2005 A1
20050179657 Russo et al. Aug 2005 A1
20050198377 Ferguson et al. Sep 2005 A1
20050205985 Smith et al. Sep 2005 A1
20050210271 Chou et al. Sep 2005 A1
20050219200 Weng Oct 2005 A1
20050220329 Payne et al. Oct 2005 A1
20050221798 Sengupta et al. Oct 2005 A1
20050231213 Chou et al. Oct 2005 A1
20050238212 Du et al. Oct 2005 A1
20050244038 Benkley Nov 2005 A1
20050244039 Geoffroy et al. Nov 2005 A1
20050247559 Frey et al. Nov 2005 A1
20050249386 Juh Nov 2005 A1
20050254694 Goodman et al. Nov 2005 A1
20050258952 Utter et al. Nov 2005 A1
20050259851 Fyke Nov 2005 A1
20050259852 Russo Nov 2005 A1
20050269402 Spitzer et al. Dec 2005 A1
20050281441 Martinsen et al. Dec 2005 A1
20060002597 Rowe Jan 2006 A1
20060006224 Modi Jan 2006 A1
20060034043 Hisano et al. Feb 2006 A1
20060055500 Burke et al. Mar 2006 A1
20060066572 Yumoto et al. Mar 2006 A1
20060076926 Lee Apr 2006 A1
20060078174 Russo Apr 2006 A1
20060078176 Abiko et al. Apr 2006 A1
20060083411 Benkley Apr 2006 A1
20060097991 Hotelling et al. May 2006 A1
20060103633 Gioeli May 2006 A1
20060110013 Lee May 2006 A1
20060110537 Huang et al. May 2006 A1
20060140461 Kim et al. Jun 2006 A1
20060141960 Fernandez et al. Jun 2006 A1
20060144953 Takao Jul 2006 A1
20060170528 Fukushige et al. Aug 2006 A1
20060187200 Martin Aug 2006 A1
20060190737 Miyasaka Aug 2006 A1
20060210082 Devadas et al. Sep 2006 A1
20060214512 Iwata Sep 2006 A1
20060214921 Takahashi et al. Sep 2006 A1
20060239514 Watanabe et al. Oct 2006 A1
20060242268 Omernick et al. Oct 2006 A1
20060244722 Gust Nov 2006 A1
20060249008 Luther Nov 2006 A1
20060259873 Mister Nov 2006 A1
20060261174 Zellner et al. Nov 2006 A1
20060261449 Rapport et al. Nov 2006 A1
20060267385 Steenwyk et al. Nov 2006 A1
20060271793 Devadas et al. Nov 2006 A1
20060280346 Machida Dec 2006 A1
20060287963 Steeves et al. Dec 2006 A1
20070014443 Russo Jan 2007 A1
20070021198 Muir et al. Jan 2007 A1
20070031011 Erhart et al. Feb 2007 A1
20070034783 Eliasson et al. Feb 2007 A1
20070036400 Watanabe et al. Feb 2007 A1
20070057763 Blattner et al. Mar 2007 A1
20070061126 Russo et al. Mar 2007 A1
20070067828 Bychkov Mar 2007 A1
20070072631 Mock et al. Mar 2007 A1
20070076923 Chiu Apr 2007 A1
20070076926 Schneider et al. Apr 2007 A1
20070076951 Tanaka et al. Apr 2007 A1
20070086634 Setlak et al. Apr 2007 A1
20070090312 Stallinga et al. Apr 2007 A1
20070125937 Eliasson et al. Jun 2007 A1
20070138299 Mitra Jun 2007 A1
20070180261 Akkermans et al. Aug 2007 A1
20070196002 Choi et al. Aug 2007 A1
20070198141 Moore Aug 2007 A1
20070198435 Siegal et al. Aug 2007 A1
20070210895 Kuhlman Sep 2007 A1
20070211923 Kuhlman Sep 2007 A1
20070228154 Tran Oct 2007 A1
20070236330 Cho et al. Oct 2007 A1
20070237366 Maletsky Oct 2007 A1
20070248249 Stoianov Oct 2007 A1
20070274575 Russo Nov 2007 A1
20070292005 Lo et al. Dec 2007 A1
20080002867 Mathiassen et al. Jan 2008 A1
20080013803 Lo et al. Jan 2008 A1
20080013805 Sengupta et al. Jan 2008 A1
20080013808 Russo et al. Jan 2008 A1
20080019578 Saito et al. Jan 2008 A1
20080030207 Vermesan et al. Feb 2008 A1
20080042813 Wheatley Feb 2008 A1
20080042814 Hurwitz Feb 2008 A1
20080049980 Castaneda et al. Feb 2008 A1
20080049987 Champagne et al. Feb 2008 A1
20080049989 Iseri et al. Feb 2008 A1
20080062140 Hotelling et al. Mar 2008 A1
20080063245 Benkley et al. Mar 2008 A1
20080069412 Champagne et al. Mar 2008 A1
20080101662 Lo et al. May 2008 A1
20080101663 Lo et al. May 2008 A1
20080101705 Mohamed et al. May 2008 A1
20080103995 Mohamed et al. May 2008 A1
20080126260 Cox et al. May 2008 A1
20080133373 Perdomo et al. Jun 2008 A1
20080138078 Alameh et al. Jun 2008 A1
20080138079 Mui et al. Jun 2008 A1
20080154816 Xiao et al. Jun 2008 A1
20080158178 Hotelling et al. Jul 2008 A1
20080159688 Schellinger et al. Jul 2008 A1
20080159698 Alameh et al. Jul 2008 A1
20080159699 Zeiger et al. Jul 2008 A1
20080165139 Hotelling et al. Jul 2008 A1
20080165158 Hotelling et al. Jul 2008 A1
20080166026 Turek et al. Jul 2008 A1
20080169345 Keane et al. Jul 2008 A1
20080170695 Adler et al. Jul 2008 A1
20080174570 Jobs et al. Jul 2008 A1
20080175450 Scott Jul 2008 A1
20080178008 Takahashi et al. Jul 2008 A1
20080179112 Qin et al. Jul 2008 A1
20080185193 Lin Aug 2008 A1
20080185429 Saville Aug 2008 A1
20080201265 Hewton Aug 2008 A1
20080205714 Benkley et al. Aug 2008 A1
20080219521 Benkley et al. Sep 2008 A1
20080222049 Loomis et al. Sep 2008 A1
20080223925 Saito et al. Sep 2008 A1
20080226132 Gardner Sep 2008 A1
20080238878 Wang Oct 2008 A1
20080240523 Benkley et al. Oct 2008 A1
20080244277 Orsini et al. Oct 2008 A1
20080247652 Mohamed et al. Oct 2008 A1
20080265751 Smith Oct 2008 A1
20080267462 Nelson et al. Oct 2008 A1
20080273767 Lo et al. Nov 2008 A1
20080273769 Lo et al. Nov 2008 A1
20080279373 Erhart et al. Nov 2008 A1
20080279416 Lo et al. Nov 2008 A1
20080285813 Holm Nov 2008 A1
20080298648 Lo et al. Dec 2008 A1
20080309633 Hotelling et al. Dec 2008 A1
20080316183 Westerman et al. Dec 2008 A1
20090009486 Sato et al. Jan 2009 A1
20090016913 Smits Jan 2009 A1
20090056124 Krebs et al. Mar 2009 A1
20090060728 Grimes et al. Mar 2009 A1
20090074255 Holm Mar 2009 A1
20090083850 Fadell et al. Mar 2009 A1
20090130369 Huang et al. May 2009 A1
20090153297 Gardner Jun 2009 A1
20090154779 Satyan et al. Jun 2009 A1
20090155456 Benkley et al. Jun 2009 A1
20090024499 Ribble Jul 2009 A1
20090169071 Bond et al. Jul 2009 A1
20090169072 Lo et al. Jul 2009 A1
20090174974 Huang et al. Jul 2009 A1
20090210722 Russo Aug 2009 A1
20090228952 Gillig et al. Sep 2009 A1
20090237135 Ramaraju et al. Sep 2009 A1
20090244820 Kusaka et al. Oct 2009 A1
20090252384 Dean et al. Oct 2009 A1
20090252385 Dean et al. Oct 2009 A1
20090252386 Dean et al. Oct 2009 A1
20090273573 Hotelling Nov 2009 A1
20090273577 Chen et al. Nov 2009 A1
20090279742 Abiko Nov 2009 A1
20090314621 Hotelling Dec 2009 A1
20090319435 Little et al. Dec 2009 A1
20090324028 Russo Dec 2009 A1
20100019032 Kim Jan 2010 A1
20100026451 Erhart et al. Feb 2010 A1
20100026453 Yamamoto et al. Feb 2010 A1
20100030921 Kim Feb 2010 A1
20100045705 Vertegaal et al. Feb 2010 A1
20100050175 Jung et al. Feb 2010 A1
20100059295 Hotelling et al. Mar 2010 A1
20100083000 Kesanupalli Apr 2010 A1
20100085325 King-Smith et al. Apr 2010 A1
20100119124 Satyan May 2010 A1
20100123675 Ippel May 2010 A1
20100127366 Bond et al. May 2010 A1
20100146275 Slick et al. Jun 2010 A1
20100162386 Li et al. Jun 2010 A1
20100176823 Thompson et al. Jul 2010 A1
20100176892 Thompson et al. Jul 2010 A1
20100177940 Dean et al. Jul 2010 A1
20100180127 Li et al. Jul 2010 A1
20100180136 Thompson et al. Jul 2010 A1
20100189314 Benkley et al. Jul 2010 A1
20100191634 Macy et al. Jul 2010 A1
20100193257 Hotelling Aug 2010 A1
20100208953 Gardner et al. Aug 2010 A1
20100244166 Shibuta et al. Sep 2010 A1
20100245553 Schuler et al. Sep 2010 A1
20100272329 Benkley Oct 2010 A1
20100284565 Benkley Nov 2010 A1
20100318515 Ramanathan et al. Dec 2010 A1
20110002461 Erhart et al. Jan 2011 A1
20110018556 Le et al. Jan 2011 A1
20110060913 Hird et al. Mar 2011 A1
20110074734 Wassvik et al. Mar 2011 A1
20110080370 Wu Apr 2011 A1
20110082791 Baghdasaryan et al. Apr 2011 A1
20110082800 Baghdasaryan et al. Apr 2011 A1
20110082801 Baghdasaryan et al. Apr 2011 A1
20110082802 Baghdasaryan et al. Apr 2011 A1
20110083018 Kesanupalli et al. Apr 2011 A1
20110083170 Kesanupalli et al. Apr 2011 A1
20110083173 Baghdasaryan et al. Apr 2011 A1
20110102567 Erhart May 2011 A1
20110102569 Erhart May 2011 A1
20110138450 Kesanupalli et al. Jun 2011 A1
20110175703 Benkley Jul 2011 A1
20110176037 Benkley Jul 2011 A1
20110182486 Valfridsson et al. Jul 2011 A1
20110193799 Jun et al. Aug 2011 A1
20110214924 Perezselsky et al. Sep 2011 A1
20110215150 Schneider et al. Sep 2011 A1
20110242021 Jun et al. Oct 2011 A1
20110261003 Lee et al. Oct 2011 A1
20110267298 Erhart et al. Nov 2011 A1
20110285640 Park et al. Nov 2011 A1
20110298711 Dean et al. Dec 2011 A1
20110304001 Erhart et al. Dec 2011 A1
20120012652 Couper et al. Jan 2012 A1
20120044639 Garcia Feb 2012 A1
20120069042 Ogita et al. Mar 2012 A1
20120075252 Dighde et al. Mar 2012 A1
20120105081 Shaikh May 2012 A1
20120162099 You et al. Jun 2012 A1
20120182259 Han Jul 2012 A1
20120189166 Russo Jul 2012 A1
20120189172 Russo Jul 2012 A1
20120242635 Erhart et al. Sep 2012 A1
20120299863 Yilmaz Nov 2012 A1
20130069894 Chen Mar 2013 A1
20130106759 Fredriksen et al. May 2013 A1
20130135247 Na et al. May 2013 A1
20130181911 Yilmaz et al. Jul 2013 A1
20130181949 Setlak Jul 2013 A1
20130229379 Joguet et al. Sep 2013 A1
20130265137 Nelson et al. Oct 2013 A1
20140036168 Ludwig Feb 2014 A1
20140184525 Kim et al. Jul 2014 A1
Foreign Referenced Citations (141)
Number Date Country
2011100746 Apr 2011 AU
1247856 Mar 2000 CN
1251650 Apr 2000 CN
101383704 Oct 2001 CN
1538154 Oct 2004 CN
1841280 Oct 2006 CN
1889103 Jan 2007 CN
1942853 Apr 2007 CN
101046458 Oct 2007 CN
101515214 Aug 2009 CN
101582002 Nov 2009 CN
201936289 Aug 2011 CN
2213813 Oct 1973 DE
3216389 Nov 1983 DE
19606408 Jun 1997 DE
102005051530 Oct 2005 DE
0905646 Mar 1999 EP
0929028 Jul 1999 EP
0973123 Jan 2000 EP
1018697 Jul 2000 EP
1 113 405 Dec 2000 EP
1 113 383 Jul 2001 EP
1139301 Oct 2001 EP
1148446 Oct 2001 EP
1 289 239 Mar 2003 EP
1289239 Mar 2005 EP
1531419 May 2005 EP
1533759 May 2005 EP
1538548 Jun 2005 EP
1624399 Oct 2007 EP
1939788 Jul 2008 EP
2343677 Jul 2011 EP
2343677 Jul 2011 EP
2343679 Jul 2011 EP
2343679 Jul 2011 EP
2348472 Jul 2011 EP
2348472 Jul 2011 EP
2839173 Oct 2003 FR
2331613 May 1999 GB
2480919 Dec 2011 GB
62-226030 Oct 1987 JP
04158434 Jun 1992 JP
10-020992 Jan 1998 JP
11-164824 Jun 1999 JP
2000-57328 Feb 2000 JP
2001-077342 Mar 2001 JP
2003-511799 Mar 2003 JP
2005242856 Sep 2005 JP
2006-271028 Oct 2005 JP
2006-184104 Jul 2006 JP
2006-517023 Jul 2006 JP
2007-010338 Jan 2007 JP
2007-018168 Jan 2007 JP
2008-134836 Jun 2008 JP
2009-516295 Apr 2009 JP
09071135 Apr 2009 JP
2011022788 Feb 2011 JP
2011059793 Mar 2011 JP
20050080628 Aug 2005 KR
201112068 Jan 2011 TW
201120507 Jun 2011 TW
WO 9003620 Apr 1990 WO
WO 9815225 Apr 1998 WO
WO 9852145 Nov 1998 WO
WO 9852146 Nov 1998 WO
WO 9852147 Nov 1998 WO
WO 98521517 Nov 1998 WO
WO 9858342 Dec 1998 WO
WO 9928701 Jun 1999 WO
WO 9928701 Jun 1999 WO
WO 9943258 Sep 1999 WO
WO 0068873 Nov 2000 WO
WO 0068874 Nov 2000 WO
WO 0072507 Nov 2000 WO
WO 0109819 Feb 2001 WO
WO 0109936 Feb 2001 WO
WO 0122349 Mar 2001 WO
WO 01027868 Apr 2001 WO
WO 0129731 Apr 2001 WO
WO 0139134 May 2001 WO
WO 0194966 Jun 2001 WO
WO 0169520 Sep 2001 WO
WO 0173678 Oct 2001 WO
WO 0177994 Oct 2001 WO
WO 0180166 Oct 2001 WO
WO 0194902 Dec 2001 WO
WO 0195304 Dec 2001 WO
WO 0195305 Dec 2001 WO
WO 0199035 Dec 2001 WO
WO 0199036 Dec 2001 WO
WO 0394892 Dec 2001 WO
WO 0211066 Feb 2002 WO
WO 0215209 Feb 2002 WO
WO 0215267 Feb 2002 WO
WO 0244998 Jun 2002 WO
WO 0247018 Jun 2002 WO
WO 0261668 Aug 2002 WO
WO 02069386 Sep 2002 WO
WO 02071313 Sep 2002 WO
WO 02073375 Sep 2002 WO
WO 02077907 Oct 2002 WO
WO 02086800 Oct 2002 WO
WO 02093462 Oct 2002 WO
WO 02095349 Nov 2002 WO
WO 03007127 Jan 2003 WO
WO 03017211 Feb 2003 WO
WO 03049011 Jun 2003 WO
WO 03049012 Jun 2003 WO
WO 03049016 Jun 2003 WO
WO 03049104 Jun 2003 WO
WO 03050963 Jun 2003 WO
WO 03063054 Jul 2003 WO
WO 03075210 Sep 2003 WO
WO 03075210 Dec 2003 WO
WO 2004066194 Aug 2004 WO
WO 2004066693 Aug 2004 WO
WO 2004066693 Aug 2004 WO
WO 2005104012 Nov 2005 WO
WO 2005104012 Nov 2005 WO
WO 2005106774 Nov 2005 WO
WO 2006040724 Apr 2006 WO
WO 2006041780 Apr 2006 WO
WO 2007011607 Jan 2007 WO
WO2007048395 May 2007 WO
WO 2007058727 May 2007 WO
WO 0165470 Sep 2007 WO
WO 2008033264 Mar 2008 WO
WO 2008033265 Mar 2008 WO
WO 2008137287 Nov 2008 WO
WO 2009002599 Dec 2008 WO
WO 2009079219 Jun 2009 WO
WO 2009079221 Jun 2009 WO
WO 2009079257 Jun 2009 WO
WO 2009079262 Jun 2009 WO
WO 2010034036 Mar 2010 WO
WO 2010036445 Apr 2010 WO
WO 2010143597 Dec 2010 WO
WO2011035491 Mar 2011 WO
WO 2011053797 May 2011 WO
WO 2011088252 Jul 2011 WO
WO2012014206 Feb 2012 WO
Non-Patent Literature Citations (74)
Entry
Biometrics, “A Responsive Supply Chain”, By Andrew Conry-Murray, 5 pages, posted Jul. 7, 2002, as printed Aug. 23, 2004, at http://www.networkmagazine.com/shared/article/ ID-17601104.
Ballard and Brown, “Computer Vision”, Prentice Hall, 1982, pp. 65-69.
S. Shigematsu et al., “A Single-Chip Fingerprint Sensor and Identifier”, IEEE Journal of Solid-State Circuits, vol. 34, No. 12, Dec. 1999, pp. 1852-1859.
“Fingernail Touch Sensors: Spatially Distributed Measurement and Hemodynamic Modeling”, Stephen Mascaro and H. Harry Asada, 2000 IEEE, pp. 3422-3427.
Jacob O. Wobbrock, Brad A. Myers, Htet Htet Aung, and Edmond F. LoPeati, Text Entry from Power Wheelchairs: EdgeWrite for Joysticks and Touchpads, pp. 110-117, Human-Computer Interaction Institute School of Computer Science Carnegie Mellon University, Pittsburg, PA 15213 USA.
Bartholomew J. Kane, “A High Resolution Traction Stress Sensor Array For Use In Robotic Tactile Determination”, A Dissertation Submitted to the Department of Mechanical Engineering and the Committee on Graduate Studies of Stanford University in Partial Fulfillment of the Requirements for the Degree of Philosophy, Sep. 1999.
Choonwoo Ryu et al. “Super-template Generation Using Successive Bayesian Estimation for Fingerprint Enrollment”, Jan. 2005 Springer-Verlag Berlin Heidelberg, pp. 710-719.
Dongjae Lee et al. “Fingerprint Fusion Based on Minutiae and Ridge for Enrollment”, Jan. 2000 Springer-Verlag Berlin Heidelberg, pp. 478-485.
Koichi Sasakawa et al. “Personal Verification System with High Tolerance of Poor Quality Fingerprints”, May 1990 Machine Vision Systems Integration in Industry, pp. 265-272.
Michal Irani et al., “Improving Resolution by Image Registration”, May 1991 by Academic Press, Inc., pp. 231-239.
Qinfen Zheng et al. “A Computational Vision Approach to Image Registration”, Aug. 1992 IEEE, pp. 311-326.
Wei-Yun Yau et al. “Fingerprint Templates Combination”, Jan. 2004 Springer-Verlag Berlin Heidelberg, pp. 449-460.
Xudong Jiand et al. “Detecting the Fingerprint Minutiae by Adaptive Tracing the Gray-Level Ridge”, May 2001, pp. 999-1013, Published by Elsevier Science Ltd.
Xudong Jiang et al. “Fingerprint Minutiae Matching Based on the Local Global Structures”, Sep. 2000 Ieee, pp. 1038-1041.
I-Control, PDS 3000 TM Product Brief “Mobile Finger-Based Control Sensor”, 12 pages, Jul. 2003.
Saunvil Pandya et al. “CORAL: Miniature Acoustic Communication Subsystem Architecture for Underwater Wireless Sensor Networks” University of Illinois at Urbana-Champaign Micro and Nanotechnology Laboratory.
Search Report dated Sep. 26, 2002 for International Application No. PCT/US2001/46525.
Search Report dated Aug. 9, 2005 for International Application No. PCT/US2005/012792.
Search Report dated Dec. 12, 2005 for International Application No. PCT/US2005/013943.
Search Report dated Dec. 22, 2005 for European Application No. 05021634.0-2218.
Davide Maltoni, “Handbook of Fingerprint Recognition”, XP002355942 Springer, New York, USA, Jun. 2003, pp. 65-69.
Vermesan et al., “A 500-dpi AC Capacitive Hybrid Flip-Chip CMOS ASIC/Sensor Module for Fingerprint, Navigation, and Pointer Detection With On-Chip Data Processing”, IEEE Journal of Solid State Circuits, vol. 38, No. 12, Dec. 2003, pp. 2288-2296.
International Search Report and Written Opinion dated Jan. 30, 2006 for Application No. PCT/US2005/035504.
Matsumoto et al., Impact of Artificial “Gummy” Fingers on Fingerprint Systems, SPIE 4677 (2002), reprinted from cryptome.org.
Maltoni, “Handbook fo Fingerprint Recognition”, XP002355942 Springer, New York, USA, Jun. 2003 pp. 65-69.
Ratha, et al. “Adaptive Flow Orientation Based Feature Extraction in Fingerprint Images,” Pattern Recognition, vol. 28, No. 11, 1657-1672, Nov. 1995.
Ratha, et al., “A Real Time Matching System for Large Fingerprint Databases,”IEEE, Aug. 1996.
Suh, et al., “Design and Implementation of the AEGIS Single-Chip Secure Processor Using Physical Random Functions”, Computer Architecture, 2005, ISCA '05, Proceedings, 32nd International Symposium, Jun. 2005 (MIT Techincal Report CSAIL CSG-TR-843, 2004.
Rivest, et al., “A Method for Obtaining Digital Signatures and Public-Key Cryptosystems”, Communication of the ACM, vol. 21, (2), pp. 120-126. (1978).
Hiltgen, et al., “Secure Internet Banking Authentication”, IEEE Security and Privacy, IEEE Computer Society, New York, NY, US, Mar. 1, 2006, pp. 24-31, XP007908655, ISSN: 1540-7993.
Hegt, “Analysis of Current and Future Phishing Attacks on Internet Banking Services”, Mater Thesis. Techische Universiteit Eindhoven—Department of Mathematics and Computer Science May 31, 2008, pp. 1-149, XP002630374, Retrieves from the Internet: URL:http://alexandria.tue.nl/extra1/afstversl/wsk-i/hegt2008.pdf [retrieved on Mar. 29, 2011] pp. 127-137, paragraph 6.2.
Gassend, et al., “Controlled Physical Random Functions”, In Proceedings of the 18th Annual Computer Security Conference Las Vegas, Nevada, Dec. 12, 2002.
bellagiodesigns.com (Internet Archive Wayback Machine, www.bellagiodesigns.com date: Oct. 29, 2005).
Universiteit Eindhoven—Department of Mathematics and Computer Science May 31, 2008, pp. 1-149, XP002630374, Retrieved from the Internet: URL:http://alexandria.tue.nl/extral/afstversl/wsk-i/hgt2008.pdf [retrieved on Mar. 3, 2011]pp. 127-134, paragraph 6.2.
Wikipedia (Mar. 2012). “Integrated Circuit,”http://en.wikipedia.org/wiki/integrated_circuit. Revision as of Mar. 23, 2012.
Closed Loop Systems, The Free Dictionary, http://www.thefreedictionary.com/closed-loop+system (downloaded Mar. 28, 2012).
Feedback: Electronic Engineering, Wikipedia, p. 11 http://en.wikipedia.org/wiki/Feedback#Electronic_engineering (downloaded Mar. 28, 2012).
ITD, “Anti-Money Laundering”, ITD, Jan. 22, 2009.
Galy et al. (Jul. 2007) “A full fingerprint verification system for a single-line sweep sensor.” IEEE Sensors J., vol. 7 No. 7, pp. 1054-1065.
Final Office Action issued in U.S. Appl. No. 13/860,494, 35 pages (dated May 4, 2017).
Office Action (with English translation) issued in Korean Patent Application No. 10-2012-7021324, 20 pages. (dated May 30, 2017).
Office Action (and English language summary) with Search Report issued in Chinese Patent Application No. 201380030318.8, 12 pages (dated May 25, 2017).
Non-final Office Action issued in U.S. Appl. No. 14/243,116, 24 pages (dated Jun. 15, 2017).
Final Office Action issued in U.S. Appl. No. 14/243,122, 24 pages. (dated Jun. 16, 2017).
Notice of Allowance issued in U.S. Appl. No. 15/357,019, 92 pages (dated Jun. 9, 2017).
Advisory Action issued in U.S. Appl. No. 13/860,494, 5 pages (dated Jun. 24, 2016).
Chinese Office Action dated Dec. 1, 2014, CN Application No. 201180014263.2, 7 pages.
Chinese Office Action dated Dec. 3, 2014, CN Application No. 201180014237.X, 9 pages.
Chinese Office Action dated Mar. 2, 2016, CN Application No. 201180014263.2, 8 pages.
Chinese Office Action dated Nov. 11, 2015, CN Application No. 201180014237.X, 6 pages.
Corrected Notice of Allowability issued in U.S. Appl. No. 13/620,271, 12 pages (dated Sep. 18, 2015).
Corrected Notice of Allowability issued in U.S. Appl. No. 13/620,271, 6 pages (dated Oct. 2, 2015).
Extended European Search Report dated Mar. 2, 2016, Application No. 15179001.1 12 pages.
Extended European Search Report issued in EP Patent Application No. 15179015.1, 7 pages (dated Nov. 24, 2015).
Extended European Search Report issued in European Patent Application No. 13775647.4, 7 pages (dated Oct. 1, 2015).
Extended European Search Report issued in European Patent Application No. 15179001, 7 pages (dated Nov. 9, 2015).
Final Office Action dated Nov. 18, 2015, U.S. Appl. No. 13/801,991, 12 pages.
Final Office Action issued in U.S. Appl. No. 13/860,494, 42 pages (dated Apr. 15, 2016).
Final Office Action issued in U.S. Appl. No. 14/243,116, 22 pages (dated Feb. 10, 2017).
Japanese Office Action (with English translation attached) dated Feb. 24, 2016, JP Application No. 2012-549090, 5 pages.
Japanese Office Action dated Dec. 2, 2014, JP Application No. 2012-549090, 15 pages.
Japanese Office Action dated Sep. 2, 2014, JP Application No. 2012-549092, 8 pages.
Non-final Office Action issued in U.S. Appl. No. 13/801,991, 24 pages (dated Apr. 9, 2015).
Notice of Allowance issued in U.S. Appl. No. 13/620,271, 21 pages (dated Apr. 21, 2015).
Notice of Allowance issued in U.S. Appl. No. 13/620,271, 127 pages (dated Aug. 3, 2015).
Office Action issued in U.S. Appl. No. 13/860,494, 98 pages (dated Oct. 7, 2015).
Office Action issued in U.S. Appl. No. 13/860,494, 25 pages (dated Sep. 8, 2016).
Office Action issued in U.S. Appl. No. 14/243,116, 112 pages (dated Aug. 10, 2016).
Office Action issued in U.S. Appl. No. 14/243,122 dated Dec. 2, 2016, 16 pages.
Taiwanese Office Action dated Nov. 16, 2015 (with English translation attached), TW Application No. 100101376, 10 pages.
Taiwanese Search Report dated Jan. 16, 2015, TW Application No. 099146099, Filed: Dec. 27, 2010, 1 page.
Examination Report issued in European Patent Application No. 13775647.4, 4 pages (dated Nov. 16, 2017).
U.S. Office Action dated Jan. 29, 2018 issued in U.S. Appl. No. 13/860,494.
Chinese Office Action dated Jan. 31, 2018 issued in Chinese Patent Application No. 201380030318.8 (with Explanation of Relevance).
Related Publications (1)
Number Date Country
20170308228 A1 Oct 2017 US
Provisional Applications (1)
Number Date Country
61622474 Apr 2012 US
Divisions (1)
Number Date Country
Parent 14243122 Apr 2014 US
Child 15590454 US
Continuations (1)
Number Date Country
Parent 13860494 Apr 2013 US
Child 14243122 US