1. Field of the Invention
The present invention relates generally to biometrics.
2. Related Art
Biometrics are a group of technologies that provide a high level of security. Fingerprint capture and recognition is an important biometric technology. Law enforcement, banking, voting, and other industries increasingly rely upon fingerprints as a biometric to recognize or verify identity. See, Biometrics Explained, v. 2.0, G. Roethenbaugh, International Computer Society Assn. Carlisle, Pa. 1998, pages 1-34 (incorporated herein by reference in its entirety). Handheld, mobile print scanners have been used to capture a fingerprint. Such a print scanner can be held by an individual and carried around to allow fingerprint capture at a variety of locations. Conventional finger print scanners have been limited to fingerprint capture. Often times, captured prints were stored for later download. An official using such a handheld scanner may detect a print of another person, but could not readily verify the person's identity in real time. In those situations, prints were downloaded and transmitted to another device for extracting and matching processes.
Therefore, a handheld system is needed that allows a user to readily verify a person's identity in real-time without having to forward captured print information to a remote device for extract and match processing.
Embodiments of the present invention provide a system including a handheld identification device including a code and a handheld, mobile reading and scanning device, said reading and scanning device including a code reader that reads said code and a biometric reader that captures a biometric image of an individual holding said handheld device.
Embodiments of the present invention include a mobile handheld biometric security device including a code reader that reads a code representation of biometric data provided on a handheld device and a biometric reader that captures biometric images of an individual holding the handheld device.
Embodiments of the present invention provide a method including reading and storing live biometric data of an individual. The method also includes reading code data associated with a handheld device held by said individual, storing said read code data, and decoding said code data and storing said decoded data. The method further includes extracting live biometric data from said stored live biometric data and determining whether said extracted live biometric data matches with said decoded data.
Embodiments of the present invention provide a method for utilizing biometric and other data for identity verification utilizing a handheld, mobile reading and scanning device and handheld device.
Embodiments of the present invention include a method of confirming an identity of an individual using a reader/scanner device. The method includes reading fingerprint data of the individual and extracting minutia from said fingerprint data and saving said minutia in a memory. The method also includes reading a bar code on a handheld device held by the individual and storing said bar code data in said memory and converting said code data into decoded data, and storing said decoded data in said memory. The method further includes determining whether said minutia matches with said decoded data.
Further embodiments, features, and advantages of the invention, as well as the structure and operation of the various embodiments of invention, are described in detail below with reference to the accompanying drawings.
The accompanying drawings, which are incorporated herein and form a part of the specification, illustrate the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention.
The present invention will now be described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
Embodiments of the present invention provide a reading and scanning device that scans and stores biometric data of an individual with a biometric scanner section. A reader section reads and stores machine readable code data with a code reading section. The machine readable code can be associated with a handheld device held by the individual having their biometric data scanned. The reading and scanning device then decodes the code data and stores decoded data. The biometric data is extracted, either within the reading and scanning device or remotely after being transmitted to a remote device. The extracted biometric data is compared to the decoded data, either within the reading and scanning device or remotely after the decoded data is transmitted to a remote device, to determine if the individual is the person listed on the handheld device. If not, the biometric data can be used to determine the person's identity if he/she is in a main database.
Through use of the handheld device, an individual can be identified or determined not to be the person shown on the handheld device. The code on the handheld device represents previously read and extracted biometric data on the individual. This would allow the reading and scanning and possible remote system to confirm the identity of a person quickly just based on the handheld device. If the handheld device does not belong to the person, then an additional step can be taken to check the live read biometric data against the main database to determine the individual's identity.
Terminology
The use of “biometric data” throughout the specification is meant to include all types of data that is imaged from a part of an individual, such as taking a image of a fingerprint on a camera as a bit map. The use of “extracted biometric data” is meant to include all types characteristics of extracted data obtained from the image or bit map, such as minutia of a fingerprint determined from the fingerprint image. Other examples can be retina imaging and extraction, limb length imaging and extraction, etc.
As used herein, the term “biometric scanner or fingerprint scanner” are used to refer to a scanner that scans a biometric or fingerprint and then processes the image data or transmits the image data to a host processor. Such a fingerprint scanner can be a remote fingerprint scanner where “remote” is meant to imply that the fingerprint scanning can take place at a location physically separate from the host processor. A remote fingerprint scanner and a host processor may be considered physically separate even though they may be connected through a data interface, permanent or otherwise. It is to be understood that any aspect of an individual's body that allows for biometric scanning can be scanned, thus the invention is not only limited to a fingerprint.
As used herein, the term “fingerprint capture event” is used to refer to a single act of capturing a fingerprint image with a fingerprint scanner. This term is not meant to imply any temporal limitations but is instead intended to refer to the event along with the particular characteristics of the event that can change from event to event. Such characteristics include the particular finger and its physical characteristics as well as other factors like the cleanliness of the image capture surface that can affect fingerprint capture.
As used herein, the term “fingerprint image” is used to refer to any type of detected fingerprint image including, but not limited to, an image of all or part of one or more fingerprints, toe prints, foot prints, a rolled fingerprint, a flat stationary fingerprint, a palm print, a hand print, and/or prints of multiple fingers or toes.
As used herein, the term “acceptable fingerprint image” is used to refer to a fingerprint image that has both acceptable darkness as well as acceptable definition. The particular acceptable darkness and definition levels are not critical and can be determined by one skilled in the relevant art given this disclosure, as discussed herein.
As used herein, the term “code” is used to refer to any type of machine readable code. Some examples can be one or two-dimensional bar codes. Also, the “handheld device” is used to refer to any type of smartcard device or identification device, such as a drivers license, government issued identification card, privately issued identification card, a privately or publically issued badge, etc.
Auto-Capture System
As discussed above, the light reflected towards camera 215 by fingerprint capture surface 210 is representative of the contact of a finger with fingerprint capture surface 210. Specifically, contact of ridges on a finger with fingerprint capture surface 210 results in light being reflected in areas corresponding to that contact. Thus, the quality of the contact places a role in the quantity of reflected light. This contact quality is affected by the dryness of the subject's skin, the cleanliness of the fingerprint contact surface 210, the pressure applied by the subject, and the like. Camera 215 captures the reflected light within, for example, an array of photo-sensitive pixels. The image is then stored in a memory 220. Memory 220 can include both non-volatile and volatile memory. In one example, memory 220 includes non-volatile memory that stores the executable code necessary for device operation and volatile memory for storing data representative of the captured image. Any type of non-volatile memory may be used, for example an electrically-erasable read only memory (EEPROM) or an optically-erasable read only memory (Flash-EPROM), though the invention is not limited to these specific types of non-volatile memory. Volatile memory can be a random-access-memory for storing detected fingerprint images. For example, the image can be stored as an array of values representing a gray-scale value associated with each pixel. Other types of memory (flash memory, floppy drives, disks, mini-floppy drives, etc.) can be used in alternative embodiments of the present invention. Volatile memory can include mini-floppy drives (such as those available from Sandisk Corp. or Intel Corp.). In this way, multiple prints can be stored locally. This is especially important in border control, crime scene, and accident sight applications.
While camera 215 is responsive to light reflected from fingerprint capture surface 210, pixel light intensity is converted into a darkness level so that the stored image is like those appearing in
The fingerprint images illustrated in
Two points should be noted about the images of
Returning to the code reader and fingerprint scanner 200 of
As would be apparent to a person skilled in the art, other types of memory, circuitry and/or processing capability may be included within code reader and fingerprint scanner 200, examples of which include a frame grabber and an analog/digital converter. Also included in the code reader and fingerprint scanner 200 shown in
Similar to the discussion above in relation to the fingerprint capturing, a code image can be captured based on reflecting light from the light source 205 off a handheld device 250 (
It is to be appreciated, the reading and scanning 200 may include separate light sources, cameras, memories, and controllers for the fingerprint system and the code reader system, which may be arranged based on the placement of the code reader window 212 or the fingerprint capture surface 210.
A discussion follows for a print and code capture and matching apparatus and routine according to embodiments of the present invention, as is shown in
Print and Code Capture and Matching Apparatus and Routine
Print and Code Capture and Matching Apparatus
Reading and scanning device 200 is ergonomically designed to fit the hand naturally. The oblong, cylindrical shape (similar to a flashlight), does not contain sharp edges. The device is small enough to be gripped by large or small hands without awkward or unnatural movement. The device is comfortable to use without muscle strain on the operator or subject. In one example, reading and scanning 200 is 1.5×8.0×1.5 inches (height×length×width), weighs about 340 grams (12 oz.), and has an image capture surface 210 size of about 1″×1″.
The non-intimidating appearance of the reading and scanning device 200 is designed to resemble a typical flashlight—a device that is not generally threatening to the public. Reading and scanning device 200 has no sharp edges and is constructed of a light-weight aluminum housing that is coated with a polymer to give the device a “rubberized” feel. Because reading and scanning device 200 is small and lightweight, it may be carried on the officer's utility belt upon exiting a vehicle. The device is designed for one hand use, allowing the officer to have a free hand for protective actions. Reading and scanning device 200 is designed for harsh environments to sustain issues such as dramatic temperature changes and non-intentional abuse.
As seen in
With continuing reference to 2B, if a FINGER button is depressed, the fingerprint scanning function is activated. Then, the reading and scanning device 200 awaits a finger to be introduced to the fingerprint capture surface 210. The digital (or analog) image is automatically captured when an acceptable image is detected. The image is then tested for quality of data prior to notifying the operator with an indication (e.g., visual indication and/or audible tone) for acceptance. A routine for automatically capturing an acceptable fingerprint image can be performed in accordance with the present invention, as is discussed elsewhere herein. The unit emits a tone to indicate a completed process.
Still with reference to
After performing the finger and code readings, an officer may introduce the unit 200 to a docking station, which can include the processor 242 and database 244, blindly, maintaining his eyes on the subject for safety. Once seated in the docking station, the fingerprint and code are automatically transferred to a mobile computer, which can include the processor 242 and the database 242, without operator intervention.
The detected image is scalable to conform to FBI provided software (cropped or padded to 512 pixels by 512 pixels), although the standard image size is 1″×1″, 500 dpi, 256 levels of gray-scale (ANSI-NIST). Other details of an example reading and scanning device 200 can be found in U.S. Pat. No. 6,111,977 to Scott et al. and U.S. Published Application 2002/0021827-A1, to Scott et al., which are both incorporated herein by reference in their entirety.
Thus, reading and scanning device 200 can be held in either hand of a user, and used to capture both code data and a person's fingerprint to verify the person's identity. The fingerprint is captured from a cooperative individual (frontal approach) or an uncooperative individual (handcuffed subject—most commonly face down). Reading and scanning device 200 can be operated with one-hand, allowing the officer to have a hand ready for protective actions. The officer need not have fingerprinting knowledge to capture the fingerprint. If the individual's handheld device 250 is available, then the code 252 can also be easily read and transmitted.
As discussed above, the integration time of camera 215 within code reader and fingerprint scanner 200 can be adjusted to compensate for light level changes introduced by variations in the contact quality between a finger and the fingerprint capture surface during any particular fingerprint capture event. Such compensation can be done automatically, i.e. without operator input, within the code reader and fingerprint scanner 200 according to a method that will next be described.
Print and Code Capture and Matching Method
Turning to
Another routine 800 for reading and matching biometric data and code data according to an embodiment of the present invention is shown in FIG. 8. Biometric data is read via the biometric reading surface 210 at step 802, extracted by the processor 225 at step 804, and stored in the memory 220 at step 806. Code data is read via the code reading window 212 and stored in the memory 220 at step 808. The code data image is converted to decoded data by the processor 225 at step 810. The decoded data is stored in memory 220 at step 812. A determination is made by processor 225 whether the extracted biometric data matches the decoded data at step 814. If they match a PASS signal is sent at step 816. If they do not match a FAIL signal is sent at step 818, and the extracted live biometric data is compared to all data in a database 244 to determine if it matches with any of the data in the database at step 820. If a match is found a MATCH signal is transmitted at step 822. If no match is found a FAIL signal is sent at step 824.
Auto-Capture Method
In a next step 310 of the routine 300 shown in
In a next step 315 of the routine 300 of
Depending on the outcome of the image darkness test performed in step 315, a next step 325 or 330 is performed as shown in
If the image darkness test of step 315 results in an un-acceptable darkness level, then a next step 325 of incrementing the image integration time and capturing another intermediate image at the incremented integration time is performed. The only exception to this step is when the integration time cannot be incremented to a higher integration time because the highest integration is the one at which the intermediate fingerprint image was captured. In such a case, the routine returns to step 305.
If the image integration time has been incremented and another intermediate image captured, the routine returns to step 315 to perform the darkness test again. Thus, routine 300 includes a loop with steps 315, 320, and 325 repeating until an intermediate image with an acceptable darkness level has been captured.
Once an intermediate fingerprint image with an acceptable darkness level has been captured, an image definition test is performed at a step 330. The image definition test used can be an image definition test according to the present invention and discussed below in connection with
Once the image definition test has been performed in step 330, one of two different steps are conducted based on the outcome of that test as shown at 335.
If the image definition test 330 indicated that the intermediate fingerprint was of un-acceptable definition, then the routine returns to step 325, discussed above. As with the above description of step 325, if the integration time cannot be incremented because the captured image was a result of the maximum integration time, routine 300 returns to step 305 to await a new initial fingerprint image.
If the image definition test 330 indicated that the intermediate fingerprint image was of acceptable definition, then intermediate finger print image is an acceptable fingerprint image in terms of both darkness as well as definition. Thus, in a final step 340, the intermediate fingerprint image that has passed both tests is an acceptable fingerprint image and the routine is complete. In this way, routine 300 has automatically captured an acceptable fingerprint image. Step 340 can include a step of providing a signal that an acceptable fingerprint image has been captured. This signal can be audible, visible, or both.
Details of an image darkness test and an image definition test in accordance of the present invention will now be described in terms of
In a next step 402 of the routine 400 shown in
In a next step 403, acceptable overall image darkness is verified. This verification can be done, for example, by verifying that a predetermined number of image darkness test lines have an associated average image darkness level above a threshold darkness level. In an embodiment, the predetermined number (or percentage) of image darkness test lines is eight (or 80% of the image darkness test lines). If eight image darkness test lines have an average image darkness level above the threshold darkness level, the overall image darkness is considered acceptable. Other numbers (or percentages) of image darkness lines can be used without departing from the scope of the present invention. Likewise, the particular threshold darkness level chosen is not critical and could be determined by one skilled in the relevant art given this disclosure. The acceptable darkness level can be based on the specific environment in which the fingerprint scanner is used as well as requirements associated with the field in which the fingerprint scanner is used and thus can be set by the manufacturer or user, as appropriate.
Once overall image darkness has been verified as acceptable in step 403, a next step 404 of verifying acceptability of image darkness distribution is performed. It should be noted that if the previous step 403 resulted in a determination that overall image darkness was not acceptable for the tested image, it is not necessary that routine 400 continue, but could instead stop at step 403. In step 404, image darkness distribution is tested. Despite the determination in step 403 that overall image darkness was acceptable, this darkness may have been concentrated in a particular region. For example, if all image darkness test lines in pairs 430-433, as shown in
Because step 404 of the routine 400 shown in
In a next step 502 of the routine 500 shown in
In a final step 503, the ridge counts of the image definition test lines determined in step 502 are used to verify image definition acceptability. This can be done, for example, by verifying that the ridge count for each image definition test line is greater than a threshold ridge count value associated with each image definition test line. The particular threshold ridge count values used are not critical and could be determined by one skilled in the relevant art given this disclosure. Rather than having a threshold ridge count value for each image definition test line, a singe threshold ridge count value could be used for all the image definition test lines. As with acceptable image darkness, the acceptable image definition level can be based on the specific environment in which the fingerprint scanner is used as well as requirements associated with the field in which the fingerprint scanner is used and thus can be set by the manufacturer or user, as appropriate.
Companding Curves
In an embodiment, darkness level may be further changed based on a camera setting. The camera setting can be varied to adjust the integration time over a range of piecewise linear functions. The camera includes a set of look up tables that define the set of piecewise linear functions. For example, the set of piecewise linear functions may be companding curves, as used in a Motorola camera model number SCM20014. Companding curves allow for coring of lower order bits of captured image data. In effect, companding curves expand the value of lower signal levels, and compress higher signal levels, allowing for on-chip contrast adjustments. Furthermore, a companding function may perform data transformations, such as performing an 8-bit transformation on an incoming 10-bit data stream.
For example, as shown in
According to the present embodiment, one or more of steps 305 through 335 are performed on a first companding curve. The same steps are then performed on a second companding curve. This routine is repeated until the desired set of steps has been performed on all desired companding curves. For example, steps 305, 310, 315, 320, and 325 may be performed on all desired companding curves. Alternatively, steps 305, 310, 315, 320, 325, 330, 335 may be performed on all desired companding curves.
A user may select the set of companding curves to be used in a particular fingerprint image capturing system application, or a set of companding curves may be determined automatically, such as by a computer system. In this way, an acceptable fingerprint image is captured, having an image integration time and a companding curve selected to capture an optimum acceptable fingerprint image.
Conclusion
While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined in the appended claims. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
This application claims priority under 35 U.S.C. 119(e) to U.S. Prov. Appl. No. 60/373,606, filed Apr. 19, 2002, which is incorporated be reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2500017 | Altman | Mar 1950 | A |
3200701 | White | Aug 1965 | A |
3475588 | McMaster | Oct 1969 | A |
3482498 | Becker | Dec 1969 | A |
3495259 | Rocholl et al. | Feb 1970 | A |
3527535 | Monroe | Sep 1970 | A |
3540025 | Levin et al. | Nov 1970 | A |
3617120 | Roka | Nov 1971 | A |
3699519 | Campbell | Oct 1972 | A |
3906520 | Phillips | Sep 1975 | A |
3947128 | Weinberger et al. | Mar 1976 | A |
3968476 | McMahon | Jul 1976 | A |
3975711 | McMahon | Aug 1976 | A |
4032975 | Malueg et al. | Jun 1977 | A |
4063226 | Kozma et al. | Dec 1977 | A |
4120585 | DePalma et al. | Oct 1978 | A |
4152056 | Swonger et al. | May 1979 | A |
4209481 | Kashiro et al. | Jun 1980 | A |
4210889 | Holce | Jul 1980 | A |
4253086 | Szwarcbier | Feb 1981 | A |
4322163 | Schiller | Mar 1982 | A |
4414684 | Blonder | Nov 1983 | A |
4537484 | Fowler et al. | Aug 1985 | A |
4544267 | Schiller | Oct 1985 | A |
4553837 | Marcus | Nov 1985 | A |
4601195 | Garritano | Jul 1986 | A |
4669487 | Frieling | Jun 1987 | A |
4681435 | Kubota et al. | Jul 1987 | A |
4684802 | Hakenewerth et al. | Aug 1987 | A |
4701772 | Anderson et al. | Oct 1987 | A |
4783823 | Tasaki et al. | Nov 1988 | A |
4784484 | Jensen | Nov 1988 | A |
4792226 | Fishbine et al. | Dec 1988 | A |
4811414 | Fishbine et al. | Mar 1989 | A |
4876726 | Capello et al. | Oct 1989 | A |
4905293 | Asai et al. | Feb 1990 | A |
4924085 | Kato et al. | May 1990 | A |
4933976 | Fishbine et al. | Jun 1990 | A |
4942482 | Kakinuma et al. | Jul 1990 | A |
4946276 | Chilcott | Aug 1990 | A |
4995086 | Lilley et al. | Feb 1991 | A |
5054090 | Knight et al. | Oct 1991 | A |
5067162 | Driscoll, Jr. et al. | Nov 1991 | A |
5067749 | Land | Nov 1991 | A |
5131038 | Puhl et al. | Jul 1992 | A |
5146102 | Higuchi et al. | Sep 1992 | A |
5157497 | Topper et al. | Oct 1992 | A |
5185673 | Sobol | Feb 1993 | A |
5187747 | Capello et al. | Feb 1993 | A |
5210588 | Lee | May 1993 | A |
5222152 | Fishbine et al. | Jun 1993 | A |
5222153 | Beiswenger | Jun 1993 | A |
5230025 | Fishbine et al. | Jul 1993 | A |
5233404 | Lougheed et al. | Aug 1993 | A |
5249370 | Stanger et al. | Oct 1993 | A |
5253085 | Maruo et al. | Oct 1993 | A |
5261266 | Lorenz et al. | Nov 1993 | A |
5285293 | Webb et al. | Feb 1994 | A |
5291318 | Genovese | Mar 1994 | A |
D348445 | Fishbine et al. | Jul 1994 | S |
5337043 | Gokcebay | Aug 1994 | A |
5351127 | King et al. | Sep 1994 | A |
D351144 | Fishbine et al. | Oct 1994 | S |
5363318 | McCauley | Nov 1994 | A |
5384621 | Hatch et al. | Jan 1995 | A |
5412463 | Sibbald et al. | May 1995 | A |
5416573 | Sartor, Jr. | May 1995 | A |
5448649 | Chen et al. | Sep 1995 | A |
5467403 | Fishbine et al. | Nov 1995 | A |
5469506 | Berson et al. | Nov 1995 | A |
5471240 | Prager et al. | Nov 1995 | A |
5473144 | Mathurin, Jr. | Dec 1995 | A |
5483601 | Faulkner | Jan 1996 | A |
5509083 | Abtahi et al. | Apr 1996 | A |
5517528 | Johnson | May 1996 | A |
5528355 | Maase et al. | Jun 1996 | A |
5548394 | Giles et al. | Aug 1996 | A |
5591949 | Bernstein | Jan 1997 | A |
5596454 | Hebert | Jan 1997 | A |
5598474 | Johnson | Jan 1997 | A |
5613014 | Eshera et al. | Mar 1997 | A |
5615277 | Hoffman | Mar 1997 | A |
5625448 | Ranalli et al. | Apr 1997 | A |
5640422 | Johnson | Jun 1997 | A |
5649128 | Hartley | Jul 1997 | A |
5650842 | Maase et al. | Jul 1997 | A |
5661451 | Pollag | Aug 1997 | A |
5680205 | Borza | Oct 1997 | A |
5689529 | Johnson | Nov 1997 | A |
5717777 | Wong et al. | Feb 1998 | A |
5729334 | Van Ruyven | Mar 1998 | A |
5736734 | Marcus et al. | Apr 1998 | A |
5745684 | Oskouy et al. | Apr 1998 | A |
5748766 | Maase et al. | May 1998 | A |
5748768 | Sivers et al. | May 1998 | A |
5755748 | Borza | May 1998 | A |
5757278 | Itsumi | May 1998 | A |
5767989 | Sakaguchi | Jun 1998 | A |
5778089 | Borza | Jul 1998 | A |
5781647 | Fishbine et al. | Jul 1998 | A |
5793218 | Oster et al. | Aug 1998 | A |
5801681 | Sayag | Sep 1998 | A |
5805777 | Kuchta | Sep 1998 | A |
5809172 | Melen | Sep 1998 | A |
5812067 | Bergholz et al. | Sep 1998 | A |
5815252 | Price-Francis | Sep 1998 | A |
5818956 | Tuli | Oct 1998 | A |
5822445 | Wong | Oct 1998 | A |
5825005 | Behnke | Oct 1998 | A |
5825474 | Maase | Oct 1998 | A |
5828773 | Setlak et al. | Oct 1998 | A |
5832244 | Jolley et al. | Nov 1998 | A |
5848231 | Teitelbaum et al. | Dec 1998 | A |
5855433 | Velho et al. | Jan 1999 | A |
5859420 | Borza | Jan 1999 | A |
5859710 | Hannah | Jan 1999 | A |
5862247 | Fisun et al. | Jan 1999 | A |
5867802 | Borza | Feb 1999 | A |
5869822 | Meadows, II et al. | Feb 1999 | A |
5872834 | Teitelbaum | Feb 1999 | A |
5892599 | Bahuguna | Apr 1999 | A |
5900993 | Betensky | May 1999 | A |
5907627 | Borza | May 1999 | A |
5920384 | Borza | Jul 1999 | A |
5920640 | Salatino et al. | Jul 1999 | A |
5928347 | Jones | Jul 1999 | A |
5942761 | Tuli | Aug 1999 | A |
5946135 | Auerswald et al. | Aug 1999 | A |
5960100 | Hargrove | Sep 1999 | A |
5973731 | Schwab | Oct 1999 | A |
5974162 | Metz et al. | Oct 1999 | A |
5987155 | Dunn et al. | Nov 1999 | A |
5991467 | Kamiko | Nov 1999 | A |
5995014 | DiMaria | Nov 1999 | A |
5999307 | Whitehead et al. | Dec 1999 | A |
6018739 | McCoy et al. | Jan 2000 | A |
6023522 | Draganoff et al. | Feb 2000 | A |
6038332 | Fishbine et al. | Mar 2000 | A |
6041372 | Hart et al. | Mar 2000 | A |
6055071 | Kuwata et al. | Apr 2000 | A |
6064398 | Ellenby et al. | May 2000 | A |
6064753 | Bolle et al. | May 2000 | A |
6064779 | Neukermans et al. | May 2000 | A |
6072891 | Hamid et al. | Jun 2000 | A |
6075876 | Draganoff | Jun 2000 | A |
6078265 | Bonder et al. | Jun 2000 | A |
6088585 | Schmitt et al. | Jul 2000 | A |
6097873 | Filas et al. | Aug 2000 | A |
6104809 | Berson et al. | Aug 2000 | A |
6115484 | Bowker et al. | Sep 2000 | A |
6122394 | Neukermans et al. | Sep 2000 | A |
6144408 | MacLean | Nov 2000 | A |
6150665 | Suga | Nov 2000 | A |
6154285 | Teng et al. | Nov 2000 | A |
6162486 | Samouilhan et al. | Dec 2000 | A |
6166787 | Akins et al. | Dec 2000 | A |
6178255 | Scott et al. | Jan 2001 | B1 |
6195447 | Ross | Feb 2001 | B1 |
6198836 | Hauke | Mar 2001 | B1 |
6204331 | Sullivan et al. | Mar 2001 | B1 |
6219439 | Burger | Apr 2001 | B1 |
6246751 | Bergl et al. | Jun 2001 | B1 |
6259108 | Antonelli et al. | Jul 2001 | B1 |
6272562 | Scott et al. | Aug 2001 | B1 |
6281931 | Tsao et al. | Aug 2001 | B1 |
6327047 | Motamed | Dec 2001 | B1 |
6347162 | Suzuki | Feb 2002 | B1 |
6404862 | Holt | Jun 2002 | B1 |
6505193 | Musgrave et al. | Jan 2003 | B1 |
6744910 | McClurg et al. | Jun 2004 | B1 |
20020021827 | Smith | Feb 2002 | A1 |
20020030581 | Janiak et al. | Mar 2002 | A1 |
20020030668 | Hoshino et al. | Mar 2002 | A1 |
Number | Date | Country |
---|---|---|
0 101 772 | Mar 1984 | EP |
0 308 162 A2 | Mar 1989 | EP |
0 379 331 | Jul 1990 | EP |
0 623 890 A2 | Nov 1994 | EP |
0 653 882 | May 1995 | EP |
0 379 333 | Jul 1995 | EP |
0 889 432 A2 | Jan 1999 | EP |
0 905 646 | Mar 1999 | EP |
0 785 750 | Jun 1999 | EP |
0 924 656 | Jun 1999 | EP |
0 623 890 | Aug 2001 | EP |
2 089 545 | Jun 1982 | GB |
2 313 441 | Nov 1997 | GB |
62-212892 | Sep 1987 | JP |
1-205392 | Aug 1989 | JP |
3-161884 | Jul 1991 | JP |
3-194674 | Aug 1991 | JP |
3-194675 | Aug 1991 | JP |
11-225272 | Aug 1999 | JP |
11-289421 | Oct 1999 | JP |
WO 8702491 | Apr 1987 | WO |
WO 9003620 | Apr 1990 | WO |
WO 9211608 | Jul 1992 | WO |
WO 9422371 A2 | Oct 1994 | WO |
WO 9617480 A2 | Jun 1996 | WO |
WO 9729477 | Aug 1997 | WO |
WO 9741528 | Nov 1997 | WO |
WO 9809246 | Mar 1998 | WO |
WO 9812670 | Mar 1998 | WO |
WO 9912123 | Mar 1999 | WO |
WO 9926187 | May 1999 | WO |
WO 9940535 | Aug 1999 | WO |
Number | Date | Country | |
---|---|---|---|
20040016811 A1 | Jan 2004 | US |
Number | Date | Country | |
---|---|---|---|
60373606 | Apr 2002 | US |