Methods combining multiple frames of image data

Information

  • Patent Grant
  • 8126272
  • Patent Number
    8,126,272
  • Date Filed
    Monday, March 17, 2008
    17 years ago
  • Date Issued
    Tuesday, February 28, 2012
    13 years ago
Abstract
In one embodiment a document authentication station, for use with passports or the like, includes a 2D image sensor (e.g., CCD- or CMOS-based video camera), and a computer device. The image sensor produces produce image data corresponding to a presented document. From this image data, the computer extracts two or more identification data. One is a digital watermark. The other can be a bar code, data glyphs, OCR data, etc. The processor then proceeds to check that the two identification data correspond in an expected fashion. If not, the document is flagged as suspect or fake. Reliability of detection can be enhanced by processing plural frames of data from the image sensor before issuing a result.
Description
TECHNICAL FIELD

The present technology concerns techniques in which data from multiple image frames are combined into a composite image, allowing subsequent image processing to yield more reliable results. The technology is particularly detailed in the context of document authentication methods.


BACKGROUND

Digital watermarking technology, a form of steganography, encompasses a great variety of techniques by which plural bits of digital data are hidden in some other object without leaving human-apparent evidence of alteration. Many such techniques are detailed in the cited documents.


In U.S. Pat. No. 5,841,886, the present assignee disclosed an identity card that includes digitally watermarked data, e.g., hidden in a photograph of a person depicted on the card. The '886 patent noted that a passport inspector, or the like, could compare the data resulting from decoding of the watermark with other data derived from the card (e.g., text printing, bar codes, etc.). If the data did not match, then photo-swapping or other alteration of the card could be inferred.


In one particular implementation detailed below, the arrangement in the '886 patent is improved upon by providing an authentication station that includes a 2D image sensor (e.g., CCD- or CMOS-based video camera), and a computer device. The image sensor produces produce image data corresponding to the presented document. From this image data, the computer extracts two or more identification data. One is a digital watermark. The other can be represented in the form of a bar code, data glyphs, OCR data, etc. The processor then proceeds to check that the two identification data correspond in an expected fashion. If not, the document is flagged as suspect or fake. Detection of barcode, data glyphs, OCR printing, and the like is enhanced by processing plural frames of image data obtained by the image sensor.


The features of the present technology will be more readily apparent from the following detailed description, which proceeds with reference to the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows a system according to an illustrative embodiment.





DETAILED DESCRIPTION

Referring to FIG. 1, in an arrangement 10, a document 12 includes plural-bit digital data steganographically encoded therein (e.g., by digital watermarking). The document can take any form; the following discussion particularly considers photo IDs, such as passports and drivers' licenses.


The encoding of the document can encompass artwork or printing on the document, the document's background, a laminate layer applied to the document, surface texture, etc. If a photograph is present, it too can be encoded. A variety of watermark encoding techniques are detailed in the cited patents and applications; many more are known to artisans in the field.


In an illustrative embodiment, the card is encoded with a payload of 32 bits. This payload is processed before encoding, using known techniques (e.g., convolutional coding, turbo codes, etc.), to improve its reliable detection in adverse conditions. In other embodiments, a payload larger or smaller than 32 bits can naturally be used (e.g., 8-256 bits).


The encoded card is presented to a reader station 14 for reading. The reader station includes an input device 16 and a processor 18.


The illustrated input device 16 is a video camera, including an image sensor comprising plural rows and columns of image sensing elements. Popular video cameras presently employ CCD or CMOS image sensors, but other technologies can likewise be employed.


The processor 18 can be a general purpose or dedicated computer, incorporating a CPU 20, memory 22, an interface 24 to the input device, a display screen or other output device 26, and optionally a network connection 28. The network connection can be used to connect, through an intranet, the internet, or otherwise, to a remote computer 30.


Suitable software programming instructions, stored in memory 22 of processor 18, or in a memory of remote computer 30, can be used to effect various types of functionality for embodiment 10.


In one embodiment, image data obtained by the camera 16 is stored in the memory of the processor 18. There it is analyzed to decode plural bits of steganographically encoded watermark data. Additionally, the frame of image data is processed to extract a second type of identification data. The second type of identification data may be encoded in bar code, data glyph, or OCR form.


Once the processor has obtained both data, the two are cross-checked to determine whether they correspond in the expect manner. This checking can take many different forms, depending on the application.


In one application, the watermark conveys textual information that is encoded in the second identification data, e.g., a bearer's name, a passport number, a social security number, etc. In such case, the processor checks that the information represented by the decoded bar code/data glyph/OCR matches the information represented by the watermark. If they do not match, the document is flagged as a likely forgery.


In another application, the watermark conveys a unique identifier (UID), or an index number. With this number, the processor 18 can query a database, e.g., resident on the remote computer 30, for additional information. In an illustrative case, the UID may be a passport number. The remote database may have a record corresponding to each valid passport number. In response to the query, the database may return some or all of the record data to the station 14. The returned data (e.g., bearer's name, or social security number) can be compared with counterpart information represented in barcode/data glyph/OCR form. Again, if they do not match, the document is flagged as a likely forgery. (The database may also return data that is used by a human inspector. For example, the database may provide a reference photograph image of the ID document holder, which the inspector can compare with the person presenting the ID document.)


In other applications, the data represented by the watermark or the other indicia is processed in some manner prior to comparison. For example, the watermark may encode a 16 bit hash value derived from the bearer's name or passport number. This latter data is represented in barcode/data glyph/OCR form. To determine document authenticity, the station 14 decodes this latter data from the image data, hashes it, and compares the resulting data with the 16 bit watermark data. If they do not match, the document is again flagged as a likely forgery.


In a particular embodiment, plural frames of image data from the camera 16 are employed in detecting the watermark data, the other data (i.e. barcode/glyph/OCR), or both. This can be effected in various ways.


To illustrate, consider the watermark data. In the watermarking technology detailed in cited application Ser. No. 09/503,881, the document is encoded both with unknown payload bits and also with known reference bits. Only if the reference bits are detected with a suitable degree of confidence are the payload bits taken as trustworthy. If, on examining a first frame of image data, the reference bits are not detected with sufficient confidence, the entire frame can be discarded, and a subsequent frame can be processed instead. Or, the results from the first frame can be accumulated with results from second or succeeding frames. The reference bits in the accumulated results are tested after each frame to determine whether they exhibit the requisite degree of trustworthiness. Accumulation continues until this test is met. The payload bits are then interpreted to yield the watermark payload.


Instead of accumulating watermark results over time, another approach is to accumulate the watermark results over space. In such an arrangement, a first frame of image data may have one portion that gives reliable watermark data, and a second portion that is unreliable (e.g., due to glare from the imaged object, positioning of the object partially out of the focal zone of the imaging device, etc.). In such case, second portion data from second or succeeding image frames can checked and used to augment the usable data from the first image frame until a sufficiently large patchwork of data is assembled for reliable watermark decoding.


Counterparts to these accumulate-over-time and accumulate-over-space approaches can likewise be used with the imaged barcode/glyph/OCR data to compile a composite set of data from which decoding can reliably proceed.


To provide a comprehensive disclosure without unduly lengthening this specification, the above-detailed patents and applications are incorporated herein by reference.


The particular combinations of elements and features in the above-detailed embodiments are exemplary only; the interchanging and substitution of these teachings with other teachings in this and the incorporated-by-reference patents/applications are also contemplated.


In view of the wide variety of embodiments to which the principles and features discussed above can be applied, it should be apparent that the detailed embodiments are illustrative only and should not be taken as limiting the scope of the technology. Rather, we claim as our invention all such modifications as may come within the scope and spirit of the following claims and equivalents thereof.

Claims
  • 1. A method for processing printed subjects, the method comprising: capturing, via an optical capture device, a first set of image data that represents a text-bearing physical object;capturing, via the optical capture device, a second set of image data that represents the text-bearing physical object, wherein the second set of image data is different than the first set of image data;identifying a portion of one of the first or second sets of image data that is unreliable for recognizing text from the text-bearing physical object; andusing a software-configured processing, via a processor, the first and second sets of image data to recognize text therefrom;wherein the processing takes into consideration that a portion of one of the first or second sets of image data is unreliable.
  • 2. The method of claim 1, wherein the identifying comprises identifying a portion of at least one of the first or second sets of image data that suffers from glare.
  • 3. The method of claim 1, wherein the identifying comprises identifying a portion of at least one of the first or second sets of image data that is not in focus.
  • 4. The method of claim 1, wherein the processing comprises generating a new set of image data from the first and second sets of image data, taking into consideration that a portion of one of the first or second sets of image data is unreliable.
  • 5. The method of claim 4, further comprising generating a composite set of image data, wherein the composite set of image data omits data from the portion of one of the first or second sets of image data identified as unreliable.
  • 6. The method of claim 1, further comprising sensing digital watermark data from the text-bearing physical object, and using the digital watermark data in conjunction with the recognized text to make a decision concerning the text-bearing physical object, wherein both the capturing and sensing make use of a digital video camera that captures successive frames of image data, and wherein the text-bearing physical object comprises a photo identification document.
  • 7. The method of claim 6, further comprising determining if the digital watermark data and the recognized text correspond in an expected fashion and, if not, flagging the photo identification document as suspect.
  • 8. A method for processing printed subjects, the method comprising: capturing, via an optical capture device, a first set of image data that represents a text-bearing physical object;capturing, via the optical capture device, a second set of image data that represents the text-bearing physical object, wherein the second set of image data is different than the first set of image data;identifying a region in the first set of image data that is relatively more reliable for recognizing text than a corresponding region of the second set of image data;identifying a region in the second set of image data that is relatively more reliable for recognizing text than another corresponding region of the first set of image data;combining image data from the reliable region in the first set of image data with image data from the reliable region in the second set of image data; andperforming, via a processor, an optical character recognition operation to recognize text from the combined image data.
  • 9. The method of claim 8, further comprising sensing digital watermark data from the text-bearing physical object, and using the digital watermark data in conjunction with the recognized text to make a decision concerning the text-bearing object.
  • 10. The method of claim 9, further comprising determining if the digital watermark data and the recognized text correspond in an expected fashion and, if not, flagging a document associated with the text-bearing physical object as suspect.
  • 11. The method of claim 9, further comprising consulting a database to obtain additional data corresponding to at least a portion of the digital watermark data, and comparing the additional data with data corresponding to the recognized text.
  • 12. The method of claim 1, wherein the identifying a portion of one of the sets of image data that is unreliable is performed by processor.
  • 13. The method of claim 8, wherein the identifying a portion of one of the sets of image data that is unreliable is performed by processor.
  • 14. The method of claim 8, wherein the identifying a region in the first set if image data comprises identifying a region in the first set of data for which a corresponding region in the second set of data suffers from glare.
  • 15. The method of claim 8, wherein the identifying a region in the first set of image data comprises identifying a region in the first set of data for which a corresponding region in the second set of data is out of focus.
  • 16. The method of claim 1, wherein the capturing a first set of image data is performed at a first time, and wherein the capturing a second set of image data is performed at a second time different from the first time.
  • 17. The method of claim 8, wherein the capturing a first set of image data is performed at a first time, and wherein the capturing a second set of image data is performed at a second time different from the first time.
  • 18. A non-transitory computer-readable medium having instructions stored thereon, the instructions comprising: instructions for capturing a first set of image data that represents a text-bearing physical object;instructions for capturing a second set of image data that represents the text-bearing physical object, wherein the second set of image data is different than the first set of image data;instructions for identifying a portion of one of the first or second sets of image data that is unreliable for recognizing text from the text-bearing physical object; andinstructions for processing the first and second sets of image data to recognize text therefrom, wherein the processing takes into consideration that a portion of one of the first or second sets of image data is unreliable.
  • 19. A non-transitory computer-readable medium having instructions stored thereon, the instructions comprising: instructions for capturing a first set of image data that represents a text-bearing physical object;instructions for capturing a second set of image data that represents the text-bearing physical object, wherein the second set of image data is different than the first set of image data;instructions for identifying a region in the first set of image data that is relatively more reliable for recognizing text than a corresponding region of the second set of image data;instructions for identifying a region in the second set of image data that is relatively more reliable for recognizing text than another corresponding region of the first set of image data;instructions for combining image data from the reliable region in the first set of image data with image data from the reliable region in the second set of image data; andinstructions for performing an optical character recognition operation to recognize text from the combined image data.
  • 20. An apparatus comprising: an optical capture device configured to: capture a first set of image data that represents a text-bearing physical object; andcapture a second set of image data that represents the text-bearing physical object, wherein the second set of image data is different than the first set of image data; anda processor coupled to the optical capture device, wherein the processor is configured to: identify a portion of one of the first or second sets of image data that is unreliable for recognizing text from the text-bearing physical object; andprocess the first and second sets of image data to recognize text therefrom, wherein the processing takes into consideration that a portion of one of the first or second sets of image data is unreliable.
  • 21. An apparatus comprising: an optical capture device configured to: capture a first set of image data that represents a text-bearing physical object; andcapture a second set of image data that represents the text-bearing physical object, wherein the second set of image data is different than the first set of image data; anda processor coupled to the optical capture device, wherein the processor is configured to: identify a region in the first set of image data that is relatively more reliable for recognizing text than a corresponding region of the second set of image data;identify a region in the second set of image data that is relatively more reliable for recognizing text than another corresponding region of the first set of image data;combine image data from the reliable region in the first set of image data with image data from the reliable region in the second set of image data; andperform an optical character recognition operation to recognize text from the combined image data.
RELATED APPLICATION DATA

This application is a continuation of application Ser. No. 09/563,663, filed May 2, 2000 (now U.S. Pat. No. 7,346,184). The subject matter of the present application is also related to that disclosed in applications Ser. No. 09/127,502, filed Jul. 31, 1998 (now U.S. Pat. No. 6,345,104); Ser. No. 09/074,034, filed May 6, 1998 (now U.S. Pat. No. 6,449,377); Ser. No. 09/343,104, filed Jun. 29, 1999; Ser. No. 09/503,881, filed Feb. 14, 2000 (now U.S. Pat. No. 6,614,914); Ser. No. 09/547,664, filed Apr. 12, 2000 (now U.S. Pat. No. 7,206,820); and in U.S. Pat. Nos. 5,841,886 and 5,862,260.

US Referenced Citations (285)
Number Name Date Kind
3805238 Rothfjell Apr 1974 A
4396903 Habicht et al. Aug 1983 A
4590366 Rothfjell May 1986 A
4675746 Tetrick et al. Jun 1987 A
4689477 Goldman Aug 1987 A
4876617 Best et al. Oct 1989 A
4939674 Price Jul 1990 A
4949391 Faulkerson et al. Aug 1990 A
4972476 Nathans Nov 1990 A
4993068 Piosenka et al. Feb 1991 A
5079648 Maufe Jan 1992 A
5237164 Takada Aug 1993 A
5262860 Fitzpatrick et al. Nov 1993 A
5284364 Jain Feb 1994 A
5319453 Copriviza et al. Jun 1994 A
5337361 Wang et al. Aug 1994 A
5354097 Tel Oct 1994 A
5379345 Greenberg Jan 1995 A
5384846 Berson et al. Jan 1995 A
5434403 Amir et al. Jul 1995 A
5436970 Ray et al. Jul 1995 A
5438188 Surka Aug 1995 A
5446273 Leslie Aug 1995 A
5457308 Spitz et al. Oct 1995 A
5469506 Berson et al. Nov 1995 A
5471533 Wang et al. Nov 1995 A
5485554 Lowitz et al. Jan 1996 A
5490217 Wang et al. Feb 1996 A
5499294 Friedman Mar 1996 A
5509083 Abtahi et al. Apr 1996 A
5513264 Wang Apr 1996 A
5625720 Miyaza et al. Apr 1997 A
5636292 Rhoads Jun 1997 A
5646997 Barton Jul 1997 A
5652626 Kawakami et al. Jul 1997 A
5659726 Sandford et al. Aug 1997 A
5664018 Leighton Sep 1997 A
5682030 Kubon Oct 1997 A
5694471 Chen et al. Dec 1997 A
5708717 Alasia Jan 1998 A
5710834 Rhoads Jan 1998 A
5721788 Powell et al. Feb 1998 A
5745604 Rhoads Apr 1998 A
5748763 Rhoads May 1998 A
5748783 Rhoads May 1998 A
5760386 Ward Jun 1998 A
5767987 Wolff et al. Jun 1998 A
5768426 Rhoads Jun 1998 A
5787186 Schroeder Jul 1998 A
5799092 Kristol et al. Aug 1998 A
5811779 Gaylord et al. Sep 1998 A
5822436 Rhoads Oct 1998 A
5832119 Rhoads Nov 1998 A
5841886 Rhoads Nov 1998 A
5850481 Rhoads Dec 1998 A
5859920 Daly et al. Jan 1999 A
5862260 Rhoads Jan 1999 A
5864622 Marcus Jan 1999 A
5864623 Messina et al. Jan 1999 A
5890742 Waller Apr 1999 A
5907149 Marckini May 1999 A
5912934 Acks et al. Jun 1999 A
5912974 Holloway et al. Jun 1999 A
5995105 Reber et al. Nov 1999 A
5995625 Sudia et al. Nov 1999 A
6000612 Xu Dec 1999 A
6024287 Takai et al. Feb 2000 A
6026193 Rhoads Feb 2000 A
6031914 Tewfik et al. Feb 2000 A
6031930 Bacus Feb 2000 A
6075905 Herman Jun 2000 A
6086707 Waller Jul 2000 A
6088612 Blair Jul 2000 A
6095566 Yamamoto et al. Aug 2000 A
6122392 Rhoads Sep 2000 A
6122403 Rhoads Sep 2000 A
6185312 Nakamura et al. Feb 2001 B1
6185316 Buffam Feb 2001 B1
6208765 Bergen Mar 2001 B1
6226387 Tewfik et al. May 2001 B1
6233684 Stefik et al. May 2001 B1
6240219 Gregory May 2001 B1
6243480 Zhao et al. Jun 2001 B1
6266430 Rhoads Jul 2001 B1
6277067 Blair Aug 2001 B1
6285776 Rhoads Sep 2001 B1
6289108 Rhoads Sep 2001 B1
6292092 Chow et al. Sep 2001 B1
6307949 Rhoads Oct 2001 B1
6311214 Rhoads Oct 2001 B1
6330335 Rhoads Dec 2001 B1
6332693 Dove Dec 2001 B1
6343138 Rhoads Jan 2002 B1
6353672 Rhoads Mar 2002 B1
6363159 Rhoads Mar 2002 B1
6369904 Bhattacharjya et al. Apr 2002 B1
6385329 Sharma et al. May 2002 B1
6389151 Carr et al. May 2002 B1
6394352 De Renzis May 2002 B1
6400827 Rhoads Jun 2002 B1
6404898 Rhoads Jun 2002 B1
6411392 Bender et al. Jun 2002 B1
6430302 Rhoads Aug 2002 B2
6449379 Rhoads Sep 2002 B1
6463416 Messina Oct 2002 B1
6466253 Honjoh Oct 2002 B1
6466618 Messing Oct 2002 B1
6493469 Taylor Dec 2002 B1
6496591 Rhoads Dec 2002 B1
6496933 Nunally Dec 2002 B1
6519352 Rhoads Feb 2003 B2
6522771 Rhoads Feb 2003 B2
6535617 Hannigan et al. Mar 2003 B1
6535618 Rhoads Mar 2003 B1
6539095 Rhoads Mar 2003 B1
6542618 Rhoads Apr 2003 B1
6542620 Rhoads Apr 2003 B1
6553127 Kurowski Apr 2003 B1
6560349 Rhoads May 2003 B1
6567534 Rhoads May 2003 B1
6567535 Rhoads May 2003 B2
6567780 Rhoads May 2003 B2
6570613 Howell May 2003 B1
6574350 Rhoads et al. Jun 2003 B1
6580819 Rhoads Jun 2003 B1
6587821 Rhoads Jul 2003 B1
6590996 Reed et al. Jul 2003 B1
6590997 Rhoads Jul 2003 B2
6594403 Bozdagi Jul 2003 B1
6608930 Agnihotri Aug 2003 B1
6614914 Rhoads et al. Sep 2003 B1
6614930 Agnihotri Sep 2003 B1
6621524 Iijima et al. Sep 2003 B1
6625297 Bradley Sep 2003 B1
6636551 Ikeda et al. Oct 2003 B1
6647129 Rhoads Nov 2003 B2
6654480 Rhoads Nov 2003 B2
6654887 Rhoads Nov 2003 B2
6665454 Silverbrook et al. Dec 2003 B1
6675146 Rhoads Jan 2004 B2
6683966 Tian et al. Jan 2004 B1
6694041 Brunk Feb 2004 B1
6724912 Carr et al. Apr 2004 B1
6738495 Rhoads et al. May 2004 B2
6744907 Rhoads Jun 2004 B2
6750985 Rhoads et al. Jun 2004 B2
6754377 Rhoads Jun 2004 B2
6757406 Rhoads Jun 2004 B2
6760464 Brunk Jul 2004 B2
6768808 Rhoads Jul 2004 B2
6771796 Rhoads Aug 2004 B2
6778682 Rhoads Aug 2004 B2
6782116 Zhao et al. Aug 2004 B1
6804377 Reed Oct 2004 B2
6804379 Rhoads Oct 2004 B2
6843566 Mihara Jan 2005 B2
6862054 Kawakami Mar 2005 B2
6882738 Davis et al. Apr 2005 B2
6944298 Rhoads Sep 2005 B1
6959100 Rhoads Oct 2005 B2
6959386 Rhoads Oct 2005 B2
6961444 Levy Nov 2005 B2
6965873 Rhoads Nov 2005 B1
6970573 Carr et al. Nov 2005 B2
6971011 Maes Nov 2005 B1
6978036 Alattar et al. Dec 2005 B2
6983051 Rhoads Jan 2006 B1
6987862 Rhoads Jan 2006 B2
6993152 Patterson et al. Jan 2006 B2
6993154 Brunk Jan 2006 B2
7003132 Rhoads Feb 2006 B2
7015954 Foote Mar 2006 B1
7016516 Rhoads Mar 2006 B2
7020303 Levy et al. Mar 2006 B2
7020349 Brunk Mar 2006 B2
7027612 Patterson et al. Apr 2006 B2
7054462 Rhoads et al. May 2006 B2
7054463 Rhoads et al. May 2006 B2
7076084 Davis et al. Jul 2006 B2
7113569 Okumura et al. Sep 2006 B2
7113615 Rhoads et al. Sep 2006 B2
7116781 Rhoads Oct 2006 B2
7130087 Rhoads Oct 2006 B2
7142691 Levy Nov 2006 B2
7158099 Berube et al. Jan 2007 B1
7181022 Rhoads Feb 2007 B2
7184570 Rhoads Feb 2007 B2
7197164 Levy Mar 2007 B2
7218751 Reed May 2007 B2
7239734 Alattar et al. Jul 2007 B2
7242790 Rhoads Jul 2007 B2
7246239 Rodriguez et al. Jul 2007 B2
7248715 Levy Jul 2007 B2
7263203 Rhoads et al. Aug 2007 B2
7266217 Rhoads et al. Sep 2007 B2
7269275 Carr et al. Sep 2007 B2
7277468 Tian et al. Oct 2007 B2
7286684 Rhoads et al. Oct 2007 B2
7305117 Davis et al. Dec 2007 B2
7313253 Davis et al. Dec 2007 B2
7321667 Stach Jan 2008 B2
7340076 Stach et al. Mar 2008 B2
7346184 Carr et al. Mar 2008 B1
7346776 Levy et al. Mar 2008 B2
7349555 Rhoads Mar 2008 B2
7359528 Rhoads Apr 2008 B2
7372976 Rhoads et al. May 2008 B2
7415129 Rhoads Aug 2008 B2
7418111 Rhoads Aug 2008 B2
7424132 Rhoads Sep 2008 B2
7499564 Rhoads Mar 2009 B2
7532741 Stach May 2009 B2
7536555 Rhoads May 2009 B2
7539325 Rhoads et al. May 2009 B2
7548643 Davis et al. Jun 2009 B2
7555139 Rhoads et al. Jun 2009 B2
7567686 Rhoads Jul 2009 B2
7570784 Alattar Aug 2009 B2
7602940 Rhoads et al. Oct 2009 B2
7602977 Rhoads et al. Oct 2009 B2
7606390 Rhoads Oct 2009 B2
7607016 Brunk et al. Oct 2009 B2
7620200 Rhoads Nov 2009 B2
7639837 Carr et al. Dec 2009 B2
7643649 Davis Jan 2010 B2
7656930 Tian et al. Feb 2010 B2
7672477 Rhoads Mar 2010 B2
7676059 Rhoads Mar 2010 B2
7693965 Rhoads Apr 2010 B2
7697719 Rhoads Apr 2010 B2
7702511 Rhoads Apr 2010 B2
7711143 Rhoads May 2010 B2
7720249 Rhoads May 2010 B2
7720255 Rhoads May 2010 B2
7724919 Rhoads May 2010 B2
7760902 Rhoads Jul 2010 B2
7763179 Levy Jul 2010 B2
7796826 Rhoads et al. Sep 2010 B2
7831062 Stach Nov 2010 B2
20010022848 Rhoads Sep 2001 A1
20020041761 Glotzbach Apr 2002 A1
20020048282 Kawamae et al. Apr 2002 A1
20020059880 Klinefelter et al. May 2002 A1
20020071905 Akedo Jun 2002 A1
20020080995 Rhoads Jun 2002 A1
20020090110 Braudaway et al. Jul 2002 A1
20020136429 Stach et al. Sep 2002 A1
20030021440 Rhoads Jan 2003 A1
20030025814 Hunter et al. Feb 2003 A1
20030071905 Yamasaki Apr 2003 A1
20030133592 Rhoads Jul 2003 A1
20030138128 Rhoads Jul 2003 A1
20040008866 Rhoads et al. Jan 2004 A1
20040057581 Rhoads Mar 2004 A1
20040128514 Rhoads Jul 2004 A1
20040181671 Brundage et al. Sep 2004 A1
20040263911 Rodriguez Dec 2004 A1
20060028689 Perry et al. Feb 2006 A1
20060062386 Rhoads Mar 2006 A1
20070016790 Brundage et al. Jan 2007 A1
20070172098 Rhoads Jul 2007 A1
20070180251 Carr Aug 2007 A1
20070201835 Rhoads Aug 2007 A1
20080016360 Rhoads et al. Jan 2008 A1
20080131083 Rhoads Jun 2008 A1
20080131084 Rhoads Jun 2008 A1
20080149713 Brundage Jun 2008 A1
20080253740 Rhoads Oct 2008 A1
20080270801 Levy et al. Oct 2008 A1
20080275906 Brundage Nov 2008 A1
20090252401 Davis et al. Oct 2009 A1
20100008534 Rhoads Jan 2010 A1
20100008536 Rhoads Jan 2010 A1
20100008537 Rhoads Jan 2010 A1
20100021004 Rhoads Jan 2010 A1
20100027969 Alattar Feb 2010 A1
20100040255 Rhoads Feb 2010 A1
20100042843 Brunk et al. Feb 2010 A1
20100119108 Rhoads May 2010 A1
20100131767 Rhoads May 2010 A1
20100142752 Rhoads et al. Jun 2010 A1
20100146285 Rhoads et al. Jun 2010 A1
20100163629 Rhoads et al. Jul 2010 A1
20100172538 Rhoads Jul 2010 A1
20110013802 Rhoads Jan 2011 A1
Foreign Referenced Citations (14)
Number Date Country
0 629 972 Dec 1994 EP
0 642 060 Mar 1995 EP
0 650 146 Apr 1995 EP
0 730 242 Sep 1996 EP
03-185585 Aug 1991 JP
WO-9513597 May 1995 WO
WO-9603286 Feb 1996 WO
WO-9626494 Aug 1996 WO
WO-9636163 Nov 1996 WO
WO-9843152 Oct 1998 WO
WO-9913391 Mar 1999 WO
WO-9936876 Jul 1999 WO
WO0139106 May 2001 WO
WO-0141056 Jun 2001 WO
Related Publications (1)
Number Date Country
20080181450 A1 Jul 2008 US
Continuations (1)
Number Date Country
Parent 09563663 May 2000 US
Child 12050000 US