Modular biometrics collection system architecture

Information

  • Patent Grant
  • 8085993
  • Patent Number
    8,085,993
  • Date Filed
    Friday, March 2, 2007
    17 years ago
  • Date Issued
    Tuesday, December 27, 2011
    12 years ago
Abstract
A modular biometrics collection system with an architecture having application to a combined features recognition system. The system may be a self-organizing mesh of collaborative independent components. Each component may have inputs, outputs, and local prioritization management. Each component may operate autonomously. Federated behavior of the components may be achieved by subscribing to content that influences local prioritization. An example of the system may have application to combined face and iris recognition.
Description
BACKGROUND

The present invention pertains to recognition systems and particularly to biometric recognition systems. More particularly, the invention may pertain to collection architecture of the recognition systems.


U.S. patent application Ser. No. 10/979,129, filed Nov. 3, 2004, is hereby incorporated by reference.


U.S. patent application Ser. No. 10/655,124, filed Sep. 5, 2003, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/382,373, filed May 9, 2006, is hereby incorporated by reference.


U.S. Provisional Application No. 60/778,770, filed Mar. 3, 2006, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/275,703, filed Jan. 25, 2006, is hereby incorporated by reference.


U.S. Provisional Application No. 60/647,270, filed Jan. 26, 2005, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/043,366, filed Jan. 26, 2005, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/372,854, filed Mar. 10, 2006, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/672,108, filed Feb. 7, 2007, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/675,424, filed Feb. 15, 2007 is hereby incorporated by reference.


SUMMARY

The present invention is modular biometrics collection system architecture applicable to system such as sensing and acquisition systems.





BRIEF DESCRIPTION OF THE DRAWING


FIG. 1 is a diagram of an example system utilizing biometrics collection system architecture;



FIG. 2 is a diagram of a different scale system than the system in the diagram of FIG. 1, having fewer biometrics sensors; and



FIG. 3 is a diagram of a system in which multiple system scopes utilize a common sensor as compared to the system in the diagram of FIG. 2.





DESCRIPTION

The present system architecture may relate to biometrics, iris recognition systems, image quality metrics, authentication, access control, monitoring, identification, and security and surveillance systems.


A biometric information acquisition system that incorporates multiple acquisition components, maximizing the amount and quality of the collected content, should be very flexible and adaptive, allowing essentially any collected content to influence the collection behavior of any collection component. A conventional design approach for this kind of system may be to perform an extensive static analysis a priori, and design an architecture and system organization for implementing the analysis results. This more generalized information acquisition problem space may be better suited for a collaborative architecture that exhibits local autonomy and federated behavior.


The present system may be constructed as a self-organizing mesh of collaborative independent components. Each component may have inputs, outputs, and local prioritization/management. An input may be physical such as a sensor or content produced by another component. An output may be content that could be used as input to another component or a physical surface such as a display. The local prioritization may determine how and/or when inputs are used. Then operations may be performed and outputs produced based on locally relevant evaluation of local values and evaluation of external values received as inputs. Inputs and outputs may be shared using a producer/consumer paradigm. That is, the producer of outputs may have no knowledge of the number of consumers or how the outputs will be used, and the consumer may subscribe to content abstractly by a content identifier and quality of service, potentially without knowledge of the producer. Each component may operate autonomously, acting on its inputs upon their arrival and producing outputs from them as dictated by its own prioritization. Federated behavior may be achieved incorporating a component of larger scope that subscribes to content produced by smaller scope components, and produces guidance content that the small scope components may subscribe to and use to influence their local prioritization.


One may analyze a problem space. The system may be partitioned into components identified by inputs, outputs, and independence from other components, and general reusability of defined content. Content may be defined using an abstraction applicable to a broad application scope.


One may identify types of information needed for collaboration guidance that might not otherwise be produced in the system, identify candidate producers, and define as produced content.


One may define identification, characteristics, and interchange encoding for each type of content. To the extent feasible, use content descriptions that already exist, or add backward compatible extensions to existing content definitions, to permit newer producers to be used with existing systems.


A coordinate system may be utilized that is relative within the immediate scope. For example, information within a zone may be exchanged in zone scoped rectangular coordinates. An aiming component within a zone may be configured to know its location and orientation, and it can determine how to aim from zone coordinates. Information within a site may be exchanged in site scoped rectangular coordinates, and any entity exchanging information outside its zone must know the zone's site location and orientation, and perform the coordinate scoping transformation. The parameters necessary to make these transformations must be available from the next level up in the hierarchy.


For description purposes, one may consider a multiple biometric system acquiring both face and iris recognition biometrics from fields of view, or zones (see FIG. 1). This system may operate at three levels—a site level, a zone level, and a biometrics level.


At the site level may be surveillance and/or management entities that serve as aggregators of multiple zones. For an example system, a site surveillance module may typically subscribe to zone images, and reconcile match results (i.e., matched) to present an overview of the scene activities, but when “zooming in”, it could temporarily subscribe to any content to present a detailed view of the current zone state/activity. The site management module may subscribe to reconciled match results, and provide functionality such as coordination across multiple zones. The coordination may be materialized as prioritizing guidance content.


At the zone level, a zone object detector may identify and publish coordinates of objects of interest, using the same identifier for objects that are likely-to-be the same entity, and the zone object reconciler may fuse biometric content taking object identifiers into account. The results may be published as reconciled match results.


At the biometrics level, acquisition and evaluation/matching modules may obtain suitable quality images from the coordinates specified with objects in zone, perform matching, and publish the evaluation and match results. These modules or components may operate autonomously, prioritizing the objects in the zone to optimize the number and quality of biometrics as appropriate for the current set of detected objects in the scene. External content including reconciled match results and prioritizing guidance may be factored into the prioritization.


An example biometric system may produce/consume multiple kinds of content. There may be camera and/or sensor inputs. There may be a zone image which is an image of a region being observed. There may be objects in a zone which can have coordinates and the size of a region in a zone image identified as being of interest. A face image may be correlated to a zone object. There may be an image having an object which can be identified as a face of sufficient quality for recognition at object coordinates. Face match results may be a report on results of a matching operation using a specified face image. An iris image may be correlated to zone object. It may be an image identified as an iris of sufficient quality for recognition at object coordinates. Iris match results may be a report on results of a matching operation using a specified iris image. Reconciled match results may be an assessment of identification and relationship between biometric artifacts with respect to objects in a zone. Prioritizing guidance may be information from a larger scope of the system intended to influence local prioritization of potential objects in a zone.


The example system may have various components, modules, and/or the like. There may be a zone object detector which has an input from a camera sensor. Its activity may include collecting images of the zone, analyzing the images, and identifying objects of interest. An identification (ID) may be assigned to each new object. The output content may include zone image and object information. There may be face image acquisition which has inputs of a camera sensor, objects in a zone, reconciled match results, and prioritizing guidance. Its activity may include prioritizing known objects, zooming to a face at object coordinates, and obtaining a sufficient quality image for a face match. Its output content may include a face image correlated to a zone.


There may be iris image acquisition having inputs of a camera sensor, objects in a zone, reconciled match results, and prioritizing guidance. Its activity may include prioritizing known objects, zooming to a face (iris) at object coordinates, and obtaining a sufficient quality image for an iris match. Its output content may include face biometric data correlated to a zone object. There may be face evaluation/matching having an input of face biometric data correlated to a zone object. Its activity may include processing the biometric data to extract recognition criteria (biometric artifact) and looking up a face biometric artifact in a recognition database. Its output content may include biometric artifacts and face match results. There may be iris matching having an input of an iris biometric data correlated to a zone object. Its activity may include processing the biometric data to extract recognition criteria (biometric artifact) and looking up an iris biometric artifact in a recognition database. Its output content may include iris match results.


There may be a zone object reconciler having inputs of objects in a zone, face match results, and iris match results. Its activity may include analyzing match results, matching conflicts, using affinity measures to resolve conflicts, auto enrolling as appropriate, and so forth. Its output content may include biometric artifacts, multi-biometric match results and image acquisition guidance.


There may be zone surveillance having input content selected for display such as zone images, biometric artifacts, and multi-biometric match results. Its activity may include displaying a zone image, match results, and so forth, as appropriate for the zone scope of an observation. Its output may include visual display. There may be site surveillance/management having input content from multiple zones selected for display such as zone images and reconciled match results. Its activity may include displaying multiple individual/consolidated zone images, match results, and so forth, as appropriate for the site scope of observation. Its output may include visual display and prioritizing guidance (to zone scope).


A collection system may have a biometric scope, a zone scope and a site scope. The zone scope may produce guidance content for the biometric scope and consume content produced by the biometric scope. The site scope may produce guidance content for the zone scope and consume content produced by the zone scope. The biometric scope may have a sensing module, an acquisition module that consumes content produced by the sensing module, and an evaluation module that consumes content produced by the acquisition module.



FIG. 1 shows an example system 10 implementing the present system architecture. An image sensor 11 may output an image to a face image acquisition module 12. The image may have a face which is acquired by module 12. Known objects may be prioritized by module 12. The face at object coordinates may be zoomed to by the module. A sufficient quality image for a match may be obtained. The image may be correlated and published. Input 13 content may include objects in a zone. Input 14 content may include reconciled match results and prioritizing guidance. An output 15 may have content of a face biometric data correlated to a zone object. Output 15 may go to a face evaluation/matching module 16. Module 16 may extract a face biometric artifact from the biometric data and look up the face biometric artifact in a recognition database. Biometric artifact and face match results may be published as content in an input 17 to a zone object reconciler 18.


An image sensor 19 may provide images 21 of its visible region to a zone object detector 22. Detector 22 may receive images 21, analyze them, and identify objects of interest. An identification label (ID) may be assigned to each new object. The zone image and object information may be published by the detector 22. Content of output 23 from detector 22 may include information locating objects in the zone which may go to reconciler 18. The same content may be provided as the input 13 to the face image acquisition module 12. Also, this content may be provided as an input 24 to an iris image acquisition module 25.


An image sensor 26 may provide images 27 of a face including images of an eye and iris to the iris image acquisition module 25. Module 25 may prioritize known objects, zoom to the face at object coordinates, obtain a sufficient quality image for a match and produce a correlated image as biometric data content. The content of an output 28 as an input to an iris evaluation/matching module 29 may include iris biometric data correlated to a zone object. The iris matching module 29 may extract an iris biometric artifact from the biometric data and look up the iris biometric artifact in a recognition database and produce the iris match results. An output 31 of module 29 may provide iris biometric artifacts and iris match results as content to the zone object reconciler 18. Reconciler 18 may provide an output as an input 32 having multi-biometric match results and prioritizing guidance as content to the iris image acquisition module 25.


Image sensor 19 may provide zone image content as an input 33 to a zone surveillance module 34 and as an input 35 to a site surveillance/management module 36. The zone object reconciler 18 may provide reconciled match results as content for an input 37 to the zone surveillance module 34 and for an input 38 to the site surveillance/management module 36. Module 36 may be a site of numerous zones, for example, zones A, B, C, D, E, F, G, H, and so on. These zones may be areas of surveillance, for instance. Content from site scope module 36 may consist of prioritizing guidance which is an input 39 to the zone object reconciler 18.


The zone surveillance module 34 may have a display output. A zone image may be displayed, match results, and so forth, as appropriate for the present scope of observation.


System 10 may have a biometric scope 41, a zone scope 42 and a site scope 43, which could be modules. Biometric scope 41 may be interactively connected to zone scope 42. Zone scope 42 may be interactively connected to site scope 43. Image sensor 11, face acquisition module 12, face matching module 16, image sensor 26, iris image acquisition module 25 and iris matching module 29 may be of the biometric scope 41. Image sensor 19, zone object detector 22, zone object reconciler 18 and zone surveillance module 34 may be of the zone scope 42. Site surveillance/management module 36 may be of the site scope 43. Modules may be at various levels.



FIG. 2 is a diagram of a system 30 which may be a more general level instantiation of the present invention then the diagram of FIG. 1. A sensing module 44 may be connected to an acquisition module 45. The acquisition module 45 may be connected to an evaluation module 46. A biometric scope 41 may include modules 44, 45 and 46. A sensing module 47 may be connected to a zone object detector/reconciler 48 and to a zone surveillance module 34. The zone object detector/reconciler 48 may be connected to the acquisition module 45, the evaluation module 46 and the zone surveillance module 34. A zone scope may include sensing module 47, zone object detector/reconciler 48 and zone surveillance module 34. A site surveillance/management module 36 may be connected to the sensing module 47 and the zone object detector/reconciler 48. A site scope 43 may include the site surveillance/management module 36.


Sensing module 44 may have one or more image or other sensors for detecting various kinds of items such as people, animals, inanimate objects, and features of those people, animals, inanimate objects, and so forth. The acquisition module 45 may acquire or extract features, activities, and/or patterns of activities, of the sensed items. The evaluation module 46 may provide such things as matching, grading, thresholding, and so forth, of those items of module 45. Sensing module 47 may provide one or more images or have other sensors to detect various items like those detected by module 44. The zone object detector/reconciler 48 may include a zone object detector and a zone object reconciler, which may be similar, but not necessarily, to the detector 22 and reconciler 18 of FIG. 1, respectively.



FIG. 3 is a diagram of a system 40 which is similar to system 30. Biometric scope 41 may overlap zone scope 42. A reason for an overlay is that the operations, functions, mechanism and the like of modules 44 and 47 may be combined into a sensing module 49.


In the present specification, some of the matter may be of a hypothetical or prophetic nature although stated in another manner or tense.


Although the invention has been described with respect to at least one illustrative example, many variations and modifications will become apparent to those skilled in the art upon reading the present specification. It is therefore the intention that the appended claims be interpreted as broadly as possible in view of the prior art to include all such variations and modifications.

Claims
  • 1. A collection system, comprising: an object detector for collecting a plurality of images of a full field of view, for analyzing the collected plurality of images, for identifying objects of interest in the analyzed plurality of images, for publishing the plurality of images, and for publishing respective coordinates in each image for each of the identified objects;a face image extractor for receiving the published plurality of images and the respective coordinates in each image for each of the identified objects, for recognizing at least one identified object as a face, for zooming into the published plurality of images at the respective coordinates to the face, and for publishing a zoomed image of the face as a published face image;a face image identifier for receiving the published face image, for determining if the published face image is present in a face image database, and for publishing a face identity result that indicates the identity of the face if the published face image is present in the face image database;an iris image extractor for receiving the published plurality of images and the respective coordinates in each image for each of the identified objects, for recognizing at least one identified object as an iris, for zooming into the published plurality of images at the respective coordinates to the iris, and for publishing a zoomed image of the iris as a published iris image;an iris image identifier for receiving the published iris image, for determining if the published iris image is present in an iris image database, and for publishing an iris identity result that indicates the identity of the iris if the published iris image is present in the iris image database; anda reconciler for receiving the published plurality of images, the published respective coordinates in each image for each of the identified objects, the published face identity result and the published iris identity result, for matching at least one of the face and iris identity results to a respective identified object, and for publishing a person identity.
  • 2. The collection system of claim 1, wherein the object detector, the face image extractor, the face image identifier, the iris image extractor, the iris image identifier and the reconciler run independently of each other.
  • 3. The collection system of claim 1, wherein the reconciler further receives multiple pluralities of images, each plurality originating from a different sensor, each sensor having its own full field of view.
  • 4. The collection system of claim 1, wherein the face image extractor and the iris image extractor both further receive multiple pluralities of images, each plurality originating from a different sensor, each sensor having its own full field of view.
  • 5. The collection system of claim 1, wherein the zoom of the face image extractor is different from the zoom of the iris image extractor.
  • 6. The collection system of claim 1, wherein the zooms of the face and iris image extractors comprise selecting a portion of image pixels from the respective image of the full field of view.
  • 7. The collection system of claim 1, wherein all the images in the collected plurality of images are produced by a stationary sensor and use the same coordinate system.
  • 8. The collection system of claim 1, wherein at least some of the images in the collected plurality of images are produced by a moving sensor; andwherein at least some of the images in the collected plurality of images cover a moving full field of view.
  • 9. The collection system of claim 8, wherein the published respective coordinates in each image for each of the identified objects accounts for the movement of the sensor and the moving full field of view.
  • 10. A collection method, comprising: collecting a plurality of images of a full field of view;analyzing the collected plurality of images;identifying objects of interest in the analyzed plurality of images;publishing the plurality of images;publishing respective coordinates in each image for each of the identified objects;receiving the published plurality of images and the respective coordinates in each image for each of the identified objects;recognizing at least one identified object as a face;zooming into the published plurality of images at the respective coordinates to the face;publishing a zoomed image of the face as a published face image;receiving the published face image;determining if the published face image is present in a face image database;publishing a face identity result that indicates the identity of the face if the published face image is present in the face image database;receiving the published plurality of images and the respective coordinates in each image for each of the identified objects;recognizing at least one identified object as an iris;zooming into the published plurality of images at the respective coordinates to the iris;publishing a zoomed image of the iris as a published iris image;receiving the published iris image;determining if the published iris image is present in an iris image database;publishing an iris identity result that indicates the identity of the iris if the published iris image is present in the iris image database;receiving the published plurality of images, the published respective coordinates in each image for each of the identified objects, the published face identity result and the published iris identity result;matching at least one of the face and iris identity results to a respective identified object; andpublishing a person identity.
  • 11. The collection method of claim 10, further comprising receiving multiple pluralities of images, each plurality originating from a different sensor, each sensor having its own full field of view.
  • 12. The collection method of claim 10, wherein the zooms are different for the face and iris resolutions.
  • 13. The collection method of claim 10, wherein the zooms comprise selecting a portion of image pixels from the respective image of the full field of view.
  • 14. The collection method of claim 10, wherein all the images in the collected plurality of images are produced by a stationary sensor and use the same coordinate system.
  • 15. The collection method of claim 10, wherein at least some of the images in the collected plurality of images are produced by a moving sensor; andwherein at least some of the images in the collected plurality of images cover a moving full field of view.
  • 16. The collection method of claim 15, wherein the published respective coordinates in each image for each of the identified objects accounts for the movement of the sensor and the moving full field of view.
Parent Case Info

This application claims the benefit of U.S. Provisional Application No. 60/778,770, filed Mar. 3, 2006.

Government Interests

The government may have rights in the present invention.

US Referenced Citations (394)
Number Name Date Kind
4641349 Flom et al. Feb 1987 A
4836670 Hutchinson Jun 1989 A
5231674 Cleveland et al. Jul 1993 A
5291560 Daugman Mar 1994 A
5293427 Ueno et al. Mar 1994 A
5359382 Uenaka Oct 1994 A
5404013 Tajima Apr 1995 A
5551027 Choy et al. Aug 1996 A
5572596 Wildes et al. Nov 1996 A
5608472 Szirth et al. Mar 1997 A
5664239 Nakata Sep 1997 A
5717512 Chmielewski, Jr. et al. Feb 1998 A
5751836 Wildes et al. May 1998 A
5859686 Aboutalib et al. Jan 1999 A
5860032 Iwane Jan 1999 A
5896174 Nakata Apr 1999 A
5901238 Matsuhita May 1999 A
5909269 Isogai et al. Jun 1999 A
5953440 Zhang et al. Sep 1999 A
5956122 Doster Sep 1999 A
5978494 Zhang Nov 1999 A
6005704 Chmielewski, Jr. et al. Dec 1999 A
6007202 Apple et al. Dec 1999 A
6012376 Hanke et al. Jan 2000 A
6021210 Camus et al. Feb 2000 A
6028949 McKendall Feb 2000 A
6055322 Salganicoff et al. Apr 2000 A
6064752 Rozmus et al. May 2000 A
6069967 Rozmus et al. May 2000 A
6081607 Mori et al. Jun 2000 A
6088470 Camus et al. Jul 2000 A
6091899 Konishi et al. Jul 2000 A
6101477 Hohle et al. Aug 2000 A
6104431 Inoue et al. Aug 2000 A
6108636 Yap et al. Aug 2000 A
6119096 Mann et al. Sep 2000 A
6120461 Smyth Sep 2000 A
6134339 Luo Oct 2000 A
6144754 Okano et al. Nov 2000 A
6246751 Bergl et al. Jun 2001 B1
6247813 Kim et al. Jun 2001 B1
6252977 Salganicoff et al. Jun 2001 B1
6282475 Washington Aug 2001 B1
6285505 Melville et al. Sep 2001 B1
6285780 Yamakita et al. Sep 2001 B1
6289113 McHugh et al. Sep 2001 B1
6299306 Braithwaite et al. Oct 2001 B1
6308015 Matsumoto Oct 2001 B1
6309069 Seal et al. Oct 2001 B1
6320610 Van Sant et al. Nov 2001 B1
6320612 Young Nov 2001 B1
6320973 Suzaki et al. Nov 2001 B2
6323761 Son Nov 2001 B1
6325765 Hay et al. Dec 2001 B1
6330674 Angelo et al. Dec 2001 B1
6332193 Glass et al. Dec 2001 B1
6344683 Kim Feb 2002 B1
6370260 Pavlidis et al. Apr 2002 B1
6377699 Musgrave et al. Apr 2002 B1
6393136 Amir et al. May 2002 B1
6400835 Lemelson et al. Jun 2002 B1
6424727 Musgrave et al. Jul 2002 B1
6424845 Emmoft et al. Jul 2002 B1
6433818 Steinberg et al. Aug 2002 B1
6438752 McClard Aug 2002 B1
6441482 Foster Aug 2002 B1
6446045 Stone et al. Sep 2002 B1
6483930 Musgrave et al. Nov 2002 B1
6484936 Nicoll et al. Nov 2002 B1
6490443 Freeny, Jr. Dec 2002 B1
6493669 Curry et al. Dec 2002 B1
6494363 Roger et al. Dec 2002 B1
6503163 Van Sant et al. Jan 2003 B1
6505193 Musgrave et al. Jan 2003 B1
6506078 Mori et al. Jan 2003 B1
6508397 Do Jan 2003 B1
6516078 Yang et al. Feb 2003 B1
6516087 Camus Feb 2003 B1
6516416 Gregg et al. Feb 2003 B2
6522772 Morrison et al. Feb 2003 B1
6523165 Liu et al. Feb 2003 B2
6526160 Ito Feb 2003 B1
6532298 Cambier et al. Mar 2003 B1
6540392 Braithwaite Apr 2003 B1
6542624 Oda Apr 2003 B1
6546121 Oda Apr 2003 B1
6553494 Glass Apr 2003 B1
6580356 Alt et al. Jun 2003 B1
6591001 Oda et al. Jul 2003 B1
6591064 Higashiyama et al. Jul 2003 B2
6594377 Kim et al. Jul 2003 B1
6594399 Camus et al. Jul 2003 B1
6598971 Cleveland Jul 2003 B2
6600878 Pregara Jul 2003 B2
6614919 Suzaki et al. Sep 2003 B1
6652099 Chae et al. Nov 2003 B2
6674367 Sweatte Jan 2004 B2
6690997 Rivalto Feb 2004 B2
6708176 Strunk et al. Mar 2004 B2
6711562 Ross et al. Mar 2004 B1
6714665 Hanna et al. Mar 2004 B1
6718049 Pavlidis et al. Apr 2004 B2
6718665 Hess et al. Apr 2004 B2
6732278 Baird, III et al. May 2004 B2
6734783 Anbai May 2004 B1
6745520 Puskaric et al. Jun 2004 B2
6750435 Ford Jun 2004 B2
6751733 Nakamura et al. Jun 2004 B1
6753919 Daugman Jun 2004 B1
6754640 Bozeman Jun 2004 B2
6760467 Min et al. Jul 2004 B1
6765470 Shinzaki Jul 2004 B2
6766041 Golden et al. Jul 2004 B2
6775774 Harper Aug 2004 B1
6785406 Kamada Aug 2004 B1
6793134 Clark Sep 2004 B2
6819219 Bolle et al. Nov 2004 B1
6829370 Pavlidis et al. Dec 2004 B1
6832044 Doi et al. Dec 2004 B2
6836554 Bolle et al. Dec 2004 B1
6837436 Swartz et al. Jan 2005 B2
6845879 Park Jan 2005 B2
6853444 Haddad Feb 2005 B2
6867683 Calvesio et al. Mar 2005 B2
6873960 Wood et al. Mar 2005 B1
6896187 Stockhammer May 2005 B2
6905411 Nguyen et al. Jun 2005 B2
6920237 Chen et al. Jul 2005 B2
6930707 Bates et al. Aug 2005 B2
6934849 Kramer et al. Aug 2005 B2
6950139 Fujinawa Sep 2005 B2
6954738 Wang et al. Oct 2005 B2
6957341 Rice et al. Oct 2005 B2
6972797 Izumi Dec 2005 B2
6992562 Fuks et al. Jan 2006 B2
7053948 Konishi May 2006 B2
7071971 Elberbaum Jul 2006 B2
7084904 Liu et al. Aug 2006 B2
7136581 Fujii Nov 2006 B2
7183895 Bazakos et al. Feb 2007 B2
7184577 Chen et al. Feb 2007 B2
7197173 Jones et al. Mar 2007 B2
7204425 Mosher, Jr. et al. Apr 2007 B2
7277561 Shin Oct 2007 B2
7277891 Howard et al. Oct 2007 B2
7298873 Miller et al. Nov 2007 B2
7315233 Yuhara Jan 2008 B2
7362210 Bazakos et al. Apr 2008 B2
7362370 Sakamoto et al. Apr 2008 B2
7365771 Kahn et al. Apr 2008 B2
7406184 Wolff et al. Jul 2008 B2
7414648 Imada Aug 2008 B2
7417682 Kuwakino et al. Aug 2008 B2
7418115 Northcott et al. Aug 2008 B2
7421097 Hamza et al. Sep 2008 B2
7443441 Hiraoka Oct 2008 B2
7460693 Loy et al. Dec 2008 B2
7471451 Dent et al. Dec 2008 B2
7486806 Azuma et al. Feb 2009 B2
7518651 Butterworth Apr 2009 B2
7537568 Moehring May 2009 B2
7538326 Johnson et al. May 2009 B2
7542945 Thompson et al. Jun 2009 B2
7580620 Raskar et al. Aug 2009 B2
7593550 Hamza Sep 2009 B2
7722461 Gatto et al. May 2010 B2
7751598 Matey et al. Jul 2010 B2
7756301 Hamza Jul 2010 B2
7756407 Raskar Jul 2010 B2
7761453 Hamza Jul 2010 B2
7777802 Shinohara et al. Aug 2010 B2
7804982 Howard et al. Sep 2010 B2
20010026632 Tamai Oct 2001 A1
20010027116 Baird Oct 2001 A1
20010047479 Bromba et al. Nov 2001 A1
20010051924 Uberti Dec 2001 A1
20010054154 Tam Dec 2001 A1
20020010857 Karthik Jan 2002 A1
20020033896 Hatano Mar 2002 A1
20020039433 Shin Apr 2002 A1
20020040434 Elliston et al. Apr 2002 A1
20020062280 Zachariassen et al. May 2002 A1
20020077841 Thompson Jun 2002 A1
20020089157 Breed et al. Jul 2002 A1
20020106113 Park Aug 2002 A1
20020112177 Voltmer et al. Aug 2002 A1
20020114495 Chen et al. Aug 2002 A1
20020130961 Lee et al. Sep 2002 A1
20020131622 Lee et al. Sep 2002 A1
20020139842 Swaine Oct 2002 A1
20020140715 Smet Oct 2002 A1
20020142844 Kerr Oct 2002 A1
20020144128 Rahman et al. Oct 2002 A1
20020150281 Cho Oct 2002 A1
20020154794 Cho Oct 2002 A1
20020158750 Almalik Oct 2002 A1
20020164054 McCartney et al. Nov 2002 A1
20020175182 Matthews Nov 2002 A1
20020186131 Fettis Dec 2002 A1
20020191075 Doi et al. Dec 2002 A1
20020191076 Wada et al. Dec 2002 A1
20020194128 Maritzen et al. Dec 2002 A1
20020194131 Dick Dec 2002 A1
20020198731 Barnes et al. Dec 2002 A1
20030002714 Wakiyama Jan 2003 A1
20030012413 Kusakari et al. Jan 2003 A1
20030014372 Wheeler et al. Jan 2003 A1
20030020828 Ooi et al. Jan 2003 A1
20030038173 Blackson et al. Feb 2003 A1
20030046228 Berney Mar 2003 A1
20030053663 Chen et al. Mar 2003 A1
20030055689 Block et al. Mar 2003 A1
20030055787 Fujii Mar 2003 A1
20030058492 Wakiyama Mar 2003 A1
20030061172 Robinson Mar 2003 A1
20030061233 Manasse et al. Mar 2003 A1
20030065626 Allen Apr 2003 A1
20030071743 Seah et al. Apr 2003 A1
20030072475 Tamori Apr 2003 A1
20030073499 Reece Apr 2003 A1
20030074317 Hofi Apr 2003 A1
20030074326 Byers Apr 2003 A1
20030076161 Tisse Apr 2003 A1
20030076300 Lauper et al. Apr 2003 A1
20030076984 Tisse et al. Apr 2003 A1
20030080194 O'Hara et al. May 2003 A1
20030091215 Lauper et al. May 2003 A1
20030092489 Veradej May 2003 A1
20030095689 Volkommer et al. May 2003 A1
20030098776 Friedli May 2003 A1
20030099379 Monk et al. May 2003 A1
20030099381 Ohba May 2003 A1
20030103652 Lee et al. Jun 2003 A1
20030107097 McArthur et al. Jun 2003 A1
20030107645 Yoon Jun 2003 A1
20030108224 Ike Jun 2003 A1
20030108225 Li Jun 2003 A1
20030115148 Takhar Jun 2003 A1
20030115459 Monk Jun 2003 A1
20030116630 Carey et al. Jun 2003 A1
20030118212 Min et al. Jun 2003 A1
20030118217 Kondo et al. Jun 2003 A1
20030123711 Kim et al. Jul 2003 A1
20030125054 Garcia Jul 2003 A1
20030125057 Pesola Jul 2003 A1
20030126560 Kurapati et al. Jul 2003 A1
20030131245 Linderman Jul 2003 A1
20030131265 Bhakta Jul 2003 A1
20030133597 Moore et al. Jul 2003 A1
20030140235 Immega et al. Jul 2003 A1
20030140928 Bui et al. Jul 2003 A1
20030141411 Pandya et al. Jul 2003 A1
20030149881 Patel et al. Aug 2003 A1
20030152251 Ike Aug 2003 A1
20030152252 Kondo et al. Aug 2003 A1
20030156741 Lee et al. Aug 2003 A1
20030158762 Wu Aug 2003 A1
20030158821 Maia Aug 2003 A1
20030159051 Hollnagel Aug 2003 A1
20030163739 Armington et al. Aug 2003 A1
20030169334 Braithwaite et al. Sep 2003 A1
20030169901 Pavlidis et al. Sep 2003 A1
20030169907 Edwards et al. Sep 2003 A1
20030173408 Mosher, Jr. et al. Sep 2003 A1
20030174049 Beigel et al. Sep 2003 A1
20030177051 Driscoll et al. Sep 2003 A1
20030182151 Taslitz Sep 2003 A1
20030182182 Kocher Sep 2003 A1
20030189480 Hamid Oct 2003 A1
20030189481 Hamid Oct 2003 A1
20030191949 Odagawa Oct 2003 A1
20030194112 Lee Oct 2003 A1
20030195935 Leeper Oct 2003 A1
20030198368 Kee Oct 2003 A1
20030200180 Phelan, III et al. Oct 2003 A1
20030210139 Brooks et al. Nov 2003 A1
20030210802 Schuessier Nov 2003 A1
20030218719 Abourizk et al. Nov 2003 A1
20030225711 Paping Dec 2003 A1
20030228898 Rowe Dec 2003 A1
20030233556 Angelo et al. Dec 2003 A1
20030235326 Morikawa et al. Dec 2003 A1
20030235411 Morikawa et al. Dec 2003 A1
20030236120 Reece et al. Dec 2003 A1
20040001614 Russon et al. Jan 2004 A1
20040002894 Kocher Jan 2004 A1
20040005078 Tillotson Jan 2004 A1
20040006553 de Vries et al. Jan 2004 A1
20040010462 Moon et al. Jan 2004 A1
20040012760 Mihashi et al. Jan 2004 A1
20040019570 Bolle et al. Jan 2004 A1
20040023664 Mirouze et al. Feb 2004 A1
20040023709 Beaulieu et al. Feb 2004 A1
20040025030 Corbett-Clark et al. Feb 2004 A1
20040025031 Ooi et al. Feb 2004 A1
20040025053 Hayward Feb 2004 A1
20040029564 Hodge Feb 2004 A1
20040030930 Nomura Feb 2004 A1
20040035123 Kim et al. Feb 2004 A1
20040037450 Bradski Feb 2004 A1
20040039914 Barr et al. Feb 2004 A1
20040042641 Jakubowski Mar 2004 A1
20040044627 Russell et al. Mar 2004 A1
20040046640 Jourdain et al. Mar 2004 A1
20040049687 Orsini et al. Mar 2004 A1
20040050924 Mletzko et al. Mar 2004 A1
20040050930 Rowe Mar 2004 A1
20040052405 Walfridsson Mar 2004 A1
20040052418 DeLean Mar 2004 A1
20040059590 Mercredi et al. Mar 2004 A1
20040059953 Purnell Mar 2004 A1
20040104266 Bolle et al. Jun 2004 A1
20040117636 Cheng Jun 2004 A1
20040133804 Smith et al. Jul 2004 A1
20040146187 Jeng Jul 2004 A1
20040148526 Sands et al. Jul 2004 A1
20040160518 Park Aug 2004 A1
20040162870 Matsuzaki et al. Aug 2004 A1
20040162984 Freeman et al. Aug 2004 A1
20040169817 Grotehusmann et al. Sep 2004 A1
20040172541 Ando et al. Sep 2004 A1
20040174070 Voda et al. Sep 2004 A1
20040190759 Caldwell Sep 2004 A1
20040193893 Braithwaite et al. Sep 2004 A1
20040219902 Lee et al. Nov 2004 A1
20040233038 Beenau et al. Nov 2004 A1
20040240711 Hamza et al. Dec 2004 A1
20040252866 Tisse et al. Dec 2004 A1
20040255168 Murashita et al. Dec 2004 A1
20050008200 Azuma et al. Jan 2005 A1
20050008201 Lee et al. Jan 2005 A1
20050012817 Hampapur et al. Jan 2005 A1
20050029353 Isemura et al. Feb 2005 A1
20050052566 Kato Mar 2005 A1
20050055582 Bazakos et al. Mar 2005 A1
20050063567 Saitoh et al. Mar 2005 A1
20050084137 Kim et al. Apr 2005 A1
20050084179 Hanna et al. Apr 2005 A1
20050099288 Spitz et al. May 2005 A1
20050102502 Sagen May 2005 A1
20050110610 Bazakos et al. May 2005 A1
20050125258 Yellin et al. Jun 2005 A1
20050127161 Smith et al. Jun 2005 A1
20050129286 Hekimian Jun 2005 A1
20050134796 Zelvin et al. Jun 2005 A1
20050138385 Friedli et al. Jun 2005 A1
20050138387 Lam et al. Jun 2005 A1
20050146640 Shibata Jul 2005 A1
20050151620 Neumann Jul 2005 A1
20050152583 Kondo et al. Jul 2005 A1
20050193212 Yuhara Sep 2005 A1
20050199708 Friedman Sep 2005 A1
20050206501 Farhat Sep 2005 A1
20050206502 Bernitz Sep 2005 A1
20050207614 Schonberg et al. Sep 2005 A1
20050210267 Sugano et al. Sep 2005 A1
20050210270 Rohatgi et al. Sep 2005 A1
20050210271 Chou et al. Sep 2005 A1
20050212654 Yoda Sep 2005 A1
20050238214 Matsuda et al. Oct 2005 A1
20050240778 Saito Oct 2005 A1
20050248725 Ikoma et al. Nov 2005 A1
20050249385 Kondo et al. Nov 2005 A1
20050255840 Markham Nov 2005 A1
20060093190 Cheng et al. May 2006 A1
20060147094 Yoo Jul 2006 A1
20060165266 Hamza Jul 2006 A1
20060210119 Willis et al. Sep 2006 A1
20060274919 LoIacono et al. Dec 2006 A1
20070036397 Hamza Feb 2007 A1
20070140531 Hamza Jun 2007 A1
20070160266 Jones et al. Jul 2007 A1
20070189582 Hamza Aug 2007 A1
20070211924 Hamza Sep 2007 A1
20070274570 Hamza Nov 2007 A1
20070274571 Hamza Nov 2007 A1
20070286590 Terashima Dec 2007 A1
20080005578 Shafir Jan 2008 A1
20080075334 Determan et al. Mar 2008 A1
20080075441 Jelinek et al. Mar 2008 A1
20080104415 Palti-Wasserman et al. May 2008 A1
20080148030 Goffin Jun 2008 A1
20080211347 Wright et al. Sep 2008 A1
20080252412 Larsson et al. Oct 2008 A1
20080267456 Anderson Oct 2008 A1
20090046899 Northcott et al. Feb 2009 A1
20090092283 Whillock et al. Apr 2009 A1
20090316993 Brasnett et al. Dec 2009 A1
20100002913 Hamza Jan 2010 A1
20100033677 Jelinek Feb 2010 A1
20100034529 Jelinek Feb 2010 A1
20100142765 Hamza Jun 2010 A1
20100182440 McCloskey Jul 2010 A1
20100239119 Bazakos et al. Sep 2010 A1
Foreign Referenced Citations (189)
Number Date Country
0484076 May 1992 EP
0593386 Apr 1994 EP
0878780 Nov 1998 EP
0899680 Mar 1999 EP
0910986 Apr 1999 EP
0962894 Dec 1999 EP
1018297 Jul 2000 EP
1024463 Aug 2000 EP
1028398 Aug 2000 EP
1041506 Oct 2000 EP
1041523 Oct 2000 EP
1126403 Aug 2001 EP
1139270 Oct 2001 EP
1237117 Sep 2002 EP
1477925 Nov 2004 EP
1635307 Mar 2006 EP
2369205 May 2002 GB
2371396 Jul 2002 GB
2375913 Nov 2002 GB
2402840 Dec 2004 GB
2411980 Sep 2005 GB
9161135 Jun 1997 JP
9198545 Jul 1997 JP
9201348 Aug 1997 JP
9147233 Sep 1997 JP
9234264 Sep 1997 JP
9305765 Nov 1997 JP
9319927 Dec 1997 JP
10021392 Jan 1998 JP
10040386 Feb 1998 JP
10049728 Feb 1998 JP
10137219 May 1998 JP
10137221 May 1998 JP
10137221 May 1998 JP
10137222 May 1998 JP
10137223 May 1998 JP
10248827 Sep 1998 JP
10269183 Oct 1998 JP
11047117 Feb 1999 JP
11089820 Apr 1999 JP
11200684 Jul 1999 JP
11203478 Jul 1999 JP
11213047 Aug 1999 JP
11339037 Dec 1999 JP
2000005149 Jan 2000 JP
2000005150 Jan 2000 JP
2000011163 Jan 2000 JP
2000023946 Jan 2000 JP
2000083930 Mar 2000 JP
2000102510 Apr 2000 JP
2000102524 Apr 2000 JP
2000105830 Apr 2000 JP
2000107156 Apr 2000 JP
2000139878 May 2000 JP
2000155863 Jun 2000 JP
2000182050 Jun 2000 JP
2000185031 Jul 2000 JP
2000194972 Jul 2000 JP
2000237167 Sep 2000 JP
2000242788 Sep 2000 JP
2000259817 Sep 2000 JP
2000356059 Dec 2000 JP
2000357232 Dec 2000 JP
2001005948 Jan 2001 JP
2001067399 Mar 2001 JP
2001101429 Apr 2001 JP
2001167275 Jun 2001 JP
2001222661 Aug 2001 JP
2001292981 Oct 2001 JP
2001297177 Oct 2001 JP
2001358987 Dec 2001 JP
2002119477 Apr 2002 JP
2002133415 May 2002 JP
2002153444 May 2002 JP
2002153445 May 2002 JP
2002260071 Sep 2002 JP
2002271689 Sep 2002 JP
2002286650 Oct 2002 JP
2002312772 Oct 2002 JP
2002329204 Nov 2002 JP
2003006628 Jan 2003 JP
2003036434 Feb 2003 JP
2003108720 Apr 2003 JP
2003108983 Apr 2003 JP
2003132355 May 2003 JP
2003150942 May 2003 JP
2003153880 May 2003 JP
2003242125 Aug 2003 JP
2003271565 Sep 2003 JP
2003271940 Sep 2003 JP
2003308522 Oct 2003 JP
2003308523 Oct 2003 JP
2003317102 Nov 2003 JP
2003331265 Nov 2003 JP
2004005167 Jan 2004 JP
2004021406 Jan 2004 JP
2004030334 Jan 2004 JP
2004038305 Feb 2004 JP
2004094575 Mar 2004 JP
2004152046 May 2004 JP
2004163356 Jun 2004 JP
2004164483 Jun 2004 JP
2004171350 Jun 2004 JP
2004171602 Jun 2004 JP
2004206444 Jul 2004 JP
2004220376 Aug 2004 JP
2004261515 Sep 2004 JP
2004280221 Oct 2004 JP
2004280547 Oct 2004 JP
2004287621 Oct 2004 JP
2004315127 Nov 2004 JP
2004318248 Nov 2004 JP
2005004524 Jan 2005 JP
2005011207 Jan 2005 JP
2005025577 Jan 2005 JP
2005038257 Feb 2005 JP
2005062990 Mar 2005 JP
2005115961 Apr 2005 JP
2005148883 Jun 2005 JP
2005242677 Sep 2005 JP
WO 9717674 May 1997 WO
WO 9721188 Jun 1997 WO
WO 9802083 Jan 1998 WO
WO 9808439 Mar 1998 WO
WO 9932317 Jul 1999 WO
WO 9952422 Oct 1999 WO
WO 9965175 Dec 1999 WO
WO 0028484 May 2000 WO
WO 0029986 May 2000 WO
WO 0031677 Jun 2000 WO
WO 0036605 Jun 2000 WO
WO 0062239 Oct 2000 WO
WO 0101329 Jan 2001 WO
WO 0103100 Jan 2001 WO
WO 0128476 Apr 2001 WO
WO 0135348 May 2001 WO
WO 0135349 May 2001 WO
WO 0140982 Jun 2001 WO
WO 0163994 Aug 2001 WO
WO 0169490 Sep 2001 WO
WO 0186599 Nov 2001 WO
WO 0201451 Jan 2002 WO
WO 0219030 Mar 2002 WO
WO 0235452 May 2002 WO
WO 0235480 May 2002 WO
WO 02091735 Nov 2002 WO
WO 02095657 Nov 2002 WO
WO 03002387 Jan 2003 WO
WO 03003910 Jan 2003 WO
WO 03054777 Jul 2003 WO
WO 03077077 Sep 2003 WO
WO 2004029863 Apr 2004 WO
WO 2004042646 May 2004 WO
WO 2004055737 Jul 2004 WO
WO 2004089214 Oct 2004 WO
WO 2004097743 Nov 2004 WO
WO 2005008567 Jan 2005 WO
WO 2005013181 Feb 2005 WO
WO 2005024698 Mar 2005 WO
WO 2005024708 Mar 2005 WO
WO 2005024709 Mar 2005 WO
WO 2005029388 Mar 2005 WO
WO 2005062235 Jul 2005 WO
WO 2005069252 Jul 2005 WO
WO 2005093510 Oct 2005 WO
WO 2005093681 Oct 2005 WO
WO 2005096962 Oct 2005 WO
WO 2005098531 Oct 2005 WO
WO 2005104704 Nov 2005 WO
WO 2005109344 Nov 2005 WO
2006012645 Feb 2006 WO
WO 2006023046 Mar 2006 WO
WO 2006051462 May 2006 WO
WO 2006063076 Jun 2006 WO
WO 2006081209 Aug 2006 WO
WO 2006081505 Aug 2006 WO
WO 2007101269 Sep 2007 WO
WO 2007101275 Sep 2007 WO
WO 2007101276 Sep 2007 WO
WO 2007103698 Sep 2007 WO
WO 2007103701 Sep 2007 WO
WO 2007103833 Sep 2007 WO
WO 2007103834 Sep 2007 WO
WO 2008016724 Feb 2008 WO
WO 2008019168 Feb 2008 WO
WO 2008019169 Feb 2008 WO
WO 2008021584 Feb 2008 WO
WO 2008031089 Mar 2008 WO
WO 2008040026 Apr 2008 WO
Related Publications (1)
Number Date Country
20070206840 A1 Sep 2007 US
Provisional Applications (1)
Number Date Country
60778770 Mar 2006 US