The present invention relates to image processing and identity verification technologies, and more specifically to systems and methods directed to a mobile platform for biometric acquisition and processing.
Typical systems for acquiring iris images for biometric matching are not optimized for compactness. Many different form factors of biometric systems exist. For iris biometric, such devices are typically hand-held, desk-top or of a fixed installation. One of the problems with hand-held iris biometric devices is that they are bulky and they have been designed only for an operator to carry as a special purpose piece of bulky equipment. Desk-top devices are easy to remove and steal, and fixed installations limit the mobility of where the biometric authentication can be performed.
In addition, Biometric systems are typically designed to acquire optimal images by considering specific constraints of the type of biometric in question. If other data is to be acquired (e.g. face or background imagery), then typically different sensors are used since requirements for different types of imagery are very different. However, such an approach can add cost to the overall solution and may also increase the size or footprint of the system. In addition, the number of components required for image acquisition and processing (e.g., illuminators, sensors, positioning systems, storage for images, etc) create complexity in the design of an integrated device that is both mobile and compact. Moreover, the ability to acquire high quality biometric images in a compact device for efficient processing provides further challenges.
Certain aspects of the design of an embedded iris image acquisition device can optimize performance. These may include a means for positioning the user in the camera field of view; a means for ensuring that the illumination is pointed optimally at the user; a means for enabling the acquisition of both high-quality and visible and infra-red images using the same sensor; a means for pointing the embedded device at the face of the user without having to pick up the device; an optimal configuration of components that maximizes the likelihood of acquiring high quality imagery when the user holds the device; a means to acquire successively higher-quality images of the iris in order to reduce memory requirements which are limited on a small embedded device, while at the same time ensuring that poorer quality images are matched when the accuracy requirements of the application permits it; a means to use successively higher-quality images acquired, to perform matching on limited numbers of images on the device; a means to use the successively higher-quality acquired images to perform successive matching on limited numbers of images on the device while at the same time encrypting the results; and a means to use the successively higher-quality acquired images in order to send a reduced number of images over a network or other connection, encrypted or un-encrypted, in order to reduce bandwidth across the connection.
In one aspect, the present systems and methods are directed to a compact, mobile apparatus for iris image acquisition. The apparatus may be adapted to address effects of ocular dominance in the subject and to guide positioning of the subject's iris for the image acquisition. The apparatus may include a sensor for acquiring an iris image from a subject. A compact mirror may be oriented relative to a dominant eye of the subject. The mirror may be sized to present an image of a single iris to the subject when the apparatus is positioned at a suitable distance for image acquisition. The mirror may assist the subject in positioning the iris for iris image acquisition. The mirror may be positioned between the sensor and the iris during iris image acquisition. The mirror may transmit a portion of light reflected off the iris to the sensor.
In some embodiments, the apparatus includes a connector for connecting to a computing device. The connector may extend from a lower end of the apparatus below the grasp of the subject's hand when operating the apparatus. In certain embodiments, the apparatus includes an articulated connector for connecting the apparatus to a computing device. The articulated connector may adjustably maintain a position of the apparatus for iris image acquisition. In some embodiments, the apparatus includes an infra-red illuminator integrated with a display screen of the apparatus. In certain embodiments, the apparatus includes a second mirror to present an image of a second iris to the subject when positioned at the suitable distance for image acquisition. In certain embodiments, the apparatus includes a contact region or button for a thumb or finger of the subject to initiate image acquisition while holding the apparatus.
In some embodiments, the mirror is located near one end of a body of the apparatus, the subject holding at least a portion of the other end while operating the apparatus. The mirror may include an adjustable or rotatable mount for tilting the mirror with respect to the dominant eye of the subject. In some embodiments, the apparatus includes at least one illuminator may provide at least one of: infra-red illumination and visible illumination to illuminate a feature of the subject. The illuminator may be oriented to focus illumination primarily on the iris. In certain embodiments, the sensor is used to acquire an infra-red image of the iris and a non-infra-red image of a feature of the subject. The apparatus may include a filter array for filtering light to the sensor. The filter array may include a plurality of infra-red cut regions and a plurality of infra-red pass regions.
The sensor may acquire a plurality of images within a period of time. The apparatus may include an image processing module for selecting an iris image from the plurality of images. The selected iris image may be of better quality for biometric matching than at least some of the other images in the plurality of images. The image processing module may store the selected iris image in a buffer while acquiring or processing additional images. In some embodiments, the image processing module overwrites a previously-selected image stored in a buffer with the selected iris image. The image processing module may perform biometric matching on the selected iris image. The image processing module may send the selected iris image to a computer via a physical or wireless connection. In certain embodiments, the apparatus includes a display having an image of a portion of the subject's face. The display may move an image of the subject's eye towards a physical location of the sensor to guide the subject's gaze towards the sensor.
In another aspect, the present systems and methods are directed to a compact, mobile apparatus for iris image acquisition. The apparatus may be adapted to address effects of ocular dominance in the subject and to guide positioning of the subject's iris for the image acquisition. The apparatus may include a sensor. The apparatus may include a display for displaying an image of a portion of the subject's face. The display may move or shift an image of the subject's eye towards a physical location of the sensor to draw the subject's gaze towards the sensor. The sensor may acquire an image of the subject's iris for biometric matching when the subject's gaze is drawn to or near the sensor.
In yet another aspect, the present systems and methods are directed to a compact, mobile apparatus for iris image acquisition. The apparatus may include a sensor for acquiring a plurality of images of a subject over a period of time. The apparatus may include an image processing module for selecting an image of the subject's iris from the plurality of acquired images for further processing. The selected image may be of better quality for biometric matching than at least some of the other images in the plurality of acquired images.
In some embodiments, the image processing module selects the iris image based at least in part on a predetermined image quality threshold. The image processing module may store the selected iris image in a buffer while acquiring or processing additional images. The image processing module may overwrite a previously-selected image stored in a buffer with the selected iris image. The image processing module may perform biometric matching on the selected iris image. In certain embodiments, the image processing module encrypts the selected iris image. The image processing module may send the selected iris image to a computer via a physical or wireless connection.
In still another aspect, the present systems and methods are directed to a compact apparatus for iris image acquisition. The apparatus may include a sensor for acquiring an infra-red image of a subject's iris and a non-infra-red image of a feature of the subject. The apparatus may include a filter array for selectively filtering light to the sensor. The filter array may include a plurality of infra-red cut regions for sampling non-infra-red data for the non-infra-red image. The filter array may include a plurality of infra-red pass regions for sampling infra-red data for the infra-red image. The plurality of infra-red pass regions may sample infra-red data substantially at or below a corresponding Nyquist limit for the infra-red pass regions. In some embodiments, the plurality of infra-red pass regions sample infra-red data substantially at or below a corresponding Nyquist limit for the infra-red pass regions, by de-focusing the light being filtered.
In some embodiments, the plurality of infra-red cut regions samples visible data substantially at or below a corresponding Nyquist limit for the infra-red cut regions. The apparatus may include a look-up table or calculator for determining pixels of the sensor exposed to infra-red light passing through the filter array. In certain embodiments, the apparatus may include an interpolator for interpolating the sampled infra-red data to produce the infra-red image. The apparatus may include an interpolator for interpolating the sampled non-infra-red data to produce the non-infra-red image.
The following figures depict certain illustrative embodiments of the methods and systems described herein, where like reference numerals refer to like elements. Each depicted embodiment is illustrative of these methods and systems and not limiting.
Before addressing other aspects of the mobile identity platform, a description of system components and features suitable for use in the present systems and methods may be helpful.
In one embodiment, the computing environment 101 can include an appliance installed between the server(s) 106 and client machine(s) 102. This appliance can mange client/server connections, and in some cases can load balance client connections amongst a plurality of backend servers. The client machine(s) 102 can in some embodiment be referred to as a single client machine 102 or a single group of client machines 102, while server(s) 106 may be referred to as a single server 106 or a single group of servers 106. In one embodiment a single client machine 102 communicates with more than one server 106, while in another embodiment a single server 106 communicates with more than one client machine 102. In yet another embodiment, a single client machine 102 communicates with a single server 106.
A client machine 102 can, in some embodiments, be referenced by any one of the following terms: client machine(s) 102; client(s); client computer(s); client device(s); client computing device(s); local machine; remote machine; client node(s); endpoint(s); endpoint node(s); or a second machine. The server 106, in some embodiments, may be referenced by any one of the following terms: server(s), local machine; remote machine; server farm(s), host computing device(s), or a first machine(s).
The client machine 102 can in some embodiments execute, operate or otherwise provide an application that can be any one of the following: software; a program; executable instructions; a virtual machine; a hypervisor; a web browser; a web-based client; a client-server application; a thin-client computing client; an ActiveX control; a Java applet; software related to voice over internet protocol (VoIP) communications like a soft IP telephone; an application for streaming video and/or audio; an application for facilitating real-time-data communications; a HTTP client; a FTP client; an Oscar client; a Telnet client; or any other set of executable instructions. Still other embodiments include a client device 102 that displays application output generated by an application remotely executing on a server 106 or other remotely located machine. In these embodiments, the client device 102 can display the application output in an application window, a browser, or other output window. In one embodiment, the application is a desktop, while in other embodiments the application is an application that generates a desktop.
The computing environment 101 can include more than one server 106A-106N such that the servers 106A-106N are logically grouped together into a server farm 106. The server farm 106 can include servers 106 that are geographically dispersed and logically grouped together in a server farm 106, or servers 106 that are located proximate to each other and logically grouped together in a server farm 106. Geographically dispersed servers 106A-106N within a server farm 106 can, in some embodiments, communicate using a WAN, MAN, or LAN, where different geographic regions can be characterized as: different continents; different regions of a continent; different countries; different states; different cities; different campuses; different rooms; or any combination of the preceding geographical locations. In some embodiments the server farm 106 may be administered as a single entity, while in other embodiments the server farm 106 can include multiple server farms 106.
In some embodiments, a server farm 106 can include servers 106 that execute a substantially similar type of operating system platform (e.g., WINDOWS NT, manufactured by Microsoft Corp. of Redmond, Wash., UNIX, LINUX, or SNOW LEOPARD.) In other embodiments, the server farm 106 can include a first group of servers 106 that execute a first type of operating system platform, and a second group of servers 106 that execute a second type of operating system platform. The server farm 106, in other embodiments, can include servers 106 that execute different types of operating system platforms.
The server 106, in some embodiments, can be any server type. In other embodiments, the server 106 can be any of the following server types: a file server; an application server; a web server; a proxy server; an appliance; a network appliance; a gateway; an application gateway; a gateway server; a virtualization server; a deployment server; a SSL VPN server; a firewall; a web server; an application server or as a master application server; a server 106 executing an active directory; or a server 106 executing an application acceleration program that provides firewall functionality, application functionality, or load balancing functionality. In some embodiments, a server 106 may be a RADIUS server that includes a remote authentication dial-in user service. Some embodiments include a first server 106A that receives requests from a client machine 102, forwards the request to a second server 106B, and responds to the request generated by the client machine 102 with a response from the second server 106B. The first server 106A can acquire an enumeration of applications available to the client machine 102 and well as address information associated with an application server 106 hosting an application identified within the enumeration of applications. The first server 106A can then present a response to the client's request using a web interface, and communicate directly with the client 102 to provide the client 102 with access to an identified application.
Client machines 102 can, in some embodiments, be a client node that seeks access to resources provided by a server 106. In other embodiments, the server 106 may provide clients 102 or client nodes with access to hosted resources. The server 106, in some embodiments, functions as a master node such that it communicates with one or more clients 102 or servers 106. In some embodiments, the master node can identify and provide address information associated with a server 106 hosting a requested application, to one or more clients 102 or servers 106. In still other embodiments, the master node can be a server farm 106, a client 102, a cluster of client nodes 102, or an appliance.
One or more clients 102 and/or one or more servers 106 can transmit data over a network 104 installed between machines and appliances within the computing environment 101. The network 104 can comprise one or more sub-networks, and can be installed between any combination of the clients 102, servers 106, computing machines and appliances included within the computing environment 101. In some embodiments, the network 104 can be: a local-area network (LAN); a metropolitan area network (MAN); a wide area network (WAN); a primary network 104 comprised of multiple sub-networks 104 located between the client machines 102 and the servers 106; a primary public network 104 with a private sub-network 104; a primary private network 104 with a public sub-network 104; or a primary private network 104 with a private sub-network 104. Still further embodiments include a network 104 that can be any of the following network types: a point to point network; a broadcast network; a telecommunications network; a data communication network; a computer network; an ATM (Asynchronous Transfer Mode) network; a SONET (Synchronous Optical Network) network; a SDH (Synchronous Digital Hierarchy) network; a wireless network; a wireline network; or a network 104 that includes a wireless link where the wireless link can be an infrared channel or satellite band. The network topology of the network 104 can differ within different embodiments, possible network topologies include: a bus network topology; a star network topology; a ring network topology; a repeater-based network topology; or a tiered-star network topology. Additional embodiments may include a network 104 of mobile telephone networks that use a protocol to communicate among mobile devices, where the protocol can be any one of the following: AMPS; TDMA; CDMA; GSM; GPRS UMTS; 3G; 4G; or any other protocol able to transmit data among mobile devices.
Illustrated in
Embodiments of the computing machine 100 can include a central processing unit 121 characterized by any one of the following component configurations: logic circuits that respond to and process instructions fetched from the main memory unit 122; a microprocessor unit, such as: those manufactured by Intel Corporation; those manufactured by Motorola Corporation; those manufactured by Transmeta Corporation of Santa Clara, Calif.; the RS/6000 processor such as those manufactured by International Business Machines; a processor such as those manufactured by Advanced Micro Devices; or any other combination of logic circuits. Still other embodiments of the central processing unit 122 may include any combination of the following: a microprocessor, a microcontroller, a central processing unit with a single processing core, a central processing unit with two processing cores, or a central processing unit with more than one processing core.
While
In some embodiments, the processing unit 121 can include one or more processing cores. For example, the processing unit 121 may have two cores, four cores, eight cores, etc. In one embodiment, the processing unit 121 may comprise one or more parallel processing cores. The processing cores of the processing unit 121 may in some embodiments access available memory as a global address space, or in other embodiments, memory within the computing device 100 can be segmented and assigned to a particular core within the processing unit 121. In one embodiment, the one or more processing cores or processors in the computing device 100 can each access local memory. In still another embodiment, memory within the computing device 100 can be shared amongst one or more processors or processing cores, while other memory can be accessed by particular processors or subsets of processors. In embodiments where the computing device 100 includes more than one processing unit, the multiple processing units can be included in a single integrated circuit (IC). These multiple processors, in some embodiments, can be linked together by an internal high speed bus, which may be referred to as an element interconnect bus.
In embodiments where the computing device 100 includes one or more processing units 121, or a processing unit 121 including one or more processing cores, the processors can execute a single instruction simultaneously on multiple pieces of data (SIMD), or in other embodiments can execute multiple instructions simultaneously on multiple pieces of data (MIMD). In some embodiments, the computing device 100 can include any number of SIMD and MIMD processors.
The computing device 100, in some embodiments, can include an image processor, a graphics processor or a graphics processing unit. The graphics processing unit can include any combination of software and hardware, and can further input graphics data and graphics instructions, render a graphic from the inputted data and instructions, and output the rendered graphic. In some embodiments, the graphics processing unit can be included within the processing unit 121. In other embodiments, the computing device 100 can include one or more processing units 121, where at least one processing unit 121 is dedicated to processing and rendering graphics.
One embodiment of the computing machine 100 includes a central processing unit 121 that communicates with cache memory 140 via a secondary bus also known as a backside bus, while another embodiment of the computing machine 100 includes a central processing unit 121 that communicates with cache memory via the system bus 150. The local system bus 150 can, in some embodiments, also be used by the central processing unit to communicate with more than one type of I/O device 130A-130N. In some embodiments, the local system bus 150 can be any one of the following types of buses: a VESA VL bus; an ISA bus; an EISA bus; a MicroChannel Architecture (MCA) bus; a PCI bus; a PCI-X bus; a PCI-Express bus; or a NuBus. Other embodiments of the computing machine 100 include an I/O device 130A-130N that is a video display 124 that communicates with the central processing unit 121. Still other versions of the computing machine 100 include a processor 121 connected to an I/O device 130A-130N via any one of the following connections: HyperTransport, Rapid I/O, or InfiniBand. Further embodiments of the computing machine 100 include a processor 121 that communicates with one I/O device 130A using a local interconnect bus and a second I/O device 130B using a direct connection.
The computing device 100, in some embodiments, includes a main memory unit 122 and cache memory 140. The cache memory 140 can be any memory type, and in some embodiments can be any one of the following types of memory: SRAM; BSRAM; or EDRAM. Other embodiments include cache memory 140 and a main memory unit 122 that can be any one of the following types of memory: Static random access memory (SRAM), Burst SRAM or SynchBurst SRAM (BSRAM); Dynamic random access memory (DRAM); Fast Page Mode DRAM (FPM DRAM); Enhanced DRAM (EDRAM), Extended Data Output RAM (EDO RAM); Extended Data Output DRAM (EDO DRAM); Burst Extended Data Output DRAM (BEDO DRAM); Enhanced DRAM (EDRAM); synchronous DRAM (SDRAM); JEDEC SRAM; PC100 SDRAM; Double Data Rate SDRAM (DDR SDRAM); Enhanced SDRAM (ESDRAM); SyncLink DRAM (SLDRAM); Direct Rambus DRAM (DRDRAM); Ferroelectric RAM (FRAM); or any other type of memory. Further embodiments include a central processing unit 121 that can access the main memory 122 via: a system bus 150; a memory port 103; or any other connection, bus or port that allows the processor 121 to access memory 122.
One embodiment of the computing device 100 provides support for any one of the following installation devices 116: a CD-ROM drive, a CD-R/RW drive, a DVD-ROM drive, tape drives of various formats, USB device, a bootable medium, a bootable CD, a bootable CD for GNU/Linux distribution such as KNOPPIX®, a hard-drive or any other device suitable for installing applications or software. Applications can in some embodiments include a client agent 120, or any portion of a client agent 120. The computing device 100 may further include a storage device 128 that can be either one or more hard disk drives, or one or more redundant arrays of independent disks; where the storage device is configured to store an operating system, software, programs applications, or at least a portion of the client agent 120. A further embodiment of the computing device 100 includes an installation device 116 that is used as the storage device 128.
The computing device 100 may further include a network interface 118 to interface to a Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (e.g., 802.11, T1, T3, 56 kb, X.25, SNA, DECNET), broadband connections (e.g., ISDN, Frame Relay, ATM, Gigabit Ethernet, Ethernet-over-SONET), wireless connections, or some combination of any or all of the above. Connections can also be established using a variety of communication protocols (e.g., TCP/IP, IPX, SPX, NetBIOS, Ethernet, ARCNET, SONET, SDH, Fiber Distributed Data Interface (FDDI), RS232, RS485, IEEE 802.11, IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, CDMA, GSM, WiMax and direct asynchronous connections). One version of the computing device 100 includes a network interface 118 able to communicate with additional computing devices 100′ via any type and/or form of gateway or tunneling protocol such as Secure Socket Layer (SSL) or Transport Layer Security (TLS), or the Citrix Gateway Protocol manufactured by Citrix Systems, Inc. Versions of the network interface 118 can comprise any one of: a built-in network adapter; a network interface card; a PCMCIA network card; a card bus network adapter; a wireless network adapter; a USB network adapter; a modem; or any other device suitable for interfacing the computing device 100 to a network capable of communicating and performing the methods and systems described herein.
Embodiments of the computing device 100 include any one of the following I/O devices 130A-130N: a keyboard 126; a pointing device 127; mice; trackpads; an optical pen; trackballs; microphones; drawing tablets; video displays; speakers; inkjet printers; laser printers; and dye-sublimation printers; or any other input/output device able to perform the methods and systems described herein. An I/O controller 123 may in some embodiments connect to multiple I/O devices 103A-130N to control the one or more I/O devices. Some embodiments of the I/O devices 130A-130N may be configured to provide storage or an installation medium 116, while others may provide a universal serial bus (USB) interface for receiving USB storage devices such as the USB Flash Drive line of devices manufactured by Twintech Industry, Inc. Still other embodiments include an I/O device 130 that may be a bridge between the system bus 150 and an external communication bus, such as: a USB bus; an Apple Desktop Bus; an RS-232 serial connection; a SCSI bus; a FireWire bus; a FireWire 800 bus; an Ethernet bus; an AppleTalk bus; a Gigabit Ethernet bus; an Asynchronous Transfer Mode bus; a HIPPI bus; a Super HIPPI bus; a SerialPlus bus; a SCI/LAMP bus; a FibreChannel bus; or a Serial Attached small computer system interface bus.
In some embodiments, the computing machine 100 can execute any operating system, while in other embodiments the computing machine 100 can execute any of the following operating systems: versions of the MICROSOFT WINDOWS operating systems; the different releases of the Unix and Linux operating systems; any version of the MAC OS manufactured by Apple Computer; OS/2, manufactured by International Business Machines; Android by Google; any embedded operating system; any real-time operating system; any open source operating system; any proprietary operating system; any operating systems for mobile computing devices; or any other operating system. In still another embodiment, the computing machine 100 can execute multiple operating systems. For example, the computing machine 100 can execute PARALLELS or another virtualization platform that can execute or manage a virtual machine executing a first operating system, while the computing machine 100 executes a second operating system different from the first operating system.
The computing machine 100 can be embodied in any one of the following computing devices: a computing workstation; a desktop computer; a laptop or notebook computer; a server; a handheld computer; a mobile telephone; a portable telecommunication device; a media playing device; a gaming system; a mobile computing device; a netbook, a tablet; a device of the IPOD or IPAD family of devices manufactured by Apple Computer; any one of the PLAYSTATION family of devices manufactured by the Sony Corporation; any one of the Nintendo family of devices manufactured by Nintendo Co; any one of the XBOX family of devices manufactured by the Microsoft Corporation; or any other type and/or form of computing, telecommunications or media device that is capable of communication and that has sufficient processor power and memory capacity to perform the methods and systems described herein. In other embodiments the computing machine 100 can be a mobile device such as any one of the following mobile devices: a JAVA-enabled cellular telephone or personal digital assistant (PDA); any computing device that has different processors, operating systems, and input devices consistent with the device; or any other mobile computing device capable of performing the methods and systems described herein. In still other embodiments, the computing device 100 can be any one of the following mobile computing devices: any one series of Blackberry, or other handheld device manufactured by Research In Motion Limited; the iPhone manufactured by Apple Computer; Palm Pre; a Pocket PC; a Pocket PC Phone; an Android phone; or any other handheld mobile device. Having described certain system components and features that may be suitable for use in the present systems and methods, further aspects are addressed below.
Another type of noise is anisotropic, or systematic/periodic noise. Periodic noise can be caused, for example, by differences in amplifier gains in the read-out path of the image sensor. For example, different rows and columns may pass through different amplifiers with slightly different gains. This type of systematic noise is depicted in
Problems arising from noise are typically addressed by performing noise reduction in an image processing module 220. The image processing module 220 may employ any type of spatial median filtering or region-selective averaging, as depicted in
Frequency characteristics of the iris signal “texture” has been characterized to some degree in NIST standards [ANSI/INCITS 379-2004, Iris Image Interchange Format], for example, the minimum resolution values corresponding to line/pairs per millimeter (mm) may be designated for different iris diameter ranges. The iris diameter may be dependent on a particular optical configuration. By way of illustration, for an iris diameter between 100-149 pixels, the defined pixel resolution may be a minimum of 8.3 pixels per mm, with an optical resolution at 60% modulation of a minimum of 2.0 line-pairs per mm. For an iris diameter between 150-199 pixels, the defined pixel resolution may be a minimum of 12.5 pixels per mm with an optical resolution at 60% modulation of a minimum of 3.0 line-pairs per mm. For an iris diameter with 200 or more pixels, the defined pixel resolution may be a minimum of 16.7 pixels per mm, with an optical resolution at 60% modulation of a minimum of 4.0 line-pairs per mm. Other diameter, defined pixel resolution and/or optical resolution combinations may be suitable in certain embodiments.
The present systems and methods can address this problem by recognizing particular characteristics related to iris recognition.
Adding systematic noise, however, to the pristine iris signal, as shown in
Another challenge relating to acquiring optimal standard scene imagery and iris imagery on the same sensor relates to the wavelength of the illumination required for standard imagery and for iris imagery. Iris imagery typically requires infra-red illumination, while standard imagery typically requires visible illumination. There are sometimes conflicting constraints. Some embodiments of the present systems may be configured to address this by interleaving filters having different responses to infra-red and visible light. These systems may use one of a plurality of different configurations of such filters against an image sensor, when capturing an image. One example of a filter that may be incorporated or modified to produce an interleaved filter is one having a Bayer RGB (red, green, blue) filter pattern (see, e.g., U.S. Pat. No. 3,971,065). Filters that (primarily, significantly or only) pass infra-red may be interleaved with other filters that (primarily, significantly or only) passes colored or visible light. Some embodiments of filters that provide selected filtering are described in U.S. Pat. Pub. 20070145273, and U.S. Pat. Pub. 20070024931. Some embodiments of the present system and methods use a R,G,(G+I),B interleaved array instead. Some of these systems have the ability to maintain full (or substantially full) resolution of the G (green) signal to which the human visual system is typically most sensitive.
In iris recognition mode, the magnitude of the G (green) response is typically much less than that of the infra-red response due to incident infra-red illumination. In some embodiments, an estimate of the infra-red signal response (I) in iris recognition mode can be recovered by subtracting the (G) signal from the adjacent (G+I) signal. In standard image acquisition mode, the R,G,(G+I),B signal may be processed to recover an estimate G′ of G in the pixel in which G+I was recovered. Various methods may be used for generating such estimates, such as when an R,G,T,B pixel array is used, where T is totally transparent. The T pixel in such an implementation may contain signals of the R,G,B and I signals accumulated or superimposed together. This can be problematic. If the T pixel filter is truly transparent, then for effective performance, the sum of the R,G,B,I responses must still lie within the dynamic range of the pixel. For a given integration time and pixel area throughout an entire imager, this means that the dynamic range of the R,G,B pixels cannot be fully utilized since saturation of the T pixel (R+G+B+I) could occur. Setting different pixel areas or gain for the T pixel compared to the other R,G,B pixels may be possible but may be expensive to implement. One improvement, which may be incorporated into the present systems, is to use a neutral density filter in place of the transparent filter. The neutral density filter may reduce the magnitude of the illumination of all wavelengths (R,G,B and I) at that pixel, and may allow a full or wide range of pixel capacities to be exploited in the R,G,B pixels, thereby reducing noise. A neutral density filter with value of 0.5 to 0.6 can be selected as an example. A green signal may typically contribute to approximately 60% of a luminance signal comprised of R,G and B combined together.
If a T filter is truly transparent, the overall dynamic range of the sensor will typically need to be reduced to accommodate the range of the T pixel and maintain it to be within a linear range, at the expense of the signal to noise ratio of the R,G,B pixels. By incorporating a R,G,G+I,B filter array in some embodiments of our systems, and since red and blue signals are not present in the G+I pixel, the overall dynamic range of the sensor may be increased compared to that of a R,G,T,B array, thereby increasing the signal to noise ratio.
Another approach incorporated in some embodiments of our methods and systems for acquiring optimal standard scene imagery and iris imagery on the same sensor, relating to the wavelength of the illumination, involves multiplexing or positioning an infra-red cut filter over a standard image sensor or lens. In one embodiment, a portion of the sensor (for example, 20% of the sensor or sensor nodes) may be designated primarily for iris recognition, while the remaining (e.g., 80%) portion may be used for standard image acquisition, for example as shown in
Another approach incorporated within some embodiments of the present systems and methods uses a dual band-pass filter over the entire or a substantial portion of a color imager or sensor. Such a filter may pass both R,G,B signals and infra-red signals within select bands, such as bands around 850 nm or 940 nm, and may yield a frequency response as depicted in
In some embodiments, the image acquisition system may interleave infra-red cut and infra-red pass filters across the sensor, for example as shown in
In some embodiments, an image acquired by an image sensor may be affected or corrupted by ambient illumination. For example, in some embodiments, where infra-red filtering and/or illumination is not optimal, images of a scene can be reflected off a surface of an eye (e.g., the cornea) during acquisition of iris imagery. An example of this is shown in
Another embodiment of the present methods manages corruption of images by exploiting particular geometrical constraints of the position of the user, the image-capturing device and the source of the corruption or artifacts. The image processing module may be configured to recognize that as the user holds the image-capturing device in front of the user's face during iris acquisition mode, the image-capturing device may reduce or even block sources of corrupting ambient illumination within one sector of the acquired iris imagery, for example as shown in
In some embodiments, infra-red illumination is not readily available or guaranteed during image capture. The image acquisition system 200 may be configured to control and/or provide infra-red illumination. The image acquisition system may reduce power usage by illuminating the infra-red source (e.g., LEDs) when the device is in iris recognition mode, as shown in
In some embodiments, the image acquisition system 200 may include infra-red illuminators embedded into a screen of the image acquisition system 200, for illuminating a user's eye with infra-red illumination. Screens and displays typically use white LED illumination under an LCD matrix. By adding to or replacing some portion of the visible light LEDs with near infra-red illuminators, a source of IR illumination may be provided by the display itself. In such an embodiment, the image acquisition system 200 may not require an additional fixture or area on the image acquisition system 200 to provide infra-red illumination, thereby saving space.
In certain embodiments, the image acquisition system 200 may include a visible illuminator, for example with two illumination strengths. The visible illuminator may be turned on at low power during iris image acquisition mode. The low power illumination may be chosen so as to not distract or cause discomfort to the user. In some embodiments, brightness level in the low power mode can be at least a factor of 2 darker then the full brightness of the visible illuminator. The latter brightness level may, for example, be used to illuminate a wider scene. The low power visible illuminator may be used to constrict the iris and increase iris area, regardless of whether the user is in the dark or not. However, since the visible illuminator may be close to the eye, some of the filters described above may still pass significant visible light into the sensor. Therefore, in some embodiments, the visible light is turned off before images of the iris is acquired while the near infra-red illuminator turned on. In an alternate embodiment, the screen itself can be used as a source of visible illumination.
In some embodiments, one advantage of using a single sensor in the image acquisition system 200 is that space occupied by the system can be minimized compared to the use of dual sensor. However, in either case, an important consideration is the ability of the user and/or operator to use the single-sensor or dual-sensor device effectively.
In some embodiments, a mirrored surface may be used to help guide an user in aligning the user's iris with a suitable capture zone of the image sensor. A mirrored surface can provide feedback to the user of the user's position, as depicted in
Ocular dominance is a tendency to prefer visual input from one eye or the other. It occurs in most individuals, with ⅔ of individuals having right-eyed dominance and ⅓ of individuals having left-eyed dominance. The present systems and methods address ocular dominance and combine properties of ocular dominance with constraints of iris recognition in order to maximize the size of recovered iris imagery while minimizing the size of a mirror used to guide the user.
Illustrated in
Further referring to
In some embodiments, the image acquisition system 200 may comprise an iris capturing mode and a picture (e.g., non-iris) capturing mode. The image sensor may capture an image of the view of the scene in picture capturing mode. The image sensor may capture an image of the view of the iris in iris capturing mode. In certain embodiments, the image acquisition system 200 may perform concurrent capture of iris and non-iris imagery in another mode. A user may select a mode for image acquisition, for example, via an application executing on the image acquisition device 200. In some embodiments, the image acquisition system may capture the view of the scene and the view of the iris as separable components within a single image. The image acquisition system may capture the view of the scene and/or the view of the iris using any embodiment and/or combination of the interleaved filter, IR-cut filter, IR-pass filter, and other types of filters described herein.
In some embodiments, the image sensor comprises a plurality of sensor nodes of the image sensor. The image sensor may activate a first subset of the sensor nodes adapted primarily for capturing an image of the iris suitable for biometric identification. The image sensor may activate a second subset of the sensor nodes adapted primarily for capturing a non-iris image. An IR-pass, (G+I) filter (e.g., allowing G+I to pass), or other filter may be applied over a sensor node adapted primarily for capturing an image of the iris. An IR-cut, visible-pass, specific bandpass or color filter may be applied over a sensor node adapted primarily for capturing a non-iris image.
In some embodiments, the image sensor captures at least one image of the iris while illuminating the iris with infra-red illumination. The image sensor may capture at least one image of the iris without infra-red illumination. The image sensor may capture at least one image of the iris upon turning off a visible light illuminator. The image sensor may capture at least one image of the iris using illumination from a screen of the image acquisition system 200. The image sensor may capture at least one image of the iris when the iris is aligned with a portion of the sensor using a mirror of the image acquisition system 200 for guidance. The image sensor may capture at least one image of the iris when the iris is aligned with a portion of the sensor by an operator using a see-through guidance channel and/or markers.
Further referring to (384), an image processing module may apply a level of noise reduction to a first portion of the at least one image to produce an image of the scene. The image acquisition system 200 may apply noise reduction on an image captured by the image sensor. The image acquisition system 200 may apply noise reduction on an image stored in the image acquisition system 200, e.g., in a storage device or buffer. The image acquisition system 200 may apply noise reduction comprising applying an averaging or median function or filter over some pixels of an images, e.g., over a 3×3 pixel window. The image acquisition system 200 may apply noise reduction comprising reduction of one of, or both of time-varying and time-invariant noise from a captured image. The image acquisition system 200 may account for or exclude a known faulty pixel while performing image processing and/or noise reduction. The image acquisition system 200 may apply noise reduction using an image processing module which may include one or more image signal processors 206 and/or other processor 208. The image acquisition system 200 may apply noise reduction by identifying, accounting for and/or compensating for the presence of systematic noise.
In some embodiments, the image processing module may apply noise reduction on an image captured in non-iris capturing mode. The image processing module may apply a level of noise reduction to a portion of an image not for iris biometric identification, e.g., a portion corresponding to an IR-cut filter. The image processing module may apply noise reduction or filtering on a general or non-iris image. The image processing module may generate an image of a general scene that is perceptibly better (e.g., to a human) than an image before noise reduction.
Further referring to (386), the image processing module may apply a reduced level of noise reduction to a second portion of the at least one image to produce an image of the iris for use in biometric identification. In some embodiments, the image processing module may disable noise reduction on an image for use in iris biometric identification. The image processing module may determine that the noise level does not overwhelm the captured iris texture. The image processing module may perform iris biometric identification based on a raw or unprocessed image captured by the image sensor. The image processing module may perform iris biometric identification based on image captured by the image sensor after some processing, e.g., removal of artifacts, sporadic noise and/or systematic noise.
In some embodiments, the image processing module may apply a reduced level of noise reduction to an image for use in iris biometric identification. The image processing module may apply a reduced level of noise reduction to an image captured while in iris capturing mode. The image processing module may perform noise reduction for systematic and/or sporadic noise. The image processing module may disable noise reduction for non-systematic noise. The image processing module may apply a reduced level of noise reduction to a portion of an image extracted for iris biometric identification, e.g., a portion corresponding to an IR-pass filter. The image processing module may apply reduction of systematic noise to a portion of an image extracted for iris biometric identification, e.g., a portion corresponding to an IR-pass filter.
In some embodiments, the image processing module 220 subtracts noise from one image of the iris with noise from another image of the iris. Such subtraction may result in reduced systematic noise and/or sporadic noise. The image processing module 220 may align two images together to perform the subtraction. The image processing module 220 may align two images using common points of reference (e.g., edge of shapes). The image processing module 220 may align two images by using pattern recognition/matching, correlation and/or other algorithms. The image processing module 220 may subtract noise corresponding to overlapping portion of two images. The image processing module 220 may reduce ambient noise in one image using ambient noise from another image. Ambient noise may comprise signals from ambient light or illumination. Ambient noise may comprise artifacts from surrounding illumination sources or reflections of surrounding objects off a surface of the eye. In some embodiments, the image processing module 220 may reduce ambient noise from one image captured in the presence of infra-red illumination, using ambient noise from another image captured without infra-red illumination.
In certain embodiments, the image processing module 220 may recover an infra-red component from one or more (G+I) pixels imaged on a sensor node array. The image processing module 220 may subtract the G component from (G+I) using a G intensity value in a neighboring pixel. In some embodiments, the image processing module 220 may subtract the G component using an estimated G intensity value. The image processing module 220 may use the estimated G intensity value in processing a non-iris (e.g., general scene) portion of an image. In some embodiments, the image processing module 220 may perform gain or brightness control or adjustment on a portion of the at least one image, to produce an image of the iris for use in biometric identification. In some embodiments, the amount of infra-red illumination may be insufficient or sub-optimal, so that gain or brightness control or adjustment can improve iris image quality. In certain embodiments, gain or brightness control or adjustment may be preferable to adding infra-red illuminators, drawing power to provide infra-red illumination, and/or controlling infra-red illumination (e.g., under different conditions). Since infra-red signals are captured by a fraction of the sensor nodes/pixels (e.g., in a RGB(G+I) array), compensation via gain or brightness control or adjustment may be appropriate.
In some aspects, the present systems and methods is directed to a compact, mobile biometric system. The compact, mobile biometric system may acquire imagery primarily to determine or verify the identity of an individual person. The mobile biometric system may use biometric recognition using the iris. The mobile biometric system may capture pictures of faces for subsequent biometric recognition or viewing by a human.
In some embodiments, the biometric device is designed so that a user can not only use it without help from an operator, but can also carry it with the user. In this way there is no equipment at a location for a thief to steal, and also no equipment to be maintained at a location. Compact, mobile devices exist for the fingerprint biometric, whereby a fingerprint reader is embedded onto a USB key device. A user places their finger on the device and the fingerprint is read. However we have found that this approach is not suitable for the iris or face biometric. For example, if the mobile fingerprint reader device is plugged into a USB (or other) socket of a computer, then a user can easily contort their finger to swipe the reader of a fingerprint reader device, but it is much more difficult and very inconvenient to orientate their eye and body to orientate towards an iris or face biometric acquisition device. We have developed a mobile biometric device that enables acquisition of the iris and/or iris by a user very easily and effectively, but also in a platform that is simple to carry.
The first aspect of the invention is a means to easily orientate the device towards the user. Methods exist for this, including cradles and other mechanical orientating assemblies. However these are bulky and not easily carried by a user. We have developed a method that allows the orientation of the device to be adjusted and then maintained, even while the device is rigidly plugged into a USB or other socket of a computer. One preferred means we use to perform this is to articulate the device into two parts. The first part contains at least the USB (or other) plug and the second part contains at least the optical component, as shown in
A further aspect of the invention relates to improving the user experience while enhancing performance at the same time. More specifically, many biometric iris devices have a mirror on them to guide the user towards them. We have found that devices where such centering mechanisms have the approximate size of the distance between the eyes (approximately 5-7 cm, “Variation and extrema of human interpupillary distance”, Neil A Dodgson, Proc. SPIE Vol. 5291, Stereoscopic Displays and Virtual Reality Systems XI, 19-22 Jan. 2004, San Jose, Calif., ISSN 0277-786X, pp 36-46), and where the distance to the device is at least 50 cm, then a user is typically able to center themselves in front of the device. However, we have found that as the device gets smaller than the eye separation in size, and as the distance to the device decreases to 50 cm or less, then for a device used by a user (as oppose to an operator), then an effect resulting from ocular dominance becomes very typical. Approximately ⅔s of the population has ocular dominance (or eyedness) in the right eye, and the remaining ⅓ rd have ocular dominance in the left eye (Chaurasia B D, Mathur B B (1976). “Eyedness”. Acta Anat (Basel) 96 (2): 301-5). Under the conditions just described, then the user naturally begins to bring the device not towards the middle of the face, but towards one eye or the other depending on their ocular dominance. The user experience can be very confusing therefore since the feedback from the mirror is encouraging the user to center themselves, but the natural human response due to the ocular dominance is to do exactly the opposite.
Rather than overcoming ocular dominance, we take advantage of it in our invention. We do this in two parts: First, we still use a centering device such as a mirror that a user can use for feedback (other centering mechanisms such as an inverted cone with different colored circles in the cone can also be used). In the case of the mirror, we actually reduce its size so that it is impossible for two eyes to appear in the field of view and therefore confuse the user. In the case of a hand-held device, then the length of the arm limits the distance of the device from the user. With these parameters we have found that a mirrored surface of 0.5 cm-5 cm in diameter depending on the device achieves this purpose. The second component of this aspect of the invention is that we choose a lens and imager such that even though the user is holding up the device centered on one eye, then we are sure to capture imagery of the second eye even though the device is not even in front of it. Therefore while the user believes that the device is acquiring data only from one eye, in fact data is being acquired from the second eye also. There are many means for detecting two eyes from a single acquired image. For example, the specular reflections of an illuminator reflected off the cornea can be detected in the image and used to mark the locations of the eyes.
While the present systems and methods are capable of many embodiments, only a few illustrative embodiments are described herein. Referring first to
Upon initiating the acquisition, a local list of successively better images from the prior subject is cleared 1101 in preparation for the next subject. An image is then acquired 1102 using a camera system. A camera system is used that can either capture images synchronously at a constant rate, or asynchronously on request by a computer-controlled trigger signal. As discussed later, the camera may be operated at a variable acquisition rate depending on the results of previous processing.
A Quality Metric module comprising, for example, one or more of the following sub-modules: face detector, eye detector, focus measurement, iris area detector is used 1103 to measure the quality of each acquired image in sequence when sufficient computing capacity is available but not necessarily simultaneously with image acquisition. As discussed later, one or all of these modules may be performed at a particular time instant depending on the results of previous processing. The quality analysis and selection system of Martin et al in US 2008/0075335, supra, which is hereby incorporated by reference in its entirety, is one suitable Quality Metric system 1103 for the purposes of the current invention, with the additional feature of the present invention wherein only the best or a small, limited number of the highest quality of the acquired images is stored in memory.
An Acquisition Stopped module 1104 is to perform an Acquisition Stopped routine. This module 1104 ensures that the overall process is not being performed unnecessarily if, for example; the subject has walked away without any data being acquired. The Acquisition Stopped module may consist of a time-out counter that compares to a threshold the difference between the current time and the time that the Acquisition process was started. The process for a particular subject can be terminated 1109 or the last image can be stored 1107 if a better 1103 image than the best quality image stored at 1110 is calculated.
A Comparator module 1105 then compares the results of the Quality Metric Module with the results stored in a Local List in storage module 1110. In the first iteration of the process, there will be no data in the Local List in storage module 1110. However, after several iterations, some data may be present within the Local List 1110. If the results of the Quality Metric Module 1103 are greater than any of those on the Local List 1110, then the imagery data is stored on the Local List, Storage may comprise appending the imagery data to the Local List 1110, or may comprise replacing 1107 imagery data on the Local List that has a lower Quality Metric 1103 value.
Step 1108 is optional, as indicated by the box shown with broken lines. In certain embodiments where step 1108 is absent, additional imagery is acquired automatically without changing focus values but is rather acquired at a fixed focus, the quality of imagery depending on the exact location of a moving subject within the capture volume at the time successive images are acquired. In certain other embodiments when module 1108 is present, the focus setting of the camera acquisition system is independently modified prior to acquiring the next image. Several methods for modifying the focus setting can be employed as discussed later.
After the focus has been modified, then imagery is once again acquired 1102 in the next iteration of the process. The process continues until 1109 either the timeout condition described above occurs, or the Quality Metric 1103 exceeds a value. In some embodiments, images are acquired and selected until a biometric match is found, or a probability of a correct match or identification is met. In certain embodiments, images are acquired and selected until a predefined number of images have been selected, e.g., that exceeds a certain quality threshold.
Referring now to
Referring now to
Referring to
The method is highly effective in many respects. A first advantage of the invention is if the disposition of the subject is immediately amenable to successful data acquisition (e.g. eyes are open and their face is facing the system), then the system will acquire iris imagery very rapidly. There are many methods for detecting the presence of an eye. For example, the Hough Transform disclosed in U.S. Pat. No. 3,069,654 can be configured to locate circular segments of the eye due to the iris/sclera boundary and the pupil/iris boundary.
However, if the subject is fidgeting or unable to remain stationary, or is distracted by baggage or children for example, then the acquisition system will still acquire imagery, although it might take a slightly longer period of time. However, the acquisition time for an amenable subject will not be penalized by the system's delays in acquiring data in the case of a less amenable subject. This is crucial when subject throughput is considered. This is to be contrasted with systems that may acquire and store a large number of images and then perform processing on the images to select imagery.
A second advantage of the invention is the ability to acquire successively better iris imagery. In the current art, iris image acquisition systems typically have resulted in the output of one image of the iris deemed to have a quality suitable for matching, usually exceeding a threshold. If such an image is not found, then no iris data is captured. The problem with the current art is that there are some applications when there will not be a second chance to acquire better data since the subject has gone elsewhere or is fed up with using the system. Ironically, however, the iris imagery they presented may have had plenty of information for the particular application at hand. For example, if the image acquisition system is to be used to gain entry into a house with only 100 subjects, then some of the iris imagery acquired earlier in the acquisition process may be sufficient.
A third advantage of the invention is the efficient use of memory, which is significant especially when an embedded device is used. The Local List contains only iris imagery that is successively of better quality than the prior imagery, and does not contain the imagery that was originally acquired. In addition, depending on the application, the Local List can comprise a single image which is replaced each time imagery of a better quality is detected. After processing is complete, then the resultant image remaining in the Local List is the imagery acquired of the best quality.
In one embodiment, the invention obtains in-focus images by using a focus controller component that controls the lens to focus at successively different points within a focus range, such scan control performed without any input from measurement of whether the image is in focus or out of focus, be it based from measurements of the image or other distance metrics to the subject. In terms of focus scan speed and how it relates to frame rate, exposure time these relationships and related algorithms are known to those skilled in this art.
Even when a subject is trying to stand still, there will be residual motion. The system in some embodiments can increase or decrease the rate of image capture at different focuses in view of the degree of motion of the subject.
The system acquires a varying number of images, to account for the fact that in some cases we may acquire a good image on the first image acquisition, but in other cases may have to wait for 10 or 20 image acquisitions or more. If the system simply fixed the number of image acquisitions to be 10 or 20, then we would dramatically slow down the average time it takes to use the device, and therefore reduce the throughput of people using the device, since the number of image acquisitions acquired would be set at the worst case, rather than being adaptive based on the quality of the iris. It may not be good enough to have the focus set at the correct focal distance opportunistically since, for example, the subject may blink or turn away even though the image is in focus.
If 10 or 20 or more images are being acquired, storing them can take up a lot of memory, which can be expensive in an embedded device. The system of the invention successively checks whether the iris image quality is better than the best iris image stored previously and only in that case does the system store it. Alternatively the system can overwrite the best iris image acquired so far to replace it with the better image. In this way, the system always has the best possible iris image stored without having to use extensive memory. If the subject turns away and the system loses its opportunity to ever again acquire iris data of a subject, the best possible image, even if not of high quality, will be stored and such image may have sufficient quality for biometric identification under the circumstances.
In addition to the area to which the camera is pointed, we also can control a focus control system such that a capture volume is swept through. Unlike autofocus which requires settling time, and many discontinuous stop/start steps that eventually can wear down components and can take time to respond, we simply sweep through a focus volume rapidly, in order to opportunistically acquire biometric imagery.
In certain embodiments, the biometric or image acquisition system may comprise a device, such as an embedded device, that may be portable, mobile, compact, lightweight and/or attachable to a computer or other computing device. The device may incorporate various design aspects that improves performance in embedded applications such as biometric acquisition and/or processing. Some of these design aspects may also make the device portable, mobile, compact, lightweight and/or suitable for connecting to a computing device.
For example, a first aspect may include a positioning system that enables simple and/or rapid positioning system and/or alignment of a user's eye with a sensor. An illuminator may enable acquisition of well-illuminated images of an iris of the user. The positioning system and/or illuminator may improve performance by enabling acquisition of non-foreshortened and/or non-occluded images of a user's iris. Another aspect may include means whereby a single compact sensor can be used to acquire both high quality images of a scene under visible illumination as well as high quality images of irises under infra-red illumination. Yet another aspect may relate to optimizing processing inside an embedded image acquisition or processing device, as well as optimizing the interface between the device and external devices in order to improve usage of limited resources available, such as memory and power. These aspects are discussed in detail below and elsewhere in the present disclosure.
In some embodiments, the biometric or image acquisition system includes a positioning system that enables simple and rapid alignment of the user's eye with a camera, which may include one or more illuminators adapted to enable the system to acquire well-illuminated, non-foreshortened and/or non-occluded images of a user's iris. In some contexts, foreshortening may describe acquisition of an image of an iris that significantly deviates from facing directly into an image acquisition sensor, or which is not optimally oriented towards the sensor. For example, foreshortening may have occurred if an image of an iris region is elliptical in shape, rather than circular. In certain contexts, occlusion may refer to blockage or obstruction of an object by another object. For example, an iris may be occluded by an eyelid or eyelashes in an acquired image. An image may show occlusion by a finger that partially covered a sensor and/or an illuminator when the image is captured.
In some embodiments, when the mirror and sensor are co-located, such as when the sensor is positioned behind the mirror, the user may look at the mirror and sensor at the same time, perhaps without even realizing that his/her iris is facing the sensor. Thus, images of the iris acquired under such conditions are less likely to be foreshortened. In some embodiments, the mirror may transmit a portion of the light directed into the mirror, to a sensor behind the mirror. The mirror may transmit a portion of the light reflected off an iris to the sensor. In some embodiments, the mirror may be translucent or semi-transparent. In certain embodiments, the mirror may be a one-way mirror, or a cold mirror (e.g., allowing infra-red light to pass, and reflecting visible light).
As discussed earlier, for example in connection with at least
Referring to
In addition to, or as an alternative to the positioning mirror(s), an eye finder module may be connected to or incorporated into the device to detect if a subject's eyes are present at the expected regions and guide the user accordingly. For example, the device may audibly alert the user using a loudspeaker, or visually using LEDs or a screen, if the user's eye(s) are positioned incorrectly.
Another challenge created by such a sensor/screen configuration is that user alignment may be more complex since the position of the eye of the user on the screen may be a function of the distance of the user relative to the sensor, as well as their horizontal and/or vertical position(s). In some embodiments, these issues may be addressed by electronically shifting or moving the coordinate system of the screen such that at the nominal/optimal operating distance of the device from the subject, the image of the subject acquired by the camera is electronically shifted such that the subject's left eye is positioned near, or as near to the camera as possible. The degree of shift may be predetermined (e.g., calibrated prior), and may be fixed for a particular device (e.g., with a fixed focus, zoom and/or camera/sensor position). The area just below the sensor on the screen may be highlighted with a different color or marking to guide or instruct the user to direct the user's eye at or near that area. Since the eye-image to camera distance can be shortened or minimized, foreshortening of the iris can be reduced or minimized. The complexity of alignment, by a user using the screen for guidance, is also reduced.
Referring again to
Referring now to
The positioning mechanisms discussed above may assume that the user is able to easily face or orient the device. However, in some identity verification applications, it may be intrusive or undesirable for a user to pick up an image acquisition device, on a regular basis for example, to verify oneself. In other identity verification applications, such as a fingerprint biometric sensor, it is possible to articulate a wrist and/or finger so that a fingerprint is presented to the sensor. However, with an iris sensor, it may be key for an eye of the user to face directly towards the sensor. An embedded device incorporating the iris sensor can allow the sensor to be oriented towards an iris. Such an embedded device may be compactly built and sufficiently streamlined to be inserted into and/or carried comfortably in a pocket of a user. As discussed earlier,
In some embodiments, the positioning mechanism may include a portion that can connect or plug to a host computer or other computing device (for example, via a USB jack). The positioning mechanism may include a second portion that can be articulated by the user, substantially unencumbered by the first portion. For example, a cable, which may be flexible and/or retractable, and may connect the two portions. In some embodiments, the second portion may be articulated relative to the first portion. For example, the two portions may be linked by an articulated arm or stiff wire, which may be bent or twisted into a particular shape. For example, the second portion may be positioned towards a user, away from a host computer. In this way, a user can move or dip towards the embedded device in a hands-free fashion, for the user's biometric data to be acquired. The articulated arm or mount may be straightened or twisted into shape, e.g., for portability. The articulated arm or mount may be reshaped to fit in a pocket of the user, e.g., so as to avoid discomfort. Other positioning mechanisms may include a cradle that can be placed on a desk or platform, or a cradle that attaches to or hangs over a screen of a computing device.
In some embodiments, the image acquisition device may include a mechanism that suitably illuminates an iris when the iris is aligned with a sensor for image acquisition. In some embodiments, this includes a narrow-beam illuminator, such as an infra-red LED. In some embodiments, the illuminator is positioned to point towards an iris of the user when aligned using any of the positioning mechanisms described earlier. The beam of the illuminator may be sufficiently narrow and shaped to only cover an iris area. Such an implementation may be more energy efficient, and may help guide positioning of the user using infra-red and iris detection, for example.
In some embodiments, the infra-red LEDs may be grouped or concentrated within an area of the screen, for example as shown in
In some embodiments, the infra-red LEDs may be independently controlled by the device so that a rudimentary infra-red display is created. In this case, basic shapes in infra-red can be projected from the screen, reflected off the cornea and collected by the camera imager or sensor. These shapes may be changed by computer control via the device, detected, and used as a means to determine liveness of the user in front of the device. This serves as a way to determine that biometrics acquired are from an actual live person instead of a recorded image of an eye, for example.
In some embodiments, imagery from the sensor is collected both with the infra-red shapes turned on and then turned off. Both sets of imagery may then be stored and subtracted to reduce effects of contamination and unpredictable visible illumination (e.g., from the screen, and/or ambient light). For example, embodiments of such methods are discussed above in connection with at least
In certain embodiments, the image acquisition device incorporates a positioning system that achieves well-lit and un-occluded acquired iris imagery while providing simple and rapid alignment of the user's eye with a sensor. Biometric features may not be well-lit if the user places a finger over LEDs on the device. Occlusion in iris imagery may be caused by improper positioning of fingers over a sensor of the device. This can be a significant problem as the size of an image acquisition device becomes smaller, where it is more likely that a user's hand will cover a larger portion of the device when held.
In some embodiments, the camera/sensor may be located near one end of the device, e.g., near the top of the device. The infra-red illuminator(s) may be positioned below the camera, e.g., to avoid casting shadows on a user's eyelashes. The thumb-rest or marker may be located beneath the area occupied by the infra-red illuminator(s). Optionally, a contact detector on the thumb-rest/marker area can be used to detect if a thumb is present, and can be used to avoid acquisition of imagery that may be sub-optimal if the device is not held properly during the time of acquisition.
In some embodiments, the biometric/image acquisition device uses a single compact sensor to acquire high quality images of a scene in visible illumination as well as high quality images of an iris with infra-red illumination.
In some embodiments, significant sub-sampling of infra-red pass imagery can creates two problems. First, severe sensor sub-sampling can create aliasing thereby making the acquired iris imagery useless. Second, resulting resolution of the acquired iris imagery may be lower thereby making iris recognition unreliable. ISO standards for iris recognition, for example, recommend acquisition of iris imagery that is 100-200 pixels in diameter. The present systems and methods take advantage of a phenomenon whereby the optical Modulation Transfer Function (MTF) of lenses in embedded devices are often unable to resolve to the resolution of the sensor itself. In other words, the quality of lenses have not kept up with the ever increasing resolution of sensors. This is especially so as sensors become smaller and support higher resolutions. The device can therefore sub-sample at high values and still not get much aliasing. That enables the device to acquire non-aliased images of irises in infra-red light while at the same time be able to acquire high quality iris images. The filter array may be configured such that the IR-pass component of the filter is significantly sub-sampled compared to the pixel spacing of the sensor. The rate of sub-sampling can, for example, be a factor of 2-6 in each coordinate axis of the sensor array. Such large sub-sampling factors means that the IR-cut portions of the filter can sample more of the visible imagery, so that signal to noise and resolution of the visible components are not compromised.
In contrast, U.S. Pat. No. 7,872,234 discloses a method that uses very densely sampled infra-red arrays to prevent aliasing. For example, with a sub-sampling factor of 2, (½)*(½)=25% of visible pixels are affected. This has the drawback of compromising the quality (signal to noise and resolution) of the visible imagery since a large portion of the data used to construct the visible imagery is lost. U.S. Pat. No. 7,872,234 mitigates this by performing color compensation at each visible pixel cluster. However such color compensation can introduce artifacts at each pixel, and the signal-to-noise value of the basic visible image signal is compromised since less incident photons are being used to create the visible image, which is especially significant when the sensor size is small and has low signal to noise properties.
By using larger sub-sampling of the infra-red component in some embodiments of the present image acquisition devices, a smaller proportion of visible pixels are affected. For example, with a sub-sampling factor of 4, only (¼)*(¼))=6.25% of visible pixels are affected by the IR sampling. With a factor of 6 sub-sampling, (1−(⅙)*(⅙))=97% of visible pixels remain untouched. Typically, such large pixel arrays already have dead or high/low gain pixels, and very different interpolation methods are well-developed and tested to fill in such anomalies that occupy increasingly smaller percentages of the screen area. For example, such small anomalies can be mitigated by applying median filtering/averaging over pixels in the surrounding area. Thus, small amounts of IR sampling affecting a small number of visible pixels may be compensated using such approaches. Median filtering is, however, not suitable if visible pixels are low (e.g., high IR sampling, as in U.S. Pat. No. 7,872,234), since median filtering can cause artifacts if such compensation had to be performed at every pixel cluster—there are few if any surrounding uncontaminated visible pixel data as an input to the median filtering/averaging algorithm. The signal to noise value of the acquired visible imagery is also not significantly compromised since the vast majority of pixels in the sensor array are still being used to produce the visible imagery.
The iris data acquired by the IR-pass component of the array may be small due to the large sub-sampling, and may be so even after interpolation. In some embodiments, the image acquisition device may resolve this by acquiring the iris imagery at a higher resolution than typical iris acquisition resolutions. This may be achieved by acquiring a larger image of the iris, e.g., by positioning the iris closer to the sensor. For example, if an interpolated iris imagery of 100 pixels in diameter is needed, and a filter array is used with an IR-pass sub-sample value of 4, the image acquisition device may be configured to acquire iris images at a distance such that the diameter of the iris with respect to the original sampling of the sensor is 400 pixels. The size of the acquired iris image, Iris_diameter_acquired, may be given by Iris_diameter_acquired=Iris_diameter_required×IR_pass_sample_value, where Iris_diameter_required is the interpolated iris diameter required for iris recognition, and IR_pass_sample_value is the sampling of the IR-pass component of the filter array.
As discussed above, the sampling interval for the sparsely sampled regions may correspond to every 2nd-6th pixel in the sensor array that lies underneath the filter array. In some embodiments, a look up table or pixel calculator may be used to determine or predict which pixels of the sensor are or are not exposed to infra-red light, for example as shown in
In some embodiments, if the filter is not aligned carefully to the sensor pixels, then an initial and/or rapid calibration step can be performed to determine which pixels are exposed to infra-red light. The calibration step may, for example, comprise pointing the sensor, filter and lens combination towards an infra-red light source (such as a matt reflective surface illuminated by an infra-red light source) and the resultant imagery recorded. Pixels underneath the IR-cut portion of the filter may appear dark, and pixels underneath the IR-pass portion of the filter may appear bright. These bright and dark variations may be detected by or compared against a threshold, for example, and pixels that exceed the threshold may be identified as being under the IR-pass portion of the filter and can be inserted into a lookup table for example.
In some embodiments, in which higher sampling of the filter array by the IR-pass component may result in aliasing, the image acquisition device's lens can be intentionally de-focused in order to blur the imagery incident on the sensor. This can reduce the impact of aliasing, as shown in
In some embodiments, the array of sensor pixels of the device's sensor may be built to correspond to the filter array. For example, RGB pixels may be located at visible sample locations, and IR sensor pixels located beneath IR-pass sampling points. This may reduce cost and/or processing of data (e.g., collected from unsampled regions).
In some embodiments, the image acquisition device optimizes processing within the device, and may optimize the interface between the device and external devices so as to make efficient use of limited resources available, such as memory and power. In an embedded device, memory is typically limited due to the physical size of the device, as well as cost limitations. Power for the device is usually limited, due to battery limitations or access to a low-power link such as a USB interface. At the same time however, we may wish to acquire many images both of low and high quality iris images since depending on the application and the user, it is uncertain whether and when an optimal image may be acquired. At the same time, images that have already been acquired may provide sufficient information for biometric matching at an accuracy required for a particular application. For efficient processing, the image acquisition device may acquire a set of biometric images over a period of time, and select an image of better quality than some of the others for storage or further processing. For example, the image acquisition device may select an image of better quality for biometric matching, rather than performing matching on all images acquired. Thus, a reduced number of images may be buffered and/or processed on the embedded device.
In some embodiments, one or more of the selected images may be over-written by newly acquired imagery that is of better quality for biometric matching. In this way, memory requirements for buffering or storing images may be reduced, and fewer images can be subsequently processed or transmitted. As discussed above in connection with
In certain embodiments, the reduced number of images may be subsequently processed such that the eye or iris regions are cropped into smaller images. In some embodiments, iris data may be extracted from cropped or uncropped images, and stored or transmitted as a more efficient form than an iris image. In some embodiments, the image acquisition device may perform biometric matching locally on the devices, using either the cropped or uncropped images, or the extracted data. Biometric matching may be performed using reference templates that are stored on or loaded onto the device. In some embodiments, a result of the biometric matching may be displayed on the device or transferred either in raw or encrypted format to a host computer.
Having described certain embodiments of the methods and systems, it will now become apparent to one of skill in the art that other embodiments incorporating the concepts of the invention may be used. It should be understood that the systems described above may provide multiple ones of any or each of those components and these components may be provided on either a standalone machine or, in some embodiments, on multiple machines in a distributed system. The systems and methods described above may be implemented as a method, apparatus or article of manufacture using programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof. In addition, the systems and methods described above may be provided as one or more computer-readable programs embodied on or in one or more articles of manufacture. The term “article of manufacture” as used herein is intended to encompass code or logic accessible from and embedded in one or more computer-readable devices, firmware, programmable logic, memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, SRAMs, etc.), hardware (e.g., integrated circuit chip, Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), etc.), electronic devices, a computer readable non-volatile storage unit (e.g., CD-ROM, floppy disk, hard disk drive, etc.). The article of manufacture may be accessible from a file server providing access to the computer-readable programs via a network transmission line, wireless transmission media, signals propagating through space, radio waves, infrared signals, etc. The article of manufacture may be a flash memory card or a magnetic tape. The article of manufacture includes hardware logic as well as software or programmable code embedded in a computer readable medium that is executed by a processor. In general, the computer-readable programs may be implemented in any programming language, such as LISP, PERL, C, C++, C#, PROLOG, or in any byte code language such as JAVA. The software programs may be stored on or in one or more articles of manufacture as object code.
The present application: is a continuation of, and claims priority to U.S. application Ser. No. 15/783,827, filed Oct. 13, 2017, entitled “Mobile Identity Platform”, which is a continuation of, and claims priority to U.S. application Ser. No. 15/487,923, filed Apr. 14, 2017, entitled “Mobile Identity Platform”, which is a continuation of, and claims priority to U.S. application Ser. No. 14/830,366, filed Aug. 19, 2015, entitled “Mobile Identity Platform”, which is a continuation of, and claims priority to U.S. patent application Ser. No. 13/440,707, entitled “Mobile Identity Platform”, filed Apr. 5, 2012 issued as U.S. Pat. No. 9,117,119 on Aug. 25, 2015, which itself claims the benefit of and priority to U.S. Provisional Patent Application No. 61/472,270, entitled “Mobile Identity Platform”, filed Apr. 6, 2011, andclaims the benefit of and priority to Provisional Patent Application No. 61/472,279, entitled “Efficient Method and System for the Acquisition of Face and Iris Imagery”, filed Apr. 6, 2011, andis a continuation-in-part of and claims priority to U.S. patent application Ser. No. 13/398,562, entitled “Efficient Method and System for the Acquisition of Scene Imagery and Iris Imagery using a Single Sensor”, filed Feb. 16, 2012 issued as U.S. Pat. No. 9,280,706 on Mar. 8, 2016, which itself claims the benefit of and priority to U.S. Provisional Patent Application No. 61/443,757, entitled “Method and System for Iris Recognition and Face Acquisition”, filed Feb. 17, 2011, andis a continuation-in-part of and claims priority to U.S. patent application Ser. No. 12/675,189, entitled “System and Method for Iris Data Acquisition for Biometric Identification”, filed Feb. 25, 2010, and issued as U.S. Pat. No. 8,853,948 on Oct. 8, 2013 which itself is a National Stage Entry of PCT Application Number PCT/US08/74737, entitled “System and Method for Iris Data Acquisition for Biometric Identification”, filed Aug. 29, 2008, which itself claims priority to U.S. Provisional Patent Application No. 60/969,607, entitled “Methodology for Acquiring Biometric Data Large Volumes”, filed Sep. 1, 2007, all of which are incorporated herein by reference in their entireties for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
4231661 | Walsh et al. | Nov 1980 | A |
4641349 | Flom et al. | Feb 1987 | A |
4910725 | Drexler et al. | Mar 1990 | A |
4923263 | Johnson | May 1990 | A |
5140469 | Lamarre et al. | Aug 1992 | A |
5259040 | Hanna | Nov 1993 | A |
5291560 | Daugman | Mar 1994 | A |
5488675 | Hanna | Jan 1996 | A |
5572596 | Wildes et al. | Nov 1996 | A |
5581629 | Hanna et al. | Dec 1996 | A |
5613012 | Hoffman et al. | Mar 1997 | A |
5615277 | Hoffman | Mar 1997 | A |
5737439 | Lapsley et al. | Apr 1998 | A |
5751836 | Wildes et al. | May 1998 | A |
5764789 | Pare et al. | Jun 1998 | A |
5802199 | Pare et al. | Sep 1998 | A |
5805719 | Pare et al. | Sep 1998 | A |
5838812 | Pare et al. | Nov 1998 | A |
5878156 | Okumura | Mar 1999 | A |
5901238 | Matsushita | May 1999 | A |
5953440 | Zhang et al. | Sep 1999 | A |
5978494 | Zhang | Nov 1999 | A |
6021210 | Camus et al. | Feb 2000 | A |
6028949 | McKendall | Feb 2000 | A |
6055322 | Salganicoff et al. | Apr 2000 | A |
6064752 | Rozmus et al. | May 2000 | A |
6069967 | Rozmus et al. | May 2000 | A |
6088470 | Camus et al. | Jul 2000 | A |
6144754 | Okano et al. | Nov 2000 | A |
6149061 | Massieu et al. | Nov 2000 | A |
6192142 | Pare et al. | Feb 2001 | B1 |
6222903 | Kim et al. | Apr 2001 | B1 |
6246751 | Bergl et al. | Jun 2001 | B1 |
6247813 | Kim et al. | Jun 2001 | B1 |
6252977 | Salganicoff et al. | Jun 2001 | B1 |
6289113 | McHugh et al. | Sep 2001 | B1 |
6301375 | Choi | Oct 2001 | B1 |
6320610 | Van Sant et al. | Nov 2001 | B1 |
6366682 | Hoffman et al. | Apr 2002 | B1 |
6373968 | Okano et al. | Apr 2002 | B2 |
6377699 | Musgrave et al. | Apr 2002 | B1 |
6424727 | Musgrave et al. | Jul 2002 | B1 |
6483930 | Musgrave et al. | Nov 2002 | B1 |
6532298 | Cambier et al. | Mar 2003 | B1 |
6542624 | Oda | Apr 2003 | B1 |
6545810 | Takada et al. | Apr 2003 | B1 |
6546121 | Oda | Apr 2003 | B1 |
6554705 | Cumbers | Apr 2003 | B1 |
6587597 | Nakao et al. | Jul 2003 | B1 |
6594376 | Hoffman et al. | Jul 2003 | B2 |
6594377 | Kim et al. | Jul 2003 | B1 |
6652099 | Chae et al. | Nov 2003 | B2 |
6700998 | Murata | Mar 2004 | B1 |
6701029 | Berfanger et al. | Mar 2004 | B1 |
6714665 | Hanna et al. | Mar 2004 | B1 |
6760467 | Min et al. | Jul 2004 | B1 |
6763148 | Sternberg et al. | Jul 2004 | B1 |
6819219 | Bolle et al. | Nov 2004 | B1 |
6832044 | Doi et al. | Dec 2004 | B2 |
6850631 | Oda et al. | Feb 2005 | B1 |
6917695 | Teng et al. | Jul 2005 | B2 |
6920236 | Prokoski | Jul 2005 | B2 |
6930707 | Bates et al. | Aug 2005 | B2 |
6944318 | Takata et al. | Sep 2005 | B1 |
6950536 | Houvener | Sep 2005 | B2 |
6980670 | Hoffman et al. | Dec 2005 | B1 |
6985608 | Hoffman et al. | Jan 2006 | B2 |
7007298 | Shinzaki et al. | Feb 2006 | B1 |
7020351 | Kumar et al. | Mar 2006 | B1 |
7047418 | Ferren et al. | May 2006 | B1 |
7095901 | Lee et al. | Aug 2006 | B2 |
7106366 | Parker et al. | Sep 2006 | B2 |
7146027 | Kim et al. | Dec 2006 | B2 |
7152782 | Shenker et al. | Dec 2006 | B2 |
7209271 | Lewis et al. | Apr 2007 | B2 |
7212330 | Seo et al. | May 2007 | B2 |
7221486 | Makihira et al. | May 2007 | B2 |
7236534 | Morejon et al. | Jun 2007 | B1 |
7248719 | Hoffman et al. | Jul 2007 | B2 |
7271939 | Kono | Sep 2007 | B2 |
7272265 | Kouri et al. | Sep 2007 | B2 |
7346472 | Moskowitz et al. | Mar 2008 | B1 |
7385626 | Aggarwal et al. | Jun 2008 | B2 |
7398925 | Tidwell et al. | Jul 2008 | B2 |
7414737 | Cottard et al. | Aug 2008 | B2 |
7418115 | Northcott et al. | Aug 2008 | B2 |
7428320 | Northcott et al. | Sep 2008 | B2 |
7542590 | Robinson et al. | Jun 2009 | B1 |
7545962 | Peirce et al. | Jun 2009 | B2 |
7558406 | Robinson et al. | Jul 2009 | B1 |
7558407 | Hoffman et al. | Jul 2009 | B2 |
7574021 | Matey | Aug 2009 | B2 |
7583822 | Guillemot et al. | Sep 2009 | B2 |
7606401 | Hoffman et al. | Oct 2009 | B2 |
7616788 | Hsieh et al. | Nov 2009 | B2 |
7639840 | Hanna et al. | Dec 2009 | B2 |
7652695 | Halpern | Jan 2010 | B2 |
7660700 | Moskowitz et al. | Feb 2010 | B2 |
7693307 | Rieul et al. | Apr 2010 | B2 |
7697786 | Camus et al. | Apr 2010 | B2 |
7715595 | Kim et al. | May 2010 | B2 |
7719566 | Guichard | May 2010 | B2 |
7760919 | Namgoong | Jul 2010 | B2 |
7770019 | Ferren et al. | Aug 2010 | B2 |
7797606 | Chabanne | Sep 2010 | B2 |
7801335 | Hanna et al. | Sep 2010 | B2 |
7847688 | Bernard et al. | Dec 2010 | B2 |
7869627 | Northcott et al. | Jan 2011 | B2 |
7912252 | Ren et al. | Mar 2011 | B2 |
7916908 | Thomas | Mar 2011 | B1 |
7925059 | Hoyos et al. | Apr 2011 | B2 |
7929017 | Aggarwal et al. | Apr 2011 | B2 |
7929732 | Bringer et al. | Apr 2011 | B2 |
7949295 | Kumar et al. | May 2011 | B2 |
7949494 | Moskowitz et al. | May 2011 | B2 |
7978883 | Rouh et al. | Jul 2011 | B2 |
8009876 | Kim et al. | Aug 2011 | B2 |
8025399 | Northcott et al. | Sep 2011 | B2 |
8028896 | Carter et al. | Oct 2011 | B2 |
8090246 | Jelinek | Jan 2012 | B2 |
8092021 | Northcott et al. | Jan 2012 | B1 |
8132912 | Northcott et al. | Mar 2012 | B1 |
8159328 | Luckhardt | Apr 2012 | B2 |
8170295 | Fujii et al. | May 2012 | B2 |
8181858 | Carter et al. | May 2012 | B2 |
8195044 | Hanna et al. | Jun 2012 | B2 |
8212870 | Hanna et al. | Jul 2012 | B2 |
8214175 | Moskowitz et al. | Jul 2012 | B2 |
8233680 | Bringer et al. | Jul 2012 | B2 |
8243133 | Northcott et al. | Aug 2012 | B1 |
8260008 | Hanna et al. | Sep 2012 | B2 |
8279042 | Beenau et al. | Oct 2012 | B2 |
8280120 | Hoyos et al. | Oct 2012 | B2 |
8289390 | Aggarwal et al. | Oct 2012 | B2 |
8306279 | Hanna | Nov 2012 | B2 |
8317325 | Raguin et al. | Nov 2012 | B2 |
8364646 | Hanna et al. | Jan 2013 | B2 |
8411909 | Zhao et al. | Apr 2013 | B1 |
8442339 | Martin et al. | May 2013 | B2 |
8443202 | White et al. | May 2013 | B2 |
8553948 | Hanna | Oct 2013 | B2 |
8604901 | Hoyos et al. | Dec 2013 | B2 |
8606097 | Hanna et al. | Dec 2013 | B2 |
8719584 | Mullin | May 2014 | B2 |
9002073 | Hanna et al. | Apr 2015 | B2 |
9633260 | Hanna | Apr 2017 | B2 |
9940516 | Hanna | Apr 2018 | B2 |
20010028730 | Nahata | Oct 2001 | A1 |
20020110286 | Cheatle et al. | Aug 2002 | A1 |
20020131623 | Musgrave et al. | Sep 2002 | A1 |
20020136435 | Prokoski | Sep 2002 | A1 |
20030103212 | Westphal et al. | Jun 2003 | A1 |
20030151674 | Lin | Aug 2003 | A1 |
20030169334 | Braithwaite et al. | Sep 2003 | A1 |
20030208125 | Watkins | Nov 2003 | A1 |
20040013288 | Svensson et al. | Jan 2004 | A1 |
20040042643 | Yeh | Mar 2004 | A1 |
20040071363 | Kouri et al. | Apr 2004 | A1 |
20040228505 | Sugimoto | Nov 2004 | A1 |
20050084137 | Kim et al. | Apr 2005 | A1 |
20050084179 | Hanna et al. | Apr 2005 | A1 |
20050105778 | Sung et al. | May 2005 | A1 |
20050168321 | Fitzgibbon | Aug 2005 | A1 |
20050226471 | Singh et al. | Oct 2005 | A1 |
20050264758 | Wakamori | Dec 2005 | A1 |
20050270386 | Saitoh et al. | Dec 2005 | A1 |
20050285943 | Cutler | Dec 2005 | A1 |
20060028552 | Aggarwal et al. | Feb 2006 | A1 |
20060029262 | Fujimatsu et al. | Feb 2006 | A1 |
20060073449 | Kumar et al. | Apr 2006 | A1 |
20060074986 | Mallalieu et al. | Apr 2006 | A1 |
20060097172 | Park | May 2006 | A1 |
20060120707 | Kusakari et al. | Jun 2006 | A1 |
20060140454 | Northcott et al. | Jun 2006 | A1 |
20060170813 | Morofuji | Aug 2006 | A1 |
20060188169 | Tener et al. | Aug 2006 | A1 |
20060204121 | Bryll | Sep 2006 | A1 |
20060279630 | Aggarwal et al. | Dec 2006 | A1 |
20070040903 | Kawaguchi | Feb 2007 | A1 |
20070098229 | Wu et al. | May 2007 | A1 |
20070110284 | Rieul et al. | May 2007 | A1 |
20070110285 | Hanna et al. | May 2007 | A1 |
20070145273 | Chang | Jun 2007 | A1 |
20070160265 | Wakiyama | Jul 2007 | A1 |
20070188613 | Nobori et al. | Aug 2007 | A1 |
20070206839 | Hanna et al. | Sep 2007 | A1 |
20070211922 | Crowley et al. | Sep 2007 | A1 |
20070253596 | Murata et al. | Nov 2007 | A1 |
20070286462 | Usher et al. | Dec 2007 | A1 |
20070286524 | Song | Dec 2007 | A1 |
20080031610 | Border et al. | Feb 2008 | A1 |
20080044063 | Friedman et al. | Feb 2008 | A1 |
20080075334 | Determan et al. | Mar 2008 | A1 |
20080075335 | Martin et al. | Mar 2008 | A1 |
20080089554 | Tabankin et al. | Apr 2008 | A1 |
20080122578 | Hoyos et al. | May 2008 | A1 |
20080259161 | Hellman et al. | Oct 2008 | A1 |
20080291279 | Samarasekera et al. | Nov 2008 | A1 |
20090046899 | Northcott et al. | Feb 2009 | A1 |
20090047010 | Yoshida et al. | Feb 2009 | A1 |
20090074256 | Haddad | Mar 2009 | A1 |
20090097715 | Cottard et al. | Apr 2009 | A1 |
20090161925 | Cottard et al. | Jun 2009 | A1 |
20090207251 | Kobayashi et al. | Aug 2009 | A1 |
20090219405 | Kaneda et al. | Sep 2009 | A1 |
20090231096 | Bringer et al. | Sep 2009 | A1 |
20090232418 | Lolacono et al. | Sep 2009 | A1 |
20090268045 | Sur et al. | Oct 2009 | A1 |
20090273562 | Baliga et al. | Nov 2009 | A1 |
20090274345 | Hanna et al. | Nov 2009 | A1 |
20090278922 | Tinker et al. | Nov 2009 | A1 |
20100014720 | Hoyos et al. | Jan 2010 | A1 |
20100021016 | Cottard et al. | Jan 2010 | A1 |
20100033677 | Jelinek | Feb 2010 | A1 |
20100074477 | Fujii et al. | Mar 2010 | A1 |
20100127826 | Saliba et al. | May 2010 | A1 |
20100201853 | Ishiga | Aug 2010 | A1 |
20100232655 | Hanna | Sep 2010 | A1 |
20100238407 | Dai | Sep 2010 | A1 |
20100246903 | Cottard | Sep 2010 | A1 |
20100253816 | Hanna | Oct 2010 | A1 |
20100278394 | Raguin et al. | Nov 2010 | A1 |
20100310070 | Bringer et al. | Dec 2010 | A1 |
20110002510 | Hanna | Jan 2011 | A1 |
20110007949 | Hanna et al. | Jan 2011 | A1 |
20110119111 | Hanna | May 2011 | A1 |
20110119141 | Hoyos et al. | May 2011 | A1 |
20110150293 | Bower et al. | Jun 2011 | A1 |
20110158486 | Bringer et al. | Jun 2011 | A1 |
20110160576 | Bower et al. | Jun 2011 | A1 |
20110194738 | Choi et al. | Aug 2011 | A1 |
20110211054 | Hanna et al. | Sep 2011 | A1 |
20110277518 | Lais et al. | Nov 2011 | A1 |
20120127295 | Hanna et al. | May 2012 | A9 |
20120187838 | Hanna | Jul 2012 | A1 |
20120212597 | Hanna | Aug 2012 | A1 |
20120219279 | Hanna et al. | Aug 2012 | A1 |
20120229617 | Yates et al. | Sep 2012 | A1 |
20120239458 | Hanna | Sep 2012 | A9 |
20120240223 | Tu | Sep 2012 | A1 |
20120242820 | Hanna et al. | Sep 2012 | A1 |
20120242821 | Hanna et al. | Sep 2012 | A1 |
20120243749 | Hanna et al. | Sep 2012 | A1 |
20120257797 | Leyvand et al. | Oct 2012 | A1 |
20120268241 | Hanna et al. | Oct 2012 | A1 |
20120293643 | Hanna | Nov 2012 | A1 |
20120300052 | Hanna et al. | Nov 2012 | A1 |
20120300990 | Hanna et al. | Nov 2012 | A1 |
20120321141 | Hoyos et al. | Dec 2012 | A1 |
20120328164 | Hoyos et al. | Dec 2012 | A1 |
20130051631 | Hanna | Feb 2013 | A1 |
20130093838 | Tan et al. | Apr 2013 | A1 |
20130108125 | Storm et al. | May 2013 | A1 |
20130110859 | Hanna et al. | May 2013 | A1 |
20130162798 | Hanna et al. | Jun 2013 | A1 |
20130162799 | Hanna et al. | Jun 2013 | A1 |
20130182093 | Hanna | Jul 2013 | A1 |
20130182094 | Hanna | Jul 2013 | A1 |
20130182095 | Hanna | Jul 2013 | A1 |
20130182913 | Hoyos et al. | Jul 2013 | A1 |
20130182915 | Hanna | Jul 2013 | A1 |
20130194408 | Hanna et al. | Aug 2013 | A1 |
20130212655 | Hoyos et al. | Aug 2013 | A1 |
20130223840 | Chang | Aug 2013 | A1 |
20130251215 | Coons | Sep 2013 | A1 |
20130294659 | Hanna et al. | Nov 2013 | A1 |
20130329079 | Florea et al. | Dec 2013 | A1 |
20140064574 | Hanna et al. | Mar 2014 | A1 |
20140072183 | Hanna et al. | Mar 2014 | A1 |
Number | Date | Country |
---|---|---|
101027678 | Aug 2007 | CN |
2007-249556 | Sep 2007 | JP |
10-2002-0078225 | Oct 2002 | KR |
10-2003-0005113 | Jan 2003 | KR |
100373850000 | Feb 2003 | KR |
10-2003-0034258 | May 2003 | KR |
10-2003-0051970 | Jun 2003 | KR |
200321670000 | Jul 2003 | KR |
100416065000 | Jan 2004 | KR |
200340273000 | Jan 2004 | KR |
200341137000 | Jan 2004 | KR |
200352669000 | May 2004 | KR |
200355279000 | Jun 2004 | KR |
200362032000 | Sep 2004 | KR |
200367917000 | Nov 2004 | KR |
10-2005-0005336 | Jan 2005 | KR |
200383808000 | May 2005 | KR |
10-2005-0051861 | Jun 2005 | KR |
200404650000 | Dec 2005 | KR |
100572626000 | Apr 2006 | KR |
10-2009-0086891 | Aug 2009 | KR |
10-2009-0106791 | Oct 2009 | KR |
10-2010-0049407 | May 2010 | KR |
10-119767800 | Jan 2012 | KR |
10-136674800 | Feb 2014 | KR |
10-2014-0028950 | Mar 2014 | KR |
10-137404900 | Mar 2014 | KR |
10-2014-0039803 | Apr 2014 | KR |
10-2014-0050501 | Apr 2014 | KR |
2318438 | Mar 2008 | RU |
97839 | Sep 2010 | RU |
WO-2008054396 | May 2008 | WO |
WO-2009029757 | Mar 2009 | WO |
WO-2009029765 | Mar 2009 | WO |
WO-2010062371 | Jun 2010 | WO |
WO-2011093538 | Aug 2011 | WO |
WO-2012112788 | Aug 2012 | WO |
WO-2013109295 | Jul 2013 | WO |
Entry |
---|
Chinese Office Action on Appl. 201280017539.7 dated Jun. 27, 2018. |
Communication pursuant Article 94(3) EPCT re EP Application No. 12/747311.4. |
Korean Office Action on Appl. 10-2013-7024678 dated Aug. 7, 2018. |
Korean Office Action on Appl. 10-2013-7029383 dated Jul. 2, 2018. |
Notice of Allowance on U.S. Appl. No. 15/061,482 dated Oct. 3, 2018. |
Office Action on U.S. Appl. No. 14/830,366 dated Jul. 18, 2016. |
Al-Zubi R T et al: Automated personal identification system based on human iris analysis, Pattern Analysis and Applications, Springer-Verlag, LO, vol. 10, No. 2, Nov. 29, 2006 (Nov. 29, 2006), pp. 147-164, XP019493841, ISSN: 1433-755X, sectionI-5, abstract; figures 1-17. |
B. Galvin, et al., Recovering Motion Fields: An Evaluation of Eight Optical Flow Algorithms, Proc. of the British Machine Vision Conf. (1998) (pp. 195-204). |
Belcher et al, “A Selective Feature Information Approach for Iris Image-Quality Measure”, IEEE, 3(3):572-577 (2008). |
Chen Y et al: A highly accurate and computationally efficient approach for unconstrained iris segmentation, Image and Vision Computing, Elsevier, Guildford, GB, vol. 28, No. 2, Feb. 1, 2010 (Feb. 1, 2010), pp. 261-269, XP026777056, ISSN: 0262-8856, DOI:10.1016/J.IMAVIS.2009.04.017 [retrieved on May 13, 2009] section on 1-7, abstract; figures 1-16. |
Daugman, John, “How Iris Recognition Works,” IEEE Transaction on Circuits and Systems for Video Technology, 14(1):21-30 (2004). |
EP Communication for EP Appl. No. 12747311.4-1901 dated Nov. 20, 2017. |
Extended European Search Report on 12866256.6 dated Aug. 1, 2014. |
Extended European Search Report on 12747311.4 dated Jul. 4, 2016. |
First Chinese Office Action on 201280017539.7 dated Mar. 14, 2016. |
He, Xiaofu et al., “Contactless Autofeedback Iris Capture Design”, IEEE Transactions on Instrumentation and Measurement, IEEE Service Center, Piscataway, NJ, U.S. 57(7):1369-1375 (2008). |
He, Y. et al, “A fast iris image quality evaluation method based on weighted entropy”, SPIE, 6623:1-8 (2007). |
International Preliminary Report on Patentability in PCT/US2008/074737 dated Mar. 2, 2010, 7 pages. |
International Preliminary Report on Patentability in PCT/US2008/074751 dated Mar. 2, 2010, 5 pages. |
International Preliminary Report on Patentability in PCT/US2012/025468 dated Aug. 21, 2013, 4 pages. |
International Preliminary Report on Patentability in PCT/US2012/032391, dated Oct. 8, 2013, 8 pages. |
International Search Report in PCT/US2008/074737, dated Jan. 23, 2009, 4 pages. |
International Search Report in PCT/US2008/074751, dated Jan. 28, 2009, 2 pages. |
International Search Report in PCT/US2012/032391, dated Jul. 25, 2013, 3 pages. |
International Search Report on PCT/US2012/025468 dated Sep. 14, 2012. |
J. R. Bergen, et al., Hierarchical Model-Based Motion Estimation, European Conf. on Computer Vision (1993) (pp. 237-252). |
K. Nishino, et al., The World in an Eye, IEEE Conf. on Pattern Recognition, vol. 1, at pp. 444-451 (Jun. 2004). |
Lu, Huiqi et al., “Iris Recognition on Low Computational Power Mobile Devices”, 23 pages, (2011). Retrieved from the Internet: URL: http:jjcdn.intechopen.comjpdfs-wm/14646.pdf [retrieved on Jul. 23, 2014]. |
Ma, L. et al, “Personal Identification Based on Iris Texture Analysis”, IEEE: Pattern Analysis and Machine Intelligence, 25(12):1519-1533 (2003). |
Notice of Allowance dated May 28, 2013 in U.S. Appl. No. 12/675,189. |
Notice of Allowance dated Oct. 27, 2014 in U.S. Appl. No. 13/493,462. |
Notice of Allowance dated Oct. 9, 2014 in U.S. Appl. No. 13/773,159. |
Notice of Allowance on U.S. Appl. No. 12/658,706 dated Feb. 24, 2012. |
Notice of Allowance on U.S. Appl. No. 13/398,562 dated Nov. 2, 2015. |
Notice of Allowance on U.S. Appl. No. 13/440,707 dated Apr. 20, 2015. |
Notice of Allowance on U.S. Appl. No. 13/493,455 dated Feb. 10, 2015. |
Notice of Allowance on U.S. Appl. No. 13/493,455 dated Jul. 18, 2014. |
Notice of Allowance on U.S. Appl. No. 13/773,168 dated Jan. 23, 2015. |
Notice of Allowance on U.S. Appl. No. 13/786,079 dated Apr. 2, 2015. |
Notice of Allowance on U.S. Appl. No. 13/786,093 dated Jul. 21, 2015. |
Notice of Allowance on U.S. Appl. No. 15/061,482 dated Jun. 12, 2018. |
Notice of Preliminary Rejection From the Korean Intellectual Property Office dated Feb. 23, 2018 (pp. 1-13). |
Office Action in U.S. Appl. No. 13/398,562, dated May 21, 2014. |
Office Action in U.S. Appl. No. 13/440,707, dated Jan. 14, 2014. |
Office Action in U.S. Appl. No. 13/773,159, dated Jun. 18, 2014, 26 pages. |
Office Action in U.S. Appl. No. 13/773,159, dated Oct. 31, 2013, 16 pages. |
Office Action in U.S. Appl. No. 13/773,168, dated Jul. 16, 2014. |
Office Action in U.S. Appl. No. 13/773,168, dated Oct. 8, 2013, 16 pages. |
Office Action in U.S. Appl. No. 13/807,256, dated Jan. 29, 2014, 16 pages. |
Office Action on U.S. Appl. No. 12/675,189 dated Dec. 7, 2012. |
Office Action on U.S. Appl. No. 13/398,562 dated Nov. 17, 2014. |
Office Action on U.S. Appl. No. 13/440,707 dated Sep. 30, 2014. |
Office Action on U.S. Appl. No. 13/493,455 dated Apr. 9, 2014. |
Office Action on U.S. Appl. No. 13/493,455 dated Sep. 19, 2013. |
Office Action on U.S. Appl. No. 13/493,462 dated Jul. 1, 2014. |
Office Action on U.S. Appl. No. 13/786,079 dated Sep. 26, 2014. |
Office Action on U.S. Appl. No. 13/786,093 dated Nov. 28, 2014. |
Office Action on U.S. Appl. No. 13/786,102 dated Nov. 25, 2014. |
Office Action on U.S. Appl. No. 14/946,956 dated Jul. 11, 2016. |
Peters, Tanya H. et al., “Effects of segmentation routine and acquisition environment on iris recognition”, 97 pages, (2009). Retrieved from the Internet: URL: http://etd.nd.edu/ETD-db/thesesjavailablejetd-12112009-103348/ [retrieved on Jul. 21, 2014]. |
R. Kumar, et al., Direct recovery of shape from multiple views: a parallax based approach, 12th IAPR Int'l Conf. on Pattern Recognition (1994)(pp. 1-5). |
R. P. Wildes, Iris Recognition: An Emerging Biometric Technology, Proc. IEEE 85(9) at pp. 1348-1363 (Sep. 1997). |
Rejection Decision on 201280017539.7 dated Mar. 9, 2017. |
Russian Decision on Grant on 2013142254 dated Jan. 12, 2016. |
Second Chinese Office Action on 201280017539.7 dated Oct. 11, 2016. |
U.S. Notice of Allowance on U.S. Appl. No. 14/830,366 dated Feb. 27, 2017. |
U.S. Notice of Allowance on U.S. Appl. No. 14/946,956 dated Mar. 1, 2017. |
U.S. Notice of Allowance on U.S. Appl. No. 14/946,956 dated Mar. 23, 2017. |
U.S. Notice of Allowance on U.S. Appl. No. 15/487,923 dated Aug. 7, 2017. |
U.S. Notice of Allowance on U.S. Appl. No. 15/783,827 dated Nov. 27, 2017. |
U.S. Office Action on U.S. Appl. No. 14/830,366 dated Dec. 16, 2016. |
U.S. Office Action on U.S. Appl. No. 14/946,956 dated Nov. 23, 2016. |
U.S. Office Action on U.S. Appl. No. 15/487,923 dated Jun. 6, 2017. |
U.S. Office Action on U.S. Appl. No. 15/495,782 dated Jun. 12, 2017. |
Written Opinion of the International Searching Authority in PCT/US2008/074737, dated Jan. 23, 2009, 6 pages. |
Written Opinion of the International Searching Authority in PCT/US2008/074751 dated Jan. 28, 2009, 4 pages. |
Written Opinion of the International Searching Authority in PCT/US2012/032391, dated Jul. 25, 2013, 7 pages. |
Written Opinion on PCT/US2012/025468 dated Sep. 14, 2012. |
Yingzi Du et al: “Video-Based Noncooperative Iris Image Segmentation”, IEEE Transactions on Systems, Man and Cybernetics. Part B:Cybernetics, IEEE Service Center, Piscataway, NJ, US, vol. 41 , No. 1, Feb. 1, 2011 (Feb. 1, 2011), pp. 64-74, XP011373393, ISSN: 1083-4419, DOI: 10.1109/TSMCB.2010.2045371, section I-IV, abstract; figures 1-17. |
Number | Date | Country | |
---|---|---|---|
20180300547 A1 | Oct 2018 | US |
Number | Date | Country | |
---|---|---|---|
61472279 | Apr 2011 | US | |
61472270 | Apr 2011 | US | |
60969607 | Sep 2007 | US | |
61443757 | Feb 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15783827 | Oct 2017 | US |
Child | 15947624 | US | |
Parent | 15487923 | Apr 2017 | US |
Child | 15783827 | US | |
Parent | 14830366 | Aug 2015 | US |
Child | 15487923 | US | |
Parent | 13440707 | Apr 2012 | US |
Child | 14830366 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12675189 | US | |
Child | 13440707 | US | |
Parent | 13398562 | Feb 2012 | US |
Child | 13440707 | US |