The present disclosure is related to electronic systems and devices. More specifically, the present disclosure is generally directed to systems, device, and methods of modifying an optical pathway of a camera of an electronic system or device.
Many mobile devices, including some smartphones, tablets, digital cameras, or other mobile devices, in addition to other sensors, are equipped with a single front-facing camera arranged on the same side as the display or touch screen of the device. In some applications, the front-facing cameras can be utilized for user identification, generating biometric information that allows the device to be unlocked or operated in a secure fashion. In addition, biometric information may also be used to carry out functions that are specific to an identified user. For example, devices with front-facing cameras can be utilized for iris recognition. However, many biometric devices impose strict guides for operation in order to meet the needs of biometric analysis. For instance, present applications for iris recognition require captured images to have a clear, straight-on view of the iris. Hence, a subject needs to be stationary, as well as located very near and directly in front of the device camera.
In general, suitable biometric information may be readily acquired using devices fitted with a rear-facing camera, since an operator would be able to view the display and adjust the device so that the camera can properly acquire imagery of the subject. However, for those devices that do not possess a rear-facing camera, operation can be difficult or awkward, particularly for applications in which a subject or scenery other than the device operator is being viewed. Hence, many present devices are not suitable or desirable for use in biometric applications, such as iris, retinal, or facial recognition.
Devices lacking specific capabilities or features can be upgraded with the addition of new hardware. However, full integration of the new hardware often requires complex and costly modifications to design and manufacturing processes. After-market add-ons, on the other hand, offer an alternative, cost-effective way of reversibly expanding mobile device capabilities without directly changing the mobile device hardware. Add-ons act as modifiers to the available inputs or outputs, using support applications that can run under the existing device operating systems. For example, commercially available add-on products for mobile device cameras include telephoto and fish-eye lenses, which are designed to provide enhanced zoom, or wider angle imaging, respectively, beyond the capabilities of the as-designed cameras.
Given the above, there is a need for modifiers or adaptations that can enhance the capabilities of presently limited devices without directly modifying the existing hardware. In particular, there is a need for modifiers or adaptations directed to devices for use in biometric applications.
This disclosure is illustrated by way of example and not by way of limitation in the accompanying figures. The figures may, alone or in combination, illustrate one or more embodiments of the disclosure. Elements illustrated in the figures are not necessarily drawn to scale. Reference labels may be repeated among the figures to indicate corresponding or analogous elements.
The present disclosure describes a novel approach for expanding the capabilities of currently limited devices. In particular, the functionality of mobile and other devices can be enhanced using add-on elements and features, in accordance with embodiments described herein, without need for changing their existing hardware. Specifically, in some aspects, the optical pathways of components fitted on a mobile or other device can be modified using one or more passive optical elements. This allows the range of sensors, illuminators, and other mobile device components to extend in directions beyond the as-designed capabilities, increasing device flexibility and applicability.
The present approach includes a broad range of applications, including biometric enrollment and recognition. Advantageously, the provided approach, in the form of various add-ons, can be produced at low cost and is installable on standard, mass-produced mobile devices. For instance, a handheld mobile device configured to acquire biometric information from a subject is beneficial due to the small size, weight and cost of mobile device platforms. In the case that such mobile device is equipped with front-facing biometric capabilities, the range of applicability would be limited. Hence, a holder with an optical assembly configured to modify the optical pathway of the front-facing camera, in accordance with embodiments of the disclosure, would extend use of the device to include rear-facing directions. That is, by redirecting the front-facing camera and illuminators of the device towards a rear-facing direction, imagery and subjects other than the operator can also be imaged. Other benefits and advantages may be readily apparent from descriptions below.
While the concepts of the present disclosure are susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and are described in detail below. It should be understood that there is no intent to limit the concepts of the present disclosure to the particular forms disclosed. On the contrary, the intent is to cover all modifications, equivalents, and alternatives consistent with the present disclosure and the appended claims.
Referring now to
In accordance with aspects of the present disclosure, the above-described limitation of the device 100 can be overcome by utilizing an optical assembly configured to reversibly modify the optical path of the device components fitted on the device 100. As will be described, in some implementations, the optical assembly may be reversibly coupled to the device 100 using a device holder, or placed proximate to the device 100. In addition, the optical assembly may be configured to modify the optical paths of the device components using a number of passive optical elements included therein. In this manner, an increased the functionality can be achieved for the device 100, for instance, by allowing sensing or illumination of a subject not directly in front of the sensing or illumination component of the device 100.
Specifically referring to
While the specific embodiments herein are discussed in relation to changing the path of a device component from a front-facing direction to a rear-facing direction, the present specification is intended to cover changing the path of a device component from a first direction to a second direction. In this manner, the first and second directions may be at an angle of about 180 degrees with respect to one another or may be at any other angle with respect to one another. The first and second directions may be at any other suitable angle. In illustrative embodiments, the first and second directions may be at an angle of between about 30 degrees and about 330 degrees or between about 45 degrees and about 180 degrees. In some embodiments, the first and second directions may be at angle of about 45 degrees.
By way of example, a TabPro device equipped with front-facing near infrared (“NIR”) light emitting diode (“LED”) emits a diverging beam 120 that propagates along an initial, forward-facing direction while diverging with half-angle of approximately 12° half-width at half maximum (“HWHM”), as shown in
As appreciated from descriptions above, the nature of optical path modification of various device components, including sensors and other elements, can depend on geometrical factors, the characteristics of the sensors or elements themselves, the optical properties of the optical assembly, as well as requirements by the particular applications of the device 100. For instance, in some aspects, it may be desirable to minimize the size of the optical assembly, either to due cost, design, or esthetic considerations.
For example, referring again to
where θg represents the angle between the normal direction of the device 100 and the ray 122, a is the distance between the bottom of the prism 110 and the vertical position 126 of the illuminator, and 2h is the length of the hypotenuse 124. For an LED illuminator with 12° divergence in air, for example, the distance a can then be computed to be then at least 14% of the length of the hypotenuse 124 according to Eqn. 1. In addition, in order for the retro-reflected diverging beam 120 to clear the top 128 of the device 100, the following condition needs to be satisfied:
4h·tan θg+t·tan θa≤2h−a−b (2)
where θa represents the refracted angle in air, b is the distance from the top 128 of the device 100 to a vertical position 126 of the illuminator, and t is the thickness of the device 100. Converting Eqn. 1 to an equality and making a substitution into Eqn. 1 gives a minimum length for the hypotenuse 124 of the prism 110, namely
For instance, a TabPro device has dimensions t=7.3 mm and b=10 mm. Hence, using Eqn. 3 gives 2 h≥16 mm, or roughly a prism 110 with hypotenuse 124 of approximately 20 mm in length, where about a 20% margin was added.
It may be appreciated that concepts illustrated in the configurations shown in
In an example demonstrating the concepts above, two Edmund Optics 15 mm prisms (part number 32332, hypotenuse=21.2 mm) and a TabPro were used to generate clear, focused, and well lit images of irises, for example, for use in biometric recognition. Referring to
Test imagery taken using the TabPro (omitted from the drawings for privacy reasons) in this configuration provided good focus and reasonable illumination. To obtain enrollment images, the TabPro operator held the tablet approximately 8 inches from the subject while centering the subject's eye in the target box. Using the rear-facing mode, the enrolled irises were all quickly matched in 4/4 subjects (8 irises). Aiming with flipped and mirror reversed imagery necessitated some training, hence it is envisioned that software modifications might be utilized to provide more intuitive views of the subjects to the operator. Also, the test images showed specular reflections either centered or slightly to the lower right of the pupil. In some aspects, robust matching to standard enrollment images might necessitate an accommodation to allow for wider or different specular reflection position in flipped and reversed eye images. Nonetheless initial attempts showed that eye-finding in the raw image followed by image reversal and inversion (accomplished in one step by 180 degree rotation, for example) produced matchable images to standard enrollments.
Referring to
The holder 302 may be shaped and dimensioned in any manner, depending upon the particular application and mobile device 300 being utilized. In some aspects, the holder 302 may be fabricated from acrylonitrile butadiene styrene (ABS) plastic using a 3D printing technique. However, it may be appreciated that the holder 302 may be manufactured in any manner, and using any materials. In addition, it may be preferable that the holder 302 is designed to withstand moderate handling due to attaching and detaching it from the mobile device 300. In some implementations, the holder 302 may be configured to take advantage of proximity sensors fitted on a mobile device 300, as indicated by arrow 310 in
The embodiment described with reference
Referring now to
In particular, the base 404 can be configured in any manner and include a variety of features, in accordance with the features and design of the mobile device 400. Specifically, the base 404 may be designed to be attachable to the mobile device 400 while avoiding interference with operation of the mobile device. For instance, as shown in
The optical assembly 406 may include a number of passive optical elements 410 configured to modify the optical pathways of various device components, such as a camera or camera illuminator, and a housing 412 for the optical assembly 406, which may or may not enclose the passive optical elements 410 completely. As shown in
The above-described holder 404 is not limited to the specific implementation detailed with respect to
As noted above, the add-ons disclosed herein may be utilized for any number of functions, for example, for biometric recognition.
According to exemplary embodiments of the present invention, the input image 501 is an infrared image, and is captured by an infrared capture device (not shown in
Initially, the input image 501 is processed by the pre-processor 502. The pre-processor 502 segments and normalizes the iris in the input image 501, where input image 501 may have variable iris/pupil and iris/sclera contrast, small eyelid openings, and non-frontal iris presentations. The result of the pre-processor 502 is a modified iris image with clearly delineated iris boundaries and synthesized quasi-frontal presentation. For example, if the iris in the input image 501 is rotated towards the left, right, up or down, the pre-processor 502 will synthesize an iris on the input image 501 as if it was positioned directly frontally. Similarly, a frontally positioned pupil will be synthesized on the skewed or rotated pupil of the input image 501.
The coding processor 504 analyzes and encodes iris information from the iris image generated by the pre-processor 502 at a range of spatial scales so that structural iris information contained in the input image 501 of varying resolution, quality, and state of focus can be robustly represented. The information content of the resulting code will vary depending on the characteristics of input image 501. The code generated by the coding processor 104 representing the input image 501 allows spatial interpolation to facilitate iris code alignment by the matching processor 506.
The output code from the coding processor 504 is coupled to the matching processor 506. The matching processor 106 incorporates constrained active alignment of iris structure information between stored iris images and captured iris codes generated from the input image 501 to compensate for limitations in iris image normalization by the pre-processor 502. The matching processor 506 performs alignment by performing local shifting or warping of the code to match the generated code with a stored iris code template based on estimated residual distortion of the code generated by the coding processor 504. According to some embodiments, a “barrel shift” algorithm is employed to perform the alignment. Accordingly, structural correspondences are registered and the matching processor 506 compares the aligned codes to determine whether a match exists. If a match is found, the matching processor returns matched iris data 508.
The matched iris data 508 may be used in many instances, for example, to authorize transactions, for example, financial transactions. The pre-processor 502 may be an application executing on a mobile device, such as a mobile phone, camera, tablet, or the like. The pre-processor 502 on the mobile device may capture an image of a user's eye using the camera of the device, perform the pre-processing steps on the mobile device, and then transmit a bundled and encrypted request to the coding processor 504, which may be accessed via a cloud service on a remote server of, for example, a financial institution. In other embodiments, the application on the mobile device may also comprise the coding processor 504 and the iris coding is performed on the mobile device. In some embodiments, the pre-processor 502 may be used in conjunction with an automated teller machine (“ATM”), where a user is authorized via their iris being scanned and processed by the pre-processor 502. The pre-processor 502 may then reside in the software of the ATM, or the ATM may supply the image captured by the camera to a server where the pre-processor 502 is executed for pre-processing.
The coding processor 504 produces an iris code that is transmitted to the matching processor 506. The matching processor 506 may be hosted on a server of a financial institution, or be a remote third party service available to multiple financial institutions for authenticating the user based on their iris image. Once a user is authenticated, financial transactions may be carried out between the user and the financial institutions. Similarly, the iris processor 500 may be used to authenticate a user in any context, such as signing in to a social network, a messaging service or the like.
The iris processor 500 may be used to authorize a cellular device user, determining whether the device is stolen or not, in conjunction with geo-location data, or the like. In this embodiment, upon purchase of a cellular device, the user may “imprint” their identity on the device based on their iris information so that others can be prevented from using the device if reported stolen. Authorization can also be extended to the office or personal environments, where the iris processor 500 may be used to determine whether an authorized or detected user has access to a particular location. For example, in a secure office environment, taking photographs may be prohibited for the majority of employees, but overriding this prohibition and enabling the camera is available to authorized employees. The employee's mobile device will be used to capture an image of the employee, and the iris processor 500 will match the iris of the employee to extract an employee profile, which delineates the authorizations for this employee.
The pre-processor 600 comprises a segmentation module 602 and a correction module 604. The segmentation module 602 further comprises a pupil segmentation module 606, an iris segmentation module 608 and an edge detection module 609. The segmentation module 602 corrects an input image for low-contrast pupil and iris boundaries. The image produced by the segmentation module 602 is then coupled to the correction module 604 for further correction. The correction module 604 comprises a tilt correction module 610 and a corneal correction module 612.
The segmentation module 602 and the correction module 604 may be used, for example, in the medical field, in targeted marketing, customer tracking in a store, or the like. For example, pupil and iris insertion may be performed by the pre-processor 602, in the medical field as a diagnostic tool for diagnosing diseases that a person might have based on their iris profiles.
Turning to
Then, at process block 704, an optical assembly that is movably coupled to the mobile device may be operated. That is, the optical assembly may be positioned in an orientation such that optical pathways of components, such as the front camera or front illuminator, for instance, are modified to be oriented in the direction of the subject, thus allowing imaging or illumination of the subject. As described, illumination and imaging, may also be provided by a camera or illuminator fitted on the rear portion of the mobile device.
Then, at process block 706, the mobile device may be operated to acquire imaging data from the subject, for instance using the front camera. Such, imaging data may then be analyzed, in accordance with aspects of the present disclosure, to generate biometric information corresponding to the subject, as indicated by process block 708. For example, imaging data of the subject may be processed and analyzed using an iris recognition process. In some aspects, the generated biometric information may be utilized to identify a subject, thus providing access, or modifying functionality of the device, or other system or apparatus, based on the identified subject. A report, of any form, may then be generated using the biometric information, as indicated by process block 710. For example, the report may include in indication of a successful or unsuccessful subject identification.
Illustrative examples of the technologies disclosed herein are provided below. An embodiment of the technologies may include any one or more, and any combination of, the examples described below.
In an example 1, a method for operating a device to obtain biometric information from a subject includes orienting a device relative to a subject with a first portion of the device facing the subject, wherein a second portion of the device comprises a camera and the first and second portions are different sides of the device, and operating an optical assembly that is movably coupled to the device, the optical assembly being configured to modify an optical pathway of at least the camera in a direction of the subject. The method also includes acquiring imaging data of the subject using the camera, and analyzing the acquired imaging data to generate biometric information corresponding to the subject. The method further includes generating a report using the biometric information.
An example 2 includes the subject matter of example 1, wherein the method further comprises activating an illuminator located on the second portion of the device.
An example 3 includes the subject matter of any of examples 1 and 2, wherein the illuminator comprises at least one near infrared light source.
An example 4 includes the subject matter of any of examples 1, 2, and 3, wherein the illuminator is configured to produce a pulsing or strobe illumination operation above continuous wave safe eye limits.
An example 5 includes the subject matter of any of examples 1, 2, 3, and 4, wherein the optical assembly comprises one or more passive optical elements.
An example 6 includes the subject matter of any of examples 1, 2, 3, 4, and 5, wherein the method further comprises performing a subject identification using the biometric information.
An example 7 includes the subject matter of any of examples 1, 2, 3, 4, 5, and 6, wherein the subject identification includes comparing the acquired imaging data with a reference.
An example 8 includes the subject matter of any of examples 1, 2, 3, 4, 5, 6, and 7, wherein the method further comprises determining an access based on the subject identification.
An example 9 includes the subject matter of any of examples 1, 2, 3, 4, 5, 6, 7, and 8, wherein the method further comprises receiving a signal from one or more proximity sensors configured on the device identifying a position of the optical assembly relative to the device.
An example 10 includes the subject matter of any of examples 1, 2, 3, 4, 5, 6, 7, 8, and 9, wherein the device is a mobile device.
An example 11 includes the subject matter of any of examples 1, 2, 3, 4, 5, 6, 7, 8, 9, and 10, wherein the first portion is a back portion of the device and the second portion is a front portion of the device and the optical pathway is modified by about 180 degrees.
In an example 12, a holder for a mobile device includes a base attachable to the mobile device, and an optical assembly movably coupled to the base and configured to pivot about an axis relative to the base. The optical assembly one or more passive optical elements configured to modify an optical pathway of at least a camera affixed to a front portion of the mobile device, and a housing at least partially enclosing the one or more passive optical elements.
An example 13 includes the subject matter of example 12, wherein the one or more passive optical elements includes a prism.
An example 14 includes the subject matter of any of examples 12 and 13, wherein the base is attachable along a top portion of the mobile device.
An example 15 includes the subject matter of any of examples 12, 13, and 14, wherein the base is attachable along at least one of a periphery or a peripheral surface of the mobile device.
An example 16 includes the subject matter of any of examples 12, 13, 14, and 15, wherein the optical assembly is configured to bring into contact the one or more passive optical elements and the mobile device.
An example 17 includes the subject matter of any of examples 12, 13, 14, 15, and 16, wherein the holder further comprises a locking mechanism for maintaining the contact.
An example 18 includes the subject matter of any of examples 12, 13, 14, 15, 16, and 17, wherein the housing further includes at least one sensor surface configured for affecting signals generated by one or more proximity sensors configured on the mobile device.
In an example 19, a method for operating a mobile device having a camera includes the steps of orienting a mobile device relative to a subject such that a first portion of the mobile device faces the subject, wherein a second portion of the device comprises a camera, the first and second portions being on opposite sides of the mobile device, operating an optical assembly that is movably coupled to the mobile device, the optical assembly being configured to modify an optical pathway of at least the camera in a direction of the subject, and acquiring imaging data of the subject using the camera.
An example 20 includes the subject matter of example 19, wherein the method further comprises activating an illuminator located on the second portion of the device and the illuminator includes at least one near infrared light source.
In the foregoing description, numerous specific details, examples, and scenarios are set forth in order to provide a more thorough understanding of the present disclosure. It will be appreciated, however, that embodiments of the disclosure may be practiced without such specific details. Further, such examples and scenarios are provided for illustration, and are not intended to limit the disclosure in any way. Those of ordinary skill in the art, with the included descriptions, should be able to implement appropriate functionality without undue experimentation.
References in the specification to “an embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is believed to be within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly indicated.
Embodiments in accordance with the disclosure may be implemented in hardware, firmware, software, or any combination thereof. Embodiments may also be implemented as instructions stored using one or more machine-readable media, which may be read and executed by one or more processors. A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing device or a “virtual machine” running on one or more computing devices). For example, a machine-readable medium may include any suitable form of volatile or non-volatile memory.
Modules, data structures, blocks, and the like are referred to as such for ease of discussion, and are not intended to imply that any specific implementation details are required. For example, any of the described modules and/or data structures may be combined or divided into sub-modules, sub-processes or other units of computer code or data as may be required by a particular design or implementation. In the drawings, specific arrangements or orderings of schematic elements may be shown for ease of description. However, the specific ordering or arrangement of such elements is not meant to imply that a particular order or sequence of processing, or separation of processes, is required in all embodiments. In general, schematic elements used to represent instruction blocks or modules may be implemented using any suitable form of machine-readable instruction, and each such instruction may be implemented using any suitable programming language, library, application-programming interface (API), and/or other software development tools or frameworks. Similarly, schematic elements used to represent data or information may be implemented using any suitable electronic arrangement or data structure. Further, some connections, relationships or associations between elements may be simplified or not shown in the drawings so as not to obscure the disclosure. This disclosure is to be considered as exemplary and not restrictive in character, and all changes and modifications that come within the spirit of the disclosure are desired to be protected.
This application claims priority to and the benefit of U.S. Provisional Patent Application Ser. No. 62/086,867, filed Dec. 3, 2014, and entitled “System and Method for Mobile Device Biometric Add-On”, the disclosure of which is incorporated herein in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2015/061024 | 11/17/2015 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2016/089592 | 6/9/2016 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
3852592 | Scoville et al. | Dec 1974 | A |
3993888 | Fellman | Nov 1976 | A |
4109237 | Hill | Aug 1978 | A |
4641349 | Flom et al. | Feb 1987 | A |
5291560 | Daugman | Mar 1994 | A |
5337104 | Smith | Aug 1994 | A |
5481622 | Gerhardt et al. | Jan 1996 | A |
5572596 | Wildes et al. | Nov 1996 | A |
5835616 | Lobo et al. | Nov 1998 | A |
5861940 | Robinson et al. | Jan 1999 | A |
5933515 | Pu et al. | Aug 1999 | A |
5953440 | Zhang et al. | Sep 1999 | A |
5966197 | Yee | Oct 1999 | A |
5987459 | Swanson et al. | Nov 1999 | A |
6055322 | Salganicoff et al. | Apr 2000 | A |
6081607 | Mori et al. | Jun 2000 | A |
6119096 | Mann et al. | Sep 2000 | A |
6144754 | Okano et al. | Nov 2000 | A |
6204858 | Gupta | Mar 2001 | B1 |
6229907 | Okano et al. | May 2001 | B1 |
6247813 | Kim et al. | Jun 2001 | B1 |
6252976 | Schildkraut et al. | Jun 2001 | B1 |
6301370 | Steffens et al. | Oct 2001 | B1 |
6307954 | Suzaki | Oct 2001 | B1 |
6320610 | Van Sant et al. | Nov 2001 | B1 |
6421462 | Christian et al. | Jul 2002 | B1 |
6424727 | Musgrave et al. | Jul 2002 | B1 |
6433326 | Levine et al. | Aug 2002 | B1 |
6525303 | Gladnick | Feb 2003 | B1 |
6526160 | Ito | Feb 2003 | B1 |
6542624 | Oda | Apr 2003 | B1 |
6549644 | Yamamoto | Apr 2003 | B1 |
6614919 | Suzaki et al. | Sep 2003 | B1 |
6714665 | Hanna et al. | Mar 2004 | B1 |
6765581 | Cheng | Jul 2004 | B2 |
6836554 | Bolle et al. | Dec 2004 | B1 |
6850252 | Hoffberg | Feb 2005 | B1 |
6895103 | Chen et al. | May 2005 | B2 |
6912298 | Wilensky | Jun 2005 | B1 |
6977989 | Bothe et al. | Dec 2005 | B2 |
7015955 | Funston et al. | Mar 2006 | B2 |
7095901 | Lee et al. | Aug 2006 | B2 |
7099495 | Kodno et al. | Aug 2006 | B2 |
7118042 | Moore et al. | Oct 2006 | B2 |
7130453 | Kondo et al. | Oct 2006 | B2 |
7146027 | Kim et al. | Dec 2006 | B2 |
7295686 | Wu | Nov 2007 | B2 |
7310443 | Kris et al. | Dec 2007 | B1 |
7380938 | Chmielewski, Jr. et al. | Jun 2008 | B2 |
7428320 | Northcott et al. | Sep 2008 | B2 |
7466308 | Dehlin | Dec 2008 | B2 |
7466847 | Komura | Dec 2008 | B2 |
7542628 | Lolacono et al. | Jun 2009 | B2 |
7574021 | Matey | Aug 2009 | B2 |
7583823 | Jones et al. | Sep 2009 | B2 |
7599524 | Camus et al. | Oct 2009 | B2 |
7627147 | Lolacono et al. | Dec 2009 | B2 |
7634114 | Zappia | Dec 2009 | B2 |
7657127 | Lolacono et al. | Feb 2010 | B2 |
7751598 | Matey et al. | Jul 2010 | B2 |
7925059 | Hoyos et al. | Apr 2011 | B2 |
8050463 | Hamza | Nov 2011 | B2 |
8170293 | Tosa et al. | May 2012 | B2 |
8189879 | Cambier | May 2012 | B2 |
8195576 | Grigg et al. | Jun 2012 | B1 |
8200980 | Robinson et al. | Jun 2012 | B1 |
8317325 | Raguin et al. | Nov 2012 | B2 |
8374404 | Williams et al. | Feb 2013 | B2 |
8553948 | Hanna | Oct 2013 | B2 |
8603165 | Park | Dec 2013 | B2 |
8639058 | Bergen et al. | Jan 2014 | B2 |
8682073 | Bergen | Mar 2014 | B2 |
8755607 | Bergen et al. | Jun 2014 | B2 |
8854446 | Bergen et al. | Oct 2014 | B2 |
8934005 | De Bruijn | Jan 2015 | B2 |
9100825 | Schultz et al. | Aug 2015 | B2 |
9131141 | Tinker et al. | Sep 2015 | B2 |
9195890 | Bergen | Nov 2015 | B2 |
9514365 | Tinker et al. | Dec 2016 | B2 |
9665772 | Bergen | May 2017 | B2 |
9836647 | Perna et al. | Dec 2017 | B2 |
9836648 | Perna et al. | Dec 2017 | B2 |
10025982 | Perna et al. | Jul 2018 | B2 |
20020080141 | Imai et al. | Jun 2002 | A1 |
20020118864 | Kondo et al. | Aug 2002 | A1 |
20020150280 | Li | Oct 2002 | A1 |
20020154794 | Cho | Oct 2002 | A1 |
20020164054 | McCartney et al. | Nov 2002 | A1 |
20020180586 | Kitson et al. | Dec 2002 | A1 |
20030046553 | Angelo | Mar 2003 | A1 |
20030103652 | Lee et al. | Jun 2003 | A1 |
20030123711 | Kim et al. | Jul 2003 | A1 |
20030169334 | Braithwaite et al. | Sep 2003 | A1 |
20030174211 | Imaoka et al. | Sep 2003 | A1 |
20040037452 | Shin | Feb 2004 | A1 |
20040088584 | Shachar et al. | May 2004 | A1 |
20040146187 | Jeng | Jul 2004 | A1 |
20040170304 | Haven | Sep 2004 | A1 |
20040213437 | Howard et al. | Oct 2004 | A1 |
20040236549 | Dalton | Nov 2004 | A1 |
20050047655 | Luo et al. | Mar 2005 | A1 |
20050063582 | Park et al. | Mar 2005 | A1 |
20050084179 | Hanna et al. | Apr 2005 | A1 |
20050088200 | Takekuma et al. | Apr 2005 | A1 |
20050210267 | Sugano et al. | Sep 2005 | A1 |
20050270386 | Saitoh et al. | Dec 2005 | A1 |
20060008125 | Lauper et al. | Jan 2006 | A1 |
20060028617 | Matsumura et al. | Feb 2006 | A1 |
20060098097 | Wach et al. | May 2006 | A1 |
20060105806 | Vance et al. | May 2006 | A1 |
20060120570 | Azuma et al. | Jun 2006 | A1 |
20060140454 | Northcott et al. | Jun 2006 | A1 |
20060150928 | Lehmann et al. | Jul 2006 | A1 |
20060184243 | Yilmaz | Aug 2006 | A1 |
20060202036 | Wang et al. | Sep 2006 | A1 |
20060210123 | Kondo et al. | Sep 2006 | A1 |
20060222212 | Du et al. | Oct 2006 | A1 |
20060245623 | Loiacono et al. | Nov 2006 | A1 |
20060274918 | Amantea et al. | Dec 2006 | A1 |
20070014439 | Ando | Jan 2007 | A1 |
20070025598 | Kobayashi et al. | Feb 2007 | A1 |
20070036397 | Hamza | Feb 2007 | A1 |
20070047770 | Swope et al. | Mar 2007 | A1 |
20070116454 | Tsai | May 2007 | A1 |
20070140531 | Hamza | Jun 2007 | A1 |
20070160266 | Jones et al. | Jul 2007 | A1 |
20070189582 | Hamza et al. | Aug 2007 | A1 |
20070198850 | Martin et al. | Aug 2007 | A1 |
20070201728 | Monro | Aug 2007 | A1 |
20070206935 | Ono | Sep 2007 | A1 |
20070236567 | Pillman et al. | Oct 2007 | A1 |
20070285537 | Dwinell et al. | Dec 2007 | A1 |
20080049185 | Huffman et al. | Feb 2008 | A1 |
20080069411 | Friedman et al. | Mar 2008 | A1 |
20080121721 | Chen | May 2008 | A1 |
20080180544 | Drader et al. | Jul 2008 | A1 |
20080187174 | Metaxas et al. | Aug 2008 | A1 |
20080219515 | Namgoong | Sep 2008 | A1 |
20080271116 | Robinson et al. | Oct 2008 | A1 |
20080273112 | Sladen | Nov 2008 | A1 |
20090041309 | Kim | Feb 2009 | A1 |
20090208064 | Cambier | Aug 2009 | A1 |
20090216606 | Coffman et al. | Aug 2009 | A1 |
20090220126 | Claret-Tournier et al. | Sep 2009 | A1 |
20090232418 | Lolacono et al. | Sep 2009 | A1 |
20090278922 | Tinker | Nov 2009 | A1 |
20100026853 | Mokhnatyuk | Feb 2010 | A1 |
20100034529 | Jelinek | Feb 2010 | A1 |
20100046808 | Connell et al. | Feb 2010 | A1 |
20100063880 | Atsmon et al. | Mar 2010 | A1 |
20100082398 | Davis et al. | Apr 2010 | A1 |
20100142938 | Zhang | Jun 2010 | A1 |
20100176802 | Huguet | Jul 2010 | A1 |
20100278394 | Raguin et al. | Nov 2010 | A1 |
20100287053 | Ganong et al. | Nov 2010 | A1 |
20100290668 | Friedman et al. | Nov 2010 | A1 |
20100301113 | Bohn et al. | Dec 2010 | A1 |
20100310133 | Mason et al. | Dec 2010 | A1 |
20100328420 | Roman | Dec 2010 | A1 |
20110007205 | Lee | Jan 2011 | A1 |
20110043683 | Beach | Feb 2011 | A1 |
20110075893 | Connel, II et al. | Mar 2011 | A1 |
20110081946 | Singh | Apr 2011 | A1 |
20110134268 | MacDonald | Jun 2011 | A1 |
20110142297 | Yu et al. | Jun 2011 | A1 |
20110187878 | Mor et al. | Aug 2011 | A1 |
20110317991 | Tsai | Dec 2011 | A1 |
20120086645 | Zheng et al. | Apr 2012 | A1 |
20120154536 | Stoker et al. | Jun 2012 | A1 |
20120155716 | Kim | Jun 2012 | A1 |
20120163783 | Braithwaite et al. | Jun 2012 | A1 |
20120243729 | Pasquero | Sep 2012 | A1 |
20120293642 | Berini et al. | Nov 2012 | A1 |
20130014153 | Bhatia et al. | Jan 2013 | A1 |
20130044199 | Nanu et al. | Feb 2013 | A1 |
20130051631 | Hanna | Feb 2013 | A1 |
20130081119 | Sampas | Mar 2013 | A1 |
20130083185 | Coleman, III | Apr 2013 | A1 |
20130089240 | Northcott et al. | Apr 2013 | A1 |
20130091520 | Chen | Apr 2013 | A1 |
20130147603 | Malhas et al. | Jun 2013 | A1 |
20130150120 | Wu | Jun 2013 | A1 |
20130162798 | Hanna et al. | Jun 2013 | A1 |
20130188943 | Wu | Jul 2013 | A1 |
20130194407 | Kim | Aug 2013 | A1 |
20130215228 | Stoker et al. | Aug 2013 | A1 |
20130250085 | MacKinnon | Sep 2013 | A1 |
20130329115 | Palmeri | Dec 2013 | A1 |
20140046772 | Raman | Feb 2014 | A1 |
20140055337 | Karlsson | Feb 2014 | A1 |
20140059607 | Upadhyay et al. | Feb 2014 | A1 |
20140071547 | O'Neill et al. | Mar 2014 | A1 |
20140078389 | Merz | Mar 2014 | A1 |
20140161325 | Bergen | Jun 2014 | A1 |
20140171150 | Hurst | Jun 2014 | A1 |
20140232930 | Anderson | Aug 2014 | A1 |
20140327815 | Auger | Nov 2014 | A1 |
20140369575 | Riopka | Dec 2014 | A1 |
20150037935 | Kim et al. | Feb 2015 | A1 |
20150098629 | Perna et al. | Apr 2015 | A1 |
20150098630 | Perna et al. | Apr 2015 | A1 |
20150126245 | Barkan | May 2015 | A1 |
20150193666 | Derakhshani et al. | Jul 2015 | A1 |
20150227790 | Smits | Aug 2015 | A1 |
20150286864 | Gottemukkula et al. | Oct 2015 | A1 |
20150338915 | Publicover et al. | Nov 2015 | A1 |
20150379325 | Tinker et al. | Dec 2015 | A1 |
20160012275 | Bergen | Jan 2016 | A1 |
20160012292 | Perna et al. | Jan 2016 | A1 |
20160014121 | Perna et al. | Jan 2016 | A1 |
20160117544 | Hoyos et al. | Apr 2016 | A1 |
20160148384 | Bud et al. | May 2016 | A1 |
20160274660 | Publicover et al. | Sep 2016 | A1 |
20160345818 | Suzuki et al. | Dec 2016 | A1 |
20160364609 | Ivanisov et al. | Dec 2016 | A1 |
20170111568 | Hsieh | Apr 2017 | A1 |
20170124314 | Laumea | May 2017 | A1 |
20170132399 | Pawluk et al. | May 2017 | A1 |
20170286790 | Mapen et al. | Oct 2017 | A1 |
20170286792 | Ackerman et al. | Oct 2017 | A1 |
20170323167 | Mapen et al. | Nov 2017 | A1 |
20170337439 | Ackerman et al. | Nov 2017 | A1 |
20170337440 | Green et al. | Nov 2017 | A1 |
20170337441 | Mapen et al. | Nov 2017 | A1 |
20180025244 | Bohl et al. | Jan 2018 | A1 |
20180165537 | Ackerman | Jun 2018 | A1 |
Number | Date | Country |
---|---|---|
102708357 | Oct 2012 | CN |
103048848 | Apr 2013 | CN |
103099624 | May 2013 | CN |
0821912 | Feb 1998 | EP |
1324259 | Jul 2003 | EP |
2007011667 | Jan 2007 | JP |
2008-538425 | Oct 2008 | JP |
4372321 | Nov 2009 | JP |
2003-0066512 | Aug 2003 | KR |
10-2011-0134848 | Dec 2011 | KR |
WO-199619132 | Jun 1996 | WO |
WO-199714873 | Apr 1997 | WO |
WO-199721188 | Jun 1997 | WO |
WO-199808439 | Mar 1998 | WO |
WO-199931183 | Jun 1999 | WO |
WO-200039760 | Jul 2000 | WO |
WO-2013056001 | Apr 2013 | WO |
WO-2014093227 | Jun 2014 | WO |
WO-2014100250 | Jun 2014 | WO |
WO-2015102704 | Jul 2015 | WO |
WO-2017172695 | Oct 2017 | WO |
WO-2017173228 | Oct 2017 | WO |
Entry |
---|
International Search Report dated Mar. 31, 2016, issued in connection with International Application No. PCT/US2015/061024 (2 pages). |
Written Opinion of the International Searching Authority dated Mar. 31, 2016, issued in connection with International Application No. PCT/US2015/061024 (5 pages). |
Annapoorani et al., Accurate and Fast Iris Segmentation. International Journal of Engineering Science and Technology. 2010;2(6):1492-1499. |
Arfken, G., “Mathematical Methods for Physicists,” Academic Press, NY 6.sup.th Ed. (2005). |
Atos Origin, “UK Passport Service, Biometrics Enrollment Trial.” Atos Origin Report (May 2005). |
Bertalmio et al., Navier-Stokes, Fluid Dynamics, and Image and Video Inpainting. Proceedings of the 2001 IEEE Computer Society Conferenc on Computer Vision and Pattern Recognition. CVPR 2001, 8 pages, (2001). |
Betke, et al., “Preliminary Investigation of Real-time Monitoring of a Driver in City Traffic,” IEEE Intelligent Vehicles Syposium, Oct. 3-5, 2000, Dearborn, MI, 563-568. |
Boehnen et al., A Multi-Sample Standoff Multimodal Biometric System, Theory, Aoolications and Systems (BTAS), Sep. 23, 2012, pp. 127-134. |
Bowyer et al., Image Understanding for Iris Biometrics: A Survey. Computer Vision and Image Understanding. 2008;110:281-307. |
Braithwaite, Michael et al., “Application-Specific Biometric Templates,” AutoID 2002 Workshop, Tarrytown, NY, pp. 1-10 (2002). |
Burt, et al., “The Laplacian Pyramid as a Compact Image Code,” IEEE Transactions on Communications, 31(4): 532-540, 1983. |
Canadian Offic Action for Application 2,833, 740 dated Jan. 15, 2018. |
Office Action dated Oct. 30, 2018, issued in connection with U.S. Appl. No. 15/514,098 (35 pages). |
Office Action dated Sep. 26, 2018, issued in connection with U.S. Appl. No. 15/471,131 (15 pages). |
Daugman John, “How Iris Recognition Works,” IEEE Transactions on Circuits and Systems for Video Teohnology, vol. 14, No. 1 (Jan. 2004). |
Daugman, J., “High confidence visual recognition of persons by a test of statistical independence”, IEEE Transactions on Pattern Analysis and Machine Intelligence, 15 (11), pp. 1148-1161 (1993). |
Daugman, J., “Recognizing Persons by Their Iris Patterns,” in Biometrics: Personal Indentification in a Networked Society, A.K.Jain, et al., eds. Kluwer Academic Pub. 1999. |
Daugman, John et al., “Iris recognition border-crossing system in the UAE,” International Airport Review, Issue 2 (2004). |
Daugman, John.“How Iris Recognition Works”.Jun. 13, 2003. IEEE Transactions on Circuits and Systems for Video technology, vol. 14, No. 1. |
Daugman, The Importance of Being Random: Statistical Principles of Iris Recognition. Pattern Recognition. Pre-publication version. 13 pages, Dec. 21, 2001. |
DellaVecchia, et al., “Methodology and apparatus for using the human iris as a robust biometric,” Ophthalmic Technologies VIII, SPIE Biomedical Optics Society, Photonics West Conference, San Jose, CA Jan. 24, 1998. |
Du et al., Analysis of Partial Iris Recognition Using a 1-D Approach. Proceedings, IEEE International Conference on Acoustics, Speech, and Signal Processing. Mar. 18-23, 2005;2;961-964. |
European Office Action for Application 12719332.4 dated Jan. 29, 2018. |
European Search Report for Apllication 14876521.7 dated Oct. 19, 2017. |
Fan, et al., “An Efficient Automatic Iris Image Acquisition and Preprocessing System,” Proceedings of the 2006 IEEE International Conference on Mechatronics and Automation, pp. 1779-1784 (6 pages). |
Final Office Action dated Aug. 18, 2016 from U.S. Appl. No. 14/858,715, filed Sep. 18, 2015 (6 pages). |
Final Office Action dated Aug. 4, 2016 from U.S. Appl. No. 14/509,366, filed Oct. 8, 2014 (24 pages). |
Final Office Action dated Mar. 21, 2017 from U.S. Appl. No. 14/863,936, filed Sep. 24, 2015 (17 pages). |
Final Office Action dated Mar. 22, 2017 from U.S. Appl. No. 14/863,950, filed Sep. 24, 2015 (16 pages). |
Final Office Action dated Mar. 22, 2017 from U.S. Appl. No. 14/863,960, filed Sep. 24, 2015 (21 pages). |
Final Office Action for U.S. Appl. No. 10/818,307, dated Jan. 21, 2009, 28 pages. |
Final Office Action for U.S. Appl. No. 10/818,307, dated Jan. 30, 2008, 19 pages. |
Final Office Action for U.S. Appl. No. 11/377,042, dated Nov. 14, 2008, 20 pages. |
Final Office Action for U.S. Appl. No. 11/510,197, dated May 5, 2009, 16 pages. |
Final Office Action for U.S. Appl. No. 12/464,369, dated Aug. 5, 2014, 22 pages. |
Final Office Action for U.S. Appl. No. 12/464,369, dated Oct. 3, 2012, 27 pages. |
Final Office Action for U.S. Appl. No. 12/576,644, dated Oct. 13, 2010, 11 pages. |
Final Office Action for U.S. Appl. No. 14/100,615, dated Sep. 1, 2015, 22 pages. |
Final Office Action for U.S. Appl. No. 14/509,356, dated Sep. 28, 2016, 20 pages. |
Final Office Action for U.S. Appl. No. 14/509,366, dated Aug. 4, 2016, 29 pages. |
Final Office Action for U.S. Appl. No. 14/846,090, dated Jun. 15, 2016, 17 pages. |
Final Office Action for U.S. Appl. No. 14/858,715, dated Aug. 18, 2016, 29 pages. |
Final Office Action for U.S. Appl. No. 14/858,715, dated Aug. 18, 2016, 6 pages. |
Final Office Action for U.S. Appl. No. 14/863,936, dated Mar. 21, 2017, 17 pages. |
Final Office Action for U.S. Appl. No. 14/863,950, dated Mar. 22, 2017, 16 pages. |
Final Office Action for U.S. Appl. No. 14/863,960, dated Mar. 22, 2017, 21 pages. |
First Japanese Office Action for Application 2015-545911 dated Feb. 26, 2018 ( with English translation). |
FIT Validation Studies, http://www.pmifit.com/validation.htm, Mar. 2, 2004. |
Google Scholar Search—“Rida Hadma” pp. 1 of 2. |
Haro, et al., “Detecting and Tracking Eyes by Using Their Physological Properties, Dynamics and Appearance,” CVPR 2000, 163-168. |
Hutchinson, et al., “Human-Computer Interaction Using Eye-Gaze Input,” IEEE Transaction on Systems, Man and Cybernetics, 19(6): 1527-1534, 1989. |
International Biometrics Group, “Independent Testing of IRIS Recognition Technology, Final Report,” Study Commissioned by the US Department of Homeland Security (May 2005). |
International Preliminary Report on Patentability for Application No. PCT/US2015/051863, dated Mar. 28, 2017, 6 pages. |
International Search Report and Written Opinion for Application No. PCT/US17/13110, dated May 18, 2017, 12 pages. |
International Search Report and Written Opinion for Application No. PCT/US17/24444, dated Jun. 19, 2017, 9 pages. |
International Search Report and Written Opinion for Application No. PCT/US2013/073887, dated Mar. 20, 2014, 11 pages. |
International Search Report and Written Opinion for Application No. PCT/US2017/025303, dated Jun. 16, 2017, 11 pages. |
International Search Report and Written Opinion for PCT/US2017/24444 dated Jun. 19, 2017 pp. 1-15. |
International Search Report and Written Opinion for PCT/US2018/042807, dated Sep. 27, 2018, pp. 1-19. |
International Search Report and Written Opinionf for PCT/US2017/025303 dated Jun. 16, 2017. |
International Search Report for Application No. PCT/US2015/051863, dated Dec. 10, 2015, 1 page. |
International Search Report for Application No. PCT/US2017/065793, dated Feb. 16, 2018, 3 pages. |
International Search Report of the International Searching Authority dated Jun. 28, 2018, issued in connection with International Application No. PCT/US2018/025895 (3 pages). |
Iwai, Daisuke, Shoichiro Mihara, and Kosuke Sato. “Extended depth-of-field projector by fast focal sweep projection.” IEEE transactions on visualization and computer graphics 21.4 (2015): 462-470. |
Jacob, R., “The Use of Eye Movements in Human-Computer Interaction Techniques: What you Look At is What you Get,” ACM Trans. Info.Sys., 9(3):152-169. |
Japanese Office Action for Application No. 2015-545911, dated Feb. 20, 2018, 6 pages. |
Li, Zexi, “An Iris Recognition Algorithm Based on Coarse and Fine Location,” 2017 IEEE 2nd International Conference on Big Data Analysis, pp. 744-747 (4 pages). |
Ma et al., “Efficient Iris Recognition by Characterizing Key Local Variations”, IEEE Transactions on Image Processing, vol. 13, No. 6, Jun. 2004, 12 pages. |
Ma., et al. “Iris Recognition Using Circular Symmetric Filters,” Pattern Recognition, 2002, Proceedings 16th International Conference on vol. 2 IEEE, 2002 (4 pages). |
Ma., et al., “Iris Recognition Based on Multichannel Gabor Filtering” ACCV2002: The 5th Asian Conference on Computer Vision, Jan. 23-25, 2002, Melbourne, Australia (5 pages). |
Mansfield, Tony et al., “Biometric Product Testing Final Report,” CESG Contract X92A/4009309, CESG/BWG Biometric Test Programme; Centre for Mathematics & Scientific Computing, National Physical Laboratory (2001). |
Matey et al., Iris on the Move: Acquisition of Images for Iris Recognition in Less Constrained Environments. Proceedings of the IEEE. Nov. 2006;94(11):1936-1947. |
Miyazawa et al., Iris Recognition Algorithm Based on Phase-Only Correlation, The Institute of Image Information and Television Engineers, JapanJun. 27, 2006, vol. 30, No. 33, pp. 45-48. |
Monro et al., An Effective Human Iris Code with Low Complexity. IEEE International Conference on Image Processing. Sep. 14, 2005;3:277-280. |
Narayanswamy, et al., “Extended Depth-of-Field Iris Recognition System for a Workstation Environment,” Proc. SPIE. vol. 5779 (2005) (10 pages). |
Negin, et al., “An Iris Biometric System for Public and Personal Use,” IEEE Computer, pp. 70-75, Feb. 2000. |
Nguyen, et al., “Quality-Driven Super-Resolution for Less Constrained Iris Recognition at a Distance and on the Move,” IEEE Transactions on Information Forensics and Security 6.4 (2011) pp. 1248-1558 (11 pages). |
Non-Final Office Action for U.S. Appl. No. 10/809,471, dated Mar. 19, 2007, 12 pages. |
Non-Final Office Action for U.S. Appl. No. 10/818,307, dated Jul. 10, 2008, 28 pages. |
Non-Final Office Action for U.S. Appl. No. 10/818,307, dated Mar. 20, 2007, 22 pages. |
Non-Final Office Action for U.S. Appl. No. 11/334,968, dated Jan. 6, 2009, 28 pages. |
Non-Final Office Action for U.S. Appl. No. 11/377,042, dated Apr. 8, 2009, 22 pages. |
Non-Final Office Action for U.S. Appl. No. 11/377,042, dated Jan. 7, 2008, 13 pages. |
Non-Final Office Action for U.S. Appl. No. 11/510,197, dated Oct. 10, 2008, 36 pages. |
Non-Final Office Action for U.S. Appl. No. 11/510,197, dated Oct. 8, 2009, 21 pages. |
Non-Final Office Action for U.S. Appl. No. 11/849,969, dated Dec. 19, 2008, 17 pages. |
Non-Final Office Action for U.S. Appl. No. 11/857,432, dated Dec. 30, 2008, 23 pages. |
Non-Final Office Action for U.S. Appl. No. 12/429,695, dated Sep. 2, 2009, 11 pages. |
Non-Final Office Action for U.S. Appl. No. 12/464,369, dated Jan. 2, 2015, 23 pages. |
Non-Final Office Action for U.S. Appl. No. 12/464,369, dated May 9, 2012, 33 pages. |
Non-Final Office Action for U.S. Appl. No. 12/576,644, dated Jul. 14, 2010, 14 pages. |
Non-Final Office Action for U.S. Appl. No. 13/096,716, dated May 23, 2013, 16 pages. |
Non-Final Office Action for U.S. Appl. No. 13/096,724, dated Jan. 16, 2014, 29 pages. |
Non-Final Office Action for U.S. Appl. No. 13/096,728, dated May 7, 2013, 33 pages. |
Non-Final Office Action for U.S. Appl. No. 13/096,728, dated Nov. 8, 2012, 37 pages. |
Non-Final Office Action for U.S. Appl. No. 14/100,615, dated Mar. 4, 2015, 19 pages. |
Non-Final Office Action for U.S. Appl. No. 14/509,356, dated Feb. 29, 2016, 19 pages. |
Non-Final Office Action for U.S. Appl. No. 14/509,356, dated Mar. 16, 2017, 21 pages. |
Non-Final Office Action for U.S. Appl. No. 14/509,366, dated Feb. 21, 2017, 25 pages. |
Non-Final Office Action for U.S. Appl. No. 14/509,366, dated Mar. 3, 2016, 40 pages. |
Non-Final Office Action for U.S. Appl. No. 14/846,090, dated Jan. 7, 2016, 35 pages. |
Non-Final Office Action for U.S. Appl. No. 14/858,715, dated Mar. 14, 2016, 37 pages. |
Non-Final Office Action for U.S. Appl. No. 14/863,936, dated Aug. 4, 2016, 16 pages. |
Non-Final Office Action for U.S. Appl. No. 14/863,936, dated Sep. 26, 2017, 28 pages. |
Non-Final Office Action for U.S. Appl. No. 14/863,950, dated Aug. 3, 2016, 15 pages. |
Non-Final Office Action for U.S. Appl. No. 14/863,950, dated Sep. 26, 2017, 22 pages. |
Non-Final Office Action for U.S. Appl. No. 14/863,960, dated Aug. 3, 2016, 21 pages. |
Non-Final Office Action for U.S. Appl. No. 14/863,960, dated Sep. 28, 2017, 28 pages. |
Non-Final Office Action for U.S. Appl. No. 15/475,425, dated Jul. 12, 2018, 31 pages. |
Non-Final Office Action for for U.S. Appl. No. 12/464,369, dated Feb. 27, 2014, 25 pages. |
Notice of Allowance dated Feb. 1, 2017 from U.S. Appl. No. 14/858,715, filed Sep. 18, 2015 (8 pages). |
Notice of Allowance for U.S. Appl. No. 10/809,471, dated Mar. 24, 2008, 14 pages. |
Notice of Allowance for U.S. Appl. No. 10/809,471, dated Oct. 5, 2007, 11 pages. |
Notice of Allowance for U.S. Appl. No. 10/818,307, dated May 18, 2009, 8 pages. |
Notice of Allowance for U.S. Appl. No. 11/334,968, dated Apr. 17, 2009, 11 pages. |
Notice of Allowance for U.S. Appl. No. 11/377,042, dated Sep. 8, 2009, 16 pages. |
Notice of Allowance for U.S. Appl. No. 11/510,197, dated Feb. 1, 2010, 13 pages. |
Notice of Allowance for U.S. Appl. No. 11/849,969, dated Aug. 20, 2009, 21 pages. |
Notice of Allowance for U.S. Appl. No. 11/849,969, dated Jul. 10, 2009, 18 pages. |
Notice of Allowance for U.S. Appl. No. 11/857,432, dated Jun. 17, 2009, 17 pages. |
Notice of Allowance for U.S. Appl. No. 12/429,695, dated Dec. 15, 2009, 7 pages. |
Notice of Allowance for U.S. Appl. No. 12/429,695, dated Nov. 17, 2009, 12 pages. |
Notice of Allowance for U.S. Appl. No. 12/464,369, dated May 8, 2015, 29 pages. |
Notice of Allowance for U.S. Appl. No. 12/576,644, dated Dec. 10, 2010, 14 pages. |
Notice of Allowance for U.S. Appl. No. 13/096,716, dated Oct. 30, 2013, 25 pages. |
Notice of Allowance for U.S. Appl. No. 13/096,724, dated Aug. 19, 2014, 17 pages. |
Notice of Allowance for U.S. Appl. No. 13/096,728, dated Feb. 7, 2014, 33 pages. |
Notice of Allowance for U.S. Appl. No. 13/096,735, dated Jun. 24, 2013, 24 pages. |
Notice of Allowance for U.S. Appl. No. 13/096,735, dated Oct. 4, 2013, 26 pages. |
Notice of Allowance for U.S. Appl. No. 14/100,615, dated Sep. 28, 2015, 22 pages. |
Notice of Allowance for U.S. Appl. No. 14/509,356, dated Aug. 1, 2017, 29 pages. |
Notice of Allowance for U.S. Appl. No. 14/509,366, dated Jul. 31, 2017, 59 pages. |
Notice of Allowance for U.S. Appl. No. 14/846,090, dated Jul. 25, 2016, 22 pages. |
Notice of Allowance for U.S. Appl. No. 14/858,715, dated Feb. 1, 2017, 42 pages. |
Notice of Allowance for U.S. Appl. No. 14/858,715, dated Feb. 1, 2017, 8 pages. |
Notice of Allowance for U.S. Appl. No. 14/858,715, dated Mar. 1, 2017, 13 pages. |
Notice of Allowance for U.S. Appl. No. 14/863,936, dated Mar. 20, 2018, 9 pages. |
Notice of Allowance for U.S. Appl. No. 14/863,950, dated Mar. 27, 2018, 9 pages. |
Notice of Allowance for U.S. Appl. No. 14/863,960, dated Mar. 20, 2018, 9 pages. |
Office Action dated Aug. 3, 2016 from U.S. Appl. No. 14/863,950, filed Sep. 24, 2015 (15 pages). |
Office Action dated Aug. 3, 2016 from U.S. Appl. No. 14/863,960, filed Sep. 24, 2015 (21 pages). |
Office Action dated Aug. 4, 2016 from U.S. Appl. No. 14/863,936, filed Sep. 24, 2015 (16 pages). |
Office Action dated Feb. 21, 2017 from U.S. Appl. No. 14/509,366, filed Oct. 8, 2014 (25 pages). |
Office Action dated Mar. 14, 2016 from U.S. Appl. No. 14/858,715, filed Sep. 18, 2015 (9 pages). |
Office Action dated Mar. 3, 2016 from U.S. Appl. No. 14/509,366, filed Oct. 8, 2014 (19 pages). |
Ortiz et al., An Optimal Strategy for Dilation Based Iris Image Enrollment. IEEE International Joint Conference on Biometrics. 6 pages, Sep. 29-Oct. 2, 2014. |
Restriction Requirement for U.S. Appl. No. 11/510,197, dated May 16, 2008, 12 pages. |
Robert J.K. Jakob, “Eye Movement Based Human Computer Interaction Techniques; Toward Non-Command Interfaces,” Advances in Human-Computer Interaction, vol. 4, ed. by H.R. Hartson and D. Hix, pp. 151-190, Ablex Publishing Co., Norwood, N.J. (1993). |
Robert J.K. Jakob, “Eye Tracking in Advanced Interface Design,” in Virtual Environments and Advanced Interface Dseign, ed. by W. Barfield and T.A. Furness, pp. 258-288, Oxford University Press, New York (1995). |
Roth, Mouthpiece Meditations, Part 3. Online Trombone Journal, www.trombone.org. 5 pages, Jul. 23, 2018. |
Schovanec, Ocular Dynamics and Skeletal Systems, IEEE Control Systems Magazine. Aug. 2001;21(4):70-79. |
Scoblete, The Future of the Electronic Shutter. pdn, Photo District News, retrieved online at: https://www.pdnonline.com/gear/cameras/the-future-of-the-electronic-shutter/, 6 pates, May 9, 2016. |
Second Japanese Office Action for Application 2015-545911 dated Feb. 26, 2018 ( with English translation). |
SRI International, “Seeing the Future of Iris Recognition”, available at www.sri.com/iom, Mar. 2014, 9 pages. |
Swiniarski, Experiments on Human Recognition Using Error Backpropagation Artificial Neural Network. Neural Networks Class (CS553) of San Diego State University Computer Science Department, Apr. 2004. |
Tan et al., Efficient Iris Recognition by Characterizing Key Local Variations. IEEE Transactions on Image Processing. Jun. 2004;13(6):739-750. |
U.S. Appl. No. 14/100,615, “Iris Biometric Matching System”, filed Dec. 9, 2013, 57 pages. |
U.S. Appl. No. 14/100,615, “Iris Biometric Matching System,” filed Dec. 9, 2013, 61 pages. |
U.S. Appl. No. 61/888,130, filed Oct. 8, 2013, 20 pages. |
Van der Wal, et al., “The Acadia Vision Processor,” IEEE International Workshop on Computer Architecture for Machine Perception, pp. 31-40, Padova, Italy, Sep. 11-13, 2000. |
Weisstein E. et al.; “Circle” From MathWorld—A Wolfram Web Resource. www.mathworld.wolfram.com/circle.html, pp. 1 to 8., Jul. 3, 2008. |
Wildes, R., “Iris Recognition: An Emerging Biometric Technology,” Proc. IEEE, 85(9):1348-1363, Sep. 1997. |
Written Opinion for Application No. PCT/US2015/051863, dated Dec. 10, 2015, 5 pages. |
Written Opinion for Application No. PCT/US2017/065793, dated Feb. 16, 2018, 10 pages. |
Written Opinion of the International Searching Authority dated Jun. 28, 2018, issued in connection with International Application No. PCT/US2018/025895 (10 pages). |
www.m-w.com--definition—“ellipse” (Refer to Ellipse Illustration; also attached) pp. 1 of 2. |
Yokoya, Ryunosuke, and Shree K. Nayar. “Extended depth of field catadioptric imaging using focal sweep.” Proceedings of the IEEE International Conference on Computer Vision. 2015. |
Zhu, et al., “Biometric Personal Identification Based on Iris Patterns,” Pattern Recognition, Proceedings 15th International Conference on vol. 2 IEEE (2000) (4 pages). |
Singapore Search Report and Written Report for Application No. 11201704097X, dated Mar. 13, 2018, 5 pages. |
Extended European Search Report in connection with European Patent Application No. 15864635.6 dated Jun. 6, 2018 (8 pages). |
U.S. Appl. No. 15/471,131, filed Mar. 28, 2017, Published. |
U.S. Appl. No. 15/475,425, filed Mar. 31, 2017, Published. |
U.S. Appl. No. 15/514,098, filed Mar. 24, 2017, Published. |
U.S. Appl. No. 15/661,188, filed Jul. 27, 2017, Published. |
U.S. Appl. No. 15/661,246, filed Jul. 27, 2017, Published. |
U.S. Appl. No. 15/661,267, filed Jul. 27, 2017, Published. |
U.S. Appl. No. 15/661,297, filed Jul. 27, 2017, Published. |
U.S. Appl. No. 15/661,340, filed Jul. 27, 2017, Published. |
U.S. Appl. No. 15/944,327, filed Apr. 3, 2018, Published. |
U.S. Appl. No. 16/039,442, filed Jul. 19, 2018, Pending. |
U.S. Appl. No. 15/839,202, filed Dec. 12, 2017, Published. |
Number | Date | Country | |
---|---|---|---|
20170347000 A1 | Nov 2017 | US |
Number | Date | Country | |
---|---|---|---|
62086867 | Dec 2014 | US |