The present invention relates to electronic devices, and more particularly to a biometric sensing device included in, or connected to an electronic device. Still more particularly, the present invention relates to the use of one or more biometric data in online commerce.
Passwords are a common security tool for applications, websites, and devices. A user-entered password must match a reference password before the user is given access or allowed to interact with an application, website, or device. But passwords can have a number of limitations. The number of characters that can be included in the password can be limited to a maximum number, such as eight or twelve characters. Additionally, a user can be prohibited from using certain types of characters in their password. For example, a password may not include symbols such as a pound or hash symbol (#), an exclamation sign (!), and a percent sign (%). Randomly generated passwords can be more secure than passwords that are selected by a user, but randomly generated passwords can be difficult to remember. Some users therefore prefer to select passwords that are easier to remember at the expense of security. For example, a password that includes a complete word, the user's birthday, or a company name may be easier to remember, but such passwords can also be easier to guess or discover.
The use of biometric data can provide a greater level of security to a device or application compared to passwords. Biometric data can also be easier to enter compared to passwords, especially randomly generated passwords and long passwords. Biometric sensing devices can detect or image a unique physical or behavioral trait of a person and produce biometric data that can reliably identify the person. For example, a fingerprint includes a unique pattern of veins, ridges and valleys that can be imaged by a fingerprint sensor. The image of the fingerprint, or the unique characteristics of the fingerprint, is compared to previously captured reference data, such as a reference fingerprint image. The identity of the person is obtained or verified when the newly captured fingerprint image matches the reference fingerprint image.
Embodiments described herein provide methods for authenticating a user with one or more biometric images and permitting the user to purchase from an online store using a biometric image or images. The terms “image” and “biometric image” are meant to encompass an image, a composite image, and other types of data that can be captured by a biometric sensing device. In one aspect, a method for completing a purchase on an online store can include a processing device determining if a biometric image matches a reference biometric image. If the biometric image matches the reference biometric image, the processing device can countersign an online account token that is associated with an account of the user on the online store with user identifier data. The countersigned online account token indicates the purchase on the online store can be completed. The countersigned token can then be transmitted to the online store, where the user is permitted to make one or more purchases on the online store based on the countersigned online account token.
In another aspect, a system can include a processing device, a biometric sensing device operatively connected to the processing device, and one or more memories operatively connected to the processing device. An online account token and user identifier data can be stored in the memory or memories. The processing device can be configured to countersign the online account token with at least some of the user identifier data when a biometric image captured by the biometric sensing device matches a reference biometric image.
In another aspect, a network communications interface can be operatively connected to the processing device. The processing device can then transmit the countersigned online account token to the online store using a network connection established with the network communications interface.
In yet another aspect, a method for authenticating a user having an account on an online store can include the online store transmitting an online account token associated with the account to an electronic device, and the online store receiving a countersigned online account token from the electronic device. The countersigned online account token can indicate the identity of the user has been authenticated based on a biometric image and can indicate the biometric image is associated with the account.
Embodiments of the invention are better understood with reference to the following drawings. The elements of the drawings are not necessarily to scale relative to each other. Identical reference numerals have been used, where possible, to designate identical features that are common to the figures.
Embodiments described herein permit a user to make purchases on an online store using one or more biometric images. The online store can transmit an online account token to an electronic device and/or to a biometric sensing device after the user successfully enters his or her account password. The electronic device or the biometric sensing device can countersign the online account token when the one or more biometric images match respective reference biometric images and the account password matches user identifier data stored in the electronic device or in the biometric sensing device. The countersigned online account token can then be transmitted to the online store. The user can make one or more purchases after the online store receives the countersigned online account token.
Any suitable type of biometric sensing device can be included in, or connected to an electronic device. A person's fingerprint, eye, DNA, vein patterns, typing speed or patterns, gait, voice, face, and heart or brain signals are examples of a physical characteristic or a behavioral trait that can be detected or imaged by a biometric sensing device. A biometric sensing device can employ capacitance, ultrasonic, optical, resistive, thermal, or other sensing technologies to detect or image a biometric attribute. The term “biometric attribute” is meant to encompass a physical or behavioral trait that can be detected by a biometric sensing device.
Directional terminology, such as “top”, “bottom”, “front”, “back”, “leading”, “trailing”, etc., is used with reference to the orientation of the Figure(s) being described. Because components of embodiments described herein can be positioned in a number of different orientations, the directional terminology is used for purposes of illustration only and is in no way limiting. When used in conjunction with layers of a display or device, the directional terminology is intended to be construed broadly, and therefore should not be interpreted to preclude the presence of one or more intervening layers or other intervening features or elements. Thus, a given layer that is described as being formed, positioned, disposed on or over another layer, or that is described as being formed, positioned, disposed below or under another layer may be separated from the latter layer by one or more additional layers or elements.
Referring now to
The electronic device 100 includes an enclosure 102 at least partially surrounding a display 104 and one or more buttons 106 or input devices. The enclosure 102 can form an outer surface or partial outer surface and protective case for the internal components of the electronic device 100, and may at least partially surround the display 104. The enclosure 102 can be formed of one or more components operably connected together, such as a front piece and a back piece. Alternatively, the enclosure 102 can be formed of a single piece operably connected to the display 104.
The display 104 can be implemented with any suitable technology, including, but not limited to, a multi-touch sensing touchscreen that uses liquid crystal display (LCD) technology, light emitting diode (LED) technology, organic light-emitting display (OLED) technology, organic electroluminescence (OEL) technology, or another type of display technology. The button 106 can take the form of a home button, which may be a mechanical button, a soft button (e.g., a button that does not physically move but still accepts inputs), an icon or image on a display, and so on. Further, in some embodiments, the button 106 can be integrated as part of a cover glass of the electronic device.
One or more biometric sensing devices can be included in, or connected to the electronic device 100. In one embodiment, the button 106 can include a biometric sensing device. As one example, a fingerprint sensor can be integrated in the button. Additionally or alternatively, a biometric sensing device can be included in a portion of the display, or in the entire display. And in some embodiments, the enclosure 102 can include one or more biometric sensing devices, such as a fingerprint sensor, a thermal sensor, and a microphone that can be used in conjunction with a voice recognition application.
The processing device 200 can control some or all of the operations of the electronic device 100. The processing device 200 can communicate, either directly or indirectly, with substantially all of the components of the electronic device 100. For example, a system bus or signal line 214 or other communication mechanisms can provide communication between the processing device 200, the memory 202, the I/O device 204, the sensor 206, the power source 208, the network communications interface 210, and/or the biometric sensing device 212. The processing device 200 can be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions. For example, the processing device 200 can be a microprocessor, a central processing unit (CPU), an application-specific integrated circuit (ASIC), a digital signal processor (DSP), or combinations of such devices. As described herein, the term “processing device” is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, or other suitably configured computing element or elements.
The memory 202 can store electronic data that can be used by the electronic device 100. For example, a memory can store electrical data or content such as, for example, audio and video files, documents and applications, device settings and user preferences, timing signals, biometric images, data structures or databases, and so on. The memory 202 can be configured as any type of memory. By way of example only, the memory can be implemented as random access memory, read-only memory, Flash memory, removable memory, or other types of storage elements, or combinations of such devices.
The I/O device 204 can transmit and/or receive data to and from a user or another electronic device. One example of an I/O device is button 106 in
The electronic device 100 may also include one or more sensors 206 positioned substantially anywhere on the electronic device 100. The sensor or sensors 206 may be configured to sense substantially any type of characteristic, such as but not limited to, images, pressure, light, touch, heat, movement, relative motion, biometric data, and so on. For example, the sensor(s) 206 may be an image sensor, a heat sensor, a light or optical sensor, an accelerometer, a pressure transducer, a gyroscope, a magnet, a health monitoring sensor, and so on.
The power source 208 can be implemented with any device capable of providing energy to the electronic device 100. For example, the power source 208 can be one or more batteries or rechargeable batteries, or a connection cable that connects the remote control device to another power source such as a wall outlet.
The network communication interface 210 can facilitate transmission of data to or from other electronic devices. For example, a network communication interface can transmit electronic signals via a wireless and/or wired network connection. Examples of wireless and wired network connections include, but are not limited to, cellular, Wi-Fi, Bluetooth, IR, and Ethernet.
The biometric sensing device 212 can be implemented as any suitable biometric sensor, scanner, and/or system. For example, the biometric sensing device can be a facial recognition device, an iris or retina scanner, a vein recognition device that can image the veins in a finger or palm, a facial biometrics scanner, and/or a thermal imaging scanner. Additionally, the biometric sensing device 212 can be implemented with any suitable sensing technology, including, but not limited to, capacitive, resistive, ultrasound, piezoelectric, and thermal sensing technology.
The biometric sensing device 212 can be connected to a secure processing system 216. The secure processing system can be included in the electronic device or in the biometric sensing device. The secure processing system 216 can receive biometric images captured by the biometric sensing device. The secure processing system 216 is generally used to store and manipulate secure data, including the biometric images, reference biometric images, and user identifier data associated with a user and his or her online account for an online store. The processing device 200 can be prohibited from accessing the secure data and the biometric images received from the biometric sensing device, which increases the security of the data and biometric images. For example, the secure data and biometric images are inaccessible or less accessible to other programs that may be running on the processing device 200.
In one embodiment, the secure processing system can include one or more secure processors, a secure persistent memory, and a secure non-persistent memory. Any suitable processing device and memory can be used in the secure processing system 216. Other components can be included in the secure processing system in some embodiments. Additionally or alternatively, a secure processing system can include only one memory. The secure processing system 216 is described in more detail in conjunction with
In embodiments described herein, the biometric sensing device can be one or more fingerprint sensors. A fingerprint sensor can capture images of one or more fingers, a portion of one or more fingers, and/or some or all of a palm or of a hand. In some embodiments, the fingerprint sensor is positioned at a location that a user's finger, fingers and/or hands are naturally in contact with as the user interacts with the electronic device. For example, as described earlier, an electronic device can include a fingerprint sensor in the display 104, the button 106, the enclosure 102, and/or as a separate electronic device that is connected to the electronic device 100.
As used herein, the terms “image” and “biometric image” includes an image, a composite image formed with multiple images, and other types of data that can be captured by a biometric sensing device. The term “fingerprint image” includes an image, a composite image, and other types of data that can be captured by a fingerprint sensor. By way of example only, a fingerprint sensor can produce a data structure that defines the features in a fingerprint. Additionally, the term “fingerprint image” is meant to encompass an image or other data relating to a fingerprint of some or all of one or more fingers, some or all of a palm, some or all of a hand, and various combinations thereof. The term “finger” is meant to encompass one or more fingers, some or all of a palm, some or all of a hand, and various combinations thereof.
It should be noted that
An example construction of a capacitive fingerprint sensor and how the capacitive fingerprint sensor captures a fingerprint will now be briefly described.
The capacitive fingerprint sensor 300 can capture a fingerprint image of at least a portion of the finger 302 by measuring capacitance differences between the finger 302 and the electrodes 314. A fingerprint is generally formed from ridges 304 and valleys 306 arranged in a unique pattern. Typically, the capacitance measured between a ridge 304 and one or more electrodes 314 varies from the capacitance measured between a valley 306 and one or more electrodes 314. The measured capacitance between a ridge and an electrode can be greater than the measured capacitance between a valley and an electrode because the ridge is closer to the electrode. The differences in the measured capacitances can be used to distinguish between ridges and valleys and produce a fingerprint image.
The skin on the finger 302 includes a dead skin layer 316 disposed over a live skin layer 318. The capacitive fingerprint sensor 300 typically images the dead skin layer 316 to obtain an image of the fingerprint. However, if a portion of the dead skin layer 316 is damaged or missing, the capacitive fingerprint sensor can obtain an image of the fingerprint by imaging the live skin layer 318 by itself, or by imaging both the remaining dead skin layer 316 and the exposed live skin layer 318.
In some embodiments, a user can determine a level of security when accessing the online store with an electronic device. For example, a user can require that a sequence of fingerprints be captured and matched to a sequence of reference fingerprint images before the user can access and/or make purchases on the online store. The user can specify the number of fingerprints in the sequence, which fingerprints are included in the sequence, and/or the order of the fingerprints in the sequence. For example, a user can require that two fingerprints be captured and the fingerprints to be scanned along with the order of the scans are the right index finger and the left ring finger.
Additionally, a user can require a password be entered and matched to a reference password. Access or purchases on the online store is allowed only when a fingerprint or a sequence of fingerprints matches respective reference fingerprint images and only after the password matches the reference password.
Referring now to
Initially, an account password is received at block 400. In one embodiment, the account password can be entered by a user on an electronic device operatively connected to an online store. The account password can be associated with the online store. The account password can be entered through a dialog box in a user interface. In one embodiment, the account password can be transmitted by the online store to an online payment service that matches the account password to an account in the online store (see step 500 in
When the entered account password matches the account password, the process continues at block 404 where an online account token can be received from the online payment service (step 502 in
When the biometric sensing device is to be used, the method continues at block 408 where a user can set a passcode for the biometric sensing device. A biometric enrollment process can then be performed on the electronic device at block 410. Generally, an enrollment process can include capturing one or more biometric images of a biometric attribute and storing at least one biometric image in memory. At least one of the entered biometric images can then be used as a reference biometric image. The term “biometric attribute” is meant to encompass a physical or behavioral trait that can be detected by a biometric sensing device. As one example, when the biometric sensing device is a fingerprint sensor, a fingerprint image can be acquired and stored in memory during an enrollment process.
A determination can then be made at block 412 as to whether or not the biometric sensing device is to be used for purchases from the online store. As one example, a user can be prompted to approve or reject the use of the biometric sensing device with a dialog box or menu. The method ends if the biometric sensing device will not be used to make purchases on the online store.
When the biometric sensing device will be used to make purchases, the process passes to block 414 where the online account token and user identifier data are transmitted to a secure processing system (e.g., 216 in
In some embodiments, a secure processing system 216 can include a non-persistent secure memory and a persistent secure memory. The online account token can be transmitted to the secure processing system 216 and stored in the non-persistent secure memory. Thus, the online account token may be cleared automatically from the non-persistent secure memory each time the non-persistent memory loses power, such as when the electronic device is turned off. The user identifier data can be transmitted to the secure processing system 216 and stored in the persistent secure memory.
Referring now to
Initially, as shown in block 600, a determination can be made as to whether the biometric sensing device is to be used to complete a purchase from the online store. If so, the process continues at block 602 where a biometric image can be captured and transmitted to a secure processing system (step 700 in
In some embodiments, a user can set an expiration date for his or her reference biometric image. The user can allow the reference biometric image or images to be used only for a set period of time. After the reference biometric image expires, a user can perform another enrollment process to create another reference biometric image. Additionally or alternatively, an online store can require a reference biometric image to expire after a given amount of time. For example, the online store may require reference biometric images to expire as part of a fraud prevention program.
When the reference biometric image has not expired, the method passes to block 606 where a determination is made as to whether the biometric image received at block 602 matches the reference biometric image. The method ends if the biometric image does not match the reference image. When the biometric image matches the reference image, the process continues at block 608 where a user can complete the purchase on the online store.
In one embodiment, a purchase can be completed by having a processing device, such as a secure processing device, countersign the online account token stored in the first secure memory with the hash of the DSID and transmit the countersigned online account token to the online store (step 702 in
In some embodiments, a window of time can be set in which a user can make purchases repeatedly without having to reenter a biometric image. The online account token can include a timestamp that indicates a start time for the window. As one example, when the biometric image matches the reference biometric image at block 606, a fifteen minute window can be created where a user can make multiple purchases. The window can then close after fifteen minutes and the user will have to re-enter his or her biometric image to complete any other purchases.
Referring now to
When the online account password matches the user identifier data, the process continues at block 804 where user is now able to make purchases on the online store. A purchase can be completed using the steps 702, 704, 706 described in conjunction with
When the passcode matches the user identifier data, the process can continue at block 908 where the online account token and user identifier data can be transmitted to a processing system, such as the secure processing system 216 (step 506 in
Referring now to
When the passcode matches the passcode in the user identifier data, the process continues at block 1004 where one or more settings for the biometric sensing device can be changed. For example, a user can add a biometric image of a new biometric attribute, such as an image of a new finger. Similarly, a user can delete a biometric image.
Next, as shown in block 1006, the online account token can be deleted from the secure processing system and the user identifier data invalidated. The method ends after block 1006. In some embodiments, a UUID is associated with each new biometric image. Thus the user identifier data may be invalidated because the UUID can change based on the modified setting or settings.
Next, as shown in block 1102, the user enters his or her online password for the online store. A determination can then be made at block 1104 as to whether the entered account password matches the password stored in the user identifier data (e.g., user identifier data stored in persistent secure memory). The method ends if the password does not match the user identifier data. When the entered account password matches the user identifier data, the process passes to block 1106 an online account token can be transmitted to a secure processing device. In some embodiments, the user identifier data does not have to be remapped because the same account password is associated with user identifier data. The user is now permitted to make purchases based on a biometric image (block 1108), and the method ends.
In some embodiments, the online account password can be deleted from the secure processing system when a user signs out of the online store or logs off the electronic device. The user identifier data, however, can still be stored in the secure processing system when the user identifier data is stored in a persistent memory.
Various embodiments have been described in detail with particular reference to certain features thereof, but it will be understood that variations and modifications can be effected within the spirit and scope of the disclosure. And even though specific embodiments have been described herein, it should be noted that the application is not limited to these embodiments. In particular, any features described with respect to one embodiment may also be used in other embodiments, where compatible. Likewise, the features of the different embodiments may be exchanged, where compatible.
This application is a continuation of U.S. patent application Ser. No. 14/022,104, filed Sep. 9, 2013, and entitled “Use of a Biometric Image in Online Commerce,” the contents of which are incorporated herein by reference as if fully disclosed herein.
Number | Name | Date | Kind |
---|---|---|---|
5872834 | Teitelbaum | Feb 1999 | A |
6256737 | Bianco et al. | Jul 2001 | B1 |
6400836 | Senior | Feb 2002 | B2 |
6795569 | Setlak | Sep 2004 | B1 |
6845453 | Scheldt et al. | Jan 2005 | B2 |
6892938 | Solomon | May 2005 | B2 |
6975202 | Rodriguez et al. | Dec 2005 | B1 |
7039805 | Messing | May 2006 | B1 |
7046139 | Kuhn et al. | May 2006 | B2 |
7065184 | Vishik et al. | Jun 2006 | B2 |
7110987 | Engelhart | Sep 2006 | B2 |
7220755 | Hamid et al. | Apr 2007 | B2 |
7210620 | Jones | May 2007 | B2 |
7246244 | Nanavati et al. | Jul 2007 | B2 |
7269737 | Robinson | Sep 2007 | B2 |
7278025 | Saito et al. | Oct 2007 | B2 |
7373671 | Gudorf | May 2008 | B2 |
7502761 | Siegal et al. | Mar 2009 | B2 |
7567909 | Billingsley | Jul 2009 | B1 |
7617399 | Ebata | Nov 2009 | B2 |
7640336 | Lu | Dec 2009 | B1 |
7769845 | Baron | Aug 2010 | B2 |
7809954 | Miller et al. | Oct 2010 | B2 |
7849013 | Engelhart | Dec 2010 | B2 |
7855899 | Yang | Dec 2010 | B2 |
7864987 | Venkatanna et al. | Jan 2011 | B2 |
7865439 | Siefert et al. | Jan 2011 | B2 |
7885899 | Sancho | Feb 2011 | B1 |
7941664 | Wheeler | May 2011 | B2 |
7949571 | Fujimaki | May 2011 | B2 |
7953671 | Bishop et al. | May 2011 | B2 |
7980378 | Jones et al. | Jul 2011 | B2 |
8028896 | Carter et al. | Oct 2011 | B2 |
8060413 | Castell et al. | Nov 2011 | B2 |
8063889 | Anderson et al. | Nov 2011 | B2 |
8064658 | Iannone | Nov 2011 | B2 |
8065190 | Collas | Nov 2011 | B2 |
8072060 | Chou | Dec 2011 | B2 |
8145916 | Boshra et al. | Mar 2012 | B2 |
8171531 | Buer | May 2012 | B2 |
8185646 | Headley | May 2012 | B2 |
8190908 | Jazayeri et al. | May 2012 | B2 |
8219495 | Niwa | Jul 2012 | B2 |
8230232 | Ahmed et al. | Jul 2012 | B2 |
8320638 | Pitt et al. | Nov 2012 | B2 |
8336096 | Narusawa et al. | Dec 2012 | B2 |
8345931 | Jeronimus | Jan 2013 | B2 |
8346953 | Hew | Jan 2013 | B1 |
8369845 | Zou et al. | Feb 2013 | B2 |
8406736 | Das et al. | Mar 2013 | B2 |
8429760 | Tribble | Apr 2013 | B2 |
8473748 | Sampas | Jun 2013 | B2 |
8483659 | Mahajan et al. | Jul 2013 | B2 |
8548166 | Wasilewski et al. | Oct 2013 | B2 |
8566955 | Brosnan et al. | Oct 2013 | B2 |
8572707 | Tuchman et al. | Oct 2013 | B2 |
8621561 | Cross et al. | Dec 2013 | B2 |
8621642 | Bjorn et al. | Dec 2013 | B2 |
8627417 | Aoyama | Jan 2014 | B2 |
8627454 | Bolyukh | Jan 2014 | B2 |
8635165 | Beenau | Jan 2014 | B2 |
8660322 | Tsai et al. | Feb 2014 | B2 |
8682798 | Patterson | Mar 2014 | B2 |
8745490 | Kim | Jun 2014 | B2 |
8745716 | Brudnicki | Jun 2014 | B2 |
8762276 | Lepisto et al. | Jun 2014 | B2 |
8799670 | Naccache | Aug 2014 | B2 |
8839371 | Ghosh | Sep 2014 | B2 |
8905303 | Ben Ayed | Dec 2014 | B1 |
8943326 | Tamkhane et al. | Jan 2015 | B2 |
8943580 | Fadell et al. | Jan 2015 | B2 |
8966076 | Kawana et al. | Feb 2015 | B2 |
8994498 | Agrafioti et al. | Mar 2015 | B2 |
9015796 | Fujioka | Apr 2015 | B1 |
9037869 | Avancha et al. | May 2015 | B2 |
9076027 | Miura et al. | Jul 2015 | B2 |
9098510 | Seryakov et al. | Aug 2015 | B2 |
9119067 | Santamaria et al. | Aug 2015 | B2 |
9203845 | Webber | Dec 2015 | B2 |
9208337 | Tayloe | Dec 2015 | B2 |
9294550 | Song et al. | Mar 2016 | B2 |
9390251 | Avancha et al. | Jul 2016 | B2 |
9411037 | Jamtgaard et al. | Aug 2016 | B2 |
9443097 | O'Hare et al. | Sep 2016 | B2 |
9576135 | Komandoor | Feb 2017 | B1 |
9633098 | Aissi et al. | Apr 2017 | B2 |
9665785 | Han et al. | May 2017 | B2 |
9699168 | Pieczul et al. | Jul 2017 | B2 |
9710630 | Kim et al. | Jul 2017 | B2 |
9721086 | Shear et al. | Aug 2017 | B2 |
9723482 | Wang et al. | Aug 2017 | B2 |
9819676 | Han et al. | Nov 2017 | B2 |
9832189 | Han et al. | Nov 2017 | B2 |
9860274 | Jacobs | Jan 2018 | B2 |
9935942 | Kim et al. | Apr 2018 | B2 |
9959539 | Han et al. | May 2018 | B2 |
9965608 | Jang | May 2018 | B2 |
10044700 | Gresham et al. | Aug 2018 | B2 |
10212158 | Han et al. | Feb 2019 | B2 |
10303884 | Liu | May 2019 | B2 |
10331866 | Meir et al. | Jun 2019 | B2 |
10359870 | Colley et al. | Jul 2019 | B2 |
10373241 | Khalsa | Aug 2019 | B2 |
10735412 | Alsina et al. | Aug 2020 | B2 |
10949844 | Dryer et al. | Mar 2021 | B2 |
20020018585 | Kim | Feb 2002 | A1 |
20020056043 | Glass | May 2002 | A1 |
20020073416 | Ramsey Catan | Jun 2002 | A1 |
20020095586 | Doyle et al. | Jul 2002 | A1 |
20020174345 | Patel | Nov 2002 | A1 |
20030040339 | Chang | Feb 2003 | A1 |
20030046237 | Uberti | Mar 2003 | A1 |
20030061111 | Dutta et al. | Mar 2003 | A1 |
20030156740 | Siegel et al. | Aug 2003 | A1 |
20040044627 | Russell et al. | Mar 2004 | A1 |
20050116026 | Burger et al. | Jun 2005 | A1 |
20050154920 | Tartaglia et al. | Jul 2005 | A1 |
20050229006 | deMoura et al. | Oct 2005 | A1 |
20060064391 | Petrov et al. | Mar 2006 | A1 |
20060101026 | Fukushima | May 2006 | A1 |
20060173793 | Glass | Aug 2006 | A1 |
20060202797 | Theis et al. | Sep 2006 | A1 |
20060204048 | Morrison et al. | Sep 2006 | A1 |
20060234764 | Gamo et al. | Oct 2006 | A1 |
20060293892 | Pathuel | Dec 2006 | A1 |
20070078908 | Rohatgi | Apr 2007 | A1 |
20070088950 | Wheeler | Apr 2007 | A1 |
20070118891 | Buer | May 2007 | A1 |
20070267478 | Turek et al. | Nov 2007 | A1 |
20080016371 | Jiang et al. | Jan 2008 | A1 |
20080097925 | King | Apr 2008 | A1 |
20080103984 | Choe et al. | May 2008 | A1 |
20080140569 | Handel | Jun 2008 | A1 |
20080148393 | Wendt | Jun 2008 | A1 |
20080267464 | Goda | Oct 2008 | A1 |
20090240622 | Zandonadi | Sep 2009 | A1 |
20090315674 | Otake | Dec 2009 | A1 |
20100005509 | Peckover | Jan 2010 | A1 |
20100099383 | Yamagishi | Apr 2010 | A1 |
20100218012 | Joseph et al. | Aug 2010 | A1 |
20100241571 | McDonald | Sep 2010 | A1 |
20100321197 | Wong et al. | Dec 2010 | A1 |
20110035768 | Ling | Feb 2011 | A1 |
20110082791 | Baghdasaryan et al. | Apr 2011 | A1 |
20110119479 | Cowie et al. | May 2011 | A1 |
20110138450 | Kesanupalli et al. | Jun 2011 | A1 |
20110166922 | Fuerstenberg | Jul 2011 | A1 |
20110238476 | Carr et al. | Sep 2011 | A1 |
20110291798 | Schibuk | Dec 2011 | A1 |
20110300829 | Nurmi et al. | Dec 2011 | A1 |
20120123841 | Taveau et al. | May 2012 | A1 |
20120237908 | Fitzgerald et al. | Sep 2012 | A1 |
20120330769 | Arceo | Dec 2012 | A1 |
20120330784 | Nahidipour | Dec 2012 | A1 |
20120331566 | Lection et al. | Dec 2012 | A1 |
20130067545 | Hanes | Mar 2013 | A1 |
20130124416 | Pawar et al. | May 2013 | A1 |
20130159699 | Torkkel | Jun 2013 | A1 |
20130246800 | Stewart | Sep 2013 | A1 |
20130254906 | Kessler et al. | Sep 2013 | A1 |
20130298224 | Heilpern | Nov 2013 | A1 |
20130332575 | Song | Dec 2013 | A1 |
20140006795 | Han et al. | Jan 2014 | A1 |
20140007223 | Han et al. | Jan 2014 | A1 |
20140129843 | Shi et al. | May 2014 | A1 |
20140136419 | Kiyohara | May 2014 | A1 |
20140189807 | Cahill et al. | Jul 2014 | A1 |
20140279497 | Qaim-Maqami | Sep 2014 | A1 |
20140279498 | Qaim-Maqami | Sep 2014 | A1 |
20140279516 | Rellas et al. | Sep 2014 | A1 |
20140347479 | Givon | Nov 2014 | A1 |
20150026056 | Calman | Jan 2015 | A1 |
20150073998 | Alsina et al. | Mar 2015 | A1 |
20150081552 | Stewart | Mar 2015 | A1 |
20150186892 | Zhang et al. | Jul 2015 | A1 |
20150199687 | Han et al. | Jul 2015 | A1 |
20150220931 | Alsina et al. | Aug 2015 | A1 |
20150294382 | Alsina et al. | Oct 2015 | A1 |
20150304323 | Alsina et al. | Oct 2015 | A1 |
20170364918 | Malhotra et al. | Dec 2017 | A1 |
20180041506 | Han et al. | Feb 2018 | A1 |
20180082065 | Liu | Mar 2018 | A1 |
Number | Date | Country |
---|---|---|
1268234 | Sep 2000 | CN |
1695163 | Nov 2005 | CN |
1783052 | Jun 2006 | CN |
1983336 | Jun 2007 | CN |
101075282 | Nov 2007 | CN |
101256700 | Sep 2008 | CN |
101261679 | Sep 2008 | CN |
101827148 | Sep 2010 | CN |
101933051 | Dec 2010 | CN |
102088353 | Jun 2011 | CN |
102609837 | Jul 2012 | CN |
102867250 | Jan 2013 | CN |
103037065 | Apr 2013 | CN |
103220637 | Jul 2013 | CN |
103221958 | Jul 2013 | CN |
103268550 | Aug 2013 | CN |
103269273 | Aug 2013 | CN |
103295129 | Sep 2013 | CN |
202005003042 | Nov 2006 | DE |
102009027682 | Jan 2011 | DE |
102012202731 | Aug 2013 | DE |
1857954 | Nov 2007 | EP |
2226741 | Sep 2010 | EP |
2114051 | Jun 2012 | EP |
2533172 | Dec 2012 | EP |
2597585 | May 2013 | EP |
2447752 | Sep 2008 | GB |
A 2010140174 | Jun 2010 | JP |
A 2010193110 | Sep 2010 | JP |
A 2011192288 | Sep 2011 | JP |
1020120122181 | Nov 2012 | KR |
I236634 | Jul 2005 | TW |
200901724 | Jan 2009 | TW |
200919255 | May 2009 | TW |
201319817 | May 2013 | TW |
WO 03062969 | Jul 2003 | WO |
WO 08004312 | Jan 2008 | WO |
WO 08030184 | Mar 2008 | WO |
WO 13095434 | Jun 2013 | WO |
Entry |
---|
Kumar et al. (“Next Generation Electronic Passport Scheme using Cryptographic Authentication Protocols and Multiple Biometrics Technology”, I.J. Information Engineering and Electronic Business, 2013, 2, pp. 34-43, Sep. 1, 2013) (Year: 2013). |
Yang et al. (Consent Biometrics, IEEE, Jul. 4, 2011, 6 pages) (Year: 2011). |
“Countersign.” Merriam-Webster.com Dictionary, Merriam-Webster, https://www.merriam-webster.com/dictionary/countersign. Accessed Oct. 28, 2022 (Year: 2022). |
Countersign definition, dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2022 (Year: 2022). |
(Definition of countersign from the Cambridge Advanced Learner's Dictionary & Thesaurus © Cambridge University Press) (Year: 2022). |
Definition of countersign, Oxford University Press (Year: 2022). |
Pot, “What is Apple's Secure Enclave, And How Does It Protect My iPhone or Mac?” How-to-Geek, Oct. 23, 2018, https://www.howtogeek.com/339705/what-is-apples-secure-enclave-and-how-does-it-protect-my-iphone-or-mac/, 5 pages. |
Islam et al., “A Biometrics-Based Secure Architecture for Mobile Computing,” systems, Applications and Technology Conference (LISAT), 2012 IEEE Long Island, May 4, 2012, pp. 1-5, XP032192493, Section III: Proposed Architecture. |
Paterson et al., “Efficient Identity-based Signatures Secure in the Standard Model,” Information Security Group, Royal Holloway, University of London, Egham, Surrey, ACISP'06 Proceedings of the 11th Australasian Conference on Information Security and Privacy, Melbourne, Australia, Jul. 3-5, 2006, 17 pages. |
Schwartz, “Apple Hackers Rate iPhone5s Security,” Informationweek—Online, Monmouth Junction, Sep. 13, 2013, 3 pages. |
Soap Web Service Development, Snell, China Electric Power Press, Sep. 2002, pp. 76-81. |
Spencer et al., “iCaughtU Pro review [iPhone],” Publisher: knowyourmobile.com, Dec. 13, 2011, pp. 1-3. |
Number | Date | Country | |
---|---|---|---|
20210125248 A1 | Apr 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14022104 | Sep 2013 | US |
Child | 17031603 | US |