Systems and methods for cross coupling risk analytics and one-time-passcodes

Information

  • Patent Grant
  • 10949520
  • Patent Number
    10,949,520
  • Date Filed
    Tuesday, November 5, 2019
    6 years ago
  • Date Issued
    Tuesday, March 16, 2021
    4 years ago
Abstract
Example embodiments provide systems and methods for validating an action using a physical token, such as a near-field-communications (NFC)-capable chip. A server may receive a request to perform the action, and may require validation from the holder of the physical token. The holder of the physical token may log into an application using their login credentials, providing a first tier of authentication. The holder may then scan the physical token with a reader on their mobile device, which provides a second tier of authentication. The scan may reveal a value for a counter on the physical token, which may be compared to a counter at the server in order to validate that the physical token has been used as expected. If the server deems it appropriate, a third (or further) tier may be required, such as scanning a photographic identification of the holder.
Description
FIELD OF THE INVENTION

The present disclosure relates to cryptography, and more particularly, to system and methods for cryptographic authentication of contactless cards.


BACKGROUND

Data security and transaction integrity are of critical importance to businesses and consumers. This need continues to grow as electronic transactions constitute an increasingly large share of commercial activity.


When a questionable or suspicious transaction is processed, verification of the transaction may be required. Conventionally, this may involve sending a message to a user via email or short message service (SMS), requesting that the user confirm their intent to engage in the transaction.


In addition to being cumbersome, these services are vulnerable to attack and may therefore not provide a sufficient level of security. Moreover, if a user keeps their card with their mobile device (e.g., storing both in a purse, or keeping the card in a wallet that is often co-located with the mobile device), a fraudulent actor may be in possession of the device that is used to authenticate the transaction.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A depicts an environment suitable for use with exemplary embodiments.



FIG. 1B depicts an example of a contactless card having a physical token.



FIG. 1C depicts the structure of an exemplary physical token.



FIG. 2A depicts an exemplary interface for a mobile application associated with an owner of a contactless card.



FIG. 2B depicts an exemplary interface when the physical token is read by a reader on the owner's mobile device.



FIG. 2C depicts an example of data exchange between a contactless card and a client device.



FIG. 3 depicts an exemplary data structure for a message between the contactless card and the client device, or between the client device and a remote server, according to an embodiment.



FIG. 4 depicts exemplary data exchanges between the client device and one or more remote servers.



FIG. 5 is a flowchart depicting exemplary client-side transaction validation logic.



FIG. 6 is a flowchart depicting exemplary server-side transaction validation logic.



FIG. 7 depicts an exemplary computing system suitable for use with exemplary embodiments.



FIG. 8 depicts an exemplary network environment suitable for use with exemplary embodiments.





DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

Exemplary embodiments provide techniques for increasing the security of contactless cards while allowing transactions to be carried out in a more efficient and user-friendly way. These embodiments may be used to ensure that the card is physically present with the card owner (which may prevent transactions when the card has been stolen for a fraudulent in-person transaction) and ensure that the card owner authorizes the transaction (which can be used to verify either an in-person or a remote transaction). Moreover, the process of validating the transaction involves interaction with a physical token on the card. Because of this interaction, the card must be used to authorize both in-person and remote transactions, and unauthorized transactions can be quickly identified and denied.


More specifically, when a user wishes to access their account (e.g., on a mobile device), a system requests authorization of the card owner. The card owner is asked to sign into an application executing on their mobile device, and to scan a physical token associated with their card (e.g., a contactless chip capable of wireless communication with the mobile device, such as NFC, Bluetooth, WiFi, etc., or wired communication, such as by a USB connection). These steps establish that the card is in the physical possession of the card owner; in theory, only the card owner should be able to sign into the mobile application, and if the card is scanned by the local NFC reader, then the card's presence in the possession of the card owner may be confirmed.


Each time the card is used, a counter value stored on the physical token may be incremented and transmitted to a remote server for validation. As part of the scanning process to validate the transaction, the counter present on the card may be checked against the remote copy stored on the server. If the counter value read from the card is not the value expected by the server, this may indicate that the card or the owner's account has been used for unauthorized transactions (since the card's counter value was not a value expected by the server, either indicating that the card was used and the transaction was not recorded, or that an attacker is attempting to replay a captured previous session in an attempt to replay a transaction).


In many cases, the counter value on the card may not perfectly match the counter value stored on the server. For instance, in the case of a partial read (which might occur if the user keeps their card near their phone without intending to actually read the value on the physical token), the remote server might not get updated with an increased counter value, although the counter on the token may be locally updated. The degree to which the counter values must agree may depend on the risk level of the transaction and/or the current risk profile of the environment (e.g., whether a banking institution is currently under attack). Thus, for low-risk transactions, the counter value on the card must match the counter value on the server within a certain predetermined range (which allows the system to account for accidental reads of the card). For high risk transactions, the counter values must match exactly, or within a narrower predetermined range. The ranges may be determined dynamically based on what is known of the user's regular interactions with the card (e.g., if the user has historically been susceptible to accidental reads of the token, then the range may be set higher as compared to a user that whose card has not generally been susceptible to such reads). If the system determines that a discrepancy exists, a first course of action may be to request that the user re-validate their card with the application on their mobile device. In this case, the counter value for this additional authentication should have advanced beyond the value contained in the suspect authentication request. If the system still cannot validate the counter value, or for especially high-risk requests, further validation may be required (e.g., the application may request that the user provide biometric validation, a picture of the user, or a scan of the user's identification; alternatively, the user may be asked to physically present themselves at a location, such as a bank, for in-person validation). These actions allows the validation process to be adapted to the risk profile.


Similarly, the risk profile may be altered based on information gleaned during the validation process. For instance, if the transaction was originally flagged as low-risk, but the counter values read during the process indicate fraudulent activity may have occurred, then the risk associated with the transaction may be elevated. In another example, if the validation of the counter value triggered a re-validation process, the risk level associated with this user and/or transactions may be elevated


Furthermore, these two options (adjusting the authentication strength based on the risk profile, and adjusting the risk profile based on the authentication results) can be combined and used in tandem.


The following description of embodiments provides non-limiting representative examples referencing numerals to particularly describe features and teachings of different aspects of the invention. The embodiments described should be recognized as capable of implementation separately, or in combination, with other embodiments from the description of the embodiments. The description of embodiments should facilitate understanding of the invention to such an extent that other implementations, not specifically covered but within the knowledge of a person of skill in the art having read the description of embodiments, would be understood to be consistent with an application of the invention.



FIG. 1A illustrates a data transmission environment 100 according to an example embodiment. As further discussed below, system 100 may include contactless card including a physical token 106, a client device 104, a network 114, and a number of servers 116, 128. Although FIG. 1A illustrates a particular configuration of components, one of ordinary skill in the art will understand that other configurations including more or fewer components, or components in another configuration, may be used.


The environment 100 may include one or more contactless cards, which are further explained below with reference to FIG. 1B. In some examples, a contactless card may be in wireless communication, for example NFC communication, with the client device 104. The contactless card may include a physical token 106, such as a contactless chip (see FIG. 1C). The physical token 106 may maintain a copy of the above-noted counter value 108, which may be incremented each time the physical token is read by a reader (such as the NFC reader 110).


The environment 100 may include a client device 104, which may be a network-enabled computer. As referred to herein, a network-enabled computer may include, but is not limited to: e.g., a computer device, or communications device including, e.g., a server, a network appliance, a personal computer (PC), a workstation, a mobile device, a phone, a handheld PC, a personal digital assistant (PDA), a thin client, a fat client, an Internet browser, or other device. The client device 104 also may be a mobile device; for example, a mobile device may include an iPhone, iPod, iPad from Apple® or any other mobile device running Apple's iOS operating system, any device running Microsoft's Windows® Mobile operating system, and/or any other smartphone or like wearable mobile device.


The client device 104 and/or the contactless card including the physical token 106 may be associated with a user 102, which may be the owner of the contactless card. The user 102 may define credentials for accessing a mobile application on the client device 104, which may be an application associated with a service provider of the contactless card.


The client device 104 may include a near-field communications reader 110 suitable for communicating with the physical token 106; for example, the NFC reader 100 may be used to read the counter value 108 from the physical token 106.


In various examples according to the present disclosure, the client device 104 of the environment 100 may execute one or more applications, such as software applications. The software applications may enable network communications with one or more components of the environment 100 and may transmit and/or receive data. Among other computer-executable logic, the client device 104 may include client-side validation logic 112 (such as the logic depicted in more detail in connection with FIG. 5).


The client device 104 may be in communication with one or more servers 116, 128 via one or more networks 114, and may operate as a respective front-end to back-end pair with a transaction validation server 116. The client device 104 may transmit, for example from a mobile device application executing on client device 104, one or more requests to the server 116. The one or more requests may be associated with retrieving data from the server 116. The server 116 may receive the one or more requests from client device 104. Based on the one or more requests from the client device 104, the server 116 may be configured to retrieve the requested data from one or more databases (not shown). Based on receipt of the requested data from the one or more databases, the server 116 may be configured to transmit the received data to the client device 104, the received data being responsive to one or more requests.


The environment 100 may include one or more servers 116, 128. In some examples, the servers 116, 128 may include one or more processors, which are coupled to memory. The servers 116, 128 may be configured as a central system, server or platform to control and call various data at different times to execute a plurality of workflow actions. The servers 116, 128 may be configured to connect to the one or more databases. The client device 104 may be connected to at least one server 116, 128.


In one embodiment, a third-party server 128 may request that a transaction be validated. For instance, the third-party server 128 may be a server associated with a vendor selling a product or service, for which a purchase request is submitted in the name of the user 102. The third-party server 128 may request that the purchase be validated with the service provider.


To that end, the third-party server 128 may communicate, via the network 114, with a transaction validation server 116 affiliated with the service provider. To validate the transaction, the server 116 may execute server-side validation logic 118 (such as the logic depicted in FIG. 6). The logic 118 may maintain a counter window 120 defining a range of acceptable counter values (which, as noted above, account for accidental reads and other unintentional incrementing of the counter value 108). The counter window 120 may include several different ranges associated with different risk levels, such as a relatively wide range for low-risk transactions, and a relatively narrow range (which may require an exact match) for high-risk transactions.


A counter value 126 may be stored in the user database 122 and may be indexed to a record 124 associated with the physical token 106. The logic 118 may apply the counter window 120 when evaluating the counter value 126 stored in the user database 122. For example, upon receiving a new counter value 108, the logic 118 may compare the new counter value 108 to the stored counter value 126, to see if the new value 108 exceeds the stored value 126. If so, the logic 118 may determine whether the new value 108 exceeds the stored value 126 by more than the maximum window value (e.g., the sum of the stored value 126 and the window 120). If the new value is less than the combination of the stored value 126 and the window 120, then the new value 108 may be determined to be acceptable. If not, the new value 108 may be rejected and further action may be taken (as described herein). The user database 122 need not necessarily be a database, but may be any data structure suitable for storing a counter value 126 associated with the physical token 106 of the user 102.



FIG. 1B illustrates one or more contactless cards 130, which may comprise a payment card, such as a credit card, debit card, or gift card, issued by a service provider 132 displayed on the front or back of the card 130. In some examples, the contactless card 130 is not related to a payment card, and may comprise, without limitation, an identification card. In some examples, the payment card may comprise a dual interface contactless payment card. The contactless card 130 may comprise a substrate 134, which may include a single layer or one or more laminated layers composed of plastics, metals, and other materials. Exemplary substrate materials include polyvinyl chloride, polyvinyl chloride acetate, acrylonitrile butadiene styrene, polycarbonate, polyesters, anodized titanium, palladium, gold, carbon, paper, and biodegradable materials. In some examples, the contactless card 130 may have physical characteristics compliant with the ID-1 format of the ISO/IEC 7810 standard, and the contactless card may otherwise be compliant with the ISO/IEC 14443 standard. However, it is understood that the contactless card 130 according to the present disclosure may have different characteristics, and the present disclosure does not require a contactless card to be implemented in a payment card.


The contactless card 130 may also include identification information 136 displayed on the front and/or back of the card, and a contact pad 138 representing a physical token. The contact pad 138 may be configured to establish contact with another communication device, such as a user device, smart phone, laptop, desktop, or tablet computer. The contactless card 130 may also include processing circuitry, antenna and other components not shown in FIG. 1C. These components may be located behind the contact pad 138 or elsewhere on the substrate 134. The contactless card 130 may also include a magnetic strip or tape, which may be located on the back of the card (not shown in FIG. 1B).


As illustrated in FIG. 1C, the contact pad 138 of FIG. 1B may include processing circuitry 140 for storing and processing information, including a microprocessor 142 and a memory 144. It is understood that the processing circuitry 140 may contain additional components, including processors, memories, error and parity/CRC checkers, data encoders, anticollision algorithms, controllers, command decoders, security primitives and tamperproofing hardware, as necessary to perform the functions described herein.


The memory 144 may be a read-only memory, write-once read-multiple memory or read/write memory, e.g., RAM, ROM, and EEPROM, and the contactless card 500 may include one or more of these memories. A read-only memory may be factory programmable as read-only or one-time programmable. One-time programmability provides the opportunity to write once then read many times. A write once/read-multiple memory may be programmed at a point in time after the memory chip has left the factory. Once the memory is programmed, it may not be rewritten, but it may be read many times. A read/write memory may be programmed and re-programmed many times after leaving the factory. It may also be read many times.


The memory 144 may be configured to store one or more applets 146, one or more counters 108, and a customer identifier 148. The one or more applets 146 may comprise one or more software applications configured to execute on one or more contactless cards, such as Java Card applet. However, it is understood that applets 146 are not limited to Java Card applets, and instead may be any software application operable on contactless cards or other devices having limited memory. The one or more counters 108 may comprise a numeric counter sufficient to store an integer. The customer identifier 148 may comprise a unique alphanumeric identifier assigned to a user of the contactless card 130, and the identifier may distinguish the user of the contactless card from other contactless card users. In some examples, the customer identifier 148 may identify both a customer and an account assigned to that customer and may further identify the contactless card associated with the customer's account.


The processor and memory elements of the foregoing exemplary embodiments are described with reference to the contact pad, but the present disclosure is not limited thereto. It is understood that these elements may be implemented outside of the pad 138 or entirely separate from it, or as further elements in addition to processor 142 and memory 144 elements located within the contact pad 138.


In some examples, the contactless card 130 may comprise one or more antennas 150. The one or more antennas 150 may be placed within the contactless card 130 and around the processing circuitry 140 of the contact pad 138. For example, the one or more antennas 150 may be integral with the processing circuitry 140 and the one or more antennas 150 may be used with an external booster coil. As another example, the one or more antennas 150 may be external to the contact pad 138 and the processing circuitry 142.


In an embodiment, the coil of contactless card 130 may act as the secondary of an air core transformer. The terminal may communicate with the contactless card 130 by cutting power or amplitude modulation. The contactless card 130 may infer the data transmitted from the terminal using the gaps in the contactless card's power connection, which may be functionally maintained through one or more capacitors. The contactless card 130 may communicate back by switching a load on the contactless card's coil or load modulation. Load modulation may be detected in the terminal's coil through interference.


As explained above, the contactless cards 130 may be built on a software platform operable on smart cards or other devices having limited memory, such as JavaCard, and one or more or more applications or applets may be securely executed. Applets may be added to contactless cards to provide a one-time password (OTP) for multifactor authentication (MFA) in various mobile application-based use cases. Applets may be configured to respond to one or more requests, such as near field data exchange (NDEF) requests, from a reader, such as a mobile NFC reader, and produce an NDEF message that comprises a cryptographically secure OTP encoded as an NDEF text tag.


As noted above, exemplary transactions may validate a transaction requested of an account associated with the contactless card via the logic 112 executing on the client device 104. FIGS. 2A-2B depict exemplary interfaces that may be presented on the client device 104 in response to the logic.


Prior to displaying the interfaces, the user of the client 104 may be notified that a transaction requires validation. For instance, the user may receive an SMS message from the service provider, may receive a notification through the service provider's application, may receive a call or an email, etc.


Upon receiving the notification, the user may log into the service provider's application. The user may, for instance, supply a username and password, which may validate the user's identity. In other embodiments, the user may be validated in other ways, such as through biometric data. In some embodiments, login may utilize two-factor authentication (2FA).


When the user logs into the application, they may be presented with an interface, such as the interface 200 depicted in FIG. 2A. In the interface, a message 202 may be displayed indicating that a questionable transaction has been received and requires validation. The message 202 may include details of the transaction, such as the value of the transaction, the name of the vendor attempting to validate the transaction, etc.


The interface 200 may include an interactable element 204 allowing the user to flag the transaction as fraudulent, if the user did not authorize the transaction. Upon selecting the interactable element 204, the application may transmit a fraud alert message to the transaction validation server indicating that the transaction in question is not approved.


The message 202 may also include instructions for validating the transaction, if the user did authorize the transaction. In one embodiment, validating the transaction may involve tapping the card 130 to a reader on the back of the client device 104, as shown in FIG. 2B. The reader may read the counter value from the physical token on the card 130, and may generate a message 300 (see FIG. 3) including the counter value 304 and an authentication cryptogram 306. The message 300 may be encrypted.


The counter value 304 may correspond to the counter value most recently read from the card, and the authentication cryptogram 306 may be generated based on cryptographic keys stored on the physical token 138 and may be used to authenticate the card with the transaction validation server and ensure that the message 300 has not been tampered with or corrupted.


The message 300 may also include a token identifier 302, which may identify the card 130 and/or the user associated with the card. For instance, the token identifier 302 may correspond to the unique customer identifier 148 stored in the physical token 138).


Upon receiving the message 300, the transaction validation server may decrypt the message 300, validate the card and the message based on the cryptogram 306, match the message to a user account based on the token identifier 302, and retrieve a user record 124 (see FIG. 1A) from the transaction validation server corresponding to the user account. The transaction validation server may then compare the counter value 304 to the corresponding counter value 126 stored in the user database 122 to verify that the number of reads or transactions on the card matches the expected counter value stored on the server. This may validate that the user is in possession of the card (i.e., that the message 300 is not forged) and that the number of transactions performed by the user matches the service provider's expectation. If the counter values are not in sync, this may indicate that unauthorized transactions have been attempted and may result in the present transaction being declined (or may result in additional validation actions being required).


One of ordinary skill in the art will understand that the message 300 is depicted in a simplified format. In some embodiments, other components may be present in the message, or the depicted components may be combined or modified.



FIG. 2C is a timing diagram illustrating an example sequence for providing authenticated access according to one or more embodiments of the present disclosure. A system may include a contactless card 130 and a client device 104, which may include an application (which may include the logic 112) and a processor.


At 202, the application communicates with the contactless card 130 (e.g., after being brought near the contactless card 130). Communication between the application and the contactless card 130 may involve the contactless card 130 being sufficiently close to a card reader (not shown) of the client device 104 to enable NFC data transfer between the application and the contactless card 130.


At step 204, after communication has been established between client device 104 and contactless card 130, the contactless card 130 generates a message authentication code (MAC) cryptogram. In some examples, this may occur when the contactless card 130 is read by an application (e.g., on the client 104). In particular, this may occur upon a read, such as an NFC read, of a near field data exchange (NDEF) tag, which may be created in accordance with the NFC Data Exchange Format. For example, a reader, such as the application, may transmit a message, such as an applet select message, with the applet ID of an NDEF producing applet. Upon confirmation of the selection, a sequence of select file messages followed by read file messages may be transmitted. For example, the sequence may include “Select Capabilities file”, “Read Capabilities file”, and “Select NDEF file”. At this point, a counter value maintained by the contactless card 130 may be updated or incremented, which may be followed by “Read NDEF file.” At this point, the message may be generated which may include a header and a shared secret. Session keys may then be generated. The MAC cryptogram may be created from the message, which may include the header and the shared secret. The MAC cryptogram may then be concatenated with one or more blocks of random data, and the MAC cryptogram and a random number (RND) may be encrypted with the session key. Thereafter, the cryptogram and the header may be concatenated, and encoded as ASCII hex and returned in NDEF message format (responsive to the “Read NDEF file” message).


In some examples, the MAC cryptogram may be transmitted as an NDEF tag, and in other examples the MAC cryptogram may be included with a uniform resource indicator (e.g., as a formatted string).


In some examples, the application may be configured to transmit a request to contactless card 130, the request comprising an instruction to generate a MAC cryptogram.


At 206, the contactless card 130 sends the MAC cryptogram to the application responsive to the instruction from the client device 104.


At 208, the application communicates the MAC cryptogram with the processor.


At 210, the processor verifies the MAC cryptogram. For example, the MAC cryptogram may be decrypted. In some examples, verifying the MAC cryptogram may be performed by a device other than client device 104, such as a server connected to the client device 104. For example, processor may output the MAC cryptogram for transmission to the server, which may verify the MAC cryptogram.



FIG. 4 is a timing diagram depicting exemplary data exchanges between the operating system on the client device, an application on the client device, a transaction validation server, and a third-party server that processes a transaction.


At 402, a third-party server (e.g., a server associated with a vendor for which a credit transaction is requested) may submit a transaction request to a transaction validation server associated with a service provider. The transaction request may be generated in response to scanning a credit card, entering a credit card number into a vendor's payment system, an online transaction with the vendor, etc. The service provider may be identified as part of the process of receiving information relating to the card.


The transaction request may be transmitted to the transaction validation server, which may apply risk analytics 404 to the requested transaction. The risk analytics 404 may identify a risk level associated with the transaction. For instance, the risk analytics 404 may consider the amount of the purchase, the location of the purchase, the user's previous purchase history, the overall risk environment (including such factors as whether an institution such as a bank that issued the contactless card 130 is currently under attack, or whether other institutions have reported a recent increase in fraud), etc. in determining whether the transaction is typical of the user's activity (and therefore low-risk) or is atypical (and is therefore high-risk).


Based on the risk analytics 404, an initial risk score may be assigned to the transaction. A set of risk tiers may be defined, each risk tier associated with a range of risk scores and a required validation action. For instance, for low risk scores, a low risk tier may require no validation action. For moderate risk scores, an intermediate risk tier may require validation by a user by scanning their physical token with their mobile client (in conjunction with logging into an application on the mobile client). For high risk scores, a high risk tier may require the validation actions of the intermediate tier with additional validation actions required. For extremely high risk scores, the transaction may be denied outright.


The initial risk score may be compared to the range of risk scores for the risk tiers and assigned to a particular risk tier. Based on the validation action(s) associated with the risk tier, the associated validation action may be retrieved and performed.


The example of FIG. 4 depicts a situation that occurs when the initial risk score is associated with a moderate risk (i.e., requiring validation by scanning the physical token). Accordingly, at 406, a validation request is generated by the server and sent to the client app. The validation request may cause a notification to be generated informing the user that validation of their recent transaction is required.


In response to the notification, the user may log into the client application using any suitable means (e.g., a username/password combination, biometric authentication, etc.). The user may then be presented with an interface (such as the one depicted in FIG. 2A) and may scan the physical token on their card. Accordingly, at 408, the client application may request access to the physical token reader (e.g., an NFC reader) from the operating system of the client device. At 410, the client OS may receive a response from the reader (e.g., including the counter value) and may forward the response to the client application. Actions 408 and 410 may involve actions similar to those described above in connection with FIG. 2C.


At 412, the client application may generate a validation response (e.g., the message 300) and transmit the response to the transaction validation server.


At 414, the transaction validation server may perform validation analytics. The validation analytics may include validating the cryptogram included in the validation response 412 and comparing the counter value received from the client to the corresponding counter value stored at the server.


As noted above, a difference between the counter value stored on the physical token and the counter value stored on the transaction validation server may indicate the presence of fraudulent transactions. However, the counter value stored on the physical token can become out of sync with the counter value stored on the server for legitimate reasons (e.g., partial reads that are not transmitted to the server, an initial read that occurs on OS startup, etc.). The risk tier associated with the risk analytics may define an acceptable range of variance between the counter value received from the client and the counter value stored on the server. For instance, a relatively low risk tier may provide a relatively broad range of variance, whereas a relatively higher risk tier may provide relatively narrow range (or no range, requiring an exact match).


If the counter value is within the acceptable range, processing may proceed directly to 426 and an approval of the transaction may be transmitted to the third party server.


In addition to the acceptable counter value range, the risk tiers may define various escalation ranges. For example, if the counter value is not within the acceptable range, but is within a secondary range, further validation actions may be required to validate the transaction. Alternatively, the initial risk score may be re-evaluated in light of the mismatch between the counter value from the client and the counter value stored at the server, and the transaction may be elevated to a higher risk tier based on the newly calculated risk score.


If the counter value is outside the secondary range, the transaction may be denied at 426. If further validation actions are required due to the counter value being outside the acceptable range, the dashed actions depicted in FIG. 4 may be performed.


To this end, an escalated validation request may be transmitted to the client application at 416. The escalated validation request may include requested validation actions to be performed based on the escalated risk tier or the escalated risk actions required by the validation analytics. For instance, escalated validation actions may involve answering security questions, providing biometric authentication, taking a picture of the user's identification, or presenting themselves in person at a defined location.


In this example, the escalated validation request 416 requests that the user take a picture of their identification, such as a driver's license. Accordingly, at 418 the application may request access to the device camera from the client operating system. A picture may be captured, and at 420 the photograph may be transmitted to the client application. Based on the photograph captured, an escalated validation response may be generated at 422 and transmitted to the transaction validation server.


At 424, the server may perform escalated validation analytics on the escalated validation response. For instance, the server may compare the user's photograph in their identification with a photograph stored at the server, or may compare the user's signature on their identification to a stored signature, or any other suitable action based on the escalated validation response (e.g., comparing biometrics to biometrics stored at the server, or receiving an indication that the client device has confirmed the biometrics, etc.).


Optionally, at 428 the server may update the current risk score based upon information determined during the validation process. For example, if the counter value was such that no additional escalated validation was required, the server may update the risk score to indicate that the risk was decreasing. If, however, additional escalated validation was required, and the additional validation succeeded, then the risk score may be updated to maintain the current risk level. If the authentication failed, the risk level may be updated to indicate a higher level of risk going forward


The above-described actions may be performed by client-side validation logic 500 (FIG. 5) in cooperation with server-side validation logic 600 (FIG. 6).


The client-side validation logic 500 may include logic for, at block 502, authenticating the user to the client-side service provider application. For example, the logic may involve instructions for validating a user name and password combination, validating biometric login information, etc.


At block 504, a validation request may be received from the transaction validation server. The validation request may specify details of the transaction being validated and/or validation actions required (such as scanning the card's physical token).


Blocks 502 and 504 may be performed in reverse, so that the validation request is received before authenticating to the application.


In response to the validation request, the client application may call on the client device's short-range (e.g., NFC) reader at block 506. At block 508, the reader may be used to exchange or read information (including the counter value cryptographically encoded at the token using one or more security keys) with the physical token.


At block 510, the device may generate a validation response. This may include the cryptographically encoded counter value read from the token at 508. At block 512, the validation response may be transmitted to the transaction validation server.


If the server determines that escalated validation is required, at block 514 the client may receive an escalated validation request and may perform the escalated validation actions specified (e.g., capturing a photograph of the user's identification). The client may respond to the escalated validation request using the information captured in response to the escalated validation actions.



FIG. 6 depicts the corresponding logic 600 performed by the validation server.


At block 502, the validation server may receive a transaction request from a vendor server. The transaction request may specify an identity of the vendor, an amount of the transaction, and any other relevant details that may be used by risk analytics performed at block 604.


Based on the risk analytics, an initial risk score may be calculated, and an associated validation action may be retrieved. In some cases, no validation action may be required. The system may determine whether this is the case at block 606 and, if no validation is required (e.g., due to the risk score being below a predefined low threshold), processing may proceed to block 608 and the transaction may be approved. Accordingly, an approval message may be generated and transmitted to the vendor server.


If validation is required, then at block 610, the server may transmit a validation request to the client device associated with the user account assigned to the transaction (e.g, based on information retrieved from the user database 122 of FIG. 1A). The server may receive a validation response with the requested information from the client at block 612.


The validation response may be processed to, for instance, authenticate the cryptogram in the validation response and retrieve the counter value. The server may identify, at block 614, a risk tier as determined by the risk analytics performed in block 604.


In this example, two risk tiers are defined (high and low). Based on the risk tier, a counter value range may be defined (e.g., a narrow window for the high risk tier, or a wide window for the low risk tier). In some cases, the counter value range may be a predetermined range associated with the risk score. In others, the counter value range may be dynamically determined based on the risk score or risk factors (such as the current risk level of the environment). There may be multiple different risk tiers, each with their own window size.


If the received counter value is within the specified range for the risk tier (a “yes” at block 618 or 616), processing may proceed to block 608 and the transaction may be approved.


If, on the other hand, the counter value is not within the specified range for the risk tier (“no” at block 616 or 618), then processing may proceed to block 620. Optionally, escalated validation may be performed at this block. As part of the escalated validation procedure, an updated risk score may be calculated and the risk score may be matched to a new risk tier. Alternatively, escalated validation actions defined for the current risk tier may be performed.


If the escalated validation is successful, then processing may proceed to block 608 and the transaction may be approved. If the escalated validation is not successful, or if escalated validation is not performed at this stage, processing may proceed to block 622 and the transaction may be denied. Alternatively, if escalated validation is not successful, processing may return to block 620 and a further updated risk score may be calculated. The process may repeat until a predetermined number of maximum iterations have occurred, until the risk score exceeds a predetermined maximum threshold, or until a predefined stopping condition is met.


At any point during this process (e.g, during approval or denial at blocks 608 and/or 622), data from the authentication process may be fed back to the system to be used as part of the risk computation process. Thus, the risk computation may be updated based on the authentication/verification procedure, and vice versa. This may allow the system to generate a feedback loop in which the authentication process influences the risk assessment, and the risk assessment influences the authentication process.


The above-described methods may be embodied as instructions on a computer readable medium or as part of a computing architecture. FIG. 7 illustrates an embodiment of an exemplary computing architecture 700 suitable for implementing various embodiments as previously described. In one embodiment, the computing architecture 700 may comprise or be implemented as part of an electronic device, such as a computer 701. The embodiments are not limited in this context.


As used in this application, the terms “system” and “component” are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution, examples of which are provided by the exemplary computing architecture 700. For example, a component can be, but is not limited to being, a process running on a processor, a processor, a hard disk drive, multiple storage drives (of optical and/or magnetic storage medium), an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and/or thread of execution, and a component can be localized on one computer and/or distributed between two or more computers. Further, components may be communicatively coupled to each other by various types of communications media to coordinate operations. The coordination may involve the uni-directional or bi-directional exchange of information. For instance, the components may communicate information in the form of signals communicated over the communications media. The information can be implemented as signals allocated to various signal lines. In such allocations, each message is a signal. Further embodiments, however, may alternatively employ data messages. Such data messages may be sent across various connections. Exemplary connections include parallel interfaces, serial interfaces, and bus interfaces.


The computing architecture 700 includes various common computing elements, such as one or more processors, multi-core processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, power supplies, and so forth. The embodiments, however, are not limited to implementation by the computing architecture 700.


As shown in FIG. 7, the computing architecture 700 comprises a processing unit 702, a system memory 704 and a system bus 706. The processing unit 702 can be any of various commercially available processors, including without limitation an AMD® Athlon®, Duron® and Opteron® processors; ARM® application, embedded and secure processors; IBM® and Motorola® DragonBall® and PowerPC® processors; IBM and Sony® Cell processors; Intel® Celeron®, Core (2) Duo®, Itanium®, Pentium®, Xeon®, and XScale® processors; and similar processors. Dual microprocessors, multi-core processors, and other multi-processor architectures may also be employed as the processing unit 702.


The system bus 706 provides an interface for system components including, but not limited to, the system memory 704 to the processing unit 702. The system bus 706 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures. Interface adapters may connect to the system bus 706 via a slot architecture. Example slot architectures may include without limitation Accelerated Graphics Port (AGP), Card Bus, (Extended) Industry Standard Architecture ((E)ISA), Micro Channel Architecture (MCA), NuBus, Peripheral Component Interconnect (Extended) (PCI(X)), PCI Express, Personal Computer Memory Card International Association (PCMCIA), and the like.


The computing architecture 700 may comprise or implement various articles of manufacture. An article of manufacture may comprise a computer-readable storage medium to store logic. Examples of a computer-readable storage medium may include any tangible media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth. Examples of logic may include executable computer program instructions implemented using any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, object-oriented code, visual code, and the like. Embodiments may also be at least partly implemented as instructions contained in or on a non-transitory computer-readable medium, which may be read and executed by one or more processors to enable performance of the operations described herein.


The system memory 704 may include various types of computer-readable storage media in the form of one or more higher speed memory units, such as read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, polymer memory such as ferroelectric polymer memory, ovonic memory, phase change or ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, magnetic or optical cards, an array of devices such as Redundant Array of Independent Disks (RAID) drives, solid state memory devices (e.g., USB memory, solid state drives (SSD) and any other type of storage media suitable for storing information. In the illustrated embodiment shown in FIG. 7, the system memory 704 can include non-volatile memory 708 and/or volatile memory 710. A basic input/output system (BIOS) can be stored in the non-volatile memory 708.


The computing architecture 700 may include various types of computer-readable storage media in the form of one or more lower speed memory units, including an internal (or external) hard disk drive (HDD) 712, a magnetic floppy disk drive (FDD) 714 to read from or write to a removable magnetic disk 716, and an optical disk drive 718 to read from or write to a removable optical disk 720 (e.g., a CD-ROM or DVD). The HDD 712, FDD 714 and optical disk drive 720 can be connected to the system bus 706 by an HDD interface 722, an FDD interface 724 and an optical drive interface 726, respectively. The HDD interface 722 for external drive implementations can include at least one or both of Universal Serial Bus (USB) and IEEE 694 interface technologies.


The drives and associated computer-readable media provide volatile and/or nonvolatile storage of data, data structures, computer-executable instructions, and so forth. For example, a number of program modules can be stored in the drives and memory units 708, 712, including an operating system 728, one or more application programs 730, other program modules 732, and program data 734. In one embodiment, the one or more application programs 730, other program modules 732, and program data 734 can include, for example, the various applications and/or components of the messaging system 500.


A user can enter commands and information into the computer 701 through one or more wire/wireless input devices, for example, a keyboard 736 and a pointing device, such as a mouse 738. Other input devices may include microphones, infra-red (IR) remote controls, radio-frequency (RF) remote controls, game pads, stylus pens, card readers, dongles, finger print readers, gloves, graphics tablets, joysticks, keyboards, retina readers, touch screens (e.g., capacitive, resistive, etc.), trackballs, trackpads, sensors, styluses, and the like. These and other input devices are often connected to the processing unit 702 through an input device interface 740 that is coupled to the system bus 706, but can be connected by other interfaces such as a parallel port, IEEE 694 serial port, a game port, a USB port, an IR interface, and so forth.


A monitor 742 or other type of display device is also connected to the system bus 706 via an interface, such as a video adaptor 744. The monitor 742 may be internal or external to the computer 701. In addition to the monitor 742, a computer typically includes other peripheral output devices, such as speakers, printers, and so forth.


The computer 701 may operate in a networked environment using logical connections via wire and/or wireless communications to one or more remote computers, such as a remote computer 744. The remote computer 744 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 701, although, for purposes of brevity, only a memory/storage device 746 is illustrated. The logical connections depicted include wire/wireless connectivity to a local area network (LAN) 748 and/or larger networks, for example, a wide area network (WAN) 750. Such LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which may connect to a global communications network, for example, the Internet.


When used in a LAN networking environment, the computer 701 is connected to the LAN 748 through a wire and/or wireless communication network interface or adaptor 752. The adaptor 752 can facilitate wire and/or wireless communications to the LAN 748, which may also include a wireless access point disposed thereon for communicating with the wireless functionality of the adaptor 752.


When used in a WAN networking environment, the computer 701 can include a modem 754, or is connected to a communications server on the WAN 750, or has other means for establishing communications over the WAN 750, such as by way of the Internet. The modem 754, which can be internal or external and a wire and/or wireless device, connects to the system bus 706 via the input device interface 740. In a networked environment, program modules depicted relative to the computer 701, or portions thereof, can be stored in the remote memory/storage device 746. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.


The computer 701 is operable to communicate with wire and wireless devices or entities using the IEEE 802 family of standards, such as wireless devices operatively disposed in wireless communication (e.g., IEEE 802.13 over-the-air modulation techniques). This includes at least Wi-Fi (or Wireless Fidelity), WiMax, and Bluetooth™ wireless technologies, among others. Thus, the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices. Wi-Fi networks use radio technologies called IEEE 802.13x (a, b, g, n, etc.) to provide secure, reliable, fast wireless connectivity. A Wi-Fi network can be used to connect computers to each other, to the Internet, and to wire networks (which use IEEE 802.3-related media and functions).



FIG. 8 is a block diagram depicting an exemplary communications architecture 800 suitable for implementing various embodiments as previously described. The communications architecture 800 includes various common communications elements, such as a transmitter, receiver, transceiver, radio, network interface, baseband processor, antenna, amplifiers, filters, power supplies, and so forth. The embodiments, however, are not limited to implementation by the communications architecture 800.


As shown in FIG. 8, the communications architecture 800 includes one or more clients 802 and servers 804. The clients 802 may implement the client device 510. The servers 804 may implement the server device 526. The clients 802 and the servers 804 are operatively connected to one or more respective client data stores 806 and server data stores 808 that can be employed to store information local to the respective clients 802 and servers 804, such as cookies and/or associated contextual information.


The clients 802 and the servers 804 may communicate information between each other using a communication framework 810. The communications framework 810 may implement any well-known communications techniques and protocols. The communications framework 810 may be implemented as a packet-switched network (e.g., public networks such as the Internet, private networks such as an enterprise intranet, and so forth), a circuit-switched network (e.g., the public switched telephone network), or a combination of a packet-switched network and a circuit-switched network (with suitable gateways and translators).


The communications framework 810 may implement various network interfaces arranged to accept, communicate, and connect to a communications network. A network interface may be regarded as a specialized form of an input output interface. Network interfaces may employ connection protocols including without limitation direct connect, Ethernet (e.g., thick, thin, twisted pair 10/100/1000 Base T, and the like), token ring, wireless network interfaces, cellular network interfaces, IEEE 802.8a-x network interfaces, IEEE 802.16 network interfaces, IEEE 802.20 network interfaces, and the like. Further, multiple network interfaces may be used to engage with various communications network types. For example, multiple network interfaces may be employed to allow for the communication over broadcast, multicast, and unicast networks. Should processing requirements dictate a greater amount speed and capacity, distributed network controller architectures may similarly be employed to pool, load balance, and otherwise increase the communicative bandwidth required by clients 802 and the servers 804. A communications network may be any one and the combination of wired and/or wireless networks including without limitation a direct interconnection, a secured custom connection, a private network (e.g., an enterprise intranet), a public network (e.g., the Internet), a Personal Area Network (PAN), a Local Area Network (LAN), a Metropolitan Area Network (MAN), an Operating Missions as Nodes on the Internet (OMNI), a Wide Area Network (WAN), a wireless network, a cellular network, and other communications networks.


The components and features of the devices described above may be implemented using any combination of discrete circuitry, application specific integrated circuits (ASICs), logic gates and/or single chip architectures. Further, the features of the devices may be implemented using microcontrollers, programmable logic arrays and/or microprocessors or any combination of the foregoing where suitably appropriate. It is noted that hardware, firmware and/or software elements may be collectively or individually referred to herein as “logic” or “circuit.”


It will be appreciated that the exemplary devices shown in the block diagrams described above may represent one functionally descriptive example of many potential implementations. Accordingly, division, omission or inclusion of block functions depicted in the accompanying figures does not infer that the hardware components, circuits, software and/or elements for implementing these functions would be necessarily be divided, omitted, or included in embodiments.


At least one computer-readable storage medium may include instructions that, when executed, cause a system to perform any of the computer-implemented methods described herein.


Some embodiments may be described using the expression “one embodiment” or “an embodiment” along with their derivatives. These terms mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment. Moreover, unless otherwise noted the features described above are recognized to be usable together in any combination. Thus, any features discussed separately may be employed in combination with each other unless it is noted that the features are incompatible with each other.


With general reference to notations and nomenclature used herein, the detailed descriptions herein may be presented in terms of program procedures executed on a computer or network of computers. These procedural descriptions and representations are used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art.


A procedure is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. These operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic or optical signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It proves convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. It should be noted, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to those quantities.


Further, the manipulations performed are often referred to in terms, such as adding or comparing, which are commonly associated with mental operations performed by a human operator. No such capability of a human operator is necessary, or desirable in most cases, in any of the operations described herein, which form part of one or more embodiments. Rather, the operations are machine operations. Useful machines for performing operations of various embodiments include general purpose digital computers or similar devices.


Some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.


Various embodiments also relate to apparatus or systems for performing these operations. This apparatus may be specially constructed for the required purpose or it may comprise a general purpose computer as selectively activated or reconfigured by a computer program stored in the computer. The procedures presented herein are not inherently related to a particular computer or other apparatus. Various general purpose machines may be used with programs written in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these machines will appear from the description given.


It is emphasized that the Abstract of the Disclosure is provided to allow a reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein,” respectively. Moreover, the terms “first,” “second,” “third,” and so forth, are used merely as labels, and are not intended to impose numerical requirements on their objects.


What has been described above includes examples of the disclosed architecture. It is, of course, not possible to describe every conceivable combination of components and/or methodologies, but one of ordinary skill in the art may recognize that many further combinations and permutations are possible. Accordingly, the novel architecture is intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims.

Claims
  • 1. A method comprising: receiving a request to confirm a transaction associated with a service;receiving a cumulative value from a physical token associated with a user representing a number of read operations performed;accessing a log, the log mapping users of the service to last-known values of respective physical tokens of the users;retrieving, from the log, the last-known value of the physical token associated with the user;identifying a window of acceptable values around the last-known value of the physical token, wherein the window is a dynamic range based on user profile information that indicates historical regular interactions for the user compared to other historical regular interactions for other users;determining that the cumulative value received from the physical token is within the window of acceptable values; andcausing the transaction to be executed when the cumulative value is within the window.
  • 2. The method of claim 1, wherein the physical token is credit card.
  • 3. The method of claim 1, further comprising computing a risk value associated with the transaction.
  • 4. The method of claim 3, wherein a size of the window varies according to the risk value.
  • 5. The method of claim 1, comprising: determining that cumulative value is not within the window acceptable values; andsending a request to re-validate the physical token by the user to a device associated with the user.
  • 6. The method of claim 1, wherein each read operation of the read operations is performed by a near-field communication (NFC) reader.
  • 7. The method of claim 1, wherein at least one of the read operations is an accidental read operation performed by a near-field communication (NFC) reader.
  • 8. A non-transitory computer-readable medium storing instructions configured to cause a processor to: receive a request to confirm a transaction associated with a service;receive a cumulative value from a physical token associated with a user representing a number of read operations performed;access a log, the log mapping users of the service to last-known values of respective physical tokens of the users;retrieve, from the log, the last-known value of the physical token associated with the user;identify a window of acceptable values around the last-known value of the physical token, wherein the window is a dynamic range based on user profile information that indicates historical regular interactions for the user compared to other historical regular interactions for other users;determine that the cumulative value received from the physical token is within the window of acceptable values; andcause the transaction to be executed when the cumulative value is within the window.
  • 9. The medium of claim 8, wherein the physical token is credit card.
  • 10. The medium of claim 8, further storing instructions for computing a risk value associated with the transaction.
  • 11. The medium of claim 10, wherein a size of the window varies according to the risk value.
  • 12. The medium of claim 8, further storing instructions for: determining that cumulative value is not within the window acceptable values; andsending a request to re-validate the physical token by the user to a device associated with the user.
  • 13. The medium of claim 8, wherein each read operation of the read operations is performed by a near-field communication (NFC) reader.
  • 14. The medium of claim 8, wherein at least one of the read operations is an accidental read operation performed by a near-field communication (NFC) reader.
  • 15. An apparatus comprising: a hardware interface configured to receive a request to confirm a transaction associated with a service and receive a cumulative value from a physical token associated with a user representing a number of read operations performed;a non-transitory computer-readable medium storing a log mapping users of the service to last-known values of respective physical tokens of the users; anda hardware processor circuit configured to: retrieve, from the log, the last-known value of the physical token associated with the user,identify a window of acceptable values around the last-known value of the physical token, wherein the window is a dynamic range based on user profile information that indicates historical regular interactions for the user compared to other historical regular interactions for other users,determine that the cumulative value received from the physical token is within the window of acceptable values, andcause the transaction to be executed when the cumulative value is within the window.
  • 16. The apparatus of claim 15, wherein the physical token is credit card.
  • 17. The apparatus of claim 15, the hardware processor configured to compute a risk value associated with the transaction, wherein a size of the window varies according to the risk value.
  • 18. The apparatus of claim 15, wherein a size of the window varies according to the risk value.
  • 19. The apparatus of claim 15, the hardware processor configured to: determine that cumulative value is not within the window acceptable values; andsend a request to re-validate the physical token by the user to a device associated with the user.
  • 20. The apparatus of claim 15, wherein at least one of the read operations is an accidental read operation performed by a near-field communication (NFC) reader.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation-in-part of U.S. patent application Ser. No. 16/205,119, filed on Nov. 29, 2018, which claims priority from U.S. Provisional Application No. 62/740,352, filed on Oct. 2, 2018, the disclosures of which are incorporated herein by reference in their entirety.

US Referenced Citations (575)
Number Name Date Kind
4683553 Mollier Jul 1987 A
4827113 Rikuna May 1989 A
4910773 Hazard et al. Mar 1990 A
5036461 Elliott et al. Jul 1991 A
5363448 Koopman, Jr. et al. Nov 1994 A
5377270 Koopman, Jr. et al. Dec 1994 A
5533126 Hazard Jul 1996 A
5537314 Kanter Jul 1996 A
5592553 Guski et al. Jan 1997 A
5616901 Crandall Apr 1997 A
5666415 Kaufman Sep 1997 A
5763373 Robinson et al. Jun 1998 A
5764789 Pare, Jr. et al. Jun 1998 A
5768373 Lohstroh et al. Jun 1998 A
5778072 Samar Jul 1998 A
5796827 Coppersmith et al. Aug 1998 A
5832090 Raspotnik Nov 1998 A
5883810 Franklin et al. Mar 1999 A
5901874 Deters May 1999 A
5929413 Gardner Jul 1999 A
5960411 Hartman et al. Sep 1999 A
6021203 Douceur et al. Feb 2000 A
6049328 Vanderheiden Apr 2000 A
6058373 Blinn et al. May 2000 A
6061666 Do et al. May 2000 A
6105013 Curry et al. Aug 2000 A
6199114 White et al. Mar 2001 B1
6199762 Hohle Mar 2001 B1
6216227 Goldstein et al. Apr 2001 B1
6227447 Campisano May 2001 B1
6282522 Davis et al. Aug 2001 B1
6324271 Sawyer et al. Nov 2001 B1
6342844 Rozin Jan 2002 B1
6367011 Lee et al. Apr 2002 B1
6402028 Graham, Jr. et al. Jun 2002 B1
6424626 Kidambi Jul 2002 B1
6438550 Doyle et al. Aug 2002 B1
6501847 Helot et al. Dec 2002 B2
6631197 Taenzer Oct 2003 B1
6641050 Kelley et al. Nov 2003 B2
6655585 Shinn Dec 2003 B2
6662020 Aaro et al. Dec 2003 B1
6721706 Strubbe et al. Apr 2004 B1
6731778 Oda et al. May 2004 B1
6745361 Gradischnig Jun 2004 B1
6779115 Naim Aug 2004 B1
6792533 Jablon Sep 2004 B2
6829711 Kwok et al. Dec 2004 B1
6834271 Hodgson et al. Dec 2004 B1
6834795 Rasmussen et al. Dec 2004 B1
6852031 Rowe Feb 2005 B1
6865547 Brake, Jr. et al. Mar 2005 B1
6873260 Lancos et al. Mar 2005 B2
6877656 Jaros et al. Apr 2005 B1
6889198 Kawan May 2005 B2
6905411 Nguyen et al. Jun 2005 B2
6910627 Simpson-Young et al. Jun 2005 B1
6971031 Haala Nov 2005 B2
6990588 Yasukura Jan 2006 B1
7006986 Sines et al. Feb 2006 B1
7085931 Smith et al. Aug 2006 B1
7127605 Montgomery et al. Oct 2006 B1
7128274 Kelley et al. Oct 2006 B2
7140550 Ramachandran Nov 2006 B2
7152045 Hoffman Dec 2006 B2
7165727 de Jong Jan 2007 B2
7175076 Block et al. Feb 2007 B1
7202773 Oba et al. Apr 2007 B1
7206806 Pineau Apr 2007 B2
7232073 de Jong Jun 2007 B1
7246752 Brown Jul 2007 B2
7254569 Goodman et al. Aug 2007 B2
7263507 Brake, Jr. et al. Aug 2007 B1
7270276 Vayssiere Sep 2007 B2
7278025 Saito et al. Oct 2007 B2
7287692 Patel et al. Oct 2007 B1
7290709 Tsai et al. Nov 2007 B2
7306143 Bonneau, Jr. et al. Dec 2007 B2
7319986 Praisner et al. Jan 2008 B2
7325132 Takayama et al. Jan 2008 B2
7373515 Owen et al. May 2008 B2
7374099 de Jong May 2008 B2
7375616 Rowse et al. May 2008 B2
7380710 Brown Jun 2008 B2
7424977 Smets et al. Sep 2008 B2
7453439 Kushler et al. Nov 2008 B1
7472829 Brown Jan 2009 B2
7487357 Smith et al. Feb 2009 B2
7568631 Gibbs et al. Aug 2009 B2
7584153 Brown et al. Sep 2009 B2
7597250 Finn Oct 2009 B2
7628322 Holtmanns et al. Dec 2009 B2
7652578 Braun et al. Jan 2010 B2
7689832 Talmor et al. Mar 2010 B2
7703142 Wilson et al. Apr 2010 B1
7748609 Sachdeva et al. Jul 2010 B2
7748617 Gray Jul 2010 B2
7748636 Finn Jul 2010 B2
7762457 Bonalle et al. Jul 2010 B2
7789302 Tame Sep 2010 B2
7793851 Mullen Sep 2010 B2
7796013 Murakami et al. Sep 2010 B2
7801799 Brake, Jr. et al. Sep 2010 B1
7801829 Gray et al. Sep 2010 B2
7805755 Brown et al. Sep 2010 B2
7809643 Phillips et al. Oct 2010 B2
7827115 Weller et al. Nov 2010 B2
7828214 Narendra et al. Nov 2010 B2
7848746 Juels Dec 2010 B2
7882553 Tuliani Feb 2011 B2
7900048 Andersson Mar 2011 B2
7908216 Davis et al. Mar 2011 B1
7922082 Muscato Apr 2011 B2
7933589 Mamdani et al. Apr 2011 B1
7949559 Freiberg May 2011 B2
7954716 Narendra et al. Jun 2011 B2
7954723 Charrat Jun 2011 B2
7962369 Rosenberg Jun 2011 B2
7993197 Kaminkow Aug 2011 B2
8005426 Huomo et al. Aug 2011 B2
8010405 Bortolin et al. Aug 2011 B1
RE42762 Shin et al. Sep 2011 E
8041954 Plesman Oct 2011 B2
8060012 Sklovsky et al. Nov 2011 B2
8074877 Mullen et al. Dec 2011 B2
8082450 Frey et al. Dec 2011 B2
3103249 Markison Jan 2012 A1
8095113 Kean et al. Jan 2012 B2
8099332 Lemay et al. Jan 2012 B2
8108687 Ellis et al. Jan 2012 B2
8127143 Abdallah et al. Feb 2012 B2
8135648 Oram et al. Mar 2012 B2
8140010 Symons et al. Mar 2012 B2
8141136 Lee et al. Mar 2012 B2
8150321 Winter et al. Apr 2012 B2
8150767 Wankmueller Apr 2012 B2
8186602 Itay et al. May 2012 B2
8196131 von Behren et al. Jun 2012 B1
8215563 Levy et al. Jul 2012 B2
8224753 Atef et al. Jul 2012 B2
8232879 Davis Jul 2012 B2
8233841 Griffin et al. Jul 2012 B2
8245292 Buer Aug 2012 B2
8249654 Zhu Aug 2012 B1
8266451 Leydier et al. Sep 2012 B2
8285329 Zhu Oct 2012 B1
8302872 Mullen Nov 2012 B2
8312519 Bailey et al. Nov 2012 B1
8316237 Felsher et al. Nov 2012 B1
8332272 Fisher Dec 2012 B2
8365988 Medina, III et al. Feb 2013 B1
8369960 Tran et al. Feb 2013 B2
8371501 Hopkins Feb 2013 B1
8381307 Cimino Feb 2013 B2
8391719 Alameh et al. Mar 2013 B2
8417231 Sanding et al. Apr 2013 B2
8439271 Smets et al. May 2013 B2
8475367 Yuen et al. Jul 2013 B1
8489112 Roeding et al. Jul 2013 B2
8511542 Pan Aug 2013 B2
8559872 Butler Oct 2013 B2
8566916 Bailey et al. Oct 2013 B1
8567670 Stanfield et al. Oct 2013 B2
8572386 Takekawa et al. Oct 2013 B2
8577810 Dalit et al. Nov 2013 B1
8583454 Beraja et al. Nov 2013 B2
8589335 Smith et al. Nov 2013 B2
8594730 Bona et al. Nov 2013 B2
8615468 Varadarajan Dec 2013 B2
8620218 Awad Dec 2013 B2
8646090 Gadde Feb 2014 B1
8667285 Coulier et al. Mar 2014 B2
8723941 Shirbabadi et al. May 2014 B1
8726405 Bailey et al. May 2014 B1
8740073 Vijayshankar et al. Jun 2014 B2
8750514 Gallo et al. Jun 2014 B2
8752189 de Jong Jun 2014 B2
8794509 Bishop et al. Aug 2014 B2
8799668 Cheng Aug 2014 B2
8806592 Ganesan Aug 2014 B2
8807440 von Behren et al. Aug 2014 B1
8811892 Khan et al. Aug 2014 B2
8814039 Bishop et al. Aug 2014 B2
8814052 Bona et al. Aug 2014 B2
8818867 Baldwin et al. Aug 2014 B2
8850538 Vernon et al. Sep 2014 B1
8861733 Benteo et al. Oct 2014 B2
8880027 Darringer Nov 2014 B1
8888002 Marshall Chesney et al. Nov 2014 B2
8898088 Springer et al. Nov 2014 B2
8930274 Brickell Jan 2015 B1
8934837 Zhu et al. Jan 2015 B2
8977569 Rao Mar 2015 B2
8994498 Agrafioti et al. Mar 2015 B2
9004365 Bona et al. Apr 2015 B2
9038894 Khalid May 2015 B2
9042814 Royston et al. May 2015 B2
9047531 Showering et al. Jun 2015 B2
9069976 Toole et al. Jun 2015 B2
9081948 Magne Jul 2015 B2
9104853 Venkataramani et al. Aug 2015 B2
9118663 Bailey et al. Aug 2015 B1
9122964 Krawczewicz Sep 2015 B2
9129280 Bona et al. Sep 2015 B2
9152832 Royston et al. Oct 2015 B2
9203800 Izu et al. Dec 2015 B2
9209867 Royston Dec 2015 B2
9251330 Boivie et al. Feb 2016 B2
9251518 Levin et al. Feb 2016 B2
9258715 Borghei Feb 2016 B2
9270337 Zhu et al. Feb 2016 B2
9306626 Hall et al. Apr 2016 B2
9306942 Bailey et al. Apr 2016 B1
9324066 Archer et al. Apr 2016 B2
9324067 Van Os et al. Apr 2016 B2
9332587 Salahshoor May 2016 B2
9338622 Bjontegard May 2016 B2
9373141 Shakkarwar Jun 2016 B1
9379841 Fine et al. Jun 2016 B2
9413430 Royston et al. Aug 2016 B2
9413768 Gregg et al. Aug 2016 B1
9420496 Indurkar Aug 2016 B1
9426132 Alikhani Aug 2016 B1
9432339 Bowness Aug 2016 B1
9455968 Machani et al. Sep 2016 B1
9473509 Arsanjani et al. Oct 2016 B2
9491626 Sharma et al. Nov 2016 B2
9508068 Mastrangelo Nov 2016 B2
9553637 Yang et al. Jan 2017 B2
9619952 Zhao et al. Apr 2017 B1
9635000 Muftic Apr 2017 B1
9665858 Kumar May 2017 B1
9674705 Rose et al. Jun 2017 B2
9679286 Colnot et al. Jun 2017 B2
9680942 Dimmick Jun 2017 B2
9710804 Zhou et al. Jul 2017 B2
9740342 Paulsen et al. Aug 2017 B2
9740988 Levin et al. Aug 2017 B1
9763097 Robinson et al. Sep 2017 B2
9767329 Forster Sep 2017 B2
9769662 Queru Sep 2017 B1
9773151 Mil'shtein et al. Sep 2017 B2
9780953 Gaddam et al. Oct 2017 B2
9891823 Feng et al. Feb 2018 B2
9940571 Herrington Apr 2018 B1
9953323 Candelore et al. Apr 2018 B2
9961194 Wiechman et al. May 2018 B1
9965756 Davis et al. May 2018 B2
9965911 Wishne May 2018 B2
9978058 Wurmfeld et al. May 2018 B2
10043164 Dogin et al. Aug 2018 B2
10075437 Costigan et al. Sep 2018 B1
10129648 Hernandez et al. Nov 2018 B1
10133979 Eidam et al. Nov 2018 B1
10217105 Sangi et al. Feb 2019 B1
10516652 Hashmi Dec 2019 B1
20010010723 Pinkas Aug 2001 A1
20010029485 Brody et al. Oct 2001 A1
20010034702 Mockett et al. Oct 2001 A1
20010054003 Chien et al. Dec 2001 A1
20020078345 Sandhu et al. Jun 2002 A1
20020093530 Krothapalli et al. Jul 2002 A1
20020100808 Norwood et al. Aug 2002 A1
20020120583 Keresman, III et al. Aug 2002 A1
20020152116 Yan et al. Oct 2002 A1
20020153424 Li Oct 2002 A1
20020165827 Gien et al. Nov 2002 A1
20020180584 McGregor Dec 2002 A1
20030023554 Yap et al. Jan 2003 A1
20030034873 Chase et al. Feb 2003 A1
20030055727 Walker et al. Mar 2003 A1
20030078882 Sukeda et al. Apr 2003 A1
20030167350 Davis et al. Sep 2003 A1
20030208449 Diao Nov 2003 A1
20040015958 Veil et al. Jan 2004 A1
20040039919 Takayama et al. Feb 2004 A1
20040042491 Sarkkinen Mar 2004 A1
20040127256 Goldthwaite et al. Jul 2004 A1
20040215674 Odinak et al. Oct 2004 A1
20040230799 Davis Nov 2004 A1
20050044367 Gasparini et al. Feb 2005 A1
20050075985 Cartmell Apr 2005 A1
20050081038 Arditti Modiano et al. Apr 2005 A1
20050138387 Lam et al. Jun 2005 A1
20050156026 Ghosh et al. Jul 2005 A1
20050160049 Lundholm Jul 2005 A1
20050195975 Kawakita Sep 2005 A1
20050247797 Ramachandran Nov 2005 A1
20060006230 Bear et al. Jan 2006 A1
20060040726 Szrek et al. Feb 2006 A1
20060041402 Baker Feb 2006 A1
20060044153 Dawidowsky Mar 2006 A1
20060047954 Sachdeva et al. Mar 2006 A1
20060085848 Aissi et al. Apr 2006 A1
20060136334 Atkinson et al. Jun 2006 A1
20060173985 Moore Aug 2006 A1
20060174331 Schuetz Aug 2006 A1
20060242698 Inskeep et al. Oct 2006 A1
20060268893 Lataretu Nov 2006 A1
20060280338 Rabb Dec 2006 A1
20070033642 Ganesan et al. Feb 2007 A1
20070055630 Gauthier et al. Mar 2007 A1
20070061266 Moore et al. Mar 2007 A1
20070061487 Moore et al. Mar 2007 A1
20070115812 Hughes May 2007 A1
20070116292 Kurita et al. May 2007 A1
20070118745 Buer May 2007 A1
20070197261 Humbel Aug 2007 A1
20070224969 Rao Sep 2007 A1
20070241182 Buer Oct 2007 A1
20070256134 Lehtonen et al. Nov 2007 A1
20070258594 Sandhu et al. Nov 2007 A1
20070278291 Rans et al. Dec 2007 A1
20080008315 Fontana et al. Jan 2008 A1
20080011831 Bonalle et al. Jan 2008 A1
20080014867 Finn Jan 2008 A1
20080035738 Mullen Feb 2008 A1
20080040276 Hammad Feb 2008 A1
20080065553 Faith Mar 2008 A1
20080071681 Khalid Mar 2008 A1
20080072303 Syed Mar 2008 A1
20080086767 Kulkarni et al. Apr 2008 A1
20080103968 Bies et al. May 2008 A1
20080109309 Landau et al. May 2008 A1
20080110983 Ashfield May 2008 A1
20080116285 Shoemaker May 2008 A1
20080120711 Dispensa May 2008 A1
20080156873 Wilhelm et al. Jul 2008 A1
20080162312 Sklovsky et al. Jul 2008 A1
20080164308 Aaron et al. Jul 2008 A1
20080207307 Cunningham, II et al. Aug 2008 A1
20080209543 Aaron Aug 2008 A1
20080223918 Williams et al. Sep 2008 A1
20080285746 Landrock et al. Nov 2008 A1
20080308641 Finn Dec 2008 A1
20090037275 Pollio Feb 2009 A1
20090048026 French Feb 2009 A1
20090055893 Manessis Feb 2009 A1
20090132417 Scipioni et al. May 2009 A1
20090143104 Loh et al. Jun 2009 A1
20090158417 Khanna Jun 2009 A1
20090171682 Dixon et al. Jul 2009 A1
20090210308 Toomer et al. Aug 2009 A1
20090235339 Mennes et al. Sep 2009 A1
20090245290 Moonen Oct 2009 A1
20090249077 Gargaro et al. Oct 2009 A1
20090282264 Ameil et al. Nov 2009 A1
20090292913 Kune Nov 2009 A1
20100023449 Skowronek et al. Jan 2010 A1
20100023455 Dispensa et al. Jan 2010 A1
20100027786 Faith Feb 2010 A1
20100029202 Jolivet et al. Feb 2010 A1
20100033310 Narendra et al. Feb 2010 A1
20100036769 Winters et al. Feb 2010 A1
20100078471 Lin et al. Apr 2010 A1
20100082491 Rosenblatt et al. Apr 2010 A1
20100094754 Bertran et al. Apr 2010 A1
20100095130 Bertran et al. Apr 2010 A1
20100100480 Altman et al. Apr 2010 A1
20100114731 Kingston et al. May 2010 A1
20100149973 Krupp Jun 2010 A1
20100192230 Steeves et al. Jul 2010 A1
20100207742 Buhot et al. Aug 2010 A1
20100211797 Westerveld et al. Aug 2010 A1
20100240413 He et al. Sep 2010 A1
20100257357 McClain Oct 2010 A1
20100312634 Cervenka Dec 2010 A1
20100312635 Cervenka Dec 2010 A1
20110028160 Roeding et al. Feb 2011 A1
20110035604 Habraken Feb 2011 A1
20110060631 Grossman et al. Mar 2011 A1
20110068170 Lehman Mar 2011 A1
20110084132 Tofighbakhsh Apr 2011 A1
20110101093 Ehrensvard May 2011 A1
20110113245 Varadarajan May 2011 A1
20110125638 Davis et al. May 2011 A1
20110131415 Schneider Jun 2011 A1
20110153437 Archer et al. Jun 2011 A1
20110153496 Royyuru Jun 2011 A1
20110153862 Roosta Jun 2011 A1
20110155800 Mastrangelo Jun 2011 A1
20110208658 Makhotin Aug 2011 A1
20110208965 Machani Aug 2011 A1
20110211219 Bradley et al. Sep 2011 A1
20110218911 Spodak Sep 2011 A1
20110238564 Lim et al. Sep 2011 A1
20110246780 Yeap et al. Oct 2011 A1
20110258452 Coulier et al. Oct 2011 A1
20110280406 Ma et al. Nov 2011 A1
20110282785 Chin Nov 2011 A1
20110294418 Chen Dec 2011 A1
20110312271 Ma et al. Dec 2011 A1
20120018506 Hammad Jan 2012 A1
20120024947 Naelon Feb 2012 A1
20120030047 Fuentes et al. Feb 2012 A1
20120030121 Grellier Feb 2012 A1
20120042096 Wu Feb 2012 A1
20120047071 Mullen et al. Feb 2012 A1
20120079281 Lowenstein et al. Mar 2012 A1
20120109735 Krawczewicz et al. May 2012 A1
20120109764 Martin et al. May 2012 A1
20120143754 Patel Jun 2012 A1
20120144479 Lefloch Jun 2012 A1
20120150737 Rottink et al. Jun 2012 A1
20120178366 Levy et al. Jul 2012 A1
20120196583 Kindo Aug 2012 A1
20120207305 Gallo et al. Aug 2012 A1
20120209773 Ranganathan Aug 2012 A1
20120238206 Singh et al. Sep 2012 A1
20120239560 Pourfallah et al. Sep 2012 A1
20120252350 Steinmetz et al. Oct 2012 A1
20120254394 Barras Oct 2012 A1
20120284194 Liu et al. Nov 2012 A1
20120290472 Mullen et al. Nov 2012 A1
20120296818 Nuzzi et al. Nov 2012 A1
20120316992 Oborne Dec 2012 A1
20120317035 Royyuru et al. Dec 2012 A1
20120317628 Yeager Dec 2012 A1
20130005245 Royston Jan 2013 A1
20130008956 Ashfield Jan 2013 A1
20130026229 Jarman et al. Jan 2013 A1
20130048713 Pan Feb 2013 A1
20130054474 Yeager Feb 2013 A1
20130065564 Conner et al. Mar 2013 A1
20130080228 Fisher Mar 2013 A1
20130080229 Fisher Mar 2013 A1
20130099587 Lou et al. Apr 2013 A1
20130104251 Moore et al. Apr 2013 A1
20130106576 Hinman et al. May 2013 A1
20130119130 Braams May 2013 A1
20130130614 Busch-Sorensen May 2013 A1
20130144793 Royston Jun 2013 A1
20130171929 Adams et al. Jul 2013 A1
20130179351 Wallner Jul 2013 A1
20130185772 Jaudon et al. Jul 2013 A1
20130191279 Calman et al. Jul 2013 A1
20130200999 Spodak et al. Aug 2013 A1
20130216108 Hwang et al. Aug 2013 A1
20130218772 Manessis Aug 2013 A1
20130226791 Springer et al. Aug 2013 A1
20130226796 Jiang et al. Aug 2013 A1
20130232082 Krawczewicz et al. Sep 2013 A1
20130238894 Ferg et al. Sep 2013 A1
20130282360 Shimota et al. Oct 2013 A1
20130282591 Faith Oct 2013 A1
20130283347 Hui Oct 2013 A1
20130303085 Boucher et al. Nov 2013 A1
20130304651 Smith Nov 2013 A1
20130312082 Izu et al. Nov 2013 A1
20130314593 Reznik et al. Nov 2013 A1
20130344857 Berionne et al. Dec 2013 A1
20140002238 Taveau et al. Jan 2014 A1
20140019352 Shrivastava Jan 2014 A1
20140027506 Heo et al. Jan 2014 A1
20140032409 Rosano Jan 2014 A1
20140032410 Georgiev et al. Jan 2014 A1
20140040120 Cho et al. Feb 2014 A1
20140040139 Brudnicki et al. Feb 2014 A1
20140040147 Varadarakan et al. Feb 2014 A1
20140047235 Lessiak et al. Feb 2014 A1
20140067690 Pitroda et al. Mar 2014 A1
20140074637 Hammad Mar 2014 A1
20140074655 Lim et al. Mar 2014 A1
20140081720 Wu Mar 2014 A1
20140138435 Khalid May 2014 A1
20140171034 Aleksin et al. Jun 2014 A1
20140171039 Bjontegard Jun 2014 A1
20140172700 Teuwen et al. Jun 2014 A1
20140180851 Fisher Jun 2014 A1
20140208112 McDonald et al. Jul 2014 A1
20140214674 Narula Jul 2014 A1
20140229375 Zaytzsev et al. Aug 2014 A1
20140245391 Adenuga Aug 2014 A1
20140256251 Caceres et al. Sep 2014 A1
20140258099 Rosano Sep 2014 A1
20140258113 Gauthier et al. Sep 2014 A1
20140258125 Gerber et al. Sep 2014 A1
20140274179 Zhu et al. Sep 2014 A1
20140279479 Maniar et al. Sep 2014 A1
20140281351 Topp Sep 2014 A1
20140304158 Basu Oct 2014 A1
20140337235 Van Heerden et al. Nov 2014 A1
20140339315 Ko Nov 2014 A1
20140346860 Aubry et al. Nov 2014 A1
20140365780 Movassaghi Dec 2014 A1
20140379361 Mahadkar et al. Dec 2014 A1
20150012444 Brown et al. Jan 2015 A1
20150032635 Guise Jan 2015 A1
20150071486 Rhoads et al. Mar 2015 A1
20150088757 Zhou et al. Mar 2015 A1
20150089586 Ballesteros Mar 2015 A1
20150134452 Williams May 2015 A1
20150140960 Powell et al. May 2015 A1
20150154595 Collinge et al. Jun 2015 A1
20150170138 Rao Jun 2015 A1
20150178724 Ngo et al. Jun 2015 A1
20150186871 Laracey Jul 2015 A1
20150205379 Mag et al. Jul 2015 A1
20150302409 Malek et al. Oct 2015 A1
20150317626 Ran et al. Nov 2015 A1
20150332266 Friedlander et al. Nov 2015 A1
20150339474 Paz et al. Nov 2015 A1
20150371234 Huang et al. Dec 2015 A1
20160012465 Sharp Jan 2016 A1
20160026997 Tsui et al. Jan 2016 A1
20160048913 Rausaria et al. Feb 2016 A1
20160055480 Shah Feb 2016 A1
20160057619 Lopez Feb 2016 A1
20160065370 Le Saint et al. Mar 2016 A1
20160087957 Shah et al. Mar 2016 A1
20160092696 Guglani et al. Mar 2016 A1
20160148193 Kelley et al. May 2016 A1
20160232523 Venot et al. Aug 2016 A1
20160239672 Khan et al. Aug 2016 A1
20160253651 Park et al. Sep 2016 A1
20160255072 Liu Sep 2016 A1
20160267486 Mitra et al. Sep 2016 A1
20160277383 Guyomarc'h et al. Sep 2016 A1
20160277388 Lowe et al. Sep 2016 A1
20160307187 Guo et al. Oct 2016 A1
20160307189 Zarakas et al. Oct 2016 A1
20160314472 Ashfield Oct 2016 A1
20160330027 Ebrahimi Nov 2016 A1
20160335531 Mullen et al. Nov 2016 A1
20160379217 Hammad Dec 2016 A1
20170004502 Quentin et al. Jan 2017 A1
20170011395 Pillai et al. Jan 2017 A1
20170011406 Tunnell et al. Jan 2017 A1
20170017957 Radu Jan 2017 A1
20170017964 Janefalkar et al. Jan 2017 A1
20170024716 Jiam et al. Jan 2017 A1
20170039566 Schipperheijn Feb 2017 A1
20170041759 Gantert et al. Feb 2017 A1
20170068950 Kwon Mar 2017 A1
20170103388 Pillai et al. Apr 2017 A1
20170104739 Lansler et al. Apr 2017 A1
20170109509 Baghdasaryan Apr 2017 A1
20170109730 Locke et al. Apr 2017 A1
20170116447 Cimino et al. Apr 2017 A1
20170124568 Moghadam May 2017 A1
20170140379 Deck May 2017 A1
20170154328 Zarakas et al. Jun 2017 A1
20170154333 Gleeson et al. Jun 2017 A1
20170180134 King Jun 2017 A1
20170230189 Toll et al. Aug 2017 A1
20170237301 Elad et al. Aug 2017 A1
20170289127 Hendrick Oct 2017 A1
20170295013 Claes Oct 2017 A1
20170316696 Bartel Nov 2017 A1
20170317834 Smith et al. Nov 2017 A1
20170330173 Woo et al. Nov 2017 A1
20170374070 Shah et al. Dec 2017 A1
20180034507 Wobak et al. Feb 2018 A1
20180039986 Essebag et al. Feb 2018 A1
20180068316 Essebag et al. Mar 2018 A1
20180129945 Saxena et al. May 2018 A1
20180160255 Park Jun 2018 A1
20180191501 Lindemann Jul 2018 A1
20180205712 Versteeg et al. Jul 2018 A1
20180232722 Wong Aug 2018 A1
20180240106 Garrett et al. Aug 2018 A1
20180254909 Hancock Sep 2018 A1
20180268132 Buer et al. Sep 2018 A1
20180270214 Caterino et al. Sep 2018 A1
20180294959 Traynor et al. Oct 2018 A1
20180300716 Carlson Oct 2018 A1
20180302396 Camenisch et al. Oct 2018 A1
20180315050 Hammad Nov 2018 A1
20180316666 Koved et al. Nov 2018 A1
20180322486 Deliwala et al. Nov 2018 A1
20180359100 Gaddam et al. Dec 2018 A1
20190014107 George Jan 2019 A1
20190019375 Foley Jan 2019 A1
20190036678 Ahmed Jan 2019 A1
20190238517 D'Agostino et al. Aug 2019 A1
Foreign Referenced Citations (38)
Number Date Country
3010336 Jul 2017 CA
101192295 Jun 2008 CN
103023643 Apr 2013 CN
103417202 Dec 2013 CN
1085424 Mar 2001 EP
1223565 Jul 2002 EP
1265186 Dec 2002 EP
1783919 May 2007 EP
2139196 Dec 2009 EP
1469419 Aug 2012 EP
2852070 Mar 2015 EP
2457221 Aug 2009 GB
2516861 Feb 2015 GB
2551907 Jan 2018 GB
101508320 Apr 2015 KR
0049586 Aug 2000 WO
2006070189 Jul 2006 WO
2008055170 May 2008 WO
2009025605 Feb 2009 WO
2010049252 May 2010 WO
2011112158 Sep 2011 WO
2012001624 Jan 2012 WO
2013039395 Mar 2013 WO
2013155562 Oct 2013 WO
2013192358 Dec 2013 WO
2014043278 Mar 2014 WO
2014170741 Oct 2014 WO
2015179649 Nov 2015 WO
2015183818 Dec 2015 WO
2016097718 Jun 2016 WO
2016160816 Oct 2016 WO
2016168394 Oct 2016 WO
2017042375 Mar 2017 WO
2017042400 Mar 2017 WO
2017157859 Sep 2017 WO
2017208063 Dec 2017 WO
2018063809 Apr 2018 WO
2018137888 Aug 2018 WO
Non-Patent Literature Citations (41)
Entry
Batina, L. and Poll, E., “SmartCards and RFID”, Course PowerPoint Presentation for IPA Security Course, Digital Security at University of Nijmegen, Netherlands (date unknown) 75 pages.
Haykin, M. and Warnar, R., “Smart Card Technology: New Methods for Computer Access Control”, Computer Science and Technology NIST Special Publication 500-157:1-60 (1988).
Lehpamer, H., “Component of the RFID System”, RFID Design Principles, 2nd edition pp. 133-201 (2012).
Author Unknown, “CardrefresherSM from American Express®”, [online] 2019 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://merchant-channel-americanexpress.com/merchant/en_US/cardrefresher, 2 pages.
Author Unknown, “Add Account Updater to your recurring payment tool”, [online] 2018-19 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://www.authorize.net/our-features/account-updater/, 5 pages.
Author Unknown, “Visa® Account Updater for Merchants”, [online] 2019 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://usa.visa.com/dam/VCOM/download/merchants/visa-account-updater-product-information-fact-sheet-for-merchants.pdf, 2 pages.
Author Unknown, “Manage the cards that you use with Apple Pay”, Apple Support [online] 2019 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://support.apple.com/en-us/HT205583, 5 pages.
Author Unknown, “Contactless Specifications for Payment Systems”, EMV Book B—Entry Point Specification [online] 2016 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://www.emvco.com/wp-content/uploads/2017/05/BookB_Entry_Point_Specification_v2_6_20160809023257319.pdf, 52 pages.
Author Unknown, “EMV Integrated Circuit Card Specifcations for Payment Systems, Book 2, Security and Key Management,” Version 3.4, [online] 2011 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://www.emvco.com/wp-content/uploads/2017/05/EMV_v4.3_Book_2_Security_and_Key_Management_20120607061923900.pdf, 174 pages.
Author Unknown, “NFC Guide: All You Need to Know About Near Field Communication”, Square Guide [online] 2018 [retrieved on Nov. 13, 2018]. Retrieved from Internet URL: https://squareup.com/guides/nfc, 8 pages.
Profis, S., “Everything you need to know about NFC and mobile payments” CNET Directory [online], 2014 [retrieved on Mar. 25, 2019]. Retrieved from the Internet URL: https://www.cnet.com/how-to/how-nfc-works-and-mobile-payments/, 6 pages.
Cozma, N., “Copy data from other devices in Android 5.0 Lollipop setup”, CNET Directory [online] 2014 [retrieved on Mar. 25, 2019]. Retrieved from the Internet URL: https://www.cnet.com/how-to/copy-data-from-other-devices-in-android-5-0-lollipop-setup/, 5 pages.
Kevin, Android Enthusiast, “How to copy text string from nfc tag”, StackExchange [online] 2013 [retrieved on Mar. 25, 2019]. Retrieved from the Internet URL: https://android.stackexchange.com/questions/55689/how-to-copy-text-string-from-nfc-tag, 11 pages.
Author Unknown, “Tap & Go Device Setup”, Samsung [online] date unknown [retrieved on Mar. 25, 2019]. Retrieved from the Internet URL: https://www.samsung.com/us/switch-me/switch-to-the-galaxy-s-5/app/partial/setup-device/tap-go.html, 1 page.
Author Unknown, “Multiple encryption”, Wikipedia [online] 2019 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://en.wikipedia.org/wiki/Multiple_encryption, 4 pages.
Krawczyk, et al., “HMAC: Keyed-Hashing for Message Authentication”, Network Working Group RFC:2104 memo [online] 1997 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://tools.ietf.org/html/rfc2104, 12 pages.
Song, et al., “ The AES-CMAC Algorithm”, Network Working Group RFC: 4493 memo [online] 2006 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://tools.ietf.org/html/rfc4493, 21 pages.
Katz, J. and Lindell, Y., “Aggregate Message Authentication Codes”, Topics in Cryptology [online] 2008 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://www.cs.umd.edu/˜jkatz/papers/aggregateMAC.pdf, 11 pages.
Adams, D., and Maier, A-K., “Goldbug Big Seven open source crypto-messengers to be compared—or: Comprehensive Confidentiality Review & Audit of GoldBug Encrypting E-Mail-Client & Secure Instant Messenger”, Big Seven Study 2016 [online] [retrieved on Mar. 25, 2018]. Retrieved from Internet URL: https://sf.net/projects/goldbug/files/bigseven-crypto-audit.pdf, 309 pages.
Author Unknown, “Triple DES”, Wikipedia [online] 2018 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://simple.wikipedia.org/wiki/Triple_DES, 2 pages.
Song F., and Yun, A.I., “Quantum Security of NMAC and Related Constructions—PRF domain extension against quantum attacks”, IACR Cryptology ePrint Archive [online] 2017 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://eprint.iacr.org/2017/509.pdf, 41 pages.
Saxena, N., “Lecture 10: NMAC, HMAC and Number Theory”, CS 6903 Modern Cryptography [online] 2008 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: http://isis.poly.edu/courses/cs6903/Lectures/lecture10.pdf, 8 pages.
Berg, G., “Fundamentals of EMV”, Smart Card Alliance [online] date unknown [retrieved on Mar. 27, 2019]. Retrieveed from Internet URL: https://www.securetechalliance.org/resources/media/scap13_preconference/02.pdf, 37 pages.
Pierce, K., “Is the amazon echo nfc compatible?”, Amazon.com Customer Q&A [online] 2016 [retrieved on Mar. 26, 2019]. Retrieved from Internet URL: https://www.amazon.com/ask/questions/Tx1RJXYSPE6XLJD?_encodi . . . , 2 pages.
Author Unknown, “Multi-Factor Authentication”, idaptive [online] 2019 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://www.centrify.com/products/application-services/adaptive-multi-factor-authentication/risk-based-mfa/, 10 pages.
Author Unknown, “Adaptive Authentication”, SecureAuth [online] 2019 [retrieved on Mar. 25, 2019}. Retrieved from Internet URL: https://www.secureauth.com/products/access-management/adaptive-authentication, 7 pages.
Van den Breekel, J., et al., “EMV in a nutshell”, Technical Report, 2016 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://www.cs.ru.nl/E.Poll/papers/EMVtechreport.pdf, 37 pages.
Author Unknown, “Autofill”, Computer Hope [online] 2018 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://www.computerhope.com/jargon/a/autofill.htm, 2 pages.
Author Unknown, “Fill out forms automatically”, Google Chrome Help [online] 2019 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://support.google.com/chrome/answer/142893?co=GENIE.Plafform%3DDesktop&hl=en, 3 pages.
Author Unknown, “Autofill credit cards, contacts, and passwords in Safari on Mac”, Apple Safari User Guide [online] 2019 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://support.apple.com/guide/safari/use-autofill-ibrw1103/mac, 3 pages.
Menghin, M.J., “Power Optimization Techniques for Near Field Communication Systems”, 2014 Dissertation at Technical University of Graz [online]. Retrieved from Internet URL: https://diglib.tugraz.at/download.php?id=576a7b910d2d6&location=browse, 135 pages.
Mareli, M., et al., “Experimental evaluation of NFC reliability between an RFID tag and a smartphone”, Conference paper (2013) IEEE AFRICON at Mauritius [online] [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://core.ac.uk/download/pdf/54204839.pdf, 5 pages.
Davison, A., et al., “MonoSLAM: Real-Time Single Camera SLAM”, IEEE Transactions on Pattern Analysis and Machine Intelligence 29(6): 1052-1067 (2007).
Barba, R., “Sharing your location with your bank sounds creepy, but it's also useful”, Bankrate, LLC [online] 2017 [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://www.bankrate.com/banking/banking-app-location-sharing/, 6 pages.
Author Unknown: “onetappayment™”, [online] Jan. 24, 2019, [retrieved on Mar. 25, 2019]. Retrieved from Internet URL: https://www.payubiz.in/onetap, 4 pages.
Vu, et al., “Distinguishing users with capacitive touch communication”, Proceedings of the Annual International Conference on Mobile Computing and Networking, 2012, MOBICOM. 10.1145/2348543.2348569.
Pourghomi, P., et al., “A Proposed NFC Payment Application,” International Journal of Advanced Computer Science and Applications, 4(8):173-181 (2013).
Author unknown, “EMV Card Personalization Specification”, EMVCo., LLC., specification version 1.0, (2003) 81 pages.
Ullmann et al., “On-Card” User Authentication for Contactless Smart Cards based on Gesture Recognition, paper presentation LNI proceedings, (2012) 12 pages.
Faraj, S.T., et al., “Investigation of Java Smart Card Technology for Multi-Task Applications”, J of Al-Anbar University for Pure Science, 2(1):23 pages (2008).
Dhamdhere, P., “Key Benefits of a Unified Platform for Loyalty, Referral Marketing, and UGC” Annex Cloud [online] May 19, 2017 [retrieved on Jul. 3, 2019]. Retrieved from Internet URL: https://www.annexcloude.com/blog/benefits-unified-platform/, 13 pages.
Related Publications (1)
Number Date Country
20200104474 A1 Apr 2020 US
Provisional Applications (1)
Number Date Country
62740352 Oct 2018 US
Continuation in Parts (1)
Number Date Country
Parent 16205119 Nov 2018 US
Child 16675172 US