The present disclosure is generally related to cloud computing, and is more specifically related to implementing multi-factor system-to-system authentication using secure execution environments.
Multi-factor authentication is an authentication method that requires the person seeking to be the authenticated (e.g., a computer system user) to present two or more of several categories of authentication factors, e.g., the knowledge factor (“something only the user knows”), the possession factor (“something only the user has”), and the inherence factor (“something only the user is”). The more factors are employed by the authentication process, the higher is the probability that the person indeed possesses the asserted identity.
The present disclosure is illustrated by way of examples, and not by way of limitation, and may be more fully understood with references to the following detailed description when considered in connection with the figures, in which:
Described herein are methods and systems for implementing multi-factor system-to-system authentication using secure execution environments.
Multi-factor authentication may be employed for human person authentication, where the human user may be required to supply a set of credentials covering, e.g., the knowledge factor (such as a password), the possession factor (such as a one-time code generated by a portable token-generating device), and/or the inherence factor (such as a biometric input). However, utilizing those factors for system-to-system authentication may present certain challenges, since the difference between the knowledge and possession factors is blurred for systems, while the inherence factor may be difficult to prove due to the lack of reliable “biometric” factors that may be applied to systems.
The present disclosure alleviates these and other deficiencies of various common implementations by providing methods and systems for performing multi-factor system-to-system authentication using secure execution environments. “Secure execution environment” herein refers to a hardware platform architecture or implementation that is capable of implementing an isolated execution environment providing integrity of the applications executing therein and confidentiality of their data. Examples of secure execution environments that may be employed for implementing the systems and methods described herein include Trusted Execution Environments (TEEs), hardware security modules (HSMs), and field-programmable gate arrays (FPGAs), however, other secure execution environments may be suitable for implementing the systems and methods of the present disclosure.
In accordance with aspects of the present disclosure, a computing system to be authenticated (“the authentication client”) may employ one or more processes running on a secure execution environment to establish two or more authentication factors to be supplied to another computing system (“the authentication server”) which requires the authentication client to be authenticated. The term “authentication client” and “authentication server” designations are purely functional designations utilized in this disclosure to designate the relative roles of the two systems in the authentication process, and may not reflect other aspects of interactions of the two systems between each other and/or with other systems.
In some implementations, the first authentication factor may be represented by the inherence factor, which may reflect a measurement of one or more secure execution environment-resident processes and/or their respective data before the execution. The measurement may be performed by computing a cryptographic hash of the executable images of one or more secure execution environment-resident processes and their respective data sets and/or by cryptographically signing the executable images and data sets. The authentication server may validate the first authentication factor presented by the authentication client by comparing the first authentication factor to a stored measurement value.
The second authentication factor, which may be considered as a possession factor, a knowledge factor, or their combination, may be represented by an outcome of executing a secure execution environment-resident process that processes a combination of authentication server-supplied data, public data, and/or confidential (i.e., non-public) data provided by the authentication server and/or by third parties over a period of time preceding the time of presenting the authentication factor. In some implementations, the second authentication factor may be represented by an outcome of executing a secure execution environment-resident process that generates a response to the challenge data presented by the authentication server, such that the valid response may only be generated based on confidential data that has been supplied to the authentication client by the authentication server and/or third parties over a period of time before presenting the challenge or based on a combination of confidential and public data.
The authentication server may validate the second authentication factor presented by the authentication client by either comparing it to a known response value or by performing computations that are similar to the computations that have been performed by the authentication client in order to compute the second authentication factor, as described in more detail herein below.
While examples of the present disclosure are concerned with one system being authenticated by another system, the authentication methods described herein may be applied to two or more peer systems authenticating each other based on two or more authentication factors presented by each peer system. In some implementations, the authentication process may be symmetric, i.e., system A presents its authentication factors to system B, while system B presents its authentication factors to system A, thus allowing the two systems to mutually authenticate each other. This approach may be further extended to three or more systems, each of which may present its authentication factors to one or more peer systems.
The multi-factor authentication methods described herein may be employed in a wide variety of scenarios, e.g., for secure content distribution or software provisioning. In an illustrative example, the multi-factor authentication methods described herein may be employed for secure distribution of protected content by a content distribution system to one or more content consuming devices via an optional proxy device, such that the content consuming devices may be authenticated by the content distribution system and/or by the content distribution proxy. In another illustrative example, the multi-factor authentication methods described herein may be employed for performing software provisioning in cloud computing environments by authenticating the target hosts (on which the software is being installed) to the software provisioning controller (which manages the software provisioning activities in the cloud) and/or by authenticating the software provisioning controller to the target hosts.
Various aspects of the methods and systems are described herein by way of examples, rather than by way of limitation. The methods described herein may be implemented by hardware (e.g., general purpose and/or specialized processing devices, and/or other devices and associated circuitry), software (e.g., instructions executable by a processing device), or a combination thereof.
In some implementations, the secure execution environment 115 may be represented by a TEE implemented by a processor of the authentication client 110 or by a processor of a computing device that is reachable by the authentication client 110 over a secure communication channel (e.g., encrypted communication channel). In an illustrative example, the TEE may be implemented by Intel® Software Guard Extensions (SGX) secure enclave, which is a private region of encrypted memory, the contents of which would only be decrypted for access by the process running within the enclave. In another illustrative example, the TEE may be implemented by a virtual machine running in the Intel® Trust Domain Extension (TDX) environment. In another illustrative example, the TEE may be implemented by the AMD® Secure Encrypted Virtualization (SEV), which encrypts the memory state of each virtual machine using a respective encryption key inaccessible by other virtual machines. Various other TEE implementations for the above-referenced and/or other processor architectures may be compatible with the systems and methods of the present disclosure.
Alternatively, the secure execution environment 115 may be represented by a hardware security module (HSM). The HSM may be a plug-in card attached to an internal interface of the authentication client 110 or a peripheral device attached to an external interface of the authentication client 110. The HSM may include one or more general purpose or specialized microprocessors, which may be utilized to compute the requisite authentication factors to be presented by the authentication client 110 to the authentication server 120 or to compute intermediate values from which the authentication factors may be derived by the authentication client 110.
Alternatively, the secure execution environment 115 may be represented by a field-programmable gate array (FPGA) attached to an internal or external interface of the authentication client 110. In various other implementations, other secure execution environments may be suitable for implementing the systems and methods of the present disclosure.
In some implementations, the first authentication factor 130 may be represented by the inherence factor, which may reflect a pre-execution measurement of one or more computing processes 132A-132N residing in the secure execution environment 115 and/or a measurement of one or more data items 134A-134M to be utilized by the computing processes 132A-132N. The measurement may be performed by computing a cryptographic hash of the executable images of the computing processes 132A-132N and data items 134A-134M and/or by cryptographically signing the executable images and data items.
The authentication server 120 may validate the first authentication factor 130 presented by the authentication client 120 by comparing the first authentication factor 130 to a stored validation value 135. In particular, the authentication server 120 may determine that the first authentication factor being is valid responsive to determining that the first authentication factor presented by the authentication client 110 matches a corresponding validation value 135 stored in persistent or non-persistent memory 150 of the authentication server 120. Conversely, responsive to determining that the first authentication factor presented by the authentication client 110 does not matches the stored validation value 135, the authentication server 120 may return an authentication error to the authentication client 110.
The second authentication factor 140 may be considered as a possession factor, a knowledge factor, or their combination. Computing the second authentication factor 140 require knowledge of confidential data that has been supplied to the authentication client 110 by the authentication server 120 and/or third parties (possibly, over a relatively long period of time). In some implementations, computing the second authentication factor may further require knowledge of non-confidential data received from various sources.
In order to determine the second authentication factor 140, the authentication client 110 may run, in the secure execution environment 115, one or more computing processes 132A-132N that receive a combination of challenge data 142 provided by the authentication server 120, confidential data 144 provided by the authentication server 120 and/or third party systems (not shown in
In an illustrative example, the challenge data 142 may specify an argument of a mathematical function (e.g., a cryptographic hash function) of two or more arguments, while the remaining arguments may be derived from the confidential data 144 or from a combination of the confidential data 144 and the public data 146.
In another illustrative example, the challenge data 142 may specify a category of objects of a chosen type. For example, the challenge data 142 may specify a category of graphic images (e.g., images depicting a specified thing, place, natural phenomenon, animal, etc.). Alternatively, the challenge data 142 may specify a category of audio streams (e.g., audio streams containing sounds of a specified natural phenomenon or voice of a specified person). Alternatively, the challenge data 142 may specify a custom-defined category of arbitrary numerical data. Accordingly, the expected valid response to the challenge data 142 would be an object of the specified category, such that the object is selected from a set of objects derived from the confidential data 144 or from a combination of the confidential data 144 and the public data 146.
In yet another illustrative example, the challenge data 142 may a category of objects of a chosen type (e.g., a category of graphic images, a category audio streams, or a custom-defined category of arbitrary numerical data) selected from a set of categories derived from a combination of the confidential data 144 and public data 146. In this scenario, the expected valid response to the challenge data 142 would be an object of the specified category, such that the object has been generated by the authentication client 110 or selected from a set of objects derived from the confidential data 144, the public data 146, or their combination.
In yet another illustrative example, the challenge data 142 may represent one or more objects (e.g., graphic images, audio streams, or arbitrary numerical data), and the expected valid response would be their respective categories selected from a list of categories derived from the confidential data 144 or from a combination of the confidential data 144 and the public data 146.
In various implementations, the authentication client 120 may utilize various computation technologies for computing the second authentication factor 140, e.g., utilize trainable classifiers for determining a category of a given object or selecting an object of a specified category from a set of objects. The classifiers may be trained using the confidential data 144, the public data 146, or their combination, which may be received from the authentication server 120 and/or third party systems over a period of time.
The authentication server 120 may validates the second authentication factor 140 presented by the authentication client 110 by either comparing it to a known or computed validation value 145 stored in persistent or non-persistent memory 150 or by performing computations that are same or similar to the computations that have been performed by the authentication client 120 in order to compute the second authentication factor 140.
In some implementations, the authentication client 110 may request the attestation of the secure execution environment and utilize the obtained attestation data to derive the first and/or the second authentication factors. “Attestation” herein refers to a platform-specific mechanism of proving the identity of a computing process running within a secure execution environment, as well as proving that the computing process has not been tampered with and is running on a secure hardware platform.
It should be noted that The “first” and “second” qualifiers of the authentication factors are only utilized in this disclosure to distinguish between the factors, and should not be interpreted as bearing any further semantic meaning, e.g., indicating the order of the factors being computed, their respective priorities, etc.
While the illustrative example of
Responsive to completing the authentication procedure described with references to
In the illustrative example of
In the illustrative example of
At block 410, the computing system implementing the method determines, using a secure execution environment, a measure of one or more computing processes running on the computing system (e.g., within the secure execution environment). In an illustrative example, the secure execution environment may be represented by a TEE implemented by a processor of the computing system implementing the method by a processor of a computing device that is reachable by the computing system over a secure communication channel (e.g., encrypted communication channel). In another illustrative example, the secure execution environment may be represented by an HSM, which may be attached to an internal or external interface of the computing system. In yet another illustrative example, the secure execution environment may be represented by an FPGA attached to an internal or external interface of the computing system, as described in more detail herein above.
The measure computed by the secure execution environment may reflect a pre-execution measurement of one or more computing processes residing in the secure execution environment and/or a measurement of one or more data items to be utilized by those computing processes. The measurement may be performed by computing a cryptographic hash of the executable images of the computing processes and the data items and/or by cryptographically signing the executable images and data items.
At block 420, the computing system presents, to an authentication server, the first authentication factor (e.g., the inherence factor) derived from the measure. The first authentication factor may be the measure itself, or a result of applying a known mathematical transformation to the measure.
At block 430, the computing system determines, using the secure execution environment, the second authentication factor (e.g., the knowledge factor, the possession factor, or a combination thereof). The second authentication factor may be derived from one or more first data items received from the second computing system, one or more confidential second data items received from one or more third computing systems, and/or one or more public data items received from one or more fourth computing systems. In some implementations, computing the second authentication factor may involve receiving authentication challenge data from the authentication server and computing, by one or more computing processes running within the secure execution environment, an authentication response to the authentication challenge. The second authentication factor may then be derived from the authentication response, as described in more detail herein above.
At block 440, the computing system presents the second authentication factor to the authentication server.
Upon receiving an authentication response from the authentication server, the computing system may transmit, to the authentication server or to another computing system in communication with the communication server, a request to perform certain functions (e.g., a digital content request or a software provisioning request), as described in more detail herein above.
At block 510, the computing system implementing the method determines, using a secure execution environment, a measure of a computing processes running on the first computing system (e.g., within the secure execution environment). In an illustrative example, the secure execution environment may be represented by a TEE implemented by a processor of the computing system implementing the method by a processor of a computing device that is reachable by the computing system over a secure communication channel (e.g., encrypted communication channel). In another illustrative example, the secure execution environment may be represented by an HSM, which may be attached to an internal or external interface of the computing system. In yet another illustrative example, the secure execution environment may be represented by an FPGA attached to an internal or external interface of the computing system, as described in more detail herein above.
The measure computed by the secure execution environment may reflect a pre-execution measurement of one or more computing processes residing in the secure execution environment and/or a measurement of one or more data items to be utilized by those computing processes. The measurement may be performed by computing a cryptographic hash of the executable images of the computing processes and the data items and/or by cryptographically signing the executable images and data items.
At block 520, the computing system presents, to an authentication server, the first authentication factor (e.g., the inherence factor) derived from the measure. The first authentication factor may be the measure itself, or a result of applying a known mathematical transformation to the measure.
At block 530, the computing system receives authentication challenge data from the authentication server.
At block 540, the computing system determines, by a computing process running within the secure execution environment, an authentication response to the authentication challenge. The response may be represented by an outcome of executing a secure execution environment-resident process based on confidential data that has been supplied to the authentication client by the authentication server and/or third parties over a period of time before presenting the challenge or based on a combination of confidential and public data, as described in more detail herein above.
At block 550, the computing system presents, to the authentication server, the second authentication factor (e.g., the knowledge factor, the possession factor, or a combination thereof) derived from the authentication response.
Upon receiving an authentication response from the authentication server, the computing system may transmit, to the authentication server or to another computing system in communication with the communication server, a request to perform certain functions (e.g., a digital content request or a software provisioning request), as described in more detail herein above.
In a further aspect, the computer system 1000 may include a processing device 1002, a volatile memory 1004 (e.g., random access memory (RAM)), a non-volatile memory 1009 (e.g., read-only memory (ROM) or electrically-erasable programmable ROM (EEPROM)), and a data storage device 1016, which may communicate with each other via a bus 1008.
Processing device 1002 may be provided by one or more processors such as a general purpose processor (such as, for example, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a microprocessor implementing other types of instruction sets, or a microprocessor implementing a combination of types of instruction sets) or a specialized processor (such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), or a network processor).
Computer system 1000 may further include a network interface device 1022. Computer system 1000 also may include a video display unit 1010 (e.g., an LCD), an alphanumeric input device 1012 (e.g., a keyboard), a cursor control device 1014 (e.g., a mouse), and a signal generation device 1020.
Data storage device 1016 may include a non-transitory computer-readable storage medium 1024 on which may store instructions 1026 encoding any one or more of the methods or functions described herein, including instructions for implementing methods 400 and/or 500 of multi-factor system-to-system authentication, in accordance with aspects of the present disclosure.
Instructions 1026 may also reside, completely or partially, within volatile memory 1004 and/or within processing device 1002 during execution thereof by computer system 1000, hence, volatile memory 1004 and processing device 1002 may also constitute machine-readable storage media.
While computer-readable storage medium 1024 is shown in the illustrative examples as a single medium, the term “computer-readable storage medium” shall include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of executable instructions. The term “computer-readable storage medium” shall also include any tangible medium that is capable of storing or encoding a set of instructions for execution by a computer that cause the computer to perform any one or more of the methods described herein. The term “computer-readable storage medium” shall include, but not be limited to, solid-state memories, optical media, and magnetic media.
Other computer system designs and configurations may also be suitable to implement the system and methods described herein. The following examples illustrate various implementations in accordance with one or more aspects of the present disclosure.
Example 1 is a method comprising: determining, by a first computing system, using a secure execution environment, a measure of one or more computing processes running on the first computing system; presenting, to a second computing system, a first authentication factor derived from the measure; computing, using the secure execution environment, a second authentication factor derived from at least one of: one or more first data items received from the second computing system, one or more confidential second data items received from one or more third computing systems, or one or more public data items received from one or more fourth computing systems; and presenting the second authentication factor to the second computing system.
Example 2 is the method of Example 1, wherein the first computing system performs authentication client functionality, and wherein the second computing system performs authentication server functionality.
Example 3 is the method of Example 1, wherein computing the second authentication factor further comprises: receiving, from the second computing system, an authentication challenge data; determining, using the secure execution environment, an authentication response to the authentication challenge data; and deriving the second authentication factor from the authentication response.
Example 4 is the method of Example 1, wherein the secure execution environment is provided by at least one of: a trusted execution environment (TEE) implemented by a general purpose processor, a hardware security module (HSM) or a field-programmable gate array (FPGA).
Example 5 is the method of Example 1, wherein the one or more computing processes run in the secure execution environment.
Example 6 is the method of Example 1, wherein the measure of the one or more computing processes comprises at least one of: a first value of a first hash function of a memory-resident executable code implementing the one or more computing processes or a second value of a second hash function of a memory-resident data set associated with the one or more computing processes.
Example 7 is the method of Example 1, wherein the first authentication factor represents an inherence factor.
Example 8 is the method of Example 1, wherein the second authentication factor represents at least one of: a knowledge factor or a possession factor.
Example 9 is the method of Example 1, further comprising: responsive to receiving an authentication response from the second computing system, transmitting to the second computing system at least one of: a digital content request or a software provisioning request.
Example 10 is a method, comprising: determining, by an authentication client, e a measure of a first computing processes running in a trusted execution environment (TEE); presenting, to an authentication server, a first authentication factor derived from the measure; receiving from the authentication server, an authentication challenge data item; determining, by a second computing process running in the TEE, an authentication response to the authentication challenge data; and presenting, to the authentication server, a second authentication factor derived from the authentication response.
Example 11 is the method of Example 10, wherein the measure of the first computing process comprises at least one of: a first value of a first hash function of a memory-resident executable code implementing the first computing process or a second value of a second hash function of a memory-resident data set associated with the first computing process.
Example 12 is the method of Example 10, wherein the first authentication factor represents an inherence factor.
Example 13 is the method of Example 10, wherein the second authentication factor represents at least one of: a knowledge factor or a possession factor.
Example 14 is the method of Example 10, further comprising: responsive to receiving an authentication response from the authentication server, transmitting to the authentication server at least one of: a digital content request or a software provisioning request.
Example 15 is a system, comprising: a memory; and a processing device, coupled to the memory, to implement an authentication client residing in a trusted execution environment (TEE), wherein the authentication client is to: determine a measure of a first computing processes running in the TEE; present, to an authentication server, a first authentication factor derived from the measure; receive from the authentication server, an authentication challenge data item; determine, by a second computing process running in the TEE, an authentication response to the authentication challenge data; and present, to the authentication server, a second authentication factor derived from the authentication response.
Example 16 is the system of example 15, wherein the measure of the first computing process comprises at least one of: a first value of a first hash function of a memory-resident executable code implementing the first computing process or a second value of a second hash function of a memory-resident data set associated with the first computing process.
Example 17 is the system of example 15, wherein the first authentication factor represents an inherence factor.
Example 18 is the system of example 15, wherein the second authentication factor represents at least one of: a knowledge factor or a possession factor.
Example 19 is the system of example 15, wherein the processing device is further to: responsive to receiving an authentication response from the authentication server, transmitting to the authentication server at least one of: a digital content request or a software provisioning request.
Example 20 is a computing system, comprising: a means for determining, using a secure execution environment, a measure of one or more computing processes running on the first computing system; a means for presenting, to an authentication server, a first authentication factor derived from the measure; a means for computing, using the secure execution environment, a second authentication factor derived from at least one of: one or more first data items received from the authentication server, one or more confidential second data items received from one or more third computing systems, or one or more public data items received from one or more fourth computing systems; and a means for presenting the second authentication factor to the authentication server.
Example 21 is the computing system of Example 20, wherein computing the second authentication factor further comprises: receiving, from the authentication server, an authentication challenge data; determining, using the secure execution environment, an authentication response to the authentication challenge data; and deriving the second authentication factor from the authentication response.
Example 22 is the computing system of Example 20, wherein the secure execution environment is provided by at least one of: a trusted execution environment (TEE) implemented by a general purpose processor, a hardware security module (HSM) or a field-programmable gate array (FPGA).
Example 23 is the computing system of Example 20, wherein the one or more computing processes run in the secure execution environment.
Example 24 is the computing system of Example 20, wherein the measure of the one or more computing processes comprises at least one of: a first value of a first hash function of a memory-resident executable code implementing the one or more computing processes or a second value of a second hash function of a memory-resident data set associated with the one or more computing processes.
Example 25 is the computing system of Example 20, wherein the first authentication factor represents an inherence factor.
Example 26 is the computing system of Example 20, wherein the second authentication factor represents at least one of: a knowledge factor or a possession factor.
Example 27 is the computing system of Example 20, further comprising: a means for, responsive to receiving an authentication response from the authentication server, transmitting to the authentication server at least one of: a digital content request or a software provisioning request.
Example 28 is a non-transitory computer-readable storage medium comprising executable instructions that, when executed by a computing system, cause the computing system to: determine, using a secure execution environment, a measure of one or more computing processes running on the first computing system; present, to an authentication server, a first authentication factor derived from the measure; compute, using the secure execution environment, a second authentication factor derived from at least one of: one or more first data items received from the authentication server, one or more confidential second data items received from one or more third computing systems, or one or more public data items received from one or more fourth computing systems; and present the second authentication factor to the authentication server.
Example 29 is the non-transitory computer-readable storage medium of Example 28, wherein computing the second authentication factor further comprises: receiving, from the authentication server, an authentication challenge data; determining, using the secure execution environment, an authentication response to the authentication challenge data; and deriving the second authentication factor from the authentication response.
Example 30 is the non-transitory computer-readable storage medium of Example 28, wherein the secure execution environment is provided by at least one of: a trusted execution environment (TEE) implemented by a general purpose processor, a hardware security module (HSM) or a field-programmable gate array (FPGA).
Example 31 is the non-transitory computer-readable storage medium of Example 28, wherein the measure of the one or more computing processes comprises at least one of: a first value of a first hash function of a memory-resident executable code implementing the one or more computing processes or a second value of a second hash function of a memory-resident data set associated with the one or more computing processes.
Example 32 is the non-transitory computer-readable storage medium of Example 28, further comprising executable instructions that, when executed by the computing system, cause the computing system to: responsive to receiving an authentication response from the authentication server, transmit a digital content request.
Example 33 is the non-transitory computer-readable storage medium of Example 28, further comprising executable instructions that, when executed by the computing system, cause the computing system to: responsive to receiving an authentication response from the authentication server, transmit a software provisioning request.
The methods, components, and features described herein may be implemented by discrete hardware components or may be integrated in the functionality of other hardware components such as ASICS, FPGAs, DSPs or similar devices. In addition, the methods, components, and features may be implemented by firmware modules or functional circuitry within hardware devices. Further, the methods, components, and features may be implemented in any combination of hardware devices and software components, or only in software.
Unless specifically stated otherwise, terms such as “updating”, “identifying”, “determining”, “sending”, “assigning”, or the like, refer to actions and processes performed or implemented by computer systems that manipulates and transforms data represented as physical (electronic) quantities within the computer system registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Examples described herein also relate to an apparatus for performing the methods described herein. This apparatus may be specially constructed for performing the methods described herein, or it may comprise a general purpose computer system selectively programmed by a computer program stored in the computer system. Such a computer program may be stored in a computer-readable tangible storage medium.
The methods and illustrative examples described herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used in accordance with the teachings described herein, or it may prove convenient to construct more specialized apparatus to perform methods 400, 500 and/or each of their individual functions, routines, subroutines, or operations. Examples of the structure for a variety of these systems are set forth in the description above.
The above description is intended to be illustrative, and not restrictive. Although the present disclosure has been described with references to specific illustrative examples and implementations, it will be recognized that the present disclosure is not limited to the examples and implementations described. The scope of the disclosure should be determined with reference to the following claims, along with the full scope of equivalents to which the claims are entitled.
Number | Name | Date | Kind |
---|---|---|---|
6161185 | Guthrie | Dec 2000 | A |
6377691 | Swift | Apr 2002 | B1 |
7024695 | Kumar | Apr 2006 | B1 |
8443187 | Orr | May 2013 | B1 |
8966021 | Allen | Feb 2015 | B1 |
9032414 | Dalal | May 2015 | B1 |
9106645 | Vadlamani | Aug 2015 | B1 |
9219611 | Naik | Dec 2015 | B1 |
9225735 | Banerjee | Dec 2015 | B1 |
9544287 | Sokolov | Jan 2017 | B1 |
9609000 | Karame et al. | Mar 2017 | B2 |
9692599 | Krahn | Jun 2017 | B1 |
9813402 | Chen | Nov 2017 | B1 |
9876823 | Smith et al. | Jan 2018 | B2 |
9948681 | Kruse | Apr 2018 | B1 |
9992018 | Tjew | Jun 2018 | B1 |
10044695 | Cahill | Aug 2018 | B1 |
10055561 | Moore | Aug 2018 | B2 |
10390222 | Khosravi | Aug 2019 | B2 |
10447663 | Sun et al. | Oct 2019 | B2 |
10454975 | Sharifi Mehr | Oct 2019 | B1 |
10558812 | Thom et al. | Feb 2020 | B2 |
10650003 | Rubin | May 2020 | B1 |
10666643 | Mathew | May 2020 | B2 |
10740466 | BShara | Aug 2020 | B1 |
10880283 | Roth | Dec 2020 | B1 |
10979430 | Hitchcock | Apr 2021 | B1 |
11051163 | Smith | Jun 2021 | B1 |
11055273 | Meduri | Jul 2021 | B1 |
11115423 | DiAcetis | Sep 2021 | B2 |
11190517 | Drake, II | Nov 2021 | B2 |
11509658 | Kulkarni | Nov 2022 | B1 |
20020034302 | Moriai et al. | Mar 2002 | A1 |
20030191943 | Poisner | Oct 2003 | A1 |
20050086419 | Neble et al. | Apr 2005 | A1 |
20070294376 | Ayachitula | Dec 2007 | A1 |
20080098464 | Mizrah | Apr 2008 | A1 |
20080229103 | Mutka | Sep 2008 | A1 |
20080256598 | Diab | Oct 2008 | A1 |
20080298588 | Shakkarwar | Dec 2008 | A1 |
20100268747 | Kern | Oct 2010 | A1 |
20110035577 | Lin et al. | Feb 2011 | A1 |
20110213985 | Miller | Sep 2011 | A1 |
20110302415 | Ahmad et al. | Dec 2011 | A1 |
20120084835 | Thomas | Apr 2012 | A1 |
20120131653 | Pasquero | May 2012 | A1 |
20120159591 | Payne | Jun 2012 | A1 |
20120166141 | Watkins | Jun 2012 | A1 |
20130018941 | Olaru | Jan 2013 | A1 |
20130036462 | Krishnamurthi | Feb 2013 | A1 |
20130152183 | Plewnia | Jun 2013 | A1 |
20140019577 | Lobo | Jan 2014 | A1 |
20140109190 | Cam-Winget | Apr 2014 | A1 |
20140205099 | Christodorescu | Jul 2014 | A1 |
20140245396 | Oberheide | Aug 2014 | A1 |
20140282964 | Stubblefield | Sep 2014 | A1 |
20140359290 | McCusker | Dec 2014 | A1 |
20150113618 | Sinha | Apr 2015 | A1 |
20150227744 | Horovitz | Aug 2015 | A1 |
20150318998 | Erlikhman | Nov 2015 | A1 |
20150326398 | Modarresi | Nov 2015 | A1 |
20160065376 | Smith et al. | Mar 2016 | A1 |
20160087792 | Smith et al. | Mar 2016 | A1 |
20160088021 | Jayanti Venkata | Mar 2016 | A1 |
20160180068 | Das | Jun 2016 | A1 |
20160180078 | Chhabra | Jun 2016 | A1 |
20160248752 | Blinn | Aug 2016 | A1 |
20160294562 | Oberheide | Oct 2016 | A1 |
20160316364 | Blanco | Oct 2016 | A1 |
20160342774 | Henkel-Wallace et al. | Nov 2016 | A1 |
20160371475 | Zhao | Dec 2016 | A1 |
20170012959 | Sierra | Jan 2017 | A1 |
20170032111 | Johansson | Feb 2017 | A1 |
20170126661 | Brannon | May 2017 | A1 |
20170195457 | Smith, II | Jul 2017 | A1 |
20170244709 | Jhingran | Aug 2017 | A1 |
20170257363 | Franke | Sep 2017 | A1 |
20170279795 | Redberg | Sep 2017 | A1 |
20170289118 | Khosravi | Oct 2017 | A1 |
20170300683 | Movsisyan | Oct 2017 | A1 |
20170302445 | Kobayashi et al. | Oct 2017 | A1 |
20170329966 | Koganti | Nov 2017 | A1 |
20170351536 | Kamalakantha | Dec 2017 | A1 |
20170366532 | Garfinkle | Dec 2017 | A1 |
20170374014 | Sastri | Dec 2017 | A1 |
20170374049 | Ateniese et al. | Dec 2017 | A1 |
20180004930 | Csinger et al. | Jan 2018 | A1 |
20180007060 | Leblang | Jan 2018 | A1 |
20180026940 | Sastri | Jan 2018 | A1 |
20180083932 | Adams | Mar 2018 | A1 |
20180097787 | Murthy | Apr 2018 | A1 |
20180097789 | Murthy | Apr 2018 | A1 |
20180101847 | Pisut, IV | Apr 2018 | A1 |
20180109504 | Poffenbarger | Apr 2018 | A1 |
20180114000 | Taylor | Apr 2018 | A1 |
20180136943 | Chew | May 2018 | A1 |
20180150331 | Chen | May 2018 | A1 |
20180176212 | Nair | Jun 2018 | A1 |
20180183586 | Bhargav-Spantzel | Jun 2018 | A1 |
20180212769 | Novak | Jul 2018 | A1 |
20180254898 | Sprague et al. | Sep 2018 | A1 |
20180270068 | Innis et al. | Sep 2018 | A1 |
20180288060 | Jackson | Oct 2018 | A1 |
20180309567 | Wooden | Oct 2018 | A1 |
20180351944 | Cho | Dec 2018 | A1 |
20180367542 | Wolf | Dec 2018 | A1 |
20180375659 | Kozma | Dec 2018 | A1 |
20180375852 | Thom et al. | Dec 2018 | A1 |
20190007384 | Maaroufi | Jan 2019 | A1 |
20190028456 | Kurian | Jan 2019 | A1 |
20190036957 | Smith et al. | Jan 2019 | A1 |
20190050557 | Martin | Feb 2019 | A1 |
20190065731 | Brocious | Feb 2019 | A1 |
20190068633 | Tsirkin | Feb 2019 | A1 |
20190097790 | Li | Mar 2019 | A1 |
20190097987 | Talur | Mar 2019 | A1 |
20190109839 | Reston | Apr 2019 | A1 |
20190139148 | Piel | May 2019 | A1 |
20190156301 | Bentov et al. | May 2019 | A1 |
20190176753 | Suzuki et al. | Jun 2019 | A1 |
20190188368 | Hastings | Jun 2019 | A1 |
20190190903 | Chen | Jun 2019 | A1 |
20190199725 | Pularikkal | Jun 2019 | A1 |
20190208009 | Prabhakaran | Jul 2019 | A1 |
20190222575 | Oberhauser | Jul 2019 | A1 |
20190243963 | Soriente et al. | Aug 2019 | A1 |
20190280863 | Meyer | Sep 2019 | A1 |
20190289017 | Agarwal | Sep 2019 | A1 |
20190334921 | Pattar | Oct 2019 | A1 |
20190342080 | Vakili | Nov 2019 | A1 |
20190364034 | Alexander | Nov 2019 | A1 |
20190392305 | Gu et al. | Dec 2019 | A1 |
20200007531 | Koottayi | Jan 2020 | A1 |
20200007536 | Piel | Jan 2020 | A1 |
20200007576 | Buhacoff | Jan 2020 | A1 |
20200027022 | Jha et al. | Jan 2020 | A1 |
20200028693 | Wu et al. | Jan 2020 | A1 |
20200057664 | Durham et al. | Feb 2020 | A1 |
20200067716 | Camenisch | Feb 2020 | A1 |
20200076829 | Wentz | Mar 2020 | A1 |
20200092284 | Zhu | Mar 2020 | A1 |
20200125717 | Wang | Apr 2020 | A1 |
20200125772 | Volos | Apr 2020 | A1 |
20200137031 | Pappachan et al. | Apr 2020 | A1 |
20200162454 | Jain | May 2020 | A1 |
20200175208 | Yu | Jun 2020 | A1 |
20200184467 | Segaran | Jun 2020 | A1 |
20200195639 | Chien | Jun 2020 | A1 |
20200244636 | Varanasi | Jul 2020 | A1 |
20200259799 | Li et al. | Aug 2020 | A1 |
20200304488 | Mimis | Sep 2020 | A1 |
20200304543 | Hamlin | Sep 2020 | A1 |
20200382323 | Keselman | Dec 2020 | A1 |
20200403994 | Bitterfeld | Dec 2020 | A1 |
20200404003 | Alameh | Dec 2020 | A1 |
20210004454 | Chester | Jan 2021 | A1 |
20210004469 | Chisnall | Jan 2021 | A1 |
20210150044 | Christofferson et al. | May 2021 | A1 |
20210152371 | Fletcher et al. | May 2021 | A1 |
20210218734 | Kapinos | Jul 2021 | A1 |
20210243180 | Beale | Aug 2021 | A1 |
20210243206 | Shivanna | Aug 2021 | A1 |
20210281577 | Sasaki | Sep 2021 | A1 |
20210297412 | Thayyilsubramanian | Sep 2021 | A1 |
20210312047 | Chen | Oct 2021 | A1 |
20210314149 | Yee | Oct 2021 | A1 |
20210314298 | Chen | Oct 2021 | A1 |
20210374232 | Bursell | Dec 2021 | A1 |
20210374233 | Bursell | Dec 2021 | A1 |
20210374234 | Bursell | Dec 2021 | A1 |
20210385183 | Henao Mota | Dec 2021 | A1 |
20220012725 | Rutter | Jan 2022 | A1 |
20220050896 | Ahmed | Feb 2022 | A1 |
20220109663 | Swain | Apr 2022 | A1 |
20220116375 | Bursell | Apr 2022 | A1 |
20220206764 | Scarlata | Jun 2022 | A1 |
Number | Date | Country |
---|---|---|
109218260 | Jan 2019 | CN |
110851231 | Feb 2020 | CN |
2019075234 | Apr 2019 | WO |
2019179543 | Sep 2019 | WO |
2020104032 | May 2020 | WO |
2020125942 | Jun 2020 | WO |
Entry |
---|
Radboud University Nijmegen, The Netherlands; SURFnet bv, Utrecht, The Netherlands, van Rijswijk-DU, Roland and Poll, Erik, “Using Trusted Execution Environments in Two-factor Authentication: Comparing Approaches”, https://dl.gi.de/bilstream/handle/20.500.12116/17195/20.pdf?sequence=1, 12 pages. |
Arfaoui et al. “Practical and Privacy-Preserving TEE Migration”, 9th Workshop on Information Security Theory and Practice (WISTP), Aug. 2015, Heraklion, Greece. pp. 153-168, 17 pages. |
Xiao, et al., “Enforcing Private Data Usage Control with Blockchain and Attested Off-Chain Contract Execution”, Virginia Polytechnic Institute and State University, Apr. 15, 2019, 16 pages. |
Number | Date | Country | |
---|---|---|---|
20220116375 A1 | Apr 2022 | US |