A computing device, such as a device including a processor, may interact with secret or otherwise sensitive information during operation. As such, some computing devices may operate to protect the sensitive information. For example, a computing device may encrypt sensitive information using a security parameter, such as an encryption key, stored on the device.
The following detailed description references the drawings, wherein:
As noted above, a computing device may operate to protect sensitive information using a security parameter stored on the computing device. For example, a computing device may encrypt information using security parameters such as secret encryption keys. If the security parameters of the computing device were to be discovered, the security parameters may subsequently be used to discover sensitive information used by the computing device or another device utilizing the same security parameters.
Accordingly, a computing device utilizing security parameters may operate to protect the security parameters stored on the device. For example, the computing device may zeroize (e.g., erase, overwrite, etc.) security parameters stored on the device in response to the detection of a security incident. By removing the security parameters from the computing device in response to such a detection, the device may eliminate the parameters before they become vulnerable to discovery as a result of the security incident. In this manner, the secrecy of the parameters may be maintained despite the security incident. For example, in response to an attempted or actual attack on the device, the device may zeroize the security parameters before any security parameter is retrieved from the device, thereby maintaining the secrecy of the parameters despite the attack.
In some computing devices, the security parameters may be stored on a processor within the computing device, and the detection of a security incident may be made by the processor itself. In such devices, the processor may zeroize security parameters stored on the processor in response to its own security incident detection functionalities. Such a computing device may also include security monitors external to the processor. For example, a computing device may include a case or other enclosure in which the processor is disposed, along with other components of the device, and may monitor physical conditions of the enclosure with a security monitor external to the processor. In such a device, the security monitor may provide a zeroization command to the processor in response to detecting a security incident involving the enclosure. The processor may then zeroize its security parameters in response to the zeroization command to maintain the secrecy of the processor's security parameters.
However, the connection between the security monitor and the processor may be vulnerable. For example, the connection between the security monitor and the processor may become disconnected or otherwise corrupted such that zeroize commands output by the security monitor either do not reach the processor or are not recognizable by the processor. As one example, an attacker may tamper with the connection through an opening in the device enclosure, or when the enclosure security is disabled, such as during device maintenance. In other examples, the connection may become disconnected or corrupted as a result of movement, wear and tear, and the like. When the connection disconnected or corrupted as described above, the processor will fail to zeroize its security parameters when the security monitor detects a security incident, which may increase the vulnerability of information on the computing device.
To address these issues, examples disclosed herein may zeroize a security parameter stored on a processor if the processor does not receive a periodic idle signal from a security monitor remote from the processor. In some examples, the remote security monitor may periodically provide an idle signal to the processor if the security monitor detects no security incidents and provide a zeroize signal to the processor when a security incident is detected. In such examples, the processor may determine whether a received signal is an idle signal from the security monitor and zeroize a security parameter stored on the processor if a threshold amount of time elapses without receiving an idle signal from the security monitor.
In such examples, the processor may use the periodic idle signal to confirm that the connection between the remote security monitor and the processor is connected and not corrupted. For example, if the remote security monitor and the processor are disconnected, the idle signal output by the security monitor will not reach the processor. Additionally, if the connection is corrupted, a valid idle signal may not reach the processor. In such examples, after a threshold amount of time passes without receiving a valid idle signal, the processor may determine that the remote security monitor is disconnected or malfunctioning and may zeroize a security parameter stored on the processor. In this manner, examples disclosed herein may provide security for the connection between a remote security monitor and the processor and reduce the likelihood of security parameters being left vulnerable by disconnection or corruption of a connection between a processor and a remote security monitor.
Referring now to the drawings,
In the example of
In some examples, processor 110 also includes a machine-readable storage medium 120 including instructions 122, 124, and 132. In some examples, storage medium 120 may also include additional instructions. In other examples, the functionality of any of instructions 122, 124, and 132 described below may be implemented in the form of electronic circuitry, in the form of executable instructions encoded on a machine-readable storage medium, or a combination thereof. As used herein, a “machine-readable storage medium” may be any electronic, magnetic, optical, or other physical storage device to contain or store information such as executable instructions, data, and the like. For example, any machine-readable storage medium described herein may be any of Random Access Memory (RAM), flash memory, a storage drive (e.g., a hard disk), a Compact Disc Read Only Memory (CD-ROM), and the like, or a combination thereof. Further, any machine-readable storage medium described herein may be non-transitory.
In the example of
In some examples, remote security monitor 140 is remote from processor 110. As used herein, a security monitor is “remote” from a processor if the security monitor is disposed outside of a package of the processor and is to communicate with the processor via at least one external pin of the processor and/or via a wireless communication interface of the processor. In some examples, remote security monitor 140 and processor 110 may be disposed on different printed circuit boards (PCBs) within computing device 100. In other examples, remote security monitor 140 may be disposed on a chassis of computing device 100, and/or on (e.g., inside or outside) an enclosure of computing device 100 in which processor 110 is disposed. Additionally, in some examples, remote security monitor 140 may be disposed outside of and separate from an enclosure or chassis of computing device 100.
In the example of
In some examples, security monitor 140 may have an idle state and a zeroize state. As used herein, an “idle state” of a security monitor is a state in which no security incident has been detected (e.g., since the monitor was placed in the idle state, reset, turned on, etc.) and the security monitor periodically outputs an idle signal. In some examples, when security monitor 140 is in the idle state, security monitor 140 may periodically provide an idle signal to processor 110 as a monitor signal 182. In some examples, monitor 140 may provide idle signals at a rate of between about 10 Hz and 100 kHz. In other examples, monitor 140 may provide idle signals at a greater or lesser rate. Additionally, as used herein, a “zeroize state” of a security monitor is a state entered by the security monitor after detecting a security incident and in which the security monitor does not output any idle signal. In some examples, in response to detecting a security incident, security monitor 140 may indicate the detection of the security incident to processor 110 by providing a zeroize signal to processor 110 as monitor signal 182.
In the example of
After instructions 122 receive monitor signal 182, signal determination instructions 124 may determine if monitor signal 182 is an idle signal from remote security monitor 140, a zeroize signal from remote security monitor 140, or an invalid signal. In some examples, instructions 124 may determine that monitor signal 182 is an idle single from remote security monitor 140 if signal 182 includes information identifying the signal as an idle signal from monitor 140. Similarly, instructions 124 may determine that monitor signal 182 is a zeroize single from remote security monitor 140 if signal 182 includes information identifying the signal as a zeroize signal from monitor 140.
In some examples, the information identifying signal 182 as an idle or a zeroize signal may be any form of information that may be included in a signal 182. For example, instructions 124 may determine that signal 182 is an idle signal if signal 182 includes a bit pattern identifying signal 182 as an idle signal, and may determine that signal 182 is a zeroize signal if signal 182 includes a bit pattern identifying signal 182 as a zeroize signal. In other examples, remote security monitor 140 may periodically receive information from processor 110 and selectively modify and return the information in signal 182 as an idle or zeroize monitor signal 182. Additionally, in some examples, security monitor 140 may include information identifying itself in each monitor signal 182. In such examples, instructions 124 may determine from information in signal 182 if signal 182 is an idle signal from monitor 140 or a zeroize signal 182 from monitor 140.
In some examples, instructions 124 may determine that a monitor signal 182 received by instructions 122 is an invalid signal by determining that the received signal 182 is neither an idle signal nor a zeroize signal from monitor 140. For example, an attacker may replace monitor 140 with a false monitor to pose as monitor 140. In such examples, instructions 124 may determine that a signal received from the false monitor is an invalid signal when the signal does not contain information correctly identifying the signal as an idle or zeroize signal from remote security monitor 140. In other examples, computing device 100 may include a plurality of remote security monitors 140. In such examples, instructions 124 may determine that a monitor signal 182 received by instructions 122 is an invalid signal by determining that the received signal 182 is not an idle signal from any of the security monitors 140 of computing device 100 or a zeroize signal from any of the security monitors 140 of computing device 100. In other examples, remote security monitor 140 may communicate valid signals other than an idle or a zeroize signal. In such examples, instructions 124 may determine that a received signal is invalid if it determines that the received signal is not any type of valid signal of any remote security monitor 140 of computing device 100.
Additionally, in the example of
In some examples, instructions 132 may track an amount of time elapsed since instructions 122 last received an idle signal from remote security monitor 140 as monitor signal 182. For example, instructions 132 may reset a timer each time instructions 122 receive the idle signal. In such examples, the timer may be reset if instructions 124 determine that the signal received by instructions 122 is an idle signal from monitor 140. In some examples, instructions 132 may monitor the timer to determine when a threshold amount of time has been reached without receiving an idle signal from monitor 140. In other examples, instructions 132 may determine when the threshold amount of time has been reached in other ways. For example, instructions 132 may compare a current time against the time at which the last idle signal was received from monitor 140.
By receiving periodic idle signals from monitor 140, processor 110 may determine both that it is connected to monitor 140 and that monitor 140 has not detected a security incident. For example, if monitor 140 is disabled or disconnected from processor 110, then instructions 122 may not receive the periodic idle signal from monitor 140. In such examples, by monitoring an amount of time elapsed without receiving an idle signal from remote security monitor 140, processor 110 may determine whether remote security monitor 140, or a connection between monitor 140 and processor 110, has been compromised.
After determining that the threshold amount of time has elapsed without receiving an idle signal from remote security monitor 140, instructions 132 may zeroize at least security parameter 117. As used herein, to “zeroize” information is to at least one of erase and overwrite the information at least once. In some examples, instructions 132 may zeroize security parameter 117 by overwriting each bit of security parameter 117 at least once. For example, instructions 132 may overwrite each bit of security parameter 117 with a first logic value (e.g., 0), then with a second logic value (e.g., 1), and then overwrite the security parameters with a combination of logic 1's and logic 0's. In other examples, instructions 132 may erase security parameter 117 and then take further action to prevent the recovery of the erased parameter 117, such as overwriting the erased parameters at least once, as described above, to complete the zeroization of the security parameters.
In some examples, instructions 132 may zeroize a plurality of security parameters 117 stored in secure parameter storage 115 in response to determining that the threshold amount of time has elapsed since receiving an idle signal from remote security monitor 140. Additionally, in some examples, instructions 132 may zeroize all of secure parameter storage 115, or a portion thereof, in response to determining that the threshold amount of time has elapsed since receiving an idle signal from remote security monitor 140. In the example of
In examples described above, a processor may determine that a remote security monitor has been disabled, or that a connection between the processor and the monitor has been disconnected or corrupted if a threshold amount of time elapses without receiving an idle signal from a remote security monitor. In such examples, the processor may zeroize security parameters stored thereon if it determines that the remote security monitor, or a connection between the monitor and processor, has been compromised. In this manner, examples described herein may protect security parameters stored on a processor of a computing device even if a remote security monitor of the computing device is removed, disconnected, or otherwise disabled. Such examples may be able to protect security parameters stored on a processor even when a remote security monitor is not able to inform the processor of security incidents.
In the example of
In the example of
In examples including a plurality of security monitors 240, instructions 124 may include signal identification instructions 226. In such examples, for each monitor signal 182 received by instructions 122, instructions 226 may determine if the received monitor signal 182 is an idle signal from one of remote security monitors 240, a zeroize signal from one of remote security monitors 240, or an invalid signal, as described above in relation to
In the example of
In the example of
In such examples, instructions 226 may utilize the unique idle and zeroize information to identify idle and zeroize signals of remote security monitors 240. In some examples, instructions 226 may include unique information determination instructions 228. In such examples, for each monitor signal 182 received by instructions 122, instructions 228 may determine if the monitor signal includes unique idle information associated with one of remote security monitors 240, includes unique zeroize information associated with one of remote security monitors 240, or is an invalid signal. In some examples, instructions 228 may determine that a monitor signal 182 is an invalid signal if it does not include the unique idle or fault information of any of security monitors 240. In other examples, instructions 228 may determine that a monitor signal 182 is an invalid signal if it does not include the unique idle or fault information of any of security monitors 240, and is not any other type of valid signal from a security monitor 240. As used herein, idle or zeroize information of a security monitor of a computing device may be “unique” if it is different from all other idle information and zeroize information of the security monitors of the computing device.
In examples described herein, by unique idle and zeroize information, a processor may be able to distinguish the idle and zeroize signals of different security monitors. In this manner, a processor may be able to tell if a security monitor is removed, even if it is replaced with another device that outputs idle signals (e.g., by an attacker). In some examples described herein, when the processor does not receive an idle signal including the idle information of the removed monitor, the security parameters may be zeroized. Additionally, in examples described herein, a processor of a computing device may ignore a zeroize signal sent by an attacker if it does not include the zeroize information of any of the security monitors of the computing device. In this manner, examples described herein may prevent denial of service attacks in which an attacker attempts to zeroize the security parameters of the processor to disrupt the operation of the computing device.
Additionally, in the example of
For example, monitor signals 182, or portions thereof, may be encrypted. In such examples, instructions 223 may decrypt at least a portion of each of monitor signals 182 received by instructions 122. In other examples, monitor signals 182, or portions thereof, may be compressed or otherwise encoded. In such examples, instructions 223 may decompress or otherwise decode at least a portion of each of monitor signals 182 received by instructions 122. Additionally, in some examples, instructions 124 may determine whether received monitor signals 182 are idle, zeroize, or invalid signals based on the monitor signals 182 as wholly or partially reformatted by instructions 223. In such examples, for each received monitor signal 182, instructions 124 may make the determinations described above after instructions 223 wholly or partially reformat the signal.
In the example of
In some examples, signaling module 360 may periodically provide a manager signal 386 to remote security monitor 340. For example, module 360 may provide manager signals 386 at a rate of between about 10 Hz and 100 kHz. In other examples, module 360 may provide manager signals 386 at a greater or lesser rate. In some examples, module 360 may provide manager signals 386, with communication interface 314, to communication interface 342 of security monitor 340. In some examples, module 360 may include manager information. In such examples, the manager information may be, for example, a particular bit pattern, or any other type of information. In some examples, determination module 358 of monitor 340 may determine the manager information from received manager signals 386.
In the example of
In some examples, module 352 may include information in signal 182 to identify the signal as either an idle signal or a zeroize signal. For example, if module 350 has not detected a security incident (i.e., is still in the idle state) then module 352 may provide to parameter manager 312 a monitor signal 182 including the received manager information, without modification, to indicate an idle state to parameter manager 312. In such examples, module 360 may receive the monitor signal 182 including the unmodified manager information, and determination module 316 of parameter manager 312 may determine that signal 182 is an idle signal indicating an idle state of monitor 340. Based on the receipt of the idle signal, determination module 316 may determine that monitor 340 is functioning, connected, and has not detected a security incident. Parameter manager 312 may periodically provide manager signals 386 to monitor 340 while monitor 340 indicates that it is in the idle state.
In some examples, parameter manager 312 also includes a time monitoring module 364 to track an amount of time elapsed after outputting a manager signal 386 including the manager information. In such examples, module 364 may track the time using a timer, by comparing a current time to a time at which signal 386 was sent, as described above in relation to
In this manner, parameter manager 312 may continually test the connection of processor 310 and monitor 340 and the functioning of monitor 340. If monitor 340 is disconnected, the connection is corrupted, or monitor 340 has been disabled, then parameter manager 312 may not receive an idle signal including the manager information as signal 182 in response to manager signal 386. In such examples, parameter manager 312 may detect the problem and zeroize at least one security parameter 117 to protect the parameters. In some examples, module 360 may reduce the likely success of replay attacks by including different manager information in successive manager signals 386. In such examples, each signal 386 may include different manager information, module 360 may cycle through a set of manager information, periodically change the manager information, or the like.
If module 350 detects a security incident (i.e., transitions to the zeroize state) then modifying module 359 may modify the manager information received in manager signal 386 to request zeroization of at least one security parameter 117 of parameter storage 115. In some examples, module 359 may modify the manager information in a manner known to parameter manager 312 to indicate a zeroization request. After modifying the manager information, module 352 may provide to parameter manager 312 a monitor signal 182 including the modified manager information to request zeroization of at least one security parameter 117. In such examples, module 360 may receive the monitor signal 182 including the modified manager information, and determination module 316 may determine that signal 182 is a zeroization signal indicating that monitor 340 has detected a security incident.
In such examples, module 360 may receive the monitor signal 182 and determining module 316 may determine that signal 182 includes the manager information modified to request zeroization, and thus determine that the signal 182 is a zeroize signal. In some examples, module 316 may determine to zeroize at least one security parameter 117 stored in secure parameter storage 115 in response to receiving a signal 182 including the first manager information modified to request zeroization. In response to the determination to zeroize, zeroization module 366 may zeroize at least one security parameter 117 stored in secure parameter storage 115. In some examples, zeroization module 366 may zeroize all or part of secure parameter storage 115 in response to the determination to zeroize.
In some examples, determination module 316 may also determine to zeroize security parameters 117 if module 316 determines that a received monitor signal 182 is an invalid signal. For example, module 316 may determine that monitor signal 182 is an invalid signal if it includes neither the unmodified manager information provided in manager signal 386, nor the manager information modified to request zeroization. In response to a determination to zeroize made by module 316, zeroization module 366 may zeroize at least one security parameter 117, as described above. In some examples, determination module 316 may also determine that the invalid signal indicates that a connection between processor 310 and remote security monitor 340 is unreliable (e.g., loose, noisy, partially disconnected, etc.). For example, a received signal 182 including asynchronous oscillations between logic 1 and logic 0 may indicate a poor connection between processor 310 and monitor 340. In such examples, module 316 may determine that the invalid signal with asynchronous oscillations indicates a poor connection. In some examples, module 316 may determine to zeroize security parameters 117 in response.
In addition, the manager information may be periodically changed, so that a single 182 including an old version of the unmodified or modified manager information may be considered invalid information. In this manner, examples disclosed herein may protect against a false security monitor from successfully replaying previously valid monitor signals 182. For example, parameter manger 312 may provide to monitor 340 a first manager signal 368 including first manager information, and receive back from monitor 340 a monitor signal 182 including the unmodified first manager information indicating the idle state. In such examples, parameter manger 312 may subsequently provide to monitor 340 a second manager signal 368 including second manager information different than the first manager information. In some examples, time monitoring module 364 may track the time elapsed since outputting the second manager signal 368. If, after outputting the second manager signal, module 364 determines that the threshold amount of time has elapsed without receiving a signal including the second manager information, then determination module 316 may determine to zeroize security parameters 117 of secure parameter storage 115. In such examples, zeroization module 366 may zeroize the parameters 117 in response to the determination.
In the example of
In some examples, signaling module 360 of parameter manager 312 may include an encryption module 361 and a decryption module 362, and signaling module 352 of monitor 340 may include an encryption module 354 and a decryption module 356. In such examples, encryption module 361 may encrypt manager information included in manager signal 386 by signaling module 360. In some examples, signaling module 352 may receive the manager signal 368 including the encrypted manager information, and decryption module 356 may decrypt all or a portion (e.g., the first manager information) of the received manager signal 386. The first manager information may then be determined from the decrypted signal 386. In some examples, determination module 358 may determine manager information from manager signals 386.
If no security incident is detected, then module 352 may provide parameter manager 312 with a monitor signal 182 including the unmodified manager information, as described above. In such examples, encryption module 354 may encrypt at least the first manager information to be included in monitor signal 182. In other examples, the entire monitor signal 182, including the first manager information, may be encrypted by module 354 before being provided to parameter manager 312.
In other examples, if module 350 has detected a security incident, then module 352 may provide parameter manager 312 with a monitor signal 182 including modified manager information, as described above. In such examples, encryption module 354 may encrypt at least the modified first manager information to be included in monitor signal 182. In other examples, the entire monitor signal 182, including the modified manager information, may be encrypted by module 354 before being provided to parameter manager 312. In some examples, encryption modules 361 and 354 may encrypt information in the same way. In other examples, encryption modules 361 and 354 may encrypt information differently. For example, modules 361 and 354 may use different keys in the same encryption process, or may use different encryption processes.
Signaling module 360 may receive monitor signal 182 including the encrypted modified or unmodified manager information. In some examples, decryption module 362 may decrypt at least a portion of the received monitor signal 182 (e.g., at least a portion potentially including manager information). After decrypting at least a portion of the received signal 182, determination module may determine whether the decrypted signal 182 includes the unmodified manager information or modified manager information.
By encrypting at least the manager information portion of manager signals 386 and monitor signals 182, examples described herein may protect the secrecy of the manager information, as well as the manner in which manager information may be modified to indicate a zeroize state, for example. In this manner, examples described herein may make it difficult to forge manager and monitor signals. In some examples, the encryption and decryption functionalities of signaling modules 360 and 352 may be utilized in combination with any other functionalities described herein in relation to
In some examples, communication interface 314 of parameter manager 312 may be a wireless communication interface, and communication interface 342 of remote security monitor 340 may be a wireless communication interface. In such examples, the wireless communication interface 314 may provide manager signals 386 to communication interface 342 of monitor 340 wirelessly, and receive monitor signals 182 from monitor 340 wirelessly. In such examples, wireless communication interface 342 may receive manager signals 386 wirelessly from parameter manager 312 and may provide monitor signals 182 to parameter manager 312 wirelessly.
Additionally, computing device 300 may include a plurality of remote security monitors 340, and parameter manager 312 may interact with each as described above in relation to monitor 340. In some examples, monitor 340 may also monitor the time elapsed between receiving manager signals 386 from processor 310. In such examples, if a threshold amount of time elapses without receiving a manager signal 386, monitor 340 may determine that processor 310 or a connection with processor 310 has been compromised. In some examples, monitor 340 may disable itself (e.g., zeroize its memory) so that an attacker may not use monitor 340 to attack another computing device. Also, in some examples, any of the functionalities described above in relation to
At 405 of method 400, remote security monitor 340 may perform monitoring to detect security incidents, as described above in relation to
After receiving the manager signal, remote security manager 340 may determine, at 415 of method 400, whether a security incident has been detected. If no security incident is detected, method 400 may proceed to 430, where security monitor 340 may modify the manager information included in the manager signal to indicate an idle state of monitor 340. After modifying the manager information to indicate the idle state, monitor 340 may, at 435 of method 400, provide to processor 310 a monitor signal including the manager information modified to indicate the idle state.
If it is determined at 415 that a security incident is detected, method 400 may proceed to 420, where security monitor 340 may modify the manager information included in the manager signal to indicate a zeroize state of monitor 340 and/or request zeroization of at least one security parameter of secure parameter storage of processor 310. After modifying the manager information to request zeroization, monitor 340 may provide to processor 310 a monitor signal including the manager information modified to request zeroization to cause processor 310 to zeroize the security parameters stored in processor 310. In such examples, by providing the modified manager information to the processor, the manager signal including the modified manager information may cause the processor to zeroize at least one security parameter 117 of parameter storage 115 if a security incident is detected. By modifying manager information both to indicate the idle state and to request zeroization, a processor in examples described herein may determine that a previously valid idle signal is invalid if replayed layer. In this manner, embodiments described herein may be less vulnerable to an attack in which a previously valid idle signal is replayed to the processor.
At 505 of method 500, remote security monitor 340 may perform monitoring to detect security incidents, as described above in relation to
After receiving the first manager signal, remote security manager 340 may determine, at 515 of method 500, whether a security incident has been detected. If it is determined at 515 that a security incident is detected, method 500 may proceed to 520, where security monitor 340 may modify the first manager information included in the first manager signal to indicate a zeroize state of monitor 340 and/or request zeroization of at least one security parameter of secure parameter storage of processor 310. In some examples, computing device 300 may include a plurality of remote security monitors 340, and each may modify manager information in a different manner to indicate its identity. For example, each monitor 340 may flip a different bit of the manager information. Additionally, any of the remote security monitors of the examples described above in relation to
After modifying the first manager information, monitor 340 may, at 525 of method 500, provide to processor 310 a first monitor signal including the first manager information modified to request zeroization to cause processor 310 to zeroize the security parameters stored in processor 310. In such examples, by providing the first modified manager information to the processor, the first manager signal including the modified first manager information may cause the processor to zeroize at least one security parameter 117 of parameter storage 115 if a security incident is detected. In examples disclosed herein, a processor receiving monitor signals may determine from the modifications to the manager information both the identity of the sending monitor and whether the monitor has detected a security incident.
Alternatively, if no security incident is detected at 515, method 500 may proceed to 530, where security monitor 340 may modify the first manager information included in the first manager signal to indicate an idle state of monitor 340 and to indicate an identity of the remote security monitor 340. After modifying the first manager information to indicate the idle state and the identity of monitor 340, monitor 340 may, at 535 of method 500, provide to processor 310 a first monitor signal including the first manager information modified to indicate the idle state and the identity of monitor 340.
After providing the first manager information modified to indicate the idle state to processor 310, method 500 may proceed to 505 to again perform monitoring to detect security incidents. At 510, security monitor 340 may receive, from processor 310, a second manager signal including second manager information.
After receiving the second manager signal, remote security manager 340 may determine, at 515 of method 500, whether a security incident has been detected. If it is determined at 515 that a security incident is detected, method 500 may proceed to 520, where security monitor 340 may modify the second manager information included in the second manager signal to indicate a zeroize state of monitor 340 and/or request zeroization of at least one security parameter of secure parameter storage of processor 310.
After modifying the second manager information, monitor 340 may, at 525 of method 500, provide to processor 310 a second monitor signal including the second manager information modified to request zeroization to cause processor 310 to zeroize the security parameters stored in processor 310. In such examples, by providing the second modified manager information to the processor, the second manager signal including the modified second manager information may cause the processor to zeroize at least one security parameter 117 of parameter storage 115 if a security incident is detected.
Alternatively, if no security incident is detected at 515, method 500 may proceed to 530, where security monitor 340 may modify the second manager information included in the second manager signal to indicate an idle state of monitor 340 and to indicate an identity of the remote security monitor 340. After modifying the second manager information to indicate the idle state and the identity of monitor 340, monitor 340 may, at 535 of method 500, provide to processor 310 a second monitor signal including the second manager information modified to indicate the idle state and the identity of monitor 340. After providing the second monitor signal indicating the idle state, method 500 may return to 505.
This application claims priority to U.S. provisional patent application No. 61/509,078, filed on Jul. 18, 2011, which is hereby incorporated by reference herein in its entirety.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US2011/065066 | 12/15/2011 | WO | 00 | 1/16/2014 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2013/012435 | 1/24/2013 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
3183498 | Howe et al. | May 1965 | A |
5131040 | Knapczyk | Jul 1992 | A |
5214760 | Hammond et al. | May 1993 | A |
5249286 | Alpert et al. | Sep 1993 | A |
5379378 | Peters et al. | Jan 1995 | A |
5406630 | Piosenka et al. | Apr 1995 | A |
5469564 | Junya | Nov 1995 | A |
5497497 | Miller et al. | Mar 1996 | A |
5568529 | Masuda | Oct 1996 | A |
5600576 | Broadwater et al. | Feb 1997 | A |
5682328 | Roeber et al. | Oct 1997 | A |
5825878 | Takahashi et al. | Oct 1998 | A |
5872967 | DeRoo et al. | Feb 1999 | A |
5937063 | Davis | Aug 1999 | A |
6078873 | Shutty et al. | Jun 2000 | A |
6148362 | Sagi | Nov 2000 | A |
6188603 | Takeda | Feb 2001 | B1 |
6243812 | Matyas et al. | Jun 2001 | B1 |
6292898 | Sutherland | Sep 2001 | B1 |
6377691 | Swift et al. | Apr 2002 | B1 |
6424143 | Blossfeld et al. | Jul 2002 | B1 |
6466048 | Goodman | Oct 2002 | B1 |
6553492 | Hosoe | Apr 2003 | B1 |
6553496 | Buer | Apr 2003 | B1 |
6625727 | Moyer et al. | Sep 2003 | B1 |
6687140 | Kitamura | Feb 2004 | B2 |
6704865 | Duff | Mar 2004 | B1 |
6789182 | Brothers et al. | Sep 2004 | B1 |
6835579 | Elward | Dec 2004 | B2 |
6836548 | Anderson et al. | Dec 2004 | B1 |
6859876 | Dykes et al. | Feb 2005 | B2 |
6910094 | Eslinger et al. | Jun 2005 | B1 |
6928551 | Lee et al. | Aug 2005 | B1 |
7039816 | Kocher et al. | May 2006 | B2 |
7057396 | Nagase | Jun 2006 | B2 |
7062615 | Miller et al. | Jun 2006 | B2 |
7107459 | Caronni et al. | Sep 2006 | B2 |
7218567 | Trimberger et al. | May 2007 | B1 |
7222053 | Snyder et al. | May 2007 | B2 |
7237121 | Cammack et al. | Jun 2007 | B2 |
7299365 | Evans | Nov 2007 | B2 |
7305534 | Watt et al. | Dec 2007 | B2 |
7360073 | Billstrom et al. | Apr 2008 | B1 |
7398441 | Gee | Jul 2008 | B1 |
7423529 | Singer et al. | Sep 2008 | B2 |
7424398 | Booth et al. | Sep 2008 | B2 |
7457960 | Kablotsky | Nov 2008 | B2 |
7512719 | Gillespie | Mar 2009 | B1 |
7525836 | Backus et al. | Apr 2009 | B2 |
7549064 | Elbert et al. | Jun 2009 | B2 |
7568112 | Yamaguchi | Jul 2009 | B2 |
7571475 | Moon | Aug 2009 | B2 |
7580919 | Hannel et al. | Aug 2009 | B1 |
7657760 | Teramoto et al. | Feb 2010 | B2 |
7667997 | Rodriguez | Feb 2010 | B2 |
7681024 | Kwon | Mar 2010 | B2 |
7729156 | Rodriguez et al. | Jun 2010 | B2 |
7733250 | Tsyrganovich | Jun 2010 | B1 |
7757098 | Brannock et al. | Jul 2010 | B2 |
7761904 | Hessel et al. | Jul 2010 | B2 |
7774619 | Paaske et al. | Aug 2010 | B2 |
7831839 | Hatakeyama | Nov 2010 | B2 |
7844835 | Ginter et al. | Nov 2010 | B2 |
7937596 | Mackey et al. | May 2011 | B2 |
7949912 | Trimberger | May 2011 | B1 |
7954153 | Bancel et al. | May 2011 | B2 |
7966467 | Ludloff et al. | Jun 2011 | B1 |
8027927 | Ogg et al. | Sep 2011 | B2 |
8046574 | Dale et al. | Oct 2011 | B2 |
8621597 | Jenkins, IV | Dec 2013 | B1 |
20010010086 | Katayama et al. | Jul 2001 | A1 |
20020120851 | Clarke | Aug 2002 | A1 |
20020129195 | Hongo et al. | Sep 2002 | A1 |
20030133574 | Caronni et al. | Jul 2003 | A1 |
20030140228 | Binder | Jul 2003 | A1 |
20030200453 | Foster et al. | Oct 2003 | A1 |
20030200454 | Foster et al. | Oct 2003 | A1 |
20040078664 | Takahashi | Apr 2004 | A1 |
20040088333 | Sidman | May 2004 | A1 |
20040153593 | Watt et al. | Aug 2004 | A1 |
20040210764 | McGrath et al. | Oct 2004 | A1 |
20040267847 | Harper | Dec 2004 | A1 |
20050091554 | Loukianov | Apr 2005 | A1 |
20050144358 | Conley et al. | Jun 2005 | A1 |
20050235166 | England et al. | Oct 2005 | A1 |
20060010356 | Snyder et al. | Jan 2006 | A1 |
20060023486 | Furusawa et al. | Feb 2006 | A1 |
20060031685 | Chen et al. | Feb 2006 | A1 |
20060059373 | Fayad et al. | Mar 2006 | A1 |
20060090084 | Buer | Apr 2006 | A1 |
20060095726 | Zaabab et al. | May 2006 | A1 |
20060101241 | Curran et al. | May 2006 | A1 |
20060168212 | Parsons et al. | Jul 2006 | A1 |
20060179302 | Hatakeyama | Aug 2006 | A1 |
20060179324 | Hatakeyama | Aug 2006 | A1 |
20060184791 | Schain et al. | Aug 2006 | A1 |
20060208884 | Diamant | Sep 2006 | A1 |
20060215437 | Trika et al. | Sep 2006 | A1 |
20060225142 | Moon | Oct 2006 | A1 |
20070067644 | Flynn et al. | Mar 2007 | A1 |
20070136606 | Mizuno | Jun 2007 | A1 |
20070140477 | Wise | Jun 2007 | A1 |
20070174909 | Curchett et al. | Jul 2007 | A1 |
20070192610 | Chun et al. | Aug 2007 | A1 |
20070204170 | Oren et al. | Aug 2007 | A1 |
20070237325 | Gershowitz et al. | Oct 2007 | A1 |
20070283140 | Jones et al. | Dec 2007 | A1 |
20080005586 | Munguia | Jan 2008 | A1 |
20080010567 | Hughes et al. | Jan 2008 | A1 |
20080072018 | Le et al. | Mar 2008 | A1 |
20080112405 | Cholas et al. | May 2008 | A1 |
20080137848 | Kocher et al. | Jun 2008 | A1 |
20080162848 | Broyles et al. | Jul 2008 | A1 |
20080165952 | Smith et al. | Jul 2008 | A1 |
20080172538 | Dice et al. | Jul 2008 | A1 |
20080184038 | Fitton | Jul 2008 | A1 |
20080276092 | Eberhardt et al. | Nov 2008 | A1 |
20080282345 | Beals | Nov 2008 | A1 |
20090031135 | Kothandaraman | Jan 2009 | A1 |
20090055637 | Holm et al. | Feb 2009 | A1 |
20090138699 | Miyazaki et al. | May 2009 | A1 |
20090150546 | Ryan | Jun 2009 | A1 |
20090150662 | Desselle et al. | Jun 2009 | A1 |
20090154705 | Price et al. | Jun 2009 | A1 |
20090172496 | Roine | Jul 2009 | A1 |
20090196418 | Tkacik et al. | Aug 2009 | A1 |
20090259854 | Cox et al. | Oct 2009 | A1 |
20090262940 | Lim | Oct 2009 | A1 |
20090271619 | Fujii et al. | Oct 2009 | A1 |
20090290712 | Henry et al. | Nov 2009 | A1 |
20090292732 | Manolescu et al. | Nov 2009 | A1 |
20090293130 | Henry et al. | Nov 2009 | A1 |
20090328201 | Jin et al. | Dec 2009 | A1 |
20100057960 | Renno | Mar 2010 | A1 |
20100064125 | Liu et al. | Mar 2010 | A1 |
20100088739 | Hall et al. | Apr 2010 | A1 |
20100268942 | Hernandez-Ardieta et al. | Oct 2010 | A1 |
20100312940 | Shinohara | Dec 2010 | A1 |
20110012709 | Payson et al. | Jan 2011 | A1 |
20110026831 | Perronnin et al. | Feb 2011 | A1 |
20110095776 | Yunoki | Apr 2011 | A1 |
20110116635 | Bar-el | May 2011 | A1 |
20110154501 | Banginwar | Jun 2011 | A1 |
20120185636 | Leon et al. | Jul 2012 | A1 |
20120224691 | Purohit | Sep 2012 | A1 |
20120246432 | Hadley et al. | Sep 2012 | A1 |
20130024637 | Hadley | Jan 2013 | A1 |
20130024716 | Hadley | Jan 2013 | A1 |
20130031290 | Schwartz et al. | Jan 2013 | A1 |
20130305380 | Diehl et al. | Nov 2013 | A1 |
20140130189 | Hadley | May 2014 | A1 |
20140140512 | Hadley | May 2014 | A1 |
20140149729 | Hadley | May 2014 | A1 |
20140156961 | Hadley | Jun 2014 | A1 |
20140165206 | Hadley | Jun 2014 | A1 |
20140358949 | Hu | Dec 2014 | A1 |
Number | Date | Country |
---|---|---|
1650183 | Aug 2005 | CN |
101995301 | Mar 2011 | CN |
0987625 | Mar 2000 | EP |
1201762 | Aug 1989 | JP |
06-028885 | Feb 1994 | JP |
08-069697 | Mar 1996 | JP |
1131068 | Feb 1999 | JP |
2008192036 | Aug 2008 | JP |
WO-9745960 | Dec 1997 | WO |
WO-9931665 | Jun 1999 | WO |
Entry |
---|
“ARM Security Technology Building a Secure System Using TrustZone® Technology”, < http://infocenter.arm.com/help/topic/com.arm.doc.prd29-genc-009492c/PRD29-GENC-009492C—trustzone—security—whitepaper.pdf > Issue: C, 2009. |
Anderson, R. et al., “Cryptographic Processors—A Survey,” Proceedings of the IEEE, vol. 94, No. 2, Feb. 2006, pp. 357-369. |
Bialas; “Intelligent Sensors Security”, Sensors, Institute of Innovative Technologies EMAG, 40-189 Katowice, ul. Leopolda 31, Poland, ISSN 1424-8220, Jan. 22, 2010. <www.mdpi.com/journal/sensors>. |
Datta et al.; “Calibration of On-Chip Thermal Sensors using Process Monitoring Circuits”, University of Massachusetts, Amherst, MA USA, IEEE 978-1-4244-6455-5/10, 2010. |
Fields, et al; “Cryptographic Key Protection Module in Hardware for the Need2know System”, < http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=1594225 > on pp. 814-817; vol. 1, Aug 7-10, 2005. |
Gilmont, et al; “An Architecture of Security Management Unit for Safe Hosting of Multiple Agents”, < http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.121.3663 > On pp. 79-82, 1998. |
International Search Report and Written Opinion received in PCT Application No. PCT/US2011/065081, mailed Jul. 25, 2012, 9 pgs. |
International Search Report and Written Opinion received in PCT Application No. PCT/US2011/066750, mailed on Sep. 20, 2012, 10 pgs. |
International Search Report and Written Opinion received in PCT Application No. PCT/US2012/020528, mailed Aug. 22, 2012, 9 pgs. |
International Search Report and Written Opinion received in PCT Application No. PCT/US2012/023385, mailed May 22, 2012, 10 pgs. |
International Search Report and Written Opinion received in Pot Application No. PCT/US2012/023794, mailed Sep. 24, 2012, 9 pgs. |
International Search Report and Written Opinion received in Pot Application No. PCT/US2012/024367, mailed Jul. 18, 2012, 10 pgs. |
International Search Report and Written Opinion received in Pot Application No. PCT/US2012/031542, mailed Sep. 27, 2012, 9 pgs. |
International Search Report and Written Opinion received in PCT Application No. PCT/US2012/065066, mailed Jul. 16, 2012, 9 pgs. |
Sun Microsystems, “Sun Cryptographic Accelerator 4000”, Firmware Version 1.1, FIPS 140-2 Non-Proprietary, Security Policy, Level 3 Validation, Aug. 6, 2004, pp. 1-20, <oracle.com/technetwork/topics/security/140sp457-160924.pdf>. |
Yang, et al; “Improving Memory Encryption Performance in Secure Processors”, < http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=1407851 > On pp. 630-640; vol. 54; Issue: 5, May 2005. |
Yao, et al.; “Calibrating On-chip Thermal Sensors in Integrated Circuits: A Design-for-Calibration Approach”, Springer Science+Business Media, LLC 2011, Sep. 21, 2011. |
Cengage Learning, “The Hexadecimal Number System and Memory Addressing,” May 16, 2011, <http://web.archive.org/web/20110516210838/http://college.cengage.com/coursemate/technology—education/andrews—9781435497788/unprotected/book—level/The—Hexadecimial—Number—System—and—Memory—Addressing.pdf. |
D. Ibrahim, “Design of a multichannel temperature data logger with SD card storage,” Electronics World, Feb. 2009, <http://www.mikroe.com/downloads/get/789/data—logger—ew—02—09.pdf>. |
Dedrick et al., “An inexpensive, microprocessor-based, data logging system,” Computers & Geosciences, 2000, vol. 26, pp. 1059-1066. |
Hobbizine, “adding memory with i2c eeproms,” May 16, 2010, (web page), <http://picaxe.hobbizine.com/eeprom.html>. |
Limor, “Logger Shield: Datalogging for Arduino,” (web page), May 17, 2011, <http://www.ladyada.net/make/logshield/index.html>. |
Maxim Integrated Products, “DS1678 Real-Time Event Recorder,” 2005, <http://datasheets.maximintegrated.com/en/ds/DS1678.pdf>. |
Microsoft Corp., “BitLocker Drive Encryption: Scenarios, User Experience, and Flow,” May 16, 2006, available at: <http://msdn.microsoft.com/en-us/library/windows/hardware/gg463165.aspx>. |
Microsoft Corp., “BitLocker Drive Encryption: Technical Overview,” May 16, 2006. |
Microsoft, “How To: Configure MachineKey in ASP.Net 2.0,” available Mar. 9, 2012, <http://msdn.microsoft.com/en-us/library/ff649308.aspx>. |
National Institute of Standards and Technology, “Security Requirements for Cryptographic Modules,” FIPS PUB 140-2, May 25, 2001, <http://csrc.nist.gov/publications/fips/fips140-2/fips1402.pdf>. |
National Institute of Standards and Technology, “Security Requirements for Cryptographic Modules,” FIPS PUB 140-3, Draft, p. 16, Sep. 11, 2009, and Annexes A-G <http://csrc.nist.gov/publications/PubsDrafts.html#FIPS-140--3>. |
Raafat. S. Habeeb, “Design a Programmable Sequence Controller Utilizing I2C BUS,” 2011, Journal of Madenat Alelem College, vol. 3, iss. 2, pp. 5-25, <http://www.iasj.net/iasj?func=fulltext&ald=60778>. |
Revolution Education Ltd., “Picaxe Datalogger (AXE110P),” version 2.0, (web page), Dec. 2010, <http://www.picaxe.com/docs/axe110.pdf>. |
Rick Smith, “Authentication,” (excerpt), Feb. 2002, <http://www.visi.com/crypto/>. |
ViaSat, Inc., “Requirements Description for an Advanced Cryptographic Module (ACM) to Support the High Capacity Communications Capability (HC3),” Technical Report, Oct. 18, 2005, <http://cryptome.org/acm-hc3.htm>. |
Supplementary Partial European Search Report, European Patent Application No. 12814434.2, Jan. 28, 2016, 10 pages. |
Number | Date | Country | |
---|---|---|---|
20140165206 A1 | Jun 2014 | US |
Number | Date | Country | |
---|---|---|---|
61509078 | Jul 2011 | US |