Programmable logic devices that store their configuration data in static random access memory (“SRAM”) storage are prevalent. SRAM storage is volatile; it does not retain its contents when power is lost. Therefore, programmable logic devices based on SRAM technology are used with nonvolatile storage, to retain the configuration programming data during times that the device is switched of or otherwise not provided with power. However, a competitor may monitor the data flowing out of the nonvolatile storage on power-up, and thereby determine the programming configuration of the programmable logic device. Indeed, the competitor need not even analyze the data stream, but need only record it and store it in its own devices.
Some devices that store configuration programming data in nonvolatile memory components have security features (e.g., encryption/decryption) to protect sensitive data. Unfortunately, an unauthorized user may sometimes circumvent these security features and access the sensitive information.
The present disclosure relates to securing programmable devices (e.g., programmable logic devices (PLDSs)) using a kill switch. A kill switch may be implemented using, for example, a kill fuse, a non-volatile memory bit, or a volatile battery-backed memory bit. In response to a detected security threat, for example, the kill switch may be blown, causing at least part of the device to be reset, disabled, or both. In some embodiments, the device's configuration circuitry and/or decryption circuitry is permanently disabled in response to a detected security threat.
In some embodiments, blowing the kill switch (i.e., switching the kill switch to a particular state) may cause a particular kill sequence to execute. A kill sequence may be defined by a user design implemented in the respective PLD. For example, for PLDs that store decryption keys, a kill sequence may cause the keys to be cleared or zeroed. A kill sequence may cause any configuration data stored in the PLD configuration memory (e.g., CRAM) to be removed as well as resetting some or all of the logical connections in the PLD that implement a current user design. It will be understood that the kill sequence may at least partially reset, clear, zeroize, disable, or both any suitable portion of the PLD. Multiple kill sequences may be defined and the particular kill sequence executed may be based on the particular trigger associated with each respective kill sequence.
Further features of the disclosure, its nature and various advantages will be apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings, in which like reference characters refer to like parts throughout, and in which:
I/O circuitry 102 may include one or more ports that provide data input, output, or both for PLD 100. For example, I/O circuitry 102 may include one or more JTAG ports and one or more ports made available for inputting programming object file (POF) bit-streams. I/O circuitry 102 may be coupled to core configuration fabric 106, control block 108, to any other suitable component of PLD 106, or to any combination thereof.
Control block 108 may include any suitable software, circuitry or both for coordinating and controlling the configuration of PLD 100, and for maintaining security of PLD 100. Control block 108 may be coupled to I/O circuitry 102, kill switch 104, any other suitable component of PLD 100 (e.g., core configuration fabric 106 or a component, such as programmable routing 110), or any combination thereof.
In some embodiments, in order to avoid unauthorized access or use of PLD 100, such as access to the user design implemented in core configuration fabric 106 (and stored in memory 114 (e.g., CRAM)), PLD 100 may include security features. For example, as Illustrated in
Security module 112 may allow the user to define a trigger that causes a kill switch 104 to be activated. Kill switch 104 may be a non-volatile element, mechanism, or both, or a volatile battery-backed element, for disabling functionality of a programmable device. For example, in some embodiments, a user design, via security module 112, may be given the capability to detect a security threat and to initiate a kill sequence by activating kill switch 104. A security threat may be detected based on any suitable criteria. Any of the kill switch 104 and the elements of the kill sequence (e.g., resetting, altering, clearing, zeroizing, etc.) may be user-programmable.
In one suitable approach, kill switch 104 may be at least partially implemented using a physical kill fuse. A kill fuse may be any suitable fuse, such as an non-volatile (NV) fuse. In another approach, kill switch 104 may be at least partially implemented using a battery-backed volatile memory. Kill switch 104 may create one or more necessary connections in order for PLD 100 to be configured, to be reconfigured, to properly function, or any combination thereof. For example, when PLD 100 is in configuration mode, control block 108 (or any other suitable component) may check kill switch 104 continuously to ensure that kill switch 104 has not been blown. If a blown kill switch 104 is detected, control block 108 may prevent any configuration or reconfiguration of PLD 100. A blown kill switch 104 may also, or in the alternative, cause any suitable reset, clearing, or zeroization (i.e., clearing and verification of clearing) of certain or all memory and/or logic components of PLD 100. For example, some or all of the contents of memory 114, which may hold user configuration data, may be removed, replaced with random data, or zeroed, to name a few.
PLD 100 may be killed (e.g., kill switch 104 blown) when, for example, a security threat is detected by a user design element of core configuration fabric 106 (e.g., security module 112) or by control block 108. For purposes of brevity and clarity and not by way of limitation, the present disclosure is described in terms of the security module 112 detecting a security threat and blowing kill switch 104. It will be understood that any other suitable elements or combination of elements may detect a security threat and kill PLD 100.
In some embodiments, a programming object file (POF) bit-stream or any other suitable configuration data used to configure or reconfigure PLD 100 may be encrypted (e.g., using an AES encryption protocol). In order to decrypt the configuration data, PLD 100 may use a decryption key. As illustrated in
In some embodiments, when security module 112 detects a threat and activates kill switch 104, the subsequent kill sequence may involve clearing the contents of volatile memory 200. Thus, the decryption key may be cleared or zeroed (i.e., memory content replaced with known and non-classified content) upon the detection of a security threat. This will prevent configuration or reconfiguration of PLD 100. A decryption key may also or alternatively be stored in non-volatile memory.
At step 304, in response to the detection of a kill trigger, security module 112, control block 108, or both may zero or clear a volatile decryption key stored in volatile memory (e.g., volatile memory 200). It is appreciated that the terms clearing or zeroing content refer to replacing the content with some other data, e.g., random data, known data, etc., other than the current data and not necessarily replacing the current data with data value zero.
At step 306, in response to the detection of a kill trigger, security module 112, control block 108, or both may blow kill switch 104.
At step 308, control block 108 may initiate reconfiguration of PLD 100. That is, PLD 100 is set to configuration mode (as opposed to user mode). As part of the reconfiguration sequence, control block 108 may continuously check kill switch 104. Under normal circumstance when no kill trigger has been detected and PLD 100 is in configuration mode, control block 108 may continue the normal configuration sequence in which configuration data is loaded from the POF bit-stream and core configuration fabric is programmed according to the configuration data.
However, if, at step 310, kill switch 104 is determined to have been blown, then control block 108 may clear core configuration fabric 106 at step 312. This clearing may effectively initialize the configuration fabric to, for example, a default state (e.g., logic array blocks and corresponding programmable connections in core configuration fabric 106 may be initialized, memory 114 may be cleared, and any other configuration data or user design data may be removed). Either as part of step 304, step 308, or step 312 (or as one or more separate steps), control block 108 may, in response to the kill trigger tri-state I/O circuitry 102 (or otherwise disable I/O circuitry 102), disable any other suitable functionality of PLD 100, including, for example, any configuration engine that may be used to configure the device. In some embodiments, a decryption engine included in PLD 100 is disabled in response to the kill trigger. One advantage of disabling the decryption engine is that side-channel attacks on the decryption engine (such as simple power analysis (SPA) or differential power analysis (DPA) may be prevented, since such attacks typically require the decryption engine to be exercised. In particular, even if the main power supply of PLD 100 is interrupted before the volatile battery-backed key is fully cleared, disabling the decryption engine prevents an attacker from accessing the key using side-channel attacks.
When kill switch 104 is blown or when a kill sequence (e.g., kill sequence 300) is executed, PLD 100 may be permanently deactivated by virtue of the terminated connection of kill switch 101, or disablement of other components or features of PLD 100. In some embodiments, PLD 100 may be configured to allow a reset by an authorized user to be functional even following the execution of a kill sequence (for example, when kill switch 104 is implemented as battery-backed volatile memory). In some such embodiments, kill switch 104 is implemented as a “sticky bit,” which are discussed in detail below. To re-enable PLD 100 after a volatile kill switch is blown, the kill switch may be fully cleared. Data stored in the PLD 100, including the key, may also be cleared.
It will be understood that kill sequence 300 is merely illustrative. Any other suitable steps may be performed in addition to or in place of those shown. Any suitable modifications may be made to kill sequence 300 and it will be understood that the order presented is merely illustrative. It will be further understood that PLD 100 may implement multiple kill sequences each associated with a respective trigger. Therefore, different triggering events may cause one or more different kill sequences to execute.
In some embodiments, fuse bits may be cleared as part of the kill sequence by blowing corresponding fuses. For example, in some designs of PLD 100, instead of using volatile decryption keys (i.e., stored in volatile memory 200 (
As discussed above, any suitable security threats may be identified and used to trigger the kill switch and associated kill sequence. In one suitable approach, the use of security option bits representative of particular security threats may be used to identify a security threat and to trigger a device clear or kill sequence. The security option bits may take the form of, for example, sticky bits. Sticky bits may be stored in PLD 100 using volatile memory powered by a battery. For example, referring to
In one suitable approach, sticky bits may be stored in multiple copies on PLD 100 (e.g., in triplicate) and may be backed up using a shadow register powered by core configuration fabric 106. If any of the sticky bits are set high, the corresponding other bits would be forced high as well. Cycling only one of the power supplies would restore the value in the register that was cycled from a corresponding register that was not cycled.
Sticky bits may be used to set any of a multitude of security options. They may also be used to test whether particular security threats exist. For example, in one suitable approach, sticky bits stored in PLD 100 may be compared to corresponding security option bits being transmitted in the POF. Because the sticky bits stored in PLD 100 are of a fixed pattern (i.e., they cannot be modified except when reconfiguring PLD 100), one suitable security threat check may involve comparing the sticky bit pattern to a security bit pattern in the POF. If there is a mismatch, then the decryption key may be cleared or zeroed, or a device kill sequence may be triggered.
In some embodiments, security option bits, such as sticky bits or bits received from the POF, stored in PLD 100 may be checked or monitored to ensure that their respective values have not been modified (e.g., as result of radiation, a security threat, any other suitable cause, or any combination thereof). If one or more of the security option bits stored in PLD 100 is determined to have been altered, then the decryption key may be cleared or zeroed, or a device kill sequence may be triggered. A determination of whether any of the security option bits may have been altered may be made using any suitable technique, such as, for example, by computing a checksum (e.g., CRC) value and comparing it to a stored (i.e., expected) value.
Security option bits, including sticky bits, are further discussed in U.S. patent application Ser. Nos. 13/098,074 and 13/098,316 filed on even date herewith and which are both hereby incorporated by reference herein in their entireties.
In some embodiments, a volatile key (or any other type of key) may be cleared or zeroed according to a particular time elapsing. For example, when programming the key, the user design may indicate that the volatile key is to be automatically zeroed after a certain period of time from the programming of the key. The time may be determined from a real time clock powered by a battery. This may prevent any subsequent attempts to reconfigure the PLD (e.g., if only a single configuration operation of the PLD is intended). The time may be calculated from any suitable start point and does not need to be from the time the key is initially programmed. For example, the zeroing may automatically occur after a certain period of time elapses since a security threat had been identified by security module 112, control block 108, or both.
A PLD 90 programmed according to any embodiment of the present disclosure may be used in many kinds of electronic devices. One possible use is in a data processing system 900 shown in
System 900 can be used in a wide variety of applications, such as computer networking, data networking, instrumentation, video processing, digital signal processing, or any other application where the advantage of using programmable or reprogrammable logic is desirable. PLD 90 can be used to perform a variety of different logic functions. For example, PLD 90 can be configured as a processor or controller that works in cooperation with processor 901. PLD 90 may also be used as an arbiter for arbitrating access to a shared resources in system 900. In yet another example, PLD 90 can be configured as an interface between processor 901 and one of the other components in system 900. It should be noted that system 900 is only exemplary, and that the true scope and spirit of the disclosure should be indicated by the following claims.
Various technologies can be used to implement PLDs 90 as described above and incorporating this disclosure.
It will be understood that the foregoing is only illustrative of the principles of the invention, and that various modifications can be made by those skilled in the art without departing from the scope and spirit of the invention. For example, the various elements of this invention can be provided on a PLD in any desired number and/or arrangement. One skilled in the art will appreciate that the present invention can be practiced by other than the described embodiments, which are presented for purposes of illustration and not of limitation, and the present invention is limited only by the claims that follow.
This application is a continuation of U.S. patent application Ser. No. 13/097,816, filed Apr. 29, 2011 know U.S. Pat. No. 8,461,863), the disclosure of which is hereby incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4609986 | Hartmann et al. | Sep 1986 | A |
4617479 | Hartmann et al. | Oct 1986 | A |
4677318 | Veenstra | Jun 1987 | A |
4713792 | Hartmann et al. | Dec 1987 | A |
4774421 | Hartmann et al. | Sep 1988 | A |
4871930 | Wong et al. | Oct 1989 | A |
4899067 | So et al. | Feb 1990 | A |
4912342 | Wong et al. | Mar 1990 | A |
5033084 | Beecher | Jul 1991 | A |
5081675 | Kittirutsunetorn | Jan 1992 | A |
5121006 | Pedersen | Jun 1992 | A |
5220214 | Pedersen | Jun 1993 | A |
5260610 | Pedersen et al. | Nov 1993 | A |
5260611 | Cliff et al. | Nov 1993 | A |
5350954 | Patel | Sep 1994 | A |
5371422 | Patel et al. | Dec 1994 | A |
5388157 | Austin | Feb 1995 | A |
5406627 | Thompson et al. | Apr 1995 | A |
5450022 | New | Sep 1995 | A |
5479512 | Weiss | Dec 1995 | A |
5513262 | van Rumpt et al. | Apr 1996 | A |
5535168 | Yepez et al. | Jul 1996 | A |
5548228 | Madurawe | Aug 1996 | A |
5563592 | Cliff et al. | Oct 1996 | A |
5581198 | Trimberger | Dec 1996 | A |
5581202 | Yano et al. | Dec 1996 | A |
5636281 | Antonini | Jun 1997 | A |
5768372 | Sung et al. | Jun 1998 | A |
5915017 | Sung et al. | Jun 1999 | A |
6181164 | Miller | Jan 2001 | B1 |
6314550 | Wang et al. | Nov 2001 | B1 |
6608792 | Pitts | Aug 2003 | B2 |
6651155 | Bocchino et al. | Nov 2003 | B1 |
6980649 | Batcher | Dec 2005 | B1 |
7009419 | Goodman | Mar 2006 | B2 |
7046570 | Hubbard | May 2006 | B1 |
7236007 | Chang | Jun 2007 | B1 |
7368935 | Bernier et al. | May 2008 | B2 |
7536559 | Jenkins et al. | May 2009 | B1 |
7676355 | Molson et al. | Mar 2010 | B1 |
7844997 | Tucker et al. | Nov 2010 | B2 |
8026746 | Nguyen | Sep 2011 | B1 |
8159259 | Lewis et al. | Apr 2012 | B1 |
8184812 | Margolis et al. | May 2012 | B2 |
8255620 | Frost et al. | Aug 2012 | B2 |
8255700 | Kitariev et al. | Aug 2012 | B2 |
8502555 | Peterson et al. | Aug 2013 | B1 |
20090164727 | Penton et al. | Jun 2009 | A1 |
20110078379 | Iida et al. | Mar 2011 | A1 |
20120278906 | Pedersen | Nov 2012 | A1 |
Entry |
---|
“Operating Requirements for Altera Devices,” Altera, Data Sheet, Version 9.02, pp. 1-14 (Dec. 1999). |
Minnick, R.C., “A Survey of Microcellular Research,” Journal of the Association for Computing Machinery, vol. 14, No. 2, pp. 203-41 (Apr. 1967). |
Mukhopadhyay, A., “Recent Developments in Switching Theory,” Academic Press, New York, Chapters VI and IX, pp. 229-54 and 369-422 (1971). |
Plummer, James D. et al., “Silicon VLSI Technology: Fundamentals, Practice and Modeling,” Prentice Hall, Upper Saddle River, New Jersey, pp. 466-468. (2000). |
Wahlstrom, S.E., “Programmable logic arrays-cheaper by the millions,” Electronics, pp. 90-95 (Dec. 11, 1967). |
Number | Date | Country | |
---|---|---|---|
20130271178 A1 | Oct 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13097816 | Apr 2011 | US |
Child | 13913355 | US |