The present invention relates generally to computer systems and, more particularly, to providing a trusted and secure computing platform.
With the advent of personal computer system use in every day business transactions, the issue of computer security has become critical. Unsecured personal computers inhibit electronic business (e-business) because users are reluctant, justifiably so, to transmit highly personal and sensitive information to system which may be vulnerable to intruders or viruses. While many personal computer (PC) manufacturers have made individual strides towards increasing security by adding “smart cards” or embedded security chips to their new models, the lack of a concerted effort by the PC industry to develop security technology could prevent the evolution of this technology in a consistent and compatible way between manufacturers.
Recognizing this potential risk and the adverse effects it could have on inhibiting electronic commerce, an open alliance between major PC manufacturers was formed to develop and propose a standard that would adopt hardware and software technologies to strengthen security at the platform level. The open alliance, known as the Trusted Computing Platform Alliance (TCPA), has proposed a standard including new hardware, BIOS and operating system specifications so PC manufacturers can provide a more trusted and secure PC platform based on common industry standards, the details of which are provided in the TCPA PC Specific Implementation Specification, 1.00 RC1 (Aug. 16, 2001), hereby incorporated by reference.
The motherboard 30 is provided by the manufacturer and includes one or more CPUs 32 and all primary peripheral devices 34, i.e., devices which directly attach to and directly interact with the CPU 32. In addition, the motherboard 30 includes all BIOSes 36 and the TBB 40. The TBB 40 is the center of the trusted platform, and includes a Core Root of Trust for Measurement (CRTM) 42, a Trusted Platform Module (TPM) 44, and a trusted connection 46 of the CRTM 42 and TPM 44 to the motherboard 30.
According to the TCPA specification, the CRTM 42 and the TPM 44 are the only trusted components on the motherboard 30, i.e., they are presumably secure and isolated from tampering by a third party vendor or software. Only the authorized platform manufacturer (or agent thereof) can update or modify code contained therein. The CRTM 42 is the executable component of the TBB 40 that gains control of the platform 20 upon a platform reset. Thus, for all types of platform resets, the CPU 32 always begins executing code with the CRTM's 42 platform initialization code. The trust in the platform is based on the CRTM 42, and trust in all measurements is based on its integrity.
The basic premise underlying the trusted platform is ensuring that untrusted devices or software have not been loaded onto the system. Trust is established during a pre-boot state that is initiated by a platform reset. The platform reset can either be a cold boot (power-on), a hardware reset, or a warm boot typically caused by a user keyboard input. Following a platform reset, the CPU 32 executes code with the CRTM's 42 platform initialization code. The chain of trust begins at the CRTM 42.
In this architecture, the BIOS includes a Boot Block 50 and a POST BIOS 36. The Boot Block 50 and the POST BIOS 36 are independent components and each can be updated independent of the other. The Boot Block 50 is located in the CRTM 42, while the POST BIOS 36 is located outside the TBB 40. Thus, while the manufacturer or a third party supplier may update, modify or maintain the POST BIOS 36, only the manufacturer can modify or update the Boot Block 50. In a variation of the architecture, the entire BIOS is a single entity located entirely within the CRTM 42.
As stated above, the CRTM 42 and TPM 44 are presumptively trusted. Thus, following a platform reset, code in the Boot Block 50 is executed, which measures the entity to which it will transfer control, in this case, the Post BIOS 36. “Measuring an entity” means hashing code in the entity to produce a log of the code, which is then extended into a platform configuration register (PCR) 48 in the TPM 44. The TPM 44 includes a plurality of PCRs 48a, 48b, 48c, and 48d, a portion of which are designated to the pre-boot environment and referred to collectively as boot PCRs 48a. Each boot PCR 48a is dedicated to collecting specific information related to a particular stage of a boot sequence. For example one boot PCR 48a (PCR[0]) stores measurements from the CRTM 42, POST BIOS 36, and all firmware 38 physically bound to the motherboard 30.
Once the POST BIOS 36 has been measured, control is transferred to the POST BIOS 36, which then continues to boot the system by ensuring that hardware devices are functional. Once POST BIOS 36 gains control, it is responsible for measuring any entity to which it will transfer control. As the POST BIOS 36 progresses through the boot sequence, values in the boot PCRs 48a increment whenever an entity is measured.
Upon booting to the operating system (OS) 14, the operating system 14 verifies the trustworthiness of the platform 20 by comparing the values in the boot PCRs 48a with precalculated values known by the operating system 14. If the values match, the operating system 14 is assured of a secure boot and that the platform is trusted. If the values do not match, the operating system 14 is alerted of a possible breach, and the operating system 14 can take measures to reestablish trust.
In
Referring now to
If the device is a bootable device (step 126), an operating system 14 has presumably been booted, and the process 100 continues at number C. This part of the process verifies the trustworthiness of the boot sequence. As explained above, each component is measured, i.e., the code in each device is hashed and extended to the appropriate boot PCR 48a. Thus, the values in the boot PCRs 48 reflect the boot sequence from beginning to end. In step 134, the operating system compares the value in each boot PCR 48a to a precalculated value that reflects a trustworthy boot sequence. The precalculated value is typically calculated by the operating system 14, which is aware of all trusted components.
If the boot PCR 48 values are equal to the precalculated value calculated by the operating system 14 (step 136), the boot sequence finishes in step 138. On the other hand, if the boot PCR 48 values are not equal to the precalculated value calculated by the operating system 14 (step 136), the operating system 14 will initiate operations to restore trust in step 140. The operating system 14 may examine the boot process to determine whether a security breach has occurred, for instance, by launching a virus detection program.
While the TCPA compliant system described above ensures that rogue applications or devices do not contaminate the trusted platform, there is no present protection against a physical intrusion, i.e., an intruder removing the physical casing or cover of the computer system and physically tampering with the system. Typically, most computer systems utilize tamper circuits to detect a tamper event, e.g., removal of the cover. The tamper event triggers a response from the system, such as an alert to the administrator or a shut down during booting. Nevertheless, these measures can be avoided if the intruder boots to a non-system operating system, which can clear any indication that a tamper event occurred.
Accordingly, a need exists for a method and system for detecting a tamper event in a TCPA compliant system. The detection method and system should be secure and private so that a non-TCPA operating system cannot clear the tamper signal. The present invention addresses such a need.
The present invention provides a method, system and computer readable medium containing programming instructions for detecting a tamper event in a trusted computer system having an embedded security system (ESS), a trusted operating system, and a plurality of devices. The method, system and computer readable medium of the present invention include receiving a tamper signal in the ESS, and locking the tamper signal in the ESS. According to the method, system and computer readable medium of the present invention, the trusted operating system is capable of detecting the tamper signal in the ESS.
Through aspects of the present invention, the tamper signal is locked in a secure, tamper-proof embedded security system. The tamper signal is hashed and extended to one of the plurality of PCRs. Thus, following a boot sequence, the trusted operating system detects the tamper signal by comparing the value of the one PCR to a precalculated value representing a trustworthy boot. Because the tamper signal is locked in the ESS, an intruder or rogue application cannot clear the tamper signal.
The present invention relates generally to computer systems and, more particularly, to a method and system for providing a trusted and secure computing platform. The following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements. Various modifications to the preferred embodiment and the generic principles and features described herein will be readily apparent to those skilled in the art. Thus, the present invention is not intended to be limited to the embodiment shown but is to be accorded the widest scope consistent with the principles and features described herein.
Thus, in step 240, a platform reset is initiated which boots the computer system. As stated above, all of the boot PCRs 48a are reset to zero except the one PCR 48a′ that contains the hashed tamper signal 62, via step 250. The normal boot sequence proceeds (via step 260), as illustrated in
Once trust has been restored in the platform, the tamper signal 62 in the TPM 44′ is cleared and the one boot PCR 48a′ containing the hashed tamper signal 62 is reset to zero. In one preferred embodiment of the present invention, only an authorized entity, e.g., the system administrator or the trusted operating system 14, is allowed to clear the tamper signal 62 in the TPM 44′. This can be accomplished using an encrypted key known only to the authorized entity or by any other means well known to those skilled in the art. Thus, an intruder or rogue software is prevented from clearing the tamper signal 62.
Through the method and system of the present invention, the trusted operating system 14 can detect a tamper event in the computer system. By locking the tamper signal 62 in the TPM 44′, the computer system is able to track and record tamper events in a secure and private manner. Thus, the method and system of the present invention enhances security in the TCPA compliant platform.
Although the present invention has been described in accordance with the embodiments shown, one of ordinary skill in the art will readily recognize that there could be variations to the embodiments and those variations would be within the spirit and scope of the present invention. Accordingly, many modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
4091246 | Donofrio et al. | May 1978 | A |
4099722 | Rodesch et al. | Jul 1978 | A |
4310754 | Check, Jr. | Jan 1982 | A |
4378551 | Drapac | Mar 1983 | A |
4691350 | Kleijne et al. | Sep 1987 | A |
4696449 | Woo et al. | Sep 1987 | A |
4783801 | Kaule | Nov 1988 | A |
4795893 | Ugon | Jan 1989 | A |
4807284 | Kleijne | Feb 1989 | A |
4818986 | Bauman | Apr 1989 | A |
4860351 | Weingart | Aug 1989 | A |
4975550 | Panchisin | Dec 1990 | A |
4985695 | Wilkinson et al. | Jan 1991 | A |
5007083 | Constant | Apr 1991 | A |
5055827 | Philipp | Oct 1991 | A |
5060261 | Avenier et al. | Oct 1991 | A |
5117457 | Comerford et al. | May 1992 | A |
5159629 | Double et al. | Oct 1992 | A |
5239664 | Verrier et al. | Aug 1993 | A |
5311450 | Ojima | May 1994 | A |
5349249 | Chiang et al. | Sep 1994 | A |
5353015 | Robinson | Oct 1994 | A |
5353350 | Unsworth et al. | Oct 1994 | A |
5388156 | Blackledge, Jr. et al. | Feb 1995 | A |
5389738 | Piosenka et al. | Feb 1995 | A |
5397176 | Allen et al. | Mar 1995 | A |
5406630 | Piosenka et al. | Apr 1995 | A |
5422953 | Fischer | Jun 1995 | A |
5450271 | Fukushima et al. | Sep 1995 | A |
5469557 | Salt et al. | Nov 1995 | A |
5555156 | Decante | Sep 1996 | A |
5555373 | Dayan et al. | Sep 1996 | A |
5568611 | Khatri et al. | Oct 1996 | A |
5574786 | Dayan et al. | Nov 1996 | A |
5712973 | Dayan et al. | Jan 1998 | A |
5724260 | Klein | Mar 1998 | A |
5748083 | Rietkerk | May 1998 | A |
5778199 | Wanner et al. | Jul 1998 | A |
D396699 | Vackar | Aug 1998 | S |
D397329 | Vackar | Aug 1998 | S |
5912621 | Schmidt | Jun 1999 | A |
5945915 | Cromer et al. | Aug 1999 | A |
5970227 | Dayan et al. | Oct 1999 | A |
5998858 | Little et al. | Dec 1999 | A |
6021493 | Cromer et al. | Feb 2000 | A |
6026492 | Cromer et al. | Feb 2000 | A |
6058481 | Kowalski | May 2000 | A |
6098171 | Johnson et al. | Aug 2000 | A |
6105136 | Cromer et al. | Aug 2000 | A |
6185507 | Huber et al. | Feb 2001 | B1 |
6191503 | Kitten et al. | Feb 2001 | B1 |
6201296 | Fries et al. | Mar 2001 | B1 |
6218941 | Cromer et al. | Apr 2001 | B1 |
6510518 | Jaffe et al. | Jan 2003 | B1 |
20020120575 | Pearson et al. | Aug 2002 | A1 |
Number | Date | Country |
---|---|---|
524734 | Jan 1993 | EP |
2163577 | Feb 1986 | GB |
WO9114354 | Sep 1991 | WO |
WO9215095 | Sep 1992 | WO |
Number | Date | Country | |
---|---|---|---|
20030084285 A1 | May 2003 | US |