Method of and apparatus for ascertaining the status of a data processing environment

Information

  • Patent Grant
  • 8219496
  • Patent Number
    8,219,496
  • Date Filed
    Friday, February 22, 2002
    22 years ago
  • Date Issued
    Tuesday, July 10, 2012
    11 years ago
Abstract
In order to facilitate a user's ability to trust a computing environment, a trusted computing device (2) is arranged to challenge other devices in the computing environment and to record a log of the facilities available within the computing environment and an indication of whether those facilities are trustworthy. A new user (40) entering the computing environment can obtain the log from the trusted computing device in order to ascertain the status of the environment. Alternatively any device can hold data concerning platforms in its vicinity and its operation can be authenticated by the trusted device.
Description
CROSS REFERENCE TO RELATED APPLICATIONS

The subject matter of the present application may also be related to the following U.S. Patent Applications: “Data Event Logging in Computing Platform,” Ser. No. 09/979,902, filed Nov. 27, 2001; “Data Integrity Monitoring in Trusted Computing Entity,” Ser. No. 09/979,903, filed Nov. 27, 2001; “Information System,” Ser. No. 10/080,476, filed Feb. 22, 2002; “Trusted Computing Environment,” Ser. No. 10/080,477, filed Feb. 22, 2002; “Method of and Apparatus for Investigating Transactions in a Data Processing Environment,” Ser. No. 10/080,478, filed Feb. 22, 2002; “Trusted Platform Evaluation,” Ser. No. 10/194,831, filed Jul. 11, 2002; “Privacy of Data on a Computer Platform,” Ser. No. 10/206,812, filed Jul. 26, 2002; and “Method and Apparatus for Locking an Application Within a Trusted Environment,” Ser. No. 10/208,718, filed Jul. 29, 2002.


TECHNICAL FIELD

The present invention relates to a method of and apparatus for determining status of a data processing environment. The information concerning the status of the environment may include an indication of what devices are operating within the environment, what facilities they offer and whether the devices can be trusted.


BACKGROUND ART

The issues of security and ease of use of a computing platform are often in conflict. For commercial applications, a client computing platform typically operates in an environment where its behaviour is vulnerable to modification. Such modification can be made by local or remote entities. This has given rise to concerns, especially in the field of e-commerce, that transactions conducted on a computer might be subject to some form of misdemeanour, such as theft of credit card details. These perceived insecurities may limit the willingness of users to undertake e-commerce transactions on either local or remote computer systems.


The data processing environment (or environment) of a computer platform or other data processing appliance consists of the other computing entities (computer platforms or any other data processing appliance) that are discrete from the computer platform and are in communication with it through one or more data networks. For a computer entity to form part of the environment of a computer platform, the computer platform must be able to interact with the entity but must not be constrained to do so—at some level, interaction must be voluntary. The boundary of an environment will generally be considered in terms of network connections (or other network “distance”) between one point and another—for some purposes, the data processing environment of a computer platform may be considered its local network and no more, whereas for other purposes, the data processing environment of a computer platform may extend much further (for example, across a company intranet). There are existing security applications, such as virus checkers and fire walls which can be installed in computer systems in order to limit their vulnerability to viruses or to malicious users seeking to take control of the machine remotely. However, these security applications execute on computing platforms under the assumption that the platform is operating as intended and that the platform itself will not subvert the processes used by these applications.


Users engaging in communication with a remote or unfamiliar data processing environment may nevertheless be concerned about the security of that environment as a whole rather than just the security of the computing device with which they have made initial contact. Thus users seek reassurance that the computing environment can be trusted.


As used herein, the word “trust” is used to mean that something can be trusted, in the sense that it is working in the way that it is intended and expected to work and is not or has not been tampered with in order to run malicious operations.


DISCLOSURE OF THE INVENTION

According to a first aspect of the present invention, there is provided an apparatus for ascertaining the status of a data processing environment, comprising at least one trusted computing device which is arranged to challenge other devices within a data processing environment, to keep a record of the responses and to make the record available.


It is thus possible to use a trusted computing device to keep an audit of the status of a data processing network. The trusted device can challenge new devices as and when it discovers them within the data processing environment and can also re-challenge known devices every now and again in order to ascertain that they are functioning correctly. In such a challenge, a trusted computing device extracts from the challenged device a response to the challenge. Preferably the challenged device enters into a predefined challenge—response protocol, such as a TCPA challenge—response protocol in order to return trusted integrity and identity information about the challenged device. Thus the response may include information which can be analysed to determine the integrity of the challenged device, such as at least one integrity metric. The trusted device, upon receiving the response, extracts the integrity information from the response and compares it with an authenticated metric for the challenged device. As a result of the comparison, the trusted device can indicate whether the challenged device can be trusted or not. By keeping a record of the time that a device is challenged, the response received from the device and the result of the comparison of the integrity metrics, the trusted computing device can maintain a log of the status of the data processing environment.


The integrity information would normally include a cryptographic representation of at least one platform component. This may, for example, include the BIOS, operating system or an application. Preferably the integrity information is of the form described in the TCPA specification (www.trustedpc.org) and has been measured and stored in the form described in the TCPA specification.


Advantageously other devices within the data processing environment can also issue challenges. The responses to those challenges and conclusions concerning trustworthiness can be recorded by the at least one trusted computing device. An indication of which devices acted as challenger and challengee can also be recorded, together with an indication of whether the challenger is itself established as trustworthy.


Preferably the challenges to known devices are made on a periodic basis in order to maintain an up to date record of the integrity of the data processing environment. However, additional challenges may also be made when a device attempts to perform an operation within the environment which requires that device to be trusted. Thus attempts to read, create or modify sensitive data, whether this data be user data, application data or system data (as defined by a system administrator) may require re-authentication of the trustworthiness of the device before it is enabled to have such access.


Advantageously the record held by the trusted device includes historical data representing the status of the network. Such data may be of use during investigations of system performance in order to indicate when a data processing environment could be regarded as trustworthy and/or what devices or processes may have entered or been executed in that environment. This information may be of use to administrators or investigators when seeking data concerning fraudulent transactions or attempts to subvert the operation of one or more components within the system.


In order to maintain a record of the devices within the computing environment, the trusted computing device needs to ascertain what devices are there. It can do this by sending query messages into the environment. The queries may be specific, that is directed to a known device in order to ascertain that it is still there and functioning. However, the queries may also be more general. Thus, for example, the trusted computing device could issue a query to a class of devices, for example printers, to identify what printers are available within the data processing environment. Such a query could then be repeated for a different class of device in order to build up a picture of the data processing environment. Alternatively, the trusted computing device could issue a general query asking devices to respond by identifying themselves, and optionally their functionality and/or integrity metrics. Advantageously the query also includes a generation identifier such that the age of the query can be ascertained by devices receiving the query. In this context, age can be measured in either or both the temporal sense or the number of retransmissions that the message has undergone. It is particularly desirable to limit the number of retransmissions that the query message may undergo as in extreme cases the message could propagate out of a local computing environment via, for example, a communications link to the internet and then computing devices receiving that query message could then attempt to respond. If this were the case, the trusted computing device could be overwhelmed by responses from computers which do not really constitute part of the data processing environment but which nevertheless have managed to receive the query message.


The trusted computing device can also listen to data communications on a network or between devices local to it in order to ascertain what devices are operating. The need to listen for devices entering and leaving the data processing network is likely to become more prevalent with the general increase in the number of portable computing devices and the ease at which these can enter or leave data processing environments as a result of the increase in wireless communication links to join computing devices, for example Blue Tooth.


When a user with a portable computing device or a remote user using a telecommunications link wishes to interact with a data processing environment, the user may seek to challenge the integrity of that environment. The functionality of the user's computing device and/or the communications bandwidth between the user's device and the data processing network may limit the ability of the user to make individual challenges to each device in the data processing environment. However, the user may still seek confirmation that the data processing environment is secure, or at least an indication of the trust which he should place in that data processing environment (for a user may still decide to use a data processing environment even if that data processing environment is not deemed to be trustworthy—this is the user's choice depending on the type of transaction that the user wishes to undertake and the urgency ascribed to that transaction). With the present invention, a user does not need to make individual challenges to each device, but instead can seek this data from the trusted computing device. The user can trust the data from the trusted computing device because the user can challenge the trusted computing device and analyse the response to the challenge, comparing the integrity metrics received from the trusted computing device with those which are certificated as being the correct metrics, thereby enabling the user to determine whether the trusted computing device is itself trustworthy. The user can also determine what facilities are available in the computing environment.


According to a second aspect of the present invention, there is provided a computing device including a communications device and a data processor wherein the data processor is arranged to establish communication with a trusted computing device via the communication device, to receive at least part of the record of responses and to establish from an internal rules base whether the data processing environment is trustworthy enough to enable a class of transaction or task to be carried out in that environment.


According to a third aspect of the present invention, there is provided a computing device including a communications device and a data processor, wherein the computing device uses the communication device to establish communication with at least one device within a data processing system, and in which the data processor is arranged to identify challenges from at least one trusted computing device, to apply response rules to the challenge and, if a response is indicated, to respond to the challenge in accordance with the rules.


Advantageously, when the computing device receives a challenge from the trusted device it examines a generation identifier in order to see whether the message is still valid. The generation identifier may be a skip number. Thus, each time the challenge is retransmitted the skip number is modified, and every time a device receives a challenge, it checks the skip number to see if the challenge is valid. For convenience, the trusted computing device may set the skip number to an integer value greater than zero, and the skip number may be decremented at each retransmission. Any device receiving a challenge with a skip number of zero ignores the challenge and does not retransmit the challenge. This prevents the challenge from propagating too widely.


According to a fourth aspect of the present invention, there is provided a method of ascertaining the status of the data processing environment, the method comprising the steps of using a trusted computing device to challenge other devices within a data processing environment, keeping a record of the responses made to the challenges and making the record available.


Preferably the trusted computing device will itself respond to a challenge such that the integrity of the trusted computing device can be verified by the device which challenged it.


According to a further aspect of the present invention, there is provided a method of conducting a transaction in a data processing environment comprising a user device and at least a trusted computing device each having respective communications capabilities, wherein the trusted computing device keeps a record of computing devices that it has identified within the data processing environment, and wherein the user device is arranged to establish communications with the trusted computing device, to receive therefrom at least a portion of the record of computing devices within the data processing environment, and to analyse the record to establish what facilities the user device may access.


Preferably the user device further analyses the record in accordance with a set of security rules contained therein to determine what level of trust the user device can place on the integrity of the data processing environment.


It is thus possible to provide a trusted record of the status and trustworthiness of devices within a data processing network such that a computing device can be spared the task of challenging each device in the computer network in order to ascertain its trustworthiness, but instead can obtain a record of the challenges from a trusted computing device.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will further be described, by way of example, with reference to the accompanying drawings, in which:



FIG. 1 is a schematic diagram illustrating the functional components within a trusted computing device;



FIG. 2 schematically illustrates the structure of a first data processing environment including a trusted computing device;



FIG. 3 schematically illustrates a second data processing environment including a trusted computing device;



FIG. 4 is a flow chart illustrating the steps undertaken by a trusted computing device in order to maintain a record of the local data processing environment;



FIG. 5 illustrates the data layout of the challenge issued by the trusted computing device; and



FIG. 6 is a flow chart illustrating the response of a device in the data processing environment upon receiving a challenge.





BEST MODE FOR CARRYING OUT THE INVENTION

Many of the concepts underlying trusted computing have already been published. In particular, a specification concerning the functionality of a trusted computing environment has been published by the “trusted computing platform alliance” on their web site at www.trustedpc.org. A trusted computing device of this general type is described in the applicant's co-pending International Patent Application Publication No. PCT/GB00/00528 entitled “Trusted Computing Platform”, filed on 15 Feb. 2000, the contents of which are incorporated by reference herein.


In essence, it is desirable to ensure that a computer is trustworthy. In broad terms, this can be achieved by attaching to or embedding in a computer a trusted physical device whose function in relation to that computer is to obtain measurements of data from the computer which provides an integrity metric of the computer platform. The identity of the computing platform and the integrity metric are compared with expected values that are provided by a trusted party whose role it is to vouch for the trustworthiness of the platform. If the identities and metrics match, then there is a strong implication that the platform is operating correctly. The trusted physical device is normally provided as a tamper evident component such that attempts to interfere with its operation will result in its performance being modified when it is next used.


A trusted platform is illustrated in FIG. 1. The platform 2 includes a central processing unit 4, which is in communication with BIOS memory 6, Random Access Memory 8, Read Only Memory 10, a removable data storage device 12, an internal mass storage device 16, at least one communications device 18, a trusted device 20, a video board and associated display 22, and a user input device such as a keyboard 24, via a data bus 26. In a conventional computing device, at power up or reset the CPU initially reads instructions from the BIOS 6. In the early days of computing the BIOS memory which is non-volatile was hard wired and therefore it was not possible to modify its contents. However, with the development of EEPROM it has become possible to modify the BIOS of a computer system. In a trusted computing environment, the boot-up sequence is modified such that the CPU first looks to the trusted device 20 for instructions after reset or power-up. The trusted device 20, having gained initial control of the CPU, then enables the CPU to execute the BIOS program held in the BIOS memory 6. The trusted device can investigate integrity metrics in the BIOS program, such as check sums for the whole or specific part of the BIOS or data at specific addresses in order to determine that the BIOS has not been interfered with or modified. It can compare these values against values certified as being correct by the trusted party. The BIOS 6 may advantageously be contained within the trusted device, thereby ensuring that the BIOS builds the correct environment for the operating system. The trusted device can also, through its interaction with the BIOS, enforce the operation of various security policies. After the BIOS has been loaded, the CPU can then seek to load its operating system from the storage device 16. Once again, the trusted device 20 can challenge the operating system to extract integrity metrics from it and to compare these with metrics provided by the trusted party. Thus, as the system builds up from power-up or reboot the BIOS is first confirmed as trustworthy, and once this has been established tests are made to see that the operating system is trustworthy, and once this has been established further tests may be made to ensure that applications executing on the trusted computer platform are also trustworthy. The trusted computing platform need not be a general purpose PC wherein applications are loaded from the mass storage device 16 to the random access memory 8, and indeed the trusted device could be an embedded system. In which case, it is likely that application data may also be held in read-only memory 10. The trusted computing device may have a reader 12 for removable media, such as floppy discs, or for interfacing with the smart cards which may be used to help authenticate the identity of a local user who seeks to operate the trusted computing device 2 via its keyboard 24. An interface to the local user is provided via the keyboard 24 and the video display 22, as is well known. The trusted computing device 2 also has a communications device 18 which may support one or more of direct connection with a local area or wide area network, wireless communications, infrared or ultrasonic communication and/or communications with a telecommunication network via a data link.


As shown in FIG. 2, the trusted computing device 2 can be a component within a relatively well defined network. Thus other devices such as a printer 30, a scanner 32, and user devices 34 and 36. Each device is connected via a local area network 38. In this environment, the communications medium between devices is well defined and a number of devices on the network can be expected to change only relatively slowly.


A user's device 40, for example in the form of a personal digital assistant can establish communications with the trusted computing device 2 via the communications device 18 of the trusted computing device 2 and also a communications port 42 of the personal digital assistant. The personal digital assistant 40 also includes a central processing unit 44 and a memory 46 holding a set of security rules which the processor 44 uses to decide whether or not it should regard the data processing environment as trustworthy or not.


In use, the trusted computing device 2 challenges the devices 30 to 36 in its computing environment 38 and keeps a record of their responses, including integrity metrics, which enables it to work out whether the devices 30 to 36 can be trusted, that is that they are behaving in an expected manner and have not been subverted by external or malicious processes. The trusted device 2 keeps a record of the results of the challenges in its memory 8 and on the mass storage device 16 (see FIG. 1).


When the user's device wishes to use the facilities available of the computing network 39, it establishes communication with the trusted computing device 2, and challenges it in order to assure itself that the trusted computing device is a trusted computing device and not some rogue device masquerading or spoofing as a trusted computing device. Once the user device 40 has completed the challenge and compared the integrity metric retrieved from the trusted computing device with an expected integrity metric as certified with a trusted authority, the user device 40 may then query the trusted computing device 2 in order to obtain a list of the devices available in the local computing area, the functions that they can perform and whether or not they are trustworthy. Having received this data, the user device 40 then applies the security rules held in the memory 46 in order to determine whether, in accordance with those rules which themselves are defined either by the device owner, administrator or some other person given the rights to modify those rules, whether the local computing environment is trustworthy. If so, the user is informed. Alternatively, the user may be informed if the computing environment is not trustworthy. The user may also be informed which classes of transactions should or should not be undertaken in this environment.


Not all computing environments are as well defined as that shown in FIG. 2. Companies may wish to offer computing facilities in publicly accessible environments where it can be expected that most users will establish local wireless communications with some form of gateway or server whilst they are in the environment. Thus, as shown in FIG. 3 a trusted computing device 2 may be situated in an environment with a display device 60. Users 62 and 64 having portable computing devices such as personal digital assistants or palm top computers may enter the computing area around the devices 2 and 60 and may establish wireless communications with the trusted computing device 2 in order to enquire what facilities are available in the computing area. The computing device 2 may then inform the devices 62 and 64 about the existence of the display device 60 such that these devices can then interface with it for example to display data which they are not capable of displaying on their own inbuilt display devices. The trusted computing device 2 may also seek to inform each mobile device 62 and 64 about the presence of the other. The trusted computing device may also provide details about access to a telecommunications network 66.


As indicated above, the arrangements of FIGS. 2 and 3 are simply examples of (relatively simple) data processing environments. Far more complex arrangements involving multiple network connections may nonetheless be considered data processing environments. In most practical cases, it will be necessary to define a boundary or a way to determine the extent of the environment, but this will generally not be problematic to the person skilled in the art (for example, such a boundary may be formed by a firewall). FIG. 4 schematically illustrates a sequence by which the trusted computing device 2 can maintain its record of devices in the data processing environment. Starting at step 70, the trusted computing device 2 listens to data traffic in the computing environment to identify the presence of any new devices. Once the device has been identified, control is passed to step 72 where the trusted computing devices issues a challenge to the new device. Any response made by that device is recorded at step 74, together with the time at which the response was received and then control is passed to step 76 where an analysis of any integrity metric returned by the challenged device is made in order to ascertain whether the device is trustworthy. The result of the analysis is recorded at step 78. The challenged device, or indeed any other device on the network may issue a request to the trusted device to seek information from the record, a test for any such request is made at step 80. If a request has been received, that data is transmitted at step 82, otherwise control is returned to step 70 where the integrity check can be repeated.


The status of the computing environment can be held by the trusted computing device, and consequently displayed by the user devices, in any manner convenient and appropriate for retaining such data. The status information may be provided in the form of a list of computing entities within the computing environment, together with a trust status for each of the computing entities. Most simply, these statuses could be “trusted” (indicating that the computing entity concerned is verified to have the status of a trusted computing device) or “untrusted” (where no such verification has occurred). The “untrusted” status could be broken down further into “not trusted” (where verification has failed) and “untested” (where verification has either not yet occurred or has not taken place sufficiently recently). A further possibility is for there to be multiple levels of trust available for a computing entity, such that the computing entity is able or permitted to perform different functions at the different levels of trust—more and different integrity metrics may be required to determine that a computing entity is at a higher trust level. Such an arrangement is further described in the applicant's copending International Patent Application Publication No. WO 01/27722, the contents of which are incorporated by reference to the fullest extent permitted by law. In this case, the trusted device 2 should be adapted to determine when these different trust levels are attained, and to indicate these trust levels appropriately when indicating the status of the computing environment.


In a modification of the flow chart showing in FIG. 4, steps 70 and 72 may be replaced by a single broadcast challenge. Such a broadcast challenge is schematically shown in FIG. 5. The broadcast challenge comprises two parts, the first being the challenge message 90 and the second part being a generation identifier 92.


Devices receiving the challenge shown in FIG. 5 may execute the procedure shown in FIG. 6. The procedure starts at step 100 where the device receives the challenge. Control is then passed to step 102 where the generation identifier is examined. In a preferred embodiment of the invention, the generation identifier is set to a positive integer number which controls the number of retransmissions which the challenge may undergo. Each time the challenge is retransmitted, the generation identifier is decremented by the device that retransmits the challenge. Thus, once the generation identifier reaches zero the challenge has become “old” and is no longer valid. At step 104 a test is made to see if the generation identifier is greater than zero, if not, control is passed to step 106 where the challenge handling routine is terminated. If the generation identifier is greater than zero control is passed to step 108 where, if the device is programmed to participate in these challenges, it responds to the challenge. From step 108 control is passed to step 110 where the challenge identifier is decremented and then to step 112 where the challenge is retransmitted with the decremented generation identifier. Control is then passed to step 114 which represents the end of this routine.


It is thus possible to provide a measure of the integrity and facilities available within a local, and possibly varying data processing network.

Claims
  • 1. a. A computing device including: a communication device and a data processor, wherein the computing device uses the communication device to establish communication with at least one device within a data processing system, and in which the data processor is arranged to identify a challenge from at least one trusted computing device to search for a generation identifier within the challenge, to apply response rules to the generation identifier to see if the challenge is still valid, and if it is not to disregard the challenge and if the challenge is valid to apply response rules to the challenge and, if a response indicated, to respond to the challenge in accordance with the rules.
  • 2. A computing device as claimed in claim 1, in which the computing device retransmits the challenge with a modified generation identifier if the challenge is valid.
  • 3. A method of ascertaining the status of a data processing environment, comprising the following steps: a trusted computing device challenges other devices within a data processing environment, keeps a record of responses made to the challenges and makes the record available, in which a challenge of the challenges generated by the trusted device includes a generation identifier such that any device receiving the challenge can examine the generation identifier in order to establish whether the challenge is directly received from the trusted computing device or whether it has been retransmitted.
  • 4. A method as claimed in claim 3, in which the trusted computing continues to challenge the devices in the data processing environment so as to maintain an evolving record of the status of the data processing environment.
  • 5. A method as claimed in claim 3, in which the record includes a historical status of the data processing environment.
  • 6. A method as claimed in claim 3, in which the at least one trusted computing device listens to communications within the data processing environment so as to identify the presence of new devices.
  • 7. a. A method of conducting a transaction in a data processing environment comprising a user device and al least a trusted computing device each having respective communication capabilities comprising: the trusted computing device keeps a record of computing devices that it has identified within the data processing environment; b. and whereby the user device establishes communication with the trusted computing device;c. the trusted computing devices sends to the user device at least a portion of the record of computing devices within the data processing environment;d. and the user device analyses the record to establish what facilities the user device may access;e. wherein the user device further analyses the record in accordance with a set of security rules to determine what level of trust can be placed on the integrity of the data processing environment.
Priority Claims (1)
Number Date Country Kind
0104673.9 Feb 2001 GB national
US Referenced Citations (205)
Number Name Date Kind
4747040 Blanset et al. May 1988 A
4799156 Shavit et al. Jan 1989 A
4926476 Covey May 1990 A
4933969 Marshall et al. Jun 1990 A
4962533 Kruger et al. Oct 1990 A
4984272 McIlroy et al. Jan 1991 A
5029206 Marino et al. Jul 1991 A
5032979 Hecht et al. Jul 1991 A
5038281 Peters Aug 1991 A
5136711 Hugard et al. Aug 1992 A
5144660 Rose Sep 1992 A
5261104 Bertram et al. Nov 1993 A
5278973 O'Brien et al. Jan 1994 A
5283828 Saunders et al. Feb 1994 A
5325529 Brown et al. Jun 1994 A
5341422 Blackledge et al. Aug 1994 A
5359659 Rosenthal Oct 1994 A
5361359 Tajalli et al. Nov 1994 A
5379342 Arnold et al. Jan 1995 A
5404532 Allen et al. Apr 1995 A
5410707 Bell Apr 1995 A
5414860 Canova et al. May 1995 A
5421006 Jablon et al. May 1995 A
5440723 Arnold et al. Aug 1995 A
5444850 Chang Aug 1995 A
5448045 Clark Sep 1995 A
5454110 Kannan et al. Sep 1995 A
5473692 Davis Dec 1995 A
5483649 Kuznetsov et al. Jan 1996 A
5491750 Bellare et al. Feb 1996 A
5495569 Kotzur Feb 1996 A
5497490 Harada et al. Mar 1996 A
5497494 Combs et al. Mar 1996 A
5504814 Miyahara Apr 1996 A
5504910 Wisor et al. Apr 1996 A
5511184 Lin Apr 1996 A
5530758 Marino et al. Jun 1996 A
5535411 Speed et al. Jul 1996 A
5548763 Combs et al. Aug 1996 A
5555373 Dayan et al. Sep 1996 A
5572590 Chess Nov 1996 A
5619571 Sandstrom et al. Apr 1997 A
5680452 Shanton Oct 1997 A
5680547 Chang Oct 1997 A
5692124 Holden et al. Nov 1997 A
5694590 Thuraisingham et al. Dec 1997 A
5701343 Takashima et al. Dec 1997 A
5706431 Otto Jan 1998 A
5768382 Schneier et al. Jun 1998 A
5771354 Crawford Jun 1998 A
5774717 Porcaro Jun 1998 A
5787175 Carter Jul 1998 A
5809145 Slik et al. Sep 1998 A
5815665 Teper et al. Sep 1998 A
5815702 Kannan et al. Sep 1998 A
5819261 Takahashi et al. Oct 1998 A
5841868 Helbig Nov 1998 A
5841869 Merkling et al. Nov 1998 A
5844986 Davis Dec 1998 A
5845068 Winiger Dec 1998 A
5867646 Benson et al. Feb 1999 A
5887163 Nguyen et al. Mar 1999 A
5889989 Robertazzi et al. Mar 1999 A
5890142 Tanimura et al. Mar 1999 A
5892900 Ginter et al. Apr 1999 A
5892902 Clark Apr 1999 A
5903732 Reed et al. May 1999 A
5917360 Yasutake Jun 1999 A
5922074 Richard et al. Jul 1999 A
5933498 Schneck et al. Aug 1999 A
5937066 Gennaro et al. Aug 1999 A
5937159 Meyers Aug 1999 A
5940513 Aucsmith et al. Aug 1999 A
5958016 Chang et al. Sep 1999 A
5960177 Tanno Sep 1999 A
5966732 Assaf Oct 1999 A
5987605 Hill et al. Nov 1999 A
5987608 Roskind Nov 1999 A
6006332 Rabne et al. Dec 1999 A
6012080 Ozden et al. Jan 2000 A
6021510 Nachenberg Feb 2000 A
6023765 Kuhn Feb 2000 A
6038667 Helbig Mar 2000 A
6067559 Allard et al. May 2000 A
6078948 Podgorny et al. Jun 2000 A
6079016 Park Jun 2000 A
6081830 Schindler Jun 2000 A
6081894 Mann Jun 2000 A
6091956 Hollenberg Jul 2000 A
6098133 Summers et al. Aug 2000 A
6100738 Illegems Aug 2000 A
6115819 Anderson Sep 2000 A
6125114 Blanc et al. Sep 2000 A
6138239 Veil Oct 2000 A
6148342 Ho Nov 2000 A
6154838 Le et al. Nov 2000 A
6157719 Wasilewski et al. Dec 2000 A
6175917 Arrow et al. Jan 2001 B1
6185678 Arbaugh et al. Feb 2001 B1
6203101 Chou et al. Mar 2001 B1
6211583 Humphreys Apr 2001 B1
6253193 Ginter et al. Jun 2001 B1
6253324 Field et al. Jun 2001 B1
6253349 Maeda et al. Jun 2001 B1
6266774 Sampath et al. Jul 2001 B1
6272631 Thomlinson et al. Aug 2001 B1
6275848 Arnold Aug 2001 B1
6289462 McNabb et al. Sep 2001 B1
6292900 Ngo et al. Sep 2001 B1
6304970 Bizzaro et al. Oct 2001 B1
6327533 Chou Dec 2001 B1
6327579 Crawford Dec 2001 B1
6327652 England et al. Dec 2001 B1
6330669 McKeeth Dec 2001 B1
6330670 England et al. Dec 2001 B1
6334118 Benson Dec 2001 B1
6367012 Atkinson et al. Apr 2002 B1
6374250 Ajtai et al. Apr 2002 B2
6389536 Nakatsuyama May 2002 B1
6393412 Deep May 2002 B1
6393556 Arora May 2002 B1
6405318 Rowland Jun 2002 B1
6414635 Stewart et al. Jul 2002 B1
6446203 Aguilar et al. Sep 2002 B1
6449716 Rickey Sep 2002 B1
6477702 Yellin et al. Nov 2002 B1
6487601 Hubacher et al. Nov 2002 B1
6496847 Bugnion et al. Dec 2002 B1
6505300 Chan et al. Jan 2003 B2
6507909 Zurko et al. Jan 2003 B1
6510418 Case et al. Jan 2003 B1
6513156 Bak et al. Jan 2003 B2
6519623 Mancisidor Feb 2003 B1
6529143 Mikkola et al. Mar 2003 B2
6529728 Pfeffer et al. Mar 2003 B1
6530024 Proctor Mar 2003 B1
6539425 Stevens et al. Mar 2003 B1
6604089 Van Horn et al. Aug 2003 B1
6609199 DeTreville Aug 2003 B1
6609248 Srivastava et al. Aug 2003 B1
6622018 Erekson Sep 2003 B1
6650902 Richton Nov 2003 B1
6654800 Rieger, III Nov 2003 B1
6671716 Diedrichsen et al. Dec 2003 B1
6678827 Rothermel et al. Jan 2004 B1
6678833 Grawrock Jan 2004 B1
6681304 Vogt et al. Jan 2004 B1
6684196 Mini et al. Jan 2004 B1
6694434 McGee et al. Feb 2004 B1
6697944 Jones et al. Feb 2004 B1
6701440 Kim et al. Mar 2004 B1
6732276 Cofler et al. May 2004 B1
6751680 Langerman et al. Jun 2004 B2
6757710 Reed Jun 2004 B2
6757824 England Jun 2004 B1
6757830 Tarbotton et al. Jun 2004 B1
6772331 Hind et al. Aug 2004 B1
6775779 England et al. Aug 2004 B1
6778968 Gulati Aug 2004 B1
6785015 Smith et al. Aug 2004 B1
6799270 Bull et al. Sep 2004 B1
6837229 Mizutani Jan 2005 B2
6853988 Dickinson et al. Feb 2005 B1
6868406 Ogg et al. Mar 2005 B1
6889325 Sipman et al. May 2005 B1
6892307 Wood et al. May 2005 B1
6931545 Ta et al. Aug 2005 B1
6948069 Teppler Sep 2005 B1
6948073 England et al. Sep 2005 B2
6965816 Walker Nov 2005 B2
6988250 Proudler et al. Jan 2006 B1
7051343 Bracha et al. May 2006 B2
7058807 Grawrock et al. Jun 2006 B2
7076655 Griffin et al. Jul 2006 B2
7076804 Kershenbaum et al. Jul 2006 B2
7159210 Griffin et al. Jan 2007 B2
7181608 Fallon et al. Feb 2007 B2
7194623 Proudler et al. Mar 2007 B1
7302698 Proudler et al. Nov 2007 B1
7376974 Proudler et al. May 2008 B2
7457951 Proudler et al. Nov 2008 B1
7529919 Lampson et al. May 2009 B2
7669238 Fee et al. Feb 2010 B2
7865876 Griffin et al. Jan 2011 B2
7877799 Proudler Jan 2011 B2
20010037450 Metlitski et al. Nov 2001 A1
20010051515 Rygaard Dec 2001 A1
20020012432 England et al. Jan 2002 A1
20020023212 Proudler Feb 2002 A1
20020089528 Hay et al. Jul 2002 A1
20020120575 Pearson et al. Aug 2002 A1
20020120876 Pearson et al. Aug 2002 A1
20020184488 Amini et al. Dec 2002 A1
20020184520 Bush et al. Dec 2002 A1
20030009685 Choo et al. Jan 2003 A1
20030014466 Berger et al. Jan 2003 A1
20030018892 Tello Jan 2003 A1
20030037237 Abgrall et al. Feb 2003 A1
20030041250 Proudler Feb 2003 A1
20030084436 Berger et al. May 2003 A1
20030145235 Choo Jul 2003 A1
20030191957 Hypponen et al. Oct 2003 A1
20030221124 Curran et al. Nov 2003 A1
20040073617 Milliken et al. Apr 2004 A1
20050256799 Warsaw et al. Nov 2005 A1
Foreign Referenced Citations (58)
Number Date Country
2187855 Jun 1997 CA
0 304 033 Feb 1989 EP
0421409 Apr 1991 EP
0510224 Oct 1992 EP
0 580 350 Jan 1994 EP
0 825 511 Feb 1998 EP
0 849 657 Jun 1998 EP
0849680 Jun 1998 EP
0 465 016 Dec 1998 EP
0893751 Jan 1999 EP
0 895 148 Feb 1999 EP
0926605 Jun 1999 EP
0992958 Apr 2000 EP
1 030 237 Aug 2000 EP
1 056 014 Nov 2000 EP
1049036 Nov 2000 EP
1055990 Nov 2000 EP
1056010 Nov 2000 EP
1076279 Feb 2001 EP
1085396 Mar 2001 EP
1107137 Jun 2001 EP
2317476 Mar 1998 GB
2 336 918 Nov 1999 GB
2 353 885 Mar 2001 GB
2361153 Oct 2001 GB
9214493 Aug 1997 JP
10083382 Mar 1998 JP
10293704 Oct 1998 JP
10510647 Oct 1998 JP
10293705 Nov 1998 JP
11003248 Jan 1999 JP
2001-0016655 Jan 2001 JP
2001-0076655 Jan 2001 JP
9325024 Dec 1993 WO
9411967 May 1994 WO
9524696 Sep 1995 WO
9527249 Oct 1995 WO
9729416 Aug 1997 WO
9826529 Jun 1998 WO
9836517 Aug 1998 WO
PCTUS9844402 Aug 1998 WO
9840809 Sep 1998 WO
9845778 Oct 1998 WO
PCTUS9815082 Feb 1999 WO
0031644 Jun 2000 WO
0048062 Aug 2000 WO
0048063 Aug 2000 WO
0054125 Sep 2000 WO
0054126 Sep 2000 WO
PCTUS0052900 Sep 2000 WO
PCTUS0058859 Oct 2000 WO
0073913 Dec 2000 WO
PCTUS0073904 Dec 2000 WO
PCTUS0113198 Feb 2001 WO
0123980 Apr 2001 WO
PCTUS0127722 Apr 2001 WO
PCTUS0165334 Sep 2001 WO
PCTUS0165366 Sep 2001 WO
Related Publications (1)
Number Date Country
20020120575 A1 Aug 2002 US