1. Field of the Invention
The present invention is related to anti-malware technology, and more particularly, to measuring a static and dynamic security rating of processes.
2. Description of the Related Art
Detection of viruses has been a concern throughout the era of the personal computer. With the growth of communication networks, such as the Internet, and increasing interchange of data, including the rapid growth in the use of e-mail for communications, the infection of computers through communications or file exchange is an increasingly significant consideration. Infections take various forms, but are typically related to computer viruses, trojan programs, or other forms of malicious code. Recent incidents of e-mail mediated virus attacks have been dramatic both for the speed of propagation and for the extent of damage, with Internet service providers (ISPs) and companies suffering service problems and a loss of e-mail capability. In many instances, attempts to adequately prevent file exchange or e-mail mediated infections significantly inconvenience computer users. Improved strategies for detecting and dealing with virus attacks are desired.
One conventional technique for detecting viruses is signature scanning. Signature scanning systems use sample code patterns extracted from known malicious code and scan for the occurrence of these patterns in other program code. In some cases, program code that is scanned is first decrypted through emulation, and the resulting code scanned for virus signatures or function signatures. A primary limitation of this signature scanning method is that only known malicious code is detected, that is, only code that matches the stored sample signatures of known malicious code is identified as infected. All the viruses or malicious code not previously identified and all the viruses or malicious code created after the last update of the signature database will not be detected. Thus, newly released viruses are not detected by this method, neither are viruses with code in which the signature, previously extracted and contained in the signature database, has been overwritten.
In addition, the signature analysis fails to identify the presence of a virus if the signature is not aligned in the code in the expected fashion. Alternatively, the authors of a virus may obscure the identity of the virus by opcode substitution or by inserting dummy or random code into virus functions. Nonsense code can be inserted that alters the signature of the virus to a sufficient extent undetectable by a signature scanning program, without diminishing the ability of the virus to propagate and deliver its payload.
Another virus detection strategy is integrity checking. Integrity checking systems extract a code sample from known benign application program code. The code sample is stored, together with information from the program file, such as the executable program header and the file length, as well as the date and time of the sample. The program file is checked at regular intervals against this database to ensure that the program file has not been modified. Integrity checking programs generate long lists of modified files when a user upgrades the operating system of the computer or installs or upgrades application software. The main disadvantage of an integrity check-based virus detection system is that many warnings of virus activity are issued when any modification of an application program is performed. It is difficult for a user to determine whether a warning represents a legitimate attack on the computer system.
Checksum monitoring systems detect viruses by generating a cyclic redundancy check (CRC) value for each program file. Modification of the program file is detected by a variation in the CRC value. Checksum monitors improve on integrity check systems since it becomes difficult for malicious code to defeat the monitoring. On the other hand checksum monitors exhibit the same limitations as integrity checking systems, meaning that false warnings are issued, and it becomes difficult to identify which warnings represent actual viruses or infection.
Behavior interception systems detect virus activity by interacting with the operating system of the target computer and monitoring for potentially malicious behavior. When malicious behavior is detected, the action is blocked and the user is informed that a potentially dangerous action is about to take place. The potentially malicious code can be allowed to perform this action by the user, which makes the behavior interception system somewhat unreliable, because the effectiveness of the system depends on the user input. In addition, resident behavior interception systems are sometimes detected and disabled by malicious code.
Another conventional strategy for detecting infections is the use of bait files. This strategy is typically used in combination with various virus detection strategies to detect an existing and active infection. This means that the malicious code is running on the target computer and is modifying files. The virus is detected the moment the bait file is modified. Many viruses are aware of bait files and do not modify files that are either too small or have a predetermined content in the file name or because of their structure.
One of the problems in the field of anti-virus software is the fact that many users are unwilling to wait for a long time for the anti-virus software to do its work. In fact, most users would wait a fraction of a second, perhaps a second or two, when starting an application, but not longer than that. On the other hand in such a relatively short period of time, only more rudimentary anti-virus checks are possible, which is problem for the anti-virus software vendor, because the need to check the executable file being launched for viruses must be balanced against the time that a thorough anti-virus check takes.
It is apparent that improved techniques for detecting viruses and other malicious types of code are desirable.
The present invention is intended as a system and a method for security rating of processes that substantially obviates one or several of the disadvantages of the related art.
In one aspect of the invention there is provided a system, method, and computer program product for security rating of processes for malware presence, including (a) detecting an attempt to execute a file on a computer; (b) performing initial risk assessment of the file and assigning initial (static) security rating S; (c) analyzing the initial risk pertaining to the file and if it exceeds predetermined threshold, notifying user; (d) starting the process from code in the file; (e) continuing to monitor the process for any suspicious activities; (f) once the process executed a suspicious operation, the security rating dynamically changes to D; (g) if the security rating D is, for example, 50% or less the system will continue to execute the process; (h) if the security rating D is greater than, for example, 50% then the user gets notified and the process most likely will continued to be executed, unless the user instructs otherwise; (i) if the security rating D is, for example, 75% or less, the system will notify the user and depending on the user response probably will continue the process execution, but restricting access to most of computer resources; (j) if the security rating D is greater than 75% then the system will block the actions of the process; (k) the system will terminate the process, optionally notify the user, and cure the process; (l) after the process has been terminated the system has to deal with the corrupted file; (m) if the corrupted file is a system component or a ‘useful’ file then the system can attempt to cure the file; (n) if the corrupted file is an independent (non-system) file, the system will delete the file. The percentages are user-set and changeable, either at installation or during normal use. The rules used to determine the rating are also modifiable and changeable dynamically, either by the user directly, or by the software accessing a server, for example, for updates, and downloading new/additional rules, modified old rules and the threat ratings associated with particular events, rules regarding new threats, and so on.
The process risk analysis is based on the security rating R. The security rating R varies from ‘safe’ to ‘dangerous’ (high) and calculated from 0 to a 100 percent. 0 is the safest level and 100 is the dangerous level. The security rating R is the number that could be viewed as a combination of a static rating and a dynamic rating.
Additional features and advantages of the invention will be set forth in the description that follows, and in part will be apparent from the description, or may be learned by practice of the invention. The advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed.
The accompanying drawings, which are included to provide further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
In the drawings:
Reference will now be made in detail to the preferred embodiment of the present invention, examples of which are illustrated in the accompanying drawings.
In one embodiment of the invention, a system and method use the process risk analysis based on the security rating R. The security rating R varies from ‘safe’ to ‘dangerous’ (high) and calculated from 0 to 100 percent. 0 is the safest level and 100% is the most dangerous level. As an example, process is rated as ‘safe’ with a rating of 0-25%, a process rated as ‘moderately dangerous’ or ‘suspicious’ with a rating of 25-75% and in excess of 75% the process is rated as ‘dangerous’ (high). As will be appreciated, these numbers are exemplary and subject to change.
The security rating R is the number that could be divided in two parts: part one is static rating and part two is dynamic rating. Before the file execution invoked, the certain criteria of the file are analyzed, such name of the file, file size, file's location, compression, whether the file is packed, and whether the file was received from a CD-ROM, etc. These criteria determine the static rating S of the file.
After that an emulator is launched, and behavior of the executable file is emulated, producing different results for each event that took place during the initial emulation process. Each event is compared to the stored list of factors and a list of weights and to each event and process, an individual rating of safety or danger is assigned. The final rating is a sum of rating of all events and processes. The file is also subject to anti-virus scan through a signature analysis to detect a certain signatures that could only be attributed to a certain classes of viruses. This generates a dynamic rating D.
Every executed process gets assigned an initial security rating R comprising a static rating S and a dynamic rating D, R=R(D,S).
Even if an initial security rating S is very low (e.g., from 0-25%), this rating could change anytime the process performs any ‘suspicious’ or ‘dangerous’ operations.
For example, the source of the file can change the rating of the process. The file that ‘arrived’ on the computer on a CD-ROM is much less likely to be infected than a file downloaded from the Internet. For the downloaded files, the source of the download, such as the URL, should preferably be considered. Whether the file is packed can also become a factor of risk, since malware files are often packed, in order to defeat the signature-based method of virus detection. The current location and/or path to the file can also be considered, in the event when some particular executable files install themselves in a particular directory, especially those directories that are infrequently used.
Another factor to consider is the size of the file. For example, a relatively small executable file executed for the first time is much more suspicious than a large executable file. This is due to the fact that transmission of large multi-megabyte infected files is frequently impractical, particularly from unsuspecting ‘zombified’ computers. If a zombified computer sends out very large infected files, it will not be able to send very many of them, if only due to hardware and bandwidth limitations. On the other hand, sending a large number of emails with a relatively small attachment is much more practical. Typical malware files sent out in this manner are on the order of 50-100 kilobytes (which, if packed, reduces to something on the order of 20-50 kilobytes). A file with the size less than 50 kilobytes could be considered as a candidate for being ‘suspicious’ file.
Another factor of risk that can be considered is whether a relatively small executable file is an installer file itself. In other words, the file, although small, triggers downloading a larger executable file from a web server or a file server on the Internet. Still another possible factor in the risk analysis is how the file was created, e.g., which process created this file, whether another file had been downloaded prior to the file being created on the disk—thus, knowing the URL of the downloaded file, the risk of the file that was just created can be assessed. Also, which directory/folder the file was created in (e.g., Temporary Internet Files is a higher risk than My Documents, root directory is higher risk than a lower-level directory) can be another factor, etc.
Another factor is whether or not a file is digitally signed and unsigned files, presumably, being more suspicious than signed ones.
File attributes, such as Archived and Read Only are used often, but attributes such as ‘Hidden’ or ‘System’ are used rarely and an indication for the executed file, that the file is suspicious. These attributes add, e.g., 5% to the process security rating.
As another example, if a process writes into a memory of other processes, or tries to handle a system service, the process assigned security rating as ‘dangerous’. If the process copies itself anywhere, the process is assigned a 70% security rating. If the process creates executable files in folders such as WINDOWS, SYSTEM32, DRIVERS, the process is assigned a 100% security rating as an extremely dangerous process. Creation of Alternate Data Streams (ADS) in the executable files and folders causes a 100% rating to be assigned to the process. Creation of the certain files such as autorun.inf and runtime.sys causes a ‘dangerous’ rating to be assigned the process. Deletion and modification of system's files causes a ‘dangerous’ rating to be assigned to the process. Access to the files that contain passwords and other confidential information causes a ‘dangerous’ rating to be assigned to the process. Deletion of any files outside the program's folder causes at least ‘suspicious’ or ‘moderately dangerous’ rating to be assigned the process. Search for files EXE, DLL, SYS in the system's folders, causes a ‘suspicious’ or ‘moderately dangerous’ rating to be assigned to the process. Access to C:\WINDOWS\system32\dirvers\hosts, C:\boot.ini assigns a ‘dangerous’ rating to the process.
Registration of driver/service causes a ‘suspicious’ or ‘moderately dangerous’ rating to be assigned to the process. Deletion or manipulation of antivirus services or Firewall causes a ‘dangerous’ rating to be assigned to the process, for example, ChangeServiceConfig(BITS) or DeleteService(‘McShield’).
Access to the keys that store passwords causes a ‘dangerous’ rating to be assigned to the process, for example Software\Ghisler\Total Commander, Software\CoffeeCup Software\Internet\Profiles, Software\Mail.Ru\Agent\mra_logins, SOFTWARE\RIT\The Bat!, SOFTWARE\Far\Plugins\FTP\Hosts. Creation of the keys in the service registration area causes a ‘suspicious’ or ‘moderately dangerous’ rating to be assigned to the process. However, deletion of the existing keys causes ‘dangerous’ rating to be assigned to the process, for example deletion of the key \Registry\Machine\SYSTEM\ControlSet001\Services\SymEvent or \Registry\Machine\SYSTEM\ControlSet001\Services\SYMTDI.
The total security rating assigned to the process allows the system to analyze only the processes with ‘dangerous’ rating, such as, e.g., greater than 75%. For dangerous processes, the system can block access to the Internet, permit access to the Internet (optionally upon user approval) and restrict the usage of memory and other computer resources. The system, in one exemplary embodiment, uses a HIPS method, which limits availability of resources to potentially ‘dangerous’ processes. Host-based Intrusion Prevention Systems (HIPS) could be used with virtualization, for example, if the process tries to create a file in a system folder, the system would not allow the process to do it, and at the same time gives the process a higher rating and informs the process that the file has been created (even though in reality, the file was not created). This allows the system to search for more complex viruses.
The processes could be grouped into different groups. For example, if the process uses a system password, it could be placed in the group Password. If the process has access to the Internet, it could be placed in the group Internet. At the same time, each group has access only to certain resources. If the process is placed in the group Password, it should have only restricted access to the hardware resources, for example, it is forbidden to access the network through the network card.
There is a set of rules according to which the security rating is calculated. The rules are divided into the following exemplary categories: services and drivers, processes, system registry, files' operations, Internet, System Privileges, Rootkits, Defense from Antirootkits. Each rule is characterized by its own parameters such as Unique Identification, API function, Decision and whether it is acceptable to recognize the process as malware according to this rule.
The following are exemplary rules (note that in a real system, there are typically more rules than illustrated below:
Rule ‘loading a driver of a malware throw a low level API ntdll.dll’
Rule identifier: 84
API function: loading a driver (NtLoadDriver)
Condition for argument 1: Includes as input <services/drivers of malware>
Condition for argument 2: *
Conditions for argument 3 . . . N: *
Rating: single operation −100%, 2-3 operations −100%, >3 operations −100%
Based on this rule, can the process be regarded as malware? Yes
Rule ‘Checking status of antivirus service’
Rule identifier: 8
API function: Checking the status of antivirus services (QueryServiceStatus)
Condition for argument 1: Includes as input <antivirus services>
Condition for argument 2: *
Conditions for argument 3 . . . N: *
Rating: single operation −10%, 2-3 operations −30%, >3 operations −60%
Based on this rule, can the process be regarded as malware? No
Rule ‘Manipulation of Autorun.inf file (creation)’
Rule identifier: 44
API function: Creation/opening of file (CreateFile)
Condition for argument 1: Has as input ‘autorun.inf’
Condition for argument 2: *
Conditions for argument 3 . . . N: *
Rating: single operation −100%, 2-3 operations −100%, >3 operations −100%
Based on this rule, can the process be regarded as malware? Yes
Rule ‘start of service BITS (use of Downloader)’
Rule identifier: 18
API function: Start of service/driver (StartService)
Condition for argument 1: Includes BITS as input Condition for argument 2: *
Conditions for argument 3 . . . N: *
Rating: single operation −90%, 2-3 operations −90%, >3 operations −90%
Based on this rule, can the process be regarded as malware? Yes
Group processes and services can be divided into the categories of antivirus processes, antivirus services, windows and elements of antivirus interfaces, system services, system processes and etc.
Groups of processes and services can be divided into such categories, such as antivirus processes, antivirus services, windows and elements of antivirus interfaces, system services, system processes, etc.
The security rating can also be increased based on properties of a system call made by the process. Depending on the system call and its properties, one can change the risk level for the computer system. For example, the properties of the system call can include how many times system call was made by the process. Another example is matching of the system call parameter to, for example, a name of one of the system services or a name of a system registry key. The names of the critical system objects, names of system services, names of antivirus objects, etc. can be combined into various groups and for most system calls, there will be a corresponding group. Thus, if the process makes a system call with parameters matching those in group, the security rating of the process will increase.
As noted earlier, the process can be placed into a group, where all the processes in the group have the same permissions for related activities. If a process attempts an activity permitted to another group, but not to its group, this attempt would be blocked, and the process' rating can be raised. Examples of such activities are local network access, Internet access, file access, system registry access, password-related activities, activities that require system privileges and activities that require OS kernel privileges.
In step 110, the system continues to monitor the process for any suspicious activities. If the system detects any suspicious activities performed by the process (step 112), the system automatically updates the security rating of the process to D (step 116).
In step 118, if the security rating D is greater than 50%, the process is considered ‘suspicious’ or ‘moderately dangerous’. The system will notify the user (step 120) and continue to execute the process (step 114), unless the user instructs the system otherwise.
In step 122, if the security rating D is greater than 75%, the process is considered ‘dangerous’, which indicates that a malware is present, or is likely to be present, then, in step 124, execution of the file is blocked, and the process terminates, in step 126. In step 128, the user may be optionally notified of the problem. The system may try to cure the process in step 130, then, if the process is cured, the process execution is permitted in step 114. The process could be cured, for example, by downloading from the Internet and replacing the damaged code or restoring the file from a trusted backup, and relaunching the process.
Once the system terminates the process in step 128, the system may have to deal with the corrupted file. If the corrupted file is a system component or a ‘useful’ file (an example of ‘useful’ file could be any user application such as Microsoft Word or Microsoft Excel) (see step 132), then the system can try to cure the file, the same way it cured the process, and the execution of the file will be continued (step 114). However, if the corrupted file is an independent executable file, the system will try to cure it, and, if impossible, quarantine or delete the file and continue to monitor the process for suspicious activities (step 110).
With reference to
The personal computer 20 may further include a hard disk drive 27 for reading from and writing to a hard disk, not shown, a magnetic disk drive 28 for reading from or writing to a removable magnetic disk 29, and an optical disk drive 30 for reading from or writing to a removable optical disk 31 such as a CD-ROM, DVD-ROM or other optical media The hard disk drive 27, magnetic disk drive 28, and optical disk drive 30 are connected to the system bus 23 by a hard disk drive interface 32, a magnetic disk drive interface 33, and an optical drive interface 34, respectively. The drives and their associated computer-readable media provide non-volatile storage of computer readable instructions, data structures, program modules/subroutines, where each of the steps described above can be a separate module, or several steps can be aggregated into a single module, and other data for the personal computer 20. Although the exemplary environment described herein employs a hard disk, a removable magnetic disk 29 and a removable optical disk 31, it should be appreciated by those skilled in the art that other types of computer readable media that can store data accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories (RAMs), read-only memories (ROMs) and the like may also be used in the exemplary operating environment.
A number of program modules may be stored on the hard disk, magnetic disk 29, optical disk 31, ROM 24 or RAM 25, including an operating system 35. The computer 20 includes a file system 36 associated with or included within the operating system 35, one or more application programs 37, other program modules 38 and program data 39. A user may enter commands and information into the personal computer 20 through input devices such as a keyboard 40 and pointing device 42. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner or the like. These and other input devices are often connected to the processing unit 21 through a serial port interface 46 coupled to the system bus, and can be connected by other interfaces, such as a parallel port, game port or universal serial bus (USB). A monitor 47 or some other type of display device is also connected to the system bus 23 via an interface, such as a video adapter 48. In addition to the monitor 47, personal computers typically include other peripheral output devices (not shown), such as speakers and printers.
The personal computer 20 may operate in a networked environment using logical connections to one or more remote computers 49. The remote computer (or computers) 49 may be represented by another personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the personal computer 20, although only a memory storage device 50 has been illustrated. The logical connections include a local area network (LAN) 51 and a wide area network (WAN) 52. Such networking environments are common in offices, enterprise-wide computer networks, Intranets and the Internet.
When used in a LAN networking environment, the personal computer 20 is connected to the local network 51 through a network interface or adapter 53. When in a WAN networking environment, the personal computer 20 typically includes a modem 54 or other means for establishing communications over the wide area network 52, such as the Internet. The modem 54, which may be internal or external, is connected to the system bus 23 via the serial port interface 46. In a networked environment, program modules depicted relative to the personal computer 20, or portions thereof, may be stored in the remote memory storage device. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
Having thus described a preferred embodiment, it should be apparent to those skilled in the art that certain advantages of the described method and apparatus can be achieved. It should also be appreciated that various modifications, adaptations and alternative embodiments thereof may be made within the scope and spirit of the present invention. The invention is further defined by the following claims.
Number | Name | Date | Kind |
---|---|---|---|
3798605 | Feistel | Mar 1974 | A |
4003033 | O'Keefe et al. | Jan 1977 | A |
4017839 | Calle et al. | Apr 1977 | A |
4047158 | Jennings | Sep 1977 | A |
4065117 | Thorsheim | Dec 1977 | A |
4084224 | Appell et al. | Apr 1978 | A |
4084228 | Dufond et al. | Apr 1978 | A |
4104718 | Poublan et al. | Aug 1978 | A |
4104721 | Markstein et al. | Aug 1978 | A |
4118789 | Casto et al. | Oct 1978 | A |
4135240 | Ritchie | Jan 1979 | A |
4215407 | Gomola et al. | Jul 1980 | A |
4218738 | Matyas et al. | Aug 1980 | A |
4249241 | Aberle et al. | Feb 1981 | A |
4262329 | Bright et al. | Apr 1981 | A |
4296464 | Woods et al. | Oct 1981 | A |
4301486 | Brown et al. | Nov 1981 | A |
4310720 | Check, Jr. | Jan 1982 | A |
4328542 | Anastas et al. | May 1982 | A |
4366537 | Heller et al. | Dec 1982 | A |
4375579 | Davida et al. | Mar 1983 | A |
4386233 | Smid et al. | May 1983 | A |
4430728 | Beitel et al. | Feb 1984 | A |
4432051 | Bogaert et al. | Feb 1984 | A |
4445177 | Bratt et al. | Apr 1984 | A |
4454594 | Heffron et al. | Jun 1984 | A |
4456954 | Bullions, III et al. | Jun 1984 | A |
4458315 | Uchenick | Jul 1984 | A |
4471163 | Donald et al. | Sep 1984 | A |
4493023 | Gavrin et al. | Jan 1985 | A |
4500750 | Elander et al. | Feb 1985 | A |
4533996 | Hartung et al. | Aug 1985 | A |
4543082 | Stenner | Sep 1985 | A |
4553263 | Smith et al. | Nov 1985 | A |
4573119 | Westheimer et al. | Feb 1986 | A |
4574350 | Starr | Mar 1986 | A |
4580217 | Celio | Apr 1986 | A |
4584639 | Hardy | Apr 1986 | A |
4599489 | Cargile | Jul 1986 | A |
4638356 | Frezza | Jan 1987 | A |
4651920 | Stenner | Mar 1987 | A |
4652990 | Pailen et al. | Mar 1987 | A |
4677546 | Freeman | Jun 1987 | A |
4688169 | Joshi | Aug 1987 | A |
4716528 | Crus et al. | Dec 1987 | A |
4722049 | Lahti | Jan 1988 | A |
4733345 | Anderson | Mar 1988 | A |
4742215 | Daughters et al. | May 1988 | A |
4780821 | Crossley | Oct 1988 | A |
4815031 | Furukawa | Mar 1989 | A |
4819159 | Shipley et al. | Apr 1989 | A |
4858117 | DiChiara et al. | Aug 1989 | A |
4864494 | Kobus, Jr. | Sep 1989 | A |
4868877 | Fischer | Sep 1989 | A |
4891503 | Jewell | Jan 1990 | A |
4891838 | Faber | Jan 1990 | A |
4916624 | Collins et al. | Apr 1990 | A |
4919545 | Yu | Apr 1990 | A |
4937863 | Robert et al. | Jun 1990 | A |
4941175 | Enescu et al. | Jul 1990 | A |
4957311 | Geisenheimer | Sep 1990 | A |
4959861 | Howlette | Sep 1990 | A |
4967377 | Masuda | Oct 1990 | A |
4969136 | Chamberlin et al. | Nov 1990 | A |
4975950 | Lentz | Dec 1990 | A |
4979098 | Baum et al. | Dec 1990 | A |
5005200 | Fischer | Apr 1991 | A |
5012514 | Renton | Apr 1991 | A |
5023773 | Baum et al. | Jun 1991 | A |
5043878 | Ooi | Aug 1991 | A |
5050078 | Sansone | Sep 1991 | A |
5060135 | Levine et al. | Oct 1991 | A |
5065429 | Lang | Nov 1991 | A |
5077654 | Ohtsuki | Dec 1991 | A |
5093914 | Coplien et al. | Mar 1992 | A |
5117458 | Takaragi et al. | May 1992 | A |
5124937 | Uchiike et al. | Jun 1992 | A |
5133070 | Barker et al. | Jul 1992 | A |
5140684 | Sakamura et al. | Aug 1992 | A |
5142578 | Matyas et al. | Aug 1992 | A |
5144557 | Wang et al. | Sep 1992 | A |
5144659 | Jones | Sep 1992 | A |
5146575 | Nolan, Jr. | Sep 1992 | A |
5148481 | Abraham et al. | Sep 1992 | A |
5163097 | Pegg | Nov 1992 | A |
5173939 | Abadi et al. | Dec 1992 | A |
5175851 | Johnson et al. | Dec 1992 | A |
5175852 | Johnson et al. | Dec 1992 | A |
5182770 | Medveczky et al. | Jan 1993 | A |
5204961 | Barlow | Apr 1993 | A |
5204966 | Wittenberg et al. | Apr 1993 | A |
5210795 | Lipner et al. | May 1993 | A |
5220603 | Parker | Jun 1993 | A |
5220604 | Gasser et al. | Jun 1993 | A |
5220669 | Baum et al. | Jun 1993 | A |
5224160 | Paulini et al. | Jun 1993 | A |
5224163 | Gasser et al. | Jun 1993 | A |
5230069 | Brelsford et al. | Jul 1993 | A |
5249290 | Heizer | Sep 1993 | A |
5263157 | Janis | Nov 1993 | A |
5265163 | Golding et al. | Nov 1993 | A |
5265164 | Matyas et al. | Nov 1993 | A |
5272754 | Boerbert | Dec 1993 | A |
5276444 | McNair | Jan 1994 | A |
5276735 | Boerbert et al. | Jan 1994 | A |
5276901 | Howell et al. | Jan 1994 | A |
5283830 | Hinsley et al. | Feb 1994 | A |
5291596 | Mita | Mar 1994 | A |
5301234 | Mazziotto et al. | Apr 1994 | A |
5303360 | Hilton et al. | Apr 1994 | A |
5311591 | Fischer | May 1994 | A |
5311593 | Carmi | May 1994 | A |
5313521 | Torii et al. | May 1994 | A |
5313637 | Rose | May 1994 | A |
5315657 | Abadi et al. | May 1994 | A |
5339403 | Parker | Aug 1994 | A |
5379433 | Yamagishi | Jan 1995 | A |
5388156 | Blackledge, Jr. et al. | Feb 1995 | A |
5388211 | Hornbuckle | Feb 1995 | A |
5390312 | Chiarot et al. | Feb 1995 | A |
5396609 | Schmidt et al. | Mar 1995 | A |
5408642 | Mann | Apr 1995 | A |
5412717 | Fischer | May 1995 | A |
5475839 | Watson et al. | Dec 1995 | A |
5485409 | Gupta et al. | Jan 1996 | A |
5537099 | Liang | Jul 1996 | A |
5560008 | Johnson et al. | Sep 1996 | A |
5574936 | Ryba et al. | Nov 1996 | A |
5577209 | Boyle et al. | Nov 1996 | A |
5581763 | Hait | Dec 1996 | A |
5642515 | Jones et al. | Jun 1997 | A |
5671367 | Le Roux | Sep 1997 | A |
5745879 | Wyman | Apr 1998 | A |
5812763 | Teng | Sep 1998 | A |
5903721 | Sixtus | May 1999 | A |
6134324 | Bohannon et al. | Oct 2000 | A |
6507909 | Zurko et al. | Jan 2003 | B1 |
6931540 | Edwards et al. | Aug 2005 | B1 |
6973578 | McIchionc | Dec 2005 | B1 |
7065657 | Moran | Jun 2006 | B1 |
7188368 | Swimmer et al. | Mar 2007 | B2 |
7203962 | Moran | Apr 2007 | B1 |
7251830 | Melchione | Jul 2007 | B1 |
7281267 | Tarbotton et al. | Oct 2007 | B2 |
7383581 | Moore et al. | Jun 2008 | B1 |
7392541 | Largman et al. | Jun 2008 | B2 |
7415726 | Kelly et al. | Aug 2008 | B2 |
20020116627 | Tarbotton et al. | Aug 2002 | A1 |
20020174010 | Rice, III | Nov 2002 | A1 |
20030088680 | Nachenberg et al. | May 2003 | A1 |
20040025015 | Satterlee et al. | Feb 2004 | A1 |
20040073617 | Milliken et al. | Apr 2004 | A1 |
20050120237 | Roux et al. | Jun 2005 | A1 |
20050138110 | Redlich et al. | Jun 2005 | A1 |
20050223239 | Dotan | Oct 2005 | A1 |
20050257265 | Cook et al. | Nov 2005 | A1 |
20050257266 | Cook et al. | Nov 2005 | A1 |
20060031937 | Steinberg | Feb 2006 | A1 |
20060031940 | Rozman et al. | Feb 2006 | A1 |
20060236392 | Thomas et al. | Oct 2006 | A1 |
20060272021 | Marinescu et al. | Nov 2006 | A1 |
20060288416 | Costea et al. | Dec 2006 | A1 |
20070006304 | Kramer et al. | Jan 2007 | A1 |
20070028291 | Brennan et al. | Feb 2007 | A1 |
20070143827 | Nicodemus et al. | Jun 2007 | A1 |
20070143851 | Nicodemus et al. | Jun 2007 | A1 |
20070157315 | Moran | Jul 2007 | A1 |
20070180509 | Swartz et al. | Aug 2007 | A1 |
20070260643 | Borden et al. | Nov 2007 | A1 |
20080005796 | Godwood et al. | Jan 2008 | A1 |
20080256075 | Claus et al. | Oct 2008 | A1 |
20080256076 | Claus et al. | Oct 2008 | A1 |
20090031129 | Milliken et al. | Jan 2009 | A1 |
20090031136 | Milliken et al. | Jan 2009 | A1 |
20090038011 | Nadathur | Feb 2009 | A1 |
Number | Date | Country |
---|---|---|
4123126 | Jun 1992 | DE |
79133 | May 1983 | EP |
187603 | Jul 1986 | EP |
235615 | Sep 1987 | EP |
0398645 | May 1990 | EP |
456386 | Nov 1991 | EP |
470163 | Feb 1992 | EP |
503765 | Sep 1992 | EP |
1142565 | Feb 1969 | GB |
1154387 | Jun 1969 | GB |
1539356 | Jan 1979 | GB |
1585960 | Mar 1981 | GB |
55116155 | Sep 1980 | JP |
3088052 | Apr 1991 | JP |
4119442 | Apr 1992 | JP |
5088960 | Apr 1993 | JP |
5094357 | Apr 1993 | JP |
5151088 | Jun 1993 | JP |
5224828 | Sep 1993 | JP |
WO 8903092 | Apr 1989 | WO |