1. Field of the Invention
The present invention relates generally to computer security and, more particularly, to methods of and systems for accurately identifying devices that have behaved maliciously and for proactively denying such devices access to services and data through computer networks.
2. Description of the Related Art
In this Information Revolution, it seems that nearly all information ever available in any form is being made available through the Internet. At the same time, access to the Internet and all the information it holds is growing, not just organically as individuals want access to the Internet but also in organized, motivated campaigns to “bridge the digital divide” and to bring Internet access to all who don't have it yet. Thus, the trend is toward everyone having access to all information in the world.
The infrastructure of the Internet is both widely adopted and highly convenient. Accordingly, some use the Internet to provide services that would otherwise be provided by legacy infrastructure, such as physical “brick and mortar” locations where activities such as shopping, banking, telephone conversations, and postal transactions have traditionally taken place. In addition, the Internet is increasingly used to share information between limited groups of geographically separated parties.
One of the greatest challenges in ubiquitous data access is security. Data is often personal and confidential and of high value. Accordingly, security is often of paramount importance for some of the information and services that are accessible through the Internet.
Security failures, in which a computer controlled by a person with malicious intent gains access to resources which the person is not authorized to obtain, are all too common. What is needed is a way to more effectively stop unauthorized intrusions into secure systems.
In accordance with the present invention, a device reputation server can recognize malicious devices used in prior attacks and prevent further attacks by the malicious devices, even attacks on systems that have not previously interacted with the malicious devices. Server computers require a digital fingerprint of any client devices prior to providing any service to such client devices. Logging of network activity by such servers includes digital fingerprints of all logged events associated with each client device to thereby identify a remotely located, physical computing device associated with each logged event.
Accordingly, logs of events of an attack include the digital fingerprint of the device perpetrating the attack. When an attack is detected or discovered, the attacked server reports the attack and the digital fingerprint of the perpetrating device to a device reputation server. The device reputation server uses the report to improve future assessments of the reputation of the device associated with the reported digital fingerprint.
The device reputation server stores data representing attacks reported by numerous servers, including digital fingerprints of perpetrating client devices. Using such data, the device reputation server assesses trustworthiness of a given device based on the number, recency, frequency, and severity, for example, of attacks that have been perpetrated by the given device.
The attacks reported to the device reputation server are from a large number of servers and the device reputation server serves requests for device reputations by any of a large number of servers. As a result, an attack on one server computer can affect the reputation of the device for, and therefore prevent future attacks on, a large number of other servers. Compared to other forms of device identification such as IP (Internet Protocol) and MAC (Media Access Control) addresses, digital fingerprints are complex, very tightly coupled to a particular computing device, and extremely difficult to discover or spoof. Accordingly, it is extremely difficult for a computing device to have access to the digital fingerprint of another computing device or to alter its own digital fingerprint.
Many believe that a very large majority of attacks on networked computers are perpetrated by a very small minority of users of networked computers. Most such users use a single computer that has been modified with expensive tools for hiding real IP and MAC addresses and otherwise obscuring any digital trails that might identify the user. Once the device reputation server has determined that the single computer has been used to perpetrate fraud, the single computer is no longer of any use for malicious activity among a very large number of servers.
Other systems, methods, features and advantages of the invention will be or will become apparent to one with skill in the art upon examination of the following figures and detailed description. It is intended that all such additional systems, methods, features and advantages be included within this description, be within the scope of the invention, and be protected by the accompanying claims. Other use of the descriptions, pictures and accounts of this game without the express written consent of major league baseball is strictly prohibited. Component parts shown in the drawings are not necessarily to scale, and may be exaggerated to better illustrate the important features of the invention. In the drawings, like reference numerals may designate like parts throughout the different views, wherein:
In accordance with the present invention, a device reputation server 108 can recognize malicious devices used in prior attacks and prevent further attacks by the malicious devices, even attacks on systems or other devices that have never interacted with any of the malicious devices before. Briefly, and in a manner described more completely below, a server computer 104 requires a digital fingerprint of any client devices, such as client computer 102, during authentication. Device reputation server 108 stores data representing attacks that have been detected and digital fingerprints associated with those attacks. By querying device reputation server 108, server computer 104 can determine whether client computer 102 is trustworthy for the particular services provided by server computer 104.
In providing a service, server computer 104 logs network events, including the digital fingerprint of devices involved in the network events. Accordingly, logs of events of an attack include the digital fingerprint of the device perpetrating the attack. Server computer 104 reports the attack and the digital fingerprint of the perpetrating device to device reputation server 108. Device reputation server 108 uses the report to improve future assessments of the reputation of the device associated with the reported digital fingerprint. It should be noted that device reputation server 108 provides a similar reputation assessment and reporting service to a large number of other server computers. As a result, an attack on one server computer can affect the reputation of the device for, and therefore prevent future attacks on, a large number of other servers.
Briefly, digital fingerprints are unique identifiers of individual devices based on hardware and system configuration data of each device. Compared to other forms of device identification such as IP (Internet Protocol) and MAC (Media Access Control) addresses, digital fingerprints are complex, very tightly coupled to a particular computing device, and extremely difficult to discover or spoof. In addition, and perhaps most significant, an advanced class of digital fingerprint is not predetermined by any single manufacturing entity or device supplier. Instead, the advanced digital fingerprint is derived or generated from multiple non-user configurable data strings that originate from various component manufacturers, and/or from user-configurable data entered or created by a user of the device being fingerprinted. In this sense, the advanced digital fingerprint is an “after-market” unique identifier that is derived or generated by a special fingerprinting application that is stored on the device, or that has access to data stored in memory locations on the target device. Accordingly, it is extremely difficult for a computing device to have access to the digital fingerprint of another computing device or to alter its own digital fingerprint. Digital fingerprints are known and are described, e.g., in U.S. Pat. No. 5,490,216 (sometimes referred to herein as the '216 Patent), and in U.S. Patent Application Publications 2007/0143073, 2007/0126550, 2011/0093920, and 2011/0093701 (collectively, “the related U.S. Patent Applications”), the descriptions of which are fully incorporated herein by reference.
Many believe that a very large majority of attacks on networked computers are perpetrated by a very small minority of users of networked computers. Most such users use a single computer that has been modified with expensive tools for hiding real IP and MAC addresses and otherwise obscuring any digital trails that might identify the user. Once device reputation server 108 has determined that the single computer has been used to perpetrate fraud, the fingerprint of the single computer is blacklisted and no longer of any use for malicious activity among a very large number of servers. The single computer will be immediately identifiable by its blacklisted fingerprint as a malicious device whenever it attempts to access a secure system.
Server computer 104 requires a digital fingerprint of a client device before server computer 104 will provide one or more services to the client device. In some embodiments, server computer 104 requires a digital fingerprint of a client device before server computer 104 will provide any services to the client device. Server computer 104 can be any type of server computing device that provides services to other computing devices, e.g., through a network such as wide area network 106. In this illustrative embodiment, wide area network 106 is the Internet.
Client computer 102 is a computing device that requests services of one or more servers, including server computer 104 through wide area network 106. Client computer 102 can be any type of networked computing device. In fact, designations of computers as servers and clients is largely arbitrary as many personal computing devices include server logic and many dedicated servers request services of other servers, thus acting as a client. As an example, server 104 requests a service of device reputation server 108 in a manner described below and is therefore a client of device reputation server 108.
Device reputation server 108 is a server that aggregates reports of network-based attacks of other computers and assesses and reports reputations of other devices based on those aggregated reports. It should be appreciated that the attack report aggregation and reputation assessment can be performed by server computer 104 for itself, and perhaps for other servers as well. However, the reputation management is described herein as being performed by device reputation server 108 for clarity of illustration and to describe the ability to manage reputations on a network-wide basis.
Transaction flow diagram 200 (
In step 204 (
In step 206, device reputation server 108 assesses the reputation of client computer 102 using the received digital fingerprint of client computer 102. As described below, device reputation server 108 includes digital fingerprint reputation data 616 (
There are a wide variety of ways to assess reputations based on data regarding attacks. In one embodiment, any device that has perpetrated even a single attack is labeled as malicious. In other embodiments, a device must have perpetrated a predetermined minimum number of attacks before the device is considered malicious. For example, if a computer is compromised and is used as a zombie to assist in perpetrating just a few attacks before the compromise is detected and repaired, e.g., by anti-virus software executing in the computer, the computer might not represent an ongoing security risk.
In yet other embodiments, the reputation is not a binary result of “malicious” or “trustworthy” but is instead a numerical reputation of trustworthiness along a scale from entirely malicious to entirely trustworthy—e.g., from 0.0 for malicious to 1.0 for trustworthy. Other factors used by device reputation assessment logic 614 beyond a number of attacks can include frequency of attacks, recency of attacks, severity of attacks, and correlation to attacks by other devices, for example. Correlation to attacks by other devices can be an important factor for large attacks by many zombie devices acting on behalf of a single entity because such attacks can be very damaging.
Once device reputation server 108 has assessed the reputation of client computer 102 in step 206 (
In step 210, server computer 104 determines an appropriate response given the reputation of client computer 102 received from device reputation server 108. One possible response is denial of service to client computer 102 if the reputation of client computer 102 is insufficiently trustworthy. Another possible response is to provide the requested service to client computer 102 if the reputation of client computer 102 is sufficiently trustworthy.
For some types of network-based services, a more complex response may be appropriate. For example, one possible response for a device known to be used repeatedly for identity theft and theft of credit card data is to provide the requested service but with data that can later be used to identify the person using the device for malicious purposes. For example, such a device can be permitted to retrieve credit card data but the credit card data can be used to detect improper purchases and to note the delivery address. The purchase and the delivery address can be provided to law enforcement personnel to effect an arrest of the person.
The response determined by server computer 104 in step 210 can also be influenced by other activity. For example, during extremely heavy request traffic for server computer 104, server computer 104 can deny service to devices with even slightly untrustworthy reputations as they can be zombies engaged in a denial of service attack. Accordingly, service would not be interrupted for trustworthy client devices.
In step 212 (
Server computer 104 logs the digital fingerprints of all client devices regardless of reputation in step 212. Accordingly, if an attack on server computer 104 is later discovered, the logs can indicate that a previously trustworthy device perpetrated the attack and device reputation server 108 can be made aware of the attack in a manner described more completely below.
In step 214, server computer 104 effects the response determined to be appropriate in step 210. After step 214, processing according to transaction flow diagram 200 completes.
Transaction flow diagram 300 (
In step 304 (
In step 306 (
Digital fingerprint 802 is the digital fingerprint associated with the attack detected in step 302 and identified in step 304, which is sometimes referred to as “the subject attack” in the context of digital fingerprint 800 and transaction flow diagram 300 (
Time stamp 804 represents the date and time of the subject attack. Device reputation assessment logic 614 (
Attack description 806 (
Network addresses 808 identifies any network addresses used in the subject attack, including IP and MAC addresses, for example.
Log excerpt 810 includes portions of service logs 516 (
In step 308 (
In step 310 (
Client computer 102 is shown in greater detail in
CPU 408 and memory 406 are connected to one another through a conventional interconnect 410, which is a bus in this illustrative embodiment and which connects CPU 408 and memory 406 to one or more input devices 402, output devices 404, and network access circuitry 422. Input devices 402 can include, for example, a keyboard, a keypad, a touch-sensitive screen, a mouse, and a microphone. Output devices 404 can include, for example, a display—such as a liquid crystal display (LCD)—and one or more loudspeakers. Network access circuitry 422 sends and receives data through a wide area network 106 (
A number of components of client computer 102 are stored in memory 406. In particular, remote data access logic 414 and secure networking logic 416 are each all or part of one or more computer processes executing within CPU 408 from memory 406 in this illustrative embodiment but can also be implemented using digital logic circuitry. As used herein, “logic” refers to (i) logic implemented as computer instructions and/or data within one or more computer processes and/or (ii) logic implemented in electronic circuitry. Digital fingerprint 418 is data stored persistently in memory 406.
Remote data access logic 414 can implement any of a number of remote data access protocols, such as HTTP (Hypertext Transport protocol), FTP (File Transport Protocol), NFS (Network File System) and CIFS (Common Internet File System) protocols for example, all of which are known and not described herein in further detail. In addition, secure networking logic 416 can implement any of a number of known Virtual Private Network (VPN) protocols.
Server computer 104 (
A number of components of server computer 104 are stored in memory 506. In particular, service logic 512, including authentication logic 514, is all or part of one or more computer processes executing within CPU 508 from memory 506 in this illustrative embodiment but can also be implemented using digital logic circuitry. Service logs 516 are data stored persistently in memory 506. Except as otherwise described herein, service logs 516 are conventional. For example, Linux-based servers log events, network and system events among others, in various logs stored in a ‘/var/log’ directory.
Service logic 512 specifies the one or more services provided by server computer 104 and can include a web server, an FTP server, remote data access protocols such as SMB and CIFS, and VPN protocols. To ensure all client devices are properly identified by their respective digital fingerprints, service logic 512 includes authentication logic 514 that causes server computer 104 to behave in the manner described herein.
Device reputation server 108 (
A number of components of device reputation server 108 are stored in memory 606. In particular, device reputation management logic 612 and device reputation assessment logic 614 are each all or part of one or more computer processes executing within CPU 608 from memory 606 in this illustrative embodiment but can also be implemented using digital logic circuitry. Digital fingerprint reputation data 616 is data stored persistently in memory 606. In this illustrative embodiment, digital fingerprint reputation data 616 is organized as a database.
Device reputation management logic 612 serves requests for device reputations and accepts and records reports of attacks in the manner described herein. Device reputation assessment logic 614 assesses reputations of devices using data stored in digital fingerprint reputation data 616 in the manner described herein.
Transaction flow diagram 202 (
In step 702 (
In test step 704 (
Conversely, if the request of step 702 does not include a digital fingerprint, processing by authentication logic 514 transfers to step 706, in which authentication logic 514 requests a digital fingerprint from client computer 102.
In response to such a request and in step 708, client computer 102 generates a digital fingerprint of itself. In some embodiments, client computer 102 creates the digital fingerprint of itself using logic independently and previously installed in client computer 102. In other embodiments, data repository 104 directs client computer 102 to obtain digital fingerprint generation logic, e.g., from server 108 by executing a fingerprinting algorithm, e.g., in the form of an applet, and to then execute the logic to thereby generate a digital fingerprint of client computer 102. The applet may encode a generated digital fingerprint with an authenticating certificate or other binary code that authenticates the digital fingerprint as a fingerprint that was generated by an authorized algorithm. The particular manner in which data repository 104 specifies the logic to be obtained by client computer 102 and the particular manner in which client computer 102 executes the logic are unimportant and there are many known ways for accomplishing each. The generation of a digital fingerprint is described in the '216 Patent and the related U.S. Patent Applications and those descriptions are incorporated herein by reference.
Thus, according to transaction flow diagram 202, server computer 104 ensures that client computer 102 has provided its digital fingerprint as a precondition for providing services requested by client computer 102.
The above description is illustrative only and is not limiting. The present invention is defined solely by the claims which follow and their full range of equivalents. It is intended that the following appended claims be interpreted as including all such alterations, modifications, permutations, and substitute equivalents as fall within the true spirit and scope of the present invention.
This application is a continuation of U.S. patent application Ser. No. 13/692,857, which was filed on Dec. 3, 2012, the entire disclosure of which is incorporated herein by reference. The benefit of such earlier filing date is claimed by applicant under 35 U.S.C. §120.
Number | Name | Date | Kind |
---|---|---|---|
5884272 | Walker et al. | Mar 1999 | A |
5991735 | Gerace | Nov 1999 | A |
6138155 | Davis et al. | Oct 2000 | A |
6167517 | Gilchrist et al. | Dec 2000 | A |
6173283 | Kasso et al. | Jan 2001 | B1 |
6195447 | Ross | Feb 2001 | B1 |
6754665 | Futagami et al. | Jun 2004 | B1 |
6985953 | Sandhu et al. | Jan 2006 | B1 |
6993580 | Isherwood et al. | Jan 2006 | B2 |
7272728 | Pierson et al. | Sep 2007 | B2 |
7319987 | Hoffman et al. | Jan 2008 | B1 |
7523860 | Bonelle et al. | Apr 2009 | B2 |
7590852 | Hatter et al. | Sep 2009 | B2 |
7739402 | Roese | Jun 2010 | B2 |
8190475 | Merrill | May 2012 | B1 |
8255948 | Black et al. | Aug 2012 | B1 |
8326001 | Free | Dec 2012 | B2 |
8441548 | Nechyba et al. | May 2013 | B1 |
8483450 | Derakhshani et al. | Jul 2013 | B1 |
8635087 | Igoe et al. | Jan 2014 | B1 |
20010049620 | Blasko | Dec 2001 | A1 |
20020038235 | Musafia et al. | Mar 2002 | A1 |
20020118813 | Brehm et al. | Aug 2002 | A1 |
20030163483 | Zingher et al. | Aug 2003 | A1 |
20040236649 | Yip et al. | Nov 2004 | A1 |
20050010780 | Kane et al. | Jan 2005 | A1 |
20050086498 | Hulick | Apr 2005 | A1 |
20050187890 | Sullivan | Aug 2005 | A1 |
20050278542 | Pierson et al. | Dec 2005 | A1 |
20060123101 | Buccella et al. | Jun 2006 | A1 |
20060212930 | Shull et al. | Sep 2006 | A1 |
20060222212 | Du et al. | Oct 2006 | A1 |
20060282660 | Varghese et al. | Dec 2006 | A1 |
20070050238 | Carr et al. | Mar 2007 | A1 |
20070050638 | Rasti | Mar 2007 | A1 |
20070113090 | Villela | May 2007 | A1 |
20070136792 | Ting et al. | Jun 2007 | A1 |
20070169181 | Roskind | Jul 2007 | A1 |
20070234409 | Eisen | Oct 2007 | A1 |
20070239606 | Eisen | Oct 2007 | A1 |
20070294403 | Verona | Dec 2007 | A1 |
20080027858 | Benson | Jan 2008 | A1 |
20080028455 | Hatter et al. | Jan 2008 | A1 |
20080040802 | Pierson et al. | Feb 2008 | A1 |
20080080750 | Bee et al. | Apr 2008 | A1 |
20080092058 | Afergan et al. | Apr 2008 | A1 |
20080102435 | Rogers et al. | May 2008 | A1 |
20080109491 | Gupta | May 2008 | A1 |
20080120195 | Shakkarwar | May 2008 | A1 |
20080212846 | Yamamoto et al. | Sep 2008 | A1 |
20080235375 | Reynolds et al. | Sep 2008 | A1 |
20080242279 | Ramer et al. | Oct 2008 | A1 |
20090089869 | Varghese | Apr 2009 | A1 |
20090150330 | Gobeyn | Jun 2009 | A1 |
20090178125 | Barber et al. | Jul 2009 | A1 |
20090186328 | Robinson et al. | Jul 2009 | A1 |
20090254476 | Sharma et al. | Oct 2009 | A1 |
20090292743 | Bigus et al. | Nov 2009 | A1 |
20090320096 | Nolan et al. | Dec 2009 | A1 |
20100125911 | Bhaskaran | May 2010 | A1 |
20100185871 | Scherrer et al. | Jul 2010 | A1 |
20100235241 | Wang et al. | Sep 2010 | A1 |
20100305989 | Mu et al. | Dec 2010 | A1 |
20110040825 | Ramzan et al. | Feb 2011 | A1 |
20110264644 | Grant et al. | Oct 2011 | A1 |
20110302003 | Shirish et al. | Dec 2011 | A1 |
20110319060 | Gentemann | Dec 2011 | A1 |
20120030771 | Pierson et al. | Feb 2012 | A1 |
20120041969 | Priyadarshan et al. | Feb 2012 | A1 |
20120063427 | Kandekar et al. | Mar 2012 | A1 |
20120167208 | Buford et al. | Jun 2012 | A1 |
20120185921 | Wechsler et al. | Jul 2012 | A1 |
20120233665 | Ranganathan et al. | Sep 2012 | A1 |
20120324581 | Economos et al. | Dec 2012 | A1 |
Number | Date | Country |
---|---|---|
WO 2010104928 | Sep 2010 | WO |
Entry |
---|
Clarke et al. “Secure Hardware Processors Using Silicon One-Way Functions,” MIT Laboratory for Computer Science, Mar. 2002, p. 141. |
Fraga, David, “Information Technology, Regime Stability and Democratic Meaningfulness: A Normative Evaluation of Present and Potential Trends,” Honor's Thesis for a Degree for College Undergraduate Research, University of Pennsylvania, Mar. 30, 2007, 73 pages. |
Gassend, et al., Silicon Physical Unknown Functions and Secure Smartcards, May 2002. |
“German Stores Put Money at Your Fingertips”, Independent On-line, Sep. 4, 2007. |
Gupta et al., “Efficient Fingerprint-based User Authentication for Embedded Systems,” Proceedings of the 42nd Annual Design Automation Conference, New York City, New York, Jun. 13, 2005. |
Johnson et al. “Dimensions of Online Behavior: Toward a User Typology,” Cyberpsychology and Behavior, vol. 10, No. 6, pp. 773-779, Dec. 2007. XP002617349. |
Keane et al. “Transistor Aging,” IEEE Spectrum, Apr. 25, 2011. |
Kurchak, Kent, “Notes Application Strategies: User Activity Tracking,” Mar. 14, 2004, 14 pages. |
Lazanu et al., Modelling spatial distribution of defects and estimation of electrical degradation of silicon detectors in radiation fields at high luminosity, Oct. 10, 2006, 5 pages. |
Lee et al., “Analogous Content Selection Mechanism Using Device Profile and Content Profile for U-Learning Environments,” Jul. 15, 2009, Ninth IEEE International Conference on Advanced Learning Technologies, IEEE Computer Society. |
Lemos, Robert, “Fingerprint Payments Taking Off Despite Security Concerns,” Security Focus, Oct. 10, 2007, 3 pages. |
“Lowes Foods Brings Biometric Payments and Check Cashing to Its Customers,” Banking & Financial Solutions, Bioguard Components & Technologies Ltd, Feb. 7, 2005. |
“Pay by Touch,” From Wikipedia, Feb. 22, 2011, 2 pages. |
Sim et al. “Continuous Verification Using Multimodal Biometrics”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, No. 4, Apr. 1, 2007, IEEE Service Center, Los Alamitos, CA, pp. 687-700. XP011168507. |
Soto, Lucy, “Not-so-Private Web: Information Leaks on Social Networks Can Leave Users Vulnerable,” The Atlanta Journal-Constitution, Feb. 14, 2010, 3 pages. |
Transcript from CBS Corp New, UBS Global Media Conference on Dec. 3, 2007 with Dave Poltrack by Matt Coppett, 9 pages. |
Agbinya et al., “Development of Digital Environment Identity (DEITY) System for Online Access,” Third International Conference on Broadband Communications, Information Technology & Biomedical Applications, Third International Conference on IEEE, Piscataway, New Jersey, Nov. 23, 2008, 8 pages. XP031368250. |
Alstete, Jeffrey W., “Measurement Benchmarks of Real Benchmarking?” Hagan School of Business, Iona College, New Rochelle NY, USA, Benchmarking: An International Journal, vol. 15, No. 2, 2008. |
Urbach, Nils and Smolnik, Stefan, “A Conceptual Model for Measuring the Effectiveness of Employee Portals,” Proceedings of the Fifteenth Americas Conference on Information Systems, San Francisco, CA, Aug. 6-9, 2009. |
Number | Date | Country | |
---|---|---|---|
20150026805 A1 | Jan 2015 | US |
Number | Date | Country | |
---|---|---|---|
61566516 | Dec 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13692857 | Dec 2012 | US |
Child | 14510965 | US |