The invention relates generally to accessory authentication in personal electronic devices and more specifically to the automatic management and enforcement of blacklists of counterfeited, cloned or otherwise unauthenticated devices.
The use of encryption for authentication of devices is generally known. Conventionally, a message, or “challenge,” is sent from a system or device to an object to be authenticated, and a message-dependent response is sent by the object to the system in reply. The system then evaluates the response to determine whether the response was sufficient to authenticate the object.
Such a method may be used, for example, to verify components of a system or device, including components that are removable, replaceable or available after-market. For example, an ink cartridge for an ink jet printer can be authenticated to determine whether it is an authorized and compatible cartridge for the particular printer. If the cartridge is successfully authenticated, normal printer operation utilizing that cartridge ensues. In an attempted use of a cartridge that is not successfully authenticated, no operation or only limited operation could be authorized as a result of the failed authentication procedure.
Counterfeiters, however, can attempt to circumvent authentication procedures through brute force cloning of the authentication chip, producing great numbers of devices with seemingly authentic though identical authentication chips. In online and networked applications, blacklisting is often used to detect these clones, with a database of blacklisted devices available for checking. Many devices for which authentication and/or counterfeit and cloning prevention is desired, however, are not networked, providing no opportunity for automatic comparison against such a database. Therefore, prevention of this and other types of counterfeiting and the use of blacklisting, for example in low-cost, high volume non-networked devices, is desired.
Embodiments relate to systems and methods for the management and enforcement of blacklists of counterfeited, cloned or otherwise unauthenticated devices. In an embodiment, a system comprises an accessory comprising an authentication chip including data signed by a private verification key, the data including a unique identifier related to the accessory, and a device comprising a public verification key forming a verification key pair with the private verification key and an identifier list, the device configured to read the data from the authentication chip, compare the unique identifier with the identifier list, and reject the accessory if the unique identifier is found in the identifier list.
In another embodiment, a method comprises reading signed data from a first device by a second device, extracting a unique identifier from the data, comparing the unique identifier with a unique identifier blacklist stored in the second device, rejecting the first device for use with the second device if the unique identifier is found in the unique identifier blacklist, and accepting the first device for use with the second device and adding the unique identifier to the unique identifier blacklist if the unique identifier is not found in the unique identifier blacklist.
In another embodiment, a semiconductor chip adapted to be embedded in a first device comprises a memory comprising data signed by a private verification key, wherein the data includes a unique identifier related to the semiconductor chip and a global blacklist of unique identifiers, and wherein the private authentication key is stored in a secure portion of the memory, and a communication interface configured to communicate with a second device comprising a public verification key using an asymmetric cryptographic technique, wherein the communication interface is configured to communicate the signed data to the second device.
In another embodiment, a microcontroller comprises circuitry configured to store a private authentication key, a public authentication key and data signed by a private verification key, the data including a unique identifier and a global blacklist, and communication circuitry configured to communicate public authentication key and the data, to receive a challenge encrypted with the public authentication key, and to communicate a response related to the encrypted challenge unencrypted with the private authentication key.
The invention may be more completely understood in consideration of the following detailed description of various embodiments of the invention in connection with the accompanying drawings, in which:
While the invention is amenable to various modifications and alternative forms, specifics thereof have been shown by way of example in the drawings and will be described in detail. It should be understood, however, that the intention is not to limit the invention to the particular embodiments described. On the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.
Embodiments of the invention utilize systems and methods for asymmetric cryptographic accessory authentication, such as those described in commonly owned U.S. patent application Ser. No. 12/582,362, entitled “SYSTEMS AND METHODS FOR ASYMMETRIC CRYPTOGRAPHIC ACCESSORY AUTHENTICATION” and filed Oct. 19, 2009, which is hereby incorporated by reference in its entirety. As discussed in the aforementioned patent application, secure authentication of accessories, batteries, parts and other objects can be provided at a lower cost suitable for price-sensitive applications using signed certificates and unique public and private key pairs.
For example,
Object 104 is depicted in
Referring also to
In an embodiment, the functionality and features of authentication chip 106 are realized as one or more system on chip components of object 104 to achieve cost or size savings. For example, object 104 can comprise a BLUETOOTH headset, which often is of small size and therefore may not be able to accommodate an additional chip 106. Instead, the features and functionality are integrated on an existing chip in the headset, saving space and possibly also costs. In such an embodiment, a manufacturer of the headset or other device comprising object 104 can be provided with, for example, a VHDL netlist for integration into an existing controller or processor of the headset or other device in place of a discrete authentication chip 106, which little or no change in the features, functions and security thereby provided.
Referring to
Before using public authentication key 111, however, device 102 determines whether public authentication key 111 is verified or genuine. In a conventional system using global or constant public and private key pairs for devices, verification can be accomplished by simply comparing the global key (public authentication key 111 received from object 104) with the same global key or a hash thereof stored on device 102. Use of global keys, however, does not provide the highest levels of security, as the global keys are vulnerable to hacking or other corruption. In some embodiments, therefore, unique public and private keys are used for each device, while in other embodiments public keys may be reused. For example, the first one million objects 104 can be manufactured with unique public keys, after which the public keys are repeated. In these embodiments, an additional unique identifier is used. Various embodiments are described in more detail below.
At 302, and after verifying public authentication key 111, device 102 uses public authentication key 111 to encrypt a challenge. In an embodiment, the challenge comprises a random number. In another embodiment, the challenge also includes additional data. In embodiments, the encryption is carried out according to an asymmetric encryption methodology, for example an elliptic curve cryptographic algorithm. In another embodiment, an RSA cryptographic algorithm or some other cryptographic algorithm is used.
At 304, the encrypted challenge is transmitted from device 102 to object 104. In embodiments, the challenge can be transmitted wirelessly, such as by radio frequency (RF), or by wire, such as by a power line or other wire connection between device 102 and object 104. At 306, object 104 decrypts the received encrypted challenge using private authentication key 110. At 308, object 104 sends the decrypted challenge as a response to device 102, and device 102 determines whether the response is appropriate such that object 104 can be authenticated.
After method 300, device 102 can retain both public keys 103 and 111, or device 102 can delete public key 111 that was read from object 104. Retaining both keys can save time and calculations in the future, while deleting one key can free memory space.
In an embodiment, and referring to
Creation of the digest by the certificate authority is shown in more detail in
Digest 508 is signed using private verification key 510 of the certificate holder to create a signature 512. In an embodiment, an elliptic curve cryptographic algorithm is used to sign digest 508. Advantages of an elliptic curve cryptographic algorithm include shorter keys and fewer calculations because of the shorter keys, which can be beneficial in small, low-cost and/or embedded objects having less processing capacity. In another embodiment, an RSA cryptographic algorithm or some other cryptographic algorithm is used.
Referring to
When object 104 is first attempted to be used with a device 102, device 102 must authenticate object 104 and verify that any data, information, content, media or other quantity originating from object 104, or object 104 itself, are legitimate. Accordingly, device 102 reads signature 512 and other data 520 from object 104 at 406. As part of this read, device 102 receives public authentication key 111 from object 104 as previously described, but device 104 cannot know whether public key 111 is corrupted or has been compromised and thus must verify the key.
This can be done using signature 512. Device 102 first recreates message 507 from data 520 and hashes message 507 according to the same algorithm used to create digest 508, thereby creating digest′ (508′) at 408. At 410, device 102 then extracts the original digest 508 from signature 512 read from object 104 using public verification key 103, which is intended, absent tampering or corruption, to correspond to private verification key 510 used to originally create signature 512. If the extraction is successful, device 102 compares digest′ (508′) with digest 508 at 412. If digest 508 and digest′ (508′) match, device 102 has verified that the data and information received from object 104 is uncorrupted and can use public authentication key 111 received from object 104 to authenticate object 104 according to process 300.
As previously mentioned, one possible way to circumvent authentication procedures, such as authentication system 100, is to clone an authentic authentication chip 106 and use the clones in counterfeit objects. A challenge to counterfeiters implementing brute force cloning of authentication chip 106 is that it is difficult to create new key pairs and signature certifications using cryptographic processes for each object 104. This is due in part to the level of computing capability needed to generate the signed certificates, which counterfeiters will not want to afford for high volume but relatively low-cost objects like printer ink cartridges and other devices and accessories. While the signature could instead be obtained by theft, such as spying, counterfeiters generally are not able to or cannot reliably depend on obtaining signed certificates in such a manner. Therefore, counterfeit objects that include cloned authentication chips, while appearing to be authentic when examined or used individually, all usually have identical certificates because they are simply copied as part of the cloning.
Embodiments relate to detecting, preventing use of and blacklisting these and other counterfeit devices.
In an embodiment, when use of object 104 with device 102 is first attempted and device 102 reads data from object 104 (refer, for example, to
If a counterfeiter clones many objects, each object will have the same unique identifier. Therefore, in an embodiment, device 102 retains each unique identifier in object ID list 904 stored in memory 902. Then, when a new object 104 is attempted to be used with device 102, device 102 first checks whether the unique identifier of that object 104 is already included in object ID list 904. If not, object 104 can be authenticated. If the unique identifier is found in list 904, the object 104 will not be authenticated.
Referring to
If the unique cartridge identifier is not found in the ID list, the unique cartridge identifier is added to the ID list at 1010, creating a self-learned local blacklist, and the cartridge is authenticated for use at 1012. In an embodiment, the printer retains a plurality or all of the most recent unique identifiers in the list, such that each subsequent cartridge can be compared against a more extensive list. The number of unique identifiers retained by the printer is limited only the memory available. Thus, in one embodiment the printer retains all unique identifiers of any cartridge attempted for use. In other embodiments, the printer memory may be limited such that the printer only retains a number of the most recent unique identifiers, for example the fifty most recent unique identifiers.
Returning to rejection of the cartridge at 1008, the rejected unique identifiers can also be communicated back to a manufacturer or distributor for addition to a global blacklist. Additionally or alternatively, a manufacturer can use market intelligence and other information to build or add to a global blacklist of rejected identifiers. In an embodiment, and referring also to
Embodiments also include a unidirectional counter 906 (referring again to
Method 1200 of
In embodiments of method 1200 in which a blacklist is provided by object 104 to device 102 as part of the signed certificate and an object 104 with a blacklisted ID is later attempted to be used with device 102, device 102 will reject that object 104 regardless of the unidirectional counter comparison. In further embodiments, unidirectional counter 906 also prevents use of piggybacked counterfeit objects, such as when a counterfeit object is coupled to an authentic object so as to use the authentic object to obtain authentication with the device. With unidirectional counter 906 being a unidirectional increment or decrement set-value counter, the object will be considered exhausted when that set value is reached, regardless of the presence of the piggybacked device.
Embodiments provide secure authentication of accessories, batteries, parts and other objects at a lower cost suitable for price-sensitive applications. Additionally, embodiments provide recovery action options in the event of hacking or key misuse by key blacklisting. Thus, if hacking of a public key is discovered, that key can be revoked or “blacklisted” and disabled globally, rather than having to block each single key in conventional approaches. This provides enhanced security and more efficient key management. Logistical improvements and efficiencies are also realized in that the device need not be preconfigured with the correct public key for a particular object, as the public key is extracted from the certificate stored in the object upon first use according to an embodiment. The overall security level is thereby enhanced, providing cost-effective authentication. Further, the use of local and global blacklisting and direction counters provides additional security against cloned and other counterfeit accessories.
Various embodiments of systems, devices and methods have been described herein. These embodiments are given only by way of example and are not intended to limit the scope of the invention. It should be appreciated, moreover, that the various features of the embodiments that have been described may be combined in various ways to produce numerous additional embodiments. Moreover, while various materials, dimensions, shapes, implantation locations, etc. have been described for use with disclosed embodiments, others besides those disclosed may be utilized without exceeding the scope of the invention.
Persons of ordinary skill in the relevant arts will recognize that the invention may comprise fewer features than illustrated in any individual embodiment described above. The embodiments described herein are not meant to be an exhaustive presentation of the ways in which the various features of the invention may be combined. Accordingly, the embodiments are not mutually exclusive combinations of features; rather, the invention may comprise a combination of different individual features selected from different individual embodiments, as understood by persons of ordinary skill in the art.
Any incorporation by reference of documents above is limited such that no subject matter is incorporated that is contrary to the explicit disclosure herein. Any incorporation by reference of documents above is further limited such that no claims included in the documents are incorporated by reference herein. Any incorporation by reference of documents above is yet further limited such that any definitions provided in the documents are not incorporated by reference herein unless expressly included herein.
For purposes of interpreting the claims for the present invention, it is expressly intended that the provisions of Section 112, sixth paragraph of 35 U.S.C. are not to be invoked unless the specific terms “means for” or “step for” are recited in a claim.
Number | Name | Date | Kind |
---|---|---|---|
4640138 | Morris | Feb 1987 | A |
6105027 | Schneider | Aug 2000 | A |
6285985 | Horstmann | Sep 2001 | B1 |
6356529 | Zarom | Mar 2002 | B1 |
6664969 | Emerson | Dec 2003 | B1 |
6678821 | Waugh | Jan 2004 | B1 |
6952475 | Horn | Oct 2005 | B1 |
6968453 | Doyle | Nov 2005 | B2 |
7047408 | Boyko | May 2006 | B1 |
7194629 | Silverbrook | Mar 2007 | B2 |
7243232 | Vanstone | Jul 2007 | B2 |
7313697 | Meyer | Dec 2007 | B2 |
7613924 | Shankar | Nov 2009 | B2 |
7823214 | Rubinstein | Oct 2010 | B2 |
20020194476 | Lewis | Dec 2002 | A1 |
20040243474 | Vu | Dec 2004 | A1 |
20050018841 | Girault | Jan 2005 | A1 |
20050052661 | Lapstun et al. | Mar 2005 | A1 |
20050105884 | Satoh et al. | May 2005 | A1 |
20050160277 | Sciupac | Jul 2005 | A1 |
20050216724 | Isozaki et al. | Sep 2005 | A1 |
20050243116 | Ward et al. | Nov 2005 | A1 |
20050246763 | Corcoran et al. | Nov 2005 | A1 |
20060031790 | Proudler | Feb 2006 | A1 |
20060107060 | Lewis | May 2006 | A1 |
20060146081 | Vandermeulen et al. | Jul 2006 | A1 |
20060161571 | Neill et al. | Jul 2006 | A1 |
20060161976 | Kahn et al. | Jul 2006 | A1 |
20060230276 | Nochta | Oct 2006 | A1 |
20070050631 | Shimizu et al. | Mar 2007 | A1 |
20080024268 | Wong | Jan 2008 | A1 |
20080165955 | Ibrahim | Jul 2008 | A1 |
20090013381 | Torvinen | Jan 2009 | A1 |
20090013410 | Kaler | Jan 2009 | A1 |
20090019282 | Arditti | Jan 2009 | A1 |
20090024352 | Braun | Jan 2009 | A1 |
20090070506 | Furtner | Mar 2009 | A1 |
20090083834 | Rubinstein | Mar 2009 | A1 |
20090235073 | Braun | Sep 2009 | A1 |
20100011218 | Shankar | Jan 2010 | A1 |
20100069086 | Ahlin | Mar 2010 | A1 |
20100226495 | Kelly | Sep 2010 | A1 |
20110093714 | Schaecher | Apr 2011 | A1 |
20110154043 | Lim | Jun 2011 | A1 |
Number | Date | Country |
---|---|---|
10161137 | Oct 2003 | DE |
10161138 | Feb 2008 | DE |
1773018 | Apr 2007 | EP |
WO2007107450 | Sep 2007 | WO |
WO2007112791 | Oct 2007 | WO |
WO2008040655 | Apr 2008 | WO |
Entry |
---|
Bock et al., A Milestone Toward RFID Products Offering Asymmetric Authentication Based on Elliptic Curve Cryptography, 14 pages, not dated. |
Technology Media, Technology Innovation: Infineon Helps Protect Consumers from Counterfeit Batteries and other Electronic Accessories with World First Authentication Chip Featuring Elliptic Curve Algorithms and Integrated Temperature Sensor, 2 pages, Sep. 17, 2008. |
Hammerschmidt, Christoph, EE Times, Peripherals authentication could change landscape, Sep. 28, 2009, 2 pages. |
Thomson Reuters, Infineon Demonstrates Remote PC Peripherals Authentication Capability With ORIGA™ Authentication Chip Using Intel, 2 pages, Sep. 22, 2009. |
Infineon Origa, Original Product Authentication Solution SLE95050F1, Published by Infineon Technologies North America, © 2009, 3 pages. |
Origa™ SLE95050 Original Product Authentication and Brand Protection Solution, Short Product Information, www.infineon.com/ORIGA, Version 1.50, 19 pages, Jun. 2009. |
Origa™—Original Product Authentication & Brand Protection Solution-SLE 95050, 2 pages, © 1999-2009. |
Anderson, Ross, Cryptography and Competition Policy—Issue with ‘Trusted Computing’, 21 pages, presented at 2nd Annual Workshop on Economics & Information Security on May 29, 2003. |
IBM, IBM eServer Cryptographic Coprocessor Security Module, Aug. 29, 2007, pp. 1-32. |
Krhovjak, Jan, EMV: Integrated Circuit Card Specifications for Payment Systems; Feb. 20, 2006; Faculty of Informatics, Masaryk University; pp. 1-13. |
Better Protection from Client to Data Center Made Possible with New Trusted Computing Group Storage Device Specifications, Jan. 27, 2009, www.wikipedia.com. |
Texas Instruments, Battery Authentication and Security Schemes:, Jul. 2005, SLUA346 (Application Report), pp. 1-7, retrieved date; Jan. 10, 2012. |
Texas Instruments, Battery Pack Security and Authentication IC for Protable Application s (bqSecure TM)(bq26150) SLUS641B—Jan. 2005—Revised Nov. 2009; Retrieved date: Jan. 10, 2012. |
Duncan Standing, “Biometric ID ePassports: Everything's Changed and Nothing's Changed”, 2007, www.SecurityWorldMag.com; pp. 1-6; Retrieved Date: Jan. 10, 2012. |
RSA Laboratories, 3.6.1 What is Diffie-Hellman?, available at www.rsa.com/rsalabs/node.asp?id=2248 as of Feb. 16, 2011, © 2010, 2 pages. |
Application and File History of U.S. Appl. No. 13/185,825, filed Jul. 19, 2011, Inventors: Lim et al. |
Application and File History of U.S. Appl. No. 12/582,362, Inventors Schaecher et al., filed Oct. 20, 2009. |
Number | Date | Country | |
---|---|---|---|
20110154043 A1 | Jun 2011 | US |