This invention relates to computers, data security, and more specifically to controlling use of time-related digital content using Digital Rights Management.
This disclosure relates to Digital Rights Management (DRM) for protection of audio and/or video data in a playback device such as a computer or computing device or audio or video media player. DRM refers to standards and proprietary systems where a content item has associated data that specifies user rights. The protection of digital content transferred between computers over a network and transferred from a computer or other host device or a server to an associated playback device is important for many organizations. The DRM process often involves encrypting the pieces of content (e.g., encrypting the binary form of the content) to restrict usage to those who have been granted a right to the content, which is typically pieces of music or video programs.
Cryptography is the traditional protection method, but for many digital file transfer situations, a party that legitimately receives the content might try to break the DRM protection scheme, so as to give illicit access to third parties or himself. Hence, an identified weak link in the DRM security is in the overall process, rather than the encryption scheme. For instance, one of the more successful DRM schemes distributes music and video content via the Internet. The DRM system distributes to a user's computer content that has been encrypted. The user's computer then decrypts the received content, generates local keys for encrypting the content, and uses the local keys to re-encrypt the content. Typically, the content in encrypted form may also be downloaded, via a local connection such as a USB (universal serial bus) connection, to an associated playback-only device such as an audio or video media player, which similarly decrypts and re-encrypts the content before playing same.
One of the major challenges in Digital Rights Management is the implementation of time enforcement. There is a time-dependent factor in many applications of DRM, notably in the protection of rental content (such as video programs, movies, etc.). In this scenario, a user purchases the rights to access an item of content for a given period of time such as a month where typically the content is distributed as a digital file over the Internet but including tangible media such as DVDs, CDs, etc. The role of DRM software in this situation is to allow the content to be accessed by the user during the agreed upon period of time and disallow access at any other time. See Vataja, U.S. Publication No. 2005/0204209, published Sep. 15, 2005 and McKune, U.S. Publication No. 2002/0169974, published Nov. 14, 2002, both incorporated herein by reference in their entirety.
Many current DRM schemes have this functionality and enforce it by using a secure time server or hardware clock. When the user tries to access the content using his playback device, the DRM software connects across a network (e.g., the Internet) to a secure time server and queries the current time. If the time returned over the network from the server is within the correct period, the DRM software allows playback. This scheme has a major drawback; the user must be connected to the network at the time of content playback. This restriction is not feasible for certain mobile playback devices without network connectivity, in situations where content needs to be accessed offline and for platforms (playback devices) with no internal secure hardware clock.
The present inventors have found that a new solution is needed for enforcing content time restrictions on platforms that are not connected to a network and do not have a secure hardware clock. The present solution is a process and system that checks the status of the host system (e.g., a user media playback device) with respect to the system clock. The expectation is that if the system's clock has been set backwards by a “hacker” (a person who wants to defeat the DRM), the associated host's file system will exhibit some telltale signs of this manipulation. One problem with looking at the current state of a system is that the state of a system tends to realign itself. This is especially true if in the attack the system clock is only changed for a small period of time. An attack here generally refers to altering of time to by-pass time protection associated with a content item.
The host system is e.g., a computer, personal digital assistant, media playback device, cellular telephone device with media playback capability, or other type of computing device.
The usual clock roll-forward attack involves the hacker choosing a specific time far in the future as the active time period for an asset (item of content protected by DRM). Access to the content is granted by moving the system clock forward to that date. Once the asset has been accessed by the hacker, the clock is returned by the hacker to the actual date and the system will realign itself. To access the asset again at a later date, the hacker sets the clock to a future date within the active time period of the asset.
The present approach detects this attack by an event detection mechanism using particular stored state information about the host system. Every time an asset is opened, time information is stored about a subset of system files. The present method then determines (detects) if the system has been subjected to a roll-forward attack by comparing the current time information of those files with the previously recorded time and the last time the files were scanned. If there has been a roll-forward attack, then access to the keys needed to decrypt the content item is denied by the DRM system. McKune, referred to above, also detects clock tampering by doing a comparison of current time with a stored time.
There are two possible results when checking a time stamp of a file on a computer system with a currently valid (non-tampered with) forward-moving clock: the time stamp was not modified since the last time it was checked or the time stamp was modified after the last time checked it. If a file's time is older than the last scan time but different than the previous time it was checked, something has happened out of order. In this case, the clock likely has earlier been rolled back to before the previous scan time, or the clock has been set to a future date when the last scan occurred. The present method detects when a file has been modified when the system clock has been rolled forward and also detects if the file's modification time has been tampered with to hide this modification.
The first time the present method is called by the DRM system in the host system, the present method records the current file system time (the system clock time) of several specifically selected files and stores each of the file system times in a secure database associated with the DRM in the host system. These files are typically content files, system files, user preference files or cache files. Secure here means difficult for a hacker to tamper with, such as encrypted. A digest is taken of the content of each of these files and also stored in the database. This digest may be computed using a hash function, or otherwise. Hash functions are well known in the data security field; they are one-way functions which generate a value from digital data, where the data cannot be (practically) generated from its hash function value. Hash functions are often used for verification and authentication of data. This establishes an initial state for the host system.
Subsequently, when the present method is again called by the DRM system, the secure database is accessed and the system times of the files listed in the database are queried. The method is called when verifying system integrity, usually at playback time of an item of content. The present method is typically implemented in software (as is the rest of the DRM system) and called (invoked) as a routine or software module by the DRM system. A “sanity” check of the returned values from the query is performed by the method. This means that the time of a file is judged valid if it has not changed since the last such check or it has changed to a value more recent than that of the last such check. Failing this check indicates that the file has been modified, accessed or changed by the hacker while the system clock was in an unnatural state as a result of the hacker's tampering.
Furthermore, if the file itself has been modified, then its digest by definition will be different. If the time of last modification of the file has not changed, but the file's digest is different, this indicates that the last modification time of the file has been tampered with. If this check is passed, then the current time is added to the database under “last scan time” and the modification time and current digest of each file is added to the database.
If “yes” at 40, control passes to 44 where the current digest (e.g., hash) value for that file is matched to the digest (e.g., hash) value for that file in the database. If “no” at 44, the check is failed at 48, as is the case for “no” at 40. If the test is failed, the current content item access is denied. No digest check is done if the file time is more recent than the last scan time, since then the file may have been altered for a valid reason since the time stamp has moved forward.
Digests of files (using, e.g., hash functions such as MD5, SHA-1, SHA-2 or others) may be replaced here by any file content identification technique. A weak such technique is a sample of the first and last bytes, and a set of random offset bytes (of fixed length) and would improve performance over use of a hash by reducing computational time at the expense of weaker security. A strong technique is using known hash algorithms as described above performed on the entire file content.
The file database would contain, e.g., entries for either all the files listed in a given directory, or for a sample of random files located in these directories. This sample approach would be useful for directories containing a large number of files (such as logs, mail cache, or a web browser cache).
Coding the software to carry out the present method would be routine in light of this disclosure. A typical computer language for the source code software would be C, although usually only the compiled version (object code) of the software would be installed in the host device in a memory. Hence contemplated here is the method, the memory medium in the host system holding the associated code, and the resulting host system programmed with the code so as to carry out the method.
Device 64 also has conventionally its system clock 80 and file storage directory 86. This directory (or set of directories) is a directory to the files in storage 72 and is maintained conventionally by the host system's operating system 14 (shown in
Then when database 96 is accessed at 26 in
The present method and apparatus are applicable in number of contexts, including a host device coupled to a conventional computer network such as the Internet, and a personal area network (PAN) where a PAN is a computer network operating over a very short distance such as a few meters.
This disclosure is illustrative but not limiting; further modifications will be apparent to those skilled in the art in light of this disclosure and are intended to fall within the scope of the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
5757908 | Cooper et al. | May 1998 | A |
5883954 | Ronning | Mar 1999 | A |
5892900 | Ginter et al. | Apr 1999 | A |
6098054 | McCollom et al. | Aug 2000 | A |
6230064 | Nakase et al. | May 2001 | B1 |
6282175 | Steele et al. | Aug 2001 | B1 |
6393126 | van der Kaay et al. | May 2002 | B1 |
7111026 | Sato | Sep 2006 | B2 |
7124302 | Ginter et al. | Oct 2006 | B2 |
7552148 | Liu et al. | Jun 2009 | B2 |
20020019814 | Ganesan | Feb 2002 | A1 |
20020120465 | Mori et al. | Aug 2002 | A1 |
20020157002 | Messerges et al. | Oct 2002 | A1 |
20020169974 | McKune | Nov 2002 | A1 |
20020196940 | Isaacson et al. | Dec 2002 | A1 |
20030120939 | Hughes et al. | Jun 2003 | A1 |
20040024688 | Bi et al. | Feb 2004 | A1 |
20040054894 | Lambert | Mar 2004 | A1 |
20040059813 | Bolder et al. | Mar 2004 | A1 |
20040162787 | Madison et al. | Aug 2004 | A1 |
20040187014 | Molaro | Sep 2004 | A1 |
20050132122 | Rozas | Jun 2005 | A1 |
20050188222 | Motsinger et al. | Aug 2005 | A1 |
20050204209 | Vataja | Sep 2005 | A1 |
20050289072 | Sabharwal | Dec 2005 | A1 |
20060008256 | Khedouri et al. | Jan 2006 | A1 |
20060015717 | Liu et al. | Jan 2006 | A1 |
20060064762 | Kayashima et al. | Mar 2006 | A1 |
20060190535 | Kaitaniemi et al. | Aug 2006 | A1 |
20070143844 | Richardson et al. | Jun 2007 | A1 |
20070168484 | Koelle et al. | Jul 2007 | A1 |
20070183742 | Cowgill | Aug 2007 | A1 |
20070204064 | Mail et al. | Aug 2007 | A1 |
20080126773 | Martinez et al. | May 2008 | A1 |
20080134297 | Clinick et al. | Jun 2008 | A1 |
20080152146 | Conrado et al. | Jun 2008 | A1 |
20080229113 | Yagawa | Sep 2008 | A1 |
20080235666 | Bhandari et al. | Sep 2008 | A1 |
20080301457 | Uesugi et al. | Dec 2008 | A1 |
20100031049 | Shima et al. | Feb 2010 | A1 |
Number | Date | Country |
---|---|---|
1055990 | Nov 2000 | EP |
1056010 | Nov 2000 | EP |
WO 0054128 | Sep 2000 | WO |
WO 2004032329 | Apr 2004 | WO |
WO 2005104426 | Nov 2005 | WO |
Entry |
---|
Herberg, “Integrity Check Value and Timestamp TLV Definitions for Mobile Ad Hoc Networks (MANETs)”, May 1, 2012, Internet Society, p. 1-21. |
Author Unknown, “Mechanism and apparatus for determining time on a disconnected machine,” Research Disclosure, Dec. 2000, 2 pages, vol. 440. |
Lie, David, et al., “Specifying and Verifying Hardware for Tamper-Resistant Software,” Proceedings of the 2003 IEEE Symposium on Security and Privacy (SP'03), May, 2003, 12 pages, IEEE. |
Number | Date | Country | |
---|---|---|---|
20090287942 A1 | Nov 2009 | US |