This invention relates to Digital Rights Management (DRM) for protection of audio and video data in a playback device such as a computer or audio or video media player, and especially to protection of such data from illicit copying.
The protection of digital content transferred between computers over a network and transferred from a computer to an associated playback device is important for many organizations. The DRM process often involves encrypting the pieces of content (e.g., encrypting the binary form of the content) to restrict usage to those who have been granted a right to the content, which is typically pieces of music or video programs.
Cryptography is the traditional protection method, but for many digital file transfer situations, a party that legitimately receives the content might try to break the DRM protection scheme, so as to give illicit access to third parties. An identified weak leak in the DRM security is in the overall process, rather than the encryption scheme. For instance, one of the more successful DRM systems distributes music and video content via the Internet. The DRM system distributes to a user's computer content that has been encrypted. The user's computer then decrypts the received content, generates local keys for encrypting the content, and uses these local keys to re-encrypt the content. Typically the content in encrypted form may also be downloaded, via a local connection, to an associated playback-only device such as an audio or video media player, which similarly decrypts and re-encrypts the content before playing same.
The present inventors have identified a potential security issue with this approach. As indicated above, during the playback process (which can also occur in the user's computer) the decrypted data is resident for a time in memory in the host computer and/or other playback device. One such memory location is a buffer memory which temporarily stores one or more packets or frames of the decrypted content bitstream. For audio content, the usual encoded format of the data is referred to as AAC (Advanced Audio Coding ISO 13818-7) which is a compression standard with associated codecs (coder-decoder) commercially available in hardware and software. For video, the usual encoded format is H.264, a similar standard with compression, also with commercially available codecs.
During the playback process, portions of the decrypted content (packets for audio, frames for video), are temporarily stored in a buffer prior to being supplied to the AAC or H.264 or other decoder. Hackers (illicit copiers) may be able to access the decrypted data in this buffer using a standard software tool known as a debugger attachment. The hacker can then copy the accessed data out of the buffer, store it, and as the content is played thereby have a decrypted copy of the entire content, without having to break the actual encryption.
This process is illustrated in
The hacker's attack uses a debugger attachment 22 to access the data in buffer 18, and transfer it to another memory, such as in a computer (not shown), where he can effectively copy the entire content free of any encryption as it is played for later illicit distribution or use (it is assumed the content is copyrighted so the use is illicit, but that is not relevant technically).
In accordance with this invention, the above hacker attack is prevented or defeated by “poisoning” the decrypted bit stream so it is unusable by the hacker, when he extracts it from the buffer memory. This poisoning process and the associated hardware and/or software to carry it out are resident in the playback device, such as a computer, dedicated media player, cell phone, PDA, etc. The poisoning process in one embodiment involves slightly modifying at least one portion of the decrypted data (such as one or more bits therein, the data being digital and in standard binary form), such as an AAC packet or H.264 frame, before transferring it to the decoder portion of the codec in the playback device. Thus the data when later intercepted by the hacker is unusable, since the nature of the modification is not known to him.
Of course, the associated decoder then is also modified so it can recover either the original data or some other piece of information which provides accurate decoding of the modified data. It is believed that the first option is less secure since then the original data must reappear during the decoding process, making it available to the hacker. The second option is believed to be more secure, avoiding this, but requires that the part of the decoding process using the other piece of information be executed, e.g. in software, in a secure manner or a secure location in the playback device (not accessible to the hacker) so the hacker cannot guess the original data.
The present method and associated apparatus has two phases. The first phase is to change the data, so it is modified (poisoned) to make it useless when later decoded by a standard decoder as used by a hacker. This usually happens when the data is first downloaded into the playback device, and stored in its transitory (e.g., volatile) memory. The second phase is to decode the data, typically during playback, to recover the data so it can be played.
The encoding operation is as follows:
1. The encrypted data (e.g. a song or video program) is downloaded from a server or host computer to the playback device, by conventional means.
2. The data is conventionally decrypted.
3. The resulting decrypted bitsteam is conventionally parsed into packets (each packet being a number of bits of predetermined length with header information).
4. The values of certain specific bit locations in each packet (or certain packets) are modified. The modifications are such that the resulting packet is parseable by a standard decoder, but not playable. Thus in one embodiment the bit modifications do not corrupt the structure of the audio packet (or video frame) itself.
5. In one embodiment, the original values of the modified bits are stored in a secure memory location in the playback device readable only by the DRM scheme associated with the device, and hence not accessible to the hacker. In a second embodiment, there is no storage of the modified bits.
6. In one embodiment, not every packet/frame (data portion) is so modified. This is accomplished by providing a set of rules (logic) which determines which portions are subject to modification. For instance, the AAC standard defines certain required blocks and optional blocks of bits. Required blocks include the section_data block, the scale_factor_data block, and the spectral_data block, while the pulse_data block and the tns_data blocks are optional. Hence in one embodiment the modifications of the bits are of those which are most likely to appear, and so poisoning the required blocks is preferred. Also, to speed up the encoding (enhance performance) in another embodiment a particular block may be poisoned (modified) only in certain cases. An example is to poison (in an AAC packet) the sect_cb bits in the section_data block only if the particular current window is of a certain length. Advantageously this may further confuse a hacker. Note the tradeoff here between performance (less poisoning) and security (more poisoning), to be selected for various implementations.
7. For the actual bit modification process, the value of each selected bit is replaced with a specific bit value, which preferably is a reserved (normally unused) value according to the relevant encoding standard. This enhances the later decoding, but has the drawback of weakening security since the hacker can more readily identify a poisoned portion of the data, although he would not know how to operate on such a bit even if he identified it.
This encoding process is illustrated in
If the logic 14 determines to poison a particular portion of the data, logic 15 as described above determines which bits m thereof are to be modified, and to what values. As described above, in one embodiment the original values of bits m are stored in secure memory location 36. Logic 32 then carries out the actual bit modifications, and reassembles the data portion (packet or frame, for instance.) The reassembled data portion is then transferred to encoder 20 for further conventional processing.
The decoding/playback process and associated apparatus 23′ are as follows, and are complementary to the encoding process and apparatus, see
1. The previously stored and encrypted song or video program is conventionally downloaded from nonvolatile content memory 12.
2. The encrypted data is decrypted by decryptor 16, conventionally, and passed to frame parser 18.
3. Decoding logic 17, for each data portion, detects the reserved bit values at the predetermined bit locations. If no such reserved bit values are detected, processing defaults to the conventional decoding at coder 20 and playing. This means no bit poisoning was done on that frame/packet or other data entity.
4. If the reserved bit values are detected or any other pertinent information is detected, this means poisoning has taken place. The logic 17 then invokes (e.g. in software, but not so limited) a secure de-poisoning routine, which is stored in secure memory location 36. (Such a secure location is known in the DRM field and readily implemented by those skilled in the art.)
5. At this point, there are two options (embodiments.) In the first (path a in
Both the above processes and associated apparatuses of
This disclosure is illustrative and not limiting; further modifications will be apparent to one skilled in the art in light of this disclosure and are intended to fall within the scope of the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
6477179 | Fujii et al. | Nov 2002 | B1 |
6507950 | Tsukidate et al. | Jan 2003 | B1 |
6636222 | Valmiki et al. | Oct 2003 | B1 |
6721710 | Lueck et al. | Apr 2004 | B1 |
6842522 | Downing | Jan 2005 | B1 |
7120250 | Candelore | Oct 2006 | B2 |
7133845 | Ginter et al. | Nov 2006 | B1 |
7356144 | Nishimoto et al. | Apr 2008 | B2 |
7420482 | Henry et al. | Sep 2008 | B2 |
20020080282 | Rumreich et al. | Jun 2002 | A1 |
20020141577 | Ripley et al. | Oct 2002 | A1 |
20020154775 | Yang | Oct 2002 | A1 |
20020188570 | Holliman et al. | Dec 2002 | A1 |
20020194613 | Unger | Dec 2002 | A1 |
20030035543 | Gillon et al. | Feb 2003 | A1 |
20030081773 | Sugahara et al. | May 2003 | A1 |
20030200452 | Tagawa et al. | Oct 2003 | A1 |
20030229489 | Mes | Dec 2003 | A1 |
20040001693 | Cavallerano et al. | Jan 2004 | A1 |
20040049687 | Orsini et al. | Mar 2004 | A1 |
20040161045 | Mizobata | Aug 2004 | A1 |
20040190872 | Loisel | Sep 2004 | A1 |
20050089164 | Lang et al. | Apr 2005 | A1 |
20050105809 | Abe et al. | May 2005 | A1 |
20050108612 | Downing | May 2005 | A1 |
20050193205 | Jacobs et al. | Sep 2005 | A1 |
20050232595 | Hirai | Oct 2005 | A1 |
20050285975 | Spektor et al. | Dec 2005 | A1 |
20060041576 | Ito | Feb 2006 | A1 |
20060069550 | Todd et al. | Mar 2006 | A1 |
20060093142 | Schneier et al. | May 2006 | A1 |
20060200846 | Phan | Sep 2006 | A1 |
20060222178 | Kuwabara et al. | Oct 2006 | A1 |
20060222330 | Lankford et al. | Oct 2006 | A1 |
20060269222 | Horii | Nov 2006 | A1 |
20070091873 | LeBlanc et al. | Apr 2007 | A1 |
20070110237 | Tehranchi et al. | May 2007 | A1 |
20070116128 | Evans et al. | May 2007 | A1 |
20070166002 | Mamidwar et al. | Jul 2007 | A1 |
20070217519 | Murayama et al. | Sep 2007 | A1 |
20070219934 | Wang et al. | Sep 2007 | A1 |
20070277039 | Zhao | Nov 2007 | A1 |
20080052516 | Tachibana et al. | Feb 2008 | A1 |
20080069340 | Vaughn | Mar 2008 | A1 |
20080247728 | Witham | Oct 2008 | A1 |
Number | Date | Country |
---|---|---|
WO-03010722 | Feb 2003 | WO |
WO-03010722 | Feb 2003 | WO |
WO-03060905 | Jul 2003 | WO |
WO-2005057535 | Jun 2005 | WO |
WO-2005057535 | Jun 2005 | WO |
Number | Date | Country | |
---|---|---|---|
20080165961 A1 | Jul 2008 | US |