Methods utilizing steganography

Information

  • Patent Grant
  • 8290202
  • Patent Number
    8,290,202
  • Date Filed
    Thursday, January 11, 2007
    18 years ago
  • Date Issued
    Tuesday, October 16, 2012
    12 years ago
Abstract
The present invention relates generally to steganography, digital watermarking and data hiding. In one embodiment a method is provided including: processing data representing content; analyzing the data to determine whether a plural-bit identifier is steganographically encoded therein, the plural-bit identifier encoded in the data through modifications to the data, the modifications steganographically hiding the presence of the plural-bit identifier; and upon detection of the plural-bit identifier, redundantly carrying out an action while attempting to conceal the action from one or more users of the content. Of course, other embodiments are described and claimed as well.
Description
FIELD OF THE INVENTION

The present invention relates to computer systems, and more particularly relates to techniques for establishing persistent evidence of a computer's use for possibly illicit purposes (e.g. counterfeiting).


BACKGROUND AND SUMMARY OF THE INVENTION

Fifty years ago, counterfeiting was a rare art practiced by a small number of skilled engravers using esoteric equipment. Today, counterfeiting is a rampant problem practiced by thousands of criminals using ubiquitous computer equipment.


Statistics from the U.S. Secret Service illustrate the magnitude of the problem in the United States. In a recent report, the Secret Service stated:

    • The amount of counterfeit currency passed in the United States over the last three fiscal years has remained fairly consistent; however, 1998 has seen a significant increase, largely due to inkjet produced counterfeits. Inkjet produced counterfeit currency comprised only 0.5% of the total counterfeit currency passed in fiscal year 1995. In comparison, 19% of the total counterfeit currency passed in the United States during fiscal year 1997 was inkjet produced, and 43% of the counterfeit currency passed through August 1998 has been ink jet counterfeit currency.
    • This trend is attributed to rapid improvements in technology, and the ever-increasing availability and affordability of scanners, high-resolution inkjet and other output devices, and computer systems. Digital counterfeiting is likely to continue to increase as the capabilities of systems and devices continue to improve, and as these capabilities become more readily understood by the criminal element.


Accompanying the Secret Service report was a table identifying the number of domestic counterfeiting plants raided, by type. Again, the explosive growth of inkjet counterfeiting is evident:




















FY98


Type of Counterfeiting Plant
FY95
FY96
FY97
(through July)



















Offset Counterfeiting
60
29
23
10


Toner-Based Counterfeiting
59
62
87
47


Inkjet-Based Counterfeiting
29
101
321
477









The problem is not limited to the United States; statistics from other countries show the above-detailed trends are worldwide.


Various means have been deployed over the years to deter the counterfeiting of banknotes and similar financial instruments. One is to incorporate design features in banknotes that are difficult to replicate. Another is to equip color photocopiers with the capability to recognize banknotes. If such a photocopier is presented with a banknote for duplication, copying is disabled or impaired.


Yet another approach is for color photocopiers to imperceptibly write their serial number on all output sheets, e.g. using small, light yellow lettering. (Such an arrangement is shown, e.g., in European laid-open application EP 554,115 and in U.S. Pat. No. 5,557,742.) While unknown to most of the public, the majority of color photocopiers employ this, or similar means, to mark all output copies with covert tracing data.


The inclusion of covert tracing data in all printed output from color photocopiers (and some color printers) brings into play the balancing of law enforcement needs versus the widely recognized users' rights of privacy and freedom of expression. Unbounded use of such covert marking techniques can raise the spectre of an Orwellian “Big Brother.”


In accordance with a preferred embodiment of the present invention, tracer data is selectively generated to assist law enforcement agencies in prosecuting counterfeiters. However, instead of rotely incorporating such data into all printed output, it is secretly stored in the counterfeiter's computer. If the computer is later searched or seized, the tracer data can be recovered and employed as evidence of the computer's use in counterfeiting.


The foregoing and additional features and advantages of the present invention will be more readily apparent from the following detailed description, which proceeds with reference to the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a diagram of a computer system according to one embodiment of the present invention.



FIG. 2 is a diagram illustrating certain of the principles used in the FIG. 1 embodiment.





DETAILED DESCRIPTION

Referring to FIG. 1, a computer system 10 employed in one embodiment of the present invention includes a processor 11, a non-volatile store 12, volatile memory 14, an external interface 16, and various peripherals (e.g. a scanner 18, a printer 20, etc.).


The processor 11 typically comprises a CPU, such as one of the microprocessors available from Intel, Sun, AMD, Cyrix, Motorola, MIPS, etc. Alternatively, the processor can take other forms, including hardwired logic circuitry, programmable logic (e.g. FPGAs), or yet-to-be-devised processing arrangements.


The non-volatile store 12 typically comprises a magnetic disk, but can also include other writeable media, including optical disks, flash memory, EEPROMS, ROMBIOS, etc. The non-volatile store can be physically located with the processor 11 (e.g. hard disk, CMOS memory with system setup data, etc), and/or can be remote (e.g. a networked drive, storage accessible over the Internet, etc.).


The volatile memory 14 typically comprises RAM, either integrated with the CPU (e.g. cache), and/or separate.


The external interface 16 can take various forms, including a modem, a network interface, a USB port, etc. Any link to a remote resource other than common peripherals is generally considered to employ an external interface.


Stored in the non-volatile store 12 is various software. This includes operating system software, applications software, and various user files (word processing documents, image files, etc.). The operating system software typically includes a thousand or more files, including a registry database (detailing the resources available in the system, etc.) and various device drivers (which serve as software interfaces between the CPU and peripheral devices, such as scanner 18 and printer 20). The applications software includes executable code and data. Both the operating system software and the applications software may employ shared files (e.g. DLLs) which can be utilized by different executables and/or operating system components to provide desired functionality.


While illustrated as resident in the non-volatile store 12, the foregoing software is generally loaded into the volatile memory 14 for execution.


The peripherals 18, 20 are typically connected to the computer system through a port 22 (e.g. serial, parallel, USB, SCSI, etc.) which permits bi-directional data exchange. Each peripheral typically includes its own processor circuitry 24 that operates in conjunction with firmware 26 (software resident in memory within the printer) to perform peripheral-specific processing and control functions. In addition to the memory in which the firmware is stored (e.g. EEPROM, flash memory, etc.), some peripherals have other data storage. For example, the disposable “consumables” in printers increasingly include their own non-volatile memories 28 in which various calibration and/or usage data is stored.


In one embodiment of the present invention, the computer system writes forensic tracer data (sometimes terms an “audit trail”) to a non-volatile store if it detects a possibly illicit action, e.g. the processing of image data corresponding to a banknote. (For expository convenience, the term “banknote” is used to refer to all manner of value documents, including paper currency, travelers checks, money orders, stamps, university transcripts, stock certificates, passports, visas, concert—or sporting event tickets, etc.) The data is written in a manner(s), and/or to a location(s), chosen to minimize its possible detection by a cautious perpetrator. If the computer is later inspected pursuant to a lawful search and seizure, it can be analyzed for the presence of incriminating tracer data.


There is considerable prior work in the field of detecting security documents from image data. Published European application EP 649,114, for example, describes banknote detection techniques based on the use of fuzzy inferencing to detect geometrical arrays of certain patterns (sometimes termed “common marks”) that are characteristic of banknotes. U.S. Pat. Nos. 5,515,451, 5,533,144, 5,629,990, and 5,796,869 describe banknote detection techniques based on different pattern matching techniques (e.g. to recognize the Federal Reserve seal). Xerox has also proposed its data glyph technology (detailed, e.g., in U.S. Pat. Nos. 5,706,364, 5,689,620, 5,684,885, 5,680,223, 5,668,636, 5,640,647, 5,594,809) as a means to mark security documents for later machine-identification.


Another means for detecting security documents is by use of Hough-based pattern matching techniques as described, e.g., in Hough's U.S. Pat. No. 3,069,654, and Ballard, “Generalizing the Hough Transform to Detect Arbitrary Shapes,” Pattern Recognition, Vol. 13, No. 2, pp. 111-122, 1981. One embodiment of such a system follows the approach outlined in the Ballard paper, and employs plural tables corresponding to different patterns found on banknotes, with different confidence. Gross Hough processing is first performed using one or more rotationally-invariant features (e.g. U.S. Federal Reserve Seal) to quickly identify most image sets as not banknote-related. Any data that looks to be potentially bank-note related after the first check is subjected to successively more selective, higher-confidence tests (some stepping through plural rotational states) to weed out more and more non-banknote image sets. Finally, any image data passing all the screens is concluded to be, to a very high degree of certainty, a banknote. An appropriate signal is then generated (e.g. a change in state of a binary signal) to indicate detection of a banknote.


Neural networks and algorithms are also suitable for detection of patterns characteristic of banknotes, as illustrated by European patent EP 731,961, etc.


In the present assignee's prior applications (e.g. 08/649,419, 09/074,034, 09/127,502, 60/082,228) techniques are disclosed for marking security documents with generally imperceptible, or steganographic, watermark data, so as to facilitate later identification of such documents. By employing digital watermark-based banknote detection in combination with visible feature-based banknote detection, very high confidence recognition of banknote data can be achieved.


The artisan is presumed to be familiar with the various approaches for recognizing banknotes from image data, of which the foregoing is just a sampling.


While such banknote-detection techniques are commonly implemented in resource-intensive form, using sophisticated processing units (e.g. the main CPU of a copier), this need not be the case. To reduce the resource requirements, the detection algorithm can be tailored to operate on parts of scan-line data, without buffering the entire set of image data for analysis. The algorithm can be implemented on less-sophisticated processors, such as those used in the scanner 18 or the printer 20. The processors can be programmed, by appropriate firmware, to perform such processing on any image data scanned by, or printed by, such devices. And as modems and other interfaces (SCSI, FireWire, IDE, ATAPI, etc.) continue their evolution from dedicated hardware to software-based implementations, their data processing capabilities increase commensurately. Thus, for example, software-implemented modems, network interfaces, UARTs, etc., can monitor the data traffic passing therethrough and flag any that appears to be banknote-related. The full analysis operation can be performed by the interface, or the data can be copied and passed to the main processor for further analysis.


In the preferred embodiment of the present invention, when banknote image data is detected, storage of forensic data is triggered. The forensic data typically includes at least the date (and optionally the time) at which the possibly illicit action occurred. Additionally, the forensic data can include the file name of the banknote image data (if available), and a code indicating the nature of the event noted (e.g., banknote data detected by the printer; banknote data detected passing through the modem on COM2; banknote data detected written to removable media having volume ID 01FF38; banknote data detected in file opened by Adobe Photoshop, etc.) The forensic data can additionally detail the source from which the data came, and/or the destination to which it was sent (e.g. IP/email addresses). In operating systems requiring user login, the stored forensic data will typically include the user ID. System status data can also be included, e.g. identifying peripheral devices attached to the system, code loaded into RAM memory, the amount of time the user spent working on the illicit data, etc. Selected data from any operating system registry database (e.g. identifying the registered owner of certain applications software then-loaded on the computer, software serial numbers, operational parameters, etc.) can likewise be included. If the computer is on a network or on the Internet, the network address, Ethernet MAC address, AppleTalk name and zone, TraceRoute information, or IP address information can be stored. If the illicit action has been detected by reference to a watermark or other embedded data, payload data recovered from the watermark can be included in the forensic tracer data.


On one extreme, the foregoing (and possibly more) information can be stored in detailed forensic tracer records. At the other extreme, the forensic tracer record can comprise a single bit indicating that the computer system has been used—at least once—for a possibly illicit action.


Expecting that savvy counterfeiters will attempt to defeat such forensic tracer data, such data is desirably generated, transmitted, and stored redundantly, transparently, and inconspicuously.


Redundant generation of the tracer data refers to detection of possibly illicit activity at various points in the computer system, and/or during various operations. Referring to FIG. 2, possibly illicit activity can be detected, e.g., during scanning of an image, printing of a document, receiving or transmitting a file through a modem connection, opening a file with an application program, saving a file with an application program, copying data to a clipboard, etc. By providing multiple opportunities for detection of possibly illicit activities, the robustness of the system is increased.


Redundant transmission of the tracer data refers to its transmission to storage media several times. When a possibly illicit activity is detected, it desirable to send tracer data to storage both immediately and on a delayed basis (e.g. five minutes after detection of banknote data, and every two minutes thereafter for a period of M minutes). By sending the data to storage repeatedly, the robustness of the system is again increased.


Redundant storage of the tracer data refers to its storage at several different locations (simultaneously or sequentially).


If even one instance of the redundantly generated/transmitted/stored tracer data survives the counterfeiter's attempts to redact incriminating data, it will be useful evidence in any prosecution.


Transparent generation/transmission/storage means that the acts associated with these operations will not arouse the counterfeiter's suspicion.


Various software tools are available to trace program execution. A savvy counterfeiter may employ such tools to monitor all disk writes performed by his system. Consider, for example, a counterfeiter using an image processing program in aid of his counterfeiting. The person may monitor the sequence of files opened and closed (and/or the data read/written) during use of the program for image processing with non-banknote data, and then be suspicious if different files, or in different orders, are opened and closed when performing the same image processing operations on banknote data. Thus, at least some of the forensic data should be stored using routine operations and routine files (e.g. writes to files that are used during normal program execution). Of course, such tracer data should be written in a manner assuring that the data will persist—either in the location originally written, or by copying during subsequent machine operation (e.g. on closing the application program, shutting down the operating system, etc.) to a location assuring longer-term availability.


Program-tracing tools typically monitor just the computer's main CPU so—where possible—at least some of the tracer data should be stored under the control of a different processing element, or in a location to which the tool's capabilities do not extend. Another option is to keep at least some of the tracer data in RAM memory for a period after the illicit action has been detected, and store it later.


Yet another option is to store at least some forensic tracer records in the operating system registry database. This resource is commonly accessed during system operation, so references to the database may not give rise to suspicion.


Inconspicuous storage covers a wide range of options. One is that the data be encrypted. This assures that simple disk-scanning operations attempting to find byte strings likely associated with tracer data will be unsuccessful. (Numerous encryption techniques are known, e.g. RSA, PGP, various private key techniques, etc., any of which can be used.)


Encrypted tracer data can be stored with other encrypted system data, such as in a password file. Due to its encrypted nature, a hacker may not be able to discern what part of the stored data is tracer data and what part is, e.g., password data. Attempts to redact the tracer data risks corrupting the password data, jeopardizing the counterfeiter's later ability to login to the machine.


Another possibility is to steganographically encode the tracer data, e.g. by randomizing/obfuscating same and inconspicuously hiding it amidst other data (e.g. within graphic or audio files associated with start-up or shut-down of the computer operating system, or wherever else noise-like data can be introduced without alerting the user to its presence). Still another possibility is to create null code that resembles normal instructions or data, but instead serves as a forensic tracer record.


To avoid creation of telltale new files in the non-volatile memory, the tracer data can be patched into existing files, by appending to the end or otherwise. Or, rather than storing the tracer data as the content of a file, the data can be stored among a file's “properties.”


Another way to avoid creating new files is to avoid using the computer's “file system” altogether, and instead use low-level programming to effect direct writes to typically-unused or reserved physical areas on the disk. By such techniques, the data is resident on the disk, but does not appear in any directory listing. (While such data may be lost if disk optimization tools are subsequently used, those skilled in the art will recognize that steps can be taken to minimize such risks.)


Yet another way to avoid creating new files is to relay at least some of the tracer data to outside the computer. One expedient is to use an external interface to transmit the data for remote storage. Again, a great variety of techniques can be employed to reliably, yet effectively, effect such transmission. And the data transmission need not occur at the moment the possibly illicit action is occurring. Instead, such data can be queued and relayed away from the machine at a later time.


Still another way to avoid creating new files is to make use of deadwood files that commonly exist on most computers. For example, application programs typically employ installation utilities which copy compressed files onto the disk, together with code to decompress and install the software. These compressed files and installation programs are usually not deleted, providing opportunities for their use as repositories of tracer data. Similarly, many computers include dozens or hundreds of duplicate files—only one of which is commonly used. By converting one or more of these files to use as a repository for tracer data, additional inconspicuous storage can be achieved.


Some application programs include hundreds of files, various of which are provided just for the occasional use of the rare super-user. Files that pass some litmus test of inactivity (e.g. not ever used, or not accessed for at least two years) might serve as tracer data repositories. (Disk utilities are available to determine when a given file was last accessed.) Yet another option is to append data to an application's Help files, or other binary data files used to save program state information for the application.


Resort may also be made to various of the known techniques employed in computer viruses to generate, transmit, store and disseminate/replicate the forensic tracer data in manners that escape common detection. Moreover, such virus techniques can be used to initially spread and install the functionality detailed above (i.e. pattern recognition, and tracer data generation/transmission/storage) onto computers without such capabilities.


Some embodiments may perform self-integrity checks of old tracer records each time a new banknote is encountered, and repair any damage encountered. Similarly, old tracer records can be expanded to detail new illicit acts, in addition to (or in lieu of) creating independent records for each illicit act.


Various tools can be used to replicate/propagate forensic tracer records to further infest the system with incriminating evidence. Utility software such as disk defragmenters, disk integrity checks, virus checkers, and other periodically-executed system maintenance tools can be written/patched to look in some of the places where forensic tracer records may be found and, if any are encountered, copy them to additional locations. Similar operations can be performed upon termination of selected application programs (e.g. image processing programs).


The foregoing is just the tip of the iceberg. Those skilled in the arts of computer programming, operating system design, disk utilities, peripheral firmware development, packet data transport, data compression, etc., etc., will each recognize many different opportunities that might be exploited to effect surreptitious, reliable banknote detection, and transmission, storage, and/or replication of tracer data. Again, if even one tracer record persists when the computer is searched by suitably-authorized law enforcement officials, incriminating evidence may be obtained. The high odds against ridding a computer of all incriminating data should serve as a deterrent against the computer's use for illegal purposes in the first place.


As noted, the computer system desirably includes several checkpoints for detecting illicit actions. In the case of banknote image processing, for example, detectors can be implemented in some or all of the following: in image processing software applications, in DLLs commonly used with image processing, in printer drivers, in printer firmware, in scanner drivers, in scanner firmware, in modem or other external interface drivers and software, in email software, in FTP software, in the operating system (looking at the clipboard, etc.), etc., etc. Similarly, where practical, the checking should be done by several different processors (e.g. main CPU, programmable interface chips, scanner microcontroller, printer microprocessor, etc.).


From the foregoing, it will be recognized that techniques according to the present invention can be used to discourage counterfeiting, and to aid in its prosecution when encountered. Moreover, this approach obviates the prior art approach of marking all color photocopies with tracer data, with its accompanying privacy and first amendment entanglements.


Having described and illustrated the principles of our invention with reference to an illustrative embodiment and several variations thereon, it should be recognized that the invention can be modified in arrangement and detail without departing from such principles.


For example, while the detailed embodiment has focused on a computer system, the same techniques can likewise be employed in stand-alone color copiers, etc.


Similarly, while the detailed embodiment has focused on counterfeiting, it will be recognized that computers can be employed in various other illicit or unauthorized activities. Each generally is susceptible to computer-detection (e.g. threats against the president may be detected by specialized natural language analysis programs; computer-aided synthesis of illegal drugs may be indicated by certain chemical modeling instructions in software specific to that industry; unauthorized duplication of copyrighted works may be flagged by the presence of embedded watermark data in the copyrighted work; unauthorized distribution of classified or confidential business documents may be detected using known techniques, etc.). The storage of forensic tracer data along the lines detailed above is equally applicable in such other contexts.


In the future, support for illicit activity detection may be routinely provided in a wide variety of software and peripherals. In one embodiment, the software and peripherals may include generic services supporting the compilation of forensic tracer data, its encryption, transmission, storage, etc. These generic services can be invoked by detector modules that are customized to the particular illicit/unauthorized activity of concern. Some of the detector modules can be fairly generic too, e.g. generic pattern recognition or watermark detection services. These can be customized by data loaded into the computer (either at manufacture, or surreptitiously accompanying new or updated software) identifying particular images whose reproduction is unauthorized/illicit. As new banknotes are issued, updated customization data can be distributed. (Naturally, such detector customization data will need to be loaded and stored in a manner that is resistant against attack, e.g. using the approaches outlined above for the covert tracer data.)


While the invention is described in the context of an end-user computer, the principles are equally applicable in other contexts, e.g. in server computers. Moreover, the principles are not limited to use in general purpose personal computers but can also be applied in other computer devices, e.g. digital cameras, personal digital assistants, set-top boxes, handheld devices, firewalls, routers, etc.


Although not belabored above, it will be understood that law enforcement agencies will have software recovery tools that can be employed on suspect computer systems to recover whatever forensic tracer data may persist. Briefly, such tools know where to look for tracer data and, when encountered, know how to interpret the stored records. After analyzing the non-volatile stores associated with a suspect computer system, the recovery software will report the results. The implementation of such tools is well within the capabilities of an artisan.


While the foregoing disclosure has focused exclusively on the storage of forensic tracer data as the response to a possibly-illicit action, more typically this is just one of several responses that would occur. Others are detailed in the previously-referenced documents (e.g. disabling output, hiding tracer data (e.g. as in U.S. Pat. No. 5,557,742, or using steganographically encoded digital watermark data) in the output, telephoning law enforcement officials, etc.).


To provide a comprehensive disclosure without unduly lengthening this specification, applicants incorporate by reference the patent applications and documents referenced above. By so doing, applicants mean to teach that the systems, elements, and methods taught in such documents find application in combination with the techniques disclosed herein. The particular implementation details of such combinations are not belabored here, being within the skill of the routineer in the relevant arts.


In view of the many possible embodiments in which the principles of our invention may be realized, it should be recognized that the detailed embodiments are illustrative only and should not be taken as limiting the scope of our invention. Rather, we claim as our invention all such modifications, combinations, and implementations as may come within the scope and spirit of the following claims, and equivalents thereof.

Claims
  • 1. A method comprising: obtaining access to a set of data corresponding to an image or video;controlling a steganographic encoding detector to examine only a portion of the data, without examining the entire set of data,detecting plural-bit data from the portion of the data; andupon detection of the plural-bit data, carrying out an action based on the detected plural-bit data, wherein the plural-bit data indicates a possible use of a device for an illicit activity and wherein the action comprises storing or generating forensic data to identify the possible use or the illicit activity.
  • 2. The method of claim 1, wherein the set of data comprises scan-line data, and wherein the detecting operates on only a portion of the scan-line data, without buffering the entire set of data for analysis.
  • 3. A method comprising: obtaining data representing content;analyzing the data to determine whether a steganographic signal is encoded therein through modifications to the data, wherein the modifications are made so as to conceal the presence of the steganographic signal to a casual human observer of the content; andupon detection of the steganographic signal, redundantly carrying out an action that is concealed from a user of the content, wherein the steganographic signal indicates a possible use of a device for an illicit activity, and wherein the action comprises storing or generating forensic data to identify the possible use or the illicit activity.
  • 4. The method of claim 3, wherein the content comprises an image or video.
  • 5. The method of claim 3, wherein the steganographic signal comprises an identifier, and wherein the action comprises transmitting at least the identifier.
  • 6. The method of claim 5, wherein the identifier is transmitted to a remote device.
  • 7. The method of claim 3, wherein the steganographic signal comprises an identifier, and wherein the action comprises generating data.
  • 8. The method of claim 3, wherein the steganographic signal comprises an identifier, and wherein the action comprises storing data.
  • 9. The method of claim 8, wherein the data is remotely stored.
  • 10. A method comprising: processing data representing content;analyzing the data to determine whether a plural-bit identifier is steganographically encoded therein through modifications to the data, wherein the modifications steganographically hide the plural-bit identifier;upon detection of the plural-bit identifier, redundantly carrying out an action while attempting to conceal the action from one or more users of the content, wherein the plural-bit identifier indicates a possible use of a device for an illicit activity and wherein the action comprises storing or generating forensic data to identify the possible use or the illicit activity.
  • 11. The method of claim 10, wherein the action comprises transmitting at least the plural-bit identifier.
  • 12. The method of claim 11, wherein the plural-bit identifier is transmitted to a remote device.
  • 13. The method of claim 10, wherein the action comprises generating data.
  • 14. The method of claim 10, wherein the action comprises storing data.
  • 15. The method of claim 10, wherein the content comprises an image or video.
  • 16. A device comprising: a memory configured to store data representing content; anda processor operatively coupled to the memory and configured to: analyze the data to determine whether a steganographic signal is encoded therein through modifications to the data, wherein the modifications are made so as to conceal a presence of the steganographic signal; andupon detection of the steganographic signal, carry out an action that is concealed from a user of the content, wherein the steganographic signal indicates a possible use of the device for an illicit activity, and wherein the action comprises storing or generating forensic data to identify the possible use or the illicit activity.
  • 17. The device of claim 16, wherein the steganographic signal comprises an identifier, and further comprising a transmitter configured to transmit at least the identifier to a remote device.
  • 18. A tangible computer-readable medium having instructions stored thereon, the instructions comprising: instructions to obtain data representing content; andinstructions to analyze the data to determine whether a steganographic signal is encoded therein through modifications to the data, wherein the modifications are made so as to conceal a presence of the steganographic signal; andupon detection of the steganographic signal, instructions to carry out an action that is concealed from a user of the content, wherein the steganographic signal indicates a possible use of the device for an illicit activity, and wherein the action comprises storing or generating forensic data to identify the possible use or the illicit activity.
  • 19. The tangible computer-readable medium of claim 18, wherein the steganographic signal comprises an identifier, and further comprising instructions to transmit at least the identifier to a remote device.
RELATED APPLICATION DATA

This application is a continuation-in-part of U.S. patent application Ser. No. 10/639,598, filed Aug. 11, 2003 (published as US 2006-0028689 A1). The U.S. patent application Ser. No. 10/639,598 is a continuation-in-part of U.S. patent application Ser. No. 10/367,092, filed Feb. 13, 2003 (now U.S. Pat. No. 7,113,615), which is a continuation-in-part of U.S. patent application Ser. No. 09/185,380, filed Nov. 3, 1998 (now U.S. Pat. No. 6,549,638). The U.S. patent application Ser. No. 10/639,598 is also a continuation-in-part of U.S. patent application Ser. No. 09/465,418, filed Dec. 16, 1999 (abandoned), which claims the benefit of U.S. Provisional Patent Application No. 60/112,955, filed Dec. 18, 1998. The above U.S. patent documents are hereby incorporated by reference.

US Referenced Citations (257)
Number Name Date Kind
4297729 Steynor et al. Oct 1981 A
4528588 Lofberg Jul 1985 A
4547804 Greenberg Oct 1985 A
4807031 Broughton Feb 1989 A
4908873 Philibert et al. Mar 1990 A
4930160 Vogel et al. May 1990 A
4969041 O'Grady et al. Nov 1990 A
5040059 Leberl Aug 1991 A
5351287 Bhattacharyya et al. Sep 1994 A
5377269 Heptig et al. Dec 1994 A
5416307 Danek et al. May 1995 A
5436974 Kovanen Jul 1995 A
5444779 Daniele Aug 1995 A
5450490 Jensen et al. Sep 1995 A
5469222 Sprague Nov 1995 A
5481377 Udagawa et al. Jan 1996 A
5483602 Stenzel et al. Jan 1996 A
5483658 Grube et al. Jan 1996 A
5485213 Murashita et al. Jan 1996 A
5499294 Friedman Mar 1996 A
5515451 Tsuji et al. May 1996 A
5533144 Fan Jul 1996 A
5557742 Smaha et al. Sep 1996 A
5568550 Ur Oct 1996 A
5574962 Fardeau et al. Nov 1996 A
5579124 Aijala et al. Nov 1996 A
5602906 Phelps Feb 1997 A
5610688 Inamoto et al. Mar 1997 A
5613004 Cooperman et al. Mar 1997 A
5629990 Tsuji et al. May 1997 A
5636292 Rhoads Jun 1997 A
5646997 Barton Jul 1997 A
5652626 Kawakami et al. Jul 1997 A
5652802 Graves et al. Jul 1997 A
5671277 Ikenoue et al. Sep 1997 A
5678155 Miyaza Oct 1997 A
5687236 Moskowitz et al. Nov 1997 A
5710636 Curry Jan 1998 A
5727092 Sandford, II et al. Mar 1998 A
5739864 Copeland Apr 1998 A
5745604 Rhoads Apr 1998 A
5748763 Rhoads May 1998 A
5751854 Saitoh et al. May 1998 A
5752152 Gasper et al. May 1998 A
5761686 Bloomberg Jun 1998 A
5768426 Rhoads Jun 1998 A
5774452 Wolosewicz Jun 1998 A
5790693 Graves et al. Aug 1998 A
5790697 Munro et al. Aug 1998 A
5790932 Komaki et al. Aug 1998 A
5796869 Tsuji et al. Aug 1998 A
5822360 Lee et al. Oct 1998 A
5822436 Rhoads Oct 1998 A
5838814 Moore Nov 1998 A
5850481 Rhoads Dec 1998 A
5892900 Ginter et al. Apr 1999 A
5901224 Hecht May 1999 A
5905800 Moskowitz et al. May 1999 A
5905810 Jones et al. May 1999 A
5974548 Adams Oct 1999 A
5982956 Lahmi Nov 1999 A
5987127 Ikenoue et al. Nov 1999 A
6026193 Rhoads Feb 2000 A
6073123 Staley Jun 2000 A
6122392 Rhoads Sep 2000 A
6131162 Yoshiura et al. Oct 2000 A
6182218 Saito Jan 2001 B1
6185312 Nakamura et al. Feb 2001 B1
6185321 Fukushima et al. Feb 2001 B1
6216228 Chapman Apr 2001 B1
6233684 Stefik et al. May 2001 B1
6246775 Nakamura et al. Jun 2001 B1
6249870 Kobayashi et al. Jun 2001 B1
6256110 Yoshitani Jul 2001 B1
6266430 Rhoads Jul 2001 B1
6278807 Ito et al. Aug 2001 B1
6282654 Ikeda et al. Aug 2001 B1
6285774 Schumann et al. Sep 2001 B1
6285776 Rhoads Sep 2001 B1
6289108 Rhoads Sep 2001 B1
6314518 Linnartz Nov 2001 B1
6330335 Rhoads Dec 2001 B1
6343138 Rhoads Jan 2002 B1
6345104 Rhoads Feb 2002 B1
6353672 Rhoads Mar 2002 B1
6363159 Rhoads Mar 2002 B1
6384935 Yamazaki May 2002 B1
6389055 August et al. May 2002 B1
6389151 Carr et al. May 2002 B1
6400827 Rhoads Jun 2002 B1
6404898 Rhoads Jun 2002 B1
6408082 Rhoads et al. Jun 2002 B1
6424726 Nakano et al. Jul 2002 B2
6427020 Rhoads Jul 2002 B1
6430302 Rhoads Aug 2002 B2
6442284 Gustafson et al. Aug 2002 B1
6449377 Rhoads Sep 2002 B1
6449379 Rhoads Sep 2002 B1
6456393 Bhattacharjya et al. Sep 2002 B1
6496591 Rhoads Dec 2002 B1
6499105 Yoshiura et al. Dec 2002 B1
6519352 Rhoads Feb 2003 B2
6522771 Rhoads Feb 2003 B2
6535618 Rhoads Mar 2003 B1
6539095 Rhoads Mar 2003 B1
6542618 Rhoads Apr 2003 B1
6542620 Rhoads Apr 2003 B1
6549638 Davis Apr 2003 B2
6556688 Ratnakar et al. Apr 2003 B1
6560349 Rhoads May 2003 B1
6567534 Rhoads May 2003 B1
6567535 Rhoads May 2003 B2
6567780 Rhoads May 2003 B2
6574350 Rhoads et al. Jun 2003 B1
6580819 Rhoads Jun 2003 B1
6587821 Rhoads Jul 2003 B1
6590997 Rhoads Jul 2003 B2
6647129 Rhoads Nov 2003 B2
6654480 Rhoads Nov 2003 B2
6654887 Rhoads Nov 2003 B2
6675146 Rhoads Jan 2004 B2
6724912 Carr et al. Apr 2004 B1
6738495 Rhoads et al. May 2004 B2
6744907 Rhoads Jun 2004 B2
6750985 Rhoads et al. Jun 2004 B2
6754377 Rhoads Jun 2004 B2
6757406 Rhoads Jun 2004 B2
6768808 Rhoads Jul 2004 B2
6771796 Rhoads Aug 2004 B2
6778682 Rhoads Aug 2004 B2
6804379 Rhoads Oct 2004 B2
6882738 Davis et al. Apr 2005 B2
6915481 Tewfik et al. Jul 2005 B1
6922480 Rhoads Jul 2005 B2
6944298 Rhoads Sep 2005 B1
6952485 Davidson et al. Oct 2005 B1
6959100 Rhoads Oct 2005 B2
6959386 Rhoads Oct 2005 B2
6968072 Tian Nov 2005 B1
6970573 Carr et al. Nov 2005 B2
6978036 Alattar et al. Dec 2005 B2
6983051 Rhoads Jan 2006 B1
6987862 Rhoads Jan 2006 B2
6993152 Patterson et al. Jan 2006 B2
7003132 Rhoads Feb 2006 B2
7020304 Alattar et al. Mar 2006 B2
7054462 Rhoads et al. May 2006 B2
7054463 Rhoads et al. May 2006 B2
7076084 Davis et al. Jul 2006 B2
7113615 Rhoads et al. Sep 2006 B2
7130087 Rhoads Oct 2006 B2
7181022 Rhoads Feb 2007 B2
7184570 Rhoads Feb 2007 B2
7224819 Levy et al. May 2007 B2
7228427 Fransdonk Jun 2007 B2
7239734 Alattar et al. Jul 2007 B2
7242790 Rhoads Jul 2007 B2
7263203 Rhoads et al. Aug 2007 B2
7266217 Rhoads et al. Sep 2007 B2
7269275 Carr et al. Sep 2007 B2
7286684 Rhoads et al. Oct 2007 B2
7305117 Davis et al. Dec 2007 B2
7313253 Davis et al. Dec 2007 B2
7321667 Stach Jan 2008 B2
7340076 Stach et al. Mar 2008 B2
7359528 Rhoads Apr 2008 B2
7372976 Rhoads et al. May 2008 B2
7415129 Rhoads Aug 2008 B2
7418111 Rhoads Aug 2008 B2
7424132 Rhoads Sep 2008 B2
7499564 Rhoads Mar 2009 B2
7499566 Rhoads Mar 2009 B2
7532741 Stach May 2009 B2
7536555 Rhoads May 2009 B2
7539325 Rhoads et al. May 2009 B2
7548643 Davis et al. Jun 2009 B2
7555139 Rhoads et al. Jun 2009 B2
7567686 Rhoads Jul 2009 B2
7567721 Alattar et al. Jul 2009 B2
7570784 Alattar Aug 2009 B2
7587601 Levy et al. Sep 2009 B2
7602940 Rhoads et al. Oct 2009 B2
7602977 Rhoads et al. Oct 2009 B2
7606390 Rhoads Oct 2009 B2
7639837 Carr et al. Dec 2009 B2
7672477 Rhoads Mar 2010 B2
7676059 Rhoads Mar 2010 B2
7702511 Rhoads Apr 2010 B2
7711564 Levy et al. May 2010 B2
7720249 Rhoads May 2010 B2
7720255 Rhoads May 2010 B2
7724919 Rhoads May 2010 B2
7769202 Bradley et al. Aug 2010 B2
7787653 Rhoads Aug 2010 B2
7796826 Rhoads et al. Sep 2010 B2
7831062 Stach Nov 2010 B2
20010002827 Yamazaki et al. Jun 2001 A1
20010017709 Murakami et al. Aug 2001 A1
20010021144 Oshima et al. Sep 2001 A1
20010021260 Chung et al. Sep 2001 A1
20010022848 Rhoads Sep 2001 A1
20010024510 Iwamura Sep 2001 A1
20010026618 Van Wie et al. Oct 2001 A1
20010042043 Shear et al. Nov 2001 A1
20020021824 Reed et al. Feb 2002 A1
20020021825 Rhoads Feb 2002 A1
20020039314 Yoshimura et al. Apr 2002 A1
20020041686 Moriyama et al. Apr 2002 A1
20020054317 Matsunoshita et al. May 2002 A1
20020054356 Kurita et al. May 2002 A1
20020054692 Suzuki et al. May 2002 A1
20020056081 Morley et al. May 2002 A1
20020056118 Hunter et al. May 2002 A1
20020059238 Saito May 2002 A1
20020080995 Rhoads Jun 2002 A1
20020085238 Umenda Jul 2002 A1
20020097420 Takaragi et al. Jul 2002 A1
20020107803 Lisanke et al. Aug 2002 A1
20020136429 Stach et al. Sep 2002 A1
20020191810 Fudge et al. Dec 2002 A1
20030021440 Rhoads Jan 2003 A1
20030084284 Ando et al. May 2003 A1
20030138128 Rhoads Jul 2003 A1
20040057581 Rhoads Mar 2004 A1
20040181671 Brundage et al. Sep 2004 A1
20040263911 Rodriguez Dec 2004 A1
20050025463 Bloom et al. Feb 2005 A1
20050111047 Rhoads May 2005 A1
20060028689 Perry et al. Feb 2006 A1
20060031684 Sharma et al. Feb 2006 A1
20060062386 Rhoads Mar 2006 A1
20060075244 Schumann et al. Apr 2006 A1
20070016790 Brundage et al. Jan 2007 A1
20070047766 Rhoads Mar 2007 A1
20070172098 Rhoads Jul 2007 A1
20070201835 Rhoads Aug 2007 A1
20080131083 Rhoads Jun 2008 A1
20080131084 Rhoads Jun 2008 A1
20080149713 Rhoads et al. Jun 2008 A1
20080240490 Finkelstein et al. Oct 2008 A1
20080253740 Rhoads Oct 2008 A1
20080275906 Brundage Nov 2008 A1
20090252401 Davis et al. Oct 2009 A1
20100008534 Rhoads Jan 2010 A1
20100008536 Rhoads Jan 2010 A1
20100008537 Rhoads Jan 2010 A1
20100021004 Rhoads Jan 2010 A1
20100027969 Alattar Feb 2010 A1
20100040255 Rhoads Feb 2010 A1
20100054529 Rhoads Mar 2010 A1
20100119108 Rhoads May 2010 A1
20100131767 Rhoads May 2010 A1
20100142752 Rhoads et al. Jun 2010 A1
20100146285 Rhoads et al. Jun 2010 A1
20100163629 Rhoads et al. Jul 2010 A1
20100172538 Rhoads Jul 2010 A1
20100226529 Rhoads Sep 2010 A1
Foreign Referenced Citations (10)
Number Date Country
0789480 Aug 1997 EP
1152592 Nov 2001 EP
1223742 Jul 2002 EP
WO 9504665 Feb 1995 WO
WO 9743736 Nov 1997 WO
WO 0106703 Jan 2001 WO
WO 0174053 Oct 2001 WO
WO 0203385 Jan 2002 WO
WO 0207150 Jan 2002 WO
WO 0229510 Apr 2002 WO
Related Publications (1)
Number Date Country
20070180251 A1 Aug 2007 US
Provisional Applications (1)
Number Date Country
60112955 Dec 1998 US
Continuation in Parts (4)
Number Date Country
Parent 10639598 Aug 2003 US
Child 11622373 US
Parent 10367092 Feb 2003 US
Child 10639598 US
Parent 09185380 Nov 1998 US
Child 10367092 US
Parent 09465418 Dec 1999 US
Child 10639598 US