Arrangement of objects in images or graphics to convey a machine-readable signal

Abstract
The disclosure provides apparatus related to arranging objects (e.g., circles, dots and other shapes) in images and graphics to convey a machine-readable signal. One claim recites an apparatus including: electronic memory for storing image or graphic data; and an electronic processor programmed for: integrating a plurality of objects in stored image or graphic data, the objects being arranged in a pattern that is machine-readable, the plurality of objects being integrated in the image or graphic so that the pattern is hidden in the image or graphic through cooperation with design elements of the image or graphic; and providing a visible structure for aiding in machine-reading of the pattern. Of course, other combinations are provided and claimed as well.
Description
FIELD OF THE INVENTION

The present invention relates to steganography and data hiding.


BACKGROUND AND SUMMARY OF THE INVENTION

Digital watermarking is a process for modifying physical or electronic media to embed a machine-readable code into the media. The media may be modified such that the embedded code is imperceptible or nearly imperceptible to the user, yet may be detected through an automated detection process. Most commonly, digital watermarking is applied to media signals such as images, audio signals, and video signals. However, it may also be applied to other types of media objects, including documents (e.g., through line, word or character shifting), software, multi-dimensional graphics models, and surface textures of objects.


Digital watermarking systems typically have two primary components: an encoder that embeds the watermark in a host media signal, and a decoder that detects and reads the embedded watermark from a signal suspected of containing a watermark (a suspect signal). The encoder embeds a watermark by altering the host media signal. The reading component analyzes a suspect signal to detect whether a watermark is present. In applications where the watermark encodes information, the reader extracts this information from the detected watermark.


Several particular watermarking techniques have been developed. The reader is presumed to be familiar with the literature in this field. Particular techniques for embedding and detecting imperceptible watermarks in media signals are detailed in the assignee's co-pending U.S. patent application Ser. No. 09/503,881 (now U.S. Pat. No. 6,614,914) and U.S. Pat. No. 6,122,403, which are each herein incorporated by reference.


In parent application Ser. No. 09/127,502 (now U.S. Pat. No. 6,345,104) we disclose the following: Many security documents are still designed largely by hand. A designer works at a drafting table or computer workstation, and spends many hours laying-out minute (e.g. 5 mm×5 mm) excerpts of the design. To aid integration of watermark and/or calibration pattern data in this process, an accessory layout grid can be provided, identifying the watermark “bias” (e.g. −3 to +3) that is to be included in each 250 micron cell of the security document. If the accessory grid indicates that the luminance should be slightly increased in a cell (e.g. 1%), the designer can take this bias in mind when defining the composition of the cell and include a touch less ink than might otherwise be included. Similarly, if the accessory grid indicates that the luminance should be somewhat strongly increased in a cell (e.g. 5%), the designer can again bear this in mind and try to include more ink than might otherwise be included. Due to the substantial redundancy of most watermark encoding techniques, strict compliance by the designer to these guidelines is not required. Even loose compliance can result in artwork that requires little, if any, further modification to reliably convey watermark and/or calibration information.


Such “designing-in” of embedded information in security documents is facilitated by the number of arbitrary design choices made by security document designers. A few examples from U.S. banknotes include the curls in the presidents' hair, the drape of clothing, the clouds in the skies, the shrubbery in the landscaping, the bricks in the pyramid, the fill patterns in the lettering, and the great number of arbitrary guilloche patterns and other fanciful designs, etc. All include curves, folds, wrinkles, shadow effects, etc., about which the designer has wide discretion in selecting local luminance, etc. Instead of making such choices arbitrarily, the designer can make these choices deliberately so as to serve an informational—as well as an aesthetic—function.


To further aid the security document designer, data defining several different information-carrying patterns (both watermark and/or calibration pattern) can be stored on mass storage of a computer a workstation and serve as a library of design elements for future designs. The same user-interface techniques that are employed to pick colors in image-editing software (e.g. Adobe Photoshop) and fill textures in presentation programs (e.g. Microsoft PowerPoint) can similarly be used to present a palette of information patterns to a security document designer. Clicking on a visual representation of the desired pattern makes the pattern available for inclusion in a security document being designed (e.g. filling a desired area).


In the embodiment earlier-described, the calibration pattern is printed as a visible artistic element of the security document. However, the same calibration effect can be provided subliminally if desired. That is, instead of generating artwork mimicking the gray-scale pattern of the reference calibration block, the reference calibration block can itself be encoded into the security document as small changes in local luminance. In many such embodiments, the bias to localized document luminance due to the calibration pattern is simply added to the bias due to the watermark data, and encoded like the watermark data (e.g. as localized changes to the width or position of component line-art lines, as inserted ink droplets, etc.).


The present invention continues these inventive ideas. According to one aspect of the disclosure, message objects are included in an image. The message objects preferably have characteristics that distinguish them from image background or other image objects. Some such distinguishing characteristics many include, color or gray-scale values, luminance values, and a contrast in comparison to other objects or to a background. The distinguishing characteristics can be subtle and need not be perceptible by a human viewer. For example, the message object may be slightly lighter than the image background or other image objects. Or the message objects may be darker than its background.


Message objects are arranged within the image to convey (or hide) information, such as a steganographic message or signal. The message is typically indistinguishable by a human viewer. However, computer software can analyze the arrangement to determine the hidden information. In one embodiment, a digital watermark signal is reduced to a set of spatial positions. The set of spatial positions sufficiently conveys the digital watermark signal. Message objects are positioned according to the set of spatial points. Non-message objects are combined with the message objects to form an image or design. The message objects include distinguishable characteristics, e.g., via color, contrast, gray-scale level or luminance, in comparison to the non-message objects. The digital watermark signal is detected by distinguishing the message objects from the non-message objects (e.g., via color or contrast differences) and analyzing the relative placement of the message objects within the image or design.


Additional features and advantages of the present disclosure will become even more apparent with reference to the following detailed description and accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a flow diagram illustrating an image construction method according to one aspect of the present disclosure.



FIG. 2 illustrates a digital watermark signal.



FIG. 3 illustrates a representation of the FIG. 2 watermark signal after thresholding.



FIG. 4 illustrates a gray-scale image including message objects.





DETAILED DESCRIPTION

We have found that the arrangement of objects within an image can be used to convey information—which is otherwise imperceptible to a human viewer. We arrange so-called “message objects” to convey or represent a steganographic signal (e.g., a digital watermark). We define a message object broadly herein as including an information carrier, an image object, a shape, an object or collection of objects, a pixel or group of pixels, a contrast or color/gray-scale area, etc. A set of message objects is arranged within an image or area to form a steganographic message. A few examples are provided below.


Consider a drawing illustrating a Dalmatian puppy. The puppy has a white coat complimented with black spots. An artist (or digital editor) can arrange the spots—an example of a message object—so as to convey a hidden or steganographic message. More practical, however, is to align the spots according to a predetermined steganographic signal and then sculpt or design the puppy around the aligned spots.


Now consider an image or picture depicting hundreds of marbles strewn across a surface. Certain of the marbles have an offsetting color (or luminance, gray-scale level, contrast, etc.) when compared to the surface or to other marbles. We call these offset color marbles our “message marbles.” The message marbles are arranged to represent (or to form) a steganographic signal. More practical, perhaps, is to initially arrange the message marbles according to a predetermined signal (e.g., a digital watermark signal, an orientation signal or various combinations of both) and then “fill-in” non-message marbles to complete the image.


Another example is particularly appealing to our astronomy friends. Consider an image or graphic depicting a nighttime sky. The sky is populated with “message stars.” The message stars are arranged in the nighttime sky in a predetermined manner according to a steganographic signal. The sky is further populated with non-message stars. The message stars are preferably distinguishable from the non-message stars. The distinguishing characteristics need not be visibly perceptible, and may be based on subtle differences, e.g., as measured in luminance, color levels, brightness, contrast, etc.


A steganographic decoder, analyzing a digital version (e.g., an optically captured image) of the Dalmatian puppy, marble image or populated nighttime sky, decodes the steganographic message.


While the above signal-conveying techniques rely on the arrangement of message objects in a spatial domain (e.g., in an image), message formation or detection can be based in either a spatial or transform (e.g., Fourier or frequency) domain. For example, the arrangement of the message objects in the spatial domain may have significance in a frequency domain (e.g., may correspond to a pattern of peaks, etc.). Message detection can be accordingly facilitated, e.g., as discussed in Assignee's U.S. patent application Ser. Nos. 09/940,872 (published as US 2003-0039376 A1) and 09/503,881 (now U.S. Pat. No. 6,614,914), each of which is herein incorporated by reference. The incorporated by reference patent documents detail many techniques for signal hiding and message detection.


(Applicant notes that since the Fourier transform is a linear operation, adjustments to pixels in a region may be made in the frequency or spatial domain. For example, a digital watermark embedder can adjust the frequency domain representation of the host signal according to the watermark definition to form a frequency domain representation of the watermarked signal region, and then take the inverse Fourier transform of the watermarked regions to produce the watermarked signal in the spatial domain. Alternatively, the embedder can compute a difference signal to effect the desired changes to the region in the frequency domain, and then compute the inverse transform of the difference signal into the spatial domain, where corresponding pixels of the difference signal and host signal region are summed. Either way, the result is a watermarked signal in the original domain of the host signal.).


Steganographic Image Construction


We now describe our preferred image construction with reference to FIGS. 1-4. FIG. 1 illustrates a flow diagram depicting method steps for a first embodiment of the present disclosure. A digital watermark signal is provided in step 10. The watermark signal preferably includes a message component, e.g., a payload or identifier, and/or an orientation signal. An orientation signal is helpful to resolve image distortion such as scale and rotation. FIG. 2 illustrates an example of a watermark signal shown in a spatial domain. (We note that the FIG. 2 representation is exaggerated to help simplify the discussion.). Although not required, the digital watermark signal is preferably a pure (or “raw”) signal in that it does not include image data.


Thresholding is applied to the FIG. 2 watermark signal (step 12). Thresholding preferable identifies (or reduces the watermark signal to) a set of relevant points. The relevant points may correspond to or represent a wide range of features, such as signal or frequency peak levels, magnitude peaks, watermark message components, watermark orientation references, spatial domain signal characteristics, etc. Regardless of the relevant features used to determine a relevant point, the set of relevant points is preferably sufficient to represent (or convey) the watermark signal. (We use the term “thresholding” generally herein to include a process to identify a set and location of spatial points for placement of message objects. Alternatively, the thresholding may identify relevant frequency domain points, which can be mapped or transformed into a spatial domain representation.) The thresholding procedure can also be adjusted to provide a more or less robust watermark signal. For example, the spacing of relevant points can be increased, but at a cost of robustness.


While the term “thresholding” is used as an illustrative technique, the present invention is not so limited. Indeed there are other ways to refine a watermark signal into a set or map of relevant points. For example, a frequency analysis can be used to identify significant frequency characteristics (e.g., peaks). These characteristics can be mapped to the spatial domain to identify a placement point (e.g., a spatial location). Or, as in another implementation, the digital watermark signal is quantitized, e.g., via a root mean square measurement. Of course other techniques can be used to reduce a watermark signal to a set of relevant spatial points sufficient to convey the signal.


The set of relevant points comprises a placement map as shown in FIG. 3. (It should be appreciated that there are many, many possible placement map patterns, including many more or less significant positions. FIG. 3 illustrates but one such possible placement map.). The FIG. 3 placement map includes a plurality of spatial positions or points (e.g., 20 and 21). These points guide the placement of message objects.


In one alternative implementation, the placement map includes a relative contrast indicator. For example, a placement map point may include a relatively darker or larger point (20), indicating a need for a stronger contrast level (or color, gray-scale, etc.) of a message object, in comparison to a relatively lighter or smaller point (21). A higher contrast may signify a predetermined frequency domain characteristic, such as peak magnitude or frequency response, etc.


Returning to FIG. 1, message objects are arranged according to the placement mapping (step 14). For example, message marbles (or message stars, black Dalmatian spots, etc.) are placed on or over (or otherwise placed corresponding to) the placement map points. The message objects thus convey the steganographic signal. Other image objects can be placed in the image, e.g., to fill in or otherwise populate the image.


Alternative Steganographic Images


There are many alternative ways to implement our inventive steganographic techniques. Consider the following illustrative examples.


Off-the-shelf digital image editing tools (e.g., as provided by Adobe) can be used to place message objects in an image. The operator selects a message object, adjusts the object contrast (or color/gray-level, etc.) to sufficiently set the object apart from other image objects or the image background, and then places the objects according to a digital watermark placement map. Of course this placement process can be automated.


A farmer's field can be plowed or crops can be planted or cut to represent message objects, all arranged according to a steganographic placement map. An aerial image of the farmer's field then includes the steganographic message.


Different color tulips (or other flowers) can be planted according to a placement map. For example, groups of white tulips (e.g., message tulips) can be planted in an otherwise purple tulip field. An aerial image captures the field—including the steganographic message.


Captured images can be used in advertisements, e.g., when the steganographic message carries a link (e.g., a URL or an identifier used to obtain a link). Assignee's U.S. patent application Ser. No. 09/571,422, filed May 15, 2000 (U.S. Pat. No. 6,947,571), discloses many suitable linking techniques that are expressly contemplated as being combined with the present invention. This patent application is herein incorporated by reference.


Our techniques can even be advantageously employed in the photo-mosaic field. (Photo-mosaic processes are even further discussed, e.g., in U.S. Pat. Nos. 6,137,498 and 5,649,032, which are each incorporated herein by reference.). As disclosed in U.S. Pat. No. 6,137,498, a mosaic image is formed from a database (or collection) of source images. Source images are analyzed, selected and organized to produce the mosaic image. A target image is divided into tile regions, each of which is compared with individual source image portions to determine the best available matching source image. Positioning respective best-matching source images at the respective tile regions forms the mosaic image.


An improvement to a photo-mosaic process is to arrange message source photos (e.g., representing message objects) according to a watermark placement map. Preferably, the message source photos are subtly distinguishable from other mosaic photos via a gray-scale value, a color value, contrast or luminance, etc. The message source photos form (or convey) a steganographic signal. In one implementation, the arrangement of message source photos is carried out via the “best available matching” discussed above with respect to the U.S. Pat. No. 6,137,498. In a first implementation, the process determines whether a selected best available photo is to be titled over a placement map position. If so, the photo characteristics (luminance, contrast, gray-scale, etc.) are subtly altered to create a message source photo. In a second implementation, the “best available matching” algorithm includes selection criteria, e.g., if selecting a photo for a placement map position, the algorithm selects a photo with sufficient distinguishing characteristics to qualify as a message object. The distinguishing characteristics can be measured in terms of its neighbors (e.g., a message photograph may include an overall different contrast, color or gray-scale level from its neighboring photographs) or in terms of non-message photographs. In a third implementation, message source photos are arranged according to a placement map, and then other source photos are used to fill in or complete the photo mosaic. In a variation of this third implementation, the other, non-message source photos are selected and arranged according to a best available matching technique.


Similar embedded designs can be made using coins, bills, umbrellas, coffee mugs, opened or closed blinds in an apartment building, shapes, snow flakes, groups of pixels, etc., etc.


With reference to FIG. 4, message objects of different colors (or gray-levels, tones, contrasts, luminance, etc.) can be used in the same image. For example, the FIG. 4 image includes a gray background (shown by the diagonal hashes). A message object can be distinguished from the background if it is either lighter 30 or darker 31 than the background. Or if the background is blue, message objects can be red or green, etc.


Message Detection


An image created according to our inventive techniques can be read using steganographic or digital watermarking decoding techniques, e.g., as described in assignee's Ser. No. 09/571,422 (U.S. Pat. No. 6,947,571) and/or Ser. No. 09/503,881 (U.S. Pat. No. 6,614,914) applications. In one implementation, Digimarc MediaBridge watermark reading software, available from Digimarc Corporation headquartered in Tualatin, Oreg., is used to read an image including a corresponding MediaBridge digital watermark signal represented through our message object arranging techniques. Of course, other decoding techniques can be used, particularly when they correspond to the techniques used to generate the original watermark signal. (For example, when using a Digimarc MediaBridge reader, the watermark signal is preferably created using a MediaBridge signal generator or embedder.). Most commonly, the reader identifies the message objects from the different levels of contrast (or color, gray-scale, luminance, etc.) between a message object and other objects or background.


CONCLUSION

The foregoing are just exemplary implementations of the present disclosure. It will be recognized that there are a great number of variations on these basic themes. The foregoing illustrates but a few applications of the detailed technology. There are many others.


To provide a comprehensive disclosure without unduly lengthening this specification, applicants incorporate by reference, in their entireties, the disclosures of the above-cited patents and applications. The particular combinations of elements and features in the above-detailed embodiments are exemplary only; the interchanging and substitution of these teachings with other teachings in this application and the incorporated-by-reference patents/applications are expressly contemplated.


One application uses our inventive embedding techniques for “long-distance” or aerial digital watermark reading, e.g., for some of the traffic monitoring examples disclosed in Assignee's U.S. Provisional Patent Application No. 60/327,687, filed Oct. 5, 2001. (In one experiment we created a digitally watermarked image using our inventive object placement techniques disclosed herein. We then optically captured an image of the watermarked image with a digital camera augmented with a telescope. The watermarked image was about 100 feet away from the camera/telescope. The watermark was successfully detected.).


Although not belabored herein, artisans will understand that the systems and methods described above can be implemented using a variety of hardware and software systems. Alternatively, dedicated hardware, or programmable logic circuits, can be employed for such operations.


The various section headings in this application are provided for the reader's convenience and provide no substantive limitations. The features found in one section may be readily combined with those features in another section.


In view of the wide variety of embodiments to which the principles and features discussed above can be applied, it should be apparent that the detailed embodiments are illustrative only and should not be taken as limiting the scope of the invention. Rather, we claim as our invention all such modifications as may come within the scope and spirit of the following claims and equivalents thereof.

Claims
  • 1. An apparatus comprising: electronic memory configured to store image or graphic data; andan electronic processor configured to: integrate a plurality of objects in the image or graphic data, wherein the objects are arranged in a pattern that is machine-readable, and wherein the plurality of objects are integrated in the image or graphic data so that the pattern is hidden in the image or graphic data through cooperation with design elements of the image or graphic data; andprovide a visible structure for aiding in machine-reading of the pattern.
  • 2. The apparatus of claim 1, wherein the plurality of objects comprise dots or circles.
  • 3. The apparatus of claim 1, wherein the cooperation comprises a relationship of color characteristics of the pattern relative to color characteristics of the design elements.
  • 4. An apparatus comprising: electronic memory configured to store an image or graphic;an electronic processor configured to: provide a visible structure for association with the image or graphic, wherein the visible structure aids machine-recognition of a pattern; andintegrate a plurality of objects in the image or graphic, wherein the objects are arranged in the pattern for machine-recognition, and wherein the plurality of objects are integrated in the image or graphic such that the pattern is concealed therein through association with design elements of the image or graphic.
  • 5. The apparatus of claim 4, wherein the objects comprise dots or circles.
  • 6. The apparatus of claim 4, wherein the pattern is hidden in the image or graphic through cooperation with design elements of the image or graphic, and wherein the cooperation comprises a relationship of color characteristics of the pattern relative to color characteristics of the design elements.
  • 7. An apparatus comprising: electronic memory configured to store data representing an image or graphic, and data representing a visible structure, wherein the visible structure is associated with the image or graphic, wherein a plurality of objects are integrated in the image or graphic, wherein the objects are arranged in a machine-readable pattern, wherein the plurality of objects being are integrated in the image or graphic so that the pattern is hidden in the image or graphic through cooperation with design elements of the image or graphic, and wherein the visible structure aids in machine-reading of the pattern; andan electronic processor configured to read the pattern utilizing the data representing the visual structure.
  • 8. The apparatus of claim 7, wherein the objects comprise dots or circles.
  • 9. The apparatus of claim 7, wherein the cooperation comprises a relationship of color characteristics of the pattern relative to color characteristics of the design elements.
  • 10. A method comprising: storing, in electronic memory, data representing an image or graphic, and data representing a visible structure associated with the image or graphic, wherein the visible structure aids machine-recognition of a pattern, wherein the image or graphic comprises a plurality of objects integrated therein that are arranged in the pattern for machine-recognition, and wherein the plurality of objects are integrated in the image or graphic such that the pattern is concealed therein through association with design elements of the image or graphic; andrecognizing, using an electric processor, the pattern using the data representing the visual structure.
  • 11. The method of claim 10, wherein the plurality of objects comprises dots or circles.
  • 12. The method of claim 10, wherein the pattern is hidden in the image or graphic through cooperation with design elements of the image or graphic, and wherein the cooperation comprises a relationship of color characteristics of the pattern relative to color characteristics of the design elements.
RELATED APPLICATION DATA

The present application is a continuation of U.S. patent application Ser. No. 12/464,679 filed May 12, 2009 (U.S. Pat. No. 7,831,062), which is a continuation of U.S. patent application Ser. No. 12/017,636, filed Jan. 22, 2008 (U.S. Pat. No. 7,532,741), which is a continuation of U.S. patent application Ser. No. 11/127,442, filed May 11, 2005 (U.S. Pat. No. 7,321,667), which is a continuation of U.S. patent application Ser. No. 10/074,680, filed Feb. 11, 2002 (published as US 2002-0136429 A1), which claims the benefit of U.S. Provisional Patent Application No. 60/350,505, filed Jan. 18, 2002, titled “Data Hiding Through Arrangement of Objects.” The present application is also related to U.S. patent application Ser. No. 09/127,502, filed Jul. 31, 1998 (now U.S. Pat. No. 6,345,104), which is a continuation-in-part of U.S. patent application Ser. No. 09/074,034, filed May 6, 1998 (now U.S. Pat. No. 6,449,377). The Ser. No. 09/127,502 application is also a continuation-in-part of U.S. patent application Ser. No. 08/967,693, filed Nov. 12, 1997 (now U.S. Pat. No. 6,122,392), which is a continuation of application Ser. No. 08/614,521, filed Mar. 15, 1996 (now U.S. Pat. No. 5,745,604), which is a continuation of application Ser. No. 08/215,289, filed Mar. 17, 1994 (now abandoned). The Ser. No. 09/127,502 application is also a continuation-in-part of application Ser. No. 08/649,419, filed May 16, 1996 (now U.S. Pat. No. 5,862,260). The Ser. No. 09/127,502 application also claims the benefit of U.S. Provisional application 60/082,228, filed Apr. 16, 1998. The present application is also related to U.S. patent application Ser. No. 09/940,872, filed Aug. 27, 2001 (published as US 2003-0039376 A1). Each of the above-mentioned patent documents is hereby incorporated herein by reference.

US Referenced Citations (264)
Number Name Date Kind
3859633 Ho et al. Jan 1975 A
3893080 Ho et al. Jul 1975 A
4748679 Gold et al. May 1988 A
4876617 Best et al. Oct 1989 A
5010405 Schreiber et al. Apr 1991 A
5091966 Bloomberg et al. Feb 1992 A
5278400 Appel Jan 1994 A
5329108 Lamoure Jul 1994 A
5337361 Wang et al. Aug 1994 A
5374976 Spannenburg Dec 1994 A
5444779 Daniele Aug 1995 A
5530759 Braudaway et al. Jun 1996 A
5568570 Rabbani Oct 1996 A
5572010 Petrie Nov 1996 A
5581800 Fardeau et al. Dec 1996 A
5636292 Rhoads Jun 1997 A
5649032 Burt et al. Jul 1997 A
5664018 Leighton Sep 1997 A
5710834 Rhoads Jan 1998 A
5721788 Powell et al. Feb 1998 A
5745604 Rhoads Apr 1998 A
5748763 Rhoads May 1998 A
5752152 Gasper et al. May 1998 A
5765176 Bloomberg Jun 1998 A
5768426 Rhoads Jun 1998 A
5772250 Gasper Jun 1998 A
5809160 Powell et al. Sep 1998 A
5832119 Rhoads Nov 1998 A
5843564 Gasper et al. Dec 1998 A
5850481 Rhoads Dec 1998 A
5862260 Rhoads Jan 1999 A
5905800 Moskowitz et al. May 1999 A
5930377 Powell et al. Jul 1999 A
5949055 Fleet et al. Sep 1999 A
6026193 Rhoads Feb 2000 A
6101602 Fridrich Aug 2000 A
6104812 Koltai et al. Aug 2000 A
6121530 Sonoda Sep 2000 A
6122392 Rhoads Sep 2000 A
6122403 Rhoads Sep 2000 A
6131161 Linnartz Oct 2000 A
6137498 Silvers Oct 2000 A
6154571 Cox et al. Nov 2000 A
6181802 Todd Jan 2001 B1
6185683 Ginter et al. Feb 2001 B1
6198832 Maes et al. Mar 2001 B1
6254007 Mowry, Jr. Jul 2001 B1
6266430 Rhoads Jul 2001 B1
6272176 Srinivasan Aug 2001 B1
6278385 Kondo et al. Aug 2001 B1
6285776 Rhoads Sep 2001 B1
6286100 Morimoto et al. Sep 2001 B1
6289108 Rhoads Sep 2001 B1
6317505 Powell et al. Nov 2001 B1
6330335 Rhoads Dec 2001 B1
6334187 Kadono Dec 2001 B1
6343138 Rhoads Jan 2002 B1
6345104 Rhoads Feb 2002 B1
5832119 Rhoads Mar 2002 C1
6353672 Rhoads Mar 2002 B1
6363159 Rhoads Mar 2002 B1
6385330 Powell et al. May 2002 B1
6389151 Carr et al. May 2002 B1
5636292 Rhoads Jun 2002 C1
6400827 Rhoads Jun 2002 B1
6404898 Rhoads Jun 2002 B1
6415040 Linnartz et al. Jul 2002 B1
6418232 Nakano et al. Jul 2002 B1
6427012 Petrovic Jul 2002 B1
6430302 Rhoads Aug 2002 B2
6449367 Van Wie et al. Sep 2002 B2
6449377 Rhoads Sep 2002 B1
6449379 Rhoads Sep 2002 B1
6459803 Powell et al. Oct 2002 B1
6463162 Vora Oct 2002 B1
6477276 Inoue et al. Nov 2002 B1
6493457 Quackenbush et al. Dec 2002 B1
6496591 Rhoads Dec 2002 B1
6505160 Levy et al. Jan 2003 B1
6519352 Rhoads Feb 2003 B2
6522771 Rhoads Feb 2003 B2
6535618 Rhoads Mar 2003 B1
6539095 Rhoads Mar 2003 B1
6542618 Rhoads Apr 2003 B1
6542620 Rhoads Apr 2003 B1
6560349 Rhoads May 2003 B1
6563936 Brill et al. May 2003 B2
6567101 Thomas May 2003 B1
6567534 Rhoads May 2003 B1
6567535 Rhoads May 2003 B2
6567780 Rhoads May 2003 B2
6574350 Rhoads et al. Jun 2003 B1
6580819 Rhoads Jun 2003 B1
6587821 Rhoads Jul 2003 B1
6590997 Rhoads Jul 2003 B2
6614914 Rhoads et al. Sep 2003 B1
6647129 Rhoads Nov 2003 B2
6654480 Rhoads Nov 2003 B2
6654887 Rhoads Nov 2003 B2
6675146 Rhoads Jan 2004 B2
6694041 Brunk Feb 2004 B1
6700994 Maes et al. Mar 2004 B2
6724912 Carr et al. Apr 2004 B1
6738495 Rhoads et al. May 2004 B2
6744906 Rhoads et al. Jun 2004 B2
6744907 Rhoads Jun 2004 B2
6750985 Rhoads Jun 2004 B2
6754377 Rhoads Jun 2004 B2
6757406 Rhoads Jun 2004 B2
6760464 Brunk Jul 2004 B2
6768808 Rhoads Jul 2004 B2
6771796 Rhoads Aug 2004 B2
6778682 Rhoads Aug 2004 B2
6804377 Reed et al. Oct 2004 B2
6804379 Rhoads Oct 2004 B2
6856977 Adelsbach et al. Feb 2005 B1
6871789 Hilton et al. Mar 2005 B2
6882738 Davis et al. Apr 2005 B2
6912295 Reed et al. Jun 2005 B2
6922480 Rhoads Jul 2005 B2
6944298 Rhoads Sep 2005 B1
6947571 Rhoads et al. Sep 2005 B1
6959100 Rhoads Oct 2005 B2
6959386 Rhoads Oct 2005 B2
6961444 Levy Nov 2005 B2
6965873 Rhoads Nov 2005 B1
6968337 Wold Nov 2005 B2
6970573 Carr et al. Nov 2005 B2
6978036 Alattar et al. Dec 2005 B2
6983051 Rhoads Jan 2006 B1
6987862 Rhoads Jan 2006 B2
6993152 Patterson et al. Jan 2006 B2
6993154 Brunk Jan 2006 B2
7003132 Rhoads Feb 2006 B2
7016516 Rhoads Mar 2006 B2
7020303 Levy et al. Mar 2006 B2
7020349 Brunk Mar 2006 B2
7027612 Patterson et al. Apr 2006 B2
7054462 Rhoads et al. May 2006 B2
7054463 Rhoads et al. May 2006 B2
7062070 Powell et al. Jun 2006 B2
7068812 Powell et al. Jun 2006 B2
7076084 Davis et al. Jul 2006 B2
7113569 Okumura et al. Sep 2006 B2
7113615 Rhoads et al. Sep 2006 B2
7116781 Rhoads Oct 2006 B2
7130087 Rhoads Oct 2006 B2
7142691 Levy Nov 2006 B2
7162052 Brundage et al. Jan 2007 B2
7162146 Cookson et al. Jan 2007 B2
7181022 Rhoads Feb 2007 B2
7184570 Rhoads Feb 2007 B2
7197164 Levy Mar 2007 B2
7239734 Alattar et al. Jul 2007 B2
7240849 Floriach et al. Jul 2007 B2
7242790 Rhoads Jul 2007 B2
7246239 Rodriguez et al. Jul 2007 B2
7248715 Levy Jul 2007 B2
7263203 Rhoads et al. Aug 2007 B2
7266217 Rhoads et al. Sep 2007 B2
7269275 Carr et al. Sep 2007 B2
7277468 Tian et al. Oct 2007 B2
7286684 Rhoads et al. Oct 2007 B2
7305117 Davis et al. Dec 2007 B2
7313253 Davis et al. Dec 2007 B2
7321667 Stach Jan 2008 B2
7340076 Stach et al. Mar 2008 B2
7346776 Levy et al. Mar 2008 B2
7349555 Rhoads Mar 2008 B2
7359528 Rhoads Apr 2008 B2
7372976 Rhoads et al. May 2008 B2
7415129 Rhoads Aug 2008 B2
7418111 Rhoads Aug 2008 B2
7424132 Rhoads Sep 2008 B2
7499564 Rhoads Mar 2009 B2
7532741 Stach May 2009 B2
7536555 Rhoads May 2009 B2
7539325 Rhoads et al. May 2009 B2
7548643 Davis et al. Jun 2009 B2
7555139 Rhoads et al. Jun 2009 B2
7567686 Rhoads Jul 2009 B2
7570784 Alattar Aug 2009 B2
7602940 Rhoads et al. Oct 2009 B2
7602977 Rhoads et al. Oct 2009 B2
7606390 Rhoads Oct 2009 B2
7607016 Brunk et al. Oct 2009 B2
7620200 Rhoads Nov 2009 B2
7639837 Carr et al. Dec 2009 B2
7643649 Davis et al. Jan 2010 B2
7656930 Tian et al. Feb 2010 B2
7672477 Rhoads Mar 2010 B2
7676059 Rhoads Mar 2010 B2
7693965 Rhoads Apr 2010 B2
7697719 Rhoads Apr 2010 B2
7702511 Rhoads Apr 2010 B2
7711143 Rhoads May 2010 B2
7720249 Rhoads May 2010 B2
7720255 Rhoads May 2010 B2
7724919 Rhoads May 2010 B2
7724920 Rhoads May 2010 B2
7760902 Rhoads Jul 2010 B2
7763179 Levy et al. Jul 2010 B2
7796826 Rhoads et al. Sep 2010 B2
7831062 Stach Nov 2010 B2
20010019611 Hilton Sep 2001 A1
20010022848 Rhoads Sep 2001 A1
20010049788 Shur Dec 2001 A1
20010052076 Kadono Dec 2001 A1
20020009209 Inoue et al. Jan 2002 A1
20020016916 Natarajan Feb 2002 A1
20020021808 Iwamura Feb 2002 A1
20020051559 Noda et al. May 2002 A1
20020054355 Brunk May 2002 A1
20020080995 Rhoads Jun 2002 A1
20020106104 Brunk et al. Aug 2002 A1
20020126869 Wang et al. Sep 2002 A1
20020136429 Stach et al. Sep 2002 A1
20020172394 Venkatesan et al. Nov 2002 A1
20020186861 Echizen et al. Dec 2002 A1
20020191811 Kamijo Dec 2002 A1
20030021439 Lubin et al. Jan 2003 A1
20030021440 Rhoads Jan 2003 A1
20030039376 Stach Feb 2003 A1
20030053654 Patterson et al. Mar 2003 A1
20030118208 Epstein Jun 2003 A1
20030133592 Rhoads Jul 2003 A1
20030138128 Rhoads Jul 2003 A1
20030215112 Rhoads et al. Nov 2003 A1
20040032972 Stach et al. Feb 2004 A1
20040057581 Rhoads Mar 2004 A1
20040128514 Rhoads Jul 2004 A1
20040181671 Brundage et al. Sep 2004 A1
20040263911 Rodriguez et al. Dec 2004 A1
20050063562 Brunk et al. Mar 2005 A1
20050105760 Eggers et al. May 2005 A1
20060028689 Perry et al. Feb 2006 A1
20060062386 Rhoads Mar 2006 A1
20070016790 Brundage et al. Jan 2007 A1
20070088953 Hilton et al. Apr 2007 A1
20070172098 Rhoads et al. Jul 2007 A1
20070180251 Carr et al. Aug 2007 A1
20070201835 Rhoads Aug 2007 A1
20080016360 Rodriguez et al. Jan 2008 A1
20080131083 Rhoads Jun 2008 A1
20080131084 Rhoads Jun 2008 A1
20080149713 Brundage Jun 2008 A1
20080253740 Rhoads Oct 2008 A1
20080270801 Levy et al. Oct 2008 A1
20080275906 Rhoads et al. Nov 2008 A1
20090252401 Davis et al. Oct 2009 A1
20100008534 Rhoads Jan 2010 A1
20100008536 Rhoads Jan 2010 A1
20100008537 Rhoads Jan 2010 A1
20100021004 Rhoads Jan 2010 A1
20100027969 Alattar Feb 2010 A1
20100040255 Rhoads Feb 2010 A1
20100042843 Brunk et al. Feb 2010 A1
20100119108 Rhoads May 2010 A1
20100131767 Rhoads May 2010 A1
20100142752 Rhoads et al. Jun 2010 A1
20100146285 Rhoads et al. Jun 2010 A1
20100163629 Rhoads et al. Jul 2010 A1
20100172538 Rhoads Jul 2010 A1
20110013802 Rhoads Jan 2011 A1
Foreign Referenced Citations (9)
Number Date Country
2943436 Jul 1981 DE
493091 Jul 1992 EP
838050 Apr 2000 EP
966837 Jul 2002 EP
1147495 Jan 2003 EP
WO 0173997 Oct 2001 WO
WO 0203328 Jan 2002 WO
WO 0219269 Mar 2002 WO
WO 2005027056 Mar 2005 WO
Non-Patent Literature Citations (20)
Entry
U.S. Appl. No. 08/154,866, filed Nov. 18, 1993, Geoffrey B. Rhoads.
U.S. Appl. No. 08/215,289, filed Mar. 17, 1994, Geoffrey B. Rhoads.
U.S. Appl. No. 09/150,147, filed Sep. 9, 1998, Geoffrey B. Rhoads.
U.S. Appl. No. 09/151,492, filed Sep. 11, 1998, Bruce L. Davis et al.
U.S. Appl. No. 09/496,380, filed Feb. 2, 2000, Geoffrey B. Rhoads.
U.S. Appl. No. 12/881,911, filed Sep. 14, 2010, Geoffrey B. Rhoads et al.
U.S. Appl. No. 12/692,470, filed Jan. 22, 2010, Jun Tian et al.
U.S. Appl. No. 60/082,228, Apr. 16, 1998, Rhoads.
U.S. Appl. No. 60/350,505, Jan. 18, 2002, Stach et al.
“Access Control and COpyright Protection for Images, WorkPackage 8: Watermarking,” Jun. 30, 1995, 46 pages.
Bender et al., “Techniques for Data Hiding,” Proc. SPIE, vol. 2420, Feb. 9, 1995, pp. 164-173.
Brassil et al., Electronic Marking and Identification Techniques to Discourage Document Copying, Proceedings of INFOCOM '94 Conference on Computer, IEEE Commun. Soc Conference, Jun. 12-16, 1994, pp. 1278-1287.
Brassil et al., “Hiding Information in Document Images,” Nov. 1995, 7 pages.
Caldelli et al., “Geometric-Invariant Robust Watermarking Through Constellation Matching in the Frequency Domain,” IEEE Proc. Int. Conf. on Image Processing, vol. 2, Sep. 2000, pp. 65-68.
Ding et al., “A Novel Digital Image Hiding Technology Based on Tangram and Conway's Game,” IEEE Proc. Int. Conf. on Image Processing, vol. 1, Sep. 2000, pp. 601-604.
Komatsu et al., “A Proposal on Digital Watermark in Document Image Communication and Its Application to Realizing a Signature,” Electronics and Communications in Japan, Part 1, vol. 73, No. 5, 1990, pp. 22-33.
Low et al., “Document Marking and Identification using Both Line and Word Shifting,” IEEE Proc. INFOCOM'95, Apr. 1995, pp. 853-860.
Maes et al., “Digital Watermarking by Geometric Warping,” IEEE Proc. Int. Conf. on Image Processing, vol. 2, Oct. 1998, pp. 424-426.
Rongen et al., “Digital Image Watermarking by Salient Point Modification Practical Results,” Proc. SPIE vol. 3657: Security and Watermarking of Multimedia Contents, Jan. 1999, pp. 273-282.
Szepanski, “A Signal Theoretic Method for Creating Forgery-Proof Documents for Automatic Verification,” Proceedings 1979 Carnahan Conference on Crime Countermeasures, May 16, 1979, 9 pages.
Related Publications (1)
Number Date Country
20110110555 A1 May 2011 US
Provisional Applications (1)
Number Date Country
60350505 Jan 2002 US
Continuations (4)
Number Date Country
Parent 12464679 May 2009 US
Child 12942735 US
Parent 12017636 Jan 2008 US
Child 12464679 US
Parent 11127442 May 2005 US
Child 12017636 US
Parent 10074680 Feb 2002 US
Child 11127442 US