The present invention relates to steganography and data hiding.
Digital watermarking is a process for modifying physical or electronic media to embed a machine-readable code into the media. The media may be modified such that the embedded code is imperceptible or nearly imperceptible to the user, yet may be detected through an automated detection process. Most commonly, digital watermarking is applied to media signals such as images, audio signals, and video signals. However, it may also be applied to other types of media objects, including documents (e.g., through line, word or character shifting), software, multi-dimensional graphics models, and surface textures of objects.
Digital watermarking systems typically have two primary components: an encoder that embeds the watermark in a host media signal, and a decoder that detects and reads the embedded watermark from a signal suspected of containing a watermark (a suspect signal). The encoder embeds a watermark by altering the host media signal. The reading component analyzes a suspect signal to detect whether a watermark is present. In applications where the watermark encodes information, the reader extracts this information from the detected watermark.
Several particular watermarking techniques have been developed. The reader is presumed to be familiar with the literature in this field. Particular techniques for embedding and detecting imperceptible watermarks in media signals are detailed in the assignee's co-pending U.S. patent application Ser. No. 09/503,881 and U.S. patent application Ser. No. 6,122,403, which are each herein incorporated by reference.
In U.S. parent application Ser. No. 09/127,502 we disclose the following:
Many security documents are still designed largely by hand. A designer works at a drafting table or computer workstation, and spends many hours laying-out minute (e.g. 5 mm×5 mm) excerpts of the design. To aid integration of watermark and/or calibration pattern data in this process, an accessory layout grid can be provided, identifying the watermark “bias” (e.g. −3 to +3) that is to be included in each 250 micron cell of the security document. If the accessory grid indicates that the luminance should be slightly increased in a cell (e.g. 1%), the designer can take this bias in mind when defining the composition of the cell and include a touch less ink than might otherwise be included. Similarly, if the accessory grid indicates that the luminance should be somewhat strongly increased in a cell (e.g. 5%), the designer can again bear this in mind and try to include more ink than might otherwise be included. Due to the substantial redundancy of most watermark encoding techniques, strict compliance by the designer to these guidelines is not required. Even loose compliance can result in artwork that requires little, if any, further modification to reliably convey watermark and/or calibration information.
Such “designing-in” of embedded information in security documents is facilitated by the number of arbitrary design choices made by security document designers. A few examples from U.S. banknotes include the curls in the presidents' hair, the drape of clothing, the clouds in the skies, the shrubbery in the landscaping, the bricks in the pyramid, the fill patterns in the lettering, and the great number of arbitrary guilloche patterns and other fanciful designs, etc. All include curves, folds, wrinkles, shadow effects, etc., about which the designer has wide discretion in selecting local luminance, etc. Instead of making such choices arbitrarily, the designer can make these choices deliberately so as to serve an informational—as well as an aesthetic—function.
To further aid the security document designer, data defining several different information-carrying patterns (both watermark and/or calibration pattern) can be stored on mass storage of a computer a workstation and serve as a library of design elements for future designs. The same user-interface techniques that are employed to pick colors in image-editing software (e.g. Adobe Photoshop) and fill textures in presentation programs (e.g. Microsoft PowerPoint) can similarly be used to present a palette of information patterns to a security document designer. Clicking on a visual representation of the desired pattern makes the pattern available for inclusion in a security document being designed (e.g. filling a desired area).
In the embodiment earlier-described, the calibration pattern is printed as a visible artistic element of the security document. However, the same calibration effect can be provided subliminally if desired. That is, instead of generating artwork mimicking the gray-scale pattern of the reference calibration block, the reference calibration block can itself be encoded into the security document as small changes in local luminance. In many such embodiments, the bias to localized document luminance due to the calibration pattern is simply added to the bias due to the watermark data, and encoded like the watermark data (e.g. as localized changes to the width or position of component line-art lines, as inserted ink droplets, etc.).
The present invention continues these inventive ideas. According to one aspect of the present invention, message objects are included in an image. The message objects preferably have characteristics that distinguish them from image background or other image objects. Some such distinguishing characteristics many include, color or gray-scale values, luminance values, and a contrast in comparison to other objects or to a background. The distinguishing characteristics can be subtle and need not be perceptible by a human viewer. For example, the message object may be slightly lighter than the image background or other image objects. Or the message objects may be darker than its background.
Message objects are arranged within the image to convey (or hide) information, such as a steganographic message or signal. The message is typically indistinguishable by a human viewer. However, computer software can analyze the arrangement to determine the hidden information. In one embodiment, a digital watermark signal is reduced to a set of spatial positions. The set of spatial positions sufficiently conveys the digital watermark signal. Message objects are positioned according to the set of spatial points. Non-message objects are combined with the message objects to form an image or design. The message objects include distinguishable characteristics, e.g., via color, contrast, gray-scale level or luminance, in comparison to the non-message objects. The digital watermark signal is detected by distinguishing the message objects from the non-message objects (e.g., via color or contrast differences) and analyzing the relative placement of the message objects within the image or design. Such techniques can be even used to mark physical structures like a building, road or bridge.
Additional features and advantages of the present invention will become even more apparent with reference to the following detailed description and accompanying drawings.
a and 5b illustrate a physical structure including a signal hidden on a top-surface thereof through arrangement of message objects.
We have found that the arrangement of objects within an image can be used to convey information—which is otherwise imperceptible to a human viewer. We arrange so-called “message objects” to convey or represent a steganographic signal (e.g., a digital watermark). We define a message object broadly herein as including an information carrier, an image object, a shape, an object or collection of objects, a pixel or group of pixels, a physical object, paint or other covering, surface texture, a contrast or color/gray-scale area, etc. A set of message objects is arranged within an image or area to form a steganographic message. A few examples are provided below.
Consider a drawing illustrating a Dalmatian puppy. The puppy has a white coat complimented with black spots. An artist (or digital editor) can arrange a set of spots—an example of our message objects—so as to convey a hidden or steganographic message. More practical, however, is to align the spots according to a predetermined steganographic signal and then sculpt or design the puppy around the aligned spots.
Now consider an image or picture depicting hundreds of marbles strewn across a surface. Certain of the marbles have an offsetting color (or luminance, gray-scale level, contrast, etc.) when compared to the surface or to other marbles. We call these offset color marbles our “message marbles.” The message marbles are arranged to represent (or to form) a steganographic signal. More practical, perhaps, is to initially arrange the message marbles according to a predetermined signal (e.g., a digital watermark signal, an orientation signal or various combinations of both) and then “fill-in” non-message marbles to complete the image.
Another example may be appealing to astronomers. Consider an image or graphic depicting a nighttime sky. The sky is populated with “message stars.” The message stars are arranged in the nighttime sky in a predetermined manner according to a steganographic signal. The sky is further populated with non-message stars. The message stars are preferably distinguishable from the non-message stars. The distinguishing characteristics need not be visibly perceptible, and may be based on subtle differences, e.g., as measured in luminance, color levels, brightness, contrast, etc.
A steganographic decoder, analyzing a digital version (e.g., an optically captured image) of the Dalmatian puppy, marble image or populated nighttime sky, decodes the steganographic message.
While the above signal-conveying techniques rely on the arrangement of message objects in a spatial domain (e.g., in an image), message formation or detection can be based in either a spatial or transform (e.g., Fourier or frequency) domain. For example, the arrangement of the message objects in the spatial domain may have significance in a frequency domain (e.g., may correspond to a pattern of peaks, etc.). Message detection can be accordingly facilitated, e.g., as discussed in Assignee's U.S. patent application Ser. Nos. 09/940,872 and 09/503,881, each of which is herein incorporated by reference. The incorporated by reference patent documents detail many techniques for signal hiding and message detection.
(Applicant notes by way of example that since the Fourier transform is a linear operation, adjustments to pixels in a region may be made in the frequency or spatial domain. For example, a digital watermark embedder can adjust the frequency domain representation of the host signal according to the watermark definition to form a frequency domain representation of the watermarked signal region, and then take the inverse Fourier transform of the watermarked regions to produce the watermarked signal in the spatial domain. Alternatively, the embedder can compute a difference signal to effect the desired changes to the region in the frequency domain, and then compute the inverse transform of the difference signal into the spatial domain, where corresponding pixels of the difference signal and host signal region are summed. Either way, the result is a watermarked signal in the original domain of the host signal.).
Steganographic Image Construction
We now describe our preferred image construction with reference to
Thresholding is applied to the
While the term “thresholding” is used as an illustrative technique, the present invention is not so limited. Indeed there are other ways to map a watermark signal into a set of relevant points. For example, a frequency analysis can be used to identify significant frequency characteristics (e.g., peaks). These characteristics can be mapped to the spatial domain to identify a placement point (e.g., a spatial location). Or, as in another implementation, the digital watermark signal is quantitized, e.g., via a root mean square measurement. Of course other techniques can be used to map a watermark signal to a set of relevant spatial points that is sufficient to convey the signal. We also note that some digital watermark signals can be convey in terms of a plurality of spatial positions and points.
The set of relevant points comprises a placement map as shown in FIG. 3. (It should be appreciated that there are many, many possible placement map patterns depending on message and orientation components, protocol, etc., including many more or less significant positions.
In one alternative implementation, the placement map includes a relative contrast indicator. For example, a placement map point may include a relatively darker or larger point (20), indicating a need for a stronger contrast level (or color, gray-scale, etc.) of a message object, in comparison to a relatively lighter or smaller point (21). A higher contrast may signify a predetermined frequency domain characteristic, such as peak magnitude or frequency response, etc.
Returning to
With reference to
Alternative Steganographic Images
There are many alternative ways to utilize our inventive steganographic techniques. Consider the following illustrative examples.
Digital Images
Off-the-shelf digital image editing tools (e.g., as provided by Adobe) can be used to place message objects in an image. The operator selects a message object, adjusts the object contrast (or color/gray-level, etc.) to sufficiently set the object apart from other image objects or the image background, and then places the objects according to a digital watermark placement map. Of course this placement process can be automated.
Marking Fields
A farmer's field can be plowed or crops can be planted or cut to represent message objects, all arranged according to a steganographic placement map. An aerial image of the farmer's field then includes the steganographic message.
Vegetation
Different color tulips (or other flowers, trees or vegetation) can be planted according to a placement map. For example, groups of white tulips (e.g., message tulips) can be planted in an otherwise purple tulip field. An aerial image captures the field—including the steganographic message conveyed through the strategic placement of the white tulips.
Marking Buildings, Airports and Roads
With reference to
Our building-marking process is further described with reference to
Marking a building with a geo-location indicator includes many advantages. A captured image of an area that depicts such a marked building provides a fixed geo-reference point—provided by the hidden signal on the building. Other image locations can be determined based on a frame of reference provided by the geo-location indicator. For example, once a geo-location is know for the building, other geo-locations in the image can be interpolated. (We note that a steganographic signal or digital watermark signal may include an orientation component. An orientation component is useful in resolving issues of rotation and scale or other image distortion. Thus, precise image locations can be determined in relation to the geo-referenced building location by accounting for image rotation and scale.). The arduous process of assigning geo-coordinates to a captured image is significantly simplified with our techniques.
When analyzing the image including a marked structure, a hidden signal can be isolated to particular structure by including a signal payload field that identifies the structure's boundaries with perhaps a center point of the structure. In another implementation, the hidden signal is isolated to a particular depicted structure from the analysis of a signal detector. A signal detector sniffs (e.g., looks through) an image and determines where within the image it can detect the signal. Such analysis can be used to trace a boundary of the marked structure, since the signal is isolated within such boundaries.
Returning to
One application of our inventive techniques is to provide a ground reference indicator for aircraft or satellites. Consider an aircraft, which captures an image of a marked building, and then decodes the signal to recover hidden geo-location information. The geo-location information can be used to provide reliable (as well as machine-readable) ground position verification for the aircraft. The aircraft can adjust or verify its position or flight path based on such geo-location information, or verify ground coordinates based upon such geo-location information.
While the above example has focused on buildings, this aspect of the present invention is not so limited. Indeed, bridges, houses, docks, warehouses, streets, parking lots, roads, buildings, water or fuel tanks, hospitals, power utilities, dams, etc. can be similarly marked—such structures are referred to as “physical structures.” In one implementation, we arrange naturally occurring objects, e.g., rocks, plants, etc., to mark an otherwise barren or rural area. Also, for some implementations, the marking need not be on top of a structure. Instead, the marking can be on a side of a structure, particularly when aerial imagery is captured from an angle.
Photo-mosaics
Our techniques can even be advantageously employed in the photo-mosaic field. (Photo-mosaic processes are even further discussed, e.g., in U.S. Pat. Nos. 6,137,498 and 5,649,032, which are each incorporated herein by reference.). As disclosed in U.S. Pat. No. 6,137,498, a mosaic image is formed from a database (or collection) of source images. Source images are analyzed, selected and organized to produce the mosaic image. A target image is divided into tile regions, each of which is compared with individual source image portions to determine the best available matching source image. Positioning respective best-matching source images at the respective tile regions forms the mosaic image.
An improvement to a photo-mosaic process is to arrange message source photos (e.g., representing message objects) according to a steganographic placement map. Preferably, the message source photos are subtly distinguishable from other mosaic photos via a gray-scale value, a color value, contrast or luminance, etc. The message source photos form (or convey) a steganographic signal. In one implementation, the arrangement of message source photos is carried out via the “best available matching” discussed above with respect to the U.S. Pat. No. 6,137,498. In a first implementation, the process determines whether a selected best available photo is to be titled over a placement map position. If so, the photo characteristics (luminance, contrast, gray-scale, etc.) are subtly altered to create a message source photo. In a second implementation, the “best available matching” algorithm includes selection criteria, e.g., if selecting a photo for a placement map position, the algorithm selects a photo with sufficient distinguishing characteristics to qualify as a message object. The distinguishing characteristics can be measured in terms of its neighbors (e.g., a message photograph may include an overall different contrast, color or gray-scale level from its neighboring photographs) or in terms of non-message photographs. In a third implementation, message source photos are arranged according to a placement map, and then other source photos are used to fill in or complete the photo mosaic. In a variation of this third implementation, the other, non-message source photos are selected and arranged according to a best available matching technique.
Other Alternatives
Similar embedded designs can be made using coins, bills, umbrellas, coffee mugs, opened or closed blinds in a building, window tints, shapes, snow flakes, groups of pixels, etc., etc.
Advertisements
Captured images including our steganographic signal arranged with message objects can be used in advertisements, e.g., when the steganographic signal includes a message link (e.g., a URL or an identifier used to obtain a link). Assignee's U.S. patent application Ser. No. 09/571,422, filed May 15, 2000, discloses many suitable linking techniques that are contemplated as being combined with the present invention. The U.S. patent application No. 09/571,422 is herein incorporated by reference
Message Detection
An steganographic signal created according to our inventive techniques can be read using steganographic or digital watermarking decoding techniques, e.g., as described in assignee's U.S. patent application Ser. Nos. 09/571,422 and/or 09/503,881. In one implementation, Digimarc MediaBridge watermark reading software, available from Digimarc Corporation headquartered in Tualatin, Oreg., is used to read an image including a corresponding MediaBridge digital watermark signal represented through our message object arranging techniques. Of course, other decoding techniques can be used, particularly when they correspond to the techniques used to generate the original watermark signal. (For example, when using a Digimarc MediaBridge reader, the watermark signal is preferably created using a MediaBridge signal generator or embedder.). Most commonly, the reader identifies the message objects from the different levels of contrast (or color, gray-scale, luminance, etc.) between a message object and other objects or background.
The foregoing are just exemplary implementations of the present invention. It will be recognized that there are a great number of variations on these basic themes. The foregoing illustrates but a few applications of the detailed technology. There are many others.
To provide a comprehensive disclosure without unduly lengthening this specification, applicants incorporate by reference, in their entireties, the disclosures of the above-cited patents and applications. The particular combinations of elements and features in the above-detailed embodiments are exemplary only; the interchanging and substitution of these teachings with other teachings in this application and the incorporated-by-reference patents/applications are contemplated. For example, while we have focused on providing hidden signals via object arrangement, the present invention is not so limited. Indeed, we envision that other digital watermarking techniques could be used to mark images or designs that are placed on physical structures.
One application uses our inventive embedding techniques for “long-distance” or aerial digital watermark reading, e.g., for some of the traffic monitoring examples disclosed in Assignee's U.S. Provisional Patent Application No. 60/327,687, filed Oct. 5, 2001. (In one experiment we created a digitally watermarked image using our inventive object placement techniques disclosed herein. We then optically captured an image of the watermarked image with a digital camera augmented with a telescope. The watermarked image was about 100 feet away from the camera/telescope. The watermark was successfully detected.).
In one alternative implementation of the present invention, we use our techniques to provide a “zero-emission” communications signal. In other words, our above-described techniques make it almost impossible for an “eaves-dropper” to intercept our message. We arrange message objects on a top surface of structure to convey a message. An aerial platform collects imagery including our message. The imagery is routed to the intended recipient, who decodes the image to retrieve our message. In some case we arrange a set of message objects on a structure to convey a signal orientation component. (The orientation component assists in resolving issues such as image rotation and scale, etc.). We can then change only select objects of another set of message objects—objects that convey the message—to update or alter the message.
Although not belabored herein, artisans will understand that the systems and methods described above can be implemented using a variety of hardware and software systems. Alternatively, dedicated hardware, or programmable logic circuits, can be employed for such operations.
The various section headings in this application are provided for the reader's convenience and provide no substantive limitations. The features found in one section may be readily combined with those features in another section.
In view of the wide variety of embodiments to which the principles and features discussed above can be applied, it should be apparent that the detailed embodiments are illustrative only and should not be taken as limiting the scope of the invention. Rather, we claim as our invention all such modifications as may come within the scope and spirit of the following claims and equivalents thereof.
The present application is a continuation in part of U.S. patent application Ser. No. 10/074,680, filed Feb. 11, 2002. The present application is also a continuation in part of U.S. patent application Ser. No. 09/939,298, filed Aug. 24, 2001 (now U.S. Pat. No. 6,804,379), which is a continuation in part of U.S. patent application Ser. No. 09/127,502, filed Jul. 31, 1998 (now U.S. Pat. No. 6,345,104), which is a continuation-in-part of U.S. patent application Ser. No. 09/074,034, filed May 6, 1998 (now U.S. Pat. No. 6,449,377). The Ser. No. 09/127,502 application is also a continuation-in-part of U.S. patent application Ser. No. 08/967,693, filed Nov. 12, 1997 (now U.S. Pat. No. 6,122,392), which is a continuation of application Ser. No. 08/614,521, filed Mar. 15, 1996 (now U.S. Pat. No. 5,745,604), which is a continuation of application Ser. No. 08/215,289, filed Mar. 17, 1994 (now abandoned). The Ser. No. 09/127,502 application is also a continuation-in-part of application Ser. No. 08/649,419, filed May 16, 1996 (now U.S. Pat. No. 5,862,260). The Ser. No. 09/127,502 application also claims the benefit of U.S. Provisional application No. 60/082,228, filed Apr. 16, 1998. The present application also claims the benefit of assignee's U.S. Provisional Patent Application No. 60/350,505, filed Jan. 18, 2002, titled crata Hiding Through Arrangement of Objects.” The present application is also related to U.S. patent application Ser. No. 09/940,872, filed Aug. 27, 2001, and PCT Patent Application No. PCT/US 02/06858, filed Mar. 5, 2002. Each of the above-mentioned patent documents is herein incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
4084241 | Tsumura | Apr 1978 | A |
4271351 | Bloodworth | Jun 1981 | A |
4504910 | Araki et al. | Mar 1985 | A |
4631678 | Angermuller et al. | Dec 1986 | A |
5113445 | Wang | May 1992 | A |
5214757 | Mauney et al. | May 1993 | A |
5280537 | Sugiyama et al. | Jan 1994 | A |
5329108 | Lamoure | Jul 1994 | A |
5375058 | Bass | Dec 1994 | A |
5385371 | Izawa | Jan 1995 | A |
5469371 | Bass | Nov 1995 | A |
5499294 | Friedman | Mar 1996 | A |
5502576 | Ramsay et al. | Mar 1996 | A |
5664018 | Leighton | Sep 1997 | A |
5812962 | Kovac | Sep 1998 | A |
5825892 | Braudaway et al. | Oct 1998 | A |
5861841 | Gildea et al. | Jan 1999 | A |
5889868 | Moskowitz et al. | Mar 1999 | A |
5889898 | Koren et al. | Mar 1999 | A |
5901178 | Lee et al. | May 1999 | A |
5919730 | Gasper et al. | Jul 1999 | A |
5958051 | Renaud et al. | Sep 1999 | A |
5964821 | Brunts et al. | Oct 1999 | A |
5990826 | Mitchell | Nov 1999 | A |
6005936 | Shimizu et al. | Dec 1999 | A |
6031914 | Tewfik et al. | Feb 2000 | A |
6122403 | Rhoads | Sep 2000 | A |
6130741 | Wen et al. | Oct 2000 | A |
6175639 | Satoh et al. | Jan 2001 | B1 |
6181802 | Todd | Jan 2001 | B1 |
6185312 | Nakamura et al. | Feb 2001 | B1 |
6198832 | Maes et al. | Mar 2001 | B1 |
6205249 | Moskowitz | Mar 2001 | B1 |
6243480 | Zhao et al. | Jun 2001 | B1 |
6246777 | Agarwal et al. | Jun 2001 | B1 |
6249226 | Harrison et al. | Jun 2001 | B1 |
6263438 | Walker et al. | Jul 2001 | B1 |
6282362 | Murphy et al. | Aug 2001 | B1 |
6289453 | Walker et al. | Sep 2001 | B1 |
6301360 | Bocionek et al. | Oct 2001 | B1 |
6310956 | Morito et al. | Oct 2001 | B1 |
6311214 | Rhoads | Oct 2001 | B1 |
6320829 | Matsumoto et al. | Nov 2001 | B1 |
6324573 | Rhoads | Nov 2001 | B1 |
6332149 | Warmus et al. | Dec 2001 | B1 |
6332193 | Glass et al. | Dec 2001 | B1 |
6341350 | Miyahara et al. | Jan 2002 | B1 |
6343138 | Rhoads | Jan 2002 | B1 |
6351439 | Miwa et al. | Feb 2002 | B1 |
6401206 | Khan et al. | Jun 2002 | B1 |
6408082 | Rhoads et al. | Jun 2002 | B1 |
6408331 | Rhoads | Jun 2002 | B1 |
6411725 | Rhoads | Jun 2002 | B1 |
6427020 | Rhoads | Jul 2002 | B1 |
6448979 | Schena et al. | Sep 2002 | B1 |
6493514 | Stocks et al. | Dec 2002 | B1 |
6496802 | van Zoest et al. | Dec 2002 | B1 |
6498984 | Agnew et al. | Dec 2002 | B2 |
6504571 | Narayanaswami et al. | Jan 2003 | B1 |
6505160 | Levy et al. | Jan 2003 | B1 |
6512835 | Numao et al. | Jan 2003 | B1 |
6522770 | Seder et al. | Feb 2003 | B1 |
6532541 | Chang et al. | Mar 2003 | B1 |
6542927 | Rhoads | Apr 2003 | B2 |
6556688 | Ratnakar | Apr 2003 | B1 |
6614914 | Rhoads et al. | Sep 2003 | B1 |
6636249 | Rekimoto | Oct 2003 | B1 |
6700994 | Maes et al. | Mar 2004 | B2 |
20010001854 | Schena et al. | May 2001 | A1 |
20010019611 | Hilton | Sep 2001 | A1 |
20010022667 | Yoda | Sep 2001 | A1 |
20010026377 | Ikegami | Oct 2001 | A1 |
20010026616 | Tanaka | Oct 2001 | A1 |
20010026629 | Oki | Oct 2001 | A1 |
20010030759 | Hayashi et al | Oct 2001 | A1 |
20010031064 | Donescu et al. | Oct 2001 | A1 |
20010033674 | Chen et al. | Oct 2001 | A1 |
20010034835 | Smith | Oct 2001 | A1 |
20010039546 | Moore et al. | Nov 2001 | A1 |
20010046307 | Wong | Nov 2001 | A1 |
20010051964 | Warmus et al. | Dec 2001 | A1 |
20020001395 | Davis et al. | Jan 2002 | A1 |
20020002679 | Murakami et al. | Jan 2002 | A1 |
20020006212 | Rhoads et al. | Jan 2002 | A1 |
20020009209 | Inoue et al. | Jan 2002 | A1 |
20030032033 | Anglin et al. | Feb 2002 | A1 |
20020044690 | Burgess | Apr 2002 | A1 |
20020046178 | Morito et al. | Apr 2002 | A1 |
20020057340 | Fernandez et al. | May 2002 | A1 |
20020059520 | Murakami et al. | May 2002 | A1 |
20020065844 | Robinson et al. | May 2002 | A1 |
20020069370 | Mack | Jun 2002 | A1 |
20020075298 | Schena et al. | Jun 2002 | A1 |
20020080396 | Silverbrook et al. | Jun 2002 | A1 |
20020095586 | Doyle et al. | Jul 2002 | A1 |
20020095601 | Hind et al. | Jul 2002 | A1 |
20020106105 | Pelly et al. | Aug 2002 | A1 |
20020122564 | Rhoads et al. | Sep 2002 | A1 |
20020124171 | Rhoads | Sep 2002 | A1 |
20020124173 | Stone | Sep 2002 | A1 |
20020135600 | Rhoads | Sep 2002 | A1 |
20020136531 | Harradine et al. | Sep 2002 | A1 |
20020147910 | Brundage et al. | Oct 2002 | A1 |
20020159765 | Maruyama et al. | Oct 2002 | A1 |
20020168069 | Tehranchi et al. | Nov 2002 | A1 |
20020191810 | Fudge et al. | Dec 2002 | A1 |
20030011684 | Narayanaswami et al. | Jan 2003 | A1 |
20030012562 | Lawandy et al. | Jan 2003 | A1 |
20030021439 | Lubin et al. | Jan 2003 | A1 |
20030040326 | Levy et al. | Feb 2003 | A1 |
20030048908 | Hamilton | Mar 2003 | A1 |
20030053654 | Patterson et al. | Mar 2003 | A1 |
20030063319 | Umeda et al. | Apr 2003 | A1 |
20030069693 | Snapp et al. | Apr 2003 | A1 |
20030074556 | Chapman et al. | Apr 2003 | A1 |
20030083098 | Yamazaki et al. | May 2003 | A1 |
20030090690 | Katayama et al. | May 2003 | A1 |
20030215110 | Rhoads et al. | Nov 2003 | A1 |
20040162981 | Wong | Aug 2004 | A1 |
20040201676 | Needham | Oct 2004 | A1 |
20040221244 | Baldino | Nov 2004 | A1 |
Number | Date | Country |
---|---|---|
0 947 953 | Oct 1999 | EP |
0 953 938 | Nov 1999 | EP |
0 935 872 | Nov 2001 | EP |
1220152 | Jul 2002 | EP |
2371934 | Aug 2002 | GB |
2004-041144 | Feb 2000 | JP |
WO 9917537 | Apr 1999 | WO |
WO 0105075 | Jan 2001 | WO |
WO0124113 | Apr 2001 | WO |
WO0139121 | May 2001 | WO |
WO0176253 | Oct 2001 | WO |
WO0203328 | Jan 2002 | WO |
WO0233650 | Apr 2002 | WO |
Number | Date | Country | |
---|---|---|---|
20030053654 A1 | Mar 2003 | US |
Number | Date | Country | |
---|---|---|---|
60082228 | Apr 1998 | US | |
60350505 | Jan 2002 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 08614521 | Mar 1996 | US |
Child | 08649419 | US | |
Parent | 08215289 | Mar 1994 | US |
Child | 08614521 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10074680 | Feb 2002 | US |
Child | 10218021 | US | |
Parent | 09939298 | Aug 2001 | US |
Child | 10074680 | US | |
Parent | 09127502 | Jul 1998 | US |
Child | 09939298 | US | |
Parent | 09074034 | May 1998 | US |
Child | 09127502 | US | |
Parent | 08967693 | Nov 1997 | US |
Child | 09074034 | US | |
Parent | 08649419 | May 1996 | US |
Child | 08967693 | US |