Methods and systems for user-association of visual stimuli with corresponding responses

Information

  • Patent Grant
  • 9497341
  • Patent Number
    9,497,341
  • Date Filed
    Thursday, October 2, 2008
    16 years ago
  • Date Issued
    Tuesday, November 15, 2016
    8 years ago
Abstract
Methods and systems permit a user to decide what different responses are triggered when different visual stimuli are presented to the user's wireless communications device.
Description

The subject matter of the present application is related to that disclosed in U.S. Pat. No. 5,862,260, and in application Ser. No. 09/503,881, filed Feb. 14, 2000 (now U.S. Pat. No. 6,614,914), which are hereby incorporated by reference.


TECHNICAL FIELD

The present technology relates to signal processing, and in particular relates to arrangements for associating user-defined behaviors with different visual stimuli.


BACKGROUND AND SUMMARY

Digital watermarking is a process for modifying physical or electronic media to embed a machine-readable code into the media. The media may be modified such that the embedded code is imperceptible or nearly imperceptible to the user, yet may be detected through an automated detection process. Most commonly, digital watermarking is applied to media signals such as images, audio signals, and video signals. However, it may also be applied to other types of media objects, including documents (e.g., through line, word or character shifting), software, multi-dimensional graphics models, and surface textures of objects.


Digital watermarking systems typically have two primary components: an encoder that embeds the watermark in a host media signal, and a decoder that detects and reads the embedded watermark from a signal suspected of containing a watermark (a suspect signal). The encoder embeds a watermark by altering the host media signal. The reading component analyzes a suspect signal to detect whether a watermark is present. In applications where the watermark encodes information, the reader extracts this information from the detected watermark.


Several particular watermarking techniques have been developed. The reader is presumed to be familiar with the literature in this field. Particular techniques for embedding and detecting imperceptible watermarks in media signals are detailed in the assignee's application Ser. No. 09/503,881 (now U.S. Pat. No. 6,614,914) and U.S. Pat. No. 5,862,260, which are hereby incorporated by reference.


Content can also be processed to extract an identifier by techniques such as applying a hashing algorithm to the content data, yielding, e.g., a 128 bit identifier.


The present technology provides methods and systems for associating objects with machine behaviors. In this context, machine behaviors refer to actions by devices or systems in response to a triggering event. Examples of these behaviors include fetching a web page, opening an email client to send an email to a specific person, initiating a phone or video conference call, etc.


One illustrative embodiment is a method by which an end-user customizes behavior of a camera-equipped wireless communications device. The customization includes defining different user-desired behaviors that are associated with different visual stimuli, so that different behaviors are triggered when the user later presents different visual stimuli to the device.


Another illustrative embodiment is a computer including a user interface on which an image of an object is presented on the left side, and an image depicting a corresponding behavior is presented on the right side. Associated controls can permit the user to associate different objects with different behaviors.


Further features will become apparent with reference to the following detailed description and accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a system diagram depicting a system for associating watermark enabled objects with machine behaviors, and for triggering those behaviors in response to decoding watermarks.



FIG. 2 is a diagram depicting a variant of the system shown in FIG. 1 showing system components for enabling users to perform watermark identifier registration and embedding.



FIG. 3 is a flow diagram illustrating initiating an identified behavior and presenting output to a user.



FIG. 4 is a block diagram of a wireless telephone device.





DETAILED DESCRIPTION

The following description details a system and related methods for associating watermark enabled objects with machine behaviors. To illustrate the system, the description focuses on an example of watermark enabled stickers. As noted, the system applies more broadly to watermarking both physical and electronic objects. In particular, aspects of the system may be used for watermarking media signals like images, video and audio, as well as applying watermarks to physical objects. Watermarks may be applied to physical objects by placing a watermarked image on a physical object, by modulating the surface topology of the object, etc. See U.S. Pat. No. 5,862,260, for more information about watermark embedding of and decoding from physical and electronic objects.


Stickers in all their varieties have found an enduring place in our society. From the workplace (Post-It® brand message notes), to kids in a classroom, stickers have an inherent value associated with them, whether it be functional (seals, labels, etc.) or just to identify yourself with a particular affinity group (bumper stickers on cars). By placing a watermark on stickers they can be used in novel ways. By encoding a set of stickers with a watermark during production, specific machine behaviors can be assigned to them. These behaviors can be associated or even possibly changed by anyone from the manufacturer through the distributor, all the way to the end-user. In addition, the users can create their own watermark enabled stickers by creating an image, embedding a watermark in it, and associating the watermark with one or more machine behaviors.


These behaviors may include, but are not limited to the following:

    • Taking the user to a web-site linked to the watermark via a network address of the web-site or index to the network address.
    • Opening an email client to email to a specific person (e.g., a person whose email address is stored in the machine behavior description associated with the watermark).
    • Launching the user into an Interframe Relay Chat (IRC) session that other people with the same sticker can participate in.
    • Authenticating the user as part of a process of accessing a network resource, such as account information or access to a computer network.
    • Authentication the user in an electronic commerce transaction performed on a computer network.
    • Sending an electronic card.
    • Placing a phone or video-conference call.
    • As props in a computer game. For example, the prop is a multi-sided, or multi-faceted object, where each side or facet has a watermarked image conveying a different message used to control the game. The computer game includes a watermark decoder for extracting the messages from image frames captured of the prop. The message may directly carry the message or act as an index to a more detailed game instruction in a database, such as an instruction that changes over time based on changes to the corresponding database entry by the user or game manufacturer.
    • As a visual aide for disabled users.
    • Anywhere where machine vision is not feasible.


In each of the above applications, the watermark carries information that links the watermarked object (e.g., sticker) with a machine behavior. To trigger this behavior, a watermark decoder application captures an image or images of the watermarked sticker, extracts the watermark, and uses information embedded in the watermark to determine the associated machine behavior. The watermark decoder then takes action to initiate the machine behavior associated with the watermark.


For some applications, it is useful to enable the user to control the behavior associated with a watermarked object. This type of management may be handled by creating accounts for users and providing access to the accounts via some authentication method (email, passwords, etc.). For a number of reasons, these access methods can be problematic (losing passwords, asking kids for their email addresses, etc.). As an alternative, watermarks may be used to manage the process of associating behaviors with a watermarked object.


For example, in the scenario where a user wants to assign behaviors to a set of watermarked stickers they have received, they can hold up the first sticker (or its packaging), and be taken to a registration application to activate the stickers in the pack.



FIG. 1 is a system diagram depicting a system for associating watermark enabled objects with machine behaviors, and for triggering those behaviors in response to decoding watermarks. The system depicted in FIG. 1 is implemented on a computer network, namely, the Internet. The user accesses the system via a computer 100 connected to the Internet. The computer broadly encompasses a variety of devices, such as Personal Computers, set-top boxes, personal digital assistants, Internet appliances, set-top boxes, telephones (including wireless devices), audio and video players, and imaging devices (CCD or CMOS cameras, camcorders, printers, fax machines, copiers, etc.). The computer is connected to an image capture device 102, such as a PC camera or scanner, and includes watermark decoder software for decoding watermarks from images captured from the image capture device.


The system architecture shown in FIG. 1 includes a system (106) for managing the process of assigning behaviors to watermarked objects as well as a database management system (108) for initiating behaviors in response to decoding watermarks from the objects. These two systems may be integrated or implemented separately. In the application depicted here, the registration system and database management system are accessible via a network interface using standard network technology, including HTML, XML, and TCP/IP. A watermark embedding system has previously embedded watermarks carrying watermark identifiers into stickers. The stickers (or packages of them) also include a registration identifier used to activate the behaviors associated with them.


The registration system maintains a registration database including records of the registration identifiers and corresponding watermark identifiers. The registration identifiers are serialized numbers corresponding to the watermarked stickers or packages of them. The watermark identifiers are a form of object identifiers that are encoded into the watermarks on the corresponding stickers. The registration system maintains a registration database 110 of data records indicating the watermark identifiers associated with each registration identifier. When a user selects a behavior to be associated with a watermarked object via the registration system, the registration system sends an update 112 to a behavior database 114 specifying the behavior to be associated with a corresponding watermark identifier. In response, the database management system 108 updates its database to include a record that indicates the behavior associated with a particular watermark identifier.


The database management system 108 is also responsible for supporting machine behavior associated with a watermarked sticker in response to detection of the watermark on the sticker. It has a network interface for communicating with other computers over the Internet. In particular, it receives requests in the form of an XML packet from a watermark decoding computer, extracts a watermark identifier from the packet and looks up the associated behavior or behaviors in the behavior database. It then initiates the associated behavior. The details of how this behavior is carried out depend on the application and type of behavior.


In a typical registration process, the user accesses the registration system via a registration web site, which presents an HTML interface to the users' computers. The user may fetch the HTML pages of this interface using an Internet browser or application program, like the watermark decoder application executing on the computer 100. This interface enables the user to enter a registration identifier to start a process of associating behaviors with watermark identifiers embedded in watermarked stickers. In response to a registration identifier, the registration system returns a page that enables the user to specify the behavior. In the case where the behavior is linking a watermarked sticker to a web site, the user specifies the network address of the web site, such as a URL or IP address. In the case where the behavior is linking a watermarked sticker to an email message, the user specifies the email address of the email recipient.


As noted above, there are many other types of watermark enabled behaviors. They can be classified as providing information to the watermark decoding computer, launching some software program or machine action, or a combination of both. Table 1 below gives some examples of behaviors, and the related information and actions.











TABLE 1





Behavior
Information Returned to
Associated Machine or


Type
Decoding Computer
Software Actions







linking to
URL, web page
launching browser on client


web site

to fetch/render web page at




URL


opening an
email address of target
launching email client and


email client
recipient
populating address field




with target recipient


starting a
address of chat session
launching chat application


chat session

(watermarks on the stickers




can be designed such that




only those holding the




stickers can gain access to




the chat session, each by




showing the sticker to his




or her watermark decoder




enabled camera)


accessing
address of account
launching of browser


account
information
application to access


information

account information


or other

through a web interface;


network

supplying user


resources

authentication information




from watermarked object




and/or from user (user




password, user name, log




on, etc.)


sending an
card template
launching client application


electronic

to enable the user to design


card

the card and add personal




message, launching email




application to send




electronic card (or link to




electronic card)


placing a
phone number or IP address
launching application to


phone or
of destination
initiate phone call over


video

the internet or telephone


conference

network


call


props in an
identifier of prop, and
game application receives


interactive
possibly other context
prop and context


computer
information, such as game
information and responds


game
player holding the prop, etc.
accordingly


visual
returns information in the
browser, or other media


aid for
form of graphics, audio, or
player applications render


disabled
video (may provide address
the information (such as


users
of audio or video content at
the streaming media) on



an audio or video server on
the decoding computer



the Internet)


machine
machine instruction
machine or software


control

executes instruction









For a given application, the registration system provides information to the user to enable the user to select the behavior and provide pertinent information, such as URL, IP address, phone number, email address, content file (e.g., audio, image or video file), etc. The registration system formulates a description of the behavior, associates it with the watermark identifier specified by the user, and creates an update 112 to the behavior database.


The user then uses the stickers or shares them with friends. To trigger the behavior of a sticker, a user captures an image of the sticker with an image capture device 102 using a watermark decoder application 104 executing on the computer 100. The watermark decoder extracts the watermark identifier from a watermark embedded in the image on the sticker. It then sends the watermark identifier to the database management system 108 via the Internet, which in turn, looks up the associated behavior. The database management system then triggers the associated behavior by sending information, or instructions back to the decoding computer. The decoding computer renders the information, and launches a software or other machine action associated with the instructions returned from the database. The database need not be implemented in a remote computer. For example, the database may be implemented in the watermark decoding computer or device.


As an enhancement to the registration process, objects may carry watermarks that automatically link the user to the registration web site. For example, one side of the sticker 116 or its packaging 118 may contain a watermark with the network address or an index to a network address of the registration web site. The user shows this part of the sticker or packaging to the image capture device. The watermark decoder extracts the watermark and looks up the network address in the behavior database, and launches a browser to fetch the registration web site. The watermark may also carry the registration identifier. In this case, the registration web site can tailor the web page returned to the user to be specific to the watermarked object. If the user or someone else previously associated a behavior with the sticker, the registration web site returns the current status associated with the registration identifier and the behaviors associated with the watermarked objects linked to that registration identifier. To get detailed information about particular watermarked objects during the registration process, the user can show the watermarked object to a camera, and use a watermark decoder to extract the watermark identifier and supply it to the registration system. In response, the registration system takes the watermark identifier, queries the behavior database via the database management system, and returns a description of the associated behaviors. This approach provides a simple and automated process of activating watermark enabled objects.


For more information about an object identifier registration system and system for linking objects with machine behaviors, see U.S. patent application Ser. No. 09/571,422 (now U.S. Pat. No. 6,947,571), which is hereby incorporated by reference.


In some applications, the user may wish to create his or her own watermarked objects. FIG. 2 illustrates a system that enables users to assign watermark identifiers to corresponding behaviors and objects and update the behavior database. In this particular system, the user's computer includes a watermark embedder application (120). However, the embedder application may be implemented on a separate computer, such as a server on the Internet accessible via a client application on the user's computer 100. In the former case, the user embeds the watermark into the desired image content on his computer. In the latter case, the client supplies the image content to the server, which performs watermark embedding and returns watermarked images to the client. In both cases, the watermarked objects are created by printing the watermarked images on objects.


The process begins when an embedder 120 creates a registration request. In the system shown in FIG. 2, the embedder 120 is a software application running on the computer 100. The embedder formulates the request in a request file. The system provides a template for the request file. The request file specifies the number of watermark identifiers requested and the names of the media files to be embedded. The file may also specify the behaviors to be associated with each watermark identifier. Alternatively, the user can specify the behaviors to be associated with the watermark identifier at a later time using the methods described in this document. In the case where embedding is performed on a server as opposed to the user's computer, the request file may also include the media file (e.g., an image file) carrying the content to be embedded with the watermark identifier.


Next, the embedder connects, via a network connection, to the registration system 106. In particular, it connects to a registration web site via an Internet connection. This web site requests the embedder's username and password to authenticate it.


The user enters his username and password via a user interface displayed on the PC 100 and submits them to the web site for authentication.


Upon authentication, the registration website 106 returns an HTML page presenting the embedder with a user interface screen that allows the user to locate the embedder's registration request file for uploading to the web site. The user then enters a command to instruct the embedder to upload the selected request file.


The embedder provides the information required to locate the file on the embedder's computer and submits it for upload.


The registration request file is uploaded into a registration loader program 122.


The registration loader 122 performs a quick scan of the uploaded registration request file and reports back to the embedder any errors in format that it detects. If there are errors, the file is not processed.


If the registration request file is properly formatted, the embedder receives a confirmation from the registration website 106 that the request file has been successfully uploaded and will be submitted for processing by the registration loader 122.


The embedder may now either submit a new registration request file or logoff of the registration web site 106.


The registration loader 122 uses the information contained in the embedder's uploaded registration request file to automatically allocate (register) watermark identifiers in a registration database 110. The identifiers are in the form of serial numbers. Once this process is completed, the registration loader 122 initiates a request to a registration extractor 124 for these new registration entries.


Upon receipt of a request, the registration extractor 124 accesses the registration database 110 and creates embedder control files for each of these new registered watermark identifiers (e.g., serial numbers).


Upon completion of this process, the registration extractor 124 process sends the embedder control file(s) back to the embedder via Internet e-mail. In the event that the embedder is server based, the extractor sends the control file(s) (or a pointer to them) to the embedder server 126, which may be integrated with the registration system or implemented at a different Internet site. The extractor 124 also sends an update 128 to the behavior database 114 to create database records associating each of the watermark identifier with a behavior.


Once the embedder 120 has received the embedder control file(s), it uses these file(s), along with the media file(s) (in this case, image files) and a set of embedding instructions to the embedder 120 to instruct the embedder to automatically embed the list of watermark serial numbers included in the embedder control file(s) into the listed media files, producing a set of watermark-embedded media files. In the case where the embedder is server based, the client executing on the PC 100 uploads the media files to be embedded to the embedder server, either directly or as part of the registration process (e.g., as part of the request file). The embedder server then returns the watermarked files to the computer 100 via e-mail or other network file transfer protocol.


For detailed disclosure describing how to embed watermarks in media signals, including images, audio, and video, see U.S. Pat. No. 5,862,260, and co-pending application Ser. No. 09/503,881, filed Feb. 14, 2000, incorporated above.


The embedder may create watermarked objects by printing watermarked images on objects, such as stickers, documents, etc. The embedder sends the watermarked image to a printer 128, which in turn, prints the image on an object.


The above system provides a mechanism for linking objects to machine behaviors. As noted previously, this mechanism applies to both physical objects, like stickers and packaging, and electronic objects, like image, audio and video signals. It also applies to other forms of machine readable signal carriers that can be applied to such objects, including bar codes, magnetic stripes, Radio Frequency tags, integrated circuit chips, organic transistors, etc. These machine readable carriers can be used in the same way that watermarks are used in the example of watermarked enabled stickers above.


While these technologies provide a mechanism for linking objects to machine behaviors, there is a need for a tool that explicitly facilitates the creative coordination between the object and the behavior linked to it. The linking process results in a potentially complex database structure which not only embodies the fundamental object to behavior link, but also might include a hierarchy of delivered responses as a function of user implicit or explicit requests. Or, said in a quite different way, the creative process of linking an object to complex data-driven responses is itself a creative endeavor all to itself, involving thinking through the various different reactions that users will want and expect when using an object as a portal. The artist who is tasked with creating the choreography between an object and a simple or complex machine behavior will need explicit assistance from well designed tools, resulting in a database record which memorializes that creativity as an active element within a connectivity system described in this document and U.S. patent application Ser. No. 09/571,422 (now U.S. Pat. No. 6,947,571). The immediate creative output is a database structure. The long term creative output is the active use of that structure as a stimulus-response hub.


Whether a link design tool be web-server based, or whether it be a stand-alone application similar in kind to an Adobe Photoshop or a Quark Express, it is possible to offer visual metaphors to a creative designer which literally presents that designer with an image of the to-be-linked object along with explicit visual links to one or more data responses.


One embodiment of this tool for linking printed objects to web pages is a local computer application which presents an image of a printed object on the left side of an application's window pane and the image of a web page on the right side of the application pane. The images of the printed objects may be stored locally or fetched from a remote device (e.g., a content database) and rendered to the left side of the screen. Similarly, the web pages may be stored locally or downloaded from web sites on the Internet or some other network. The user interface of the application displays a control such as a button, labeled, “Connect”, “Link” or some other active word representing the process of associating an object with a corresponding machine behavior. The user, having browsed through a series of objects to be linked, and browsed through a series of potential web site destinations, finding the best “matched-pair”, pushes the button and off this relational link goes into a queue waiting to “go live”, or, in other words, a temporary record is stored for a candidate link to be sent to the behavior database of the linking system described previously. A user can perform multiple links per session, queueing them up as they go, reviewing the queue at some point in time, then directing the links to become active at the behavioral database, as described previously and in the referenced documents.


An extension begins by generalizing the single printed item to be an icon or visual analogy to a related set of printed material. Graphical user interface methods can be employed to move, manipulate, view and otherwise process this icon in a fashion familiar to creative professionals. Likewise, surrounding this generalized icon representing the object(s) to be printed can be a whole series of icons representing a variety of potential data-delivered responses that are possible links. Existing web pages, placeholders for web pages to be designed, streaming media icons, Java application icons, “links to links” icons wherein a given response may explicitly point to a menu of actions presented to the end user. (end user=consumer doing the linking). This list of possible responses is incomplete but nevertheless representative of graphically displaying the possible relationships between printed material and data responses.


As in the baseline case, various relationships can be created between objects and responses, ultimately stored into a queue. The actual functionality and quality assurance of the links could be tested in the process. Once the creative artist is satisfied with their link or set of links, the queue can be sent to go live at the behavior database and further double checks on quality performed.


CONCLUDING REMARKS

Having described and illustrated the principles of the technology with reference to specific implementations, it will be recognized that the technology can be implemented in many other, different, forms. To provide a comprehensive disclosure without unduly lengthening the specification, applicants incorporate by reference the patents and patent applications referenced above.


While the technology is illustrated with reference to watermarked stickers, aspects of the technology apply to other object types including media signals like audio and video. There are number of different watermark embedding and decoding methods that may be used. The watermark embedding process may modulate features of a signal in the time, frequency, spatial or some other transform domain of the signal to be watermarked.


In addition to an object identifier, the watermark may be used to convey other information, such as an index to related metadata, rendering control instructions, etc. For example, the watermark can carry a network address or index to a network address to link the watermarked signal to a network resource such as a related web site.


Other machine readable codes may be embedded in an object and used to link the object to a machine behavior. Some examples include bar codes, magnetic stripes, RF tags, etc. The devices and methods used to extract an identifier from the machine readable code differ, yet the process for registering identifiers and associating behavior with objects may be similar.


The methods, processes, and systems described above may be implemented in hardware, software or a combination of hardware and software. For example, the auxiliary data encoding processes may be implemented in a programmable computer or a special purpose digital circuit. Similarly, auxiliary data decoding may be implemented in software, firmware, hardware, or combinations of software, firmware and hardware. The methods and processes described above may be implemented in programs executed from a system's memory (a computer readable medium, such as an electronic, optical or magnetic storage device).


The particular combinations of elements and features in the above-detailed embodiments are exemplary only; the interchanging and substitution of these teachings with other teachings in this and the incorporated-by-reference patents/applications are also contemplated.

Claims
  • 1. A method comprising: receiving a first image frame of a first physical subject captured using a camera;processing, using a computing device, image information corresponding to the first image frame to derive first identifying data therefrom;storing, within a data structure coupled to the computing device, the derived first identifying data;assigning, using the computing device, a behavior to be associated with the first physical subject, and storing data relating to the behavior within the data structure in association with the first identifying data;receiving a second image frame of a second physical subject captured using the camera;processing, using the computing device, image information corresponding to the second image frame to derive second identifying data therefrom; andchecking, using the computing device, the data structure for stored first identifying data corresponding to the second identifying data and if the stored first identifying data and the second identifying data match, then: identifying a requested behavior associated with the corresponding first identifying data;initiating the requested behavior; andpresenting an output from the requested behavior to a user of the device.
  • 2. The method of claim 1, wherein each of the processing acts comprises applying a hashing algorithm.
  • 3. The method of claim 1, wherein processing image information corresponding to the first image frame comprises applying a digital watermark decoding algorithm.
  • 4. The method of claim 1, wherein processing image information corresponding to the first image frame comprises processing information within a wireless communications device associated with the camera to derive identifying data therefrom.
  • 5. The method of claim 1, wherein storing comprises storing within a data structure within a wireless communications device associated with the camera.
  • 6. The method of claim 1, wherein the first physical subject comprises at least a part of a human.
  • 7. The method of claim 1, wherein the behavior associated with the first physical subject comprises an operation corresponding to a particular physical aspect of the first physical subject.
  • 8. A non-transitory computer-readable medium having instructions stored thereon that, when executed by a computing device, cause the computing device to perform operations comprising: receiving a first image frame of a first physical subject captured using a camera;processing image information corresponding to the first image frame to derive first identifying data therefrom;storing the derived first identifying data within a data structure;assigning a behavior to be associated with the first physical subject, and storing data relating to the behavior within the data structure in association with the first identifying data;receiving a second image frame of a second physical subject captured using the camera;processing image information corresponding to the second image frame to derive a second identifying data therefrom; andchecking the data structure for stored first identifying data corresponding to the second identifying data and if the stored first identifying data and the second identifying data match, then: identifying a requested behavior associated with the corresponding first identifying data;initiating the requested behavior; andpresenting an output from the requested behavior to a user of the device.
  • 9. The computer-readable medium of claim 8, wherein each of the processing image information comprises applying a hashing algorithm.
  • 10. The computer-readable medium of claim 8, wherein processing image information corresponding to the first image frame comprises applying a digital watermark decoding algorithm.
  • 11. The computer-readable medium of claim 8, wherein processing image information corresponding to the first image frame comprises processing image information within a wireless communications device associated with the camera to derive the identifying data therefrom.
  • 12. The computer-readable medium of claim 8, wherein storing comprises storing within a wireless communications device associated with the camera.
  • 13. The computer-readable medium of claim 8, wherein the first physical subject comprises at least a part of a human.
RELATED APPLICATION DATA

This patent application is a division of pending U.S. patent application Ser. No. 11/359,756, filed Feb. 21, 2006, which is a division of U.S. patent application Ser. No. 09/690,773, filed Oct. 17, 2000 (now U.S. Pat. No. 7,003,731), which is a continuation in part of U.S. patent application Ser. No. 09/633,587, filed Aug. 7, 2000 (now abandoned), which is a continuation in part of U.S. patent application Ser. No. 09/343,104, filed Jun. 29, 1999 (now abandoned, but a continuation of which is pending as U.S. patent application Ser. No. 10/764,430, filed Jan. 23, 2004). U.S. patent application Ser. No. 09/343,104 claims priority from U.S. Provisional Application No. 60/134,782, filed May 19, 1999. U.S. patent application Ser. No. 09/690,773 is also a continuation in part of U.S. patent application Ser. No. 09/571,422, filed May 15, 2000 (now U.S. Pat. No. 6,947,571), which claims priority to U.S. Provisional Application No. 60/134,782, filed May 19, 1999. These patent applications are hereby incorporated by reference.

US Referenced Citations (301)
Number Name Date Kind
4656603 Dunn Apr 1987 A
4780599 Baus Oct 1988 A
4907264 Seiler Mar 1990 A
4994987 Baldwin Feb 1991 A
5001696 Baldwin Mar 1991 A
5361871 Gupta Nov 1994 A
5382779 Gupta Jan 1995 A
5385371 Izawa Jan 1995 A
5415553 Szmidla May 1995 A
5424524 Ruppert Jun 1995 A
5444230 Baldwin Aug 1995 A
5474457 Bromley Dec 1995 A
5480306 Liu Jan 1996 A
5486686 Zdybel, Jr. et al. Jan 1996 A
5572653 DeTemple Nov 1996 A
5574519 Manico Nov 1996 A
5603054 Theimer et al. Feb 1997 A
5613004 Cooperman et al. Mar 1997 A
5640193 Wellner Jun 1997 A
5703795 Mankovitz Dec 1997 A
5745782 Conway Apr 1998 A
5754981 Veeneman et al. May 1998 A
5765176 Bloomberg Jun 1998 A
5774666 Portuesi Jun 1998 A
5781914 Stork Jul 1998 A
5832119 Rhoads Nov 1998 A
5838458 Tsai Nov 1998 A
5841978 Rhoads Nov 1998 A
5848413 Wolff Dec 1998 A
5862260 Rhoads Jan 1999 A
5878155 Heeter Mar 1999 A
5892892 Popat et al. Apr 1999 A
5899700 Williams et al. May 1999 A
5902353 Reber et al. May 1999 A
5905248 Russell et al. May 1999 A
5932863 Rathus Aug 1999 A
5933829 Durst Aug 1999 A
5938726 Reber et al. Aug 1999 A
5938727 Ikeda Aug 1999 A
5939695 Nelson Aug 1999 A
5948038 Daly et al. Sep 1999 A
5950173 Perkowski Sep 1999 A
5963916 Kaplan Oct 1999 A
5971277 Cragun et al. Oct 1999 A
5979757 Tracy et al. Nov 1999 A
5986651 Reber et al. Nov 1999 A
5991737 Chen Nov 1999 A
5995105 Reber et al. Nov 1999 A
6002946 Reber Dec 1999 A
6012102 Shachar Jan 2000 A
6032195 Reber Feb 2000 A
6064779 Neukermans et al. May 2000 A
6081629 Browning Jun 2000 A
6081827 Reber et al. Jun 2000 A
6084528 Beach et al. Jul 2000 A
6108656 Durst Aug 2000 A
6119944 Mulla Sep 2000 A
6121530 Sonoda Sep 2000 A
6122403 Rhoads Sep 2000 A
6138151 Reber et al. Oct 2000 A
6148331 Parry Nov 2000 A
6154738 Call Nov 2000 A
6164534 Rathus et al. Dec 2000 A
6167469 Safai et al. Dec 2000 A
6199048 Hudetz et al. Mar 2001 B1
6229924 Rhoads et al. May 2001 B1
6243480 Zhao et al. Jun 2001 B1
6249226 Harrison et al. Jun 2001 B1
6282362 Murphy et al. Aug 2001 B1
6286036 Rhoads Sep 2001 B1
6307949 Rhoads Oct 2001 B1
6311214 Rhoads Oct 2001 B1
6321992 Knowles Nov 2001 B1
6381341 Rhoads Apr 2002 B1
6385329 Sharma et al. May 2002 B1
6386453 Russell May 2002 B1
6389055 August May 2002 B1
6400272 Holtzman et al. Jun 2002 B1
6408082 Rhoads et al. Jun 2002 B1
6408331 Rhoads Jun 2002 B1
6408429 Marrion et al. Jun 2002 B1
6411725 Rhoads Jun 2002 B1
6411994 van Allen Jun 2002 B2
6421070 Ramos et al. Jul 2002 B1
6424725 Rhoads et al. Jul 2002 B1
6425525 Swaminathan Jul 2002 B1
6430554 Rothschild Aug 2002 B1
6434561 Durst, Jr. et al. Aug 2002 B1
6439465 Bloomberg Aug 2002 B1
6445460 Pavley Sep 2002 B1
6448979 Schena et al. Sep 2002 B1
6464503 Heit et al. Oct 2002 B1
6466329 Mukai Oct 2002 B1
6484198 Milovanovic Nov 2002 B1
6512919 Ogasawara Jan 2003 B2
6516079 Rhoads et al. Feb 2003 B1
6522770 Seder et al. Feb 2003 B1
6535617 Hannigan et al. Mar 2003 B1
6542927 Rhoads Apr 2003 B2
6542933 Durst, Jr. et al. Apr 2003 B1
6549933 Barrett Apr 2003 B1
6553129 Rhoads Apr 2003 B1
6567533 Rhoads May 2003 B1
6573916 Grossweiler Jun 2003 B1
6580808 Rhoads Jun 2003 B2
6590996 Reed et al. Jul 2003 B1
6611607 Davis et al. Aug 2003 B1
6614914 Rhoads et al. Sep 2003 B1
6636249 Rekimoto Oct 2003 B1
6647128 Rhoads Nov 2003 B1
6647130 Rhoads Nov 2003 B2
6650761 Rodriguez et al. Nov 2003 B1
6650877 Tarbouriech et al. Nov 2003 B1
6651053 Rothschild Nov 2003 B1
6678499 Silverbrook et al. Jan 2004 B1
6681028 Rodriguez et al. Jan 2004 B2
6681029 Rhoads Jan 2004 B1
6687345 Swartz Feb 2004 B1
6694042 Seder et al. Feb 2004 B2
6694043 Seder et al. Feb 2004 B2
6700990 Rhoads Mar 2004 B1
6700995 Reed Mar 2004 B2
6704869 Rhoads et al. Mar 2004 B2
6718046 Reed et al. Apr 2004 B2
6718047 Rhoads Apr 2004 B2
6721440 Reed et al. Apr 2004 B2
6721733 Lipson Apr 2004 B2
6738491 Ikenoue May 2004 B1
6745234 Philyaw Jun 2004 B1
6760463 Rhoads Jul 2004 B2
6763123 Reed et al. Jul 2004 B2
6766363 Rothschild Jul 2004 B1
6768809 Rhoads et al. Jul 2004 B2
6775392 Rhoads Aug 2004 B1
6798894 Rhoads Sep 2004 B2
6804379 Rhoads Oct 2004 B2
6807676 Robbins Oct 2004 B1
6813039 Silverbrook Nov 2004 B1
6813366 Rhoads Nov 2004 B1
6820062 Gupta Nov 2004 B1
6829368 Meyer et al. Dec 2004 B2
6832717 Silverbrook et al. Dec 2004 B1
6860424 Philyaw Mar 2005 B1
6868405 DeTreville Mar 2005 B1
6879701 Rhoads Apr 2005 B1
6917691 Evans Jul 2005 B2
6917724 Seder et al. Jul 2005 B2
6920232 Rhoads Jul 2005 B2
6935562 Hecht et al. Aug 2005 B2
6947571 Rhoads et al. Sep 2005 B1
6965682 Davis Nov 2005 B1
6965873 Rhoads Nov 2005 B1
6968057 Rhoads Nov 2005 B2
6970886 Levy Nov 2005 B1
6975746 Davis et al. Dec 2005 B2
6988202 Rhoads et al. Jan 2006 B1
6995859 Silverbrook Feb 2006 B1
6996252 Reed et al. Feb 2006 B2
7003731 Rhoads et al. Feb 2006 B1
7010144 Davis et al. Mar 2006 B1
7024016 Rhoads et al. Apr 2006 B2
7027614 Reed Apr 2006 B2
7035427 Rhoads Apr 2006 B2
7044395 Davis et al. May 2006 B1
7050603 Rhoads May 2006 B2
7051086 Rhoads et al. May 2006 B2
7054465 Rhoads May 2006 B2
7058223 Cox Jun 2006 B2
7058697 Rhoads Jun 2006 B2
7062069 Rhoads Jun 2006 B2
7065559 Weiss Jun 2006 B1
7095871 Jones et al. Aug 2006 B2
7111170 Rhoads et al. Sep 2006 B2
7113614 Rhoads Sep 2006 B2
7139408 Rhoads et al. Nov 2006 B2
7143949 Hannigan Dec 2006 B1
7158654 Rhoads Jan 2007 B2
7164780 Brundage et al. Jan 2007 B2
7171016 Rhoads Jan 2007 B1
7174031 Rhoads et al. Feb 2007 B2
7177443 Rhoads Feb 2007 B2
7213757 Jones et al. May 2007 B2
7224819 Levy et al. May 2007 B2
7224995 Rhoads May 2007 B2
7248717 Rhoads Jul 2007 B2
7261612 Hannigan et al. Aug 2007 B1
7305104 Carr et al. Dec 2007 B2
7308110 Rhoads Dec 2007 B2
7313251 Rhoads Dec 2007 B2
7319775 Sharma et al. Jan 2008 B2
7330564 Brundage et al. Feb 2008 B2
7349552 Levy Mar 2008 B2
7362879 Evans Apr 2008 B2
7369678 Rhoads May 2008 B2
7377421 Rhoads May 2008 B2
7391880 Reed et al. Jun 2008 B2
7406214 Rhoads et al. Jul 2008 B2
7421128 Venkatesan et al. Sep 2008 B2
7424131 Alattar et al. Sep 2008 B2
7427030 Jones et al. Sep 2008 B2
7433491 Rhoads Oct 2008 B2
7444000 Rhoads Oct 2008 B2
7444392 Rhoads et al. Oct 2008 B2
7450734 Rodriguez et al. Nov 2008 B2
7454035 Miller et al. Nov 2008 B2
7460726 Levy et al. Dec 2008 B2
7461136 Rhoads Dec 2008 B2
7466840 Rhoads Dec 2008 B2
7486799 Rhoads Feb 2009 B2
7502759 Hannigan et al. Mar 2009 B2
7508955 Carr et al. Mar 2009 B2
7515733 Rhoads Apr 2009 B2
7536034 Rhoads et al. May 2009 B2
7537170 Reed et al. May 2009 B2
7542587 Tian et al. Jun 2009 B2
7545952 Brundage et al. Jun 2009 B2
7548643 Davis et al. Jun 2009 B2
7564992 Rhoads Jul 2009 B2
7577273 Rhoads et al. Aug 2009 B2
RE40919 Rhoads Sep 2009 E
7602978 Levy et al. Oct 2009 B2
7628320 Rhoads Dec 2009 B2
7643649 Davis et al. Jan 2010 B2
7650009 Rhoads Jan 2010 B2
7653210 Rhoads Jan 2010 B2
7657058 Sharma Feb 2010 B2
7657064 Conwell Feb 2010 B1
7685426 Ramos et al. Mar 2010 B2
7693300 Reed et al. Apr 2010 B2
7697719 Rhoads Apr 2010 B2
7711143 Rhoads May 2010 B2
7738673 Reed Jun 2010 B2
7747038 Rhoads Jun 2010 B2
7751588 Rhoads Jul 2010 B2
7751596 Rhoads Jul 2010 B2
7756290 Rhoads Jul 2010 B2
7760905 Rhoads et al. Jul 2010 B2
7762468 Reed et al. Jul 2010 B2
7787653 Rhoads Aug 2010 B2
7792325 Rhoads et al. Sep 2010 B2
7822225 Alattar Oct 2010 B2
7837094 Rhoads Nov 2010 B2
7945781 Rhoads May 2011 B1
7949147 Rhoads et al. May 2011 B2
7953270 Rhoads May 2011 B2
7953824 Rhoads et al. May 2011 B2
7957553 Ellingson et al. Jun 2011 B2
7961949 Levy et al. Jun 2011 B2
7970166 Carr et al. Jun 2011 B2
7970167 Rhoads Jun 2011 B2
20010001854 Schena et al. May 2001 A1
20010003835 Watts Jun 2001 A1
20010011233 Narayanaswami Aug 2001 A1
20010034705 Rhoads et al. Oct 2001 A1
20010044824 Hunter et al. Nov 2001 A1
20010047428 Hunter Nov 2001 A1
20010055407 Rhoads Dec 2001 A1
20020009208 Alattar et al. Jan 2002 A1
20020010941 Johnson Jan 2002 A1
20020023148 Ritz et al. Feb 2002 A1
20020023218 Lawandy et al. Feb 2002 A1
20020032698 Cox Mar 2002 A1
20020131076 Davis Sep 2002 A1
20020176003 Seder et al. Nov 2002 A1
20020186886 Rhoads Dec 2002 A1
20020196272 Ramos et al. Dec 2002 A1
20030040957 Rhoads et al. Feb 2003 A1
20030105730 Davis et al. Jun 2003 A1
20030130954 Carr et al. Jul 2003 A1
20040005093 Rhoads Jan 2004 A1
20040190750 Rodriguez et al. Sep 2004 A1
20040199387 Wang Oct 2004 A1
20040240704 Reed Dec 2004 A1
20040264733 Rhoads et al. Dec 2004 A1
20050041835 Reed et al. Feb 2005 A1
20050058318 Rhoads Mar 2005 A1
20050192933 Rhoads et al. Sep 2005 A1
20050229107 Hull Oct 2005 A1
20060013435 Rhoads Jan 2006 A1
20060041591 Rhoads Feb 2006 A1
20060251291 Rhoads Nov 2006 A1
20070041667 Cox Feb 2007 A1
20070055884 Rhoads Mar 2007 A1
20070108287 Davis et al. May 2007 A1
20070276841 Rhoads et al. Nov 2007 A1
20070276928 Rhoads et al. Nov 2007 A1
20080121728 Rodriguez May 2008 A1
20080133555 Rhoads et al. Jun 2008 A1
20080292134 Sharma et al. Nov 2008 A1
20090012944 Rodriguez et al. Jan 2009 A1
20090125475 Rhoads et al. May 2009 A1
20090286572 Rhoads et al. Nov 2009 A1
20100045816 Rhoads Feb 2010 A1
20100062819 Hannigan et al. Mar 2010 A1
20100172540 Davis et al. Jul 2010 A1
20100198941 Rhoads Aug 2010 A1
20110007936 Rhoads Jan 2011 A1
20110026777 Rhoads et al. Feb 2011 A1
20110051998 Rhoads Mar 2011 A1
20110062229 Rhoads Mar 2011 A1
20110091066 Alattar Apr 2011 A1
Foreign Referenced Citations (4)
Number Date Country
WO 9803923 Jan 1998 WO
WO 0058883 Oct 2000 WO
WO 0115021 Mar 2001 WO
WO 0201379 Jan 2002 WO
Non-Patent Literature Citations (40)
Entry
Samsung, “A Mobile Phone that Doubles as a Camera”, Samsung press release Jun. 23, 2000, found at http://www.samsung.com/us/news/newsPreviewRead.do?news—seq=529.
Rekimoto et al., “CyberCode: Designing Augmented Reality Environments with Visual Tags”, Proceedings of DARE 2000 on Designing Augmented Reality Environments (DARE'00), Elsinore, Denmark, Apr. 12-14, 2000, copyright ACM 2000, pp. 1-10.
Waldherr et al., “A Gesture Based Interface for Human-Robot Interaction”, Autonomous Robots, vol. 9, No. 2, Sep. 2000, pp. 151-173.
Arai et al, “Retrieving Electronic Documents with Real-World Objects on InteractiveDESK,” UIST '95, Nov. 14, 1995.
Arai, “PaperLink: A Technique for Hyperlinking from Real Paper to Electronic Content,” Conference on Human Factors in Computing Systems, May 18, 1997.
Aust, D., “Augmenting Paper Documents with Digital Information in a Mobile Environment,” MS Thesis, University of Dortmund, Department of Computer Graphics, Sep. 3, 1996.
“D Marks the Spot,” Popular Mechanics, Aug. 1, 2000.
Digimarc press release, “Digimarc MediaBridge Reader Software Now Available for Mac Users,” 2 pp., Jul. 17, 2000.
Digimarc Stock Prospectus, Form S-1, Sep. 21, 1999, through p. 45.
Dymetman, M., and Copperman, M., Intelligent Paper; in Electronic Publishing, Artistic Imaging, and Digital Typography, Proceedings of EP '98, Mar./Apr. 1998, Springer Verlag LNCS 1375, pp. 392-406.
Foote, Content-Based Retrieval of Music and Audio, Proc. of SPIE, vol. 3229, pp. 138-147, 1997.
Heiner et al, Linking and messaging from real paper in the Paper PDA, ACM Symposium on User Interface Software and Technology, UIST '99, Nov. 7-10, 1999, 8 pp.
Holmquist, Token-Based Access to Digital Information, Proc.1st Int'l Symp. on Handheld and Ubiquitous Computing 1999, pp. 234-245.
IBM Corp, “Universal Interactive Device,” Research Disclosure Database No. 410129, Jun. 1998.
Ishii, Tangible Bits: Towards Seamless Interfaces Between People, Bits and Atoms, Proc. of CHI '97, Mar. 22-27, 1997, 8 pp.
Lee, “Fingerprinting,” Chapter 8 in Information Hiding Techniques for Steganography and Digital Watermarking, Boston, MA, Artech House, pp. 175-190, 2000.
Ljungstrand et al, “WebStickers: Using Physical Tokens to Access, Manage and Share Bookmarks to the Web,” DARE 2000, Apr. 12, 2000.
Pascoe, The Stick-e Note Architecture: Extending the Interface Beyond the User, Proc. of the 2d Int'l Conf on Intelligent User Interfaces, pp. 261-264, Jan. 6-9, 1997.
Rekimoto, “Augment-able Reality: Situation Communication through Physical and Digital Spaces,” Prc. of 2d Int. Symp. on Wearable Computers, Oct. 1998.
Rekimoto, “CyberCode: Designing Augmented Reality Environments with Visual Tags,” DARE 2000, Apr. 12, 2000.
Rekimoto, Matrix: A Realtime Object Identification and Registration Method for Augmented Reality, Third Asian Pacific Computer and Human Interaction, Jul. 15, 1998.
Ullmer, Models and Mechanisms for Tangible User Interfaces, MS Thesis, MIT, Jun. 1997, 83 pp.
U.S. Appl. No. 09/343,101, filed Jun. 29, 1999, Bruce L. Davis et al.
U.S. Appl. No. 09/343,104, filed Jun. 29, 1999, Tony F. Rodriguez, et al.
U.S. Appl. No. 09/413,117, filed Oct. 6, 1999, Geoffre B. Rhoads.
U.S. Appl. No. 09/482,749, filed Jan. 13, 2000, Geoffre B. Rhoads.
U.S. Appl. No. 09/507,096, filed Feb. 17, 2000 Geoffre B. Rhoads, et al.
U.S. Appl. No. 09/552,998, filed Apr. 19, 2000, Tony F. Rodriguez, et al.
U.S. Appl. No. 09/567,405, filed May 8, 2000, Geoffre B. Rhoads, et al.
U.S. Appl. No. 09/629,649, filed Aug. 1, 2000, J. Scott Carr, et al.
U.S. Appl. No. 09/633,587, filed Aug. 7, 2000, Geoffrey B. Rhoads, et al.
U.S. Appl. No. 09/689,289, filed Oct. 11, 2000, Geoffrey B. Rhoads, et al.
U.S. Appl. No. 09/697,015, filed Oct. 25, 2000, Bruce L Davis, et al.
U.S. Appl. No. 09/697,009, filed Oct. 25, 2000, Bruce L. Davis, et al.
U.S. Appl. No. 13/084,981, filed Apr. 12, 2011, Geoffrey B. Rhoads.
Prosecution excerpts from U.S. Appl. No. 11/874,054, filed Oct. 17, 2007 (original claims; Non-final rejection dated Jan. 20, 2011; Interview summary dated Mar. 15, 2011; and Amendment filed Apr. 18, 2011).
Rekimoto, “The World through the Computer: Computer Augmented Interaction with Real World Environments,” Symposium on User Interface Software and Technology, 1995 (8 pages).
Venkatesan, Robust Image Hashing, Int'l Conf on Image Processing, Sep. 2000, pp. 664-666.
Wagner, Fingerprinting, IEEE Proc. Symp. on Security and Privacy, pp. 18-22, 1983.
Wold et al, Content-Based Classification, Search, and Retrieval of Audio, IEEE Multimedia Magazine, Fall, 1996, pp. 27-36.
Related Publications (1)
Number Date Country
20090125475 A1 May 2009 US
Provisional Applications (1)
Number Date Country
60134782 May 1999 US
Divisions (2)
Number Date Country
Parent 11359756 Feb 2006 US
Child 12244531 US
Parent 09690773 Oct 2000 US
Child 11359756 US
Continuation in Parts (3)
Number Date Country
Parent 09633587 Aug 2000 US
Child 09690773 US
Parent 09343104 Jun 1999 US
Child 09633587 US
Parent 09571422 May 2000 US
Child 09690773 US