System and method for associating content with an image bearing surface

Information

  • Patent Grant
  • 7922099
  • Patent Number
    7,922,099
  • Date Filed
    Friday, December 30, 2005
    18 years ago
  • Date Issued
    Tuesday, April 12, 2011
    13 years ago
Abstract
Systems and methods for associating content with an image bearing surface. In accordance with a first embodiment of the present invention, a method, operable at an electronic interactive device, comprises accessing a first image, e.g., identifying the content, on a surface, wherein the surface comprises an encoded pattern of location information on the surface for providing location information to the electronic interactive device. The method further comprises decoding the first image to associate the location information with second image information, e.g., of the content, of a second image on the surface.
Description

This Application is related to co-pending, commonly owned U.S. patent application Ser. No. 10/803,806, filed Mar. 17, 2004, to James Marggraff et al., entitled “SCANNING APPARATUS,” which is hereby incorporated by reference herein in its entirety.


This Application is related to co-pending, commonly owned U.S. patent application Ser. No. 10/861,243, filed Jun. 3, 2004, to James Marggraff et al., entitled “USER CREATED INTERACTIVE INTERFACE,” which is hereby incorporated by reference herein in its entirety.


This application is related to co-pending, commonly owned U.S. patent application Ser. No. 11/034,491, filed Jan. 12, 2005, to James Marggraff et al., entitled “A METHOD AND SYSTEM FOR IMPLEMENTING A USER INTERFACE FOR A DEVICE EMPLOYING WRITTEN GRAPHICAL ELEMENTS,” which is hereby incorporated by reference herein in its entirety.


This application is related to co-pending, commonly owned U.S. patent application Ser. No. 11/035,155, filed Jan. 12, 2005, to James Marggraff et al., entitled “A METHOD AND SYSTEM FOR IMPLEMENTING A USER INTERFACE FOR A DEVICE THROUGH RECOGNIZED TEXT AND BOUNDED AREAS,” which is hereby incorporated by reference herein in its entirety.


This application is related to co-pending, commonly owned U.S. patent application Ser. No. 11/035,003, filed Jan. 12, 2005, to James Marggraff et al., entitled “TERMINATION EVENTS,” which is hereby incorporated by reference herein in its entirety.


This application is related to co-pending, commonly owned U.S. patent application Ser. No. 11/034,489, filed Jan. 12, 2005, by James Marggraff et al., entitled “PROVIDING A USER INTERFACE HAVING INTERACTIVE ELEMENTS ON A WRITABLE SURFACE,” which is hereby incorporated by reference herein in its entirety.


This application is related to co-pending, commonly owned U.S. patent application Ser. No. 11/267,785, filed Nov. 3, 2005, to James Marggraff, entitled “A REUSABLE IMAGE BEARING SURFACE AND METHOD OF MODIFYING MEMORY CONTENTS RELATED TO SAME,” which is hereby incorporated by reference herein in its entirety.


FIELD OF INVENTION

Embodiments of the present invention relate to the field of interactive devices and pen based computing. More specifically, embodiments of the present invention relate to systems and methods for associating content with an image bearing surface and interactions with pen based computing.


BACKGROUND

In the last twenty years, the use of personal computing devices, such as desktop computer systems, laptop computer systems, handheld computers systems, and tablet computer systems, has grown tremendously. These personal computing devices provide users with a broad range of interactive applications, business utilities, communication abilities, and entertainment possibilities.


Current personal computing devices provide access to these interactive applications via a user interface. Typical computing devices have on-screen graphical interfaces that present information to a user using a display device, such as a monitor or display screen, and receive information from a user using an input device, such as a mouse, a keyboard, a joystick, or a stylus.


Even more so than computing systems, the use of pen and paper is ubiquitous among literate societies. While graphical user interfaces of current computing devices provide for effective interaction with many computing applications, typical on-screen graphical user interfaces have difficulty mimicking the common use of a pen or pencil and paper. For example, desktop and laptop computer systems typically do not have a pen-like interface. Moreover, input into a computer is shown on an electronic display, and is not tangible and accessible in the same manner as information written on paper or a physical surface.


Images and writings drawn with a pen-like interface on a paper surface have convenience, portability, permanence, and tangibility.


Today, interactive content, e.g., a web page, is available only through screen-based mediums such as graphical user interfaces that utilize display screens, e.g., a conventional computer display. It would be advantageous to expand the mediums over which interactive content is available for use.


SUMMARY OF THE INVENTION

Therefore, a need exists for systems and methods for associating content with an image bearing surface. A need also exists for systems and methods for associating content with an image bearing surface having qualities of paper that also satisfies the above need. A further need exists for systems and methods for associating content with an image bearing surface that is compatible and complementary with existing computers, computer peripherals and methods of web access. A need exists for using the above principles to provide paper-based interactive content usable with a pen-based computer system


Accordingly, in one embodiment, a web page can be obtained, e.g., via the internet, and a copy of this web page can be printer on paper, the paper having a pre-printed dot pattern thereon providing spatial location information. Information pertaining to the web page is transferred onto a pen-based computer system. An identifier of the web page is printed on the paper copy of the web page. The identifier may be a bar code, for example. The identifier relates to the transferred information A scan of the identifier by the pen based computer system informs pen-based computer system to use the transferred information for web page interaction.


Systems and methods for associating content with an image bearing surface are disclosed. In accordance with a first embodiment of the present invention, a method, operable at an electronic interactive device, comprises accessing a first image, e.g., a bar code, on a surface, wherein the surface comprises an encoded pattern of location information on the surface for providing location information to the electronic interactive device. The method further comprises decoding the first image to associate the location information with second image information, e.g., of a web page, of a second image on the surface.


In accordance with another embodiment of the present invention, a method, operable at an electronic interactive device, comprises accessing a first image, e.g., a bar code, on a surface, wherein the surface comprises an encoded pattern of location information on the surface for providing location information to the electronic interactive device. The method further comprises decoding the first image to associate the location information with second image information, e.g., of a web page, of a second image on the surface. Responsive to the electronic interactive device accessing a portion of the second image, an action associated with the portion of the second image is performed, for example, implementing an interaction, e.g., associated with a hyperlink as indicated on the web page, by the device.


In accordance with a system embodiment of the present invention, a pen-shaped device comprises an optical detector coupled to a housing, a processor coupled to the optical sensor and a memory coupled to the processor. The memory unit comprises instructions that when executed implement a method, the method comprising accessing a first image on a surface, wherein the surface comprises an encoded pattern of location information on the surface for providing location information to the pen-shaped device and decoding the first image to associate the location information with second image information of a second image on the surface.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and form a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.



FIG. 1 illustrates an exemplary interactive device, in accordance with embodiments of the present invention.



FIG. 2 shows an exemplary image bearing surface provided with a pattern of location determining marks, in accordance with embodiments of the present invention.



FIG. 3 shows an enlarged portion of the position code of FIG. 2, in accordance with embodiments of the present invention.



FIG. 4 is a screen capture of an exemplary web page image.



FIG. 5 is a rendering of an exemplary web page image printed on a piece of encoded paper, in accordance with an embodiment of the present invention.



FIG. 6 illustrates an image printed on a piece of encoded paper, in accordance with an embodiment of the present invention.



FIG. 7 is a flow diagram of a method, in accordance with embodiments of the present invention.



FIG. 8 is a flow diagram of a computer implemented method for preparing printed content for interaction with a pen computer, in accordance with embodiments of the present invention.





DETAILED DESCRIPTION

Reference will now be made in detail to various embodiments of the invention, a reusable image bearing surface and method of modifying memory contents related to same, examples of which are illustrated in the accompanying drawings. While the invention will be described in conjunction with these embodiments, it is understood that they are not intended to limit the invention to these embodiments. On the contrary, the invention is intended to cover alternatives, modifications and equivalents, which may be included within the spirit and scope of the invention as defined by the appended claims. Furthermore, in the following detailed description of the invention, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be recognized by one of ordinary skill in the art that the invention may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the invention.


Some portions of the detailed descriptions, which follow, are presented in terms of procedures, steps, logic blocks, processing, and other symbolic representations of operations on data bits that can be performed on computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. A procedure, computer executed step, logic block, process, etc., is here, and generally, conceived to be a self-consistent sequence of steps or instructions leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.


It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. It is appreciated that throughout the present invention, discussions utilizing terms such as “recognizing” or “accessing” or “performing” or “decoding” or “recording” or “interfacing” or the like, often refer to the action and processes of an electronic system (e.g., interactive device 100 of FIG. 1), or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the electronic device's registers and memories into other data similarly represented as physical quantities within the electronic device memories or registers or other such information storage, transmission or display devices.


System and Method for Associating Content with an Image Bearing Surface


FIG. 1 illustrates an exemplary interactive device 100 for use with embodiments of the present invention, in accordance with embodiments of the present invention. The use, operation, and composition of interactive device 100 are described briefly herein, and more comprehensively in the above referenced patent applications which are incorporated by reference. Interactive device 100 includes processor 112, memory unit 114, audio output device 116, writing element 118 and optical detector 120 within housing 130. In one embodiment, processor 112, memory unit 114, audio output device 116 and optical detector 120 are communicatively coupled over bus 122. In one embodiment, optical detector 120 may also include an optical emitter. In one embodiment, housing 130 may also contain a power supply operable to power circuits and functions of interactive device 100. In one embodiment, housing 130 may also include a display and/or input buttons communicatively coupled with bus 122.


In one embodiment, housing 130 is shaped in the form of a stylus or a writing instrument (e.g., pen-like). In this embodiment, device 100 is a pen-based computer system. A user may hold interactive device 100 in a similar manner as a stylus is held. Writing element 118 is located at one end of housing 130 such that a user can place writing element 118 in contact with a writable surface (not shown). Writing element 118 may include a pen, a pencil, a marker, a crayon, chalk, or any other marking material. It should be appreciated that writing element 118 may also include a non-marking writing element such as a stylus type tip. It should also be appreciated that writing element 118 may also have magnetic properties. During use, a user can hold interactive device 100 and use it in a similar manner as a writing instrument to write on a surface with writing element 118.


Interactive device 100 allows users to create user-written selectable items that represent different functions provided by interactive device 100. In one embodiment, the user-written selectable item includes a symbol representation of an application program executable by processor 112. Computer code for recognizing such functional user-written selectable items and distinguishing them from other non-functional user-written items can reside in memory unit 114 in interactive device 100. It should be appreciated that interactive device 100 is also operable to recognize and execute functions associated with pre-printed selectable items on the surface.


Optical detector 120 is at one end of the stylus-shaped interactive device 100. Optical detector 120 is operable to detect information on a surface. In one embodiment, interactive device 100 also comprises an optical emitter for illuminating a portion of a surface that is detected by optical detector 120. The information detected by optical detector 120 is transmitted to processor 112.


Processor 112 may include any suitable electronics to implement the functions of the interactive device 100. Processor 112 can recognize the user-written selectable items and pre-printed selectable items, and can identify the locations of those user-written and pre-printed selectable items so that interactive device 100 can perform various operations. In these embodiments, memory unit 114 may comprise computer code for correlating any user-written or pre-printed selectable items with their locations on the surface.


Memory unit 114 comprises computer code for performing any of the functions of the interactive device 100. In one embodiment, wherein computer code stored in memory unit 114 and implemented on processor 112 is responsive to a user selection of a user-written or pre-printed selectable item and operable to execute a function associated with the user-written or pre-printed selectable item in response to the selection. Memory unit 114 is also operable to record information associated with user made markings on a surface.


In accordance with embodiments of the present invention, the interactive device 100 may optionally comprise wireless communications unit 121. Optional wireless communications unit 121 enables interactive device 100 to communicate wirelessly with another device, for example, a desktop or laptop computer, a handheld computer, a mobile phone and/or a wireless access point, e.g., a “hot spot.” Interactive device 100 may wirelessly access content on such another device, e.g., a nearby computer, or utilize such a device to access yet another device, e.g., via a network, for example, the Internet.



FIG. 2 shows an exemplary image bearing surface 15 provided with a pattern of location determining marks, in accordance with embodiments of the present invention. In the embodiment of FIG. 2, image bearing surface 15 is provided with a coding pattern in the form of optically readable position code 17 that consists of a pattern of marks 18. The marks 18 in FIG. 2 are greatly enlarged for the sake of clarity. In actuality, the marks 18 may not be easily discernible by the human visual system, and may appear as grayscale on reusable image bearing surface 15. In one embodiment, the marks 18 are embodied as dots; however, the present invention is not so limited. In one embodiment, the dots are permanently printed on the writing surface.



FIG. 3 shows an enlarged portion 19 of the position code 17 of FIG. 2, in accordance with embodiments of the present invention. An interactive device such as interactive device 100 (FIG. 1) is positioned to record an image of a region of the position code 17. In one embodiment, the optical device fits the marks 18 to a reference system in the form of a raster with raster lines 21 that intersect at raster points 22. Each of the marks 18 is associated with a raster point 22. For example, mark 23 is associated with raster point 24. For the marks in an image/raster, the displacement of a mark from the raster point associated with the mark is determined. Using these displacements, the pattern in the image/raster is compared to patterns in the reference system. Each pattern in the reference system is associated with a particular location on the reusable surface 70. Thus, by matching the pattern in the image/raster with a pattern in the reference system, the position of the pattern on the surface 70, and hence the position of the optical device relative to the surface 70, can be determined.


In one embodiment, the pattern of marks on image bearing surface 15 comprises substantially invisible codes. The codes are “substantially invisible” to the eye of the user and may correspond to the absolute or relative locations of the selectable items on the page. “Substantially invisible” also includes codes that are completely or slightly invisible to the user's eye. For example, if dot codes that are slightly invisible to the eye of a user are printed all over a sheet of paper, the sheet may appear to have a light gray shade when viewed at a normal viewing distance and/or without magnification. It should be appreciated that although dot patterned codes are specifically described herein, other types of substantially invisible codes may be used in other embodiments of the invention.


Anoto, a Swedish company, employs a technology that uses an algorithm to generate a pattern the enables a very large unique data space for non-conflicting use across a large set of documents. Their pattern, if fully printed, would cover 70 trillion 8.5″×11″ pages with unique recognition of any 2 cm square on any page. Paper containing the specific dot patterns is commercially available from Anoto. The following patents and patent applications are assigned to Anoto and describe this basic technology and are all herein incorporated by reference in their entirety for all purposes: U.S. Pat. No. 6,502,756, U.S. application Ser. No. 10/179,966, filed on Jun. 26, 2002, WO 01/95559, WO 01/71473, WO 01/75723, WO 01/26032, WO 01/75780, WO 01/01670, WO 01/75773, WO 01/71475, WO 00/73983, and WO 01/16691.


A particular instance of an image bearing surface, e.g., image bearing surface 70 of FIG. 2, may comprise an encoded pattern of location information as described previously. The specific location information encoded into the image bearing surface, or “dot-space” of the image bearing surface, may generally be known or unknown to an interactive device, e.g., interactive device 100 (FIG. 1).


For example, an interactive device may associate a first location encoding, or “dot-space,” with a first application, e.g., a game. Consequently, whenever that first location is detected, e.g., an interactive device scans a piece of paper with the first location encoding, the interactive device executes software associated with the game. In this case, the dot-space of the paper is known to the interactive device. The dot-space and/or surfaces encoding the dot-space may be known as “special purpose” surfaces.


In other cases, e.g., “general purpose” surfaces, the dot-space of the surface, e.g., paper, is not known, e.g., pre-associated, to the interactive device. For example, it is known for a user to draw, e.g., a calculator, on encoded paper representing a second location using an interactive device. In association with the actions of a user drawing the calculator (or other types of commands), the interactive device associates the second dot-space with the calculator and is able to perform calculator functions in association with the second dot-space.


Such associations among a general purpose dot-space and an application, e.g., embodied on an intelligent device, are generally not considered permanent. It is known, however, for the association to be retained, for example, until an interactive device is reset. It is to be appreciated that, in general, there are numerous instances of any given dot-space, e.g., a particular dot-space encoding can be printed on an effectively limitless number of different sheets of paper. For example, if a second piece of paper encoding the same second location information is scanned with the same interactive device, the interactive device may recall the previously established association between the second location and the calculator function, even if the second piece of paper has not been marked by a user. Such lasting associations unfortunately limit the usability of “general purpose” surfaces, e.g., encoded paper.


A vast amount of content has been and can be created in computerized databases, e.g., the word wide web. While most of this content was not originally intended for use with a pen-like interface, much of such content has application to such an interface, with numerous advantages over the traditional display screen and mouse interface.



FIG. 4 is a screen capture of an exemplary web page image 400. Image 400 may be obtained for the web, for instance, and comprises pictures, e.g., picture 410, text, e.g., text 420 and links, e.g., link 430. In this particular example, the links are primarily pointers to digital audio files, e.g., recordings of woodpecker songs. In a typical application of embodiments of the present invention, image 400 may be accessed and displayed on a conventional display-based a computer coupled to the internet.



FIG. 5 is a rendering 500 of image 400 (FIG. 4) printed on a piece of encoded paper, in accordance with an embodiment of the present invention. For example, rendering 500 is printed via a printer attached to the computer described above. Rendering 500 comprises substantially all of the information of image 400, including, for example, pictures, text and links. For example, image 510 is a printed rending of image 410 (FIG. 4), text 520 is a printed rendering of text 420 (FIG. 4) and link 530 is printed rendering of link 430 (FIG. 4). It is to be appreciated that the content of rendering 500 includes location encoding, e.g., via a substantially invisible encoding, such that the position of, for example, text 520 can be differentiated from the position of link 530, by an appropriate device, e.g., interactive device 100 of FIG. 1.


Embodiments in accordance with the present invention provide a pen-like interface to interact with rendering 500, in much the same manner as a person could interact, via a conventional screen-based computer, with a website represented by image 400 (FIG. 4). For example, touching on link 530 with an appropriate interactive device, e.g., one capable of audio playback, should produce an analogous effect to “clicking” on link 430 (FIG. 4). In this case, such action should play the song of a red-bellied woodpecker from the pen-based computer.


In accordance with embodiments of the present invention, an interactive device can be provided with either the content of such links, e.g., the audio files, which may be transcoded, e.g., to reduce their size, or the pointers of the links, or both. However, to perform the proper action in response to touching a link, the interactive device needs to be aware of where in dot-space a particular link resides. For example, the link needs to be associated with its dot-space. Stated differently, the dot-space on which the web page is printed needs to be associated with the digital content that is related to the web page and typically transferred to the pen-based computer.


In general, most computer-attached printing devices cannot produce output of sufficient quality to encode location information, e.g., in substantially invisible codes, in a manner that is compatible with existing encoding and decoding devices. Consequently, in the general case, an association between content rendered on a computer printer and a particular dot-space is not typically established by printing encoding dots along with the content. Typically, it is commercially infeasible to associate such printed content with particular dot-space prior to printing. For example, in the general case, an interactive device e.g., interactive device 100 of FIG. 1, cannot be told where link 530 is out of all possible dot-spaces prior to printing.


Even in a case in which a desired computer-attached printing device is capable of printing encoding dots, it is often advantageous to reuse a particular, e.g., a general purpose, dot-space. Consequently, it is desirable to associate a particular content with a particular dot space.


In general, such association is formed after printing the content onto pre-encoded paper, or other media. In accordance with an embodiment of the present invention, an interactive device can be provided partial information about printed content, for example, horizontal and vertical positions of links within a page, e.g., the horizontal and vertical position of link 530. It is to be appreciated that this information can be given or transferred to an interactive device prior to, during, or after printing of the content onto pre-encoded paper. This information can be provided, e.g., via wireless communication with an interactive device. Alternatively, this information can be provided via other means, for example, via a docking station that communicates via light pulses with optical detector 120 of interactive device 100 (FIG. 1). While information about the content of a page may be known, in general, the dot-space of the particular page is unknown.


In accordance with embodiments of the present invention, a web image, e.g., image 400 (FIG. 4) may be processed by a computer, and information, e.g., X-Y locations of links, of that web image may be sent to an interactive device. Subsequently, the interactive device associates the first location encoding that it detects with such web image. In this manner, a page or dot-space may be identified by the interactive device, and the links can be located within that dot-space by utilizing information provided by the processing computer.


In accordance with other embodiments of the present invention, a user may be instructed to write identifying information in a particular area of a page and/or to trace some set of content, e.g., a title, of a page. In this manner, a page or dot-space may be identified by the interactive device, and the links can be located within that dot-space by utilizing information provided by the processing computer.


In accordance with still other embodiments of the present invention, a processing computer may add an image to rendering 500, for example, code mark 540. Code mark 540 identifies the content of rendering 500. For example, code mark 540 may signify date and time that image 400 (FIG. 4) was processed into rendering 500 (but not necessarily the time printed). Alternatively, code mark 540 may signify the sequence of the processed image, e.g., the nth processed image.


By reading code mark 540, a page or dot-space is identified by the interactive device and associated with its particular content. Code mark 540 is an image that can be rendered by the computer-attached printer and detected by an interactive device. Examples of such codes include one- and two-dimensional bar codes of a variety of formats. In accordance with an embodiment of the present invention, code mark 540 can encode information by obscuring substantially invisible codes pre-printed onto encoded paper.


It is to be appreciated that, in accordance with embodiments of the present invention, the interactive device may not have knowledge of the content of an image prior to forming an association between an image and a particular dot-space. For example, code mark 540 may encode a universal resource locator (URL) corresponding to a printed page. Responsive to a touch in a particular location on that page, the interactive device may query that web page, e.g., wirelessly via a local computer or via a “hot spot.” Such query may simulate a mouse click to be web page in the location corresponding to the touch. The interactive device may receive the web page response and react appropriately.


For example, consider a copy of rendering 500 printed on encoded paper and interactive device 100 (FIG. 1), wirelessly coupled to the internet via an 802.11 access point. Interactive device 100 has not been preloaded with any bird calls, or information of rendering 500.


A user scans code mark 540, which informs interactive device 100 that a particular 8.5 inch×11 inch portion of dot-space represents the web page located at a particular URL. Interactive device 100 may access the web page to obtain information of links, e.g., their location within the page and/or content of the links. In this manner, interactive device 100 enables a user to interact with a web page utilizing a pen-like interface, without prior knowledge of the web page in the interactive device 100. For example, a user touch onto link 530 may cause the interactive device 100 to play the call of a red-bellied woodpecker from interactive device 100's memory.


Alternatively, interactive device 100 can simulate a conventional computer/mouse interaction with the website, e.g., sending information of mouse “clicks” in response to user touches of interactive device 100 onto the encoded paper. For example, a user touch onto link 530 may cause the interactive device 100 to send a mouse “click” indication to the website, and to play the call of a red-bellied woodpecker as the content is streamed from the website to interactive device 100.


Although the previous examples have illustrated interaction with content that is primarily the reception of information, e.g., accessing and playing back woodpecker calls, embodiments in accordance with the present invention are well suited to interactions that provide information as well.



FIG. 6 illustrates an image 600 printed on a piece of encoded paper, in accordance with an embodiment of the present invention. Image 600 may be rendered from a web image, and printed onto pre-encoded paper. Alternatively, image 600 may be printed contemporaneously with substantially invisible encoding, for example, via high quality commercial printing techniques.


Image 600 comprises a code mark 640, a user name input field 610, a password field 620 and a submit button, e.g., link, 630. If image 600 were a display of a conventional website, a user would understand to type a user name into field 610, type a password into field 620 and click on submit button 630. A pen-like interaction may comprise scanning code mark 640, writing a user name in field 610, writing a password in field 620 and touching submit button 630.


Responsive to these user actions, an interactive device 100 (FIG. 1) may perform a recognition function on the user generated stroke data entered into fields 610 and 620 and send the resulting character information to a destination, e.g., a website, specified in code block 640. The sending operation may take place immediately or be queued until the interactive device 100 is able to communicate with the destination.



FIG. 7 is a flow diagram of a computer implemented method 700, in accordance with embodiments of the present invention. Method 700 may be performed at an electronic interactive device, for example, interactive device 100 of FIG. 1. In 710, a first image on a surface is accessed. For example, interactive device 100 accesses code mark 540 (FIG. 5). The surface comprises an encoded pattern of location information on the surface for providing location information to the electronic interactive device.


In 720, the first image is decoded to associate the location information with second image information of a second image, e.g., a printed rendering of a web page, on the surface. In optional 730, responsive to the electronic interactive device accessing a portion of the second image, an action associated with the portion of the second image is performed. For example, responsive to accessing link 530 (FIG. 5), the electronic interactive device may access the content associated with link 530 and play the song of the red-bellied woodpecker. The action may include a wide variety of actions, including accessing content associated with a link, e.g., a hyperlink.


In this novel manner, content printed on encoded paper may be associated with the encoded location, such that desirable interaction can be conducted utilizing a pen-like interface.



FIG. 8 is a flow diagram of a computer implemented method 800 for preparing printed content for interaction with a pen computer, in accordance with embodiments of the present invention. Method 800 may be performed at a conventional computer, e.g., a desktop computer system. In 810, content comprising an interactive item is accessed. For example, a web page is accessed. The interactive item may be a hyperlink, for example.


In 820, a graphical identifier is assigned to the content. The graphical identifier is printable by a desired printing device and readable by the pen computer. The graphical identifier may, for example, directly encode an address of the content within a database, e.g., a universal resource locator of a web page. Alternatively, for example, the graphical identifier may encode some other relation between a processing device and the content, e.g., an index relationship to information stored on the processing device.


In 830, the content and the graphical identifier are printed. For example, the content of content image 400 (FIG. 4) and the graphical identifier code mark 540 (FIG. 5) are printed onto paper encoded with a pre-printed substantially invisible code.


Embodiments in accordance with the present invention provide for associating content with an image bearing surface. Embodiments in accordance with the present invention provide also provide for systems and methods for associating content with an image bearing surface having qualities of paper. Further, embodiments in accordance with the present invention provide for systems and methods for associating content with an image bearing surface that is compatible and complementary with existing computers, computer peripherals and methods of web access.


Various embodiments of the invention, systems and methods for associating content with an image bearing surface, are thus described. While the present invention has been described in particular embodiments, it should be appreciated that the invention should not be construed as limited by such embodiments, but rather construed according to the below claims.

Claims
  • 1. A method comprising: accessing a first image on a surface, wherein said surface comprises an encoded pattern of location information on said surface for providing location information to an electronic interactive device; and wherein further said encoded pattern of location information defines a portion of a dot space; anddecoding said first image to associate said location information with second image information of a plurality of second images on said surface; wherein said first image identifies a web page comprising said plurality of second images and a plurality of memory-stored content associated with said plurality of second images; andassociating said portion of said dot space with said web page, wherein said accessing and said decoding and associating are performed by said electronic interactive device.
  • 2. The method of claim 1 wherein said encoded pattern is substantially invisible.
  • 3. The method of claim 1 wherein said second image information is resident on said electronic interactive device at a time of said decoding.
  • 4. The method of claim 1 wherein said second image information is not resident on said electronic interactive device at a time of said decoding.
  • 5. The method of claim 1 wherein said second image information comprises an address of a source of said plurality of second images.
  • 6. The method of claim 5 wherein said address is a universal resource locator.
  • 7. The method of claim 1 wherein said first image encodes information by obscuring a portion of said encoded pattern.
  • 8. The method of claim 1 wherein said second image information comprises positional information of elements of said plurality of second images.
  • 9. A method comprising: accessing a first image on a surface, wherein said surface comprises an encoded pattern of location information on said surface for providing location information to an electronic interactive device; and wherein further said encoded pattern of location information defines a portion of a dot space;decoding said first image to associate said location information with second image information of a plurality of second image on said surface; wherein said first image identifies a web page comprising said plurality of second images and a plurality of memory-stored content associated with said plurality of second images;associating said portion of said dot space with said web page; andresponsive to said electronic interactive device accessing a portion of said plurality of second images, performing an action associated with said portion of said plurality of second images, wherein said accessing, decoding, associating and performing are performed by said electronic interactive device.
  • 10. The method of claim 9 wherein said action is to render an audio content associated with said portion of said plurality of second images.
  • 11. The method of claim 9 wherein said action is to render an audio content associated with accessing said portion of said plurality of second images.
  • 12. The method of claim 9 wherein said action comprises accessing content not resident on said electronic interactive device at a time of said accessing said portion of said plurality of second images.
  • 13. The method of claim 12 wherein said accessing content comprises accessing the internet.
  • 14. A pen-shaped device comprising: an optical detector coupled to a housing;a processor coupled to said optical sensor; anda memory coupled to said processor, said memory unit comprising instructions that when executed implement a method, said method comprising:accessing a first image on a surface, wherein said surface comprises an encoded pattern of location information on said surface for providing location information to said pen-shaped device; and wherein further said encoded pattern of location information defines a portion of a dot space; anddecoding said first image to associate said location information with second image information of a plurality of second images on said surface; wherein said first image identifies a web page comprising said plurality of second images and a plurality of memory-stored content associated with said plurality of second images; andassociating said portion of said dot space with said web page.
  • 15. The pen-shaped device of claim 14 further comprising a wireless communications unit coupled to said processor for accessing said second image information.
  • 16. The pen-shaped device of claim 14 wherein second image information is resident on said pen-shaped device at a time of said decoding.
  • 17. The pen-shaped device of claim 15 wherein said second image information is not present on said pen-shaped device at a time of said decoding.
  • 18. The pen-shaped device of claim 17 wherein said address is a universal resource locator.
  • 19. The pen-shaped device of claim 14 wherein said second image information comprises an address of a source of said plurality of second images.
  • 20. The pen-shaped device of claim 14 wherein said first image encodes information by obscuring a portion of said encoded pattern.
  • 21. The pen-shaped device of claim 14 wherein said second image information comprises positional information of elements of said plurality of second images.
  • 22. A method of identifying information, said method comprising: scanning a first image printed on a surface, wherein said surface has printed thereon an encoded pattern of location information for use by a pen computer device and wherein further said encoded pattern of location information defines a portion of a dot space;recognizing said first image wherein said first image identifies a web page comprising a plurality of second images and a plurality of memory-stored content associated with said plurality of second images; andassociating said portion of said dot space with said web page, wherein said scanning, said recognizing and said associating are performed by said pen computer device.
  • 23. A method as described in claim 22 further comprising, in response to said pen computer device selecting a selected one of said second images, performing an action related to a memory-stored content that is associated with said selected one of said plurality of second images.
  • 24. A method as described in claim 23 wherein said action is audibly rendering a recording.
  • 25. A method as described in claim 22 further comprising, in response to said pen computer device selecting a selected one of said second images, accessing a memory-stored content that is associated with said selected one of said plurality of second images.
  • 26. A method as described in claim 25 wherein said memory-stored content that is associated with said selected one of said plurality of second images is a universal resource locator.
  • 27. A method as described in claim 22 wherein said first image is a bar code image.
  • 28. A method as described in claim 22 further comprising: printing said plurality of second images on said surface; andprinting said first image on said surface.
  • 29. A method as described in claim 28 further comprising loading said plurality of memory-stored content that is associated with said plurality of second images onto a memory of said pen computer device.
  • 30. A method as described in claim 29 wherein said printings and said loading occur substantially contemporaneously.
RELATED APPLICATIONS

This application is a Continuation In Part of co-pending, commonly owned U.S. patent application Ser. No. 11/194,020, filed Jul. 29, 2005, to Young et al., entitled “Image Bearing Surface,” which is hereby incorporated by reference herein in its entirety.

US Referenced Citations (397)
Number Name Date Kind
2182334 Crespo Dec 1939 A
2932907 Stieber et al. Apr 1960 A
3292489 Johnson et al. Dec 1966 A
3304612 Proctor et al. Feb 1967 A
3530241 Ellis Sep 1970 A
3591718 Asano et al. Jul 1971 A
3657812 Lee Apr 1972 A
3782734 Krainin Jan 1974 A
3798370 Hurst Mar 1974 A
3888311 Cooke, Jr. Jun 1975 A
3911215 Hurst et al. Oct 1975 A
3921165 Dym Nov 1975 A
4079194 Kley Mar 1978 A
4220815 Gibson et al. Sep 1980 A
4318096 Thornburg et al. Mar 1982 A
4337375 Freeman Jun 1982 A
4375058 Bouma et al. Feb 1983 A
4425099 Naden Jan 1984 A
4464118 Scott et al. Aug 1984 A
4492819 Rodgers et al. Jan 1985 A
4570149 Thornburg et al. Feb 1986 A
4603231 Reiffel et al. Jul 1986 A
4604058 Fisher et al. Aug 1986 A
4604065 Frazer et al. Aug 1986 A
4619539 Kageyama Oct 1986 A
4627819 Burrows Dec 1986 A
4630209 Saito et al. Dec 1986 A
4650926 Nakamura et al. Mar 1987 A
4686332 Greanias et al. Aug 1987 A
4706090 Hashiguchi et al. Nov 1987 A
4739299 Eventoff et al. Apr 1988 A
4748318 Bearden et al. May 1988 A
4787040 Ames et al. Nov 1988 A
4793810 Beasley, Jr. Dec 1988 A
4839634 More et al. Jun 1989 A
4841387 Rindfuss Jun 1989 A
4853494 Suzuki Aug 1989 A
4853498 Meadows et al. Aug 1989 A
4853499 Watson Aug 1989 A
4880968 Kwang-Chien Nov 1989 A
4913463 Tlapek et al. Apr 1990 A
4922061 Meadows et al. May 1990 A
4924387 Jeppesen May 1990 A
4964167 Kunizawa et al. Oct 1990 A
4972496 Sklarew Nov 1990 A
4990093 Frazer et al. Feb 1991 A
4991987 Holloway et al. Feb 1991 A
5007085 Greanias et al. Apr 1991 A
5030117 Delorme Jul 1991 A
5053585 Yaniger Oct 1991 A
5057024 Sprott et al. Oct 1991 A
5059126 Kimball Oct 1991 A
5113178 Yasuda et al. May 1992 A
5117071 Greanias et al. May 1992 A
5128525 Stearns et al. Jul 1992 A
5149919 Greanias et al. Sep 1992 A
5157384 Greanias et al. Oct 1992 A
5168147 Bloomberg Dec 1992 A
5184003 McMillin et al. Feb 1993 A
5194852 More et al. Mar 1993 A
5209665 Billings et al. May 1993 A
5217376 Gosselin Jun 1993 A
5217378 Donovan Jun 1993 A
5220136 Kent Jun 1993 A
5220649 Forcier Jun 1993 A
5221833 Hecht Jun 1993 A
5250930 Yoshida et al. Oct 1993 A
5260697 Barrett et al. Nov 1993 A
5294792 Lewis et al. Mar 1994 A
5301243 Olschafskie et al. Apr 1994 A
5314336 Diamond et al. May 1994 A
5356296 Pierce et al. Oct 1994 A
5401916 Crooks Mar 1995 A
5406307 Hirayama et al. Apr 1995 A
5409381 Sundberg et al. Apr 1995 A
5413486 Burrows et al. May 1995 A
5417575 McTaggart May 1995 A
5438168 Wolfe et al. Aug 1995 A
5438662 Randall Aug 1995 A
5466158 Smith, III Nov 1995 A
5474457 Bromley Dec 1995 A
5480306 Liu Jan 1996 A
5484292 McTaggart Jan 1996 A
5485176 Ohara et al. Jan 1996 A
5509087 Nagamine Apr 1996 A
5510606 Worthington et al. Apr 1996 A
5517579 Baron et al. May 1996 A
5520544 Manico et al. May 1996 A
5561446 Montlick Oct 1996 A
5572651 Weber et al. Nov 1996 A
5574519 Manico et al. Nov 1996 A
5574804 Olschafskie et al. Nov 1996 A
5575659 King et al. Nov 1996 A
5596698 Morgan Jan 1997 A
5604517 Filo Feb 1997 A
5624265 Redford et al. Apr 1997 A
5629499 Flickinger et al. May 1997 A
5635726 Zavislan et al. Jun 1997 A
5636995 Sharpe, III et al. Jun 1997 A
5640193 Wellner Jun 1997 A
5649023 Barbara et al. Jul 1997 A
5652412 Lazzouni et al. Jul 1997 A
5652714 Peterson et al. Jul 1997 A
5661506 Lazzouni et al. Aug 1997 A
5663748 Huffman et al. Sep 1997 A
5666214 MacKinlay et al. Sep 1997 A
5686705 Conroy et al. Nov 1997 A
5694102 Hecht Dec 1997 A
5697793 Huffman et al. Dec 1997 A
5698822 Haneda et al. Dec 1997 A
5717939 Bricklin et al. Feb 1998 A
5730602 Gierhart et al. Mar 1998 A
5739814 Ohara et al. Apr 1998 A
5757361 Hirshik May 1998 A
5760773 Berman et al. Jun 1998 A
5767457 Gerpheide et al. Jun 1998 A
5788508 Lee et al. Aug 1998 A
5790114 Geaghan et al. Aug 1998 A
5801687 Peterson et al. Sep 1998 A
5835726 Shwed et al. Nov 1998 A
5844483 Boley Dec 1998 A
5847698 Reavey et al. Dec 1998 A
5852434 Sekendur Dec 1998 A
5855483 Collins et al. Jan 1999 A
5877458 Flowers Mar 1999 A
5889506 Lopresti et al. Mar 1999 A
5896403 Nagasaki et al. Apr 1999 A
5902968 Sato et al. May 1999 A
5903729 Reber et al. May 1999 A
5910009 Leff et al. Jun 1999 A
5913629 Hazzard Jun 1999 A
5914707 Kono Jun 1999 A
5932863 Rathus et al. Aug 1999 A
5933829 Durst et al. Aug 1999 A
5945656 Lemelson et al. Aug 1999 A
5951298 Werzberger Sep 1999 A
5957697 Iggulden et al. Sep 1999 A
5960124 Taguchi et al. Sep 1999 A
5963199 Kato et al. Oct 1999 A
5963208 Dolan et al. Oct 1999 A
5973420 Kaiserman et al. Oct 1999 A
5974558 Cortopassi et al. Oct 1999 A
5978773 Hudetz et al. Nov 1999 A
5992817 Klitsner et al. Nov 1999 A
5997309 Metheny et al. Dec 1999 A
6000613 Hecht et al. Dec 1999 A
6000621 Hecht et al. Dec 1999 A
6002387 Ronkka et al. Dec 1999 A
6008799 Van Kleeck Dec 1999 A
6009393 Sasaki Dec 1999 A
6018656 Shirai Jan 2000 A
6020895 Azami Feb 2000 A
6021306 McTaggart Feb 2000 A
6041215 Maddrell et al. Mar 2000 A
6050735 Hazzard Apr 2000 A
6052117 Ohara et al. Apr 2000 A
6064855 Ho May 2000 A
6072476 Harada et al. Jun 2000 A
6076734 Dougherty et al. Jun 2000 A
6076738 Bloomberg et al. Jun 2000 A
6081261 Wolff et al. Jun 2000 A
6088023 Louis et al. Jul 2000 A
6089943 Lo Jul 2000 A
6094197 Buxton et al. Jul 2000 A
6100877 Chery et al. Aug 2000 A
6104387 Chery et al. Aug 2000 A
6104388 Nagai et al. Aug 2000 A
6119944 Mulla et al. Sep 2000 A
6124851 Jacobson Sep 2000 A
6130666 Persidsky Oct 2000 A
6144371 Clary et al. Nov 2000 A
6148173 Bell Nov 2000 A
6164534 Rathus et al. Dec 2000 A
6164541 Dougherty et al. Dec 2000 A
6181329 Stork et al. Jan 2001 B1
6183262 Tseng Feb 2001 B1
6188983 Hanson Feb 2001 B1
6199042 Kurzweil Mar 2001 B1
6199048 Hudetz et al. Mar 2001 B1
6201903 Wolff et al. Mar 2001 B1
6201947 Hur et al. Mar 2001 B1
6208771 Jared et al. Mar 2001 B1
6215476 Depew et al. Apr 2001 B1
6215901 Schwartz Apr 2001 B1
6218964 Ellis Apr 2001 B1
6239792 Yanagisawa et al. May 2001 B1
6241528 Myers Jun 2001 B1
6252564 Albert et al. Jun 2001 B1
6256638 Dougherty et al. Jul 2001 B1
6262711 Cohen et al. Jul 2001 B1
6262719 Bi et al. Jul 2001 B1
6275301 Bobrow et al. Aug 2001 B1
6295439 Bejar et al. Sep 2001 B1
6297812 Ohara et al. Oct 2001 B1
6297824 Hearst et al. Oct 2001 B1
6304667 Reitano Oct 2001 B1
6304898 Shiigi Oct 2001 B1
6304989 Kraus et al. Oct 2001 B1
6309122 Wang Oct 2001 B1
6313828 Chombo Nov 2001 B1
6322369 Patterson et al. Nov 2001 B1
6330976 Dymetman et al. Dec 2001 B1
6331865 Sachs et al. Dec 2001 B1
6331867 Eberhard et al. Dec 2001 B1
6335727 Morishita et al. Jan 2002 B1
6349194 Nozaki et al. Feb 2002 B1
6363239 Tutt et al. Mar 2002 B1
6388681 Nozaki May 2002 B1
6392632 Lee May 2002 B1
6396481 Challa et al. May 2002 B1
6405167 Cogliano Jun 2002 B1
6415108 Kamishima et al. Jul 2002 B1
6418326 Heinonen et al. Jul 2002 B1
6421524 Padgett Jul 2002 B1
6431439 Suer et al. Aug 2002 B1
6434561 Durst, Jr. et al. Aug 2002 B1
6441807 Yamaguchi Aug 2002 B1
6442350 Stephany et al. Aug 2002 B1
6456749 Kasabach et al. Sep 2002 B1
6460155 Nagasaki et al. Oct 2002 B1
6473072 Comiskey et al. Oct 2002 B1
6476834 Doval et al. Nov 2002 B1
6493734 Sachs et al. Dec 2002 B1
6502756 Fahraeus Jan 2003 B1
6509893 Akhlagi et al. Jan 2003 B1
6516181 Kirwan Feb 2003 B1
6529920 Arons et al. Mar 2003 B1
6532314 Plain et al. Mar 2003 B1
6535799 Levanoni et al. Mar 2003 B2
6556188 Cordner Apr 2003 B1
6564249 Shiigi May 2003 B2
6577299 Schiller et al. Jun 2003 B1
6584249 Gu et al. Jun 2003 B1
6587859 Dougherty et al. Jul 2003 B2
6592039 Smith et al. Jul 2003 B1
6593908 Borgstrom et al. Jul 2003 B1
6608618 Wood Aug 2003 B2
6609653 Lapstun et al. Aug 2003 B1
6627870 Lapstun et al. Sep 2003 B1
6628847 Kasabach et al. Sep 2003 B1
6641401 Wood et al. Nov 2003 B2
6644545 Lapstun et al. Nov 2003 B1
6647369 Silverbrook et al. Nov 2003 B1
6651879 Lapstun et al. Nov 2003 B2
6661405 Flowers Dec 2003 B1
6663008 Pettersson et al. Dec 2003 B1
6665490 Copperman et al. Dec 2003 B2
6668156 Lynch et al. Dec 2003 B2
6676411 Rehkemper et al. Jan 2004 B2
6678499 Silverbrook et al. Jan 2004 B1
6689966 Wiebe Feb 2004 B2
6724373 O'Neill, Jr. et al. Apr 2004 B1
6724374 Lapstun et al. Apr 2004 B1
6732927 Olsson et al. May 2004 B2
6738050 Comiskey et al. May 2004 B2
6738053 Borgstrom et al. May 2004 B1
6752557 Hsieh Jun 2004 B1
6755584 O'Brien et al. Jun 2004 B2
6763995 Song Jul 2004 B1
6771283 Carro Aug 2004 B2
6773185 Hsieh Aug 2004 B1
6798403 Kitada et al. Sep 2004 B2
6816702 Kuntz et al. Nov 2004 B2
6831632 Vardi Dec 2004 B2
6847883 Walmsley et al. Jan 2005 B1
6853293 Swartz et al. Feb 2005 B2
6874883 Shigemura et al. Apr 2005 B1
6885878 Borgstrom et al. Apr 2005 B1
6886036 Santamaki et al. Apr 2005 B1
6915103 Blume Jul 2005 B2
6933928 Lilienthal Aug 2005 B1
6938222 Hullender et al. Aug 2005 B2
6940491 Carro Sep 2005 B2
6947027 Lapstun et al. Sep 2005 B2
6956562 O'Hara et al. Oct 2005 B1
6965454 Silverbrook et al. Nov 2005 B1
6966495 Lynggaard et al. Nov 2005 B2
6966777 Robotham Nov 2005 B2
6982703 Lapstun et al. Jan 2006 B2
6985138 Charlier Jan 2006 B2
6989816 Dougherty et al. Jan 2006 B1
7006116 Meyers et al. Feb 2006 B1
7035583 Ferrigno et al. Apr 2006 B2
7068860 Kasabach et al. Jun 2006 B2
7080103 Womack Jul 2006 B2
7099019 Silverbrook et al. Aug 2006 B2
7134606 Chou Nov 2006 B2
7184592 Iga et al. Feb 2007 B2
7193618 Morehouse Mar 2007 B2
7202861 Lynggaard Apr 2007 B2
7239306 Fahraeus et al. Jul 2007 B2
7289110 Hansson Oct 2007 B2
7295193 Fahraeus Nov 2007 B2
7350996 Bielecki et al. Apr 2008 B2
7409089 Simmons et al. Aug 2008 B2
7421439 Wang et al. Sep 2008 B2
7453447 Marggraff et al. Nov 2008 B2
20010015721 Byun et al. Aug 2001 A1
20010024193 Fahraeus Sep 2001 A1
20010051329 Lynch et al. Dec 2001 A1
20020000468 Bansal Jan 2002 A1
20020001418 Fahraeus et al. Jan 2002 A1
20020011989 Ericson et al. Jan 2002 A1
20020021284 Wiebe Feb 2002 A1
20020023957 Michaelis et al. Feb 2002 A1
20020029146 Nir Mar 2002 A1
20020041290 LeKuch et al. Apr 2002 A1
20020044134 Ericson et al. Apr 2002 A1
20020060665 Sekiguchi et al. May 2002 A1
20020077902 Marcus Jun 2002 A1
20020083101 Card et al. Jun 2002 A1
20020087598 Carro Jul 2002 A1
20020113802 Card et al. Aug 2002 A1
20020113823 Card et al. Aug 2002 A1
20020118230 Card et al. Aug 2002 A1
20020120854 LeVine et al. Aug 2002 A1
20020193975 Zimmerman Dec 2002 A1
20020197589 Wood et al. Dec 2002 A1
20030001020 Kardach Jan 2003 A1
20030013073 Duncan et al. Jan 2003 A1
20030013483 Ausems et al. Jan 2003 A1
20030014615 Lynggaard Jan 2003 A1
20030016210 Soto et al. Jan 2003 A1
20030016212 Lynggaard Jan 2003 A1
20030020629 Swartz et al. Jan 2003 A1
20030024975 Rajasekharan Feb 2003 A1
20030025951 Pollard et al. Feb 2003 A1
20030028451 Ananian Feb 2003 A1
20030029919 Lynggaard et al. Feb 2003 A1
20030040310 Barakat et al. Feb 2003 A1
20030046256 Hugosson et al. Mar 2003 A1
20030052900 Card et al. Mar 2003 A1
20030067427 Comiskey et al. Apr 2003 A1
20030071850 Geidl Apr 2003 A1
20030080948 Lapstun et al. May 2003 A1
20030087219 Berger et al. May 2003 A1
20030089777 Rajasekharan et al. May 2003 A1
20030090477 Lapstun et al. May 2003 A1
20030095098 Paul et al. May 2003 A1
20030112220 Yang et al. Jun 2003 A1
20030133164 Tsai Jul 2003 A1
20030134257 Morsy et al. Jul 2003 A1
20030162162 Marggraff Aug 2003 A1
20030173405 Wilz et al. Sep 2003 A1
20030195820 Silverbrook et al. Oct 2003 A1
20030208410 Silverbrook et al. Nov 2003 A1
20030218604 Wood et al. Nov 2003 A1
20040012198 Brotzell et al. Jan 2004 A1
20040022454 Kasabach et al. Feb 2004 A1
20040023200 Blume Feb 2004 A1
20040029092 Orr et al. Feb 2004 A1
20040039750 Anderson et al. Feb 2004 A1
20040043365 Kelley et al. Apr 2004 A1
20040043371 Ernst et al. Apr 2004 A1
20040084190 Hill et al. May 2004 A1
20040091842 Carro May 2004 A1
20040104890 Caldwell et al. Jun 2004 A1
20040121298 Creamer et al. Jun 2004 A1
20040140966 Marggraff et al. Jul 2004 A1
20040164975 Ho et al. Aug 2004 A1
20040167895 Carro Aug 2004 A1
20040169695 Forman Sep 2004 A1
20040202987 Scheuring et al. Oct 2004 A1
20040219501 Small et al. Nov 2004 A1
20040229195 Marggraff et al. Nov 2004 A1
20040259067 Cody et al. Dec 2004 A1
20050002053 Meador et al. Jan 2005 A1
20050005246 Card et al. Jan 2005 A1
20050013487 Clary et al. Jan 2005 A1
20050022130 Fabritius Jan 2005 A1
20050024346 Dupraz et al. Feb 2005 A1
20050055628 Chen et al. Mar 2005 A1
20050060644 Patterson Mar 2005 A1
20050082359 Marggraff et al. Apr 2005 A1
20050083316 Brian et al. Apr 2005 A1
20050106547 Chiu May 2005 A1
20050131803 Lapstun et al. Jun 2005 A1
20050134926 Takezaki et al. Jun 2005 A1
20050135678 Wecker et al. Jun 2005 A1
20050138541 Euchner et al. Jun 2005 A1
20050165663 Razumov Jul 2005 A1
20050188306 Mackenzie Aug 2005 A1
20050208458 Smith et al. Sep 2005 A1
20050211783 Chou Sep 2005 A1
20060033725 Marggraff et al. Feb 2006 A1
20060067576 Marggraff et al. Mar 2006 A1
20060067577 Marggraff et al. Mar 2006 A1
20060080609 Marggraff Apr 2006 A1
20060125805 Marggraff Jun 2006 A1
20060126105 Sedky et al. Jun 2006 A1
20060127872 Marggraff Jun 2006 A1
20060146029 Diercks Jul 2006 A1
20060159345 Clary et al. Jul 2006 A1
20060168261 Serval et al. Jul 2006 A1
20060242562 Wang et al. Oct 2006 A1
20060269168 Kasabach et al. Nov 2006 A1
20070003168 Oliver Jan 2007 A1
Foreign Referenced Citations (58)
Number Date Country
1142471 Feb 1997 CN
1520542 Aug 2004 CN
1655184 Aug 2005 CN
0495618 Jul 1992 EP
0519714 Dec 1992 EP
0539053 Apr 1993 EP
0697780 Feb 1996 EP
0866397 Sep 1998 EP
0973314 Jan 2000 EP
1256090 Nov 2002 EP
1256091 Nov 2002 EP
1315085 May 2003 EP
1416426 May 2004 EP
2811130 Jan 2002 FR
2202664 Sep 1988 GB
57238486 Mar 1982 JP
5137846 Jun 1993 JP
5217688 Aug 1993 JP
6146516 May 1994 JP
H06231466 Aug 1994 JP
H08036452 Feb 1996 JP
1011639 Apr 1998 JP
11119790 Apr 1999 JP
2000247074 Sep 2000 JP
2000293303 Oct 2000 JP
2001184291 Jul 2001 JP
2002297308 Oct 2002 JP
2003528402 Sep 2003 JP
2004503840 Feb 2004 JP
2007296387 Nov 2007 JP
2002009615 Nov 2000 KR
20020033775 Jul 2002 KR
9957648 Nov 1999 WO
WO 0073983 Dec 2000 WO
WO 0101670 Jan 2001 WO
WO 0116691 Mar 2001 WO
WO 0126032 Apr 2001 WO
0148685 Jul 2001 WO
0161455 Aug 2001 WO
0167222 Sep 2001 WO
0169917 Sep 2001 WO
0171653 Sep 2001 WO
0171743 Sep 2001 WO
WO 0171473 Sep 2001 WO
WO 0171475 Sep 2001 WO
WO 0175723 Oct 2001 WO
WO 0175773 Oct 2001 WO
WO 0175780 Oct 2001 WO
0183213 Nov 2001 WO
0186612 Nov 2001 WO
WO 0195559 Dec 2001 WO
0242894 May 2002 WO
03001357 Jan 2003 WO
03001475 Jan 2003 WO
03067553 Aug 2003 WO
03083763 Oct 2003 WO
03094489 Nov 2003 WO
2004084190 Sep 2004 WO
Continuation in Parts (1)
Number Date Country
Parent 11194020 Jul 2005 US
Child 11322800 US