The present disclosure relates generally to information retrieval. In an example embodiment, the disclosure relates to identification of items depicted in images.
Online shopping and auction websites provide a number of publishing, listing, and price-setting mechanisms whereby a seller may list or publish information concerning items for sale. A buyer can express interest in or indicate a desire to purchase such items by, for example, submitting a query to the website for use in a search of the requested items.
The accurate matching of a query to relevant items is currently a major challenge in the field of information retrieval. An example of such a challenge is that item descriptions tend to be short and are uniquely defined by the sellers. Buyers seeking to purchase the items might use a different vocabulary from the vocabulary used by the sellers to describe the items. As an example, an item identified in the title as a “garnet” does not match a query “January birthstone” submitted by a buyer, although garnet is known as the birthstone for January. As a result, online shopping and auction websites that use a conventional search engine to locate items may not effectively connect the buyers to the sellers and vice versa.
The present disclosure is illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
The description that follows includes illustrative systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments of the present invention. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident. however, to those skilled in the art that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures and techniques have not been shown in detail.
The embodiments described herein provide techniques for identifying items depicted in images. Images depicting a variety of items are stored in a repository of, for example, a network-based publication system (e.g., an online shopping website and an online auction website). Users may submit these images for inclusion in item postings, advertisements, or other publications in the network-based publication system. As explained in more detail below, an item depicted in an image may be identified by matching the image with user submitted images stored in the repository. In some embodiments, as explained in more detail below, the match may be based on a comparison of the color histograms of the images.
It should be noted that the submission of an image of an item (e.g., image 102 of the painting) for identification may be used in a variety of different applications. As used herein, an “item” refers to any tangible or intangible thing and/or something that has a distinct, separate existence from other things (e.g., goods, services, electronic files, web pages, electronic documents, and land). For example, in addition to a sale of the item, a user may submit an image of the item to a price comparison service, in accordance with an embodiment of the invention. This price comparison service can identify the item depicted in the image and deliver shopping comparison results associated with the item. In another embodiment, a user can submit an image to a search engine (e.g., Internet search engine or website search engine) and the search engine can then retrieve websites or other information associated with the item depicted in the image. In yet another embodiment, a user can submit the image to an online auction website that can identify the item depicted in the image and return a template associated with the item to the user such that the user may then modify the template, if necessary, for use in auctioning the item on the online auction website. A template is an electronic file or document with descriptions and layout information. For example, a template may be a document with a predesigned, customized format and structure, such as a fax template, a letter template, or sale template, which can be readily filled in with information.
In the example of
The image recognition module 314 accesses the image from the client processing systems and, as explained in more detail below, identifies the item 309 depicted in the image with an item identifier. An “item identifier,” as used herein, refers to a variety of values (e.g., alphanumeric characters and symbols) that establish the identity of or uniquely identify one or more items, such as item 309. For example, the item identifier can be a name assigned to the item 309. In another example, the item identifier can be a barcode value (e.g., Universal Product Code (UPC)) assigned to the item 309. In yet another example, the item identifier can be a title or description assigned to the item 309.
In an embodiment, the item recognition module 314 may then transmit the item identifier to a service hosted on the server 310 to locate item data. The “item data,” as used herein, refer to a variety of data regarding one or more items depicted in an image that are posted or associated with the image. Such item data, for example, may be stored with the images or at other locations. Examples of item data include titles included in item listings, descriptions of items included in item listings, locations of the items, prices of the items, quantities of the items, availability of the items, a count of the items, templates associated with the items, and other item data. The type of item data requested by the item recognition module 314 depends on the type of service being accessed. Examples of services include online auction websites, online shopping websites, and Internet search engines (or website search engines). It should be appreciated that the item recognition module 314 may access a variety of different services by way of, for example, a Web-exposed application program interface (API). In an alternate embodiment, the item recognition module 314 may be embodied with the service itself where, for example, the item recognition module 314 may be hosted in the server 310 with the other services.
The system 300 may also include a global positioning system (not shown) that may be attached to or included in the client processing systems. The client processing systems can transmit the coordinates or location identified by the global positioning system to the services hosted on server 310 and, for example, the services can use the coordinates to locate nearby stores that sell the item 309 depicted in the image.
The processing system 402 is configured to execute an operating system 404 that manages the software processes and/or services executing on the processing system 402. As depicted in
The request handler module 410 is configured to interface with other processing systems, such as the client processing systems 304 and 306 of
The image recognition module 412 is configured to identify one or more items depicted in an image by comparing the received image with other images of items to identify a match, which is explained in more detail below. The hosting module 414 is configured to interface with other services, which are discussed above. As an example, the image recognition module 412 may transmit a request to a service by way of the hosting module 414 for item data associated with the identified items. This request may include an item identifier, global positioning coordinates, and other information. In turn, the item recognition module 314 receives the requested item data from the service by way of the hosting module 414. The request handler module 410 may then parse the item data from the service into, for example, a lightweight eXtensible Markup Language (XML) for mobile devices and may transmit the response back to the processing systems that originally requested the item data regarding the items depicted in the image.
It should be appreciated that in other embodiments, the processing system 402 may include fewer, more, or different modules apart from those shown in
Generally, the neural network module 508 is configured to identify one or more items depicted in an image through learning and training. As an example, the neural network module 508 can identify matches between images based on learning algorithms. It should be appreciated that a neural network is a type of computer system that is based generally on the parallel architecture of animal brains and can learn by example. As explained in more detail below, the neural network module 508 gathers representative data and then invokes learning algorithms to learn automatically the structure of the data. A Java Object Oriented Neural Engine is an example of a neural network module 508. Other examples of neural network modules include Feed Forward Neural Networks, Recursive Neural Networks (e.g., Elman and Jordan), Time Delay Neural Networks, Standard Back-Propagation Neural Networks (e.g., Gradient Descent, on-line, and batch), Resilient Back-Propagation (RPROP) Neural Networks, Kohonen Self-Organizing Maps (with WTA or Gaussian output maps), Principal Component Analysis, and Module Neural Networks.
The harvester module 504 is configured to request item data from a service by way of, for example, an API. As described in more detail below, the harvester module 504 may then parse the item data to identify item identifiers and associate the item identifiers with an image.
A variety of image identification techniques may be applied to identify the item depicted in the image. As an example, the identification can be based on identifying a match of the image with one of the other images accessed from the repository. In this embodiment, the image is compared with other images at 606, and a match of the image with at least one of the other images is identified at 608 based on the comparison. Once a match is identified, the item identifier associated with the matched image is accessed and the submitted image is associated with the item identifier at 610. Since the item identifier identifies the item depicted in the image, the association effectively results in the identification of the item depicted in the image.
It should be appreciated that a single image may also include multiple items. Each item may be automatically identified or, to assist in the identification, a user may manually point to or designate an approximate location or region of each item in the image as separate items, and the item recognition module can then focus on each designated location to identify a particular item. As a result, for example, if a user wants to list several items for sale, the user can simply take a single picture of all the items and submit the picture in the form of an image to a listing service. The listing service with the item recognition module may then automatically identify and list all the items in the submitted image for sale.
In an embodiment, to enhance the accuracy of the subsequent item identification, a variety of different image algorithms can be applied to the images. An example is the application of an edge detection algorithm to the images at 706, in accordance with an alternative embodiment, to detect edges in the images. An image tool module included in the item recognition module, as discussed above, may apply an edge detection algorithm to detect, draw, enhance, or highlight lines, areas, or points of contrast in the image. An example is the application of a Canny edge detector algorithm to extrapolate contrasts of the images. The contrasts effectively serve to highlight the lines, points, or areas that define the item, and the detection of these lines, points, or areas increases the probability of identifying a match between two or more images. Other examples of image algorithms that may be applied to the images include Marching Squares Algorithm and Haar wavelet.
The identification of items depicted in the image can be based on identifying a match of the image with at least one of the other images accessed from the repository. In an embodiment, at 708, the images being compared are converted into color histograms, which are representations of distributions of colors in the images. The color histogram of the image is then compared with the color histograms of the other images at 710 to identify a match. As an example, a neural network module compares the color histograms to generate a statistical analysis of the comparison. The statistical analysis may identify a statistical difference or a statistical similarity between the compared color histograms, and the match is based on the resulting statistical analysis.
The neural network module may then return a set of statistical analysis and associated item identifiers assigned to each set of comparisons. As an example, item identifiers can be correlated with statistical differences using name value pairs, such as “DVD player: .00040040.” Here, the item identifier with the smallest correlated error may be the best match based, in part, on training data. As discussed previously, the neural network module can learn from training using examples from previous comparisons. As an example, if a match is identified, the image and its item identifier identified from the match may be warehoused or stored with a large group of images for training the neural network module to make the identification of items more accurate. In another example. a user can manually confirm that a particular item as depicted in an image is accurate, and this confirmation may also be used to develop training for the neural network module.
Once a match is identified, the item identifier associated with the matched image is accessed at 712 and associated with the image being submitted at 714. In the example above, if the item identifier “DVD player” is associated with the matched image from the repository, then the “DVD player” is associated with the image being submitted. It should be appreciated that in addition to the application of the edge detector algorithm and the comparison with other images as discussed above, other image identification processes may also be applied to identify items depicted in the image, in accordance with other embodiments of the invention.
Still referring to
An item recognition module hosted with the listing service receives a request to identify the car depicted in the image from the processing system (e.g., a mobile phone) used by the user. This item recognition module has the capability to identify the type of car depicted in the image 802 by identifying a match of the image 802 with at least one other image of a car. Before identification, an edge detection algorithm is applied to the image 802 to produce an image 804 that highlights the lines of the car depicted in the image 802.
As depicted in
The image 804 thereafter is compared with one or more images 851-855, which may, for example, be extracted from previous listings of cars. In this example, the image 804 is compared with each image 851, 852, 853, 854, and 855 and, for example, a statistical difference between each pair of images (e.g., 804 and 851 or 804 and 852) is generated for each comparison. In the example of
The item identifier associated with the image 852, which is identified from a parsing of the item data, is then associated with the image 802. The item recognition module then transmits the item identifier along with other requested item data (e.g., model and make) in a response to the earlier request back to the processing system used by the user. With a match, the listing service can also automatically place the listing of the car in an appropriate category and then list the car with its image 802 for sale on the website.
The machine is capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
The example processing system 900 includes a processor 902 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 904, and static memory 906, which communicate with each other via bus 908. The processing system 900 may further include video display unit 910 (e.g., a plasma display, a liquid crystal display (LCD) or a cathode ray tube (CRT)). The processing system 900 also includes an alphanumeric input device 912 (e.g., a keyboard), a user interface (UI) navigation device 914 (e.g., a mouse), a disk drive unit 916, signal generation device 918 (e.g., a speaker), and network interface device 920.
The disk drive unit 916 includes machine-readable medium 922 on which is stored one or more sets of instructions and data structures 924 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. The instructions and data structures 924 may also reside, completely or at least partially, within main memory 904 and/or within processor 902 during execution thereof by processing system 900, main memory 904, and processor 902 also constituting machine-readable, tangible media.
The instructions and data structures 924 may further be transmitted or received over network 926 via network interface device 920 utilizing any one of a number of well-known transfer protocols (e.g., HTTP).
While the invention(s) is (are) described with reference to various implementations and exploitations, it will be understood that these embodiments are illustrative and that the scope of the invention(s) is not limited to them. In general, techniques for identifying items depicted in images may be implemented with facilities consistent with any hardware system or hardware systems defined herein. Many variations, modifications, additions, and improvements are possible.
Plural instances may be provided for components, operations or structures described herein as a single instance. Finally, boundaries between various components, operations, and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within the scope of the invention(s). In general, structures and functionality presented as separate components in the exemplary configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the invention(s).
This application claims the benefit of U.S. Provisional Application No. 61/106,916, filed Oct. 20, 2008, and claims the benefit of U.S. Provisional No. 61/033,940, filed Mar. 5, 2008, the disclosures of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5068723 | Dixit | Nov 1991 | A |
5546475 | Bolle et al. | Aug 1996 | A |
5579471 | Barber et al. | Nov 1996 | A |
5692012 | Virtamo et al. | Nov 1997 | A |
5781899 | Hirata | Jul 1998 | A |
5802361 | Wang et al. | Sep 1998 | A |
5818964 | Itoh | Oct 1998 | A |
5870149 | Comroe et al. | Feb 1999 | A |
5889896 | Meshinsky et al. | Mar 1999 | A |
5949429 | Bonneau et al. | Sep 1999 | A |
6134548 | Gottsman | Oct 2000 | A |
6151587 | Matthias | Nov 2000 | A |
6154738 | Call | Nov 2000 | A |
6157435 | Slater et al. | Dec 2000 | A |
6216227 | Goldstein et al. | Apr 2001 | B1 |
6278446 | Liou et al. | Aug 2001 | B1 |
6292593 | Nako et al. | Sep 2001 | B1 |
6463426 | Lipson et al. | Oct 2002 | B1 |
6477269 | Brechner | Nov 2002 | B1 |
6483570 | Slater et al. | Nov 2002 | B1 |
6484130 | Dwyer et al. | Nov 2002 | B2 |
6512919 | Ogasawara | Jan 2003 | B2 |
6530521 | Henry | Mar 2003 | B1 |
6549913 | Murakawa | Apr 2003 | B1 |
6563959 | Troyanker | May 2003 | B1 |
6589290 | Maxwell et al. | Jul 2003 | B1 |
6642929 | Essafi et al. | Nov 2003 | B1 |
6724930 | Kosaka et al. | Apr 2004 | B1 |
6763148 | Sternberg et al. | Jul 2004 | B1 |
6947571 | Rhoads et al. | Sep 2005 | B1 |
7023441 | Choi et al. | Apr 2006 | B2 |
7130466 | Seeber | Oct 2006 | B2 |
7254779 | Rezvani et al. | Aug 2007 | B1 |
7257268 | Eichhorn et al. | Aug 2007 | B2 |
7281018 | Begun et al. | Oct 2007 | B1 |
7460735 | Rowley et al. | Dec 2008 | B1 |
7478143 | Friedman et al. | Jan 2009 | B1 |
7593602 | Stentiford | Sep 2009 | B2 |
7702185 | Keating et al. | Apr 2010 | B2 |
7801893 | Gulli′ et al. | Sep 2010 | B2 |
7890386 | Reber | Feb 2011 | B1 |
7921040 | Reber | Apr 2011 | B2 |
7933811 | Reber | Apr 2011 | B2 |
7957510 | Denney et al. | Jun 2011 | B2 |
8130242 | Cohen | Mar 2012 | B2 |
8370062 | Starenky et al. | Feb 2013 | B1 |
8385646 | Lang et al. | Feb 2013 | B2 |
8825660 | Chittar | Sep 2014 | B2 |
9164577 | Tapley et al. | Oct 2015 | B2 |
20010034668 | Whitworth | Oct 2001 | A1 |
20010049636 | Hudda et al. | Dec 2001 | A1 |
20020052709 | Akatsuka et al. | May 2002 | A1 |
20020107737 | Kaneko et al. | Aug 2002 | A1 |
20020146176 | Meyers | Oct 2002 | A1 |
20030053706 | Hong et al. | Mar 2003 | A1 |
20030085894 | Tatsumi | May 2003 | A1 |
20030112260 | Gouzu | Jun 2003 | A1 |
20030123026 | Abitbol et al. | Jul 2003 | A1 |
20030130910 | Pickover et al. | Jul 2003 | A1 |
20030147623 | Fletcher | Aug 2003 | A1 |
20030208409 | Mault | Nov 2003 | A1 |
20030231806 | Troyanker | Dec 2003 | A1 |
20040019643 | Zirnstein, Jr. | Jan 2004 | A1 |
20040046779 | Asano et al. | Mar 2004 | A1 |
20040057627 | Abe et al. | Mar 2004 | A1 |
20040133927 | Sternberg et al. | Jul 2004 | A1 |
20040205286 | Bryant et al. | Oct 2004 | A1 |
20050081161 | Macinnes et al. | Apr 2005 | A1 |
20050084154 | Li et al. | Apr 2005 | A1 |
20050151963 | Pulla et al. | Jul 2005 | A1 |
20050162523 | Darrell et al. | Jul 2005 | A1 |
20050193006 | Bandas et al. | Sep 2005 | A1 |
20050283379 | Reber | Dec 2005 | A1 |
20060012677 | Neven, Sr. et al. | Jan 2006 | A1 |
20060013481 | Park et al. | Jan 2006 | A1 |
20060015492 | Keating et al. | Jan 2006 | A1 |
20060032916 | Mueller et al. | Feb 2006 | A1 |
20060038833 | Mallinson et al. | Feb 2006 | A1 |
20060071945 | Anabuki | Apr 2006 | A1 |
20060120686 | Liebenow et al. | Jun 2006 | A1 |
20060184013 | Emanuel et al. | Aug 2006 | A1 |
20070005576 | Cutrell et al. | Jan 2007 | A1 |
20070078846 | Gulli et al. | Apr 2007 | A1 |
20070104348 | Cohen | May 2007 | A1 |
20070122947 | Sakurai et al. | May 2007 | A1 |
20070133947 | Armitage et al. | Jun 2007 | A1 |
20070143082 | Degnan | Jun 2007 | A1 |
20070230817 | Kokojima | Oct 2007 | A1 |
20070300161 | Bhatia et al. | Dec 2007 | A1 |
20080037877 | Jia et al. | Feb 2008 | A1 |
20080046738 | Galloway et al. | Feb 2008 | A1 |
20080071559 | Arrasvuori | Mar 2008 | A1 |
20080074424 | Carignano | Mar 2008 | A1 |
20080082426 | Gokturk et al. | Apr 2008 | A1 |
20080142599 | Benillouche et al. | Jun 2008 | A1 |
20080163379 | Robinson | Jul 2008 | A1 |
20080170810 | Wu et al. | Jul 2008 | A1 |
20080177640 | Gokturk et al. | Jul 2008 | A1 |
20080201241 | Pecoraro | Aug 2008 | A1 |
20080205755 | Jackson | Aug 2008 | A1 |
20080205764 | Iwai et al. | Aug 2008 | A1 |
20080240575 | Panda et al. | Oct 2008 | A1 |
20080278778 | Saino | Nov 2008 | A1 |
20080288338 | Wiseman et al. | Nov 2008 | A1 |
20080288477 | Kim et al. | Nov 2008 | A1 |
20090028435 | Wu et al. | Jan 2009 | A1 |
20090028446 | Wu et al. | Jan 2009 | A1 |
20090094260 | Cheng et al. | Apr 2009 | A1 |
20090109240 | Englert et al. | Apr 2009 | A1 |
20090235187 | Kim et al. | Sep 2009 | A1 |
20090240735 | Grandhi et al. | Sep 2009 | A1 |
20090245638 | Collier et al. | Oct 2009 | A1 |
20090287587 | Bloebaum et al. | Nov 2009 | A1 |
20090319887 | Waltman et al. | Dec 2009 | A1 |
20090324100 | Kletter et al. | Dec 2009 | A1 |
20090325554 | Reber | Dec 2009 | A1 |
20100015960 | Reber | Jan 2010 | A1 |
20100015961 | Reber | Jan 2010 | A1 |
20100015962 | Reber | Jan 2010 | A1 |
20100034469 | Thorpe et al. | Feb 2010 | A1 |
20100037177 | Golsorkhi | Feb 2010 | A1 |
20100045701 | Scott et al. | Feb 2010 | A1 |
20100046842 | Conwell et al. | Feb 2010 | A1 |
20100048290 | Baseley et al. | Feb 2010 | A1 |
20100131714 | Chandrasekaran | May 2010 | A1 |
20100171999 | Namikata et al. | Jul 2010 | A1 |
20100214284 | Rieffel et al. | Aug 2010 | A1 |
20100241650 | Chittar | Sep 2010 | A1 |
20100260426 | Huang et al. | Oct 2010 | A1 |
20100281417 | Yolleck et al. | Nov 2010 | A1 |
20100287511 | Meier et al. | Nov 2010 | A1 |
20110016487 | Chalozin et al. | Jan 2011 | A1 |
20110029334 | Reber | Feb 2011 | A1 |
20110055054 | Glasson | Mar 2011 | A1 |
20110061011 | Hoguet | Mar 2011 | A1 |
20110148924 | Tapley et al. | Jun 2011 | A1 |
20110215138 | Crum | Sep 2011 | A1 |
20120165046 | Rhoads et al. | Jun 2012 | A1 |
20120192235 | Tapley et al. | Jul 2012 | A1 |
20120308077 | Tseng | Dec 2012 | A1 |
20130050218 | Beaver, III et al. | Feb 2013 | A1 |
20140007012 | Govande et al. | Jan 2014 | A1 |
20140372449 | Chittar | Dec 2014 | A1 |
20160019723 | Tapley et al. | Jan 2016 | A1 |
Number | Date | Country |
---|---|---|
102667913 | Sep 2012 | CN |
WO-2011087797 | Jul 2011 | WO |
WO-2011087797 | Jul 2011 | WO |
Entry |
---|
J.R. Parker, Algorithms for Image Processing and Computer Vision, 1997, Wiley Computer Publishing, pp. 23-29. |
“S60 Camera Phones Get Image Recognition Technology”, http://news.softpedia.com/news/S60-Camera-Phones-Get-Image-Recognition-Technology-46333.shtml, (Feb. 27, 2008), 2 pgs. |
“SnapTell: Technology”, http://web.archive.org/web/20071117023817/http://www.snaptell.com/technology/index.htm, (Nov. 17, 2007), 1 pg. |
“The ESP Game”, http://www.espgame.org/instructions.html, (Downloaded Nov. 13, 2007), 2 pgs. |
Gonsalves, Antone, “Amazon Launches Experimental Mobile Shopping Feature”, http://www.informationweek.com/news/internet/retail/showArticle.jhtml?articleID=212201750&subSection=News, (Dec. 3, 2008), 1 pg. |
Patterson, Ben, “Amazon iPhone app takes snapshots, looks for a match”, http://tech.yahoo.com/blogs/patterson/30983, (Dec. 3, 2008), 3 pgs. |
Terada, S., “New cell phone services tap image-recognition technologies”, http://search.japantimes.co.jp/cgi-bin/nb20070626a1.html, (Jun. 26, 2007), 3 pgs. |
Von Ahn, Luis, et al., “Labeling images with a computer game”, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, (2004), 319-326. |
“U.S. Appl. No. 12/406,016 , Response filed Sep. 21, 2011 to Non Final Office Action mailed Jun. 21, 2011”, 17 pgs. |
“U.S. Appl. No. 12/398,957, Non Final Office Action mailed Jul. 29, 2011”, 23 pgs. |
“U.S. Appl. No. 12/406,016, Non Final Office Action mailed Jun. 21, 2011”, 21 pgs. |
“RedLaser—Impossibly Accurate Barcode Scanning”, [Online] Retrieved from the Internet: http://redlaser.com/index.php Visited Jul. 8, 2011, 2 pgs. |
“YouTube Occipitaihq, RedLaser 2.0: Realtime iPhone UPC barcode scanning”, [Online] Retrieved from the Internet: http://www.youtube.com/watch?v=9—6k visited Jul. 8, 2011, (Jun. 16, 2009), 1 pg. |
“U.S. Appl. No. 12/406,016, Response filed May 17, 2012 to Non Final Office Action mailed Feb. 29, 2012”, 16 pgs. |
“U.S. Appl. No. 12/406,016, Examiner Interview Summary mailed May 15, 2012”, 3 pgs. |
“U.S. Appl. No. 12/398,957, Response filed Dec. 29, 2011 to Non Final Office Action mailed Jul. 29, 2011”, 15 pgs. |
“U.S. Appl. No. 12/406,016, Final Office Action mailed Feb. 29, 2012”, 25 pgs. |
“U.S. Appl. No. 12/398,957, Non Final Office Action mailed Mar. 29, 2012”, 23 pgs. |
“U.S. Appl. No. 61/033,940, Application filed Mar. 5, 2008”, 56 pgs. |
“U.S. Appl. No. 12/398,957, Final Office Action mailed Nov. 7, 2012”, 22 pgs. |
“U.S. Appl. No. 12/398,957, Response filed Jul. 30, 2012 to Non Final Office Action mailed Mar. 29, 2012”, 13 pgs. |
“U.S. Appl. No. 12/644,957, Non Final Office Action mailed Mar. 18, 2013”, 17 pgs. |
“U.S. Appl. No. 12/644,957, Response filed Jun. 14, 2013 to Non Final Office Action mailed Mar. 18, 2013”, 12 pgs. |
“European Application Serial No. 10803429.9, Office Action mailed Aug. 22, 2012”, 2 pgs. |
“European Application Serial No. 10803429.9, Response filed Jan. 21, 2013 to Office Action mailed Aug. 22, 2012”, 10 pgs. |
“International Application Serial No. PCT/US2010/061628, International Preliminary Report on Patentability mailed Jul. 5, 2012”, 6 pgs. |
“International Application Serial No. PCT/US2010/061628, International Search Report mailed Aug. 12, 2011”, 4 pgs. |
“International Application Serial No. PCT/US2010/061628, Written Opinion mailed Aug. 12, 2011”, 4 pgs. |
“U.S. Appl. No. 12/398,957, Response filed Mar. 7, 2013 to Final Office Action mailed Nov. 7, 2012”, 12 pgs. |
“U.S. Appl. No. 12/398,957, Non Final Office Action mailed Sep. 19, 2013”, 21 pgs. |
“U.S. Appl. No. 12/406,016, Non Final Office Action mailed Oct. 2, 2013”, 21 pgs. |
“U.S. Appl. No. 12/644,957, Final Office Action mailed Aug. 26, 2013”, 19 pgs. |
“U.S. Appl. No. 12/644,957, Response filed Nov. 26, 2013 to Final Office Action mailed Aug. 26, 2013”, 11 pgs. |
“U.S. Appl. No. 12/398,957, Response filed Jan. 16, 2014 to Non Final Office Action mailed Sep. 19, 2013”, 13 pgs. |
“U.S. Appl. No. 12/406,016, Notice of Allowance mailed Apr. 28, 2014”, 23 pgs. |
“U.S. Appl. No. 12/406,016, Response filed Mar. 3, 2014 to Non Final Office Action mailed Oct. 2, 2013”, 15 pgs. |
“U.S. Appl. No. 12/644,957, Non Final Office Action mailed Mar. 7, 2014”, 21 pgs. |
“U.S. Appl. No. 13/537,482, Non Final Office Action mailed Jan. 6, 2014”, 25 pgs. |
“U.S. Appl. No. 13/537,482, Response filed Apr. 22, 2014 to Non Final Office Action mailed Jan. 6, 2014”, 10 pgs. |
Kan, et al., “Applying QR Code in Augmented Reality Applications”, (Dec. 15, 2009), 1-6. |
“U.S. Appl. No. 12/398,957, Appeal Brief filed Oct. 27, 2014”, 32 pgs. |
“U.S. Appl. No. 12/398,957, Applicant Interview Summary filed Jan. 19, 2015”, 1 pg. |
“U.S. Appl. No. 12/398,957, Examiner Interview Summary mailed Sep. 10, 2014”, 4 pgs. |
“U.S. Appl. No. 12/398,957, Examiners Answer mailed Jan. 14, 2015”, 10 pgs. |
“U.S. Appl. No. 12/398,957, Final Office Action mailed Jul. 18, 2014”, 27 pgs. |
“U.S. Appl. No. 12/398,957, Reply Brief filed Mar. 13, 2015”, 9 pgs. |
“U.S. Appl. No. 12/406,016, Notice of Allowability mailed Jun. 11, 2014”, 19 pgs. |
“U.S. Appl. No. 12/644,957, Examiner Interview Summary mailed Apr. 29, 2015”, 3 pgs. |
“U.S. Appl. No. 12/644,957, Examiner Interview Summary mailed Jun. 11, 2014”, 3 pgs. |
“U.S. Appl. No. 12/644,957, Examiner Interview Summary mailed Sep. 4, 2014”, 3 pgs. |
“U.S. Appl. No. 12/644,957, Final Office Action mailed Jul. 11, 2014”, 25 pgs. |
“U.S. Appl. No. 12/644,957, Non Final Office Action mailed Dec. 29, 2014”, 20 pgs. |
“U.S. Appl. No. 12/644,957, Response filed Apr. 29, 2015 to Non Final Office Action mailed Dec. 29, 2014”, 13 pgs. |
“U.S. Appl. No. 12/644,957, Response filed Jun. 9, 2014 to Non Final Office Action mailed Mar. 7, 2014”, 13 pgs. |
“U.S. Appl. No. 12/644,957, Response filed Sep. 30, 2014 to Final Office Action mailed Jul. 11, 2014”, 14 pgs. |
“U.S. Appl. No. 13/537,482, Final Office Action mailed May 8, 2014”, 20 pgs. |
“U.S. Appl. No. 13/537,482, Final Office Action mailed May 22, 2015”, 32 pgs. |
“U.S. Appl. No. 13/537,482, Non Final Office Action mailed Nov. 6, 2014”, 24 pgs. |
“U.S. Appl. No. 13/537,482, Response filed Apr. 6, 2015 to Non Final Office Action mailed Nov. 6, 2014”, 8 pgs. |
“U.S. Appl. No. 13/537,482, Response filed Sep. 8, 2014 to Final Office Action mailed May 8, 2014”, 10 pgs. |
“U.S. Appl. No. 14/473,809, Preliminary Amendment filed Oct. 3, 2014”, 10 pgs. |
Mello, John P, et al., “Pongr Giving Cell Phone Users Way to Make Money”, [Online]. Retrieved from the Internet: <http://www.techhive.com/article/240209/pongr—giving—cell—phone—users—way—to—make—money.html.>, (Sep. 18, 2011), 2 pgs. |
“U.S. Appl. No. 14/868,105, Preliminary Amendment filed Oct. 20, 2015”, 8 pgs. |
“U.S. Appl. No. 12/644,957, Notice of Allowance mailed Jan. 17, 2015”, 20 pgs. |
“U.S. Appl. No. 14/473,809, Non Final Office Action mailed Aug. 13, 2015”, 21 pgs. |
“U.S. Appl. No. 13/537,482, Response filed Nov. 23, 2015 to Final Office Action mailed May 22, 2015”, 10 pgs. |
“U.S. Appl. No. 14/473,809, Final Office Action mailed Apr. 14, 2016”, 23 pgs. |
“U.S. Appl. No. 14/473,809, Response filed Feb. 16, 2016 to Non Final Office Action mailed Aug. 13, 2015”, 13 pgs. |
Number | Date | Country | |
---|---|---|---|
20090304267 A1 | Dec 2009 | US |
Number | Date | Country | |
---|---|---|---|
61106916 | Oct 2008 | US | |
61033940 | Mar 2008 | US |