Identification of items depicted in images

Information

  • Patent Grant
  • 11694427
  • Patent Number
    11,694,427
  • Date Filed
    Wednesday, February 17, 2021
    3 years ago
  • Date Issued
    Tuesday, July 4, 2023
    a year ago
Abstract
In an example embodiment, a method of identifying an item depicted in an image is provided. In this method, the image depicting the item is accessed; in addition, other images and their item identifiers are also accessed. A match of the image with one of the other images is identified. With a match, the image is then associated with an item identifier of the matched image.
Description
FIELD

The present disclosure relates generally to information retrieval. In an example embodiment, the disclosure relates to identification of items depicted in images.


BACKGROUND

Online shopping and auction websites provide a number of publishing, listing, and price-setting mechanisms whereby a seller may list or publish information concerning items for sale. A buyer can express interest in or indicate a desire to purchase such items by, for example, submitting a query to the website for use in a search of the requested items.


The accurate matching of a query to relevant items is currently a major challenge in the field of information retrieval. An example of such a challenge is that item descriptions tend to be short and are uniquely defined by the sellers. Buyers seeking to purchase the items might use a different vocabulary from the vocabulary used by the sellers to describe the items. As an example, an item identified in the title as a “garnet” does not match a query “January birthstone” submitted by a buyer, although garnet is known as the birthstone for January. As a result, online shopping and auction websites that use a conventional search engine to locate items may not effectively connect the buyers to the sellers and vice versa.





BRIEF DESCRIPTION OF DRAWINGS

The present disclosure is illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:



FIG. 1 is a user interface diagram showing an image that depicts an item, in accordance with an embodiment, that may be submitted for identification;



FIG. 2 is a user interface diagram showing a listing of items, in accordance with an embodiment, that match the item depicted in the image of FIG. 1;



FIG. 3 is a diagram depicting a system, in accordance with an illustrative embodiment, for identifying items depicted in images;



FIG. 4 is a block diagram depicting an item recognition module, in accordance with an illustrative embodiment, included in a processing system that is configured to identify items depicted in images;



FIG. 5 is a block diagram depicting modules, in accordance with an embodiment, included in the image recognition module;



FIG. 6 is a flow diagram depicting a general overview of a method, in accordance with an embodiment, for identifying an item depicted in an image;



FIG. 7 is a flow diagram depicting a detailed method, in accordance with some embodiments, for identifying an item depicted in an image;



FIGS. 8 and 9 are diagrams depicting a method of identifying an item depicted in an image based on comparisons with other images, in accordance with an illustrative embodiment; and



FIG. 10 is a block diagram depicting a machine in the example form of a processing system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.





DETAILED DESCRIPTION

The description that follows includes illustrative systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments of the present invention. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures and techniques have not been shown in detail.


The embodiments described herein provide techniques for identifying items depicted in images. Images depicting a variety of items are stored in a repository of, for example, a network-based publication system (e.g., an online shopping website and an online auction website). Users may submit these images for inclusion in item postings, advertisements, or other publications in the network-based publication system. As explained in more detail below, an item depicted in an image may be identified by matching the image with user submitted images stored in the repository. In some embodiments, as explained in more detail below, the match may be based on a comparison of the color histograms of the images.



FIG. 1 is a user interface diagram showing an image 102 that depicts an item, in accordance with an embodiment, that may be submitted for identification. As depicted, the image 102 is of a painting, and a user can shop for this painting by submitting this image 102 to, for example, an online shopping website. This online shopping website can identify the particular painting depicted in the image 102 and search its inventory for the identified painting. As depicted in FIG. 2, the online shopping website found several other paintings that match the painting depicted in the image 102 and lists these paintings for sale. As a result, rather than submitting the name or description of the painting depicted in the image 102, a user can simply submit the image 102 of the painting to, for example, the online shopping website for identification. The submission of the image 102 of the painting may therefore be faster because a user can effectively submit the painting for sale with just “one click” of a button instead of typing in a name or description of the painting. Furthermore, a user can locate the painting depicted in the image without even knowing the name of the painting. The submission process can also be more accurate because, for example, it does not depend on the user's knowledge of the painting's name, which can be erroneous.


It should be noted that the submission of an image of an item (e.g., image 102 of the painting) for identification may be used in a variety of different applications. As used herein, an “item” refers to any tangible or intangible thing and/or something that has a distinct, separate existence from other things (e.g., goods, services, electronic files, web pages, electronic documents, and land). For example, in addition to a sale of the item, a user may submit an image of the item to a price comparison service, in accordance with an embodiment of the invention. This price comparison service can identify the item depicted in the image and deliver shopping comparison results associated with the item. In another embodiment, a user can submit an image to a search engine (e.g., Internet search engine or website search engine) and the search engine can then retrieve websites or other information associated with the item depicted in the image. In yet another embodiment, a user can submit the image to an online auction website that can identify the item depicted in the image and return a template associated with the item to the user such that the user may then modify the template, if necessary, for use in auctioning the item on the online auction website. A template is an electronic file or document with descriptions and layout information. For example, a template may be a document with a predesigned, customized format and structure, such as a fax template, a letter template, or sale template, which can be readily filled in with information.



FIG. 3 is a diagram depicting a system 300, in accordance with an illustrative embodiment, for identifying items depicted in images. As depicted, the system 300 includes client processing systems (e.g., personal computer 304 and mobile phone 306), a server 310 hosting a variety of services, and another server 312 hosting an item recognition module 314, which are all interconnected by way of a computer network 302. The computer network 302 is a collection of interconnected processing systems that communicate utilizing wired or wireless mediums. Examples of computer networks, such as the computer network 302, include Local Area Networks (LANs) and/or Wide-Area Networks (WANs), such as the Internet.


In the example of FIG. 3, a client processing system (e.g., personal computer 304 or mobile phone 306) transmits an image of an item 309 to the image recognition module 314, which is hosted on the server 312. The image may be captured by a camera built-in the mobile phone 306 or by a digital camera 308, which is configurable to download its stored images to the personal computer 304. Alternatively, the user may locate the image through, for example, the Internet or other image repositories.


The image recognition module 314 accesses the image from the client processing systems and, as explained in more detail below, identifies the item 309 depicted in the image with an item identifier. An “item identifier,” as used herein, refers to a variety of values (e.g., alphanumeric characters and symbols) that establish the identity of or uniquely identify one or more items, such as item 309. For example, the item identifier can be a name assigned to the item 309. In another example, the item identifier can be a barcode value (e.g., Universal Product Code (UPC)) assigned to the item 309. In yet another example, the item identifier can be a title or description assigned to the item 309.


In an embodiment, the item recognition module 314 may then transmit the item identifier to a service hosted on the server 310 to locate item data. The “item data,” as used herein, refer to a variety of data regarding one or more items depicted in an image that are posted or associated with the image. Such item data, for example, may be stored with the images or at other locations. Examples of item data include titles included in item listings, descriptions of items included in item listings, locations of the items, prices of the items, quantities of the items, availability of the items, a count of the items, templates associated with the items, and other item data. The type of item data requested by the item recognition module 314 depends on the type of service being accessed. Examples of services include online auction websites, online shopping websites, and Internet search engines (or website search engines). It should be appreciated that the item recognition module 314 may access a variety of different services by way of, for example, a Web-exposed application program interface (API). In an alternate embodiment, the item recognition module 314 may be embodied with the service itself where, for example, the item recognition module 314 may be hosted in the server 310 with the other services.


The system 300 may also include a global positioning system (not shown) that may be attached to or included in the client processing systems. The client processing systems can transmit the coordinates or location identified by the global positioning system to the services hosted on server 310 and, for example, the services can use the coordinates to locate nearby stores that sell the item 309 depicted in the image.



FIG. 4 is a block diagram depicting an item recognition module 314, in accordance with an illustrative embodiment, included in a processing system 402 that is configured to identify items depicted in images. It should be appreciated that the processing system 402 may be deployed in the form of variety of computing devices, such as personal computers, laptop computers, server computers, and other computing devices. For example, the processing system 402 may be the server 310 or 312 or the personal computer 304 depicted in FIG. 3. In various embodiments, the processing system 402 may be used to implement computer programs, logic, applications, methods, processes, or other software to identify items depicted in images, as described in more detail below.


The processing system 402 is configured to execute an operating system 404 that manages the software processes and/or services executing on the processing system 402. As depicted in FIG. 4, these software processes and/or services include the item recognition module 314. Generally, the item recognition module 314 is configured to identify one or more items depicted in an image. The item recognition module 314 may include a request handler module 410, an image recognition module 412, and a hosting module 414.


The request handler module 410 is configured to interface with other processing systems, such as the client processing systems 304 and 306 of FIG. 3. The interface may include the receipt of messages and data from other processing systems by way of Hypertext Transfer Protocol (or other protocols), and also include transmission of messages and data from the item recognition module 314 to other processing systems by way of Hypertext Transfer Protocol. Referring to FIG. 4, another processing system in communication with the item recognition module 314 may convert an image into a byte array and open a remote Hypertext Transfer Protocol (HTTP) request to the item recognition module 314. The byte array is written to a server socket using, for example, HTTP POST, and a separate HTTP GET request may be sent, including global positioning system coordinates of the processing system, if available. The request handler module 410 receives the byte array and converts it into, for example, a Java image object that is then processed by the image recognition module 412.


The image recognition module 412 is configured to identify one or more items depicted in an image by comparing the received image with other images of items to identify a match, which is explained in more detail below. The hosting module 414 is configured to interface with other services, which are discussed above. As an example, the image recognition module 412 may transmit a request to a service by way of the hosting module 414 for item data associated with the identified items. This request may include an item identifier, global positioning coordinates, and other information. In turn, the item recognition module 314 receives the requested item data from the service by way of the hosting module 414. The request handler module 410 may then parse the item data from the service into, for example, a lightweight eXtensible Markup Language (XML) for mobile devices and may transmit the response back to the processing systems that originally requested the item data regarding the items depicted in the image.


It should be appreciated that in other embodiments, the processing system 402 may include fewer, more, or different modules apart from those shown in FIG. 4. For example, the image recognition module 412 may be further split into an image recognition module and a neural network module, which are explained in more detail below.



FIG. 5 is a block diagram depicting modules 502, 504, 506, and 508, in accordance with an embodiment, included in the image recognition module 412. As depicted, the image recognition module 412 includes another request handler module 502, a harvester module 504, an image tools module 506, and a neural network module 508. In general, this other request handler module 502 is configured to process requests made to the image recognition module 412. The image tools module 506 is configured to process the images using one or more image processing algorithms, such as an edge detection algorithm, which is described in more detail below.


Generally, the neural network module 508 is configured to identify one or more items depicted in an image through learning and training. As an example, the neural network module 508 can identify matches between images based on learning algorithms. It should be appreciated that a neural network is a type of computer system that is based generally on the parallel architecture of animal brains and can learn by example. As explained in more detail below, the neural network module 508 gathers representative data and then invokes learning algorithms to learn automatically the structure of the data. A Java Object Oriented Neural Engine is an example of a neural network module 508. Other examples of neural network modules include Feed Forward Neural Networks, Recursive Neural Networks (e.g., Elman and Jordan), Time Delay Neural Networks, Standard Back-Propagation Neural Networks (e.g., Gradient Descent, on-line, and batch), Resilient Back-Propagation (RPROP) Neural Networks, Kohonen Self-Organizing Maps (with WTA or Gaussian output maps), Principal Component Analysis, and Module Neural Networks.


The harvester module 504 is configured to request item data from a service by way of, for example, an API. As described in more detail below, the harvester module 504 may then parse the item data to identify item identifiers and associate the item identifiers with an image.



FIG. 6 is a flow diagram depicting a general overview of a method 600, in accordance with an embodiment, for identifying an item depicted in an image. In an embodiment, the method 600 may be implemented by the item recognition module 314 and employed in the processing system 402 of FIG. 4. As depicted in FIG. 6, an image depicting an item is accessed at 602. This image may be submitted by a user to identify the item depicted in the image. Additionally, one or more other images and their associated item identifiers, which identify the items depicted in these other images, are accessed at 604. These images and item identifiers may be from user-submitted item postings and are stored in and accessed from a repository of, for example, a network-based publication system. For example, a large number of users place or sell items on an auction website and, when placing or selling these items, the users would submit images and descriptions of the items. All these images and their descriptions, which may be used as item identifiers, may be stored in the repository and are accessible by the item recognition module.


A variety of image identification techniques may be applied to identify the item depicted in the image. As an example, the identification can be based on identifying a match of the image with one of the other images accessed from the repository. In this embodiment, the image is compared with other images at 606, and a match of the image with at least one of the other images is identified at 608 based on the comparison. Once a match is identified, the item identifier associated with the matched image is accessed and the submitted image is associated with the item identifier at 610. Since the item identifier identifies the item depicted in the image, the association effectively results in the identification of the item depicted in the image.


It should be appreciated that a single image may also include multiple items. Each item may be automatically identified or, to assist in the identification, a user may manually point to or designate an approximate location or region of each item in the image as separate items, and the item recognition module can then focus on each designated location to identify a particular item. As a result, for example, if a user wants to list several items for sale, the user can simply take a single picture of all the items and submit the picture in the form of an image to a listing service. The listing service with the item recognition module may then automatically identify and list all the items in the submitted image for sale.



FIG. 7 is a flow diagram depicting a detailed method 700, in accordance with another embodiment, for identifying an item depicted in an image. In the method 700, a request is received to identify an item depicted in an image at 702. This request may, for example, be received from a client processing system and includes an image submitted by a user. Additionally, one or more other images and their associated item identifiers are accessed at 704 from, for example, a repository of a network-based publication system.


In an embodiment, to enhance the accuracy of the subsequent item identification, a variety of different image algorithms can be applied to the images. An example is the application of an edge detection algorithm to the images at 706, in accordance with an alternative embodiment, to detect edges in the images. An image tool module included in the item recognition module, as discussed above, may apply an edge detection algorithm to detect, draw, enhance, or highlight lines, areas, or points of contrast in the image. An example is the application of a Canny edge detector algorithm to extrapolate contrasts of the images. The contrasts effectively serve to highlight the lines, points, or areas that define the item, and the detection of these lines, points, or areas increases the probability of identifying a match between two or more images. Other examples of image algorithms that may be applied to the images include Marching Squares Algorithm and Haar wavelet.


The identification of items depicted in the image can be based on identifying a match of the image with at least one of the other images accessed from the repository. In an embodiment, at 708, the images being compared are converted into color histograms, which are representations of distributions of colors in the images. The color histogram of the image is then compared with the color histograms of the other images at 710 to identify a match. As an example, a neural network module compares the color histograms to generate a statistical analysis of the comparison. The statistical analysis may identify a statistical difference or a statistical similarity between the compared color histograms, and the match is based on the resulting statistical analysis.


The neural network module may then return a set of statistical analysis and associated item identifiers assigned to each set of comparisons. As an example, item identifiers can be correlated with statistical differences using name value pairs, such as “DVD player: 0.00040040.” Here, the item identifier with the smallest correlated error may be the best match based, in part, on training data. As discussed previously, the neural network module can learn from training using examples from previous comparisons. As an example, if a match is identified, the image and its item identifier identified from the match may be warehoused or stored with a large group of images for training the neural network module to make the identification of items more accurate. In another example, a user can manually confirm that a particular item as depicted in an image is accurate, and this confirmation may also be used to develop training for the neural network module.


Once a match is identified, the item identifier associated with the matched image is accessed at 712 and associated with the image being submitted at 714. In the example above, if the item identifier “DVD player” is associated with the matched image from the repository, then the “DVD player” is associated with the image being submitted. It should be appreciated that in addition to the application of the edge detector algorithm and the comparison with other images as discussed above, other image identification processes may also be applied to identify items depicted in the image, in accordance with other embodiments of the invention.


Still referring to FIG. 7, a template associated with the item identifier is accessed at 716, in accordance with an embodiment of the invention. The template may be a pre-built template stored in a data structure and associated with a particular item or item identifier. For example, this template may already include descriptions and attributes of an associated item. The template is then transmitted at 718 in a response to the request. As an example, the template is included in a response and this response is transmitted back to the client processing system that initially requested the identification.



FIGS. 8 and 9 are diagrams depicting a method of identifying an item depicted in an image based on comparisons with other images, in accordance with an illustrative embodiment. As depicted in FIG. 8, a user takes a picture of a car using his mobile phone and submits this picture, in the form of an image 802 to, for example, a listing service that sells cars. Alternatively, the user may take a video of the car and submit one or more frames from the video to the listing service.


An item recognition module hosted with the listing service receives a request to identify the car depicted in the image from the processing system (e.g., a mobile phone) used by the user. This item recognition module has the capability to identify the type of car depicted in the image 802 by identifying a match of the image 802 with at least one other image of a car. Before identification, an edge detection algorithm is applied to the image 802 to produce an image 804 that highlights the lines of the car depicted in the image 802.


As depicted in FIG. 9, a number of other images 851-855 of cars and their associated item data are accessed. In this embodiment, the item identifiers associated with the images 851-855 are not immediately available and instead, the item identifiers are derived from item data associated with the images 851-855. In an embodiment, the item recognition module accesses the item data associated with one or more images 851-855 and then parses the item data to identify one or more item identifiers, which, for example, a user may define as a title or barcode value of an item.


The image 804 thereafter is compared with one or more images 851-855, which may, for example, be extracted from previous listings of cars. In this example, the image 804 is compared with each image 851, 852, 853, 854, and 855 and, for example, a statistical difference between each pair of images (e.g., 804 and 851 or 804 and 852) is generated for each comparison. In the example of FIG. 8b, the comparison of the image 804 with the image 852 yields the lowest statistical difference. As a result, a match of the image 804 with the image 852 is identified.


The item identifier associated with the image 852, which is identified from a parsing of the item data, is then associated with the image 802. The item recognition module then transmits the item identifier along with other requested item data (e.g., model and make) in a response to the earlier request back to the processing system used by the user. With a match, the listing service can also automatically place the listing of the car in an appropriate category and then list the car with its image 802 for sale on the website.



FIG. 10 is a block diagram of a machine in the example form of a processing system 900 within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. Embodiments may also, for example, be deployed by Software-as-a-Service (SaaS), Application Service Provider (ASP), or utility computing providers, in addition to being sold or licensed via traditional channels.


The machine is capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.


The example processing system 900 includes a processor 902 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 904, and static memory 906, which communicate with each other via bus 908. The processing system 900 may further include video display unit 910 (e.g., a plasma display, a liquid crystal display (LCD) or a cathode ray tube (CRT)). The processing system 900 also includes an alphanumeric input device 912 (e.g., a keyboard), a user interface (UI) navigation device 914 (e.g., a mouse), a disk drive unit 916, signal generation device 918 (e.g., a speaker), and network interface device 920.


The disk drive unit 916 includes machine-readable medium 922 on which is stored one or more sets of instructions and data structures 924 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. The instructions and data structures 924 may also reside, completely or at least partially, within main memory 904 and/or within processor 902 during execution thereof by processing system 900, main memory 904, and processor 902 also constituting machine-readable, tangible media.


The instructions and data structures 924 may further be transmitted or received over network 926 via network interface device 920 utilizing any one of a number of well-known transfer protocols (e.g., HTTP).


While the invention(s) is (are) described with reference to various implementations and exploitations, it will be understood that these embodiments are illustrative and that the scope of the invention(s) is not limited to them. In general, techniques for identifying items depicted in images may be implemented with facilities consistent with any hardware system or hardware systems defined herein. Many variations, modifications, additions, and improvements are possible.


Plural instances may be provided for components, operations or structures described herein as a single instance. Finally, boundaries between various components, operations, and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within the scope of the invention(s). In general, structures and functionality presented as separate components in the exemplary configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the invention(s).

Claims
  • 1. A method comprising: receiving a request to sell an item, the request including a first image of the item;determining, for a plurality of other images, a respective difference between a first color histogram of the first image and a plurality of second color histograms of one or more images of the plurality of other images;selecting, by one or more processors, one of the second color histograms most similar to the first color histogram based on the respective differences;identifying, by one or more processors, a second image of the plurality of other images that is associated with the selected second color histogram, the second image included in an item listing, the item listing further including an item description;determining an identifier associated with the identified second image;identifying a template based on the identifier; andtransmitting the template to a client device based on identifying the template and in response to receiving the request to sell the item.
  • 2. The method of claim 1, further comprising determining a statistical difference between the first color histogram and one or more color histograms from the plurality of second color histograms, wherein the selected second color histogram is selected based upon the statistical difference.
  • 3. The method of claim 2, wherein the statistical difference is determined using a neural network.
  • 4. The method of claim 1, further comprising determining a statistical similarity between the first color histogram and one or more color histograms from the plurality of second color histograms.
  • 5. The method of claim 4, wherein the statistical similarity is determined using a neural network.
  • 6. The method of claim 5, wherein the selected second color histogram is selected based upon the statistical similarity.
  • 7. The method of claim 6, wherein the selected second color histogram has a highest statistical similarity with the first color histogram.
  • 8. A system comprising: at least one processor; andmemory storing computer executable instructions that, when executed by the at least one processor, cause the system to perform operations comprising: receiving a request to sell an item, the request including a first image of the item;determining, for a plurality of other images, a respective difference between a first color histogram of the first image and a plurality of second color histograms of one or more images of the plurality of other images;selecting one of the second color histograms most similar to the first color histogram based on the respective differences;identifying a second image of the plurality of other images that is associated with the selected second color histogram, the second image included in an item listing, the item listing further including an item description;determining an identifier associated with the identified second image;identifying a template based on the identifier; andtransmitting the template to a client device based on identifying the template and in response to receiving the request to sell the item.
  • 9. The system of claim 8, wherein the operations further comprise determining a statistical difference between the first color histogram and one or more color histograms from the plurality of second color histograms.
  • 10. The system of claim 9, wherein the statistical difference is determined using a neural network.
  • 11. The system of claim 10, wherein the selected second color histogram is selected based upon the statistical difference.
  • 12. The system of claim 11, wherein the selected second color histogram has a lowest statistical difference with the first color histogram.
  • 13. The system of claim 8, wherein the operations further comprise determining a statistical similarity between the first color histogram and one or more color histograms from the plurality of second color histograms.
  • 14. The system of claim 13, wherein the statistical similarity is determined using a neural network.
  • 15. The system of claim 14, wherein the selected second color histogram is selected based upon the statistical similarity.
  • 16. The system of claim 15, wherein the selected second color histogram has a highest statistical similarity with the first color histogram.
  • 17. A non-transitory machine-readable medium storing instructions that, when executed by one or more processors, cause a system to perform operations comprising: receiving a request to sell an item, the request including a first image of the item;determining, for a plurality of other images, a respective difference between a first color histogram of the first image and a plurality of second color histograms of one or more images of the plurality of other images;selecting one of the second color histograms most similar to the first color histogram based on the respective differences;identifying a second image of the plurality of other images that is associated with the selected second color histogram, the second image included in an item listing, the item listing further including an item description;determining an identifier associated with the identified second image;identifying a template based on the identifier; andtransmitting the template to a client device based on identifying the template and in response to receiving the request to sell the item.
  • 18. The non-transitory machine-readable medium of claim 17, wherein the operations further comprise determining a statistical difference between the first color histogram and one or more color histograms from the plurality of second color histograms, wherein the selected second color histogram is selected based upon the statistical difference.
  • 19. The non-transitory machine-readable medium of claim 18, wherein the statistical difference is determined using a neural network.
  • 20. The non-transitory machine-readable medium of claim 17, wherein the operations further comprise determining a statistical similarity between the first color histogram and one or more color histograms from the plurality of second color histograms, wherein the selected second color histogram is selected based upon the statistical similarity.
CROSS-REFERENCES TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 15/337,899, filed on Oct. 28, 2016, which application is a continuation of U.S. patent application Ser. No. 12/371,882, filed on Feb. 16, 2009, issued as U.S. Pat. No. 9,495,386, which claims the benefit of priority to U.S. Provisional Patent Application Ser. No. 61/106,916, filed Oct. 20, 2008, and U.S. Provisional Patent Application Ser. No. 61/033,940, filed Mar. 5, 2008, the benefit of priority of each of which is claimed hereby, and each of which are incorporated by reference herein in their entireties.

US Referenced Citations (515)
Number Name Date Kind
3675215 Arnold et al. Jul 1972 A
4539585 Spackova et al. Sep 1985 A
4596144 Panton et al. Jun 1986 A
4753079 Sumitomo Jun 1988 A
5068723 Dixit et al. Nov 1991 A
5408417 Wilder Apr 1995 A
5546475 Bolle et al. Aug 1996 A
5579471 Barber et al. Nov 1996 A
5633678 Parulski et al. May 1997 A
5692012 Virtamo et al. Nov 1997 A
5724660 Kauser et al. Mar 1998 A
5727379 Cohn Mar 1998 A
5732354 MacDonald Mar 1998 A
5781899 Hirata Jul 1998 A
5802361 Wang et al. Sep 1998 A
5818964 Itoh Oct 1998 A
5848373 DeLorme et al. Dec 1998 A
5870149 Comroe et al. Feb 1999 A
5889896 Meshinsky et al. Mar 1999 A
5890068 Fattouche et al. Mar 1999 A
5949429 Bonneau et al. Sep 1999 A
6069570 Herring May 2000 A
6091956 Hollenberg Jul 2000 A
6097958 Bergen Aug 2000 A
6112226 Weaver et al. Aug 2000 A
6134548 Gottsman et al. Oct 2000 A
6134674 Akasheh Oct 2000 A
6151587 Matthias Nov 2000 A
6154738 Call Nov 2000 A
6157435 Slater et al. Dec 2000 A
6157841 Bolduc et al. Dec 2000 A
6167274 Smith Dec 2000 A
6198927 Wright et al. Mar 2001 B1
6204812 Fattouche Mar 2001 B1
6208297 Fattouche et al. Mar 2001 B1
6208857 Agre et al. Mar 2001 B1
6216134 Heckerman et al. Apr 2001 B1
6216227 Goldstein et al. Apr 2001 B1
6243588 Koorapaty et al. Jun 2001 B1
6246861 Messier et al. Jun 2001 B1
6246882 Lachance Jun 2001 B1
6259381 Small Jul 2001 B1
6259923 Lim et al. Jul 2001 B1
6266014 Fattouche et al. Jul 2001 B1
6278446 Liou et al. Aug 2001 B1
6292593 Nako et al. Sep 2001 B1
6314365 Smith Nov 2001 B1
6317684 Roeseler et al. Nov 2001 B1
6330452 Fattouche et al. Dec 2001 B1
6347230 Koshima et al. Feb 2002 B2
6356543 Hall et al. Mar 2002 B2
6404388 Sollenberger et al. Jun 2002 B1
6341255 Lapidot Jul 2002 B1
6424840 Fitch et al. Jul 2002 B1
6434530 Sloane et al. Aug 2002 B1
6456852 Bar et al. Sep 2002 B2
6463426 Lipson et al. Oct 2002 B1
6477269 Brechner Nov 2002 B1
6477363 Ayoub et al. Nov 2002 B1
6483570 Slater et al. Nov 2002 B1
6484130 Dwyer et al. Nov 2002 B2
6512919 Ogasawara Jan 2003 B2
6519463 Tendler Feb 2003 B2
6530521 Henry Mar 2003 B1
6549913 Murakawa Apr 2003 B1
6563459 Takenaga May 2003 B2
6563959 Troyanker May 2003 B1
6567797 Schuetze et al. May 2003 B1
6577946 Myr Jun 2003 B2
6580914 Smith Jun 2003 B1
6587835 Treyz et al. Jul 2003 B1
6589290 Maxwell et al. Jul 2003 B1
6590533 Sollenberger et al. Jul 2003 B2
6618593 Drutman et al. Sep 2003 B1
6625457 Raith Sep 2003 B1
6642929 Essafi et al. Nov 2003 B1
6690322 Shamoto et al. Feb 2004 B2
6714797 Rautila Mar 2004 B1
6714945 Foote et al. Mar 2004 B1
6724930 Kosaka et al. Apr 2004 B1
6732080 Blants May 2004 B1
6763148 Sternberg et al. Jul 2004 B1
6783148 Henderson Aug 2004 B2
6804662 Annau et al. Oct 2004 B1
6807479 Watanabe et al. Oct 2004 B2
6901379 Balter et al. May 2005 B1
6947571 Rhoads et al. Sep 2005 B1
7022281 Senft Apr 2006 B1
7023441 Choi et al. Apr 2006 B2
7062722 Carlin et al. Jun 2006 B1
7082365 Sheha et al. Jul 2006 B2
7092702 Cronin et al. Aug 2006 B2
7130466 Seeber Oct 2006 B2
7130622 Vanska et al. Oct 2006 B2
7138913 Mackenzie et al. Nov 2006 B2
7142858 Aoki et al. Nov 2006 B2
7149665 Feld et al. Dec 2006 B2
7162082 Edwards Jan 2007 B2
7164986 Humphries et al. Jan 2007 B2
7199815 Aoyama Apr 2007 B2
7240025 Stone et al. Jul 2007 B2
7254388 Nam et al. Aug 2007 B2
7254779 Rezvani et al. Aug 2007 B1
7257268 Eichhorn et al. Aug 2007 B2
7273172 Olsen, III et al. Sep 2007 B2
7281018 Begun et al. Oct 2007 B1
7346453 Matsuoka Mar 2008 B2
7346543 Edmark Mar 2008 B1
7347373 Singh Mar 2008 B2
7363214 Musgrove et al. Apr 2008 B2
7363252 Fujimoto Apr 2008 B2
7385499 Horton et al. Jun 2008 B2
7460735 Rowley et al. Dec 2008 B1
7478143 Friedman et al. Jan 2009 B1
7495674 Biagiotti et al. Feb 2009 B2
7502133 Fukunaga et al. Mar 2009 B2
7519562 Vander et al. Apr 2009 B1
7568004 Gottfried Jul 2009 B2
7587359 Levy et al. Sep 2009 B2
7593602 Stentiford Sep 2009 B2
7669759 Zettner Mar 2010 B1
7683858 Allen et al. Mar 2010 B2
7702185 Keating et al. Apr 2010 B2
7747259 Pande et al. Jun 2010 B2
7752082 Calabria Jul 2010 B2
7756757 Oakes, III Jul 2010 B1
7761339 Alivandi Jul 2010 B2
7761340 Yee et al. Jul 2010 B2
7801893 Gulli et al. Sep 2010 B2
7827074 Rolf Nov 2010 B1
7848764 Riise et al. Dec 2010 B2
7848765 Phillips et al. Dec 2010 B2
7881560 John Feb 2011 B2
7890386 Reber Feb 2011 B1
7916129 Lin et al. Mar 2011 B2
7921040 Reber Apr 2011 B2
7933811 Reber Apr 2011 B2
7948481 Vilcovsky May 2011 B2
7957510 Denney et al. Jun 2011 B2
7996282 Scott et al. Aug 2011 B1
8078498 Edmark Dec 2011 B2
8095428 Penagulur et al. Jan 2012 B2
8098894 Soderstrom Jan 2012 B2
8130242 Cohen Mar 2012 B2
8131118 Jing et al. Mar 2012 B1
8230016 Pattan et al. Jul 2012 B1
8233723 Sundaresan Jul 2012 B2
8239130 Upstill et al. Aug 2012 B1
8260846 Lahav Sep 2012 B2
8275590 Szymczyk et al. Sep 2012 B2
8326315 Phillips et al. Dec 2012 B2
8370062 Starenky et al. Feb 2013 B1
8385646 Lang et al. Feb 2013 B2
8411977 Baluja et al. Apr 2013 B1
8421872 Neven, Sr. Apr 2013 B2
8442871 Veres et al. May 2013 B2
8547401 Mallinson et al. Oct 2013 B2
8650072 Mason et al. Feb 2014 B2
8811957 Jovicic et al. Aug 2014 B2
8825660 Chittar Sep 2014 B2
8868443 Yankovich et al. Oct 2014 B2
8909248 Phillips et al. Dec 2014 B2
9037600 Garrigues et al. May 2015 B1
9058764 Persson et al. Jun 2015 B1
9164577 Tapley et al. Oct 2015 B2
9240059 Zises Jan 2016 B2
9251395 Botchen Feb 2016 B1
9336541 Pugazhendhi et al. May 2016 B2
9449342 Sacco Sep 2016 B2
9495386 Tapley et al. Nov 2016 B2
9530059 Zises Dec 2016 B2
9953350 Pugazhendhi et al. Apr 2018 B2
10127606 Tapley et al. Nov 2018 B2
10147134 Sacco Dec 2018 B2
10210659 Tapley et al. Feb 2019 B2
10846766 Govande et al. Nov 2020 B2
10936650 Grandhi et al. Mar 2021 B2
10956775 Tapley et al. Mar 2021 B2
20010034668 Whitworth Oct 2001 A1
20010049636 Hudda et al. Dec 2001 A1
20010055976 Crouch et al. Dec 2001 A1
20020002504 Engel et al. Jan 2002 A1
20020027694 Kim et al. Mar 2002 A1
20020052709 Akatsuka et al. May 2002 A1
20020072993 Sandus et al. Jun 2002 A1
20020094189 Navab et al. Jul 2002 A1
20020102967 Chang et al. Aug 2002 A1
20020107737 Kaneko et al. Aug 2002 A1
20020116286 Walker et al. Aug 2002 A1
20020143930 Babu et al. Oct 2002 A1
20020145984 Babu et al. Oct 2002 A1
20020146176 Meyers Oct 2002 A1
20020155844 Rankin et al. Oct 2002 A1
20020196333 Gorischek Dec 2002 A1
20030004802 Callegari Jan 2003 A1
20030018652 Heckerman et al. Jan 2003 A1
20030028873 Lemmons Feb 2003 A1
20030051255 Bulman et al. Mar 2003 A1
20030053706 Hong et al. Mar 2003 A1
20030063128 Salmimaa et al. Apr 2003 A1
20030065805 Melvin Apr 2003 A1
20030080978 Navab et al. May 2003 A1
20030085894 Tatsumi May 2003 A1
20030098892 Hiipakka May 2003 A1
20030101105 Vock May 2003 A1
20030112260 Gouzu Jun 2003 A1
20030123026 Abitbol et al. Jul 2003 A1
20030125043 Silvester Jul 2003 A1
20030126150 Chan Jul 2003 A1
20030130787 Clapper Jul 2003 A1
20030130910 Pickover et al. Jul 2003 A1
20030134645 Stern et al. Jul 2003 A1
20030139190 Steelberg et al. Jul 2003 A1
20030147623 Fletcher Aug 2003 A1
20030195044 Narita Oct 2003 A1
20030197740 Reponen Oct 2003 A1
20030208409 Mault Nov 2003 A1
20030216960 Postrel Nov 2003 A1
20030220835 Barnes, Jr. Nov 2003 A1
20030229537 Dunning et al. Dec 2003 A1
20030231806 Troyanker Dec 2003 A1
20040002359 Deas et al. Jan 2004 A1
20040019643 Robert Jan 2004 A1
20040021567 Dunn Feb 2004 A1
20040043773 Lee et al. Mar 2004 A1
20040046779 Asano et al. Mar 2004 A1
20040057627 Abe et al. Mar 2004 A1
20040075670 Bezine et al. Apr 2004 A1
20040096096 Huber May 2004 A1
20040128320 Grove Jul 2004 A1
20040133927 Sternberg et al. Jul 2004 A1
20040144338 Goldman Jul 2004 A1
20040153505 Verdi et al. Aug 2004 A1
20040192339 Wilson et al. Sep 2004 A1
20040192349 Reilly Sep 2004 A1
20040203901 Wilson et al. Oct 2004 A1
20040203931 Karaoguz Oct 2004 A1
20040205286 Bryant et al. Oct 2004 A1
20040220767 Tanaka et al. Nov 2004 A1
20040220821 Ericsson et al. Nov 2004 A1
20040230558 Tokunaka Nov 2004 A1
20050001852 Dengler et al. Jan 2005 A1
20050004850 Gutbrod Jan 2005 A1
20050010486 Pandhe Jan 2005 A1
20050015300 Smith et al. Jan 2005 A1
20050065655 Hong et al. Mar 2005 A1
20050081161 Macinnes et al. Apr 2005 A1
20050084154 Li et al. Apr 2005 A1
20050091597 Ackley Apr 2005 A1
20050151743 Sitrick Jul 2005 A1
20050151963 Pulla et al. Jul 2005 A1
20050159883 Humphries et al. Jul 2005 A1
20050162419 Kim et al. Jul 2005 A1
20050162523 Darrell et al. Jul 2005 A1
20050171864 Nakade et al. Aug 2005 A1
20050182792 Israel et al. Aug 2005 A1
20050193006 Bandas Sep 2005 A1
20050222987 Vadon Oct 2005 A1
20050240512 Quintero et al. Oct 2005 A1
20050250516 Shim Nov 2005 A1
20050278749 Ewert et al. Dec 2005 A1
20050283379 Reber Dec 2005 A1
20060004646 Schoen et al. Jan 2006 A1
20060004850 Chowdhury Jan 2006 A1
20060006238 Singh Jan 2006 A1
20060012677 Neven et al. Jan 2006 A1
20060013481 Park et al. Jan 2006 A1
20060015492 Keating et al. Jan 2006 A1
20060032916 Mueller et al. Feb 2006 A1
20060038833 Mallinson et al. Feb 2006 A1
20060047825 Steenstra et al. Mar 2006 A1
20060058948 Blass et al. Mar 2006 A1
20060059434 Boss et al. Mar 2006 A1
20060064346 Steenstra et al. Mar 2006 A1
20060071945 Anabuki Apr 2006 A1
20060071946 Anabuki et al. Apr 2006 A1
20060099959 Staton et al. May 2006 A1
20060116935 Evans Jun 2006 A1
20060120686 Liebenow Jun 2006 A1
20060145837 Horton et al. Jul 2006 A1
20060149625 Koningstein Jul 2006 A1
20060149638 Allen Jul 2006 A1
20060178782 Pechtl et al. Aug 2006 A1
20060184013 Emanuel et al. Aug 2006 A1
20060190293 Richards Aug 2006 A1
20060195428 Peckover Aug 2006 A1
20060211453 Schick Sep 2006 A1
20060218153 Voon et al. Sep 2006 A1
20060236257 Othmer et al. Oct 2006 A1
20060240862 Neven Oct 2006 A1
20060270421 Phillips et al. Nov 2006 A1
20070005576 Cutrell et al. Jan 2007 A1
20070015586 Huston Jan 2007 A1
20070024469 Chou Feb 2007 A1
20070038944 Carignano et al. Feb 2007 A1
20070060112 Reimer Mar 2007 A1
20070078846 Gulli et al. Apr 2007 A1
20070091125 Takemoto et al. Apr 2007 A1
20070098234 Fiala May 2007 A1
20070100740 Penagulur et al. May 2007 A1
20070104348 Cohen May 2007 A1
20070122947 Sakurai et al. May 2007 A1
20070133947 Armitage et al. Jun 2007 A1
20070143082 Degnan Jun 2007 A1
20070150403 Mock Jun 2007 A1
20070159522 Neven Jul 2007 A1
20070172155 Guckenberger Jul 2007 A1
20070198505 Fuller Aug 2007 A1
20070202844 Wilson et al. Aug 2007 A1
20070230817 Kokojima Oct 2007 A1
20070244633 Phillips et al. Oct 2007 A1
20070244924 Sadovsky et al. Oct 2007 A1
20070300161 Bhatia et al. Dec 2007 A1
20080003966 Magnusen Jan 2008 A1
20080005074 Flake et al. Jan 2008 A1
20080035725 Jambunathan et al. Feb 2008 A1
20080037877 Jia et al. Feb 2008 A1
20080046738 Galloway et al. Feb 2008 A1
20080046956 Kulas Feb 2008 A1
20080059055 Geelen et al. Mar 2008 A1
20080071559 Arrasvuori Mar 2008 A1
20080074424 Carignano Mar 2008 A1
20080082426 Gokturk et al. Apr 2008 A1
20080084429 Wissinger Apr 2008 A1
20080092551 Skowronski Apr 2008 A1
20080104054 Spangler May 2008 A1
20080109301 Yee et al. May 2008 A1
20080126193 Robinson May 2008 A1
20080126251 Wassingbo May 2008 A1
20080127647 Leitner Jun 2008 A1
20080142599 Benillouche et al. Jun 2008 A1
20080151092 Vilcovsky Jun 2008 A1
20080154710 Varma Jun 2008 A1
20080163311 St. John-Larkin Jul 2008 A1
20080163379 Robinson et al. Jul 2008 A1
20080165032 Lee Jul 2008 A1
20080170810 Wu et al. Jul 2008 A1
20080176545 Dicke et al. Jul 2008 A1
20080177640 Gokturk et al. Jul 2008 A1
20080186226 Ratnakar Aug 2008 A1
20080194323 Merkli et al. Aug 2008 A1
20080201241 Pecoraro Aug 2008 A1
20080205755 Jackson et al. Aug 2008 A1
20080205764 Iwai et al. Aug 2008 A1
20080207357 Savarese et al. Aug 2008 A1
20080225123 Osann et al. Sep 2008 A1
20080240575 Panda et al. Oct 2008 A1
20080248815 Busch Oct 2008 A1
20080255961 Livesey Oct 2008 A1
20080267521 Gao et al. Oct 2008 A1
20080268876 Gelfand et al. Oct 2008 A1
20080278778 Saino Nov 2008 A1
20080285940 Kulas Nov 2008 A1
20080288338 Wiseman Nov 2008 A1
20080288477 Kim et al. Nov 2008 A1
20080313078 Payne et al. Dec 2008 A1
20090006208 Grewal et al. Jan 2009 A1
20090019487 Kulas Jan 2009 A1
20090028435 Wu et al. Jan 2009 A1
20090028446 Wu et al. Jan 2009 A1
20090034260 Ziemkowski et al. Feb 2009 A1
20090076925 DeWitt et al. Mar 2009 A1
20090083096 Cao et al. Mar 2009 A1
20090094260 Cheng Apr 2009 A1
20090099951 Pandurangan Apr 2009 A1
20090106127 Purdy et al. Apr 2009 A1
20090109240 Englert et al. Apr 2009 A1
20090110241 Takemoto et al. Apr 2009 A1
20090141986 Boncyk et al. Jun 2009 A1
20090144624 Barnes Jun 2009 A1
20090148052 Sundaresan Jun 2009 A1
20090182810 Higgins et al. Jul 2009 A1
20090228342 Walker et al. Sep 2009 A1
20090232354 Camp et al. Sep 2009 A1
20090235181 Saliba et al. Sep 2009 A1
20090235187 Kim et al. Sep 2009 A1
20090240735 Grandhi et al. Sep 2009 A1
20090245638 Collier et al. Oct 2009 A1
20090262137 Walker et al. Oct 2009 A1
20090271293 Parkhurst et al. Oct 2009 A1
20090287587 Bloebaum et al. Nov 2009 A1
20090299824 Barnes, Jr. Dec 2009 A1
20090304267 Tapley et al. Dec 2009 A1
20090319373 Barrett Dec 2009 A1
20090319388 Yuan et al. Dec 2009 A1
20090319887 Waltman et al. Dec 2009 A1
20090324100 Kletter et al. Dec 2009 A1
20090324137 Stallings et al. Dec 2009 A1
20090325554 Reber Dec 2009 A1
20100015960 Reber Jan 2010 A1
20100015961 Reber Jan 2010 A1
20100015962 Reber Jan 2010 A1
20100034469 Thorpe et al. Feb 2010 A1
20100037177 Golsorkhi Feb 2010 A1
20100045701 Scott et al. Feb 2010 A1
20100046842 Conwell Feb 2010 A1
20100048290 Baseley et al. Feb 2010 A1
20100049663 Kane et al. Feb 2010 A1
20100070996 Liao et al. Mar 2010 A1
20100082927 Riou Apr 2010 A1
20100131714 Chandrasekaran May 2010 A1
20100153378 Sardesai Jun 2010 A1
20100161605 Gabrilovich et al. Jun 2010 A1
20100171758 Maassel et al. Jul 2010 A1
20100171999 Namikata et al. Jul 2010 A1
20100185529 Chesnut Jul 2010 A1
20100188510 Yoo et al. Jul 2010 A1
20100198684 Eraker et al. Aug 2010 A1
20100211900 Fujioka Aug 2010 A1
20100214284 Rieffel et al. Aug 2010 A1
20100235259 Farraro et al. Sep 2010 A1
20100241650 Chittar Sep 2010 A1
20100257024 Holmes et al. Oct 2010 A1
20100260426 Huang et al. Oct 2010 A1
20100281417 Yolleck et al. Nov 2010 A1
20100287511 Meier et al. Nov 2010 A1
20100289817 Meier et al. Nov 2010 A1
20100293068 Drakoulis et al. Nov 2010 A1
20100312596 Saffari et al. Dec 2010 A1
20100316288 Ip et al. Dec 2010 A1
20100332283 Ng et al. Dec 2010 A1
20100332304 Higgins et al. Dec 2010 A1
20110004517 Soto et al. Jan 2011 A1
20110016487 Chalozin et al. Jan 2011 A1
20110029334 Reber Feb 2011 A1
20110047075 Fourez Feb 2011 A1
20110053642 Lee Mar 2011 A1
20110055054 Glasson Mar 2011 A1
20110061011 Hoguet Mar 2011 A1
20110065496 Gagner et al. Mar 2011 A1
20110078305 Varela Mar 2011 A1
20110084983 Demaine Apr 2011 A1
20110090343 Alt et al. Apr 2011 A1
20110128288 Petrou et al. Jun 2011 A1
20110128300 Gay et al. Jun 2011 A1
20110131241 Petrou et al. Jun 2011 A1
20110143731 Ramer et al. Jun 2011 A1
20110148924 Tapley et al. Jun 2011 A1
20110153614 Solomon Jun 2011 A1
20110173191 Tsaparas et al. Jul 2011 A1
20110184780 Alderson et al. Jul 2011 A1
20110187306 Aarestrup et al. Aug 2011 A1
20110212717 Rhoads et al. Sep 2011 A1
20110214082 Osterhout et al. Sep 2011 A1
20110215138 Crum Sep 2011 A1
20110238472 Sunkada Sep 2011 A1
20110238476 Carr et al. Sep 2011 A1
20110246064 Nicholson Oct 2011 A1
20110277744 Gordon et al. Nov 2011 A1
20110307338 Carlson Dec 2011 A1
20110313874 Hardie et al. Dec 2011 A1
20120072233 Hanlon et al. Mar 2012 A1
20120084812 Thompson et al. Apr 2012 A1
20120097151 Matsuno et al. Apr 2012 A1
20120099800 Llano et al. Apr 2012 A1
20120105475 Tseng et al. May 2012 A1
20120113141 Zimmerman et al. May 2012 A1
20120120113 Hueso May 2012 A1
20120126974 Phillips et al. May 2012 A1
20120129553 Phillips et al. May 2012 A1
20120130796 Busch May 2012 A1
20120150619 Jacob et al. Jun 2012 A1
20120165046 Rhoads et al. Jun 2012 A1
20120179716 Takami Jul 2012 A1
20120185492 Israel et al. Jul 2012 A1
20120192235 Tapley et al. Jul 2012 A1
20120195464 Ahn Aug 2012 A1
20120197764 Nuzzi et al. Aug 2012 A1
20120215612 Ramer et al. Aug 2012 A1
20120230581 Miyashita et al. Sep 2012 A1
20120239483 Yankovich et al. Sep 2012 A1
20120239501 Yankovich et al. Sep 2012 A1
20120284105 Li Nov 2012 A1
20120308077 Tseng et al. Dec 2012 A1
20120323707 Urban Dec 2012 A1
20120327115 Chhetri et al. Dec 2012 A1
20130006735 Koenigsberg et al. Jan 2013 A1
20130019177 Schlossberg et al. Jan 2013 A1
20130036438 Kutaragi et al. Feb 2013 A1
20130050218 Beaver et al. Feb 2013 A1
20130073365 McCarthy Mar 2013 A1
20130086029 Hebert Apr 2013 A1
20130103306 Uetake Apr 2013 A1
20130106910 Sacco May 2013 A1
20130110624 Mitrovic May 2013 A1
20130116922 Cai et al. May 2013 A1
20130144701 Kulasooriya et al. Jun 2013 A1
20130151366 Godsey Jun 2013 A1
20130170697 Zises Jul 2013 A1
20130198002 Nuzzi et al. Aug 2013 A1
20130262231 Glasgow et al. Oct 2013 A1
20130325839 Goddard et al. Dec 2013 A1
20140000701 Korevaar et al. Jan 2014 A1
20140007012 Govande et al. Jan 2014 A1
20140085333 Pugazhendhi et al. Mar 2014 A1
20140237352 Sriganesh et al. Aug 2014 A1
20140372449 Chittar Dec 2014 A1
20150006291 Yankovich et al. Jan 2015 A1
20150032531 Yankovich et al. Jan 2015 A1
20150052171 Cheung Feb 2015 A1
20150065177 Phillips et al. Mar 2015 A1
20150148078 Phillips et al. May 2015 A1
20150163632 Phillips et al. Jun 2015 A1
20160019723 Tapley et al. Jan 2016 A1
20160034944 Raab et al. Feb 2016 A1
20160117863 Pugazhendhi et al. Apr 2016 A1
20160171305 Zises Jun 2016 A1
20160364793 Sacco Dec 2016 A1
20170046593 Tapley et al. Feb 2017 A1
20170091975 Zises Mar 2017 A1
20180124513 Kim et al. May 2018 A1
20180189863 Tapley et al. Jul 2018 A1
20180336734 Tapley et al. Nov 2018 A1
20190266614 Grandhi et al. Aug 2019 A1
20210027345 Govande et al. Jan 2021 A1
Foreign Referenced Citations (85)
Number Date Country
2012212601 May 2016 AU
2015264850 Apr 2017 AU
1255989 Jun 2000 CN
1750001 Mar 2006 CN
1802586 Jul 2006 CN
1865809 Nov 2006 CN
2881449 Mar 2007 CN
101153757 Apr 2008 CN
101515195 Aug 2009 CN
101515198 Aug 2009 CN
101520904 Sep 2009 CN
101541012 Sep 2009 CN
101764973 Jun 2010 CN
101772779 Jul 2010 CN
101893935 Nov 2010 CN
102084391 Jun 2011 CN
102156810 Aug 2011 CN
102194007 Sep 2011 CN
102667913 Sep 2012 CN
103443817 Dec 2013 CN
104081379 Oct 2014 CN
104656901 May 2015 CN
105787764 Jul 2016 CN
1365358 Nov 2003 EP
1710717 Oct 2006 EP
2015244 Jan 2009 EP
2034433 Mar 2009 EP
2499635 Oct 2019 EP
2418275 Mar 2006 GB
56-085650 Jul 1981 JP
57-164286 Oct 1982 JP
59-107144 Jun 1984 JP
59-196956 Nov 1984 JP
60-078250 May 1985 JP
63-013113 Mar 1988 JP
11191118 Jul 1999 JP
2942851 Aug 1999 JP
2000-110515 Apr 2000 JP
2000-279944 Oct 2000 JP
2001-283079 Oct 2001 JP
2001-309323 Nov 2001 JP
2001-344479 Dec 2001 JP
2002-004943 Jan 2002 JP
2002-099826 Apr 2002 JP
2003-014316 Jan 2003 JP
2003-022395 Jan 2003 JP
2004-326229 Nov 2004 JP
2005-337966 Dec 2005 JP
2006-209658 Aug 2006 JP
2006-351024 Dec 2006 JP
3886045 Feb 2007 JP
2007-172605 Jul 2007 JP
3143216 Jul 2008 JP
2010-039908 Feb 2010 JP
2010-141371 Jun 2010 JP
2010-524110 Jul 2010 JP
2011-209934 Oct 2011 JP
2012-529685 Nov 2012 JP
10-2006-0126717 Dec 2006 KR
10-2007-0014532 Feb 2007 KR
10-0805607 Feb 2008 KR
10-0856585 Sep 2008 KR
10-2009-0056792 Jun 2009 KR
10-2009-0070900 Jul 2009 KR
10-2010-0067921 Jun 2010 KR
10-2010-0071559 Jun 2010 KR
10-2011-0082690 Jul 2011 KR
9904153 Jan 1999 WO
1999044153 Sep 1999 WO
2001022326 Mar 2001 WO
2005072157 Aug 2005 WO
2005072157 Feb 2007 WO
2008003966 Jan 2008 WO
2008051538 May 2008 WO
2009111047 Sep 2009 WO
2009111047 Dec 2009 WO
2010084585 Jul 2010 WO
2010141939 Dec 2010 WO
2011070871 Jun 2011 WO
2011087797 Jul 2011 WO
2011087797 Oct 2011 WO
2012106096 Aug 2012 WO
2013063299 May 2013 WO
2013101903 Jul 2013 WO
2013101903 Jun 2014 WO
Non-Patent Literature Citations (201)
Entry
Non-Final Office Action received for U.S. Appl. No. 13/537,482, dated Jun. 28, 2017, 25 pages.
Corrected Notice of Allowability Received for U.S. Appl. No. 15/337,899, dated Feb. 24, 2021, 2 Pages.
Non-Final Office Action received for U.S. Appl. No. 15/337,899 dated Feb. 5, 2020, 11 pages.
Notice of Allowance received for U.S. Appl. No. 15/337,899, dated Jul. 30, 2020, 7 pages.
Corrected Notice of Allowability received for U.S. Appl. No. 15/337,899, dated Sep. 10, 2020, 2 pages.
Notice of Allowance received for U.S. Appl. No. 15/337,899, dated Nov. 17, 2020, 7 Pages.
Klemperer,“Auctions: Theory and Practice”, Princeton University Press, 2004, 15 pages.
Final Office Action Received for U.S. Appl. No. 15/337,899 dated Nov. 14, 2019, 20 pages.
First Action Interview—Office Action Summary received for U.S. Appl. No. 15/337,899, dated Jun. 25, 2019, 4 pages.
First Action Interview—Pre-Interview Communication received for U.S. Appl. No. 15/337,899, dated Mar. 19, 2019, 6 pages.
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 10803429.9, dated Aug. 30, 2018, 6 pages.
Corrected Notice of Allowance received for U.S. Appl. No. 14/868,105, dated Sep. 21, 2018, 9 pages.
Corrected Notice of Allowability received for U.S. Appl. No. 14/868,105, dated Oct. 11, 2018, 2 pages.
Office Action Received for Chinese Patent Application No. 201610108229.6 dated Nov. 15, 2018, 15 pages (6 pages Official Copy and 9 pages English Translation).
Final Office Action received for U.S. Appl. No. 13/537,482, dated Dec. 13, 2018, 18 pages.
Notice of Allowance Received For U.S. Appl. No. 12/398,957, dated Jan. 2, 2019, 10 pages.
Corrected Notice of Allowability Received for U.S. Appl. No. 14/868,105, dated Jan. 14, 2019, 2 pages.
Office Action received for Chinese patent Application No. 201610108229.6, dated May 17, 2019, 33 pages (20 pages of English Translation and 13 pages of Official copy).
Non-Final Office Action received for U.S. Appl. No. 13/537,482, dated Jun. 20, 2019, 22 pages.
Non-Final Office Action Received for U.S. Appl. No. 16/046,434, dated Aug. 21, 2019, 23 Pages.
Extended European Search Report Received for European Patent Application No. 19184977.7 dated Sep. 26, 2019, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated Oct. 17, 2019, 11 pages.
Non-Final Office Action Received for U.S. Appl. No. 12/398,957, dated Dec. 9, 2019, 10 pages.
Office Action Received for Chinese Patent Application No. 201610108229.6, dated Dec. 17, 2019, 23 Pages (9 pages of Official Copy and 14 pages of English Translation).
Final Office Action Received for U.S. Appl. No. 13/537,482, dated Jan. 7, 2020, 25 Pages.
Final Office Action received for U.S. Appl. No. 16/046,434, dated Jan. 17, 2020, 24 pages.
Communication Pursuant to Article 94(3) EPC received For European Patent Application No. 17171025.4, dated Feb. 7, 2020, 6 pages.
Notice of Allowance received for U.S. Appl. No. 13/537,482, dated Apr. 8, 2020, 13 pages.
Final Office Action received for U.S. Appl. No. 12/398,957, dated Jun. 24, 2020, 17 pages.
Notice of Allowance received for U.S. Appl. No. 13/537,482, dated Jul. 13, 2020, 12 pages.
Notice of Allowance received for U.S. Appl. No. 12/398,957, dated Oct. 30, 2020, 8 pages.
Notice of Allowance Received for Korean Patent Application No. 10-2016-7025254 dated Mar. 9, 2018, 5 pages (2 pages of English Translation and 3 pages of Official Copy).
Kraft, “Real Time Baseball Augmented Reality Retrieved from the Internet URL: <http://dx.doi.org/10.7936/K7HH6H84>” 11 pages.
Gonsalves, Antone, “Amazon Launches Experimental Mobile Shopping Feature”, Retrieved from the Internet <URL: http://www.informationweek.com/news/internet/retail/showArticle.jhtml?articleID=212201750&subSection=News Dec. 3, 2008, 1 page.
Terada, “New Cell Phone Services Tap Image-Recognition Technologies”, Retrieved from the Internet: <URL: http://search.japantimes.co.jp/cgi-bin/nb20070626a1.html>, Jun. 26, 2007, pp. 1-3.
Troaca, “S60 Camera Phones Get Image Recognition Technology”, http://news.softpedia.com/news/S60-Camera-Phones-Get-Image-Recognition-Technology-79666.shtml, Feb. 27, 2008, pp. 1-2.
“The ESP Game”, Retrieved from the Internet: <URL: http://www.espgame.org/instructions.html>, Accessed on Nov. 13, 2007, 2 pages.
Mello Jr., “Pongr Giving Cell Phone Users Way to Make Money”, Retrieved from the Internet URL; <https://www.pcworld.com/article/240209/pongr_giving_cell_phone_users_way_to_make_money.html>, Sep. 18, 2011, 2 pages.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Mar. 13, 2013, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 12/644,957, dated Mar. 18, 2013, 17 pages.
“SnapTell: Technology”, Retrieved from the Internet: <URL: http://web.archive.org/web/20071117023817 /http:/ /www.snaptell.com/technology/index. htm>, Nov. 17, 2007, 1 page.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Nov. 14, 2011, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Jun. 8, 2011, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated Mar. 29, 2012, 23 pages.
Grandhi, Roopnath, et al., “Image Recognition as a Service”, Application Serial No. 611033,940, Application filed Mar. 5, 2008, 56 pgs.
Ahn et al., et al., “Labeling Images with a Computer Game”, Retrieved from the internet URL:<http://ael.gatech.edu/cs6452f13/files/2013/08/labeling-images.pdf>, 2004, 8 pages.
Occipitaihq, “RedLaser 2.0: Realtime iPhone UPC barcode scanning”, Available online on URL: <https://www.youtube.com/watch?v=9_hFGsmx_6k>, Jun. 16, 2009, 1 pages.
Parker, J.R., et al., “Algorithms for Image Processing and Computer Vision”, Wiley Computer Publishing, 1997, pp. 23-29.
Patterson, “Amazon Iphone App Takes Snapshots, Looks for a Match”, Retrieved from the Internet: <URL: http://tech.yahoo.com/blogs/patterson/30983>,Dec. 3, 2008, 3 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2010/061628, dated Jul. 5, 2012, 6 pages.
Written Opinion received for PCT Application No. PCT/US2010/061628, dated Aug. 12, 2011, 4 pages.
Redlaser, “Redlaser—Impossibly Accurate Barcode Scanning”, Retrieved from the Internet URL: <http://redlaser.com/index.php>, Jul. 8, 2011, pp. 1-2.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated Sep. 19, 2013, 21 pages.
Final Office Action received for U.S. Appl. No. 12/644,957, dated Aug. 26, 2013, 19 pages.
Office Action received for Korean Patent Application No. 10-2012-7019181, dated Nov. 18, 2013, 11 pages.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Dec. 18, 2013, 26 pages.
Non-Final Office Action received for U.S. Appl. No. 12/644,957, dated Mar. 7, 2014, 21 pages.
Office Action received for Chinese Patent Application No. 201080059424.5, dated Apr. 21, 2014, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 13/537,482, dated Jan. 6, 2014, 25 pages.
Final Office Action received for U.S. Appl. No. 12/644,957, dated Jul. 11, 2014, 25 pages.
Office Action received for Korean Patent Application No. 10-2012-7019181, dated Jun. 26, 2014, 5 pages.
Office Action received for Korean Patent Application No. 10-2012-7019181, dated Nov. 3, 2014, 7 pages.
Non-Final Office Action received for U.S. Appl. No. 12/644,957, dated Dec. 29, 2014, 20 pages.
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 12/398,957, dated Jan. 14, 2015, 10 pages.
Final Office Action received for U.S. Appl. No. 12/398,957, dated Jul. 18, 2014, 27 pages.
Final Office Action received for U.S. Appl. No. 13/537,482, dated May 8, 2014, 20 pages.
Final Office Action received for U.S. Appl. No. 13/537,482, dated May 22, 2015, 32 pages.
Non-Final Office Action received for U.S. Appl. No. 13/537,482, dated Nov. 6, 2014, 24 pages.
Kan, et al., “Applying QR Code in Augmented Reality Applications”, pp. 253-258.
Notice of Allowance received for U.S. Appl. No. 12/644,957, dated Jun. 17, 2015, 20 pages.
Appeal Decision received for Korean Patent Application No. 10-2012-7019181, dated Jan. 29, 2016, 36 pages (46 pages of official copy and 20 pages of English translation).
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Feb. 8, 2016, 37 pages.
Office Action received for Korean Patent Application No. 10-2012-7019181, dated Feb. 23, 2016, 12 pages (5 pages of English Translation and 7 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2014-7004160, dated Mar. 2, 2016, 7 pages (2 pages of English Translation and 5 pages of Official Copy).
Extended European Search report received for European Patent Application No. 10803429.9, dated Jun. 17, 2015, 7 pages.
Notice of Allowance received for U.S. Appl. No. 12/371,882, dated Jul. 20, 2016, 5 pages.
Notice of Allowance Received for Koean Patent Application No. 10-2014-7004160, dated Jun. 15, 2016, 8 pgs.
Definition of Homogeneous Coordinates, Retrieved from the internet URL: <https://web.archive.org/web/20110305185824/http://en.wikipedia.org/wiki/Homogeneous_coordinates>, 8 pages.
Vassilios, et al.,“Archeoguide:An Augmented Reality Guide for Archaeological Sites”, IEEE Computer Graphics and application vol. 22, No. 5, Sep./Oct. 2002, pp. 52-60.
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 09717996.4, dated Jul. 23, 2013, 7 pages.
Extended European Search report received for European Patent Application No. 09717996.4, dated Feb. 17, 2011, 6 pages.
Office Action received for Korean Patent Application No. 10-2010-7022281, dated Feb. 28, 2012, 13 pgs.
Office Action received for Korean Patent Application No. 10-2010-7022281, dated Sep. 27, 2012, 12 pgs.
Trial Board Decision Filed on Mar. 25, 2014 For Korean Patent Application No. 10-2010-7022281, 21 pgs.
Appeal decision mailed for U.S. Appl. No. 12/398,957, dated Oct. 18, 2016, 7 pages.
Final Office Action received for U.S. Appl. No. 13/537,482, dated Nov. 7, 2016, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 13/537,482, dated Jun. 24, 2016, 19 pages.
Decision of Reexamination received for Chinese Patent Application No. 200980107871.0, dated Nov. 30, 2015, 11 pages.
Office Action received for Chinese Patent Application No. 200980107871.0, dated Feb. 2, 2012, 17 pages.
Office Action received for Chinese Patent Application No. 200980107871.0, dated Jun. 5, 2014, 10 pages.
Office Action received for Chinese Patent Application No. 200980107871.0, dated May 3, 2013, 29 pages.
Office Action received for Chinese Patent Application No. 200980107871.0, dated Nov. 1, 2012, 13 pages.
Office Action received for Chinese Patent Application No. 200980107871.0, dated Nov. 5, 2013, 12 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2009/001419, dated Sep. 16, 2010, 5 pages.
International Search Report received for PCT Application No. PCT/US2009/001419, dated Sep. 30, 2009, 4 pages.
International Written Opinion received for PCT Application No. PCT/US2009/001419, dated Sep. 30, 2009, 4 pages.
Summons to Attend Oral Proceedings received for European Application No. 09717996.4, dated Nov. 28, 2016, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 14/868,105, dated Dec. 12, 2016, 24 pages.
Office Action received for Korean Patent Application No. 10-2016-7025254, dated Oct. 13, 2016, 11 pages (6 pages of English Translation and 5 pages of Official copy).
Office Action received for Chinese Patent Application No. 201510088798.4, dated Mar. 17, 2017, 23 pages (14 pages of English Translation and 9 pages of Official Copy).
Final Office Action received for U.S. Appl. No. 14/868,105, dated Apr. 12, 2017, 22 pages.
Office Action received for Korean Patent Application No. 10-2016-7025254, dated May 2, 2017, 7 pages (3 pages of English Translation and 4 pages of Official Copy).
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated May 2, 2017, 19 pages.
Nalther, et al., “Selective Visual Attention Enables Learning and Recognition of Multiple Objects in Cluttered Scenes”, Jun. 15, 2005, 23 pages.
First Examiner Report received for Indian Patent Application No. 6557/DELNP/2010, dated Apr. 11, 2017, 11 pages.
Office Action received for Korean Patent Application No. 10-2016-7025254, dated Sep. 5, 2017, 12 pages (5 pages of English Translation and 7 pages of Official Copy).
Non-Final Office Action received for U.S. Appl. No. 14/868,105, dated Nov. 14, 2017, 14 pages.
Final Office Action received for U.S. Appl. No. 13/537,482, dated Nov. 24, 2017, 19 pages.
Final Office Action received for U.S. Appl. No. 12/398,957, dated Jan. 22, 2018, 20 pages.
Araki, et al., “Follow-The-Trial-Fitter: Real-Time Dressing without Undressing”, Retrieved from the Internet URL: https://dialog.proquest.com/professional/printviewfile?accountId=142257>, Dec. 1, 2008, 8 pages.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Jun. 25, 2015, 27 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Mar. 12, 2015, 29 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Aug. 30, 2013, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Oct. 23, 2012, 21 pages.
Final Office Action received for U.S. Appl. No. 12/398,957, dated Nov. 7, 2012, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated Jul. 29, 2011, 23 pages.
International Search Report received for PCT Application No. PCT/US2010/061628, dated Aug. 12, 2011, 4 pages.
Extended European Search Report received for European Patent Application No. 17171025.4, dated Sep. 4, 2017, 7 pages.
Non-Final Office Action Received for U.S. Appl. No. 13/537,482 dated May 16, 2018, 20 pages.
Notice of Allowance received for U.S. Appl. No. 14/868,105, dated May 21, 2018, 14 pages.
“Draw something”, Retrieved from the Internet URL: <http://omgpop.com/drawsomething>, Accessed on Feb. 16, 2018, 2 pages.
Duke University, “How to Write Advertisements that Sell”, Company: System, the magazine of Business, 1912, 66 pages.
Google Play, “AgingBooth”, Retrieved from the Internet URL: <https://play.google.com/store/apps/details?id=com.piviandco.agingbooth&hl=en_IN>, Jan. 7, 2019, 3 pages.
Madeleine, “Terminator 3 Rise of Jesus! Deutsch”, “Retrieved from the Internet URL: <https://www.youtube.com/watch?v:::Oj3o7HFcgzE>”, Jun. 12, 2012, 2 pages.
Mlobitv,“MobiTV”, Retrieved from the Internet: <URL: http://www.mobitv.com/>, Accessed on Mar. 30, 2015, 1 page.
Newby, “Facebook, Politico to measure sentiment of GOP candidates by collecting posts”, 2006-2012 Clarity Digital Group LLC d/b/a Examiner.com, Jun. 28, 2012, 3 pages.
Sifry, “Politico-Facebook Sentiment Analysis Will Generate “Bogus” Results, Expert Says”, Retrieved from the Internet: <http://techpresident.com/news/21618/politico-facebook-sentiment-analysis-bogus>, Jan. 13, 2012, pages.
Slingbox,“Sling Media, Inc.”, Retrieved from the Internet: <URL: http://www.slingbox.com/>, Accessed on Mar. 30, 2015, 1 page.
“MLB At Bat”,Retrieved from the Internet: <URL: http://texas.rangers.mlb.com/mobile/atbat/2c id=tex>, Accessed on Apr. 19, 2018, pp. 1-6.
Mulloni, et al., “Handheld Augmented Reality Indoor Navigation with Activity-Based Instructions”, Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, Aug. 30-Sep. 2, 2011, 10 pages.
Vlahakis, et al., “Archeoguide: First Results of an Augmented Reality, Mobile Computing System in Cultural Heritage Sites”, Jan. 2001, 10 pages.
Wikipedia, “Polar Coordinate System”, Wikipedia on Oct. 11, 2011 via Internet Archive WayBackMachine, [Online]., Oct. 8, 2011, 12 pages.
Non Final Office Action received for U.S. Appl. No. 17/039,443, dated Feb. 1, 2022, 20 pages.
About the Eclipse Foundation, Retrieved from Internet URL: <http://www.eclipse.org/org/>, Accessed on Nov. 2, 2021, 2 Pages.
Apache Tomcat, The Apache Software Foundation, Retrveied from Internet URL: <http://tomcat.apache.org/>, Accessed on Nov. 2, 2021, 4 Pages.
GOCR, open-source character recognition, Retrieved from Internet URL: <https://www-e.ovgu.de/jschulen/ocr/download.html>, Accessed on Nov. 3, 2021, 2 Pages.
Communication under Rule 71(3) for European Patent Application No. 10803429.9, dated Jun. 6, 2019, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 16/406,787, dated Oct. 8, 2021, 12 Pages.
Oracle,“Java Technical Details”, Retrieved from Internet URL: <https://www.oracle.com/java/technologies/>, Accessed on Nov. 3, 2021, 4 Pages.
Salesforce,“Custom Application Development Software for Business”, Retrieved from internet URL: <https://www.salesforce.com/products/platform/overview/?d=70130000000liBh&internal=true>, Accessed on Oct. 4, 2021, 9 Pages.
W3C, “URIs, Addressability, and the use of HTTP GET and POST”, Retrieved from Internet URL: <https://www.w3.org/2001/tag/doc/whenToUseGet.html>, Mar. 21, 2004, 9 Pages.
eBay Developers Program, Retrieved from the Internet URL: <https://developer.ebay.com/common/api/>, Accessed on Nov. 8, 2021, 3 Pages.
iPhone—Apple, Retrieved from the Internet URL: <http://www.apple.com/iphone/>, Accessed on Nov. 10, 2021, 12 pages.
Whatls.Com, Retrieved from the Internet URL :<http://searchexchange.techtarget.com/sDefinition/0,,sid43_gci212805,00.html>, Accessed on Nov. 8, 2021, 5 pages.
DS Development SOFTWARE,“Email Protocols: IMAP, POP3, SMTP and HTTP”, Retrieved from the Internet UR: :<http://www.emailaddressmanager.com/tips/protocol.html>, © 2004-2013 Digital Software Development, Accessed on Nov. 8, 2021, 1 Page.
W3C,“Extensible Markup Language (XML) 1.0 (Fourth Edition)”, Retrieved from the Internet URL :<http://www.w3.org/TR/2006/REC-xml-20060816/#sec-origin-goals>, Aug. 16, 2006, 30 pages.
W3 Schools, “Introduction to XML”, Retrieved from the Internet URL :<https://www.w3schools.com/xml/xml_whatis.asp>, Accessed on Nov. 8, 2021, 7 pages.
Gmail, Retrieved from Internet URL: https://www.gmail.com, Accessed on Nov. 10, 2021, 7 Pages.
Java Servlet Technology Overview, Retrieved from Internet URL: <https://www.oracle.com/java/technologies/servlet-technology.html>, Accessed on Nov. 10, 2021, 2 Pages.
Chinese Application Serial No. 200980107871.0, Notification of Reexamination dated Aug. 7, 2015, 22 pages (13 pages of Officiai Copy and 09 pages of English Translation).
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 10803429.9, dated Feb. 16, 2018, 8 pages.
Decision of Rejection Received for Chinese Patent Application No. 201610108229.6, dated Mar. 26, 2020, 11 pages (7 pages of Official Copy & 4 pages of English Translation of Claims).
Final Office Action received for U.S. Appl. No. 13/194,584, dated Jan. 22, 2016, 27 pages.
Final Office Action received for U.S. Appl. No. 13/194,584, dated Jul. 27, 2017, 35 pages.
Final Office Action received for U.S. Appl. No. 11/140,273, dated Dec. 13, 2007, 11 pages.
Final Office Action received for U.S. Appl. No. 11/140,273, dated Jul. 15, 2009, 11 pages.
Final Office Action received for U.S. Appl. No. 12/406,016, dated Feb. 29, 2012, 25 pages.
Final Office Action received for U.S. Appl. No. 13/194,584, dated Mar. 27, 2014, 22 pages.
Final Office Action received for U.S. Appl. No. 13/339,235, dated Jan. 13, 2014, 13 Pages.
Final Office Action received for U.S. Appl. No. 13/339,235, dated Aug. 29, 2012, 10 pages.
Final Office Action received for U.S. Appl. No. 13/339,235, dated Dec. 2, 2014, 7 pages.
Final Office Action received for U.S. Appl. No. 13/339,235, dated Jan. 27, 2017, 16 pages.
Final Office Action received for U.S. Appl. No. 14/512,350, dated Apr. 14, 2016, 23 pages.
Final Office Action received for U.S. Appl. No. 14/512,350, dated Aug. 23, 2017, 21 pages.
Final Office Action received for U.S. Appl. No. 141512,350, dated Nov. 30, 2015, 7 pages.
Final Office Action received for U.S. Appl. No. 17/039,443, dated Aug. 18, 2022, 12 pages.
Final Office Action received for U.S. Appl. No. 17/039,443, dated May 27, 2022, 17 pages.
Final Office Action received for U.S. Appl. No. 14/486,518, dated Nov. 16, 2017, 20 pages.
First Action Interview Office Action Summary received for U.S. Appl. No. 14/534,797, dated Feb. 18, 2016, 5 pages.
Geekery, “Proposal for Free, Open Source Cell Phone Location Service”, Retrieved from the Internet URL: <//crud.blog/2004/03/06/proposal-for-free-open-source-cell-phone-location-service/>, Mar. 6, 2004, 8 pages.
Korean Application Serial No. 2012-7019181, Notice of Appeal filed Feb, 4, 2015, with English translation of claims, 24 pgs.
Non-Final Office Action received for U.S. Appl. No. 13/194,584, dated Jul. 16, 2015, 27 pages.
Non-Final Office Action received for U.S. Appl. No. 11/690,720, dated May 17, 2011, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 13/361,196, dated Mar. 16, 2012, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 13/361,196, dated Aug. 23, 2012, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 15/250,588, dated Sep. 22, 2017, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 11/140,273, dated Jul. 3, 2008, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 11/140,273, dated May 31, 2007, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 12/406,016, dated Jun. 21, 2011, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 12/406,016, dated Oct. 2, 2013, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 13/339,235, dated Aug. 18, 2014, 4 pages.
Non-Final Office Action received for U.S. Appl. No. 13/339,235, dated Aug. 28, 2017, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 13/361,113, dated Feb. 13, 2014, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 13/361,196, dated Mar. 29, 2012, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 12/406,016, dated Mar. 11, 2016, 7 pages.
Notice of Allowability received for U.S. Appl. No. 12/406,016, dated Jun. 11, 2014, 19 pages.
Notice of Allowance received for U.S. Appl. No. 14/990,291, dated Dec. 13, 2017, 5 pages.
Notice of Allowance received for U.S. Appl. No. 13/339,235, dated Aug. 3, 2010, 6 pages.
Notice of Allowance received for U.S. Appl. No. 13/339,235, dated Apr. 25, 2014, 8 pages.
Notice of Allowance received for U.S. Appl. No. 17/039,443, dated Sep. 22, 2022, 10 pages.
Notification of Reexamination received for Chinese Patent Application No. 201610108229.6 dated May 9, 2022, 10 pages (2 pages English Translation, 8 pages Official Copy).
Office Action—First Action Interview received for No. U.S. Appl. No. 14/990,291, dated Oct. 18, 2017, 5 pages.
Preinterview First Office Action received for U.S. Appl. No. 14/990,291, dated Aug. 10, 2017, 4 pages.
U.S. Appl. No. 13/194,584, Non Final Office Action dated Sep. 19, 2013, 25 pgs.
U.S. Appl. No. 13/624,682, Non Final Office Action dated Jan. 22, 2015, 9 pgs.
U.S. Appl. No. 13/624,682, Notice of Allowance dated Jun. 8, 2015, 5 pgs.
U.S. Appl. No. 13/624,682, Notice of Allowance dated Oct. 1, 2015, 7 pgs.
U.S. Appl. No. 14/473,809, Non Final Office Action dated Aug. 13, 2015, 21 pgs.
Non-Final Office Action received for U.S. Appl. No. 16/406,787, dated Dec. 6, 2022, 11 pages.
Reexamination Decision received for Chinese Patent Application No. 201610108229.6 dated Nov. 4, 2022, 14 Pages (1 Page of English translation and 13 Pages of Official Copy).
Notice of Allowance received for U.S. Appl. No. 17/039,443, dated Jan. 26, 2023, 10 pages.
Related Publications (1)
Number Date Country
20210166061 A1 Jun 2021 US
Provisional Applications (2)
Number Date Country
61106916 Oct 2008 US
61033940 Mar 2008 US
Continuations (2)
Number Date Country
Parent 15337899 Oct 2016 US
Child 17177862 US
Parent 12371882 Feb 2009 US
Child 15337899 US