Contextual menus based on image recognition

Information

  • Patent Grant
  • 11651398
  • Patent Number
    11,651,398
  • Date Filed
    Wednesday, September 30, 2020
    4 years ago
  • Date Issued
    Tuesday, May 16, 2023
    a year ago
Abstract
Contextual menus based on images submitted to a network based publication system are disclosed. Images depicting a variety of locales such as businesses, or other items, may be stored in an image repository in the system and used to identify images that users may submit as photograph images taken by cell phone, camera, webcam, a laptop with camera capability. After identification of the submitted image the system may categorize the image and provide the user a category driven menu relating to the photograph, the menu based on both the submitted image and the user's intent when he or she is capturing the image.
Description
FIELD

The present disclosure relates generally to information retrieval. In an example embodiment, the disclosure relates to providing contextual menus based on images.


BACKGROUND

Online shopping and other publication systems provide a number of publishing and shopping mechanisms whereby a seller may list or publish information concerning goods or services. A buyer can express interest in or indicate a desire to purchase such goods or services by, for example, responding to a menu presented as a user interface by the online shopping or publication system.


The accurate presentation of online menus that reflect or relate closely to a user's intent is currently a challenge in the field of information retrieval. An example of such a challenge is that menus are usually static and are uniquely defined by sellers. Buyers seeking goods or services might be interested in a different good or service than that offered in a traditional menu provided online for goods or services. As a result, publication systems such as online shopping systems that use a conventional search engine to locate goods and services may not effectively connect the buyers to the sellers and vice versa.





BRIEF DESCRIPTION OF DRAWINGS

The present disclosure is illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:



FIG. 1 is a photograph image that may, in accordance with an illustrative embodiment, be submitted for identification and for obtaining a menu related to the image for a good or service desired by a user;



FIG. 2 is a diagram depicting a system, in accordance with an illustrative embodiment, for providing the image of FIG. 1 to a publication system;



FIG. 3 is a diagram depicting a publication system, in accordance with an illustrative embodiment, that identifies items depicted in images and provides menus relating to the items and that are desired by a user;



FIG. 4 is a block diagram illustrating an example embodiment of a publication system.



FIG. 4A is an illustration of various modules of an implementation of an image identification module useful in an example embodiment;



FIG. 5 is an illustration of a contextual menu in accordance with an illustrative embodiment.



FIG. 6 is a flow chart illustrating a method useful in an example embodiment for providing to users contextual menus relating to the images provided to the system seen in FIGS. 2 and 3;



FIG. 7 is a flow chart further illustrating the method of FIG. 6; and



FIG. 8 is a block diagram depicting a machine in the example form of a processing system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.





DETAILED DESCRIPTION

The description that follows includes illustrative systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments. It will be evident, however, to those skilled in the art that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures and techniques have not been shown in detail.


The embodiments described herein provide techniques for providing contextual menus based on images submitted to a network based publication system by way of photographs. The submission may be by way of a network such as, in one embodiment, the Internet, either by wire coupling or wirelessly. Other networks, such as a LAN or other internal or external networks, may be used. As part of identification functionality, images depicting a variety of locales such as businesses, or other items, may be stored in an image repository in, for example, the network-based publication system (e.g., an online shopping system). The stored images in the image repository may be used to identify images that users may submit as photograph images taken by cell phone, camera, webcam, or even by a laptop with camera capability, Alternatively, the publication system may identify the submitted image by performing a location based Internet search based on the submitted image, or based on the location of the camera that supplied the image. Identification of the image may be based on identifying a well known logo or trademark. Image recognition could be based on recognition of an image of the building stored in the system's image repository. The system may then, after categorizing the image, provide the user a menu relating to the photograph, the menu based on context rather than being a generic form of menu. That is, the submitted image, and what the user might want to do with respect to the image at the specific moment when he or she is capturing the image, is a primary consideration for the menu.


A technique for image identification may be seen in U.S. patent application Ser. No. 12/371,882 filed Feb. 16, 2009, entitled “Identification of Items Depicted in Images,” and assigned to the assignee of the present application. The application identified in the previous sentence is incorporated herein by reference in its entirety. The foregoing application explains in greater detail a technique by which an item depicted in an image may be identified by matching the image with user submitted images stored in the repository. In some embodiments the match may be based on a comparison of the color histograms of the images.


If, for example, a user takes a photograph image of a restaurant at a given locale, the system would attempt to determine what the user may want to do when taking the picture of the restaurant. Some possibilities might be that the user wants to order something, and the user would like to make a decision based on a contextual menu that is based on that image rather than having a generic menu. Or the user may want to check on the status of a take-out order he has placed with the restaurant. In the latter case, the user may have been given an order number at the time of placing the take out order, and the menu presented by the system could be a user interface that provides the user with the opportunity of entering the order number and the system would then return, for example, the status of the order and when it would be ready for pickup. The system processes the image and identifies it as a restaurant and presents a category driven menu to the user with various selectable options such as, “call the restaurant,” or “view the menu from the restaurant.”


In another example, if the image is of, say, a local ATT® office, the system may filter and process the image, identify the image as that of an ATT store, categorize the image as an ATT store, and present a category driven menu to the user with various selectable options. In this situation since the image is that of an ATT office, information relating to ATT is available as public information on the internet. The user might be presented with a menu from the web site relating to ATT, with an option for the various services or upgrades offered by ATT. These might include view your bill, pay your bill, see your upgrade options, technical support, new services, and the like. The menu may provide the user with the option of entering or selecting the user's account number in order to see the amount of the current phone bill, to pay the bill, and similar functions and services. For example, options may also including pay your bill by a payment processing system. An example of such a system is PayPal™. Since the foregoing information can be provided as public information, such as that available on the business's website, it is not necessary for the specific entity to be a subscriber to a service on the publication system in order for the function to be performed.


As an example of implementing the foregoing functions, if a user were walking down the street and uses a cell phone camera to take a picture of a sandwich shop, the system may identify the image by the search and comparisons discussed above. Alternatively, the system may determine by GPS or other geographical locating system what buildings are in the local area. The system may then compare the image with stored images of businesses in that local area and identify the image in that manner. When identified as a sandwich shop, if the sandwich shop has its own website, the system may provide a menu based on the website. If the sandwich shop does not have its own website, it may decide to register with the publication system and provide information for the publication system to present with the menu. For, if registered, the business could then provide the publication system with a photograph of one or more of its business location to store in the image repository discussed above, for image identification purposes. The business that registers may also provide menu information both in the initial registration and on a continuing basis for storage in a database for presentation to a user upon receipt of a photograph of the business location taken by or for the user. Again using a restaurant as an example, the registering restaurant may upload its menu, take out options, hours of business, and other details to the publication system. Further, on a continuing basis, say, for example, daily, the restaurant could provide the specials of the day, which could be updated in the database for presentation to the user as indicated in the foregoing sentence.


If the image is that of a popular restaurant like P.F. Chang's that has its own website, the system could link to, as discussed above, the vendor's main website and present a menu or other information to the user based on the website. In addition, businesses such as restaurants, as only one example of many businesses, may wish to register with the publication system even if the business is a well known business with its own website. This would be done for the reasons outlined above for businesses that are not well known.


Further, the user may be provided with the category in which the publication system categorizes the business and may be given the opportunity of changing the category if the user believes changing the category would yield a response from the system that more nearly meets the user's desires at the moment. An example of the statement can be a business called “Club One”. In California, this business corresponds to a fitness club/gym. But elsewhere it can be a dance club. Hence the user will be provided with the option to change the original category the image gets sorted into. Further still, the publication system may also provide the business with information indicating the category in which the system placed the business, and provide the business with the opportunity of changing its category if the business believes a different category would be beneficial. The same example may be given for this statement as well. If “Club One” fitness club registers for the service and is categorized as a dance club, the business will also be provided with the choice of changing the category of the business from a dance club to a fitness club.



FIG. 1 is a photograph image that may be submitted for identification and for obtaining a menu related to the image for a good or service desired by a user, sometimes referred to herein as a contextual menu. The photograph may be taken by the user or by some other person. While the photograph is that of a well known restaurant, it may be of any business, for example a local, non-nationally known restaurant, a phone company, or any other business, item or locale.



FIG. 2 is a diagram depicting a system 200, in accordance with an illustrative embodiment, for identifying items depicted in images. As depicted, the system 200 includes client processing systems (e.g., personal computer 204, mobile phone 206, or similar device) a server 210 hosting a variety of services, and another server 212 hosting an item recognition module 214, which are all interconnected by way of a computer network 202. The computer network 202 is a collection of interconnected processing systems that communicate utilizing wired or wireless mediums. Examples of computer networks, such as the computer network 202, include Local Area Networks (LANs) and/or Wide-Area Networks (WANs), such as the Internet.


In the example of FIG. 2, a client processing system (e.g., personal computer 204 or mobile phone 206) transmits an image of an item 209 to the image recognition module 214, which is hosted on the server 212. The image may be captured by a camera built-in the mobile phone 206 or by a camera 208, which is configurable to download its stored images to the personal computer 204. Further the submitted image could be an already existing photograph or other images capable of being submitted to the publication system by, for example, upload. Alternatively, the user may locate the image through, for example, the Internet or other image repositories and submit it to the system.


The image recognition module 214 accesses the image from the client processing systems and, as explained in more detail below, identifies the item 209 depicted in the image with an item identifier. The item 209 may be, in one embodiment, a business. An “item identifier,” as used herein, refers to a variety of values (e.g., alphanumeric characters and symbols) that establish the identity of or uniquely identify one or more items, such as item 209. For example, the item identifier can be a name assigned to the item 209. In another example, the item identifier can be a barcode value (e.g., Universal Product Code (UPC)) assigned to the item 209. In yet another example, the item identifier can be a title or description assigned to the item 209.


In an embodiment, the item recognition module 214, which may include a categorization module to categorize the identified image, may then transmit the item identifier to a service hosted on the server 210 to locate item data. The “item data,” as used herein, refer to a variety of data regarding one or more images, in one embodiment a business, depicted in an image, the data posted or associated with the image. Such item data, for example, may be stored with the images or at other locations. Examples of item data may include, in one embodiment, menus related to the business or item. The menus may include locations of the business, prices of the goods or services offered by the business, quantities of the items available at or through the business, availability of the items at the business, and other item data. It should be appreciated that the item recognition module 214 may access a variety of different services by way of, for example, a Web-exposed application program interface (API). In an alternate embodiment, the item recognition module 214 may be embodied with the service itself where, for example, the item recognition module 214 may be hosted in the server 210 with the other services.


The system 200 may also include a global positioning system (not shown) that may be attached to or included in the client processing systems. The client processing systems can transmit the coordinates or location identified by the global positioning system to the services hosted on server 210 and, for example, the services can use the coordinates to locate nearby stores that sell the item 209 depicted in the image.


With reference to FIG. 3, an example embodiment of a high-level client-server-based network architecture 300, more detailed then FIG. 2, which may include the servers 210 and 212 of FIG. 2. A networked system 302, in an example form of a network-server-side functionality, is coupled via a communication network 304 (e.g., the Internet, wireless network, cellular network, or a Wide Area Network (WAN)) to one or more client devices 310 and 312. FIG. 3 illustrates, for example, a web client 306 operating via a browser (e.g., such as the INTERNET EXPLORER® browser developed by Microsoft® Corporation of Redmond, Wash. State), and a programmatic client 308 executing on respective client devices 310 and 312.


The client devices 310 and 312 may comprise a mobile phone, desktop computer, laptop, or any other communication device that a user may utilize to access the networked system 302. In some embodiments, the client device 310 may comprise or be connectable to an image capture device 313 (e.g., camera, camcorder). In further embodiments, the client device 310 may comprise one or more of a touch screen, accelerometer, microphone, and GPS device. The client devices 310 and 312 may be a device of an individual user interested in visualizing an item within an environment.


An Application Program Interface (API) server 314 and a web server 316 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 318. The application servers 318 host a publication system 320 and a payment processor, or payment system, 322, each of which may comprise one or more modules, applications, or engines, and each of which may be embodied as hardware, software, firmware, or any combination thereof. The application servers 318 are, in turn, coupled to one or more database servers 324 facilitating access to one or more information storage repositories or database(s) 326. The databases 326 may also store user account information of the networked system 302 in accordance with example embodiments.


In example embodiments, the publication system 320 publishes content on a network (e.g., Internet). As such, the publication system 320 provides a number of publication functions and services to users that access the networked system 302. The publication system 320 is discussed in more detail in connection with FIG. 4. In example embodiments, the publication system 320 is discussed in terms of a marketplace environment. However, it is noted that the publication system 320 may be associated with a non-marketplace environment such as an informational or social networking environment.


The payment system 322 provides a number of payment services and functions to users. The payment system 322 allows users to accumulate value (e.g., in a commercial currency, such as the U.S. dollar, or a proprietary currency, such as “points”) in their accounts, and then later to redeem the accumulated value for products (e.g., goods or services) that are made available via the publication system 320 or elsewhere on the network 304. The payment system 322 also facilitates payments from a payment mechanism (e.g., a bank account, PayPal™, or credit card) for purchases of items via any type and form of a network-based marketplace.


While the publication system 320 and the payment system 322 are shown in FIG. 3 to both form part of the networked system 302, it will be appreciated that, in alternative embodiments, the payment system 322 may form part of a payment service that is separate and distinct from the networked system 302. Additionally, while the example network architecture 300 of FIG. 3 employs a client-server architecture, a skilled artisan will recognize that the present disclosure is not limited to such an architecture. The example network architecture 300 can equally well find application in, for example, a distributed or peer-to-peer architecture system. The publication system 320 and payment system 322 may also be implemented as standalone systems or standalone software programs operating under separate hardware platforms, which do not necessarily have networking capabilities.


Referring now to FIG. 4, an example block diagram illustrating multiple components that, in one embodiment, are provided within the publication system 320 of the networked system 302 is shown. In one embodiment, the publication system 320 is a marketplace system where items (e.g., goods or services) may be offered for sale. In an alternative embodiment, the publication system 320 is a social networking system or information system. The publication system 320 may be hosted on dedicated or shared server machines (not shown) that are communicatively coupled to enable communications between the server machines. The multiple components themselves are communicatively coupled (e.g., via appropriate interfaces), either directly or indirectly, to each other and to various data sources, to allow information to be passed between the components or to allow the components to share and access common data. Furthermore, the components may access the one or more databases 326 via the one or more database servers 324.


In one embodiment, the publication system 320 provides a number of mechanisms whereby the system 320 may publish menus relating to goods or services of a seller or business, a buyer can express interest in or indicate a desire to purchase such goods or services based on an image, and a price can be set for a transaction pertaining to the goods or services. To this end, the publication system 320 may comprise at least one image receiving module 400, one or more image filtering and processing module 402, one or more image identification module 404, one or more image categorization module 406, and one or more menu generation module 408.


An image receiver module 400 is an image receiver that receives images that are uploaded to the publication system by a user that are identified and categorized by the publication system and then used in retrieving menus that, based on the categorization, relate to the image and are desired by the user.


An image filtering and processing module 402 provides well known functionality for filtering and processing image information in order to remove image defects such as, in one embodiment, defects that lead to undesired red-eye or other flash characteristics. This may allow more effective identification of the image.


An image identification module 404 allows identification of the image submitted by the user. As explained in more detail in the above-incorporated application, an item depicted in an image may be identified by matching the image known images stored in an image repository. In some embodiments, also as explained in the foregoing application, the match may be based on a comparison of the color histograms of the images.


An image categorization module 406 allows categorization of images identified by image identification module 404. An example of such image categorization is disclosed in U.S. patent application Ser. No. 11/952,026 entitled “Image Categorization Based on Comparisons between Images” filed on Dec. 6, 2007 and assigned to the assignee of the present application. The foregoing application is hereby incorporated herein by reference in its entirety.


A category driven menu module 408 allows generation of category specific menus. For example, if the image is of a restaurant, the menu generation module could generate, as one example, a link to the restaurant's main website, providing menus, directions to the business, hours of operation, specials of the day, take-out information, and the like. This will be discussed in further detail below. For example, the category driven menu module 408 may also deliver menus by electronic mail (e-mail), instant message (IM), a Short Message Service (SMS), text, facsimile, or voice (e.g., Voice over IP (VoIP)) messages via wired networks (e.g., the Internet), a Plain Old Telephone Service (POTS) network, or wireless networks (e.g., mobile, cellular, WiFi, WiMAX).


Although the various components of the publication system 320 have been defined in terms of a variety of individual modules and engines, a skilled artisan will recognize that many of the items can be combined or organized in other ways. Alternatively, not all components of the publication system 320 of FIG. 4 may be utilized. Furthermore, not all components of the publication system 320 have been included in FIG. 4. In general, components, protocols, structures, and techniques not directly related to functions of exemplary embodiments (e.g., dispute resolution engine, loyalty promotion engine, personalization engines, etc.) have not been shown or discussed in detail. The description given herein simply provides a variety of exemplary embodiments to aid the reader in an understanding of the systems and methods used herein.



FIG. 4A is an illustration of various modules of an implementation of an image identification module useful in an example embodiment image identification module 406 comprises database 464 which includes image repository 466. Database 464 may be included as part of database 326 of FIG. 3. Image repository 466 may be used for storing images that are to be compared to an image which may be received from a user over line 452 at image receiving module 400. The image may be filtered and processed at image filtering and processing module 402 to remove or minimize defects. The filtered image enters image identification module 404 over line 462. The image over line 462 and comparison images from repository 466 are compared in comparator 470. If there is a successful identification, the image identification information is provided over line 480. Alternatively, the image over line 462 may be compared in comparator 470 with images obtained from network 472 over line 474, identification information again being provided over line 480.



FIG. 5 is an illustration of a contextual menu 500 in accordance with an exemplary embodiment. Consider for example, if the image of the menu relates to, say, a local ATT office, the system may filter and process the image, identify the image as that of an ATT store, categorize the image as an ATT store, and present a category driven menu to the user with icons selectable by the user and space for entering information by the user. The possible options may be selectable icons to “View Bill” 502, or “View Upgrades” 504, or “View Minutes” 506, “Store Locator” 508, among others. User space 510 may, in the embodiment under discussion, include space 512 for user name and space 514 for user password. For example, if the user selects View Bill 502, he or she may be prompted also to include name and password for the appropriate account for transmission to the ATT organization via a transmit radio button such as 516.



FIG. 6 is a flow chart illustrating a method useful in an example embodiment for providing to users contextual menus relating to the images provided to the system seen in FIGS. 2 and 3. Camera 602 may be in a cell phone, personal digital assistant, or laptop, or it may be another camera, or a webcam. An image such as from a photograph is transmitted from the camera 602 for image filtering and processing at image filtering and processing module 402 of FIG. 4. This image may be transmitted as described above with respect to FIG. 2, FIG. 3, and FIG. 4A. Image filtering and processing module 402 includes filtering software to process the image to remove issues that militate against identification of the image. For example issues such as redeye and other anomalies caused by flash are removed by well known means. The filtered image is transmitted over line 605 for image identification at image identification module 404 of FIG. 4A, which identifies the image transmitted by camera 602. For example, if the image is letters, if may be identified by comparison to stored letters, including trademarks and logos. If the image is a building, it may be identified by comparison with stored images of buildings. Comparison may be by comparator 470 of FIG. 4A, the comparison being to images stored in repository 466 of database 464 for identification, or to images from Internet 472 of FIG. 4A. The identified image, or information that represents the identified image, is transmitted over line 607 for categorization at image categorization module 406 of FIG. 4. Image categorization module 406 categorizes the identified image into a particular category based on the identity such as, for example, ATT office 610, McDonalds® 612, bakery shop 614, and the like. Based on the categorization, the image categorization module sends an appropriate identifier, by way of one of lines 610, 612, . . . , 614, to category driven menu module 616 which will provide the appropriate menu for rendering for the user at display 618. In the described embodiment, the category of the image determines what the menu options might be for the particular image. If the business is widely known, such as ATT, the, the business's website might be provided by category driven menu module 616. If the identified business is not widely known, and the business subscribes with the publication system for the service described herein, the category driven menu module 616 provides the menu determined by the subscribing business.


The links in the menu will be provided in the user interface. Links for specifications 620 may be, in one example, for the example in which the business is a restaurant, a link to the restaurant's website that displays a menu or other business information. As another example, if the image is of, say, a local ATT office, the system may filter and process the image, identify the image as that of an ATT store, categorize the image as an ATT store, and present a category driven menu to the user with various selectable options. The possible options may be to “View the bill” or “View Minutes” or “Store Locator” etc. A possible submenu for “View Bill” might be “Pay Bill” or “Schedule a payment” Submenus 622 may be, if the restaurant is a subscriber and, for example, has menus describing daily specials, the daily special could be a submenu that could be linked to. Option to change category 624 provides the user an option to change the category in case the user believes that the category determined by image categorization module 608 may not be appropriate for the image. For example, if the user takes a picture of, say, a coffee shop like Starbuck's, and the image is identified and categorized as a restaurant instead of a coffee shop, a menu the system determines is appropriate is forwarded by category driven menu module 616, which may be accompanied by option to change category 624, which may be selectable, that gives the user the option of changing the category from restaurant to coffee shop. FIG. 7 is a flow chart further illustrating a method according to another embodiment. In operation 702, the system searches to detect, by image receiving module 400, an image submitted by the user. If the image is detected, at the Yes leg, the system at 704 identifies and categorizes the image, by use of image identification module 404 and image categorization module 406. If the No leg is taken, operation 702 continues. After image identification and categorization, operation 706 provides a menu, sometimes called a specification, based on the category. In operation 708 the system may receive information from the user for modifying the menu to make the menu relate more closely to the intent of the user who submitted the image. Consider the example of “ClubOne”, which can either be a health club or a dance club, since both share the same name. If the user submitted the image of ClubOne, the dance club, but the system categorizes that image as a health club, the user can change the category of the image to dance club. By doing so, he/she can receive relevant menu options for a dance club such as “View events”, or “View DJ” or “Buy tickets”. Operation 710 tests to detect whether the intent of the user is to conclude a business transaction based on the menu. If Yes then, optionally, operation 712 determines whether payment is required. If yes, the system processes payment through a payment processor associated with the system, such as payment system 322. In the case of a No decision at steps 710 and 712, the system returns to the detection operation at 702.


Modules, Components, and Logic


Additionally, certain embodiments described herein may be implemented as logic or a number of modules, engines, components, or mechanisms. A module, engine, logic, component, or mechanism (collectively referred to as a “module”) may be a tangible unit capable of performing certain operations and configured or arranged in a certain manner. In certain example embodiments, one or more computer systems (e.g., a standalone, client, or server computer system) or one or more components of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) or firmware (note that software and firmware can generally be used interchangeably herein as is known by a skilled artisan) as a module that operates to perform certain operations described herein.


In various embodiments, a module may be implemented mechanically or electronically. For example, a module may comprise dedicated circuitry or logic that is permanently configured (e.g., within a special-purpose processor, application specific integrated circuit (ASIC), or array) to perform certain operations. A module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software or firmware to perform certain operations. It will be appreciated that a decision to implement a module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by, for example, cost, time, energy-usage, and package size considerations.


Accordingly, the term “module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which modules or components are temporarily configured (e.g., programmed), each of the modules or components need not be configured or instantiated at any one instance in time. For example, where the modules or components comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different modules at different times. Software may accordingly configure the processor to constitute a particular module at one instance of time and to constitute a different module at a different instance of time.


Modules can provide information to, and receive information from, other modules. Accordingly, the described modules may be regarded as being communicatively coupled. Where multiples of such modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the modules. In embodiments in which multiple modules are configured or instantiated at different times, communications between such modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple modules have access. For example, one module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further module may then, at a later time, access the memory device to retrieve and process the stored output. Modules may also initiate communications with input or output devices and can operate on a resource (e.g., a collection of information).


Example Machine Architecture and Machine-Readable Medium


With reference to FIG. 8, an example embodiment extends to a machine in the example form of a computer system 800 within which instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed. In alternative example embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, a switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.


The example computer system 800 may include a processor 802 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 804 and a static memory 806, which communicate with each other via a bus 808. The computer system 800 may further include a video display unit 810 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). In example embodiments, the computer system 800 also includes one or more of an alpha-numeric input device 812 (e.g., a keyboard), a user interface (UI) navigation device or cursor control device 814 (e.g., a mouse), a disk drive unit 816, a signal generation device 818 (e.g., a speaker), and a network interface device 820.


Machine-Readable Storage Medium


The disk drive unit 816 includes a machine-readable storage medium 822 on which is stored one or more sets of instructions 824 and data structures (e.g., software instructions) embodying or used by any one or more of the methodologies or functions described herein. The instructions 824 may also reside, completely or at least partially, within the main memory 804 or within the processor 802 during execution thereof by the computer system 800, with the main memory 804 and the processor 802 also constituting machine-readable media.


While the machine-readable storage medium 822 is shown in an example embodiment to be a single medium, the term “machine-readable storage medium” may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) that store the one or more instructions. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of embodiments of the present invention, or that is capable of storing, encoding, or carrying data structures used by or associated with such instructions. The term “machine-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories and optical and magnetic media. Specific examples of machine-readable storage media include non-volatile memory, including by way of example semiconductor memory devices (e.g., Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), and flash memory devices); magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.


Transmission Medium


The instructions 824 may further be transmitted or received over a communications network 826 using a transmission medium via the network interface device 820 and utilizing any one of a number of well-known transfer protocols (e.g., HTTP). Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, mobile telephone networks, POTS networks, and wireless data networks (e.g., WiFi and WiMax networks). The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.


Although an overview of the inventive subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of embodiments of the present invention. Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is, in fact, disclosed.


The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.


Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present invention. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present invention as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims
  • 1. A method comprising: receiving, from a client device, an image depicting a physical item;providing, to the client device, an initial menu associated with a first establishment, the initial menu including one or more options for the first establishment, and determined based on an initial categorization of a content of the image depicting the physical item;in response to providing the initial menu, receiving, from the client device, an indication to modify the initial menu including the one or more options, the indication providing additional category information for the image depicting the physical item; andproviding, by at least one processor, a second menu for a second establishment to replace the initial menu at the client device and one or more second options for the second establishment to replace the one or more options for the first establishment, the one or more second options for the second establishment determined based on the additional category information provided by the client device.
  • 2. The method of claim 1, wherein the providing the initial menu further comprises determining the one or more options for the initial menu based on location information associated with the image depicting the physical item.
  • 3. The method of claim 2, wherein the providing the initial menu further comprises determining the one or more options for the initial menu based on an image detection of the image depicting the physical item.
  • 4. The method of claim 3, wherein the providing the initial menu further comprises determining the one or more options for the initial menu based on the location information associated with the image depicting the physical item and an image detection characters depicting in the image of the physical item.
  • 5. The method of claim 4, wherein the initial categorization comprises an identity of the physical item, wherein the initial menu including the one or more options are selected based on the initial categorization including the identity and the location information of the physical item.
  • 6. The method of claim 1, wherein the initial menu includes a first graphical user interface menu for a first network site corresponding to the initial categorization.
  • 7. The method of claim 6, wherein the second menu includes a second graphical user interface menu for a second network site determined based on the additional category information provided by the client device.
  • 8. The method of claim 7, further comprising: transmitting at least a first data item to the second network site; andreceiving at least a second data item from the second network site, wherein the first network site is a different site from the second network site.
  • 9. The method of claim 7, further comprising: receiving, from the first network site, a first set of content to include in the first graphical user interface menu; andreceiving, from the second network site, a second set of content to include in the second graphical user interface menu.
  • 10. The method of claim 1, wherein the indication to modify the initial menu indicates to change the initial categorization from a first category to second category.
  • 11. A system comprising: at least one processor; andat least one memory including program code which when executed by the at least one processor causes the system to provide operations comprising:receiving, from a client device, an image depicting a physical item;providing, to the client device, an initial menu associated with a first establishment, the initial menu including one or more options for the first establishment, and determined based on an initial categorization of a content of the image depicting the physical item;in response to providing the initial menu, receiving, from the client device, an indication to modify the initial menu including the one or more options, the indication providing additional category information for the image depicting the physical item; andproviding a second menu for a second establishment to replace the initial menu at the client device and one or more second options for the second establishment to replace the one or more options for the first establishment, the one or more second options for the second establishment determined based on the additional category information provided by the client device.
  • 12. The system of claim 11, wherein the providing the initial menu further comprises determining the one or more options for the initial menu based on location information associated with the image depicting the physical item.
  • 13. The system of claim 12, wherein the providing the initial menu further comprises determining the one or more options for the initial menu based on an image detection of the image depicting the physical item.
  • 14. The system of claim 13, wherein the providing the initial menu further comprises determining the one or more options for the initial menu based on the location information associated with the image depicting the physical item and an image detection characters depicting in the image of the physical item.
  • 15. The system of claim 14, wherein the initial categorization comprises an identity of the physical item, wherein the initial menu including the one or more options are selected based on the initial categorization including the identity and the location information of the physical item.
  • 16. The system of claim 15, wherein the initial menu includes a first graphical user interface menu for a first network site corresponding to the initial categorization.
  • 17. The system of claim 16, wherein the second menu includes a second graphical user interface menu for a second network site determined based on the additional category information provided by the client device.
  • 18. The system of claim 17, the operations further comprising: transmitting at least a first data item to the second network site; andreceiving at least a second data item from the second network site, wherein the first network site is a different site from the second network site.
  • 19. The system of claim 17, the operations further comprising: receiving, from the first network site, a first set of content to include in the first graphical user interface menu.
  • 20. The system of claim 17, the operations further comprising: receiving, from the second network site, a second set of content to include in the second graphical user interface menu.
  • 21. A non-transitory computer-readable storage medium including program code which when executed by at least one processor causes a system to perform operations comprising: receiving, from a client device, an image depicting a physical item;providing, to the client device, an initial menu associated with a first establishment, the initial menu including one or more options for the first establishment, and determined based on an initial categorization of a content of the image depicting the physical item;in response to providing the initial menu, receiving, from the client device, an indication to modify the initial menu including the one or more options, the indication providing additional category information for the image depicting the physical item; andproviding a second menu for a second establishment to replace the initial menu at the client device and one or more second options for the second establishment to replace the one or more options for the first establishment, the one or more second options for the second establishment determined based on the additional category information provided by the client device.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a Continuation of application Ser. No. 13/537,482 filed on Jun. 29, 2012, entitled “CONTEXTUAL MENUS BASED ON IMAGE RECOGNITION,” the entire contents of this application is incorporated herein by reference in its entirety.

US Referenced Citations (493)
Number Name Date Kind
3675215 Arnold et al. Jul 1972 A
4539585 Spackova et al. Sep 1985 A
4596144 Panton et al. Jun 1986 A
4753079 Sumitomo Jun 1988 A
5068723 Dixit et al. Nov 1991 A
5408417 Wilder Apr 1995 A
5546475 Bolle et al. Aug 1996 A
5579471 Barber et al. Nov 1996 A
5633678 Parulski et al. May 1997 A
5692012 Virtamo et al. Nov 1997 A
5724660 Kauser et al. Mar 1998 A
5727379 Cohn Mar 1998 A
5732354 MacDonald Mar 1998 A
5781899 Hirata Jul 1998 A
5802361 Wang et al. Sep 1998 A
5818964 Itoh Oct 1998 A
5848373 DeLorme et al. Dec 1998 A
5870149 Comroe et al. Feb 1999 A
5889896 Meshinsky et al. Mar 1999 A
5890068 Fattouche et al. Mar 1999 A
5949429 Bonneau et al. Sep 1999 A
6069570 Herring May 2000 A
6091956 Hollenberg Jul 2000 A
6097958 Bergen Aug 2000 A
6112226 Weaver et al. Aug 2000 A
6134548 Gottsman et al. Oct 2000 A
6134674 Akasheh Oct 2000 A
6151587 Matthias Nov 2000 A
6154738 Call Nov 2000 A
6157435 Slater et al. Dec 2000 A
6157841 Bolduc et al. Dec 2000 A
6167274 Smith Dec 2000 A
6198927 Wright et al. Mar 2001 B1
6204812 Fattouche Mar 2001 B1
6208297 Fattouche et al. Mar 2001 B1
6208857 Agre et al. Mar 2001 B1
6216134 Heckerman et al. Apr 2001 B1
6216227 Goldstein et al. Apr 2001 B1
6243588 Koorapaty et al. Jun 2001 B1
6246861 Messier et al. Jun 2001 B1
6246882 Lachance Jun 2001 B1
6259381 Small Jul 2001 B1
6259923 Lim et al. Jul 2001 B1
6266014 Fattouche et al. Jul 2001 B1
6278446 Liou et al. Aug 2001 B1
6292593 Nako et al. Sep 2001 B1
6314365 Smith Nov 2001 B1
6317684 Roeseler et al. Nov 2001 B1
6330452 Fattouche et al. Dec 2001 B1
6341255 Lapidot Jan 2002 B1
6347230 Koshima et al. Feb 2002 B2
6356543 Hall et al. Mar 2002 B2
6404388 Sollenberger et al. Jun 2002 B1
6424840 Fitch et al. Jul 2002 B1
6434530 Sloane et al. Aug 2002 B1
6456852 Bar et al. Sep 2002 B2
6463426 Lipson et al. Oct 2002 B1
6477269 Brechner Nov 2002 B1
6477363 Ayoub et al. Nov 2002 B1
6483570 Slater et al. Nov 2002 B1
6484130 Dwyer et al. Nov 2002 B2
6512919 Ogasawara Jan 2003 B2
6519463 Tendler Feb 2003 B2
6530521 Henry Mar 2003 B1
6549913 Murakawa Apr 2003 B1
6563459 Takenaga May 2003 B2
6563959 Troyanker May 2003 B1
6567797 Schuetze et al. May 2003 B1
6577946 Myr Jun 2003 B2
6580914 Smith Jun 2003 B1
6587835 Treyz et al. Jul 2003 B1
6589290 Maxwell et al. Jul 2003 B1
6590533 Sollenberger et al. Jul 2003 B2
6618593 Drutman et al. Sep 2003 B1
6625457 Raith Sep 2003 B1
6642929 Essafi et al. Nov 2003 B1
6690322 Shamoto et al. Feb 2004 B2
6714797 Rautila Mar 2004 B1
6714945 Foote et al. Mar 2004 B1
6724930 Kosaka et al. Apr 2004 B1
6732080 Blants May 2004 B1
6763148 Sternberg et al. Jul 2004 B1
6783148 Henderson Aug 2004 B2
6804662 Annau et al. Oct 2004 B1
6807479 Watanabe et al. Oct 2004 B2
6901379 Balter et al. May 2005 B1
6947571 Rhoads et al. Sep 2005 B1
7022281 Senff Apr 2006 B1
7023441 Choi et al. Apr 2006 B2
7062722 Carlin et al. Jun 2006 B1
7082365 Sheha et al. Jul 2006 B2
7092702 Cronin et al. Aug 2006 B2
7130466 Seeber Oct 2006 B2
7130622 Vaenskae et al. Oct 2006 B2
7138913 Mackenzie et al. Nov 2006 B2
7142858 Aoki et al. Nov 2006 B2
7149665 Feld et al. Dec 2006 B2
7162082 Edwards Jan 2007 B2
7199815 Aoyama Apr 2007 B2
7240025 Stone et al. Jul 2007 B2
7254388 Nam et al. Aug 2007 B2
7254779 Rezvani et al. Aug 2007 B1
7257268 Eichhorn et al. Aug 2007 B2
7273172 Olsen, III et al. Sep 2007 B2
7281018 Begun et al. Oct 2007 B1
7346453 Matsuoka Mar 2008 B2
7346543 Edmark Mar 2008 B1
7363214 Musgrove et al. Apr 2008 B2
7363252 Fujimoto Apr 2008 B2
7460735 Rowley et al. Dec 2008 B1
7478143 Friedman et al. Jan 2009 B1
7495674 Biagiotti et al. Feb 2009 B2
7502133 Fukunaga et al. Mar 2009 B2
7519562 Vander et al. Apr 2009 B1
7568004 Gottfried Jul 2009 B2
7587359 Levy et al. Sep 2009 B2
7593602 Stentiford Sep 2009 B2
7669759 Zettner Mar 2010 B1
7683858 Allen et al. Mar 2010 B2
7702185 Keating et al. Apr 2010 B2
7747259 Pande et al. Jun 2010 B2
7752082 Calabria Jul 2010 B2
7756757 Oakes, III Jul 2010 B1
7761339 Alivandi Jul 2010 B2
7801893 Gulli et al. Sep 2010 B2
7827074 Rolf Nov 2010 B1
7848764 Riise et al. Dec 2010 B2
7881560 John Feb 2011 B2
7890386 Reber Feb 2011 B1
7916129 Lin et al. Mar 2011 B2
7921040 Reber Apr 2011 B2
7933811 Reber Apr 2011 B2
7957510 Denney et al. Jun 2011 B2
7996282 Scott et al. Aug 2011 B1
8078498 Edmark Dec 2011 B2
8098894 Soderstrom Jan 2012 B2
8130242 Cohen Mar 2012 B2
8131118 Jing Mar 2012 B1
8230016 Pattan et al. Jul 2012 B1
8233723 Sundaresan Jul 2012 B2
8239130 Upstill et al. Aug 2012 B1
8260846 Lahav Sep 2012 B2
8275590 Szymczyk et al. Sep 2012 B2
8370062 Starenky et al. Feb 2013 B1
8385646 Lang et al. Feb 2013 B2
8411977 Baluja et al. Apr 2013 B1
8421872 Neven, Sr. Apr 2013 B2
8442871 Veres et al. May 2013 B2
8650072 Mason et al. Feb 2014 B2
8811957 Jovicic et al. Aug 2014 B2
9037600 Garrigues May 2015 B1
9058764 Persson et al. Jun 2015 B1
9251395 Botchen Feb 2016 B1
9495386 Tapley et al. Nov 2016 B2
10846766 Govande et al. Nov 2020 B2
10956775 Tapley et al. Mar 2021 B2
20010034668 Whitworth Oct 2001 A1
20010049636 Hudda et al. Dec 2001 A1
20010055976 Crouch et al. Dec 2001 A1
20020002504 Engel et al. Jan 2002 A1
20020027694 Kim et al. Mar 2002 A1
20020052709 Akatsuka et al. May 2002 A1
20020072993 Sandus et al. Jun 2002 A1
20020094189 Navab et al. Jul 2002 A1
20020102967 Chang et al. Aug 2002 A1
20020107737 Kaneko et al. Aug 2002 A1
20020116286 Walker et al. Aug 2002 A1
20020143930 Babu et al. Oct 2002 A1
20020145984 Babu et al. Oct 2002 A1
20020146176 Meyers Oct 2002 A1
20020155844 Rankin et al. Oct 2002 A1
20020196333 Gorischek Dec 2002 A1
20030004802 Callegari Jan 2003 A1
20030018652 Heckerman et al. Jan 2003 A1
20030028873 Lemmons Feb 2003 A1
20030051255 Bulman et al. Mar 2003 A1
20030053706 Hong et al. Mar 2003 A1
20030063128 Salmimaa et al. Apr 2003 A1
20030065805 Barnes Apr 2003 A1
20030080978 Navab et al. May 2003 A1
20030085894 Tatsumi May 2003 A1
20030098892 Hiipakka May 2003 A1
20030101105 Vock May 2003 A1
20030112260 Gouzu Jun 2003 A1
20030123026 Abitbol et al. Jul 2003 A1
20030125043 Silvester Jul 2003 A1
20030126150 Chan Jul 2003 A1
20030130787 Clapper Jul 2003 A1
20030130910 Pickover et al. Jul 2003 A1
20030134645 Stern et al. Jul 2003 A1
20030139190 Steelberg et al. Jul 2003 A1
20030147623 Fletcher Aug 2003 A1
20030195044 Narita Oct 2003 A1
20030197740 Reponen Oct 2003 A1
20030208409 Mault Nov 2003 A1
20030216960 Postrel Nov 2003 A1
20030220835 Barnes Nov 2003 A1
20030229537 Dunning et al. Dec 2003 A1
20030231806 Troyanker Dec 2003 A1
20040002359 Deas et al. Jan 2004 A1
20040019643 Zirnstein, Jr. Jan 2004 A1
20040021567 Dunn Feb 2004 A1
20040043773 Lee et al. Mar 2004 A1
20040046779 Asano et al. Mar 2004 A1
20040057627 Abe et al. Mar 2004 A1
20040075670 Bezine et al. Apr 2004 A1
20040096096 Huber May 2004 A1
20040128320 Grove et al. Jul 2004 A1
20040133927 Sternberg et al. Jul 2004 A1
20040144338 Goldman Jul 2004 A1
20040153505 Verdi et al. Aug 2004 A1
20040192339 Wilson et al. Sep 2004 A1
20040192349 Reilly Sep 2004 A1
20040203901 Wilson et al. Oct 2004 A1
20040203931 Karaoguz Oct 2004 A1
20040205286 Bryant et al. Oct 2004 A1
20040220767 Tanaka et al. Nov 2004 A1
20040220821 Ericsson et al. Nov 2004 A1
20040230558 Tokunaka Nov 2004 A1
20050001852 Dengler et al. Jan 2005 A1
20050004850 Gutbrod et al. Jan 2005 A1
20050010486 Pandhe Jan 2005 A1
20050015300 Smith et al. Jan 2005 A1
20050065655 Hong et al. Mar 2005 A1
20050081161 MacInnes et al. Apr 2005 A1
20050084154 Li et al. Apr 2005 A1
20050091597 Ackley Apr 2005 A1
20050151743 Sitrick Jul 2005 A1
20050151963 Pulla et al. Jul 2005 A1
20050159883 Humphries et al. Jul 2005 A1
20050162419 Kim et al. Jul 2005 A1
20050162523 Darrell et al. Jul 2005 A1
20050171864 Nakade et al. Aug 2005 A1
20050182792 Israel et al. Aug 2005 A1
20050193006 Bandas Sep 2005 A1
20050222987 Vadon Oct 2005 A1
20050240512 Quintero et al. Oct 2005 A1
20050250516 Shim Nov 2005 A1
20050278749 Ewert et al. Dec 2005 A1
20050283379 Reber Dec 2005 A1
20060004646 Schoen et al. Jan 2006 A1
20060004850 Chowdhury Jan 2006 A1
20060006238 Singh Jan 2006 A1
20060012677 Neven et al. Jan 2006 A1
20060013481 Park et al. Jan 2006 A1
20060015492 Keating et al. Jan 2006 A1
20060032916 Mueller et al. Feb 2006 A1
20060038833 Mallinson et al. Feb 2006 A1
20060047825 Steenstra et al. Mar 2006 A1
20060058948 Blass et al. Mar 2006 A1
20060059434 Boss et al. Mar 2006 A1
20060064346 Steenstra et al. Mar 2006 A1
20060071945 Anabuki Apr 2006 A1
20060071946 Anabuki et al. Apr 2006 A1
20060099959 Staton et al. May 2006 A1
20060116935 Evans Jun 2006 A1
20060120686 Liebenow Jun 2006 A1
20060145837 Horton et al. Jul 2006 A1
20060149625 Koningstein Jul 2006 A1
20060149638 Allen Jul 2006 A1
20060178782 Pechtl et al. Aug 2006 A1
20060184013 Emanuel et al. Aug 2006 A1
20060190293 Richards Aug 2006 A1
20060195428 Peckover Aug 2006 A1
20060211453 Schick Sep 2006 A1
20060218153 Voon et al. Sep 2006 A1
20060236257 Othmer et al. Oct 2006 A1
20060240862 Neven et al. Oct 2006 A1
20060270421 Phillips et al. Nov 2006 A1
20070005576 Cutrell et al. Jan 2007 A1
20070015586 Huston Jan 2007 A1
20070024469 Chou Feb 2007 A1
20070038944 Carignano et al. Feb 2007 A1
20070060112 Reimer Mar 2007 A1
20070078846 Gulli et al. Apr 2007 A1
20070091125 Takemoto et al. Apr 2007 A1
20070098234 Fiala May 2007 A1
20070100740 Penagulur et al. May 2007 A1
20070104348 Cohen May 2007 A1
20070122947 Sakurai et al. May 2007 A1
20070133947 Armitage et al. Jun 2007 A1
20070143082 Degnan Jun 2007 A1
20070150403 Mock et al. Jun 2007 A1
20070159522 Neven Jul 2007 A1
20070172155 Guckenberger Jul 2007 A1
20070198505 Fuller Aug 2007 A1
20070202844 Wilson et al. Aug 2007 A1
20070230817 Kokojima Oct 2007 A1
20070244633 Phillips et al. Oct 2007 A1
20070244924 Sadovsky et al. Oct 2007 A1
20070300161 Bhatia et al. Dec 2007 A1
20080003966 Magnusen Jan 2008 A1
20080005074 Flake et al. Jan 2008 A1
20080035725 Jambunathan et al. Feb 2008 A1
20080037877 Jia et al. Feb 2008 A1
20080046738 Galloway et al. Feb 2008 A1
20080046956 Kulas Feb 2008 A1
20080059055 Geelen et al. Mar 2008 A1
20080071559 Arrasvuori Mar 2008 A1
20080074424 Carignano Mar 2008 A1
20080082426 Gokturk et al. Apr 2008 A1
20080084429 Wissinger Apr 2008 A1
20080092551 Skowronski Apr 2008 A1
20080104054 Spangler May 2008 A1
20080109301 Yee et al. May 2008 A1
20080126193 Robinson May 2008 A1
20080126251 Wassingbo May 2008 A1
20080127647 Leitner Jun 2008 A1
20080142599 Benillouche et al. Jun 2008 A1
20080151092 Vilcovsky Jun 2008 A1
20080154710 Varma Jun 2008 A1
20080163311 St. John-Larkin Jul 2008 A1
20080163379 Robinson et al. Jul 2008 A1
20080165032 Lee et al. Jul 2008 A1
20080170810 Wu et al. Jul 2008 A1
20080176545 Dicke et al. Jul 2008 A1
20080177640 Gokturk et al. Jul 2008 A1
20080186226 Ratnakar Aug 2008 A1
20080194323 Merkli et al. Aug 2008 A1
20080201241 Pecoraro Aug 2008 A1
20080205755 Jackson et al. Aug 2008 A1
20080205764 Iwai et al. Aug 2008 A1
20080207357 Savarese et al. Aug 2008 A1
20080225123 Osann et al. Sep 2008 A1
20080240575 Panda et al. Oct 2008 A1
20080248815 Busch Oct 2008 A1
20080255961 Livesey Oct 2008 A1
20080267521 Gao et al. Oct 2008 A1
20080268876 Gelfand et al. Oct 2008 A1
20080278778 Saino Nov 2008 A1
20080285940 Kulas Nov 2008 A1
20080288338 Wiseman et al. Nov 2008 A1
20080288477 Kim et al. Nov 2008 A1
20080313078 Payne et al. Dec 2008 A1
20090006208 Grewal et al. Jan 2009 A1
20090019487 Kulas Jan 2009 A1
20090028435 Wu et al. Jan 2009 A1
20090028446 Wu et al. Jan 2009 A1
20090034260 Ziemkowski et al. Feb 2009 A1
20090076925 DeWitt et al. Mar 2009 A1
20090083096 Cao et al. Mar 2009 A1
20090094260 Cheng et al. Apr 2009 A1
20090099951 Pandurangan Apr 2009 A1
20090106127 Purdy et al. Apr 2009 A1
20090109240 Englert et al. Apr 2009 A1
20090110241 Takemoto et al. Apr 2009 A1
20090141986 Boncyk et al. Jun 2009 A1
20090144624 Barnes, Jr. Jun 2009 A1
20090148052 Sundaresan Jun 2009 A1
20090182810 Higgins et al. Jul 2009 A1
20090228342 Walker et al. Sep 2009 A1
20090232354 Camp et al. Sep 2009 A1
20090235181 Saliba et al. Sep 2009 A1
20090235187 Kim et al. Sep 2009 A1
20090240735 Grandhi et al. Sep 2009 A1
20090245638 Collier et al. Oct 2009 A1
20090262137 Walker et al. Oct 2009 A1
20090271293 Parkhurst et al. Oct 2009 A1
20090287587 Bloebaum et al. Nov 2009 A1
20090299824 Barnes, Jr. Dec 2009 A1
20090304267 Tapley et al. Dec 2009 A1
20090319373 Barrett Dec 2009 A1
20090319388 Yuan et al. Dec 2009 A1
20090319887 Waltman et al. Dec 2009 A1
20090324100 Kletter et al. Dec 2009 A1
20090324137 Stallings et al. Dec 2009 A1
20090325554 Reber Dec 2009 A1
20100015960 Reber Jan 2010 A1
20100015961 Reber Jan 2010 A1
20100015962 Reber Jan 2010 A1
20100034469 Thorpe et al. Feb 2010 A1
20100037177 Golsorkhi Feb 2010 A1
20100045701 Scott et al. Feb 2010 A1
20100046842 Conwell Feb 2010 A1
20100048290 Baseley et al. Feb 2010 A1
20100049663 Kane et al. Feb 2010 A1
20100070996 Liao et al. Mar 2010 A1
20100082927 Riou Apr 2010 A1
20100131714 Chandrasekaran May 2010 A1
20100153378 Sardesai Jun 2010 A1
20100161605 Gabrilovich et al. Jun 2010 A1
20100171758 Maassel et al. Jul 2010 A1
20100171999 Namikata et al. Jul 2010 A1
20100185529 Chesnut et al. Jul 2010 A1
20100188510 Yoo et al. Jul 2010 A1
20100198684 Eraker et al. Aug 2010 A1
20100211900 Fujioka Aug 2010 A1
20100214284 Rieffel et al. Aug 2010 A1
20100235259 Farraro et al. Sep 2010 A1
20100241650 Chittar Sep 2010 A1
20100257024 Holmes et al. Oct 2010 A1
20100260426 Huang et al. Oct 2010 A1
20100281417 Yolleck et al. Nov 2010 A1
20100287511 Meier et al. Nov 2010 A1
20100289817 Meier et al. Nov 2010 A1
20100293068 Drakoulis Nov 2010 A1
20100312596 Saffari et al. Dec 2010 A1
20100316288 Ip et al. Dec 2010 A1
20100332283 Ng et al. Dec 2010 A1
20100332304 Higgins et al. Dec 2010 A1
20110004517 Soto et al. Jan 2011 A1
20110016487 Chalozin et al. Jan 2011 A1
20110029334 Reber Feb 2011 A1
20110047075 Fourez Feb 2011 A1
20110053642 Lee Mar 2011 A1
20110055054 Glasson Mar 2011 A1
20110061011 Hoguet Mar 2011 A1
20110065496 Gagner et al. Mar 2011 A1
20110078305 Varela Mar 2011 A1
20110084983 Demaine Apr 2011 A1
20110090343 Alt et al. Apr 2011 A1
20110128288 Petrou et al. Jun 2011 A1
20110128300 Gay et al. Jun 2011 A1
20110131241 Petrou Jun 2011 A1
20110143731 Ramer et al. Jun 2011 A1
20110148924 Tapley et al. Jun 2011 A1
20110153614 Solomon Jun 2011 A1
20110173191 Tsaparas et al. Jul 2011 A1
20110184780 Alderson et al. Jul 2011 A1
20110187306 Aarestrup et al. Aug 2011 A1
20110212717 Rhoads et al. Sep 2011 A1
20110214082 Osterhout Sep 2011 A1
20110215138 Crum Sep 2011 A1
20110238472 Sunkada Sep 2011 A1
20110238476 Carr et al. Sep 2011 A1
20110246064 Nicholson Oct 2011 A1
20110277744 Gordon et al. Nov 2011 A1
20110307338 Carlson Dec 2011 A1
20110313874 Hardie et al. Dec 2011 A1
20120072233 Hanlon et al. Mar 2012 A1
20120084812 Thompson et al. Apr 2012 A1
20120097151 Matsuno et al. Apr 2012 A1
20120099800 Llano et al. Apr 2012 A1
20120105475 Tseng May 2012 A1
20120113141 Zimmerman et al. May 2012 A1
20120120113 Hueso May 2012 A1
20120126974 Phillips et al. May 2012 A1
20120129553 Phillips et al. May 2012 A1
20120130796 Busch May 2012 A1
20120150619 Jacob et al. Jun 2012 A1
20120165046 Rhoads et al. Jun 2012 A1
20120179716 Takami Jul 2012 A1
20120185492 Israel et al. Jul 2012 A1
20120192235 Tapley et al. Jul 2012 A1
20120195464 Ahn Aug 2012 A1
20120197764 Nuzzi et al. Aug 2012 A1
20120215612 Ramer et al. Aug 2012 A1
20120230581 Miyashita et al. Sep 2012 A1
20120239483 Yankovich et al. Sep 2012 A1
20120239501 Yankovich et al. Sep 2012 A1
20120284105 Li Nov 2012 A1
20120308077 Tseng Dec 2012 A1
20120323707 Urban Dec 2012 A1
20120327115 Chhetri et al. Dec 2012 A1
20130006735 Koenigsberg et al. Jan 2013 A1
20130019177 Schlossberg et al. Jan 2013 A1
20130036438 Kutaragi Feb 2013 A1
20130050218 Beaver et al. Feb 2013 A1
20130073365 McCarthy Mar 2013 A1
20130086029 Hebert Apr 2013 A1
20130103306 Uetake Apr 2013 A1
20130106910 Sacco May 2013 A1
20130110624 Mitrovic May 2013 A1
20130116922 Cai et al. May 2013 A1
20130144701 Kulasooriya Jun 2013 A1
20130151366 Godsey Jun 2013 A1
20130170697 Zises Jul 2013 A1
20130198002 Nuzzi et al. Aug 2013 A1
20130262231 Glasgow et al. Oct 2013 A1
20130325839 Goddard et al. Dec 2013 A1
20140000701 Govande et al. Jan 2014 A1
20140007012 Govande et al. Jan 2014 A1
20140085333 Pugazhendhi et al. Mar 2014 A1
20140237352 Sriganesh et al. Aug 2014 A1
20140372449 Chittar Dec 2014 A1
20150006291 Yankovich et al. Jan 2015 A1
20150032531 Yankovich et al. Jan 2015 A1
20150052171 Cheung Feb 2015 A1
20150065177 Phillips et al. Mar 2015 A1
20150148078 Phillips et al. May 2015 A1
20150163632 Phillips et al. Jun 2015 A1
20160019723 Tapley et al. Jan 2016 A1
20160034944 Raab et al. Feb 2016 A1
20160117863 Pugazhendhi et al. Apr 2016 A1
20160171305 Zises Jun 2016 A1
20160364793 Sacco Dec 2016 A1
20170046593 Tapley et al. Feb 2017 A1
20170091975 Zises Mar 2017 A1
20180124513 Kim et al. May 2018 A1
20180189863 Tapley et al. Jul 2018 A1
20180336734 Tapley et al. Nov 2018 A1
20190266614 Grandhi et al. Aug 2019 A1
20210166061 Tapley et al. Jun 2021 A1
Foreign Referenced Citations (81)
Number Date Country
2012212601 Oct 2013 AU
2015264850 Dec 2015 AU
1750001 Mar 2006 CN
1802586 Jul 2006 CN
1865809 Nov 2006 CN
2881449 Mar 2007 CN
1255989 Jun 2007 CN
101153757 Apr 2008 CN
101515195 Aug 2009 CN
101515198 Aug 2009 CN
101520904 Sep 2009 CN
101541012 Sep 2009 CN
101764973 Jun 2010 CN
101772779 Jul 2010 CN
101893935 Nov 2010 CN
102084391 Jun 2011 CN
102156810 Aug 2011 CN
102194007 Sep 2011 CN
102667913 Sep 2012 CN
103443817 Dec 2013 CN
104081379 Oct 2014 CN
104656901 May 2015 CN
105787764 Jul 2016 CN
1365358 Nov 2003 EP
1710717 Oct 2006 EP
2015244 Jan 2009 EP
2034433 Mar 2009 EP
2418275 Mar 2006 GB
56-085650 Jul 1981 JP
57-164286 Oct 1982 JP
59-107144 Jun 1984 JP
59-196956 Nov 1984 JP
59-196211 Dec 1984 JP
60-078250 May 1985 JP
61-115805 Jul 1986 JP
63-013113 Mar 1988 JP
11-191118 Jul 1999 JP
2942851 Aug 1999 JP
2000-110515 Apr 2000 JP
2000-279944 Oct 2000 JP
2001-283079 Oct 2001 JP
2001-309323 Nov 2001 JP
2001-344479 Dec 2001 JP
2002-004943 Jan 2002 JP
2002-099826 Apr 2002 JP
2003-014316 Jan 2003 JP
2003-022395 Jan 2003 JP
2004-326229 Nov 2004 JP
2005-337966 Dec 2005 JP
2006-209658 Aug 2006 JP
2006-351024 Dec 2006 JP
3886045 Feb 2007 JP
2007-172605 Jul 2007 JP
3143216 Jul 2008 JP
2010-039908 Feb 2010 JP
2010-141371 Jun 2010 JP
2010-524110 Jul 2010 JP
2011-209934 Oct 2011 JP
2012-529685 Nov 2012 JP
10-2006-0126717 Dec 2006 KR
10-2007-0014532 Feb 2007 KR
10-0805607 Feb 2008 KR
10-0856585 Sep 2008 KR
10-2009-0056792 Jul 2009 KR
10-2009-0070900 Jul 2009 KR
10-2010-0067921 Jun 2010 KR
10-2010-0071559 Jun 2010 KR
10-2011-0082690 Jul 2011 KR
9944153 Sep 1999 WO
0122326 Mar 2001 WO
2005072157 Aug 2005 WO
2008003966 Jan 2008 WO
2008051538 May 2008 WO
2009111047 Sep 2009 WO
2010084585 Dec 2010 WO
2010141939 Dec 2010 WO
2011070871 Jun 2011 WO
2011087797 Jul 2011 WO
2012106096 Aug 2012 WO
2013063299 May 2013 WO
2013101903 Jul 2013 WO
Non-Patent Literature Citations (271)
Entry
Response to Non-Final Office Action filed on Jun. 12, 2015, for U.S. Appl. No. 12/371,882, dated Mar. 12, 2015, 18 pages.
Amendment After Notice of Allowance Under 37 CFR filed on Jul. 27, 2020 U.S. Appl. No. 13/537,482, 10 Pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/537,482, dated Feb. 23, 2018, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/537,482, dated Sep. 27, 2016, 3 pages.
Applicant Initiated Interview Summary Received for U.S. Appl. No. 13/537,482, dated Sep. 12, 2019, 3 pages.
Final Office Action received for U.S. Appl. No. 13/537,482, dated Dec. 13, 2018, 18 pages.
Final Office Action received for U.S. Appl. No. 13/537,482, dated May 8, 2014, 20 pages.
Final Office Action received for U.S. Appl. No. 13/537,482, dated May 22, 2015, 32 pages.
Final Office Action received for U.S. Appl. No. 13/537,482, dated Nov. 7, 2016, 17 pages.
Final Office Action received for U.S. Appl. No. 13/537,482, dated Nov. 24, 2017, 19 pages.
Final Office Action Received for U.S. Appl. No. 13/537,482, dated Jan. 7, 2020,25 Pages.
Non-Final Office Action received for U.S. Appl. No. 13/537,482, dated Jan. 6, 2014, 25 pages.
Non-Final Office Action received for U.S. Appl. No. 13/537,482, dated Jun. 24, 2016, 19 pages.
Non-Final Office Action received for U.S. Appl. No. 13/537,482, dated Jun. 28, 2017, 25 pages.
Non-Final Office Action received for U.S. Appl. No. 13/537,482, dated Nov. 6, 2014, 24 pages.
Non-Final Office Action Received for U.S. Appl. No. 13/537,482 dated May 16, 2018, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 13/537,482, dated Jun. 20, 2019, 22 pages.
Notice of Allowance received for U.S. Appl. No. 13/537,482, dated Apr. 8, 2020,13 pages.
Notice of Allowance received for U.S. Appl. No. 13/537,482, dated Jul. 13, 2020, 12 pages.
Response to Final Office Action filed on Feb. 7, 2017 for U.S. Appl. No. 13/537,482, dated Nov. 7, 2016, 17 pages.
Response to Final Office Action filed on Feb. 19, 2018 for U.S. Appl. No. 13/537,482, dated Nov. 24, 2017, 18 pages.
Response to Final Office Action filed on Mar. 13, 2019, for U.S. Appl. No. 13/537,482, dated Dec. 13, 2018, 16 pages.
Response to Final Office Action Filed on Mar. 9, 2020, for U.S. Appl. No. 13/537,482, dated Jan. 37,2020, 15 Pages.
Response to Final Office Action filed on Nov. 23, 2015 for U.S. Appl. No. 13/537,482, dated May 22, 2015, 10 pages.
Response to Final Office Action filed on Sep. 8, 2014 for U.S. Appl. No. 13/537,482, dated May 8, 2014, 10 pages.
Response to Non-Final Office Action filed on Apr. 6, 2015 for U.S. Appl. No. 13/537,482, dated Nov. 6, 2014, 8 pages.
Response to Non-Final Office Action filed on Apr. 22, 2014 for U.S. Appl. No. 13/537,482, dated Jan. 6, 2014, 10 pages.
Response to Non-Final Office Action filed on Nov. 13, 2019 for U.S. Appl. No. 13/537,482 dated Jun. 20, 2019, 16 pages.
Response to Non-Final Office Action filed on Sep. 17, 2018, for U.S. Appl. No. 13/537,482, dated May 16, 2018, 13 pages.
Response to Non-Final Office Action filed on Sep. 23, 2016 for U.S. Appl. No. 13/537,482, dated Jun. 24, 2016, 11 pages.
Response to Non-Final Office Action filed on Sep. 28, 2017 for U.S. Appl. No. 13/537,482, dated Jun. 28, 2017, 20 pages.
Response to Rule 312 Communication Received for U.S. Appl. No. 13/537,482, dated Aug. 31, 2020, 2 pages.
Applicant Initiated Interview Summary Received for U.S. Appl. No. 15/337,899, dated May 26, 2020, 3 pages.
Corrected Notice of Allowability received for U.S. Appl. No. 15/337,899, dated Sep. 10, 2020, 2 pages.
Notice of Allowance received for U.S. Appl. No. 15/337,899, dated Jul. 30, 2020, 7 pages.
Response to Non-Final Office Action Filed on May 21, 2020, for U.S. Appl. No. 15/337,899, dated Feb. 5, 2020, 12 pages.
Notice of Allowance received for U.S. Appl. No. 15/337,899, dated Nov. 17, 2020, 7 Pages.
Corrected Notice of Allowabillity Received for U.S. Appl. No. 15/337,899, dated Feb. 24, 2021, 2 Pages.
Troaca, “S60 Camera Phones Get Image Recognition Technology”, http://news.softpedia.com/news/S60-Camera-Phones-Get-Image-Recognition-Technology-79666.shtml, Feb. 27, 2008, pp. 1-2.
Final Office Action Received for U.S. Appl. No. 15/337,899 dated Nov. 14, 2019, 20 pages.
Non Final Office Action received for U.S. Appl. No. 17/177,862, dated Aug. 24, 2021, 10 pages.
U.S. Appl. No. 13/537,482, filed Jun. 29, 2012, Issued.
“SnapTell: Technology,” Retrieved from the Internet: <URL: http:/!web.archive.org/web/20071117023817/http://www.snaptell.com/technology/index.htm>, Nov. 17, 2007, 1 page.
“The ESP Game,” Retrieved from the Internet: <URL: http://www.espgame.org/instructions.html>. Accessed on Nov. 13, 2007, 2 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/371,882, dated Apr. 27, 2016, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/371,882, dated Feb. 27, 2012, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/371,882, dated Jul. 21, 2015, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/371,882, dated Nov. 20, 2013, 3 pages.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Jun. 25, 2015, 27 pages.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Dec. 18, 2013, 26 pages.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Nov. 14, 2011, 21 pages.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Mar. 13, 2013, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Jun. 8, 2011, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Mar. 12, 2015, 29 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Aug. 30, 2013, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Feb. 8, 2016, 37 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Oct. 23, 2012, 21 pages.
Notice of Allowance received for U.S. Appl. No. 12/371,882, dated Jul. 20, 2016, 5 pages.
Preliminary amendment filed for U.S. Appl. No. 12/371,882, dated Feb. 16, 2009, 4 pages.
Preliminary Amendment received for U.S. Appl. No. 12/371,882, filed Jun. 19, 2009, 3 pages.
Response to Final Office Action filed on Jun. 13, 2013, for U.S. Appl. No. 12/371,882, dated Mar. 13, 2013, 14 pages.
Response to Final Office Action filed on Mar. 14, 2012, for U.S. Appl. No. 12/371,882, dated Nov. 14, 2011, 10 pages.
Response to Final Office Action filed on May 8, 2014, for U.S. Appl. No. 12/371,882, dated Dec. 18, 2013, 12 Pages.
Response to Final Office Action filed on Sep. 25, 2015, for U.S. Appl. No. 12/371,882, dated Jun. 25, 2015, 13 pages.
Response to Non-Final Office Action filed on Jan. 22, 2013, for U.S. Appl. No. 12/371,882, dated Oct. 23, 2012, 12 pages.
Response to Non-Final Office Action filed on May 9, 2016, for U.S. Appl. No. 12/371,882, dated Feb. 8, 2016, 14 pages.
Response to Non-Final Office Action filed on Sep. 8, 2011, for U.S. Appl. No. 12/371,882, dated Jun. 8, 2011, 13 pages.
Response to Non-Final Office Action filed on Dec. 2, 2013 for U.S. Appl. No. 12/371,882, dated Aug. 30, 2013, 13 pages.
Terada, “New Cell Phone Services Tap Image-Recognition Technologies”, Retrieved from the Internet: <URL: http://search, japantimes.co.Jp/cgi-bin/nb20070626a1.html>, Jun. 26, 2007, pp. 1-3.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated Jul. 29, 2011, 23 pages.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated Mar. 29, 2012, 23 pages.
Response to Non-Final Office Action filed on Dec. 29, 2011 for U.S. Appl. No. 12/398,957, dated Jul. 29, 2011, 15 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/406,016, dated May 15, 2012, 3 pages.
Final Office Action received for U.S. Appl. No. 12/406,016, dated Feb. 29, 2012, 25 pages.
Non-Final Office Action received for U.S. Appl. No. 12/406,016, dated Jun. 21, 2011, 21 pages.
Response to Final Office Action filed on May 17, 2012, for U.S. Appl. No. 12/406,016, dated Feb. 29, 2012, 16 pages.
Response to Non Final Office Action filed on Sep. 21, 2011, for U.S. Appl. No. 12/406,016, dated Jun. 21, 2011, 17 pages.
Redlaser, “Redlaser—Impossibly Accurate Barcode Scanning”, Retrieved from the Internet URL: <http://redlaser.com/index.php>, Jul. 8, 2011, pp. 1-2.
Patterson, “Amazon Iphone App Takes Snapshots, Looks for a Match”, Retrieved from the Internet: <URL:http://tech.yahoo.com/blogs/patterson/30983>, Dec. 3, 2008, 3 pages.
Parker, “Algorithms for Image Processing and Computer Vision”, Wiley Computer Publishing, 1997, pp. 23-29.
Occipitaihq, “RedLaser 2.0: Realtime iPhone UPC barcode scanning”, Available Online on URL: <https://www.youtube.com/watch?v=9_hFGsmx_6k>, Jun. 16, 2009, 2 pages.
Mello Jr.,“Pongr Giving Cell Phone Users Way to Make Money”, Retrieved from the Internet URL; <https://www.ocworld.com/article/240209/pongr_giving_cell_phone_users_way_to_make_money.html>, Sep. 18, 2011, 4 pages.
Gonsalves, “Amazon Launches Experimental Mobile Shopping Feature”, Retrieved from the Internet: <URL: http://www.informationweek.com/news/internet/retail/showArticle.jhtml?articleID=212201750&subSection=News, Dec. 3, 2008, 1 page.
Ahn et al., “Labeling Images with a Computer Game”, Retrieved from the Internet URL:<http://ael.gatech.edu/cs6452f13/files/2013/08/labeling-images.pdf>, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2004, 8 pages.
U.S. Appl. No. 61/033,940, “Image Recognition as a Service” filed on Mar. 5, 2008, 45 pages.
Response to First Action Interview Office Action summary Filed on Sep. 6, 2019, for U.S. Appl. No. 15/337,899, dated Jun. 25, 2019, 14 Pages.
Response to First Action Interview—Pre-Interview Communication filed on May 16, 2019, for U.S. Appl. No. 15/337,899, dated Mar. 19, 2019, 10 pages.
Response to Final Office Action filed on Jan. 13, 2020 for U.S. Appl. No. 15/337,899, dated Nov. 14, 2019, 15 pages.
Preliminary Amendment for U.S. Appl. No. 15/337,899, filed Nov. 11, 2016, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 15/337,899 dated Feb. 5, 2020, 11 pages.
First Action Interview-Office Action Summary received for U.S. Appl. No. 15/337,899, dated Jun. 25, 2019, 6 pages.
First Action Interview—Pre-Interview Communication received for U.S. Appl. No. 15/337,899, dated Mar. 19, 2019, 6 pages.
About the Eclipse Foundation, Retrieved from Internet URL: <http://www.eclipse.org/org/>, Accessed on Nov. 2, 2021, 2 pages.
Apache Tomcat, The Apache Software Foundation, Retrieved from the Internet URL: <http://tomcal.apache.org/>, Accessed on Nov. 2, 2021, 4 pages.
U.S. Appl. No. 11/690,720, Final Office Action dated Apr. 27, 2010, 10 pgs.
U.S. Appl. No. 13/050,769, Final Office Action dated Jun. 17, 2013, 10 pgs.
U.S. Appl. No. 13/050,769, Non Final Office Action dated Jan. 11, 2013, 10 pgs.
U.S. Appl. No. 14/611,210, Notice of Allowance dated Jun. 16, 2014, 8 pgs.
U.S. Appl. No. 14/611,210, Pre-Interview First Office Action dated Mar. 22, 2016, 4 pgs.
Araki et al., “Follow-The-Triai-Fitter: Real-Time Dressing without Undressing”, Retrieved from the internet URL: https://dialog.proquest.com/professional/printviewfile?accountid=142257>, Dec. 1, 2008, 8 pages.
Chinese Application Serial No. 200980107871.0, Decision of Reexamination dated Nov. 30, 2015, 11 pages English Translation only.
Chinese Application Serial No. 200980107871.0, Notification of Reexamination dated Aug. 7, 2015, 22 pages (13 pages of Official Copy and 9 pages of English Translation).
Chinese Application Serial No. 200980107871.0, Office Action dated Feb. 2, 2012, 11 pages (5 pages of Official Copy and 6 pages of English Translation).
Chinese Application Serial No. 200980107871.0, Office Action dated Jun. 5, 2014, 15 pages (9 pages of English translation of claims and 6 pages of official copy).
Chinese Application Serial No. 200980107871.0, Office Action dated May 3, 2013, 18 pages (10 pages of English translation and 8 pages of official copy).
Chinese Application Serial No. 200980107871.0, Office Action dated Nov. 1, 2012, 13 pages (5 pages of Official Copy and 8 pages of English Translation).
Chinese Application Serial No. 201080059424.5, Office Action dated Apr. 21, 2014, 19 pages (11 pages English translation and 8 pages official copy).
Chinese Application Serial No. 201510088798.4, Office Action dated Mar. 17, 2017, 23 pages (14 pages of English Translation and 9 pages of Official Copy).
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 09717996.4, dated Jul. 23, 2013, 7 pages.
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 10803429.9, dated Aug. 30, 2018, 6 pages.
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 10803429.9, dated Feb. 16, 2018, 8 pages.
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 17171025.4, dated Feb. 7, 2020, 6 pages.
Communication under Rule 71(3) for European Patent Application No. 10803429.9, dated Jun. 6, 2019, 7 pages.
Decision of Rejection Received for Chinese Patent Application No. 201610108229.6, dated Mar. 26, 2020, 11 pages (7 pages of Official Copy & 4 pages of English Translation of Claims).
Draw something, Retrieved from the Internet URL: <http://omgpop.com/drawsomelhing>, Accessed on Feb. 16, 2018, 2 pages.
DS Development Software,“ Email Protocols: IMAP, POP3, SMTP and HTTP”, Retrieved from the Internet UR: :<http://www.emailaddressmanager.com/tips/protocol.html>,(Copyrights) 2004-2013 Digital Software Development, Accessed on Nov. 8, 2021, 1 Page.
Duke University, “How to Write Advertisements that Sell”, Company: System, The Magazine of Business, 1912, 66 pages.
EBay Developers Program, Retrieved from the Internet URL :<htlps://developer.ebay.com/common/api/>, Accessed on Nov. 8, 2021, 3 pages.
European Application Serial No. 09717996.4, Extended European Search Report dated Feb. 17, 2011, 6 pgs.
European Application Serial No. 10803429.9, Extended European Search Report dated Jun. 17, 2015, 7 pgs.
Extended European Search Report received for European Patent Application No. 17171025.4, dated Sep. 4, 2017, 7 pages.
Extended European Search Report Received for European Patent Application No. 19184977.7 dated Sep. 26, 2019, 10 pages.
Final Office Action received for U.S. Appl. No. 13/194,584, dated Jan. 22, 2016, 27 pages.
Final Office Action received for U.S. Appl. No. U.S. Appl. No. 12/398,957, dated Jan. 17, 2020, 24 pages.
Final Office Action received for U.S. Appl. No. U.S. Appl. No. 12/398,957, dated Jan. 22, 2018, 20 pages.
Final Office Action received for U.S. Appl. No. U.S. Appl. No. 12/398,957, dated Jun. 24, 2020, 17 pages.
Final Office Action received for U.S. Appl. No. 13/194,584, dated Jul. 27, 2017, 35 pages.
Final Office Action received for U.S. Appl. No. 11/140,273, dated Dec. 13, 2007, 11 pages.
Final Office Action received for U.S. Appl. No. 11/140,273, dated Jul. 15, 2009, 11 pages.
Final Office Action received for U.S. Appl. No. 11/690,720, dated Nov. 9, 2011, 17 pages.
Final Office Action received for U.S. Appl. No. 12/398,957, dated Jul. 18, 2014, 27 pages.
Final Office Action received for U.S. Appl. No. 12/398,957, dated Nov. 7, 2012, 22 pages.
Final Office Action received for U.S. Appl. No. 12/644,957, dated Aug. 26, 2013, 19 pages.
Final Office Action received for U.S. Appl. No. 12/644,957, dated Jul. 11, 2014, 25 pages.
Final Office Action received for U.S. Appl. No. 13/324,834, dated Mar. 27, 2014, 22 pages.
Final Office Action received for U.S. Appl. No. 13/324,834, dated Apr. 28, 2015, 20 pages.
Final Office Action received for U.S. Appl. No. 13/324,834, dated Jan. 13, 2014, 13 Pages.
Final Office Action received for U.S. Appl. No. 13/339,235, dated Aug. 29, 2012, 10 pages.
Final Office Action received for U.S. Appl. No. 13/339,235, dated Dec. 2, 2014, 7 pages.
Final Office Action received for U.S. Appl. No. 13/339,235, dated Jan. 27, 2017, 16 pages.
Final Office Action received for U.S. Appl. No. 13/361,196, dated Jan. 22, 2013, 15 pages.
Final Office Action received for U.S. Appl. No. 13/436,370, dated Jun. 12, 2015, 18 pages.
Final Office Action received for U.S. Appl. No. 13/436,370, dated Oct. 13, 2016, 14 pages.
Final Office Action received for U.S. Appl. No. 14/473,809, dated Apr. 14, 2016, 23 pages.
Final Office Action received for U.S. Appl. No. 14/486,518, dated Dec. 8, 2015, 7 pages.
Final Office Action received for U.S. Appl. No. 14/512,350, dated Aug. 23, 2017, 21 pages.
Final Office Action received for U.S. Appl. No. 14/868,105, dated Apr. 12, 2017, 22 pages.
Final Office Action received for U.S. Appl. No. 16/406,787, dated Aug. 18, 2022, 12 pages.
Final Office Action received for U.S. Appl. No. 17/177,862, dated Mar. 21, 2022, 10 pages.
Final Office Action received for U.S. Appl. No. 14/512,350, dated Nov. 30, 2015, 7 pages.
Final Office Action received for U.S. Appl. No. 14/486,518, dated Nov. 16, 2017, 19 pages.
First Action Interview Office Action Summary received for U.S. Appl. No. 14/534,797, dated Feb. 18, 2016, 5 pages.
First Action Interview Office Action Summary received for U.S. Appl. No. 14/624,083, dated Apr. 8, 2016, 1 pages.
First Examiner Report received for Indian Patent Application No. 6557/DELNP/2010, dated Apr. 11, 2017, 11 pages.
Geekery, “Proposal for Free, Open Source Cell Phone Location Service”, Retrieved from the Internet URL: <//crud.blog/2004/03/06/proposal-for-free-open-source-cell-phone-location-service/>, Mar. 6, 2004, 8 pages.
Gmail, Retrieved from Internet URL: https://www.gmail.com. Accessed on Nov. 10, 2021, 7 Pages.
GOCR, “Open-Source Character Recognition”, Retrieved from Internet URL: <https://www-e.ovgu.de/jschulen/ocr/download.html>, Accessed on Nov. 3, 2021, 2 pages.
Google Play, “AgingBooth”, Retrieved from the Internet URL: <https://play.google.com/store/apps/details?id=com.piviandco.agingbooth&hl=en_IN>, Jan. 7, 2019, 4 pages.
Halfbakery: Buddy Locator, [Online], Retrieved from the Internet: <URL: http://www.halfbakery.com/idea/Buddy_20Locator#1055455737>, (Jun. 11, 2003), 3 pages.
Halfbakery: Mobile Phone Utility, Retrieved from the Internet URL: <http://www.halfbakery.com/idea/mobile_20phone_20utility#1073585857>, Jan. 8, 2004, 2 pages.
Halfbakery: Mobile Proximity Link, [Online], Retrieved from the Internet: <URL: http://www.halfbakerv.com/idea/Mobile_20Proximity_20Link#1001923289>, (Sep. 30, 2001).
International Search Report and Written Opinion of the International Searching Authority, issued in connection with Int'l Appl. No. PCT/US2009/001419, dated Sep. 30, 2009 (8 pages).
International Search Report and Written Opinion of the International Searching Authority, issued in connection with Int'l Appl. No. PCT/US2010/061628, dated Aug. 12, 2011 (6 pages).
iPhone—Apple, Oh.So.Pro, Retrieved from the Internet URL :<http://www.apple.com/iphone/>, Accessed on Nov. 10, 2021, 12 pages.
Java Servlet Technology Overview, Retrieved from Internet URL :<https://www.oracle.com/java/technologies/servlet-technology.html>. Accessed on Nov. 10, 2021, 2 pages.
Kan et al., “Applying QR Code in Augmented Reality Applications”, VRCAI, Dec. 15, 2009, pp. 253-258.
Klemperer, “Auctions: Theory and Practice”, Princeton University Press, 2004, 15 pages.
Korean Application Serial No. 2012-7019181, Notice of Appeal filed Feb. 4, 2015, 24 pages (Including English Translation of claims).
Korean Application Serial No. 2012-7019181, Notice of Final Rejection dated Nov. 3, 2014, with English translation of claims, 6 pgs.
Korean Application Serial No. 2012-7019181, Notice of Reason for Refusal dated Feb. 23, 2016, 12 pages (5 pages of English Translation and 7 pages of Official Copy).
Korean Application Serial No. 2012-7019181, Notification of Reason for Refusal dated Nov. 18, 2013, 12 pages (6 pages English translation & 6 pages official copy).
Korean Application Serial No. 2012-7019181, Office Action dated Jun. 26, 2014, with English translation, 5 pgs.
Korean Application Serial No. 2014-7004160, Reasons for Rejection dated Mar. 2, 2016, with English translation, 7 pgs.
Kraft, “Real Time Baseball Aguemented Reality”, Washington University in Sl. Louis, Dec. 6, 2011, 11 pages.
Madeleine, “Terminator 3 Rise of Jesus! Deutsch”, “Retrieved from the Internet URL: <https://www.youtube.com/watch?v=Oj3o7HFcgzE>”, Jun. 12, 2010, 2 pages.
MLB At Bat 11, Retrieved from the Internet: <URL: http://texas.rangers.mlb.com/mobile/atbat/?c id=tex>, Accessed in Apr. 19, 2018, 6 pages.
Mobitv,“Mobi IV”, Retrieved from the Internet: <URL: http://www.mobitv.com/>, Accessed on Mar. 30, 2015, 1 page.
Mulloni et al., “Handheld Augmented Reality Indoor Navigation With Activity-Based Instructions”, Proceedings of the 3th International Conference on Human Computer Interaction With Mobile Devices and Services, 2011, 10 pages.
Newby, “Facebook, Politico to measure sentiment of GOP candidates by collecting posts”, 2006-2012 Clarity Digital 3roup LLC d/b/a Examiner.com, Jun. 28, 2012, 3 pages.
Non-Final Office Action received for U.S. Appl. No. U.S. Appl. No. 15/250,588, dated Sep. 22, 2017, 16 pages.
Non-Final Office Action received for U.S. Appl. No. U.S. Appl. No. 15/337,899 dated Feb. 5, 2020, 11 pages.
Non-Final Office Action Received for U.S. Appl. No. U.S. Appl. No. 16/046,434, dated Aug. 21, 2019, 23 pages.
Non-Final Office Action received for U.S. Appl. No. 11/140,273, dated Feb. 26, 2010, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 11/140,273, dated Jul. 3, 2008, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 11/140,273, dated May 31, 2007, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 11/690,720, dated May 17, 2011, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 11/690,720, dated Sep. 25, 2009, 8 pages.
Non-Final Office Action Received for U.S. Appl. No. 12/398,957, dated Dec. 9, 2019, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated May 2, 2017, 19 pages.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated Oct. 17, 2019, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated Sep. 19, 2013, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 12/406,016, dated Oct. 2, 2013, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 12/644,957, dated Dec. 29, 2014, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 12/644,957, dated Mar. 7, 2014, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 12/644,957, dated Mar. 18, 2013, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 13/194/584, dated Jul. 16, 2015, 27 pages.
Non-Final Office Action received for U.S. Appl. No. 13/324,834, dated Aug. 14, 2013, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 13/324,834, dated Aug. 27, 2014 16 pages.
Non-Final Office Action received for U.S. Appl. No. 13/339,235, dated Aug. 18, 2014, 4 pages.
Non-Final Office Action received for U.S. Appl. No. 13/339,235, dated Aug. 28, 2017, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 13/339,235, dated Feb. 12, 2015, 4 pages.
Non-Final Office Action received for U.S. Appl. No. 13/339,235, dated Mar. 16, 2012; 10 pages.
Non-Final Office Action received for U.S. Appl. No. 13/339,235, dated May 16, 2016, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 13/339,235, dated Sep. 18, 2015, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 13/361,113, dated Feb. 13, 2014, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 13/361,196, dated Aug. 23, 2012, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 13/361,196, dated Jan. 3, 2014, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 13/361,196, dated Mar. 29, 2012, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 13/380,315, dated Mar 26, 2014, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 13/436,370, dated Mar. 25, 2016, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 13/436,370, dated Nov. 5, 2014, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 14/486,518, dated May 21, 2015, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 14/486,518, dated Nov. 30, 2016, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 14/512,350, dated Mar. 11, 2016, 7 pages.
Non-Final Office Action received for U.S. Appl. No. 14/512,350, dated May 22, 2015, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 14/512,350, dated Nov. 2, 2016, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 14/868,105, dated Dec. 12, 2016, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 14/868,105, dated Nov. 14, 2017, 14 pages.
Non-final Office Action received for U.S. Appl. No. 16/406,787, dated Oct. 8, 2021, 12 pages.
Notice of Allowance Received for Korean Patent Application No. 10-2016-7025254 dated Mar. 9, 2018, 5 pages (2 pages of English Translation and 3 pages of Official Copy).
Notice of Allowance received for U.S. Appl. No. 14/990,291, dated Dec. 13, 2017, 5 pages.
Notice of Allowance received for U.S. Appl. No. 11/140,273, dated Aug. 3, 2008, 6 pages.
Notice of Allowance received for U.S. Appl. No. 11/690,720, dated Aug. 2, 2012, 7 pages.
Notice of Allowance received for U.S. Appl. No. 11/690,720, dated May 15, 2012, 7 pages.
Notice of Allowance received for U.S. Appl. No. 12/398,957, dated Jan. 2, 2019, 10 pages.
Notice of Allowance received for U.S. Appl. No. 12/398,957, dated Oct. 30, 2020, 8 pages.
Notice of Allowance received for U.S. Appl. No. 12/406,016, dated Jun. 11 , 2014, 19 pages.
Notice of Allowance received for U.S. Appl. No. 12/644,957, dated Jun. 17, 2015, 20 pages.
Notice of Allowance received for U.S. Appl. No. 13/339,235, dated Apr. 25, 2014, 8 pages.
Notice of Allowance received for U.S. Appl. No. 13/361,113, dated Aug. 1, 2014, 8 pages.
Notice of Allowance received for U.S. Appl. No. 13/361,196, dated Jun. 10, 2014, 8 pages.
Notice of Allowance received for U.S. Appl. No. 14/868,105, dated May 21, 2018, 14 pages.
Notice of Allowance received for U.S. Appl. No. 17/177,862, dated Jul. 15, 2022, 7 pages.
Notification of Reexamination received for Chinese Patent Application No. 201610108229.6 dated May 9, 2022, 10 pages (2 pages English Translation, 8 pages Official Copy).
Office Action received for Chinese Patent Application No. 200980107871.0, dated Nov. 5, 2013, 12 pages.
Office Action Received for Chinese Patent Application No. 201610108229.6 dated Nov. 15, 2018, pages (6 pages Official Copy and 9 pages English Translation).
Office Action received for Chinese Patent Application No. 201610108229.6, dated Dec. 17, 2019, 23 Pages (9 pages of Official Copy and 14 pages of English Translation).
Office Action received for Chinese patent Application No. 201610108229.6, dated May 17, 2019, 33 pages (20 pages of English Translation and 13 pages of Official copy).
Office Action received for European Patent Application No. 10803429.9, dated Aug. 22, 2012, 2 pages.
Office Action received for Korean Patent Application No. 10-2010-7022281, dated Feb. 28, 2012, 13 pages (7 pages of Official Copy and 6 pages of English Translation).
Office Action received for Korean Patent Application No. 10-2010-7022281, dated Sep. 27, 2012, 7 pages(4 pages of Official Copy and 3 pages of English Translation).
Office Action Action received for Korean Patent Application No. 10-2016-7025254, dated May 2, 2017, 7 pages (3 pages of English Translation and 4 pages Official Copy).
Office Action received for Korean Patent Application No. 10-2016-7025254, dated Oct. 13, 2016, 11 pages (6 pages of English Translation and 5 pages of Official copy).
Office Action received for Korean Patent Application No. 10-2016-7025254, dated Sep. 5, 2017, 12 pages. (5 pages of English Translation and 7 pages of Official Copy).
Office Action-First Action Interview received for U.S. Appl. No. 14/990,291, dated Oct. 13, 2017, 5 pages.
Oracle,“Java Technic:al Details”, Retrieved from Internet URL: <https://www.oracle.com/java/technologies/>, Accessed on Nov. 3, 2021, 4 Pages.
Preinterview First Office Action received for U.S. Appl. No. 14/990,291, dated Aug. 10, 2017, 4 pages.
Salesforce, “Custom Application Development Software for Business”, Retrieved from Internet URL: <https://www.salesforce.com/products/platform/overview/?d=70130000000liBh&internal=true>, Accessed on Oct. 4, 2021, 9 pages.
Sifry “Politico-Facebook Sentiment Analysis Will Generate “Bogus” Results, Expert Says”, Retrieved from the Internet: <<http://techpresident.com/news/216181politico-facebook-sentiment-a- nalysis-bogus>, Jan. 13, 2012, Accessed on May 18, 2018, 4 pages.
Signal soft Corporation awarded location-based services patent, [Online]. Retrieved from the Internet: <URL: http://www.cellular.co.za/news 2001/04282001-signalsoft-patent.htm>, (Apr. 27, 2001), 1 pg.
Slingbox, “Sling Media, Inc.”, Retrieved from the Internet URL:< http://www.slingbox.com/>, Accessed on Mar. 30 2015, 1 page.
Summons to Attend Oral Proceedings received for European Application No. 09717996.4, dated Nov. 28, 2016, 12 pages.
U.S. Appl. No. 13/194,584, Non Final Office Action dated Sep. 19, 2013, 25 pgs.
U.S. Appl. No. 13/624,682, Non Final Office Action dated Jan. 22, 2015, 9 pgs.
U.S. Appl. No. 13/624,682, Notice of Allowance dated Jun. 8, 2015, 5 pgs.
U.S. Appl. No. 13/624,682, Notice of Allowance dated Oct. 1, 2015, 7 pgs.
U.S. Appl. No. 14/473,809, Non Final Office Action dated Aug. 13, 2015, 21 pgs.
Usdatanow, Networks in Motion Named Semi-Finalist for Wireless LBS Challenge, [Online]. Retrieved from the Internet: <URL: http://tmcnet.com/usubmit/2004/Mar/1025200.htm>, (Mar. 18, 2004), 2 pages.
Vassilios et al., “Archeoguide: An Augmented Reality Guide for Archaeological Sites”, IEEE Computer Graphics and application vol. 22, No. 5, Sep./Oct. 2002, pp. 52-60.
Vlahakis et al., “Archeoguide: First Results of an Augmented Reality, Mobile Computing System in Cultural Heritage Sites”, Virtual Reality, Archeology, and Cultural Heritage, 2001, 10 pages.
W3 Schools, “Introduction to XML”, Retrieved from the Internet URL :<https://www.w3schools.com/xml/xml_whatis.sp>, Accessed on Nov. 8, 2021, 7 pages.
W3C, Extensible Markup Language {XML) 1.0 {Fourth Edition):, Retrieved from the Internet URL :<http://www.w3.org/TR/2006/REC-xml-20060816/#sec-origin-goals>, Aug. 16, 2006, 30 pages.
W3C, “URIs, Addressability, and the use of HTTP GET and POST”, Retrieved from Internet URL: <https://www.w3.prg/2001/tag/doc/whenToUseGet.html>, Mar. 21, 2004, 9 Pages.
Walther et al., “Selective Visual Attention Enables Learning and Recognition of Multiple Objects in Cluttered Scenes”, Accessed on Jun. 15, 2005, 23 pages.
WhatIs.com, Retrieved from the Internet URL :<http://searchexchange.techtargel.com/sDefinition/0sid43_gci212805,00.html>, Accessed on Nov. 8, 2021, 5 pages.
Wikipedia, “Definition of Homogeneous Coordinates”, Retrieved from the internet URL: <https://web.archive.org/web/20110305185824/http://en.wikipedia.org/wi- ki/Homogeneous_coordinates>, Accessed on Apr. 18, 2018, 8 pages.
Wikipedia, “Polar Coordinate System”, Retrieved from the Intent URL: <http://en.wikipedia.org/wiki/Polar_coordinate_system>, Oct, 8, 2011, 12 pages.
YouTube., “RedLaser 2.0: Realtime iPhone UPC Barcode Scanning”, Retrieved from the Internet URL: <https://www.youtube.com/watch?v=9_hFGsmx_6k>, Jun. 16, 2009, pp. 1-2.
Notice Of Allowance received for U.S. Appl. No. 17/177,862, dated Nov. 2, 2022, 5 Pages.
Non-Final Office Action received for U.S. Appl. No. 16/406,787, dated Dec. 6, 2022, 11 pages.
Reexamination Decision received for Chinese Patent Application No. 201610108229.6 dated Nov. 4, 2022, 14 Pages (1 Page of English translation and 13 Pages of Official Copy).
Related Publications (1)
Number Date Country
20210027345 A1 Jan 2021 US
Continuations (1)
Number Date Country
Parent 13537482 Jun 2012 US
Child 17039443 US