System and method for visualization of items in an environment using augmented reality

Information

  • Patent Grant
  • 10147134
  • Patent Number
    10,147,134
  • Date Filed
    Monday, August 29, 2016
    8 years ago
  • Date Issued
    Tuesday, December 4, 2018
    6 years ago
Abstract
Systems and methods for visualization of an item in an environment using augmented reality are provided. Environment image data containing an image of an environment is received. A selection of an item for placement into an indicated location of the environment is received. An item image of the selected item is scaled based on dimensions determined from the environment image data for the environment. The scaled item image is augmented into the image of the environment at the indicated location to generate an augmented reality image. The augmented reality image is displayed on a device of a user, whereby the scaled item image in the augmented reality image is selectable to cause display of information. A selection of the scaled item image is received. In response to the selection of the scaled item image, the information is presented on the device of the user.
Description
FIELD

The present disclosure relates generally to image processing, and in a specific example embodiment, to visualization of items in an environment using augmented reality.


BACKGROUND

Conventionally, when an individual shops for an item, the individual must mentally visualize what the item will look like in the environment that the individual intends to place the item. Often, the individual has difficulty imagining the item with proper dimensions and orientation. In some cases, the individual may purchase the item only to realize that the item does not ideally fit in the environment. As a result, the individual may end up returning the item or otherwise disposing of the item (e.g., sell, trade, give away).





BRIEF DESCRIPTION OF DRAWINGS

Various ones of the appended drawings merely illustrate example embodiments of the present invention and cannot be considered as limiting its scope.



FIG. 1 is a block diagram illustrating an example embodiment of a network architecture of a system used to determining query aspects.



FIG. 2 is a block diagram illustrating an example embodiment of a publication system.



FIG. 3 is a block diagram illustrating an example embodiment of an augmented reality engine.



FIG. 4 is a flow diagram of an example high-level method for visualization of an item in an environment using augmented reality.



FIG. 5 is a flow diagram of an example high-level method for generating an augmented reality image.



FIG. 6A is a screenshot of an example of an environment image.



FIG. 6B is a screenshot of the environment image with an augmented item image.



FIG. 6C illustrates an example screenshot displaying shopping information pertaining to the selected item.



FIG. 6D illustrates an example screenshot displaying a window providing additional information for the selected item.



FIG. 6E illustrates an example screenshot displaying a window having recommendations.



FIG. 7 is a simplified block diagram of a machine in an example form of a computing system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.





DETAILED DESCRIPTION

The description that follows includes systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments of the present invention. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques have not been shown in detail.


As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Additionally, although various example embodiments discussed below focus on a marketplace environment, the embodiments are given merely for clarity in disclosure. Thus, any type of electronic publication, electronic commerce, social networking, or electronic business system and method, including various system architectures, may employ various embodiments of the system and method described herein and may be considered as being within a scope of example embodiments. Each of a variety of example embodiments is discussed in detail below.


Example embodiments described herein provide systems and methods for visualizing of an item in an environment using augmented reality. In example embodiments, environment image data containing an image of an environment is received from a client device. A selection of an item that is under consideration for purchase and placement into an indicated location of the environment is received. An item image of the selected item is scaled to a scale that is based on dimensions determined from the environment image data for the environment. The dimensions may be determined based on a calculated distance to a focal point of the indicated location in the environment and on a marker located in the image of the environment. The scaled item image is augmented into the image of the environment at the indicated location to generate an augmented reality image. In some embodiments, the scaled item may be oriented to match an orientation of the indicated location in the environment.


By using embodiments of the present invention, a user may search for an item and augment an image of an environment with an image of the item. Because the user can create and view an augmented reality image of the environment including the selected item, the user can easily visualize the selected item in the environment without having to, for example, manually cut and paste or scale the image of the item into the image of the environment. Therefore, one or more of the methodologies discussed herein may obviate a need for time consuming data processing by the user. This may have the technical effect of reducing computing resources used by one or more devices within the system. Examples of such computing resources include, without limitation, processor cycles, network traffic, memory usage, storage space, and power consumption.


With reference to FIG. 1, an example embodiment of a high-level client-server-based network architecture 100 to enable visualization of items in an environment using augmented reality is shown. A networked system 102, in an example form of a network-server-side functionality, is coupled via a communication network 104 (e.g., the Internet, wireless network, cellular network, or a Wide Area Network (WAN)) to one or more client devices 110 and 112. FIG. 1 illustrates, for example, a web client 106 operating via a browser (e.g., such as the INTERNET EXPLORER® browser developed by Microsoft® Corporation of Redmond, Wash. State), and a programmatic client 108 executing on respective client devices 110 and 112.


The client devices 110 and 112 may comprise a mobile phone, desktop computer, laptop, or any other communication device that a user may utilize to access the networked system 102. In some embodiments, the client device 110 may comprise or be connectable to an image capture device 113 (e.g., camera, camcorder). In further embodiments, the client device 110 may comprise one or more of a touch screen, accelerometer, microphone, and GPS device. The client devices 110 and 112 may be a device of an individual user interested in visualizing an item within an environment.


An Application Program Interface (API) server 114 and a web server 116 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 118. The application servers 118 host a publication system 120 and a payment system 122, each of which may comprise one or more modules, applications, or engines, and each of which may be embodied as hardware, software, firmware, or any combination thereof. The application servers 118 are, in turn, coupled to one or more database servers 124 facilitating access to one or more information storage repositories or database(s) 126. The databases 126 may also store user account information of the networked system 102 in accordance with example embodiments.


In example embodiments, the publication system 120 publishes content on a network (e.g., Internet). As such, the publication system 120 provides a number of publication functions and services to users that access the networked system 102. The publication system 120 is discussed in more detail in connection with FIG. 2. In example embodiments, the publication system 120 is discussed in terms of a marketplace environment. However, it is noted that the publication system 120 may be associated with a non-marketplace environment such as an informational or social networking environment.


The payment system 122 provides a number of payment services and functions to users. The payment system 122 allows users to accumulate value (e.g., in a commercial currency, such as the U.S. dollar, or a proprietary currency, such as “points”) in their accounts, and then later to redeem the accumulated value for products (e.g., goods or services) that are made available via the publication system 120 or elsewhere on the network 104. The payment system 122 also facilitates payments from a payment mechanism (e.g., a bank account, PayPal™, or credit card) for purchases of items via any type and form of a network-based marketplace.


While the publication system 120 and the payment system 122 are shown in FIG. 1 to both form part of the networked system 102, it will be appreciated that, in alternative embodiments, the payment system 122 may form part of a payment service that is separate and distinct from the networked system 102. Additionally, while the example network architecture 100 of FIG. 1 employs a client-server architecture, a skilled artisan will recognize that the present disclosure is not limited to such an architecture. The example network architecture 100 can equally well find application in, for example, a distributed or peer-to-peer architecture system. The publication system 120 and payment system 122 may also be implemented as standalone systems or standalone software programs operating under separate hardware platforms, which do not necessarily have networking capabilities.


Referring now to FIG. 2, an example block diagram illustrating multiple components that, in one embodiment, are provided within the publication system 120 of the networked system 102 is shown. In one embodiment, the publication system 120 is a marketplace system where items (e.g., goods or services) may be offered for sale. In an alternative embodiment, the publication system 120 is a social networking system or informational system. The publication system 120 may be hosted on dedicated or shared server machines (not shown) that are communicatively coupled to enable communications between the server machines. The multiple components themselves are communicatively coupled (e.g., via appropriate interfaces), either directly or indirectly, to each other and to various data sources, to allow information to be passed between the components or to allow the components to share and access common data. Furthermore, the components may access the one or more databases 126 via the one or more database servers 124.


In one embodiment, the publication system 120 provides a number of publishing, listing, and price-setting mechanisms whereby a seller may list (or publish information concerning) goods or services for sale, a buyer can express interest in or indicate a desire to purchase such goods or services, and a price can be set for a transaction pertaining to the goods or services. To this end, the publication system 120 may comprise at least one publication engine 202 and one or more shopping engines 204. In one embodiment, the shopping engines 204 may support auction-format listing and price setting mechanisms (e.g., English, Dutch, Chinese, Double, Reverse auctions, etc.).


A pricing engine 206 supports various price listing formats. One such format is a fixed-price listing format (e.g., the traditional classified advertisement-type listing or a catalog listing). Another format comprises a buyout-type listing. Buyout-type listings (e.g., the Buy-It-Now (BIN) technology developed by eBay Inc., of San Jose, Calif.) may be offered in conjunction with auction-format listings and allow a buyer to purchase goods or services, which are also being offered for sale via an auction, for a fixed price that is typically higher than a starting price of an auction for an item.


A store engine 208 allows a seller to group listings within a “virtual” store, which may be branded and otherwise personalized by and for the seller. Such a virtual store may also offer promotions, incentives, and features that are specific and personalized to the seller. In one example, the seller may offer a plurality of items as Buy-It-Now items in the virtual store, offer a plurality of items for auction, or a combination of both.


Navigation of the publication system 120 may be facilitated by a navigation engine 210. For example, a search module (not shown) of the navigation engine 210 enables, for example, keyword searches of listings or other information published via the publication system 120. In a further example, a browse module (not shown) of the navigation engine 210 allows users to browse various category, catalog, or data structures according to which listings or other information may be classified within the publication system 120. Various other navigation applications within the navigation engine 210 may be provided to supplement the searching and browsing applications. In one embodiment, the navigation engine 210 allows the user to search or browse for items in the publication system 120 (e.g., virtual stores, listings in a fixed-price or auction selling environment, listings in a social network or information system). In alternative embodiments, the navigation engine 210 may navigate (e.g., conduct a search on) a network at large (e.g., network 104). Based on a result of the navigation engine 210, the user may select an item that the user is interested in augmenting into an environment.


In order to make listings or posting of information available via the networked system 102 as visually informing and attractive as possible, the publication system 120 may include an imaging engine 212 that enables users to upload images for inclusion within listings and to incorporate images within viewed listings. In some embodiments, the imaging engine 212 also receives image data from a user and utilizes the image data to generate the augmented reality image. For example, the imaging engine 212 may receive an environment image (e.g., still image, video) of an environment within which the user wants to visualize an item. The imaging engine 212 may work in conjunction with the augmented reality engine 218 to generate the augmented reality image as will be discussed in more details below.


A listing engine 214 manages listings on the publication system 120. In example embodiments, the listing engine 214 allows users to author listings of items. The listing may comprise an image of an item along with a description of the item. In one embodiment, the listings pertain to goods or services that a user (e.g., a seller) wishes to transact via the publication system 120. As such, the listing may comprise an image of a good for sale and a description of the item such as, for example, dimensions, color, and, identifier (e.g., UPC code, ISBN code). In some embodiments, a user may create a listing that is an advertisement or other form of publication to the networked system 102. The listing engine 214 also allows the users to manage such listings by providing various management features (e.g., auto-relisting, inventory level monitors, etc.).


A messaging engine 216 is responsible for the generation and delivery of messages to users of the networked system 102. Such messages include, for example, advising users regarding the status of listings and best offers (e.g., providing an acceptance notice to a buyer who made a best offer to a seller) or providing recommendations. The messaging engine 216 may utilize any one of a number of message delivery networks and platforms to deliver messages to users. For example, the messaging engine 222 may deliver electronic mail (e-mail), an instant message (IM), a Short Message Service (SMS), text, facsimile, or voice (e.g., Voice over IP (VoIP)) messages via wired networks (e.g., the Internet), a Plain Old Telephone Service (POTS) network, or wireless networks (e.g., mobile, cellular, WiFi, WiMAX).


An augmented reality engine 218 manages the generation of an augmented reality based on an environment image and item specified by a user. The augmented reality engine 218 will be discussed in more detail in connection with FIG. 3 below.


Although the various components of the publication system 120 have been defined in terms of a variety of individual modules and engines, a skilled artisan will recognize that many of the items can be combined or organized in other ways. Alternatively, not all components of the publication system 120 of FIG. 2 may be utilized. Furthermore, not all components of the marketplace system 120 have been included in FIG. 2. In general, components, protocols, structures, and techniques not directly related to functions of exemplary embodiments (e.g., dispute resolution engine, loyalty promotion engine, personalization engines, etc.) have not been shown or discussed in detail. The description given herein simply provides a variety of exemplary embodiments to aid the reader in an understanding of the systems and methods used herein.



FIG. 3 is a block diagram illustrating an example embodiment of the augmented reality engine 216. In example embodiments, the augmented reality engine 216 comprises an access module 300, a distance module 302, a sizing module 304, a scaling module 306, an orientation module 308, an augmenting module 310, a recommendation module 312, a save module 314, and a purchase module 316. In alternative embodiments, functions of one or more of the modules of the augmented reality engine 216 may be combined together, one or more of the modules may be removed from the augmented reality engine 216, or one or more of the modules may be located elsewhere in the networked system 102 (e.g., the imaging engine 214, shopping engines 204) or at the client device 110.


In example embodiments, the imaging engine 212 may receive environment image data of an environment (e.g., still image, video) from the client device 110. The environment image data is then provided to the augmented reality engine 216 for processing. In some embodiments, the augmented reality engine 216 also receives item data for an item that the user is interested in visualizing in the environment and an indication of a location where the item is to be augmented in the environment. The item data may be provided by the navigation engine 210 based on a user selection of an item found using a search or browsing function of the navigation engine 210.


Alternatively, the item data may be received from the client device 110. For example, the user may capture an image of an item that the user is interested in augmenting into the environment (e.g., take a photo of an item at a store). The user may, in some cases, enter information regarding the item such as dimensions or an identifier (e.g., UPC code). The augmented reality engine 216 receives the item data from the client device 110.


The access module 300 accesses item data for a selected item. In some embodiments, an item to be augmented into the environment may be selected by a user at the client device and the selection is received, for example, by the navigation engine 210. In other embodiments, the selection is received by the access module 300. Based on the selection, the access module 300 may access information corresponding to the selection. If the selection is an item listing for the item, the access module 300 may access the item listing and extract item data (e.g., dimensions, images) from the listing. In other examples, if the selection is a user inputted name or other item identifier of an item (e.g., UPC code), the access module 300 may access a catalog (e.g., stored in the database 126) that stores item data using the item identifier.


The distance module 302 determines a distance to a focal point in an image of the environment. The focal point may be a user selected area (also referred to as an “indicated location”) where an item image is to be augmented. For example, if the environment is a room, the distance to a wall where the item image is to be augmented may be determined. In one embodiment, the distance module 302 may use a focus capability of the image capture device 113 of, or coupled to, the client device 110 to determine the distance. Alternatively, the distance module 302 may use an echo technique using the client device 110 as a sound generator to determine the distance. For example, the client device 110 may generate a sound in the direction of the wall and an amount of time is registered for an echo to be returned. The distance module 302 may use this amount of time to determine the distance. As such, the distance is from a point of view of the viewer or image capture device (e.g., camera) to the focal point.


The sizing module 304 determines sizing for the environment. In example embodiments, the sizing module 304 uses a marker (an object with known standard dimensions) in the environment image data to calculate the sizing. For example, if a door is shown in the environment image data, the sizing module 304 may assume that the door is a standard sized door (e.g., 36″×80″) or that a door knob is located at 36″ from the floor. Using these known standard dimensions, sizing for the environment may be determined. In another example, if the environment is an automobile, the marker may be a wheel well of the automobile. In this example, the user may specify a type of automobile when providing the environment image data.


The scaling module 306 scales an image of the item based on the distance and sizing determined by the distance module 302 and the sizing module 304, respectively. Accordingly, the scaling module 306 may receive (e.g., from the navigation engine 210) or retrieve the item data (e.g., from the database 126) for a selected item. The item data may include an item image, dimensions, or an item identifier. If the item image and dimensions are provided, then the scaling module 306 may use the item image and the dimensions to scale the item image to the environment based on the sizing determined by the sizing module 304. Alternatively, if one of the image or dimension is not provided, the item identifier may be used to look up the item in an item catalog which may contain an image and item information for the item (e.g., dimensions and description). In one embodiment, the scaling module 306 may look up and retrieve the item information from the item catalog.


Once the item image is scaled, the scaled item image may be oriented to the environment by the orientation module 308. For example, if the environment image has a wall at a slight angle and the scaled item image is to be placed on the wall, the orientation module 308 orients the scaled item image to the angle of the wall. It is noted that functionality of any of the distance module 302, sizing module 304, scale module 306, and orientation module 308 may be combined into one or more modules that can determine proper sizing and orientation for the item image. In some embodiments, these combined modules may comprise or make use of one or more gyroscopes or accelerometers.


The augmenting module 310 augments the scaled and oriented item image with the environment image to create an augmented reality image. The augmenting module 310 then provides the augmented reality image to the client device 110.


The recommendation module 312 optionally provides recommendations for alternative items for the environment. For example, if the scaled and oriented item image appears too large for an indicated area on the environment image (e.g., as determined by the augmenting module 310), the recommendation module 312 may suggest one or more alternative items that are smaller and will fit better in the indicated area. Accordingly, the recommendation module 312 may determine a dimension that is more appropriate for the indicated area and perform a search (e.g., provide instructions to the navigation engine 210 to perform a search) to find one or more alternative items. The recommendation module 312 may then retrieve the item information and provide the alternative items as a suggestion to the user. In one embodiment, the alternative items may be listed on a side of a display that is displaying the augmented reality image or on a pop-up window.


The save module 314 saves the environment image for later use. In one embodiment, the environmental image may be stored to the database 126 of the networked environment 102. Alternatively, the environmental image may be stored to the client device 110. For example, the user may record the environmental image for a room and save the environmental image. At a later time, the user may obtain an item image for an item that the user is interested in augmenting into the saved environmental image. The save module 314 may access and retrieve the saved environmental image.


The purchase module 316 allows the user to purchase the item that is augmented into the environment or an alternative item recommended by the recommendation module 312. In one embodiment, the purchase module 316 provides a selection on or near the augmented reality image that when selected takes the user to, for example, a purchase page for the item, a store front for a store that sells the item, or search page with search results for availability of the item for purchase. In another embodiment, an activation of the selection may initiate an automatic purchase of the item. Once selected, the purchase module 316 performs the corresponding actions to facilitate the purchase (e.g., send a search for the item to the navigation engine 210, provide one or more listings using the shopping engine 204, provide a webpage associated with the store engine 208).



FIG. 4 is a flow diagram of an example high-level method 400 for visualization of an item in an environment using augmented reality. In operation 402, environment image data is received. In example embodiments, the imaging engine 212 may receive the environment image data from a client device 110. The environment image data may comprise an image of an environment into which the user wants to augment an item image.


In operation 404, a selection of an item to be augmented into the environment is received. In some embodiments, the navigation engine 210 receives a selection of the item from the client device. In other embodiments, the imaging engine 212 receives an image of an item that the user is interested in augmenting into the environment.


Based on the received selection of the item, item data is accessed in operation 406. The access module 300 accesses item data for the selected item. The item data may be extracted from an item listing for the item, retrieved from an item catalog, or retrieved from a website of a manufacturer or reseller (e.g., using an item identifier of the item).


In operation 408, augmentation processing is performed. Augmentation processing takes the environment image data and the selected item and augments or merges an item image for the item into an environment image. The operations of the augmentation processing will be discussed in detail with respect to FIG. 5.


The result of the augmentation is provided in operation 410. The result may comprise a video of the environment with the selected item augmented into the environment (referred to as “the augmented reality image”). In example embodiments, the augmenting module 310 provides the augmented reality image to the client device 110 of the user that provided the environment image, the item selection, or both.


In operation 412, a determination is made as to whether a modification is received. In some embodiments, the modification may be caused by the movement of the image capture device 113. For example, if the image capture device 113 is a video camera, then the modification is the movement within the environment as captured by the video camera. In another embodiment, the user may select an alternative item based on a recommendation provided by the recommendation module 312. Based on the modification, the method 400 may return to either operation 406 to access item data for the new item or to operation 408 to perform augmentation processing based on, for example, the movement within the environment.



FIG. 5 is a flow diagram of an example high-level method (operation 408) for generating the augmented reality image. In operation 502, a distance is determined by the distance module 302. The distance module 302 determines a distance to a focal point in the environment. The focal point may be a user selected area where an item image is to be augmented. In one embodiment, the distance module 302 may use capabilities (e.g., focus, echo based on sound) of the image capture device 113 of, or coupled to, the client device 110 to determine the distance.


In operation 504, sizing for the environment is determined by the sizing module 304. In example embodiments, the sizing module 304 uses a marker in the environment image data to calculate the sizing. Using known standard dimensions of the marker, sizing for the environment may be determined by the sizing module 304.


The item image is scaled in operation 506. The scaling module 306 scales an image of the item based on the distance and sizing determined by the distance module 302 and the sizing module 304, respectively. Accordingly, the scaling module 306 may receive or retrieve the item data including an item image, dimensions, or an item identifier. The retrieved item data is then used in association with the determined distance and sizing data to scale the item image.


Once the item image is scaled, the scaled item image may be oriented to the environment, in operation 508, by the orientation module 308. For example, if the environment image has a wall at a slight angle and the scaled item image is to be placed on the wall, the orientation module 308 orients the scaled item image to the angle of the wall.


In operation 510, the scaled and oriented item image is merged into the environment image. The augmenting module 310 augments the scaled and oriented item image with the environment image to create an augmented reality image. It is noted that operations of FIG. 5 may be combined into fewer operations. Alternatively, some of the operations of FIG. 5 may be optional.



FIG. 6A is a screenshot of an example of an environment image 600. The environment image 600 may be captured by the image capture device 113 or retrieved from a storage location (e.g., database 126). In the present example, the environment image 600 is an image of a room in which a user wants to augment an item. In the present case, the environment image 600 is taken from a location where the user may want to view the item. For example, if the item is a flat panel television, the environment image 600 may be taken from a location where the user will position a sofa to view the flat panel television.



FIG. 6B is a screenshot of the environment image 600 with an augmented item image. In the present example, an image of a flat panel television 602 selected by the user is positioned in a location indicated by the user in the environment image 600. In one embodiment, additional information may be obtained by activating a selection on a display displaying the screenshot. For example, the user may select the image of the flat panel television 602 on the screenshot to open up a new window (e.g., a new window over a portion of the screenshot) that provides purchase information (e.g., where to buy, links to online stores, a listing for the item, prices), item information (e.g., dimensions, description), alternative recommendations (e.g., smaller or larger items, comparable items, less expensive items, newer version of the item), or any combination of these.



FIG. 6C illustrates an example screenshot displaying shopping information in a new window pertaining to the selected item. In the present example, a window 604 provides shopping information including a lowest, highest, and average price along with links to various marketplaces where the item may be purchased. The window 604 is provided when the user makes a selection of the image of the flat panel or performs some other action to indicate a desire to receive additional information.



FIG. 6D illustrates an example screenshot displaying the window 604 providing additional information for the selected item. In the present example, the window 604 provides dimensions, weight, item identifiers, and product description of the selected item. Any information pertaining to the selected item may be provided in the window 604.



FIG. 6E illustrates an example screenshot displaying the window 604 having recommendations. The recommendations may be provided by the recommendation module 312 and include a name of each recommended item and an image of the recommended item. Other information, such as price, ratings, or dimensions, may also be provided in the window 604. The recommendations may be, for example, items that may fit in the user designated location better, items less expensive than the selected item, items that are a new model of the selected item, or items that rank higher based on other users of the system.


While the various examples of FIG. 6C-6E show provide the window 604 for displaying additional information, alternative embodiments may use other display mechanisms to provide the additional information. For example, the additional information may be displayed on a side of a display showing the environment image 600.


Modules, Components, and Logic


Additionally, certain embodiments described herein may be implemented as logic or a number of modules, engines, components, or mechanisms. A module, engine, logic, component, or mechanism (collectively referred to as a “module”) may be a tangible unit capable of performing certain operations and configured or arranged in a certain manner. In certain example embodiments, one or more computer systems (e.g., a standalone, client, or server computer system) or one or more components of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) or firmware (note that software and firmware can generally be used interchangeably herein as is known by a skilled artisan) as a module that operates to perform certain operations described herein.


In various embodiments, a module may be implemented mechanically or electronically. For example, a module may comprise dedicated circuitry or logic that is permanently configured (e.g., within a special-purpose processor, application specific integrated circuit (ASIC), or array) to perform certain operations. A module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software or firmware to perform certain operations. It will be appreciated that a decision to implement a module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by, for example, cost, time, energy-usage, and package size considerations.


Accordingly, the term “module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which modules or components are temporarily configured (e.g., programmed), each of the modules or components need not be configured or instantiated at any one instance in time. For example, where the modules or components comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different modules at different times. Software may accordingly configure the processor to constitute a particular module at one instance of time and to constitute a different module at a different instance of time.


Modules can provide information to, and receive information from, other modules. Accordingly, the described modules may be regarded as being communicatively coupled. Where multiples of such modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the modules. In embodiments in which multiple modules are configured or instantiated at different times, communications between such modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple modules have access. For example, one module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further module may then, at a later time, access the memory device to retrieve and process the stored output. Modules may also initiate communications with input or output devices and can operate on a resource (e.g., a collection of information).


Example Machine Architecture and Machine Readable Medium


With reference to FIG. 7, an example embodiment extends to a machine in the example form of a computer system 700 within which instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed. In alternative example embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, a switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.


The example computer system 700 may include a processor 702 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 704 and a static memory 706, which communicate with each other via a bus 708. The computer system 700 may further include a video display unit 710 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). In example embodiments, the computer system 700 also includes one or more of an alpha-numeric input device 712 (e.g., a keyboard), a user interface (UI) navigation device or cursor control device 714 (e.g., a mouse), a disk drive unit 716, a signal generation device 718 (e.g., a speaker), and a network interface device 720.


Machine-Readable Storage Medium


The disk drive unit 716 includes a machine-readable storage medium 722 on which is stored one or more sets of instructions 724 and data structures (e.g., software instructions) embodying or used by any one or more of the methodologies or functions described herein. The instructions 724 may also reside, completely or at least partially, within the main memory 704, the static memory 706 or within the processor 702 during execution thereof by the computer system 700, with the main memory 704 and the processor 702 also constituting machine-readable media.


While the machine-readable storage medium 722 is shown in an example embodiment to be a single medium, the term “machine-readable storage medium” may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) that store the one or more instructions. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of embodiments of the present invention, or that is capable of storing, encoding, or carrying data structures used by or associated with such instructions. The term “machine-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories and optical and magnetic media. Specific examples of machine-readable storage media include non-volatile memory, including by way of example semiconductor memory devices (e.g., Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), and flash memory devices); magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.


Transmission Medium


The instructions 724 may further be transmitted or received over a communications network 726 using a transmission medium via the network interface device 720 and utilizing any one of a number of well-known transfer protocols (e.g., HTTP). Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, mobile telephone networks, POTS networks, and wireless data networks (e.g., WiFi and WiMax networks). The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.


Although an overview of the inventive subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of embodiments of the present invention. Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is, in fact, disclosed.


The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.


Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present invention. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present invention as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims
  • 1. A method comprising: receiving environment image data containing an image of an environment;receiving a selection of an item for placement into an indicated location of the image of the environment;scaling an item image of the selected item based on dimensions determined from the environment image data for the environment;augmenting, using at least one processor, the scaled item image into the image of the environment at the indicated location to generate an augmented reality image;causing display of the augmented reality image on a device of a user, the scaled item image in the augmented reality image being selectable to cause display of information;receiving a selection of the scaled item image in the augmented reality image; andin response to the receiving the selection of the scaled item image, causing presentation of the information on the device of the user.
  • 2. The method of claim 1, further comprising determining a distance to the indicated location, the scaling of the item image of the selected item being based on the dimensions and the determined distance.
  • 3. The method of claim 2, wherein the determining the distance to the indicated location comprises using environment image data derived from a focus capability of an image capture device associated with the client device to determine the distance.
  • 4. The method of claim 2, wherein the determining the distance to the indicated location comprises using environment image data based on an echo technique performed by a sound generator of the client device to determine the distance.
  • 5. The method of claim 1, further comprising determining the dimensions for the environment using a marker located in the environment.
  • 6. The method of claim 1, further comprising: determining an orientation of the indicated location; andorienting the scaled item image to the determined orientation of the indicated location.
  • 7. The method of claim 1, wherein the image of the environment comprises a video of the environment, the scaling of the item image and the augmenting of the scaled item image being repeatedly performed for the video.
  • 8. The method of claim 1, wherein the receiving of the selection of the item comprises receiving the item image for the selected item.
  • 9. The method of claim 1, further comprising obtaining item data for the selected item from an item catalog, the item data including dimension data for the selected item.
  • 10. The method of claim 1, wherein the causing presentation of the additional information in response to the receiving the selection of the scaled item image comprises causing presentation of at least one alternative recommendation, the least one alternative recommendation being a smaller or larger item.
  • 11. The method of claim 1, wherein the causing the presentation of the additional information in response to the receiving the selection of the scaled item image comprises causing presentation of at least one alternative recommendation, the least one alternative recommendation being an item that is less expensive than the selected item.
  • 12. The method of claim 1, wherein the causing presentation of the additional information in response to the receiving the selection of the scaled item image comprises causing presentation of one or more links to various marketplaces where the selected item may be purchased.
  • 13. The method of claim 1, wherein the causing presentation of the additional information in response to the receiving the selection of the scaled item image comprises causing presentation of one or more of a dimension, weight, item identifier, or product description of the selected item.
  • 14. The method of claim 1, wherein the causing the presentation of the additional information in response to the receiving the selection of the scaled item image comprises causing presentation of at least one alternative recommendation, the least one alternative recommendation being a newer version of the selected item.
  • 15. The method of claim 1, wherein the causing the presentation of the additional information in response to the receiving the selection of the scaled item image comprises causing presentation of at least one alternative recommendation, the least one alternative recommendation being an item that ranks higher than the selected item.
  • 16. A system comprising: one or more hardware processors; anda storage device storing instructions that, when executed by the one or more hardware processors, cause the one or more hardware processors to perform operations comprising:receiving environment image data containing an image of an environment;receiving a selection of an item for placement into an indicated location of the image of the environment;scaling an item image of the selected item based on dimensions determined from the environment image data for the environment;augmenting the scaled item image into the image of the environment at the indicated location to generate an augmented reality image;causing display of the augmented reality image on a device of a user, the scaled item image in the augmented reality image being selectable to cause display of information;receiving a selection of the scaled item image in the augmented reality image; andin response to the receiving the selection of the scaled item image, causing presentation of the information on the device of the user.
  • 17. The system of claim 16, wherein the causing the presentation of the additional information in response to the receiving the selection of the scaled item image comprises causing presentation of at least one alternative recommendation.
  • 18. The system of claim 16, wherein the causing presentation of the additional information in response to the receiving the selection of the scaled item image comprises causing presentation of one or more links to various marketplaces where the selected item may be purchased.
  • 19. The system of claim 16, wherein the causing presentation of the additional information in response to the receiving the selection of the scaled item image comprises causing presentation of one or more of a dimension, weight, item identifier, or product description of the selected item.
  • 20. A machine-readable hardware device storing instructions which, when executed by the at least one processor of a machine, causes the machine to perform operations comprising: receiving environment image data containing an image of an environment;receiving a selection of an item for placement into an indicated location of the image of the environment;scaling an item image of the selected item based on dimensions determined from the environment image data for the environment;augmenting the scaled item image into the image of the environment at the indicated location to generate an augmented reality image;causing display of the augmented reality image on a device of a user, the scaled item image in the augmented reality image being selectable to cause display of information;receiving a selection of the scaled item image in the augmented reality image; andin response to the receiving the selection of the scaled item image, causing presentation of the information on the device of the user.
PRIORITY

This application is a continuation of and claims the benefit of priority to U.S. patent application Ser. No. 13/283,416, filed on Oct. 27, 2011, which is hereby incorporated by reference herein in its entirety.

US Referenced Citations (319)
Number Name Date Kind
3675215 Arnold et al. Jul 1972 A
4539585 Spackova et al. Sep 1985 A
4596144 Panton et al. Jun 1986 A
5068723 Dixit et al. Nov 1991 A
5408417 Wilder Apr 1995 A
5546475 Bolle et al. Aug 1996 A
5579471 Barber et al. Nov 1996 A
5692012 Virtamo et al. Nov 1997 A
5781899 Hirata Jul 1998 A
5802361 Wang et al. Sep 1998 A
5818964 Itoh Oct 1998 A
5870149 Comroe et al. Feb 1999 A
5889896 Meshinsky et al. Mar 1999 A
5949429 Bonneau et al. Sep 1999 A
6112226 Weaver et al. Aug 2000 A
6134548 Gottsman et al. Oct 2000 A
6134674 Akasheh Oct 2000 A
6151587 Matthias Nov 2000 A
6154738 Call Nov 2000 A
6157435 Slater et al. Dec 2000 A
6216134 Heckerman et al. Apr 2001 B1
6216227 Goldstein et al. Apr 2001 B1
6278446 Liou et al. Aug 2001 B1
6292593 Nako et al. Sep 2001 B1
6434530 Sloane et al. Aug 2002 B1
6463426 Lipson et al. Oct 2002 B1
6477269 Brechner Nov 2002 B1
6483570 Slater et al. Nov 2002 B1
6484130 Dwyer et al. Nov 2002 B2
6512919 Ogasawara Jan 2003 B2
6530521 Henry Mar 2003 B1
6549913 Murakawa Apr 2003 B1
6563959 Troyanker May 2003 B1
6567797 Schuetze et al. May 2003 B1
6587835 Treyz et al. Jul 2003 B1
6589290 Maxwell et al. Jul 2003 B1
6642929 Essafi et al. Nov 2003 B1
6714945 Foote et al. Mar 2004 B1
6724930 Kosaka et al. Apr 2004 B1
6763148 Sternberg et al. Jul 2004 B1
6804662 Annau et al. Oct 2004 B1
6901379 Balter et al. May 2005 B1
6947571 Rhoads et al. Sep 2005 B1
7022281 Senff Apr 2006 B1
7023441 Choi et al. Apr 2006 B2
7062722 Carlin et al. Jun 2006 B1
7082365 Sheha et al. Jul 2006 B2
7130466 Seeber Oct 2006 B2
7149665 Feld et al. Dec 2006 B2
7162082 Edwards Jan 2007 B2
7240025 Stone et al. Jul 2007 B2
7254779 Rezvani et al. Aug 2007 B1
7257268 Eichhorn et al. Aug 2007 B2
7281018 Begun et al. Oct 2007 B1
7346453 Matsuoka Mar 2008 B2
7346543 Edmark Mar 2008 B1
7363214 Musgrove et al. Apr 2008 B2
7363252 Fujimoto Apr 2008 B2
7460735 Rowley et al. Dec 2008 B1
7478143 Friedman et al. Jan 2009 B1
7495674 Biagiotti et al. Feb 2009 B2
7519562 Vander Mey et al. Apr 2009 B1
7568004 Gottfried Jul 2009 B2
7587359 Levy et al. Sep 2009 B2
7593602 Stentiford Sep 2009 B2
7683858 Allen et al. Mar 2010 B2
7702185 Keating et al. Apr 2010 B2
7752082 Calabria Jul 2010 B2
7756757 Oakes, III Jul 2010 B1
7761339 Alivandi Jul 2010 B2
7801893 Gulli′ et al. Sep 2010 B2
7827074 Rolf Nov 2010 B1
7848764 Riise et al. Dec 2010 B2
7848765 Phillips et al. Dec 2010 B2
7890386 Reber Feb 2011 B1
7916129 Lin et al. Mar 2011 B2
7921040 Reber Apr 2011 B2
7933811 Reber Apr 2011 B2
7948481 Vilcovsky May 2011 B2
7957510 Denney et al. Jun 2011 B2
8078498 Edmark Dec 2011 B2
8130242 Cohen Mar 2012 B2
8230016 Pattan et al. Jul 2012 B1
8239130 Upstill et al. Aug 2012 B1
8260846 Lahav Sep 2012 B2
8275590 Szymczyk et al. Sep 2012 B2
8370062 Starenky et al. Feb 2013 B1
8385646 Lang et al. Feb 2013 B2
8547401 Mallinson et al. Oct 2013 B2
8825660 Chittar Sep 2014 B2
9058764 Persson et al. Jun 2015 B1
9164577 Tapley et al. Oct 2015 B2
9240059 Zises et al. Jan 2016 B2
9336541 Pugazhendhi et al. May 2016 B2
9449342 Sacco Sep 2016 B2
9495386 Tapley et al. Nov 2016 B2
9530059 Zises Dec 2016 B2
9953350 Pugazhendhi et al. Apr 2018 B2
20010034668 Whitworth Oct 2001 A1
20010049636 Hudda et al. Dec 2001 A1
20020002504 Engel et al. Jan 2002 A1
20020027694 Kim et al. Mar 2002 A1
20020052709 Akatsuka et al. May 2002 A1
20020072993 Sandus et al. Jun 2002 A1
20020094189 Navab et al. Jul 2002 A1
20020107737 Kaneko et al. Aug 2002 A1
20020116286 Walker et al. Aug 2002 A1
20020146176 Meyers Oct 2002 A1
20020196333 Gorischek Dec 2002 A1
20030018652 Heckerman et al. Jan 2003 A1
20030028873 Lemmons Feb 2003 A1
20030051255 Bulman et al. Mar 2003 A1
20030053706 Hong et al. Mar 2003 A1
20030085894 Tatsumi May 2003 A1
20030101105 Vock May 2003 A1
20030112260 Gouzu Jun 2003 A1
20030123026 Abitbol et al. Jul 2003 A1
20030130910 Pickover et al. Jul 2003 A1
20030147623 Fletcher Aug 2003 A1
20030208409 Mault Nov 2003 A1
20030229537 Dunning et al. Dec 2003 A1
20030231806 Troyanker Dec 2003 A1
20040019643 Zirnstein, Jr. Jan 2004 A1
20040046779 Asano et al. Mar 2004 A1
20040057627 Abe et al. Mar 2004 A1
20040075670 Bezine et al. Apr 2004 A1
20040096096 Huber May 2004 A1
20040133927 Sternberg et al. Jul 2004 A1
20040153505 Verdi et al. Aug 2004 A1
20040205286 Bryant et al. Oct 2004 A1
20040220767 Tanaka et al. Nov 2004 A1
20050001852 Dengler et al. Jan 2005 A1
20050004850 Gutbrod et al. Jan 2005 A1
20050081161 Macinnes et al. Apr 2005 A1
20050084154 Li et al. Apr 2005 A1
20050091597 Ackley Apr 2005 A1
20050151743 Sitrick Jul 2005 A1
20050151963 Pulla et al. Jul 2005 A1
20050162419 Kim et al. Jul 2005 A1
20050162523 Darrell et al. Jul 2005 A1
20050171864 Nakade et al. Aug 2005 A1
20050182792 Israel et al. Aug 2005 A1
20050193006 Bandas Sep 2005 A1
20050222987 Vadon Oct 2005 A1
20050283379 Reber Dec 2005 A1
20060004850 Chowdhury Jan 2006 A1
20060012677 Neven, Sr. et al. Jan 2006 A1
20060013481 Park et al. Jan 2006 A1
20060015492 Keating et al. Jan 2006 A1
20060032916 Mueller et al. Feb 2006 A1
20060038833 Mallinson et al. Feb 2006 A1
20060058948 Blass et al. Mar 2006 A1
20060071945 Anabuki Apr 2006 A1
20060071946 Anabuki et al. Apr 2006 A1
20060116935 Evans Jun 2006 A1
20060120686 Liebenow Jun 2006 A1
20060149625 Koningstein Jul 2006 A1
20060149638 Allen Jul 2006 A1
20060184013 Emanuel et al. Aug 2006 A1
20060190293 Richards Aug 2006 A1
20060218153 Voon et al. Sep 2006 A1
20060240862 Neven et al. Oct 2006 A1
20070005576 Cutrell et al. Jan 2007 A1
20070015586 Huston Jan 2007 A1
20070060112 Reimer Mar 2007 A1
20070078846 Gulli et al. Apr 2007 A1
20070104348 Cohen May 2007 A1
20070122947 Sakurai et al. May 2007 A1
20070133947 Armitage et al. Jun 2007 A1
20070143082 Degnan Jun 2007 A1
20070150403 Mock et al. Jun 2007 A1
20070172155 Guckenberger Jul 2007 A1
20070198505 Fuller Aug 2007 A1
20070230817 Kokojima Oct 2007 A1
20070244924 Sadovsky et al. Oct 2007 A1
20070300161 Bhatia Dec 2007 A1
20080003966 Magnusen Jan 2008 A1
20080037877 Jia et al. Feb 2008 A1
20080046738 Galloway et al. Feb 2008 A1
20080046956 Kulas Feb 2008 A1
20080059055 Geelen et al. Mar 2008 A1
20080071559 Arrasvuori Mar 2008 A1
20080074424 Carignano Mar 2008 A1
20080082426 Gokturk et al. Apr 2008 A1
20080084429 Wissinger Apr 2008 A1
20080104054 Spangler May 2008 A1
20080142599 Benillouche et al. Jun 2008 A1
20080151092 Vilcovsky Jun 2008 A1
20080154710 Varma Jun 2008 A1
20080163311 St. John-larkin Jul 2008 A1
20080163379 Robinson Jul 2008 A1
20080165032 Lee et al. Jul 2008 A1
20080170810 Wu et al. Jul 2008 A1
20080176545 Dicke et al. Jul 2008 A1
20080177640 Gokturk et al. Jul 2008 A1
20080186226 Ratnakar Aug 2008 A1
20080194323 Merkli et al. Aug 2008 A1
20080201241 Pecoraro Aug 2008 A1
20080205755 Jackson et al. Aug 2008 A1
20080205764 Iwai et al. Aug 2008 A1
20080207357 Savarese et al. Aug 2008 A1
20080225123 Osann et al. Sep 2008 A1
20080240575 Panda et al. Oct 2008 A1
20080255961 Livesey Oct 2008 A1
20080268876 Gelfand et al. Oct 2008 A1
20080278778 Saino Nov 2008 A1
20080285940 Kulas Nov 2008 A1
20080288338 Wiseman et al. Nov 2008 A1
20080288477 Kim et al. Nov 2008 A1
20090006208 Grewal et al. Jan 2009 A1
20090019487 Kulas Jan 2009 A1
20090028435 Wu et al. Jan 2009 A1
20090028446 Wu et al. Jan 2009 A1
20090083096 Cao et al. Mar 2009 A1
20090094260 Cheng et al. Apr 2009 A1
20090106127 Purdy et al. Apr 2009 A1
20090109240 Englert et al. Apr 2009 A1
20090144624 Barnes, Jr. Jun 2009 A1
20090228342 Walker et al. Sep 2009 A1
20090232354 Camp, Jr. et al. Sep 2009 A1
20090235181 Saliba et al. Sep 2009 A1
20090235187 Kim et al. Sep 2009 A1
20090240735 Grandhi et al. Sep 2009 A1
20090245638 Collier et al. Oct 2009 A1
20090262137 Walker et al. Oct 2009 A1
20090271293 Parkhurst et al. Oct 2009 A1
20090287587 Bloebaum et al. Nov 2009 A1
20090299824 Barnes, Jr. Dec 2009 A1
20090304267 Tapley et al. Dec 2009 A1
20090319373 Barrett Dec 2009 A1
20090319388 Yuan et al. Dec 2009 A1
20090319887 Waltman et al. Dec 2009 A1
20090324100 Kletter et al. Dec 2009 A1
20090324137 Stallings et al. Dec 2009 A1
20090325554 Reber Dec 2009 A1
20100015960 Reber Jan 2010 A1
20100015961 Reber Jan 2010 A1
20100015962 Reber Jan 2010 A1
20100034469 Thorpe et al. Feb 2010 A1
20100037177 Golsorkhi Feb 2010 A1
20100045701 Scott et al. Feb 2010 A1
20100046842 Conwell Feb 2010 A1
20100048290 Baseley et al. Feb 2010 A1
20100049663 Kane, Jr. et al. Feb 2010 A1
20100070996 Liao et al. Mar 2010 A1
20100082927 Riou Apr 2010 A1
20100131714 Chandrasekaran May 2010 A1
20100153378 Sardesai Jun 2010 A1
20100161605 Gabrilovich et al. Jun 2010 A1
20100171758 Maassel et al. Jul 2010 A1
20100171999 Namikata et al. Jul 2010 A1
20100185529 Chesnut et al. Jul 2010 A1
20100198684 Eraker et al. Aug 2010 A1
20100211900 Fujioka Aug 2010 A1
20100214284 Rieffel et al. Aug 2010 A1
20100235259 Farraro et al. Sep 2010 A1
20100241650 Chittar Sep 2010 A1
20100257024 Holmes et al. Oct 2010 A1
20100260426 Huang et al. Oct 2010 A1
20100281417 Yolleck et al. Nov 2010 A1
20100287511 Meier et al. Nov 2010 A1
20100312596 Saffari et al. Dec 2010 A1
20100316288 Ip et al. Dec 2010 A1
20100332283 Ng et al. Dec 2010 A1
20100332304 Higgins et al. Dec 2010 A1
20110004517 Soto et al. Jan 2011 A1
20110016487 Chalozin et al. Jan 2011 A1
20110029334 Reber Feb 2011 A1
20110053642 Lee Mar 2011 A1
20110055054 Glasson Mar 2011 A1
20110061011 Hoguet Mar 2011 A1
20110078305 Varela Mar 2011 A1
20110084983 Demaine Apr 2011 A1
20110128300 Gay et al. Jun 2011 A1
20110143731 Ramer et al. Jun 2011 A1
20110148924 Tapley et al. Jun 2011 A1
20110153614 Solomon Jun 2011 A1
20110173191 Tsaparas et al. Jul 2011 A1
20110184780 Alderson et al. Jul 2011 A1
20110187306 Aarestrup et al. Aug 2011 A1
20110215138 Crum Sep 2011 A1
20110246064 Nicholson Oct 2011 A1
20120072233 Hanlon et al. Mar 2012 A1
20120084812 Thompson et al. Apr 2012 A1
20120099800 Llano et al. Apr 2012 A1
20120105475 Tseng May 2012 A1
20120113141 Zimmerman et al. May 2012 A1
20120120113 Hueso May 2012 A1
20120165046 Rhoads et al. Jun 2012 A1
20120185492 Israel et al. Jul 2012 A1
20120192235 Tapley et al. Jul 2012 A1
20120195464 Ahn Aug 2012 A1
20120197764 Nuzzi et al. Aug 2012 A1
20120215612 Ramer et al. Aug 2012 A1
20120230581 Miyashita Sep 2012 A1
20120284105 Li Nov 2012 A1
20120308077 Tseng Dec 2012 A1
20120327115 Chhetri et al. Dec 2012 A1
20130019177 Schlossberg et al. Jan 2013 A1
20130050218 Beaver, III et al. Feb 2013 A1
20130073365 Mccarthy Mar 2013 A1
20130086029 Hebert Apr 2013 A1
20130103306 Uetake Apr 2013 A1
20130106910 Sacco et al. May 2013 A1
20130116922 Cai et al. May 2013 A1
20130144701 Kulasooriya et al. Jun 2013 A1
20130170697 Zises Jul 2013 A1
20130198002 Nuzzi et al. Aug 2013 A1
20130325839 Goddard et al. Dec 2013 A1
20140007012 Govande et al. Jan 2014 A1
20140085333 Pugazhendhi et al. Mar 2014 A1
20140372449 Chittar Dec 2014 A1
20150052171 Cheung Feb 2015 A1
20160019723 Tapley et al. Jan 2016 A1
20160034944 Raab et al. Feb 2016 A1
20160117863 Pugazhendhi et al. Apr 2016 A1
20160171305 Zises Jun 2016 A1
20170046593 Tapley et al. Feb 2017 A1
20170091975 Zises Mar 2017 A1
Foreign Referenced Citations (61)
Number Date Country
2012212601 May 2016 AU
1255989 Jun 2000 CN
1750001 Mar 2006 CN
1802586 Jul 2006 CN
101515198 Aug 2009 CN
101520904 Sep 2009 CN
101541012 Sep 2009 CN
101764973 Jun 2010 CN
101772779 Jul 2010 CN
101893935 Nov 2010 CN
102084391 Jun 2011 CN
102156810 Aug 2011 CN
102194007 Sep 2011 CN
102667913 Sep 2012 CN
103443817 Dec 2013 CN
104081379 Oct 2014 CN
104656901 May 2015 CN
105787764 Jul 2016 CN
1365358 Nov 2003 EP
1710717 Oct 2006 EP
2015244 Jan 2009 EP
2034433 Mar 2009 EP
2418275 Mar 2006 GB
11-191118 Jul 1999 JP
2001283079 Oct 2001 JP
2001-309323 Nov 2001 JP
2001344479 Dec 2001 JP
2002-099826 Apr 2002 JP
2003-022395 Jan 2003 JP
2004-326229 Nov 2004 JP
2005-337966 Dec 2005 JP
2006-351024 Dec 2006 JP
2007-172605 Jul 2007 JP
2010-039908 Feb 2010 JP
2010-141371 Jun 2010 JP
2010-524110 Jul 2010 JP
2011-209934 Oct 2011 JP
2012-529685 Nov 2012 JP
10-2007-0014532 Feb 2007 KR
10-0805607 Feb 2008 KR
10-0856585 Sep 2008 KR
20090056792 Jun 2009 KR
10-2009-0070900 Jul 2009 KR
10-2010-0067921 Jun 2010 KR
10-2010-0071559 Jun 2010 KR
1020110082690 Jul 2011 KR
2015264850 Apr 2017 RU
WO-9944153 Sep 1999 WO
2008003966 Jan 2008 WO
2008051538 May 2008 WO
2009111047 Sep 2009 WO
2009111047 Dec 2009 WO
2010084585 Jul 2010 WO
2010141939 Dec 2010 WO
WO-2011070871 Jun 2011 WO
WO-2011087797 Jul 2011 WO
WO-2011087797 Jul 2011 WO
2012106096 Aug 2012 WO
WO-2013063299 May 2013 WO
WO-2013101903 Jul 2013 WO
WO-2013101903 Jul 2013 WO
Non-Patent Literature Citations (305)
Entry
“U.S. Appl. No. 12/644,957, Examiner Interview Summary dated Apr. 29, 2015”, 3 pgs.
“U.S. Appl. No. 12/644,957, Examiner Interview Summary dated Jun. 11, 2014”, 3 pgs.
“U.S. Appl. No. 12/644,957, Examiner Interview Summary dated Sep. 4, 2014”, 3 pgs.
“U.S. Appl. No. 12/644,957, Final Office Action dated Jul. 11, 2014”, 25 pgs.
“U.S. Appl. No. 12/644,957, Final Office Action dated Aug. 26, 2013”, 19 pgs.
“U.S. Appl. No. 12/644,957, Non Final Office Action dated Mar. 7, 2014”, 21 pgs.
“U.S. Appl. No. 12/644,957, Non Final Office Action dated Mar. 18, 2013”, 17 pgs.
“U.S. Appl. No. 12/644,957, Non Final Office Action dated Dec. 29, 2014”, 20 pgs.
“U.S. Appl. No. 12/644,957, Notice of Allowance dated Jun. 17, 2015”, 20 pgs.
“U.S. Appl. No. 12/644,957, Response filed Apr. 29, 2015 to Non Final Office Action dated Dec. 29, 2014”, 13 pgs.
“U.S. Appl. No. 12/644,957, Response filed Jun. 9, 2014 to Non Final Office Action dated Mar. 7, 2014”, 13 pgs.
“U.S. Appl. No. 12/644,957, Response filed Jun. 14, 2013 to Non Final Office Action dated Mar. 18, 2013”, 12 pgs.
“U.S. Appl. No. 12/644,957, Response filed Sep. 30, 2014 to Final Office Action dated Jul. 11, 2014”, 14 pgs.
“U.S. Appl. No. 12/644,957, Response filed Nov. 26, 2013 to Final Office Action dated Aug. 26, 2013”, 11 pgs.
“U.S. Appl. No. 13/283,416 Response Filed May 2, 2016 to Non-Final Office Action dated Feb. 2, 2016”, 12 pgs.
“U.S. Appl. No. 13/283,416, Final Office Action dated Aug. 7, 2015”, 25 pgs.
“U.S. Appl. No. 13/283,416, Final Office Action dated Nov. 25, 2014”, 26 pgs.
“U.S. Appl. No. 13/283,416, Non Final Office Action dated Feb. 2, 2016”, 32 pgs.
“U.S. Appl. No. 13/283,416, Non Final Office Action dated Apr. 2, 2015”, 31 pgs.
“U.S. Appl. No. 13/283,416, Non Final Office Action dated Jul. 10, 2014”, 29 pgs.
“U.S. Appl. No. 13/283,416, Notice of Allowance dated May 26, 2016”, 9 pgs.
“U.S. Appl. No. 13/283,416, Response filed Feb. 25, 2015 to Final Office Action dated Nov. 25, 2014”, 12 pgs.
“U.S. Appl. No. 13/283,416, Response filed Jul. 2, 2015 to Non Final Office Action dated Apr. 2, 2015”, 13 pgs.
“U.S. Appl. No. 13/283,416, Response filed Nov. 10, 2014 to Non Final Office Action dated Jul. 10, 2014”, 12 pgs.
“U.S. Appl. No. 13/283,416, Response filed Dec. 7, 2015 to Final Office Action dated Aug. 7, 2015”, 15 pgs.
“U.S. Appl. No. 13/340,141, Examiner Interview Summary dated Aug. 4, 2015” 3 pgs.
“U.S. Appl. No. 13/340,141, Examiner Interview Summary dated Dec. 11, 2014”, 3 pgs.
“U.S. Appl. No. 13/340,141, Final Office Action dated Feb. 6, 2014”, 19 pgs.
“U.S. Appl. No. 13/340,141, Final Office Action dated Sep. 26, 2014”, 22 pgs.
“U.S. Appl. No. 13/340,141, Non Final Office Action dated Apr. 9, 2015”, 13 pgs.
“U.S. Appl. No. 13/340,141, Non Final Office Action dated Jun. 5, 2014”, 18 pgs.
“U.S. Appl. No. 13/340,141, Non Final Office Action dated Aug. 29, 2013”, 15 pgs.
“U.S. Appl. No. 13/340,141, Notice of Allowance dated Sep. 10, 2015”, 9 pgs.
“U.S. Appl. No. 13/340,141, Response filed Feb. 26, 2015 to Final Office Action dated Sep. 26, 2014”, 12 pgs.
“U.S. Appl. No. 13/340,141, Response filed May 6, 2014 to Final Office Action dated Feb. 6, 2014”, 12 pgs.
“U.S. Appl. No. 13/340,141, Response filed Aug. 6, 2015 to Non Final Office Action dated Apr. 9, 2015”, 11 pgs.
“U.S. Appl. No. 13/340,141, Response filed Sep. 5, 2014 to Non Final Office Action dated Jun. 5, 2014”, 14 pgs.
“U.S. Appl. No. 13/340,141, Response filed Dec. 30, 2013 to Non Final Office Action dated Aug. 20, 2013”, 13 pgs.
“U.S. Appl. No. 14/963,706, Non Final Office Action dated Jul. 5, 2016”, 5 pgs.
“U.S. Appl. No. 14/963,706, Notice of Allowance dated Aug. 18, 2016”, 7 pgs.
“U.S. Appl. No. 14/963,706, Preliminary Amendment filed Mar. 11, 2016”, 8 pgs.
“U.S. Appl. No. 14/963,706, Response filed Aug. 3, 2016 to Non Final Office Action dated Jul. 5, 2016”, 8 pgs.
“Australian Application Serial No. 2012328754, Response filed Aug. 3, 2015 to Office Action dated Mar. 30, 2015”, 17 pgs.
“Australian Application Serial No. 2012328754,0ffice Action dated Mar. 30, 2015”.
“Australian Application Serial No. 2012362467, Office Action dated Mar. 24, 2015”, 3 pgs.
“Australian Application Serial No. 2012362467, Response filed Aug. 28, 2015 to Office Action dated Mar. 24, 2015”, 20 pgs.
“U.S. Appl. No. 2,850,074, Office Action dated Sep. 29, 2015”, 6 pgs.
“Canadian Application Serial No. 2,856,869, Office Action dated Oct. 14, 2015”, 4 pgs.
“Canadian Application Serial No. 2,856,869, Response filed Apr. 11, 2016 to Office Action dated Oct. 14, 2015”, 20 pgs.
“Chinese Application Serial No. 201080059424.5, Office Action dated Apr. 21, 2014”, with English translation of claims, 18 pgs.
“Chinese Application Serial No. 201080059424.5, Response filed Sep. 4, 2014 to Office Action dated Apr. 21, 2014”, with English translation of claims, 10 pgs.
“Chinese Application Serial No. 201280052967.3, Office Action dated Mar. 2, 2016”, with English translation of claims, 18 pgs.
“Chinese Application Serial No. 201280052967.3, Response filed Jul. 18, 2016 to Office Action dated Mar. 2, 2016”, 8 pgs.
“Definition of Homogeneous Coordinates”, Wikipedia on Mar. 5, 2011 via Internet Archive WayBackMachine, [Online]. Retrieved from the Internet: <https://wellarchive.orgiweb/20110305185824/http://en.wikipedia.org/wiki/Homogeneous_coordinates>, (Nov. 17, 2014), 7 pgs.
“Definition of Polar Coordinate System”, Wikipedia on Oct. 11, 2011 via Internet Archive WayBackMachine, [Online]. Retrieved from the Internet: <https://web.archive.org/web/20111008005218/http://en.wikipedia.org/wiki/Polar_coordinate_system>, (Nov. 17, 2014), 17 pgs.
“European Application Serial No. 10803429.9, Extended European Search Report dated Jun. 17, 2015”, 7 pgs.
“European Application Serial No. 10803429.9, Office Action dated Aug. 22, 2012”, 2 pgs.
“European Application Serial No. 10803429.9, Response filed Jan. 29, 2013 to Office Action dated Aug. 22, 2012”, 10 pgs.
“European Application Serial No. 12843046.9, Extended European Search Report dated Mar. 5, 2015”, 7 pgs.
“European Application Serial No. 12843046.9, Office Action dated Jun. 4, 2014”, 3 pgs.
“European Application Serial No. 12843046.9, Response filed Sep. 30, 2015”, 20 pgs.
“European Application Serial No. 12843046.9, Response filed Nov. 5, 2014 to Office Action dated Jun. 4, 2014”, 5 pgs.
“European Application Serial No. 12862340.2, Extended European Search Report dated Dec. 21, 2015”, 5 pgs.
“European Application Serial No. 12862340.2, Response filed Feb. 3, 2015”, 11 pgs.
“International Application Serial No. PCT/US2010/061628, International Preliminary Report on Patentability dated Jul. 5, 2012”, 6 pgs.
“International Application Serial No. PCT/US2010/061628, International Search Report dated Aug. 12, 2011”, 2 pgs.
“International Application Serial No. PCT/US2010/061628, Written Opinion dated Aug. 12, 2011”, 4 pgs.
“International Application Serial No. PCT/US2012/061966, International Preliminary Report on Patentability dated May 8, 2014”, 6 pgs.
“International Application Serial No. PCT/US2012/061966, International Search Report dated Jan. 18, 2013”, 2 pgs.
“International Application Serial No. PCT/US2012/061966, Written Opinion mailed Jan. 18, 2013”, 4 pgs.
“International Application Serial No. PCT/US2012/071770, International Preliminary Report on Patentability dated Jul. 10, 2014”, 7 pgs.
“International Application Serial No. PCT/US2012/071770, International Search Report dated May 13, 2013”, 2 pgs.
“International Application Serial No. PCT/US2012/071770, Written Opinion dated May 13, 2013”, 5 pgs.
“Japanese Application Serial No. 2014-539013, Office Action dated May 31, 2016”, With English Translation, 4 pgs.
“Japanese Application Serial No. 2014-539013, Office Action dated Aug. 11, 2015”, with English translation of claims, 7 pgs.
“Japanese Application Serial No. 2014-539013, Response filed Aug. 19, 2016 to Office Action dated May 31, 2016”, (English Translation of Claims), 11 pgs.
“Japanese Application Serial No. 2014-539013, Response filed Dec. 9, 2015 to Office Action dated Aug. 11, 2015”, 16 pgs.
“Korean Application Serial No. 2012-7019181, Notice of Appeal filed Feb. 4, 2015”, with English translation of claims, 24 pgs.
“Korean Application Serial No. 2012-7019181, Notice of Final Rejection dated Nov. 3, 2014”, with English translation of claims, 7 pgs.
“Korean Application Serial No. 2012-7019181, Notice of Preliminary Rejection dated Nov. 18, 2013”, with English translation of claims, 11 pgs.
“Korean Application Serial No. 2012-7019181, Office Action dated Jun. 26, 2014”, with English translation of claims, 5 pgs.
“Korean Application Serial No. 2012-7019181, Response filed Feb. 18, 2014 to Notice of Preliminary Rejection dated Nov. 18, 2013”, with English translation of claims, 26 pgs.
“Korean Application Serial No. 2014-7014116, Final Office Action dated Jan. 29, 2016”, 8 pgs.
“Korean Application Serial No. 2014-7014116, Office Action dated Jun. 26, 2015”, w/ English Claims, 13 pgs.
“Korean Application Serial No. 2014-7014116, Request for Re-Examination filed May 2, 2016”, with English translation of claims, 22 pgs.
“Korean Application Serial No. 2014-7014116, Response filed Aug. 26, 2015”, with English translation of claims, 23 pgs.
“MLB At Bat 11”, [Online]. Retrieved from the Internet: <URL:http://texas.rangers.mlb.com/mobile/atbat/?c_id=tex>, (Accessed Dec. 22, 2011), 3 pgs.
Kan, et al., “Applying QR Code in Augmented Reality Applications”, VRCAI, (Dec. 15, 2009), 253-258.
Kraft, Adam, “Real Time Baseball Aguernented Reality”, Washington University in St. Louis, (Dec. 6, 2011), 10 pgs.
Mulloni, Alessandro, et al., “Handheld augmented reality indoor navigation with activity-based instructions”, Proceedings of the 13th international conference on human computer interaction with mobile devices and services, (2011), 10 pgs.
Vlahakis, Vassilios, et al., “Archeoguide: An Augmented Reality Guide for Archaeological Sites”, IEEE Computer Graphics and Application vol. 22, No. 5, (2002), 52-60 pgs.
Vlahakis, Vassilios, et al., “Archeoguide: first results of an augmented reality, mobile computing system in cultural heritage sites”, Virtual Reality, Archeology, and Cultural Heritage, (2001), 9 pgs.
“U.S. Appl. No. 15/377,651, Preliminary Amendment filed Dec. 27, 2016”, 6 pgs.
“Australian Application Serial No. 2015264850, First Examiners Report dated Dec. 19, 2016”, 2 pgs.
“Canadian Application Serail No. 2,850,074, Office Action dated Nov. 28, 2016”, 11 pgs.
“Chinese Application Serial No. 201280052967.3, Office Action dated Aug. 24, 2016”, with English translation of claims, 14 pgs.
“European Application Serial No. 12843046.9, Communication Pursuant o Article 94(3) EPC dated Dec. 1, 2016”, 8 pgs.
“Japanese Application Serial No. 2014-539013, Examiners Decision of Final Refusal dated Dec. 6, 2016”, with English translation of claims, 5 pgs.
“Korean Application Serial No. 2016-7024912, Office Action dated Dec. 7, 2016”, with English translation of claims, 11 pgs.
“Australian Application Serial No. 2015264850, Response to First Examiners Report dated Mar. 20, 2017”, 15 pgs.
“Chinese Application Serial No. 201280052967.3, Response filed Jan. 6, 2017 to Office Action dated Aug. 24, 2016”, 10 pgs.
“European Application Serial No. 12843046.9, Response to Communication Pursuant to Article 94(3) EPC dated Apr. 6, 2017”, 11 pgs.
“Korean Application Serial No. 2016-7024912, Response filed Feb. 7, 2017 to Office Action dated Dec. 7, 2016”, (English Translation of Claims), 24 pgs.
“Canadian Application Serial No. 2,850,074, Office Action Response dated May 26, 2017”, 23 pags.
“Canadian Application Serial No. 2,850,074, Office Action Response dated Mar. 29, 2016”, 9 pgs.
“Chinese Application Serial No. 201280052967.3, Office Action dated Mar. 23, 2017”, with English translation of claims, 22 pgs.
“Japanese Application Serial No. 2014-539013, Appeal with Amendment filed Apr. 6, 2017”, with English translation of claims, 19 pgs.
Request for Reexamination for Chinese Patent Application No. 201280052967.3, filed on Nov. 17, 2017, 11 pages (including English Translation of claims).
Rejection Decision received from Chinese Patent Application No. 201280052967.3, dated Aug. 4, 2017, 17 pages (in English).
Response to Final Office Action filed on Sep. 18, 2017, for Korean Patent Application No. 2016-7024912, dated Jun. 16, 2017, 23 pages (including English Translation).
Office Action received for Canadian Patent Application No. 2,850,074, dated Oct. 23, 2017, 6 pages.
Final Office Action received for Korean Patent Application No. 2016-7024912, dated Oct. 25, 2017, 7 pages (including English Translation of claims).
First Examiner Report received for Indian Patent Application No. 6557/DELNP/2010, dated Apr. 11, 2017, 11 pages.
“S60 Camera Phones Get Image Recognition Technology”, Retrieved from the Internet URL: <https://news.softpedia.com/news/S60-Camera-Phones-Get-Image-Recognition-Technology-79666.shtml>, Feb. 27, 2008, 2 pages.
“SnapTell: Technology”, Retrieved from the Internet URL: <http://web.archive.org/web/20071117023817/http://www.snaptell.com/technology/index.htm>, Nov. 17, 2007, 1 page.
“The ESP Game”, Retrieved from the Internet URL: <http://www.espgame.org/instructions.html>, Nov. 13, 2007, 2 pages.
Appeal Decision received for Korean Patent Application No. 10-2012-7019181, mailed on Feb. 1, 2016, 16 pages.
Preinterview First Office Action received for U.S. Appl. No. 14/990,291, dated Aug. 10, 2017, 4 pages.
Office Action—First Action Interview received for U.S. Appl. No. 14/990,291, dated Oct. 18, 2017, 5 pages.
Office Action received for Korean Patent Application No. 10-2012-7019181, dated Feb. 23, 2016, 12 pages (5 pages of English Translation and 7 pages of Official Copy).
Notice of Allowance received for U.S. Appl. No. 14/990,291, dated Dec. 13, 2017, 5 pages.
Response to Office Action filed on May 23, 2016 for Korean Patent Application No. 10-2012-7019181, dated Feb. 23, 2016, 26 pages (21 pages of Official Copy and 5 pages of English Pending Claims).
Final Office Action received for Korean Patent Application No. 10-2013-7023099, dated Jun. 10, 2014, 5 pages (2 pages of English Translation and 3 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2013-7023099, dated Jan. 10, 2014, 8 pages (3 pages of English Translation and 5 pages of Official copy).
Notice of Decision to Grant Received for Korean Patent Application No. 10-2014-7004160, dated Jun. 15, 2016, 8 pages (6 pages of English Translation and 2 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2014-7004160, dated Mar. 2, 2016, 7 pages (2 pages of English Translation and 5 pages of Official Copy).
Response to Office Action filed on Jun. 2, 2016 for Korean Patent Application No. 10-2014-7004160, dated Mar. 2, 2016, 39 pages (34 pages of Official Copy and 5 pages of English Pending Claims).
Final Office Action received for Korean Patent Application No. 10-2014-7009560, dated May 26, 2015, 6 pages.
Final Office Action received for Korean Patent Application No. 10-2014-7009560, dated Sep. 30, 2015, 4 pages (2 pages of English Translation and 3 pages of Official Copy).
Response to Office Action filed on Aug. 26, 2015 for Korean Patent Application No. 10-2014-7009560, dated May 26, 2015, 12 pages.
Response to Office Action filed on Jan. 7, 2015 for Korean Patent Application No. 10-2014-7009560, dated Oct. 8, 2014, 14 pages.
Office Action received for Korean Patent Application No. 10-2015-7037233, dated Mar. 30, 2016, 6 pages (2 pages of English Translation and 4 pages of Official Copy).
Response to Office Action filed on Jun. 30, 2016 for Korean Patent Application No. 10-2015-7037233, dated Mar. 30, 2016, 34 pages.
Appeal Brief Filed on Jan. 19, 2018 for Korean Patent Application No. 10-2016-7024912, 23 pages.
Notice of Allowance Received for Korean Patent Application No. 10-2016-7025254 dated Mar. 9, 2018, 5 pages (2 pages of English Translation and 3 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2016-7025254, dated May 2, 2017, 10 pages.
Office Action received for Korean Patent Application No. 10-2016-7025254, dated Oct. 13, 2016, 12 pages.
Office Action received for Korean Patent Application No. 10-2016-7025254, dated Sep. 5, 2017, 12 pages (5 pages of English Translation and 7 pages of Official Copy).
Response to Office Action filed on Dec. 27, 2016 for Korean Patent Application No. 10-2016-7025254, dated Oct. 13, 2016, 25 pages.
Response to Office Action filed on Nov. 3, 2017, for Korean Patent Application No. 10-2016-7025254, dated May 2, 2017, 22 pages (17 pages of Official Copy and 5 pages of English Pending Claims).
Office Action received for Korean Patent Application No. 10-2017-7036972, dated Jan. 30, 2018, 8 pages.
Response to Extended European Search report received filed on Dec. 15, 2015, for European Patent Application No. 10803429.9, dated Jun. 17, 2015, 24 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/371,882, dated Apr. 27, 2016, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/371,882, dated Jul. 21, 2015, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/371,882, dated Nov. 20, 2013, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/371,882, dated Feb. 27, 2012, 3 pages.
Final Office Action received for U.S. Appl. No. 12/371,882 , dated Jun. 25, 2015, 26 pages.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Mar. 13, 2013, 23 pages.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Nov. 14, 2011, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Jun. 8, 2011, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Mar. 12, 2015, 29 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Aug. 30, 2013, 19 pages.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Dec. 18, 2013, 25 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Feb. 8, 2016, 36 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Oct. 23, 2012, 20 pages.
Notice of Allowance received for U.S. Appl. No. 12/371,882, dated Jul. 20, 2016, 5 pages.
Preliminary Amendment received for U.S. Appl. No. 12/371,882, filed on Feb. 16, 2009, 4 pages.
Preliminary Amendment received for U.S. Appl. No. 12/371,882, filed on Jun. 19, 2009, 3 pages.
Response to Final Office Action filed on Jun. 13, 2013, for U.S. Appl. No. 12/371,882, dated Mar. 13, 2013, 14 pages.
Response to Final Office Action filed on Mar. 14, 2012, for U.S. Appl. No. 12/371,882, dated Nov. 14, 2011, 10 pages.
Response to Final Office Action filed on May 8, 2014, for U.S. Appl. No. 12/371,882, dated Dec. 18, 2013, 12 pages.
Response to Final Office Action filed on Sep. 25, 2015, for U.S. Appl. No. 12/371,882, dated Jun. 25, 2015, 13 pages.
Response to Non-Final Office Action filed on Jan. 22, 2013, for U.S. Appl. No. 12/371,882, dated Oct. 23, 2012, 12 pages.
Response to Non-Final Office Action filed on May 9, 2016, for U.S. Appl. No. 12/371,882, dated Feb. 8, 2016, 14 pages.
Response to Non-Final Office Action filed on Sep. 8, 2011, for U.S. Appl. No. 12/371,882, dated Jun. 8, 2011, 13 pages.
Response to Non-Final Office Action filed on Dec. 2, 2013, for U.S. Appl. No. 12/371,882, dated Aug. 30, 2013, 13 pages.
Response to Non-Final Office Action filed on Jun. 12, 2015, for U.S. Appl. No. 12/371,882, dated Mar. 12, 2015, 18 pages.
Final Office Action received for U.S. Appl. No. 12/398,957, dated Jan. 22, 2018, 20 pages.
Final Office Action received for U.S. Appl. No. 12/398,957, dated Nov. 7, 2012, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated Jul. 29, 2011, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated Mar. 29, 2012, 22 pages.
Response to Non-Final Office Action filed on Dec. 29, 2011 for U.S. Appl. No. 12/398,957, dated Jul. 29, 2011, 15 pages.
Communication pursuant to Rules 94(3) EPC received for European Patent Application No. 12862340.2, dated Dec. 21, 2016, 4 pages.
Response to Communication Pursuant to Article 94(3) EPC filed on Feb. 6, 2017, for European Patent Application No. 12862340.2, dated Dec. 21, 2016, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 13/011,324, dated Apr. 18, 2013, 17 pages.
Final Office Action received for U.S. Appl. No. 13/019,918, dated Aug. 6, 2015, 36 pages.
Final Office Action received for U.S. Appl. No. 13/019,918, dated Mar. 27, 2014, 27 pages.
Final Office Action received for U.S. Appl. No. 13/019,918, dated Nov. 30, 2016, 32 pages.
Non-Final Office Action received for U.S. Appl. No. 13/019,918, dated Aug. 29, 2013, 25 pages.
Non-Final Office Action received for U.S. Appl. No. 13/019,918, dated Jun. 2, 2016, 31 pages.
Response to Final Office Action filed on Apr. 28, 2017, for U.S. Appl. No. 13/019,918, dated Nov. 30, 2016, 22 pages.
Response to Final Office Action filed on Jun. 26, 2014, for U.S. Appl. No. 13/019,918, dated Mar. 27, 2014, 14 pages.
Response to Non-Final Office Action filed on Apr. 29, 2015, for U.S. Appl. No. 13/019,918, dated Dec. 29, 2014, 26 pages.
Response to Non-Final Office Action filed on Aug. 25, 2016, for U.S. Appl. No. 13/019,918, dated Jun. 2, 2016, 22 pages.
Response to Final Office Action filed on Nov. 6, 2015, for U.S. Appl. No. 13/019,918, dated Aug. 6, 2015, 21 pages.
Response to Non-Final Office Action filed on Nov. 27, 2013, for U.S. Appl. No. 13/019,918, dated Aug. 29, 2013, 9 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2012/021450, dated Jan. 22, 2013, 17 pages.
International Search Report received for PCT Patent Application No. PCT/US2012/021450, dated May 1, 2012, 2 pages.
Written Opinion received for PCT Patent Application No. PCT/US2012/021450, dated May 1, 2012, 6 pages.
Von et al., “Labeling Images with a Computer Game”, Retrieved from the internet URL:<http://ael.gatech.edu/cs6452f13/files/2013/08/labeling-images.pdf>, 2004, 8 pages.
Walther et al., “Selective Visual Attention Enables Learning and Recognition of Multiple Objects in Cluttered Scenes”, 2005, 23 pages.
Youtube, “Redlaser 2.0: Realtime Iphone Upc Barcode Scanning”, Retrieved from the internet URL: <https://www.youtube.com/watch?v=9_hFGsmx_6k>, 2017, 2 pages.
312 Amendment for U.S. Appl. No. 13/194,584, filed Feb. 27, 2018, 9 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/194,584, dated May 19, 2014, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/194,584, dated Dec. 28, 2017, 3 pages.
Final Office Action received for U.S. Appl. No. 13/194,584, dated Jan. 22, 2016, 26 pages.
Final Office Action received for U.S. Appl. No. 13/194,584, dated Jul. 27, 2017, 34 pages.
Final Office Action received for U.S. Appl. No. 13/194,584, dated Mar. 27, 2014, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 13/194,584, dated Jul. 16, 2015, 26 pages.
Extended European Search Report received for EP Application No. 17171025.4, dated Sep. 4, 2017, 8 pages.
Office Action received for Canadian Patent Application No. 2,826,580, dated Aug. 1, 2014, 2 pages.
Office Action received for Canadian Patent Application No. 2,826,580, dated Mar. 30, 2015, 3 pages.
Office Action received for Canadian Patent Application No. 2,826,580, dated Sep. 23, 2016, 4 pages.
Response to Office Action filed on Sep. 28, 2015 for Canadian Patent Application No. 2,826,580, dated on Mar. 30, 2015, 6 pages.
Response to Office Action filed on Feb. 13, 2018 for Canadian Patent Application No. 2,850,074, dated Oct. 23, 2017, 12 pages.
First Examiner Report received for Australian Patent Application No. 2012212601, dated Oct. 28, 2015, 3 pages.
Notice of Acceptance received for Australian Patent Application No. 2012212601, dated May 5, 2016, 3 pages.
Response to First Examiner Report filed on Mar. 23, 2016, for Australian Patent Application No. 2012212601, dated Oct. 28, 2015, 21 pages.
Office Action received for Chinese Patent Application No. 201280013881.X, dated Jan. 9, 2015, 17 pages.
Examiners Decision of Final Refusal received for Japanese Patent Application No. 2014-215914, dated Jun. 21, 2016, 5 pages (3 pages of English Translation and 2 pages of Official copy).
Office Action received for Japanese Patent Application No. 2014-215914, dated Nov. 4, 2015, 9 pages (5 pages of English Translation and 4 pages of Official Copy).
Response to Office Action filed on Mar. 4, 2016 for Japanese Patent Application No. 2014-215914, dated Nov. 4, 2015, 13 pages.
Office Action received for Chinese Patent Application No. 201510088798.4, dated Mar. 17, 2017, 23 pages (14 pages of English Translation and 9 pages of Official Copy).
Response to Office Action filed on Jul. 28, 2017 for Chinese Patent Application No. 201510088798.4, dated Mar. 17, 2017, 13 pages (Official copy only).
First Examiner Report received for Australian Patent Application No. 2015271902, dated May 22, 2017, 3 pages.
Response to First Examiner Report filed on Aug. 2, 2017, for Australian Patent Application No. 2015271902, dated May 22, 2017, 19 pages.
Response to Final Office Action filed on Apr. 14, 2016, for U.S. Appl. No. 13/194,584, dated Jan. 22, 2016, 10 pages.
Response to Final Office Action filed on Jun. 26, 2014, for U.S. Appl. No. 13/194,584, dated Mar. 27, 2014, 14 pages.
Office Action received for Japanese Patent Application No. 2017-075846, dated Mar. 20, 2018, 16 pages (9 pages of English Translation and 7 pages of Official copy).
Response to Final Office Action filed on Oct. 30, 2017 for U.S. Appl. No. 13/194,584, dated Jul. 27, 2017, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 13/359,630, dated Oct. 29, 2014, 29 pages.
Notice of Non-Compliant Amendment received for U.S. Appl. No. 13/359,630, filed on Apr. 23, 2015, 3 pages.
Response to Final Office Action filed on Mar. 31, 2014, for U.S. Appl. No. 13/359,630, dated Nov. 29, 2013, 17 pages.
Response to Non-Final Office Action filed on Mar. 22, 2016, for U.S. Appl. No. 13/359,630, dated Sep. 22, 2015, 30 pages.
Response to Non-Final Office Action filed on Mar. 30, 2015, for U.S. Appl. No. 13/359,630, dated Oct. 29, 2014, 31 pages.
Response to Non-Final Office Action filed on Oct. 7, 2013, for U.S. Appl. No. 13/359,630, dated Jun. 7, 2013, 15 pages.
Response to Notice of Non-Compliant Amendment dated Jun. 23, 2015, for U.S. Appl. No. 13/359,630, dated Apr. 23, 2015, 31 pages.
Response to Relsotriction Requirement filed on May 21, 2013, for U.S. Appl. No. 13/359,630, dated Apr. 29, 2013, 10 pages.
Restriction Requirement received for U.S. Appl. No. 13/359,630, dated Apr. 29, 2013, 6 pages.
Corrected Notice of Allowance received for U.S. Appl. No. 13/624,682, dated Jan. 15, 2016, 5 pages.
Non-Final Office Action received for U.S. Appl. No. 13/624,682, dated Jan. 22, 2015, 8 pages.
Response to Office Action filed on Dec. 4, 2013 for Australian Application No. 2012212601, 10 pages.
Office Action received for Chinese Patent Application No. 201280013881.X, dated Jun. 4, 2014, 15 pages (8 pages of Official copy and 7 pages of English Translation).
Office Action received for Chinese Patent Application No. 201280013881.X, dated Sep. 6, 2015, 18 pages (9 pages of English Translation and 9 pages of Official copy).
Office Action received for Chinese Patent Application No. 201280013881.X, dated Sep. 7, 2016, 22 pages (11 pages of English Translation and 11 pages of Official copy).
Response to Office Action filed on May 25, 2015 for Chinese Patent Application No. 201280013881.X, dated Jan. 9, 2015, 11 pages (8 pages of Official copy and 3 pages of English Pending Claims).
Response to Office Action filed on Oct. 20, 2014 for Chinese Patent Application No. 201280013881.X, dated Jun. 4, 2014, 13 pages (7 pages of Official copy and 6 pages of English Pending Claims).
Appeal Decision received for Japanese Patent Application No. 2013-552538, dated Dec. 1, 2015, 66 pages.
Notice of Appeal filed on Oct. 23, 2014, for Japanese Patent Application No. 2013-552538, 16 pages.
Office Action received for Japanese Patent Application No. 2013-552538, dated Jan. 14, 2014, 6 pages (3 pages of English Translation and 3 pages of Official copy).
Office Action received for Japanese Patent Application No. 2013-552538, dated Jun. 24, 2014, 4 pages (2 pages of English Translation and 2 pages of Official Copy).
Response to Office Action filed on Apr. 8, 2014 for Japanese Patent Application No. 2013-552538, dated Jan. 14, 2014, 11 pages (8 pages of Official copy and 3 pages of English Pending Claims).
U.S. Appl. No. 61/033,940, “Image Recognition as a Service” filed Mar. 5, 2008, 45 pages.
“Draw something”, Retrieved from the Internet URL: <http://omgpop.com/drawsomething>, Accessed May 3, 2013, 1 page.
AgingBooth, “Android Apps on Google Play”, Accessed on May 3, 2013, 3 pages.
Micah, “Politico-Facebook Sentiment Analysis Will Generate “Bogus” Results, Expert Says”, Retrieved from the Internet: <http://techpresident.com/news/21618/politico-facebook-sentiment-analysis-bogus>, Jan. 13, 2012, 3 pages.
Madeleine, “Terminator 3 Rise of Jesus! Deutsch”, Retrieved from the Internet URL: <https://www.youtube.com/watch?v:::Oj3o7HFcgzE>, Jun. 12, 2012, 2 pages.
Newby, “Facebook, Politico to Measure Sentiment of GOP Candidates by Collecting Posts”, Accessed on Jun. 28, 2012, 3 pages.
Parker et al., “Algorithms for Image Processing and Computer Vision”, Wiley Computer Publishing, 1997, pp. 23-29.
Patterson, “Amazon Iphone App Takes Snapshots, Looks for a Match”, Available online at URL: < http://tech.yahoo.com/blogs/patterson/30983>, Dec. 3, 2008,3 pages.
Mello, “Pongr Giving Cell Phone Users Way to Make Money”, Retrieved from the Internet URL; <https://www.pcworld.com/article/240209/pongr_giving_cell_phone_users_way_to_make_money.html>, Sep. 9, 2011, 3 pages.
“Mobitv”, MobiTV, Retrieved from the Internet: <URL: http://www.mobitv.com>, Accessed on Mar. 30, 2015, 1 page.
Natsuha et al., “Follow-the-trial-fitter: Real-time Dressing Without Undressing”, Dec. 1, 2008, 8 pages.
Redlaser, “Redlaser—Impossibly Accurate Barcode Scanning”, Available online at <URL: http://redlasercom/index.php>, Accessed on Jul. 8, 2011, 2 pages.
Slingbox, “Sling Media, Inc.”, Retrieved from the Internet: <URL: http://www.slingbox.com/>, Accessed on Mar. 30, 2015, 3 pages.
Preliminary Amendment for U.S. Appl. No. 15/337,899, filed on Nov. 11, 2016, 8 pages.
Terada, “New Cell Phone Services Tap Image-recognition Technologies”, Retrieved from the Internet URL: <http://search.japantimes.co.jp/cgi-bin/nb20070628a1.html>, Jun. 26, 2007, 3 pages.
Gonsalves, “Amazon Launches Experimental Mobile Shopping Feature”, Retreived from the Internet URL: <http://www.informationweek.com/news/internet/retail/showArticle.jhtml?articleID=212201750&subSection=News>, Dec. 3, 2008, 1 page.
Non-Final Office Action received for U.S. Appl. No. 13/194,584, dated Nov. 29, 2016, 28 pages.
Non-Final Office Action received for U.S. Appl. No. 13/194,584, dated Sep. 19, 2013, 24 pages.
Notice of Allowance received for U.S. Appl. No. 13/194,584, dated Jan. 23, 2018, 10 pages.
Response to Non-Final Office Action filed on Dec. 19, 2013, for U.S. Patent Application No. 13/194,584, dated Sep. 19, 2013, 13 pages.
Response to Non-Final Office Action filed on May 1, 2017 for U.S. Appl. No. 13/194,584, dated Nov. 29, 2016, 10 pages.
Response to Non-Final Office Action filed on Oct. 16, 2015, for U.S. Appl. No. 13/194,584, dated Jul. 16, 2015, 15 pages.
Response to Rule 312 Communication for U.S. Appl. No. 13/194,584 dated Mar. 14, 2018, 2 pages.
Final Office Action received for U.S. Appl. No. 13/359,630, dated Nov. 29, 2013, 21 pages.
Final Office Action received for U.S. Appl. No. 13/359,630, dated Sep. 22, 2015, 29 pages.
Non-Final Office Action received for U.S. Appl. No. 13/359,630, dated Jun. 7, 2013, 23 pages.
Non-Final Office Action received for U.S. Appl. No. 13/359,630, dated Jun. 13, 2016, 40 pages.
Response to Office Action—First Action Interview filed on Oct. 31, 2017 for U.S. Appl. No. 14/990,291, dated Oct. 18, 2017, 7 pages.
Notice of Allowance received for U.S. Appl. No. 13/624,682, dated Jun. 8, 2015, 5 pages.
Notice of Allowance received for U.S. Appl. No. 13/624,682, dated Oct. 1, 2015, 7 pages.
Response to Non-Final Office Action filed on May 22, 2015, for U.S. Appl. No. 13/624,682, dated Jan. 22, 2015, 8 pages.
Advisory Action received for U.S. Appl. No. 14/067,795, dated Aug. 30, 2016, 5 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 14/067,795, dated Aug. 17, 2016, 3 pages.
Final Office Action received for U.S. Appl. No. 14/067,795, dated Jun. 1, 2016, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 14/067,795, dated Sep. 25, 2015, 10 pages.
Response to Final Office Action filed on Jul. 26, 2016, for U.S. Appl. No. 14/067,795, dated Jun. 1, 2016, 12 pages.
Response to Non-Final Office Action filed on Feb. 25, 2016, for U.S. Appl. No. 14/067,795, dated Sep. 25, 2015, 9 pages.
Final Office Action received for U.S. Appl. No. 14/868,105, dated Apr. 12, 2017, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 14/868,105, dated Dec. 12, 2016, 23 pages.
Non-Final Office Action received for U.S. Appl. No. 14/868,105, dated Nov. 14, 2017, 14 pages.
Preliminary Amendment filed for U.S. Appl. No. 14/868,105, dated Nov. 12, 2015, 8 pages.
Preliminary Amendment filed for U.S. Appl. No. 14/868,105, dated Oct. 20, 2015, 8 pages.
Response to Final Office Action filed on Jul. 12, 2017, for U.S. Appl. No. 14/868,105, dated Apr. 12, 2017, 12 pages.
Response to Non-Final Office Action filed on Feb. 22, 2017, for U.S. Appl. No. 14/868,105, dated Dec. 12, 2016, 15 pages.
Office Action received for European Application No. 10803429.9 dated Feb. 16, 2018, 8 pages.
Office Action received for Korean Application No. 2016-7025254 dated Sep. 5, 2017, 12 pages.
International Preliminary Report on Patentability issued in Application No. PCT/US2010/061628 dated Jun. 27, 2013, 6 pages.
International Search Report and Written Opinion issued in Application No. PCT/US2010/061628 dated Dec. 13, 2016, 6 pages.
Response to Non-Final Office Action filed on Jul. 30, 2012 for U.S. Appl. No. 12/398,957, dated Mar. 29, 2012, 13 pages.
Response to Final Office Action filed on Mar. 7, 2013 for U.S. Appl. No. 12/398,957, dated Nov. 7, 2012, 12 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/406,016, dated May 15, 2012, 3 pages.
Final Office Action received for U.S. Appl. No. 12/406,016 , dated Feb. 29, 2012, 25 pages.
Non-Final Office Action received for U.S. Appl. No. 12/406,016, dated Jun. 21, 2011, 20 pages.
Response to Final Office Action filed on May 17, 2012, for U.S. Appl. No. 12/406,016, dated Feb. 29, 2012, 16 pages.
Response to Non-Final Office Action filed on Sep. 21, 2011 for U.S. Appl. No. 12/406,016, dated Jun. 21, 2011, 17 pages.
Communication pursuant to Rules 94(3) EPC received for European Patent Application No. 12741860.6, dated Mar. 19, 2015, 8 pages.
Extended European Search Report received for European Patent Application No. 12741860.6, dated Apr. 30, 2014,7 pages.
Response to Communication Pursuant to Article 94(3) EPC filed on Jul. 23, 2015, for European Patent Application No. 12741860.6, dated Mar. 19, 2015, 8 pages.
Response to Extended European Search report received filed on Nov. 26, 2014, for European Patent Application No. 12741860.6, dated Apr. 30, 2014, 7 pages.
Response to Office Action filed on Jun. 20, 2018 for Japanese Patent Application No. 2017-075846, 17 pages.
Response to Office Action filed on Jul. 31, 2018 for Korean Patent Application No. 2017-7036972, 19 pages.
Office Action received for European Patent Application No. 10803429.9, dated Aug. 30, 2018, 6 pages.
Notice of Allowance received for U.S. Appl. No. 14/868,105, dated Sep. 21, 2018, 9 pages.
Related Publications (1)
Number Date Country
20160364793 A1 Dec 2016 US
Continuations (1)
Number Date Country
Parent 13283416 Oct 2011 US
Child 15250588 US