System and method for visualization of items in an environment using augmented reality

Information

  • Patent Grant
  • 11113755
  • Patent Number
    11,113,755
  • Date Filed
    Monday, April 20, 2020
    4 years ago
  • Date Issued
    Tuesday, September 7, 2021
    3 years ago
Abstract
Systems and methods for visualization of an item in an environment using augmented reality are provided. Environment image data containing an image of an environment is received. A selection of an item for placement into an indicated location of the environment is received. An item image of the selected item is scaled based on dimensions determined from the environment image data for the environment. The scaled item image is augmented into the image of the environment at the indicated location to generate an augmented reality image. The augmented reality image is displayed on a device of a user, whereby the scaled item image in the augmented reality image is selectable to cause display of information. A selection of the scaled item image is received. In response to the selection of the scaled item image, the information is presented on the device of the user.
Description
FIELD

The present disclosure relates generally to image processing, and in a specific example embodiment, to visualization of items in an environment using augmented reality.


BACKGROUND

Conventionally, when an individual shops for an item, the individual must mentally visualize what the item will look like in the environment that the individual intends to place the item. Often, the individual has difficulty imagining the item with proper dimensions and orientation. In some cases, the individual may purchase the item only to realize that the item does not ideally fit in the environment. As a result, the individual may end up returning the item or otherwise disposing of the item (e.g., sell, trade, give away).





BRIEF DESCRIPTION OF DRAWINGS

Various ones of the appended drawings merely illustrate example embodiments of the present invention and cannot be considered as limiting its scope.



FIG. 1 is a block diagram illustrating an example embodiment of a network architecture of a system used to determining query aspects.



FIG. 2 is a block diagram illustrating an example embodiment of a publication system.



FIG. 3 is a block diagram illustrating an example embodiment of an augmented reality engine.



FIG. 4 is a flow diagram of an example high-level method for visualization of an item in an environment using augmented reality.



FIG. 5 is a flow diagram of an example high-level method for generating an augmented reality image.



FIG. 6A is a screenshot of an example of an environment image.



FIG. 6B is a screenshot of the environment image with an augmented item image.



FIG. 6C illustrates an example screenshot displaying shopping information pertaining to the selected item.



FIG. 6D illustrates an example screenshot displaying a window providing additional information for the selected item.



FIG. 6E illustrates an example screenshot displaying a window having recommendations.



FIG. 7 is a simplified block diagram of a machine in an example form of a computing system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.





DETAILED DESCRIPTION

The description that follows includes systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments of the present invention. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques have not been shown in detail.


As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Additionally, although various example embodiments discussed below focus on a marketplace environment, the embodiments are given merely for clarity in disclosure. Thus, any type of electronic publication, electronic commerce, social networking, or electronic business system and method, including various system architectures, may employ various embodiments of the system and method described herein and may be considered as being within a scope of example embodiments. Each of a variety of example embodiments is discussed in detail below.


Example embodiments described herein provide systems and methods for visualizing of an item in an environment using augmented reality. In example embodiments, environment image data containing an image of an environment is received from a client device. A selection of an item that is under consideration for purchase and placement into an indicated location of the environment is received. An item image of the selected item is scaled to a scale that is based on dimensions determined from the environment image data for the environment. The dimensions may be determined based on a calculated distance to a focal point of the indicated location in the environment and on a marker located in the image of the environment. The scaled item image is augmented into the image of the environment at the indicated location to generate an augmented reality image. In some embodiments, the scaled item may be oriented to match an orientation of the indicated location in the environment.


By using embodiments of the present invention, a user may search for an item and augment an image of an environment with an image of the item. Because the user can create and view an augmented reality image of the environment including the selected item, the user can easily visualize the selected item in the environment without having to, for example, manually cut and paste or scale the image of the item into the image of the environment. Therefore, one or more of the methodologies discussed herein may obviate a need for time consuming data processing by the user. This may have the technical effect of reducing computing resources used by one or more devices within the system. Examples of such computing resources include, without limitation, processor cycles, network traffic, memory usage, storage space, and power consumption.


With reference to FIG. 1, an example embodiment of a high-level client-server-based network architecture 100 to enable visualization of items in an environment using augmented reality is shown. A networked system 102, in an example form of a network-server-side functionality, is coupled via a communication network 104 (e.g., the Internet, wireless network, cellular network, or a Wide Area Network (WAN)) to one or more client devices 110 and 112. FIG. 1 illustrates, for example, a web client 106 operating via a browser (e.g., such as the INTERNET EXPLORER® browser developed by Microsoft Corporation of Redmond, Wash. State), and a programmatic client 108 executing on respective client devices 110 and 112.


The client devices 110 and 112 may comprise a mobile phone, desktop computer, laptop, or any other communication device that a user may utilize to access the networked system 102. In some embodiments, the client device 110 may comprise or be connectable to an image capture device 113 (e.g., camera, camcorder). In further embodiments, the client device 110 may comprise one or more of a touch screen, accelerometer, microphone, and GPS device. The client devices 110 and 112 may be a device of an individual user interested in visualizing an item within an environment.


An Application Program Interface (API) server 114 and a web server 116 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 118. The application servers 118 host a publication system 120 and a payment system 122, each of which may comprise one or more modules, applications, or engines, and each of which may be embodied as hardware, software, firmware, or any combination thereof. The application servers 118 are, in turn, coupled to one or more database servers 124 facilitating access to one or more information storage repositories or database(s) 126. The databases 126 may also store user account information of the networked system 102 in accordance with example embodiments.


In example embodiments, the publication system 120 publishes content on a network (e.g., Internet). As such, the publication system 120 provides a number of publication functions and services to users that access the networked system 102. The publication system 120 is discussed in more detail in connection with FIG. 2. In example embodiments, the publication system 120 is discussed in terms of a marketplace environment. However, it is noted that the publication system 120 may be associated with a non-marketplace environment such as an informational or social networking environment.


The payment system 122 provides a number of payment services and functions to users. The payment system 122 allows users to accumulate value (e.g., in a commercial currency, such as the U.S. dollar, or a proprietary currency, such as “points”) in their accounts, and then later to redeem the accumulated value for products (e.g., goods or services) that are made available via the publication system 120 or elsewhere on the network 104. The payment system 122 also facilitates payments from a payment mechanism (e.g., a bank account, PayPal™, or credit card) for purchases of items via any type and form of a network-based marketplace.


While the publication system 120 and the payment system 122 are shown in FIG. 1 to both form part of the networked system 102, it will be appreciated that, in alternative embodiments, the payment system 122 may form part of a payment service that is separate and distinct from the networked system 102. Additionally, while the example network architecture 100 of FIG. 1 employs a client-server architecture, a skilled artisan will recognize that the present disclosure is not limited to such an architecture. The example network architecture 100 can equally well find application in, for example, a distributed or peer-to-peer architecture system. The publication system 120 and payment system 122 may also be implemented as standalone systems or standalone software programs operating under separate hardware platforms, which do not necessarily have networking capabilities.


Referring now to FIG. 2, an example block diagram illustrating multiple components that, in one embodiment, are provided within the publication system 120 of the networked system 102 is shown. In one embodiment, the publication system 120 is a marketplace system where items (e.g., goods or services) may be offered for sale. In an alternative embodiment, the publication system 120 is a social networking system or informational system. The publication system 120 may be hosted on dedicated or shared server machines (not shown) that are communicatively coupled to enable communications between the server machines. The multiple components themselves are communicatively coupled (e.g., via appropriate interfaces), either directly or indirectly, to each other and to various data sources, to allow information to be passed between the components or to allow the components to share and access common data. Furthermore, the components may access the one or more databases 126 via the one or more database servers 124.


In one embodiment, the publication system 120 provides a number of publishing, listing, and price-setting mechanisms whereby a seller may list (or publish information concerning) goods or services for sale, a buyer can express interest in or indicate a desire to purchase such goods or services, and a price can be set for a transaction pertaining to the goods or services. To this end, the publication system 120 may comprise at least one publication engine 202 and one or more shopping engines 204. In one embodiment, the shopping engines 204 may support auction-format listing and price setting mechanisms (e.g., English, Dutch, Chinese, Double, Reverse auctions, etc.).


A pricing engine 206 supports various price listing formats. One such format is a fixed-price listing format (e.g., the traditional classified advertisement-type listing or a catalog listing). Another format comprises a buyout-type listing. Buyout-type listings (e.g., the Buy-It-Now (BIN) technology developed by eBay Inc., of San Jose, Calif.) may be offered in conjunction with auction-format listings and allow a buyer to purchase goods or services, which are also being offered for sale via an auction, for a fixed price that is typically higher than a starting price of an auction for an item.


A store engine 208 allows a seller to group listings within a “virtual” store, which may be branded and otherwise personalized by and for the seller. Such a virtual store may also offer promotions, incentives, and features that are specific and personalized to the seller. In one example, the seller may offer a plurality of items as Buy-It-Now items in the virtual store, offer a plurality of items for auction, or a combination of both.


Navigation of the publication system 120 may be facilitated by a navigation engine 210. For example, a search module (not shown) of the navigation engine 210 enables, for example, keyword searches of listings or other information published via the publication system 120. In a further example, a browse module (not shown) of the navigation engine 210 allows users to browse various category, catalog, or data structures according to which listings or other information may be classified within the publication system 120. Various other navigation applications within the navigation engine 210 may be provided to supplement the searching and browsing applications. In one embodiment, the navigation engine 210 allows the user to search or browse for items in the publication system 120 (e.g., virtual stores, listings in a fixed-price or auction selling environment, listings in a social network or information system). In alternative embodiments, the navigation engine 210 may navigate (e.g., conduct a search on) a network at large (e.g., network 104). Based on a result of the navigation engine 210, the user may select an item that the user is interested in augmenting into an environment.


In order to make listings or posting of information available via the networked system 102 as visually informing and attractive as possible, the publication system 120 may include an imaging engine 212 that enables users to upload images for inclusion within listings and to incorporate images within viewed listings. In some embodiments, the imaging engine 212 also receives image data from a user and utilizes the image data to generate the augmented reality image. For example, the imaging engine 212 may receive an environment image (e.g., still image, video) of an environment within which the user wants to visualize an item. The imaging engine 212 may work in conjunction with the augmented reality engine 218 to generate the augmented reality image as will be discussed in more details below.


A listing engine 214 manages listings on the publication system 120. In example embodiments, the listing engine 214 allows users to author listings of items. The listing may comprise an image of an item along with a description of the item. In one embodiment, the listings pertain to goods or services that a user (e.g., a seller) wishes to transact via the publication system 120. As such, the listing may comprise an image of a good for sale and a description of the item such as, for example, dimensions, color, and, identifier (e.g., UPC code, ISBN code). In some embodiments, a user may create a listing that is an advertisement or other form of publication to the networked system 102. The listing engine 214 also allows the users to manage such listings by providing various management features (e.g., auto-relisting, inventory level monitors, etc.).


A messaging engine 216 is responsible for the generation and delivery of messages to users of the networked system 102. Such messages include, for example, advising users regarding the status of listings and best offers (e.g., providing an acceptance notice to a buyer who made a best offer to a seller) or providing recommendations. The messaging engine 216 may utilize any one of a number of message delivery networks and platforms to deliver messages to users. For example, the messaging engine 222 may deliver electronic mail (e-mail), an instant message (IM), a Short Message Service (SMS), text, facsimile, or voice (e.g., Voice over IP (VoIP)) messages via wired networks (e.g., the Internet), a Plain Old Telephone Service (POTS) network, or wireless networks (e.g., mobile, cellular, WiFi, WiMAX).


An augmented reality engine 218 manages the generation of an augmented reality based on an environment image and item specified by a user. The augmented reality engine 218 will be discussed in more detail in connection with FIG. 3 below.


Although the various components of the publication system 120 have been defined in terms of a variety of individual modules and engines, a skilled artisan will recognize that many of the items can be combined or organized in other ways. Alternatively, not all components of the publication system 120 of FIG. 2 may be utilized. Furthermore, not all components of the marketplace system 120 have been included in FIG. 2. In general, components, protocols, structures, and techniques not directly related to functions of exemplary embodiments (e.g., dispute resolution engine, loyalty promotion engine, personalization engines, etc.) have not been shown or discussed in detail. The description given herein simply provides a variety of exemplary embodiments to aid the reader in an understanding of the systems and methods used herein.



FIG. 3 is a block diagram illustrating an example embodiment of the augmented reality engine 218. In example embodiments, the augmented reality engine 218 comprises an access module 300, a distance module 302, a sizing module 304, a scaling module 306, an orientation module 308, an augmenting module 310, a recommendation module 312, a save module 314, and a purchase module 316. In alternative embodiments, functions of one or more of the modules of the augmented reality engine 218 may be combined together, one or more of the modules may be removed from the augmented reality engine 218, or one or more of the modules may be located elsewhere in the networked system 102 (e.g., the imaging engine 214, shopping engines 204) or at the client device 110.


In example embodiments, the imaging engine 212 may receive environment image data of an environment (e.g., still image, video) from the client device 110. The environment image data is then provided to the augmented reality engine 218 for processing. In some embodiments, the augmented reality engine 218 also receives item data for an item that the user is interested in visualizing in the environment and an indication of a location where the item is to be augmented in the environment. The item data may be provided by the navigation engine 210 based on a user selection of an item found using a search or browsing function of the navigation engine 210.


Alternatively, the item data may be received from the client device 110. For example, the user may capture an image of an item that the user is interested in augmenting into the environment (e.g., take a photo of an item at a store). The user may, in some cases, enter information regarding the item such as dimensions or an identifier (e.g., UPC code). The augmented reality engine 218 receives the item data from the client device 110.


The access module 300 accesses item data for a selected item. In some embodiments, an item to be augmented into the environment may be selected by a user at the client device and the selection is received, for example, by the navigation engine 210. In other embodiments, the selection is received by the access module 300. Based on the selection, the access module 300 may access information corresponding to the selection. If the selection is an item listing for the item, the access module 300 may access the item listing and extract item data (e.g., dimensions, images) from the listing. In other examples, if the selection is a user inputted name or other item identifier of an item (e.g., UPC code), the access module 300 may access a catalog (e.g., stored in the database 126) that stores item data using the item identifier.


The distance module 302 determines a distance to a focal point in an image of the environment. The focal point may be a user selected area (also referred to as an “indicated location”) where an item image is to be augmented. For example, if the environment is a room, the distance to a wall where the item image is to be augmented may be determined. In one embodiment, the distance module 302 may use a focus capability of the image capture device 113 of, or coupled to, the client device 110 to determine the distance. Alternatively, the distance module 302 may use an echo technique using the client device 110 as a sound generator to determine the distance. For example, the client device 110 may generate a sound in the direction of the wall and an amount of time is registered for an echo to be returned. The distance module 302 may use this amount of time to determine the distance. As such, the distance is from a point of view of the viewer or image capture device (e.g., camera) to the focal point.


The sizing module 304 determines sizing for the environment. In example embodiments, the sizing module 304 uses a marker (an object with known standard dimensions) in the environment image data to calculate the sizing. For example, if a door is shown in the environment image data, the sizing module 304 may assume that the door is a standard sized door (e.g., 36″×80″) or that a door knob is located at 36″ from the floor. Using these known standard dimensions, sizing for the environment may be determined. In another example, if the environment is an automobile, the marker may be a wheel well of the automobile. In this example, the user may specify a type of automobile when providing the environment image data.


The scaling module 306 scales an image of the item based on the distance and sizing determined by the distance module 302 and the sizing module 304, respectively. Accordingly, the scaling module 306 may receive (e.g., from the navigation engine 210) or retrieve the item data (e.g., from the database 126) for a selected item. The item data may include an item image, dimensions, or an item identifier. If the item image and dimensions are provided, then the scaling module 306 may use the item image and the dimensions to scale the item image to the environment based on the sizing determined by the sizing module 304. Alternatively, if one of the image or dimension is not provided, the item identifier may be used to look up the item in an item catalog which may contain an image and item information for the item (e.g., dimensions and description). In one embodiment, the scaling module 306 may look up and retrieve the item information from the item catalog.


Once the item image is scaled, the scaled item image may be oriented to the environment by the orientation module 308. For example, if the environment image has a wall at a slight angle and the scaled item image is to be placed on the wall, the orientation module 308 orients the scaled item image to the angle of the wall. It is noted that functionality of any of the distance module 302, sizing module 304, scale module 306, and orientation module 308 may be combined into one or more modules that can determine proper sizing and orientation for the item image. In some embodiments, these combined modules may comprise or make use of one or more gyroscopes or accelerometers.


The augmenting module 310 augments the scaled and oriented item image with the environment image to create an augmented reality image. The augmenting module 310 then provides the augmented reality image to the client device 110.


The recommendation module 312 optionally provides recommendations for alternative items for the environment. For example, if the scaled and oriented item image appears too large for an indicated area on the environment image (e.g., as determined by the augmenting module 310), the recommendation module 312 may suggest one or more alternative items that are smaller and will fit better in the indicated area. Accordingly, the recommendation module 312 may determine a dimension that is more appropriate for the indicated area and perform a search (e.g., provide instructions to the navigation engine 210 to perform a search) to find one or more alternative items. The recommendation module 312 may then retrieve the item information and provide the alternative items as a suggestion to the user. In one embodiment, the alternative items may be listed on a side of a display that is displaying the augmented reality image or on a pop-up window.


The save module 314 saves the environment image for later use. In one embodiment, the environmental image may be stored to the database 126 of the networked environment 102. Alternatively, the environmental image may be stored to the client device 110. For example, the user may record the environmental image for a room and save the environmental image. At a later time, the user may obtain an item image for an item that the user is interested in augmenting into the saved environmental image. The save module 314 may access and retrieve the saved environmental image.


The purchase module 316 allows the user to purchase the item that is augmented into the environment or an alternative item recommended by the recommendation module 312. In one embodiment, the purchase module 316 provides a selection on or near the augmented reality image that when selected takes the user to, for example, a purchase page for the item, a store front for a store that sells the item, or search page with search results for availability of the item for purchase. In another embodiment, an activation of the selection may initiate an automatic purchase of the item. Once selected, the purchase module 316 performs the corresponding actions to facilitate the purchase (e.g., send a search for the item to the navigation engine 210, provide one or more listings using the shopping engine 204, provide a webpage associated with the store engine 208).



FIG. 4 is a flow diagram of an example high-level method 400 for visualization of an item in an environment using augmented reality. In operation 402, environment image data is received. In example embodiments, the imaging engine 212 may receive the environment image data from a client device 110. The environment image data may comprise an image of an environment into which the user wants to augment an item image.


In operation 404, a selection of an item to be augmented into the environment is received. In some embodiments, the navigation engine 210 receives a selection of the item from the client device. In other embodiments, the imaging engine 212 receives an image of an item that the user is interested in augmenting into the environment.


Based on the received selection of the item, item data is accessed in operation 406. The access module 300 accesses item data for the selected item. The item data may be extracted from an item listing for the item, retrieved from an item catalog, or retrieved from a website of a manufacturer or reseller (e.g., using an item identifier of the item).


In operation 408, augmentation processing is performed. Augmentation processing takes the environment image data and the selected item and augments or merges an item image for the item into an environment image. The operations of the augmentation processing will be discussed in detail with respect to FIG. 5.


The result of the augmentation is provided in operation 410. The result may comprise a video of the environment with the selected item augmented into the environment (referred to as “the augmented reality image”). In example embodiments, the augmenting module 310 provides the augmented reality image to the client device 110 of the user that provided the environment image, the item selection, or both.


In operation 412, a determination is made as to whether a modification is received. In some embodiments, the modification may be caused by the movement of the image capture device 113. For example, if the image capture device 113 is a video camera, then the modification is the movement within the environment as captured by the video camera. In another embodiment, the user may select an alternative item based on a recommendation provided by the recommendation module 312. Based on the modification, the method 400 may return to either operation 406 to access item data for the new item or to operation 408 to perform augmentation processing based on, for example, the movement within the environment.



FIG. 5 is a flow diagram of an example high-level method (operation 408) for generating the augmented reality image. In operation 502, a distance is determined by the distance module 302. The distance module 302 determines a distance to a focal point in the environment. The focal point may be a user selected area where an item image is to be augmented. In one embodiment, the distance module 302 may use capabilities (e.g., focus, echo based on sound) of the image capture device 113 of, or coupled to, the client device 110 to determine the distance.


In operation 504, sizing for the environment is determined by the sizing module 304. In example embodiments, the sizing module 304 uses a marker in the environment image data to calculate the sizing Using known standard dimensions of the marker, sizing for the environment may be determined by the sizing module 304.


The item image is scaled in operation 506. The scaling module 306 scales an image of the item based on the distance and sizing determined by the distance module 302 and the sizing module 304, respectively. Accordingly, the scaling module 306 may receive or retrieve the item data including an item image, dimensions, or an item identifier. The retrieved item data is then used in association with the determined distance and sizing data to scale the item image.


Once the item image is scaled, the scaled item image may be oriented to the environment, in operation 508, by the orientation module 308. For example, if the environment image has a wall at a slight angle and the scaled item image is to be placed on the wall, the orientation module 308 orients the scaled item image to the angle of the wall.


In operation 510, the scaled and oriented item image is merged into the environment image. The augmenting module 310 augments the scaled and oriented item image with the environment image to create an augmented reality image. It is noted that operations of FIG. 5 may be combined into fewer operations. Alternatively, some of the operations of FIG. 5 may be optional.



FIG. 6A is a screenshot of an example of an environment image 600. The environment image 600 may be captured by the image capture device 113 or retrieved from a storage location (e.g., database 126). In the present example, the environment image 600 is an image of a room in which a user wants to augment an item. In the present case, the environment image 600 is taken from a location where the user may want to view the item. For example, if the item is a flat panel television, the environment image 600 may be taken from a location where the user will position a sofa to view the flat panel television.



FIG. 6B is a screenshot of the environment image 600 with an augmented item image. In the present example, an image of a flat panel television 602 selected by the user is positioned in a location indicated by the user in the environment image 600. In one embodiment, additional information may be obtained by activating a selection on a display displaying the screenshot. For example, the user may select the image of the flat panel television 602 on the screenshot to open up a new window (e.g., a new window over a portion of the screenshot) that provides purchase information (e.g., where to buy, links to online stores, a listing for the item, prices), item information (e.g., dimensions, description), alternative recommendations (e.g., smaller or larger items, comparable items, less expensive items, newer version of the item), or any combination of these.



FIG. 6C illustrates an example screenshot displaying shopping information in a new window pertaining to the selected item. In the present example, a window 604 provides shopping information including a lowest, highest, and average price along with links to various marketplaces where the item may be purchased. The window 604 is provided when the user makes a selection of the image of the flat panel or performs some other action to indicate a desire to receive additional information.



FIG. 6D illustrates an example screenshot displaying the window 604 providing additional information for the selected item. In the present example, the window 604 provides dimensions, weight, item identifiers, and product description of the selected item. Any information pertaining to the selected item may be provided in the window 604.



FIG. 6E illustrates an example screenshot displaying the window 604 having recommendations. The recommendations may be provided by the recommendation module 312 and include a name of each recommended item and an image of the recommended item. Other information, such as price, ratings, or dimensions, may also be provided in the window 604. The recommendations may be, for example, items that may fit in the user designated location better, items less expensive than the selected item, items that are a new model of the selected item, or items that rank higher based on other users of the system.


While the various examples of FIG. 6C-6E show provide the window 604 for displaying additional information, alternative embodiments may use other display mechanisms to provide the additional information. For example, the additional information may be displayed on a side of a display showing the environment image 600.


Modules, Components, and Logic


Additionally, certain embodiments described herein may be implemented as logic or a number of modules, engines, components, or mechanisms. A module, engine, logic, component, or mechanism (collectively referred to as a “module”) may be a tangible unit capable of performing certain operations and configured or arranged in a certain manner. In certain example embodiments, one or more computer systems (e.g., a standalone, client, or server computer system) or one or more components of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) or firmware (note that software and firmware can generally be used interchangeably herein as is known by a skilled artisan) as a module that operates to perform certain operations described herein.


In various embodiments, a module may be implemented mechanically or electronically. For example, a module may comprise dedicated circuitry or logic that is permanently configured (e.g., within a special-purpose processor, application specific integrated circuit (ASIC), or array) to perform certain operations. A module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software or firmware to perform certain operations. It will be appreciated that a decision to implement a module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by, for example, cost, time, energy-usage, and package size considerations.


Accordingly, the term “module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which modules or components are temporarily configured (e.g., programmed), each of the modules or components need not be configured or instantiated at any one instance in time. For example, where the modules or components comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different modules at different times. Software may accordingly configure the processor to constitute a particular module at one instance of time and to constitute a different module at a different instance of time.


Modules can provide information to, and receive information from, other modules. Accordingly, the described modules may be regarded as being communicatively coupled. Where multiples of such modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the modules. In embodiments in which multiple modules are configured or instantiated at different times, communications between such modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple modules have access. For example, one module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further module may then, at a later time, access the memory device to retrieve and process the stored output. Modules may also initiate communications with input or output devices and can operate on a resource (e.g., a collection of information).


Example Machine Architecture and Machine-Readable Medium


With reference to FIG. 7, an example embodiment extends to a machine in the example form of a computer system 700 within which instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed. In alternative example embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, a switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.


The example computer system 700 may include a processor 702 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 704 and a static memory 706, which communicate with each other via a bus 708. The computer system 700 may further include a video display unit 710 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). In example embodiments, the computer system 700 also includes one or more of an alpha-numeric input device 712 (e.g., a keyboard), a user interface (UI) navigation device or cursor control device 714 (e.g., a mouse), a disk drive unit 716, a signal generation device 718 (e.g., a speaker), and a network interface device 720.


Machine-Readable Storage Medium


The disk drive unit 716 includes a machine-readable storage medium 722 on which is stored one or more sets of instructions 724 and data structures (e.g., software instructions) embodying or used by any one or more of the methodologies or functions described herein. The instructions 724 may also reside, completely or at least partially, within the main memory 704, the static memory 706, or within the processor 702 during execution thereof by the computer system 700, with the main memory 704 and the processor 702 also constituting machine-readable media.


While the machine-readable storage medium 722 is shown in an example embodiment to be a single medium, the term “machine-readable storage medium” may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) that store the one or more instructions. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of embodiments of the present invention, or that is capable of storing, encoding, or carrying data structures used by or associated with such instructions. The term “machine-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories and optical and magnetic media. Specific examples of machine-readable storage media include non-volatile memory, including by way of example semiconductor memory devices (e.g., Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), and flash memory devices); magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.


Transmission Medium


The instructions 724 may further be transmitted or received over a communications network 726 using a transmission medium via the network interface device 720 and utilizing any one of a number of well-known transfer protocols (e.g., HTTP). Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, mobile telephone networks, POTS networks, and wireless data networks (e.g., WiFi and WiMax networks). The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.


Although an overview of the inventive subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of embodiments of the present invention. Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is, in fact, disclosed.


The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.


Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present invention. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present invention as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims
  • 1. A method implemented by a client device, the method comprising: receiving a selection of an item for placement into an image of an environment;receiving environment image data containing the image of the environment;scaling an item image for the selected item based on dimensions of the environment;generating an augmented reality image that depicts the scaled item image disposed in the environment;receiving a selection of the scaled item image in the augmented reality image; anddisplaying item information in response to receiving the selection of the scaled item image.
  • 2. The method of claim 1, wherein the dimensions of the environment are determined based on the environment image data.
  • 3. The method of claim 2, further comprising determining the dimensions of the environment using a focus capability of an image capture device of the client device.
  • 4. The method of claim 1, wherein the image of the environment comprises one of a plurality of frames of a video, the method further comprising performing the scaling and the generating for each of the plurality of frames of the video.
  • 5. The method of claim 1, wherein generating the augmented reality image comprises receiving an indication of a placement location for the item image relative to the environment and positioning the scaled item image at the placement location.
  • 6. The method of claim 1, wherein displaying the item information comprises displaying at least one dimension associated with the item.
  • 7. The method of claim 1, wherein displaying the item information comprises displaying a weight associated with the item.
  • 8. The method of claim 1, wherein displaying the item information comprises displaying a product description associated with the item.
  • 9. The method of claim 1, wherein displaying the item information comprises displaying a link that is selectable to initiate purchase of the item.
  • 10. The method of claim 1, wherein displaying the item information comprises displaying a recommendation for an alternative item.
  • 11. The method of claim 1, wherein displaying the item information comprises displaying an item identifier associated with the item.
  • 12. A method implemented by a client device, the method comprising: receiving a selection of an item for placement into an image of an environment;receiving environment image data containing the image of the environment;scaling an item image for the selected item based on dimensions of the environment;generating an augmented reality image that depicts the scaled item image disposed in the environment;receiving a selection of the scaled item image in the augmented reality image; anddisplaying at least one dimension associated with the item in the augmented reality image responsive to receiving the selection of the scaled item image.
  • 13. The method of claim 12, wherein the dimensions of the environment are determined based on the environment image data.
  • 14. The method of claim 13, further comprising determining the dimensions of the environment using a focus capability of an image capture device of the client device.
  • 15. The method of claim 12, wherein the image of the environment comprises one of a plurality of frames of a video, the method further comprising performing the scaling and the generating for each of the plurality of frames of the video.
  • 16. The method of claim 12, wherein generating the augmented reality image comprises receiving an indication of a placement location for the item image relative to the environment and positioning the scaled item image at the placement location.
  • 17. A system comprising: at least one processor; anda storage device comprising instructions that are executable by the at least one processor to perform operations comprising: receiving a selection of an item for placement into an image of an environment;receiving environment image data containing the image of the environment;scaling an item image for the selected item based on dimensions of the environment;generating an augmented reality image that depicts the scaled item image disposed in the environment;receiving a selection of the scaled item image in the augmented reality image; anddisplaying at least one dimension associated with the item in the augmented reality image responsive to receiving the selection of the scaled item image.
  • 18. The system of claim 17, wherein the dimensions of the environment are determined based on the environment image data, the operations further comprising determining the dimensions of the environment using a focus capability of an image capture device of the client device.
  • 19. The system of claim 17, wherein the image of the environment comprises one of a plurality of frames of a video, the operations further comprising performing the scaling and the generating for each of the plurality of frames of the video.
  • 20. The system of claim 17, wherein generating the augmented reality image comprises receiving an indication of a placement location for the item image relative to the environment and positioning the scaled item image at the placement location.
PRIORITY

This application is a continuation of and claims the benefit of priority to U.S. patent application Ser. No. 16/162,153, filed on Oct. 16, 2018, Ser. No. 16/162,153 is a continuation of and claims the benefit of priority to U.S. patent application Ser. No. 15/250,588, filed on Aug. 29, 2016, Ser. No. 15/250,588 is a continuation of and claims the benefit of priority to U.S. patent application Ser. No. 13/283,416, filed on Oct. 27, 2011, which are hereby incorporated by reference herein in their entirety.

US Referenced Citations (361)
Number Name Date Kind
3675215 Arnold et al. Jul 1972 A
4539585 Spackova et al. Sep 1985 A
4596144 Panton et al. Jun 1986 A
5068723 Dixit et al. Nov 1991 A
5408417 Wilder Apr 1995 A
5546475 Bolle et al. Aug 1996 A
5579471 Barber et al. Nov 1996 A
5601431 Howard Feb 1997 A
5692012 Virtamo et al. Nov 1997 A
5781899 Hirata Jul 1998 A
5802361 Wang et al. Sep 1998 A
5818964 Itoh Oct 1998 A
5870149 Comroe et al. Feb 1999 A
5889896 Meshinsky et al. Mar 1999 A
5901379 Hirata May 1999 A
5949429 Bonneau et al. Sep 1999 A
6112226 Weaver et al. Aug 2000 A
6134548 Gottsman et al. Oct 2000 A
6134674 Akasheh Oct 2000 A
6151587 Matthias Nov 2000 A
6154738 Call Nov 2000 A
6157435 Slater et al. Dec 2000 A
6216134 Heckerman et al. Apr 2001 B1
6216227 Goldstein et al. Apr 2001 B1
6278446 Liou et al. Aug 2001 B1
6292593 Nako et al. Sep 2001 B1
6298330 Gardenswartz et al. Oct 2001 B1
6434530 Sloane et al. Aug 2002 B1
6463426 Lipson et al. Oct 2002 B1
6477269 Brechner Nov 2002 B1
6483570 Slater et al. Nov 2002 B1
6484130 Dwyer et al. Nov 2002 B2
6512919 Ogasawara Jan 2003 B2
6530521 Henry Mar 2003 B1
6549913 Murakawa Apr 2003 B1
6563959 Troyanker May 2003 B1
6567797 Schuetze et al. May 2003 B1
6587835 Treyz et al. Jul 2003 B1
6589290 Maxwell et al. Jul 2003 B1
6642929 Essafi et al. Nov 2003 B1
6714945 Foote et al. Mar 2004 B1
6724930 Kosaka et al. Apr 2004 B1
6763148 Sternberg et al. Jul 2004 B1
6804662 Annau et al. Oct 2004 B1
6901379 Baiter et al. May 2005 B1
6947571 Rhoads et al. Sep 2005 B1
7022281 Senff Apr 2006 B1
7023441 Choi et al. Apr 2006 B2
7062722 Carlin et al. Jun 2006 B1
7082365 Sheha et al. Jul 2006 B2
7130466 Seeber Oct 2006 B2
7149665 Feld et al. Dec 2006 B2
7162082 Edwards Jan 2007 B2
7240025 Stone et al. Jul 2007 B2
7254779 Rezvani et al. Aug 2007 B1
7257268 Eichhorn et al. Aug 2007 B2
7281018 Begun et al. Oct 2007 B1
7346453 Matsuoka Mar 2008 B2
7346543 Edmark Mar 2008 B1
7347373 Singh Mar 2008 B2
7363214 Musgrove et al. Apr 2008 B2
7363252 Fujimoto Apr 2008 B2
7460735 Rowley et al. Dec 2008 B1
7478143 Friedman et al. Jan 2009 B1
7495674 Biagiotti et al. Feb 2009 B2
7519562 Vander et al. Apr 2009 B1
7568004 Gottfried Jul 2009 B2
7587359 Levy et al. Sep 2009 B2
7593602 Stentiford Sep 2009 B2
7683858 Allen et al. Mar 2010 B2
7702185 Keating et al. Apr 2010 B2
7752082 Calabria Jul 2010 B2
7756757 Oakes, III Jul 2010 B1
7761339 Alivandi Jul 2010 B2
7796155 Neely et al. Sep 2010 B1
7801893 Gulli et al. Sep 2010 B2
7827074 Rolf Nov 2010 B1
7848764 Riise et al. Dec 2010 B2
7848765 Phillips et al. Dec 2010 B2
7881560 John Feb 2011 B2
7890386 Reber Feb 2011 B1
7916129 Lin et al. Mar 2011 B2
7921040 Reber Apr 2011 B2
7933811 Reber Apr 2011 B2
7948481 Vilcovsky May 2011 B2
7957510 Denney et al. Jun 2011 B2
8078498 Edmark Dec 2011 B2
8130242 Cohen Mar 2012 B2
8230016 Pattan et al. Jul 2012 B1
8239130 Upstill et al. Aug 2012 B1
8260846 Lahav Sep 2012 B2
8275590 Szymczyk et al. Sep 2012 B2
8370062 Starenky et al. Feb 2013 B1
8385646 Lang et al. Feb 2013 B2
8547401 Mallinson et al. Oct 2013 B2
8825660 Chittar Sep 2014 B2
8868443 Yankovich et al. Oct 2014 B2
9058764 Persson et al. Jun 2015 B1
9164577 Tapley et al. Oct 2015 B2
9240059 Zises Jan 2016 B2
9251395 Botchen Feb 2016 B1
9336541 Pugazhendhi et al. May 2016 B2
9449342 Sacco Sep 2016 B2
9495386 Tapley et al. Nov 2016 B2
9530059 Zises Dec 2016 B2
9953350 Pugazhendhi et al. Apr 2018 B2
10127606 Tapley et al. Nov 2018 B2
10147134 Sacco Dec 2018 B2
10210659 Tapley et al. Feb 2019 B2
10614602 Zises Apr 2020 B2
10628877 Sacco Apr 2020 B2
20010034668 Whitworth Oct 2001 A1
20010049636 Hudda et al. Dec 2001 A1
20020002504 Engel et al. Jan 2002 A1
20020027694 Kim et al. Mar 2002 A1
20020052709 Akatsuka et al. May 2002 A1
20020072993 Sandus et al. Jun 2002 A1
20020094189 Navab et al. Jul 2002 A1
20020107737 Kaneko et al. Aug 2002 A1
20020111154 Eldering et al. Aug 2002 A1
20020116286 Walker et al. Aug 2002 A1
20020146176 Meyers Oct 2002 A1
20020196333 Gorischek Dec 2002 A1
20030018652 Heckerman et al. Jan 2003 A1
20030028873 Lemmons Feb 2003 A1
20030051255 Bulman et al. Mar 2003 A1
20030053706 Hong et al. Mar 2003 A1
20030080978 Navab et al. May 2003 A1
20030085894 Tatsumi May 2003 A1
20030101105 Vock May 2003 A1
20030112260 Gouzu Jun 2003 A1
20030123026 Abitbol et al. Jul 2003 A1
20030130910 Pickover et al. Jul 2003 A1
20030147623 Fletcher Aug 2003 A1
20030208409 Mault Nov 2003 A1
20030229537 Dunning et al. Dec 2003 A1
20030231806 Troyanker Dec 2003 A1
20040019643 Zirnstein, Jr. Jan 2004 A1
20040046779 Asano et al. Mar 2004 A1
20040057627 Abe et al. Mar 2004 A1
20040075670 Bezine et al. Apr 2004 A1
20040096096 Huber May 2004 A1
20040128320 Grove et al. Jul 2004 A1
20040133927 Sternberg et al. Jul 2004 A1
20040153505 Verdi et al. Aug 2004 A1
20040205286 Bryant et al. Oct 2004 A1
20040220767 Tanaka et al. Nov 2004 A1
20040230558 Tokunaka Nov 2004 A1
20050001852 Dengler et al. Jan 2005 A1
20050004850 Gutbrod Jan 2005 A1
20050010486 Pandhe Jan 2005 A1
20050065655 Hong et al. Mar 2005 A1
20050081161 Macinnes et al. Apr 2005 A1
20050084154 Li et al. Apr 2005 A1
20050091597 Ackley Apr 2005 A1
20050151743 Sitrick Jul 2005 A1
20050151963 Pulla et al. Jul 2005 A1
20050162419 Kim et al. Jul 2005 A1
20050162523 Darrell et al. Jul 2005 A1
20050171864 Nakade et al. Aug 2005 A1
20050182792 Israel et al. Aug 2005 A1
20050193006 Bandas Sep 2005 A1
20050222987 Vadon Oct 2005 A1
20050283379 Reber Dec 2005 A1
20060004850 Chowdhury Jan 2006 A1
20060012677 Neven et al. Jan 2006 A1
20060013481 Park et al. Jan 2006 A1
20060015492 Keating et al. Jan 2006 A1
20060032916 Mueller et al. Feb 2006 A1
20060038833 Mallinson et al. Feb 2006 A1
20060058948 Blass et al. Mar 2006 A1
20060071945 Anabuki Apr 2006 A1
20060071946 Anabuki et al. Apr 2006 A1
20060116935 Evans Jun 2006 A1
20060120686 Liebenow Jun 2006 A1
20060149625 Koningstein Jul 2006 A1
20060149638 Allen Jul 2006 A1
20060184013 Emanuel et al. Aug 2006 A1
20060190293 Richards Aug 2006 A1
20060218153 Voon et al. Sep 2006 A1
20060240862 Neven Oct 2006 A1
20070005576 Cutrell et al. Jan 2007 A1
20070015586 Huston Jan 2007 A1
20070038944 Carignano et al. Feb 2007 A1
20070060112 Reimer Mar 2007 A1
20070078846 Gulli et al. Apr 2007 A1
20070091125 Takemoto et al. Apr 2007 A1
20070098234 Fiala May 2007 A1
20070104348 Cohen May 2007 A1
20070122947 Sakurai et al. May 2007 A1
20070133947 Armitage et al. Jun 2007 A1
20070143082 Degnan Jun 2007 A1
20070150403 Mock Jun 2007 A1
20070159522 Neven Jul 2007 A1
20070172155 Guckenberger Jul 2007 A1
20070198505 Fuller Aug 2007 A1
20070230817 Kokojima Oct 2007 A1
20070244924 Sadovsky et al. Oct 2007 A1
20070300161 Bhatia et al. Dec 2007 A1
20080003966 Magnusen Jan 2008 A1
20080005313 Flake et al. Jan 2008 A1
20080037877 Jia et al. Feb 2008 A1
20080046738 Galloway et al. Feb 2008 A1
20080046956 Kulas Feb 2008 A1
20080059055 Geelen et al. Mar 2008 A1
20080071559 Arrasvuori Mar 2008 A1
20080074424 Carignano Mar 2008 A1
20080082426 Gokturk et al. Apr 2008 A1
20080084429 Wissinger Apr 2008 A1
20080097975 Guay et al. Apr 2008 A1
20080104054 Spangler May 2008 A1
20080126193 Robinson May 2008 A1
20080142599 Benillouche et al. Jun 2008 A1
20080151092 Vilcovsky Jun 2008 A1
20080154710 Varma Jun 2008 A1
20080163311 St. John-larkin Jul 2008 A1
20080163379 Robinson et al. Jul 2008 A1
20080165032 Lee Jul 2008 A1
20080170810 Wu et al. Jul 2008 A1
20080176545 Dicke et al. Jul 2008 A1
20080177640 Gokturk et al. Jul 2008 A1
20080186226 Ratnakar Aug 2008 A1
20080194323 Merkli et al. Aug 2008 A1
20080201241 Pecoraro Aug 2008 A1
20080205755 Jackson et al. Aug 2008 A1
20080205764 Iwai et al. Aug 2008 A1
20080207357 Savarese et al. Aug 2008 A1
20080225123 Osann et al. Sep 2008 A1
20080240575 Panda et al. Oct 2008 A1
20080255961 Livesey Oct 2008 A1
20080268876 Gelfand et al. Oct 2008 A1
20080278778 Saino Nov 2008 A1
20080285940 Kulas Nov 2008 A1
20080288338 Wiseman et al. Nov 2008 A1
20080288477 Kim et al. Nov 2008 A1
20090006208 Grewal et al. Jan 2009 A1
20090019487 Kulas Jan 2009 A1
20090028435 Wu et al. Jan 2009 A1
20090028446 Wu et al. Jan 2009 A1
20090083096 Cao et al. Mar 2009 A1
20090083134 Burckart et al. Mar 2009 A1
20090094260 Cheng et al. Apr 2009 A1
20090106127 Purdy et al. Apr 2009 A1
20090109240 Englert et al. Apr 2009 A1
20090110241 Takemoto et al. Apr 2009 A1
20090144624 Barnes Jun 2009 A1
20090182810 Higgins et al. Jul 2009 A1
20090228342 Walker et al. Sep 2009 A1
20090232354 Camp et al. Sep 2009 A1
20090235181 Saliba et al. Sep 2009 A1
20090235187 Kim et al. Sep 2009 A1
20090240735 Grandhi et al. Sep 2009 A1
20090245638 Collier et al. Oct 2009 A1
20090262137 Walker et al. Oct 2009 A1
20090271293 Parkhurst et al. Oct 2009 A1
20090287587 Bloebaum et al. Nov 2009 A1
20090299824 Barnes, Jr. Dec 2009 A1
20090304267 Tapley et al. Dec 2009 A1
20090319373 Barrett Dec 2009 A1
20090319388 Yuan et al. Dec 2009 A1
20090319887 Waltman et al. Dec 2009 A1
20090324100 Kletter et al. Dec 2009 A1
20090324137 Stallings et al. Dec 2009 A1
20090325554 Reber Dec 2009 A1
20100015960 Reber Jan 2010 A1
20100015961 Reber Jan 2010 A1
20100015962 Reber Jan 2010 A1
20100034469 Thorpe et al. Feb 2010 A1
20100037177 Golsorkhi Feb 2010 A1
20100045701 Scott et al. Feb 2010 A1
20100046842 Conwell Feb 2010 A1
20100048290 Baseley et al. Feb 2010 A1
20100049663 Kane et al. Feb 2010 A1
20100070996 Liao et al. Mar 2010 A1
20100082927 Riou Apr 2010 A1
20100131714 Chandrasekaran May 2010 A1
20100153378 Sardesai Jun 2010 A1
20100161605 Gabrilovich et al. Jun 2010 A1
20100171758 Maassel et al. Jul 2010 A1
20100171999 Namikata et al. Jul 2010 A1
20100185529 Chesnut Jul 2010 A1
20100188510 Yoo et al. Jul 2010 A1
20100198684 Eraker et al. Aug 2010 A1
20100211900 Fujioka Aug 2010 A1
20100214284 Rieffel et al. Aug 2010 A1
20100235259 Farraro et al. Sep 2010 A1
20100241650 Chittar Sep 2010 A1
20100257024 Holmes et al. Oct 2010 A1
20100260426 Huang et al. Oct 2010 A1
20100281417 Yolleck et al. Nov 2010 A1
20100283630 Alonso et al. Nov 2010 A1
20100287511 Meier et al. Nov 2010 A1
20100289817 Meier et al. Nov 2010 A1
20100312596 Saffari et al. Dec 2010 A1
20100316288 Ip et al. Dec 2010 A1
20100332283 Ng et al. Dec 2010 A1
20100332304 Higgins et al. Dec 2010 A1
20110004517 Soto et al. Jan 2011 A1
20110016487 Chalozin et al. Jan 2011 A1
20110029334 Reber Feb 2011 A1
20110053642 Lee Mar 2011 A1
20110055054 Glasson Mar 2011 A1
20110061011 Hoguet Mar 2011 A1
20110065496 Gagner et al. Mar 2011 A1
20110078305 Varela Mar 2011 A1
20110084983 Demaine Apr 2011 A1
20110090343 Alt et al. Apr 2011 A1
20110128300 Gay et al. Jun 2011 A1
20110143731 Ramer et al. Jun 2011 A1
20110148924 Tapley et al. Jun 2011 A1
20110153614 Solomon Jun 2011 A1
20110173191 Tsaparas et al. Jul 2011 A1
20110184780 Alderson et al. Jul 2011 A1
20110187306 Aarestrup et al. Aug 2011 A1
20110215138 Crum Sep 2011 A1
20110246064 Nicholson Oct 2011 A1
20120072233 Hanlon et al. Mar 2012 A1
20120084812 Thompson et al. Apr 2012 A1
20120099800 Llano et al. Apr 2012 A1
20120105475 Tseng et al. May 2012 A1
20120113141 Zimmerman et al. May 2012 A1
20120120113 Hueso May 2012 A1
20120165046 Rhoads et al. Jun 2012 A1
20120179716 Takami Jul 2012 A1
20120185492 Israel et al. Jul 2012 A1
20120192235 Tapley et al. Jul 2012 A1
20120195464 Ahn Aug 2012 A1
20120197764 Nuzzi et al. Aug 2012 A1
20120215612 Ramer et al. Aug 2012 A1
20120230581 Miyashita et al. Sep 2012 A1
20120284105 Li Nov 2012 A1
20120293548 Perez et al. Nov 2012 A1
20120308077 Tseng et al. Dec 2012 A1
20120327115 Chhetri et al. Dec 2012 A1
20130019177 Schlossberg et al. Jan 2013 A1
20130050218 Beaver et al. Feb 2013 A1
20130073365 Mccarthy Mar 2013 A1
20130086029 Hebert Apr 2013 A1
20130103306 Uetake Apr 2013 A1
20130106910 Sacco May 2013 A1
20130116922 Cai et al. May 2013 A1
20130144701 Kulasooriya et al. Jun 2013 A1
20130170697 Zises Jul 2013 A1
20130198002 Nuzzi et al. Aug 2013 A1
20130325839 Goddard et al. Dec 2013 A1
20140007012 Govande et al. Jan 2014 A1
20140063054 Osterhout et al. Mar 2014 A1
20140085333 Pugazhendhi et al. Mar 2014 A1
20140372449 Chittar Dec 2014 A1
20150052171 Cheung Feb 2015 A1
20160019723 Tapley et al. Jan 2016 A1
20160034944 Raab et al. Feb 2016 A1
20160117863 Pugazhendhi et al. Apr 2016 A1
20160171305 Zises Jun 2016 A1
20160364793 Sacco Dec 2016 A1
20170046593 Tapley et al. Feb 2017 A1
20170091975 Zises Mar 2017 A1
20180189863 Tapley et al. Jul 2018 A1
20180336734 Tapley et al. Nov 2018 A1
20190050939 Sacco Feb 2019 A1
20200193668 Zises Jun 2020 A1
Foreign Referenced Citations (75)
Number Date Country
2012212601 May 2016 AU
2015264850 Apr 2017 AU
1255989 Jun 2000 CN
1750001 Mar 2006 CN
1802586 Jul 2006 CN
101515195 Aug 2009 CN
101515198 Aug 2009 CN
101520904 Sep 2009 CN
101541012 Sep 2009 CN
101764973 Jun 2010 CN
101772779 Jul 2010 CN
101893935 Nov 2010 CN
102084391 Jun 2011 CN
102156810 Aug 2011 CN
102194007 Sep 2011 CN
102667913 Sep 2012 CN
103443817 Dec 2013 CN
104081379 Oct 2014 CN
104656901 May 2015 CN
105787764 Jul 2016 CN
1365358 Nov 2003 EP
1710717 Oct 2006 EP
2015244 Jan 2009 EP
2034433 Mar 2009 EP
2418275 Mar 2006 GB
11191118 Jul 1999 JP
2001283079 Oct 2001 JP
2001309323 Nov 2001 JP
2001344479 Dec 2001 JP
2002099826 Apr 2002 JP
2002183542 Jun 2002 JP
2002-207781 Jul 2002 JP
2002318926 Oct 2002 JP
2003022395 Jan 2003 JP
2004318359 Nov 2004 JP
2004326229 Nov 2004 JP
2005337966 Dec 2005 JP
2006209658 Aug 2006 JP
2006-244329 Sep 2006 JP
2006351024 Dec 2006 JP
2007172605 Jul 2007 JP
2008191751 Aug 2008 JP
2009545019 Dec 2009 JP
2010039908 Feb 2010 JP
2010141371 Jun 2010 JP
2010524110 Jul 2010 JP
2011209934 Oct 2011 JP
2012529685 Nov 2012 JP
1020060056369 May 2006 KR
1020060126717 Dec 2006 KR
1020070014532 Feb 2007 KR
100805607 Feb 2008 KR
100856585 Sep 2008 KR
1020090056792 Jun 2009 KR
1020090070900 Jul 2009 KR
1020100067921 Jun 2010 KR
1020100071559 Jun 2010 KR
1020110082690 Jul 2011 KR
1999044153 Sep 1999 WO
2005072157 Aug 2005 WO
2005072157 Feb 2007 WO
2008003966 Jan 2008 WO
2008015571 Feb 2008 WO
2008051538 May 2008 WO
2009111047 Sep 2009 WO
2009111047 Dec 2009 WO
2010084585 Jul 2010 WO
2010141939 Dec 2010 WO
2011070871 Jun 2011 WO
2011087797 Jul 2011 WO
2011087797 Oct 2011 WO
2012106096 Aug 2012 WO
2013063299 May 2013 WO
2013101903 Jul 2013 WO
2013101903 Jun 2014 WO
Non-Patent Literature Citations (392)
Entry
Definition of Homogeneous Coordinates, Retrieved from the Internet URL :<https://web.archive.org/web/20110305185824/http://en.wikipedia.org/wiki/Homogeneous_coordinates>, Wikipedia on Mar. 5, 2011 via Internet Arctlive WayBackMachine, Nov. 17, 2014, 10 pages.
Draw Something, Retrieved from the Internet URL: <http://omgpop.com/drawsomething>, Accessed on May 3, 2013, 1 page.
MLB at Bat 11, Retrieved from the Internet: <URL: http://texas.rangers.mlb.com/mobile/atbat/?c id=tex>, Accessed on Apr. 19, 2018, Accessed on Dec. 22, 2011, 3 pages.
SnapTell: Technology, Retrieved from the Internet: <URL: http://web.archive.org/web/20071117023817/http://www.snaptell.com/technology/index.htm>, Nov. 17, 2007, 1 page.
The ESP Game, Retrieved from the Internet: <URL: http://www.espgame.org/instructions.html>, Accessed on Nov. 13, 2007, 2 pages.
Appeal Decision received for Korean Patent Application No. 10-2012-7019181, mailed on Feb. 1, 2016, 16 pages.
Notice of Appeal for Korean Patent Application No. 10-2012-7019181, filed on Feb. 4, 2015, 24 pages (including English translation of claims).
Office Action received for Korean Patent Application No. 10-2012-7019181, dated Feb. 23, 2016, 12 pages (5 pages of English translation and 7 pages of official copy).
Office Action received for Korean Patent Application No. 10-2012-7019181, dated Jun. 26, 2014, 5 pages (with English translation of claims).
Notice of Final Rejection received for Korean Patent Application No. 10-2012-7019181, dated Nov. 3, 2014, 7 pages (with English translation of claims).
Office Action received for Korean Patent Application No. 10-2012-7019181, dated Nov. 18, 2013, 11 pages (with English translation of claims).
Response to Office Action filed on Feb. 18, 2014 for Korean Patent Application No. 10-2012-7019181, dated Nov. 18, 2013, 26 pages (with English translation of claims).
Response to Office Action filed on May 23, 2016 for Korean Patent Application No. 10-2012-7019181, dated Feb. 23, 2016, 26 pages (21 pages of official copy and 5 pages of English pending claims).
Final Office Action received for Korean Patent Application No. 10-2013-7023099, dated Jun. 10, 2014, 5 pages (2 pages of English translation and 3 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2013-7023099, dated Jan. 10, 2014, 8 pages (3 pages of English Translation and 5 pages of official copy).
Notice of Allowance Received for Korean Patent Application No. 10-2014-7004160, dated Jun. 15, 2016, 8 pages (2 pages of official copy and 6 pages of English translation).
Office Action received for Korean Patent Application No. 10-2014-7004160, dated Mar. 2, 2016, 7 pages (2 pages of English translation and 5 pages of official copy).
Response to Office Action filed on Jun. 2, 2016 for Korean Patent Application No. 10-2014-7004160, dated Mar. 2, 2016, 39 pages (34 pages of official copy and 15 pages of English pending claims).
Final Office Action received for Korean Patent Application No. 10-2014-7009560, dated May 26, 2015, 6 pages.
Final Office Action received for Korean Patent Application No. 10-2014-7009560, dated Sep. 30, 2015, 5 pages (2 pages of English translation and 3 pages of official copy).
Response to Office Action filed on Aug. 26, 2015 for Korean Patent Application No. 10-2014-7009560, dated May 26, 2015, 14 pages.
Response to Office Action filed on Jan. 7, 2015 for Korean Patent Application No. 10-2014-7009560, dated Oct. 8, 2014, 14 pages.
Final Office Action received for Korean Patent Application No. 10-2014-7014116, dated Jan. 29, 2016, 8 pages.
U.S. Appl. No. 61/447,962, “Shopping Experience Based on Concurrently Viewed Content”, filed Mar. 1, 2011, 53 pages.
Office Action received for Korean Patent Application No. 10-2014-7014116 dated Jun. 26, 2015, 13 pages (with English translation of claims).
Request for Re-Examination filed on May 2, 2016, for Korean Patent Application No. 10-2014-7014116, 22 pages (with English translation of claims).
Response to Office Action filed on Aug. 26, 2015, for Korean Patent Application No. 10-2014-7014116, dated Jun. 26, 2015, 23 pages (with English translation of claims).
Office Action received for Korean Patent Application No. 10-2015-7037233, dated Mar. 30, 2016, 6 pages (2 pages of English Translation and 4 pages of official copy).
Response to Office Action filed on Jun. 30, 2016 for Korean Patent Application No. 10-2015-7037233, dated Mar. 30, 2016, 34 pages.
Appeal Brief filed on Jan. 19, 2018, for Korean Patent Application No. 10-2016-7024912, 23 pages.
Final Office Action received for Korean Patent Application No. 10-2016-7024912, dated Jun. 16, 2017, 7 pages (3 pages of English translation and 4 pages of official copy).
Final Office Action received for Korean Patent Application No. 10-2016-7024912, dated Oct. 25, 2017, 7 pages.
Notice of Appeal filed on Dec. 22, 2017, for Korean Patent Application No. 10-2016-7024912, 2 pages.
Office Action received for Korean Patent Application No. 10-2016-7024912, dated Dec. 7, 2016, 11 pages.
Response to Final Office Action filed on Sep. 18, 2017, for Korean Patent Application No. 10-2016-7024912, dated Jun. 16, 2017, 23 pages.
Response to Office Action filed on Feb. 7, 2017, for Korean Patent Application No. 10-2016-7024912, dated Dec. 7, 2016, 15 pages.
Notice of Allowance Received for Korean Patent Application No. 10-2016-7025254 dated Mar. 9, 2018, 5 pages (2 pages of English translation and 3 pages of official copy).
Office Action received for Korean Patent Application No. 10-2016-7025254, dated May 2, 2017, 10 pages.
Office Action received for Korean Patent Application No. 10-2016-7025254, dated Oct. 13, 2016, 10 pages.
Office Action received for Korean Patent Application No. 10-2016-7025254, dated Sep. 5, 2017, 12 pages (5 pages of English translation and 7 pages of official copy).
Response to Office Action filed on Dec. 27, 2016 for Korean Patent Application No. 10-2016-7025254, dated Oct. 13, 2016, 25 pages.
Response to Office Action filed on Nov. 3, 2017, for Korean Patent Application No. 10-2016-7025254, dated Sep. 5, 2017, 22 pages (17 pages of official copy and 5 pages of English claims).
Final Office Action received for Korean Patent Application No. 10-2017-7036972, dated Mar. 18, 2019, 7 pages (4 pages official copy and 3 pages of English translation).
Final Office Action received for Korean Patent Application No. 10-2017-7036972, dated Dec. 26, 2018, 6 pages (2 pages of English translation and 4 pages of official copy).
Office Action received for Korean Patent Application No. 10-2017-7036972, dated Jan. 30, 2018, 8 pages.
Response to Office Action filed on Feb. 22, 2019, for Korean Patent Application No. 10-2017-7036972, dated Dec. 26, 2018, 13 pages (3 pages of English translation and 10 pages of official copy).
Response to Office Action filed on Jul. 31, 2018, for Korean Patent Application No. 10-2017-7036972, dated Jan. 30, 2018, 19 pages.
Voluntary Amendment for Korean Patent Application No. 10-2018-166862, filed on Oct. 5, 2018, 5 pages.
U.S. Appl. No. 61/033,940, “Image Recognition as a Service” filed Mar. 5, 2008, 45 pages.
Office Action received for Japanese Patent Application No. 2018-166862, dated Oct. 4, 2019, 6 pages.
Office Action received for Korean Patent Application No. 10-2020-7025366, dated Sep. 16, 2020, 11 pages (6 pages of official copy and 5 pages of English translation).
International Search Report received for PCT Application No. PCT/US2010/061628, dated Aug. 12, 2011, 4 pages.
Non-Final Office Action received for U.S. Appl. No. 13/359,630, dated Jun. 13, 2016, 40 pages.
Non-Final Office Action received for U.S. Appl. No. 13/359,630, dated Oct. 29, 2014, 29 pages.
Notice of Non-Compliant Amendment received for U.S. Appl. No. 13/359,630, filed Apr. 23, 2015, 3 pages.
Response to Final Office Action filed on Mar. 22, 2016, for U.S. Appl. No. 13/359,630 dated Sep. 22, 2015, 30 pages.
Response to Final Office Action filed on Mar. 31, 2014, for U.S. Appl. No. 13/359,630, dated Nov. 29, 2013, 17 pages.
Response to Non-Final Office Action filed on Mar. 30, 2015, for U.S. Appl. No. 13/359,630, dated Oct. 29, 2014, 31 pages.
Response to Non-Final Office Action filed on Oct. 7, 2013, for U.S. Appl. No. 13/359,630, dated Jun. 7, 2013, 15 pages.
Response to Notice of Non-Compliant Amendment filed on Jun. 23, 2015, for U.S. Appl. No. 13/359,630, dated Apr. 23, 2015, 31 pages.
Response to Restriction Requirement filed on May 21, 2013, for U.S. Appl. No. 13/359,630, dated Apr. 29, 2013, 10 pages.
Restriction Requirement received for U.S. Appl. No. 13/359,630, dated Apr. 29, 2013, 6 pages.
Corrected Notice of Allowance received for U.S. Appl. No. 13/624,682, dated Jan. 15, 2016, 5 pages.
Non-Final Office Action received for U.S. Appl. No. 13/624,682, dated Jan. 22, 2015, 8 pages.
Notice of Allowance received for U.S. Appl. No. 13/624,682, dated Jun. 8, 2015, 5 pages.
Notice of Allowance received for U.S. Appl. No. 13/624,682, dated Oct. 1, 2015, 7 pages.
Response to Non-Final Office Action filed on May 22, 2015, for U.S. Appl. No. 13/624,682, dated Jan. 22, 2015, 8 pages.
Advisory Action received for U.S. Appl. No. 14/067,795, dated Aug. 30, 2016, 5 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 14/067,795, dated Aug. 17, 2016, 3 pages.
Final Office Action received for U.S. Appl. No. 14/067,795, dated Jun. 1, 2016, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 14/067,795, dated Sep. 25, 2015, 10 pages.
Response to Final Office action filed on Jul. 26, 2016, for U.S. Appl. No. 14/067,795, dated Jun. 1, 2016, 12 pages.
Response to Non-Final Office Action filed on Feb. 25, 2016, for U.S. Appl. No. 14/067,795, dated Sep. 25, 2015, 9 pages.
Corrected Notice of Allowability received for U.S. Appl. No. 14/868,105, dated Oct. 11, 2018, 2 pages.
Corrected Notice of Allowance received for U.S. Appl. No. 14/868,105, dated Sep. 21, 2018, 9 pages.
Final Office Action received for U.S. Appl. No. 14/868,105, dated Apr. 12, 2017, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 14/868,105, dated Dec. 12, 2016, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 14/868,105, dated Nov. 14, 2017, 14 pages.
International Search Report received for PCT Patent Application No. PCT/US2012/061966, dated Jan. 18, 2013, 2 pages.
Preliminary Amendment filed for U.S. Appl. No. 14/868,105, dated Nov. 12, 2015, 7 pages.
Preliminary Amendment filed for U.S. Appl. No. 14/868,105, dated Oct. 20, 2015, 8 pages.
Response to Final Office Action filed on Jul. 12, 2017, for U.S. Appl. No. 14/868,105, dated Apr. 12, 2017, 12 pages.
Response to Non-Final Office Action filed on Feb. 22, 2017, for U.S. Appl. No. 14/868,105, dated Dec. 12, 2016, 14 pages.
Response to Non-Final Office Action filed on Feb. 23, 2018, for U.S. Appl. No. 14/868,105, dated Nov. 14, 2017, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 14/963,706, dated Jul. 5, 2016, 5 pages.
Notice of Allowance received for U.S. Appl. No. 14/963,706, dated Aug. 18, 2016, 7 pages.
Preliminary Amendment filed for U.S. Appl. No. 14/963,706, dated Mar. 11, 2016, 8 pages.
Response to Non-Final Office Action filed on Aug. 3, 2016, for U.S. Appl. No. 14/963,706, dated Jul. 5, 2016, 8 pages.
Notice of Allowance received for U.S. Appl. No. 14/990,291, dated Dec. 13, 2017, 5 pages.
Office Action-First Action Interview received for U.S. Appl. No. 14/990,291, dated Oct. 18, 2017, 5 pages.
Preinterview First Office Action received for U.S. Appl. No. 14/990,291, dated Aug. 10, 2017, 4 pages.
Response to Office Action-First Action Interview filed on Oct. 31, 2017, for U.S. Appl. No. 14/990,291, dated Oct. 18, 2017, 7 pages.
Corrected Notice of Allowability received for U.S. Appl. No. 15/250,588, dated Oct. 19, 2018, 2 pages.
Corrected Notice of Allowability received for U.S. Appl. No. 15/250,588, dated Sep. 26, 2018, 2 pages.
Corrected Notice of Allowability received for U.S. Appl. No. 15/250,588, dated Sep. 20, 2018, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 15/250,588, dated Sep. 22, 2017, 16 pages.
Notice of Allowance received for U.S. Appl. No. 15/250,588, dated Jul. 13, 2018, 9 pages.
Notice of Allowance received for U.S. Appl. No. 15/250,588 dated Mar. 21, 2018, 10 pages.
Preliminary Amendment received for U.S. Appl. No. 15/250,588, filed Aug. 30, 2016, 8 pages.
Response to Non-Final Office Action filed on Jan. 15, 2018, for U.S. Appl. No. 15/250,588, dated Sep. 22, 2017, 11 pages.
Preliminary Amendment for U.S. Appl. No. 15/337,899, filed Nov. 11, 2016, 8 pages.
Office Action received for Japanese Patent Application No. 2018-166862, dated Jan. 21, 2020, 6 pages.
Advisory action received for U.S. Appl. No. 15/377,651, dated Jul. 30, 2019, 2 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 15/377,651, dated Apr. 1, 2019, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 15/377,651, dated Jun. 28, 2019, 3 pages.
Final Office Action received for U.S. Appl. No. 15/377,651, dated May 15, 2019, 16 pages.
First Action Interview-Pre Interview Communication received for U.S. Appl. No. 15/377,651, dated Mar. 19, 2019, 5 pages.
Non-Final Office Action Received for U.S. Appl. No. 15/377,651, dated Aug. 15, 2019, 16 pages.
Notice of Allowance received for U.S. Appl. No. 15/377,651, dated Nov. 29, 2019, 7 pages.
Preliminary Amendment filed for U.S. Appl. No. 15/377,651, dated Dec. 27, 2016, 6 pages.
Response to Final Office Action filed on Jul. 15, 2019, for U.S. Appl. No. 15/377,651, dated May 15, 2019, 14 pages.
Response to First Action Interview—Pre-Interview Communication filed on Mar. 21, 2019, for U.S. Appl. No. 15/377,651, dated Mar. 19, 2019, 8 pages.
Response to Non-Final Office Action filed on Nov. 6, 2019, for U.S. Appl. No. 15/377,651, dated Aug. 15, 2019, 16 pages.
Response to Restriction Requirement filed on Jan. 14, 2019, for U.S. Appl. No. 15/377,651, dated Dec. 28, 2018, 9 pages.
Restriction Requirement received for U.S. Appl. No. 15/377,651 dated Dec. 28, 2018, 6 pages.
Supplemental Amendment filed on Apr. 29, 2019, for U.S. Appl. No. 15/377,651, 17 pages.
Response to Office Action filed on Jun. 20, 2018, for Japanese Patent Application No. 2017-075846, dated Mar. 20, 2018, 10 pages.
Office Action received for Japanese Patent Application No. 2017-075846, dated Mar. 20, 2018, 16 pages.
Notice of Decision to Grant received for Japan Patent Application No. 2017-075846 dated Aug. 7, 2018, 6 pages (3 pages of English translation and 3 pages of official copy).
Office Action received for Australian Patent Application No. 2016216535, dated Nov. 28, 2017, 3 pages.
Response to First Examiner Report filed on Aug. 2, 2017, for Australian Patent Application No. 2015271902, dated May 22, 2017, 19 pages.
First Examiner Report received for Australian Patent Application No. 2015271902, dated May 22, 2017, 3 pages.
First Examiner Report received for Indian Patent Application No. 6557/DELNP/2010, dated Apr. 11, 2017, Apr. 11, 2017, 11 pages.
Notice of Acceptance received for Australian Patent Application No. 2015264850, dated Apr. 13, 2017, Apr. 13, 2017, 3 pages.
First Examiner Report received for Australian Patent Application No. 2015264850, dated Dec. 19, 2016, 2 pages.
Response to Office Action filed on Jul. 28, 2017, for Chinese Patent Application No. 201510088798.4, dated Mar. 17, 2017, 13 pages (official copy only).
Extended European Search Report received for European Patent Application No. 17171025.4, dated Sep. 4, 2017, 7 pages.
Office Action received for Canadian Patent Application No. 2,826,580, dated Aug. 1, 2014, 2 pages.
Office Action received for Canadian Patent Application No. 2,826,580, dated Mar. 30, 2015, 3 pages.
Office Action received for Canadian Patent Application No. 2,826,580, dated Sep. 23, 2016, 4 pages.
Response to Office Action filed on Nov. 29, 2017, for Canadian Patent Application No. 2,826,580, dated May 30, 2017, 18 pages.
Response to Office Action filed on Aug. 18, 2014, for Canadian Patent Application No. 2,826,580, dated Aug. 1, 2014, 3 pages.
Response to Office Action filed on Sep. 28, 2015, for Canadian Patent Application No. 2,826,580, dated Mar. 30, 2015, 6 pages.
Notice of Allowance received for Canada Patent Application No. 2,850,074, dated Jul. 31, 2018, 1 page.
Office Action received for Canadian Patent Application No. 2,850,074, dated Nov. 28, 2016, 11 pages.
Office Action received for Canadian Patent Application No. 2,850,074, dated Sep. 29, 2015, 6 pages.
Office Action received for Canadian. Patent Application No. 2,850,074, dated Oct. 23, 2017, 6 pages.
Response to Office Action filed on Feb. 13, 2018, for Canadian Patent Application No. 2,850,074, dated Oct. 23, 2017, 12 pages.
Response to Office Action filed on Mar. 24, 2016 for Canadian Patent Application No. 2,850,074, dated Sep. 29, 2015, 8 pages.
Response to Office Action filed on May 26, 2017, Canadian Patent Application No. 2,850,074, dated Nov. 28, 2016, 5 pages.
Office Action received for Canadian Patent Application No. 2,856,869, dated Oct. 14, 2015, 3 pages.
Response to Office Action filed on Apr. 11, 2016, for Canadian Patent Application No. 2,856,869, dated Oct. 14, 2015, 20 pages.
Office Action received for Chinese Patent Application No. 201080059424.5, dated Apr. 21, 2014, 18 pages (with English translation of claims).
Response to Office Action filed on Sep. 4, 2014, for Chinese Patent Application No. 201080059424.5, dated Apr. 21, 2014, 10 pages (with English translation of claims).
Amendment filed on Dec. 4, 2013, for Australian Patent Application No. 2012212601, 10 pages.
First Examiner Report received for Australian Patent Application No. 2012212601, dated Oct. 28, 2015, 3 pages.
Notice of Acceptance received for Australian Patent Application No. 2012212601, dated May 5, 2016, 3 pages.
Response to First Examiner Report filed on Mar. 23, 2016, for Australian Patent Application No. 2012212601, dated Oct. 28, 2015, 21 pages.
First Examiner Report received for Australian Patent Application No. 2012328754, dated Mar. 30, 2015, 3 pages.
Response to First Examiner Report filed on Aug. 3, 2015, for Australian Patent Application No. 2012328754, dated Mar. 30, 2015, 17 pages.
First Examiner Report received for Australian Patent Application No. 2012362467 dated Mar. 24, 2015, 3 pages.
Written Opinion received for PCT Application No. PCT/US2010/061628, dated Aug. 12, 2011, 4 pages.
Response to Final Office Action filed on Jun. 26, 2014, for U.S. Appl. No. 13/019,918, dated Mar. 27, 2014, 13 pages.
Response to Final Office Action filed on Nov. 6, 2015, for U.S. Appl. No. 13/019,918, dated Aug. 6, 2015, 21 pages.
Response to Non-Final Office Action filed on Apr. 29, 2015, for U.S. Appl. No. 13/019,918, dated Dec. 29, 2014, 26 pages.
Response to Non-Final Office Action filed on Aug. 25, 2016, for U.S. Appl. No. 13/019,918, dated Jun. 2, 2016, 22 pages.
Response to Non-Final Office Action filed on Nov. 27, 2013, for U.S. Appl. No. 13/019,918, dated Aug. 29, 2013, 9 pages.
312 Amendment for U.S. Appl. No. 13/194,584, filed Feb. 27, 2018, 9 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/194,584, dated May 19, 2014, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/194,584, dated Dec. 28, 2017, 3 pages.
Google Play, “AgingBooth”, Retrieved from the Internet URL: <https://play.google.com/store/apps/details?id=com.piviandco.agingbooth&hl=en_IN>, Accessed on Jan. 7, 2019, 4 pages.
Gonsalves, “Amazon Launches Experimental Mobile Shopping Feature”, Retrieved from the Internet URL: <http://www.informationweek.com/news/internet/retail/showArticle.jhtml?articleID=212201750&subSection=News>, Dec. 3, 2008, 1 page.
Duke University, “How to Write Advertisements that Sell”, Company: System, the magazine of Business, 1912, 66 pages.
Araki et al., “Follow-The-Trial-Fitter: Real-Time Dressing without Undressing”, Retrieved from the Internet URL: <https://dialog.proquest.com/professional/printviewfile?accountId=142257>, Dec. 1, 2008, 8 pages.
Appelman, “Product Description for Fangraphs Baseball, An Iphone/ipad App”, Retrieved from the Internet URL: <https://blogs.fangraphs.com/fangraphs-iphone-app/>, Sep. 26, 2009, 3 pages.
Von et al., “Labeling Images with a Computer Game”, Retrieved from the Internet URL :<http://ael.gatech.edu/cs6452f13/files/2013/08/labeling-images.pdf>, 2004, 8 pages.
Notice of Allowance received for U.S. Appl. No. 13/194,584, dated Jan. 23, 2018, 12 pages.
Response to Final Office Action filed on Apr. 14, 2016, for U.S. Appl. No. 13/194,584, dated Jan. 22, 2016, 10 pages
Response to Final Office Action filed on Jun. 26, 2014, for U.S. Appl. No. 13/194,584, dated Mar. 27, 2014, 14 pages.
Response to Final Office Action filed on Oct. 30, 2017, for U.S. Appl. No. 13/194,584, dated Jul. 27, 2017, 11 pages.
Response to Non-Final Office Action filed on Dec. 19, 2013, for U.S. Appl. No. 13/194,584, dated Sep. 19, 2013, 13 pages.
Response to Non-Final Office Action filed on May 1, 2017, for U.S. Appl. No. 13/194,584, dated Nov. 29, 2016, 10 pages.
Response to Non-Final Office Action filed on Oct. 16, 2015, for U.S. Appl. No. 13/194,584, dated Jul. 16, 2015, 15 pages.
Response to Rule 312 Communication for U.S. Appl. No. 13/194,584 dated Mar. 14, 2018, 2 pages.
Final Office Action received for U.S. Appl. No. 13/283,416, dated Aug. 7, 2015, 25 pages.
Final Office Action received for U.S. Appl. No. 13/283,416, dated Nov. 25, 2014, 26 pages.
Non Final Office Action received for U.S. Appl. No. 13/283,416, dated Apr. 2, 2015, 31 pages.
Non Final Office Action received for U.S. Appl. No. 13/283,416, dated Feb. 2, 2016, 32 pages.
Non Final Office Action received for U.S. Appl. No. 13/283,416, dated Jul. 10, 2014, 29 pages.
Notice of Allowance received for U.S. Appl. No. 13/283,416, dated May 26, 2016, 9 pages.
Response to Final Office Action filed on Dec. 7, 2015, for U.S. Appl. No. 13/283,416, dated Aug. 7, 2015, 15 pages.
Response to Final Office Action filed on Feb. 25, 2015, for U.S. Appl. No. 13/283,416, dated Nov. 25, 2014, 12 pages.
Response to Non-Final Office Action filed on Jul. 2, 2015, for U.S. Appl. No. 13/283,416, dated Apr. 2, 2015, 13 pages.
Response to Non-Final Office Action filed on May 2, 2016, for U.S. Appl. No. 13/283,416, dated Feb. 2, 2016, 12 pages.
Response to Non-Final Office Action filed on Nov. 10, 2014, for U.S. Appl. No. 13/283,416, dated Jul. 10, 2014, 12 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/340,141, dated Aug. 4, 2015, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 13/340,141, dated Dec. 11, 2014, 3 pages.
Final Office Action received for U.S. Appl. No. 13/340,141, dated Feb. 6, 2014, 19 pages.
Final Office Action received for U.S. Appl. No. 13/340,141, dated Sep. 26, 2014, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 13/340,141, dated Apr. 9, 2015, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 13/340,141, dated Aug. 29, 2013, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 13/340,141, dated Jun. 5, 2014, 18 pages.
Notice of Allowance received for U.S. Appl. No. 13/340,141, dated Sep. 10, 2015, 9 pages.
Response to Final Office Action filed on Feb. 26, 2015, for U.S. Appl. No. 13/340,141, dated Sep. 26, 2014, 12 pages.
Response to Final Office Action filed on May 6, 2014, for U.S. Appl. No. 13/340,141, dated Feb. 6, 2014, 12 pages.
Response to Non-Final Office Action filed on Aug. 6, 2015, for U.S. Appl. No. 13/340,141, dated Apr. 9, 2015, 11 pages.
Response to Non-Final Office Action filed on Dec. 30, 2013, for U.S. Appl. No. 13/340,141, dated Aug. 29, 2013, 13 pages.
Response to Non-Final Office Action filed on Sep. 5, 2014, for U.S. Appl. No. 13/340,141, dated Jun. 5, 2014, 14 pages.
Final Office Action received for U.S. Appl. No. 13/359,630, dated Nov. 29, 2013, 21 pages.
Final Office Action received for U.S. Appl. No. 13/359,630, dated Sep. 22, 2015, 29 pages.
Non-Final Office Action received for U.S. Appl. No. 13/359,630, dated Jun. 7, 2013, 23 pages.
Response to Office Action filed on Jan. 6, 2020, for Japanese Patent Application No. 2018-166862, dated Oct. 1, 2019, 7 pages.
Office Action received for Korean Patent Application No. 10-2019-7017324, dated Sep. 16, 2019, 9 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2010/061628, dated Jul. 5, 2012, 6 pages.
Response to Office Action filed on Nov. 18, 2019, for Korean Patent Application No. 10-2019-7017324, dated Sep. 19, 2019, 12 pages.
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 10803429.9, dated Aug. 30, 2018, 6 pages.
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 10803429.9, dated Feb. 16, 2018, 8 pages.
Extended European Search report received for European Patent Application No. 10803429.9, dated Jun. 17, 2015, 7 pages.
Office Action received for European Patent Application No. 10803429.9, dated Aug. 22, 2012, 2 pages.
Response to Extended European Search report received filed on Dec. 15, 2015, for European Patent Application No. 10803429.9, dated Jun. 17, 2015, 24 pages.
Response to Office Action filed on Jan. 29, 2013, for European Application No. 10803429.9, dated Aug. 22, 2012, 10 pages.
Patterson, “Amazon Iphone App Takes Snapshots, Looks for a Match”, Retrieved from the Internet URL :<http://tech.yahoo.com/blogs/patterson/30983>, Dec. 3, 2008, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/371,882, dated Feb. 27, 2012, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/371,882, dated Jul. 21, 2015, 3 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/371,882, dated Nov. 20, 2013, 3 pages.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Jun. 25, 2015, 26 pages.
Parker, “Algorithms for Image Processing and Computer Vision”, Wiley Computer Publishing, 1997, pp. 23-29.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Nov. 14, 2011, 21 pages.
Youtube, “RedLaser 2.0: Realtime iPhone UPC Barcode Scanning”, Available online on URL: <https://www.youtube.com/watch?v=9_hFGsmx_6k>, Jun. 16, 2009, 2 pages.
Newby, “Facebook, Politico to Measure Sentiment of GOP Candidates by Collecting Posts”, 2006-2012 Clarity Digital Group LLC d/b/a Examiner.com, Jun. 28, 2012, 3 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Mar. 12, 2015, 29 pages.
Mulloni et al., “Handheld Augmented Reality Indoor Navigation with Activity-Based Instructions”, Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, Aug. 30-Sep. 2, 2011, 10 pages.
Mobitv, “MobiTV”, Retrieved from the Internet URL: <http://www.mobitv.com/>, Accessed on Mar. 30, 2015, 1 page.
Mello Jr., “Pongr Giving Cell Phone Users Way to Make Money”, Retrieved from the Internet URL: <https://www.pcworld.com/article/240209/pongr_giving_cell_phone_users_way_to_make_money.html>, Sep. 18, 2011, 4 pages.
Notice of Allowance received for U.S. Appl. No. 12/371,882, dated Jul. 20, 2016, 5 pages.
Preliminary Amendment filed for U.S. Appl. No. 12/371,882, dated Feb. 16, 2009, 4 pages.
Preliminary Amendment filed for U.S. Appl. No. 12/371,882, filed on Jun. 19, 2009, 3 pages.
Response to Final Office Action filed on Jun. 13, 2013, for U.S. Appl. No. 12/371,882, dated Mar. 13, 2013, 14 pages.
Response to Final Office Action filed on Mar. 14, 2012, for U.S. Appl. No. 12/371,882, dated Nov. 14, 2011, 10 pages.
Response to Final Office Action filed on May 8, 2014, for U.S. Appl. No. 12/371,882, dated Dec. 18, 2013, 12 pages.
Response to Final Office Action filed on Sep. 25, 2015, for U.S. Appl. No. 12/371,882, dated Jun. 25, 2015, 13 pages.
Response to Non-Final Office Action filed on Jan. 22, 2013, for U.S. Appl. No. 12/371,882, dated Oct. 23, 2012, 12 pages.
Response to Non-Final Office Action filed on May 9, 2016, for U.S. Appl. No. 12/371,882, dated Feb. 8, 2016, 14 pages.
Response to Non-Final Office Action filed on Sep. 8, 2011, for U.S. Appl. No. 12/371,882, dated Jun. 8, 2011, 13 pages.
Response to Non-Final Office Action filed on Dec. 2, 2013, for U.S. Appl. No. 12/371,882, dated Aug. 30, 2013, 13 pages.
Response to Non-Final Office Action filed on Jun. 12, 2015, for U.S. Appl. No. 12/371,882, dated Mar. 12, 2015, 18 pages.
Appeal Brief filed on Oct. 27, 2014, for U.S. Appl. No. 12/398,957, mailed on Oct. 17, 2014, 32 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/398,957, dated Jan. 6, 2017, 4 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/398,957, dated Sep. 10, 2014, 4 pages.
Decision on Appeal filed on Dec. 12, 2016, for U.S. Appl. No. 12/398,957, 17 pages.
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 12/398,957, mailed on Jan. 14, 2015, 10 pages.
Final Office Action received for U.S. Appl. No. 12/398,957, dated Jan. 22, 2018, 20 pages.
Final Office Action received for U.S. Appl. No. 12/398,957, dated Jul. 18, 2014, 27 pages.
Final Office Action received for U.S. Appl. No. 12/398,957, dated Nov. 7, 2012, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated Jul. 29, 2011, 23 pages.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated May 2, 2017, 19 pages.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated Sep. 19, 2013, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 12/398,957, dated Mar. 29, 2012, 23 pages.
International Search Report received for PCT Patent Application No. PCT/US2012/021450, dated May 1, 2012, 2 pages.
Reply Brief filed on Mar. 13, 2015, for U.S. Appl. No. 12/398,957, in response to Examiner's Answer to Appeal Brief mailed on Jan. 14, 2015, 9 pages.
Response to Final Office Action filed on Mar. 7, 2013, for U.S. Appl. No. 12/398,957, dated Nov. 7, 2012, 12 pages.
Notice of Allowance received for Korean Patent Application No. 10-2014-7014116, dated Jun. 10, 2016, 3 pages (2 pages of official copy and 1 page of English copy).
Amendment filed on Jul. 17, 2019, for Korean Patent Application No. 10-2019-7017324, 20 pages (11 pages of official copy and 9 pages of English translation).
Final Office Action received for Korean Patent Application No. 10-2019-7017324, dated Mar. 26, 2020, 6 pages. (3 pages of official copy and 3 pages of English translation).
Supplemental Notice of Allowability received for U.S. Appl. No. 15/377,651, dated Feb. 26, 2020, 4 pages.
Response to Office Action filed on Mar. 25, 2020 for Japanese Application No. 2018-166862, dated Jan. 21, 2020, 8 pages (5 pages of official copy & 3 pages of English pending claims).
Amendment Under 37 CFR filed on Mar. 9, 2020, U.S. Appl. No. 16/162,153, 8 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 16/162,153, dated Oct. 22, 2019, 4 pages.
Corrected Notice of Allowability received for U.S. Appl. No. 16/162,153, dated Feb. 21, 2020, 2 pages.
Corrected Notice of Allowability received for U.S. Appl. No. 16/162,153, dated Mar. 30, 2020, 2 pages.
Non-Final Office Action received for U.S. Appl. No. 16/162,153, dated Aug. 16, 2019, 25 pages.
Notice of Allowance received for U.S. Appl. No. 16/162,153, dated Dec. 11, 2019, 9 pages.
Response to Non-Final Office Action filed on Nov. 1, 2019, for U.S. Appl. No. 16/162,153, dated Aug. 16, 2019, 21 pages.
Response to Rule 312 Communication received for U.S. Appl. No. 16/162,153, dated Mar. 16, 2020, 2 pages.
Decision of Reexamination received for Chinese Patent Application No. 201280052967.3, mailed on Jan. 16, 2019, 18 pages (official copy only).
Reexamination Notification received for Chinese Patent Application No. 201280052967.3, mailed on Aug. 23, 2018, 20 pages (12 pages of official copy and 8 pages of English translation).
Response to Final Office Action filed on May 26, 2020, for Korean Patent Application No. 10-2019-7017324, dated Mar. 26, 2020, 21 pages (18 pages of official copy and 3 pages of English pending claims).
Response to First Examiner Report filed on Mar. 20, 2017, for Australian Patent Application No. 2015264850, dated Dec. 19, 2016, 15 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/371,882, dated Apr. 27, 2016, 5 pages.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Dec. 18, 2013, 26 pages.
Final Office Action received for U.S. Appl. No. 12/371,882, dated Mar. 13, 2013, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Jun. 8, 2011, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Aug. 30, 2013, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Feb. 8, 2016, 37 pages.
Non-Final Office Action received for U.S. Appl. No. 12/371,882, dated Oct. 23, 2012, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 12/406,016, dated Jun. 21, 2011, 21 pages.
Final Office Action received for U.S. Appl. No. 13/019,918, dated Mar. 27, 2014, 29 pages.
Notice of Allowance received for U.S. Appl. No. 12/644,957, dated Jun. 17, 2015, 21 pages.
Final Office Action received for U.S. Appl. No. 13/194,584, dated Jan. 22, 2016, 27 pages.
Final Office Action received for U.S. Appl. No. 13/194,584, dated Jul. 27, 2017, 35 pages.
Final Office Action received for U.S. Appl. No. 13/194,584, dated Mar. 27, 2014, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 13/194,584, dated Jul. 16, 2015, 27 pages.
Non-Final Office Action received for U.S. Appl. No. 13/194,584, dated Nov. 29, 2016, 29 pages.
Non-Final Office Action received for U.S. Appl. No. 13/194,584, dated Sep. 19, 2013, 25 pages.
Communication under Rule 71(3) for European Patent Application No. 10803429.9, dated Jun. 6, 2019, 7 pages.
Response to Communication Pursuant to Article 94(3) EPC filed on Dec. 11, 2018, for European Patent Application No. 10803429.9, dated Aug. 30, 2018, 6 pages.
Applicant-Initiated Interview Summary received for U.S. Appl. No. 16/046,434, dated Oct. 25, 2019, 3 pages.
Final Office Action received for U.S. Appl. No. 16/046,434, dated Jan. 17, 2020, 24 pages.
Non-Final Office Action Received for U.S. Appl. No. 16/046,434, dated Aug. 21, 2019, 23 pages.
Response to Non-Final Office Action filed on Nov. 1, 2019, for U.S. Appl. No. 16/046,434, dated Aug. 21, 2019, 21 pages.
Extended European Search Report received for European Patent Application No. 19184977.7, dated Sep. 26, 2019, 10 pages.
Response to Extended European Search Report filed on Jan. 20, 2020, for European Patent Application No. 19184977.7, dated Sep. 26, 2019, 14 pages.
Notice of Allowance received for U.S. Appl. No. 14/868,105, dated May 21, 2018, 15 pages.
Response to Communication Pursuant to Article 94(3) EPC Filed on Jun. 4, 2018, for European Patent Application No. 10803429.9, dated Feb. 16, 2018, 11 pages.
Response to Office Action filed on Aug. 19, 2016, for Japanese Patent Application No. 2014-539013, dated May 31, 2016, 11 pages (with English translation of claims).
Response to First Examiner Report filed on Aug. 28, 2015, for Australian Patent Application No. 2012362467, dated Mar. 24, 2015, 18 pages.
Office Action received for Chinese Patent Application No. 201280013881.X, dated Jan. 9, 2015, 17 pages.
Office Action received for Chinese Patent Application No. 201280013881.X, dated Jun. 4, 2014, 15 pages (8 pages of official copy and 7 pages of English translation).
Office Action received for Chinese Patent Application No. 201280013881.X, dated Sep. 6, 2015, 18 pages (9 pages of English translation and 9 pages of official copy).
Office Action received for Chinese Patent Application No. 201280013881.X, dated Sep. 7, 2016, 22 pages (11 pages of English translation and 11 pages of official copy).
Response to Office Action filed on Dec. 21, 2015, for Chinese Patent Application No. 201280013881.X, dated Sep. 6, 2015, 16 pages (4 pages of English pending claims and 12 pages of official copy).
Response to Office Action filed on May 25, 2015, for Chinese Patent Application No. 201280013881.X, dated Jan. 9, 2015, 11 pages (8 pages of official copy and 3 pages of English pending claims).
Response to Office Action filed on Oct. 20, 2014, for Chinese Patent Application No. 201280013881.X, dated Jun. 4, 2014, 13 pages (7 pages of official copy and 6 pages of English pending claims).
Response to Office Action filed on Dec. 9, 2015, for Japanese Patent Application No. 2014-539013, dated Aug. 11, 2015, 16 pages.
Decision of Rejection received for Chinese Patent Application No. 201280052967.3, dated Aug. 4, 2017, 17 pages (in English).
Office Action received for Chinese Patent Application No. 201280052967.3, dated Aug. 24, 2016, 14 pages (with English translation of claims).
Office Action received for Chinese Patent Application No. 201280052967.3, dated Mar. 2, 2016, 18 pages (with English translation of claims).
Office Action received for Chinese Patent Application No. 201280052967.3, dated Mar. 23, 2017, 22 pages (with English translation of claims).
Office Action received for Chinese Patent Application No. 201510088798.4, dated Mar. 17, 2017, 23 pages (14 pages of English translation and 9 pages of official copy).
Request for Re-examination for Chinese Patent Application No. 201280052967.3, filed on Nov. 17, 2017, 11 pages (including English translation of claims).
Response to Office Action filed on Jan. 6, 2017, for Chinese Patent Application No. 201280052967.3, dated Aug. 24, 2016, 10 pages.
Response to Office Action filed on Jul. 18, 2016, for Chinese Patent Application No. 201280052967.3, dated Mar. 2, 2016, 8 pages.
Response to Office Action filed on Jun. 7, 2017, for Chinese Patent Application No. 201280052967.3, dated Mar. 23, 2017, 16 pages (5 pages of English pending claims and 11 pages of official copy).
Appeal Decision received for Japanese Patent Application No. 2013-552538, mailed on Dec. 1, 2015, 66 pages (Official copy only).
Notice of Appeal filed on Oct. 23, 2014, for Japanese Patent Application No. 2013-552538, 16 pages.
Office Action received for Japanese Patent Application No. 2013-552538, dated Jan. 14, 2014, 6 pages (3 pages of English translation and 3 pages of official copy).
Office Action received for Japanese Patent Application No. 2013-552538, dated Jun. 24, 2014, 4 pages (2 pages of English translation and 2 pages of official copy).
Response to Office Action filed on Apr. 8, 2014, for Japanese Patent Application No. 2013-552538, dated Jan. 14, 2014, 11 pages (8 pages of official copy and 3 pages of English pending claims).
Office Action received for Japanese Patent Application No. 2014-215914, dated Jun. 21, 2016, 5 pages (3 pages of English translation and 2 pages of official copy).
Office Action received for Japanese Patent Application No. 2014-215914, dated Nov. 4, 2015, 9 pages (5 pages of English translation and 4 pages of official copy).
Response to Office Action filed on Mar. 4, 2016, for Japanese Patent Application No. 2014-215914, dated Nov. 4, 2015, 13 pages.
Appeal with Amendment filed on Apr. 6, 2017, for Japanese Patent Application No. 2014-539013, 19 pages (with English translation of claims).
Office Action received for Japanese Patent Application No. 2014-539013, dated Aug. 11, 2015, 7 pages (with English translation of claims).
Office Action received for Japanese Patent Application No. 2014-539013, dated Dec. 6, 2016, 5 pages (with English translation of claims).
Office Action received for Japanese Patent Application No. 2014-539013, dated May 31, 2016, 4 pages (with English translation of claims).
Ye et al., “Jersey Number Detection in Sports Video for Athlete Identification”, Visual Communications and Image Processing 2005. vol. 5960, Jul. 2005, 9 pages.
Wikipedia, “Definition of Polar Coordinate System”, Wikipedia on Oct. 11, 2011 via Internet Archive WayBackMachine, [Online], Apr. 18, 2018, 12 pages.
Walther et al., “Selective Visual Attention Enables Learning and Recognition of Multiple Objects in Cluttered Scenes”, Jun. 15, 2005, 23 pages.
Vlahakis et al., “Archeoguide: First Results of an Augmented Reality, Mobile Computing System in Cultural Heritage Sites”, Jan. 2001, 10 pages.
Vassilios et al., “Archeoguide:An Augmented Reality Guide for Archaeological Sites”, IEEE Computer Graphics and application vol. 22, No. 5, Sep./Oct. 2002, pp. 52-60.
Troaca, “S60 Camera Phones Get Image Recognition Technology”, Retrieved from the Internet URL: <http://news.softpedia.com/news/S60-Camera-Phones-Get-Image-Recognition-Technology-79666.shtml>, Feb. 27, 2008, 2 pages.
Terada, “New Cell Phone Services Tap Image-Recognition Technologies”, Retrieved from the Internet URL: <http://search.japantimes.co.jp/cgi-bin/nb20070626a1.html>, Jun. 26, 2007, 3 pages.
Slingbox, “Sling Media, Inc.”, Retrieved from the Internet URL: <http://www.slingbox.com/>, Accessed on Mar. 30, 2015, 4 pages.
Sifry, “Politico-Facebook Sentiment Analysis Will Generate “Bogus” Results, Expert Says”, Retrieved from the Internet URL: <http://techpresident.com/news/21618/politico-facebook-sentiment-analysis-bogus>, Jan. 13, 2012, 4 pages.
Redlaser, “Redlaser—Impossibly Accurate Barcode Scanning”, Retrieved from the Internet URL: <http://redlaser.com/index.php>, Jul. 8, 2011, pp. 1-2.
Politicology, “Facebook Gives POLITICO Access to Your Political Beliefs”, Ology, Retrieved from the Internet URL :<http://www.ology.com/post/51413/facebook-gives-politico-access- to-your-political-beliefs>, Accessed on Jun. 28, 2012, 4 pages.
Written Opinion received for PCT Patent Application No. PCT/US2012/071770, dated May 13, 2013, 5 pages.
International Search Report received for PCT Patent Application No. PCT/US2012/071770, dated May 13, 2013, 2 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2012/071770, dated Jul. 10, 2014, 7 pages.
Written Opinion received for PCT Patent Application No. PCT/US2012/061966, dated Jan. 18, 2013, 4 pages.
Supplemental Notice of Allowability received for U.S. Appl. No. 15/377,651, dated Jan. 28, 2020, 4 pages.
Response to Final Office Action filed on May 18, 2018, for U.S. Appl. No. 12/398,957, dated Jan. 22, 2018, 15 pages.
Response to Non-Final Office Action filed on Dec. 29, 2011, for U.S. Appl. No. 12/398,957, dated Jul. 29, 2011, 15 pages.
Response to Non-Final Office Action filed on Jan. 16, 2014, for U.S. Appl. No. 12/398,957, dated Sep. 19, 2013, 13 pages.
Response to Non-Final Office Action filed on Jul. 30, 2012, for U.S. Appl. No. 12/398,957, dated Mar. 29, 2012, 13 pages.
Response to Non-Final Office Action filed on Sep. 1, 2017, for U.S. Appl. No. 12/398,957, dated May 2, 2017, 13 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/406,016, dated May 15, 2012, 3 pages.
Final Office Action received for U.S. Appl. No. 12/406,016, dated Feb. 29, 2012, 25 pages.
Madeleine, “Terminator 3 Rise of Jesus! Deutsch”, Retrieved from the Internet URL: <https://www.youtube.com/watch?v=0j3o7HFcgzE>, Jun. 12, 2012, 2 pages.
Response to Final Office Action filed on May 17, 2012, for U.S. Appl. No. 12/406,016, dated Feb. 29, 2012, 16 pages.
Response to Non Final Office Action filed on Sep. 21, 2011, for U.S. Appl. No. 12/406,016, dated Jun. 21, 2011, 17 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 12/644,957, dated Apr. 29, 2015, 3 pages.
Applicant Initiated Summary received for U.S. Appl. No. 12/644,957, dated Jun. 11, 2014, 3 pages.
Applicant Interview Summary received for U.S. Appl. No. 12/644,957, dated Sep. 4, 2014, 3 pages.
Final Office Action received for U.S. Appl. No. 12/644,957, dated Aug. 26, 2013, 19 pages.
Final Office Action received for U.S. Appl. No. 12/644,957, dated Jul. 11, 2014, 25 pages.
Non-Final Office Action received for U.S. Appl. No. 12/644,957, dated Dec. 29, 2014, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 12/644,957, dated Mar. 7, 2014, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 12/644,957, dated Mar. 18, 2013, 17 pages.
Kraft, “Real Time Baseball Augmented Reality”, Retrieved from the Internet URL: <http://dx.doi.org/10.7936/K7HH6H84>, Dec. 6, 2011, 11 pages.
Response to Final Office Action filed on Nov. 26, 2013 for U.S. Appl. No. 12/644,957 dated Aug. 26, 2013, 11 pages.
Response to Final Office Action filed on Sep. 30, 2014, for U.S. Appl. No. 12/644,957 dated Jul. 11, 2014, 14 pages.
Response to Non-Final Office Action filed on Apr. 29, 2015, for U.S. Appl. No. 12/644,957, dated Dec. 29, 2014, 13 pages.
Response to Non-Final Office Action filed Jun. 9, 2014, for U.S. Appl. No. 12/644,957, dated Mar. 7, 2014, 13 pages.
Response to Non-Final Office Action filed on Jun. 14, 2013, for U.S. Appl. No. 12/644,957, dated Mar. 18, 2013, 12 pages.
Communication pursuant to Rules 94(3) EPC received for European Patent Application No. 12741860.6, dated Mar. 19, 2015, 8 pages.
Extended European Search Report received for European Patent Application No. 12741860.6, dated Apr. 30, 2014, 7 pages.
Response to Communication Pursuant to Article 94(3) EPC filed on Jul. 23, 2015, for European Patent Application No. 12741860.6, dated Mar. 19, 2015, 8 pages.
Response to Extended European Search report filed on Nov. 26, 2014, for European Patent Application No. 12741860.6, dated Apr. 30, 2014, 8 pages.
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 12843046.9, dated Dec. 1, 2016, 8 pages.
Extended European Search report received for European Patent Application No. 12843046.9, dated Mar. 5, 2015, 7 pages.
Written Opinion received for PCT Patent Application No. PCT/US2012/021450, dated May 1, 2012, 2 pages.
Response to Communication pursuant to Articles 94(3) EPC filed on Apr. 5, 2017, for European Patent Application No. 12843046.9, dated Dec. 1, 2016, 11 pages.
Response to Extended European Search Report filed on Sep. 30, 2015, for European Patent Application No. 12843046.9, dated Mar. 5, 2015, 13 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2012/061966, dated May 8, 2014, 6 pages.
Summon to Attend Oral Proceedings received for European Patent Application No. 12843046.9, dated Jun. 21, 2018, 12 pages.
Communication pursuant to Rules 94(3) EPC received for European Patent Application No. 12862340.2, dated Dec. 21, 2016, 4 pages.
Extended European Search Report received for European Patent Application No. 12862340.2, dated Dec. 21, 2015, 4 pages.
Response to Communication Pursuant to Article 94(3) EPC filed on Feb. 6, 2017, for European Patent Application No. 12862340.2, dated Dec. 21, 2016, 6 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2012/021450, dated Jan. 22, 2013, 17 pages.
Response to Extended European Search Report filed on Jul. 8, 2016, for European Patent Application No. 12862340.2, dated Dec. 21, 2015, 16 pages.
Summons to Attend Oral Proceedings received for European Patent Application No. 12862340.2, dated Mar. 14, 2017, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 13/011,324, dated Apr. 18, 2013, 17 pages.
Final Office Action received for U.S. Appl. No. 13/019,918, dated Aug. 6, 2015, 36 pages.
Kan et al., “Applying QR Code in Augmented Reality Applications”, VRCAI, Dec. 15, 2009, pp. 253-258.
Final Office Action received for U.S. Appl. No. 13/019,918, dated Nov. 30, 2016, 32 pages.
Non-Final Office Action received for U.S. Appl. No. 13/019,918, dated Aug. 29, 2013, 25 pages.
Non-Final Office Action received for U.S. Appl. No. 13/019,918, dated Dec. 29, 2014, 30 pages.
Non-Final Office Action received for U.S. Appl. No. 13/019,918, dated Jun. 2, 2016, 31 pages.
Response to Final Office Action filed on Apr. 28, 2017, for U.S. Appl. No. 13/019,918, dated Nov. 30, 2016, 22 pages.
Final Office Action received for Korean Patent Application No. 10-2020-7025366, dated Feb. 17, 2021, 8 Pages (4 pages of official Copy and 4 pages of English Translation).
Non Final Office Action Received for U.S. Appl. No. 16/803,468, dated Apr. 26, 2021, 17 pages.
Related Publications (1)
Number Date Country
20200250741 A1 Aug 2020 US
Continuations (3)
Number Date Country
Parent 16162153 Oct 2018 US
Child 16852972 US
Parent 15250588 Aug 2016 US
Child 16162153 US
Parent 13283416 Oct 2011 US
Child 15250588 US