The present disclosure relates generally to image processing, and in a specific example embodiment, to visualization of items in an environment using augmented reality.
Conventionally, when an individual shops for an item, the individual must mentally visualize what the item will look like in the environment that the individual intends to place the item. Often, the individual has difficulty imagining the item with proper dimensions and orientation. In some cases, the individual may purchase the item only to realize that the item does not ideally fit in the environment. As a result, the individual may end up returning the item or otherwise disposing of the item (e.g., sell, trade, give away).
Various ones of the appended drawings merely illustrate example embodiments of the present invention and cannot be considered as limiting its scope.
The description that follows includes systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments of the present invention. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques have not been shown in detail.
As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Additionally, although various example embodiments discussed below focus on a marketplace environment, the embodiments are given merely for clarity in disclosure. Thus, any type of electronic publication, electronic commerce, social networking, or electronic business system and method, including various system architectures, may employ various embodiments of the system and method described herein and may be considered as being within a scope of example embodiments. Each of a variety of example embodiments is discussed in detail below.
Example embodiments described herein provide systems and methods for visualizing of an item in an environment using augmented reality. In example embodiments, environment image data containing an image of an environment is received from a client device. A selection of an item that is under consideration for purchase and placement into an indicated location of the environment is received. An item image of the selected item is scaled to a scale that is based on dimensions determined from the environment image data for the environment. The dimensions may be determined based on a calculated distance to a focal point of the indicated location in the environment and on a marker located in the image of the environment. The scaled item image is augmented into the image of the environment at the indicated location to generate an augmented reality image. In some embodiments, the scaled item may be oriented to match an orientation of the indicated location in the environment.
By using embodiments of the present invention, a user may search for an item and augment an image of an environment with an image of the item. Because the user can create and view an augmented reality image of the environment including the selected item, the user can easily visualize the selected item in the environment without having to, for example, manually cut and paste or scale the image of the item into the image of the environment. Therefore, one or more of the methodologies discussed herein may obviate a need for time consuming data processing by the user. This may have the technical effect of reducing computing resources used by one or more devices within the system. Examples of such computing resources include, without limitation, processor cycles, network traffic, memory usage, storage space, and power consumption.
With reference to
The client devices 110 and 112 may comprise a mobile phone, desktop computer, laptop, or any other communication device that a user may utilize to access the networked system 102. In some embodiments, the client device 110 may comprise or be connectable to an image capture device 113 (e.g., camera, camcorder). In further embodiments, the client device 110 may comprise one or more of a touch screen, accelerometer, microphone, and GPS device. The client devices 110 and 112 may be a device of an individual user interested in visualizing an item within an environment.
An Application Program Interface (API) server 114 and a web server 116 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 118. The application servers 118 host a publication system 120 and a payment system 122, each of which may comprise one or more modules, applications, or engines, and each of which may be embodied as hardware, software, firmware, or any combination thereof. The application servers 118 are, in turn, coupled to one or more database servers 124 facilitating access to one or more information storage repositories or database(s) 126. The databases 126 may also store user account information of the networked system 102 in accordance with example embodiments.
In example embodiments, the publication system 120 publishes content on a network (e.g., Internet). As such, the publication system 120 provides a number of publication functions and services to users that access the networked system 102. The publication system 120 is discussed in more detail in connection with
The payment system 122 provides a number of payment services and functions to users. The payment system 122 allows users to accumulate value (e.g., in a commercial currency, such as the U.S. dollar, or a proprietary currency, such as “points”) in their accounts, and then later to redeem the accumulated value for products (e.g., goods or services) that are made available via the publication system 120 or elsewhere on the network 104. The payment system 122 also facilitates payments from a payment mechanism (e.g., a bank account, PayPal′, or credit card) for purchases of items via any type and form of a network-based marketplace.
While the publication system 120 and the payment system 122 are shown in
Referring now to
In one embodiment, the publication system 120 provides a number of publishing, listing, and price-setting mechanisms whereby a seller may list (or publish information concerning) goods or services for sale, a buyer can express interest in or indicate a desire to purchase such goods or services, and a price can be set for a transaction pertaining to the goods or services. To this end, the publication system 120 may comprise at least one publication engine 202 and one or more shopping engines 204. In one embodiment, the shopping engines 204 may support auction-format listing and price setting mechanisms (e.g., English, Dutch, Chinese, Double, Reverse auctions, etc.).
A pricing engine 206 supports various price listing formats. One such format is a fixed-price listing format (e.g., the traditional classified advertisement-type listing or a catalog listing). Another format comprises a buyout-type listing. Buyout-type listings (e.g., the Buy-It-Now (BIN) technology developed by eBay Inc., of San Jose, Calif.) may be offered in conjunction with auction-format listings and allow a buyer to purchase goods or services, which are also being offered for sale via an auction, for a fixed price that is typically higher than a starting price of an auction for an item.
A store engine 208 allows a seller to group listings within a “virtual” store, which may be branded and otherwise personalized by and for the seller. Such a virtual store may also offer promotions, incentives, and features that are specific and personalized to the seller. In one example, the seller may offer a plurality of items as Buy-It-Now items in the virtual store, offer a plurality of items for auction, or a combination of both.
Navigation of the publication system 120 may be facilitated by a navigation engine 210. For example, a search module (not shown) of the navigation engine 210 enables, for example, keyword searches of listings or other information published via the publication system 120. In a further example, a browse module (not shown) of the navigation engine 210 allows users to browse various category, catalog, or data structures according to which listings or other information may be classified within the publication system 120. Various other navigation applications within the navigation engine 210 may be provided to supplement the searching and browsing applications. In one embodiment, the navigation engine 210 allows the user to search or browse for items in the publication system 120 (e.g., virtual stores, listings in a fixed-price or auction selling environment, listings in a social network or information system). In alternative embodiments, the navigation engine 210 may navigate (e.g., conduct a search on) a network at large (e.g., network 104). Based on a result of the navigation engine 210, the user may select an item that the user is interested in augmenting into an environment.
In order to make listings or posting of information available via the networked system 102 as visually informing and attractive as possible, the publication system 120 may include an imaging engine 212 that enables users to upload images for inclusion within listings and to incorporate images within viewed listings. In some embodiments, the imaging engine 212 also receives image data from a user and utilizes the image data to generate the augmented reality image. For example, the imaging engine 212 may receive an environment image (e.g., still image, video) of an environment within which the user wants to visualize an item. The imaging engine 212 may work in conjunction with the augmented reality engine 218 to generate the augmented reality image as will be discussed in more details below.
A listing engine 214 manages listings on the publication system 120. In example embodiments, the listing engine 214 allows users to author listings of items. The listing may comprise an image of an item along with a description of the item. In one embodiment, the listings pertain to goods or services that a user (e.g., a seller) wishes to transact via the publication system 120. As such, the listing may comprise an image of a good for sale and a description of the item such as, for example, dimensions, color, and, identifier (e.g., UPC code, ISBN code). In some embodiments, a user may create a listing that is an advertisement or other form of publication to the networked system 102. The listing engine 214 also allows the users to manage such listings by providing various management features (e.g., auto-relisting, inventory level monitors, etc.).
A messaging engine 216 is responsible for the generation and delivery of messages to users of the networked system 102. Such messages include, for example, advising users regarding the status of listings and best offers (e.g., providing an acceptance notice to a buyer who made a best offer to a seller) or providing recommendations. The messaging engine 216 may utilize any one of a number of message delivery networks and platforms to deliver messages to users. For example, the messaging engine 222 may deliver electronic mail (e-mail), an instant message (IM), a Short Message Service (SMS), text, facsimile, or voice (e.g., Voice over IP (VoIP)) messages via wired networks (e.g., the Internet), a Plain Old Telephone Service (POTS) network, or wireless networks (e.g., mobile, cellular, WiFi, WiMAX).
An augmented reality engine 218 manages the generation of an augmented reality based on an environment image and item specified by a user. The augmented reality engine 218 will be discussed in more detail in connection with
Although the various components of the publication system 120 have been defined in terms of a variety of individual modules and engines, a skilled artisan will recognize that many of the items can be combined or organized in other ways. Alternatively, not all components of the publication system 120 of
In example embodiments, the imaging engine 212 may receive environment image data of an environment (e.g., still image, video) from the client device 110. The environment image data is then provided to the augmented reality engine 218 for processing. In some embodiments, the augmented reality engine 218 also receives item data for an item that the user is interested in visualizing in the environment and an indication of a location where the item is to be augmented in the environment. The item data may be provided by the navigation engine 210 based on a user selection of an item found using a search or browsing function of the navigation engine 210.
Alternatively, the item data may be received from the client device 110. For example, the user may capture an image of an item that the user is interested in augmenting into the environment (e.g., take a photo of an item at a store). The user may, in some cases, enter information regarding the item such as dimensions or an identifier (e.g., UPC code). The augmented reality engine 218 receives the item data from the client device 110.
The access module 300 accesses item data for a selected item. In some embodiments, an item to be augmented into the environment may be selected by a user at the client device and the selection is received, for example, by the navigation engine 210. In other embodiments, the selection is received by the access module 300. Based on the selection, the access module 300 may access information corresponding to the selection. If the selection is an item listing for the item, the access module 300 may access the item listing and extract item data (e.g., dimensions, images) from the listing. In other examples, if the selection is a user inputted name or other item identifier of an item (e.g., UPC code), the access module 300 may access a catalog (e.g., stored in the database 126) that stores item data using the item identifier.
The distance module 302 determines a distance to a focal point in an image of the environment. The focal point may be a user selected area (also referred to as an “indicated location”) where an item image is to be augmented. For example, if the environment is a room, the distance to a wall where the item image is to be augmented may be determined. In one embodiment, the distance module 302 may use a focus capability of the image capture device 113 of, or coupled to, the client device 110 to determine the distance. Alternatively, the distance module 302 may use an echo technique using the client device 110 as a sound generator to determine the distance. For example, the client device 110 may generate a sound in the direction of the wall and an amount of time is registered for an echo to be returned. The distance module 302 may use this amount of time to determine the distance. As such, the distance is from a point of view of the viewer or image capture device (e.g., camera) to the focal point.
The sizing module 304 determines sizing for the environment. In example embodiments, the sizing module 304 uses a marker (an object with known standard dimensions) in the environment image data to calculate the sizing. For example, if a door is shown in the environment image data, the sizing module 304 may assume that the door is a standard sized door (e.g., 36″×80″) or that a door knob is located at 36″ from the floor. Using these known standard dimensions, sizing for the environment may be determined. In another example, if the environment is an automobile, the marker may be a wheel well of the automobile. In this example, the user may specify a type of automobile when providing the environment image data.
The scaling module 306 scales an image of the item based on the distance and sizing determined by the distance module 302 and the sizing module 304, respectively. Accordingly, the scaling module 306 may receive (e.g., from the navigation engine 210) or retrieve the item data (e.g., from the database 126) for a selected item. The item data may include an item image, dimensions, or an item identifier. If the item image and dimensions are provided, then the scaling module 306 may use the item image and the dimensions to scale the item image to the environment based on the sizing determined by the sizing module 304. Alternatively, if one of the image or dimension is not provided, the item identifier may be used to look up the item in an item catalog which may contain an image and item information for the item (e.g., dimensions and description). In one embodiment, the scaling module 306 may look up and retrieve the item information from the item catalog.
Once the item image is scaled, the scaled item image may be oriented to the environment by the orientation module 308. For example, if the environment image has a wall at a slight angle and the scaled item image is to be placed on the wall, the orientation module 308 orients the scaled item image to the angle of the wall. It is noted that functionality of any of the distance module 302, sizing module 304, scale module 306, and orientation module 308 may be combined into one or more modules that can determine proper sizing and orientation for the item image. In some embodiments, these combined modules may comprise or make use of one or more gyroscopes or accelerometers.
The augmenting module 310 augments the scaled and oriented item image with the environment image to create an augmented reality image. The augmenting module 310 then provides the augmented reality image to the client device 110.
The recommendation module 312 optionally provides recommendations for alternative items for the environment. For example, if the scaled and oriented item image appears too large for an indicated area on the environment image (e.g., as determined by the augmenting module 310), the recommendation module 312 may suggest one or more alternative items that are smaller and will fit better in the indicated area. Accordingly, the recommendation module 312 may determine a dimension that is more appropriate for the indicated area and perform a search (e.g., provide instructions to the navigation engine 210 to perform a search) to find one or more alternative items. The recommendation module 312 may then retrieve the item information and provide the alternative items as a suggestion to the user. In one embodiment, the alternative items may be listed on a side of a display that is displaying the augmented reality image or on a pop-up window.
The save module 314 saves the environment image for later use. In one embodiment, the environmental image may be stored to the database 126 of the networked system 102. Alternatively, the environmental image may be stored to the client device 110. For example, the user may record the environmental image for a room and save the environmental image. At a later time, the user may obtain an item image for an item that the user is interested in augmenting into the saved environmental image. The save module 314 may access and retrieve the saved environmental image.
The purchase module 316 allows the user to purchase the item that is augmented into the environment or an alternative item recommended by the recommendation module 312. In one embodiment, the purchase module 316 provides a selection on or near the augmented reality image that when selected takes the user to, for example, a purchase page for the item, a store front for a store that sells the item, or search page with search results for availability of the item for purchase. In another embodiment, an activation of the selection may initiate an automatic purchase of the item. Once selected, the purchase module 316 performs the corresponding actions to facilitate the purchase (e.g., send a search for the item to the navigation engine 210, provide one or more listings using the shopping engine 204, provide a webpage associated with the store engine 208).
In operation 404, a selection of an item to be augmented into the environment is received. In some embodiments, the navigation engine 210 receives a selection of the item from the client device. In other embodiments, the imaging engine 212 receives an image of an item that the user is interested in augmenting into the environment.
Based on the received selection of the item, item data is accessed in operation 406. The access module 300 accesses item data for the selected item. The item data may be extracted from an item listing for the item, retrieved from an item catalog, or retrieved from a website of a manufacturer or reseller (e.g., using an item identifier of the item).
In operation 408, augmentation processing is performed. Augmentation processing takes the environment image data and the selected item and augments or merges an item image for the item into an environment image. The operations of the augmentation processing will be discussed in detail with respect to
The result of the augmentation is provided in operation 410. The result may comprise a video of the environment with the selected item augmented into the environment (referred to as “the augmented reality image”). In example embodiments, the augmenting module 310 provides the augmented reality image to the client device 110 of the user that provided the environment image, the item selection, or both.
In operation 412, a determination is made as to whether a modification is received. In some embodiments, the modification may be caused by the movement of the image capture device 113. For example, if the image capture device 113 is a video camera, then the modification is the movement within the environment as captured by the video camera. In another embodiment, the user may select an alternative item based on a recommendation provided by the recommendation module 312. Based on the modification, the method 400 may return to either operation 406 to access item data for the new item or to operation 408 to perform augmentation processing based on, for example, the movement within the environment.
In operation 504, sizing for the environment is determined by the sizing module 304. In example embodiments, the sizing module 304 uses a marker in the environment image data to calculate the sizing Using known standard dimensions of the marker, sizing for the environment may be determined by the sizing module 304.
The item image is scaled in operation 506. The scaling module 306 scales an image of the item based on the distance and sizing determined by the distance module 302 and the sizing module 304, respectively. Accordingly, the scaling module 306 may receive or retrieve the item data including an item image, dimensions, or an item identifier. The retrieved item data is then used in association with the determined distance and sizing data to scale the item image.
Once the item image is scaled, the scaled item image may be oriented to the environment, in operation 508, by the orientation module 308. For example, if the environment image has a wall at a slight angle and the scaled item image is to be placed on the wall, the orientation module 308 orients the scaled item image to the angle of the wall.
In operation 510, the scaled and oriented item image is merged into the environment image. The augmenting module 310 augments the scaled and oriented item image with the environment image to create an augmented reality image. It is noted that operations of
While the various examples of
Modules, Components, and Logic
Additionally, certain embodiments described herein may be implemented as logic or a number of modules, engines, components, or mechanisms. A module, engine, logic, component, or mechanism (collectively referred to as a “module”) may be a tangible unit capable of performing certain operations and configured or arranged in a certain manner. In certain example embodiments, one or more computer systems (e.g., a standalone, client, or server computer system) or one or more components of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) or firmware (note that software and firmware can generally be used interchangeably herein as is known by a skilled artisan) as a module that operates to perform certain operations described herein.
In various embodiments, a module may be implemented mechanically or electronically. For example, a module may comprise dedicated circuitry or logic that is permanently configured (e.g., within a special-purpose processor, application specific integrated circuit (ASIC), or array) to perform certain operations. A module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software or firmware to perform certain operations. It will be appreciated that a decision to implement a module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by, for example, cost, time, energy-usage, and package size considerations.
Accordingly, the term “module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which modules or components are temporarily configured (e.g., programmed), each of the modules or components need not be configured or instantiated at any one instance in time. For example, where the modules or components comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different modules at different times. Software may accordingly configure the processor to constitute a particular module at one instance of time and to constitute a different module at a different instance of time.
Modules can provide information to, and receive information from, other modules. Accordingly, the described modules may be regarded as being communicatively coupled. Where multiples of such modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the modules. In embodiments in which multiple modules are configured or instantiated at different times, communications between such modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple modules have access. For example, one module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further module may then, at a later time, access the memory device to retrieve and process the stored output. Modules may also initiate communications with input or output devices and can operate on a resource (e.g., a collection of information).
Example Machine Architecture and Machine-Readable Medium
With reference to
The example computer system 700 may include a processor 702 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 704 and a static memory 706, which communicate with each other via a bus 708. The computer system 700 may further include a video display unit 710 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). In example embodiments, the computer system 700 also includes one or more of an alpha-numeric input device 712 (e.g., a keyboard), a user interface (UI) navigation device or cursor control device 714 (e.g., a mouse), a disk drive unit 716, a signal generation device 718 (e.g., a speaker), and a network interface device 720.
Machine-Readable Storage Medium
The disk drive unit 716 includes a machine-readable storage medium 722 on which is stored one or more sets of instructions 724 and data structures (e.g., software instructions) embodying or used by any one or more of the methodologies or functions described herein. The instructions 724 may also reside, completely or at least partially, within the main memory 704, the static memory 706, or within the processor 702 during execution thereof by the computer system 700, with the main memory 704 and the processor 702 also constituting machine-readable media.
While the machine-readable storage medium 722 is shown in an example embodiment to be a single medium, the term “machine-readable storage medium” may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) that store the one or more instructions. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of embodiments of the present invention, or that is capable of storing, encoding, or carrying data structures used by or associated with such instructions. The term “machine-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories and optical and magnetic media. Specific examples of machine-readable storage media include non-volatile memory, including by way of example semiconductor memory devices (e.g., Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), and flash memory devices); magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
Transmission Medium
The instructions 724 may further be transmitted or received over a communications network 726 using a transmission medium via the network interface device 720 and utilizing any one of a number of well-known transfer protocols (e.g., HTTP). Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, mobile telephone networks, POTS networks, and wireless data networks (e.g., WiFi and WiMax networks). The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
Although an overview of the inventive subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of embodiments of the present invention. Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is, in fact, disclosed.
The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present invention. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present invention as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
This application is a continuation of and claims the benefit of priority to U.S. patent application Ser. No. 16/852,972, filed on Apr. 20, 2020, Ser. No. 16/852,972 is a continuation of and claims the benefit of priority to U.S. patent application Ser. No. 16/162,153, filed on Oct. 16, 2018, Ser. No. 16/162,153 is a continuation of and claims the benefit of priority to U.S. patent application Ser. No. 15/250,588, filed on Aug. 29, 2016, Ser. No. 15/250,588 is a continuation of and claims the benefit of priority to U.S. patent application Ser. No. 13/283,416, filed on Oct. 27, 2011, which are hereby incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
3675215 | Arnold et al. | Jul 1972 | A |
4539585 | Spackova et al. | Sep 1985 | A |
4596144 | Panton et al. | Jun 1986 | A |
5068723 | Dixit et al. | Nov 1991 | A |
5408417 | Wilder | Apr 1995 | A |
5546475 | Bolle et al. | Aug 1996 | A |
5579471 | Barber et al. | Nov 1996 | A |
5601431 | Howard | Feb 1997 | A |
5692012 | Virtamo et al. | Nov 1997 | A |
5781899 | Hirata | Jul 1998 | A |
5802361 | Wang et al. | Sep 1998 | A |
5818964 | Itoh | Oct 1998 | A |
5870149 | Comroe et al. | Feb 1999 | A |
5889896 | Meshinsky et al. | Mar 1999 | A |
5901379 | Hirata | May 1999 | A |
5949429 | Bonneau et al. | Sep 1999 | A |
6112226 | Weaver et al. | Aug 2000 | A |
6134548 | Gottsman et al. | Oct 2000 | A |
6134674 | Akasheh | Oct 2000 | A |
6151587 | Matthias | Nov 2000 | A |
6154738 | Call | Nov 2000 | A |
6157435 | Slater et al. | Dec 2000 | A |
6216134 | Heckerman et al. | Apr 2001 | B1 |
6216227 | Goldstein et al. | Apr 2001 | B1 |
6278446 | Liou et al. | Aug 2001 | B1 |
6292593 | Nako et al. | Sep 2001 | B1 |
6298330 | Gardenswartz et al. | Oct 2001 | B1 |
6434530 | Sloane et al. | Aug 2002 | B1 |
6463426 | Lipson et al. | Oct 2002 | B1 |
6477269 | Brechner | Nov 2002 | B1 |
6483570 | Slater et al. | Nov 2002 | B1 |
6484130 | Dwyer et al. | Nov 2002 | B2 |
6512919 | Ogasawara | Jan 2003 | B2 |
6530521 | Henry | Mar 2003 | B1 |
6549913 | Murakawa | Apr 2003 | B1 |
6563959 | Troyanker | May 2003 | B1 |
6567797 | Schuetze et al. | May 2003 | B1 |
6587835 | Treyz et al. | Jul 2003 | B1 |
6589290 | Maxwell et al. | Jul 2003 | B1 |
6642929 | Essafi et al. | Nov 2003 | B1 |
6714945 | Foote et al. | Mar 2004 | B1 |
6724930 | Kosaka et al. | Apr 2004 | B1 |
6763148 | Sternberg et al. | Jul 2004 | B1 |
6804662 | Annau et al. | Oct 2004 | B1 |
6901379 | Balter et al. | May 2005 | B1 |
6947571 | Rhoads et al. | Sep 2005 | B1 |
7022281 | Senff | Apr 2006 | B1 |
7023441 | Choi et al. | Apr 2006 | B2 |
7062722 | Carlin et al. | Jun 2006 | B1 |
7082365 | Sheha et al. | Jul 2006 | B2 |
7130466 | Seeber | Oct 2006 | B2 |
7149665 | Feld et al. | Dec 2006 | B2 |
7162082 | Edwards | Jan 2007 | B2 |
7240025 | Stone et al. | Jul 2007 | B2 |
7254779 | Rezvani et al. | Aug 2007 | B1 |
7257268 | Eichhorn et al. | Aug 2007 | B2 |
7281018 | Begun et al. | Oct 2007 | B1 |
7346453 | Matsuoka | Mar 2008 | B2 |
7346543 | Edmark | Mar 2008 | B1 |
7347373 | Singh | Mar 2008 | B2 |
7363214 | Musgrove et al. | Apr 2008 | B2 |
7363252 | Fujimoto | Apr 2008 | B2 |
7460735 | Rowley et al. | Dec 2008 | B1 |
7478143 | Friedman et al. | Jan 2009 | B1 |
7495674 | Biagiotti et al. | Feb 2009 | B2 |
7519562 | Vander et al. | Apr 2009 | B1 |
7568004 | Gottfried | Jul 2009 | B2 |
7587359 | Levy et al. | Sep 2009 | B2 |
7593602 | Stentiford | Sep 2009 | B2 |
7683858 | Allen et al. | Mar 2010 | B2 |
7702185 | Keating et al. | Apr 2010 | B2 |
7752082 | Calabria | Jul 2010 | B2 |
7756757 | Oakes, III | Jul 2010 | B1 |
7761339 | Alivandi | Jul 2010 | B2 |
7796155 | Neely et al. | Sep 2010 | B1 |
7801893 | Gulli et al. | Sep 2010 | B2 |
7827074 | Rolf | Nov 2010 | B1 |
7848764 | Riise et al. | Dec 2010 | B2 |
7848765 | Phillips et al. | Dec 2010 | B2 |
7881560 | John | Feb 2011 | B2 |
7890386 | Reber | Feb 2011 | B1 |
7916129 | Lin et al. | Mar 2011 | B2 |
7921040 | Reber | Apr 2011 | B2 |
7933811 | Reber | Apr 2011 | B2 |
7948481 | Vilcovsky | May 2011 | B2 |
7957510 | Denney et al. | Jun 2011 | B2 |
8078498 | Edmark | Dec 2011 | B2 |
8130242 | Cohen | Mar 2012 | B2 |
8230016 | Pattan et al. | Jul 2012 | B1 |
8239130 | Upstill et al. | Aug 2012 | B1 |
8260846 | Lahav | Sep 2012 | B2 |
8275590 | Szymczyk et al. | Sep 2012 | B2 |
8370062 | Starenky et al. | Feb 2013 | B1 |
8385646 | Lang et al. | Feb 2013 | B2 |
8547401 | Mallinson et al. | Oct 2013 | B2 |
8825660 | Chittar | Sep 2014 | B2 |
8868443 | Yankovich et al. | Oct 2014 | B2 |
9058764 | Persson et al. | Jun 2015 | B1 |
9164577 | Tapley et al. | Oct 2015 | B2 |
9240059 | Zises | Jan 2016 | B2 |
9251395 | Botchen | Feb 2016 | B1 |
9336541 | Pugazhendhi et al. | May 2016 | B2 |
9449342 | Sacco | Sep 2016 | B2 |
9495386 | Tapley et al. | Nov 2016 | B2 |
9530059 | Zises | Dec 2016 | B2 |
9953350 | Pugazhendhi et al. | Apr 2018 | B2 |
10127606 | Tapley et al. | Nov 2018 | B2 |
10147134 | Sacco | Dec 2018 | B2 |
10210659 | Tapley et al. | Feb 2019 | B2 |
10614602 | Zises | Apr 2020 | B2 |
10628877 | Sacco | Apr 2020 | B2 |
11113755 | Sacco | Sep 2021 | B2 |
20010034668 | Whitworth | Oct 2001 | A1 |
20010049636 | Hudda et al. | Dec 2001 | A1 |
20020002504 | Engel et al. | Jan 2002 | A1 |
20020027694 | Kim et al. | Mar 2002 | A1 |
20020052709 | Akatsuka et al. | May 2002 | A1 |
20020072993 | Sandus et al. | Jun 2002 | A1 |
20020094189 | Navab et al. | Jul 2002 | A1 |
20020107737 | Kaneko et al. | Aug 2002 | A1 |
20020111154 | Eldering et al. | Aug 2002 | A1 |
20020116286 | Walker et al. | Aug 2002 | A1 |
20020146176 | Meyers | Oct 2002 | A1 |
20020196333 | Gorischek | Dec 2002 | A1 |
20030018652 | Heckerman et al. | Jan 2003 | A1 |
20030028873 | Lemmons | Feb 2003 | A1 |
20030051255 | Bulman et al. | Mar 2003 | A1 |
20030053706 | Hong et al. | Mar 2003 | A1 |
20030080978 | Navab et al. | May 2003 | A1 |
20030085894 | Tatsumi | May 2003 | A1 |
20030101105 | Vock | May 2003 | A1 |
20030112260 | Gouzu | Jun 2003 | A1 |
20030123026 | Abitbol et al. | Jul 2003 | A1 |
20030130910 | Pickover et al. | Jul 2003 | A1 |
20030147623 | Fletcher | Aug 2003 | A1 |
20030208409 | Mault | Nov 2003 | A1 |
20030229537 | Dunning et al. | Dec 2003 | A1 |
20030231806 | Troyanker | Dec 2003 | A1 |
20040019643 | Robert | Jan 2004 | A1 |
20040046779 | Asano et al. | Mar 2004 | A1 |
20040057627 | Abe et al. | Mar 2004 | A1 |
20040075670 | Bezine et al. | Apr 2004 | A1 |
20040096096 | Huber | May 2004 | A1 |
20040128320 | Grove et al. | Jul 2004 | A1 |
20040133927 | Sternberg et al. | Jul 2004 | A1 |
20040153505 | Verdi et al. | Aug 2004 | A1 |
20040205286 | Bryant et al. | Oct 2004 | A1 |
20040220767 | Tanaka et al. | Nov 2004 | A1 |
20040230558 | Tokunaka | Nov 2004 | A1 |
20050001852 | Dengler et al. | Jan 2005 | A1 |
20050004850 | Gutbrod | Jan 2005 | A1 |
20050010486 | Pandhe | Jan 2005 | A1 |
20050065655 | Hong et al. | Mar 2005 | A1 |
20050081161 | Macinnes et al. | Apr 2005 | A1 |
20050084154 | Li et al. | Apr 2005 | A1 |
20050091597 | Ackley | Apr 2005 | A1 |
20050151743 | Sitrick | Jul 2005 | A1 |
20050151963 | Pulla et al. | Jul 2005 | A1 |
20050162419 | Kim et al. | Jul 2005 | A1 |
20050162523 | Darrell et al. | Jul 2005 | A1 |
20050171864 | Nakade et al. | Aug 2005 | A1 |
20050182792 | Israel et al. | Aug 2005 | A1 |
20050193006 | Bandas | Sep 2005 | A1 |
20050222987 | Vadon | Oct 2005 | A1 |
20050283379 | Reber | Dec 2005 | A1 |
20060004850 | Chowdhury | Jan 2006 | A1 |
20060012677 | Neven et al. | Jan 2006 | A1 |
20060013481 | Park et al. | Jan 2006 | A1 |
20060015492 | Keating et al. | Jan 2006 | A1 |
20060032916 | Mueller et al. | Feb 2006 | A1 |
20060038833 | Mallinson et al. | Feb 2006 | A1 |
20060058948 | Blass et al. | Mar 2006 | A1 |
20060071945 | Anabuki | Apr 2006 | A1 |
20060071946 | Anabuki et al. | Apr 2006 | A1 |
20060116935 | Evans | Jun 2006 | A1 |
20060120686 | Liebenow | Jun 2006 | A1 |
20060149625 | Koningstein | Jul 2006 | A1 |
20060149638 | Allen | Jul 2006 | A1 |
20060184013 | Emanuel et al. | Aug 2006 | A1 |
20060190293 | Richards | Aug 2006 | A1 |
20060218153 | Voon et al. | Sep 2006 | A1 |
20060240862 | Neven | Oct 2006 | A1 |
20070005576 | Cutrell et al. | Jan 2007 | A1 |
20070015586 | Huston | Jan 2007 | A1 |
20070038944 | Carignano et al. | Feb 2007 | A1 |
20070060112 | Reimer | Mar 2007 | A1 |
20070078846 | Gulli et al. | Apr 2007 | A1 |
20070091125 | Takemoto et al. | Apr 2007 | A1 |
20070098234 | Fiala | May 2007 | A1 |
20070104348 | Cohen | May 2007 | A1 |
20070122947 | Sakurai et al. | May 2007 | A1 |
20070133947 | Armitage et al. | Jun 2007 | A1 |
20070143082 | Degnan | Jun 2007 | A1 |
20070150403 | Mock | Jun 2007 | A1 |
20070159522 | Neven | Jul 2007 | A1 |
20070172155 | Guckenberger | Jul 2007 | A1 |
20070198505 | Fuller | Aug 2007 | A1 |
20070230817 | Kokojima | Oct 2007 | A1 |
20070244924 | Sadovsky et al. | Oct 2007 | A1 |
20070300161 | Bhatia et al. | Dec 2007 | A1 |
20080003966 | Magnusen | Jan 2008 | A1 |
20080005313 | Flake et al. | Jan 2008 | A1 |
20080037877 | Jia et al. | Feb 2008 | A1 |
20080046738 | Galloway et al. | Feb 2008 | A1 |
20080046956 | Kulas | Feb 2008 | A1 |
20080059055 | Geelen et al. | Mar 2008 | A1 |
20080071559 | Arrasvuori | Mar 2008 | A1 |
20080074424 | Carignano | Mar 2008 | A1 |
20080082426 | Gokturk et al. | Apr 2008 | A1 |
20080084429 | Wissinger | Apr 2008 | A1 |
20080097975 | Guay et al. | Apr 2008 | A1 |
20080104054 | Spangler | May 2008 | A1 |
20080126193 | Robinson | May 2008 | A1 |
20080142599 | Benillouche et al. | Jun 2008 | A1 |
20080151092 | Vilcovsky | Jun 2008 | A1 |
20080154710 | Varma | Jun 2008 | A1 |
20080163311 | St. John-Larkin | Jul 2008 | A1 |
20080163379 | Robinson et al. | Jul 2008 | A1 |
20080165032 | Lee | Jul 2008 | A1 |
20080170810 | Wu et al. | Jul 2008 | A1 |
20080176545 | Dicke et al. | Jul 2008 | A1 |
20080177640 | Gokturk et al. | Jul 2008 | A1 |
20080186226 | Ratnakar | Aug 2008 | A1 |
20080194323 | Merkli et al. | Aug 2008 | A1 |
20080201241 | Pecoraro | Aug 2008 | A1 |
20080205755 | Jackson et al. | Aug 2008 | A1 |
20080205764 | Iwai et al. | Aug 2008 | A1 |
20080207357 | Savarese et al. | Aug 2008 | A1 |
20080225123 | Osann et al. | Sep 2008 | A1 |
20080240575 | Panda et al. | Oct 2008 | A1 |
20080255961 | Livesey | Oct 2008 | A1 |
20080268876 | Gelfand et al. | Oct 2008 | A1 |
20080278778 | Saino | Nov 2008 | A1 |
20080285940 | Kulas | Nov 2008 | A1 |
20080288338 | Wiseman et al. | Nov 2008 | A1 |
20080288477 | Kim et al. | Nov 2008 | A1 |
20080318625 | Rofougaran | Dec 2008 | A1 |
20090006208 | Grewal et al. | Jan 2009 | A1 |
20090019487 | Kulas | Jan 2009 | A1 |
20090028435 | Wu et al. | Jan 2009 | A1 |
20090028446 | Wu et al. | Jan 2009 | A1 |
20090083096 | Cao et al. | Mar 2009 | A1 |
20090083134 | Burckart et al. | Mar 2009 | A1 |
20090094260 | Cheng et al. | Apr 2009 | A1 |
20090106127 | Purdy et al. | Apr 2009 | A1 |
20090109240 | Englert et al. | Apr 2009 | A1 |
20090110241 | Takemoto et al. | Apr 2009 | A1 |
20090144624 | Barnes | Jun 2009 | A1 |
20090182810 | Higgins et al. | Jul 2009 | A1 |
20090228342 | Walker et al. | Sep 2009 | A1 |
20090232354 | Camp et al. | Sep 2009 | A1 |
20090235181 | Saliba et al. | Sep 2009 | A1 |
20090235187 | Kim et al. | Sep 2009 | A1 |
20090240735 | Grandhi et al. | Sep 2009 | A1 |
20090245638 | Collier et al. | Oct 2009 | A1 |
20090262137 | Walker et al. | Oct 2009 | A1 |
20090271293 | Parkhurst et al. | Oct 2009 | A1 |
20090287587 | Bloebaum et al. | Nov 2009 | A1 |
20090299824 | Barnes, Jr. | Dec 2009 | A1 |
20090304267 | Tapley et al. | Dec 2009 | A1 |
20090319373 | Barrett | Dec 2009 | A1 |
20090319388 | Yuan et al. | Dec 2009 | A1 |
20090319887 | Waltman et al. | Dec 2009 | A1 |
20090324100 | Kletter et al. | Dec 2009 | A1 |
20090324137 | Stallings et al. | Dec 2009 | A1 |
20090325554 | Reber | Dec 2009 | A1 |
20100015960 | Reber | Jan 2010 | A1 |
20100015961 | Reber | Jan 2010 | A1 |
20100015962 | Reber | Jan 2010 | A1 |
20100026809 | Curry | Feb 2010 | A1 |
20100034469 | Thorpe et al. | Feb 2010 | A1 |
20100037177 | Golsorkhi | Feb 2010 | A1 |
20100045701 | Scott et al. | Feb 2010 | A1 |
20100046842 | Conwell | Feb 2010 | A1 |
20100048290 | Baseley et al. | Feb 2010 | A1 |
20100049663 | Kane et al. | Feb 2010 | A1 |
20100070996 | Liao et al. | Mar 2010 | A1 |
20100082927 | Riou | Apr 2010 | A1 |
20100131714 | Chandrasekaran | May 2010 | A1 |
20100153378 | Sardesai | Jun 2010 | A1 |
20100161605 | Gabrilovich et al. | Jun 2010 | A1 |
20100171758 | Maassel et al. | Jul 2010 | A1 |
20100171999 | Namikata et al. | Jul 2010 | A1 |
20100185529 | Chesnut | Jul 2010 | A1 |
20100188510 | Yoo et al. | Jul 2010 | A1 |
20100198684 | Eraker et al. | Aug 2010 | A1 |
20100211900 | Fujioka | Aug 2010 | A1 |
20100214284 | Rieffel et al. | Aug 2010 | A1 |
20100235259 | Farraro et al. | Sep 2010 | A1 |
20100241650 | Chittar | Sep 2010 | A1 |
20100257024 | Holmes et al. | Oct 2010 | A1 |
20100260426 | Huang et al. | Oct 2010 | A1 |
20100281417 | Yolleck et al. | Nov 2010 | A1 |
20100283630 | Alonso et al. | Nov 2010 | A1 |
20100287511 | Meier et al. | Nov 2010 | A1 |
20100289817 | Meier et al. | Nov 2010 | A1 |
20100312596 | Saffari et al. | Dec 2010 | A1 |
20100316288 | Ip et al. | Dec 2010 | A1 |
20100332283 | Ng et al. | Dec 2010 | A1 |
20100332304 | Higgins et al. | Dec 2010 | A1 |
20110004517 | Soto et al. | Jan 2011 | A1 |
20110016487 | Chalozin et al. | Jan 2011 | A1 |
20110029334 | Reber | Feb 2011 | A1 |
20110053642 | Lee | Mar 2011 | A1 |
20110055054 | Glasson | Mar 2011 | A1 |
20110061011 | Hoguet | Mar 2011 | A1 |
20110065496 | Gagner et al. | Mar 2011 | A1 |
20110078305 | Varela | Mar 2011 | A1 |
20110084983 | Demaine | Apr 2011 | A1 |
20110090343 | Alt et al. | Apr 2011 | A1 |
20110128300 | Gay et al. | Jun 2011 | A1 |
20110143731 | Ramer et al. | Jun 2011 | A1 |
20110148924 | Tapley et al. | Jun 2011 | A1 |
20110153614 | Solomon | Jun 2011 | A1 |
20110173191 | Tsaparas et al. | Jul 2011 | A1 |
20110184780 | Alderson et al. | Jul 2011 | A1 |
20110187306 | Aarestrup et al. | Aug 2011 | A1 |
20110215138 | Crum | Sep 2011 | A1 |
20110246064 | Nicholson | Oct 2011 | A1 |
20120072233 | Hanlon et al. | Mar 2012 | A1 |
20120084812 | Thompson et al. | Apr 2012 | A1 |
20120099800 | Llano et al. | Apr 2012 | A1 |
20120105475 | Tseng et al. | May 2012 | A1 |
20120113141 | Zimmerman et al. | May 2012 | A1 |
20120120113 | Hueso | May 2012 | A1 |
20120165046 | Rhoads et al. | Jun 2012 | A1 |
20120179716 | Takami | Jul 2012 | A1 |
20120185492 | Israel et al. | Jul 2012 | A1 |
20120192235 | Tapley et al. | Jul 2012 | A1 |
20120195464 | Ahn | Aug 2012 | A1 |
20120197764 | Nuzzi et al. | Aug 2012 | A1 |
20120215612 | Ramer et al. | Aug 2012 | A1 |
20120230581 | Miyashita et al. | Sep 2012 | A1 |
20120284105 | Li | Nov 2012 | A1 |
20120293548 | Perez et al. | Nov 2012 | A1 |
20120308077 | Tseng et al. | Dec 2012 | A1 |
20120327115 | Chhetri et al. | Dec 2012 | A1 |
20130019177 | Schlossberg et al. | Jan 2013 | A1 |
20130050218 | Beaver et al. | Feb 2013 | A1 |
20130073365 | Mccarthy | Mar 2013 | A1 |
20130086029 | Hebert | Apr 2013 | A1 |
20130103306 | Uetake | Apr 2013 | A1 |
20130106910 | Sacco | May 2013 | A1 |
20130116922 | Cai et al. | May 2013 | A1 |
20130144701 | Kulasooriya et al. | Jun 2013 | A1 |
20130170697 | Zises | Jul 2013 | A1 |
20130198002 | Nuzzi et al. | Aug 2013 | A1 |
20130325839 | Goddard et al. | Dec 2013 | A1 |
20140007012 | Govande et al. | Jan 2014 | A1 |
20140063054 | Osterhout et al. | Mar 2014 | A1 |
20140085333 | Pugazhendhi et al. | Mar 2014 | A1 |
20140372449 | Chittar | Dec 2014 | A1 |
20150052171 | Cheung | Feb 2015 | A1 |
20160019723 | Tapley et al. | Jan 2016 | A1 |
20160034944 | Raab et al. | Feb 2016 | A1 |
20160117863 | Pugazhendhi et al. | Apr 2016 | A1 |
20160171305 | Zises | Jun 2016 | A1 |
20160364793 | Sacco | Dec 2016 | A1 |
20170046593 | Tapley et al. | Feb 2017 | A1 |
20170091975 | Zises | Mar 2017 | A1 |
20180189863 | Tapley et al. | Jul 2018 | A1 |
20180336734 | Tapley et al. | Nov 2018 | A1 |
20190050939 | Sacco | Feb 2019 | A1 |
20200193668 | Zises | Jun 2020 | A1 |
20200250741 | Sacco | Aug 2020 | A1 |
Number | Date | Country |
---|---|---|
2012212601 | May 2016 | AU |
2015264850 | Apr 2017 | AU |
1255989 | Jun 2000 | CN |
1750001 | Mar 2006 | CN |
1802586 | Jul 2006 | CN |
101515195 | Aug 2009 | CN |
101515198 | Aug 2009 | CN |
101520904 | Sep 2009 | CN |
101541012 | Sep 2009 | CN |
101764973 | Jun 2010 | CN |
101772779 | Jul 2010 | CN |
101893935 | Nov 2010 | CN |
102084391 | Jun 2011 | CN |
102156810 | Aug 2011 | CN |
102194007 | Sep 2011 | CN |
102667913 | Sep 2012 | CN |
103443817 | Dec 2013 | CN |
104081379 | Oct 2014 | CN |
104656901 | May 2015 | CN |
105787764 | Jul 2016 | CN |
1365358 | Nov 2003 | EP |
1710717 | Oct 2006 | EP |
2015244 | Jan 2009 | EP |
2034433 | Mar 2009 | EP |
2418275 | Mar 2006 | GB |
11-191118 | Jul 1999 | JP |
2001-283079 | Oct 2001 | JP |
2001-309323 | Nov 2001 | JP |
2001-344479 | Dec 2001 | JP |
2002-99826 | Apr 2002 | JP |
2002-183542 | Jun 2002 | JP |
2002-207781 | Jul 2002 | JP |
2002-318926 | Oct 2002 | JP |
2003-22395 | Jan 2003 | JP |
2004-318359 | Nov 2004 | JP |
2004-326229 | Nov 2004 | JP |
2005-337966 | Dec 2005 | JP |
2006-209658 | Aug 2006 | JP |
2006-244329 | Sep 2006 | JP |
2006-351024 | Dec 2006 | JP |
2007-172605 | Jul 2007 | JP |
2008-191751 | Aug 2008 | JP |
2009-545019 | Dec 2009 | JP |
2010-39908 | Feb 2010 | JP |
2010-141371 | Jun 2010 | JP |
2010-524110 | Jul 2010 | JP |
2011-209934 | Oct 2011 | JP |
2012-529685 | Nov 2012 | JP |
10-2006-0056369 | May 2006 | KR |
10-2006-0126717 | Dec 2006 | KR |
10-2007-0014532 | Feb 2007 | KR |
10-0805607 | Feb 2008 | KR |
10-0856585 | Sep 2008 | KR |
10-2009-0056792 | Jun 2009 | KR |
10-2009-0070900 | Jul 2009 | KR |
10-2010-0067921 | Jun 2010 | KR |
10-2010-0071559 | Jun 2010 | KR |
10-2011-0082690 | Jul 2011 | KR |
1999044153 | Sep 1999 | WO |
2005072157 | Aug 2005 | WO |
2005072157 | Feb 2007 | WO |
2008003966 | Jan 2008 | WO |
2008015571 | Feb 2008 | WO |
2008051538 | May 2008 | WO |
2009111047 | Sep 2009 | WO |
2009111047 | Dec 2009 | WO |
2010084585 | Jul 2010 | WO |
2010141939 | Dec 2010 | WO |
2011070871 | Jun 2011 | WO |
2011087797 | Jul 2011 | WO |
2011087797 | Oct 2011 | WO |
2012106096 | Aug 2012 | WO |
2013063299 | May 2013 | WO |
2013101903 | Jul 2013 | WO |
2013101903 | Jun 2014 | WO |
Entry |
---|
Non-Final Office Action received for U.S. Appl. No. 13/340,141, dated Apr. 9, 2015, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/340,141, dated Aug. 29, 2013, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/340,141, dated Jun. 5, 2014, 18 pages. |
Notice of Allowance received for U.S. Appl. No. 13/340,141, dated Sep. 10, 2015, 9 pages. |
Appeal Decision received for Korean Patent Application No. 10-2012-7019181, mailed on Jan. 29, 2016, 36 pages. (16 pages of official copy and 20 pages of English translation). |
Corrected Notice of Allowability received for U.S. Appl. No. 14/868,105, dated Oct. 11, 2018, 2 pages. |
Corrected Notice of Allowance received for U.S. Appl. No. 14/868,105, dated Sep. 21, 2018, 9 pages. |
Final Office Action received for U.S. Appl. No. 14/868,105, dated Apr. 12, 2017, 22 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/868,105, dated Dec. 12, 2016, 24 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/868,105, dated Nov. 14, 2017, 14 pages. |
Notice of Allowance received for U.S. Appl. No. 14/868,105, dated May 21, 2018, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/963,706, dated Jul. 5, 2016, 5 pages. |
Notice of Allowance received for U.S. Appl. No. 14/963,706, dated Aug. 18, 2016, 7 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 15/250,588, dated Oct. 19, 2018, 2 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 15/250,588, dated Sep. 26, 2018, 2 pages. |
Corrected Notice of Allowability received for U.S. Appl. No. 15/250,588, dated Sep. 20, 2018, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/250,588, dated Sep. 22, 2017, 16 pages. |
Notice of Allowance received for U.S. Appl. No. 15/250,588, dated Jul. 13, 2018, 9 pages. |
Notice of Allowance Received for U.S. Appl. No. 15/250,588 dated Mar. 21, 2018, 10 pages. |
Advisory action received for U.S. Appl. No. 15/377,651 , dated Jul. 30, 2019, 2 pages. |
Final Office Action received for U.S. Appl. No. 15/377,651, dated May 15, 2019, 16 pages. |
First Action Interview-Pre Interview Communication received for U.S. Appl. No. 15/377,651, dated Mar. 19, 2019, 5 pages. |
Non-Final Office Action Received for U.S. Appl. No. 15/377,651, dated Aug. 15, 2019, 16 pages. |
Notice Of Allowance received for U.S. Appl. No. 15/377,651, dated Nov. 29, 2019, 7 pages. |
Restriction Requirement Received for U.S. Appl. No. 15/377,651 dated Dec. 28, 2018, 6 pages. |
Supplemental Notice of Allowability Received for U.S. Appl. No. 15/377,651, dated Feb. 26, 2020, 4 pages. |
Supplemental Notice of Allowability Received for U.S. Appl. No. 15/377,651, dated Jan. 28, 2020, 4 pages. |
Final Office Action received for U.S. Appl. No. 16/046,434, dated Jan. 17, 2020, 24 pages. |
Non-Final Office Action Received for U.S. Appl. No. 16/046,434, dated Aug. 21, 2019, 23 Pages. |
Corrected Notice Of Allowability received for U.S. Appl. No. 16/162,153, dated Feb. 21, 2020, 2 pages. |
Corrected Notice Of Allowability Received for U.S. Appl. No. 16/162,153, dated Mar. 30, 2020, 2 pages. |
Non-Final Office Action Received for U.S. Appl. No. 16/162,153, dated Aug. 16, 2019, 25 pages. |
Notice Of Allowance received for U.S. Appl. No. 16/162,153, dated Dec. 11, 2019, 9 pages. |
Non Final Office Action Received for U.S. Appl. No. 16/803,468, dated Apr. 26, 2021, 17 pages. |
Office Action received for Korean Patent Application No. 10-2012-7019181, dated Jun. 26, 2014, 5 pages. |
Extended European Search Report Received for European Patent Application No. 19184977.7 dated Sep. 26, 2019, 10 pages. |
Notice of Allowance received for Canada Patent Application No. 2,850,074, dated Jul. 31, 2018, 1 page. |
Office Action received for Canadian Patent Application No. 2,850,074, dated Nov. 28, 2016, 11 pages. |
Office Action received for Canadian Patent Application No. 2,850,074, dated Sep. 29, 2015, 6 pages. |
Office Action received for Canadian. Patent Application No. 2,850,074, dated Oct. 23, 2017, 6 Pages. |
Office Action received for Canadian Patent Application No. 2,856,869, dated Oct. 14, 2015, 3 pages. |
Office Action received for Chinese Patent Application No. 201080059424.5, dated Apr. 21, 2014, 18 Pages. |
First Examiner Report Received for Australian Patent Application No. 2012328754 dated Mar. 30, 2015, 3 pages. |
First Examiner Report Received for Australian Patent Application No. 2012362467 dated Mar. 24, 2015, 3 pages. |
Decision of Reexamination received for Chinese Patent Application No. 201280052967.3, dated Jan. 16, 2019, 18 pages. |
Decision of Rejection received for Chinese Patent Application No. 201280052967.3, dated Aug. 4, 2017, 21 pages (11 pages of Official copy and 10 pages of English Translation). |
Office Action received for Chinese Patent Application No. 201280052967.3, dated Aug. 24, 2016, 21 pages (10 pages of English Translation and 11 pages of Official Copy). |
Office Action received for Chinese Patent Application No. 201280052967.3, dated Mar. 2, 2016, 18 pages (9 pages of English Translation and 9 pages of Official copy). |
Office Action received for Chinese Patent Application No. 201280052967.3, dated Mar. 23, 2017, 22 pages (11 pages of Official copy and 11 pages of English Translation). |
Reexamination Notification received for Chinese Patent Application No. 201280052967.3, dated Aug. 23, 2018, 20 pages (12 pages of official copy and 8 pages of English translation). |
Final Office Action received for U.S. Appl. No. 16/803,468, dated Oct. 19, 2021, 14 pages. |
Non Final Office Action received for U.S. Appl. No. 16/803,468, dated Jan. 6, 2022, 16 pages. |
Corrected Notice of Allowability Received for U.S. Appl. No. 16/852,972, dated May 17, 2021, 2 Pages. |
Non Final Office Action Received for U.S. Appl. No. 16/852,972, dated Feb. 8, 2021, 14 Pages. |
Notice of Allowance Received for U.S. Appl. No. 16/852,972, dated May 6, 2021, 8 pages. |
Communication under Rule 71(3)received for European Patent Application No. 19184977.7, dated Apr. 14, 2021, 34 Pages. |
Corrected Notice of Allowability Received For U.S. Appl. No. 14/868,105, dated Jan. 14, 2019, 2 pages. |
Office Action received for Korean Patent Application No. 10-2016-7025254, dated May 2, 2017, 7 pages (3 pages of English Translation and 4 pages of Official Copy). |
Office Action received for Korean Patent Application No. 10-2016-7025254, dated Oct. 13, 2016, 11 pages (6 pages of English Translation and 5 pages of Official copy). |
Definition of Homogeneous Coordinates, Retrieved from the internet URL: <https://web.archive.org/web/20110305185824/http://en.wikipedia.org/wiki/Homogeneous_coordinates>, Wikipedia on Mar. 5, 2011 via Internet Arctlive WayBackMachine, [Online], 10 pages. |
Draw something, Retrieved from the Internet URL: <http://omgpop.com/drawsomething>, Accessed on Feb. 16, 2018, 2 pages. |
MLB At Bat 11, Retrieved from the Internet: <URL: http://texas.rangers.mlb.com/mobile/atbat/?c id=tex>,, Accessed on Apr. 19, 2018, Accessed on Apr. 19, 2018, pp. 3. |
SnapTell: Technology, Retrieved from the Internet: <URL: http:/!web.archive.org/web/20071117023817 /http:/ /www.snaptell.com/technology/index. htm>,, Nov. 17, 2007, 1 page. |
The ESP Game, Retrieved from the Internet: <URL: http://www.espgame.org/instructions.html>, Accessed on Nov. 13, 2007, Accessed on Nov. 13, 2007, 2 pages. |
Notice of Allowance Received for Koean Patent Application No. 10-2014-7004160, dated Jun. 15, 2016, 3 pages (2 pages of Official Copy and 1 page of English Translation). |
Office Action received for Korean Patent Application No. 10-2014-7004160, dated Mar. 2, 2016, 7 pages (2 pages of English Translation and 5 pages of Official Copy). |
Final Office Action received for Korean Patent Application No. 10-2014-7014116 dated Jan. 29, 2016, 6 pages (2 pages of English Translation and 4 pages of Official copy). |
Notice of Allowance received for Korean Patent Application No. 10-2014-7014116, dated Jun. 10, 2016, 8 pages (2 pages of official copy and 6 pages of English copy). |
Office Action received for Korean Patent Application No. 10-2014-7014116 dated Jun. 26, 2015, 13 pages (6 pages of English Translation and 7 pages of Official copy). |
Final Office Action received for Korean Patent Application No. 10-2016-7024912, dated Jun. 16, 2017, 7 pages (3 pages of English Translation and 4 pages of Official Copy). |
Final Office Action received for Korean Patent Application No. 10-2016-7024912, dated Oct. 25, 2017, 7 pages (3 pages of English Translation and 4 pages of Official Copy). |
Office Action received for Korean Patent Application No. 10-2016-7024912, dated Dec. 7, 2016, 11 pages (5 pages of English Translation and 6 pages of Official copy). |
Notice of Allowance Received for Korean Patent Application No. 10-2016-7025254 dated Mar. 9, 2018, 5 pages (2 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Korean Patent Application No. 10-2012-7019181, dated Feb. 23, 2016, 12 pages (5 pages of English Translation and 7 pages of Official Copy). |
Ye et al., “Jersey number detection in sports video for athlete identification”, Visual Communications and Image Processing 2005. vol. 5960. International Society for Optics and Photonics., Jul. 2005, 9 pages. |
Office Action received for Korean Patent Application No. 10-2016-7025254, dated Sep. 5, 2017, 12 pages (5 pages of English Translation and 7 pages of Official Copy). |
Final Office Action received for Korean Patent Application No. 10-2017-7036972, dated Mar. 18, 2019, 7 pages (4 pages official copy+ 3 pages english translation). |
Final Office Action received for Korean Patent Application No. 10-2017-7036972, dated Dec. 26, 2018, 6 pages (2 pages of English Translation and 4 pages of Official Copy). |
Office Action received for Korean Patent Application No. 10-2017-7036972, dated Jan. 30, 2018, 8 pages (4 pages of English Translation and 4 pages of Official copy). |
Final Office Action Received for Korean Patent Application No. 10-2019-7017324, dated Mar. 26, 2020, 7 pages (4 pages of official copy and 3 pages of English translation). |
Office Action Received for Korean Patent Application No. 10-2019-7017324, dated Sep. 16, 2019, 9 pages (7 pages of Official copy and 2 pages of English Translation). |
Final Office Action received for Korean Patent Application No. 10-2020-7025366, dated Feb. 17, 2021, 8 Pages (4 pages of official Copy and 4 pages of English Translation). |
Office Action Received for Korean Patent Application No. 10-2020-7025366, dated Sep. 16, 2020, 11 pages (6 pages of official copy and 5 pages of English translation). |
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 10803429.9, dated Aug. 30, 2018, 6 pages. |
Communication Pursuant To Article 94(3) EPC received for European Patent Application No. 10803429.9, dated Feb. 16, 2018, 8 pages. |
Communication under Rule 71(3) for European Patent Application No. 10803429.9, dated Jun. 6, 2019, 7 pages. |
Extended European Search report received for European Patent Application No. 10803429.9, dated Jun. 17, 2015, 7 pages. |
Office Action received for Korean Patent Application No. 10-2012-7019181, dated Nov. 18, 2013, 11 Pages. |
Final Office Action received for U.S. Appl. No. 12/644,957, dated Aug. 26, 2013, 19 pages. |
Final Office Action received for U.S. Appl. No. 12/644,957, dated Jul. 11, 2014, 25 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/644,957, dated Dec. 29, 2014, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/644,957, dated Mar. 7, 2014, 21 pages. |
Non-Final Office Action received for U.S. Appl. No. 12/644,957, dated Mar. 18, 2013, 17 pages. |
Notice of Allowance received for U.S. Appl. No. 12/644,957, dated Jun. 17, 2015, 21 pages. |
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 12843046.9, dated Dec. 1, 2016, 8 pages. |
Extended European Search report received for European Patent Application No. 12843046.9, dated Mar. 5, 2015, 7 pages. |
Office Action received for Korean Patent Application No. 10-2012-7019181, dated Nov. 3, 2014, 7 Pages. |
Summon to Attend Oral Proceedings received for European Patent Application No. 12843046.9, dated Jun. 21, 2018, 12 pages. |
Dommunication pursuant to Rules 94(3) EPC received for European Patent Application No. 12862340.2, dated Dec. 21, 2016, 4 pages. |
Extended European Search Report received for European Patent Application No. 12862340.2, dated Dec. 21, 2015, 4 pages. |
Summons to Attend Oral Proceedings received for European Patent Application No. 12862340.2, mailed on Mar. 14, 2017, 5 pages. |
Final Office Action received for U.S. Appl. No. 13/283,416, dated Aug. 7, 2015, 25 pages. |
Final Office Action received for U.S. Appl. No. 13/283,416, dated Nov. 25, 2014, 26 pages. |
Non Final Office Action received for U.S. Appl. No. 13/283,416, dated Apr. 2, 2015, 31 pages. |
Non Final Office Action received for U.S. Appl. No. 13/283,416, dated Feb. 2, 2016, 32 pages. |
Non Final Office Action received for U.S. Appl. No. 13/283,416, dated Jul. 10, 2014, 29 pages. |
Notice of Allowance received for U.S. Appl. No. 13/283,416, dated May 26, 2016, 9 pages. |
Final Office Action received for U.S. Appl. No. 13/340,141, dated Feb. 6, 2014, 19 pages. |
Final Office Action received for U.S. Appl. No. 13/340,141, dated Sep. 26, 2014, 22 pages. |
Office Action received for Japanese Patent Application No. 2014-539013, dated Aug. 11, 2015, 7 pages (4 pages of English Translation and 3 pages of Official copy). |
Office Action received for Japanese Patent Application No. 2014-539013, dated Dec. 6, 2016, 5 pages (2 pages of English Translation and 3 pages of Official copy). |
Office Action received for Japanese Patent Application No. 2014-539013, dated May 31, 2016, 4 pages (2 pages of Official Copy and 2 pages of English Translation). |
Office Action received for Chinese Patent Application No. 201510088798.4, dated Mar. 17, 2017, 23 pages (14 pages of English Translation and 9 pages of Official Copy). |
First Examiner Report received for Australian Patent Application No. 2015264850, dated Dec. 19, 2016, 2 pages. |
Notice of Acceptance received for Australian Patent Application No. 2015264850, dated Apr. 13, 2017, 3 pages. |
First Examiner Report received for Australian Patent Application No. 2015271902, dated May 22, 2017, 3 pages. |
Notice of Decision to Grant received for Japan Patent Application No. 2017-075846 dated Aug. 7, 2018, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
Office Action received for Japanese Patent Application No. 2017-075846, dated Mar. 20, 2018, 16 pages (9 pages of English Translation and 7 pages of Official copy). |
Office Action Received for Japanese Patent Application No. 2018-166862, dated Jan. 21, 2020, 6 pages (3 pages of Official Copy and 3 pages of English Translation). |
Office Action Received for Japanese Patent Application No. 2018-166862, dated Oct. 1, 2019, 6 pages (3 pages of English Translation and 3 pages of Official copy).). |
U.S. Appl. No. 61/033,940, “Image Recognition as a Service” filed Mar. 5, 2008, 56 pages. |
U.S. Appl. No. 61/447,962, “Shopping Experience Based On Concurrently Viewed Content”, filed Mar. 1, 2011, 53 pages. |
Ahn et al. et al., “Labeling Images with a Computer Game”, Retrieved from the internet URL:<http://ael.gatech.edu/cs6452f13/files/2013/08/labeling-images.pdf>, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2004, 8 pages. |
Appelman, “Product Description For Fangraphs Baseball, An Iphone/ipad App”, Sep. 26, 2009, 3 pages. |
Araki et al., “Follow-The-Trial-Fitter: Real-Time Dressing without Undressing”, Retrieved from the Internet URL: https://dialog.proquest.com/professional/printviewfile?accountId=142257>, 3rd International Conference on Digital Information Management, ICDIM 2008:33-38, IEEE Computer Society, Dec. 1, 2008, 8 pages. |
Duke University, “How to Write Advertisements that Sell”, Company: System, the magazine of Business, Retrieved from the Internet: URL<https://babel.hathitrust.org/cgi/pt?id=dul1.ark:/13960/t25b0d88r;view=1up;seq=5>, 1912, 66 pages. |
Gonsalves, “Amazon Launches Experimental Mobile Shopping Feature”, Retrieved from the Internet: <URL: http://www.informationweek.com/news/internet/retail/showArticle.jhtml?articleID=212201750&subSection=News, Dec. 3, 2008, 1 page. |
Google Play, “AgingBooth”, Retrieved from the Internet URL: <https://play.google.com/store/apps/details?id=com.piviandco.agingbooth&hl=en_IN>, Jan. 7, 2019, 4 pages. |
Kan et al., “Applying QR Code in Augmented Reality Applications”, VRCAI, Dec. 15, 2009, pp. 253-258. |
Kraft, “Real Time Baseball Augmented Reality”, Retrieved from the Internet URL: <http://dx.doi.org/10.7936/K7HH6H84>, Washington University in St. Louis, Dec. 6, 2011, 11 pages. |
Madeleine, “Terminator 3 Rise of Jesus! Deutsch”, Retrieved from the Internet URL: <https:/lwww.youtube.com/watch?v:::Oj3o7HFcgzE>, Jun. 12, 2012, 2 pages. |
Mello Jr.,“Pongr Giving Cell Phone Users Way to Make Money”, Retrieved from the Internet URL; <https://www.pcworld.com/article/240209/pongr_giving_cell_phone_users_way_to_make_money.html>, Sep. 18, 2011, 4 pages. |
MobiTV,“MobiTV”, Retrieved from the Internet: <URL: http://www.mobitv.com/>, Accessed Mar. 30, 2015, Accessed on Mar. 30, 2015, 1 page. |
Mulloni et al., “Handheld Augmented Reality Indoor Navigation with Activity-Based Instructions”, Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, Aug. 30-Sep. 2, 2011, 10 pages. |
Newby,“Facebook, Politico to measure sentiment of GOP candidates by collecting posts”, 2006-2012 Clarity Digital Group LLC d/b/a Examiner.com, Jun. 28, 2012, 3 pages. |
OCCIPITAIHQ,“RedLaser 2.0: Realtime iPhone UPC barcode scanning”, Available online on URL: <https://www.youtube.com/watch?v=9_hFGsmx_6k>, Jun. 16, 2009, 2 pages. |
Parker,“Algorithms for Image Processing and Computer Vision”, Wiley Computer Publishing, 1997, pp. 23-29. |
Patterson,“Amazon Iphone App Takes Snapshots, Looks for a Match”, Retrieved from the Internet: <URL: http://tech.yahoo.com/blogs/patterson/30983>,, Dec. 3, 2008, 3 pages. |
International Preliminary Report on Patentability received for PCT Application No. PCT/US2010/061628, dated Jul. 5, 2012, 6 pages. |
International Search Report received for PCT Application No. PCT/US2010/061628, dated Aug. 12, 2011, 2 pages. |
Written Opinion received for PCT Application No. PCT/US2010/061628, dated Aug. 12, 2011, 4 pages. |
International Preliminary Report on Patentability received for PCT Application No. PCT/US2012/061966, dated May 8, 2014, 6 pages. |
International Search Report received for PCT Patent Application No. PCT/US2012/061966, dated Jan. 18, 2013, 2 pages. |
Written Opinion received for PCT Patent Application No. PCT/US2012/061966, dated Jan. 18, 2013, 4 pages. |
International Preliminary Report on Patentability received for PCT Application No. PCT/US2012/071770, dated Jul. 10, 2014, 7 pages. |
International Search Report received for PCT Patent Application No. PCT/US2012/071770, dated May 13, 2013, 2 pages. |
Written Opinion received for PCT Patent Application No. PCT/US2012/071770, dated May 13, 2013, 5 pages. |
Politicology,“Facebook Gives Politico Access To Your Political Beliefs”, Ology, URL: http://www.ology.com/post/51413/facebook-gives-politico-access-to-your-political-beliefs, Accessed on Jun. 28, 2012, 4 pages. |
Redlaser,“Redlaser—Impossibly Accurate Barcode Scanning”, Retrieved from the Internet URL: <http://redlaser.com/index.php>, Jul. 8, 2011, pp. 1-2. |
Sifry,“Politico—Facebook Sentiment Analysis Will Generate “Bogus” Results, Expert Says”, Retrieved from the Internet: <http://techpresident.com/news/21618/politico-facebook-sentiment-analysis-bogus>, Jan. 13, 2012, 4 pages. |
Slingbox,“Sling Media, Inc.”, Retrieved from the Internet: <URL: http://www.slingbox.com/>, Accessed Mar. 30, 2015, Accessed on Mar. 30, 2015, 4 pages. |
Terada,“New Cell Phone Services Tap Image-Recognition Technologies”, Retrieved from the Internet: <URL: http://search.japantimes.co.jp/cgi-bin/nb20070626a1.html>, Jun. 26, 2007, pp. 1-3. |
Troaca,“S60 Camera Phones Get Image Recognition Technology”, http://news.softpedia.com/news/S60-Camera-Phones-Get-Image-Recognition-Technology-79666.shtml, Feb. 27, 2008, pp. 1-2. |
Vassilios et al., “Archeoguide:An Augmented Reality Guide for Archaeological Sites”, IEEE Computer Graphics and application vol. 22, No. 5, Sep./Oct. 2002, pp. 52-60. |
Vlahakis et al., “Archeoguide: First Results of an Augmented Reality, Mobile Computing System in Cultural Heritage Sites”, Virtual Reality, Archeology, and Cultural Heritage, Jan. 2001, 10 pages. |
Walther et al., “Selective Visual Attention Enables Learning and Recognition of Multiple Objects in Cluttered Scenes”, Jun. 15, 2005, 23 pages. |
Wikipedia,“Polar Coordinate System”, Wikipedia on Oct. 11, 2011 via Internet Archive WayBackMachine, [Online]., Retrieved from the Internet: <https://web.archive.org/web/20111008005218/http://en.wikipedia.org/wiki/Polar_coordinate_system>, Oct. 8, 2011, 12 pages. |
Number | Date | Country | |
---|---|---|---|
20210334891 A1 | Oct 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16852972 | Apr 2020 | US |
Child | 17368197 | US | |
Parent | 16162153 | Oct 2018 | US |
Child | 16852972 | US | |
Parent | 15250588 | Aug 2016 | US |
Child | 16162153 | US | |
Parent | 13283416 | Oct 2011 | US |
Child | 15250588 | US |