The field of the invention is transaction infrastructure technologies.
The background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
Transaction systems provide primitive solutions for the ever growing world of on-line transactions. Typically, existing transaction systems provide a single provider the ability to conduct a transaction with a single user. However, they lack the ability to offer providers (e.g., of service, good, etc.) or consumers a system that can reconcile aspects of a transaction among multiple providers or user accounts.
For example. Apple® EasyPay allows a user to make merchant (i.e., Apple®) specific payments in its retail stores in a closed loop system through the user's iTunes® account, thereby eliminating the need to wait in line at a physical counter; see URL: www.javelinstrategy.com/blog/2011/11/15/apple-iphone-easypay-mobile-payment-rollout-may-delay-nfc/. Another example includes Zoosh®, which uses ultrasound to perform certain near field transactions via two devices; see URL, venturebeat.com/2011/06/19/narattes-zoosh-enables-nfc-with-just-a-speaker-and-microphone/).
As another example, U.S. Patent Application Publication No. 2012/0252359 to Adams et al. teaches a mobile device having a motion sensing device and a processor configured to, among other things, recognize a movement pattern and, based upon that movement, determine a payment account from which access information for the payment account is sent to a transaction terminal via an NFC device of the mobile device.
Unfortunately, known existing transaction systems apparently fail to reconcile aspects of a transaction among multiple provider accounts or user accounts. Moreover, known existing transaction systems apparently fail to reconcile aspects of a transaction based at least in part on derived object attributes. Thus, there is still a need for transaction systems capable of reconciling aspects of a transaction among multiple provider or user accounts.
The inventive subject matter provides apparatus, systems and methods in which one can conduct a transaction involving multiple accounts (e.g., provider accounts or user accounts) that are involved with a payment (e.g., of cash, coupon, discount, loyalty points, etc.) or any other suitable transaction associated to an item (e.g., good, service, subscriptions, etc.). As used herein, the term “provider” is used very broadly to include providers of a service or good, whether real or virtual. One aspect of the inventive subject matter is considered to include a transaction system comprising a recognition engine, a transaction engine, and possibly one or more device engagement engines. The recognition engine can obtain a digital representation of a real-world object and derive attributes associated with the object (e.g., time, location, color, shape, model, cost, price, available coupons, messages, metadata, etc.) based on recognizing the object. A transaction engine can communicate with the recognition engine and can reconcile a transaction among multiple provider accounts or user accounts as a function of one or more of the derived object attributes. In some preferred embodiments, the transaction involves making a payment to one or more of the provider accounts or user accounts where the payment relates to an item. In some embodiments, an engagement engine can cause a computing device (e.g., cell phone, set-top box, kiosk, game consoles or system, etc.) to take action based on the transaction. Actions will be discussed in greater detail below.
In one aspect of the inventive subject matter, the transaction engine can comprise a memory storing one or more reconciliation matrices. In such embodiments, the transaction reconciliation engine can be configured to associate one or mom object attributes (derived by the object recognition engine) with a recognition signature. The recognition signature can be stored in a signature database communicatively coupled to the transaction reconciliation engine and object recognition engine.
It is contemplated that a “recognition signature” can comprise a collection of values associated with certain attributes within an interaction space, which can either be manually generated by a user, provider or third party, or constructed based on historical actions of one or more users. Where a derived object attribute or attributes meet a set of criteria associated with the signature the object attribute(s) can be associated with the signature. The criteria can comprise mandatory requirements (e.g., the presence of an attribute, a threshold, the presence of a first attribute and a second attribute, etc.) or optional conditions (e.g., the presence of a first attribute or a second attribute, etc.).
The transaction recognition engine can also be configured to map a recognition signature to a plurality of transaction accounts via one or more reconciliation matrices in a memory.
As used herein, the term “reconciliation matrix” can comprise an algorithm, a lookup table, or any other suitable rules set or data structure that can be utilized in mapping a recognition signature to one or more accounts.
Where, inter alia, the reconciliation matrix is utilized in mapping a recognition signature to multiple accounts, it is contemplated that the transaction reconciliation engine can be configured to reconcile a transaction among the multiple accounts.
From a methods perspective, a transaction can be initiated via a mobile device or other computing device. A mobile device can obtain a digital representation of a real-world object via a sensor. The digital representation can be transmitted by the mobile device to an object recognition engine configured to recognize the real world object and derive one or more attributes of the object. The mobile device can cause various actions to be performed by a transaction reconciliation engine, which can optionally be installed in the mobile device. Among the actions that can be performed by the transaction reconciliation engine are (a) associating an object attribute with a recognition signature, (b) mapping the signature to one or more transaction accounts via a reconciliation matrix, and (c) reconciling a transaction related to an item associated with the real-world object among the accounts.
Upon completion of the transaction, the mobile device can receive a notification associated with the complete transaction. It is contemplated that a notification can be received or presented to a user in any commercially suitable manner. e.g., via an email, a push notification, etc.
Another aspect of the inventive subject matter includes a method of reconciling payment of a coupon. Contemplated methods include using a recognition engine, possibly deployed on a mobile cell phone or other computing device, to recognize a real-world object related to a purchasable item. The purchasable item could be the same item as the real-world object (e.g., a product on a store shelf) or tangentially related (e.g., a ticket of a movie displayed in on a movie poster). The method can further include creating a virtual coupon based on attributes derived from a digital representation of the real-world object where the digital representation could comprise image data, sound data, non-visible data, signals, or other modal types of data. Another step can include activating the virtual coupon upon a triggering event generated by a mobile device, a cell phone for example. The triggering event could be generated automatically, “zero touch” (e.g., via near field communications, recognition of an object, etc.), or manually via one, two or more user interactions with a user interface. With zero touch generation of a triggering event, it is contemplated that a device can be configured to recognize objects located as far as 1, 5, 10, 20, 50, 100 or even (1000 or more feet away. Less clicks or user interface interactions can be preferable over a larger number of interactions. The mobile device, or other electronic device involved in the transaction, can reconcile the transaction among two or more electronic accounts relating to the purchase of the purchasable item. For example, a coupon vendor account can be charged according to terms of the virtual coupon, a vendor account can be credited with payment, a consumer's account can be charged according to terms of the virtual coupon, or a consumer's account can be credited according to terms of the transaction or virtual coupon. In some embodiments, the accounts can be third party accounts associated with various loyalty programs possibly participating in a loyalty exchange system where one account currency (e.g., frequent flyer miles, free gasoline points, etc.) can be converted to another (e.g., virtual gold in an on-line game, mobile phone minutes, etc.) according to terms of a virtual coupon.
Yet another aspect of the inventive subject matter can include a transaction apparatus (e.g., cell phone, point of sales device, computer, server, kiosk, game console, etc.) that can include a sensor interface through which the apparatus can acquire a digital representation of a real-world object or scene having the object. Example sensor interfaces could include an actual sensor (e.g., GPS, microphone, camera, accelerometer, bio-metric, etc.), a hardware interface to a sensor platform (e.g., serial interface, parallel interface, network interface, USB, etc.), or even a set of application program interfaces (APIs) capable of making local or remote procedure call. The transaction apparatus can also include a recognition module communicatively coupled with the sensor interface where the recognition module derives one or more object attributes from the digital representation and uses the object attributes to identify a purchasable item (e.g., the real-world object, a good, a product, a service, a subscription, etc.). Contemplated apparatus can further comprise a virtual coupon generator configured to generate a virtual coupon associated with the purchasable item as a function of the object attributes or other information available (e.g., customer ID, phone number, vendor ID, merchant ID, account numbers, prior consumer transactions, prior consumer scans, prior consumer interactions, etc.). The transaction apparatus can further include a transaction interface configured to electronically engage, directly or indirectly, one or more accounts possibly over a network. The transaction interface allows the apparatus to reconcile an account based on a payment for the purchasable item and based on use or authentication of the virtual coupon. In especially preferred embodiments, a user account is credited for purchasing the item. One especially preferred embodiment includes a mobile phone operating as the transaction apparatus where the virtual coupon is specifically generated according to a location and time.
Still another aspect of the inventive subject matter includes a method of mitigating risk of transaction fraud associated with a coupon. The method includes providing a virtual coupon on a mobile device (e.g., cell phone, vehicle, car, airplane, mobile kiosk, point of sales device, etc.) where the coupon relates to a purchasable item, possibly identified from object attributes derived from a digital representation of a scene. The virtual coupon can be provided via a text message, an email, a pop up, or any other commercially suitable method. The mobile device can further engage in an electronic transaction locally or even over a network with one or more account servers. The device can further allow the user to select at least one target account to be involved in the transaction from multiple accounts available for use with the transaction. The method can further include authenticating the transaction as a function of transaction attributes derived by the mobile device, possibly from the digital representation (e.g., location, time, make, model, consumer ID, device ID, etc.), as a function of the user selected target account, as a function of virtual coupon properties, or other factors. In more preferred embodiments, the method includes crediting the target account according to the terms of the virtual coupon. For example, a user might purchase a television at full price and might receive a virtual coupon from the television manufacturer that allows the user an option to credit a number of miles to a frequently flyer account or to credit their cell phone account.
Various objects, features, aspects and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
The following discussion provides many example embodiments of the inventive subject matter. Although each embodiment represents a single combination of inventive elements, the inventive subject matter is considered to include all possible combinations of the disclosed elements. Thus if one embodiment comprises elements A, B, and C, and a second embodiment comprises elements B and D, then the inventive subject matter is also considered to include other remaining combinations of A, B, C, or D, even if not explicitly disclosed.
It should be noted that while the following description is drawn to a computer/server based transaction/recognition systems, various alternative configurations are also deemed to represent computing devices including servers, interfaces, systems, databases, agents, peers, engines, controllers, or other types of computing devices operating individually or collectively. One should appreciate the computing devices comprise a processor configured to or programmed to execute software instructions stored on a tangible, non-transitory computer readable storage medium (e.g., hard drive, solid state drive. RAM, flash, ROM, etc.). The software instructions preferably configure the computing device to provide the roles, responsibilities, or other functionality as discussed below with respect to the disclosed apparatus. In especially preferred embodiments, the various servers, systems, databases, or interfaces exchange data using standardized protocols or algorithms, possibly based on HTTP, HTTPS, AES, public-private key exchanges, web service APIs, known financial transaction protocols, or other electronic information exchanging methods. Data exchanges preferably are conducted over a packet-switched network, the Internet, LAN, WAN, VPN, or other type of packet switched network.
One should appreciate that the disclosed techniques provide many advantageous technical effects including providing an infrastructure capable of generating networked signals that cause remote devices to engage in reconciling accounts among one or more provider or user accounts.
As used herein, and unless the context dictates otherwise, the term “coupled to” is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms “coupled to” and “coupled with” are used synonymously. Within this document, “coupled with” can also mean “communicatively coupled with”, possibly over a network.
As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
Unless the context dictates the contrary, all ranges set forth herein should be interpreted as being inclusive of their endpoints, and open-ended ranges should be interpreted to include commercially practical values. Similarly, all lists of values should be considered as inclusive of intermediate values unless the context indicates the contrary.
It is contemplated that the inventive subject matter described herein can leverage one or more techniques, including object recognition techniques, disclosed in the following co-owned pending applications: U.S. provisional application No. 60/246,295 filed Nov. 6, 2000; U.S. provisional application No. 60/317,521 filed Sep. 5, 2001: U.S. application Ser. No. 11/510,009 titled “Interactivity Via Mobile Image Recognition” filed on Aug. 25, 2006; U.S. application Ser. No. 12/505,726 titled “Interactivity with a Mixed Reality” filed on Jul. 20, 2009; U.S. application Ser. No. 13/005,716 titled “Data Capture and Identification System and Process” filed on Jan. 13, 2011; U.S. application Ser. No. 13/037,317 titled “Image Capture and Identification System and Process” filed on Feb. 28, 2011; and U.S. application Ser. No. 13/037,330 titled “Object Information Derived from Object Images” filed on Feb. 28, 2011. For example, the inventive subject matter can utilize one or more of the techniques for presenting information such as Internet content to a user described in U.S. application Ser. No. 13/005,726.
Object recognition engine 110 is configured to recognize a real-world object and derive object attributes of the object. The “real-world object” can comprise any existing still or moving object, including for example, a poster, a magazine, a brochure, a billboard sign, a product, a purchasable product, a vehicle, a storefront, a food item, a logo, a trademark, an image on a screen (e.g., television or computer screen, etc.), a person providing a service, a person, an image of a good or service (e.g., an image in printed media, non-printed media, an image of an animated object, an image of an inanimate object, etc.), a street sign, a 2D object, a 3D object, a time-varying object (i.e., 4D object), or a service embodied in a tangible medium. In some preferred embodiments, the real-world object is related to a purchasable product or service. The object attributes can comprise any quality or feature of an object, including for example, an absolute time, a relative time, a response time, a location, a relative position or orientation, a color, a size, a quantity, a quality, a material, an availability, a depreciation rate, an appreciation rate, a supply relative to a demand, a shape, a make, a model, a cost, an available coupon, a message, metadata, and a price. Additional details regarding object recognition engines will be provided in the description of
Transaction reconciliation engine 120 can be communicatively coupled to object recognition engine 110 and configured to reconcile a transaction amongst two or more provider accounts or user accounts as a function of the attributes derived by object recognition engine 110. The transaction can include, among other things, a purchase, a download, an order, a pre-order, a reservation, a down payment, a pre-payment, a partial pre-payment, reconciling a coupon among at least one of a provider account associated with the transaction and a user account, or any other suitable transaction. Reconciling a coupon can include, among other things, at least one of the following steps: (1) identification of a coupon to be generated; (2) generation of a coupon; (3) providing a coupon to a user; (4) a user receiving a coupon; (5) a user presenting the coupon to a provider or point-of-sale terminal (e.g., via a text, entail, a closeness of a device to a point of sale device, a presenting of a displayed virtual coupon, a presenting of a code or password associated with the virtual coupon, etc.); and (6) reconciling at least one account (e.g., user, provider, etc.) via a credit, debit, update, notification, etc.
In some embodiments, the transaction can involve some sort of payment to at least two different accounts, for the purpose of obtaining a good, product, service, or subscription. Moreover, the transaction can be completed in no more than 10, 5, 3, or even 2 or less physical interactions. For example, it is contemplated that an end user can capture an image of a real world object via a single click of a device, and that this single click can initiate some or all of the steps of a transaction related to the object, which can include reconciliation of a coupon. It is also contemplated that the transaction can occur automatically based on a real-time automatic ambient collection of sensor data forming a digital representation of the real-world object without requiring the end user to initiate the transaction. For example, the digital representation of the real-world object can automatically be compared with objects or items within a database. This database can be located in a server located within the device collecting sensor data, a remote server, or any combination thereof. Additional details regarding transaction reconciliation engines and the process of reconciling a transaction will be provided in the description of
Engagement engine 130 is communicatively coupled to object recognition engine 110, transaction engine 120 and computing device 140, and is configured to cause computing device 140 to take one or more actions as a function of a transaction. All commercially suitable computing devices are contemplated, including for example, a mobile phone, a gaming console or system (including 3-D gaining consoles and systems), a local server, a remote server, a general purpose computer, a tablet, a kiosk, an electronic device, a set-top box, a vehicle, or any other commercially suitable device. Examples of actions that can be caused by engagement engine 130 include, among other things: (1) a credit or a debit where the computing device comprises a bank server; (2) an initiation of a software process, posting data, reading data, logging on or off an account, registering an account, interacting with a game where the computing device comprises any one of several suitable computing devices; (3) an initiation of a transaction where the computing device comprises a point-of-sale device; or (4) making a phone call (e.g., to a doctor, store, etc.) where the computing device comprises telephony capabilities.
One contemplated type of transaction system can be configured to modify a subscription hill from a service or product provider (e.g., a utility provider, a magazine subscription provider, etc.). Modifying a subscription bill can include, among other things, at least one of a deletion of a charge, a reduction of a charge, an increase of a charge, an overall percentage discount, a credit, a free offer, a reduced price offer, or a coupon for a service or a good. The system can establish a universal framework of a virtual loyalty card usable by multiple providers based on object recognition of a real world object. A recognition engine of such systems can recognize the object, derive object attributes associated with the object, and recognize an ID of a virtual coupon in the process of recognition of both the object and of the virtual coupon of the real world object. The recognition engine can then engage a transaction engine to initiate a transaction associated with the object (e.g., a purchase of the object, etc.), and redeem the coupon by crediting the provider's monthly statement, or other provider or user account. A final reconciliation among providers can occur on a weekly basis, monthly basis, yearly basis, one-time basis, or any other recurring or non-recurring time period.
For example, a user can be assigned a single loyalty ID, loyalty card, or account number. When the user interacts with an object such as a poster via his mobile phone, he can be provided with a link to a website selling the poster. Based on at least one of the user's clicking on the link, saving the link, or making a purchase at a website associated with the link, a credit can be made to a loyalty account associated with the link or website. If the same user interacts with an image of Disneyland, and is automatically redirected to a website where the user purchases airplane tickets to Florida, a credit can automatically be made to a loyalty account associated with the airline or a credit card used to purchase the airplane tickets (e.g., with frequent flyer miles, etc.).
Coupon redemption by providers can occur upon confirmation of payment even before a payment is received (e.g., during the period wherein a credit card payment has been authorized but funds have not yet been received by the provider(s)). One should appreciate the reconciliation can occur based on a recognition of the real world object, generation of virtual coupon, entering into virtual loyalty card, and a purchase or redemption being complete in less than 2 steps. These steps can each occur on a single mobile device (or multiple devices), in real or substantially real time in the real world. In some embodiments the reconciliation can occur in real or substantially real time through collection of ambient sensor data or at a point of sales. Thus, various provider and user accounts can be credited or debited based on their loyalty activity.
An additional aspect of the inventive subject matter includes a system to allow users to pay a provider to receive a service (e.g., telephone, utility, magazine or newspaper subscription, car lease etc.) by redemption of coupons obtained via a purchase of real world objects or services different from the first service. This can be achieved by utilizing a recognition and engagement engine in at least one of a device and a server. For example, a user who purchases a telephone at AT&T™ can be sent a coupon, via text or email, which the user can redeem to receive a free month of telephone service through Verizon™.
Object recognition engine 200 comprises a derivation engine 220 that is configured to derive one or more object attributes (e.g., 221, 222, and 223) of real world object 210. Derivation engine 220 can be configured to derive 5, 10, 15, 20, 25, 50, or even 100 or more object attributes for every real world object it scans or recognizes. The derived attributes can be organized hierarchically by class, category, or even dimensions within an interaction space. As discussed above, the object attributes (e.g., 221, 222, and 223) can include, among other things, an absolute time, a relative time, a response time, a location, a relative position or orientation, a color, a size, a quantity, a quality, a material, an availability, a depreciation rate, an appreciation rate, a supply relative to a demand, a shape, a make, a model, a cost, an available coupon, a message, metadata, and a price.
Some or all of the derived attributes can be derived from visible data, and others derived from non-visible data. It is also contemplated that a single attribute can be derived from both visible and non-visible data. Examples of visible data include, for example, digital representations of a size, a color, a shape, a location, a brightness, a material, a shadow, a relative size, a relative color, a relative shape, a relative location, or a relative brightness, examples of non-visible data include, for example, digital representations of a sound, texture, touch, sensed emotion, taste or smell.
Item 300 is a car associated with object 210, which is an image of a car very similar to item 300. User 310 captures a digital representation of object 210 via a camera enabled mobile phone. Object recognition engine 220 obtains the digital representation captured by user 310, recognizes object 210 as a real world object, and derives various attributes thereof, including a make, model, year of manufacture, and color of object 210. Object recognition engine 220 then causes information related to item 300 to be presented to the user, either via the mobile phone used to capture the digital representation or a remote device. The information related to item 300 can be a video showing item 300, an accident history, a repair history, miles driven, a location, a price, an availability, and a transaction link (e.g., a link to a website where the item can be purchased).
In this example, user 310 clicks on a link to a website where the item can be purchased, and engages in a transaction 330 that comprises a payment to four accounts, 321, 322, 323 and 324, related to item 300. This payment to the four accounts can be a lump or partial payment, and can be direct or indirect. For example, a user can make a lump (i.e. complete) or partial payment indirectly via a third party collector. A lump payment can be distributed amongst the various accounts, while a partial payment can be distributed to a specific account, or two or more accounts. It is also contemplated that the user can also make lump or partial payments directly to the accounts themselves. A direct payment can be a payment between a consumer and a provider, which is limited to the inclusion of two or more of the following: entities associated with the user account; entities associated with the provider account; the use of a transaction system of the inventive subject matter; and transaction reconciliation engine of the inventive subject matter. An indirect payment can comprise any other payment and can include a third party entity.
When user 310 purchases item 300 via the website, user 310 provides user account information to transaction reconciliation engine 320, either directly or via a third party payment processor (not shown). This user account information can include, for example, at least one of a name, address, expiration date, login username, login password, consumer ID associated with an application, account number, a name of a financial establishment, a type of account (e.g., Visa™, Mastercard™, Discover™, American Express™, etc.), or a continuation of a payment or authorization of a payment made by a remote server or third party.
Upon at least one of the following: receipt of user account information; confirmation of correctness or completeness of user account information; processing a payment associated with user account information; authorization of a payment associated with user account information; and receipt of a payment associated with user account information, transaction reconciliation engine 320 can reconcile the transaction among accounts 321, 322, 323, and 324.
The following use case describes a reconciliation of a payment by the embodiment in
It is also contemplated that in some embodiments, the transaction reconciliation engine can perform some or all of the functions of a third party payment processor. Thus, transaction reconciliation engine 320 can also allow providers to receive their portions of a payment efficiently and immediately without having to transact directly with user 310.
In some embodiments, as shown in
On the client-side, image recognition services also can provide real-time tracking and augmented reality overlays on identified objects. These augmented reality overlays can comprise 2D content (e.g., HTML) or 3D animated models. In either case, the augmented reality overlays can be aligned with or track the recognized object, in real-time, in the mobile device screen. The overlay content can be accessed via network or can be resident in the handset to minimize latency. In some embodiments the overlay can include a virtual coupon constructed, possibly in real-time. In some embodiments the overlay can include an advertisement, informational text, link to a website, or any other commercially suitable overlay.
The client-side image recognition services or modules can also provide real-time data describing the relative position or orientation of a tracked object. The position or orientation of objects 410-412 can be presented relative to the user, relative to mobile device 400, relative to each other, or according to other desired formats.
The client-side recognition technology can be optimized to produce a smooth and robust user experience. For example, on an iPhone® 4, the Applicant's object recognition system is able to operate easily at a rate of 10 video frames per second, where post-recognition augmented reality object tracking is able to operate at 30 frames per second.
Images in the client-side database can have an average size of approximately 30 KB. A typical smart-phone can therefore accommodate a database of thousands of images.
Several options are available for loading and updating the client-side database. This includes: “User Pull”, wherein the user initiates a database update by performing an action such as checking in to a retail store, selecting a product category, and so forth. A client app, via the SDK 465, then downloads images related to the user's action from the SDK server 475; and “Automatic Push”, wherein the SDK server 475 automatically sends database updates to the app 485, based on any set of desired conditions, e.g. the user's location, time, actions taken within an app, local deals, coupons, or other promotions. If the database is running out of storage space, the application can be configured to automatically delete images or data from the database, preferably based on a prioritized basis (discussed further below).
In addition to server-side and client-side image recognition, the platform can also support a hybrid mode in which initial image recognition is performed by a server but augmented reality processing is performed by the client. This hybrid mode enables use of a virtually unlimited database size and eliminates the need to update the client-side database, while providing for augmented reality. This hybrid mode can cause initial delay of a few seconds between capturing a digital representation of a scene and the start of the augmented reality experience. During that delay the initial image is sent to the server, analyzed, and the augmented reality content can be sent to the client.
The client-side image recognition SDK can provide direct programmatic access to all object recognition and tracking functions (e.g., vSLAM, VIPR, SIFT, GPS information, time stamps, etc.). The SDK can include a complete detailed function library, for use by developers familiar with image recognition, and a simplified function library that is useful for those without an image recognition background.
The client SDK can also include support for the Unity® game development environment. 3D models developed in that environment are compatible with the augmented reality functionality in the SDK.
Example SDK Use Cases
The following use cases provide various examples of some aspects of some embodiments of the inventive subject matter.
Merchant and Consumer Applications for Redeeming a Coupon
In some embodiments, a merchant or other provider can utilize a merchant application developed via an SDK. A transaction system associated with the merchant can have certain templates, which the merchant can provision with pricing and coupons or other forms of benefits. The merchant can open the merchant application, log in with his or her merchant ID (which can be set up through a third party), and scan one or more objects by capturing an image, scanning a barcode or other symbol, via near field communications, etc. The merchant can also provision the system or application (if there are templates), or simply input pricing and coupon/benefits information, which can be saved to the system.
From a user perspective, the user can walk into the merchant's store and engage the system by opening a consumer application developed via an SDK. The user can log in to the consumer application with his or her consumer ID (e.g., account number, loyalty card number, etc.) and scan items to view coupons/savings. The consumer can select a coupon or saving, take an item associated with the coupon to a merchant checkout counter or person and redeem the coupon.
When a merchant is presented with a coupon, the merchant can open the application and log in with his or her merchant ID. The merchant can obtain a phone number or other information associated with the consumer and input it into the application. It is contemplated that the merchant will be able to view the coupons selected by the consumer in the merchant's application. The merchant can apply the coupon selected using various methods. First, the user can apply the coupon within the merchant application if POS is enabled and the consumer is set up for payment (e.g., via ClearXchange™). The merchant can receive confirmation of payment processing and obtain an electronic receipt, including a merchant service charge, if any. Second, the merchant can apply coupons to the consumer bill manually with his or her existing payment system. Third, the merchant can enter the total charge to an existing payment system and savings associated with the coupon(s) can be applied to the consumer's mobile phone carrier phone bill (e.g., Verizon™, AT&T™, etc.). The consumer can receive a text, app, or email notification that the coupon(s) were applied, along with what payment option was used. The merchant and consumer can have access to all transaction online, including details of coupons/savings received, and via which method(s), among other things.
Consumer Identification, Interests and Recommendations
A consumer utilizing a transaction system of the inventive subject matter (e.g., a SDK generated application installed in his mobile device) can obtain various recommendations, coupons, etc. based on objects or items associated with the consumer. It is contemplated that systems of the inventive subject matter can achieve at least one of the following: (1) identify a consumer. (2) identify what objects with which the consumer interacts or transacts, or (3) identify what objects and items would be of interest to the consumer.
The following use case shows some of the benefits the system can provide a consumer walking through a shopping center. Prior to entering the shopping center, the user can open up the consumer application and log in using his unique consumer ID and possibly a password. The system will identify the consumer and begin to store and prioritize data associated with the consumer.
For example, when a user walks into a specific store, sensor data can automatically be acquired and an interest in the store can be identified by the system. This identification can be based on one or more of the following, among other things: a recognition that the consumer entered the store; the amount of time the consumer spent in the store, the number of items purchased in the store; or the amount of money spent in the store.
The prioritization of actions or items based on acquired data can be based upon any suitable scoring or ranking system, which can be determined by a user of a device, a transaction system manager, or an SDK of the inventive subject matter. For example: a user can assign certain values to certain types of actions associated with an item (e.g., a download of a song can be worth 2 points being assigned to the musician, while a purchase of a physical product can be worth 4 points being assigned to a person associated with the product, etc.); a transaction system manager can assign a range of scores that determine a level of interest (e.g., basic level of interest between 1-10, moderate level of interest between 11-20, special level of interest between 21-30, superior level of interest between 31-40, etc.), and an SDK can modify the user assigned values of certain types of actions or data based on user activity.
One possible scenario where an SDK can modify the user assigned value of an action can be where a user assigns a high value to an action (e.g., 4), but then repeatedly declines offers presented to the user based on items associated with the action. In such a scenario, the SDK could lower the value assigned to the action (e.g., to 2).
For example, when a consumer walks by a movie advertisement in the store, the movie advertisement can automatically be scanned, and sensor data automatically obtained. The system can then identify that the movie is of the consumer's basic level of interest and prioritize the movie poster accordingly. Based at least in part on the priority given to the movie poster, the user's device can present information or options to the user (e.g., to purchase movie tickets, download a trailer, etc.), or initiate transactions related to the movie poster.
This identification of the movie as being of the consumer's basic level of interest can be based on a score of, for example, between 1-10 (e.g., 8) being associated with a director of the film. The system can also possibly recognize an actor, director, producer, filming location, or any other feature of the movie advertised and prioritize those features accordingly. If an actor in the poster is then recognized as being of the consumer's higher level of interest, it is contemplated that the scores of two identified objects of interest can be averaged. Thus, if the actor is recognized as being of the consumer's moderate level of interest (e.g., score between 11-20), a special level of interest (e.g., score between 21-30), or a superior level of interest (e.g., score between 31-40), then the average of the scores can be used to determine a level of interest of the movie.
Alternatively or additionally to prioritization based on features of the object, it is also contemplated that a scoring or ranking system can be associated with actions of a user, and that these actions can be used to determine a type of desired transaction. For example, it is contemplated that a scoring or ranking system can be set in place that recognizes that the consumer wishes to initiate a certain type or types of transactions with the movie advertisement based on the consumer capturing an image of the object, or simply standing in front of the advertisement for a specified threshold time (e.g., 2, 5, 10, 15, or even 20 or more seconds). These actions of a user can be used to recognize a level of interest in the object (e.g., basic, moderate, special, superior, etc.). The consumer can then be provided with a link to a website that provides information related to the movie (e.g., show times, location of a DVI) in the store itself, etc.) or allows the consumer to purchase movie tickets, DVI), movie poster, or soundtrack.
If the consumer chooses to in fact engage in a transaction (e.g., downloading a trailor, etc.) associated with the object by clicking on this link, the system can identify that a transaction related to the movie occurred and assign a higher priority to the movie than other movies where an advertisement was photographed, but where no subsequent transaction related to the advertised movie tickets was made. If the consumer chooses to engage in a further transaction related to the object (e.g., purchase the movie tickets, etc.), then the movie can be given an ever higher priority based and assigned an even level of interest.
As shown above, it is contemplated that based on the information recognized, stored or prioritized by the system, the system could provide a recommendation, coupon, or other information related to real world objects that have not yet been scanned or captured by the consumer. In some embodiments of the inventive subject matter, the scoring, ranking or prioritization of actions, objects or other information can be done in a manner where users are grouped based on a demographic, and an action of a user within that group can affect the scoring, ranking of prioritization associated with the some members of, or even the entire demographic.
As yet another example, where a class of goods or services, or a specific brand or type of goods or services is purchased by the consumer with increased frequency, those classes, brands or types of goods or services can be assigned a higher priority than those where fewer transactions occurred. Moreover, the system can identify one or more common attributes of objects that were scanned, captured, or purchased by a user and provide recommendations, coupons or information based on those attributes. Identifying the common attributes can allow the system to identify the interests and intent profile of the consumer.
It is also contemplated that the system provider can charge a fee to providers whose goods or services are mentioned in a recommendation, coupon, or other information provided to the consumer.
Once the system has identified, stored or prioritized the objects or attributes of interest to the consumer in a database associated with the system, it is contemplated that a transaction can occur automatically based on real time or substantially real time automatic ambient collection of sensor data forming a digital representation of the real-world object without requiring the end user to initiate the transaction. For example, when the user passes by Dodgers™ stadium, a mobile device carried by the user can automatically open the team's official website in an Internet browser based on stored data indicating an interest in baseball team websites.
In some contemplated embodiments, a system can be configured to identify a consumer based on location data or height data (e.g., a height of the user, location of a device relative to the ground, location of a device by street, city or state, etc.) obtained via a consumer device, objects scanned, objects with which the consumer interacts or transacts in relation to, or any combination thereof. For example, a consumer carrying a consumer mobile device may open an SDK generated application at any time. Regardless of whether the consumer enters a consumer ID, the device can begin scanning the objects within its field of view and allow the consumer to interact or transact in relation to the objects scanned. If the consumer has logged in, the consumer can be authenticated based on at least one of the location data, height data, scans, interactions or transactions. If the consumer has not logged in, the consumer can be identified based on at least one of the location data, height data, scans, interactions or transactions, thereby eliminating the need to log in manually.
As previously discussed, it is contemplated that the inventive subject matter described herein can leverage one or more techniques disclosed, possibly including object recognition techniques, in the following co-owned pending applications: U.S. provisional application No. 60/246,295 filed Nov. 6, 2000; U.S. provisional application No. 60/317,521 filed Sep. 5, 2001: U.S. application Ser. No. 11/510,009 titled “Interactivity Via Mobile image Recognition” filed on Aug. 25, 2006; U.S. application Ser. No. 12/505,726 titled “Interactivity with a Mixed Reality” filed on Jul. 20, 2009; U.S. application Ser. No. 13/005,716 titled “Data Capture and identification System and Process” filed on Jan. 13, 2011; U.S. application Ser. No. 13/037,317 titled “Image Capture and Identification System and Process” filed on Feb. 28, 2011; and U.S. application Ser. No. 13/037,330 titled “Object Information Derived from Object Images” filed on Feb. 28, 2011. For example, the inventive subject matter can utilize one or more of the techniques for presenting information such as Internet content to a user described in U.S. application Ser. No. 13/005,726.
The object recognition engine can transmit one or more of the derived object attributes 4016 to transaction reconciliation engine 4020. Transaction reconciliation engine 4020 comprises a memory 4021 storing one or more reconciliation matrices 4023.
Transaction reconciliation engine 4020 is configured to perform many actions including, among other things, associating an object attribute with a recognition signature (e.g., 4036, 4037, etc.), mapping the recognition signature to one or more accounts via the reconciliation matrix, and reconciling a transaction among the accounts. One should appreciate that the disclosed approach seeks to map the derived object attributes to reconciliation accounts without requiring an intermediary look-up of the recognized object. In fact, the disclosed approach lacks any requirement to recognize an object at all.
In order to associate object attribute(s) with a recognition signature, the transaction reconciliation engine 4020 can compare the attribute(s) with recognition signatures 4036 and 4037, which can be stored in signature database 4035. It is contemplated that an association between an object attribute or attributes and a recognition signature (e.g., 4036, 4037, etc.) can occur when certain parameters are met (e.g., a threshold match score, a complete match, a partial match, etc.). One should appreciate that recognition signatures 4036 and 4037 can be defined within the same attribute namespace or valuespace as used to derive object attributes. For example, the attributes space could include SIFT features; audio amplitudes, frequencies, envelopes, or phases; position; orientation; location; context; or other attributes.
Recognition signatures can comprise a collection of values associated with certain attributes, and can be generated in a number of ways. All commercially suitable digital signature schemes are contemplated. A suitable technique that could be adapted for use with the inventive subject matter includes those disclosed by U.S. Pat. No. 7,650,616 to Lee.
A signature can be represented in a number of ways including, for example, as an N-tuple, scalar, vector or matrix having n-dimensions of relevance (where n can equal 0, 1, 2, 3, 5, 10 dimensions in a recognition space, etc.). It is contemplated that a key value (e.g., hash value, etc.) can be generated from the values associated with each attribute of a signature. These key values can be used to query a hash table or other reconciliation matrix to map a recognition signature to one or more accounts.
One technique for generating a recognition signature can comprise a manual input into recognition signature generator 4065.
For example, a system manager or other user can generate a recognition signature to be mapped to both a Costco™ account and a user's Costco™ American Express™ card by including values representative of specific attributes. The attributes represented can comprise any suitable attributes, including for example, a word, a price, a location, a time, a logo, an account number, biometrics, an image, or audio. For this particular example, a recognition signature can be generated in the form of a vector having four values, wherein the first value is representative of an account holder's identity (e.g., driver's license photo, name, age, birthday, identification number, etc.), the second value is representative of an item to be purchased (e.g., barcode, QR code, image of product, etc.), the third value is representative of a Costco™ location, and the fourth value is representative of an authorization (e.g., a motion, fingerprint scan, signature, voice authorization, etc.).
Contemplated attributes can comprise audio data (e.g., tone, pitch, bass, rhythm, melody, etc.), textual data (e.g., words, names, handwriting, etc.), imagery data (e.g., pixilation, SIFT features, symbols, etc.), video data, a gesture, a brightness, a location, or any other suitable type of data.
Another technique for generating a recognition signature can be via recognition of historical data related to one or more users. In some embodiments, the recognized data related to a user can be used to represent a demographic. For example, a recognition signature generator 4065 can be communicatively coupled to a user device and configured to acquire transaction data related to the user or device. Where a user initiates a transaction via the user device, the recognition signature generator can be configured to acquire data related to the transaction and generate a signature. It is also contemplated that a single generated signature can be based on multiple transactions initiated via the user device.
For example, a recognition signature generator 4065 can acquire data indicating that a user device was inside of a Target™ when a user made a purchase utilizing her Target™ reward card, a Visa™ credit card for 50% of the purchase, and an American Express™ card for 50% of the purchase. Based on this data, the recognition signature generator 4065 can generate a signature related to Target™, the reward program, Visa™, and American Express™. When this signature is used to query a reconciliation matrix, alone or along with one or more other signatures, the signatures can then be mapped to a purchasable item, a user's Target™ reward card, the user's Visa™ card, and the user's American Express™ card. Thus, one should appreciate that recognition signatures 4036 and 4037 can be a priori created, possibly by the user or a third party, or could be dynamically created through observation of the user.
It is contemplated that the specificity of a recognition signature can vary depending on the amount of historical data available. Where there is not a sufficient amount of historical data available, a user or an account manager may be prompted to confirm or deny certain associations (e.g., between attribute and signature, between signature and account, etc.) prior to completion of a reconciled transaction.
A signature can be mapped to one or more accounts (e.g., 4040, 4050, 4060, etc.) via a query of a reconciliation matrix 4023. The reconciliation matrix or matrices 4023 can be utilized to map one or more of the recognition signatures to one or more accounts.
As discussed above, a reconciliation matrix can comprise an algorithm, a lookup table, or any other suitable rules set or data structure that can be utilized in mapping a recognition signature to one or more accounts.
One example of such a matrix can comprise a hash function, which can map a recognition signature to an account. It is contemplated that the recognition signature can be represented by a vector of attribute values and accounts can be represented by a vector of account numbers. The reconciliation matrix can accept the vector of attribute values as input and return the vector of account numbers as output. The mapping within the matrix operates as a function of a look up table, a mathematical matrix operation, or other mapping algorithm. Further, the vector of account numbers could include weighting factors indicating what fraction of a transaction amount should be applied to each member account in the vector. Even further, the reconciliation matrix can include rules for resolving placement fractional amount of currency (e.g., a fraction of a cent) among accounts.
Once a signature is mapped to one or more accounts utilizing a reconciliation matrix 4023, the transaction reconciliation engine can reconcile the transaction amongst the accounts in a suitable manner, such as by (1) associating rewards points with the user's reward card, (2) transferring 50% of the item's purchase price from the user's Visa™ card to a Target account, and (3) transferring 50% of the item's purchase price from the user's American Express™ card to a Target account.
Once obtained by the mobile device, the mobile device can transmit the digital representation to an object recognition engine that is configured to recognize the real world object and derive one or more object attributes of the object as shown in step 4120. As previously discussed, it is contemplated that the step of recognizing the real world object and deriving one or more object attributes of the object can leverage one or more techniques disclosed in the following co-owned pending applications: U.S. provisional application No. 60/246,295 filed Nov. 6, 2000; U.S. provisional application No. 60/317,521 filed Sep. 5, 2001: U.S. application Ser. No. 11/510,009 titled “Interactivity Via Mobile Image Recognition” filed on Aug. 25, 2006; U.S. application Ser. No. 12/505,726 titled “Interactivity with a Mixed Reality” filed on Jul. 20, 2009; U.S. application Ser. No. 13/005,716 titled “Data Capture and Identification System and Process” filed on Jan. 13, 2011; U.S. application Ser. No. 13/037,317 titled “Image Capture and Identification System and Process” filed on Feb. 28, 2011; and U.S. application Ser. No. 13/037,330 titled “Object Information Derived from Object images” filed on Feb. 28, 2011. For example, the inventive subject matter can utilize one or more of the techniques for presenting information such as Internet content to a user described in U.S. application Ser. No. 13/005,726.
The mobile device can then cause a transaction reconciliation engine to perform one or more actions as shown in steps 4130 through 4133. For example, the transaction reconciliation engine can be caused to associate an object attribute with a recognition signature as shown in step 4131, map the signature to one or more accounts via a reconciliation matrix as shown in step 4132, or reconcile a transaction related to an item associated with a real-world object among two or more accounts as shown in step 4133.
The transaction reconciliation engine can be installed on the mobile device itself, or can compose a separate device communicatively coupled to the mobile device. Among the actions that can be performed by the transaction reconciliation include, for example, associating an object attribute with a recognition signature, mapping a recognition signature to one or more accounts via a reconciliation matrix stored therein, or reconciling a transaction related to an item associated with the real-world object among a plurality of accounts.
Upon completion of a reconciled transaction, a notification associated with the reconciled transaction can be received by the mobile device as suggested by step 4140. It is contemplated that this notification can be received by the mobile device in any suitable format and manner, including for example, a push notification or an email.
Example Recognition Signature Use Cases
The following use cases provide various examples of some aspects of some embodiments of the inventive subject matter.
Magazine Subscription Use Case
A user and a magazine publisher engaging in a transaction can utilize a system of the inventive subject matter to purchase/sell a magazine subscription, and divide one or more payments from the user amongst various payees.
The user can capture a digital representation of an HGTV™ magazine via a camera enabled mobile device. The digital representation can be sent to an object recognition engine that is configured to recognize the magazine and derive attributes of the magazine. For example, the object recognition engine can derive the following attributes of the magazine: the publication date, the name of the magazine, the publisher name, data related to the cover photo (e.g., SIFT features, grayscale value, RGB value, identification of a celebrity in the image, brand name of clothing worn by celebrity, etc.), headlines, price, promotions, coupons, publication period (e.g., weekly, biweekly, monthly, etc.), or any other suitable attributes. For purposes of this example, the derived attributes can comprise (1) a Jan. 7, 2013 publication date, (2) Jessica Alba as the cover image, (3) a weekly publication period, (4) a promotional price of $1 per issue for 52 weeks, and (5) Visa™ and MasterCard™ as the accepted payment methods (collectively referred to as “Object Attributes”).
The derived object attributes can then be transmitted to a transaction reconciliation engine communicatively coupled with the object recognition engine. The transaction reconciliation engine can comprise a memory storing one or more reconciliation matrices, which can be utilized to map a recognition signature to one or more accounts.
The transaction reconciliation engine can be configured to associate an attribute or attributes (derived by the object recognition engine) with a recognition signature based on a partial match, best match, or complete match. One possible recognition signature (“Object Signature”) can comprise a price value of $1 per issue, and a publication period value of weekly, and an accepted payment method value of American Express™ or debit card. If an association is set to occur based on a partial match or a best match, it is possible that the attributes of the magazine will be associated with this recognition signature. However, if a complete match is required, the attributes of the magazine will not be associated with this recognition signature.
It can generally be preferred that a complete match be required for transactions where only one or a few recognition signature association is required, but that a partial or best match can be required for transactions where more recognition signature associations are required. As used herein, the term “best match” is used broadly to include, for example, where an attribute or set of attributes derived is compared to a plurality of recognition signatures of a database, and the most commonality (requirements or optional conditions met) is found with one of a few of the recognition signatures.
If an association is made between the Object Attributes and the Object Signature, the Object Signature, either alone or in combination with other recognition signatures, can be used to query a reconciliation matrix. The process of querying a reconciliation matrix will be discussed in further detail below.
One other possible recognition signature association that can be required is an association between user or user device attributes and a recognition signature. For example, attributes associated with a user or user device can be received by the transaction reconciliation engine via the user device. These attributes can include, among other things, credit card account numbers used to make purchases via the device, names associated with the account numbers, types of items purchased using an account number, information related to a user of the device at or near the time a transaction is initiated (e.g., fingerprint data, voice recognition, username, password, address, gesture, height, weight, etc.), or types of items purchased by the user. The foregoing user or user device attributes can be associated with a recognition signature (“User or User Device Signature”) based on a partial, best or complete match. For purposes of this example, the user attribute can be associated with a recognition signature related to user X (“User or User Device Attributes”).
If an association is made between the User or User Device Attributes and the User or User Device Signature, the User or User Device Signature can be used to query a reconciliation matrix, either alone or in conjunction with other recognition signatures.
Yet another possible recognition signature association that can be required is an association between a payment division requirement and a recognition signature (“Payment Division Signature”). The attributes related to a payment division requirement can be obtained via the real-world object itself, a payee entity associated with an item related to the real-world object, a third party, or any other suitable source. For example, it is contemplated that the attributes can be obtained via Hearst™ Corporation, a payee who would obtain a portion of a payment related to a subscription of HGTV™ magazine. For purposes of this example, the payment division requirements can require that 50% of a subscription fee be paid to Hearst™ Corporation, while 50% of the subscription fee be paid to the newsstand carrying the magazine photographed (“Payment Division Requirement Attributes”). If an association is made between the Payment Division Requirement Attributes and the Payment Division Signature, the Payment Division Signature can be used to query a reconciliation matrix, either alone or in conjunction with other recognition signatures.
Mapping Recognition Signature To Account
Assuming an association is made between the various attributes described above and the Object Signature, User or User Device Signature, and Payment Division Signature, it is contemplated that one, some, or all of the recognition signatures can be used to query a reconciliation matrix.
The reconciliation matrix can be utilized to map one or more of the recognition signatures to one or more accounts. In this example, the combination of Object Signature, User or User Device Signature, and Payment Division Signature can be mapped to (1) User X's Visa™ debit card. (2) a Hearst™ Corporation account, and (3) a newsstand's account.
The transaction reconciliation engine can be configured to reconcile User X's subscription to HGTV™ magazine by transferring $0.50 from User X's Visa™ debit card to each of the Hearst™ Corporation account and the newsstand's account on a weekly basis until the subscription is cancelled.
The transaction reconciliation engine can reconcile some or all of the accounts in real-time or over a period of time, and can be configured to handle certain problems that may arise (e.g., an error in accounting, a lack of funds, etc.). For example, where a payor's account has insufficient funds, a back-up payor account can be utilized in making a payment.
It is contemplated that User X can receive a notification of each transfer via her mobile device in any commercially suitable manner.
Additional Use Cases
The below use cases provide various examples of some aspects of some embodiments of the inventive subject matter.
Downloading Digital Files
A user and digital music provider can utilize a transaction system of the inventive subject matter to (1) download/permit download of a digital file, and (2) reconcile a transaction between a payor account and one or more payee accounts.
For example, Alison, while listening to “Scream and Shout” by Britney Spears and Will.i.am, can cause her cell phone to receive audio data related to the song (e.g., via a mobile application such as Shazam™, etc.). This data is sent to an object recognition engine that derives various attributes of the audio data including, for example, lyrics, tone, pitch, bass, types of instruments, speed, sound patterns, or any other suitable attributes of audio data.
Once the object recognition engine derives one or more object attributes, a transaction reconciliation engine coupled with the object recognition engine can associate one or more of the object attributes with a recognition signature. For example, a lyric and a sound pattern can be associated with a signature having values representative of the lyric and the sound pattern. The transaction reconciliation engine can then map the signature to (1) a payee account for Britney Spears for a royalty payment, (2) a payee account for Will.i.am for a royalty payment, (3) a song writer account for royalty payment, (4) a record label payee account, and (5) an iTunes™ payee account. Alison can input her credit card payment information, and the transaction reconciliation engine can reconcile the transaction by placing a percentage of the payment from Alison's account to each of the payee accounts in accordance with a set of rules.
Cable Subscription
As another example it is contemplated that a user can capture video data via her cellular phone related to a television commercial for the HBO™ show Game of Thrones™. The cellular phone can transmit the video data to an object recognition engine that recognizes the video data as being related to Game of Thrones™ and derives attributes of the video data.
The object recognition engine can be coupled with a transaction reconciliation engine having a memory storing reconciliation matrices. The transaction reconciliation engine can associate the attributes with a recognition signature, map the signature to an HBO account, the user's credit card account, the SAG, DGA or other guild accounts associated with talent in Game of Thrones™, and reconcile a subscription to HBO™ between the relevant accounts.
Payment For Public Transportation
In yet another example, it is contemplated that a user wishing to ride a bus, taxi, plane or other public transportation vehicle can purchase a ticket or entrance by capturing an image of the vehicle or advertisement associated with the vehicle. A system of the inventive subject matter can then be used to distribute payment amongst the transportation company, taxes to the city, tips to drivers and other employees, beverage and food vendors, and any other parties associated with the transaction.
Patient Care
In yet another example, a user can capture an image of a patient and possibly CPT codes or other information, and use image data to reconcile a payment for a medical procedure amongst, among other things, healthcare providers, a patient account for co-payment, payment from an insurance company, and a hospital.
Pre-Order
In yet another example, a user can pre-order or pre-purchase a video game or other product by capturing digital data related to the game (e.g., an image of a game advertisement, audio data related to a game advertisement, etc.). It is contemplated that a payment can be temporally reconciled wherein a down-payment or a deposit can be reconciled immediately between a video game vendor, a user checking account, and Electronic Arts™ (EA™) (the game developer), and that the final full payment can be further reconciled at a later time (e.g., upon pick-up or delivery, etc.).
Charitable Donations
It is also contemplated that a system of the inventive subject matter could be used to reconcile a payment amongst a for-profit organization and a non-profit organization. Oftentimes an establishment, such as Starbucks™ will promote that a percentage of a purchase of a pound of coffee or other good will be donated to a charitable organization. Using a system of the inventive subject matter, a user can capture an image of the product, QR code, barcode or other item associated with the pound of coffee, and the purchase can be reconciled amongst, inter alia, Starbucks™ and the American Kidney Fund™.
It is further contemplated that the system can be configured to keep track of payments made from one or more user accounts to a charitable organization for tax purposes. It is also contemplated that the system can generate a partial or complete report, such as a Form 1040, Schedule A.
Rewards Programs
In yet another example, a system of the inventive subject matter can be used to reconcile a transaction with a rewards program account (e.g., a frequent flyer miles program, restaurant rewards program, etc.). For example, it is contemplated that a user's Visa™ card can be associated with an American Airlines™ frequent flyer miles program. Whenever the user utilizes the transaction system to make a purchase or other transaction, it is contemplated that the transaction can be reconciled between the user's Visa™ card (debit), the American Airlines™ frequent flyer miles program (points credit), and the vendor(s)' accounts (monetary credit). It is contemplated that when the system is used to purchase an airline ticket on American Airlines™, additional points can be credited to the American Airlines™, frequent flyer miles program.
Gamification
In yet another example, a system of the inventive subject matter can be used for gamification. When a user plays a video game in conjunction with the transaction system, the user can be awarded points, achievement badges, virtual currency, etc., based on derived attributes and a reconciliation matrix. For example, the object recognition engine can recognize that a level was completed on a video game. The object recognition engine can derive object attributes associated with the completed level, and a transaction reconciliation engine can associate the attribute(s) with a recognition signature, map the signature to one or more accounts via a reconciliation matrix, and reconcile a transaction (e.g., award a badge, etc.) amongst the user's gaming account and the game's leader board.
Tolls
Yet another use case can include using a recognition engine to recognize vehicles on a toll road. The recognition engine use object attributes associated with a vehicle (e.g., make, model, color, location, time, license plate, etc.) for several purposes. First, the object attributes in general can be used to identify or verify that the vehicle is properly authorized to utilize the toll road. For example, the engine, or transaction system, can compare the make and model of the vehicle against DMV records associated with the license plate to ensure the vehicle or plates and not stolen. Second, the object attributes can be used to determine which accounts could be charged for use of the toll road. Consider a scenario where the salesman utilizes his personal vehicle. The salesman could have multiple accounts that should be charged in exchange for using the toll road. During working hours (e.g., based on a GPS location or time of day), a corporate account might be charged. During weekends, a personal account might be charged. Further, possibly based on an employee agreement, both accounts might be charged (e.g., 75% corporate and 25% personal during working hours).
Rights Management
In view that the disclosed system is capable of reconciling multiple accounts and can be leveraged with respect to purchasing or otherwise interacting with digital media, the disclosed system can be used to provide payment to rights holders associated with digital media. Consider a scenario where individual seeks to obtain content associated with a movie poster, or possibly a poster of sports team (e.g., the L.A. Lakers™). The person can capture an image of the poster and the system can return a selection of multimedia content bound or linked to the poster. Should the person decided to purchase the content, the transaction system can select one or more reconciliation matrices based on derived attributes from the image of the poster, or even based on user selection. The reconciliation matrices can then be used to identify accounts for all rights holders associated with the content: actors, producers, owners, distributers, publishers, artists, or other individuals associated with the content. When the transaction takes place, each account can be credited according to one or more business rules determined form the reconciliation matrix.
Additional Concepts
One should also note that the inventive subject matter is also considered to include the additional concepts presented below.
Methods of Reconciling a Payment of a Coupon
Some aspects of the inventive subject matter can include a method of reconciling a payment of a coupon, as shown in Figure. Such Methods can comprise recognizing a real world object related to a purchasable item via a recognition engine. In some embodiments a virtual coupon car be created based on attributes derived from a digital representation of the real world object recognized via a recognition engine. The virtual coupon can then be activated via an event that is triggered from a mobile device. Moreover, a transaction amongst multiple electronic accounts associated with the object can be reconciled as a function of the derived attributes and the virtual coupon.
Transaction Apparatus
In some embodiments of the inventive subject matter, a transaction apparatus is provided, as shown in
Method of Mitigating Risk of Transaction Fraud
Some aspects of the inventive subject matter can include a method of mitigating risk of transaction fraud, as shown in
It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the scope of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and “comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refers to at least one of something selected from the group consisting of A, B, C . . . and N, the text should be interpreted as requiring only one element from the group, not A plus N. or B plus N, etc.
This application is a continuation of U.S. application Ser. No. 18/126,916, filed on Mar. 27, 2023 which is a continuation of U.S. application Ser. No. 17/982,463, filed on Nov. 7, 2022 which is a continuation of U.S. application Ser. No. 17/214,644, filed Mar. 26, 2021 which is divisional of U.S. application Ser. No. 16/841,586, filed Apr. 6, 2020, which is a continuation of U.S. application Ser. No. 16/422,901, filed May 24, 2019, which is a divisional of U.S. application Ser. No. 16/173,882, filed Oct. 29, 2018, which is a continuation of U.S. application Ser. No. 15/947,152, filed Apr. 6, 2018, which is a continuation of U.S. application Ser. No. 15/719,422, filed Sep. 28, 2017, which is a continuation of U.S. application Ser. No. 14/359,913, filed May 21, 2014, which is a national phase of International Application No. PCT/US13/34164, filed Mar. 27, 2013, which is a continuation-in-part of International Application Number PCT/US12/66300, filed Nov. 21, 2012, which claims the benefit of priority to U.S. Provisional Application No. 61/562,385, filed on Nov. 21, 2011. This and all other extrinsic materials discussed herein are incorporated by reference in their entirety. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.
Number | Name | Date | Kind |
---|---|---|---|
3050870 | Heilig | Aug 1962 | A |
5255211 | Redmond | Oct 1993 | A |
5446833 | Miller et al. | Aug 1995 | A |
5625765 | Ellenby et al. | Apr 1997 | A |
5682332 | Ellenby et al. | Oct 1997 | A |
5742521 | Ellenby et al. | Apr 1998 | A |
5748194 | Chen | May 1998 | A |
5751576 | Monson | May 1998 | A |
5759044 | Redmond | Jun 1998 | A |
5815411 | Ellenby et al. | Sep 1998 | A |
5848373 | DeLorme et al. | Dec 1998 | A |
5884029 | Brush, II et al. | Mar 1999 | A |
5991827 | Ellenby et al. | Nov 1999 | A |
6031545 | Ellenby et al. | Feb 2000 | A |
6037936 | Ellenby et al. | Mar 2000 | A |
6052125 | Gardiner et al. | Apr 2000 | A |
6064398 | Ellenby et al. | May 2000 | A |
6064749 | Hirota et al. | May 2000 | A |
6081278 | Chen | Jun 2000 | A |
6092107 | Eleftheriadis et al. | Jul 2000 | A |
6097393 | Prouty, IV et al. | Aug 2000 | A |
6098118 | Ellenby et al. | Aug 2000 | A |
6130673 | Pulli et al. | Oct 2000 | A |
6161126 | Wies et al. | Dec 2000 | A |
6169545 | Gallery et al. | Jan 2001 | B1 |
6173239 | Ellenby | Jan 2001 | B1 |
6215498 | Filo et al. | Apr 2001 | B1 |
6226669 | Huang et al. | May 2001 | B1 |
6240360 | Phelan | May 2001 | B1 |
6242944 | Benedetti et al. | Jun 2001 | B1 |
6256043 | Aho et al. | Jul 2001 | B1 |
6278461 | Ellenby et al. | Aug 2001 | B1 |
6307556 | Ellenby et al. | Oct 2001 | B1 |
6308565 | French et al. | Oct 2001 | B1 |
6336098 | Fortenberry et al. | Jan 2002 | B1 |
6339745 | Novik et al. | Jan 2002 | B1 |
6346938 | Chan et al. | Feb 2002 | B1 |
6396475 | Ellenby et al. | May 2002 | B1 |
6414696 | Ellenby et al. | Jul 2002 | B1 |
6512844 | Bouguet et al. | Jan 2003 | B2 |
6522292 | Ellenby et al. | Feb 2003 | B1 |
6529331 | Massof et al. | Mar 2003 | B2 |
6535210 | Ellenby et al. | Mar 2003 | B1 |
6552729 | Bernardo et al. | Apr 2003 | B1 |
6552744 | Chen | Apr 2003 | B2 |
6553310 | Lopke | Apr 2003 | B1 |
6557041 | Mallart | Apr 2003 | B2 |
6559813 | DeLuca et al. | May 2003 | B1 |
6563489 | Latypov et al. | May 2003 | B1 |
6563529 | Jongerius | May 2003 | B1 |
6577714 | Darcie et al. | Jun 2003 | B1 |
6631403 | Deutsch et al. | Oct 2003 | B1 |
6672961 | Uzun | Jan 2004 | B1 |
6690370 | Ellenby et al. | Feb 2004 | B2 |
6691032 | Irish et al. | Feb 2004 | B1 |
6746332 | Ing et al. | Jun 2004 | B1 |
6751655 | Deutsch et al. | Jun 2004 | B1 |
6757068 | Foxlin | Jun 2004 | B2 |
6767287 | Mcquaid et al. | Jul 2004 | B1 |
6768509 | Bradski et al. | Jul 2004 | B1 |
6774869 | Biocca et al. | Aug 2004 | B2 |
6785667 | Orbanes et al. | Aug 2004 | B2 |
6804726 | Ellenby et al. | Oct 2004 | B1 |
6822648 | Furlong et al. | Nov 2004 | B2 |
6853398 | Malzbender et al. | Feb 2005 | B2 |
6854012 | Taylor | Feb 2005 | B1 |
6882933 | Kondou et al. | Apr 2005 | B2 |
6922155 | Evans et al. | Jul 2005 | B1 |
6930715 | Mower | Aug 2005 | B1 |
6965371 | MacLean et al. | Nov 2005 | B1 |
6968973 | Uyttendaele et al. | Nov 2005 | B2 |
7016532 | Boncyk et al. | Mar 2006 | B2 |
7031875 | Ellenby et al. | Apr 2006 | B2 |
7073129 | Robarts et al. | Jul 2006 | B1 |
7076505 | Campbell | Jul 2006 | B2 |
7113618 | Junkins et al. | Sep 2006 | B2 |
7116326 | Soulchin et al. | Oct 2006 | B2 |
7116342 | Dengler et al. | Oct 2006 | B2 |
7142209 | Uyttendaele et al. | Nov 2006 | B2 |
7143258 | Bae | Nov 2006 | B2 |
7168042 | Braun et al. | Jan 2007 | B2 |
7174301 | Florance et al. | Feb 2007 | B2 |
7206000 | Zitnick, III et al. | Apr 2007 | B2 |
7245273 | Eberl et al. | Jul 2007 | B2 |
7269425 | Valkóet al. | Sep 2007 | B2 |
7271795 | Bradski | Sep 2007 | B2 |
7274380 | Navab et al. | Sep 2007 | B2 |
7280697 | Perona et al. | Oct 2007 | B2 |
7301536 | Ellenby et al. | Nov 2007 | B2 |
7353114 | Rohlf et al. | Apr 2008 | B1 |
7369668 | Huopaniemi et al. | May 2008 | B1 |
7395507 | Robarts et al. | Jul 2008 | B2 |
7406421 | Odinak et al. | Jul 2008 | B2 |
7412427 | Zitnick et al. | Aug 2008 | B2 |
7454361 | Halavais et al. | Nov 2008 | B1 |
7477780 | Boncyk et al. | Jan 2009 | B2 |
7511736 | Benton | Mar 2009 | B2 |
7529639 | Räsänen et al. | May 2009 | B2 |
7532224 | Bannai | May 2009 | B2 |
7564469 | Cohen | Jul 2009 | B2 |
7565008 | Boncyk et al. | Jul 2009 | B2 |
7641342 | Eberl et al. | Jan 2010 | B2 |
7650616 | Lee | Jan 2010 | B2 |
7680324 | Boncyk et al. | Mar 2010 | B2 |
7696905 | Ellenby et al. | Apr 2010 | B2 |
7710395 | Rodgers et al. | May 2010 | B2 |
7714895 | Pretlove et al. | May 2010 | B2 |
7729946 | Chu | Jun 2010 | B2 |
7734412 | Shi et al. | Jun 2010 | B2 |
7768534 | Pentenrieder et al. | Aug 2010 | B2 |
7774180 | Joussemet et al. | Aug 2010 | B2 |
7796155 | Neely, III et al. | Sep 2010 | B1 |
7817104 | Ryu et al. | Oct 2010 | B2 |
7822539 | Akiyoshi et al. | Oct 2010 | B2 |
7828655 | Uhlir et al. | Nov 2010 | B2 |
7844229 | Gyorfi et al. | Nov 2010 | B2 |
7847699 | Lee et al. | Dec 2010 | B2 |
7847808 | Cheng et al. | Dec 2010 | B2 |
7887421 | Tabata | Feb 2011 | B2 |
7889193 | Platonov et al. | Feb 2011 | B2 |
7899915 | Reisman | Mar 2011 | B2 |
7904577 | Taylor | Mar 2011 | B2 |
7907128 | Bathiche et al. | Mar 2011 | B2 |
7908462 | Sung | Mar 2011 | B2 |
7916138 | John et al. | Mar 2011 | B2 |
7962281 | Rasmussen et al. | Jun 2011 | B2 |
7978207 | Herf et al. | Jul 2011 | B1 |
8046408 | Torabi | Oct 2011 | B2 |
8118297 | Izumichi | Feb 2012 | B2 |
8130242 | Cohen | Mar 2012 | B2 |
8130260 | Krill et al. | Mar 2012 | B2 |
8160994 | Ong et al. | Apr 2012 | B2 |
8170222 | Dunko | May 2012 | B2 |
8189959 | Szeliski et al. | May 2012 | B2 |
8190749 | Chi et al. | May 2012 | B1 |
8204299 | Arcas et al. | Jun 2012 | B2 |
8218873 | Boncyk et al. | Jul 2012 | B2 |
8223024 | Petrou et al. | Jul 2012 | B1 |
8223088 | Gomez et al. | Jul 2012 | B1 |
8224077 | Boncyk et al. | Jul 2012 | B2 |
8224078 | Boncyk et al. | Jul 2012 | B2 |
8246467 | Huang et al. | Aug 2012 | B2 |
8251819 | Watkins, Jr. et al. | Aug 2012 | B2 |
8291346 | Kerr et al. | Oct 2012 | B2 |
8315432 | Lefevre et al. | Nov 2012 | B2 |
8321527 | Martin et al. | Nov 2012 | B2 |
8374395 | Lefevre et al. | Feb 2013 | B2 |
8417261 | Huston et al. | Apr 2013 | B2 |
8427508 | Mattila et al. | Apr 2013 | B2 |
8438110 | Calman et al. | May 2013 | B2 |
8472972 | Nadler et al. | Jun 2013 | B2 |
8488011 | Blanchflower et al. | Jul 2013 | B2 |
8489993 | Tamura et al. | Jul 2013 | B2 |
8498814 | Irish et al. | Jul 2013 | B2 |
8502835 | Meehan | Aug 2013 | B1 |
8509483 | Inigo | Aug 2013 | B2 |
8519844 | Richey et al. | Aug 2013 | B2 |
8527340 | Fisher et al. | Sep 2013 | B2 |
8531449 | Lynch et al. | Sep 2013 | B2 |
8537113 | Weising et al. | Sep 2013 | B2 |
8558759 | Gomez et al. | Oct 2013 | B1 |
8576276 | Bar-Zeev et al. | Nov 2013 | B2 |
8576756 | Ko et al. | Nov 2013 | B2 |
8585476 | Mullen et al. | Nov 2013 | B2 |
8605141 | Dialameh et al. | Dec 2013 | B2 |
8606657 | Chesnut et al. | Dec 2013 | B2 |
8633946 | Cohen | Jan 2014 | B2 |
8645220 | Harper et al. | Feb 2014 | B2 |
8660369 | Llano et al. | Feb 2014 | B2 |
8660951 | Calman et al. | Feb 2014 | B2 |
8675017 | Rose et al. | Mar 2014 | B2 |
8686924 | Braun et al. | Apr 2014 | B2 |
8700060 | Huang | Apr 2014 | B2 |
8706170 | Jacobsen et al. | Apr 2014 | B2 |
8706399 | Irish et al. | Apr 2014 | B2 |
8711176 | Douris et al. | Apr 2014 | B2 |
8727887 | Mahajan et al. | May 2014 | B2 |
8730156 | Weising et al. | May 2014 | B2 |
8743145 | Price et al. | Jun 2014 | B1 |
8743244 | Vartanian et al. | Jun 2014 | B2 |
8744214 | Snavely et al. | Jun 2014 | B2 |
8745494 | Spivack | Jun 2014 | B2 |
8751159 | Hall | Jun 2014 | B2 |
8754907 | Tseng | Jun 2014 | B2 |
8762047 | Sterkel et al. | Jun 2014 | B2 |
8764563 | Toyoda | Jul 2014 | B2 |
8786675 | Deering et al. | Jul 2014 | B2 |
8803917 | Meehan | Aug 2014 | B2 |
8810598 | Soon-Shiong | Aug 2014 | B2 |
8814691 | Haddick et al. | Aug 2014 | B2 |
8855719 | Jacobsen et al. | Oct 2014 | B2 |
8872851 | Choubassi et al. | Oct 2014 | B2 |
8893164 | Teller | Nov 2014 | B1 |
8913085 | Anderson et al. | Dec 2014 | B2 |
8933841 | Valaee et al. | Jan 2015 | B2 |
8938464 | Bailly et al. | Jan 2015 | B2 |
8958979 | Levine et al. | Feb 2015 | B1 |
8965741 | McCulloch et al. | Feb 2015 | B2 |
8968099 | Hanke et al. | Mar 2015 | B1 |
8994645 | Meehan | Mar 2015 | B1 |
9001252 | Hannaford | Apr 2015 | B2 |
9007364 | Bailey | Apr 2015 | B2 |
9024842 | Gomez et al. | May 2015 | B1 |
9024972 | Bronder et al. | May 2015 | B1 |
9026940 | Jung | May 2015 | B2 |
9037468 | Osman | May 2015 | B2 |
9041739 | Latta et al. | May 2015 | B2 |
9047609 | Ellis et al. | Jun 2015 | B2 |
9071709 | Wither et al. | Jun 2015 | B2 |
9098905 | Rivlin et al. | Aug 2015 | B2 |
9122053 | Geisner et al. | Sep 2015 | B2 |
9122321 | Perez et al. | Sep 2015 | B2 |
9122368 | Szeliski et al. | Sep 2015 | B2 |
9122707 | Wither et al. | Sep 2015 | B2 |
9128520 | Geisner et al. | Sep 2015 | B2 |
9129644 | Gay et al. | Sep 2015 | B2 |
9131208 | Jin | Sep 2015 | B2 |
9143839 | Reisman et al. | Sep 2015 | B2 |
9167386 | Valaee et al. | Oct 2015 | B2 |
9177381 | McKinnon | Nov 2015 | B2 |
9178953 | Theimer et al. | Nov 2015 | B2 |
9182815 | Small et al. | Nov 2015 | B2 |
9183560 | Abelow | Nov 2015 | B2 |
9230367 | Stroila | Jan 2016 | B2 |
9240074 | Berkovich et al. | Jan 2016 | B2 |
9245387 | Poulos et al. | Jan 2016 | B2 |
9262743 | Heins et al. | Feb 2016 | B2 |
9264515 | Ganapathy et al. | Feb 2016 | B2 |
9280258 | Bailly et al. | Mar 2016 | B1 |
9311397 | Meadow et al. | Apr 2016 | B2 |
9317133 | Korah et al. | Apr 2016 | B2 |
9345957 | Geisner et al. | May 2016 | B2 |
9377862 | Parkinson et al. | Jun 2016 | B2 |
9384737 | Lamb et al. | Jul 2016 | B2 |
9389090 | Levine et al. | Jul 2016 | B1 |
9396589 | Soon-Shiong | Jul 2016 | B2 |
9466144 | Sharp et al. | Oct 2016 | B2 |
9480913 | Briggs | Nov 2016 | B2 |
9482528 | Baker et al. | Nov 2016 | B2 |
9495591 | Visser et al. | Nov 2016 | B2 |
9495760 | Swaminathan et al. | Nov 2016 | B2 |
9498720 | Geisner et al. | Nov 2016 | B2 |
9503310 | Hawkes et al. | Nov 2016 | B1 |
9536251 | Huang et al. | Jan 2017 | B2 |
9552673 | Hilliges et al. | Jan 2017 | B2 |
9558557 | Jiang et al. | Jan 2017 | B2 |
9573064 | Kinnebrew et al. | Feb 2017 | B2 |
9582516 | McKinnon et al. | Feb 2017 | B2 |
9602859 | Strong | Mar 2017 | B2 |
9662582 | Mullen | May 2017 | B2 |
9678654 | Wong et al. | Jun 2017 | B2 |
9782668 | Golden et al. | Oct 2017 | B1 |
9805385 | Soon-Shiong | Oct 2017 | B2 |
9817848 | McKinnon et al. | Nov 2017 | B2 |
9824501 | Soon-Shiong | Nov 2017 | B2 |
9891435 | Boger et al. | Feb 2018 | B2 |
9942420 | Rao et al. | Apr 2018 | B2 |
9972208 | Levine et al. | May 2018 | B2 |
10002337 | Siddique et al. | Jun 2018 | B2 |
10007928 | Graham | Jun 2018 | B2 |
10062213 | Mount et al. | Aug 2018 | B2 |
10068381 | Blanchflower et al. | Sep 2018 | B2 |
10115122 | Soon-Shiong | Oct 2018 | B2 |
10127733 | Soon-Shiong | Nov 2018 | B2 |
10133342 | Mittal et al. | Nov 2018 | B2 |
10140317 | McKinnon et al. | Nov 2018 | B2 |
10147113 | Soon-Shiong | Dec 2018 | B2 |
10217284 | Das et al. | Feb 2019 | B2 |
10304073 | Soon-Shiong | May 2019 | B2 |
10339717 | Weisman et al. | Jul 2019 | B2 |
10403051 | Soon-Shiong | Sep 2019 | B2 |
10509461 | Mullen | Dec 2019 | B2 |
10565828 | Amaitis et al. | Feb 2020 | B2 |
10614477 | Soon-Shiong | Apr 2020 | B2 |
10664518 | McKinnon et al. | May 2020 | B2 |
10675543 | Reiche, III | Jun 2020 | B2 |
10828559 | Mullen | Nov 2020 | B2 |
10838485 | Mullen | Nov 2020 | B2 |
11004102 | Soon-Shiong | May 2021 | B2 |
11107289 | Soon-Shiong | Aug 2021 | B2 |
11263822 | Weisman et al. | Mar 2022 | B2 |
11270114 | Park et al. | Mar 2022 | B2 |
11514652 | Soon-Shiong | Nov 2022 | B2 |
11521226 | Soon-Shiong | Dec 2022 | B2 |
11645668 | Soon-Shiong | May 2023 | B2 |
11854036 | Soon-Shiong | Dec 2023 | B2 |
11854153 | Soon-Shiong | Dec 2023 | B2 |
11869160 | Soon-Shiong | Jan 2024 | B2 |
20010045978 | McConnell et al. | Nov 2001 | A1 |
20020044152 | Abbott, III et al. | Apr 2002 | A1 |
20020077905 | Arndt et al. | Jun 2002 | A1 |
20020080167 | Andrews et al. | Jun 2002 | A1 |
20020086669 | Bos et al. | Jul 2002 | A1 |
20020107634 | Luciani | Aug 2002 | A1 |
20020133291 | Hamada et al. | Sep 2002 | A1 |
20020138607 | Rourke et al. | Sep 2002 | A1 |
20020158873 | Williamson | Oct 2002 | A1 |
20020163521 | Ellenby et al. | Nov 2002 | A1 |
20030004802 | Callegari | Jan 2003 | A1 |
20030008619 | Werner | Jan 2003 | A1 |
20030027634 | Matthews, III | Feb 2003 | A1 |
20030060211 | Chern et al. | Mar 2003 | A1 |
20030069693 | Snapp et al. | Apr 2003 | A1 |
20030177187 | Levine et al. | Sep 2003 | A1 |
20030195022 | Lynch et al. | Oct 2003 | A1 |
20030212996 | Wolzien | Nov 2003 | A1 |
20030224855 | Cunningham | Dec 2003 | A1 |
20030234859 | Malzbender et al. | Dec 2003 | A1 |
20040002843 | Robarts et al. | Jan 2004 | A1 |
20040058732 | Piccionelli | Mar 2004 | A1 |
20040104935 | Williamson et al. | Jun 2004 | A1 |
20040110565 | Levesque | Jun 2004 | A1 |
20040164897 | Treadwell et al. | Aug 2004 | A1 |
20040193441 | Altieri | Sep 2004 | A1 |
20040203380 | Hamdi et al. | Oct 2004 | A1 |
20040221053 | Codella et al. | Nov 2004 | A1 |
20040223190 | Oka | Nov 2004 | A1 |
20040246333 | Steuart, III | Dec 2004 | A1 |
20040248653 | Barros et al. | Dec 2004 | A1 |
20050004753 | Weiland et al. | Jan 2005 | A1 |
20050024501 | Ellenby et al. | Feb 2005 | A1 |
20050043097 | March et al. | Feb 2005 | A1 |
20050047647 | Rutishauser et al. | Mar 2005 | A1 |
20050049022 | Mullen | Mar 2005 | A1 |
20050060377 | Lo et al. | Mar 2005 | A1 |
20050143172 | Kurzweil | Jun 2005 | A1 |
20050192025 | Kaplan | Sep 2005 | A1 |
20050197767 | Nortrup | Sep 2005 | A1 |
20050202877 | Uhlir et al. | Sep 2005 | A1 |
20050208457 | Fink et al. | Sep 2005 | A1 |
20050223031 | Zisserman et al. | Oct 2005 | A1 |
20050285878 | Singh et al. | Dec 2005 | A1 |
20050289590 | Cheok et al. | Dec 2005 | A1 |
20060010256 | Heron et al. | Jan 2006 | A1 |
20060025229 | Mahajan et al. | Feb 2006 | A1 |
20060038833 | Mallinson et al. | Feb 2006 | A1 |
20060047704 | Gopalakrishnan | Mar 2006 | A1 |
20060105838 | Mullen | May 2006 | A1 |
20060160619 | Skoglund | Jul 2006 | A1 |
20060161379 | Ellenby et al. | Jul 2006 | A1 |
20060166740 | Sufuentes | Jul 2006 | A1 |
20060190812 | Ellenby et al. | Aug 2006 | A1 |
20060223635 | Rosenberg | Oct 2006 | A1 |
20060223637 | Rosenberg | Oct 2006 | A1 |
20060249572 | Chen et al. | Nov 2006 | A1 |
20060259361 | Barhydt et al. | Nov 2006 | A1 |
20060262140 | Kujawa et al. | Nov 2006 | A1 |
20070035562 | Azuma et al. | Feb 2007 | A1 |
20070038944 | Carignano et al. | Feb 2007 | A1 |
20070060408 | Schultz et al. | Mar 2007 | A1 |
20070066358 | Silverbrook et al. | Mar 2007 | A1 |
20070070069 | Samarasekera et al. | Mar 2007 | A1 |
20070087828 | Robertson et al. | Apr 2007 | A1 |
20070099703 | Terebilo | May 2007 | A1 |
20070109619 | Eberl et al. | May 2007 | A1 |
20070146391 | Pentenrieder et al. | Jun 2007 | A1 |
20070162341 | McConnell | Jul 2007 | A1 |
20070167237 | Wang et al. | Jul 2007 | A1 |
20070173265 | Gum | Jul 2007 | A1 |
20070182739 | Platonov et al. | Aug 2007 | A1 |
20070265089 | Robarts et al. | Nov 2007 | A1 |
20070271301 | Klive | Nov 2007 | A1 |
20070288332 | Naito | Dec 2007 | A1 |
20080024594 | Ritchey | Jan 2008 | A1 |
20080030429 | Hailpern et al. | Feb 2008 | A1 |
20080071559 | Arrasvuori | Mar 2008 | A1 |
20080081638 | Boland et al. | Apr 2008 | A1 |
20080106489 | Brown et al. | May 2008 | A1 |
20080125218 | Collins et al. | May 2008 | A1 |
20080129528 | Guthrie | Jun 2008 | A1 |
20080132251 | Altman et al. | Jun 2008 | A1 |
20080147325 | Maassel et al. | Jun 2008 | A1 |
20080154538 | Stathis | Jun 2008 | A1 |
20080157946 | Eberl et al. | Jul 2008 | A1 |
20080198159 | Liu et al. | Aug 2008 | A1 |
20080198222 | Gowda | Aug 2008 | A1 |
20080211813 | Jamwal et al. | Sep 2008 | A1 |
20080261697 | Chatani et al. | Oct 2008 | A1 |
20080262910 | Altberg et al. | Oct 2008 | A1 |
20080268876 | Gelfand | Oct 2008 | A1 |
20080291205 | Rasmussen et al. | Nov 2008 | A1 |
20080319656 | Irish | Dec 2008 | A1 |
20090003662 | Joseph et al. | Jan 2009 | A1 |
20090013052 | Robarts et al. | Jan 2009 | A1 |
20090037103 | Herbst et al. | Feb 2009 | A1 |
20090061901 | Arrasvuori | Mar 2009 | A1 |
20090081959 | Gyorfi et al. | Mar 2009 | A1 |
20090102859 | Athsani et al. | Apr 2009 | A1 |
20090144148 | Jung | Jun 2009 | A1 |
20090149250 | Middleton | Jun 2009 | A1 |
20090167787 | Bathiche et al. | Jul 2009 | A1 |
20090167919 | Anttila et al. | Jul 2009 | A1 |
20090176509 | Davis et al. | Jul 2009 | A1 |
20090187389 | Dobbins et al. | Jul 2009 | A1 |
20090193055 | Kuberka et al. | Jul 2009 | A1 |
20090195650 | Hanai et al. | Aug 2009 | A1 |
20090209270 | Gutierrez et al. | Aug 2009 | A1 |
20090210486 | Lim | Aug 2009 | A1 |
20090213114 | Dobbins et al. | Aug 2009 | A1 |
20090219224 | Elg et al. | Sep 2009 | A1 |
20090222742 | Pelton et al. | Sep 2009 | A1 |
20090237546 | Bloebaum et al. | Sep 2009 | A1 |
20090248300 | Dunko et al. | Oct 2009 | A1 |
20090271160 | Copenhagen et al. | Oct 2009 | A1 |
20090271715 | Tumuluri | Oct 2009 | A1 |
20090284553 | Seydoux | Nov 2009 | A1 |
20090285483 | Guven | Nov 2009 | A1 |
20090287587 | Bloebaum | Nov 2009 | A1 |
20090289956 | Douris et al. | Nov 2009 | A1 |
20090293012 | Alter et al. | Nov 2009 | A1 |
20090319902 | Kneller et al. | Dec 2009 | A1 |
20090322671 | Scott et al. | Dec 2009 | A1 |
20090325607 | Conway et al. | Dec 2009 | A1 |
20100008255 | Khosravy et al. | Jan 2010 | A1 |
20100023878 | Douris et al. | Jan 2010 | A1 |
20100045933 | Eberl et al. | Feb 2010 | A1 |
20100048242 | Rhoads et al. | Feb 2010 | A1 |
20100087250 | Chiu | Apr 2010 | A1 |
20100113157 | Chin et al. | May 2010 | A1 |
20100138294 | Bussmann et al. | Jun 2010 | A1 |
20100162149 | Sheleheda et al. | Jun 2010 | A1 |
20100185504 | Rajan | Jul 2010 | A1 |
20100188638 | Eberl et al. | Jul 2010 | A1 |
20100189309 | Rouzes et al. | Jul 2010 | A1 |
20100194782 | Gyorfi et al. | Aug 2010 | A1 |
20100208033 | Edge et al. | Aug 2010 | A1 |
20100211506 | Chang et al. | Aug 2010 | A1 |
20100217855 | Przybysz et al. | Aug 2010 | A1 |
20100241628 | Levanon | Sep 2010 | A1 |
20100246969 | Winder et al. | Sep 2010 | A1 |
20100257252 | Dougherty et al. | Oct 2010 | A1 |
20100287485 | Bertolami et al. | Nov 2010 | A1 |
20100302143 | Spivack | Dec 2010 | A1 |
20100306120 | Ciptawilangga | Dec 2010 | A1 |
20100309097 | Raviv et al. | Dec 2010 | A1 |
20100315418 | Woo | Dec 2010 | A1 |
20100321389 | Gay et al. | Dec 2010 | A1 |
20100321540 | Woo et al. | Dec 2010 | A1 |
20100325154 | Schloter et al. | Dec 2010 | A1 |
20110018903 | Lapstun et al. | Jan 2011 | A1 |
20110028220 | Reiche, III | Feb 2011 | A1 |
20110034176 | Lord et al. | Feb 2011 | A1 |
20110038634 | DeCusatis et al. | Feb 2011 | A1 |
20110039622 | Levenson et al. | Feb 2011 | A1 |
20110055049 | Harper et al. | Mar 2011 | A1 |
20110093326 | Bous | Apr 2011 | A1 |
20110134108 | Hertenstein | Jun 2011 | A1 |
20110142016 | Chatterjee | Jun 2011 | A1 |
20110145051 | Paradise et al. | Jun 2011 | A1 |
20110148922 | Son et al. | Jun 2011 | A1 |
20110151955 | Nave | Jun 2011 | A1 |
20110153186 | Jakobson | Jun 2011 | A1 |
20110183754 | Alghamdi | Jul 2011 | A1 |
20110202460 | Buer et al. | Aug 2011 | A1 |
20110205242 | Friesen | Aug 2011 | A1 |
20110212762 | Ocko et al. | Sep 2011 | A1 |
20110216060 | Weising et al. | Sep 2011 | A1 |
20110221771 | Cramer et al. | Sep 2011 | A1 |
20110225069 | Cramer | Sep 2011 | A1 |
20110234631 | Kim et al. | Sep 2011 | A1 |
20110238751 | Belimpasakis et al. | Sep 2011 | A1 |
20110241976 | Boger et al. | Oct 2011 | A1 |
20110246064 | Nicholson | Oct 2011 | A1 |
20110246276 | Peters | Oct 2011 | A1 |
20110249122 | Tricoukes et al. | Oct 2011 | A1 |
20110279445 | Murphy et al. | Nov 2011 | A1 |
20110282747 | Lavrov | Nov 2011 | A1 |
20110316880 | Ojala et al. | Dec 2011 | A1 |
20110319148 | Kinnebrew et al. | Dec 2011 | A1 |
20120019557 | Aronsson et al. | Jan 2012 | A1 |
20120050144 | Morlock | Mar 2012 | A1 |
20120050503 | Kraft | Mar 2012 | A1 |
20120092328 | Flaks et al. | Apr 2012 | A1 |
20120098859 | Lee et al. | Apr 2012 | A1 |
20120105473 | Bar-Zeev et al. | May 2012 | A1 |
20120105474 | Cudalbu et al. | May 2012 | A1 |
20120105475 | Tseng et al. | May 2012 | A1 |
20120109773 | Sipper et al. | May 2012 | A1 |
20120110477 | Gaume | May 2012 | A1 |
20120113141 | Zimmerman et al. | May 2012 | A1 |
20120116920 | Adhikari et al. | May 2012 | A1 |
20120122570 | Baronoff | May 2012 | A1 |
20120127062 | Bar-Zeev et al. | May 2012 | A1 |
20120127201 | Kim et al. | May 2012 | A1 |
20120127284 | Bar-Zeev et al. | May 2012 | A1 |
20120139817 | Freeman | Jun 2012 | A1 |
20120150746 | Graham | Jun 2012 | A1 |
20120157210 | Hall | Jun 2012 | A1 |
20120162255 | Ganapathy et al. | Jun 2012 | A1 |
20120194547 | Johnson et al. | Aug 2012 | A1 |
20120206452 | Geisner et al. | Aug 2012 | A1 |
20120219181 | Tseng et al. | Aug 2012 | A1 |
20120226437 | Li et al. | Sep 2012 | A1 |
20120229625 | Calman et al. | Sep 2012 | A1 |
20120231424 | Calman et al. | Sep 2012 | A1 |
20120231891 | Watkins, Jr. et al. | Sep 2012 | A1 |
20120232968 | Calman | Sep 2012 | A1 |
20120232976 | Calman | Sep 2012 | A1 |
20120233070 | Calman | Sep 2012 | A1 |
20120233072 | Calman | Sep 2012 | A1 |
20120236025 | Jacobsen et al. | Sep 2012 | A1 |
20120244950 | Braun | Sep 2012 | A1 |
20120252359 | Adams et al. | Oct 2012 | A1 |
20120256917 | Lieberman et al. | Oct 2012 | A1 |
20120260538 | Schob et al. | Oct 2012 | A1 |
20120276997 | Chowdhary et al. | Nov 2012 | A1 |
20120287284 | Jacobsen et al. | Nov 2012 | A1 |
20120293506 | Vertucci et al. | Nov 2012 | A1 |
20120302129 | Persaud et al. | Nov 2012 | A1 |
20130021373 | Vaught et al. | Jan 2013 | A1 |
20130044042 | Olsson et al. | Feb 2013 | A1 |
20130044128 | Liu et al. | Feb 2013 | A1 |
20130050258 | Liu et al. | Feb 2013 | A1 |
20130050496 | Jeong | Feb 2013 | A1 |
20130064426 | Watkins, Jr. et al. | Mar 2013 | A1 |
20130073988 | Groten et al. | Mar 2013 | A1 |
20130076788 | Zvi | Mar 2013 | A1 |
20130124563 | CaveLie et al. | May 2013 | A1 |
20130128060 | Rhoads et al. | May 2013 | A1 |
20130141419 | Mount et al. | Jun 2013 | A1 |
20130147836 | Small et al. | Jun 2013 | A1 |
20130159096 | Santhanagopal et al. | Jun 2013 | A1 |
20130176202 | Gervautz | Jul 2013 | A1 |
20130178257 | Langseth | Jul 2013 | A1 |
20130236040 | Crawford et al. | Sep 2013 | A1 |
20130326364 | Latta et al. | Dec 2013 | A1 |
20130335405 | Scavezze et al. | Dec 2013 | A1 |
20130342572 | Poulos et al. | Dec 2013 | A1 |
20140002492 | Lamb et al. | Jan 2014 | A1 |
20140101608 | Ryskamp et al. | Apr 2014 | A1 |
20140161323 | Livyatan et al. | Jun 2014 | A1 |
20140168261 | Margolis et al. | Jun 2014 | A1 |
20140184749 | Hilliges et al. | Jul 2014 | A1 |
20140267234 | Hook et al. | Sep 2014 | A1 |
20140306866 | Miller et al. | Oct 2014 | A1 |
20150091941 | Das et al. | Apr 2015 | A1 |
20150172626 | Martini | Jun 2015 | A1 |
20150206349 | Rosenthal et al. | Jul 2015 | A1 |
20150288944 | Nistico et al. | Oct 2015 | A1 |
20160269712 | Ostrover et al. | Sep 2016 | A1 |
20160292924 | Balachandreswaran et al. | Oct 2016 | A1 |
20170045941 | Tokubo et al. | Feb 2017 | A1 |
20170087465 | Lyons et al. | Mar 2017 | A1 |
20170216099 | Saladino | Aug 2017 | A1 |
20180300822 | Papakipos et al. | Oct 2018 | A1 |
20200005547 | Soon-Shiong | Jan 2020 | A1 |
20200257721 | McKinnon et al. | Aug 2020 | A1 |
20210358223 | Soon-Shiong | Nov 2021 | A1 |
20220156314 | McKinnon et al. | May 2022 | A1 |
20230051746 | Soon-Shiong | Feb 2023 | A1 |
20240037857 | Mckinnon et al. | Feb 2024 | A1 |
20240062486 | Soon-Shiong | Feb 2024 | A1 |
Number | Date | Country |
---|---|---|
2 311 319 | Jun 1999 | CA |
2235030 | Aug 1999 | CA |
2233047 | Sep 2000 | CA |
102436461 | May 2012 | CN |
102484730 | May 2012 | CN |
102509342 | Jun 2012 | CN |
102509348 | Jun 2012 | CN |
1 012 725 | Jun 2000 | EP |
1 246 080 | Oct 2002 | EP |
1 354 260 | Oct 2003 | EP |
1 119 798 | Mar 2005 | EP |
1 965 344 | Sep 2008 | EP |
2 207 113 | Jul 2010 | EP |
1 588 537 | Aug 2010 | EP |
2484384 | Apr 2012 | GB |
2001-286674 | Oct 2001 | JP |
2002-056163 | Feb 2002 | JP |
2002-282553 | Oct 2002 | JP |
2002-346226 | Dec 2002 | JP |
2003-305276 | Oct 2003 | JP |
2003-337903 | Nov 2003 | JP |
2004-64398 | Feb 2004 | JP |
2004-078385 | Mar 2004 | JP |
2005-196494 | Jul 2005 | JP |
2005-215922 | Aug 2005 | JP |
2005-316977 | Nov 2005 | JP |
2006-085518 | Mar 2006 | JP |
2006-190099 | Jul 2006 | JP |
2006-280480 | Oct 2006 | JP |
2007-222640 | Sep 2007 | JP |
2010-102588 | May 2010 | JP |
2010-118019 | May 2010 | JP |
2010-224884 | Oct 2010 | JP |
2011-60254 | Mar 2011 | JP |
2011-153324 | Aug 2011 | JP |
2011-253324 | Dec 2011 | JP |
2012-014220 | Jan 2012 | JP |
2010-0124947 | Nov 2010 | KR |
20120082672 | Jul 2012 | KR |
10-1171264 | Aug 2012 | KR |
9509411 | Apr 1995 | WO |
9744737 | Nov 1997 | WO |
9850884 | Nov 1998 | WO |
9942946 | Aug 1999 | WO |
9942947 | Aug 1999 | WO |
0020929 | Apr 2000 | WO |
0163487 | Aug 2001 | WO |
0171282 | Sep 2001 | WO |
0188679 | Nov 2001 | WO |
0203091 | Jan 2002 | WO |
0242921 | May 2002 | WO |
02059716 | Aug 2002 | WO |
02073818 | Sep 2002 | WO |
2007140155 | Dec 2007 | WO |
2010079876 | Jul 2010 | WO |
2010138344 | Dec 2010 | WO |
2011028720 | Mar 2011 | WO |
WO-2011084720 | Jul 2011 | WO |
2011163063 | Dec 2011 | WO |
2012082807 | Jun 2012 | WO |
2012164155 | Dec 2012 | WO |
2013023705 | Feb 2013 | WO |
WO-2013095383 | Jun 2013 | WO |
2014108799 | Jul 2014 | WO |
Entry |
---|
Varshney, Upkar. “Location Management for Mobile Commerce Applications in Wireless Internet Environment.” ACM transactions on Internet technology 3.3 (2003): 236-255. Web. (Year: 2003). |
Hühn, Arief Ernst et al. “On the Use of Virtual Environments for the Evaluation of Location-Based Applications.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2012. 2569-2578. Web. (Year: 2012). |
Jackson, Emily. “Shopper Marketing Techs up; From Geo-Targeting To Augmented Reality Shopping Apps, Brands And Retailers Get Digital.” Strategy (2012): 18-. Print. (Year: 2012). |
Warner, Chris. Augmented Reality Helps Retailers Get Personal: Sensors Are First in Line at the Point of Sale. vol. 56. Advantage Business Media, 2012. Print. (Year: 2012). |
Nelson, “THQ Announces ‘Star Wars: Falcon Gunner’ Augmented Reality Shooter,” https://toucharcade.com/2010/11/04/thq-announces-star-wars-falcon-gunner-augmented-reality-shooter/, 6 pages. |
Rogers, “Review: Star Wars Arcade: Falcon Gunner,” isource.com/2010/12/04/review-star-wars-arcade-falcon-gunner/, downloaded on Feb. 9, 2021, 14 pages. |
Schonfeld, “The First Augmented Reality Star Wars Game, Falcon Gunner, Hits The App Store,” https://techcrunch.com/2010/11/17/star-wars-iphone-falcon-gunner/, 15 pages. |
“How It Works,” https://web.archive.org/web/20130922212452/http://www.strava.com/how-it-works, 4 pages. |
“Tour,” https://web.archive.org/web/20110317045223/http://www.strava.com/tour, 9 pages. |
McCavitt, “Turf Wars iPhone Game Review,” https://web.archive.org/web/20100227030259/http://www.thegamereviews.com:80/article-1627-Turf-Wars-iPhone-Game-Review.html, 2 pages. |
“Turf Wars Captures Apple's iPad,” old.gamegrin.com/game/news/2010/turf-wars-captures-apples-ipad, downloaded on Feb. 5, 2021, 2 pages. |
James, “Turf Wars (iPhone GPS Game) Guide and Walkthrough,” https://web.archive.org/web/20120114125609/http://gameolosophy.com/games/turf-wars-iphone-gps-game-guide-and-walkthrough, 3 pages. |
Rachel et al., “Turf Wars' Nick Baicoianu—Exclusive Interview,” https://web.archive.org/web/20110101031555/http://www.gamingangels.com/2009/12/turf-wars-nick-baicoianu-exclusive-interview/, 7 pages. |
Gharrity, “Turf Wars Q&A,” https://web.archive.org/web/20110822135221/http://blastmagazine.com/the-magazine/gaming/gaming-news/turf-wars-qa/, 11 pages. |
“Introducing Turf Wars, the Free, GPS based Crime Game for Apple iPhone,” https://www.ign.com/articles/2009/12/07/introducing-turf-wars-the-free-gps-based-crime-game-for-apple-iphone, 11 pages. |
“Turf Wars,” https://web.archive.org/web/20100328171725/http://itunes.apple.com:80/app/turf-wars/id332185049? mt=8, 3 pages. |
Zungre, “Turf Wars Uses GPS to Control Real World Territory,” https://web.archive.org/web/20110810235149/http://www.slidetoplay.com/story/turf-wars-uses-gps-to-control-real-world-territory, 1 page. |
“Turf Wars,” https://web.archive.org/web/20101220170329/http://turfwarsapp.com/, 1 page. |
“Turf Wars News,” https://web.archive.org/web/20101204075000/hltp://turfwarsapp.com/news/, 5 pages. |
“Turf Wars Screenshots,” https://web.archive.org/web/20101204075000/http://turfwarsapp.com/news/, 5 pages. |
Buchanan, “UFO on Tape Review,” https://www.ign.com/articles/2010/09/30/ufo-on-tape-review, 7 pages. |
Barry, “Waze Combines Crowdsourced GPS and Pac-Man,” https://www.wired.com/2010/11/waze-combines-crowdsourced-gps-and-pac-man/, 2 pages. |
Forrest, “Waze: Make Your Own Maps in Realtime,” http://radar.oreilly.com/2009/08/waze-make-your-own-maps-in-rea.html, 4 pages. |
Forrest, “Waze: Using groups and gaming to get geodata,” http://radar.oreilly.com/2010/08/waze-using-groups-and-gaming-t.html, 3 pages. |
Furchgott, “The Blog; App Warns Drivers of the Mayhem Ahead, ” https://archive.nytimes.com/query.nytimes.com/gst/fullpage-9B07EFDC1E3BF930A25751C0A967908B63.html, downloaded on Feb. 17, 2021, 2 pages. |
Rogers, “Review: Waze for the iPhone,” isource.com/2010/08/30/review-waze-for-the-iphone/, downloaded on Feb. 17, 2021, 22 page. |
“Developers—Download Wikitude API,” https://web.archive.org/web/20110702200814/http://www.wikitude.com/en/developers, 8 pages. |
Hauser, “Wikitude World Browser,” https://web.archive.org/web/20110722165744/http:/www.wikitude.com/en/wikitude-world-browser-augmented-reality, 5 pages. |
Madden, “Professional augmented reality browsers for smartphones: programming for junaio, layar and wikitude,” 2011, 345 pages. |
Chen, “Yelp Sneaks Augmented Reality Into iPhone App,” https://www.wired.com/2009/08/yelp-ar/, 2 pages. |
“Easter Egg: Yelp Is the iPhone's First Augmented Reality App,” https://mashable.com/2009/08/27/yelp-augmented-reality/, downloaded Feb. 5, 2021, 10 pages. |
Schramm, “Voices that Matter iPhone: How Ben Newhouse created Yelp Monocle, and the future of AR,” https://www.engadget.com/2010-04-26-voices-that-matter-iphone-how-ben-newhouse-created-yelp-monocle.html, 7 pages. |
Hand, “NYC Nearest Subway AR App for iPhone 3GS,” https://vizworld.com/2009/07/nyc-nearest-subway-ar-app-for-iphone-3gs/, 7 pages. |
Hartsock, “Acrossair: Getting There Is Half the Fun,” https://www.technewsworld.com/story/70502.html, downloaded on Mar. 12, 2021, 5 pages. |
“AugmentedWorks—iPhone Apps Travel Guide with AR: Augmented GeoTravel 3.0.0,” https://web.archive.org/web/20110128180606/http://augmentedworks.com/, 3 pages. |
“Augmented GeoTravel—Features,” https://web.archive.org/web/20100909163937/http://www.augmentedworks.com/en/augmented-geotravel/features, 2 pages. |
Lin, “How is Nike+ Heat Map Calculated?,” howtonike.blogspot.com/2012/06/how-is-nike-heat-map-calculated.html, 4 pages. |
“Map your run with new Nike+ GPS App,” Nike News, Sep. 7, 2010, 3 pages. |
Savov, “App review: Nike+ GPS,” https://www.engadget.com/2010-09-07-app-review-nike-gps.html, 4 pages. |
Lutz, “Nokia reveals new City Lens augmented reality app for Windows Phone 8 lineup,” https://www.engadget.com/2012-09-11-nokia-reveals-new-city-lens-for-windows-phone-8.html, 3 pages. |
Webster, “Nokia's City Lens augmented reality app for Lumia Windows Phones comes out of beta,” https://www.theverge.com/2012/9/2/3287420/nokias-city-lens-ar-app-launch, 2 pages. |
“Nokia Image Space on video,” https://blogs.windows.com/devices/2008/09/24/nokia-image-space-on-video/, 4 pages. |
Montola et al., “Applying Game Achievement Systems to Enhance User Experience in a Photo Sharing Service,” Proceedings of the 13th International MindTrek Conference: Everyday Life in the Ubiquitous Era, 2009, pp. 94-97. |
Uusitalo et al., “A Solution for Navigating User-Generated Content,” 2009 8th IEEE International Symposium on Mixed and Augmented Reality, 2009, pp. 219-220. |
Bonetti, “Here brings sight recognition to Maps,” https://web.archive.org/web/20130608025413/http://conversations.nokia.com/2013/05/21/here-brings-sight-recognition-to-maps/, 5 pages. |
Burns, “Nokia City Lens released from Beta for Lumia devices,” https://www.slashgear.com/nokia-city-lens-released-from-beta-for-lumia-devices-1%20246841/, 9 pages. |
Greene, “Hyperlinking Reality via Phones,” https://www.technologyreview.com/2006/11/20/273250/hyperlinking-reality-via-phones/, 11 pages. |
Maubon, “A little bit of history from 2006: Nokia MARA project,” https://www.augmented-reality.fr/2009/03/un-petit-peu-dhistoire-de-2006-projet-mara-de-nokia/, 7 pages. |
“Nokia's MARA Connects The Physical World Via Mobile,” https://theponderingprimate.blogspot.com/2006/11/nokias-mara-connects-physical-world.html, 14 pages. |
Patro et al., “The anatomy of a large mobile massively multiplayer online game,” Proceedings of the first ACM international workshop on Mobile gaming, 2012, 6 pages. |
Schumann et al., “Mobile Gaming Communities: State of the Art Analysis and Business Implications,” Central European Conference on Information and Intelligent Systems, 2011, 8 pages. |
Organisciak, “Pico Safari: Active Gaming in Integrated Environments,” https://organisciak.wordpress.com/2016/07 /19/pico-safari-active-gaming-in-integrated-environments/, 21 pages. |
“Plundr,” https://web.archive.org/web/20110110032105/areacodeinc.com/projects/plundr/, 3 pages. |
Caoili et al., “Plundr: Dangerous Shores' location-based gaming weighs anchor on the Nintendo DS,” https://www.engadget.com/2007-06-03-plundr-dangerous-shores-location-based-gaming-weighs-anchor-on-the-nintendi-ds.html, 2 pages. |
Miller, “Plundr, first location-based DS game, debuts at Where 2.0,” https://www.engadget.com/2007-06-04-plundr-first-location-based-ds-game-debuts-at-where-2-0.html, 4 pages. |
Blösch et al., “Vision Based MAV Navigation in Unknown and Unstructured Environments,” 2010 IEEE International Conference on Robotics and Automation, 2010, 9 pages. |
Castle et al., “Video-rate Localization in Multiple Maps for Wearable Augmented Reality,” 2008 12th IEEE International Symposium on Wearable Computers, 2012, 8 pages. |
Klein et al. “Parallel Tracking and Mapping for Small AR Workspaces,” 2007 6th IEEE and ACM international symposium on mixed and augmented reality, 2007, 10 pages. |
Klein et al. “Parallel tracking and mapping on a camera phone,” 2009 8th IEEE International Symposium on Mixed and Augmented Reality, 2009, 4 pages. |
Van Den Hengel et al., “In Situ Image-based Modeling,” 2009 8th IEEE International Symposium on Mixed and Augmented Reality, 2009, 4 pages. |
Hughes, “Taking social games to the next level,” https://www.japantimes.co.jp/culture/2010/08/04/general/taking-social-games-to-the-next-level/, 1 page. |
Kincaid, “TC50 Star Tonchidot Releases Its Augmented Reality Sekai Camera Worldwide,” https://techcrunch.com/2009/12/21/sekai-camera/, 9 pages. |
Martin, “Sekai Camera's new reality,” https://www.japantimes.co.jp/life/2009/10/14/digital/sekai-cameras-new-reality/. 3 pages. |
Nakamura et al., “Control of Augmented Reality Information Volume by Glabellar Fader,” Proceedings of the 1st Augmented Human international Conference, 2010, 3 pages. |
“AnimexTSUTAYA×Sekai Camera,” https://japanesevw.blogspot.com/2010/08/animetsutayasekai-camera.html#links, 4 pages. |
“AR-RPG(ARPG) ”Sekai Hero“,” https://japanesevw.blogspot.com/2010/08/ar-rpgarpg-sekai-hero.html#links, 5 pages. |
Toto, “Augmented Reality App Sekai Camera Goes Multi-Platform. Adds API And Social Gaming,” https://techcrunch.com/2010/07/14/augmented-reality-app-sekai-camera-goes-multi-platform-adds-api-and-social-gaming/, 4 pages. |
Hämäläinen, “[Job] Location Based MMORPG server engineers—Grey Area & Shadow Cities,” https://erlang.org/pipermail/erlang-questions/2010-November/054788.html, 2 pages. |
“Shadow Cities,” https://web.archive.org/web/20101114162700/http://www.shadowcities.com/, 7 pages. |
Buchanan, “Star Wars: Falcon Gunner iPhone Review,” https://www.ign.com/articles/2010/11/18/star-wars-falcon-gunner-iphone-review, 13 pages. |
“THQ Wireless Launches Star Wars Arcade: Falcon Gunner,” https://web.archive.org/web/20101129010405/http:/starwars.com/games/videogames/swarcade_falcongunner/index.html, 5 pages. |
“Star Wars Arcade: Falcon Gunner,” https://www.macupdate.com/app/mac/35949/star-wars-arcade-falcon-gunner, downloaded on Feb. 9, 2021, 5 pages. |
Julier et al., “BARS: Battlefield Augmented Reality System,” Advanced Information Technology (Code 5580), Naval Research Laboratory, 2000, 7 pages. |
Baillot et al., “Authoring of Physical Models Using Mobile Computers,” Naval Research Laboratory, 2001, IEEE, 8 Pages. |
Cheok et al., “Human Pacman: a mobile, wide-area entertainment system based on physical, social, and ubiquitous computing,” Springer-Verlag London Limited 2004, 11 pages. |
Boger, “ Are Existing Head-Mounted Displays ‘Good Enough’?,” Sensics, Inc., 2007, 11 pages. |
Boger, “The 2008 HMD Survey: Are We There Yet?” Sensics, Inc., 2008, 14 pages. |
Boger, “Cutting the Cord: the 2010 Survey on using Wireless Video with Head-Mounted Displays,” Sensics, Inc., 2008, 10 pages. |
Bateman, “The Essential Guide to 3D in Flash,” 2010, Friends of Ed—an Apress Company, 275 pages. |
Magerkurth, “Proceedings of PerGames—Second International Workshop on Gaming Applications in Pervasive Computing Environments,” www.pergames.de., 2005, 119 pages. |
Avery, “Outdoor Augmented Reality Gaming on Five Dollars a Day,” www.pergames.de., 2005, 10 pages. |
Ivanov, “Away 3D 3.6 Cookbook,” 2011, Pakt Publishing, 480 pages. |
Azuma, “The Challenge of Making Augmented Reality Work Outdoors,” In Mixed Reality: Merging Real and Virtual Worlds. Yuichi Ohta and Hideyuki Tamura (ed.), Springer-Verlag, 1999. Chp 21 pp. 379-390, 10 pages. |
Bell et al., “Interweaving Mobile Games With Everyday Life,” Proc. ACM CHI, 2006, 10 pages. |
Broll, “Meeting Technology Challenges of Pervasive Augmented Reality Games,” ACM, 2006, 13 pages. |
Brooks, “What's Real About Virtual Reality?,” IEEE, Nov./Dec. 1999, 12 pages. |
Julier et al., “ The Need for AI: Intuitive User Interfaces for Mobile Augmented Reality Systems,” 2001, ITT Advanced Engineering Sytems, 5 pages. |
Burdea et al., “Virtual Reality Technology: Second Edition,” 2003, John Wiley & Sons, Inc., 134 pages. |
Butterworth et al., “3DM: A Three Dimensional Modeler Using a Head-Mounted Display,” ACM, 1992, 5 pages. |
Lee et al., “CAMAR 2.0: Future Direction of Context-Aware Mobile Augmented Reality,” 2009, IEEE, 5 pages. |
Cheok et al., “Human Pacman: A Mobile Entertainment System with Ubiquitous Computing and Tangible Interaction over a Wide Outdoor Area,” 2003, Springer-Verlag, 16 pages. |
Hezel et al., “Head Mounted Displays For Virtual Reality,” Feb. 1993, MITRE, 5 pages. |
McQuaid, “Everquest Shadows of Luclin Game Manual, ”2001, Sony Computer Entertainment America, Inc. , 15 pages. |
“Everquest Trilogy Manual,” 2001, Sony Computer Entertainment America, Inc., 65 pages. |
Kellner et al., “Geometric Calibration of Head-Mounted Displays and its Effects on Distance Estimation,” Apr. 2012, IEEE Transactions On Visualization and Computer Graphics, vol. 18, No. 4, IEEE Computer Scociety, 8 pages. |
L. Gutierrez et al., “Far-Play: a framework to develop Augmented/Alternate Reality Games,” Second IEEE Workshop on Pervasive Collaboration and Social Networking, 2011, 6 pages. |
Feiner et al., “A Touring Machine: Prototyping 3D Mobile Augmented Reality Systems for Exploring the Urban Environment,” InProc ISWC '97 (Int. Symp. on Wearable Computing), Cambridge, MA, Oct. 13-14, 1997, pp. 74-81, 8 pages. |
Fisher et al., “Virtual Environment Display System,” Oct. 23-24, 1986, ACM, 12 pages. |
Fuchs et al., “Virtual Reality: Concepts and Technologies,” 2011, CRC Press, 132 pages. |
Gabbard et al., “Usability Engineering: Domain Analysis Activities for Augmented Reality Systems,” 2002, The Engineering Reality of Virtual Reality, Proceedings SPIE vol. 4660, Stereoscopic Displays and Virtual Reality Systems IX, 13 pages. |
Gledhill et al., “Panoramic imaging—a review,” 2003, Elsevier Science Ltd., 11 pages. |
Gotow et al., “Addressing Challenges with Augmented Reality Applications on Smartphones,” Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, 2010, 14 pages. |
“GPS accuracy and Layar usability testing,” 2010, mediaLABamsterdam, 7 pages. |
Gradecki, “The Virtual Reality Construction Kit,” 1994, Wiley & Sons Inc, 100 pages. |
Heymann et al., “Representation, Coding and Interactive Rendering of High-Resolution Panoramic Images and Video Using MPEG-4,” 2005, 5 pages. |
Hollands, “The Virtual Reality Homebrewer's Handbook,” 1996, John Wiley & Sons, 213 pages. |
Hollerer et al., “User Interface Management Techniques for Collaborative Mobile Augmented Reality,” Computers and Graphics 25(5), Elsevier Science Ltd, Oct. 2001, pp. 799-810, 9 pages. |
Holloway et al., “Virtual Environments: A Survey of the Technology,” Sep. 1993, 59 pages. |
Strickland, “How Virtual Reality Gear Works,” Jun. 7, 2009, How Stuff Works, Inc., 3 pages. |
Hurst et al., “Mobile 3D Graphics and Virtual Reality Interaction,” 2011, ACM, 8 pages. |
“Human Pacman-Wired NextFest,” 2005, Wired. |
Cheok et al., “Human Pacman: A Sensing-based Mobile Entertainment System with Ubiquitous Computing and Tangible Interaction,” 2000, ACM, 12 pages. |
Basu et al., “Immersive Virtual Reality On-The-Go,” 2013, IEEE Virtual Reality, 2 pages. |
“Inside QuickTime—The QuickTime Technical Reference Library—QuickTime VR,” 2002, Apple Computer Inc., 272 pages. |
Iovine, “Step Into Virtual Reality”, 1995, Windcrest/McGraw-Hill , 106 pages. |
Jacobson et al., “Garage Virtual Reality,” 1994, Sams Publishing, 134 pages. |
International Search Report and Written Opinion issued in International Application No. PCT/US2012/032204 dated Oct. 29, 2012. |
Wauters, “Stanford Graduates Release Pulse, A Must-Have News App For The iPad,” Techcrunch.com, techcrunch.com/2010/05/31/pulse-ipad/. |
Hickins, “A License to Pry,” The Wall Street Journal, http://blogs.wsj.com/digits/2011/03/10/a-license-to-pry/tab/print/. |
Notice of Reasons for Rejection issued in Japanese Patent Application No. 2014-503962 dated Sep. 22, 2014. |
Notice of Reasons for Rejection issued in Japanese Patent Application No. 2014-503962 dated Jun. 30, 2015. |
European Search Report issued in European Patent Application No. 12767566.8 dated Mar. 20, 2015. |
“3D Laser Mapping Launches Mobile Indoor Mapping System,” 3D Laser Mapping, Dec. 3, 2012, 1 page. |
Banwell et al., “Combining Absolute Positioning and Vision for Wide Area Augmented Reality,” Proceedings of the International Conference on Computer Graphics Theory and Applications, 2010, 4 pages. |
Li et al., “3-D Motion Estimation and Online Temporal Calibration for Camera-IMU Systems,” Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2013, 8 pages. |
Li et al., “High-fidelity Sensor Modeling and Self-Calibration in Vision-aided Inertial Navigation,” Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2014, 8 pages. |
Li et al., “Online Temporal Calibration for Camera-IMU Systems: Theory and Algorithms,” International Journal of Robotics Research, vol. 33, Issue 7, 2014, 16 pages. |
Li et al., “Real-time Motion Tracking on a Cellphone using Inertial Sensing and a Rolling-Shutter Camera,” Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2013, 8 pages. |
Mourikis et al., “Methods for Motion Estimation With a Rolling-Shutter Camera,” Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany May 6-10, 2013, 10 pages. |
Panzarino, “What Exactly WiFiSlam Is, And Why Apple Acquired It,” http://thenextweb.com/apple/2013/03/26/what-exactly-wifislam-is-and-why-apple-acquired-it, Mar. 26, 2013, 10 pages. |
Vondrick et al., “HOGgles: Visualizing Object Detection Features,” IEEE International Conference on Computer Vision (ICCV), 2013, 9 pages. |
Vu et al., “High Accuracy and Visibility-Consistent Dense Multiview Stereo,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, vol. 34, No. 5, 13 pages. |
International Search Report and Written Opinion issued in International Application No. PCT/US2014/061283 dated Aug. 5, 2015, 11 pages. |
Pang et al., “Development of a Process-Based Model for Dynamic Interaction in Spatio-Temporal GIS”, GeoInformatica, 2002, vol. 6, No. 4, pp. 323-344. |
Zhu et al., “The Geometrical Properties of Irregular 2D Voronoi Tessellations,” Philosophical Magazine A, 2001, vol. 81, No. 12, pp. 2765-2783. |
Bimber et al., “A Brief Introduction to Augmented Reality, in Spatial Augmented Reality,” 2005, CRC Press, 23 pages. |
Milgram et al., “A Taxonomy of Mixed Reality Visual Displays,” IEICE Transactions on Information and Systems, 1994, vol. 77, No. 12, pp. 1321-1329. |
Normand et al., “A new typology of augmented reality applications,” Proceedings of the 3rd augmented human international conference, 2012, 9 pages. |
Sutherland, “A head-mounted three dimensional display,” Proceedings of the Dec. 9-11, 1968, Fall Joint Computer Conference, part I, 1968, pp. 757-764. |
Maubon, “A little bit of history from 2006: Nokia's MARA project,” https://www.augmented-reality.fr/2009/03/un-petit-peu-dhistoire-de-2006-projet-mara-de-nokia/, 7 pages. |
Madden, “Professional Augmented Reality Browsers for Smartphones,” 2011, John Wiley & Sons, 44 pages. |
Raper et al., “Applications of location-based services: a selected review,” Journal of Location Based Services, 2007, vol. 1, No. 2, pp. 89-111. |
Savage, “Blazing gyros: The evolution of strapdown inertial navigation technology for aircraft,” Journal of Guidance, Control, and Dynamics, 2013, vol. 36, No. 3, pp. 637-655. |
Kim et al., “A Step, Stride and Heading Determination for the Pedestrian Navigation System,” Journal of Global Positioning Systems, 2004, vol. 3, No. 1-2, pp. 273-279. |
“Apple Reinvents the Phone with iPhone,” Apple, dated Jan. 9, 2007, https://www.apple.com/newsroom/2007/01/09Apple-Reinvents-the-Phone-with-iPhone/, 5 pages. |
Macedonia et al., “Exploiting reality with multicast groups: a network architecture for large-scale virtual environments,” Proceedings Virtual Reality Annual International Symposium'95, 1995, pp. 2-10. |
Magerkurth et al., “Pervasive Games: Bringing Computer Entertainment Back to the Real World,” Computers in Entertainment (CIE), 2005, vol. 3, No. 3, 19 pages. |
Thomas et al., “ARQuake: An Outdoor/Indoor Augmented Reality First Person Application,” Digest of Papers. Fourth International Symposium on Wearable Computers, 2000, pp. 139-146. |
Thomas et al., “First Person Indoor/Outdoor Augmented Reality Application: ARQuake,” Personal and Ubiquitous Computing, 2002, vol. 6, No. 1, pp. 75-86. |
Zyda, “From Visual Simulation to Virtual Reality to Games,” IEEE Computer Society, 2005, vol. 38, No. 9, pp. 25-32. |
Zyda, “Creating a Science of Games,” Communications-ACM, 2007, vol. 50, No. 7, pp. 26-29. |
“Microsoft Computer Dictionary,” Microsoft, 2002, 10 pages. |
“San Francisco street map,” David Rumsey Historical Map Collection, 1953, https://www.davidrumsey.com/luna/servlet/s/or3ezx, 2 pages. |
“Official Transportation Map (2010),” Florida Department of Transportation, https://www.fdot.gov/docs/default-source/geospatial/past_statemap/maps/FLStatemap2010.pdf, 2010, 2 pages. |
Krogh, “GPS,” American Society of Media Photographers, dated Mar. 22, 2010, 10 pages. |
Ta et al., “SURFTrac: Efficient Tracking and Continuous Object Recognition using Local Feature Descriptors,” 2009 IEEE Conference on Computer Vision and Pattern Recognition, 2009, pp. 2937-2944. |
Office Action issued in Chinese Application No. 201710063195.8 dated Mar. 24, 2021, 9 pages. |
U.S. Appl. No. 10/438,172 filed on May 13, 2003. |
U.S. Appl. No. 60/496,752, filed Aug. 21, 2003. |
U.S. Appl. No. 60/499,810, filed Sep. 2, 2003. |
U.S. Appl. No. 60/502,939, filed Sep. 16, 2003. |
U.S. Appl. No. 60/628,475, filed Nov. 16, 2004. |
U.S. Appl. No. 61/411,591, filed Nov. 9, 2010. |
Macedonia et al., “A Taxonomy for Networked Virtual Environments,” 1997, IEEE Muultimedia, 20 pages. |
Macedonia, “A Network Software Architecture for Large Scale Virtual Environments,” 1995, 31 pages. |
Macedonia et al., “NPSNET: A Network Software Architecture for Large Scale Virtual Environments,” 1994, Proceeding of the 19th Army Science Conference, 24 pages. |
Macedonia et al., “NPSNET: A Multi-Player 3D Virtual Environment Over the Internet,” 1995, ACM, 3 pages. |
Macedonia et al., “Exploiting Reality with Multicast Groups: A Network Architecture for Large-scale Virtual Environments,” Proceedings of the 1995 IEEE Virtual Reality Annual Symposium, 14 pages. |
Paterson et al., “Design, Implementation and Evaluation of Audio for a Location Aware Augmented Reality Game,” 2010, ACM, 9 pages. |
Organisciak et al., “Pico Safari: Active Gaming in Integrated Environments,” Jul. 19, 2016, 22 pages. |
Raskar et al., “Spatially Augmented Reality,” 1998, 8 pages. |
Raskar et al., “The Office of the Future: A Unified Approach to Image-Based Modeling and Spatially Immersive Displays,” 1998, Computer Graphics Proceedings, Annual Conference Series, 10 pages. |
Grasset et al., “MARE: Multiuser Augmented Reality Environment on table setup,” 2002, 2 pages. |
Behringer et al., “A Wearable Augmented Reality Testbed for Navigation and Control, Built Solely with Commercial-Off-The-Shelf (COTS) Hardware,” International Symposium in Augmented Reality (ISAR 2000) in München (Munich), Oct. 5-6, 2000, 9 pages. |
Behringer et al., “Two Wearable Testbeds for Augmented Reality: itWARNS and WIMMIS,” International Symposium on Wearable Computing (ISWC 2000), Atlanta, Oct. 16-17, 2000, 3 pages. |
Hartley et al., “Multiple View Geometry in Computer Vision, Second Edition,” Cambridge University Press , 673 pages. 2004. |
Wetzel et al., “Guidelines for Designing Augmented Reality Games,” 2008, ACM, 9 pages. |
Kasahara et al., “Second Surface: Multi-user Spatial Collaboration System based on Augmented Reality,” 2012, Research Gate, 5 pages. |
Diverdi et al., “Envisor: Online Environment Map Construction for Mixed Reality,” 8 pages. 2008. |
Benford et al., “Understanding and Constructing Shared Spaces with Mixed-Reality Boundaries,” 1998, ACM Transactions on Computer-Human Interaction, vol. 5, No. 3, 40 pages. |
Mann, “Humanistic Computing: “WearComp” as a New Framework and Application for Intelligent Signal Processing,” 1998, IEEE, 29 pages. |
Feiner et al., “Knowledge-Based Augmented Reality,” 1993, Communications of the ACM , 68 pages. |
Bible et al., “Using Spread-Spectrum Ranging Techniques for Position Tracking in a Virtual Environment,” 1995, Proceedings of Network Realities, 16 pages. |
Starner et al., “Mind-Warping: Towards Creating a Compelling Collaborative Augmented Reality Game,” 2000, ACM, 4 pages. |
Höllerer et al., “Chapter Nine—Mobile Augmented Reality,” 2004, Taylor & Francis Books Ltd. , 39 pages. |
Langlotz et al., “Online Creation of Panoramic Augmented-Reality Annotations on Mobile Phones,” 2012, IEEE, 9 pages. |
Kuroda et al., “Shared Augmented Reality for Remote Work Support,” 2000, IFAC Manufacturing, 5 pages. |
Ramirez et al., “Chapter 5—Soft Computing Applications in Robotic Vision Systems,” 2007, I-Tech Education and Publishing, 27 pages. |
Lepetit et al., “Handling Occlusion in Augmented Reality Systems: A Semi-Automatic Method,” 2000, IEEE, 11 pages. |
Zhu et al., “Personalized In-store E-Commerce with the PromoPad: an Augmented Reality Shopping Assistant,” 2004, Electronic Journal for E-commerce Tools, 20 pages. |
Broll et al., “Toward Next-Gen Mobile AR Games,” 2008, IEEE, 10 pages. |
Lee et al., “Exploiting Context-awareness in Augmented Reality Applications,” International Symposium on Ubiquitous Virtual Reality, 4 pages. 2008. |
Tian et al., “Real-Time Occlusion Handling in Augmented Reality Based on an Object Tracking Approach,” www.mdpi.com/journal/sensors, 16 pages. 2010. |
Szalavári et al., “Collaborative Gaming in Augmented Reality,” 1998, ACM, 20 pages. |
Sheng et al., “A Spatially Augmented Reality Sketching Interface for Architectural Daylighting Design,” 2011, IEEE, 13 pages. |
Szeliski et al., “Computer Vision: Algorithms and Applications,” 2010, Springer, 874 pages. |
Avery et al., “Improving Spatial Perception for Augmented Reality X-Ray Vision,” 2009, IEEE, 4 pages. |
Brutzman et al., “Internetwork Infrastructure Requirements for Virtual Environments,” 1995, Proceedings of the Virtual Reality Modeling Language (VRML) Symposium, 11 pages. |
Selman, “Java 3D Programming,” 2002, Manning Publications, 352 pages. |
Bradski et al., “Learning OpenCV,” 2008, O'Reilly Media, 572 pages. |
Schmeil et al., “MARA—Mobile Augmented Reality-Based Virtual Assistant,” 2007, IEEE Virtual Reality Conference 2007 , 5 pages. |
Macedonia et al., “NPSNET: A Network Software Architecture for Large Scale Virtual Environments,” 1994, Presence, Massachusetts Institute of Technology, 30 pages. |
Guan et al., “Spherical Image Processing for Immersive Visualisation and View Generation,” 2011, Thesis submitted to the University of Lancashire, 133 pages. |
Sowizral et al., “The Java 3D API Specification—Second Edition,” 2000, Sun Microsystems, Inc., 664 pages. |
Moore, “A Tangible Augmented Reality Interface to Tiled Street Maps and its Usability Testing,” 2006, Springer-Verlag, 18 pages. |
Ismail et al., “Multi-user Interaction in Collaborative Augmented Reality for Urban Simulation,” 2009, IEEE Computer Society, 10 pages. |
Organisciak et al., “Pico Safari—Active Gaming in Integrated Environments,” 2011, SDH-SEMI (available at https://www.slideshare .net/PeterOrganisciak/pico-safari-sdsemi-2011). |
Davison, “Chapter 7 Walking Around the Models,” Pro Java™ 6 3D Game Development Java 3D, 23 pages. 2007. |
“Archive for the ‘Layers’ Category,” May 29, 2019, LAYAR. |
Neider et al., “The Official Guide to Learning OpenGL, Version 1.1,” Addison Wesley Publishing Company, 616 pages. |
Singhal et al., “Netwoked Virtual Environments—Design and Implementation”, ACM Press, Addison Wesley, 1999, 368 pages. |
Neider et al., “OpenGL programming guide,” 1993, vol. 478, 438 pages. |
International Search Report and Written Opinion issued in International Application No. PCT/US2013/034164 dated Aug. 27, 2013, 11 pages. |
Office Action issued in Japanese Application No. 2014-558993 dated Sep. 24, 2015, 7 pages. |
Office Action issued in Japanese Application No. 2014-542591 dated Feb. 23, 2016, 8 pages. |
Office Action issued in Japanese Application No. 2014-542591 dated Jul. 7, 2015, 6 pages. |
Supplementary European Search Report issued in European Application No. 13854232.9 dated Jul. 24, 2015, 8 pages. |
Supplementary European Search Report issued in European Application No. 12852089.7 dated Mar. 13, 2015, 8 pages. |
Zhu et al., “Design of the Promo Pad: an Automated Augmented Reality Shopping Assistant,” 12th Americas Conference on Information Systems, Aug. 4-6, 2006, 16 pages. |
International Search Report and Written Opinion issued in International Application No. PCT/US2012/066300 dated Feb. 19, 2013, 9 pages. |
International Preliminary Report on Patentability issued in International Application No. PCT/US2012/066300 dated Feb. 19, 2014, 12 pages. |
Hardawar, “Naratte's Zoosh enables NFC with just a speaker and microphone,” Venture Beat News, https://venturebeat.com/2011/06/19/narattes-zoosh-enables-nfc-with-just-a-speaker-and-microphone/, 24 pages. |
Monahan, “Apple iPhone EasyPay Mobile Payment Rollout May Delay NFC,” Javelin Strategy & Research Blog, Nov. 15, 2011, 3 pages. |
“Augmented GeoTravel—Support,” https://web.archive.org/web/20110118072624/http://www.augmentedworks.com/en/augmented-geotravel/augmented-geotravel-support, 2 pages. |
“Augmented GeoTravel,” https://web.archive.org/web/20200924232145/https://en.wikipedia.org/wiki/Augmented_GeoTravel, 2 pages. |
“AugmentedWorks—iPhone Apps Travel Guide with AR: Augmented Geo Travel 3.0.0!,” https://web.archive.org/web/20110128180606/http://www.augmentedworks.com/, 3 pages. |
Honkamaa et al., “A Lightweight Approach for Augmented Reality on Camera Phones using 2D Images to Simulate 3D,” Proceedings of the 6th international conference on Mobile and ubiquitous multimedia, 2007, pp. 155-159. |
Höllerer et al., “Mobile Augmented Reality,” Telegeoinformatics: Location-based computing and services, vol. 21, 2004, 39 pages. |
Raskar et al., “The Office of the Future: A Unified Approach to Image-Based Modeling and Spatially Immersive Displays,” Proceedings of the 25th annual conference on Computer graphics and interactive techniques, 1998, 10 pages. |
Loomis et al., “Personal Guidance System for the Visually Impaired using GPS, GIS, and VR Technologies,” Proceedings of the first annual ACM conference on Assistive technologies, 1994, 5 pages. |
Feiner et al., “A Touring Machine: Prototyping 3D Mobile Augmented Reality Systems for Exploring the Urban Environment,” Personal Technologies, vol. 1, 1997, 8 pages. |
Screen captures from YouTube video clip entitled “LiveSight for Here Maps—Demo on Nokia Lumia 928,” 1 page, uploaded on May 21, 2013 by user “Mark Guim”. Retrieved from Internet: <https://www.youtube.com/watch?v=Wf59vblvGmA>. |
Screen captures from YouTube video clip entitled “Parallel Kingdom Cartography Sneak Peek,” 1 page, uploaded on Oct. 29, 2010 by user “PerBlueInc”. Retrieved from Internet: <https://www.youtube.com/watch?v=L0RdGh4aYis>. |
Screen captures from YouTube video clip entitled “Parallel Kingdom—Video 4—Starting Your Territory.mp4,” 1 page, uploaded on Aug. 24, 2010 by user “PerBlueInc”. Retrieved from Internet: <https://www.youtube.com/watch?app=desktop&v=5zPXKo6yFzM>. |
Screen captures from YouTube video clip entitled “Parallel Kingdom—Video 8—Basics of Trading.mp4,” 1 page, uploaded on Aug. 24, 2010 by user “PerBlueInc”. Retrieved from Internet: <https://www.youtube.com/watch?v=z6YCmMZvHbl>. |
Screen captures from YouTube video clip entitled “Parallel Tracking and Mapping for Small AR Workspaces (PTAM)—extra,” 1 page, uploaded on Nov. 28, 2007 by user “ActiveVision Oxford”. Retrieved from Internet: <https://www.youtube.com/watch?v=Y9HMn6bd-v8>. |
Screen captures from Vimeo video clip entitled “Tabletop Speed Trailer,” 1 page, uploaded on Jun. 5, 2013 by user “Dekko”. Retrieved from Internet: <https://vimeo.com/67737843>. |
Screen captures from YouTube video clip entitled “Delorme PN-40: Viewing maps and Imagery,” 1 page, uploaded on Jan. 21, 2011 by user “Take a Hike GPS”. Retrieved from Internet: <https://www.youtube.com/watch?v=cMoKKfGDw4s>. |
Screen captures from YouTube video clip entitled “Delorme Earthmate PN-40: Creating Waypoints,” 1 page, uploaded on Nov. 22, 2010 by user “Take a Hike GPS”. Retrieved from Internet: <https://www.youtube.com/watch?v=rGz-nFdAO9Y>. |
Screen captures from YouTube video clip entitled “Google Maps Navigation (Beta),” 1 page, uploaded on Oct. 27, 2009 by user “Google”. Retrieved from Internet: <https://www.youtube.com/watch?v=tGXK4jKN_jY>. |
Screen captures from YouTube video clip entitled “Google Maps for mobile Layers,” 1 page, uploaded on Oct. 5, 2009 by user “Google”. Retrieved from Internet: <https://www.youtube.com/watch?v=1W90u0Y1HGI>. |
Screen captures from YouTube video clip entitled “Introduction of Sekai Camera,” 1 page, uploaded on Nov. 7, 2010 by user “tonchidot”. Retrieved from Internet: <https://www.youtube.com/watch?v=oxnKOQkWwF8>. |
Screen captures from YouTube video clip entitled “Sekai Camera for iPad,” 1 page, uploaded on Aug. 17, 2010 by user “tonchidot”. Retrieved from Internet: <https://www.youtube.com/watch?v=YGwyhEK8mV8>. |
Screen captures from YouTube video clip entitled “TechCrunch 50 Presentation ”SekaiCamera“ by TonchiDot,” 1 page, uploaded on Oct. 18, 2008 by user “tonchidot”. Retrieved from Internet: <https://www.youtube.com/watch?v=FKgJTJojVEw>. |
Screen captures from YouTube video clip entitled “Ville Vesterinen—Shadow Cities,” 1 page, uploaded on Feb. 4, 2011 by user “momoams”. Retrieved from Internet: <https://www.youtube.com/watch?v=QJ1BsgoKYew>. |
Screen captures from YouTube video clip entitled ““Subway”: Star Wars Arcade: Falcon Gunner Trailer #1,” 1 page, uploaded on Nov. 3, 2010 by user “Im/nl Studios”. Retrieved from Internet: < https://www.youtube.com/watch? v=CFSMXk8Dw10>. |
Screen captures from YouTube video clip entitled “Star Wars Augmented Reality: TIE Fighters Attack NYC!,” 1 page, uploaded on Nov. 3, 2010 by user “Im/nl Studios”. Retrieved from Internet: <https://www.youtube.com/watch? v=LoodrUC05r0>. |
Screen captures from YouTube video clip entitled “Streetmuseum,” 1 page, uploaded on Dec. 1, 2010 by user “Jack Kerruish”. Retrieved from Internet: < https://www.youtube.com/watch?v=qSfATEZiUYo>. |
Screen captures from YouTube video clip entitled “UFO on Tape iPhone Gameplay Review—AppSpy.com,” 1 page, uploaded on Oct. 5, 2010 by user “Pocket Gamer”. Retrieved from Internet: <https://www.youtube.com/watch?v=Zv4J3ucwyJg>. |
U.S. Appl. No. 18/378,977, filed Oct. 11, 2023. |
U.S. Appl. No. 18/385,800, filed Oct. 31, 2023. |
Vince, “Introduction to Virtual Reality,” 2004, Springer-Verlag, 97 pages. |
Fuchs et al., “Virtual Reality: Concepts and Technologies,” 2006, CRC Press, 56 pages. |
Arieda, “A Virtual / Augmented Reality System with Kinaesthetic Feedback—Virtual Environment with Force Feedback System,” 2012, LAP Lambert Academic Publishing, 31 pages. |
Sperber et al., “Web-based mobile Augmented Reality: Developing with Layar (3D),” 2010, 7 pages. |
“WXHMD—A Wireless Head-Mounted Display with embedded Linux,” 2009, Pabr.org, 8 pages. |
“XMP Adding Intelligence to Media—XMP Specification Part 3—Storage in Files,” 2014, Adobe Systems Inc., 78 pages. |
Zhao, “A survey on virtual reality,” 2009, Springer, 54 pages. |
Zipf et al., “Using Focus Maps to Ease Map Reading—Developing Smart Applications for Mobile Devices,” 3 pages. 2002. |
Gammeter et al., “Server-side object recognition and client-side object tracking for mobile augmented reality,” 2010, IEEE, 8 pages. |
Martedi et al., “Foldable Augmented Maps,” 2012, IEEE, 11 pages. |
Martedi et al., “Foldable Augmented Maps,” 2010, IEEE, 8 pages. |
Morrison et al., “Like Bees Around the Hive: A Comparative Study of a Mobile Augmented Reality Map,” 2009, 10 pages. |
Takacs et al., “Outdoors Augmented Reality on Mobile Phone using Loxel-Based Visual Feature Organization,” 2008, ACM, 8 pages. |
Livingston et al., “An Augmented Reality System for Military Operations in Urban Terrain,” 2002, Proceedings of the Interservice/Industry Training, Simulation, & Education Conference , 8 pages. |
Sappa et al., “Chapter 3—Stereo Vision Camera Pose Estimation for On-Board Applications,” 2007, I-Tech Education and Publishing, 12 pages. |
Light et al., “Chutney and Relish: Designing to Augment the Experience of Shopping at a Farmers' Market,” 2010, ACM, 9 pages. |
Bell et al., “View Management for Virtual and Augmented Reality,” 2001, ACM, 11 pages. |
Cyganek et al., “An Introduction to 3D Computer Vision Techniques and Algorithms,” 2009, John Wiley & Sons, Ltd, 502 pages. |
Lu et al., “Foreground and Shadow Occlusion Handling for Outdoor Augmented Reality,” 2010, IEEE, 10 pages. |
Lecocq-Botte et al., “Chapter 25—Image Processing Techniques for Unsupervised Pattern Classification,” 2007, pp. 467-488. |
Lombardo, “Hyper-NPSNET: embedded multimedia in a 3D virtual world,” 1993, 83 pages. |
Doignon, “Chapter 20—An Introduction to Model-Based Pose Estimation and 3-D Tracking Techniques,” 2007, IEEE, I-Tech Education and Publishing, 26 pages. |
Forsyth et al., “Computer Vision A Modern Approach, Second Edition,” 2012, Pearson Education, Inc., Prentice Hall, 793 pages. |
Breen et al., “Interactive Occlusion and Collision of Real and Virtual Objects in Augmented Reality,” 1995, ECRC, 22 pages. |
Breen et al., “Interactive Occlusion and Automatic Object Placement for Augmented Reality,” 1996, Eurographics (vol. 15, No. 3), 12 pages. |
Pratt et al., “Insertion of an Articulated Human into a Networked Virtual Environment,” 1994, Proceedings of the 1994 AI, Simulation and Planning in High Autonomy Systems Conference, 12 pages. |
Schmalstieg et al., “Bridging Multiple User Interface Dimensions with Augmented Reality,” 2000, IEEE, 10 pages. |
Brutzman et al., “Virtual Reality Transfer Protocol (VRTP) Design Rationale,” 1997, Proceedings of the IEEE Sixth International Workshop on Enabling Technologies, 10 pages. |
Brutzman et al., “Internetwork Infrastructure Requirements for Virtual Environments,” 1997, National Academy Press, 12 pages. |
Han, “Chapter 1—Real-Time Object Segmentation of the Disparity Map Using Projection-Based Region Merging,” 2007, I-Tech Education and Publishing, 20 pages. |
George et al., “A Computer-Driven Astronomical Telescope Guidance and Control System with Superimposed Star Field and Celestial Coordinate Graphics Display,” 1989, J. Roy. Astron. Soc. Can., The Royal Astronomical Society of Canada, 10 pages. |
Marder-Eppstein et al., “The Office Marathon: Robust Navigation in an Indoor Office Environment,” 2010, IEEE International Conference on Robotics and Automation, 8 pages. |
Barba et al., “Lessons from a Class on Handheld Augmented Reality Game Design,” 2009, ACM, 9 pages. |
Reitmayr et al., “Collaborative Augmented Reality for Outdoor Navigation and Information Browsing,” 2004, 12 pages. |
Reitmayr et al., “Going out: Robust Model-based Tracking for Outdoor Augmented Reality,” 2006, IEEE, 11 pages. |
Regenbrecht et al., “Interaction in a Collaborative Augmented Reality Environment,” 2002, CHI, 2 pages. |
Herbst et al., “TimeWarp: Interactive Time Travel with a Mobile Mixed Reality Game,” 2008, ACM, 11 pages. |
Loomis et al., “Personal Guidance System for the Visually Impaired using GPS, GIS, and VR Technologies,” 1993, VR Conference Proceedings , 8 pages. |
Rekimoto, “Transvision: A hand-held augmented reality system for collaborative design, ”1996, Research Gate, 7 pages. |
Morse et al., “Multicast Grouping for Data Distribution Management,” 2000, Proceedings of the Computer Simulation Methods and Applications Conference, 7 pages. |
Morse et al., “Online Multicast Grouping for Dynamic Data Distribution Management,” 2000, Proceedings of the Fall 2000 Simulation Interoperability Workshop, 11 pages. |
Cheverst et al., “Developing a Context-aware Electronic Tourist Guide: Some Issues and Experiences,” 2000, Proceedings of the Fall 2000 Simulation Interoperability Workshop, 9 pages. |
Squire et al., “Mad City Mystery: Developing Scientific Argumentation Skills with a Place-based Augmented Reality Game on Handheld Computers,” 2007, Springer, 25 pages. |
Romero et al., “Chapter 10—A Tutorial on Parametric Image Registration,” 2007, I-Tech Education and Publishing, 18 pages. |
Rosenberg, “The Use of Virtual Fixtures as Perceptual Overlays to Enhance Operator Performance in Remote Environments,” 1992, Air Force Material Command, 53 pages. |
Livingston et al., “Mobile Augmented Reality: Applications and Human Factors Evaluations,” 2006, Naval Research Laboratory, 32 pages. |
Gabbard et al., “Resolving Multiple Occluded Layers in Augmented Reality,” 2003, IEEE, 11 pages. |
Wloka et al., “Resolving Occlusion in Augmented Reality,” 1995, ACM, 7 pages. |
Zyda, “VRAIS Panel on Networked Virtual Environments,” Proceedings of the 1995 IEEE Virtual Reality Annual Symposium, 2 pages. |
Zyda et al., “NPSNET-HUMAN: Inserting the Human into the Networked Synthetic Environment,” 1995, Proceedings of the 13th DIS Workshop, 5 pages. |
Buchanan, “1,000: Find 'Em All Preview,” https://www.ign.com/articles/2009/10/16/1000-find-em-all-preview, 8 pages. |
Tschida, “You Can Now Find 1000: Find Em All! In The App Store,” https://appadvice.com/appnn/2010/02/you-can-now-find-1000-find-em-all-in-the-app-store, 3 pages. |
Piekarski et al., “ARQuake: the outdoor augmented reality gaming system,” Communications of the ACM, 2002, vol. 45, No. 1, pp. 36-38. |
Piekarski et al., “ARQuake—Modifications and Hardware for Outdoor Augmented Reality Gaming,” Linux Australia, 2003, 9 pages. |
Thomas et al., “ARQuake: An Outdoor/Indoor Augmented Reality First Person Application,” University of South Australia, 2000, 8 pages. |
Thomas et al., “First Person Indoor/Outdoor Augmented Reality Application: ARQuake,” Personal and Ubiquitous Computing, 2002, vol. 6, pp. 75-86. |
Thomas et al., “Usability and Playability Issues for ARQuake,” 2003, 8 pages. |
Livingston et al., “An augmented reality system for military operations in urban terrain,” Interservice/Industry Training, Simulation, and Education Conference, 2002, vol. 89, 9 pages. |
Livingston et al., “Mobile Augmented Reality: Applications and Human Factors Evaluations,” 2006, 16 pages. |
Cutler, “Dekko Debuts An Augmented Reality Racing Game Playable From The iPad,” Techcrunch, https://techcrunch.com/2013/06/09/dekko-2/, 8 pages. |
“Dekko's TableTop Speed AR Proof of Concept,” www.gametrender.net/2013/06/dekkos-tabletop-speed-ar-proof-of.html, 2 pages. |
“Racing AR Together,” https://augmented.org/2013/06/racing-ar-together/, 3 pages. |
“DeLorme PN-40,” www.gpsreview.net/delorme-pn-40/, downloaded on Mar. 8, 2021, 37 pages. |
Owings, “DeLorme Earthmate PN-40 review,” https://gpstracklog.com/2009/02/delorme-earthmate-pn-40-review.html, 17 pages. |
Butler, “How does Google Earth work?,” https://www.nature.com/news/2006/060213/full/060213-7.html, 2 pages. |
Castello, “How's the weather?,” https://maps.googleblog.com/2007/11/hows-weather.html, 3 pages. |
Friedman, “Google Earth for iPhone and iPad,” https://www.macworld.com/article/1137794/googleearth_iphone.html, downloaded on Sep. 7, 2010, 3 pages. |
“Google Earth,” http://web.archive.org/web/20091213164811/http://earth.google.com/, 1 page. |
Mellen, “Google Earth 2.0 for iPhone released,” https://www.gearthblog.com/blog/archives/2009/11/google_earth_20_for_iphone_released.html, downloaded on Mar. 5, 2021, 5 pages. |
“Google Earth iPhone,” http://web.archive.org/web/20091025070614/http://www.google.com/mobile/products/earth.html, 1 page. |
Senoner, “Google Earth and Microsoft Virtual Earth two Geographic Information Systems,” 2007, 44 pages. |
Barth, “Official Google Blog: The bright side of sitting in traffic: Crowdsourcing road congestion data,” https://googleblog.blogspot.com/2009/08/bright-side-of-sitting-in-traffic.html, 4 pages. |
Soni, “Introducing Google Buzz for mobile: See buzz around you and tag posts with your location.,” googlemobile.blogspot.com/2010/02/introducing-google-buzz-for-mobile-see.html, 15 pages. |
Chu, “New magical blue circle on your map,” https://googlemobile.blogspot.com/2007/11/new-magical-blue-circle-on-your-map.html, 20 pages. |
“Google Maps for your phone,” https://web.archive.org/web/20090315195718/http://www.google.com/mobile/default/maps.html, 2 pages. |
“Get Google on your phone,” http://web.archive.org/web/20091109190817/http://google.com/mobile/#p=default, 1 page. |
Gundotra, “To 100 million and beyond with Google Maps for mobile,” https://maps.googleblog.com/2010/08/to-100-million-and-beyond-with-google.html, 6 pages. |
“Introducing Google Buzz for mobile: See buzz around you and tag posts with your location,” https://maps.googleblog.com/2010/02/introducing-google-buzz-for-mobile-see.html, 16 pages. |
“Google Maps Navigation (Beta),” http://web.archive.org/web/20091101030954/http://www.google.com:80/mobile/navigation/index.html#p=default, 3 pages. |
Miller, “Googlepedia: the ultimate Google resource,” 2008, Third Edition, 120 pages. |
“Upgrade your phone with free Google products,” http://web.archive.org/web/20090315205659/http://www.google.com/mobile/, 1 page. |
“Google blogging in 2010,” https://googleblog.blogspot.com/2010/, 50 pages. |
Cheok et al., “Human Pacman: A Mobile Entertainment System with Ubiquitous Computing and Tangible Interaction over a Wide Outdoor Area,” 2003, Human-Computer Interaction with Mobile Devices and Services: 5th International Symposium, Mobile HCI 2003, Udine, Italy, Sep. 2003. Proceedings 5, 17 pages. |
Biggs, “Going The Distance: Nike+ GPS Vs. RunKeeper,” https://techcrunch.com/2010/10/09/going-the-distance-nike-gps-vs-runkeeper/, 4 pages. |
Vallino, “Interactive Augmented Reality,” 1998, University of Rochester, 109 pages. |
Jay et al., “Amplifying Head Movements with Head-Mounted Displays,” 2003, Presence, Massachusetts Institute of Technology, 10 pages. |
Julier et al., “Information Filtering for Mobile Augmented Reality,” 2000, IEEE and ACM International Symposium on Augmented Reality , 10 pages. |
Julier et al., “Information Filtering for Mobile Augmented Reality,” Jul. 2, 2002, IEEE, 6 pages. |
Kalwasky, “The Science of Virtual Reality and Virtual Environments,” 1993, Addison-Wesley, 215 pages. |
Kerr et al., “Wearable Mobile Augmented Reality: Evaluating Outdoor User Experience,” 2011, ACM, 8 pages. |
Kopper et al., “Towards an Understanding of the Effects of Amplified Head Rotations,” 2011, IEEE, 6 pages. |
MacIntyre et al., “Estimating and Adapting to Registration Errors in Augmented Reality Systems,” 2002, Proceedings IEEE Virtual Reality 2002, 9 pages. |
Feißt, “3D Virtual Reality on mobile devices,” 2009, VDM Verlag Dr. Muller Aktiengesellschaft & Co. KG, 53 pages. |
Chua et al., “MasterMotion: Full Body Wireless Virtual Reality for Tai Chi,” Jul. 2002, ACM SIGGRAPH 2002 conference abstracts and applications, 1 page. |
Melzer et al., “Head-Mounted Displays: Designing for the User,” 2011, 85 pages. |
Vorländer, “Auralization—Fundamentals of Acoustics, Modelling, Simulation, Algorithms and Acoustic Virtual Reality,” 2008, Springer, 34 pages. |
Miller et al., “The Virtual Museum: Interactive 3D Navigation of a Multimedia Database,” Jul./Sep. 1992, John Wiley & Sons, Ltd., 19 pages. |
Koenen, “MPEG-4 Multimedia for our time,” 1999, IEEE Spectrum, 8 pages. |
Navab et al., “Laparoscopic Virtual Mirror,” 2007, IEEE Virtual Reality Conference, 8 pages. |
Ochi et al., “HMD Viewing Spherical Video Streaming System,” 2014, ACM, 2 pages. |
Olson et al., “A Design for a Smartphone- Based Head Mounted Display,” 2011, 2 pages. |
Pausch, “Virtual Reality on Five Dollars a Day,” 1991, ACM, 6 pages. |
Peternier et al., “Wearable Mixed Reality System In Less Than 1 Pound,” 2006, The Eurographics Assoc. , 10 pages. |
Piekarski et al., “ARQuake—Modifications and Hardware for Outdoor Augmented Reality Gaming,” 9 pages. |
Piekarski, “Interactive 3d modelling in outdoor augmented reality worlds,” 2004, The University of South Australia, 264 pages. |
Piekarski et al., “Tinmith-Metro: New Outdoor Techniques for Creating City Models with an Augmented Reality Wearable Computer,” 2001, IEEE, 8 pages. |
Piekarski et al., “The Tinmith System—Demonstrating New Techniques for Mobile Augmented Reality Modelling,” 2002, 10 pages. |
Piekarski et al., “ARQuake: The Outdoor Augmented Reality Gaming System,” 2002, Communications of the ACM, 3 pages. |
Piekarski et al., “Integrating Virtual and Augmented Realities in an Outdoor Application,” 1999, 10 pages. |
Pimentel et al., “Virtual Reality—Through the new looking glass,” 1993, Windcrest McGraw-Hill , 45 pages. |
Basu et al., “Poster: Evolution and Usability of Ubiquitous Immersive 3D Interfaces,” 2013, IEEE, 2 pages. |
Pouwelse et al., “A Feasible Low-Power Augmented—Reality Terminal,” 1999, 10 pages. |
Madden, “Professional Augmented Reality Browsers for Smartphones,” 2011, John Wiley & Sons, 345 pages. |
“Protecting Mobile Privacy: Your Smartphones, Tablets, Cell Phones and Your Privacy—Hearing,” May 10, 2011, U.S. Government Printing Office, 508 pages. |
Rashid et al., “Extending Cyberspace: Location Based Games Using Cellular Phones,” 2006, ACM, 18 pages. |
Reid et al., “Design for coincidence: Incorporating real world artifacts in location based games,” 2008, ACM, 8 pages. |
Rockwell et al., “Campus Mysteries: Serious Walking Around,” 2013, Journal of the Canadian Studies Association, vol. 7(12): 1-18, 18 pages. |
Shapiro, “Comparing User Experience in a Panoramic HMD vs. Projection Wall Virtual Reality System,” 2006, Sensics, 12 pages. |
Sestito et al., “Intelligent Filtering for Augmented Reality,” 2000, 8 pages. |
Sherman et al., “Understanding Virtual Reality,” 2003, Elsevier, 89 pages. |
Simcock et al., “Developing a Location Based Tourist Guide Application,” 2003, Australian Computer Society, Inc., 7 pages. |
“Sony—Head Mounted Display Reference Guide,” 2011, Sony Corporation, 32 pages. |
Hollister, “Sony HMZ-T1 Personal 3D Viewer Review—The Verge, ”Nov. 10, 2011, The Verge, 25 pages. |
Gutiérrez et al., “Stepping into Virtual Reality,” 2008, Springer-Verlag, 33 pages. |
Sutherland, “A head-mounted three dimensional display,” 1968, Fall Joint Computer Conference, 8 pages. |
Sutherland, “The Ultimate Display,” 1965, Proceedings of IFIP Congress, 2 pages. |
Thomas et al., “ARQuake: An Outdoor/Indoor Augmented Reality First Person Application, ”2000, IEEE, 8 pages. |
Thomas et al., “First Person Indoor/Outdoor Augmented Reality Application: ARQuake,” 2002, Springer-Verlag, 12 pages. |
Wagner et al., “Towards Massively Multi-user Augmented Reality on Handheld Devices,” May 2005, Lecture Notes in Computer Science, 13 pages. |
Shin et al., “Unified Context-aware Augmented Reality Application Framework for User-Driven Tour Guides,” 2010, IEEE, 5 pages. |
Julier et al., “Chapter 6—Urban Terrain Modeling For Augmented Reality Applications,” 2001, Springer, 20 pages. |
“Adobe Flash Video File Format Specification Version 10.1,” 2010, Adobe Systems Inc., 89 pages. |
Number | Date | Country | |
---|---|---|---|
20240062236 A1 | Feb 2024 | US |
Number | Date | Country | |
---|---|---|---|
61562385 | Nov 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16841586 | Apr 2020 | US |
Child | 17214644 | US | |
Parent | 16173882 | Oct 2018 | US |
Child | 16422901 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 18126916 | Mar 2023 | US |
Child | 18385889 | US | |
Parent | 17982463 | Nov 2022 | US |
Child | 18126916 | US | |
Parent | 17214644 | Mar 2021 | US |
Child | 17982463 | US | |
Parent | 16422901 | May 2019 | US |
Child | 16841586 | US | |
Parent | 15947152 | Apr 2018 | US |
Child | 16173882 | US | |
Parent | 15719422 | Sep 2017 | US |
Child | 15947152 | US | |
Parent | 14359913 | US | |
Child | 15719422 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US2012/066300 | Nov 2012 | WO |
Child | 14359913 | US |