Augmented reality gifting on a mobile device

Information

  • Patent Grant
  • 12020309
  • Patent Number
    12,020,309
  • Date Filed
    Friday, May 17, 2019
    5 years ago
  • Date Issued
    Tuesday, June 25, 2024
    11 days ago
Abstract
A system for receiving environment information, gift information, and location information and creating an augmented gifting environment are presented herein. An application may receive an image of a gifting environment and a gift to be placed in the gifting environment. A gift recipient may scan the environment that is represented by the gifting environment and the application may provide a virtual gift representing the gift in the scanned environment presenting the virtual gift in an augmented reality environment.
Description
BACKGROUND
1. Field

Embodiments of the invention relate to presenting and receiving gifts via a mobile device. More specifically, embodiments of the invention relate to presenting and receiving gifts in an augmented reality environment presenting a virtual gift via a mobile device.


2. Related Art

Online gift giving is becoming more and more popular. Typically, gifts are purchased online and either delivered to a recipient via a parcel delivery service or in some form of digital message. The recipient may use the gift or, in the event of a stored-value card, redeem the gift at an online retailer's website or at a retail store. The problem associated with a conventional online gifting system as described is that there is no personalization. Further, these methods do not utilize technology to provide new and exciting methods of enhancing delivery and receipt of the gifts.


The Internet is being accessed more frequently by mobile devices. As this trend continues, online gift giving also is becoming a mobile-to-mobile endeavor. What is lacking in the prior art are new and innovative ways for a gift giver to customize the gift and the delivery of the gift to the recipient utilizing mobile devices and features of mobile devices. Further, what is needed, is a method for purchasing and sending a gift to a recipient that allows the gift giver and the recipient to view the gift in a particular environment before purchase. This allows both the gift giver and the recipient the ability to see how the gift will fit in with the environment and/or recipient.


SUMMARY

Embodiments of the invention solve the above-mentioned problems by providing a system and method for providing to a gift giver an application that allows the user to find and purchase a gift. A virtual gift representing the gift may then be placed in an augmented reality scene, or gifting environment, that may be an image of a location scanned on a mobile device of the user. The gifting environment may present objects suitable for presenting the gift and the gift giver may select the object or location for placement of the virtual gift. A gift recipient may receive notification via the application that the gift has been purchased or the virtual gift has been placed. The gift recipient may then search for the virtual gift receiving clues such as, for example, GPS coordinates or proximity notifications. The gift recipient may scan the gifting environment using the camera on a recipient mobile device to discover the virtual gift. The gift recipient may then choose to accept the gift by interacting with the recipient mobile device.


A first embodiment is directed to one or more non-transitory computer-readable media storing computer-executable instructions that, when executed by a processor, perform a method of presenting a virtual gift to a recipient using augmented reality, the method comprising the steps of receiving a selection of a gift from a gift giver, receiving at least one image of a gifting environment via a giver mobile device, receiving a scan of the gifting environment from a recipient mobile device, and presenting the virtual gift in a real-time scan of the gifting environment on the recipient mobile device to create the augmented reality of the gifting environment.


A second embodiment is directed to a method for selecting and presenting a virtual gift to a recipient using augmented reality, the method comprising the steps of receiving a selection of a gift from a gift giver, receiving at least one image of a gifting environment via a giver mobile device, discerning a plurality of objects from the gifting environment, determining at least one object from the plurality of objects for placement of the virtual gift, and presenting the gifting environment and the virtual gift to the gift recipient.


A third embodiment is directed to a method for presenting a gift to a recipient using augmented reality, the method comprising the steps of receiving a notification indicative of a gift for the recipient, receiving an indication of a location of at least one of a gifting environment or a virtual gift, scanning the gifting environment with a camera associated with a recipient mobile device, selecting, using the recipient mobile device, the virtual gift from the gifting environment, and viewing the gift via the recipient mobile device.


This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Other aspects and advantages will be apparent from the following detailed description of the embodiments and the accompanying drawing figures.





BRIEF DESCRIPTION OF THE DRAWING FIGURES

Embodiments of this disclosure are described in detail below with reference to the attached drawing figures, wherein:



FIG. 1 depicts an embodiment of a hardware system for implementing embodiments of the invention;



FIG. 2A depicts an embodiment of an application on a giver mobile device scanning a gifting environment;



FIG. 2B depicts an embodiment of the application on the giver mobile device placing a virtual gift in the gifting environment;



FIG. 3A depicts an embodiment of the application displaying a map;



FIG. 3B depicts an embodiment of the application displaying the gifting environment presenting the virtual gift;



FIG. 3C depicts an embodiment of the application presenting the gift; and



FIG. 4 depicts an exemplary flow diagram of embodiments of the invention for selecting and placing a gift.





The drawing figures do not limit the invention to the specific embodiments disclosed and described herein. The drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the invention.


DETAILED DESCRIPTION

Embodiments of the invention solve the above-described problems and provide a distinct advance in the art by providing a method and system for presenting in augmented reality a virtual gift in a gifting environment on a mobile device. The virtual gift may be presented to a recipient in the gifting environment such as, for example, in an image of the user's living room on a coffee table. The user may perform an action to open the virtual gift revealing the contents therein.


The following description of embodiments of the invention references the accompanying illustrations that illustrate specific embodiments in which the invention can be practiced. The embodiments are intended to describe aspects of the invention in sufficient detail to enable those skilled in the art to practice the invention. Other embodiments can be utilized, and changes can be made, without departing from the scope of the invention. The following detailed description is, therefore, not to be taken in a limiting sense.


In this description, references to “one embodiment”, “an embodiment”, “embodiments”, “various embodiments”, “certain embodiments”, “some embodiments”, or “other embodiments” mean that the feature or features being referred to are included in at least one embodiment of the technology. Separate references to “one embodiment”, “an embodiment”, “embodiments”, “various embodiments”, “certain embodiments”, “some embodiments”, or “other embodiments” in this description do not necessarily refer to the same embodiment and are also not mutually exclusive unless so stated and/or except as will be readily apparent to those skilled in the art from the description. For example, a feature, structure, act, etc. described in one embodiment may also be included in other embodiments but is not necessarily included. Thus, the current technology can include a variety of combinations and/or integrations of the embodiments described herein.


Broadly speaking, embodiments of the invention provide for a recipient of an online gift to experience the fun and excitement of receiving a gift. For example, the recipient may have to hunt for the gift using an augmented reality application provided by their mobile device. A virtual gift representing the gift may be placed in the augmented reality in an imaged gifting environment. Once the user discovers the gifting environment, the user may use a camera of a mobile device to scan the gifting environment revealing the virtual gift on the mobile device display. The user may then interact with the mobile device in some way using one or more input devices or sensors of the mobile device to virtually unwrap the virtual gift revealing the gift.


Turning first to FIG. 1, an exemplary hardware platform that can form one element of certain embodiments of the invention is depicted. Computer 102 can be a desktop computer, a laptop computer, a server computer, a mobile device such as a smartphone or tablet, or any other form factor of general- or special-purpose computing device. Depicted with computer 102 are several components, for illustrative purposes. In some embodiments, certain components may be arranged differently or absent. Additional components may also be present. Included in computer 102 is system bus 104, whereby other components of computer 102 can communicate with each other. In certain embodiments, there may be multiple busses or components may communicate with each other directly. Connected to system bus 104 is central processing unit (CPU) 106. Also attached to system bus 104 are one or more random-access memory (RAM) modules 108. Also attached to system bus 104 is graphics card 110. In some embodiments, graphics card 104 may not be a physically separate card, but rather may be integrated into the motherboard or the CPU 106. In some embodiments, graphics card 110 has a separate graphics-processing unit (GPU) 112, which can be used for graphics processing or for general purpose computing (GPGPU). Also on graphics card 110 is GPU memory 114. Connected (directly or indirectly) to graphics card 110 is display 116 for user interaction. In some embodiments no display is present, while in others it is integrated into computer 102. Similarly, peripherals such as keyboard 118 and mouse 120 are connected to system bus 104. Like display 116, these peripherals may be integrated into computer 102 or absent. Also connected to system bus 104 is local storage 122, which may be any form of computer-readable media, and may be internally installed in computer 102 or externally and removeably attached.


Computer-readable media include both volatile and nonvolatile media, removable and nonremovable media, and contemplate media readable by a database. For example, computer-readable media include (but are not limited to) RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD), holographic media or other optical disc storage, magnetic cassettes, magnetic tape, magnetic disk storage, and other magnetic storage devices. These technologies can store data temporarily or permanently. However, unless explicitly specified otherwise, the term “computer-readable media” should not be construed to include physical, but transitory, forms of signal transmission such as radio broadcasts, electrical signals through a wire, or light pulses through a fiber-optic cable. Examples of stored information include computer-useable instructions, data structures, program modules, and other data representations.


Finally, network interface card (NIC) 124 is also attached to system bus 104 and allows computer 102 to communicate over a network such as network 126. NIC 124 can be any form of network interface known in the art, such as Ethernet, ATM, fiber, Bluetooth, or Wi-Fi (i.e., the IEEE 802.11 family of standards). NIC 124 connects computer 102 to local network 126, which may also include one or more other computers, such as computer 128, and network storage, such as data store 130. Generally, a data store such as data store 130 may be any repository from which information can be stored and retrieved as needed. Examples of data stores include relational or object-oriented databases, spreadsheets, file systems, flat files, directory services such as LDAP and Active Directory, or email storage systems. A data store may be accessible via a complex API (such as, for example, Structured Query Language), a simple API providing only read, write and seek operations, or any level of complexity in between. Some data stores may additionally provide management functions for data sets stored therein such as backup or versioning. Data stores can be local to a single computer such as computer 128, accessible on a local network such as local network 126, or remotely accessible over Internet 132. Local network 126 is in turn connected to Internet 132, which connects many networks such as local network 126, remote network 134 or directly attached computers such as computer 136. In some embodiments, computer 102 can itself be directly connected to Internet 132.


In some embodiments, an application may run on the computer 102 and the computer 128 which, in some embodiments, may be mobile devices or may be accessed via mobile devices and run in a web-based environment from a web browser of the user 138. The web-based environment may store data such that it is not required for the mobile device or computer to have downloaded and stored large amounts of data for the application. The application may access data such as object databases, user profiles, information related to other users, financial information, third-party financial institutions, third-party vendors, social media, or any other online service or website that is available over the internet.


In some embodiments of the invention, the application may access or store a profile of the user 138. In some embodiments, the user 138 may be a gift giver or a recipient of the gift. The user 138 may be any person or persons that access the application through any application accessible device. The application may be downloaded on the mobile device which, in some embodiments, is computer 102 or accessed via the Internet as in a cloud-based application. The user 138, in the case of a new user, may be prompted to set up a profile for use with the application. The user 138 may input such exemplary items as age, race, nationality, favorite stores, fashion trends, designers, and any information associated with a gift giver or a recipient such that the application may customize offers and provide a unique experience for the user 138. For example, the gift giver may set up an account including the user profile and submit user preferences to the application based on a recipient's likes and dislikes such that the application may provide gift suggestions based on the likes and dislikes of the recipient. Further, the gift giver or the recipient may provide access to social media sites such that the application may access the information on the social media sites of the gift giver and the recipient to determine gift suggestions. In some embodiments, the application may access any of a plurality of peripheral devices such as a camera, microphone, GPS, accelerometer, gyroscope, or any other peripheral device that may be useful in embodiments as described below.


The profile of the user 138 may also store historical information based on the actions or interactions with the application by the user 138. Financial transactions, viewed items, online searches, or any information gathered from any of the online databases mentioned above may be used to assist in creating a unique experience for the user 204. Further, a time spent viewing items may be used to determine a higher level of interest in particular items. Any items may be cross-referenced for similarities to determine other items that may be offered as gifts. The profile of the user 138 may be updated continuously such that the offers are based on the newest information from the user 138 and associated context received from the mobile device.


In some embodiments, the user 138 may be the gift giver and may select a gift from the suggested gifts on the application. The gift giver may select the gift directly through the application in communication with a third-party site or may, upon selection of the gift, be directed to the third-party site via a link. Once the gift is selected, the gift may be purchased and an image of the gift may be stored as a virtual gift for placement in the gifting environment as described in embodiments below.


In some embodiments, the gift may be purchased directly through the application or through a third-party website. The profile of the gift giver may be associated with an account for direct financial transactions such as a savings, checking, or a credit card account or any indirect financial transactions through an intermediate third-party financial transaction institution may be conducted. In some embodiments, the profile of the gift giver may store biometric data for the gift giver. This can be used, for example, to confirm that the gift giver is accessing the application. This may be determined with a password, Personal Identification Number (PIN), or any biometric identifier such as, for example, fingerprint recognition, facial recognition, voice recognition, a retinal scan, or any other biometrics. This may provide a high level of security to the financial transactions conducted by the user 204 through the application.



FIGS. 2A-2B depict a gifting environment 200 or location for placing a virtual gift 214 or augmented reality gifting presentation. In some embodiments the gifting environment 200 is scanned or photographed by the gift giver via the giver mobile device 202 to determine one or more potential locations for gift placement. The image or video may be stored in a remote data store or on the giver mobile device 202 and accessible by the application such that the application may combine a virtual image of the gift for placement in the gifting environment 200.



FIGS. 2A-2B depict the gifting environment 200 which in this case is the exemplary living room 204 including, a couch 206, coffee table 208, book 210, lamp 212, and virtual gift 214. Though the living room 204 is described as the gifting environment 200 in embodiments herein, any environment indoors or outdoors may be the gifting environment 200 and any objects may be used to emplace the virtual gift 214 as described below.


In some embodiments, the virtual gift 214 is representative of the gift to be given from the gift giver to the recipient. The virtual gift 214 may be created by the application or gift giver imaging the gift from a third-party website or by the gift giver taking photograph of the gift and uploading the resulting image to the application. In some embodiments, multiple images may be taken creating different views of the gift such that the application may piece the different views together creating a 3-D image of the gift. The resulting virtual gift 214 may be three dimensional such that it may be rotated to fit on objects and in locations within the gifting environment. The 3-D image of the virtual gift 214 may also provide the recipient a good look at the gift when deciding any modifications before accepting the gift.


The selection of a gift may be made from a third-party vendor or directly through the application from a list of a plurality of suggested gifts as described in embodiments below. The gift may be any type of physical gift, consumer products, services, vouchers, tokens, stored-value card, or electronically redeemable gift at any retail or online store. In some embodiments, the virtual gift 214 may represent or be associated with the gift selected by the gift giver. For example, the application may use any information associated with the user 138 (the gift giver or gift recipient) and scanning of the environment to suggest gifts. For example, if the gift giver has chosen to give the recipient a gift card, the application may access databases of nearby retailers and suggest gifts that match the color or objects in the gifting environment 200 and may be purchased using the gift card. Similarly, a gift card or voucher redeemable at the nearby retailer may be suggested as the gift.


In some embodiments, the suggestions may be based on the decor or objects within the gifting environment 200. For example, the application may determine that the gifting environment 200 is the living room 204 and many of the objects in the living room 204 are from a particular retailer. The application accesses the online website of the retailer and suggests gifts from the website that match the color scheme and decor of the living room 204. The application may also access a recipient online profile on social media or the stored profile of the recipient in the application to determine suitable gifts for the recipient. In some embodiments, the gift may be a stored-value card redeemable at a specific retailer that the recipient frequently visits as determined by object recognition of the gifting environment 200 or accessing information related to the recipient from the online databases.


In some embodiments, the gift is a stored-value card and suggestions for purchase may be provided by the application based on the value of the card. Suggestions for purchase may also be made based on recognized objects in the gifting environment 200. For example, the application may recognize that there is the book 210 in the living room 204. A stored-value card may be selected as the gift by the application based on a stored history of the recipient. The stored-value card may have a twenty-dollar value. The application may suggest or select a stored-value card redeemable at a local bookstore that the recipient frequently visits. The application may also suggest books for purchase that cost approximately twenty dollars or any price corresponding to the value of the stored-value card or a credit associate with the retailer. The book suggestions may also be based on types of books recognized in the scanned environment. Alternatively, the value associated with the stored-value card may be high such that a book stand or other related gift of a corresponding value is suggested.


In some embodiments, a selection of the book 210 for the gift may be made by the gift giver based on suggestions provided by the application. For example, the gift giver may have a stored-value card, voucher, or other stored-value item associated with a vendor or product. The application may suggest to the gift giver that a gift may be provided such as, for example, the book 210, by comparing a stored-value with products associated with the gift giver or the gift recipient through any of the online databases described above.


Further, suggestions may be made based on the scanned environment. For example, if the lamp 212 is scanned by the gift giver using the giver mobile device 202, the application may recognize the lamp 212 as an object and search the online databases to find a retailer for the lamp 212. A plurality of gifts may then be suggested to the gift giver based on the lamp 212 and the retailer associated with the lamp 212. The gift giver may select a suggested gift from the plurality of suggested gifts and add the gift to the application as a purchased gift. The application may then add the virtual gift 214 to the augmented reality gifting environment 200 provided by a scan of the gifting environment 200. The application may store the virtual gift 214 in the database of images and may associate the gift with the gift giver and the recipient to use for targeted marketing and suggestions.


In some embodiments, the application may link to third-party retailers such that the gift giver may find a gift at a website of the third-party retailer and add the gift to the application. The application may recognize the gift and discern a suitable location within the gifting environment 200 for the virtual gift 214, representative of the gift, to be placed. For example, the gift giver may purchase the book 210 from a third-party retailer as suggested by the application based on the available stored-value of the gift giver at the third-party retailer. Upon purchase of the book 210, the application may automatically image the book 210 and suggest placement of the image of the book 210 or place the image of the book 210 as the virtual gift 214 via augmented reality on the coffee table 208 of the scanned living room 204.


Further, the application may market to the gift giver by displaying the gift in the gifting environment 200 of the gift giver prior to purchase. Continuing with the exemplary embodiment described above, the application may suggest the book 210 to the gift giver based on the stored-value associated with the third-party retailer, or a stored-value card, and the price of the book 210. The application may suggest the gift by placing it in a stored gifting environment associated with the gift giver or the recipient. The stored gifting environment may be a previously used gifting environment 200. The gift may be suggested based on a stored-value, a scanned environment, a profile of the gift giver and the recipient, and any online databases and social media accounts as described above.


In some embodiments, the virtual gift 214 may be presented in a box, envelope, or any such template that may be provided by the application and customizable by the user. Images, text, video, personal hand written and scanned notes, caricatures of the gift giver and the recipient, or any other personalization may be virtually added to the gifting environment 200 or the virtual gift 214.


In some embodiments, the gifting environment 200 may be a residence of the gift giver or the recipient. For example, the embodiment depicted in FIG. 2A presents the living room 204 with the couch 206 and the coffee table 208. In some embodiments, the images of the gifting environment 200 may be uploaded such that the application may combine the uploaded images or video with the virtual gift 214 to present the gift to the recipient. The images and video may also be taken using a camera on the giver mobile device 202 or may be uploaded to the application from a peripheral device such as a camera, a computer, a tablet, or another device capable of taking a photograph or a video. Images and video of the gifting environment 200 may be stored on the giver mobile device 202 and added to the application by drag-and-drop or any other method. In other embodiments, the gifting environment 200 may be scanned in real time by a recipient mobile device of the recipient during the process of receiving the gift or in advance as a part of registering the application.


In some embodiments, the gifting environment 200 may also be a restaurant, bowling alley, sports arena, or any other indoor location. The gifting environment 200 may also be an outdoor location. For example, the gift giver may scan or image a patio, deck, wooded area, beach, or any other outdoor location that the gift giver may use as the gifting environment 200 to present the gift. Further, the location may be outdoors and the objects may include picnic tables, beaches, boats, decks, patios, or any other objects or environments that may be imaged and stored in the application database. The application may match the outdoor area with locations and objects in a database with stored outdoor scenes and find a suitable object and gift to suggest to the gift giver. Once selected, the object may be presented, via the application on the recipient mobile device, to the recipient.


In some embodiments, the application may utilize object recognition to recognize objects within the gifting environment 200 for placement or hiding of the virtual gift 214. The application may compare objects in the images of the living room 204 with images in a database and recognize shapes, colors, and arrangements of object within the living room 204 and compare those images with a database of objects and arrangements. The application may also relate the objects and the arrangements with similar objects and arrangements that are found in a similar room from the database. The objects may be defined from contrasts between the objects and background and the shape may be matched to a database of shapes to determine the objects. Once the objects are determined it may be determined if the objects are suitable for the virtual gift 214 to be emplaced. The type of object and the type of gift may be used to determine the placement of the virtual gift 214 (e.g. flowers on a table or picture on a wall). For example, as depicted in FIG. 2A, the application may recognize that the couch 208 is in the living room 204 by comparing the shape of the couch 208 to images in the database. The application may recognize that the couch 208 may be an appropriate gift location to place, for example, a virtual gift such as a pillow or a blanket but appropriate for a picture or a bowl. The application may also recognize the type of couch 208, the manufacturer of the couch 208, and retailers near the location of the gift giver and the recipient that sell the couch 208. In some embodiments, the application may relate the couch 208 to other objects from the manufacturer or the retailer and suggest gifts that may complement the couch 208 or any other objects in the room. In some embodiments, the application may access social media or the profile of the gift giver and the recipient to make gift selections as described in more detail below.


In some embodiments, the application may detect objects that are unrecognized because they are partially concealed. Continuing with the above-described exemplary embodiment, an object may be partially concealed by the couch 206. The application may scan living room images in a database and determine that many couches in the database images have end tables at the location that the unrecognized object is located. The application may determine with a calculated level of confidence that the unrecognized object is an end table 216 and determine that the end table 216 is a suitable location for the gift. The application may place the virtual gift 214 on the end table 216 in the image such that the virtual gift 214 is revealed when the recipient of the gift scans the living room 204 with the camera of the recipient mobile device running the application.


In some embodiments, the application may determine an object by recognizing a flat surface or a suitable location for the virtual gift 214 emplacement based on the gift and the objects in the gifting environment 200. For example, the gift giver may buy a comforter at an online third-party retailer and upload an image of the virtual gift 214 to the application. The gift giver may also enter a number associated with the comforter or select a retailer in the application and select the comforter from options provided by the application accessing a database. For selection of the gifting environment 200, the gift giver may scan, for example, a bedroom. The application may recognize the bedroom and the objects in the bedroom and through augmented reality place the virtual gift 214 or, for example, an image of the comforter on the bed for viewing by the recipient upon viewing the bedroom with the application running on the recipient mobile device.


In the event that the software cannot define a suitable gift placement object with a high level of certainty the application may offer suggestions to the gift giver for placement of the virtual gift 214. For example, the application may suggest that the lighting be changed in the gifting environment 200 or that the gift giver perform the scan of the gifting environment 200 from a different location such as inside or in a direction away from a light source.


In some embodiments, the application may receive input from the gift giver such as room type or defining or labeling particular objects in the gifting environment 200. For example, the gift giver scans a room and the application is unable to discern the room type. The application prompts the gift giver to label the room. The user types “bedroom.” The application may then narrow the options to known images of bedrooms to compare objects within the room. Further, the gift giver may select objects and define the objects with text inputs such as “dresser” or “bed”. The application may then deduce that the gifting environment 214 is a bedroom and narrow the database search to objects in a bedroom. From the text inputs and label recognition, the application may select suitable objects for placement of the virtual gift 214.


In some embodiments, the object may be selected by the gift giver. The gift giver may select an object in a the living room 204 once the living room 204 has been scanned and the application has identified objects, the gift giver may select an object in the living room 204 for placement of the virtual gift 214. Once the object is selected, the virtual gift 214 may appear on the screen in the gifting environment 200. The gift 214 may be edited such that the gift giver may move the virtual gift 214 in any translational or rotational way or change the size of the virtual gift 214 to place the virtual gift 214 in a realistic manner on the object, for example, the coffee table 206 in the living room 204. In some embodiments, the size, shape, and position of the virtual gift 214 is changed automatically by the application to present a realistic placement on the coffee table 206 within the gifting environment 200.


In certain embodiments, the virtual gift 214 may be tied to a particular geographical location. The recipient may be presented with the virtual gift 214 or hints as to what the gift may be and may or may not access the gift unless the recipient is in a specified gift location or gifting environment 200. The gift location may be associated with GPS coordinates and the application may track a recipient location relative to the location of the gift and/or direct the recipient to the location of the gift. In some embodiments, the location of the gift is the gifting environment 200 for presentation of the virtual gift 214.


Turning now to FIG. 3A where, in some embodiments, a map 300 may be displayed on the recipient mobile device 302. The recipient location 304 may be represented by a marker 306. The recipient location 304 may be visible on the map 300 and, in certain embodiments, the application provides hints to the recipient in an exemplary scavenger hunt type scenario. For example, the recipient may receive a notice that a gift the virtual gift 214 is hidden and that the recipient must find it. The notice may be received from notification methods such as email, text, instant message, audio message, video message, application alerts, or may be received from any user 138 associated social media site. The recipient may receive more hints based on time spent looking for the gift or based on GPS location through features such as geo-fencing. If the recipient is within a certain range or passes a certain checkpoint more hints may be provided. Similarly, if a certain time has passed and the recipient is not making sufficient progress then further hints may be provided. The frequency of hints may be customized by the gift giver and the recipient. Upon arrival to the gift location the recipient may access the application and use the camera on the recipient mobile device 302 to scan the area. For example, a picnic (the gift) is revealed as a virtual gift 214 on a blanket (the object) that the gift giver has left for a picnic. In other exemplary embodiments, picnic may be revealed to the recipient as the virtual gift 214 at any point and the recipient may need to find the gift location which, in some embodiments, may be the gifting environment 214, to find the picnic (the gift). In some embodiments, the virtual gift 214 may be an electronic gift such as an e-card, voucher, token, or gift card.


In some embodiments, a proximate gift location 308 may be presented to the recipient. This may narrow the search area for the recipient. The application may provide hints using the proximate gifting location 308 in an exemplary game of hot and cold. The location of the gifting environment 200 may be known through GPS or a proximity sensing method. The gift may be in the location and have an accessible GPS device or the location may be input by the gift giver when selecting and placing the gift as described in embodiments above. The application may provide hints based on the location of the recipient mobile device 302 in relation to the gift or the gifting environment 200. The application may also provide a hot indication when the recipient mobile device 302 is less than a designated distance from the gift and a cold indication when the recipient mobile device 302 is greater than designated distance from the gift. The indications may be provided by voice, text, or any indication representing any hot, cold, or any other representative temperature such as warm, cool, or ice cold. In some embodiments, the notifications are provided on the map 300 as a proximity indicator 310 that may be colored or provide text to indicate the proximity such as, for example, a calculated distance to the virtual gift 214. In some embodiments, the map 300 may be a heat map that displays different levels of hot and cold as the recipient moves closer to the virtual gift 214 or the gifting environment 200.


In some embodiments, the gift may be received by accessing the web-based application such that the recipient does not download the application. Alternatively, the recipient may have the application installed on the recipient mobile device 302 and may access the gift using the installed application. The recipient may get a notification via email, message, social media, or any other messaging system, or may receive an alert via the installed application. Upon receiving the notification, the recipient may access the application and receive hints, instructions, personalized messages, text, video, or pictures uploaded by the gift giver or selected from application provided templates.



FIG. 3B depicts on embodiment of the recipient mobile device 302 presenting the gifting environment 200 with the virtual gift 214 displayed on the coffee table 208. In some embodiments, the recipient may be at any location and view the scanned gifting environment 200 including the virtual gift 214. For example, the recipient may be at work and receive a text message that a gift has been left in the recipient's living room 204 and that the virtual gift 214 may be viewed via the application. Upon accessing the application on the recipient mobile device 302, the recipient views the living room 204 with the virtual gift 214 on the coffee table 208. The recipient arrives home to receive the gift on the living room 204 coffee table 208 as viewed via the gifting environment 200 on the recipient mobile device 302.


In some embodiments, the virtual gift 214 is presented in the gifting environment 200 to the recipient as depicted in FIG. 3B. In some embodiments, the recipient must be in or otherwise viewing the environment represented by the virtual gifting environment 200 to view the virtual gift 214. The application may recognize the objects as the recipient scans the gifting environment 200 and places or presents the virtual gift 214 thus presenting the virtual gift 214 to the recipient. For example, the recipient may be in the living room 204 and scan the living room 204 with the recipient mobile device 302. The application may recognize that the recipient mobile device 302 is at the correct location by GPS and by recognition of the objects in the gifting environment 200. When the recipient scans the coffee table 208, the application recognizes the coffee table 208 as the object for placement of the virtual gift 214 and presents the virtual gift 214 in augmented reality in real time as the gifting environment 200 is scanned via the recipient mobile device 302. In this case, the view of the living room 204 may be a live view from the camera of the recipient mobile device 302 and the virtual gift 214 is added to the scene through augmented reality.


In some embodiments, the recipient may use the camera of the recipient mobile device 302 running the application downloaded on the recipient mobile device 302 or through an online website. The recipient may view the living room 204 via the camera. When the coffee table 208 comes into view, the virtual gift 214 may be displayed on the coffee table 208 in augmented reality. This provides the recipient with a view of the virtual gift 214 in the living room 204. This is a surprise to the recipient and also provides the recipient and the gift giver a method for viewing the gift in the gifting environment 200 prior to accepting the gift.


In some embodiments, the recipient may modify or select different gifts to be displayed as the virtual gift 214. For example, the gift may be the couch 206. The recipient may not care for the color of the couch 206 and may access the third-party retailer and change the color as desired. This may provide the recipient the option to see the virtual gift 214 in the living room 204 before purchase or accepting the gift. Further, the gift giver may provide selection for different colors, sizes, or any other option for the gift such that the recipient may scroll through the different options as the virtual gift 214 in the gifting environment 200 and select the most desired.



FIG. 3C depicts the recipient mobile device 302 displaying the virtual gift 214. When the recipient has located the virtual gift 214, the virtual gift 214 may be selected and the gift 306 accepted. The gift 306 may be received by any method of interaction with the recipient mobile device 302. The recipient may interact with the recipient mobile device 302 by tilting, shaking, touching, swiping, and making a facial expression or movement detected by the application accessing the camera of the recipient mobile device 302. For example, the virtual gift 214 may be wrapped as depicted in FIG. 3B. The recipient may touch, swipe, or otherwise interact with the recipient mobile device 302 in any method as describe above and the virtual gift 214 unwraps to reveal the gift. In some embodiments, the gift 312 may also be revealed or accessed through biometric identification such as facial, retinal, fingerprint, or any other biometric recognition.


Upon reveal of the gift 312, the application may access peripheral devices of the recipient mobile device 302 to provide feedback to the recipient. For example, the application may provide a sound, for example, of the virtual gift 214 unwrapping, or a vibration of the recipient mobile device 302, or lights illuminating on the recipient mobile device 302. The camera of the recipient mobile device 302 may automatically activate taking images or video to capture the recipient at the time of the reveal or when the gifting environment 200 is detected by the application.


In some embodiments, the gift 312 may be a token as depicted in FIG. 3C, or any gift card, e-card, or voucher that may be redeemable online or electronically redeemable. In some embodiments, the gift 312 may be a physical gift that may be sent directly to the recipient upon selection and acceptance. The physical gift may be sent to any user 138 that may be associated with an account or profile accessible by the application.


In some embodiments, the gift 312 may be revealed with a personal message 314. The personal message 314 may be text, audio, video, photographs, or any other method of customization for displaying a personalized item to the recipient. The personal message 314 may also link to a website of the gift giver, the recipient, or a third-party retailer of the gift 312. In some embodiments, the personal message 314 provides the options for modifying the gift 312 as described in embodiments above. After modification of the gift 312 the recipient may return to the gifting environment 200 and view the modified virtual gift 214 in the gifting environment 200 as described in embodiments above.


In some embodiments, the gift 312 is presented with a gift receipt or a gift receipt is sent to the recipient automatically upon accepting the gift 312. The gift receipt may be sent via text message, email, instant message, through a social media account, or any other method that information may be passed to the recipient. In some embodiments, the gift receipt is uploaded to the profile of the recipient such that the recipient may view and print the gift receipt at any time.



FIG. 4 depicts an exemplary flow diagram 400 for embodiments of the invention. At step 402 the profile of the user 138 is created and/or updated. The user 138, in the case of a gift giver, may create a profile associated with the application such that the user 138 may access the application for, in some embodiments, selection and purchase of the gift 312 and placement the virtual gift 214 in the gifting environment 200 as described in embodiments above. The user 138, in the case of the recipient, may create a profile and utilize the application to receive the gift 312 via the recipient mobile device 302 displaying the virtual gift 214 in the gifting environment 200.


At step 404, the gift giver may receive offers or suggestions for gifts based on the profile of the gift giver or the recipient as in embodiments described above. The application may access any information associated with the gift giver and the recipient provided by the gift giver and the recipient on the profile such as, for example, age, physical features, location, friends, favorite sports, favorite TV shows, favorite music, or any other information that may be used to select the gift 312. The gift giver may select at least one gift 312 from a plurality of suggested gifts or the gift giver may select the gift 312 that the gift giver chooses online or in store.


At step 406, the gift giver may utilize the camera associated with the giver mobile device 202 and take an image or video of the gifting environment 200 as described in embodiments presented above. The image or video may be used to select objects for placement of the virtual gift 214 and for reveal of the gift 312 to the recipient.


At step 408, the gift giver may place the virtual gift 214, or select a location, in the gifting environment 200 for the virtual gift 214 to be presented as describe in embodiments presented above. The gift giver may choose from objects in the gifting environment 200 suggested by the application or the gift giver may select on object or just place the virtual gift 214 anywhere in the gifting environment 200. The gift giver may also provide a GPS location of the virtual gift 214 and the gifting environment 200 for use with embodiments of the invention. The application may use the location of the virtual gift 214 and the gifting environment 200 to send information to the recipient.


At step 410, a notification may be sent to the recipient providing information related to the gift 312 as described in embodiments presented above. The notification may indicate that the gift 312 is waiting for the recipient and may provide location information or hints as to the location of the gift 312. Personal information or greetings such as Happy Birthday, Congratulations, or any other celebratory greeting or personally customized content from text, images, videos, or the like may be provided with the notification. In some embodiments, the gifting environment 200 and the virtual gift 214 may also be sent or otherwise linked to the notification.


At step 412, the recipient may search for the gifting environment 200 and the virtual gift 214 as described in embodiments above. The recipient may receive notifications and hints directing the recipient to the location of the virtual gift 214. The notifications may be provided at regular intervals or periodically based on time or location of the recipient mobile device 302 relative to the location of the virtual gift 214 and the gifting environment 200.


At step 414, the gifting environment 200 may be displayed via the camera of the recipient mobile device 302. The recipient may utilize the recipient mobile device 302 to scan the gifting environment 200. The application may recognize the objects in the gifting environment 200 and place the virtual gift 214 in the scanned image of the recipient thus presenting the virtual gift 214 to the recipient in real time as the gifting environment 200 is scanned.


At step 416, the virtual gift 214 is selected. The recipient may select the virtual gift 214 by touching, swiping, speaking, or making a gesture recognized by any sensor of the recipient mobile device 302 that may sense any interaction from the recipient. In some embodiments, a personal message 314 is also presented that may be selectable and provide a link to a website or third-party retailer.


At step 418, the gift 312 is presented to the recipient. Upon selection of the virtual gift 214 the gift 312 may be revealed to the recipient as described in embodiments above. The reveal may be animated in the augmented reality environment such as with a gift opening, fireworks, animals celebrating, or any other celebratory display or animation that may be selected by the gift giver or the application. The gift 312 reveal may also include the personal message 314 that may be from the gift giver or a plurality of gift givers, and may be a text, images, videos, audio, or any other personal message 314.


At step 420, the recipient may input modifications and accept the gift 312 as described in embodiments above. The user 138 may access a third-party retailer and modify the colors, shapes, designs, sizes, or any other features and view the virtual gift 214 with the modifications in the gifting environment 200 prior to accepting the gift 312.


At step 422, the gift 312 may be automatically sent along with a gift receipt to the recipient upon accepting the gift 312 as described in embodiments above. The address and any other information from the recipient may be obtained from the profile of the recipient or the profile of the gift giver.


At any time and any interaction with the user 138 the application and profiles may update. Any information gained from the interaction with the user 138 may be stored for future use and to modify offers and suggestions of gifts and personal information associated with the user 138.


Any steps provided in embodiments of the invention and described in methods may be omitted, added, and rearranged. Multiple methods for selecting and placing and receiving and accepting gifts may be utilized as described in embodiments provided herein.


Some embodiments of the invention utilize machine learning, neural networks, fuzzy logic, or any other statistical, or general mathematical algorithm or artificial intelligence to increase the efficiency of the application and create a more user-friendly experience. The mathematical algorithms may be used for recommending gifts based on online databases such as social media or location and user demographics. The mathematical algorithms may be used along with user feedback to increase environment recognition and discern objects within the gifting environment 200. For example, in the event that an object is placed in a location that is not suitable for the gift 312 such as, for example, an electronic device placed in a sink, the user 128 may provide negative feedback and the results of the process may be re-evaluated, stored, and incorporated into the algorithms for future use. Further, positive feedback may be used to strengthen the positive outcomes thus increasing the likelihood of positive outcomes and decreasing the likelihood of negative outcomes.


Although the invention has been described with reference to the embodiments illustrated in the attached drawing figures, it is noted that equivalents may be employed and substitutions made herein without departing from the scope of the invention.

Claims
  • 1. One or more non-transitory computer-readable media storing computer-executable instructions that, when executed by a processor, perform a method of presenting a virtual gift to a recipient using augmented reality, the method comprising: receiving a selection of a gift from a gift giver, the gift selected from a plurality of suggested gifts;obtaining an image of the gift from a third party retailer;receiving at least one image of a gifting environment via a giver mobile device, using the image of the gifting environment to select objects for placement of the gift;receiving a scan of the gifting environment from a recipient mobile device;discerning a plurality of objects from the gifting environment;presenting one or more suggested virtual gift placement locations, and receiving selection of a virtual gift placemen location;presenting the virtual gift in a real-time scan of the gifting environment on the recipient mobile device to create an augmented reality of the gifting environment;receiving one or more modifications to the gift from the recipient; andcreating a modified virtual gift by obtaining an image of the gift with the one or more modifications from a third party retailer;presenting the modified virtual gift in the gifting environment to the recipient mobile device.
  • 2. The method of claim 1, further comprising: receiving a gift environment location via the giver mobile device.
  • 3. The method of claim 2, wherein the gifting environment location is received as global positioning system (GPS) coordinates.
  • 4. The method of claim 3, further comprising: providing at least one notification to the recipient via the recipient mobile device, wherein the at least one notification is indicative of at least one of the gifting environment location and a virtual gift location.
  • 5. The method of claim 1, further comprising: receiving a virtual gift location within the gifting environment from the giver mobile device.
  • 6. The method of claim 5, wherein the virtual gift location is selected by the gift giver via the giver mobile device from a plurality of virtual gift locations recognized in the gifting environment.
  • 7. The method of claim 6, wherein the virtual gift location is recognized using object recognition and the virtual gift is presented to the recipient via the recipient mobile device at the virtual gift location during the real-time scan.
  • 8. The method of claim 1, wherein the recipient receives at least one notification when the virtual gift is placed in the gifting environment.
  • 9. The method of claim 1, wherein the gift is selected by the gift giver via the gift giver mobile device from a plurality of suggested gifts.
  • 10. The method of claim 1, wherein the gift is revealed to the recipient upon receiving an input via the recipient mobile device.
  • 11. The method of claim 10, wherein the input is at least one of a touch of the recipient mobile device, a movement of the recipient mobile device, and an expression of the recipient recorded via a camera associated with the recipient mobile device,wherein the gift is revealed by at least one of unwrapping the virtual gift, opening the virtual gift, and a celebratory display, andwherein the gift is at least one of a physical gift, a gift card, a token, and an electronically redeemable gift.
  • 12. A method for selecting and presenting a virtual gift to a recipient using augmented reality, the method comprising: receiving a selection of a gift from a gift giver, the gift selected from a plurality of suggested gifts;determining a virtual representation of the gift selected from an online website of a third party retailer, the virtual representation comprising an image;receiving at least one image of a gifting environment via a giver mobile device, using the image of the gifting environment to select objects for placement of the gift;discerning a plurality of objects from the gifting environment;determining at least one object from the plurality of objects for placement of the virtual gift;presenting one or more suggested virtual gift placement locations, and receiving selection of a virtual gift placemen location;presenting the gifting environment and the virtual gift to the gift recipient;receiving one or more modifications to the gift from the recipient; andpresenting a modified virtual gift in the gifting environment to the recipient mobile device, the modified virtual gift comprising an image of the gift with the one or more modifications retrieved from an online website of the third party retailer.
  • 13. The method of claim 12, wherein the virtual gift is an image taken by a camera associated with the giver mobile device.
  • 14. The method of claim 12, wherein the virtual gift is an image retrieved from a third-party website.
  • 15. The method of claim 12, wherein the at least one object is determined from the plurality of objects by object recognition.
RELATED APPLICATIONS

This non-provisional patent application claims priority benefit, with regard to all common subject matter, of earlier-filed U.S. Provisional Patent Application No. 62/673,280, filed May 18, 2018, and entitled “AUGMENTED REALITY GIFTING ON A MOBILE DEVICE.” The identified earlier-filed provisional patent application is hereby incorporated by reference in its entirety into the present application.

US Referenced Citations (684)
Number Name Date Kind
794417 Maniachi Jul 1905 A
3288350 Kushner Nov 1966 A
4068213 Nakamura et al. Jan 1978 A
4482802 Aizawa et al. Nov 1984 A
4734858 Schlafly Mar 1988 A
4747049 Richardson et al. May 1988 A
4755940 Brachtl et al. Jul 1988 A
4767917 Ushikubo Aug 1988 A
4795892 Gilmore et al. Jan 1989 A
4877947 Mori Oct 1989 A
4900904 Wright et al. Feb 1990 A
4908521 Boggan et al. Mar 1990 A
4943707 Boggan Jul 1990 A
5091634 Finch et al. Feb 1992 A
5117355 McCarthy May 1992 A
5202826 McCarthy Apr 1993 A
5239165 Novak Aug 1993 A
5255182 Adams Oct 1993 A
5256863 Ferguson et al. Oct 1993 A
5311424 Mukherjee et al. May 1994 A
5367148 Storch et al. Nov 1994 A
5375240 Grundy Dec 1994 A
5383113 Kight et al. Jan 1995 A
5384449 Peirce Jan 1995 A
5465288 Falvey et al. Nov 1995 A
5477038 Levine et al. Dec 1995 A
5500513 Langhans et al. Mar 1996 A
5502765 Ishiguro et al. Mar 1996 A
5511114 Stimson et al. Apr 1996 A
5577109 Stimson et al. Nov 1996 A
5581064 Riley et al. Dec 1996 A
5590038 Pitroda Dec 1996 A
5602377 Beller et al. Feb 1997 A
5619559 Kennedy Apr 1997 A
5621201 Langhans et al. Apr 1997 A
5621787 McKoy et al. Apr 1997 A
5644721 Chung et al. Jul 1997 A
5657389 Houvener Aug 1997 A
5671279 Elgamal Sep 1997 A
5679938 Templeton et al. Oct 1997 A
5679940 Templeton et al. Oct 1997 A
5696909 Wallner Dec 1997 A
5699528 Hogan Dec 1997 A
5705798 Tarbox Jan 1998 A
5708780 Levergood Jan 1998 A
5721768 Stimson et al. Feb 1998 A
5721781 Deo et al. Feb 1998 A
5729693 Holda-Fleck Mar 1998 A
5732136 Murphree et al. Mar 1998 A
5734719 Tsevdos et al. Mar 1998 A
5740915 Williams Apr 1998 A
5754655 Hughes et al. May 1998 A
5760381 Stich et al. Jun 1998 A
5777305 Smith et al. Jul 1998 A
5799285 Klingman Aug 1998 A
5806045 Biorge et al. Sep 1998 A
5828740 Khuc et al. Oct 1998 A
5844972 Jagadish et al. Dec 1998 A
5850217 Cole Dec 1998 A
5857175 Day et al. Jan 1999 A
5873072 Kight et al. Feb 1999 A
5878401 Joseph Mar 1999 A
5884271 Pitroda Mar 1999 A
5889270 Haagen et al. Mar 1999 A
5889863 Weber Mar 1999 A
5897625 Gustin Apr 1999 A
5903633 Lorsch May 1999 A
5903830 Joao et al. May 1999 A
5903874 Leonard et al. May 1999 A
5907142 Kelsey May 1999 A
5909492 Payne et al. Jun 1999 A
5913210 Call Jun 1999 A
5918213 Bernard et al. Jun 1999 A
5930363 Stanford et al. Jul 1999 A
5945653 Walker et al. Aug 1999 A
5950173 Perkowski Sep 1999 A
5953710 Fleming Sep 1999 A
5956695 Carrithers et al. Sep 1999 A
5956700 Landry Sep 1999 A
5968110 Westrope et al. Oct 1999 A
5984508 Hurley Nov 1999 A
5988509 Taskett Nov 1999 A
5991413 Arditti et al. Nov 1999 A
5991748 Taskett Nov 1999 A
5991749 Morrill, Jr. Nov 1999 A
5999624 Hopkins Dec 1999 A
6000608 Dorf Dec 1999 A
6012049 Kawan Jan 2000 A
6018570 Matison Jan 2000 A
6018719 Rogers Jan 2000 A
6025780 Bowers et al. Feb 2000 A
6029139 Cunningham et al. Feb 2000 A
6029141 Bezos et al. Feb 2000 A
6029151 Nikander Feb 2000 A
6032135 Molano et al. Feb 2000 A
6032859 Muehlberger et al. Mar 2000 A
6039244 Finsterwald Mar 2000 A
6049778 Walker et al. Apr 2000 A
6055511 Luebbering et al. Apr 2000 A
6055567 Ganesan et al. Apr 2000 A
6058300 Hanson May 2000 A
6058382 Kasai et al. May 2000 A
6062472 Cheung May 2000 A
6070150 Remington et al. May 2000 A
6085167 Iguchi Jul 2000 A
6085242 Chandra Jul 2000 A
6088682 Burke Jul 2000 A
6092053 Boesch et al. Jul 2000 A
6098053 Slater Aug 2000 A
6105008 Davis et al. Aug 2000 A
6119164 Basche Sep 2000 A
6125352 Franklin et al. Sep 2000 A
6129276 Jelen et al. Oct 2000 A
6134533 Shell Oct 2000 A
6138106 Walker et al. Oct 2000 A
6138911 Fredregill et al. Oct 2000 A
6142369 Jonstromer Nov 2000 A
6148249 Newman Nov 2000 A
6167387 Lee-Wai-Yin Dec 2000 A
6169890 Vatanen Jan 2001 B1
6173272 Thomas et al. Jan 2001 B1
6175823 Dusen Jan 2001 B1
6185545 Resnick et al. Feb 2001 B1
6188752 Lesley Feb 2001 B1
6219652 Carter et al. Apr 2001 B1
6222914 McMullin Apr 2001 B1
6226364 O'Neil May 2001 B1
6240397 Sachs May 2001 B1
6256690 Carper Jul 2001 B1
6267670 Walker Jul 2001 B1
6282566 Lee, Jr. et al. Aug 2001 B1
6285749 Manto Sep 2001 B1
6289322 Kitchen et al. Sep 2001 B1
6295522 Boesch Sep 2001 B1
6304860 Martin Oct 2001 B1
6308887 Korman et al. Oct 2001 B1
6311165 Coutts et al. Oct 2001 B1
6311170 Embrey Oct 2001 B1
6314171 Dowens Nov 2001 B1
6315195 Ramachandran Nov 2001 B1
6317028 Valiulis Nov 2001 B1
6324525 Kramer et al. Nov 2001 B1
6327577 Garrison et al. Dec 2001 B1
6330544 Walker et al. Dec 2001 B1
6332135 Conklin et al. Dec 2001 B1
6333976 Lesley Dec 2001 B2
6334116 Ganesan et al. Dec 2001 B1
6360254 Linden Mar 2002 B1
6363362 Burfield et al. Mar 2002 B1
6363364 Nel Mar 2002 B1
6366893 Hannula et al. Apr 2002 B2
6375073 Aebi et al. Apr 2002 B1
6422462 Cohen Jul 2002 B1
6424706 Katz et al. Jul 2002 B1
6434238 Chaum et al. Aug 2002 B1
6442532 Kawan Aug 2002 B1
6467684 Fite et al. Oct 2002 B2
6473500 Risafi et al. Oct 2002 B1
6473739 Showghi et al. Oct 2002 B1
6494367 Zacharias Dec 2002 B1
6502745 Stimson et al. Jan 2003 B1
6507823 Nel Jan 2003 B1
6529956 Smith Mar 2003 B1
6536659 Hauser Mar 2003 B1
6542933 Durst, Jr. et al. Apr 2003 B1
6550672 Tracy et al. Apr 2003 B1
6575361 Graves et al. Jun 2003 B1
6581827 Welton Jun 2003 B2
6591098 Shieh et al. Jul 2003 B1
6594640 Postrel Jul 2003 B1
6594644 Van Dusen Jul 2003 B1
6609113 O'Leary et al. Aug 2003 B1
6615189 Phillips et al. Sep 2003 B1
6615190 Slater Sep 2003 B1
6622015 Himmel et al. Sep 2003 B1
6636833 Flitcroft et al. Oct 2003 B1
6648222 McDonald et al. Nov 2003 B2
6651885 Arias Nov 2003 B1
6659259 Knox et al. Dec 2003 B2
6678664 Ganesan Jan 2004 B1
6684269 Wagner Jan 2004 B2
6705520 Pitroda et al. Mar 2004 B1
6745022 Knox Jun 2004 B2
6769607 Pitroda et al. Aug 2004 B1
6805289 Noriega et al. Oct 2004 B2
6807410 Pailles et al. Oct 2004 B1
6819219 Bolle et al. Nov 2004 B1
6827260 Stoutenburg et al. Dec 2004 B2
6829596 Frazee Dec 2004 B1
6836765 Sussman Dec 2004 B1
6836962 Khandros et al. Jan 2005 B2
6839692 Carrott et al. Jan 2005 B2
6839744 Kloba et al. Jan 2005 B1
6848613 Nielsen et al. Feb 2005 B2
6856974 Ganesan et al. Feb 2005 B1
6862575 Anttila Mar 2005 B1
6868391 Hultgren Mar 2005 B1
6882984 Boyle et al. Apr 2005 B1
6899621 Behm May 2005 B2
6915277 Manchester Jul 2005 B1
6918537 Graves et al. Jul 2005 B2
6925439 Pitroda Aug 2005 B1
6932268 McCoy et al. Aug 2005 B1
6934529 Bagoren et al. Aug 2005 B2
6934689 Ritter et al. Aug 2005 B1
6941270 Hannula Sep 2005 B1
6948063 Ganesan et al. Sep 2005 B1
6961412 Ruckart et al. Nov 2005 B2
6965866 Klein Nov 2005 B2
6975937 Kantarjiev et al. Dec 2005 B1
6988657 Singer et al. Jan 2006 B1
6993510 Guy et al. Jan 2006 B2
7014107 Singer et al. Mar 2006 B2
7016863 Kamakura et al. Mar 2006 B1
7024174 Nagy et al. Apr 2006 B2
7024396 Woodward Apr 2006 B2
7054842 James et al. May 2006 B2
7055740 Schultz Jun 2006 B1
7069248 Huber Jun 2006 B2
7072854 Loeser Jul 2006 B2
7072864 Brake, Jr. et al. Jul 2006 B2
7083084 Graves et al. Aug 2006 B2
7085931 Smith et al. Aug 2006 B1
7086584 Stoutenburg et al. Aug 2006 B2
7093761 Smith et al. Aug 2006 B2
7117227 Call Oct 2006 B2
7128274 Kelley et al. Oct 2006 B2
7130817 Karas et al. Oct 2006 B2
7131582 Welton Nov 2006 B2
7143055 Perkowski Nov 2006 B1
7162440 Koons Jan 2007 B2
7165052 Diveley et al. Jan 2007 B2
7182252 Cooper et al. Feb 2007 B1
7194438 Sovio et al. Mar 2007 B2
7209889 Whitfield Apr 2007 B1
7209890 Peon et al. Apr 2007 B1
7216092 Weber May 2007 B1
7222101 Bishop et al. May 2007 B2
7229006 Babbi et al. Jun 2007 B2
7229014 Snyder Jun 2007 B1
7243839 Beck et al. Jul 2007 B2
7249054 Keil Jul 2007 B2
7249097 Hutchison et al. Jul 2007 B2
7269256 Rosen Sep 2007 B2
7292998 Graves et al. Nov 2007 B2
7316350 Algiene Jan 2008 B2
7328190 Smith et al. Feb 2008 B2
7333955 Graves et al. Feb 2008 B2
7356327 Cai et al. Apr 2008 B2
7363265 Horgan Apr 2008 B2
7370012 Karns et al. May 2008 B2
7376431 Niedermeyer May 2008 B2
7383226 Kight et al. Jun 2008 B2
7398248 Phillips et al. Jul 2008 B2
7400883 Rivers et al. Jul 2008 B2
7415617 Ginter et al. Aug 2008 B2
7437328 Graves et al. Oct 2008 B2
7494417 Walker Feb 2009 B2
7519543 Ota et al. Apr 2009 B2
7536349 Mik et al. May 2009 B1
7566000 Agostino et al. Jul 2009 B2
7590557 Harrison et al. Sep 2009 B2
7631803 Peyret et al. Dec 2009 B2
7650308 Nguyen et al. Jan 2010 B2
7702542 Aslanian Apr 2010 B2
7711598 Perkowski May 2010 B2
7725326 Tracy et al. May 2010 B1
7735724 Fujita et al. Jun 2010 B2
7747644 Reihl Jun 2010 B1
7757944 Cline et al. Jul 2010 B2
7774209 James et al. Aug 2010 B2
7840437 Lewis Nov 2010 B2
7848948 Perkowski et al. Dec 2010 B2
7866548 Reed et al. Jan 2011 B2
7904333 Perkowski Mar 2011 B1
7905399 Barnes Mar 2011 B2
7917386 Christensen Mar 2011 B2
7940333 Suzuki May 2011 B2
7941373 Chang et al. May 2011 B1
7959076 Hopkins, III Jun 2011 B1
8046268 Hunt Oct 2011 B2
8060413 Castell et al. Nov 2011 B2
8103520 Mueller Jan 2012 B2
8109436 Hopkins, III Feb 2012 B1
8195568 Singhal Jun 2012 B2
8332323 Stals et al. Dec 2012 B2
8352627 Harris Jan 2013 B1
8355982 Hazel et al. Jan 2013 B2
8396758 Paradise Mar 2013 B2
8509814 Parker Aug 2013 B1
8577735 Wilen Nov 2013 B2
8636203 Patterson Jan 2014 B1
8662387 Geller et al. Mar 2014 B1
8712835 Philyaw Apr 2014 B1
8751298 Giordano et al. Jun 2014 B1
8768834 Zacarias et al. Jul 2014 B2
9098190 Zhou Aug 2015 B2
9183534 Gharabally Nov 2015 B2
9449343 Mayerle Sep 2016 B2
9483786 Glass Nov 2016 B2
9626070 Cowles Apr 2017 B2
9672687 Cage Jun 2017 B2
10262346 Glass Apr 2019 B2
10515397 Serfass Dec 2019 B2
10679268 Kochhar Jun 2020 B1
20010001321 Resnick et al. May 2001 A1
20010001856 Gould et al. May 2001 A1
20010005840 Verkama Jun 2001 A1
20010007983 Lee Jul 2001 A1
20010011248 Himmel et al. Aug 2001 A1
20010032878 Tsiounis et al. Oct 2001 A1
20010034707 Sakaguchi Oct 2001 A1
20010037264 Husemann et al. Nov 2001 A1
20010042784 Fite et al. Nov 2001 A1
20010044776 Kight et al. Nov 2001 A1
20010056410 Ishigaki Dec 2001 A1
20020002535 Kitchen et al. Jan 2002 A1
20020010627 Lerat Jan 2002 A1
20020010677 Kitchen et al. Jan 2002 A1
20020013768 Ganesan Jan 2002 A1
20020019809 Kitchen et al. Feb 2002 A1
20020022472 Watler et al. Feb 2002 A1
20020046165 Kitchen et al. Apr 2002 A1
20020046166 Kitchen et al. Apr 2002 A1
20020046167 Kitchen et al. Apr 2002 A1
20020046168 Kitchen et al. Apr 2002 A1
20020049672 Kitchen et al. Apr 2002 A1
20020052840 Kitchen et al. May 2002 A1
20020052841 Guthrie et al. May 2002 A1
20020060243 Janiak et al. May 2002 A1
20020062249 Ianacci May 2002 A1
20020062282 Kight et al. May 2002 A1
20020065713 Awada et al. May 2002 A1
20020065773 Kight et al. May 2002 A1
20020065774 Young et al. May 2002 A1
20020077076 Suryanarayana et al. Jun 2002 A1
20020077993 Immonen et al. Jun 2002 A1
20020088855 Hodes Jul 2002 A1
20020091573 Hodes Jul 2002 A1
20020101966 Nelson Aug 2002 A1
20020107791 Nobrega et al. Aug 2002 A1
20020111906 Garrison et al. Aug 2002 A1
20020115424 Bagoren et al. Aug 2002 A1
20020116329 Serbetcioglu et al. Aug 2002 A1
20020116531 Chu Aug 2002 A1
20020119767 Fieldhouse et al. Aug 2002 A1
20020120571 Maung et al. Aug 2002 A1
20020128968 Kitchen et al. Sep 2002 A1
20020138450 Kremer Sep 2002 A1
20020138573 Saguy Sep 2002 A1
20020152123 Giordano et al. Oct 2002 A1
20020152160 Allen-Rouman et al. Oct 2002 A1
20020152179 Racov Oct 2002 A1
20020153414 Stoutenburg et al. Oct 2002 A1
20020161631 Banerjee et al. Oct 2002 A1
20020169713 Chang et al. Nov 2002 A1
20020178062 Wright et al. Nov 2002 A1
20020190123 Anvekar et al. Dec 2002 A1
20020198722 Yuschik Dec 2002 A1
20030001005 Risafi et al. Jan 2003 A1
20030004802 Callegari Jan 2003 A1
20030004891 Rensburg et al. Jan 2003 A1
20030004894 Rowney et al. Jan 2003 A1
20030009382 D'Arbeloff Jan 2003 A1
20030023552 Kight et al. Jan 2003 A1
20030028481 Flitcroft et al. Feb 2003 A1
20030033246 Slater Feb 2003 A1
20030055735 Cameron et al. Mar 2003 A1
20030055782 Slater Mar 2003 A1
20030070080 Rosen Apr 2003 A1
20030074328 Schiff et al. Apr 2003 A1
20030093366 Halper et al. May 2003 A1
20030105672 Epstein et al. Jun 2003 A1
20030105688 Brown et al. Jun 2003 A1
20030115126 Pitroda Jun 2003 A1
20030121967 Goldberg et al. Jul 2003 A1
20030126079 Roberson et al. Jul 2003 A1
20030141358 Hudson et al. Jul 2003 A1
20030162565 Al-Khaja Aug 2003 A1
20030163787 Hay et al. Aug 2003 A1
20030172039 Guy et al. Sep 2003 A1
20030191711 Jamison et al. Oct 2003 A1
20030197059 Tidball et al. Oct 2003 A1
20030200179 Kwan Oct 2003 A1
20030200184 Dominguez et al. Oct 2003 A1
20030212601 Silva et al. Nov 2003 A1
20030218062 Noriega et al. Nov 2003 A1
20030218066 Fernandes et al. Nov 2003 A1
20030220884 Choi et al. Nov 2003 A1
20030226042 Fukushima Dec 2003 A1
20030229590 Byrne et al. Dec 2003 A1
20030233317 Judd Dec 2003 A1
20030233318 King et al. Dec 2003 A1
20030234819 Daly et al. Dec 2003 A1
20030236704 Antonucci Dec 2003 A1
20040009760 Laybourn et al. Jan 2004 A1
20040010462 Moon et al. Jan 2004 A1
20040019564 Goldthwaite et al. Jan 2004 A1
20040019568 Moenickheim et al. Jan 2004 A1
20040029569 Khan et al. Feb 2004 A1
20040046035 Davila et al. Mar 2004 A1
20040049456 Dreyer Mar 2004 A1
20040049458 Kunugi et al. Mar 2004 A1
20040054587 Dev et al. Mar 2004 A1
20040059671 Nozaki et al. Mar 2004 A1
20040064409 Kight et al. Apr 2004 A1
20040068446 Do et al. Apr 2004 A1
20040068448 Kim Apr 2004 A1
20040078327 Frazier et al. Apr 2004 A1
20040078332 Ferguson et al. Apr 2004 A1
20040083170 Bam et al. Apr 2004 A1
20040083171 Kight et al. Apr 2004 A1
20040093305 Kight et al. May 2004 A1
20040094624 Fernandes et al. May 2004 A1
20040107170 Labrou et al. Jun 2004 A1
20040114766 Hileman et al. Jun 2004 A1
20040117302 Weichert et al. Jun 2004 A1
20040118914 Smith et al. Jun 2004 A1
20040128197 Bam et al. Jul 2004 A1
20040139005 Ganesan Jul 2004 A1
20040159700 Khan et al. Aug 2004 A1
20040162058 Mottes Aug 2004 A1
20040167853 Sharma Aug 2004 A1
20040181463 Goldthwaite et al. Sep 2004 A1
20040185881 Lee Sep 2004 A1
20040193464 Szrek Sep 2004 A1
20040199431 Ganesan et al. Oct 2004 A1
20040199474 Ritter Oct 2004 A1
20040205023 Hafer et al. Oct 2004 A1
20040205138 Friedman et al. Oct 2004 A1
20040210449 Breck et al. Oct 2004 A1
20040215560 Amalraj et al. Oct 2004 A1
20040215564 Lawlor et al. Oct 2004 A1
20040215573 Teutenberg et al. Oct 2004 A1
20040224660 Anderson Nov 2004 A1
20040225560 Lewis et al. Nov 2004 A1
20040230489 Goldthwaite et al. Nov 2004 A1
20040232225 Bishop et al. Nov 2004 A1
20040242208 Teicher Dec 2004 A1
20040243490 Murto et al. Dec 2004 A1
20040249710 Smith et al. Dec 2004 A1
20040249766 Ganesan et al. Dec 2004 A1
20040267663 Karns Dec 2004 A1
20040267664 Nam et al. Dec 2004 A1
20040267665 Nam et al. Dec 2004 A1
20050001027 Bahar Jan 2005 A1
20050004837 Sweeny Jan 2005 A1
20050015388 Dasgupta et al. Jan 2005 A1
20050021457 Johnson et al. Jan 2005 A1
20050027624 Cai Feb 2005 A1
20050027655 Sharma et al. Feb 2005 A1
20050033645 DuPhily Feb 2005 A1
20050051619 Graves et al. Mar 2005 A1
20050060261 Remington et al. Mar 2005 A1
20050061872 Paschini et al. Mar 2005 A1
20050065876 Kumar Mar 2005 A1
20050071179 Peters et al. Mar 2005 A1
20050071269 Peters Mar 2005 A1
20050075958 Gonzalez Apr 2005 A1
20050075975 Rosner et al. Apr 2005 A1
20050079863 Macaluso Apr 2005 A1
20050080634 Kanniainen et al. Apr 2005 A1
20050080727 Postrel Apr 2005 A1
20050086164 Kim et al. Apr 2005 A1
20050086168 Alvarez Apr 2005 A1
20050096981 Shimada May 2005 A1
20050097038 Yu et al. May 2005 A1
20050103839 Hewel May 2005 A1
20050107068 Smith et al. May 2005 A1
20050109835 Jacoby et al. May 2005 A1
20050116028 Cotten Jun 2005 A1
20050125343 Mendelovich Jun 2005 A1
20050125348 Fulton et al. Jun 2005 A1
20050127169 Foss Jun 2005 A1
20050137978 Ganesan et al. Jun 2005 A1
20050143051 Park Jun 2005 A1
20050154644 Deakin et al. Jul 2005 A1
20050174975 Mgrdechian et al. Aug 2005 A1
20050177437 Ferrier Aug 2005 A1
20050177517 Leung Aug 2005 A1
20050182714 Nel Aug 2005 A1
20050182720 Willard et al. Aug 2005 A1
20050184145 Law et al. Aug 2005 A1
20050187873 Labrou et al. Aug 2005 A1
20050199709 Linlor Sep 2005 A1
20050203835 Nhaissi et al. Sep 2005 A1
20050203844 Ferguson et al. Sep 2005 A1
20050209965 Ganesan Sep 2005 A1
20050222925 Jamieson Oct 2005 A1
20050222961 Staib et al. Oct 2005 A1
20050228717 Gusler et al. Oct 2005 A1
20050233797 Gilmore Oct 2005 A1
20050240477 Friday Oct 2005 A1
20050247777 Pitroda Nov 2005 A1
20050259589 Rozmovits et al. Nov 2005 A1
20050261968 Randall et al. Nov 2005 A1
20050262017 Kawase et al. Nov 2005 A1
20050269401 Spitzer et al. Dec 2005 A1
20050269402 Spitzer et al. Dec 2005 A1
20050274793 Cantini et al. Dec 2005 A1
20060000900 Fernandes et al. Jan 2006 A1
20060004631 Roberts et al. Jan 2006 A1
20060004656 Lee Jan 2006 A1
20060006226 Fitzgerald et al. Jan 2006 A1
20060023856 Welton Feb 2006 A1
20060026070 Sun Feb 2006 A1
20060037835 Doran Feb 2006 A1
20060041470 Filho et al. Feb 2006 A1
20060053056 Alspach-Goss et al. Mar 2006 A1
20060058011 Vanska et al. Mar 2006 A1
20060074767 Fortney et al. Apr 2006 A1
20060080232 Epps Apr 2006 A1
20060085310 Mylet et al. Apr 2006 A1
20060089160 Othmer Apr 2006 A1
20060089893 Joseph et al. Apr 2006 A1
20060113376 Reed et al. Jun 2006 A1
20060116892 Grimes et al. Jun 2006 A1
20060136334 Atkinson et al. Jun 2006 A1
20060136901 Nichols Jun 2006 A1
20060161490 Chakiris et al. Jul 2006 A1
20060163343 Changryeol Jul 2006 A1
20060167744 Yoo Jul 2006 A1
20060206436 James et al. Sep 2006 A1
20060207856 Dean et al. Sep 2006 A1
20060224450 Moon Oct 2006 A1
20060235754 Walker et al. Oct 2006 A1
20060253320 Heywood Nov 2006 A1
20060255125 Jennings et al. Nov 2006 A1
20070017976 Peyret et al. Jan 2007 A1
20070021969 Homeier-Beals Jan 2007 A1
20070030824 Ribaudo Feb 2007 A1
20070038577 Werner Feb 2007 A1
20070043682 Drapkin et al. Feb 2007 A1
20070055785 Stevens Mar 2007 A1
20070063024 Guillot Mar 2007 A1
20070156436 Fisher Apr 2007 A1
20070100761 Dillon May 2007 A1
20070108269 Benco et al. May 2007 A1
20070114274 Gibbs et al. May 2007 A1
20070118478 Graves et al. May 2007 A1
20070125838 Law et al. Jun 2007 A1
20070130085 Zhu Jun 2007 A1
20070162337 Hawkins et al. Jul 2007 A1
20070174123 Dorr Jul 2007 A1
20070175984 Khandaker et al. Aug 2007 A1
20070192182 Monaco Aug 2007 A1
20070198347 Muldoon Aug 2007 A1
20070208618 Paintin et al. Sep 2007 A1
20070233615 Tumminaro Oct 2007 A1
20070257767 Beeson Nov 2007 A1
20070262140 Long Nov 2007 A1
20070265872 Robinson et al. Nov 2007 A1
20070284434 Fletcher Dec 2007 A1
20080006685 Rackley, III et al. Jan 2008 A1
20080010190 Rackley, III et al. Jan 2008 A1
20080010191 Rackley, III et al. Jan 2008 A1
20080010192 Rackley, III et al. Jan 2008 A1
20080010193 Rackley, III et al. Jan 2008 A1
20080010196 Rackley, III et al. Jan 2008 A1
20080010204 Rackley, III et al. Jan 2008 A1
20080010215 Rackley, III et al. Jan 2008 A1
20080028395 Motta et al. Jan 2008 A1
20080033817 Billmaier et al. Feb 2008 A1
20080040265 Rackley, III et al. Feb 2008 A1
20080041938 Wise Feb 2008 A1
20080046366 Bemmel et al. Feb 2008 A1
20080052164 Abifaker Feb 2008 A1
20080052169 O'Shea et al. Feb 2008 A1
20080059318 Packes et al. Mar 2008 A1
20080070690 Luchene et al. Mar 2008 A1
20080071620 Lowe Mar 2008 A1
20080078831 Johnson et al. Apr 2008 A1
20080091545 Jennings et al. Apr 2008 A1
20080097844 Hsu et al. Apr 2008 A1
20080097851 Bemmel et al. Apr 2008 A1
20080103972 Lanc May 2008 A1
20080114699 Yuan et al. May 2008 A1
20080126145 Rackley, III et al. May 2008 A1
20080133351 White et al. Jun 2008 A1
20080172306 Schorr et al. Jul 2008 A1
20080172331 Graves et al. Jul 2008 A1
20080179395 Dixon et al. Jul 2008 A1
20080228597 Sondles Sep 2008 A1
20080235095 Oles et al. Sep 2008 A1
20080255942 Craft Oct 2008 A1
20080273630 Mege et al. Nov 2008 A1
20080290987 Li Nov 2008 A1
20080319868 Briscoe Dec 2008 A1
20090001159 James et al. Jan 2009 A1
20090006116 Baker et al. Jan 2009 A1
20090030789 Mashinsky Jan 2009 A1
20090037326 Chitti et al. Feb 2009 A1
20090055296 Nelsen Feb 2009 A1
20090076896 Dewitt Mar 2009 A1
20090106115 James et al. Apr 2009 A1
20090111378 Sheynman Apr 2009 A1
20090112709 Barhydt et al. Apr 2009 A1
20090144161 Fisher Jun 2009 A1
20090157554 Hobson et al. Jun 2009 A1
20090163263 Herndon Jun 2009 A1
20090164329 Bishop et al. Jun 2009 A1
20090171739 De et al. Jul 2009 A1
20090171804 Lee et al. Jul 2009 A1
20090173782 Muscato Jul 2009 A1
20090173784 Yang Jul 2009 A1
20090187491 Bull Jul 2009 A1
20090192904 Patterson et al. Jul 2009 A1
20090192928 Abifaker Jul 2009 A1
20090197684 Arezina Aug 2009 A1
20090240579 Skowreonek Sep 2009 A1
20090247131 Champion Oct 2009 A1
20090254453 Sanguinetti et al. Oct 2009 A1
20090281915 Deakin et al. Nov 2009 A1
20090281941 Worth Nov 2009 A1
20090281951 Shakkarwar Nov 2009 A1
20090287558 Seth et al. Nov 2009 A1
20090298427 Wilkinson Dec 2009 A1
20090327121 Carroll et al. Dec 2009 A1
20100005025 Kumar et al. Jan 2010 A1
20100005424 Sundaresan Jan 2010 A1
20100008535 Abulafia et al. Jan 2010 A1
20100010906 Grecia Jan 2010 A1
20100041368 Kumar Feb 2010 A1
20100042471 Chang et al. Feb 2010 A1
20100063906 Nelsen et al. Mar 2010 A1
20100076833 Nelsen Mar 2010 A1
20100082487 Nelsen Apr 2010 A1
20100082490 Rosenblatt Apr 2010 A1
20100088188 Kumar et al. Apr 2010 A1
20100097180 Cardullo Apr 2010 A1
20100130172 Vendrow et al. May 2010 A1
20100131415 Sartipi May 2010 A1
20100174993 Pennington et al. Jul 2010 A1
20100175287 Gupta et al. Jul 2010 A1
20100185505 Sprogoe et al. Jul 2010 A1
20100205063 Mersky Aug 2010 A1
20100280921 Stone et al. Nov 2010 A1
20100293536 Nikitin Nov 2010 A1
20100299194 Snyder et al. Nov 2010 A1
20100299266 Catania et al. Nov 2010 A1
20100304852 Szrek Dec 2010 A1
20110057027 Grossman Mar 2011 A1
20110087592 Veen et al. Apr 2011 A1
20110091092 Nepomniachtchi et al. Apr 2011 A1
20110106698 Isaacson May 2011 A1
20110125607 Wilen May 2011 A1
20110145044 Nelsen et al. Jun 2011 A1
20110161226 Courtion Jun 2011 A1
20110173083 Reed et al. Jul 2011 A1
20110202419 Mamdani Aug 2011 A1
20110234514 Gothard Sep 2011 A1
20110246284 Chaikin Oct 2011 A1
20110251962 Hruska Oct 2011 A1
20120004769 Hallenbeck et al. Jan 2012 A1
20120060425 Yamauchi et al. Mar 2012 A1
20120089467 Comparelli Apr 2012 A1
20120099780 Smith et al. Apr 2012 A1
20120136780 El-Awady May 2012 A1
20120203572 Christense Aug 2012 A1
20120209688 Lamothe et al. Aug 2012 A1
20120234911 Yankovich et al. Sep 2012 A1
20120245987 Isaacson Sep 2012 A1
20120271689 Etheredge et al. Oct 2012 A1
20120284185 Mettler et al. Nov 2012 A1
20120303425 Katzin Nov 2012 A1
20130073388 Heath Mar 2013 A1
20130226728 Oghittu Aug 2013 A1
20130290121 Simakov Oct 2013 A1
20130304561 Warner et al. Nov 2013 A1
20140006268 Roberts Jan 2014 A1
20140019238 Blatchley Jan 2014 A1
20140058873 Sorenson Feb 2014 A1
20140074704 White Mar 2014 A1
20140297437 Natarajan Mar 2014 A1
20140164159 Lovelace Jun 2014 A1
20140237578 Bryant Aug 2014 A1
20140081769 Wilen Nov 2014 A1
20140279187 Gopinath Nov 2014 A1
20150066757 Shenoy Mar 2015 A1
20150106225 Glass Apr 2015 A1
20150178701 Glass Jun 2015 A1
20150278845 Sorem et al. Oct 2015 A1
20160232480 Erez Aug 2016 A1
20170076293 Cage Mar 2017 A1
Foreign Referenced Citations (18)
Number Date Country
0950968 May 2004 EP
1519332 Mar 2005 EP
2128809 Dec 2009 EP
2002189963 Jul 2002 JP
2002318951 Oct 2002 JP
2003208541 Jul 2003 JP
20010106187 Nov 2001 KR
20040028487 Apr 2004 KR
20040052531 Jun 2004 KR
1020040052502 Jun 2004 KR
20040069294 Aug 2004 KR
20050118609 Dec 2005 KR
20090123444 Dec 2009 KR
2004012118 Feb 2004 WO
2005111882 Nov 2005 WO
2008005018 Jan 2010 WO
2013078499 Jun 2013 WO
WO-2014015365 Jan 2014 WO
Non-Patent Literature Citations (9)
Entry
Pejsa, Tomislav, et al. “Room2room: Enabling life-size telepresence in a projected augmented reality environment.” Proceedings of the 19th ACM conference on computer-supported cooperative work & social computing. 2016.
Eazel, William, “Paypal intros SMS Payments,” http://www.v3co.uk/articles/print/2152694, vnunet.com, Mar. 24, 2006, 1 page.
International Search Report of PCT/US 13/23945, dated Mar. 29, 2013; 2 pages.
ISA European Patent Office, Search Report of EP09812328.4, dated Jul. 4, 2012, Germany, 6 pages.
ISA Korea, International Search Report of PCT/US2009/056118, dated Apr. 19, 2010, 3 pages.
ISA Korea, International Search Report of PCT/US2009/058111, dated May 26, 2010, 3 pages.
ISA Korean Intellectual Property Office, International Search Report of PCT/US2010/060875, dated Jul. 29, 2011, 10 pages.
ISA United States Patent and Trademark Office, International Search Report of PCT/US2008/073910, dated Nov. 10, 2008.
Nelsen, David A., “Systems and Methods to Manage and Control Use of a Virtual Card,” U.S. Appl. No. 13/158,349, filed Jun. 10, 2011, 62 pages.
Related Publications (1)
Number Date Country
20190355050 A1 Nov 2019 US
Provisional Applications (1)
Number Date Country
62673280 May 2018 US