Even with advances in technology, budgeting finances to purchase various products is often difficult for a user. Purchasing large products, such as houses or cars, may affect a user's budget for many years. It is imperative that individual's budget correctly prior to a purchase, in order to have adequate financial stability for the purchase and after a purchase. However, budgeting can be challenging for several reasons. First, it is difficult to predict some future purchases that may be made. Second, budgeting takes planning, time, and sometime requires expert advice. Finally, after the budget is complete, it is easy for someone to stray from his/her budget and subsequently not be able to reach purchasing goals as early as desired.
Current handheld mobile devices, such as smart phones or the like, have the functionality to allow there use for a myriad of day to day transactions, such as paying for a cup of coffee or providing a boarding pass for a flight. These technological advances combine multiple technologies into a handheld mobile device to provide a user with a large number of capabilities. For example, many smart phones are equipped with significant processing power, sophisticated multi-tasking operating systems, and high-bandwidth Internet connection capabilities. Moreover, such devices often have additional features that are increasingly more common and standardized. Such features include, but are not limited to, location-determining devices, such as Global Positioning System (GPS) devices; sensor devices, such as accelerometers; and high-resolution video cameras.
As the hardware capabilities of such mobile devices have increased, so too have the applications (i.e., software) that rely on the hardware advances. One such example of innovative software is a category known as augmented reality (“AR”), or more generally referred to as mediated reality. The AR technology analyzes location data, compass direction data, and the like in combination with information related to the objects, locations or the like in the video stream to create browse-able “hot-spots” or “tags” that are superimposed on the mobile device display, resulting in an experience described as “reality browsing”.
Therefore, a need exists to provide individuals with access to budgeting capabilities for easy budgeting and updating of a budget to include products the individual may wish to purchase in the future.
The following presents a simplified summary of one or more embodiments in order to provide a basic understanding of such embodiments. This summary is not an extensive overview of all contemplated embodiments, and is intended to neither identify key or critical elements of all embodiments nor delineate the scope of any or all embodiments. Its sole purpose is to present some concepts of one or more embodiments in a simplified form as a prelude to the more detailed description that is presented later.
Embodiments of the present invention address the above needs and/or achieve other advantages by providing apparatuses (e.g., a system, computer program product and/or other devices) and methods for using real-time video analysis, such as augmented reality (“AR”) or the like to assist the user of mobile devices with populating budgets and/or wish lists, which allows a user to easily add products to budgets and/or wish lists and be able to begin planning to purchase the products in the future.
Using real-time video analysis, such as AR or the like the system may provide the user of mobile device real-time budget and/or wish list updates incorporating product and/or budget data located in a real-time video stream. In some embodiments, the budget may provide the user with real-time indication as to how purchasing a product may affect the user's budget, a future budget based on the purchasing of the product, etc. In some embodiments, wish list may provide the user with real-time indications as to the user's current wish list, current prices for products on the wish list, etc. Through the use of real-time vision object recognition, objects, logos, artwork, products, locations and other features that can be recognized in the real-time video stream can be matched to data associated with those objects, logos, artwork, products, locations, or other features to determine products and/or data for directories associated with budgeting and/or wish lists. In some embodiments, the user may not require using real-time video stream, but instead be able to take a still photo image of the product and/or budget data. In some embodiments, the data that is matched to the images is specific to financial institutions, such as user financial behavior history, user purchasing power, transaction history, and the like. In this regard, many of the embodiments herein disclosed leverage financial institution data, which is uniquely specific to financial institutions, in providing information to mobile device users in connection with real-time video stream analysis.
In some embodiments, the product and/or budget data is determined from the images in the real-time video stream. Products may include any good, service, etc. that a user may wish to purchase. For example, a user may provide real-time video images of a television that the user wishes to purchase. Budget data may include any product, document, statement, bill, financial record, etc. that the user may wish to incorporate into his/her budget. In this way, the user may be able to capture an image of a product and/or budget data and the product and/or budget data in that image may be provided to the user's wish list and/or the user's budget, such that the user has real-time wish list and budget data available. For example, if a user is wishing to purchase a vehicle, the user may take an image of the vehicle, the system may recognize the vehicle and provide data associated with the vehicle to the user's budged.
In this way, the user may be able to select an indicator associated with his/her budget and receive information about how purchasing the vehicle may effect his/her budget in the future, best payment mechanism to purchase the vehicle, how to receive a loan to purchase the vehicle, and/or if purchasing the vehicle is recommended based on the user's budgeting strategy. In other embodiments, the products and/or budget data that have been previously determined may be recognized with when images of the product and/or budget data are located within the real-time video stream. For example, if a user has previously put a specific television on his/her wish list. The user may use his/her real-time video stream to scan a planogram, down a street, a virtual store, etc. The system may then identify the television and the location within the planogram, merchant location on the street, and/or location in the virtual store, such that the user may be able to locate the products on his/her wish list using the system.
In some embodiments, the system may populate a budget and/or budget strategies. In some embodiments, the user may input, via an interface or the like information to set up a budget and/or budget strategy. In other embodiments, the system may automatically set up a user's budget and/or budget strategy. In yet other embodiments, a combination of user input and automatic system input may be utilized to set up a budget and/or budget strategy. A user budget provides a financial plan that forecasts revenues and expenditures, such that a model of financial performance may be obtained if certain strategies, evens, and/or plans are carried out. Budgets may include plans for saving, borrowing, and/or spending. The user may input information, such as expenses, retirement, goals, future purchases, income, etc. that all may be part of the user's budget. In some embodiments, the user may partially set up his/her budget. In this way, the user may select predetermined budgeting. For example, the user may indicate to the system that he/she wants to purchase a house in a year from now. The system may determine a budget for the user such that he/she may be able to provide the down payment on a house. In other embodiments, the budget and budget strategies may be set up automatically by the system. The budgets that may be set up via the system, include, but are not limited to multiple budgets, such as, but not limited to macro, category, commercial, micro, groups, sales, production, cash flow, marketing, project expenditure, businesses, and/or other budgets a user may desire to product.
The system may aid the user or completely set up the budget for the user utilizing several factors, including but not limit to utilizing the user's financial plans, the user's financial history, similarly situated individuals' financial history, crowd sourcing data, financial advisor recommendations, and/or the like to set up the user's budget. Financial plans of a user may include financial goals, payment strategies, tax strategies, personal planning, loan repayment (e.g., student loan repayment, car loan repayment, etc.), paying off credit card debt (e.g., paying off one credit card with higher interest rates faster than other debt), mortgage repayment and/or the like. The financial goals of the user may include savings goals, such as saving for a child's college, investments, saving to reach a specific amount, saving to purchase a specific product, retirement savings goals, etc. Personal planning may include vacation planning, job loss planning, emergency planning, and social networking data.
User financial history includes payment strategies of the user may have previously implemented, previous earnings, previous purchasing habits, previously used purchase methods, previous bills, etc. that may be used by the system to determine a budget that may be suitable for the user to implement. Similarly situated individuals' financial history includes the system reviewing and determining budgets for individuals that are in the similar financial, local, age, etc. position as the user. In this way, the user receives aggregate data for other individuals that are similarly situated to provide the user a budget.
In some embodiments, the system may populate a user's wish list. In some embodiments, the wish list may be populated manually by the user, using an interface and/or the like. In some embodiments, the wish list may be set up automatically by the system for the user. In yet other embodiments, the wish list may be set up via a combination of user input and system automatic wish list population. The user's wish list is a list that includes products that the user may wish to purchase or is planning to purchase in the future. These products may then be monitored for price changes, discounts, merchants providing the product, similar products, comments/reviews of the product, promotional offers for the product, accompanying products, and/or the like. The real-time video image of the product may be incorporated into the user's wish list such that upon selection of the wish list, information about the product will be available to the user in real-time. In this way, the system may be able to receive images of the products the user may wish to provide to the user's wish list, identify those products and update the user's wish list with those products.
In some embodiments, the system provides a directory that comprises both the user budge and the user wish list. In this way, the products and/or budget data that is captured via real-time video, photograph, still image, internet, and/or the like may be provided to either the budget, the wish list, and/or a combination thereof. In this way, information may be exchanged between the budget and/or the wish list. For example, if the user selects a vehicle to be on his/her wish list, then the system may also provide the user with an updated budget that includes the vehicle incorporated into the budget.
In some embodiments, the system may identify specific information about the product and/or budget data, such as the product, price, transaction, entity associated with the product and/or budget data, etc. In this way, the system may be able to identify the product and/or budget data using the image provided to the system from the real-time video stream. Using the specific information identified about the product and/or the budget data, the system may be able to populate the user's directory, including the user's budget and/or the user's wish list, with the specific information. In this way, the system may receive a real-time video image of a product, for example, a television. The system may then identify the specifics about the television, such as, but not limited to the brand, manufacturer, model, merchants selling the television, prices, discounts, comments/reviews, etc. The system may then provide the user directory with this data such that the user's budget and/or wish list may be populated.
In some embodiments, the system may provide selectable indicators associated with the product and/or budget data in the real-time video stream. One or more indicators are presented on the display of the mobile device in conjunction with the real-time video stream. Each of the indicators corresponds with an image determined to be a product and/or budget data. The indicator may take various forms, such as a display of a tag, a highlighted area, a hot-spot, and/or the like. In some embodiments, the indicator is a selectable indicator, such that the user may select (e.g., click-on, hover-over, touch the display, provide a voice command, etc.) the product, budget data, or indicator to provide display of specific information related to the product and/or budget data, including for instance real-time information about the user's budget and/or wish list and the potential impact that the product and/or budget data may have on the user's budget and/or wish list. In other embodiments, the indicator itself may provide the information or a portion of the information. For example, a user may wish to purchase a television; the user may use real-time vision object recognition to recognize that the television within an isle at a retail store. The real-time vision object recognition may consider the located at a specific retail store, the characteristics of the television such as brand, quality, etc., and price of the television. The user may select the indicator. In some embodiments, the selected indicator may display what the user's budget may look like in the future based on the user's potential purchase of the product. In other embodiments, the selected indicator may display what the user's wish list with the product incorporated into the wish list.
Furthermore, the display of the real-time video stream on a mobile device may also provide the user with a direct video or audio conference with the user's financial advisor. In this way, the user may be provided instant advisement from his financial advisor for advice when purchasing major purchases or when the user is facing significant financial decisions.
Finally, the system may provide the user with the ability to purchase a product located within the real-time video stream. In this way, the user may select his/her budget and/or wish list and purchase the product. The product may be purchased on-line, through a merchant, etc. using a mobile device.
Embodiments of the invention relate to systems, methods, and computer program products for populating a user directory, comprising: receiving financial information for a user, wherein the financial information provides an indication of financial history of the user; identifying, by a computer device processor, one or more products proximate in location to a mobile device; determining a location in the user directory to populate with the one or more products, wherein the user directory comprise at least one of a user budget or a user wish list; receiving information about one or more products in the user directory, wherein the information about one or more products comprises a price associated with the one or more products; presenting, via the mobile device, a selectable indicator associated with the product, wherein the indicators provides real-time impact on the user directory based at least in part on the information about the one or more products identified proximate in location to the mobile device; and providing access to the user directory by selection of the indicator associated with the product, such that the at least one of the user budget or the user wish list is accessible.
In some embodiments, the invention may further comprise providing the user with real-time loan procurement for purchasing the one or more products proximate in location to the mobile device, wherein the real-time loan procurement is based at least in part on the receiving of financial information for a user.
In some embodiments, the invention may further comprise providing a communicable link between the user and a financial advisor, via the mobile device, such that the financial advisor can advise the user based on the real-time impact that purchasing one or more products proximate in location to the mobile device would have on the user budget and the user wish list. In some embodiments, the real-time impact on the user directory further comprises the real-time impact that purchasing the one or more products proximate in location to the mobile device would have on the at least one of the user budget or the user wish list.
In some embodiments, identifying one or more products further comprises capturing a tag located on or proximate to one or more of the products and reading the tag to identify the product. Identifying one or more products may further comprise capturing real-time video of the one or more products product. Identifying one or more products may still further comprise capturing, via the mobile device, images of the one or more products. Capturing images further comprises implementing, via a computer device processor, object recognition processing to identify one or more images that correspond to one or more products.
In some embodiments, the user directory further comprises the user budget and the user wish list.
In some embodiments, the invention may further comprise developing a budget strategy, wherein the budget strategy is based at least in part on said receiving financial information for the user, wherein the receiving financial information comprises at least one of user transaction history, user income, user financial plans, or user personal plans.
In some embodiments, presenting an indicator associated with the product further comprises superimposing the indicator over real-time video that is captured by the mobile device, wherein the indicator is selectable by the user. In some embodiments, one or more products further comprise budget data, wherein budget data includes financial documents that influence the user budget or user wish list.
The features, functions, and advantages that have been discussed may be achieved independently in various embodiments of the present invention or may be combined with yet other embodiments, further details of which can be seen with reference to the following description and drawings.
Having thus described embodiments of the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all, embodiments of the invention are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like numbers refer to elements throughout. Where possible, any terms expressed in the singular form herein are meant to also include the plural form and vice versa, unless explicitly stated otherwise. Also, as used herein, the term “a” and/or “an” shall mean “one or more,” even though the phrase “one or more” is also used herein. Although some embodiments of the invention herein are generally described as involving a “financial institution,” one of ordinary skill in the art will appreciate that other embodiments of the invention may involve other businesses that take the place of or work in conjunction with the financial institution to perform one or more of the processes or steps described herein as being performed by a financial institution. Still in other embodiments of the invention the financial institution described herein may be replaced with other types of businesses that offer payment account systems to users. Furthermore, as used herein, the term “product” shall mean any good, service, event, etc. that may be provided to populate a user wish list and/or a budget. Finally, the term “budget data” as used herein shall mean any product, document, statement, bill, financial record, etc. that may be used to populate a user budget and/or wish list. In this way, budget data may include any data that the user may wish to include into his/her budget or use to set up his/her budget.
While embodiments discussed herein are generally described with respect to “real-time video streams” or “real-time video” it will be appreciated that the video stream may be captured and stored for later viewing and analysis. Indeed, in some embodiments video is recorded and stored on a mobile device and portions or the entirety of the video may be analyzed at a later time. The later analysis may be conducted on the mobile device or loaded onto a different device for analysis. The portions of the video that may be stored and analyzed may range from a single frame of video (e.g., a screenshot) to the entirety of the video. Additionally, rather than video, the user may opt to take a still picture of the environment to be analyzed immediately or at a later time. Embodiments in which real-time video, recorded video or still pictures are analyzed are contemplated herein.
Furthermore, embodiments of the invention generally describe a “user directory” or “directory.” It will be appreciated by one of ordinary skill in the art that a directory may comprise any data that may be required for a wish list and/or budget. Furthermore, the term directory as described herein may include a budget, a wish list, and/or a combination thereof.
Next, in block 104 a determination is made as to which images from the real-time video stream are associated with products and/or budget data to populate a user directory. The user directory comprising a user wish list and/or a user budget. The determination is made by analyzing the real-time video stream for objects, logos, artwork, and/or other product-indicating features or business-indications features to determine what the products and/or budget data are within the video stream and to then incorporate the product and/or budget data into the user's directory, such that the product and/or budget data may be incorporated into the user's wish list and/or the user's budget. In some embodiments, the product and/or budget data may be provided via real-time video stream, voice communications, text communications, and/or the like. In some embodiments, a wish list is a list that includes products that the user may wish to purchase or is planning to purchase in the future. These products may then be monitored for price changes, discounts, etc. as well as being able to incorporate product of the user's wish list into the user's budget, such that the user may be able to receive an indication as to the impact of purchasing a product on the wish list with respect to his/her future financial state. In some embodiments, a budget provides a financial plan that forecasts revenues and expenditures, such that a model of financial performance may be obtained if certain strategies, evens, and/or plans are carried out. Budgets may include plans for saving, borrowing, and/or spending. Budgets may be utilized for macro, category, commercial, micro, groups, sales, production, cash flow, marketing, project expenditure, businesses, and/or any other potential budgeting requirement. The budgets may be used by any user, including but not limited to individuals, families, businesses, and/or other entities.
Thereafter, at block 106 one or more indicators are presented on the display of the mobile device in conjunction with the real-time video stream. The indicator may take various forms, such as display of a tag, a highlighted area, a hot-spot, or the like. In specific embodiments, the indicator is a selectable indicator, such that a user may select (e.g., click-on, hover-over, touch the display, provide a voice command, and/or the like) the product or indicator to provide display of specific information related to the product, including but not limited to budgets with that product incorporated in, budget impact data, financing options for the product, true cost of credit if the product is purchased, user wish list incorporating the product, and/or the like. In some embodiments, the indicator itself may provide the information or a portion of the information to the user.
The network 201 may be a global area network (GAN), such as the Internet, a wide area network (WAN), a local area network (LAN), or any other type of network or combination of networks. The network 201 may provide for wireline, wireless, or a combination wireline and wireless communication between devices on the network.
In some embodiments, the user 202 is an individual. In other embodiments, the user 202 may be associated with a business, company, and/or other entity that may wish to utilize the system for budgeting and/or wish list functionality. The user 202 may be at a retail store, near a service center, at his/her home, and/or within real-time video range of any product and/or budget data the user 202 may wish to provide to the user 202 directory to incorporate into a wish list and/or budget.
As illustrated in
The processing device 212 is operatively coupled to the communication device 210 and the memory device 216. The processing device 212 uses the communication device 210 to communicate with the network 201 and other devices on the network 201, such as, but not limited to the mobile device 204. As such, the communication device 210 generally comprises a modem, server, or other device for communicating with other devices on the network 201.
In some embodiments, the processing device 212 may also be capable of operating one or more applications, such as one or more applications functioning as an artificial intelligence (“AI”) engine. The processing device 212 may recognize objects that it has identified in prior uses by way of the AI engine. These objects may include products and/or budged data. In this way, the processing device 212 may recognize specific objects and/or classes of objects, and store information related to the recognized objects in one or more memories and/or databases discussed herein. Once the AI engine has thereby “learned” of an object and/or class of objects, the AI engine may run concurrently with and/or collaborate with other modules or applications described herein to perform the various steps of the methods discussed. For example, in some embodiments, the AI engine recognizes an object that has been recognized before and stored by the AI engine. The AI engine may then communicate to another application or module of the mobile device and/or server, an indication that the object may be the same object previously recognized. In this regard, the AI engine may provide a baseline or starting point from which to determine the nature of the object. In other embodiments, the AI engine's recognition of an object is accepted as the final recognition of the object.
As further illustrated in
In the embodiment illustrated in
In some embodiments, the directory comprises a budget and a wish list for the user 202. A user budget may provide the user 202 with real-time indication as to how purchasing a product may affect the user's budget, a future budget based on the purchasing of the product, etc. In some embodiments, wish list may provide the user 202 with real-time indications as to the user's current wish list, current prices for products on the wish list, etc.
Through the use of real-time vision object recognition, objects, logos, artwork, products, locations and other features that can be recognized in a real-time video stream, a still image, a photograph, image, vehicle recognized image, etc. The object can be matched to data associated with those objects, logos, artwork, products, locations, or other features to determine products and/or budget data located within those images. In this way, the financial institution application 224 may recognize and identify products and/or budget data located within a real-time video stream or the like.
In some embodiments, as described in further detail below, the financial institution application 224 may recognize a marker 230 and/or objects 220 within an environment 250. The marker 230 may be interpreted with respect to data in the memory device 216 and be recognized as possible products and/or services that may be available to the user 202. In this way, the financial institution server 208 provides marker 230 interpretation and analysis with respect to the data on the financial institution server 208.
The financial institution application 224 may then analyze the objects to determine which products in the real-time video stream are products that the user 202 may wish to add to his/her wish list and/or budget. The product and/or budget data may be determined from the objects in the images provided to the system via real-time video stream, still image, photograph, image, vehicle recognized image, etc. Products may include any good, service, etc. that a user may wish to purchase. For example, a user may provide real-time video images of a television that the user wishes to purchase. Budget data may include any product, document, statement, bill, financial record, etc. that the user may wish to incorporate into his/her budget. In this way, the user 202 may be able to capture an image of a product and/or budget data and the product and/or budget data in that image may be provided to the user's wish list and/or the user's budget, such that the user 202 has real-time wish list and budget data available. For example, if a user 202 is wishing to purchase a vehicle, the user 202 may take an image of the vehicle, the financial institution application 224 may recognize the vehicle and provide data associated with the vehicle to the user's budged. In this way, the user 202 may be able to receive information about how purchasing the vehicle may effect his/her budget in the future, best payment mechanism to purchase the vehicle, how to receive a loan to purchase the vehicle, and/or if purchasing the vehicle is recommended based on the user's budgeting strategy. In other embodiments, the products and/or budget data that have been previously determined may be recognized with when images of the product and/or budget data are located within the real-time video stream. For example, if a user 202 has previously put a specific television on his/her wish list. The user 202 may use his/her real-time video stream to scan a planogram, down a street, a virtual store, etc. The financial institution application 224 may then identify the television and the location within the planogram, merchant location on the street, and/or location in the virtual store, such that the user 202 may be able to locate the products on his/her wish list using the system.
The financial institution application 224 may then add the products and/or budget data associated with the objects in the real-time video stream to the user's directory. In some embodiments, the financial institution application 224 may identify specific information about the product and/or budget data, such as the product, price, transaction, entity associated with the product and/or budget data, etc. Using the specific information identified about the product and/or the budget data, the financial institution application 224 may be able to populate the user's directory, including the user's budget and/or the user's wish list, with the specific information. For example, the financial institution application 224 may receive a real-time video image of a product, for example, a television. The financial institution application 224 may then identify the specifics about the television, such as, but not limited to the brand, manufacturer, model, merchants selling the television, prices, discounts, comments/reviews, etc. The financial institution application 224 may then provide the user directory with this data such that the user's budget and/or wish list may be populated.
The directory may be stored in the memory device 216 of the financial institution server 208. The directory comprises both the user budge and the user wish list. In this way, the products and/or budget data that are captured via real-time video, photograph, still image, internet, and/or the like may be provided to the budget, the wish list, and/or a combination thereof. In this way, information may be exchanged between the budget and/or the wish list within the directory. For example, if the user 202 selects a vehicle to be on his/her wish list, then the system may also provide the user with an updated budget that includes the vehicle incorporated into the budget.
The financial institution application 224 may populate the user's budget and/or wish list by directing information to the directory in the memory device 216 of the financial institution server 208. In some embodiments, the financial institution application 224 may populate the budget within the directory with information about products and/or budget data determined from the real-time video stream. In other embodiments, the financial institution application 224 may populate the wish list within the directory with information about products and/or budget data determined from the real-time video stream. In yet other embodiments, the financial institution application 224 may populate both the budget and the wish list within the directory with information about products and/or budget data determined from the real-time video stream.
In some embodiments, the financial institution application 224 may populate the user's budget within directory. In some embodiments, the information that the financial institution application 224 utilizes to populate the budget may be from user 202 inputted information. The information may be provided to the financial institution application 224 via an interface or the like information to populate the user's budget and/or the user's budget strategy. In other embodiments, the financial institution application 224 may automatically set up a user's budget and/or budget strategy. In yet other embodiments, a combination of user 202 inputs and financial institution application 224 automatic set up may be utilized to populate a budget and/or budget strategy.
The financial institution application 224 may aid the user 202 or completely set up and populate the budget for the user 202 utilizing several factors, including but not limit to utilizing the user's financial plans, the user's financial history, similarly situated individuals' financial history, crowd sourcing data, financial advisor recommendations, and/or the like. Financial plans of a user may include financial goals, payment strategies, tax strategies, personal planning, loan repayment (e.g., student loan repayment, car loan repayment, etc.), paying off credit card debt (e.g., paying off one credit card with higher interest rates faster than other debt), mortgage repayment and/or the like. The financial goals of the user may include savings goals, such as saving for a child's college, investments, saving to reach a specific amount, saving to purchase a specific product, retirement savings goals, etc. Personal planning may include vacation planning, job loss planning, emergency planning, and social networking data. The financial institution application 224 may via the communication device 210 to communication with the other systems on the network, such as, but not limited to mobile devices 204. In this way, the financial institution application 224 may utilize the financial institution server 208 and the other systems on the network 201 to determine information listed able to set up and/or populate the budget.
In some embodiments, the financial institution application 224 may set up and populate a user's wish list. In some embodiments, the financial institution application 224 may allow the user 202 to populate the wish list, using an interface, such as the one illustrated in
In some embodiments, the financial institution application 224 may update the user's directly in real-time to incorporate the products and/or budget data that is located in the real-time video stream. In this way, the financial institution application 224 may incorporate the products and/or budget data into the directory, such that the financial institution application 224 may provide the user 202 with real-time updates to the user's budget and/or user's wish list. For example, if the user 202 is taking a real-time video stream of a vehicle, the vehicle may be recognized by the financial institution application 224 and information about the vehicle may be incorporated into the user's budget and/or wish list.
The financial institution application 224 may also provide the user 202 with a direct video or audio conference with the user's financial advisor. In this way, the user 202 may be provided instant advisement from his/her financial advisor for advice when purchasing major purchases or when the user 202 is facing significant financial decisions.
The financial institution application 224 may provide the user 202 with the ability to purchase a product located within the real-time video stream. In this way, the user 202 may select his/her budget and/or wish list and purchase the product using the user's mobile device 204.
The financial institution application 224 may also allow the user 202 to set up and edit personal preferences associated with the directory. In this way, the financial institution application 224 allows the user 202 may be able to set up the type of budget, wish list, etc. that may fit his/her needs.
As further illustrated is
The environment 250 contains a number of objects 220. Objects 220 include, but are not limited to products and/or services for which the user 202 may wish to view a recommended appropriate payment account. For example, an object 220 may be a product, such as a television, vehicle, computer, etc. or an object 220 may be a budget data, such as a bill, statement, check, etc. Some of such objects 220 may include a marker 230 identifiable by the mobile device 204 or application accessible through the mobile device 204. A marker 230 may be any type of marker that is a distinguishing feature that can be interpreted to identify specific objects 220. In some embodiments, the marker 230 may be interpreted by the mobile device 204. In other embodiments, the marker 230 may be interpreted by the financial institution server 208. In yet other embodiments, the marker 230 may be interpreted by both the mobile device 204 and the financial institution server 208. For instance, a marker 230 may be alpha-numeric characters, symbol, logo, shape, ratio of size of one feature to another feature, a product identifying code such as a bar code, electromagnetic radiation such as radio waves (e.g., radio frequency identification (RFID)), architectural features, color, etc. In some embodiments, the marker 230 may be audio and the mobile device 204 may be capable of utilizing audio recognition to identify words or unique sounds broadcast by the products, service, location, merchant, etc. The marker 230 may be any size, shape, etc. Indeed, in some embodiments, the marker 230 may be very small relative to the object 220 such as the alpha-numeric characters that identify the name or model of an object 220, whereas, in other embodiments, the marker 230 is the entire object 220 such as the unique shape, size, structure, etc.
In some embodiments, the marker 230 is not actually a physical marker located on or being broadcast by the object 220. For instance, the marker 230 may be some type of identifiable feature that is an indication that the object 220 is nearby. In some embodiments, the marker 230 for an object 220 may actually be the marker 230 for a different object 220. For example, the mobile device 204 may recognize a particular building as being “Building A.” Data stored in the data storage 371 may indicate that “Building B” is located directly to the east and next to “Building A.” Thus, marker 230 for an object 220 that are not located on or being broadcast by the object 220 are generally based on fixed facts about the object 220 (e.g., “Building B” is next to “Building A”). However, it is not a requirement that such a marker 230 be such a fixed fact. The marker 230 may be anything that enables the mobile device 204 and/or the financial institution application 224 to interpret to a desired confidence level what the object is. For example, the mobile device 204, object recognition application 325 and/or AR presentation application 321 may be used to identify a particular person as a first character from a popular show, and thereafter utilize the information that the first character is nearby features of other characters to interpret that a second character, a third character, etc. are nearby, whereas without the identification of the first character, the features of the second and third characters may not have been used to identify the second and third characters.
The marker 230 may also be, or include, social network data, such as data retrieved or communicated from the Internet, such as tweets, blog posts, social networking site posts, various types of messages and/or the like. In other embodiments, the marker 230 is provided in addition to social network data as mentioned above. For example, mobile device 204 may capture a video stream and/or one or more still shots of a large gathering of people. In this example, as above, one or more people dressed as characters in costumes may be present at a specified location. The mobile device 204, object recognition application 325, and/or the AR presentation application 321 may identify several social network indicators, such as posts, blogs, tweets, messages, and/or the like indicating the presence of one or more of the characters at the specified location. In this way, the mobile device 204 and associated applications may communicate information regarding the social media communications to the user and/or use the information regarding the social media communications in conjunction with other methods of object recognition. For example, the mobile device 204 object recognition application 325, and/or the AR presentation application 321 performing recognition of the characters at the specified location may confirm that the characters being identified are in fact the correct characters based on the retrieved social media communications. This example may also be applied objects outside of people.
In some embodiments, the mobile device and/or server accesses one or more other servers, social media networks, applications and/or the like in order to retrieve and/or search for information useful in performing an object recognition. In some embodiments, the mobile device and/or server accesses another application by way of an application programming interface or API. In this regard, the mobile device and/or server may quickly search and/or retrieve information from the other program without requiring additional authentication steps or other gateway steps.
While
In some embodiments, a marker 230 may be the location of the object 220. In such embodiments, the mobile device 204 may utilize GPS software to determine the location of the user 202. As noted above, a location-based marker 230 could be utilized in conjunction with other non-location-based markers 230 identifiable and recognized by the mobile device 204 to identify the object 220. However, in some embodiments, a location-based marker 230 may be the only marker 230. For instance, in such embodiments, the mobile device 204 may utilize GPS software to determine the location of the user 202 and a compass device or software to determine what direction the mobile device 204 is facing in order to identify the object 220. In still further embodiments, the mobile device 204 does not utilize any GPS data in the identification. In such embodiments, markers 230 utilized to identify the object 220 are not location-based.
The mobile device 204 may generally include a processing device 310 communicably coupled to such devices as a memory device 320, user output devices 336, user input devices 340, a network interface 360, a power source 315, a clock or other timer 350, a camera 370, a positioning system device 375, one or more chips 380, etc.
In some embodiments, the mobile device 204 and/or the server access one or more databases or datastores (not shown) to search for and/or retrieve information related to the object and/or marker. In some embodiments, the mobile device 204 and/or the server access one or more datastores local to the mobile device 204 and/or server and in other embodiments, the mobile device 204 and/or server access datastores remote to the mobile device 204 and/or server. In some embodiments, the mobile device 204 and/or server access both a memory and/or datastore local to the mobile device 204 and/or server as well as a datastore remote from the mobile device 204 and/or server.
The processing device 310 may include functionality to operate one or more software programs or applications, which may be stored in the memory device 320. For example, the processing device 310 may be capable of operating a connectivity program, such as a web browser application 322. The web browser application 322 may then allow the mobile device 204 to transmit and receive web content, such as, for example, location-based content and/or other web page content, according to a Wireless Application Protocol (WAP), Hypertext Transfer Protocol (HTTP), and/or the like.
The processing device 310 may also be capable of operating applications, such as an object recognition application 325 and/or an AR presentment application 321. The object recognition application 325 and/or AR presentment application 321 may be downloaded from a server and stored in the memory device 320 of the mobile device 204. Alternatively, the object recognition application 325 and/or AR presentment application 321 may be pre-installed and stored in a memory in the chip 380. In such an embodiment, the user 202 may not need to download the object recognition application 325 and/or the AR presentment application 321 from a server. In this way, the object recognition application 325 and/or the AR presentment application 321 may remain at the server, such as the financial institution server 208, within the financial institution application 224.
The object recognition application 325 provides the mobile device 204 with object recognition capabilities. In this way, objects 220 such as products and/or the like may be recognized by the object 220 itself and/or markers 230 associated with the objects 220. This is described in further detail below with respect to
The AR presentment application 321 provides the mobile device 204 with AR capabilities. In this way, the AR presentment application 321 may provide superimposed indicators related to the object 220 in the real-time video stream, such that the user 202 may have access to the targeted offers by selecting an indicator superimposed on the real-time video stream. The AR presentment application 321 may communicate with the other devices on the network 201 to provide the user 202 with indications associated with targeted offers for objects 202 in the real-time video display. The presentation and selection of indicators provided to the user 202 via the AR presentment application 321 is described in further detail below with respect to
In some embodiments, the processor 310 may also be capable of operating one or more applications, such as one or more applications functioning as an artificial intelligence (“AI”) engine. The processor 310 may recognize objects that it has identified in prior uses by way of the AI engine. In this way, the processor 310 may recognize specific objects and/or classes of objects, and store information related to the recognized objects in one or more memories and/or databases discussed herein. Once the AI engine has thereby “learned” of an object and/or class of objects, the AI engine may run concurrently with and/or collaborate with other modules or applications described herein to perform the various steps of the methods discussed. For example, in some embodiments, the AI engine recognizes an object that has been recognized before and stored by the AI engine. The AI engine may then communicate to another application or module of the mobile device 204 and/or server, an indication that the object may be the same object previously recognized. In this regard, the AI engine may provide a baseline or starting point from which to determine the nature of the object. In other embodiments, the AI engine's recognition of an object is accepted as the final recognition of the object.
The chip 380 may include the necessary circuitry to provide both object recognition and AR functionality to the mobile device 204. Generally, the chip 380 will include data storage 371 which may include data associated with the objects within a real-time video stream that the object recognition application 325 identifies as having a certain marker(s) 230. The chip 380 and/or data storage 371 may be an integrated circuit, a microprocessor, a system-on-a-chip, a microcontroller, or the like. As discussed above, in one embodiment, the chip 380 may provide both object recognition and/or the AR functionality to the mobile device 204.
Of note, while
The processing device 310 may be configured to use the network interface 360 to communicate with one or more other devices on a network 201 such as, but not limited to the financial institution server 208. In this regard, the network interface 360 may include an antenna 376 operatively coupled to a transmitter 374 and a receiver 372 (together a “transceiver”). The processing device 310 may be configured to provide signals to and receive signals from the transmitter 374 and receiver 372, respectively. The signals may include signaling information in accordance with the air interface standard of the applicable cellular system of the wireless telephone network that may be part of the network 201. In this regard, the mobile device 204 may be configured to operate with one or more air interface standards, communication protocols, modulation types, and access types. By way of illustration, the mobile device 204 may be configured to operate in accordance with any of a number of first, second, third, and/or fourth-generation communication protocols and/or the like. For example, the mobile device 204 may be configured to operate in accordance with second-generation (2G) wireless communication protocols IS-136 (time division multiple access (TDMA)), GSM (global system for mobile communication), and/or IS-95 (code division multiple access (CDMA)), or with third-generation (3G) wireless communication protocols, such as Universal Mobile Telecommunications System (UMTS), CDMA2000, wideband CDMA (WCDMA) and/or time division-synchronous CDMA (TD-SCDMA), with fourth-generation (4G) wireless communication protocols, and/or the like. The mobile device 204 may also be configured to operate in accordance with non-cellular communication mechanisms, such as via a wireless local area network (WLAN) or other communication/data networks.
The network interface 360 may also include an application interface 373 in order to allow a user 202 to execute some or all of the above-described processes with respect to the object recognition application 325, the AR presentment application 321 and/or the chip 380. The application interface 373 may have access to the hardware, e.g., the transceiver, and software previously described with respect to the network interface 360. Furthermore, the application interface 373 may have the ability to connect to and communicate with an external data storage on a separate system within the network 201. In some embodiments, the external AR data is stored in the memory device 216 of the financial institution server 208.
As described above, the mobile device 204 may have a user interface that includes user output devices 336 and/or user input devices 340. The user output devices 336 may include a display 330 (e.g., a liquid crystal display (LCD) or the like) and a speaker 332 or other audio device, which are operatively coupled to the processing device 310. The user input devices 340, which may allow the mobile device 204 to receive data from a user 202, may include any of a number of devices allowing the mobile device 204 to receive data from a user 202, such as a keypad, keyboard, touch-screen, touchpad, microphone, mouse, joystick, other pointer device, button, soft key, and/or other input device(s).
The mobile device 204 may further include a power source 315. Generally, the power source 315 is a device that supplies electrical energy to an electrical load. In some embodiment, power source 315 may convert a form of energy such as solar energy, chemical energy, mechanical energy, etc. to electrical energy. Generally, the power source 315 in a mobile device 204 may be a battery, such as a lithium battery, a nickel-metal hydride battery, or the like, that is used for powering various circuits, e.g., the transceiver circuit, and other devices that are used to operate the mobile device 204. Alternatively, the power source 315 may be a power adapter that can connect a power supply from a power outlet to the mobile device 204. In such embodiments, a power adapter may be classified as a power source “in” the mobile device.
The mobile device 204 may also include a memory device 320 operatively coupled to the processing device 310. As used herein, memory may include any computer readable medium configured to store data, code, or other information. The memory device 320 may include volatile memory, such as volatile Random Access Memory (RAM) including a cache area for the temporary storage of data. The memory device 320 may also include non-volatile memory, which can be embedded and/or may be removable. The non-volatile memory may additionally or alternatively include an electrically erasable programmable read-only memory (EEPROM), flash memory or the like.
The memory device 320 may store any of a number of applications or programs which comprise computer-executable instructions/code executed by the processing device 310 to implement the functions of the mobile device 204 described herein. For example, the memory device 320 may include such applications as an object recognition application 325, an AR presentment application 321, a web browser application 322, an SMS application 323, an email application 324, etc.
The object recognition application 325 of the mobile device 204 may use any type of means in order to identify desired objects 220. For instance, the object recognition application 325 may utilize one or more pattern recognition algorithms to analyze objects in the environment 250 and compare with markers 230 in data storage 371 which may be contained within the mobile device 204 (such as within chip 380) or externally on a separate system accessible via the connected network 201, such as but not limited to the financial institution server 208. For example, the pattern recognition algorithms may include decision trees, logistic regression, Bayes classifiers, support vector machines, kernel estimation, perceptrons, clustering algorithms, regression algorithms, categorical sequence labeling algorithms, real-valued sequence labeling algorithms, parsing algorithms, general algorithms for predicting arbitrarily-structured labels such as Bayesian networks and Markov random fields, ensemble learning algorithms such as bootstrap aggregating, boosting, ensemble averaging, combinations thereof, and the like.
Upon identifying an object 220 within the real-time video stream via the object recognition application 325, the mobile device 204 is configured to superimpose a virtual image 400 on the mobile device display 330 utilizing the AR presentment application 321. The virtual image 400 is generally a tab or link displayed such that the user 202 may “select” the virtual image 400 and retrieve information related to the identified object. The information may include any desired information associated with the selected object and may range from basic information to greatly detailed information. In some embodiments, the virtual image 400 may provide the user 202 with an internet hyperlink to further information on the object 220. The information may include, for example, all types of media, such as text, images, clipart, video clips, movies, or any other type of information desired. In yet other embodiments, the virtual image 400 information related to the identified object may be visualized by the user 210 without “selecting” the virtual image 400.
In embodiments in which the virtual image 400 provides an interactive tab to the user 202 utilizing the AR presentment application 321 of the mobile device 204, the user 202 may select the virtual image 400 by any conventional means for interaction with the mobile device 204. For instance, in some embodiments, the user 202 may utilize an input device 340 such as a keyboard to highlight and select the virtual image 400 in order to retrieve the information. In a particular embodiment, the mobile device display 330 includes a touch screen that the user may employ to select the virtual image 400 utilizing the user's finger, a stylus, or the like.
In some embodiments, the virtual image 400 is not interactive and simply provides information to the user 202 by superimposing the virtual image 400 onto the display 330. For example, in some instances it may be beneficial for the object recognition application 325 and/or the AR presentment application 321 to merely identify an object 220, just identify the object's name/title, give brief information about the object, etc., rather than provide extensive detail that requires interaction with the virtual image 400. The mobile device 204 along with the object recognition application 325 and/or the AR presentment application 321 are capable of being tailored to a user's desired preferences.
Furthermore, the virtual image 400 may be displayed at any size on the mobile device display 330. The virtual image 400 may be small enough that it is positioned on or next to the object 220 being identified such that the object 220 remains discernable behind the virtual image 400. Additionally, the virtual image 400 may be semi-transparent such that the object 220 remains discernable behind the virtual image. In other embodiments, the virtual image 400 may be large enough to completely cover the object 220 portrayed on the display 330. Indeed, in some embodiments, the virtual image 400 may cover a majority or the entirety of the mobile device display 330.
The user 202 may opt to execute the object recognition application 325 and/or the AR presentment application 321 at any desired moment and begin video capture and analysis. However, in some embodiments, the object recognition application 325 and/or the AR presentment application 321 includes an “always on” feature in which the mobile device 204 is continuously capturing video and analyzing the objects 220 within the video stream. In such embodiments, the object recognition application 325 may be configured to alert the user 202 that a particular object 220 has been identified. The user 202 may set any number of user preferences to tailor the AR experience to his needs. For instance, the user 202 may opt to only be alerted if a certain particular object 220 is identified. Additionally, it will be appreciated that the “always on” feature in which video is continuously captured may consume the mobile device power source 315 more quickly. Thus, in some embodiments, the “always on” feature may disengage if a determined event occurs such as low power source 315, low levels of light for an extended period of time (e.g., such as if the mobile device 204 is in a user's pocket obstructing a clear view of the environment 250 from the mobile device 204), if the mobile device 204 remains stationary (thus receiving the same video stream) for an extended period of time, the user sets a certain time of day to disengage, etc. Conversely, if the “always on” feature is disengaged due to the occurrence of such an event, the user 202 may opt for the “always on” feature to re-engage after the duration of the disengaging event (e.g., power source 315 is re-charged, light levels are increased, etc.).
In some embodiments, the user 202 may identify objects 220 that the object recognition application 325 does not identify and add it to the data storage 371 with desired information in order to be identified and/or displayed in the future. For instance, the user 202 may select an unidentified object 220 and enter a name/title and/or any other desired information for the unidentified object 220. In such embodiments, the object recognition application 325 may detect/record certain markers 230 about the object so that the pattern recognition algorithm(s) (or other identification means) may detect the object 220 in the future. Furthermore, in cases where the object information is within the data storage 371, but the object recognition application 325 fails to identify the object 220 (e.g., one or more identifying characteristics or markers 230 of the object has changed since it was added to the data storage 371 or the marker 230 simply was not identified), the user 202 may select the object 220 and associate it with an object 220 already stored in the data storage 371. In such cases, the object recognition application 325 may be capable of updating the markers 230 for the object 220 in order to identify the object in future real-time video streams.
In addition, in some embodiments, the user 202 may opt to edit the information or add to the information provided by the virtual image 400. For instance, the user 202 may opt to include user-specific information about a certain object 220 such that the information may be displayed upon a future identification of the object 220. Conversely, in some embodiments, the user may opt to delete or hide an object 220 from being identified and a virtual image 400 associated therewith being displayed on the mobile device display 330.
Furthermore, in some instances, an object 220 may include one or more markers 230 identified by the object recognition application 325 that leads the object recognition application 325 to associate an object with more than one object in the data storage 371. In such instances, the user 202 may be presented with the multiple candidate identifications and may opt to choose the appropriate identification or input a different identification. The multiple candidates may be presented to the user 202 by any means. For instance, in one embodiment, the candidates are presented to the user 202 as a list wherein the “strongest” candidate is listed first based on reliability of the identification. Upon input by the user 202 identifying the object 220, the object recognition application 325 may “learn” from the input and store additional markers 230 in order to avoid multiple identification candidates for the same object 220 in future identifications.
Additionally, the object recognition application 325 may utilize other metrics for identification than identification algorithms. For instance, the object recognition application 325 may utilize the user's location, time of day, season, weather, speed of location changes (e.g., walking versus traveling), “busyness” (e.g., how many objects are in motion versus stationary in the video stream), as well any number of other conceivable factors in determining the identification of objects 220. Moreover, the user 202 may input preferences or other metrics for which the object recognition application 325 may utilize to narrow results of identified objects 220.
The AR presentment application 321 may then provide virtual objects or indicators associated with the objects in the real-time video stream. The AR presentment application 321, in this way, may provide for superimposing a virtual object and/or indicators associated with objects 230 in the video stream, such that the user 202 may receive more information associated with the object 220 in the real-time video stream.
In some embodiments, the AR presentment application 321 may have the ability to gather and report user interactions with displayed virtual images 400. The data elements gathered and reported may include, but are not limited to, number of offer impressions; time spent “viewing” an offer, product, object or business; number of offers investigated via a selection; number of offers loaded to an electronic wallet and the like. Such user interactions may be reported to any type of entity desired. In one particular embodiment, the user interactions may be reported to a financial institution and the information reported may include user financial behavior, purchase power/transaction history, and the like.
In some embodiments, the information provided by the real-time video stream may be compared to data provided to the system through an API. In this way, the data may be stored in a separate API and be implemented by request from the mobile device and/or server accesses another application by way of an API.
In various embodiments, information associated with or related to one or more objects that is retrieved for presentation to a user via the mobile device may be permanently or semi-permanently associated with the object. In other words, the object may be “tagged” with the information. In some embodiments, a location pointer is associated with an object after information is retrieved regarding the object. In this regard, subsequent mobile devices capturing the object for recognition may retrieve the associated information, tags and/or pointers in order to more quickly retrieve information regarding the object. In some embodiments, the mobile device provides the user an opportunity to post messages, links to information or the like and associate such postings with the object. Subsequent users may then be presenting such postings when their mobile devices capture and recognize an object. In some embodiments, the information gathered through the recognition and information retrieval process may be posted by the user in association with the object. Such tags and/or postings may be stored in a predetermined memory and/or database for ease of searching and retrieval.
Once the information is provided to the directory in block 513, in decision block 514, a determination is made as to whether the mobile device 204 is still capturing video stream of an object, product, and/or budget data. If no video stream is being captured then no indicator is presented in block 516 via the AR presentment application 321. If a video stream is still being capture, then in block 518 indicators are continuing to be presented. The indicators are associated with a product and/or budget data that the user 202 may visualize in an environment 250 via the AR presentment application 321. In some embodiments, the user 202 may be provided with a real-time updated wish list and/or budget prior to selecting an indicator. In some embodiments, as illustrated in block 520, a user 202 may receive directory information such as budget and/or wish list data after the user 202 selects the indicator.
Next, as illustrated in block 520, the user 202 may select an indicator associated with the product and/or budget data located in the real-time video stream to obtain information about the product and/or budget data. In this way, the information may be provided to the user's budget and wish list in real-time, such that the user 202 may be able to see his/her budget and wish list with the product and/or budget data associated therein.
A user budget provides a financial plan that forecasts revenues and expenditures, such that a model of financial performance may be obtained if certain strategies, evens, and/or plans are carried out. Budgets may include plans for saving, borrowing, and/or spending. In some embodiments, the budget may provide the user 202 with real-time indication as to how purchasing a product may affect the user's budget, a future budget based on the purchasing of the product, changes in the budget, loans for purchasing the product, connection to a financial analyst, etc.
A user wish list is a list that includes products that the user may wish to purchase or is planning to purchase in the future. These products may then be monitored for price changes, discounts, merchants providing the product, similar products, comments/reviews of the product, promotional offers for the product, accompanying products, and/or the like.
As illustrated in block 522, the user 202 may be able to, upon selecting the indicator, review the product and/or budget data within the user's budget and wish list. As also illustrated in block 522, the user 202 may be able to communicate with a financial provider, and if the image corresponds to a product, the user 202 may be able to purchase the product.
The user's wish list 606 is a list that includes products that the user 202 may wish to purchase or is planning to purchase in the future. In some embodiments, the wish list may be populated manually by the user, using an interface and/or the like. In some embodiments, the wish list may be set up automatically by the system for the user. In yet other embodiments, the wish list may be set up via a combination of user input and system automatic wish list population. The wish list 606 may be provided to the user 202 through an interface, such as that illustrated in
The user's budget 608 provides a financial plan that forecasts revenues and expenditures, such that a model of financial performance may be obtained if certain strategies, evens, and/or plans are carried out. The user's budget 608 may include plans for saving, borrowing, and/or spending. The user's budget 608 may be a single budget or may be multiple budgets. The budgets 608 may include, but are not limited to macro, category, commercial, micro, groups, sales, production, cash flow, marketing, project expenditure, businesses, and/or any other potential budgeting requirement. The budgets 608 may be used by any user 202, including but not limited to individuals, families, businesses, corporations, merchants, manufacturers, and/or other entities. In some embodiments, the user 202 may input, via an interface, such as that illustrated in
In some embodiments the user 202 may set up 617 the user's budget 608. The user 202 may input several items into the budget 608, such as, but not limited to answering of financial questions from the system, input of financial plans, input of financial history, bills, statements, expenses, savings, input goals, etc. to set up the user's budget 608 to the user's specific needs. In this way, the user 202 may be able to set up budgets 608 for overall budgeting, micro budgeting, the user's place of business, etc. by imputing preferences and budgeting goals. The system may then automatically run the user's budget based on the user's inputs. Furthermore, the system may provide purchase and other budgeting recommendations based on the user's input.
In some embodiments, the system may aid the user 202 or completely set up 617 the user's budget 608 utilizing several factors, including but not limit to utilizing financial plans, the user's financial history, similarly situated individuals' financial history, crowd sourcing data, financial advisor recommendations, and/or the like to set up the user's budget. Information associated with these factors may be known to the system based on the unique position with respect to financial institution data associated with the user 202.
Financial plans of a user 202 may include financial goals, payment strategies, tax strategies, personal planning, loan repayment (e.g., student loan repayment, car loan repayment, etc.), paying off credit card debt (e.g., paying off one credit card with higher interest rates faster than other debt), mortgage repayment and/or the like. The financial goals of the user 202 may include savings goals, such as saving for a child's college, investments, saving to reach a specific amount, saving to purchase a specific product, retirement savings goals, etc. Personal planning may include vacation planning, job loss planning, emergency planning, and social networking data.
User 202 financial history includes payment strategies of the user 202 may have previously implemented, previous earnings, previous purchasing habits, previously used purchase methods, previous bills, etc. that may be used by the system to determine a budget that may be suitable for the user to implement. Similarly situated individuals' financial history includes the system reviewing and determining budgets for individuals that are in the similar financial, local, age, etc. position as the user 202. In this way, the user 202 receives aggregate data for other individuals that are similarly situated to provide to the user's budget 608.
The user's budget 608 may comprise financing 618 for the user 202, budget adjustment 602, goals 622, strategies 624, and planning 626. These may be provided to the user 202 in real-time, such that the user 202 may realize how making a financial decision may impact financing 618 for the user 202, budget adjustment 602, goals 622, strategies 624, and planning 626 that has previously been set up as the user's budget 608.
Next, as illustrated in block 628, the user 202 upon selecting the indicator associated with the product or budget data in the real-time video stream may be directed to an interface associated with the user's wish list and/or budget. In some embodiments, the user's wish list and/or budget may be displayed as part of the indicator on the real-time video stream. In yet other embodiments, upon selecting the indicator the user 202 may be directed to his/her wish list and/or budget. The wish list 606 may be provided to the user 202 through an interface, such as that illustrated in
The wish list interface 700, in some embodiments requires entering information for security reasons. At this point, the user 202 may enter a user name 706, a password 708, and a reply to a security question 710 in the security section 704 of the wish list interface 700. If the user name 706, password 708, and the reply to a security question 710 are satisfactory, the interface prompts the user 202 to the next step in the process. For example, if the user name 706 is being used by a current user, the new user will be prompted to create a different user name 706. The user 202 may provide products that the user 202 may wish to purchase, will purchase, or is interested in purchasing via the wish list interface 700 in the information about image to add to wish list section 736.
The information about image to add to wish list section 736 of the wish list interface 700 may allow for adding products to the wish list and subsequently viewing products currently on the user's 202 wish list. The user 202 may be provided a visualization of the image in the real-time video stream in section 742. The user 202 may also be provided with information about the product in the image in section 744. This information may include, but is not limited to, the price of the product, the price of the product at different merchants, specifications of the product, promotions, etc. At this point, the user 202 may select the product in the real-time video stream to be incorporated into the user's wish list, as illustrated in section 476. Once the user 202 has selected the product, the user 202 may add the product to his/her current wish list 740, by selecting the add button.
Once the user 202 has selected the add button, the product in the image may appear on the user's current wish list 740. The wish list has a compilation of all the products that have been added to the user's wish list. The products may have been added during a previous session or during the session. If the user 202 wishes, he/she may remove a product from the current wish list 740 if it is no longer a product the user 202 may wish to purchase. In some embodiments, the system may automatically remove the product if the system has received an indication that the user 202 has purchased the product or a similar product. The user 202 may also select products on his/her currently wish list 740 and view images of the products that are on the wish list by selecting the show image of product on wish list section 748.
Along with the current wish list 740 the user 202 may be able to select current information about the products on the wish list, as illustrated in section 712. The information available for the user 202 may include, but is not limited to merchant location 714, price 724, sales 716, promotions 720, alternative products 722, and accompanying products 718. Merchant location 714 provides the user 202 with an indication of all the merchants, online, virtual, physical, and the like, that currently are selling one or more products on the user's wish list. Price 724 includes the price of the products on the user's wish list, including the price for the product at any of the merchants where the product is found, price differences compared to a previous price found at that merchant, and future potential price changes the system may predict may occur. Sales 716 include any merchant sales for the product on the wish list, alternative products, and/or accompanying products. Promotions 720 includes any discounts, offers, coupons, etc. that may be available via the Internet and/or any other merchant. Alternative products 722 include products that are similar to the product on the user's wish list, but instead may be a different model, brand, type, etc. Accompanying products 718 include products that may accompany the product on the wish list. For example, if the user 202 has a bicycle on his/her wish list, then accompanying products may include bike shorts, a helmet, water bottles, a trainer, warranty information, etc. In this way, the user 202 may be able to purchase the product on his/her wish list as well as the accessories that may be needed to go along with the product.
Financing for the purchase of the product may also be available to the user 202 through the wish list interface 700 of
As illustrated in
In this way, the financial plans 804 section may include sub-sections such as the financial goals section 806, the payment strategies 807, and personal plans 808. Financial goals 806 may include savings goals, such as saving for a child's college, making an investment, saving to reach a specific amount, or saving for retirement. In some embodiments, a user 202 may have multiple savings goals, of which some may be more important than others to the user 202 at a particular time. Payment strategies 807 may include loan repayment, such as repaying a student loan, a car loan, a personal bank loan, etc. Payment strategies 807 may also include paying off debt, such as mortgage repayment, credit card debt repayment, or personal debt repayment. Personal plans 808 may include vacation planning, job loss planning, emergency planning, social networking data, and/or tax planning. Vacation planning may include a user 202 saving for airfare, lodging, or other travel expenses. Job loss planning allows the user 202 to direct the payment determination account to allocate financial transaction requests to accounts to maximize finances in case of a situation of unemployment. Emergency planning allows the user 202 to direct the system to allocate transaction requests to accounts to maximize available financial resources to use in case of an emergency situation. Tax planning allows the user 202 to direct the payment determination account to allocate transaction requests to accounts to utilize tax planning strategies set by the user 202.
The user 202 may input financial plans 804 via the budget interface 800, as illustrated in
Next, in section 810 of the budget interface 800 the user 202 may rank the budget's importance 810. In this way, if the user 202 has multiple budgets, for example, an overall budget, an entertainment budget, a food budget, and/or the like, the user 202 may be able to rank the budgets that are more important. In this way, if a user 202 makes a purchase of a product in one budget, the impact of the purchase may be shifted to a less or more important budget in order for the user's budgets to balance out. Furthermore, the user 202 may be able to rank aspects associated with determining the user's budget, such as financial goals, in section 836 and plans, in section 838. In this way, the user's budget may be implemented with the goals or plans that are most important to the user 202, allowing the budget to be adapted to the user's specific needs.
In some embodiments, the ranking of goals and/or plans provides a selection box to add a numerical value to each of the goals and/or plan, or other ranking or rating indicator. For example, the user 202 that had student loans, a car loan, and a mortgage to repay may want to focus his repayments to the loan having the highest interest rates. Therefore, the user 202 may rank the highest interest rate loan as the most important goal in his financial plan.
In some embodiments, the user 202 may be provided with current budget graphs 812. These graphs may include, but are not limited to charts, graphs, tables, etc. that may allow a user 202 to visualize his/her budget quickly and easily. In the embodiment illustrated in
Referring back to
As will be appreciated by one of ordinary skill in the art, the present invention may be embodied as an apparatus (including, for example, a system, a machine, a device, a computer program product, and/or the like), as a method (including, for example, a business process, a computer-implemented process, and/or the like), or as any combination of the foregoing. Accordingly, embodiments of the present invention may take the form of an entirely software embodiment (including firmware, resident software, micro-code, etc.), an entirely hardware embodiment, or an embodiment combining software and hardware aspects that may generally be referred to herein as a “system.” Furthermore, embodiments of the present invention may take the form of a computer program product that includes a computer-readable storage medium having computer-executable program code portions stored therein. As used herein, a processor may be “configured to” perform a certain function in a variety of ways, including, for example, by having one or more general-purpose circuits perform the functions by executing one or more computer-executable program code portions embodied in a computer-readable medium, and/or having one or more application-specific circuits perform the function.
It will be understood that any suitable computer-readable medium may be utilized. The computer-readable medium may include, but is not limited to, a non-transitory computer-readable medium, such as a tangible electronic, magnetic, optical, infrared, electromagnetic, and/or semiconductor system, apparatus, and/or device. For example, in some embodiments, the non-transitory computer-readable medium includes a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a compact disc read-only memory (CD-ROM), and/or some other tangible optical and/or magnetic storage device. In other embodiments of the present invention, however, the computer-readable medium may be transitory, such as a propagation signal including computer-executable program code portions embodied therein.
It will also be understood that one or more computer-executable program code portions for carrying out operations of the present invention may include object-oriented, scripted, and/or unscripted programming languages, such as, for example, Java, Perl, Smalltalk, C++, SAS, SQL, Python, Objective C, and/or the like. In some embodiments, the one or more computer-executable program code portions for carrying out operations of embodiments of the present invention are written in conventional procedural programming languages, such as the “C” programming languages and/or similar programming languages. The computer program code may alternatively or additionally be written in one or more multi-paradigm programming languages, such as, for example, F#.
It will further be understood that some embodiments of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of systems, methods, and/or computer program products. It will be understood that each block included in the flowchart illustrations and/or block diagrams, and combinations of blocks included in the flowchart illustrations and/or block diagrams, may be implemented by one or more computer-executable program code portions. These one or more computer-executable program code portions may be provided to a processor of a general purpose computer, special purpose computer, and/or some other programmable data processing apparatus in order to produce a particular machine, such that the one or more computer-executable program code portions, which execute via the processor of the computer and/or other programmable data processing apparatus, create mechanisms for implementing the steps and/or functions represented by the flowchart(s) and/or block diagram block(s).
It will also be understood that the one or more computer-executable program code portions may be stored in a transitory or non-transitory computer-readable medium (e.g., a memory, etc.) that can direct a computer and/or other programmable data processing apparatus to function in a particular manner, such that the computer-executable program code portions stored in the computer-readable medium produce an article of manufacture, including instruction mechanisms which implement the steps and/or functions specified in the flowchart(s) and/or block diagram block(s).
The one or more computer-executable program code portions may also be loaded onto a computer and/or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer and/or other programmable apparatus. In some embodiments, this produces a computer-implemented process such that the one or more computer-executable program code portions which execute on the computer and/or other programmable apparatus provide operational steps to implement the steps specified in the flowchart(s) and/or the functions specified in the block diagram block(s). Alternatively, computer-implemented steps may be combined with operator and/or human-implemented steps in order to carry out an embodiment of the present invention.
Thus, methods, systems, computer programs and the like have been disclosed that provide for using real-time video analysis, such as AR or the like to assist the user of mobile devices with commerce activities. Through the use real-time vision object recognition objects, logos, artwork, products, locations and other features that can be recognized in the real-time video stream can be matched to data associated with such to assist the user with commerce activity. The commerce activity may include, but is not limited to; conducting a transaction, providing information about a product/service, providing rewards based information, providing user-specific offers, or the like. In specific embodiments, the data that matched to the images in the real-time video stream is specific to financial institutions, such as user financial behavior history, user purchase power/transaction history and the like. In this regard, many of the embodiments herein disclosed leverage financial institution data, which is uniquely specific to financial institution, in providing information to mobile devices users in connection with real-time video stream analysis.
While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of, and not restrictive on, the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other changes, combinations, omissions, modifications and substitutions, in addition to those set forth in the above paragraphs, are possible. Those skilled in the art will appreciate that various adaptations and modifications of the just described embodiments can be configured without departing from the scope and spirit of the invention. Therefore, it is to be understood that, within the scope of the appended claims, the invention may be practiced other than as specifically described herein.
The systems, methods, computer program products, etc. described herein, may be utilized or combined with any other suitable AR-related application. Non-limiting examples of other suitable AR-related applications include those described in the following U.S. Provisional Patent Applications, the entirety of each of which is incorporated herein by reference:
This application is a continuation of U.S. patent application Ser. No. 13/365,996, filed Feb. 3, 2012, which in turn claims priority to U.S. Provisional Patent Application Ser. No. 61/450,213, filed Mar. 8, 2011, entitled “Real-Time Video Image Analysis Applications for Commerce Activity,” assigned to the assignee hereof and hereby expressly incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
3055513 | Kralowetz | Sep 1962 | A |
3173269 | Imbertson | Mar 1965 | A |
3266649 | Britcher, Jr. et al. | Aug 1966 | A |
3925196 | Sinfelt et al. | Dec 1975 | A |
3928165 | Piester | Dec 1975 | A |
3996263 | Sakai et al. | Dec 1976 | A |
5638457 | Deaton et al. | Jun 1997 | A |
5768633 | Allen et al. | Jun 1998 | A |
6055513 | Katz et al. | Apr 2000 | A |
6173269 | Solokl et al. | Jan 2001 | B1 |
6202055 | Houvener et al. | Mar 2001 | B1 |
6266649 | Linden et al. | Jul 2001 | B1 |
6522772 | Morrison et al. | Feb 2003 | B1 |
6533392 | Koitabashi | Mar 2003 | B1 |
6636249 | Rekimoto | Oct 2003 | B1 |
6674993 | Tarbouriech | Jan 2004 | B1 |
6856965 | Stinson et al. | Feb 2005 | B1 |
6925196 | Kass et al. | Aug 2005 | B2 |
6928165 | Takai | Aug 2005 | B1 |
6996263 | Jones et al. | Feb 2006 | B2 |
7016532 | Boncyk et al. | Mar 2006 | B2 |
7062454 | Giannini et al. | Jun 2006 | B1 |
7110964 | Tengler et al. | Sep 2006 | B2 |
7154529 | Hoke et al. | Dec 2006 | B2 |
7155228 | Rappaport et al. | Dec 2006 | B2 |
7162438 | Kelly et al. | Jan 2007 | B1 |
7254552 | Bezos et al. | Aug 2007 | B2 |
7265779 | Sato et al. | Sep 2007 | B2 |
7309015 | Frantz et al. | Dec 2007 | B2 |
7336890 | Lu et al. | Feb 2008 | B2 |
7403652 | Boncyk et al. | Jul 2008 | B2 |
7412081 | Doi | Aug 2008 | B2 |
7424303 | Al-Sarawi | Sep 2008 | B2 |
7477780 | Boncyk et al. | Jan 2009 | B2 |
7526280 | Jung et al. | Apr 2009 | B2 |
7564469 | Cohen | Jul 2009 | B2 |
7565008 | Boncyk et al. | Jul 2009 | B2 |
7615446 | Kim et al. | Nov 2009 | B2 |
7630937 | Mo et al. | Dec 2009 | B1 |
7634448 | Ramachandran | Dec 2009 | B1 |
7680324 | Boncyk et al. | Mar 2010 | B2 |
7693585 | Kalan et al. | Apr 2010 | B2 |
7735728 | Wallerstorfer | Jun 2010 | B2 |
7751805 | Neven et al. | Jul 2010 | B2 |
7775437 | Cohen | Aug 2010 | B2 |
7792736 | Wasendorf, Sr. | Sep 2010 | B2 |
7792738 | Channell | Sep 2010 | B2 |
7832632 | Meek et al. | Nov 2010 | B2 |
7840031 | Albertson et al. | Nov 2010 | B2 |
7873710 | Kiley et al. | Jan 2011 | B2 |
7881243 | Hardy et al. | Feb 2011 | B2 |
7881529 | Boncyk et al. | Feb 2011 | B2 |
7899243 | Boncyk et al. | Mar 2011 | B2 |
7899252 | Boncyk et al. | Mar 2011 | B2 |
7912785 | Kay | Mar 2011 | B1 |
7962128 | Neven et al. | Jun 2011 | B2 |
7970649 | Wu | Jun 2011 | B2 |
7983971 | McLuckie et al. | Jul 2011 | B1 |
7988060 | Killian et al. | Aug 2011 | B2 |
8049621 | Egan | Nov 2011 | B1 |
8121944 | Norman et al. | Feb 2012 | B2 |
8138930 | Heath | Mar 2012 | B1 |
8144944 | Ishii | Mar 2012 | B2 |
8145594 | Geisner et al. | Mar 2012 | B2 |
8149113 | Diem | Apr 2012 | B2 |
8154428 | Do et al. | Apr 2012 | B2 |
8156115 | Erol et al. | Apr 2012 | B1 |
8180377 | Yang et al. | May 2012 | B2 |
8184893 | Calman | May 2012 | B2 |
8189855 | Opalach et al. | May 2012 | B2 |
8190732 | Cooley et al. | May 2012 | B1 |
8233673 | Britz et al. | Jul 2012 | B2 |
8290237 | Burks et al. | Oct 2012 | B1 |
8301501 | Glaeser | Oct 2012 | B1 |
8315423 | Jing et al. | Nov 2012 | B1 |
8364015 | Russ et al. | Jan 2013 | B2 |
8385950 | Wagner et al. | Feb 2013 | B1 |
8385971 | Rhoads et al. | Feb 2013 | B2 |
8392450 | Blanchflower et al. | Mar 2013 | B2 |
8438091 | Berman | May 2013 | B1 |
8438110 | Calman et al. | May 2013 | B2 |
8483484 | Boncyk et al. | Jul 2013 | B2 |
8547401 | Mallinson et al. | Oct 2013 | B2 |
8571888 | Calman et al. | Oct 2013 | B2 |
8582850 | Calman et al. | Nov 2013 | B2 |
8610766 | Ding et al. | Dec 2013 | B2 |
8668498 | Calman et al. | Mar 2014 | B2 |
8698896 | Nerayoff et al. | Apr 2014 | B2 |
8718612 | Calman et al. | May 2014 | B2 |
8721337 | Calman et al. | May 2014 | B2 |
8758102 | Block et al. | Jun 2014 | B2 |
8793303 | Speicher et al. | Jul 2014 | B2 |
8843405 | Hartman et al. | Sep 2014 | B1 |
8862762 | Motrenko et al. | Oct 2014 | B1 |
8892987 | Leebow | Nov 2014 | B2 |
20010022615 | Fernandez et al. | Sep 2001 | A1 |
20010049653 | Sheets | Dec 2001 | A1 |
20020006602 | Masters | Jan 2002 | A1 |
20020016739 | Ogasawara | Feb 2002 | A1 |
20020029386 | Robbins | Mar 2002 | A1 |
20020091607 | Sloan | Jul 2002 | A1 |
20020124188 | Sherman et al. | Sep 2002 | A1 |
20020133468 | Mertens | Sep 2002 | A1 |
20030031358 | Truxa et al. | Feb 2003 | A1 |
20030063292 | Mostafavi | Apr 2003 | A1 |
20030064705 | Desiderio | Apr 2003 | A1 |
20030081934 | Kirmuss | May 2003 | A1 |
20030081935 | Kirmuss | May 2003 | A1 |
20030095688 | Kirmuss | May 2003 | A1 |
20040015983 | Lemmons | Jan 2004 | A1 |
20040021584 | Hartz, Jr. et al. | Feb 2004 | A1 |
20040024709 | Yu et al. | Feb 2004 | A1 |
20040068751 | Basawapatna et al. | Apr 2004 | A1 |
20040088228 | Mercer et al. | May 2004 | A1 |
20040170392 | Lu et al. | Sep 2004 | A1 |
20040172285 | Gibson | Sep 2004 | A1 |
20040208372 | Boncyk et al. | Oct 2004 | A1 |
20040229611 | Chun | Nov 2004 | A1 |
20040243468 | Cohagan et al. | Dec 2004 | A1 |
20050018216 | Barsness et al. | Jan 2005 | A1 |
20050020359 | Ackley et al. | Jan 2005 | A1 |
20050052549 | Schinner et al. | Mar 2005 | A1 |
20050131585 | Luskin et al. | Jun 2005 | A1 |
20050137958 | Huber et al. | Jun 2005 | A1 |
20050162523 | Darrell et al. | Jul 2005 | A1 |
20050201510 | Mostafavi | Sep 2005 | A1 |
20050246457 | Parry et al. | Nov 2005 | A1 |
20050261987 | Bezos et al. | Nov 2005 | A1 |
20060002607 | Boncyk et al. | Jan 2006 | A1 |
20060100897 | Halloran, Jr. et al. | May 2006 | A1 |
20060100951 | Mylet et al. | May 2006 | A1 |
20060161390 | Namaky et al. | Jul 2006 | A1 |
20060176516 | Rothschild | Aug 2006 | A1 |
20060218097 | Walker et al. | Sep 2006 | A1 |
20060227998 | Hobgood et al. | Oct 2006 | A1 |
20060229936 | Cahill | Oct 2006 | A1 |
20060229981 | Crites | Oct 2006 | A1 |
20060240862 | Neven et al. | Oct 2006 | A1 |
20060253329 | Haines | Nov 2006 | A1 |
20070088746 | Baker | Apr 2007 | A1 |
20070096886 | Lich et al. | May 2007 | A1 |
20070116299 | Vanderwall et al. | May 2007 | A1 |
20070140595 | Taylor et al. | Jun 2007 | A1 |
20070142091 | Gasborro et al. | Jun 2007 | A1 |
20070159522 | Neven | Jul 2007 | A1 |
20070162942 | Hamynen | Jul 2007 | A1 |
20070162971 | Blom et al. | Jul 2007 | A1 |
20070185795 | Petrime et al. | Aug 2007 | A1 |
20070240186 | Silver et al. | Oct 2007 | A1 |
20070241183 | Brown et al. | Oct 2007 | A1 |
20070260486 | Urich et al. | Nov 2007 | A1 |
20070279521 | Cohen | Dec 2007 | A1 |
20070294721 | Haeuser et al. | Dec 2007 | A1 |
20070294738 | Kuo et al. | Dec 2007 | A1 |
20080030580 | Kashiwa et al. | Feb 2008 | A1 |
20080040278 | DeWitt | Feb 2008 | A1 |
20080070198 | Dempsey | Mar 2008 | A1 |
20080070546 | Lee | Mar 2008 | A1 |
20080077882 | Kramer et al. | Mar 2008 | A1 |
20080120639 | Walter et al. | May 2008 | A1 |
20080148320 | Howcroft | Jun 2008 | A1 |
20080183678 | Weston et al. | Jul 2008 | A1 |
20080183819 | Gould et al. | Jul 2008 | A1 |
20080192048 | Nabais Nobre | Aug 2008 | A1 |
20080195460 | Varghese | Aug 2008 | A1 |
20080210753 | Plozay et al. | Sep 2008 | A1 |
20080214210 | Rasanen et al. | Sep 2008 | A1 |
20080223918 | Williams et al. | Sep 2008 | A1 |
20080230603 | Stawar et al. | Sep 2008 | A1 |
20080243721 | Joao | Oct 2008 | A1 |
20080252723 | Park | Oct 2008 | A1 |
20080267447 | Kelusky et al. | Oct 2008 | A1 |
20080268876 | Gelfand et al. | Oct 2008 | A1 |
20080272914 | Murray et al. | Nov 2008 | A1 |
20080307307 | Ciudad et al. | Dec 2008 | A1 |
20090005010 | Dote et al. | Jan 2009 | A1 |
20090006191 | Arankalle et al. | Jan 2009 | A1 |
20090017930 | Burnett et al. | Jan 2009 | A1 |
20090055205 | Nguyen et al. | Feb 2009 | A1 |
20090061949 | Chen | Mar 2009 | A1 |
20090070228 | Ronen | Mar 2009 | A1 |
20090089131 | Moukas et al. | Apr 2009 | A1 |
20090094125 | Killian et al. | Apr 2009 | A1 |
20090102859 | Athsani et al. | Apr 2009 | A1 |
20090106317 | Letendre-Hedlund | Apr 2009 | A1 |
20090112744 | Park et al. | Apr 2009 | A1 |
20090121271 | Son et al. | May 2009 | A1 |
20090140839 | Bishop et al. | Jun 2009 | A1 |
20090144164 | Wane et al. | Jun 2009 | A1 |
20090162746 | Honda | Jun 2009 | A1 |
20090171650 | Norman | Jul 2009 | A1 |
20090171778 | Powell | Jul 2009 | A1 |
20090171850 | Yuval | Jul 2009 | A1 |
20090181640 | Jones | Jul 2009 | A1 |
20090182748 | Walker | Jul 2009 | A1 |
20090185241 | Nepomniachtchi | Jul 2009 | A1 |
20090202114 | Morin et al. | Aug 2009 | A1 |
20090204511 | Tsang | Aug 2009 | A1 |
20090237546 | Bloebaum et al. | Sep 2009 | A1 |
20090250515 | Todd et al. | Oct 2009 | A1 |
20090251963 | Seol et al. | Oct 2009 | A1 |
20090254440 | Pharris | Oct 2009 | A1 |
20090254467 | Camp, Jr. | Oct 2009 | A1 |
20090285444 | Erol et al. | Nov 2009 | A1 |
20090287587 | Bloebaum | Nov 2009 | A1 |
20090299857 | Brubaker | Dec 2009 | A1 |
20090322671 | Scott et al. | Dec 2009 | A1 |
20100002204 | Jung et al. | Jan 2010 | A1 |
20100034468 | Boncyk et al. | Feb 2010 | A1 |
20100060739 | Salazar | Mar 2010 | A1 |
20100070365 | Siotia et al. | Mar 2010 | A1 |
20100100253 | Fausak et al. | Apr 2010 | A1 |
20100103241 | Linaker | Apr 2010 | A1 |
20100130226 | Arrasvuori et al. | May 2010 | A1 |
20100138037 | Adelberg et al. | Jun 2010 | A1 |
20100169336 | Eckhoff-Hornback et al. | Jul 2010 | A1 |
20100185529 | Chesnut et al. | Jul 2010 | A1 |
20100217651 | Crabtree et al. | Aug 2010 | A1 |
20100223165 | Calman et al. | Sep 2010 | A1 |
20100228558 | Corcoran et al. | Sep 2010 | A1 |
20100228776 | Melkote et al. | Sep 2010 | A1 |
20100250581 | Chau | Sep 2010 | A1 |
20100255795 | Rubinsky et al. | Oct 2010 | A1 |
20100257448 | Squires | Oct 2010 | A1 |
20100260373 | Neven et al. | Oct 2010 | A1 |
20100268629 | Ross et al. | Oct 2010 | A1 |
20100277412 | Pryor | Nov 2010 | A1 |
20100281432 | Geisner et al. | Nov 2010 | A1 |
20100283630 | Alonso | Nov 2010 | A1 |
20100302361 | Yoneyama et al. | Dec 2010 | A1 |
20100306712 | Snook et al. | Dec 2010 | A1 |
20100306715 | Geisner et al. | Dec 2010 | A1 |
20100309225 | Gray et al. | Dec 2010 | A1 |
20110022540 | Stern et al. | Jan 2011 | A1 |
20110034176 | Lord et al. | Feb 2011 | A1 |
20110077046 | Durand et al. | Mar 2011 | A1 |
20110079639 | Khan | Apr 2011 | A1 |
20110081952 | Song et al. | Apr 2011 | A1 |
20110091092 | Nepomniachtchi et al. | Apr 2011 | A1 |
20110098029 | Rhoads et al. | Apr 2011 | A1 |
20110106622 | Kuhlman et al. | May 2011 | A1 |
20110106845 | Lipson et al. | May 2011 | A1 |
20110113343 | Trauth | May 2011 | A1 |
20110119155 | Hammad et al. | May 2011 | A1 |
20110145093 | Paradise et al. | Jun 2011 | A1 |
20110153341 | Diaz-Cortes | Jun 2011 | A1 |
20110153402 | Craig | Jun 2011 | A1 |
20110157357 | Weisensale et al. | Jun 2011 | A1 |
20110164163 | Bilbrey et al. | Jul 2011 | A1 |
20110183732 | Block et al. | Jul 2011 | A1 |
20110191372 | Kaushansky et al. | Aug 2011 | A1 |
20110202460 | Buer et al. | Aug 2011 | A1 |
20110202466 | Carter | Aug 2011 | A1 |
20110252311 | Kay et al. | Oct 2011 | A1 |
20110258121 | Kauniskangas et al. | Oct 2011 | A1 |
20110280450 | Nepomniachtchi et al. | Nov 2011 | A1 |
20110282821 | Levy et al. | Nov 2011 | A1 |
20110317008 | Sam | Dec 2011 | A1 |
20110318717 | Adamowicz | Dec 2011 | A1 |
20120013770 | Stafford et al. | Jan 2012 | A1 |
20120066026 | Dusig et al. | Mar 2012 | A1 |
20120075450 | Ding et al. | Mar 2012 | A1 |
20120095853 | Von Bose et al. | Apr 2012 | A1 |
20120098977 | Striemer et al. | Apr 2012 | A1 |
20120099756 | Sherman et al. | Apr 2012 | A1 |
20120100915 | Margalit et al. | Apr 2012 | A1 |
20120140068 | Monroe et al. | Jun 2012 | A1 |
20120179609 | Agarwal et al. | Jul 2012 | A1 |
20120179665 | Baarman et al. | Jul 2012 | A1 |
20120190455 | Briggs | Jul 2012 | A1 |
20120229624 | Calman et al. | Sep 2012 | A1 |
20120229625 | Calman et al. | Sep 2012 | A1 |
20120229629 | Blumstein-Koren et al. | Sep 2012 | A1 |
20120229647 | Calman et al. | Sep 2012 | A1 |
20120229657 | Calman et al. | Sep 2012 | A1 |
20120230538 | Calman et al. | Sep 2012 | A1 |
20120230539 | Calman et al. | Sep 2012 | A1 |
20120230540 | Calman et al. | Sep 2012 | A1 |
20120230548 | Calman et al. | Sep 2012 | A1 |
20120230557 | Calman et al. | Sep 2012 | A1 |
20120230577 | Calman et al. | Sep 2012 | A1 |
20120231424 | Calman et al. | Sep 2012 | A1 |
20120231425 | Calman et al. | Sep 2012 | A1 |
20120231814 | Calman et al. | Sep 2012 | A1 |
20120231840 | Calman et al. | Sep 2012 | A1 |
20120232937 | Calman et al. | Sep 2012 | A1 |
20120232954 | Calman et al. | Sep 2012 | A1 |
20120232966 | Calman et al. | Sep 2012 | A1 |
20120232968 | Calman et al. | Sep 2012 | A1 |
20120232976 | Calman et al. | Sep 2012 | A1 |
20120232977 | Calman et al. | Sep 2012 | A1 |
20120232993 | Calman et al. | Sep 2012 | A1 |
20120233003 | Calman et al. | Sep 2012 | A1 |
20120233015 | Calman et al. | Sep 2012 | A1 |
20120233025 | Calman et al. | Sep 2012 | A1 |
20120233032 | Calman et al. | Sep 2012 | A1 |
20120233033 | Calman et al. | Sep 2012 | A1 |
20120233070 | Calman et al. | Sep 2012 | A1 |
20120233072 | Calman et al. | Sep 2012 | A1 |
20120233089 | Calman et al. | Sep 2012 | A1 |
20120265679 | Calman et al. | Oct 2012 | A1 |
20120265809 | Hanson et al. | Oct 2012 | A1 |
20120287278 | Danis | Nov 2012 | A1 |
20120299961 | Ramkumar et al. | Nov 2012 | A1 |
20120313781 | Barker et al. | Dec 2012 | A1 |
20120320248 | Igarashi | Dec 2012 | A1 |
20120330753 | Urbanski et al. | Dec 2012 | A1 |
20130011111 | Abraham et al. | Jan 2013 | A1 |
20130031202 | Mick et al. | Jan 2013 | A1 |
20130033522 | Calman et al. | Feb 2013 | A1 |
20130036050 | Giordano et al. | Feb 2013 | A1 |
20130046589 | Grigg et al. | Feb 2013 | A1 |
20130046602 | Grigg et al. | Feb 2013 | A1 |
20130046603 | Grigg et al. | Feb 2013 | A1 |
20130054367 | Grigg et al. | Feb 2013 | A1 |
20130103608 | Scipioni et al. | Apr 2013 | A1 |
20130114877 | Meek et al. | May 2013 | A1 |
20130116855 | Nielsen et al. | May 2013 | A1 |
20130155474 | Roach et al. | Jun 2013 | A1 |
20130156317 | Calman | Jun 2013 | A1 |
20130182010 | Schoeller et al. | Jul 2013 | A2 |
20130259313 | Breed et al. | Oct 2013 | A1 |
20140006259 | Grigg et al. | Jan 2014 | A1 |
20140098993 | Boncyk et al. | Apr 2014 | A1 |
20140219566 | Rodriguez et al. | Aug 2014 | A1 |
Number | Date | Country |
---|---|---|
2007-266143 | Oct 2007 | JP |
1020090047614 | May 2009 | KR |
1020090105734 | Oct 2009 | KR |
Entry |
---|
Zhu, Wei, et al. “Design of the PromoPad: An Automated Augmented-Reality Shopping Assistant.” Journal of Organizational and End User Computing, vol. 20, No. 3, 2008., pp. 41-56. (http://search.proquest.com/docview/199899751?accountid=14753). |
Brody, A B (1999), Pocket BargainFinder: A handheld Device for Augmented Commerce, Handheld and Ubiquitous Computing, First International Symposium, HUC'99 Karlsruhe, Germany, Sep. 27-29, 1999 Proceedings, pp. 44-51. Retrieved from https://search.proquest.com/professional/docview/729929360/briefcitation/1510901369b4c70b903/3?accountid=142257. |
International Search Report and Written Opinion for International Publication No. PCT/Us12/27890 mailed Feb. 5, 2013. |
PCT International Preliminary Report on Parentability (IPRP) for International Application No. PCT/US2012/048697 dated Feb. 4, 2014. |
PCT International Search Report and Written Opinion for International Application No. PCT/US 12/28008 dated May 22, 2012. |
PCT International Search Report and Written Opinion for International Application No. PCT/US 12/28036 dated May 28, 2012. |
PCT International Search Report and Written Opinion for International Application No. PCT/US2012/027912 dated Jun. 8, 2012. |
PCT International Search Report and Written Opinion for International Application No. PCT/US 12/27892 dated Jun. 14, 2012. |
International Search Report and Written Opinion dated Oct. 12, 2012 for International Application No. PCT/US1248697. |
M J Welch (2010). Addressing the Challenges in Underspecification in Web Search. (order No. 3446833, University of Califomia, Los Angeles). ProQuest Dissertations and Theses; 137; retrieved from http://search.proquest.com/docview/8581010500?accountid=14753 (858101500). |
K.J. Jeevan & P. Padhi (2006). A Selective Review of Research in Content Personalization. Library Review, 55(9), 556-586. doi:http:/dx.doi.org/10.1108/00242503610706761. |
P.A. Lessner (2007). Chi-thinking: Chiasmus and Cognition. (Order No. 3297307, University of Maryland, College Park). ProQuest Dissertations and Theses; 487; retrieved from http://search.proquest.com/docview/304851937?accountid=14753. (304851937). |
International Preliminary Examination Report for International Application No. PCT/US12/27892 dated Sep. 10, 2013; 9 pages. |
International Preliminary Examination Report for International Application No. PCT/US2012/027890 dated Sep. 10, 2013; 6 pages. |
International Preliminary Examination Report for International Application No. PCT/US2012/28036 dated Sep. 10, 2013; 5 pages. |
International Preliminary Examination Report for International Application No. PCT/US12/28008 dated Sep. 10, 2013; 7 pages. |
International Preliminary Examination Report for International Application No. PCT/US12/27912 dated Sep. 10, 2013; 6 pages. |
Hollmer, M. (Mar. 18, 2004) MIT kicks off annual $50K business plan competition, The Boston Business Journal, 24, 24. Retrieved from http://search.proquest.com/dooview/216355359?ac,countid=14753. |
Jayne O'Donnell and Christine Dugas. More retailers go for green—the eco kind; Home Depot tags friendly products: [Final Edition]. USA Today [McLean, Va] Apr. 18, 2007: B.3. |
Number | Date | Country | |
---|---|---|---|
20160162982 A1 | Jun 2016 | US |
Number | Date | Country | |
---|---|---|---|
61450213 | Mar 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13365996 | Feb 2012 | US |
Child | 15045188 | US |