Augmented reality is a technology in which a person's conception of reality can be enhanced, typically through augmented sound, video or graphics displays. The augmentation is typically implemented via various technologies, such as a headset that may be worn by the person. One or more augmented views may be presented to the person through the headset.
The augmented reality headset typically includes a wearable computer and an optical display mounted to the headset. The wearable computer may include a wireless telecommunication capability, permitting a wireless connection from the wearable computer to a server computer. Because of the wireless telecommunication capability, the augmented reality headset may be used to interact with the server computer to accomplish various tasks.
Embodiments of the disclosure are directed to a method implemented on an augmented reality electronic device, the method comprising: viewing at least a portion of a financial document with the augmented reality electronic device; identifying one or more words or phrases in a section of the financial document using the augmented reality electronic device; identifying a customer associated with the financial document; sending an indication of the one or more words or phrases and an identification of the customer to a server computer; receiving customized content from the server computer based on the one or more words or phrases and the identification of the customer; and displaying the customized content on the augmented reality electronic device.
In another aspect, a method implemented on an augmented reality electronic device comprises: viewing at least a portion of a financial document with the augmented reality electronic device; identifying one or more words or phrases in a section of the financial document using the augmented reality electronic device; identifying a customer associated with the financial document; sending the one or more words or phrases and an identification of the customer to a server computer; receiving customized content for the financial document from the server computer, the customized content based on the one or more words or phrases and the identification of the customer; displaying the customized content on the augmented reality electronic device, the customized content including one or more actionable items, the display of the customized content including at least one actionable area on a display area of the augmented reality electronic device; receiving a user input from the actionable area; sending an indication of the user input to the server computer; receiving additional content from the server computer based on the user input; and displaying the additional content on the augmented reality electronic device.
In yet another aspect, an augmented reality electronic computing device comprises: a processing unit; and system memory, the system memory including instructions which, when executed by the processing unit, cause the augmented reality electronic computing device to: view at least a portion of a financial document; identify one or more words or phrases in a section of the financial document; send the one or more words or phrases and an identification of the customer to a server computer; receive customized content from the server computer based on the one or more words or phrases and the identification of the customer; and display the customized content.
The details of one or more techniques are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of these techniques will be apparent from the description, drawings, and claims.
The present disclosure is directed to systems and methods for using augmented reality (AR) to provide enhanced information when reading a banking statement or similar types of financial documents. As described in this disclosure, by viewing a banking statement with an AR device, additional information relevant to the banking statement is displayed on the AR device. In some implementations, the AR device is an AR headset. In other implementations, the AR device is a smart telephone or laptop computer having an AR software application. As described in this disclosure, the display of enhanced information for a banking statement is an example of virtual banking.
The systems and methods are generally described for an AR headset that may be worn by a user. The AR headset includes a wearable computer, a camera and an optical display. The wearable computer includes a wireless telecommunication capability, permitting a wireless connection between the wearable computer and one or more server computers. The wearable computer also includes voice recognition capability, permitting the user to direct the wearable computer via voice commands. In addition, in some implementations, the wearable computer also includes biometric capability such as facial recognition, retinal scan capability, finger print and voice print capability. The biometric capability permits biometric authentication of the user, as described in more detail later herein.
One type of AR headset described in this disclosure is a smart glass type of headset, similar to eyeglasses, that may be worn by the user. The user may view AR images in the glass portion of the headset. An example of a smart glass headset is Google Glass, from Google Inc. of Mountain View, Calif.
The systems are methods are also described for a smart telephone or laptop computer having an AR software application. The smart telephone and laptop computer each includes a camera that may be used to view a banking statement or other similar type of financial document. The AR software application may display information relevant to the banking statement on a display screen of the smart telephone or laptop computer. For example, the AR software application may display a graph showing a trend of assets and liabilities for a customer associated with the banking statement.
The example server computer 102 is typically a server computer at a bank or other financial institution. A wireless connection may be established between the AR headset 104 and the server computer 102. Information relating to the user's banking statement may be transmitted from server computer 102 to AR headset 104.
The example AR headset 104 includes a headset camera 106, headset electronics 108 and a headset display 110. The example headset camera 106 is a camera that is typically mounted to the headset such that a lens of the camera has a same orientation as the user. When the user looks at an object or a scene, the camera is positioned to record or transmit what the user sees. The headset electronics 108 includes an electronic computing device with wireless capability. The wireless capability may include Bluetooth, radio frequency identification (RFID) or similar types of wireless capability. The headset electronics 108 may also include optical character recognition (OCR).
In an example implementation, the user may display the banking statement 112 on the mobile electronic device 114. As discussed in more detail later herein, the banking statement 112 may be displayed via a user interface of a mobile banking software application on the mobile electronic device 114. A user wearing the AR headset 104 may view the banking statement 112 via the headset camera 106. Alternatively, the AR headset 104 may view a hard copy of the banking statement 112.
When the user focuses the AR headset 104 on a specific section of the banking statement 112, the headset camera 106 captures one or more keywords or phrases in the section of the banking statement 112. The keywords are specific words that relate to banking or financial transactions. Some example keywords that may be captured include account, balance, asset, liability, stock, transaction, and statement. More, fewer or different keywords may be captured. The keywords may be captured alone or in combination with other words, for example “mutual fund.”
In some implementations, the headset camera 106 may capture other identifiers in addition to keywords or phrases. For example, some financial documents may include built-in identifiers that may provide augmented content. One standard identifier that may be included in a financial document is a QR (quick response) code. Other identifiers may be included that may be recognized by an AR device and thereupon AR content may be provided. In this disclosure, example implementations are described for capturing keywords and phrases. Operations implemented using the keywords or phrases may also be applicable to any other identifiers that may be captured.
The one or more keywords, an account identification number and an authentication indication for the user are sent via a wireless connection to server computer 102. The account identification number identifies a customer associated with the banking statement 112. The authentication verifies that the wearer of the AR headset 104 either is the customer associated with the banking statement 112 or is authorized to view the banking statement 112.
When the user is authenticated and when server computer 102 processes the keywords, financial information related to the section of the document is displayed on the headset display 110. In some implementations, the financial information is displayed such that when the banking statement 112 is viewed via the AR headset 104, the financial information overlays the banking statement 112 on the headset display 110. In other implementations, the financial information is displayed on the headset display 110 independently of the banking statement 112.
The user may be authenticated in one or more ways. One way of authenticating the user is via biometric authentication, typically by one or more of facial recognition, retinal scan, finger print scan or voice print, as discussed in more detail later herein. Another way of authenticating the user is by having the user utter a personal identification number (PIN) that may be assigned to the user for AR applications. In some implementations, a combination of facial recognition and a PIN may be used.
In some implementations, the account number in combination with an indication of user authentication may be sufficient authorization for the server computer 102 to send the financial information related to the identified section of the document. The account number may be obtained by the AR headset 104 from the banking statement. In some implementations, the PIN is correlated with the user's account number. In these implementations, once the user is authenticated, server computer 102 uses the PIN to access a proper account number for the user.
Each time the user focuses the AR headset 104 onto a different section of the banking statement, keywords obtained from these sections of the banking statement may be used by server computer 102 to send customized financial information related to these sections to the AR headset 104. Examples of customized financial information that may be displayed include a graph showing a balance trend of assets and liabilities for the user, marketing information such as information about a bank credit card rewards program, a graph showing performance data for a stock or mutual fund, and links to educational videos regarding personal banking, etc. Other types of financial information may be displayed.
In some implementations, the AR headset 104 may process keywords, phrase and other identifiers independently of the server computer 102. In these implementations, instead of sending the keywords and phrases to the server computer 102, only the customer identification and authentication information may be sent to the server computer 102. In some implementations, an indication of the keywords and phrases is sent to the server computer 102 instead of the actual keywords and phrases. In some implementations, the customized financial information may be generated by an AR software application on the AR headset 104 or other AR device.
In an example implementation, the camera 204 on the mobile electronic device 202 focuses on the banking statement 208 and displays the banking statement 208 on the display 206. In the example system 200, the banking statement 208 is a hard copy banking statement. When the camera zooms in on a specific section of the banking statement 208, one or more keywords in the section of the banking statement 208 are scanned into the mobile electronic device 202. In some implementations, the AR software application is activated when the banking statement 208 is scanned. In other implementations, the AR software application is activated when the mobile electronic device 202 is turned on.
The keywords and account information are sent to server computer 102. The account information is corresponds to an account associated with the banking statement 208. The account number is number is known when the user logs on to a mobile banking software application on the mobile electronic device 202 and displays the banking statement 208 on the mobile electronic device 202.
Based on the one or more keywords, corresponding to the section of the banking statement 208 that is scanned, the server computer 102 sends customized financial information to the mobile electronic device 202. The customized financial information may include customized content, a message or a combination of content and a message. The content and message are customized based on an evaluation of the keywords. For example, when the keywords indicate that the section of the banking statement 208 is related to asset allocation, the server computer 102 may send information to the mobile electronic device 202 regarding a current asset allocation for the user. The server computer may also send a message, for example from a virtual banker, as discussed later herein.
In some implementations, the customized financial information is displayed as an overlay of the banking statement 208. The overlay may be displayed in conjunction with the camera 204 view of the banking statement 208 on the mobile electronic device 202. For example, with the overlay, the user may be able to view both the banking statement 208 and the customized financial information. In this way the customized financial information augments data in the banking statement 208. In other implementations, the customized financial information may comprise a new display that replaces the banking statement 208 on the mobile electronic device 202.
When the view accounts 302 button is activated, a banking statement for the user may be displayed on the mobile electronic device 202. When the user views the banking statement with an AR headset, for example with AR headset 104, the user may focus on a specific section of the banking statement with the AR headset. In some implementations, focusing on a specific section of the banking statement with the AR headset, automatically causes a camera on the AR headset, for example headset camera 106, to capture one or more key words from the section of the banking statement focused on by the AR headset. As discussed, the keywords, the user's online banking account number and an indication of authentication for the user are sent to server computer 102. Customized financial information relating to the section of the banking statement is then sent to the AR headset 104.
As discussed, the customized financial information may be displayed on a display device of AR headset 104, for example on headset display 110. In some implementations, the customized financial information is displayed as an overlay of the banking statement. When displayed as an overlay, the customized financial information occupies a similar space as the banking statement, so that when the user views the banking statement with the AR headset 104, the customized financial information is overlaid on the banking statement as viewed with the AR headset 104. In other implementations, for example when the AR device is the mobile electronic device 202, the customized financial information may be displayed on the display 206 as a separate screen, so that the display of the customized financial information blocks out the banking statement when displayed. As discussed earlier herein, the customized financial information may also be displayed in conjunction with the camera 204 as an overlay on the mobile electronic device 202.
The example dialog area 402 may be displayed automatically by the AR software application based on an evaluation of the section of the banking statement at the server computer 102. The evaluation makes use of business rules in an algorithm used by an AR software application on server computer 102. For example, an evaluation of the keywords sent to server computer 102 may indicate that it may be useful to display a balance trend of assets and liabilities for the users. Based on this evaluation using the business rules, the AR software application may display dialog area 402 on the headset display 110.
In the example dialog area 402, a photo 404 of a virtual banker may be displayed along with a message 406 from the virtual banker. In this example, the message asks the user whether the user would be interested in viewing the balance trend. A yes button 408 and a no button 410 are displayed along with the message 406. When the user selects the yes button 408, the example graphic 412 is displayed. In some implementations, the user may select the yes button 408 by touching the yes button 408 area on the display 400. In other implementations, the user may simply utter the word “yes.”
The graphic 412 includes graphs showing a trend over time for the assets and liabilities of the user. As shown in
The call button 414 is an example of how the customized financial information displayed on the AR headset 104 may become actionable. In this case, the action is to call a bank employee, for example a personal banker.
The example dialog area 502 includes a photo 504 of a virtual banker and a message 506 from the virtual banker. The dialog area 502 also includes buttons corresponding to actions the user may take based on the message 506. In this example, the message 506, based on the keywords and other information available at the server computer 102, indicates that the user's credit card has been inactive for several months. The message 506 further includes an inquiry as to whether the user would be interested in viewing details about a bank credit card rewards program.
The dialog area 502 includes three buttons corresponding to actions the user may take. The buttons include a details button 508, an appointment button 510 and a call button 512. When the details button 508 is selected, a hyperlink is displayed for a website that may provide information about the bank credit card rewards program. When the appointment button 510 is selected, a hyperlink is displayed for a website that can be used to make an appointment with a personal banker. When the call button 512 is selected, a dialog box is displayed that permits activation of a telephone call to a bank employee, typically a personal banker.
In some implementations, selecting one of the details button 508, appointment button 510 and call button 512, results in a display of information pertaining to the credit and rewards program, the appointment with the personal banker and the telephone call to the bank employee, respectively. In other implementations, selection of anyone of the details button 508, appointment button 510 and call button 512 results in a display of information area 514, providing information regarding each of the bank credit card rewards program, the appointment with the personal banker and the telephone call to the bank employee.
The example information area 514 includes a details area 516, an appointments area 518 and a call area 520. The example details area 516 includes a hyperlink for obtaining information for the bank credit card rewards program. The example appointments area 518 includes a hyperlink for the website that can be used to make an appointment with the personal banker. The example call area 520 displays the dialog box that permits activation of the telephone call to the bank employee, typically a personal banker.
The example dialog area 602 includes a photo 604 of the virtual banker and a message 606 from the personal banker. The dialog area 602 also includes buttons corresponding to actions the user may take based on the message 606. In this example, the message, based on the keywords and other information available at the server computer 102, inquires of the user whether the user would like to see 90 day pricing for either or both of two stocks. For this example, the section of the banking statement viewed by the AR device may have indicated that the user owns the two stocks and the key words may have included stock market symbols corresponding to the two stocks.
One button 608 corresponds to example symbol BGF for a first stock and a second button 610 corresponds to example symbol JAMS for a second stock. When either button 608 or button 610 is selected, the information area 612 is displayed. The example information area 612 includes one graph showing 90-day performance of the BGF stock and another graph showing 90-day performance of the JAMS stock.
At operation 702, a financial document is viewed by a user using an AR device. In a typical implementation, the financial document is displayed on a mobile electronic device, for example mobile electronic device 114, using a banking financial application on the mobile electronic device 14. In this implementation, the AR device is an AR headset device, for example AR headset device 104. In another implementation, the financial document may be a hard copy banking statement and the AR device is the mobile electronic device 202. In this implementation, the mobile electronic device 202 includes an AR software application.
In some implementations, mobile electronic device 114 and mobile electronic device 202 are the same physical device. In other implementations they are separate device. For example, in some implementations mobile electronic device 114 may include a mobile financial banking software application but not an AR software application. In some implementations, mobile electronic device 202 may include both a mobile financial banking software application and an AR software application. Other combinations of software applications are possible.
At operation 704, the user is authenticated. Authentication comprises verifying an identity of the user of the AR device and determining that the user of the AR device is authorized to view the financial document. When the AR device is the AR headset device 104, authentication typically is done by a biometric authentication method, such as facial recognition, retinal scan, finger print scan or voice print. When the AR device is the mobile electronic device 202, authentication is typically done via the user of a password or PIN on the mobile electronic device 202. In some implementations, when the AR device is the mobile electronic device 202, authentication may also be done via biometric authentication, using the AR software application on the mobile electronic device 202.
At operation 706, one or more keywords or phrases are identified in a section of the financial document. The AR device focuses on a specific section of the financial document and the section of the financial document is scanned by the AR device. For example, when the AR device is the AR headset device 104, when the user focuses headset camera 106 on the specific section of the financial document, that section of the financial documented is scanned by headset camera 106. The scanned section is analyzed using an AR software application running on the AR headset device 104. The AR software application identifies the one or more keywords or phrases. When the AR device is the mobile electronic device 202, the camera 204 on the mobile electronic device 202 may be used to focus on a section of a hard copy financial document. The section of the hard copy financial document is scanned in using the AR software application running on the mobile electronic device 202. The scanned section is analyzed using the AR software application running on the mobile electronic device 202 and one or more keywords or phrases from the scanned section of the document are identified.
At operation 708, the identified keywords and phrases and an indication of user authentication are sent to server computer 102. Server computer 102 includes an AR software application program that includes rules for analyzing the keywords and phrases. The rules are used to look up customer information related to the keyword and phrases. The customer information may be stored on server computer 102 or a data store or other server computer accessible from server computer 102. The AR software application program on server computer 102 determines customized content that may be displayed to the user to augment information on the financial document. For example, as discussed earlier, the customized information may include a graph showing a balance trend or an asset allocation for the user.
At operation 710, customized information is received at the AR device from the server computer 102 and at operation 712, the customized information is displayed on the AR device. When the AR device is AR headset 104, the customized information is displayed on the headset display 110. In some implementations the customized information is displayed as an overlay of the financial document when the financial document is viewed via AR headset 104. In other implementations, the customized information is displayed on the headset display 110 independently of the financial document.
When the AR device is mobile electronic device 202, the customized financial information is displayed on the mobile electronic device 202. In some implementations, the customized financial information may be displayed as an overlay of the financial document, as viewed by the camera 204 on the mobile electronic device 202. In other implementations, the customized financial information may be displayed within a financial application display screen on the mobile electronic device 202.
At operation 714, the user logs off from the user's financial account. When the AR device is the AR headset 104, the user may log off by simply looking away from the financial document for greater than a predetermined period of time, for example 30 seconds. When the AR device is the mobile electronic device 202, the user may log off by turning off the camera 204.
At operation 804, the biometric information for the user obtained at operation 802 is compared with previously obtained biometric information for the user. Typically, when the user is assigned an AR device, a biometric profile is compiled for the user and stored on the AR device. The profile may include one or more of a facial profile, a retinal profile, a voice print and a finger print.
At operation 806, a determination is made as to whether the biometric information obtained at operation 802 matches the previously obtained biometric information for the user. When a determination is made that there is a match, at operation 808, the user is designated as being authenticated.
When a determination is made that there is not a match, at operation 810, a message is displayed on the headset display 110 indicating that there is an authentication failure and that the current transaction (for example displaying payment card financial information on the AR device) has been ended.
At operation 812, the current transaction is ended.
As illustrated in the example of
The mass storage device 914 is connected to the CPU 902 through a mass storage controller (not shown) connected to the system bus 922. The mass storage device 914 and its associated computer-readable data storage media provide non-volatile, non-transitory storage for the AR headset 104. Although the description of computer-readable data storage media contained herein refers to a mass storage device, such as a hard disk or solid state disk, it should be appreciated by those skilled in the art that computer-readable data storage media can be any available non-transitory, physical device or article of manufacture from which the central display station can read data and/or instructions.
Computer-readable data storage media include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable software instructions, data structures, program modules or other data. Example types of computer-readable data storage media include, but are not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROMs, digital versatile discs (“DVDs”), other optical storage media, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the server computer 102.
According to various embodiments of the invention, the AR headset 104 may operate in a networked environment using logical connections to remote network devices through the network 920, such as a wireless network, the Internet, or another type of network. The AR headset 104 may connect to the network 920 through a network interface unit 904 connected to the system bus 922. It should be appreciated that the network interface unit 904 may also be utilized to connect to other types of networks and remote computing systems. The AR headset 104 also includes an input/output controller 906 for receiving and processing input from a number of other devices, including a touch user interface display screen, or another type of input device. Similarly, the input/output controller 906 may provide output to a touch user interface display screen or other type of output device.
As mentioned briefly above, the mass storage device 914 and the RAM 910 of the AR headset 104 can store software instructions and data. The software instructions include an operating system 918 suitable for controlling the operation of the AR headset 104. The mass storage device 914 and/or the RAM 910 also store software instructions, that when executed by the CPU 902, cause the AR headset 104 to provide the functionality of the AR headset 104 discussed in this document. For example, the mass storage device 914 and/or the RAM 910 can store software instructions that, when executed by the CPU 902, cause the AR headset 104 to display received financial data on the display screen of the AR headset 104.
Although various embodiments are described herein, those of ordinary skill in the art will understand that many modifications may be made thereto within the scope of the present disclosure. Accordingly, it is not intended that the scope of the disclosure in any way be limited by the examples provided.
Number | Name | Date | Kind |
---|---|---|---|
5920848 | Schutzer et al. | Jul 1999 | A |
5947526 | Neu | Sep 1999 | A |
7254548 | Tannenbaum | Aug 2007 | B1 |
7337947 | Swanson, Sr. | Mar 2008 | B1 |
7481359 | Kawase et al. | Jan 2009 | B2 |
7634662 | Monroe | Dec 2009 | B2 |
7653600 | Gustin et al. | Jan 2010 | B2 |
7672870 | Haines et al. | Mar 2010 | B2 |
7980462 | Graef et al. | Jul 2011 | B1 |
8019648 | King et al. | Sep 2011 | B2 |
8070055 | Block et al. | Dec 2011 | B2 |
8090159 | Gabara | Jan 2012 | B2 |
8244002 | Chen et al. | Aug 2012 | B2 |
8370639 | Azar et al. | Feb 2013 | B2 |
8438110 | Calman et al. | May 2013 | B2 |
8554647 | Grigg et al. | Oct 2013 | B1 |
8594931 | Sterkel et al. | Nov 2013 | B2 |
8639622 | Moore et al. | Jan 2014 | B1 |
8751393 | Murray | Jun 2014 | B1 |
8935799 | Anderson et al. | Jan 2015 | B1 |
20030105711 | O'Neil | Jun 2003 | A1 |
20060031123 | Leggett et al. | Feb 2006 | A1 |
20060218006 | Malik et al. | Sep 2006 | A1 |
20070108275 | Shuster | May 2007 | A1 |
20070233839 | Gaos | Oct 2007 | A1 |
20090138386 | Wilson et al. | May 2009 | A1 |
20100030578 | Siddique et al. | Feb 2010 | A1 |
20100057574 | Banerjee et al. | Mar 2010 | A1 |
20100153242 | Preston et al. | Jun 2010 | A1 |
20100324727 | Eonnet | Dec 2010 | A1 |
20110055049 | Harper et al. | Mar 2011 | A1 |
20110099067 | Cooper | Apr 2011 | A1 |
20110150296 | Eckhoff et al. | Jun 2011 | A1 |
20110153310 | Ehlen et al. | Jun 2011 | A1 |
20110264470 | Miller | Oct 2011 | A1 |
20110266340 | Block et al. | Nov 2011 | A9 |
20120094639 | Carlson et al. | Apr 2012 | A1 |
20120095853 | von Bose et al. | Apr 2012 | A1 |
20120105476 | Tseng | May 2012 | A1 |
20120136779 | Nonaka | May 2012 | A1 |
20120185381 | Kim | Jul 2012 | A1 |
20120192235 | Tapley et al. | Jul 2012 | A1 |
20120197773 | Grigg et al. | Aug 2012 | A1 |
20120212414 | Osterhout et al. | Aug 2012 | A1 |
20120216149 | Kang et al. | Aug 2012 | A1 |
20120218188 | Kashitani | Aug 2012 | A1 |
20120230577 | Calman | Sep 2012 | A1 |
20120231424 | Calman et al. | Sep 2012 | A1 |
20120231814 | Calman et al. | Sep 2012 | A1 |
20120232937 | Calman et al. | Sep 2012 | A1 |
20120232966 | Calman et al. | Sep 2012 | A1 |
20120232968 | Calman et al. | Sep 2012 | A1 |
20120232976 | Calman et al. | Sep 2012 | A1 |
20120232977 | Calman et al. | Sep 2012 | A1 |
20120233015 | Calman et al. | Sep 2012 | A1 |
20120233032 | Calman et al. | Sep 2012 | A1 |
20120233072 | Calman et al. | Sep 2012 | A1 |
20120239564 | Summerrow et al. | Sep 2012 | A1 |
20120242696 | Martin | Sep 2012 | A1 |
20120303528 | Weiner et al. | Nov 2012 | A1 |
20120320216 | Mkrtchyan et al. | Dec 2012 | A1 |
20130016123 | Skarulis | Jan 2013 | A1 |
20130030994 | Calman et al. | Jan 2013 | A1 |
20130033522 | Calman et al. | Feb 2013 | A1 |
20130046633 | Grigg et al. | Feb 2013 | A1 |
20130060691 | Typrin et al. | Mar 2013 | A1 |
20130093759 | Bailey | Apr 2013 | A1 |
20130208234 | Lewis | Aug 2013 | A1 |
20130215116 | Siddique et al. | Aug 2013 | A1 |
20130222369 | Huston et al. | Aug 2013 | A1 |
20130228615 | Gates et al. | Sep 2013 | A1 |
20130229261 | Gates et al. | Sep 2013 | A1 |
20130232048 | Corner | Sep 2013 | A1 |
20130282542 | White | Oct 2013 | A1 |
20140012691 | Hanson et al. | Jan 2014 | A1 |
20140075528 | Matsuoka | Mar 2014 | A1 |
20140147004 | Uchida | May 2014 | A1 |
20140181741 | Apacible et al. | Jun 2014 | A1 |
20140236789 | Caldwell | Aug 2014 | A1 |
20140244266 | Brown | Aug 2014 | A1 |
20140298235 | Caldwell | Oct 2014 | A1 |
20140341441 | Slaby | Nov 2014 | A1 |
20140372427 | Lehmann et al. | Dec 2014 | A1 |
20150012426 | Purves et al. | Jan 2015 | A1 |
20150127541 | Just et al. | May 2015 | A1 |
20160070581 | Soon-Shiong | Mar 2016 | A1 |
Number | Date | Country |
---|---|---|
2876928 | Jan 2014 | CA |
2006-99445 | Apr 2006 | JP |
10-2004-0026913 | Apr 2004 | KR |
10-2008-0042374 | May 2008 | KR |
10-2011-0078913 | Jul 2011 | KR |
10-2011-0136457 | Dec 2011 | KR |
Entry |
---|
“User Interfaces for Mobile Augmented Reality Display” by Tobias Hans Hollerer at Columbia University, 2004. |
Augmented Reality, Tommytoy.typepad.com, https://web.archive.org/web/20130901224748/http://tommytoy.typepad.com/tommy-toy-pbt-consultin/augmented-reality/, 69 pages (Aug. 6, 2013). |
Bass, T., “Dress Code,” http://archive.wired.com/wired/archive/6.04/wearables_pr.html, 10 pages (Copyright 1993-2004). |
Bray, A., “Google Glass will Change Your Branches,” http://www.americanbanker.com/bankthink/google-glass-will-change-your-branches-1057312-1.html, 6 pages (Mar. 7, 2013). |
Elash, A. et al., “Canadian casinos, banks, police use facial-recognition technology,” The Globe and Mail, http://www.theglobeandmail.com/news/national/time-to-lead/canadian-casinos-banks-police-use-facial-recognition-technology/article590998/, 4 pages (Sep. 6, 2012). |
Harper, S. et al., “Sentinel: Universal Access to Ambient Devices,” http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.59.6168&rep=rep1&type=pdf, 5 pages (2003). |
Leiva-Gomez, M., Cameras Scanning Credit Cards? Is That Even Secure?, http://www.mobilecommerceinsider.com/topics/mobilecommerceinsider/articles/344823-cameras-scanning-credit-cards-that-even-secure.htm, 3 pages (Jul. 8, 2013). |
Lotte Credit Card Releases Asia's First Financial Mobile App with Augmented Reality for iPhone and Android Devices, PRWeb, http://www.prweb.com/releases/worklight/mobile-platform/prweb5210334.htm, 3 pages (Mar. 30, 2011). |
Mok, S. et al., “Addressing Biometrics Security and Privacy Related Challenges in China,” Proceedings of the International Conference o the Biometrics Special Interest Gropu (BIOSIG), 8 pages (Sep. 6-7, 2012). |
Prindle, D., “Best Augmented Reality Apps,” https://web.archive.org/web/20130407024439/http://www.digitaltrends.com/mobile/best-augmented-reality-apps/, 23 pages (Jan. 29, 2013). |
Salo, M. et al., “Consumer value of camera-based mobile interaction with the real world,” Pervasive and Mobile Computing, vol. 9, pp. 258-268 (2013). |
Sterling, G., “Mobile Credit-Card Scanning Should be Ubiquitous,” http://internet2go.net/news/local-search/credit-card-scanning-should-be-ubiquitous-mobile, 2 pages (Jul. 29, 2013). |
Tatton, E., “Google Glass—Draft Document,” https://prezi.com/kfvacljxv7fg/google-glass-draft-document/, 7 pages (May 6, 2013). |
U.S. Appl. No. 14/143,633, filed Dec. 30, 2013 entitled “Augmented Reality Enhancements for Financial Activities”. |
U.S. Appl. No. 14/143,658, filed Dec. 30, 2013 entitled “Augmented Reality Enhancements for Financial Activities”. |
U.S. Appl. No. 14/151,965, filed Jan. 10, 2014 entitled “Augmented Reality for Finiancial Budgeting and Spending”. |
Yamashita, A. et al., “Assisting system of visually impaired in touch panel operation using stereo camera,” 18th IEEE International Conference on Image Processing (ICIP), http://sensor.eng.shizuoka.ac.jp/˜yamasita/paper/B/B067Final.pdf, 4 pages (Sep. 11-14, 2011). |
Yayla, A. et al., “An Exploration of Using Face Recognition Technologies for National Security,” Turkish Journal of Police Studies, vol. 6, No. 1-2, pp. 141-157 (2004). |