Whenever currency must be exchanged during a transaction between a customer and a provider of goods and/or services, both the customer and the provider prefer to conclude the transaction as quickly as possible. Inefficiency not only slows down each individual transaction, but also reduces the number of completed transactions over a given period of time. When a customer pays with currency, whose type and amount the provider must manually verify to validate and complete a transaction, such inefficiency increases dramatically. Manual verification and validation takes considerable time and may lead to further delays if the unsorted items given over by the customer to satisfy the transaction contain non-currency items that the provider must examine and discard.
Some examples provide a system for currency verification and transaction validation. The system includes a processor and a memory communicatively coupled to the processor. A currency identification component stored at the memory and executed by the processor obtains an image file from an image capture device associated with a currency environment. The currency identification component detects one or more items within the image file; verifies at least one currency item of the detected one or more items; analyzes the at least one verified currency item to identify a currency type and a currency value; and generates a currency verification report based on the verified currency type and the verified currency value. The generated currency verification report is outputted from the currency identification component to a transaction system for a currency calculation. The currency identification component receives the resulting currency calculation from the transaction system and performs a currency validation action based on the received currency calculation.
Other examples provide a computer-implemented method for currency verification and transaction validation. A currency identification component obtains an image file from an image capture device associated with a currency environment. Depending on the contents of the image file, the currency identification component detects one of: no items within the image file; one or more items within the image file, of which no items are currency items; or one or more items within the image file, of which at least one item is a currency item. When one or more items are detected, the currency identification component verifies at least one currency item of the detected one or more items; analyzes the at least one verified currency item to identify a currency type and a currency value; and generates a currency verification report based on the verified currency type and the verified currency value. The currency identification component outputs the generated currency verification report to a transaction system for a currency calculation and receives the currency calculation from the transaction system. Based on the received currency calculation, the currency identification component performs a currency validation action.
Still other examples provide one or more computer storage media, having computer-executable instructions for currency verification and transaction validation that, when executed by a computer, cause the computer to perform operations. These operations comprise obtaining, by a currency identification component implemented on a processor, an image file from an image capture device associated with a currency environment; detecting one or more items within the image file; verifying at least one currency item of the detected one or more items; analyzing the at least one verified currency item to identify a currency type and a currency value; and generating a currency verification report based on the verified currency type and the verified currency value. These operations further comprise outputting the generated currency verification report from the currency identification component to a transaction system for a currency calculation; and the currency identification component receiving the currency calculation from the transaction system. Based on the received currency calculation, the currency identification component performs a currency validation action.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Corresponding reference characters indicate corresponding parts throughout the drawings.
Referring to the figures, examples of the disclosure provide transaction validation through currency verification with computer vision and machine learning functionality. Currency verification and transaction validation operations may be performed on captured images obtained from image capture device(s) associated with the currency environment. The images may contain mixed collections of non-currency items and currency items, and the generated output of such operations may be delivered via various output devices, including an augmented reality (AR) display, an audio-only output, and/or another user interface.
The elements described herein operate in an unconventional manner to allow for partial, complete, or near-complete automation of currency verification and transaction validation during a transaction between a provider of goods and/or services and a customer. Thus, the disclosure facilitates conclusion of the transaction as efficiently as possible, which in turn increases the amount of transactions which may be completed over a given unit of time. Delays that might otherwise arise from the provider having to manually sort and verify the items given over by the customer to satisfy the transaction are eliminated by the disclosure's ability to rapidly and accurately verify the presence and value of currency, identify and require the removal of non-currency, and prompt the customer and provider to take whatever actions are necessary to deliver sufficient currency to complete the transaction and, if necessary, deliver change to the customer. When necessary, the customer or provider may override the currency verification result when an item is not recognized as currency and indicate that that item is to be treated as a currency item with a specified value. Some examples of the disclosure provide video analytics that in some examples can verify that the correct amount and type of currency has been given by the customer and the correct amount of change has been returned by the provider in real-time.
Examples of the disclosure also improve transaction accuracy, by quickly identifying and generating a notification for removal of foreign currency or counterfeit currency that cannot be used to satisfy the transaction. Because examples of the disclosure are incorporated into various types of systems with a variety of output features, it is possible to incorporate AR features to increase efficiency for fully-sighted providers and/or customers, or to utilize an audio output or braille display to allow visually impaired providers and/or customers to accurately and efficiently complete the transaction. Because examples of the disclosure are configured to instruct the provider on both what types and amounts of currency must be requested from the customer to satisfy the transaction and also what types and amounts of currency must be returned to the customer when the customer has overpaid, the disclosure may also function as an efficient teaching tool for trainees learning how to quickly and efficiently make change during a transaction.
Referring again to
The currency environment 180 may also be referred to as a scene, viewing range, or picture. The currency environment 180 refers to the portion of the real world about which the system 100 may receive information in order to perform currency verification and transaction validation. In this example, that information is delivered in the form of an image file from at least one image capture device(s) associated with the currency environment 180 and capable of capturing still or video images of the currency environment 180.
In some examples, the computing device 102 has an at least one processor 106 and a memory 108. The computing device 102 may also include a user interface component 110.
The processor 106 includes any quantity of processing units and is programmed to execute the computer-executable instructions 104. The computer-executable instructions 104 may be performed by the processor 106 or by multiple processors within the computing device 102 or performed by a processor external to the computing device 102. In some examples, the processor 106 is programmed to execute instructions such as those illustrated in the figures (e.g.,
The computing device 102 further has one or more computer readable media such as the memory 108. The memory 108 includes any quantity of media associated with or accessible by the computing device 102. The memory 108 may be internal to the computing device 102 (as shown in
The memory 108 stores data, such as one or more applications. The applications, when executed by the processor 106, operate to perform functionality on the computing device 102. The applications may communicate with counterpart applications or services such as web services accessible via a network 112. For example, the applications may represent downloaded client-side applications that correspond to server-side services executing in a cloud.
In other examples, the user interface component 110 includes a graphics card for displaying data to the user and receiving data from the user. The user interface component 110 may also include computer-executable instructions (e.g., a driver) for operating the graphics card. Further, the user interface component 110 may include a display (e.g., a touch screen display or natural user interface) and/or computer-executable instructions (e.g., a driver) for operating the display. The user interface component 110 may also include one or more of the following to provide data to the user or receive data from the user: speakers 170, a sound card, a camera, a microphone, a vibration motor, one or more accelerometers, a BLUETOOTH® brand communication module, global positioning system (GPS) hardware, and a photoreceptive light sensor. For example, the user may input commands or manipulate data by moving the computing device 102 in a particular way.
The network 112 is implemented by one or more physical network components, such as, but without limitation, routers, switches, network interface cards (NICs), and other network devices. The network 112 may be any type of network for enabling communications with remote computing devices, such as, but not limited to, a local area network (LAN), a subnet, a wide area network (WAN), a wireless (Wi-Fi) network, or any other type of network. In this example, the network 112 is a WAN, such as the Internet. However, in other examples, the network 112 is a local or private LAN.
In some examples, the system 100 optionally includes a communications interface component 114. The communications interface component 114 includes a network interface card and/or computer-executable instructions (e.g., a driver) for operating the network interface card. Communication between the computing device 102 and other devices, such as but not limited to a user device 116 and/or one or more image capture device(s) 118, may occur using any protocol or mechanism over any wired or wireless connection. In some examples, the communications interface component 114 is operable with short range communication technologies such as by using near-field communication (NFC) tags.
The user device 116 represents any device executing computer-executable instructions 104 that is associated with a user 128. The user device 116 may be implemented as a mobile computing device, such as, but not limited to, a wearable computing device, a mobile telephone, laptop, tablet, computing pad, netbook, gaming device, and/or any other portable device. The user device 116 includes at least one processor and a memory. The user device 116 may also include a user interface component 144. In this example, the user device 116 may be an AR headset or other mobile computing device capable of generating an AR overlay over an image captured from an image capture device.
The image capture device(s) 118 are one or more devices for generating image file(s) 127 associated with the currency environment 180. The image capture device(s) 118 may be communicatively coupled to the network 112. An image capture device 118 may include a video camera and/or a still image camera, a set of video cameras and/or still image cameras, and/or a depth sensor for generating an image file 127 of at least one item in a plurality of detected items 152. In some examples, a set of image capture device(s) 126, which is functionally interchangeable with the image capture device(s) 118, is communicatively coupled directly to the computing device 102, without the network 112 being interposed between them. Unless otherwise stated, any description in this disclosure of the function of the image capture device(s) 118 applies equally to the set of image capture devices 126, and any description in this disclosure of the function of the set of image capture device(s) 126 applies equally to the image capture device(s) 118.
The system 100 may optionally include a data storage device 150 for storing data, such as, but not limited to, a plurality of image files 151; the plurality of detected items 152; a plurality of verified currency items 153; a plurality of currency calculations 154; a set of machine learning inputs 155; and a plurality of verification files 156. The set of machine learning inputs 155 may include, but are not limited to, training data, user preferences, historical transaction data, and a set of weighted selection criteria. The types and uses of the set of machine learning inputs 155 are discussed in further depth in the discussion of
The data storage device 150 may include one or more different types of data storage devices, such as, for example, one or more rotating disks drives, one or more solid state drives (SSDs), and/or any other type of data storage device. The data storage device 150 in some non-limiting examples includes a redundant array of independent disks (RAID) array. In other examples, the data storage device 150 includes a database.
The data storage device 150 in this example is included within the computing device 102 or associated with the computing device 102. In other examples, the data storage device 150 is a remote data storage accessed by the computing device 102 via the network 112, such as a remote data storage device, a data storage in a remote data center, or a cloud storage.
The data storage device 150 in some non-limiting examples is utilized to aggregate data together for currency verification and transaction validation. The aggregated data may include the plurality of image files 151, the plurality of detected items 152, the plurality of verified currency items 153, the plurality of currency calculations 154, the set of machine learning inputs 155, the plurality of verification files 156, etc. This enables data utilized for currency verification and transaction validation to be aggregated into a single location for quick and efficient access by a currency identification component 122 and/or a verification and validation application 142 on the user device 116. In other examples, the data storage device 150 stores data identifying the various items within the currency environment 180. In yet other examples, such data is aggregated on a cloud storage device rather than a physical data storage associated with the currency environment 180.
The memory 108 in some examples stores one or more computer-executable components. Exemplary components include but are not limited to the currency identification component 122 and a transaction system 160. The transaction system 160 is, for example, any computing device, mobile device, dedicated hardware, or other component capable of functioning as a point of sale (POS) terminal for currency-based transactions.
The currency identification component 122 obtains the image file 127 from the image capture device 118 associated with the currency environment 180. The currency identification component 122 further detects one or more items within the image file 127, verifies at least one currency item of the detected one or more items; analyzes the at least one verified currency item to identify a currency type and a currency value; and generates a currency verification report 132 based on the verified currency type and the verified currency value. In this context, verifying an item means verifying that the currency item is of an appropriate type for the transaction presently being processed by the transaction system 160. This includes but is not limited to verifying that the item is indeed currency (either paper currency or coinage); that the item is the correct type of national currency (e.g., any United States-issued currency, but not Euros or other non-US currency, when the system 100 is configured to operate within the United States); that the item is legitimate (e.g., not counterfeit currency); and that the item is not a non-currency item (e.g., buttons, casino chips, arcade tokens, etc.).
The currency identification component 122 outputs the generated currency verification report 132 to the transaction system 160 for a currency calculation. The currency identification component 122 receives the currency calculation from the transaction system 160 and, based on the received currency calculation, performs a currency validation action from a set of currency validation actions 134. In some examples, the currency validation action is one of generating a notification identifying additional currency items to be added in order to satisfy a current transaction, generating a notification identifying one or more currency items to be removed in order to satisfy the current transaction, generating a notification identifying both the one or more currency items to be removed and the additional currency items to be added in order to satisfy the current transaction, generating a notification validating the current transaction, or generating a notification identifying unverified items to be removed.
In other examples, the currency validation action comprises generating a notification output audibly provided via a speaker device. The speaker device may be the speakers 170 of
In some examples, the received currency calculation identifies a delta of a current transaction total and the at least one verified currency item. That is, in such examples the received currency calculation identifies the amount by which the at least one verified currency item differs from the transaction total, being either less than, greater than, or equal to the transaction total. When the delta is zero, indicating no difference between the transaction total and the at least one verified currency item, the transaction has been validated.
In some examples, the computing device 102 may further include or be communicatively coupled with an output component device 124. The output component device 124 may be the user interface component 110 of the computing device 102. The output component device 124 may output a portion of a currency environment 180 within a field of view (FOV) 140 of the user 128, as obtained by the image capture device 118 and including the plurality of detected items 152. This output may be a three-dimensional or two-dimensional image including real-world elements as well as virtual/graphical elements generated by the output component device 124. This output may also include or be determined by a currency validation action from the set of currency validation actions 134.
In other examples, the computing device 102 sends output to the user device 116 via the network 112. The verification and validation application 142 generates the output to be displayed on the user device 116, which may be a three-dimensional or two-dimensional image including real-world elements as well as virtual/graphical elements, based on the data received by the user device 116 from the computing device 102 via the network 112. The user interface component 144 on the user device 116 utilizes the output received from the computing device 102 to generate output displayed to the user 128. In some examples, the user device 116 may download the verification and validation application 142 from a web applications server via the network 112.
The computing device 102 performs currency verification and transaction validation using the image file(s) 127 captured from the image capture device(s) 118 associated with a currency environment 180. The user device 116 in some examples communicates with the computing device 102 or other local server on the Internet via web services application programming interface (API) management.
Verified currency items 206 have been verified by, for example, the currency identification component 122 of the system 100 of
The verification and validation application analyzes the image file(s) generated by the image capture device(s) 306 to verify the items in the plurality of items 308 and validate the associated transaction. In some examples, the verification and validation application is the verification and validation application 142 from
The image capture device(s) 328 on a different user device 304 associated with a different user 332 viewing the same plurality of items 308 generates image file(s) associated with the plurality of items 308. The user device 304 has an output component 334. The function and output of the user device 304 should be identical to the user device 302 given identical inputs, thus allowing the user devices 302 and 304 to be used interchangeably to conduct currency verification and transaction validation operations to complete a transaction, or to be used simultaneously to complete multiple transactions (e.g., at two separate POS locations). User preferences and/or transaction history data for the user 326 and or the user 332 may be used to adjust operation of the user device 302 and/or the user device 304. For example, user preferences and/or transaction history data are used to make adjustments to increase the accuracy of the currency verification and transaction validation process (e.g., to adjust the settings of image capture device(s) 306 and/or image capture device(s) 326 to accommodate local lighting conditions for optimum image capture).
In some examples, the currency environment 320 may include a digital output device 335. The digital output device 335 may include, without limitation, a light emitting diode (LED) display, a digital display, or any other type of digital output device. The digital output device 335 outputs default content 336, including item identifiers, item pricing information, item size information, promotional information, as well as any other default content. For example, whenever the digital output device 335 is not being used to verify currency and/or validate a transaction, the default content 336 may include advertisements meant to entice potential customers to purchase additional goods and/or services from the provider.
In other examples, the user device 302 sends customized content 338 to the digital output device 335 for output to the user 326 when the user device 302 detects the digital output device 335 within a predetermined range/distance of the user device 302. In other examples, the digital output device 335 displays customized content 338 received from the user device 302 for so long as the digital output device 335 detects the user device 302 within a predetermined range of the digital output device 335. A geofence area may be utilized to define the predetermined area. When the user device 302 is within the geofence area, the digital output device 335 displays the customized content 338 received from the user device 302. For example, if the user device 302 is within a geofence area associated with the digital output device 335, the digital output device 335 pings the user device 302 to request the customized content 338. In other examples, the user device 302 automatically sends the customized content 338 to the digital output device 335 in response to detecting/entering the geofence area. The digital output device 335 displays the customized content 338 as long as the user device 302 is within the geofence area. When the user device 302 is no longer within the geofence area, the digital output device 335 resumes display of default content 336.
The customized content 338 may include, but is not limited to, the result of the last currency validation action performed; content associated with local network status (e.g., network errors currently preventing currency from being verified and/or transactions being validated and completed); or other system status messages. In one example, the customized content 338 may include information associated with any of the items within the currency environment 320, such as the item 310 and/or the item 314.
When the user device 302 is no longer within the predetermined range of the digital output device 335, the digital output device 335 resumes displaying the default content 336. In other examples, when the user device 302 is detected within the predetermined range of the digital output device 335, the user device 302 sends customized content 338 associated with the item 310 and/or the item 316 to the digital output device 335. The digital output device 335 outputs the customized content 338 while the user device 304 is within range of the digital output device 335 for viewing by the user 332.
The customized content 338 may be sent to the digital output device 335 from the user device 302 via the network 340. The network 340 may include a BLUETOOTH®, a beacon transmitter, a LAN, a WAN, or any other type of network, such as, but not limited to, the network 112 in
Some examples of the image capture device 400 may include the set of cameras 404. use of the set of cameras 404 allows for capturing a given currency environment from offset points of view. For example, this enables capturing images which preserve information on stereoscopic depth within the currency environment. With stereoscopic depth information thus preserved, the disclosure may be configured to use computer vision and machine learning algorithms optimized to take advantage of this information, allowing for faster and more accurate currency verification and transaction validation.
Some other examples of the image capture device 400 may include the depth sensor 406. The depth sensor 406 (which may also be called a depth camera) in some examples is a laser coupled with a traditional two-dimensional camera, such as the camera 402. The depth sensor 406 is of particular importance in modern computer vision systems optimized for classification of three-dimensional items. While computer vision systems have historically performed sufficiently without the depth sensor 406 when the subject matter was essentially two-dimensional (e.g., recognition of handwriting on a flat surface), achieving reliable, accurate classification of three-dimensional items in three-dimensional space requires the ability to sense depth. Thus, in some examples, the depth sensor 406 is coupled with the traditional two-dimensional camera 402 to record and preserve depth information corresponding to an otherwise two-dimensional image. This is a simpler, more cost effective, and easier to implement solution than the stereoscopic set of cameras 404. Many modern mobile devices already include examples of the single-lens camera 402 coupled with the depth sensor 406, thus requiring no additional specialized equipment for accurate computer vision-based three-dimensional item classification.
In the context of currency verification and transaction validation, depth information, however recorded, is of particular importance in properly classifying coin currency. Coins are often distinguishable based on an individual coin's thickness and the topographical features of either side of the coin. In some examples, depth information may also enable the proper classification of a stack of coin currency wherein some of the features of each piece of coin currency may be at least partially obscured.
The plurality of items 508 includes any type of items, such as, but not limited to, the plurality of detected items 152 in
A user 514 associated with the image capture device 504 views the output of the image capture device 504 that may include a real-world image of a portion of the currency environment 500 within the FOV 502 of the user 514 or the FOV 502 of the image capture device 504. This output may also include additional information relating to the present status of the currency verification and transaction validation operations for the present transaction. The image capture device 504 may be a computing device, such as, but not limited to, the computing device 102 in
In some examples, the currency environment 500 includes one or more sensor devices (not shown) for identifying a location of the image capture device 504 within the currency environment 500. For example, the currency environment 500 may include image capture devices, beacon transmitters (not shown), beacon receivers (not shown), infrared (heat) sensors (not shown), proximity sensors (not shown), etc. The system in these examples analyzes the sensor data generated by the sensor device(s) to determine when an identified user is within proximity to a digital output device or other display area for customizing displayed content. For example, infrared (IR) sensor data is utilized for three-dimensional mapping of an area associated with the image capture device 504 to identify a location of the image capture device 504 within the currency environment 500 and/or to identify a plurality of items 508 located within a given range of the image capture device 504.
In some examples, the AR display 600 is an AR headset worn by the user. The AR display 600 displays an image of the field of view of the currency environment corresponding to the field of view of the image capture device currently in use, overlaid with various virtual display elements providing output and user interface functionality to the user. These virtual display elements may include a transaction information display 608, a notification output display 610, an AR display overlay component 612, and control(s) 614.
The transaction information display 608 provides an overlay giving information on the current transaction the user is attempting to complete. This information may include an amount due display 620 and an amount paid display 622. The amount due display 620 displays the total amount of currency necessary to validate and complete the transaction. The amount paid display 622 shows the total amount of currency the system has already recognized and verified as being present. The notification output display 610 provides an overlay displaying the notification generated by the most recently performed currency validation action. This notification gives instructions to the user to continue the currency verification and transaction validation process. Thus, the notification output display 610 may provide an additional currency required notification display 630, a removal of currency required notification display 632, a validated transaction notification display 634, or a removal of unverified items required notification display 636.
The additional currency required notification display 630 may be shown when the user must provide more currency within the currency environment to validate and complete a transaction, in response to the last currency validation action. The additional currency required notification display 630 may include detailed instructions on which currency types must be provided, and in what amount. For example, if $7.63 is required to validate and complete a transaction, the additional currency required notification display 630 may notify the user to provide one five-dollar bill, two one-dollar bills, two quarters, one dime, and three pennies.
The removal of currency required notification display 632 may be shown when the user must remove currency from the currency environment to validate and complete a transaction, based on the last currency validation action. For example, if $15.00 is required to validate and complete a transaction and the user has provided a single twenty-dollar bill within the currency environment, the removal of currency required notification display 632 may notify the user to remove the twenty-dollar bill. In another example, $75.00 is required to validate and complete a transaction. If a user provides a single fifty-dollar bill and three ten-dollar bills, for a total of eighty dollars in verified currency, the removal of currency required notification display 632 may notify the user to remove only a single ten-dollar bill. The additional currency required notification display 630 may then notify the user to add a single five-dollar bill, or alternatively, five one-dollar bills.
The validated transaction notification display 634 may be shown when no further action by the user is required to validate and complete a transaction, based on the last currency validation action. For example, if $10.00 is required to complete the transaction and the user has provided two five-dollar bills within the currency environment, the validated transaction notification display 634 may notify the user that the transaction has been validated. Once this notification is delivered, the transaction is complete.
The removal of unverified items required notification display 636 may be shown when the user must remove unverified items from the currency environment to validate and complete a transaction. Such items may include currency items 604 which cannot be verified. For example, the currency items 604 may not be verified because the currency items 604 are damaged, foreign currency, or counterfeit currency. Unverified items may also include any other type of items 602.
The AR display overlay component 612 provides an overlay over all the real-world items within the field of view of the currency environment and is used to provide instructions to the user in visual form. An unverified item removal indicator 638 may be used to highlight any of the items 602 or unverified currency items 604 which the user must remove to continue verifying the currency validate and complete the current transaction. The AR display overlay component 612 may be updated in response to the notification output display 610 displaying any one of the additional currency required notification display 630, the removal of currency required notification display 632, the validated transaction notification display 634, or the removal of unverified items required notification display 636.
The control(s) 614 provide an overlay containing various user interface elements necessary to interact with the currency verification and transaction validation system. Which of the control(s) 614 are displayed may depend on the specific configuration the currency verification and transaction validation system. In some examples, a new transaction control 640 is displayed by the control(s) 614. The new transaction control 640 cancels the current transaction without completing currency verification and transaction validation operations and readies the system to begin a new transaction. In some other examples, a verification override control 650 is displayed by the control(s) 614. The verification override control 650 allows the user to override the system when it produces false negatives by failing to recognize certain items 602 as verified currency items 606, or alternatively recognizes items 602 as verified currency items 606 but assigns them the incorrect currency value. After activating the verification override control 650, the user may access certain user interface elements (not shown) enabling the user to manually indicate that the system should recognize certain items 602 as currency items 606 and assign the correct currency value to such currency items 606.
In some examples, an image of the content of AR display overlay component 612 indicating the state of all items 602 before and after the user completes all verification override operations is stored by the system. This image may be used, for example, for later review of the verification override event to ensure overriding the system was actually necessary and not a result of user mistake or wrongdoing. Such images may be recorded in a collection of verification files on a data storage device, such as the plurality of verification files 156 in the data storage device 150 in
The feedback 704 may include currency verification accuracy feedback, feedback associated with the efficiency of the currency validation action chosen based on the currency calculation, feedback associated with false negative identification of currency items as non-currency items or incorrect valuation of correctly identified currency items, and/or feedback associated with false positive identification of non-currency items as currency items. If the system accurately verifies currency, chooses efficient currency validation actions, and exhibits few or no false positive identifications, the feedback may be good. If the system performs inaccurate currency verifications, delivers a high number of false positives, or chooses inefficient currency validation actions, the feedback may be poor. In some examples incorporating the AR display 600, the feedback associated with false negative identification of currency items as non-currency items or incorrect valuation of correctly identified currency items is indicated by the user activating the verification override control 650 as illustrated in
The user preferences 708 may include user-selected AR display preferences. For example, the user preferences 708 may include a user-selected colors for display of virtual elements within the AR display. In another example, the user preferences 708 may specify image capture optimizations to accommodate for local lighting conditions in order to create the most accurate image files.
The machine learning component 700 utilizes real-time data, such as the feedback 704, to adjust weights associated with each of the currency verification criteria 702. For example, if the currency verification criteria 702 indicate the user prefers the most accurate currency verification even at the expense of verification speed, the currency verification criteria 702 are weighted to indicate that accuracy should be prioritized over performance. In another example, if the currency verification criteria 702 indicate the user wants the most accuracy in the verification of coin currency even at the expense of accuracy in verification of paper currency, the currency verification criteria 702 are weighted to indicate that accuracy of coin currency recognition should have priority, even at the expense of less accurate recognition of paper currency.
The machine learning component 700 in some examples utilizes the feedback 704 from the user and the training data 706 associated with currency items to be recognized to adjust the currency verification criteria 702 weights. In the example above, if the user frequently interacts with certain types of currency, the machine learning component 700 may generate weighted selection criteria 712 indicating that the greatest preference should be given to attempting to recognize and verify those frequently encountered types of currency.
The currency verification criteria 702, the feedback 704, the training data 706, the user preferences 708, the historical transaction data 710, and/or the weighted selection criteria 712 may in some examples be stored in the set of machine learning inputs 155 within the data storage device 150 of the computing device 102 in
In some examples, the machine learning component 700 comprises a trained regressor such as a random decision forest, directed acyclic graph, support vector machine, neural network, or other trained regressor. The trained regressor may be trained using the feedback 704 described above. Examples of trained regressors include a convolutional neural network and a random decision forest. It should further be understood that the machine learning module, in some examples, may operate according machine learning principles and/or techniques known in the art without departing from the systems and/or methods described herein.
The process begins by obtaining an image file from an image capture device associated with a currency environment at 802. A currency identification component implemented on a processor obtains the image file. The currency environment may also be referred to as a scene, viewing range, or picture. The currency environment refers to the portion of the real world about which the process may receive information in order to perform currency verification and transaction validation. In this example, that information is delivered in the form of an image file from an image capture device associated with the currency environment and capable of capturing still or video images of the currency environment. In some examples, the currency environment includes a field of view of the image capture device.
Depending on the contents of the image file, the currency identification component detects one of: no items within the image file; one or more items within the image file, of which no items are currency items; or one or more items within the image file, of which at least one item is a currency item. When one or more items are present within the image file, the one or more items are detected within the image file at 804. These items may include both currency items and non-currency items. The process verifies at least one currency item of the detected one or more items at 806 and analyzes the at least one verified currency item to identify a currency type and a currency value at 808. The process generates a currency verification report based on the verified currency type and the verified currency value at 810, and outputs the generated currency verification report to a transaction system for a currency calculation at 812.
In this context, verifying an item means verifying that the currency item is of an appropriate type for the transaction presently being processed by the transaction system. This includes but is not limited to verifying that the item is indeed currency (either paper currency or coin currency); that the item is the correct type of national currency (e.g., any United States-issued currency, but not Euros, when the process is operating within the United States); that the item is legitimate (e.g., not counterfeit currency); and that the item is not a non-currency item (e.g., buttons, casino chips, arcade tokens, etc.). The transaction system is, for example, any computing device, mobile device, dedicated hardware, or other component capable of functioning as a point of sale (POS) terminal for currency-based transactions.
The process receives the currency calculation from the transaction system at 814 and, based on the received currency calculation, performs a currency validation action at 816. In some examples, the received currency calculation identifies a delta of a current transaction total and the at least one verified currency item. That is, in such examples the received currency calculation identifies the amount by which the at least one verified currency item differs from the transaction total, being either less than, greater than, or equal to the transaction total. When the delta is zero, indicating no difference between the transaction total and the at least one verified currency item, the transaction has been validated. The process terminates thereafter.
In the examples where, after step 802, the currency identification component detects no items within the image file, or in the alternative detects no currency items within the image file, the process takes no further action and terminates immediately. No new currency verification report is generated, and no new currency verification report is sent to the transaction system. No new currency calculation nor new currency validation action are performed before the process terminates. When the process does terminate, the results and output of the last successful currency calculation and the last successful currency validation action are preserved. Thus, the state of the transaction currently being validated, as set by all the previous successful currency calculations and all the previous successful currency validation actions, is unchanged.
Such examples include scenarios where the image file is obtained while the currency environment is completely empty of items. This could indicate that the user accidentally started the process too early, or that the image capture device is improperly configured (e.g.: not oriented such that the FOV of the image capture device contains the currency area). In either scenario, the user should be able to quickly determine the reason that the process terminated after step 802 and rectify the issue.
Such examples also include scenarios where the image file is obtained while the currency environment is not completely empty of items, but the currency identification component detects only non-currency items. This result may occur when a user places an assortment of unidentified items in the currency environment, which contains no currency items. Thus, the identification component may detect whether any currency items are actually present. This result may also occur when foreign, fake, or counterfeit currency is detected within the currency environment. In this case, the system may allow for correction or may prevent the transaction from completing.
While the operations illustrated in
In some examples, performing the currency validation action based on the received currency calculation further comprises, whenever additional currency is required to validate and complete a current transaction, generating a notification identifying the additional currency items to be added in order to validate and complete the current transaction at 918 and outputting the generated notification via augmented reality to visually indicate the additional currency items to be added at 926.
In other examples, performing the currency validation action based on the received currency calculation further comprises, whenever one or more currency items must be removed to validate and complete the current transaction, generating a notification identifying one or more currency items to be removed in order to validate and complete the current transaction at 920 and outputting the generated notification via augmented reality to visually indicate the one or more currency items to be removed at 926.
In still other examples, performing the currency validation action based on the received currency calculation further comprises, whenever both removal and addition of currency are required to validate and complete the current transaction, generating a notification identifying both the one or more currency items to be removed and the additional currency items to be added in order to validate and complete the current transaction at 922, and outputting the generated notification via augmented reality to visually indicate the one or more currency items to be removed and the additional currency items to be added at 926.
In yet other examples, performing the currency validation action based on the received currency calculation further comprises, whenever a transaction has been successfully validated, generating a notification validating the current transaction at 924, and outputting the generated notification via augmented reality to visually indicate the at least one identified currency item validates and completes the current transaction at 926.
The process terminates thereafter. For legibility,
In some examples, performing the currency validation action further comprises generating an audio output at 1016. The audio output includes at least one of a notification identifying additional currency items to be added in order to satisfy a current transaction, a notification identifying one or more currency items to be removed in order to satisfy the current transaction, a notification identifying both the one or more currency items to be removed and the additional currency items to be added in order to satisfy the current transaction, a notification validating the current transaction, or a notification identifying unverified items to be removed. Unverified items may be any non-currency item within the FOV of the currency environment.
The process terminates thereafter. The audio output used by this process may be the speakers 170 of the computing device 102 in
In some examples, the process further comprises storing the output of the performed currency validation action in a verification file for an associated transaction at 1118. The verification file may be one of the plurality of verification files 156 stored in the data storage device 150 of
The process terminates thereafter. While the operations illustrated in
In some examples, the currency identification component receives an amount of currency due from the transaction system. This amount due is sufficient to satisfy and validate, and thus complete, the current transaction. The machine learning and computer vision elements of the currency identification component take an image file of the currency environment as input and verify the type and amount of currency present in the currency environment. With this information available, the currency identification component determines what follow-up action is required to validate and complete the current transaction. Such follow-up actions are primarily in the form of removing or adding specified types of currency in specified amounts and may also include removing non-currency items. Examples of the disclosure notify the user of which follow-up action to take. Once the user takes such action, the process is repeated on a continuous loop until the correct type and amount of currency has been verified and the transaction has been validated and completed.
One example of the disclosure demonstrates a conventional computing device and image capture device, for example, the computing device 102 and image capture device 118, operating in an unconventional manner. A conventional computing device generally cannot compete with the speed and accuracy of a human being in the realm of image recognition and pattern matching. In contrast, the disclosure uses computer vision and machine learning techniques to enable a computing device, when programmed as described herein, to perform image recognition and pattern matching techniques in such a way as to meet or exceed the performance of a human being. The disclosure thus has a distinct advantage over a human being in the act of verifying the amount and type of currency presented by a customer to a provider of goods and/or services to validate and complete a transaction. When implemented on a computing device, the disclosure thereby improves the functioning of the computing device.
The modular nature of the disclosure, as well as the flexibility and versatility of the machine learning capabilities that provide the core of the currency verification and transaction validation features, allow for various alternative embodiments. For example, specialized machine learning techniques are used to teach examples of the disclosure how to detect and mark counterfeit currency. Because the presence of counterfeit currency may indicate possible criminal activity by the customer, the disclosure's potential to identify counterfeit currency provides a substantial benefit to public welfare and crime reduction efforts.
Another alternative embodiment builds on the disclosure's ability to identify which types of currency items, and in what amount, are required to fully validate and complete a transaction. Examples of the disclosure featuring any type of visual output are configured to display images of the required types and/or amounts of currency items. These images may be photorealistic or stylized so long as sufficient detail is provided. This extension would potentially increase transaction efficiency, as many users are quicker to identify familiar images than to read or listen to instructions necessary to complete a task.
Yet another alternative embodiment includes a suite of disability assistance/user accessibility aids. Examples of the disclosure configured to output audio notifications for the visually impaired have been discussed elsewhere in this disclosure. Alternative examples are configured to deliver notifications during the currency verification and transaction validation process via braille display or another type of tactile output. Such examples grant visually impaired customers and providers more confidentiality and privacy during transactions than a configuration dependent on an audio output.
Still another alternative embodiment provides the ability not only to identify foreign currency, but to simultaneously determine the appropriate conversion rate to convert the foreign currency into local currency. This embodiment is of particular use in environments where the provider of goods and/or services is able to accept both local and foreign currencies to validate and complete a transaction. Such environments may include international travel hubs (e.g., airports, train stations, etc.), travel accommodations which cater to international travelers, and money exchange/money transfer services (e.g., Western Union and other wire service providers).
The disclosure is adaptable to any type of image capture device which, at a minimum, can provide a two-dimensional image file depicting the currency environment. Thus, any type of image capture device meeting this threshold may be used, from the most simplistic of single-lens still cameras and scanners to the most advanced of depth sensor-equipped cameras and stereoscopic image capture devices. This flexibility gives the disclosure's image capture device(s) the potential to be customized to achieve optimum speed and/or accuracy for a given environment and/or verification and validation task.
Examples of the disclosure using AR features are configured to use any AR device, such as a headset, as an input/output device. Alternatively, such examples are configured to use an output device such as an LCD monitor communicatively coupled with a computing device and one or more image capture devices (e.g., an overhead camera) associated with the currency environment. When a non-AR input/output device combination (e.g., the LCD monitor and separate image capture device combination) is used, the virtual elements of the AR display may be layered over the image file captured from the image capture device and displayed on the non-AR output. Thus, the AR output may be delivered even when no dedicated AR input/output hardware is in use.
Examples of the disclosure using AR features are configured to provide a visual indication via AR overlay that a particular currency item has been verified and should not be removed. For example, a green checkmark is overlaid over a verified currency item. Verified currency items might also be given a color indicator and/or a highlighting outline. Such a configuration may increase transaction efficiency. Verified currency would be less likely to accidentally be removed and could also be more easily located and rearranged without being accidentally mixed with non-verified currency and non-currency items which need to be removed. Examples of the disclosure using AR features may also be configured to provide a visual indication via AR overlay that a particular item must be removed. Such indicators may include an “X” overlaid over the item, a color indicator, a highlighting outline, or a strike-though line.
In examples of the disclosure where a customer is provided a receipt by the transaction system upon successful validation of a transaction, the captured image used to successfully verify the currency and validate the transaction may be preserved. This preserved image may then be included with the receipt. Depending on the configuration of the transaction system, the receipt may be delivered on paper (e.g., via printout) or electronically (e.g., via email).
The disclosure enables providers of goods and/or services to quickly and accurately verify that the correct amount of currency is present to complete and validate a transaction, as well as that the customer has received the proper amount of change in return for an excess payment.
Examples of the disclosure optimized for working with large amounts of currency at a time may have particular benefits in environments where large amounts of currency must be exchanged at one time in a fast and accurate manner. Two such example environments might include certain banks and casinos.
Some examples of the disclosure are configured to verify any currency-like item that has distinct types and values within an environment, and validate transactions based on those currency-like items. Some potential environments include providers which offer coupons, providers which accept tokens (e.g., food stamps) or similar tokens which substitute for currency, casinos using a set of tokens in place of currency to securely facilitate betting, public transportation hubs which accept pre-purchased tokens as payment, and entertainment facilities whose attractions are activated via pre-purchased tokens. Other examples of the disclosure are used with coupons by visually verifying the coupon code entered at the point-of-sale (POS) matches the code on the coupon provided by the customer. Still other examples of the disclosure are configured to verify arcade tokens or raffle tickets. Yet other examples of the disclosure are configured to verify reward tokens submitted by a user to validate a transaction for a receipt of a prize. One example of a tokens-for-prize transaction includes verifying reward tokens submitted by a user to validate an exchange for a free stuffed animal.
Certain features of the disclosure herein may rely on certain computer vision and or machine learning techniques and technologies, and/or a combination thereof to obtain an image file from an image capture device associated with a currency environment; detect one or more items within the image file; verify at least one currency item of the detected one or more items; and analyze the at least one verified currency item to identify a currency type and a currency value. A number of such techniques and technologies may be used successfully to implement the disclosure herein, non-exhaustive examples of which are discussed in the following paragraphs.
Due to the different properties of coin currency and paper currency, different techniques and methodologies may be necessary for a single example of the disclosure to properly deal with both types of currency. The disclosure requires only that sufficiently capable computer vision and machine learning-powered currency recognition features be present and functional. The examples herein are thus provided only as illustrations, and do not represent an exclusive listing of all techniques and methodologies which may be suitable and applicable to the disclosure herein.
A Scale-Invariant Feature Transform (SIFT) algorithm may be used for recognition of both coinage and paper currency. SIFT produces distinct key-points and feature descriptors for each item in an image captured by an image capture device and is considered one of the most robust feature extraction algorithms. SIFT is a feature selection technique dependent on the appearance of the items to be recognized at specific interest points, which are not changed by image scale or rotation. SIFT deals effectively with changes in illumination, image noise, and minor changes in the image viewpoint. These features make it particularly well suited for implementation on a mobile image capture device such as a smartphone. SIFT is optimized for grayscale image capture but may take advantage of additional data available via color image capture to provide more accurate results. The SIFT algorithm may be implemented for currency recognition on a variety of platforms, including on JAVA® runtimes with the OpenCV computer vision/machine learning software library available.
An example currency recognition algorithm designed particularly for paper currency recognition uses a radial basis function network to classify the currency being examined. This method requires high-resolution image capture (e.g., from a digital camera), with the resulting image being converted first into greyscale and then black-and-white images. After such conversion is completed, the edge of the image is filtered using the Prewitt method and then detected using Canny's edge detection method. This image pre-processing serves to remove noise and distortions that may cause recognition errors. After this image pre-processing is complete, the image is analyzed for feature extraction. That is, a set of metadata is compiled about the image that can be used for a radial based neural network-powered pattern matching analysis. Gaussian radial basis functions are some of the most widely used types of radial basis functions in constructing radial based neural networks. A currency classifier built on top of such a radial basis function network may contain twenty-five neurons in the hidden layer. When an image is input, the recognition system calculates all the correlations between the image and known currency data template images, builds and trains the neural network, and finally classifies the type of currency in the inputted image.
Another example artificial neural network-based system, designed particularly for coin recognition, separates the entire process into seven distinct operations: acquisition of a color image from an image capture device; conversion of the color image to greyscale; removal of shadows from the image; cropping and trimming the image; generation of a pattern averaged image; generation of a feature vector which is passed as input to a trained neural network; and delivery of a recognition result to the user based on the output of the neural network. The shadow of the coin is removed by using the Hough transform algorithm for circle detection combined with the Sobel edge detection algorithm. The image is then cropped so that only the coin appears in the image and then trimmed to a size of 100×100 pixels. The trimmed and cropped image is used as input to the trained neural network. However, to reduce the computation and complexity in the neural network, the image is further reduced to a size of 20×20 pixels by segmenting the image using 5×5 pixel segments, and then taking the average of pixel values within the segment, to create a pattern averaged coin image. This pattern averaged coin image is used to generate a feature vector containing all the pixel values from the image. The vector is then passed to the trained neural network. Finally, the neural network processes the input and classifies the coin image to determine if it depicts a coin the neural network has been trained to recognize.
Many of the examples in this disclosure use an image capture device to capture a still image file associated with a currency environment for processing. That does not preclude the use of an image capture device associated with a currency environment and configured to capture a live video stream updated in real-time (a “real-time video capture device”). An example configured to use a real-time video capture device is able to obtain image files and perform more currency validation actions than a user could manually request in a given unit of time. Some examples configured to use a real-time video capture device may function identically to examples configured to capture and process a still image file, except that examples configured to use a real-time video capture device may effectively capture a single still image at a rate not exceeding the frame rate of the real-time video capture device. For example, a real-time video capture device configured to capture video at a rate of thirty frames per second (FPS) may effectively obtain up to thirty still image files every one second, even though some image files may be lost due to capture errors inherent in image capture technology such that the actual number of image files captured per second is less than the maximum possible amount. Each of these image files may then be individually processed as described herein to perform an appropriate currency validation action.
In such examples using a sufficiently high-performance processor and other components, images are captured, and currency verification and currency validation actions are performed at such great speed (e.g., in the above example, up to thirty times per second) that from the user's perspective a new currency validation action is performed instantaneously based on the user's actions. In such examples, where currency is verified and transactions are validated in apparent real-time, user efficiency may be maximized, and total transaction processing time may be minimized, allowing for an increased number of completed transactions validated per unit time, and a more pleasant experience for users. Where desirable (e.g., where only a less-powerful processor and or image capture device are available), an example is configured to capture and process fewer FPS (and thus, fewer image files per second) from a real-time video capture device. Such an example system may be configured to capture and process, for example, only five FPS, and thus, only five image files per second. However, even a reduced FPS configuration may still retain the advantage of increased transaction efficiency, if to a lesser degree than a configuration which appears to perform currency verification and transaction validation actions instantaneously based on the user's actions.
Alternatively, or in addition to the other examples described herein, examples include any combination of the following:
At least a portion of the functionality of the various elements in
In some examples, the operations illustrated in
While the aspects of the disclosure have been described in terms of various examples with their associated operations, a person skilled in the art would appreciate that a combination of operations from any number of different examples is also within scope of the aspects of the disclosure.
The term “Wi-Fi” as used herein refers, in some examples, to a wireless local area network using high frequency radio signals for the transmission of data. The term “BLUETOOTH®” as used herein refers, in some examples, to a wireless technology standard for exchanging data over short distances using short wavelength radio transmission. The term “cellular” as used herein refers, in some examples, to a wireless communication system using short-range radio stations that, when joined together, enable the transmission of data over a wide geographic area. The term “NFC” as used herein refers, in some examples, to a short-range high frequency wireless communication technology for the exchange of data over short distances.
While no personally identifiable information is tracked by aspects of the disclosure, examples have been described with reference to data monitored and/or collected from the users. In some examples, notice may be provided to the users of the collection of the data (e.g., via a dialog box or preference setting) and users are given the opportunity to give or deny consent for the monitoring and/or collection. The consent may take the form of opt-in consent or opt-out consent.
The present disclosure is operable with a computing apparatus according to an embodiment as a functional block diagram 1200 in
Computer executable instructions may be provided using any computer-readable media that are accessible by the computing apparatus 1218. Computer-readable media may include, for example, computer storage media such as a memory 1222 and communications media. Computer storage media, such as a memory 1222, include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or the like. Computer storage media include, but are not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing apparatus. In contrast, communication media may embody computer readable instructions, data structures, program modules, or the like in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media do not include communication media. Therefore, a computer storage medium should not be interpreted to be a propagating signal per se. Propagated signals per se are not examples of computer storage media. Although the computer storage medium (the memory 1222) is shown within the computing apparatus 1218, it will be appreciated by a person skilled in the art, that the storage may be distributed or located remotely and accessed via a network or other communication link (e.g. using a communication interface 1223).
The computing apparatus 1218 may comprise an input/output controller 1224 configured to output information to one or more output devices 1225, for example a display or a speaker, which may be separate from or integral to the electronic device. The input/output controller 1224 may also be configured to receive and process an input from one or more input devices 1226, for example, a keyboard, a microphone or a touchpad. In one embodiment, the output device 1225 may also act as the input device 1226. An example of such a device may be a touch sensitive display. The input/output controller 1224 may also output data to devices other than the output device, e.g. a locally connected printing device. In some embodiments, a user may provide input to the input device(s) 1226 and/or receive output from the output device(s) 1225.
The functionality described herein can be performed, at least in part, by one or more hardware logic components. According to an embodiment, the computing apparatus 1218 is configured by the program code when executed by the processor 1219 to execute the embodiments of the operations and functionality described. Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), Graphics Processing Units (GPUs).
Although described in connection with an exemplary computing system environment, examples of the disclosure are capable of implementation with numerous other general purpose or special purpose computing system environments, configurations, or devices.
Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with aspects of the disclosure include, but are not limited to, mobile computing devices, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, gaming consoles, microprocessor-based systems, set top boxes, programmable consumer electronics, mobile telephones, mobile computing and/or communication devices in wearable or accessory form factors (e.g., watches, glasses, headsets, or earphones), network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. Such systems or devices may accept input from the user in any way, including from input devices such as a keyboard or pointing device, via gesture input, proximity input (such as by hovering), and/or via voice input.
Examples of the disclosure may be described in the general context of computer-executable instructions, such as program modules, executed by one or more computers or other devices in software, firmware, hardware, or a combination thereof. The computer-executable instructions may be organized into one or more computer-executable components or modules. Generally, program modules include, but are not limited to, routines, programs, objects, components, and data structures that perform tasks or implement particular abstract data types. Aspects of the disclosure may be implemented with any number and organization of such components or modules. For example, aspects of the disclosure are not limited to the specific computer-executable instructions or the specific components or modules illustrated in the figures and described herein. Other examples of the disclosure may include different computer-executable instructions or components having more or less functionality than illustrated and described herein.
In examples involving a general-purpose computer, aspects of the disclosure transform the general-purpose computer into a special-purpose computing device when configured to execute the instructions described herein.
The examples illustrated and described herein as well as examples not specifically described herein but within the scope of aspects of the disclosure constitute exemplary means for currency verification and transaction validation. For example, the elements illustrated in
The order of execution or performance of the operations in examples of the disclosure illustrated and described herein is not essential, unless otherwise specified. That is, the operations may be performed in any order, unless otherwise specified, and examples of the disclosure may include additional or fewer operations than those disclosed herein. For example, it is contemplated that executing or performing a particular operation before, contemporaneously with, or after another operation is within the scope of aspects of the disclosure.
When introducing elements of aspects of the disclosure or the examples thereof, the articles “a,” “an,” “the,” and “said” are intended to mean that there are one or more of the elements. The terms “comprising,” “including,” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements. The term “exemplary” is intended to mean “an example of” The phrase “one or more of the following: A, B, and C” means “at least one of A and/or at least one of B and/or at least one of C.”
Having described aspects of the disclosure in detail, it will be apparent that modifications and variations are possible without departing from the scope of aspects of the disclosure as defined in the appended claims. As various changes could be made in the above constructions, products, and methods without departing from the scope of aspects of the disclosure, it is intended that all matter contained in the above description and shown in the accompanying drawings shall be interpreted as illustrative and not in a limiting sense.
Number | Date | Country | |
---|---|---|---|
62697725 | Jul 2018 | US |