Retailers often provide purchasers with the option to undertake self-checkout, as an alternative to assisted checkout (e.g., provided by an employee of the retailer). Purchaser can use POS systems to scan and tally items, and to pay the resulting bill. Some items, however, do not include a code for automatic scanning (e.g., do not include a universal product code (UPC)). For these items, purchasers typically must use the POS system to identify the item. For example, purchasers can identify the item by reviewing pictures of item options or textual labels for item options, or by entering a product code (e.g., entering an alphanumeric product code). This can be inefficient, inaccurate, and detrimental to the retailer, and can cause frustration and delay for purchasers.
As discussed above, some retail items, for example produce, do not include a UPC or another code for automatic scanning at a POS system. In prior solutions, a customer is typically required to manually identify the item at the POS system, for example by searching for the item using images or textual labels, or by entering an alphanumeric product code. In an embodiment, a POS system can, instead, predict an item that a customer is purchasing, and can prompt the customer to confirm the predicted item.
For example, a POS system can include one or more image capture devices (e.g., cameras) to capture one or more images of the item. The POS system can then use image recognition techniques (e.g., machine learning (ML) techniques) to predict the item depicted in the images. The POS system can then present the customer with the predicted item, and allow the purchaser to confirm the prediction or select a different item (e.g., if the prediction is incorrect).
In an embodiment, image recognition can be performed using a trained ML model (e.g., a suitable neural network). This ML model can be trained using real-word data reflecting purchaser selection of items, and captured images of these items. These selections can be used as ground truth to train the ML model.
This real-world data, however, may not always be accurate. In an embodiment, purchasers do not accurately select the item that they are seeking to purchase. For example, purchasers may mistakenly select one item when they are actually purchasing a different item. This could occur when a purchaser mistakenly provides the wrong input to a user interface (e.g., mistakenly touches a picture of the wrong item on a touch sensitive screen) or when the purchaser themselves does not realize what item they are actually purchasing. As another example, purchasers may intentionally select the incorrect item. For example, purchasers may intentionally select a cheaper item compared to the item they are actually purchasing (e.g., purchasers may select a non-organic produce item when they are actually purchasing an organic produce item).
Using this inaccurate real-world purchase data as the ground truth for training an ML model can result in inaccuracies in the model. This can be improved by auditing the real-world purchase data before the data is used to train the ML model. In an embodiment, the real-world purchase data can be provided to an auditing system for verification before the data is used to train the ML model.
For example, the real-world purchase data can be provided to a human auditor, who can review the purchaser selection and either confirm its accuracy or not the selection's inaccuracy. Alternatively, or in addition, the real-world purchase data can be provided to an additional ML model trained to audit the data. The audited data can then be used to train the ML model, increasing the accuracy of inference by the ML model.
Advantageously, one or more of these techniques can improve prediction of items for purchase at a POS system using image recognition. For example, this can improve the performance of the POS system by enabling it to detect, using an image recognition system, an item being purchased without having to rely on the purchaser to scan a UPC or manually type in a name of the item. The embodiments herein also advantageously provide for improved training data for the ML model. This can improve the performance of the ML model, improving the accuracy of the prediction of items for purchase. In addition to the advantages described above, these techniques have many additional technical advantages. For example, improving the accuracy of prediction reduces the computational burden on the system, by lessening the number of transactions required, because accurate item prediction reduces the number of searches initiated by a user. As another example, improving the training data for the ML model can provide advantages for the ML model. More accurate training data can allow for a less heavily trained ML Model (e.g., requiring less training data to meet a required accuracy threshold), and can require less computationally intensive training.
One or more purchasers 102 use a checkout area 110 (e.g., to pay for purchases). In an embodiment, the checkout area 110 includes multiple point of sale (POS) systems 120A-N. For example, one of the purchasers 102 can use one of the POS systems 120A-N for self-checkout to purchase items. The checkout area 110 further includes an employee station 126. For example, an employee (e.g., a retail employee) can use the employee station 126 to monitor the purchasers 102 and the POS systems 120A-N. Self-checkout is merely one example, and the POS systems 120A-N can be any suitable systems. For example, the POS system 120A can be an assisted checkout kiosk in which an employee assists a purchaser with checkout.
In an embodiment, each of the POS systems 120A-N includes components used by the purchaser for self-checkout. For example, the POS system 120A includes a scanner 122 and an image capture device 124. In an embodiment, one of the purchasers 102 can use the scanner to identify items for checkout. For example, the purchaser 102 can use the scanner 122 to scan a UPC on an item.
In an embodiment, the scanner 122 is a component of the POS system 120A and identifies an item for purchase based on the scanner activity. For example, the POS system 120A can communicate with an administration system 150 using a network 140. The network 140 can be any suitable communication network, including a local area network (LAN), wide area network (WAN), cellular communication network, the Internet, or any other suitable communication network. The POS system 122A can communicate with the network 140 using any suitable network connection, including a wired connection (e.g., an Ethernet connection), a WiFi connection (e.g., an 802.11 connection), or a cellular connection.
In an embodiment, the POS system 120A can communicate with the administration system 150 to identify items scanned by a purchaser 102, and to perform other functions relating to self-checkout. The administration system 150 is illustrated further with regard to
Further, in an embodiment, the image capture device 124 (e.g., a camera) is also a component of the POS system 120A and can be used to identify the item that a purchaser is seeking to purchase. For example, the image capture device 124 can capture one or more images of an item a purchaser 102 is seeking to purchase. The POS system 120A can transmit the images to the administration system 150 to identify the item depicted in the images. The administration system 150 can then use a suitable trained ML model to identify the items depicted in the image, and can reply to the POS system 120A with identification information for the identified item.
For example, the administration system 150 can transmit to the POS system 120A a code identifying the item (e.g., a PLU). The POS system 120A can use the code to lookup the item and present the item to the user (e.g., displaying an image relating to the item and a textual description relating to the item). In an embodiment, information about the item presented to the user (e.g., a stock image and textual description) is maintained at the POS system 120A. Alternatively, this information can be maintained at another suitable location. For example, the POS system 120A can communicate with any suitable storage location (e.g., a local storage location or a cloud storage location) to retrieve the information (e.g., using the identifying code for the item). Alternatively, or in addition, the administration system 150 can provide the information (e.g., the image and textual description) to the user.
In an embodiment, the checkout controller includes an image recognition service 164, which includes an auditing service 166. This is discussed further below with regard to
In an embodiment, the checkout controller 160 provides the images 162, and the results of the image recognition service 164, to an ML controller 170. The ML controller 170 includes an ML training service 172. In an embodiment, the ML training service 172 is computer program code, stored in a memory, and configured to train an ML model when executed by a computer processor. For example, the ML training service 172 can train an ML model 182 using the images 162 and the results of the image recognition service 164. The ML model 182 can be any suitable supervised ML model for image recognition, including a deep learning neural network. For example, a suitable convolutional neural network (CNN) can be used. This is merely one example, and any suitable supervised ML model can be used.
The ML controller 170 can then provide the ML model 182, and any suitable additional data 184, to a computer vision service 190. In an embodiment, the ML model 182 is a suitable supervised ML model for image recognition (e.g., trained using the ML training service 172). The computer vision service 190 can use the ML model 182 (along with any suitable additional data 184) to recognize items for purchase in images captured during checkout. This is discussed further with regard to
The network components 220 include the components necessary for the checkout controller 160 to interface with a suitable communication network (e.g., the communication network 140 illustrated in
The memory 210 generally includes program code for performing various functions related to use of the checkout controller 160. The program code is generally described as various functional “applications” or “modules” within the memory 210, although alternate implementations may have different functions and/or combinations of functions. Within the memory 210, the image recognition service 164 facilitates item recognition and the auditing service 166 facilitates auditing purchaser data to generate training data for an item recognition ML model. This is discussed further below with regard to
At block 304, a purchaser purchases the new item. In an embodiment, the customer can use a POS system (e.g., the POS system 120A illustrated in
At block 306 an image recognition service (e.g., the image recognition service 164 illustrated in
At block 308, the image recognition service determines whether sufficient images have been collected to train the ML model for image recognition. For example, the image recognition can communicate with an ML training service (e.g., the ML training service 172 illustrated in
If sufficient images have been collected, the flow proceeds to block 310. At block 310 the image recognition service enrolls the new item for image recognition. In an embodiment, the item is enrolled in the ML model and added as a permissible product code (e.g., PLL) for the POS system.
At block 312, the image recognition service verifies the enrollment of the new item. For example, the image recognition service can test the accuracy of the ML model for the new item by determining whether the ML model can correctly identify a set number of pre-selected items (e.g., 20 items). This is merely one example, and any suitable verification technique can be used. Assuming the image recognition service verifies the enrollment of the new item, the flow proceeds to block 314.
At block 314, the image recognition service deploys the new item at the retailer. For example, the image recognition can deploy the verified ML model for the POS systems. In an embodiment, the POS systems access the ML model through a suitable network connection, as illustrated in
At block 404, the auditing service receives an auditor selection. This is discussed further with regard to
At block 406, the auditing service determines whether the purchaser selection and the auditor selection match. If so, the flow proceeds to block 408 and the match is confirmed. For example, the auditing service can provide the customer match as training data for the image recognition ML model and can note that the match is confirmed (e.g., by including an annotation that the selection is confirmed).
If not, the flow proceeds to block 410 and the match is contradicted. For example, the auditing service can provide the captured image and customer match as training data for the image recognition ML model and can note that the match is contradicted (e.g., by including an annotation that the selection is contradicted). In an embodiment, both confirmed and contradicted data is useful for training the ML model. Alternatively, only confirmed matches can be used to train the ML model.
At block 504, the auditing service generates alternative selections (e.g., for a human auditor). In an embodiment, as illustrated in
At block 506, the auditing service presents the selections to the auditor. For example, the auditing service can generate a suitable user interface presenting the selections to the auditor. This is illustrated in
At block 508, the auditing service captures the auditor selection. In an embodiment, the auditing service determines whether the auditor selection confirms, or contradicts, the purchase selection. The auditing service then provides this information to an ML model as part of the ML model's training data.
For example, in
In an embodiment, the user interface 550 further includes an auditor record 552. For example, the auditor record 552 can provide statistics and data relating to the auditor. The auditor record 552 can further allow the auditor to control the user interface.
Further, in an embodiment, auditing using the user interface 550 can be used to assist with loss prevention. For example, as discussed above, a purchaser selection of an item that does not match the prediction could be intentional. Purchasers may intentionally select a cheaper item compared to the item they are actually purchasing (e.g., purchasers may select a non-organic produce item when they are actually purchasing an organic produce item). In an embodiment, an auditor could identify a potentially suspicious transaction using the user interface 550.
For example, the auditor can select the “Report Problem” interface in the auditor record 552 section. This could flag the transaction for further analysis. For example, selecting the “Report Problem” interface could forward a screenshot of the transaction (e.g., the captured image 560 and the purchaser selection) for further analysis. The screenshot could be analyzed, either manually or using a suitable automatic analysis tool (e.g., an ML model), and it could be determined whether the purchaser was likely intentionally selecting the incorrect item. This could be used to modify the POS system (e.g., providing additional loss prevention checks) or to provide additional loss prevention assistance.
At block 604, the auditing service provides the auditor ML model with the captured image and the purchaser selection. In an embodiment, the auditor ML model infers a binary output using the captured image and the purchaser selection, either confirming or contradicting the selection. At block 606, the auditing service receives the inference of confirmation or contradiction.
As illustrated in
For example, at block 702, an auditing service (e.g., the auditing service 166 illustrated in
At block 704, the auditing service updates the image recognition ML model training data. For example, the auditing service can confirm, or contradict, purchaser selections of items (e.g., as discussed above in relation to
At block 706, an ML training service (e.g., the ML training service 172 illustrated in
At block 804, the POS system captures and transmits an image of the item being purchased. For example, the POS system can include an image capture device (e.g., the image capture device 124 illustrated in
As discussed above in relation to
At block 806 the POS system receives the item prediction. In an embodiment, the POS system receives a code relating to the item (e.g., a PLU code) and uses the code to identify a description and image for the item. Alternatively, or in addition, the POS system receives the description and image for the item (e.g., from an administration system).
At block 808, the POS system presents the items to the purchaser. For example, the POS system can use the item code(s) received at block 806 to identify enrolled items for the retailer. Alternatively, or in addition, the POS system can use description and images received at block 806. The POS system can present the predicted item to the user for selection.
The descriptions of the various embodiments of the present invention have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
In the preceding, reference is made to embodiments presented in this disclosure. However, the scope of the present disclosure is not limited to specific described embodiments. Instead, any combination of the preceding features and elements, whether related to different embodiments or not, is contemplated to implement and practice contemplated embodiments. Furthermore, although embodiments disclosed herein may achieve advantages over other possible solutions or over the prior art, whether or not a particular advantage is achieved by a given embodiment is not limiting of the scope of the present disclosure. Thus, the preceding aspects, features, embodiments and advantages are merely illustrative and are not considered elements or limitations of the appended claims except where explicitly recited in a claim(s). Likewise, reference to “the invention” shall not be construed as a generalization of any inventive subject matter disclosed herein and shall not be considered to be an element or limitation of the appended claims except where explicitly recited in a claim(s).
Aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.”
The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
Embodiments of the invention may be provided to end users through a cloud computing infrastructure. Cloud computing generally refers to the provision of scalable computing resources as a service over a network. More formally, cloud computing may be defined as a computing capability that provides an abstraction between the computing resource and its underlying technical architecture (e.g., servers, storage, networks), enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. Thus, cloud computing allows a user to access virtual computing resources (e.g., storage, data, applications, and even complete virtualized computing systems) in “the cloud,” without regard for the underlying physical systems (or locations of those systems) used to provide the computing resources.
Typically, cloud computing resources are provided to a user on a pay-per-use basis, where users are charged only for the computing resources actually used (e.g. an amount of storage space consumed by a user or a number of virtualized systems instantiated by the user). A user can access any of the resources that reside in the cloud at any time, and from anywhere across the Internet. In context of the present invention, a user may access applications (e.g., the administration system 150 illustrated in
While the foregoing is directed to embodiments of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.