Touchless multi-staged retail process automation systems and methods

Information

  • Patent Grant
  • 12019410
  • Patent Number
    12,019,410
  • Date Filed
    Monday, May 24, 2021
    3 years ago
  • Date Issued
    Tuesday, June 25, 2024
    7 months ago
Abstract
A touchless multi-staged retail process automation systems and methods (retail process automation system) to automate key store functionalities within retail stores is disclosed. The retail process automation system receives a set of commands (e.g., a voice command) to automatically perform retail store operations such as opening a store, closing a close, automatically opening tills, etc. with the single command. As a result, the retail process automation system replaces the multiple-touches required by conventional systems with a touchless solution, thereby providing increased time and resource efficiency of managing store operations. For example, the retail process automation system intercepts a voice command to identify the associated retail store information and source of the command. The retail process automation system then generates a set of operations associated with the received voice command to alter a state of the retail store.
Description
BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a system diagram illustrating an example of a computing environment in which the retail process automation system operates according to various implementations.



FIG. 2 is a flow diagram illustrating a process of retail process automation system according to various implementations.



FIG. 3 is a flow diagram illustrating a process of retail process automation according to various implementations.



FIG. 4 is a data flow diagram showing a typical process used by the retail process automation system in some embodiments to generate suggested modifications corresponding to an instruction trigger.



FIGS. 5A-5H are examples of user interfaces that can be used according to various implementations.



FIG. 6 is a block diagram that illustrates an example of a computer system in which at least some operations described herein can be implemented.







The technologies described herein will become more apparent to those skilled in the art from studying the Detailed Description in conjunction with the drawings. Embodiments or implementations describing aspects of the invention are illustrated by way of example, and the same references can indicate similar elements. While the drawings depict various implementations for the purpose of illustration, those skilled in the art will recognize that alternative implementations can be employed without departing from the principles of the present technologies. Accordingly, while specific implementations are shown in the drawings, the technology is amenable to various modifications.


DETAILED DESCRIPTION

Retailers typically maintain hundreds, if not thousands of stores. Each store is unique and offers a set of services and products to customers. Store operations are complex and comprise several steps for both store opening and store closing. For instance, in a retail store environment, a worker (e.g., a cashier) will start his/her shift with a starting till when a store opens. This is an amount of money made up of a certain amount of notes of varying denominations and a certain amount of coin. Different store tills can be associated with different starting amounts. The tills can be associated with a central system such that a store can be activated/opened after the tills are brought online with a centralized system. In addition, one or more reports (e.g., tills summary report) can be generated when a store opens. Similarly, when a store closes, the open tills are closed, their ending amount is captured, and one or more end-of-day reports are run. Typically, a store manager performs a set of detailed steps to open or close a store. Moreover, a store manager has to perform the multi-touch steps in a specific order. However, depending on certain criteria (e.g., weekend, holiday week, light day, busy day, etc.), the store manager may perform different steps and/or steps in a different order to open the same store. Also, different users may open the same store with different profiles. For example, while a store manager may open a store with five tills, a regional manager may be authorized to open the same store with fifteen tills with varying opening accounts, and different reports. Overall, store opening and closing can take anywhere from 20 minutes to an hour—and even then, the possibility of human/manual error is high. Since store opening and closing operations are complex and involve multiple touch-points (e.g., managing multiple tills and associated starting and ending amounts, report generation infrastructure, and so on), these operations are time and resource intensive.


To solve these and other problems with conventional systems, the inventors have conceived and reduced to practice touchless multi-staged retail process automation systems and methods (retail process automation system) to automate key store functionalities within retail stores. The retail process automation system receives a single command (e.g., a voice command) to automatically perform retail store operations such as opening a store, closing a store, automatically opening tills, etc. with the single command. As a result, the retail process automation system replaces the multiple-touches required by conventional systems with a touchless solution, thereby providing increased time and resource efficiency of managing store operations. For example, the retail process automation system intercepts a voice command (e.g., “Open store ABC in regular mode”) to identify the associated retail store information (e.g., store identifier) and source of the command (e.g., user who invoked the command). The retail process automation system then generates a set of operations associated with the received voice command. For example, the retail process automation system generates a set of API calls for opening a store when it receives a voice command “Open store ABC in regular mode.” The retail process automation system can then intercept the responses from the APIs and continue down a particular operational flow, all with a single command.


In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of implementations of the present technology. It will be apparent, however, to one skilled in the art, that implementations of the present technology can be practiced without some of these specific details.


The phrases “in some implementations,” “according to some implementations,” “in the implementations shown,” “in other implementations,” and the like generally mean the specific feature, structure, or characteristic following the phrase is included in at least one implementation of the present technology and can be included in more than one implementation. In addition, such phrases do not necessarily refer to the same implementations or different implementations.



FIG. 1 is a system diagram illustrating an example of a computing environment in which the retail process automation system operates according to various implementations. In some implementations, environment 100 includes one or more client computing devices 105A-E, examples of which can include computer system 600. Client computing devices 105 operate in a networked environment using logical connections through network 130 to one or more remote computers, such as a server computing device. Client computing devices 105 can receive commands/instructions from one or more users to perform retail store operations.


In some implementations, server 110 is an edge server which receives client requests. Server 110 can coordinate fulfillment of those requests through other servers, such as server 120. Server 110 can route those requests to computing devices located in one or more identified retail stores, such as stand-alone store 140, store in a group location, such as a mall 135, and so on. Server computing devices 110 and 120 comprise computing systems, such as computer system 600. Though each server computing device 110 and 120 is displayed logically as a single server, server computing devices can each be a distributed computing environment encompassing multiple computing devices located at the same or at geographically disparate physical locations. In some implementations, each server 120 corresponds to a group of servers.


Client computing devices 105 and server computing devices 110 and 120 can each act as a server or client to other server/client devices. In some implementations, servers (110, 120) connect to a corresponding database (115, 125). As discussed above, each server 120 can correspond to a group of servers, and each of these servers can share a database or can have its own database. Databases 115 and 125 warehouse (e.g., store) information such as store information (e.g., store identifier, location, capacity, number of tills, type of store, etc.), store employee information (e.g., number of employees, manager identifier(s), employee identifier(s), employee name(s), addresses, contact details, etc.), operations information (e.g., operation flows, operation dependencies, operations history, etc.), and so on. Though databases 115 and 125 are displayed logically as single units, databases 115 and 125 can each be a distributed computing environment encompassing multiple computing devices, can be located within their corresponding server, or can be located at the same or at geographically disparate physical locations.


Network 130 can be a local area network (LAN) or a wide area network (WAN), but can also be other wired or wireless networks. In some implementations, network 130 is the Internet or some other public or private network. Client computing devices 105 are connected to network 130 through a network interface, such as by wired or wireless communication. While the connections between server 110 and servers 120, and stores 140, 135 are shown as separate connections, these connections can be any kind of local, wide area, wired, or wireless network, including network 130 or a separate public or private network.


Aspects of the system can be implemented in a special purpose computing device or data processor that is specifically programmed, configured, or constructed to perform one or more of the computer-executable instructions explained in detail herein. Aspects of the system can also be practiced in distributed computing environments where tasks or modules are performed by remote processing devices, which are linked through a communications network, such as a Local Area Network (LAN), Wide Area Network (WAN), or the Internet. In a distributed computing environment, program modules can be located in both local and remote memory storage devices.


Aspects of the system can be stored or distributed on computer-readable media (e.g., physical and/or tangible non-transitory computer-readable storage media), including magnetically or optically readable computer discs, hard-wired or preprogrammed chips (e.g., EEPROM semiconductor chips), nanotechnology memory, or other data storage media. Indeed, computer implemented instructions, data structures, screen displays, and other data under aspects of the system can be distributed over the Internet or over other networks (including wireless networks), on a propagated signal on a propagation medium (e.g., an electromagnetic wave(s), a sound wave, etc.) over a period of time, or they can be provided on any analog or digital network (packet switched, circuit switched, or other scheme). Portions of the system can reside on a server computer, while corresponding portions can reside on a client computer such as a mobile or portable device, and thus, while certain hardware platforms are described herein, aspects of the system are equally applicable to nodes on a network. In alternative implementations, the mobile device or portable device can represent the server portion, while the server can represent the client portion.



FIG. 2 is a flow diagram illustrating a process 200 of retail process automation according to various implementations. Process 200 begins at act 210 where it receives a trigger for instructions related to a retail store. The trigger can be in the form of voice command(s), UI command(s), and so on. For example, the trigger for instructions can be received via a voice bot application, such as Sid®, Alexa®, and so on. The instructions trigger can be associated with an intent, such as opening a store, closing a store, running reports, and so on. For example, as illustrated in FIG. 5A, process 200 can receive instructions (via a voice command and/or a user interface) to “Quick open store ABC with all tills at $150.” As another example, as illustrated in FIG. 5B, process 200 can receive instructions via a UI selection to “Open first three tills of store ABC with $150.” Other examples of instructions include, but are not limited to searching for customer information (e.g., “Search customer “1234567890””), activate an account (e.g., “Start activation”), modify customer account (e.g., “Start adding new line”), manage payments (e.g., “Take payment” within a flow), add reports, automating number of tolls to be opened based on number of scheduled workers for a day, and so on. Process 200 enables users to customize or set preference commands, which can be loaded by authentication module. In some implementations, process 200 receives a set of instructions. For example, process 200 can receive the following set of instructions to open a store: “Open first three tills of store ABC with $150; Run reports 2, 3, and 4; Print reports at printer ABC.” In some implementations, process 200 receives instructions associated with one or more preconfigured profiles, such as normal day, rush day, weekend, holiday week, light day, busy day, etc. For example, process 200 receives the following instructions to open a store: “Open store as a weekend day.”


Upon receiving the instructions trigger, process 200, at act 215, identifies the store associated with the instructions. For example, upon receiving voice instructions to “Quick open store ABC with all tills at $150,” process 200 parses the instructions (using, for example, natural language processing methods) to identify that the store associated with the instructions trigger is “store ABC.” Process 200 can then access a data structure and/or a database to identify a store identifier associated with “store ABC.” In some implementations, process 200 can access a location of a users device (via which the instructions trigger is received) to identify (or suggest) a store associated with the instructions. Process 200 can also identify a source of the voice instructions. For example, using a voice signature (and/or other biometrics information), process 200 can identify the source who initiated the instructions trigger. As other examples, process 200 can identify the source based on login information of a user, information of a user associated with the device from which the instructions trigger was received, and so on. In some implementations, process 200 can first identify the source of the instructions and then identify the associated store based on the source's past behavior (e.g., the last store managed by a store manager who triggered the current instructions).


In some implementations, when process 200 receives the trigger instructions, it can display the relevant information related to the trigger instructions at a user interface. FIG. 5A is an example user interface 500 displayed when process 200 receives a command to open a store with all tills at a default starting amount. User interface 500 displays store information 502, current store status 504, and till information 506. FIG. 5B is an example user interface 507 displayed when process 200 receives a command to open a store with some tills at a default starting amount. User interface 507 displays store open till information 508, and closed till information 510, with an option for the user to open the closed till(s) and specify an opening till amount. FIG. 5C is an example user interface displayed when process 200 receives a command to close a store; the user can be presented with options to cancel the store closing command using control 512 and/or confirm the store closing using control 514. FIG. 5E is an example user interface to help in opening issue ticket within a store. FIGS. 5F-5G are example user interfaces to help in generating reports, such as daily reports and/or on-demand reports. Users can select a report date using controls 530, 534, and select one or more reports to be generated using controls 532, 536.


At act 220, process 200 generates a set of operations for the received trigger instructions. Process 200 can select and/or compile the set of operations using the received trigger instructions. For example, after receiving a trigger instruction, process 200 can identify one or more instruction identifiers for the received trigger instruction (e.g., each type of trigger instruction can be associated with a unique identifier). As another example, when process 200 receives instructions associated with one or more preconfigured profiles, process 200 identifies the associated instruction identifiers associated with the selected profile. Then, based on the instruction identifier, process 200 can identify a set of operations associated with that instruction identifier. For example, for a trigger instruction to open a store, process 200 can identify the following set of operations: login, open cash drawers (till number, till starting amount), and activate store (brings tills online within all systems). As another example, for a trigger instruction to close a store, process 200 can identify the following set of operations: login, close cash drawers, and deactivate store (brings store offline within all systems). As another example, for a trigger instruction to generate reports for a store, process 200 can identify the following set of operations: tabulate transactions, generate report(s), and print report(s). As another example, for a trigger instruction to perform an audit of a store (e.g., “audit store ABC”), process 200 can automate the memos that are created for audits by gathering necessary data to maintain and update audit reports for the store.


Process 200 can then identify one or more APIs to invoke for the identified set of operations. For example, process 200 identifies the following APIs for opening a store: login API (user identifier), open cash drawer API (starting amount, till number, open command), and activate store API. As another example, process 200 identifies the following APIs for closing a store: login API (user identifier), close cash drawer API (ending amount, till number, close command), and deactivate store API. As another example, process 200 identifies the following APIs for generating reports for a store at periodic internals (e.g., end of day, audits, and so on): tabulate all transactions—balance APIs (starting amount, transactions, end amount), generate report API (can turn transaction information into a printable format (e.g., pdf), and print API (prints report(s) automatically)).


At act 225, process 200 executes the operations in the generated set of operations to alter the current state of the identified store. In some implementations, process 200 can auto advance a store operations management application through various views that illustrate the execution of the operations without any further user interaction. For example, process 200 can automatically execute the APIs associated with the set of operations to alter the current state of the identified store. FIG. 5D is an example user interface 515 displayed as process 200 executes operations to close a store. User interface 515 displays the operations and their statuses 516-519 as the operations are executed. FIG. 5H is an example user interface to enable a user to print a status report (using control 538) once process 200 completes execution of the operations. In some implementations, process 200 can execute the operations to alter the current state of the identified target retail store at an execution time later than a receipt time of the trigger for instructions. The execution time can be received as part of the trigger for instructions, can be determined based on past behavior of the source of the trigger instructions, can be determined based on one or more of the following: operating hours of the target retail store, or current events information, or the associated intent, and so on.



FIG. 3 is a flow diagram illustrating a process 300 of retail process automation according to various implementations. Similar to process 200 of FIG. 2, process 300 begins at act 305 where it receives a trigger for instructions related to a retail store. Upon receiving the instructions trigger, process 300, at act 310, identifies the store associated with the instructions and/or a source of the instructions (similar to act 215 of process 200). At act 315, process 300 processes the trigger instructions to generate one or more suggested modifications using, for example, one or more trained machine learning models.



FIG. 4 is a data flow diagram showing a typical process used by the retail process automation system in some embodiments to generate suggested modifications corresponding to an instruction trigger. A customer information store 405 is shown, from which attributes of a customer are provided to a customer model 410 that produce customer behavior parameter values. The customer information store 405 can comprise information such as call service records, customer behavior information (e.g., time/day when customers access stores, how much time customers spend at a store, services/products utilized by customers, customer demographic information, customer age, customer employment information, customer education information, and so on). Examples of customer behavior parameters include, but are not limited to, popular times when customers visit store, customer behavior patterns, and so on. A user information store 415 is shown, from which attributes of a user are provided to a user model 420 that produce user behavior parameter values. The user information store 415 can comprise information such as user behavior information (e.g., time/day when users issue instructions, stores visited, types of instructions issued, form of instructions (e.g., verbal, UI, and so on), composition of instructions (e.g., single instructions with details, composite instructions with details, and so on), how much time users spend at a store, services/products utilized by users, users demographic information, users age, users employment information, users education information, and so on). Examples of user behavior parameters include, but are not limited to, typical times when user visit store, typical instructions issued by user, typical instructions issued by user of certain type (e.g., manager, cashier, service technician, etc.), user behavior patterns, and so on.


A store information store 425 is shown, from which attributes of a store are provided to a store model 430 that produce store parameter values. The store information store 425 can comprise information such as store location, operating hours, type (standalone, mall, etc.), stock information, till information, employee information, services/products offered, occupancy limits, and so on. Examples of store parameters include, but are not limited to, store occupancy behavior patterns, busy times, busy days, most popular services, most popular products, and so on. A social information store 435 is shown, from which attributes of social media presence are provided to a social media model 440 that produce social media behavior parameter values. The social information store 435 can comprise information about one or more social media accounts and/or aggregate information, such as post locations, post platform (e.g., Twitter®, Facebook®, Instagram®, Tinder®, TikTok®, etc.), post details, post date/time, and so on. Examples of social media behavior parameters include, but are not limited to, popular post dates/times, location of posts in reference to store location, and so on. A news information store 445 is shown, from which attributes of news events are provided to a news model 450 that produce news behavior parameter values. The news information store 445 can comprise information such as current events (e.g., launch of new product/service), disaster information, and so on. Examples of news behavior parameters include, but are not limited to, correlation between news events and customer behavior, correlation between news events and store occupancy/traffic, and so on. The outputs from one or more models are provided to a meta model 455. The meta model 455 applies various techniques such as input weighting, bias correction, data smoothing, and confidence interval estimation in producing one or more suggested modifications 460 to the received instructions.


In some implementations, when the retail process automation system receives a new command (for example, a command for which a corresponding set of operations is not available), the system can parse the new command to identify its constituent parts (e.g., parse a sentence to identify the constituent words). The system can then identify other previously stored and mapped command to identify matches for the new command (e.g., via k-means clustering and other similar techniques). In some implementations, when a new command is encountered, the system can prompt the user to map the new command to a set of operations. For example, upon receiving a new command, the system can prompt the user to specify whether he/she would like to map the new command with a set of actions. When the user agrees to map the new command, the system can begin recording the actions/operations the user performs when performing the new command until a “done” action is performed. With this breadcrumb approach, the system creates a mapping of the new command and a set of actions/operations. The system can then record/save the mapping in a database.


The retail process automation system can learn one or more features associated with the received instructions, such as time of day of action, user information, location information, till numbers used, starting amounts, report types, and so on. For example, the system can learn that a store representative A opens a store with specific till numbers and specific starting amounts. The system records the action taken by a user A or preferences which is common for the store based on command. There can be phrase commands like “open store busy today”, “open store for event”, or “open store for thanksgiving” which identify to open a particular number of tills (e.g., greater than four tills, greater than six tills, and so on). Based on the learning and preference set by a user, the system can update the set of operations associated with the command. For example, when the system identifies that the received instructions are to open a store from store representative A, it can automatically suggest opening the store with the information it has learned. As another example, based on a pattern and phrase which it learns in store and day of action taken, the system can automatically suggest opening multiple tills or standard preference.


A “model,” as used herein, refers to a construct that is trained using training data to make predictions or provide probabilities for new data items, whether or not the new data items were included in the training data. For example, training data for supervised learning can include items with various parameters and an assigned classification. A new data item can have parameters that a model can use to assign a classification to the new data item. As another example, a model can be a probability distribution resulting from the analysis of training data, such as a likelihood of an n-gram occurring in a given language based on an analysis of a large corpus from that language. Examples of models include, without limitation: neural networks, support vector machines, decision trees, Parzen windows, Bayes, clustering, reinforcement learning, probability distributions, decision trees, decision tree forests, and others. Models can be configured for various situations, data types, sources, and output formats.


In some implementations, models trained by the system can include a neural network with multiple input nodes that receive training datasets. The input nodes can correspond to functions that receive the input and produce results. These results can be provided to one or more levels of intermediate nodes that each produce further results based on a combination of lower level node results. A weighting factor can be applied to the output of each node before the result is passed to the next layer node. At a final layer, (“the output layer,”) one or more nodes can produce a value classifying the input that, once the model is trained, can be used to assess likelihood that a caller is a scam caller, and so forth. In some implementations, such neural networks, known as deep neural networks, can have multiple layers of intermediate nodes with different configurations, can be a combination of models that receive different parts of the input and/or input from other parts of the deep neural network, or are convolutions—partially using output from previous iterations of applying the model as further input to produce results for the current input.


A machine learning model can be trained with supervised learning. Testing data can then be provided to the model to assess for accuracy. Testing data can be, for example, a portion of the training data (e.g., 10%) held back to use for evaluation of the model. Output from the model can be compared to the desired and/or expected output for the training data and, based on the comparison, the model can be modified, such as by changing weights between nodes of the neural network and/or parameters of the functions used at each node in the neural network (e.g., applying a loss function). Based on the results of the model evaluation, and after applying the described modifications, the model can then be retrained to evaluate new call data.


Returning to FIG. 3, after generating the suggested modifications, at act 320, process 300 presents one or more of the suggested modifications to a user. In some implementations, process 200 filters the generated suggested modifications to present a selected set. For example, process 200 can filter the generated suggestions based on user preference information, time of day, store information, and so on. For example, a user may have not noticed that a particular day is a holiday. But based on the behavior learnings, the system can recognize that the particular day is expected to be a busy day at the store. In this example, when a user just issues a command as “open store”, the system can suggest to the user that since the particular day is a holiday event, would the user like to open more tills than the usual preferences. At act 325, process 300 receives a final set of instructions and generates, at act 330, a set of operations for the final instructions set (similar to act 220 of FIG. 2). At act 335, process 300 executes the operations in the generated set of operations to alter the current state of the identified store (similar to act 225 of FIG. 2).


Computer System



FIG. 6 is a block diagram that illustrates an example of a computer system 600 in which at least some operations described herein can be implemented. As shown, the computer system 600 can include: one or more processors 602, main memory 606, non-volatile memory 610, a network interface device 612, video display device 618, an input/output device 620, a control device 622 (e.g., keyboard and pointing device), a drive unit 624 that includes a storage medium 626, and a signal generation device 630 that are communicatively connected to a bus 616. The bus 616 represents one or more physical buses and/or point-to-point connections that are connected by appropriate bridges, adapters, or controllers. Various common components (e.g., cache memory) are omitted from FIG. 6 for brevity. Instead, the computer system 600 is intended to illustrate a hardware device on which components illustrated or described relative to the examples of the figures and any other components described in this specification can be implemented.


The computer system 600 can take any suitable physical form. For example, the computing system 600 can share a similar architecture as that of a server computer, personal computer (PC), tablet computer, mobile telephone, game console, music player, wearable electronic device, network-connected (“smart”) device (e.g., a television or home assistant device), AR/VR systems (e.g., head-mounted display), or any electronic device capable of executing a set of instructions that specify action(s) to be taken by the computing system 600. In some implementation, the computer system 600 can be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) or a distributed system such as a mesh of computer systems or include one or more cloud components in one or more networks. Where appropriate, one or more computer systems 600 can perform operations in real-time, near real-time, or in batch mode.


The network interface device 612 enables the computing system 600 to mediate data in a network 614 with an entity that is external to the computing system 600 through any communication protocol supported by the computing system 600 and the external entity. Examples of the network interface device 612 include a network adaptor card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, bridge router, a hub, a digital media receiver, and/or a repeater, as well as all wireless elements noted herein.


The memory (e.g., main memory 606, non-volatile memory 610, machine-readable medium 626) can be local, remote, or distributed. Although shown as a single medium, the machine-readable medium 626 can include multiple media (e.g., a centralized/distributed database and/or associated caches and servers) that store one or more sets of instructions 628. The machine-readable (storage) medium 626 can include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the computing system 600. The machine-readable medium 626 can be non-transitory or comprise a non-transitory device. In this context, a non-transitory storage medium can include a device that is tangible, meaning that the device has a concrete physical form, although the device can change its physical state. Thus, for example, non-transitory refers to a device remaining tangible despite this change in state.


Although implementations have been described in the context of fully functioning computing devices, the various examples are capable of being distributed as a program product in a variety of forms. Examples of machine-readable storage media, machine-readable media, or computer-readable media include recordable-type media such as volatile and non-volatile memory devices 610, removable flash memory, hard disk drives, optical disks, and transmission-type media such as digital and analog communication links.


In general, the routines executed to implement examples herein can be implemented as part of an operating system or a specific application, component, program, object, module, or sequence of instructions (collectively referred to as “computer programs”). The computer programs typically comprise one or more instructions (e.g., instructions 604, 608, 628) set at various times in various memory and storage devices in computing device(s). When read and executed by the processor 602, the instruction(s) cause the computing system 600 to perform operations to execute elements involving the various aspects of the disclosure.


Remarks


The terms “example”, “embodiment” and “implementation” are used interchangeably. For example, reference to “one example” or “an example” in the disclosure can be, but not necessarily are, references to the same implementation; and, such references mean at least one of the implementations. The appearances of the phrase “in one example” are not necessarily all referring to the same example, nor are separate or alternative examples mutually exclusive of other examples. A feature, structure, or characteristic described in connection with an example can be included in another example of the disclosure. Moreover, various features are described which can be exhibited by some examples and not by others. Similarly, various requirements are described which can be requirements for some examples but no other examples.


The terminology used herein should be interpreted in its broadest reasonable manner, even though it is being used in conjunction with certain specific examples of the invention. The terms used in the disclosure generally have their ordinary meanings in the relevant technical art, within the context of the disclosure, and in the specific context where each term is used. A recital of alternative language or synonyms does not exclude the use of other synonyms. Special significance should not be placed upon whether or not a term is elaborated or discussed herein. The use of highlighting has no influence on the scope and meaning of a term. Further, it will be appreciated that the same thing can be said in more than one way.


Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” As used herein, the terms “connected,” “coupled,” or any variant thereof means any connection or coupling, either direct or indirect, between two or more elements; the coupling or connection between the elements can be physical, logical, or a combination thereof. Additionally, the words “herein,” “above,” “below,” and words of similar import can refer to this application as a whole and not to any particular portions of this application. Where context permits, words in the above Detailed Description using the singular or plural number may also include the plural or singular number respectively. The word “or” in reference to a list of two or more items covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list. The term “module” refers broadly to software components, firmware components, and/or hardware components.


While specific examples of technology are described above for illustrative purposes, various equivalent modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize. For example, while processes or acts are presented in a given order, alternative implementations can perform routines having steps, or employ systems having acts, in a different order, and some processes or acts may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or sub-combinations. Each of these processes or acts can be implemented in a variety of different ways. Also, while processes or acts are at times shown as being performed in series, these processes or acts can instead be performed or implemented in parallel, or can be performed at different times. Further, any specific numbers noted herein are only examples such that alternative implementations can employ differing values or ranges.


Details of the disclosed implementations can vary considerably in specific implementations while still being encompassed by the disclosed teachings. As noted above, particular terminology used when describing features or aspects of the invention should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the invention with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the invention to the specific examples disclosed herein, unless the above Detailed Description explicitly defines such terms. Accordingly, the actual scope of the invention encompasses not only the disclosed examples, but also all equivalent ways of practicing or implementing the invention under the claims. Some alternative implementations can include additional elements to those implementations described above or include fewer elements.


Any patents and applications and other references noted above, and any that may be listed in accompanying filing papers, are incorporated herein by reference in their entireties, except for any subject matter disclaimers or disavowals, and except to the extent that the incorporated material is inconsistent with the express disclosure herein, in which case the language in this disclosure controls. Aspects of the invention can be modified to employ the systems, functions, and concepts of the various references described above to provide yet further implementations of the invention.


To reduce the number of claims, certain implementations are presented below in certain claim forms, but the applicant contemplates various aspects of an invention in other forms. For example, aspects of a claim can be recited in a means-plus-function form or in other forms, such as being embodied in a computer-readable medium. A claim intended to be interpreted as a mean-plus-function claim will use the words “means for.” However, the use of the term “for” in any other context is not intended to invoke a similar interpretation. The applicant reserves the right to pursue such additional claim forms in either this application or in a continuing application.

Claims
  • 1. A computer-implemented method to perform key store functionalities within retail stores, the method comprising: receiving a trigger for instructions, wherein the instructions are associated with an intent to perform retail store operations;processing the instructions to identify at least two target retail stores and a source of the trigger for instructions;applying a trained machine learning model to generate one or more suggested modifications to the received instructions;generating final instructions using the generated one or more suggested modifications to the received instructions;generating application programming interface (API) calls for the retail store operations associated with the generated final instructions, wherein the retail store operations comprise at least one of: opening the target retail stores, closing the target retail stores, opening tills in the target retail stores, closing the tills in the target retail stores, performing an audit of the target retail stores, or generating reports for the target retail stores; andperforming, using the API calls, the retail store operations for the target retail stores.
  • 2. The computer-implemented method of claim 1, wherein the instructions comprise a voice command.
  • 3. The computer-implemented method of claim 1, wherein the trigger for instructions are received via a voice bot.
  • 4. The computer-implemented method of claim 1, wherein the trigger for instructions are received via a user interface.
  • 5. The computer-implemented method of claim 1, wherein the retail store operations alter a current state of the identified target retail stores at an execution time later than a receipt time of the trigger for instructions.
  • 6. The computer-implemented method of claim 1, wherein the retail store operations alter a current state of the identified target retail stores at an execution time, and wherein the execution time is received as part of the trigger for instructions.
  • 7. The computer-implemented method of claim 1, wherein the retail store operations alter a current state of the identified target retail stores at an execution time, and wherein the execution time is determined based on past behavior of the source of the trigger instructions.
  • 8. The computer-implemented method of claim 1, wherein the retail store operations alter a current state of the identified target stores at an execution time, and wherein the execution time is determined based on one or more of the following: operating hours of the target retail stores, orcurrent events information, orthe associated intent.
  • 9. The computer-implemented method of claim 1, wherein the source of the trigger instructions is affiliated with a telecommunications service provider operating the identified target retail stores.
  • 10. The computer-implemented method of claim 1, wherein the current state of the identified target retail stores is open or closed.
  • 11. The computer-implemented method of claim 1, wherein the machine learning model is trained using one or more of: customer information, user information, store information, social media information, or news information.
  • 12. The computer-implemented method of claim 1, wherein the received instructions comprise information identifying the target retail stores.
  • 13. At least one non-transitory computer-readable medium, excluding transitory signals, and carrying instructions that, when executed by at least one processor, cause the at least one processor to perform operations comprising: receiving a trigger for instructions, wherein the instructions are associated with an intent to perform retail store operations;processing the instructions to identify at least two target retail stores and a source of the trigger for instructions;applying a trained machine learning model to generate one or more suggested modifications to the received instructions;generating final instructions using the generated one or more suggested modifications to the received instructions;generating application programming interface (API) calls for the retail store operations associated with the generated final instructions, wherein the retail store operations comprise at least one of: opening the target retail stores, closing the target retail stores, opening tills in the target retail stores, closing the tills in the target retail stores, performing an audit of the target retail stores, or generating reports for the target retail stores; andperforming, using the API calls, the retail store operations for the target retail stores.
  • 14. The computer-readable medium of claim 13, wherein the retail store operations alter a current state of the identified target retail stores at an execution time later than a receipt time of the trigger for instructions.
  • 15. The computer-readable medium of claim 13, wherein the retail store operations alter a current state of the identified target retail stores at an execution time, and wherein the execution time is: received as part of the trigger for instructions, ordetermined based on past behavior of the source of the trigger instructions, ordetermined based on one or more of the following: operating hours of the target retail stores, orcurrent events information, orthe associated intent.
  • 16. The computer-readable medium of claim 13, wherein the source of the trigger instructions is affiliated with a telecommunications service provider operating the identified target retail stores.
  • 17. The computer-readable medium of claim 13, wherein the current state of the identified target retail stores is open or closed.
  • 18. The computer-readable medium of claim 13, wherein the machine learning model is trained using one or more of: customer information, user information, store information, social media information, or news information.
  • 19. A computing system, comprising: at least one processor; andat least one non-transitory memory, excluding transitory signals, and carrying instructions that, when executed by the at least one processor, cause the computing system to perform operations comprising: receiving a trigger for instructions, wherein the instructions are associated with an intent to perform retail store operations;processing the instructions to identify at least two target retail stores and a source of the trigger for instructions;applying a trained machine learning model to generate one or more suggested modifications to the received instructions;generating final instructions using the generated one or more suggested modifications to the received instructions;generating application programming interface (API) calls for the retail store operations associated with the generated final instructions, wherein the retail store operations comprise at least one of: opening the target retail stores, closing the target retail stores, opening tills in the target retail stores, closing the tills in the target retail stores, performing an audit of the target retail stores, or generating reports for the target retail stores; andperforming, using the API calls, the retail store operations for the target retail stores.
  • 20. The computing system of claim 19, wherein the retail store operations alter the current state of the identified target retail stores at an execution time, and wherein the execution time is: received as part of the trigger for instructions, ordetermined based on past behavior of the source of the trigger instructions, ordetermined based on one or more of the following: operating hours of the target retail stores, orcurrent events information, orthe associated intent.
US Referenced Citations (135)
Number Name Date Kind
5867823 Richardson Feb 1999 A
6510989 Ortega Jan 2003 B1
6609101 Landvater Aug 2003 B1
7174304 Boardman Feb 2007 B1
7480623 Landvater Jan 2009 B1
RE41717 Dejaeger Sep 2010 E
7918395 Gelbman Apr 2011 B2
8190497 Odell et al. May 2012 B2
8239245 Bai et al. Aug 2012 B2
8286863 Brooks Oct 2012 B1
8533186 Lazaridis et al. Sep 2013 B2
8606645 Applefeld Dec 2013 B1
8635116 Fiorentino Jan 2014 B2
8639041 Grigsby et al. Jan 2014 B2
8650100 Miller et al. Feb 2014 B1
8751316 Fletchall et al. Jun 2014 B1
8827810 Weston et al. Sep 2014 B2
8886554 Chatterjee et al. Nov 2014 B2
8910862 Gangi Dec 2014 B2
9013602 Karn Apr 2015 B2
9032304 Lotan May 2015 B2
9053510 Crum Jun 2015 B2
9064277 Wong et al. Jun 2015 B2
9135542 Odell et al. Sep 2015 B2
9424577 Mutha Aug 2016 B2
9449110 Singh et al. Sep 2016 B2
9470532 Pellow et al. Oct 2016 B2
9501793 Edelman et al. Nov 2016 B1
9595058 Khalid Mar 2017 B2
9606982 Macmillan et al. Mar 2017 B2
9639872 Dyer et al. May 2017 B2
9811840 Sinclair Nov 2017 B2
9892424 Abraham et al. Feb 2018 B2
9928531 Mccarthy Mar 2018 B2
10102564 Pellow et al. Oct 2018 B2
10121116 Tamblyn et al. Nov 2018 B2
10146766 Macmillan et al. Dec 2018 B2
10157415 Perks et al. Dec 2018 B2
10176452 Rizzolo et al. Jan 2019 B2
10255640 Rossmark et al. Apr 2019 B1
10269026 Cook et al. Apr 2019 B2
10318907 Bergstrom et al. Jun 2019 B1
10362439 Kao Jul 2019 B2
10417387 Toupin et al. Sep 2019 B2
10417690 Mueller et al. Sep 2019 B2
10430841 Shah Oct 2019 B1
10438159 Edelman et al. Oct 2019 B2
10482527 Porter et al. Nov 2019 B2
10572932 Kumar et al. Feb 2020 B2
10592959 Wilkinson Mar 2020 B2
10671986 Yen et al. Jun 2020 B2
10691889 Macmillan et al. Jun 2020 B2
10713615 Smith et al. Jul 2020 B2
10733661 Bergstrom et al. Aug 2020 B1
10783582 Deperro et al. Sep 2020 B2
10803439 Evans Oct 2020 B2
10853851 Shah Dec 2020 B2
10943291 Wiedmeyer et al. Mar 2021 B2
10972866 Macdonald-korth et al. Apr 2021 B1
10984404 Nack et al. Apr 2021 B2
10991036 Bergstrom et al. Apr 2021 B1
11004153 Rossmark et al. May 2021 B2
20050288989 Kim et al. Dec 2005 A1
20060149639 Liu et al. Jul 2006 A1
20090106160 Skowronek Apr 2009 A1
20100153174 Angell et al. Jun 2010 A1
20110015966 Wasco Jan 2011 A1
20110145051 Paradise et al. Jun 2011 A1
20120239536 Takahashi Sep 2012 A1
20120320214 Kundu Dec 2012 A1
20130219434 Farrell et al. Aug 2013 A1
20130254006 Braun et al. Sep 2013 A1
20140058899 Fellinger et al. Feb 2014 A1
20140081682 Perlmuter Mar 2014 A1
20140172697 Ward et al. Jun 2014 A1
20140278773 Fiorentino Sep 2014 A1
20140344041 Yeleswarapu Nov 2014 A1
20140379534 Brazell Dec 2014 A1
20150081392 Fox et al. Mar 2015 A1
20150100418 Gangi Apr 2015 A1
20150100445 Johnson et al. Apr 2015 A1
20150106202 Bastaldo-Tsampalis Apr 2015 A1
20150262236 Cypher et al. Sep 2015 A1
20150278849 Reichert Oct 2015 A1
20150363796 Lehman et al. Dec 2015 A1
20160048871 Glass et al. Feb 2016 A1
20160140585 Popescu et al. May 2016 A1
20160148226 Popescu May 2016 A1
20160232461 Popescu et al. Aug 2016 A1
20160247172 Lei et al. Aug 2016 A1
20170061346 High et al. Mar 2017 A1
20170076356 Agrawal et al. Mar 2017 A1
20170098175 Norby et al. Apr 2017 A1
20170154349 Popescu et al. Jun 2017 A1
20170206546 Martin Jul 2017 A1
20170249685 Villa Aug 2017 A1
20170372401 Wang et al. Dec 2017 A1
20180005174 Dixon et al. Jan 2018 A1
20180053172 Nack et al. Feb 2018 A1
20180060943 Mattingly et al. Mar 2018 A1
20180189888 Deperro et al. Jul 2018 A1
20180197218 Mallesan et al. Jul 2018 A1
20180197227 Mchale et al. Jul 2018 A1
20180268352 Fantini Sep 2018 A1
20180285893 Deluca et al. Oct 2018 A1
20180300114 Stohrer et al. Oct 2018 A1
20180308048 Nemati et al. Oct 2018 A1
20190005143 Cypher et al. Jan 2019 A1
20190005498 Roca et al. Jan 2019 A1
20190005569 Kotha Jan 2019 A1
20190019228 Reddy et al. Jan 2019 A1
20190026676 Tamblyn et al. Jan 2019 A1
20190205915 Hunter et al. Jul 2019 A1
20190272596 Pancholi et al. Sep 2019 A1
20190362413 Joshi et al. Nov 2019 A1
20190370709 Hodges Dec 2019 A1
20200034775 Edelman et al. Jan 2020 A1
20200042914 Karmakar et al. Feb 2020 A1
20200074373 Adato et al. Mar 2020 A1
20200082924 Toupin et al. Mar 2020 A1
20200273048 Andon et al. Aug 2020 A1
20200372216 Macmillan et al. Nov 2020 A1
20200372529 Saarenvirta Nov 2020 A1
20200402092 Williams et al. Dec 2020 A1
20210035137 Poole Feb 2021 A1
20210065080 Brockman et al. Mar 2021 A1
20210065122 Brockman et al. Mar 2021 A1
20210082007 Shah Mar 2021 A1
20210097467 Liang et al. Apr 2021 A1
20210117171 Stohrer et al. Apr 2021 A1
20210117949 Guo et al. Apr 2021 A1
20210166300 Wiedmeyer et al. Jun 2021 A1
20210342785 Mann Nov 2021 A1
20210366149 Almazán Nov 2021 A1
20220374849 Rathod Nov 2022 A1
Foreign Referenced Citations (65)
Number Date Country
2012100063 Oct 2012 AU
2015414792 Jul 2018 AU
2733804 Feb 2010 CA
2798863 Feb 2013 CA
2956606 Feb 2016 CA
2511231 Jun 2016 CA
2972893 Jul 2016 CA
2744629 Oct 2016 CA
2798965 Jan 2017 CA
2939729 Feb 2017 CA
3020450 Oct 2017 CA
3023644 Oct 2017 CA
2976571 Feb 2018 CA
3034097 Mar 2018 CA
3067361 Jan 2019 CA
3039878 Oct 2019 CA
107293051 Oct 2017 CN
109643527 Apr 2019 CN
110570154 Dec 2019 CN
110738438 Jan 2020 CN
110738541 Jan 2020 CN
110770719 Feb 2020 CN
110998592 Apr 2020 CN
112465541 Mar 2021 CN
202010018193 Jul 2014 DE
2282290 Feb 2011 EP
3166066 May 2017 EP
3309728 Apr 2018 EP
3378016 Sep 2018 EP
3424039 Jan 2019 EP
3735669 Nov 2020 EP
3742336 Nov 2020 EP
2273589 Jun 1994 GB
2543393 Apr 2017 GB
2011CN01615 Dec 2011 IN
201841043695 Jun 2020 IN
202147016897 Apr 2021 IN
20130066478 Jun 2013 KR
101729414 Apr 2017 KR
2016011196 Feb 2017 MX
2018012484 Mar 2019 MX
2019002216 Jul 2019 MX
10201805726 Feb 2019 SG
0225823 Mar 2002 WO
2008115278 Sep 2008 WO
2013119743 Aug 2013 WO
2013140386 Sep 2013 WO
2013152444 Oct 2013 WO
2016018897 Feb 2016 WO
2017062302 Apr 2017 WO
2017086999 May 2017 WO
2017095634 Jun 2017 WO
2017173457 Oct 2017 WO
2017180977 Oct 2017 WO
2018129028 Jul 2018 WO
2018200557 Nov 2018 WO
2019006116 Jan 2019 WO
2019136020 Jul 2019 WO
2019227480 Dec 2019 WO
2019231482 Dec 2019 WO
2019232434 Dec 2019 WO
2020152487 Jul 2020 WO
2020257394 Dec 2020 WO
2021037202 Mar 2021 WO
2021072699 Apr 2021 WO