Embodiments of the disclosure relate generally to task automation, and specifically to a score that reflects how proximal a brand of an enterprise is to a customer base.
Traditional customer surveys and scores created from those surveys involve enterprises explicitly asking the customers on the experience and/or how the customer will act in the future, such as recommending the enterprise's services or goods to their networks and friends. However, the traditional methods lack the ability to intelligently and automatically harness information from the customers over a period of time in a non-invasive way as the workflow progresses towards completion of a task.
The following is a simplified summary of the disclosure in order to provide a basic understanding of some aspects of the disclosure. This summary is not an extensive overview of the disclosure. It is intended to neither identify key or critical elements of the disclosure, nor delineate any scope of the particular implementations of the disclosure or any scope of the claims. Its sole purpose is to present some concepts of the disclosure in a simplified form as a prelude to the more detailed description that is presented later.
Enterprises leverage service engagement platforms (sometimes simply called a service platform, or user engagement platform) to interact with their customers. Service engagement platforms automate the enterprise workflow, interact with customers to broker information and build proximity with customers through conversations. The enterprise workflow is task-oriented. Examples of task may include booking a ticket, registering an account, resolving a claim, collecting user feedback etc.
The service engagement platform disclosed here may use various mechanisms such as chatbots, conversational artificial intelligence (AI) etc. for conversing with the customers while improving workflow efficiency. The service engagement platform automates at least parts of the enterprise workflow, interacts with customers to broker information and builds proximity with customers through conversations. The enterprise workflow is task-oriented. Examples of task may include booking a ticket, registering an account, resolving a claim, collecting user feedback etc.
The service engagement platform supports two-way text-based interaction centered around business-necessitated engagements between enterprises with their customers as they interact over a period of time. Note that the term ‘enterprise’ broadly encompasses an entity (which can be a business entity or a person) who serves a customer. The customer is sometimes referred to as ‘end-user’ or simply user, though based on the context, the term ‘user’ may also indicate the entity that is referred to as an enterprise elsewhere. Based on customer engagements and behaviors, the service engagement platform described here gradually derives a score that would convey how proximal the brand of the enterprise is to their customer base. This score is expected to be a standard of measurement for enterprises getting into a complete messaging-based interaction with their customers.
Generally speaking, a Brand Proximity Index (BPI) reflects how well an enterprise builds its relationship with its customer base over time. The BPI is computed from each individual customer interaction and each conversation flow has a brand proximity score (BPS).
The brand proximity score computation combines statistical processing and machine learning. Some of the important aspects that are taken into consideration are: overall task completion, the level of the customer's engagement and the efficiency of the system.
The measure of engagement is data-driven and uses the historical multi-turn conversational data to estimate the continuation to respond to each module. Multi-turn conversational modelling concatenates contextual utterances to ensure conversation consistency. The more effort it takes for a customer to respond, the higher the engagement score. A long text response from a customer will have a higher engagement score than a single click on a multiple choice tab. This is regardless of the content of the response such as a negative feedback. The level of engagement also incorporates the response time. Generally, short response time will have a higher engagement score than a lagged response time when all other conditions are the same. A task efficiency score will incorporate the system latency and heavily depend on whether the task is completed and the number of steps it took to complete. For a successfully completed task, the more steps it takes, the score will be slightly lower than those with fewer steps. For a task that is not completed, the task efficiency score significantly suffers and give score credits for each additional step it has completed so far.
Specifically, an aspect of the present disclosure describes methods and systems for automatically assessing proximity of an enterprise's brand to a customer. A processing device obtains a first sub-score indicative of a degree of completion of a task that involves the enterprise providing a service to the customer, a second sub-score indicative of a level of user engagement between the enterprise and the customer, and a third sub-score indicative of efficiency of the task that involves the enterprise providing the service to the customer. The processing device then combines the first sub-score, the second sub-score and the third sub-score to determine a composite brand proximity score (BPS) indicative of the proximity of the enterprise's brand to the customer.
Note that the term “service” is broadly interpreted to encompass providing information about or delivery of tangible goods too. Also, the term “user engagement” means engagement with a customer of the enterprise, where an “enterprise” can be an organization, or an individual, or a team of individuals that provide the service to the customer. Also, the term score is used generically, though when a score has multiple components, those multiple components can be indicated as sub-score.
The present disclosure will be understood more fully from the detailed description given below and from the accompanying drawings of various embodiments of the disclosure.
Embodiments of the present disclosure are directed to determining a score indicative of how an enterprise's brand becomes proximal to a customer (or end-user) over time via progressive interactions.
The other engagement records that fall into the same time window (t, t+delta) are computed in the same manner and the BPS_a, BPS_b, BPS_c etc will be weighted based on the activeness of the end-users. The enterprise level BPS at time t is computed along with the BPI from last period time (i.e. at time (t−1)) as retrieved from BPI scoring database 234. In one embodiment, the moving average of the last K periods can be used as one of the time series smoothing (236) method which yields a more robust estimate.
The engagement score component 300A is based on two sub-components. The continuation score function indicated within the completion score component 308 gives a high score to a deeper engagement with a discounting factor 309. The response time reward function within the response time score component 311 assigns a high score for short average user response time. Normalization 327 is applied to calculate the engagement score 310.
The efficiency score component 300B evaluates how well the workflow system handles the end-user's response and whether the primary objective is achieved or not. The response time for each system interaction is stored in an array in 303. If the session is evaluated as completed at the decision block 304, the response time array is padded with 0 (at reward padding block 305), otherwise, the response time array is padded with ‘infinite’ (at penalization padding block 306). The exponential score function takes the conceptual infinite and yields 0 score for that step. The temporary array which stores the system response time at each step is passed to the efficiency scoring component 307, and then a weighted layer 313 which puts a higher weight on the critical interaction step. The prior probability for an end-user to continue at each step pj[u] can be used as weights. The efficiency score 312 is output as a result of these operations.
The brand proximity score (BPS) of one engagement is a linear combination of three sub-scores: Task Completion(I) sun-score, User Engagement Score (UES) and Task Efficiency Score(TES):
BPSi=Ii+α*UESi+β*TESi
The corresponding weight α and β are scaling parameters and reflect the relative importance.
Ii is a binary variable. It measure whether the task is completed or not. If the conversational flow goes through one of the pre-defined success nodes or modules, the score is 1, else the score is 0. For example, in a credit-card payment section, last question section of a survey etc. indicates task completion.
UESi is designed to evaluate the level of engagement per session. The score takes two aspects into account: the steps the flow has gone through and the user response time at each step.
The system can set a probability function pj[u] to represent the prior belief of whether the user will continue at the step j. The pj[u] will be influenced by a few categorical variables. e.g. Ti>j−1 total turns larger than previous turn, Ni number of finite user-defined steps and Cj types of response for turn j. The p4[u] is the expected value of a Bernoulli distribution conditioned on several variables.
pj[u] can be estimated by regression over a training dataset. If we don't have sufficient data to get a robust estimate, an alternative way to compute pj[u] uses a prior distribution, e.g. Poisson's distribution (with varying values of lambda λ), which reflect the expert's belief on the distribution of the engaging turns. This is shown by the set of plots 600 in
In general, g0(p) is a decreasing score function which assigns a high score to a lower p value. As it is compared with all the other candidates, this engagement is pushed further.
The selection of a scoring function is not an exact science. For example, the drop out rate g0(p)=1−p is a possible candidate which is naturally bounded within (0, 1), beta(α=2, β=0.98) can also be used for non-linearity assumption.
The total score for the steps will be a summation of each individual response score,
Σj=1T
where, γ is a discounting factor and nj is the repetitive times for the same module due to validation. The discounting factor was introduced for repetitive validation engagement because some module requires strict input format (MM/DD/YYYY) etc. Those user responses demonstrate that the user continue to engage with the system but those activities will lead to unbounded scoring. The discounting factor will ensure the engaging score will have an upper bound.
In some embodiments, g1(ttrim) is an exponential score function used in the score calculation, which takes in a trimmed average of user engagement response time for all messages in one session.
O is set to 1 and tk is ordered by value. The general concept is that the quicker the average response time, the more engaged the user is.
Another type of function, g2(maxk(tk[u])) is a step function to give the memorizing reward for a user to return back to engage with the system without a reminder. As the user might be in a situation that he was distracted by something else, and later he remember to continue to engage with the system and finish the flow.
To sum up, the user engagement score for one session i:
Ti is the total user responses for session i. For the task efficiency score, g3 (tj[s]) is an exponential function takes in the system response time at step j and yield a score. The longer the system responded back, the lower the score. However the penalization for a lagged system response at different stages and modules will be different. The relative importance the module plays in the full workflow can be applied here with a proper parameter setting. Another way is to utilize the prior probability pj[u] the user continue at step j. When the conversation starts, the user has a higher chance to continually engage as they want to explore, a lagged system response at an earlier stage will raise the probability of discontinuation. The discontinuation at an earlier stage can be penalized with more weights according to the following equation:.
K is a fixed parameter set to be higher than the length of all the engagements. If it is a completed session, padded the rest of the array with K-Tj default system responses and set each response time to 0. If the task flow is not completed, the padded system response time will be set to infinite.
The BPS for an enterprise at time window t is an average of the user engagements at that time window.
the BPI (brand proximity index) at time t will be a combination of latest BPS and historical stock value.
BPIt=BPSt*0.7+BPSt−1*0.2+BPSi−2*0.1; i ∈ {1, 2, 3, . . . }
At operation 910, the enterprise engages with a customer to whom the enterprise provides a service. Note that the term “service” is broadly interpreted to encompass providing information about or delivery of tangible goods too.
At operation 920, a first sub-score is obtained, as described above, the first sub-score being indicative of a degree of completion of a task that involves the enterprise providing the service to the customer.
At operation 930, a second sub-score is obtained, as described above, the second sub-score being indicative of a level of user engagement between the enterprise and the customer.
At operation 940, a third sub-score is obtained, as described above, the third sub-score being indicative of efficiency of the task that involves the enterprise providing the service to the customer.
At operation 950, the processing device combines the first, second and the third sub-scores to determine a composite BPS indicative of the proximity of the enterprise's brand to the customer.
The machine can be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a server, a network router, a switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
The example computer system 1000 includes a processing device 1002, a main memory 1004 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.), a static memory 1008 (e.g., flash memory, static random access memory (SRAM), etc.), and a data storage system 1018, which communicate with each other via a bus 1030.
Processing device 1002 represents one or more general-purpose processing devices such as a microprocessor, a central processing unit, or the like. More particularly, the processing device can be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets, or processors implementing a combination of instruction sets. Processing device 1002 can also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 1002 is configured to execute instructions 1028 for performing the operations and steps discussed herein. The computer system 1000 can further include a network interface device 1008 to communicate over the network 1020.
The data storage system 1018 can include a machine-readable storage medium 1024 (also known as a computer-readable medium) on which is stored one or more sets of instructions 1028 or software embodying any one or more of the methodologies or functions described herein. The instructions 1028 can also reside, completely or at least partially, within the main memory 1004 and/or within the processing device 1002 during execution thereof by the computer system 1000, the main memory 1004 and the processing device 1002 also constituting machine-readable storage media. The machine-readable storage medium 1024, data storage system 1018, and/or main memory 1004 can correspond to a memory sub-system.
In one embodiment, the instructions 1028 include instructions to implement functionality corresponding to the BPS calculation component 1013. While the machine-readable storage medium 1024 is shown in an example embodiment to be a single medium, the term “machine-readable storage medium” should be taken to include a single medium or multiple media that store the one or more sets of instructions. The term “machine-readable storage medium” shall also be taken to include any medium that is capable of storing or encoding a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term “machine-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
Some portions of the preceding detailed descriptions have been presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the ways used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. The operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. The present disclosure can refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage systems.
The present disclosure also relates to an apparatus for performing the operations herein. This apparatus can be specially constructed for the intended purposes, or it can include a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program can be stored in a computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, each coupled to a computer system bus.
The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems can be used with programs in accordance with the teachings herein, or it can prove convenient to construct a more specialized apparatus to perform the method. The structure for a variety of these systems will appear as set forth in the description below. In addition, the present disclosure is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages can be used to implement the teachings of the disclosure as described herein.
The present disclosure can be provided as a computer program product, or software, that can include a machine-readable medium having stored thereon instructions, which can be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure. A machine-readable medium includes any mechanism for storing information in a form readable by a machine (e.g., a computer). In some embodiments, a machine-readable (e.g., computer-readable) medium includes a machine (e.g., a computer) readable storage medium such as a read only memory (“ROM”), random access memory (“RAM”), magnetic disk storage media, optical storage media, flash memory devices, etc.
In the specification, embodiments of the disclosure have been described with reference to specific example embodiments thereof. It will be evident that various modifications can be made thereto without departing from the broader spirit and scope of embodiments of the disclosure as set forth in the following claims. The specification and drawings are, accordingly, to be regarded in an illustrative sense rather than a restrictive sense.
This application claims the benefit of U.S. Provisional Patent Application No. 62/951,707, filed Dec. 20, 2019, entitled, “BRAND PROXIMITY SCORE FOR TASK AUTOMATION PLATFORM,” the entirety of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62951707 | Dec 2019 | US |