The present invention relates to the automated servicing of messages from a user, and more particularly relates to a multi-domain chatbot configured to automatically service domain-specific messages from the user.
The use of chatbots is becoming more widespread due to the increasing power of Artificial Intelligence (AI) and sophistication of natural language understanding (NLU) systems. Instead of interacting with a human agent, a user is able to interact with a chatbot to determine the amount of money in his/her bank account, another chatbot to search for available hotels, and another chatbot to receive flight status information. Discussed herein are techniques for providing increased convenience and a more streamlined experience for a user who desires to use a plurality of domain-specific chatbots.
In accordance with one embodiment of the invention, a multi-domain chatbot is used to service a message of a user. An automated agent of the multi-domain chatbot may act as an intermediary between the user and a plurality of domain-specific modules of the multi-domain chatbot. The automated agent may receive the message from the user, determine an intent of the message, and based on the intent, determine a group of the domain-specific modules that should be investigated. The automated agent may then investigate the group of domain-specific modules by sending the user message to and receiving responses from the domain-specific modules within the group. Based on the received responses, the automated agent may determine whether to provide one of the domain-specific responses to the user or a null response to the user, in the event that none of the domain-specific responses is aligned with the intent of the message.
The process to determine whether to provide one of the domain-specific responses or a null response may include determining whether any of the domain-specific responses are aligned with the intent of the message. Such determination may be based on at least one of prior interactions between the automated agent and the user or interactions between the automated agent and other users. If at least one of the domain-specific responses is aligned with the intent of the message, the domain-specific modules may be ranked based on a criterion of how likely each of the domain-specific modules will be able to satisfy the intent of the user message. The ranking of domain-specific modules may also be based on at least one of prior interactions between the automated agent and the user or interactions between the automated agent and other users. The response from the most highly ranked domain-specific module may then be transmitted to the client device.
If, however, none of the domain-specific responses is aligned with the intent of the message, a null response may be selected from a plurality of null responses based on at least one of the intent of the user message and the responses from the group of domain-specific modules. The selected null response may then be transmitted to the client device. In one embodiment, the selected null response may inform the user of one or more domain-specific modules that have been investigated and ruled out for not being able to address (or sufficiently address) the user message. In another example, the selected null response may request information from the user to clarify the intent of the user message. In additional examples, the selected null response may offer an apology to the user, state that the intent of the user message cannot be understood, transfer the user to a human agent, or direct the user to contact a human agent.
These and other embodiments of the invention are more fully described in association with the drawings below.
In the following detailed description of the preferred embodiments, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the invention may be practiced. It is understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention. Descriptions associated with any one of the figures may be applied to different figures containing like or similar components/steps.
Multi-domain chatbot 108 may include automated agent 110 that acts as an interface or intermediary between user 102 and one or more of the domain-specific modules (e.g., 112a, 112b and 112c). As an example, automated agent 110 may receive a message from user 102, and return a response to the message from the most relevant one of the domain-specific modules, or if no response from any of the domain-specific modules is suitable for responding to the message, automated agent 110 may return a null response. Specific details of the operation of multi-domain chatbot 108 will be provided below in the description of
In the embodiment of
The domains serviced by the domain-specific modules may vary depending on the specific context in which multi-domain chatbot 108 is instantiated. In an enterprise context, example domains may include information technology (IT) (e.g., software support, hardware support), finance, human resources (HR), management, etc., and example domain-specific modules may include a module adapted to respond to messages regarding IT issues, a module adapted to respond to messages regarding finance issues, a module adapted to respond to messages regarding HR issues, etc. In a college context, example domains may include enrollment, athletics, fundraising, student organizations, etc. In a travel agency context, example domains may include airline, cruise, hotel, weather, excursions, marketing, tour packages, etc.
At step 204, a natural language understanding (NLU) system of automated agent 110 may determine an intent of the message from client device 104. Example NLU systems include the Moveworks AI platform from Moveworks, Inc.® of Mountain View, Calif.; DialogFlow from Alphabet Inc.® of Mountain View, Calif.; and Language Understanding (LUIS) from Microsoft Corp.® of Redmond, Wash. An intent may refer to a taxonomy or class into which a message from the user may be classified. For example, all the following messages, “My laptop is not working”, “My laptop has an issue”, “I don't know what is wrong with my laptop” may be classified under the intent of “User has a problem with his/her laptop”.
At step 206, automated agent 110 may identify a group of domain-specific modules that should be investigated based on the identified intent of the message. In one embodiment, the identified group may include at least two domain-specific modules. Such intent identification may be based on at least one of prior interactions between automated agent 110 and user 102 or interactions between automated agent 110 and other users (e.g., also called “historical data”), and the implementation of the intent identification in step 206 may involve a table look-up and/or machine learning. In a table look-up approach, a table may be used to map the intent of “User has a problem with his/her laptop” to the domain-specific module of an “IT module”.
In a machine learning approach, a model (not depicted) may be used to identify one or more domain-specific modules that are suitable to address a particular intent. More specifically, during a training phase, the model may be provided with known pairings of inputs and outputs (e.g., input of “User has a problem with his/her laptop” paired with the output of “IT module”; and various other pairings) so as to tune parameters of the model. Subsequently, during a model application phase, the model (with the parameters optimized during the training phase) may be used to identify one or more domain-specific modules that are suitable to address a particular intent (e.g., determine “finance module” in response to the intent of “User needs a reimbursement”).
At step 208, automated agent 110 may transmit the user message (and in some embodiments, may also transmit the identified intent) to each of the domain-specific modules identified in step 206. In the example of
At step 212, automated agent 110 may, based on responses 210a, 210b and 210c, determine whether to transmit a response from one of the domain-specific modules or a null response to the user. Additional details of step 212 will be provided below in the description of
In a machine learning approach, a model (not depicted) may be used to determine whether a response is aligned with an intent. More specifically, during a training phase, the model may be provided with known pairings of inputs and outputs (e.g., input of [response: “Can you send a picture of the receipt?”, intent: “User needs help with a reimbursement”] paired with the output “aligned”; input of [response: “Can you send a profile picture?”, intent: “User needs help with a reimbursement”] paired with the output “not aligned”; and various other pairings) so as to tune parameters of the model. Subsequently, in a model application phase, the model (with the parameters tuned during the training phase) may be used to, for example, determine whether a domain-specific response is aligned with the intent of a message.
Known pairings of inputs and outputs may be based on past interactions between multi-domain chatbot 108 and user 102. For instance, if in the past, the user message of “Can you reimburse $22.05 for the lunch I had with the client?” results in the identification of the intent of “User needs help with a reimbursement”, which returns the domain-specific response of “Can you send a picture of the receipt?”, and the user subsequently follows through and submits a picture of the receipt, it may be inferred from the user's follow through that the response: “Can you send a picture of the receipt?” is aligned with the intent: “User needs help with a reimbursement”. On the other hand, if in the past, the user message of “Can you reimburse $22.05 for the lunch I had with the client?” results in the identification of the intent of “User needs help with a reimbursement”, which returns the domain-specific response of “Can you send a profile picture?”, and the user ignores this request of multi-domain agent 108, it may be inferred based on the user's inaction that the response: “Can you send a profile picture?” is not aligned with the intent: “User needs help with a reimbursement”.
If automated agent 110 determines that at least one of the domain-specific responses is aligned with the intent of the message, the process may proceed to step 308 (take “Yes” branch of step 302), in which automated agent 110 may rank the domain-specific modules within the group in accordance with a criterion. In one embodiment, the ranking of the domain-specific modules may take into account at least one of prior interactions between automated agent 110 and user 102 or interactions between automated agent 110 and other users (e.g., also known as “historical data”). Further, the criterion to rank the domain-specific responses may consider how likely each of the domain-specific modules will be able to satisfy the intent of the user message. At step 310, the response from the most highly ranked domain-specific module may be transmitted to client device 104.
If, on the other hand, automated agent 110 determines that none of the domain-specific responses is aligned with the intent of the message, the process may proceed to step 304 (take “No” branch of step 302), in which automated agent 110 may select a null response from a group of null responses based on at least one of the intent of the user message and the responses from the group of domain-specific modules. In one example, the selected null response may inform the user of one or more domain-specific modules that have been investigated and ruled out for not being able to address (or sufficiently address) the user message. In another example, the selected null response may request information from the user to clarify the intent of the user message. In additional examples, the selected null response may offer an apology to the user, state that the intent of the user message cannot be understood, transfer the user to a human agent, or direct the user to contact a human agent. At step 306, automated agent 110 may transmit the null response to client device 104.
One motivation for using automated agent 110 as an intermediary between user 102 and the domain-specific modules is to eliminate and/or reduce the “back-and-forth” communication that might need to take place in order for user 102 to determine the domain-specific module that is the most suitable for responding to his/her message. Instead, this “back-and-forth” communication to explore the domain-specific modules may be off-loaded to automated agent 110, lessening the burden on the user to explore the domain-specific modules. In some instances, however, automated agent 110 may not be able to confidently determine a domain-specific module that is best suited to service the message of user 102, and some “back-and-forth” communication between user 102 and multi-domain chatbot 108 may still be necessary in order to select one (or more) of the domain-specific modules to service the message of user 102.
At step 512, automated agent 110 may determine whether any of the domain-specific responses is aligned with the intent of the message. In the example of
As illustrated in the present example, even though multi-domain chatbot 108 was unable to directly service the message of user 102, the user may be able to still acquire useful information from multi-domain chatbot 108, in the form of which domains (and/or departments) were already searched and ruled out. Such information can then be used by user to focus his/her attention on other, more promising domains that can possibly address the outstanding problem or issue.
As is apparent from the foregoing discussion, aspects of the present invention involve the use of various computer systems and computer readable storage media having computer-readable instructions stored thereon.
System 600 includes a bus 602 or other communication mechanism for communicating information, and a processor 604 coupled with the bus 602 for processing information. Computer system 600 also includes a main memory 606, such as a random access memory (RAM) or other dynamic storage device, coupled to the bus 602 for storing information and instructions to be executed by processor 604. Main memory 606 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 604. Computer system 600 further includes a read only memory (ROM) 608 or other static storage device coupled to the bus 602 for storing static information and instructions for the processor 604. A storage device 610, for example a hard disk, flash memory-based storage medium, or other storage medium from which processor 604 can read, is provided and coupled to the bus 602 for storing information and instructions (e.g., operating systems, applications programs and the like).
Computer system 600 may be coupled via the bus 602 to a display 612, such as a flat panel display, for displaying information to a computer user. An input device 614, such as a keyboard including alphanumeric and other keys, may be coupled to the bus 602 for communicating information and command selections to the processor 604. Another type of user input device is cursor control device 616, such as a mouse, a trackpad, or similar input device for communicating direction information and command selections to processor 604 and for controlling cursor movement on the display 612. Other user interface devices, such as microphones, speakers, etc. are not shown in detail but may be involved with the receipt of user input and/or presentation of output.
The processes referred to herein may be implemented by processor 604 executing appropriate sequences of computer-readable instructions contained in main memory 606. Such instructions may be read into main memory 606 from another computer-readable medium, such as storage device 610, and execution of the sequences of instructions contained in the main memory 606 causes the processor 604 to perform the associated actions. In alternative embodiments, hard-wired circuitry or firmware-controlled processing units may be used in place of or in combination with processor 604 and its associated computer software instructions to implement the invention. The computer-readable instructions may be rendered in any computer language.
In general, all of the above process descriptions are meant to encompass any series of logical steps performed in a sequence to accomplish a given purpose, which is the hallmark of any computer-executable application. Unless specifically stated otherwise, it should be appreciated that throughout the description of the present invention, use of terms such as “processing”, “computing”, “calculating”, “determining”, “displaying”, “receiving”, “transmitting” or the like, refer to the action and processes of an appropriately programmed computer system, such as computer system 600 or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within its registers and memories into other data similarly represented as physical quantities within its memories or registers or other such information storage, transmission or display devices.
Computer system 600 also includes a communication interface 618 coupled to the bus 602. Communication interface 618 may provide a two-way data communication channel with a computer network, which provides connectivity to and among the various computer systems discussed above. For example, communication interface 618 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN, which itself is communicatively coupled to the Internet through one or more Internet service provider networks. The precise details of such communication paths are not critical to the present invention. What is important is that computer system 600 can send and receive messages and data through the communication interface 618 and in that way communicate with hosts accessible via the Internet. It is noted that the components of system 600 may be located in a single device or located in a plurality of physically and/or geographically distributed devices.
Thus, a multi-domain chatbot has been described. It is to be understood that the above-description is intended to be illustrative, and not restrictive. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the invention should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
Number | Name | Date | Kind |
---|---|---|---|
6259969 | Tackett | Jul 2001 | B1 |
6604090 | Tackett | Aug 2003 | B1 |
6629087 | Benson | Sep 2003 | B1 |
8346563 | Hjelm | Jan 2013 | B1 |
8700005 | Kiraly | Apr 2014 | B1 |
8954444 | Retzlaff, II | Feb 2015 | B1 |
9189742 | London | Nov 2015 | B2 |
9552350 | Brown | Jan 2017 | B2 |
9836452 | Robichaud et al. | Dec 2017 | B2 |
10331791 | Anbazhagan | Jun 2019 | B2 |
10403283 | Schramm | Sep 2019 | B1 |
10417266 | Patel | Sep 2019 | B2 |
10452782 | Kumar | Oct 2019 | B1 |
10460728 | Anbazhagan | Oct 2019 | B2 |
10498883 | Krebs | Dec 2019 | B1 |
10554817 | Sullivan | Feb 2020 | B1 |
10565317 | Liu | Feb 2020 | B1 |
10573298 | Anders | Feb 2020 | B2 |
10592609 | Tucker | Mar 2020 | B1 |
10628471 | Chandramouli | Apr 2020 | B2 |
10629191 | Cheng | Apr 2020 | B1 |
10664527 | Henderson | May 2020 | B1 |
10671941 | Karp | Jun 2020 | B1 |
10691897 | Rajagopal | Jun 2020 | B1 |
10720160 | Schramm | Jul 2020 | B2 |
10721356 | Segalis | Jul 2020 | B2 |
20040044516 | Kennewick | Mar 2004 | A1 |
20070112714 | Fairweather | May 2007 | A1 |
20080109473 | Dixon | May 2008 | A1 |
20110087673 | Chen et al. | Apr 2011 | A1 |
20110213642 | Makar | Sep 2011 | A1 |
20120041903 | Beilby | Feb 2012 | A1 |
20120245925 | Guha | Sep 2012 | A1 |
20120290950 | Rapaport | Nov 2012 | A1 |
20150178371 | Seth | Jun 2015 | A1 |
20160012465 | Sharp | Jan 2016 | A1 |
20160071517 | Beaver | Mar 2016 | A1 |
20160155442 | Kannan | Jun 2016 | A1 |
20160360466 | Barak | Dec 2016 | A1 |
20170300831 | Gelfenbeyn | Oct 2017 | A1 |
20170358296 | Segalis | Dec 2017 | A1 |
20170366478 | Mohammed | Dec 2017 | A1 |
20170366479 | Ladha | Dec 2017 | A1 |
20180052664 | Zhang | Feb 2018 | A1 |
20180053119 | Zeng | Feb 2018 | A1 |
20180054464 | Zhang | Feb 2018 | A1 |
20180054523 | Zhang | Feb 2018 | A1 |
20180060303 | Sarikaya | Mar 2018 | A1 |
20180075847 | Lee | Mar 2018 | A1 |
20180083894 | Fung | Mar 2018 | A1 |
20180083898 | Pham | Mar 2018 | A1 |
20180083901 | McGregor, Jr. | Mar 2018 | A1 |
20180089163 | Ben Ami | Mar 2018 | A1 |
20180090137 | Horling | Mar 2018 | A1 |
20180109526 | Fung | Apr 2018 | A1 |
20180115643 | Skiba | Apr 2018 | A1 |
20180129484 | Kannan | May 2018 | A1 |
20180131642 | Trufinescu | May 2018 | A1 |
20180144738 | Yasavur et al. | May 2018 | A1 |
20180159805 | Jones | Jun 2018 | A1 |
20180210874 | Fuxman | Jul 2018 | A1 |
20180212904 | Smullen | Jul 2018 | A1 |
20180225568 | Chandramouli | Aug 2018 | A1 |
20180253985 | Aggarwal | Sep 2018 | A1 |
20180296929 | Vaccari | Oct 2018 | A1 |
20180307678 | Anantaram | Oct 2018 | A1 |
20180307687 | Natkin | Oct 2018 | A1 |
20180357220 | Galitsky | Dec 2018 | A1 |
20180357221 | Galitsky | Dec 2018 | A1 |
20180358001 | Amid | Dec 2018 | A1 |
20180358006 | McConnell | Dec 2018 | A1 |
20180365228 | Galitsky | Dec 2018 | A1 |
20180367483 | Rodriguez | Dec 2018 | A1 |
20180367484 | Rodriguez | Dec 2018 | A1 |
20180374479 | Hall | Dec 2018 | A1 |
20180375806 | Manning | Dec 2018 | A1 |
20190012390 | Nishant | Jan 2019 | A1 |
20190012714 | Bright | Jan 2019 | A1 |
20190013023 | Pourmohammad | Jan 2019 | A1 |
20190020605 | Efrati | Jan 2019 | A1 |
20190034976 | Hamedi | Jan 2019 | A1 |
20190042988 | Brown | Feb 2019 | A1 |
20190043106 | Talmor | Feb 2019 | A1 |
20190052584 | Barve | Feb 2019 | A1 |
20190089655 | Uppala | Mar 2019 | A1 |
20190103111 | Tiwari | Apr 2019 | A1 |
20190108285 | Stillwell, Jr. | Apr 2019 | A1 |
20190108286 | Pan | Apr 2019 | A1 |
20190124020 | Bobbarjung | Apr 2019 | A1 |
20190138595 | Galitsky | May 2019 | A1 |
20190147883 | Mellenthin | May 2019 | A1 |
20190149488 | Bansal | May 2019 | A1 |
20190188261 | Herzig | Jun 2019 | A1 |
20190197111 | Garrote | Jun 2019 | A1 |
20190213831 | Cage | Jul 2019 | A1 |
20190217206 | Liu | Jul 2019 | A1 |
20190228068 | Sen | Jul 2019 | A1 |
20190258710 | Biyani | Aug 2019 | A1 |
20190259380 | Biyani | Aug 2019 | A1 |
20190272323 | Galitsky | Sep 2019 | A1 |
20190281159 | Segalis | Sep 2019 | A1 |
20190286711 | Terry | Sep 2019 | A1 |
20190286712 | Terry | Sep 2019 | A1 |
20190286713 | Terry | Sep 2019 | A1 |
20190306107 | Galbraith | Oct 2019 | A1 |
20190312827 | Barve | Oct 2019 | A1 |
20190325081 | Liu | Oct 2019 | A1 |
20190340201 | Havens | Nov 2019 | A1 |
20190347668 | Williams | Nov 2019 | A1 |
20190361977 | Crudele | Nov 2019 | A1 |
20190370615 | Murphy | Dec 2019 | A1 |
20190370629 | Liu | Dec 2019 | A1 |
20190377790 | Redmond | Dec 2019 | A1 |
20190385237 | Wetton | Dec 2019 | A1 |
20200004813 | Galitsky | Jan 2020 | A1 |
20200005118 | Chen | Jan 2020 | A1 |
20200007380 | Chen | Jan 2020 | A1 |
20200007461 | Zhang | Jan 2020 | A1 |
20200028803 | Helmy | Jan 2020 | A1 |
20200034797 | Jonnalagadda | Jan 2020 | A1 |
20200042642 | Bakis | Feb 2020 | A1 |
20200042649 | Bakis | Feb 2020 | A1 |
20200050940 | Li | Feb 2020 | A1 |
20200065857 | Lagi | Feb 2020 | A1 |
20200075003 | Kim | Mar 2020 | A1 |
20200081939 | Subramaniam | Mar 2020 | A1 |
20200099633 | D'Agostino | Mar 2020 | A1 |
20200106726 | Pham | Apr 2020 | A1 |
20200117858 | Freeman | Apr 2020 | A1 |
20200125901 | Pelov | Apr 2020 | A1 |
20200175117 | Tsuji | Jun 2020 | A1 |
20200184959 | Yasa | Jun 2020 | A1 |
20200184992 | Newell | Jun 2020 | A1 |
20200227025 | DiMascio | Jul 2020 | A1 |
Number | Date | Country |
---|---|---|
3392781 | Oct 2018 | EP |
201808772 | Jul 2018 | GB |
101851786 | Apr 2018 | KR |
101851789 | Apr 2018 | KR |
101851790 | Apr 2018 | KR |
20180055680 | May 2018 | KR |
101851785 | Jun 2018 | KR |
20190059731 | May 2019 | KR |
101971582 | Aug 2019 | KR |
Entry |
---|
“How intent classification works in NLU”, Botfront.io Blog, retrieved Feb. 4, 2020 from: https://botfront.io/blog/how-intent-classification-works-in-nlu, 8 pages. |
Aboluwarin, Pelumi, “Chatbots—Igniting Division of Labour in AI”, Medium.com, Apr. 12, 2016, retrieved from: https://medium.com/@pelumi/chatbots-igniting-division-of-labour-in-ai-1430fcc85c8d, 4 pages. |
Chavan, Muralidhar, IBM Blog, published Jan. 18, 2019, retrieved from: https://developer.ibm.com/patterns/compose-bots-using-an-agent-bot/, 3 pages. |
Chaves; et al., “Single or Multiple Conversational Agents? An Interactional Coherence Comparison”, Conference: ACM CHI Conference on Human Factors in Computing Systems (CHI 2018), Apr. 21-26, 2018, Montreal, QC, Canada, 13 pages. |
Maturi, Hareesh, “Meta Chatbot: Enabling collaboration between chatbots”, Linkedin.com (published Dec. 26, 2016), retrieved from: https://www.linkedin.com/pulse/meta-chatbot-enabling-collaboration-between-chatbots-hareesh-maturi/, 7 pages. |
Pei; et al., “A Modular Task-oriented Dialogue System Using a Neural Mixture-of-Experts”, Proceedings of the 2019 SIGIR Workshop WCIS: Workshop on Conversational Interaction Systems (submitted Jul. 10, 2019), Paris, France, arXiv:1907.05346v1 [cs.CL], 7 pages. |
Subramaniam; et al., “COBOTS—A Cognitive Multi-Bot Conversational Framework for Technical Support”, AAMAS 2018, Jul. 10-15, 2018, Stockholm, Sweden, pp. 597-604. |