1. Field of the Invention
This invention relates generally to telecommunications and, more specifically, to communications between users of communication devices and customer service agents.
2. Description of the Background Art
Most companies need to provide customer care and make customer service agents available to customers. Each time a customer calls a customer service agent and consumes the customer service agent's time, it costs the company money and cuts into profit margins. To reduce these costs, many companies have implemented automated customer care options via Interactive Voice Response (IVR) systems, or self-service customer care options via the web. However, sometimes a user really does need or want to talk to a customer service agent. Therefore, in order to provide users' with adequate customer care and reduce customer care costs, there is a need for a system and method that enables customer service agents to efficiently serve customers.
The present invention provides a method, system, and software application that enable customer service agents to more efficiently assist customers. Specifically, the present invention enables a customer service agent to simultaneously engage in communication sessions with multiple users.
In one embodiment of the present invention, a user speaks a request, question, or statement into a communication device. The user's speech input is converted to text and the text is sent to a customer service agent. The customer service agent reads the text, and types a response. The customer service agent's text response is played to the customer as speech on the communication device, and the user hears the response on the communication device. The user may also see the response as text on the display screen of his communication device.
In an alternate embodiment, the user's speech input is provided to the customer service agent in the form of an audio file. The customer service agent then listens to the audio file, and types a text response. The response is then provided to the user, either in text form, speech form (by converting the text to speech), or both.
In a further embodiment, the user's speech input is converted to text and the text is sent to a customer service agent. The customer service agent reads the text and records a speech response, which is stored as an audio file. The audio file is then played back to the user.
Since the customer service agent is not talking live on the communication device with a user, the customer service agent can engage in communication sessions with multiple users simultaneously. While one user is digesting a customer service agent's response, the customer service agent can be responding to another user.
For each eligible user requesting to communicate with a customer service agent, the system opens up a communication session for the user (120). A communication session is a set of related communications between a user and one or more customer service agents. A communication session is associated with a record of the communications between a user and customer service agent(s). When a communication session is open, the record is updated with each communication between the user and the agent.
During a communication session, the system enables the user to enter speech input for a customer service agent (130). The user enters speech input by speaking into his communication device. The speech input is then converted to text (140). The session record is updated with the text (150), and the system provides the customer service agent with the session record, where the user's speech input is displayed as text on the customer service agent's screen (160).
The customer service agent provides a text response (170) (or enters a speech response which is converted to text), and the session is updated with the text response (180). The customer's service agent's text response is converted to speech and played to the user in the form of speech (190). In one embodiment, the user is provided with the customer service agent's response in both speech and text form (e.g., the user hears the customer service agent's response and see the text response in the display screen of his mobile phone). Alternatively, the customer service agent's response is provided to the user only in text form.
In an alternate embodiment, the user's speech input is provided to the customer service agent in the form of an audio file. The customer service agent then listens to the audio file, and types a text response. The response is then provided to the user, either in text form, speech form (by converting the text to speech), or both.
Since the customer service agent is not talking to the user live, the customer service agent can engage in communication sessions with multiple users simultaneously. While one user is digesting a customer service agent's response, the customer service agent can be responding to another user.
During a communication session, a user may communicate with the same customer service agent, or may communicate with multiple customer service agents. In most cases, it will be most efficient for the same customer service agent to service the user during a communication session. However, it is possible for different customer service agents to service the user during a single communication session. For instance, during the same communication session, one customer service agent may respond to a first question spoken by a user, and another customer service agent may respond to a second question from the user. In this way, the invention can “packetize” interactions between users and customer service agents, where one user input/agent response is like a “packet.” The system can packetize interactions to load balance and/or to ensure that the user inquiry is routed to a customer service agent best suited to respond to the inquiry (e.g., to provide first- and second-level support to the user). The fact that multiple customer service agents are responding to a user during a communication session may not be apparent to the user (i.e., the user experience may be that he is communicating with the same customer service agent).
The Server Application 330 includes (1) a Session Manager 332 that keeps track of open communication sessions between users and customer service agents; (2) a Load Balancer 334 that allocates an agent to a particular session or communication from a user; and (3) a Server Network Module 336 that interfaces with a network.
The Agent Application 340 on the customer service agent's computer includes (1) a Agent User Interface Module 342 that provides a visual interface to the customer service agent (on the agent's computer screen); and (2) a Client Network Module 328 that interfaces with a network.
Those skilled in the art will appreciate that the user's communication device, the Server, and the customer service agent's computer will include additional functionality not represented by the above Client Application 320, Server Application 330, and Agent Application 340. However, such functionality is well known in the art and a discussion of such functionality is not relevant for an understanding of the invention described herein. Moreover, those skilled in the art will appreciate that there may be many ways to implement the present invention, and the software implementations described herein with respect to
The Server Network Module 330 receives the text from the Client Network Module 328 (420). The Session Manager 332 on the Server then updates the users communication session with the text (425). This involves determining if an open communication session exists for the user. If an open communication session exists (i.e., the text from the user is part of an ongoing, existing conversation with a customer service agent), the Session Manager 332 updates the existing communication session. If an open communication session does not exist (i.e., the user is initiating a conversation with a customer service agent), the Session Manager 332 opens a new communication session for the user and updates the new session with the text from the user.
The Load Balancer 334 on the Server then identifies an appropriate customer service agent to receive the session and transfer the session record to the customer service agent via the Server Network Module 330 (430). If the communication session is a new communication session, the Load Balancer 334 may use conventional load balancing techniques (e.g., round robin, agent load, etc.) to select an agent. If the communication session is an existing communication session, the Load Balancer 334 may either select the agent that previously handled the session, or it may use conventional load balancing techniques to identify an agent with availability. The Load Balancer 334 may also factor in agent expertise in selecting an agent.
The Agent Network Module 344 receives the communication session record from the server (435), and the Agent User Interface Module 342 displays the contents of the record to the customer service agent in the form of text on the customer service agent's display screen (440).
The customer service agent types a response (or enters a speech response which is converted to text) (445), and the Agent Network Module 344 transmits the text to the server (450). The Server Network Module 336 receives the text from the Agent Application 340 (455), and the Session Manager 332 updates the communication session record with the text (460). The Server Network Module 336 then sends the customer service agent's response (in the form of text) to the user's communication device (465).
The Client Network Module 328 receives the text from the Server Application 330 (470), and the Translation Module 324 translates the text to speech with the Text-to-Speech Engine 326 (475). The Client User Interface Module 322 displays the text and plays the speech to the user (480). Steps 405-480 are repeated until the user or the customer agent terminates the communication session.
In an alternate embodiment of the invention, the Server Application 330 determines whether an automated response can be provided to the user prior to sending a user's session record to a customer service agent.
In the embodiment described with respect to
In an alternate embodiment, in addition to or instead of receiving text of the user's speech input, a customer service agent can receive an audio file (e.g., a .wav file) of the user's speech input. The audio file enables the customer service agent to listen to the user's speech input if desired by the customer service agent. For example, in the method described with respect to
In a further alternate embodiment, a user's speech input is converted to text and then provided to a customer service agent. The customer service agent reads the text input and then records a speech response, which is saved as an audio file. The audio file is then sent to the user's phone and played back to the user. A text transcript of the agent's speech response may optionally be provided to the user. Also, the agent's speech response may optionally be converted to text for the purpose of having a text transcript of the agent's response in the session record.
As will be understood by those familiar with the art, the invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the above disclosure of the present invention is intended to be illustrative and not limiting of the invention.
| Number | Name | Date | Kind |
|---|---|---|---|
| 4164025 | Dubnowski et al. | Aug 1979 | A |
| 4697282 | Winter et al. | Sep 1987 | A |
| 4850007 | Marino et al. | Jul 1989 | A |
| 4918322 | Winter et al. | Apr 1990 | A |
| 4945557 | Kaneuchi et al. | Jul 1990 | A |
| 5136636 | Wegrzynowicz | Aug 1992 | A |
| 5386455 | Cooper | Jan 1995 | A |
| 5553119 | McAllister et al. | Sep 1996 | A |
| 5638425 | Meador et al. | Jun 1997 | A |
| 5677990 | Junqua | Oct 1997 | A |
| 5724481 | Garberg et al. | Mar 1998 | A |
| 5799065 | Junqua et al. | Aug 1998 | A |
| 5819265 | Ravin et al. | Oct 1998 | A |
| 5875394 | Daly et al. | Feb 1999 | A |
| 5880770 | Ilcisin et al. | Mar 1999 | A |
| 5946613 | Hayes et al. | Aug 1999 | A |
| 5991720 | Galler et al. | Nov 1999 | A |
| 5991739 | Cupps et al. | Nov 1999 | A |
| 6016336 | Hanson | Jan 2000 | A |
| 6125347 | Cote et al. | Sep 2000 | A |
| 6167383 | Henson | Dec 2000 | A |
| 6173266 | Marx et al. | Jan 2001 | B1 |
| 6181927 | Welling et al. | Jan 2001 | B1 |
| 6208965 | Brown et al. | Mar 2001 | B1 |
| 6216111 | Walker et al. | Apr 2001 | B1 |
| 6253174 | Ishii et al. | Jun 2001 | B1 |
| 6314165 | Junqua et al. | Nov 2001 | B1 |
| 6334103 | Surace et al. | Dec 2001 | B1 |
| 6363357 | Rosenberg et al. | Mar 2002 | B1 |
| 6404876 | Smith et al. | Jun 2002 | B1 |
| 6473734 | Dvorak | Oct 2002 | B1 |
| 6526273 | Link et al. | Feb 2003 | B1 |
| 6567658 | Van De Graaf | May 2003 | B1 |
| 6584180 | Nemoto | Jun 2003 | B2 |
| 6587558 | Lo | Jul 2003 | B2 |
| 6618704 | Kanevsky et al. | Sep 2003 | B2 |
| 6650887 | McGregor et al. | Nov 2003 | B2 |
| 6654447 | Dewan | Nov 2003 | B1 |
| 6662163 | Albayrak et al. | Dec 2003 | B1 |
| 6728353 | Espejo et al. | Apr 2004 | B1 |
| 6731737 | Davis | May 2004 | B2 |
| 6771761 | LaPierre | Aug 2004 | B1 |
| 6792102 | Shires | Sep 2004 | B2 |
| 6853987 | Cook | Feb 2005 | B1 |
| 6856673 | Banks et al. | Feb 2005 | B1 |
| 6876728 | Kredo et al. | Apr 2005 | B2 |
| 6907118 | Henderson et al. | Jun 2005 | B2 |
| 6915112 | Sutton et al. | Jul 2005 | B1 |
| 6917802 | Nilsson | Jul 2005 | B1 |
| 6941273 | Loghmani et al. | Sep 2005 | B1 |
| 6985753 | Rodriguez et al. | Jan 2006 | B2 |
| 6996531 | Korall et al. | Feb 2006 | B2 |
| 7006971 | Stahl et al. | Feb 2006 | B1 |
| 7023979 | Wu et al. | Apr 2006 | B1 |
| 7106851 | Tang et al. | Sep 2006 | B2 |
| 7113571 | Matsubara et al. | Sep 2006 | B2 |
| 7120235 | Altberg et al. | Oct 2006 | B2 |
| 7143037 | Chestnut | Nov 2006 | B1 |
| 7187761 | Bookstaff | Mar 2007 | B2 |
| 7215969 | Benco et al. | May 2007 | B2 |
| 7353016 | Roundtree et al. | Apr 2008 | B2 |
| 7424427 | Liu et al. | Sep 2008 | B2 |
| 7487095 | Hill et al. | Feb 2009 | B2 |
| 7529678 | Kobal | May 2009 | B2 |
| 7698140 | Bhardwaj et al. | Apr 2010 | B2 |
| 7724878 | Timmins et al. | May 2010 | B2 |
| 7779408 | Papineau | Aug 2010 | B1 |
| 7783755 | Goss et al. | Aug 2010 | B2 |
| 7822414 | Bender et al. | Oct 2010 | B2 |
| 7870199 | Galli | Jan 2011 | B2 |
| 7881703 | Roundtree et al. | Feb 2011 | B2 |
| 7970118 | O'Dell, III | Jun 2011 | B2 |
| 7996251 | Kannan et al. | Aug 2011 | B2 |
| 8000973 | Williams et al. | Aug 2011 | B2 |
| 8081749 | Shaffer et al. | Dec 2011 | B1 |
| 20010010714 | Nemoto | Aug 2001 | A1 |
| 20010011230 | Morganstein et al. | Aug 2001 | A1 |
| 20010037241 | Puri | Nov 2001 | A1 |
| 20010039492 | Nemoto | Nov 2001 | A1 |
| 20010048737 | Goldberg et al. | Dec 2001 | A1 |
| 20010056359 | Abreu | Dec 2001 | A1 |
| 20020010000 | Chern et al. | Jan 2002 | A1 |
| 20020034940 | Takae et al. | Mar 2002 | A1 |
| 20020044639 | Shioda et al. | Apr 2002 | A1 |
| 20020065736 | Willner et al. | May 2002 | A1 |
| 20020077833 | Arons et al. | Jun 2002 | A1 |
| 20020077898 | Koulouris | Jun 2002 | A1 |
| 20020087323 | Thomas et al. | Jul 2002 | A1 |
| 20020091726 | MacLeod Beck et al. | Jul 2002 | A1 |
| 20020103641 | Kuo et al. | Aug 2002 | A1 |
| 20020120582 | Elston et al. | Aug 2002 | A1 |
| 20020159572 | Fostick | Oct 2002 | A1 |
| 20020168986 | Lau et al. | Nov 2002 | A1 |
| 20020169618 | Caspari | Nov 2002 | A1 |
| 20020177914 | Chase | Nov 2002 | A1 |
| 20030007464 | Balani | Jan 2003 | A1 |
| 20030023439 | Ciurpita et al. | Jan 2003 | A1 |
| 20030050043 | Ohrstrom et al. | Mar 2003 | A1 |
| 20030061171 | Gilbert et al. | Mar 2003 | A1 |
| 20030064720 | Valins et al. | Apr 2003 | A1 |
| 20030130904 | Katz et al. | Jul 2003 | A1 |
| 20030162561 | Johnson et al. | Aug 2003 | A1 |
| 20030177009 | Odinak et al. | Sep 2003 | A1 |
| 20030185359 | Moore et al. | Oct 2003 | A1 |
| 20030204444 | Van Luchene et al. | Oct 2003 | A1 |
| 20040012501 | Mazzara et al. | Jan 2004 | A1 |
| 20040019487 | Kleindienst et al. | Jan 2004 | A1 |
| 20040047453 | Fraser | Mar 2004 | A1 |
| 20040091093 | Bookstaff | May 2004 | A1 |
| 20040102225 | Furuta et al. | May 2004 | A1 |
| 20040111267 | Jadhav et al. | Jun 2004 | A1 |
| 20040161097 | Henry | Aug 2004 | A1 |
| 20040162724 | Hill et al. | Aug 2004 | A1 |
| 20040185833 | Walden et al. | Sep 2004 | A1 |
| 20040203728 | Schwinke et al. | Oct 2004 | A1 |
| 20040207508 | Lin et al. | Oct 2004 | A1 |
| 20040242209 | Kruis et al. | Dec 2004 | A1 |
| 20050044254 | Smith | Feb 2005 | A1 |
| 20050074102 | Altberg et al. | Apr 2005 | A1 |
| 20050131910 | Yanagisawa | Jun 2005 | A1 |
| 20050147052 | Wu | Jul 2005 | A1 |
| 20050163296 | Smith et al. | Jul 2005 | A1 |
| 20050177368 | Odinak | Aug 2005 | A1 |
| 20050183032 | Bushey et al. | Aug 2005 | A1 |
| 20050201540 | Rampey et al. | Sep 2005 | A1 |
| 20050213743 | Huet et al. | Sep 2005 | A1 |
| 20050222712 | Orita | Oct 2005 | A1 |
| 20050261990 | Gocht et al. | Nov 2005 | A1 |
| 20050286691 | Taylor et al. | Dec 2005 | A1 |
| 20060009218 | Moss | Jan 2006 | A1 |
| 20060080107 | Hill et al. | Apr 2006 | A1 |
| 20060098619 | Nix et al. | May 2006 | A1 |
| 20060100851 | Schonebeck | May 2006 | A1 |
| 20060106610 | Napper | May 2006 | A1 |
| 20060126804 | Lee et al. | Jun 2006 | A1 |
| 20060135215 | Chengalvarayan et al. | Jun 2006 | A1 |
| 20060203989 | Ollason | Sep 2006 | A1 |
| 20060217113 | Rao et al. | Sep 2006 | A1 |
| 20060262919 | Danson et al. | Nov 2006 | A1 |
| 20070117584 | Davis et al. | May 2007 | A1 |
| 20070190986 | Lee | Aug 2007 | A1 |
| 20070280439 | Prywes | Dec 2007 | A1 |
| 20080088440 | Palushaj | Apr 2008 | A1 |
| 20080255851 | Ativanichayaphong et al. | Oct 2008 | A1 |
| 20100267378 | Hamabe et al. | Oct 2010 | A1 |
| Number | Date | Country |
|---|---|---|
| 1435720 | Jul 2004 | EP |
| 2206265 | Dec 1988 | GB |
| 2360418 | Sep 2001 | GB |
| 0062518 | Oct 2000 | WO |
| 03079656 | Sep 2003 | WO |
| WO 03079656 | Sep 2003 | WO |
| Entry |
|---|
| International Search Report in related PCT Application No. PCT/US2008/013893. |
| Written Opinion of ISA in corresponding PCT Application No. PCT/US2008/013893. |
| Simoudis, E. (2000). If it's not one channel, then it's another. Bank Marketing, 32(1), 48-50+. |
| Number | Date | Country | |
|---|---|---|---|
| 20090164214 A1 | Jun 2009 | US |