Customer Relationship Management (‘CRM’) is an approach to managing a company's interaction with current and potential customers. CRM implements data analysis of customers' history with a company to improve business relationships with customers, specifically focusing on customer retention and sales growth. CRM systems compile data from a range of communication channels, including telephone, email, live chat, text messaging, marketing materials, websites, and social media. Through the CRM approach and the systems used to facilitate it, businesses learn more about their target audiences and how to best address their needs.
Enterprise CRM systems can be huge. Such systems can include data warehouse technology, used to aggregate transaction information, to merge the information with information regarding CRM products and services, and to provide key performance indicators. CRM systems aid managing volatile growth and demand and implement forecasting models that integrate sales history with sales projections. CRM systems track and measure marketing campaigns over multiple networks, tracking customer analysis by customer clicks and sales. Some CRM software is available through cloud systems, software as a service (SaaS), delivered via network and accessed via a browser instead of installed on a local computer. Businesses using cloud-based CRM SaaS typically subscribe to such CRM systems, paying a recurring subscription fee, rather than purchasing the system outright.
Despite their sheer size, many CRM systems today lack the infrastructure to make full use of the information they can access. Customer contacts alone, for example can be difficult to track. A tele-agent today does not limit customer contacts merely to phone calls from a desk in a call center. Such contacts are often administered through contact centers that can administer multiple modes of contact, phone, text, email, and so on, among multiple agents across multiple locations. When a particular tele-agent representing a contact center on behalf of a marketing client accepts a phone call from a customer representative, it is entirely possible that there may have been multiple intervening contacts with that customer representative, including text messages and emails or even automated messages from artificial intelligence agents, which may be very difficult for the current tele-agent to know or sort.
Referring to
Administering communications across platforms includes administering communication contacts asynchronously across a same platform or across different platforms, and administering across platforms includes administering across platform types. For example, communications platform types may be selected from a platform type group consisting of a telephone, an email, a text message, and/or a chatbot, as well as others, either extant or yet to be developed, as may occur to routineers in the art. Thus, a first communications contact in a session can be by telephone with a subsequent contact by the same telephone or a different telephone; and a first contact can be by email with subsequent contacts by telephone or text message. Similarly, administering contacts across platforms includes administering contacts across physical locations of platforms: A first contact of a session can be by landline telephone in a call center with a subsequent contact by cell phone in a restaurant; and a first contact can be by email from a desk in a call center with subsequent contacts by cell phone or text message from a supermarket. In all the above examples, the tele-agent is provided at each platform with a transcript or partial transcript of the thread to aid in interaction with the customer.
A tele-agent 128 is a person, an agent of a contact center 305, responsible for selling or supporting commercial products and services. A customer representative 129 is a person who represents a customer, a company or other enterprise that is a current or prospective purchaser of goods or services of contact center 305. CRM contact center 305 is an organization of personnel and computer resources that provide CRM according to embodiments of the present invention. In the example of
A computer system 99 for CRM according to embodiments of the present invention includes client computers 152 and one or more servers, including a triple server 157 and a voice server 151. Computer system 99 may also include cloud services 159. Client computers 152 are automated computing machinery each configured for CRM with CRM-related I/O through a display, a graphical user interface, or a speech-enabled interface that accepts and recognizes speech from a user and optionally expresses to a user voice prompts and speech responses. Such devices are referred to as client devices because they implement the client side of computer architectures that carry out CRM according to embodiments. Client computers 152 in the example of
Automated computing machinery, as that phrase is used in this specification, means a module, segment, or portion of code or other automated computing logic, hardware, software, firmware, or the like, as well as a combination of any of the aforementioned, local or remote. Automated computing machinery is often implemented as executable instructions, physical units, or other computing logic for implementing specified logical functions.
A speech-enabled device is automated computing machinery configured to accept and recognize speech from a user and optionally to express to a user voice prompts and speech responses. Speech-enabled devices in the example of
Computer system 99 includes memory 169, which can include cache, random access memory (“RAM”), disk storage, and most other of computer memory, either extent or yet to be developed. For simplicity,
Computer memory 169 includes a CRM application 195, which is executed by computer system 99. In one or more embodiments, CRM application 195 may be hosted by one or more client computers 152, which is referred to as a thick-client implementation. In other embodiments, CRM application 195 may be hosted by triple server 157, voice server 151, and/or cloud services 159 in a thin client implementation. In this case, tele-agent 128 may access CRM application 195 using a web browser via hypertext mark-up language (HTML) and the like. Yet, in still other embodiments, CRM application 195 may consist of any number of discrete software modules distributed across and executed by each computing device 107, 110, 126, 151, 157, 159 of computer system 99, as known to routineers in the art.
Computer memory 169 of computer system 99 also includes a parsing engine 380, an inference engine 298, and a natural language processing speech recognition (“NLP-SR”) engine 153.
In the example of
In one or more embodiments, CRM application 195 establishes, as structure of computer memory 169 of computer system 99, session 140 that is configured as an object-oriented module of automated computing machinery. That is, a session 140 in such embodiments is established, initially at least, as an instance of an object-oriented session class. Establishing such a session 140 entails storing in computer memory 169 session member data elements including a subject code, a timestamp, identification of the tele-agent, identification of the customer representative, and optionally other information regarding the session. CRM application 195 thereafter administers a sequence of contacts 142 by establishing each contact 142 as an object-oriented module within computer memory 169 of computer system 99. That is, contacts 142 are established as instances of an object-oriented contact class. Administering a sequence of contacts entails recording in computer memory 169 contact member data elements including a timestamp denoting the beginning the contact, a session identifier for the contact, platform type, contact status, any communications content of the contact, and optionally other information regarding the contact.
The structure and content 509 of a communication session 140, including first and subsequent contacts 142, may include transcripts or partial transcripts of words of text, spoken or typed, as well as images, digital text, and the like. Words can include those typed into a text box, words from email or text messages, or words of digitized speech for recognition 315 from a conversation 313. The speech for recognition can be the entire conversation, where, for example, both persons speaking are in the same room, and the entire conversation is picked up by a microphone on a speech-enabled device, or where a telephone conversation is recorded. The scope of speech for recognition can be reduced by providing to a speech-enabled device only one side of the conversation, as only through a microphone on a headset 105. The scope of speech for recognition can be further reduced by providing for recognition only speech that responds to a prompt from a VoiceXML dialogue executing on a speech-enabled client computer. As the scope of speech for recognition is reduced, data processing burdens are reduced across the system as a whole, although it remains an option, in some embodiments at least, to recognize the entire conversation and stream across a display a flow of all words in the conversation.
Speech from the conversation 313 is recognized into digitized words by operation of NLP-SR engine 153, shown here hosted by voice server 151, but also amenable to installation on speech-enabled client computers 152 or other devices. In addition to digitizing words by speech recognition functions of voice server 151, for a further example, words can be digitized by operation of widgets or by typing into a text entry box of a graphical user interface on a client computer 152.
In one or more embodiments, CRM application 195 also stores structure and content 509 of session 140 and associated contacts 142 as semantic triples in an enterprise knowledge graph 154 a follows: Structure and content 509 of communication session 140, including first and subsequent contacts 142, is parsed by parsing engine 380 into parsed triples 752; inference engine 298 analyzes parsed triples 752 according to rules 376 to create inferred triples 754. The parsed and inferred triples 752, 754 are stored in enterprise knowledge graph 154. Parsed triples 752, inferred triples 754, and enterprise knowledge graph 154 are described in greater detail hereinafter.
Client computer 153, voice server 151, triple server 157, CRM application 195, parsing engine 380, inference engine 298, and enterprise knowledge graph 154 are all described in greater detail in co-pending U.S. application Ser. Nos. 16/154,718 and 16/911,717, incorporated herein by reference.
For further explanation,
In the method of
Referring first to
More particularly, CRM application 195 establishes, as structure of computer memory 169 of computer system 99, a communications session 140 that links all communications between one or more customer representatives of a given customer and one or more tele-agents of a contact center having in common a particular subject, i.e., a particular thread of communication. That is, session 140 is an object—an instance of object-oriented session class. Over the course of time, multiple instances—communications sessions—140a, 140b, . . . 140n are stored in memory 169—one for each thread of communication with a customer.
Each session instance 140a . . . 140n has an associated first contact 142a . . . 142n, which are also configured as object-oriented modules of automated computing machinery. Depending on the thread of communications, each session 140a . . . 140n may also have one or more additional subsequent contacts 142a . . . 142n. Contacts 142 are objects instances of an object-oriented contact class. Contacts 142 may also include both instances of actual communications between a tele-agent and a customer representative and also failed attempts at such communication. Sessions 140 and contacts 142 contain member data elements, as follows.
Each session 140 includes a unique session ID 393 data element, which functions as a foreign key linking that session 140 to all contacts 142 associated with that session. Each contact 142 also has a session ID 393 which matches the session ID 393 of one of the sessions 140. The session 140 thus functions as a wrapper or container for all contacts 142 linked by the session ID 393, i.e., for contacts related to a customer and having a particular subject matter, including multiple contacts among multiple tele-agents and multiple customer representatives across multiple platforms.
Other session 140 member data elements ideally include a subject code 385 that identifies the subject of the contacts between agents of a contact center and a customer representative that form the session, a timestamp 386 that delineates when that session 140 is first created, and a customer ID code 387 that identifies the customer that is represented in contacts of the session, that is, represented by a customer representative in contacts between a customer representative and an agent. The session member data elements may also include a content element 389. A typical use for the content element 389 is to elaborate on the subject matter 385 of the session. Session 140 can also include a time limit 221, represented here as a time-to-live or “TTL,” after which the session is either terminated or a user is notified of a timeout and prompted whether to terminate the session. A session can also include a status code 219 to indicate whether the session is active or terminated. Terminated sessions retained in storage optionally can be configured to be reactivated.
Contact 142 member data elements ideally also include a platform type 201 (such as email, telephone, text messaging, and chatbots), a timestamp 203 indicating the beginning time of the contact, a tele-agent ID 205, a customer representative ID 207, a customer ID 387, a subject code 385, a status flag 211, and content 213 of the contact.
As shown in
In one or more embodiments, dashboard 111 includes customer display and control mechanism 112 by which a customer can be identified and selected. Customer display and control mechanism 112 may be used for placing outgoing communications from a tele-agent to a customer representative in conjunction with selection of one of the platform icons 131, 135, 137, 139. Although not illustrated for brevity, customer display and control mechanism 112 may allow drilling down to display and allow selection from various customer representatives associated with the selected customer. Associated contact information such as email address, phone numbers, IP address, etc. may be stored by CRM application 195 to facilitate initiating a contact. In one or more embodiments, communications contacts can be implemented through speech alone without GUI operations, including specifying platform types. A tele-agent can issue oral instructions to CRM application 195: “Computer, email Bob and ask to schedule a call.” “Computer, text Bob and ask him to reschedule our call.” “Computer, ring Bob for me and if he doesn't answer, leave a message asking for a call back.” Each of these examples expressly identifies a platform type 101, respectively, email, text message, and telephone.
Customer display and control mechanism 112 may also be used to notate the customer when an incoming contact is received from a customer representative. However, customer identification may also be facilitated in part or in whole by automated recognition of email address, phone caller ID, voice identification, or the like, as known in the art.
Once the customer is selected, CRM application 195 populates a subject display and control mechanism 113, by which the tele-agent may view and select the session 140 to which the incoming or outgoing contact is associated. CRM application 195 initially populates subject display and control mechanism 113 by searching memory 169 for sessions 140 having the customer ID 387 that matches the customer ID associated with the customer selected at the customer display and control mechanism 112. Of these matching sessions 140, the subject codes 385 are mapped to options within subject display and control mechanism 113 that can be selected by the tele-agent. Optionally, closed or terminated sessions may selectively be shown within subject display and control mechanism 113, thereby allowing the tele-agent to reopen a closed matter. Although not illustrated, a option to add a new subject is also provided. CRM application 195 may optionally propose a suggested option within subject display and control mechanism 113 by the context of an incoming email, by the customer representative, or using the most recent subject, for example.
At the commencement of a contact with a customer, at step 377 CRM application 195 establishes a contact preamble 141. Contact preamble is a subset of an instance of a contact 142, and it includes the customer ID 387 and subject code 385. In the example of
By querying session data and populating session pane 114 as soon as the session can be identified, steps 377 and 381 allow the tele-agent to efficiently review relevant history of contacts with the customer regarding a particular subject, regardless of the tele-agent, customer representative, or platform associated with the earlier contacts, prior to commencing a new outgoing communication with the customer. Such may facilitate drafting a reply email or text message for example. For incoming communications, rapid display and organization of the previous contacts within session pane 114 allows quicker and more effective handling of the contact.
At the administering step 382, CRM application 195 first generates a new instance of a contact 142 and records therein the session ID 393, the customer ID 387 and the subject code 385. The tele-agent identification code 205 may be determined from the tele-agent credentials used to log in to dashboard 111, and CRM application 195 records the tele-agent ID 205 in the contact 142. The customer representative ID may be determined from the source of an incoming communication, such as by caller ID or email address, or it may be selected by the tele-agent via dashboard 111, such as by drilling down on customer display and control mechanism 112 to select the appropriate customer representative; it is recorded by CRM application 195 in the contact 142.
In the method of
Similarly, timestamp 203 is the reception time of incoming communications, and the time of submission (e.g., when an email is sent or a phone call is placed) for outgoing communications. CRM application 195 records platform type 201 and timestamp 203 in the contact 142.
The content 213 of communications in the contacts 142 may include the content of texts, chats, emails, with or without attachments, and voice communications. Contents 213 are captured by CRM application 195 and stored in association with the contacts themselves 142, rather than in a session 140 object. To the extent that the content 213 of a communications contact is speech, the speech may be recorded raw and/or recognized into a text transcript, the text then being stored as element 213 of contact 142. More particularly, to generate a transcript, CRM application 195, using a speech engine, recognizes words from a conversation between a tele-agent and a customer representative into digitized speech. Speech recognition may be carried out as follows: Words of speech from the conversation travel through a microphone and amplifier of computer 152 and, in a thin-client architecture, through a VOIP connection to voice server 151 where a speech recognition engine recognizes the words into a stream of digitized speech, which is handed off to natural language processing (“NLP”) engine 153 (
Additionally, in one or more embodiments, CRM application 195 may automate speaker recognition by implementing a speech recognition function running in a loop with current voice print extraction and comparison with previously stored voice prints of known speakers. For a new speaker with no voice print on record, a computer according to embodiments can take speaker identification from context, e.g., a tele-agent telephoned a particular customer representative selected from an address list. Alternatively for a new speaker with no voice print on record, CRM application 195 according to embodiments can prompt for identification. Other ways to associate a new speaker with a voice print may occur to routineers, and all such ways are well within the scope of the present invention.
In one or more embodiments, dashboard 111 includes a call notes text entry box or widget 366 into which the tele-agent can type notes before, during, or following a communication contact with a customer representative. The administering step 382 also records such call notes in the content field 213 of the corresponding contact 142.
The status code 211 of contact 142 can be used to indicate success or failure of a contact and is recorded by CRM application 195. The contact structure in this example includes identification codes for the tele-agent who initiated the contact and the customer representative who was sought to be contacted. CRM application 195 with voice enablement can indicate in response to a status query from a tele-agent, “You tried him last Wednesday and left a message, but we have had no further contact.”
In the method of
Although not illustrated, the method of
In many embodiments, the structure and content 509 of the session 140, typically including all session 140 data elements and associated contact 142 data elements, are stored as semantic triples in an enterprise knowledge graph 154. Enterprise knowledge graph 154 is composed of triples of a description logic that includes all CRM-related knowledge that is available to a tele-agent through CRM system 99. For example, enterprise knowledge graph 154 may include type-coded subgraphs that are implemented as logically-connected segments of the overall knowledge graph. All the nodes and edges in enterprise knowledge graph 154 are elements of semantic triples. Enterprise knowledge graph 154 may include subgraphs type-coded as customer information and customer representative information. These subgraphs are merely examples, not limitations of enterprise knowledge graphs. Enterprise knowledge graphs also will typically include financials, vendor information, business entities and structures, project information, corporate guidelines and manuals, employee data, incorporation data, transactions, contracts, sales histories, research details, and so on, and so on, and so on. In some embodiments, the storing process disposes at least some subgraphs of structure and content of session and communications contacts within the overall enterprise knowledge graph within segments of contiguous computer memory.
Enterprise knowledge graph 154 is a semantic graph database—a database that uses graph structures for semantic queries with nodes, edges and properties to represent and store data. A key concept of this database system is the graph (or edge or relationship), which directly relates data items in a data store. The relationships allow data in the store to be linked together directly, and in many cases retrieved with one operation. Such a graph database contrasts with conventional relational databases, where links among data are mere indirect metadata, and queries search for data within the store using join operations to collect related data. Graph databases, by design, make explicit relations among data and allow simple and fast retrieval of complex hierarchical structures that are difficult to model in relational systems. Enterprise knowledge graph 154 is described in detail in co-pending U.S. application Ser. Nos. 16/154,718 and 16/911,717,which are incorporated herein by reference.
Accordingly, as indicated at step 379, CRM application 195 first extracts structure and content 509 from session 140 and associated contact 142 data elements, typically as the object instances are created, i.e., written to memory 169. The structure and content 509 include words of communications contacts from the CRM application 195. Contacts 142 are said to present in a sequence because they present serially in time, one after another. The structure and content 509 of the communications session 140, including the structure and content 213 of the communications contacts 142, is then parsed by parsing engine 380 of computer system 99 into parsed triples 752, i.e., as element having a subject, a predicate, and an object, of a description logic. That is, the parsing process can function by forming into semantic triples 752 words designated by natural language processing engine 153 as parts of speech—subject, predicate, object. The parsing process can also function by forming into semantic triples 752 words designated through voice commands in a VoiceXML dialog or words designated through GUI widgets as element of triples—subject, predicate, object. In at least some embodiments, the description logic is a member of a family of formal knowledge representation languages in which a query of the logic is decidable. The parsing process is described in detail in co-pending U.S. application Ser. Nos. 16/154,718 and 16/911,717,which are incorporated herein by reference.
The parsed triples 752 may next be forwarded to inference engine 298, which generates inferred triples 754 from the parsed tripled 752 according to inference rules 376. In many embodiments, the parsing process hands off the parsed triples 752 to the inference engine 298 by disposing the parsed triples in segments of contiguous memory and providing to the inference engine 298 the memory addresses for the segments. Parsed and inferred triples 752, 754 are ultimately stored in enterprise knowledge graph 154.
As noted above, the method of
In the method of
In the method of
In the method of
The Abstract of the disclosure is solely for providing the a way by which to determine quickly from a cursory reading the nature and gist of technical disclosure, and it represents solely one or more embodiments.
It will be understood from the foregoing description that modifications and changes may be made in various embodiments of the present invention without departing from its true spirit. The descriptions in this specification are for purposes of illustration only and are not to be construed in a limiting sense. The scope of the present invention is limited only by the language of the following claims.
The present application is a continuation-in-part of U.S. application Ser. No. 16/154,718, filed on Oct. 9, 2018, the disclosure of which is incorporated herein in its entirety by reference. The present application is also a continuation-in-part of U.S. application Ser. No. 16/911,717, filed on Jun. 25, 2020, which is a continuation of U.S. application Ser. No. 16/183,736, filed on Nov. 8, 2018, the disclosures of which are incorporated herein in their entirety by reference.
Number | Name | Date | Kind |
---|---|---|---|
6785380 | Ribera | Aug 2004 | B2 |
6829603 | Chai et al. | Dec 2004 | B1 |
7275083 | Seibel et al. | Sep 2007 | B1 |
7486785 | Flores | Feb 2009 | B2 |
8108237 | Bourne et al. | Jan 2012 | B2 |
8332279 | Woolston | Dec 2012 | B2 |
3411843 | Cyriac | Apr 2013 | A1 |
9049295 | Cooper | Jun 2015 | B1 |
9165556 | Sugar | Oct 2015 | B1 |
9848082 | Lilland | Dec 2017 | B1 |
9860391 | Wu et al. | Jan 2018 | B1 |
9936066 | Mammen | Apr 2018 | B1 |
9942779 | Proctor | Apr 2018 | B1 |
9948783 | Farrell | Apr 2018 | B1 |
10026092 | Heater et al. | Jul 2018 | B2 |
10057423 | Sheikh | Aug 2018 | B1 |
10101976 | Cavalcante | Oct 2018 | B2 |
10303466 | Karman | May 2019 | B1 |
10482384 | Stoilos | Nov 2019 | B1 |
20030126136 | Omoigui | Jul 2003 | A1 |
20040143473 | Tivey et al. | Jul 2004 | A1 |
20040210881 | Friedman | Oct 2004 | A1 |
20050005266 | Datig | Jan 2005 | A1 |
20050044357 | Fano | Feb 2005 | A1 |
20050105712 | Williams et al. | May 2005 | A1 |
20060095273 | Montvay et al. | May 2006 | A1 |
20060098625 | King | May 2006 | A1 |
20060239439 | Blackwood | Oct 2006 | A1 |
20070019618 | Shaffer | Jan 2007 | A1 |
20070064913 | Shaffer | Mar 2007 | A1 |
20070094183 | Paek et al. | Apr 2007 | A1 |
20070233561 | Golec | Oct 2007 | A1 |
20080162498 | Omoigui | Jul 2008 | A1 |
20080275744 | Macintyre et al. | Nov 2008 | A1 |
20090070322 | Salvetti | Mar 2009 | A1 |
20090132474 | Ma | May 2009 | A1 |
20090245500 | Wampler | Oct 2009 | A1 |
20090271192 | Marquette | Oct 2009 | A1 |
20100010802 | Ruano | Jan 2010 | A1 |
20100036788 | Wu | Feb 2010 | A1 |
20100063799 | Jamieson | Mar 2010 | A1 |
20100114563 | Choi | May 2010 | A1 |
20110077999 | Becker et al. | Mar 2011 | A1 |
20110082829 | Kolovski | Apr 2011 | A1 |
20110113094 | Chunilal | May 2011 | A1 |
20110206198 | Freedman | Aug 2011 | A1 |
20110264451 | Hoepfinger | Oct 2011 | A1 |
20120059776 | Estes | Mar 2012 | A1 |
20120078636 | Ferrucci | Mar 2012 | A1 |
20120233558 | Naim | Sep 2012 | A1 |
20120275642 | Aller | Nov 2012 | A1 |
20120303355 | Liu et al. | Nov 2012 | A1 |
20130091090 | Spivack et al. | Apr 2013 | A1 |
20130006916 | Mcbride | Jun 2013 | A1 |
20130163731 | Yan | Jun 2013 | A1 |
20130204663 | Kahlow | Aug 2013 | A1 |
20140022328 | Gechter et al. | Jan 2014 | A1 |
20140081585 | Cappucino et al. | Mar 2014 | A1 |
20140081934 | Mizell | Mar 2014 | A1 |
20140122535 | Gerard | May 2014 | A1 |
20140164502 | Khodorenko | Jun 2014 | A1 |
20140189680 | Kripalani | Jul 2014 | A1 |
20140201234 | Lee et al. | Jul 2014 | A1 |
20140270108 | Riahi et al. | Sep 2014 | A1 |
20140278343 | Tran | Sep 2014 | A1 |
20140314225 | Riahi | Oct 2014 | A1 |
20140372630 | Bostick | Dec 2014 | A1 |
20140379755 | Kuriakose | Dec 2014 | A1 |
20150012350 | Li et al. | Jan 2015 | A1 |
20150066479 | Pasupalak | Mar 2015 | A1 |
20150189085 | Riahi et al. | Jul 2015 | A1 |
20150201077 | Konig et al. | Jul 2015 | A1 |
20150242410 | Pattabhiraman et al. | Aug 2015 | A1 |
20150254234 | Dixit et al. | Sep 2015 | A1 |
20150261743 | Sengupta | Sep 2015 | A1 |
20150294405 | Hanson | Oct 2015 | A1 |
20150309994 | Liu | Oct 2015 | A1 |
20150348551 | Gruber | Dec 2015 | A1 |
20150379603 | Gupta | Dec 2015 | A1 |
20160019882 | Matula | Jan 2016 | A1 |
20160021181 | Ianakiev et al. | Jan 2016 | A1 |
20160034457 | Bradley | Feb 2016 | A1 |
20160036981 | Hollenberg | Feb 2016 | A1 |
20160036982 | Ristock | Feb 2016 | A1 |
20160036983 | Korolev | Feb 2016 | A1 |
20160117593 | London | Apr 2016 | A1 |
20160162913 | Linden et al. | Jun 2016 | A1 |
20160171099 | Lorge et al. | Jun 2016 | A1 |
20160188686 | Hopkins | Jun 2016 | A1 |
20160189028 | Hu et al. | Jun 2016 | A1 |
20160217479 | Kashyap et al. | Jul 2016 | A1 |
20160239851 | Tanner | Aug 2016 | A1 |
20160162474 | Agarwal | Sep 2016 | A1 |
20160321748 | Mahatm | Nov 2016 | A1 |
20160335544 | Bretschneider et al. | Nov 2016 | A1 |
20170017694 | Roytman et al. | Jan 2017 | A1 |
20170024375 | Hakkani-Tur | Jan 2017 | A1 |
20170091390 | Joul | Mar 2017 | A1 |
20170124193 | Li | May 2017 | A1 |
20170147635 | Mcateer et al. | May 2017 | A1 |
20170154108 | Li et al. | Jun 2017 | A1 |
20170177715 | Chang | Jun 2017 | A1 |
20170200220 | Nicholson | Jul 2017 | A1 |
20170195488 | Pendyaia | Aug 2017 | A1 |
20170262429 | Harper | Sep 2017 | A1 |
20170262530 | Okura | Sep 2017 | A1 |
20170293610 | Tran | Oct 2017 | A1 |
20180082183 | Hertz et al. | Mar 2018 | A1 |
20180115644 | Al-Khaja | Apr 2018 | A1 |
20180144250 | Kwon | May 2018 | A1 |
20180150459 | Farid | May 2018 | A1 |
20180288098 | Wang | Oct 2018 | A1 |
20180300310 | Shinn | Oct 2018 | A1 |
20180315000 | Kulkarni | Nov 2018 | A1 |
20180315001 | Garner | Nov 2018 | A1 |
20180338040 | Carly | Nov 2018 | A1 |
20180365772 | Thompson | Dec 2018 | A1 |
20180376002 | Abraham | Dec 2018 | A1 |
20190042988 | Brown | Feb 2019 | A1 |
20190080370 | Copeland | Mar 2019 | A1 |
20190188617 | Copeland | Jun 2019 | A1 |
20190206400 | Cui | Jul 2019 | A1 |
20190220794 | Kulkarni | Jul 2019 | A1 |
20190340294 | Spangler | Nov 2019 | A1 |
20200042642 | Bakis | Feb 2020 | A1 |
20200097814 | Devesa | Mar 2020 | A1 |
20200110835 | Zhao | Apr 2020 | A1 |
Number | Date | Country |
---|---|---|
1020180058877 | Jul 2018 | KR |
20160139666 | Sep 2016 | WO |
Entry |
---|
Jan. 10, 2020 Office Action for corresponding U.S. Appl. No. 15/700,210. |
Jan. 25, 2020 Office Action for corresponding U.S. Appl. No. 15/844,512. |
Jul. 25, 2019 Office Action for corresponding U.S. Appl. No. 16/198,742. |
Final Office Action dated Jul. 27, 2020 for corresponding U.S. Appl. No. 15/844,512. |
Final Office Action dated Jul. 30, 2020 for corresponding U.S. Appl. No. 16/154,718. |
Final Office Action dated Jul. 7, 2020 for corresponding U.S. Appl. No. 15/700,210. |
Mar. 5, 2020 Office Action for corresponding U.S. Appl. No. 16/183,736. |
May 4, 2020 Office Action for corresponding U.S. Appl. No. 16/154,718. |
Non-Final Office Action dated Sep. 29, 2020 for corresponding U.S. Appl. No. 16/157,075. |
Non-Final Office Action dated Sep. 30, 2020 for corresponding U.S. Appl. No. 16/911,717. |
Oct. 19, 2020 Notice of Allowance for corresponding U.S. Appl. No. 16/157,075. |
Liew. “Strategic integration of knowledge management and customer relationship 1-20 management.” In: Journal of Knowledge Management. Jul. 18, 2008 (Jul. 18, 2008) Retrieved on Dec. 25, 2019 (Dec. 25, 2019) from <http://student.bms.lk/GDM/49/Slides/MarManaSampleAssi/MMAsuportingJouArti/13673270810884309.pdf> entire document. |
Tung. “Google's human-sounding AI to answer calls at contact centers.” In: ZDNet. Jul. 25, 2018 (Jul. 25, 2018) Retrieved on Dec. 25, 2019 (Dec. 25, 2019) from <https://www.zdnet.com/article/googles-human-sounding-ai-to-answer-calls-at-contact-centers/> entire document. |
International Search Report and Written Opinion dated Jan. 9, 2020 for PCT/US2019/055488. |
International Search Report and Written Opinion dated Jan. 14, 2020 for PCT/US2019/060174. |
International Search Report and Written Opinion dated Jan. 17, 2020 for PCT/US2019/058997. |
International Search Report and Written Opinion dated Feb. 22, 2019 for PCT/US2018/065584. |
International Search Report and Written Opinion dated Mar. 9, 2020 for PCT/US2019/059949. |
International Search Report and Written Opinion dated Apr. 1, 2020 for PCT/US2019/055483. |
International Search Report and Written Opinion dated Nov. 28, 2018 for PCT/US2018/049813. |
International Search Report and Written Opinion dated Dec. 30, 2019 for PCT/US2019/062629. |
Number | Date | Country | |
---|---|---|---|
20200382642 A1 | Dec 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16183736 | Nov 2018 | US |
Child | 16911717 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16911717 | Jun 2020 | US |
Child | 16947802 | US |