A customer self-help system is a system that supports or accompanies one or more other data management systems by helping users of the one or more other data management systems find answers to their questions, without involving live customer support personnel. If a customer self-help system adequately helps a user find a satisfactory answer to the user's question, the user is less likely to seek additional support from live customer support (e.g., telephone support, live chat, text message, etc.). A business benefit of a well-functioning customer self-help system is reduced overhead costs for a company because providing live customer support can be expensive (e.g., sometimes costing the company as much as $25 per use of the live customer support). A user benefit of a well-functioning customer self-help system is that users can find answers to their questions more quickly than having to wait for live customer support because use of live customer support usually involves waiting in a queue for a turn to communicate with a customer support representative.
The quality of a customer self-help system is determined, at least in part, by how well the customer self-help system assists users in finding the customer support content for which the users are searching. To assist users in finding customer support content, traditional customer self-help systems typically apply a one-size-fits-all approach to the content search user experience that is provided to the users of the customer self-help system.
However, this traditional one-size-fits-all approach is problematic because it does not satisfy the searching needs or capabilities of users who have different levels of ability to formulate search queries. In other words, some users are very comfortable using advanced search features to formulate and submit a search query in a customer self-help system, while other users experience stress or confusion when presented with advanced search features to formulate a search query. If all users, including both advanced users and less experienced users are provided with the same content search user experience, neither category or type of user is likely to be satisfied with their search experience within the customer self-help system. For example, if an advanced user is provided with fewer content search user experience options and only a simplified content search user experience, it may take the advanced user more searches to find the results that the advanced user is searching for. Having to perform multiple searches and unnecessarily wasting time performing multiple searches, when a single search might do, can lead to a frustrated advanced user. If a less-experienced (e.g., normal) user is provided with an advanced content search user experience, the less-experienced user might be intimidated by the advanced content search user experience options and default towards seeking live customer support. If the less-experienced user attempts to use the advanced features and repeatedly fails to find the customer support content that the user is searching for, the less-experienced user may become dissatisfied with the search experience and incorrectly determine that the customer self-help system is incapable of satisfying the user's needs. Whenever a user of any type is dissatisfied, determines that the customer self-help system is inadequate, or seeks live customer support, the service provider of the customer self-help system is unlikely to gain or maintain that user's trust, confidence, and future business.
Consequently, a technical problem that exists for customer self-help systems is content search user experience to provide a customer self-help system that provides satisfying search experiences for the entire spectrum of users when servicing a customer base of varying levels of information searching skills.
The present disclosure includes embodiments for determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users to adapt the user experience to the users and thereby increase a likelihood of user satisfaction with the search experience, according to one embodiment. The customer self-help system determines levels of search sophistication for users by analyzing search query data representing search queries and by analyzing clickstream data representing users' interactions with the customer self-help system or with a financial management system that is supported by the customer self-help system, according to one embodiment. The customer self-help system analyzes the search query data and the clickstream data by applying the search query data and the clickstream data to one or more analytics models, which include one or more of a predictive model and a probabilistic topic model, according to one embodiment. The customer self-help system uses the results of the analyses to determine levels of search sophistication for the users, and the customer self-help system provides a simplified content search user experience to less-experienced users and an advanced content search user experience to advanced users, according to one embodiment. Thus, the present disclosure resolves the above technical problem with a technical solution that includes determining the level of search sophistication of users and providing personalized content search user experiences to the users, based on the determined users' levels of search sophistication. As a result, implementation of the disclosed embodiments reduces the likelihood of repeated searching, the likelihood of contacting live customer support, and the likelihood of customer dissatisfaction with the content search user experience received from the customer self-help system, according to one embodiment.
A user's search sophistication is an ability of a user to formulate a search query, according to one embodiment. Some users have a relatively high level of ability to formulate a search query by using, for example, multiple search text boxes that are combined to form a complex search query, according to one embodiment. Some users have a relatively low level of ability to formulate a search query and are more likely to complete the formulation and submission of a search query when provided with a simplified search query (e.g., a single search text box and tips for using the search text box), according to one embodiment. The level of search sophistication for a user is a level of ability of a user to formulate a search query that effectively results in the customer support content sought by the user, according to one embodiment. Examples of levels of search sophistication include, but are not limited to, basic, intermediate, and advanced, according to one embodiment.
Determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users and to increase a likelihood of user satisfaction with the search experience is a technical solution to a long standing technical problem of dissatisfying and inefficient content search user experiences in customer self-help systems. Therefore, the disclosed embodiments do not represent an abstract idea for at least a few reasons. First, determining levels of search sophistication for users to personalize a content search user experience for users is not an abstract idea because it is not merely an idea itself (e.g., cannot be performed mentally or using pen and paper). Indeed, some of the disclosed embodiments of determining levels of search sophistication include tracking clickstream data and using analytics models to determine search sophistication score data, which cannot be performed mentally. Second, determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users is not an abstract idea because it is not a fundamental economic practice (e.g., is not merely creating a contractual relationship, hedging, mitigating a settlement risk, etc.). Third, determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users is not an abstract idea because it is not a method of organizing human activity (e.g., managing a game of bingo). Rather, the disclosed embodiments analyze human behavior to determine characteristics of users that can be used to modify computing processes (e.g., the selection of one content search user experience over another). Fourth, although mathematics may be used to generate an analytics model, determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users is not simply a mathematical relationship/formula but is instead a technique for transforming search query data into personalized search experience data that is used to personalize a content search user experience for users and to increase thereby the likelihood of causing users to more quickly and efficiently finding answers to questions, without the use of live customer support, according to one embodiment.
Further, determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users is not an abstract idea because the disclosed techniques allow for significant improvement to the technical fields of user experience, self-help systems, customer service, and financial management systems, according to one embodiment. The present disclosure adds significantly to the field of content searching because the disclosed customer self-help system reduces the likelihood of redundant searches, reduces the likelihood of users seeking live customer support, and increases the likelihood of improving users' search experiences by providing one of a number of content search user experiences that is suited to the searching skills of the users, according to one embodiment.
As a result, embodiments of the present disclosure allow for reduced use of processor cycles, processor power, communications bandwidth, memory, and power consumption, by reducing inefficient searching as measured by the number of search queries submitted by users when searching for customer support content, according to one embodiment. Consequently, computing and communication systems implementing or providing the embodiments of the present disclosure are transformed into more operationally efficient devices and systems.
In addition to improving overall computing performance, personalizing a content search user experience for the users of a customer self-help system significantly improves the field of financial management systems, by increasing the likelihood that users will promptly resolve their own concerns with one search or better search results, so that the users continue use of the financial management system that is supported by the customer self-help system, according to one embodiment. Furthermore, by personalizing a content search user experience for the users, the disclosed embodiments help maintain or build trust and therefore loyalty in the customer self-help system and in the financial management system with which it is associated, which results in repeat customers, efficient delivery of financial services, and reduced abandonment of use of the financial management system, according to one embodiment.
Common reference numerals are used throughout the FIGs. and the detailed description to indicate like elements. One skilled in the art will readily recognize that the above FIGs. are examples and that other architectures, modes of operation, orders of operation, and elements/functions can be provided and implemented without departing from the characteristics and features of the invention, as set forth in the claims.
Embodiments will now be discussed with reference to the accompanying FIGs., which depict one or more exemplary embodiments. Embodiments may be implemented in many different forms and should not be construed as limited to the embodiments set forth herein, shown in the FIGs., or described below. Rather, these exemplary embodiments are provided to allow a complete disclosure that conveys the principles of the invention, as set forth in the claims, to those of skill in the art.
The INTRODUCTORY SYSTEM, USER EXPERIENCE, ARCHITECTURE, and PROCESS sections herein describe systems and processes suitable for determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users, according to various embodiments.
A customer self-help system improves the likelihood that users of the customer self-help system will have a satisfying search experience by providing personalized content search user experiences to the users of the customer self-help system, according to one embodiment. The customer self-help system receives search queries from users and determines the users' likely levels of ability to formulate search queries, according to one embodiment. If the users have basic or less-experienced levels of ability, then the customer self-help system provides a simplified content search user experience, which includes a single search text box with one or more tips or instructions on how to formulate a search query, according to one embodiment. If the users have advanced levels of ability, the customer self-help system provides an advanced content search user experience, which includes one or more search text boxes that provide users with the ability to define particular characteristics of their search queries, according to one embodiment. Consequently, less-experienced users are provided a user friendly content search user experience, and advanced users are provided an advanced content search user experience, so that both categories of users are more likely to have a satisfying search experience, according to one embodiment. Providing users with satisfying search experiences increases the likelihood that the users will more efficiently find answers to their questions or concerns, according to one embodiment. For users who access the features of the customer self-help system during use of a financial management system that is supported by the customer help system, providing users with a satisfying search experience increases the likelihood that the users will continue using the financial management system to complete one or more financial management tasks, according to one embodiment.
Introductory System
The present disclosure includes embodiments for determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users and to increase a likelihood of user satisfaction with the search experience, according to one embodiment. The customer self-help system determines levels of search sophistication for users by analyzing search query data representing search queries and by analyzing clickstream data representing users' interactions with the customer self-help system or with a financial management system that is supported by the customer self-help system, according to one embodiment. The customer self-help system analyzes the search query data and the clickstream data by applying the search query data and the clickstream data to one or more analytics models, which include one or more of a predictive model and a probabilistic topic model, according to one embodiment. The customer self-help system uses the results of the analyses to determine levels of search sophistication for the users, and the customer self-help system provides a simplified content search user experience to less-experienced users and an advanced content search user experience to advanced users, according to one embodiment. Thus, the present disclosure resolves the above technical problem with a technical solution that includes determining the level of search sophistication of users and providing personalized content search user experiences to the users, based on the determined level of user search sophistication. This reduces the likelihood of repeated searching, the likelihood of contacting live customer support, and the likelihood of customer dissatisfaction with the content search user experience received from the customer self-help system, according to one embodiment.
As used herein, the term data management system (e.g., customer self-help system, tax return preparation system, or other software system) includes, but is not limited to the following: one or more of computing system implemented, online, web-based personal and business tax return preparation system; one or more of computing system implemented, online, web-based personal or business financial management systems, services, packages, programs, modules, or applications; one or more of computing system implemented, online, and web-based personal or business management systems, services, packages, programs, modules, or applications; one or more of computing system implemented, online, and web-based personal or business accounting or invoicing systems, services, packages, programs, modules, or applications; and various other personal or business electronic data management systems, services, packages, programs, modules, or applications, whether known at the time of filing or as developed after the time of filing.
As used herein the term “self-help system” is interchangeable with “customer self-help system,” “self-service system,” and “self-support system”. A self-help system (e.g., a customer self-help system) is a system that enables customers and other users to help themselves find answers to questions, find specific content within a financial management system, navigate within the financial management system, or perform one or more actions (e.g., adjust the user tax data within a particular form), according to one embodiment. In contrast, the term “live customer support” denotes an interaction between a user of a financial management system and a customer support representative who uses a telephone call, instant messaging, a video conference, text messaging, or other mode of telecommunications to resolve questions or concerns of a user of the financial management system.
Specific examples of data management systems include financial management systems. Examples of financial management systems include, but are not limited to the following: TurboTax® available from Intuit®, Inc. of Mountain View, Calif.; TurboTax Online™ available from Intuit®, Inc. of Mountain View, Calif.; QuickBooks®, available from Intuit®, Inc. of Mountain View, Calif.; QuickBooks Online™, available from Intuit®, Inc. of Mountain View, Calif.; Mint®, available from Intuit®, Inc. of Mountain View, Calif.; Mint® Online, available from Intuit®, Inc. of Mountain View, Calif.; or various other systems discussed herein, or known to those of skill in the art at the time of filing, or as developed after the time of filing.
A specific illustrative example of a customer self-help system includes, but is not limited to, TurboTax AnswerXchange® available from Intuit®, Inc. of Mountain View, Calif., according to one embodiment. The TurboTax AnswerXchange® available from Intuit®, Inc. of Mountain View, Calif., is one specific example of a customer self-help system that enables users to receive responses to search queries with User Generated Content (“UGC”), service provider content (e.g., prepared by employees of Intuit®), and definitions content (e.g., explanations of tax-specific jargon), according to one embodiment.
As used herein, the terms “computing system,” “computing device,” and “computing entity,” include, but are not limited to, the following: a server computing system; a workstation; a desktop computing system; a mobile computing system, including, but not limited to, one or more of smart phones, portable devices, and devices worn or carried by a user; a database system or storage cluster; a virtual asset; a switching system; a router; any hardware system; any communications system; any form of proxy system; a gateway system; a firewall system; a load balancing system; or any device, subsystem, or mechanism that includes components that can execute all, or part, of any one of the processes or operations as described herein.
In addition, as used herein, the terms “computing system”, “computing entity”, and “computing environment” can denote, but are not limited to the following: systems made up of multiple virtual assets, server computing systems, workstations, desktop computing systems, mobile computing systems, database systems or storage clusters, switching systems, routers, hardware systems, communications systems, proxy systems, gateway systems, firewall systems, load balancing systems, or any devices that can be used to perform the processes or operations as described herein.
Herein, the term “production environment” includes the various components, or assets, used to deploy, implement, access, and use, a given system as that system is intended to be used. In various embodiments, production environments include multiple computing systems or assets that are combined, communicatively coupled, virtually or physically connected, or associated with one another, to provide the production environment implementing the application.
As specific illustrative examples, the assets making up a given production environment can include, but are not limited to, the following: one or more computing environments used to implement at least part of a system in the production environment such as a data center, a cloud computing environment, a dedicated hosting environment, or one or more other computing environments in which one or more assets used by the application in the production environment are implemented; one or more computing systems or computing entities used to implement at least part of a system in the production environment; one or more virtual assets used to implement at least part of a system in the production environment; one or more supervisory or control systems, such as hypervisors, or other monitoring and management systems used to monitor and control assets or components of the production environment; one or more communications channels for sending and receiving data used to implement at least part of a system in the production environment; one or more access control systems for limiting access to various components of the production environment, such as firewalls and gateways; one or more traffic or routing systems used to direct, control, or buffer data traffic to components of the production environment, such as routers and switches; one or more communications endpoint proxy systems used to buffer, process, or direct data traffic, such as load balancers or buffers; one or more secure communication protocols or endpoints used to encrypt/decrypt data, such as Secure Sockets Layer (SSL) protocols, used to implement at least part of a system in the production environment; one or more databases used to store data in the production environment; one or more internal or external services used to implement at least part of a system in the production environment; one or more backend systems, such as backend servers or other hardware used to process data and implement at least part of a system in the production environment; one or more modules/functions used to implement at least part of a system in the production environment; or any other assets/components making up an actual production environment in which at least part of a system is deployed, implemented, accessed, and run, e.g., operated, as discussed herein, or as known in the art at the time of filing, or as developed after the time of filing.
As used herein, the term “computing environment” includes, but is not limited to, a logical or physical grouping of connected or networked computing systems or virtual assets using the same infrastructure and systems such as, but not limited to, hardware systems, systems, and networking/communications systems. Typically, computing environments are either known, “trusted” environments or unknown, “untrusted” environments. Typically, trusted computing environments are those where the assets, infrastructure, communication and networking systems, and security systems associated with the computing systems or virtual assets making up the trusted computing environment, are either under the control of, or known to, a party.
In various embodiments, each computing environment includes allocated assets and virtual assets associated with, and controlled or used to create, deploy, or operate at least part of the system.
In various embodiments, one or more cloud computing environments are used to create, deploy, or operate at least part of the system that can be any form of cloud computing environment, such as, but not limited to, a public cloud; a private cloud; a virtual private network (VPN); a subnet; a Virtual Private Cloud (VPC); a sub-net or any security/communications grouping; or any other cloud-based infrastructure, sub-structure, or architecture, as discussed herein, as known in the art at the time of filing, or as developed after the time of filing.
In many cases, a given system or service may utilize, and interface with, multiple cloud computing environments, such as multiple VPCs, in the course of being created, deployed, or operated.
As used herein, the term “virtual asset” includes any virtualized entity or resource, or virtualized part of an actual, or “bare metal” entity. In various embodiments, the virtual assets can be, but are not limited to, the following: virtual machines, virtual servers, and instances implemented in a cloud computing environment; databases associated with a cloud computing environment, or implemented in a cloud computing environment; services associated with, or delivered through, a cloud computing environment; communications systems used with, part of, or provided through a cloud computing environment; or any other virtualized assets or sub-systems of “bare metal” physical devices such as mobile devices, remote sensors, laptops, desktops, point-of-sale devices, etc., located within a data center, within a cloud computing environment, or any other physical or logical location, as discussed herein, or as known/available in the art at the time of filing, or as developed/made available after the time of filing.
In various embodiments, any, or all, of the assets making up a given production environment discussed herein, or as known in the art at the time of filing, or as developed after the time of filing can be implemented as one or more virtual assets within one or more cloud or traditional computing environments.
In one embodiment, two or more assets, such as computing systems or virtual assets, or two or more computing environments are connected by one or more communications channels including but not limited to, Secure Sockets Layer (SSL) communications channels and various other secure communications channels, or distributed computing system networks, such as, but not limited to the following: a public cloud; a private cloud; a virtual private network (VPN); a subnet; any general network, communications network, or general network/communications network system; a combination of different network types; a public network; a private network; a satellite network; a cable network; or any other network capable of allowing communication between two or more assets, computing systems, or virtual assets, as discussed herein, or available or known at the time of filing, or as developed after the time of filing.
As used herein, the term “network” includes, but is not limited to, any network or network system such as, but not limited to, the following: a peer-to-peer network; a hybrid peer-to-peer network; a Local Area Network (LAN); a Wide Area Network (WAN); a public network, such as the Internet; a private network; a cellular network; any general network, communications network, or general network/communications network system; a wireless network; a wired network; a wireless and wired combination network; a satellite network; a cable network; any combination of different network types; or any other system capable of allowing communication between two or more assets, virtual assets, or computing systems, whether available or known at the time of filing or as later developed.
As used herein, the term “user experience display” includes not only data entry and question submission user interfaces, but also other user experience features and elements provided or displayed to the user such as, but not limited to, the following: data entry fields, question quality indicators, images, backgrounds, avatars, highlighting mechanisms, icons, buttons, controls, menus and any other features that individually, or in combination, create a user experience, as discussed herein, or as known in the art at the time of filing, or as developed after the time of filing.
As used herein, the terms “user experience page” and “user experience screen” are interchangeable in meaning and represent a changeable rendering or view of content that is provided to a user in the user experience display, according to one embodiment.
As used herein, the term “user experience” includes, but is not limited to, one or more of a search query creation process, an incremental search results receipt process, a user session, interview process, interview process questioning, or interview process questioning sequence, or other user experience features provided or displayed to the user such as, but not limited to, interfaces, images, assistance resources, backgrounds, avatars, highlighting mechanisms, icons, and any other features that individually, or in combination, create a user experience, as discussed herein, or as known in the art at the time of filing, or as developed after the time of filing.
Herein, the term “party,” “user,” “user consumer,” and “customer” are used interchangeably to denote any party or entity that interfaces with, or to whom information is provided by, the disclosed methods and systems described herein, or a legal guardian of person or entity that interfaces with, or to whom information is provided by, the disclosed methods and systems described herein, or an authorized agent of any party or person or entity that interfaces with, or to whom information is provided by, the disclosed methods and systems described herein. For instance, in various embodiments, a user can be, but is not limited to, a person, a commercial entity, an application, a service, or a computing system.
As used herein, the term “analytics model” denotes one or more individual or combined algorithms or sets of ordered relationships that describe, determine, or predict characteristics of or the performance of a datum, a data set, multiple data sets, a computing system, or multiple computing systems. Analytics models or analytical models represent collections of measured or calculated behaviors of attributes, elements, or characteristics of data or computing systems. Analytics models include probabilistic topic models and predictive models (e.g., query classifiers), which identify the likelihood of one attribute or characteristic based on one or more other attributes or characteristics.
As used herein, the term “search sophistication” denotes or represents an ability of a user to formulate a search query. Some users have a relatively high level of ability to formulate a search query by using, for example, multiple search text boxes that are combined to form a complex search query. Some users have a relatively low level of ability to formulate a search query and are more likely to complete the formulation and submission of a search query when provided with a simplified search query (e.g., a single search text box and tips for using the search text box).
As used herein a “search sophistication score” quantifies or metricizes (i.e., makes measureable) the search sophistication of a user with a numerical score. The search sophistication score is used to determine a level of search sophistication for a user by, for example, comparing the search sophistication score to one or more thresholds, according to one embodiment. The level of search sophistication for a user is a level of ability of a user to formulate a search query that effectively results in the customer support content sought by the user. Examples of levels of search sophistication include, but are not limited to, basic, intermediate, and advanced, according to one embodiment.
User Experience
Referring to
Referring to
Referring now to
The first or simplified content search user experience 104 includes one or more of a content search user experience identifier 108, a search text box 110, search tips 112, and a search submission user experience element 114, according to various embodiments. The content search user experience identifier 108 includes “Not what you were searching for?”, according to the specific illustrative example of
The requested customer support content 106 is provided to a user in response to the search query that was used to find the user experience page 100, according to one embodiment. The requested customer support content 106 includes, but is not limited to, one or more of User Generated Content (“UGC”), service provider content (e.g., white papers, tutorials, Frequently Asked Questions (“FAQs”), etc.), and definitions content (e.g., definitions of tax-specific jargon), according to one embodiment. The search query is “TurboTax how do I get my tax refund”, according to the specific illustrative example of
The representation of a search query 116 represents a question that the customer self-help system determines to be intended by the user's search query, according to one embodiment. The representation of a search query 116 represents a question that a third-party search engine determines to be intended by the user's search query, according to one embodiment. The representation of a search query 116 is “When will I get my IRS tax refund?”, according to the specific illustrative example of
The text answer to the representation of a search query 118 and a multimedia answer to the representation of a search query 120 are examples of customer support content that are provided to users by the customer self-help system to answer users questions and to resolve user concerns, according to one embodiment. The text answer to the representation of a search query 118 provides a legible response to the representation of a search query 116, according to one embodiment. The multimedia answer to the representation of a search query 120 provides a viewable or audible response to the representation of a search query 116, according to one embodiment.
Referring now to
The second or advanced content search user experience 152 includes one or more of a content search user experience identifier 158, a search text box 160, a search text box 162, and a search text box 164, according to various embodiments. The second or advanced content search user experience 152 also includes one or more of a search box descriptor 166, the search box descriptor 168, a search box descriptor 170, and a search submission user experience element 172, according to various embodiments. The content search user experience identifier 158 includes “Refine search?”, according to the specific illustrative example of
The requested customer support content 154 is provided to a user in response to the search query that was used to reference the user experience page 150, according to one embodiment. The requested customer support content 154 includes, but is not limited to, one or more of User Generated Content (“UGC”), service provider content, and definitions content, according to one embodiment. The search query is “turbotax 2016 refund status” (not shown), according to the specific illustrative example of
The representation of a search query 174 represents a question that the customer self-help system determines to be intended by the user's search query, according to one embodiment. The representation of a search query 174 represents a question that a third-party search engine determines to be intended by the user's search query, according to one embodiment. The representation of a search query 174 is “How do I check my e-file status?”, according to the specific illustrative example of
The text answer to the representation of a search query 176 and a multimedia answer to the representation of a search query 178 are examples of customer support content that are provided to users by the customer self-help system to answer users' questions and to resolve users' concerns, according to one embodiment. The text answer to the representation of a search query 176 provides a legible response to the representation of a search query 174, according to one embodiment. The multimedia answer to the representation of a search query 178 provides a viewable or audible response to the representation of a search query 174, according to one embodiment.
Architecture
The user computing systems 210 represent one or more user computing systems that are used by users 212 to access the third-party search engine 220, the financial management system 230, and the customer self-help system 250, according to one embodiment. A user 214 represents one of the users 212, according to one embodiment. The user 214 submits a search query 216 to the third-party search engine 220 or to the customer self-help system 250 to resolve a question or concern, to acquire more information about the financial management system 230, or to acquire information related to the financial management system 230, according to one embodiment.
The third-party search engine 220 is an example of an Internet search engine that provides search results 222 that are responsive to the search query 216, according to one embodiment. The third-party search engine 220 employs one or more content searching algorithms to identify portions of customer support content that match the search query 216 or that match an identified intent of the search query 216, according to one embodiment. Specific illustrative examples of the third-party search engine 220 include, but are not limited to, Google®, Bing®, and Yahoo®, according to various embodiments. The search results 222 include a customer self-help system reference 224, according to one embodiment. The customer self-help system reference 224 is a hyperlink or other Internet-based reference to the customer self-help system 250 or to content within the customer self-help system 250, according to one embodiment. The third-party search engine 220 provides the customer self-help system reference 224 to the user 214, in response to receiving the search query 216, according to one embodiment. The user 214 is directed to the customer self-help system 250 (e.g., though a web browser), in response to selecting the customer self-help system reference 224, according to one embodiment.
The financial management system 230 is configured to provide one or more financial management services, according to one embodiment. The financial management system 230 includes one or more of a tax return preparation system, a business financial management system, and a personal financial management system, according to one embodiment (not shown). As specific illustrative examples, the financial management system 230 includes, but is not limited to, one or more of: TurboTax® available from Intuit®, Inc. of Mountain View, Calif.; TurboTax Online™ available from Intuit®, Inc. of Mountain View, Calif.; QuickBooks®, available from Intuit®, Inc. of Mountain View, Calif.; QuickBooks Online™, available from Intuit®, Inc. of Mountain View, Calif.; Mint®, available from Intuit®, Inc. of Mountain View, Calif.; Mint® Online, available from Intuit®, Inc. of Mountain View, Calif.; and various other systems discussed herein, or known to those of skill in the art at the time of filing, or as developed after the time of filing (not shown), according to various embodiments.
The financial management system 230 assists the users 212 in completing a financial management task 232 by providing the users 212 with financial management system user experience content 234, according to one embodiment. The financial management task 232 includes, but is not limited to, one or more of preparing a tax return, filing a tax return, preparing and filing a tax return, entering financial information into the financial management system 230 to support preparing and filing a tax return, creating an account with the financial management system 230, logging into an account with the financial management system 230, creating a personal budget, setting a monetary value of a personal budget for a number of financial categories, linking one or more financial institution accounts to the financial management system 230, importing financial information into the financial management system 230 from one or more third-party servers, creating an invoice, creating a receipt, transmitting a receipt or an invoice to a supplier or to a customer, setting up a business within the financial management system 230, entering employee information, setting up payroll, paying employees through the financial management system 230, and tracking expenses, according to various embodiments.
The financial management system 230 provides the financial management system user experience content 234 with a plurality of user experience pages (not shown) to assist users in completing the financial management task 232, according to one embodiment. The financial management system user experience content 234 includes, but is not limited to, one or more tax topics, questions, question sequences, web links, content sequences, pages, colors, interface elements, promotions, audio clips, video clips, other multimedia, business questions, business budget questions, personal budget questions, data entry fields, question quality indicators, images, backgrounds, avatars, highlighting mechanisms, icons, buttons, controls, menus and any other features that individually, or in combination, create a user experience in a financial management system, as discussed herein, as known in the art at the time of filing, or as developed after the time of filing, according to various embodiments.
The financial management system user experience content 234 is provided to the users 212 to acquire user financial data (not shown) from the users 212, according to one embodiment. The financial management system 230 uses the user financial data to facilitate completion of the financial management task 232 (e.g., prepare and file a tax return) or to provide other financial management services to the users 212, according to one embodiment. The user financial data includes, but is not limited to, one or more of a user's name, a date of birth, an address, a zip code, a home ownership status, a marital status, an annual income, a job title, an employer's address, spousal information, children's information, asset information, medical history, occupation, information regarding dependents, salary and wages, interest income, dividend income, business income, farm income, capital gain income, pension income, individual retirement account (“IRA”) distributions, unemployment compensation, education expenses, health savings account deductions, moving expenses, IRA deductions, student loan interest deductions, tuition and fees, medical and dental expenses, state and local taxes, real estate taxes, personal property tax, mortgage interest, charitable contributions, casualty and theft losses, unreimbursed employee expenses, alternative minimum tax, foreign tax credit, education tax credits, retirement savings contribution, child tax credits, business income, accounts receivable, accounts payable, invoice information, inventory quantities, inventory costs, operating expenses, business travel records, business travel expenses, customer contact information, credit card balances, quarterly tax estimations, spending category budgets, outstanding loan balances, personal spending trends, categories of business expenses, categories of personal expenses, employee information, employee expenses, insurance costs, residential energy credits, and any other user financial data that is discussed herein, that is known at the time of filing, or that becomes known after the time of filing.
The financial management system 230 defines and stores a user identification (“ID”) 236 for each of the users 212 who use the financial management system 230, according to one embodiment. The user ID 236 is at least partially based on one or more of the user computing systems 210 associated with the users 212, according to one embodiment. The user ID 236 is based on characteristics of one or more of the user computing systems 210 used to access the financial management system 230, according to one embodiment. The characteristics of the user computing systems 210 include, but are not limited to, one or more of an operating system, an Internet browser, a media access control (“MAC”) address or other computer hardware identifier, an Internet Protocol (“IP”) address, or any combination of the characteristics, according to one embodiment. The user ID 236 is at least partially based on one or more user characteristics provided to the financial management system 230 by the users 212 and includes, but is not limited to, a name, a username, a password, a code word, an email address, a birthdate, a government identification number, or any combination of the user characteristics, according to one embodiment. The user ID 236 is a combination of characteristics of the user computing systems 210 and of user characteristics (not shown), according to one embodiment. The user ID 236 is used by a service provider to identify a user in the financial management system 230, the customer self-help system 250, and other systems or products offered by the service provider, according to one embodiment. Defining the user ID 236 based on characteristics of one or more of the user computing systems 210 enables the financial management system 230 and the customer self-help system to identify the users 212, without the users 212 logging into an account, according to one embodiment.
The financial management system 230 records or tracks clickstream data 238 for the users 212 as they navigate and use the financial management system 230, according to one embodiment. Clickstream data 238 includes, but is not limited to, one or more selection device (e.g., mouse, stylus, finger) movements, typing speed, time spent on a user experience page, user experience elements selected with a selection device, and user experience elements that are hovered over, according to various embodiments. The clickstream data 238 includes navigation behavior data representing navigation behavior such as navigating back and forth between two or more user experience pages, which may be used to identify potential user confusion, according to one embodiment. The clickstream data 238 includes financial management system user experience history 240 as a record of the financial management system user experience content 234 that is provided to the users 212 and as a record of which user experience pages the users 212 visit within the financial management system 230, according to one embodiment.
The financial management system 230 provides the user ID 236 and the clickstream data 238 to the customer self-help system 250, to enable the customer self-help system 250 to create and maintain user profile data 256, according to one embodiment. In one embodiment, the user profile data 256 is stored in the financial management system 230 (not shown). The user profile data 256 is stored in the customer self-help system 250, in the specific illustrated example of
The financial management system 230 is represented or implemented by data that is partially or wholly stored in memory 242 (inclusive of non-volatile memory and volatile memory) and is partially or wholly executed by processors 244, according to one embodiment.
The production environment 200 includes a customer self-help system 250 that is associated with the financial management system 230 and that is configured to determine levels of search sophistication for the users 212 that access the customer self-help system 250 to personalize a content search user experience for the users 212, according to one embodiment. The customer self-help system 250 also provides customer support content 252 to the users 212, to resolve questions or concerns of the users 212 that are relevant to the financial management system 230, according to one embodiment. The customer self-help system 250 includes the customer support content 252, a customer support engine 254, the user profile data 256, and content search user experience options 258 for determining levels of search sophistication for the users 212 and for providing a personalized content search user experience (e.g., the first or simplified content search user experience 104 of
The customer self-help system 250 receives the search query 216 directly or indirectly from the user 214, according to one embodiment. The customer self-help system 250 receives the search query 216 directly from the user 214, if the user enters the search query 216 into one or more user experience pages of the customer self-help system 250, according to one embodiment. The customer self-help system 250 receives the search query 216 indirectly from the user 214, if the user submits the search query 216 to the third-party search engine 220, according to one embodiment. The customer self-help system 250 determines the search query 216 by using web page characteristics that are generated when the user 214 selects the customer self-help system reference 224 to be directed to the customer self-help system 250, according to one embodiment. In one specific illustrative example, if the search query 216 is “What is a like-kind exchange?” and is entered into a third-party search engine 220, then a post-search URL of Google® is “https(colon)//www(dot)google(dot)com(forward slash)webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=What+is+a+like-kind+exchange %3F”, which includes the terms of the search query 216. The customer self-help system 250 can use one or more of a number of techniques to parse the search query 216 from the URL of the user experience page from which the user is navigated to the customer self-help system 250, as known in the art, according to one embodiment.
In response to receiving the search query 216, the customer self-help system 250 provides portions of the customer support content 252 to the user 214, according to one embodiment. The customer self-help system 250 employs one or more of a number of search engines or database search techniques to identify portions of the customer support content 252 that match or that are responsive to the search query 216, according to one embodiment. Examples of the customer support content 252 include one or more of crowd-sourced customer support content, service provider content, and definitions content, according to one embodiment. The crowd-sourced customer support content includes questions and responses that are submitted by a community of question askers and response providers that use the customer self-help system 250 or another question and answer customer support system that is associated with the financial management system 230, according to one embodiment. The crowd-sourced customer support content can also be referred to as User Generated Content (“UGC”) to distinguish the crowd-sourced customer support content from the service provider content, according to one embodiment. The service provider content includes white papers, questions, answers, frequently asked questions, answers to frequently asked questions, tutorials, audio/video content, interactive content, or other content that can be used to assist users in learning about taxes, tax preparation, financial business management, personal financial management, the financial management system 230, or other relevant subject matter, according to one embodiment. The definitions content includes acronym definitions, definitions of tax-specific terms (e.g., tax jargon), definitions of terms that are related to tax law or preparing tax returns, definitions of business-specific terms, definitions of terms that are related to financial business management, and definitions of terms that are related to the financial management system 230, according to one embodiment.
The customer self-help system 250 uses the customer support engine 254 to identify the search query 216 and to provide a personalized content search user experience 260 to the user 214, according to one embodiment. The customer support engine 254 includes a customer support request 262 and a user experience display 264, according to one embodiment. The customer support request 262 represents a request from one of the user computing systems 210 that is associated with user 214, and represents a request by the user 214 to display a portion of the customer support content 252, according to one embodiment.
The customer support engine 254 uses the user experience display 264 to provide the personalized content search user experience 260 and to display requested customer support content 266, according to one embodiment. Specific illustrative examples of the user experience display 264 include the user experience page 100 of
The user profile data 256 includes the user ID 236, the clickstream data 238, search sophistication characteristics 268, and a search sophistication score 270, according to one embodiment. The clickstream data 238 represents user interactions with the financial management system 230, according to one embodiment. The clickstream data 238 is also updated by the customer self-help system 250, based on user interactions with the customer self-help system 250, according to one embodiment. The search sophistication characteristics 268 are determined by applying one or more of the search query 216 and the clickstream data 238 to a first analytics model 272, according to one embodiment. The search sophistication score 270 is determined by applying the search sophistication characteristics 268 to a second analytics model 274, according to one embodiment.
The search sophistication characteristics 268 represent search characteristics of the users 212 that are transformed into a search sophistication score 270 by one or more analytics models, according to one embodiment. The search sophistication characteristics 268 are used to distinguish between less-sophisticated search queries and more-sophisticated search queries in order to identify the type of content search user experience to provide to the users 212, according to one embodiment. As an example, “can you help me find my tax refund amount from last year?” is an example of a search query that is formulated in a manner that is similar to communicating with another person. Because search queries that are formulated in a manner that is similar to communicating with another person include several terms that likely provide little improvement to the search, such a search query is generally considered to be a less sophisticated search query, according to one embodiment. In particular, pronouns (e.g., I, we, us, mine), question words (e.g., why, where, how), and punctuation are generally useless or less-useful terms for a search engine and are therefore used to characterize a user to determine which type of content search user experience would be more effective for the user, according to one embodiment. The search sophistication characteristics 268 are calculated for users who have used the financial management system 230, who have used the customer self-help system 250, or who have used both the financial management system 230 and the customer self-help system 250, according to one embodiment. The search sophistication characteristics 268 are determined by analyzing the search query 216 and the clickstream data 238, according to one embodiment. The search sophistication characteristics 268 are used to determine a level of search sophistication of a user and are used to identify which of the content search user experience options to provide to the user 214, according to one embodiment. The search sophistication characteristics 268 include, but are not limited to, one or more of pronoun use in the search query 216, punctuation in the search query 216, a character length of the search query 216, a number of search query terms in the search query 216, a user's typing speed in the financial management system 230, a user's typing speed and the customer self-help system 250, a user's typing speed while formulating the search query 216, a number of misspelled words in the search query 216, whether misspelled search query terms are adjacent key misspellings, whether misspelled search query terms are wrong letter order misspellings, whether misspelled search query terms are phonetical misspellings, misspellings made from a mobile device auto-correction, other misspelling characteristics, and any other search sophistication characteristics discussed herein, as known in the art at the time of filing, or as developed after the time of filing, according to various embodiments. In one embodiment, the search sophistication characteristics 268 include a mouse click rate as an indication of sophistication, since longer search queries (e.g., conversational queries) typically receive less clicks from users.
The customer self-help system 250 determines the search sophistication characteristics 268 by applying one or more of the search query 216 and the clickstream data 238 to the first analytics model 272, according to one embodiment. The first analytics model 272 analyzes the one or more of the search query 216 and the clickstream data 238 with one or more linguistics analysis algorithms to determine the search sophistication characteristics 268, according to one embodiment. The first analytics model 272 includes a probabilistic topic model to determine at least some of the search sophistication characteristics 268, according to one embodiment. The probabilistic topic model of the first analytics model 272 is implemented using one or more of a Latent Dirichlet Allocation (“LDA”) algorithm, Latent Semantic Indexing (“LSI”), query clustering, query de-duplication, and one or more other techniques currently known or later developed for generating probabilistic topic models, according to various embodiments. The first analytics model 272 incorporates one or more third-party spell-checking engines for determining misspelling characteristics of the search query 216 and the clickstream data 238, according to one embodiment. The first analytics model 272 derives the typing speed from the clickstream data 238 and the search query 216 by comparing timestamps against the information entered for the search query 216 and the clickstream data 238, according to one embodiment. The first analytics model 272 employs one or more techniques to identify pronoun use, interrogatory term use, punctuation, query length characteristics, and other characteristics of the search query 216, according to one embodiment. Specific illustrative examples of techniques that are available for identifying the characteristics of the search query 216 include, but are not limited to, one or more open source language processing techniques (e.g. Python's Natural Language Toolkit (“NLTK”), Stanford CoreNLP Suite, Apache SOLR™ and Lucene™, etc.), commercial language processing techniques, and other language processing techniques whether known at the time of filing or as developed after the time of filing, according to various embodiments.
The search sophistication score 270 is a score that represents the likelihood that a user has an advanced skill or comfort with formulating search queries, according to one embodiment. The search sophistication score 270 is output from the second analytics model 274, and is a result of an analysis of the search sophistication characteristics 268, according to one embodiment. The search sophistication score 270 is defined within a range of scores that correspond to a level of search sophistication for the users 212, according to one embodiment. The higher the search sophistication score is for a user, the more likely the user is to be satisfied with an advanced content search user experience, according to one embodiment. The lower the search sophistication score is for a user, the more likely the user is to be satisfied with simplified content search user experience, according to one embodiment. A specific illustrative example of a range of search sophistication scores is 0-1 (inclusive of 0 and 1), although other ranges can also be used, according to one embodiment.
The customer self-help system 250 uses the second analytics model 274 to determine the search sophistication score 270, based on the search sophistication characteristics 268, according to one embodiment. In one embodiment, the second analytics model 274 includes or uses a predictive model to determine the search sophistication score 270. The predictive model receives the search sophistication characteristics 268 for one particular user (e.g., the user 214) and determines a search sophistication score for the particular user that is based on the search sophistication characteristics and actions of other prior users of the customer self-help system 250, according to one embodiment.
The predictive model of the second analytics model 274 is trained using predictive model training operations that include, but are not limited to, one or more of regression, logistic regression, decision trees, artificial neural networks, support vector machines, linear regression, nearest neighbor methods, distance based methods, naive Bayes, linear discriminant analysis, k-nearest neighbor algorithm, another query classifier, and any other presently known or later developed predictive model training operations, according to one embodiment.
The predictive model of the second analytics model 274 is trained using historic data (not shown) from prior users of the customer self-help system 250 as a training data set, according to one embodiment. The historic data from prior users of the customer self-help system 250 includes information that is indicative of users' actions after receiving one of the content search user experience options 258, according to one embodiment. The users' actions are used to determine whether the presentation of a particular content search user experience (e.g., basic, intermediate, advanced, etc.) enabled the user to effectively find portions of the customer support content 252 that were searched for, according to one embodiment. The users' actions include whether or not a user requested live customer support after receiving one of the content search user experience options 258. The users' actions include whether or not the user indicated satisfaction with one of the content search user experience options 258, according to one embodiment. The users' actions include whether or not the user returned to the use of the financial management system 230 after submitting a search query with one of the content search user experience options 258, according to one embodiment. The users' actions include the number of additional search queries the user submitted with one of the content search user experience options 258, before finding a satisfactory answer, according to one embodiment. The users' actions include whether a user searched for certain content and did not contact assisted support afterwards, and vice versa, according to one embodiment. Each of these user actions and the submitted search queries are associated with search sophistication characteristics for these prior users, and the users' actions in combination with the search sophistication characteristics for the prior users is used to train the predictive model of the second analytics model 274, according to one embodiment.
In one embodiment, the first analytics model 272 and the second analytics model 274 are the same analytics model. In one embodiment, the first analytics model 272 and the second analytics model 274 are both included in a third analytics model (not shown), which is used by the customer self-help system 250 to determine the search sophistication characteristics 268 and the search sophistication score 270.
The customer self-help system 250 applies the search sophistication score 270 to a threshold 276 to determine a level of search sophistication 278 for the user 214, according to one embodiment. The threshold 276 is a number that is within the range for the search sophistication score 270, according to one embodiment. One specific illustrative example of the threshold 276 is 0.7 if the potential range of the search sophistication score 270 is 0-1. If the search sophistication score 270 is greater than or equal to 0.7, then the level of search sophistication 278 is advanced, according to one embodiment. If the search sophistication score 270 is less than 0.7, then the level of search sophistication 278 is basic or less-experienced, according to one embodiment. The level of search sophistication 278 is a class label for the user and that is used for training the second analytics model 274, according to one embodiment. The level of search sophistication 278 is expressed as a as a binary variable (e.g., less-sophisticated and sophisticated), according to one embodiment. The level of search sophistication 278 is expressed as a as an integer, float, or categorical variable (e.g., basic, intermediate, advanced), according to one embodiment.
The customer self-help system 250 selects from the content search user experience options 258, based on the level of search sophistication 278, to determine or populate the personalized content search user experience 260, according to one embodiment. The content search user experience options 258 include a first content search user experience 280 and a second content search user experience 282, according to one embodiment. Although two content search user experience options 258 are specifically discussed herein, many more content search user experience options 258 are optionally available, according to one embodiment. The first content search user experience 280 is associated with a first or simplified content search user experience, which may include a single search text box, according to one embodiment. The first content search user experience 280 is associated with a basic or less-experienced level of search sophistication 278, according to one embodiment. Thus, if a user 214 is categorized as having a basic or less-experienced level of search sophistication 278, the customer self-help system 250 assigns the first content search user experience 280 for the personalized content search user experience 260 for delivery to the user 214, according to one embodiment.
The second content search user experience 282 is associated with a second or advanced content search user experience, according to one embodiment. The second content search user experience 282 is associated with an advanced level of search sophistication 278, according to one embodiment. Thus, if the user 214 is categorized as having an advanced level of search sophistication 278, the customer self-help system 250 assigns the second content search user experience 282 for the personalized content search user experience 260 for delivery to the user 214, according to one embodiment.
The customer self-help system 250 is represented by or implemented using data that is partially or wholly stored in memory 284 (inclusive of non-volatile memory and volatile memory) and is partially or wholly executed by processors 286, according to one embodiment.
Although the features and functionality of the production environment 200 are illustrated or described in terms of individual or modularized components, engines, modules, models, databases/data stores, and systems, one or more of the functions of one or more of the components, engines, modules, models, databases/data stores, or systems are functionally combinable with one or more other described or illustrated components, engines, modules, models, databases/data stores, and systems, according to one embodiment. Each of the described engines, modules, models, databases/data stores, characteristics, user experiences, content, and systems are data that can be stored in memory and executed by one or more processors, according to various embodiments.
Process
Returning to
At operation 304, the process 300 receives a request to display customer support content in response to a search query submitted by a user, according to one embodiment. The request to display customer support content is a request made by a web browser in response to selection of a hyperlink that references customer support content within a customer self-help system, according to one embodiment. The user submits the search query in to a third-party search engine, according to one embodiment. The user submits the search query directly to a customer self-help system, according to one embodiment. Operation 304 proceeds to operation 306, according to one embodiment.
At operation 306, the process 300 determines the search query, according to one embodiment. The search query is determined by parsing search query terms from a URL of the web page from which a user was directed to a customer self-help system, according to one embodiment. Operation 306 proceeds to operation 308, according to one embodiment.
At operation 308, the process 300 determines if the request is from a third-party search engine, according to one embodiment. If the request is not from a third-party search engine, the operation 308 proceeds to operation 310, according to one embodiment. If the request is from a third-party search engine, the operation 308 proceeds to operation 312, according to one embodiment.
At operation 310, the process 300 determines search sophistication characteristics data from clickstream data, according to one embodiment. The clickstream data is acquired from use of a customer self-help system or from use of a financial management system that the customer self-help system is associated with, according to one embodiment. Operation 310 proceeds to operation 314, according to one embodiment.
At operation 312, the process 300 determines if the user is a prior user, according to one embodiment. By calculating a user identification (“ID”) from characteristics of a user's computing system, a customer self-help system determines if a user has previously accessed a financial management system supported by the customer self-help system or accessed the self-help system, according to one embodiment. If the user is a prior user, operation 312 proceeds to operation 310, according to one embodiment. If the user is not a prior user, operation 312 proceeds to operation 314, according to one embodiment.
At operation 314, the process 300 applies the search query to a first analytics model to determine search sophistication characteristics data, according to one embodiment. Operation 314 proceeds to operation 316, according to one embodiment.
At operation 316, the process 300 applies the search sophistication characteristics data to a second analytics model to determine a search sophistication score for the user, according to one embodiment. In one embodiment, the first analytics model and the second analytics model are a single analytics model or are included in a third analytics model. Operation 316 proceeds to operation 318, according to one embodiment.
At operation 318, the process 300 compares the search sophistication score for the user to a threshold to determine a level of search sophistication for the user, according to one embodiment. Operation 318 proceeds to operation 320, according to one embodiment.
At operation 320, the process 300 identifies one of a plurality of content search user experiences to provide to the user, at least partially based on the level of search sophistication for the user, according to one embodiment. Operation 320 proceeds to operation 322, according to one embodiment.
At operation 322, the process 300 displays the identified one of the plurality of content search user experiences concurrently with customer support content, according to one embodiment. Operation 322 proceeds to operation 324, according to one embodiment.
At operation 324, the process 300 ends, according to one embodiment.
At operation 402, the process 400 begins, according to one embodiment. Operation 402 proceeds to operation 404, according to one embodiment.
At operation 404, the process 400 provides, with one or more computing systems, a customer self-help system associated with a financial management system, according to one embodiment. Operation 404 proceeds to operation 406, according to one embodiment.
At operation 406, the process 400 stores, in memory dedicated for use by the customer self-help system, customer support content data, the customer support content data representing customer support content that is provided to users of the customer self-help system to enable users to resolve questions or concerns related to the financial management system, according to one embodiment. Operation 406 proceeds to operation 408, according to one embodiment.
At operation 408, the process 400 receives, with the customer self-help system, request data representing a request to display a portion of the customer support content, according to one embodiment. Operation 408 proceeds to operation 410, according to one embodiment.
At operation 410, the process 400 identifies search query data from the request data, the search query data representing a search query from a user of the customer self-help system, according to one embodiment. Operation 410 proceeds to operation 412, according to one embodiment.
At operation 412, the process 400 provides analytics model data representing at least one analytics model, according to one embodiment. Operation 412 proceeds to operation 414, according to one embodiment.
At operation 414, the process 400 applies the search query data to the analytics model data to determine search sophistication characteristics data and search sophistication score data for the user at least partially based on the search query data, the search sophistication characteristics data representing search sophistication characteristics of the user; the search sophistication score data representing a search sophistication score, according to one embodiment. Operation 414 proceeds to operation 416 and
At operation 416, the process 400 provides requested customer support content data concurrently with personalized content search user experience data, the requested customer support content data representing the portion of the customer support content, the personalized content search user experience data representing a selected one of a plurality of content search user experiences, the selected one of the plurality of content search user experiences being selected at least partially based on the search sophistication score data for the user, according to one embodiment. Operation 416 proceeds to operation 418, according to one embodiment.
At operation 418, the process 400 ends, according to one embodiment.
As noted above, the specific examples discussed above are but illustrative examples of implementations of embodiments of the method or process for determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users and to increase a likelihood of user satisfaction with the search experience. Those of skill in the art will readily recognize that other implementations and embodiments are possible. Therefore, the discussion above should not be construed as a limitation on the claims provided below.
Determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users and to increase a likelihood of user satisfaction with the search experience, is a technical solution to a long standing technical problem of content search user experience dissatisfaction and inefficient searching in customer self-help systems. Therefore, the disclosed embodiments do not represent an abstract idea for at least a few reasons. First, determining levels of search sophistication for users to personalize a content search user experience for the users is not an abstract idea because it is not merely an idea itself (e.g., cannot be performed mentally or using pen and paper). Indeed, some of the disclosed embodiments of determining levels of search sophistication include tracking clickstream data and using analytics models to determine search sophistication score data, which cannot be performed mentally. Second, determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users is not an abstract idea because it is not a fundamental economic practice (e.g., is not merely creating a contractual relationship, hedging, mitigating a settlement risk, etc.). Third, determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users is not an abstract idea because it is not a method of organizing human activity (e.g., managing a game of bingo). Rather, the disclosed embodiments analyze human behavior to determine characteristics of users that can be used to modify computing processes (e.g., the selection of one content search user experience over another). Fourth, although mathematics may be used to generate an analytics model, determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users is not simply a mathematical relationship/formula but is instead a technique for transforming search query data into data that is used to personalize a content search user experience for users, to increase a likelihood of causing users to more quickly or efficiently find answers to questions, without the use of live customer support, according to one embodiment.
Further, determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users is not an abstract idea because the disclosed techniques allow for significant improvement to the technical fields of user experience, self-help systems, customer service, and financial management systems, according to one embodiment. The present disclosure adds significantly to the field of content searching because the disclosed customer self-help system reduces the likelihood of redundant searches, reduces the likelihood of users seeking live customer support, and increases the likelihood of improving users' search experiences by providing one of a number of content search user experiences that is suited to the searching skills of the users, according to one embodiment.
As a result, embodiments of the present disclosure allow for reduced use of processor cycles, processor power, communications bandwidth, memory, and power consumption, by reducing the number of search queries submitted by users when searching for customer support content, according to one embodiment. Consequently, computing and communication systems implementing or providing the embodiments of the present disclosure are transformed into more operationally efficient devices and systems.
In addition to improving overall computing performance, personalizing a content search user experience for the users of a customer self-help system significantly improves the field of financial management systems, by increasing the likelihood that users will promptly resolve their own concerns that arise during the use of the financial management system, so that the users continue use of the financial management system that is supported by the customer self-help system, according to one embodiment. Furthermore, by personalizing a content search user experience for the users, the disclosed embodiments help maintain or build trust and therefore loyalty in the customer self-help system and in the financial management system with which it is associated, which results in repeat customers, efficient delivery of financial services, and reduced abandonment of use of the financial management system, according to one embodiment.
In accordance with an embodiment, a computing system implemented method determines levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users. The method includes providing, with one or more computing systems, a customer self-help system associated with a financial management system, according to one embodiment. The method includes storing, in memory dedicated for use by the customer self-help system, customer support content data, the customer support content data representing customer support content that is provided to users of the customer self-help system to enable users to resolve questions or concerns related to the financial management system, according to one embodiment. The method includes receiving, with the customer self-help system, request data representing a request to display a portion of the customer support content, according to one embodiment. The method includes identifying search query data from the request data, the search query data representing a search query from a user of the customer self-help system, according to one embodiment. The method includes providing analytics model data representing at least one analytics model, according to one embodiment. The method includes applying the search query data to the analytics model data to determine search sophistication characteristics data and search sophistication score data for the user at least partially based on the search query data, the search sophistication characteristics data representing search sophistication characteristics of the user; the search sophistication score data representing a search sophistication score, according to one embodiment. The method includes providing requested customer support content data concurrently with personalized content search user experience data, the requested customer support content data representing the portion of the customer support content and the personalized content search user experience data representing a selected one of a plurality of content search user experiences, the selected one of the plurality of content search user experiences being selected at least partially based on the search sophistication score data for the user, according to one embodiment.
In accordance with an embodiment, a system determines levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users. The system includes one or more processors, according to one embodiment. The system includes memory having data representing instructions which, if executed by the one or more processors, cause the one or more processors to perform a process for determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users, according to one embodiment. The process includes providing a customer self-help system associated with a financial management system, according to one embodiment. The process includes storing, in memory dedicated for use by the customer self-help system, customer support content data, the customer support content data representing customer support content that is provided to users of the customer self-help system to enable users to resolve questions or concerns related to the financial management system, according to one embodiment. The process includes receiving, with the customer self-help system, request data representing a request to display a portion of the customer support content, according to one embodiment. The process includes identifying search query data from the request data, the search query data representing a search query from a user of the customer self-help system, according to one embodiment. The process includes providing analytics model data representing at least one analytics model, according to one embodiment. The process includes applying the search query data to the analytics model data to determine search sophistication characteristics data and search sophistication score data for the user at least partially based on the search query data, the search sophistication characteristics data representing search sophistication characteristics of the user; the search sophistication score data representing a search sophistication score, according to one embodiment. The process includes providing requested customer support content data concurrently with personalized content search user experience data, the requested customer support content data representing the portion of the customer support content and the personalized content search user experience data representing a selected one of a plurality of content search user experiences, the selected one of the plurality of content search user experiences being selected at least partially based on the search sophistication score data for the user, according to one embodiment.
In accordance with an embodiment, a system for determines levels of search sophistication for users of a customer self-help system to personalize a content search user experience for the users. The system includes a memory that stores customer self-help system data and customer support content data, the customer self-help system data representing a customer self-help system and the customer support content data representing customer support content for the customer self-help system, the customer self-help system being associated with and configured to support a financial management system represented by financial management system data, according to one embodiment. The system includes one or more processors communicatively coupled to the memory to execute the customer self-help system data to operate the customer self-help system, according to one embodiment. The system includes user experience page data representing a user experience page that is provided, by the customer self-help system, to a user of the customer self-help system in response to search query data submitted by the user, the search query data representing a search query, the user experience page data including requested customer support content data representing requested portions of the customer support content, the user experience page data including personalized content search user experience data representing a personalized content search user experience, according to one embodiment. The system includes user profile data for the user, the user profile data including search sophistication characteristics data representing search sophistication characteristics of the user and search sophistication score data, according to one embodiment. The system includes analytics model data representing an analytics model that determines the personalized content search user experience data at least partially based on applying the search query data to the analytics model data, by the customer self-help system, according to one embodiment.
In the discussion above, certain aspects of one embodiment include process steps or operations or instructions described herein for illustrative purposes in a particular order or grouping. However, the particular order or grouping shown and discussed herein are illustrative only and not limiting. Those of skill in the art will recognize that other orders or grouping of the process steps or operations or instructions are possible and, in some embodiments, one or more of the process steps or operations or instructions discussed above can be combined or deleted. In addition, portions of one or more of the process steps or operations or instructions can be re-grouped as portions of one or more other of the process steps or operations or instructions discussed herein. Consequently, the particular order or grouping of the process steps or operations or instructions discussed herein do not limit the scope of the invention as claimed below.
As discussed in more detail above, using the above embodiments, with little or no modification or input, there is considerable flexibility, adaptability, and opportunity for customization to meet the specific needs of various users under numerous circumstances.
In the discussion above, certain aspects of one embodiment include process steps or operations or instructions described herein for illustrative purposes in a particular order or grouping. However, the particular order or grouping shown and discussed herein are illustrative only and not limiting. Those of skill in the art will recognize that other orders or grouping of the process steps or operations or instructions are possible and, in some embodiments, one or more of the process steps or operations or instructions discussed above can be combined or deleted. In addition, portions of one or more of the process steps or operations or instructions can be re-grouped as portions of one or more other of the process steps or operations or instructions discussed herein. Consequently, the particular order or grouping of the process steps or operations or instructions discussed herein do not limit the scope of the invention as claimed below.
The present invention has been described in particular detail with respect to specific possible embodiments. Those of skill in the art will appreciate that the invention may be practiced in other embodiments. For example, the nomenclature used for components, capitalization of component designations and terms, the attributes, data structures, or any other programming or structural aspect is not significant, mandatory, or limiting, and the mechanisms that implement the invention or its features can have various different names, formats, or protocols. Further, the system or functionality of the invention may be implemented via various combinations of software and hardware, as described, or entirely in hardware elements. Also, particular divisions of functionality between the various components described herein are merely exemplary, and not mandatory or significant. Consequently, functions performed by a single component may, in other embodiments, be performed by multiple components, and functions performed by multiple components may, in other embodiments, be performed by a single component.
Some portions of the above description present the features of the present invention in terms of algorithms and symbolic representations of operations, or algorithm-like representations, of operations on information/data. These algorithmic or algorithm-like descriptions and representations are the means used by those of skill in the art to most effectively and efficiently convey the substance of their work to others of skill in the art. These operations, while described functionally or logically, are understood to be implemented by computer programs or computing systems. Furthermore, it has also proven convenient at times to refer to these arrangements of operations as steps or modules or by functional names, without loss of generality.
Unless specifically stated otherwise, as would be apparent from the above discussion, it is appreciated that throughout the above description, discussions utilizing terms such as, but not limited to, “activating,” “accessing,” “adding,” “aggregating,” “alerting,” “applying,” “analyzing,” “associating,” “calculating,” “capturing,” “categorizing,” “classifying,” “comparing,” “creating,” “defining,” “detecting,” “determining,” “distributing,” “eliminating,” “encrypting,” “extracting,” “filtering,” “forwarding,” “generating,” “identifying,” “implementing,” “informing,” “monitoring,” “obtaining,” “posting,” “processing,” “providing,” “receiving,” “requesting,” “saving,” “sending,” “storing,” “substituting,” “transferring,” “transforming,” “transmitting,” “using,” etc., refer to the action and process of a computing system or similar electronic device that manipulates and operates on data represented as physical (electronic) quantities within the computing system memories, resisters, caches or other information storage, transmission or display devices.
The present invention also relates to an apparatus or system for performing the operations described herein. This apparatus or system may be specifically constructed for the required purposes, or the apparatus or system can comprise a general purpose system selectively activated or configured/reconfigured by a computer program stored on a computer program product as discussed herein that can be accessed by a computing system or other device.
The present invention is well suited to a wide variety of computer network systems operating over numerous topologies. Within this field, the configuration and management of large networks comprise storage devices and computers that are communicatively coupled to similar or dissimilar computers and storage devices over a private network, a LAN, a WAN, a private network, or a public network, such as the Internet.
It should also be noted that the language used in the specification has been principally selected for readability, clarity and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the claims below.
In addition, the operations shown in the FIG.s, or as discussed herein, are identified using a particular nomenclature for ease of description and understanding, but other nomenclature is often used in the art to identify equivalent operations.
Therefore, numerous variations, whether explicitly provided for by the specification or implied by the specification or not, may be implemented by one of skill in the art in view of this disclosure.
Number | Name | Date | Kind |
---|---|---|---|
5471382 | Tallman et al. | Nov 1995 | A |
5519608 | Kupiec | May 1996 | A |
5701399 | Lee et al. | Dec 1997 | A |
6006218 | Breese et al. | Dec 1999 | A |
6147975 | Bowman-Amuah | Nov 2000 | A |
6256633 | Dharap | Jul 2001 | B1 |
6349307 | Chen | Feb 2002 | B1 |
6513036 | Fruensgaard et al. | Jan 2003 | B2 |
6564213 | Ortega et al. | May 2003 | B1 |
6601055 | Roberts | Jul 2003 | B1 |
6708172 | Wong et al. | Mar 2004 | B1 |
6853982 | Smith et al. | Feb 2005 | B2 |
7013263 | Isaka et al. | Mar 2006 | B1 |
7222078 | Abelow | May 2007 | B2 |
7385716 | Skaanning | Jun 2008 | B1 |
7565312 | Shaw et al. | Jul 2009 | B1 |
7587387 | Hogue | Sep 2009 | B2 |
7594176 | English | Sep 2009 | B1 |
7603301 | Regan | Oct 2009 | B1 |
7610226 | Miller | Oct 2009 | B1 |
7627504 | Brady et al. | Dec 2009 | B2 |
7685144 | Katragadda | Mar 2010 | B1 |
7739286 | Sethy et al. | Jun 2010 | B2 |
7747601 | Cooper et al. | Jun 2010 | B2 |
7966282 | Pinckney et al. | Jun 2011 | B2 |
7974860 | Travis | Jul 2011 | B1 |
8010545 | Stefik et al. | Aug 2011 | B2 |
8019753 | Podgorny et al. | Sep 2011 | B2 |
8200527 | Thompson et al. | Jun 2012 | B1 |
8209333 | Hubbard et al. | Jun 2012 | B2 |
8311792 | Podgorny et al. | Nov 2012 | B1 |
8311863 | Kemp | Nov 2012 | B1 |
8341167 | Podgorny et al. | Dec 2012 | B1 |
8473339 | McKennon et al. | Jun 2013 | B1 |
8478780 | Cooper et al. | Jul 2013 | B2 |
8484228 | Bhattacharyya et al. | Jul 2013 | B2 |
8631006 | Haveliwala et al. | Jan 2014 | B1 |
8645298 | Hennig et al. | Feb 2014 | B2 |
8660849 | Gruber et al. | Feb 2014 | B2 |
8732222 | Horvitz et al. | May 2014 | B2 |
8805734 | Diana et al. | Aug 2014 | B2 |
8817968 | Boutcher et al. | Aug 2014 | B1 |
8850490 | Thomas et al. | Sep 2014 | B1 |
8892539 | Anand et al. | Nov 2014 | B2 |
8909568 | Mann et al. | Dec 2014 | B1 |
8935192 | Ventilla et al. | Jan 2015 | B1 |
8943145 | Peters et al. | Jan 2015 | B1 |
8983977 | Ishikawa et al. | Mar 2015 | B2 |
9015031 | Ferrucci et al. | Apr 2015 | B2 |
9037578 | Brust et al. | May 2015 | B2 |
9060062 | Madahar et al. | Jun 2015 | B1 |
9063983 | Lee | Jun 2015 | B1 |
9229974 | Lee et al. | Jan 2016 | B1 |
9235626 | Liu et al. | Jan 2016 | B2 |
9247066 | Stec et al. | Jan 2016 | B1 |
9336211 | Bousquet et al. | May 2016 | B1 |
9336269 | Smith et al. | May 2016 | B1 |
9342608 | Cook et al. | May 2016 | B2 |
9460191 | Gaucher et al. | Oct 2016 | B1 |
9471883 | Chatterjee et al. | Oct 2016 | B2 |
9582757 | Holmes et al. | Feb 2017 | B1 |
9633309 | Giffels et al. | Apr 2017 | B2 |
9767169 | Paff et al. | Sep 2017 | B1 |
9779388 | Hansen et al. | Oct 2017 | B1 |
9887887 | Hunter et al. | Feb 2018 | B2 |
9892367 | Guo et al. | Feb 2018 | B2 |
9910886 | Adams, Jr. et al. | Mar 2018 | B2 |
10002177 | McClintock et al. | Jun 2018 | B1 |
10049664 | Indyk et al. | Aug 2018 | B1 |
10083213 | Podgorny et al. | Sep 2018 | B1 |
10134050 | Hung et al. | Nov 2018 | B1 |
10147037 | Podgorny et al. | Dec 2018 | B1 |
10162734 | Podgorny et al. | Dec 2018 | B1 |
10210244 | Branavan et al. | Feb 2019 | B1 |
10354182 | Chang et al. | Jul 2019 | B2 |
10460398 | Gielow et al. | Oct 2019 | B1 |
10475044 | Cannon et al. | Nov 2019 | B1 |
10552843 | Podgorny et al. | Feb 2020 | B1 |
10579625 | Cho et al. | Mar 2020 | B2 |
20020087387 | Calver et al. | Jul 2002 | A1 |
20020111888 | Stanley et al. | Aug 2002 | A1 |
20020111926 | Bebie | Aug 2002 | A1 |
20020123983 | Riley et al. | Sep 2002 | A1 |
20020169595 | Agichtein et al. | Nov 2002 | A1 |
20030028448 | Joseph et al. | Feb 2003 | A1 |
20030061131 | Parkan | Mar 2003 | A1 |
20030099924 | Tsuboi et al. | May 2003 | A1 |
20030101161 | Ferguson et al. | May 2003 | A1 |
20030144873 | Keshel | Jul 2003 | A1 |
20040024739 | Copperman et al. | Feb 2004 | A1 |
20040064442 | Popovitch | Apr 2004 | A1 |
20050086290 | Joyce et al. | Apr 2005 | A1 |
20050114327 | Kumamoto et al. | May 2005 | A1 |
20050137939 | Calabria et al. | Jun 2005 | A1 |
20050246314 | Eder | Nov 2005 | A1 |
20060064322 | Mascarenhas et al. | Mar 2006 | A1 |
20060074788 | Grizack et al. | Apr 2006 | A1 |
20060085255 | Hastings et al. | Apr 2006 | A1 |
20060085750 | Easton et al. | Apr 2006 | A1 |
20060253578 | Dixon et al. | Nov 2006 | A1 |
20060265232 | Katariya et al. | Nov 2006 | A1 |
20070011131 | Delefevre | Jan 2007 | A1 |
20070192166 | Van Luchene | Aug 2007 | A1 |
20070192168 | Van Luchene | Aug 2007 | A1 |
20070192179 | Van Luchene | Aug 2007 | A1 |
20070219863 | Park et al. | Sep 2007 | A1 |
20070244900 | Hopkins et al. | Oct 2007 | A1 |
20070259325 | Clapper | Nov 2007 | A1 |
20070291739 | Sullivan et al. | Dec 2007 | A1 |
20070294195 | Curry et al. | Dec 2007 | A1 |
20080189197 | Allanson et al. | Aug 2008 | A1 |
20080201413 | Sullivan et al. | Aug 2008 | A1 |
20080208610 | Thomas et al. | Aug 2008 | A1 |
20080215541 | Li et al. | Sep 2008 | A1 |
20080248815 | Busch | Oct 2008 | A1 |
20080294637 | Liu | Nov 2008 | A1 |
20090012926 | Ishikawa et al. | Jan 2009 | A1 |
20090077047 | Cooper et al. | Mar 2009 | A1 |
20090089286 | Kumar et al. | Apr 2009 | A1 |
20090119575 | Velusamy | May 2009 | A1 |
20090158143 | Arav | Jun 2009 | A1 |
20090162824 | Heck | Jun 2009 | A1 |
20090198667 | Groeneveld et al. | Aug 2009 | A1 |
20090248659 | McCool et al. | Oct 2009 | A1 |
20090253112 | Cao et al. | Oct 2009 | A1 |
20090259642 | Cao et al. | Oct 2009 | A1 |
20090265340 | Barcklay et al. | Oct 2009 | A1 |
20090292609 | Vaidyanathan | Nov 2009 | A1 |
20100068687 | Bertelsen | Mar 2010 | A1 |
20100070554 | Richardson et al. | Mar 2010 | A1 |
20100076847 | Heller | Mar 2010 | A1 |
20100076998 | Podgorny et al. | Mar 2010 | A1 |
20100088262 | Visel et al. | Apr 2010 | A1 |
20100138451 | Henkin et al. | Jun 2010 | A1 |
20100185630 | Cheng et al. | Jul 2010 | A1 |
20100191686 | Wang et al. | Jul 2010 | A1 |
20100203492 | Nibe et al. | Aug 2010 | A1 |
20100205180 | Cooper et al. | Aug 2010 | A1 |
20100205550 | Chen et al. | Aug 2010 | A1 |
20100228744 | Craswell et al. | Sep 2010 | A1 |
20100235361 | Chandran et al. | Sep 2010 | A1 |
20100241971 | Zuber | Sep 2010 | A1 |
20100318919 | Murphy et al. | Dec 2010 | A1 |
20110055110 | Kolyvanov et al. | Mar 2011 | A1 |
20110055699 | Li et al. | Mar 2011 | A1 |
20110071997 | Sullivan et al. | Mar 2011 | A1 |
20110106895 | Ventilla et al. | May 2011 | A1 |
20110125734 | Duboue et al. | May 2011 | A1 |
20110202472 | Wan et al. | Aug 2011 | A1 |
20110231347 | Xu et al. | Sep 2011 | A1 |
20110246334 | Schoenberg et al. | Oct 2011 | A1 |
20110264569 | Houseworth et al. | Oct 2011 | A1 |
20110282892 | Castellani et al. | Nov 2011 | A1 |
20120005148 | Horvitz et al. | Jan 2012 | A1 |
20120005219 | Apacible et al. | Jan 2012 | A1 |
20120022983 | Hughes et al. | Jan 2012 | A1 |
20120030079 | Slater et al. | Feb 2012 | A1 |
20120077178 | Bagchi et al. | Mar 2012 | A1 |
20120084120 | Hirsch et al. | Apr 2012 | A1 |
20120084185 | Ciaramitaro et al. | Apr 2012 | A1 |
20120084279 | Driscoll et al. | Apr 2012 | A1 |
20120084293 | Brown et al. | Apr 2012 | A1 |
20120095976 | Hebenthal et al. | Apr 2012 | A1 |
20120101965 | Hennig et al. | Apr 2012 | A1 |
20120130910 | Al-Alami | May 2012 | A1 |
20120130978 | Li et al. | May 2012 | A1 |
20120136764 | Miller et al. | May 2012 | A1 |
20120150861 | Thione et al. | Jun 2012 | A1 |
20120166438 | Wu et al. | Jun 2012 | A1 |
20120219142 | Gould | Aug 2012 | A1 |
20120221557 | Edmonds et al. | Aug 2012 | A1 |
20120233191 | Ramanujam | Sep 2012 | A1 |
20120331052 | Rathod | Dec 2012 | A1 |
20130019286 | Barborak et al. | Jan 2013 | A1 |
20130024290 | Berg et al. | Jan 2013 | A1 |
20130054497 | Garland et al. | Feb 2013 | A1 |
20130066693 | Laird-McConnell et al. | Mar 2013 | A1 |
20130073387 | Heath | Mar 2013 | A1 |
20130073390 | Konig et al. | Mar 2013 | A1 |
20130103493 | Gao et al. | Apr 2013 | A1 |
20130110671 | Gray | May 2013 | A1 |
20130110823 | Su et al. | May 2013 | A1 |
20130111323 | Taghaddos et al. | May 2013 | A1 |
20130117677 | St. Jacques, Jr. | May 2013 | A1 |
20130204876 | Szucs et al. | Aug 2013 | A1 |
20130224713 | Ajmera et al. | Aug 2013 | A1 |
20130268319 | Palombo | Oct 2013 | A1 |
20130275408 | Rodriguez et al. | Oct 2013 | A1 |
20130282363 | Fan et al. | Oct 2013 | A1 |
20130285855 | Dupray et al. | Oct 2013 | A1 |
20130297545 | Bierner et al. | Nov 2013 | A1 |
20130297553 | Bierner | Nov 2013 | A1 |
20130297625 | Bierner et al. | Nov 2013 | A1 |
20130304730 | Zhou | Nov 2013 | A1 |
20130325992 | McGann et al. | Dec 2013 | A1 |
20130339870 | Tandra Sishtla et al. | Dec 2013 | A1 |
20140006012 | Zhou et al. | Jan 2014 | A1 |
20140022328 | Gechter et al. | Jan 2014 | A1 |
20140052496 | Diana et al. | Feb 2014 | A1 |
20140052606 | Vasudevan et al. | Feb 2014 | A1 |
20140075004 | Van Dusen et al. | Mar 2014 | A1 |
20140088944 | Natarajan et al. | Mar 2014 | A1 |
20140114822 | Sharma et al. | Apr 2014 | A1 |
20140119531 | Tuchman et al. | May 2014 | A1 |
20140172883 | Clark et al. | Jun 2014 | A1 |
20140181652 | Stanke et al. | Jun 2014 | A1 |
20140189829 | McLachlan et al. | Jul 2014 | A1 |
20140195613 | Ogilvie | Jul 2014 | A1 |
20140201045 | Pai et al. | Jul 2014 | A1 |
20140222669 | Novak et al. | Aug 2014 | A1 |
20140244528 | Zhang et al. | Aug 2014 | A1 |
20140280055 | Chang et al. | Sep 2014 | A1 |
20140280070 | George et al. | Sep 2014 | A1 |
20140308648 | Jain | Oct 2014 | A1 |
20140316856 | Williams et al. | Oct 2014 | A1 |
20140324856 | Lahiani et al. | Oct 2014 | A1 |
20140337257 | Chatterjee et al. | Nov 2014 | A1 |
20140372980 | Verma et al. | Dec 2014 | A1 |
20150006344 | Saimani et al. | Jan 2015 | A1 |
20150052087 | Srinivasan et al. | Feb 2015 | A1 |
20150058380 | Polonsky et al. | Feb 2015 | A1 |
20150088608 | Cama et al. | Mar 2015 | A1 |
20150095267 | Behere et al. | Apr 2015 | A1 |
20150120718 | Luo et al. | Apr 2015 | A1 |
20150127587 | Pinckney et al. | May 2015 | A1 |
20150139415 | Skiba et al. | May 2015 | A1 |
20150170049 | Mann et al. | Jun 2015 | A1 |
20150213021 | He et al. | Jul 2015 | A1 |
20150229531 | O'Sullivan et al. | Aug 2015 | A1 |
20150242906 | Liu et al. | Aug 2015 | A1 |
20150254785 | Yang et al. | Sep 2015 | A1 |
20150317197 | Blair | Nov 2015 | A1 |
20150324367 | Aravamudan et al. | Nov 2015 | A1 |
20150324805 | Skiba et al. | Nov 2015 | A1 |
20150363481 | Haynes | Dec 2015 | A1 |
20150371137 | Giffels et al. | Dec 2015 | A1 |
20160048772 | Bruno et al. | Feb 2016 | A1 |
20160055234 | Visotski et al. | Feb 2016 | A1 |
20160062980 | Boguraev et al. | Mar 2016 | A1 |
20160078567 | Goldman et al. | Mar 2016 | A1 |
20160103833 | Sanders et al. | Apr 2016 | A1 |
20160148222 | Davar et al. | May 2016 | A1 |
20160148321 | Ciaramitaro et al. | May 2016 | A1 |
20160154856 | Olof-Ors et al. | Jun 2016 | A1 |
20160179816 | Glover | Jun 2016 | A1 |
20160180470 | Mascaro et al. | Jun 2016 | A1 |
20160189029 | Giffels et al. | Jun 2016 | A1 |
20160196497 | Allen et al. | Jul 2016 | A1 |
20160203523 | Spasojevic | Jul 2016 | A1 |
20160217472 | Podgorny et al. | Jul 2016 | A1 |
20160283491 | Lu et al. | Sep 2016 | A1 |
20160306846 | Adams, Jr. et al. | Oct 2016 | A1 |
20160371276 | Furtado et al. | Dec 2016 | A1 |
20170011352 | Jones-McFadden et al. | Jan 2017 | A1 |
20170024424 | Almohizea | Jan 2017 | A1 |
20170032251 | Podgorny et al. | Feb 2017 | A1 |
20170032468 | Wang et al. | Feb 2017 | A1 |
20170046623 | Murdock et al. | Feb 2017 | A1 |
20170053026 | Musuluri et al. | Feb 2017 | A1 |
20170124184 | Podgorny et al. | May 2017 | A1 |
20170228459 | Wang et al. | Aug 2017 | A1 |
20170262529 | Chim et al. | Sep 2017 | A1 |
20170262900 | Ramachandran et al. | Sep 2017 | A1 |
20170270159 | Wang et al. | Sep 2017 | A1 |
20170308613 | Zhu et al. | Oct 2017 | A1 |
20170323233 | Bencke et al. | Nov 2017 | A1 |
20180032523 | Singhal et al. | Feb 2018 | A1 |
20180032607 | Singhal et al. | Feb 2018 | A1 |
20180032890 | Podgorny et al. | Feb 2018 | A1 |
20180089283 | Indyk et al. | Mar 2018 | A1 |
20180108092 | Goodyear et al. | Apr 2018 | A1 |
20180108093 | Podgorny et al. | Apr 2018 | A1 |
20180113935 | George et al. | Apr 2018 | A1 |
20180137203 | Hennekey et al. | May 2018 | A1 |
20180189292 | Grace, Jr. et al. | Jul 2018 | A1 |
20180287968 | Koukoumidis et al. | Oct 2018 | A1 |
20180321951 | Fitzgerald et al. | Nov 2018 | A1 |
20190018692 | Indyk et al. | Jan 2019 | A1 |
20190018899 | Podgorny et al. | Jan 2019 | A1 |
20190065576 | Peng et al. | Feb 2019 | A1 |
20190103035 | Belier et al. | Apr 2019 | A1 |
20200027095 | Cannon et al. | Jan 2020 | A1 |
20200134635 | Podgorny et al. | Apr 2020 | A1 |
20200134738 | Goodyear et al. | Apr 2020 | A1 |
Number | Date | Country |
---|---|---|
2001259223 | Nov 2001 | AU |
101520802 | Apr 2009 | CN |
2159715 | Mar 2010 | EP |
2014112316 | Jun 2014 | JP |
2001082202 | Nov 2001 | WO |
WO 2011053830 | May 2011 | WO |
Entry |
---|
Dror, et al., “From Query to Question in One Click: Suggesting Synthetic Questions to Searchers,” International World Wide Web Conferences Steering Committee, May 13, 2013, pp. 391-401. |
Podgorny, et al., “Content Quality and User Ranking in TurboTax AnswerXchange,” Proceedings of the European Conference on Social Media, University of Brighton UK, Jul. 10-11, 2014. |
Blei, David M., et al. “Latent Dirichlet Allocation;” Journal of Machine Learning Research 3, Jan. 2003, pp. 993-1022. |
Steyvers, Mark, et al. “Probabilistic Author-Topic Models for Information Discovery;” KDD'04, Aug. 22-25, 2004. |
Mimno, David, et al., “Sparse Stochastic Inference for Latent Dirichlet Allocation,” Proceedings of the 29th International Conference on Machine Learning, Edinburgh, Scotland, UK, 2012. |
Blei, David M., “Probabilistic Topic Models,” Communications of the ACM, Apr. 2012, vol. 55, No. 4, pp. 77-84. |
Grant, Sheryl, et al., “The Human Face of Crowdsourcing: A Citizen-led Crowdsourcing Case Study;” 2013 IEEE International Conference on Big Data, Silicon Valley, CA, 2013, pp. 21-24. |
Encyclopedia Britannica, “Graphical User Interface (GUI);” Sep. 5, 2015. Retrieved from the internet <URL: https://www.britannica.com/technology/graphical-user-interface>. |
Wikipedia, “Widget (GUI),” Sep. 7, 2016. Retrieved from the internet <URL: https://en.wikipedia.org/w/index.php?title=Widget_(GUI)&oldid=738206274>. |
The Scientific Marketer, “Uplift Modelling FAQ”, article date of Sep. 27, 2007, retrieved from http://scientificmarketer.com/2007/09/uplift-modelling-faq.html (Year: 2007). |
Zadeh, Z.; “Probabilistic Modeling in Community-Based Question Answering Services,” Dalhousie University, Halifax, Nova Scotia; Feb. 2012. |
Podgorny, et al. “Real Time Detection and Intervention of Poorly Phrased Questions,” CHI EA '15, Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, Apr. 18-23, 2015, Seoul, Republic of Korea, pp. 2205-2210. |
Get Satisfaction [online]. Sprinklr, 2017 [retrieved on Nov. 22, 2017]. Retrieved from the Internet: <URL: https://getsatisfaction.com/corp>. |
Bartolome et al., “Processing Unstructured Voice of Customer Feedback for Improving Content Rankings in Customer Support Systems,” U.S. Appl. No. 15/094,653, filed Apr. 8, 2016. |
Fitchett et al., “An Empirical Characterisation of File Retrieval,” Oct. 3, 2014, University of Canterbury, Christchurch, New Zealand, Int. J. Human-Computer Studies 74 (2015), pp. 1-13 (Year: 2014). |
Han et al., “Understanding and Supporting Cross-Device Web Search for Exploratory Tasks with Mobile Touch Interactions,” Apr. 2015, ACM Transactions on information System, vol. 33, No. 4, pp. 1-35, (Year: 2015). |
Kumar et al., “Personalized Web search Using Browsing History and Domain Knowledge” International Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT), IEEE 2014, pp. 493-497. |
Wen et al., “Clustering user Queries of a Search Engine,” Proceedings of the 10th International Conference on World Wide Web, pp. 162-168, ACM, 2001 (Year: 2001). |
Google Search, “Access Customer Support Content System User Value Analysis Experience Portion Option,” 2-pages, retrieved from the internet on Jun. 15, 2020. |
IEEE Xplore Search Results, “Access Customer Support Content System User Value Analysis Experience Portion Option,” 3-pages, retrieved from the internet on Jun. 15, 2020. |