Dynamic user interface customization

Information

  • Patent Grant
  • 11868591
  • Patent Number
    11,868,591
  • Date Filed
    Friday, January 28, 2022
    2 years ago
  • Date Issued
    Tuesday, January 9, 2024
    5 months ago
Abstract
Described are computer-based methods and apparatuses, including computer program products, for dynamic user interface customization. A set of functions for a user interface is stored, each function comprising a function that can be added to the user interface. A first set of data is transmitted to a remote device that causes the user interface to be displayed on the remote device with a predetermined set of functions from the set of functions. Interaction data is received indicative of a user's interactions with the user interface. A second set of data is transmitted to the remote device that causes the user interface to dynamically add a new function from the set of functions to the user interface based on the interaction data, wherein the new function is displayed as a selectable item in the user interface that the user can select to use a function associated with the new function.
Description
TECHNICAL FIELD

The technical field relates generally to computer-based methods and apparatuses, including computer program products, for dynamic user interface customization, and to dynamic communication and collaboration between businesses and consumers through a customizable user interface.


BACKGROUND

With the continued growth of internet use by consumers, live online help services have become increasingly important. Website providers can incorporate online chat options into their website page(s) to offer an additional level of customer service to their users (e.g., in addition to the more traditional fillable information request forms, frequently asked questions pages, etc.). For example, many websites include a “click to chat” option, which a website user can use to engage in interactive chat with a live help agent. When the user clicks the “click to chat” button, a chat user interface is presented to the website user, and through the interface, the user is able to exchange chat messages with the help agent. As another example, websites can include embedded code to automatically display a message to the user that includes a “click to chat” button (e.g., after the user has been idle on a page for a predetermined amount of time). There are several additional methods to engage a website user, and facilitate live interaction between the user and an agent. Regardless of the engagement method used, the help agent can interact with the user through the chat to answer the website user's questions, help navigate the user through the website, suggest products, and/or the like.


While online chat has become an increasingly common method used by website owners to serve users, online chat may not scale well to address each individual user's needs. For example, if the live agent cannot successfully help a website user via the chat, the chat user interface may not include sufficient functions and/or features to successfully address the user's problem. Further, website providers currently face limitations imposed by the chat user interface itself. For example, when current chat interfaces are used, it is generally not possible to add any specific functionality to address the individual needs of the website providers or website user.


Additionally, once a user is engaged with a particular communication channel (e.g., text chat, voice, video, etc.), then it is usually difficult to switch among other communication modalities without starting a new engagement with the user. For example, once a user is engaged in an online chat session, a new communication channel (e.g., with separate user interfaces, equipment, etc.) is often required to change to a different communication modality (e.g., to set up a video chat instead of a standard text chat).


SUMMARY OF THE INVENTION

This disclosure provides a computer-implemented method, a computer-program product, and a computer-implemented system, each of which may be used to dynamically customize a user interface. The method may comprise the steps of storing, at a computing device, multiple functions for a user interface, wherein each of the stored functions is configured to operate on a remote user device, in conjunction with the user interface, and wherein the user interface is configured to operate at the remote user device, transmitting a first set of data to the remote user device, wherein the first set of data causes a first one of the stored functions to operate on the remote user device, wherein operating on the remote user device includes operating in conjunction with the user interface, receiving, at the computing device, interaction data associated with user interactions with the user interface, the interactions occurring at the remote user device, identifying a second set of data based on the received interaction data, wherein the second set of data is identified at the computing device; and transmitting the second set of data from the computing device to the remote user device, wherein the second set of data causes a second one of the stored functions to operate on the remote user device, in conjunction with the user interface.


The system may include a processor which is configured to perform operations such as storing, at a computing device, multiple functions for a user interface, wherein each of the stored functions is configured to operate on a remote user device, in conjunction with the user interface, and wherein the user interface is configured to operate at the remote user device, transmitting a first set of data to the remote user device, wherein the first set of data causes a first one of the stored functions to operate on the remote user device, wherein operating on the remote user device includes operating in conjunction with the user interface, receiving, at the computing device, interaction data associated with user interactions with the user interface, the interactions occurring at the remote user device, identifying a second set of data based on the received interaction data, wherein the second set of data is identified at the computing device; and transmitting the second set of data from the computing device to the remote user device, wherein the second set of data causes a second one of the stored functions to operate on the remote user device, in conjunction with the user interface.


The computer-program product may include instructions for causing a computing device to perform operations including storing, at the computing device, multiple functions for a user interface, wherein each of the stored functions is configured to operate on a remote user device, in conjunction with the user interface, and wherein the user interface is configured to operate at the remote user device, transmitting a first set of data to the remote user device, wherein the first set of data causes a first one of the stored functions to operate on the remote user device, wherein operating on the remote user device includes operating in conjunction with the user interface, receiving, at the computing device, interaction data associated with user interactions with the user interface, the interactions occurring at the remote user device, identifying a second set of data based on the received interaction data, wherein the second set of data is identified at the computing device; and transmitting the second set of data from the computing device to the remote user device, wherein the second set of data causes a second one of the stored functions to operate on the remote user device, in conjunction with the user interface.


The computerized methods and apparatus disclosed herein allow dynamic customization of a user interface (e.g., a chat user interface) by dynamically adding interactive elements or functions (e.g., widgets) to customize the user interface based on a user's unique experience. A unified communication channel allows seamless integration among various communication modalities, such as chat, voice, and video communication channels. A brief summary of various exemplary embodiments is presented. Some simplifications and omissions may be made in the following summary, which is intended to highlight and introduce some aspects of the various exemplary embodiments, but not limit the scope of the invention. Detailed descriptions of a preferred exemplary embodiment adequate to allow those of ordinary skill in the art to make and use the inventive concepts will follow in the later sections.


In one embodiment, a computerized method is featured. The computerized method is for dynamically customizing a user interface. The method includes storing, by a computing device, a set of functions for a user interface, whereby each function is configured so that it can be added to the user interface. The method includes transmitting, by the computing device, a first set of data to a remote device that causes the user interface to be displayed on the remote device with a predetermined subset of functions from the set of functions, wherein one or more functions from the predetermined subset of functions are displayed as a selectable item at the user interface, such that a user can select, activate or engage the function. The method includes receiving, by the computing device, interaction data indicative of a user's interactions with the user interface. The method includes transmitting, by the computing device, a second set of data to the remote device that causes the user interface to dynamically add a new function from the set of functions to the user interface based on the interaction data, wherein the new function is displayed as a selectable item in the user interface, such that the function may be selected for use by the user.


In another embodiment, a computer program product, tangibly embodied in a non-transitory computer readable medium, is featured. The computer program product includes instructions configured to cause a data processing apparatus to store a set of functions for a user interface, the set of functions including functions which can be added to the user interface. The computer program product includes instructions configured to cause a data processing apparatus to transmit a first set of data to a remote device that causes the user interface to be displayed on the remote device with a predetermined subset of functions from the set of primary functions, wherein one or more functions from the predetermined set of functions are displayed as a selectable item in the user interface that a user can select to use the function associated with the primary function. The computer program product includes instructions being configured to cause a data processing apparatus to receive interaction data indicative of a user's interactions with the user interface. The computer program product includes instructions being configured to cause a data processing apparatus to transmit a second set of data to the remote device that causes the user interface to dynamically add a new function from the set of predetermined functions to the user interface based on the interaction data, wherein the new function is displayed as a selectable item in the user interface that the user can select to use a function associated with the new function.


In another embodiment, an apparatus is featured. The apparatus is for dynamically customizing a user interface. The apparatus including a processor and memory. The apparatus is configured to store a set of functions for a user interface, each function including a function that can be added to the user interface. The apparatus is configured to transmit a first set of data to a remote device that causes the user interface to be displayed on the remote device with a predetermined set of functions from the set of functions, wherein one or more functions from the predetermined set of functions are displayed as a selectable item in the user interface that a user can select to use the function associated with the function. The apparatus is configured to receive interaction data indicative of a user's interactions with the user interface. The apparatus is configured to transmit a second set of data to the remote device that causes the user interface to dynamically add a new function from the set of functions to the user interface based on the interaction data, wherein the new function is displayed as a selectable item in the user interface that the user can select to use a function associated with the new function.


In other examples, any of the aspects above can include one or more of the following features. The user interface can be a chat window facilitating chat between the user and a third party. A function from the set of functions can include an agent function that provides information about an agent that the user is in communication with, a shopping cart function that lists a set of products the user has selected while browsing a website, a data transfer function that downloads data to the user, allows a third party to push data to the user, or both, a video function that allows a user to control playback of video content, an audio function that allow a user to control playback of audio content, or any combination thereof.


In some examples, data indicative of a new function is received, wherein the new function includes a function that customizes the user interface for a third party, and adding the new function to the set of functions. The interaction data can be transmitted to an agent device. Action data indicative of the user taking an action associated with a function of a function on the user interface can be received, and the action data can be transmitted to the agent device.


In other examples, data selecting a function from the set of functions is received for addition to the user interface, and a third set of data is transmitted to the remote device that causes the user interface to dynamically add the selected function to the user interface, such that a function associated with the selected function is incorporated into the user interface, wherein prior to addition of the selected function to the user interface, the user interface did not include the function associated with the selected function. A third set of data can be transmitted to the remote device that causes the user interface to dynamically add a second new function from the set of functions to the user interface based on the interaction data, wherein the new function is not displayed as a selectable item in the user interface. The second new function can listen for one or more events from the user interface.


In other examples, transmitting the second set of data to the remote device includes transmitting the second set of data based on data indicative of a change of the user interface, data indicative of user interface behavior, or any combination thereof. An interface can be configured to receive data to design a user interface experience. The user interface experience can include a function, an interaction model between a first function and a second function, a behavior, a restriction for the user of the user interface, an agent the user is in communication with, or both, an automated interaction model, a set of permissions for the user, or any combination thereof.


The techniques described herein are techniques capable of being embodied in methods or apparatuses, and may provide or enable one or more of the following features. The techniques may allow functions to be dynamically added to, and/or removed from the user interface, such that the user interface can be altered to suit an individual user and/or solve problems or address needs associated with the user's interaction with the interface. Further, new functions can be designed for addition to the user interface to suit individual website provider needs. A single engagement with a customer can provide a unified communication channel that can seamlessly use any number of modalities to communicate with the customer. For example, various modalities such as chat communication channels, voice communication channels, video communication channels, and/or other communication channels can be seamlessly switched among during the single engagement with the customer.


Other aspects and advantages of the present invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating the principles of the invention by way of example only.





BRIEF DESCRIPTION OF THE DRAWINGS

The foregoing and other aspects, features, and advantages of the present invention, as well as the invention itself, will be more fully understood from the following description of various embodiments, when read together with the accompanying drawings.



FIG. 1 is an exemplary diagram illustrating a computerized system for dynamic user interface customization;



FIG. 2A illustrates an exemplary diagram of a user interface being dynamically customized;



FIG. 2B illustrates an exemplary diagram of a user interface being dynamically customized;



FIG. 3 illustrates an exemplary computerized method for dynamic user interface customization; and



FIG. 4 illustrates an exemplary computerized method for adding new functions for dynamic user interface customization.





DETAILED DESCRIPTION

In general, computerized systems and methods are provided for dynamically customizing a user interface through adding, removing, configuring or making available functionality, features, capabilities or options. In accordance with this disclosure, the customization of the user interface may be done such that, from the perspective of the interface user, all customized elements are completely integrated with the interface. For example, in the case of a user and an agent in communication over chat, the techniques, methods and systems of this disclosure may enable the agent (and/or the engagement service that established the communication between the user and the agent) to dynamically add and/or remove functions (e.g., either with or without graphical interface component(s)) to/from the user interface to customize the user interface for the particular user, without any action required on the user's part. For example, an agent may be able to add a desktop sharing function, a video function, and/or other functions to the interface, as described herein.



FIG. 1 is a diagram illustrating an example computerized system 100 for facilitating dynamic user interface customization through implementation of any combination of the techniques described herein. The system 100 includes a user device 110, a web server 120, an engagement server 130, and an agent device 140. As depicted, each of these components is connected for communication via network 150. The user device 110 includes a browser 112 and user interface 114. Engagement server 130 includes database 132 and customization component 134. The agent computer 140 includes agent user interface 142. The system 100 includes third party server 160.


In accordance with this disclosure a user device such as the device depicted at 110 may be a personal computer, (e.g., a PC or a laptop) including a processor and memory. Alternatively, the user device 110 may be a smart phone, a personal digital assistant, a tablet PC, and/or any another computing device capable of displaying browser 112 and/or the user interface 114 to the user associated with the user device 110. The user device 110 may be a computing device capable of displaying web content using a web browser (e.g., the browser 112). The browser 112 may be implemented through software used by the user device 110 to display web content received from the web server 120 (e.g., web pages). For example, the web browser may be Microsoft Internet Explorer.


Although FIG. 1 depicts browser 112, this specific inclusion is for exemplary purposes only. In certain embodiments of this disclosure, the user device 110 may display user interface 114 without the interface being associated with a web browser. Additionally, while FIG. 1 only includes a single user device 110, the system can include multiple user devices. When such an arrangement is used, two or more users may participate in a joint engagement (i.e. more than two parties involved) with agent computer 140 (or multiple agent computers, not shown). For example, two users can participate in an engagement that is moderated by an agent.


The user interface 114 may be, for example, a chat window facilitating chat between the user of the user device 110 and a third party (e.g., the agent operating agent device 140). For example, if the user encounters difficulties navigating a web page displayed by the browser 112, the user can initiate a chat help session (e.g., by clicking a “click to chat” button) with an agent who is operating agent device 140. In this way, the agent may be able to help the user properly navigate the web page. The user, via the user interface 114, can chat with the agent to learn how to navigate the web page.


In other example embodiments included within the scope of this disclosure, the user interface 114 may be, or may include, a video chat interface, online help interface, or any other type of user interface. In certain implementations of the techniques presented herein, the user interface 114 may be configured so as to not be displayed on the user device 110 until after an initializing action (or other predetermined criteria) is taken by the user of the user device 110. Additionally or alternatively, the user interface 114 may be configured such that the interface 114 enables the user to minimize, maximize and/or control predetermined aspects of the user interface 114.


In accordance with this disclosure, the web server 120 may be, for example, a single web server with a processor and memory. In other embodiments, the web server 120 may comprise a plurality of web servers configured to provide web services (e.g., to serve web site content). The web server 120 may provide content which may be accessed at the user device 110 through utilization of browser 112.


The engagement server 130 can be, for example, a single server with a processor and memory. In some embodiments, the engagement server 130 may include multiple servers connected directly to each other, or connected through the network 15. The engagement server 130 can be configured to provide any number of technical solutions or capabilities. One such possible capability may be provided to an agent who, through operating agent device 140, provides customer service or assistance to a user operating user device 110. The solution provided by the engagement server 130 to the agent may involve providing the agent with capabilities associated with the agent's operation of agent device 140. With these capabilities, the agent may be able to provide improved or enhanced customer service, help, recommendations to users (e.g., visitors to a website, such as user devices 110 loading web content provided by web server 120 via browser 112) or improved management of communicative connections with users. For example, the engagement server 130 can establish an online chat help session between the user device 110 and the agent device 140. The engagement server 130 may be configured to provide such a capability in response to a user clicking a “click to chat” button at user device 110, in response to a web page being displayed in the browser 112, or upon satisfaction of some other predetermined criteria established by code associated with browser 112, etc.).


Additionally or alternatively, the engagement server 130 may be configured to provide a number of services to remote users (e.g., users interacting or interfacing with a device such as user device 110). The customization component 134 may be configured to dynamically customize the user interface 114. The customization component 134 may be configured to use data from past user or agent engagements (e.g., historical data indicative of functions which were activated, used or displayed, or were otherwise associated with a user interface, and whether the engagement(s) associated with these functions were successful). The customization component 134 may be configured with capabilities for intelligently learning from the data how to optimize a particular user experience, based on previous successful engagements. The customization component 134 can process such data and use the data to customize user interfaces. This processing and customizing may involve using rules (e.g., stored in database 132), predictive analytics, multivariate analysis and testing, and/or other methods or analytics.


For example, in one aspect of the present disclosure, the engagement server 130 may receive data indicative of user actions occurring on the user device 110 (e.g., taken by a user of the user device 110). The customization component 134 may then use the received data to intelligently add and/or remove functions from the user interface 114. As another example, the engagement server 130 can receive data from the agent user interface 142. This data may include instructions to add and/or remove functions from the user interface 114, and may be used by the customization component 134 to execute appropriate action in response to the instructions.


The engagement server 130 can transmit data to the agent device 140 (e.g., to the agent user interface 142) that provides feedback about the user interface 114 and/or about a user's current or past experience with the interface 114. For example, engagement server 130 may provide functions or capabilities to an agent, so that the agent may push new functions and/or content to the user interface 114. The agent may be enabled to push these functions and/or content via agent user interface 142, in conjunction with network 150. However, the agent may need to know a current configuration of the user interface 114 before selecting a new function to be pushed to the user interface 114. In this case, the engagement server 130 may be configured to transmit data which describes a user's experience with user interface 114. The data may be transmitted to the agent device 140, thereby enabling the agent to understand the user's experience on the user interface 114, and/or the current configuration and/or functionality of the user interface 114. For example, the engagement server 130 can transmit a snapshot of the user interface 114 at a particular time. The snapshot may be transmitted to agent device 140, thereby enabling the agent to ascertain the user interface 114 configuration. The agent can use the data to determine what the user is experiencing through the user interface 114, and can use that determination to customize the user interface 114 by adding new functions to it.


In accordance with certain embodiments of this disclosure, the engagement server 130 may be configured to cause an agent user interface 142 to be displayed at agent device 140. This interface on agent device 140 may enable the agent to receive data pertinent to helping a user and/or customizing a user interface experience for the user associated with user interface 114. For example, once the engagement server 130 causes an agent user interface 142 to be displayed on the agent device 140, the engagement server 130 may then receive data associated with the agent's interactions or experience with agent user interface 142. Additionally or alternatively, the engagement server may receive data associated with a user's interactions or experience with user interface 114. The engagement server may be configured to use this data to appropriately customize agent user interface 142 or user interface 114.


As described herein, a user or agent experience with an interface may be understood to include, be characterized by, or be affected by, one or more functions operating in conjunction with the interface, an interaction model between a first function and a second function (e.g., which describes how the two functions interact with one another), user or agent behavior (e.g., a combination of user or agent interaction with one or more components or functions associated or operating in conjunction with the interface), a restriction affecting a user and/or agent interacting with the interface, an automated interaction model (e.g., which may executed by the engagement server 130 to determine when to dynamically alter the user interface 114 or agent user interface 142, based on previously collected data), a set of permissions for the user of the user device 110, or any other arrangement(s) recognizable, in light of this disclosure, to an artisan of ordinary skill in any one of the arts to which this disclosure pertains.


This paragraph will discuss one possible example implementation of certain of the techniques and methods disclosed herein. This example implementation is presented only for the purposes of demonstrating one way in which data related to a user experience may be used, in accordance with this disclosure, to customize an interface. In this example implementation, user behavior data may be used by engagement server 130 for purposes of customizing an agent user interface (such as the one shown at 142) or a user interface (such as the one shown at 114). In this case, engagement server 130 may be configured to install certain functions which operate on user device 110 or agent device 140, and in conjunction with user interface 110 or agent user interface 142, as the case may be. These particular functions, when installed or activated, may provide functionality based on, or in response to, user or agent interaction data associated with the user interface 114 or agent user interface 142. This interaction data may be received and processed by engagement server 130, and additionally, in some embodiments, may then be provided to agent device 140 by the engagement server 130.


Implementations such as those which are consistent with the aforementioned implementation description may enable an agent to intelligently affect a user's experience or customize the user interface 114 based on the user's or agent's current behavior. For example, the engagement server 130 may be configured to provide an agent with information relevant to serving the user, whereby the provided information is selected by the engagement server 130 based on detection of a specific term provided by the user or agent in chat or in another type of communication session. A function operating on agent device 140 or user device 110 may be used to detect such a term. The function may inform engagement server 130 that the term has been used. The engagement server 130 may then apply a business rule to analyzes the term and determine the term if it is associated with a relevant product. Alternatively, the function may apply the rule to analyze the term. In either case, the function or the engagement server 130 could cause an additional function to be incorporated into the user interface 110 or agent user interface 142. The additional function could be configured to provide product information to the user or agent related to any product determined to be relevant, in view of the detected term and the business rule.


The engagement server 130 may be configured to include a database 132. In an embodiment of this disclosure, the engagement server 130 may be configured to use the database 132 to store feedback information indicative of the state of user interface 114. For example, the engagement server 130 can keep a log of all functions added to and/or removed from the user interface 114 (e.g., added automatically via the engagement server 130, and/or added via an agent through the agent user interface 142), and can be configured to use the log to determine a current state of the user interface 114.


In some examples, the user interface 114 can include a function that transmits feedback information to the engagement server 130. For example, a function can be configured to periodically send data to the engagement sever 130 indicative of a snapshot of the user interface 114 (e.g., at predetermined time intervals or upon request from the engagement server 130). Based on this snapshot, the Agent may be provided with a detailed visual depiction of the visitor's experience, as well as a visual depiction of what the visitor sees at user interface 114. In this way, the system may suggest next steps to the agent based on the current view or experience of the visitor.


The database 132 stores, for example, the functions that can be added to and/or removed from the user interface 114. For example, a function can be an agent function that provides information about an agent (e.g., the person using the agent user interface 142 on the agent device 140, which may be in some sort of communication with the user device 110 via online chat, video chat, voice chat, etc.) that the user of the user device 110 is in communication with through the user interface 114. As another example, the function can be a shopping cart function that lists a set of products the user of the user device 110 has selected while browsing a website, using the browser 112, that is provided by the web server 120 (e.g., an online catalog). As another example, the function can include a data transfer function that downloads data to the user (e.g., a coupon), allows a third party to push data to the user (e.g., allows the agent device 140 to upload a file, document, presentation, work file, etc. to the user device 110), and/or the like. As another example, the function can be a video function that allows a user to control playback of video content (e.g., to play, pause, stop a video being streamed to the user). As another example, the function can be an audio function that allows a user to control playback of audio content (e.g., to play, pause, stop audio being streamed to the user). As another example, the function can be a social engagement function (e.g., Twitter, Facebook, etc.) that allows a user to push the engagement between it and the agent device 140 (e.g., an online chat) into the user's social engagement application to continue the experience in the user's social engagement application. As another example, the function can provide a service that affects the user's experience (e.g. a translation service). As another example, the function can be a secure information transfer function (e.g., which is compliant with the PCI Security Council standards for the exchange of credit card numbers) that allows transfer of Personal Identifiable Information (PII) over the communication channel (e.g., over chat).


In some examples, a function can be configured to detect events, and to take one or more actions based on the detected events. For example, a detected event may be triggered by a user's actions taken on the user device 110, the browser 112, the user interface 114, etc. Such events can also be events which occur in response to one or more other functions associated with the user interface 114, etc. The functions can be configured to send data to the engagement server 130 (and/or the agent user interface 142). For example, the functions can be configured to transmit data indicative of state changes and user interface interaction. For example, the video function can transmit data to the engagement server 130 indicative of the user's actions taken with respect to the video function (e.g., transmit information indicative of the user pressing play on the user interface 114, information indicative of the user pressing pause on the user interface 114, etc.).


This disclosure shall not be interpreted to be limited in scope by the mention or description of example functions presented herein. Rather, the functions specifically presented and described are included for example purposes only. This disclosure is intended to cover any and all functions which may expand, limit, alter, track, monitor, improve, document or otherwise affect a user experience associated with a user interface such as user interface 114. This disclosure is also intended to cover the many other types of related or applicable functions which would, in view of this disclosure, be readily recognizable to a person skilled in one or more of the arts to which this disclosure applies.


In some examples, the functions are added to the user interface 114 without changing the visual display of the user interface 114. Such functions may be thought of as behind-the-scenes functions with respect to the user interface 114. For example, a function can be added that tracks events and initiates responsive actions based on the detected events. For example, a function can detect a reception of data sent to the user device 110 from the agent device 140 (e.g. messages from the agent operating the agent user interface 142) and initiate actions based on the received data (e.g., transmitting a message in response to the received data). As another example, a function can be added to detect a user's interaction with other functions of the user interface 114. This may enable the detecting function to initiate actions which are determined to be appropriate based on the user's interactions with the user interface 114.


The agent device 140 can be, for example, a personal computer (e.g., a PC or a laptop) which includes a processor and memory. Alternatively, and in accordance with this disclosure, the agent device 140 may a smart phone, a personal digital assistant, a tablet PC, and/or any another computing device capable of providing the agent user interface 142 and/or operations or processing associated with it. The agent user interface 142 may be configured so that the agent interfacing with the agent device 140 is able to control, activate, remove and/or invoke functions provided by the engagement server 130. For example, the agent interface 142 may be configured to that the agent has the option of activating a chat help session involving the user device 110). An example agent console is described in U.S. patent application Ser. No. 13/413,197, which was filed on Mar. 6, 2012, entitled “Occasionally-Connected Computing Interface,” and which is incorporated by reference herein in its entirety. The agent user interface 142 can also display a detailed visual “playback” of historical user experience data (e.g., historical charts and/or graphs of function usage, success rates for functions, etc.). The historical user experience data can include, for example, historical data collected from previously-deployed user interfaces, such as which functions were used for the user interface and whether the engagement was successful (e.g., whether an agent was able to solve the user's problem via the user interface). The historical data playback can be used, for example, for backtracking and analysis capabilities, as well as the use of Natural Language Processing (NLP) (e.g., which can analyze text in a text chat) to identify correlations and insights of functions (or user interface configurations) by looking at the functions and/or the engagement experience. While the terms “agent” and “agent device” are used herein, the terms should be interpreted broadly to include any end user, such as a typical agent as well as a user similar to that using the user device 120. As another example, an agent can be a business ambassador for a company (e.g., a representative or spokesperson for the company).


Network 150 can be, for example, a packet-switching network that can forward packets to other devices based on information included in the packet.


Third party server 160 can provide services for functions added to the user interface 114 (e.g., in addition to those services provided by the engagement server 130). The engagement server 130 can be configured to incorporate technologies from the third party sever 160 (and/or other third party servers, not shown), which can add to the robustness of the experience presented to the user through the user interface 114. The engagement server 130 can incorporate disparate technologies and/or applications into the user interface 114 (e.g., much like an operating system).


The system 100 is an example of a computerized system that is configured to perform the methods described herein. However, the system structure and content recited with regards to FIG. 1 is presented for exemplary purposes only and is not intended to limit this disclosure to implementations involving the specific structure shown in FIG. 1. As will be apparent to one of ordinary skill in the art, many recognizable system structures can be used to implement the techniques and methods described herein, without departing from the scope of this disclosure. For example, a web server 120, while included for illustrative purposes, may be omitted without departing from the spirit of the invention. As another example, a plurality of user devices and/or agent devices (not shown) may be used in the system 100.


In addition, information may flow between the elements, components and subsystems described herein using any technique. Such techniques include, for example, passing the information over the network using standard protocols, such as TCP/IP, passing the information between modules in memory and passing the information by writing to a file, database, or some other non-volatile storage device. In addition, pointers or other references to information may be transmitted and received in place of, or in addition to, copies of the information. Conversely, the information may be exchanged in place of, or in addition to, pointers or other references to the information. Other techniques and protocols for communicating information may be used without departing from the scope of the invention.



FIGS. 2A-2B depict an example of user device components involved in dynamic customization of a user interface, in accordance with certain of the methods disclosed herein. FIG. 2A depicts a user device 202, a web browser 204, and a user interface 206. Furthermore, as depicted in FIG. 2A, user interface 206 includes functions 208A, 208B. The user interface 206 also includes a chat console 210.



FIG. 2B depicts each of the aforementioned components shown in FIG. 2A, and also includes function 208C, which will be explained in greater detail in following paragraphs. FIGS. 2A and 2B are used for illustrative purposes only. In accordance with this disclosure, a user interface may include any number of additional and or alternative functions and components. For example, a user interface such as user interface 206 may include other interactive components or features in addition to, or instead of, chat console 210. Also, a user interface need not include any active functions (e.g., the user interface may include simply a list of functions which are selectable by the user). The user interface 206 may be an interface associated with a computer to computer connection, a video display (e.g., with notification that the video is being watched), and/or any other type(s) of engagement or communication interface (e.g., an interface linking the agent device 140 and the user device 110). For example, certain user interfaces which are within the scope of this disclosure are described in U.S. patent application Ser. No. 13/371,163, entitled “Analytic Driven Engagement,” filed on Feb. 10, 2012, which addresses analytic driven engagement, and is incorporated herein in its entirety. In some examples, the initial user interface 206 is not displayed on the web browser 204 until the user device 110 or engagement server 130 determines the user interface 206 should be displayed (e.g., the engagement server 130 may make such a determination based on interaction data, as is described further below).



FIG. 3 illustrates an example computerized method 300 for dynamic user interface customization in accordance with the present disclosure. The discussion of the method depicted in FIG. 3 will refer back to previous FIGS. 1, 2A and 2B, as these previous figures depict components and elements which may be involved in certain of the method steps described in FIG. 3.


As depicted in FIG. 3, at step 302, the engagement server 130 stores a set of functions which may be installed or activated on a user interface (e.g., for user interface 114). In certain embodiments, the user interface may, but need not necessarily, be displayed in a browser such as the one depicted at 112. At step 304, the engagement server 130 transmits a first set of data to user device 110. This transmitted data causes the user interface 114 to be displayed on the user device 110 such that the interface displays a predetermined set of the functions stored at engagement server 130. The data causes these displayed functions (e.g., functions 208A and 208B, as shown in FIG. 2A) to be incorporated into the user device 110. At step 306, the engagement server 130 receives interaction data from the user device 110. As depicted, the interaction data is indicative of a user's interactions with the browser 112 and/or the user interface 114. At step 308, the engagement server 130 transmits a second set of data to the user device 110. The second set of data is selected by the engagement server 130 based on the interaction data, and causes the user interface 114 to dynamically add a new function from the stored set to the user interface 114.


Referring to step 302, each function stored at the engagement server 130 may be a function which can be added to the user interface 114. For example, the stored functions may be functions such as the functions described above with respect to FIG. 1. These functions include video playback, audio playback, file transfer, and/or any other function which could be configured to be incorporated into the user interface 114.


Referring to step 304, the first set of data (e.g., data which determines which functions will be initially included in the user interface 206) can be predetermined and stored in a configuration file. The engagement sever 130 can store such a configuration file in a database such as database 132, for example. The configuration file can be configured for a particular customer of the engagement serer 130. For example, the customer may be a website provider (e.g., web server 120, which provides content that can be loaded by a browser such as the browser depicted at 112). The website provider can configure its website, such that, when the website is requested by the browser 112, a code module is loaded in the browser 112. The code module, when loaded, may then control the time at which the user interface 114 is displayed to the user of the user device 110. For example, a code module of this type may prevent the user interface 114 from being displayed until after the user clicks a “click to chat” button, or until after the user has remained at a particular web page in browser 112 for a predetermined period of time (e.g., ten seconds), and/or the like.


Referring further to step 304, the first set of data need not cause actual display of functions on the user interface 114. For example, the engagement server 130 can transmit data to the user device 110 which causes the user interface 114 to dynamically incorporate or activate a function such that the new function is not displayed in the user interface (e.g., the function adds functionality to the user interface 114 without requiring a displayed component). For example, such a function can include javascript code executable by the user's browser 112 for monitoring and storing information related to user interactions with websites loaded using the browser 112. As another example, the function can include javascript code that (e.g., when executed by the browser 112) for detecting one or more events associated with the user interface 114 (e.g., button presses, menu-item selections, checkbox selections, and/or other graphical user interface interactions).


Referring further to step 304, certain of the function(s) in the set of functions stored at engagement server 130 may be displayed for selection within the user interface 114. In this way, a user operating user device 110 may select the function for use, activation or installation when it is displayed. Displaying a function for selection may include, for example, displaying a clickable icon, menu item, checkbox, and/or any other graphical user interface component which can be used or invoked by a user.


Referring to step 306, the engagement server 130 can receive interaction data indicative of the user's interactions with the browser 112 and/or the user interface 114. For example, the browser 112 can include a code module (not shown) which executes in the browser 112 to track and store the user's navigation or search history associated with the user's use of browser 112. The code module can cause this stored interaction data to be transmitted to the engagement server (e.g., on command, periodically, etc.). As another example, the user interface 114 can include a code module which monitors a user's interactions with the user interface 114 (via the user device 110).


Referring further to step 306, the engagement server 130 can use the interaction history data to determine when to add and/or remove functions from the user interface 114 (while method 300 addresses adding functions, functions can also be removed from the user interface 114). For example, the engagement server 130 can reference a stored set of rules which describe when to add a function to the user interface 114. The engagement server can use the rules by processing the interaction data in light of the rules. In this way, the engagement server 130 can use the rules (and/or other similar forms of artificial intelligence) to determine which functions are added and/or removed from the user interface 114 (to provide the user with the best possible experience). In some examples, the engagement server 130 uses data indicative of a change of the user interface 114 (e.g., data that is transmitted between the user interface 114 and the engagement server 130 that provides constant updates about what is occurring with the user interface 114 on the user device 110), data indicative of user interface behavior (e.g., interaction among functions), or both, to determine when to add and/or remove functions from the user interface 114.


As another example, the engagement server 130 can transmit the interaction data to the agent device 140. An operator of the agent device 140 can use the agent user interface 142 to view the interaction data to determine when to add/remove functions from the user interface 114 (e.g., if the user clicked on a video link displayed using a video function, if the user started playback of the video using the video function, if the user paused playback of the video using the video function, etc.). The operator can transmit a signal to the engagement server 130 to cause the engagement server 130 to transmit a new function to the user device 110 for incorporation into the user interface 114.


Referring to step 308, the user device 110 dynamically adds the new function to the user interface 114. As described above, the new function may include a visual aspect (e.g., a checkbox, menu item, button, icon, etc.). In some examples, the new function is displayed as a selectable item in the user interface 114 that the user can select to use a function associated with the new function (e.g., new function 208C of FIG. 2B, which was newly added to the interface 206 from FIG. 2A, which only includes functions 208A and 208B). For example, once a function is added to the user interface 114, the user can invoke the functionality of the new function by clicking an icon associated with the function.


As an example of method 300, referring to FIGS. 2A and 2B, assume an agent is conducting an online chat with a user of the user device 202 (e.g., a situation in which an online chat involves the agent user interface 142 of agent device 140, and the user interface 206 includes the chat console 210) to help the user navigate a website loaded in the browser 112. In this case, the agent user interface 142 may display the chat information entered by the user via the chat console 210. The agent on the agent device 140 can determine, for example, that it is most beneficial to play a movie for the user. In accordance with the techniques disclosed herein, the agent, by using the agent user interface 142, may be able to send a command to the engagement server 130. The command may cause a new function, such as function 208C, to be loaded in the user interface 206. In this way, the user may be provided with the ability to control playback of the desired video content. For example, function 208C can include playback controls (e.g., pause, fast forward, rewind, etc.) that the user of the user device 110 can use to control playback of the video content. The function 208C can include a listening component (and/or the engagement server 130 can send a second function to the user device 110) that monitors which of the playback controls are used, and transmits information indicative of the same to the agent user interface 142. For example, when the user invokes, using the user interface 114, the play button, the function transmits data to the agent device 140 indicative of the user starting playback of the video content.


As another example, referring to FIGS. 2A-2B, the engagement server 130 establishes a chat communication between the user of the user device 202 (via the chat console 210) and an agent (e.g., via agent user interface 142 of FIG. 1). The user begins asking the agent about the website loaded in the web browser 204 (e.g., the user is having trouble navigating the website). It becomes apparent through the chat conversation that the agent can better assist the user by viewing the same webpage, however the user interface 206 does not include desktop sharing. Therefore, the agent causes the engagement server 130 to add a desktop sharing function to the user interface 206 (e.g., function 208C). The user can invoke the desktop sharing function by selecting the function (or, in some examples, the user need not take any action to invoke the function). The desktop sharing function can share the user's screen with the agent, so that the agent can guide the user through navigating the web page on the user's web browser 204.


The agent can add (and/or remove) any number of functions to the user interface 206 (e.g., video playback, file transfers, etc.). For example, if the agent is still having trouble helping out the user, the agent can add a video function so the conversation can escalate to a video chat. The experience is in the user interface, and the user interface can change form to best suit addressing the user's problem. For example, while systems often use separate channels for online chat, voice, and video communications, the user interface 206 can provide a unified communication channel that allows the agent to seamlessly switch among different communication modalities with the user. The communication modalities can also include traditional communication channels (e.g., phone communications over the PSTN). For example, the agent can switch from chat to voice communication, and then switch again from voice to video communication, all while using the same user interface 206. The agent can add and/or remove the additional communication modalities by, for example, adding and/or removing functions from the user interface 206. For example, if an agent determines that he needs to move from a chat communication to a voice communication, the agent can add a voice communication function (e.g., via the agent user interface 142) to the user interface 206 such that the user interface 206 can provide both the chat communication and the voice communication to the user of the user device 110 (e.g., the user can click the newly-added voice communication function to engage in voice communication with the agent without opening any additional interfaces or taking any further actions on the user device 110).



FIG. 4 illustrates an exemplary computerized method 400 for adding new functions for dynamic user interface customization. At step 402, the engagement server 130 receives data indicative of a new function. At step 404, the engagement server 130 adds the new function to the set of functions (e.g., adds the function to the database 132 for storage) so that the new function can be incorporated into a user interface. At step 406, the engagement server 130 receives data selecting the new function for addition to a user interface. At step 408, the engagement server 130 transmits data to the remote device (e.g., to the user device 110) that causes the user interface (e.g., user interface 114) to dynamically add the selected function to the user interface, such that a function associated with the selected function is incorporated into the user interface, wherein prior to addition of the selected function to the user interface, the user interface did not include the function associated with the selected function.


Referring to step 402, functions can be added to the engagement server to add functions for a third party, such that the third party can use the functions to customize the user interface based on the third party's needs. For example, if the third party is a new user of the engagement server 130 dynamic customization services, the third party can create new/additional functions designed specifically for the needs of the third party. For example, if the third party would like to use a video function but there are no video functions (e.g., there are no video functions stored in the database 132, and/or the video function(s) stored in the database 132 do not include desired functionality), the third party can create a new video function to include all the functions desired by the third party (e.g., via the agent user interface 142 of FIG. 1).


The dynamic customization systems and methods described herein provide for flexibility of designing functions and customizing user interfaces to dynamically add and/or remove functions (e.g., during use of the user interface, without any action required on the GUI user's part). Any function can be configured to communicate with any other function(s) running in the user interface framework, and therefore the functions can be used to design any kind of desired behavior. The communication and signaling between the functions and/or the environment (e.g., the web browser, user interface, etc.) can be managed in a pluggable way. For example, a predefined language set can be implemented to support the communication and signaling. A user of the service can implement functions by designing their own callback procedures and callback names to signal between, for example, two functions, that the user adds to the engagement server 130.


The dynamic customization systems and methods can measure the success of an engagement. For example, it may be desirable to measure how successful certain engagements are with users (e.g., where a successful engagement is measured by whether a user's problem was solved, whether the user was satisfied with the engagement, etc.). As an example, the system may determine that engagements that start with chat communication and then escalate to voice communication are more successful at solving the user's problems than those engagements that only use chat communication. As another example, it may be desirable to measure how often a function is used in a successful user engagement. For example, an agent may be more likely to add a particular function to the user interface 206 if it has a higher rate of success than other functions.


The user interface experience can be used on any device. For example, if a user is interacting with the user interface on their home computer, they can continue the experience from their mobile phone. For example, the agent can send the user a code on the home computer, and the user can scan the code using their mobile device by taking a picture of the code and using a code converting application (the agent can send the code using a code function). After scanning the code, the code converting application can re-create the user interface 114 on the user's mobile device (and/or the interaction, such as chat, the user was engaged in with the agent). The user can then continue the interactions on their mobile phone.


The engagement server 130 can provide a modular, personalized service to individual users. For example, the engagement server 130 can associate a user with a brand. The brand can allow users to have an “assistant” (e.g., virtual assistant that goes everywhere with the user). If the user asks the virtual assistant a question, it can initiate an online chat window with agents working for the brand (e.g., which can expand into a movie, etc. based on the user's interactions with the agents).


The above-described techniques can be implemented in digital and/or analog electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. The implementation can be as a computer program product, i.e., a computer program tangibly embodied in a machine-readable storage device, for execution by, or to control the operation of, a data processing apparatus, e.g., a programmable processor, a computer, and/or multiple computers. A computer program can be written in any form of computer or programming language, including source code, compiled code, interpreted code and/or machine code, and the computer program can be deployed in any form, including as a stand-alone program or as a subroutine, element, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one or more sites.


Method steps can be performed by one or more processors executing a computer program to perform functions of the invention by operating on input data and/or generating output data. Method steps can also be performed by, and an apparatus can be implemented as, special purpose logic circuitry, e.g., a FPGA (field programmable gate array), a FPAA (field-programmable analog array), a CPLD (complex programmable logic device), a PSoC (Programmable System-on-Chip), ASIP (application-specific instruction-set processor), or an ASIC (application-specific integrated circuit). Subroutines can refer to portions of the computer program and/or the processor/special circuitry that implement one or more functions.


Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital or analog computer. Generally, a processor receives instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and/or data. Memory devices, such as a cache, can be used to temporarily store data. Memory devices can also be used for long-term data storage. Generally, a computer also includes, or is operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. A computer can also be operatively coupled to a communications network in order to receive instructions and/or data from the network and/or to transfer instructions and/or data to the network. Computer-readable storage devices suitable for embodying computer program instructions and data include all forms of volatile and non-volatile memory, including by way of example semiconductor memory devices, e.g., DRAM, SRAM, EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and optical disks, e.g., CD, DVD, HD-DVD, and Blu-ray disks. The processor and the memory can be supplemented by and/or incorporated in special purpose logic circuitry.


To provide for interaction with a user, the above described techniques can be implemented on a computer in communication with a display device, e.g., a CRT (cathode ray tube), plasma, or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse, a trackball, a touchpad, or a motion sensor, by which the user can provide input to the computer (e.g., interact with a user interface element). Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, and/or tactile input.


The above described techniques can be implemented in a distributed computing system that includes a back-end component. The back-end component can, for example, be a data server, a middleware component, and/or an application server. The above described techniques can be implemented in a distributed computing system that includes a front-end component. The front-end component can, for example, be a client computer having a graphical user interface, a Web browser through which a user can interact with an example implementation, and/or other graphical user interfaces for a transmitting device. The above described techniques can be implemented in a distributed computing system that includes any combination of such back-end, middleware, or front-end components.


The computing system can include clients and servers. A client and a server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.


The components of the computing system can be interconnected by any form or medium of digital or analog data communication (e.g., a communication network). Examples of communication networks include circuit-based and packet-based networks. Packet-based networks can include, for example, the Internet, a carrier internet protocol (IP) network (e.g., local area network (LAN), wide area network (WAN), campus area network (CAN), metropolitan area network (MAN), home area network (HAN)), a private IP network, an IP private branch exchange (IPBX), a wireless network (e.g., radio access network (RAN), 802.11 network, 802.16 network, general packet radio service (GPRS) network, HiperLAN), and/or other packet-based networks. Circuit-based networks can include, for example, the public switched telephone network (PSTN), a private branch exchange (PBX), a wireless network (e.g., RAN, bluetooth, code-division multiple access (CDMA) network, time division multiple access (TDMA) network, global system for mobile communications (GSM) network), and/or other circuit-based networks.


Devices of the computing system and/or computing devices can include, for example, a computer, a computer with a browser device, a telephone, an IP phone, a mobile device (e.g., cellular phone, personal digital assistant (PDA) device, laptop computer, electronic mail device), a server, a rack with one or more processing cards, special purpose circuitry, and/or other communication devices. The browser device includes, for example, a computer (e.g., desktop computer, laptop computer) with a world wide web browser (e.g., Microsoft® Internet Explorer® available from Microsoft Corporation, Mozilla® Firefox available from Mozilla Corporation). A mobile computing device includes, for example, a Blackberry®. IP phones include, for example, a Cisco® Unified IP Phone 7985G available from Cisco System, Inc, and/or a Cisco® Unified Wireless Phone 7920 available from Cisco System, Inc.


One skilled in the art will realize the invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The foregoing embodiments are therefore to be considered in all respects illustrative rather than limiting of the invention described herein. Scope of the invention is thus indicated by the appended claims, rather than by the foregoing description, and all changes that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims
  • 1. A computer-implemented method, comprising: storing at a user device, data for multiple functions of a graphical interface on the user device, and wherein the graphical interface facilitates communication between the user device and an agent device;receiving at the user device, an initial set of data, wherein when the initial set of data is received from an engagement server, the initial set of data is associated with an initial function of the multiple functions, and wherein the initial set of data causes the user device to display the graphical interface customized with the initial function as part of a communication session between the user device and the agent device;transmitting a feedback set of data, wherein the feedback set of data is associated with the initial function, wherein when received at the agent device, the feedback set of data causes the agent device to display a current configuration of the graphical interface for the user device that includes details of a problem experienced by a user of the user device; andreceiving an additional set of data at the user device, wherein the additional set of data is generated by intelligent learning at a customization component of the engagement server to optimize the graphical interface based on the feedback set of data.
  • 2. The computer-implemented method of claim 1, wherein the feedback set of data facilitates display a snapshot of the graphical interface at the user device at a particular time.
  • 3. The computer-implemented method of claim 1, wherein the feedback set of data facilitates generation of the graphical interface matching the current configuration for the user device at the agent device.
  • 4. The computer-implemented method of claim 1, wherein the additional set of data causes the additional function to be dynamically added to the graphical interface at the user device as the current configuration of the graphical interface is displayed at the agent device.
  • 5. The computer-implemented method of claim 1, further comprising: transmitting response data describing an updated current configuration of the graphical interface with the additional function at the user device.
  • 6. The computer-implemented method of claim 1, further comprising: transmitting updated problem data from the user device based on the additional set of data; andreceiving an updated additional set of data at the user device responsive to the updated problem data analyzed at the agent device.
  • 7. The computer-implemented method of claim 1, wherein the data for multiple functions of the graphical interface is generated by the customization component using the intelligent learning at the customization component based on previous successful communication sessions.
  • 8. The computer-implemented method of claim 1, wherein the intelligent learning is performed using historical data indicative of functions which were activated, used, displayed, or otherwise associated with the user interface to optimize the user interface for a particular user experience.
  • 9. A system, comprising: memory; andone or more processors coupled to the memory and configured to perform operations including:storing at a user device, data for multiple functions of a graphical interface on the user device, and wherein the graphical interface facilitates communication between the user device and an agent device;receiving at the user device, an initial set of data, wherein when the initial set of data is received from an engagement server, the initial set of data is associated with an initial function of the multiple functions, and wherein the initial set of data causes the user device to display the graphical interface customized with the initial function as part of a communication session between the user device and the agent device;transmitting a feedback set of data, wherein the feedback set of data is associated with the initial function, wherein when received at the agent device, the feedback set of data causes the agent device to display a current configuration of the graphical interface for the user device that includes details of a problem experienced by a user of the user device; andreceiving an additional set of data at the user device, wherein the additional set of data is generated by intelligent learning at a customization component of the engagement server to optimize the graphical interface based on the feedback set of data.
  • 10. The system of claim 9, wherein the feedback set of data facilitates display a snapshot of the graphical interface at the user device at a particular time.
  • 11. The system of claim 9, wherein the feedback set of data facilitates generation of the graphical interface matching the current configuration for the user device at the agent device.
  • 12. The system of claim 9, wherein the additional set of data causes the additional function to be dynamically added to the graphical interface at the user device as the current configuration of the graphical interface is displayed at the agent device.
  • 13. The system of claim 9, wherein the one or more processors are configured for operations further comprising: transmitting response data describing an updated current configuration of the graphical interface with the additional function at the user device.
  • 14. The system of claim 9, wherein the one or more processors are configured for operations further comprising: transmitting updated problem data from the user device based on the additional set of data; andreceiving an updated additional set of data at the user device responsive to the updated problem data analyzed at the agent device.
  • 15. A non-transitory computer readable storage medium comprising instructions that, when executed by one or more processors of a device, cause the device to perform operations comprising: storing at a user device, data for multiple functions of a graphical interface on the user device, and wherein the graphical interface facilitates communication between the user device and an agent device;receiving at the user device, an initial set of data, wherein when the initial set of data is received from an engagement server, the initial set of data is associated with an initial function of the multiple functions, and wherein the initial set of data causes the user device to display the graphical interface customized with the initial function as part of a communication session between the user device and the agent device;transmitting a feedback set of data, wherein the feedback set of data is associated with the initial function, wherein when received at the agent device, the feedback set of data causes the agent device to display a current configuration of the graphical interface for the user device that includes details of a problem experienced by a user of the user device; andreceiving an additional set of data at the user device, wherein the additional set of data is generated by intelligent learning at a customization component of the engagement server to optimize the graphical interface based on the feedback set of data.
  • 16. The non-transitory computer readable storage medium of claim 15, wherein the feedback set of data facilitates display a snapshot of the graphical interface at the user device at a particular time.
  • 17. The non-transitory computer readable storage medium of claim 15, wherein the feedback set of data facilitates generation of the graphical interface matching the current configuration for the user device at the agent device.
  • 18. The non-transitory computer readable storage medium of claim 15, wherein the additional set of data causes the additional function to be dynamically added to the graphical interface at the user device as the current configuration of the graphical interface is displayed at the agent device.
  • 19. The non-transitory computer readable storage medium of claim 15, wherein the instructions further cause the device to perform operations comprising: transmitting response data describing an updated current configuration of the graphical interface with the additional function at the user device.
  • 20. The non-transitory computer readable storage medium of claim 15, wherein the instructions further cause the device to perform operations comprising: transmitting updated problem data from the user device based on the additional set of data; andreceiving an updated additional set of data at the user device responsive to the updated problem data analyzed at the agent device.
CROSS REFERENCE TO RELATED APPLICATION

This application is a continuation of U.S. patent application Ser. No. 17/004,362 filed Aug. 27, 2020, which is a continuation of U.S. patent application Ser. No. 15/389,598 filed Dec. 23, 2016 which is a continuation of U.S. application Ser. No. 13/841,434 filed Mar. 15, 2013, which claims priority to U.S. Provisional Application No. 61/638,825, filed Apr. 26, 2012, the disclosure of each of which is incorporated herein by reference in its entirety for all purposes.

US Referenced Citations (692)
Number Name Date Kind
5450537 Hirai et al. Sep 1995 A
5517405 McAndrew et al. May 1996 A
5563805 Arbuckle et al. Oct 1996 A
5572643 Judson Nov 1996 A
5581702 McArdle et al. Dec 1996 A
5583763 Atcheson et al. Dec 1996 A
5590038 Pitroda Dec 1996 A
5592378 Cameron et al. Jan 1997 A
5596493 Tone Jan 1997 A
5611052 Dykstra et al. Mar 1997 A
5636346 Saxe Jun 1997 A
5664115 Fraser Sep 1997 A
5668953 Sloo Sep 1997 A
5678002 Fawcett et al. Oct 1997 A
5694163 Harrison Dec 1997 A
5696907 Tom Dec 1997 A
5699526 Siefert Dec 1997 A
5704029 Wright Dec 1997 A
5710887 Chelliah et al. Jan 1998 A
5715402 Popolo Feb 1998 A
5724155 Saito Mar 1998 A
5724522 Kagami et al. Mar 1998 A
5727048 Hiroshima et al. Mar 1998 A
5727163 Bezos Mar 1998 A
5732400 Mandler et al. Mar 1998 A
5745654 Titan Apr 1998 A
5748755 Johnson et al. May 1998 A
5758328 Giovannoli May 1998 A
5760771 Blonder et al. Jun 1998 A
5761640 Kalyanswamy et al. Jun 1998 A
5761649 Hill Jun 1998 A
5764916 Busey et al. Jun 1998 A
5765142 Allred et al. Jun 1998 A
5774869 Toader Jun 1998 A
5774870 Storey Jun 1998 A
5774882 Keen et al. Jun 1998 A
5774883 Andersen et al. Jun 1998 A
5778164 Watkins et al. Jul 1998 A
5784568 Needham Jul 1998 A
5793365 Tang et al. Aug 1998 A
5794207 Walker et al. Aug 1998 A
5796393 MacNaughton et al. Aug 1998 A
5796952 Davis Aug 1998 A
5797133 Jones et al. Aug 1998 A
5799151 Hoffer Aug 1998 A
5805159 Bertram et al. Sep 1998 A
5806043 Toader Sep 1998 A
5812769 Graber et al. Sep 1998 A
5815663 Uomini Sep 1998 A
5818907 Mahoney et al. Oct 1998 A
5819029 Edwards et al. Oct 1998 A
5819235 Tamai et al. Oct 1998 A
5819236 Josephson Oct 1998 A
5819291 Haimowitz et al. Oct 1998 A
5825869 Brooks et al. Oct 1998 A
5826241 Stein et al. Oct 1998 A
5826244 Huberman Oct 1998 A
5828839 Moncreiff Oct 1998 A
5832465 Tom Nov 1998 A
5835087 Herz et al. Nov 1998 A
5838682 Dekelbaum et al. Nov 1998 A
5838910 Domenikos et al. Nov 1998 A
5839117 Cameron et al. Nov 1998 A
5850517 Verkler et al. Dec 1998 A
5852809 Abel et al. Dec 1998 A
5857079 Claus et al. Jan 1999 A
5859974 McArdle et al. Jan 1999 A
5862330 Anupam et al. Jan 1999 A
5866889 Weiss et al. Feb 1999 A
5870721 Norris Feb 1999 A
5878403 DeFrancesco et al. Mar 1999 A
5895454 Harrington Apr 1999 A
5903641 Tonisson May 1999 A
5907677 Glenn et al. May 1999 A
5911135 Atkins Jun 1999 A
5916302 Dunn et al. Jun 1999 A
5918014 Robinson Jun 1999 A
5924082 Silverman et al. Jul 1999 A
5930776 Dykstra et al. Jul 1999 A
5940811 Norris Aug 1999 A
5940812 Tengel et al. Aug 1999 A
5943416 Gisby et al. Aug 1999 A
5943478 Aggarwal et al. Aug 1999 A
5945989 Freishtat et al. Aug 1999 A
5948061 Merriman et al. Sep 1999 A
5950179 Buchanan et al. Sep 1999 A
5956693 Geerlings Sep 1999 A
5958014 Cave Sep 1999 A
5960411 Hartman et al. Sep 1999 A
5963625 Kawecki et al. Oct 1999 A
5963635 Szlam Oct 1999 A
5966699 Zandi Oct 1999 A
5970475 Barnes et al. Oct 1999 A
5970478 Walker et al. Oct 1999 A
5974396 Anderson Oct 1999 A
5974446 Sonnenrich et al. Oct 1999 A
5987434 Libman Nov 1999 A
5991740 Messer Nov 1999 A
5995947 Fraser et al. Nov 1999 A
6000832 Franklin et al. Dec 1999 A
6003013 Boushy Dec 1999 A
6009410 LeMole et al. Dec 1999 A
6014644 Erickson Jan 2000 A
6014645 Cunningham Jan 2000 A
6014647 Nizzari Jan 2000 A
6016504 Arnold et al. Jan 2000 A
6026370 Jermyn Feb 2000 A
6028601 Machiraju et al. Feb 2000 A
6029141 Bezos et al. Feb 2000 A
6029149 Dykstra et al. Feb 2000 A
6029890 Austin et al. Feb 2000 A
6044146 Gisby et al. Mar 2000 A
6044360 Picciallo Mar 2000 A
6049784 Weatherly et al. Apr 2000 A
6052447 Golden Apr 2000 A
6052730 Felciano Apr 2000 A
6055573 Gardenswartz et al. Apr 2000 A
6058375 Park et al. May 2000 A
6058428 Wang et al. May 2000 A
6061658 Chou et al. May 2000 A
6064987 Walker et al. May 2000 A
6067525 Jonhson et al. May 2000 A
6070149 Tavor et al. May 2000 A
6073112 Geerlings Jun 2000 A
6076100 Cottrille et al. Jun 2000 A
6078892 Anderson et al. Jun 2000 A
6084585 Kraft et al. Jul 2000 A
6085126 Mellgren, III et al. Jul 2000 A
6085195 Hoyt et al. Jul 2000 A
6088686 Walker et al. Jul 2000 A
6105007 Norris Aug 2000 A
6112190 Fletcher et al. Aug 2000 A
6119101 Peckover Sep 2000 A
6119103 Basch et al. Sep 2000 A
6131087 Luke et al. Oct 2000 A
6131095 Low et al. Oct 2000 A
6134318 O'Neil Oct 2000 A
6134530 Bunting et al. Oct 2000 A
6134532 Lazarus et al. Oct 2000 A
6134533 Shell Oct 2000 A
6134548 Gottsman et al. Oct 2000 A
6138139 Beck et al. Oct 2000 A
6141653 Conklin et al. Oct 2000 A
6144991 England Nov 2000 A
6163607 Bogart et al. Dec 2000 A
6167395 Beck et al. Dec 2000 A
6170011 Macleod Beck et al. Jan 2001 B1
6173053 Bogart et al. Jan 2001 B1
6182050 Ballard Jan 2001 B1
6182124 Lau et al. Jan 2001 B1
6185543 Galperin et al. Feb 2001 B1
6189003 Leal Feb 2001 B1
6192319 Simonson Feb 2001 B1
6192380 Light et al. Feb 2001 B1
6195426 Bolduc et al. Feb 2001 B1
6199079 Gupta et al. Mar 2001 B1
6202053 Christiansen et al. Mar 2001 B1
6202155 Tushie et al. Mar 2001 B1
6208979 Sinclair Mar 2001 B1
6222919 Hollatz et al. Apr 2001 B1
6230121 Weber May 2001 B1
6236975 Boe et al. May 2001 B1
6240396 Walker et al. May 2001 B1
6249795 Douglis Jun 2001 B1
6262730 Horvitz Jul 2001 B1
6267292 Walker et al. Jul 2001 B1
6272506 Bell Aug 2001 B1
6282284 Dezonno et al. Aug 2001 B1
6285983 Jenkins Sep 2001 B1
6289319 Lockwood Sep 2001 B1
6292786 Deaton Sep 2001 B1
6295061 Park et al. Sep 2001 B1
6298348 Eldering Oct 2001 B1
6311169 Duhon Oct 2001 B2
6311178 Bi et al. Oct 2001 B1
6324524 Lent et al. Nov 2001 B1
6327574 Kramer et al. Dec 2001 B1
6330546 Gopinathan et al. Dec 2001 B1
6334110 Walter Dec 2001 B1
6338066 Martin Jan 2002 B1
6346952 Shtivelman Feb 2002 B1
6349290 Horowitz et al. Feb 2002 B1
6356909 Spencer Mar 2002 B1
6374230 Walker et al. Apr 2002 B1
6377936 Henrick et al. Apr 2002 B1
6381640 Beck Apr 2002 B1
6385594 Lebda et al. May 2002 B1
6393479 Glommen et al. May 2002 B1
6405181 Lent et al. Jun 2002 B2
6438526 Dykes et al. Aug 2002 B1
6449358 Anisimov Sep 2002 B1
6449646 Sikora et al. Sep 2002 B1
6463149 Jolissaint et al. Oct 2002 B1
6466970 Lee Oct 2002 B1
6477533 Schiff et al. Nov 2002 B2
6507851 Fujiwara et al. Jan 2003 B1
6510418 Case et al. Jan 2003 B1
6510427 Bossemeyer, Jr. et al. Jan 2003 B1
6516421 Peters Feb 2003 B1
6519628 Locascio Feb 2003 B1
6526404 Slater Feb 2003 B1
6535492 Shtivelman Mar 2003 B2
6542936 Mayle Apr 2003 B1
6546372 Lauffer Apr 2003 B2
6549919 Lambert et al. Apr 2003 B2
6567791 Lent et al. May 2003 B2
6571236 Ruppelt May 2003 B1
6597377 MacPhai Jul 2003 B1
6606744 Mikurak Aug 2003 B1
6618746 Desai et al. Sep 2003 B2
6622131 Brown et al. Sep 2003 B1
6622138 Bellamkonda Sep 2003 B1
6654815 Goss Nov 2003 B1
6662215 Moskowitz et al. Dec 2003 B1
6665395 Busey et al. Dec 2003 B1
6671818 Mikurak Dec 2003 B1
6691151 Cheyer et al. Feb 2004 B1
6691159 Grewal et al. Feb 2004 B1
6701441 Balasubramaniam et al. Mar 2004 B1
6718313 Lent et al. Apr 2004 B1
6721713 Guheen et al. Apr 2004 B1
6725210 Key Apr 2004 B1
6741995 Chen May 2004 B1
6760429 Hung et al. Jul 2004 B1
6766302 Bach Jul 2004 B2
6771766 Shaflee et al. Aug 2004 B1
6778982 Knight Aug 2004 B1
6795812 Lent et al. Sep 2004 B1
6804659 Graham et al. Oct 2004 B1
6826594 Pettersen Nov 2004 B1
6829585 Grewal et al. Dec 2004 B1
6836768 Hirsh Dec 2004 B1
6839680 Liu Jan 2005 B1
6839682 Blume Jan 2005 B1
6850896 Kelman et al. Feb 2005 B1
6865267 Dezono Mar 2005 B2
6892226 Tso et al. May 2005 B1
6892347 Williams May 2005 B1
6904408 McCarthy et al. Jun 2005 B1
6920434 Cossette Jul 2005 B1
6922705 Northrup Jul 2005 B1
6925441 Jones Aug 2005 B1
6925442 Shapira et al. Aug 2005 B1
6950983 Snavely Sep 2005 B1
6965868 Bednarek Nov 2005 B1
6981028 Rawat et al. Dec 2005 B1
6993557 Yen Jan 2006 B1
7003476 Samra et al. Feb 2006 B1
7013329 Paul et al. Mar 2006 B1
7039599 Merriman et al. May 2006 B2
7051273 Holt et al. May 2006 B1
7076443 Emens et al. Jul 2006 B1
7085682 Heller et al. Aug 2006 B1
7092959 Chen Aug 2006 B2
7106850 Campbell et al. Sep 2006 B2
7123974 Hamilton Oct 2006 B1
7143063 Lent et al. Nov 2006 B2
7181492 Wen et al. Feb 2007 B2
7200614 Reid et al. Apr 2007 B2
7242760 Shires Jul 2007 B2
7243109 Omega et al. Jul 2007 B2
7251648 Chaudhuri et al. Jul 2007 B2
7266510 Cofino Sep 2007 B1
7287000 Boyd et al. Oct 2007 B2
7313575 Carr et al. Dec 2007 B2
7337127 Smith et al. Feb 2008 B1
7346576 Lent et al. Mar 2008 B2
7346604 Bharat et al. Mar 2008 B1
7346606 Bharat Mar 2008 B2
7350207 Fisher Mar 2008 B2
7370002 Heckerman et al. May 2008 B2
7376603 Mayr et al. May 2008 B1
7403973 Wilsher et al. Jul 2008 B2
7424363 Cheng Sep 2008 B2
7467349 Bryar et al. Dec 2008 B1
7523191 Thomas et al. Apr 2009 B1
7526439 Freishtat et al. Apr 2009 B2
7536320 McQueen et al. May 2009 B2
7552080 Willard et al. Jun 2009 B1
7552365 Marsh Jun 2009 B1
7562058 Pinto Jul 2009 B2
7590550 Schoenberg Sep 2009 B2
7630986 Herz et al. Dec 2009 B1
7650381 Peters Jan 2010 B2
7657465 Freishtat et al. Feb 2010 B2
7660815 Scofield Feb 2010 B1
7689924 Schneider et al. Mar 2010 B1
7702635 Horvitz et al. Apr 2010 B2
7716322 Benedikt et al. May 2010 B2
7730010 Kishore et al. Jun 2010 B2
7734503 Agarwal et al. Jun 2010 B2
7734632 Wang Jun 2010 B2
7739149 Freishtat et al. Jun 2010 B2
7743044 Kalogeraki Jun 2010 B1
7818340 Warren Oct 2010 B1
7827128 Karlsson et al. Nov 2010 B1
7865457 Ravin et al. Jan 2011 B2
7877679 Ozana Jan 2011 B2
7895193 Cucerzan Feb 2011 B2
7958066 Pinckney et al. Jun 2011 B2
7966564 Catlin et al. Jun 2011 B2
7975020 Green et al. Jul 2011 B1
8010422 Lascelles et al. Aug 2011 B1
8065666 Schnabele Nov 2011 B2
8081963 Aftab Dec 2011 B2
8166026 Sadler Apr 2012 B1
8185544 Oztekin et al. May 2012 B2
8260846 Lahav Sep 2012 B2
8266127 Mattox et al. Sep 2012 B2
8321906 Agrusa Nov 2012 B2
8386340 Feinstein Feb 2013 B1
8386509 Scofield Feb 2013 B1
8392580 Allen et al. Mar 2013 B2
8478816 Parks et al. Jul 2013 B2
8489269 Galligan Jul 2013 B1
8713424 Kuhn Apr 2014 B1
8738732 Karidi May 2014 B2
8762313 Lahav et al. Jun 2014 B2
8769044 Carlander Jul 2014 B2
8799200 Lahav Aug 2014 B2
8805844 Schorzman et al. Aug 2014 B2
8805941 Barak et al. Aug 2014 B2
8812601 Hsueh et al. Aug 2014 B2
8843481 Xu Sep 2014 B1
8868448 Freishtat et al. Oct 2014 B2
8918465 Barak Dec 2014 B2
8943002 Zelenko et al. Jan 2015 B2
8943145 Peters et al. Jan 2015 B1
8954539 Lahav Feb 2015 B2
8965998 Dicker Feb 2015 B1
9104970 Lahav et al. Aug 2015 B2
9247066 Stec et al. Jan 2016 B1
9256761 Sahu Feb 2016 B1
9331969 Barak et al. May 2016 B2
9336487 Lahav May 2016 B2
9350598 Barak et al. May 2016 B2
9396295 Lahav et al. Jul 2016 B2
9396436 Lahav Jul 2016 B2
9432468 Karidi Aug 2016 B2
9525745 Karidi Dec 2016 B2
9558276 Barak et al. Jan 2017 B2
9563336 Barak et al. Feb 2017 B2
9563707 Barak et al. Feb 2017 B2
9569537 Barak et al. Feb 2017 B2
9576292 Freishtat et al. Feb 2017 B2
9582579 Barak et al. Feb 2017 B2
9590930 Karidi Mar 2017 B2
9672196 Shachar et al. Jun 2017 B2
9767212 Lavi et al. Sep 2017 B2
9819561 Freishtat et al. Nov 2017 B2
9892417 Shachar et al. Feb 2018 B2
9948582 Karidi Apr 2018 B2
10038683 Barak et al. Jul 2018 B2
10142908 Barak et al. Nov 2018 B2
10191622 Karidi et al. Jan 2019 B2
10278065 Stuber et al. Apr 2019 B2
11004119 Shachar et al. May 2021 B2
20010001150 Miloslavsky May 2001 A1
20010011245 Duhon Aug 2001 A1
20010011246 Tammaro Aug 2001 A1
20010011262 Hoyt et al. Aug 2001 A1
20010011282 Katsumata et al. Aug 2001 A1
20010013009 Greening et al. Aug 2001 A1
20010014877 Defrancesco et al. Aug 2001 A1
20010025249 Tokunaga Sep 2001 A1
20010027436 Tenembaum Oct 2001 A1
20010032140 Hoffman Oct 2001 A1
20010032244 Neustel Oct 2001 A1
20010034689 Heilman Oct 2001 A1
20010044751 Pugliese Nov 2001 A1
20010054041 Chang Dec 2001 A1
20010054064 Kannan Dec 2001 A1
20010056405 Muyres Dec 2001 A1
20020002491 Whitfield Jan 2002 A1
20020004735 Gross Jan 2002 A1
20020010625 Smith et al. Jan 2002 A1
20020016731 Kupersmit Feb 2002 A1
20020023051 Kunzle et al. Feb 2002 A1
20020026351 Coleman Feb 2002 A1
20020029188 Schmid Mar 2002 A1
20020029267 Sankuratripati et al. Mar 2002 A1
20020035486 Huyn et al. Mar 2002 A1
20020038230 Chen Mar 2002 A1
20020038388 Netter Mar 2002 A1
20020045154 Wood Apr 2002 A1
20020046086 Pletz Apr 2002 A1
20020046096 Srinivasan Apr 2002 A1
20020047859 Szlam et al. Apr 2002 A1
20020055878 Burton et al. May 2002 A1
20020059095 Cook May 2002 A1
20020067500 Yokomizo et al. Jun 2002 A1
20020073162 McErlean Jun 2002 A1
20020082923 Merriman et al. Jun 2002 A1
20020083095 Wu et al. Jun 2002 A1
20020083167 Costigan et al. Jun 2002 A1
20020085705 Shires Jul 2002 A1
20020091832 Low et al. Jul 2002 A1
20020099694 Diamond et al. Jul 2002 A1
20020107728 Bailey et al. Aug 2002 A1
20020111847 Smith Aug 2002 A1
20020111850 Smrcka et al. Aug 2002 A1
20020123926 Bushold Sep 2002 A1
20020161620 Hatanaka Oct 2002 A1
20020161651 Godsey Oct 2002 A1
20020161664 Shaya et al. Oct 2002 A1
20020167539 Brown et al. Nov 2002 A1
20030004781 Mallon Jan 2003 A1
20030009768 Moir Jan 2003 A1
20030011641 Totman et al. Jan 2003 A1
20030014304 Calvert et al. Jan 2003 A1
20030023754 Eichstadt et al. Jan 2003 A1
20030028415 Herschap et al. Feb 2003 A1
20030036949 Kaddeche et al. Feb 2003 A1
20030041056 Bossemeyer et al. Feb 2003 A1
20030055778 Erlanger Mar 2003 A1
20030061091 Amaratunga Mar 2003 A1
20030079176 Kang et al. Apr 2003 A1
20030105826 Mayraz Jun 2003 A1
20030110130 Pelletier Jun 2003 A1
20030140037 Deh-Lee Jul 2003 A1
20030149581 Chaudhri et al. Aug 2003 A1
20030149937 McElfresh et al. Aug 2003 A1
20030154196 Goodwin et al. Aug 2003 A1
20030167195 Fernandes et al. Sep 2003 A1
20030177096 Trent et al. Sep 2003 A1
20030195848 Felger Oct 2003 A1
20030217332 Smith et al. Nov 2003 A1
20030221163 Glover et al. Nov 2003 A1
20030233425 Lyons et al. Dec 2003 A1
20040034567 Gravett Feb 2004 A1
20040064412 Phillips et al. Apr 2004 A1
20040073475 Tupper Apr 2004 A1
20040088323 Elder et al. May 2004 A1
20040128390 Blakley et al. Jul 2004 A1
20040128624 Arellano Jul 2004 A1
20040141016 Fukatsu et al. Jul 2004 A1
20040153368 Freishtat et al. Aug 2004 A1
20040163101 Swix et al. Aug 2004 A1
20040167928 Anderson et al. Aug 2004 A1
20040193377 Brown Sep 2004 A1
20040210820 Tarr et al. Oct 2004 A1
20040243539 Skurtovich et al. Dec 2004 A1
20040249650 Freedman Dec 2004 A1
20040260574 Gross Dec 2004 A1
20050004864 Lent et al. Jan 2005 A1
20050014117 Stillman Jan 2005 A1
20050033641 Jha et al. Feb 2005 A1
20050033728 James Feb 2005 A1
20050044149 Regardie et al. Feb 2005 A1
20050091254 Stabb Apr 2005 A1
20050096963 Myr May 2005 A1
20050096997 Jain et al. May 2005 A1
20050097089 Nielsen et al. May 2005 A1
20050102177 Takayama May 2005 A1
20050102257 Onyon et al. May 2005 A1
20050114195 Bernasconi May 2005 A1
20050131944 Edward Jun 2005 A1
20050132205 Palliyil et al. Jun 2005 A1
20050138115 Llamas et al. Jun 2005 A1
20050171861 Bezos et al. Aug 2005 A1
20050183003 Peri Aug 2005 A1
20050198120 Reshef et al. Sep 2005 A1
20050198212 Zilberfayn et al. Sep 2005 A1
20050198220 Wada et al. Sep 2005 A1
20050216342 Ashbaugh Sep 2005 A1
20050234761 Pinto Oct 2005 A1
20050256955 Bodwell et al. Nov 2005 A1
20050262065 Barth et al. Nov 2005 A1
20050273388 Roetter Dec 2005 A1
20050288943 Wei et al. Dec 2005 A1
20060015390 Rijisinghani et al. Jan 2006 A1
20060021009 Lunt Jan 2006 A1
20060026147 Cone et al. Feb 2006 A1
20060026237 Wang et al. Feb 2006 A1
20060041378 Chen Feb 2006 A1
20060041476 Zheng Feb 2006 A1
20060041562 Paczkowski et al. Feb 2006 A1
20060047615 Ravin et al. Mar 2006 A1
20060059124 Krishna Mar 2006 A1
20060106788 Forrest May 2006 A1
20060122850 Ward et al. Jun 2006 A1
20060168509 Boss et al. Jul 2006 A1
20060224750 Davies Oct 2006 A1
20060253319 Chayes et al. Nov 2006 A1
20060265495 Butler et al. Nov 2006 A1
20060271545 Youn et al. Nov 2006 A1
20060277477 Christenson Dec 2006 A1
20060282327 Neal et al. Dec 2006 A1
20060282328 Gerace et al. Dec 2006 A1
20060284378 Snow et al. Dec 2006 A1
20060284892 Sheridan Dec 2006 A1
20060288087 Sun Dec 2006 A1
20060293950 Meek et al. Dec 2006 A1
20060294084 Patel Dec 2006 A1
20070027771 Collins et al. Feb 2007 A1
20070027785 Lent et al. Feb 2007 A1
20070053513 Hoffberg Mar 2007 A1
20070061412 Karidi et al. Mar 2007 A1
20070061421 Karidi Mar 2007 A1
20070073585 Apple et al. Mar 2007 A1
20070094228 Nevin et al. Apr 2007 A1
20070100653 Ramer et al. May 2007 A1
20070100688 Book May 2007 A1
20070112958 Kim May 2007 A1
20070116238 Jacobi May 2007 A1
20070116239 Jacobi May 2007 A1
20070162501 Agassi et al. Jul 2007 A1
20070162846 Cave Jul 2007 A1
20070168256 Horsmann Jul 2007 A1
20070168874 Kloeffer Jul 2007 A1
20070185751 Dempers Aug 2007 A1
20070206086 Baron et al. Sep 2007 A1
20070214048 Chan Sep 2007 A1
20070220092 Heitzeberg et al. Sep 2007 A1
20070239527 Nazer et al. Oct 2007 A1
20070250585 Ly et al. Oct 2007 A1
20070256003 Wagoner Nov 2007 A1
20070260596 Koran et al. Nov 2007 A1
20070260624 Chung et al. Nov 2007 A1
20070265873 Sheth et al. Nov 2007 A1
20080021816 Lent et al. Jan 2008 A1
20080033794 Ou et al. Feb 2008 A1
20080033941 Parrish Feb 2008 A1
20080040225 Roker Feb 2008 A1
20080072170 Simons Mar 2008 A1
20080133650 Saarimaki et al. Jun 2008 A1
20080147480 Sarma et al. Jun 2008 A1
20080147486 Wu et al. Jun 2008 A1
20080147741 Gonen et al. Jun 2008 A1
20080163379 Robinson Jul 2008 A1
20080183745 Cancel et al. Jul 2008 A1
20080183806 Cancel et al. Jul 2008 A1
20080201436 Gartner Aug 2008 A1
20080215541 Li et al. Sep 2008 A1
20080222656 Lyman Sep 2008 A1
20080244024 Aaltonen et al. Oct 2008 A1
20080262897 Howarter et al. Oct 2008 A1
20080270294 Lent et al. Oct 2008 A1
20080270295 Lent et al. Oct 2008 A1
20080275864 Kim Nov 2008 A1
20080288658 Banga Nov 2008 A1
20080319778 Abhyanker Dec 2008 A1
20090006174 Lauffer Jan 2009 A1
20090006179 Billingsley et al. Jan 2009 A1
20090006622 Doerr Jan 2009 A1
20090028047 Schmidt Jan 2009 A1
20090030859 Buchs et al. Jan 2009 A1
20090037355 Brave Feb 2009 A1
20090055267 Roker Feb 2009 A1
20090063645 Casey et al. Mar 2009 A1
20090076887 Spivack et al. Mar 2009 A1
20090099904 Affeld et al. Apr 2009 A1
20090119173 Parsons et al. May 2009 A1
20090132368 Cotter et al. May 2009 A1
20090138563 Zhu May 2009 A1
20090138606 Moran et al. May 2009 A1
20090164171 Wold et al. Jun 2009 A1
20090177771 Britton et al. Jul 2009 A1
20090210405 Ortega et al. Aug 2009 A1
20090222572 Fujihara Sep 2009 A1
20090228914 Wong Sep 2009 A1
20090240586 Ramer et al. Sep 2009 A1
20090287534 Guo et al. Nov 2009 A1
20090287633 Nevin et al. Nov 2009 A1
20090293001 Lu et al. Nov 2009 A1
20090298480 Khambete Dec 2009 A1
20090307003 Benjamin Dec 2009 A1
20090319296 Schoenberg Dec 2009 A1
20090327863 Holt et al. Dec 2009 A1
20100017263 Zernik Jan 2010 A1
20100023475 Lahav Jan 2010 A1
20100023581 Lahav Jan 2010 A1
20100049602 Softky Feb 2010 A1
20100063879 Araradian et al. Mar 2010 A1
20100106552 Barillaud Apr 2010 A1
20100125657 Dowling et al. May 2010 A1
20100169176 Turakhia Jul 2010 A1
20100169342 Kenedy Jul 2010 A1
20100205024 Shachar et al. Aug 2010 A1
20100211579 Fujioka Aug 2010 A1
20100255812 Nanjundaiah et al. Oct 2010 A1
20100262558 Willcock Oct 2010 A1
20100281008 Braunwarth Nov 2010 A1
20100306043 Lindsay et al. Dec 2010 A1
20110004888 Srinivasan et al. Jan 2011 A1
20110041168 Murray et al. Feb 2011 A1
20110055207 Schorzman et al. Mar 2011 A1
20110055309 Gibor et al. Mar 2011 A1
20110055331 Adelman et al. Mar 2011 A1
20110055338 Loeb et al. Mar 2011 A1
20110112893 Karlsson et al. May 2011 A1
20110113101 Ye et al. May 2011 A1
20110119264 Hu et al. May 2011 A1
20110131077 Tan Jun 2011 A1
20110137737 Baird et al. Jun 2011 A1
20110138298 Alfred et al. Jun 2011 A1
20110161792 Florence et al. Jun 2011 A1
20110208822 Rathod Aug 2011 A1
20110213655 Henkin Sep 2011 A1
20110246255 Gilbert et al. Oct 2011 A1
20110246406 Lahav et al. Oct 2011 A1
20110258039 Patwa et al. Oct 2011 A1
20110270926 Boyd Nov 2011 A1
20110270934 Wong et al. Nov 2011 A1
20110271175 Lavi et al. Nov 2011 A1
20110271183 Bose et al. Nov 2011 A1
20110307331 Richard et al. Dec 2011 A1
20110320715 Ickman et al. Dec 2011 A1
20120012358 Horan et al. Jan 2012 A1
20120036200 Cole Feb 2012 A1
20120042389 Bradley et al. Feb 2012 A1
20120059722 Rao Mar 2012 A1
20120066345 Rayan Mar 2012 A1
20120110136 Beckley May 2012 A1
20120110322 Slepinin May 2012 A1
20120130918 Gordon May 2012 A1
20120136939 Stern et al. May 2012 A1
20120150973 Barak Jun 2012 A1
20120173373 Soroca Jul 2012 A1
20120195422 Famous Aug 2012 A1
20120246017 Kleber Sep 2012 A1
20120254301 Fiero Oct 2012 A1
20120259891 Edoja Oct 2012 A1
20120323346 Ashby et al. Dec 2012 A1
20120324008 Werz Dec 2012 A1
20130013362 Walker et al. Jan 2013 A1
20130013990 Green Jan 2013 A1
20130036202 Lahav Feb 2013 A1
20130050392 Chiang Feb 2013 A1
20130054707 Muszynski et al. Feb 2013 A1
20130080961 Levien et al. Mar 2013 A1
20130103790 Gupta Apr 2013 A1
20130117276 Hedditch May 2013 A1
20130117380 Pomazanov et al. May 2013 A1
20130117804 Chawla May 2013 A1
20130125009 DeLuca May 2013 A1
20130132194 Rajaram May 2013 A1
20130136253 Liberman May 2013 A1
20130138765 Fielding May 2013 A1
20130165234 Hall Jun 2013 A1
20130182834 Lauffer Jul 2013 A1
20130204859 Vijaywargi et al. Aug 2013 A1
20130212497 Zelenko et al. Aug 2013 A1
20130238714 Barak et al. Sep 2013 A1
20130250354 Kato Sep 2013 A1
20130268468 Vijayaraghavan et al. Oct 2013 A1
20130275862 Adra Oct 2013 A1
20130290533 Barak Oct 2013 A1
20130311874 Schachar et al. Nov 2013 A1
20130326375 Barak et al. Dec 2013 A1
20130336471 Agarwal et al. Dec 2013 A1
20130339445 Perincherry Dec 2013 A1
20140058721 Becerra Feb 2014 A1
20140068011 Zhang et al. Mar 2014 A1
20140094134 Balthasar Apr 2014 A1
20140115466 Barak et al. Apr 2014 A1
20140189539 St. Clair Jul 2014 A1
20140222888 Karidi Aug 2014 A1
20140250051 Lahav et al. Sep 2014 A1
20140278795 Satyamoorthy Sep 2014 A1
20140310229 Lahav et al. Oct 2014 A1
20140331138 Overton et al. Nov 2014 A1
20140358826 Traupman Dec 2014 A1
20140372240 Freishtat et al. Dec 2014 A1
20150006242 Bhasin Jan 2015 A1
20150012602 Barak et al. Jan 2015 A1
20150012848 Barak et al. Jan 2015 A1
20150019525 Barak et al. Jan 2015 A1
20150019527 Barak et al. Jan 2015 A1
20150082345 Archer Mar 2015 A1
20150101003 Bull Apr 2015 A1
20150120520 Jung Apr 2015 A1
20150149571 Barak et al. May 2015 A1
20150200822 Zelenko et al. Jul 2015 A1
20150213363 Lahav et al. Jul 2015 A1
20150248486 Barak et al. Sep 2015 A1
20150269609 Mehanian Sep 2015 A1
20150278837 Lahav et al. Oct 2015 A1
20160055277 Lahav et al. Feb 2016 A1
20160117736 Barak et al. Apr 2016 A1
20160198509 Hayes, Jr. Jul 2016 A1
20160248706 Karidi Aug 2016 A1
20160380932 Matan et al. Dec 2016 A1
20170011146 Lahav et al. Jan 2017 A1
20170026690 Andrade Jan 2017 A1
20170046021 Karidi Feb 2017 A1
20170054701 Barak et al. Feb 2017 A1
20170169081 Barak et al. Jun 2017 A1
20170171047 Freishtat et al. Jun 2017 A1
20170178002 Moriarty Jun 2017 A1
20170206568 Schachar et al. Jul 2017 A1
20170230505 McCarthy-Howe Aug 2017 A1
Foreign Referenced Citations (42)
Number Date Country
102143235 Aug 2011 CN
104394191 Mar 2015 CN
840244 May 1998 EP
1233361 Aug 2002 EP
1276 064 Jan 2003 EP
1549025 Jun 2005 EP
1 840 803 Oct 2007 EP
1845436 Oct 2007 EP
1850284 Oct 2007 EP
2 950 214 Mar 2011 FR
9288453 Nov 1997 JP
2004-054533 Feb 2004 JP
2010128877 Jun 2010 JP
20040110399 Dec 2004 KR
20050010487 Jan 2005 KR
20080046310 May 2008 KR
20080097751 Nov 2008 KR
9722073 Jun 1997 WO
9845797 Oct 1998 WO
9909470 Feb 1999 WO
9922328 May 1999 WO
9944152 Sep 1999 WO
0057294 Sep 2000 WO
0127825 Apr 2001 WO
2001035272 May 2001 WO
02065367 Aug 2002 WO
03032146 Apr 2003 WO
2004057473 Jul 2004 WO
2005059777 Jun 2005 WO
2007044757 Apr 2007 WO
2007129625 Nov 2007 WO
2008057181 May 2008 WO
2008143382 Nov 2008 WO
2009029940 Mar 2009 WO
2010099632 Sep 2010 WO
2010119379 Oct 2010 WO
2010144207 Dec 2010 WO
2011127049 Oct 2011 WO
2013119808 Aug 2013 WO
2013158830 Oct 2013 WO
2013163426 Oct 2013 WO
2015021068 Feb 2015 WO
Non-Patent Literature Citations (297)
Entry
Chartrand Sabra, “A new system seeks to ease the bottleneck in the customer-service information highway,” The New York Times (Apr. 30, 2001), 2 pages.
Just Answer (2004 Faq) Archive.org cache of www.justanswer.com circa (Dec. 2004), 8 pages.
Pack Thomas, “Human Search Engines the next Killer app,” (Dec. 1, 2000) Econtent DBS vol. 23; Issue 6, 7 pages.
Match.com “Match.com Launches Match.com Advisors,” PR Newswire (Oct. 14, 2003), 2 pages.
Sitel, “Sitel to Provide Live Agent Support Online for Expertcity.com,” PR Newswire (Feb. 28, 2000), 2 pages.
Webmaster World, “Link to my website is in a frame with banner ad at the top,” www.webmasterworld.com (Nov. 11, 2003), 2 pages.
Bry et al., “Realilzing Business Processes with ECA Rules: Benefits, Challenges, Limits,” Principles and Practice of Sematic Web Reasoning Lecture Notes in Computer Science, pp. 48-62, LNCS, Springer, Berlin, DE (Jan. 2006).
Fairisaac, “How SmartForms for Blaze Advisor Works,” www.fairisaac.com 12 pages (Jan. 2005).
Mesbah A et al., “A Component-and Push-Based Architectural Style for Ajax Applications,” The Journal of Systems & Software, 81 (12): pp. 2194-2209, Elsevier North Holland, New York, NY US (Dec. 2008).
Oracle Fusion Middleware Administrator's Guide for Oracle SOA (Oracle Guide) Suite 11g Release 1 (11.1.1) Part No. E10226-02 www.docs.oracle.com (Oct. 2009), 548 pages.
“OAuth core 1.0 Revision A [XP002570263],” OAuth Core Workgroups, pp. 1-27 www.ouath.net/core/1.0a/ (retrieved Jan. 31, 2013), 24 pages.
Anon., “AnswerSoft Announces Concerto; First to Combine Call Center Automation with Power of Web,” Business Wire, (Feb. 3, 1997) 3 pages.
Emigh, J., “AnswerSoft Unveils Concerto for Web-Based Call Centers Feb. 5, 1996,” Newsbytes, (Feb. 5, 1997) 2 pages.
Grigonis, R., “Webphony-It's not Just Callback Buttons Anymore,” Computer Telephony, (Dec. 1997) 4 pages.
Wagner, M., “Caring for Customers,” Internet World, (Sep. 1, 1999) 3 pages.
Sweat, J., “Human Touch—a New Wave of E-Service Offerings Blends the Web, E-Mail, and Voice Bringing People back into the Picture,” Information week, (Oct. 4, 1999) 2 pages.
Kirkpatrick, K., “Electronic Exchange 2000, the, ” Computer Shopper, (Nov. 1999) 5 pages.
Anon., “InstantService.com Teams with Island Data to provide Integrated Solution for Online Customer Response,” Business Wire, (May 22, 2000) 3 pages.
Kersnar, S., “Countrywide Offers Proprietary Technology for Online Wholesale Lending,” National Mortgage News, vol. 24, No. 38, (Jun. 5, 2000) 2 pages.
Douglas Armstrong, Firstar Web site helps add up future, Milwaukee Journal Sentinel, (Mar. 28, 1996) 3 pages.
redhat .com downloaded on Jul. 23, 2006.
apache.org downloaded on Jul. 23, 2006.
mysql.com downloaded on Jul. 23, 2006.
developer.com downloaded on Jul. 23, 2006.
Canter, Ronald S., “Lender Beware-Federal Regulation of Consumer Credit”, Credit World, vol. 81, No. 5, pp. 16-20, (May 1993).
Staff, “On-Line System Approves Loans While Customer Waits,” Communication News, vol. 31, Issue 9, (Sep. 1994) 3 pages.
“Low-Rent Loan Officer in a Kiosk”, Bank Technology News vol. 8 No. 2, p (Feb. 1995) 2 pages.
Duclaux, Denise, “A Check for $5,000 in Ten Minutes”, ABA Banking Journal, vol. 87, No. 8, p. 45, AUQ. (1995) 2 pages.
“World Wide Web Enhances Customer's Choice”, Cards International, No. 143, p. 9, (Nov. 1995) 2 pages.
Wells Fargo Launches First Real-Time, Online Home Equity Credit Decision-Making Service, Business Wire, (Jun. 3, 1998), Dialog_ File 621: New Product Announcement, 3 pages.
Handley, John, “Credit Review Lets the Numbers Do the Talking in Home Mortgage Game”, Chicago Tribune (Jul. 1998) 3 pages.
Sherman, Lee, “Wells Fargo Writes a New Online Script”, Interactive Week, vol. 5, No. 31, p. 29, (Aug. 1998) 2 pages.
Calvey, Mark, “Internet Gives Bankers a Snappy Comeback”, San Francisco Business Times, vol. 13, No. 5, p. 3 (Sep. 1998) 2 pages.
McCormick, Linda, “Users of Credit Scoring Face Tough Rules on Notification”, American Banker, Dialog File 625: American Banker Publications, (Mar. 21, 1982) 2 pages.
What the Credit Bureau is Saying About You: If a Mistake Sneaks Into Your Record, You May Not Know About it Until You Get Turned Down for Credit, Changing Times, vol. 37, p. 56, (Jul. 1983) 2 pages.
McShane. Peter K., “Got Financing?”, Business Journal Serving Southern Tier, CNY, Mohawk Valley, Finger Lakes. North, vol. 11, Issue 19, p. 9, (Sep. 15, 1997) 3 pages.
Borowsky, Mark, “The Neural Net: Predictor of Fraud or Victim of Hype?”, Bank Technology News DialoQ File 16:PROMT, p. 7 (Sep. 1993) 2 pages.
FICO http://houseloans.idis.com/fico (2009) 1 page.
Altavista: search, FICO http://www.altavista.com (2001) 3 pages.
What Do FICO Scores Mean to Me?, http://www.sancap.com. (1999) 3 pages.
What is a FICO Score?, http://www.aspeenloan.com (2009) 1 page.
“Credit”, The New Encyclopedia Britannica vol. 3 p. 722. (1994) 3 pages.
“Creditnet.com—an Online Guide to Credit Cards”, http://www.creditnet/com. (1999) 1 page.
“Phillips 66 Introduces Mastercard with Rebate Feature”, PR Newswire, p914NY067, (Sep. 14, 1995) 1 page.
Anon, “VAR Agreement Expands Credit Bureau Access.”, (CCS America, Magnum Communications Ltd expand CardPac access, Computers in Banking, v6, n10, (1) (Oct. 1989) 2 pages.
Wortmann, Harry S., “Reengineering Update—Outsourcing: an Option Full of Benefits and Responsibilities”, American Banker, (Oct. 24, 1994), p. 7A vol. 159, No. 205 3 pages.
Anon. “To Boost Balances, AT&T Renews No-Fee Universal Credit Card Offer”, Gale Group Newsletter, V 10, N. 13, (Mar. 30, 1992) 2 pages.
Anon. “Citgo Puts a New Spin on the Cobranded Oil Card”, Credit Card News, p. 4, (Nov. 1, 1995) 2 pages.
Anon. “Microsoft Targets More than PIM Market with Outlook 2000,” Computer Reseller News, N. 805 p. 99, (Aug. 31, 1998) 2 pages.
Chesanow, Neil, “Pick the Right Credit Cards-and use them wisely”, Medical Economics, v. 75, n. 16, p. 94, (Aug. 24, 1998) 4 pages.
Friedland, Marc, “Credit Scoring Digs Deeper into Data”, Credit World, v. 84, n. 5 p. 19-23, (May 1996) 5 pages.
Hollander, Geoffrey, “Sibling Tool Personator 3 untangles File Formats”, InfoWorld, v20, n5, p. 102 (Feb. 2, 1998) 2 pages.
Kantrow, Yvette D., “Banks Press Cardholders to Take Cash Advances”, American Banker, v. 157, n. 18 pp. 1-2. (Jan. 28, 1992) 2 pages.
Lotus News Release: “Lotus Delivers Pre-Release of Lotus Notes 4.6 Client Provides Compelling New Integration with Internet Explorer”, (May 20, 1997) 2 pages.
Stetenfeld, Beth, “Credit Scoring: Finding the Right Recipe”, Credit Union Management, v. 17, n 11, pp. 24-26 (Nov. 1994).
Block, Valerie, “Network Assembles Card Issuers at an Internet Site”, Am. Banker, V160, (1998) 1 page.
CreditNet Financial Network http://consumers.creditnet.com (1999) 1 page.
Anon., “Lending Tree: Lending Tree Provides Borrowers Fast and Easy Online Access to Multiple Loan Offers,” Business Wire, Jun. 23, 1998, 2 pages.
Anon, Regulation Z Commentary Amendments, Retail Banking Digest, vol. 15, No. 2, p. 17-18, (Mar.- Apr. 1995).
Anon, San Diego Savings Association Offers Customers No-Fee Visa Product, Card News, (Feb. 29, 1988) 1 page.
Bloom, J.K., “For This New Visa, Only Web Surfers Need Apply,” American Banker, vol. 1163, No. 34 12 (Feb. 20, 1998) 2 pages.
Harney, K.R., “Realty Brokers, Lenders Face Restrictions,” Arizona Republic, Final Chaser edition, Sun Living section, (Feb. 10, 1991) 2 pages.
Higgins, K.T., “Mr. Plastic Joins the Marketing Team,” Credit Card Management, vol. 6, No. 3, pp. 26-30, Jun. 1993.
Microsoft Press Computer Dictionary, Third Edition, Microsoft Press, Redmond, 1997, 4 pages.
Whiteside, D.E., “One Million and Counting,” Collections and Credit Risk, vol. 1, No. 11 (Nov. 1996) 5 pages.
Fickenscher, L., “Providian Undercuts rivals with 7.9% Rate Offer,” American banker, vol. 163, Oct. 8, 1998, 2 pages.
Fargo, J., “The Internet Specialists,” Credit Card Management, vol. 11, No. 10, pp. 38-45, Jan. 1999.
Lemay, T., “Browsing for a Mortgage a Click away,” Financial Post, (Jan. 15, 2000) 1 page.
Wijnen, R., “Banks Fortify Online Services,” Bank Technology News, vol. 13, No. 3, Mar. 2000, 3 pages.
Anon. “IAFC Launches NextCard, the First True Internet VISA,” Business Wire, New York: (Feb. 6, 1998), 3 pages.
Lazarony, Lucy, “Only Online Applicants Need Apply,” Bank Advertising News, North Palm Beach, Mar. 23, 1998, vol. 21, Issue 15, 3 pages.
FIData, Inc., News & Press Releases, “Instant Credit Union Loans via the Internet,” http://web.archive.org/web/19990221115203/www.fidata-inc.com/news-pr01.htm_(1999) 2 pages.
FIData, Inc., Press Releases, “Instant Loan Approvals via the Internet,” http://www.fidata- inc.com/news/pr_040198.htm, (Apr. 1, 1998) 2 pages.
Staff, “On-Line System Approves Loans While Customer Waits” -Abstract, Communication News, vol. 31, Issue 9, (Sep. 1994) 3 pages.
Anon. “Affordable Lending Systems Now Available for Smaller Financial Institutions,” Business Wire, (May 18, 1998), 2 pages.
Nexis—All News Sources—Examiner's NPL Search Results in U.S. Appl. No. 11/932,498, included with Office Action dated Oct. 8, 2008, 14 pages.
“Sample Experian Credit Report” by Consumer Information consumerinfo.com (Jul. 9, 1998) 4 pages.
Plaintiffs Original Complaint, Nextcard, LLC v. Liveperson, Inc.; Civil Action No. 2:08-cv-00184-TJW, In the U.S. District Court for the Eastern District of Texas, Marshall Division, filed Apr. 30, 2008 (7 pages).
Amended Complaint and Jury Demand; Liveperson, Inc. v. Nextcard, LLC, et al.; Civil Action No. 08-062 (GMS), in the U.S. District Court for the District of Delaware, filed Mar. 18, 2008 (5 pages).
Plaintiffs Second Amended Complaint; Nextcard, LLC v. American Express Company, et al.; Civil Action No. 2:07-cv-354 (TJW); In the U.S. District Court for the Eastern District of Texas, Marshall Division, filed Apr. 9, 2008 (12 pages).
Defendants HSBC North America Holdings Inc.'s and HSBC USA Inc's Answer, Affirmative Defenses and Counterclaims to Plaintiffs Second Amended Complaint; Nextcard, LLC v. American Express Company, et al.; Civil Action No. 2:07-cv-354 (TJW); In the U.S. District Court for the Eastern District of Texas, Marshall Division filed (Apr. 28, 2008), 13 pages.
Answer and Counterclaims of Defendant DFS Services LLC; Nextcard, LLC v. American Express Company, et al.; Civil Action No. 2:07-cv-354 (TJW); In the U.S. District Court for the Eastern District of Texas, Marshall Division, filed Apr. 28, 2008 (13 pages).
Defendant The PNC Financial Services Group, Inc.'s Answer and Affirmative Defenses to Second Amended Complaint; Nextcard, LLC v. American Express Company, et al.; Civil Action No. 2:07-cv-354 (TJW); In the U.S. District Court for the Eastern District of Texas, Marshall Division, filed Apr. 28, 2008, 10 pages.
Plaintiffs Second Amended Reply to Counterclaims of Defendants HSBC North America Holdings Inc. and HSBC USA Inc.; Nextcard, LLC v. American Express Company, et al.; Civil Action No. 2:07-cv-354 (TJW); In the U.S. District Court for the Eastern District of Texas, Marshall Division, filed May 14, 2008, 5 pages.
Plaintiffs Second Amended Reply to Counterclaims of Defendant DFS Services LLC; Nextcard, LLC v. American Express Company, et al.; Civil Action No. 2:07-cv-354 (TJW); In the U.S. District Court for the Eastern District of Texas, Marshall Division, filed May 14, 2008 (71 pages).
Plaintiffs Second Amended Reply to Counterclaims of Defendant American Express Company; Nextcard, LLC v. American Express Company, et al.; Civil Action No. 2:07-cv-354 (TJW); In the U.S. District Court for the Eastern District of Texas, Marshall Division, filed (May 8, 2008), 8 pages.
Justin Hibbard, Gregory Dalton, Mary E Thyfault. (Jun. 1998). “Web-based customer care.” Information Week, (684) 18-20, 3 pages.
Kim S. Nash “Call all Customers.” Computerworld, 32 (1), 25-28 (Dec. 1997), 2 pages.
PRN: “First American Financial Acquires Tele-Track Inc., ”PR Newswire, (May 11, 1999), Proquest #41275773, 2 pages.
Young, Deborah, “The Information Store,” (Sep. 15, 2000), Wireless Review, pp. 42, 44, 46, 48, 50.
Whiting et al., “Profitable Customers,” (Mar. 29, 1999), Information Week, Issue 727, pp. 44, 45, 48, 52, 56.
Bayer, Judy, “A Framework for Developing and Using Retail Promotion Response Models,” Cares Integrated Solutions, retrieved from www.ceresion.com (2007) 5 pages.
Bayer, Judy, “Automated Response Modeling System for Targeted Marketing,” (Mar. 1998), Ceres Integrated Solutions, 5 pages.
Sweet et al., “Instant Marketing,” (Aug. 12, 1999), Information Week, pp. 18-20.
SmartKids.com “Chooses Quadstone—the Smartest Customer Data Mining Solution,” (Jul. 31, 2000), Business Wire, 2 pages.
“NCR's Next Generation Software Makes True Customer Relationship Management a Reality,” (Jul. 26, 1999) PR Newswire, 3 pages.
“Quadstone System 3.0 Meets New Market Demand for Fast, Easy-to-Use Predictive Analysis for CRM,” (May 22, 2000) Business Wire, 3 pages.
“Net Perceptions Alters Dynamics of Marketing Industry with Introduction of Net Perceptions for Call Centers,” (Oct. 12, 1998) PR Newswire, 3 pages.
“Ceres Targeted Marketing Application,” Ceres Integrated Solutions: retrieved from www.ceresios.com/Product/index.htm (2007) 3 pages.
Prince, C. J., E:business: a Look at the Future, Chief Executive, vol. 154, (Apr. 2000), pp. 10-11.
Oikarinen et al. “Internet Relay Chat Protocol” RFC-1459, pp. 1-65, (May 1993).
eDiet.com: Personalized Diets, Fitness, and Counseling, (May 3, 1998), pp. 1-15.
Fiszer, Max; “Customizing an inbound call-center with skills-based routing,” Telemarketing & Call Center Solutions, (Jan. 1997), v1517 p. 24; Proquest #11267840, 5 pages.
“ESL Federal Credit Union Inaugurates Internet Target Marketing.” PR Newswire p. 4210 (Oct. 6, 1998), 3 pages.
“Welcome to eStara—the Industry Leader in Click to Call and Call Tracking Solutions,” e-Stara, Inc., retrieved from www.estara.com on Mar. 21, 2013, 1 page.
“Push to Talk Live Now! From your website” iTalkSystem, Inc., retrieved from www.italksystems.com on Mar. 21, 2013, 1 page.
Richardson et al., “Predicting Clicks: Estimating the Click-Through Rate for New Ads,” (May 2007) 9 pages.
“Welcome to Keen” retrieved from www.archive.org/web/20010302014355/http://www.keen.com/ on Jan. 25, 2013, 1 page.
Christophe Destruel, Herve Luga, Yves Duthen, Rene Caubet. “Classifiers based system for interface evolution.” Expersys Conference, 265-270 (1997), 6 pages.
Ulla de Stricker, Annie Joan Olesen. “Is Management Consulting for You?” SEARCHER, 48-53 (Mar. 2005), 6 pages.
Humberto T. Marques Neto, Leonardo C.D. Rocha, Pedro H.C. Guerra, Jussara M. Almeida, Wagner Meira Jr., Virgilio A. F. Almeida. “A Characterization of Broadband User Behavior and Their E-Business Activities.” ACM SIGMETRICS Performance Evaluation Review, 3-13 (2004), 11 pages.
Greg Bowman, Michael M. Danchak, Mary LaCombe, Don Porter. “Implementing the Rensselaer 80/20 Model in Professional Education.” 30th ASEE/IEEE Frontiers in Education Conference, Session T3G (Oct. 18-21, 2000), 1 page.
Elizabeth Sklar Rozier, Richard Alterman. “Participatory Adaptation.” CHI, 97, 261-262 (Mar. 22-27, 1997), 2 pages.
Frank White. “The User Interface of Expert Systems: What Recent Research Tells Us.” Library Software Review, vol. 13, No. 2, p. 91-98 (Summer 1994) 8 pages.
Frederick W. Rook, Michael L. Donnell. “Human Cognition and the Expert System Interface: Mental Models and Inference Explanations.” IEEE Transactions on Systems, Man, and Cybernetics, vol. 23, No. 6, p. 1649-1661 (Nov./Dec. 1993), 13 pages.
Francois Bry et al., “Realizing Business Processes with ECA Rules: Benefits Challenges, Limits” (2006) Principles and Practive of Semantic Web Reasoning Lecture Notes in Computer Science; LNCS Springer Belin DE pp. 48-62 XP019042871, ISBN: 978-3540-39586-7.
International Search Report and Written Opinion for PCT Application No. PCT/US2013/041147, dated Jul. 30, 2013, 9 pages.
International Search Report and Written Opinion for PCT Application No. PCT/US2013/037086, dated Jul. 12, 2013, 9 pages.
International Search Report and Written Opinion for PCT Application No. PCT/US2013/29389, dated Jul. 24, 2013, 8 pages.
International Search Report and Written Opinion for PCT Application No. PCT/US2013/038212, dated Jul. 17, 2013, 11 pages.
International Search Report for PCT Application No. PCT/US03/41090, dated Sep. 1, 2004, 3 pages.
International Search Report for PCT Application No. PCT/US05/40012, dated Oct. 5, 2007, 2 pages.
International Preliminary Report on Patentability for PCT Application No. PCT/US2006/039630, dated Apr. 16, 2008, 4 pages.
International Search Report for PCT Application No. PCT/US2011/031239, dated Jul. 7, 2011, 3 pages.
International Search Report for PCT Application No. PCT/US2011/064946, dated Jun. 22, 2012, 3 pages.
International Preliminary Report on Patentability for PCT Application No. PCT/US2011/031239, dated Oct. 9, 2012, 8 pages.
International Search Report and Written Opinion for PCT Application No. PCT/US14/49822, dated Feb. 27, 2015, 11 pages.
Extended European Search Report dated Jul. 7, 2015 for European Patent Application No. 15161694.3; 8 pages.
International Preliminary Report on Patentability for PCT Application No. PCT/US2014/049822, dated Feb. 18, 2016, 7 pages.
International Search Report and Written Opinion for PCT Application No. PCT/US2016/035535, dated Aug. 8, 2016, 11 pages.
International Search Report and Written Opinion dated Nov. 7, 2017 for PCT Application No. PCT/US2017/046550, 16 pages.
Non-Final Office Action dated Dec. 11, 2008 for U.S. Appl. No. 11/394,078, 15 pages.
Final Office Action dated Jul. 9, 2009 for U.S. Appl. No. 11/394,078, 15 pages.
Non-Final Office Action dated Jan. 28, 2010 for U.S. Appl. No. 11/394,078, 14 pages.
Final Office Action dated Jul. 9, 2010 for U.S. Appl. No. 11/394,078, 16 pages.
Non-Final Office Action dated Feb. 1, 2011 for U.S. Appl. No. 11/394,078, 20 pages.
Final Office Action dated Aug. 2, 2011 for U.S. Appl. No. 11/394,078, 23 pages.
Non-Final Office Action dated May 16, 2012 for U.S. Appl. No. 11/394,078, 23 pages.
Final Office Action dated Jan. 25, 2013 for U.S. Appl. No. 11/394,078, 22 pages.
Non-Final Office Action dated Jun. 22, 2012 for U.S. Appl. No. 13/080,324, 9 pages.
Non-Final Office Action dated Aug. 15, 2012 for U.S. Appl. No. 12/967,782, 31 pages.
Non-Final Office Action dated Jul. 29, 2011 for U.S. Appl. No. 12/608,117, 20 pages.
Final Office Action dated Apr. 4, 2012 for U.S. Appl. No. 12/608,117, 25 pages.
Non-Final Office Action dated Apr. 24, 2004 for U.S. Appl. No. 09/922,753, 16 pages.
Final Office Action dated Oct. 14, 2004 for U.S. Appl. No. 09/922,753, 13 pages.
Non-Final Office Action dated May 17, 2005 for U.S. Appl. No. 09/922,753, 13 pages.
Non-Final Office Action dated Mar. 14, 2006 for U.S. Appl. No. 09/922,753, 13 pages.
Final Office Action dated Jul. 26, 2006 for U.S. Appl. No. 09/922,753, 13 pages.
Non-Final Office Action dated Aug. 13, 2008 for U.S. Appl. No. 09/922,753, 10 pages.
Final Office Action dated Apr. 23, 2009 for U.S. Appl. No. 09/922,753, 11 pages.
Non-Final Office Action dated Jul. 21, 2009 for U.S. Appl. No. 09/922,753, 10 pages.
Final Office Action dated Feb. 18, 2010 for U.S. Appl. No. 09/922,753, 9 pages.
Non-Final Office Action dated Apr. 25, 2011 for U.S. Appl. No. 09/922,753, 9 pages.
Final Office Action dated Nov. 25, 2011 for U.S. Appl. No. 09/922,753, 10 pages.
Non-Final Office Action dated Aug. 7, 2007 for U.S. Appl. No. 10/980,613, 16 pages.
Non-Final Office Action dated May 15, 2008 for U.S. Appl. No. 10/980,613, 23 pages.
Non-Final Office Action dated Apr. 30, 2012 for U.S. Appl. No. 12/504,265, 16 pages.
Final Office Action dated Aug. 28, 2012 for U.S. Appl. No. 12/504,265, 28 pages.
Final Office Action dated Feb. 14, 2013 for U.S. Appl. No. 13/080,324, 11 pages.
Non-Final Office Action dated Mar. 30, 2013 for U.S. Appl. No. 11/360,530, 23 pages.
Final Office Action dated Apr. 11, 2013 for U.S. Appl. No. 12/967,782, 18 pages.
Non-Final Office Action dated May 10, 2013 for U.S. Appl. No. 13/563,708, 20 pages.
Non-Final Office Action dated Jun. 12, 2013 for U.S. Appl. No. 12/608,117, 56 pages.
Non-Final Office Action dated Jun. 20, 2013 for U.S. Appl. No. 13/157,936, 19 pages.
Non-Final Office Action dated Jun. 27, 2013 for U.S. Appl. No. 12/504,265, 11 pages.
Non-Final Office Action dated Jul. 8, 2013 for U.S. Appl. No. 13/413,197, 10 pages.
Final Office Action dated Oct. 21, 2013 for U.S. Appl. No. 12/504,265 14 pages.
Non-Final Office Action dated Oct. 30, 2013 for U.S. Appl. No. 13/961,072, 10 pages.
Non-Final Office Action dated Dec. 5, 2013 for U.S. Appl. No. 12/967,782, 14 pages.
Non-Final Office Action dated Dec. 4, 2014 for U.S. Appl. No. 14/275,698, 6 pages.
Notice of Allowance dated Jan. 3, 2014 for U.S. Appl. No. 11/360,530, 29 pages.
Final Office Action dated Jan. 22, 2014 for U.S. Appl. No. 12/608,117, 45 pages.
Final Office Action dated Jan. 27, 2014 for U.S. Appl. No. 13/563,708, 35 pages.
Non-Final Office Action dated Jan. 30, 2014 for U.S. Appl. No. 13/413,158, 19 pages.
Notice of Allowance dated Feb. 12, 2014 for U.S. Appl. No. 13/157,936, 33 pages.
Final Office Action dated Feb. 19, 2014 for U.S. Appl. No. 13/961,072, 35 pages.
Non-Final Office Action dated Feb. 20, 2014 for U.S. Appl. No. 10/980,613, 43 pages.
Notice of Allowance dated Feb. 28, 2014 for U.S. Appl. No. 09/922,753, 13 pages.
Notice of Allowance dated Mar. 25, 2014 for U.S. Appl. No. 12/504,265 31 pages.
Notice of Allowance dated Mar. 31, 2014 for U.S. Appl. No. 12/725,999, 41 pages.
Notice of Allowance dated Mar. 30, 2015 for U.S. Appl. No. 14/275,698, 11 pages.
Notice of Allowance dateed Apr. 1, 2014 for U.S. Appl. No. 13/413,197, 32 pages.
Non-Final Office Action dated Jul. 17, 2014 for U.S. Appl. No. 11/394,078, 41 pages.
Non-Final Office Action dated Jul. 31, 2014 for U.S. Appl. No. 13/080,324, 38 pages.
Notice of Allowance dated Aug. 18, 2014 for U.S. Appl. No. 12/967,782, 43 pages.
Non-Final Office Action dated Aug. 21, 2014 for U.S. Appl. No. 10/980,613, 43 pages.
Final Office Action dated Mar. 12, 2015 for U.S. Appl. No. 13/080,324, 13 pages.
Non-Final Office Action dated Mar. 13, 2015 for U.S. Appl. No. 13/841,434, 26 pages.
Non-Final Office Action dated Apr. 9, 2015 for U.S. Appl. No. 13/830,719, 24 pages.
Final Office Action dated Apr. 7, 2015 for U.S. Appl. No. 11/394,078, 18 pages.
Non-Final Office Action dated Apr. 6, 2015 for U.S. Appl. No. 14/322,736, 13 pages.
Non-Final Office Action dated May 7, 2015 for U.S. Appl. No. 13/829,708, 16 pages.
Final Office Action dated May 8, 2015 for U.S. Appl. No. 10/980,613, 18 pages.
Non-Final Office Action dated May 13, 2015 for U.S. Appl. No. 14/317,346, 21 pages.
Non-Final Office Acton dated Jun. 2, 2015 for U.S. Appl. No. 12/608,117, 26 pages.
First Action Pre-Interview Communication dated Jun. 19, 2015 for U.S. Appl. No. 14/244,830, 7 pages.
Non-Final Office Action dated Jul. 20, 2015 for U.S. Appl. No. 14/711,609; 12 pages.
Non-Final Office Action dated Jul. 20, 2015 for U.S. Appl. No. 14/500,537; 12 pages.
Final Office Action dated Jul. 31, 2015 for U.S. Appl. No. 14/317,346, 13 pages.
Final Office Action dated Aug. 10, 2015 for U.S. Appl. No. 13/961,072, 12 pages.
Non-Final Office Action dated Aug. 14, 2015 for U.S. Appl. No. 14/543,397, 12 pages.
Non-Final Office Action dated Aug. 18, 2015 for U.S. Appl. No. 14/570,963, 23 pages.
Non-Final Office Action dated Aug. 27, 2015 for U.S. Appl. No. 11/394,078, 21 pages.
Non-Final Office Action dated Sep. 11, 2015 for U.S. Appl. No. 14/500,502; 12 pages.
Final Office Action dated Sep. 18, 2015 for U.S. Appl. No. 14/288,258, 17 pages.
Notice of Allowance dated Sep. 18, 2015 for U.S. Appl. No. 14/244,830, 11 pages.
First Action Interview Pilot Program Pre-Interview Communication dated Oct. 21, 2015 for U.S. Appl. No. 14/313,511, 3 pages.
Final Office Action dated Oct. 22, 2015 for U.S. Patent Application No. 13/830, 719, 29 pages.
Final Office Action dated Nov. 10, 2015 for U.S. Appl. No. 13/841,434; 30 pages.
Final Office Acton dated Nov. 17, 2015 for U.S. Appl. No. 12/608,117, 32 pages.
Non-Final Office Action dated Dec. 4, 2015 for U.S. Appl. No. 10/980,613 21 pages.
Non-Final Office Action dated Dec. 24, 2015 for U.S. Appl. No. 14/317,346, 15 pages.
Notice of Allowance dated Dec. 30, 2015 for U.S. Appl. No. 14/322,736, 9 pages.
Non-Final Office Action dated Jan. 5, 2016 for U.S. Appl. No. 14/245,400, 33 pages.
Notice of Allowance dated Jan. 7, 2016 for U.S. Appl. No. 14/313,511, 5 pages.
First Action Pre-Interview Communication dated Jan. 12, 2016 for U.S. Appl. No. 14/753,496, 3 pages.
Notice of Allowance dated Jan. 20, 2016 for U.S. Appl. No. 13/829,708, 11 pages.
Final Office Action dated Jan. 29, 2016 for U.S. Appl. No. 14/711,609; 15 pages.
Final Office Action dated Jan. 29, 2016 for U.S. Appl. No. 14/500,537; 15 pages.
Non-Final Office Action dated Feb. 12, 2016 for U.S. Appl. No. 13/080,324, 15 pages.
Notice of Allowance dated Mar. 16, 2016 for U.S. Appl. No. 14/582,550; 9 pages.
Notice of Allowance dated Mar. 21, 2016 for U.S. Appl. No. 14/753,496; 5 pages.
Final Office Action dated Apr. 14, 2016 for U.S. Appl. No. 10/980,613, 21 pages.
Final Office Action dated Apr. 21, 2016 for U.S. Appl. No. 14/317,346, 17 pages.
Non-Final Office Action dated Apr. 22, 2016 for U.S. Appl. No. 14/288,258 11 pages.
Notice of Allowance dated Apr. 22, 2016 for U.S. Appl. No. 11/394,078, 16 pages.
Non-Final Office Action dated May 12, 2016 for U.S. Appl. No. 13/961,072, 12 pages.
Non-Final Office Acton dated May 23, 2016 for U.S. Appl. No. 12/608,117, 35 pages.
Final Office Action dated Jun. 9, 2016 for U.S. Appl. No. 14/543,397, 18 pages.
Final Office Action dated Jun. 17, 2016 for U.S. Appl. No. 14/570,963, 18 pages.
Notice of Allowance dated Jun. 23, 2016 for U.S. Appl. No. 13/830,719; 26 pages.
Final Office Action dated Jun. 28, 2016 for U.S. Appl. No. 14/500,502, 10 pages.
Final Office Action dated Jul. 12, 2016 for U.S. Appl. No. 14/245,400, 36 pages.
First Action Pre-Interview Communication dated Jul. 14, 2016 for U.S. Appl. No. 14/970,225.
Final Office Action dated Sep. 8, 2016 for U.S. Appl. No. 13/080,324, 15 pages.
Notice of Allowance dated Sep. 21, 2016 for U.S. Appl. No. 14/711,609, 22 pages.
Notice of Allowance dated Sep. 22, 2016 for U.S. Appl. No. 14/500,537, 19 pages.
Notice of Allowance dated Sep. 23, 2016 for U.S. Appl. No. 13/841,434, 15 pages.
Notice of Allowance dated Sep. 30, 2016 for U.S. Appl. No. 14/317,346, 19 pages.
Notice of Allowance dated Oct. 7, 2016 for U.S. Appl. No. 14/288,258, 10 pages.
Non-Final Office Action dated Jan. 13, 2017 for U.S. Appl. No. 14/543,397, 19 pages.
Non-Final Office Action dated Jan. 9, 2017 for U.S. Appl. No. 14/570,963, 16 pages.
Notice of Allowance dated Jan. 13, 2017 for U.S. Appl. No. 15/294,441, 10 pages.
Pre-Interview First Office Action dated Apr. 3, 2017 for U.S. Appl. No. 15/384,895, 7 pages.
Non-Final Office Action dated Mar. 27, 2017 for U.S. Appl. No. 14/245,400; 43 pages.
Notice of Allowance dated May 22, 2017 for U.S. Appl. No. 13/080,324; 10 pages.
Non-Final Office Action dated Jul. 17, 2017 for U.S. Appl. No. 15/131,777; 11 pages.
Non-Final Office Action dated Sep. 7, 2017 for U.S. Appl. No. 15/273,863, 29 pages.
Pre-Interview First Office Action dated Sep. 11, 2017 for U.S. Appl. No. 15/409,720, 6 pages.
Final Office Action dated Sep. 22, 2017 for U.S. Appl. No. 14/543,397, 18 pages.
Non-Final Office Action dated Sep. 25, 2017 for U.S. Appl. No. 15/632,069, 12 pages.
Final Office Action dated Oct. 6, 2017 for U.S. Appl. No. 14/570,963, 17 pages.
Notice of Allowance dated Oct. 2, 2017 for U.S. Appl. No. 15/595,590, 9 pages.
Notice of Allowance dated Dec. 8, 2017 for U.S. Patent Application No. 15,409,720, 9 pages.
Final Office Action dated Jan. 4, 2018 for U.S. Appl. No. 14/245,400; 22 pages.
Final Office Action dated Jan. 9, 2018 for U.S. Patent No. U.S. Appl. No. 15/384,895, 10 pages.
Non-Final Office Action dated Feb. 8, 2018 for U.S. Appl. No. 14/570,963; 25 pages.
Non-Final Office Action dated Mar. 19, 2018 for U.S. Appl. No. 15/084,133; 16 pages.
Non-Final Office Action dated Jun. 4, 2018 for U.S. Appl. No. 15/682,186; 13 pages.
Non-Final Office Action dated Jul. 12, 2018 for U.S. Appl. No. 15/860,378; 7 pages.
Final Office Action dated Jul. 11, 2018 for U.S. Appl. No. 15/273,863; 29 pages.
Notice of Allowance dated Jul. 23, 2018 for U.S. Appl. No. 15/171,525; 14 pages.
Notice of Allowance dated Sep. 12, 2018 for U.S. Appl. No. 15/213,776; 8 pages.
Non-Final Office Action dated Oct. 4, 2018 for U.S. Appl. No. 15/389,598; 21 pages.
Final Office Action dated Dec. 13, 2018 for U.S. Appl. No. 14/570,963; 32 pages.
Non-Final Office Action dated Jan. 24, 2019 for U.S. Appl. No. 15/273,863; 29 pages.
Notice of Allowance dated Feb. 1, 2019 for U.S. Appl. No. 15/084,133; 8 pages.
Notice of Allowance dated Feb. 28, 2019 for U.S. Appl. No. 15/860,378; 7 pages.
Non-Final Office Action dated Mar. 7, 2019 for U.S. Appl. No. 15/682,186; 12 pages.
Final Office Action dated Apr. 25, 2019 for U.S. Appl. No. 14/245,400; 25 pages.
Final Office Action dated May 14, 2019 for U.S. Appl. No. 15/389,598; 19 pages.
Non-Final Office Action dated Jun. 25, 2019 for U.S. Appl. No. 16/218,052; 8 pages.
Non-Final Office Action dated Aug. 7, 2019 for U.S. Appl. No. 16/353,321; 10 pages.
Final Office Action dated Aug. 7, 2019 for U.S. Appl. No. 15/273,863; 33 pages.
Notice of Allowance dated Aug. 14, 2019 for U.S. Appl. No. 15/384,895; 8 pages.
Non-Final Office Action dated Sep. 20, 2019 for U.S. Appl. No. 15/682,186; 13 pages.
Non-Final Office Action dated Dec. 4, 2019 for U.S. Appl. No. 15/182,310; 8 pages.
Non-Final Office Action dated Dec. 31, 2019 for U.S. Appl. No. 16/026,603; 7 pages.
Final Office Action dated Nov. 4, 2019 for U.S. Appl. No. 16/353,321; 14 pages.
Non-Final Office Action dated Mar. 17, 2020 for U.S. Appl. No. 15/273,863; 25 pages.
Final Office Action dated Apr. 9, 2020 for U.S. Appl. No. 16/218,052; 15 pages.
Final Office Action dated Jun. 26, 2020 for U.S. Appl. No. 15/682,186; 15 pages.
Non-Final Office Action dated Jul. 10, 2020 for U.S. Appl. No. 16/420,458; 5 pages.
Final Office Action dated Jul. 20, 2020 for U.S. Appl. No. 14/570,963; 43 pages.
Final Office Action dated Aug. 6, 2021 for U.S. Appl. No. 15/182,310; 9 pages.
Final Office Action dated Aug. 6, 2021 for U.S. Appl. No. 14/245,400; 22 pages.
Non-Final Office Action dated Oct. 30, 2020 for U.S. Appl. No. 14/570,963; 35 pages.
Non-Final Office Action dated Nov. 10, 2020 for U.S. Appl. No. 16/218,052; 16 pages.
Non-Final Office Action dated Dec. 28, 2020 for U.S. Appl. No. 14/570,963; 16 pages.
Non-Final Office Action dated Mar. 30, 2021 for U.S. Appl. No. 15/182,310; 8 pages.
Notice of Allowance dated Jan. 13, 2021 for U.S. Appl. No. 15/273,863; 11 pages.
Notice of Allowance dated Nov. 3, 2021 for U.S. Appl. No. 17/004,362; 11 pages.
Non-Final Office Action dated Feb. 17, 2022 for U.S. Appl. No. 14/570,963; 41 pages.
Notice of Allowance dated Mar. 21, 2022 for U.S. Appl. No. 15/912,761; 15 pages.
Notice of Allowance dated Mar. 8, 2022 for U.S. Appl. No. 17/114,934; 14 pages.
Notice of Allowance dated Mar. 7, 2022 for U.S. Appl. No. 14/245,400; 18 pages.
Final Office Action dated Apr. 19, 2022 for U.S. Appl. No. 15/682,186; 16 pages.
Related Publications (1)
Number Date Country
20220229525 A1 Jul 2022 US
Provisional Applications (1)
Number Date Country
61638825 Apr 2012 US
Continuations (3)
Number Date Country
Parent 17004362 Aug 2020 US
Child 17587033 US
Parent 15389598 Dec 2016 US
Child 17004362 US
Parent 13841434 Mar 2013 US
Child 15389598 US