System and method for increasing email productivity

Information

  • Patent Grant
  • 9699129
  • Patent Number
    9,699,129
  • Date Filed
    Monday, June 30, 2003
    21 years ago
  • Date Issued
    Tuesday, July 4, 2017
    7 years ago
Abstract
A system and method for increasing email productivity based on an analysis of the content of received email messages. The system includes a content analysis engine that analyzes the content of a received email message using natural language processing techniques. A prioritization module produces a priority score and a priority level for the message using a prioritization knowledge base. A message sorting module produces a set of suggested folders for the message using a sorting knowledge base. A junkmail module produces a junkman score for the message using a junkmail knowledge base. The prioritization knowledge base, the sorting knowledge base, and the junkmail knowledge base are updated with feedback from the user for each received email message, which allows the system to learn in real-time the user's preferences.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


This invention relates generally to electronic mail (email) and relates more particularly to a system and method for increasing email productivity.


2. Description of the Background Art


Electronic mail (email) has become an important tool for both business and personal communication, and its use will likely become even more critical in the future. Many professionals spend a good part of their workday reading and drafting email messages for various purposes. High-volume email users, for example professionals with publicly-available email addresses, may receive hundreds of email messages a day, and can spend several hours each day dealing with them. The sheer volume of received email messages challenges the productivity of any professional.


Email users often receive junkman, also known as unsolicited commercial email or spam. Most users are not interested in such messages and routinely delete them unread. However, some junkman messages are not easily identified as such from the sender or the subject line, and a user may waste time opening and reading a message only to discover that it is unwanted junkman. Email users also often receive personal messages such as jokes, invitations, and notes from friends. Some users may consider these types of messages as junkman, while others enjoy sending and receiving such messages.


For most business users, the bulk of received email messages are work-related, some of which have a higher priority than others. For example, an email message from a key customer is likely to be of a higher priority than a message from a co-worker. Also, a message from a user's supervisor is likely to be of a higher priority than a message that is a reminder of a regularly scheduled meeting. In addition, messages regarding certain topics may be regarded as having a higher priority than messages regarding a different topic. Each user may have a different determination of which messages have a higher priority, and this determination of priority may change over time.


SUMMARY OF THE INVENTION

An email productivity module interacts with an existing email application to perform a variety of functions, such as prioritizing and sorting email messages and filtering junk email messages. The email productivity module performs these functions and others by first analyzing the content of each message using a content analysis engine that implements natural language text classification techniques. Performance of the email productivity module is maintained and/or improves over time by learning from feedback generated by a user's actions taken in connection with email messages. Feedback may be utilized to adapt one or more knowledge bases, which may be stored locally or at another computing device located on a computer network. A knowledge base is a collection of information used by the system to analyze and classify messages. The knowledge base may include any combination of statistical information about messages, domain or language in general, as well as rules, lexicons, thesauri, anthologies and other natural language processing elements known in the art. The knowledge base may additionally include thresholds, likelihood tables, and other information used to classify a message.


The content analysis engine analyzes the content of the message. A prioritization module assigns a priority score and a priority label to the message based on its content. In one embodiment, the priority score is a whole number from 0 to 100, and the priority label is a designation such as high, medium, or low. The user provides feedback to the email productivity module by either accepting the prioritization assigned to the message, changing the priority label (e.g., changing a low priority message to high), or providing feedback by some other means to establish the priority of a selected message. The email productivity module may also infer information about the priority of the message from the actions of the user (implicit feedback).


In connection with the prioritization function, the email productivity module may advantageously provide a mechanism for filtering low-priority (but otherwise legitimate) email messages having content that is relatively unimportant to the recipient and which does not require a response therefrom. These low-priority e-mail messages, which are known colloquially in the art as “occupational spam,” may consist of enterprise-wide announcements, reminders, status reports, and the like, and may constitute a significant portion of the total number of emails received by employees, particularly high-level executives. However, existing junkmail filtering software is not capable of detecting and filtering occupational spam, and so a user may spend a substantial amount of time reading through such messages and taking appropriate action, such as deleting the messages or moving them to a folder for later review. Embodiments of the present invention address this deficiency by generating a priority score for each message, and then executing a predetermined action, such as deleting the message or moving it to a specified folder, if the priority score falls below a predetermined threshold representative of occupational junkmail having little importance and/or urgency to the user.


The sorting function determines a set of suggested folders for storing incoming messages. For each incoming message, the content analysis engine analyzes the content of the message. A message sorting module determines a confidence score for the message based on content for each available folder and lists the top scoring folders in descending order. The user provides feedback to the email productivity module by moving the message into a folder, or by other means of explicit or implicit feedback. In one embodiment, the user may enable the email productivity module to automatically move a message into a folder, for example if the message score for that folder exceeds a threshold.


The junkman identifying function determines whether a message qualifies as junkmail (also known as unsolicited commercial email or spam). The content analysis engine analyzes the content of each incoming message and a junkmail module determines a junkmail score based on the content of the message. The user provides feedback by moving the message to a junkmail folder, which provides positive feedback, or moving the message to a folder not associated with junkmail, which provides negative feedback. The user may also provide explicit feedback to the email productivity module by manually indicating that the message is junkmail or non-junkmail, for example by a keystroke or by clicking on a button. In one embodiment, the user may enable the email productivity module to automatically move a message into the junkmail folder or delete the message if that message's junkman score exceeds a threshold.


The email productivity module may modify the user interface of the email application. The inbox view includes additional fields to show the prioritization score, prioritization level, and junkmail score. In one embodiment, the user interface also includes a toolbar with buttons related to the functions of the email productivity tool. For example, the toolbar includes a junkman button for identifying a message as junkmail, a number of buttons for assigning a priority level to a message, and a drop down list of suggested folders.


The email productivity module may also provide an intelligent search function that allows a user to select one or more messages and search stored messages in every folder to find messages with similar content (“search by example”).





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of one embodiment of an electronic communications network, according to the invention;



FIG. 2 is a block diagram of one embodiment of the email productivity module of FIG. 1, according to the invention;



FIG. 3 is a block diagram of one embodiment of the email productivity server of FIG. 1, according to the invention;



FIG. 4 is a diagram of one embodiment of an explorer window of the email application of FIG. 1, according to the invention;



FIG. 5 is a diagram of one embodiment of the message window of FIG. 4, according to the invention; and



FIG. 6 is a flowchart of method steps for processing a received email message according to one embodiment of the invention.





DETAILED DESCRIPTION OF THE INVENTION


FIG. 1 is a symbolic diagram of an implementation of one embodiment of the invention in a networked computing environment 100. A computing device 120 communicates over a network 110 with other computing devices, such as client computer 114. Network 110 may be any appropriate type of network or combination of networks, for example a local area network (LAN), a wide area network (WAN), a wireless network, a home network, the public switched telephone (PSTN), or the public Internet. Computing device 120 may take the form of any suitable device capable of sending an receiving email message, for example a desktop computer, a laptop computer, a set-top box, a handheld computing device, a handheld messaging device, a mobile telephone, or a dedicated email appliance. In a typical network environment, computing device 120 received and sends email messages via at least one email server 117, which performs a standard set of store-and-forward, addressing and routing functions. Computing device 120 may also communicate with an email productivity server 118 to effect various productivity functions, as is described in further detail below.


Computing device 120 is provided with storage and a processor for storing and executing an email application 122 having an email productivity add-in 124, and an email productivity module 126. Email application 122 may be, for example, a widely utilized commercial application such as Microsoft Outlook®. Email productivity add-in 124 and email productivity module 126 cooperate to process incoming email messages before the messages are displayed by email application 122. More specifically, email productivity module 126 analyzes the content of incoming email messages and accordingly generates a set of scores and other attributes characterizing the content. Email productivity add-in 124 adds fields containing these scores and attributes to the message so that they may be displayed to the user by email application 122, or so that the presentation of incoming email messages in the user interface may be modified in accordance with the associated scores and attributes. Email productivity add-in 124 may also modify the user interface of email application 122 to allow a user to access functions of email productivity module 126. The contents and functionalities of email productivity module 126 are further discussed in conjunction with FIG. 2.



FIG. 2 is a block diagram of one embodiment of the email productivity module 126 of FIG. 1. Email productivity module 126 includes, but is not limited to, a content analysis engine 212, a prioritization module 214, a message sorting module 216, and a junkmail module 218. Prioritization module 214 has an associated prioritization knowledge base 224, message sorting module 216 has an associated sorting knowledge base 226, and junkmail module 218 has an associated junkman knowledge base 228. It should be noted that the architecture depicted by FIG. 2, in which the functions of productivity module 126 are distributed among four separate and distinct component structures, is provided by way of example only, and that the productivity module may instead be implemented having a greater or lesser number of component structures.


Content analysis engine 212 analyzes the content of the message using language processing techniques to produce content information representative of the content of the email message. The content information may include, without limitation, scoring vectors or matrices indicative of the presence, absence, or incidence of concepts or terms in the message, or of the source and/or recipients of the email message. As used herein, the “content” of the message is intended to include without limitation all lexical and syntactical information in the body of the message, any attachments, and in all message headers such as sender, recipient, reply-to address, carbon copy (cc), subject, date, and time.


Prioritization module 214 is configured to generate priority information for email messages received or stored by computing device 120. More specifically, prioritization module 214 operates in conjunction with content analysis engine 212 to assign a priority score and a priority level to email messages. Content analysis engine 212 and prioritization module 214 apply the content information to prioritization knowledge base 224 to determine a priority score for the message. Prioritization module 214 then assigns a priority level to the email message based on the priority score. In an illustrative implementation, priority scores are whole numbers in a range of 0-100, and available priority levels are low, medium, and high. Scores 0-29 correspond to the low level, scores 30-70 correspond to the medium level, and scores 71-100 correspond to the high level. Other ranges of priority scores and other priority levels are within the scope of the invention. Additionally and/or alternatively, priority scores generated by the prioritization module may take the form of vectors, matrices, sets, or other data structures, wherein the universe of possible priority scores includes at least three different values or states.


Prioritization knowledge base 224 is a collection of information used by email productivity module 126 to make determinations regarding the priority of email messages. Prioritization knowledge base 224 may include any combination of statistical information about messages, domain or language in general, as well as rules, lexicons, thesauri, anthologies, and other natural language processing elements well known in the art. Prioritization knowledge base 224 may also include thresholds, likelihood tables, and any other information used to classify a message. Thresholds may be designated by the user, predetermined by the designers of email productivity module 126, or automatically adjusted by email productivity module 126. In one embodiment, prioritization knowledge base 224 stores prioritization information as models arranged in a branch structure, wherein each branch is a model corresponding to a priority level, e.g. the knowledge base may include a first model for high priority messages, a second model for medium priority messages, and a third model for low-priority messages. Prioritization module 214 applies the content information of a received message to each of the models in prioritization knowledge base 224 to determine the priority score, which reflects the relative priority of the message. Each model in prioritization module 214 may have the same threshold or each model may have a uniquely assigned threshold.


It should be noted that the inclusion in the present invention of the prioritization module and its attendant functionality advantageously enables users to identify legitimate but low-priority email messages (which would not be filtered or otherwise flagged by conventional spam filtering software), and to take appropriate action with respect to the low-priority messages, such as deleting them or storing them in a folder for subsequent review at a convenient time. These legitimate but low-priority email messages, which have been referred to as “occupational spam”, may take the form of enterprise-wide announcements, reminders, status reports, and the like, and may constitute a significant portion of the total number of emails received by employees, particularly high-level executives.


Message sorting module 216 operates in conjunction with content analysis engine 212 to determine a set of suggested folders for a message. The set of suggested folders represent one or more folders in which are stored other emails having similar content, and in which the user would be most likely store the message. Content analysis engine 212 applies the content information of the message to sorting knowledge base 226 to determine the set of suggested folders for the message. Sorting knowledge base 226 is a collection of information used by email productivity module 126 to analyze and classify messages. Sorting knowledge base 226 may include any combination of statistical information about messages, domain or language in general, as well as rules, lexicons, thesauri, anthologies, and other natural language processing elements well known in the art. Sorting knowledge base 226 may also include thresholds, likelihood tables, and any other information used to classify a message. Thresholds may be designated by the user, predetermined by the designers of email productivity module 126, or automatically adjusted by email productivity module 126. In one embodiment, sorting knowledge base 226 is organized in a branched structure and includes on each branch a model corresponding to a folder in email application 122 or other class characterizing the content of email messages. Message sorting module 216 generates a score in connection with each model in sorting knowledge base 226, the score being representative of the likelihood that the email should be classified in the folder to which the model corresponds. Message sorting module 216 then identifies a set of suggested folders. In one embodiment, the set of suggested folders includes the top ten highest scoring folders. Each model in sorting knowledge base 226 may have the same threshold or each model may have a uniquely assigned threshold.


Junkmail module 218 operates in conjunction with content analysis engine 212 to determine a junkmail score for each message. Content analysis engine 212 applies the content information of the message to junkmail knowledge base 228. Junkman knowledge base 228 is a collection of information used by email productivity module 126 to analyze and classify messages. Junkmail knowledge base 228 may include any combination of statistical information about messages, domain or language in general, as well as rules, lexicons, thesauri, anthologies, and other natural language processing elements well known in the art. Junkman knowledge base 228 may also include thresholds, likelihood tables, and any other information used to classify a message. Thresholds may be designated by the user, predetermined by the designers of email productivity module 126, or automatically adjusted by email productivity module 126. In one embodiment, junkman knowledge base 228 is organized into a branched structure and includes on one branch a junk model and at the other a not junk model. The score generated when the content information is applied to the junk model is the junkmail score for the message. In one embodiment, the junkmail score is a whole number in a range of 0-100. The higher the junkman score, the higher the probability that the message is junkmail according to the criteria in junkmail knowledge base 228. Each model in junkmail knowledge base 228 may have the same threshold or each model may have a uniquely assigned threshold.


Prioritization module 214, message sorting module 216, and junkmail module 218 supply the priority score and priority level, the set of suggested folders, and the junkmail score, respectively, to email productivity add-in 124. Email productivity add-in 124 attaches fields for the priority score, priority level, set of suggested folders, and junkmail score to the message for display by email application 122. The user interface of email application 122 modified by email productivity add-in 124 is discussed below in conjunction with FIGS. 4 & 5.


Prioritization knowledge base 224, sorting knowledge base 226, and junkmail knowledge base 228 are updated by feedback from user input to email application 122. Email productivity add-in 124 monitors user inputs to email application 122 and reports these inputs as feedback to email productivity module 126. Prioritization module 214 may update prioritization knowledge base 224 using both explicit and implicit feedback. In other words, the terms “feedback” and “user action” are not limited to affirmative actions taken by the user, but rather extend to actions wherein the user accepts the result produced by the productivity module and/or refrains from modifying the result. The user may provide explicit feedback, for example, by changing a priority level that has been assigned to a message. For example, if a message has been assigned a priority level of low and the user feels that the message has a higher priority, then the user can assign a new priority level to the message, such as high. When the user assigns a new priority level to a message, prioritization module 214 assigns the message a priority score in the range of scores for that priority level. In one embodiment, prioritization module 214 assigns a score in the middle of the range of possible scores for the priority level assigned by the user. If the user does not provide explicit priority feedback for a message and does not delete the message but instead takes another action with respect to the email, implicit feedback may be provided to prioritization knowledge base 224. If the user reads or otherwise deals with messages in order of the relative priority scores, this action provides positive feedback to prioritization knowledge base 224. If the user reads messages in an order other than in descending order of priority score, this user action may provide negative feedback to prioritization knowledge base 224. Thus, prioritization knowledge base 224 is updated with feedback for every message received by email application 122. Updating of the prioritization knowledge base may be effected, for example, by adapting the models corresponding to each priority level in accordance with the feedback generated by user action. The feedback is preferably supplied to the prioritization module immediately after the relevant user action is taken to allow real-time adaptation of the models, thereby improving the accuracy and reliability of the prioritization function.


Message sorting module 216 updates sorting knowledge base 226 using feedback generated by user action, for example moving the received message to a selected folder for storage. When the user moves a message to a folder, email productivity add-in 124 provides explicit feedback to message sorting module 216. The user can move the message to one of the set of suggested folders or to any other folder, or create a new folder. When the user creates a new folder in email application 122, message sorting module 216 creates a new model in sorting knowledge base 226. Email productivity module 126 preferably learns in real time how the user prefers to store messages by updating sorting knowledge base 226 with feedback for each message moved into a folder.


Categories may be used to search for or locate messages, whether or not used in conjunction with a keyword search engine. In one embodiment, message sorting module 216 operates in conjunction with a category function native to email application 122. If the category function is enabled, a category is created for each folder of email application 122 and the user designates a category threshold. If a message receives one or more folder scores that exceeds the category threshold, then message sorting module 216 assigns the categories for these folders to the message. Although a message is stored in only one folder, the message may be assigned a plurality of categories. The user is then able to locate messages by category. Other implementations of a category function are within the scope of the invention, for example a category function in which the category names do not correspond to a folder of email application 122.


In one embodiment, message sorting module 216 includes an automove function that, when enabled by the user, will automatically move a received message to the highest scoring folder when the score for that folder exceeds a user-designated threshold value. For example, if message sorting module 216 determines a highest scoring folder with a score that exceeds the threshold for a received message, message sorting module 216 will notify email productivity add-in 124 to cause email application 122 to move the received message directly to the highest scoring folder instead of the inbox. When a message has been automoved to a folder, email productivity add-in 124 may provide implicit feedback to message sorting module 216 if the user reads the message and does not move it to another folder. If the user does move the message to a different folder, email productivity add-in 124 provides explicit feedback to message sorting module 216.


Junkmail module 218 updates junkmail knowledge base 228 using explicit and implicit feedback generated by user actions. The user provides explicit feedback to junkman module 218, for example, by taking an action indicating that a message is junkmail, for example by moving it to a junkmail folder. If the user reads a message and does not delete it or indicate that it is junkmail, junkmail module 218 receives implicit feedback that the message is not junkmail. If the user moves a message out of the junkman folder, junkman module 218 receives implicit feedback that the message is not junkmail. If the user deletes a message from the junkmail folder, junkman module 218 receives implicit feedback that the message is junkmail. In one embodiment, junkmail module 218 has an automove function that, when enabled by the user, will automatically move a received message to a junkman folder if the junkmail score for that message exceeds a user-designated threshold. Email productivity module 126 learns in real-time the user's criteria for junkmail by updating junkmail knowledge base 228 with feedback for each received message.


In one embodiment, junkmail knowledge base 228 may be updated using information provided from an external source. For example, an email service provider could implement a service whereby it generates rules and other information for filtering junkmail messages. Junkmail module 218 may, upon receipt of such rules and other information (which may supplied, for example, through network 110), update junkmail knowledge base 228 in accordance with the received information.


In one embodiment, email productivity module 126 includes an intelligent search function. The user may select a message, and instruct email productivity module 126 to find other similar messages. Content analysis engine 212 analyzes the selected message to identify concepts of the message, and then analyzes stored messages in all folders to identify similar messages.


In one embodiment, email productivity module 126 includes a folder management function. When enabled by the user, email productivity module 126 will analyze the content of previously stored messages in each folder and identify messages that may not belong in a particular folder according to the information in sorting knowledge base 226. The user may then decide to move such an identified message to a folder suggested by email productivity module 126, move the message to some other folder, or not move the message. Each of these actions provides feedback to message sorting module 216 and sorting knowledge base 226. The folder management function may also suggest to the user that certain folders be combined into a single folder, or a single folder be split up into multiple folders.


In one embodiment, email application 122 may forward copies of email messages to a wireless device. Email productivity module 126 can enhance this function by selectively forwarding copies of messages to a wireless device based on analysis of content. For example, the user may designate that messages that have a priority level of high should be forwarded to a wireless device. In another example, the user may designate that only messages that are automoved to a certain folder are to be copied to a wireless device. Email productivity module 126 can also enhance other functions of email application 122, for example a text-to-voice function that only converts messages having a high priority level to voice.


Email productivity module 126 may perform other automatic operations based on results generated by content analysis engine 212 and other components of the email productivity module. Such other automatic operations may include auto-forward to another email address, notification to the user that a message from a particular address has been received, automatically using a web browser to open an embedded hyperlink, or automatically opening an attachment.


It is noted that although the embodiment of the invention described and depicted herein includes prioritization, sorting, and junkmail filtering functions, this embodiment is intended as illustrative rather than limiting, and other embodiments within the scope of the invention may include a subset or superset of these functions.



FIG. 3 is a block diagram of one embodiment of the email productivity server 118 of FIG. 1, according to the invention. Email productivity server 118 includes, but is not limited to, a content analysis engine 312, client knowledge bases 314, and a backup and synchronization module 316. In the FIG. 3 embodiment of email productivity server 118, content analysis engine 312 performs the functions of content analysis engine 212 (FIG. 2) for clients that do not have content analysis engine 212 (i.e., for “thin clients” having limited storage and/or data processing capabilities). Client knowledge bases 314 include a prioritization knowledge base, a sorting knowledge base, and a junkmail knowledge base for each client in network 100 that does not have resident knowledge bases. It will be appreciated that two or more clients may share a common set of knowledge bases, or alternatively the productivity server 118 may store a separate knowledge base for each client. In another embodiment, client knowledge bases 314 includes a global junkman knowledge base that includes information adapted in accordance with the information in the clients' junkmail knowledge bases 228. This architecture allows feedback regarding junkmail to be utilized by each client in network 100, thereby improving filtering accuracy for all clients.


Backup and synchronization module 316 is configured to copy the clients' knowledge bases as a backup in case of failure at one of the clients in network 100. Backup and synchronization module 316 is also configured to synchronize the information in the clients' junkmail knowledge bases 228. Backup and synchronization module 316 may also manage licensing of email productivity module 216 and email productivity add-in 214 on clients 120. Backup and synchronization module 316 may also synchronize the knowledge bases for users that use multiple instances of email application 122.



FIG. 4 is a diagram of one embodiment of an explorer window 400 of the user interface of email application 122 of FIG. 1, according to an embodiment of the invention. Window 400 includes, but is not limited to, a window toolbar 410, an email toolbar 420, a productivity toolbar 430, a folder list 450, and a message window 460. Window toolbar 410 includes a file menu button 412, an edit menu button 414, and a view menu button 416. Window toolbar 410 may include any other appropriate menu buttons. Email toolbar 420 includes a new button 422 configured to open a new blank message, a reply button 424 configured to open a reply message, and a forward button 426 configured to open a forward message window. Email toolbar 420 may include any other appropriate buttons.


Productivity toolbar 430 includes buttons that allow the user to provide input to email productivity module 126. Productivity toolbar 430 includes, but is not limited to, a folder pull-down menu 432, a move button 436, a productivity module (PM) button 438, a junkmail button 440, a high button 442, a medium (med) button 444, a low button 446, and a score button 448. A window of folder pull-down menu 432 displays a highest scoring folder for a selected message in message window 460. The user can actuate an arrow button 434 to view the rest of the set of suggested folders and a “select a folder” choice (to select a folder that doesn't appear in the set of suggested folders). Folder pull-down menu 432 shows the folder name and score for each suggested folder. Actuating move button 426 moves the selected message to the folder currently selected in folder pull-down menu 432.


Actuating junkmail button 440 provides explicit feedback to junkman module 218 that the selected message in message window 460 is junkman. In one embodiment, actuating junkmail button 440 also moves the selected message to the junkmail folder or deletes the selected message. Actuating high button 442, medium button 444, or low button 446 provides explicit feedback to prioritization module 212 regarding the priority of the selected message in message window 460. Actuating score button 448 causes email productivity module 126 to score (or re-score) a selected message or messages in message window 460. In response to input from score button 448, the selected message or messages will be scored by prioritization module 214, message sorting module 216, and junkmail module 218.


Actuating productivity module 438 button opens a productivity module window (not shown) that allows the user to modify certain aspects of email productivity module 126. A message sorting tab allows the user to enable or disable an automove function for message sorting module 216, designate a sorting score threshold, enable or disable categories, designate a category threshold, and select a base folder. A junkmail tab allows the user to enable or disable an automove function for junkmail module 218 and designate a junkmail score threshold. The productivity module window may also include a help button that provides access to help regarding the functions of email productivity module 126.


Folder list 450 includes, but is not limited to, an inbox folder 452, a plurality of folders 454, a junkmail folder 456, and a deleted items folder 458. The user can select any of these folders to view the messages in that folder in message window 460. The user can move a message to a folder in folder list 450 by dragging and dropping the message from message window 460 to the folder in folder list 450. As discussed above, moving a message into a folder in folder list 450 provides feedback to message sorting module 216, and moving a message into junkmail folder 456 provides feedback to junkmail module 218.



FIG. 5 is a diagram of one embodiment of message window 460 of FIG. 4. Message window 460 displays messages for a selected folder. In the FIG. 5 example, message window 460 displays messages in the inbox of email application 122. Message window 460 includes four messages 532, 534, 536, and 538. Message window 460 further includes a junkmail column 512, a priority score column 514, a priority level column 516, “a from” column 518, a subject column 520, and a received column 522. Other embodiments of message window 460 may contain additional columns.


Junkmail column 512 shows a junkmail score for each message as determined by junkman module 218. Priority score column 514 shows a priority score for each message and priority level column 516 shows a priority level for each message as determined by prioritization module 214. Message 532 is the currently selected message. The set of suggested folders for message 532 determined by message sorting module 216 is shown in folder pull-down menu 432 (FIG. 4).


It is noted that different and/or additional visual indications be employed to represent the junkmail score, priority score, and priority levels of listed email messages. For example, messages having a high priority score and level may be presented in a brightly colored font or other manner that draws the user's attention, whereas messages assigned a low priority score can be presented in a duller font. In addition, the message window may be configured to sort the messages in order of priority score or other user-selected field such that messages having relatively greater importance to the user are grouped together above and apart from messages having relatively lesser importance.



FIG. 6 is a flowchart of method steps for processing a received email message according to one embodiment of the invention. Although the steps of FIG. 6 are discussed in the context of email productivity module 126 and email productivity add-in 124, any other means configured to perform the steps are within the scope of the invention. In step 610, email productivity module 126 receives an email message. In step 612, content analysis engine 212 analyzes the content of the message using language processing techniques. Then, in step 614, prioritization module 214 applies the content information to prioritization knowledge base 224, producing a priority score and a priority level. At about the same time, in step 616, message sorting module 216 applies the content information to sorting knowledge base 226, producing a set of suggested folders. At about the same time, in step 618, junkman module 218 applies the content information to junkman knowledge base 228, producing a junkman score for the message.


Then, in step 620, email productivity module 126 determines whether a folder automove function is currently enabled. If the folder automove function is not enabled, the method continues with step 622. If the folder automove function is enabled, then in step 626 email productivity module 126 determines whether the score of the highest scoring folder of the set of suggested folders exceeds the designated threshold. If the score of the highest scoring folder does not exceed the designated threshold, then the method continues with step 622. If the score of the highest scoring folder exceeds the designated threshold, then in step 632, email productivity add-in 124 adds the priority score, priority level, set of suggested folders, and junkmail score fields to the message. Then, in step 638, email productivity add-in 124 sends the message to the highest scoring folder of email application 122.


In step 622, email productivity module 126 determines whether a junkmail automove function is enabled. If the junkman automove function is enabled, the method continues with step 624. If the junkmail automove function is not enabled, then in step 628 email productivity add-in 124 adds the priority score, priority level, set of suggested folders, and junkmail score fields to the message. In step 634, email productivity add-in 124 sends the message to the base folder, such as the inbox, of email application 122.


In step 624, email productivity module 126 determines whether the junkmail score for the message exceeds the designated threshold. If the junkmail score does not exceed the designated threshold, then the method continues with step 628, discussed above. If the junkman score exceeds the designated threshold, then in step 630 email productivity add-in 124 adds the priority score, priority level, set of suggested folders, and junkmail score fields to the message. In step 636 email productivity add-in 124 sends the message to the junkmail folder of email application 122.


It is noted that the steps set forth above are preferably performed automatically upon receipt of each incoming email. However, in other embodiments and implementations, the foregoing steps or portions thereof may be performed only on selected messages, at specified intervals, or responsively to a user request.


The invention has been described above with reference to specific embodiments. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The foregoing description and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims
  • 1. An electronic mail apparatus, comprising: a computing device capable of being connected to a network; andan email productivity module, executed by the computing device, and configured to interact with an existing email application executed by the computing device that sends and receives email messages over the network, wherein the email productivity module includes:a content analysis engine, executed by the computing device, configured to analyze a received email message to generate content information representative of a content of the received email message; a prioritization module, executed by the computing device, having at least one prioritization knowledge base implemented on the computing device, the prioritization module being configured to apply the content information to the at least one prioritization knowledge base to determine at least one priority score for the received email message that reflects a relative priority of the received email message as a legitimate email message and to assign at least one priority level to the received email message based on the priority score that reflects a range of priority scores;a message sorting module, executed by the computing device, having at least one sorting knowledge base implemented on the computing device, the message sorting module being configured to apply the content information to the at least one sorting knowledge base to determine a set of suggested folders for the received email message that represent one or more folders in which are stored other emails having similar content and in which a user would be most likely to store the received email message; anda junkmail module, executed by the computing device, having at least one junkmail knowledge base implemented on the computing device, the junkmail module being configured to apply the content information to the at least one junkmail knowledge base to determine a junkmail score for the received email message that represents a probability that the received email message is junkmail, and the junkmail module being configured to cause a user interface of the existing email application to modify a presentation of the received email message in accordance with the junkmail score;the email productivity module being configured to attach fields for the priority score, priority level, set of suggested folders, and junkmail score to the received email message for display by the existing email application;the email productivity module being configured to receive user feedback to the existing email application indicative of a user action taken with respect to the received email message, and to cause the computing device to adapt the at least one prioritization knowledge base, the at least one sorting database, or the at least one junkmail database, in accordance with the user feedback;wherein the at least one prioritization knowledge base is adapted by the computing device in accordance with explicit user feedback in an event that the user modifies the at least one priority level or the at least one priority score produced by the prioritization module and attached to the received email message for display; andwherein the at least one prioritization knowledge base is adapted by the computing device in accordance with implicit user feedback in an event that the user does not modify the at least one priority level or the at least one priority score produced by the prioritization module and attached to the received email message for display.
  • 2. The apparatus of claim 1, wherein the email productivity module responsively adapts the at least one prioritization knowledge base immediately upon receipt of the user feedback.
  • 3. The apparatus of claim 1, wherein the email application is caused to display both the at least one priority level and the at least one priority score in connection with the received email message.
  • 4. The apparatus of claim 1, the message sorting module being configured to cause the user interface of the email application to modify the presentation of the received email message to display the suggested folders.
  • 5. The apparatus of claim 1, wherein the sorting knowledge base is updated by the explicit user feedback if the user moves the received email message to a folder.
  • 6. The apparatus of claim 1, wherein the at least one junkmail knowledge base is updated by the explicit user feedback if the user indicates that the received email message is junkmail.
  • 7. The apparatus of claim 1, wherein the content analysis engine analyzes the content of the received email message by analyzing a text of the received email message using natural language processing techniques.
  • 8. The apparatus of claim 1, wherein the email productivity module includes an intelligent search function that allows a user to select one or more messages and search stored messages in every folder to find messages with similar content.
  • 9. The apparatus of claim 1, wherein the email productivity module includes a folder management function that analyzes the content of previously stored messages in each folder and identifies messages that may not belong in a particular folder according to information in the at least one sorting knowledge base.
  • 10. The apparatus of claim 1, wherein the email productivity module is configured to execute a prespecified operation if the priority score exceeds a predefined threshold.
  • 11. The apparatus of claim 10, wherein the prespecified operation comprises forwarding the received email message to a wireless device.
  • 12. The apparatus of claim 10, wherein the prespecified operation comprises tagging the received email message with a relevant tag.
  • 13. The apparatus of claim 1, wherein the email productivity module is further configured to forward the received email message to a wireless device if a highest scoring folder matches a predetermined folder.
  • 14. The apparatus of claim 1, wherein the message sorting module is further configured to automatically move the received email message to a folder if a score for the folder exceeds a predetermined or dynamic threshold.
  • 15. The apparatus of claim 1, wherein the junkmail module is further configured to automatically move the received email message to a junkmail folder if the junkmail score for the received email message exceeds a predetermined or dynamic threshold.
  • 16. The apparatus of claim 1, wherein the junkmail module is further configured to automatically delete the received email message if the junkmail score for the received email message exceeds a predetermined or dynamic threshold.
  • 17. The apparatus of claim 1, wherein the message sorting module is configured to assign a folder to the received email message if a folder score exceeds a category threshold.
  • 18. The apparatus of claim 17, wherein the email productivity module is further configured to organize or tag received email messages according to folder.
  • 19. The apparatus of claim 1, wherein the at least one prioritization knowledge base is structured as a set of models, each model uniquely corresponding to one of a plurality of priority levels.
  • 20. The apparatus of claim 1, wherein the sorting knowledge base is structured as a set of models, each model uniquely corresponding to one of a set of folders.
  • 21. The apparatus of claim 1, wherein the at least one junkmail knowledge base is structured as a set of two models, consisting of a junk model and a nonjunk model.
  • 22. A method for providing increased email productivity, comprising: receiving an email message at an email productivity module, executed by a computing device capable of being connected to a network, and configured to interact with an existing email application, executed by the computing device, that sends and receives email messages over the network, wherein the email productivity module performs the steps of: analyzing the email message, in a content analysis engine executed by the computing device, to generate content information representative of a content of the email message;applying the content information, in a prioritization module executed by the computing device, to at least one prioritization knowledge base implemented on the computing device to determine at least one priority score for the email message that reflects a relative priority of the email message as a legitimate email message and to assign at least one priority level to the email message based on the priority score that reflects a range of priority scores; applying the content information, in a message sorting module executed by the computing device, to at least one sorting knowledge base implemented on the computing device to determine a set of suggested folders for the email message that represent one or more folders in which are stored other emails having similar content and in which a user would be most likely to store the email message; andapplying the content information, in a junkmail module executed by the computing device, to at least one junkmail knowledge base implemented on the computing device to determine a junkmail score for the email message that represents a probability that the email message is junkmail, and to cause a user interface of the existing email application to modify a presentation of the email message in accordance with the junkmail score;the email productivity module being configured to attach fields for the priority score, priority level, set of suggested folders, and junkmail score to the email message for display by the existing email application; the email productivity module being configured to receive user feedback to the existing email application indicative of a user action taken with respect to the email message and to cause the computing device to adapt the at least one prioritization knowledge base, the at least one sorting database, or the at least one junkmail database, in accordance with the user feedback;wherein the at least one prioritization knowledge base is adapted by the computing device in accordance with explicit user feedback in an event that the user modifies the at least one priority level or the at least one priority score produced by the prioritization module and attached to the email message for display; andwherein the at least one prioritization knowledge base is adapted by the computing device in accordance with implicit user feedback in an event that the user does not modify the at least one priority level or the at least one priority score produced by the prioritization module and attached to the email message for display.
  • 23. The method of claim 22, wherein analyzing the email message to determine content includes using natural language processing techniques.
  • 24. The method of claim 22, further comprising updating the sorting knowledge base using the user feedback.
  • 25. The method of claim 22, further comprising automatically moving the email message to a folder if a score for that folder exceeds a predetermined or dynamic threshold.
  • 26. The method of claim 22, further comprising updating the at least one junkmail knowledge base using the user feedback.
  • 27. The method of claim 22, further comprising automatically moving the email message to a junkmail folder if the junkmail score exceeds a predetermined or dynamic threshold.
  • 28. The method of claim 22, further comprising deleting the email message if the junkmail score exceeds a predetermined or dynamic threshold.
  • 29. The method of claim 22, further comprising forwarding the email message to a wireless device if the priority score exceeds a predetermined or dynamic threshold.
  • 30. The method of claim 22, further comprising: selecting at least one stored email message;analyzing the selected at least one stored email message to generate content information; andsearching among other stored email messages to identify email messages with content information similar to the content information of the selected stored email message.
  • 31. The method of claim 22, further comprising reorganizing stored email messages based on the content information.
  • 32. The method of claim 22, wherein categories or tags characterizing the content of the email message are displayed to the user interface.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of patent application Ser. No. 09/602,588 filed on Jun. 21, 2000, by the same title and inventor. This application is a continuation-in-part of U.S. patent application Ser. No. 10/112,230, entitled “System and Method for Determining a Set of Attributes Based on Content of Communications,” filed Mar. 27, 2002. This application is also related to U.S. patent application Ser. No. 09/754,179, entitled “System and Method for Electronic Communication Management,” filed Jan. 3, 2001. The subject matter of the related applications is hereby incorporated by reference. The related applications are commonly assigned.

US Referenced Citations (369)
Number Name Date Kind
3648253 Mullery et al. Mar 1972 A
4110823 Cronshaw et al. Aug 1978 A
4286322 Hoffman et al. Aug 1981 A
4586160 Amano et al. Apr 1986 A
4642756 Sherrod Feb 1987 A
4658370 Erman et al. Apr 1987 A
4724523 Kucera Feb 1988 A
4805107 Kieckhafer et al. Feb 1989 A
4814974 Narayanan et al. Mar 1989 A
4908865 Doddington et al. Mar 1990 A
4918735 Morito et al. Apr 1990 A
4942527 Schumacher Jul 1990 A
4984178 Hemphill et al. Jan 1991 A
5018215 Nasr et al. May 1991 A
5023832 Fulcher et al. Jun 1991 A
5040141 Yazima et al. Aug 1991 A
5051924 Bergeron et al. Sep 1991 A
5060155 Van Zuijlen Oct 1991 A
5067099 McCown et al. Nov 1991 A
5068789 van Vliembergen Nov 1991 A
5099425 Kanno et al. Mar 1992 A
5101349 Tokuume et al. Mar 1992 A
5111398 Nunberg et al. May 1992 A
5125024 Gokcen et al. Jun 1992 A
5210872 Ferguson et al. May 1993 A
5228116 Harris et al. Jul 1993 A
5230054 Tamura Jul 1993 A
5247677 Welland et al. Sep 1993 A
5251129 Jacobs Oct 1993 A
5251131 Masand et al. Oct 1993 A
5265033 Vajk et al. Nov 1993 A
5278942 Bahl et al. Jan 1994 A
5287430 Iwamoto Feb 1994 A
5321608 Namba et al. Jun 1994 A
5325298 Gallant Jun 1994 A
5325526 Cameron et al. Jun 1994 A
5345501 Shelton Sep 1994 A
5349526 Potts et al. Sep 1994 A
5365430 Jagadish Nov 1994 A
5369570 Parad Nov 1994 A
5369577 Kadashevich et al. Nov 1994 A
5371807 Register et al. Dec 1994 A
5377354 Scannell et al. Dec 1994 A
5418717 Su et al. May 1995 A
5418948 Turtle May 1995 A
5437032 Wolf et al. Jul 1995 A
5444820 Tzes et al. Aug 1995 A
5475588 Schabes et al. Dec 1995 A
5483466 Kawahara et al. Jan 1996 A
5487100 Kane Jan 1996 A
5493677 Balogh et al. Feb 1996 A
5493692 Theimer et al. Feb 1996 A
5506787 Muhlfeld et al. Apr 1996 A
5526521 Fitch et al. Jun 1996 A
5528701 Aref Jun 1996 A
5542088 Jennings, Jr. et al. Jul 1996 A
5555344 Zunkler Sep 1996 A
5559710 Shahraray et al. Sep 1996 A
5577241 Spencer Nov 1996 A
5590055 Chapman et al. Dec 1996 A
5594641 Kaplan et al. Jan 1997 A
5596502 Koski et al. Jan 1997 A
5610812 Scabes et al. Mar 1997 A
5615360 Bezek et al. Mar 1997 A
5627914 Pagallo May 1997 A
5630128 Farrell et al. May 1997 A
5634053 Noble et al. May 1997 A
5634121 Tracz et al. May 1997 A
5636124 Rischar et al. Jun 1997 A
5649215 Itoh Jul 1997 A
5664061 Andreshak et al. Sep 1997 A
5680628 Carus Oct 1997 A
5687384 Nagase Nov 1997 A
5694616 Johnson et al. Dec 1997 A
5701400 Amado Dec 1997 A
5708829 Kadashevich Jan 1998 A
5715371 Ahamed et al. Feb 1998 A
5721770 Kohler Feb 1998 A
5721897 Rubinstein Feb 1998 A
5724481 Garberg et al. Mar 1998 A
5737621 Kaplan et al. Apr 1998 A
5737734 Schultz Apr 1998 A
5745652 Bigus Apr 1998 A
5745736 Picart Apr 1998 A
5748973 Palmer et al. May 1998 A
5754671 Higgins et al. May 1998 A
5761631 Nasukawa Jun 1998 A
5765033 Miloslavsky Jun 1998 A
5768578 Kirk et al. Jun 1998 A
5794194 Takebayashi et al. Aug 1998 A
5799268 Boguraev Aug 1998 A
5802253 Gross et al. Sep 1998 A
5806040 Vensko Sep 1998 A
5809462 Nussbaum Sep 1998 A
5809464 Kopp et al. Sep 1998 A
5822731 Schultz Oct 1998 A
5822745 Hekmatpour Oct 1998 A
5826076 Bradley et al. Oct 1998 A
5832220 Johnson et al. Nov 1998 A
5832470 Morita et al. Nov 1998 A
5835682 Broomhead et al. Nov 1998 A
5845246 Schalk Dec 1998 A
5850219 Kumomura Dec 1998 A
5860059 Aust et al. Jan 1999 A
5864848 Horvitz et al. Jan 1999 A
5864863 Burrows Jan 1999 A
5867495 Elliott et al. Feb 1999 A
5878385 Bralich et al. Mar 1999 A
5878386 Coughlin Mar 1999 A
5884032 Bateman et al. Mar 1999 A
5884302 Ho Mar 1999 A
5890142 Tanimura et al. Mar 1999 A
5890147 Peltonen et al. Mar 1999 A
5895447 Ittycheriah et al. Apr 1999 A
5899971 De Vos May 1999 A
5913215 Rubinstein et al. Jun 1999 A
5920835 Huzenlaub et al. Jul 1999 A
5933822 Braden-Harder et al. Aug 1999 A
5937400 Au Aug 1999 A
5940612 Brady et al. Aug 1999 A
5940821 Wical Aug 1999 A
5944778 Takeuchi et al. Aug 1999 A
5946388 Walker et al. Aug 1999 A
5948058 Kudoh et al. Sep 1999 A
5950184 Kartutunen Sep 1999 A
5950192 Moore et al. Sep 1999 A
5956711 Sullivan et al. Sep 1999 A
5960393 Cohrs et al. Sep 1999 A
5963447 Kohn et al. Oct 1999 A
5963894 Riachardson et al. Oct 1999 A
5970449 Alleva et al. Oct 1999 A
5974385 Ponting et al. Oct 1999 A
5974465 Wong Oct 1999 A
5983216 Kirach Nov 1999 A
5991713 Unger et al. Nov 1999 A
5991751 Rivette et al. Nov 1999 A
5991756 Wu Nov 1999 A
5995513 Harrand et al. Nov 1999 A
5999932 Paul Dec 1999 A
5999990 Sharrit et al. Dec 1999 A
6006221 Liddy et al. Dec 1999 A
6009422 Ciccarelli Dec 1999 A
6012053 Pant et al. Jan 2000 A
6018735 Hunter Jan 2000 A
6021403 Horvitz et al. Feb 2000 A
6025843 Sklar Feb 2000 A
6026388 Liddy et al. Feb 2000 A
6032111 Mohri et al. Feb 2000 A
6035104 Zahariev Mar 2000 A
6038535 Campbell Mar 2000 A
6038560 Wical Mar 2000 A
6055528 Evans Apr 2000 A
6058365 Nagal et al. May 2000 A
6058389 Chandra et al. May 2000 A
6061709 Bronte May 2000 A
6064953 Maxwell, III et al. May 2000 A
6064971 Hartnett May 2000 A
6064977 Haverstock et al. May 2000 A
6067565 Horvitz May 2000 A
6070149 Tavor et al. May 2000 A
6070158 Kirsch et al. May 2000 A
6073098 Buchsbaum et al. Jun 2000 A
6073101 Maes Jun 2000 A
6073142 Geiger et al. Jun 2000 A
6076088 Paik et al. Jun 2000 A
6081774 de Hita et al. Jun 2000 A
6085159 Ortega et al. Jul 2000 A
6092042 Iso Jul 2000 A
6092095 Maytal Jul 2000 A
6092103 Pritsch Jul 2000 A
6094652 Faisal Jul 2000 A
6098047 Oku et al. Aug 2000 A
6101537 Edelstein et al. Aug 2000 A
6112126 Hales et al. Aug 2000 A
6115734 Mansion Sep 2000 A
6138128 Perkowitz et al. Oct 2000 A
6138139 Beck et al. Oct 2000 A
6144940 Nishi et al. Nov 2000 A
6148322 Sand et al. Nov 2000 A
6151538 Bate et al. Nov 2000 A
6154720 Onishi et al. Nov 2000 A
6161094 Adcock et al. Dec 2000 A
6161130 Horvitz et al. Dec 2000 A
6167370 Tsourikov et al. Dec 2000 A
6169986 Bowman et al. Jan 2001 B1
6182029 Friedman Jan 2001 B1
6182036 Poppert Jan 2001 B1
6182059 Angotti et al. Jan 2001 B1
6182063 Woods Jan 2001 B1
6182065 Yeomans Jan 2001 B1
6182120 Beaulieu et al. Jan 2001 B1
6185603 Henderson et al. Feb 2001 B1
6199103 Sakaguchi et al. Mar 2001 B1
6212544 Borkenhagen et al. Apr 2001 B1
6223201 Reznak Apr 2001 B1
6226630 Billmers May 2001 B1
6233575 Agrawal et al. May 2001 B1
6233578 Machihara et al. May 2001 B1
6236987 Horowitz et al. May 2001 B1
6243679 Mohri et al. Jun 2001 B1
6243735 Imanishi et al. Jun 2001 B1
6249606 Kiraly et al. Jun 2001 B1
6256773 Bowman-Amuah Jul 2001 B1
6260058 Hoenninger et al. Jul 2001 B1
6263335 Paik et al. Jul 2001 B1
6269368 Diamond Jul 2001 B1
6271840 Finseth et al. Aug 2001 B1
6275819 Carter Aug 2001 B1
6278973 Chung et al. Aug 2001 B1
6282565 Shaw et al. Aug 2001 B1
6292794 Cecchini et al. Sep 2001 B1
6292938 Sarkar et al. Sep 2001 B1
6298324 Zuberec et al. Oct 2001 B1
6301602 Ueki Oct 2001 B1
6304864 Liddy et al. Oct 2001 B1
6304872 Chao Oct 2001 B1
6308197 Mason et al. Oct 2001 B1
6311194 Sheth et al. Oct 2001 B1
6314439 Bates et al. Nov 2001 B1
6314446 Stiles Nov 2001 B1
6324534 Neal et al. Nov 2001 B1
6327581 Platt Dec 2001 B1
6349295 Tedesco et al. Feb 2002 B1
6353667 Foster et al. Mar 2002 B1
6353827 Davies et al. Mar 2002 B1
6360243 Lindsley et al. Mar 2002 B1
6363373 Steinkraus Mar 2002 B1
6363377 Kravets et al. Mar 2002 B1
6366910 Rajaraman et al. Apr 2002 B1
6370526 Agrawal et al. Apr 2002 B1
6374221 Haimi-Cohen Apr 2002 B1
6377945 Risvik Apr 2002 B1
6377949 Gilmour Apr 2002 B1
6389405 Oatman et al. May 2002 B1
6393415 Getchius et al. May 2002 B1
6393465 Leeds May 2002 B2
6397209 Reed et al. May 2002 B1
6397212 Biffar May 2002 B1
6401084 Ortega et al. Jun 2002 B1
6408277 Nelken Jun 2002 B1
6411947 Rice et al. Jun 2002 B1
6411982 Williams Jun 2002 B2
6415250 van den Akker Jul 2002 B1
6418458 Maresco Jul 2002 B1
6421066 Sivan Jul 2002 B1
6421675 Ryan et al. Jul 2002 B1
6424995 Shuman Jul 2002 B1
6424997 Buskirk et al. Jul 2002 B1
6430615 Hellerstein et al. Aug 2002 B1
6434435 Tubel et al. Aug 2002 B1
6434554 Asami et al. Aug 2002 B1
6434556 Levin et al. Aug 2002 B1
6438540 Nasr et al. Aug 2002 B2
6438575 Khan et al. Aug 2002 B1
6442542 Ramani et al. Aug 2002 B1
6442589 Takahashi et al. Aug 2002 B1
6446061 Doerre et al. Sep 2002 B1
6446081 Preston Sep 2002 B1
6446256 Hyman et al. Sep 2002 B1
6449589 Moore Sep 2002 B1
6449646 Sikora et al. Sep 2002 B1
6460074 Fishkin Oct 2002 B1
6463533 Calamera et al. Oct 2002 B1
6466940 Mills Oct 2002 B1
6477500 Maes Nov 2002 B2
6477580 Bowman-Amuah Nov 2002 B1
6480843 Li Nov 2002 B2
6490572 Akkiraju et al. Dec 2002 B2
6493447 Goss et al. Dec 2002 B1
6493694 Xu et al. Dec 2002 B1
6496836 Ronchi Dec 2002 B1
6496853 Klein Dec 2002 B1
6499021 Abu-Hakima Dec 2002 B1
6505158 Conkie Jan 2003 B1
6507872 Geshwind Jan 2003 B1
6513026 Horvitz et al. Jan 2003 B1
6535795 Zetlmeisl et al. Mar 2003 B1
6542889 Aggarwal et al. Apr 2003 B1
6553358 Horvitz Apr 2003 B1
6560330 Gabriel May 2003 B2
6560590 Shwe et al. May 2003 B1
6571282 Bowman-Amuah May 2003 B1
6574480 Foladare et al. Jun 2003 B1
6574658 Gabber et al. Jun 2003 B1
6578025 Pollack et al. Jun 2003 B1
6584464 Warthen Jun 2003 B1
6594697 Praitis et al. Jul 2003 B1
6601026 Appelt et al. Jul 2003 B2
6607136 Atsmon et al. Aug 2003 B1
6611535 Ljungqvist Aug 2003 B2
6611825 Billheimer et al. Aug 2003 B1
6615172 Bennett et al. Sep 2003 B1
6618727 Wheeler et al. Sep 2003 B1
6628194 Hellebust et al. Sep 2003 B1
6636733 Helferich Oct 2003 B1
6651220 Penteroudakis et al. Nov 2003 B1
6654726 Hanzek Nov 2003 B1
6654815 Goss et al. Nov 2003 B1
6665662 Kirkwood et al. Dec 2003 B1
6675159 Lin et al. Jan 2004 B1
6704728 Chang et al. Mar 2004 B1
6708205 Sheldon et al. Mar 2004 B2
6711561 Chang et al. Mar 2004 B1
6714643 Gargeya et al. Mar 2004 B1
6714905 Chang et al. Mar 2004 B1
6718367 Ayyadurai Apr 2004 B1
6732149 Kephart May 2004 B1
6738759 Wheeler et al. May 2004 B1
6742015 Bowman-Amuah May 2004 B1
6744878 Komissarchik et al. Jun 2004 B1
6745181 Chang et al. Jun 2004 B1
6748387 Garber et al. Jun 2004 B2
6766320 Wang et al. Jul 2004 B1
6785671 Bailey et al. Aug 2004 B1
6832244 Raghunandan Dec 2004 B1
6832245 Isaacs et al. Dec 2004 B1
6850513 Pelissier Feb 2005 B1
6862710 Machisio Mar 2005 B1
7007067 Azvine et al. Feb 2006 B1
7047242 Ponte May 2006 B1
7051277 Kephart et al. May 2006 B2
7076527 Bellegarda et al. Jul 2006 B2
7131057 Ferrucci et al. Oct 2006 B1
7200606 Elkan Apr 2007 B2
7219054 Begeja et al. May 2007 B1
7272853 Goodman et al. Sep 2007 B2
7363590 Kerr et al. Apr 2008 B2
7366760 Warren et al. Apr 2008 B2
7370020 Azvine et al. May 2008 B1
7376701 Bhargava et al. May 2008 B2
7409336 Pak et al. Aug 2008 B2
7519668 Goodman et al. Apr 2009 B2
7565403 Horvitz et al. Jul 2009 B2
9270625 Alspector Feb 2016 B2
20010027463 Kobayashi Oct 2001 A1
20010042090 Williams Nov 2001 A1
20010047270 Gusick et al. Nov 2001 A1
20010056456 Cota-Robles Dec 2001 A1
20020029825 Kuehmann et al. Mar 2002 A1
20020032715 Utsumi Mar 2002 A1
20020049602 Horvitz Apr 2002 A1
20020052907 Wakai et al. May 2002 A1
20020059161 Li May 2002 A1
20020065953 Alford et al. May 2002 A1
20020073129 Wang et al. Jun 2002 A1
20020078119 Brenner et al. Jun 2002 A1
20020078121 Ballantyne Jun 2002 A1
20020078257 Nishimura Jun 2002 A1
20020083251 Chauvel et al. Jun 2002 A1
20020087618 Bohm et al. Jul 2002 A1
20020087623 Eatough Jul 2002 A1
20020091746 Umberger et al. Jul 2002 A1
20020099714 Murray Jul 2002 A1
20020103871 Pustejovsky Aug 2002 A1
20020107926 Lee Aug 2002 A1
20020116463 Hart Aug 2002 A1
20020150966 Muraca Oct 2002 A1
20020196911 Gao et al. Dec 2002 A1
20030028564 Sanfilippo Feb 2003 A1
20030046297 Mason Mar 2003 A1
20030074397 Morin et al. Apr 2003 A1
20030233419 Beringer Dec 2003 A1
20030236845 Pitsos Dec 2003 A1
20040167889 Chang et al. Aug 2004 A1
20040177120 Kirsch Sep 2004 A1
20040205135 Hallam-Baker Oct 2004 A1
20040225653 Nelken et al. Nov 2004 A1
20040254904 Nelken et al. Dec 2004 A1
20050187913 Nelken et al. Aug 2005 A1
Foreign Referenced Citations (8)
Number Date Country
2180392 Feb 2001 CA
0 597 630 May 1994 EP
0 304 191 Feb 1999 EP
09106296 Apr 1997 JP
0036487 Jun 2000 WO
WO 0036487 Jun 2000 WO
0184373 Aug 2001 WO
0184374 Aug 2001 WO
Non-Patent Literature Citations (70)
Entry
Androutsopoulos, Ion et al. “An Experimental Comparison of Naïve Bayesian and Keyword-Based Anti-Spam Filtering with Personal E-mail Messages.” Proceedings of the 23rd annual international ACM SIGIR conference on Research and development in information retrieval. ACM Press. Jul. 2000. 160-167.
Browning, Brandon. “Getting Rid of Spam.” Linux Journal. Mar. 1998. Specialized Systems Consultants Inc. 4 pages.
Cranor, Lorrie Faith et al. “Spam!”. Communications of the ACM. Aug. 1998. ACM Press. 74-83.
Schneider, Karl-Michael. “A Comparison of Event Models for Naïve Bayes Anti-Spam E-Mail Filtering.” Proceedings of the tenth conference on European chapter of the Association for Computational Linguistics. vol. 1. 307-314. Apr. 2003.
Hall, Robert J. “How to Avoid Unwanted Email”. Communications of the ACM. vol. 41, No. 3. ACM Press. Mar. 1998. 88-95.
Chai, Kian Ming Adam et al. “Bayesian online classifiers for text classification and filtering.” Proceedings of the 25th annual international ACM SIGIR conference on Research and development in information retrieval. ACM Press. Aug. 2002. 97-104.
Breese et al., “Empirical Analysis of Predictive Algorithms for Collaborative Filtering,” Proc. of the 14th Conf. on Uncertainty in Artificial Intelligence, Jul. 1998.
Czerwinski et al., “Visualizing Implicit Queries for Information Management and Retrieval,” Proc. of CHI 1999; ACM SIGCHI Conf. on Human Factors in Computing Systems, 1999.
Dumais et al., “Inductive Learning Algorithms and Representations for Task Categorization,” Proc. of 7th Intl. Conf. on Information & Knowledge Management, 1998.
Horvitz, “Principles of Mixed-Initiative User Interfaces,” Proc. of CHI 1999; ACM SIGCHI Conf. on Human Factors in Computing Systems, 1999.
Horvitz et al., “Display of Information for Time-Critical Decision Making,” Proc. of the 11th Conf. on Uncertainty in Artificial Intelligence, Jul. 1995.
Horvitz et al., “The Lumiere Project: Bayesian User Modeling . . . ,” Proc. of the 14th Conf. on Uncertainty in Artificial Intelligence, Jul. 1998.
Horvitz et al., “Time-Dependent Utility and Action Under Uncertainty,” Proc. of the 7th Conf. on Uncertainty in Artificial Intelligence, Jul. 1991.
Horvitz et al., “Time-Critical Action: Representations and Application,” Proc. of the 13th Conf. on Uncertainty in Artificial Intelligence, Jul. 1997.
Koller et al., “Toward Optimal Feature Selection,” Proc. of 13th Conf. on Machine Learning, 1996.
Lieberman, “Letizia: An Agent That Assists in Web Browsing,” Proc. of International Joint Conference on Artificial Intelligence, 1995.
Platt, “Fast Training of Support Vector Machines Using Sequential Minimal Optimization,” Advances in Kernel Methods: Support Vector Learning, MIT Press, Cambridge, MA, 1999.
Platt, “Probabilistic Outputs for Support Vector Machines & Comparisons to Regularized Likelihood Methods,” Adv. in Large Margin Classifiers, MIT Press, Cambridge, MA, 1999.
Sahami et al., “A Bayesian Approach to Filtering Junk E-Mail,” Amer. Assoc. for Art. Intell. Technical Report WS-98-05, 1998.
Cohen, “Learning Rules that Classify E-Mail,” AT&T Laboratories, 1996.
Lewis, “Evaluating and Optimizing Autonomous Text Classification Systems,” ACM SIGIR, 1995.
Lewis et al., “Training Algorithms for Linear Text Classifiers,” ACM SIGIR, 1996.
Apte et al., “Automated Learning of Decision Rules for Text Categorization,” ACM Transactions on Information Systems, vol. 12, No. 3, 1994.
Losee, Jr., “Minimizing Information Overload: The Ranking of Electronic Messages,” Journal of Information Science 15, 1989.
Joachimes, “Text Categorization with Support Vector Machines: Leaming with Many Relevant Features,” Universitat Dortmund, Germany, 1998.
Apte, C. et al., “Automated Learning of Decision Rules for Text Categorization”, IBM Research Report RC 18879, to appear in ACM Transactions on Information Systems, vol. 12, No. 3, pp. 233-251, 1994.
Breese, John S et al., “Empirical Analysis of Predictive Algorithms for Collaborative Filtering”, Proceedings of the Fourteenth Conference on Uncertainty in Artificial Intelligence, Madison, WI, Jul. 1998, Morgan Kaufman Publisher.
Cohen, William W., “Learning Rules That Classify E-Mail”, In Proceedings of the 1996 AAAI Spring Symposium on Machine Learning in Information Access.
Czerwinski, Mary et al., “Visualizing Implicit Queries for Information Management and Retrieval”, Proceedings of CHI '99, ACM SIGCHI Conference on Human Factors in Computing Systems, May 15-20, 1999, pp. 560-567.
Dumais, Susan et al., “Inductive Learning Algorithms and Representations for Text Categorization”, Proceedings of Seventh International Conference on Information and Knowledge Management (CIKM98), Bethesda MD, Nov. 1998, ACM Press, pp. 148-155.
Horvitz, Eric et al., “Display of Information for Time-Critical Decision Making”, Proceedings of the Eleventh Conference on Uncertainty in Artificial Intelligence, Montreal, Canada, Aug. 1995, Morgan Kaufmann Publishers (San Francisco, CA), pp. 296-305. http://research.microsoft.com/˜horvitz/vista.htm.
Horvitz, Eric, “Principles of Mixed-Initiative User Interfaces”, Proceedings of CHI 1999; ACM SIGCHI Conference on Human Factors in Computing Systems, 1999.
Horvitz, Eric et al., “The Lumiere Project: Bayesian User Modeling for Inferring the Goals and Needs of Software Users”, Proceedings of the Fourteenth Conference on Uncertainty in Artificial Intelligence, Madison, WI, Jul. 1998, Morgan Kaufmann Publishers, pp. 256-265. http://research.microsoft.com/˜horvitz/lumiere.htm.
Horvitz, Eric et al., “Time-Critical Action: Representations and Application”, Proceedings of the Thirteenth Conference on Uncertainty in Artificial Intelligence, Jul. 1997.
Horvitz, Eric et al., “Time-Dependent Utility and Action Under Uncertainty”, In Proceedings of Seventh Conference on Uncertainty in Artificial Intelligence, Los Angeles, CA, pp. 151-158, Morgan Kaufman (San Mateo, CA), Jul. 1991.
Joachims, Thorsten, “Text Categorization With Support Vector Machines: Learning With Many Relevant Features”, In Proceedings 10th European Conference on Machine Learning (ECML), Springer Verlag, 1998. http://www-ai.cs.uni-dortmund.de/KOKIMENTE/Joachims—97a.ps.gz.
Koller, Daphne et al., “Toward Optimal Feature Selection”, Proceedings of the Thirteenth Conference on Machine Learning, pp. 284-292, 1996.
Lewis, David D., “Evaluating and Optimizing Autonomous Text Classification Systems”, ACM SIGIR '95, Seattle, WA, USA, pp. 246-254, 1995. ACM 0-89791-714-6/95/07.S3.50.
Lewis, David D., et al., “Training Algorithms for Linear Text Classifiers”, ACM SIGIR '96, Zurich, Switzerland, pp. 298-306, 1996. ACM 0-89791-792-8/96/08.S3/50.
Lieberman, Henry, “Letizia: An Agent That Assists in Web Browsing”, International Joint Conference on Artificial Intelligence (IJCAI), Montreal, Canada, Aug. 1995.
Losee, Robert M., “Minimizing Information Overload: the Ranking of Electronic Messages”, Journal of Information Science 15, 1989, pp. 179-189. 0165-5515/89/S3.50, 1989, Elsevier Science Publishers B.V.
Platt, John C., “Fast Training of Support Vector Machines Using Sequential Minimal Optimization”, To appear in: B. Scholkopf, C. Burges, and A. Smola (Eds.) “Advances in Kernel Methods: Support Vector Learning”, MIT Press, Cambridge, MA, London, England, 1999, pp. 185-208. ISBN 0-262-19416-3.
Platt, John C., “Probabilistic Outputs for Support Vector Machines and Comparisons to Regularized Likelihood Methods”, To appear in: Alexander J. Smola et al. (Eds.) “Advances in Large Margin Classifiers”, MIT Press, Cambridge, MA, Mar. 26, 1999, pp. 1-11.
Sahami, Mehran, et al., “A Bayesian Approach to Filtering Junk E-Mail”, American Association for Artificial Intelligence Technical Report WS-98-05, Workshop on Text Categorization, Jul. 1998. http://robotics.stanford.edu/users/sahami/papers-dir/spam.ps.
“Grammar-like Functional Rules for Representing Query Optimization Alternative,” 1998 ACM, pp. 18-27.
Khan et al., “Personal Adaptive Web Agent: A Tool for Information Filtering,” Canadian Conference on Electrical and Computer Engineering, vol. 1, May 25, 1997, pp. 305-308.
Davies et al., “Knowledge Discovery and Delivery,” British Telecommunications Engineering, London, GB, vol. 17, No. 1, Apr. 1, 1998, pp. 25-35.
Persin, “Document Filtering for Fast Ranking,” Sigir 94. Dublin, Jul. 3-6, 1994, Proceedings of the Annual International ACM-Sigir Conference on Research and Development in Information Retrieval, Berlin, Springer, DE, vol. CONF. 17, Jul. 3, 1994, pp. 339-348.
Han et al., “WebACE: A Web Agent for Document Categorization and Exploration,” Proceedings of the 2nd International Conference on Autonomous Agents Minneapolis/St. Paul, MN, May 9-13, 1998, Proceedings of the International Conference on Autonomous Agents, New York, NY, May 9, 1998, pp. 408-415.
Shimazu et al., “CAPIT: Natural Language Interface Design Tool with Keyword Analyzer and Case-Based Parser,” NEC Research and Development, Nippon Electric Ltd., Tokyo, JP, vol. 33, No. 4, Oct. 1, 1992, pp. 679-688.
Computer Dictionsry, Microsoft Press, 1997, Third Edition, p. 192.
Webster's Third New International Dictionary, G.&C. Meriam Company, 1961, pp. 538, 834, 1460.
Moore et al., “Web Page Categorization and Feature Selection Using Association Rule and Principal Component Clustering,” Proceedings of the 7th Workshop on Information Technologies and Systems, Dec. 1997, 10 pages.
Mase, “Experiments on Automatic Web Page Categorization for IR Systems,” Technical Report, Stanford University, 1998, pp. 1-12.
Berners-Lee et al., “The Semantic Web,” Scientific American.com, May 17, 2001, 9 pages.
Brasethvik et al., “A Conceptual Modeling Approach to Semantic Document Retrieval,” Proceedings of the 14th International Conference on Advanced Information Systems Engineering, May 27-31, 2002, pp. 167-182.
Firepond eService Provider, http://www.firepond.com/products/eserviceperformer, 2 pages.
Banter White Paper:, “Natural Language Engines or Advanced Customer Interaction,” by Banter Inc., pp. 1-13.
Banter Technology RME, “The Foundation for Quality E-Communications,” Technical White Paper, pp. 1-9.
Webster's Computer Internet Dictionary, 3rd Edition, P.E. Margolis, 1999, 3 pages.
Morelli et al., “Predicting Technical Communication in Product Development Organizations,” IEEE Transactions on Engineering Management, vol. 42, issue 3, Aug. 1995, pp. 1-16.
Parmentier et al., “Logical Structure Recognition of Scientific Bibliographic References,” 4th Int'l. Conf. on Document Analysis and Recognition, vol. 2, Aug. 18-20, 1997, pp. 1072-1076.
Kalogeraki et al., “Using Multiple Feedback Loops for Object Profiling . . . ,” IEEE Int'l Symposium on Object-Oriented Real-Time Distributed Computing, May 2-5, 1999, 10 pages.
Johnson et al., “Adaptive Model-Based Neural Network Control,” IEEE Int'l Conf. on Robotics and Automation, May 13-18, 1990, pp. 1704-1709.
McKinnon et al., “Data Communications and Management of a Distributed Network of Automated Data Acquisition Systems,” 1997 IEEE Nuclear Science Symp., Nov. 1997, pp. 730-733.
searchCRM.com Definitions (contact center), http://www.searchcrm.techtarget.com. 1 page.
“Transforming Your Call Center Into a Contact Center: Where Are You? Trends and Recommendations,” An IDC Executive Brief (#33), Jun. 2001, pp. 1-7.
Hawkins et al., “The Evolution of the Call Center to the ‘Customer Contact Center’”, ITSC White Paper, Feb. 2001, pp. 1-30.
Computer Dictionary, Microsoft Press, 1997, Third Edition, p. 192.
Webster's Third New International Dictionary, G.&C. Meriam Company, 1961, pp. 538,834, 1460.
Continuations (1)
Number Date Country
Parent 09602588 Jun 2000 US
Child 10112230 US
Continuation in Parts (1)
Number Date Country
Parent 10112230 Mar 2002 US
Child 10610964 US