This application relates to a method and apparatus to process and incoming message. In an example embodiment, the method and apparatus may process an incoming voice message in a telephone communication system.
Voice mail systems are well known in the art. Currently, a sender of a voice mail message may set the priority of the message. Likewise, a sender of an email message may set the priority of the email message (e.g., flag the importance of the email message as “Low,” “Medium,” and “High”). However, a recipient of the message may not consider the message to be of the same importance as the sender.
In order to identify the importance defined by the sender, a mail message may include an indicator in an envelope that is communicated from a sender device to a recipient device. Typically, included in the envelope are various separate fields such as a routing information field, a message importance field, a payload field that includes the payload or actual message content, and the like.
Embodiments of the present invention are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
In an example embodiment, a method and a system to process an incoming message received via a communication network is described.
In the following detailed description of example embodiments, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific embodiments in which the example method, apparatus and system may be practiced. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of this description.
The system 10 is shown to include a Voice over Internet Protocol (VoIP) phone 12, a telephone 14 (which may be a cellular telephone, a wired or landline telephone, or the like), a computer 16 (e.g. a personal computer including an email and/or Instant Message client), a pager 18, or any other device 20 capable of communicating messages from a sender to a recipient. The devices 12 to 20, may communicate via one or more networks 22 with a voice mail system or apparatus 24. Unlike prior art systems where the importance of an incoming message is defined by a sender, in the system 10, the voice mail apparatus 24 may define the importance or any action to be performed on an incoming message. The action performed on the incoming message may thus be independent of any priority or importance attached to the incoming message by a sender. Further, unlike prior art systems where any importance attached by the sender to the message is by virtue of a flag that is not derived in any way from the message content, the voice mail system 24 analyses the message content of an incoming message in an automated fashion, and without human intervention, to identify a keyword (or keywords) included in the message content. Thereafter, as described in more detail below, a predefined action (or actions) associated with the keyword may be automatically identified and the action may be automatically performed or executed when the keyword is identified in the message content.
As a predefined action (e.g. attaching an importance to an incoming message) may be identified from the actual message content when the incoming message is received at the recipient, it may be performed in an automated fashion without further interaction with a user. Thus, the system 10 differs from prior art call center configurations where a user may be required to identify the importance of an incoming message via Interactive Voice Recognition (IVR) technology. However, it will still be appreciated that it is in this prior art case the sender defines the priority of the message and that such a priority may not correspond to the priority attached to the message by the recipient. Further, such priority is attached to the message in general and is not dependent or automatically derived, without human intervention, from the actual message content of a message which is communicated from a sender to a recipient.
It will be appreciated that the system 10 shown in
The apparatus 30 is shown to include a communication interface 32, an Adaptive Speech Recognition (ASR) module 34, a processing module 36, an optional voice mail search module 38, a keyword database 40, and optional legacy modules of a voice mail system 42. The communication interface 32 may interface the apparatus 30 to any one or more of the devices 12 to 20 via any one or more networks 22 as shown in
Referring to
Thereafter, as shown in block 54, the processing module 36 may automatically analyze the message content to identify if one or more keywords exist in the textual equivalent of the message content. The keywords, or “hot words”, may be provided in the keyword database 40 (e.g. in an XML format). Each keyword may have one or more predefined actions associated therewith. For example, a particular keyword may have an action such as associating a priority with the incoming message, alerting an intended recipient (e.g. the mailbox owner) of the incoming message, routing the incoming message to the intended recipient, paging the intended recipient, emailing the incoming message to the intended recipient, sending an SMS message to the intended recipient, or any other user defined action. Thus, as shown at block 54, the method 50 may investigate or interrogate the keyword database 40 to initially identify which keywords are to be searched for in the incoming message and, if a particular keyword is found, identify the predefined action or actions associated with the keyword. Thereafter, as shown at block 56 the predefined action may be performed or executed. Thus, the apparatus 30 may define a content sensitive voice mail system where a recipient of a voice mail defines an action dependent upon the content of the voice mail. Similarly, natural speech recognition may be used to identify phrases.
In an example embodiment, the apparatus 30 may be language sensitive or provide a facility to define more than one language associated with the keywords. For example, the apparatus 30 may identify a source of an incoming message (e.g. a source of an incoming telephone call) and associate a particular language with the source of the incoming message. For example, if it is determined from caller identification information that the incoming message is a voice message from a person in France, then a French language profile, including French keywords, may be retrieved from the database 40. In an example embodiment, a user may be prompted to identify which particular language they are communicating the message in.
The apparatus 30 may, for example, form part of a voice mail system of a medical clinic or medical doctor. For example, a doctor may define keywords relating to a particular patient and/or particular medical condition. In addition, the doctor may then define actions to be performed when a voice mail is received including the keywords. For example, the doctor may define a patient's name (e.g. “Jones”) and a health condition (e.g. “heart”) as keywords. Further, an action such as automatically forwarding a call to the doctor's mobile telephone or automatically attaching a high priority to the call may be associated with the keywords. Accordingly, if Mr. Jones were to call and indicate in his voice mail that he was Mr. Jones and that he had a heart condition, the apparatus 30 would automatically identify these words in the voice message, and perform the actions defined by the doctor. Thus, in the present example, the call or message would be either marked as a high priority message or immediately forwarded to the doctor's mobile telephone. It will, however, be appreciated that the keywords and the associated actions may vary from one deployment of the apparatus 30 to another. Further, the apparatus 30 is not restricted to processing only stored messages and may process and incoming message (e.g., an incoming telephone call) on-the-fly or in real time. Thus, in an example embodiment, an incoming telephone call may be processed while the caller is speaking.
In method 60, as shown at block 64, a user may be authenticated prior to allowing the user access to a user specific profile of keywords and associated predefined actions. Once the user has been authenticated, the apparatus 30 may receive new keywords that the user has defined and/or selected (see block 66) and, thereafter, the apparatus 30 may receive new actions that have been defined by the user and which are associated with the new keywords (see block 68). As shown at block 70, the new keywords and new actions may then be stored (e.g. in the database 40). In an embodiment, multiple users may be associated with the apparatus 30 and, accordingly, each user may have a user profile that allows customization of keywords and predefined actions for the particular user. It will be however be appreciated that certain keywords and actions may be common to all users. As mentioned above, it will be appreciated that the user interface that allows the user to define new keywords and actions may use any communication medium (e.g. an email communication, a voice mail communication, an Instant Message communication, web interface via a browser, or the like).
In an example embodiment, a user may dial into the apparatus 30, which may form part of a voice mail system, and, upon authentication, may be presented with a Telephone User Interface (TUI) menu. For example, the telephone user interface menu may prompt the user with the following audio instructions: “to listen to your voice mail messages, please press 1; to set your personal preferences, please press 2, and so on.” A user may then select the personal preferences option and be presented with a relevant menu such as: “to change your outgoing message, please press 1, to set your keywords, please press 2, to set actions associated with keywords, please press 3, and so on”. The user may then choose the keywords or hot words option and record a sequence of keywords which are then stored in a personal profile and associated with the user's mailbox. In an embodiment, the user may utilize natural language capabilities of the apparatus 30 and instruct the apparatus 30 via a spoken sentence. For example, after authentication, the user may hear a short prompt such as: “please specify your command”. In response, the user may say “please configure my three keywords to be ‘word 1’, ‘word 2’, and ‘word 3’.” Following on the example above, with respect to a medical practice, the user may command the apparatus 30 as follows: “please configure my three keywords to be ‘Jones’, ‘heart’, and ‘statuses.” The apparatus 30 may then play back to a caller (in the present example a doctor associated with the mailbox) a confirmation message such as: “you have asked to configure three keywords: ‘Jones’, ‘heart’, and ‘statuses. If this is correct please acknowledge or say ‘modify’.” Once the user acknowledges that the system or apparatus has properly recorded the three keywords, the keywords may then be stored in the user's personal profile in the database 40 and associated with their mailbox.
In an embodiment, the apparatus 30 allows the user to program actions or rules for the keywords. For example, the user may mark as urgent any call from a particular caller that includes the words “dinner” and “at”. For example, returning to the medical example, any messages that include the keywords “blood test results” may be marked as urgent. In an example embodiment, caller identification may also be used in conjunction with the keywords to perform a particular predefined action. In one example deployment of the apparatus 30, when an incoming call is redirected to a voice mail system, the adaptive speech recognition module 34 may be conferenced in to a voice path of a voice mail system. Further, the apparatus 30 may then dynamically load corresponding keywords to speech recognition from the associated personal profile. Thus, different users of the apparatus 30 may have customized personal profiles each including hot words or keywords that they have defined and that are relevant to them.
In an embodiment, if the message content of an incoming call or message includes any one or more of the keywords, the voice mail apparatus 30 may mark the message as urgent, or as of high priority. In addition or instead, the apparatus 30 may call the recipient user on an alternative (e.g. a home or cellular) telephone number and inform the recipient user of the high priority call or transfer the call to the alternative telephone number. In an embodiment, the apparatus 30 may intermittently or periodically call alternative numbers or alert the user to a high priority message via e-page, SMS, Instant Messaging (IM), or the like. Thus, the apparatus 30 allows a user of a voice mail apparatus to define keywords and, in an embodiment, to associate or compare the keywords against a preprogrammed set of user specific rules or actions. As soon as a rule or action is matched, the apparatus may mark the call as urgent or high priority and act in accordance with the preprogrammed or predefined action or plan.
As shown in
The method 70 is shown at block 72 to receive a search query from a user/owner of a voice mail box that includes one or more keywords. Thereafter, as shown at block 74, a database of stored messages may be searched to identify messages including one or more of the search terms or keywords. For example, the database 40 may be searched utilizing the voice mail search module 38. Returning to the medical doctor example, a doctor may call in to the apparatus 30 and via a search interface, search all his or her email messages for the words “Jones” and “heart”. In this given example, the voice mail search module 38 may then search all the voice mail messages associated with the doctor's profile and return or playback those messages that include the search terms or keywords. Thus, the voice mail search module 38 may facilitate identifying voice mail messages of high priority or of a particular concern to the user and identify them from a plurality of other voice mail messages which may be less relevant. It will, however, be appreciated to a person of skill in the art that the search functionality may also be used to search email messages, instant messages, or the like.
In an embodiment, the method 70 receives a voice query including a spoken search term and performs speech recognition on the spoken search term to obtain an equivalent textual search term. Thereafter, the database of incoming messages is searched using the textual equivalent search term to identify any incoming messages including the spoken search term. In addition or instead, the method 70 may receive a textual query including a textual search term and search the database of incoming messages with the textual search term to identify any incoming messages including the textual search term. In an embodiment, methods described herein may identify the phrases without the need to convert the keywords (hot words) into text but perform the analysis in the audio domain.
When one or more keywords have been identified in one or more messages, those messages that have been found are then presented to the user (see block 76). For example, the message(s) including the keyword(s) may be played back to the user in a similar fashion to a conventional voice mail message. In an embodiment, the search is performed in the audio domain.
Referring to
As shown at block 82, an instant message may be received where after, as shown at decision block 84, a determination is made as to whether or not the recipient user is available. If the recipient user is available then the incoming instant message is immediately displayed to the user as shown at block 86. If, however, the recipient user has defined or indicated in the instant messaging client that he or she is not available, then the method 80 may parse or analyze the instant message content to identify whether or not any user defined keywords are present (see block 88). As shown at decision block 90, if a keyword is present in the instant message content, the method 80 may then display the instant message to the user as shown at block 86. If, however, the keyword is not present in the instant message content, the method 80 may send the instant message to an instant message mailbox. Thus, a user may define whether or not an instant message is displayed or not dependent upon the content of the instant message.
Referring in particular to
The example computer system 200 includes a processor 202 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 204 and a static memory 206, which communicate with each other via a bus 208. The computer system 200 may further include a video display unit 210 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 200 also includes an alphanumeric input device 212 (e.g., a keyboard), optionally a user interface (UI) navigation device 214 (e.g., a mouse), optionally a disk drive unit 216, a signal generation device 218 (e.g., a speaker) and a network interface device 220.
The disk drive unit 216 includes a machine-readable medium 222 on which is stored one or more sets of instructions and data structures (e.g., software 224) embodying or utilized by any one or more of the methodologies or functions described herein. The software 224 may also reside, completely or at least partially, within the main memory 204 and/or within the processor 202 during execution thereof by the computer system 200, the main memory 204 and the processor 202 also constituting machine-readable media.
The software 224 may further be transmitted or received over a network 226 via the network interface device 220 utilizing any one of a number of well-known transfer protocols (e.g., HTTP).
While the machine-readable medium 222 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention, or that is capable of storing, encoding or carrying data structures utilized by or associated with such a set of instructions. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals. Such medium may also include, without limitation, hard disks, floppy disks, flash memory cards, digital video disks, random access memory (RAM), read only memory (ROMs), and the like.
The embodiments described herein may be implemented in an operating environment comprising software installed on any programmable device, in hardware, or in a combination of software and hardware.
Although embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
This application is a divisional application of, and claims the benefit of priority to U.S. patent application Ser. No. 11/237,081, filed Sep. 28, 2005, which is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5588042 | Comer | Dec 1996 | A |
6035017 | Fenton et al. | Mar 2000 | A |
6226670 | Ueno et al. | May 2001 | B1 |
6313734 | Weiss et al. | Nov 2001 | B1 |
6507865 | Hanson et al. | Jan 2003 | B1 |
6687339 | Martin | Feb 2004 | B2 |
6727930 | Currans et al. | Apr 2004 | B2 |
6769002 | Ayan | Jul 2004 | B2 |
6781962 | Williams et al. | Aug 2004 | B1 |
6941304 | Gainey et al. | Sep 2005 | B2 |
7032030 | Codignotto | Apr 2006 | B1 |
7130885 | Chandra et al. | Oct 2006 | B2 |
7379872 | Cabezas et al. | May 2008 | B2 |
7617042 | Horvitz et al. | Nov 2009 | B2 |
7685102 | Adelman et al. | Mar 2010 | B2 |
7693267 | Howell et al. | Apr 2010 | B2 |
7769001 | Narasimhan et al. | Aug 2010 | B2 |
8064576 | Skakkebaek et al. | Nov 2011 | B2 |
8407786 | Elias et al. | Mar 2013 | B1 |
8503624 | Shaffer et al. | Aug 2013 | B2 |
8510389 | Gurajada et al. | Aug 2013 | B1 |
8645473 | Spitkovsky | Feb 2014 | B1 |
8738611 | Zarmer et al. | May 2014 | B1 |
20020103867 | Schilter | Aug 2002 | A1 |
20020131399 | Philonenko | Sep 2002 | A1 |
20030028380 | Freeland et al. | Feb 2003 | A1 |
20030185383 | Bergsagel | Oct 2003 | A1 |
20030220784 | Fellenstein et al. | Nov 2003 | A1 |
20040252679 | Williams et al. | Dec 2004 | A1 |
20050055213 | Claudatos et al. | Mar 2005 | A1 |
20070081636 | Shaffer et al. | Apr 2007 | A1 |
Number | Date | Country |
---|---|---|
10338237 | Mar 2005 | DE |
1109390 | Jun 2001 | EP |
WO-2007037875 | Apr 2007 | WO |
WO-2007037875 | Apr 2009 | WO |
Entry |
---|
“U.S. Appl. No. 11/237,081, Final Office Action mailed Dec. 29, 2011”, 13 pgs. |
“U.S. Appl. No. 11/237,081, Non-Final Office Action mailed Mar. 11, 2010”, 10 pgs. |
“U.S. Appl. No. 11/237,081, Notice of Allowance mailed Apr. 1, 2013”, 8 pgs. |
“U.S. Appl. No. 11/237,081, Response filed Jul. 12, 2010 to Non Final Office Action mailed Mar. 11, 2010”, 13 pgs. |
“European Application Serial No. 06813809.8, Extended European Search Report mailed Sep. 16, 2010”, 8 Pgs. |
“European Application Serial No. 06813809.8, Response filed Apr. 4, 2011 to the Communication pursuant to Article 70(2) and 70a(2) mailed Sep. 16, 2010”, 19 pgs. |
“International application Serial No. PCT/US06/33387, International Preliminary Report on Patentability mailed Mar. 10, 2009”, 6 pgs. |
“International application Serial No. PCT/US06/33387, International Search Report mailed May 2, 2008”, 5 pgs. |
“International application Serial No. PCT/US06/33387, Written Opinion mailed May 2, 2008”, 5 pgs. |
“Method for Enhanced Messaging Service”, IBM Technical Disclosure Bulletin vol. 36, No. 8, XP000390273 ISSN:0018-8689, (Aug. 1, 1993), 405-407. |
Number | Date | Country | |
---|---|---|---|
20130301813 A1 | Nov 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11237081 | Sep 2005 | US |
Child | 13942764 | US |