Various embodiments related to telephone-based or internet-based call transactions are presented.
In telephone-based or internet-based communication, data, voice or sound (or a combination) is exchanged between parties on a call (typically two parties). Traditionally, businesses have utilized people to participate in telephone-based transactions with their clients. However, recently there are an increasing number of transactions that use automated services and do not engage a person until a certain stage of the call. The embodiments presented herein, relate to such transactions.
The present embodiments provides in one aspect, a system for detecting a hold status in a transaction between a waiting party and a queuing party, said system comprising a device adapted to use a preexisting cue profile database containing cue profile for at least one queuing party.
In another aspect, the present embodiments provide for the use of a preexisting cue profile for detecting a hold status in a call between a waiting party and a queuing party.
In another aspect, the present embodiments provide a method for detecting a hold status in a transaction between a waiting party and a queuing party, said method comprising using a preexisting cue profile database containing cue profile for at least one queuing party.
For a fuller understanding of the invention, reference is made to the following detailed description, taken in connection with the accompanying drawings illustrating various embodiments of the present invention, in which:
The embodiments and implementations described here are only exemplary. It will be appreciated by those skilled in the art that these embodiments may be practiced without certain specific details. In some instances however, certain obvious details have been eliminated to avoid obscuring inventive aspects the embodiments.
Embodiments presented herein relate to telephone-based (land or mobile) and internet-based call transactions. The words “transaction” and “call” are used throughout this application to indicate any type of telephone-based or internet based communication. It is also envisioned that such transactions could be made with a combination of telephone and internet-connected device.
In all such transactions, the client (normally, but not necessarily, the dialing party) is the waiting party or on-hold party who interacts with an automated telephone-based service (normally, but not necessarily, the receiver of the call) which is the queuing party or holding party (different from the on-hold party). The terms “waiting party” and “queuing party” are used throughout this application to indicate these parties, however, it could be appreciated by those skilled in the art that the scope of the embodiments given herein applies to any two parties engaged in such transactions.
During a typical transaction between a waiting party and a queuing party, the waiting party needs to take certain measures like pressing different buttons or saying certain phrases to proceed to different levels of the transaction. In addition, the waiting party may have to wait “on hold” for a duration, before being able to talk to an actual person. Any combination of the two is possible and is addressed in the embodiments given herein.
To understand one example, as shown in
It is desirable for the waiting party to find out when the hold status changes from an on-hold state to a live state by a method other than constantly listening and paying attention. Accordingly, different embodiments presented herein address the issue of “hold status detection”.
A “cue profile” of a company, in this disclosure, is referred to as all the information available about the queuing party hold status. In some embodiments presented herein, the preexisting cue profiles of different queuing parties are used to determine the hold status.
In some embodiments, the cue profile may contain the hold status “audio cues” which are used to detect the hold status for a particular queuing party. Audio cues are any audible cues that could bear information about the hold status. For instance, music, pre-recorded voice, silence, or any combination thereof could indicate an on-hold state. On the other hand, the voice of an actual person could indicate a live state. The event of transition from an on-hold state to a live state could be very subtle. For instance, the transition form a recorded message to a live agent speaking may not be accompanied by any distinguished audio message like a standard greeting. Nevertheless there are audio cues indicating the transition from an on-hold state to a live state. Such audio cues are called “transition audio cues”.
In some embodiments, certain preexisting data about a queuing party is used to determine the hold status. Such preexisting data is referred as “cue metadata”. For example, the cue metadata may indicate the sensitivity required for each cue in order to dependably identify it in the audio stream while avoiding false-positives. In these particular embodiments, combinations of hold status audio cues in combination with cue metadata are referred to as the cue profile.
Some embodiments described herein relate to finding the cue profile of a particular queuing party. In certain embodiments, the queuing party itself is used, at least partially, to provide cue metadata to create a cue profile. However, in other embodiments, the cooperation of the queuing party is not necessary.
In some embodiments, “dial-in profiling” is used to create a cue profile of a queuing party accessible through PSTN. The method used in these embodiments is an ordinary telephone connection as used by a typical waiting party.
Dial-in profiling is an iterative process that is done in order to figure out the hold status of a queuing party.
In certain cases, dial-in profiling, as described herein, could be the only means for creating a cue profile of a queuing party. In addition, dial-in profiling, according to some embodiments, could also be used to update, expand, or edit a previously created cue profile.
Audio cues could be stored in a standardized format (for example, MP3) and are of fixed time length, for instance two seconds. Another type of cue used in some embodiments is a text cue, which is stored in a standard format (for example ASCII) and is of fixed length (for example two syllables).
In some embodiments these two cues are used create a confidence score. Shown in
In one embodiment related to the case when the audio cues are not sufficient to detect the hold status, a verbal challenge is issued to the queuing party. A verbal challenge consists of a prerecorded message which is asked of the queuing party at specific instances. For example, one verbal challenge may be “is this a live person?” After a verbal challenge has been issued, a speech recognition engine determines whether there is any response from a live person to the verbal challenge. Based on this, a judgment is made as to the hold status.
Verbal challenges can also make use of DTMF tones. For example, the challenge could be “press 1 if you are a real human”. In this case, the audio processing system will be searching for the DTMF tones instead of an audio cue. If the queuing party is in a live state, it may send an unprompted DTMF tone down the line in order to send preemptive notification of the end-of-hold transition. In an order to handle this case the audio system is always listening to and detecting DTMF tones.
A typical apparatus built in accordance with some embodiments presented herein, is referred to as a “hold detection system” and it could comprise, inter alia, some of the following components:
It should be noted that any number of the components mentioned above could be integrated into a single component, device. And it should be noted that any device capable of using preexisting cue profile database to determine the hold status in a call or transaction falls within the scope of the embodiments presented herein.
The embodiments presented herein address, inter alia, the following difficulties:
It will be obvious to those skilled in the art that one may be able to envision alternative embodiments without departing from the scope and spirit of the embodiments presented herein.
As will be apparent to those skilled in the art, various modifications and adaptations of the structure described above are possible without departing from the present invention, the scope of which is defined in the appended claims.
This application claims priority from U.S. Provisional Patent Application Ser. No. 60/989,908 filed Nov. 23, 2007, the disclosure of which is herein incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4169217 | Szanto et al. | Sep 1979 | A |
4228324 | Rasmussen et al. | Oct 1980 | A |
4425479 | Dubner et al. | Jan 1984 | A |
4731822 | Berry, III et al. | Mar 1988 | A |
4834551 | Katz | May 1989 | A |
4870680 | Ohtsuka et al. | Sep 1989 | A |
5627884 | Williams et al. | May 1997 | A |
5640448 | Toyoshima | Jun 1997 | A |
5737393 | Wolf | Apr 1998 | A |
5764746 | Reichelt | Jun 1998 | A |
5802526 | Fawcett et al. | Sep 1998 | A |
5822405 | Astarabadi | Oct 1998 | A |
6031905 | Furman et al. | Feb 2000 | A |
6049600 | Nabkel et al. | Apr 2000 | A |
6104797 | Nabkel et al. | Aug 2000 | A |
6122346 | Grossman | Sep 2000 | A |
6141328 | Nabkel et al. | Oct 2000 | A |
6195417 | Dans | Feb 2001 | B1 |
6201855 | Kennedy | Mar 2001 | B1 |
6501750 | Shaffer et al. | Dec 2002 | B1 |
6512825 | Lindholm et al. | Jan 2003 | B1 |
6563921 | Williams et al. | May 2003 | B1 |
6584184 | Nabkel et al. | Jun 2003 | B1 |
6594484 | Hitchings, Jr. | Jul 2003 | B1 |
6643641 | Snyder | Nov 2003 | B1 |
6674725 | Nabkel et al. | Jan 2004 | B2 |
6684224 | Meding et al. | Jan 2004 | B2 |
6694008 | Mukherji et al. | Feb 2004 | B1 |
6724885 | Deutsch et al. | Apr 2004 | B1 |
6754334 | Williams et al. | Jun 2004 | B2 |
6757260 | Pandit | Jun 2004 | B2 |
6763090 | Che et al. | Jul 2004 | B2 |
6788770 | Cook et al. | Sep 2004 | B1 |
6804342 | Gadant | Oct 2004 | B1 |
6807274 | Joseph et al. | Oct 2004 | B2 |
6813636 | Bean et al. | Nov 2004 | B1 |
6836478 | Huang et al. | Dec 2004 | B1 |
6850602 | Chou | Feb 2005 | B1 |
6914962 | Neary | Jul 2005 | B2 |
6920425 | Will et al. | Jul 2005 | B1 |
6990524 | Hymel | Jan 2006 | B1 |
6999944 | Cook | Feb 2006 | B1 |
7027408 | Nabkel et al. | Apr 2006 | B2 |
7027990 | Sussman | Apr 2006 | B2 |
7065203 | Huart et al. | Jun 2006 | B1 |
7092738 | Creamer et al. | Aug 2006 | B2 |
7113987 | Nabkel et al. | Sep 2006 | B2 |
7120244 | Joseph et al. | Oct 2006 | B2 |
7130411 | Brown et al. | Oct 2006 | B2 |
7136478 | Brand et al. | Nov 2006 | B1 |
7174011 | Kortum et al. | Feb 2007 | B2 |
7215759 | Brown et al. | May 2007 | B2 |
7221753 | Hutton et al. | May 2007 | B2 |
7228145 | Burritt et al. | Jun 2007 | B2 |
7231035 | Walker et al. | Jun 2007 | B2 |
7251252 | Norby | Jul 2007 | B2 |
7315617 | Shaffer et al. | Jan 2008 | B2 |
7324633 | Gao et al. | Jan 2008 | B2 |
7349534 | Joseph et al. | Mar 2008 | B2 |
7386101 | Pugliese | Jun 2008 | B2 |
7414981 | Jaramillo et al. | Aug 2008 | B2 |
7715531 | Golding et al. | May 2010 | B1 |
8155276 | Beauregard et al. | Apr 2012 | B2 |
8160209 | Wang et al. | Apr 2012 | B2 |
8223929 | Sylvain | Jul 2012 | B2 |
20030043990 | Gutta | Mar 2003 | A1 |
20030112931 | Brown et al. | Jun 2003 | A1 |
20040202309 | Baggenstoss et al. | Oct 2004 | A1 |
20050069117 | Fernandez | Mar 2005 | A1 |
20050147219 | Comerford | Jul 2005 | A1 |
20050278177 | Gottesman | Dec 2005 | A1 |
20060095564 | Gissel et al. | May 2006 | A1 |
20060106613 | Mills | May 2006 | A1 |
20060126803 | Patel et al. | Jun 2006 | A1 |
20060245579 | Bienfait et al. | Nov 2006 | A1 |
20060256949 | Noble | Nov 2006 | A1 |
20070041564 | Antilli | Feb 2007 | A1 |
20070071223 | Lee et al. | Mar 2007 | A1 |
20070165608 | Altberg et al. | Jul 2007 | A1 |
20070280460 | Harris et al. | Dec 2007 | A1 |
20080039056 | Mathews et al. | Feb 2008 | A1 |
20080144786 | Wang et al. | Jun 2008 | A1 |
20080159495 | Dahan | Jul 2008 | A1 |
20090149158 | Goldfarb et al. | Jun 2009 | A1 |
20090154578 | Prakash | Jun 2009 | A1 |
20090154678 | Kewin et al. | Jun 2009 | A1 |
20100057456 | Grigsby et al. | Mar 2010 | A1 |
20110103559 | Andrews et al. | May 2011 | A1 |
Number | Date | Country |
---|---|---|
1156649 | Nov 2001 | EP |
2001285493 | Oct 2001 | JP |
2004304770 | Oct 2004 | JP |
20040039586 | May 2004 | KR |
20040106487 | Dec 2004 | KR |
1020050002930 | Jan 2005 | KR |
Number | Date | Country | |
---|---|---|---|
20090136014 A1 | May 2009 | US |
Number | Date | Country | |
---|---|---|---|
60989908 | Nov 2007 | US |