The invention relates to telecommunications and more specifically relates to providing enhanced telephony on a communication device.
Generally, Interactive Voice Response (IVR) systems are allows a user to interact with an audio response system. The IVR systems can provide prompts to a user and receive touch tone and/or spoken responses on the prompts from the user. Through such IVR dialogue the system collects sufficient information about the user to direct the call to the most appropriate resource, information processing system or the like. Various organizations such as banks, insurance companies, and other service providers use IVR system to manage calls from their customers. Typically, IVR systems are used by organizations that have high call volumes. An objective for implementing the IVR systems is to provide the users or customers with a quick and good experience. Moreover, or the cost of providing the services is reduced.
Typically, in case of an audio IVR menu the user calling the destination may have to listen and follow instructions on the menu to get a desired response or a function performed. Therefore, the process can be time consuming. Moreover, in case the user provides an incorrect input, the complete process may have to be repeated. Furthermore, the IVR menu for an organization may be updated or changed regularly. For example, extension numbers inside an organization may be changed and correspondingly, the extension numbers associated with the IVR menu may be updated. As a result, a frequent user may not be able to reach a desired end by remembering a combination of numbers. Therefore, the user may become frustrated with the IVR systems.
Usually, the IVR menus are same for all the users. Therefore, the customer has to listen them carefully to select the appropriate option. Some existing techniques try to address this problem by providing visual form of IVR. U.S. Pat. No. 7,215,743 assigned to International Business Machines Corporation and a published U.S. patent application with Ser. No. 11/957,605, filed Dec. 17, 2007 and assigned to Motorola Inc., provides the IVR menu of the destination in a visual form to the user. Therefore, the user can select the options from the IVR menu without listening to the complete audio IVR menu.
Various service providers that implement IVR systems may have multiple stores or outlets in and around a particular geographical area. Further, each outlet may have a different phone number but have the same IVR menu. Therefore, the user may not be aware of all the phone numbers. Moreover, some outlets may be located relatively farther that other outlets from the geographical location of the caller. Further, some outlets may not provide the services desired by the user. Generally, more than one provider may provide similar products or services. For example, various banks may provide similar banking services, or various pizzerias may provide similar type of pizzas. Therefore, the user may prefer to call or use an outlet that is near for better services and time management.
In the light of the above discussion, techniques are desired for providing enhanced telephony.
Embodiments of the invention provide an enhanced communication device. The enhanced communication device comprises a processor and a memory coupled to the processor. The memory comprises a database including one or more destination phone numbers and at least one property associated with the destination phone numbers. Further, the memory comprises instructions executable by the processor for identifying a dialed phone number of a destination, determining a location code associated with a current location of the communication device, comparing the dialed phone number to one or more destination phone numbers stored in a database, and displaying at least one property associated with the one or more destination phone numbers based on the comparison.
Embodiments of the invention provide an enhanced communication device. The enhanced communication device comprises a database including one or more destination phone numbers and at least one property associated with the destination phone numbers. Further, the enhanced communication device comprises means for identifying a dialed phone number of a destination, means for determining a location code associated with a current location of the communication device, means for comparing the dialed phone number to one or more destination phone numbers stored in a database, and means for displaying at least one property associated with the one or more destination phone numbers based on the comparison.
Embodiments of the invention provide a method for providing enhanced telephony. The method includes identifying a phone number of a destination dialed from a communication device; determining a location code associated with a current location of the communication device; and comparing the dialed phone number to one or more destination phone numbers stored in a database. The database may include at least one property associated with the destination phone numbers. Further, the method includes displaying, at the communication device, the at least one property associated with the one or more destination phone numbers based on the comparison.
An aspect of the invention is to provide a visual IVR menu of a destination according to the location of the communication device of a user and/or a location of the dialed destination phone number.
Another aspect of the invention is to provide the position based visual IVR menus in a communication network.
Having thus described the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Illustrative embodiments of the invention now will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like numbers refer to like elements throughout.
The communication device 102a includes a Visuphone 104 that provides information regarding a phone number dialed from communication device 102a. For example, the information may include geographical information of the destinations and/or the user. Further, the Visuphone 104 may display a visual IVR menu on the communication device 102a corresponding to the audible IVR menu based of a phone number of the destination to be connected. The Visuphone 104 may be hardware, software, or a firmware implemented on the communication device 102a, or a combination thereof. The visual IVR menu may have one or more options. Thereafter, the user 106 can select the options of the audible IVR menu from the visual IVR menu display without the requirement to listen to the audible instructions. Exemplary audible IVR menu at the destination 108a and a corresponding visual IVR menu are explained in detail in
In an embodiment of the invention, the communication device 102a can request for updates from a server through a communication network. The server may maintain the updated information of destinations and their associated properties. The communication network can include more than one communication devices. Examples of the communication network include, but are not limited to, the Network, PSTN, Local Area Network (LAN), Wide Area Network (WAN), and so forth.
In an embodiment of the invention, the communication device 102 may be an in-car navigation system such as a Global Positioning System (GPS). Therefore, when the user 106 dials a phone number of a destination, the Visuphone 104 may provide a representation of one or more destinations on a map on the communication device 102a screen. Further, the properties like location, reviews, ratings etc. associated with the destination may be displayed on the map. The user interacts with the displayed maps and can select a destination from the map based on the one or more properties. Further, the user 106 can select a destination by clicking or scrolling on the map. In an embodiment of the invention, a route map from the user 106 to the selected destination may be displayed on the communication device 102a screen.
The communication device 102a includes a display 402 to output graphical information to the user 106. In an embodiment of the invention, the display 402 may include a touch sensitive screen. Therefore, the user 106 can provide inputs to the communication device 102a by touching display 402 or by point and click using the ‘mouse’. Memory 406 of the communication device 102a stores various programs, data and/or instructions that can be executed by a Processor 404. Examples of the memory 406 include, but are not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a hard disk, and so forth. A person skilled in the art will appreciate that other types of computer-readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, and the like, may also be used by the communication device 102a. The memory 406 may include Operating System (OS) (not shown) for the communication device 102a to function. Further, the memory 406 may include other applications that enable the user 106 to communication with the destinations 108a-n. Examples of other applications include, but are not limited to, Skype, Google Talk, Magic Jack, and so forth. Other applications may be stored as software or firmware on the communication device 102a.
Further, the memory 406 includes Visuphone 104 for providing a visual representation of the destinations 108a-n. As discussed with reference to the
Further, the Visuphone 104 may filter the results of the search based on a location code associated with the communication device 102a. The location code may be associated with the current location of the user 106 of the communication device 102. In an embodiment of the invention, the Visuphone 104 may also filter the results based on the location code of the communication device 102a and dialed destination phone number. The Visuphone 104 displays the visual IVR menu of the dialed destination phone number when a matching destination based on the location code is not found. When a matching destination is found, the Visuphone 104 displays a visual representation including one or more destinations with their associated properties on a display 402, as discussed with reference to
The user 106 can select a destination from the representation on the communication device 102a screen according to his/her preference. The user 106 can select a destination from the representation of one or more destinations. The user may prefer to select a destination which is near to his present location though it's not the one he/she dialed for. The user may also prefer to select a destination who has good reviews and is little far than the dialed destination. Subsequently, the visual IVR menu of the selected destination is presented on display 402. The visual IVR menu has one or more options. Thereafter, the user 106 can interact with the visual IVR menu accordingly.
The user 106 may dial a phone number corresponding to a destination using keyboard 418. The keyboard 418 may be a physical keyboard or a virtual keyboard displayed on a touch screen display 402. In an embodiment, the keyboard 418 is a keypad on the communication device 102a. Subsequently, after some processing by the Visuphone 104, the visual IVR menu 302 corresponding to dialed destination phone number is searched and displayed on display 402.
In an exemplary instance, if the user 106 dials a phone number of a destination, then a representation of one or more destinations is displayed on display 402. Thereafter, on selection by the user 106, a visual IVR menu corresponding to an audible IVR menu of the selected destination is displayed on the display 402. Similarly, if the user 106 receives a call from a destination phone number, then a visual IVR menu corresponding to audible IVR menu of destination is displayed on the display 402. Thereafter, the user 106 can interact with the visual IVR menu to select an option from the visual IVR menu. The representation of one or more destinations and the visual IVR menu is displayed before actual connection of the communication device 102a to destination. Therefore, the user 106 can select a desired action from the visual IVR menu before connecting to destination. In an embodiment of the invention, the visual IVR menu may be provided in real-time to user. In an embodiment of the invention, the visual IVR menu is provided by a messaging service such as a Short Messaging Service (SMS). Therefore, destinations may provide customized visual IVR menu to the user 106. The visual IVR menu may be customized based on the profile of user. In an embodiment of the invention, the profile may be generated based on access pattern of user or the data capture by a hub connected to the communication device 102a.
User can interact with the visual IVR menu by pressing a desired button from the keyboard 418. For example, the user can press a ‘3’ key from the keyboard 418 to select a node 3 in the visual IVR menu 302. Further, the user 106 can directly select the node 3 of the visual IVR menu 302 from the display 402, in case of a touch sensitive screen. Depending on the complexity or size of destinations, visual IVR menu 302 may have various nodes. Moreover, display area of the display 402 may be limited or small. As a result, all the nodes of the visual IVR menu 302 may not be displayed together on the display 402. In such a case, the Visuphone 104 is configured to allow the user 106 to navigate by scrolling horizontally and/or vertically to view nodes on the visual IVR menu 302. Further, the Visuphone 104 may detect the capability of the communication device 102a before displaying the visual IVR menu 302. For example, in case the communication device 102a is a basic mobile phone with limited functionality of the display screen. Therefore, the Visuphone 104 may display the visual IVR menu in form of a simple list. Similarly, a list may be displayed in case of fixed line or wired telephones. Moreover, in case the communication device 102a includes a high capability screen, such as but not limited to an iPhone, then the visual IVR menu is displayed in form of graphics. Subsequently, after the user 106 selects a desired action from the visual IVR menu 302, a connection is established between the communication device 102a and the selected destination. In one embodiment, the Visuphone 104 is configured to detect and present an application or computer program available on the communication device 102a.
In an embodiment, a user 106 may dial a phone number from a VOIP application 428 on the communication device 102b, as shown with reference to
In an embodiment, the Visuphone 104 may include a VOIP plug-in that monitors the outgoing calls made from the VOIP application. Therefore, the VOIP plug-in may search each dialed number in the database 414. In case, the dialed number is found in database 414 and is associated with an audible IVR, then the VOIP plug-in may display the visual IVR menu corresponding to the audible IVR menu of the dialed destination phone number.
In one embodiment, the Visuphone 104 is configured to detect and present applications suitable to the user 106 for initiating the connection. For example, the Visuphone 104 may detect more than one VOIP applications present in the communication device 102b and present them to the user 106 on the display 508. Thereafter, the user 106 can select an application to be used or initiate the connection in a default configuration. The default configuration can be for example, a VOIP application 428 on which destination phone number was dialed. In another embodiment, the user 106 may select a phone number displayed in applications such as a browser, messenger, or a mail client. Subsequently, the Visuphone 104 detects and presents applications suitable to the user 106 for initiating the connection. Furthermore, the Visuphone 104 is configured to display the visual IVR menu 302 for the phone number selected from the applications.
In an embodiment, the communication device 102b may include a web browser to display web pages from the Network and/or other computer networks. Various websites provide a phone number on the web pages as a click-to-talk button. The clickable button can provide, for example, a contact number of executives of the organization. The clickable button may be programmed to display a phone number of the organization and/or display a user a form to provide his contact details, so that an executive from the organization can call back the user. The Visuphone 104 is configured to detect a connect button a webpage. Connect button may be used by the Visuphone 104 to initiate a connection to a destination. The Visuphone 104 detects and launches a VOIP application on the communication device 102b. In an embodiment, in case more than one application is available on the communication device 102, the Visuphone 104 selects a VOIP application preferred by the user 106. Moreover, the Visuphone 104 may be configured to automatically login into the VOIP application. In an embodiment, the user 106 stores the login details for the VOIP application in the Visuphone 104. Further, the Visuphone 104 displays a visual IVR menu corresponding to audible IVR menu of the destination connected once clicked on the connect tab. Therefore, the user 106 can connect to the destination from web browser automatically and may not be required to dial the phone number or provide call-back information.
The communication device 102b includes a display interface 502 to connect to a display 508. The display interface 502 can be for example, a video adapter. The display 508 outputs graphical information to the user 106. In an embodiment of the invention, the display 508 includes a touch sensitive screen. Therefore, the user 106 can provide inputs to the communication device 102b by touching display 508 or by scrolling and pointing with the mouse 514 and a click. Memory 506 of the communication device 102b stores various programs, data and/or instructions that can be executed by a processor 504. Examples of the memory 506 include, but are not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a hard disk, and so forth. A person skilled in the art will appreciate that other types of computer-readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, and the like, may also be used by the communication device 102b. The memory 506 may include Operating System (OS) (not shown) for the communication device 102b to function. Further, the memory 506 may include other applications that enable user 106 to communication with destinations 108a-n. Examples of other applications include, but are not limited to, Skype, Google Talk, Magic Jack, and so forth. Other applications may be software or firmware stored on the communication device 102b. Further, the memory 506 includes the Visuphone 104 for searching and selecting one or more destination matching a business category of the dialed destination phone number. Further, the Visuphone 104 is capable of filtering the one or more destination based on the location code of the communication device 102b. The location code determines the current location of the communication device 102b. In an embodiment, the Visuphone 104 can filter the one or more destinations based on the current location of the dialed destination phone number. Further, the Visuphone 104 is capable of presenting a visual IVR menu corresponding to the audible IVR menu of a selected destination as discussed with reference to
In an exemplary instance, if user 106 dials a destination phone number. The user 106 is presented with representation of the one or more destinations with their associated at least one property. As discussed in
Another embodiment of the invention allows the user to select the visual IVR menu using car display like GPS display. Hands-free cell phone system is used in many cars as a separated device or as an integrated system in the car. These devices allow the user to talk with the cell phone without holding the cell phone in his hands. Some devices are using the car speakers for the phone call. In many cases, the hands-free system can use a display screen in the car like GPS screen or other display. Following voice menu while driving might not be the optimal way to use hands-free cell phone system. In some cases, selecting an option from a visual IVR menu is preferred. While driving or stopping in red light, it might be easier to use larger display like the GPS display in the car. The display can present the visual IVR menu and the user can select the option from the menu. The computing engine to support the visual IVR menu could be embedded in the car GPS system or in another controller that have access to the car display. Once the system recognizes a destination of a call to be an IVR it will access the database, and pull out the representation of one or more destinations and display. Accordingly all the other features of the Visuphone 104 could be incorporated.
At step 602, the Visuphone 104 identifies a phone number of a destination dialed by the user 106 of the communication device 102. In an embodiment of the invention, the number is clicked according to the display of the communication device 102. The number is identified by the processor 404. In an embodiment of the invention, the Visuphone 104 displays at least one property associated with one or more destinations 108a-n based on the identified dialed phone number of the destination.
Further, at step 604, a location code associated with current location of the communication device 102 is determined. The location code determines the present location of the communication device 102. When the user 106 is in his/her home country or state, the location code of the communication device and location code of the dialed destination phone number is same. In an embodiment, the location code of the user 106 is different than the location code of the dialed destination phone number. This happens, in case when the user 106 is travelling and the communication device 102 is on roaming. The processor 404 determines the location code of the communication device 102. At step 606, the processor 404 determines a business category associated with the dialed destination phone number. The destinations 108a-n are categorized into various groups based on their associated business category. Various destinations are grouped into a business category based on the services and operations of the destinations.
At step 608, the processor 404 searches the database 114, for phone numbers of the destinations matching the business category of the dialed destination phone number. Further, at step 610, the processor 404 checks whether the destination phone numbers matching the business category are available in the database 414. In case the one or more destination phone numbers are available, then the process continues to step 612, else the process continues to step 628. At step 612, the processor 404 searches for at least one phone number from the one or more destination phone numbers based on the location code. The location code is associated with the communication device 102. The location code determines the present location of the communication device 102. In an embodiment, the processor 404 searches for at least one phone number from the one or more destination phone numbers based on the location code of the communication device 102b and a location code of the dialed destination phone number. At step 614, the processor 404 checks whether the at least one destination phone number matching the location code of the communication device is available in the database 414. In case the at least one destination phone number, based on the location code, is not available then step 626 is executed. At step 626, a visual IVR menu of the dialed destination phone number is displayed on the display 402, as shown with reference to
At step 614, when the at least one destination phone number matching the location code of the communication device 102 is available in the database 414, the process continues to step 616. At step 616, the at least one destination phone number with its associated properties are displayed. The processor 404 displays the at least one destination phone number and associated properties are displayed on the display 402 of the communication device 102. Further, at step 618, the user 106 selects a destination phone number from the displayed destinations. At step 620, a visual IVR menu of the selected destination is displayed on the communication device 102 screen. The processor 404 displays the visual IVR menu on the display 402. As discussed with reference to
At step 610, when the one or more destination phone numbers matching the business category of the dial phone number, are not available in the database, the process continues to step 628, the communication device 102 requests for updates from the server. The updates include information of destinations 108a-n. The destination information includes destination phone number, and their associated properties. Further, at step 630, the updates are received from the server. The communication device 102 receives the updates. Then, at step 632, the received updates are stored in the database 414 on the communication device 102. Thereafter, the process continues to step 608. In another embodiment of the invention, electronic yellow pages directory allows the dialing the number directly from the directory and further provides representation of one or more destinations and the visual IVR menu of the destination. The user can select the exact destination before dialing or follow the visual IVR menu after dialing. For example, an airline company might have various option, menus and layers in the large organization. Selecting the exact department in the organization before dialing can save the user the time and overhead of listening to the menu and making decisions based on the voice menu. The yellow pages company can have a copy of the visual IVR menu database or can be connected to a visual IVR menu service in order to provide the menu to the user.
Alternatively an enhanced web based yellow page could be provided, wherein the user can first choose the provider he requires to contact. Thereafter, if that destination provides an IVR, then the enhanced yellow page will use the visual IVR menu database to present a visual IVR menu on the web page. Moreover, the user can click to choose the internal destination of that provider and the enhanced yellow page may accordingly initiate the call. The call could be made using the conventional telephone network or PSTN. In this case, the enhanced yellow page may need the user's telephone number to perform the connection. Alternatively, the enhanced yellow page could use a VOIP to connect the user over the web to the IVR of the destination.
In some IVR systems, the user may have to wait or hold on a queue of previous dialers until the specific department or agent is available. In another embodiment of the invention, the enhanced yellow page system will connect the user only after the specific agent is available, without waiting in a long waiting line queue. The system can recognize that the waiting queue message of the specific department, and to connect the user only after the agent is answering. Therefore, the waiting time of the user on the phone queue that sometimes may be very long, may be reduced. The system can park on the line for the waiting line on the specific entry in the menu, as soon as the agent is available the user gets a signal to start the conversation with the agent.
Additional advantage of the invention relates to users who are more proficient in foreign language. Application may provide the visual IVR menu in multiple languages. A user can than choose a language of his choice and download to his device database the menu in that language.
Another advantage of the invention is that it provides user with information about all the matching destinations available in his/her vicinity. So the user has more choices of destinations to choose from based on their one or more properties.
Yet another advantage of the invention relates to IVR that ask for voice commands. This IVR interface is for some user harder to use due to accent or other problems. The database could be provided with the option as been described before for the system to output voice command according to user selection of the menu options.
Embodiments of the invention are described above with reference to block diagrams and schematic illustrations of methods and systems according to embodiments of the invention. It will be understood that each block of the diagrams and combinations of blocks in the diagrams can be implemented by computer program instructions. These computer program instructions may be loaded onto one or more general purpose computers, special purpose computers, or other programmable data processing translator to produce machines, such that the instructions which execute on the computers or other programmable data processing translator create means for implementing the functions specified in the block or blocks. Such computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means that implement the function specified in the block or blocks.
While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims. The invention has been described in the general context of computing devices, phone and computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, characters, components, data structures, etc., that perform particular tasks or implement particular abstract data types. A person skilled in the art will appreciate that the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. Further, the invention may also be practiced in distributed computing worlds where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing world, program modules may be located in both local and remote memory storage devices.
This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.
This application is a Continuation-In-Part (CIP) of U.S. Non-Provisional application Ser. No. 12/699,618 entitled ‘Systems and methods for visual presentation and selection of IVR menu’ and filed on Feb. 3, 2010.
Number | Name | Date | Kind |
---|---|---|---|
4653045 | Stanley et al. | Mar 1987 | A |
4736405 | Akiyama | Apr 1988 | A |
4897866 | Majmudar et al. | Jan 1990 | A |
5006987 | Harles | Apr 1991 | A |
5007429 | Treatch et al. | Apr 1991 | A |
5027400 | Baji et al. | Jun 1991 | A |
5086385 | Launey et al. | Feb 1992 | A |
5144548 | Salandro | Sep 1992 | A |
5265014 | Haddock et al. | Nov 1993 | A |
5294229 | Hartzell et al. | Mar 1994 | A |
5335276 | Thompson et al. | Aug 1994 | A |
5416831 | Chewning, III et al. | May 1995 | A |
5417575 | McTaggart | May 1995 | A |
5422809 | Griffin et al. | Jun 1995 | A |
5465213 | Ross | Nov 1995 | A |
5465401 | Thompson | Nov 1995 | A |
5475399 | Borsuk | Dec 1995 | A |
5499330 | Lucas et al. | Mar 1996 | A |
5519809 | Husseiny et al. | May 1996 | A |
5533102 | Robinson et al. | Jul 1996 | A |
5550746 | Jacobs | Aug 1996 | A |
5572581 | Sattar et al. | Nov 1996 | A |
5585858 | Harper et al. | Dec 1996 | A |
5586235 | Kauffman | Dec 1996 | A |
5588044 | Lofgren et al. | Dec 1996 | A |
5592538 | Kosowsky et al. | Jan 1997 | A |
5606361 | Davidsohn et al. | Feb 1997 | A |
5633909 | Fitch | May 1997 | A |
5633916 | Goldhagen et al. | May 1997 | A |
5657221 | Warman et al. | Aug 1997 | A |
5689648 | Diaz et al. | Nov 1997 | A |
5724412 | Srinivasan | Mar 1998 | A |
5739814 | Ohara et al. | Apr 1998 | A |
5740549 | Reilly et al. | Apr 1998 | A |
5768142 | Jacobs | Jun 1998 | A |
5790652 | Gulley et al. | Aug 1998 | A |
5794205 | Walters et al. | Aug 1998 | A |
5796806 | Birckbichler | Aug 1998 | A |
5802283 | Grady et al. | Sep 1998 | A |
5802526 | Fawcett et al. | Sep 1998 | A |
5807336 | Russo et al. | Sep 1998 | A |
5819225 | Eastwood et al. | Oct 1998 | A |
5822404 | Cave | Oct 1998 | A |
5822405 | Astarabadi | Oct 1998 | A |
5838682 | Dekelbaum et al. | Nov 1998 | A |
5838775 | Montalbano | Nov 1998 | A |
5867816 | Nussbaum | Feb 1999 | A |
5873068 | Beaumont et al. | Feb 1999 | A |
5885083 | Ferrell | Mar 1999 | A |
5885245 | Lynch et al. | Mar 1999 | A |
5890123 | Brown et al. | Mar 1999 | A |
5892813 | Morin et al. | Apr 1999 | A |
5907793 | Reams | May 1999 | A |
5912952 | Brendzel | Jun 1999 | A |
5913195 | Weeren et al. | Jun 1999 | A |
5920477 | Hoffberg et al. | Jul 1999 | A |
5937040 | Wrede et al. | Aug 1999 | A |
5940488 | DeGrazia et al. | Aug 1999 | A |
5948040 | DeLorme et al. | Sep 1999 | A |
5956034 | Sachs et al. | Sep 1999 | A |
5982875 | Lieben et al. | Nov 1999 | A |
5987103 | Martino | Nov 1999 | A |
6009398 | Mueller et al. | Dec 1999 | A |
6014428 | Wolf | Jan 2000 | A |
6020915 | Bruno et al. | Feb 2000 | A |
6049779 | Berkson | Apr 2000 | A |
6055513 | Katz et al. | Apr 2000 | A |
6062863 | Kirksey et al. | May 2000 | A |
6088429 | Garcia | Jul 2000 | A |
6088712 | Huang et al. | Jul 2000 | A |
6091805 | Watson | Jul 2000 | A |
6091956 | Hollenberg | Jul 2000 | A |
6104790 | Narayanaswami | Aug 2000 | A |
6144848 | Walsh et al. | Nov 2000 | A |
6148065 | Katz | Nov 2000 | A |
6169734 | Wilson | Jan 2001 | B1 |
6212547 | Ludwig et al. | Apr 2001 | B1 |
6228921 | Kasemann et al. | May 2001 | B1 |
6229694 | Kono | May 2001 | B1 |
6230197 | Beck et al. | May 2001 | B1 |
6259444 | Palmer et al. | Jul 2001 | B1 |
6263064 | O'Neal et al. | Jul 2001 | B1 |
6273726 | Kirksey et al. | Aug 2001 | B1 |
6321198 | Hank et al. | Nov 2001 | B1 |
6335678 | Heutschi | Jan 2002 | B1 |
6366650 | Rhie et al. | Apr 2002 | B1 |
6373817 | Kung et al. | Apr 2002 | B1 |
6400807 | Hewitt et al. | Jun 2002 | B1 |
6405033 | Kennedy, III et al. | Jun 2002 | B1 |
6408301 | Patton et al. | Jun 2002 | B1 |
6427063 | Cook et al. | Jul 2002 | B1 |
6445694 | Swartz | Sep 2002 | B1 |
6449595 | Arslan et al. | Sep 2002 | B1 |
6456706 | Blood et al. | Sep 2002 | B1 |
6460057 | Butler et al. | Oct 2002 | B1 |
6463145 | O'Neal et al. | Oct 2002 | B1 |
6482156 | Iliff | Nov 2002 | B2 |
6505146 | Blackmer | Jan 2003 | B1 |
6510411 | Norton et al. | Jan 2003 | B1 |
6529500 | Pandharipande | Mar 2003 | B1 |
6560320 | Paleiov et al. | May 2003 | B1 |
6603840 | Fellingham et al. | Aug 2003 | B2 |
6606611 | Khan | Aug 2003 | B1 |
6606741 | Kojima et al. | Aug 2003 | B2 |
6636835 | Ragsdale et al. | Oct 2003 | B2 |
6653930 | Bonomo et al. | Nov 2003 | B1 |
6658389 | Alpdemir | Dec 2003 | B1 |
6705869 | Schwartz | Mar 2004 | B2 |
6714519 | Luzzatti et al. | Mar 2004 | B2 |
6731625 | Eastep et al. | May 2004 | B1 |
6754181 | Elliott et al. | Jun 2004 | B1 |
6788770 | Cook et al. | Sep 2004 | B1 |
6791583 | Tang et al. | Sep 2004 | B2 |
6816580 | Timmins | Nov 2004 | B2 |
6820037 | Simon | Nov 2004 | B2 |
6820062 | Gupta et al. | Nov 2004 | B1 |
6826194 | Vered et al. | Nov 2004 | B1 |
6829368 | Meyer et al. | Dec 2004 | B2 |
6856673 | Banks et al. | Feb 2005 | B1 |
6862713 | Kraft et al. | Mar 2005 | B1 |
6865268 | Matthews et al. | Mar 2005 | B1 |
6885737 | Gao et al. | Apr 2005 | B1 |
6889195 | Strandberg | May 2005 | B2 |
6920205 | Hahn et al. | Jul 2005 | B2 |
6920425 | Will et al. | Jul 2005 | B1 |
6920431 | Showghi et al. | Jul 2005 | B2 |
6937705 | Godfrey et al. | Aug 2005 | B1 |
6968506 | Yacovone et al. | Nov 2005 | B2 |
6990455 | Vozick | Jan 2006 | B2 |
7020609 | Thrift et al. | Mar 2006 | B2 |
7027990 | Sussman | Apr 2006 | B2 |
7036128 | Julia et al. | Apr 2006 | B1 |
7039589 | Whitham | May 2006 | B2 |
7047196 | Calderone et al. | May 2006 | B2 |
7065188 | Mei et al. | Jun 2006 | B1 |
7068643 | Hammond | Jun 2006 | B1 |
7092738 | Creamer et al. | Aug 2006 | B2 |
7100118 | Klask | Aug 2006 | B1 |
7130391 | Janakiraman et al. | Oct 2006 | B2 |
7136480 | Mason | Nov 2006 | B2 |
7139591 | Callaghan et al. | Nov 2006 | B2 |
7145902 | Schindler et al. | Dec 2006 | B2 |
7146321 | Cyr et al. | Dec 2006 | B2 |
7149549 | Ortiz et al. | Dec 2006 | B1 |
7159008 | Wies et al. | Jan 2007 | B1 |
7177814 | Gong et al. | Feb 2007 | B2 |
7180889 | Kung et al. | Feb 2007 | B1 |
7180985 | Colson et al. | Feb 2007 | B2 |
7181401 | Johnson et al. | Feb 2007 | B2 |
7181502 | Incertis | Feb 2007 | B2 |
7188352 | Nathan et al. | Mar 2007 | B2 |
7203517 | Shimoda et al. | Apr 2007 | B2 |
7206745 | Surukkai et al. | Apr 2007 | B2 |
7206809 | Ludwig et al. | Apr 2007 | B2 |
7209124 | Hunt et al. | Apr 2007 | B2 |
7213061 | Hite et al | May 2007 | B1 |
7215743 | Creamer et al. | May 2007 | B2 |
7216348 | deCarmo | May 2007 | B1 |
7225409 | Schnarel et al. | May 2007 | B1 |
7225455 | Bennington et al. | May 2007 | B2 |
7228492 | Graham | Jun 2007 | B1 |
7231636 | Evans | Jun 2007 | B1 |
7231656 | Nathan | Jun 2007 | B1 |
7240006 | Brocious et al. | Jul 2007 | B1 |
7240289 | Naughton et al. | Jul 2007 | B2 |
7246063 | James et al. | Jul 2007 | B2 |
7248885 | Benco et al. | Jul 2007 | B2 |
7250939 | Lira | Jul 2007 | B2 |
7254227 | Mumick et al. | Aug 2007 | B2 |
7265861 | Ranalli et al. | Sep 2007 | B1 |
7266185 | Trandel et al. | Sep 2007 | B2 |
7266186 | Henderson | Sep 2007 | B1 |
7266499 | Surace et al. | Sep 2007 | B2 |
7272222 | Joseph et al. | Sep 2007 | B2 |
7272497 | Koshiji et al. | Sep 2007 | B2 |
7277854 | Bennett et al. | Oct 2007 | B2 |
7280097 | Chen et al. | Oct 2007 | B2 |
7280646 | Urban et al. | Oct 2007 | B2 |
7280651 | Anderson | Oct 2007 | B2 |
7286990 | Edmonds et al. | Oct 2007 | B1 |
7289608 | Kumhyr | Oct 2007 | B2 |
7289904 | Uyeki | Oct 2007 | B2 |
7299405 | Lee et al. | Nov 2007 | B1 |
7303121 | Martinez | Dec 2007 | B2 |
7319477 | Katz | Jan 2008 | B2 |
7324947 | Jordan et al. | Jan 2008 | B2 |
7328239 | Berberian et al. | Feb 2008 | B1 |
7330890 | Partovi et al. | Feb 2008 | B1 |
7353016 | Roundtree et al. | Apr 2008 | B2 |
7392193 | Mault | Jun 2008 | B2 |
7398215 | Mesbah et al. | Jul 2008 | B2 |
7406413 | Geppert et al. | Jul 2008 | B2 |
7412533 | Johnson et al. | Aug 2008 | B1 |
7433452 | Taylor et al. | Oct 2008 | B2 |
7440898 | Eberle et al. | Oct 2008 | B1 |
7450112 | Shneidman | Nov 2008 | B2 |
7466803 | Burg et al. | Dec 2008 | B2 |
7492883 | Kumhyr | Feb 2009 | B2 |
7539484 | Roundtree | May 2009 | B2 |
7546143 | Nelson et al. | Jun 2009 | B2 |
7584249 | Mummick et al. | Sep 2009 | B2 |
7606741 | King et al. | Oct 2009 | B2 |
7646858 | Salafia et al. | Jan 2010 | B2 |
7693720 | Kennewick et al. | Apr 2010 | B2 |
7720091 | Faber et al. | May 2010 | B2 |
7729490 | Hemm et al. | Jun 2010 | B2 |
7757173 | Beaman | Jul 2010 | B2 |
7809376 | Letourneau et al. | Oct 2010 | B2 |
7813485 | Yin et al. | Oct 2010 | B2 |
7843899 | Burritt | Nov 2010 | B2 |
7864944 | Khouri et al. | Jan 2011 | B2 |
7908381 | Koch et al. | Mar 2011 | B2 |
7966188 | Ativanichayaphong et al. | Jun 2011 | B2 |
8000454 | Or-Bach et al. | Aug 2011 | B1 |
8023624 | Kargman et al. | Sep 2011 | B2 |
8054952 | Or-Bach et al. | Nov 2011 | B1 |
8155280 | Or-Bach et al. | Apr 2012 | B1 |
8160215 | Or-Bach et al. | Apr 2012 | B2 |
8223931 | Lavian et al. | Jul 2012 | B1 |
8345835 | Or-Bach et al. | Jan 2013 | B1 |
8406388 | Or-Bach et al. | Mar 2013 | B2 |
20020055844 | L'Esperance et al. | May 2002 | A1 |
20020147986 | Michael et al. | Oct 2002 | A1 |
20030005126 | Schwartz et al. | Jan 2003 | A1 |
20030007625 | Pines et al. | Jan 2003 | A1 |
20030033382 | Bogolea et al. | Feb 2003 | A1 |
20030074198 | Sussman | Apr 2003 | A1 |
20030112931 | Brown et al. | Jun 2003 | A1 |
20040034561 | Smith | Feb 2004 | A1 |
20040122941 | Creamer et al. | Jun 2004 | A1 |
20040198316 | Johnson | Oct 2004 | A1 |
20040204116 | Ben Efraim et al. | Oct 2004 | A1 |
20050004977 | Roberts et al. | Jan 2005 | A1 |
20050055310 | Drewett et al. | Mar 2005 | A1 |
20060203977 | Erhart et al. | Sep 2006 | A1 |
20060239422 | Rinaldo et al. | Oct 2006 | A1 |
20060259424 | Turcotte et al. | Nov 2006 | A1 |
20060262921 | Eppel et al. | Nov 2006 | A1 |
20060285662 | Yin et al. | Dec 2006 | A1 |
20070026852 | Logan et al. | Feb 2007 | A1 |
20070032247 | Shaffer et al. | Feb 2007 | A1 |
20070038513 | Flax et al. | Feb 2007 | A1 |
20070094109 | Perry | Apr 2007 | A1 |
20070123223 | Letourneau et al. | May 2007 | A1 |
20070239537 | Protheroe et al. | Oct 2007 | A1 |
20070243887 | Bandhole et al. | Oct 2007 | A1 |
20070298776 | Arlene | Dec 2007 | A1 |
20080039056 | Mathews et al. | Feb 2008 | A1 |
20080066015 | Blankenhorn | Mar 2008 | A1 |
20080095330 | Jin et al. | Apr 2008 | A1 |
20080226042 | Singh | Sep 2008 | A1 |
20080250334 | Price | Oct 2008 | A1 |
20080268823 | Shalev et al. | Oct 2008 | A1 |
20090041215 | Schmitt et al. | Feb 2009 | A1 |
20090116414 | Or et al. | May 2009 | A1 |
20090136014 | Bigue et al. | May 2009 | A1 |
20090154666 | Rios et al. | Jun 2009 | A1 |
20090202050 | Berger et al. | Aug 2009 | A1 |
20090207980 | Berger et al. | Aug 2009 | A1 |
20090207996 | Berger et al. | Aug 2009 | A1 |
20090225788 | Kephart et al. | Sep 2009 | A1 |
20090228908 | Margis et al. | Sep 2009 | A1 |
20090276441 | Malik | Nov 2009 | A1 |
20090276708 | Smith et al. | Nov 2009 | A1 |
20090280863 | Shin et al. | Nov 2009 | A1 |
20090285380 | Chen et al. | Nov 2009 | A1 |
20100007028 | Fachmann et al. | Jan 2010 | A1 |
20100021030 | Collins et al. | Jan 2010 | A1 |
20100049654 | Pilo et al. | Feb 2010 | A1 |
20100087175 | Roundtree | Apr 2010 | A1 |
20100100377 | Madhavapeddi et al. | Apr 2010 | A1 |
20100166158 | Costello et al. | Jul 2010 | A1 |
20100172481 | Canu et al. | Jul 2010 | A1 |
20100189250 | Williams et al. | Jul 2010 | A1 |
20110009096 | Rotsztein et al. | Jan 2011 | A1 |
20110014952 | Minton | Jan 2011 | A1 |
20110060683 | Salmon Rock et al. | Mar 2011 | A1 |
20110091021 | Adkar et al. | Apr 2011 | A1 |
20110099116 | Gabel | Apr 2011 | A1 |
20110123004 | Chang et al. | May 2011 | A1 |
20110276408 | Toole | Nov 2011 | A1 |
20120063574 | Or-Bach et al. | Mar 2012 | A1 |
20120257002 | Stocker | Oct 2012 | A1 |
20130022191 | Or-Bach et al. | Jan 2013 | A1 |
20130078970 | Rotsztein et al. | Mar 2013 | A1 |
20130108030 | Snir et al. | May 2013 | A1 |
20130138443 | Kim et al. | May 2013 | A1 |
20130142320 | Williams et al. | Jun 2013 | A1 |
20130343534 | Nguyen et al. | Dec 2013 | A1 |
20140003594 | Chatterjee | Jan 2014 | A1 |
Number | Date | Country |
---|---|---|
1225754 | Jul 2003 | EP |
1001597 | Sep 2003 | EP |
1351477 | Oct 2003 | EP |
1120954 | Jun 2005 | EP |
1545101 | Dec 2005 | EP |
774853 | May 2006 | EP |
1874018 | Jan 2008 | EP |
2004274425 | Sep 2004 | JP |
9819259 | May 1998 | WO |
9840826 | Dec 1998 | WO |
9856158 | Mar 1999 | WO |
9848551 | Apr 1999 | WO |
0131497 | May 2001 | WO |
0157851 | Aug 2001 | WO |
0165871 | Sep 2001 | WO |
9820409 | Nov 2001 | WO |
0217604 | Feb 2002 | WO |
2004049306 | Jun 2004 | WO |
2004064299 | Jul 2005 | WO |
2007012831 | Feb 2007 | WO |
2007081929 | Jan 2008 | WO |
2008086320 | Jul 2008 | WO |
2009006173 | Mar 2009 | WO |
2009100477 | Aug 2009 | WO |
Entry |
---|
Yin, M. and Zhai, S., “The Benefits of Augmenting Telephone Voice Menu Navigation with Visual Browsing and Search,” CHI'06 Proceedings of the SIGCHI conference on Human Factors in computing systems: pp. 319-328, ACM, Montreal, Canada (Apr. 2006). |
Damhuis, M., et al., “A Multimodal Consumer Information Server with IVR Menu,” 2nd IEEE Workshop on Interactive Voice Technology for Telecommunications Applications (IVTTA94): pp. 73-76, Kyoto, Japan (Sep. 1994). |
Shah, S.AA., et al., “Interactive Voice Response with Pattern Recognition Based on Artificial Neural Network Approach,” International Conference on Emerging Technologies: pp. 249-252, (Nov. 2007). |
Trihandoyo, A., et al., “A real-time speech recognition architecture for a multi-channel interactive voice response system,” International Conference on Acoustics, Speech, and Signal Processing vol. 4: pp. 2687-2690,(1995). |
Hattori, S., et al., “A multimedia intelligent message communication system for distributed coordination environments,” Electronics & Communications in Japan, Part I—Communications, vol. 76, No. 1, pp. 11-23 (1993). |
Patent abstracts of Japan, vol. 097, No. 001, Jan. 31, 1997 & JP 08 242307 A (Canon Inc), Sep. 17, 1996. |
Kalva, H., et al., “Delivering Object-Based Audio-Visual Services,” IEEE Transactions on Consumer Electronics, vol. 45, No. 4, pp. 1108-1111, (1999). |
Schmandt, “Phoneshell: the telephone as computer terminal,” Proceedings of first ACM International Conference on Multimedia, Anaheim, CA, US, pp. 373-381, (1993). |
Himberg, J., et al., “Time Series Segmentation for Context Recognition in Mobile Devices”, IEEE, 203-210, (2001). |
Chris Schmandt and Stephen Casner, “Phonetool: Integrating Telephones and Workstations,” IEEE Communication Society, Nov. 27-30, pp. 0970-0974, (1989). |
Basinger, R. G., et al., “Calling Card Service—Overall Description and Operational Characteristics”, The Bell System Technical Journal, (1982). |
Cadiz et al. “Designing and Deploying an Information Awareness interface” CSCW'02, Nov. 2002, ACM, pp. 314-323. |
Corcoran et al. disclose “User interface technologies for home appliances and networks”, IEEE Trans. Consumer Elect.; pp. 679-685, (1998). |
Nancy Friedrich, “Graphical-User-Interface Module Eases Integration,” Wireless Systems Design, Oct. 2004, 1 page. |
Balachandran, R., et al., “Dialog System for Mixed Initiative One-Turn Address entry and Error Recovery,” Proceedings of SIGDIAL 2009, the 10th Annual Meeting of the Special Interest Group in Discourse and Dialogue, pp. 152-155, Queen Mary University of London, Association of Computational Logistics, (2009). |
Number | Date | Country | |
---|---|---|---|
Parent | 12699618 | Feb 2010 | US |
Child | 13411652 | US |