System and method of disambiguating natural language processing requests

Information

  • Patent Grant
  • 10331784
  • Patent Number
    10,331,784
  • Date Filed
    Monday, July 31, 2017
    7 years ago
  • Date Issued
    Tuesday, June 25, 2019
    5 years ago
Abstract
A system and method is provided of disambiguating natural language processing requests based on smart matching, request confirmations that are used until ambiguities are resolved, and machine learning. Smart matching may match entities (e.g., contact names, place names, etc.) based on user information such as call logs, user preferences, etc. If multiple matches are found and disambiguation has not yet been learned by the system, the system may request that the user identify the intended entity. On the other hand, if disambiguation has been learned by the system, the system may execute the request without confirmations. The system may use a record of confirmations and/or other information to continuously learn a user's inputs in order to reduce ambiguities and no longer prompt for confirmations.
Description
FIELD OF THE INVENTION

The invention relates to a system and method of disambiguating natural language processing requests based on smart matching, request confirmations that are used until ambiguities are resolved, and machine learning.


BACKGROUND OF THE INVENTION

Facilitating human to machine interactions in a natural manner is a difficult problem. Both non-voice and certain voice interfaces tend to be overly structured and require some level of familiarity with the interfaces. An ability for a human to interact with machines using natural language utterances in various contexts remains a desirable goal. However, human interaction with machines in various contexts remains a difficult problem.


One area of difficulty lies in the disambiguation of natural language requests. The objects of natural language requests and command, entity names, may sometimes be ambiguous. For example, a user may have multiple contacts of the same name in their directory. A phone dialing request made by first name may require disambiguation. A natural language system however, that constantly requests a user to use specific and unambiguous language or to clarify requests made in ambiguous language may quickly begin to feel burdensome for the user.


These and other drawbacks exist with conventional natural language processing disambiguation methods.


SUMMARY OF THE INVENTION

The invention addressing these and other drawbacks relates to a system and method of machine learned disambiguation of natural language processing requests. An intent of a natural language input (e.g., a natural language utterance of a user) may be ambiguous. For example, a user may provide the input “call John,” when there are several “Johns” in the user's contact list. Other types of ambiguities and other types of requests may be ambiguous based on natural language processing as well.


The system may disambiguate natural language processing requests based on smart matching, request confirmations that are used until ambiguities are resolved, and machine learning. Smart matching may match entities (e.g., contact names, place names, etc.) based on user information such as call logs, user preferences, etc. If multiple matches are found and disambiguation has not yet been learned by the system for these matches, the system may request that the user identify the intended entity. On the other hand, if disambiguation has been learned by the system, the system may execute the request without confirmations. The system may use a record of confirmations and/or other information to continuously learn a user's inputs in order to reduce ambiguities and no longer prompt for confirmations.


To this end, the system may include a smart matching engine that identifies and matches entities, a confirmation engine that generates request confirmations (e.g., “do you want to call John Doe or John Smith”?) that are used until a sufficient level of confidence that an ambiguity relating to a given natural language input has been resolved (e.g., until the system is confident that the user intends to call “John Doe” and not “John Smith”), a learning engine that uses machine learning based on user information (e.g., history of request confirmations from a user, where each request confirmation sought to resolve a prior ambiguous result of natural language processing from a prior natural language input), and/or other components to disambiguate a natural language input.


The smart matching engine may identify an input entity name. The smart matching engine may then match the input entity name to an entity name that can be acted upon. For example, if the user requests directions to Main St., then the smart matching engine may operate to suggest a match between the user uttered input entity name “Main St.” and an identified entity name—i.e., an actual, mappable, Main St. If the user asks to “call James,” then the smart matching engine may operate to suggest a match between an identified entity name, i.e., an actual contact in the user's directory, and the input entity name of “James.”


The smart matching engine may suggest a match between an input entity name and an identified entity name based at least partially on an entity matching model. The entity matching model may comprise information compiled through machine learning techniques, and may include at least context information and user history information, as well as other information suitable for machine learning. A suggested match may include the input entity name and an identified entity name. A suggested match may further include additional entity match information.


Additional entity match information may include context information related to the natural language request, the input entity name, and the identified entity name. For example, for a user that lives in New York, an entity matching model may associate the entity name “taxi” in the request “call a taxi” with a particular taxi company in New York. When creating a suggested match for the input entity name “taxi,” for that user, the smart matching engine may include the additional entity match information of the user's location as part of the suggested match. This may permit the system to disambiguate the term “taxi” differently based on the user's location.


Context information usable by the smart matching engine may include a request context. For example, if a user makes a request to “call James” during the business day, the smart matching engine may suggest a match between the input entity name James and a business contact named James. If the request is made after hours, the smart matching engine may suggest a match with a contact from a friends and family contact list. Similarly, when attempting to disambiguate requests for directions, smart matching engine may suggest matches between locations that are closer to a present location of the user.


User history information usable by the smart matching engine may include information about previous user requests and activity. For example, when requesting to make a call, the smart matching engine may select a name from a user's contact list that the user calls very frequently. When seeking directions, the smart matching engine may suggest a destination location that matches a location the user has travelled to frequently.


The smart matching engine may additionally employ machine learned matching techniques. For example, a user may frequently refer to a contact name by a nickname, making a request to “call Jim,” where the name “James” is stored in the user's contact list. Such a connection may become a part of an entity matching model based on machine learning techniques, as described in more detail below.


In some implementations, the smart matching engine may suggest a match between an input entity name and an identified entity name based on an exact match. For example, where a user requests to “call John,” the system may suggest only matches that match “John” exactly. In some implementations, the smart matching engine may suggest a match between an input entity name and an identified entity name based on a phonetic match. For example, where a user requests to “call John,” the system may suggest all matches that sound like “John,” including, for example, “Jon.” In some implementations, the smart matching engine may suggest a match between an input entity name and an identified entity name based on similar sounds. Continuing to use the above example, the smart matching engine may suggest a match between “John” and “Juwan.” In some implementations the smart matching engine may suggest a match between an input entity name and an identified entity name based on an interpretation of a user's pronunciation style.


In some implementations, the smart matching engine may suggest a match between an input entity name and an identified entity name based on learned associations. For example, a user may request to “call my brother.” The smart matching engine may suggest a match between the input entity name “my brother,” and all user contacts having a same last name as the user. The smart matching engine may also request clarification from the user about the entity name “my brother.” Once the system has successfully confirmed a match between a user contact and the input entity name “my brother,” the entity matching model may be updated with the information in order to correctly link the input entity name “my brother” to the appropriate contact. In another example, a user may request directions to “the pizza place.” Smart matching engine may propose a suggested match between the input entity name “the pizza place,” and the closest pizza restaurant. Once the system has successfully confirmed a match between “the pizza place,” and a particular pizza restaurant, the entity matching model may be updated to associate “the pizza place,” with the specific pizza restaurant.


In some implementations, the smart matching engine may pass a suggested match between an input named entity and an identified named entity to confirmation engine for command confirmation, as described below.


In some implementations, the smart matching engine may determine an alternate identified entity name if the confirmation engine is unable to confirm the suggested match between the input named entity and the identified entity name. In some implementations, the smart matching engine may determine an alternate identified entity name using similar methods to those described above for selecting the first identified entity name. In determining an alternate identified entity name, the smart matching engine may select a next best match for an input entity name. In some implementations, the smart matching engine may develop an n-best list of suggested matches for an input entity name including n number of potential matches ranked in order of best fit. The smart matching engine may provide the 2nd, 3rd, 4th, etc., matches from an n-best list, in order, to the confirmation engine as alternate identified names if the first suggestion is cancelled or met with non-confirmation. In some implementations, the smart matching engine may provide an entire n-best list to the confirmation engine.


The confirmation engine may perform command confirmation based on a user's confirmation history of a proposed entity match. Command confirmation may be performed actively and/or passively. For example, active command confirmation may include querying the user to confirm a suggest match—e.g., where a user has asked to “call James,” a command confirmation may query the user “did you mean James Smith?” In some implementations, passive command confirmation may include permitting a user a predefined amount of time to cancel an action. For example, where a user has asked to “call James,” a command confirmation may alert the user “calling James Smith,” and wait for a predetermined amount of time to give the user a chance to cancel the action before executing the action. If the user does not cancel the action, then the suggested entity match may be confirmed. A predetermined amount of time may be relatively short, e.g., 1 to 5 seconds, or relatively long, e.g., 5-15 seconds. Selecting which command confirmation method to use may be based on a user's confirmation history.


For example, the confirmation engine may use a history of a user's confirmation between a particular input entity name and a particular identified entity name. For example, a user may ask to “call James,” and the smart matching engine may determine a suggested match between the input entity name “James” and an identified entity name “James Smith,” corresponding to a user contact. If the smart matching engine has previously suggested the same match, and the user has previously provided confirmations, the confirmation engine may alert the user of the impending action. If the user has not previously provided enough confirmations of the suggested match, the confirmation engine may request active confirmation from the user. The confirmation engine may switch between these two methods when a number of previous confirmations of the suggested match exceeds a predetermined threshold number of times. Such a predetermined threshold number may be one or more times. A predetermined threshold number may vary depending on the type of entity name that is being matched (e.g., whether it is a contact name or a location address.)


In some implementations, the confirmation engine may update a number of previous confirmations of a suggested match in a user's confirmation history after confirmation or non-confirmation is received from the user. If confirmation is received from the user for a particular suggested match, the confirmation engine may increment the number of previous confirmations. If a user cancels or fails to confirm a suggested match, the confirmation engine may instead decrement the number of previous confirmations. In some implementations, the confirmation engine may reset a number of previous confirmations to zero after a non-confirmation is received. In some implementations, the confirmation engine may separately record a number of non-confirmations and decrement or reset the previous confirmation score after a threshold number of non-confirmations is reached. In some implementations, the confirmation engine may separately record a number of consecutive non-confirmations and decrement or reset the previous confirmation score after a threshold number of consecutive non-confirmations is reached. In some implementations, the confirmation engine may do nothing to a previous confirmation score when a non-confirmation is received.


In some implementations, the confirmation engine may request an alternate identified match from the smart matching engine if a non-confirmation or cancellation is received. The smart matching engine may provide a next best alternate identified match to the confirmation engine. The confirmation engine may treat the alternate identified match similarly to the initial identified match, and determine an active or passive confirmation method based on a number of previous confirmations.


In some implementations, the confirmation engine may receive an n-best list from the smart matching engine. The confirmation engine may then try to confirm with the user each entry in the n-best list as a selected match. In some implementations, the confirmation engine may try each member of the n-best list in turn, using the above-discussed confirmation methods based on confirmation history. In some implementations, the confirmation engine may try to confirm the first entry of a an n-best list, and, if confirmation fails, query the user to select from the remaining entries in the n-best list.


The learning engine may provide machine learning updates to a smart matching model used by the smart matching engine and confirmation engine in suggesting and confirming matches between natural language inputs and named entities, and/or other instructions that computer system 110 to perform various operations, each of which are described in greater detail herein.


The learning engine may include instructions for updating an entity matching model based on results provided by the smart matching engine and the confirmation engine. As described above, the smart matching engine may suggest a match between an input entity name and an identified entity name, while the confirmation engine may operate to confirm the identified entity name and determine it as a selected entity name. When an entity name is confirmed as a selected entity name, the learning engine may update the entity matching model with information about the confirmed match. The learning engine may update context information of the entity matching model with information about the context (e.g., time, location, whether the user is traveling, etc.) under which the suggested match was confirmed. The learning engine may update user history information of the entity matching model, for example, to keep track of a frequency of user requests involving the selected entity name.


For example, where a user has a requested to make a phone call or send a text message to a contact and the confirmation engine confirms the suggested match provided by the smart matching engine, the learning engine may update the entity matching model with information about the match between the input entity name and the identified entity name, including, for example, time of day, location of the user, category of the entity name (e.g., business or personal contact), and any other suitable information. In some implementations, when the confirmation engine obtains a non-confirm result of a suggested match, the learning engine may update entity matching model with information that the match was not confirmed, as well as other context information associated with the failed match.


For example, a user may frequently his brother “James Lewis” in the evening but call his business associate “James Smith” during the day. Due to a frequency of calls, the smart matching engine may initially suggest a match between the request to “call James,” and “James Smith,” even when the call is taking place after 9 pm. After one or more non-confirmations between a command to “call James” at 9 pm and “James Smith,” the learning engine may sufficiently update the entity matching model to make an association between “James” and the user's brother “James Lewis” in the evening, and between “James” and the user's business associate “James Smith” during the day.


These and other objects, features, and characteristics of the system and/or method disclosed herein, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a system for natural language processing, according to an implementation of the invention.



FIG. 2 depicts an NLP system configured to perform natural language processing on an utterance, according to an implementation of the invention.



FIG. 3 depicts a process of natural language processing disambiguation, according to an implementation of the invention.



FIG. 4 depicts a process of disambiguation machine learning, according to an implementation of the invention.





DETAILED DESCRIPTION OF THE INVENTION

The invention described herein relates to a system and method of natural language processing disambiguation. In particular, the systems and methods described are related to machine learning of disambiguation models. The system may be configured to receive and process natural language inputs, and then use machine learning techniques to optimally provide disambiguation between potentially ambiguous inputs.


For example, a user may have multiple contacts having the same first or last name in a directory. If a user makes a natural language request to “Call John,” when there are three Johns stored in the users contact directory, the system may employ disambiguation models to determine which John the user the intended. The system may review context data, including call frequency, call timing, user location, and other information, to determine a suggestion to match the user's request. The user may then be prompted to confirm the suggestion, and/or, in some implementations, be provided with a time in which to cancel the suggestion, before the request is carried out using the suggestion. The system may incorporate the user response in disambiguation models to optimize future suggestions and actions. Further examples of user requests requiring disambiguation may include requests for directions (e.g., where multiple destinations may have similar names), requests for weather reports (e.g., where the user fails to specify the location of request), requests to play music (e.g., where the user selects an album but not an artist) and others. Disambiguation may be required between identical user requests, between phonetically similar requests, between homophonic requests, and/or between requests that may sound similar based on user pronunciation. The system may further be configured for disambiguation of generic requests to identify a user's specific intent. For example, a request to “call the pizza place,” may be disambiguated to a phone dialing request to contact a specific pizza restaurant that a user frequently calls.



FIG. 1 illustrates a system 100 for natural language processing. In one implementation, system 100 may include a computer system 110, a user device 140, a database 104, and/or other components. Components of system 100 may communicate with one another via network 102. Network 102 may be any type of network, such as a WAN, LAN, Bluetooth network, the Internet, etc.


Computer system 110 may be configured as a server (e.g., having one or more server blades, processors, etc.), a gaming console, a handheld gaming device, a personal computer (e.g., a desktop computer, a laptop computer, etc.), a smartphone, a tablet computing device, and/or other device that can be programmed for natural language processing, and, in particular the disambiguation of natural language processing requests.


Computer system 110 may include one or more processors 112 (also interchangeably referred to herein as processors 112, processor(s) 112, or processor 112 for convenience), one or more storage devices 114, and/or other components. Processors 112 may be programmed by one or more computer program instructions. For example, processors 112 may be programmed by NLP subsystem 106A, and/or other instructions that program computer system 110 to perform various operations, each of which are described in greater detail herein. As used herein, for convenience, the various instructions and engines will be described as performing an operation, when, in fact, the various instructions program the processors 112 (and therefore computer system 110) to perform the operation. Storage device 114 may further include a user history database 180.


NLP subsystem 106A may perform at least some aspects of natural language processing on an utterance using all or a portion of the components of NLP subsystem 106 illustrated in FIG. 2. In other words, computer system 110 may be programmed with some or all of the functions of NLP system 106 described with respect to FIG. 2.


User device 140 may be configured as a server device, a gaming console, a handheld gaming device, a personal computer (e.g., a desktop computer, a laptop computer, etc.), a smartphone, a tablet computing device, and/or other device that can be programmed to receive and interpret natural language processing requests. In particular, user device 140 may be programmed to perform disambiguation on a received natural language input.


User device 140 may include one or more processors 142 (also interchangeably referred to herein as processors 142, processor(s) 142, or processor 142 for convenience), one or more storage devices 144, and/or other components. Processors 142 may be programmed by one or more computer program instructions. For example, processors 142 may be programmed by NLP subsystem 106B, and/or other instructions that program computer system 110 to perform various operations, each of which are described in greater detail herein. As used herein, for convenience, the various instructions will be described as performing an operation, when, in fact, the various instructions program the processors 142 (and therefore user device 140) to perform the operation. Additionally storage device 144 may include one or more databases, including a user information a user history database 190.


Any or all aspects of natural language processing may be carried out by programming instructions instantiated on either NLP subsystem 106b of user device 140 and NLP subsystem 106a of computer system 110. Each of user device 140 and computer system 110 may perform all or a portion of a natural language processing task. User device 140 and computer system 110 may operate to perform natural language processing tasks sequentially, communicating results via network 102. In such implementations, one system may pass results of a completed task to the other system for further processing. User device 140 and computer system may operate to redundantly perform natural language processing tasks. In such implementations, depending upon the results of a natural language processing task carried out by one system, the other system may repeat the same task. User device 140 and computer system 110 may operate in parallel to perform natural language processing tasks. In such implementations, user device 140 and computer system 110 may each perform one or more tasks independently and, after task completion, communicate with each other to select between and or combine the processing results.


NLP subsystem 106B may perform at least some aspects of natural language processing on an utterance using all or a portion of the components of NLP subsystem 106 illustrated in FIG. 2. In other words, user device 140 may be programmed with some or all of the functions of NLP system 106 described with respect to FIG. 2.


Although illustrated in FIG. 1 as a single component, computer system 110 and user device 140 may each include a plurality of individual components (e.g., computer devices) each programmed with at least some of the functions described herein. In this manner, some components of computer system 110 and/or user device 140 may perform some functions while other components may perform other functions, as would be appreciated. The one or more processors 112 may each include one or more physical processors that are programmed by computer program instructions. The various instructions described herein are exemplary only. Other configurations and numbers of instructions may be used, so long as the processor(s) 112 are programmed to perform the functions described herein.


Furthermore, it should be appreciated that although the various instructions are illustrated in FIG. 1 as being co-located within a single processing unit, in implementations in which processor(s) 112 includes multiple processing units, one or more instructions may be executed remotely from the other instructions.


The description of the functionality provided by the different instructions described herein is for illustrative purposes, and is not intended to be limiting, as any of instructions may provide more or less functionality than is described. For example, one or more of the instructions may be eliminated, and some or all of its functionality may be provided by other ones of the instructions. As another example, processor(s) 112 may be programmed by one or more additional instructions that may perform some or all of the functionality attributed herein to one of the instructions.


The various instructions described herein may be stored in a storage device 114, which may comprise random access memory (RAM), read only memory (ROM), and/or other memory. The storage device may store the computer program instructions (e.g., the aforementioned instructions) to be executed by processor 112 as well as data that may be manipulated by processor 112. The storage device may comprise floppy disks, hard disks, optical disks, tapes, or other storage media for storing computer-executable instructions and/or data.


The various components illustrated in FIG. 1 may be coupled to at least one other component via a network 102, which may include any one or more of, for instance, the Internet, an intranet, a PAN (Personal Area Network), a LAN (Local Area Network), a WAN (Wide Area Network), a SAN (Storage Area Network), a MAN (Metropolitan Area Network), a wireless network, a cellular communications network, a Public Switched Telephone Network, and/or other network. In FIG. 1, as well as in other drawing Figures, different numbers of entities than those depicted may be used. Furthermore, according to various implementations, the components described herein may be implemented in hardware and/or software that configure hardware.


The various databases 104 described herein may be, include, or interface to, for example, an Oracle™ relational database sold commercially by Oracle Corporation. Other databases, such as Informix™, DB2 (Database 2) or other data storage, including file-based, or query formats, platforms, or resources such as OLAP (On Line Analytical Processing), SQL (Structured Query Language), a SAN (storage area network), Microsoft Access™ or others may also be used, incorporated, or accessed. The database may comprise one or more such databases that reside in one or more physical devices and in one or more physical locations. The database may store a plurality of types of data and/or files and associated data or file descriptions, administrative information, or any other data.


Natural Language Processing



FIG. 2 depicts an NLP system 106 configured to perform natural language processing on an utterance, according to an implementation of the invention. NLP system 106 may include a an Automated Speech Recognition (“ASR”) engine 210, an Intent Recognition Engine (“IRE”) 220, text-to-speech engine 230, a dictionary 240, a user profile 250, an agent system 260, smart matching engine 270, confirmation engine 271, learning engine 272, execution engine 273, and/or other components. The components of NLP system 106 may communicate with one another via communication manager 202, which may mediate interactions between components of NLP system 106.


NLP system 106 may process an utterance encoded into audio to determine words or phrases from the utterance. Alternatively, NLP system 106 may process text that was already detected from the utterance. In whichever manner the utterance is received (e.g., whether as encoded audio or text), NLP system 106 may determine an intent of the utterance using the words or phrases, formulate requests based on its intent, and facilitate generating a response to or executing the request.


For example, in implementations in which an utterance is received as audio, ASR engine 210 may use words and phrases from dictionary 240 and information received from agent system 260 (discussed below) to recognize words from the utterance (i.e., the audio). The recognized words and phrases are processed by IRE 220, which determines the intent of the utterance and generates a request that corresponds to the utterance based on context, and information from agent system 260. The request may include a query (e.g., “what is the weather like in Seattle?”) a command (e.g., “turn up the volume”), and/or other information.


To process the intent of the utterance, IRE 220 may use keywords within the utterance itself to identify context. For example, for the utterance “what is the weather like in Seattle,” IRE 220 may determine that the context is weather and a subject is “Seattle.” In some instances, the contextual information may be based on immediately prior utterances. In these instances, IRE 220 may generate a stack that stores (e.g., in Random Access Memory) information relating to one or more previous utterances in a given session (which may be defined by utterances received within a threshold time of one another). For example, a first utterance “what is the weather like in Seattle” may be followed up with a second utterance “book me a flight there.” By itself, the second utterance “book me a flight there” may be non-specific because “there” is undefined in the second utterance. However, when processed as part of a stack that includes information relating to the first utterance “what is the weather like in Seattle,” IRE 220 determines that “there” refers to “Seattle.”


A natural language input may comprise a request and an entity name. A request may include several types of request, including a query, an executable command, and/or any other actionable statement. For example, a request may include a command to place a phone call or send a text message, may include a query for directions or a location, may include a question about the weather, and/or may include a question for a search engine. Numerous additional examples of requests may exist and may be processed from a received natural language input. Entity names may include objects of the processed request. For example, input entity names may include a contact to be called or texted, an address or establishment for which directions are sought, a location or time for which the weather is requested, and/or a target of a search engine query. Numerous other examples exist. Thus, a request and an input entity name, together may form a complete natural language input request, including both an action to be taken an object subject to that action. IRE 220 may process a received utterance to determine at least a type of request and an entity name.


Alternatively or additionally, NLP system 106 may use agent system 260, which includes system and domain specific agents, that assists in ASR, intent recognition, request generation, and/or request response. Agent system 260 may use nonvolatile storage (e.g., storage device 114 or storage device 144) to store data, parameters, history information, and locally stored content provided in the system databases 104 or other data sources. User specific data, parameters, and session and history information that may determine the behavior of the agents are stored in one or more user profiles 250. For example, user specific speech patterns may assist in ASR, while user preferences and history may assist in contextual determinations and, therefore, intent recognition.


The system agent provides default functionality and basic services. A domain specific agent relates to a corresponding domain of information (e.g., a business listing domain may relate to business listings, a contacts domain may relate to user contacts, a shopping domain may relate to commerce, etc.).


Agent system 260 create queries to local databases or though data sources on the Internet or other networks. Commands typically result in actions taken by the device on which NLP system 106 operates, or to a remote device or data source.


In implementations in which a response to a request is desired, agent system 260 may create a response string for presentation to the user. The response string may be presented as text and/or as speech, in which case the string is sent to the text to speech engine 230 to be output by a speaker (whether an on-board or off-board speaker).


Text-to-speech engine 230 may convert text to speech, such as when information from NLP system 106 (e.g., responses to spoken queries) is to be presented to a user in speech format. Conventional text-to-speech processors may be used for such text-to-speech generation.


A more detailed description of NLP processing is described in U.S. patent application Ser. No. 10/452,147, entitled “Systems and Methods for Responding Natural Language Speech Utterance,” filed on Jun. 3, 2003, the disclosure of which is incorporated in its entirety herein. NLP system 260 may use the foregoing and/or other types of NLP processing techniques and systems.


Smart matching engine 270 may include additional instructions that program computer system 110. The instructions of smart matching engine 270 may include, without limitation, instructions for suggesting named entities to appropriate portions of a received natural language input, and/or other instructions that program computer system 110 to perform various operations, each of which are described in greater detail herein.


Smart matching engine 270 may operate to identify an input entity name. ASR engine 210 and IRE 220 may process a natural language input to determine a type of request and an input entity name. Smart matching engine 270 may then match the input entity name to an entity name that can be acted upon. For example, if the user requests directions to Main St., then smart matching engine 270 may operate to suggest a match between the user uttered input entity name “Main St.” and an identified entity name—i.e., an actual, mappable, Main St. If the user asks to “call James,” then smart matching engine 270 may operate to suggest a match between an identified entity name, i.e., an actual contact in the user's directory, and the input entity name of “James.”


Smart matching engine 270 may suggest a match between an input entity name and at least one identified entity name based at least partially on an entity matching model. The entity matching model may comprise information compiled through machine learning techniques, and may include at least context information and user history information, as well as other information suitable for machine learning. A suggested match may include the input entity name and at least one identified entity name. In some embodiments, the at least one identified entity name may include a list of identified entity names. Thus, smart matching engine may identify multiple suggested matches between an input entity names A suggested match may further include additional entity match information.


Additional entity match information may include context information related to the natural language request, the input entity name, and the identified entity name. For example, for a user that lives in New York, an entity matching model may associate the entity name “taxi” in the request “call a taxi” with a particular taxi company in New York. When creating a suggested match for the input entity name “taxi,” for that user, smart matching engine 270 may include the additional entity match information of the user's location as part of the suggested match. This may permit the system to disambiguate the term “taxi” differently based on the user's location.


Context information usable by smart matching engine 270 may include a request context. For example, if a user makes a request to “call James” during the business day, smart matching engine 270 may suggest a match between the input entity name James and a business contact named James. If the request is made after hours, smart matching engine 270 may suggest a match with a contact from a friends and family contact list. Similarly, when attempting to disambiguate requests for directions, smart matching engine may suggest matches between locations that are closer to a present location of the user.


User history information usable by smart matching engine 270 may include information about previous user requests and activity. For example, when requesting to make a call, smart matching engine 270 may select a name from a user's contact list that the user calls very frequently. When seeking directions, smart matching engine 270 may suggest a destination location that matches a location the user has travelled to frequently.


Smart matching engine 270 may additionally employ machine learned matching techniques. For example, a user may frequently refer to a contact name by a nickname, making a request to “call Jim,” where the name “James” is stored in the user's contact list. Such a connection may become a part of an entity matching model based on machine learning techniques, as described in more detail below.


In some implementations smart matching engine 270 may suggest a match between an input entity name and an identified entity name based on an exact match. For example, where a user requests to “call John,” the system may suggest only matches that match “John” exactly. In some implementations smart matching engine 270 may suggest a match between an input entity name and an identified entity name based on a phonetic match. For example, where a user requests to “call John,” the system may suggest all matches that sound like “John,” including, for example, “Jon.” In some implementations smart matching engine 270 may suggest a match between an input entity name and an identified entity name based on similar sounds. Continuing to use the above example, smart matching engine 270 may suggest a match between “John” and “Juwan.” In some implementations smart matching engine 270 may suggest a match between an input entity name and an identified entity name based on an interpretation of a user's pronunciation style.


In some implementations smart matching engine 270 may suggest a match between an input entity name and an identified entity name based on learned associations. For example, a user may request to “call my brother.” Smart matching engine 270 may suggest a match between the input entity name “my brother,” and all user contacts having a same last name as the user. Smart matching engine 270 may also request clarification from the user about the entity name “my brother.” Once the system has successfully confirmed a match between a user contact and the input entity name “my brother,” the entity matching model may be updated with the information in order to correctly link the input entity name “my brother” to the appropriate contact. In another example, a user may request directions to “the pizza place.” Smart matching engine may propose a suggested match between the input entity name “the pizza place,” and the closest pizza restaurant. Once the system has successfully confirmed a match between “the pizza place,” and a particular pizza restaurant, the entity matching model may be updated to associate “the pizza place,” with the specific pizza restaurant.


In some implementations, smart matching engine 270 may pass a suggested match between an input named entity and an identified named entity to confirmation engine 271 for command confirmation, as described below.


In some implementations, smart matching engine 270 may determine an alternate identified entity name if confirmation engine 271 is unable to confirm the suggested match between the input named entity and the identified entity name. In some implementations, smart matching engine 270 may determine an alternate identified entity name using similar methods to those described above for selecting the first identified entity name. In determining an alternate identified entity name, smart matching engine 270 may select a next best match for an input entity name. In some implementations, smart matching engine 270 may develop an n-best list of suggested matches for an input entity name including n number of potential matches ranked in order of best fit. Smart matching engine 270 may provide the 2nd, 3rd, 4th etc., matches from an n-best list, in order, to confirmation engine 271 as alternate identified names if the first suggestion is cancelled or met with non-confirmation. In some implementations, smart matching engine 270 may provide an entire n-best list to confirmation engine 271.


Confirmation engine 271 may include additional instructions that program computer system 110. The instructions of confirmation engine 271 may include, without limitation, instructions for confirming a suggest match between a received natural language input and a named entity, and/or other instructions that computer system 110 to perform various operations, each of which are described in greater detail herein.


Confirmation engine 271 may perform command confirmation based on a user's confirmation history of a proposed entity match. Command confirmation may be performed actively and/or passively. For example, active command confirmation may include querying the user to confirm a suggest match—e.g., where a user has asked to “call James,” a command confirmation may query the user “did you mean James Smith?” In some implementations, passive command confirmation may include permitting a user a predefined amount of time to cancel an action. For example, where a user has asked to “call James,” a command confirmation may alert the user “calling James Smith,” and wait for a predetermined amount of time to give the user a chance to cancel the action before executing the action. If the user does not cancel the action, then the suggested entity match may be confirmed. A predetermined amount of time may be relatively short, e.g., 1 to 5 seconds, or relatively long, e.g., 5-15 seconds. Selecting which command confirmation method to use may be based on a user's confirmation history.


For example, confirmation engine 271 may use a history of a user's confirmation between a particular input entity name and a particular identified entity name. For example, a user may ask to “call James,” and suggestion engine 270 may determine a suggested match between the input entity name “James” and an identified entity name “James Smith,” corresponding to a user contact. If suggestion engine 270 has previously suggested the same match, and the user has previously provided confirmations, confirmation engine 271 may alert the user of the impending action. If the user has not previously provided enough confirmations of the suggested match, confirmation engine 271 may request active confirmation from the user. Confirmation engine 271 may switch between these two methods when a number of previous confirmations of the suggested match exceeds a predetermined threshold number of times. Such a predetermined threshold number may be one or more times. A predetermined threshold number may vary depending on the type of entity name that is being matched (e.g., whether it is a contact name or a location address.)


In some implementations, confirmation engine 271 may update a number of previous confirmations of a suggested match in a user's confirmation history after confirmation or non-confirmation is received from the user. If confirmation is received from the user for a particular suggested match, confirmation engine 271 may increment the number of previous confirmations. If a user cancels or fails to confirm a suggested match, confirmation engine 271 may instead decrement the number of previous confirmations. In some implementations, confirmation engine 271 may reset a number of previous confirmations to zero after a non-confirmation is received. In some implementations, confirmation engine 271 may separately record a number of non-confirmations and decrement or reset the previous confirmation score after a threshold number of non-confirmations is reached. In some implementations, confirmation engine 271 may separately record a number of consecutive non-confirmations and decrement or reset the previous confirmation score after a threshold number of consecutive non-confirmations is reached. In some implementations, confirmation engine 271 may do nothing to a previous confirmation score when a non-confirmation is received.


In some implementations, confirmation engine 271 may request an alternate identified match from smart matching engine 270 if a non-confirmation or cancellation is received. Smart matching engine 270 may provide a next best alternate identified match to confirmation engine 271. Confirmation engine 271 may treat the alternate identified match similarly to the initial identified match, and determine an active or passive confirmation method based on a number of previous confirmations.


In some implementations, confirmation engine 271 may receive an n-best list from smart matching engine 270. Confirmation engine 271 may then try to confirm with the user each entry in the n-best list as a selected match. In some implementations, confirmation engine 271 may try each member of the n-best list in turn, using the above-discussed confirmation methods based on confirmation history. In some implementations, confirmation engine 271 may try to confirm the first entry of a an n-best list, and, if confirmation fails, query the user to select from the remaining entries in the n-best list.


Learning engine 272 may include additional instructions that program computer system 110. The instructions of learning engine 224 may include, without limitation, instructions for providing machine learning updates to a smart matching model used by smart matching engine 270 and confirmation engine 271 in suggesting and confirming matches between natural language inputs and named entities, and/or other instructions that computer system 110 to perform various operations, each of which are described in greater detail herein.


Learning engine 272 may include instructions for updating an entity matching model based on results provided by smart matching engine 270 and confirmation engine 271. As described above, smart matching engine 270 may suggest a match between an input entity name and an identified entity name, while confirmation engine 271 may operate to confirm the identified entity name and determine it as a selected entity name. When an entity name is confirmed as a selected entity name, learning engine 272 may update the entity matching model with information about the confirmed match. Learning engine 272 may update context information of the entity matching model with information about the context (e.g., time, location, whether the user is traveling, etc.) under which the suggested match was confirmed. Learning engine 272 may update user history information of the entity matching model, for example, to keep track of a frequency of user requests involving the selected entity name.


For example, where a user has a requested to make a phone call or send a text message to a contact and confirmation engine 271 confirms the suggested match provided by the smart matching engine 270, learning engine 272 may update the entity matching model with information about the match between the input entity name and the identified entity name, including, for example, time of day, location of the user, category of the entity name (e.g., business or personal contact), and any other suitable information. In some implementations, when confirmation engine 271 obtains a non-confirm result of a suggested match, learning engine 272 may update entity matching model with information that the match was not confirmed, as well as other context information associated with the failed match.


For example, a user may frequently his brother “James Lewis” in the evening but call his business associate “James Smith” during the day. Due to a frequency of calls, smart matching engine 270 may initially suggest a match between the request to “call James,” and “James Smith,” even when the call is taking place after 9 pm. After one or more non-confirmations between a command to “call James” at 9 pm and “James Smith,” learning engine 272 may sufficiently update the entity matching model to make an association between “James” and the user's brother “James Lewis” in the evening, and between “James” and the user's business associate “James Smith” during the day.


Execution engine 273 may include additional instructions that program computer system 110. The instructions of execution engine 273 may include, without limitation, instructions for executing a natural language request, and/or other instructions that computer system 110 to perform various operations, each of which are described in greater detail herein.


Execution engine 273 may be configured to cause a computer system on which it is instantiated to carry out the action indicated by the request and the selected entity name determined from the processed natural language input. Execution engine 273 may be configured to engage any systems of the device or computer system on which it is instantiated to carry out the natural language request.


Furthermore, as previously noted, some or all aspects of the foregoing operations may be performed by a given component illustrated in FIG. 1. In particular, some or all aspects of the foregoing operations may be performed by computer system 110 and/or user device 140. In some instances, some operations relating to natural language processing may be performed by user device 140, while other operations are performed at computer system 110.


In some instances, the same operation relating to natural language processing may be performed by both user device 140 and computer system 110 (i.e., both user device 140 and computer system 110 may perform one or more of the operations relating to natural language processing). In implementations where NLP system 106 operates in a hybrid manner (e.g., one or more components of NLP system 106 are included in and operate at different remote devices such as computer system 110 and user device 140), communication manager 202 may mediate communication through a network, such as network 102.


For example, a version of ASR engine 210 operating and executing on user device 140 may convert spoken utterances into text, which is then provided to a version of IRE 220 operating and executing on computer system 110. Alternatively or additionally, a version of ASR engine 210 operating and executing on user device 140 may convert spoken utterances into text and a version of ASR engine 210 operating and executing on computer system 110 may also convert the spoken utterances into text, in which case one result may be chosen over.


A more detailed description of such hybrid natural language processing is disclosed in U.S. patent application Ser. No. 12/703,032, entitled “System and Method for Hybrid Processing in a Natural Language Voice Services Environment,” filed on Feb. 9, 2010, the disclosure of which is incorporated in its entirety herein. The foregoing or other hybrid systems may be used for NLP system 106.



FIG. 3 depicts a process 300 of natural language processing disambiguation, according to an implementation of the invention. The various processing operations and/or data flows depicted in FIG. 3 (and in the other drawing figures) are described in greater detail herein. The described operations may be accomplished using some or all of the system components described in detail above and, in some implementations, various operations may be performed in different sequences and various operations may be omitted. Additional operations may be performed along with some or all of the operations shown in the depicted flow diagrams. One or more operations may be performed simultaneously. Accordingly, the operations as illustrated (and described in greater detail below) are exemplary by nature and, as such, should not be viewed as limiting.


In an operation 302, process 300 may include a natural language input operation. In natural language input operation 300 NLP system 106 receives a natural language input, such as a verbal utterance, from the user. The utterance may be received, for example, via a microphone incorporated into user device 140. ASR manager 210 of NLP subsystem 106 may receive the utterance.


In an operation 304, process 300 may include an natural language processing operation. Operation 304 may perform natural language processing on the received natural language input. Components of NLP subsystem 106 may cooperate to process the received natural language input to determine at least a request and an entity name. For example, as described above with respect to FIG. 2, ASR engine 210 may perform automatic speech recognition on the natural language input and IRE 220 may perform intent recognition on the natural language input. NLP subsystem 106 may thus determine a request and an input entity name from the user's natural language input.


In an operation 306, process 300 may include a smart matching step. During a smart matching operation 306, smart matching engine 270 may determine a suggested match between the input entity name and an identified entity name. Smart matching engine 270 may access an entity matching model, including context information, user history, and machine learned user tendencies to determine a suggested match.


In an operation 308, process 300 may include an entity match confirmation step. An entity match confirmation operation may be performed by confirmation engine 271. Confirmation engine 271 may determine whether to designate the identified entity name as a selected entity name, based on a user's confirmation history of suggested matches between the identified entity name and the input entity name. The user's confirmation history may be used to determine a method, e.g., active or passive, by which confirmation engine 271 confirms the entity name. Entity match confirmation operation 308 may be understood in more detail with reference to FIG. 4.



FIG. 4 depicts a process 400 of natural language match confirmation, according to an implementation of the invention. The various processing operations and/or data flows depicted in FIG. X (and in the other drawing figures) are described in greater detail herein. The described operations may be accomplished using some or all of the system components described in detail above and, in some implementations, various operations may be performed in different sequences and various operations may be omitted. Additional operations may be performed along with some or all of the operations shown in the depicted flow diagrams. One or more operations may be performed simultaneously. Accordingly, the operations as illustrated (and described in greater detail below) are exemplary by nature and, as such, should not be viewed as limiting.


In an operation 401, process 400 may include a suggested match reception step. During suggested match reception operation, confirmation engine 271 may receive information about a suggested match from smart matching operation 270. Confirmation engine 271 may receive the input entity name and the suggested match for it, i.e., the identified entity name.


In an operation 402, process 400 may include a match confirmation history determination step. Match confirmation history determination operation 402 may access a user history database 180/190 to determine a user command confirmation history associated with the suggested match. The user command confirmation history may include a previous confirmations counter comprising information about a user's previous confirmations of the suggested match between the input entity name and identified entity name. For example, a user's confirmation history may include the information that a user has confirmed a particular match 9 times, cancelled that match 2 times, and confirmed the previous 6 suggestions of the match.


In some implementations, the user command confirmation history may further include additional entity match information, which may include the context of a user's confirmation history with respect to a suggested match. For example, additional entity match information may include information about circumstances under which a user has confirmed a suggested match. For example, where a user has made the request to “call James,” and the suggested match is with the identified entity “James Smith” the business associate, the additional entity match information may include the information that the user has confirmed this suggested match 13 times during business hours, but cancelled the match 7 times during evening hours. During match confirmation history determination operation 402, confirmation engine 271 may determine the number of previous confirmations by a user of the suggested match between the input entity name and the identified entity name, and may further use additional entity match information to make the determination.


In an operation 404, process 400 may include a confirmation history thresholding step. Confirmation history thresholding operation 404 may include comparing the number of previous confirmations by the user to a predetermined threshold number of confirmations. Confirmation history thresholding operation 404 may include determining whether the previous number of confirmations exceeds or fails to exceed the predetermined threshold number of confirmations.


In an operation 406, process 400 may include an active match confirmation step. If confirmation engine 271 determines that a number of previous confirmations does not exceed a predetermined confirmation threshold number, process 400 may proceed to an active confirmation request operation 406. During active confirmation request operation 406, confirmation engine 271 may cause the host system to query the use for active confirmation of the suggested match provided by smart matching engine 270. For example, where a user has requested directions to the input entity name “233 Main St.,” the system may actively request that the user confirm the identified entity name by querying the user “Did you mean 233 Main St, Cambridge, Mass.?” When a match has been confirmed by the user, confirmation engine 271 may designate the confirmed entity name as a selected entity name for subsequent execution. Confirmation engine 271 may further update the previous confirmation history with the information that the suggested match has been confirmed. Such updating may include incrementing a number of previous confirmations, incrementing a number of previous consecutive confirmations, and/or providing any additional entity match information about the context in which the suggested match was confirmed.


When a match has been cancelled or decline by the user, confirmation engine 271 may designate the identified entity name as a non-match, and proceed to an alternative match suggestion operation 410. Confirmation engine 271 may further update the previous confirmation history with the information that the suggested match was not confirmed. Such updating may include decrementing a number of previous confirmations, resetting a number of previous consecutive confirmations, resetting a total number of previous confirmations, and/or providing any additional entity match information about the context in which the suggested match was cancelled.


In an operation 410, process 400 may include an alternative match suggestion step. Confirmation engine 271 may perform an alternative match suggestion operation after a user has cancelled or failed to confirm an initially suggested match provided by smart matching engine 270 (either in operation 406 or an operation 412). In some implementations, confirmation engine 271 may request that smart matching engine 270 provide a next best match for the input entity name. Confirmation engine 271 may then suggest this next best match as an alternate identified entity name to the user for confirmation. In some implementations, confirmation engine 271 may select a next best match from an n-best list that has already been provided by smart matching engine 270. Confirmation engine 271 may then suggest this next best match as an alternate identified entity name to the user for confirmation. In some implementations, confirmation engine 271 may provide, to the user, multiple potential matches, e.g., from an n-best list provided by smart matching engine 270, from which the user may select an alternate identified entity name for confirmation. After alternate match suggestion step 410, process flow may return to catch confirmation history determination operation 402 and repeat steps 402, 404, 406, and/or 412 using the alternate identified entity name.


In an operation 412, process 400 may include a passive confirmation operation. If confirmation engine 271, at thresholding step 404, determines that a number of previous confirmations exceeds a predetermined confirmation threshold number, process 400 may proceed to an passive confirmation request operation 412. A passive confirmation operation 412 may be performed by confirmation engine 271 when the user has previously confirmed the suggested match more than a certain number of times. After multiple past confirmations, the system may determine a higher level of certainty in the suggested match, and not require that the user actively confirm the match. A passive confirmation operation may proceed by alerting the user (e.g., by an audio prompt or a display prompt) that an action based on the request and the identified entity name will be carried out. A predetermined pause period may then be introduced to give the opportunity to the user to cancel the action. If the user does not cancel the action, then confirmation engine 271 may confirm the suggested match. If the user does issue a cancellation command, then confirmation engine 271 may designate the suggested match as non-confirmed.


When a match has been confirmed by the user, confirmation engine 271 may designate the confirmed entity name as a selected entity name for subsequent execution at operation 414. Confirmation engine 271 may further update the previous confirmation history with the information that the suggested match has been confirmed. Such updating may include incrementing a number of previous confirmations, incrementing a number of previous consecutive confirmations, and/or providing any additional entity match information about the context in which the suggested match was confirmed.


When a match has been cancelled or non-confirmed by the user, confirmation engine 271 may designate the identified entity name as a non-match, and proceed to an alternative match suggestion operation 410. Confirmation engine 271 may further update the previous confirmation history with the information that the suggested match was not confirmed. Such updating may include decrementing a number of previous confirmations, resetting a number of previous consecutive confirmations, resetting a total number of previous confirmations, and/or providing any additional entity match information about the context in which the suggested match was cancelled.


In an operation 414, process 400 may include a selected match finalization operation. Selected match finalization operation 414 may be performed by confirmation engine 271 to pass the confirmed selected entity name to execution engine 273. During operation 414, information about the confirmed match, e.g., the selected entity name and any additional match entity information, may be passed to learning engine 272 to update the entity matching model.


Returning now to FIG. 3, in an operation 310, process 310 may include a natural language execution step. After confirmation of the identified entity name as the selected entity name during command confirmation operation 308, natural language execution operation 310 may be performed by execution engine 273 to cause the system to carry out the request on the selected entity name. For example, the request may include a phone dialing request and the selected entity name may be a user object. Execution of this natural language result would cause a device of the user to initiate a phone call. The request may include a search request, and the selected entity name may include a search object. Execution of this natural language result would cause a device of the user to carry out a search for information about the search object. The request may include a navigation request, and the selected entity name may include a street address or other physical location. Execution of this statement would cause the device of the user to provide the user with directions to the street address or physical location.


Other implementations, uses and advantages of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. The specification should be considered exemplary only, and the scope of the invention is accordingly intended to be limited only by the following claims.

Claims
  • 1. A computer implemented method of natural language disambiguation, the method being implemented in a computer system having one or more physical processors programmed with computer program instructions that, when executed by the one or more physical processors, cause the computer system to perform the method, the method comprising: processing, by the computer system, using an intent recognition engine, a natural language input;determining, by the computer system, based on the processing, at least a type of request and an input entity name;identifying, by the computer system, at least a first match between the input entity name and a first identified entity name and at least a second match between the input entity name and a second identified entity name;selectively performing, by the computer system, one of a passive confirmation or an active confirmation of whether the first identified entity name or the second identified entity name was intended by the natural language input, depending on a threshold comparison applied to previous confirmation history of a number of previous confirmations of whether the first identified entity name or the second entity name was intended by the natural language input, wherein the active confirmation comprises: requesting, by the computer system, a confirmation of whether the first identified entity name or the second identified entity name was intended by the natural language input; andreceiving, by the computer system, a response to the requested confirmation;wherein the passive confirmation comprises: selecting, as a response, the first identified entity name or the second identified entity name as the intended entity name;generating, by the computer system, a request based on the type of request and the response; andupdating the number of previous responses regarding whether the first identified entity name or the second identified entity name was intended by the natural language input.
  • 2. The computer implemented method of claim 1, wherein the input entity name includes at least one of a contact name, an address, or a business name.
  • 3. The computer implemented method of claim 1, wherein the type of request includes a phone dialing request and the first identified entity name or the second identified entity name includes a user contact.
  • 4. The computer implemented method of claim 1, wherein the type of request includes a search query and the first identified entity name or the second identified entity name includes a search object.
  • 5. The computer implemented method of claim 1, wherein the type of request includes a navigation query and the first identified entity name or the second identified entity name includes a street address.
  • 6. The computer implemented method of claim 1, wherein identifying at least a first match between the input entity name and the first identified entity name and at least a second match between the input entity name and the second identified entity name is at least partially based on at least one of a user location, a time of day, a user request history, a user request frequency, and a user preference.
  • 7. The computer implemented method of claim 1, wherein identifying at least a first match between the input entity name and the first identified entity name and at least a second match between the input entity name and the second identified entity name is at least partially based on a user preference for contacting at least one of family, friends, and business associates.
  • 8. The computer implemented method of claim 1, wherein: the active confirmation is performed if a number of previous confirmations of whether the first identified entity name or the second identified entity name was intended by the natural language input is below a predetermined threshold; andwherein the passive confirmation is performed if a number of previous confirmations of whether the first identified entity name or the second identified entity name was intended by the natural language input is above a predetermined threshold.
  • 9. The computer implemented method of claim 1, wherein the method further comprises:alerting the user that execution of the request will be performed;waiting, by the computer system, a predetermined amount of time for user cancellation of the announced execution prior to executing the request; andupdating the number of previous responses regarding whether the first identified entity name or the second identified entity name was intended by the natural language input.
  • 10. The computer implemented method of claim 9, further comprising: receiving, by the computer system, a cancellation command from the user;requesting that the user select an alternate request different from the generated request; andstoring the alternate request selected by the user for use in future natural language input processing.
  • 11. The computer implemented method of claim 1, wherein the method further comprises:processing, by the computer system, using the intent recognition engine, a second natural language input;determining, by the computer system, a second input entity name based on the processing;identifying, by the computer system, at least a third match between the second input entity name and the first identified entity name and at least a fourth match between the second input entity name and the second identified entity name;determining, by the computer system, that the first number of times that the first identified entity name was intended exceeds a predetermined threshold; anddetermining, by the computer system, that the first identified entity name is intended by the second natural language input based on the determination that the first number of times that the first identified entity name was intended exceeds the predetermined threshold.
  • 12. The computer implemented method of claim 1, wherein requesting a confirmation comprises: requesting that the user provide an alternate entity name, different from the first identified entity name and the second identified entity name;providing information about the alternate entity name as the response to the requested confirmation,and wherein the method further comprises:storing, by the computer system, information indicating the response for use in future natural language input processing.
  • 13. The computer implemented method of claim 1, further comprising: updating the intent recognition engine with information indicating a match between the input entity name and the first identified entity name or second identified entity name based on one of the passive confirmation or the active confirmation,wherein the computer system recognizes the first identified entity name and the second identified entity name as being an exact match.
  • 14. The computer implemented method of claim 1, further comprising: updating the intent recognition engine with information indicating a match between the input entity name and the first identified entity name or second identified entity name based on one of the passive confirmation or the active confirmation,wherein the computer system recognizes the first identified entity name and the second identified entity name as being a phonetic match.
  • 15. The computer implemented method of claim 1, further comprising: updating the intent recognition engine with information indicating a match between the input entity name and the first identified entity name or second identified entity name based on one of the passive confirmation or the active confirmation,wherein the computer system recognizes the first identified entity name and the second identified entity name as being a partial match.
  • 16. A system of natural language disambiguation, the system comprising: a computer system comprising one or more physical processors programmed by computer program instructions that, when executed, cause the computer system to:process, using an intent recognition engine, a natural language input;determine, based on the processed natural language input, at least a type of request and an input entity name;identify at least a first match between the input entity name and a first identified entity name and at least a second match between the input entity name and a second identified entity name;selectively perform one of a passive confirmation or an active confirmation of whether the first identified entity name or the second identified entity name was intended by the natural language input, depending on a threshold comparison applied to previous confirmation history of a number of previous confirmations of whether the first identified entity name or the second entity name was intended by the natural language input, wherein the active confirmation comprises: requesting a confirmation of whether the first identified entity name or the second identified entity name was intended by the natural language input; andreceiving a response to the requested confirmation;wherein the passive confirmation comprises: selecting, as a response, the first identified entity name or the second identified entity name as the intended entity name;generate a request based on the type of request and the response; andupdate the number of previous responses regarding whether the first identified entity name or the second identified entity name was intended by the natural language input.
  • 17. The system of claim 16, wherein the input entity name includes at least one of a contact name, an address, and a business name.
  • 18. The system of claim 16, wherein the request includes a phone dialing request and the first identified entity name or the second identified entity name includes a user contact.
  • 19. The system of claim 16, wherein the type of request includes a search query and the first identified entity name or the second identified entity name includes a search object.
  • 20. The system of claim 16, wherein the type of request includes a navigation query and the first identified entity name or the second identified entity name includes a street address.
  • 21. The system of claim 16, wherein the computer program instructions further cause the computer system to: identify at least the first match between the input entity name and the first identified entity name and at least the second match between the input entity name and the second identified entity name at least partially based on at least one of a user location, a time of day, a user request history, a user request frequency, and a user preference.
  • 22. The system of claim 16, wherein the computer program instructions further cause the computer system to: identify at least the first match between the input entity name and the first identified entity name and at least the second match between the input entity name and the second identified entity name at least partially based on user preferences including a preference for contacting at least one of family, friends, and business associates.
  • 23. The system of claim 16, wherein the computer program instructions further cause the computer system to: perform the active confirmation if a number of previous confirmations of whether the first identified entity name or the second identified entity name was intended by the natural language input is below a predetermined threshold; andperform the passive confirmation if a number of previous confirmations of whether the first identified entity name or the second identified entity name was intended by the natural language input is above a predetermined threshold.
  • 24. The system of claim 16, wherein the computer program instructions further cause the computer system to:alert the user that execution of the intended request will be performed;wait a predetermined amount of time for user cancellation of the announced execution prior to executing the intended request; andupdate the number of previous responses regarding whether the first identified entity name or the second identified entity name was intended by the natural language input.
  • 25. The system of claim 24, further comprising instructions to cause the computer system to: receive a cancellation command from the user;request that the user select an alternate request different from the generated request; andstore the alternate request selected by the user for use in future natural language input processing.
  • 26. The system of claim 16, wherein the computer program instructions further cause the computer system to:process, using the intent recognition engine, a second natural language input;identify a second input entity name based on the processing;identify at least a third match between the second input entity name and the first identified entity name and at least a fourth match between the second input entity name and the second identified entity name;determine that the first number of times that the first identified entity name was intended exceeds a predetermined threshold; anddetermine that the first identified entity name is intended by the second natural language input based on the determination that the first number of times that the first identified entity name was intended exceeds the predetermined threshold.
  • 27. The system of claim 16, wherein the computer program instructions to request a confirmation further cause the computer system to: request that the user provide an alternate entity name, different from the first identified entity name and the second identified entity name;provide information about the alternate entity name as the response to the requested confirmation, andwherein the computer system is further caused to store information indicating the response for use in future natural language input processing.
  • 28. The system of claim 16, wherein the computer program instructions further cause the computer system to: update the intent recognition engine with information indicating a match between the input entity name and the first identified entity name or second identified entity name based on one of the passive confirmation or the active confirmation,wherein the computer system recognizes the first identified entity name and the second identified entity name as being an exact match.
  • 29. The system of claim 16, wherein the computer program instructions further cause the computer system to: update the intent recognition engine with information indicating a match between the input entity name and the first identified entity name or second identified entity name based on one of the passive confirmation or the active confirmation,wherein the computer system recognizes the first identified entity name and the second identified entity name as being a phonetic match.
  • 30. The system of claim 16, wherein the computer program instructions further cause the computer system to: update the intent recognition engine with information indicating a match between the input entity name and the first identified entity name or second identified entity name based on one of the passive confirmation or the active confirmation,wherein the computer system recognizes the first identified entity name and the second identified entity name as being a partial match.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Patent Application Ser. No. 62/368,975, entitled “SYSTEM AND METHOD OF DISAMBIGUATING NATURAL LANGUAGE PROCESSING REQUESTS”, filed Jul. 29, 2016, which is hereby incorporated herein by reference in its entirety.

US Referenced Citations (902)
Number Name Date Kind
4430669 Cheung Feb 1984 A
4821027 Mallory Apr 1989 A
4829423 Tennant May 1989 A
4887212 Zamora Dec 1989 A
4910784 Doddington Mar 1990 A
5027406 Roberts Jun 1991 A
5155743 Jacobs Oct 1992 A
5164904 Sumner Nov 1992 A
5208748 Flores May 1993 A
5265065 Turtle Nov 1993 A
5274560 LaRue Dec 1993 A
5331554 Graham Jul 1994 A
5357596 Takebayashi Oct 1994 A
5369575 Lamberti Nov 1994 A
5377350 Skinner Dec 1994 A
5386556 Hedin Jan 1995 A
5424947 Nagao Jun 1995 A
5471318 Ahuja Nov 1995 A
5475733 Eisdorfer Dec 1995 A
5479563 Yamaguchi Dec 1995 A
5488652 Bielby Jan 1996 A
5499289 Bruno Mar 1996 A
5500920 Kupiec Mar 1996 A
5517560 Greenspan May 1996 A
5533108 Harris Jul 1996 A
5537436 Bottoms Jul 1996 A
5539744 Chu Jul 1996 A
5557667 Bruno Sep 1996 A
5559864 Kennedy, Jr. Sep 1996 A
5563937 Bruno Oct 1996 A
5577165 Takebayashi Nov 1996 A
5590039 Ikeda Dec 1996 A
5608635 Tamai Mar 1997 A
5615296 Stanford Mar 1997 A
5617407 Bareis Apr 1997 A
5633922 August May 1997 A
5634086 Rtischev May 1997 A
5652570 Lepkofker Jul 1997 A
5675629 Raffel Oct 1997 A
5696965 Dedrick Dec 1997 A
5708422 Blonder Jan 1998 A
5721938 Stuckey Feb 1998 A
5722084 Chakrin Feb 1998 A
5740256 CastelloDaCosta Apr 1998 A
5742763 Jones Apr 1998 A
5748841 Morin May 1998 A
5748974 Johnson May 1998 A
5752052 Richardson May 1998 A
5754784 Garland May 1998 A
5761631 Nasukawa Jun 1998 A
5774841 Salazar Jun 1998 A
5774859 Houser Jun 1998 A
5794050 Dahlgren Aug 1998 A
5794196 Yegnanarayanan Aug 1998 A
5797112 Komatsu Aug 1998 A
5799276 Komissarchik Aug 1998 A
5802510 Jones Sep 1998 A
5829000 Huang Oct 1998 A
5832221 Jones Nov 1998 A
5839107 Gupta Nov 1998 A
5848396 Gerace Dec 1998 A
5855000 Waibel Dec 1998 A
5867817 Catallo Feb 1999 A
5878385 Bralich Mar 1999 A
5878386 Coughlin Mar 1999 A
5892813 Morin Apr 1999 A
5892900 Ginter Apr 1999 A
5895464 Bhandari Apr 1999 A
5895466 Goldberg Apr 1999 A
5897613 Chan Apr 1999 A
5899991 Karch May 1999 A
5902347 Backman May 1999 A
5911120 Jarett Jun 1999 A
5918222 Fukui Jun 1999 A
5926784 Richardson Jul 1999 A
5933822 Braden-Harder Aug 1999 A
5950167 Yaker Sep 1999 A
5953393 Culbreth Sep 1999 A
5960384 Brash Sep 1999 A
5960397 Rahim Sep 1999 A
5960399 Barclay Sep 1999 A
5960447 Holt Sep 1999 A
5963894 Richardson Oct 1999 A
5963940 Liddy Oct 1999 A
5983190 Trower, II Nov 1999 A
5987404 DellaPietra Nov 1999 A
5991721 Asano Nov 1999 A
5995119 Cosatto Nov 1999 A
5995928 Nguyen Nov 1999 A
5995943 Bull Nov 1999 A
6009382 Martino Dec 1999 A
6014559 Amin Jan 2000 A
6018708 Dahan Jan 2000 A
6021384 Gorin Feb 2000 A
6028514 Lemelson Feb 2000 A
6035267 Watanabe Mar 2000 A
6044347 Abella Mar 2000 A
6049602 Foladare Apr 2000 A
6049607 Marash Apr 2000 A
6058187 Chen May 2000 A
6067513 Ishimitsu May 2000 A
6073098 Buchsbaum Jun 2000 A
6076059 Glickman Jun 2000 A
6078886 Dragosh Jun 2000 A
6081774 deHita Jun 2000 A
6085186 Christianson Jul 2000 A
6101241 Boyce Aug 2000 A
6108631 Ruhl Aug 2000 A
6119087 Kuhn Sep 2000 A
6119101 Peckover Sep 2000 A
6122613 Baker Sep 2000 A
6134235 Goldman Oct 2000 A
6144667 Doshi Nov 2000 A
6144938 Surace Nov 2000 A
6154526 Dahlke Nov 2000 A
6160883 Jackson Dec 2000 A
6167377 Gillick Dec 2000 A
6173266 Marx Jan 2001 B1
6173279 Levin Jan 2001 B1
6175858 Bulfer Jan 2001 B1
6185535 Hedin Feb 2001 B1
6188982 Chiang Feb 2001 B1
6192110 Abella Feb 2001 B1
6192338 Haszto Feb 2001 B1
6195634 Dudemaine Feb 2001 B1
6195651 Handel Feb 2001 B1
6199043 Happ Mar 2001 B1
6208964 Sabourin Mar 2001 B1
6208972 Grant Mar 2001 B1
6219346 Maxemchuk Apr 2001 B1
6219643 Cohen Apr 2001 B1
6219645 Byers Apr 2001 B1
6226612 Srenger May 2001 B1
6233556 Teunen May 2001 B1
6233559 Balakrishnan May 2001 B1
6233561 Junqua May 2001 B1
6236968 Kanevsky May 2001 B1
6243679 Mohri Jun 2001 B1
6246981 Papineni Jun 2001 B1
6246990 Happ Jun 2001 B1
6266636 Kosaka Jul 2001 B1
6269336 Ladd Jul 2001 B1
6272455 Hoshen Aug 2001 B1
6272461 Meredith Aug 2001 B1
6275231 Obradovich Aug 2001 B1
6278377 DeLine Aug 2001 B1
6278968 Franz Aug 2001 B1
6286002 Axaopoulos Sep 2001 B1
6288319 Catona Sep 2001 B1
6292767 Jackson Sep 2001 B1
6301560 Masters Oct 2001 B1
6308151 Smith Oct 2001 B1
6311159 VanTichelen Oct 2001 B1
6314402 Monaco Nov 2001 B1
6321196 Franceschi Nov 2001 B1
6356869 Chapados Mar 2002 B1
6362748 Huang Mar 2002 B1
6366882 Bijl Apr 2002 B1
6366886 Dragosh Apr 2002 B1
6374214 Friedland Apr 2002 B1
6374226 Hunt Apr 2002 B1
6377913 Coffman Apr 2002 B1
6381535 Durocher Apr 2002 B1
6385596 Wiser May 2002 B1
6385646 Brown May 2002 B1
6389398 Lustgarten May 2002 B1
6393403 Majaniemi May 2002 B1
6393428 Miller May 2002 B1
6397181 Li May 2002 B1
6404878 Jackson Jun 2002 B1
6405170 Phillips Jun 2002 B1
6408272 White Jun 2002 B1
6411810 Maxemchuk Jun 2002 B1
6411893 Ruhl Jun 2002 B2
6415257 Junqua Jul 2002 B1
6418210 Sayko Jul 2002 B1
6420975 DeLine Jul 2002 B1
6429813 Feigen Aug 2002 B2
6430285 Bauer Aug 2002 B1
6430531 Polish Aug 2002 B1
6434523 Monaco Aug 2002 B1
6434524 Weber Aug 2002 B1
6434529 Walker Aug 2002 B1
6442522 Carberry Aug 2002 B1
6446114 Bulfer Sep 2002 B1
6453153 Bowker Sep 2002 B1
6453292 Ramaswamy Sep 2002 B2
6456711 Cheung Sep 2002 B1
6456974 Baker Sep 2002 B1
6466654 Cooper Oct 2002 B1
6466899 Yano Oct 2002 B1
6470315 Netsch Oct 2002 B1
6487494 Odinak Nov 2002 B2
6487495 Gale Nov 2002 B1
6498797 Anerousis Dec 2002 B1
6499013 Weber Dec 2002 B1
6501833 Phillips Dec 2002 B2
6501834 Milewski Dec 2002 B1
6505155 Vanbuskirk Jan 2003 B1
6510417 Woods Jan 2003 B1
6513006 Howard Jan 2003 B2
6522746 Marchok Feb 2003 B1
6523061 Halverson Feb 2003 B1
6532444 Weber Mar 2003 B1
6539348 Bond Mar 2003 B1
6549629 Finn Apr 2003 B2
6553372 Brassell Apr 2003 B1
6556970 Sasaki Apr 2003 B1
6556973 Lewin Apr 2003 B1
6560576 Cohen May 2003 B1
6560590 Shwe May 2003 B1
6567778 ChaoChang May 2003 B1
6567797 Schuetze May 2003 B1
6567805 Johnson May 2003 B1
6570555 Prevost May 2003 B1
6570964 Murveit May 2003 B1
6571279 Herz May 2003 B1
6574597 Mohri Jun 2003 B1
6574624 Johnson Jun 2003 B1
6578022 Foulger Jun 2003 B1
6581103 Dengler Jun 2003 B1
6584439 Geilhufe Jun 2003 B1
6587858 Strazza Jul 2003 B1
6591185 Polidi Jul 2003 B1
6591239 McCall Jul 2003 B1
6594257 Doshi Jul 2003 B1
6594367 Marash Jul 2003 B1
6598018 Junqua Jul 2003 B1
6601026 Appelt Jul 2003 B2
6601029 Pickering Jul 2003 B1
6604075 Brown Aug 2003 B1
6604077 Dragosh Aug 2003 B2
6606598 Holthouse Aug 2003 B1
6611692 Raffel Aug 2003 B2
6614773 Maxemchuk Sep 2003 B1
6615172 Bennett Sep 2003 B1
6622119 Ramaswamy Sep 2003 B1
6629066 Jackson Sep 2003 B1
6631346 Karaorman Oct 2003 B1
6631351 Ramachandran Oct 2003 B1
6633846 Bennett Oct 2003 B1
6636790 Lightner Oct 2003 B1
6643620 Contolini Nov 2003 B1
6647363 Claassen Nov 2003 B2
6650747 Bala Nov 2003 B1
6658388 Kleindienst Dec 2003 B1
6678680 Woo Jan 2004 B1
6681206 Gorin Jan 2004 B1
6691151 Cheyer Feb 2004 B1
6701294 Ball Mar 2004 B1
6704396 Parolkar Mar 2004 B2
6704576 Brachman Mar 2004 B1
6704708 Pickering Mar 2004 B1
6707421 Drury Mar 2004 B1
6708150 Hirayama Mar 2004 B1
6721001 Berstis Apr 2004 B1
6721633 Funk Apr 2004 B2
6721706 Strubbe Apr 2004 B1
6726636 DerGhazarian Apr 2004 B2
6732088 Glance May 2004 B1
6735592 Neumann May 2004 B1
6739556 Langston May 2004 B1
6741931 Kohut May 2004 B1
6742021 Halverson May 2004 B1
6745161 Arnold Jun 2004 B1
6751591 Gorin Jun 2004 B1
6751612 Schuetze Jun 2004 B1
6754485 Obradovich Jun 2004 B1
6754627 Woodward Jun 2004 B2
6754647 Tackett Jun 2004 B1
6757544 Rangarajan Jun 2004 B2
6757718 Halverson Jun 2004 B1
6785651 Wang Aug 2004 B1
6795808 Strubbe Sep 2004 B1
6801604 Maes Oct 2004 B2
6801893 Backfried Oct 2004 B1
6804330 Jones Oct 2004 B1
6810375 Ejerhed Oct 2004 B1
6813341 Mahoney Nov 2004 B1
6816830 Kempe Nov 2004 B1
6829603 Chai Dec 2004 B1
6832230 Zilliacus Dec 2004 B1
6833848 Wolff Dec 2004 B1
6850603 Eberle Feb 2005 B1
6856990 Barile Feb 2005 B2
6865481 Kawazoe Mar 2005 B2
6868380 Kroeker Mar 2005 B2
6868385 Gerson Mar 2005 B1
6871179 Kist Mar 2005 B1
6873837 Yoshioka Mar 2005 B1
6877001 Wolf Apr 2005 B2
6877134 Fuller Apr 2005 B1
6882970 Garner Apr 2005 B1
6901366 Kuhn May 2005 B1
6910003 Arnold Jun 2005 B1
6912498 Stevens Jun 2005 B2
6915126 Mazzara, Jr. Jul 2005 B2
6928614 Everhart Aug 2005 B1
6934756 Maes Aug 2005 B2
6937977 Gerson Aug 2005 B2
6937982 Kitaoka Aug 2005 B2
6941266 Gorin Sep 2005 B1
6944594 Busayapongchai Sep 2005 B2
6950821 Faybishenko Sep 2005 B2
6954755 Reisman Oct 2005 B2
6959276 Droppo Oct 2005 B2
6961700 Mitchell Nov 2005 B2
6963759 Gerson Nov 2005 B1
6964023 Maes Nov 2005 B2
6968311 Knockeart Nov 2005 B2
6973387 Masclet Dec 2005 B2
6975993 Keiller Dec 2005 B1
6980092 Turnbull Dec 2005 B2
6983055 Luo Jan 2006 B2
6990513 Belfiore Jan 2006 B2
6996531 Korall Feb 2006 B2
7003463 Maes Feb 2006 B1
7016849 Arnold Mar 2006 B2
7020609 Thrift Mar 2006 B2
7024364 Guerra Apr 2006 B2
7027586 Bushey Apr 2006 B2
7027974 Busch Apr 2006 B1
7027975 Pazandak Apr 2006 B1
7035415 Belt Apr 2006 B2
7036128 Julia Apr 2006 B1
7043425 Pao May 2006 B2
7054817 Shao May 2006 B2
7058890 George Jun 2006 B2
7062488 Reisman Jun 2006 B1
7069220 Coffman Jun 2006 B2
7072834 Zhou Jul 2006 B2
7072888 Perkins Jul 2006 B1
7076362 Ohtsuji Jul 2006 B2
7082469 Gold Jul 2006 B2
7085708 Manson Aug 2006 B2
7092928 Elad Aug 2006 B1
7107210 Deng Sep 2006 B2
7107218 Preston Sep 2006 B1
7110951 Lemelson Sep 2006 B1
7127395 Gorin Oct 2006 B1
7127400 Koch Oct 2006 B2
7130390 Abburi Oct 2006 B2
7136875 Anderson Nov 2006 B2
7137126 Coffman Nov 2006 B1
7143037 Chestnut Nov 2006 B1
7143039 Stifelman Nov 2006 B1
7146319 Hunt Dec 2006 B2
7149696 Shimizu Dec 2006 B2
7165028 Gong Jan 2007 B2
7170993 Anderson Jan 2007 B2
7171291 Obradovich Jan 2007 B2
7174300 Bush Feb 2007 B2
7177798 Hsu Feb 2007 B2
7184957 Brookes Feb 2007 B2
7190770 Ando Mar 2007 B2
7197069 Agazzi Mar 2007 B2
7197460 Gupta Mar 2007 B1
7203644 Anderson Apr 2007 B2
7206418 Yang Apr 2007 B2
7207011 Mulvey Apr 2007 B2
7215941 Beckmann May 2007 B2
7228276 Omote Jun 2007 B2
7231343 Treadgold Jun 2007 B1
7236923 Gupta Jun 2007 B1
7254482 Kawasaki Aug 2007 B2
7272212 Eberle Sep 2007 B2
7277854 Bennett Oct 2007 B2
7283829 Christenson Oct 2007 B2
7283951 Marchisio Oct 2007 B2
7289606 Sibal Oct 2007 B2
7299186 Kuzunuki Nov 2007 B2
7301093 Sater Nov 2007 B2
7305381 Poppink Dec 2007 B1
7321850 Wakita Jan 2008 B2
7328155 Endo Feb 2008 B2
7337116 Charlesworth Feb 2008 B2
7340040 Saylor Mar 2008 B1
7366285 Parolkar Apr 2008 B2
7366669 Nishitani Apr 2008 B2
7376645 Bernard May 2008 B2
7386443 Parthasarathy Jun 2008 B1
7398209 Kennewick Jul 2008 B2
7406421 Odinak Jul 2008 B2
7415100 Cooper Aug 2008 B2
7415414 Azara Aug 2008 B2
7421393 DiFabbrizio Sep 2008 B1
7424431 Greene Sep 2008 B2
7447635 Konopka Nov 2008 B1
7451088 Ehlen Nov 2008 B1
7454368 Stillman Nov 2008 B2
7454608 Gopalakrishnan Nov 2008 B2
7461059 Richardson Dec 2008 B2
7472020 Brulle-Drews Dec 2008 B2
7472060 Gorin Dec 2008 B1
7472075 Odinak Dec 2008 B2
7477909 Roth Jan 2009 B2
7478036 Shen Jan 2009 B2
7487088 Gorin Feb 2009 B1
7487110 Bennett Feb 2009 B2
7493259 Jones Feb 2009 B2
7493559 Wolff Feb 2009 B1
7502672 Kolls Mar 2009 B1
7502730 Wang Mar 2009 B2
7502738 Kennewick Mar 2009 B2
7516076 Walker Apr 2009 B2
7529675 Maes May 2009 B2
7536297 Byrd May 2009 B2
7536374 Au May 2009 B2
7542894 Murata Jun 2009 B2
7546382 Healey Jun 2009 B2
7548491 Macfarlane Jun 2009 B2
7552054 Stifelman Jun 2009 B1
7558730 Davis Jul 2009 B2
7574362 Walker Aug 2009 B2
7577244 Taschereau Aug 2009 B2
7606708 Hwang Oct 2009 B2
7606712 Smith Oct 2009 B1
7620549 DiCristo Nov 2009 B2
7634409 Kennewick Dec 2009 B2
7640006 Portman Dec 2009 B2
7640160 Di Cristo Dec 2009 B2
7640272 Mahajan Dec 2009 B2
7672931 Hurst-Hiller Mar 2010 B2
7676365 Hwang Mar 2010 B2
7676369 Fujimoto Mar 2010 B2
7684977 Morikawa Mar 2010 B2
7693720 Kennewick Apr 2010 B2
7697673 Chiu Apr 2010 B2
7706616 Kristensson Apr 2010 B2
7729916 Coffman Jun 2010 B2
7729918 Walker Jun 2010 B2
7729920 Chaar Jun 2010 B2
7734287 Ying Jun 2010 B2
7748021 Obradovich Jun 2010 B2
7788084 Brun Aug 2010 B2
7792257 Vanier Sep 2010 B1
7801731 Odinak Sep 2010 B2
7809570 Kennewick Oct 2010 B2
7818176 Freeman Oct 2010 B2
7831426 Bennett Nov 2010 B2
7831433 Belvin Nov 2010 B1
7856358 Ho Dec 2010 B2
7873519 Bennett Jan 2011 B2
7873523 Potter Jan 2011 B2
7873654 Bernard Jan 2011 B2
7881936 Longe Feb 2011 B2
7890324 Bangalore Feb 2011 B2
7894849 Kass Feb 2011 B2
7902969 Obradovich Mar 2011 B2
7917367 DiCristo Mar 2011 B2
7920682 Byrne Apr 2011 B2
7949529 Weider May 2011 B2
7949537 Walker May 2011 B2
7953732 Frank May 2011 B2
7974875 Quilici Jul 2011 B1
7983917 Kennewick Jul 2011 B2
7984287 Gopalakrishnan Jul 2011 B2
8005683 Tessel Aug 2011 B2
8015006 Kennewick Sep 2011 B2
8024186 De Bonet Sep 2011 B1
8027965 Takehara Sep 2011 B2
8032383 Bhardwaj Oct 2011 B1
8060367 Keaveney Nov 2011 B2
8069046 Kennewick Nov 2011 B2
8073681 Baldwin Dec 2011 B2
8077975 Ma Dec 2011 B2
8082153 Coffman Dec 2011 B2
8086463 Ativanichayaphong Dec 2011 B2
8103510 Sato Jan 2012 B2
8112275 Kennewick Feb 2012 B2
8140327 Kennewick Mar 2012 B2
8140335 Kennewick Mar 2012 B2
8145489 Freeman Mar 2012 B2
8150694 Kennewick Apr 2012 B2
8155962 Kennewick Apr 2012 B2
8170867 Germain May 2012 B2
8180037 Delker May 2012 B1
8195468 Weider Jun 2012 B2
8200485 Lee Jun 2012 B1
8204751 Di Fabbrizio Jun 2012 B1
8219399 Lutz Jul 2012 B2
8219599 Tunstall-Pedoe Jul 2012 B2
8224652 Wang Jul 2012 B2
8255224 Singleton Aug 2012 B2
8326599 Tomeh Dec 2012 B2
8326627 Kennewick Dec 2012 B2
8326634 DiCristo Dec 2012 B2
8326637 Baldwin Dec 2012 B2
8332224 DiCristo Dec 2012 B2
8340975 Rosenberger Dec 2012 B1
8346563 Hjelm Jan 2013 B1
8370147 Kennewick Feb 2013 B2
8447607 Weider May 2013 B2
8447651 Scholl May 2013 B1
8452598 Kennewick May 2013 B2
8503995 Ramer Aug 2013 B2
8509403 Chiu Aug 2013 B2
8515765 Baldwin Aug 2013 B2
8527274 Freeman Sep 2013 B2
8577671 Barve Nov 2013 B1
8589161 Kennewick Nov 2013 B2
8612205 Hanneman Dec 2013 B2
8612206 Chalabi Dec 2013 B2
8620659 DiCristo Dec 2013 B2
8719005 Lee May 2014 B1
8719009 Baldwin May 2014 B2
8719026 Kennewick May 2014 B2
8731929 Kennewick May 2014 B2
8738380 Baldwin May 2014 B2
8849652 Weider Sep 2014 B2
8849670 DiCristo Sep 2014 B2
8849696 Pansari Sep 2014 B2
8849791 Hertschuh Sep 2014 B1
8886536 Freeman Nov 2014 B2
8972243 Strom Mar 2015 B1
8983839 Kennewick Mar 2015 B2
9009046 Stewart Apr 2015 B1
9015049 Baldwin Apr 2015 B2
9037455 Faaborg May 2015 B1
9070366 Mathias Jun 2015 B1
9070367 Hoffmeister Jun 2015 B1
9105266 Baldwin Aug 2015 B2
9171541 Kennewick Oct 2015 B2
9269097 Freeman Feb 2016 B2
9305548 Kennewick Apr 2016 B2
9308445 Merzenich Apr 2016 B1
9318108 Gruber Apr 2016 B2
9406078 Freeman Aug 2016 B2
9443514 Taubman Sep 2016 B1
9502025 Kennewick Nov 2016 B2
20010039492 Nemoto Nov 2001 A1
20010041980 Howard Nov 2001 A1
20010047261 Kassan Nov 2001 A1
20010049601 Kroeker Dec 2001 A1
20010054087 Flom Dec 2001 A1
20020002548 Roundtree Jan 2002 A1
20020007267 Batchilo Jan 2002 A1
20020010584 Schultz Jan 2002 A1
20020015500 Belt Feb 2002 A1
20020022927 Lemelson Feb 2002 A1
20020022956 Ukrainczyk Feb 2002 A1
20020029186 Roth Mar 2002 A1
20020029261 Shibata Mar 2002 A1
20020032752 Gold Mar 2002 A1
20020035501 Handel Mar 2002 A1
20020040297 Tsiao Apr 2002 A1
20020049535 Rigo Apr 2002 A1
20020049805 Yamada Apr 2002 A1
20020059068 Rose May 2002 A1
20020065568 Silfvast May 2002 A1
20020067839 Heinrich Jun 2002 A1
20020069059 Smith Jun 2002 A1
20020069071 Knockeart Jun 2002 A1
20020073176 Ikeda Jun 2002 A1
20020082911 Dunn Jun 2002 A1
20020087312 Lee Jul 2002 A1
20020087326 Lee Jul 2002 A1
20020087525 Abbott Jul 2002 A1
20020107694 Lerg Aug 2002 A1
20020120609 Lang Aug 2002 A1
20020124050 Middeljans Sep 2002 A1
20020133347 Schoneburg Sep 2002 A1
20020133354 Ross Sep 2002 A1
20020133402 Faber Sep 2002 A1
20020135618 Maes Sep 2002 A1
20020138248 Corston-Oliver Sep 2002 A1
20020143532 McLean Oct 2002 A1
20020143535 Kist Oct 2002 A1
20020152260 Chen Oct 2002 A1
20020161646 Gailey Oct 2002 A1
20020161647 Gailey Oct 2002 A1
20020169597 Fain Nov 2002 A1
20020173333 Buchholz Nov 2002 A1
20020173961 Guerra Nov 2002 A1
20020184373 Maes Dec 2002 A1
20020188602 Stubler Dec 2002 A1
20020198714 Zhou Dec 2002 A1
20030005033 Mohan Jan 2003 A1
20030014261 Kageyama Jan 2003 A1
20030016835 Elko Jan 2003 A1
20030036903 Konopka Feb 2003 A1
20030046071 Wyman Mar 2003 A1
20030046281 Son Mar 2003 A1
20030046346 Mumick Mar 2003 A1
20030064709 Gailey Apr 2003 A1
20030065427 Funk Apr 2003 A1
20030069734 Everhart Apr 2003 A1
20030069880 Harrison Apr 2003 A1
20030088421 Maes May 2003 A1
20030093419 Bangalore May 2003 A1
20030097249 Walker May 2003 A1
20030110037 Walker Jun 2003 A1
20030112267 Belrose Jun 2003 A1
20030115062 Walker Jun 2003 A1
20030120493 Gupta Jun 2003 A1
20030135488 Amir Jul 2003 A1
20030144846 Denenberg Jul 2003 A1
20030158731 Falcon Aug 2003 A1
20030161448 Parolkar Aug 2003 A1
20030167167 Gong Sep 2003 A1
20030174155 Weng Sep 2003 A1
20030182132 Niemoeller Sep 2003 A1
20030187643 VanThong Oct 2003 A1
20030204492 Wolf Oct 2003 A1
20030206640 Malvar Nov 2003 A1
20030212550 Ubale Nov 2003 A1
20030212558 Matula Nov 2003 A1
20030212562 Patel Nov 2003 A1
20030225825 Healey Dec 2003 A1
20030233230 Ammicht Dec 2003 A1
20030236664 Sharma Dec 2003 A1
20040006475 Ehlen Jan 2004 A1
20040010358 Oesterling Jan 2004 A1
20040025115 Sienel Feb 2004 A1
20040030741 Wolton Feb 2004 A1
20040036601 Obradovich Feb 2004 A1
20040044516 Kennewick Mar 2004 A1
20040093567 Schabes May 2004 A1
20040098245 Walker May 2004 A1
20040117179 Balasuriya Jun 2004 A1
20040117804 Scahill Jun 2004 A1
20040122673 Park Jun 2004 A1
20040122674 Bangalore Jun 2004 A1
20040133793 Ginter Jul 2004 A1
20040140989 Papageorge Jul 2004 A1
20040143440 Prasad Jul 2004 A1
20040148154 Acero Jul 2004 A1
20040148170 Acero Jul 2004 A1
20040158555 Seedman Aug 2004 A1
20040166832 Portman Aug 2004 A1
20040167771 Duan Aug 2004 A1
20040172247 Yoon Sep 2004 A1
20040172258 Dominach Sep 2004 A1
20040189697 Fukuoka Sep 2004 A1
20040193408 Hunt Sep 2004 A1
20040193420 Kennewick Sep 2004 A1
20040199375 Ehsani Oct 2004 A1
20040199389 Geiger Oct 2004 A1
20040201607 Mulvey Oct 2004 A1
20040205671 Sukehiro Oct 2004 A1
20040243393 Wang Dec 2004 A1
20040243417 Pitts Dec 2004 A9
20040247092 Timmins Dec 2004 A1
20040249636 Applebaum Dec 2004 A1
20050015256 Kargman Jan 2005 A1
20050021331 Huang Jan 2005 A1
20050021334 Iwahashi Jan 2005 A1
20050021470 Martin Jan 2005 A1
20050021826 Kumar Jan 2005 A1
20050033574 Kim Feb 2005 A1
20050033582 Gadd Feb 2005 A1
20050043940 Elder Feb 2005 A1
20050080632 Endo Apr 2005 A1
20050102282 Linden May 2005 A1
20050114116 Fiedler May 2005 A1
20050125232 Gadd Jun 2005 A1
20050131673 Koizumi Jun 2005 A1
20050137850 Odell Jun 2005 A1
20050137877 Oesterling Jun 2005 A1
20050143994 Mori Jun 2005 A1
20050144013 Fujimoto Jun 2005 A1
20050144187 Che Jun 2005 A1
20050149319 Honda Jul 2005 A1
20050216254 Gupta Sep 2005 A1
20050222763 Uyeki Oct 2005 A1
20050234637 Obradovich Oct 2005 A1
20050234727 Chiu Oct 2005 A1
20050246174 DeGolia Nov 2005 A1
20050283364 Longe Dec 2005 A1
20050283532 Kim Dec 2005 A1
20050283752 Fruchter Dec 2005 A1
20060041431 Maes Feb 2006 A1
20060046740 Johnson Mar 2006 A1
20060047509 Ding Mar 2006 A1
20060072738 Louis Apr 2006 A1
20060074670 Weng Apr 2006 A1
20060074671 Farmaner Apr 2006 A1
20060080098 Campbell Apr 2006 A1
20060100851 Schonebeck May 2006 A1
20060106769 Gibbs May 2006 A1
20060129409 Mizutani Jun 2006 A1
20060130002 Hirayama Jun 2006 A1
20060182085 Sweeney Aug 2006 A1
20060206310 Ravikumar Sep 2006 A1
20060217133 Christenson Sep 2006 A1
20060236343 Chang Oct 2006 A1
20060242017 Libes Oct 2006 A1
20060253247 de Silva Nov 2006 A1
20060253281 Letzt Nov 2006 A1
20060285662 Yin Dec 2006 A1
20070011159 Hillis Jan 2007 A1
20070033005 Di Cristo Feb 2007 A1
20070033020 Francois Feb 2007 A1
20070033526 Thompson Feb 2007 A1
20070038436 Cristo Feb 2007 A1
20070038445 Helbing Feb 2007 A1
20070043569 Potter Feb 2007 A1
20070043574 Coffman Feb 2007 A1
20070043868 Kumar Feb 2007 A1
20070050191 Weider Mar 2007 A1
20070050279 Huang Mar 2007 A1
20070055525 Kennewick Mar 2007 A1
20070061067 Zeinstra Mar 2007 A1
20070061735 Hoffberg Mar 2007 A1
20070073544 Millett Mar 2007 A1
20070078708 Yu Apr 2007 A1
20070078709 Rajaram Apr 2007 A1
20070078814 Flowers Apr 2007 A1
20070094003 Huang Apr 2007 A1
20070100797 Thun May 2007 A1
20070112555 Lavi May 2007 A1
20070112630 Lau May 2007 A1
20070118357 Kasravi May 2007 A1
20070124057 Prieto May 2007 A1
20070135101 Ramati Jun 2007 A1
20070146833 Satomi Jun 2007 A1
20070162296 Altberg Jul 2007 A1
20070174258 Jones Jul 2007 A1
20070179778 Gong Aug 2007 A1
20070185859 Flowers Aug 2007 A1
20070186165 Maislos Aug 2007 A1
20070192309 Fischer Aug 2007 A1
20070198267 Jones Aug 2007 A1
20070203699 Nagashima Aug 2007 A1
20070203736 Ashton Aug 2007 A1
20070208732 Flowers Sep 2007 A1
20070214182 Rosenberg Sep 2007 A1
20070250901 McIntire Oct 2007 A1
20070265850 Kennewick Nov 2007 A1
20070266257 Camaisa Nov 2007 A1
20070276651 Bliss Nov 2007 A1
20070294615 Sathe Dec 2007 A1
20070299824 Pan Dec 2007 A1
20080014908 Vasant Jan 2008 A1
20080034032 Healey Feb 2008 A1
20080046311 Shahine Feb 2008 A1
20080059188 Konopka Mar 2008 A1
20080065386 Cross Mar 2008 A1
20080065389 Cross Mar 2008 A1
20080065390 Ativanichayaphong Mar 2008 A1
20080086455 Meisels Apr 2008 A1
20080091406 Baldwin Apr 2008 A1
20080103761 Printz May 2008 A1
20080103781 Wasson May 2008 A1
20080104071 Pragada May 2008 A1
20080109285 Reuther May 2008 A1
20080115163 Gilboa May 2008 A1
20080126091 Clark May 2008 A1
20080133215 Sarukkai Jun 2008 A1
20080140385 Mahajan Jun 2008 A1
20080147396 Wang Jun 2008 A1
20080147410 Odinak Jun 2008 A1
20080147637 Li Jun 2008 A1
20080154604 Sathish Jun 2008 A1
20080162471 Bernard Jul 2008 A1
20080177530 Cross Jul 2008 A1
20080184164 Di Fabbrizio Jul 2008 A1
20080189110 Freeman Aug 2008 A1
20080228496 Yu Sep 2008 A1
20080235023 Kennewick Sep 2008 A1
20080235027 Cross Sep 2008 A1
20080269958 Filev Oct 2008 A1
20080270135 Goel Oct 2008 A1
20080270224 Portman Oct 2008 A1
20080294437 Nakano Nov 2008 A1
20080294994 Kruger Nov 2008 A1
20080306743 Di Fabbrizio Dec 2008 A1
20080319751 Kennewick Dec 2008 A1
20090006077 Keaveney Jan 2009 A1
20090006194 Sridharan Jan 2009 A1
20090018829 Kuperstein Jan 2009 A1
20090024476 Baar Jan 2009 A1
20090030686 Weng Jan 2009 A1
20090052635 Jones Feb 2009 A1
20090055176 Hu Feb 2009 A1
20090067599 Agarwal Mar 2009 A1
20090076827 Bulitta Mar 2009 A1
20090106029 DeLine Apr 2009 A1
20090117885 Roth May 2009 A1
20090144131 Chiu Jun 2009 A1
20090144271 Richardson Jun 2009 A1
20090150156 Kennewick Jun 2009 A1
20090157382 Bar Jun 2009 A1
20090164216 Chengalvarayan Jun 2009 A1
20090171664 Kennewick Jul 2009 A1
20090171912 Nash Jul 2009 A1
20090197582 Lewis Aug 2009 A1
20090216540 Tessel Aug 2009 A1
20090248565 Chuang Oct 2009 A1
20090248605 Mitchell Oct 2009 A1
20090259561 Boys Oct 2009 A1
20090259646 Fujita Oct 2009 A1
20090265163 Li Oct 2009 A1
20090271194 Davis Oct 2009 A1
20090273563 Pryor Nov 2009 A1
20090276700 Anderson Nov 2009 A1
20090287680 Paek Nov 2009 A1
20090299745 Kennewick Dec 2009 A1
20090299857 Brubaker Dec 2009 A1
20090304161 Pettyjohn Dec 2009 A1
20090307031 Winkler Dec 2009 A1
20090313026 Coffman Dec 2009 A1
20090319517 Guha Dec 2009 A1
20100023320 Cristo Jan 2010 A1
20100023331 Duta Jan 2010 A1
20100029261 Mikkelsen Feb 2010 A1
20100036967 Caine Feb 2010 A1
20100049501 Kennewick Feb 2010 A1
20100049514 Kennewick Feb 2010 A1
20100057443 Cristo Mar 2010 A1
20100063880 Atsmon Mar 2010 A1
20100064025 Nelimarkka Mar 2010 A1
20100094707 Freer Apr 2010 A1
20100138300 Wallis Jun 2010 A1
20100145700 Kennewick Jun 2010 A1
20100185512 Borger Jul 2010 A1
20100204986 Kennewick Aug 2010 A1
20100204994 Kennewick Aug 2010 A1
20100217604 Baldwin Aug 2010 A1
20100268536 Suendermann Oct 2010 A1
20100286985 Kennewick Nov 2010 A1
20100299142 Freeman Nov 2010 A1
20100312566 Odinak Dec 2010 A1
20100318357 Istvan Dec 2010 A1
20100331064 Michelstein Dec 2010 A1
20110022393 Waller Jan 2011 A1
20110106527 Chiu May 2011 A1
20110112827 Kennewick May 2011 A1
20110112921 Kennewick May 2011 A1
20110119049 Ylonen May 2011 A1
20110131036 DiCristo Jun 2011 A1
20110131045 Cristo Jun 2011 A1
20110231182 Weider Sep 2011 A1
20110231188 Kennewick Sep 2011 A1
20110238409 Larcheveque Sep 2011 A1
20110307167 Taschereau Dec 2011 A1
20120022857 Baldwin Jan 2012 A1
20120041753 Dymetman Feb 2012 A1
20120046935 Nagao Feb 2012 A1
20120101809 Kennewick Apr 2012 A1
20120101810 Kennewick Apr 2012 A1
20120109753 Kennewick May 2012 A1
20120150620 Mandyam Jun 2012 A1
20120150636 Freeman Jun 2012 A1
20120239498 Ramer Sep 2012 A1
20120240060 Pennington Sep 2012 A1
20120265528 Gruber Oct 2012 A1
20120278073 Weider Nov 2012 A1
20130006734 Ocko Jan 2013 A1
20130054228 Baldwin Feb 2013 A1
20130060625 Davis Mar 2013 A1
20130080177 Chen Mar 2013 A1
20130211710 Kennewick Aug 2013 A1
20130253929 Weider Sep 2013 A1
20130254314 Chow Sep 2013 A1
20130297293 Cristo Nov 2013 A1
20130304473 Baldwin Nov 2013 A1
20130311324 Stoll Nov 2013 A1
20130332454 Stuhec Dec 2013 A1
20130339022 Baldwin Dec 2013 A1
20140006951 Hunter Jan 2014 A1
20140012577 Freeman Jan 2014 A1
20140025371 Min Jan 2014 A1
20140108013 Cristo Apr 2014 A1
20140156278 Kennewick Jun 2014 A1
20140195238 Terao Jul 2014 A1
20140236575 Tur Aug 2014 A1
20140249821 Kennewick Sep 2014 A1
20140249822 Baldwin Sep 2014 A1
20140278413 Pitschel Sep 2014 A1
20140278416 Schuster Sep 2014 A1
20140288934 Kennewick Sep 2014 A1
20140330552 Bangalore Nov 2014 A1
20140365222 Weider Dec 2014 A1
20150019211 Simard Jan 2015 A1
20150019217 Cristo Jan 2015 A1
20150019227 Anandarajah Jan 2015 A1
20150066479 Pasupalak Mar 2015 A1
20150066627 Freeman Mar 2015 A1
20150073910 Kennewick Mar 2015 A1
20150095159 Kennewick Apr 2015 A1
20150142447 Kennewick May 2015 A1
20150170641 Kennewick Jun 2015 A1
20150193379 Mehta Jul 2015 A1
20150199339 Mirkin Jul 2015 A1
20150228276 Baldwin Aug 2015 A1
20150293917 Bufe Oct 2015 A1
20150348544 Baldwin Dec 2015 A1
20150348551 Gruber Dec 2015 A1
20150364133 Freeman Dec 2015 A1
20160049152 Kennewick Feb 2016 A1
20160078482 Kennewick Mar 2016 A1
20160078491 Kennewick Mar 2016 A1
20160078504 Kennewick Mar 2016 A1
20160078773 Carter Mar 2016 A1
20160110347 Kennewick Apr 2016 A1
20160148610 Kennewick May 2016 A1
20160148612 Guo May 2016 A1
20160188292 Carter Jun 2016 A1
20160188573 Tang Jun 2016 A1
20160217785 Kennewick Jul 2016 A1
20160335676 Freeman Nov 2016 A1
Foreign Referenced Citations (35)
Number Date Country
1433554 Jul 2003 CN
1860496 Nov 2006 CN
1320043 Jun 2003 EP
1646037 Apr 2006 EP
H08263258 Oct 1996 JP
H11249773 Sep 1999 JP
2001071289 Mar 2001 JP
2006146881 Jun 2006 JP
2008027454 Feb 2008 JP
2008058465 Mar 2008 JP
2008139928 Jun 2008 JP
2011504304 Feb 2011 JP
2012518847 Aug 2012 JP
9946763 Sep 1999 WO
0021232 Jan 2000 WO
0046792 Jan 2000 WO
0171609 Sep 2001 WO
0178065 Oct 2001 WO
2004072954 Aug 2004 WO
2005010702 Feb 2005 WO
2007019318 Jan 2007 WO
2007021587 Jan 2007 WO
2007027546 Jan 2007 WO
2007027989 Jan 2007 WO
2008098039 Jan 2008 WO
2008118195 Jan 2008 WO
2009075912 Jan 2009 WO
2009145796 Jan 2009 WO
2009111721 Sep 2009 WO
2010096752 Jan 2010 WO
2016044290 Mar 2016 WO
2016044316 Mar 2016 WO
2016044319 Mar 2016 WO
2016044321 Mar 2016 WO
2016061309 Apr 2016 WO
Non-Patent Literature Citations (23)
Entry
“Statement in Accordance with the Notice from the European Patent Office” dated Oct. 1, 2007 Concerning Business Methods (OJ EPO Nov. 2007, 592-593), XP002456252.
Arrington, Michael, “Google Redefines GPS Navigation Landscape: Google Maps Navigation for Android 2.0”, TechCrunch, printed from the Internet <http://www.techcrunch.com/2009/10/28/google-redefines-car-gps-navigation-google-maps-navigation-android/>, Oct. 28, 2009, 4 pages.
Bazzi, Issam et al., “Heterogeneous Lexical Units for Automatic Speech Recognition: Preliminary Investigations”, Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, vol. 3, Jun. 5-9, 2000, XP010507574, pp. 1257-1260.
Belvin, Robert, et al., “Development of the HRL Route Navigation Dialogue System”, Proceedings of the First International Conference on Human Language Technology Research, San Diego, 2001, pp. 1-5.
Chai et al., “MIND: A Semantics-Based Multimodal Interpretation Framework for Conversational Systems”, Proceedings of the International CLASS Workshop on Natural, Intelligent and Effective Interaction in Multimodal Dialogue Systems, Jun. 2002, pp. 37-46.
Cheyer et al., “Multimodal Maps: An Agent-Based Approach”, International Conference on Cooperative Multimodal Communication (CMC/95), May 24-26, 1995, pp. 111-121.
Davis, Z., et al., A Personal Handheld Multi-Modal Shopping Assistant, IEEE, 2006, 9 pages.
El Meliani et al., “A Syllabic-Filler-Based Continuous Speech Recognizer for Unlimited Vocabulary”, Canadian Conference on Electrical and Computer Engineering, vol. 2, Sep. 5-8, 1995, pp. 1007-1010.
Elio et al., “On Abstract Task Models and Conversation Policies” in Workshop on Specifying and Implementing Conversation Policies, Autonomous Agents '99, Seattle, 1999, 10 pages.
Kirchhoff, Katrin, “Syllable-Level Desynchronisation of Phonetic Features for Speech Recognition”, Proceedings of the Fourth International Conference on Spoken Language, 1996, ICSLP 96, vol. 4, IEEE, 1996, 3 pages.
Kuhn, Thomas, et al., “Hybrid In-Car Speech Recognition for Mobile Multimedia Applications”, Vehicular Technology Conference, IEEE, Jul. 1999, pp. 2009-2013.
Lin, Bor-shen, et al., “A Distributed Architecture for Cooperative Spoken Dialogue Agents with Coherent Dialogue State and History”, ASRU'99, 1999, 4 pages.
Lind, R., et al., “The Network Vehicle—A Glimpse into the Future of Mobile Multi-Media”, IEEE Aerosp. Electron. Systems Magazine, vol. 14, No. 9, Sep. 1999, pp. 27-32.
Mao, Mark Z., “Automatic Training Set Segmentation for Multi-Pass Speech Recognition”, Department of Electrical Engineering, Stanford University, CA, copyright 2005, IEEE, pp. I-685 to I-688.
O'Shaughnessy, Douglas, “Interacting with Computers by Voice: Automatic Speech Recognition and Synthesis”, Proceedings of the IEEE, vol. 91, No. 9, Sep. 1, 2003, XP011100665. pp. 1272-1305.
Reuters, “IBM to Enable Honda Drivers to Talk to Cars”, Charles Schwab & Co., Inc., Jul. 28, 2002, 1 page.
Turunen, “Adaptive Interaction Methods in Speech User Interfaces”, Conference on Human Factors in Computing Systems, Seattle, Washington, 2001, pp. 91-92.
Vanhoucke, Vincent, “Confidence Scoring and Rejection Using Multi-Pass Speech Recognition”, Nuance Communications, Menlo Park, CA, 2005, 4 pages.
Weng, Fuliang, et al., “Efficient Lattice Representation and Generation”, Speech Technology and Research Laboratory, SRI International, Menlo Park, CA, 1998, 4 pages.
Wu, Su-Lin, et al., “Incorporating Information from Syllable-Length Time Scales into Automatic Speech Recognition”, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, 1998, vol. 2, IEEE, 1998, 4 pages.
Wu, Su-Lin, et al., “Integrating Syllable Boundary Information into Speech Recognition”, IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP-97, 1997, vol. 2, IEEE, 1997, 4 pages.
Zhao, Yilin, “Telematics: Safe and Fun Driving”, IEEE Intelligent Systems, vol. 17, Issue 1, 2002, pp. 10-14.
International Preliminary Report on Patentability dated Feb. 7, 2019 from corresponding International Patent Application No. PCT/US2017/044610, 7 pages.
Related Publications (1)
Number Date Country
20180032503 A1 Feb 2018 US
Provisional Applications (1)
Number Date Country
62368975 Jul 2016 US