This invention relates to a virtual reality avatar.
Servers located around the Internet serve up content to users on demand. A user interacting through a search engine enters a text query for information and the search results are displayed to the user as text, graphics, audio and/or video, using a graphical user interface (GUI) such as an Internet browser.
According to an aspect of the present invention a method of conducting commerce includes receiving a transaction request from a user as text input and using natural programming language to analysis the text input to build conversations with the user based on the transaction request. The method also includes conducting the transaction with the user based on the text input, generating a voice-synthesized response in accordance with the transaction through an avatar and tracking the transaction by storing the transaction in the database.
Tracking searches a database to find related information associated with conducting the transaction. The method can generate follow-up messages to send to the user is based on added information stored in the database. The method can statistically analyze responses to follow-up messages to generate marketing related information.
The transaction can be a user request as to order status for an order being tracked in the database, an inquiry as to financial information related to the user, support for a sales transaction, a report or a help desk inquiry that involves customer support for a product or service. The transaction can support a report for customer support to report a malfunctioning product, system, or service. The method can generate responses by searching a conversational engine in conjunction with a match and animate the avatar with a voice and facial movements corresponding text produced from the match.
According to an aspect of the present invention a computer program product for conducting commerce includes instructions to cause a computer to receive a transaction from a user as text input, and use natural programming language to analysis the text input to build conversations with the user based on the transaction request. The method further includes instructions to generate a voice-synthesized response in accordance with the transaction through an avatar and track the transaction by storing the transaction in the database.
In some aspects the method further includes instructions to search a database for related content that can further assist in conducting the transaction.
One or more of the following may also be included. The text input may include a user query. The database may include content residing on a server system. The server system may be linked to a globally connected group of computer systems.
Instructions to generate the response may include instructions to search a database in conjunction with the transaction, and animate the avatar with a voice and facial movements representing text associated with the transaction. Animating may be based on a history of a user interaction with the database. Animating may include generating helpful verbal suggestions for completing the transaction based on further queries to the database. Animating may include natural language processing (NLP) techniques to develop and build conversations between the user and the avatar. Thus, completing the transaction may be in response to receiving the text input and producing a suggestion generated by the avatar.
Other features, objects, and advantages of the invention will be apparent from the description and drawings, and from the claims.
Referring to
The network 10 includes a web content server 22 linked to the Internet 20. The user, through the web browser software, accesses content residing in databases in the web content server 22. The user 18 connects to the web content server 22 by entering its Universal Resource Locator (URL) address. The web content server 22 URL may also be obtained from executing a search engine through the browser software on the user system 12.
Referring to
Referring to
Details of the input stage 102 are discussed in
The processing stage 104 is a user interaction process that uses a conversational engine in conjunction with natural language processing (NLP) techniques to develop and build conversations between the user and the avatar. In a preferred example, the conversational engine is the eGain Assistant from eGain Communications.
The processing stage 104 searches the database 46 for appropriate content to present to the user 18 in response to the user query represented by the text. The response may also include one or words that represent a “key concept” or concepts associated with the response. The key concept triggers a facility to present information on or about the key concept(s). The response and conversation attributes are passed to the facial modeling stage 106.
The facial modeling stage 106 combines the response and the conversational attributes to a face and voice of the avatar. The avatar is a photo-realistic, life-like virtual person. The facial modeling stage 106 uses a facial modeling system, such as LifeFX from LifeFX Stand-In Technology, to bring the avatar to life. The avatar is generated from simple two-dimensional photographs and animated in the facial modeling stage 106.
LifeFX's present humanlike communication, facial expressions and emotions draws on advanced research for sophisticated medical applications by the University of Auckland. LifeFX is a biologically based facial modeling system. The LifeFX Stand-In Virtual Person is realistic and interactive. Rather than requiring developers to spend days generating realistic images, Stand-Ins are “ready to use,” human-like faces that serve as flexible containers for the audio content and facial expressions and interactivity.
The animation generated in the facial modeling stage 106 is passed to the presentation stage 108. The presentation stage 108 delivers an appropriate answer or a warning message to the user 18 through the avatar displayed on the GUI 16.
Referring to
Referring to
In a preferred embodiment, the related information is displayed on the GUI 16 on a matrix representation using “TheBrain” from TheBrain Technologies Corporation of Santa Monica, Calif.
In
Referring to
Subsequently there are a number of actions that the program can take based on the added information. For example, the actions can be marketing related. The program can follow up with messages to the user or other marketing data and can statistically analyze information stored in the database to derive useful market data. Thus, the program retrieves information from the database and sends it to the user, but in addition the program tracks 138 the interaction with the user and stores that interaction in the same or a different database. Thus, the tracked transaction can be subsequently used for either specific marketing to that person or can be analyzed statistically to produce information used in market research.
Alternatively or in addition, the program performs 138 an action such as an inquiry as to financial information that relates to the user. The program can also perform a sales transaction. The program can perform a help desk inquiry that involves customer support for a product or service. The program can also file a report for customer support to report a malfunctioning product, system, or service, e.g., when initiating a transaction by receiving an order to buy a product. In addition, the program can call another program to process an inquiry. For example, the other program can be involved in trading a stock, moving money from one account to another and other forms of transactions.
Referring to
Other embodiments are within the scope of the following claims.
Number | Name | Date | Kind |
---|---|---|---|
4016540 | Hyatt | Apr 1977 | A |
5357596 | Takebayashi et al. | Oct 1994 | A |
5594789 | Seazholtz et al. | Jan 1997 | A |
6006225 | Bowman et al. | Dec 1999 | A |
6029158 | Bertrand et al. | Feb 2000 | A |
6185558 | Bowman et al. | Feb 2001 | B1 |
6246981 | Papineni et al. | Jun 2001 | B1 |
6250928 | Poggio et al. | Jun 2001 | B1 |
6266649 | Linden et al. | Jul 2001 | B1 |
6349290 | Horowitz et al. | Feb 2002 | B1 |
6466654 | Cooper et al. | Oct 2002 | B1 |
6499015 | Brooks et al. | Dec 2002 | B2 |
6570555 | Prevost et al. | May 2003 | B1 |
6587822 | Brown et al. | Jul 2003 | B2 |
6658389 | Alpdemir | Dec 2003 | B1 |
6665643 | Lande et al. | Dec 2003 | B1 |
6701294 | Ball et al. | Mar 2004 | B1 |
6735566 | Brand | May 2004 | B1 |
6853982 | Smith et al. | Feb 2005 | B2 |
6985897 | Abrahams | Jan 2006 | B1 |
7050977 | Bennett | May 2006 | B1 |
20010014868 | Herz et al. | Aug 2001 | A1 |
20020010584 | Schultz et al. | Jan 2002 | A1 |
20030028498 | Hayes-Roth | Feb 2003 | A1 |
20030191649 | Stout et al. | Oct 2003 | A1 |
20040039990 | Bakar et al. | Feb 2004 | A1 |
20040070606 | Yang et al. | Apr 2004 | A1 |
20040103023 | Irwin et al. | May 2004 | A1 |
20040104930 | Stoler | Jun 2004 | A1 |
20040165703 | Jones | Aug 2004 | A1 |
Number | Date | Country | |
---|---|---|---|
20050125229 A1 | Jun 2005 | US |