Interactive avatar in messaging environment

Information

  • Patent Grant
  • 11425068
  • Patent Number
    11,425,068
  • Date Filed
    Friday, November 16, 2018
    6 years ago
  • Date Issued
    Tuesday, August 23, 2022
    2 years ago
Abstract
Among other things, embodiments of the present disclosure relate to communicating via an avatar embedded in an email message. In some embodiments, the avatar can provide information associated with the email message.
Description
FIELDS OF THE INVENTION

The present invention generally relates to electronic messaging systems and method. More particularly, the present invention is related to communicating via an avatar embedded in an email message.


DESCRIPTION OF THE PRIOR ART

Internet traffic becomes more and more important everyday and an electronic mail (email) provides a quick and convenient way for Internet users over the world to communicate between them. An email communication is initiated by a message sender composing an email message which optionally includes attached files, and then sending the message to one or more recipient via a network (e.g., Internet).



FIG. 1 illustrates a traditional electronic messaging system. Once an email message is composed by a composer 10 and the composer pushes “send” key/button, the message is delivered to each of the recipients (e.g., a receiver 25) automatically, provided that valid e-mail destination addresses have been specified in the message. An email message may pass through a number of separate server devices, generally SMTP (Simple Mail Transfer Protocol) servers (e.g., SMTP servers 15 and 20, Message Transfer Agents (MTA) 30 and 25) before reaching its final destination (e.g., a receiver 25). An action of each MTA is defined as a “store and forward” mechanism which means that each message is temporarily stored and then forwarded when the MTA has an appropriate communication channel available.


A mail client (e.g., Outlook Express® from Microsoft®, GroupWise® from Novell®, Lotus® Note from IBM®) has a Graphical User Interface (GUI) to use a Messaging service (e.g., email or instant messaging). This GUI is interfaced with two main software stacks (a software stack refers to a set of programs that cooperatively interact together to produce a result):

    • A software stack enables users to submit an email message in Simple Mail Transfer Protocol (SMTP) to an MTA (e.g., SMTP servers 15 and 20, MTA 30 and 35) using an Internet Message Format (e.g., RFC2821) and,
    • Another software stack enables users to access to the message stored in a user's mailbox.


      This GUI may further include a directory interface, i.e., an interface to enable to access to an enterprise directory or to an address book.


The SMTP protocol establishes a connection between MTAs (e.g., SMTP servers 15 and 20, MTA 30 and 35) to submit a message for recipients. Information exchanged during this connection are called envelop information and could be stored in a log file or during the store and forward mechanism. In a connection under the SMTP protocol, one message could be submitted for several recipients. A connection under the SMTP protocol is established with a MTA (e.g., a SMTP server 15 or 20, MTA 30 or 35) for a specific Domain Name. The SMTP protocol further describes a queuing method for congestion.


An MTA (e.g., a SMTP server 15 or 20, MTA 30 or 35) does not modify a message. However, the MTA puts information in a message header to trace a path of the message. When an MTA (e.g., a SMTP server 20 or MTA 30) accepts a message either for relaying or for a final delivery (i.e., a delivery to a recipient), the MTA inserts a time stamp line at the beginning of a message header. The time stamp line indicates an identity of a computing device that sent the message, and an identity of a computing device that received the message (and is inserting this time stamp), and a date and time the message was received. Relayed messages will have multiple time stamp lines like following: Return-Path:


<@GHLARPA,@DEF.ARPA,@ABC.ARPA:JOE@ABC.ARPA>


Received: from GHI.ARPA by JKL.ARPA; 27 Oct 81 15:27:39 PST


Received: from DEF.ARPA by GHI.ARPA; 27 Oct 8115:15:13 PST


Received: from ABC.ARPA by DEF.ARPA; 27 Oct 81 15:01:59 PST


Date: 27 Oct 81 15:01:01 PST


The SMTP protocol manages the envelop information and the message header in the message. Under the SMTP protocol, a message format is included in the envelop information or the message header. The MTA via the SMTP protocol is not aware about the envelop information and/or the message header inside the message.


When a message is received and stored in a mailbox of a recipient by the MTA, there are several possibilities for users to access the Mail:

    • An User Agent (UA) (e.g., Outlook Express from Microsoft®; i.e., a mail client) can read directly the message in the mailbox.
    • The UA can have access to the mailbox via a server under a POP3 or like protocol which provides message retrieval commands on a mail server. The mail server is called Message Store (MS) (not shown) and must have an access to the mailbox. All email messages are retrieved on the UA.
    • The UA can have an access under IMAP4 (Internet Message Access Protocol 4) via the Message Store. The IMAP4 is more sophisticated than the POP3 protocol. For example, under the IMAP4, email messages can be stored on the Message Store. A user can create folders on the Message Store to classify the email messages. The IMAP4 provides a remote management of the mailbox.


However, in the traditional messaging system (e.g., FIG. 1), it is difficult for a user to identify a purpose of an email message or to have all background information about an email message. Furthermore, in the traditional messaging system, it is difficult to obtain instantly or immediately more details or information about a received email message.


Therefore, it would be desirable to provide a method or system for communicating via an avatar embedded in an email message to immediately obtain more details or information associated with the received email message.


SUMMARY OF THE INVENTION

The present invention describes a system and method communicating via an avatar embedded in an email message to obtain more details or information associated with the email message.


In one embodiment, there is provided an email messaging system for communicating via an avatar, the system comprising:


a first client device for sending an email having an embedded avatar representing a sender;


a second client device for receiving the email from the first client device and enabling a receiver to submit a query via the avatar in the received email; and


a remote server device for communicating with the second client device via a communication protocol, receiving the query from the second client device, parsing and analyzing the query, retrieving a corresponding answer from the sender's mailbox including emails associated with the parsed and analyzed query, and providing the corresponding answer to the second client device via the avatar.


In one embodiment, there is provided a method for communicating via an avatar, the method comprising:


sending an email having an embedded avatar representing a sender;


receiving the email and enabling a receiver to submit a query via the avatar in the received email; and


receiving the query via a communication protocol, parsing and analyzing the query, retrieving a corresponding answer from the sender's mailbox including emails associated with the parsed and analyzed query, and providing the corresponding answer via the avatar.


In a further embodiment, the corresponding answer is retrieved from one or more of: an agenda, a folder, a previous answer, a calendar, a resume, and an expert system with an artificial intelligence.


In a further embodiment, the communication protocol is one or more of: SMTP (Simple Mail Transfer Protocol), SIP (Session Initiation Protocol), SIMPLE (SIP for Instant Messaging and Presence Leveraging Extensions), APEX (Application Exchange), Prim (Presence and Instance Messaging Protocol), XMPP (Extensible Messaging and Presence Protocol), IMPS (Instant Messaging and Presence Service), RTMP (Real Time Messaging Protocol), STM (Simple TCP/IP Messaging) protocol, Azureus Extended Messaging Protocol, and like messaging protocols.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings are included to provide a further understanding of the present invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention. In the drawings,



FIG. 1 illustrates a traditional messaging system.



FIG. 2 illustrates a system diagram according to one embodiment of the present invention.



FIG. 3 illustrates a system diagram according to a further embodiment of the present invention.



FIG. 4 illustrates a flow chart including method steps employed according to one embodiment of the present invention.





DETAILED DESCRIPTION

One embodiment of the present invention implements an avatar remote server device (e.g., an avatar remote server device 100 in FIG. 2) to enable a user to submit a question directly and instantly to a sender of an email message. If the sender of the email message is not online, the avatar remote server device retrieves a corresponding answer from a mailbox of the sender. The question may be provided to the sender or the avatar remote server device in a text, audio or speech format.



FIG. 4 illustrates a flow chart describing method steps executed according to one embodiment of the present invention. At step 1, a sender using a first client device 200, which includes a UA, sends an email 215 including an avatar 240 to a receiver. The avatar 240 is a graphical application or plug-in representing the sender. In one embodiment, the UA embed the avatar 240 in the email 215 by implementing a plug-in or API (Application Programming Interface) in the email 215. In another embodiment, the UA attaches the avatar 240 in the email 215 as a graphic file, which can invoke a pop-up window 245 upon clicking. The avatar 240 may be an animated moving image or static image, through which a communication between a receiver of the email 215 and the avatar remote server device 205 can be established under a communication protocol, which is described later. The first client device 200 including the UA may be a personal computer, desktop computer, laptop computer, or a workstation. After sending the email 215 from the first client device 200, a MTA sender 210 receives the email 215. The MTA sender 210 is a messaging server forwarding the email 210 from the first client device 200 to a MTA receiver 225. A messaging server is a middleware application that handles messages that are sent for other applications using an application program interface (API). The messaging server usually queue and prioritize messages as needed and has applications to perform the queuing and prioritizing. The messaging server usually operates according to point-to-point messaging or publish/subscribe messaging. The point-to-point messaging is a queue-based mechanism. In the point-to-point messaging, messages are addressed to a specific queue. Senders send messages to a queue and receivers obtain messages from the queue. The publish/subscribe messaging is an asynchronous messaging paradigm where senders are not programmed to send their messages to specific receivers. Rather, sent messages are characterized based on topics without knowledge of receivers. Receivers express an interest in one or more topics and receive messages that are of interest without knowledge of senders.


In one embodiment, when the sender creates the email 215 by using the UA in the first client device, the UA in the first client device also creates a message-identification (MSG-ID) to uniquely identify the email 215 and the receiver (email's final destination). The UA may create the MSG-ID of the email 215 based on a date and time at which the email 215 is created. For example, if the email 215 is created at 10:20:30 AM on Jul. 10, 2008, the MSG-ID of the UA may be XXXX10203007102008, where XXXX represents a random number that can be generated by a random number generator (e.g., ComScire® PCQNG True Random Number Generator (TRNG)). In a further embodiment, after sending the email 215 from the first client device 200, the email 215 passes through the avatar remote server device 205, where the MSG-ID is stored in a profile database (e.g., a profile database 155 in FIG. 2).


At step 2, the MTA sender 210 transfers the email 215 to a MTA receiver 225. The MTA receiver 225 is another messaging server which delivers the email 215 to a mailbox of the receiver, e.g., via a message store (not shown) which stores the email 215 permanently or temporarily. At step 3, the MTA receiver 225 delivers the email 215 to the receiver. In one embodiment, the receiver operates a second device 235 including a UA to access a mailbox of the receiver. The second device 235 may be a personal computer, desktop computer, laptop computer, or workstation. After receiving the email 215, if the receiver wants to ask a query related to the email 215, then the receiver may send the query to the avatar remote server device 205 under an ARS (Avatar Remote Server) communication protocol 245. The ARS communication protocol 245 includes, but is not limited to, SMTP (Simple Mail Transfer Protocol), SIP (Session Initiation Protocol), SIMPLE (SIP for Instant Messaging and Presence Leveraging Extensions), APEX (Application Exchange), Prim (Presence and Instance Messaging Protocol), XMPP (Extensible Messaging and Presence Protocol), IMPS (Instant Messaging and Presence Service), RTMP (Real Time Messaging Protocol), STM (Simple TCP/IP Messaging) protocol, Azureus Extended Messaging Protocol and like messaging protocols. In one embodiment, to send a query to the avatar remote server device 205, the receiver clicks the avatar 240 in the email 215. Then, a pop-up window 245 appears on the second client device 235. The receiver can type the query in the pop-up window and submit the query to the avatar remote server device by pushing “send” button (not shown). In one embodiment, if the second client device supports a microphone capability, the receiver can ask the query in audio format. For example, after clicking the avatar, a pop-up window 245 may appear on the second client device 235 to indicate to record the query in audio format. After recording the query, the receiver may push “complete and send” button (not shown) to stop the recording and to send the recorded query.


The avatar remote server device 205 communicating with the second client device 235 via the communication protocol 245 parses and analyzes the query, after receiving the query from the second client device 235 via the avatar 240. Hafiz et al., “Modular and Efficient Top-Down Parsing for Ambiguous Left-Recursive Grammars”, Proceedings of the 10th Conference on Parsing Technologies, June 2007, hereinafter “Hafiz”, wholly incorporated as a reference as set forth herein, describes building a parser. To analyze the query, MySQL Query Analyzer and Java Query Analyzer may be utilized. MySQL Query Analyzer and Java Query Analyzer are freeware and can be obtained in Internet. For example, http://www.download3000.com/download_19469.html provides a download link of MySQL Query Analyzer.


After parsing and analyzing the query, the avatar remote server device 205 access a mailbox of the sender. The mailbox of the sender may include a plurality of emails. Among the plurality of emails, the avatar remote server device retrieves an answer (i.e., emails, documents and/or information answering the query) corresponding to the query. To retrieve the answer, the avatar remote server device 205 may employ a well-known search algorithm such as depth-first search (DFS) or breath-first search (BFS). For example, if the query is “provide emails between the sender and Mr. Lee between 10/10/2007 and 11/10/2007”, the avatar remote server device 205 access one or more mailboxes of the sender and then retrieve emails from “Mr. Lee” or emails to “Mr. Lee” between 10/10/2007 and 11/10/2007 by utilizing a well-known search algorithm. In one embodiment, the avatar remote server device 205 may provide the parsed and analyzed query to the UA in the first client device 200. Then, the UA (e.g., Microsoft® Outlook®, Novell® GroupWise®, IBM® Lotus® Note) may retrieve a corresponding email, document or information from the mailbox(s) of the sender based on its built-in search feature (e.g., ctrl-F). Then, the UA provides the corresponding email, document or information to the avatar remote server device 205. After obtaining the answer, the avatar remote server device 205 provides the answer to the second client device 235 via an avatar. For example, the avatar 240 may show the answer on a display of the second client device 235. Thus, the receiver does not need to wait an hour or a day to obtain the answer, e.g., by sending another email to the sender. The avatar remote server device 205 provides the answer to the receiver within a couple of minutes, after receiving the query from the receiver.


In a further embodiment, the avatar remote server device may retrieve the answer (e.g., emails, document, or information answering the query) from the mailbox(s) of the sender based on a relevance to the parsed and analyzed query.


In a further embodiment, the avatar remote server device 205 provides the answer by an email to the receiver. At the same time, the avatar remote server device 205 may send a notice to the receiver via the avatar 240. For example, a pop-up window 245 invoked from the avatar 240 may show that the avatar remote server device 205 just sent an email including the answer to the receiver.


In a further embodiment, the avatar remote server device 205 accesses an agenda associated with the parsed and analyzed query, a folder associated with the parsed and analyzed query, a previous answer associated with the parsed and analyzed query, a calendar associated with the parsed and analyzed query, a resume associated with the parsed and analyzed query, an address book associated with the parsed and analyzed query and an expert system with an artificial intelligence associated with the parsed and analyzed query to retrieve the answer (e.g., emails, documents, information answering the parsed and analyzed query).


In a further embodiment, the MSG-ID stored in the profile database (e.g., a profile database 155 in FIG. 2) correlates the email 215 and the query provided from the receiver, when the avatar remote server device receives the query. For example, when receiving the query from the receiver, the avatar remote server device 205 checks whether the query passes through a correct avatar corresponding to the MSG-ID. If the query passes through the correct avatar, the avatar remote server device 205 processes the query by parsing and analyzing the query. Otherwise, the avatar remote server device 205 ignores the query. Alternatively, the avatar remote server device 205 sends a notice to the receiver to indicate that the query is invalid because it came from an avatar which does not correspond to the MSG-ID of the email 215. For a security purpose, the profile database may store the MSG-ID encrypted with an encryption/description algorithm (e.g., DES (Data Encryption Standard)). When the receiver sending the query, the query may also include an encrypted email address of the receiver. By sharing a public key (i.e., an encryption key) and using a same encryption/decryption algorithm between the receiver and the avatar remote server device 205, the encrypted email address of the receiver can be decrypted in the avatar remote server device 205.


In a further embodiment, after receiving the query from the receiver via an avatar 240 in the email 215, the avatar remote server device 205 sends a notification indicating the query to the sender based on a rule. The rule may be based on a decision table like following:









TABLE 1







Decision Table










Rule 1
Rule 2
















Conditions
Access to Mailbox
Y
Y




Access to Agenda
Y
Y




Access to Resume
N
N




Access to User Profile
Y
N




Voice Question
Y
Y




Internet Email
Y
N




Direct Contact
Y
N



Actions
Search Text
Y
Y




Speech to text
Y
N










The decision table describes each rule. For example, a rule 1 allows the avatar remote server device 205 to access a mailbox, agenda, user profile (e.g., a record including a date of birth) and email to obtain the answer corresponding to the query. However, the rule 1 does not allow the avatar remote server device 205 to access a resume to obtain the answer corresponding to the query. The rule 1 allows the avatar remote server device 205 to accept the query in audio format (i.e., “Voice Questionis” is Y). The rule 1 allows the avatar remote server device 205 contacts the sender to ask the query from the receiver. For example, if the avatar remote server device 205 sends the notification to sender and the sender has an online connectivity (i.e., has an Internet connection), then the sender may able to provide the answer, e.g., by typing on the first client device 200. Then, the typed answer may appear on a display of the second client device 235 via the avatar 240. However, the receiver does not know that the answer comes from the sender, because the answer appears same via the avatar 240 regardless of where the answer is provided from (e.g., from the avatar remote server device 205 or from the sender). Though the table 1 includes two rules, the decision table can describe a plurality of rules. The decision table may further describe services or actions of the avatar remote server device 205. For example, the rule 1 describes the avatar remote server device 205 performs a text search in the mailbox, agenda, resume, user profile and email to retrieve or obtain the answer. The rule 1 further describes whether the avatar remote server device 205 supports a speech-to-text transformation. Thus, the avatar remote server device 205 may include a speech-to-text transformation capability. Englund, “Speech recognition in the JAS 39 Gripen aircraft-adaptation to speech at different G-loads”, Mar. 11, 2004, wholly incorporated as a reference as set forth herein, hereinafter “Englund”, describes a speech recognition technology. One of ordinary skill in the art is able to implement the speech-to-text transformation capability in the avatar remote server device 205 based on the contents of Englund. In a further embodiment, the decision table may be a stored in a database (e.g., expert system 135 in FIG. 2)


In one embodiment, the avatar remote server device 205 receives the query in an audio format from the receiver, performs a speech-to-text transformation on the audio query, parses and analyzes the transformed query, retrieves a corresponding answer (e.g., emails or documents) from a database associated with the parsed and analyzed query, and provides the corresponding answer in a text or in an audio format to receiver via the avatar 240. The database associated with the parsed and analyzed query includes, but is not limited to, a mailbox, a folder, an address book, a previous answer, a toDoList, an expert system, and a resume.


In an exemplary use scenario, a first user receives an email message from a second user's computing device, which is also sent to a third user. Then, the first user would like to have a previous message exchange by the second user and the third user. So, the first user clicks an avatar embedded in the message to request the previous message. This request is sent to mailboxes of the third user. Previous emails between the second user and the third user are provided from the mailboxes of the third user to the first user via email(s). In a security aspect, the third user may allow only a specific mailbox to be shared. Other mailboxes of the third user may not be allowed to be searched to find a previous message between the second user and the third user. In a further embodiment, the avatar remote server device may implement a security mechanism such as authentication (e.g., validating whether the first user can read an email message of the second user based on a preference setting of the second user's mailbox) and non-repudiation (e.g., the first user should have a key to open or read the previous message). In a further embodiment, the avatar remote server device accesses a personal information management, agenda, to do list (i.e., a list including items that have to be done), an address book of the third user to retrieve more information of the previous message(s). The personal information management refers to both a practice and a study of activities people perform in order to acquire, organize, maintain, retrieve and use documents, web pages and email messages for everyday use to complete tasks and fulfill a person's various roles (as parent, employee, friend, member of a community, etc.).


In one embodiment, the avatar remote server device (e.g., the avatar remote server device 205 in FIG. 4 or an avatar remote server device 100 in FIGS. 2-3) accesses a mailbox of a user using a cryptographic protocol, e.g., Transport Layer Security (TLS). The TLS is a protocol providing a security and data integrity for communications over TCP/IP networks such as Internet. The mailbox of the user may be maintained in the user's computing device and/or a mail server (e.g., SMTP 20) under one of the communication protocols described above. The avatar remote server device receives a request from a user (e.g., the request to provide the previous email between the second user and the third user), maintains a security aspect (e.g., the avatar remote server device does not access a mailbox of a user, which the user does not want to share) and provide an answer of the request (e.g., by retrieving the previous email from a mailbox of the third user). In one embodiment, the avatar remote server device informs an owner of a mailbox before accessing the mailbox. Then, if the owner of the mailbox is online and can provide the answer of the request immediately, the owner provides the answer immediately. However, a requester who sent the request to the avatar remote server device may not know whether the answer is provided from the avatar remote server device or from the owner, because the answer is provided via the avatar representing the owner regardless of whether the avatar remote server device provides the answer or whether the owner provides the answer.


When an owner of a mailbox provides the answer directly, the avatar remote server device can reuse the answer and save the answer for a future purpose, because a lot of people may ask a same question. For example, assume that a user sent an email to everybody in the user's organization to ask to attend a meeting without specifying the meeting's location and the meeting's time. Then, for the meeting, everybody may ask where the meeting is and what time the meeting starts. The avatar remote server device can provide an answer automatically, if the owner of the mailbox already provided the answer or the avatar remote server device accessed the user's agenda. When a user installs the avatar remote server device, the user (e.g., an owner of a mailbox) can specify who can access mailboxes of the user, which mailboxes can be accessed by the other users, whether agendas, resumes, toDoLists, folders, previous answers, and expert systems with artificial intelligences associated with the user can be accessed or not. An expert system is software that attempts to reproduce a performance of one or more human experts in a specific problem domain. In one embodiment, a user agent (UA), e.g., Outlook® from Microsoft® or GroupWise® from Novell®, maintains an agenda, toDoList, address book, calendar, and a mail folder. Upon receiving a question, the avatar remote server device may access the UA to retrieve an answer of the question from a mailbox, a mail folder, an agenda, an address book, a calendar, or a toDoList. The user of the avatar remote server device can also specify which services the avatar remote server device can provide. For example, as described above, the avatar remote server device can provide an answer of a question automatically to users without contacting the user by retrieving the answer from a mailbox, a mail folder, an agenda, an address book, a calendar, and/or a toDoList of the user.


In one embodiment, a sender sends an email including an avatar. A receiver of the email can immediately ask a query related to the email via the avatar, e.g., by clicking the avatar to invoke a pop-up window, typing a query on the pop-up window and submitting the query to the avatar remote server device by pushing “send” button on the pop-up window. Then, the receiver submits the query to the avatar remote server device. Thus, the avatar allows the receiver to interact with the avatar remote server device via a communication protocol, e.g., SMTP (Simple Mail Transfer Protocol). The avatar remote server device installed in a computing device of the sender, which includes a user agent (UA), e.g., Outlook® from Microsoft®, may communicate with the UA to provide answers or services to the receiver who asked the query via an avatar. In another embodiment, the avatar remote server device is installed in an MTA (Message Transfer Agent). Upon receiving a query from a user, the avatar remote server device installed in the MTA parses and analyzes the query and accesses a mailbox, agenda, calendar, or folder associated with the parsed and analyzed query. The mailbox, agenda, calendar, or folder may belong to anyone other than the user who asked the query. Alternatively, the mailbox, agenda, calendar or folder may belong to the user who asked the query. Then, the avatar remote server device retrieves a mail or a document associated with the parsed and analyzed query from the mailbox, agenda, calendar, or folder of the user or another user. The avatar remote server device sends the retrieved mail or document to the receiver. The retrieved mail or document may appear on a screen of a client device that the receiver uses via the avatar.



FIG. 2 illustrates a system diagram according to one embodiment of the present invention. The avatar remote server device 100 in FIG. 2 corresponding to the avatar remote server device 205 in FIG. 4 includes, but is not limited to, a selection module 140, a parser and analyzer 145, a ARS (Avatar Remote Server) protocol stack 150 and a profile database 155. A client device 175 includes, but is not limited to, an ARS protocol stack 165 and an interface 170. The avatar remote server device 100 and the client device 175 communicate each other via an ARS communication protocol 160, which is described above. The ARS protocol stack 150 in the avatar remote server device 205 and the ARS protocol stack 165 in the client device 175 are a set of programs enabling a communication between the avatar remote server device 205 and the client device 175 under the ARS communication protocol 160. The parser and analyzer 145 may include a parser implemented based on Hafiz and a query analyzer such as MySQL query analyzer or Java query analyzer. Upon receiving a query from the client device 175, the ARS protocol stack 150 forwards to query to the parser and analyzer 145. Then, the parser and analyzer 145 perform parsing and analyzing the query and then provide the parsed and analyzed query to a selection module 140. The selection module accesses a database 190 that includes, but is not limited to, resumes 105, emails 110, agendas 115, folders 120, previous answers 125, toDoLists 130, and expert systems. Then, the selection module 140 provides the parsed and analyzed query to the database and retrieves an answer (e.g., a table, document, email, etc., answering the query) from the database. Subsequently, the selection module 140 provides the answer to the client device 175 via the ARS protocol stack 150. Upon receiving the answer, the client device 175 displays the answer via an avatar 180, which corresponds to the avatar 240 in FIG. 4. The interface 170 in the client device 175 may be an Application Programming Interface (API) enabling the avatar and/or a Graphical User Interface (GUI) assisting a user to create the query such as via a pop-up menu (not shown). The profile database 155 in the avatar remote server device 205 stores the MSG-ID(s) and a list of receivers and senders corresponding to each MSG-ID. The profile database 155 may also store an access right of each user. The access right of a user may specify whether the user can access all the data in the database of all other users, whether the user can access only specific data (e.g., only mailboxes and address books) of a specific user, when the user can access the database, etc.


In a further embodiment, the database 190 may be configured to allow limited access to specific data. For example, the avatar remote server device 205 may allow a group of users to access a folder 120, but may not allow to access a mailbox 110 including emails. The avatar remote server device 205 may allow a group of users only access to a folder named “Group”. The avatar remote server device 205 may allow a specific user only access to a folder named “Bid” and emails from a specific domain (e.g., @ibm.com). Thus, the avatar remote server device 205 can retrieve the answer by accessing only allowed or authorized data. Alternatively, the avatar remote server device 205 can retrieve the answer by accessing all data (e.g., resumes 105-expert systems 135) in the database 190, if an owner of the database 190 allows to share all the data with other users.



FIG. 3 illustrates another system diagram according to one embodiment of the present invention. All components operate same as described above according to FIG. 2 except that, upon receiving a query from the client device 175, the selection module 140 sends a notification 185 to an owner of the database 190, who may be the sender of the email 215 in FIG. 4. Then, the owner of the database 190 provides the answer directly, if the owner is connected to Internet. In a further embodiment, the answer is recorded in the database 190 (e.g., the previous answers 125) and can be used for other users who asking a similar or same query. If the owner does not provide an answer within a certain time after sending the notification 285, e.g., 10 minutes, the selection module 140 accesses the database 190 to retrieve the answer. Then, the selection module 140 provides the answer to the client device 175.


In a further embodiment, the avatar 180 comprises an interface 170 including a text pop-up window to provide a text question. The avatar 180 may be a 2D (two-Dimensional) or 3D (three-Dimensional) image where the avatar represents the owner of the database 190. According capabilities of the client device 175 and plug-ins, the interface 170 collects the query which is then translated in a right grammar and sent via the ARS protocol stack 165 in another message. The query may be an audio file recorded by the Interface 170. If the query is the audio file, a speech to text transformation is used at the ARS protocol stack 165 to covert the query into a text. The selection module 140 may access an expert system 135 which is composed of rules (e.g., table 1) to generate actions and access all other databases such as resumes 105-ToDoList 130.


In one embodiment, an owner of the database 190 sends a message including the avatar 180 to a receiver. The ARS protocol stack 165 in the client device 175 and the ARS protocol stack 150 in the avatar remote server device 100 maintains compatibility and consistency between the avatar remote server device 100 and the avatar 180 by utilizing a same communication protocol. The implementation of the avatar remote server device 100 and the avatar 180 does not require any modification of a UA (e.g., IBM® Lotus® Note) to support functions of the avatar remote server device 100 and the avatar 180.


In a further embodiment, the avatar remote server device 100 is implemented as hardware on a reconfigurable hardware, e.g., FPGA (Field Programmable Gate Array) or CPLD (Complex Programmable Logic Device, using a hardware description language (Verilog, VHDL, Handel-C, or System C). In another embodiment, the avatar remote server device 100 is implemented on a semiconductor chip, e.g., ASIC (Application-Specific Integrated Circuit), using a semi custom design methodology, i.e., designing a chip using standard cells and a hardware description language.


In a further embodiment, the avatar remote server device 100 is implemented as software using one or more programming languages, e.g., C, C++, Java, .NET, Perl, Python, etc. In one embodiment, the avatar remote server device 100 is recorded in a computer readable medium, e.g., CD (Compact Disc), DVD (Digital Versatile Disc), HDD (Hard Disk Drive), as an instruction, e.g., a machine language or assembly language, that is executed by a processor, e.g., Intel® Core®, IBM® Power PC®, AMD® Opteron®.


In a further embodiment, the avatar remote server device 100 is implemented as hardware through a computing device, e.g., a desktop, a laptop, a mainframe, a workstation, etc., by being executed in the computing device. The computing device comprises, but not limited to include, processor(s), memory(s), display device(s), input/output device(s) and network interface(s).


The avatar 180 may be implemented by a plug-in or API. A standard anti-spamming system, e.g., Symantec® Norton™ Anti Spam™, can be added in the client device 175 or the avatar remote server device 100 to prevent spam mails.


Following is another exemplary usage scenario of the present invention.


(1) A sender (first_user_sender@ibm.com) of an email wants to send his avatar in an email, so at the end of the email the sender embeds or attaches his/her avatar. A MSG-ID associated with the email is stored in a database along with the email and receiver's information. A protocol used between the receiver and the avatar remote server device is SIP (Session Initiation Protocol). The email sent by the sender may be like following:


Subject: Patent US201


TO: second_user_receiver@ibm.com


From: first_user_sender@ibm.com


Copy: third_user_copy@atty.com; fourth_user_copy@atty.com


Text:

    • Patent related to IBM® WebSphere® Application Server


(2) The sender embeds or attaches the avatar in the email and sends the email to the receiver.


(3) A MTA in charge delivers the email to a mailbox of the receiver.


(4) When the receiver opens the email, the receiver clicks the avatar included in the email.


(5) The avatar contacts an ARS protocol stack of a client device to establish a connection with the Avatar remote server device of the sender.


(6) The avatar remote server device receives a connection request from the avatar. An ARS stack protocol of the avatar remote server device parses the connection request and retrieves a MSG-ID associated with the email from which the connection request is made.


(7) A session established by the connection becomes secure using the MSG-ID created by a UA of the sender. (A validity of the session is verified by matching the MSG-ID and the avatar which invokes the session.)


(8) Based on the retrieved MSG-ID, the avatar remote server device identifies the email and the receiver of the email.


(9) The avatar remote server device obtains an access right of the receiver from the profile database, i.e., whether the receiver (seond_user_ibm@ibm.com) can access all data in a database associated with the sender.


(10) The receiver has a question about the email that the receiver received from the sender. Thus, the receiver sends a question to the avatar remote server device via a pop-up window appeared by clicking the avatar. The question is “do have you other messages including “patent US201” in a subject?”.


(11) The avatar remote server device receives the question through the SIP protocol.


(12) The ARS protocol stack of the avatar remote server device opens the question and sends the question to a parser and analyzer in the avatar remote server device.


(13) The parser and analyzer in the avatar remote server device identify the question and issue a search request for emails whose subject includes “patent US201”.


(14) The selection module in the avatar remote server device performs the search request in the sender's mailbox(s).


(15) The selection module found 3 emails including “patent US201” in the subject.


(16) The avatar remote server device sends the 3 emails to the receiver under the SIP protocol. The, the 3 emails appear, via the avatar, on the screen of the client device that the receiver is operating.


(17) The receiver has a question related to one of the 3 emails. The receiver sends the question via the avatar. The question is “Who is “third_user_copy””?


(18) The avatar remote server device receives the question.


(19) The parser and analyzer in the avatar remote server device issue a search request to look for the third_user_copy.


(19) The selection module searches the third_user_copy in the mailbox of the sender and in the address book of the sender.


(20) The selection module finds an entry of the third_user_copy in the address book with all information describing the third_user_copy.


(21) The avatar remote server device sends all the information of the third_user_copy to the receiver.


(22) The receiver read all the information of the third_user_copy appeared, via the avatar, on the screen of the client device that the receiver is operating.


Although the embodiments of the present invention have been described in detail, it should be understood that various changes and substitutions can be made therein without departing from spirit and scope of the inventions as defined by the appended claims. Variations described for the present invention can be realized in any combination desirable for each particular application. Thus particular limitations, and/or embodiment enhancements described herein, which may have particular advantages to a particular application need not be used for all applications. Also, not all limitations need be implemented in methods, systems and/or apparatus including one or more concepts of the present invention.


The present invention can be realized in hardware, software, or a combination of hardware and software. A typical combination of hardware and software could be a general purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein. The present invention can also be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which—when loaded in computer systems—are able to carry out these methods.


Computer program means or computer program in the present context include any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after conversion to another language, code or notation, and/or reproduction in a different material form.


Thus the invention includes an article of manufacture which comprises a computer usable medium having computer readable program code means embodied therein for causing a function described above. The computer readable program code means in the article of manufacture comprises computer readable program code means for causing a computer to effect the steps of a method of this invention. Similarly, the present invention may be implemented as a computer program product comprising a computer usable medium having computer readable program code means embodied therein for causing a function described above. The computer readable program code means in the computer program product comprising computer readable program code means for causing a computer to effect one or more functions of this invention. Furthermore, the present invention may be implemented as a program storage device readable by machine, tangibly embodying a program of instructions executable by the machine to perform method steps for causing one or more functions of this invention.


The present invention may include a method of deploying a computer program product including a program of instructions in a computer readable medium (e.g., a compact disc (CD, a digital versatile disc (DVD), a hard disk drive, a solid state drive, etc.) for one or more functions of this invention, wherein, when the program of instructions is executed by a processor, the compute program product performs the one or more of functions of this invention.


It is noted that the foregoing has outlined some of the more pertinent objects and embodiments of the present invention. This invention may be used for many applications. Thus, although the description is made for particular arrangements and methods, the intent and concept of the invention is suitable and applicable to other arrangements and applications. It will be clear to those skilled in the art that modifications to the disclosed embodiments can be effected without departing from the spirit and scope of the invention. The described embodiments ought to be construed to be merely illustrative of some of the more prominent features and applications of the invention. Other beneficial results can be realized by applying the disclosed invention in a different manner or modifying the invention in ways known to those familiar with the art.

Claims
  • 1. A system comprising: a processor; andmemory coupled to the processor and storing instructions that, when executed by the processor, cause the system to perform operations comprising:responsive to a recipient at a computing device selecting an embedded avatar representing a sender that is embedded in a first electronic communication sent by the sender, providing an interface through which the recipient interacts to submit a query to the sender via the embedded avatar, wherein the first electronic communication comprises an email or an instant message;receiving the query via a communication protocol;parsing and analyzing the received query;retrieving an answer corresponding to the parsed and analyzed query from electronic communications sent or received by the sender;sending the answer to the computing device; andinitiating display of the answer via the embedded avatar on the computing device.
  • 2. The system of claim 1, wherein the electronic communications sent or received by the sender includes mails associated with the parsed and analyzed query.
  • 3. The system of claim 1, wherein the answer is retrieved from an agenda, a folder, a previous answer, a calendar, a resume, or an expert system with artificial intelligence.
  • 4. The system of claim 1, wherein the memory further stores instructions causing the system to perform operations comprising: sending a notification indicating the query to the sender based on a rule.
  • 5. The system of claim 1, wherein sending the answer to the computing device further comprises: identifying that the sender has an online connection; andrelaying the corresponding answer to the computing device, wherein the answer appears on a display of the computing device.
  • 6. The system of claim 1, wherein the memory further stores instructions for causing the system to perform operations comprising: receiving the query in an audio format;performing a speech-to-text transformation on the audio query; andproviding the answer in an audio format via the embedded avatar.
  • 7. The system of claim 1, wherein the communication protocol is SMTP (Simple Mail Transfer Protocol), SIP (Session Initiation Protocol), SIMPLE (SIP for Instant Messaging and Presence Leveraging Extensions), APEX (Application Exchange), Prim (Presence and Instance Messaging Protocol), XMPP (Extensible Messaging and Presence Protocol), IMPS (Instant Messaging and Presence Service), RTMP (Real Time Messaging Protocol), STM (Simple TCP/IP Messaging) protocol, or Azureus Extended Messaging Protocol.
  • 8. The system of claim 1, wherein the memory further stores instructions for causing the system to perform operations comprising: storing one or more message-identifications (MSG-IDs), the one or more MSG-IDs uniquely identifying each message created by the sender.
  • 9. A computer-implemented method comprising: responsive to a recipient at a computing device selecting an embedded avatar representing a sender that is embedded in a first electronic communication sent by the sender, providing, by a processor, an interface through which the recipient interacts to submit a query to the sender via the embedded avatar, wherein the first electronic communication comprises an email or an instant message;receiving, by the processor, the query via a communication protocol;parsing and analyzing the received query by the processor;retrieving, by the processor, an answer corresponding to the parsed and analyzed query from electronic communications sent or received by the sender;sending, by the processor, the answer to the computing device; andinitiating, by the processor, display of the answer via the embedded avatar on the computing device.
  • 10. The method of claim 9, wherein electronic communications sent or received by the sender includes emails associated with the parsed and analyzed query.
  • 11. The method of claim 9, wherein the corresponding answer is retrieved from an agenda, a folder, a previous answer, a calendar, a resume, or an expert system with artificial intelligence.
  • 12. The method of claim 9, further comprising: sending, by the processor, a notification indicating the query to the sender based on a rule.
  • 13. The method of claim 9, wherein sending the answer to the computing device further comprises: identifying that the sender has an online connection; andrelaying the answer to the computing device, wherein the answer appears on a display of the computing device.
  • 14. The method of claim 9, further comprising: receiving, by the processor, the query in an audio format;performing, by the processor, a speech-to-text transformation on the audio query; andproviding, by the processor, the answer in an audio format via the embedded avatar.
  • 15. The method of claim 9, wherein the communication protocol is SMTP (Simple Mail Transfer Protocol), SIP (Session Initiation Protocol), SIMPLE (SIP for Instant Messaging and Presence Leveraging Extensions), APEX (Application Exchange), Prim (Presence and Instance Messaging Protocol), XMPP (Extensible Messaging and Presence Protocol), IMPS (Instant Messaging and Presence Service), RTMP (Real Time Messaging Protocol), STM (Simple TCP/IP Messaging) protocol, or Azureus Extended Messaging Protocol.
  • 16. The method of claim 9, wherein memory of the computing device further stores instructions causing the computing device to perform operations comprising: storing one or more message-identifications (MSG-IDs), the one or more MSG-IDs uniquely, identifying each message created by the sender.
  • 17. A non-transitory computer-readable medium storing instructions that, when executed by a processor, cause the processor to perform operations comprising: responsive to a recipient at a computing device selecting an embedded avatar representing a sender that is embedded in a first electronic communication sent by the sender, providing an interface through which the recipient interacts to submit a query to the sender via the embedded avatar, wherein the first electronic communication comprises an email or an instant message;receiving the query via a communication protocol;parsing and analyzing the received query;retrieving an answer corresponding to the parsed and analyzed query from electronic communications sent or received by the sender, wherein the electronic communications sent or received by the sender includes the first electronic communication;sending the answer to the computing device; andinitiating display of the answer via the embedded avatar on the computing device.
  • 18. The non-transitory computer-readable of claim 17, wherein the electronic communications sent or received by the sender includes emails associated with the parsed and analyzed query.
  • 19. The non-transitory computer-readable of claim 17, wherein the answer is retrieved from an agenda, a folder, a previous answer, a calendar, a resume, or an expert system with artificial intelligence.
  • 20. The non-transitory computer-readable of claim 17, wherein instructions cause the processor to perform operations comprising: sending a notification indicating the query to the sender based on a rule.
Priority Claims (1)
Number Date Country Kind
09305104 Feb 2009 EP regional
PRIORITY

This application is a continuation of and claims the benefit of priority of U.S. patent application Ser. No. 15/661,953, filed Jul. 27, 2017 which is a continuation of and claims the benefit of priority of U.S. patent application Ser. No. 14/753,200, filed on Jun. 29, 2015, which is a continuation of and claims the benefit of priority of U.S. patent application Ser. No. 12/471,811, filed on May 26, 2009, which claims priority to European Patent Application No. 09305104.3, filed Feb. 3, 2009, which are hereby incorporated by reference herein in their entirety.

US Referenced Citations (306)
Number Name Date Kind
5826269 Hussey Oct 1998 A
5880731 Liles et al. Mar 1999 A
6023270 Brush, II et al. Feb 2000 A
6038295 Mattes Mar 2000 A
6223165 Lauffer Apr 2001 B1
6233318 Picard et al. May 2001 B1
6283858 Hayes, Jr. et al. Sep 2001 B1
6374292 Srivastava et al. Apr 2002 B1
6473794 Guheen et al. Oct 2002 B1
6772195 Hatlelid et al. Aug 2004 B1
6839411 Saltanov et al. Jan 2005 B1
6842779 Nishizawa Jan 2005 B1
6980909 Root et al. Dec 2005 B2
7079158 Lambertsen Jul 2006 B2
7173651 Knowles Feb 2007 B1
7342587 Danzig et al. Mar 2008 B2
7411493 Smith Aug 2008 B2
7468729 Levinson Dec 2008 B1
7535469 Kim et al. May 2009 B2
7535890 Rojas May 2009 B2
7627828 Collison et al. Dec 2009 B1
7636755 Blattner et al. Dec 2009 B2
7639251 Gu et al. Dec 2009 B2
7689649 Heikes et al. Mar 2010 B2
7775885 Van Luchene et al. Aug 2010 B2
7792789 Prahlad et al. Sep 2010 B2
7859551 Bulman et al. Dec 2010 B2
7885931 Seo et al. Feb 2011 B2
7925703 Dinan et al. Apr 2011 B2
8077931 Chatman et al. Dec 2011 B1
8088044 Tchao et al. Jan 2012 B2
8095878 Bates et al. Jan 2012 B2
8108774 Finn et al. Jan 2012 B2
8117281 Robinson et al. Feb 2012 B2
8130219 Fleury et al. Mar 2012 B2
8131597 Hudetz Mar 2012 B2
8146005 Jones et al. Mar 2012 B2
8151191 Nicol Apr 2012 B2
8170957 Richard May 2012 B2
8199747 Rojas et al. Jun 2012 B2
8332475 Rosen et al. Dec 2012 B2
8384719 Reville et al. Feb 2013 B2
RE44054 Kim Mar 2013 E
8396708 Park et al. Mar 2013 B2
8413059 Lee et al. Apr 2013 B2
8425322 Gillo et al. Apr 2013 B2
8457367 Sipe et al. Jun 2013 B1
8458601 Castelli et al. Jun 2013 B2
8462198 Lin et al. Jun 2013 B2
8484158 Deluca et al. Jul 2013 B2
8495503 Brown et al. Jul 2013 B2
8495505 Smith et al. Jul 2013 B2
8504926 Wolf Aug 2013 B2
8559980 Pujol Oct 2013 B2
8564621 Branson et al. Oct 2013 B2
8564710 Nonaka et al. Oct 2013 B2
8581911 Becker et al. Nov 2013 B2
8597121 Andres Del Valle Dec 2013 B2
8601051 Wang Dec 2013 B2
8601379 Marks et al. Dec 2013 B2
8632408 Gillo et al. Jan 2014 B2
8648865 Dawson et al. Feb 2014 B2
8659548 Hildreth Feb 2014 B2
8683354 Khandelwal et al. Mar 2014 B2
8692830 Nelson et al. Apr 2014 B2
8718333 Wolf et al. May 2014 B2
8724622 Rojas May 2014 B2
8730231 Snoddy et al. May 2014 B2
8738719 Lee et al. May 2014 B2
8810513 Ptucha et al. Aug 2014 B2
8812171 Filev et al. Aug 2014 B2
8832201 Wall Sep 2014 B2
8832552 Arrasvuori et al. Sep 2014 B2
8839327 Amento et al. Sep 2014 B2
8874677 Rosen et al. Oct 2014 B2
8890926 Tandon et al. Nov 2014 B2
8892999 Nims et al. Nov 2014 B2
8909679 Root et al. Dec 2014 B2
8924250 Bates et al. Dec 2014 B2
8935656 Dandia et al. Jan 2015 B2
8963926 Brown et al. Feb 2015 B2
8989786 Feghali Mar 2015 B2
8995433 Rojas Mar 2015 B2
9040574 Wang et al. May 2015 B2
9055416 Rosen et al. Jun 2015 B2
9086776 Ye et al. Jul 2015 B2
9100806 Rosen et al. Aug 2015 B2
9100807 Rosen et al. Aug 2015 B2
9105014 Collet et al. Aug 2015 B2
9191776 Root et al. Nov 2015 B2
9204252 Root Dec 2015 B2
9224220 Toyoda et al. Dec 2015 B2
9241184 Weerasinghe Jan 2016 B2
9256860 Herger et al. Feb 2016 B2
9298257 Hwang et al. Mar 2016 B2
9314692 Konoplev et al. Apr 2016 B2
9330483 Du et al. May 2016 B2
9357174 Li et al. May 2016 B2
9361510 Yao et al. Jun 2016 B2
9378576 Bouaziz et al. Jun 2016 B2
9392308 Ahmed et al. Jul 2016 B2
9402057 Kaytaz et al. Jul 2016 B2
9412192 Mandel et al. Aug 2016 B2
9443227 Evans et al. Sep 2016 B2
9460541 Li et al. Oct 2016 B2
9485747 Rodoper et al. Nov 2016 B1
9489661 Evans et al. Nov 2016 B2
9489760 Li et al. Nov 2016 B2
9491134 Rosen et al. Nov 2016 B2
9503845 Vincent Nov 2016 B2
9508197 Quinn et al. Nov 2016 B2
9544257 Ogundokun et al. Jan 2017 B2
9576400 Van Os et al. Feb 2017 B2
9589357 Li et al. Mar 2017 B2
9592449 Barbalet et al. Mar 2017 B2
9635195 Green et al. Apr 2017 B1
9641870 Cormie et al. May 2017 B1
9648376 Chang et al. May 2017 B2
9697635 Quinn et al. Jul 2017 B2
9706040 Kadirvel et al. Jul 2017 B2
9744466 Fujioka Aug 2017 B2
9746990 Anderson et al. Aug 2017 B2
9749270 Collet et al. Aug 2017 B2
9792714 Li et al. Oct 2017 B2
9839844 Dunstan et al. Dec 2017 B2
9883838 Kaleal, III et al. Feb 2018 B2
9898849 Du et al. Feb 2018 B2
9911073 Spiegel et al. Mar 2018 B1
9936165 Li et al. Apr 2018 B2
9959037 Chaudhri et al. May 2018 B2
9980100 Charlton et al. May 2018 B1
9990373 Fortkort Jun 2018 B2
10039988 Lobb et al. Aug 2018 B2
10097492 Tsuda et al. Oct 2018 B2
10116598 Tucker et al. Oct 2018 B2
10155168 Blackstock et al. Dec 2018 B2
10158589 Collet et al. Dec 2018 B2
10242477 Charlton et al. Mar 2019 B1
10242503 McPhee et al. Mar 2019 B2
10262250 Spiegel et al. Apr 2019 B1
10362219 Wilson et al. Jul 2019 B2
10475225 Park et al. Nov 2019 B2
10504266 Blattner et al. Dec 2019 B2
10573048 Ni et al. Feb 2020 B2
10657701 Osman et al. May 2020 B2
10938758 Allen et al. Mar 2021 B2
20020035607 Checkoway et al. Mar 2002 A1
20020059193 Decime et al. May 2002 A1
20020067362 Agostino Nocera et al. Jun 2002 A1
20020169644 Greene Nov 2002 A1
20030206171 Kim et al. Nov 2003 A1
20050144241 Stata et al. Jun 2005 A1
20050162419 Kim et al. Jul 2005 A1
20050206610 Cordelli Sep 2005 A1
20050280660 Seo et al. Dec 2005 A1
20060031412 Adams et al. Feb 2006 A1
20060145944 Tarlton et al. Jul 2006 A1
20060294465 Ronen et al. Dec 2006 A1
20070011270 Klein et al. Jan 2007 A1
20070113181 Blattner et al. May 2007 A1
20070168863 Blattner et al. Jul 2007 A1
20070174273 Jones Jul 2007 A1
20070176921 Iwasaki et al. Aug 2007 A1
20070218987 Luchene et al. Sep 2007 A1
20070258656 Aarabi et al. Nov 2007 A1
20080097979 Heidloff et al. Apr 2008 A1
20080158222 Li et al. Jul 2008 A1
20080201638 Nair Aug 2008 A1
20080209329 Defranco et al. Aug 2008 A1
20080216092 Serlet Sep 2008 A1
20080222108 Prahlad et al. Sep 2008 A1
20080309617 Kong et al. Dec 2008 A1
20090013268 Amit Jan 2009 A1
20090016617 Bregman-amitai et al. Jan 2009 A1
20090030884 Pulfer et al. Jan 2009 A1
20090044113 Jones et al. Feb 2009 A1
20090055484 Vuong Feb 2009 A1
20090070688 Gyorfi et al. Mar 2009 A1
20090099925 Mehta et al. Apr 2009 A1
20090100367 Dargahi et al. Apr 2009 A1
20090106672 Burstrom Apr 2009 A1
20090144639 Nims et al. Jun 2009 A1
20090150778 Nicol Jun 2009 A1
20090153552 Fidaleo et al. Jun 2009 A1
20090158170 Narayanan et al. Jun 2009 A1
20090177976 Bokor et al. Jul 2009 A1
20090202114 Morin et al. Aug 2009 A1
20090228811 Adams et al. Sep 2009 A1
20090265604 Howard et al. Oct 2009 A1
20090300525 Jolliff et al. Dec 2009 A1
20090303984 Clark et al. Dec 2009 A1
20090319178 Khosravy et al. Dec 2009 A1
20090328122 Amento et al. Dec 2009 A1
20100011422 Mason et al. Jan 2010 A1
20100023885 Reville et al. Jan 2010 A1
20100083138 Dawson Apr 2010 A1
20100083140 Dawson et al. Apr 2010 A1
20100083148 Finn et al. Apr 2010 A1
20100100828 Khandelwal et al. Apr 2010 A1
20100115426 Liu et al. May 2010 A1
20100121915 Wang May 2010 A1
20100162149 Sheleheda et al. Jun 2010 A1
20100179991 Lorch et al. Jul 2010 A1
20100203968 Gill et al. Aug 2010 A1
20100227682 Reville et al. Sep 2010 A1
20100274724 Bible, Jr. et al. Oct 2010 A1
20100290756 Karaoguz et al. Nov 2010 A1
20100332980 Sun et al. Dec 2010 A1
20110047404 Metzler et al. Feb 2011 A1
20110066664 Goldman et al. Mar 2011 A1
20110093780 Dunn Apr 2011 A1
20110115798 Nayar et al. May 2011 A1
20110148864 Lee et al. Jun 2011 A1
20110153759 Rathod Jun 2011 A1
20110161076 Davis et al. Jun 2011 A1
20110202598 Evans et al. Aug 2011 A1
20110211764 Krupka et al. Sep 2011 A1
20110239136 Goldman et al. Sep 2011 A1
20110239143 Ye et al. Sep 2011 A1
20110246330 Tikku et al. Oct 2011 A1
20110249891 Li Oct 2011 A1
20110292051 Nelson et al. Dec 2011 A1
20120013770 Stafford et al. Jan 2012 A1
20120069028 Bouguerra Mar 2012 A1
20120113106 Choi et al. May 2012 A1
20120124458 Cruzada May 2012 A1
20120130717 Xu et al. May 2012 A1
20120139830 Hwang et al. Jun 2012 A1
20120209924 Evans et al. Aug 2012 A1
20120229506 Nishikawa Sep 2012 A1
20120271883 Montoya et al. Oct 2012 A1
20130031180 Abendroth et al. Jan 2013 A1
20130036165 Tseng et al. Feb 2013 A1
20130103760 Golding et al. Apr 2013 A1
20130103766 Gupta Apr 2013 A1
20130111354 Marra et al. May 2013 A1
20130155169 Hoover et al. Jun 2013 A1
20130179520 Lee et al. Jul 2013 A1
20130201187 Tong et al. Aug 2013 A1
20130249948 Reitan Sep 2013 A1
20130257877 Davis Oct 2013 A1
20130258040 Kaytaz et al. Oct 2013 A1
20130332068 Kesar et al. Dec 2013 A1
20140043329 Wang et al. Feb 2014 A1
20140055554 Du et al. Feb 2014 A1
20140085293 Konoplev et al. Mar 2014 A1
20140125678 Wang et al. May 2014 A1
20140129343 Finster et al. May 2014 A1
20140160149 Blackstock et al. Jun 2014 A1
20140362091 Bouaziz et al. Dec 2014 A1
20150086087 Ricanek, Jr. et al. Mar 2015 A1
20150123967 Quinn et al. May 2015 A1
20150169142 Longo et al. Jun 2015 A1
20150206349 Rosenthal et al. Jul 2015 A1
20150213604 Li et al. Jul 2015 A1
20150220774 Ebersman et al. Aug 2015 A1
20150234942 Harmon Aug 2015 A1
20150245168 Martin Aug 2015 A1
20150264432 Cheng Sep 2015 A1
20150295866 Collet et al. Oct 2015 A1
20150304806 Vincent Oct 2015 A1
20150347519 Hornkvist et al. Dec 2015 A1
20160045834 Burns Feb 2016 A1
20160078095 Man et al. Mar 2016 A1
20160093078 Davis et al. Mar 2016 A1
20160134840 Mcculloch May 2016 A1
20160158600 Rolley Jun 2016 A1
20160163084 Corazza et al. Jun 2016 A1
20160164823 Nordstrom et al. Jun 2016 A1
20160188997 Desnoyer et al. Jun 2016 A1
20160189310 O'kane Jun 2016 A1
20160210500 Feng et al. Jul 2016 A1
20160217292 Faaborg et al. Jul 2016 A1
20160234149 Tsuda et al. Aug 2016 A1
20160241504 Raji et al. Aug 2016 A1
20160275721 Park et al. Sep 2016 A1
20160343160 Blattner et al. Nov 2016 A1
20160350297 Riza Dec 2016 A1
20170006322 Dury et al. Jan 2017 A1
20170027528 Kaleal, III et al. Feb 2017 A1
20170039752 Quinn et al. Feb 2017 A1
20170064240 Mangat et al. Mar 2017 A1
20170080346 Abbas Mar 2017 A1
20170087473 Siegel et al. Mar 2017 A1
20170113140 Blackstock et al. Apr 2017 A1
20170118145 Aittoniemi et al. Apr 2017 A1
20170199855 Fishbeck Jul 2017 A1
20170235848 Van Deusen et al. Aug 2017 A1
20170286752 Gusarov et al. Oct 2017 A1
20170310934 Du et al. Oct 2017 A1
20170312634 Ledoux et al. Nov 2017 A1
20170324688 Collet et al. Nov 2017 A1
20170336960 Chaudhri et al. Nov 2017 A1
20180005420 Bondich et al. Jan 2018 A1
20180024726 Hviding Jan 2018 A1
20180047200 O'hara et al. Feb 2018 A1
20180091732 Wilson et al. Mar 2018 A1
20180113587 Allen et al. Apr 2018 A1
20180115503 Baldwin et al. Apr 2018 A1
20180315076 Andreou Nov 2018 A1
20180315133 Brody et al. Nov 2018 A1
20180315134 Amitay et al. Nov 2018 A1
20190001223 Blackstock et al. Jan 2019 A1
20190057616 Cohen et al. Feb 2019 A1
20190188920 Mcphee et al. Jun 2019 A1
20210266277 Allen et al. Aug 2021 A1
Foreign Referenced Citations (30)
Number Date Country
2887596 Jul 2015 CA
109863532 Jun 2019 CN
110168478 Aug 2019 CN
2184092 May 2010 EP
2001230801 Aug 2001 JP
2014006881 Jan 2014 JP
5497931 Mar 2014 JP
20040063436 Jul 2004 KR
1020050036963 Apr 2005 KR
1020120070898 Jul 2012 KR
101445263 Sep 2014 KR
WO-03094072 Nov 2003 WO
WO-2004095308 Nov 2004 WO
WO-2006107182 Oct 2006 WO
WO-2007134402 Nov 2007 WO
WO-2012139276 Oct 2012 WO
WO-2013027893 Feb 2013 WO
WO-2013152454 Oct 2013 WO
WO-2013166588 Nov 2013 WO
WO-2014031899 Feb 2014 WO
WO-2014194439 Dec 2014 WO
WO-2016054562 Apr 2016 WO
WO-2016090605 Jun 2016 WO
WO-2017173319 Oct 2017 WO
WO-2018005644 Jan 2018 WO
WO-2018006053 Jan 2018 WO
WO-2018081013 May 2018 WO
WO-2018102562 Jun 2018 WO
WO-2018129531 Jul 2018 WO
WO-2019089613 May 2019 WO
Non-Patent Literature Citations (117)
Entry
“U.S. Appl. No. 12/471,811, Advisory Action dated Mar. 28, 2012”, 6 pgs.
“U.S. Appl. No. 12/471,811, Examiner Interview Summary dated Feb. 2, 2012”, 3 pgs.
“U.S. Appl. No. 12/471,811, Examiner Interview Summary dated Apr. 18, 2011”, 3 pgs.
“U.S. Appl. No. 12/471,811, Examiner Interview Summary dated May 27, 2014”, 2 pgs.
“U.S. Appl. No. 12/471,811, Final Office Action dated Dec. 23, 2011”, 20 pgs.
“U.S. Appl. No. 12/471,811, Non Final Office Action dated Jan. 13, 2011”, 15 pgs.
“U.S. Appl. No. 12/471,811, Non Final Office Action dated Jun. 28, 2011”, 26 pgs.
“U.S. Appl. No. 12/471,811, Non Final Office Action dated Oct. 24, 2014”, 21 pgs.
“U.S. Appl. No. 12/471,811, Notice of Allowance dated Apr. 1, 2015”, 6 pgs.
“U.S. Appl. No. 12/471,811, Response filed Jan. 26, 2015 to Non Final Office Action dated Oct. 24, 2014”, 18 pgs.
“U.S. Appl. No. 12/471,811, Response filed Feb. 23, 2012 to Final Office Action dated Dec. 23, 2011”, 12 pgs.
“U.S. Appl. No. 12/471,811, Response filed Mar. 28, 2012 to Advisory Action dated Mar. 28, 2012”, 14 pgs.
“U.S. Appl. No. 12/471,811, Response filed Apr. 13, 2011 to Non Final Office Action dated Jan. 13, 2011”, 5 pgs.
“U.S. Appl. No. 12/471,811, Response filed Sep. 28, 2011 to Non Final Office Action dated Jun. 28, 2011”, 19 pgs.
“U.S. Appl. No. 13/979,974, Examiner Interview Summary dated Jun. 29, 2017”, 3 pgs.
“U.S. Appl. No. 13/979,974, Examiner Interview Summary dated Sep. 15, 2017”, 3 pgs.
“U.S. Appl. No. 13/979,974, Final Office Action dated Apr. 25, 2018”, 18 pgs.
“U.S. Appl. No. 13/979,974, Final Office Action dated Jun. 9, 2017”, 20 pgs.
“U.S. Appl. No. 13/979,974, Final Office Action dated Oct. 12, 2016”, 13 pgs.
“U.S. Appl. No. 13/979,974, Non Final Office Action dated Feb. 22, 2017”, 17 pgs.
“U.S. Appl. No. 13/979,974, Non Final Office Action dated Apr. 27, 2016”, 16 pgs.
“U.S. Appl. No. 13/979,974, Non Final Office Action dated Oct. 3, 2017”, 17 pgs.
“U.S. Appl. No. 13/979,974, Notice of Allowance dated Aug. 10, 2018”, 9 pgs.
“U.S. Appl. No. 13/979,974, Response filed Jan. 3, 2018 to Non Final Office Action dated Oct. 3, 2017”, 8 pgs.
“U.S. Appl. No. 13/979,974, Response filed May 22, 2017 to Non Final Office Action dated Feb. 22, 2017”, 10 pgs.
“U.S. Appl. No. 13/979,974, Response filed Jul. 25, 2018 to Final Office Action dated Apr. 25, 2018”, 10 pgs.
“U.S. Appl. No. 13/979,974, Response filed Jul. 26, 2016 to Non Final Office Action dated Apr. 27, 2016”, 8 pgs.
“U.S. Appl. No. 13/979,974, Response filed Sep. 11, 2017 to Final Office Action dated Jun. 9, 2017”, 8 pgs.
“U.S. Appl. No. 13/979,974, Response filed Jan. 12, 2017 to Non Final Office Action dated Apr. 27, 2016”, 8 pgs.
“U.S. Appl. No. 14/753,200, Non Final Office Action dated Oct. 11, 2016”, 6 pgs.
“U.S. Appl. No. 14/753,200, Notice of Allowance dated Apr. 27, 2017”, 7 pgs.
“U.S. Appl. No. 14/753,200, Response filed Feb. 13, 2017 to Non Final Office Action dated Oct. 11, 2016”, 9 pgs.
“U.S. Appl. No. 15/086,749, Final Office Action dated Oct. 31, 2017”, 15 pgs.
“U.S. Appl. No. 15/086,749, Non Final Office Action dated Mar. 13, 2017”, 12 pgs.
“U.S. Appl. No. 15/086,749, Non Final Office Action dated Apr. 30, 2018”, 14 pgs.
“U.S. Appl. No. 15/086,749, Response filed Apr. 2, 2018 to Final Office Action dated Oct. 31, 2017”, 14 pgs.
“U.S. Appl. No. 15/086,749, Response filed Aug. 29, 2018 to Non Final Office Action dated Apr. 30, 2018”, 12 pgs.
“U.S. Appl. No. 15/199,472, Final Office Action dated Mar. 1, 2018”, 31 pgs.
“U.S. Appl. No. 15/199,472, Final Office Action dated Nov. 15, 2018”, 37 pgs.
“U.S. Appl. No. 15/199,472, Non Final Office Action dated Jul. 25, 2017”, 30 pgs.
“U.S. Appl. No. 15/199,472, Non Final Office Action dated Sep. 21, 2018”, 33 pgs.
“U.S. Appl. No. 15/199,472, Response filed Jan. 25, 2018 to Non Final Office Action dated Jul. 25, 2017”, 13 pgs.
“U.S. Appl. No. 15/199,472, Response filed Aug. 31, 2018 to Final Office Action dated Mar. 1, 2018”, 14 pgs.
“U.S. Appl. No. 15/199,472, Response filed Oct. 17, 2018 to Non Final Office Action dated Sep. 31, 2018”, 11 pgs.
“U.S. Appl. No. 15/369,499, Final Office Action dated Jan. 31, 2019”, 22 pgs.
“U.S. Appl. No. 15/369,499, Non Final Office Action dated Aug. 15, 2018”, 22 pgs.
“U.S. Appl. No. 15/369,499, Response filed Nov. 15, 2018 to Non Final Office Action dated Aug. 15, 2018”, 10 pgs.
“U.S. Appl. No. 15/583,142, Non Final Office Action dated Oct. 25, 2018”, 14 pgs.
“U.S. Appl. No. 15/661,953, Examiner Interview Summary dated Nov. 13, 2018”, 3 pgs.
“U.S. Appl. No. 15/661,953, Non Final Office Action dated Mar. 26, 2018”, 6 pgs.
“U.S. Appl. No. 15/661,953, Notice of Allowance dated Aug. 10, 2018”, 7 pgs.
“U.S. Appl. No. 15/661,953, PTO Response to Rule 312 Communication dated Oct. 30, 2018”, 2 pgs.
“U.S. Appl. No. 15/661,953, PTO Response to Rule 312 Communication dated Nov. 7, 2018”, 2 pgs.
“U.S. Appl. No. 15/661,953, Response Filed Jun. 26, 2018 to Non Final Office Action dated Mar. 26, 2018”, 13 pgs.
“U.S. Appl. No. 16/115,259, Preliminary Amendment filed Oct. 18, 2018 t”, 6 pgs.
“International Application Serial No. PCT/CA2013/000454, International Preliminary Report on Patentability dated Nov. 20, 2014”, 9 pgs.
“International Application Serial No. PCT/CA2013/000454, International Search Report dated Aug. 20, 2013”, 3 pgs.
“International Application Serial No. PCT/CA2013/000454, Written Opinion dated Aug. 20, 2013”, 7 pgs.
“International Application Serial No. PCT/US2017/025460, International Preliminary Report on Patentability dated Oct. 11, 2018”, 9 pgs.
“International Application Serial No. PCT/US2017/025460, International Search Report dated Jun. 20, 2017”, 2 pgs.
“International Application Serial No. PCT/US2017/025460, Written Opinion dated Jun. 20, 2017”, 7 pgs.
“International Application Serial No. PCT/US2017/040447, International Search Report dated Oct. 2, 2017”, 4 pgs.
“International Application Serial No. PCT/US2017/040447, Written Opinion dated Oct. 2, 2017”, 6 pgs.
“International Application Serial No. PCT/US2017/057918, International Search Report dated Jan. 19, 2018”, 3 pgs.
“International Application Serial No. PCT/US2017/057918, Written Opinion dated Jan. 19, 2018”, 7 pgs.
“International Application Serial No. PCT/US2017/063981, International Search Report dated Mar. 22, 2018”, 3 pgs.
“International Application Serial No. PCT/US2017/063981, Written Opinion dated Mar. 22, 2018”, 8 pgs.
“International Application Serial No. PCT/US2018/000112, International Search Report dated Jul. 20, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/000112, Written Opinion dated Jul. 20, 2018”, 4 pgs.
“International Application Serial No. PCT/US2018/000113, International Search Report dated Jul. 13, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/000113, Written Opinion dated Jul. 13, 2018”, 4 pgs.
“International Application Serial No. PCT/US2018/030039, International Search Report dated Jul. 11, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/030039, Written Opinion dated Jul. 11, 2018”, 4 pgs.
“International Application Serial No. PCT/US2018/030043, International Search Report dated Jul. 23, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/030043, Written Opinion dated Jul. 23, 2018”, 5 pgs.
“International Application Serial No. PCT/US2018/030044, International Search Report dated Jun. 26, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/030044, Written Opinion dated Jun. 26, 2018”, 6 pgs.
“International Application Serial No. PCT/US2018/030045, International Search Report dated Jul. 3, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/030045, Written Opinion dated Jul. 3, 2018”, 6 pgs.
“International Application Serial No. PCT/US2018/030046, International Search Report dated Jul. 6, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/030046, Written Opinion dated Jul. 6, 2018”, 6 pgs.
“List of IBM Patents or Patent Applications Treated as Related, Filed Herewith.”, 2 pgs.
Broderick, Ryan, “Everything You Need to Know About Japan's Amazing Photo Booths”, [Online] Retrieved from the internet: <https://www.buzzfeed.com/ryanhatesthis/look-how-kawaii-i-am?utm_term=.kra5QwGNZ#.muYoVB7qJ>, (Jan. 22, 2016), 30 pgs.
Chan, Connie, “The Elements of Stickers”, [Online] Retrieved from the internet: <https://a16z.com/2016/06/17/stickers/>, (Jun. 20, 2016), 15 pgs.
Collet, Jean Luc, et al., “Interactive avatar in messaging environment”, U.S. Appl. No. 12/471,811, filed May 26, 2009, (May 26, 2009), 31 pgs.
Dillet, Romain, “Zenly proves that location sharing isn't dead”, URL: https://techcrunch.com/2016/05/19/zenly-solomoyolo/, (accessed Jun. 27, 2018), 6 pgs.
Leyden, John, “This SMS will self-destruct in 40 seconds”, URL: http://www.theregister.co.uk/2005/12/12/stealthtext/, (Dec. 12, 2005), 1 pg.
Rhee, Chi-Hyoung, et al., “Cartoon-like Avatar Generation Using Facial Component Matching”, International Journal of Multimedia and Ubiquitous Engineering, (Jul. 30, 2013), 69-78.
“U.S. Appl. No. 15/369,499, Response filed Mar. 14, 2019 to Final Office Action dated Jan. 31, 2019”, 12 pgs.
“U.S. Appl. No. 15/369,499, Non Final Office Action dated Jun. 17, 2019”, 17 pgs.
“U.S. Appl. No. 16/115,259, Non Final Office Action dated Jul. 30, 2019”, 21 pgs.
“U.S. Appl. No. 15/369,499, Response filed Sep. 10, 2019 to Non-Final Office Action dated Jun. 17, 2019”, 9 pgs.
“U.S. Appl. No. 15/369,499, Final Office Action dated Oct. 1, 2019”, 17 pgs.
“U.S. Appl. No. 15/369,499, Non Final Office Action dated Mar. 2, 2020”, 17 pgs.
“U.S. Appl. No. 15/369,499, Response filed Feb. 3, 2020 to Final Office Action dated Oct. 1, 2019”, 10 pgs.
“U.S. Appl. No. 16/115,259, Final Office Action dated Dec. 16, 2019”, 23 pgs.
“U.S. Appl. No. 16/115,259, Response filed Mar. 13, 2020 to Final Office Action dated Dec. 16, 2019”, 9 pgs.
“U.S. Appl. No. 16/115,259, Response filed Oct. 30, 2019 to Non Final Office Action dated Jul. 30, 2019”, 9 pgs.
“U.S. Appl. No. 15/369,499, Examiner Interview Summary dated Sep. 21, 2020”, 2 pgs.
“U.S. Appl. No. 15/369,499, Examiner Interview Summary dated Oct. 9, 2020”, 2 pgs.
“U.S. Appl. No. 15/369,499, Final Office Action dated Jun. 15, 2020”, 17 pgs.
“U.S. Appl. No. 15/369,499, Notice of Allowance dated Oct. 26, 2020”, 17 pgs.
“U.S. Appl. No. 15/369,499, Response filed Jun. 2, 2020 to Non Final Office Action dated Mar. 2, 2020”, 9 pgs.
“U.S. Appl. No. 15/369,499, Response filed Sep. 15, 2020 to Final Office Action dated Jun. 15, 2020”, 10 pgs.
“U.S. Appl. No. 16/115,259, Final Office Action dated Jul. 22, 2020”, 20 pgs.
“U.S. Appl. No. 16/115,259, Non Final Office Action dated Apr. 9, 2020”, 18 pgs.
“U.S. Appl. No. 16/115,259, Response filed Jul. 9, 2020 to Non Final Office Action dated Apr. 9, 2020”, 8 pgs.
“U.S. Appl. No. 16/115,259, Response filed Oct. 22, 2020 to Final Office Action dated Jul. 22, 2020”, 10 pgs.
“U.S. Appl. No. 15/369,499, Corrected Notice of Allowability dated Jan. 28, 2021”, 3 pgs.
“U.S. Appl. No. 16/115,259, Non Final Office Action dated Jan. 11, 2021”, 17 pgs.
“U.S. Appl. No. 16/115,259, Response filed May 11, 2021 to Non Final Office Action dated Jan. 11, 2021”, 14 pgs.
U.S. Appl. No. 17/314,963, filed May 7, 2021, Generating and Displaying Customized Avatars in Media Overlays.
“U.S. Appl. No. 16/115,259, Final Office Action dated Jul. 13, 2021”, (18 pgs).
“U.S. Appl. No. 16/115,259, Non Final Office Action dated Nov. 8, 2021”, 17 pgs.
“U.S. Appl. No. 16/115,259, Response filed Feb. 8, 2022 to Non Final Office Action dated Nov. 8, 2021”, 9 pgs.
“U.S. Appl. No. 16/115,259, Response filed Oct. 13, 2021 to Final Office Action dated Jul. 13, 2021”, 10 pgs.
“U.S. Appl. No. 17/314,963, Non Final Office Action dated Feb. 2, 2022”, 24 pgs.
Related Publications (1)
Number Date Country
20190097958 A1 Mar 2019 US
Continuations (3)
Number Date Country
Parent 15661953 Jul 2017 US
Child 16193938 US
Parent 14753200 Jun 2015 US
Child 15661953 US
Parent 12471811 May 2009 US
Child 14753200 US