The present invention relates generally to systems and methods for unified communication systems, and, in particular embodiments, to a system and method for automatic context detection, sharing, and storage in real-time communication systems.
Generally, unified communications (UC) refers to the combination of real-time communication systems (e.g., instant messaging, telephony, video conferencing, and the like) and non-real-time time communication systems (e.g., voicemail, e-mail, short message services, and the like). UC systems allow a user to consolidate communications from various sources, improving organizational efficiency and productivity. However, while existing systems may allow a user to monitor and categorize the subject matter of past communications, the ability to automatically detect the subject matter of a communication in real-time and share the relevant subject matter is not currently available.
These and other problems are generally solved or circumvented, and technical advantages are generally achieved, by preferred embodiments of the present invention which provide a system and method for automatic context detection, sharing, and storage in real-time communication systems.
In accordance with an embodiment, a method for real-time communications includes creating, by a processor, a plurality of contexts for a first user, creating a list of potential contexts in accordance with the plurality of contexts when the first user and a second user engage in a real-time communication, displaying the list of potential contexts to the first user.
In accordance with another embodiment, a method for real-time communications includes creating, by a processor, a datastore of contexts for a first user, wherein each context in the datastore includes information on communications, activities, or a combination thereof related to the context, information on one or more objects related to the context, and information on other users related to the context, generating a list of possible contexts when the first user and the second user engage in a real-time communication based on the database of contexts, displaying the list of possible contexts to the first user, determining an applicable context for real-time communication from the list of possible contexts, and sharing the applicable context with the first and the second user.
In accordance with yet another embodiment, a unified communications (UC) device includes a processor, and a computer readable storage medium storing programming for execution by the processor, the programming including instructions to create a datastore of contexts for a first user, generate a list of possible contexts when the first user and the second user engage in a real-time communication based on the datastore of contexts, display the list of possible contexts to the first user, determine an applicable context for the real-time communication from the list of possible contexts, and share the applicable context with the first and the second user.
For a more complete understanding of the present invention, and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawing, in which:
The making and using of embodiments are discussed in detail below. It should be appreciated, however, that the present invention provides many applicable inventive concepts that can be embodied in a wide variety of specific contexts. The specific embodiments discussed are merely illustrative of specific ways to make and use the invention, and do not limit the scope of the invention.
Various embodiments will be described with respect to a specific context, namely a unified communications system approach to real-time communications systems. Various embodiments may also be applied, however, to other forms of real-time communications such as social media (e.g., Facebook, Twitter, etc.), interactive televisions, social gaming, contact center systems (CRMs), and the like.
Background engine 102 generates contexts for a user's communications and activities based on a monitored set of circumstances and/or facts regarding the communications and activities (e.g., the content, time, location, situation, and the like). Each context includes information regarding communications or activities related to a particular subject matter (e.g., a theme or topic). Communications/activities with the same context are grouped together based on the monitored set of circumstances/facts of the communication/activity itself. For example, background engine 102 monitors an email and determines it relates to context A based on the email's subject line. Background engine 102 also monitors a document and determines it also relates to context A based on the document's contents. Background engine 102 creates information in context A that indicates the document and email are related to context A. Context A, including this relationship information is stored in datastore 104. Each context contains relationship information linking past, present, and future communications/activities together. Varying contexts may also be distinguished based on time period even though the contexts may pertain to similar subject matters.
Each context also includes information on objects (e.g., an email, voicemail, a document, and the like) related to the context. This information on related objects is also derived from the monitored communications/activities. A context may include copies of related objects in datastore 104, or the context may contain reference links to the original object stored elsewhere.
Furthermore, each context also includes related user information for the context based on the monitored communications/activities. For example, background engine 102, working for user 1, determines that an email between user 1 and user 2 relates to context A. Background engine 102 stores information for context A indicating that user 2 is related to context A. Therefore, each context relates to a particular topic and includes information on past, present, and future communications, activities, objects and other users related to the topic. A communication/activity may be related to multiple contexts. For example, a call may be related to several different topics. All the relevant contexts related to a particular communication/activity include information relating to the communication/activity.
A user may select the forms of communications/activities the engine does or does not monitor.
When a user receives a real-time communication (e.g., a phone call or an instant message) from another user, UC 100 displays a list of possible contexts the communication may be related to. These possible contexts are derived from the information in the contexts stored in datastore 104. The list may be weighted so that the context with the highest probability of being relevant is displayed first, the context with the next highest probability is displayed second, and so on. The weighting of the list may be based on the frequency of a potential context (i.e., how often communications between the relevant users occurs), the freshness of a potential context (e.g., the last time activity between the relevant users occurred or the next time activity between the users is scheduled), the number of times the context has been labeled as important (e.g., flagged as important in an email, voicemail, etc.), the number of times other relevant users (e.g., users in the same team) have had communications about the context or a similar context, and the like. This weighted list may also be updated while the real-time communication takes place. For example, background engine 102 may determine a potential context has a higher probability of being relevant based on the monitored content of the real-time communication. Background engine 102 would then update the weighted list to reflect this determination. The user may choose which potential context in the list applies to the real time communication. Alternatively, as the real-time communication takes place, background engine 102 monitors the communication and determines the appropriate context.
For example,
Once the appropriate context of a real-time communication is determined, UC 100 may display other users that are related to the context. If the other users are brought into the real-time communication, the appropriate context is shared with them as well. For example,
Once an appropriate context is determined, UC 100 shares objects relevant to the context with all the users engaged in the real-time communication. Again, UC 100 knows which objects are relevant to a particular context based on the information stored in datastore 104. For example,
Following the real-time communication, the appropriate context is updated in datastore 104 to include the information from the communication. For example, after the conference call in
In step 504, an appropriate context for the real-time communication is determined. The appropriate context may be selected manually by the user from the list of potential contexts. Alternatively, the appropriate context may be determined by UC 100 automatically based on the monitored content of the real-time communication.
In step 506, the UC shares the context, including relevant objects, with all the users on the communication. The UC determines which objects are relevant based on the context's stored information. The UC may share these objects by physically transferring the objects to each user. Alternatively, the UC may simply share referential links to the objects stored in a commonly accessible space (e.g., a shared drive or a cloud).
In step 508, the UC updates the appropriate context with information about the communication. For example, if new users (i.e., users that were previously uninvolved with the context) were on the call, the UC would update the context to include information on the new users.
In an alternative embodiment, illustrated in
The bus may be one or more of any type of several bus architectures including a memory bus or memory controller, a peripheral bus, video bus, or the like. The CPU may comprise any type of electronic data processor. The memory may comprise any type of system memory such as static random access memory (SRAM), dynamic random access memory (DRAM), synchronous DRAM (SDRAM), read-only memory (ROM), a combination thereof, or the like. In an embodiment, the memory may include ROM for use at boot-up, and DRAM for program and data storage for use while executing programs.
The mass storage device may comprise any type of storage device configured to store data, programs, and other information and to make the data, programs, and other information accessible via the bus. The mass storage device may comprise, for example, one or more of a solid state drive, hard disk drive, a magnetic disk drive, an optical disk drive, or the like.
The video adapter and the I/O interface provide interfaces to couple external input and output devices to the processing unit. As illustrated, examples of input and output devices include the display coupled to the video adapter and the mouse/keyboard/printer coupled to the I/O interface. Other devices may be coupled to the processing unit, and additional or fewer interface cards may be utilized. For example, a serial interface card (not shown) may be used to provide a serial interface for a printer.
The processing unit also includes one or more network interfaces, which may comprise wired links, such as an Ethernet cable or the like, and/or wireless links to access nodes or different networks. The network interface allows the processing unit to communicate with remote units via the networks. For example, the network interface may provide wireless communication via one or more transmitters/transmit antennas and one or more receivers/receive antennas. In an embodiment, the processing unit is coupled to a local-area network or a wide-area network for data processing and communications with remote devices, such as other processing units, the Internet, remote storage facilities, or the like.
While this invention has been described with reference to illustrative embodiments, this description is not intended to be construed in a limiting sense. Various modifications and combinations of the illustrative embodiments, as well as other embodiments of the invention, will be apparent to persons skilled in the art upon reference to the description. It is therefore intended that the appended claims encompass any such modifications or embodiments.