A. Field of the Invention
The present invention relates generally to the field of telecommunications, and more particularly to casual collaborative conferencing.
B. Description of the Related Art
The World Wide Web (WWW), one type of service provided through the Internet, allows a user to access a universe of information which combines text, audio, graphics and animation within a hypermedia document. Links are contained within a WWW document which allow simple and rapid access to related documents. The WWW was developed to provide researchers with a system that would enable them to quickly access all types of information with a common interface, removing the necessity to execute a variety of numerous steps to access the information. During 1991, the WWW was released for general usage with access to hypertext and UseNet news articles. Interfaces to WAIS, anonymous FTP, Telnet and Gopher were added. By the end of 1993, WWW browsers with easy to use interfaces had been developed for many different computer systems.
UseNet is a network of news groups on thousands of different topics which allow the on-line discussion through the posting of individual messages (articles) which can be read by participants. An article is similar to an e-mail message, having a header, message body and signature.
Internet Relay Chat (IRC) is an example of a program that facilitates Web chat. “Chatting” is the term used for the network equivalent of the old telephone party line. IRC is accessed through an Internet connection. This technology permits the user to chat with users from all over the world about hundreds of different subjects at any time. In a way, it is as if the UseNet newsgroups were a live discussion group rather than postings.
The word “chat” may be somewhat misleading, because persons participating in a chat session are not necessarily speaking, but they are typing and reading text messages that chat participants write. Moreover, if the information communicated is not only in text form, but is real-time audio and video, chat rooms are better described by the term virtual space rooms. Once a person enters a chat room, which is really just a web page, that person can choose to only read the exchanges, known as lurking, or the person can join in and post messages.
Many chat rooms focus the conversation on specific topics, such as health, politics, and football. In that way, people with similar interests can find one another.
The first step for a person interested in joining a chat session, is to locate a chat room that interests the person. Once the person is on the web site (leading to the chat room), the interested person will usually be asked to register. For privacy purposes, people do not register using their real name, but instead people make up a name.
Once the person is equipped with a registration name, the person clicks a button and follows the instructions on the web site to choose a chat room, depending on the interests of the person. Joining a chat room is like walking into a room full of people talking to each other, sometimes with several conversations going on at once. Once inside the chat room, the person will probably find himself or herself in the middle of a conversation. There is no need to jump into the conversation. It is not uncommon for chat rooms to have many more lurkers than participants. As the interaction continues, new postings appear on the computer screen. When the person decides to join the conversation, all it takes is to type a message in a blank box in the screen and click a Talk button (or hit the Enter or Return key on the keyboard). Soon the message will be posted in the chat room and people may respond. In addition to chatting on a chat room where the text is broadcast to everyone on that chat room, there are ways to enter into a private chat.
A number of Internet phone software products offer voice capabilities in real time over the Internet. Internet phoneware vendors typically provide their own directory servers, organized by topic as well as by name. Voice quality varies from moment to moment. Such variations are due to the processing delay that results from encoding and decoding the conversation as well as the inherent delay of the Internet, which varies according to the amount of traffic at any given time and the route through which the signal must travel.
The Web chat is, however, only one level of an area of technology known as collaborative conferencing. Collaborative conferencing is the ability for two or more individuals to work together in real-time, in a coordinated manner over time and space by using computers. Collaborative conferencing is not limited to a live text exchange, but includes data conferencing/shared whiteboard applications, group interactive document editing, and audio and video multi-point conferencing among others.
The technique of Internet chat has the disadvantage that it is limited in the choices that individuals can make respecting whom they want to establish communication with. Namely, they have to join a chat room that has a specific discussion topic, and can only pick people in that chat room with whom to engage in a private chat. To solve this problem, a solution has been proposed and implemented, in which matches between different individuals connected to the WWW are created. This requires the inconvenient step of requesting information to the user, so as to create a user profile, and thus, perform matches based on those profiles.
Therefore, there is a need in the art for a system that offers more flexibility to individuals to choose other individuals with whom they want to engage in a conversation, the conversation not being limited to a conventional Internet chat (text).
Accordingly, it is an object of the present invention to meet the foregoing needs by providing systems and methods that efficiently enable real-time communication among two or more individuals separated in space.
Specifically, a method for meeting the foregoing needs is disclosed. The method includes the steps of determining that a first individual is likely to be interested in communicating with a second individual via a first communications link; retrieving information via the first communications link about one or more additional individuals from electronic memory means associated with the second individual; and establishing communication with at least one of the additional individuals based on the retrieved information.
Both the foregoing general description and the following detailed description provide examples and explanations only. They do not restrict the claimed invention.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, explain the advantages and principles of the invention. In the drawings,
Reference will now be made to preferred embodiments of this invention, examples of which are shown in the accompanying drawings and will be obvious from the description of the invention. In the drawings, the same reference numbers represent the same or similar elements in the different drawings whenever possible.
Systems and methods consistent with the present invention perform collaborative conferencing by using recursive identification of individuals. For purposes of the following description, the systems and methods consistent with the present invention are mainly described with respect to Internet chat. The description should be understood to apply to other levels or modes of operation in a collaborative conferencing system, such as a casual collaborative conversation with persons in a virtual space room.
In the system 100, user A determines that user B is a person that is likely to be interesting enough so as to get involved in a casual collaborative conversation with that person. That is, if user A believes that he or she shares common interests with user B, user A will engage in collaborative conferencing with user B. This determination is made after obtaining information about user B. The information is obtained by communicating with user B. The manner in which user A communicates with user B in order to determine whether he or she is likely to be interested in communicating with user B (possibly via some other communication means or links) includes, but is not limited to, telephonic conversations, e-mail, voice mail, real-time video, and real-time text.
Once user A determines he or she is likely to be interested in communicating with user B, user A targets or spots user B when user B enters into a chat room or virtual space room. User A will see on his computer screen (208 in
Unlike conventional methods of matchmaking in a chat room context, user A does not rely on a computer program to pick interesting persons for him or her. Instead, user A relies on user B's personal directory 20 as a starting point to find more interesting persons. User A accesses some of the information contained in directory 20 about other users with collaborative conferencing capability, with whom user B communicates. This technique is called recursive identification of individuals. The information that user A can access is limited according to permissions assigned to each record in the directory by user B.
There are different levels of permissions that the user B can assign to the users records (300-304) in the directory 20. Because any other user of the system in the present invention can get access to some information, user 12 assigns access permissions to records 300-304. These permissions define how much information can be accessed by the other users via their respective communications means (10 and 16-19 in
One level of access corresponds to the type of service that is used within the system. In
Other levels of permissions include, but are not limited to, giving the public access to the entire directory 20, giving specific persons access to the entire directory 20, giving the public access to information contained in some of the records 300-304, and giving specific persons access to information contained in some of the records 300-304.
The directory 20 can be created by user B manually. That is, user B can gather a list of names of individuals that he or she communicates with, and enters that list into the directory 20. In the present invention, an alternative to manually creating the directory is to have the software that enables collaborative conferencing create the directory 20 for the user. The software has a routine that monitors the communication between user B and other users (e.g., C-F) and that adds to the directory 20 information about the users that communicate with user B. As an option, the software can sort the information in the directory 20, according to the frequency of the communications between user B and the individuals named in the directory 20. Moreover, another option consists of automatically deleting information from the directory 20, when the software determines that persons that do not communicate frequently with user B, have not actually communicated with user B for specified period of time. For example, the software could look at the sorted directory 20, and determine whether the individual whose information is at the bottom of the directory (less frequency) has communicated with user B in the past two months. If the person at the bottom has not done so, that person's information is deleted from the directory 20. The period of two months is only an example of a parameter that can be adjusted according to the directory's owner preferences.
The computer 202 only displays an image of those users that have been determined to be of interest to user A 10. As seen on
The speaker 204 is used for listening to voice messages sent by the users in the virtual space room. On the other hand, the microphone 210 is used to send voice messages to users in the virtual space room. These voice messages are either voice mail messages, stored either locally in the computer 202 or in some other recording means, or real-time voice messages (i.e., real-time telephony).
The camera 212 is used to capture an image of user A, which is presumably displayed in the computer screen associated with other users participating in the virtual space room. The camera 212 is turned off when user A does not desire to transmit an image of herself/himself. It is possible to have a participant in the virtual space room that does not want his or her image displayed. For example, a chat window 224 displays interactive text communications between user B and user A. As seen from the display, an image of user B is not shown in the screen 208. The chat window 224 is used by any of the users in the virtual space room, and its use is limited to displaying text messages from all of the parties, as it would for a conventional chat room.
When user A decides to communicate via interactive text, he or she needs to type the message on the keyboard 206. The user can edit the entered text which is displayed on the window 228. After the changes have been entered, the text is displayed on the chat window 224 when user A hits the button 226 displayed on the screen 208.
By comparing
After the first user has determined likely interesting persons and has accessed the directory of a first likely interesting person, the first user establishes communication with the persons who are determined to be likely interesting. This communication takes place in a virtual space room context.
The foregoing description of preferred embodiments of the present invention provides illustration and description, but is not intended to be exhaustive or to limit the invention to the precise form disclosed. Modifications and variations are possible in light of the above teachings or may be acquired from practice of the invention. The scope of the invention is defined by the claims and their equivalents.
This application is a continuation of prior U.S. application Ser. No. 13/429,128 filed Mar. 23, 2012, which is a continuation of U.S. application Ser. No. 12/950,749 filed Nov. 19, 2010, now U.S. Pat. No. 8,442,199, which is a continuation of U.S. application Ser. No. 12/605,168, filed Oct. 23, 2009, now U.S. Pat. No. 7,860,229, which is a continuation of prior U.S. application Ser. No. 10/625,493, filed Jul. 23, 2003, now U.S. Pat. No. 7,627,102, which is a continuation of prior U.S. application Ser. No. 09/371,781 filed on Aug. 10, 1999, now U.S. Pat. No. 6,721,410, which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5598351 | Chater et al. | Jan 1997 | A |
5781614 | Brunson | Jul 1998 | A |
5832229 | Tomoda et al. | Nov 1998 | A |
5841966 | Irribarren | Nov 1998 | A |
5872923 | Schwartz et al. | Feb 1999 | A |
5894556 | Grimm et al. | Apr 1999 | A |
5907677 | Glenn et al. | May 1999 | A |
5920692 | Nguyen et al. | Jul 1999 | A |
6029195 | Herz | Feb 2000 | A |
6119178 | Martin et al. | Sep 2000 | A |
6148067 | Leipow | Nov 2000 | A |
6175831 | Weinreich et al. | Jan 2001 | B1 |
6269369 | Robertson | Jul 2001 | B1 |
6317781 | De Boor et al. | Nov 2001 | B1 |
6393423 | Goedken | May 2002 | B1 |
6721410 | Will | Apr 2004 | B1 |
7349907 | Celik | Mar 2008 | B2 |
7627102 | Will | Dec 2009 | B2 |
7739139 | Robertson et al. | Jun 2010 | B2 |
7860229 | Will | Dec 2010 | B2 |
8442199 | Will | May 2013 | B2 |
8542811 | Will | Sep 2013 | B2 |
8625768 | Will | Jan 2014 | B2 |
20050149487 | Celik | Jul 2005 | A1 |
20070083594 | Ludwig et al. | Apr 2007 | A1 |
20090282121 | Robertson et al. | Nov 2009 | A1 |
20100153504 | Will | Jun 2010 | A1 |
20110153747 | Will | Jun 2011 | A1 |
20120185535 | Will | Jul 2012 | A1 |
20120185548 | Will | Jul 2012 | A1 |
Entry |
---|
Non-Final Office Action for U.S. Appl. No. 09/371,781, mailed Sep. 6, 2002, 8 pages. |
Final Office Action for U.S. Appl. No. 09/371,781, mailed Apr. 23, 2003, 9 pages. |
Notice of Allowance for U.S. Appl. No. 09/371,781, mailed Nov. 20, 2003, 8 pages. |
Non-Final Office Action for U.S. Appl. No. 10/625,493, mailed Dec. 14, 2006, 8 pages. |
Final Office Action for U.S. Appl. No. 10/625,493, mailed Aug. 9, 2007, 11 pages. |
Non-Final Office Action for U.S. Appl. No. 10/625,493, mailed Jan. 28, 2008, 9 pages. |
Non-Final Office Action for U.S. Appl. No. 10/625,493, mailed Aug. 21, 2008, 5 pages. |
Final Office Action for U.S. Appl. No. 10/625,493, mailed Jan. 7, 2009, 7 pages. |
Notice of Allowance for U.S. Appl. No. 10/625,493, mailed Jul. 23, 2009, 4 pages. |
Notice of Allowance for U.S. Appl. No. 12/605,168, mailed Oct. 19, 2010, 7 pages. |
Non-Final Office Action for U.S. Appl. No. 12/950,749, mailed Oct. 1, 2012, 9 pages. |
Notice of Allowance for U.S. Appl. No. 12/950,749, mailed Jan. 18, 2013, 5 pages. |
Non-Final Office Action for U.S. Appl. No. 13/429,128, mailed May 9, 2013, 7 pages. |
Notice of Allowance for U.S. Appl. No. 13/429,128, mailed Sep. 4, 2013, 6 pages. |
Non-Final Office Action for U.S. Appl. No. 13/429,142, mailed Jan. 16, 2013, 6 pages. |
Notice of Allowance for U.S. Appl. No. 13/429,142, mailed May 24, 2013, 6 pages. |
Number | Date | Country | |
---|---|---|---|
20140108554 A1 | Apr 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13429128 | Mar 2012 | US |
Child | 14137420 | US | |
Parent | 12950749 | Nov 2010 | US |
Child | 13429128 | US | |
Parent | 12605168 | Oct 2009 | US |
Child | 12950749 | US | |
Parent | 10625493 | Jul 2003 | US |
Child | 12605168 | US | |
Parent | 09371781 | Aug 1999 | US |
Child | 10625493 | US |