This application is related to application Ser. Nos. 12/749,028, 12/749,058, 12/749,094, 12/749,123, 12/749,150, 12/749,178, and 12/749,103, filed on Mar. 29, 2010, each of which is herein incorporated by reference.
1. Technical Field
The present disclosure relates to telecommunications and more specifically to displaying and managing communication sessions via a graphical user interface (GUI). Communication sessions can exist in a variety of modes such as telephone calls, communication sessions, instant messaging sessions, email sessions, video conference sessions, multi-media sessions, and the like.
2. Introduction
Touchtone telephones have been supplemented over the years by the addition of feature buttons and menus. Interfaces for these features have evolved from simple buttons to hierarchical menus actuated by trackballs, quadrant style pointers, and the like. As the number of features increases, the interfaces add more buttons, sequences, and/or combination of button presses. This proliferation of features has led to a multitude of different interfaces with varying levels of complexity. Often users resort to rote memorization of key features, but that is not always practical or desirable. Recently, smartphones with touch-sensitive displays have begun to provide similar functionality. However, the touch-sensitive displays in such devices typically reproduce the feature buttons and menus, albeit on a touch-sensitive display.
Further, users are migrating to other communication forms, such as text messaging, instant messaging, email, chat sessions, video conferencing, and so forth. Incorporating the ability to handle these modes of communication into a traditional telephone increases the complexity and difficulty manyfold. What is needed in the art is a more intuitive communication management interface.
In order to describe the manner in which the above-recited and other advantages and features of the disclosure can be obtained, a more particular description of the principles briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only exemplary embodiments of the disclosure and are not therefore to be considered to be limiting of its scope, the principles herein are described and explained with additional specificity and detail through the use of the accompanying drawings in which:
Various embodiments of the disclosure are discussed in detail below. While specific implementations are discussed, it should be understood that this is done for illustration purposes only. A person skilled in the relevant art will recognize that other components and configurations may be used without parting from the spirit and scope of the disclosure.
The present disclosure addresses the need in the art for improved communication session management. A companion case U.S. patent application Ser. No. 12/749,028, filed on Mar. 29, 2010 discloses a graphical interface which enables a user to setup a communication session with various users and tear down or remove users from a communication session. A system, method and non-transitory computer-readable media are disclosed which in each respective embodiment relate to graphical user interfaces for managing various types of communication sessions quickly and efficiently based on a graphical user interface having communication related widgets. In the system embodiment, the system displays to the user on a graphical user interface a set of graphical connected elements representing a structure of a particular communication session or group of communication sessions. This disclosure focuses on mode-neutral communications graphic interface in which icons or images of participants in a communication session can be connected by a user graphically adding a communications widget to the respective icons. The communication widgets can relate to various modes of communication such as by: telephone, conference call, video conference, web conference, IM session, email, and so forth. The communication mode can also be deleted, changed or otherwise modified by managing the use of the widgets in the interface. A brief introductory description with reference to
The graphical interface 200 of
The communication session is also agnostic with respect to the mode of communication. The same metaphor of a connected user in a communication session being displayed on the graphical interface can represent a called/calling user, an instant messaging (IM) user, an email user, a user connecting via video conferencing, and so forth. The presentation of the graphical elements, how they are connected and how the user interacts with the elements all vary depending on the needs and current active context of the communication session. For example, the elements can include text, titles, positions, data about each user, etc. and the connection metaphor between users can also represent information such as the type of connection (phone, video, web conference, etc), the quality of the connection (low-band, high-band, etc.), a hierarchy of how participants are related to the primary user (friend, associate, acquaintance, un-trusted user, etc.), a status of the connection (active, inactive, on-hold, etc.), and so forth. For example, a user can select a contact and then use the same type of user input (drag and drop, flicking, gestures, etc.) to initiate any of the communication modes with a contact. The user does not have to know or learn different input mechanisms for different communication modes. These variations shall be discussed herein as the various embodiments are set forth. The disclosure now turns to
With reference to
The system bus 110 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. A basic input/output (BIOS) stored in ROM 140 or the like, may provide the basic routine that helps to transfer information between elements within the computing device 100, such as during start-up. The computing device 100 further includes storage devices 160 such as a hard disk drive, a magnetic disk drive, an optical disk drive, tape drive or the like. The storage device 160 can include software modules 162, 164, 166 for controlling the processor 120. Other hardware or software modules are contemplated. The storage device 160 is connected to the system bus 110 by a drive interface. The drives and the associated computer readable storage media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the computing device 100. In one aspect, a hardware module that performs a particular function includes the software component stored in a non-transitory computer-readable medium in connection with the necessary hardware components, such as the processor 120, bus 110, display 170, and so forth, to carry out the function. The basic components are known to those of skill in the art and appropriate variations are contemplated depending on the type of device, such as whether the device 100 is a small, handheld computing device, a desktop computer, or a computer server.
Although the exemplary embodiment described herein employs the hard disk 160, it should be appreciated by those skilled in the art that other types of computer readable media which can store data that are accessible by a computer, such as magnetic cassettes, flash memory cards, digital versatile disks, cartridges, random access memories (RAMs) 150, read only memory (ROM) 140, a cable or wireless signal containing a bit stream and the like, may also be used in the exemplary operating environment. Non-transitory computer-readable storage media expressly exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
To enable user interaction with the computing device 100, an input device 190 represents any number of input mechanisms, such as a microphone for speech, a touch-sensitive screen for gesture or graphical input, keyboard, mouse, motion input, speech and so forth. An output device 170 can also be one or more of a number of output mechanisms known to those of skill in the art. If the device includes a graphical display which also receives touch sensitive input, the input device 190 and the output device 170 can be essentially the same element or display. In some instances, multimodal systems enable a user to provide multiple types of input to communicate with the computing device 100. The communications interface 180 generally governs and manages the user input and system output. There is no restriction on operating on any particular hardware arrangement and therefore the basic features here may easily be substituted for improved hardware or firmware arrangements as they are developed.
For clarity of explanation, the illustrative system embodiment is presented as including individual functional blocks including functional blocks labeled as a “processor” or processor 120. The functions these blocks represent may be provided through the use of either shared or dedicated hardware, including, but not limited to, hardware capable of executing software and hardware, such as a processor 120, that is purpose-built to operate as an equivalent to software executing on a general purpose processor. For example the functions of one or more processors presented in
The logical operations of the various embodiments are implemented as: (1) a sequence of computer implemented steps, operations, or procedures running on a programmable circuit within a general use computer, (2) a sequence of computer implemented steps, operations, or procedures running on a specific-use programmable circuit; and/or (3) interconnected machine modules or program engines within the programmable circuits. The system 100 shown in
Having briefly discussed the exemplary system embodiment, the disclosure now turns to
A benefit of this approach is that the system can add additional communication modes to the modes already existing as is represented by the utility icons in
The display 200 shows a communication session of three connected graphical elements 202, 204, 206. The displayed communication session 201 represents a real-time communication. In this case, the real-time communication is a three-way conference call between Frank Grimes 202, Max Power 204, and Karl 206, shown by connecting lines between their respective icons 202, 204, 206.
However, visualization of communication sessions and user controls are neutral with respect to various communication modalities and treat each the same even as users seek to join a call or other communication session. The only difference is the indicator used for the specific modality such as the communication modality icons 208, 210, 212, 214, 216. For instance, in
The user can then add additional parties to the communication session in a similar manner. The user can remove participants from a communication session by dragging them to a trash can icon 220, providing a flicking motion, clicking an X associated with that participant, highlighting a participant and shaking a device, if it is mobile with accelerometer capability, or clicking a physical or graphical disconnect button. In one aspect where the communication session is via telephone, the system 100 removes participants from the communication session when the user hangs up the telephone receiver. As participants leave the communication session, the system 100 removes their icon from the graphical representation of the communication session.
The graphical elements shown are icons, but can also include images, text, video, animations, sound, caricatures, and/or avatars. Users can personalize their own graphical elements or feed a live stream of images from a camera or video camera, for example. In addition, the graphical elements can have an associated string of text 202a, 204a, 206a. The string of text can include a name, a title, a position, a telephone number, email address, a current status, presence information, location, and/or any other available information. The string of text can be separate from but associated with the graphical element, as shown in
The system 100 can include for each icon a graphical sub-element 202b, 204b, 206b that indicates the communication mode for each participant. For example, Max Power 204 is participating via an instant messaging (IM) client 204b; Frank Grimes 202 is participating via telephone 202b; Karl is participating via a video conference client 206b. The system 100 is mode-neutral, meaning that the system 100 treats each mode of communication the same, such as telephone, cellular phone, voice over IP (VoIP), instant messaging, e-mail, text messaging, screen sharing, file sharing, application sharing, and video conferencing. As a user changes from one mode to another, the sub-elements can change accordingly. For example, if Frank Grimes 202 changes from a landline to a cellular phone mid-conference, the telephone icon 202b can change to a mobile phone icon.
The graphical elements can also convey information about the conference call by changing type, size, color, border, brightness, position, and so forth. The lines, for example, can convey relationships between participants. A user can manually trigger the changes for their own icon or others' icons, or the system 100 can detect change events and change the graphical elements accordingly. Change events can be based on a contacted party, context, persona, connectivity status, and/or presence. For example, as one person is talking or typing a text message, the system 100 can enlarge their icon. As another example, the system 100 can track how much each person in the conference call is talking and move graphical elements up and down based on a total talk time in the conference call.
In another variation, the system 100 modifies the links connecting the graphical elements 202, 204, 206 by changing their thickness, length, color, style, and/or animating the links. These modifications can represent a currently active party, shared resources, an active communication session, a held communication session, a muted communication session, a pending communication session, a connecting communication session, a multi-party line, a sidebar conversation, a monitored transfer, an unmonitored transfer, selective forwarding, and selective breakup of the communication session into multiple communication sessions, and so forth.
In one aspect, a user provides input such as a gesture (such as a drag and drop, tap, and drag with a touch screen or performs any other instructive user input) to manipulate and manage the conference call. For example, the user can click a call icon 208, a video conference icon 210, an IM icon 212, an email icon 214, or a social media icon 216 to invite another user to join the communication session. A user can drag these icons and drop them on a contact or on a participant in a current communication session. For example, if an incoming communication session is in one modality (IM for example), the user can drag the call icon onto the incoming communication session to accept the incoming communication session but renegotiate it from IM to a call. A user can also initiate a communication session by dragging and dropping an appropriate icon onto a contact. Social media include web sites such as Facebook, Twitter, LinkedIn, MySpace, and so forth. Alternatively, the user can browse through a list of contacts 218, then drag and drop a desired contact to add the desired contact to the conference call. The system 100 then automatically contacts that person in their desired mode, a sender preferred mode, a currently available mode based on presence information, or in a common available mode between the participants and joins that person to the conference call. The system 100 can display other information as well, such as a calendar, notes, memos, personal presence information, and time. The system 100 display can be user-configurable. Each participant in the communication session 201 or contact in a list of contacts can have multiple associated addresses, phone numbers, or points of contact, such as a work phone, home phone, mobile phone, work email, home email, AIM address, Facebook chat address, and the like and that each may have an icon or a qualifier such as a symbol that indicates not only the party but the contact mode.
An incoming communication session icon can blink, bounce, pulse, grow, shrink, vibrate, change color, send an audible alert (such as a ringtone), and/or provide some other notification to the user of the incoming session. The user can interact with and manipulate this incoming request in the same manner as the other current communication sessions. The system 100 does not differentiate between an active communication session and a communication session representing an incoming request. For example, the user can drag and drop an incoming call on top of a communication session to add the incoming call directly to the communication session. As another example, the user can drag and drop an incoming session to a trash can icon to ignore, double click on the incoming session to send the incoming caller (if it is a call) to voicemail, or tap and hold to place the caller on hold.
In one aspect, user preferences guide the amount and type of information conveyed by the graphical elements and the associated text. User preferences can be drawn from a viewer's preferences and/or a source person's preferences. For example, a viewer sets preferences to show others' email addresses when available, but a source person sets preferences as never share email address. The source person's preferences (or preferences of the “owner” of the information) can override a third party's preferences.
One possible user input is to divide the communication session shown in
Then the system presents a utility icon (not shown in
Having discussed several variations of
In one aspect, a centralized entity controls the communication session. The centralized entity can reside in the network or communicate via the network. The centralized entity can operate as a centralized enterprise intelligence server. In another aspect, the communication session control and functionality is distributed among multiple server resources 314, 316, 318, 320 in the network or cloud. In addition to a centralized intelligence and distributed intelligence in the cloud, the network 302 can provide this functionality using a peer-to-peer approach with intelligence on the endpoints. Some variations include providing standardized functionality on a standards-compliant server and non-standardized functionality distributed across the endpoints.
The display of each communications device shows a different aspect or view of the same communication session. For example, the display of device 304 shows the same display of the same participants 202, 204, 206 as shown in
For example, Max Power 204 is currently communicating via instant messaging. Frank Grimes 202 is currently communicating via telephone. Karl 206 is currently communicating via video conferencing. While
In addition to the current communication modality icons, each participant's icon in the communication session can have associated sub-icons or sub-elements indicating available and/or preferred communication modalities. For example, Max Power 204 includes sub-icons for telephone 408, IM 410, and email 412. Frank Grimes 414 includes sub-icons for telephone 414, video conferencing 416, and social media 418. Karl 206 includes sub-icons for video conferencing 420, IM 422, and email 424. A user can interact with other communication session participants via these sub-icons or sub-elements. For example, in order to set up a sidebar communication session with Karl 206, Max Power 204 can click on the IM sub-icon 422 associated with Karl 206. A user can modify his or her own set of sub-icons in the communication session. For example, Frank Grimes 202 can drag the social media sub-icon 418 out to remove it from the group of sub-icons. Alternatively, Frank Grimes 202 can click on the video conferencing sub-icon 416 to change from a telephone connection to a video conferencing connection while remaining connected to the communication session.
In some cases, a sub-icon can represent multiple related facets of functionality.
The display 500 can include additional sub-icons not related to any participant for all available modalities regardless of each participant's preferences. A user can drag a sub-icon from the additional sub-icons to a user to request interaction via a communication modality not currently preferred or available. For example, Max Power's 204 icon does not have an associated sub-icon for video conferencing. Frank Grimes can drag a video conferencing sub-icon from the additional sub-icons onto Max Power 204 to request a video conference.
The communication session view 602 depicts a communication session 604 with three participants, John 606, Moe 608, and Carly 610. In this example, the communication session view 602 includes a central hub or session manager 612 that links the participants. Each participant's icon can have a set of associated icons representing available or currently used communication modalities. For example, John 606 has a cellular phone icon 606a and a webcam icon 606b. Moe 608 has a webcam icon 608a, a telephone icon 608b, and a computer icon 608c. Carly 610 has a telephone icon 610a, a computer icon 610b, and a webcam icon 610c. A user can drag and drop graphical elements from the various portions of the user interface 600 to perform actions such as adding participants to the communication session 604, creating a new communication session, terminating a communication session, dividing a communication session, sharing information, and so forth.
A user can manipulate communication modalities of himself as well as others via the communication modality buttons 714. For example, the user can click and drag or otherwise move any one of the communication modality buttons 714 onto a participant icon, such as John 706. That action can trigger the system 100 to shift to the indicated communication modality or inquire of John 706 if he is willing to change to the indicated communication modality.
The basic input/output buttons 716 provide a user with basic functionality to manipulate the communication session, create a new communication session, or simply to answer system queries. For example, if the user wants to add a new participant to the communication session, the user can click the “new” button. The system 100 presents a dialog to the user to determine which contact to add to the communication session, and presents a confirmation dialog such as “Are you sure?” The user can then tap on the OK button to confirm, after which the system 100 adds the selected contact as a new participant.
A user can click and drag individual lines onto other contacts to establish additional communication sessions. For example, if participants 802, 804 are part of a larger communication session having more participants, participant 802 can establish a sidebar with participant 804 by dragging the IM line 812 and dropping it directly on participant 804. Then to terminate the sidebar, participant 804 can drag the IM line 812 back to the communication session hub 806.
In another aspect, the communication session includes multiple modalities for each participant. For example, the communication session can be a video conference where all participants have a video stream (if a video camera is available), one or more participants have an audio stream (if a microphone is available), and the session includes a text-based chat under the video stream. In this example, each participant in the graphical representation connects to the communication session via one, two, or three graphical links representing actual communication modalities. Users can individually control (i.e. terminate, add, mute, pause, and so forth) each modality's separate link to the communication session. For example, participant 802 can pause the video conference link 810 feeding a video stream to other participants 804 via the hub 806 but still maintain the other two modalities, telephone 808 and IM 812. Participant 802 can later resume the video conference link 810.
The disclosure now turns to the exemplary method embodiment shown in
The system 100 receives user input associated with the set of connected graphical elements, the user input having an action associated with the communication session (904). The user input can be a click of a mouse, a tap of a finger on a touch screen, or any other suitable input. The user can click, drag, drop, and otherwise move and locate icons as user input.
An example of applying user controls in a mode neutral way are provided below. For example, the display shown in
The system 100 performs the action based on the received user input (906). The system 100 can also receive a first user input indicating a specific graphical sub-element, display a menu of options based on the specific graphical sub-element and its respective associated user, receive a second user input selecting an option in the menu of options, and manipulate the communication session based on the second user input. In one instance, the system 100 manipulates the communication session by creating a separate communication session, but the other actions are possible consistent with the disclosure.
As can be appreciated based on this disclosure, the interface treats all communication modalities exactly the same with respect to session control. For example, as discussed herein, communication sessions of any mode or type can be controlled and managed using the same user input modalities. Thus, if communication sessions are started, ended, split, or if participants are added or removed, the same communication modalities (drag and drop, speech, gesture input, tapping, etc.) perform the same functions across different communication modes. An IM chat session with four participants can be split into two sessions of two IM chat participants using the same modality as splitting a telephone conference with four participants into two separate conferences of two people each. The communication session may be a video conference or a screen sharing session over the web. The user operations with respect to session control are identical.
Thus, the interfaces shown in FIGS. 2 and 4-8 can easily support a new mode of communication while enabling users to manage a new mode with the same communication modalities of the other modes. For example, if the system is to integrate a communication mode such as Google wave in addition to the call, video, IM, email and social shown in
Embodiments within the scope of the present disclosure may also include tangible and/or non-transitory computer-readable storage media for carrying or having computer-executable instructions or data structures stored thereon. Such non-transitory computer-readable storage media can be any available media that can be accessed by a general purpose or special purpose computer, including the functional design of any special purpose processor as discussed above. By way of example, and not limitation, such non-transitory computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code means in the form of computer-executable instructions, data structures, or processor chip design. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or combination thereof) to a computer, the computer properly views the connection as a computer-readable medium. Thus, any such connection is properly termed a computer-readable medium. Combinations of the above should also be included within the scope of the computer-readable media.
Computer-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Computer-executable instructions also include program modules that are executed by computers in stand-alone or network environments. Generally, program modules include routines, programs, components, data structures, objects, and the functions inherent in the design of special-purpose processors, etc. that perform particular tasks or implement particular abstract data types. Computer-executable instructions, associated data structures, and program modules represent examples of the program code means for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps.
Those of skill in the art will appreciate that other embodiments of the disclosure may be practiced in network computing environments with many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. Embodiments may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination thereof) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
The various embodiments described above are provided by way of illustration only and should not be construed to limit the scope of the disclosure. Those skilled in the art will readily recognize various modifications and changes that may be made to the principles described herein without following the example embodiments and applications illustrated and described herein, and without departing from the spirit and scope of the disclosure.
Number | Name | Date | Kind |
---|---|---|---|
5533110 | Pinard et al. | Jul 1996 | A |
5627978 | Altom et al. | May 1997 | A |
5745711 | Kitahara et al. | Apr 1998 | A |
5768552 | Jacoby | Jun 1998 | A |
5892764 | Riemann et al. | Apr 1999 | A |
5940488 | DeGrazia et al. | Aug 1999 | A |
5999609 | Nishimura | Dec 1999 | A |
6067357 | Kishinsky et al. | May 2000 | A |
6191807 | Hamada et al. | Feb 2001 | B1 |
6415020 | Pinard et al. | Jul 2002 | B1 |
6445682 | Weitz | Sep 2002 | B1 |
6496201 | Baldwin et al. | Dec 2002 | B1 |
6501740 | Sun et al. | Dec 2002 | B1 |
6559863 | Megiddo | May 2003 | B1 |
6751669 | Ahuja et al. | Jun 2004 | B1 |
6853398 | Malzbender et al. | Feb 2005 | B2 |
D528553 | Nevill-Manning et al. | Sep 2006 | S |
D529036 | Koch et al. | Sep 2006 | S |
D529037 | Koch et al. | Sep 2006 | S |
D529920 | Nevill-Manning et al. | Oct 2006 | S |
7124164 | Chemtob | Oct 2006 | B1 |
7127685 | Canfield et al. | Oct 2006 | B2 |
7162699 | Pena-Mora et al. | Jan 2007 | B1 |
7167182 | Butler | Jan 2007 | B2 |
7213206 | Fogg | May 2007 | B2 |
7269162 | Turner | Sep 2007 | B1 |
7478129 | Chemtob | Jan 2009 | B1 |
D591304 | Banks et al. | Apr 2009 | S |
7519912 | Moody et al. | Apr 2009 | B2 |
D603866 | Banks et al. | Nov 2009 | S |
7949952 | Hawley et al. | May 2011 | B2 |
8082302 | Becker et al. | Dec 2011 | B2 |
8144633 | Yoakum et al. | Mar 2012 | B2 |
8223186 | Derocher et al. | Jul 2012 | B2 |
8786664 | Hornyak et al. | Jul 2014 | B2 |
20030001890 | Brin | Jan 2003 | A1 |
20030133562 | Ooki | Jul 2003 | A1 |
20030206619 | Curbow et al. | Nov 2003 | A1 |
20030236835 | Levi et al. | Dec 2003 | A1 |
20040019683 | Lee et al. | Jan 2004 | A1 |
20040028199 | Carlson | Feb 2004 | A1 |
20040125937 | Turcan et al. | Jul 2004 | A1 |
20040218751 | Colson et al. | Nov 2004 | A1 |
20040258222 | Kobrosly et al. | Dec 2004 | A1 |
20040264652 | Erhart et al. | Dec 2004 | A1 |
20050021624 | Herf et al. | Jan 2005 | A1 |
20050132012 | Muller et al. | Jun 2005 | A1 |
20050141694 | Wengrovitz | Jun 2005 | A1 |
20050151836 | Ni | Jul 2005 | A1 |
20050182745 | Dhillon et al. | Aug 2005 | A1 |
20050251555 | Little, II | Nov 2005 | A1 |
20060023859 | Crockett et al. | Feb 2006 | A1 |
20060059236 | Sheppard et al. | Mar 2006 | A1 |
20060098793 | Erhart et al. | May 2006 | A1 |
20060117264 | Beaton et al. | Jun 2006 | A1 |
20060135142 | Repka | Jun 2006 | A1 |
20060190546 | Daniell | Aug 2006 | A1 |
20060235716 | Mahesh et al. | Oct 2006 | A1 |
20060236269 | Boma | Oct 2006 | A1 |
20070053308 | DuMas et al. | Mar 2007 | A1 |
20070121893 | Khouri et al. | May 2007 | A1 |
20070201674 | Annadata et al. | Aug 2007 | A1 |
20070206768 | Bourne et al. | Sep 2007 | A1 |
20070208806 | Mordecai et al. | Sep 2007 | A1 |
20070260685 | Surazski | Nov 2007 | A1 |
20070288627 | Abella et al. | Dec 2007 | A1 |
20080005235 | Hegde et al. | Jan 2008 | A1 |
20080075247 | Tanaka et al. | Mar 2008 | A1 |
20080080386 | Calahan et al. | Apr 2008 | A1 |
20080115087 | Rollin et al. | May 2008 | A1 |
20080148156 | Brewer et al. | Jun 2008 | A1 |
20080167056 | Gilzean et al. | Jul 2008 | A1 |
20080263475 | Hwang | Oct 2008 | A1 |
20080266378 | Ryu | Oct 2008 | A1 |
20080309617 | Kong et al. | Dec 2008 | A1 |
20090006980 | Hawley et al. | Jan 2009 | A1 |
20090019367 | Cavagnari et al. | Jan 2009 | A1 |
20090024952 | Brush et al. | Jan 2009 | A1 |
20090054107 | Feland, III et al. | Feb 2009 | A1 |
20090059818 | Pickett | Mar 2009 | A1 |
20090089683 | Thapa | Apr 2009 | A1 |
20090138554 | Longobardi et al. | May 2009 | A1 |
20090204904 | Mujkic et al. | Aug 2009 | A1 |
20090241031 | Gamaley et al. | Sep 2009 | A1 |
20090248709 | Fuhrmann et al. | Oct 2009 | A1 |
20090249226 | Manolescu et al. | Oct 2009 | A1 |
20090319623 | Srinivasan et al. | Dec 2009 | A1 |
20100011304 | van Os | Jan 2010 | A1 |
20100023585 | Nersu et al. | Jan 2010 | A1 |
20100076807 | Bells et al. | Mar 2010 | A1 |
20100083137 | Shin et al. | Apr 2010 | A1 |
20100085417 | Satyanarayanan et al. | Apr 2010 | A1 |
20100162153 | Lau | Jun 2010 | A1 |
20100167710 | Alhainen | Jul 2010 | A1 |
20100223089 | Godfrey et al. | Sep 2010 | A1 |
20100234052 | Lapstun et al. | Sep 2010 | A1 |
20100312836 | Serr et al. | Dec 2010 | A1 |
20110022968 | Conner et al. | Jan 2011 | A1 |
20110109940 | Silverbrook et al. | May 2011 | A1 |
20110151905 | Lapstun et al. | Jun 2011 | A1 |
20110191136 | Bourne et al. | Aug 2011 | A1 |
20110296312 | Boyer et al. | Dec 2011 | A1 |
20120019610 | Hornyak et al. | Jan 2012 | A1 |
20120083252 | Lapstun et al. | Apr 2012 | A1 |
20120084672 | Vonog et al. | Apr 2012 | A1 |
20120110473 | Tseng | May 2012 | A1 |
20120216129 | Ng et al. | Aug 2012 | A1 |
20120259633 | Aihara et al. | Oct 2012 | A1 |
20130080954 | Carlhian et al. | Mar 2013 | A1 |
20130108035 | Lyman | May 2013 | A1 |
20130250038 | Satyanarayanan et al. | Sep 2013 | A1 |
20130268866 | Lyman | Oct 2013 | A1 |
Number | Date | Country |
---|---|---|
1292127 | Apr 2001 | CN |
19543870 | May 1996 | DE |
19716316 | Oct 1998 | DE |
0453128 | Oct 1991 | EP |
0717544 | Jun 1996 | EP |
1480422 | Nov 2004 | EP |
1983729 | Oct 2008 | EP |
1983729 | Oct 2008 | EP |
2338146 | Dec 1999 | GB |
H08-251261 | Sep 1996 | JP |
2004-199644 | Feb 1997 | JP |
2002-297873 | Oct 2002 | JP |
2003-296556 | Oct 2003 | JP |
2004-102389 | Apr 2004 | JP |
H09-055983 | Jul 2004 | JP |
2004-320235 | Nov 2004 | JP |
2005-318055 | Nov 2005 | JP |
2006-050370 | Feb 2006 | JP |
2006-060340 | Mar 2006 | JP |
2006-092367 | Apr 2006 | JP |
2007-004000 | Jan 2007 | JP |
2008-171068 | Jul 2008 | JP |
2009-502048 | Jan 2009 | JP |
2009-044679 | Feb 2009 | JP |
2006-1158872 | Jun 2006 | KR |
2006-0132484 | Dec 2006 | KR |
2009-0001500 | Jan 2009 | KR |
WO 9821871 | May 1998 | WO |
WO 9945716 | Sep 1999 | WO |
WO 0018082 | Mar 2000 | WO |
WO 2006054153 | May 2006 | WO |
WO 2006060340 | Jun 2006 | WO |
WO 2007008321 | Jan 2007 | WO |
Entry |
---|
Honda et al., e-MuICS: multi-party conference system with virtual space and the intuitive input interface, 2004, Applications and the Internet, Proceedings. 2004 International Symposium on. p. 56-63. |
Byrne et al., “Developing multiparty conferencing services for the NGN: towards a service creation framework”, Jun. 2004, ISICT '04: Proceedings of the 2004 International Symposium on Information and Communication. |
WebEx, WebEx Meeting Center User's Guide, 2007. WebEx Communications Inc., Version 8, pp. 1-332. |
Number | Date | Country | |
---|---|---|---|
20100251124 A1 | Sep 2010 | US |
Number | Date | Country | |
---|---|---|---|
61164753 | Mar 2009 | US |