The present application is related to three other patent applications: “Document Viewing Mechanism for Document Sharing Environment”, filed Apr. 15, 2002, Ser. No. 10/127,951; “Application Sharing Single Document Sharing”, filed Apr. 3, 2002, Ser. No. 11/344,361; and “Application Sharing User Interface Improvements”, filed Apr. 5, 2002, Ser. No. 11/401,519.
The present invention is related generally to collaborative computing applications, and, more particularly, to annotating a shared display of a collaborative computing application.
The growth of computer networks, both local to one area and among remotely located areas, has spawned increased interest in collaborative computing applications. These applications allow users at multiple computing devices to work together, running the same application and viewing the same data. For example, consider a word-processing collaborative application. Through network connections, a set of users can see and collaboratively edit a document. In another example, a frustrated user shares an application's error logging screen with a remote technical support specialist. The user runs the application while the specialist views the logging screen to discover what causes the application to fail.
In many implementations, the collaborative application actually runs on only one device, called the “application sharer” or the “local device,” and that device shares the collaborative application display with all other devices, called “application viewers” or “remote devices.” On the sharer and on all of the viewers runs a collaborative computing utility program that allows remote users to see the same display created by the collaborative application for the sharer device. An advantage of these implementations is that the application need not be modified to support collaboration. The application may run for the sharer's user and be unaware that its display is shared to remote users.
In many scenarios, it is not enough for all users to be able to see the display produced by the collaborative application for the application sharer's device. Users at the remote devices would like to provide input to the display. Some collaboration systems allow the users to pass control among themselves so that each in turn can run the collaborative application as if it were running on his own device. In another scenario of remote user interaction, the collaborative application presents its display to all connected users. The remote users may not send input to the collaborative application, but they may wish to annotate the collaborative application display in order to call the attention of the other users to a particular point. Rather than relying on the traditional but confusing voice cues “Look, up near the top,” “No, not that line, the one below it,” “It's the, let's see, one, two, three, fourth column from the right,” “I meant the ‘thee,’ not the ‘the’,” some collaborative computing utilities support visual annotation in which users “draw” on the collaborative application display. The users' annotations are not sent as input to the collaborative application, but they are displayed to all users.
In previous annotation systems, each user creates annotation information (which may include a pointer, a highlighted area, etc.) and sends this annotation information to all other users. Each device, including the annotator's, combines the annotation information with the display received from the collaborative application. This method has many shortcomings. First, there is no central control to coordinate the annotations from multiple users or to turn off annotations for a while if that is desired: every user can annotate at any time. The lack of central control also means that there is no guarantee that all users are seeing the same annotations, which can increase confusion in the collaborative effort beyond what it would have been without visual annotation. As the collaborative application is actually running only on the sharer, a remote user's annotations are not tied directly to the current application display, that is, the application may change the display after the remote user annotates it. Finally, the process of receiving annotations from all of the other users and combining that information with the shared display received from the collaborative application may tax the resources of a remote device, especially a lower-power portable or handheld device.
What is needed is an annotation system for collaborative applications that better coordinates the annotation input from all the users while presenting less of a burden to remote devices.
In view of the foregoing, the present invention provides a method for remote users of a collaborative application to generate annotation information, send that annotation information to an application sharer device, and receive back a display combining output of the collaborative application with the annotation information. Each viewer device only communicates with the sharer, not with the other viewers, and the viewers need not have the capability to graphically blend the application output with annotation information from their own, and potentially from all other, users.
A collaborative application display is visible on an application viewer's screen. To make an annotation, the user of the application viewer draws over the collaborative display. The drawing may be performed by moving a mouse, by invoking a paint program, or the like. In any case, the output of the user's annotation efforts is intercepted and is not displayed to the user's screen. Instead, the intercepted annotation information is sent to a central device, presumably the application sharer running the collaborative application.
On the application sharer, the annotation information is received, potentially from many remote users at the same time. The sharer has the opportunity to coordinate the annotation input, possibly by giving one set of remote users permission to annotate and ignoring annotation input from all other users. In some embodiments, permission to annotate may be passed from one user to another. The user of the sharer may also generate annotation input. In any case, the annotation input that the sharer device has decided to display is drawn into an annotation display on the sharer. That annotation display is graphically blended with the display produced by the collaborative application itself. The combination is then sent to the remote viewers.
On each remote application viewer, the received combination is simply displayed. Included in the display is what annotation information the sharer decided to accept. The remote viewers need not perform a graphics blend because they only receive one input stream, the combined display produced by the sharer.
In other aspects of the present invention, the application sharer uses its centralized position to further coordinate which annotations are displayed and how they are displayed. The sharer may visually indicate, via color or a text flag, for example, the source of each annotation. The sharer may time out an annotation, or may delete the annotation if the collaborative application's display has scrolled underneath the annotation, causing the annotation to “lose its place” in the collaborative display and become meaningless.
While the appended claims set forth the features of the present invention with particularity, the invention, together with its objects and advantages, may be best understood from the following detailed description taken in conjunction with the accompanying drawings of which:
a through 3c together form a dataflow diagram generally showing the information passed and the operations performed when a user at an application viewer annotates the collaborative application display received from the application sharer;
a is an exemplary screen display showing annotations from two application viewers overlaid onto the collaborative application display;
b shows what would happen to the display of
a is a schematic diagram of an exemplary system on an application viewer that supports annotation; and
b is a schematic diagram of an exemplary system on an application sharer that supports annotation.
Turning to the drawings, wherein like reference numerals refer to like elements, the present invention is illustrated as being implemented in a suitable computing environment. The following description is based on embodiments of the invention and should not be taken as limiting the invention with regard to alternative embodiments that are not explicitly described herein.
In the description that follows, the present invention is described with reference to acts and symbolic representations of operations that are performed by one or more computing devices, unless indicated otherwise. As such, it will be understood that such acts and operations, which are at times referred to as being computer-executed, include the manipulation by the processing unit of the computing device of electrical signals representing data in a structured form. This manipulation transforms the data or maintains them at locations in the memory system of the computing device, which reconfigures or otherwise alters the operation of the device in a manner well understood by those skilled in the art. The data structures where data are maintained are physical locations of the memory that have particular properties defined by the format of the data. However, while the invention is being described in the foregoing context, it is not meant to be limiting as those of skill in the art will appreciate that various of the acts and operations described hereinafter may also be implemented in hardware.
The present invention presents methods for visually annotating the collaborative application display. Users on the application sharer 102 and on the application viewers 104 may create visual annotations. In the case of the application viewers 104, these annotations travel to the application sharer 102 via the data flows 108 and 110. Upon reception, the application sharer 102 does not send this annotation input to the collaborative application, but merges it graphically with the collaborative application display. The merged image is then sent, via data flows 106, to the application viewers 104. Providing for annotation in this manner gives the application sharer 102 centralized control over all annotations and eases the burden on the application viewers 104 of merging annotations with the collaborative application display. This latter point is especially important when an application viewer 104 does not have the computing resources of a typical desktop computer but is a lower-power device, such as a handheld computer or an enhanced cellular telephone.
The application sharer 102 and the application viewers 104 of
To illustrate one way of implementing the methods of the present invention,
In step 300, the application sharer 102 runs the collaborative application, and in step 302, the application sharer 102 and the application viewer 104 initiate their collaborative computing utility programs. As the collaborative application runs, it produces display information. The manner of information displayed depends upon the collaborative application itself and may include, for example, static text, a warning message, toolbars, a picture, full motion video, or a combination of these and other elements. The application sharer 102 chooses to share some or all of this collaborative display information with the application viewers 104. For example, if the application viewers 104 are prevented from sending control input to the collaborative application, then the application sharer 102 may share a video display but not the toolbars. Whatever display information the application sharer 102 chooses to display is captured in step 304 and sent to the application viewers 104. Numerous formats for encoding the collaborative display information are known and include drawing commands, bit maps, and still and live video formats. The application viewer 104's collaborative computing utility program receives the display information and presents it to a user of the application viewer 104 in step 306.
Upon viewing the collaborative application display, the user of the application viewer 104 decides to annotate it. Annotations are designed to call the attention of users at other devices to specific portions of the display. They are not intended to be control inputs sent to the collaborative application itself. In the embodiment of step 308 of
In step 312, the application sharer 102 receives the annotation input from the application viewer 104 and associates a timer with the annotation input. The timer forms one part of the application sharer 102′ mechanism for centralized coordination of annotation input, as described below.
The application sharer 102 may be receiving annotation input from several application viewers 104 and from a user of the application sharer 102 itself. In steps 314 through 318 of
As one aspect of centralized coordination, the application sharer 102 may choose not to display any or all of the received annotations. The user of the collaborative application may choose to turn off annotation for a while and then later open up the display for annotation. One set of users of the application viewers 104 may be permitted to annotate the display while other users cannot. For example, particularly obnoxious users, or their annotations, may be screened. In some embodiments, permission to annotate may be passed from one user to another user just as permission to send input to the collaborative application is often passed around.
In step 316, the application sharer 102 optionally tags each annotation with an indication of its source. These tags can help to reduce confusion when several users are annotating the same display. Color and text are both useful in tagging annotations.
The annotation inputs that pass the screening of the application sharer 102 are combined with the collaborative application display in step 318 and the result is sent to the application viewers 104. When the combination information is received by the application viewer 104 in step 320, it is displayed in the same manner as the unannotated information is displayed in step 306. In fact, in some embodiments, the application viewer 104 cannot distinguish annotated from unannotated display information. Both are received simply as visual information and are displayed to the user of the application viewer 104. This is why there was no need in step 310 to display the annotation input created in that step on the application viewer 104.
The present invention's centralized coordination of annotation presents the same annotated display to the users of all of the collaborating devices. Because the annotations are graphically merged with the collaborative application display by the application sharer 102, the application viewers 104 do not need to perform this merging.
In the methods as described so far, an annotation, once made, remains forever on the collaborative application display. To prevent an ever-increasing accumulation of outdated annotations, the application sharer 102 in step 322 discards each annotation after displaying it for a set period of time. In another embodiment, the source of the annotation could send a special message to the application sharer 102 saying that the annotation should be removed.
A third method for removing annotations, which may be combined with the first two methods, is presented in step 324. Because annotations are associated with a position on the collaborative application display, they may “lose their places” if the display scrolls underneath them. As an example, consider the exemplary collaborative application display 400 of
In steps 326 and 328, the application sharer 102 and the application viewer 104 repeat the annotating and displaying as long as the collaborative session endures.
If strictly applied, the methods of
a illustrates the basic components of one embodiment of an application viewer 104. The application viewer 104 receives the collaborative display information from the network 100 over a network communications channel 210. The information is passed up a standard network communications protocol stack 500 which handles transmission issues such as session establishment, addressing, error recovery, and the like. The information is next passed to the collaborative computing utility program 502. In some embodiments, this utility 502 need know nothing whatsoever about the collaborative application running on the application sharer 102. It does, however, know how to present, via data flow 504, the collaborative application display 400 to a user of the application viewer 104.
When a user of the application viewer 104 decides to annotate the collaborative application display 400, he uses some input component 212 to “draw” over the display 400. The commands to draw the annotation are intercepted by module 506 and are prevented from having any direct effect on the collaborative application display 400. Instead, the annotation commands 508 are sent to the collaborative computing utility program 502 which packages them and sends them through the network 100 to the application sharer 102.
The exemplary application sharer 102 illustrated in
The user of the application sharer 102 may use that device's input components 212 to send control information, via data flow 514, to the collaborative application 510. That user may also choose to annotate the collaborative application display. The annotation input is handled the same way as on the application viewers 104, being intercepted by module 506 and then sent to join the set of annotation inputs 516.
In view of the many possible embodiments to which the principles of the present invention may be applied, it should be recognized that the embodiments described herein with respect to the drawing figures are meant to be illustrative only and should not be taken as limiting the scope of the invention. Those of skill in the art will recognize that some implementation details, such as display and annotation information message formats, are determined by the protocols chosen for specific situations and can be found in published standards. Although the invention is described in terms of software modules or components, some processes may be equivalently performed by hardware components. Therefore, the invention as described herein contemplates all such embodiments as may come within the scope of the following claims and equivalents thereof.
Number | Name | Date | Kind |
---|---|---|---|
4386416 | Giltner et al. | May 1983 | A |
4631521 | El-Sherbini | Dec 1986 | A |
4672459 | Kudo | Jun 1987 | A |
4677649 | Kunishi et al. | Jun 1987 | A |
4783834 | Anderson et al. | Nov 1988 | A |
4814987 | Miyao et al. | Mar 1989 | A |
4823122 | Mann et al. | Apr 1989 | A |
4882687 | Gordon | Nov 1989 | A |
4897799 | Le Gall et al. | Jan 1990 | A |
4965677 | Pennebaker et al. | Oct 1990 | A |
4974173 | Stefik et al. | Nov 1990 | A |
5008853 | Bly et al. | Apr 1991 | A |
5057916 | Krause et al. | Oct 1991 | A |
5077732 | Fischer et al. | Dec 1991 | A |
RE33894 | Bradley | Apr 1992 | E |
5172103 | Kita et al. | Dec 1992 | A |
5177622 | Yoshida et al. | Jan 1993 | A |
5179711 | Vreeland | Jan 1993 | A |
5206934 | Naef, III | Apr 1993 | A |
5210825 | Kavaler | May 1993 | A |
5241625 | Epard et al. | Aug 1993 | A |
5241653 | Collins et al. | Aug 1993 | A |
5255361 | Callaway et al. | Oct 1993 | A |
5287203 | Namizuka | Feb 1994 | A |
5298992 | Pietras et al. | Mar 1994 | A |
5319463 | Hongu et al. | Jun 1994 | A |
5390262 | Pope | Feb 1995 | A |
5404436 | Hamilton | Apr 1995 | A |
5408600 | Garfinkel et al. | Apr 1995 | A |
5485559 | Sakaibara et al. | Jan 1996 | A |
5491780 | Fyles et al. | Feb 1996 | A |
5550968 | Miller et al. | Aug 1996 | A |
5565886 | Gibson | Oct 1996 | A |
5608872 | Schwartz et al. | Mar 1997 | A |
5649104 | Carleton et al. | Jul 1997 | A |
5655152 | Ohnishi et al. | Aug 1997 | A |
5673371 | Koopman et al. | Sep 1997 | A |
5699524 | Ooishi et al. | Dec 1997 | A |
5717856 | Carleton et al. | Feb 1998 | A |
5727155 | Dawson | Mar 1998 | A |
5754873 | Nolan | May 1998 | A |
5758110 | Boss et al. | May 1998 | A |
5760769 | Petrie | Jun 1998 | A |
5781732 | Adams | Jul 1998 | A |
5815151 | Argiolas et al. | Sep 1998 | A |
5831872 | Pan et al. | Nov 1998 | A |
5835713 | FitzPatrick et al. | Nov 1998 | A |
5847706 | Kingsley | Dec 1998 | A |
5864711 | Mairs et al. | Jan 1999 | A |
5874960 | Mairs et al. | Feb 1999 | A |
5933597 | Hogan | Aug 1999 | A |
5938724 | Pommier et al. | Aug 1999 | A |
5949435 | Brock et al. | Sep 1999 | A |
5986655 | Chiu et al. | Nov 1999 | A |
5995096 | Kitahara et al. | Nov 1999 | A |
6008804 | Pommier et al. | Dec 1999 | A |
6025871 | Kantor et al. | Feb 2000 | A |
6057835 | Sato et al. | May 2000 | A |
6167433 | Maples et al. | Dec 2000 | A |
6173315 | Deleeuw | Jan 2001 | B1 |
6212547 | Ludwig et al. | Apr 2001 | B1 |
6216177 | Mairs | Apr 2001 | B1 |
6219044 | Ansberry et al. | Apr 2001 | B1 |
6230171 | Pacifici | May 2001 | B1 |
6275223 | Hughes | Aug 2001 | B1 |
6285363 | Mairs et al. | Sep 2001 | B1 |
6292166 | Palmer et al. | Sep 2001 | B1 |
6304928 | Mairs et al. | Oct 2001 | B1 |
6317777 | Skarbo et al. | Nov 2001 | B1 |
6342906 | Kumar et al. | Jan 2002 | B1 |
6343313 | Salesky et al. | Jan 2002 | B1 |
6456305 | Qureshi et al. | Sep 2002 | B1 |
6570590 | Dubrow et al. | May 2003 | B1 |
6577330 | Tsuda et al. | Jun 2003 | B1 |
6584493 | Butler | Jun 2003 | B1 |
6601087 | Zhu et al. | Jul 2003 | B1 |
6654032 | Zhu et al. | Nov 2003 | B1 |
6687878 | Eintracht et al. | Feb 2004 | B1 |
6823514 | Degenaro et al. | Nov 2004 | B1 |
6825860 | Hu et al. | Nov 2004 | B1 |
6826595 | Barbash et al. | Nov 2004 | B1 |
6833844 | Shiota et al. | Dec 2004 | B1 |
6910188 | Keohane et al. | Jun 2005 | B2 |
6911987 | Mairs | Jun 2005 | B1 |
6925645 | Zhu et al. | Aug 2005 | B2 |
6973627 | Appling | Dec 2005 | B1 |
6982729 | Lange et al. | Jan 2006 | B1 |
7003728 | Berque | Feb 2006 | B2 |
20010000811 | May et al. | May 2001 | A1 |
20020010713 | Egilsson | Jan 2002 | A1 |
20020075304 | Thompson et al. | Jun 2002 | A1 |
20020078088 | Kuruoglu et al. | Jun 2002 | A1 |
20020095399 | Devine et al. | Jul 2002 | A1 |
20020174181 | Wei | Nov 2002 | A1 |
20020184310 | Traversat et al. | Dec 2002 | A1 |
20030028610 | Pearson | Feb 2003 | A1 |
20030085922 | Wei | May 2003 | A1 |
20030103088 | Dresti et al. | Jun 2003 | A1 |
20030137522 | Kaasila et al. | Jul 2003 | A1 |
20030167339 | Zhu et al. | Sep 2003 | A1 |
20030189599 | Ben-Shachar et al. | Oct 2003 | A1 |
20030189601 | Ben-Shachar et al. | Oct 2003 | A1 |
20040024819 | Sasaki et al. | Feb 2004 | A1 |
20040066408 | Meyers et al. | Apr 2004 | A1 |
20040260717 | Albornoz et al. | Dec 2004 | A1 |
20050024389 | Mairs et al. | Feb 2005 | A1 |
20050027896 | Mairs et al. | Feb 2005 | A1 |
20050033817 | Wei | Feb 2005 | A1 |
20050055306 | Miller et al. | Mar 2005 | A1 |
20050216847 | Zhu et al. | Sep 2005 | A1 |
20060190839 | Ben-Shachar et al. | Aug 2006 | A1 |
20060288389 | Deutscher et al. | Dec 2006 | A1 |
Number | Date | Country |
---|---|---|
WO-9926153 | May 1999 | WO |