Optimized joint document review

Information

  • Patent Grant
  • 9383888
  • Patent Number
    9,383,888
  • Date Filed
    Wednesday, December 15, 2010
    14 years ago
  • Date Issued
    Tuesday, July 5, 2016
    8 years ago
Abstract
A UI for presenting and reviewing a document is optimized based upon the type of computing device being utilized to present the document. One such UI includes a first pane showing a view of the document under review that is sized and formatted for display on a large-format display device. The first pane can also be utilized to emphasize a portion of the document. The UI also includes a second pane that includes indicators for each of the reviewers of the document. The selection of an indicator will cause a portion of the document being reviewed by the corresponding reviewer to be displayed in the first pane. The UI also includes a third pane that includes a scaled image of the document shown in the first pane. Selection of a portion of the scaled image causes the selected portion of the document to be displayed in the first pane.
Description
BACKGROUND

Joint review of an electronic document often involves a single computer connected to a large-format display device, such as a projector. The program utilized to create the document, such as a word processing application program, is typically utilized to present and navigate the document under review. Document reviewers typically follow along with the display of the document on the large-format display device. In practice, this leads to all of the document reviewers viewing exactly the same content. Moreover, because the user interface provided by the document creation program is not optimized for a large-format display device, the document as presented may be difficult to read.


Some document reviewers might also opt to open a copy of the document under review on their own computing device, such as a laptop computer. The review of the document by these reviewers, however, occurs in a manner that is disconnected from the review of the document by the other reviewers. Consequently, the common joint review practices described above can lead to considerable manual navigation within the document under review, especially for long documents. Moreover, the user interface provided by the document creation program may not be configured to take advantage of enhanced capabilities provided by each reviewer's computing device.


It is with respect to these and other considerations that the disclosure made herein is presented.


SUMMARY

Technologies are described herein for enabling the joint review of documents. Through an implementation of the concepts and technologies presented herein, documents can be reviewed jointly in an efficient and effective manner. In particular, a user interface (“UI”) for presenting a document can be optimized based upon the type of computing device being utilized to present the document. For example, a UI that provides functionality for document review may be customized for use on large-format display devices and other device types. Additionally, document navigation mechanisms may be customized with support for touch-enabled display devices and gesture input, UIs for emphasizing portions of a document may be provided, and UIs may be provided for identifying the portions of the documents under review by each reviewer.


According to one aspect presented herein, a group of people, who may be referred to herein individually as “reviewers” or “document reviewers,” can jointly review an electronic document. A UI for reviewing the document can be provided that is customized based upon the type of computing device that each reviewer is utilizing. For instance, unique UIs can be provided for desktop or laptop computers, tablet computing devices, wireless mobile telephones, and large-format displays. Each UI can be customized to take advantage of the available user input mechanisms. For instance, the UIs may be optimized for touch input, gestures, and other types of input. Moreover, the location of each reviewer within a document under review may be shared between the devices, communicated to the reviewers, and utilized to assist in document navigation.


According to another aspect, a UI is provided for joint review of a document. In one implementation, the UI is configured for use on a large-format display device, such as a large touch-enabled video display. The UI includes a first UI element, such as a UI pane, showing a view of the document under review. When a large-format display device is utilized, the view may be sized and formatted for display on a large-format display device. For instance, a font style and size may be selected that are appropriate for viewing the document from a distance. Additionally, pagination and other elements traditionally associated with a paper document might not be displayed in the view of the document shown in the first UI pane. Conventional UI tools for editing a document might also be eliminated from the UI. The view of the document shown in the first UI pane may be navigated utilizing touch input, gestures, or other supported UI mechanisms.


The UI might also include a second UI element, such as a second UI pane. The second UI pane might include indicators for each of the reviewers of the document. Each indicator might include data identifying the reviewer, an image of the reviewer, data indicating the type of review being performed (e.g. reading or editing), and an indication of the type of device being utilized (e.g. a tablet or a phone). In one implementation, the selection of an indicator will cause a portion of the document being reviewed by the corresponding reviewer to be displayed in the first UI pane. In this manner, the indicators can be utilized to quickly navigate to the portion of the document under review by each of the reviewers. In alternate embodiments, one reviewer can “push” their location in a document to other reviewers, thereby allowing one reviewer to synchronize their location in a document to other reviewers. Additionally, the view shown to each reviewer might automatically follow along with the location of another reviewer.


According to another aspect, the UI also includes a third UI element, such as a third UI pane, for enabling navigation of the document shown in the first UI pane in a manner optimized for a particular display size and input type. For instance, in one embodiment the third UI pane includes a scaled image, such as a “thumbnail” image, of the document shown in the first UI pane. Selection of a portion of the scaled in causes the selected portion of the document to be displayed in the first UI pane. The UI might also be optimized for a particular display size and input type. For instance, on a large display, a visual thumbnail might be provided in the manner described above that can be utilized to quickly navigate the document with simple gestures. On a smaller display, such as a mobile telephone, the third UI pane might include a mechanism for navigating between headings in the document rather than a thumbnail. Other variations might also be utilized.


In other embodiments, the third UI pane includes a view of sections of the document. Selection of a section of the document in the third UI pane causes the selected section of the document to be displayed in the first UI pane. The third UI pane might also provide an indication of recent viewing or editing activity within the document by the reviewers. The third UI pane might also provided functionality for browsing by heading, figures, or other document elements. The view shown in the third pane might also be utilized to navigate the document independently of the view shown in the first UI pane.


According to another aspect, the first UI pane can be utilized to emphasize a portion of the document. For instance, in one embodiment a portion of the document, such as a paragraph, can be selected in the first UI pane. In response to the selection of a portion of the document, the other portions of the document shown in the first UI pane may be obfuscated. For instance, the other portions of the document may be “blurred” or “fogged” over, thereby rendering the other portions unreadable. In this manner, the attention of the reviewers may be focused on a selected portion of the document.


The UI elements disclosed herein and the user experience provided for interacting with those panes might also be customized based upon the type of device utilized by each reviewer. For example, when a sufficiently large display is available, each pane may be displayed separately or also inline with the document being reviewed. On a mobile telephone the panes may be displayed individually and UI elements may be provided for allowing a user to navigate between the panes. It should be appreciated that although UI panes have been utilized herein to describe various features, other types of UI elements such as “fly out” menus, overlays, and other types of UI elements and UI controls might also be utilized.


This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended that this Summary be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a software and network architecture diagram showing one illustrative operating environment for the embodiments disclosed herein;



FIG. 2 is a user interface diagram showing one illustrative user interface disclosed herein for optimized document presentation and navigation on a large-format display device, according to one embodiment disclosed herein;



FIG. 3 is a user interface diagram showing one illustrative user interface disclosed herein for optimized document presentation and navigation on a desktop or laptop computer, according to one embodiment disclosed herein;



FIG. 4 is a user interface diagram showing one illustrative user interface disclosed herein for emphasizing a portion of a document, according to one embodiment disclosed herein;



FIG. 5 is a flow diagram showing aspects of one illustrative process disclosed herein for optimized document presentation and navigation for presenting a document in a meeting, according to one embodiment presented herein; and



FIG. 6 is a computer architecture diagram showing an illustrative computer hardware and software architecture for a computing system capable of implementing the various embodiments presented herein.





DETAILED DESCRIPTION

The following detailed description is directed to technologies for optimized joint document review. As discussed briefly above, a UI for presenting and reviewing a document is disclosed herein that is optimized based upon the type of computing device being utilized to present the document. One such UI includes a first pane showing a view of the document under review that is sized and formatted for display on a large-format display device. The first pane can also be utilized to emphasize a portion of the document. The UI also includes a second pane that includes indicators for each of the reviewers of the document. The selection of an indicator will cause a portion of the document being reviewed by the corresponding reviewer to be displayed in the first pane. The UI also includes a third pane that includes a scaled image of the document shown in the first pane. Selection of a portion of the scaled image causes the selected portion of the document to be displayed in the first pane. Additional details regarding this UI and others will be provided below.


While the subject matter described herein is presented in the general context of program modules that execute in conjunction with the execution of an operating system and application programs on a computer system, those skilled in the art will recognize that other implementations may be performed in combination with other types of program modules. Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the subject matter described herein may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like.


In the following detailed description, references are made to the accompanying drawings that form a part, hereof, and which are shown by way of illustration specific embodiments or examples. Referring now to the drawings, in which like numerals represent like elements through the several figures, aspects of a computing system and methodology for optimized joint document review will be described.



FIG. 1 is a software and network architecture diagram showing one illustrative operating environment 100 for the embodiments disclosed herein. The illustrative operating environment 100 shown in FIG. 1 includes a number of computing devices that can be utilized by a group of people, who may be referred to herein as “reviewers” or “document reviewers”, to jointly review an electronic document 120. For instance, in the example shown in FIG. 1, a smartphone 110, tablet computer 112, and a desktop or laptop computer 114 are being utilized to jointly review the document 120. Additionally, in the example shown in FIG. 1, a reviewer 104 is interacting with a large-format display device 102 in order to review the document. Other reviewers might also be present at the location of the large-format display device 102. The lame-format display device 102 is driven by a large-format display controller 106. The large-format display controller 106 may be a computer system configured appropriately for driving the large-format display device 102.


As shown in FIG. 1, the large-format display controller 106, smartphone 110, tablet computer 112, and desktop/laptop computer 114 are each configured to execute an editing program 116. The editing program 116 is a program configured for creating and editing the document 120. As will be described in greater detail below, the editing program 116 is also configured in embodiments herein for providing functionality for allowing the document 120 to be jointly reviewed by users of the large-format display device 102, smartphone 110, tablet computer 112, and desktop/laptop computer 114. According to implementations, the editing program 116 may be a program for creating and editing word processing documents, presentation documents, spreadsheet documents, or other types of electronic documents. It should be appreciated that different versions of the editing program 116 can be executed on each of the devices shown in FIG. 1. The different versions may be configured to interoperate in the manner disclosed herein.


In one implementation, a meeting server 108 is utilized to coordinate the review of the document 120. In this regard, the meeting server 108 may execute a meeting service 118. Through the use of the meeting service 118, a meeting may be scheduled for reviewing the document 120. Appropriate invitations may be sent to the individuals that will be reviewing the document 120. The meeting server 108 might also provide other functionality for facilitating, scheduling, and managing a document review meeting. It should be appreciated that while the document 120 is illustrated in FIG. 1 as being stored at the meeting server 108, the document 120 may be stored in any location accessible to the editing program 116. It should also be appreciated that the embodiments disclosed herein might be implemented without the use of the meeting server 108.


It should be appreciated that each of the computing devices illustrated in FIG. 1 may provide various types of user input mechanisms and various output capabilities. For instance, the large-format display device 102 may include touch input capabilities allowing the reviewer 104 to provide input to the editing program 116 executing on the large-format display controller 106 by touching the large-format display device 102. In this regard, the editing program 116 might support the use of touch gestures to indicate various types of input.


The smartphone 110 might also support touch input, but typically includes a small-scale display screen. In other embodiments, the smartphone 110 might not provide touch input. The tablet computer 112 might provide touch input on a medium-sized display screen and the desktop/laptop computer 114 may or may not be equipped with touch input and may provide a medium-sized display.


As will be described in greater detail below, the editing program 116 provides a UI for presenting and collaboratively reviewing the document 120 that is optimized based upon the type of computing device utilized to present the document. For instance, in one implementation, a UI is provided for document review that is customized for use on the large-format display device 102. In alternative embodiments, the UI might be customized for use on the smartphone 110, the tablet computer 112, and the desktop/laptop computer 114.


As will also be described in greater detail below, the editing program 116 executing on each of the devices in FIG. 1 may be configured to provide an indication of the location within the document 120 currently under review by a user of the respective device. UIs may be provided herein that identify the portions of the document 120 under review by each reviewer and that permit easy navigation to the portions of the document currently under review. Moreover, UIs are provided that permit one reviewer to “push” their location in a document to other reviewers, thereby allowing one reviewer to synchronize their location in a document to other reviewers. Additionally, the view shown to each reviewer might automatically follow along with the location of another reviewer. Additionally. UIs will be described herein for emphasizing portions of the document 120 when used in a group review session setting, such as through the use of the large-format display device 102. Additional details regarding these UIs will be provided below with reference to FIGS. 2-5.


It should be appreciated that although FIG. 1 illustrates four computing devices executing the editing program 116, many other such devices might be utilized by the concepts and technologies presented herein. Additionally, it should be appreciated that while a single network 118 is illustrated in FIG. 1 that connects the computing devices, many other such networks might be utilized. Moreover, although a meeting server 108 and meeting service 118 have been illustrated in FIG. 1, it should be appreciated that the embodiments described herein are not dependent upon the use of a meeting server 108 or a meeting service 118 to organize and facilitate a review meeting.



FIG. 2 is a UI diagram showing one illustrative UI disclosed herein for optimized document presentation and navigation on a large-format display device 102, according to one embodiment disclosed herein. In particular, FIG. 2 shows a UI 200 suitable for joint review of a document on the large-format display, device 102. In the particular embodiment shown in FIG. 2, the UI 200 includes three UI panes 202A-202C. The UI pane 202A includes a view of the document 120 that is sized and formatted for display on the large-format display device 102. For instance, a font style and size may be selected that are appropriate for viewing the document 120 from a distance. Additionally, the view of the document 120 shown in the pane 202A is simplified to remove pagination and other elements traditionally associated with a paper document. In this regard, the UI 200 also does not include conventional UI tools for editing a document. In this manner, the UI 200 is focused on review of the document 120 rather than editing of the document 120.


The pane 202A may be utilized to navigate the contents of the document 120. For instance, a touch enabled large-format display device 102 may be utilized to make input into the pane 202A for scrolling the visible portion of the document 120 up or down. As an example, a “swipe” touch gesture might be utilized to scroll the view of the document 120 shown in the pane 202A. Other types of gestures might also be made into the pane 202A to scroll the view of the document 120 visible within the pane 202A.


According to one implementation, no editing of the document 120 is permitted within the pane 202A. According to another embodiment, simple editing functionality may be provided within the pane 202A, such as functionality for generating an ink drawing, highlighting a portion of the document 120, or inserting comments into the document 120. In general, it should be appreciated, however, that the pane 202A is configured in a manner so as to avoid inadvertent modification of the document 120.


As shown in FIG. 2 and described briefly above, the UI 200 also includes a UI pane 202B. The UI 202B includes indicators 204A-204C for each of the reviewers of the document 120. Each of the indicators 204A-204C includes an image 206A-206C of the corresponding reviewer, text indicating the type of review being performed (e.g. reading or editing), and an indication of the type of device being utilized (e.g. a tablet or a phone). For instance, the indicator 204A indicates that a reviewer named “Joe” is reading the document 120 from his smartphone 110. The indicator 204B indicates that a reviewer named “Jeff” is editing the section of the document 120 shown in the pane 202A from his computer 114. The indicator 204C indicates that a reviewer named “Nathan” is reading the document 120 on a tablet computer 112. It should be appreciated that while three indicators 204A-204C are illustrated in FIG. 2, more or fewer indicators may be shown in the pane 202B depending on the number of individuals currently reviewing the document 120. Additionally, it should be appreciated that an appropriate user input mechanism might be utilized to scroll the contents of the pane 202B when the number of indicators 204A-204C exceed the available space within the pane 202B.


According to one implementation disclosed herein, the indicators 204A-204C may be selected in order to navigate the view of the document 120 shown in the pane 202A to the portion of the document currently being reviewed by the corresponding reviewer. For instance, the indicator 204A may be selected. In response there to, the portion of the document 120 shown in the pane 202A may be modified to show the portion of the document 120 currently being read by the user named “Joe”. Likewise, the indicator 204C might be selected in order to cause the view of the document 120 shown in the pane 202A to reflect the portion of the document 120 being read by the user named “Nathan” on his tablet 112. In this manner, the indicators 204A-204C can be utilized to quickly navigate to the portion of the document 120 under review by each of the reviewers.


It should be appreciated that, in alternate embodiments, one reviewer can “push” their location in a document to other reviewers, thereby allowing one reviewer to synchronize their location in a document to other reviewers. For instance, a reviewer utilizing a mobile telephone to review a document could “push” their location in the document to other reviewers using desktop computers or other types of devices. Additionally, the view shown to each reviewer might automatically follow along with the location of another reviewer.


As also shown in FIG. 2 and described briefly above, the user interface 200 also includes a UI pane 202C. The UI pane 202C includes a scaled image 208, such a “thumbnail” image, of the document 120 currently under review. The UI pane 202C might also include a bounding box 210 that indicates the portion of the document 120 currently being shown in the pane 202A. The user may select a portion of the scaled image 208 shown in the pane 202C. In response thereto, the portion of the document 120 corresponding to the selection made in the pane 202C will be displayed in the pane 202A. In this manner, virtually any portion of the document 120 can be quickly navigated to.


In one implementation, the pane 202C also includes a traditional scroll bar 212 in the event that the scaled image 208 exceeds the available space in the pane 202C. The scroll bar 212 may be utilized to scroll the view of the scaled image 208. The pane 202C might also include a UI control 214 for modifying the font size of the view of the document 120 shown in the pane 202A, a UI control 216 for hiding the pane 202B, a UI control for editing the document 120 in the editing program 116, and a UI control 220 for setting other options regarding the operation of the UI 220.


According to other embodiments, the pane 202C might include a visualization of sections or elements within the document 120. For instance, the pane 202C might include a visualization of the pages, headers, embedded content, comments, sections, or other elements within the document 120. By selecting any of the visualizations within the pane 202C, a user may quickly modify the view of the document 120 shown in the pane 202A to reflect the selected portion of the document. In this manner, the pane 202C may be utilized to quickly navigate to virtually any portion of the document 120. The pane 202C might also provide an indication of the portion of the document 120 currently being reviewed by each reviewer.


It should be appreciated that the UI elements disclosed herein and the user experience provided for interacting with those panes might also be customized based upon the type of device utilized by each reviewer. For example, when a sufficiently large display is available, each pane may be displayed separately or also inline with the document being reviewed. On a mobile telephone the panes may be displayed individually and UI elements may be provided for allowing a user to navigate between the panes.



FIG. 3 is a UI diagram showing one illustrative UI disclosed herein for optimized document presentation and navigation on a desktop or laptop computer, according to one embodiment disclosed herein. In particular, FIG. 3 shows a UI 300 that may be displayed by a desktop/laptop computer 114 or another computing device having an average-sized display screen. In the UI shown in FIG. 3, a UI pane 302A includes indicators 304A-304C identifying the reviewers currently reviewing the document 120.


The indicators 304A-304C may include images or text identifying the user, the type of review in progress, the type of device upon which the review is being performed, and other information. The pane 302A might also include UI controls for performing other review and related tasks, such as at UI control 306 for screen sharing, a UI control 308 for requesting that other users open the document 120, a UI control 310 for opening a meeting, and a UI control 312 for opening other meeting files.


The UI 300 shown in FIG. 3 also includes a UI pane 302B. The UI pane 302B provides a view of the document 120. The view of the document 120 shown in the pane 302B may be navigated in a traditional fashion utilizing the scroll bar 316 or other user input mechanisms.


In the example shown in FIG. 3, the view of the document 120 shown in the pane 302B also includes the indicators 314A-314C. The indicators 314A-314C identify the reviewers currently reviewing the document shown in the pane 302B. Additionally, the indicators 314A-314C provide an indication as to the portions of the document 120 currently under review. For instance, in the example shown in FIG. 3, the indicator 314B includes an arrow pointing at the overview section of the document 120. This indicates that the user named “Jeff” is currently reviewing this portion of the document 120. The indicators 314A and 314C include arrows pointing off screen. These indicators show that the users named “Joe” and “Nathan” are currently reviewing portions of the document 120 that are below the portion shown in pane 302B. The indicators 314A-314C may be selected in order to navigate the view of the document 120 shown in the pane 302B to the portion of the document 120 currently under view by the respective reviewer.



FIG. 4 is at UI diagram showing one illustrative UI disclosed herein for emphasizing a portion of a document, according to one embodiment disclosed herein. In particular, FIG. 4 shows a UI 400 that may be utilized within the UI 200 or the UI 300 to emphasize a portion of the document 120. In the UI 400, a portion 402A of the document 120 may be selected. For instance, selection may be made utilizing an appropriate touch-enabled input device. Once the portion 402A of the document 120 has been selected, a request may be received in order to emphasize the selected portion 402A. An appropriate UI gesture or other type of user input may be received in order to make such a request.


In response to receiving a request to emphasize the selected portion 402A, the editing program 116 is configured to obfuscate the other portions of the document 120 displayed in the UI 400. For instance, the portion 402B of the document 120 may be “blurred” or “fogged” over, thereby rendering the portion 402B of the document 120 unreadable. Additionally, the text size or other visual attributes of the portion 402A may be emphasized. For instance, the font size of text within the portion 402 may be increased. Additionally, separators 404A-404B may be displayed to further distinguish the portion 402A of the document 120 from the remaining portion 402B of the document 120. It should be appreciated that other types of visual effects might also be applied to the non-selected portion 402B of the document 120. A UI control 406 might also be displayed which, when selected, will cause the emphasis to the portion 402A to be removed from the UI 400. In this manner, the attention of the reviewers may be focused on the selected portion 402A of the document 120.


It should be appreciated that the user interfaces as illustrated in FIGS. 2-4 and described above are merely illustrative and that other implementations may be utilized. It should be further appreciated that the various UI components illustrated in FIGS. 2-4 and described above may be used in combination. For instance, the UI 400 may be utilized in the UI pane 202A shown in FIG. 2 or the UI pane 302 shown in FIG. 3. Similarly, the indicators 314A-314C might be utilized in the UI pane 202A or the UI pane 202C.



FIG. 5 is a flow diagram showing a routine 500 that illustrates aspects of one process disclosed herein for optimized document presentation and navigation for presenting a document in a meeting. It should be appreciated that the logical operations described herein with respect to FIG. 5 and the other FIGURES are implemented (1) as a sequence of computer implemented acts or program modules running on a computing system and/or (2) as interconnected machine logic circuits or circuit modules within the computing system. The implementation is a matter of choice dependent on the performance and other requirements of the computing system. Accordingly, the logical operations described herein are referred to variously as operations, structural devices, acts, or modules. These operations, structural devices, acts and modules may be implemented in software, in firmware, in special purpose digital logic, and any combination thereof. It should also be appreciated that more or fewer operations may be performed than shown in the figures and described herein. These operations may also be performed in a different order than those described herein.


The routine 500 begins at operation 502, where the editing program 116 is launched on the various devices utilized to review the document 120. For instance, as described above, the editing program 116 may be launched on the large-format displaycontroller 106, the smartphone 110, the tablet computer 112, and the desktop/laptop computer 114. Once the editing program 116 has been launched, the routine 500 proceeds to operation 504.


At operation 504, the editing program 116 retrieves the document 120 to be jointly reviewed. The routine 500 then proceeds to operation 506 where the editing program 116 identifies the device type, display form factor, and input devices of the device upon which it is executing. Once these items have been identified, the routine 500 then proceeds to operation 508 where the editing program 116 provides a user interface customized for reviewing the document 120 that is suitable for the display and user input mechanisms provided by the identified device. For instance as described above, the UI 200 shown in FIG. 2 may be provided by the editing program 116 executing on the large-format displaycontroller 106. Similarly, the UI shown in FIG. 3 and described above may be provided by the editing program 116 executing on the desktop/laptop computer 114. Additionally, various aspects of the UIs described herein might be provided by a suitable UI on the smartphone 110 and the tablet computer 112.


From operation 508, the routine 500 proceeds to operation 510. At operation 510, the editing program 116 executing on each device provides an indication to the other devices as to the location in the document currently being reviewed. Each of the editing programs 116 receives this information and provides an indication as to the location in the document of the other reviewers. For instance, the contents of the pane 202B described above with reference to FIG. 2 may be provided indicating the location of reviewers within the document 120. Similarly, the indicators 314A-314C may be displayed showing the location within the document 120 currently being reviewed by the other reviewers.


From operation 510, the routine 500 proceeds to operation 512 where the editing program 116 also provides other UI features for navigating the document 120 and emphasizing portions of the document 120 for review. These features may be based upon the particular device type upon which each instance of the editing program 116 is executing. These features were described above with reference to FIGS. 2-5. From operation 512, the routine 500 proceeds to operation 514, where it ends.



FIG. 6 is a computer architecture diagram showing an illustrative computer hardware and software architecture for a computing system capable of implementing the various embodiments presented herein. The computer architecture shown in FIG. 6 illustrates a conventional desktop, laptop computer, or server computer and may be utilized to execute the various software components described herein.


The computer architecture shown in FIG. 6 includes a central processing unit 602 (“CPU”), a system memory 608, including a random access memory 614 (“RAM”) and a read-only memory (“ROM”) 616, and a system bus 604 that couples the memory to the CPU 602. A basic input/output system (“BIOS”) containing the basic routines that help to transfer information between elements within the computer 600, such as during startup, is stored in the ROM 616. The computer 600 further includes a mass storage device 610 for storing an operating system 618, application programs, and other program modules, which will be described in greater detail below.


The mass storage device 610 is connected to the CPU 602 through a mass storage controller (not shown) connected to the bus 604. The mass storage device 610 and its associated computer-readable storage media provide non-volatile storage for the computer 600. Although the description of computer-readable media contained herein refers to a mass storage device, such as a hard disk or CD-ROM drive, it should be appreciated by those skilled in the art that computer-readable storage media can be any available computer storage media that can be accessed by the computer 600.


By way of example, and not limitation, computer-readable storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, digital versatile disks (“DVD”), HD-DVD, BLU-RAY, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transitory medium which can be used to store the desired information and which can be accessed by the computer 600.


It should be appreciated that the computer-readable media disclosed herein also encompasses communication media. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of conputer readable media. Computer-readable storage media does not encompass communication media.


According to various embodiments, the computer 600 may operate in a networked environment using logical connections to remote computers through a network such as the network 620. The computer 600 may connect to the network 620 through a network interface unit 606 connected to the bus 604, it should be appreciated that the network interface unit 606 may also be utilized to connect to other types of networks and remote computer systems. The computer 600 may also include an input/output controller 612 for receiving and processing input from a number of other devices, including a keyboard, mouse, or electronic stylus (not shown in FIG. 6). Similarly, an input/output controller may provide output to a display screen, a printer, or other type of output device (also not shown in FIG. 6).


As mentioned briefly above, a number of program modules and data files may be stored in the mass storage device 610 and RAM 614 of the computer 600, including an operating system 618 suitable for controlling the operation of a networked desktop, laptop, or server computer. The mass storage device 610 and RAM 614 may also store one or more program modules. In particular, the mass storage device 610 and the RAM 614 may store the document editing program 116, the meeting service 118, and/or the other software components described above. The mass storage device 610 and RAM 614 may also store other program modules and data, such as the document 120.


In general, software applications or modules may, when loaded into the CPU 602 and executed, transform the CPU 602 and the overall computer 600 from a general-purpose computing system into a special-purpose computing system customized to perform the functionality presented herein. The CPU 602 may be constructed from any number of transistors or other discrete circuit elements, which may individually or collectively assume any number of states. More specifically, the CPU 602 may operate as one or more finite-state machines, in response to executable instructions contained within the software or modules. These computer-executable instructions may transform the CPU 602 by specifying how the CPU 602 transitions between states, thereby physically transforming the transistors or other discrete hardware elements constituting the CPU 602.


Encoding the software or modules onto a mass storage device may also transform the physical structure of the mass storage device or associated computer readable storage media. The specific transformation of physical structure may depend on various factors, in different implementations of this description. Examples of such factors may include, but are not limited to: the technology used to implement the computer readable storage media, whether the computer readable storage media are characterized as primary or secondary storage, and the like. For example, if the computer readable storage media is implemented as semiconductor-based memory, the software or modules may transform the physical state of the semiconductor memory, when the software is encoded therein. For example, the software may transform the states of transistors, capacitors, or other discrete circuit elements constituting the semiconductor memory.


As another example, the computer readable storage media may be implemented using magnetic or optical technology. In such implementations, the software or modules may transform the physical state of magnetic or optical media, when the software is encoded therein. These transformations may include altering the magnetic characteristics of particular locations within given magnetic media. These transformations may also include altering the physical features or characteristics of particular locations within given optical media, to change the optical characteristics of those locations. Other transformations of physical media are possible without departing from the scope and spirit of the present description, with the foregoing examples provided only to facilitate this discussion.


Based on the foregoing, it should be appreciated that technologies for optimized joint document review have been presented herein. Although the subject matter presented herein has been described in language specific to computer structural features, methodological acts, and computer readable media, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features, acts, or media described herein. Rather, the specific features, acts and mediums are disclosed as example forms of implementing the claims.


The subject matter described above is provided by way of illustration only and should not be construed as limiting. Various modifications and changes may be made to the subject matter described herein without following the example embodiments and applications illustrated and described, and without departing from the true spirit and scope of the present invention, which is set forth in the following claims.

Claims
  • 1. A computer-implemented method comprising: displaying a first user interface element comprising a view of a document, wherein a first selection of a portion of the document in the first user interface element causes the display of portions of the document other than the selected portion to be obfuscated in the first user interface element;displaying a second user interface element comprising an indicator for each of two or more reviewers of the document that are currently reviewing the document, each indicator comprising an image that identifies each respective reviewer and text indicating both a type of review being performed within the document by each respective reviewer and a device type being utilized by each respective reviewer, the type of review comprising editing the document or reading the document, wherein a second selection of an indicator of a respective reviewer causes the view of the document being displayed in the first user interface element to be modified to display another view of the document comprising a portion of the document currently being reviewed by the respective reviewer; anddisplaying a third user interface element comprising a scaled image of the document, the third user interface element including a bounding box that indicates the portion of the document currently being displayed in the first user interface element, the first user interface element, the second user interface element, and the third user interface element concurrently displayed in a single user interface.
  • 2. The computer-implemented method of claim 1, wherein the selection of a portion of the scaled image causes a portion of the document corresponding to the selected portion of the scaled image to be displayed in the first user interface element.
  • 3. The computer-implemented method of claim 1, wherein the third user interface element comprises a view of one or more sections of the document, and wherein a third selection of a section of the document in the third user interface element causes a portion of the document corresponding to the selected section to be displayed in the first user interface element.
  • 4. The computer-implemented method of claim 1, wherein the third user interface element provides an indication of recent viewing or editing activity within the document by the two or more reviewers.
  • 5. The computer-implemented method of claim 1, wherein the reviewers of the document are reviewing the document on a plurality of different device types.
  • 6. The computer-implemented method of claim 5, wherein the first user interface element, the second user interface element, and the third user interface element can be navigated using touch input.
  • 7. A system comprising: at least a memory and a processor configured to perform operations comprising:displaying a first user interface pane comprising a view of a document, wherein a first selection of a portion of the document in the first user interface pane causes the display of portions of the document other than the selected portion to be obfuscated in the first user interface pane;displaying a second user interface pane adjacent to the first user interface pane comprising an indicator for each of one or more reviewers of the document that are concurrently reviewing the document, each indicator providing an indication of a current viewing or editing activity within the document by each respective reviewer and a device type being utilized by each respective reviewer; andproviding a third user interface pane adjacent to the second user interface pane comprising a scaled image of the document, wherein a second selection of an indicator of a respective reviewer in the second user interface pane causes the view of the document being displayed in the first user interface pane to be modified to display another view of the document comprising a portion of the document currently being reviewed by the respective reviewer, and wherein the third user interface pane further includes a bounding box that indicates the portion of the document currently being displayed in the first user interface pane.
  • 8. The system of claim 7, wherein the selection of a portion of the scaled image causes a portion of the document corresponding to the selected portion of the scaled image to be displayed in the first user interface pane.
  • 9. The system of claim 7, wherein the third user interface pane provides an indication of recent viewing or editing activity within the document by the one or more reviewers.
  • 10. The system of claim 9, wherein the reviewers of the document are reviewing the document on a plurality of different device types.
  • 11. The system of claim 7, wherein each indicator further includes an image that identifies each respective reviewer.
  • 12. A system comprising: one or more processors; anda memory comprising instructions stored thereon that, responsive to execution by the one or more processors, implements a large format display controller, the large format display controller configured to:display a first user interface pane comprising a view of a document sized and formatted for display on a large-format display device, wherein a selection of a portion of the document in the first user interface pane causes the display of portions of the document other than the selected portion to be obfuscated in the first user interface pane;display a second user interface pane adjacent to the first user interface pane comprising an indicator for each of two or more reviewers currently reviewing the document which, when selected, will cause the view of the document being displayed in the first user interface pane to be modified to display another view of the document comprising a portion of the document being reviewed by a corresponding reviewer, each indicator including an identifier of the reviewer, data identifying a type of review currently being performed by the reviewer, the type of review comprising editing the document or reading the document, and a device type being utilized by each corresponding reviewer; andprovide a third user interface pane adjacent to the second user interface pane comprising a scaled image of the document which, when selected, will cause a portion of the document corresponding to the selected portion of the scaled image to be displayed in the first user interface pane, the third user interface pane including a bounding box that indicates the portion of the document currently being displayed in the first user interface pane.
  • 13. The system of claim 12, wherein the third user interface pane comprises a view of one or more sections of the document, and wherein additional selection of a section of the document in the third user interface pane causes a portion of the document corresponding to the selected section to be displayed in the first user interface pane.
  • 14. The system of claim 12, wherein the identifier of the reviewer comprises an image that identifies the reviewer.
  • 15. The system of claim 12, wherein the third user interface pane provides an indication of recent viewing or editing activity within the document by the two or more reviewers.
  • 16. The system of claim 12, wherein the two or more reviewers of the document are reviewing the document on at least two different device types.
  • 17. The system of claim 12, wherein the first user interface pane, the second user interface pane, and the third user interface pane can be navigated using touch input.
US Referenced Citations (296)
Number Name Date Kind
4540850 Herr et al. Sep 1985 A
4831552 Scully et al. May 1989 A
5297250 Leroy et al. Mar 1994 A
5337407 Bates et al. Aug 1994 A
5339389 Bates et al. Aug 1994 A
5495269 Elrod et al. Feb 1996 A
5566291 Boulton et al. Oct 1996 A
5675752 Scott et al. Oct 1997 A
5704029 Wright, Jr. Dec 1997 A
5717869 Moran et al. Feb 1998 A
5802299 Logan et al. Sep 1998 A
5821925 Carey et al. Oct 1998 A
5821932 Pittore Oct 1998 A
5893098 Peters et al. Apr 1999 A
5907324 Larson et al. May 1999 A
6016478 Zhang et al. Jan 2000 A
6018346 Moran et al. Jan 2000 A
6049334 Bates et al. Apr 2000 A
6119147 Toomey et al. Sep 2000 A
6192395 Lerner et al. Feb 2001 B1
6208339 Atlas et al. Mar 2001 B1
6230185 Salas et al. May 2001 B1
6353436 Reichlen Mar 2002 B1
6553417 Gampper Apr 2003 B1
6564246 Varma et al. May 2003 B1
6633315 Sobeski et al. Oct 2003 B1
6670970 Bonura et al. Dec 2003 B1
6735615 Iwayama et al. May 2004 B1
6738075 Torres et al. May 2004 B1
7035865 Doss et al. Apr 2006 B2
7036076 Anwar Apr 2006 B2
7051285 Harrison et al. May 2006 B1
7073127 Zhao et al. Jul 2006 B2
7075513 Silfverberg et al. Jul 2006 B2
7124164 Chemtob Oct 2006 B1
7171567 Bayer et al. Jan 2007 B1
7203479 Deeds Apr 2007 B2
7225257 Aoike et al. May 2007 B2
7228492 Graham Jun 2007 B1
7233933 Horvitz et al. Jun 2007 B2
7242389 Stern Jul 2007 B1
7246316 Furlong et al. Jul 2007 B2
7248677 Randall et al. Jul 2007 B2
7251786 Wynn et al. Jul 2007 B2
7257769 Caspi Aug 2007 B2
7269787 Amitay et al. Sep 2007 B2
7299193 Cragun et al. Nov 2007 B2
7299405 Lee et al. Nov 2007 B1
7299418 Dieberger Nov 2007 B2
7401300 Nurmi Jul 2008 B2
7426297 Zhang et al. Sep 2008 B2
7451183 Romero et al. Nov 2008 B2
7451186 Morinigo et al. Nov 2008 B2
7454439 Gansner et al. Nov 2008 B1
7466334 Baba Dec 2008 B1
7469222 Glazer Dec 2008 B1
7478129 Chemtob et al. Jan 2009 B1
7512906 Baier et al. Mar 2009 B1
7554576 Erol et al. Jun 2009 B2
7571210 Swanson et al. Aug 2009 B2
7590941 Wee et al. Sep 2009 B2
7599989 Stevens et al. Oct 2009 B2
7606862 Swearingen et al. Oct 2009 B2
7627830 Espinoza et al. Dec 2009 B1
7636754 Zhu et al. Dec 2009 B2
7669141 Pegg Feb 2010 B1
7679518 Pabla et al. Mar 2010 B1
7730411 Chotai et al. Jun 2010 B2
7743098 Anglin et al. Jun 2010 B2
7764247 Blanco et al. Jul 2010 B2
7770116 Zhang et al. Aug 2010 B2
7774221 Miller et al. Aug 2010 B2
7774703 Junuzovic et al. Aug 2010 B2
7818678 Massand Oct 2010 B2
7869941 Coughlin et al. Jan 2011 B2
7911409 Chatterjee et al. Mar 2011 B1
7941399 Bailor et al. May 2011 B2
7962525 Kansal Jun 2011 B2
7984387 Batthish et al. Jul 2011 B2
7992089 Murray et al. Aug 2011 B2
8032832 Russ et al. Oct 2011 B2
8099458 Burtner, IV et al. Jan 2012 B2
8126974 Lyle et al. Feb 2012 B2
8150719 Perrella et al. Apr 2012 B2
8161419 Palahnuk et al. Apr 2012 B2
8204942 Roskind et al. Jun 2012 B2
8214748 Srikanth et al. Jul 2012 B2
8330795 Iyer et al. Dec 2012 B2
8352870 Bailor et al. Jan 2013 B2
8358762 Renner et al. Jan 2013 B1
8385964 Haney Feb 2013 B2
8437461 Gartner et al. May 2013 B1
8452839 Heikes et al. May 2013 B2
8560487 Jhoney et al. Oct 2013 B2
8583148 Ollila et al. Nov 2013 B2
8606517 Ehrlacher et al. Dec 2013 B1
8631119 Malkin et al. Jan 2014 B2
8667401 Lozben Mar 2014 B1
8682973 Kikin-Gil et al. Mar 2014 B2
8768308 Kim et al. Jul 2014 B2
9118612 Fish et al. Aug 2015 B2
20010040592 Foreman et al. Nov 2001 A1
20020143876 Boyer et al. Oct 2002 A1
20020143877 Hackbarth et al. Oct 2002 A1
20030020805 Allen et al. Jan 2003 A1
20030046296 Doss Mar 2003 A1
20030122863 Dieberger et al. Jul 2003 A1
20030137539 Dees Jul 2003 A1
20030142133 Brown et al. Jul 2003 A1
20030158900 Santos Aug 2003 A1
20030179230 Seidman Sep 2003 A1
20030220973 Zhu et al. Nov 2003 A1
20030222890 Salesin et al. Dec 2003 A1
20040024822 Werndorfer et al. Feb 2004 A1
20040027370 Jaeger Feb 2004 A1
20040030992 Moisa et al. Feb 2004 A1
20040062383 Sylvain Apr 2004 A1
20040085354 Massand May 2004 A1
20040128350 Topfl et al. Jul 2004 A1
20040150627 Luman et al. Aug 2004 A1
20040161090 Digate et al. Aug 2004 A1
20040169683 Chiu et al. Sep 2004 A1
20040175036 Graham Sep 2004 A1
20040194033 Holzwarth et al. Sep 2004 A1
20040196286 Guzik Oct 2004 A1
20040230594 Flam et al. Nov 2004 A1
20040250201 Caspi Dec 2004 A1
20040254998 Horvitz Dec 2004 A1
20040263636 Cutler et al. Dec 2004 A1
20040267701 Horvitz et al. Dec 2004 A1
20050005025 Harville et al. Jan 2005 A1
20050018828 Nierhaus et al. Jan 2005 A1
20050055625 Kloss Mar 2005 A1
20050081160 Wee et al. Apr 2005 A1
20050088410 Chaudhri Apr 2005 A1
20050091571 Leichtling Apr 2005 A1
20050125246 Muller et al. Jun 2005 A1
20050125717 Segal et al. Jun 2005 A1
20050138109 Redlich et al. Jun 2005 A1
20050138570 Good et al. Jun 2005 A1
20050171830 Miller et al. Aug 2005 A1
20050285845 Dehlin Dec 2005 A1
20060004911 Becker et al. Jan 2006 A1
20060010023 Tromczynski et al. Jan 2006 A1
20060010197 Ovenden Jan 2006 A1
20060026253 Kessen et al. Feb 2006 A1
20060053380 Spataro et al. Mar 2006 A1
20060067250 Boyer et al. Mar 2006 A1
20060080610 Kaminsky Apr 2006 A1
20060082594 Vafiadis et al. Apr 2006 A1
20060094441 Beckmann et al. May 2006 A1
20060132507 Wang Jun 2006 A1
20060136828 Asano Jun 2006 A1
20060143064 Mock et al. Jun 2006 A1
20060146765 Van De Sluis et al. Jul 2006 A1
20060161585 Clarke et al. Jul 2006 A1
20060167996 Orsolini et al. Jul 2006 A1
20060168533 Yip et al. Jul 2006 A1
20060171515 Hintermeister et al. Aug 2006 A1
20060184872 Dontcheva et al. Aug 2006 A1
20060190547 Bhogal et al. Aug 2006 A1
20060195587 Cadiz et al. Aug 2006 A1
20060234735 Digate et al. Oct 2006 A1
20060239212 Pirzada et al. Oct 2006 A1
20060259875 Collins et al. Nov 2006 A1
20060265398 Kaufman Nov 2006 A1
20060282759 Collins et al. Dec 2006 A1
20070005752 Chawla et al. Jan 2007 A1
20070011231 Manion Jan 2007 A1
20070033091 Ravikumar et al. Feb 2007 A1
20070083597 Salesky et al. Apr 2007 A1
20070100937 Burtner et al. May 2007 A1
20070109939 Shimizu et al. May 2007 A1
20070112926 Brett et al. May 2007 A1
20070150583 Asthana et al. Jun 2007 A1
20070168447 Chen et al. Jul 2007 A1
20070174389 Armstrong et al. Jul 2007 A1
20070185870 Hogue et al. Aug 2007 A1
20070186171 Junuzovic et al. Aug 2007 A1
20070189487 Sharland et al. Aug 2007 A1
20070214423 Teplov et al. Sep 2007 A1
20070219645 Thomas et al. Sep 2007 A1
20070226032 White et al. Sep 2007 A1
20070226299 Shaffer et al. Sep 2007 A1
20070245238 Fugitt et al. Oct 2007 A1
20070253424 Herot et al. Nov 2007 A1
20070276909 Chavda et al. Nov 2007 A1
20070279416 Cobb et al. Dec 2007 A1
20070294612 Drucker et al. Dec 2007 A1
20070300185 Macbeth et al. Dec 2007 A1
20080001717 Fiatal Jan 2008 A1
20080005235 Hegde et al. Jan 2008 A1
20080008458 Gudipaty et al. Jan 2008 A1
20080013698 Holtzberg Jan 2008 A1
20080022225 Erl Jan 2008 A1
20080040187 Carraher et al. Feb 2008 A1
20080040188 Klausmeier Feb 2008 A1
20080059889 Parker et al. Mar 2008 A1
20080065580 Spence Mar 2008 A1
20080084984 Levy et al. Apr 2008 A1
20080098328 Rollin et al. Apr 2008 A1
20080109406 Krishnasamy et al. May 2008 A1
20080114844 Sanchez et al. May 2008 A1
20080115076 Frank et al. May 2008 A1
20080133551 Wensley et al. Jun 2008 A1
20080136897 Morishima et al. Jun 2008 A1
20080141126 Johnson et al. Jun 2008 A1
20080147790 Malaney et al. Jun 2008 A1
20080177782 Poston et al. Jul 2008 A1
20080189624 Chotai et al. Aug 2008 A1
20080239995 Lee et al. Oct 2008 A1
20080244442 Veselova et al. Oct 2008 A1
20080263010 Roychoudhuri et al. Oct 2008 A1
20080263460 Altberg et al. Oct 2008 A1
20080276174 Hintermeister et al. Nov 2008 A1
20080288889 Hunt et al. Nov 2008 A1
20080300944 Surazski et al. Dec 2008 A1
20080303746 Schlottmann et al. Dec 2008 A1
20080307322 Stochosky et al. Dec 2008 A1
20080320082 Kuhlke et al. Dec 2008 A1
20090006980 Hawley et al. Jan 2009 A1
20090006982 Curtis et al. Jan 2009 A1
20090019367 Cavagnari et al. Jan 2009 A1
20090030766 Denner et al. Jan 2009 A1
20090043856 Darby Feb 2009 A1
20090055739 Murillo et al. Feb 2009 A1
20090089055 Caspi et al. Apr 2009 A1
20090094367 Song et al. Apr 2009 A1
20090109180 Do et al. Apr 2009 A1
20090119255 Frank et al. May 2009 A1
20090119604 Simard et al. May 2009 A1
20090129596 Chavez et al. May 2009 A1
20090138552 Johnson et al. May 2009 A1
20090138826 Barros May 2009 A1
20090204465 Pradhan Aug 2009 A1
20090204671 Hawkins et al. Aug 2009 A1
20090210822 Schindler Aug 2009 A1
20090222741 Shaw et al. Sep 2009 A1
20090228569 Kalmanje et al. Sep 2009 A1
20090234721 Bigelow et al. Sep 2009 A1
20090235177 Saul et al. Sep 2009 A1
20090254843 Van Wie et al. Oct 2009 A1
20090265632 Russ et al. Oct 2009 A1
20090282339 Van Melle et al. Nov 2009 A1
20090309846 Trachtenberg et al. Dec 2009 A1
20090313584 Kerr et al. Dec 2009 A1
20090327019 Addae et al. Dec 2009 A1
20090327425 Gudipaty Dec 2009 A1
20100031152 Villaron et al. Feb 2010 A1
20100037151 Ackerman et al. Feb 2010 A1
20100058201 Harvey et al. Mar 2010 A1
20100079467 Boss et al. Apr 2010 A1
20100095198 Bultrowicz et al. Apr 2010 A1
20100097331 Wu Apr 2010 A1
20100131868 Chawla et al. May 2010 A1
20100138756 Saund et al. Jun 2010 A1
20100149307 Iyer et al. Jun 2010 A1
20100235216 Hehmeyer et al. Sep 2010 A1
20100235763 Massand Sep 2010 A1
20100241968 Tarara et al. Sep 2010 A1
20100251140 Tipirneni Sep 2010 A1
20100268705 Douglas et al. Oct 2010 A1
20100295958 Larsson et al. Nov 2010 A1
20100306004 Burtner et al. Dec 2010 A1
20100306018 Burtner et al. Dec 2010 A1
20100324963 Gupta et al. Dec 2010 A1
20110107241 Moore May 2011 A1
20110113351 Phillips May 2011 A1
20110137894 Narayanan et al. Jun 2011 A1
20110154180 Evanitsky et al. Jun 2011 A1
20110154192 Yang et al. Jun 2011 A1
20110185288 Gupta et al. Jul 2011 A1
20110212430 Smithmier et al. Sep 2011 A1
20110239142 Steeves et al. Sep 2011 A1
20110282871 Seefeld et al. Nov 2011 A1
20110295879 Logis et al. Dec 2011 A1
20120075337 Rasmussen et al. Mar 2012 A1
20120144325 Mital et al. Jun 2012 A1
20120150577 Berg Jun 2012 A1
20120150863 Fish Jun 2012 A1
20120159347 Fish et al. Jun 2012 A1
20120166985 Friend Jun 2012 A1
20120233543 Vagell et al. Sep 2012 A1
20130035853 Stout et al. Feb 2013 A1
20130091205 Kotler et al. Apr 2013 A1
20130091440 Kotler et al. Apr 2013 A1
20130091465 Kikin-Gil et al. Apr 2013 A1
20130097544 Parker et al. Apr 2013 A1
20130101978 Ahl et al. Apr 2013 A1
20130124978 Horns et al. May 2013 A1
20130125051 Kelley et al. May 2013 A1
20130132886 Mangini et al. May 2013 A1
20130246903 Mukai Sep 2013 A1
20140032481 Lang Jan 2014 A1
20140033088 Shaver Jan 2014 A1
20140207867 Kotler et al. Jul 2014 A1
Foreign Referenced Citations (12)
Number Date Country
1886977 Dec 2006 CN
101198976 Jun 2008 CN
101363739 Feb 2009 CN
101364886 Feb 2009 CN
101515226 Aug 2009 CN
101789871 Jul 2010 CN
1517260 Mar 2005 EP
04257046 Sep 1992 JP
2010176320 Aug 2010 JP
2005139793 Jun 2007 RU
WO-02061682 Aug 2002 WO
WO-2007092470 Aug 2007 WO
Non-Patent Literature Citations (131)
Entry
Author: Adam Pash Title: Google Docs Updates With a Drawing Editor, Real-Time Collaboration, Speed Date: Apr. 13, 2010 pp. 1-5.
Final Office Action, U.S. Appl. No. 12/473,206, (Dec. 7, 2011), 36 pages.
Non Final Office Action, U.S. Appl. No. 12/486,762, (Oct. 14, 2011), 24 pages.
“Online Calendar & Group Scheduling”, MOSAIC Technologies, retrieved from <http://www.webexone.com/Brandded/ID.asp?brandid=2348&pg=%20AppCalendar> on Apr. 24, 2009, 4 pages.
Ju, Wendy et al., “Where the Wild Things Work: Capturing Shared Physical Design Workspaces”, Stanford University, CSCW '04, (Nov. 6-10), pp. 533-541.
“CSS Max-width Property”, Retrieved From: http://web.archive.org/web/20070608101036/http://www.w3schools.com/, 2007, 1 page.
“Create Treemaps Using Easy Drag-and-drop Interactions”, Retrieved From: http://www.magnaview.nl/treemap/, 2010, 1 page.
“GeoTime”, Retrieved at: https://web.archive.org/web/20101219085705/http://www.geotime.com/Product/GeoTime-%281%29/Features---Benefits.aspx, 2009, 10 pages.
“The Beginner's Guide to Data Visualization”, Retrieved From: http://www.tableausoftware.com/beginners-data-visualization, 2010, 10 Pages.
“Foreign Office Action”, CN Application No. 201110436593.2, Jan. 6, 2014, 11 Pages.
“Collaboration within the Telepresence Experience”, Retrieved From: http://www.wrplatinum.com/downloads/11056.aspx, Jan. 2010, 11 Pages.
“Foreign Office Action”, CN Application No. 200980131157.5, Nov. 21, 2013, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/965,965, Jun. 4, 2012, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/253,886, Apr. 11, 2013, 13 pages.
“Final Office Action”, U.S. Appl. No. 11/260,515, Feb. 24, 2011, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/272,832, Aug. 12, 2013, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/472,101, Oct. 5, 2011, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/184,174, Feb. 4, 2011, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/260,515, Mar. 3, 2009, 16 pages.
“Final Office Action”, U.S. Appl. No. 12/472,101, Mar. 28, 2012, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/184,174, Sep. 25, 2013, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/965,965, Dec. 20, 2013, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/260,515, Sep. 30, 2010, 17 pages.
“Final Office Action”, U.S. Appl. No. 13/272,832, Dec. 30, 2013, 18 Pages.
“Non-Final Office Action”, U.S. Appl. No. 12/184,174, Mar. 13, 2012, 19 pages.
“Final Office Action”, U.S. Appl. No. 11/260,515, Dec. 11, 2009, 19 pages.
“Meeting Center Using Video in Your Meetings”, Retrieved From: http://www.oucs.ox.ac.uk/webex/Windows/Video.pdf, May 13, 2009, 2 Pages.
“Mindshift Innovation”, Retrieved From: http://mindshiftinnovation.blogspot.com/2007/09/seadragon.html, Oct. 4, 2007, 2 Pages.
“Datapoint version 1.1”, Retrieved From: http://www.filedudes.com/DataPoint-download-20853.html, 1997-2007, 2 Pages.
“Free PhotoMesa 3.1.2 (Windows)”, Retrieved From: https://web.archive.org/web/20071209231951/http://www.windsorinterfaces.com/photomesa.shtml, 2007, 2 Pages.
“ZuiPrezi Nonlinear Presentation Editor”, ZuiPrezi Ltd., http://zuiprezi.kibu.hu/, 2007, 2 pages.
“ProShow Producer Feature Overview”, Photodex Corporation: http://www.photodex.com/products/producer/features.html, 2008, 2 pages.
“Final Office Action”, U.S. Appl. No. 12/184,174, Sep. 6, 2011, 20 pages.
“Final Office Action”, U.S. Appl. No. 12/184,174, Nov. 20, 2012, 20 pages.
“Final Office Action”, U.S. Appl. No. 12/967,497, Dec. 3, 2013, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/968,332, Dec. 5, 2013, 20 pages.
“An Overview of Aabel 3Features”, Retreived From: http://www.gigawiz.com/aabel3.html, Aug. 9, 2011, 21 pages.
“CounterPoint User Manual”, Retrieved From: http://www.cs.umd.edu/hcil/counterpoint/, 2005, 21 pages.
“Human and Technical Factors of Distributed Group Drawing Tools”, Retrieved From: http://grouplab.cpsc.ucalgary.ca/grouplab/uploads/Publications/Publications/1992-HumanTech.IWC.pdf, 1992, 29 Pages.
“CounterPoint: A Zooming Presentation Tool”, Retrieved From: http://web.archive.org/web/20050205082738/www.cs.umd.edu/hcil/counterpoint/, Feb. 5, 2005, 3 Pages.
“Freepath-Edu Nonlinear Presentation Software”, Grass Roots Software, 2008, 3 pages.
“Aquatic Sugar: The Children's Interface, Translated for Adults”, Retrieved From: http://www.olpcnews.com/software/operating—system/aquatic—sugar—childrens—interface. html, Nov. 7, 2007, 5 Pages.
“Extended European Search Report”, EP Application No. 09803312.9, Jul. 7, 2011, 6 pages.
“Foreign Office Action”, CN Application No. 200980131157.5, Jan. 30, 2013, 7 pages.
“Foreign Office Action”, CN Application No. 200980131157.5, Aug. 31, 2012, 7 pages.
“Visualize and Map SalesForce Leads with SpatiaiKey”, Retrieved From: http://web.archive.org/web/20101120170237/http://www.spatialkey.com/support/tutorials/visualize-and-map-salesforce-leads-with-spatialkey-part-ii, 2010, 7 Pages.
“Foreign Office Action”, CN Application No. 200980131157.5, Jul. 23, 2013, 8 pages.
et al., “International Search Report and Written Opinion”, Application No. PCT/US2009/046529, Nov. 30, 2009, 11 Pages.
Derthick, et al., “An Interactive Visualization Environment for Data Exploration”, Retrieved From: http://www.cs.cmu.edu/˜sage/KDD97.html, Aug. 1997, 10 Pages.
Fernando, et al., “Narrowcasting Attributes for Presence Awareness in Collaborative Virtual Environments pdf”, http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4019930, 2006, 6 pages.
Geyer, et al., “Activity Explorer: Activity-centric Collaboration from Research to Product”, IBM Systems Journal, IBM® Retrieved From: http://www.research.ibm.com/journal/sj/454/geyer.html., 2006, 26 Pages.
Good, et al., “CounterPoint: Creating Jazzy Interactive Presentations”, Retrieved From: http://drum.lib.umd.edu/bitstream/1903/1121/2/CS-TR-4225.pdf, 2001-2003, 9 Pages.
Hewagamage, “Interactive Visualization of Spatiotemporal Patterns Using Spirals on a Geographical Map”, Proc. IEEE Symp. Visual Languages, 1999, 8 pages.
Hupfer, “Introducing Collaboration into an Application Development Environment”, Retrieved From: http://pnexpert.com/files/IBM—Contextual—Collaboration.pdf, Nov. 6-10, 2004, 4 Pages.
Izadi, et al., “Dynamo: A public interactive surface supporting the cooperative sharing and exchange of media”, Retrieved From: http://research.microsoft.com/pubs/132613/p159-izadi.pdf, 2003, 10 Pages.
Little, “High-End Business Intelligence with Data Visualization for WPF 4”, Retrieved From: http://www.codeproject.com/Articles/90591/High-End-Business-Intelligence-with-Data-Visualization, Jun. 29, 2010, 7 Pages.
Moran, et al., “Tailorable Domain Objects as Meeting Tools for an Electronic Whiteboard”, Retrieved From: http://pdf.aminer.org/000/121/871/tailorable—domain—objects—as—meeting—tools—for—an—electronic—whiteboard.pdf, 1998, 10 Pages.
Nelson, “Just Around the Corner: Visual Fusion 4.5”, Retrieved From: http://www.idvsolutions.com/Company/Newsletters/2009/Q3/Vfx45Silverlight.aspx, Sep. 30, 2009, 6 Pages.
Shaw, “Create Pan andd Zoom Effects in PowerPoint”, Retrieved From: http://office.microsoft.com/en-us/powerpoint-help/create-pan-and-zoom-effects-in-powerpoint-HA010232631.aspx, 2007, 13 Pages.
Thomas, et al., “Through-Walls Collaboration”, Retrieved From: http://www.tinmith.net/papers/piekarski-pervasive-2009.pdf, 2009, 8 Pages.
Wempen, “PowerPoint 2007 Bible”, John Wiley & Sons, Feb. 27, 2007, 27 pages.
Weverka, “PowerPoint 2007 All-in-One Desk Reference for Dummies”, Published by Wiley Publishing, Jan. 2007, 8 pages.
Final Office Action, U.S. Appl. No. 12/486,762, (Feb. 8, 2012),28 pages.
Final Office Action, U.S. Appl. No. 12/978,308, (Apr. 9, 2013), 21 pages.
Non-Final Office Action, U.S. Appl. No. 12/486,762, (Feb. 14, 2013), 29 pages.
“Adobe Connect”, Retrieved from: <http://www.adobe.com/acom/connectnow/> on Oct. 11, 2010, (Sep. 16, 2010), 3 pages.
“Adobe ConnectNow”, Retrieved from: <http://www.adobe.com/acom/connectnow/> on Oct. 13, 2010 (2010), 6 pages.
“Cisco Context-Aware Mobility Solution: Presence Applications”, retrieved from https://www.cisco.com/en/US/solutions/collateral/ns340/ns394/ns348/ns788/brochure—c22-497557.html on Sep. 7, 2010, 5 pages.
“Description for SharePoint Meeting Manager”, Retrieved from: <http://www.softpicks.net/software/Business/Project-Management/SharePoint-Meeting-Manager-47146.htm> on Oct. 11, 2010, (Jul. 27, 2009), 2 pages.
“GoToMeeting”, Retrieved from: <http://www.gotomeeting.com/fec/online—meeting> on Oct. 11, 2010, 1 page.
“Meet mimio—The Digital Meeting Assistant”, Mayflower Business Systems Limited; http://www.kda.co.uk/mimio1/whitepaper.html, (May 1999), 10 pages.
“Meeting Management Software”, Retrieved from: <http://workingsmarter.typepad.com/my—weblog/2004/12/meeting—managem.html> on Oct. 11, 2010 (Dec. 10, 2004), 2 pages.
“Microsoft Office Communicator 2007 Getting Started Guide”, retrieved from http://www.ittdublin.ie/media/Media,22233,en.pdf, (Jul. 2007), 77 pages.
“Microsoft® Office Live Meeting Feature Guide”, Microsoft Corporation, Available at <http://download.microsoft.com/download/8/0/3/803f9ba6-5e12-4b40-84d9-d8a91073e3dc/LiveMeeting.doc>,(Jan. 2005), pp. 1-17.
Non-Final Office Action, U.S. Appl. No. 12/473,206, (May 19, 2011), 28 pages.
Adams, Lia et al., “Distributed Research Teams: Meeting Asynchronously in Virtual Space”, Institute of Electrical and Electronics Engineers, (1999), 17 pages.
Bell, David et al., “Sensory Semantic User Interfaces (SenSUI) (position paper)”, Fluidity Research Group; Brunel University, (Oct. 20, 2009), 14 pages.
Bunzel, Tom “Using Quindi Meeting Capture”, retrieved from http://www.informit.com/guides/content.aspx?g=msoffice&segNum=220, (Sep. 1, 2006), 3 pages.
Fruchter, Renate “Brick & Bits & Interaction (BBI)”, http://www.ii.ist.i.kyoto-u.ac.jp/sid/sid2001/papers/positions/brickbitsinteraction.pdf, (2001), 4 pages.
Ionescu, Arna et al., “Workspace Navigator: Tools for Capture, Recall and Reuse using Spatial Cues in an Interactive Workspace”, Stanford Technical Report TR2002-04, http://bcj.stanford.edu/research/wkspcNavTR.pdf, (2002), 16 pages.
Kim, Hyun H., et al., “SmartMeeting: CMPT 481/811 Automatic Meeting Recording System”, http://www.cs.usask.ca/grads/hyk564/homePage/811/CMPT%20811%20final.doc, (2004), 7 pages.
Mitrovic, Nikola et al., “Adaptive User Interface for Mobile Devices”, retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.140.4996&rep=rep1&type=pdf, (2002), 15 pages.
Peddemors, A.J.H. et al., “Presence, Location and Instant Messaging in a Context-Aware Application Framework”, retrieved from htt://citeseerx.ist.psu.edu/viewdoc/download?doi=10.11.1.98.3321&rep=rep1&type=pdf; 4th International Conference on Mobile Data Management, MDM, (2003), 6 pages.
Rudnicky, Alexander I., et al., “Intelligently Integrating Information from Speech and Vision to Perform Light-weight Meeting Understanding”, retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.126.1733&rep=rep1&type=pdf, (Oct. 2005), 6 pages.
Watson, Richard “What is Mobile Presence?”, Retrieved from http://reseller.tmcnet.com/topics/unified-communications/articles/54033-what-mobile-presence.htm, (Apr. 10, 2009), 4 pages.
Yu, Shoou-Jong et al., “Who Said What When? Capturing Important Moments of a Meeting”, retrieved from http://repository.cmu.edu/cgi/viewcontent.cgi?article=1003&context=silicon—valley; Technical Report, (Apr. 10-15, 2010), 7 pages.
Zenghong, Wu et al., “Context Awareness and Modeling in Self-Adaptive Geo-Information Visualization”, retrieved from http://icaci.org/documents/ICC—proceedings/ICC2009/html/refer/17—1.pdf on Aug. 30, 2010, 13 pages.
Non-Final Office Action, U.S. Appl. No. 12/965,965, (Jun. 4, 2012), 12 pages.
Non-Final Office Action, U.S. Appl. No. 12/968,332, (Jul. 23, 2012), 19 pages.
Final Office Action, U.S. Appl. No. 12/486,762, (Jun. 20, 2013), 42 pages.
Final Office Action, U.S. Appl. No. 12/968,332, (Aug. 1, 2013),19 pages.
Non-Final Office Action, U.S. Appl. No. 12/967,497, (Jun. 20, 2013),19 pages.
Bergmann, et al., “Automated Assistance for the Telemeeting Lifecycle”, Proceedings of the ACM conference on Computer supported cooperative work, (Oct. 1994), pp. 373-384.
Final Office Action, U.S. Appl. No. 12/965,965, (Nov. 8, 2012), 12 pages.
Non-Final Office Action, U.S. Appl. No. 12/978,308, (Aug. 31, 2012), 17 pages.
Karlson, et al., “Courier: A Collaborative Phone-Based File Exchange System”, Retrieved at <<http://docs.google.com/viewer?a=v&q=cache:Mb2OKecuT1kJ:citeseerx.ist.psu.edu/viewdoc/download%3Fdoi%3D10.1.1.146.360%26rep%3Drep1%26type%3Dpdf+collaborative+document+navigation+visual+display+participant+device&hl=en&pid=bl&srcid=ADGEESgArWqUU1B—J2heHCEm78A3YhBLNjwOzrUuQeMSHPm8FebYGzDX9mSFK GG6RLq1l3MU25cyntlHk5zlolmCFFyGe8wyfYgwMNhwzx8McZbUIL0Og1zr7WR7MwmX5lgeiRZXKTqj&sig=AHIEtbQ5pCA4H1qUtjsbbjNbvylgMMaXOg >>, Technical Report, MSR-TR-2008-05, Jan. 2008, pp. 17.
“Meeting Center Using Video in Your Meetings”, Retrieved at << http://www.oucs.ox.ac.uk/webex/Windows/Video.pdf >>, May 13, 2009, pp. 2.
Werle, et al., “Active Documents Supporting Teamwork in a Ubiquitous Computing Environment”, Retrieved at << http://docs.google.com/viewer?a=v&q=cache:iyt-5ZWZURYJ:citeseerx.ist.psu.edu/viewdoc/download%3Fdoi%3D10.1.1.157.4661%26rep%3Drep1%26type%3Dpdf+smart+layout+document+conference+meeting+where+participant+is+within+the+document&h1=en&pid=bl&srcid=ADGEEShctdCPK5oM1kGncxGqgHps9wl1DPOjAHtQXOxazPZIShLb—4JN551ty2XiA7lnx9CbbH6yaRfXouOdD0mDIRrXEHFs—r20A20tYaiZMCmPpOnB9pLAWciSDqjoADbz3LD2-saD&sig=AHIEtbRnWcfCqVctAPxz3qFSB2bmF9pxfg >>, In Proceedings of the PCC Workshop, Apr. 3-5, 2001, pp. 4.
“Foreign Office Action”, CN Application No. 201110436306.8, Feb. 8, 2014, 13 Pages.
Foreign Office Action, CN Application No. 201110436306.8, Nov. 15, 2014, 6 pages.
Foreign Office Action, CN Application No. 201110436635.2, Nov. 27, 2014, 11 pages.
Foreign Office Action, CN Application No. 201110443291.8, Nov. 21, 2014, 8 Pages.
Non-Final Office Action, U.S. Appl. No. 12/968,332, Oct. 9, 2014, 23 pages.
Final Office Action, U.S. Appl. No. 12/965,965, Jun. 5, 2014, 13 pages.
Foreign Notice of Allowance, RU Application No. 2011103151, Sep. 4, 2013, 18 pages.
Final Office Action, U.S. Appl. No. 12/968,332, Jul. 17, 2014, 23 pages.
Final Office Action, U.S. Appl. No. 13/253,886, Feb. 14, 2014, 26 Pages.
Final Office Action, U.S. Appl. No. 12/184,174, Aug. 11, 2014, 18 pages.
Foreign Office Action, CN Application No. 201110436306.8, Sep. 17, 2014, 7 Pages.
Foreign Office Action, CN Application No. 201110436593.2, Sep. 12, 2014, 12 Pages.
Foreign Office Action, CN Application No. 201110436635.2, May 27, 2014, 14 pages.
Foreign Office Action, CN Application No. 201110443291.8, Jan. 24, 2014, 12 Pages.
Foreign Office Action, CN Application No. 201110443291.8, Jul. 24, 2014, 10 Pages.
Non-Final Office Action, U.S. Appl. No. 12/472,101, Sep. 16, 2014, 10 pages.
Non-Final Office Action, U.S. Appl. No. 12/473,206, Jul. 31, 2014, 41 pages.
Non-Final Office Action, U.S. Appl. No. 12/965,965, Oct. 2, 2014, 14 pages.
Non-Final Office Action, U.S. Appl. No. 13/253,886, Aug. 14, 2014, 15 pages.
Non-Final Office Action, U.S. Appl. No. 14/225,234, Jul. 18, 2014, 5 pages.
Final Office Action, U.S. Appl. No. 12/967,497, Jul. 2, 2015, 24 pages.
Foreign Notice of Allowance, CN Application No. 201110436593.2, Jun. 4, 2015, 6 Pages.
Final Office Action, U.S. Appl. No. 12/965,965, Mar. 11, 2015, 17 pages.
Foreign Notice of Allowance, CN Application No. 201110436306.8, Apr. 1, 2015, 4 Pages.
Foreign Office Action, CN Application No. 201110436593.2, Mar. 16, 2015, 7 Pages.
Foreign Office Action, CN Application No. 201110436635.2, May 18, 2015, 14 Pages.
Non-Final Office Action, U.S. Appl. No. 12/473,206, Apr. 9, 2015, 55 pages.
Non-Final Office Action, U.S. Appl. No. 12/967,497, Mar. 13, 2015, 21 pages.
Notice of Allowance, U.S. Appl. No. 12/968,332, Apr. 10, 2015, 15 pages.
Foreign Office Action, CN Application No. 201110436635.2, Oct. 20, 2015, 12 Pages.
Final Office Action, U.S. Appl. No. 12/473,206, Oct. 8, 2015, 39 pages.
Notice on Reexamination, CN Application No. 201110443291.8, Aug. 24, 2015, 9 pages.
Notice on Reexamination, CN Application No. 201110443291.8, Jan. 4, 2016, 10 Pages.
Related Publications (1)
Number Date Country
20120159355 A1 Jun 2012 US