System and method of annotating presentation timeline with questions, comments and notes using simple user inputs in mobile devices

Information

  • Patent Grant
  • 11429781
  • Patent Number
    11,429,781
  • Date Filed
    Tuesday, October 22, 2013
    11 years ago
  • Date Issued
    Tuesday, August 30, 2022
    2 years ago
  • CPC
    • G06F40/169
  • Field of Search
    • CPC
    • G06F17/241
    • G06F17/30044
    • G06F17/30817
    • G06F17/30056
    • G06F17/3089
    • G06F3/048
    • G06F40/169
    • G06F16/78
    • G06F16/7867
    • G06F16/40
    • G06F16/4393
    • G06F16/489
  • International Classifications
    • G06F40/169
Abstract
A system and method for presentation timeline annotation are provided. The system permits an audience member to interact with the presentation using simple inputs, such as key presses on desktop computers or gestures on a mobile device for example, to mark a segment of the presentation timeline with various interactive annotation capabilities including, for example a question, a comment, a rating or a large number of other interactive annotation capabilities.
Description
FIELD

The disclosure relates to web-based content presentations, where presentations can be broadly divided into three categories including live or recorded interactive webcasting, podcasting (video or audio), and slideshows. The disclosure more specifically relates to a system and method for allow an audience member to provide interactive annotations to a presentation timeline using simple inputs.


BACKGROUND

Due to the lack of currently available non-interruptive question and answer (Q&A) (i.e., the presenter speaking in a large event or meeting can't stop to view or answer every question while presenting), asking questions, commenting and note taking during presentations to large audiences requires audience members to regularly break-out of the presentation to do trivial tasks like asking questions or writing a comment or note. Very often these tasks are performed through widgets and forms, which in turn prohibit a uniform experience for the audiences.


From the point of view of archiving of audience interactions and feedback—there exists no convenient ways for the audience to quickly go back to the segment of a presentation where a question was asked or a comment was written. Furthermore, for the presenter of a presentation to an audience, there is no convenient way to identify and address segments of the presentation that raised more questions, concerns or feedback from the audiences.


In the current state of the art presentation systems, audience interactions such as question & answering, commenting, reviews and rating are enabled through third-party plugins or custom widgets that are embedded in the web page containing presentation video, audio and slides. For example, there exists popular social commenting plugins from Facebook and Disqus. These currently available solutions are mostly general purpose solutions that have no special concerns for presentation experience or flow. For example—Facebook's commenting plugin can be embedded into any web page usually at the bottom (or side) of main content. New comments and questions get added to the bottom of the main thread or somewhere in the middle of sub-threads. In the process, synchronization of presentation timeline and audience interactions is lost.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a diagram illustrating a method for asset acquisition for an online presentation method;



FIG. 2 is a diagram illustrating an example of an online presentation system that may use the presentation timeline annotation system;



FIG. 3 illustrates a system architecture of the online presentation system shown in FIG. 2;



FIG. 4 is a functional diagram of the interacting components of the online presentation system in FIG. 3;



FIG. 5 is a diagram illustrating a presentation workflow;



FIG. 6 is a diagram illustrating an example of an online presentation client that may incorporate the presentation timeline annotation system;



FIG. 7 illustrates a set of media unit components and client components that implement a method for presentation timeline annotation;



FIG. 8 illustrates an example of an implementation of an event console manager/client that is part of the event presentation system;



FIG. 9 illustrates more details of the presentation timeline annotation system;



FIG. 10 illustrates a method for presentation timeline annotation; and



FIGS. 11-14 illustrates examples of the presentation timeline annotation user interface.





DETAILED DESCRIPTION OF ONE OR MORE EMBODIMENTS

The disclosure is particularly applicable to a system and method for annotating presentation timelines in a web casting system and it is in this context that the disclosure will be described. It will be appreciated, however, that the system and method has greater utility because it may be implemented in other manners that are within the scope of the disclosure and may be used for other presentation systems in which the annotation of the presentation timelines is desirable. Furthermore, the system and method for annotating presentation timelines also may be used with audience participation widgets in which the presence, invocation, reuse and overall design and experience of using these widgets alongside a presentation may be changed drastically. Now, a disclosure of an on-line web-based presentation system is provided wherein the on-line web-based presentation system may include the system and method for presentation timeline annotation.



FIG. 1 is a diagram illustrating a method 20 for asset acquisition for online presentation event system. As shown, an audio/video or audio data source 22 is edited in step 24 if necessary or is automatically captured. In step 26, the data source 22 is encoded. Alternatively, an automated phone-based recording source 28 is encoded in step 30. The encoded data may then be stored in a media database 32, such as in a real media format 32a and/or a windows media format 32b. In this manner, a data source/piece of media is prepared for distribution using an event system, an example of which is shown in FIG. 2.



FIG. 2 is a diagram illustrating an event system 40 into which the presentation timeline annotation apparatus may be incorporated. The event system 40 may comprise an asset acquisition and event management portion 42, a database portion 44 and a distribution portion 46 wherein a piece of media/content 48 is input into the event system 40 in order to distribute that content/piece of media during the event. Generally, each element of the event system being described is implemented in software wherein each portion may be one or more software modules and each software modules may be a plurality of computer instructions being executed to perform a particular function/operation of the system. Each element of the system may thus be implemented as one or more computer resources, such as typical personal computers, servers or workstations that have one or more processors, persistent storage devices and memory with sufficient computing power in order to store and execute the software modules that form the frame event system in accordance with the invention. The event system may generate an event that is provided to one or more event clients 52 wherein each client is a computing resource, such as a personal computer, workstation, cellular phone, personal digital assistant, wireless email device, telephone, etc. with sufficient computing power to execute the event client located on the client wherein the client communicates with the event system over a wired or wireless connection.


In more detail, the asset acquisition and event management portion 42 may further comprise an asset acquisition portion 42a and an event management portion 42b wherein the asset acquisition portion performs one or more of the following functions: recording of the piece of media/content, editing of the piece of media/content, encoding of the piece of media/content and asset tagging. The event manager module 42b further comprises an asset manager module 50a, an event manager module 50b, a presentation manager module 50c and an encoder controller 50d. The asset manager module 50a, prior to an event, imports/exports content/pieces of media into/from a library of media as needed and manages the assets for each event presentation. The event manager module 50b may perform actions/function prior to and after an event. Prior to a particular event, the event manager module may reserve the event in the system (both resources and access points), set-up an event console which a user interacts with to manage the event and then send messages to each recipient of the upcoming event with the details of how to access/operate the event. After a particular event, the event manager module 50b may permit a user to import an old event presentation into the system in order to re-use one or more pieces of the old event presentation. The presentation manager module 50c, during a particular event presentation, generates an event file with the slides of the event presentation, URLs and polls to an encoder controller to distribute the particular event presentation to the users. The encoder controller 50d encodes the event presentation stream to one or more distribution server 54 that distributes the event presentation to the users.


As shown in FIG. 2, the database 44 may include data about each event, including the clients to which the event is being provided and the media associated with the event, one or more event users, the display of the particular event, the assets associated with the event, the metrics for the event and other event data. In combination with this data in the database for a particular event, operations and commands from the event manager module 42b are downloaded to the distribution servers 54 that distribute each event to each client 52 for the particular event over a distribution network 56. As shown, the event/presentation may be distributed to one or more different clients 52 that use one or more different methods to access the event. The clients 52 may include a client that downloads the presentation and then views the presentation offline.



FIG. 3 illustrates more details of the event system shown in FIG. 2. The event system may include a web server portion 60, an application server portion 62 and the database portion 40 (with the database 44) shown in FIG. 2. Each of these portions may be implemented as one or more computer resources with sufficient computing resources to implement the functions described below. In a preferred embodiment, each portion may be implemented as one or more well-known server computers. The web server portion 60 may further comprise one or more servlets 64 and a web container portion 66 which are both behind a typical firewall 68. In a preferred embodiment of the invention, the servlets reside on a BEA Weblogic system which is commercially available and may include an event registration servlet, an event manager module servlet, a presentation manager module servlet and an encoder controller servlet that correspond to the event manager module 50b, presentation manager module 50c and encoder controller 50c shown in FIG. 2. Each of these servlets implement the functions and operations described above for the respective portions of the system wherein each servlet is a plurality of lines of computer code executed on a computing resource with sufficient computing power and memory to execute the operations. The servlets may communicate with the application server portion 62 using well-known protocols such as, in a preferred embodiment, the well-known remote method invocation (RMI) protocol. The servlets may also communicate with the web container portion 66 which is preferable implemented using an well-known Apache/Weblogic system. The web container portion 66 generates a user interface, preferably using Perl, Active Server Page (ASP), HTML, XML/XSL, Java Applet, Javascript and Java Server Pages (JSPs.) The web container portion 66 may thus generate a user interface for each client and the presentation manager module user interface. The user interface generated by the web container portion 66 may be output to the clients of the system through the firewall as well as to an application demo server 68 that permits a demo of any presentation to be provided.


The application server portion 62 may preferably be implemented using an Enterprise JavaBeans (EJBs) container implemented using a BEA Weblogic product that is commercially sold. The application server management portion 62 may be known as middleware and may include a media metric manager 70a, a chat manager 70b, a media URL manager 70c, an event manager 70d, a presentation manager 70e and an event administration manager 70f which may each be software applications performed the specified management operations. The application server portion 62 communicates with the database 44 using a protocol, such as the well-known Java Database Connectivity (JDBC) protocol in a preferred embodiment of the invention. The database 44 may preferably be implemented using an Oracle 8/9 database product that is commercially available. As shown, the database 44 may include media data including URL data, slide data, poll data and document data. The database 44 may further include metric data, event data and chat data wherein the event data may further preferably include administration data, configuration data and profile data.



FIG. 4 is a diagram illustrating more details of the event database 44 in FIG. 3. As shown in FIG. 4, the database may generate data that is used to implement a function to reserve an event, to configure an event, present an event, for registration, for the lobby for the event console, for reporting and for archiving an event. The database may include asset data 44a that may be provided to the asset manager module 50a, metrics data 44b that is provided to a metric module 72, event data 44c that is provided to the event manager module 50b, presentation data 44d that is provided to the presentation manager module 50c, event user data 44e that is provided to an event registration module 80, display element data 44f that is provided to an event consoles module 76 and email notification data 44g that is provided to an email alerts module 74. The database may also store data that is used by a reporting module 78 to generate reports about the events and presentations provided by the system. The database may also store data that is used by a syndication module 82 to syndicate and replicate existing presentations.



FIG. 5 is a diagram illustrating an event center 90 that may be utilized by one or more users 92 that are presented with a presentation by the system and one or more presenters 94 who utilize the system to present presentations to the users 92. The users 92 may interact with a registration and lobby modules 80 that permit the users to register with the system and schedule a presentation to view. In response to a successful registration, the user may be presented with a player page 96, such as a web page provided to a client computer of the user, that provides the audio and visual data for the presentation, slides, polls and URLs for the presentation, chat sessions and question and answers for a particular presentation. The data in the player page 96 is provided by the web server 60, the media server 54 and a chat server 98 that provides the chat functionality for a presentation. The presentation data for a live event presentation is provided to the servers 54, 60 and 98 by the presentation manager module 50c. The presenters 94 may utilize the event manager module 50b to reserve an event and/or configure an event. Once the event is reserve and configured, the presentation data is forwarded to the presentation manager module 50c.



FIG. 6 is a diagram illustrating an example of an online presentation client 100 that may incorporate the presentation timeline annotation apparatus. The event client 100 may be implemented as a personal computer, workstation, PDA, cellular phone and the like with sufficient computing power to implement the functions of the client as described below. In the example shown in FIG. 6, the event client may be a typical personal computer that may further comprise a display unit 102, such as a CRT or liquid crystal display or the like, a chassis 104 and one or more input/output devices 106 that permit a user to interact with the client 100, such as, for example, a keyboard 106a and a mouse 106b. The chassis 104 may further include one or more processors 108, a persistent storage device 110, such as a hard disk drive, optical disk drive. tape drive, etc., and a memory 112, such as SRAM, DRAM or flash memory. In a preferred embodiment, the client is implemented as one or more pieces of software stored in the persistent storage device 110 and then loaded into the memory 112 to be executed by the processor(s) 108. The memory may further include an operating system 114, such as Windows, and a typical browser application 116, such as Microsoft Internet Explorer, Mozilla Firefox or Netscape Navigator and an event console module 118 (including a slide, polls, survey, URL, Q&A) that operates within the browser application. The client side of the system/apparatus is implemented as HTML and Javascript code that is downloaded/streamed to the client 100 during/prior to each presentation so that the synchronization of the assets does not require separate client software downloaded to the client.



FIG. 7 illustrates a set of media unit components and client components that implement a method for reconstructing the timeline of a presentation. To implement the method for reconstructing the timeline of a presentation, a set of media unit components (that may be implemented on one or more server computers) and a set of client 100 components are used that interact with each other. In the method, the audio/video stream must anchor the timeline, from the viewer's point of view. If media playback is delayed on the client-side due to network latency, if the viewer pauses the player for some duration and then resumes playback, or if playback is temporarily paused by the player to allow buffering, then all other presentational events in the webcast must also be offset in time by the same amount which is the goal of the timeline reconstruction method.


The presentation manager 50c, web server/application server/database 98 and the streaming media servers 54 described above are part of the process. In addition, a synchronization unit 122, that may be implemented as one or more server computers, participates the process. In particular, as the presenter's audio and video data is encoded (by an encoder that is part of the media unit) and broadcast (by the media server 54), the synchronization unit 122, acting as if it is client software that receives the media stream, receives the encoded media stream and demultiplexes it using a demultiplexer 124 that is part of the synchronization unit 122. As the raw media stream data is demuxed, timestamps are extracted by a timestamp extractor unit 126 of the synchronization unit 122. The timestamps may be a start time and/or end time for a particular portion of the stream and the like. The timestamps are captured together with informational markers into the stream data—enough information so that the timestamp and its association with a particular piece of media data can be reconstructed and reassociated independently later on the client.


In the system, anything that is part of a presentation that is synchronized with the audio and video may have timestamps. Examples of timecoded serial data may include, but is not limited to, slide/powerpoint flips, captions, animations like a moving virtual pointer, any background/auxiliary information that should be displayed at specific times during the presentation, links to other resources that should be displayed at specific times, for example linking to a participation certificate after viewing time has been reached and polls or surveys that should be displayed at specific times.


After the timestamps are extracted, the stream is remuxed using a multiplexer 128 in the synchronization unit 122 and transmitted to each client who is viewing the particular presentation. Optionally at this step, the media data can additionally be transcoded using a transcoder 127 in the synchronization unit 122 into another format that is supported on the client device. While the re-encoded media stream is being sent to the client, the timeline synchronization information, media data association, and local server time are published (using a timeline publishing unit 129 associated with the synchronization unit 122) in a place and format suitable for retrieval separately by client software.


At a suitable time on the client, when it requests the stream data, it separately also obtains the corresponding timestamp information that was extracted from the media stream previously that corresponds to the current audio/video data, by retrieving it from the location where it was published. Then at an appropriate time on the client side—when the media data has finished buffering and begins playback—the client captures, and saves for later use, the current local device time.


The local device time that is captured at the right moment, will be different than the timestamp captured on the server during in the previous steps, but the two different times will correspond to the same point on the presentation timeline. Those two timestamps and the difference between them, together with the time offset delay calculated from the transmission latency between client and server, can be used to determine the correct time delay to use when rendering all other presentational events of the webcast, such as slide transitions, so that they are rendered on the client in the same time positions as the speaker intends when the presentation is produced.


On each client that is participating in the presentation, the client reconstructs the event timelines by: 1) retrieving the presentation events from the web server/application server/database 98 of the media unit; 2) retrieves the media time data generated by the timeline publishing unit 129; 3) retrieves the re-encoded media stream; 4) generates the reconstruction presentation timeline as described above; and 5) synchronizes with the media player events in the browser 118 (such as the media player component, the question and answer/chat channel, any slides and any other presentation items). The event console manager on each audience member device may be implemented as a plurality of lines of computer code executed by the computer that perform the steps above.



FIG. 8 illustrates an example of an implementation of an event console manager/client 100 that is part of the event presentation system. The client performs a presentation preparation process 140 in which the client downloads the media data, retrieves non-media presentation assets and retrieves the sync data. During a reconstruct timeline process 142, the client reconstructs the timeline of the presentation based on the media data, the non-media presentation assets and the sync data. Thus, for example as shown in FIG. 8, the timing of the slides, captions and polls of the presentation are determined as shown pictorially in FIG. 8. Once the timeline is reconstructed, the client reconstructs the presentation (144) for one or more browsers and the presentation playback begins (146).



FIG. 9 illustrates more details of the presentation timeline annotation system 900. The presentation timeline annotation system 900 may further comprise one or more components on the backend side 98, 122 and one or more components on the client side 100. The components on the backend side and the client side may interact with each other over a link 901 to implement the presentation timeline annotation system. The link 901 may be a wired or wireless link, such as the Internet, a computer network, a wireless computer network, a cellular data network and the like.


The backend components may include a backend annotation manager 902 and the presentation database 44 described above and the client side components may include an annotation component 920 and a presentation component 910. Each of the components shown in FIG. 9 may be implemented in hardware or software or a combination of hardware and software. In one embodiment, each of the components may be a plurality of lines of computer code that may be stored in a memory of the backend or client device and then executed by a processor of the backend or client device. The annotation component 920 of the client side may perform the annotation operations and functions described below in more detail while the presentation component 910 may generate the user interface of the presentation and display the presentation to each audience member such as is shown, for example, in FIGS. 7 and 11.


The backend annotation manager 902 and a client annotation manager 924 (that is part of the annotation component 920 on the client) each may control the annotations made to the presentation. For example, either of the annotation managers 902, 924 (or both of them in some embodiments) may allow a user (the presenter or producer of the presentation or the audience member that receives the presentation) to set one or more settings of the presentation timeline annotation system. For example, each or both of the annotation managers 902, 924 may allow a user to configure one or more annotation selectors that trigger (for an audience member) the marking of a segment of the presentation with an interactive annotation wherein the interaction annotation may include, for example, a question, an answer, a comment, a note, a review or a rating of the presentation. The one or more annotation selectors may be key presses, such as for a personal computer, desktop computer, tablet computer, etc. or a gesture, such as for smartphone devices, tablet computer and the like. For example, the gesture may be a hand movement on a Samsung Galaxy smartphone device or the movement of a device that indicates user intent, such as a shake motion. In some embodiments, the one or more annotation selectors may be preconfigured using the annotation manager 902 by the content presenter or producer.


A non-limiting example of different key presses that may be used as the one or more annotation selectors are shown in Table 1 below.










TABLE 1





Sample Key/Key Combination
Action







Q
Mark current segment with a question


Shift + Q
Mark current segment with a question



and add a question


N
Mark current segment with a note


Shift + N
Mark current segment with a note and



add a text note


R
Mark current segment with a review


Shift + R
Mark current segment with a review



and add a review









As shown in FIG. 9, the annotation component 920 also may include an annotation event handler 928 and an annotation user interface (UI) generator 930. The annotation event handler 928 may perform one or more functions to implement the presentation timeline annotation method as shown in FIG. 10 while the annotation user interface generator 930 may generate the annotation user interface elements displayed to the user (examples of which are shown in FIGS. 11-14) or as shown in FIG. 7 when the annotations are used in conjunction with a user interface widget. For example, FIG. 14 illustrates an example of the presentation display and the annotation display that may be displayed to the user.



FIG. 10 illustrates a method for presentation timeline annotation 1100 that may be implemented, for example, using the backend and client side components shown in FIG. 9. In the method, a client side component, such as the annotation event handler component of the annotation component, may listen/detect the one or more annotation selector events (1002) and delegate the relevant annotation selector events to the appropriate annotation event handler. If no annotation selector events occurs, then the method loops back to the start. If an annotation selector events occurs, the annotation event handler may determine a time reference, such as a current timestamp or slide number of video/audio and slideshow, respectively, at which the annotation selector event occurred (1004.) The time reference is an important point of reference, which is maintained throughout the lifetime of content. The time reference identifies a point in the presentation at which an audience member wants to make an annotation to the presentation timeline.


The annotation event handler also may, based on the time reference of the annotation event, create a custom annotation object, such as using JavaScript for example, with annotation type and time reference as two primary properties. Everything in a JavaScript world are objects, in its simplest form a pair of curly braces represents a JavaScript object, e.g. { }. There are several ways to create, manipulate and maintain such objects. Various types of properties and methods can be added to these objects, e.g. var annotationObject={‘annotationName’: ‘Webcast 101’, ‘annotationTimestamp’: 1381179374, ‘annotationCreator’: ‘John Appleseed’, ‘annotationType’: ‘Q&A’}; Here annotationObject is created upon user interaction and it can be transferred across servers, stored in databases and dispatched to other clients.


The annotation event handler also may communicate the annotation object to the presentation database 44 (1006) using a protocol, such as a simple client-server communication protocol. The communication of the annotation object for each annotation event for each audience member allows the annotations to be delivered to each audience member of the presentation. For example, in one implementation, the created annotation object may be delegated to a next client-side module (for a particular audience member) where view of the annotation may be generated (1008), styling is added and the object itself is added to the annotation timeline on the given time reference point (an example of which is shown in FIGS. 12-13). Thus, each audience member for the presentation can see the annotations of the other audience members.


Two examples of an annotation user interface element 1200 are shown. FIG. 12 shows an example of a presentation timeline annotation user interface element for a video presentation element while FIG. 13 shows an example of a presentation timeline annotation user interface element for a slideshow presentation element. As shown in both Figures, the presentation timeline annotation user interface element may include an annotation indicator element 1202 that shows a location in the presentation at which one or more annotations occurred as well as the different types of annotations at each point in the presentation. For example, as shown in FIG. 12, there are two questions (indicated by Q(2)) and five comments (indicated by C(5)) ten minutes into the exemplary 30 minute video presentation. In FIG. 13, there are fifteen questions (indicated by Q(15)) and no comments (indicated by C(0)) at slide 10 of 30 slides of the exemplary slideshow presentation. As shown in these figures, the presentation timeline annotations are shown at the same time or in-line with the presentation so that the audience member is able to interact with the presentation while continuing to be engaged by the presentation.


In some embodiments, the dimensions of the presentation annotation timeline may be proportional to the duration or length of presentation. For example, for video/audio content, the system may have a one-to-one mapping of content duration and annotation timeline dimensions. For slideshows, for example, the system may have a control where its behavior and presentation is similar to HTML5 video player control.


In each user interface displayed to each audience member (such as that shown in FIGS. 12-14), the audience member can select an annotation in the timeline, such as by hovering over or clicking on the annotation, and an element, such as a text bubble, may be displayed such as is shown in FIGS. 12-13. The text-bubble may show the type and count of the audience member's interactions possibly showing interactions from other audiences and presenter. At any given point on the timeline, there could be zero or more annotations, where each annotation can be created by different users. Also, each annotation belongs to one of the supported annotation types such as Questions (Q), Answers (A), Rating (R) or Comment (C). Text-bubble is used to show type and count of annotation. For example, a timestamp containing 4 questions, 6 answers, 8 ratings and 0 comment can be displayed as Q(4) A(6) R(8) C(0). If an audience member selects the element, such as by clicking the text-bubble or double clicking the annotation mark, the user interface may present the audience member with the options to further interact with the annotation or open a specific widget from the dock etc. Continuing from above example of Q(4) A(6) R(8) C(0), a user could click on the ‘Q’ to open a widget containing 4 questions and a form to add new question.


The system of presentation timeline annotation also may be configured to support unary task only such as note taking. Thus, an audience member is able to mark a segment with notes and later have those notes emailed as a PDF or added to video as closed-caption.


While the foregoing has been with reference to a particular embodiment of the invention, it will be appreciated by those skilled in the art that changes in this embodiment may be made without departing from the principles and spirit of the disclosure, the scope of which is defined by the appended claims.

Claims
  • 1. An annotation system for a presentation, comprising: a computing device having a processor that executes a plurality of instructions;a presentation system, connected to the computing device over a network, having a store that stores one or more presentations that may be provided to the computing device;the computing device being configured to:display a presentation;mark, for a segment among one or more specifically marked segments while the presentation is being displayed, a segment of the presentation with a plurality of different types of interactive electronic annotations using a plurality of user configurable annotation selectors, wherein each annotation selector is configured by the user to perform a user selected annotation, the plurality of different types of interactive electronic annotations including a comment annotation that marks the segment of the presentation with a first question, a question annotation that one of marks the segment of the presentation with a first question and marks the segment of the presentation with the first question and adds a second question, a note annotation that one of marks the segment of the presentation with a note and marks the segment of the presentation with the note and adds a text note, and a review annotation that one of marks the segment of the presentation with a first review and marks the segment of the presentation with the first review and adds a second review; andthe presentation system being configured to:receive the user selected annotation for the presentation;generate an annotation timeline for the presentation, the annotation timeline showing an identifier which identifies types of each of the plurality of user selected annotations and a number of each of the user selected annotations for the identified types made at a commonly marked segment among the one or more marked segments during the presentation that includes the different types of at least two of the question annotation, the review annotation, the note annotation and a comment annotation; andcommunicate the annotation timeline to each computing device that displays the presentation so that annotation timeline is displayed on each computing device.
  • 2. The system of claim 1, wherein the computing device is further configured to determine a time reference of the presentation at a time of the marking of the segment.
  • 3. The system of claim 2, wherein the time reference is one of a timestamp of the presentation and a slide number of the presentation.
  • 4. The system of claim 1, wherein the computing device is further configured to generate an annotation object.
  • 5. The system of claim 4, wherein the annotation object is a Javascript annotation object.
  • 6. The system of claim 4, wherein the computing device is further configured to communicate the annotation object to the store of the presentation system.
  • 7. The system of claim 1, wherein the computing device is further configured to display the annotations adjacent to the presentation.
  • 8. The system of claim 1, wherein the computing device is further configured to have a settings menu so that one or more annotation selectors are configured, wherein the one or more annotation selectors indicate a marking of a segment of the presentation.
  • 9. The system of claim 8, wherein each annotation selector is a key press.
  • 10. The system of claim 8, wherein each annotation selector is a gesture.
  • 11. The system of claim 8, wherein the computing device has an annotation manager that configures the one or more annotation selectors.
  • 12. The system of claim 8, wherein the presentation system has an annotation manager that configures the one or more annotation selectors.
  • 13. The system of claim 1, wherein the annotation is one of an answer and a rating.
  • 14. The system of claim 1, wherein the computing device is further configured to detect an annotation selector event that indicates that the user is marking a segment of the presentation.
  • 15. The system of claim 1, wherein the computing device further comprises an annotation user interface generator that generates a presentation timeline annotation element having an annotation indicator.
  • 16. The system of claim 15, wherein the annotation user interface generator generates a user interface element that displays all of the annotations for the presentation.
  • 17. A method for annotating a presentation being delivered to a computing device by a presentation system, connected to the computing device over a network, having a store that stores one or more presentations that may be provided to the computing device of a user, the method comprising: displaying, by a computing device, a presentation;marking, using the computing device, a segment among one or more specifically marked segments of the presentation with a plurality of different types of interactive electronic annotations using a plurality of user configurable annotation selectors while the presentation is being displayed, wherein each annotation selector is configured by the user to perform a user selected annotation, the plurality of different types of interactive electronic annotations including a comment annotation that marks the segment of the presentation with a first question, a question annotation that one of marks the segment of the presentation with a first question and marks the segment of the presentation with the first question and adds a second question, a note annotation that one of marks the segment of the presentation with a note and marks the segment of the presentation with the note and adds a text note, and a review annotation that one of marks the segment of the presentation with a first review and marks the segment of the presentation with the first review and adds a second review;receiving, by the presentation system, the user selected annotation for the presentation;generating, by the presentation system, an annotation timeline for the presentation, the annotation timeline showing an identifier which identifies types of each of the plurality of user selected annotations and a number of each of the user selected annotations for the identified types made at a commonly marked segment among the one or more marked segments during the presentation that includes the different types of at least two of the question annotation, the review annotation, the note annotation and a comment annotation; andcommunicating by the presentation system, the annotation timeline to each computing device that displays the presentation so that annotation timeline is displayed on each computing device.
  • 18. The method of claim 17 further comprising determining a time reference of the presentation at a time of the marking of the segment.
  • 19. The method of claim 18, wherein the time reference is one of a timestamp of the presentation and a slide number of the presentation.
  • 20. The method of claim 17 further comprising generating an annotation object.
  • 21. The method of claim 20, wherein the annotation object is a Javascript annotation object.
  • 22. The method of claim 20 further comprising communicating the annotation object to the store of the presentation system.
  • 23. The method of claim 17 further comprising displaying the annotations adjacent to the presentation.
  • 24. The method of claim 17 further comprising configuring the one or more annotation selectors.
  • 25. The method of claim 24, wherein configuring the one or more key presses further comprises configuring the one or more key presses by one of a user of the computing device and an author of the presentation.
  • 26. The method of claim 17, wherein each annotation selector is a key press.
  • 27. The method of claim 17, wherein each annotation selector is a gesture.
  • 28. The method of claim 17, wherein the annotation is one of an answer and a rating.
  • 29. The method of claim 17, wherein marking the segment of the presentation further comprises detecting an annotation selector event that indicates that the user is marking a segment of the presentation.
  • 30. The method of claim 17 further comprising generating a presentation timeline annotation element having an annotation indicator.
  • 31. The method of claim 30 further comprising generating a user interface element that displays all of the annotations for the presentation.
US Referenced Citations (305)
Number Name Date Kind
5220665 Coyle, Jr. et al. Jun 1993 A
5388197 Rayner Feb 1995 A
5420801 Dockter et al. May 1995 A
5557796 Fehskens et al. Sep 1996 A
5642171 Baumgartner et al. Jun 1997 A
5680619 Gudmundson et al. Oct 1997 A
5732216 Logan et al. Mar 1998 A
5748185 Stephan et al. May 1998 A
5752244 Rose et al. May 1998 A
5801685 Miller et al. Sep 1998 A
5815154 Hirschtick et al. Sep 1998 A
5838973 Carpenter-Smith et al. Nov 1998 A
5861906 Dunn et al. Jan 1999 A
5892915 Duso et al. Apr 1999 A
5929850 Broadwin et al. Jul 1999 A
5996015 Day et al. Nov 1999 A
6006332 Rabne et al. Dec 1999 A
6008807 Bretschneider et al. Dec 1999 A
6009458 Hawkins et al. Dec 1999 A
6014706 Cannon et al. Jan 2000 A
6058424 Dixon et al. May 2000 A
6097441 Allport Aug 2000 A
6108645 Eichstaedt et al. Aug 2000 A
6141595 Gloudeman et al. Oct 2000 A
6155840 Sallette Dec 2000 A
6157809 Kaqmbayashi Dec 2000 A
6223292 Dean et al. Apr 2001 B1
6253368 Nelin et al. Jun 2001 B1
6324683 Fuh et al. Nov 2001 B1
6366916 Baer et al. Apr 2002 B1
6396500 Qureshi et al. May 2002 B1
6404978 Abe Jun 2002 B1
6445834 Rising, III et al. Sep 2002 B1
6452609 Katinsky et al. Sep 2002 B1
6473749 Smith et al. Oct 2002 B1
6523022 Hobbs Feb 2003 B1
6535909 Rust Mar 2003 B1
6538665 Crow et al. Mar 2003 B2
6546405 Gupta et al. Apr 2003 B2
6601026 Appelt et al. Jul 2003 B2
6628279 Schell et al. Sep 2003 B1
6629065 Gadh et al. Sep 2003 B1
6636237 Murray et al. Oct 2003 B1
6636888 Bookspan et al. Oct 2003 B1
6657543 Chung Dec 2003 B1
6697805 Choquier et al. Feb 2004 B1
6714909 Gibbon et al. Mar 2004 B1
6715126 Chang et al. Mar 2004 B1
6728753 Parasnis et al. Apr 2004 B1
6745344 Gibbon et al. Jun 2004 B1
6748382 Mohan et al. Jun 2004 B1
6795972 Rovira Sep 2004 B2
6801224 Chang et al. Oct 2004 B1
6834308 Ikezoye et al. Dec 2004 B1
6842175 Schmalstieg et al. Jan 2005 B1
6850944 MacCall et al. Feb 2005 B1
6859838 Puranik et al. Feb 2005 B1
6877023 Maffeis et al. Apr 2005 B1
6920181 Porter Jul 2005 B1
7062722 Carlin et al. Jun 2006 B1
7079990 Haller et al. Jul 2006 B2
7096416 Smith et al. Aug 2006 B1
7103770 Conrath Sep 2006 B2
7146329 Conkwright et al. Dec 2006 B2
7168035 Bell et al. Jan 2007 B1
7188186 Meyer et al. Mar 2007 B1
7281034 Eyal Oct 2007 B1
7281060 Hofmann et al. Oct 2007 B2
7290057 Suanders et al. Oct 2007 B2
7296137 Moyer Nov 2007 B2
7313595 Rust Dec 2007 B2
7330875 Parasnis et al. Feb 2008 B1
7349944 Vernon Mar 2008 B2
7350231 Madison et al. Mar 2008 B2
7363372 Potenzone et al. Apr 2008 B2
7370269 Prabhu et al. May 2008 B1
7415529 Saunders et al. Aug 2008 B2
7418431 Nies et al. Aug 2008 B1
7441201 Printezis Oct 2008 B1
7454708 O'Neal et al. Nov 2008 B2
7559055 Yang et al. Jul 2009 B2
7561178 Baartman et al. Jul 2009 B2
7590945 Sims et al. Sep 2009 B2
7711722 Sahasi et al. May 2010 B1
7712052 Szeliski et al. May 2010 B2
7873638 Young et al. Jan 2011 B2
8234336 Slater et al. Jul 2012 B2
8392821 DeMarco et al. Mar 2013 B2
8443041 Krantz et al. May 2013 B1
8447664 Pape et al. May 2013 B1
8682672 Ha et al. Mar 2014 B1
8682969 Sahasi et al. Mar 2014 B1
8706812 Sahasi et al. Apr 2014 B2
8798252 Krantz et al. Aug 2014 B2
9046995 Garland Jun 2015 B2
9148480 Sahasi et al. Sep 2015 B2
9224173 Arora et al. Dec 2015 B2
9553922 Guarraci et al. Jan 2017 B1
9720577 Sahasi Aug 2017 B1
9892028 Garland Feb 2018 B1
9973576 Sahasi et al. May 2018 B2
10430491 Joshi et al. Oct 2019 B1
10749948 Sahasi et al. Aug 2020 B2
10785325 Baishya et al. Sep 2020 B1
20010027420 Boublik et al. Oct 2001 A1
20010032242 Terahama et al. Oct 2001 A1
20010032305 Barry Oct 2001 A1
20020016788 Burridge Feb 2002 A1
20020026323 Sakaguchi et al. Feb 2002 A1
20020059342 Gupta et al. May 2002 A1
20020065635 Lei et al. May 2002 A1
20020078150 Thompson et al. Jun 2002 A1
20020085029 Ghani Jul 2002 A1
20020087496 Stirpe et al. Jul 2002 A1
20020107673 Haller et al. Aug 2002 A1
20020112031 Franklin et al. Aug 2002 A1
20020112155 Martherus et al. Aug 2002 A1
20020112247 Horner et al. Aug 2002 A1
20020122050 Sandberg Sep 2002 A1
20020133719 Westerdal Sep 2002 A1
20020143901 Lupo et al. Oct 2002 A1
20020152278 Pontenzone et al. Oct 2002 A1
20020193895 Qian et al. Dec 2002 A1
20030004791 Kojima Jan 2003 A1
20030005019 Pabla et al. Jan 2003 A1
20030005465 Connely Jan 2003 A1
20030014521 Elson et al. Jan 2003 A1
20030025650 Uesaki et al. Feb 2003 A1
20030037131 Verma Feb 2003 A1
20030061280 Bulson et al. Mar 2003 A1
20030061330 Frisco et al. Mar 2003 A1
20030071810 Shoov et al. Apr 2003 A1
20030086682 Schofield et al. May 2003 A1
20030101091 Levin et al. May 2003 A1
20030115267 Hinton et al. Jun 2003 A1
20030154277 Haddad et al. Aug 2003 A1
20030156135 Lucarelli Aug 2003 A1
20030167315 Chowdhry et al. Sep 2003 A1
20030204566 Dhupelia et al. Oct 2003 A1
20040024898 Wan Feb 2004 A1
20040030787 Jandel et al. Feb 2004 A1
20040032424 Florschuetz Feb 2004 A1
20040039834 Saunders et al. Feb 2004 A1
20040049539 Reynolds et al. Mar 2004 A1
20040054542 Foote et al. Mar 2004 A1
20040059941 Hardman et al. Mar 2004 A1
20040073629 Bazot et al. Apr 2004 A1
20040098754 Vella et al. May 2004 A1
20040103150 Ogdon et al. May 2004 A1
20040125877 Chang et al. Jul 2004 A1
20040143603 Kaufmann et al. Jul 2004 A1
20040148375 Levett et al. Jul 2004 A1
20040153504 Hutchinson et al. Aug 2004 A1
20040162787 Madison et al. Aug 2004 A1
20040167896 Eakin Aug 2004 A1
20040187140 Aigner et al. Sep 2004 A1
20040237120 Lewin et al. Nov 2004 A1
20040243928 Hesmer et al. Dec 2004 A1
20040268224 Balkus et al. Dec 2004 A1
20050039131 Paul Feb 2005 A1
20050093860 Yanagisawa et al. May 2005 A1
20050138560 Lee et al. Jun 2005 A1
20050144258 Burckart et al. Jun 2005 A1
20050183143 Anderholm et al. Aug 2005 A1
20050204148 Mayo et al. Sep 2005 A1
20050212797 Lee et al. Sep 2005 A1
20050223340 Repka Oct 2005 A1
20050223341 Repka Oct 2005 A1
20050223342 Repka et al. Oct 2005 A1
20050278650 Sims et al. Dec 2005 A1
20050288001 Foster et al. Dec 2005 A1
20060005114 Williamson et al. Jan 2006 A1
20060031914 Dakss et al. Feb 2006 A1
20060048058 O'Neal et al. Mar 2006 A1
20060106780 Degan May 2006 A1
20060129933 Land et al. Jun 2006 A1
20060150149 Chandhoke et al. Jul 2006 A1
20060167896 Kapur et al. Jul 2006 A1
20060235973 McBride et al. Oct 2006 A1
20060265495 Butler et al. Nov 2006 A1
20060277553 Henning et al. Dec 2006 A1
20070038931 Allaire et al. Feb 2007 A1
20070055401 Van Bael et al. Mar 2007 A1
20070121850 Klos et al. May 2007 A1
20070174905 Martherus et al. Jul 2007 A1
20070192613 Amoroso et al. Aug 2007 A1
20070192727 Finley et al. Aug 2007 A1
20070211065 Feth et al. Sep 2007 A1
20070245243 Lanza et al. Oct 2007 A1
20070271367 Yardeni et al. Nov 2007 A1
20070282858 Arner et al. Dec 2007 A1
20080005240 Knighton et al. Jan 2008 A1
20080005247 Khoo Jan 2008 A9
20080028341 Szeliski et al. Jan 2008 A1
20080062969 Picard et al. Mar 2008 A1
20080062970 Picard et al. Mar 2008 A1
20080086456 Rasanen et al. Apr 2008 A1
20080109396 Kacin May 2008 A1
20080120336 Bergman et al. May 2008 A1
20080162206 Mak et al. Jul 2008 A1
20080189162 Ganong et al. Aug 2008 A1
20080201736 Gordon et al. Aug 2008 A1
20080235189 Rayman et al. Sep 2008 A1
20080270151 Mahoney et al. Oct 2008 A1
20080276271 Anderson et al. Nov 2008 A1
20090013244 Cudich et al. Jan 2009 A1
20090019367 Cavagnari et al. Jan 2009 A1
20090044138 Rudolph et al. Feb 2009 A1
20090049385 Blinnikka et al. Feb 2009 A1
20090066366 Solomon Mar 2009 A1
20090083641 Christy Mar 2009 A1
20090094520 Kulas Apr 2009 A1
20090094544 Savage Apr 2009 A1
20090100372 Lauridsen et al. Apr 2009 A1
20090133048 Gibbs et al. May 2009 A1
20090138508 Tolle et al. May 2009 A1
20090171968 Kane et al. Jul 2009 A1
20090172021 Kane et al. Jul 2009 A1
20090172597 Mercer Jul 2009 A1
20090187825 Sandquist et al. Jul 2009 A1
20090217187 Kendall et al. Aug 2009 A1
20090222842 Narayanan et al. Sep 2009 A1
20090259937 Rohall et al. Oct 2009 A1
20090287790 Upton et al. Nov 2009 A1
20090292584 Dalal et al. Nov 2009 A1
20090292768 Franke Nov 2009 A1
20100023849 Hakim et al. Jan 2010 A1
20100037205 Maillot et al. Feb 2010 A1
20100057415 Chu et al. Mar 2010 A1
20100189131 Branam et al. Jul 2010 A1
20100192132 Yuan et al. Jul 2010 A1
20100216443 Jacobstein et al. Aug 2010 A1
20100251174 Belandrino et al. Sep 2010 A1
20100277696 Huebner Nov 2010 A1
20100325674 Liu Dec 2010 A1
20110004914 Ennis, Jr. Jan 2011 A1
20110010307 Bates et al. Jan 2011 A1
20110026898 Lussier et al. Feb 2011 A1
20110035431 Geary et al. Feb 2011 A1
20110055176 Choi et al. Mar 2011 A1
20110082719 Dutta Apr 2011 A1
20110191316 Lai et al. Aug 2011 A1
20110225015 Spivack et al. Sep 2011 A1
20110252094 Sahasi et al. Oct 2011 A1
20110276372 Spivack et al. Nov 2011 A1
20110289422 Spivack et al. Nov 2011 A1
20120048298 Humphrey et al. Mar 2012 A1
20120084292 Liang et al. Apr 2012 A1
20120109966 Liang et al. May 2012 A1
20120130771 Kannan et al. May 2012 A1
20120158902 Udtke et al. Jun 2012 A1
20120191716 Omoigui Jul 2012 A1
20120210247 Khouri et al. Aug 2012 A1
20120226984 Bastide et al. Sep 2012 A1
20120246137 Sallakonda et al. Sep 2012 A1
20120254454 Margush et al. Oct 2012 A1
20120290399 England et al. Nov 2012 A1
20120290950 Rapaport et al. Nov 2012 A1
20120310750 Schutzbank et al. Dec 2012 A1
20130036191 Fink et al. Feb 2013 A1
20130132374 Olstad et al. May 2013 A1
20130138585 Forte May 2013 A1
20130215116 Siddique et al. Aug 2013 A1
20130268872 Yin et al. Oct 2013 A1
20130282611 Avedissian Oct 2013 A1
20140006975 Oldham Jan 2014 A1
20140068779 Tan et al. Mar 2014 A1
20140115466 Barak et al. Apr 2014 A1
20140123014 Keen May 2014 A1
20140126714 Sayko May 2014 A1
20140126715 Lum et al. May 2014 A1
20140136528 Anima et al. May 2014 A1
20140214691 Morris, III Jul 2014 A1
20140229839 Lynch et al. Aug 2014 A1
20140237381 Socolof Aug 2014 A1
20140279049 Wiseman et al. Sep 2014 A1
20140289326 McCormack et al. Sep 2014 A1
20140366098 Savage et al. Dec 2014 A1
20140372468 Collins et al. Dec 2014 A1
20150002619 Johnston et al. Jan 2015 A1
20150006610 Johnston et al. Jan 2015 A1
20150082021 Mandyann et al. Mar 2015 A1
20150199411 Greenspan Jul 2015 A1
20150213145 Baldwin et al. Jul 2015 A1
20150213361 Gamon et al. Jul 2015 A1
20150278363 Briere et al. Oct 2015 A1
20150304367 Chan et al. Oct 2015 A1
20150365244 Schmiltz et al. Dec 2015 A1
20160011729 Flores et al. Jan 2016 A1
20160028790 Eriksson et al. Jan 2016 A1
20160180248 Regan Jun 2016 A1
20170046374 Fletcher et al. Feb 2017 A1
20170064358 Sullivan et al. Mar 2017 A1
20170097743 Hameed et al. Apr 2017 A1
20170140398 Fleischman et al. May 2017 A1
20170243255 Sahasi et al. Aug 2017 A1
20170255696 Pulitzer Sep 2017 A1
20180033051 Maynard et al. Feb 2018 A1
20180211285 Todasco et al. Jul 2018 A1
20180293610 Maynard Oct 2018 A1
20180315103 Lakshminarayan et al. Nov 2018 A1
20190108234 Torres et al. Apr 2019 A1
20190108438 Torres et al. Apr 2019 A1
20200267110 Nolan et al. Aug 2020 A1
20200382583 Sahasi et al. Dec 2020 A1
Foreign Referenced Citations (8)
Number Date Country
1500353 May 2004 CN
103535026 Jan 2014 CN
2261898 Dec 2010 EP
20100003117 Jan 2010 KR
WO02082815 Oct 2002 WO
WO02093352 Nov 2002 WO
WO02097616 Dec 2002 WO
WO2009020770 Feb 2009 WO
Non-Patent Literature Citations (22)
Entry
Abla et al., Advanced Tools for enhancing control room collaborations, Fusion Engineering and Design, vol. 81, Issues 15-17, 5th IAEA TM on Control, Data Acquisition, and Remote Participation for Fusion Research—5th IAEA TM, Jul. 2006, pp. 2039-2044 (15 pages), ISSN 0920-3796, DOI: 10.1016/j.jusengdes.200.
Holmberg et al., “Web Real-Time Communication Use Cases and Requirements”; Internet Engineering Task Force (IETF), dated Mar. 2015 (29 pages).
Draft—C., Holmberg et al., “Web Real-Time Communication Use Cases and Requirements”, RTCWEV Working Group, dated Oct. 14, 2013 (25 pages).
Saint-Andre, P. 2005. Streaming XML with Jabber/XMPP. IEEE Internet Computing 9, Apr. 27, 2005, 6 pages. (Sep. 2005).
Sen, Sandip, “An Automated Distributed Meeting Scheduler, PSU,” Apr. 2007, 13 pages. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.56.6862.
Sinha et al., “Video Conferencing System,” Columbia University, 11 pages. http://www.cs.columbia.edu/sedwards/classes/2009/4840/reports/RVD-presentation.pdf.
On24, “Best practices in Webcasting for Publishing” dated 2006, 16 pages.
Weiser, “Microsoft PowerPoint 2003—Table of Contents, ITM Online Help Collection, UWEC,” University of Wisconsin, Eau Claire entitled dated Sep. 19, 2004, (2 pages). archived located @ http://web.archive.org/web/20040919191008/http://www.uwec.edu/help/ppoint03.htm.
Weiser, Microsoft PowerPoint 2003 Viewing Online Presentations: The Environment, UWEC—University of Wisconsin, Eau Claire archived dated Dec. 21, 2004, (2 pages). located @ http://web.archive.org/web/20041221201404/www.uwec.edu/help/PPoint03/webenvir.htm.
Microsoft Corporation—“COM: Component Object Model Technologies”—archived dated Oct. 23, 2004, 2 pages. located @ http://web.archive.org/web/20041023025124/http://www.microsoft.com/com/default.mspx.
Rothganger et al., “3D Object Modeling and Recognition Using Local Affine-Invariant Image Descriptors and Multi-View Spatial Constraints,” Department of Computer Science and Beckman Institute, University of Illinois—Cordelia Schmid Inria, France—International Journal of Computer Vision 66(3), 231-259, 2006 (29 pages).
Papadakis et al., “Efficient 3D shape matching and retrieval using a concrete radialized spherical projection representation”—The Journal of the Pattern Recognition Society 40 dated 2007 p. 2437-2452 (16 pages).
“Breeze Manager User Guide,” Copyright © 2005 Macromedia, Inc., Second Edition: Jun. 2005, 306 pages.
“Breeze Meeting User Guide for Meeting Hosts and Presenters”, Copyright © 2005 Macromedia, Inc., Third Edition: Sep. 2005, 130 pages.
Freeman et al., ““Creative Collaboration between Audiences and Musicians in Flock,”” Georgia Tech Center for Music Technology, Feb. 2010, 17 pages.
Suduc et al., “Exploring Multimedia Web Conferencing,” Valahia University of Targoviste, Exploring Multimedia Web Conferencing (Year: 2009), Mar. 2009, 14 pages. https://www.researchgate.net/profile/Suduc_Ana-Maria/publication/26849386.
Marni Gunther, “Webcasting 101: Online Broadcasting in the Meetings and Events Industry”, Netbriefings, Inc., Jul. 2008, 2 pages. http://www.netbriefings.com/pdf/0807-MtgsMNHospitality.pdf.
Aguiar, Everaldo, et al. “Engagement vs performance: using electronic portfolios to predict first semester engineering student retention.” Proceedings of the Fourth International Conference on Learning Analytics and Knowledge. 2014: 103-112 (Year: 2014), 10 pages.
Navarathna, Rajitha, et al. “Estimating audience engagement to predict movie ratings.” IEEE Transactions on Affective Computing 10.1 (Jul. 3, 2017): 48-59. (Year: 2017), 12 pages.
Sam Dutton, “Get Started with WebRTC”, published Jul. 23, 2012 (updated: Nov. 24, 2020), 24 pages. https://www.html5rocks.com/en/tutorials/webrtc/basics/.
Ebner, et al.; “First steps towards an integration of a Personal Learning Environment at university level”, Jan. 1, 2011, 15 pages.
Berthold et al.; “Psycho-pedagogical Mash-up Design for Personalizing the Learning Environment”, Knowledge Management Institute, Graz University of Technology, Austria, Jul. 11, 2011, 15 pages.