While presentation slides may provide a convenient and reusable authoring environment for presentation visuals, such slides may be used in a manner that is more appropriate for a document (e.g., large amounts of text, dense diagrams, and raw tabular data) than for speaker support. Presenting such material may be problematic for both the presenter and the audience. For example, excessive slide text may encourage the audience to read the slide rather than listen to the presenter. Further, abrupt transitions from one slide to another may result in a loss of visual context and may make the relationship between slide contents implicit. The presenter may prepare for a transition and make an appropriate verbal linkage. However, in the event that the presenter does not make such an explicit verbal connection, the audience may not be able to determine whether a logical connection exists between slides.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some implementations provide techniques and arrangements for displaying a user interface that allows a user to create and manage shapes and shape transitions across multiple slides. In some examples, the user may select a slide via the user interface and one or more shape threads associated with the selected slide may be presented (e.g., via an effects pane). Each shape displayed on the selected slide may be associated with a shape thread, and the attributes of each shape may be independently definable via the associated shape threads.
In some implementations, different shapes transition at different rates in response to navigation between slides. In some cases, the user may independently vary a delay before each shape begins transitioning. To illustrate, an amount of time for one shape to transition between slides may be different from an amount of time for another shape to transition between slides. Further, in some implementations, semantic interpolation allows for transitions of non-continuous attributes (e.g., a year, a date, etc.) between slides.
The Detailed Description is set forth with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items or features.
Overview
As discussed above, presentation slides may often be used in a manner that is more appropriate for a document than for supporting presenters. For example, excessive slide text and abrupt transitions between slides may be problematic for both the presenter and the audience. Presenters may attempt to enhance a presentation by animating transitions between slides or animating shapes on a particular slide. However, such incidental animation may not support the act of communication and may instead prove counterproductive by distracting or confusing the audience. While linear interpolation of object attributes between slides (e.g., “tweening” or “in-betweening”) may be possible, existing solutions do not allow for independent management of the attributes of each object individually. Rather, when transitioning between slides, existing solutions use linear interpolation over the same period of time for each object. Transitioning each object over the same period of time may prevent one object from transitioning at one speed and finishing the transition earlier, while another object transitions at another speed and finishes the transition later. Further, existing solutions do not provide for interpolation of non-continuous attributes (e.g., semantic attributes) between slides.
The present disclosure describes techniques and arrangements for creating and managing media effects to provide a more cinematic experience that may improve audience engagement by avoiding distracting incidental animation and allowing the presenter to control intra-slide and inter-slide animations and transitions. Some implementations include a user interface that may allow a user to independently control attributes of each individual shape within a slide as well as attributes of each individual shape between slides. Thus, the user interface may allow the user to manage individual shapes such that the shapes may evolve in a meaningful way during the presentation.
In some implementations, semantic attributes of a shape may be interpolated. As used herein, the term “semantic attribute” refers to a non-continuous attribute (e.g, alphabetical text or numerical values on a slide) in which at least an attribute context (e.g., whether the attribute represents a date, a time, a year, etc.) is used to determine how to properly interpolate attribute values between slides. By contrast, for a continuous attribute (e.g., position values, size values, or color values of an image), interpolation of attribute values between slides may be achieved based on an initial attribute value and a target attribute value without determining the context of the attribute. Semantic attributes may be interpolated between slides by, for example, presenting sequential values (e.g., 1, 2, 3, . . . N; A, B, C, . . . Z; etc.) of the semantic attributes over the interpolation period (i.e., between slides).
The preview pane 102 may allow a user to select a particular slide to be presented in the detail view pane 104.
The effects pane 106 may display “shape threads” corresponding to the particular slide that is currently displayed in the detail view pane 104. Each shape on each slide may be associated with a shape thread. As used herein, the term “shape thread” refers to the connection or path of shapes across slides. By connecting shapes across slides, a shape thread may allow the user to independently define an on-screen transition behavior of each shape individually. That is, the effects pane 106 allows a user to independently manage the behavior of each of the shapes within a particular slide (“intra-slide”) and between slides (“inter-slide”) via corresponding shape threads.
By creating a new slide for each step of an animated progression, the user may visualize the temporal changes in the user interface 100. This may create a high degree of inter-slide continuity, as shape animations or other transitions are not limited to a single slide. Further, the ability to individually manage the behavior of each shape within a particular slide may allow for a high degree of intra-slide continuity, as it may be desirable for each shape to behave differently within a particular slide.
In the example of
In the example of
In the example of
The shape threads 116, 120, 124 displayed in the effects pane 106 may allow the user to customize a presentation such that shapes may evolve (i.e., move and/or change) in a meaningful way over time. To illustrate, a user may duplicate or copy and paste an existing slide to create a new slide, make changes to the new slide, and identify individual time periods over which various changes are to unfold when transitioning between slides. That is, for each slide in a presentation, the effects pane 106 may allow the user to identify independent time periods for each shape to reach a “target” or end state. To illustrate, the effects pane 106 identifies an endpoint of each shape and a relationship to a previous slide, while allowing shape transitions on different time scales.
A newly created shape may appear in the effects pane 106 with the “Appear” effect type (see e.g.,
In the example of
In some implementations, linear interpolation is used for transitioning continuous attributes, such as spatial attributes, color-based attributes, font attributes, format attributes, or image attributes. For a continuous attribute, interpolation of attribute values between slides may be achieved based on an initial attribute value and a target attribute value without determining the context of the attribute. By contrast, for a non-continuous attribute (e.g., a semantic attribute), an attribute context (e.g., whether the attribute represents a date, a time, a year, etc.) be used to determine how to properly interpolate the non-continuous attribute values between slides. As an illustrative example of a continuous attribute, if an image appears on one slide at the top left (e.g., at position x1, y1) and appears on a subsequent slide at the bottom right (e.g., at position x2, y2), the spatial attributes (e.g., the relative positions) may be used to linearly transition the image between slides. That is, during a slide transition, the position of the image would linearly transition along an x-axis based on the difference between the x coordinates (e.g., x2 minus x1) and would linearly transition along a y-axis based on the difference between the y coordinates (e.g., y2 minus y1). By contrast, as an illustrative example of a non-continuous attribute, if first text (e.g., “January”) appears on one slide and second text (e.g., “December”) appears on another slide, the context of the text being a month may allow for the text to be properly interpolated during a slide transition. In this example, by identifying the text as a month, the text may transition from January to February, from February to March, etc. until reaching the target month of December.
In some implementations, continuous attribute interpolation is non-linear. As an illustrative example, an attribute value change may be slow initially, then speed up. Alternatively, the attribute value change may be fast initially, then slow down. As another illustrative example, the change may “ease” in multiple directions through an appropriately defined mathematical easing function. Such an easing function may be selected from a list of preset functions or may be constructed by a user (e.g., with a vertical axis representing a rate of change and a horizontal axis representing time). Further, negative rates of attribute value change may indicate “tweening” away from a target value in a way that could be used to create an overshooting effect before bouncing back. In some cases, an option (not shown in
With respect to continuous attributes, examples of spatial attributes may include width, height, x position, y position, left edge x position, right edge x position, top edge y position, bottom edge y position, or location and scale values from some other (e.g., polar) coordinate system. Examples of color-based attributes may include transparency, hue, saturation, brightness, alpha, red, green, blue, or some other color system. Examples of font attributes may include size, face, color, kerning, line spacing, or text effect (e.g., shadow angle, glow radius). Examples of format attributes may include fill color, line color, line weight, or shape effect (e.g., reflection depth, reflection separation). Examples of image attributes may include brightness, contrast, crop (each edge), among other alternatives.
In the example illustrated in
As an illustrative example, selection of the semantic transition icon 126 may allow for a linear or non-linear (e.g., exponential) interpolation from the year 1937 in the first slide 108 to the year 1955 in the second slide 110. While
In some implementations, slides are set to advance to a non-consecutive slide, or to advance automatically after a specified time period to a specified slide. In some examples, a default slide transition option may be for the user to advance to the next slide by clicking on the current slide. However, as illustrated in
The user interface 100 may also include a selectable Preview icon 130 to preview the presentation for the slide currently displayed in the detail view pane 104 (e.g., the second slide 110). Further, the user interface 100 may include a selectable Run icon 132 to begin the presentation, starting with the first slide 108.
Referring to
In some cases, as illustrated in
Referring to
Referring to
Referring to
Referring to
Referring to
In some implementations, the second shape 118 and the third shape 122 disappear at the same time when advancing to the third slide 112. However,
Referring to
At block 3004, the process flow 3000 optionally includes setting slide actions. For example, referring to
At block 3006, the process flow 3000 includes adding a shape to the slide. For example, referring to
At block 3008, the process flow 3000 includes duplicating the slide. For example, referring to
At block 3010, the process flow 3000 includes creating a shape thread with the “Appear” effect type. For example, referring to
At block 3012, for the duplicate slide created at block 3008, the process flow 3000 includes extending a shape thread with the “Target” effect type. For example, referring to
At block 3014, the process flow 3000 optionally includes connecting the shape thread with the “Reappear” effect type. While not illustrated in
At block 3016, the process flow 3000 includes editing one or more shapes (e.g., shapes on either the slide created at block 3002 or the duplicate slide created at block 3008). To illustrate, referring to
At block 3018, the process flow 3000 optionally includes setting the effect timing (e.g., from a default timing value). For example, referring to
As another example, the user may change the effect timing for the duplicate slide created at block 2008 (e.g., the second slide 110). For example, referring to
At block 3020, the process flow 3000 includes unlinking shapes with the effect type “Appear” or one or more attributes with the effect type “Continuous.” For example, referring to
Referring to
At block 3104, the process flow 3100 includes starting a presentation timer. At block 3106, the process flow 3100 optionally includes setting a current state to that of the slide preceding the selected slide. At block 3108, the process flow 3100 includes saving the current state. At block 3110, the process flow 3100 includes setting the target slide to the selected slide. At block 3112, the process flow 3100 includes interpolating shape attributes over the specified time towards those of their next target slide.
To illustrate, referring to
At block 3116, the process flow 3100 includes advancing to the next or another slide. For example, referring to
The process flows 3000 and 3100 illustrated in
The computing device 3200 may include at least one processor 3202, a memory 3204, communication interfaces 3206, a display device 3208 (e.g. a touchscreen display), other input/output (I/O) devices 3210 (e.g. a touchscreen display or a mouse and keyboard), and one or more mass storage devices 3212, able to communicate with each other, such as via a system bus 3214 or other suitable connection.
The processor 3202 may be a single processing unit or a number of processing units, all of which may include single or multiple computing units or multiple cores. The processor 3202 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor 3202 can be configured to fetch and execute computer-readable instructions stored in the memory 3204, mass storage devices 3212, or other computer-readable media.
Memory 3204 and mass storage devices 3212 are examples of computer storage media for storing instructions which are executed by the processor 3202 to perform the various functions described above. For example, memory 3204 may generally include both volatile memory and non-volatile memory (e.g., RAM, ROM, or the like). Further, mass storage devices 3212 may generally include hard disk drives, solid-state drives, removable media, including external and removable drives, memory cards, flash memory, floppy disks, optical disks (e.g., CD, DVD), a storage array, a network attached storage, a storage area network, or the like. Both memory 3204 and mass storage devices 3212 may be collectively referred to as memory or computer storage media herein, and may be computer-readable media capable of storing computer-readable, processor-executable program instructions as computer program code that can be executed by the processor 3202 as a particular machine configured for carrying out the operations and functions described in the implementations herein.
The computing device 3200 may also include one or more communication interfaces 3206 for exchanging data with other devices, such as via a network, direct connection, or the like, as discussed above. The communication interfaces 3206 can facilitate communications within a wide variety of networks and protocol types, including wired networks (e.g., LAN, cable, etc.) and wireless networks (e.g., WLAN, cellular, satellite, etc.), the Internet and the like. Communication interfaces 3206 can also provide communication with external storage (not shown), such as in a storage array, network attached storage, storage area network, or the like.
The discussion herein refers to data being sent and received by particular components or modules. This should not be taken as a limitation as such communication need not be direct and the particular components or module need not necessarily be a single functional unit. This is not to be taken as limiting implementations to only those in which the components directly send and receive data from one another. The signals could instead be relayed by a separate component upon receipt of the data. Further, the components may be combined or the functionality may be separated amongst components in various manners not limited to those discussed above. Other variations in the logical and practical structure and framework of various implementations would be apparent to one of ordinary skill in the art in view of the disclosure provided herein.
A display device 3208, such as touchscreen display or other display device, may be included in some implementations. The display device 3208 may be configured to display the user interface 100 as described above. Other I/O devices 3210 may be devices that receive various inputs from a user and provide various outputs to the user, and may include a touchscreen, such as a touchscreen display, a keyboard, a remote controller, a mouse, a printer, audio input/output devices, and so forth.
Memory 3204 may include modules and components for execution by the computing device 3200 according to the implementations discussed herein. Memory 3204 may further include one or more other modules 3216, such as an operating system, drivers, application software, communication software, or the like. Memory 3204 may also include other data 3218, such as data stored while performing the functions described above and data used by the other modules 3216. Memory 3204 may also include other data and data structures described or alluded to herein.
The example systems and computing devices described herein are merely examples suitable for some implementations and are not intended to suggest any limitation as to the scope of use or functionality of the environments, architectures and frameworks that can implement the processes, components and features described herein. Thus, implementations herein are operational with numerous environments or architectures, and may be implemented in general purpose and special-purpose computing systems, or other devices having processing capability. Generally, any of the functions described with reference to the figures can be implemented using software, hardware (e.g., fixed logic circuitry) or a combination of these implementations. The term “module,” “mechanism” or “component” as used herein generally represents software, hardware, or a combination of software and hardware that can be configured to implement prescribed functions. For instance, in the case of a software implementation, the term “module,” “mechanism” or “component” can represent program code (and/or declarative-type instructions) that performs specified tasks or operations when executed on a processing device or devices (e.g., CPUs or processors). The program code can be stored in one or more computer-readable memory devices or other computer storage devices. Thus, the processes, components and modules described herein may be implemented by a computer program product.
As used herein, “computer-readable media” includes computer storage media and communication media. Computer storage media includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. Computer storage media includes, but is not limited to, random access memory (RAM), read only memory (ROM), electrically erasable programmable ROM (EEPROM), flash memory or other memory technology, compact disk ROM (CD-ROM), digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store information for access by a computing device.
In contrast, communication media may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave. As defined herein, computer storage media does not include communication media.
Furthermore, this disclosure provides various example implementations, as described and as illustrated in the drawings. However, this disclosure is not limited to the implementations described and illustrated herein, but can extend to other implementations, as would be known or as would become known to those skilled in the art. Reference in the specification to “one implementation,” “this implementation,” “these implementations” or “some implementations” means that a particular feature, structure, or characteristic described is included in at least one implementation, and the appearances of these phrases in various places in the specification are not necessarily all referring to the same implementation.
Although the subject matter has been described in language specific to structural features and/or methodological acts, the subject matter defined in the appended claims is not limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. This disclosure is intended to cover any and all adaptations or variations of the disclosed implementations, and the following claims should not be construed to be limited to the specific implementations disclosed in the specification. Instead, the scope of this document is to be determined entirely by the following claims, along with the full range of equivalents to which such claims are entitled.
This application is a continuation of PCT International Application No PCT/CN2013/084565, filed Sep. 29, 2013, the entire contents of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5574798 | Greer et al. | Nov 1996 | A |
6084582 | Qureshi et al. | Jul 2000 | A |
6396500 | Qureshi | May 2002 | B1 |
6453302 | Johnson et al. | Sep 2002 | B1 |
6580438 | Ichimura et al. | Jun 2003 | B1 |
6774920 | Cragun | Aug 2004 | B1 |
6826729 | Giesen | Nov 2004 | B1 |
7299418 | Dieberger | Nov 2007 | B2 |
7342586 | Jaeger | Mar 2008 | B2 |
7383509 | Foote et al. | Jun 2008 | B2 |
7428704 | Baker et al. | Sep 2008 | B2 |
7549120 | Griffith et al. | Jun 2009 | B1 |
7714802 | Hurley et al. | May 2010 | B2 |
7870503 | Levy | Jan 2011 | B1 |
7996436 | Schneider et al. | Aug 2011 | B2 |
8166402 | Collins et al. | Apr 2012 | B2 |
8269790 | Wong et al. | Sep 2012 | B2 |
9043722 | Holt | May 2015 | B1 |
9093007 | Berglund | Jul 2015 | B2 |
20010020953 | Moriwake | Sep 2001 | A1 |
20010021938 | Fein | Sep 2001 | A1 |
20020069218 | Sull et al. | Jun 2002 | A1 |
20020147740 | Faraday | Oct 2002 | A1 |
20020194230 | Polanyi | Dec 2002 | A1 |
20030090506 | Moore | May 2003 | A1 |
20030122863 | Dieberger et al. | Jul 2003 | A1 |
20030202007 | Silverstein et al. | Oct 2003 | A1 |
20040239679 | Ito | Dec 2004 | A1 |
20040267387 | Samadani | Dec 2004 | A1 |
20050066059 | Zybura et al. | Mar 2005 | A1 |
20050081154 | Vogel | Apr 2005 | A1 |
20050108619 | Theall et al. | May 2005 | A1 |
20050193323 | Coulomb et al. | Sep 2005 | A1 |
20050246313 | Turski et al. | Nov 2005 | A1 |
20060036568 | Moore | Feb 2006 | A1 |
20060200759 | Agrawala et al. | Sep 2006 | A1 |
20070055926 | Christiansen et al. | Mar 2007 | A1 |
20070055939 | Furlong et al. | Mar 2007 | A1 |
20070058207 | Asai | Mar 2007 | A1 |
20070101299 | Shaw | May 2007 | A1 |
20070118506 | Kao et al. | May 2007 | A1 |
20070133034 | Jindal et al. | Jun 2007 | A1 |
20070171201 | Pi | Jul 2007 | A1 |
20070186167 | Anderson | Aug 2007 | A1 |
20070186168 | Waldman | Aug 2007 | A1 |
20070226625 | Cardone | Sep 2007 | A1 |
20070266325 | Helm | Nov 2007 | A1 |
20080028314 | Bono | Jan 2008 | A1 |
20080034345 | Curtis et al. | Feb 2008 | A1 |
20080040340 | Varadarajan et al. | Feb 2008 | A1 |
20080046803 | Beauchamp | Feb 2008 | A1 |
20080178089 | Baker et al. | Jul 2008 | A1 |
20090044117 | Vaughan | Feb 2009 | A1 |
20090100369 | Mindrum | Apr 2009 | A1 |
20090119597 | Vaughan | May 2009 | A1 |
20090172548 | Screen | Jul 2009 | A1 |
20090172549 | Davidson | Jul 2009 | A1 |
20090172559 | Waldman | Jul 2009 | A1 |
20090216794 | Saptharishi | Aug 2009 | A1 |
20090262116 | Zhao | Oct 2009 | A1 |
20090292986 | Anderson | Nov 2009 | A1 |
20090309881 | Zhao | Dec 2009 | A1 |
20100031152 | Villaron | Feb 2010 | A1 |
20100064223 | Tilton | Mar 2010 | A1 |
20100088605 | Livshin et al. | Apr 2010 | A1 |
20100118037 | Sheikh | May 2010 | A1 |
20100146393 | Land et al. | Jun 2010 | A1 |
20100169784 | Weber | Jul 2010 | A1 |
20100199180 | Brichter | Aug 2010 | A1 |
20100207950 | Zhao | Aug 2010 | A1 |
20100218100 | Simon | Aug 2010 | A1 |
20100238176 | Guo | Sep 2010 | A1 |
20100281367 | Langmacher | Nov 2010 | A1 |
20100293470 | Zhao | Nov 2010 | A1 |
20100309436 | Allen, Jr. et al. | Dec 2010 | A1 |
20110004563 | Rauber | Jan 2011 | A1 |
20110040804 | Peirce et al. | Feb 2011 | A1 |
20110181521 | Reid | Jul 2011 | A1 |
20110181602 | Boda | Jul 2011 | A1 |
20110191672 | Schodl et al. | Aug 2011 | A1 |
20110271179 | Jasko et al. | Nov 2011 | A1 |
20110295392 | Cunnington | Dec 2011 | A1 |
20110302494 | Callery | Dec 2011 | A1 |
20120221975 | Juristovski et al. | Aug 2012 | A1 |
20130024772 | Delia et al. | Jan 2013 | A1 |
20130050255 | Sprang et al. | Feb 2013 | A1 |
20130097552 | Villaron | Apr 2013 | A1 |
20130111373 | Kawanishi et al. | May 2013 | A1 |
20130120400 | Maloney | May 2013 | A1 |
20130120403 | Maloney et al. | May 2013 | A1 |
20130288722 | Ramanujam et al. | Oct 2013 | A1 |
20140096006 | Berglund | Apr 2014 | A1 |
20140165087 | Smith et al. | Jun 2014 | A1 |
20140317488 | Lutz | Oct 2014 | A1 |
20140344702 | Edge et al. | Nov 2014 | A1 |
20150007005 | Edge et al. | Jan 2015 | A1 |
20150033116 | McKinney et al. | Jan 2015 | A1 |
20150113372 | Flider | Apr 2015 | A1 |
20150132735 | Edge et al. | May 2015 | A1 |
20150178287 | Kim | Jun 2015 | A1 |
20160196681 | Tilton | Jul 2016 | A1 |
Number | Date | Country |
---|---|---|
102081946 | Jun 2011 | CN |
102169483 | Aug 2011 | CN |
102903128 | Jan 2013 | CN |
103279259 | Sep 2013 | CN |
1696337 | Aug 2006 | EP |
2004184576 | Jul 2004 | JP |
WO2000026828 | May 2000 | WO |
WO0145018 | Jun 2001 | WO |
WO2007069557 | Jun 2007 | WO |
WO2010151257 | Dec 2010 | WO |
Entry |
---|
Batts, “A Beamer Tutorial in Beamer”, Department of Computer Science, The University of North Carolina at Greensboro, Apr. 2007, 55 pages. |
Batts, “A Beamer Tutorial in Beamer,” Department of Computer Science, The University of North Caroline at Greensboro, Apr. 2007, 55 pages. |
The European Office Action dated Apr. 4, 2016 for European patent application No. 14730712.8, a counterpart foreign application of U.S. Appl. No. 13/898,338, 5 pages. |
The Supplementary European Search Report dated Mar. 15, 2016 for European patent application No. 14730712.8, 4 pages. |
Indezine.com, retrieved on Feb. 5, 2012 at <<http://www.indezine.com/products/powerpoi nt/learn/picturesandvisuals/apply-theme-to-photo-album-ppt2010.html>> pp. 1-3. |
PCT International Preliminary Report on Patentability for Application No. PCT/US2014/064714, dated Feb. 15, 2016 (8 pages). |
Office action for U.S. Appl. No. 13/898,338, dated Apr. 15, 2016, Edge et al., “Adaptive Timing Support for Presentations”, 17 pages. |
Slidy—a web based alternative to Microsoft PowerPoint, May 14, 2006, <<http://www.w3.org/2006/05/Slidy-XTech/slidy-xtech06-dsr.pdf>> pp. 1-14. |
Tex Sound, embedding sound files into beamer presentation with media9, retrieved on Apr. 12, 2012 at <<http://tex.stackexchange.com/questions/51632/embedding-soundfiles-into-beamer-presentation-with-media9>22 pp. 1-3. |
Bajaj,“Apply Theme to Photo Album Presnetations win PowerPoint 2010”, Retrieved from <<http://www.indezine.cpm/products/powerpoint/learn/picturesandvisuals/apply-theme-to-photo-album-ppt2010.html>>, Feb. 2012, 3 pages. |
Batts, “A Beamer Tutorial in Beamer”, Department of Computer Science, The University of North Carolina at Greensboro, Apr. 2007, 110 pages. |
Career Track, “QuickClicks Reference Guide to Microsoft PowerPoint 2010”, Published by Career Track, a Division of PARK University Enterprises, Inc., 2011, 4 pages. |
LaTeX, “Beamer linking within document”, LaTeX Community Forum, Retreived from <<http://www.latex-community.org/forum/viewtopic.php?f=4&t=4594>>, Apr. 2009, 5 pages. |
Office action for U.S. Appl. No. 13/933,030, dated Jan. 4, 2016, Edge et al., “Dynamic Presentation Prototyping and Generation”, 18 pages. |
Office action for U.S. Appl. No. 14/077,674, dated Feb. 1, 2016, Edge et al., “Presentation Rehearsal”, 33 pages. |
Raggett, “Slidy—A Web based alternative to Microsoft PowerPoint”, XTech 2006, May 2006, 13 pages. |
Tex, “Background image in beamer slides”, Retrieved from <<http://tex.stackexchange.com/questions/78464/background-image-in-beamer-slides>>, Oct. 2012, 6 pages. |
TeX, “Embedding sound files into beamer presentation with media9”, Retreived from <<http://tex.stackexchange.com/questions/51632/embedding-sound-files-into-beamer-presnetation-with-media9>>, Apr. 2012, 3 pages. |
The European Office Action dated Dec. 18, 2015 for European patent application No. 13876698.5, a counterpart foreign application of U.S. Appl. No. 13/933,030, 6 pages. |
The Supplementary European Search Report dated Oct. 28, 2015 for European patent application No. 13876698.5, 2 pages. |
Office action for U.S. Appl. No. 13/898,338, dated Dec. 15, 2015, Edge et al., “Adaptive Timing Support for Presentations”, 16 pages. |
“PowerPoint2010: Applying Transitions”, Retrieved from <<http://www.gcflearnfree.org/powerpoing2010/6.4>>, Available as early as Jan. 2011, 1 page. |
Raggett, et al., “HTML 4.01 Specification”, Internet Citation, Dec. 24, 1999, retrieved from the internet on May 3, 2011 at URL:http://www.w3.org/TR/html401. |
Wikipedia Digital Dictation, retrieved on Jul. 29, 2015 at <<https://en.wikipedia.org/w/index.php?title=Digital_dictation&oldid=Presentation program&oldid=546255522>>, Wikipedia, 3 pgs. |
Office action for U.S. Appl. No. 13/898,338 dated Aug. 19, 2015, Edge et al., “Adaptive Timing Support for Presentations”, 17 pages. |
The PCT Written Opinion of the IPEA dated Aug. 4, 2015 for PCT application No. PCT/US2014/064714, 9 pages. |
Chinese Office Action dated Mar. 14, 2017 for Chinese patent application No. 201380074201.X, a counterpart foreign application of U.S. Appl. No. 13/933,030. |
The Extended European Search Report dated Mar. 30, 2017 for European patent application No. 13888284.0, 8 pages. |
U.S. Appl. No. 13/898,338, Darren Edge et al., “Adaptive Timing Support for Presentations,” May 20, 2013, 55 pages. |
U.S. Appl. No. 13/933,030, Darren Edge et al., “Dynamic Presentation Prototyping and Generation,” filed Jul. 1, 2013, 45 pages. |
U.S. Appl. No. 14/077,674, Darren Edge et al., “Presentation Rehearsal,” filed Nov. 12, 2013, 52 pages. |
“Apple Keynote,” retrieved on Feb. 2, 2013 at <<http://www.apple.com/iwork/keynote/>>, Apple Inc., 2013, 4 page. |
Beck et al., “Principles behind the Agile Manifesto,” retrieved on Feb. 2, 2013 at <<http://agilemanifesto.org/principles.html>>, 2001, 2 pages. |
Bederson et al., “Pad++: Advances in Multiscale Interfaces,” retrieved on Feb. 9, 2013 at <<http://wiki.lri.fr/fondihm/_files/pad-chi94-bederson.pdf>>, ACM, Proceedings of Conference Companion on Human Factors in Computing Systems, Apr. 24, 1994, p. 315-316. |
Bohon, “How to Create a Presentation with Markdown,” retrieved on Mar. 5, 2013 at <<http://www.maclife.com/article/howtos/how_create_presentation_markdown>>, Mac/Life, May 2, 2012, 5 pages. |
Carnegie, “The Quick and Easy Way to Effective Speaking: Modern Techniques for Dynamic Communication,” Pocket Books, copyright 1962, 112 pages. |
Carpenter et al., “What types of learning are enhanced by a cued recall test?” retrieved on Feb. 9, 2012 at <<http://www.edvul.com/pdf/CarpenterPashlerVul-PBR-2006.pdf>>, Psychonomic Bulletin and Review, vol. 13, No. 5, 2006, pp. 826-830. |
Charmaz, “Constructing Grounded Theory: A Practical Guide through Qualitative Analysis. Sage,” Sage Publications, 2006, 219 pages. |
“Create a PowerPoint presentation from a plain text file,” retrieved on Mar. 5, 2013 at <<http://www.ppffaq.com/FAQ00246_Create_a_PowerPoint_presentation_from_a_plain_text_file.htm>>, PPTools, 2 pages. |
Duarte, “Resonate: Present Visual Stories that Transform Audiences,” Wiley, 2010, 251 pages. |
Duarte, “Slide:ology: The Art and Science of Creating Great Presentations,” O'Reilly Media, 2010, 240 pages. |
Edge et al., “MicroMandarin: Mobile Language Learning in Context,” retrieved on Feb. 9, 2013 at <<http://voxy.com/blog/wp-content/uploads/2011/03/micromandarin.pdf>>, Proceedings of Conference on Human Factors in Computing Systems (CHI), May 7, 2011, pp. 3169-3178. |
Fourney et al., “Gesturing in the Wild: Understanding the Effects and Implications of Gesture-Based Interaction for Dynamic Presentations,” retrieved on Feb. 9, 2013 at <<http://www.adamfourney.com/papers/fourney_terry_mann_bhci2010.pdf>>, Proceedings of British Computer Society (BCS) Interaction Specialist Group Conference, Sep. 6, 2010, pp. 230-240. |
Gallo, “The Presentation Secrets of Steve Jobs: How to be Insanely Great in Front of Any Audience,” McGraw Hill, 2010, 128 pages. |
Good et al., “Zoomable user interfaces as a medium for slide show presentations,” retrieved on Feb. 9, 2013 at <<http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.101.2931&rep=rep1&type=pdf>>, Journal of Information Visualization, vol. 1, No. 1, Mar. 2002, pp. 35-49. |
“Google Docs Presentations,” retrieved on Feb. 13, 2013 at <<http://www.google.com/drive/start/apps.html#product=slides>>, Google Drive, 3 pages. |
Gouli et al., “An Adaptive Feedback Framework to Support Reflection, Guiding and Tutoring,” Advances in Web-Based Education: Personalized Learning Environments, Oct. 2005, 19 pages. |
“Haiku Deck,” retrieved on Feb. 13, 2013 at <<http://www.haikudeck.com/>>, Giant Thinkwell Inc., 2012, 2 pages. |
“Impress.js,” retrieved on Feb. 13, 2013 at <<http://bartaz.github.com/impress.js/#/title>>, 4 pages. |
Iqbal et al., “Peripheral Computing During Presentations: Perspectives on Costs and Preferences,” retrieved on Feb. 9, 2013 at <<http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.188.4197&rep=rep1&type=pdf>>, Proceedings of Conference on Human Factors in Computing Systems (CHI), May 7, 2011, pp. 891-894. |
Kurihara et al., “Presentation Sensei: A Presentation Training System using Speech and Image Processing,” retrieved on Feb. 9, 2013 at <<http://www.unryu.org/home/papers/icmi130-kurihara.pdf>>, Proceedings of Intl. Conference on Multimodal Interfaces (ICMI), Nov. 12, 2007, pp. 358-365. |
Lane et al., “Cafe-style PowerPoint: Navigation's Conversational Touch,” retrieved on Mar. 5, 2013 at <<http://office.microsoft.com/en-in/powerpoint-help/cafe-style-powerpoint-navigation-s-conversational-touch-HA010274710.aspx>>, Microsoft Corporation, 2013, 5 pages. |
Lanir et al., “Observing Presenters' Use of Visual Aids to Inform the Design of Classroom Presentation Software,” retrieved on Feb. 9, 2013 at <<http://nguyendangbinh.org/Proceedings/CHI/2008/docs/p695.pdf>>, Proceedings of Conference on Human Factors in Computing Systems (CHI), Apr. 5, 2008, pp. 695-704. |
Levasseur et al., “Pedagogy Meets PowerPoint: A Research Review of the Effects of Computer-Generated Slides in the Classroom,” retrieved on Feb. 9, 2013 at <<http://flc-ppt-plus.wikispaces.com/file/view/LevasseurandSawyer.pdf>>, National Communication Association, The Review of Communication, vol. 6, No. 1-2, Jan.-Apr. 2006, pp. 101-123. |
Lichtschlag et al., “Canvas Presentations in the Wild,” CHI 12 EA, 2012, pp. 537-540. |
Lichtschlag, “Fly an Organic Authoring Tool for Presentations,” published Nov. 10, 2008, In proceedings: In Diploma Thesis at the Media Computing Group, retrieved at <<http://hci.rwth-aachen.de/materials/publications/lichtschlag2008.pdf>>, 114 pages (in 2 parts). |
Lichtschlag et al., “Fly: Studying Recall, Macrostructure Understanding, and User Experience of Canvas Presentations,” Session: Space: The Interaction Frontier, CHI 2012, May 5-10, 2012, Austin, Texas, USA, 4 pages. |
Lichtschlag et al., “Fly: A Tool to Author Planar Presentations,” retrieved on Feb. 9, 2013 at <<https://hci.rwthaachen.de/materials/publications/lichtschlag2009.pdf>>, Proceedings of Conference on Human Factors in Computing Systems (CHI), Apr. 4, 2009, pp. 547-556. |
Mayer, “Multi-Media Learning,” New York: Cambridge University Press, 2009, 2nd edition, 162 pages. |
Mayer, “Multimedia Learning: Are We Asking the Right Questions?” retrieved on Feb. 9, 2013 at <<http://www.uky.edu/gmswan3/544/mayer_1997.pdf>>, Journal of Educational Psychologist, vol. 32, No. 1, 1997, pp. 1-19. |
Mayer et al., “Nine Ways to Reduce Cognitive Load in Multimedia Learning,” retrieved on Feb. 9, 2013 at <<http://www.uky.edu/˜mswan3/544/9_ways_to_reduce_CL.pdf>>, Journal of Educational Psychologist, vol. 38, No. 1, 2003, pp. 43-52. |
“Microsoft PowerPoint,” retrieved on Feb. 13, 2013 at <<http://office.microsoft.com/en-us/powerpoint>>, Microsoft Corporation, 2013, 2 pages. |
Moscovich et al., “Customizable Presentations,” retrieved on Feb. 9, 2013 at <<http://www.moscovich.net/tomer/papers/cpresentations.pdf>>, Proceedings of Intl. Conference on Multimodal Interfaces (ICMI), Nov. 5, 2003, 5 pages. |
Nelson et al., “Palette: A Paper Interface for Giving Presentations,” retrieved on Feb. 9, 2013 at <<http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.39.5686&rep=rep1&type=pdf>>, Proceedings of Conference on Human Factors in Computing Systems (CHI), May 1999, pp. 354-361. |
Nelson et al., “Pictorial Superiority Effect. Journal of Experimental Psychology: Human Learning & Memory,” Sep. 1976, vol. 2, pp. 523-528. |
Norman, “In Defense of PowerPoint,” retrieved on Feb. 13, 2013 at <<http://www.jnd.org/dn.mss/in_defense_of_p.html>>, 2004, 5 pages. |
Pavivio, “Mental Representations: A Dual Coding Approach,” Oxford University Press, 1990, 170 pages (2 parts). |
Panjwani et al., “Collage: A Presentation Tool for School Teachers,” retrieved on Feb. 9, 2013 at <<http://www.gg.rhul.ac.uk/ict4d/ictd2010/papers/ICTD2010%20Panjwani%20et%20al.pdf>>, Proceedings of ACM/IEEE Intl. Conference on Information and Communication Technologies and Development (ICTD), Article No. 30, Dec. 13, 2010, 10 pages. |
Parker, “Absolute PowerPoint: Can a Software Package Edit our Thoughts?” retrieved on Feb. 9, 2013 at <<http://www.utdallas.edu/˜dxt023000/courses/6331/readings/Anti-PowerPoint.pdf>>, The New Yorker, May 28, 2001, 15 pages. |
PCT Patent Application No. PCT/CN2013/072061, filed on Mar. 1, 2013, Koji Yatani et al., “Dynamic Presentation Prototyping,” 10 pages. |
PCT Patent Application PCT/CN2013/078288 filed on Jun. 28, 2013, Koji Yatani et al., “Selecting and Editing Visual Elements with Attribute Groups,” 44 pages. |
“Pecha Kucha 20x20,” retrieved Feb. 13, 2013 at <<http://www.pechakucha.org/>>, 2013, 5 pages. |
“PPTPlex PowerPoint Add-In,” retrieved on Feb. 13, 2013 at <<http://www.microsoft.com/en-us/download/details.aspx?id=28558>>, Microsoft Corporation, Dec. 15, 2011, 2 pages. |
“Prezi—Ideas matter,” retrieved on Feb. 13, 2013 at <<http://prezi.com/>>, Prezi Inc., 2013, 3 pages. |
Raggett, “HTML Slidy: Slide Shows in HTML and XHTML,” retrieved on Feb. 13, 2013 at <<http://www.w3.org/Talks/Tools/Slidy2/#(1)>>, W3C, 2013, 23 pages. |
Reynolds, “Presentation Zen: Simple Ideas on Presentation Design and Delivery,” New Riders, 2008, 234 pages. |
Roberts, “Aristotle's Rhetoric,” retrieved on Feb. 12, 2013 at <<http://rhetoric.eserver.org/aristotle/>>, Alpine Lakes Design, A hypertextual resource compiled by Lee Honeycutt, 2011, 1 page. |
Signer et al., “PaperPoint: A Paper-Based Presentation and Interactive Paper Prototyping Tool,” retrieved on Feb. 9, 2013 at <<http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.71.6985&rep=rep1&type=pdf>>, Proceedings of Intl Conference on Tangible and Embedded Interaction (TEI), Feb. 15, 2007, p. 57-64. |
“SlideShare,” retrieved on Feb. 13, 2013 at <<http://www.slideshare.net/>>, SlideShare Inc., 2013, 9 pages. |
Spicer et al., “NextSlidePlease: Authoring and Delivering Agile Multimedia Presentations,” retrieved on Feb. 9, 2013 at <<http://ame2.asu.edu/faculty/hs/pubs/2012/2012_nsp.pdf>>, Journal of ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP), vol. 2, No. 3, May 2010, 25 pages. |
Stiller et al., “Presentation time concerning system-paced multimedia,” Australian Journal of Educational Technology, vol. 27, Issue 4, Aug. 2011, pp. 693-708 . |
Sweller, “Cognitive Load During Problem Solving: Effects on Learning,” retrieved on Feb. 9, 2013 at <<http://csjarchive.cogsci.rpi.edu/1988v12/i02/p0257p0285/main.pdf>>, Journal of Cognitive Science, vol. 12, No. 2, Apr. 1988, pp. 257-285. |
Tam, “The Design and Field Observation of a Haptic Notification System,” In Phd Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Science, The University of British Columbia, Vancouver, BC, Canada, Oct. 2012, 120 pages. |
Teevan et al., “Displaying Mobile Feedback during a Presentation,” retrieved on Feb. 9, 2013 at <<http://research.microsoft.com/en-us/um/people/teevan/publications/papers/mobilehci12.pdf>>, Proceedings of Intl. Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI), Sep. 21, 2012, 4 pages. |
The Beamer Package, retrieved on Feb. 12, 2013 at <<http://en.wikibooks.org/wiki/LaTeX/Presentations>>, Wikimedia, 2011, 12 pages. |
Tufte, “The Cognitive Style of PowerPoint: Pitchng Out Corrupts Within,” 2003, Graphics Press, 28 pages. |
“Tutorial: an Introduction to the Magic Move Transition in Keynote,” retrieved on Aug. 30, 2013 at <<http://www.keynoteclassroom.com/index_files/Tutorial-Magic-Move.html>>, 2 pages. |
Tweening in PowerPoint, published on Feb. 2, 2010, retrieved at <<http://www.pptalchemy.co.uk/Tweeny.html>>, 2 pages. |
Weissman, “Presenting to Win: The Art of Telling Your Story,” FT Press, 2009, 268 pages. |
Weissman, “The Power Presenter: Technique, Style, and Strategy from America's Top Speaking Coach,” Wiley, 2009, 135 pages. |
“Welcome to the Open XML SDK 2.5 CTP for Office,” retrieved on Feb. 9, 2013 at <<http://msdn.microsoft.com/en-us/library/office/bb448854.aspx>>, Microsoft Corporation, Aug. 22, 2012, 2 pages. |
Wozniak et al., “Optimization of Repetition Spacing in the Practice of Learning,” retrieved on Feb. 9, 2013 at <<http://www.ane.pl/pdf/5409.pdf>>, Journal of Acta Neurobiologiae Experimentalis, vol. 54, No. 1, 1994, pp. 59-62. |
Yonge, translation of Cicero's De Inventione, “Treatise on Rhetorical Invention,” retrieved on Feb. 12, 2013 at <<http://www.classicpersuasion.org/pw/cicero/dnvindex.htm>>, The Orations of Marcus Tullius Cicero: Cicero's De Inventione translated by C. D. Yonge, George Bell and Sons, London, vol. 4, 1888, pp. 241-380. |
Zongker, “Creating Animation for Presentations,” retrieved on Feb. 9, 2013 at <<http://grail.cs.washington.edu/theses/ZongkerPhd.pdf>>, PhD Dissertation, University of Washington, 2003, 228 pages. |
“International Search Report & Written Opinion for PCT Patent Application No. PCT/US2014/064714”, dated Mar. 24, 2015, 10 Pages. |
Wikipedia: “Laptop”, retrieved on Mar. 16, 2015 at <<http://en.wikipedia.org/w/index.php?title=Laptop&oldid=580975732>>, Wikipedia, 23 pgs. |
Wikipedia: “Microsoft PowerPoinPowerPoint”, retrieved on Mar. 13, 2015 at <<http://en.wikipedia.ore/w/index.php?title=Microsoft_PowerPoint&oldid+580301821>>, Wikipedia, 10 pgs. |
Office action for U.S. Appl. No. 13/898,338, dated Apr. 8, 2015, Edge et al., “Adaptive Timing Support for Presentations”, 15 pages. |
Wikipedia: “Presentation program”, retrieved on Mar. 13, 2015 at <<http://en.wikipedia.org/w/index.php?title=Presentation_program&oldid=580933046>>, Wikipedia, 3 pgs. |
The PCT Search Report and Written Opinion dated Jul. 1, 2014 for PCT application No. PCT/CN2013/084565, 11 pages. |
Office action for U.S. Appl. No. 13/898,338, dated Apr. 15, 2016, Edge et al., “Adaptive Timing Suppor for Presentations”, 17 pages. |
Wikipedia, “Digital Dictation”, Internet Article, Mar. 22, 2013, retrieved on Jul. 29, 2015 at: https://en.wikipedia.org/w/index.php?title=Digital_dictation&oldid=546255522, Wikipedia, 3 pages. |
Edge et al., “HyperSlides: Dynamic Presentation Prototyping”, In ACM SIGCHI Conference on Human Factors in Computing Systems, Apr. 29, 2013, 10 pages. |
Mamykina et al., “Time Aura Interfaces for Pacing”, CHI 2001 Conference on Human Factors in Computing Systems, Mar.-Apr. 2001, vol. 3, Issue 1, pp. 144-151. |
U.S. Appl. No. 13/898,338, Edge et al., “Adaptive Timing Support for Presentations”, filed May 20, 2013, 55 pages. |
PCT International Search Report and Written Opinion in International Application PCT/US2014/064714, dated Mar. 24, 2015, 10 pages. |
Wikipedia, “Laptop” retrieved Mar. 16, 2015 at: http://en.wikipedia.org/w/index.php?title=Laptop&oldid=580975732, Wikipedia,19 pages. |
European Office Action in EP Application 13876698.5, dated Dec. 18, 2015, a foreign counterpart application to U.S. Appl. No. 13/933,030, 6 pages. |
U.S. Appl. No. 13/898,338, Office Action dated Dec. 15, 2015, Edge et al., “Adaptive Timing Support for Presentations”, 16 pages. |
Career Track, “QuickClicks Reference Guide to Microsoft Powerpoint 2010”, published by Career Track, a division of Park Univ. Enterprises, Inc., 2011, 4 pages. |
U.S. Appl. No. 14/077,674, Office Action dated Feb. 1, 2016, 33 pages. |
European Office Action dated Apr. 4, 2016 in EP Application 14730712.8, a counterpart application for U.S. Appl. No. 13/898,338, 5 pages. |
U.S. Appl. No. 13/898,338, Office Action dated Apr. 15, 2016, 17 pages. |
European Search Report dated Mar. 15, 2016 in EP Patent Appl. 14730712.8, 4 pages. |
PCT International Preliminary Report on Patentability in Application PCT/US2014/064714, dated Feb. 15, 2016, 8 pages. |
Number | Date | Country | |
---|---|---|---|
20150095785 A1 | Apr 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2013/084565 | Sep 2013 | US |
Child | 14464607 | US |