Electronic presentations are used in a variety of contexts for conveying information. For example, a businessperson may use an electronic slide presentation to convey information about business performance. In another example, a teacher may use an electronic slide presentation to teach a lesson.
Presentation applications executing on personal computers are used to author and present electronic presentations. A typical presentation application presents an authoring interface that enables a user to edit slides in a presentation. The authoring interface may include a primary pane and a navigation pane. The primary pane contains an editable slide in the presentation. The navigation pane may include a series of thumbnail images of each slide in the presentation. A thumbnail image of a slide is a smaller version of the slide. A user of the presentation application can click on a thumbnail image of a slide to cause the primary pane of the authoring interface to display the slide for editing.
Electronic slide presentations may include a large number of slides and may contain information about several topics. For example, a physics teacher may use an electronic slide presentation to teach a lesson that includes slides about resistance, slides about capacitance, and slides about an upcoming exam.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In general, this disclosure describes techniques that employ user-defined values of properties of sections of an electronic presentation. As described herein, a user may configure a presentation to include a plurality of sections. Each of the sections includes zero or more slides of the electronic slide presentation. In addition to the slides associated with each section, each of the sections is associated with one or more properties having values that can be defined by an author of the presentation. Because the values of the properties of the sections are user-definable, these properties may, in some implementations of these techniques, enable the author of the presentation to use the sections in ways not possible in presentation applications that merely use sections as a means of grouping thumbnail images of slides for easy navigation among slides in an authoring interface.
As described below, the properties of a section may include, for example, a title of the section, a name of an author of the section, a set of access control data that specifies rights of users to perform actions with respect to the section, and other properties. The properties of sections may be used during authoring of the presentation or during presentation of the presentation. The uses of such properties during authoring of a presentation may include, for example, the ability to conceal or reveal thumbnail images of slides in a section by clicking on a title of the section, the ability to reorder sections using titles of the sections, the ability to use the name of a section to print the slides in the section, the ability to create a hyperlink to a section of a presentation, the ability to associate searchable keywords with sections, and other uses. The uses of such properties during presentation of the presentation may, for example, include the ability to view names of sections of a presentation during presentation of the presentation and navigating to a first slide in a section.
In general, this disclosure describes techniques that employ user-defined values of properties of sections of an electronic presentation. In the following description, various examples are described. It should be appreciated that these examples are provided for purposes of explanation and not as express or implied limitations on the scopes of the claims.
As illustrated in the example of
The example of
Storage medium 8 is capable of storing instructions that are readable and executable by processing unit 4. Storage medium 8 may be a wide variety of different types of computer-readable storage media. For example, storage medium 8 may be implemented as one or more random access memory units, one or more read-only memory units, magnetic disks, optical disks, magnetic tapes, flash memory units, or other types of storage media. It should be appreciated that the term “storage medium” refers to a collection of one or more storage media units or one or more types of storage media. For instance, some data in storage medium 8 may be physically stored on a magnetic tape and some data in storage medium 8 may be physically stored on a magnetic disk.
In the example of
Input device 16 may be wide variety of different types of devices. For example, input device 16 may be a mouse, a trackball, a touch-sensitive screen, a keyboard, a keypad, or another type of input device.
Output device 18 may also be a wide variety of different types of devices. For example, output device 18 may be a visual display unit such as a cathode ray display screen, a liquid crystal display (LCD) screen, a light-emitting diode (LED) array, a plasma screen, or another type of device that is capable of outputting information to the real world. Processing unit 4 may present information on output device 18 in a variety of ways. For example, processing unit 4 and output device interface 14 may be connected to a motherboard of computing device 2. In this example, a digital visual interface cable, or another type of physical video connector cable, may connect the output device interface 14 and output device 18. In this example, processing unit 4 may send instructions regarding an image to output device interface 14 and output device interface 14 may send signals to output device 18 to display the image. In another example, processing unit 4 may present information on output device 18 by transmitting information over a network to a computing device that causes output device 18 to display an image based on the transmitted information.
Storage medium 8 stores a presentation file 20 that represents an electronic presentation. Presentation file 20 contains at least one set of section data. Each set of section data in presentation file 20 defines a section. A “section” is a logical unit of an electronic presentation that has at least one property having a user-definable value and that is associated with zero or more sequential or non-sequential slides of the presentation. Each set of section data identifies a set of slides associated with a section and a user-defined value of a property of the section other than the set of slides included in the section. As discussed below, the property of the section may be a variety of different properties and that a set of section data may include several user-defined values of properties of a section. For example, a first set of section data may contain data that directly represents six slides included in a first section and may contain data that indicates that the name of the first section is “Section 1.” Further, in this example, a second set of section data in presentation file 20 may contain data that directly represents four slides and may contain data that indicates that the name of the second section is “Section 2.” In this example, the first section may include slides 1, 3, 5, 6, 7, and 9 and the second section may include slides 2, 4, 8, and 10.
Presentation file 20 may include a wide variety of different types of data structures that embody the sets of section data. For example, presentation file 20 may include an extensible markup language (XML) data structure for each set of section data. In another example, presentation file 20 may include binary data structures that represent each set of section data. In another example, presentation file 20 may include a set of section data that identifies the first set of slides and identifies the data that specifies the user-defined value of the property of the first section by specifying a link to a third set of section data contained in a second presentation file. In this example, the third set of section data identifies the first set of slides and identifies data that specifies the user-defined value of the property of the first section by containing data that directly represents the first set of slides and by containing data that directly represents the user-defined value of the property of the first presentation.
In addition to presentation file 20, storage medium 8 stores a presentation application 22. Presentation application 22 may be similar in some respects to presentation applications such as the Microsoft POWERPOINT® presentation graphics program sold by Microsoft Corporation of Redmond, Wash., the KEYNOTE® slide presentation software sold by Apple Corporation of Cupertino, Calif., the OpenOffice Impress slide presentation software provided by OpenOffice.org, and the GOOGLE APPS® slide presentation application provided by Google, Inc. of Mountain View, Calif.
In one example implementation, presentation application 22 comprises a set of instructions that are executable by processing unit 4. When a user 24 wants to interact with the electronic presentation represented by presentation file 20, user 24 may use input device 16 to instruct computing device 2 to begin executing the instructions of presentation application 22. For example, user 24 may instruct computing device 2 to begin executing instructions of presentation application 22 by using a mouse to select an icon displayed on output device 18 that represents presentation application 22. In another example, user 24 may instruct computing device 2 to begin executing instructions of presentation application 22 by using a keyboard to select an icon representing presentation file 20.
When processing unit 4 begins executing the instructions of presentation application 22, the instructions cause processing unit 4 to access presentation file 20. Upon accessing presentation file 20, the instructions of presentation application 22 cause processing unit 4 to generate a graphical interface 26 in storage medium 8. When processing unit 4 generates graphical interface 26, processing unit 4 uses the values of the properties of the sections of the presentation. Graphical interface 26, when displayed on output device 18 enables a user 24 to interact with an electronic presentation that includes the slides in each of the sections defined by the sets of section data included in presentation file 20. After causing processing unit 4 to generate graphical interface 26, the instructions of presentation application 22 cause processing unit 4 to display graphical interface 26 on output device 18.
After the instructions of presentation application 22 cause processing unit 4 to access presentation file 20, the instructions of presentation application 22 cause processing unit 4 to generate graphical interface 26 using at least one user-defined value of a property of one of the sections (42). Continuing the example cited in the previous paragraph, the instructions of presentation application 22 may cause processing unit 4 to generate graphical interface 26 using the value of property of the first section and the value of the property of the second section. Once processing unit 4 generates graphical interface 26, the instructions of presentation application 22 cause processing unit 4 to display graphical interface 26 on output device 18 (44).
As mentioned above, graphical interface 26 is designed to enable user 24 to interact with a presentation that includes slides in the sections contained in presentation file 20. Accordingly, when output device 18 displays graphical interface 26, the instructions of presentation application 22 enable processing unit 4 to receive input related to a property of a section in the presentation (46). For example, the instructions of presentation application 22 may enable processing unit 4 to receive mouse movement and mouse click input. In response to the input, the instructions of presentation application 22 cause processing unit 4 to perform an action using the property of the section (48).
The instructions of presentation application 22 may cause processing unit 4 to generate graphical interface 26 in a wide variety of ways, thereby enabling a wide variety of possible ways that user 24 can interact with the presentation. Furthermore, because the instructions of presentation application 22 may cause processing unit 4 to generate graphical interface 26 in a wide variety of ways, processing unit 4 may be receive a wide variety of inputs in step 46 and may perform a wide variety of actions in response to these inputs in step 48. Some of the potential ways of generating graphical interface 26 to enable specific types of interaction are summarized with reference to
In a first example, presentation file 20 includes sets of section data that contain user-defined values of title properties of the sections of the presentation. Referring to
The example interface in
The example interface in
The example interface in
The example interface of
The example interface of
User 24 can use the example interface of
When user 24 has selected a section, processing unit 4 may receive copy command input from user 24 and may subsequently receive paste command input from user 24. In response to the paste command input, the instructions of presentation application 22 cause processing unit 4 to copy the presentation data that defines the selected section to a location indicated by the paste command input. For instance, when the paste command input indicates a location in a second presentation, the instructions of presentation application 22 may cause processing unit 4 to copy the set of presentation data that defines the selected section to a location in the second presentation. As a result, the second presentation includes the selected section, including the data identifying the slides in the selected section and values of properties of the selected section. The copy command input may take the form of user 24 clicking on the title of a section of the presentation and the paste command input may take the form of user 24 dragging the title of the section to a location and “dropping” the title of the section at a location where the section is to be added.
User 24 may use input device 14 to add or remove checkmarks from checkboxes 112. When user 24 adds a checkmark to one of checkboxes 112, processing unit 4 receives section selection input that indicates that user 24 wants slides in the section associated with the one of checkboxes 112 to be included in the presentation of the presentation. In response to the section selection input, processing unit 4 modifies the value of property of the presentation selection to indicate that the slides of the section are to be included in presentations of the presentation.
Later, processing unit 4 may receive input that indicates that user 24 wants to present the presentation. In response to this input, the instructions of presentation application 22 may cause processing unit 4 to use the value of the property of the selected section to determine whether to display the slides of the selected section. Subsequently, the instructions of presentation application 22 may cause processing unit 4 to generate a presentation graphical interface that includes a slide of the selected section when it is determined that the value of the property of the selected section indicates that the slides of the selected section are to be displayed in the presentation of the presentation. The instructions of presentation application 22 may then cause processing unit 4 to display the presentation graphical interface on output device 18. In this way, sections can be skipped seamlessly during presentation of the presentation.
It should be appreciated that values of these properties for a single user may differ among sections of the presentation. For instance, the value of the property of a first section represents a first set of access control data that specifies that a user has a right to perform an action with respect to the first section and the value of the property of the second section represents a second set of access control data that specifies that the user does not have the right to perform the action with respect to the second section.
Subsequently, processing unit 4 may receive a request from a user to perform an action (e.g., view or edit a slide) with respect to a section in the presentation. In response to receiving the request from the user to perform the action with respect to the section of the presentation, the instructions of presentation application 22 cause processing unit 4 to use these properties of the section to determine whether the user has a right to perform the action with respect to the section. If the user has the right to perform the action with respect to the section, the instructions of presentation application 22 cause processing unit 4 to perform the action. If the user does not have the right to perform the action with respect to the section, the instructions of presentation application 22 cause processing unit 4 to deny the request to perform the action.
Title bars 170 include icons 172A, 172B, 172C, and 172D (collectively, “icons 172”) that enable user 24 to conceal or reveal thumbnail images of slides. In the example of
User 24 may use this version of graphical interface 24 to organize slides of the presentation into sections. For example, user 24 may use input device 14 to select one of thumbnail images 174A. User 24 may then drag the selected thumbnail image of the slide to an area of graphical interface 26 beneath one of title bars 170. When user 24 has dragged the thumbnail image of the slide to the area of graphical interface 26 beneath one of title bars 170, the slide is removed from the first section added to the section of the presentation associated with the title bar. For instance, user 24 drags the selected thumbnail image of the slide to an area of graphical interface 26 beneath title bar 170D, the slide is removed from the first section and added to the fourth section.
User 24 may interact with the presentation in a variety of ways by selecting title bars 170. For instance, by selecting title bars 170, user 24 may change the values of the title properties of the sections of the presentation. In another instance, by selecting title bars 170, user 24 may add a set of keywords that enable a search engine to identify a section within the presentation. In each of these instances, when user 24 selects one of title bars 170, processing unit 4 receives input and the instructions of presentation application 22 cause processing unit 4 to perform an action in response.
It is to be understood that the implementations described herein may be implemented by hardware, software, firmware, middleware, microcode, or any combination thereof. When the systems and/or methods are implemented in software, firmware, middleware or microcode, program code or code segments, they may be stored in a computer-readable storage medium, such as a storage component. A code segment may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted using any suitable means including memory sharing, message passing, token passing, network transmission, etc.
Furthermore, it is to be understood that computing device 2 may have additional features or functionality. For example, computing device 2 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Computer-readable storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
For a software implementation, the techniques described herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes and instructions may be stored in computer-readable storage media and executed by processors. The memory unit may be implemented within the processor or external to the processor, in which case it can be communicatively coupled to the processor via various means as is known in the art.
These techniques may be realized in several ways. For example, these techniques may be conceptualized as a method for organizing slides of an electronic slide presentation. The method comprises accessing, with a computing device, a presentation file stored on a computer-readable storage medium, the presentation file containing: (i) a first set of section data that defines a first section, the first set of section data identifying a first set of slides and identifying data that specifies a user-defined value of a property of the first section other than the first set of slides, and (ii) a second set of section data that defines a second section, the second set of section data identifying a second set of slides and identifying data that specifies a user-defined value of a property of the second section other than the second set of slides. The method also comprises generating, at the computing device, a graphical interface using the value of the property of the first section and the value of the property of the second section, the graphical interface enabling a user to interact with an electronic presentation that includes the slides in the first set of slides and the slides in the second set of slides. In addition, the method comprises displaying the graphical interface on an output device.
In another example, the techniques of this disclosure may be realized as a computing device comprising a processing unit that is capable of executing instructions, an output device, and a storage medium. The storage medium comprises a presentation file stored on a computer-readable storage medium, the presentation file containing: (i) a first set of section data that defines a first section, the first set of section data identifying a first set of slides and identifying data that specifies a user-defined value of a property of the first section other than the first set of slides, and (ii) a second set of section data that defines a second section, the second set of section data identifying a second set of slides and identifying data that specifies a user-defined value of a property of the second section other than the second set of slides. The computer-readable storage medium also comprises instructions that, when executed by the processing unit, cause the processing unit to: access the presentation file; generate a graphical interface using the value of the property of the first section and the value of the property of the second section, the graphical interface enabling a user to interact with an electronic presentation that includes the slides in the first set of slides and the slides in the second set of slides; and display the graphical interface on the output device.
In another example, the techniques of this disclosure may be realized as a computer-readable storage medium comprising a presentation file stored on a computer-readable storage medium, the presentation file containing: (i) a first set of section data that defines a first section, the first set of section data identifying a first set of slides, identifying data that specifies a title of the first section, and identifying data that specifies a user-defined value of a property of the first section that represents a first set of access control data that specifies that the user has a right to perform an action with respect to the first section, and (ii) a second set of section data that defines a second section, the second set of section data identifying a second set of slides, identifying data that specifies a title of the second section, and identifying data that specifies a user-defined value of a property of the second section that represents a second set of access control data that specifies that the user does not have the right to perform the action with respect to the second section. The computer-readable storage medium also comprises instructions that, when executed by a processing unit of a computing device, cause the processing unit to access the presentation file. The instructions also cause the processing unit to generate a graphical interface that displays the title of the first section and the title of the second section. Furthermore, the instructions cause the processing unit to receive a request from the user to perform the action with respect to the first section. The instructions also cause the processing unit to, in response to receiving the request from the user to perform the action with respect to the first section, determine that the second set of access control data specifies that the user does not have the right to perform the action with respect to the second section; and perform the action with respect to the first section. In addition, the instructions cause the processing unit to receive a request from the user to perform the action with respect to the second section. Furthermore, the instructions cause the processing unit to, in response to receiving the request from the user to perform the action with respect to the second section, determine that the second set of access control data specifies that the user does not have the right to perform the action with respect to the second section and deny the request to perform the action with respect to the second section.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Number | Name | Date | Kind |
---|---|---|---|
5572644 | Liaw et al. | Nov 1996 | A |
6008807 | Bretschneider et al. | Dec 1999 | A |
6018346 | Moran et al. | Jan 2000 | A |
6041333 | Bretschneider et al. | Mar 2000 | A |
6119147 | Toomey et al. | Sep 2000 | A |
6128629 | Bretschneider et al. | Oct 2000 | A |
6369835 | Lin | Apr 2002 | B1 |
6396500 | Qureshi et al. | May 2002 | B1 |
6473749 | Smith et al. | Oct 2002 | B1 |
6738075 | Torres | May 2004 | B1 |
6819338 | Heasman et al. | Nov 2004 | B2 |
6834371 | Jensen | Dec 2004 | B1 |
6938032 | Heath et al. | Aug 2005 | B1 |
7073127 | Zhao et al. | Jul 2006 | B2 |
7206773 | Erol et al. | Apr 2007 | B2 |
7246316 | Furlong | Jul 2007 | B2 |
7246317 | Karasawa et al. | Jul 2007 | B2 |
7266568 | Erol et al. | Sep 2007 | B1 |
7266773 | Dorwart | Sep 2007 | B2 |
7299418 | Dieberger | Nov 2007 | B2 |
7363581 | Parks et al. | Apr 2008 | B2 |
7392475 | Leban et al. | Jun 2008 | B1 |
7493561 | Sareen et al. | Feb 2009 | B2 |
7526726 | Skwarecki et al. | Apr 2009 | B1 |
7546533 | Sareen et al. | Jun 2009 | B2 |
7590939 | Sareen et al. | Sep 2009 | B2 |
7590941 | Wee et al. | Sep 2009 | B2 |
7743331 | Fleischer et al. | Jun 2010 | B1 |
7882565 | Collins et al. | Feb 2011 | B2 |
8108777 | Penner et al. | Jan 2012 | B2 |
20010040592 | Foreman et al. | Nov 2001 | A1 |
20020001106 | Lan | Jan 2002 | A1 |
20020109712 | Yacovone et al. | Aug 2002 | A1 |
20020138389 | Martone et al. | Sep 2002 | A1 |
20020164151 | Jasinschi et al. | Nov 2002 | A1 |
20020174085 | Nelson et al. | Nov 2002 | A1 |
20030101043 | Boegelund et al. | May 2003 | A1 |
20030122863 | Dieberger et al. | Jul 2003 | A1 |
20030142145 | Bennett et al. | Jul 2003 | A1 |
20030160814 | Brown | Aug 2003 | A1 |
20030222890 | Salesin | Dec 2003 | A1 |
20030222900 | Schramm-Apple et al. | Dec 2003 | A1 |
20030231202 | Parker | Dec 2003 | A1 |
20040001106 | Deutscher | Jan 2004 | A1 |
20040015595 | Lin et al. | Jan 2004 | A1 |
20040027370 | Jaeger | Feb 2004 | A1 |
20040071453 | Valderas | Apr 2004 | A1 |
20040113934 | Kleinman et al. | Jun 2004 | A1 |
20040125128 | Chang et al. | Jul 2004 | A1 |
20040128691 | Egawa et al. | Jul 2004 | A1 |
20040210845 | Paul et al. | Oct 2004 | A1 |
20050055625 | Kloss | Mar 2005 | A1 |
20050138570 | Good et al. | Jun 2005 | A1 |
20050246642 | Valderas et al. | Nov 2005 | A1 |
20050289453 | Segal et al. | Dec 2005 | A1 |
20060067578 | Fuse | Mar 2006 | A1 |
20060080610 | Kaminsky | Apr 2006 | A1 |
20060082594 | Vafiadis et al. | Apr 2006 | A1 |
20060259875 | Collins et al. | Nov 2006 | A1 |
20060265659 | Collins et al. | Nov 2006 | A1 |
20060282759 | Collins | Dec 2006 | A1 |
20060294469 | Sareen et al. | Dec 2006 | A1 |
20070056045 | Collins | Mar 2007 | A1 |
20070185870 | Hogue et al. | Aug 2007 | A1 |
20070188520 | Finley et al. | Aug 2007 | A1 |
20070245238 | Fugitt et al. | Oct 2007 | A1 |
20070279416 | Cobb et al. | Dec 2007 | A1 |
20070294612 | Drucker et al. | Dec 2007 | A1 |
20080022225 | Erl | Jan 2008 | A1 |
20080025691 | Kinoshita et al. | Jan 2008 | A1 |
20080059889 | Parker et al. | Mar 2008 | A1 |
20080070218 | Ahl | Mar 2008 | A1 |
20080189616 | Coulomb et al. | Aug 2008 | A1 |
20080263460 | Altberg et al. | Oct 2008 | A1 |
20080288889 | Hunt et al. | Nov 2008 | A1 |
20080313544 | Kleinman et al. | Dec 2008 | A1 |
20090007003 | Dukhon et al. | Jan 2009 | A1 |
20090037821 | O'Neal et al. | Feb 2009 | A1 |
20090044117 | Vaughan et al. | Feb 2009 | A1 |
20090119604 | Simard et al. | May 2009 | A1 |
20090138826 | Barros | May 2009 | A1 |
20090183095 | Deitsch et al. | Jul 2009 | A1 |
20090222763 | Dukhon et al. | Sep 2009 | A1 |
20090235166 | Keohane et al. | Sep 2009 | A1 |
20090300501 | Miller et al. | Dec 2009 | A1 |
20090309846 | Trachtenberg et al. | Dec 2009 | A1 |
20090319562 | Holm-Petersen et al. | Dec 2009 | A1 |
20100031152 | Villaron et al. | Feb 2010 | A1 |
20100037140 | Penner et al. | Feb 2010 | A1 |
Number | Date | Country |
---|---|---|
07-261963 | Oct 1995 | JP |
2001-022257 | Jan 2001 | JP |
2005-5352701 | Dec 2005 | JP |
2006-059361 | Mar 2006 | JP |
2011530769 | Dec 2011 | JP |
2005 139 793 | Jun 2007 | RU |
2312390 | Dec 2007 | RU |
2324987 | May 2008 | RU |
200615840 | May 2006 | TW |
9428480 | Dec 1994 | WO |
2006124140 | Nov 2006 | WO |
2009087999 | Jul 2009 | WO |
2010014294 | Feb 2010 | WO |
2010019349 | Feb 2010 | WO |
Entry |
---|
Chinese Third Office Action dated Jul. 23, 2013 in Appln No. 200980137757.5, 8 pgs. |
Office Action mailed Sep. 25, 2013, in U.S. Appl. No. 12/184,174. |
Japanese Notice of Rejection dated Sep. 20, 2013 in Appln No. 2011-523026. |
Chinese Fourth Office Action dated Nov. 21, 2013 in Appln No. 200980137757.5, 11 pgs. |
Chilean Office Action Summary dated Aug. 17, 2012 in Appln No. 282-2011. |
Chinese Office Action dated Nov. 16, 2012 in Appln No. 200980131705.4. |
Chinese Second Office Action dated Jan. 30, 2013 in Appln No. 200980131157.5. |
“CSS Max-width Property” by W3Schools, archived by Internet Archive WaybackMachine Jun. 8, 2007, downloaded Nov. 16, 2012; 1 pg. |
Office Action mailed Nov. 20, 2012, in U.S. Appl. No. 12/184,174. |
“An Overview of Aabel 3 Features” accessed at: http://www.gigawiz.com/Aabel.html; accessed on Jul. 21, 2010, 19 pages. |
“Collaboration within the Telepresence Experience” accessed at: http://www.wrplatinum.com/Downloads/11056.aspx; published Jan. 2010, 11 pages. |
“Create treemaps using easy drag and drop interactions” accessed at: http://www.magnaview.nl/treennap/; accessed on Jul. 21, 2010, 1 page. |
“The Beginner's Guide to Data Visualization” accessed at: http://www.tableausoftware.com/beginners-data-visualization; accessed on Jul. 21, 2010, 6 pages. |
“The Platinum Experience of Collaboration—CollaboratorHYPERMAX”, accessed at: http://www.businessoctane.com/group—telepresence.php; accessed on Jul. 16, 2010, 7 pages. |
Davis et al., “Collaboration within the Telepresence Experience” accessed at: http://www.wrplatinum.com/Downloads/11056.aspx; published Jan. 2010, 11 pages. |
EP Examination Report in EP Application No. 06759316.0 mailed Dec. 28, 2011, 6 pages. |
EP Supplemental Search Report in EP Application No. 09803312.9 mailed Jul. 7, 2011, 6 pages. |
GeoTime, accessed at: http://www.geotime.com/Product/GeoTime-(1)/Features---Benefits.aspx; accessed on Jul. 19, 2010, 7 pages. |
Hewagamage et al.; Interactive Visualization of Spatiotemporal Patterns Using Spirals on a Geographical Map—accessed at: http://ieeexploreieee.org/stamp/stamp.jsp?arnumber=00795916; published 1999, 8 pages. |
International Search Report in Application No. PCT/US2009/051090 mailed Jan. 29, 2010, 12 pages. |
International Search Report in Application No. PCT/US2006/17725 mailed Jul. 5, 2007, 8 pages. |
International Search Report in Application No. PCT/US2009/046529 mailed Nov. 30, 2009, 11 pages. |
Izadi et al., “Dynamo: A public interactive surface supporting the cooperative sharing and exchange of media” accessed at: http://hci.stanford.edu/publications/2007/range-wip-final.pdf; published Apr. 2007, 10 pages. |
Little, J. Ambrose; High-End Business Intelligence with Data Visualization for WPF 4, accessed at: http://www.codeproject.com/KB/showcase/DataVisualizationWPF4.aspx; published Jun. 29, 2010, 7 pages. |
Moran et al., “Tailorable Domain Objects as Meeting Tools for an Electronic Whiteboard” accessed at: http://www.fxpal.com/people/chiu/paper-mvc-CSCW98.pdf; published1998, 10 pages. |
Nelson, John; Just Around the Corner: Visual Fusion 4.5, accessed at: http://www.idvsolutions.com/press—newsletter—vfx45—silverlight.aspx; published Sep. 30, 2009, 6 pages. |
U.S. Office Action in U.S. Appl. No. 12/184,174 mailed Mar. 13, 2012, 20 pages. |
PresenterNet.com: “PresenterNet Product Overview”—accessed at URL:http://web.archive.org/web/20050105035936/ http://www.presenternet.com/html/products.php; posted Jan. 6, 2005, 3 pages. |
U.S. Appl. No. 13/271,148, filed Oct. 11, 2011 entitled “Interactive Visualization of Multiple Software Functionality Content Items”. |
U.S. Appl. No. 13/253,839, filed Oct. 5, 2011 entitled “Multi-User and Multi-Device Collaboration”. |
U.S. Appl. No. 13/253,886, filed Oct. 5, 2011 entitled “Workspace Collaboration Via a Wall-Type Computing Device”. |
U.S. Appl. No. 13/272,832, filed Oct. 13, 2011 entitled “Authoring of Data Visualizations and Maps”. |
Visualize and Map SalesForce Leads with SpatialKey—accessed at: http://www.spatialkey.com/support/tutorials/visualize-and-map-salesforce-leads-with-spatialkey-part-ii/; accessed on Jul. 19, 2010, 8 pages. |
Weverka, “PowerPoint 2007 All-in-One Desk Reference for Dummies”—Wiley Publishing, Jan. 2007, 8 pages. |
Chinese Office Action dated Aug. 31, 2012 in Appln No. 200980131157.5. |
ZuiPrezi Ltd.; “ZuiPrezi Nonlinear Presentation Editor”; http://zuiprezi.kibu.hu/; 2007; 2 Pgs. |
Content Applications, copyright 1996-2003 Documentum, Inc., 2 pages. |
Delivering format transformation and analysis for all content, copyright 2006 EMC Corporation, 4 pages. |
Microsoft Office Picture Manager Basics, TAG Nov. 2006, 4 pages. |
Microsoft Office Picture Manager, Wikipedia, Oct. 2007, 2 pages. |
SpanSoft, Software from SpanSoft, copyright 2006, 4 pages. |
What is Slide Librarian accessed at: http://www.spansoft.org/slide—rt.htm, accessed on Dec. 7, 2007, 3 pages. |
CounterPoint: A Zooming Pesentation Tool, Archive.Org 2005 capture, 3 pages. |
Canvas Tips and Techniques, Denaba Systems, Inc. copyright 1995-2002, 9 pages. |
v4v: a View for the Viewer, AIGA copyright 2005, 9 pages. |
Drucker et al.; Comparing and Managing Multiple Versions of Slide Presentations, ACM copyright 2006, 10 pages. |
CounterPoint User Manual, Archive.Org 2005 capture, 21 pages. |
Good et al.; CounterPoint: Creating jazzy Interactive Presentations, HCIL Tech Report Mar. 2001, 9 pages. |
FREEPATH-EDU Nonlinear Presentation Software, accessed at: http://www.fullcompass.com/product/233150.html, accessed on May 13, 2008, 3 pages. |
Kan, Min-Yen; SlideSeer: A digital library of aligned document and presentation pairs, ACM copyright 2006, 10 pages. |
Keynote '08 Users Guide, Apple, Inc. copyright 2008, 204 pages. |
Microsoft Releases First Beta of “Office 11”—Microsoft Corporation copyright 2007, 1 page. |
Moscovich et al., Customizable Presentations, accessed at http://www.photodex.com/products/producer/features.html, 2008, 2 pages. |
ProShow Producer Feature Overview, Photodex Corporation copyright 2008, 2 pages. |
CounterPoint: A Zooming Presentation Tool, accessed at http://www.cs.umd.edu/hcil/counterpoint/, accessed on Aug. 14, 2008, 2 pages. |
Welcome to PowerPoint 2007; accessed at http://www.computerbook.nl/pdf/9780470040591.pdf, accessed 2008, 20 pages. |
Wepen, Faithe; PowerPoint 2007 Bible, published Feb. 27, 2007 by John Wiley & Sons, accessed at http://academc.safaribooksonline.com/book/office-and-productivity-applications/9780470 . . . , accessed on Jan. 25, 2011, 27 pages. |
ZuiPrezi Nonlinear Presentation Editor, ZuiPrezi Ltd copyright 2007, 2 pages. |
US Non-Final Office Action in U.S. Appl. No. 12/189,583 mailed Sep. 24, 2010, 25 pages. |
US Non-Final Office Action in U.S. Appl. No. 12/184,174 mailed Feb. 4, 2011, 19 pages. |
US Final Office Action in U.S. Appl. No. 12/189,583 mailed Mar. 11, 2011, 14 pages. |
US Final Office Action in U.S. Appl. No. 12/184,174 mailed Sep. 6, 2011, 25 pages. |
Chinese Fifth Office Action dated May 30, 2014 in Appln No. 200980131157.5, 9 pgs. |
Office Action mailed Aug. 11, 2014, in U.S. Appl. No. 12/184,174, 50 pgs. |
“Office Action Received in Japanese Patent Application No. 2011-523026”, Mailed Date: Mar. 7, 2014, Filed Date: Jul. 19, 2009, 4 pages. |
“First Examination Report Received in Australian Patent Application No. 2009282364”, Mailed Date: Mar. 17, 2014, Filed Date: Jul. 19, 2009, 3 Pages. |
“Office Action and Search Report Received for Taiwan Patent Application No. 98126585”, Mailed Date: Aug. 4, 2014, 15 Pages. |
Japanese Office Action Received for Patent Application No. 2011-523026, Mailed Date: Nov. 5, 2014, 4 Pages. |
Number | Date | Country | |
---|---|---|---|
20120131464 A1 | May 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12189583 | Aug 2008 | US |
Child | 13361009 | US |