Systems and methods for publishing and/or sharing media presentations over a network

Information

  • Patent Grant
  • 11682150
  • Patent Number
    11,682,150
  • Date Filed
    Monday, May 24, 2021
    3 years ago
  • Date Issued
    Tuesday, June 20, 2023
    a year ago
Abstract
In accordance with one or more embodiments of the present disclosure, systems and methods for publishing and/or sharing media presentations over a network comprise communicating with a user and one or more distribution channels via the network, gathering media resources based on user input, creating a media presentation with the media resources based on user input, and publishing the media presentation by distributing the media presentation to the one or more distribution channels via the network. In one aspect, publishing comprises directly emailing the media presentation to one or more other users via the network. In another aspect, publishing comprises providing a link to one or more other users via the network for direct access to the media presentation. In still another aspect, publishing comprises obtaining and embedding source code for the media presentation in a web page associated with one or more of the distribution channels via the network.
Description
BACKGROUND
Technical Field

The present invention generally relates to network-based multi-media presentations and more particularly to publishing and/or sharing media presentations over a network.


Related Art

Presently, in reference to creating and posting online media presentations, a user must typically purchase conventional media presentation software, which may be expensive, create a media presentation with the conventional media presentation software, which is often time consuming, upload the file for the media presentation, and then post a link to a file for online access to the created media presentation, which may require security features. To view the media presentation, another user must access the site storing the file, pass some form of access security features, download the file from the communication network, and have the same software that created the file for viewing. However, this sequence of creating and downloading the media presentation is often expensive, time-consuming, and inconvenient for each user involved in the process. As such, there exists a need to simplify the process of creating and viewing online media presentations over a communication network.


SUMMARY

In accordance with one or more embodiments of the present disclosure, a system for publishing and/or sharing media presentations over a network comprises a service component adapted to interface with a user over the network and one or more distribution channels over the network, a collect module adapted to gather media resources based on user input, a create module adapted to create a media presentation with the media resources based on user input, and a publish module adapted to distribute the media presentation to the one or more distribution channels via the network.


In one implementation, the publish module is adapted to directly email the media presentation to one or more other users via the network. In another implementation, the publish module is adapted to provide a link to one or more other users via the network for direct access to the media presentation. In still another implementation, the publish module is adapted to obtain and embed source code for the media presentation in a web page associated with one or more of the distribution channels via the network.


In accordance with another embodiment of the present disclosure, a method for publishing and/or sharing media presentations over a network comprises communicating with a user and one or more distribution channels via the network, gathering media resources based on user input, creating a media presentation with the media resources based on user input, and publishing the media presentation by distributing the media presentation to the one or more distribution channels via the network. In various implementations, publishing may comprise directly emailing the media presentation to one or more other users via the network, providing a link to one or more other users via the network for direct access to the media presentation, and/or embedding source code for the media presentation in a web page associated with one or more of the distribution channels via the network.


These and other features and advantages of the present disclosure will be more readily apparent from the detailed description of the embodiments set forth below taken in conjunction with the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows a block diagram of a system configured to facilitate publishing and/or sharing media presentations over a network, in accordance with an embodiment of the present disclosure.



FIGS. 2A-2B show a block diagram of a method adapted to facilitate publishing and/or sharing multi-media presentations over a network, in accordance with an embodiment of the present disclosure.



FIG. 3 is a block diagram of a computer system suitable for implementing one or more embodiments of the present disclosure.





Embodiments of the present disclosure and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures, wherein showings therein are for purposes of illustrating embodiments of the present disclosure and not for purposes of limiting the same.


DETAILED DESCRIPTION

Systems and methods disclosed herein, in accordance with one or more embodiments, facilitate publishing, sharing and/or broadcasting multi-media presentations over a network for viewing by other users in communication with the network. In one embodiment, the multi-media presentation may be published or distributed to a site accessible via the network for viewing by one or more other network users in communication with the network. In another embodiment, the multi-media presentation may be directly emailed to one or more recipients (i.e., other network users). In still another embodiment, an associated URL link for the multi-media presentation may be given (e.g., via email or some type of text message) to one or more recipients (i.e., other network users) for direct access to the multi-media presentation. In yet another embodiment, source code for the multi-media presentation may be embedded in a web page via the network.



FIG. 1 shows one embodiment of a block diagram of a system 100 adapted to facilitate publishing, sharing and/or broadcasting multi-media presentations over a network 160. As shown in FIG. 1, the system 100 includes at least one client device 120 (e.g., network computing device), one or more multi-media distribution channels 140 (e.g., network server devices), and at least one service provider device 180 (e.g., network server device) in communication over the network 160.


The network 160, in one embodiment, may be implemented as a single network or a combination of multiple networks. For example, in various embodiments, the network 160 may include the Internet and/or one or more intranets, landline networks, wireless networks, and/or other appropriate types of communication networks. In another example, the network 160 may comprise a wireless telecommunications network (e.g., cellular phone network) adapted to communicate with other communication networks, such as the Internet. As such, in various embodiments, the at least one client device 120, the multi-media distribution channels 140, and the at least one service provider device 180 may be associated with a particular link (e.g., a link, such as a URL (Uniform Resource Locator) to an IP (Internet Protocol) address).


The at least one client device 120, in various embodiments, may be implemented using any appropriate combination of hardware and/or software configured for wired and/or wireless communication over the network 160. In various implementations, the client device 120 may be implemented as a personal computing device (e.g., a personal computer (PC)) in communication with the network 160, such as the Internet. In various other implementations, the client device 120 may be implemented as one or more wireless telephones (e.g., cell phones), personal digital assistants (PDAs), notebook computers, and/or various other generally known types of wired and/or wireless computing devices. It should be appreciated that the client device 120 may be referred to as a user device or customer device without departing from the scope of the present disclosure.


The client device 120, in one embodiment, includes a user interface application 122, which may be utilized by a user 102 to conduct information transactions with the distribution channels 140 and the service provider server 180 over the network 160. For example, the user interface application 122 may be implemented as a multi-media presentation application to collect, create and publish information via the network 160. In various implementations, multi-media presentations may be published to and/or shared with one or more of the multi-media channels 140 via the user interface application 122 over the network 160.


In one implementation, the user interface application 122 comprises a software program, such as a graphical user interface (GUI), executable by a processor that is configured to interface and communicate with the multi-media channels 140 and the service provider server 180 via the network 160. In another implementation, the user interface application 122 comprises a browser module that provides a network interface to browse information available over the network 160. For example, the user interface application 122 may be implemented, in part, as a web browser to view information available over the network 160. In another example, each member of the user group 102 is able to access multi-media websites via the one or more multi-media channels 140 to view, collect and publish multi-media presentations over the network 160.


The client device 120, in various embodiments, may include other applications as may be desired in one or more implementations to provide additional features available to the user 102. In one example, such other applications may include security applications for implementing client-side security features, programmatic client applications for interfacing with appropriate application programming interfaces (APIs) over the network 160 or various other types of generally known programs and/or applications. In other examples, these other applications may interface with the user interface application 122 for improved efficiency and convenience. For example, files, data, and/or various types of information may be imported from multi-media software directly into the user interface application 122 for ease of access to multi-media files (e.g., audio, video, pictures, clip-art, etc.).


The client device 120, in various embodiments, may include a user identifier, which may be implemented, for example, as operating system registry entries, cookies associated with the user interface application 122, identifiers associated with hardware of the client device 120, or various other appropriate identifiers. The user identifier may include attributes related to the user 102, such as personal information (e.g., a user name, password, etc.). In one implementation, the user identifier may be passed to the service provider server 180 during publishing and/or sharing of a multi-media presentation.


The multi-media distribution channels 140, in one embodiment, may be maintained by one or more resource providers and/or entities (e.g., social networking sites, resource information sites, management sites, merchant sites, etc.) in communication with the network 160. As such, the multi-media distribution channels 140 may be implemented using any appropriate combination of hardware and/or software configured for wired and/or wireless communication over the network 160. In one implementation, the multi-media distribution channels 140 may be implemented as a network computing device (e.g., a network server) in wired and/or wireless communication with the network 160.


The service provider server 180, in one embodiment, may be maintained by an online transaction processing provider and/or entity in communication with the network 160. As such, the service provider server 180 may be implemented using any appropriate combination of hardware and/or software configured for wired and/or wireless communication over the network 160. In one implementation, the service provider server 180 may be implemented as a network computing device (e.g., a network server) in wired and/or wireless communication with the network 160. As shown in FIG. 1, the service provider server 180 includes a service interface application 182, which may be adapted to interact with the client device 120 to facilitate publishing and/or sharing multi-media presentations over a network. In one example, the service provider server 180 may be provided and implemented by PayPal, Inc. of San Jose, Calif., USA.


The service application 182, in one embodiment, utilizes a collect module 184, a create module 186, and a publish module 188 to collect information, create presentations, and publish presentations, respectively. As described in greater detail herein, the modules 184, 186, 188 enable users, such as the user 102, to collect diverse types of audio and visual media, create rich multi-media presentations with real-time editing and authoring using media software, such as Flash, and then share and/or publish the rich multi-media presentations with other users via the network 160. In one example, the collect, create, and publish modules 184, 186, 188 may be implemented within a standard web browser for interfacing with the user 102.


In one implementation, the user 102 is able to share multi-media presentations with other users via the media channels 140 and/or embed multi-media presentations directly in webpages of other users. For example, the user 102 may provide a unique URL link for the multi-media presentation to other users. In another example, the user 102 may directly email multi-media presentations to multiple recipients and include a message with the email. In still another example, the user 102 may provide the source HTML (i.e., HyperText Markup Language) code to other users and/or embed the source HTML code directly into other user's webpages. Still other examples include the ability to publish multi-media presentations on a website to sell a particular item or service for purchase. For items and/or services, a media rich presentation helps users market and sell items and/or services, which may be valuable for high-end or high-priced items and/or services. Social and/or dating sites may utilize these multi-media presentations to provide online users with a way to better present themselves to other online users. In various implementations, some types of webpages may be presented in a more dynamic manner by utilizing Rich Site Summary (RSS) feeds, since, for example, a particular user's presentation may be continually changing with new media.


The service provider server 180, in various embodiments, may be configured to maintain, store and/or archive multi-media presentations in a database 190, each of which may include information related to one or more users, such as the user 102, and one or more multi-media channels, such as multi-media distributions channels 140. In various examples, the multi-media presentations may include attributes stored as part thereof, and the attributes may be passed to the service provider server 180 as part of a creating, publishing and/or sharing the multi-media presentations.


Referring to FIG. 1, the collect module 184, in one embodiment, enables the user 102 to collect audio, photographic images, video, and music media from various sources, such as a PC, RSS feeds, websites, and any other online source, via a user interface, such as the user interface application 122. In various implementations, the user interface application 122 comprises multiple tabs and/or links for the various sources. Once collected, the media may be saved and categorized in the database 190 and edited on the system site via the service provider server 180. Editing may include one or more of sizing, rotating, overlying, moving and stacking various media backward and forward with an overlay or stack. Video may be broken up automatically by the service provider server 180 into smaller segments. Selected video segments may be combined and/or used as desired. Selected media may be placed on a virtual storyboard, such as a clipboard, on the same screen as the collection of media. Media may be edited either in the collection or in the storyboard. Placing desired media on the storyboard may be accomplished by dragging and dropping. In one example, the collect module 184 provides selected media on a storyboard. In another example, the collect module 184 provides media on a user's media page (i.e., not placed on the storyboard). In still another example, uploading media may be delayed until editing is completed.


Referring to FIG. 1, the create module 186, in one embodiment, enables the user 102 to place selected media onto a presentation style, board or collage. The service provider server 180 may automatically suggest a story idea to launch the creative process, or the user 102 may select a specific style or presentation tool. In one implementation, media from the storyboard may be dragged and dropped onto the presentation. Within the presentation, there may be multiple styles, such as a picture frame, a television, a billboard, etc. Media may be placed within the viewing window of each type of style. Once in the presentation, the media may be edited. For example, the media may be rotated, sized, cut-out (e.g., by selecting the boundaries of an image, such as with clicks to designate points along the boundary, enabling as coarse or fine a resolution as desired), moved forward or backward in relation to adjacent media, slide multiple images to add or remove spaces within the presentation, and adding a hotspot (e.g., selecting an area of the image for additional information, such as adding a link, video, text, etc.). Other editing features may include adding audio to the background, adding text, and/or distorting images. In one aspect, the editing may be achieved in real-time so that the user 102 may quickly and easily see the results and change them as needed.


Referring to FIG. 1, the publish module 186, in one embodiment, enables the user 102 to share, publish and/or distribute the presentation when, for example, the presentation is completed. In one implementation, as described herein, the presentation may be saved in the database 190 of the service provider server 180. Once saved, the user 102 may share, publish and/or distribute presentations to any selected channel, such as one or more of the multi-media channels 140. Any users on the network 160 having access to the channels 140 or website related to the channels 140 may refresh the view, which may automatically load the presentation into that channel and/or website for viewing the content of the presentation. As such, the presentations may be distributed to various online websites, blogs, mobile video players, and IP TV networks, and/or on the system site.


These modules 184, 186, 188 may be combined, used, and/or modified to provide the user 102 with different initial choices regarding the type of presentation and features desired for creating the presentation. The choices may be a simple, easy to use tool to quickly build presentations with dynamic content from RSS feeds and online albums. Accordingly, the user 102 may select a presentation style and then link it to the user's media libraries through RSS feeds that maintain an “always on” permalink to the content source.



FIGS. 2A-2B show one embodiment of a block diagram of a method 200 adapted to facilitate publishing and/or sharing multi-media presentations over the network 160, in reference to FIG. 1. It should be appreciated that the order of the following process flow may be rearranged without departing from the scope of the present disclosure.


Referring to FIG. 2A, the method 200 comprises collecting media for a multi-media presentation (block 210). In various implementations, media such as photographic images, audio, video, music, etc. may be collected from a variety of sources including local sources, such as a personal computer (PC), and online sources, such as the Internet, for use in the media presentation or media piece. For example, an online search engine may be accessed and one or more keyword searches may be utilized to search the Internet for various types of media content. In another example, additional media may be collected from other sources, such as media from a PC, which may be selected, uploaded and viewed. As such, media from different sources may be viewed by selecting corresponding media source tabs from the user interface application 122. Media may be viewed as individual media items or clustered such that each of the individual media items within the cluster may be viewed. In one aspect, the user interface application 122 interfaces with the service interface application 182 via the network 160 to utilize the collect module 186 for collecting media. In another aspect, any media collected may be displayed on the user device 120 via the network 160 for viewing by the user 102 in a media collection area of the user interface application 122 (block 214).


Next, the method 200 comprises populating a storyboard (block 218). In one implementation, the user interface application 122 includes a graphical storyboard, which may be populated with collected media from one or more of the media items or a cluster of media items collected from various media sources. For example, the graphical storyboard may include one or more images of items and/or clustered items collected from the Internet and items uploaded from a PC.


Next, optionally, one or more of the media items may be edited (block 222). In various implementations, media positioned on the graphical storyboard may be edited prior to generating the media presentation or media piece, and/or individual media items in a media cluster may be reordered.


Next, the media presentation or media piece may be created (block 226). In one implementation, once media items have been collected and optionally edited, the media presentation or media piece may be created by selecting a creation operation of the user interface application 122. In one aspect, the user interface application 122 interfaces with the service interface application 182 via the network 160 to utilize the create module 186 for creating the media presentation or media piece.


In various implementations, creating the media presentation and/or media piece may include selecting and/or customizing its style (block 230) and viewing the selected and/or customized style (block 234). For example, presentation styles may include, but are not limited to, a picture frame, billboard, kaleidoscope, street scene, landscape, etc. Once a style is selected, media items from the storyboard may automatically flow into or populate the selected style. Some styles, such as kaleidoscope style, support movement of media as part of a presentation style and provides various customizable characteristics of movement, such as speed of media rotation, etc., within the style. In one aspect, the customized style may be displayed on the user device 120 via the network 160 for viewing by the user 102.


Next, referring to FIG. 2B, the selected and/or customized style of the media presentation or media piece may be optionally edited (block 238). In various implementations, if further editing of media presentation style is desired, style editing tools may be provided to the user interface application 122 via the create module 186. These editing tools may include the ability to preview the media presentation or media piece, edit the media presentation or media piece (e.g., scale, rotate and/or move frames directly onto the media presentation or media piece), add text and/or captions to the media presentation or media piece, add audio, sound, music and/or voice-over to the media presentation or media piece, placement of a frame and/or masking of the media presentation or piece, and add background images and/or color to the media presentation or media piece. For example, a sound feature may provide options, such as adding music to particular media or background, having the music vary in loudness depending on time and/or spatial indicators, and having a repeating playlist that repeats particular music and/or plays different segments in a random manner.


Next, the method 200 comprises providing identifying information (block 242) about the media presentation or media piece (e.g., a title, storage location and/or various types of descriptive information) and deciding where to store the media presentation or media piece (block 246). In one implementation, the user 102 may identify and/or a select one or more distribution channels located on an associated server for storage of the media presentation or media piece or choose an alternative storage location in the network 160.


Next, the method 200 comprises publishing, sharing and/or broadcasting the media presentation or media piece via the network (block 250). In various implementations, the media presentation or media piece may be published or distributed to a site accessible via the network 160 for viewing by one or more other network users in communication with the network 160. For example, the media presentation or media piece may be directly emailed (block 254a) to one or more recipients (i.e., other network users), along with a message. In another example, an associated URL link (block 254b) for the media presentation or media piece may be given (e.g., via email or some type of text message) to one or more recipients (i.e., other network users) for direct access to the media presentation or media piece via the network 160. In still another example, source code (block 254c) for the media presentation or media piece may be obtained by the user 102 and embedded into a web page managed by the user 102 via the network 160.


Accordingly, in various implementations, the user 102 may email the media presentation or media piece to other network users via the network 160, embed the media presentation or media piece in a web page that is accessible via the network 160, and/or create a URL permalink of the media presentation or media piece to one or more of the multi-media distribution channels 140 in the network 160.


In one implementation, the service interface application 182 allows users to compose multi-media stories, presentations and pieces by laying out photographic images, video, text and audio on a stage, storyboard or collage. In one aspect, a multi-media story may begin from an unpopulated storyboard. The user 102 may select to view and work with a sample story until the user 102 is ready to begin the media story. Multi-media items from various sources may be viewed by selecting source tabs of the user interface application 122.


Once the user 102 is ready to create a media story, the user 102 accesses a collage storyboard. In one aspect, the storyboard or stage progressively moves in a first direction, such as a left or right direction, and when viewed, may appear as a movie adapted to present a linear narrative in time and/or space. In another aspect, the stage may progressively move in a second direction, such as upward or downward. As such, this may also be presented as a movie adapted to present a linear narrative in time and/or space. The vertical orientation may represent physical structures that have height and/or depth, such as buildings, structures, monuments and/or geological strata. In still another aspect, the stage may be adapted to progressively move in a plurality of directions and may be presented as real or virtual spaces in two or three dimensions.


In various implementations, multi-media items from various sources may be uploaded from the network 160, or previously collected media items may be used to populate the storyboard by selecting desired multi-media items. A multi-media item may be repositioned within the storyboard by dragging and dropping the multi-media item to another location in the storyboard. Similarly, a multi-media item may be removed from the storyboard by dragging and dropping the multi-media item from the storyboard. In one aspect, once multi-media items are positioned within the storyboard, the media story may be edited. In this regard, a rich interface may be presented to users within a predefined screen area.


In various embodiments, the service interface application 182 utilizes a number of innovative techniques. For example, a first technique utilizes one or more direct-attached media-specific tools. When selecting any media object on the stage, a highlight rectangle appears around the object. Attached to the rectangle around the media object are a number of tools, some generic for all media types, others specific to the media type. By showing the tools in close proximity to the media object, it is easier for users to understand the relationship of the tool to the media object. For each tool, an edge may be specified (e.g., expressed as a number from 0-3, 0=top, 1=right, 2=bottom and 3=left), a position along the edge (e.g., expressed as a ratio of edge length) and an offset from the edge (e.g., expressed as a factor of the tool's size). Whenever updating a position of a tool, an absolute position may be calculated or recalculated based on various parameters.


In another example, another technique utilizes a rotation invariant display of tools. If a user 102 rotates a media object using a rotate tool, the associated tools remain fixed to the original object constraint point (i.e., they do not rotate, thereby making it easier for the user to read the icons). Once a rotation is complete, the attached edge is calculated or recalculated based on the perceived edge, and if necessary fades the tool out from a previous location to a new location. As such, the user 102 may be accustomed to finding certain tools in certain positions relative to a media object, irrespective of the media object's rotation.


In another example, another technique utilizes a slide tool. When working with media that may progressively move in a particular direction, user selection and manipulation of individual media objects may become a challenge. Traditional techniques of multiple selection and direct manipulation break down when dealing with large documents. The slide tool allows the user 102 to perform an action that may be difficult, such as insert or remove horizontal space within a collage or story. The user 102 selects a media object to reveal the media specific tools. The slide tool is affixed to an edge of the media object, and when grabbed, the slide tool selects one or more media objects at or to the other edge of the selected media object, and allows side-to-side sliding during a user action. In one aspect, this technique may be utilized for a vertically oriented collage or story, which would present the slide tool on an upper edge of the selected media object and allow dragging of layers at or below the selected object.


In another example, another technique utilizes a drag and drop insertion. The user 102 may drag thumbnails representing media objects from a media tab and drop them onto the stage or storyboard. As the user 102 drags the thumbnail over the stage, an insertion point indicator is presented to the user 102 showing where an edge of the dropped item may be located during user action, such as, if the user's mouse is released. If the user's mouse hovers over the left edge of any existing media object on the stage, the indicator state switches to an insert mode. If the user releases the mouse while the indicator is in this mode, the underlying media object and all media objects to the right of the left edge of that media object will be offset to the right to allow for the dropped media object to fit on the stage without being obscured.


In another example, another technique utilizes one or more unconstrained animated button icons. Some applications use iconic images to represent actions that the user 102 may perform. As icons become smaller, they are difficult to discern and understand. The service interface application 182 uses animation and unbounded icons to convey visual information. In one aspect, a text tool icon in an idle state is an “A” (other applications often use a T as the text tool). On rollover, the A is revealed to be a window onto a cropped, sliding Add Text message, which animates to an edge. In another aspect, a soundtrack tool may be represented as a musical note, but on rollover, the notes play and rise outside of a representative button. To create these buttons, a designer may designate a static rectangle, which represents traditional bounds of the graphic. These bounds may be used by the button to determine the button bounds and scale. The innovative part is to not force all graphic icon content to lie within these bounds and set the button class to not clip its content. By triggering the animation on rollover, the design may exceed its bounds, while still preserving visual coherence in the idle state.


In another example, another technique utilizes one or more collapsible tabbed containers. When editing a collage or story, a button bar may represent one or more additional editing options. The collapsed bar may comprise a row of buttons, although each icon may be presented as a button rather than an icon. When the user 102 selects at least one of the buttons, the selected button transforms into a tab, and the bar expands horizontally to include the tab content. The user 102 may collapse the bar by selecting a close button in the bar or by selecting the currently open tab.


In another example, another technique utilizes an on-demand loading of media. A presentation may be represented as a document with a plurality of layers. Each layer may include various attributes, including position, scale, visual bounds, associated annotations (i.e., hotspots) and a target media asset, which may have layer specific properties, such as playback behaviors. When a player loads a collage or story, the player requests the first page of the collage document and specifies a number of layers per page. The server returns up to a page worth of layers, sorted by x position in the collage. The player may download one or more pages of layers and create a local indexed list of layers. Then, based on the visual window, the player may filter the layers down to the currently visible. For each layer, if the layer hasn't already been loaded or queued for loading, the layer is queued for loading.


In one implementation, the user 102 may add one or more hotspots to the media presentation or piece. A hotspot may be identified by a title as a cursor is moved over a designated hotspot area. Activating the hotspot by selecting the hotspot area may link the user 102 to additional information relating to the hotspot item. For example, if an automobile is used as a media item within the collage storyboard, a hotspot may link the user 102 to additional information relating to the automobile, such as price, condition, and terms or sale. Hotspots may be specific to one part or element of the media item. In this example, the user 102 may create a hotspot on the tire, which, when opened, may give the viewer access and/or information on the tire, where to buy the tire, etc. Other hotspots may link the user to music or other audio clips, and other media may be uploaded into the collage storyboard from a clipboard or link to sale items posted on the Internet. Once the media piece has been created the media piece may be viewed, saved, previewed and/or published. Once published, the user 102 may have options as to the playback. For example, the media presentation or piece may play continuously until stopped, play only for a specified number of times, play only once, etc.


In various implementations, using the various media creation and publishing tools described herein, a user 102 may quickly and easily create media rich presentations and use those presentations in numerous ways. Some examples include the ability to publish a presentation on a site to sell a particular item or service. For items or services, a media rich presentation may assist the seller with marketing and selling the item or service, which may be valuable for high-end or high-priced items or services. Sellers may be able to cross sell or promote items or services or direct consumers to partner commercial sites using the hotspot feature. Social or dating sites may use these presentations to give its users a meaningful way to present themselves to others. Blogs and personal pages may be more dynamic using RSS feeds, since, for example, a particular user's presentation may be continually changing with new media.



FIG. 3 is a block diagram of a computer system 300 suitable for implementing one or more embodiments of the present disclosure, including the user device 120, the one or more distribution channels 140, and the service provider device 180. In various implementations, the client device 140 may comprise a personal computing device capable of communicating with the network 160, such as a personal computer, laptop, cell phone, PDA, etc., the one or more merchant devices 140 may comprise a network computing device, such as a network server, and the service provider device 180 may comprise a network computing device, such as a network server. Hence, it should be appreciated that each of the devices 120, 140, 180 may be implemented as computer system 300 in a manner as follows.


In accordance with various embodiments of the present disclosure, computer system 300, such as a personal computer and/or a network server, includes a bus 302 or other communication mechanism for communicating information, which interconnects subsystems and components, such as processing component 304 (e.g., processor, micro-controller, digital signal processor (DSP), etc.), system memory component 306 (e.g., RAM), static storage component 308 (e.g., ROM), disk drive component 310 (e.g., magnetic or optical), network interface component 312 (e.g., modem or Ethernet card), display component 314 (e.g., CRT or LCD), input component 316 (e.g., keyboard), and cursor control component 318 (e.g., mouse or trackball). In one implementation, disk drive component 310 may comprise a database having one or more disk drive components.


In accordance with embodiments of the present disclosure, computer system 300 performs specific operations by processor 304 executing one or more sequences of one or more instructions contained in system memory component 306. Such instructions may be read into system memory component 306 from another computer readable medium, such as static storage component 308 or disk drive component 310. In other embodiments, hard-wired circuitry may be used in place of or in combination with software instructions to implement the present disclosure.


Logic may be encoded in a computer readable medium, which may refer to any medium that participates in providing instructions to processor 304 for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. In various implementations, non-volatile media includes optical or magnetic disks, such as disk drive component 310, volatile media includes dynamic memory, such as system memory component 306, and transmission media includes coaxial cables, copper wire, and fiber optics, including wires that comprise bus 302. In one example, transmission media may take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.


Some common forms of computer readable media includes, for example, floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, carrier wave, or any other medium from which a computer is adapted to read.


In various embodiments of the present disclosure, execution of instruction sequences to practice the present disclosure may be performed by computer system 300. In various other embodiments of the present disclosure, a plurality of computer systems 300 coupled by communication link 320 (e.g., network 160 of FIG. 1, such as a LAN, WLAN, PTSN, and/or various other wired or wireless networks, including telecommunications, mobile, and cellular phone networks) may perform instruction sequences to practice the present disclosure in coordination with one another.


Computer system 300 may transmit and receive messages, data, information and instructions, including one or more programs (i.e., application code) through communication link 320 and communication interface 312. Received program code may be executed by processor 304 as received and/or stored in disk drive component 310 or some other non-volatile storage component for execution.


Where applicable, various embodiments provided by the present disclosure may be implemented using hardware, software, or combinations of hardware and software. Also, where applicable, the various hardware components and/or software components set forth herein may be combined into composite components comprising software, hardware, and/or both without departing from the spirit of the present disclosure. Where applicable, the various hardware components and/or software components set forth herein may be separated into sub-components comprising software, hardware, or both without departing from the scope of the present disclosure. In addition, where applicable, it is contemplated that software components may be implemented as hardware components and vice-versa.


Software, in accordance with the present disclosure, such as program code and/or data, may be stored on one or more computer readable mediums. It is also contemplated that software identified herein may be implemented using one or more general purpose or specific purpose computers and/or computer systems, networked and/or otherwise. Where applicable, the ordering of various steps described herein may be changed, combined into composite steps, and/or separated into sub-steps to provide features described herein.


The foregoing disclosure is not intended to limit the present disclosure to the precise forms or particular fields of use disclosed. As such, it is contemplated that various alternate embodiments and/or modifications to the present disclosure, whether explicitly described or implied herein, are possible in light of the disclosure. Having thus described embodiments of the present disclosure, persons of ordinary skill in the art will recognize that changes may be made in form and detail without departing from the scope of the present disclosure. Thus, the present disclosure is limited only by the claims.

Claims
  • 1. A method comprising: determining, based on received input, a style to apply to a video, the video being associated with a user of a social network, wherein the style comprises a kaleidoscope style that supports movement of the video;generating the video with the kaleidoscope style;displaying a user interface that includes a preview of the video with the kaleidoscope style and a button bar comprising a row of buttons, wherein at least one button in the row of buttons is associated with one or more editing tools, wherein selection of the at least one button causes content associated with the respective editing tool to be displayed in the user interface;receiving a request, from the user of the social network, to publish the video with the kaleidoscope style to a channel of the social network associated with the user; andin response to receiving the request to publish, sharing the video with the kaleidoscope style through the channel via the social network such that the video can be displayed by a plurality of computing devices, wherein the plurality of computing devices are associated with a plurality of users having access to the channel such that content is automatically displayed on the plurality of computing devices.
  • 2. The method of claim 1, wherein the one or more editing tools includes a sound tool for adding audio, sound, music, or a voice-over to the video.
  • 3. The method of claim 1, wherein the one or more editing tools includes a text tool for adding text to the video.
  • 4. The method of claim 1, further comprising: receiving user input via the user interface to add text to the video; andadding text to the video prior to sharing the video.
  • 5. The method of claim 1, further comprising: receiving user input via the user interface to add music to the video; andadding music to the video prior to sharing the video.
  • 6. A system, comprising: at least one processor; anda memory coupled to the at least one processor and storing instructions that, when executed by the at least one processor, perform operations comprising: determining, based on received input, a style to apply to a video, the video being associated with a user of a social network, wherein the style comprises a kaleidoscope style that supports movement of the video;generating the video with the kaleidoscope style;receiving a request, from the user of the social network, to publish the video with the kaleidoscope style to a channel of the social network associated with the user; andin response to receiving the request to publish, sharing the video with the kaleidoscope style through the channel via the social network such that the video can be displayed by a plurality of computing devices, wherein the plurality of computing devices are associated with a plurality of users having access to the channel such that content is automatically displayed on the plurality of computing devices.
  • 7. The system of claim 6, wherein the operations further comprise displaying a user interface that includes one or more editing tools for editing the video.
  • 8. The system of claim 7, wherein the one or more editing tools includes a preview tool for previewing the video prior to sharing the video through the channel via the social network.
  • 9. The system of claim 7, wherein the one or more editing tools includes a sound tool for adding audio, sound, music, or a voice-over to the video.
  • 10. The system of claim 7, wherein the one or more editing tools includes a text tool for adding text to the video.
  • 11. The system of claim 7, wherein the operations further comprise: receiving user input via the user interface to add text to the video; andadding text to the video prior to sharing the video.
  • 12. The system of claim 7, wherein the operations further comprise: receiving user input via the user interface to add music to the video; andadding music to the video prior to sharing the video.
  • 13. The system of claim 7, wherein the operations further comprise: receiving user input via the user interface to add music and text to the video; andadding music and text to the video prior to sharing the video.
  • 14. The system of claim 6, wherein the operations further comprise displaying a user interface that includes a preview of the video with the determined style and a button bar comprising a row of buttons, wherein at least one button in the row of buttons is associated with one or more editing tools, wherein selection of the at least one button causes content associated with the respective editing tool to be displayed in the user interface.
  • 15. One or more computer-readable storage devices having instructions stored thereon that, responsive to execution by one or more processors, perform operations comprising: determining, based on received input, a style to apply to a video, the video being associated with a user of a social network, wherein the style comprises a kaleidoscope style that supports movement of the video;generating the video with the kaleidoscope style;receiving a request, from the user of the social network, to publish the video with the kaleidoscope style to a channel of the social network associated with the user; andin response to receiving the request to publish, sharing the video with the kaleidoscope style through the channel via the social network such that the video can be displayed by a plurality of computing devices, wherein the plurality of computing devices are associated with a plurality of users having access to the channel such that content is automatically displayed on the plurality of computing devices.
  • 16. The one or more computer-readable storage devices of claim 15, wherein the operations further comprise displaying a user interface that includes one or more editing tools for editing the video.
  • 17. The one or more computer-readable storage devices of claim 16, wherein the one or more editing tools includes a preview tool for previewing the video prior to sharing the video through the channel via the social network.
  • 18. The one or more computer-readable storage devices of claim 16, wherein the one or more editing tools includes a sound tool for adding audio, sound, music, or a voice-over to the video.
  • 19. The one or more computer-readable storage devices of claim 16, wherein the one or more editing tools includes a text tool for adding text to the video.
  • 20. The one or more computer-readable storage devices of claim 15, wherein the operations further comprise displaying a user interface that includes a preview of the video with the determined style and a button bar comprising a row of buttons, wherein at least one button in the row of buttons is associated with one or more editing tools, wherein selection of the at least one button causes content associated with the respective editing tool to be displayed in the user interface.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 12/347,747, filed Dec. 31, 2008, which application claims priority to and the benefit of U.S. Provisional Patent Application No. 61/078,288, filed Jul. 3, 2008, entitled, “Multi-Media Online Presentation System and Method,” each of which are incorporated herein by reference in their entireties.

US Referenced Citations (277)
Number Name Date Kind
5539871 Gibson Jul 1996 A
5541662 Adams et al. Jul 1996 A
5559942 Gough et al. Sep 1996 A
5578808 Taylor Nov 1996 A
5666554 Tanaka Sep 1997 A
5708845 Wistendahl et al. Jan 1998 A
5799292 Hekmatpour et al. Aug 1998 A
5844557 Shively, II Dec 1998 A
5845299 Arora et al. Dec 1998 A
5860073 Ferrel et al. Jan 1999 A
5918012 Astiz et al. Jun 1999 A
5933817 Hucal Aug 1999 A
6008807 Bretschneider et al. Dec 1999 A
6038552 Fleischl et al. Mar 2000 A
6085249 Wang et al. Jul 2000 A
6097389 Morris et al. Aug 2000 A
6154771 Rangan et al. Nov 2000 A
6259457 Davies et al. Jul 2001 B1
6397196 Kravetz et al. May 2002 B1
6470100 Horiuchi Oct 2002 B2
6515656 Wittenburg et al. Feb 2003 B1
6647383 August et al. Nov 2003 B1
6693649 Lipscomb et al. Feb 2004 B1
6751776 Gong Jun 2004 B1
6769095 Brassard et al. Jul 2004 B1
6789060 Wolfe et al. Sep 2004 B1
6834282 Bonneau et al. Dec 2004 B1
6928610 Brintzenhofe et al. Aug 2005 B2
7023452 Oshiyama et al. Apr 2006 B2
7139970 Michaud et al. Nov 2006 B2
7181468 Spring Feb 2007 B2
7276290 Sanders Feb 2007 B2
7237185 Sequeira Jun 2007 B1
7296242 Agata et al. Nov 2007 B2
7376290 Anderson et al. May 2008 B2
7409543 Bjorn Aug 2008 B1
7469380 Wessling et al. Dec 2008 B2
7502795 Svendsen et al. Mar 2009 B1
7573486 Mondry et al. Apr 2009 B2
7546554 Chiu et al. Jun 2009 B2
7576555 Hashimoto Aug 2009 B2
7576755 Sun et al. Aug 2009 B2
RE41210 Wang et al. Apr 2010 E
7725494 Rogers et al. May 2010 B2
7752548 Mercer Jul 2010 B2
7768535 Reid et al. Aug 2010 B2
7805382 Rosen et al. Sep 2010 B2
7814560 Bellagamba et al. Oct 2010 B2
7836110 Schoenbach et al. Nov 2010 B1
7885951 Rothschild Feb 2011 B1
7885955 Hull et al. Feb 2011 B2
7982909 Beato et al. Jul 2011 B2
8006192 Reid et al. Aug 2011 B1
8010629 Lanahan et al. Aug 2011 B2
8019579 Wey et al. Sep 2011 B1
8024658 Fagans et al. Sep 2011 B1
8082328 Hull et al. Dec 2011 B2
8103546 Des Jardins et al. Jan 2012 B1
8121902 Desjardins et al. Feb 2012 B1
8131114 Wang et al. Mar 2012 B2
8180178 Cheatle May 2012 B2
8316084 Lanahan et al. Nov 2012 B2
8365092 Lanahan et al. Jan 2013 B2
8560565 Howard et al. Oct 2013 B2
8591332 Bright Nov 2013 B1
8620893 Howard et al. Dec 2013 B2
8627192 Lanahan et al. Jan 2014 B2
8667160 Haot et al. Mar 2014 B1
8789094 Singh et al. Jul 2014 B1
8799952 Gossweiler, III et al. Aug 2014 B2
8812945 Sidon et al. Aug 2014 B2
8893015 Lanahan et al. Nov 2014 B2
9043726 Lanahan et al. May 2015 B2
9058765 Mallick et al. Jun 2015 B1
9165388 Delia et al. Oct 2015 B2
9430448 Howard et al. Aug 2016 B2
9432361 Mahaffey et al. Aug 2016 B2
9613006 Lanahan et al. Apr 2017 B2
9639505 Lanahan et al. May 2017 B2
9658754 Lanahan et al. May 2017 B2
10157170 Howard et al. Dec 2018 B2
10282391 Lanahan et al. May 2019 B2
10706222 Lanahan et al. Jul 2020 B2
10853555 Lanahan et al. Dec 2020 B2
11017160 Lanahan May 2021 B2
11100690 Lanahan et al. Aug 2021 B2
20010034740 Kerne Oct 2001 A1
20010044825 Barritz Nov 2001 A1
20010044835 Schober et al. Nov 2001 A1
20010050681 Keys et al. Dec 2001 A1
20020023111 Arora et al. Feb 2002 A1
20020080165 Wakefield Jun 2002 A1
20020083178 Brothers Jun 2002 A1
20020091600 Kravetz et al. Jul 2002 A1
20020108122 Alao et al. Aug 2002 A1
20020112093 Slotznick Aug 2002 A1
20020112096 Kaminsky et al. Aug 2002 A1
20020122067 Geigel et al. Sep 2002 A1
20020135621 Angiulo et al. Sep 2002 A1
20020138428 Spear Sep 2002 A1
20020152233 Cheong et al. Oct 2002 A1
20020164151 Jasinschi et al. Nov 2002 A1
20020180803 Kaplan et al. Dec 2002 A1
20030014510 Avvari et al. Jan 2003 A1
20030046222 Bard et al. Mar 2003 A1
20030098877 Boegelund May 2003 A1
20030149983 Markel Aug 2003 A1
20040001106 Deutscher et al. Jan 2004 A1
20040008226 Manolis et al. Jan 2004 A1
20040021684 B. Millner Feb 2004 A1
20040054579 Lamb et al. Mar 2004 A1
20040083080 Reghetti et al. Apr 2004 A1
20040091232 Appling, III May 2004 A1
20040097232 Haverinen May 2004 A1
20040143796 Lerner et al. Jul 2004 A1
20040184778 Jung et al. Sep 2004 A1
20040199574 Franco et al. Oct 2004 A1
20040268224 Balkus et al. Dec 2004 A1
20050007382 Schowtka Jan 2005 A1
20050069225 Schneider et al. Mar 2005 A1
20050094014 Haas et al. May 2005 A1
20050114356 Bhatti May 2005 A1
20050114754 Miller et al. May 2005 A1
20050114784 Spring et al. May 2005 A1
20050138193 Encarnacion et al. Jun 2005 A1
20050149970 Fairhurst et al. Jul 2005 A1
20050228749 Lozano Oct 2005 A1
20050234981 Manousos et al. Oct 2005 A1
20050235201 Brown et al. Oct 2005 A1
20050237952 Punj et al. Oct 2005 A1
20050268227 Carlson et al. Dec 2005 A1
20050268279 Paulsen et al. Dec 2005 A1
20050273693 Peterson Dec 2005 A1
20060010162 Stevens et al. Jan 2006 A1
20060036949 Moore et al. Feb 2006 A1
20060041632 Shah Feb 2006 A1
20060061595 Goede et al. Mar 2006 A1
20060064642 Iyer Mar 2006 A1
20060069989 Jones et al. Mar 2006 A1
20060070005 Gilbert et al. Mar 2006 A1
20060071947 Ubillos et al. Apr 2006 A1
20060086843 Lin et al. Apr 2006 A1
20060089843 Flather Apr 2006 A1
20060106693 Carlson et al. May 2006 A1
20060112081 Qureshi May 2006 A1
20060114510 Maeng Jun 2006 A1
20060123455 Pai et al. Jun 2006 A1
20060129917 Volk et al. Jun 2006 A1
20060181736 Quek et al. Aug 2006 A1
20060184574 Wu et al. Aug 2006 A1
20060193008 Osaka et al. Aug 2006 A1
20060195789 Rogers et al. Aug 2006 A1
20060203294 Makino Sep 2006 A1
20060206811 Dowdy Sep 2006 A1
20060209214 Fader et al. Sep 2006 A1
20060230332 Lin Oct 2006 A1
20060256739 Seier et al. Nov 2006 A1
20060271691 Jacobs et al. Nov 2006 A1
20060277482 Hoffman et al. Dec 2006 A1
20060287989 Glance Dec 2006 A1
20070016930 Wesemann et al. Jan 2007 A1
20070033059 Adkins Feb 2007 A1
20070038931 Allaire et al. Feb 2007 A1
20070050718 Moore et al. Mar 2007 A1
20070061266 Moore et al. Mar 2007 A1
20070061715 Chartier et al. Mar 2007 A1
20070070066 Bakhash Mar 2007 A1
20070074110 Miksovsky et al. Mar 2007 A1
20070078989 Van Datta et al. Apr 2007 A1
20070089057 Kindig Apr 2007 A1
20070113250 Logan et al. May 2007 A1
20070118801 Harshbarger et al. May 2007 A1
20070130177 Schneider et al. Jun 2007 A1
20070136194 Sloan Jun 2007 A1
20070136244 Maclaurin et al. Jun 2007 A1
20070156382 Graham et al. Jul 2007 A1
20070156434 Martin et al. Jul 2007 A1
20070162853 Weber et al. Jul 2007 A1
20070162856 Schlossberg et al. Jul 2007 A1
20070186182 Schiller Aug 2007 A1
20070204208 Cheng et al. Aug 2007 A1
20070204209 Truelove et al. Aug 2007 A1
20070239770 Enock et al. Oct 2007 A1
20070245243 Lanza et al. Oct 2007 A1
20070253028 Widdowson Nov 2007 A1
20070262995 Tran Nov 2007 A1
20080005125 Gaedeke Jan 2008 A1
20080005282 Gaedcke Jan 2008 A1
20080005669 Eilertsen et al. Jan 2008 A1
20080021829 Kranzley Jan 2008 A1
20080027798 Ramamurthi et al. Jan 2008 A1
20080034295 Kulas Feb 2008 A1
20080040683 Walsh Feb 2008 A1
20080046406 Seide et al. Feb 2008 A1
20080077530 Banas et al. Mar 2008 A1
20080081662 Strandell et al. Apr 2008 A1
20080086688 Chandratillake et al. Apr 2008 A1
20080086689 Berkley et al. Apr 2008 A1
20080092054 Bhumkar et al. Apr 2008 A1
20080120278 Roe et al. May 2008 A1
20080120550 Oakley May 2008 A1
20080126191 Schiavi May 2008 A1
20080134018 Kembel et al. Jun 2008 A1
20080165960 Woo Jul 2008 A1
20080195477 Kennedy et al. Aug 2008 A1
20080195962 Lin et al. Aug 2008 A1
20080205694 Sagoo et al. Aug 2008 A1
20080215680 Salesky et al. Sep 2008 A1
20080215985 Batchelder et al. Sep 2008 A1
20080222538 Cardu Sep 2008 A1
20080222560 Harrison Sep 2008 A1
20080244038 Martinez Oct 2008 A1
20080244740 Hicks et al. Oct 2008 A1
20080270905 Goldman Oct 2008 A1
20080276279 Gossweiler et al. Nov 2008 A1
20080288460 Poniatowski et al. Nov 2008 A1
20080301546 Moore et al. Dec 2008 A1
20080306995 Newell et al. Dec 2008 A1
20090007023 Sundstrom Jan 2009 A1
20090037449 Fagans et al. Feb 2009 A1
20090083161 Mital Mar 2009 A1
20090087161 Roberts et al. Apr 2009 A1
20090119256 Waters et al. May 2009 A1
20090132415 Davis et al. May 2009 A1
20090138320 Schmidt et al. May 2009 A1
20090150797 Burkholder Jun 2009 A1
20090177546 Dijk et al. Jul 2009 A1
20090182810 Higgins et al. Jul 2009 A1
20090210391 Hall et al. Aug 2009 A1
20090254515 Terheggen et al. Oct 2009 A1
20090271283 Fosnacht et al. Oct 2009 A1
20090276425 Phillips et al. Nov 2009 A1
20090292681 Wood et al. Nov 2009 A1
20090319530 Hoertnagl et al. Dec 2009 A1
20100004508 Naito et al. Jan 2010 A1
20100005066 Howard et al. Jan 2010 A1
20100005067 Howard et al. Jan 2010 A1
20100005068 Howard et al. Jan 2010 A1
20100005119 Howard et al. Jan 2010 A1
20100005139 Lanahan et al. Jan 2010 A1
20100005168 Williams et al. Jan 2010 A1
20100005379 Lanahan et al. Jan 2010 A1
20100005380 Lanahan et al. Jan 2010 A1
20100005397 Lanahan et al. Jan 2010 A1
20100005408 Lanahan et al. Jan 2010 A1
20100005417 Lanahan et al. Jan 2010 A1
20100005498 Lanahan et al. Jan 2010 A1
20100023849 Hakim et al. Jan 2010 A1
20100036812 Choi et al. Feb 2010 A1
20100042628 Crowley et al. Feb 2010 A1
20100083077 Paulsen et al. Apr 2010 A1
20100083303 Redei et al. Apr 2010 A1
20100115410 Fu et al. May 2010 A1
20100162375 Tiu, Jr. et al. Jun 2010 A1
20100281386 Lyons et al. Nov 2010 A1
20100325019 Avery Dec 2010 A1
20100332565 Al-Shaykh et al. Dec 2010 A1
20110022966 Rose et al. Jan 2011 A1
20110060979 O 'Brien-Strain Mar 2011 A1
20110285748 Slatter et al. Nov 2011 A1
20120323743 Chang et al. Dec 2012 A1
20130124996 Margulis May 2013 A1
20140108510 Schwesig et al. Apr 2014 A1
20140108931 Howard et al. Apr 2014 A1
20140122985 Lanahan et al. May 2014 A1
20140282877 Mahaffey et al. Sep 2014 A1
20150074502 Lanahan et al. Mar 2015 A1
20150254212 Lanahan et al. Sep 2015 A1
20160170568 Kontkanen et al. Jun 2016 A1
20160371266 Howard et al. Dec 2016 A1
20170199847 Lanahan et al. Jul 2017 A1
20170235450 Lanahan et al. Aug 2017 A1
20170235712 Lanahan et al. Aug 2017 A1
20180329870 Lanahan et al. Nov 2018 A1
20190339830 Lanahan et al. Nov 2019 A1
20200272787 Lanahan et al. Aug 2020 A1
20210081595 Lanahan et al. Mar 2021 A1
Foreign Referenced Citations (6)
Number Date Country
2008-183330 Aug 2008 JP
00007110 Feb 2000 WO
00056055 Sep 2000 WO
02059799 Aug 2002 WO
2010003111 Jan 2010 WO
2010003121 Jan 2010 WO
Non-Patent Literature Citations (176)
Entry
Final Office Action received for U.S. Appl. No. 12/495,520, dated Oct. 18, 2011, 29 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,520, dated Apr. 22, 2014, 35 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,520, dated Jul. 22, 2013, 34 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,520, dated Mar. 3, 2011, 22 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,520, dated Sep. 10, 2015, 39 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,520, dated Oct. 24, 2012, 31 Pages.
Notice of Allowance received for U.S. Appl. No. 12/495,520, dated Apr. 25, 2016, 8 Pages.
Final Office Action received for U.S. Appl. No. 12/495,684, dated Apr. 10, 2012, 16 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,684, dated Dec. 2, 2013, 14 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,684,dated Nov. 15, 2011, 12 Pages.
Notice of Allowance received for U.S. Appl. No. 12/495,684, dated Jul. 7, 2014, 16 pages.
Final Office Action received for U.S. Appl. No. 12/495,718, dated Dec. 30, 2013, 11 Pages.
Final Office Action received for U.S. Appl. No. 12/495,718, dated Feb. 27, 2012, 11 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,718, dated Jun. 28, 2013, 10 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,718, dated Nov. 15, 2011, 10 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,718, dated Sep. 30, 2014, 10 Pages.
Notice of Allowance received for U.S. Appl. No. 12/495,718, dated Jan. 26, 2015, 13 pages.
Final Office Action received for U.S. Appl. No. 12/495,748, dated Apr. 17, 2012, 5 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,748, dated Nov. 8, 2011, 8 Pages.
Notice of Allowance received for U.S. Appl. No. 12/495,748, dated Jul. 18, 2012, 6 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,756, dated Feb. 2, 2011, 15 Pages.
Notice of Allowance received for U.S. Appl. No. 12/495,756, dated Jun. 15, 2011, 9 Pages.
Supplemental Notice of Allowance received for U.S. Appl. No. 12/495,756, dated Jul. 19, 2011, 3 pages.
Advisory Action received for U.S. Appl. No. 14/144,199, dated Dec. 30, 2016, 5 Pages.
Corrected Notice of Allowability Received for U.S. Appl. No. 14/144,199 dated Nov. 2, 2018, 3 pages.
Corrected Notice of Allowability received for U.S. Appl. No. 14/144,199, dated Nov. 19, 2018, 3 pages.
Corrected Notice of Allowability received for U.S. Appl. No. 14/144,199, dated Sep. 14, 2018, 3 pages.
Final Office Action received for U.S. Appl. No. 14/144,199, dated Sep. 22, 2016, 14 Pages.
Non-Final Office Action received for U.S. Appl. No. 14/144,199, dated Apr. 4, 2017, 14 Pages.
Non-Final Office Action received for U.S. Appl. No. 14/144,199, dated Mar. 10, 2016, 44 Pages.
Non-Final Office Action received for U.S. Appl. No. 14/144,199, dated Mar. 14, 2018, 8 pages.
Notice of Allowance received for U.S. Appl. No. 14/144,199, dated Aug. 7, 2018, 9 pages.
Notice of Allowance received for U.S. Appl. No. 14/144,199, dated Nov. 17, 2017, 8 pages.
Final Office Action received for U.S. Appl. No. 14/149,140, dated Oct. 20, 2016, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 14/149,140, dated Mar. 24, 2016, 18 Pages.
Notice of Allowance received for U.S. Appl. No. 14/149,140, dated Nov. 18, 2016, 8 Pages.
Final Office Action received for U.S. Appl. No. 14/547,083, dated Nov. 3, 2016, 13 Pages.
Non-Final Office Action received for U.S. Appl. No. 14/547,083, dated Apr. 20, 2016, 11 Pages.
Notice of Allowance received for U.S. Appl. No. 14/547,083, dated Jan. 13, 2017, 9 Pages.
Non-Final Office Action received for U.S. Appl. No. 14/722,030, dated Jun. 19, 2017, 11 pages.
Notice of Allowance received for U.S. Appl. No. 14/722,030, dated Feb. 27, 2018, 7 pages.
Notice of Allowance received for U.S. Appl. No. 14/722,030, dated Jan. 23, 2019, 7 pages.
Notice of Allowance received for U.S. Appl. No. 14/722,030, dated Oct. 19, 2017, 9 pages.
Notice of Allowance received for U.S. Appl. No. 14/722,030, dated Jun. 11, 2018, 7 pages.
Advisory Action received for U.S. Appl. No. 15/250,763, dated Jul. 5, 2018, 3 pages.
Final Office Action received for U.S. Appl. No. 15/250,763 , dated Apr. 12, 2019, 13 pages.
Final Office Action received for U.S. Appl. No. 15/250,763, dated Mar. 27, 2018, 12 Pages.
Final Office Action received for U.S. Appl. No. 15/250,763, dated May 5, 2017, 13 Pages.
Non-Final Office Action received for U.S. Appl. No. 15/250,763, dated Aug. 31, 2017, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 15/250,763, dated Jan. 13, 2017, 16 Pages.
Non-Final Office Action received for U.S. Appl. No. 15/250,763, dated Sep. 20, 2018, 12 pages.
Final Office Action received for U.S. Appl. No. 15/452,474, dated Jul. 16, 2018, 16 pages.
Final Office Action Received for U.S. Appl. No. 15/452,474, dated Jun. 23, 2020, 20 pages.
Final Office Action received for U.S. Appl. No. 15/452,474, dated Mar. 22, 2019, 14 pages.
Final Office Action Received for U.S. Appl. No. 15/452,474, dated Oct. 3, 2019, 15 pages.
Non Final Office Action Received for U.S. Appl. No. 15/452,474, dated Jan. 30, 2020, 16 pages.
Non Final Office Action Received for U.S. Appl. No. 15/452,474, dated Dec. 28, 2020, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 15/452,474, dated Feb. 6, 2018, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 15/452,474, dated Nov. 6, 2018, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 15/452,474, dated Jul. 18, 2019, 14 pages.
Notice of Allowance Received for U.S. Appl. No. 15/452,474, dated Apr. 2, 2021, 9 Pages.
Corrected Notice of Allowability Received for U.S. Appl. No. 15/583,704, dated May 29, 2020, 2 pages.
Non-Final Office action received for U.S. Appl. No. 15/583,704, dated Apr. 1, 2019, 15 pages.
Notice of Allowability Received for U.S. Appl. No. 15/583,704, dated Mar. 31, 2020, 2 pages.
Notice of Allowance received for U.S. Appl. No. 15/583,704, dated Feb. 20, 2020, 5 pages.
Notice of Non-Compliant Amendment Received for U.S. Appl. No. 15/583,704, dated Oct. 21, 2019, 3 pages.
Non-Final Office Action received for U.S. Appl. No. 15/584,993, dated Apr. 19, 2019, 12 pages.
Final Office Action received for U.S. Appl. No. 15/930,146, dated Mar. 15, 2021, 15 pages.
Non Final Office Action Received for U.S. Appl. No. 15/930,146, dated Dec. 22, 2020, 12 pages.
Non Final Office Action Received for U.S. Appl. No. 16/046,547, dated Mar. 20, 2020, 9 Pages.
Notice of Allowance received for U.S. Appl. No. 16/046,547, dated Jul. 16, 2020, 7 pages.
Final Office Action Received for U.S. Appl. No. 16/511,499, dated Dec. 4, 2020, 16 pages.
Non Final Office Action Received for U.S. Appl. No. 16/511,499, dated Feb. 18, 2021, 14 Pages.
Non Final Office Action Received for U.S. Appl. No. 16/511,499, dated Jun. 11, 2020, 14 pages.
Arrington, “Ebay Launches “Togo” Widgets for Any Listing”, Retrieved from Internet URL: <https://techcrunch.com/2007/04/30/ebay-launches-togo-widgets-for-any-listing/>, Apr. 30, 2007, 2 pages.
blog.justwell.org, Retrieved from the Internet URL: <https://web.archive.org/web/20090731120449/http://blog.justswell.org/drag-and-drop-files-from-your-desktop-to-your-browser-using-javascript/>,, Jul. 28, 2009, 6 pages.
Brisbin,“Clickable Image Maps in Adobe Golive”, Retrieved from the Internet URL: <https://www.peachpit.com/articles/article.aspx?p=20995>, Mar. 30, 2001, 3 pages.
Burke,“How to Use Lotus Notes 6”, Mar. 4, 2003, 4 pages.
Catone, “Create Photo Books with Panraven”, readwrite.com, retrieved from https://readwrite.com/2007/07/30/create_photobooks_with_panraven/. (Year: 2007), Jul. 30, 2007, 3 Pages.
Hansen,“Guns Hansen's Exclusive Poker Tips Video #1”, Retrieved from the Internet URL: www.dailymotion.com/video/x3op2y_gus-hansensexclusive-poker-tips-vi videogames,. Category : Gaming, Dec. 6, 2007, 2 pages.
GN,“World Poker Tour Deals Twelve Million Hands ofWPT Texas Hold' Em and Receives Industry Accolades”, Retrieved from the Internet URL: <http://www.ign.com/articles/2008/02/26/world-poker-tour-deals-twelve-million-hands-of-wpt-texas-hold-em-2-and-receives-industry-accolades>, Feb. 26, 2008, 9 pages.
Janine C. Warner,“Dreamweaver CS3 for Dummies”, May 7, 2007, pp. 2, 3 and 80-83, 100-105, and 394-395.
Lowensohn,“CNET Ebay Does MySpace-Compatible Widgets”, Retrieved from the Internet URL: < https://www.cnet.com/au/news/ebay-does-myspace-compatible-widgets/>, May 2, 2007, 3 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2009/049606, dated Jan. 13, 2011, 5 pages.
International Search Report received for PCT Patent Application No. PCT/US2009/049606, dated Aug. 14, 2009, 2 pages.
International Written Opinion received for PCT Application No. PCT/US2009/049606, dated Aug. 14, 2009, 4 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2009/049622, dated Jan. 13, 2011, 5 pages.
International Search Report received for PCT Application No. PCT/US2009/049622, dated Aug. 14, 2009, 2 pages.
International Written Opinion received for PCT Application No. PCT/US2009/049622, dated Aug. 14, 2009, 4 pages.
Scotts,“Looks Good Works Well”, Mind Hacking Visual Transitions, Retrieved Online from the Internet URL : <http://looksgoodworkswell.blogspot.in/2006/>, Mar. 2006, 3 pages.
Taylor,“Crystal Reports 10: Adding a Hyperlink to a Report, In Crystal reports 10 for Dummies”, Jun. 1, 2004, 6 pages.
Wikipedia“File Manager”, Retrieved from the Internet URL: <https://en.wikipedia.org/wiki/File_manager>, provided to establish well-known aspects in the art as it relates to file managers, e.g. what they are , 9 pages.
YAHOO!, “Groups—Groups Messages Help, Collection of Help pages for Yahoo!”, Groups as captured by Internet Archive Wayback Machine in Feb. 2006, originally available at http://help.yahoo.com/help/us/groups/index.html. (Year: 2006), 2006, 9 Pages.
Asterpix—SeachLight: Content Discovery Made Easy, Retrieved from the internet URL: <http://www.asteqiix.com/searchlight/>, Feb. 14, 2011, 1 page.
Ebay the Chatter Check out the Ebay to Go Widget, May 17, 2007, 3 pages.
Formatting Shapes and Objects, Retrieved online from URL: <http://www.functionx.com/powerpoint/Lesson11.htm>, Copyright © 2004-2007 FunctionX, Inc., Dec. 14, 2007, 7 pages.
Free Word 2003 Tutorial at GCFLearnFree, Retrieved online from the Internet URL: <http://www.gcflearnfree.org/word2003/instertinghyperlinnks/1>, Jan. 1, 2003, 2 pages.
Golden Nugget, Retrieved from the Internet URL: <www.absolute-playstation.com/api review/rgnugg.gtm>, Apr. 4, 2011, 5 pages.
Golden Nugget Screenshots, media.psx.ign.com/medial000/0002951imgs_ 1.html, Apr. 4, 2011, 2 pages.
How to Create a Clickable Image Map with Dreamweaver, Ehow, Feb. 16, 2008, 2 pages.
Microsoft Frontpage 2003 Image Maps: Creating Hotspots, Retrieved from the Internet URL: <http://www.uwec.edu/help/fpage03/imp-hotspot.htm>, University of Wisconsin-EauClaire, Sep. 29, 2004, 5 pages.
Naj My Depictions Ebay to Go—New Widget to Display Listings, May 22, 2007, 3 pages.
Photoshow, Retrieved from the Internet URL: <http://www.photoshow.com/home/start>, Accessed on May 21, 2019, 1 page.
Upload Files in a Browser Using Drag and Drop, Google Operating System Retrieved from the Internet URL: <http:/ / googlesystem .blogspot.com/2007 /02/upload-files-in-browser -using-drag-and. html>, Feb. 23, 2007, 1 page.
Using Adobe Acrobat, Apr. 9, 2004, 17 pages.
WPT Mobile; World Poker Tour, Retrieved online fro the Internet URL: <wptmobile.handson.com/wpt texas hold em 2. pho?Q.erformcheck=2>, 2008, 1 pages.
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 09774560.8, mailed on Nov. 10, 2014, 7 pages.
Extended European Search report received for European Patent Application No. 09774560.8, dated Jun. 26, 2013, 6 pages.
Summons to attend oral Proceedings received for EP Patent Application No. 09774560.8, mailed on May 3, 2017, 16 pages.
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 09774570.7, dated Sep. 16, 2013, 5 Pages.
Supplementary European Search report received for European Patent Application No. 09774570.7, dated Nov. 22, 2011, 8 pages.
Advisory Action received for U.S. Appl. No. 12/347,638, dated Dec. 7, 2016, 3 Pages.
Final Office Action received for U.S. Appl. No. 12/347,638, dated Apr. 19, 2012, 11 Pages.
Final Office Action received for U.S. Appl. No. 12/347,638, dated Apr. 25, 2014, 12 Pages.
Final Office Action received for U.S. Appl. No. 12/347,638, dated Aug. 17, 2015, 15 Pages.
Final Office Action received for U.S. Appl. No. 12/347,638, dated May 21, 2013, 11 Pages.
Final Office Action received for U.S. Appl. No. 12/347,638, dated Sep. 26, 2016, 15 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,638, dated Feb. 10, 2015, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,638, dated Jan. 15, 2016, 16 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,638, dated Oct. 4, 2013, 12 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,638, dated Oct. 26, 2012, 12 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,638, dated Sep. 8, 2011, 10 Pages.
Notice of Allowance received for U.S. Appl. No. 12/347,638, dated Dec. 30, 2016, 5 Pages.
Final Office Action received for U.S. Appl. No. 12/347,749, dated Jul. 17, 2012, 10 Pages.
Final Office Action received for U.S. Appl. No. 12/347,749, dated Sep. 2, 2011, 7 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,749, dated Dec. 23, 2011, 9 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,749, dated Feb. 13, 2013, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,749, dated Mar. 24, 2011, 8 Pages.
Notice of Allowance received for U.S. Appl. No. 12/347,749 dated Aug. 28, 2013, 11 Pages.
Final Office Action received for U.S. Appl. No. 12/347,829, dated Jun. 14, 2012, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,829, dated Oct. 5, 2011, 12 pages.
Notice of Allowance received for U.S. Appl. No. 12/347,829, dated Sep. 27, 2012, 8 Pages.
Final Office Action received for U.S. Appl. No. 12/495,438 dated Jan. 3, 2013, 15 Pages.
Final Office Action received for U.S. Appl. No. 12/495,438, dated Nov. 21, 2011, 15 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,438, dated Jun. 20, 2011, 15 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,438, dated Jun. 21, 2012, 13 Pages.
Notice of Allowance received for U.S. Appl. No. 12/495,438, dated Jun. 11, 2013, 17 pages.
Final Office Action received for U.S. Appl. No. 12/495,493, dated Dec. 28, 2011, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 12/495,493, dated Aug. 2, 2011, 18 Pages.
Notice of Allowance received for U.S. Appl. No. 12/495,493, dated Aug. 26, 2013, 13 pages.
Final Office Action received for U.S. Appl. No. 12/495,520, dated Apr. 2, 2013, 33 Pages.
Final Office Action received for U.S. Appl. No. 12/495,520, dated Jan. 16, 2014, 34 Pages.
Final Office Action received for U.S. Appl. No. 12/495,520, dated Sep. 16, 2014, 34 Pages.
Non Final Office Action received for U.S. Appl. No. 16/511,499, dated Sep. 2, 2021, 16 pages.
Non Final Office Action received for U.S. Appl. No. 17/107,080, dated Nov. 3, 2021, 17 pages.
Final Office Action Received for U.S. Appl. No. 15/930,146, dated Nov. 23, 2021, 16 Pages.
U.S. Appl. No. 12/347,747, filed Dec. 31, 2008, Issued.
Summons to Attend Oral Proceedings received for European Patent Application No. 09774570.7, mailed on Oct. 14, 2015, 7 Pages.
Corrected Notice of Allowability Received for U.S. Appl. No. 15/452,474, dated May 13, 2021, 2 pages.
Non-Final Office Action received for U.S. Appl. No. 15/930,146, dated Jun. 10, 2021, 15 Pages.
Final Office Action Received for U.S. Appl. No. 16/511,499, dated Jun. 14, 2021, 14 Pages.
Final Office Action received for U.S. Appl. No. 12/347,747, dated Feb. 2, 2015, 10 pages.
Final Office Action received for U.S. Appl. No. 12/347,747, dated Jan. 25, 2012, 14 pages.
Non Final Office Action Received for U.S. Appl. No. 12/347,747 , dated Jul. 24, 2020, 15 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,747, dated Dec. 29, 2017, 17 Pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,747, dated Jan. 24, 2011, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,747, dated Jul. 7, 2011, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,747, dated Nov. 18, 2015, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,747, dated Sep. 2, 2016, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 12/347,747, dated Jul. 8, 2014, 9 pages.
Non-Final Office Action Received for U.S. Appl. No. 12/347,747, dated Oct. 4, 2019, 14 pages.
Advisory Action received for U.S. Appl. No. 12/347,747, dated May 24, 2017, 3 pages.
Decision on Pre Appeal Brief received for U.S. Appl. No. 12/347,747, dated Nov. 27, 2018, 2 pages.
Final Office Action received for U.S. Appl. No. 12/347,747, dated Jul. 12, 2018, 11 pages.
Final Office Action received for U.S. Appl. No. 12/347,747, dated Mar. 10, 2017, 14 pages.
Final Office Action received for U.S. Appl. No. 12/347,747, dated Mar. 31, 2016, 13 pages.
Final Office Action Received for U.S. Appl. No. 12/347,747, dated Mar. 31, 2020, 12 pages.
Notice of Allowance Received for U.S. Appl. No. 12/347,747, dated Jan. 26, 2021, 9 Pages.
Supplemental Notice of Allowability Received for U.S. Appl. No. 12/347,747, dated Apr. 27, 2021, 3 pages.
Supplemental Notice of Allowability Received for U.S. Appl. No. 12/347,747, dated Feb. 18, 2021, 3 Pages.
Corrected Notice of Allowability received for U.S. Appl. No. 15/452,474, dated Jul. 23, 2021, 2 Pages.
U.S. Appl. No. 16/511,499 , “Notice of Allowance Received for U.S. Appl. No. 16/511,499, dated Feb. 2, 2022”, dated Dec. 2, 2020, 10 pages.
U.S. Appl. No. 17/107,080 , “Notice of Allowance Received for U.S. Appl. No. 17/107,080, dated Mar. 2, 2022”, dated Feb. 13, 2022, 7 pages.
U.S. Appl. No. 15/930,146 , “Non-Final Office Action”, U.S. Appl. No. 15/930,146, dated Aug. 4, 2022, 17 pages.
U.S. Appl. No. 16/511,499 , “Corrected Notice of Allowability”, U.S. Appl. No. 16/511,499, dated May 11, 2022, 5 pages.
U.S. Appl. No. 15/930,146, “Final Office Action”, U.S. Appl. No. 15/930,146, dated Jan. 18, 2023, 15 pages.
Related Publications (1)
Number Date Country
20210279931 A1 Sep 2021 US
Provisional Applications (1)
Number Date Country
61078288 Jul 2008 US
Continuations (1)
Number Date Country
Parent 12347747 Dec 2008 US
Child 17328778 US