Systems and methods for linking media content

Information

  • Patent Grant
  • 8438602
  • Patent Number
    8,438,602
  • Date Filed
    Monday, January 26, 2009
    16 years ago
  • Date Issued
    Tuesday, May 7, 2013
    11 years ago
Abstract
Systems and methods are described for presenting a plurality of media clips to a viewer using a media player device. A first media clip is displayed on the media player, and an indicator corresponding to a second media clip is provided during playback of the first media clip. In response to the viewer selecting the indicator, playback of the first media clip is suspended, and information is stored about the first media clip on the media player device. The second media clip can then be subsequently displaying on the media player device. After displaying the second media clip, the stored information can be retrieved, and playback of the first media clip can resume from the point that playback was previously interrupted.
Description
TECHNICAL FIELD

The following discussion generally relates to streaming media, and in particular relates to systems, devices and techniques for processing linked media streams.


BACKGROUND

Consumers are continually demanding increased flexibility in viewing streaming and other forms of media. Whereas television viewing traditionally involved watching imagery received on a broadcast signal on a conventional television set, modern media experiences allow media content to be provided via broadcast, cable, satellite, portable media (e.g., DVD) and other sources. Further, the Internet and other relatively high-bandwidth networks now allow media content to be streamed or otherwise delivered to any number of devices (e.g., wireless phones, computers and the like) that previously were not typically used for viewing media content. Consumers are therefore able to view media content on a wide variety of devices and in a wide variety of locations.


In addition to the increased availability and flexibility in viewing media content, consumers have recently expressed significant interest in creating “clips” of media content that can be shared with others. Such clips may include relatively short excerpts of viewed media content in a digital or other format that may be distributed via the Internet or another channel; a number of Internet services for uploading and sharing media clips have become very popular in recent years.


As media streaming, clipping, placeshifting and other forms of media viewing continue to evolve, a need has emerged for an interface that allows consumers to view multiple media files in a convenient and intuitive manner. These and other desirable features and characteristics will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and this background section.


BRIEF DESCRIPTION

According to various exemplary embodiments, systems and methods are described for presenting a plurality of media clips to a viewer using a media player device. In a first embodiment, a first media clip is displayed on the media player, and an indicator corresponding to a second media clip is provided during playback of the first media clip. In response to the viewer selecting the indicator, playback of the first media clip is suspended, and information is stored about the first media clip on the media player device. The second media clip can then be subsequently displaying on the media player device. After displaying the second media clip, the stored information can be retrieved, and playback of the first media clip can resume from the point that playback was previously interrupted.


Other embodiments provide a system for presenting a plurality of media clips to a viewer. The system comprises means for displaying a first one of the plurality of media clips and for providing an indicator corresponding to a second one of the plurality of media clips on the media player device during playback of one the first one of the plurality of media clips. A means for receiving a viewer input corresponding to the indicator from the viewer is also provided. The system additionally comprises a means for processing playback of the plurality of media clips, the controlling means comprising means for suspending playback of the first one of the plurality of media clips and for subsequently directing the display of the second one of the plurality of media clips on the media player device in response to the viewer input corresponding to the indicator. Various embodiments also comprise a means for storing information about the first one of the plurality of media clips on the media player device while the second one of the plurality of video clips is being displayed.


In still other embodiments, a device for presenting a plurality of media clips to a viewer is provided. The device suitably comprises a display, a digital storage medium, a user interface configured to accept inputs from the viewer, and a processor. The processor can be configured to present a first one of the plurality of media clips to the viewer on the display and to provide an indicator corresponding to a second one of the plurality of media clips on the media player device during playback of the first one of the plurality of media clips, and, in response to the inputs from the viewer indicating a selection of the indicator, to suspend playback of the first one of the plurality of media clips, to store information about the first one of the plurality of media clips on the digital storage medium and to subsequently present the second one of the plurality of media clips on the display.


Various other embodiments, aspects and other features are described in more detail below.





BRIEF DESCRIPTION OF THE DRAWING FIGURES

Exemplary embodiments will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and



FIG. 1 is a block diagram of an exemplary media player;



FIG. 2 is a block diagram of an exemplary metadata structure; and



FIG. 3 is a flowchart of an exemplary method for presenting multiple interlinked media clips.





DETAILED DESCRIPTION

The following detailed description is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.


According to various embodiments, the viewer experience is improved by providing media clips that are interconnected by a hyperlink or the like so that the viewer is able to readily navigate between the various clips. In some embodiments, an indicator associated with a second clip is presented while the viewer is watching a first clip. When the viewer clicks or otherwise activates the indicator, the viewing context switches from the first clip to the second clip. That is, viewing of the first clip can be temporarily suspended while the second clip is viewed. Information about the first clip is stored while the second clip is played, therefore allowing convenient restoration of the first clip after the second clip is finished, or after the viewer navigates back to the first clip from the second. This allows the viewer to “browse” through video clips in a manner similar to browsing of web pages or the like, thereby providing a convenient and intuitive interface to the viewer.


With reference now to FIG. 1, media player 100 is any device, system or module capable of playing media clips received from any source. Clips may be received from any source, such as a local storage medium 127 and/or a server 108 located across a digital network 110. Clips are rendered on a display 132 using various hardware and software features as appropriate. As noted above, multiple clips may be inter-linked in any manner to allow browsing between clips through navigation of links associated with each clip.


In various embodiments, media player 100 is implemented with any sort of conventional computer system or similar workstation, such as any sort of desktop, laptop or other personal computer or general-purpose computing system. In other embodiments, media player 100 is a set-top box (STB) or other receiver device capable of receiving television or other media signals via any sort of broadcast, cable, satellite or other medium. In still other embodiments, media player 100 may be a portable self-contained computing device such as any sort of wireless phone, personal digital assistant (PDA), network client and/or the like. Alternatively, media player 100 is a logical application or other module implemented in software or firmware that can be executed on any sort of processing hardware, including any sort of web or other network client. Media player 100 as described herein therefore represents any device, logic or other system capable of receiving a media clips from any local or remote source and of playing media clips for one or more viewers.


“Clips”, as used herein, refers to any audio, video, audio/video or other media content received from any source, and in any format. In various embodiments, “clips” are simply streaming or file-based digital media content in any format that can be retrieved from server 108 or another source via a digital communications network 110 or the like. In embodiments wherein media player 100 is a television receiver such as a STB, for example, media clips may be received directly from a satellite, cable, broadcast or other source. In such embodiments, media clips may be a received television signal or the like that is received and processed locally on media player device 102. Media clips may also represent a stream obtained from a DVD or other portable medium, and/or a media file stored at media player 100 in any format.


In some embodiments, media player 100 is able to create clips from any source. Sources of clipped content could include any digital or analog programming stream, including any stream received from a broadcast source (e.g., a satellite, terrestrial broadcast and/or cable source), digital video recorder, placeshifting device, DVD or other portable media, network source, and/or the like. Clip creating may be provided by media player 130 or a separate editing application. In such embodiments, received media streams can be converted into shorter digital clips that can be shared with other viewers, e.g., using network 110. Clip creation features need not be provided in all embodiments, however.


In the exemplary embodiment shown in FIG. 1, media player 100 includes a processor 122, memory 124 and input/output features 126 commonly associated with any conventional computing platform. Processor 122, for example, may be any sort of microprocessor, microcontroller, digital signal processor, programmable array or other logic capable of executing instructions and processing data to implement the various features of the media player device. Memory 124 includes any sort of RAM, ROM, flash and/or other memory capable of storing instructions and data that can be processed by processor 122 or other processing logic as appropriate. Input/output 126 typically includes any conventional interfaces to input devices (e.g., keyboard, mouse, trackball, joystick, directional pad, touchpad/touchscreen, wireless or other remote control, and/or other input devices as appropriate), as well as any conventional interfaces to output devices such as a display 132 or the like. Input/output 126 typically also includes interfaces to any sort of mass storage 127 (e.g., a magnetic or optical disk) and/or to a communications network 110. Network interfaces used in various embodiments might include any sort of wired (e.g., IEEE 802.3 “ETHERNET”) or wireless (e.g., IEEE 802.11 “Wi-fi”) interfaces, including any sort of interfaces to telephone networks.


Any of the various features of media player 100 may be implemented with any sort of general or special purpose hardware, software and/or firmware, as appropriate. In some embodiments (e.g., embodiments wherein media player 100 is implemented as a STB or other media receiver), processor 122, memory 124 and/or input/output 126 may be implemented as a “system on a chip” (SoC) using any suitable processing circuitry under control of any appropriate control logic. In various embodiments, such control logic may execute within an integrated SoC or other processor to implement a media receiver, decoder, display processor and/or any other features as appropriate. The Broadcom Corporation of Irvine, Calif., for example, produces several models of processors (e.g., the model BCM 7400 family of processors) that are capable of supporting SoC implementations of satellite and/or cable receiver systems, although products from any number of other suppliers could be equivalently used. In still other embodiments, various distinct chips, circuits or components may be inter-connected and inter-relate with each other to implement the various functions and features described herein. In still other embodiments, processor 122 and/or any other features may be implemented with an application specific integrated circuit (ASIC) or the like.


To that end, operations of device 102 may be controlled by any sort of general purpose or other operating system 128. Operating system 128 typically implements user interface features and also allows programs (e.g., media player application 130) to use the various hardware and other resources of device 102. Examples of operating systems that could be used in various embodiments include any of the well-known operating systems conventionally used in personal computing (e.g., any version of WINDOWS, MacOS/OSX, LINUX OS, etc.) or mobile computing (e.g., any version of BLACKBERRY, ANDROID, WINDOWS MOBILE, SYMBIAN, iPHONE and/or any other operating system). The particular examples of operating systems are not intended to be limiting; indeed, other embodiments could be based upon other operating systems, including any sort of proprietary operating system, and equivalent embodiments could be based upon any sort of programming or other abstraction environment (e.g., JAVA, .NET, and/or the like) in place of or in addition to a conventional operating system 128.


Media player application 130 is any program, application, applet, browser plugin, software module and/or other logic capable of decoding one or more clips for playback on display 132 or the like. Media player application 130 may be implemented in any programming language, and may be stored in source or object code format in any storage medium, including memory 124 and/or any sort of disk or other mass storage available to media player 100. In an exemplary implementation, media player application 130 is a software program that is stored in object code form on a disk or similar medium until being activated by the user. The program 130 is then partially or wholly duplicated into memory 124 to facilitate execution of the object code instructions by processor 122.


Display 132 is any sort of television, monitor or other display capable of presenting imagery to the viewer. In various embodiments, display 132 is a conventional television or computer monitor associated with media player 100, including any sort of integrated or external display 132 that might be associated with a computer, wireless phone, PDA or the like. In other embodiments, display 132 is a conventional analog or digital television that may be coupled to a STB or other receiver, as described above. Display 132 need not be physically present at the same location as media player 100 in all embodiments; to the contrary, content may be provided from media player 100 to display 132 via any sort of cabling, network (e.g., network 110) or the like.


Network 110 is any digital or other communications network capable of transmitting messages between senders and receivers. In various embodiments, network 110 may encompass one or more wide area networks, local area networks, and/or any combination of wide and local area networks, including any sort of wireless or other telephone networks. Network 110 can include any number of public or private data connections, links or networks supporting any number of communications protocols. Network 110 may include the Internet, for example, or any other network based upon TCP/IP or other conventional protocols. In many embodiments, network 110 may also include one or more conventional local area networks, such as one or more IEEE 802.3 and/or IEEE 802.11 networks. Network 110 may also incorporate any sort of wireless telephone network, such as any sort of GSM/EDGE or CDMA/EVDO connection, any sort of 3G or subsequent telephone link, and/or the like. Network 110 as shown in FIG. 1, then, is intended to broadly encompass any digital communications network(s), systems or architectures for transmitting data between the various components of system 100.


Server 108 is any conventional network server that is able to receive and/or provide one or more media clips on network 110. In various embodiments, server 108 is a media server that distributes media clips to various media players 100 over network 110. In various further embodiments, server 108 also receives media clips that are created by viewers using media players 100 or the like. In such embodiments, viewers are able to create custom-made clips from broadcast or other content and to share their clips with friends or others. Clips may be provided from server 108 in any sort of file-based and/or streaming format, such as any sort of formatting based upon WINDOWS MEDIA, QUICKTIME, REAL MEDIA, FLASH, SHOCKWAVE and/or other products.


The viewer is able to interact with media player application 130 using the interface features of display 132 in any manner. In various embodiments, the viewer is able to select the buttons, icons, sliders and/or other interface features presented on display 132 using conventional keyboard, mouse, and/or other interface hardware that may be available. Other embodiments may provide a wireless remote control or the like, or any other interface feature supported by media player 100. In an exemplary embodiment, the viewer is able to select and manipulate icons, sliders, buttons and/or other interface elements on display 132 through movement and selection of a cursor or other conventional interface feature using a mouse or other multi-dimensional input device. Such features are commonly supported by various operating systems 128, hardware drivers (e.g., input/output features 126) and/or the like.


In the exemplary embodiment shown in FIG. 1, media player 100 is able to present a clip 106 on display 132 for the viewer. The clip 106 is presented in conjunction with an appropriate user interface that incorporates buttons, icons and/or other features that allow for navigation and control of the viewer experience. For example, the exemplary interface shown in FIG. 1 includes a Play button 116 that allows the viewer to play or pause the presentation of the media clip 106, as well as a time shift buffer indicator 113 that displays the current playback position within the clip. A slider 114 is also provided in this example that allows the viewer to move within the time shift buffer to change the playback position of the clip. That is, the slider 114 moves with respect to the time shift buffer indicator 113 to “fast forward” or “rewind” the playback of the media clip 106. Other embodiments may equivalently provide buttons, icons or other features in place of slider 114 or buffer indicator 113, or any of the other particular imagery and interface elements shown in FIG. 1. Further, various embodiments may provide additional features not shown in FIG. 1. Some embodiments may allow viewers to record “mark points” or the like, for example, on or near the time shift buffer indicator 113 to provide a convenient reference for later re-viewing of one or more scenes. Still other embodiments may allow editing of the clip, or “sub-clipping” (e.g., creation of new clips from the existing clip). For example, start and end points of a sub-clip may be defined with respect to the time shift buffer or the like.


While a first clip 106 is playing, indicators 102 for other clips may be presented in any manner. In various embodiments, indicators 102 are simply icons, buttons or other interface features that can be clicked or otherwise actuated to present a different clip to the viewer. This second clip may be interlinked to the first clip in any manner, as described more fully below. When the viewer selects indicator 102, the viewing experience changes to that of the second clip. Information about the first clip can be stored (e.g., in memory 124 and/or storage 127) for subsequent retrieval, and the second clip is played for the viewer. After viewing of the second clip is complete (e.g., the clip is finished, or the viewer indicates a desire to stop viewing the clip), viewing of the first clip can be restored, as described more fully below.


To that end, contextual information about the first clip can be stored before (or while) the subsequent clip is presented, thereby allowing the viewer to restore the original context at a later time. Contextual information may include some or all of the contents of the time shift buffer (including any mark points, sub-clip start/end points, editing information and/or the like). This information may be stored in memory 124, storage 127 or another suitable location so that the viewer is able to return to the same viewing context that was displayed before navigating to the subsequent clip.


By storing context information about clips as subsequent clips are selected, navigation or “browsing” between clips can be facilitated. To that end, display 132 as shown in FIG. 1 includes two browsing indicators 105, 107 that allow reverse or forward navigation, respectively, between clips. By selecting indicator 105, for example, the viewer is able to revert to the previously-viewed clip. After reverting to a previously-viewed clip, indicator 107 can be selected to move “forward” in the linked chain of video clips, as appropriate. Any number of clip contexts may be stored in any sort of linear or other chain to allow backward and/or forward browsing through the various clip contexts, as desired.


Indicators 102 associated with other clips may be presented in any manner. In various embodiments, indicators 102 are simply presented within imagery presented on display 132 to allow for selection of a subsequent clip. Indicator 102B, for example, shows an icon or other indicator that can be selected to bring up a second clip at any time during viewing of the present clip. Clips may be selected from indicators 102C that overlie or are otherwise located within the displayed imagery of the current clip 106 in some embodiments. To that end, indicators 102A-C may be presented in any location and at any time during viewing of clip 106, as appropriate.


In the example of FIG. 1, indicator 102A is shown proximate to the time shift buffer indicator 113 to indicate a clip that may correspond to the particular content of the presently-viewed clip at the time indicated in the time shift buffer. If a particular product is being shown or discussed during a portion of the present clip, for example, indicator 102B may provide a link to a second clip that provides more information about that product for that portion of the clip. Indicator 102A may therefore be presented when the playback of the first clip (as indicated by the position of slider 114) is within the portion of the time shift buffer 113 that corresponds to the particular time that the content of the second clip is most relevant. Indicators corresponding to other clips may be presented at other times, and/or no indicators to other clips may be provided at certain times during playback of the first clip, as desired.


Indicators 102A-C that allow the viewer to quickly access other media clips may therefore be provided in any manner. As a first clip is presented to the viewer, the indicators 102A-C may be presented at times and locations on display 132 that are convenient and logical for providing access to additional clips. Upon selection of one or more indicators 102A-C, the subsequent clip is presented, and information about the first clip (e.g., the contents of the time shift buffer, an indication of the point in the clip when viewing was suspended, and/or other information) is stored for subsequent viewing of the first clip.


Clips may be inter-linked in any manner. In some embodiments, linking information about subsequent clips is provided in metadata associated with the first clip to facilitate a clip browsing experience that is both familiar and convenient to the viewer. Metadata associated with the first clip can contain information 200 that identifies subsequent clips, and provides sufficient information to media player 100 to allow displaying and selection of the clips in a manner intended by the clip creator.



FIG. 2 shows an exemplary structure for linking clips to each other. In various embodiments, information 200 is stored in metadata of the first clip that allows for linking to the second clip. Such information 200 may include such features as a time 202 that the indicator 102 is presented, an identification 204 of the indicator 102, an identification 206 of the second clip itself, and/or any access restrictions 208 placed on the subsequent clip. By providing this information with the first clip, a second clip can be conveniently linked to the second clip to allow for sequential or other viewing of the related clips.


Time information 202 is any indication of a time in the first clip that the indicator for the second clip may be active. Information 202 may include a start time, an end time, or both starting and ending times to define those times during playback that the indicator 102 for the clip is active. Multiple times or time ranges may be defined in some embodiments, thereby allowing the indicator 102 to be displayed at multiple points during playback of the first clip. Timing information may not be present in all embodiments; in such cases, indicator 102 may be presented throughout the duration of the first clip, or other default-type actions may be taken.


Identification 204 of the indicator 102 may be provided in any manner. In various embodiments, identification 204 includes data that describes the icon or other graphical imagery for indicator 102. This information may be defined in any manner. In various embodiments, indictor 102 is simply provided as a JPEG, TIFF, PNG or other image. Other embodiments may provide a uniform resource locator (URL) or other address/link to additional information. In still other embodiments, a reference to a video frame or other source of imagery for indicator 102 may be provided. Alternatively, textual or numerical information may be provided that allows the media player 100 to generate a suitable indicator 102.


Clip identification 206 is any information capable of identifying the second clip. Such information may include a URL or other address of the clip (e.g., a location on network 110 where the clip may be obtained), or may include the clip itself in some embodiments.


Some clips may be associated with access restrictions 208 and/or other limits upon viewing. Clips may be restricted based upon time of day, dates of availability, and/or other factors as appropriate. In some embodiments, certain clips can be access restricted to allow for distribution only to limited groups (e.g., members of a “friends” group or to viewers who have entered appropriate userid/password pairs or other credentials). Any information 208 relating to viewing restrictions, access controls or the like can be incorporated into metadata 200 for processing by media player 100 as appropriate.


The information provided in metadata 200 may be created and formatted in any manner. In various embodiments, a clip author is able to provide information within structure 200 using an authoring tool, or any sort of manual entry, or any combination of the two. In various embodiments, media player application 130 includes the ability to create media clips from received media streams, as described above; this functionality can be enhanced to allow the inter-linking of additional clips as described herein. A clip creation tool, for example, may include options to add references to additional clips that can be linked from the created clip. Such tools may simply request a URL or other address of the created clip, as appropriate, or may provide any sort of “drag and drop” or other convenient graphical mechanism for inter-linking multiple clips. Start and end times for presenting the indicator 102 (as well as any other parameters) may be defined manually, graphically or otherwise. Start and end times could be selected from points on the time shift buffer indicator 113 associated with the primary clip, for example. Any additional information (e.g., relating to a display location for the indicator 102) could also be provided as desired.


Turning now to FIG. 3, an exemplary process 300 that may be executed at media player 100 suitably involves the broad steps of displaying a first media clip on the media player device (function 302), providing an indicator 102 corresponding to a second media clips during playback of the first media clip (function 306), and in response to the viewer selecting the indicator (function 308), suspending playback of the first media clip and storing information about the first media clip on the media player device (function 310). The second media clip can be subsequently displayed on the media player device (function 312) as appropriate.


Generally speaking, each of the method steps shown in FIG. 3 may be implemented in software or firmware that may be stored in memory, mass storage or any other storage medium available to the executing device, and that may be executed on any processor or control circuitry associated with the executing device. For example, the various functions shown in FIG. 3 may be implemented in software or firmware that can be stored in memory 124 and/or storage medium 127 and executed by processor 122. The particular logic shown in FIG. 3 may be modified or enhanced in any manner, and any other components, systems, logic or devices may be involved in various other embodiments as appropriate.


Media clips may be displayed in any manner (function 302). As noted above, clips may be rendered by media player application 130 and presented on display 132 associated with media player 100 in any manner. Displaying may be initiated in response to inputs (e.g., from “play” button 116) provided by the viewer using a graphical or other user interface. The media clip is then rendered (e.g., by processor 122 operating in conjunction with decoder software in media player application 130 or the like) and displayed as appropriate.


The displayed media clip may have any number of associated clips that are interlinked through metadata or the like. If links are present (function 304), indicators 102 for the linked clips can be presented (function 306) on display 132 while the clip is being displayed. As noted above, metadata 200 associated with the displayed clip can describe appropriate times during playback that the indicator 102 is active, and/or any other parameters as desired. That is, indicator 102 may be active during particular times during playback or the like, and may also be activated/deactivated in response to access controls, temporal constraints, available hardware or software, or other parameters as desired. These parameters may be defined within metadata 200 that accompanies the primary clip.


When the viewer selects an indicator 102 (thereby indicating a desire to view a secondary media clip that is associated with the indicator 102), the linked clip is processed as appropriate (function 310). In various embodiments, playback of the primary media clip is paused or otherwise suspended, and some or all of the contents of the time shift buffer associated with the primary clip are stored (e.g., in memory 124 and/or storage 127) for later retrieval. In other embodiments, a time marker that indicates a time that the playback was interrupted may be stored to facilitate later playback of the first clip from the same point that playback was previously left off. Other embodiments may additionally store metadata, viewer settings and/or other contextual information associated with the viewing of the first media clip.


Information for multiple clips can be stored for later retrieval (e.g., using the back button 105 and/or forward button 107 described above). Indeed, any number of additional clip contexts can be stored, so long as the amount of available memory or other storage space is not exceeded. To that end, some embodiments may impose a limit on the number of contexts/information that can be stored to prevent excessive memory or other storage consumption by such information.


After storing information associated with the primary media clip, the secondary clip associated with the selected indicator 102 can be presented on display 132 (function 312). At this point, the second clip can effectively function as the primary clip described above, with additional links/indicators 102 processed as appropriate to facilitate a multi-link chain of video clips. That is, the viewer may be presented with additional indicators 102 for additional clips that have not yet been presented based upon metadata 200 associated with the second clip.


Further, the viewer may be able to navigate back to the original clip by selecting an appropriate icon, button or other indicator 102 on display 132 (function 314). The indicator may simply be the “back” button 105 described above, or may be another indicator 102 presented elsewhere on display 132 as desired.


If the second clip terminates without the viewer selecting a link to a third clip, then any appropriate default action may be taken (function 316). In an exemplary embodiment, the media player application 130 simply waits for additional instruction before proceeding. In other embodiments, the viewing context for the previous clip can be restored, or any other action can be taken as desired.


Viewing of the original clip can be restored in any manner (function 318). In various embodiments, the time shift buffer or other information previously stored (e.g., in function 310) can be retrieved from memory 124, storage 127 or any other location, and the viewing context for the original clip can continue. If the point at which viewing was previously suspended was recorded, playback of the original clip can continue at that same point, as desired. Other information from the previous viewing context (mark points, viewer settings, preferences, etc.) may also be retrieved to preserve the original viewing context as accurately as possible.


The basic functionality described in FIG. 3 can be supplemented in any number of ways. The particular clips that are linked to each clip, for example, may be manually or automatically selected in any manner. Automatic selection may take place, for instance, in response to keywords entered by any number of authors, editors and/or viewers, thereby allowing clips with common keywords or other descriptors to be linked together. Clips can be associated based upon commercial relationships or other factors (e.g., an advertisement clip can be linked to a primary clip to allow for selective viewing of the advertisement). Still further, clips can be associated with each other based upon the contents of the clips, as noted above. If an actor in a primary clip is visiting a museum or other attraction, for example, a secondary clip providing more information about the attraction can be linked to the primary clip, with the indicator 102 associated with the second clip being visible during the portions of the primary clip where the attraction is discussed or described. Similar features could be used with product placement, advertisements, or simply to provide clips with common content and to associate the content of the current clip with content of the linked clips.


Again, the concepts described herein could be readily and equivalently applied to any type of video, audio or other media files, including files provided in any sort of streaming or file based format, as well as terrestrial or satellite broadcasts, cable transmissions, signals obtained from DVD or other recorded media, and/or the like.


As used herein, the word “exemplary” means “serving as an example, instance, or illustration”. “Exemplary” embodiments are not intended as models to be literally duplicated, but rather as examples that provide instances of embodiments that may be modified or altered in any way to create other embodiments. Any implementation described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other implementations.


While the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing various embodiments of the invention, it should be appreciated that the particular embodiments described above are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. Various changes may be made in the function and arrangement of elements described without departing from the scope of the invention and its legal equivalents.

Claims
  • 1. A method for browsing between a plurality of media clips each comprising motion video presented to a viewer using a media player device, wherein the plurality of media clips comprises a first media clip and a second media clip, the method comprising: displaying the first media clip in a first viewing context on the media player device, wherein the first media clip is stored in a first time shift buffer;during playback of the first one of the plurality of media clips from the first time shift buffer, providing an indicator corresponding to a second one of the plurality of media clips on the media player device;in response to the viewer selecting the indicator, suspending the playback of the first one of the plurality of media clips from the first time shift buffer, storing context information describing the first viewing context on the media player device, and subsequently replacing the first viewing context with a second viewing context that plays back the second media clip on the media player device from a second time shift buffer that is separate from the first time shift buffer; andin response to a subsequent input from the viewer during the playback of the second video clip in the second video context, retrieving the stored context information describing the first viewing context, restoring the first time shift buffer containing the first media clip on the media player device from the stored context information, and continuing playback of the first one of the plurality of media clips in the first viewing context on the media player device from the first time shift buffer to thereby allow the viewer to sequentially browse between playback of the first media clip from the first time shift buffer and playback of the second media clip from the second time shift buffer on the media player device.
  • 2. The method of claim 1 wherein the context information about the playback state of the first media clip comprises the contents of the first time shift buffer and wherein the first time shift buffer is used to support fast forward and rewind operations during playback of the first media clip, and wherein the restoring comprises restoring the contents of the first time shift buffer after playback of the second media clip to thereby restore the fast forward and rewind operations during the continuing playback of the first media clip.
  • 3. The method of claim 1 wherein the stored context information about the first one of the plurality of media clips comprises a time indicator corresponding to the point in the first one of the plurality of media clips when playback from the first time shift buffer was suspended, and wherein the continuing playback of the first media clip is restored from the point in the first time shift buffer when playback was previously suspended.
  • 4. The method of claim 1 wherein the restoring comprises restoring the contents of the first time shift buffer after playback of the second media clip to thereby restore the fast forward and rewind operations during the continuing playback of the first media clip.
  • 5. The method of claim 1 wherein the indicator is provided only during a portion of the playback of the first one of the plurality of media clips.
  • 6. The method of claim 5 wherein the second one of the plurality of media clips is selected based upon the content of the first one of the plurality of media clips during the portion of the playback that the indicator is provided.
  • 7. The method of claim 1 further comprising displaying time shift imagery on the media player device that corresponds to the first time shift buffer that is navigable by the viewer to play a portion of the first one of the plurality of media clips.
  • 8. The method of claim 7 wherein the indicator is provided proximate the time shift imagery when content of the first one of the plurality of media clips corresponds to the content of the second one of the plurality of media clips.
  • 9. The method of claim 7 wherein the information stored about the first one of the plurality of media clips comprises the contents of the first time shift buffer.
  • 10. The method of claim 9 wherein the restoring comprises restoring the contents of the first time shift buffer to thereby resume playing of the first one of the plurality of media clips.
  • 11. The method of claim 10 wherein the subsequent input from the viewer comprises a viewer selection of a second indicator displayed with the second one of the plurality of media clips.
  • 12. The method of claim 1 further comprising processing metadata associated with the first one of the plurality of media clips, wherein the metadata comprises identification information about the second one of the plurality of media clips.
  • 13. The method of claim 12 wherein the identification information comprises the indicator.
  • 14. The method of claim 12 wherein the identification information comprises timing information that identifies a portion of the first one of the plurality of media clips when the indicator is visible to the viewer.
  • 15. The method of claim 12 wherein the identification information comprises the indicator, timing information that identifies a portion of the first one of the plurality of media clips when the indicator is visible to the viewer, and linking information identifying a location of the second one of the plurality of media clips.
  • 16. A system for presenting a plurality of media clips comprising moving video to a viewer, the system comprising: means for displaying a first one of the plurality of media clips and for providing an indicator corresponding to a second one of the plurality of media clips on the media player device during playback of one the first one of the plurality of media clips from a first time shift buffer;means for receiving a viewer input corresponding to the indicator from the viewer; andmeans for processing playback of the plurality of media clips, wherein the processing means is configured to suspend playback of the first one of the plurality of media clips in response to the viewer input corresponding to the indicator, to direct the storage of context information about a first viewing context of the first one of the plurality of media clips comprising the first time shift buffer, to subsequently direct the displaying means to replace the first viewing context of the first one of the plurality of media clips with a second viewing context displaying the second one of the plurality of media clips from a second time shift buffer on the media player device in response to the viewer input corresponding to the indicator, and, in response to subsequent input from the viewer, to retrieve the stored context information about the first one of the plurality of media clips and to restore the first viewing context of the first one of the plurality of media clips from the stored context information to thereby allow for continued playback of the first one of the plurality of media clips and to thereby allow the viewer to sequentially browse between playback of the first media clip from the first time shift buffer and playback of the second media clip from the second time shift buffer on the media player device; andmeans for storing the context information about the first one of the plurality of media clips on the media player device while the second one of the plurality of video clips is being displayed.
  • 17. The system of claim 16 wherein the context information about the first viewing context of the first media clip comprises a time indicator corresponding to a point in the first one of the plurality of media clips when playback is suspended, and wherein the processing means is further configured to process the time indicator from the restored context information to thereby restore playback of the first media clip from the point when playback was previously suspended.
  • 18. The system of claim 16 wherein the context information about the first viewing context of the first media clip comprises the contents of the first time shift buffer used to support fast forward and rewind operations during playback of the first media clip, and wherein the processing means is further configured to restore the contents of the first time shift buffer after playback of the second media clip to thereby restore the fast forward and rewind operations during the continuing playback of the first media clip.
  • 19. The system of claim 18 wherein the processing means is further configured to provide the indicator proximate the time shift imagery when content of the first one of the plurality of media clips corresponds to content of the second one of the plurality of media clips.
  • 20. The system of claim 18 wherein the context information stored about the first one of the plurality of video clips comprises the contents of the first time shift buffer.
  • 21. A device for presenting a plurality of media clips each comprising moving video to a viewer, the device comprising: a display;a digital storage medium;a user interface configured to accept inputs from the viewer; anda processor configured to present a first one of the plurality of media clips to the viewer on the display in a first viewing context comprising a first time shift buffer and to provide an indicator corresponding to a second one of the plurality of media clips on the media player device during playback of the first one of the plurality of media clips, and, in response to the inputs from the viewer indicating a selection of the indicator, to suspend playback of the first one of the plurality of media clips from the first time shift buffer in the first viewing context, to store information about the first viewing context on the digital storage medium and to subsequently present the second one of the plurality of media clips on the display from a second time shift buffer in a second viewing context that replaces the first viewing context, and, in response to subsequent input from the viewer, to retrieve the stored context information about the first viewing context and to restore the first viewing context from the stored context information to thereby allow for continued playback of the first one of the plurality of media clips from the first time shift buffer and to thereby allow the viewer to sequentially browse between playback of the first media clip from the first time shift buffer and playback of the second media from the second time shift buffer on the media player device.
  • 22. The device of claim 21 wherein the first viewing context comprises a time indicator corresponding to a point in the first one of the plurality of media clips when playback is suspended, and wherein the processor is further configured to process the time indicator from the restored context information to thereby restore playback of the first media clip from the point when playback was previously suspended.
  • 23. The device of claim 21 wherein the processor is further configured to direct the display of time shift imagery on the display, wherein the time shift imagery corresponds to the first time shift buffer in the digital storage medium that is navigable by the viewer to play a portion of the first one of the plurality of media clips.
  • 24. The device of claim 23 wherein the processor is further configured to provide the indicator proximate the time shift imagery when content of the first one of the plurality of media clips corresponds to content of the second one of the plurality of media clips.
  • 25. The device of claim 23 wherein the information stored about the first viewing context comprises the contents of the first time shift buffer.
  • 26. The device of claim 25 wherein the processor is further configured to restore the contents of the first time shift buffer to thereby resume playing of the first one of the plurality of media clips.
  • 27. The device of claim 21 wherein the processor is further configured to process metadata associated with the first one of the plurality of media clips, wherein the metadata comprises identification information about the second one of the plurality of media clips.
  • 28. The device of claim 27 wherein the identification information comprises the indicator, timing information that identifies a portion of the first one of the plurality of media clips when the indicator is visible to the viewer, and linking information identifying a location of the second one of the plurality of media clips.
US Referenced Citations (268)
Number Name Date Kind
3416043 Jorgensen Dec 1968 A
4254303 Takizawa Mar 1981 A
5161021 Tsai Nov 1992 A
5237648 Mills et al. Aug 1993 A
5386493 Degen et al. Jan 1995 A
5434590 Dinwiddie, Jr. et al. Jul 1995 A
5493638 Hooper et al. Feb 1996 A
5602589 Vishwanath et al. Feb 1997 A
5661516 Carles Aug 1997 A
5666426 Helms Sep 1997 A
5682195 Hendricks et al. Oct 1997 A
5706290 Shaw et al. Jan 1998 A
5708961 Hylton et al. Jan 1998 A
5710605 Nelson Jan 1998 A
5722041 Freadman Feb 1998 A
5757416 Birch et al. May 1998 A
5774170 Hite et al. Jun 1998 A
5778077 Davidson Jul 1998 A
5794116 Matsuda et al. Aug 1998 A
5822537 Katseff et al. Oct 1998 A
5831664 Wharton et al. Nov 1998 A
5850482 Meany et al. Dec 1998 A
5852437 Wugofski et al. Dec 1998 A
5880721 Yen Mar 1999 A
5898679 Brederveld et al. Apr 1999 A
5909518 Chui Jun 1999 A
5911582 Redford et al. Jun 1999 A
5922072 Hutchinson et al. Jul 1999 A
5929849 Kikinis Jul 1999 A
5936968 Lyons Aug 1999 A
5968132 Tokunaga Oct 1999 A
5987501 Hamilton et al. Nov 1999 A
6002450 Darbee et al. Dec 1999 A
6008777 Yiu Dec 1999 A
6014694 Aharoni et al. Jan 2000 A
6020880 Naimpally Feb 2000 A
6029045 Picco et al. Feb 2000 A
6031940 Chui et al. Feb 2000 A
6036601 Heckel Mar 2000 A
6040829 Croy et al. Mar 2000 A
6043837 Driscoll, Jr. et al. Mar 2000 A
6049671 Slivka et al. Apr 2000 A
6075906 Fenwick et al. Jun 2000 A
6088777 Sorber Jul 2000 A
6097441 Allport Aug 2000 A
6104334 Allport Aug 2000 A
6108041 Faroudja et al. Aug 2000 A
6115420 Wang Sep 2000 A
6117126 Appelbaum et al. Sep 2000 A
6141059 Boyce et al. Oct 2000 A
6141447 Linzer et al. Oct 2000 A
6144375 Jain et al. Nov 2000 A
6160544 Hayashi et al. Dec 2000 A
6201536 Hendricks et al. Mar 2001 B1
6212282 Mershon Apr 2001 B1
6222885 Chaddha et al. Apr 2001 B1
6223211 Hamilton et al. Apr 2001 B1
6240459 Roberts et al. May 2001 B1
6240531 Spilo et al. May 2001 B1
6240555 Shoff et al. May 2001 B1
6243596 Kikinis Jun 2001 B1
6256019 Allport Jul 2001 B1
6263503 Margulis Jul 2001 B1
6279029 Sampat et al. Aug 2001 B1
6282714 Ghori et al. Aug 2001 B1
6286142 Ehreth Sep 2001 B1
6298482 Seidman et al. Oct 2001 B1
6310886 Barton Oct 2001 B1
6340994 Margulis et al. Jan 2002 B1
6353885 Herzi et al. Mar 2002 B1
6356945 Shaw et al. Mar 2002 B1
6357021 Kitagawa et al. Mar 2002 B1
6370688 Hejna, Jr. Apr 2002 B1
6389467 Eyal May 2002 B1
6434113 Gubbi Aug 2002 B1
6442067 Chawla et al. Aug 2002 B1
6456340 Margulis Sep 2002 B1
6466623 Youn et al. Oct 2002 B1
6470378 Tracton et al. Oct 2002 B1
6476826 Plotkin et al. Nov 2002 B1
6487319 Chai Nov 2002 B1
6493874 Humpleman Dec 2002 B2
6496122 Sampsell Dec 2002 B2
6505169 Bhagavath et al. Jan 2003 B1
6510177 De Bonet et al. Jan 2003 B1
6529506 Yamamoto et al. Mar 2003 B1
6553147 Chai et al. Apr 2003 B2
6557031 Mimura et al. Apr 2003 B1
6564004 Kadono May 2003 B1
6567984 Allport May 2003 B1
6584201 Konstantinou et al. Jun 2003 B1
6584559 Huh et al. Jun 2003 B1
6597375 Yawitz Jul 2003 B1
6598159 McAlister et al. Jul 2003 B1
6600838 Chui Jul 2003 B2
6609253 Swix et al. Aug 2003 B1
6611530 Apostolopoulos Aug 2003 B1
6628716 Tan et al. Sep 2003 B1
6642939 Vallone et al. Nov 2003 B1
6647015 Malkemes et al. Nov 2003 B2
6658019 Chen et al. Dec 2003 B1
6665751 Chen et al. Dec 2003 B1
6665813 Forsman et al. Dec 2003 B1
6697356 Kretschmer et al. Feb 2004 B1
6698020 Zigmond et al. Feb 2004 B1
6701380 Schneider et al. Mar 2004 B2
6704678 Minke et al. Mar 2004 B2
6704847 Six et al. Mar 2004 B1
6708231 Kitagawa Mar 2004 B1
6718551 Swix et al. Apr 2004 B1
6754266 Bahl et al. Jun 2004 B2
6754439 Hensley et al. Jun 2004 B1
6757851 Park et al. Jun 2004 B1
6757906 Look et al. Jun 2004 B1
6766376 Price Jul 2004 B2
6768775 Wen et al. Jul 2004 B1
6771828 Malvar Aug 2004 B1
6774912 Ahmed et al. Aug 2004 B1
6781601 Cheung Aug 2004 B2
6785700 Masud et al. Aug 2004 B2
6795638 Skelley, Jr. Sep 2004 B1
6798838 Ngo Sep 2004 B1
6806909 Radha et al. Oct 2004 B1
6807308 Chui et al. Oct 2004 B2
6816194 Zhang et al. Nov 2004 B2
6816858 Coden et al. Nov 2004 B1
6826242 Ojard et al. Nov 2004 B2
6834123 Acharya et al. Dec 2004 B2
6839079 Barlow et al. Jan 2005 B2
6847468 Ferriere Jan 2005 B2
6850571 Tardif Feb 2005 B2
6850649 Malvar Feb 2005 B1
6868083 Apostolopoulos et al. Mar 2005 B2
6889385 Rakib et al. May 2005 B1
6892359 Nason et al. May 2005 B1
6898583 Rising, III May 2005 B1
6907602 Tsai et al. Jun 2005 B2
6927685 Wathen Aug 2005 B2
6930661 Uchida et al. Aug 2005 B2
6941575 Allen Sep 2005 B2
6944880 Allen Sep 2005 B1
6952595 Ikedo et al. Oct 2005 B2
6981050 Tobias et al. Dec 2005 B1
7016337 Wu et al. Mar 2006 B1
7020892 Levesque et al. Mar 2006 B2
7032000 Tripp Apr 2006 B2
7047305 Brooks et al. May 2006 B1
7110558 Elliott Sep 2006 B1
7124366 Foreman et al. Oct 2006 B2
7127735 Lee et al. Oct 2006 B1
7151575 Landry et al. Dec 2006 B1
7155734 Shimomura et al. Dec 2006 B1
7155735 Ngo et al. Dec 2006 B1
7184433 Oz Feb 2007 B1
7224323 Uchida et al. May 2007 B2
7239800 Bilbrey Jul 2007 B2
7344084 DaCosta Mar 2008 B2
7430686 Wang et al. Sep 2008 B1
7464396 Hejna, Jr. Dec 2008 B2
7502733 Andrsen et al. Mar 2009 B2
7505480 Zhang et al. Mar 2009 B1
7565681 Ngo et al. Jul 2009 B2
7584491 Bruckner et al. Sep 2009 B2
7640566 Taylor et al. Dec 2009 B1
7836149 Kikinis Nov 2010 B2
20010021998 Margulis Sep 2001 A1
20020004839 Wine et al. Jan 2002 A1
20020010925 Kikinis Jan 2002 A1
20020012530 Bruls Jan 2002 A1
20020031333 Mano et al. Mar 2002 A1
20020046404 Mizutani Apr 2002 A1
20020053053 Nagai et al. May 2002 A1
20020080753 Lee Jun 2002 A1
20020090029 Kim Jul 2002 A1
20020105529 Bowser et al. Aug 2002 A1
20020112247 Horner et al. Aug 2002 A1
20020122137 Chen et al. Sep 2002 A1
20020131497 Jang Sep 2002 A1
20020138843 Samaan et al. Sep 2002 A1
20020143973 Price Oct 2002 A1
20020147634 Jacoby et al. Oct 2002 A1
20020147687 Breiter et al. Oct 2002 A1
20020167458 Baudisch et al. Nov 2002 A1
20020188818 Nimura et al. Dec 2002 A1
20020191575 Kalavade et al. Dec 2002 A1
20030001880 Holtz et al. Jan 2003 A1
20030028873 Lemmons Feb 2003 A1
20030065915 Yu et al. Apr 2003 A1
20030093260 Dagtas et al. May 2003 A1
20030095791 Barton et al. May 2003 A1
20030115167 Sharif et al. Jun 2003 A1
20030159143 Chan Aug 2003 A1
20030187657 Erhart et al. Oct 2003 A1
20030192054 Birks et al. Oct 2003 A1
20030208612 Harris et al. Nov 2003 A1
20030231621 Gubbi et al. Dec 2003 A1
20040003406 Billmaier Jan 2004 A1
20040052216 Roh Mar 2004 A1
20040068334 Tsai et al. Apr 2004 A1
20040083301 Murase et al. Apr 2004 A1
20040100486 Flamini et al. May 2004 A1
20040103340 Sundareson et al. May 2004 A1
20040139047 Rechsteiner et al. Jul 2004 A1
20040162845 Kim et al. Aug 2004 A1
20040162903 Oh Aug 2004 A1
20040172410 Shimojima et al. Sep 2004 A1
20040205830 Kaneko Oct 2004 A1
20040212640 Mann et al. Oct 2004 A1
20040216173 Horoszowski et al. Oct 2004 A1
20040236844 Kocherlakota Nov 2004 A1
20040255249 Chang et al. Dec 2004 A1
20050021398 McCleskey et al. Jan 2005 A1
20050027821 Alexander et al. Feb 2005 A1
20050038981 Connor et al. Feb 2005 A1
20050044058 Matthews et al. Feb 2005 A1
20050050462 Whittle et al. Mar 2005 A1
20050053356 Mate et al. Mar 2005 A1
20050055595 Frazer et al. Mar 2005 A1
20050060759 Rowe et al. Mar 2005 A1
20050097542 Lee May 2005 A1
20050114852 Chen et al. May 2005 A1
20050132351 Randall et al. Jun 2005 A1
20050138560 Lee et al. Jun 2005 A1
20050198584 Matthews et al. Sep 2005 A1
20050204046 Watanabe Sep 2005 A1
20050216851 Hull et al. Sep 2005 A1
20050227621 Katoh Oct 2005 A1
20050229118 Chiu et al. Oct 2005 A1
20050246369 Oreizy et al. Nov 2005 A1
20050251833 Schedivy Nov 2005 A1
20050283791 McCarthy et al. Dec 2005 A1
20050288999 Lerner et al. Dec 2005 A1
20060011371 Fahey Jan 2006 A1
20060031381 Van Luijt et al. Feb 2006 A1
20060050970 Gunatilake Mar 2006 A1
20060051055 Ohkawa Mar 2006 A1
20060095401 Krikorian et al. May 2006 A1
20060095471 Krikorian et al. May 2006 A1
20060095472 Krikorian et al. May 2006 A1
20060095942 Van Beek May 2006 A1
20060095943 Demircin et al. May 2006 A1
20060107226 Matthews et al. May 2006 A1
20060117371 Margulis Jun 2006 A1
20060146174 Hagino Jul 2006 A1
20060280157 Karaoguz et al. Dec 2006 A1
20070003224 Krikorian et al. Jan 2007 A1
20070005783 Saint-Hillaire et al. Jan 2007 A1
20070022328 Tarra et al. Jan 2007 A1
20070074115 Patten et al. Mar 2007 A1
20070076604 Litwack Apr 2007 A1
20070168543 Krikorian et al. Jul 2007 A1
20070180485 Dua Aug 2007 A1
20070198532 Krikorian et al. Aug 2007 A1
20070234213 Krikorian et al. Oct 2007 A1
20070286596 Lonn Dec 2007 A1
20080019276 Takatsuji et al. Jan 2008 A1
20080037573 Cohen Feb 2008 A1
20080059533 Krikorian Mar 2008 A1
20080134267 Moghe et al. Jun 2008 A1
20080195744 Bowra et al. Aug 2008 A1
20080199150 Candelore Aug 2008 A1
20080294759 Biswas et al. Nov 2008 A1
20080307456 Beetcher et al. Dec 2008 A1
20080307462 Beetcher et al. Dec 2008 A1
20080307463 Beetcher et al. Dec 2008 A1
20090074380 Boston et al. Mar 2009 A1
20090199248 Ngo et al. Aug 2009 A1
20100100915 Krikorian et al. Apr 2010 A1
Foreign Referenced Citations (25)
Number Date Country
1464685 Dec 2003 CN
4407319 Sep 1994 DE
0838945 Apr 1998 EP
1077407 Feb 2001 EP
1443766 Aug 2004 EP
1691550 Aug 2006 EP
1830558 Sep 2007 EP
2307151 May 1997 GB
19990082855 Nov 1999 KR
20010211410 Aug 2001 KR
0133839 May 2001 WO
0147248 Jun 2001 WO
0193161 Dec 2001 WO
2003026232 Mar 2003 WO
03052552 Jun 2003 WO
03098897 Nov 2003 WO
2004032511 Apr 2004 WO
2005050898 Jun 2005 WO
2006064454 Jun 2006 WO
2006074110 Jul 2006 WO
2007027891 Mar 2007 WO
2007051156 May 2007 WO
2007141555 Dec 2007 WO
2007149466 Dec 2007 WO
2008024723 Feb 2008 WO
Non-Patent Literature Citations (140)
Entry
USPTO, Final Office Action, mailed Nov. 6, 2009; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Final Office Action mailed Nov. 12, 2009; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO, Non-Final Office Action mailed Nov. 23, 2009; U.S. Appl. No. 11/683,862, filed Mar. 8, 2007.
USPTO, Non-Final Office Action mailed Oct. 1, 2009; U.S. Appl. No. 11/778,287, filed Jul. 16, 2007.
USPTO Final Office Action mailed Dec. 30, 2009; U.S. Appl. No. 11/147,664, filed Jun. 7, 2005.
European Patent Office, European Search Report, mailed Sep. 28, 2009 for European Application No. EP 06 78 6175.
International Search Report for PCT/US2008/069914 mailed Dec. 19, 2008.
PCT Partial International Search, PCT/US20091054893, mailed Dec. 23, 2009.
Newton's Telecom Dictionary, 21st ed., Mar. 2005.
Ditze M. et all “Resource Adaptation for Audio-Visual Devices in the UPnP QoS Architecture,” Advanced Networking and Applications, 2006; AINA, 2006; 20% H International conference on Vienna, Austria Apr. 18-20, 2006.
Joonbok, Lee et al. “Compressed High Definition Television (HDTV) Over IPv6,” Applications and the Internet Workshops, 2006; Saint Workshops, 2006; International Symposium, Phoenix, AZ, USA, Jan. 23-27, 2006.
Lowekamp, B. et al. “A Hierarchy of Network Performance Characteristics for Grid Applications and Services,” GGF Network Measurements Working Group, pp. 1-29, May 24, 2004.
Meyer, Derrick “MyReplayTV™ Creates First-Ever Online Portal to Personal TI! Service; Gives Viewers Whole New Way to Interact With Programming,” http://web.archive.org/web/20000815052751/http://www.myreplaytv.com/, Aug. 15, 2000.
Sling Media “Sling Media Unveils Top-of-Line Slingbox PRO-HD” [online], Jan. 4, 2008, XP002560049; retrieved from the Internet: URL:www.slingmedia.com/get/pr-slingbox-pro-hd.html; retrieved on Oct. 12, 2009.
Srisuresh, P. et al. “Traditional IP Network Address Translator (Traditional NAT),” Network Working Group, The Internet Society, Jan. 2001.
Conway, Frank et al. “Systems and Methods for Creating Variable Length Clips from a Media Stream,” U.S. Appl. No. 12/347,465, filed Dec. 31, 2008.
Thiyagarajan, Venkatesan et al. “Always-On-Top Media Player Launched From a Web Browser,” U.S. Appl. No. 12/617,271, filed Nov. 12, 2009.
Paul, John Michael et al. “Systems and Methods for Delivering Messages Over a Network,” U.S. Appl. No. 12/619,192, filed Nov. 16, 2009.
Rao, Padmanabha R. et al. “Methods and Apparatus for Establishing Network Connections Using an Inter-Mediating Device,” U.S. Appl. No. 12/642,368, filed Dec. 18, 2009.
Dham, Vikram et al. “Systems and Methods for Establishing Network Connections Using Local Mediation Services,” U.S. Appl. No. 12/644,918, filed Dec. 22, 2009.
Paul, John et al. “Systems and Methods for Remotely Controlling Media Server Via a Network,” U.S. Appl. No. 12/645,870, filed Dec. 23, 2009.
Bajpal, Parimal et al. “Method and Node for Transmitting Data Over a Communication Network using Negative Ackhowledgement,” U.S. Appl. No. 12/404,920, filed Mar. 16, 2009.
Bajpal, Parimal et al. “Method and Note for Employing Network connections Over a Connectinoless Transport Layer Protocol,” U.S. Appl. No. 12/405,062, filed Mar. 16, 2009.
Asnis, Ilya et al. “Mediated Network address Translation Traversal” U.S. Appl. No. 12/405,039, filed Mar. 16, 2009.
China State Intellectual Property Office “First Office Action,” issued Jul. 31, 2009, for Application No. 200580026825.
USPTO, Non-Final Office Action, mailed Aug. 4, 2009; U.S. Appl. No. 11/734,277, filed Apr. 12, 2007.
USPTO, Final Office Action, mailed Jul. 31, 2009; U.S. Appl. No. 11/683,862, filed Mar. 8, 2007.
USPTO, Non-Final Office Action, mailed Aug. 5, 2009; U.S. Appl. No. 11/147,663, filed Jun. 7, 2005.
UsSPTO, Non-Final Office Action, mailed Sep. 3, 2009; U.S. Appl. No. 11/620,711, filed Jan. 7, 2007.
Einaudi, Andrew E. et al. “Systems and Methods for Selecting Media Content Obtained from Multiple Sources,” U.S. Appl. No. 12/543,278, filed Aug. 18, 2009.
Malode, Deepak Ravi “Remote Control and Method for Automatically Adjusting the Volume Output of an Audio Device,” U.S. Appl. No. 12/1550,145, filed Aug. 28, 2009.
Akella, Aparna Sarma “Systems and Methods for Event Programming Via a Remote Media Player,” U.S. Appl. No. 12/537,057, filed Aug. 6, 2009.
Shah, Bhupendra Natwerlan et al. “Systems and Methods for Transcoding and Place Shifting Media Content,” U.S. Appl. No. 12/548,130, filed Aug. 26, 2009.
Banger, Shashidhar et al. “Systems and Methods for Automatically Controlling the Resolution of Streaming Video Content,” U.S. Appl. No. 12/537,785, filed Aug. 7, 2009.
Panigrahi, Biswaranjan “Home Media Aggregator System and Method,” U.S. Appl. No. 12/538,681, filed Aug. 10, 2009.
Nandury, Venkata Kishore “Adaptive Gain Control for Digital Audio Samples in a Media Stream,” U.S. Appl. No. 12/507,971, filed Jul. 23, 2009.
Shirali, Amey “Systems and Methods for Providing Programming Content,” U.S. Appl. No. 12/538,676, filed Aug. 10, 2009.
Thiyagarajan, Venkatesan “Systems and Methods for Virtual Remote Control of Streamed Media,” U.S. Appl. No. 12/538,664, filed Aug. 10, 2009.
Thiyagarajan, Venkatesan et al. “Localization Systems and Method,” U.S. Appl. No. 12/538,783, filed Aug. 10, 2009.
Shirali, Amey et al. “Methods and Apparatus for Seeking Within a Media Stream Using Scene Detection,” U.S. Appl. No. 12/538,784, filed Aug. 10, 2009.
Thiyagarajan, Venkatesan “Systems and Methods for Updating Firmware Over a Network,” U.S. Appl. No. 12/538,661, filed Aug. 10, 2009.
Iyer, Satish “Methods and Apparatus for Fast Seeking Within a Media Stream Buffer,” U.S. Appl. No. 12/538,659, filed Aug. 10, 2009.
European Patent Office, International Searching Authority, “International Search Report,” for International Application No. PCT/US2009/049006, mailed Sep. 11, 2009.
Krikorian, Jason, U.S. Appl. No. 11/734,277, filed Apr. 12, 2007.
Tarra, Raghuveer et al., U.S. Appl. No. 60/975,239, filed Sep. 26, 2007.
Williams, George Edward, U.S. Appl. No. 12/167,041, filed Jul. 2, 2008.
Rao, Padmanabha R., U.S. Appl. No. 12/166,039, filed Jul. 1, 2008.
International Search Report and Written Opinion, PCT/US2005/020105, Feb. 15, 2007, 6 pages.
International Search Report and Written Opinion for PCT/US2006/04382, mailed Apr. 27, 2007.
Archive of “TV Brick Home Server,” www.tvbrick.com, [online] [Archived by http://archive.org on Jun. 3, 2004; Retrieved on Apr. 12, 2006] retrieved from the Internet <URL:http://web.archive.org/web/20041107111024/www.tvbrick.com/en/affiliate/tvbs/tvbrick/document18/print>.
Faucon, B. “TV ‘Brick’ Opens up Copyright Can of Worms,” Financial Review, Jul. 1, 2003, [online [Retrieved on Apr. 12, 2006] Retrieved from the Internet, URL:http://afr.com/cgi-bin/newtextversions.pl?storyid+1056825330084&3ate+2003/07/01&pagetype+printer&section+1053801318705&path +articles/2003/06/30/0156825330084.html].
Balster, Eric J., “Video Compression and Rate Control Methods Based on the Wavelet Transform,” The Ohio State University 2004, pp. 1-24.
Kulapala et al., “Comparison of Traffic and Quality Characteristics of Rate-Controlled Wavelet and DCT Video,” Arizona State University, Oct. 11, 2004.
Skodras et al., “JPEG2000: The Upcoming Still Image Compression Standard,” May 11, 2000, 14 pages.
Taubman et al., “Embedded Block Coding in JPEG2000,” Feb. 23, 2001, pp. 1-8 of 36.
Kessler, Gary C., An Overview of TCP/IP Protocols and the Internet; Jan. 16, 2007, retrieved from the Internet on Jun. 12, 2008 at http://www.garykessler.net/library/tcpip.html; originally submitted to the InterNIC and posted on their Gopher site on Aug. 5, 1994.
Roe, Kevin, “Third-Party Observation Under EPC Article 115 on the Patentability of an Invention,” Dec. 21, 2007.
Roe, Kevin, Third-Party Submission for Published Application Under CFR §1.99, Mar. 26, 2008.
China State Intellectual Property Office “First Office Action,” issued Jan. 8, 2010, for Application No. 200810126554.0.
USPTO Final Office action mailed Jan. 25, 2010; U.S. Appl. No. 11/734,277, filed Apr. 12, 2007.
Australian Government “Office Action,” Australian Patent Application No. 2006240518, mailed Nov. 12, 2009.
Jain, Vikal Kumar “Systems and Methods for Coordinating Data Communication Between Two Device,” U.S. Appl. No. 12/699,280, filed Feb. 3, 2010.
Gangotri, Arun L. et al. “Systems and Methods and Program Applications for Selectively Restructuring the Placeshiftnig of Copy Protected Digital Media Content,” U.S. Appl. No. 12/623,955, filed Nov. 23, 2009.
Paul, John et al. “Systems and Methods for Searching Media Content,” U.S. Appl. No. 12/648,024, filed Dec. 28, 2009.
Newton's Telcom Dictionary, 20th ed., Mar. 2004.
“The Authoritative Dictionary of IEEE Standard Terms,” 7th ed. 2000.
Gurzhi, Alexander et al. “Systems and Methods for Emulation Network-Enabled Media Components,” U.S. Appl. No. 12/711,830, filed Feb. 24, 2010.
Bajpai, Parimal et al. “Systems and Methods of Controlling the Encoding of a Media Stream,” U.S. Appl. No. 12/339,878, filed Dec. 19, 2008.
Malone, Edward D. et al. “Systems and Methods for Controlling Media Devices,” U.S. Appl. No. 12/256,344, filed Oct. 22, 2008.
Banger, Shashidhar et al. “Systems and Methods for Determining Attributes of Media Items Accessed Via a Personal Media Broadcaster,” U.S. Appl. No. 12/334,959, filed Dec. 15, 2008.
Kulkarni, Anant Madhava “Systems and Methods for Creating Logical Media Streams for Media Storage and Playback,” U.S. Appl. No. 12/323,907, filed Nov. 26, 2008.
Krikorian, Blake Gary et al. “Systems and Methods for Projecting Images From a Computer System,” U.S. Appl. No. 12/408,460, filed Mar. 20, 2009.
Krikorian, Blake Gary et al. “Systems and Methods for Presenting Media Content Obtained From Multiple Sources,” U.S. Appl. No. 12/408,456, filed Mar. 20, 2009.
International Search Report and Written Opinion for International Application No. PCT/US2008/080910, mailed Feb. 16, 2009.
International Search Report and Written Opinion for International Application No. PCT/US2006/025911, mailed Jan. 3, 2007.
International Search Report for International Application No. PCT/US2007/063599, mailed Dec. 12, 2007.
International Search Report for International Application No. PCT/US2007/076337, mailed Oct. 20, 2008.
International Search Report and Written Opinion for International Application No. PCT/US2006/025912, mailed Jul. 17, 2008.
International Search Report for International Application No. PCT/US2008/059613, mailed Jul. 21, 2008.
Sony Corporation “LocationFree Player Pak—LocationFree Base Station—LocationFree Player” [Online] 2005, XP002512401; retrieved from the Internet: <URL:http://www.docs.sony.com/release/LFPK1.pdf>; retrieved on Jan. 28, 2009.
Wikipedia “Slingbox” [Online], Oct. 21, 2007, XP002512399; retrieved from the Internet: <URL:http://en.wikipedia.org/w/index.php?title=Slingbox&oldid=166080570>; retrieved on Jan. 28, 2009.
Capable Networks LLC “Keyspan Remote Control—Controlling Your Computer With a Remote” [Online], Feb. 21, 2006, XP002512495; retrieved from the Internet: <URL:http://www.slingcommunity.com/article/11791/Keyspan-Remote-Control—Controlling-Your-Computer-With-a-Remote/?highlight=remote+control>; retrieved on Jan. 28, 2009.
Wikipedia “LocationFree Player” [Online], Sep. 22, 2007, XP002512400; retrieved from the Internet: <URL:http://en.wikipedia.org/w/index.php?title=LocationFree—Player&oldid=159683564>; retrieved on Jan. 28, 2009.
Sling Media Inc. “Slingbox User Guide” [Online] 2006, XP002512553; retrieved from the Internet: <URL:http://www.slingmedia.hk/attach/en-US—Slingbox—User—Guide—v12.pdf>; retrieved on Jan. 29, 2009.
Sony Corporation “LocationFree TV” [Online], 2004, SP002512410; retrieved from the Internet: <URL:http://www.docs.sony.com/release/LFX1—X5revision.pdf>; retrieved on Jan. 28, 2009 [note—document uploaded in two parts as file exceeds the 25MB size limit].
European Patent Office, International Searching Authority, “International Search Report,” mailed Mar. 30, 2010; International Application PCT/US2009/068468 filed Dec. 27, 2009.
USPTO Final Office Action mailed Mar. 3, 2010; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO Final Office Action mailed Mar. 12, 2010; U.S. Appl. No. 11/620,711, filed Jan. 7, 2007.
USPTO Non-Final Office Action mailed Mar. 19, 2010; U.S. Appl. No. 11/147,664, filed Jun. 7, 2005.
USPTO Non-Final Office Action mailed Mar. 31, 2010; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO Non-Final Office Action mailed Apr. 1, 2010; U.S. Appl. No. 12/237,103, filed Sep. 24, 2008.
Qiong, Liu et al. “Digital Rights Management for Content Distribution,” Proceedings of the Australasian Information Security Workshop Conference on ACSW Frontiers 2003, vol. 21, 2003, XP002571073, Adelaide, Australia, ISSN: 1445-1336, ISBN: 1-920682-00-7, sections 2 and 2.1.1.
China State Intellectual Property Office “Office Action” issued Mar. 18, 2010 for Application No. 200680022520.6.
China State Intellectual Property Office “Office Action” issued Apr. 13, 2010 for Application No. 200580026825.X.
Canadian Intellectual Property Office “Office Action” mailed Feb. 18, 2010 for Application No. 2569610.
European Patent Office “European Search Report,” mailed May 7, 2010 for Application No. 06786174.0.
Margulis, Neal “Apparatus and Method for Effectively Implementing a Wireless Television System,” U.S. Appl. No. 12/758,193, filed Apr. 12, 2010.
Margulis, Neal “Apparatus and Method for Effectively Implementing a Wireless Television System,” U.S. Appl. No. 12/758,194, filed Apr. 12, 2010.
Margulis, Neal “Apparatus and Method for Effectively Implementing a Wireless Television System,” U.S. Appl. No. 12/758,196, filed Apr. 12, 2010.
Kirkorian, Jason Gary et al. “Personal Media Broadcasting System with Output Buffer,” U.S. Appl. No. 12/757,697, filed Apr. 9, 2010.
Tarra, Raghuveer et al. “Firmware Update for Consumer Electronic Device,” U.S. Appl. No. 12/757,714, filed Apr. 9, 2010.
European Patent Office, European Search Report for European Application No. EP 08 16 7880, mailed Mar. 4, 2009.
MythTV Wiki, “MythTV User Manual” [Online], Aug. 27, 2007, XP002515046; retrieved from the Internet: <URL: http://www.mythtv.org/wiki?title=User—Manual:Introduction&oldid=25549>.
International Searching Authority, Written Opinion and International Search Report for International Application No. PCT/US2008/077733, mailed Mar. 18, 2009.
International Searching Authority, Written Opinion and International Search Report for International Application No. PCT/US20081087005, mailed Mar. 20, 2009.
Watanabe Y. et al., “Multimedia Database System for TV Newscasts and Newspapers”; Lecture Notes in Computer Science, Springer Verlag, Berlin, Germany; vol. 1554, Nov. 1, 1998, pp. 208-220, XP002402824, ISSN: 0302-9743.
Yasuhiko Watanabe et al., “Aligning Articles in TV Newscasts and Newspapers”; Proceedings of the International Conference on Computationallinguistics, XX, XX, Jan. 1, 1998, pp. 1381-1387, XP002402825.
Sodergard C. et al., “Integrated Multimedia Publishing: Combining TV and Newspaper Content on Personal Channels”; Computer Networks, Elsevier Science Publishers B.V., Amsterdam, Netherlands; vol. 31, No. 11-16, May 17, 1999, pp. 1111-1128, XP004304543, ISSN: 1389-1286.
Ariki Y. et al., “Automatic Classification of TV News Articles Based on Telop Character Recognition”; Multimedia Computing and Systems, 1999; IEEE International Conference on Florence, Italy, Jun. 7-11, 1999, Los Alamitos, California, USA, IEEE Comput. Soc. US; vol. 2, Jun. 7, 1999, pp. 148-152, XP010519373, ISBN: 978-0-7695-0253-3; abstract, paragraph [03.1], paragraph [052], figures 1,2.
USPTO, Non-Final Office Action mailed Dec. 17, 2004; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Final Office Action mailed Jul. 28, 2005; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Non-Final Office Action mailed Jan. 30, 2006; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Final Office Action mailed Aug. 10, 2006; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
Joonbok, Lee et al. “Compressed High Definition Television (HDTV) Over IPv6,” Applications and the Internet Workshops; Saint Workshops; International Symposium, Phoenix, AZ, USA, Jan. 23-27, 2006.
USPTO, Non-Final Office Action mailed Apr. 16, 2008; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Final Office Action mailed Sep. 18, 2008; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Non-Final Office Action mailed Mar. 31, 2009; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Non-Final Office Action mailed May 1, 2008; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO, Final Office Action mailed Dec. 29, 2008; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO, Non-Final Office Action mailed Jun. 8, 2009; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO, Non-Final Office Action mailed Jun. 26, 2008; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO, Final Office Action mailed Oct. 21, 2008; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO, Non-Final Office Action mailed Mar. 25, 2009; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO, Non-Final Office Action mailed Aug. 7, 2008; U.S. Appl. No. 11/620,711, filed Jan. 7, 2007.
USPTO, Final Office Action mailed Feb. 9, 2009; U.S. Appl. No. 11/620,711, filed Jan. 7, 2007.
USPTO, Non-Final Office Action mailed Feb. 25, 2009; U.S. Appl. No. 11/683,862, filed Mar. 8, 2007.
USPTO, Non-Final Office Action mailed Dec. 24, 2008; U.S. Appl. No. 11/147,985, filed Jun. 7, 2005.
USPTO, Non-Final Office Action mailed Jun. 25, 2008; U.S. Appl. No. 11/428,254, filed Jun. 30, 2006.
USPTO, Final Office Action mailed Feb. 6, 2009; U.S. Appl. No. 11/428,254, filed Jun. 30, 2006.
USPTO, Non-Final Office Action mailed May 15, 2009; U.S. Appl. No. 11/147,664, filed Jun. 7, 2005.
Sonic Blue “ReplayTV 5000 User's Guide,” 2002, entire document.
Bluetooth-News; Main Future User Models Document Verification & Qualification: Bluetooth Technical Background, Apr. 21, 1999; pp. 1 of 7 and 2 of 7; http://www.bluetooth.com/v2/news/show.asp 1-2.
Microsoft Corporation; Harman/Kardon “Master Your Universe” 1999.
Matsushita Electric Corporation of America MicroCast : Wireless PC Multimedia Transceiver System, Nov. 1998.
“Wireless Local Area Networks: Issues in Technology and Standards” Jan. 6, 1999.
USPTO, Final Office Action mailed Jun. 25, 2009; U.S. Appl. No. 11/147,985, filed Jun. 7, 2005.
Lee, M. et al. “Video Frame Rate Control for Non-Guaranteed Network Services with Explicit Rate Feedback,” Globecom'00, 2000 IEEE Global Telecommunications conference, San Francisco, CA, Nov. 27-Dec. 1, 2000; [IEEE Global Telecommunications Conference], New York, NY; IEEE, US, vol. 1, pp. 293-297, XP001195580; ISBN: 978-0-7803-6452-3, lines 15-20 of sec. II on p. 293, fig. 1.
European Patent Office, International Searching Authority, “International Search Report and Written Opinion,” mailed Jun. 4, 2010 for International Application No. PCT/IN2009/000728, filed Dec. 18, 2009.
USPTO Non-Final Office Action mailed Jun. 23, 2010; U.S. Appl. No. 11/933,969, filed Nov. 1, 2007.
Korean Intellectual Property Office “Official Notice of Preliminary Rejection,” issued Jun. 18, 2010; Korean Patent Application No. 10-2008-7021254.
Related Publications (1)
Number Date Country
20100192188 A1 Jul 2010 US