Systems and methods for creating variable length clips from a media stream

Abstract
Systems and methods are described for providing variable-length media clips based upon a received media stream. An exemplary system for providing media clips to users includes a database comprising a list of rules associated with a plurality of programs, as well as a first server and a second server. The first server receives information about a particular program being viewed by a media player and obtains rules from the database for the identified program that are transmitted to the media player. The second server receives a media clip of the particular program being viewed from the media player, wherein the media clip is created at the media player in accordance with the rules corresponding to the program, and subsequently makes the media clip available to the users.
Description
TECHNICAL FIELD

The following discussion generally relates to creating media clips from a received media stream.


BACKGROUND

Consumers are continually demanding increased flexibility in viewing television and other forms of media. Whereas television viewing traditionally involved watching imagery received on a broadcast signal on a conventional television set, modern media experiences allow media content to be provided via broadcast, cable, satellite, portable media (e.g., DVD) and other sources. Further, the Internet and other relatively high-bandwidth networks now allow media content to be delivered to any number of devices (e.g., wireless phones, computers and the like) that previously were not typically used for viewing media content. Such networks have also allowed viewers to “place shift” their media experience from their home television or the like to a remotely-located television, computer, wireless telephone or other device. Consumers are therefore able to view television and other media content on a wide variety of devices and in a wide variety of locations.


In addition to the increased availability and flexibility in viewing media content, consumers have recently expressed significant interest in creating “clips” of media content that can be shared with others. Such clips may include relatively short excerpts of viewed media content in a digital or other format that may be distributed via the Internet or another channel; a number of Internet services for uploading and sharing media clips have become very popular in recent years. A challenge continues, however, in allowing users to create clips of popular media content while protecting the content owner's rights in the media.


As a result, it is desirable to create systems, methods and/or devices that are able to allow consumers to create clips of received media content and to share these clips in a manner that respects the rights of the content owner. These and other desirable features and characteristics will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and this background section.


BRIEF DESCRIPTION

According to various exemplary embodiments, systems and methods are described for providing variable-length media clips based upon a received media stream. An exemplary system for providing media clips to users includes a database comprising a list of rules associated with a plurality of programs, as well as a first server and a second server. The first server receives information about a particular program being viewed by a media player and obtains rules from the database for the identified program that are transmitted to the media player. The second server receives a media clip of the particular program being viewed from the media player, wherein the media clip is created at the media player in accordance with the rules corresponding to the particular program, and subsequently makes the media clip available to the users.


In other embodiments, a method for processing a media clip suitably comprises receiving information about a particular program being viewed by a remotely-located media player, obtaining at least one rule corresponding to the particular program, wherein the at least one rule regulates clipping of the particular program, and providing the at least one rule corresponding to the identified particular program to the remotely-located media player to thereby allow the remotely-located media player to create a clip of the particular program in accordance with the at least one clipping rule.


In still other embodiments, a method of creating a variable length media clip from a media stream suitably comprises receiving a media stream at a media player device, obtaining a maximum clip length associated with the program, receiving a start point and an end point of the variable length media clip while maintaining the length of the variable length media clip to be less than the maximum clip length, and providing the created clip to a remote server.


Various other embodiments, aspects and other features are described in more detail below.





BRIEF DESCRIPTION OF THE DRAWING FIGURES

Exemplary embodiments will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and



FIG. 1 is a diagram of an exemplary media clipping system;



FIG. 2 is a data flow diagram of an exemplary media clipping process and system;



FIG. 3 is a flowchart of an exemplary technique for identifying rules associated with a particular program or network; and



FIGS. 4-6 are exemplary interfaces for an exemplary media player application that is capable of creating variable-length media clips.





DETAILED DESCRIPTION

The following detailed description is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.


According to various exemplary embodiments, systems and techniques are provided for creating variable-length media clips from a media stream that is received at a media player. The player is able to provide information about the received media stream or a program that is currently being viewed to a remotely-located server to obtain one or more rules relating to the creation of media clips. The rules may include, for example, a maximum allowable clip length, as well as any limitations on displaying or distributing the clip, that are determined based upon the program name, network or other particular information that is the source of the clip. Other rules could restrict clip quality or other parameters, and/or may restrict other actions such as whether clip may be recorded or not, or the like. Any number of different rules could be formulated and applied as desired. The media player creates the clip in accordance with the received rules, and provides the clip to a distribution server for distribution or sharing with other users. In various embodiments, any limitations on display or distribution of the clip may be contained within metadata associated with the clip itself to thereby allow the distribution server to implement the rules for the particular clip.


With initial reference to FIG. 1, an exemplary system 100 for creating and distributing a media clip 107A-C suitably includes a media player device 102 that receives a media stream 125 from any source and creates clips 107A-C in accordance with rules stored in a database 105. Media player 102 suitably obtains the rules associated with the particular program or network being “clipped” from a clipping rights server (CRS) 104 that is remotely-located across a network 110. Clips 107A-C created in accordance with the received rules can be distributed (also in accordance with the rules associated with the particular program) via media distribution server 106. By identifying the particular program or network from the source media stream 125 prior to clip creation and then applying clip creation and distribution rules that are unique to the program and/or network, the rights of the media content owner can be preserved while still providing great flexibility to viewers in creating clips of viewed content.


Media player 102 is any device, system or module capable creating a media clip 107A-C from a received media stream 125. In various embodiments, media player 102 is implemented with any sort of conventional computer system or similar workstation, such as any sort of desktop, laptop or other personal computer or general-purpose computing system. In other embodiments, media player 102 is a set-top box (STB) or other receiver device capable of receiving television or other media signals via any sort of broadcast, cable, satellite or other medium. In still other embodiments, media player 102 may be a portable self-contained computing device such as any sort of wireless phone, personal digital assistant (PDA), network client and/or the like. Alternatively, media player 102 is a logical application or other module implemented in software or firmware that can be executed on any sort of processing hardware, including any sort of web or other network server. Media player 102 therefore includes any sort of hardware, software, firmware and/or other resources that allow for creation of clips 107A-C from a received media stream 125.


Media stream 125 is received from any source, and in any format. In embodiments wherein media player 102 is a television receiver such as a STB, for example, media stream 125 may be received directly from a satellite, cable, broadcast or other source. In such embodiments, media stream 125 may be a received television signal or the like that is received and processed locally on media player device 102. Media stream 125 may also represent a stream obtained from a DVD or other portable medium, and/or a media file stored at media player 102 in any format.


Other embodiments of media player 102 may also support streaming media placeshifting over a digital network 110. In various embodiments, consumers may wish to placeshift content within a home, office or other structure, for example, such as from a placeshifting device 112 to a media player 102 located in another room, or to a remotely located media player 102 located in a second home, office, hotel or other remote location. Although placeshifting may traverse any local, wide area, telephone and/or other networks 110 to provide media stream 125 from any placeshifting source 112 to any destination media player 102, as described more fully below, media stream 125 may be obtained from any local or remote source available to media player 102. Similarly, media player 102 may itself provide placeshifting functionality; that is, some embodiments of media player 102 may additionally incorporate the features of placeshifting device 112 into a common system. A set-top box or other television receiver that provides placeshifting features, for example, could receive media stream 125 from any local or remote source. This media stream 125 may be clipped or otherwise processed locally at the receiver, and/or may be placeshifted for viewing at any remote location. Again, placeshifting is not required in all embodiments.


Media player 102 as described herein therefore equivalently represents any device, logic or other system capable of receiving a media stream 125 from any local or remote source and of creating a media clip from the received content. In embodiments wherein media player 102 is a conventional STB or similar receiver, media stream 125 may be a satellite, cable or broadcast stream that is presented on a television or other local display 132, and that is clipped or otherwise processed in response to inputs provided by a user via a remote control or other input device as appropriate. While other embodiments may provide clipping in the context of a media stream that is place shifted across network 110, placeshifting is not required in all embodiments. Media player 102, then, may equivalently represent a conventional STB or other television receiver, as well as any sort of media player application or device capable of receiving placeshifted media streams from any source.


In the exemplary embodiment shown in FIG. 1, media player 102 includes a processor 122, memory 124 and input/output features 126 commonly associated with any conventional computing platform. Processor 122, for example, may be any sort of microprocessor, microcontroller, digital signal processor, programmable array or other logic capable of executing instructions and processing data to implement the various features of the media player device. Memory 124 includes any sort of RAM, ROM, flash and/or other memory capable of storing instructions and data that can be processed by processor 122 or other processing logic as appropriate. Input/output 126 typically includes any conventional interfaces to input devices (e.g., keyboard, mouse, touchpad, wireless or other remote control, or other input devices as appropriate), as well as any conventional interfaces to output devices such as a display 132 or the like. Input/output 126 typically also includes interfaces to any sort of mass storage (e.g., a magnetic or optical disk) and/or to a communications network 110. Network interfaces used in various embodiments might include any sort of wired (e.g., ETHERNET) or wireless (e.g., IEEE 802.11 “Wi-fi”) interfaces, including any sort of interfaces to telephone networks.


Any of the various features of media player 102 may be implemented with any sort of general or special purpose hardware, software and/or firmware, as appropriate. In some embodiments (e.g., embodiments wherein media player 102 is implemented as a STB or other media receiver), processor 122, memory 124 and/or input/output 126 may be implemented as a “system on a chip” (SoC) using any suitable processing circuitry under control of any appropriate control logic. In various embodiments, such control logic may execute within an integrated SoC or other processor to implement a media receiver, decoder, display processor and/or any other features as appropriate. The Broadcom Corporation of Irvine, Calif., for example, produces several models of processors (e.g., the model BCM 7400 family of processors) that are capable of supporting SoC implementations of satellite and/or cable receiver systems, although products from any number of other suppliers could be equivalently used. In still other embodiments, various distinct chips, circuits or components may be inter-connected and inter-relate with each other to implement the various functions and features described herein.


To that end, operations of device 102 may be controlled by any sort of general purpose or other operating system 128. Operating system 128 typically implements user interface features and also allows programs (e.g., media player application 130) to use the various hardware and other resources of device 102. Examples of operating systems that could be used in various embodiments include any of the well-known operating systems conventionally used in personal computing (e.g., any version of WINDOWS, MacOS/OSX, LINUX OS, etc.) or mobile computing (e.g., Blackberry OS, ANDROID OS, WINDOWS MOBILE, SYMBIAN OS, iPHONE OS, etc.). The particular examples of operating systems are not intended to be limiting; indeed, other embodiments could be based upon other operating systems, including any sort of proprietary operating system, and equivalent embodiments could be based upon any sort of programming or other abstraction environment (e.g., JAVA, .NET, and/or the like) in place of or in addition to a conventional operating system 128.


Media player application 130 is any program, application, applet, browser plugin, software module and/or other logic capable of processing the received media stream 125 to create one or more media clips 107A-C. In various embodiments, media player application 130 also contains decoder functionality for decoding the received stream 125 for playback on display 132 or the like. Media player application 130 may be implemented in any programming language, and may be stored in source or object code format in any storage medium, including memory 124 and/or any sort of disk or other mass storage available to media player 102. In an exemplary implementation, media player application 130 is a software program that is stored in object code form on a disk or similar medium until being activated by the user. The program 130 is then partially or wholly duplicated into memory 124 to facilitate execution of the object code instructions by processor 122. Examples of various interfaces that might be provided by one implementation of media player application 130 are described below with reference to FIGS. 4-7, although any other interfaces could be used in alternate embodiments.


Display 132 is any sort of television, monitor or other display capable of presenting imagery to the viewer. In various embodiments, display 132 is a conventional television or computer monitor associated with media player 102, including any sort of integrated or external display 132 that might be associated with a computer, wireless phone, PDA or the like. In other embodiments, display 132 is a conventional analog or digital television that may be coupled to a STB or other receiver 102, as described above. Display 132 need not be physically present at the same location as media player 102 in all embodiments; to the contrary, content may be provided from media player 102 to display 132 via any sort of cabling, network (e.g., network 110) or the like.


Network 110 is any digital or other communications network capable of transmitting messages between senders and receivers. In various embodiments, network 110 may encompass one or more wide area networks, local area networks, and/or any combination of wide and local area networks, including any sort of wireless or other telephone networks. Network 110 can include any number of public or private data connections, links or networks supporting any number of communications protocols. Network 110 may include the Internet, for example, or any other network based upon TCP/IP or other conventional protocols. In many embodiments, network 110 may also include one or more conventional local area network, such as one or more IEEE 802.3 and/or IEEE 802.11 networks. Network 110 as shown in FIG. 1, then, is intended to broadly encompass any digital communications network(s), systems or architectures for transmitting data between the various components of system 100.


As noted above, in some embodiments media player 102 receives media stream 125 over network 110 from one or more placeshifting devices 112. In such embodiments, placeshifting device 112 packetizes media content received from a media source 115 for transmission over communications network 110. The media source 115 may include any source of media content, including any sort of digital or analog recording device (e.g., a digital video recorder); any broadcast, cable or satellite television programming source; any “video-on-demand” or similar source; a player for any sort of digital video disk (DVD) or other removable media; a security or other video camera; and/or the like. When present, then, placeshifting device 112 is any component, hardware, software logic and/or the like capable of transmitting a packetized stream of media content over network 110. Although FIG. 1 shows only a single placeshifting device 112, in practice system 100 may include any number of placeshifting devices 112 and/or media sources 115, each of which may be able to stream media content to media player 102. Again, placeshifting is not necessary in all embodiments.


In embodiments that do provide placeshifting, media data may be received from the controlled source 115 in any format. Each placeshifting device 112 therefore incorporates suitable transcoder logic to convert audio/video or other media data into a packetized format (e.g., MPEG, QuickTime, Windows Media and/or the like) that can be transmitted over network 110 as stream 125. Placeshifting device 112 may also provide control instructions to one or more media sources 115 using any sort of infrared, radio frequency, or other signals. Such signals may be provided, for example, from an “IR Blaster” or similar feature that emulates infrared or other RF instructions provided from a remote control associated with the media source 115. Control signals and response could be equivalently transferred over a home network or other data connection, or alternatively via USB and/or any other sort of hardwired cable connection. U.S. Patent Publication No. 2006/0095471 describes one example of a placeshifting encoder, although the concepts described herein could be used in conjunction with products and services available from any source, including those available from Sling Media of Foster City, Calif. and others.


As noted above, it is generally desirable to allow viewers to create clips from media stream 125 that can be shared with other viewers, provided that the rights of the content owner can be protected. In various embodiments, media player 102 communicates with a data center system 103 that provides clipping rules associated with the particular program or network contained within media stream 125 to allow such clipping to occur. Data center system 103 as shown in FIG. 1 includes a database 105, a clip rules server (“CRS”) 104, and a media distribution server 106 as appropriate. Servers 104 and 106 may implemented using any sort of conventional network server hardware. Various embodiments of system 103 may also include redundant servers for CRS server 104 and/or media distribution server 106, as well as appropriate backup, load balancing, security and other features conventionally found in a typical web service system.


Database 105 includes any sort of data collection or repository that associates clipping rules with particular programs, channels and/or networks, as well as any sort of database engine or the like that allows access to the data. In various embodiments, database 105 is implemented with any sort of conventional relational, hierarchical, object-oriented or other database structure that may be organized in any manner. Database 105 typically includes any number of entries corresponding to particular programs and/or networks as appropriate. Rules associated with any particular program or networks may include rules related to clip sharing, clip distribution, clip display, and/or the like. Clip sharing rules may include, for example, rules setting forth whether clipping is allowed or not allowed, a maximum clip length, a portion of a program wherein clipping is blocked, and/or other rules as desired. Clip distribution rules may include limits on whether the clip may be shared or not, as well as any limits on the parties that may receive the clip (e.g., everyone, registered members, friends of the clip maker, etc.). Clip display rules may include temporal or other restrictions on the availability of the clip (e.g., blocking clips for a particular program until a certain time). Such features may be beneficial, for example, in preventing clips of a program initially broadcast in a more easterly time zone (e.g., EST) from being viewed until the same program has been broadcast in a more westerly time zone (e.g., PST or Hawaii-Aleutian Time). Time limits could also be imposed to prevent clipping from an original broadcast, but to allow clipping from DVR recordings or the like after an appropriate period of time has elapsed after the original broadcast. Other rules may restrict the number of users who have access to a particular clip, the quality of clips that can be created (e.g., any limits on resolution, bit or frame rate, or any other parameter), limits on further copying, displaying, editing or the like, and/or any other limitations as desired. Rules may be formulated with regard to any aspect of clip creation, display, processing, distribution and/or the like.


As noted above, rules may be associated with particular programs, networks, channels and/or other features as desired. In various embodiments, rules can be associated with particular programs to allow clipping according to certain constraints for the particular program. Clipping may be blocked entirely for some programs, for example, while clipping may be allowed, or allowed with constraints, for others. Clipping rules may also be described for entire channels or networks, if desired. In such embodiments, a set of clipping constraints can be defined for all of the programs appearing on that channel or network. Further, rules may be defined that incorporate both program name and a channel or network, if desired. Such rules may allow block clipping of new episodes of a program, for example, while allowing clipping of older episodes that may be running in syndication on another channel or network. Rules may be created to facilitate any sort of licensing or other business arrangement with the content owner in any convenient manner.


CRS server 104 is any system, component or logic capable of receiving queries from one or more media players 102 and of returning one or more rules from database 105 in response. In various embodiments, CRS server 104 receives information about the media stream 125 currently being received by a particular media player 102 and obtains the particular rules from database 105 that are associated with the particular program or network. These rules can then be provided back to media player 102 to allow or disallow clipping according to the defined rules for that particular program, network and/or channel.


Media distribution server 106 is any system, component or logic capable of receiving media clips 107A-C from one or more media players 102 and of providing access to the various clips for other users of system 100. In various embodiments, media distribution service is implemented using conventional network server hardware (including any desired redundancy, load balancing, backup and the like) for providing access to files via network 110. Media distribution server 106 is further configured, however, to implement various rules regarding the distribution and/or display of particular clips. These rules may be received directly from database 105 in some embodiments. In other embodiments, the particular rules implemented by the media distribution server 106 are received as metadata contained within the clips 107A-C themselves.


Distribution and display of media clips 107A-C may take place in any manner. Clips 107A-C may be provided in a streaming format, for example, or may be provided in a file-based format that would allow users to download and save local copies of the clip 107A-C. In various embodiments, the type of downloading or presentation may be restricted or otherwise determined using rules obtained from server 105.


Clips 107A-C uploaded to server 106 may be accessed in any manner. In various embodiments, one or more users (e.g., a user of computer system 108) can access one or more clips 107A-C via network 110 using a conventional browser program or the like. Server 106 may implement any sort of account management scheme (e.g., based upon userid/password combinations or other credentials), or public access to some or all clips 107A-C may be allowed. Users may be directed toward particular clips in any manner. A search engine that allows keyword searching or searching based upon program name, network, channel, actor/actress name or the like may be provided. In various embodiments, a user creating a clip 107A-C may be able to forward an email or other message to one or more other users that includes a link (e.g., a uniform resource locator (URL)) or other identifier for the particular clip 107A-C.


To create a clip 107A-C from a received media stream 125, then, media player 102 suitably obtains one or more clip rules associated with the particular program or network being viewed from CRS server 104. The rule may include, for example, a maximum clip length allowed for the particular program or network. The media player 102 then allows the viewer to create a clip that is within the constraints of the received rules for the particular program, and to upload the clip 107A-C to server 106 for distribution to others within the further constraints of the rules established by the content owner. By providing rules associated with the particular program or network that is the source material for the clip 107, the rights of the content owner can be protected while still providing flexibility in allowing significant flexibility in creating and distributing media clips 107A-C. Additional detail about an exemplary process for creating a clip 107A-C is described with reference to FIG. 2, and other embodiments may be modified and/or enhanced in any number of ways.


With reference now to FIG. 2, an exemplary process 200 for creating a video clip 107 suitably involves gathering information about the media stream (step 210), obtaining rules associated with the program or network from WIO server 104 based upon the gathered information, and creating and distributing the clip 107 within the constraints of the received rules.


Content 125 may be received at media player 102 in any manner. In the exemplary embodiment shown in FIG. 2, content is received at media player 102 from a remotely-located placeshifting device 112, as described above. In such embodiments, content is initially received in response to a request 202 that is placed from the player 102 to placeshifting device 112 (e.g., over network 110). Placeshifting device 112 provides infrared or other instructions 204 to a controlled device 115 (e.g., a DVR or receiver) to obtain the requested content. This content is provided as an audio/visual signal 206 to placeshifting device 112, which in turn transcodes or otherwise processes the received data to create a media stream 125 that can be provided 208 to media player 102 as desired. Other embodiments, however, may obtain media stream 125 from any other local or remote source other than placeshifting device 112, as described more fully above. That is, many equivalent embodiments could simply receive media stream 125 from a local receiver (e.g., a satellite, cable or broadcast receiver) as part of a STB or the like, without the need to obtain media stream 125 from a remote source such as devices 112 or 115. In such embodiments, functions 202-208 may not be needed or present, as content 125 is simply received from a broadcast or other locally-received source.


After being received from any local or remote source, the contents of media stream 125 may be temporarily stored in any manner. In various embodiments, media player application 130 (FIG. 1) provides a buffer that allows for temporary storage of media stream 125 to facilitate pausing, rewind, fast forward (within the limits of available content) and other features as appropriate. Generally speaking, the contents of the buffer are used to create any media clips 107A-C associated with the received stream 125. This buffer is generally cleared 212 upon initialization of media player application 130 and/or upon a change 210 in media stream 125. The change may be, for example, switching to a different media source, changing to a different channel or network, selecting a different program from a DVR or other recording device, or the like.


After a startup or change in media stream 125, media player 102 identifies the content of the media stream 125 in any manner (step 221 in FIG. 2). In some embodiments, identifying the content may be as simple as retrieving program, channel and/or network identification information from an electronic program guide (EPG) associated with media player 102, placeshifting device 112 and/or controlled device 115. Various EPG implementations are populated with data from any source (e.g., Tribune Media Services (TMS) of Chicago, Ill.) that may include identification information (e.g., program title, network name, channel name or number, identification codes or the like) in a textual or other convenient format that can be used to identify the particular program, channel or network. If EPG data is available to media player 102, then, information about the media stream 125 may be readily ascertained from such data.


Embodiments and situations may arise, however, in which media player 102 does not have current EPG information or is otherwise not able to directly obtain information about the content of media stream 125. In such embodiments, it may be desirable to generate an image within stream 125 itself that includes identifying information. This may be accomplished, for example, by generating a program guide window within the received stream 125. In the exemplary placeshifting embodiment shown in FIG. 2, media player 102 transmits an EPG request message 214 to placeshifting device 112, which in turn instructs the controlled device 115 to generate the EPG image. Placeshifting device 112 may transmit such an instruction 216 via an infrared blaster, for example, or any other link used to control device 115. In such embodiments, device 115 appropriately displays a program guide window 218 that can be transcoded or otherwise provided 220 to player 102 as part of media stream 125.


Media player 102 may then process the received EPG imagery in any manner to extract textual or other information about the media stream 125. In various embodiments, media player 102 implements an optical character recognition (OCR) feature that recognizes alphanumeric or other characters contained within the EPG imagery. This OCR feature may be implemented locally at media player 102 in some embodiments, or may be performed by a remotely-located server, such as any OCR server associated with data center system 103. In various embodiments, imagery from media stream 125 may be transmitted to CRS server 104 (or another “what is on” (WIO) server, if available) for identification of the program currently being viewed, as described more fully below.


Media player 102 suitably transmits 222 any available identifying information about media stream 125 to WIO server 104 to obtain a response 226 that includes any clipping rules that may be available for the particular program, channel or network being viewed. Information provided in request 222 may include EPG information, if available, and/or EPG imagery as appropriate. Other embodiments may provide any other information (e.g., channel number) that can be used to determine the currently-viewed program from other information that may be known to CRS server 104 (e.g., physical location of media player 102, date and time, etc.).


CRS server 104 processes query 222 in any manner (step 224). If the query does not already contain identifying information about the program, CRS server 104 may appropriately identify the program, channel and/or network of the currently-received media stream 125 in any manner from any sort of available information. Any rules associated with the identified program can then be retrieved from database 105, as described above, and returned to media player 102 as response 226. In various embodiments, response 226 includes at least an indication as to whether clipping is allowed for the identified program, and may also include the maximum length for any allowed clips. In some embodiments, response 226 may indicate that clipping is not available by simply indicating a maximum clip length of zero (or another pre-determined value). Response 226 may also include any additional rules relating to distribution or display of clips 107A-C made from the identified program, as desired. In various embodiments, response 226 may also include rules associated with programs that may be reasonably expected to follow the current program in the media stream (e.g., rules for the next program being broadcast on the same channel or network as the current program), as desired.


Upon receipt of response 226, media player 102 may enable clipping 228 as appropriate. Various techniques for processing a video clip, including determining a suitable beginning and end (step 230) for the clip, are described in conjunction with FIGS. 4-7 below. Generally speaking, the beginning and end of the clip are maintained within the confines of a current program in most embodiments (e.g., clips may not typically contain content from more than one program), and the beginning and end are constrained to remain within the allotted maximum clip length for the generated clip 107A-C.


Clip 107 may be created in any manner (step 231). In various embodiments, decoded content from media stream 125 that is present within the buffer is simply encapsulated within appropriate framing to allow subsequent processing and/or transport across network 110. In other embodiments, the decoded content may be further compressed, transcoded or otherwise processed to place the clip 107 into a format that is more suitable for transport or viewing over network 110. Further, in various embodiments, at least some of the rules received in response 226 may be included in metadata or other formatting associated with clip 107. Such metadata may also include any identifying information about the clip, including any EPG information, start/end times, identifying information received in response 226, and/or any other information as appropriate.


The created clip 107 is then transported 232 to distribution server 106 in any appropriate manner. Clip 107 may be transported using a conventional file transfer protocol, for example, or any other protocols as desired. The clip is received and processed at distribution server 106 to perform any additional transcoding, compression or other modifications that may be desired, and to ensure compliance with the remaining rules associated with the source material for clip 107. As noted above, the rules for distribution, display and/or the like may be obtained from metadata associated with the clip itself, or may be retrieved from database 105 using any available identifying information about media clip 107. The received, formatted and processed clip 107 may then be provided to other users (e.g., users on network 110) within the constraints of any rules associated with the clip or its content source.


Turning now to FIG. 3, an exemplary process 300 that may be executed at CRS server 104 and/or database 105 suitably involves the broad steps of receiving a request 222 containing information about a particular program being viewed by a remotely-located media player 102 (step 302), identifying the program being viewed by the remotely-located media player from the information (step 304), obtaining one or more rules corresponding to the identified particular program from the database 105 (steps 306-314), and providing a response 226 to the remotely-located media player that includes the rules corresponding to the identified program. Generally speaking, each of the method steps shown in FIG. 3 may be implemented in software or firmware that may be stored in memory, mass storage or any other storage medium available to the executing device, and that may be executed on any processor or control circuitry associated with the executing device. The particular logic shown in FIG. 3 may be modified or enhanced in any manner, and any other components, systems, logic or devices may be involved in various other embodiments as appropriate.


As noted above, process 300 begins by WIO server 302 and/or database 105 receiving request 222 from a remotely-located media player 102. The request may be received in any format (e.g., a TCP or UDP format) over network 110, or any other communications link. Request 222 will typically contain some sort of identifying information about the media stream 125, such as information obtained from an EPG, EPG image or other source as described more fully above.


The program currently processed by media player 102 may be identified in any manner (step 304). Generally speaking, step 304 involves identifying a show name and network in any manner. Such information may be obtained, for example, from media player 102 itself, from the media source, from a separate WIO Server, and/or any other information source as desired. In various embodiments, the information received in request 222 may be sufficient to identify the program itself with little or no additional processing. Such embodiments may provide, for example, TMS program and/or station identification codes that can be readily correlated to a show name and/or network name, respectively. In other embodiments, OCR processing may be obtained (or performed directly) to extract textual information that can be used to identify the program, as described more fully above. Various other embodiments may identify the program and/or network being viewed in any manner.


The obtained “show name” and “network” (or channel) information may then be used to obtain desired clipping rules in any manner. In many embodiments, CRS server 104 posits a query to database 105 to determine whether the show name and/or network correspond to any valid entries that include clipping rules (steps 306, 308, 310). In general, if the show name and/or network can be recognized in database 105, then clipping rules allowing (step 312) or denying clipping (step 314) can be obtained. In many embodiments, clipping will not be allowed if the program or network are not identifiable to conservatively protect the rights of content owners. Other embodiments could be readily configured to allow clipping by default, however, or to allow clips of a restricted length, quality or the like. As noted above, clipping rules may be created for particular programs (step 306), networks (308), channels (not shown) and/or combinations of program name/channel/network (step 310) to allow for flexibility in applying different clipping rules in different programming situations.


After any clipping rules are obtained, a response 226 is provided to media player 102 as appropriate (step 316). As noted above, response 226 may include any number of rules, including any rules relating to clip distribution and display. Clipping rules may be changed at any time by updating the CRS server 104 and/or the database 105, as desired.



FIGS. 4-6 are views of an exemplary interface 400 for a media player application 130. Referring now to FIGS. 4-6, an exemplary media player interface 400 suitably includes a window 402 for playing the received media stream 125, a time shift buffer indicator 413 that indicates relative position of the current imagery being displayed in window 402 within the buffer. In various embodiments, buffer indicator 413 provides a line 412 that extends across a portion of indicator 413 to show the relative fullness of the buffer. Interface 400 shown in FIGS. 4-6 also includes a play button 416 to play or pause playback of video stream 125 in window 402 as appropriate. Additional interface features such as a slider control 414 may be provided to allow forward/reverse movement in time within video stream 125 along the buffer indicator 413; equivalent embodiments may provide “forward” and/or “reverse” buttons in place of slider 414 as desired.


As the program is received via video stream 125, line 412 extends to fill buffer indicator 413 as appropriate. In the embodiment shown in FIGS. 4-6, line 412 progressively fills from left to right, although other embodiments may be spatially arranged in any other manner. In some embodiments, imagery may be presented “live” as it is received in window 402 until directed otherwise by the viewer. The viewer may pause viewing (by depressing button 416, for example) or rewind viewing by moving slider 414 to the left along line 412 as shown in FIGS. 4-6. The viewer may subsequently forward back to the “live” stream as it is received by moving slider 414 to the right (in this example), or may take any other actions as desired.


The viewer may change the content presented in window 402 in any manner. In various embodiments, content is changed by changing the received media stream 125. This may be accomplished, for example, by changing the channel on a tuner (e.g., a tuner located internal or external to media player 102), by directing a remotely-located placeshifting device 112 (FIG. 1) to change the stream 125, or in any other manner. In various embodiments, an EPG feature 406 is provided as part of media player application 130. One example of an EPG feature 406 suitable for use with either a local tuner or a remote placeshifting device 112 is shown in FIGS. 4-5. As shown in the figure, EPG 406 provides information about various programs, channels and/or networks that may be obtained from any source (e.g., an external provider), and that may be stored in any location (e.g., locally within player device 102, or retrieved from a source on network 110 or elsewhere). EPG feature 406 may also provide information 404 (e.g., title, channel and/or other information) about the program currently being viewed, as desired. EPG 406 in FIGS. 4-5 allows users to graphically interact with the guide (e.g., using a cursor control device, remote control or the like) to select programs for viewing.


In the exemplary embodiment shown in FIGS. 4-5, certain programs in EPG 406 are identified as being eligible for clipping by an icon 408. Clip eligibility 408 (and any rules associated with clip eligibility) may be determined by information provided with the EPG data, and/or may be obtained by queries submitted to the CRS server (e.g., queries 222 to server 104) at a later time. Other embodiments may indicate clip eligibility using any other icon or feature; clip-eligible programs, channels or networks may be differently highlighted, shaded or colored within EPG 406, for example, or clip-eligibility may not be indicated in program guide 406 at all in some embodiments.


For programs that are clip-eligible, an icon, button or other feature 410 may be provided in interface 400 that allows the viewer to create a clip of the viewed program. When the viewer activates feature 410, interface features for identifying the beginning and end of the clip may be presented. Sliders 502 and 504 in FIG. 5, for example, show one technique for establishing the beginning and end of the clip 107. In various embodiments, player application 130 locks the sliders 502 and 504 (or other indicators) together so that they cannot be extended beyond the maximum allowable clip length, as determined in rules received for the particular program. Further, the current length of the clip and/or the maximum clip length may be displayed, e.g., as feature 506, to further assist the user in complying with the clipping rules. If the user attempts to extend the clip beyond its limits, then, media player application 130 will not allow the sliders 502, 504 (or other features) to extend beyond the limit or to otherwise allow clipping that violates one or more rules associated with the program, channel or network.


Interface 400 may reflect further limits upon clip creation, as desired. Referring now to FIG. 6, if multiple programs are stored within the time shift buffer, buffer indicator 413 may provide a divider 608 or other indication of the breaks between programs 602, 604, 606. Because each program may have its own set of clipping rules that may (or may not) be compatible with the other programs within the buffer, media application 130 stores the clipping rules for each particular program, and enforces the rules for each program separately. To that end, clips that span dividers 608 may be disallowed, and each program 602, 604, 606 may have its own maximum allowed clip length or other parameters as appropriate.


Various techniques, system and devices have therefore been described for creating variable length clips from a received media stream. As noted at the outset, rules can be defined for particular programs, channels and/or networks, and these rules can be provided to a media player so that clipping rules are enforced when the clip is created. Rules may also apply to clip distribution and/or display; such rules may be provided to the distribution server with the clip itself in various embodiments. By enforcing a maximum clip length that is specific to the program, channel or network, the rights of the content owner can be protected while still allowing flexibility in creating clips.


As used herein, the word “exemplary” means “serving as an example, instance, or illustration”. “Exemplary” embodiments are not intended as models to be literally duplicated, but rather as examples that provide instances of embodiments that may be modified or altered in any way to create other embodiments. Any implementation described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other implementations.


While the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing various embodiments of the invention, it should be appreciated that the particular embodiments described above are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. Various changes may be made in the function and arrangement of elements described without departing from the scope of the invention and its legal equivalents.

Claims
  • 1. A method executable by a media player device to create a variable length media clip from a media stream received via a network, the method comprising: receiving the media stream via the network by the media player device, wherein the media stream encodes a particular program for presentation to a viewer;decoding the media stream by the media player device;storing the decoded media stream in a buffer of the media player device;automatically identifying, by the media player device, a title of the particular program encoded by the media stream;transmitting identifying information about the particular program comprising the title of the particular program from the media player device via the network to a remotely-located first computer server, the remotely-located first computer server having access to a database of maximum clip lengths for each of a plurality of different programs;receiving, by the media player device, a maximum clip length for the particular program that was automatically identified by the media player device from the remotely-located first computer server having access to the database of maximum clip lengths;after receiving the maximum clip length for the particular program from the remotely-located first computer server, the media player device extracting a portion of the decoded media stream from the buffer to thereby create the variable length clip of the particular program, wherein the extracted portion of the media stream has a start point and an end point that are based upon inputs received by the media player device from the viewer;enforcing the length of the variable length media clip by the media player device to be less than the maximum clip length for the particular program that is received from the remotely-located first computer server; andtransmitting the extracted clip from the media player device via the network for redistribution of the clip on the network by a remotely-located second computer server.
  • 2. The method of claim 1 wherein the automatic identifying comprises the media player device identifying the particular program encoded by the media stream without input from the viewer.
  • 3. The method of claim 2 wherein the extracting is performed only after the maximum clip length is obtained from the remotely-located first computer server.
  • 4. The method of claim 2 wherein the information about the particular program comprises a station identification and a program identification.
  • 5. The method of claim 4 further comprising the media player device determining the station identification and the program identification from the media stream itself.
  • 6. The method of claim 2 wherein the identifying comprises generating an informational banner in the media stream, capturing an image of the informational banner, and performing an optical character recognition of the image to thereby extract information about the media stream.
  • 7. The method of claim 6 wherein the generating comprises directing a remotely-located placeshifting device that provides the media stream to generate the banner from a controlled device so that the banner is visible within the media stream.
  • 8. The method of claim 6 wherein the optical character recognition is performed by the remotely-located first computer server.
  • 9. The method of claim 6 wherein the optical character recognition is performed by the media player device.
  • 10. The method of claim 2 wherein the program is identified from information contained in an electronic program guide associated with the media player device.
  • 11. The method of claim 1 wherein the obtaining comprises receiving a plurality of rules associated with the identified program from the remotely-located first computer server.
  • 12. The method of claim 11 wherein the media clip provided to the remote server comprises metadata describing at least some of the plurality of rules associated with the identified program.
  • 13. The method of claim 1 wherein the receiving of the media stream comprises receiving a place-shifted media stream over a network from a placeshifting device.
  • 14. The method of claim 1 wherein the receiving of the media stream comprises receiving the media stream from a television source.
  • 15. The method of claim 14 wherein the television source is at least one of a satellite television source, a cable television source, and a broadcast television source.
  • 16. A media player device comprising a processor and a memory, wherein the processor is configured to execute instructions stored in the memory to perform a method that creates a variable length media clip from a media stream received via a network, the method comprising: receiving the media stream via the network by the media player device, wherein the media stream encodes a particular program for presentation to a viewer;decoding the media stream by the media player device;storing the decoded media stream in a buffer of the media player device;automatically identifying, by the media player device, a title of the particular program encoded by the media stream;transmitting identifying information about the particular program comprising the title of the particular program from the media player device via the network to a remotely-located first computer server, the remotely-located first computer server having access to a database of maximum clip lengths for each of a plurality of different programs;receiving, by the media player device, a maximum clip length for the particular program that was automatically identified by the media player device from the remotely-located first computer server having access to the database of maximum clip lengths;after receiving the maximum clip length for the particular program from the remotely-located first computer server, the media player device extracting a portion of the decoded media stream from the buffer to thereby create the variable length clip of the particular program, wherein the extracted portion of the media stream has a start point and an end point that are based upon inputs received by the media player device from the viewer;enforcing the length of the variable length media clip by the media player device to be less than the maximum clip length for the particular program that is received from the remotely-located first computer server; andtransmitting the extracted clip from the media player device via the network for redistribution of the clip on the network by a remotely-located second computer server.
  • 17. The media player device of claim 16 wherein the automatic identifying comprises the media player device identifying the particular program encoded by the media stream without input from the viewer.
  • 18. The media player device of claim 17 wherein the extracting is performed only after the maximum clip length is obtained from the remotely-located first computer server.
  • 19. The media player device of claim 16 wherein the program is identified from information contained in an electronic program guide associated with the media player device.
  • 20. The media player device of claim 16 wherein the obtaining comprises receiving a plurality of rules associated with the identified program from the remotely-located first computer server.
  • 21. The media player device of claim 20 wherein the media clip provided to the second computer system comprises metadata describing at least some of the plurality of rules associated with the identified program.
  • 22. The media player device of claim 21 wherein the remotely-located second computer system enforces the at least some of the plurality of rules associated with the identified program in redistribution of the clip.
PRIORITY CLAIM

This application is a continuation-in-part of U.S. patent application Ser. No. 11/620,711, which was filed on Jan. 7, 2007. That application claims the benefit of U.S. Provisional Application No. 60/823,066, filed on Aug. 21, 2006 and is also a continuation-in-part of U.S. application Ser. No. 11/147,664, filed on Jun. 7, 2005, which claims the benefit of U.S. Provisional Application No. 60/577,833, filed Jun. 7, 2004. All of these applications are incorporated herein by reference in their entirety. This Application also claims the benefit of U.S. Provisional Application Ser. No. 61/019,232, filed on Jan. 4, 2008, which is incorporated herein by reference in its entirety.

US Referenced Citations (455)
Number Name Date Kind
3416043 Jorgensen Dec 1968 A
4254303 Takizawa Mar 1981 A
5132992 Yurt et al. Jul 1992 A
5161021 Tsai Nov 1992 A
5237420 Hayashi Aug 1993 A
5237648 Mills et al. Aug 1993 A
5293357 Hallenbeck Mar 1994 A
5321846 Yokota et al. Jun 1994 A
5377332 Entwistle et al. Dec 1994 A
5386493 Degen et al. Jan 1995 A
5434590 Dinwiddie, Jr. et al. Jul 1995 A
5434678 Abecassis Jul 1995 A
5438423 Lynch et al. Aug 1995 A
5493638 Hooper et al. Feb 1996 A
5537530 Edgar et al. Jul 1996 A
5541852 Eyuboglu et al. Jul 1996 A
5602589 Vishwanath et al. Feb 1997 A
5610653 Abecassis Mar 1997 A
5623312 Yan et al. Apr 1997 A
5661516 Carles Aug 1997 A
5666426 Helms Sep 1997 A
5682195 Hendricks et al. Oct 1997 A
5684918 Abecassis Nov 1997 A
5696869 Abecassis Dec 1997 A
5706290 Shaw et al. Jan 1998 A
5708961 Hylton et al. Jan 1998 A
5710605 Nelson Jan 1998 A
5722041 Freadman Feb 1998 A
5757416 Birch et al. May 1998 A
5774170 Hite et al. Jun 1998 A
5778077 Davidson Jul 1998 A
5794116 Matsuda et al. Aug 1998 A
5822537 Katseff et al. Oct 1998 A
5831664 Wharton et al. Nov 1998 A
5850482 Meany et al. Dec 1998 A
5852437 Wugofski et al. Dec 1998 A
5870144 Guerrera Feb 1999 A
5870146 Zhu Feb 1999 A
5880721 Yen Mar 1999 A
5892536 Logan et al. Apr 1999 A
5898679 Brederveld et al. Apr 1999 A
5909518 Chui Jun 1999 A
5911582 Redford et al. Jun 1999 A
5922072 Hutchinson et al. Jul 1999 A
5936968 Lyons Aug 1999 A
5953485 Abecassis Sep 1999 A
5968132 Tokunaga Oct 1999 A
5969764 Sun et al. Oct 1999 A
5987501 Hamilton et al. Nov 1999 A
6002450 Darbee et al. Dec 1999 A
6008777 Yiu Dec 1999 A
6014694 Aharoni et al. Jan 2000 A
6020880 Naimpally Feb 2000 A
6023233 Craven et al. Feb 2000 A
6031940 Chui et al. Feb 2000 A
6036601 Heckel Mar 2000 A
6040829 Croy et al. Mar 2000 A
6043837 Driscoll, Jr. et al. Mar 2000 A
6044396 Adams Mar 2000 A
6049671 Slivka et al. Apr 2000 A
6067118 Chen et al. May 2000 A
6075906 Fenwick et al. Jun 2000 A
6088455 Logan et al. Jul 2000 A
6088777 Sorber Jul 2000 A
6091886 Abecassis Jul 2000 A
6097441 Allport Aug 2000 A
6104334 Allport Aug 2000 A
6108041 Faroudja et al. Aug 2000 A
6115420 Wang Sep 2000 A
6117126 Appelbaum et al. Sep 2000 A
6141059 Boyce et al. Oct 2000 A
6141447 Linzer et al. Oct 2000 A
6151444 Abecassis Nov 2000 A
6157674 Oda et al. Dec 2000 A
6160544 Hayashi et al. Dec 2000 A
6201536 Hendricks et al. Mar 2001 B1
6208805 Abecassis Mar 2001 B1
6212282 Mershon Apr 2001 B1
6222885 Chaddha et al. Apr 2001 B1
6223211 Hamilton et al. Apr 2001 B1
6240459 Roberts et al. May 2001 B1
6240531 Spilo et al. May 2001 B1
6243596 Kikinis Jun 2001 B1
6256019 Allport Jul 2001 B1
6263503 Margulis Jul 2001 B1
6279029 Sampat et al. Aug 2001 B1
6282548 Bumer et al. Aug 2001 B1
6282714 Ghori et al. Aug 2001 B1
6286142 Ehreth Sep 2001 B1
6289485 Shiomoto Sep 2001 B1
6310886 Barton Oct 2001 B1
6340994 Margulis et al. Jan 2002 B1
6353885 Herzi et al. Mar 2002 B1
6356945 Shaw et al. Mar 2002 B1
6357021 Kitigawa et al. Mar 2002 B1
6359902 Putzolu Mar 2002 B1
6370688 Hejna, Jr. Apr 2002 B1
6389467 Eyal May 2002 B1
6408128 Abecassis Jun 2002 B1
6421429 Merritt et al. Jul 2002 B1
6434113 Gubbi Aug 2002 B1
6442067 Chawla et al. Aug 2002 B1
6456340 Margulis Sep 2002 B1
6456781 Rijckaert Sep 2002 B1
6466623 Youn et al. Oct 2002 B1
6470378 Tracton et al. Oct 2002 B1
6476826 Plotkin et al. Nov 2002 B1
6486892 Stern Nov 2002 B1
6487319 Chai Nov 2002 B1
6493874 Humpleman Dec 2002 B2
6496122 Sampsell Dec 2002 B2
6505160 Levy et al. Jan 2003 B1
6505169 Bhagavath et al. Jan 2003 B1
6510177 De Bonet et al. Jan 2003 B1
6529506 Yamamoto et al. Mar 2003 B1
6553147 Chai et al. Apr 2003 B2
6553178 Abecassis Apr 2003 B2
6557031 Mimura et al. Apr 2003 B1
6564004 Kadono May 2003 B1
6567984 Allport May 2003 B1
6584201 Konstantinou et al. Jun 2003 B1
6584559 Huh et al. Jun 2003 B1
6597375 Yawitz Jul 2003 B1
6598159 McAlister et al. Jul 2003 B1
6600838 Chui Jul 2003 B2
6609253 Swix et al. Aug 2003 B1
6611530 Apostolopoulos Aug 2003 B1
6628716 Tan et al. Sep 2003 B1
6642939 Vallone et al. Nov 2003 B1
6647015 Malkemes et al. Nov 2003 B2
6658019 Chen et al. Dec 2003 B1
6665751 Chen et al. Dec 2003 B1
6665813 Forsman et al. Dec 2003 B1
6697356 Kretschmer et al. Feb 2004 B1
6701380 Schneider et al. Mar 2004 B2
6704678 Minke et al. Mar 2004 B2
6704847 Six et al. Mar 2004 B1
6708231 Kitagawa Mar 2004 B1
6718551 Swix et al. Apr 2004 B1
6754266 Bahl et al. Jun 2004 B2
6754439 Hensley et al. Jun 2004 B1
6757277 Shaffer et al. Jun 2004 B1
6757851 Park et al. Jun 2004 B1
6757906 Look et al. Jun 2004 B1
6766376 Price Jul 2004 B2
6768775 Wen et al. Jul 2004 B1
6771828 Malvar Aug 2004 B1
6774912 Ahmed et al. Aug 2004 B1
6781601 Cheung Aug 2004 B2
6785700 Masud et al. Aug 2004 B2
6788882 Geer et al. Sep 2004 B1
6795638 Skelley, Jr. Sep 2004 B1
6798838 Ngo Sep 2004 B1
6806909 Radha et al. Oct 2004 B1
6807308 Chui et al. Oct 2004 B2
6816194 Zhang et al. Nov 2004 B2
6816858 Coden et al. Nov 2004 B1
6826242 Ojard et al. Nov 2004 B2
6834123 Acharya et al. Dec 2004 B2
6839079 Barlow et al. Jan 2005 B2
6847468 Ferriere Jan 2005 B2
6850571 Tardif Feb 2005 B2
6850649 Malvar Feb 2005 B1
6868083 Apostolopoulos et al. Mar 2005 B2
6889385 Rakib et al. May 2005 B1
6892359 Nason et al. May 2005 B1
6898583 Rising, III May 2005 B1
6907602 Tsai et al. Jun 2005 B2
6910175 Krishnamachari Jun 2005 B2
6927685 Wathen Aug 2005 B2
6930661 Uchida et al. Aug 2005 B2
6931451 Logan et al. Aug 2005 B1
6941575 Allen Sep 2005 B2
6944880 Allen Sep 2005 B1
6952595 Ikedo et al. Oct 2005 B2
6970602 Smith et al. Nov 2005 B1
6981050 Tobias et al. Dec 2005 B1
6985932 Glaser et al. Jan 2006 B1
7016337 Wu et al. Mar 2006 B1
7020892 Levesque et al. Mar 2006 B2
7032000 Tripp Apr 2006 B2
7047305 Brooks et al. May 2006 B1
7055166 Logan et al. May 2006 B1
7058376 Logan et al. Jun 2006 B2
7099951 Laksono Aug 2006 B2
7103905 Novak Sep 2006 B2
7110558 Elliott Sep 2006 B1
7124366 Foreman et al. Oct 2006 B2
7127507 Clark et al. Oct 2006 B1
7151575 Landry et al. Dec 2006 B1
7155734 Shimomura et al. Dec 2006 B1
7155735 Ngo et al. Dec 2006 B1
7181756 Zigmond et al. Feb 2007 B1
7184433 Oz Feb 2007 B1
7224323 Uchida et al. May 2007 B2
7239800 Bilbrey Jul 2007 B2
7317759 Turaga et al. Jan 2008 B1
7344084 Dacosta Mar 2008 B2
7382729 Honda et al. Jun 2008 B2
7430360 Abecassis Sep 2008 B2
7430686 Wang et al. Sep 2008 B1
7464396 Hejna, Jr. Dec 2008 B2
7478164 Lango et al. Jan 2009 B1
7478166 Agnoli et al. Jan 2009 B2
7502733 Andrsen et al. Mar 2009 B2
7505480 Zhang et al. Mar 2009 B1
7516136 Lee et al. Apr 2009 B2
7565681 Ngo et al. Jul 2009 B2
7577978 Wistendahl et al. Aug 2009 B1
7583676 Shobatake Sep 2009 B2
7594218 Lozben Sep 2009 B1
7616955 Kim Nov 2009 B2
7661121 Smith et al. Feb 2010 B2
7676590 Silverman et al. Mar 2010 B2
7720432 Colby et al. May 2010 B1
7721300 Tipton et al. May 2010 B2
7725912 Margulis May 2010 B2
7778372 Takashima Aug 2010 B2
7788696 Burges et al. Aug 2010 B2
7895275 Evans et al. Feb 2011 B1
7913278 Ellis et al. Mar 2011 B2
7917932 Krikorian Mar 2011 B2
7921150 Schwartz Apr 2011 B1
7921446 Krikorian et al. Apr 2011 B2
7945688 Lango et al. May 2011 B1
7975062 Krikorian et al. Jul 2011 B2
7992176 Margulis Aug 2011 B2
8060906 Begeja et al. Nov 2011 B2
8250605 Opaluch Aug 2012 B2
8266657 Margulis Sep 2012 B2
8352991 Cahnbley et al. Jan 2013 B2
8365236 Krikorian et al. Jan 2013 B2
8578408 Tom et al. Nov 2013 B2
8621533 Krikorian et al. Dec 2013 B2
8799969 Krikorian et al. Aug 2014 B2
8819750 Krikorian et al. Aug 2014 B2
8892681 Basso et al. Nov 2014 B2
8904455 Krikorian Dec 2014 B2
9106723 Krikorian et al. Aug 2015 B2
9106964 Zhao Aug 2015 B2
9143825 Chittella Sep 2015 B2
9237300 Krikorian Jan 2016 B2
9270515 Postelnicu et al. Feb 2016 B1
9491523 Margulis Nov 2016 B2
9584757 Margulis Feb 2017 B2
9628849 Asnis et al. Apr 2017 B2
9654532 Strigeus et al. May 2017 B2
9674257 Eyler et al. Jun 2017 B2
9739634 Trivedi Aug 2017 B2
20010004733 Eldering Jun 2001 A1
20010007097 Kim Jul 2001 A1
20010021998 Margulis Sep 2001 A1
20020004839 Wine et al. Jan 2002 A1
20020010925 Kikinis Jan 2002 A1
20020012530 Bruls Jan 2002 A1
20020019984 Rakib Feb 2002 A1
20020031333 Mano et al. Mar 2002 A1
20020032907 Daniels Mar 2002 A1
20020046404 Mizutani Apr 2002 A1
20020053053 Nagai et al. May 2002 A1
20020059591 Nakagawa May 2002 A1
20020080753 Lee Jun 2002 A1
20020083009 Lansing et al. Jun 2002 A1
20020090029 Kim Jul 2002 A1
20020105529 Bowser et al. Aug 2002 A1
20020112247 Horner et al. Aug 2002 A1
20020120925 Logan Aug 2002 A1
20020122137 Chen et al. Sep 2002 A1
20020131497 Jang Sep 2002 A1
20020138843 Samaan et al. Sep 2002 A1
20020143972 Christopoulos et al. Oct 2002 A1
20020143973 Price Oct 2002 A1
20020147634 Jacoby et al. Oct 2002 A1
20020147687 Breiter et al. Oct 2002 A1
20020151992 Hoffberg et al. Oct 2002 A1
20020154892 Hoshen et al. Oct 2002 A1
20020162118 Levy et al. Oct 2002 A1
20020167458 Baudisch et al. Nov 2002 A1
20020188818 Nimura et al. Dec 2002 A1
20020191575 Kalavade et al. Dec 2002 A1
20020196853 Liang et al. Dec 2002 A1
20030001016 Fraier et al. Jan 2003 A1
20030001846 Davis et al. Jan 2003 A1
20030001880 Holtz et al. Jan 2003 A1
20030027556 Kobayashi Feb 2003 A1
20030028643 Jabri Feb 2003 A1
20030028873 Lemmons Feb 2003 A1
20030030752 Begeja et al. Feb 2003 A1
20030065915 Yu et al. Apr 2003 A1
20030070167 Holtz et al. Apr 2003 A1
20030078973 Przekop et al. Apr 2003 A1
20030088686 Jennings May 2003 A1
20030088687 Begeja et al. May 2003 A1
20030091189 Rhoads May 2003 A1
20030093260 Dagtas et al. May 2003 A1
20030093790 Logan et al. May 2003 A1
20030095791 Barton et al. May 2003 A1
20030105831 O'Kane Jun 2003 A1
20030115167 Sharif et al. Jun 2003 A1
20030159143 Chan Aug 2003 A1
20030187657 Erhart et al. Oct 2003 A1
20030192054 Birks et al. Oct 2003 A1
20030208612 Harris et al. Nov 2003 A1
20030220905 Amado et al. Nov 2003 A1
20030229900 Reisman Dec 2003 A1
20030231621 Gubbi et al. Dec 2003 A1
20030231655 Kelton et al. Dec 2003 A1
20030231868 Herley Dec 2003 A1
20030234803 Toyama et al. Dec 2003 A1
20040003406 Billmaier Jan 2004 A1
20040049791 Shah et al. Mar 2004 A1
20040052216 Roh Mar 2004 A1
20040068334 Tsai et al. Apr 2004 A1
20040068536 Demers et al. Apr 2004 A1
20040083301 Murase et al. Apr 2004 A1
20040098748 Bo et al. May 2004 A1
20040100486 Flamini et al. May 2004 A1
20040101271 Boston et al. May 2004 A1
20040103340 Sundareson et al. May 2004 A1
20040139047 Rechsteiner et al. Jul 2004 A1
20040153951 Walker et al. Aug 2004 A1
20040162845 Kim et al. Aug 2004 A1
20040162903 Oh Aug 2004 A1
20040172410 Shimojima et al. Sep 2004 A1
20040177063 Weber et al. Sep 2004 A1
20040205830 Kaneko Oct 2004 A1
20040212640 Mann et al. Oct 2004 A1
20040215458 Kobayashi et al. Oct 2004 A1
20040216173 Horoszowski et al. Oct 2004 A1
20040220791 Lamkin et al. Nov 2004 A1
20040220926 Lamkin et al. Nov 2004 A1
20040236844 Kocherlakota Nov 2004 A1
20040246936 Perlman Dec 2004 A1
20040254887 Jacoby Dec 2004 A1
20040255249 Chang et al. Dec 2004 A1
20040255330 Logan Dec 2004 A1
20040255334 Logan Dec 2004 A1
20040255336 Logan et al. Dec 2004 A1
20050005308 Logan et al. Jan 2005 A1
20050021398 McCleskey et al. Jan 2005 A1
20050021830 Urzaiz et al. Jan 2005 A1
20050027821 Alexander et al. Feb 2005 A1
20050038981 Connor et al. Feb 2005 A1
20050044058 Matthews et al. Feb 2005 A1
20050050462 Whittle et al. Mar 2005 A1
20050053356 Mate et al. Mar 2005 A1
20050055595 Frazer et al. Mar 2005 A1
20050060759 Rowe et al. Mar 2005 A1
20050091683 Sheynman et al. Apr 2005 A1
20050097542 Lee May 2005 A1
20050114852 Chen et al. May 2005 A1
20050132351 Randall et al. Jun 2005 A1
20050138560 Lee et al. Jun 2005 A1
20050165795 Myka et al. Jul 2005 A1
20050177846 Maruyama et al. Aug 2005 A1
20050198584 Matthews et al. Sep 2005 A1
20050204046 Watanabe Sep 2005 A1
20050216851 Hull et al. Sep 2005 A1
20050227621 Katoh Oct 2005 A1
20050229118 Chiu et al. Oct 2005 A1
20050246369 Oreizy et al. Nov 2005 A1
20050246451 Silverman et al. Nov 2005 A1
20050246738 Lockett et al. Nov 2005 A1
20050251833 Schedivy Nov 2005 A1
20050262534 Bontempi et al. Nov 2005 A1
20050262539 Barton et al. Nov 2005 A1
20050283791 McCarthy et al. Dec 2005 A1
20050288999 Lerner et al. Dec 2005 A1
20060011371 Fahey Jan 2006 A1
20060015925 Logan Jan 2006 A1
20060031381 Van Luijt et al. Feb 2006 A1
20060050970 Gunatilake Mar 2006 A1
20060051055 Ohkawa Mar 2006 A1
20060095401 Krikorian et al. May 2006 A1
20060095471 Krikorian et al. May 2006 A1
20060095472 Krikorian et al. May 2006 A1
20060095942 Van Beek May 2006 A1
20060095943 Demircin et al. May 2006 A1
20060101499 Aravamudan et al. May 2006 A1
20060107226 Matthews et al. May 2006 A1
20060117371 Margulis Jun 2006 A1
20060146174 Hagino Jul 2006 A1
20060156219 Haot et al. Jul 2006 A1
20060156374 Hu et al. Jul 2006 A1
20060171395 Deshpande Aug 2006 A1
20060184980 Cole Aug 2006 A1
20060190616 Mayerhofer et al. Aug 2006 A1
20060206526 Sitomer Sep 2006 A1
20060230345 Weng et al. Oct 2006 A1
20060265384 Lee et al. Nov 2006 A1
20060280157 Karaoguz et al. Dec 2006 A1
20060280177 Gupta et al. Dec 2006 A1
20060280437 Logan et al. Dec 2006 A1
20060287916 Starr et al. Dec 2006 A1
20060294183 Agnoli et al. Dec 2006 A1
20070003224 Krikorian et al. Jan 2007 A1
20070005783 Saint-Hillaire et al. Jan 2007 A1
20070022328 Tarra et al. Jan 2007 A1
20070043792 O'Brien Feb 2007 A1
20070061783 Prakash Mar 2007 A1
20070067390 Agnoli et al. Mar 2007 A1
20070073767 Springer, Jr. et al. Mar 2007 A1
20070074115 Patten et al. Mar 2007 A1
20070076604 Litwack Apr 2007 A1
20070113250 Logan et al. May 2007 A1
20070136778 Birger et al. Jun 2007 A1
20070168543 Krikorian et al. Jul 2007 A1
20070180485 Dua Aug 2007 A1
20070198532 Krikorian et al. Aug 2007 A1
20070234213 Krikorian et al. Oct 2007 A1
20070260462 Andersen et al. Nov 2007 A1
20070274327 Kaarela et al. Nov 2007 A1
20070286596 Lonn Dec 2007 A1
20070300258 O'Connor et al. Dec 2007 A1
20080007651 Bennett Jan 2008 A1
20080019276 Takatsuji et al. Jan 2008 A1
20080037573 Cohen Feb 2008 A1
20080059533 Krikorian Mar 2008 A1
20080134267 Moghe et al. Jun 2008 A1
20080195698 Stefanovic et al. Aug 2008 A1
20080195744 Bowra et al. Aug 2008 A1
20080199150 Candelore Aug 2008 A1
20080215392 Rajan Sep 2008 A1
20080294759 Biswas et al. Nov 2008 A1
20080036917 Pascarella et al. Dec 2008 A1
20080307456 Beetcher et al. Dec 2008 A1
20080307462 Beetcher et al. Dec 2008 A1
20080307463 Beetcher et al. Dec 2008 A1
20090045938 Levien et al. Feb 2009 A1
20090049502 Levien et al. Feb 2009 A1
20090074380 Boston et al. Mar 2009 A1
20090103607 Bajpai et al. Apr 2009 A1
20090133079 Li et al. May 2009 A1
20090157697 Conway et al. Jun 2009 A1
20090157777 Golwalkar et al. Jun 2009 A1
20090199248 Ngo et al. Aug 2009 A1
20090254672 Zhang Oct 2009 A1
20090268740 Sindhu et al. Oct 2009 A1
20090300205 Jabri Dec 2009 A1
20100023642 Ladd et al. Jan 2010 A1
20100100915 Krikorian et al. Apr 2010 A1
20100169477 Stienhans et al. Jul 2010 A1
20100226444 Thevathasan et al. Sep 2010 A1
20100269144 Forsman et al. Oct 2010 A1
20100281042 Windes et al. Nov 2010 A1
20100309916 Oskouy et al. Dec 2010 A1
20110002381 Yang et al. Jan 2011 A1
20110047079 Du et al. Feb 2011 A1
20110050908 Nam Mar 2011 A1
20110125861 Evans et al. May 2011 A1
20110307608 Chang et al. Dec 2011 A1
20120166669 Price Jun 2012 A1
20120219001 Sindhu et al. Aug 2012 A1
20130185163 Krikorian et al. Jul 2013 A1
20150163206 McCarthy et al. Jun 2015 A1
Foreign Referenced Citations (50)
Number Date Country
2286641 Oct 1998 CA
2324856 May 2001 CA
1464685 Dec 2003 CN
1464685 Dec 2003 CN
1558668 Dec 2004 CN
4407319 Sep 1994 DE
0515101 Nov 1992 EP
0690626 Jan 1996 EP
0838945 Apr 1998 EP
1077407 Feb 2001 EP
1443766 Aug 2004 EP
1691550 Aug 2006 EP
1830558 Sep 2007 EP
2307151 May 1997 GB
H05268296 Oct 1993 JP
11289350 Oct 1999 JP
2001054066 Feb 2001 JP
2003046582 Feb 2003 JP
2003101547 Apr 2003 JP
2003114845 Apr 2003 JP
2003179580 Jun 2003 JP
2004007127 Jan 2004 JP
2004015111 Jan 2004 JP
2005032120 Feb 2005 JP
2005039781 Feb 2005 JP
19990082855 Nov 1999 KR
20010211410 Aug 2001 KR
200703018 Jan 2007 TW
1998037701 Aug 1998 WO
0072596 Nov 2000 WO
2001033839 May 2001 WO
2001047248 Jun 2001 WO
2001093161 Dec 2001 WO
02073443 Sep 2002 WO
2003026232 Mar 2003 WO
2003052552 Jun 2003 WO
2003084172 Oct 2003 WO
2003098897 Nov 2003 WO
2004032511 Apr 2004 WO
2004045218 May 2004 WO
2005039180 Apr 2005 WO
2005050898 Jun 2005 WO
2005122025 Dec 2005 WO
2006064454 Jun 2006 WO
20060074110 Jul 2006 WO
2007027891 Mar 2007 WO
2007051156 May 2007 WO
2007141555 Dec 2007 WO
2007149466 Dec 2007 WO
2008024723 Feb 2008 WO
Non-Patent Literature Citations (279)
Entry
Getting Rights to Movie Clips Still Complex; Greig, Jane S. Autin American Statesman [Austin, Tex] Jan. 17, 1997 :F.1.
USPTO, Final Office Action, dated Nov. 6, 2009; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Final Office Action dated Nov. 12, 2009; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO, Non-Final Office Action dated Nov. 23, 2009; U.S. Appl. No. 11/683,862, filed Mar. 8, 2007.
USPTO, Non-Final Office Action dated Oct. 1, 2009; U.S. Appl. No. 11/778,287, filed Jul. 16, 2007.
USPTO Final Office Action dated Dec. 30, 2009; U.S. Appl. No. 11/147,664, filed Jun. 7, 2005.
European Patent Office, European Search Report, dated Sep. 28, 2009 for European Application No. EP 06 78 6175.
International Search Report for PCT/US2008/069914 dated Dec. 19, 2008.
PCT Partial International Search, PCT/US2009/054893, dated Dec. 23, 2009.
Newton's Telecom Dictionary, 21st ed., Mar. 2005.
Ditze M. et all “Resource Adaptation for Audio-Visual Devices in the UPnP QoS Architecture,” Advanced Networking and Applications, 2006; AINA, 2006; 20% H International conference on Vienna, Austria Apr. 18-20, 2006.
Joonbok, Lee et al. “Compressed High Definition Television (HDTV) Over IPv6,” Applications and the Internet Workshops, 2006; Saint Workshops, 2006; International Symposium, Phoenix, AZ, USA, Jan. 23-27, 2006.
Lowekamp, B. et al. “A Hierarchy of Network Performance Characteristics for Grid Applications and Services,” GGF Network Measurements Working Group, pp. 1-29, May 24, 2004.
Meyer, Derrick “MyReplayTV™ Creates First-Ever Online Portal to Personal TI! Service; Gives Viewers Whole New Way to Interact With Programming,” http://web.archive.org/web/20000815052751/http://www.myreplaytv.com/, Aug. 15, 2000.
Sling Media “Sling Media Unveils Top-of-Line Slingbox PRO-HD” [online], Jan. 4, 2008, XP002560049; retrieved from the Internet: URL: www.slingmedia.com/get/pr-slingbox-pro-hd.html; retrieved on Oct. 12, 2009.
Srisuresh, P. et al. “Traditional IP Network Address Translator (Traditional NAT),” Network Working Group, The Internet Society, Jan. 2001.
Lucas, Brian et al. “Systems and Methods for Establishing Connections Between Devices Communicating Over a Network,” U.S. Appl. No. 12/426,103, filed Apr. 17, 2009.
Thiyagarajan, Venkatesan et al. “Always-On-Top Media Player Launched From a Web Browser,” U.S. Appl. No. 12/617,271, filed Nov. 12, 2009.
Paul, John Michael et al. “Systems and Methods for Delivering Messages Over a Network,” U.S. Appl. No. 12/619,192, filed Nov. 16, 2009.
Rao, Padmanabha R. et al. “Methods and Apparatus for Establishing Network Connections Using an Inter-Mediating Device,” U.S. Appl. No. 12/642,368, filed Dec. 18, 2009.
Dham, Vikram et al. “Systems and Methods for Establishing Network Connections Using Local Mediation Services,” U.S. Appl. No. 12/644,918, filed Dec. 22, 2009.
Paul, John et al. “Systems and Methods for Remotely Controlling Media Server Via a Network,” U.S. Appl. No. 12/645,870, filed Dec. 23, 2009.
Bajpal, Parimal et al. “Method and Node for Transmitting Data Over a Communication Network using Negative Ackhowledgement,” U.S. Appl. No. 12/404,920, filed Mar. 16, 2009.
Bajpal, Parimal et al. “Method and Note for Employing Network connections Over a Connectinoless Transport Layer Protocol,” U.S. Appl. No. 12/405,062, filed Mar. 16, 2009.
Asnis, Ilya et al. “Mediated Network address Translation Traversal” U.S. Appl. No. 12/405,039, filed Mar. 16, 2009.
International Search Report for International Application No. PCT/US2007/076337 dated Oct. 20, 2008.
China State Intellectual Property Office “First Office Action of China State Intellectual Property Office,” dated Oct. 9, 2010; Application No. 200780030811.4.
European Patent Office “EPO Communication” dated Nov. 29, 2010; Application No. 08 167 880.7-2202.
Korean Intellectual Property Office “Official Notice of Preliminary Rejection,” dated Dec. 29, 2010;Patent Application No. 10-2009-7003607.
China State Intellectual Property Office “Second Office Action of China State Intellectual Property Office, ” dated Aug. 26, 2010; Application No. 200810126554.0.
China State Intellectual Property Office “First Office Action,” dated Jan. 8, 2010, for Application No. 200810126554.0.
USPTO Final Office action dated Jan. 25, 2010; U.S. Appl. No. 11/734,277, filed Apr. 12, 2007.
Australian Government “Office Action,” Australian Patent Application No. 2006240518, dated Nov. 12, 2009.
Jain, Vikal Kumar “Systems and Methods for Coordinating Data Communication Between Two Device,” U.S. Appl. No. 12/699,280, filed Feb. 3, 2010.
Gangotri, Arun L. et al. “Systems and Methods and Program Applications for Selectively Restructuring the Placeshiftnig of Copy Protected Digital Media Content,” U.S. Appl. No. 12/623,955, filed Nov. 23, 2009.
Paul, John et al. “Systems and Methods for Searching Media Content,” U.S. Appl. No. 12/648,024, filed Dec. 28, 2009.
Newton's Telcom Dictionary, 20th ed., Mar. 2004.
“The Authoritative Dictionary of IEEE Standard Terms,” 7th ed. 2000.
Gurzhi, Alexander et al. “Systems and Methods for Emulation Network-Enabled Media Components,” U.S. Appl. No. 12/711,830, filed Feb. 24, 2010.
Curtis Sean “Systems and Methods for Formatting Media Content for Distribution,” U.S. Appl. No. 12/635,138, filed Dec. 10, 2009.
Carlson, T. “Mule 2.x Getting Started Guide,” Apr. 15, 2008, 134 pages.
European Patent Office, International Searching Authority, “International Search Report,” dated Mar. 30, 2010; International Application PCT/US2009/068468 filed Dec. 27, 2009.
USPTO Final Office Action dated Mar. 3, 2010; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO Final Office Action dated Mar. 12, 2010; U.S. Appl. No. 11/620,711, filed Jan. 7, 2007.
USPTO Non-Final Office Action dated Mar. 19, 2010; U.S. Appl. No. 11/147,664, filed Jun. 7, 2005.
USPTO Non-Final Office Action dated Mar. 31, 2010; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO Non-Final Office Action dated Apr. 1, 2010; U.S. Appl. No. 12/237,103, filed Sep. 24, 2008.
Qiong, Liu et al. “Digital Rights Management for Content Distribution,” Proceedings of the Australasian Information Security Workshop Conference on ACSW Frontiers 2003, vol. 21, 2003, XP002571073, Adelaide, Australia, ISSN: 1445-1336, ISBN: 1-920682-00-7, sections 2 and 2.1.1.
China Patent Office “Office Action” dated Feb. 23, 2011; Chinese Patent Appln. No. 200680022520.6.
China Patent Office “Office Action” dated Jan. 6, 2911; Chinese Patent Appln. No. 200810126554.0.
USPTO “Non-Final Office Action” dated Mar. 4, 2011; U.S. Appl. No. 12/757,697, filed Apr. 9, 2010.
USPTO “Final Office Action” dated Mar. 10, 2011; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
Canadian Patent Office “Office Action” dated Mar. 17, 2011; Canadian Patent Appln. No. 2,606,235.
Sikora, Thomas “Trends and Perspectives in Image and Video Coding,” Proceedings of the IEEE, vol. 93 No. 1, pp. 6-17, Jan. 1, 2005.
Vetro, Anthony et al. “Video Transcoding Architectures and Techniques: An Overview,” IEEE Signal Processing Magazine, vol. 20, No. 2, pp. 18-29, Mar. 1, 2003.
De Lattre, Alexis et al. “VideoLAN Streaming Howto,” Internet Citation, Jan. 22, 2005; http://www.videolan.org/doc/streaming-howto/en/.
USPTO “Non-Final Office Action” dated Mar. 29, 2011; U.S. Appl. No. 12/979,145, filed Dec. 27, 2010.
China State Intellectual Property Office “Office Action” dated Mar. 18, 2010 for Application No. 200680022520.6.
China State Intellectual Property Office “Office Action” dated Apr. 13, 2010 for Application No. 200580026825.X.
Canadian Intellectual Property Office “Office Action” dated Feb. 18, 2010 for Application No. 2569610.
European Patent Office “European Search Report,” dated May 7, 2010 for Application No. 06786174.0.
Margulis, Neal “Apparatus and Method for Effectively Implementing a Wireless Television System,” U.S. Appl. No. 12/758,193, filed Apr. 12, 2010.
Margulis, Neal “Apparatus and Method for Effectively Implementing a Wireless Television System,” U.S. Appl. No. 12/758,194, filed Apr. 12, 2010.
Margulis, Neal “Apparatus and Method for Effectively Implementing a Wireless Television System,” U.S. Appl. No. 12/758,196, filed Apr. 12, 2010.
Kirkorian, Jason Gary et al. “Personal Media Broadcasting System with Output Buffer,” U.S. Appl. No. 12/757,697, filed Apr. 9, 2010.
Tarra, Raghuveer et al. “Firmware Update for Consumer Electronic Device,” U.S. Appl. No. 12/757,714, filed Apr. 9, 2010.
Lee, M. et al. “Video Frame Rate Control for Non-Guaranteed Network Services with Explicit Rate Feedback,” Globecom'00, 2000 IEEE Global Telecommunications conference, San Francisco, CA, Nov. 27-Dec. 1, 2000; [IEEE Telecommunications Conference], New York, NY; IEEE, US, vol. 1,Nov. 27, 2000, pp. 293-297, Global XP001195580; ISBN: 978-0/7803-6452-3, lines 15-20 of sec. II on p. 293, fig. 1.
European Patent Office, International Searching Authority, “International Search Report and Written Opinion,” dated Jun. 4, 2010 for International Application No. PCT/IN2009/000728, filed Dec. 18, 2009.
USPTO Non-Final Office Action dated Jun. 23, 2010; U.S. Appl. No. 11/933,969, filed Nov. 1, 2007.
Korean Intellectual Property Office “Official Notice of Preliminary Rejection,” dated Jun. 18, 2010; Korean Patent Application No. 10-2008-7021254.
European Patent Office, European Search Report for European Application No. EP 08 16 7880, dated Mar. 4, 2009.
MythTV Wiki, “MythTV User Manual” [Online], Aug. 27, 2007, XP002515046; retrieved from the Internet: <URL: http://www.mythtv.org/wiki?title=User_Manual:Introduction&oldid=25549>.
International Searching Authority, Written Opinion and International Search Report for International Application No. PCT/US2008/077733, dated Mar. 18, 2009.
International Searching Authority, Written Opinion and International Search Report for International Application No. PCT/US2008/087005, dated Mar. 20, 2009.
Watanabe Y. et al., “Multimedia Database System for TV Newscasts and Newspapers”; Lecture Notes in Computer Science, Springer Verlag, Berlin, Germany; vol. 1554, Nov. 1, 1998, pp. 208-220, XP002402824, ISSN: 0302-9743.
Yasuhiko Watanabe et al., “Aligning Articles in TV Newscasts and Newspapers”; Proceedings of the International Conference on Computationallinguistics, XX, XX, Jan. 1, 1998, pp. 1381-1387, XP002402825.
Sodergard C. et al., “Integrated Multimedia Publishing: Combining TV and Newspaper Content on Personal Channels”; Computer Networks, Elsevier Science Publishers B.V., Amsterdam, Netherlands; vol. 31, No. 11-16, May 17, 1999, pp. 1111-1128, XP004304543, ISSN: 1389-1286.
Ariki Y. et al., “Automatic Classification of TV News Articles Based on Telop Character Recognition”; Multimedia Computing and Systems, 1999; IEEE International Conference on Florence, Italy, Jun. 7-11, 1999, Los Alamitos, California, USA, IEEE Comput. Soc. US; vol. 2, Jun. 7, 1999, pp. 148-152, XP010519373, ISBN: 978-0/7695-0253-3; abstract, paragraph [03.1], paragraph [05.2], figures 1,2.
USPTO, Non-Final Office Action dated Dec. 17, 2004; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Final Office Action dated Jul. 28, 2005; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Non-Final Office Action dated Jan. 30, 2006; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Final Office Action dated Aug. 10, 2006; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Non-Final Office Action dated Jun. 19, 2007; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Non-Final Office Action dated Apr. 16, 2008; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Final Office Action dated Sep. 18, 2008; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Non-Final Office Action dated Mar. 31, 2009; U.S. Appl. No. 09/809.868, filed Mar. 15, 2001.
USPTO, Non-Final Office Action dated May 1, 2008; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO, Final Office Action dated Dec. 29, 2008; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO, Non-Final Office Action dated Jun. 8, 2009; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO, Non-Final Office Action dated Jun. 26, 2008; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO, Final Office Action dated Oct. 21, 2008; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO, Non-Final Office Action dated Mar. 25, 2009; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO, Non-Final Office Action dated Aug. 7, 2008; U.S. Appl. No. 11/620,711, filed Jan. 7, 2007.
USPTO, Final Office Action dated Feb. 9, 2009; U.S. Appl. No. 11/620,711, filed Jan. 7, 2007.
USPTO, Non-Final Office Action dated Feb. 25, 2009; U.S. Appl. No. 11/683,862, filed Mar. 8, 2007.
USPTO, Non-Final Office Action dated Dec. 24, 2008; U.S. Appl. No. 11/147,985, filed Jun. 7, 2005.
USPTO, Non-Final Office Action dated Jun. 25, 2008; U.S. Appl. No. 11/428,254, filed Jun. 30, 2006.
USPTO, Final Office Action dated Feb. 6, 2009; U.S. Appl. No. 11/428,254, filed Jun. 30, 2006.
USPTO, Non-Final Office Action dated May 15, 2009; U.S. Appl. No. 11/147,664, filed Jun. 7, 2005.
Sonic Blue “ReplayTV 5000 User's Guide,” 2002, entire document.
Bluetooth-News; Main Future User Models Document Verification & Qualification: Bluetooth Technical Background, Apr. 21, 1999; pp. 1 of 7 and 2 of 7; http://www.bluetooth.com/v2/news/show.asp 1-2.
Microsoft Corporation; Harman/Kardon “Master Your Universe” 1999.
Matsushita Electric Corporation of America MicroCast : Wireless PC Multimedia Transceiver System, Nov. 1998.
“Wireless Local Area Networks: Issues in Technology and Standards” Jan. 6, 1999.
USPTO, Final Office Action dated Jun. 25, 2009; U.S. Appl. No. 11/147,985, filed Jun. 7, 2005.
Krikorian, Jason, U.S. Appl. No. 11/734,277, filed Apr. 12, 2007.
Tarra, Raghuveer et al., U.S. Appl. No. 60/975,239, filed Sep. 26, 2007.
Williams, George Edward, U.S. Appl. No. 12/167,041, filed Jul. 2, 2008.
Rao, Padmanabha R., U.S. Appl. No. 12/166,039, filed Jul. 1, 2008.
International Search Report and Written Opinion, PCT/US2005/020105, dated Feb. 15, 2007, 6 pages.
International Search Report and Written Opinion for PCT/US2006/04382, dated Apr. 27, 2007.
Archive of “TV Brick Home Server,” www.tvbrick.com, [online] [Archived by http://archive.org on Jun. 3, 2004; Retrieved on Apr. 12, 2006] retrieved from the Internet <URL:http://web.archive.org/web/20041107111024/www.tvbrick.com/en/affiliate/tvbs/tvbrick/document18/print>.
Faucon, B. “TV ‘Brick’ Opens up Copyright Can of Worms, ”Financial Review, Jul. 1, 2003, [online [Retrieved on Apr. 12, 2006] Retrieved from the Internet, URL:http://afr.com/cgi-bin/newtextversions.pl?storyid+1056825330084&3ate+2003/07/01&pagetype+printer&section+1053801318705&path+articles/2003/06/30/0156825330084.html.].
Balster, Eric J., “Video Compression and Rate Control Methods Based on the Wavelet Transform,” The Ohio State University 2004, pp. 1-24.
Kulapala et al., “Comparison of Traffic and Quality Characteristics of Rate-Controlled Wavelet and DCT Video,” Arizona State University, Oct. 11, 2004.
Skodras et al., “JPEG2000: The Upcoming Still Image Compression Standard,” May 11, 2000, 14 pages.
Taubman et al., “Embedded Block Coding in JPEG2000,” Feb. 23, 2001, pp. 1-8 of 36.
Kessler, Gary C., An Overview of TCP/IP Protocols and the Internet; Jan. 16, 2007, retrieved from the Internet on Jun. 12, 2008 at http://www.garykessler.net/libraly/tcpip.html; originally submitted to the InterNIC and posted on their Gopher site on Aug. 5, 1994.
Roe, Kevin, “Third-Party Observation Under EPC Article 115 On The Patentability Of An Invention,” Dec. 21, 2007.
Roe, Kevin, Third-Party Submission for Published Application Under CFR §1.99, Mar. 26, 2008.
Bajpai, Parimal et al. “Systems and Methods of Controlling the Encoding of a Media Stream,” U.S. Appl. No. 12/339,878, filed Dec. 19, 2008.
Malone, Edward D. et al. “Systems and Methods for Controlling Media Devices,” U.S. Appl. No. 12/256,344, filed Oct. 22, 2008.
Banger, Shashidhar et al. “Systems and Methods for Determining Attributes of Media Items Accessed Via a Personal Media Broadcaster,” U.S. Appl. No. 12/334,959, filed Dec. 15, 2008.
Kulkarni, Anant Madhava “Systems and Methods for Creating Logical Media Streams for Media Storage and Playback,” U.S. Appl. No. 12/323,907, filed Nov. 26, 2008.
Rao, Padmanabha R. “Systems and Methods for Linking Media Content,” U.S. Appl. No. 12/359,784, filed Jan. 26, 2009.
Krikorian, Blake Gary et al. “Systems and Methods for Presenting Media Content Obtained From Multiple Sources,” U.S. Appl. No. 12/408,456, filed Mar. 20, 2009.
Krikorian, Blake Gary et al. “Systems and Methods for Projecting Images From a Computer System,” U.S. Appl. No. 12/408,460, filed Mar. 20, 2009.
International Search Report and Written Opinion for International Application No. PCT/US2006/025911, dated Jan. 3, 2007.
International Search Report for International Application No. PCT/US2007/063599, dated Dec. 12, 2007.
International Search Report for International Application No. PCT/US2007/076337, dated Oct. 20, 2008.
International Search Report and Written Opinion for International Application No. PCT/US2006/025912, dated Jul. 17, 2008.
International Search Report for International Application No. PCT/US2008/059613, dated Jul. 21, 2008.
International Search Report and Written Opinion for International Application No. PCT/US2008/080910, dated Feb. 16, 2009.
Wikipedia “Slingbox” [Online], Oct. 21, 2007, XP002512399; retrieved from the Internet: <URL:http://en.wikipedia.org/w/index.php?title=Slingbox&oldid=166080570>; retrieved on Jan. 28, 2009.
Wikipedia “LocationFree Player” [Online], Sep. 22, 2007, XP002512400; retrieved from the Internet: <URL: http://en.wikipedia.org/w/index.php?title=LocationFree_Player&oldid=159683564>; retrieved on Jan. 28, 2009.
Capable Networks LLC “Keyspan Remote Control—Controlling Your Computer With a Remote” [Online], Feb. 21, 2006, XP002512495; retrieved from the Internet: <URL:http://www.slingcommunity.com/article/11791/Keyspan-Remote-Control—Controlling-Your-Computer-With-a-Remote/?highlight=remote+control>; retrieved on Jan. 28, 2009.
Sling Media Inc. “Slingbox User Guide” [Online] 2006, XP002512553; retrieved rom the Internet: <URL:http://www.slingmedia.hk/attach/en-US_Slingbox_User_Guide_v1.2.pdf>; retrieved on Jan. 29, 2009.
Sony Corporation “LocationFree TV” [Online], 2004, SP002512410; retrieved from the Internet: <URL:http://www.docs.sony.com/release/LFX1_X5revision.pdf>; retrieved on Jan. 28, 2009 [note—document uploaded in two parts as file exceeds the 25MB size limit].
Sony Corporation “LocationFree Player Pak—LocationFree Base Station—LocationFree Player” [Online] 2005, XP002512401; retrieved from the Internet: <URL:http://www.docs.sony.com/release.LFPK1.pdf>; retrieved on Jan. 28, 2009.
China State Intellectual Property Office “First Office Action,” dated Jul. 31, 2009, for Application No. 200580026825.X.
USPTO, Non-Final Office Action, dated Aug. 4, 2009; U.S. Appl. No. 11/734,277, filed Apr. 12, 2007.
USPTO, Final Office Action, dated Jul. 31, 2009; U.S. Appl. No. 11/683,862, filed Mar. 8, 2007.
USPTO, Non-Final Office Action, dated Aug. 5, 2009; U.S. Appl. No. 11/147,663, filed Jun. 7, 2005.
USPTO, Non-Final Office Action, dated Sep. 3, 2009; U.S. Appl. No. 11/620,711, filed Jan. 7, 2007.
Einaudi, Andrew E. et al. “Systems and Methods for Selecting Media Content Obtained from Multiple Sources,” U.S. Appl. No. 12/543,278, filed Aug. 18, 2009.
Malode, Deepak Ravi “Remote Control and Method for Automatically Adjusting the Volume Output of an Audio Device,” U.S. Appl. No. 12/550,145, filed Aug. 28, 2009.
Akella, Aparna Sarma “Systems and Methods for Event Programming Via a Remote Media Player,” U.S. Appl. No. 12/537,057, filed Aug. 6, 2009.
Shah, Bhupendra Natwerlan et al. “Systems and Methods for Transcoding and Place Shifting Media Content,” U.S. Appl. No. 12/548,130, filed Aug. 26, 2009.
Banger, Shashidhar et al. “Systems and Methods for Automatically Controlling the Resolution of Streaming Video Content,” U.S. Appl. No. 12/537,785, filed Aug. 7, 2009.
Panigrahi, Biswaranjan “Home Media Aggregator System and Method,” U.S. Appl. No. 12/538,681, filed Aug. 10, 2009.
Nandury, Venkata Kishore “Adaptive Gain Control for Digital Audio Samples in a Media Stream,” U.S. Appl. No. 12/507,971, filed Jul. 23, 2009.
Shirali, Amey “Systems and Methods for Providing Programming Content,” U.S. Appl. No. 12/538,676, filed Aug. 10, 2009.
Thiyagarajan, Venkatesan “Systems and Methods for Virtual Remote Control of Streamed Media,” U.S. Appl. No. 12/538,664, filed Aug. 10, 2009.
Thiyagarajan, Venkatesan et al. “Localization Systems and Method,” U.S. Appl. No. 12/538,783, filed Aug. 10, 2009.
Shirali, Amey et al. “Methods and Apparatus for Seeking Within a Media Stream Using Scene Detection,” U.S. Appl. No. 12/538,784, filed Aug. 10, 2009.
Thiyagarajan, Venkatesan “Systems and Methods for Updating Firmware Over a Network,” U.S. Appl. No. 12/538,661, filed Aug. 10, 2009.
Iyer, Satish “Methods and Apparatus for Fast Seeking Within a Media Stream Buffer,” U.S. Appl. No. 12/538,659, filed Aug. 10, 2009.
European Patent Office, International Searching Authority, “International Search Repoli,” for International Application No. PCT/US2009/049006, dated Sep. 11, 2009.
USPTO Non-Final Office Action dated Aug. 11, 2010; U.S. Appl. No. 11/11,265, filed Apr. 21, 2005.
USPTO Final Office Action dated Sep. 24, 2010; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO Non-Final Office Action dated Sep. 30, 2010; U.S. Appl. No. 12/758,193, filed Apr. 12, 2010.
USPTO Non-Final Office Action dated Sep. 29, 2010; U.S. Appl. No. 12/758,194, filed Apr. 12, 2010.
USPTO Non-Final Office Action dated Sep. 30, 2010; U.S. Appl. No. 12/758,196, filed Apr. 12, 2010.
USPTO Non-Final Office Action dated Oct. 4, 2010; U.S. Appl. No. 12/643,812, filed Dec. 21, 2009.
Japan Patent Office “Notice of Grounds for Rejection (Office Action),” dated May 25, 2010; Patent Application No. 2007-0268269.
Japan Patent Office “Notice of Grounds for Rejection (Office Action),” dated May 25, 2010; Patent Application No. 2007-527683.
USPTO “Non-Final Office Action” dated Jun. 28, 2011; U.S. Appl. No. 12/339,878, filed Dec. 19, 2008.
Australian Government “Notice of Acceptance” dated Aug. 10, 2011; Patent Appln. No. 2006240518.
Japan Patent Office “Notice of Grounds for Rejection (Office Action)” dated Jun. 28, 2011; Japanese Patent Appln. No. P2008-507651.
European Patent Office “Extended European Search Report” dated Feb. 28, 2011; Appln. No. 06734554.
Japan Patent Office “Decision of Rejection (Final Rejection)” dated Mar. 15, 2011; Patent Appln. No. 2007-527683.
China State Intellectual Property Office “Second Office Action” dated May 18, 2011; Appln. No. 200780030811.4.
USPTO “Non-Final Office Action” dated Sep. 16, 2011; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO “Notice of Allowance” dated Sep. 22, 2011; U.S. Appl. No. 12/979,145, filed Dec. 27, 2010.
Canadian Intellectual Property Office, “Office Action” dated Nov. 28, 2011; Canadian Appln. No. 2,606,235.
China State Intellectual Property Office, “Reexamination Office Action” dated Mar. 5, 2012; Chinese Appln. No. 200580026825.X.
USPTO “Non-Final Office Action” dated Feb. 24, 2012; U.S. Appl. No. 13/240,932, filed Sep. 22, 2011.
China Intellectual Property Office, “Second Office Action” dated Feb. 15, 2012; Appln. No. CN200680022520.6.
USPTO “Non-Final Office Action” datedd Jan. 10, 2012; U.S. Appl. No. 12/827,964, filed Jun. 30, 2010.
Chinese Office Action, dated Dec. 31, 2011, for Chinese Patent Application No. 200810161874.X.
European Office Action, dated Nov. 30, 2011, for European Patent Application No. 06 734 554.6-2223.
USPTO “Notice of Allowance”, dated Oct. 18, 2011; U.S. Appl. No. 12/339,878, filed Dec. 19, 2008.
China State Intellectual Property Office, Decision of Rejection:, dated Sep. 15, 2011; Chine Patent Appln. No. 200780030811.4.
China State Intellectual Property Office “Third Office Action” dated Dec. 5, 2012 for Chinese Patent Appln. No. 200680022520.6.
European Patent Office “Extgended Search Report” dated Nov. 27, 2011 for European Patent Appln. No. 12187858.1.
USPTO “Notice of Allowance” dated Sep. 25, 2012 for U.S. Appl. No. 13/240,932, filed Sep. 22, 2011.
USPTO, Non-Final Office Action, dated Jan. 6, 2014 for U.S. Appl. No. 13/615,306.
USPTO, Non-Final Office Action, dated Jan. 6, 2014 for U.S. Appl. 12/827,964.
European Patent Office, Communication under Rule 71(3) EPC for European Patent Application No. 06 734 554.6 dated Feb. 13, 2015.
European Patent Office, Communication pursuant to Article 94(3) EPC for European Patent Application No. 05 758 509.3 dated Mar. 3, 2015.
USPTO, Office Action for U.S. Appl. No. 14/449,896 dated Feb. 18, 2015.
USPTO, Final Office Action for U.S. Appl. No. 13/194,834 dated Mar. 3, 2015.
USPTO, Utility U.S. Appl. No. 14/449,896, filed Aug. 1, 2014.
Office of the Controller-General of Patents, Designs and Trademarks, First Examination Report for Indian Patent Application No. 4190/KOLNP/2007 dated Feb. 27, 2015.
China State Intellectual Property Office “Fourth Office Action” dated Mar. 5, 2013 for Chinese Patent Appln. No. 200810161874.X.
Japanese Patent Office “Decision of Rejection (Final Rejection)” dated Jan. 29, 2013 for Japanese Patent Appin. No. 2008-507651.
USPTO “Non-Final Office Action” dated Mar. 11, 2013 for U.S. Appl. No. 13/079,644.
USPTO “Non-Final Office Action” dated Mar. 5, 2013 for U.S. Appl. No. 13/107,341.
European Patent Office Supplementary Search Report dated Mar. 6, 2013 for European Patent Appln. No. 05758509.3.
Canadian Intellectual Property Office, Office Action dated Sep. 30, 2013 for Canadian Patent Application No. 2,747,539.
USPTO, Final Office Action dated Dec. 5, 2013 for U.S. Appl. No. 13/194,834.
Li Zhuo et al: “Adaptive forward error correction for streaming stored MPEG-4 FGS video over wireless channel”, IEEE 5th Workshop on Signal Processing Advances in Wireless Communications, 2004, Jan. 1, 2004 (Jan. 1, 2004), pp. 26-30, XP055091312, DOI: 10.1109/SPAWC.2004.1439196 ISBN: 978-0-78-038337-1.
European Patent Office, Summons to attend oral proceedings pursuant to Rule 115(1) EPC, dated Jan. 15, 2014 for European Patent Application No. 06734554.6.
Japan Patent Office, Notice of Ground(s) for Rejection, dated Feb. 18, 2014 for Japanese Patent Application No. P2008-507651.
Office of the Controller-General of Patents, Designs and Trademarks, First Examination Report, dated Mar. 21, 2014 for India Patent Application No. 3740/KOLNP/2006.
USPTO, Non-Final Office Action, dated Mar. 27, 2014 for U.S. Appl. No. 13/073,469.
USPTO, U.S. Appl. No. 14/468,059, filed Aug. 25, 2014.
State Intellectual Property Office of the People'S Republic of China, Office Action for Chinese Patent Application No. 201210572890.4 dated Feb. 9, 2015.
USPTO, Office Action for U.S. Appl. No. 14/468,059 dated Apr. 7, 2015.
USPTO, Notice of Allowance and Fee(s) Due for U.S. Appl. No. 14/144,225 dated Apr. 8, 2015.
USPTO, Non-Final Office Action, dated May 15, 2014 for U.S. Appl. No. 13/194,783.
USPTO, Notice of Allowance, dated May 28, 2014 for U.S. Appl. No. 13/615,306.
European Patent Office, Extended Search Report dated May 24, 2013 for European Patent Application No. 07814265.0.
USPTO, Final Office Action, dated Jun. 18, 2014 for U.S. Appl. No. 13/609,033.
USPTO, Non-Final Office Action, dated Jul. 16, 2014 for U.S. Appl. No. 13/194,834.
Chinese Intellectual Property Office “Office Action” dated Apr. 10, 2012 for Chinese Application 200810126554.0.
Japan Patent Office “Office Action” dated Feb. 28, 2012 for Japanese Application P2008-507651.
Canadian Intellectual Property Office “Office Action” dated Feb. 16, 2012 for Canadian Application 2,660,350.
Taiwan Intellectual Property Office “Office Action” dated Feb. 23, 2012 for Taiwan Application 097137393.
USPTO “Final Office Action” date Jun. 6, 2012 for U.S. Appl. No. 12/827,964, filed Jun. 30, 2012.
USPTO “Notice of Allowance” dated Jul. 6, 2012 for U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO “Notice of Allowance” dated Jun. 12, 2012 for U.S. Appl. No. 13/240,932, filed Sep. 22, 2011.
Japan Patent Office “Hearing” dated Apr. 24, 2012; Japanese Patent Appln. No. P2007-527683.
Japan Patent Office “Hearing” dated Apr. 24, 2012; Japanese Patent Appln. No. P2007-268269.
USPTO, Final Office Action dated May 8, 2013 for U.S. Appl. No. 13/194,783.
USPTO, Office Action dated May 8, 2013 for U.S. Appl. No. 13/194,834.
USPTO, Office Action dated Jun. 19, 2013 for U.S. Appl. No. 12/827,964.
USPTO, Office Action dated Jun. 20, 2013 for U.S. Appl. No. 13/615,306.
USPTO, Notice of Allowance dated Jun. 24, 2013 for U.S. Appl. No. 12/635,138.
USPTO, Final Office Action dated Jul. 17, 2013 for U.S. Appl. No. 13/079,644.
China State Intellectual Property Office, Office Action, dated Jul. 1, 2013 for China Patent Appln. No. 200680022520.6.
USPTO, Final Office Action, dated Sep. 4, 2014 for U.S. Appl. No. 12/827,964.
USPTO, Non-Final Office Action, dated Sep. 5, 2014 for U.S. Appl. No. 14/144,225.
Eurpoean Patent Office, Examination Report, dated Sep. 6, 2013 for European Patent Application No. 12 187 858.1.
Japan Patent Office, Hearing, dated Aug. 13, 2013 for Japanese Patent Application No. P2008-507651.
European Patent Office, Examination Report, dated Jul. 25, 2013 for European Patent Application No. 05 758 509.3.
China State Intellectual Property Office, First Office Action, dated Aug. 2, 2013 for Chinese Patent Application No. 200980151308.3.
USPTO, Non-Final Office Action, dated Aug. 7, 2013 for U.S. Appl. No. 13/609,033.
USPTO, Final Office Action, dated Sep. 19, 2013 for U.S. Appl. No. 13/107,341.
Canadian Intellectual Property Office, Office Action, dated Sep. 3, 2013 for Canadian Patent Application No. 2,660,350.
USPTO, Notice of Allowance, dated Sep. 20, 2013 for U.S. Appl. No. 13/079,644.
European Patent Office “Office Action” dated Aug. 7, 2012 for European Patent Appln. No. 06 734 554.6.
China Patent Office “Office Action” dated Aug. 3, 2012 for Chinese Patent Appln. No. 200810161874.X.
USPTO “Notice of Allowance” dated Aug. 31, 2012 for U.S. Appl. No. 11/620,711, filed Jan. 7, 2007.
M2 Presswire, Aug. 16, 2006, OTCPicks.com; Daily Market Movers Digest Midday Stock Alerts for Wednesday, Aug. 16, AMRU, IPTM, BCLC, IHDR, EGIL © 1994-2006 M2 Communications Ltd.
USPTO “Non-Final Office Action” dated Oct. 23, 2012 for U.S. Appl. No. 13/194,783, filed Jul. 29, 2011.
USPTO “Non-Final Office Action” dated Sep. 14, 2012 for U.S. Appl. No. 12/635,138, filed Dec. 12, 2009.
China State Intellectual Property Office, Office Action, dated Sep. 29, 2013 for Chinese Patent Application No. 200810161874.X.
Intellectual Property Office “Office Action” dated Feb. 25, 2013 for Taiwan Patent Appln. No. 098146025.
State Intellectual Property Office of the People's Republic of China, Notification of Reexamination for Chinese Patent Application No. 200810161874.X dated Apr. 7, 2015.
USPTO, Notice of Allowance and Fee(s) Due for U.S. Appl. No. 12/827,964 dated Apr. 29, 2015.
USPTO, Office Action for U.S. Appl. No. 13/609,033 dated May 7, 2015.
Japan Patent Office, Notice of Grounds for Rejection for Japanese Divisional Patent Application No. 2014-166162 dated Apr. 28, 2015.
USPTO, Office Action in U.S. Appl. No. 14/798,227 dated Sep. 21, 2015.
Utility U.S. Appl. No. 14/809,012, filed Jul. 24, 2015.
U.S. Appl. No. 60/577,833, filed Jun. 7, 2004.
U.S. Appl. No. 60/823,066, filed Aug. 21,2006.
USPTO, Final Office Action in U.S. Appl. No. 13/730,425 dated Oct. 18, 2016.
USPTO, Notice of Allowance and Fee(s) Due in U.S. Appl. No. 13/194,834 dated Oct. 21, 2016.
USPTO, Office Action in U.S. Appl. No. 15/251,483 dated Nov. 4, 2016.
USPTO, Office Action in U.S. Appl. No. 14/979,271 dated Sep. 2, 2016.
USPTO, Office Action in U.S. Appl. No. 13/194,783 dated Sep. 20, 2016.
USPTO, Notice of Allowance and Fee(s) Due in U.S. Appl. No. 14/809,012 dated Dec. 6, 2017.
USPTO, Notice of Allowance and Fee(s) Due for U.S. Appl. No. 14/979,271 dated Mar. 27, 2017.
USPTO, Final Office Action for U.S. Appl. No. 13/194,783 dated Apr. 12, 2017.
USPTO, Notice of Allowance and Fee(s) Due in U.S. Appl. No. 14/558,490 dated Sep. 9, 2015.
USPTO, Office Action in U.S. Appl. No. 13/194,834 dated Sep. 18, 2015.
USPTO, Office Action in U.S. Appl. No. 13/730,425 dated Sep. 21, 2015.
USPTO, Office Action in U.S. Appl. No. 14/809,012 dated Sep. 21, 2015.
USPTO, Notice of Allowance and Fee(s) Due in U.S. Appl. No. 14/468,059 dated Sep. 29, 2015.
U.S. Appl. No. 14/809,012, filed Jul. 24, 2015.
State Intellectual Property Office of the Peoples Republic of China, Office Action in Chinese Patent Application No. 200780030811.4 dated Oct. 27, 2015.
USPTO, Final Office Action in U.S. Appl. No. 13/609,033 dated Nov. 18, 2015.
State Intellectual Property Office of the Peoples Republic of China, Reexamination Decision in Chinese Patent Application No. 200810161874.X dated Nov. 27, 2015.
USPTO Final Office Action, U.S. Appl. No. 13/194,783, dated Mar. 10, 2016.
USPTO Final Office Action, U.S. Appl. No. 13/194,834, dated Apr. 12, 2016.
USPTO, Notice of Allowance and Fee(s) Due in U.S. Appl. No. 15/221,882 dated Sep. 20, 2017.
USPTO, Office Action in U.S. Appl. No. 15/429,588 dated Oct. 3, 2017.
China Patent and Trademark Office, Office Action for Chinese Patent Application 201610143560.1, dated Apr. 4, 2018.
Related Publications (1)
Number Date Country
20090157697 A1 Jun 2009 US
Provisional Applications (3)
Number Date Country
61019232 Jan 2008 US
60823066 Aug 2006 US
60577833 Jun 2004 US
Continuation in Parts (2)
Number Date Country
Parent 11620711 Jan 2007 US
Child 12347465 US
Parent 11147664 Jun 2005 US
Child 11620711 US