Content recordation techniques

Information

  • Patent Grant
  • 9021529
  • Patent Number
    9,021,529
  • Date Filed
    Thursday, July 15, 2004
    20 years ago
  • Date Issued
    Tuesday, April 28, 2015
    9 years ago
Abstract
Content recordation techniques are described. In an implementation, a method includes querying electronic program guide (EPG) data to determine if a content item described in a recording document is available for recording. If the content item is available, a reference is added to a recording list for causing recordation of the content item.
Description
TECHNICAL FIELD

The present invention generally relates to the field of content and in particular to content recordation techniques.


BACKGROUND

Users are continually exposed to an ever increasing variety of clients that provide network access, such a set-top boxes, wireless phones, computers, and so on. A user of a set-top box, for instance, may view traditional television programming obtained from a broadcast network for display on a television, as well as order pay-per-view movies, receive video-on-demand (VOD), play “live” video games, and so on. Likewise, a user of a wireless phone may place and receive traditional telephone calls, as well as read email, download digital music, and so on.


Another such example is a digital video recorder (DVR). A DVR typically includes non-volatile storage (e.g., a hard disk) that enables the user to record desired content. DVR's also offer control functionality, such as the ability to pause content that is currently being broadcast and allows viewers to watch the content, while still in progress, from the point it was paused. The DVR plays back the content from storage, starting at the pause event, while continuing to record the currently-broadcast content. Additionally, the DVR may support other control functions, such as rewinding, fast forwarding a stored program, slow motion playback, and the like.


To record content using a DVR, a user was typically required to directly interact with the DVR itself. In some instances, the user could configure the DVR to record related content by specifying parameters to be matched with those of available content to locate potentially desirable content. For example, the user could specify the title of a television program so that the DVR would record each television program having that title. However, the user was not assured that the DVR would record a particular content item of interest. In other words, the user could not be certain that the potentially desirable content recorded by the DVR corresponded with the actual content the user wished to record. For example, although the DVR may be configured to record a particular television program, the DVR might fail to record a special regarding the actors in that particular television program. Therefore, when the user was located “away” from the DVR, the user could not cause the DVR to record the particular content item, even if the user had access to one or more of the clients that provide network access as previously described.


Accordingly, there is a continuing need for improved content recordation techniques.


SUMMARY

Content recordation techniques are described. The content recordation techniques may be utilized when the user is local to and remote from the client. For example, a user, when remotely located from a client configured as a digital video recorder (DVR), interacts with a remote device that is configured as a wireless phone. The user utilizes the wireless phone to access a review of a television program via the Internet. Based on the review, the user invokes a recording document that is embedded in the review to be communicated to the remote client. The recording document describes the television program, such as by describing a title, actors, broadcast time, service (e.g., channel) that broadcasts the television program, and so on.


Upon receipt of the recording document, the remote client executes a parser module to examine the recording document to determine if the television program described in the recording document is available for being recorded by the remote client. For instance, the recording document may be compared with electronic program guide (EPG) data that is received from a head end, EPG data service, and so on. The EPG data may be utilized to determine if the television program is available. The EPG data may also be utilized to determine how the television program is to be recorded, such as by supplying a channel and broadcast start time. If the television program is available, a reference to the television program is added to a recording list based on the EPG data. For example, the broadcast channel and the broadcast start time may be added to the recording list. The recording list is then utilized by the remote client to cause the client to record the content. In another instance, the recording document may cause the head end to cause the client to record the content, such as by examining EPG data stored at the head end to determine if the content is available for recording. If so, the head end causes the client the record the content. In a further instance, the recording document may cause the head end itself to record the content, such as in a network digital video recorder (NDVR) scenario.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is an illustration of an environment in an exemplary implementation that includes a content provider that is communicatively coupled to a client over a network.



FIG. 2A is an illustration of an exemplary implementation showing a distribution server, the client, a head end, and a remote device of FIG. 1 in greater detail such that the head end is configured to record content.



FIG. 2B is an illustration of another exemplary implementation showing a distribution server, the client, a head end, and a remote device of FIG. 1 in greater detail such that the client is configured to record content.



FIG. 3 is an illustration of a system showing a variety of content recordation techniques as implemented by the distribution server and the client of FIG. 2A.



FIG. 4 is a flow diagram depicting a procedure in an exemplary implementation in which a recording document is utilized to record a particular content item.



FIG. 5 is an illustration of a system in an exemplary implementation in which a graphical user interface (GUI) is provided by a recording module to dynamically generate a recording document based on user input.



FIG. 6 is an illustration of a system in an exemplary implementation in which the recording module is executed to examine a textual description of content to dynamically generate a recording document.



FIG. 7 is a flow diagram depicting a procedure in an exemplary implementation in which a client dynamically generates a recording document that is utilized to determine availability of a particular content item for recording by the client.





The same reference numbers are utilized in instances in the discussion to reference like structures and components.


DETAILED DESCRIPTION

Overview


Content recordation techniques are described. In an implementation, a content recordation technique is described in which a client, such as a digital video recorder (DVR), is configured to record content streamed from a head end through use of a recording document that describes the content. The recording document may be provided in a variety of ways, such as embedded in a web site, shared via email or text messaging, submitted via an application programming interface (API), manually written by a user, and so on. The recording document is processed via a parser module to locate the content item that is described by the recording document. In one scenario, the parser module compares the recording document with an electronic program guide (EPG) that is stored on the client to find a matching content item that is described in the EPG data. If a sufficient match is found, a reference to the matching content item is added to a recording list based on the EPG data, such as a broadcast channel and time to record the matching content item. In another scenario, the head end processes the recording document provided by a remote device to determine whether the described content item is available. If so, the head end then causes the client to record the particular content item, such as through communication of a recording list to the client. Thus, a user may cause a particular content item to be recorded without direct interaction with the client.


In a further implementation, the head end stores client state data to process content recordation requests. For example, the head end may include client state data, such as ratings limits, favorite channels, levels of service, and so on, that is accessible locally by the head end. The head end may utilize this client state data to determine if the client is permitted to access the content item described by the recording document. If so, the head end may then cause the client to record the content. By performing the determination utilizing client state data at the head end, the head end provides an authoritative source for processing requests to record content by the client. This may result in a variety of increased functionality that is available to the user, such as an ability to change from an old client to a new client without manually updating client state data from the old client to the new client, remote initiation of content recordation without obtaining a connection with the client itself, and so on.


Exemplary Environment



FIG. 1 is an illustration of an environment 100 in an exemplary implementation showing a content provider 102 that is communicatively coupled to a client 104 over a network 106. The network 106 in the following implementations is an example of a wide area network (WAN), such as the Internet, and may also include a variety of other networks, such as a broadcast network, an intranet, a wired or wireless network, and so forth.


The client 104 is configured to receive content communicated from the content provider 102 over the network 106. The content provider 102 includes content 108(k), where “k” can be any integer from 1 to “K”, that is locally stored on the content provider 102. The content 108(k) may include a variety of data, such as television programming, video-on-demand, one or more results of remote application processing, and so on. The content provider 102 communicates the content 108(k) over a network 110 to a head end 112. The network 110 may be the same as or different from network 106. For example, the network 110 may represent a dedicated network connection between the content provider 102 and the head end 112 while network 106 is implemented by the Internet, both networks 106, 110 may be the Internet, and so on.


The content 108(k) may then be stored in a database 114 as content 116(n), where “n” can be any integer from 1 to “N”, on the head end 112 for communication over the network 106 to the client 104. In other words, the content 116(n) stored in the database 114 may be copies of the content 108(k) received from the content provider 102 over the network 110.


The head end 112, as illustrated, includes a distribution server 118 to format and distribute the content 116(n) over the network 106. Distribution from the head end 112 to the client 104 may be accomplished in a number of ways, including cable, RF, microwave, and satellite. Although the head end 112 is illustrated as separate from the content provider 102, the content provider 102 may also include the head end 112.


The head end 112 may also include a database 120 having a plurality of EPG data 122(m), where “m” can be any integer from one to “M”. The EPG data 122(m) is used to construct an EPG 124 for display by the client 104 to a user. The EPG 124, for instance, may enable the user to observe a listing of television programs that are currently being broadcast from the head end 112, as well as a listing of television programs that will be broadcast in the future. Additionally, the EPG 124 may allow the viewer to navigate to a television program (e.g., content 116(n)) from the EPG 124 itself. To provide additional information to the user, the EPG 124 may include one or more content characteristics that describe content represented in the EPG 124. The content characteristics may include title, broadcast time, broadcast channel, output duration of the content, plot description, a rating (e.g., G, PG, PG-13, R, etc.), a principle actor's name, and so on. The EPG data may be communicated to the client 104 in a variety of ways. In one instance, the EPG data 122(m) is broadcast to the client 104 utilizing a carousel file system. The carousel file system repeatedly broadcasts the EPG data over an out-of-band (OOB) channel to the client 104 over the network 106. Although the head end 112 is illustrated as including the EPG data 122(m), in another instance the EPG data 122(m) is provided over the network 106 utilizing a separate EPG data service.


The client 104 may be configured as a computer that is capable of communicating over the network 106, such as a desktop computer, a mobile station, an entertainment appliance, a set-top box 126 that is communicatively coupled to a display device 128 as illustrated, and so forth. Although the set-top box 126 is shown separately from the display device 128, the set-top box 126 may be built into the display device 128 to form an integral unit. The client 104 may also relate to a person and/or entity that operate the client 104. In other words, client 104 may describe a logical client that includes a user and/or a machine. Although one client 104 is illustrated, a plurality of clients may be communicatively coupled to the network 106.


The client 104 may also include a database 130 having locally stored content 132(j), where “j” can be any integer from 1 to “J”. For example, the client 104 may be configured as a DVR that stores the database 130 in hard disk memory. Due to the size of the memory, users are able to record content, such as content 116(n) streamed from the head end 112. As previously described, the DVR also offers control functions, such as the ability to pause content that is currently being broadcast and allows viewers to watch the content while still in progress from the point it was paused. The DVR plays back the content from disk memory, starting at the pause event, while continuing to record the currently-broadcast content in the disk memory. Additionally, the DVR may support other control functions, such as rewinding, fast forwarding a stored program, slow motion playback, and the like.


The client 104 is equipped with sufficient processing and storage capabilities to store and run a navigation module 134. The navigation module 134, when executed on the client 104, provides control functions for interacting with content. For example, the control functions may include the DVR control functions as previously discussed, as well as channel selection, electronic program guide (EPG) navigation, and so on. In another implementation, the navigation module 134 provides media player functionality, such as to play media having audio and/or visual data, such as MP3 data.


In a further implementation, the client 104 may execute the navigation module 134 to cause recordation of the content 116(n) at the distribution server 118. For example, the navigation module 134 may form a request that is communicated to the distribution server 118 over the network 106 to record content 108(k) communicated to the distribution server 130 from the content provider 102. The distribution server 118, in response to the request, records the requested content such that the navigation module 134 operates as a network digital video recorder (NDVR). Thus, through execution of the navigation module 134, the client 104 may playback locally-stored content 132(j), content 116(n) that is stored remotely over the network 106, and may even control the recordation and playback of the remotely stored content 116(n) to the client 104.


Generally, any of the functions described herein can be implemented using software, firmware (e.g., fixed logic circuitry), manual processing, or a combination of these implementations. The terms “module,” “functionality,” and “logic” as used herein generally represent software, firmware, or a combination of software and firmware. In the case of a software implementation, the module, functionality, or logic represents program code that performs specified tasks when executed on a processor (e.g., CPU or CPUs). The program code can be stored in one or more computer readable memory devices. The content recordation techniques described below are platform-independent, meaning that the content recordation techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


The environment 100 supports a variety of techniques for recordation of the content 132(j), 116(n) through use of a recording document 136. The recording document 136, for instance, describes content to be recorded and may conform to an extensible Markup Language (XML) schema that is parsable by a parser module 138 to locate a particular content item. For example, the recording document 136 may describe a title and a start time for a desired content item. The parser module 138 is executed on the distribution server 118 to compare the title and the start time described in the recording document with the EPG data 122(m) stored in the database 120 to determine if and when the particular content item is available. If the particular content item is available, a reference to the particular content item is added to a recording list 140 to cause the particular content item to be recorded. For example, the recording list 140 may be utilized by the distribution server 118 to record content 116(n) at the head end 112 in a NDVR scenario. In another example, the recording list 140 is communicated to the client 104 to cause the navigation module 134 to record content 132(j) locally in a DVR scenario. In a further example, the recording document 136 is communicated from the remote device 142 to the client 104 for parsing by the client 104, an example of which is shown in relation to FIG. 2B.


The recording document 136 may be provided in a variety of ways. As illustrated in FIG. 1, the recording document 136 is stored on a remote device 142 that is communicatively coupled to the network 106. Therefore, a user of the remote device 142 may provide the recording document 136 to the head end 112 to cause recording of content described by the recording document 136. Thus, the recording document 136 may be provided remotely by the remote device 142 such that the user does not need to interact with the client 104 locally to cause recordation of desired content. A variety of other ways of providing the recording document 136 are described in relation to FIGS. 2A, 2B, and 3.



FIG. 2A is an illustration of an exemplary system 200 showing the distribution server 118, the client 104, and the remote device 142 of FIG. 1 in greater detail. The client 104 includes a processor 202 and memory 204. Processors are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions. Alternatively, the mechanisms of or for processors, and thus of or for a computing device, may include, but are not limited to, quantum computing, optical computing, mechanical computing (e.g., using nanotechnology), and so forth. Memory 204 may include one or more memory devices, such as read only memory (ROM), random access memory (RAM), hard disk memory, removable media memory devices, and so on.


The navigation module 134 and the recording list 140 are illustrated as being executed on the processor 202 and are storable in memory 204. The EPG 124 is illustrated as being stored in the memory 204 and is executable on the processor 202. It should be noted that in the illustrated system 200 of FIG. 2A, the recording list 140 is depicted within the client 104 to show that the recording list 140 may be configured for use by the navigation module 134 to cause recordation of content, further discussion of which may be found starting in relation to FIG. 4.


The client 104 may also include a network interface 206 for receiving the content 116(n) of FIG. 1 that is communicated (e.g., streamed) over the network 106. For example, the network interface 206 may be configured as a tuner that receives broadcast content from over the network 106, may be configured as a transmitter/receiver (transceiver) that is suitable for two-way communication over the network 106, and so on. Thus, the network interface 206 may be configured to transmit and receive messages over the network to and from the head end 112 and/or the remote device 142.


Content 116(n) received from the network 106 via the network interface 206 may be stored in the database 130 for later output by the client 104 and/or provide for immediate output of the content 116(n). The database 130 is illustrated as being separate from memory 204, but may also be included in memory 204. For example, the storage device for the database 124 may be configured as a hard disk drive and the memory 204 may be configured as RAM, both the memory 204 and the database 130 may be implemented as RAM, both the memory 204 and the database 130 may be implemented as removable memory, and so forth. The client 104 executes the navigation module 134 to retrieve the content 132(j) from the database 124 and output the content 132(j) through an output interface 208 for rendering on the display device 122. Thus, in this implementation, the client 104 is capable of operating as a DVR that stores and plays back the content 1320(j).


The client 104 may be locally controlled by a user via inputs provided by an input device 210. The inputs are received by the client 104 from an input interface 212 over a local connection 214. The input interface 212, local connection 214 and input device 210 may be configured in a variety of ways. For example, the input interface 212 may be configured as a wireless port, such as an infrared (IR) or Bluetooth wireless port, for receiving wireless communications from input device 210, such as a remote control device, a handheld input device, or any other wireless device, such as a wireless keyboard. In alternate embodiments, the input interface 212 may use an RF communication link or other mode of transmission to communicate with client 104, such as a wired connection which may include a universal serial bus (USB) connection, and so on.


When output of content is requested, the navigation module 134 is executed on the processor 202 to obtain content, such as from content 116(n) of FIG. 1 that is streamed from the distribution server 118 over the network 106, content 132(j) that is stored locally on the database 130, and so on. The navigation module 134 may also restore the content to the original encoded format as provided by the content provider 102 of FIG. 1. For example, content 116(n) of FIG. 1 may be compressed and then streamed from the distribution server 118 to the client 104. Therefore, when the navigation module 134 receives the content, the content may be decompressed for rendering by the display device 128.


The distribution server 118 also includes a processor 216 and memory 218. The parser module 138 is illustrated as being executed on the processor 216 and is storable in memory 218. The memory 218 of the distribution server 118 is also illustrated as including a plurality of client state data 220(l), where “l” can be any integer from 1 to “L”. The client state data 220(l) is utilized to process requests to record content, such as at the head end 112 of FIG. 1 as content 116(n) and/or at the client in database 130 as content 132(j). For example, the distribution server 118 may include a content manager module 222 (hereinafter, “manager module”) that is executable on the processor 216 to manage client 104 content access. The client state data 220(l), for instance, may specify parental blocks to prevent viewing of content items, may specify conditional access rights (e.g., digital access rights) of the client for particular items of content, rating limits, favorite channels, level of service provisioned, and so on. The manager module 222, when executed, may determine if the client 104 is permitted (i.e., authorized) to record the particular content item, and if so, the distribution server 118 causes the client 104, and specifically the navigation module 132, to record the particular content item. In this way, the head end 112 provides an authoritative source for client state data 220(l) in the system 200 and environment 100 as shown, respectively, in FIGS. 1 and 2. In an implementation, the distribution server 118 may be considered the primary source for the client state data 220(l) for a particular client, even over the client 104 itself. For example, by storing the client state data 220(l) on the distribution server 118, a user may switch set-top boxes without transferring client state data between the set-top boxes. Further discussion of use of the client state data 220(l) may be found in relation to FIG. 7.


The remote device 142 is also illustrated as including a processor 224 and memory 226. The recording document 136 is illustrated as being stored in memory 226 and is executable on the processor 224. The remote device 142 also includes a recording module 228 which is illustrated as being executed on the processor 224 and is storable in memory 226. The recording module 228, when executed, generates the recording document 136 for a particular content item. For example, the recording module 228 may provide a user interface for accepting user inputs that describe a particular content item. The user inputs are processed by the recording module 228 to generate a recording document 136 that follows a schema that is understood by the parser module 138. The parser module 138, when executed, parses the recording document 136 to locate and compare descriptive data in the recording document 136 with EPG data 122(m) in the database 120. EPG data 122(m) that matches the descriptive data is then utilized to determine how to record a particular content item described in the recording document 136. A reference to the particular content item is then added to the recording list 140 to cause the navigation module 134 to be executed on the client 104 to record the particular content item. A variety of other scenarios are also contemplated such that the user may cause recordation of content at the client 104 and/or the head end 112 of FIG. 1, further examples of which may be found in relation to the following figure.



FIG. 2B is an illustration of an exemplary system 250 showing the distribution server 118, the client 104, and the remote device 142 of FIG. 1 in greater detail such that the client includes the functionality to parse a recording document. In the system 200 described in relation to FIG. 2A, the distribution server acted as a central repository for client state data 220(L) and executed the parser module 138 to parse the recording document 136.


In the exemplary system 250 illustrated in FIG. 2B, however, the client 104 executes the parser module 138 to parse the recording document 136 that is communicated from the remote device 142 over the network 106. The parser module 138 may then be utilized to populate the recording list 140 as previously described by comparing the recording document 136 with EPG 124 that is stored in the memory 204. Thus, in this instance, the head end 112 acts to broadcast content 116(n) over a broadcast network 252 and does not actively participate in the recordation of the content 132(j) on the client 104, further discussion of which may be found in relation to FIG. 6.



FIG. 3 is an illustration of a system 300 showing a variety of content recordation techniques as implemented by the distribution server 118 and the client 104 of FIG. 2A. One such content recordation technique is the inclusion of the recording document 302 within content 108(k). For example, the client 104 may receive content 108(k) from the content provider 102 of FIG. 1. The content 108(k) in this example is a television program which includes credits which describe the actors, producers, and so on. The credits may also include a preview for next week's episode of the television program. The preview has an embedded recording document 302 which causes an interactive icon to appear that, when selected, allows the user to automatically schedule a recording for the next episode of the television program.


The recording document 302 is then communicated to the distribution server 118 over the network 110 for parsing by the parser module 138. The parser module 138, when executed, locates the particular content item (e.g., the next episode of the television program) based on the recording document 302 and the EPG data 122(m) and adds a reference to the particular content item to a recording list 140. The recording list 140 causes the navigation module 134 to record the content locally in the database 130 as content 132(j). Thus, the recording document 302 embedded in the content 108(k) provides for automatic recording of the next episode of the television program with minimal user intervention.


In another such technique, a remote content recordation technique is provided by using a remote record service 304. For example, the remote record service 304 may provide a web site which enables a user to select content for recording. The web site 304 may then communicate a recording document 306 that describes the content selected by the user to cause the client 104 to automatically record the selected content as previously described.


In a further such technique, the user interacts with the remote device 142 to remotely record content using the client 104. For example, the remote device 142 may execute an email module 308 that causes an email that contains a recording document 310 to be communicated to the distribution server 118. The distribution server 118 may then execute the parser module 138 to compare the descriptive data in the recording document 310 with the EPG data 122(m) to determine if the particular content item referenced in the email is available. If so, the parser module 138 may then be executed to determine if access to the particular content item is permitted by the client 104 based on the client state data 220(l). For example, the client state data 220(l) may indicate whether the user subscribes to a content service package that includes the particular content item. If the user does have conditional access rights, the particular content item is added to the recording list 140 for causing the navigation module 134 to record the particular content item as content 132(j) in the database 130.


In another example, the remote device 142 may include a text messaging module 312 to receive a text message from another remote device. The text message may describe a particular content item, such as by providing the title, names of actors, genre, and so on. The text message may be examined to dynamically generate the recording document 310 that contains the content descriptions from the text message. The recording document 310 may then be communicated to the distribution server 118 for processing as previously described. In this example, the recording document 310 is dynamically generated, further discussion of which may be found in relation to FIGS. 5-7. Although the system 300 of FIG. 3 described the execution of the parser module 138 on the distribution server 118, the parser module 138 may also be executed on the client 104. For example, the parser module 138, when executed on the client 104, may compare the descriptive data in the recording document 306 with the EPG 124 (e.g., the EPG data utilized to form the EPG stored on the client 104) to determine availability of the particular content item. Further discussion of execution of the parser module 138 by the client 104 may also be found in relation to FIGS. 5-7. Although a variety of exemplary content recordation techniques have been described, a variety of other content recordation techniques may also be provided that utilize a recording document for comparison with EPG data.


Exemplary Procedures


The following discussion describes content recordation techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedures may be implemented in hardware, firmware, or software, or a combination thereof. The procedures are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks.



FIG. 4 is a flow diagram depicting a procedure 400 in an exemplary implementation in which a recording document is utilized to record a particular content item. At block 402, a recording document 404 is invoked by a remote device 142 for a particular content item. For example, the remote device 142 may access a web site 406 that provides an output of a web page 408 for viewing at the remote device 142. The web page 408, when provided to the remote device 142, may also include the recording document 404 that describes a particular content item. For instance, the web page 408 may include a review of a television show that is available from a broadcast from a head end 112. If the user wishes to record the television show, the user selects a link in the web page 408, which causes the recording document to be invoked.


The recording document 404 in this example follows an XML recording schema. The XML recording schema is an abstract representation depicting the interrelationship between attributes and elements of an XML object, which in this instance is the recording document 404 or a portion of the recording document 404. An example of the recording document 404 which complies with an exemplary XML recording schema is shown as follows:














<?xml version=“1.0” encoding=“utf-8” ?>


<!-- Sample Click-to-Record (NBC Single-Episode Scenario) -->


<clickToRecord xmlns=“urn:schemas-microsoft-com:ehome:clicktorecord”>


 <ds:Signature xmlns:ds=“http://www.w3.org/2000/09/xmldsig#”>


   <!-- XML Signature goes here -->


 </ds:Signature>


 <body>


   <metadata>


     <!-- The following information should be considered insecure unless


signed. -->


     <description>A very special episode of Friends.</description>


     <moreInfoUrl>http://www.nbc.com/friends/</moreInfoUrl>


   </metadata>


   <!-- hard prepad and postpad 5 minutes - if airing specified isn't found, suggest


another one -->


   <programRecord prepadding=“5” postpadding=“5”


allowAlternateAirings=“true” allowAlternateService=”false”>


     <program>


      <key field=“urn:schemas-microsoft-com:ehome:epg:program#title”


match=“exact”>Friends</key>


      <key field=“urn:schemas-microsoft-


com:ehome:epg:program#episodetitle” match=“exact”>The One Where Chandler Marries


Monica</key>


     </program>


     <service>


      <key field=“urn:schemas-microsoft-


com:ehome:epg:service#affiliate” match=“startswith”>NBC</key>


     </service>


     <!-- Folks in PST have a time (in UTC) specified -->


     <airing timeZone=“EST”>


      <key field=“urn:schemas-microsoft-


com:ehome:epg:airing#starttime”>2003-10-15T08:00:00Z</key>


     </airing>


     <!-- Other folks record if the show is found within 3 hours of the UTC


time specified -->


     <airing searchSpan=“180”>


      <key field=“urn:schemas-microsoft-


com:ehome:epg:airing#starttime”>2003-10-15T08:00:00Z</key>


     </siring>


   </programRecord>


 </body>


</clickToRecord>










The outermost element <clickToRecord> is the root element of the recording document 404, which is defined by the namespace “urn:schemas-microsoft-com:ehome:clicktorecord”. The <clickToRecord> element contains an element: <body> plus an optional digital signature conforming to an XML Signature specification.


The <body> element contains a single <metadata> element followed by one or more <programRecord> elements. The <metadata> element may encapsulate several sub-elements to providing additional data that describes the requested content item. The following is a list of exemplary <metadata> sub-elements:













Element
Usage







<description>
Description of the package (e.g., A Very Special



Episode of Friends)


<expires>
Date/Time after which the recording document



expires.


<moreInfoUrl>
Hyperlink to the source's website (e.g.,



http://entertainment.msn.com/tv)


<updateUrl>
Pointer to a URL that may contain updated versions of



the recording document to account for schedule



changes.










One or more record definition elements may be included after the <metadata> element as shown in the above exemplary recording document 404. A <programRecord> record definition element is included which covers both one-time and series recording scenarios.


As shown in the sample document above, the <programRecord> element may include several optional attributes, examples of which are described as follows:















De-



Attribute
fault
Usage







prepadding/
0/0
Specifies pre-padding and/or post-padding


postpadding

of the recording in minutes. For example,




padding may be utilized to account for a




lack of clock synchronization between the




head end 112 of FIG. 1 and the client 104.


allowAlternateAirings
true
Specifies how to handle instances in




which a specified broadcast of a content




item cannot be found. If this attribute is




“true”, a search for the same content item




is performed for different scheduled times




and the user is informed of the change, if




any. If this attribute is“false”, and the




show is not found within the specified




time window, then the request will fail




and the user will be informed.


allowAlternateServices
false
Similar to the “allowAlternateAirings”,




however, by specifying “true” different




content providers (e.g., broadcasters) may




be specified.


programDuration
0
Specifies an output duration of the content




item. This may be used if the content item




is not found in the current EPG and a




temporary recording event must be




created. If 0, the content item must appear




in the guide or the query fails.


firstRunOnly
false
Don't record reruns.


daysOfWeek
0x7F
Indicates which days of the week a




content item may be recorded for manual




and generic “keyword” requests.


isRecurring
false
This element differentiates between one-




time and series request behavior for the




content item.









The <programRecord> element may include a variety of element types as children that further describe the particular content item, such as <program> (e.g., a title of a television program), <service> (e.g., a broadcast channel that provides the television program), and <airing> (e.g., a time when the television program is to be broadcast). Each of these elements may occur more than once in the recording document 404.


At block 410, the remote device communicates the recording document 404 to the head end 112. For example, the recording document 404 may be transmitted (i.e., pushed) over the network 106, implemented using the Internet, for receipt by the head end 112. In another example, the recording document is “pulled” from the remote device 142 by the head end 112. For instance, the head end 112 may be configured to periodically monitor the remote device 142 for presence of the recording document 404.


At block 412, the head end 112 queries the database 120 of EPG data 122(m) to determine if the particular content item described by the recording document 404 is available. The head end 122, for instance, may execute the parser module 138 to locate data in the recording document 404 that describes the particular content item, which is illustrated at block 412 of FIG. 4 as “content description 414”. The content description 414 (i.e., the descriptive data) is then compared with the EPG data 122(m) to find a match. For example, the <program>, <service>, and <airing> elements may be compared with the EPG data 122(m) to find a particular content item which most satisfies those elements. Thus, the parser module 138, when executed, may determine how to record the particular content item that is described by the recording document by cross-referencing the content description 414 with the EPG data 122(m).


In an implementation, the head end 112 utilizes minimum search field requirements before querying the database that contains EPG data 122(m) (block 412). For example, specific combinations of search criteria, when included in a recording document, may result in a failure in the query (block 412) due to insufficient amount of information (e.g., elements) to locate the particular content item. For instance, a recording document that only specifies a <service> (e.g., a broadcast channel) may be considered as invalid unless a corresponding <program> and/or <airing> is provided. The following is a listing of exemplary legal combinations of the three elements previously described:


<program> (e.g., record this program anytime it is streamed, on any service);


<program>, <service> (record this program anytime it is streamed from a specified service);


<program>, <airing> (record this program at this time from any service);


<program>, <service>, <airing> (record this program at this time from this service); and


<service>, <airing> (record the named service at the given time).


Although three elements are described, a variety of other elements and combinations thereof may also be included in the recording document 404 to locate a particular content item.


Through use of elements and different combinations of the elements, for example, search criteria may be broadly or narrowly specified depending upon the desired implementation. For instance, a fan website may post a recording document for recording any episode of a particular television program no matter what time it is broadcast and no matter which channel broadcasts the television program. Such a recording document may specify the title of the television program (e.g., <program>) without supplying any other additional elements. In another instance, a website provided by a particular content broadcaster may supply a recording document that specifies episodes that are broadcast by that particular content broadcaster and does not wish to include episodes that are broadcast by rival broadcasters. In this instance, the recording document specifies the title of the television program (e.g., <program>) and the broadcaster (e.g., <service>).


Although some examples of search criteria have been described, a variety of other search criteria may also be specified in a recording document. For example, the recording document may specify alternative matching attributes, such as “(<program> and <service>) or (<service> and <airing>)”. Additionally, each of the elements may be specified in a variety of ways. For instance, a target service may be specified by call sign, name, affiliate, and so on. Thus, the recording document may flexibly describe search criteria as contemplated by a creator of the recording document.


The search criteria (i.e., elements) may also be processed so that the recording document is transportable between users having different respective content providers. For example, users may receive content through different channel lineups, the users may be located in different time zones, and so on. Time-based search criteria, for instance, may be specified using any time-zone and then normalized to a local time-zone when parsed. In another instance, the search criteria may be restricted to within a particular offset from coordinated universal time (UTC). Multiple criteria may be specified in this way so one time can be specified that applies only to Pacific Standard Time (PST) and Eastern Standard Time (EST), for example, and another time set for Mountain Standard Time (MST) and Central Standard Time (CST). In a further instance, a search “window” can be specified to allow the episode to be matched within a specific range of time around the time specified.


If the particular content item is available, then the head end adds a content reference 418 to the particular content item in the recording list 140 (block 416). The content reference 418, for instance, may specify a broadcast channel and time to record the referenced content item, map to a memory location of the particular content item in a database 114 of FIG. 1 at the head end 112, and so on.


At block 420, the head end 112 causes the client 104 to record the particular content item referenced in the recording list 140. For example, the head end 112 may execute the parser module 138 to communicate the content reference 418 over the network 106 to the client 104. The navigation module 134, upon receipt of the content reference 418, records the particular content item to the database 130 as specified by the content reference 418. Thus, in this example, the remote device 142 is able to cause the client 104 to record a particular content item without direct interaction with the client 104.



FIG. 5 is an illustration of a system 500 in an exemplary implementation in which a graphical user interface (GUI) 502 is provided by the recording module 228 to dynamically generate the recording document 136 based on user input. In the previously described procedure 400 of FIG. 4, the recording document 404 was preconfigured and obtained to record a particular content item. The recording document may also be dynamically generated based on user input to describe a particular content item for recording.


The remote device 142, for example, includes the recording module 228. The recording module 228, when executed on the remote device 142, provides an output for display on a display device 504 of the GUI 502. The GUI 504 in this example provides an interface for entering keyword search elements which may be utilized to locate a particular content item. For instance, the user may utilize an input device to enter a portion of a title and actors in the particular content item, such as “Godfather” 506, “Pacino” 508, and “DeNiro” 510. The recording module 228 utilizes these elements to form the recording document 136. The recording document 136 is then communicated over the network 106 to the distribution server 118 and parsed by the parser module 138 as previously described to determine if the described particular content item of the recording document 136 is available.


In this example, the elements “Godfather” 506, “Pacino” 508 and “De Niro” 510 are utilized to determine if the particular content item “Godfather II” is available by finding a content item described in the EPG data 122(m) of FIG. 1 that satisfies each of these elements. In another implementation, a “best match” may be performed so that the content item described in the EPG data 122(m) which satisfies the most elements in the recording document 136 is reported to the user via the GUI 504. For example, Godfather II may not be available based on a query of the EPG data 122(m). However, the movie Godfather I, which satisfies the elements “Godfather” 506 and “Pacino” 508 may be available. Therefore a result of the query, which indicates the availability of Godfather I, is output via the GUI 504 so that the user can decide whether to record that particular content item.



FIG. 6 is an illustration of a system 600 in an exemplary implementation in which the recording module is executed to examine a textual description of content to dynamically generate a recording document. In the system 500 of FIG. 5, the recording document 136 was dynamically generated based on user input. The recording document 136 may also be dynamically generated without user input.


The client 104, for instance, may be configured as a set-top box 126 that is communicatively coupled to the display device 128. The client 104 executes the navigation module 134 to access a textual description of a particular content item over the network 106, which in FIG. 6 is illustrated as a content review 602 that is available from a web site. The user, upon reading the content review 602, may desire to record the particular content item described in the review. In this instance, however, the content review 602 does not include a preconfigured recording document as previously described in relation to FIG. 3. Therefore, the client 104 executes a recording module 604 to dynamically generate the recording document 136 based on the content review 602.


The recording module 604, for example, may be executed to examine the content review 602 to find one or more words which describe the particular content item. In an implementation, the recording module 604 compares words in the content review 602 with a database 606 of descriptive words which may be utilized to describe content, such as names of broadcast channels, titles, actors, and so on. For instance, the recording module 604, when executed, locates the words “hardball” 608, “Chris Matthews” 610, and “MSNBC” 612 (MSNBC is a trademark of MSNBC Cable L.L.C. of New York, N.Y.). The recording module 604 then generates a recording document 614 and communicates it over the network 106 to a parser module 616 that is executable on another client 618.


The parser module 616, when executed on the other client 618, compares the recording document 614 with an EPG 620 to determine if the particular content item described in the recording document 614 is available, and if so, causes a navigation module 622 to record the particular content item as content 624(p), where “p” can be any integer from one to “P”, in the database 626.


In another implementation, the recording module 604 and the parser module 616 are executed to directly compare words 608-612 in the content review 602 with the EPG 124 that is stored on the client 104. In other words, the EPG 124 (and more particularly the EPG data that is utilized to configure the EPG 124) provides the database 606. In this implementation, the recording module 604 does not wait until after the recording document 608 is completely generated to perform the comparison, but rather compares words 608-612 with the EPG 124.


It should be noted that in the system 600 of FIG. 6, the client 104 executes the recording module 604 to generate the recording document 136. Another client 618 executes the parser module 616 to determine if the particular content item is available based on another EPG 620 that is stored locally on the other client 618. Thus, the content recordation techniques may also be utilized for interaction between clients 104, 618 without directly involving the head end 112 of FIG. 1. Further discussion of client execution of the parser module may be found in relation to the following figure.



FIG. 7 is a flow diagram depicting a procedure 700 in an exemplary implementation in which a client dynamically generates a recording document that is utilized to determine availability of a particular content item for recording by the client. At block 702, the client displays a textual description of a particular content item. A variety of textual descriptions may be displayed, such as the content review 602 of FIG. 6, an email, a text message communicated from another client, and so on.


At block 704, the client receives an input to active a recording module. For example, the client may provide an icon for selection by the user, a drop-down menu for activation of the recording module, and so on. At block 706, the recording module, when executed, examines the text to locate descriptions of the particular content item. For instance, the recording module may first examine the text to locate words which are typically used to describe the <program>, <service>, and <airing> elements that were previously described. The recording module may also locate other words which describe the particular content item, such as actor, output duration of the content, genre, start time, stop time, plot, and so on.


At block 708, the recording module generates a recording document that includes the located descriptions of the particular content item. The recording document, for instance, may be configured according to an XML recording schema that is understood by the parser module. The recording module then passes the recording document to the parser module (block 710).


At block 712, the parser module queries EPG data to locate the particular content item. In a first scenario, the parser module is executed on the client to query an EPG that is stored locally on the client. In a second scenario, the parser module is executed on the client to query EPG data that is stored at the head end, such as the EPG data 122(m) stored in the database 120 at the head end 112 of FIG. 1.


At decision block 714, a determination is made as to whether the located content item conflicts with another content item in the recording list. For example, the recording list may be configured for implementation by a DVR that is capable of recording a single content item at any one point in time. Therefore, the parser module may be executed to flag conflicts in the recording list so that the referenced content items are recorded as desired. If there is a conflict (block 714), a message is sent to the user (block 716) so that the user may decide which of the conflicting content items is to be recorded, if any.


If the located content item does not conflict with another content item in the recording list (block 714), a determination is made as to whether the client is authorized to record the content (block 718). For example, the parser module may be executed to determine from the client state data 220(l) stored at the distribution server 118 of FIG. 2 whether the client is permitted to access the referenced content item. The client state data 220(l), for instance, may be utilized to indicate a variety of conditional access rights, such as parental blocks, digital rights management (DRM), content subscriptions, and so on.


If the client is authorized to record the content (block 718), then the located content item is added to a recording list (block 720). For instance, a reference to the located content item may be added which describes how to record the located content item, such as date, time, and channel of a broadcast of a television program, a memory location, and so on. The recording list may then be utilized to cause the navigation module to record the added content item (block 722).


CONCLUSION

Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as exemplary forms of implementing the claimed invention.

Claims
  • 1. A method comprising: receiving a recording document via a network that describes a content item but does not describe whether the content item is available for recording or how the content item is to be recorded, wherein the recording document was generated by a recording module configured to examine a textual description of the content item to locate one or more words that describe the content item and dynamically generate the recording document, wherein the recording document includes the located one or more words from the textual description;querying electronic program guide (EPG) data to determine if the content item described in the recording document is available for recording without user intervention, and if so, how the content item is to be recorded;examining client state data that describes conditional access rights of a client to determine if recordation of the content item for the client is permitted using digital rights management; andif the content item is available and recordation is permitted, adding a reference in a recording list for causing recordation of the content item without user intervention.
  • 2. A method as described in claim 1, wherein the querying is performed by execution of a parser module on a head end.
  • 3. A method as described in claim 1, wherein: the querying is performed by execution of a parser module on a head end; andthe reference in the recording list is for causing recordation of the content item by a client.
  • 4. A method as described in claim 1, wherein the querying is performed by execution of a parser module on a client.
  • 5. A method as described in claim 1, wherein: the query is performed at a client; andthe reference in the recording list is for causing recordation of the content item at a head end.
  • 6. A method as described in claim 1, further comprising determining if the available content item conflicts with another content item in the recording list.
  • 7. A method as described in claim 1, wherein the recording document describes the content item through at least one of a restrictive criterion, a service, a time, or a genre.
  • 8. A method as described in claim 1, wherein the textual description is selected from the group consisting of: one or more words entered by a user via a user interface;an article;a text message;an email; andanother content item.
  • 9. A computer-implemented method comprising: examining, by a recording module, a textual description of a content item to locate one or more words that describe the content item;dynamically generating, by the recording module, a recording document that includes the located one or more words;comparing the one or more words in the generated recording document with electronic program guide (EPG) data to determine if the content item is available for recording; andif the content item is available, adding a reference in a recording list to cause recordation of the content item.
  • 10. A computer-implemented method as described in claim 9, wherein the content item is selected from the group consisting of: a television program;a movie;a picture;a music file; andmedia.
  • 11. A computer-implemented method as described in claim 9, wherein: the examining is performed by executing a recording module on a client; andthe comparing and the adding are performed by executing a parser module on a head end.
  • 12. A computer-implemented method as described in claim 9, wherein the examining, the comparing, and the adding are performed by executing a recording module and a parser module on a client.
  • 13. A computer-implemented method as described in claim 9, further comprising: communicating the recording document to a head end which executes a parser module to perform the comparing and the adding; andcommunicating the recording list to a client to cause recordation of the content item described in the EPG data through execution of a navigation module on the client.
  • 14. A computer-implemented method as described in claim 9, further comprising determining if the client is authorized to record the content item.
  • 15. A computer implemented method as described in claim 14, wherein the determining is performed at a head end using client state data that is stored on the head end.
  • 16. One or more computer readable memory devices comprising computer executable instructions that, when executed by a computer, direct the computer to: examine, by a recording module, a textual description of a content item to locate one or more words that describe the content item;dynamically generate, by the recording module, a recording document having the located one or more words; andform a communication for communicating the recording document for comparison with electronic program guide (EPG) data to determine if the content item described by the recording document is available for recording, and if so, how the content item is to be recorded.
  • 17. One or more computer readable memory devices as described in claim 16, wherein the computer executable instructions further direct the computer to: determine if the content item described by the recording document is available for recording; andif the content item is available, add a reference to the available content item to a recording list.
  • 18. One or more computer readable memory devices as described in claim 16, wherein the computer executable instructions further direct the computer to examine client state data that describes conditional access rights of a plurality of clients to determine if recordation of the content item for a particular said client is permitted.
  • 19. One or more computer readable memory devices as described in claim 16, wherein the computer executable instructions further direct the computer to determine if the available content item conflicts with another content item in a recording list.
  • 20. One or more computer readable memory devices as described in claim 16, wherein the textual description is selected from the group consisting of: one or more words entered by a user via a user interface;an article;a text message;an email; andanother content item.
  • 21. A head end comprising: a processor; andmemory configured to maintain: electronic program guide (EPG) data;a recording document having a plurality of elements which describe a content item, wherein the recording document was generated by a recording module configured to examine a textual description of the content item to locate one or more words that describe the content item and dynamically generate the recording document, wherein the recording document includes the located one or more words; anda parser module that is executable on the processor without user intervention to: determine if the content item described in the recording document is available for recording by comparing at least one said element with the electronic program guide (EPG) data;examine client state data that describes conditional access rights of a plurality of clients to determine if recordation of the content item for a particular said client is permitted based on one or more parental blocks; andif the content item is available and recordation of the content item for the particular said client is permitted, add a reference to a recording list describing how the content item is to be recorded for causing recordation of the content item in the memory for access by the particular said client.
  • 22. A head end as described in claim 21, wherein the textual description is selected from the group consisting of: one or more words entered by a user via a user interface;an article;a text message;an email; andanother content item.
  • 23. A head end as described in claim 21, wherein the parser module is further executable on the processor to determine if the available content item conflicts with another content item referenced in the recording list.
  • 24. A client comprising: a processor; andmemory configured to maintain: an electronic program guide (EPG) formed from a plurality of EPG data;a recording document having a plurality of elements which describe a content item, wherein the recording document was generated by a recording module configured to examine a textual description of the content item to locate one or more words that describe the content item and dynamically generate the recording document, wherein the recording document includes the located one or more words from the textual description; anda parser module that is executable on the processor to: determine, without user intervention, if the content item described in the recording document is available for recording by comparing at least one said element with the electronic program guide (EPG) data;examine client state data, which describes conditional access rights, to determine if recordation of the content item is permitted based on one or more content subscriptions; andif the content item is available and recordation is permitted, add a reference in a recording list, based on the query and without user intervention, for causing recordation of the content item.
  • 25. A client as described in claim 24, wherein the reference in the recording list is for causing recordation of the content item at a head end.
  • 26. A client as described in claim 24, wherein the reference in the recording list is for causing recordation of the content item through execution of a navigation module on the client.
  • 27. A client as described in claim 24, wherein the textual description is selected from the group consisting of: one or more words entered by a user via a user interface;an article;a text message;an email; andanother content item.
  • 28. A client as described in claim 24, wherein the parser module is further executable on the processor to determine if the available content item conflicts with another content item in the recording list.
  • 29. A system comprising: a network;a head end that is communicatively coupled to the network, includes a database having electronic program guide (EPG) data, and has a parser module that is executable thereon without user intervention to: query the EPG data to determine if a content item described in a recording document that is received from over the network is available for recording, wherein the recording document was generated by a recording module configured to examine a textual description of the content item to locate one or more words that describe the content item and dynamically generate the recording document, wherein the recording document includes the located one or more words from the textual description;if the content item is available, examine client state data that describes conditional access rights of a plurality of clients to determine if recordation of the content item for a particular said client is permitted;if recordation of the content item for the particular said client is permitted, add a reference in a recording list for causing recordation of the content item; andform a communication for communicating the recording list via the network; andthe particular said client that is communicatively coupled to the network and includes a navigation module that is executable thereon to: receive the communication having the recording list; andrecord the referenced content item.
US Referenced Citations (391)
Number Name Date Kind
4706121 Young Nov 1987 A
4894789 Yee Jan 1990 A
4908713 Levine Mar 1990 A
4945563 Horton et al. Jul 1990 A
4977455 Young Dec 1990 A
5010499 Yee Apr 1991 A
5025837 Gielen Jun 1991 A
5091877 Itoh et al. Feb 1992 A
5121476 Yee Jun 1992 A
5195692 Fujii et al. Mar 1993 A
5223924 Strubbe Jun 1993 A
5226079 Holloway Jul 1993 A
5247364 Banker et al. Sep 1993 A
5293357 Hallenbeck Mar 1994 A
5307173 Yuen et al. Apr 1994 A
5335079 Yuen et al. Aug 1994 A
5351075 Herz Sep 1994 A
5355484 Record et al. Oct 1994 A
5473673 Van Wijk et al. Dec 1995 A
5485553 Kovalick et al. Jan 1996 A
5488409 Yuen et al. Jan 1996 A
5508731 Kohorn Apr 1996 A
5515173 Mankovitz et al. May 1996 A
5526035 Lappington et al. Jun 1996 A
5528490 Hill Jun 1996 A
5528759 Moore Jun 1996 A
5532732 Yuen et al. Jul 1996 A
5539822 Lett Jul 1996 A
5565895 Akatsuka Oct 1996 A
5589832 Grundvig et al. Dec 1996 A
5600573 Hendricks et al. Feb 1997 A
5600632 Schulman Feb 1997 A
5617526 Oran et al. Apr 1997 A
5619247 Russo Apr 1997 A
5648824 Dunn et al. Jul 1997 A
5659653 Diehl Aug 1997 A
5684153 Geen et al. Nov 1997 A
5686954 Yoshinobu et al. Nov 1997 A
5692214 Levine Nov 1997 A
5697844 Von Kohorn Dec 1997 A
5726702 Hamaguchi et al. Mar 1998 A
5737552 Lavallee et al. Apr 1998 A
5752159 Faust et al. May 1998 A
5761602 Wagner et al. Jun 1998 A
5761606 Wolzien Jun 1998 A
5771354 Crawford Jun 1998 A
5774664 Hidary et al. Jun 1998 A
5774666 Portuesi Jun 1998 A
5778181 Hidary et al. Jul 1998 A
5796828 Tsukamoto et al. Aug 1998 A
5796967 Filepp et al. Aug 1998 A
5798758 Harada et al. Aug 1998 A
5798785 Hendricks et al. Aug 1998 A
5801747 Bedard Sep 1998 A
5818441 Throckmorton et al. Oct 1998 A
5818935 Maa Oct 1998 A
5826165 Echeita Oct 1998 A
5832223 Hara et al. Nov 1998 A
5835712 Dufresne Nov 1998 A
5838314 Neel et al. Nov 1998 A
5844620 Coleman et al. Dec 1998 A
5845260 Nakano et al. Dec 1998 A
5848352 Dougherty Dec 1998 A
5848396 Gerace Dec 1998 A
5848397 Marsh Dec 1998 A
5855008 Goldhaber Dec 1998 A
5857190 Brown Jan 1999 A
5861881 Freeman et al. Jan 1999 A
5862220 Perlman Jan 1999 A
5864823 Levitan Jan 1999 A
5867208 McLaren Feb 1999 A
5873076 Barr et al. Feb 1999 A
5878222 Harrison Mar 1999 A
5880768 Lemmons et al. Mar 1999 A
5886731 Ebisawa Mar 1999 A
5889950 Kuzma Mar 1999 A
5913013 Abecassis Jun 1999 A
5920700 Gordon et al. Jul 1999 A
5928850 Murai et al. Jul 1999 A
5929849 Kikinis Jul 1999 A
5931908 Gerba Aug 1999 A
5937331 Kalluri et al. Aug 1999 A
5940073 Klosterman et al. Aug 1999 A
5943467 Beyers Aug 1999 A
5956090 Yamauchi Sep 1999 A
5961603 Kunkel et al. Oct 1999 A
5963264 Jackson Oct 1999 A
5978828 Greer et al. Nov 1999 A
5982445 Eyer et al. Nov 1999 A
5987509 Portuesi Nov 1999 A
5990883 Byrne et al. Nov 1999 A
5991306 Burns et al. Nov 1999 A
5991596 Cunningham et al. Nov 1999 A
5991799 Yen et al. Nov 1999 A
6002394 Schein et al. Dec 1999 A
6002444 Marshall et al. Dec 1999 A
6005565 Legall et al. Dec 1999 A
6006256 Zdepski Dec 1999 A
6006265 Rangan et al. Dec 1999 A
6008836 Bruck et al. Dec 1999 A
6009410 LeMole et al. Dec 1999 A
6011537 Slotznick Jan 2000 A
6012086 Lowell Jan 2000 A
6012087 Freivald Jan 2000 A
6016497 Suver Jan 2000 A
6018359 Kermode et al. Jan 2000 A
6018768 Ullman et al. Jan 2000 A
6021426 Douglis et al. Feb 2000 A
6023585 Perlman et al. Feb 2000 A
6023698 Lavey et al. Feb 2000 A
6025868 Russo Feb 2000 A
6026435 Enomoto Feb 2000 A
6029045 Picco et al. Feb 2000 A
6029193 Yamamoto Feb 2000 A
6029195 Herz Feb 2000 A
6034689 White et al. Mar 2000 A
6035304 Machida et al. Mar 2000 A
6038367 Abecassis Mar 2000 A
6048764 Suzuki et al. Apr 2000 A
6049831 Gardell et al. Apr 2000 A
6055564 Phaal Apr 2000 A
6057872 Candelore May 2000 A
6058238 Ng May 2000 A
6058430 Kaplan May 2000 A
6061719 Bendinelli et al. May 2000 A
6064376 Berezowski et al. May 2000 A
6064440 Born et al. May 2000 A
6075568 Matsuura Jun 2000 A
6078961 Mourad et al. Jun 2000 A
6081842 Shachar Jun 2000 A
6097441 Allport et al. Aug 2000 A
6108706 Birdwell Aug 2000 A
6125388 Reisman Sep 2000 A
6134379 LaMacchia Oct 2000 A
6141488 Knudson et al. Oct 2000 A
6141678 Britt Oct 2000 A
6154771 Rangan et al. Nov 2000 A
6157413 Hanafee et al. Dec 2000 A
6161182 Nadooshan Dec 2000 A
6166778 Yamamoto et al. Dec 2000 A
6167235 Sibecas et al. Dec 2000 A
6169541 Smith Jan 2001 B1
6172673 Lehtinen et al. Jan 2001 B1
6172677 Stautner et al. Jan 2001 B1
6173112 Gruse et al. Jan 2001 B1
6177931 Alexander et al. Jan 2001 B1
6178114 Yang Jan 2001 B1
6182124 Lau et al. Jan 2001 B1
6195501 Perry Feb 2001 B1
6198511 Matz et al. Mar 2001 B1
6199206 Nishioka Mar 2001 B1
6201536 Hendricks et al. Mar 2001 B1
6201538 Wugofski Mar 2001 B1
6215483 Zigmond Apr 2001 B1
6215783 Neyman Apr 2001 B1
6229532 Fujii May 2001 B1
6229541 Kamen et al. May 2001 B1
6240555 Shoff et al. May 2001 B1
6243713 Nelson et al. Jun 2001 B1
6243741 Utsumi Jun 2001 B1
6252629 Takatori Jun 2001 B1
6252630 Kusumi et al. Jun 2001 B1
6253204 Glass et al. Jun 2001 B1
6253228 Ferris Jun 2001 B1
6256071 Hiroi Jul 2001 B1
6256785 Klappert et al. Jul 2001 B1
6260192 Rosin Jul 2001 B1
6263505 Walker et al. Jul 2001 B1
6266814 Lemmons et al. Jul 2001 B1
6268849 Boyer Jul 2001 B1
6271893 Kawaguchi et al. Aug 2001 B1
6285407 Yasuki et al. Sep 2001 B1
6286005 Cannon Sep 2001 B1
6311011 Kuroda Oct 2001 B1
6314569 Chernock et al. Nov 2001 B1
6317779 Gile et al. Nov 2001 B1
6317780 Cohn et al. Nov 2001 B1
6317881 Shah-Nazaroff et al. Nov 2001 B1
6317885 Fries Nov 2001 B1
6335736 Wagner et al. Jan 2002 B1
6344878 Emura Feb 2002 B1
6348932 Nishikawa Feb 2002 B1
6349410 Lortz Feb 2002 B1
6351270 Nishikawa et al. Feb 2002 B1
6357042 Srinivasan et al. Mar 2002 B2
6366907 Fanning et al. Apr 2002 B1
6367080 Enomoto Apr 2002 B1
6373503 Perkes Apr 2002 B1
6374402 Schmeidler et al. Apr 2002 B1
6374404 Brotz et al. Apr 2002 B1
6374406 Hirata Apr 2002 B2
6389593 Yamagishi May 2002 B1
6393430 Van Ryzin May 2002 B1
6397261 Eldridge May 2002 B1
6400407 Zigmond et al. Jun 2002 B1
6412111 Cato Jun 2002 B1
6415319 Ambroziak Jul 2002 B1
6415438 Blackketter et al. Jul 2002 B1
6418169 Datari Jul 2002 B1
6437836 Huang et al. Aug 2002 B1
6442593 Wang et al. Aug 2002 B1
6442755 Lemmons et al. Aug 2002 B1
6446082 Arita Sep 2002 B1
6446261 Rosser Sep 2002 B1
6446262 Malaure Sep 2002 B1
6460180 Park et al. Oct 2002 B1
6463205 Aschbrenner et al. Oct 2002 B1
6473751 Nikolovska et al. Oct 2002 B1
6473903 Balakrishnan et al. Oct 2002 B2
6485044 Blake Nov 2002 B1
6498895 Young et al. Dec 2002 B2
6502243 Thomas Dec 2002 B1
6504990 Abecassis Jan 2003 B1
6510557 Thrift Jan 2003 B1
6510558 Iinuma et al. Jan 2003 B1
6512551 Lund Jan 2003 B1
6522342 Gagnon et al. Feb 2003 B1
6526575 McCoy et al. Feb 2003 B1
6526577 Knudson et al. Feb 2003 B1
6526579 Sato Feb 2003 B2
6530082 Del Sesto et al. Mar 2003 B1
6536041 Knudson et al. Mar 2003 B1
6560777 Blackketter et al. May 2003 B2
6564248 Budge et al. May 2003 B1
6564379 Knudson et al. May 2003 B1
6564380 Murphy May 2003 B1
6571392 Zigmond et al. May 2003 B1
6591292 Morrison et al. Jul 2003 B1
6600496 Wagner et al. Jul 2003 B1
6603488 Humpleman et al. Aug 2003 B2
6604239 Kohen Aug 2003 B1
6604242 Weinstein et al. Aug 2003 B1
6611654 Shteyn Aug 2003 B1
6614987 Ismall et al. Sep 2003 B1
6615408 Kaiser et al. Sep 2003 B1
6631413 Aggarwal et al. Oct 2003 B1
6631523 Matthews et al. Oct 2003 B1
6636890 Mandalia Oct 2003 B1
6637029 Maissal et al. Oct 2003 B1
6647417 Hunter et al. Nov 2003 B1
6662007 Yuen Dec 2003 B2
6665870 Finseth et al. Dec 2003 B1
6668378 Leak et al. Dec 2003 B2
6675385 Wang Jan 2004 B1
6681393 Bauminger et al. Jan 2004 B1
6698020 Zigmond et al. Feb 2004 B1
6701528 Arsenault Mar 2004 B1
6718551 Swix et al. Apr 2004 B1
6721954 Nickum Apr 2004 B1
6725461 Dougherty et al. Apr 2004 B1
6732158 Hesselink et al. May 2004 B1
6732366 Russo May 2004 B1
6738978 Hendricks et al. May 2004 B1
6745245 Carpenter Jun 2004 B1
6745368 Boucher et al. Jun 2004 B1
6751800 Fukuda et al. Jun 2004 B1
6754905 Gordon et al. Jun 2004 B2
6765557 Segal et al. Jul 2004 B1
6766524 Matheny et al. Jul 2004 B1
6772139 Smith, III Aug 2004 B1
6785902 Zigmond et al. Aug 2004 B1
6792615 Rowe et al. Sep 2004 B1
6792618 Bendinelli et al. Sep 2004 B1
6799326 Boylan, III et al. Sep 2004 B2
6804528 Laroia et al. Oct 2004 B1
6813776 Chernock et al. Nov 2004 B2
6816904 Ludwig et al. Nov 2004 B1
6834156 Marko et al. Dec 2004 B1
6886178 Mao et al. Apr 2005 B1
6898762 Ellis et al. May 2005 B2
6912726 Chen et al. Jun 2005 B1
6928652 Goldman Aug 2005 B1
6938270 Blackketter et al. Aug 2005 B2
6956833 Yukie Oct 2005 B1
6966066 Zigmond et al. Nov 2005 B1
6968364 Wong et al. Nov 2005 B1
6983478 Grauch et al. Jan 2006 B1
6990676 Proehl et al. Jan 2006 B1
6990678 Zigmond Jan 2006 B2
7032030 Codignotto Apr 2006 B1
7076152 Eguchi et al. Jul 2006 B1
7099952 Wong Aug 2006 B2
7103904 Blackketter et al. Sep 2006 B1
7117518 Takahashi et al. Oct 2006 B1
7139723 Conkwright et al. Nov 2006 B2
7139983 Kelts Nov 2006 B2
7155735 Ngo et al. Dec 2006 B1
7165098 Boyer et al. Jan 2007 B1
7165266 Zigmond Jan 2007 B2
7167895 Connelly Jan 2007 B1
7174562 Leak Feb 2007 B1
7178162 Martinolich et al. Feb 2007 B2
7181756 Zigmond et al. Feb 2007 B1
7185355 Ellis et al. Feb 2007 B1
7222354 Ching et al. May 2007 B1
7224886 Akamatsu et al. May 2007 B2
7229354 McNutt et al. Jun 2007 B2
7272298 Lang et al. Sep 2007 B1
7284064 Connelly Oct 2007 B1
7287267 Knudson et al. Oct 2007 B2
7296282 Koplar et al. Nov 2007 B1
7305692 Blackketter et al. Dec 2007 B2
7340411 Cook Mar 2008 B2
7359871 Paasche et al. Apr 2008 B1
7401121 Wong Jul 2008 B2
7493641 Klosterman et al. Feb 2009 B2
7516203 Watanabe et al. Apr 2009 B2
7543323 Zigmond et al. Jun 2009 B1
7636846 Eskicioglu Dec 2009 B1
7665111 Barton et al. Feb 2010 B1
7673315 Wong et al. Mar 2010 B1
7676138 Blackketter Mar 2010 B2
7885517 Blackketter Feb 2011 B2
7979881 Wong et al. Jul 2011 B1
8316389 Wong et al. Nov 2012 B2
8341687 Blackketter et al. Dec 2012 B2
20010001160 Shoff et al. May 2001 A1
20010023430 Srinivasan Sep 2001 A1
20010037303 Mizrahi Nov 2001 A1
20020007493 Butler et al. Jan 2002 A1
20020010926 Lee Jan 2002 A1
20020026496 Boyer et al. Feb 2002 A1
20020032903 Sprunk Mar 2002 A1
20020042923 Asmussen et al. Apr 2002 A1
20020046407 Franco Apr 2002 A1
20020053077 Shah-Nazaroff et al. May 2002 A1
20020056123 Liwerant et al. May 2002 A1
20020087967 Conkwright et al. Jul 2002 A1
20020092017 Klosterman et al. Jul 2002 A1
20020095676 Knee et al. Jul 2002 A1
20020100044 Daniels Jul 2002 A1
20020104099 Novak Aug 2002 A1
20020133816 Greene et al. Sep 2002 A1
20020138840 Schein et al. Sep 2002 A1
20020156660 Nishimura et al. Oct 2002 A1
20020166120 Boylan et al. Nov 2002 A1
20020194585 Connelly Dec 2002 A1
20030005463 Macrae et al. Jan 2003 A1
20030009770 Tantawy et al. Jan 2003 A1
20030020744 Ellis et al. Jan 2003 A1
20030031465 Blake Feb 2003 A1
20030037333 Ghashghai et al. Feb 2003 A1
20030040962 Lewis Feb 2003 A1
20030044165 Wood et al. Mar 2003 A1
20030066085 Boyer et al. Apr 2003 A1
20030079231 Hirata Apr 2003 A1
20030093792 Labeeb et al. May 2003 A1
20030095791 Barton et al. May 2003 A1
20030149980 Hassell et al. Aug 2003 A1
20030149988 Ellis et al. Aug 2003 A1
20030154486 Dunn et al. Aug 2003 A1
20030198461 Taylor et al. Oct 2003 A1
20040002156 Greener et al. Jan 2004 A1
20040019903 Knudson et al. Jan 2004 A1
20040040042 Feinleib Feb 2004 A1
20040078817 Horowitz et al. Apr 2004 A1
20040109675 Tsukidate Jun 2004 A1
20040117831 Ellis et al. Jun 2004 A1
20040123323 Russo Jun 2004 A1
20040205810 Matheny et al. Oct 2004 A1
20040205829 Hane Oct 2004 A1
20040210932 Mori et al. Oct 2004 A1
20040237119 Smith Nov 2004 A1
20040261130 Leak et al. Dec 2004 A1
20050028206 Cameron et al. Feb 2005 A1
20050028208 Ellis et al. Feb 2005 A1
20050044571 Goldman Feb 2005 A1
20050066353 Fransdonk Mar 2005 A1
20050091493 Hirata Apr 2005 A1
20050097594 O'Donnell et al. May 2005 A1
20050149964 Thomas et al. Jul 2005 A1
20050155056 Knee et al. Jul 2005 A1
20050160283 Hirata Jul 2005 A1
20050172231 Myers Aug 2005 A1
20050204388 Knudson et al. Sep 2005 A1
20050251827 Ellis et al. Nov 2005 A1
20050273819 Knudson et al. Dec 2005 A1
20050278714 Vahid et al. Dec 2005 A1
20050278741 Robarts et al. Dec 2005 A1
20060015893 Kitsukawa et al. Jan 2006 A1
20060031405 Goldman et al. Feb 2006 A1
20060031883 Ellis et al. Feb 2006 A1
20060041921 Hane Feb 2006 A1
20060190966 McKissick et al. Aug 2006 A1
20070094690 Rodriguez et al. Apr 2007 A1
20070107030 Zigmond May 2007 A1
20070136753 Bovenschulte et al. Jun 2007 A1
20070277201 Wong et al. Nov 2007 A1
20090158374 Malaure et al. Jun 2009 A1
20100107194 McKissick et al. Apr 2010 A1
20110013885 Wong et al. Jan 2011 A1
Foreign Referenced Citations (24)
Number Date Country
0849946 Jun 1998 EP
0942595 Sep 1999 EP
1298923 Apr 2003 EP
2338364 Dec 1999 GB
2003022377 Jan 2003 JP
2003339041 Nov 2003 JP
2004064184 Feb 2004 JP
2004180088 Jun 2004 JP
2368094 Sep 2009 RU
WO 9222983 Dec 1992 WO
WO-9222983 Dec 1992 WO
WO-9722207 Jun 1997 WO
WO-9750251 Dec 1997 WO
WO-9816062 Apr 1998 WO
WO-9817064 Apr 1998 WO
WO-9826584 Jun 1998 WO
WO-9841020 Sep 1998 WO
WO-9853611 Nov 1998 WO
WO-9901984 Jan 1999 WO
WO 9938321 Jul 1999 WO
WO-9938321 Jul 1999 WO
WO-0004709 Jan 2000 WO
WO-0101270 Jan 2001 WO
WO-0137549 May 2001 WO
Non-Patent Literature Citations (220)
Entry
“Final Office Action”, U.S. Appl. No. 11/115,669, (Jun. 11, 2009),9 pages.
“Non Final Office Action”, U.S. Appl. No. 11/15,676, (Jun. 11, 2009),9 pgaes.
“Advanced Television Enhancement Forum Specification (ATVEF)”, Comment draft Version 1.0, Revision 1, Aug. 1998, 19 pages.
“Advanced Television Enhancement Forum Specification (ATVEF)”, Draft Version 1.1, Revision 19, Aug. 1998, 32 pages.
“Advanced Television Enhancement Forum Specification (ATVEF)”, Version 1.1 revision 26, Feb. 2, 1999, 37 pages.
“Advisory Action”, U.S. Appl. No. 09/441,708, Apr. 27, 2005, 3 pages.
“Advisory Action”, U.S. Appl. No. 09/441,708, Dec. 27, 2005, 3 pages.
“Advisory Action”, U.S. Appl. No. 09/649,788, Apr. 21, 2006, 4 pages.
“Advisory Action”, U.S. Appl. No. 09/650,167, Jun. 28, 2004, 4 pages.
“Advisory Action”, U.S. Appl. No. 09/650,352, Aug. 23, 2005, 3 pages.
“Advisory Action”, U.S. Appl. No. 09/650,352, Nov. 24, 2008, 3 pages.
“Advisory Action”, U.S. Appl. No. 09/650,375, May 5, 2005, 3 pages.
“Advisory Action”, U.S. Appl. No. 09/650,375, Oct. 12, 2007, 3 pages.
“Advisory Action”, U.S. Appl. No. 09/650,481, Feb. 26, 2008, 3 pages.
“Advisory Action”, U.S. Appl. No. 09/650,481, Jul. 14, 2004, 3 pages.
“Advisory Action”, U.S. Appl. No. 09/894,327, Dec. 29, 2005, 3 pages.
“Advisory Action”, U.S. Appl. No. 11/115,669, Apr. 14, 2010, 4 pages.
“Advisory Action”, U.S. Appl. No. 11/115,670, Sep. 11, 2008, 3 pages.
“BPAI Decision”, U.S. Appl. No. 10/891,579, Jan. 13, 2014, 9 pages.
“ClickReward, it all adds up”, Netcentives Press Identified as p. http.www.netcentives.com/clickrewards/index.html, 1997, 3 pages.
“Creating Interactive Television Links”, http://developer.webtv.net/itv/links/main.htm, Jun. 9, 1999, 7 pages.
“Decision on Appeal”, U.S. Appl. No. 09/650,481, Apr. 19, 2007, 9 pages.
“Displaying Television Broadcasts in Web Pages”, http://developer.erbtv.net/itv/embedtv.main.htm, Jun. 9, 1999, 4 pages.
“ECMA—European Association for Standardizing Information and Communication Systems”, Standard ECMA—262, 2nd Edition, Aug. 1998, 108 pages.
“Examiner's Answer to Appeal Brief”, U.S. Appl. No. 09/650,375, Nov. 3, 2006, 31 pages.
“Examiner's Answer to Appeal Brief”, U.S. Appl. No. 09/650,375, Jun. 15, 2006, 32 pages.
“Examiner's Answer to Appeal Brief”, U.S. Appl. No. 09/650,481, Jan. 27, 2005, 10 pages.
“Final Office Action”, U.S. Appl. No. 09/087,354, Jan. 31, 2002, 11 pages.
“Final Office Action”, U.S. Appl. No. 09/087,354, May 5, 2004, 11 pages.
“Final Office Action”, U.S. Appl. No. 09/087,354, Jun. 19, 2003, 13 pages.
“Final Office Action”, U.S. Appl. No. 09/287,985, Apr. 21, 2003, 11 pages.
“Final Office Action”, U.S. Appl. No. 09/295,746, Jul. 18, 2001, 10 pages.
“Final Office Action”, U.S. Appl. No. 09/345,223, Oct. 20, 2005, 10 pages.
“Final Office Action”, U.S. Appl. No. 09/345,223, Nov. 21, 2003, 8 pages.
“Final Office Action”, U.S. Appl. No. 09/412,839, Jul. 18, 2001, 10 pages.
“Final Office Action”, U.S. Appl. No. 09/441,708, Jul. 13, 2006, 6 pages.
“Final Office Action”, U.S. Appl. No. 09/441,708, Oct. 28, 2005, 6 pages.
“Final Office Action”, U.S. Appl. No. 09/441,708, Dec. 17, 2004, 9 pages.
“Final Office Action”, U.S. Appl. No. 09/467,851, Jan. 31, 2006, 14 pages.
“Final Office Action”, U.S. Appl. No. 09/467,851, Apr. 20, 2005, 9 pages.
“Final Office Action”, U.S. Appl. No. 09/566,695, Oct. 27, 2003, 15 pages.
“Final Office Action”, U.S. Appl. No. 09/649,788, Feb. 3, 2006, 35 pages.
“Final Office Action”, U.S. Appl. No. 09/649,788, Apr. 8, 2005, 26 pages.
“Final Office Action”, U.S. Appl. No. 09/650,167, Apr. 5, 2004, 18 pages.
“Final Office Action”, U.S. Appl. No. 09/650,352, Jun. 2, 2005, 20 pages.
“Final Office Action”, U.S. Appl. No. 09/650,352, Sep. 26, 2008, 17 pages.
“Final Office Action”, U.S. Appl. No. 09/650,375, Feb. 10, 2005, 28 pages.
“Final Office Action”, U.S. Appl. No. 09/650,375, Mar. 11, 2009, 32 pages.
“Final Office Action”, U.S. Appl. No. 09/650,375, Jul. 19, 2007, 31 pages.
“Final Office Action”, U.S. Appl. No. 09/650,481, Jan. 8, 2009, 9 pages.
“Final Office Action”, U.S. Appl. No. 09/650,481, Mar. 18, 2010, 9 pages.
“Final Office Action”, U.S. Appl. No. 09/650,481, May 10, 2004, 8 pages.
“Final Office Action”, U.S. Appl. No. 09/650,481, Dec. 14, 2007, 9 pages.
“Final Office Action”, U.S. Appl. No. 09/894,327, Oct. 19, 2005, 6 pages.
“Final Office Action”, U.S. Appl. No. 10/835,196, Jul. 9, 2008, 14 pages.
“Final Office Action”, U.S. Appl. No. 10/835,196, Aug. 31, 2009, 17 pages.
“Final Office Action”, U.S. Appl. No. 10/918,864, Feb. 17, 2010, 10 pages.
“Final Office Action”, U.S. Appl. No. 10/918,864, May 13, 2008, 10 pages.
“Final Office Action”, U.S. Appl. No. 11/093,666, Jan. 20, 2010, 29 pages.
“Final Office Action”, U.S. Appl. No. 11/093,666, Feb. 4, 2009, 16 pages.
“Final Office Action”, U.S. Appl. No. 11/115,669, Feb. 5, 2010, 8 pages.
“Final Office Action”, U.S. Appl. No. 11/115,669, Oct. 15, 2010, 15 pages.
“Final Office Action”, U.S. Appl. No. 11/115,670, May 1, 2008, 10 pages.
“Final Office Action”, U.S. Appl. No. 11/177,724, Jun. 9, 2009, 12 pages.
“Final Office Action”, U.S. Appl. No. 11/614,759, Aug. 5, 2009, 11 pages.
“Final Office Action”, U.S. Appl. No. 11/836,451, May 5, 2010, 9 pages.
“Final Office Action”, U.S. Appl. No. 11/836,451, Aug. 17, 2011, 10 pages.
“Final Office Action”, U.S. Appl. No. 12/684,763, Apr. 18, 2012, 26 pages.
“Foreign Office Action”, CA Application No. 2,508,652, Jul. 31, 2013, 2 Pages.
“Foreign Notice of Allowance”, JP Application No. 2007-521446, Jan. 10, 2012, 6 pages.
“Foreign Office Action”, CA Application No. 2,508,652, Jan. 30, 2014, 6 Pages.
“Foreign Office Action”, CA Application No. 2,508,652, May 16, 2012, 5 pages.
“Foreign Office Action”, EP Application No. 04786621.5, Apr. 25, 2012, 4 pages.
“Foreign Office Action”, EP Application No. 04786621.5, Jun. 20, 2011, 5 pages.
“Foreign Office Action”, JP Application No. 2007-521446, Jan. 21, 2011, 7 pages.
“Foreign Office Action”, JP Application No. 2007-521446, Oct. 21, 2011, 3 pages.
“Foreign Office Action”, JP Application No. 2007-521446, Aug. 17, 2010, 6 pages.
“Foreign Office Action”, KR Application No. 10-2005-7012341, Nov. 1, 2011, 14 pages.
“Foreign Office Action”, KR Application No. 10-2005-7012341, Feb. 9, 2011, 19 pages.
“Foreign Office Action”, MX Application No. PA/a/2005/007149, Jan. 29, 2010, 4 pages.
“How Does it Work”, Intel Intercast Website www.intercast.com, Jan. 29, 1999, 5 pages.
“How the Internet Works”, Preston Gralla, Special Edition, 1997, pp. 66, 67, 142, 143, 150, 151, 260, 261, 278, 279.
“HTML 4.0 Specification”, W3c, Identified as p. http://www.w3org/TR/TEC-html40.txt, 1997, 209 pages.
“International Search Report”, Application No. PCT/US00/31438, May 6, 2001, 4 pages.
“International Search Report”, Application No. PCT/US99/16141, Oct. 14, 1999, 3 pages.
“Internet Protocol, DARPA Internet Protocol Specification”, Request for Comments (RFC) 791, Sep. 1981, 44 pages.
“Joint EIA/CVCC Recommended Practice for Teletext North American Basic Teletext Specification (NABTS)”, Electronic Industries Association EIA—516, May 1988, 86 pages.
“MbTV Your Thumbprint on TV, Metabyte TV”, http://www.mbtv.com.index.htm, Jul. 20, 2000, 12 pages.
“Microsoft Computer Dictionary 4th Edition, p. 444”, Microsoft Press, 1999, 6 pages.
“NDS the Company, Website Home Page”, http://www.nds.com/news/pressroom.htm, Jul. 20, 2000, 9 pages.
“Netcentives Launches Global Incentive Program”, Netcentives Press Identified as pp. http://www.netcentives.com/press/archive/NC19991021a.html., Oct. 25, 1999, 2 pages.
“Non Final Office Action”, U.S. Appl. No. 09/087,354, May 9, 2001, 5 pages.
“Non Final Office Action”, U.S. Appl. No. 09/087,354, Dec. 8, 2003, 8 pages.
“Non Final Office Action”, U.S. Appl. No. 09/087,354, Dec. 17, 2002, 11 pages.
“Non Final Office Action”, U.S. Appl. No. 09/099,481, Apr. 12, 2000, 8 pages.
“Non Final Office Action”, U.S. Appl. No. 09/287,985, Mar. 29, 2004, 10 pages.
“Non Final Office Action”, U.S. Appl. No. 09/287,985, Aug. 14, 2002, 10 pages.
“Non Final Office Action”, U.S. Appl. No. 09/287,985, Sep. 25, 2003, 11 pages.
“Non Final Office Action”, U.S. Appl. No. 09/295,436, Jul. 5, 2002, 12 pages.
“Non Final Office Action”, U.S. Appl. No. 09/295,746, Dec. 6, 2000, 9 pages.
“Non Final Office Action”, U.S. Appl. No. 09/345,223, May 9, 2003, 9 pages.
“Non Final Office Action”, U.S. Appl. No. 09/345,223, Jun. 3, 2004, 9 pages.
“Non Final Office Action”, U.S. Appl. No. 09/345,247, Dec. 19, 2002, 12 pages.
“Non Final Office Action”, U.S. Appl. No. 09/412,839, Nov. 22, 2000, 9 pages.
“Non Final Office Action”, U.S. Appl. No. 09/467,851, Mar. 29, 2004, 9 pages.
“Non Final Office Action”, U.S. Appl. No. 09/467,851, Jul. 3, 2006, 7 pages.
“Non Final Office Action”, U.S. Appl. No. 09/467,851, Jul. 8, 2004, 7 pages.
“Non Final Office Action”, U.S. Appl. No. 09/467,851, Aug. 12, 2005, 13 pages.
“Non Final Office Action”, U.S. Appl. No. 09/566,695, Apr. 10, 2003, 25 pages.
“Non Final Office Action”, U.S. Appl. No. 09/649,788, Aug. 4, 2004, 24 pages.
“Non Final Office Action”, U.S. Appl. No. 09/649,788, Aug. 9, 2005, 30 pages.
“Non Final Office Action”, U.S. Appl. No. 09/649,788, Oct. 27, 2006, 30 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,167, Nov. 4, 2003, 16 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,167, Nov. 10, 2004, 15 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,352, Mar. 5, 2009, 18 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,352, Apr. 28, 2010, 20 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,352, Apr. 29, 2008, 16 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,352, Sep. 8, 2004, 17 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,352, Oct. 6, 2009, 19 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,352, Nov. 1, 2007, 21 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,375, Feb. 14, 2007, 36 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,375, Jul. 30, 2004, 25 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,375, Dec. 13, 2007, 37 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,481, Jun. 26, 2007, 10 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,481, Jul. 17, 2008, 8 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,481, Sep. 12, 2003, 7 pages.
“Non Final Office Action”, U.S. Appl. No. 09/650,481, Sep. 23, 2009, 9 pages.
“Non Final Office Action”, U.S. Appl. No. 09/788,985, Feb. 13, 2004, 10 pages.
“Non Final Office Action”, U.S. Appl. No. 09/788,985, Nov. 17, 2004, 10 pages.
“Non Final Office Action”, U.S. Appl. No. 09/894,327, Jun. 24, 2005, 6 pages.
“Non Final Office Action”, U.S. Appl. No. 10/835,196, Oct. 22, 2007, 13 pages.
“Non Final Office Action”, U.S. Appl. No. 10/835,196, Dec. 24, 2008, 13 pages.
“Non Final Office Action”, U.S. Appl. No. 10/893,518, Jun. 19, 2008, 8 pages.
“Non Final Office Action”, U.S. Appl. No. 10/893,518, Dec. 13, 2007, 11 pages.
“Non Final Office Action”, U.S. Appl. No. 10/918,864, Jan. 21, 2009, 8 pages.
“Non Final Office Action”, U.S. Appl. No. 10/918,864, Oct. 16, 2007, 8 pages.
“Non Final Office Action”, U.S. Appl. No. 11/093,666, Jul. 21, 2009, 28 pages.
“Non Final Office Action”, U.S. Appl. No. 11/093,666, Aug. 5, 2008, 6 pages.
“Non Final Office Action”, U.S. Appl. No. 11/115,676, Jun. 11, 2009, 7 pages.
“Non Final Office Action”, U.S. Appl. No. 11/177,724, Sep. 18, 2008, 10 pages.
“Non Final Office Action”, U.S. Appl. No. 11/464,081, Jun. 22, 2007, 5 pages.
“Non Final Office Action”, U.S. Appl. No. 11/464,081, Dec. 11, 2007, 6 pages.
“Non Final Office Action”, U.S. Appl. No. 11/614,759, Dec. 30, 2008, 15 pages.
“Non Final Office Action”, U.S. Appl. No. 11/836,451, Feb. 14, 2011, 8 pages.
“Non Final Office Action”, U.S. Appl. No. 11/836,451, Dec. 24, 2009, 8 pages.
“Non Final Office Action”, U.S. Appl. No. 11/836,451, Feb. 27, 2012, 5 pages.
“Non Final Office Action”, U.S. Appl. No. 12/951,974, Jun. 28, 2012, 27 pages.
“Non-Final Office Action”, U.S. Appl. No. 09/441,708, Feb. 9, 2006, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 09/441,708, Apr. 19, 2004, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 09/441,708, Jun. 16, 2005, 5 pages.
“Non-Final Office Action”, U.S. Appl. No. 09/441,708, Nov. 6, 2003, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/115,669, Sep. 17, 2009, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/115,669, Nov. 24, 2008, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/115,669, Jun. 3, 2010, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/115,670, Mar. 30, 2007, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/115,676, Dec. 26, 2008, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/115,719, Apr. 20, 2007, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 11/115,719, Apr. 21, 2006, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/684,763, Oct. 13, 2011, 17 pages.
“Notice of Allowance”, Application No. 09/087,354, Jul. 13, 2004, 5 pages.
“Notice of Allowance”, Application No. 09/099,481, Sep. 11, 2000, 7 pages.
“Notice of Allowance”, Application No. 09/287,985, Mar. 2, 2005, 4 pages.
“Notice of Allowance”, Application No. 09/295,436, Jan. 15, 2003, 12 pages.
“Notice of Allowance”, Application No. 09/295,746, May 21, 2002, 3 pages.
“Notice of Allowance”, U.S. Appl. No. 09/345,223, Jun. 14, 2006, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 09/345,247, Jun. 4, 2003, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 09/412,839, Feb. 25, 2002, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 09/441,708, Sep. 18, 2006, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 09/467,851, Oct. 10, 2006, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 09/566,695, Mar. 10, 2004, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 09/650,167, Jun. 2, 2005, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 09/650,375, Oct. 29, 2008, 14 pages.
“Notice of Allowance”, U.S. Appl. No. 09/650,481, Mar. 4, 2011, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 09/788,985, Jun. 17, 2005, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 09/894,327, May 23, 2006, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 10/835,196, Sep. 2, 2010, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 10/893,518, Jan. 8, 2009, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 10/893,518, Apr. 30, 2009, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 11/093,666, Aug. 31, 2012, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 11/115,669, Nov. 30, 2010, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 11/115,676, Dec. 4, 2009, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 11/115,719, Oct. 10, 2007, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 11/216,250, Oct. 31, 2006, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 11/464,081, Apr. 21, 2008, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 11/614,759, Mar. 22, 2010, 12 pages.
“Notice of Allowance”, U.S. Appl. No. 11/614,759, Jun. 5, 2007, 4 pages.
“Notice of Allowance”, U.S. Appl. No. 11/614,759, Dec. 31, 2009, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 11/836,451, Jul. 16, 2012, 7 pages.
“Press Room, Replay TV”, http://www.replaytv.com/news/pressroom.htm, Jul. 20, 2000, 3 pages.
“Recommended Practice for Line 21 Data Service”, Electronic Industries Association EIA-608, Sep. 1994, 132 pages.
“ReplayTV, Inc.”, Press Release—http://www.replaytv.com/news/pressrelease18htm, Aug. 10, 2000, pp. 1-4.
“Research Disclosure #385002”, Kenneth Mason Publications Ltd, Electronic Program Guide Via Internet,May 1996, 2 pages.
“Restriction Requirement”, U.S. Appl. No. 09/441,708, Aug. 11, 2003, 5 pages.
“Restriction Requirement”, U.S. Appl. No. 09/649,788, Apr. 10, 2007, 4 pages.
“Restriction Requirement”, U.S. Appl. No. 09/650,375, Dec. 24, 2008, 7 pages.
“Restriction Requirement”, U.S. Appl. No. 11/115,719, Jan. 10, 2007, 4 pages.
“Restriction Requirement”, U.S. Appl. No. 11/115,719, Sep. 26, 2006, 5 pages.
“Supplemental Notice of Allowability”, U.S. Appl. No. 09/087,354, Mar. 10, 2005, 4 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 09/650,167, Jun. 10, 2005, 4 pages.
“TiVo, Inc., Press Release”, http://www.tivo.com/.about/061400.html, Jul. 20, 2000, 10 pages.
“Transport of Internet Uniform Resource Locator (URL) Information using Text-2 (T-2) Service”, Electronic Industries Association EIA-746A, Sep. 1998, 16 pages.
“W3C Document Object Model (DOM) Level 1 Specification”, identified as p. http://www.w3.org/TR/1998/REC-DOM-Level-1-19981001/DOM.txt, Oct. 1998, 113 pages.
“WebTV Networks Introduces Its Revolutionary Next-Generation System-Web TV Plus”, Network Computing News—www.ncns.com, Sep. 16, 1997, 5 pages.
Bryant, “The Liberate Technologies TV Navigator for DTV: A Thin Web-Centric Client for Digital Television”, Printed and Published by the IEE, Savoy Place, London, 1999, 7 pages.
Davenport, “WebTV Plus.. Is it Worth an Upgrade?”, Net4tv www.net-4tv.com, Apr. 1, 1998, 9 pages.
Deering, “Host Extensions for IP Multicasting”, Network Working Group, Stanford University, Aug. 1989, 18 pages.
Gifford, “Teletext Decoder”, Radio-Electronics, Apr. 1996, pp. 45-49.
Hause, “Digital Video Recorders—The Next Big Thing?”, Abstract, International Data Corporation, Report No. 19588, Jul. 1999, 20 pages.
Jones, “The Microsoft Interactive TV System”, An Experience Report, Jul. 1997, 21 pages.
Lewis, “Boom Box”, http://www.nytimes.com/library/magazine/home/2000813mag-boombox.html, Aug. 14, 2000, 16 pages.
Morris, “ZDNet: Hits and Hype, ZDNet Reviews”, http://www.zdnet.com/products/stories/reviews/0,4161,2619461,99.html, Oct. 9, 2000, 3 pages.
Ng, “MPEG Transmission Schemes for a Timed Token Medium Access Control Network”, ACM SIGCOMM Computer Communication Review vol. 29—No. 1., Jan. 1999, pp. 66-80.
Panabaker, et al.,' “The Transmission of IP Over the Vertical Blanking Interval of a Television Signal”, Philips Semiconductors, WebTV Networks, Feb. 1999, 18 pages.
Postel, “User Datagram Protocol”, Identified as p. ftp://ftp.isi.edu/in-notes/rfc768.txt, Aug. 1980, 3 pages.
Tetsuya, et al.,' “A Single-Chip MPEG-2 422P@ML Video, Audio, and System Encoder with a 162 MHz Media-Processor Core and Dual Motion Estimation Cores”, IEICE Trans. Electron. vol. E84-C—No. 1, Jan. 2001, pp. 108-122.
Webster, “Merriam Webster's Cataloging Dictionary 10th Edition, p. 1240”, 1998, 4 pages.
Zigmond, “Uniform Resource for Television Broadcast”, Network Working Group, WebTV Networks, Jun. 1997, 3 pages.
Zigmond, “Uniform Resource Locators for Television and Telephony”, Network Working Group, Wink Communications, Oct. 1996, 5 pages.
“Foreign Notice of Allowance”, CA Application No. 2,508,652, Aug. 22, 2014, 1 Page.
“Non-Final Office Action”, U.S. Appl. No. 13/681,360, Oct. 15, 2014, 7 Pages.
Related Publications (1)
Number Date Country
20060117351 A1 Jun 2006 US