Collecting data from different sources

Information

  • Patent Grant
  • 11593444
  • Patent Number
    11,593,444
  • Date Filed
    Thursday, June 24, 2021
    2 years ago
  • Date Issued
    Tuesday, February 28, 2023
    a year ago
Abstract
A system for collecting data from different sources is described. In one example embodiment, the system obtains content-related data from a plurality of source computer systems, automatically identifies, based on the content-related data, content items having respective popularity values greater than a predetermined threshold value as popular content items, and automatically generates a list of popular content items based on the popular content items.
Description
TECHNICAL FIELD

This application relates to the fields of media and entertainment and specifically to method and system for aggregating data collected from different sources.


BACKGROUND

The approaches described in this section could be pursued, but are not necessarily approaches that have been previously conceived or pursued. Therefore, unless otherwise indicated herein, the approaches described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.


In the field of media and entertainment, there is a new generation of viewers that has a high expectation of the level of entertainment to be enjoyed from various sources of content, such as, e.g., television programming, the Internet, and locally stored content. These viewers may expect more choice, more flexibility, as well as the ability to interact and participate more with the viewable content.


On the other hand, the sheer volume of content that is available for viewing is exploding dramatically. Just the number of television channels that are now available is almost unmanageable. The amount of content that is available via free video or video on demand service is also increasing. It is now possible to view content over a wider span of time by employing time shifting technologies, such as Personal Video Recording (PVR) (sometimes referred to as DVR or Digital Video Recording). This explosion of content may be described as a paradox of choice, where the excess of choices causes a viewer's inability to choose.





BRIEF DESCRIPTION OF DRAWINGS

Embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:



FIG. 1 illustrates an environment within which an example smart playlist may be implemented, in accordance with an example embodiment;



FIG. 2 is a network diagram illustrating architecture within which a smart playlist may be utilized, in accordance with an example embodiment;



FIG. 3 is an example architecture within which data collected from different sources may be processed utilizing a recommendation engine, in accordance with an example embodiment;



FIG. 4 is a block diagram illustrating a smart playlist system, in accordance with an example embodiment;



FIG. 5 is a flow chart illustrating a method for providing a smart playlist to a viewer's client device, in accordance with an example embodiment; and



FIG. 6 illustrates a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.





DETAILED DESCRIPTION

The description that follows includes systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments of the present invention. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques have not been shown in detail.


A system is described to collect information from a great number of viewers' client devices, determine a list of popular content items based on the collected information, customize the list for a particular viewer, and send that list to the viewer's device. This approach to aiding a viewer in making choices in the universe of viewable content may be termed a smart playlist system. Example embodiments described herein provide systems and methods to generate a smart play list. For the purposes of this description, the term viewer will be understood to include actual viewers, as well as potential viewers, e.g., persons that may at some point in time view a video program.


In one embodiment, a smart playlist system obtains from viewers' client devices content-related information such as, e.g., which programs are being currently viewed, which programs are being recorded and scheduled to be recorded, which content has been rated and the associated ratings, as well as recommendations pertaining to programs, purchases of various programs, etc. For the purposes of this description the terms content, content item, show, and program will be understood to denote viewable content. Data collected indiscriminately from the entire accessible community of viewers may be accumulated in a repository termed a global bucket. Data from the global bucket may be analyzed to determine programs that appear to be most popular at the time of the analyzing, i.e., appear to be of heightened interest to viewers. A certain number of programs that have been determined as most popular are compiled into a so-called hot list. The hot list may be made available to viewer, e.g., by communicating the list to the viewers' client devices or providing an access link that can be invoked from the users' devices.


Before a hot list is provided to a viewer, it may be personalized for the viewer by determining how relevant the items in the hot list are to that particular viewer and presenting to the viewer only those programs that have been determined to be of high relevance to the viewer. The relevancy of a particular program to a particular viewer may be determined by associating each item in the hot list with a score based on the viewer's profile, on the viewer's content viewing history and patterns, as well as based on information collected from the client devices of a subset of viewers who are members of the particular viewer's social network.


In one example embodiment, in addition to determining a personalized hot list of content items, a smart playlist system may trigger recording of a certain program as soon as the program has been identified as a live program and of high relevance to the viewer. For example, a viewer may not be tuned into a channel broadcasting a particular live sports event. If the smart playlist system determined that the live sports event is of high relevance to the viewer, the smart playlist system may trigger the recording of the live broadcast of the sports event on the viewer's client device (e.g., a set top box, a desktop computer, etc.) and also alerts the user to the fact that she may be interested in the event being currently broadcast on a certain channel. The viewer may then ignore the alert. If the viewer, instead, tunes to the suggested channel the viewer would not have missed the beginning of the broadcast because the recording of the program has been automatically triggered by an instruction provided to the viewer's client device from the smart playlist system. In one example, the high relevancy of the live broadcast may have been determined based on the fact that all of the viewer's social network contacts have either tuned into the associated channel or have scheduled the recording of the broadcast. In another example, the high relevancy of the live broadcast may have been determined based on the viewer's profile or on the viewer's viewing history. An example smart playlist system may be implemented within architecture illustrated in FIG. 1.



FIG. 1 illustrates network architecture of an example interactive media environment 100 wherein some embodiments of the present invention may be deployed. The interactive media environment 100 includes a source system 102 that communicates data (e.g., media content data and interactive application data) via a distribution network or system 104 (e.g., the Internet, a mobile communication network, or any other network capable of communicating digital data) and a modulator box 106 to a receiver system 108. In one example embodiment, the interactive media environment 100 optionally includes a storage unit 110 (e.g., personal computer) that communicates stored data via a network 112 to the modulator box 106 which, in turn, communicates the stored data, media content data, and interactive application data to the receiver system 108. The modulator box 106, storage unit 110, and the receiver system 108 may be co-located in a user's home. Thus, in one embodiment, the modulator box 106 may combine media content data and interactive application data received from the remote source system 102 with a local stored data provided by the storage unit 110 provided at the user's home.


Turning first to the source system 102, an example headend system 114 operates to communicate the data as a broadcast transmission. To this end, the headend system 114 is shown to include one or more broadcast servers 116 and, optionally, one or more application servers 118. Each of the broadcast servers 116 may operate to receive, encode, packetize, multiplex, modulate, and broadcast data from various sources and of various types. While the example embodiment is described herein as transmitting data from the headend system 114 as a broadcast, it will be appreciated that the relevant data could also be unicast or multicast from the source system 102 via the distribution system 104 and modulator box 106 to the receiver system 108. In various embodiments, data could also be transmitted from the source system 102 via a network connection to the receiver system 108. Further, in other example embodiments the source system 102 may be modified to facilitate communications via the Internet, a mobile phone network, or any other network capable of communicating digital data.


Each application server 118, in one example embodiment, compiles and provides interactive data modules to the broadcast server 116. The interactive data modules may also include data that is utilized by an interactive television application. The application server 118 may also include multiplexing functionality to enable multiplexing of, for example, interactive television applications and associated data with audio and video signals received from various sources. The application server 118 may also have the capability to feed (e.g., stream) multiple interactive television applications to one or more broadcast servers 116 for distribution to the receiver system 108. To this end, each application server 118 may implement a so-called “carousel,” whereby code and data modules are provided to a broadcast server 116 in a cyclic, repetitive manner for inclusion within a transmission from the headend system 114. In other embodiments, code may reside permanently in a set-top box (STB) 120 (e.g., the code may be stored in non-volatile memory of the STB 120), may be pushed of downloaded to the STB 120, or be provided to the STB 120 in any other manner. In one embodiment, the application server 118 provides a smart playlist mechanism to collect information from viewers, determine a list of popular content items, customizing the list for a particular user and sending that lit to the user's device. The smart playlist mechanism will be discussed by way of example in more detail in connection with FIGS. 2-4.


The headend system 114 is also shown, by way of example, to include one or more backend servers 122, which are coupled to the application servers 118 and to an input/output device 124 (e.g., a modem pool). Specifically, the I/O device 124 is coupled to receive data from the receiver system 108 via a network 126 (e.g., the Internet) and to provide this data to backend servers 122. The backend servers 122 may then provide the data, received from the receiver system 108, to the application servers 118 and the broadcast servers 116. Alternatively, data received from the receiver system 108 may be directly provided to the application servers 118.


Accordingly, the network 126 and the I/O device 126 may operate as a return channel whereby the receiver system 108 is provided with interactivity with the source system 102. Data provided to the headend system 114 via the return channel may include, merely for example, user input to an interactive media application executed at the receiver system 108 or data that is generated by the receiver system 108 and communicated to the source system 102. The return channel may also provide a channel whereby programs, targeted advertisements/commercials, and applications from the source system 102 are provided to the receiver system 108.


Within the source system 102, the headend system 114 is also shown optionally to receive data (e.g., content, code, and application data) from external sources. For example, the headend system 114 may be coupled to one or more content sources 128 and one or more application sources 130 via a network 132 (e.g., the Internet). For example, a content source 128 may be a provider of entertainment content (e.g., movie), a provider of real-time dynamic data (e.g., weather information), and the like. The application source 130 may be a provider of any interactive media application. For example, one or more application sources 130 may provide a TV media player application, electronic program guide and navigation applications, messaging and communication applications, information applications, and so forth. The application sources 130 may be configured to execute on different client devices (e.g., mobile phones, personal computer, STBs, or the like).


Turning now to the example distribution system 104, the distribution system 104 may, in one embodiment, support the broadcast distribution of data from the source system 102 to the receiver system 108. As shown, the distribution network or system 104 may comprise a satellite, cable, terrestrial or Digital Subscribers Line (DSL) network, or any other data communication network or combination of such networks.


The receiver system 108 is shown, in one example embodiment, to include the set-top box (STB) 120 that receives data (e.g., primary and secondary content streams) via the distribution system 104 and modulator box 106 and an input/output device 132 (e.g., modem) for return channel communications with the headend system 114. The receiver system 108 is also shown to include other optional external systems such as a user input device 134 (e.g., a keyboard, remote control, mouse etc.) and a display device 136, coupled to the set-top box 120, for the display of content received at the set-top box 120. In one example embodiment, the display device 136 may be a television set.


The modulator box 106, in one example embodiment, receives stored data from the storage unit 110 and a broadcast transmission from the source system 102. The modulator box 106 multiplexes the stored data into the broadcast transmission thereby generating a second transmission that is communicated to the receiving system 108. It will, however, be appreciated that storage unit functionality is optional. The storage unit 110 may store data and, upon request, communicate the stored data to the modulator box 106 over the network 112 (e.g., Ethernet). The storage unit 110 may communicate the stored data in response to commands that are entered by a user from the set-top box 120 and communicated to the storage unit 110 over a link 138.


It will be appreciated to one skilled in the art that one or more of the modules, applications, or the like of the modulator box 106, the set-top box 120, and the storage unit 110 may be combined or integrated. In general, components, protocols, structures, and techniques not directly related to functions of example embodiments have not been shown or discussed in detail. The description given herein simply provides a variety of example embodiments to aid the reader in an understanding of the systems and methods used herein. While the interactive media environment 100 is illustrated having a receiving system 108 including a set-top box 120, it is noted that the receiving system 108 may comprise a mobile device or a personal computer coupled to a network for receiving media.


Smart playlist may be utilized beneficially in the context of a network environment. FIG. 1 illustrates an environment 100 within which an example smart playlist may be implemented. The environment 100 includes a set top box 110 in communication with an entertainment display device 120 and a control device 130. The set-top box (STB) 110 may be a device that connects to a television and an external source of signal, turning the signal into content which can then be displayed on the television screen. In one example embodiment, the entertainment display device 120 is a television set, and the control device 130 is a remote control device that may be used for switching between television channels, for example. The set-top box 110 may be configured to include a system 112 to provide a smart playlist that may include features outlined above. The set-top box 110 may be configured to receive content from sources such as, e.g., an Ethernet cable, a satellite dish, a coaxial cable, a telephone line (including digital subscriber line (DSL) connections), Broadband over Power Line, as well as very high frequency (VHF) or ultra high frequency (UHF) antenna. Content, in this context, could mean any or all of video, audio, Internet web pages, interactive games, or other possibilities. As shown in FIG. 1, the set-top box 110 is shown as having access to signal sources 140, including broadcast programming 142, video on demand programs 144, as well as to local content 146 and Internet content 148.



FIG. 2 is a network diagram illustrating architecture 200 within which a smart playlist may be utilized, in accordance with an example embodiment. The architecture 100 includes a client device 210 and a client device 220, each configured to receive content from content sources 250 and to be in communication with a server system 240 via a communications network 230. The client devices 210 and 220 may be set top boxes, desktop computers, mobile devices, etc. The communications network 230 may be a public network (e.g., the Internet, a wireless network, etc.) or a private network (e.g., a local area network (LAN), a wide area network (WAN), Intranet, etc.). The server 240 may include a smart playlist system 242 configured to collect information related to utilization of viewable content from viewers' client devices, to aggregate and customize the collected information, and to provide the resulting hot list to viewers personalized for each particular user, as was described above.


The client device 210 may be configured to include a smart playlist agent 212 that may be configured to cooperate with the smart playlist system 242 with respect to collecting information regarding viewable content accessed or referenced on the client device 210. In some embodiments, the smart playlist system 242 may be configured to obtain information regarding viewable content accessed or referenced on a client device without the use of a smart playlist agent. As shown in FIG. 2, the client devices 210 and 220 have access to signal sources 250. The signal sources 250 include broadcast programming 252, video on demand programs 254, as well as to local content 256 and Internet content 258.


In one embodiment, the smart playlist system 242 may be configured to collect content-related data from different sources in addition to client devices. For example, he smart playlist system 242 may be configured to collect content-related data from systems providing on-line social communities, systems providing search engines, systems of providers of video-on-demand, system of providers of content for purchase or rent, etc. The collected data may be weighted according to its source (e.g., in analyzing the collected data to generate a smart playlist data collected from a certain on-line blog server may be weighted lower than data collected from a video-on-demand provider system). An example architecture, within which data collected from different sources may be processed utilizing smart playlist system (also referred to as a recommendation engine) may be described with reference to FIG. 3.



FIG. 3 illustrates architecture 300 comprising a recommendation engine 310 that, in one embodiment, may reside on the application server 240 of FIG. 2 and may correspond to the smart playlist system 242 or it may be hosted on some other computer system or be distributed across multiple computers or computer systems. The recommendation engine 310 may be configured to use a collector module 312 to obtain content-related information from different sources such as, e.g., content providers' systems 332, on-line community providers' systems 334, search engine providers' systems 336 web searching services, video-on demand providers' systems 338, a system providing an electronic programming guide, etc. The term “system” will be understood to include one or more computer modules provided on a computer or distributed across multiple computers or computer systems. The content providers' systems 332 may include one or more server computers hosting a video-sharing website. The on-line community providers' systems 334 may include one or more server computers hosting a social networking website or a microblogging service. The video-on demand providers' systems 338 may include one or more server computers hosting a rental-by-mail and video streaming services. The collector module 312 provided with the recommendation engine 310 may be also configured to obtain content-related information from client devices 320, such as a set top box 322, a desk top computer 324, a mobile device 326, etc.


Content-related data, which may include viewership information, changes in viewership (e.g., a sudden spike in the number of users trending about a video program or a dramatic increase in the number of viewers watching or recording a video program), ratings of content, references to content items in on-line publications, rental and purchasing information, etc., may be processed by the analytics module 314 to identify those content items that appear to be of heightened interest to viewers. An indication of the heightened interest (also referred to as popularity) may be expressed in terms of a popularity value, which may be calculated for a content item (e.g., a video program) based on, cumulatively, the total number of viewers currently watching or recording the video program being above a predetermined threshold value, the total number of viewers currently watching or recording the video program having increased by a certain percent as compared to the earlier measurement, the number of times the video program has been referenced in microblogs or on-line social network news feeds, etc. The recommendation engine 310 may be configured to generate a list of popular content items, where a popular item is associated with a popularity value above a certain threshold value, customize the lists respectively for viewers associated with viewer devices 340 and 350, and provide the customized lists to the viewer devices 340 and 350. Customization process is described in further detail with reference to FIG. 4 below.


Prior to customizing the list of popular content items for a specific viewer, the recommendation engine 310 may apply to the list a preliminary filter configured to filter the list based on characteristics of various groups of users. Such preliminary filtering may be based on geographic location or demographics of a group of users. In one embodiment, the viewer's device 350 may host an analytics module 354 that may be configured to receive content-related data from the recommendation engine 310 and used the received data to generate recommendations, e.g., using profile data stored in a profile repository 352. An example system to generate a smart playlist (e.g., a customized list of popular items) may be described with reference to FIG. 4.



FIG. 4 illustrates an example system 400 to generate a smart playlist based on content utilization information collected from client devices of the entire community of users that can be accessed by the smart playlist system. The system 400, which may correspond to the recommendation engine 410 of FIG. 4 or a smart playlist system 240 of FIG. 2, includes a collector module 410, a hot list generator 420, a customization module 440, and a communications module 450. The collector module 410 may be configured to obtain content utilization data from a plurality of client devices (e.g., the client devices 210 and 220 of FIG. 2. The content utilization data for a viewer from the plurality of viewers may be indicative of the viewer's interest in respective content items. In one embodiment, the collector module 410 obtains content utilization data from a real time listener provided at a client device, e.g., the smart playlist agent 212 of FIG. 2. The hot list generator 420 may be configured to generate a list of popular content items based on the obtained content utilization data.


The collector module 410 obtains content utilization information from all client devices accessible to the smart playlist system 400. This information, collected from the entire universe of viewers that have diverse tastes, viewing habits, and content source preferences and that reflects content utilization of the entire viewing community, is stored, by a storing module 440, in a repository termed a global bucket. The data from the global bucket is analyzed by the hot list generator 420 to determine those content items that are of most interest to the global community of viewers and assemble those content items into a list of popular items, a so-called hot list. In one embodiment, the hot list generator 420 may generate a hot list based on how many viewers are watching or recording a show, the duration of the watching, ratings and recommendations associated with the program, and so on. As the collector module 410 continuously obtains content utilization data from client devices, the hot list generator 420 may be configured to continuously update the hot list, e.g., once a day or based on any predetermined time period.


The customization module 440 may be configured to customize the hot list that is generated based on the information from the global bucket that reflect preferences of the entire community of viewers to target more closely the actual and projected preferences of a particular viewer (a target viewer) and generate a so-called customized playlist. The customizing may be based on the viewer's profile that may be stored at the application server 240 of FIG. 2, as well as on the viewing history of the viewer and the viewing history of members of the viewer's social network. In one embodiment, the storing module 440 stores content utilization data for individual viewers in respective repositories termed personal buckets. A viewer's profile stored at the application server 240 may indicate that one or more other viewers are associated with the viewers as “friends” in terms of social networking. The storing module 440 stores content utilization data collected from client devices of the viewer's “friends” or social connections in a repository termed a social bucket. The customization module 440 may utilize data from the viewer's personal bucket and the viewer's social bucket to generate the customized playlist. The customization module 440 may be configured to periodically update the customized playlist, e.g., based on the changes in the hot list, based on the changes in the data stored in the personal bucket and the social bucket, as well as based on the changes in the viewer's profile.


In one embodiment, a customized playlist is generated by generating a score for each item from the list of popular content items and including items into in the customized playlist based on respective scores of the items from the list of popular content items. The scoring may be based on the viewer's preferences identified in the viewer's profile, based on data from the viewer's personal bucket and the viewer's social bucket. A content item from a category that is not indicated in the viewer's profile as being of interest to the viewer and that is not considered as being of interest to the viewer based on the viewing history of the viewer may still be assigned a high score by the customization module 440 based on the information from the viewers social bucket. For example, the customization module 440 may be configured to weigh heavily an indication that a certain content item is of high interest to a great number of the viewer's social contacts.


The communications module 450 may be configured to communicate the customized playlist to a client device of the target viewer. The communications module 450 may be configured to communicate to the client device an instruction to start recording of a live program identified in the customized playlist. The communications module 450 may also be configured to communicate to the client device an instruction to display an alert message regarding of a live program identified in the customized playlist. As mentioned above, a client device may be a set top box, a desktop computer, or a mobile device. Content items referenced in the hot list or in the customized playlist may be associated with a variety of content sources, such as, e.g., the Internet, video on demand, and live broadcast. Example operations performed by the system 400 may be described with reference to FIG. 5.



FIG. 5 illustrates an example method 500 of providing a smart playlist. The method 500 may be performed in the context of media and entertainment, e.g., in the context of television entertainment. The method 500 may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both. It will be noted, that, in an example embodiment, the processing logic may reside in any of the modules shown in FIG. 3 or FIG. 4.


As shown in FIG. 5, the method 500 commences with operation 510, where the collector module 410 of FIG. 4 obtains content utilization data from a plurality of client devices associated with respective plurality of viewers. At operation 520, the hot list generator 420 of FIG. 4 generates a list of popular content items based on the obtained content utilization data. At operation 540, the customization module 440 of FIG. 4 generates a customized playlist for a target viewer from the plurality of viewers, based on the list of popular content items and a profile of the target viewer. At operation 540, the communications module 440 of FIG. 4 communicates the customized playlist to a client device of the target viewer.



FIG. 6 shows a diagrammatic representation of machine in the example form of a computer system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.


The example computer system 600 includes a processor 602 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 604 and a static memory 606, which communicate with each other via a bus 608. The computer system 600 may further include a video display unit 610 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 600 also includes an alphanumeric input device 612 (e.g., a real or virtual keyboard), a viewer interface (UI) navigation device 614 (e.g., a remote control or a mouse), a disk drive unit 616, a signal generation device 618 (e.g., a speaker) and a network interface device 620.


The disk drive unit 616 includes a machine-readable medium 622 on which is stored one or more sets of instructions and data structures (e.g., software 624) embodying or utilized by any one or more of the methodologies or functions described herein. The software 624 may also reside, completely or at least partially, within the main memory 604, within the processor 602 during execution thereof by the computer system 600, the main memory 604 and the processor 602 also constituting machine-readable media. The main memory 604 comprises storage locations that are addressable by the processor 602 for storing software program code. The memory may comprise a form of random access memory (RAM). Those skilled in the art will appreciate that other memory means, such as FLASH memory media, may also be used for storing the program instructions and data structures shown in the main memory 604.


The software 624 may further be transmitted or received over a network 626 via the network interface device 620 utilizing any one of a number of well-known transfer protocols (e.g., HTTP).


While the machine-readable medium 622 is shown in an example embodiment to be a single medium, the term “machine-readable medium” may be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium (e.g., FLASH memory media) that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention, or that is capable of storing, encoding or carrying data structures utilized by or associated with such a set of instructions. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media.


The embodiments described herein may be implemented in an operating environment comprising software installed on a computer, in hardware, or in a combination of software and hardware.


Thus, method and system to collect content-related data from multiple source computer systems have been described. In the description above, for purposes of explanation, numerous specific details have been set forth in order to provide a thorough understanding of one example embodiment. It will be evident, however, to one skilled in the art that the present invention may be practiced without these specific details. It is to be noted that the delivery mechanism for the content for viewing may be via a satellite, cable, terrestrial broadcast, Internet, local storage, a local network, mobile telephony, or any other content distribution network. Accordingly, the viewing device need not be a television set but may be any display unit of any device (including portable devices). It will be noted that any references to television content will be understood to include any content available for viewing on an entertainment display device, such as a television screen. Such content may include television programming, as well as locally stored content, such as stored video files or digital images, as well as content accessible via the Internet. It will be noted that the term viewer may be understood broadly as any user that may potentially view viewable content at any point in time.

Claims
  • 1. A computer-implemented method comprising: receiving, at a server computer system, first content-related data indicative of a first number of viewers currently watching or recording a content item;receiving, at the server computer system, second content-related data indicative of a second number of viewers currently watching or recording the content item;calculating, by the server computer system, a value for the content item, the calculating of the value for the content item comprising: accessing a change in viewership of the content item based on the first number of viewers, a first weight value applied to the first number of viewers, the second number of viewers, a second weight value applied to the second number of viewers, and a reference viewership information of the content item; andcomparing the change in viewership of the content item to a first predetermined threshold; andautomatically generating, by the server computer system, based on the calculated value for the content item satisfying a second predetermined threshold, third content-related data comprising information identifying the content item.
  • 2. The method of claim 1, wherein the first content-related data comprises information related to a specific viewer.
  • 3. The method of claim 1, wherein the second content-related data comprises information related to a plurality of viewers.
  • 4. The method of claim 2, wherein the information related to the specific viewer is chosen from the group consisting of: device type, network type, user demographics, user history, user profile, or user preference.
  • 5. The method of claim 3, wherein the information related to the plurality of viewers is chosen from the group consisting of: device type, network type, users demographics, users history, users profile, or users preferences.
  • 6. The method of claim 1, further comprising: generating, by the server computer system, playlist data for a viewer, the playlist data identifying the content item based on a content preference identified in a profile of the viewer; andbased on generating the playlist data, automatically sending, by the server computer system, to a device of the viewer, an instruction that causes the device to perform an operation for the content item identified in the playlist data.
  • 7. The method of claim 6, wherein the content preference comprises a preference of the viewer for a particular content category.
  • 8. The method of claim 6, wherein the instruction that causes the device to perform the operation for the content item causes the device to generate an alert for the content item.
  • 9. The method of claim 1, wherein the content item is a broadcast program.
  • 10. The method of claim 1, wherein the content item is a video on demand program.
  • 11. The method of claim 1, wherein the content item comprises Internet content.
  • 12. The method of claim 1, wherein the receiving of the first content-related data comprises receiving the first content-related data from a set top box.
  • 13. The method of claim 1, wherein the receiving of the first content-related data comprises receiving the first content-related data from a desktop computer.
  • 14. The method of claim 1, wherein the receiving of the first content-related data comprises receiving the first content-related data from a mobile device.
  • 15. A system comprising: a memory that stores instructions; andone or more processors configured by the instructions to perform operations comprising: receiving first content-related data indicative of a first number of viewers currently watching or recording a content item;receiving second content-related data indicative of a second number of viewers currently watching or recording the content item;calculating a value for the content item, the calculating of the value for the content item comprising: accessing a change in viewership of the content item based on the first number of viewers, a first weight value applied to the first number of viewers, the second number of viewers, a second weight value applied to the second number of viewers, and a reference viewership information of the content item; andcomparing the change in viewership of the content item to a first predetermined threshold; andautomatically generating, based on the calculated value for the content item satisfying a second predetermined threshold, third content-related data comprising information identifying the content item.
  • 16. The system of claim 15, wherein the first content-related data comprises data describing a number of references to the content item in online media sources.
  • 17. The system of claim 15, wherein the first content-related data comprises information related to a specific viewer.
  • 18. A non-transitory computer-readable storage medium storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising: receiving first content-related data indicative of a first number of viewers currently watching or recording a content item;receiving second content-related data indicative of a second number of viewers currently watching or recording the content item;calculating a value for the content item, the calculating of the value for the content item comprising: accessing a change in viewership of the content item based on the first number of viewers, a first weight value applied to the first number of viewers, the second number of viewers, a second weight value applied to the second number of viewers, and a reference viewership information of the content item; andcomparing the change in viewership of the content item to a first predetermined threshold; andautomatically generating, based on the calculated value for the content item satisfying a second predetermined threshold, third content-related data comprising information identifying the content item.
  • 19. The computer-readable storage medium of claim 18, wherein the first content-related data comprises data describing ratings of the content item.
  • 20. The computer-readable storage medium of claim 18, wherein the operations further comprise: generating playlist data for a viewer, the playlist data identifying the content item based on a content preference identified in a profile of the viewer; andbased on generating the playlist data, automatically sending, to a device of the viewer, an instruction that causes the device to perform an operation for the content item identified in the playlist data.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a Continuation of U.S. patent application Ser. No. 16/237,022, filed on Dec. 31, 2018, which is a Continuation of U.S. patent application Ser. No. 12/878,001, filed on Sep. 8, 2010, which is a Continuation-in-Part of U.S. patent application Ser. No. 12/877,034, filed on Sep. 7, 2010, which applications are incorporated herein by reference in their entireties.

US Referenced Citations (423)
Number Name Date Kind
4718107 Hayes et al. Jan 1988 A
4769697 Gilley et al. Sep 1988 A
4907079 Turner et al. Mar 1990 A
4930160 Vogel May 1990 A
4931865 Scarampi Jun 1990 A
5019899 Boles et al. May 1991 A
5034807 Von Kohom Jul 1991 A
5068733 Bennett Nov 1991 A
5099322 Gove Mar 1992 A
5227874 Von Kohorn Jul 1993 A
5264933 Rosser et al. Nov 1993 A
5283639 Esch et al. Feb 1994 A
5353392 Luquet et al. Oct 1994 A
5373315 Dufresne et al. Dec 1994 A
5382983 Kwoh et al. Jan 1995 A
5410326 Goldstein Apr 1995 A
5467288 Fasciano et al. Nov 1995 A
5483276 Brooks et al. Jan 1996 A
5497185 Dufresne et al. Mar 1996 A
5508731 Kohorn Apr 1996 A
5515485 Luquet et al. May 1996 A
5524193 Covington et al. Jun 1996 A
5543856 Rosser et al. Aug 1996 A
5546471 Merjanian Aug 1996 A
5579002 Igguiden et al. Nov 1996 A
5583980 Anderson Dec 1996 A
5590262 Isadore-Barreca Dec 1996 A
5600368 Matthews, III Feb 1997 A
5600775 King et al. Feb 1997 A
5603078 Henderson et al. Feb 1997 A
5604896 Duxbury et al. Feb 1997 A
5613909 Stelovsky Mar 1997 A
5614940 Cobbley et al. Mar 1997 A
5621454 Ellis et al. Apr 1997 A
5627936 Prasad et al. May 1997 A
5631903 Dianda et al. May 1997 A
5635989 Rothmuller Jun 1997 A
5652615 Bryant et al. Jul 1997 A
5655144 Milne et al. Aug 1997 A
5661516 Carles Aug 1997 A
5663756 Blahut et al. Sep 1997 A
5664046 Abecassis Sep 1997 A
5675511 Prasad et al. Oct 1997 A
5680639 Milne et al. Oct 1997 A
5708845 Wistendahl et al. Jan 1998 A
5715014 Perkins et al. Feb 1998 A
5724472 Abecassis Mar 1998 A
5727141 Hoddie et al. Mar 1998 A
5740549 Reilly et al. Apr 1998 A
5758257 Herz et al. May 1998 A
5758259 Lawler May 1998 A
5765164 Prasad et al. Jun 1998 A
5771307 Lu et al. Jun 1998 A
5774664 Hidary et al. Jun 1998 A
5774666 Portuesi Jun 1998 A
5793409 Tetsumura Aug 1998 A
5794210 Goldhaber et al. Aug 1998 A
5798785 Hendricks et al. Aug 1998 A
5801747 Bedard Sep 1998 A
5812123 Rowe et al. Sep 1998 A
5818510 Cobbley et al. Oct 1998 A
5828402 Collings Oct 1998 A
5848396 Gerace Dec 1998 A
5854927 Gelissen Dec 1998 A
5859662 Cragun et al. Jan 1999 A
5861881 Freeman et al. Jan 1999 A
5894320 Vancelette Apr 1999 A
5898838 Wagner Apr 1999 A
5917830 Chen et al. Jun 1999 A
5920642 Merjanian Jul 1999 A
5929849 Kikinis Jul 1999 A
5929850 Broadwin et al. Jul 1999 A
5931908 Gerba et al. Aug 1999 A
5945988 Williams et al. Aug 1999 A
5951639 MacInnis Sep 1999 A
5970504 Abe et al. Oct 1999 A
5973683 Cragun et al. Oct 1999 A
5977962 Chapman et al. Nov 1999 A
5978013 Jones et al. Nov 1999 A
5982399 Scully et al. Nov 1999 A
5987509 Portuesi Nov 1999 A
5990911 Arrott Nov 1999 A
5995091 Near et al. Nov 1999 A
6002393 Hite et al. Dec 1999 A
6002443 Igguiden Dec 1999 A
6006241 Purnaveja et al. Dec 1999 A
6006256 Zdepski et al. Dec 1999 A
6012098 Bayeh et al. Jan 2000 A
6020882 Kinghorn et al. Feb 2000 A
6021275 Horwat Feb 2000 A
6028950 Merjanian Feb 2000 A
6029045 Picco et al. Feb 2000 A
6038367 Abecassis Mar 2000 A
6049821 Theriault et al. Apr 2000 A
6057833 Heidmann et al. May 2000 A
6057872 Candelore May 2000 A
6058430 Kaplan May 2000 A
6061056 Menard et al. May 2000 A
6061719 Bendinelli et al. May 2000 A
6069672 Claassen May 2000 A
6075526 Rothmuller Jun 2000 A
6075971 Williams et al. Jun 2000 A
6078322 Simonoff et al. Jun 2000 A
6083276 Davidson et al. Jul 2000 A
6091886 Abecassis Jul 2000 A
6100916 August et al. Aug 2000 A
6104334 Allport Aug 2000 A
6104423 Elam Aug 2000 A
6124877 Schmidt Sep 2000 A
6125259 Perlman Sep 2000 A
6128011 Peng Oct 2000 A
6134243 Jones et al. Oct 2000 A
6144401 Casement et al. Nov 2000 A
6151444 Abecassis Nov 2000 A
6154205 Carroll et al. Nov 2000 A
6154771 Rangan et al. Nov 2000 A
6163272 Goode Dec 2000 A
6166780 Bray Dec 2000 A
6173317 Chaddha et al. Jan 2001 B1
6173437 Polcyn Jan 2001 B1
6175718 Kim et al. Jan 2001 B1
6175840 Chen et al. Jan 2001 B1
6177931 Alexander et al. Jan 2001 B1
6178446 Gerszberg et al. Jan 2001 B1
6195090 Riggins, III Feb 2001 B1
6201538 Wugofski Mar 2001 B1
6216263 Elam Apr 2001 B1
6226793 Kwoh May 2001 B1
6229524 Chemock et al. May 2001 B1
6229546 Lancaster et al. May 2001 B1
6230172 Purnaveja et al. May 2001 B1
6240555 Shoff et al. May 2001 B1
6256785 Klappert et al. Jul 2001 B1
6263189 Reagor Jul 2001 B1
6263332 Nasr et al. Jul 2001 B1
6263500 Yoshida et al. Jul 2001 B1
6266793 Mozdzen et al. Jul 2001 B1
6269216 Abecassis Jul 2001 B1
6292805 Basso et al. Sep 2001 B1
6297853 Sharir et al. Oct 2001 B1
6308327 Liu et al. Oct 2001 B1
6314568 Ochiai et al. Nov 2001 B1
6317881 Shah-Nazaroff et al. Nov 2001 B1
6324519 Eldering Nov 2001 B1
6330719 Zigmond et al. Dec 2001 B1
6345278 Hitchcock et al. Feb 2002 B1
6349410 Lortz Feb 2002 B1
6356933 Mitchell et al. Mar 2002 B2
6357042 Srinivasan et al. Mar 2002 B2
6359661 Nickum Mar 2002 B1
6363380 Dimitrova Mar 2002 B1
6377995 Agraharam et al. Apr 2002 B2
6404445 Galea et al. Jun 2002 B1
6415438 Blackketter et al. Jul 2002 B1
6426778 Valdez, Jr. Jul 2002 B1
6438752 McClard Aug 2002 B1
6446246 Suto Sep 2002 B1
6446261 Rosser Sep 2002 B1
6449766 Fleming Sep 2002 B1
6449767 Krapf et al. Sep 2002 B1
6463585 Hendricks et al. Oct 2002 B1
6476828 Burkett et al. Nov 2002 B1
6476833 Moshfeghi Nov 2002 B1
6477579 Kunkel et al. Nov 2002 B1
6481011 Lemmons Nov 2002 B1
6483547 Eyer Nov 2002 B1
6493872 Rangan et al. Dec 2002 B1
6513160 Dureau Jan 2003 B2
6519770 Ford Feb 2003 B2
6551357 Madduri Apr 2003 B1
6560366 Wilkins May 2003 B1
6560777 Blackketter et al. May 2003 B2
6581207 Sumita et al. Jun 2003 B1
6594825 Goldschmidt Iki Jul 2003 B1
6615408 Kaiser et al. Sep 2003 B1
6675384 Block et al. Jan 2004 B1
6675388 Beckmann et al. Jan 2004 B1
6698020 Zigmond et al. Feb 2004 B1
6721954 Nickum Apr 2004 B1
6725421 Boucher et al. Apr 2004 B1
6738978 Hendricks et al. May 2004 B1
6760043 Markel Jul 2004 B2
6766524 Matheny et al. Jul 2004 B1
6785902 Zigmond et al. Aug 2004 B1
6791579 Markel Sep 2004 B2
6795826 Flinn et al. Sep 2004 B2
6804675 Knight et al. Oct 2004 B1
6826597 Lonnroth et al. Nov 2004 B1
6828993 Hendricks et al. Dec 2004 B1
6845374 Oliver et al. Jan 2005 B1
6880171 Ahmad et al. Apr 2005 B1
6938270 Blackketter et al. Aug 2005 B2
6941521 Lin et al. Sep 2005 B2
7353235 Sally et al. Apr 2008 B2
7581237 Kurapati Aug 2009 B1
7644427 Horvitz et al. Jan 2010 B1
7739604 Lyons et al. Jun 2010 B1
7757250 Horvitz et al. Jul 2010 B1
7801419 Sakai et al. Sep 2010 B2
7853600 Herz et al. Dec 2010 B2
7904924 De Heer et al. Mar 2011 B1
7937725 Schaffer et al. May 2011 B1
8108341 Barsook et al. Jan 2012 B2
8230360 Ma et al. Jul 2012 B2
8234147 Olejniczak et al. Jul 2012 B2
8286206 Aaron et al. Oct 2012 B1
8311875 Lloyd Nov 2012 B1
8346624 Goad et al. Jan 2013 B2
8402031 Govani et al. Mar 2013 B2
8429530 Neuman et al. Apr 2013 B2
8515975 Federici Aug 2013 B1
8539359 Rapaport et al. Sep 2013 B2
8666979 Chen et al. Mar 2014 B2
8677235 Chronister et al. Mar 2014 B2
8803882 Lam et al. Aug 2014 B2
8949871 Chai et al. Feb 2015 B2
9009768 Agnihotri et al. Apr 2015 B2
9135333 Cameron et al. Sep 2015 B2
9595300 Duffin et al. Mar 2017 B2
9602563 Barkai et al. Mar 2017 B2
9678623 Neuman et al. Jun 2017 B2
9699503 Fishman Jul 2017 B2
9800927 Chai Oct 2017 B2
9883250 Chai et al. Jan 2018 B2
10080060 Fishman et al. Sep 2018 B2
10129600 Fishman et al. Nov 2018 B2
10210160 Fishman et al. Feb 2019 B2
10419817 Fishman et al. Sep 2019 B2
10595094 Fishman et al. Mar 2020 B2
11074308 Fishman et al. Jul 2021 B2
20010011375 Yun et al. Aug 2001 A1
20010021994 Nash Sep 2001 A1
20010023436 Srinivasan et al. Sep 2001 A1
20010037500 Reynolds et al. Nov 2001 A1
20020010625 Smith et al. Jan 2002 A1
20020010923 Pack et al. Jan 2002 A1
20020011988 Sai et al. Jan 2002 A1
20020023263 Ahn et al. Feb 2002 A1
20020029256 Zintel et al. Mar 2002 A1
20020035728 Fries Mar 2002 A1
20020049983 Bove, Jr. et al. Apr 2002 A1
20020049984 Klappert et al. Apr 2002 A1
20020053084 Escobar et al. May 2002 A1
20020056087 Berezowski et al. May 2002 A1
20020056090 Wagner et al. May 2002 A1
20020056136 Wistendahl et al. May 2002 A1
20020057286 Markel May 2002 A1
20020057837 Wilkinson et al. May 2002 A1
20020059117 Yoch et al. May 2002 A1
20020059588 Huber et al. May 2002 A1
20020059590 Kitsukawa et al. May 2002 A1
20020059593 Shao et al. May 2002 A1
20020059629 Markel May 2002 A1
20020065678 Peliotis et al. May 2002 A1
20020069405 Chapin et al. Jun 2002 A1
20020073416 Ramsey Catan Jun 2002 A1
20020088008 Markel Jul 2002 A1
20020088011 Lamkin et al. Jul 2002 A1
20020089542 Imamura Jul 2002 A1
20020095687 Shintani et al. Jul 2002 A1
20020112239 Goldman Aug 2002 A1
20020112249 Hendricks et al. Aug 2002 A1
20020120931 Huber et al. Aug 2002 A1
20020126990 Rasmussen et al. Sep 2002 A1
20020129364 Smith et al. Sep 2002 A1
20020131511 Zenoni Sep 2002 A1
20020133817 Markel Sep 2002 A1
20020147987 Reynolds et al. Oct 2002 A1
20020162117 Pearson et al. Oct 2002 A1
20020162121 Mitchell Oct 2002 A1
20020166119 Cristofalo Nov 2002 A1
20020174425 Markel et al. Nov 2002 A1
20030020744 Ellis et al. Jan 2003 A1
20030028873 Lemmons Feb 2003 A1
20030037334 Khoo et al. Feb 2003 A1
20030066085 Boyer et al. Apr 2003 A1
20030093790 Logan et al. May 2003 A1
20030149983 Markel Aug 2003 A1
20030172374 Vinson et al. Sep 2003 A1
20030177199 Zenoni Sep 2003 A1
20030196164 Gupta et al. Oct 2003 A1
20030217365 Caputo Nov 2003 A1
20030236695 Litwin, Jr. Dec 2003 A1
20030237093 Marsh et al. Dec 2003 A1
20040021679 Chapman et al. Feb 2004 A1
20040031062 Lemmons Feb 2004 A1
20040054572 Oldale et al. Mar 2004 A1
20040056900 Blume Mar 2004 A1
20040073953 Xu et al. Apr 2004 A1
20040163045 Hui et al. Aug 2004 A1
20040210947 Shusman Oct 2004 A1
20040237108 Drazin et al. Nov 2004 A1
20050028194 Elenbaas et al. Feb 2005 A1
20050038717 McQueen et al. Feb 2005 A1
20050144499 Narahara et al. Jun 2005 A1
20050149964 Thomas et al. Jul 2005 A1
20050160458 Baumgartner Jul 2005 A1
20050262542 DeWeese et al. Nov 2005 A1
20060008256 Khedouri et al. Jan 2006 A1
20060010464 Azami Jan 2006 A1
20060123448 Ma et al. Jun 2006 A1
20060200434 Flinn et al. Sep 2006 A1
20060242554 Gerace et al. Oct 2006 A1
20060277098 Chung et al. Dec 2006 A1
20060282328 Gerace et al. Dec 2006 A1
20070011702 Vaysman Jan 2007 A1
20070033607 Bryan Feb 2007 A1
20070041705 Bontempi Feb 2007 A1
20070043617 Stein et al. Feb 2007 A1
20070061745 Anthony et al. Mar 2007 A1
20070100824 Richardson et al. May 2007 A1
20070136753 Bovenschulte et al. Jun 2007 A1
20070157242 Cordray et al. Jul 2007 A1
20070157248 Ellis Jul 2007 A1
20070186243 Pettit et al. Aug 2007 A1
20070192794 Curtis et al. Aug 2007 A1
20070220543 Shanks et al. Sep 2007 A1
20070266401 Hallberg Nov 2007 A1
20070271518 Tischer Nov 2007 A1
20080006601 Brodersen et al. Mar 2008 A1
20080092173 Shannon et al. Apr 2008 A1
20080117202 Martinez et al. May 2008 A1
20080134053 Fischer Jun 2008 A1
20080155588 Roberts Jun 2008 A1
20080163059 Craner Jul 2008 A1
20080178239 Yampanis Jul 2008 A1
20080222106 Rao et al. Sep 2008 A1
20080276277 Ahn et al. Nov 2008 A1
20080301118 Chien et al. Dec 2008 A1
20080320517 Beadle et al. Dec 2008 A1
20090006374 Kim et al. Jan 2009 A1
20090006398 Lam et al. Jan 2009 A1
20090031354 Riley et al. Jan 2009 A1
20090037254 Colando Feb 2009 A1
20090046101 Askey et al. Feb 2009 A1
20090052859 Greenberger et al. Feb 2009 A1
20090060469 Olague et al. Mar 2009 A1
20090070185 Farrelly Mar 2009 A1
20090083326 Pelton Mar 2009 A1
20090089433 Kisel et al. Apr 2009 A1
20090092183 O'hern Apr 2009 A1
20090100469 Conradt et al. Apr 2009 A1
20090119258 Petty May 2009 A1
20090133051 Hildreth May 2009 A1
20090133070 Hamano et al. May 2009 A1
20090133078 Hamano et al. May 2009 A1
20090138805 Hildreth May 2009 A1
20090144773 Cavanaugh et al. Jun 2009 A1
20090150214 Mohan Jun 2009 A1
20090150786 Brown Jun 2009 A1
20090158337 Stiers et al. Jun 2009 A1
20090163183 O'donoghue et al. Jun 2009 A1
20090164450 Martinez et al. Jun 2009 A1
20090182725 Govani et al. Jul 2009 A1
20090210902 Slaney et al. Aug 2009 A1
20090217324 Massimi Aug 2009 A1
20090249393 Shelton et al. Oct 2009 A1
20090265359 Barsook et al. Oct 2009 A1
20100031366 Knight et al. Feb 2010 A1
20100042608 Kane, Jr. Feb 2010 A1
20100050098 Turner Feb 2010 A1
20100058241 Saijo et al. Mar 2010 A1
20100071000 Amento et al. Mar 2010 A1
20100083318 Weare et al. Apr 2010 A1
20100088312 Goldfeder Apr 2010 A1
20100100537 Druzgalski et al. Apr 2010 A1
20100124911 Leeder May 2010 A1
20100201618 Lorente Aug 2010 A1
20100235745 Shintani Sep 2010 A1
20100293034 Olejniczak et al. Nov 2010 A1
20110035707 Kitayama Feb 2011 A1
20110060649 Dunk et al. Mar 2011 A1
20110069940 Shimy et al. Mar 2011 A1
20110078717 Drummond et al. Mar 2011 A1
20110145040 Zahn et al. Jun 2011 A1
20110162008 Aldrey et al. Jun 2011 A1
20110175867 Satake Jul 2011 A1
20110225290 Kansal et al. Sep 2011 A1
20110239158 Barraclough et al. Sep 2011 A1
20110247044 Jacoby Oct 2011 A1
20110279311 Hamano Nov 2011 A1
20110283189 Mccarty Nov 2011 A1
20110283304 Roberts et al. Nov 2011 A1
20110289189 Bartholomew Nov 2011 A1
20110289422 Spivack et al. Nov 2011 A1
20110310305 Alexander Dec 2011 A1
20110320715 Ickman et al. Dec 2011 A1
20120059825 Fishman et al. Mar 2012 A1
20120060094 Irwin et al. Mar 2012 A1
20120060176 Chai et al. Mar 2012 A1
20120060195 Fishman et al. Mar 2012 A1
20120137316 Elizarov et al. May 2012 A1
20120197930 Newell et al. Aug 2012 A1
20120278331 Campbell et al. Nov 2012 A1
20120311453 Reyna et al. Dec 2012 A1
20130066885 Komuves Mar 2013 A1
20130073988 Groten et al. Mar 2013 A1
20130086159 Gharachorloo et al. Apr 2013 A1
20130152129 Alberth et al. Jun 2013 A1
20130167168 Ellis et al. Jun 2013 A1
20130204825 Su Aug 2013 A1
20130204833 Pang et al. Aug 2013 A1
20130205314 Ramaswamy et al. Aug 2013 A1
20130212493 Krishnamurthy Aug 2013 A1
20140052785 Sirpal Feb 2014 A1
20140068689 Sirpal et al. Mar 2014 A1
20140081954 Elizarov Mar 2014 A1
20140215512 Maruyama et al. Jul 2014 A1
20140365873 Willis et al. Dec 2014 A1
20150006280 Ruiz et al. Jan 2015 A1
20150033109 Marek Jan 2015 A1
20150074552 Chai et al. Mar 2015 A1
20150074721 Fishman et al. Mar 2015 A1
20150074728 Chai et al. Mar 2015 A1
20150121406 Chai et al. Apr 2015 A1
20150206269 Qin Jul 2015 A1
20180020255 Fishman et al. Jan 2018 A1
20180035161 Fishman et al. Feb 2018 A1
20180220194 Chai et al. Aug 2018 A1
20180234736 Fishman et al. Aug 2018 A1
20190045272 Fishman et al. Feb 2019 A1
20190258689 Fishman et al. Aug 2019 A1
20200045369 Fishman et al. Feb 2020 A1
Foreign Referenced Citations (45)
Number Date Country
201101152 Oct 2011 AU
2810521 May 2020 CA
2810511 Feb 2021 CA
106462316 Feb 2017 CN
0262757 Apr 1988 EP
0921696 Jun 1999 EP
0967804 Dec 1999 EP
0982943 Mar 2000 EP
1021036 Jul 2000 EP
0967804 Nov 2000 EP
1056273 Nov 2000 EP
1071287 Jan 2001 EP
1056273 Jan 2002 EP
2490454 Aug 2012 EP
2614444 Jul 2013 EP
2730837 Feb 1995 FR
2327837 Feb 1999 GB
10042271 Feb 1998 JP
2000227851 Aug 2000 JP
2003308145 Oct 2003 JP
WO-9115921 Oct 1991 WO
WO-9510919 Apr 1995 WO
WO-9625821 Aug 1996 WO
WO-9633572 Oct 1996 WO
WO-9637075 Nov 1996 WO
WO-9749236 Dec 1997 WO
WO-9749239 Dec 1997 WO
WO-9831114 Jul 1998 WO
WO-9915968 Apr 1999 WO
WO-1999031881 Jun 1999 WO
WO-9935832 Jul 1999 WO
WO-0005884 Feb 2000 WO
WO-0038427 Jun 2000 WO
WO-0049520 Aug 2000 WO
WO-0049801 Aug 2000 WO
WO-0057295 Sep 2000 WO
WO-0128235 Apr 2001 WO
WO-2001050752 Jul 2001 WO
WO-0199416 Dec 2001 WO
WO-0232136 Apr 2002 WO
WO-2002032136 Apr 2002 WO
WO-2012033489 Mar 2012 WO
WO-2012033921 Mar 2012 WO
WO-2015038515 Mar 2015 WO
WO-2015038516 Mar 2015 WO
Non-Patent Literature Citations (272)
Entry
F. Alvarez et al., “Audience Measurement Modeling for Convergent Broadcasting and IPTV Networks,” in IEEE Transactions on Broadcasting, vol. 55, No. 2, pp. 502-515, Jun. 2009, doi: 10.1109/TBC.2008.2012040. (Year: 2009).
“About TVML”, Product Documentation, [Online], Retrieved from the Internet: <URL: http://web.archive.org/web/19961214195058/http://www.tvml.co.uk/developer/about.htm>, (1996), 2 pgs.
“U.S. Appl. No. 09/941,148, Advisory Action dated May 20, 2004”, 3 pgs.
“U.S. Appl. No. 09/941,148, Amendment filed Apr. 26, 2004”, 14 pgs.
“U.S. Appl. No. 09/941,148, Amendment filed Sep. 19, 2005”, 17 pgs.
“U.S. Appl. No. 09/941,148, Examiner Interview Summary dated May 27, 2005”, 2 pgs.
“U.S. Appl. No. 09/941,148, Final Office Action dated Apr. 25, 2007”, 18 pgs.
“U.S. Appl. No. 09/941,148, Final Office Action dated May 19, 2005”, 10 pgs.
“U.S. Appl. No. 09/941,148, Final Office Action dated Oct. 24, 2003”, 11 pgs.
“U.S. Appl. No. 09/941,148, Non Final Office Action dated Apr. 1, 2003”, 8 pgs.
“U.S. Appl. No. 09/941,148, Non Final Office Action dated Aug. 2, 2006”, 16 pgs.
“U.S. Appl. No. 09/941,148, Non Final Office Action dated Aug. 11, 2004”, 13 pgs.
“U.S. Appl. No. 09/941,148, Non Final Office Action dated Nov. 28, 2005”, 19 pgs.
“U.S. Appl. No. 09/941,148, Preliminary Amendment filed Jun. 19, 2002”, 1 pg.
“U.S. Appl. No. 09/941,148, Response filed Feb. 2, 2007 to Non Final Office Action dated Aug. 2, 2006”, 17 pgs.
“U.S. Appl. No. 09/941,148, Response filed Jul. 31, 2003 to Non Final Office Action dated Apr. 1, 2003”, 10 pgs.
“U.S. Appl. No. 09/941,148, Response filed Nov. 12, 2004 to Non Final Office Action dated Aug. 11, 2004”, 15 pgs.
“U.S. Appl. No. 12/877,034, Appeal Brief filed Jun. 11, 2015”, 21 pgs.
“U.S. Appl. No. 12/877,034, Appeal Decision mailed Jan. 3, 2017”, 10 pgs.
“U.S. Appl. No. 12/877,034, Decision on Pre-Appeal Brief Request mailed Dec. 11, 2014”, 2 pgs.
“U.S. Appl. No. 12/877,034, Examiner Interview Summary dated Jul. 24, 2013”, 3 pgs.
“U.S. Appl. No. 12/877,034, Final Office Action dated Mar. 25, 2013”, 14 pgs.
“U.S. Appl. No. 12/877,034, Final Office Action dated Jun. 13, 2014”, 14 pgs.
“U.S. Appl. No. 12/877,034, Non Final Office Action dated Aug. 10, 2012”, 11 pgs.
“U.S. Appl. No. 12/877,034, Non Final Office Action dated Oct. 1, 2013”, 13 pgs.
“U.S. Appl. No. 12/877,034, Notice of Allowance dated Mar. 29, 2017”, 9 pgs.
“U.S. Appl. No. 12/877,034, Pre-Appeal Brief Request filed Nov. 4, 2014”, 5 pgs.
“U.S. Appl. No. 12/877,034, Response filed Feb. 26, 2014 to Non Final Office Action dated Oct. 1, 2013”, 13 Ipgs.
“U.S. Appl. No. 12/877,034, Response filed Aug. 26, 2013 to Final Office Action dated Mar. 25, 2013”, 12 pgs.
“U.S. Appl. No. 12/877,034, Response filed Nov. 13, 2012 to Non Final Office Action dated Aug. 10, 2012”, 11 pgs.
“U.S. Appl. No. 12/877,875, Advisory Action dated Aug. 2, 2013”, 3 pgs.
“U.S. Appl. No. 12/877,875, Final Office Action dated Apr. 23, 2013”, 12 pgs.
“U.S. Appl. No. 12/877,875, Non Final Office Action dated Apr. 15, 2014”, 13 pgs.
“U.S. Appl. No. 12/877,875, Non Final Office Action dated Nov. 6, 2012”, 13 pgs.
“U.S. Appl. No. 12/877,875, Notice of Allowance dated Sep. 17, 2014”, 12 pgs.
“U.S. Appl. No. 12/877,875, Response filed Mar. 11, 2013 to Non Final Office Action dated Nov. 6, 2012”, 10 pgs.
“U.S. Appl. No. 12/877,875, Response filed Jul. 16, 2013 to Final Office Action dated Apr. 23, 2013”, 11 pgs.
“U.S. Appl. No. 12/877,875, Response filed Aug. 15, 2014 to Non Final Office Action dated Apr. 15, 2014”, 12 pgs.
“U.S. Appl. No. 12/877,993, Amendment with Request to Reopen Prosecution filed Jul. 7, 2017”, 18 pgs.
“U.S. Appl. No. 12/877,993, Appeal Brief filed Feb. 24, 2016”, 20 pgs.
“U.S. Appl. No. 12/877,993, Appeal Decision mailed May 8, 2017”, 9 pgs.
“U.S. Appl. No. 12/877,993, Examiner Interview Summary dated Mar. 19, 2018”, 3 pgs.
“U.S. Appl. No. 12/877,993, Examiner Interview Summary dated May 21, 2019”, 3 pgs.
“U.S. Appl. No. 12/877,993, Final Office Action dated Jan. 28, 2015”, 35 pgs.
“U.S. Appl. No. 12/877,993, Final Office Action dated Mar. 15, 2013”, 30 pgs.
“U.S. Appl. No. 12/877,993, Final Office Action dated Jul. 9, 2018”, 37 pgs.
“U.S. Appl. No. 12/877,993, Final Office Action dated Jul. 23, 2019”, 35 pgs.
“U.S. Appl. No. 12/877,993, Non Final Office Action maidated led Feb. 4, 2019”, 34 pgs.
“U.S. Appl. No. 12/877,993, Non Final Office Action dated Jun. 20, 2014”, 31 pgs.
“U.S. Appl. No. 12/877,993, Non Final Office Action dated Aug. 2, 2012”, 26 pgs.
“U.S. Appl. No. 12/877,993, Non Final Office Action dated Dec. 15, 2017”, 36 pgs.
“U.S. Appl. No. 12/877,993, Response filed Mar. 15, 2018 to Non Final Office Action dated Dec. 15, 2017”, 25 pgs.
“U.S. Appl. No. 12/877,993, Response filed May 6, 2019 to Non Final Office Action dated Feb. 4, 2019”, 15 pgs.
“U.S. Appl. No. 12/877,993, Response filed Jul. 22, 2013 to Final Office Action dated Mar. 15, 2013”, 17 pgs.
“U.S. Appl. No. 12/877,993, Response filed Oct. 14, 2014 to Non Final Office Action dated Jun. 20, 2014”, 19 pgs.
“U.S. Appl. No. 12/877,993, Response filed Nov. 8, 2018 to Final Office Action dated Jul. 9, 2018”, 15 pgs.
“U.S. Appl. No. 12/877,993, Response filed Dec. 3, 2012 to Non Final Office Action dated Aug. 2, 2012”, 17 pgs.
“U.S. Appl. No. 12/878,001, Appeal Brief filed May 12, 2015”, 16 pgs.
“U.S. Appl. No. 12/878,001, Appeal Decision dated Mar. 20, 2017”, 10 pgs.
“U.S. Appl. No. 12/878,001, Examiner Interview Summary dated Jul. 24, 2013”, 3 pgs.
“U.S. Appl. No. 12/878,001, Examiner Interview Summary dated Jul. 27, 2018”, 3 pgs.
“U.S. Appl. No. 12/878,001, Examiner Interview Summary dated Dec. 18, 2017”, 3 pgs.
“U.S. Appl. No. 12/878,001, Final Office Action dated Mar. 29, 2013”, 13 pgs.
“U.S. Appl. No. 12/878,001, Final Office Action dated Apr. 23, 2018”, 18 pgs.
“U.S. Appl. No. 12/878,001, Final Office Action dated Jul. 17, 2014”, 12 pgs.
“U.S. Appl. No. 12/878,001, Non Final Office Action dated Aug. 9, 2012”, 11 pgs.
“U.S. Appl. No. 12/878,001, Non Final Office Action dated Aug. 24, 2017”, 14 pgs.
“U.S. Appl. No. 12/878,001, Non Final Office Action dated Oct. 3, 2013”, 12 pgs.
“U.S. Appl. No. 12/878,001, Notice of Allowance dated Oct. 2, 2018”, 12 pgs.
“U.S. Appl. No. 12/878,001, Request to Reopen Prosecution under 37 C.F.R. 41.50 filed May 19, 2017”, 8 pgs.
“U.S. Appl. No. 12/878,001, Response filed Apr. 1, 2014 to Non Final Office Action dated Oct. 3, 2013”, 13 pgs.
“U.S. Appl. No. 12/878,001, Response filed Jul. 23, 2018 to Final Office Action dated Apr. 23, 2018”, 12 pgs.
“U.S. Appl. No. 12/878,001, Response filed Aug. 23, 2013 to Final Office Action dated Mar. 29, 2013”, 12 pgs.
“U.S. Appl. No. 12/878,001, Response filed Nov. 9, 2012 to Non Final Office Action dated Aug. 9, 2012”, 11 pgs.
“U.S. Appl. No. 12/878,001, Response filed Dec. 22, 2017 to Non Final Office Action dated Aug. 24, 2017”, 16 pgs.
“U.S. Appl. No. 14/242,459, Advisory Action dated Sep. 2, 2015”, 6 pgs.
“U.S. Appl. No. 14/242,459, Appeal Brief filed Dec. 4, 2015”, 17 pgs.
“U.S. Appl. No. 14/242,459, Applicant Summary of Interview with Examiner filed Sep. 15, 2015”, 2 pgs.
“U.S. Appl. No. 14/242,459, Applicant Summary of Interview with Examiner filed Nov. 17, 2015”, 4 pgs.
“U.S. Appl. No. 14/242,459, Decision on Pre-Appeal Brief Request mailed Nov. 4, 2015”, 4 pgs.
“U.S. Appl. No. 14/242,459, Examiner Interview Summary dated Mar. 3, 2015”, 3 pgs.
“U.S. Appl. No. 14/242,459, Examiner Interview Summary dated Jul. 21, 2015”, 3 pgs.
“U.S. Appl. No. 14/242,459, Examiner Interview Summary dated Nov. 17, 2015”, 3 pgs.
“U.S. Appl. No. 14/242,459, Examiners Answer to Appeal Brief mailed Jul. 12, 2016”, 18 pgs.
“U.S. Appl. No. 14/242,459, Final Office Action dated Jun. 19, 2015”, 21 pgs.
“U.S. Appl. No. 14/242,459, Non Final Office Action dated Jan. 5, 2015”, 11 pgs.
“U.S. Appl. No. 14/242,459, Pre-Brief Conference request filed Sep. 15, 2015”, 5 pgs.
“U.S. Appl. No. 14/242,459, Response filed Feb. 19, 2015 to Non Final Office Action dated Jan. 5, 2015”, 9 pgs.
“U.S. Appl. No. 14/242,459, Response filed Jul. 21, 2015 to Final Office Action dated Jun. 19, 2015”, 11 pgs.
“U.S. Appl. No. 14/260,677, Advisory Action dated Dec. 9, 2016”, 3 pgs.
“U.S. Appl. No. 14/260,677, Advisory Action dated Dec. 28, 2016”, 5 pgs.
“U.S. Appl. No. 14/260,677, Corrected Notice of Allowance dated Nov. 3, 2017”, 2 pgs.
“U.S. Appl. No. 14/260,677, Examiner Interview Summary dated Aug. 28, 2017”, 3 pgs.
“U.S. Appl. No. 14/260,677, Final Office Action dated Sep. 23, 2016”, 20 pgs.
“U.S. Appl. No. 14/260,677, Non Final Office Action dated Jun. 6, 2017”, 18 pgs.
“U.S. Appl. No. 14/260,677, Non Final Office Action dated Jun. 7, 2016”, 15 pgs.
“U.S. Appl. No. 14/260,677, Notice of Allowability dated Sep. 25, 2017”, 2 pgs.
“U.S. Appl. No. 14/260,677, Notice of Allowance dated Sep. 12, 2017”, 7 pgs.
“U.S. Appl. No. 14/260,677, Response filed Aug. 29, 2017 to Non Final Office Action dated Jun. 6, 2017”, 11 pgs.
“U.S. Appl. No. 14/260,677, Response filed Sep. 6, 2016 to Non Final Office Action dated Jun. 7, 2016”, 9 pgs.
“U.S. Appl. No. 14/260,677, Response filed Dec. 1, 2016 to Final Office Action dated Sep. 23, 2016”, 11 pgs.
“U.S. Appl. No. 14/336,758, Advisory Action dated Mar. 9, 2016”, 3 pgs.
“U.S. Appl. No. 14/336,758, Appeal Brief filed May 20, 2016”, 18 pgs.
“U.S. Appl. No. 14/336,758, Appeal Decision mailed May 17, 2017”, 6 pgs.
“U.S. Appl. No. 14/336,758, Examiner Interview Summary dated Dec. 20, 2017”, 3 pgs.
“U.S. Appl. No. 14/336,758, Examiners Answer to Appeal Brief mailed Jul. 12, 2016”, 13 pgs.
“U.S. Appl. No. 14/336,758, Final Office Action dated Nov. 25, 2015”, 11 pgs.
“U.S. Appl. No. 14/336,758, Non Final Office Action dated Jan. 29, 2015”, 10 pgs.
“U.S. Appl. No. 14/336,758, Non Final Office Action dated Jul. 23, 2015”, 10 pgs.
“U.S. Appl. No. 14/336,758, Notice of Allowance dated May 4, 2018”, 8 pgs.
“U.S. Appl. No. 14/336,758, Notice of Allowance dated Aug. 1, 2017”, 8 pgs.
“U.S. Appl. No. 14/336,758, Notice of Allowance dated Sep. 14, 2017”, 8 pgs.
“U.S. Appl. No. 14/336,758, PTO Response to Rule 312 Communication dated Dec. 21, 2017”, 2 pgs.
“U.S. Appl. No. 14/336,758, Reply Brief filed Aug. 31, 2016”, 4 pgs.
“U.S. Appl. No. 14/336,758, Response filed Feb. 25, 2016 to Final Office Action dated Nov. 25, 2015”, 5 pgs.
“U.S. Appl. No. 14/336,758, Response filed Apr. 28, 2015 to Non Final Office Action dated Jan. 29, 2015”, 10 pgs.
“U.S. Appl. No. 14/336,758, Response filed Sep. 22, 2015 to Non Final Office Action dated Jul. 23, 2015”, 14 pgs.
“U.S. Appl. No. 14/588,871, Final Office Action dated Mar. 7, 2016”, 12 pgs.
“U.S. Appl. No. 14/588,871, Final Office Action dated Mar. 31, 2017”, 17 pgs.
“U.S. Appl. No. 14/588,871, Non Final Office Action dated Jun. 29, 2015”, 13 pgs.
“U.S. Appl. No. 14/588,871, Non Final Office Action dated Sep. 9, 2016”, 12 pgs.
“U.S. Appl. No. 14/588,871, Non Final Office Action dated Sep. 15, 2016”, 16 pgs.
“U.S. Appl. No. 14/588,871, Notice of Allowance dated Jun. 26, 2017”, 5 pgs.
“U.S. Appl. No. 14/588,871, Notice of Allowance dated Sep. 12, 2017”, 2 pgs.
“U.S. Appl. No. 14/588,871, Preliminary Amendment filed Jan. 27, 2015”, 8 pgs.
“U.S. Appl. No. 14/588,871, Response filed Jan. 17, 2017 to Non Final Office Action dated Sep. 15, 2016”, 20 pgs.
“U.S. Appl. No. 14/588,871, Response filed May 31, 2017 to Final Office Action dated Mar. 31, 2017”, 19 pgs.
“U.S. Appl. No. 14/588,871, Response filed Jul. 7, 2016 Final Office Action dated Mar. 7, 2016”, 12 pgs.
“U.S. Appl. No. 14/588,871, Response filed Oct. 29, 2015 to Non Final Office Action dated Jun. 29, 2015”, 11 pgs.
“U.S. Appl. No. 14/588,871, Supplemental Notice of Allowability dated Jul. 17, 2017”, 2 pgs.
“U.S. Appl. No. 15/637,561, Advisory Action dated Mar. 6, 2019”, 3 pgs.
“U.S. Appl. No. 15/637,561, Final Office Action dated Nov. 26, 2018”, 20 pgs.
“U.S. Appl. No. 15/637,561, Non Final Office Action dated Apr. 23, 2018”, 22 pgs.
“U.S. Appl. No. 15/637,561, Notice of Allowance dated Apr. 18, 2019”, 12 pgs.
“U.S. Appl. No. 15/637,561, Preliminary Amendment filed Oct. 5, 2017”, 7 pgs.
“U.S. Appl. No. 15/637,561, Response filed Jan. 22, 2019 to Final Office Action dated Nov. 26, 2018”, 11 pgs.
“U.S. Appl. No. 15/637,561, Response filed Jul. 23, 2018 to Non Final Office Action dated Apr. 23, 2018”, 11 pgs.
“U.S. Appl. No. 15/637,561, Resposne filed Mar. 26, 2019 to Final Office Action dated Nov. 26, 2018”, 8 pgs.
“U.S. Appl. No. 15/726,102, Non Final Office Action dated Apr. 18, 2018”, 32 pgs.
“U.S. Appl. No. 15/726,102, Preliminary Amendment filed Oct. 6, 2017”, 7 pgs.
“U.S. Appl. No. 15/841,904, Notice of Allowance dated Jul. 2, 2018”, 8 pgs.
“U.S. Appl. No. 15/841,904, Preliminary Amendment filed May 8, 2018”, 8 pgs.
“U.S. Appl. No. 15/882,472, Preliminary Amendment Filed Apr. 23, 2018”, 7 pgs.
“U.S. Appl. No. 16/237,022, Corrected Notice of Allowability dated Jan. 25, 2021”, 7 pgs.
“U.S. Appl. No. 16/237,022, Corrected Notice of Allowability dated Feb. 18, 2021”, 7 pgs.
“U.S. Appl. No. 16/237,022, Corrected Notice of Allowability dated Apr. 14, 2021”, 2 pgs.
“U.S. Appl. No. 16/237,022, Examiner Interview Summary dated Apr. 17, 2020”, 3 pgs.
“U.S. Appl. No. 16/237,022, Final Office Action dated Sep. 3, 2020”, 27 pgs.
“U.S. Appl. No. 16/237,022, Non Final Office Action dated Jan. 21, 2020”, 34 pgs.
“U.S. Appl. No. 16/237,022, Notice of Allowance dated Apr. 5, 2021”, 9 pgs.
“U.S. Appl. No. 16/237,022, Notice of Allowance dated Dec. 24, 2020”, 10 pgs.
“U.S. Appl. No. 16/237,022, Preliminary Amendment Filed May 14, 2019”, 8 pgs.
“U.S. Appl. No. 16/237,022, Response filed Apr. 21, 2020 to Non Final Office Action dated Jan. 21, 2020”, 12 pgs.
“U.S. Appl. No. 16/237,022, Response filed Dec. 3, 2020 to Final Office Action dated Sep. 3, 2020”, 10 pgs.
“U.S. Appl. No. 16/511,648, Final Office Action dated Feb. 25, 2021”, 24 pgs.
“U.S. Appl. No. 16/511,648, Non Final Office Action dated Sep. 29, 2020”, 17 pgs.
“U.S. Appl. No. 16/511,648, Preliminary Amendment Filed Oct. 29, 2019”, 6 pgs.
“U.S. Appl. No. 16/511,648, Response filed Dec. 22, 2020 to Non Final Office Action dated Sep. 29, 2020”, 10 pgs.
“Australian Application Serial No. 2011101152, Examination Report No. 1 dated May 6, 2013”, 4 pgs.
“Australian Application Serial No. 2011101152, Response filed Sep. 17, 2013 to Examination Report No. 1 dated May 6, 2013”, 13 pgs.
“Australian Application Serial No. 2011299221, Response filed Jan. 15, 2015”, 19 pgs.
“Australian Application Serial No. 2011299234, Amendment filed Apr. 4, 2013”, 11 pgs.
“Australian Application Serial No. 2011299234, Amendment filed Aug. 25, 2015”, 26 pgs.
“Australian Application Serial No. 2011299234, First Examiner Report dated Aug. 25, 2014”, 3 pgs.
“Australian Application Serial No. 2011299234, Response filed Oct. 26, 2015 to Subsequent Examiners Report dated Sep. 4, 2015”, 3 pgs.
“Australian Application Serial No. 2011299234, Subsequent Examiners Report dated Sep. 4, 2015”, 4 pgs.
“Australian Application Serial No. 2016201377, First Examiner Report dated Feb. 1, 2017”, 3 pgs.
“Australian Application Serial No. 2016201377, Response filed May 25, 2017 to First Examiner Report dated Feb. 1, 2017”, 55 pgs.
“Australian Application Serial No. 2016201377, Response filed Aug. 9, 2017 to Subsequent Examiners Report dated Jun. 6, 2017”, 2 pgs.
“Australian Application Serial No. 2016201377, Subsequent Examiners Report dated Jun. 6, 2017”, 3 pgs.
“Australian Application Serial No. 2016201377, Subsequent Examiners Report dated Aug. 23, 2017”, 3 pgs.
“Australian Serial No. 2011299221, First Examiner Report dated May 2, 2014”, 3 pgs.
“Brazil Application Serial No. BR1120130055251, Final Office Action dated Feb. 19, 2021”, with English translation, 7 pages.
“Brazil Application Serial No. BR1120130055251, Office Action dated Sep. 24, 2019”, w/English translation, 7 pgs.
“Brazil Application Serial No. BR1120130055251, Office Action dated Nov. 13, 2020”, with English translation, 7 pages.
“Brazil Application Serial No. BR1120130055251, Response filed Feb. 11, 2021 to Office Action dated Nov. 13, 2020”, with English claims, 40 pages.
“Brazil Application Serial No. BR1120130055251, Response filed Dec. 23, 2019 to Office Action dated Sep. 24, 2019”, w/English Claims, 32 pgs.
“Brazil Application Serial No. BR1120130056967, Office Action dated Jul. 13, 2020”, with English translation, 8 pages.
“Brazil Application Serial No. BR1120130056967, Office Action dated Nov. 5, 2019”, W/English Translation, 7 pgs.
“Brazil Application Serial No. BR1120130056967, Office Action dated Nov. 26, 2020”, with English translation, 7 pages.
“Brazil Application Serial No. BR1120130056967, Response filed Jan. 28, 2021 to Office Action dated Nov. 26, 2020”, with English claims, 18 pages.
“Brazil Application Serial No. BR1120130056967, Response filed Feb. 10, 2020 to Office Action dated Nov. 5, 2019”, with English claims, 45 pages.
“Brazilian Application Serial No. BR1120130055251, Voluntary Amendment filed Sep. 8, 2014”, with English claims, 9 pages.
“Canadian Application Serial No. 2,810,511, Examiner's Rule 30(2) Requisition dated Sep. 30, 2019”, 5 pgs.
“Canadian Application Serial No. 2,810,511, Office Action dated Jun. 12, 2018”, 4 pgs.
“Canadian Application Serial No. 2,810,511, Office Action dated Jun. 21, 2017”, 4 pgs.
“Canadian Application Serial No. 2,810,511, Office Action dated Dec. 10, 2018”, 5 pgs.
“Canadian Application Serial No. 2,810,511, Response filed Apr. 4, 2019 to Office Action dated Dec. 10, 2018”, 36 pgs.
“Canadian Application Serial No. 2,810,511, Response filed Aug. 24, 2018 to Office Action dated Jun. 12, 2018”, 26 pgs.
“Canadian Application Serial No. 2,810,511, Response filed Dec. 15, 2017 to Office Action dated Jun. 21, 2017”, 37 pgs.
“Canadian Application Serial No. 2,810,521, Examiner's Rule 30(2) Requisition dated Jan. 4, 2019”, 4 pgs.
“Canadian Application Serial No. 2,810,521, Office Action dated Mar. 1, 2018”, 5 pgs.
“Canadian Application Serial No. 2,810,521, Office Action dated Jun. 8, 2017”, 3 pgs.
“Canadian Application Serial No. 2,810,521, Response filed Apr. 4, 2019 to Examiner's Rule 30(2) Requisition dated Jan. 4, 2019”, 9 pgs.
“Canadian Application Serial No. 2,810,521, Response filed Jul. 30, 2018 to Office Action dated Mar. 1, 2018”, 17 pgs.
“Canadian Application Serial No. 2,810,521, Response filed Sep. 7, 2017 to Office Action dated Jun. 8, 2017”, 15 pgs.
“European Application Serial No. 01968190.7, European Amendment filed Aug. 18, 2011”, 1 pg.
“European Application Serial No. 01968190.7, European Amendment filed Sep. 20, 2011”, 3 pgs.
“European Application Serial No. 01968190.7, Office Action dated May 17, 2010”, 9 pgs.
“European Application Serial No. 01968190.7, Office Action dated Nov. 6, 2006”, 4 pgs.
“European Application Serial No. 01968190.7, Response filed May 16, 2007 to Office Action dated Nov. 6, 2006”, 26 pgs.
“European Application Serial No. 01968190.7, Response filed Sep. 24, 2010 to Office Action dated May 17, 2010”, 5 pgs.
“European Application Serial No. 11824078.7, Communication Pursuant to Article 94(3) EPC dated May 7, 2018”, 3 pgs.
“European Application Serial No. 11824078.7, Communication Pursuant to Article 94(3) EPC dated Aug. 16, 2018”, 7 pgs.
“European Application Serial No. 11824078.7, Extended European Search Report dated Aug. 19, 2016”, 10 pgs.
“European Application Serial No. 11824078.7, Response filed Feb. 26, 2019 to Communication Pursuant to Article 94(3) EPC dated Aug. 16, 2018”, 13 pgs.
“European Application Serial No. 11824078.7, Response filed Mar. 3, 2017 to Extended European Search Report dated Aug. 19, 2016”, 4 pgs.
“European Application Serial No. 11824078.7, Response filed May 29, 2018 to Communication Pursuant to Article 94(3) EPC dated May 7, 2018”, 1 pg.
“European Application Serial No. 11824078.7, Response filed Nov. 7, 2019 to Summons to Attend Oral Proceedings mailed Jul. 23, 2019”, 29 pgs.
“European Application Serial No. 11824078.7, Summons to Attend Oral Proceedings mailed Jul. 23, 2019”, 9 pgs.
“European Application Serial No. 11824132.2, Communication pursuant to Article 94(3) EPC dated Mar. 1, 2017”, 7 pgs.
“European Application Serial No. 11824132.2, Extended European Search Report dated Feb. 25, 2014”, 6 pgs.
“European Application Serial No. 11824132.2, Response filed Mar. 5, 2019 to Summons to Attend Oral Proceedings mailed Nov. 5, 18”, 29 pgs.
“European Application Serial No. 11824132.2, Response filed Jun. 27, 2017 to Communication pursuant to Article 94(3) EPC dated Mar. 1, 2017”, 4 pgs.
“European Application Serial No. 11824132.2, Response filed Aug. 29, 2014”, 12 pgs.
“European Application Serial No. 11824132.2, Summons to Attend Oral Proceedings mailed Nov. 5, 2018”, 5 pgs.
“European Application Serial No. 14843569.6, Extended European Search Report dated Mar. 6, 2017”, 10 pgs.
“European Application Serial No. 14843569.6, Response filed Sep. 20, 2017 to Extended European Search Report dated Mar. 6, 2017”, 12 pgs.
“European Application Serial No. 14843569.6, Response filed Oct. 26, 2016 to Communication pursuant to Rules 161(2) and 162 EPC dated Apr. 22, 2016”, 9 pgs.
“European Application Serial No. 14844441.7, Extended European Search Report dated Mar. 2, 2017”, 10 pgs.
“European Application Serial No. 14844441.7, Response filed Sep. 20, 2017 to Extended European Search Report dated Mar. 2, 2017”, 45 pgs.
“European Application Serial No. 14844441.7, Response filed Oct. 26, 2016 to Communication pursuant to Rules 161(2) and 162 EPC dated Apr. 19, 2016”, 7 pgs.
“HTML 4.0 Specification”, W3C Recommendation, XP002191626, (Apr. 24, 1998), 12 pgs.
“HTML Support—Multimedia and Images”, [Online] Retrieved from the Internet: <URL: http://www.citycat.ru/doc/HTML/IExplorer.30/mmedia.htm#Marquee>, (1996), 4 pgs.
“International Application Serial No. PCT/US01/26801, International Preliminary Examination Report dated Nov. 25, 2003”, 12 pgs.
“International Application Serial No. PCT/US01/26801, International Search Report dated Mar. 14, 2002”, 3 pgs.
“International Application Serial No. PCT/US2011/50712, International Preliminary Report on Patentability dated Mar. 21, 2013”, 8 pgs.
“International Application Serial No. PCT/US2011/50712, International Search Report dated Jan. 5, 2012”, 2 pgs.
“International Application Serial No. PCT/US2011/50712, Written Opinion dated Jan. 5, 2012”, 6 pgs.
“International Application Serial No. PCT/US2011/50839, International Preliminary Report on Patentability dated Mar. 21, 2013”, 6 pgs.
“International Application Serial No. PCT/US2011/50839, International Search Report dated Dec. 30, 2011”, 2 pgs.
“International Application Serial No. PCT/US2011/50839, Written Opinion dated Dec. 30, 2011”, 4 pgs.
“International Application Serial No. PCT/US2014/054701, International Preliminary Report on Patentability dated Mar. 24, 2016”, 8 pgs.
“International Application Serial No. PCT/US2014/054701, International Search Report dated Jan. 12, 2015”, 2 pgs.
“International Application Serial No. PCT/US2014/054701, Written Opinion dated Jan. 12, 2015”, 6 pgs.
“International Application Serial No. PCT/US2014/054702, International Preliminary Report on Patentability dated Mar. 24, 2016”, 6 pgs.
“International Application Serial No. PCT/US2014/054702, International Search Report dated Nov. 19, 2014”, 2 pgs.
“International Application Serial No. PCT/US2014/054702, Written Opinion dated Nov. 19, 2014”, 4 pgs.
“Mexican Application Serial No. MX/a/2016/003114, Office Action dated Nov. 16, 2017”, with English translation, 4 pages.
“Mexican Application Serial No. MX/a/2016/003114, Response filed Feb. 6, 2018 to Office Action dated Nov. 16, 2017”, with English translation, 18 pages.
“Mexican Application Serial No. MX/a/2016/003115, Office Action dated Nov. 7, 2017”, with English translation, 6 pages.
“Mexican Application Serial No. MX/a/2016/003115, Response filed Feb. 19, 2018 to Office Action dated Nov. 7, 2017”, with English translation, 14 pages.
“MPEG-4 Authoring Tools Let Pros, Consumers Create Mutimedia for Web Pages, TV, HDTV”, Sarnoff Document, XP002155140, (Dec. 10, 1998), 2 pgs.
Alvaer, Jose, “Realnetworks' Realaudio and Realvideo”, Webdeveloper.com, guide to streaming multimedia, XP002150113, ISBN:0-471-24822-3, (1998), 20 pgs.
Chai, Crx K., “U.S. Appl. No. 12/877,875 / Smart Media Selection Based on Viewer User Preference”, 11.
Chambers, C. S., “Designing a set-top box operating syatem”, International conference on consumer electronics,IEEE US vol. Conf. 14, XP000547858 ISBN 0-7803-2141-3, (Jun. 7, 1995), 368-369.
Clearplay, “Being a Very Cool Responsible Parent Just Got a Whole Lot Easier”, [Online]. Retrieved from the Internet: <URL: http://www.clearplay.com/>, (Accessed Jan. 13, 2003), 2 pages.
Clearplay, “Enjoy the Show!”, Press Release, Dec. 10, 2001, “ClearPlay Launches Groundbreaking Movie Filtering,”, [Online]. Retrieved from the Internet: <URL: http://www.clearplay.com/10Dec2001.asp>, (Dec. 10, 2001), 2 pages.
Cobb, Jerry, “Taking Violence out of DVD Movies—System from ClearPlay Removes ‘R’ Content from DVDs”, CNBC, [Online]. Retrieved from the Internet: <URL: http://www.msnbc.com/news/857154.asp?cpl=1,>, (Jan. 9, 2003), 3 pgs.
EBU Project Group B/CA, “Functional model of a conditional access system”, EBU Technical Review, 266, Grand-Saconnex, CH, (Winter 1995), 64-77.
Fernandez, Panadero MC, et al., “Mass-customizing electronic journals”, Online!, XP002177409, (May 10, 1999), 11 pgs.
Giles, Aaron, “Transparency—A Quick and Dirty Utility for Creating Tranparent GIF Images”, [Online]. Retrieved from the Internet: <URL: http://www.mit.edu:8001/people/nocturne/etc/Transparency_notes.html>, (Aug. 19, 1994), 2 pgs.
Levin, “Software Design of a Personal Television Service”, ICCE 2000, (2000), pp. 26-27.
Shim, S.Y. Shim, et al., “Template Based Synchronized Multimedia Integration Language Authoring Tool”, Proceedings of the SPIE, SPIE, Bellingham, VA, vol. 3964, (Jan. 2000), 134-142.
Vuorimaa, Petri, et al., “XML Based Text TV”, IEEE—WISE '00 Proceedings of the First International Conference on Web, (2000), 109-113.
Watson, Christopher, “Scripting the Web (times 2)”, [Online]. Retrieved from the Internet: <URL: http://groups.google.com/groups?q=javascript+hypermedia&hl=en&selm=cwatson-3008961022470001%40204.212.150.108&rnum=7>, (Aug. 30, 1996), 2 pages.
“U.S. Appl. No. 16/511,648, Final Office Action dated Nov. 8, 2021”, 23 pages.
“U.S. Appl. No. 15/882,472, Non Final Office Action dated Jul. 12, 2019”, 16 pgs.
“U.S. Appl. No. 16/148,843, Notice of Allowance dated Jul. 18, 2019”, 8 pgs.
“U.S. Appl. No. 16/148,843, Supplemental Amendment filed Oct. 9, 2019”, 8 pgs.
“U.S. Appl. No. 16/148,843, Notice of Allowance dated Oct. 18, 2019”, 8 pgs.
“U.S. Appl. No. 15/882,472, Examiner Interview Summary dated Nov. 5, 2019”, 3 pgs.
“U.S. Appl. No. 15/882,472, Response Filed Jan. 6, 2019 to Non Final Office Action dated Jul. 12, 2019”, 10 pgs.
“U.S. Appl. No. 15/882,472, Final Office Action dated Nov. 27, 2019”, 16 pgs.
“U.S. Appl. No. 16/511,648, Response filed May 25, 2021 to Final Office Action dated Feb. 25, 2021”, 11 pgs.
“U.S. Appl. No. 16/511,648, Non Final Office Action dated Jun. 4, 2021”, 26 pgs.
“Brazil Application Serial No. BR1120130055251, Response filed Apr. 27, 2021 to Final Office Action dated Feb. 19, 2021”, with English claims, 19 pages.
“U.S. Appl. No. 16/511,648, Response filed Sep. 7, 2021 to Non Final Office Action dated Jun. 4, 2021”, 12 pgs.
“U.S. Appl. No. 12/877,993, Examiner's Answer dated Jun. 3, 2016 to Appeal Brief filed Feb. 24, 2016”, 10 pgs.
“U.S. Appl. No. 12/877,993, Reply Brief filed Aug. 3, 2016 to Examiner's Answer dated Jun. 3, 2016”, 5 pgs.
“U.S. Appl. No. 16/511,648, Appeal Brief filed Apr. 8, 2022”, 20 pgs.
“U.S. Appl. No. 16/511,648, Examiner's Answer to Appeal Brief dated Jun. 1, 2022”, 23 pgs.
Related Publications (1)
Number Date Country
20210382955 A1 Dec 2021 US
Continuations (2)
Number Date Country
Parent 16237022 Dec 2018 US
Child 17304692 US
Parent 12878001 Sep 2010 US
Child 16237022 US
Continuation in Parts (1)
Number Date Country
Parent 12877034 Sep 2010 US
Child 12878001 US