Methods and apparatus to validate a tag for media

Abstract
Methods and apparatus to validate a tag for media are disclosed. An example method includes obtaining first identification information extracted from a tag distributed with media presented at a presentation location, obtaining second identification information determined from at least one of a) inherent information of at least one of audio or video of the media presented at the presentation location or b) a watermark embedded in at least one of the audio or the video of the media presented at the presentation, comparing the first identification information with the second identification information, and when first identification information does not substantially match the second identification information, preventing the tag from being used to report exposure of the media.
Description
FIELD OF THE DISCLOSURE

The present disclosure relates generally to identifying media content and, more particularly, to methods and apparatus to generate a tag for media content.


BACKGROUND

Media content is distributed over many different distribution channels. For example, the same media content may be distributed over a broadcast system (e.g., cable, satellite, terrestrial, etc.) and may be distributed over the internet. Media content distributed over broadcast systems is often transmitted with identifying information embedded in or otherwise associated with the media content (e.g., watermarks) so that monitoring of the exposure to the media content at presentation locations (e.g., households) can be performed. Additionally or alternatively, identifying information comprising one or more characteristics of the media content (e.g., signatures) can be collected, labeled with known identifying information, and stored prior to distribution of the media content so that the media content can be later identified by extracting the signatures at a reception site and accessing the identifying information by matching the signatures extracted at the reception site to the stored signatures.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of an example system to generate a tag for media content and to retrieve the tag at a presentation location for analysis at a central facility.



FIG. 2 is a flowchart representative of example machine readable instructions that may be executed to generate a tag for media content.



FIG. 3 is a flowchart representative of example machine readable instructions that may be executed to generate a tag for watermarked content.



FIG. 4 is a flowchart representative of example machine readable instructions that may be executed to generate a watermark and a tag for media content.



FIG. 5 is a flowchart representative of example machine readable instructions that may be executed to generate a tag using reference identifying information.



FIG. 6 is a flowchart representative of example machine readable instructions that may be executed to validate a tag that has been generated by the media identifier tool of FIG. 1.



FIG. 7 is an example processor system that can be used to execute the example instructions of FIGS. 2-5 and/or 6 to implement the example apparatus of FIG. 1.





DETAILED DESCRIPTION

Advances in technology result in changes to the specifications defining the use of identifying information (e.g., watermarks, signatures, codes, etc.) for media content. In other words, the particular identifying information embedded in, associated with, extracted from, and so forth from media content changes over time. Example methods, apparatus, and articles of manufacture disclosed herein generate one or more identifying tags for media content (e.g., programs or advertising).


As used herein, “identifying information” includes information that is inserted in media content for the purpose of identifying the media content (e.g., watermarks, codes, etc.) and/or includes information inherent to one or more aspects of the media content (e.g., the audio, the video, etc.) or to one or more aspects of the signal representing the media content, which inherent information identifies the media content (e.g., signatures, fingerprints, etc.). Such inherent information is not inserted for the purpose of identifying the media content.


As used herein “attribute data” is information that identifies media content such as, for example, a source identifier (SID), a media asset identifier (MAID), a signature value for media content, a title of the media content, an identification of a creator of the media content, a timestamp indicating when the media content was broadcast, etc. Identifying information may include attribute data. For example, a SID (attribute data) may be included in a watermark (identifying information) inserted in media content. In another example, a signature value (identifying information) generated for media content may be associated with a MAID (attribute data) in a reference database.


As used herein a “tag” is a string of letters and/or numbers that are associated with media content so that the media content can be identified. In some examples, the tag includes attribute data and/or identifying information that has been extracted from the media content. By associating a tag with media content, the processing needed to determine identifying information (e.g., extracting watermarks, generating signatures, etc.) can be performed once (e.g., prior to distribution) and the media content can be identified (e.g., after distribution) by reading the tag for the distributed media content without, for example, re-extracting the watermark, regenerating the signature, etc. Furthermore, when the tag includes identifying information or attribute data included in the identifying information, records of presentation of the media content collected using the tag can be combined with records from distribution systems that include identifying information in the media content (but do not necessarily use the tags to identify the media content).


In some examples, identifying information (e.g., watermarks embedded in the media content, signatures collected and labeled for the media content, etc.) for the media content has been previously associated with the media content. In some such examples, the identifying information can be obtained and used to generate the tag(s). The example tag(s) can be associated with media content prior to distribution (e.g., before internet media content is streamed to presentation locations (e.g., households)). For example, the tag(s) may be associated with the media content in a webpage distributing the media content, inserted in metadata of the media content (e.g., in a file containing the media content or a file associated with the file containing the media content), inserted in metadata of a stream, etc. The example tag(s) can later be extracted at presentation location(s) and analyzed to identify the media content and increment records for exposure to the media content. Where the tag(s) are generated based on identifying information used to identify the media content in other monitoring systems (e.g., watermarks or signatures used to identify the media content in a television broadcast system), media content presentation records collected using the tags may be combined with or compared to the records from the other distribution systems. In some examples, the identifying information and the tag(s) may be collected at a presentation location. The tag(s) may be compared with the identifying information to validate that the tag(s) are correct (e.g., to confirm that the tag(s) correctly identify the media content). If the format of the previously associated identifying information changes (e.g., technology developments may change the details of the watermarks, signatures, etc.), new tag(s) can be generated using the adaptable tag structure described herein.



FIG. 1 is a block diagram of an example system 100 to generate a tag for media content and to retrieve such tag at a presentation location 114 for analysis at a central facility 120.


In the example system 100, media content is added to a content data store 104. In the illustrated example, some media content is identified in a reference library of the central facility 120 by identifying information embedded in or otherwise broadcast with the media content (e.g., watermarks), some media content is identified in a reference library stored by the central facility 120 based on identifying information inherent in the program (e.g., using media content signatures), some media content is associated with identifying information to enable embedding of the identifying information (e.g., adding watermarks), and some media content is unknown. In other examples, any combination of methods for identifying media content may be used. In the illustrated example identifying information and/or other attribute data is stored with the media content in the content data store 104. Alternatively, the metadata may be stored in another device linked to the content data store 104 and/or linked to devices that require access to the metadata.


The content data store 104 may be any combination of one or more data storage devices. For example, the content data store 104 may be one or more databases, one or more network storage locations, one or more hard drives, one or more files, one or more extensible markup language (XML) files, any combination of the foregoing, etc. In the illustrated example, the content data store 104 provides media content to the media identifier tool 105 for tagging the media content. The example content data store 104 receives media content that has been associated with identifying information (e.g., watermarks, signatures, etc.) from the media identifier tool 105 and/or receives tags that identify the media content from the media identifier tool 105. The content data store 104 provides media content to the content server 112 for distribution to presentation locations (e.g., the presentation location 114).


The example media identifier tool 105 of FIG. 1 receives media content from the content data store 104 and analyzes the media content to generate a tag for the media content and, in some instances, to embed or otherwise associate identifying information with the media content. The media identifier tool 105 of the illustrated example retrieves identifying information from the central facility 120. In the illustrated example, the media identifier tool 105 includes an identity analyzer 106, a tag generator 108, and a central facility transceiver 110.


The example identity analyzer 106 of FIG. 1 determines that media content is available for analysis at the content data store 104. For example, the media content may be transmitted from the content data store 104 to the identity analyzer 106 when new content is received, at periodic or aperiodic intervals, etc. Alternatively, the identity analyzer 106 may periodically or aperiodically query the content data store 104 to determine if there is content to be analyzed (e.g., content that has not yet been tagged, content that has been tagged in error, etc.). For example, the identity analyzer 106 may query the content data store 104 for content that is not yet associated with a tag, for content identified on a list of invalid tags, etc.


The example identity analyzer 106 of FIG. 1 attempts to identify the content to be analyzed. The example identity analyzer 106 of FIG. 1 uses one or more of the following techniques to identify the media content: extracting identifying information embedded in the media content (e.g., watermarks, codes, etc.) and querying a reference database with the embedded information, determining identifying information which is inherent in the media content or portion(s) of the signal representing the media content (e.g., signatures) and querying a reference database with the characteristic data, and/or reading identifying information and/or attribute information input to the identity analyzer 106 or otherwise provided to the identity analyzer 106 (e.g., an XML file storing identifying information input by a user).


As used herein, the term “embedded” includes modifying any portion (e.g., the audio, the video, etc.) of the media content to store identifying information (e.g., by amplifying or attenuating portions of the audio portion, by shifting portions of the audio portion, by modifying a video and/or an image portion of the media content to store identifying information (e.g., by amplifying or attenuating portions of the video portion, changing colors of portions of the video portion, etc.)).


In the illustrated example, the identity analyzer 106 accesses the reference database at the central facility 120 via the central facility transceiver 110 to obtain identifying information based on characteristic data representative of the media content (or part thereof) and/or a signal (or portion thereof) representing the media content. After the identity analyzer 106 obtains inherent identifying information (e.g., a signature) from the media content, the identity analyzer 106 queries reference data using the inherent identifying information to obtain attribute data associated with (e.g., mapped to) the inherent identifying information. For example, the reference database can be generated by obtaining inherent identifying information for known media content and storing the inherent identifying information in the reference database in association with known attribute data. Alternatively, a reference database may be stored at the identity analyzer 106 or any other location accessible by the identity analyzer. In some examples, the identity analyzer 106 will not need to access the reference database when the media content has been watermarked.


In some examples, the identity analyzer 106 may embed (or otherwise associate) identifying information that has been input to the identity analyzer 106 in (or with) the media content. For example, the identity analyzer 106 may embed a watermark in the media content using information in a received XML file. When a watermark is embedded or associated with the media content, the media content with the watermark is transmitted to the content data store 104 for distribution. In addition, information about the watermark is communicated to the central facility 120 via the central facility transceiver 110. If the media content cannot be identified by any available techniques, the example identity analyzer 106 provides an error message to an operator of the media identifier tool 105 to enable or prompt manual identification.


The identity analyzer 106 of the illustrated example transmits the information identifying the media content to the tag generator 108 (e.g., information obtained from the media content, information retrieved from the central facility 120, information received from an operator, etc.). The information may be any combination of identifying information and/or attribute data associated with the identifying information.


In the illustrated example, attribute data extracted from identifying information includes a 4-digit source identifier (SID), a time and date stamp, and an identifier for the type of code (e.g., an indication of whether the media content is from a network television broadcast or a cable television broadcast). Alternatively, any other identifying information and/or attribute data may be transmitted to the tag generator 108 from the identity analyzer 106. Attribute data may include one or more of a station identifier, a channel identifier, a title, a producer, a broadcast time, a serial number, a code identifier, a signature value, website identifier, etc.


The example tag generator 108 of FIG. 1 receives identifying information and/or attribute data for media content and generates a tag for the media content. In the illustrated example, the generated tag is a numeric string having six attribute type identifiers prefixing six attribute data elements. The example tag includes an attribute type identifier and attribute data element for each of a tag version, a cyclic redundancy check (CRC) key, a SID, a media asset ID (MAID) identifying the media content in a reference database, a code type, a time and date stamp, and a tick (duration from the start of the media content). According to the illustrated example, the attribute type identifier is a two digit number indicating what type of data follows the attribute type identifier (e.g., attribute type identifier 99 precedes 6 digits indicating the tag version).


An example generated tag and data used for generating the tag are shown in Table 1. Table 2 illustrates example attribute data from a watermark that may be extracted from the first 9 seconds of a media content file. The example watermark in Table 1 is generated from the last row of the watermark data (i.e., timestamp 788939798 that is 9 seconds into the media content file.









TABLE 1







Example tag generation











Example Value
Attribute Type
Attribute Data














Tag Version
0.0.1
99
000001


CRC key
123456789
98
09123456789


SID
9004
02
049004


CODE type
FD
06
01


Time/Date Stamp
788939798
03
788939798


Tick (Duration)
9
04
019





Tag = “9900000102049004060103788939798040199809123456789”













TABLE 2







Example attribute data from a watermark














ENCODE
FILE








TS (UTC)
TIME
SID
LEVEL
TYPE
BLOCK
STRENGTH
CHANNEL





788939791
1
9004
FD
U
177
16
1


788939792
3
9004
FD
U
364
11
1


788939794
5
9004
FD
U
551
17
1


788939796
7
9004
FD
U
738
20
1


788939798
9
9004
FD
U
925
21
1









Table 3 is a description of the attribute types and structure of the example tag generation. Any other attribute types, data formats, tag structures, etc. may be used.









TABLE 3







Example attribute descriptions and definitions










Attribute
Attribute Type




Type
Description
Attribute Data
Attribute Data Description





99
Name:
011053
Structure:



Tag structure

six digit number of the



Version.

form VVMMNN where:



Description:

VV = Major Version



If there is a

Number,



change in the

MM = Minor Version



structure

Number,



definition

NN = Incremental number



it can be

Each field will be padded



reflected by

with a leading 0 in case of



this attribute

a single digit number.



so that the

Example:



correct tag is

1.10.53 =



produced and

Attribute Data = 011053



consumed


98
Name:
09123456789
Structure:



Tag Integrity

LLXXXXX . . . X



Description:

LL = two digit number



This tag holds

indicating the length of the



the string

CRC String



that is

XXXXX . . . X = The CRC



generated by

String



a CRC type of

Example:



algorithm.

CRC string = 123456789





Length = 9





Attribute Data =





09123456789


01
Name:
06105600
Structure:



Media Asset ID

LLXXXX . . . X



(MAID)

LL = two digit number





indicating the length of the





MAID





XXXXX . . . X = The





MAID





Example:





MAID = 105600





Length = 6





Attribute Data = 06105600


02
Name:
041004
Structure:



SID

LLXXXX . . . X





LL = two digit number





indicating the length of the





SID





XXXXX . . . X = The SID





Example:





SID = 1004





Length = 4





Attribute Data = 041004









In some examples, the tag generator 108 generates the tag by making a call to a tag generation application programming interface (API) and passing the identification information. The API may be made available at the media identifier tool (e.g., as part of the tag generator 108) or may be provided by the central facility 120 via the central facility transceiver 110.


The tag generator 108 of the illustrated example transmits the generated tag to the content data store 104 for storage of the tag in association with the media content. In some examples, the example tag generator 108 also transmits the tag to the central facility 120 via the central facility transceiver 110 for later comparison with tags detected at the presentation location 114. In some examples, the tag generator 108 does not transmit the tag to the central facility 120. For example, the central facility may not need to receive the tag because the central facility 120 may obtain and/or decode the identifying information from the tag itself.


The central facility transceiver 110 of the illustrated example communicatively couples the media identifier tool 105 with the central facility 120. The example central facility transceiver 110 of FIG. 1 is a network communication device that is linked to the central facility by one or more networks (e.g., the internet, a local area network, a wide area network, etc.). Alternatively, the central facility transceiver 110 may be any other device to communicatively couple the media identifier tool 105 with the central facility. An example network linking the media identifier tool 105 with the central facility 120 may also link the media identifier tool 105 with the content data store 104.


The content server 112 of the illustrated example is communicatively coupled with the content data store 104 and the presentation location 114 to provide media content from the content data store 104 to the presentation location 114. For example, the content server 112 may be a web server that provides media content to the presentation location 114 in response to a request for the media content from the presentation location 114. Alternatively, the content server 112 may by any device for media content distribution.


For media content that has previously been associated with a tag in the content data store 104, the example content server 112 distributes the tag with the media content. In the illustrated example, the tag is inserted in a metadata field of an Adobe® Flash® video file so that the tag is sent to the presentation location 114 when the Adobe® Flash® video file is sent to the presentation location. When the presentation location 114 requests media content from the content server 112, the content server 112 transmits an Adobe® Flash® video player to the presentation location 114. The Adobe® Flash® video player executes at the presentation location 114 and requests the particular media content Adobe video file corresponding to the request from the presentation location 114. The content server 112 transmits the Adobe® Flash® video file with the tag in the metadata to the video player. Alternatively, any other arrangement may be used. For example, the tag may be associated with the media content in a markup language file (e.g., a hypertext markup language (HTML).


The presentation location 114 of the illustrated example requests, receives, and presents media content. For example, the presentation location may be a household, a business, a public location, etc. Typically, the presentation location 114 requests media content that has been requested by a user residing at the presentation location.


The example presentation location 114 of FIG. 1 includes a media presentation device 116. The media presentation device 116 of the illustrated example is the device that requests media content from the content server 112 and receives and presents that media content. In the illustrated example, the media presentation device 116 is a personal computer executing a web browser that can make requests for media content and execute an Adobe® Flash® video player provided by the content server 112. Alternatively, the media presentation device 116 may be any type of device such as, for example, a desktop computer, a laptop computer, a mobile device, a cellular device, a wireless device, a television, a billboard, a projector, etc. While a single media presentation device 116 is shown in the illustrated example, any number or type of media presentation device(s) may be located at the presentation location 114.


The example presentation location 114 includes monitoring instructions 115 and a meter 118 to extract identifying information from presented media content and to transmit the identifying information to the central facility 120 for analysis.


In the illustrated example, the monitoring instructions 115 are computer instructions (e.g., JavaScript, JAVA, etc.) that are transmitted from the content server 112 to the presentation location 114 along with the Adobe® Flash® video player and/or in association with the video content. The computer instructions 115 extract tags from media content presented at the presentation location 114 and transmit the tags to the central facility 120. In addition, the computer instructions 115 transmit information identifying the presentation location 114 to the central facility 120. For example, the computer instructions 115 may transmit an internet protocol (IP) address, a serial number, an identification stored in a cookie, or any other identifier.


In some examples, the monitoring instructions 115 may be transmitted to the presentation location at a time other than when the media content and/or video player is transmitted from the content server 112 to the presentation location 114. The monitoring instructions may be implemented in any manner such as, for example, computer instructions in any instruction language, a browser toolbar or plugin, any other type of plugin for a computer, a device installed at the presentation location, etc. The monitoring instructions may transmit tags to the content server 120 for any media content available at the presentation location 114 such as, for example, media content that is received at the presentation location 114, media content that is presented at the presentation location 114, media content that is presented at the presentation location 114 but is not viewed or heard, media content that is stored at the presentation location 114, media content that is transmitted from the presentation location 114 to another presentation location, etc. While the example monitoring instructions 115 of the illustrated example transmit the tag and information identifying the presentation location 114 to the central facility 120, the monitoring instructions 115 may transmit any additional or alternative information such as, for example, information about trick play of the media content, information about user input, etc.


The meter 118 of the illustrated example analyzes media content presented at the media presentation location 116 to obtain identifying information and transmit the identifying information to the central facility 120. The example meter 118 obtains watermarks embedded in or otherwise associated with the media content. At least when analyzed media content does not include a watermark, the example meter 118 extracts signature information from the media content. The example meter 118 also transmits tags associated with the media content to the central facility 120. Because the identifying information (e.g., signatures, watermarks, etc.) are transmitted to the central facility 120 with the tags, the central facility can compare the information to confirm that the tags have been accurately associated with the media content (e.g., to confirm that the wrong tag has not been associated with the media content).


In the illustrated example, the meter 118 is implemented in software that is installed on the media presentation device 116 when a user of the media presentation device 116 joins a panel. Alternatively, the meter 118 may be a device that is communicatively coupled to the media presentation device 116, may be software that is distributed to the general public, may be software that is automatically installed on the media presentation device 116 without user interaction, may be software that is installed by a user of the media presentation device 116, etc.


While, for simplicity of illustration, a single media presentation location 114 is illustrated in FIG. 1, multiple media presentation locations will typically be present. Furthermore, while the presentation location 114 includes the monitoring instructions 115 and the meter 118, presentation locations may include either the monitoring instructions 115 or the meter 118. For example, a first set of media presentation 114 locations may include only the monitoring instructions 115 and a second set of media presentation locations 114 different from the first set may include the meter 118. In other examples, media presentation locations 114 of a panel may include the meter 118 and all media presentation locations 114 (including the media presentation locations 114 in the panel) may include the monitoring instructions 115.


The example central facility 120 of FIG. 1 communicates with the media identifier tool 105 to receive tags and identifying information for media content and to provide identifying information for media content to the media identifier tool 105. In addition the example central facility 120 receives identifying information for presented media content from the presentation location 114. The central facility 120 may store the identifying records in any type of data storage (e.g., a database, a log file, etc.). The example central facility 120 of FIG. 1 includes a reference database 122 that stores identifying information for known media content (e.g., media associated with a code or signature that has been previously associated with identifying information), a tag access database 124 that stores records of tags received from the presentation location 114, and an identifying information database 126 that stores records of identifying information (e.g., signatures, watermarks, etc.) received from the presentation location 114.


In the illustrated example, the central facility 120 credits presentation records based on the tags received from the media presentation location 114. For example, if the central facility 120 receives a tag identifying a particular media content by a combination of SID, timestamp, and code type, the central facility 120 will credit the identified media content as having been presented. The central facility 120 may also validate the tags when identifying information and/or attribute data for media content is received with the tags. In other words, the central facility 120 compares the tags (i.e., the information represented by the tags) to other identifying information (e.g., watermarks, signatures) and/or attribute data to identify invalid tags. For example, the central facility 120 may compare an SID retrieved from a tag to an SID extracted from a watermark or code embedded in or otherwise associated with the media content. In another example, the central facility 120 may compare a MAID retrieved from a tag to a MAID determined by querying the reference database 122 with a signature extracted from the media content. The example central facility stores a listing of invalid tags (e.g., in the tag access database 124) to prevent those tags from being used to credit presentation records. The central facility 120 may also notify the content data store 104 that an invalid tag is being used. Such notification may be electronically transmitted or may be manually performed by an operator of the central facility.


While an example manner of implementing the system 100 has been illustrated in FIG. 1, one or more of the elements, processes and/or devices illustrated in FIG. 1 may be combined, divided, re-arranged, omitted, eliminated and/or implemented in any other way. Further, the content data store 104, the media identifier tool 105, the identity analyzer 106, the tag generator 108, the central facility transceiver 110, the content server 112, the presentation location 114, the monitoring instructions 115, the media presentation device 116, the meter 118, and the central facility 120 may be implemented by hardware, software, firmware and/or any combination of hardware, software and/or firmware. Thus, for example, any of the content data store 104, the media identifier tool 105, the identity analyzer 106, the tag generator 108, the central facility transceiver 110, the content server 112, the presentation location 114, the monitoring instructions 115, the media presentation device 116, the meter 118, and the central facility 120 could be implemented by one or more circuit(s), programmable processor(s), application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)) and/or field programmable logic device(s) (FPLD(s)), etc. When any of the appended apparatus claims are read to cover a purely software and/or firmware implementation, at least one of the content data store 104, the media identifier tool 105, the identity analyzer 106, the tag generator 108, the central facility transceiver 110, the content server 112, the presentation location 114, the monitoring instructions 115, the media presentation device 116, the meter 118, and the central facility 120 are hereby expressly defined to include a computer readable medium such as a memory, DVD, CD, etc. storing the software and/or firmware. Further still, the example system 100 of FIG. 1 may include one or more elements, processes and/or devices in addition to, or instead of, those illustrated in FIG. 1, and/or may include more than one of any or all of the illustrated elements, processes and devices.



FIGS. 2-6 are flow diagrams representative of example machine readable instructions that may be executed to generate and process tags to implement the example system 100 of FIG. 1. The example processes of FIGS. 2-6 may be implemented using machine readable instructions that, when executed, cause a device (e.g., a programmable controller, processor, or other programmable machine or integrated circuit) to perform the operations shown in FIGS. 2-6. For instance, the example processes of FIGS. 2-6 may be performed using a processor, a controller, and/or any other suitable processing device. For example, the example processes of FIG. 2-6 may be implemented using coded instructions stored on a tangible machine readable medium such as a flash memory, a read-only memory (ROM), and/or a random-access memory (RAM).


As used herein, the term tangible computer readable medium is expressly defined to include any type of computer readable storage and to exclude propagating signals. Additionally or alternatively, the example processes of FIGS. 2-6 may be implemented using coded instructions (e.g., computer readable instructions) stored on a non-transitory computer readable medium such as a flash memory, a read-only memory (ROM), a random-access memory (RAM), a cache, or any other storage media in which information is stored for any duration (e.g., for extended time periods, permanently, brief instances, for temporarily buffering, and/or for caching of the information). As used herein, the term non-transitory computer readable medium is expressly defined to include any type of computer readable medium and to exclude propagating signals.


Alternatively, the example processes of FIGS. 2-6 may be implemented using any combination(s) of application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)), field programmable logic device(s) (FPLD(s)), discrete logic, hardware, firmware, etc. Also, the example processes of FIGS. 2-6 may be implemented as any combination(s) of any of the foregoing techniques, for example, any combination of firmware, software, discrete logic and/or hardware.


Although the example processes of FIGS. 2-6 are described with reference to the flow diagrams of FIGS. 2-6, other methods of implementing the processes of FIGS. 2-6 may be employed. For example, the order of execution of the blocks may be changed, and/or some of the blocks described may be changed, eliminated, sub-divided, or combined. Additionally, one or more of the example processes of FIGS. 2-6 may be performed sequentially and/or in parallel by, for example, separate processing threads, processors, devices, discrete logic, circuits, etc.


Turning in detail to FIG. 2, initially, the identity analyzer 106 detects media content for tagging (block 202). For example, the identity analyzer 106 may receive a notification or request from the content data store 104 or may query the content data store 104 for available media content. The identity analyzer 106 then determines if a watermark or other identifying information or attribute data is included in the media content (block 204). If the identity analyzer 106 determines that a watermark or other identifying information or attribute data is included in the media content, control proceeds to block 206 and a tag is generated based on the existing watermark and/or other identifying information and/or attribute data, which is described below in conjunction with FIG. 3.


If the identity analyzer 106 determines that a watermark and/or other identifying information and/or attribute data is not included in the media content (block 204), the identity analyzer 106 determines if identifying information has been provided for watermarking the media content (block 208). For example, an XML file including an identification of the media content (e.g., by file name) and identifying information (e.g., watermark information) may be input to the identity analyzer 106 or similar data may be input by an operator of the identity analyzer 106. If the identity analyzer 106 determines that identifying information has been provided, control proceeds to block 210 to generate and associate a watermark and/or other identifying information and/or attribute data with the media content and to generate a tag for the watermark and/or other identifying information and/or attribute data, which is described in conjunction with FIG. 4.


In the identity analyzer 106 determines that identifying information has not been provided (block 208), the identity analyzer determines if the media content is found in a reference library (block 210). The identity analyzer 106 of the illustrated example generates a signature for the media content and determines if a matching signature can be found in the reference database 122 at the central facility 120 (block 211). If the identity analyzer 106 determines that the media content is found in the reference library (block 212), control proceeds to block 214 to generate a tag using the identifying information from the reference library, which is described in conjunction with FIG. 5.


If the media content is not found in a reference library (block 212), the example identity analyzer 106 provides an error indicating that a tag could not be generated (block 216). Alternatively, the identity analyzer 106 could prompt an operator of the identity analyzer 106 to input identifying information and/or attribute data so that a tag could be generated in a similar manner to the process described in FIG. 3 in which identifying information is retrieved based on a watermark associated with the media content. After providing the error (block 216), the process of FIG. 2 terminates until additional media content for tagging is detected.


Turning to block 218, after block 206, block 210, or block 214 complete, the tag generator 108 transmits the generated tag to the content data store 104 (block 218). Metadata associated with inherent information (e.g., a signature) generated in block 214 may also be transmitted to the content data store 104. Then, the process of FIG. 2 terminates until additional media content for tagging is detected.



FIG. 3 is a flowchart of example machine readable instructions that may be executed to implement block 206 of FIG. 2. The identity analyzer 106 extracts embedded identifying information (e.g., a watermark or an identification code) from the media content (block 302). Example watermark information extracted from example media content is shown in Table 2. The tag generator 108 generates the tag from the extracted identifying information (block 304). The example tag generator 108 generates the tag by concatenating several sets of attribute type values and data values as shown in Table 1. Control then returns to block 218 of FIG. 2.



FIG. 4 is a flowchart of example machine readable instructions that may be executed to implement block 210 of FIG. 2. The identity analyzer 106 retrieves watermark and/or other identifying information, metadata, and/or attribute data from the content data store 104 for watermarking the media content (block 402). For example, the identity analyzer 106 may receive an XML file with attribute data for the media content. The identity analyzer 106 then associates the watermark with the media content (block 404). Any desired watermarking technique may be employed. The identity analyzer 106 then transmits the watermarked media content to the content data store 104 for storage (block 406). The identity analyzer 106 then generates a watermark data output of information about the media content and the watermarks that have been inserted therein (block 408). The example identity analyzer 106 generates an XML watermark data output that includes the identifying information and/or other identifying information and/or attribute data for the media content and information about the watermark(s) that were embedded in the media content. The example identity analyzer transmits the watermark data output to the content data store 104 (block 410). Metadata associated with the watermarking may also be transmitted to the content data store 104. Then, the tag generator 108 generates a tag using the identification information from the watermark (block 412). The example tag generator 108 generates the tag by concatenating several sets of attribute type values and data values as shown in Table 1. Control advances to block 218 of FIG. 2.



FIG. 5 is a flowchart of example machine readable instructions that may be executed to implement block 214 of FIG. 2. The identity analyzer 106 extracts inherent information (e.g., characteristic information) from the media content (e.g., a signature) (block 502). The example identity analyzer 106 retrieves attribute data and/or identifying information for the media content from the content data store 104 based on the inherent information (block 504). In the illustrated example, the identity analyzer 106 queries the content data store 104 to match the extracted signature information to reference signature information in the content data store 104 to retrieve the attribute data and/or identifying information. Then, the tag generator 108 generates a tag using the attribute data and/or identifying information associated with the media content by the signature (block 506). The example tag generator 108 generates the tag by concatenating several sets of attribute type values and data values as shown in Table 1. Control then advances to block 218 of FIG. 2.



FIG. 6 is a flowchart of example machine readable instructions that may be executed to validate tags that have been generated by the media identifier tool 105. The central facility 120 receives tag response data from the presentation location 114 (block 602). For example, the central facility 120 may receive tag response data from the meter 118 for media content that has been presented on the media presentation device 114. The central facility 120 also receives identifying information and/or attribute data extracted from the media content corresponding to the received tag (block 604). For example, the example meter 118 transmits extracted watermark or signature data for the media content that has been presented on the media presentation device 116. The central facility 120 then compares the tag data and the identification information to determine if they both identify the same media content (block 608). In the illustrated example, the identifying information is a stream of watermarks and/or signatures. The central facility 120 determines whether any of the watermarks and/or signatures in the stream substantially match the watermark and/or signature information in the tag. For example, the central facility 120 may check whether the SID and/or MAID in the tag matches the SID and/or MAID from watermark or signature information embedded in or associated with the media content. Tags may substantially match identifying information from a signature or watermark when they include the same values, when the majority of the values match, when the tag more closely matches the identification information than any other identifying information, when one or more values from the tag match attribute data associated with the identifying information, or when the identifying information more closely matches the tag than any other tag.


When the tag substantially matches the watermark or signature information (block 606), the tag may be marked as valid and/or reporting of exposure data using that tag will continue (block 610). For example, presentation of the media content may be credited based on the received tag data.


When the information in the tag does not match the watermark or signature information (block 606), the tag is added to an invalid tag list (e.g., in the tag access database 124) to remove the tag from exposure log data (block 612). Control then returns to block 602 to await reception of the next tag.


In some examples, the meter 118 is included at a subset of all presentation locations 114 (e.g., a set of presentation locations 114 selected for a panel) and other presentation locations 114 do not include a meter (i.e., are not panelists) but nonetheless receive the monitoring instructions 115. Accordingly, the presentation locations 114 that include the meter 118 will send both identifying information (e.g., watermark, signature, etc.) and tag information to the central facility 120. Upon receiving the first combination of tag information and identifying information from the meter 118, the central facility 120 performs the validation described in FIG. 6. If a tag is identified as valid, the example central facility 120 records that the tag has been validated and, in the illustrated example, the validation is not performed again for that tag. If a tag is identified as invalid, the example central facility 120 records that the tag is invalid and, in the illustrated example, the validation is not performed again for that tag. Alternatively, validation may be performed every time a tag/identifying information combination is received or may be performed at some interval (e.g., a random interval, a periodic interval, etc.). Accordingly, because the central facility 120 has recorded whether a tag is valid or invalid, when a tag is later received (e.g., a tag is received without identifying information), the central facility can determine whether the tag data should be trusted for use in reporting or should be ignored. In other words, when a tag has been marked as invalid, the media content identified (incorrectly) by the tag will not be credited as an exposure.



FIG. 7 is a block diagram of an example processor system 710 that may be used to execute the example instructions of FIGS. 2-6 to implement the example apparatus, methods, and/or systems described herein. As shown in FIG. 7, the processor system 710 includes a processor 712 that is coupled to an interconnection bus 714. The processor 712 may be any suitable processor, processing unit, or microprocessor. Although not shown in FIG. 7, the system 710 may be a multi-processor system and, thus, may include one or more additional processors that are identical or similar to the processor 712 and that are communicatively coupled to the interconnection bus 714.


The processor 712 of FIG. 7 is coupled to a chipset 718, which includes a memory controller 720 and an input/output (I/O) controller 722. A chipset provides I/O and memory management functions as well as a plurality of general purpose and/or special purpose registers, timers, etc. that are accessible or used by one or more processors coupled to the chipset 718. The memory controller 720 performs functions that enable the processor 712 (or processors if there are multiple processors) to access a system memory 724, a mass storage memory 725, and/or a digital versatile disk (DVD) 740.


In general, the system memory 724 may include any desired type of volatile and/or non-volatile memory such as, for example, static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, read-only memory (ROM), etc. The mass storage memory 725 may include any desired type of mass storage device including hard disk drives, optical drives, tape storage devices, etc. The computer-readable instructions represented by the flow charts described above may be stored in the system memory 724, the mass storage memory 725, and/or the DVD 740.


The I/O controller 722 performs functions that enable the processor 712 to communicate with peripheral input/output (I/O) devices 726 and 728 and a network interface 730 via an I/O bus 732. The I/O devices 726 and 728 may be any desired type of I/O device such as, for example, a keyboard, a video display or monitor, a mouse, etc. The network interface 730 may be, for example, an Ethernet device, an asynchronous transfer mode (ATM) device, an 802.11 device, a digital subscriber line (DSL) modem, a cable modem, a cellular modem, etc. that enables the processor system 710 to communicate with another processor system.


While the memory controller 720 and the I/O controller 722 are depicted in FIG. 7 as separate functional blocks within the chipset 718, the functions performed by these blocks may be integrated within a single semiconductor circuit or may be implemented using two or more separate integrated circuits.


Although certain example methods, apparatus and articles of manufacture have been described herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods, apparatus and articles of manufacture fairly falling within the scope of the claims of this patent.

Claims
  • 1. A method to validate a tag, the method comprising: obtaining, by executing an instruction with a processor, first identification information extracted from a first tag distributed with first media presented at a first presentation location;obtaining, by executing an instruction with the processor, second identification information determined from at least one of a) inherent information of at least one of audio or video of the first media presented at the first presentation location or b) a watermark embedded in at least one of the audio or the video of the first media presented at the first presentation location;comparing, by executing an instruction with the processor, the first identification information with the second identification information;when first identification information does not substantially match the second identification information, preventing, by executing an instruction with the processor, the first tag from being used to report exposure of the first media;after preventing the first tag from being used, obtaining, by executing an instruction with the processor, third identification information extracted from a second tag distributed with second media presented at a second presentation location; andin response to determining that the second tag substantially matches the first tag, preventing, by executing an instruction with the processor, the second tag from being used to report exposure to the second media, wherein preventing the second tag from being used includes preventing the crediting of the second media as having been presented.
  • 2. The method as defined in claim 1, further including, when the first identification information substantially matches the second identification information, crediting the first media as having been presented.
  • 3. The method as defined in claim 2, further including generating a report of usage based on the crediting.
  • 4. The method as defined in claim 1, wherein preventing the first tag from being used includes marking the first tag, and obtaining the third identification information is performed after the marking.
  • 5. The method as defined in claim 1, wherein the second media is the first media.
  • 6. The method as defined in claim 4, further including: after marking the first tag, obtaining fourth identification information extracted from a third tag distributed with third media presented at a third presentation location; andin response to determining that the third tag does not substantially match the first tag, crediting the third media as having been presented.
  • 7. The method as defined in claim 1, wherein the second presentation location is the same as the first presentation location.
  • 8. A tangible computer readable storage medium comprising instructions that, when executed, cause a machine to at least: obtain first identification information extracted from a first tag distributed with first media presented at a first presentation location;obtain second identification information determined from at least one of a) inherent information of at least one of audio or video of the first media presented at the first presentation location or b) a watermark embedded in at least one of the audio or the video of the first media presented at the first presentation location;compare the first identification information with the second identification information;when first identification information does not substantially match the second identification information, prevent the first tag from being used to report exposure of the first media;after preventing the first tag from being used, obtain third identification information extracted from a second tag distributed with second media presented at a second presentation location; andin response to determining that the second tag substantially matches the first tag, prevent the second tag from being used to report exposure to the second media, wherein the instructions, when executed, cause the machine to prevent the second tag from being used by preventing the crediting of the second media as having been presented.
  • 9. The tangible computer readable storage medium as defined in claim 8, wherein the instructions, when executed, cause the machine to, when the first identification information substantially matches the second identification information, credit the first media as having been presented.
  • 10. The tangible computer readable storage medium as defined in claim 9, wherein the instructions, when executed, cause the machine to generate a report of usage based on the crediting.
  • 11. The tangible computer readable storage medium as defined in claim 8, wherein the instructions, when executed, cause the machine to, prevent the first tag from being used by marking the first tag, and the obtaining of the third identification information is performed after the marking.
  • 12. The tangible computer readable storage medium as defined in claim 8, wherein the second media is the first media.
  • 13. The tangible computer readable storage medium as defined in claim 11, wherein the instructions, when executed, cause the machine to: after marking the first tag, obtain fourth identification information extracted from a third tag distributed with third media presented at a third presentation location; andin response to determining that the third tag does not substantially match the first tag, credit the third media as having been presented.
  • 14. The tangible computer readable storage medium as defined in claim 13, wherein the second presentation location is the same as the first presentation location.
  • 15. A system comprising: a central facility to obtain first identification information extracted from a first tag distributed with first media presented at a first presentation location;a meter to: obtaining second identification information determined from at least one of a) inherent information of at least one of audio or video of the first media presented at the first presentation location or b) a watermark embedded in at least one of the audio or the video of the first media presented at the first presentation location, andtransmit the second identification to the central facility;wherein the central facility is to: compare the first identification information with the second identification information and, when the first identification information does not substantially match the second identification information, prevent the first tag from being used to report exposure of the first media;after preventing the first tag from being used, obtain third identification information extracted from a second tag distributed with second media presented at a second presentation location; andin response to determining that the second tag substantially matches the first tag, prevent the second tag from being used to report exposure to the second media, wherein the central facility is to prevent the second tag from being used by preventing the crediting of the second media as having been presented.
  • 16. The system as defined in claim 15, wherein the central facility is further to, when the first identification information substantially matches the second identification information, credit the first media as having been presented.
  • 17. The system as defined in claim 16, wherein the central facility is further to generate a report of usage based on the crediting.
  • 18. The system as defined in claim 15, wherein the central facility is to prevent the first tag from being used by marking the tag, and the obtaining of the third identification information is performed after the marking.
  • 19. The system as defined in claim 15, wherein the second media is the first media.
  • 20. The system as defined in claim 18, wherein the central facility is to: after marking the first tag, obtain fourth identification information extracted from a third tag distributed with third media presented at a third presentation location; andin response to determining that the third tag does not substantially match the first tag, crediting the third media as having been presented.
  • 21. The system of as defined in claim 15, wherein the second presentation location is the same as the first presentation location.
RELATED APPLICATIONS

This patent is a continuation of U.S. patent application Ser. No. 13/181,147, filed on Jul. 12, 2011, entitled “METHODS AND APPARATUS TO GENERATE A TAG FOR MEDIA CONTENT,” which claims the benefit of U.S. Provisional Patent Application Ser. No. 61/474,728, filed Apr. 12, 2011, entitled METHODS AND APPARATUS TO GENERATE A TAG FOR MEDIA CONTENT.” U.S. patent application Ser. No. 13/181,147 and U.S. Provisional Patent Application Ser. No. 61/474,728 are hereby incorporated by reference in their entireties. The present disclosure relates generally to identifying media content and, more particularly, to methods and apparatus to generate a tag for media content.

US Referenced Citations (419)
Number Name Date Kind
4230990 Lert, Jr. et al. Oct 1980 A
4647974 Butler et al. Mar 1987 A
4677466 Lert, Jr. et al. Jun 1987 A
4697209 Kiewit et al. Sep 1987 A
4745468 Von Kohorn May 1988 A
4876592 Von Kohorn Oct 1989 A
4876736 Kiewit Oct 1989 A
4926255 Von Kohorn May 1990 A
4973952 Malec et al. Nov 1990 A
5003591 Kauffman et al. Mar 1991 A
5019899 Boles et al. May 1991 A
5023929 Call Jun 1991 A
5034807 Von Kohorn Jul 1991 A
5057915 Von Kohorn Oct 1991 A
5081680 Bennett Jan 1992 A
5128752 Von Kohorn Jul 1992 A
5227874 Von Kohorn Jul 1993 A
5249044 Von Kohorn Sep 1993 A
5283734 Von Kohorn Feb 1994 A
5331544 Lu et al. Jul 1994 A
5401946 Weinblatt Mar 1995 A
5425100 Thomas et al. Jun 1995 A
5481294 Thomas et al. Jan 1996 A
5512933 Wheatley et al. Apr 1996 A
5524195 Clanton, III et al. Jun 1996 A
5543856 Rosser et al. Aug 1996 A
5559716 Gaalswyk Sep 1996 A
5574962 Fardeau et al. Nov 1996 A
5579124 Aijala et al. Nov 1996 A
5594934 Lu et al. Jan 1997 A
5629739 Dougherty May 1997 A
5659366 Kerman Aug 1997 A
5666293 Metz et al. Sep 1997 A
5719634 Keery et al. Feb 1998 A
5734413 Lappington et al. Mar 1998 A
5740035 Cohen et al. Apr 1998 A
5815671 Morrison Sep 1998 A
5841978 Rhoads Nov 1998 A
5848155 Cox Dec 1998 A
5850249 Massetti et al. Dec 1998 A
5872588 Aras et al. Feb 1999 A
5880789 Inaba Mar 1999 A
5893067 Bender et al. Apr 1999 A
5910987 Ginter et al. Jun 1999 A
5918223 Blum et al. Jun 1999 A
5930369 Cox et al. Jul 1999 A
5933789 Byun et al. Aug 1999 A
5956716 Kenner et al. Sep 1999 A
5966120 Arazi et al. Oct 1999 A
5974299 Massetti Oct 1999 A
5974396 Anderson et al. Oct 1999 A
5987855 Dey et al. Nov 1999 A
6034722 Viney et al. Mar 2000 A
6049830 Saib Apr 2000 A
6055573 Gardenswartz et al. Apr 2000 A
6061793 Tewfik et al. May 2000 A
6088659 Kelley et al. Jul 2000 A
6108637 Blumenau Aug 2000 A
6115654 Eid et al. Sep 2000 A
6154209 Naughton et al. Nov 2000 A
6208735 Cox et al. Mar 2001 B1
6216129 Eldering Apr 2001 B1
6282573 Darago et al. Aug 2001 B1
6286036 Rhoads Sep 2001 B1
6286140 Ivanyi Sep 2001 B1
6298348 Eldering Oct 2001 B1
6308327 Liu et al. Oct 2001 B1
6310956 Morito et al. Oct 2001 B1
6327619 Blumenau Dec 2001 B1
6331876 Koster et al. Dec 2001 B1
6335736 Wagner et al. Jan 2002 B1
6363159 Rhoads Mar 2002 B1
6363488 Ginter et al. Mar 2002 B1
6377993 Brandt et al. Apr 2002 B1
6389055 August et al. May 2002 B1
6400827 Rhoads Jun 2002 B1
6411725 Rhoads Jun 2002 B1
6421445 Jensen et al. Jul 2002 B1
6442285 Rhoads et al. Aug 2002 B2
6463445 Suzuki et al. Oct 2002 B1
6487564 Asai et al. Nov 2002 B1
6505160 Levy et al. Jan 2003 B1
6512836 Xie et al. Jan 2003 B1
6513014 Walker et al. Jan 2003 B1
6519509 Nierlich et al. Feb 2003 B1
6522771 Rhoads Feb 2003 B2
6539095 Rhoads Mar 2003 B1
6546556 Kataoka et al. Apr 2003 B1
6553178 Abecassis Apr 2003 B2
6642966 Limaye Nov 2003 B1
6647269 Hendrey et al. Nov 2003 B2
6651253 Dudkiewicz et al. Nov 2003 B2
6654480 Rhoads Nov 2003 B2
6665873 Van Gestel et al. Dec 2003 B1
6671732 Weiner Dec 2003 B1
6675383 Wheeler et al. Jan 2004 B1
6683966 Tian et al. Jan 2004 B1
6710815 Billmaier et al. Mar 2004 B1
6714683 Tian et al. Mar 2004 B1
6741684 Kaars May 2004 B2
6750985 Rhoads Jun 2004 B2
6766523 Herley Jul 2004 B2
6795972 Rovira Sep 2004 B2
6804379 Rhoads Oct 2004 B2
6829368 Meyer et al. Dec 2004 B2
6853634 Davies et al. Feb 2005 B1
6871323 Wagner et al. Mar 2005 B2
6873688 Aarnio Mar 2005 B1
6941275 Swierczek Sep 2005 B1
6956575 Nakazawa et al. Oct 2005 B2
6968315 Nakisa Nov 2005 B1
6968564 Srinivasan Nov 2005 B1
6970886 Conwell et al. Nov 2005 B1
6996213 De Jong Feb 2006 B1
7003731 Rhoads et al. Feb 2006 B1
7007166 Moskowitz et al. Feb 2006 B1
7032178 McKnight et al. Apr 2006 B1
7050603 Rhoads et al. May 2006 B2
7051086 Rhoads et al. May 2006 B2
7058697 Rhoads Jun 2006 B2
7082434 Gosselin Jul 2006 B2
7095871 Jones et al. Aug 2006 B2
7143949 Hannigan Dec 2006 B1
7158943 van der Riet Jan 2007 B2
7171018 Rhoads et al. Jan 2007 B2
7174293 Kenyon et al. Feb 2007 B2
7181042 Tian Feb 2007 B2
7185201 Rhoads et al. Feb 2007 B2
7194752 Kenyon et al. Mar 2007 B1
7197156 Levy Mar 2007 B1
7215280 Percy et al. May 2007 B1
7221405 Basson et al. May 2007 B2
7227972 Brundage et al. Jun 2007 B2
7254249 Rhoads et al. Aug 2007 B2
7273978 Uhle Sep 2007 B2
7317716 Boni et al. Jan 2008 B1
7328153 Wells et al. Feb 2008 B2
7346512 Li-Chun Wang et al. Mar 2008 B2
7356700 Noridomi et al. Apr 2008 B2
7363278 Schmelzer et al. Apr 2008 B2
7369678 Rhoads May 2008 B2
7421723 Harkness et al. Sep 2008 B2
7443292 Jensen et al. Oct 2008 B2
7451317 Oh et al. Nov 2008 B2
7463143 Forr et al. Dec 2008 B2
7519658 Anglin et al. Apr 2009 B1
7592908 Zhang et al. Sep 2009 B2
7623823 Zito et al. Nov 2009 B2
7643090 Ramaswamy et al. Jan 2010 B2
7689823 Shen et al. Mar 2010 B2
7712114 Ramaswamy May 2010 B2
7761465 Nonaka et al. Jul 2010 B1
7827312 Ramaswamy et al. Nov 2010 B2
7962934 Eldering et al. Jun 2011 B1
8065703 Wilson et al. Nov 2011 B2
8176322 Lee et al. May 2012 B2
8179475 Sandrew May 2012 B2
8578272 Pantos et al. Nov 2013 B2
8706685 Smith et al. Apr 2014 B1
8732185 Lynn et al. May 2014 B1
8839338 Eyer Sep 2014 B2
8856869 Brinskelle Oct 2014 B1
9197421 Besehanic Nov 2015 B2
9209978 Besehanic Dec 2015 B2
9313544 Besehanic Apr 2016 B2
9357261 Besehanic May 2016 B2
9609034 Ramaswamy et al. Mar 2017 B2
20010031066 Meyer et al. Oct 2001 A1
20010044851 Rothman et al. Nov 2001 A1
20010044899 Levy Nov 2001 A1
20020001395 Davis Jan 2002 A1
20020012443 Rhoads et al. Jan 2002 A1
20020016969 Kimble Feb 2002 A1
20020032734 Rhoads Mar 2002 A1
20020033842 Zetts Mar 2002 A1
20020053078 Holtz et al. May 2002 A1
20020056089 Houston May 2002 A1
20020056094 Dureau May 2002 A1
20020059218 August et al. May 2002 A1
20020062382 Rhoads et al. May 2002 A1
20020069037 Hendrickson et al. Jun 2002 A1
20020083324 Hirai Jun 2002 A1
20020091991 Castro Jul 2002 A1
20020102993 Hendrey et al. Aug 2002 A1
20020108125 Joao Aug 2002 A1
20020111934 Narayan Aug 2002 A1
20020112002 Abato Aug 2002 A1
20020120925 Logan Aug 2002 A1
20020124246 Kaminsky et al. Sep 2002 A1
20020133705 Tagashira et al. Sep 2002 A1
20020138852 Reynolds et al. Sep 2002 A1
20020144262 Plotnick et al. Oct 2002 A1
20020144273 Reto Oct 2002 A1
20020150247 Linnartz et al. Oct 2002 A1
20020157112 Kuhn Oct 2002 A1
20020162118 Levy et al. Oct 2002 A1
20020174425 Markel et al. Nov 2002 A1
20020194592 Tsuchida et al. Dec 2002 A1
20020197063 Cho Dec 2002 A1
20030021441 Levy et al. Jan 2003 A1
20030039465 Bjorgan et al. Feb 2003 A1
20030055949 Coulombe et al. Mar 2003 A1
20030079131 Reefman Apr 2003 A1
20030088674 Ullman et al. May 2003 A1
20030093810 Taniguchi May 2003 A1
20030105870 Baum Jun 2003 A1
20030108200 Sako Jun 2003 A1
20030115598 Pantoja Jun 2003 A1
20030149890 Shen et al. Aug 2003 A1
20030177488 Smith et al. Sep 2003 A1
20030185232 Moore et al. Oct 2003 A1
20040003394 Ramaswamy Jan 2004 A1
20040009763 Stone et al. Jan 2004 A1
20040019690 Cardno et al. Jan 2004 A1
20040025181 Addington et al. Feb 2004 A1
20040037271 Liscano et al. Feb 2004 A1
20040038692 Muzaffar Feb 2004 A1
20040064319 Nehauser et al. Apr 2004 A1
20040073916 Petrovic et al. Apr 2004 A1
20040073951 Bae et al. Apr 2004 A1
20040088556 Weirauch May 2004 A1
20040125125 Levy Jul 2004 A1
20040128514 Rhoads Jul 2004 A1
20040137929 Jones et al. Jul 2004 A1
20040146161 DeJong Jul 2004 A1
20040153649 Rhoads et al. Aug 2004 A1
20040156489 Vishik et al. Aug 2004 A1
20040199387 Wang et al. Oct 2004 A1
20040254887 Jacoby Dec 2004 A1
20050028189 Heine et al. Feb 2005 A1
20050033758 Baxter Feb 2005 A1
20050058319 Rhoads et al. Mar 2005 A1
20050086682 Burges et al. Apr 2005 A1
20050138179 Encarnaction et al. Jun 2005 A1
20050144004 Bennett et al. Jun 2005 A1
20050152287 Yokomitsu et al. Jul 2005 A1
20050177738 Van Der Veen et al. Aug 2005 A1
20050188297 Knight et al. Aug 2005 A1
20050192933 Rhoads et al. Sep 2005 A1
20050193425 Sull et al. Sep 2005 A1
20050204379 Yamamori Sep 2005 A1
20050216346 Kusumoto et al. Sep 2005 A1
20050234774 Dupree Oct 2005 A1
20060026431 Campello De Souza Feb 2006 A1
20060031297 Zuidema Feb 2006 A1
20060056625 Nakabayashi et al. Mar 2006 A1
20060059277 Zito et al. Mar 2006 A1
20060062426 Levy et al. Mar 2006 A1
20060095401 Krikorian et al. May 2006 A1
20060107195 Ramaswamy et al. May 2006 A1
20060107302 Zdepski May 2006 A1
20060136564 Ambrose Jun 2006 A1
20060161635 Lamkin et al. Jul 2006 A1
20060167747 Goodman et al. Jul 2006 A1
20060168613 Wood et al. Jul 2006 A1
20060195614 Sena et al. Aug 2006 A1
20060195886 Ashley Aug 2006 A1
20060212705 Thommana et al. Sep 2006 A1
20060221173 Duncan Oct 2006 A1
20060224798 Klein et al. Oct 2006 A1
20060242325 Ramaswamy et al. Oct 2006 A1
20070006250 Croy et al. Jan 2007 A1
20070016918 Alcorn et al. Jan 2007 A1
20070030966 Sra et al. Feb 2007 A1
20070050832 Wright et al. Mar 2007 A1
20070055987 Lu et al. Mar 2007 A1
20070074020 Nishimura Mar 2007 A1
20070083611 Farago et al. Apr 2007 A1
20070110089 Essafi et al. May 2007 A1
20070112837 Houh et al. May 2007 A1
20070118375 Kenyon et al. May 2007 A1
20070118873 Houh et al. May 2007 A1
20070124771 Shvadron May 2007 A1
20070127717 Herre et al. Jun 2007 A1
20070129952 Kenyon et al. Jun 2007 A1
20070133223 Fredley et al. Jun 2007 A1
20070136753 Bovenschulte et al. Jun 2007 A1
20070136777 Hasek et al. Jun 2007 A1
20070149114 Danilenko Jun 2007 A1
20070157262 Ramaswamy et al. Jul 2007 A1
20070162927 Ramaswamy et al. Jul 2007 A1
20070186228 Ramaswamy Aug 2007 A1
20070198738 Angiolillo et al. Aug 2007 A1
20070201835 Rhoads Aug 2007 A1
20070226760 Neuhauser et al. Sep 2007 A1
20070250901 McIntire et al. Oct 2007 A1
20070266395 Lee et al. Nov 2007 A1
20070274523 Rhoads Nov 2007 A1
20070276925 La Joie et al. Nov 2007 A1
20070276926 La Joie et al. Nov 2007 A1
20070288476 Flanagan, III et al. Dec 2007 A1
20070294057 Crystal et al. Dec 2007 A1
20070294132 Zhang et al. Dec 2007 A1
20070294705 Gopalakrishnan et al. Dec 2007 A1
20070294706 Neuhauser et al. Dec 2007 A1
20080027734 Zhao et al. Jan 2008 A1
20080028223 Rhoads Jan 2008 A1
20080040354 Ray et al. Feb 2008 A1
20080046499 Cabrera et al. Feb 2008 A1
20080059160 Saunders et al. Mar 2008 A1
20080065507 Morrison et al. Mar 2008 A1
20080077956 Morrison et al. Mar 2008 A1
20080082510 Wang et al. Apr 2008 A1
20080082922 Biniak et al. Apr 2008 A1
20080083003 Biniak et al. Apr 2008 A1
20080104624 Narasimhan et al. May 2008 A1
20080120661 Ludvig et al. May 2008 A1
20080126420 Wright May 2008 A1
20080133223 Son et al. Jun 2008 A1
20080139182 Levy et al. Jun 2008 A1
20080140573 Levy et al. Jun 2008 A1
20080168503 Sparrell Jul 2008 A1
20080184132 Zato Jul 2008 A1
20080200999 Hakansson Aug 2008 A1
20080209491 Hasek Aug 2008 A1
20080219496 Tewfik et al. Sep 2008 A1
20080219637 Sandrew Sep 2008 A1
20080235077 Harkness et al. Sep 2008 A1
20080240490 Finkelstein et al. Oct 2008 A1
20080249961 Harkness et al. Oct 2008 A1
20080263579 Mears et al. Oct 2008 A1
20080294487 Nasser Nov 2008 A1
20080310629 Van Der Veen et al. Dec 2008 A1
20090007169 Headley et al. Jan 2009 A1
20090015599 Bennett et al. Jan 2009 A1
20090070408 White Mar 2009 A1
20090074240 Srinivasan Mar 2009 A1
20090083417 Hughes et al. Mar 2009 A1
20090086812 Ducharme Apr 2009 A1
20090103887 Choi et al. Apr 2009 A1
20090106297 Wright et al. Apr 2009 A1
20090119723 Tinsman May 2009 A1
20090129588 Takakusu et al. May 2009 A1
20090133093 Hodge May 2009 A1
20090150553 Collart et al. Jun 2009 A1
20090157731 Zigler et al. Jun 2009 A1
20090158318 Levy Jun 2009 A1
20090164378 West et al. Jun 2009 A1
20090164564 Willis Jun 2009 A1
20090210892 Ramaswamy Aug 2009 A1
20090228492 Valdez et al. Sep 2009 A1
20090248886 Tan et al. Oct 2009 A1
20090259325 Topchy et al. Oct 2009 A1
20090259745 Lee Oct 2009 A1
20090265214 Jobs et al. Oct 2009 A1
20090276313 Wilhelm Nov 2009 A1
20090305680 Swift Dec 2009 A1
20090307061 Monighetti et al. Dec 2009 A1
20090307084 Monighetti et al. Dec 2009 A1
20100008586 Meyer et al. Jan 2010 A1
20100009722 Levy et al. Jan 2010 A1
20100023405 Liu Jan 2010 A1
20100083299 Nelson Apr 2010 A1
20100088583 Schachter Apr 2010 A1
20100094897 Sumrall et al. Apr 2010 A1
20100121936 Liu et al. May 2010 A1
20100135638 Mio Jun 2010 A1
20100174774 Kern et al. Jul 2010 A1
20100226526 Modro et al. Sep 2010 A1
20100241963 Kulis et al. Sep 2010 A1
20100246955 Wright Sep 2010 A1
20100262711 Bouazizi Oct 2010 A1
20100280641 Harkness et al. Nov 2010 A1
20100306257 Levy Dec 2010 A1
20100318600 Furbeck Dec 2010 A1
20110016231 Ramaswamy et al. Jan 2011 A1
20110030031 Lussier et al. Feb 2011 A1
20110055314 Rosenstein et al. Mar 2011 A1
20110066437 Luff Mar 2011 A1
20110078721 Wang et al. Mar 2011 A1
20110088053 Lee Apr 2011 A1
20110123062 Hilu May 2011 A1
20110145246 Prager et al. Jun 2011 A1
20110145581 Malhotra et al. Jun 2011 A1
20110154185 Kern et al. Jun 2011 A1
20110157475 Wright et al. Jun 2011 A1
20110173200 Yang et al. Jul 2011 A1
20110196921 Sylthe Aug 2011 A1
20110252118 Pantos et al. Oct 2011 A1
20110320287 Holt et al. Dec 2011 A1
20110321003 Doig et al. Dec 2011 A1
20120023516 Wolinsky et al. Jan 2012 A1
20120036350 Kuno et al. Feb 2012 A1
20120045054 Main et al. Feb 2012 A1
20120096546 Dilley et al. Apr 2012 A1
20120124605 Praden May 2012 A1
20120137015 Sun May 2012 A1
20120209949 Deliyannis et al. Aug 2012 A1
20120239809 Mazumdar et al. Sep 2012 A1
20120284804 Lindquist et al. Nov 2012 A1
20120308071 Ramsdell et al. Dec 2012 A1
20120311126 Jadallah et al. Dec 2012 A1
20130007298 Ramaswamy et al. Jan 2013 A1
20130007794 Besehanic et al. Jan 2013 A1
20130054972 Thorwirth Feb 2013 A1
20130061275 Seo et al. Mar 2013 A1
20130097285 van Zwol et al. Apr 2013 A1
20130124747 Harrang et al. May 2013 A1
20130166868 Jarnikov et al. Jun 2013 A1
20130202150 Sinha et al. Aug 2013 A1
20130231931 Kulis et al. Sep 2013 A1
20130247078 Nikankin et al. Sep 2013 A1
20130268623 Besehanic et al. Oct 2013 A1
20130268630 Besehanic et al. Oct 2013 A1
20130290508 Besehanic et al. Oct 2013 A1
20130291001 Besehanic et al. Oct 2013 A1
20130297410 Oh et al. Nov 2013 A1
20130297737 Wajs et al. Nov 2013 A1
20130311776 Besehanic Nov 2013 A1
20130311780 Besehanic Nov 2013 A1
20140082220 Ramaswamy et al. Mar 2014 A1
20140105392 Robert et al. Apr 2014 A1
20140229629 Besehanic Aug 2014 A1
20140229970 Besehanic Aug 2014 A1
20140244828 Besehanic Aug 2014 A1
20140298365 Matsubara et al. Oct 2014 A1
20140301386 Harrenstien et al. Oct 2014 A1
20160043916 Ramaswamy et al. Feb 2016 A1
20160127466 Albrecht et al. May 2016 A1
Foreign Referenced Citations (63)
Number Date Country
8976601 Feb 2002 AU
9298201 Apr 2002 AU
2003230993 Nov 2003 AU
0112901 Jun 2003 BR
0309598 Feb 2005 BR
2483104 Nov 2003 CA
1457600 Nov 2003 CN
1592906 Mar 2005 CN
1647160 Jul 2005 CN
0769749 Apr 1997 EP
1176826 Jul 2003 EP
1349370 Oct 2003 EP
1406403 Apr 2004 EP
1504445 Aug 2008 EP
2002247610 Aug 2002 JP
2003524199 Aug 2003 JP
2004320752 Nov 2004 JP
9527349 Oct 1995 WO
9702672 Jan 1997 WO
0004662 Jan 2000 WO
0019699 Apr 2000 WO
0119088 Mar 2001 WO
0124027 Apr 2001 WO
0131497 May 2001 WO
0140963 Jun 2001 WO
0146782 Jun 2001 WO
0153922 Jul 2001 WO
0175743 Oct 2001 WO
0191109 Nov 2001 WO
0205517 Jan 2002 WO
0211123 Feb 2002 WO
0215081 Feb 2002 WO
0217591 Feb 2002 WO
0219625 Mar 2002 WO
0227600 Apr 2002 WO
0237381 May 2002 WO
0245034 Jun 2002 WO
02061652 Aug 2002 WO
02065305 Aug 2002 WO
02065318 Aug 2002 WO
02069121 Sep 2002 WO
03009277 Jan 2003 WO
03091990 Nov 2003 WO
03094499 Nov 2003 WO
03096337 Nov 2003 WO
2004010352 Jan 2004 WO
2004040416 May 2004 WO
2004040475 May 2004 WO
2004061699 Jul 2004 WO
2005025217 Mar 2005 WO
2005064885 Jul 2005 WO
2008044664 Apr 2008 WO
2008045950 Apr 2008 WO
2008110002 Sep 2008 WO
2008110790 Sep 2008 WO
2009011206 Jan 2009 WO
2009061651 May 2009 WO
2009064561 May 2009 WO
2010095320 Aug 2010 WO
2010127268 Nov 2010 WO
2012117872 Dec 2012 WO
2012177870 Dec 2012 WO
2012177874 Dec 2012 WO
Non-Patent Literature Citations (173)
Entry
Mexican Patent Office, “Office Action”, issued in connection with Mexican Patent Application No. MX/a/2014/000280, dated Jan. 21, 2015 (5 pages, English translation included).
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/443,596, dated Feb. 26, 2015.
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/793,991, dated Feb. 27, 2015.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/778,108, dated Feb. 27, 2015.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/341,646, dated Mar. 3, 2015.
Japanese Patent Office, “Notice of Reasons for Rejection”, issued in connection with Japanese Patent Application No. P2014-517158, dated Mar. 3, 2015 (7 pages).
Mexican Patent Office, “Notice of Allowance”, issued in connection with Mexican Patent Application No. MX/a/2014/000281, dated Feb. 25, 2015 (1 page).
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 13/793,983, dated Mar. 16, 2015.
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 13/472,170, dated Mar. 26, 2015.
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/341,661, dated Mar. 26, 2015.
European Patent Office, “European Search Report”, issued in connection with European Patent Application No. 12803215.8, dated Apr. 20, 2015 (9 pages).
Canadian Patent Office, “Office Action”, issued in connection with Canadian Patent Application No. 2,840,092, dated Apr. 20, 2015 (4 pages).
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/455,961, dated May 20, 2015.
State Intellectual Property Office of China, “Notice of Allowance” issued in connection with Application No. 201210105474.3, May 25, 2015.
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/793,974, dated May 28, 2015.
Canadian Intellectual Property Office, “Office Action”, issued in connection with Canadian Patent Application No. 2,840,094, dated May 19, 2015 (4 pages).
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 14/089,279, dated Mar. 28, 2014.
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 14/089,279, dated Nov. 21, 2014.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 14/089,279, dated Apr. 23, 2015.
European Patent Office, “European Search Report”, issued in connection with European Patent Application No. 12802805.7, dated May 27, 2015 (8 pages).
European Patent Office, “European Search Report”, issued in connection with European Patent Application No. 12802746.3, dated May 27, 2015 (9 pages).
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 13/341,646, dated Jun. 19, 2015
Mexican Patent Office, “Notice of Allowance”, issued in connection with Mexican Patent Application No. MX/a/2014/000280, dated Jun. 12, 2015, 1 page.
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 13/472,170, dated Jul. 7, 2015.
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 13/793,983, dated Jul. 7, 2015.
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/341,661, mailed Mar. 19, 2013.
U.S. Appl. No. 13/793,974, filed Mar. 11, 2013 (58 pages).
U.S. Appl. No. 13/793,991, filed Mar. 11, 2013 (47 pages).
U.S. Appl. No. 13/778,108, filed Feb. 26, 2013, (66 pages).
Canadian Intellectual Property Office, “Office Action,” issued in connection with application No. CA 2,773,567, on Mar. 6, 2014 (2 pages).
State Intellectual Property Office of China, “First Office Action,” issued in connection with application No. CN 201210105474.3, on Feb. 8, 2014 (15 pages).
International Bureau, “International Preliminary Report on Patentability” issued in connection with International Application No. PCT/US2012/043544, dated Jan. 9, 2014 (9 pages).
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/443,596, dated Apr. 9, 2014.
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/793,991, dated Apr. 11, 2014.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/341,646, dated Jun. 5, 2014.
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/793,983, dated Jun. 6, 2014.
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/472,170, dated Jun. 18, 2014.
IP Australia, “Examination Report”, issued in connection with Australian Patent Application No. 2012272868, dated Jun. 27, 2014 (3 pages).
IP Australia, “Examination Report”, issued in connection with Australian Patent Application No. 2012272874, dated Jun. 27, 2014 (3 pages).
IP Australia, “Examination Report”, issued in connection with Australian Patent Application No. 2012272872, dated Jun. 24, 2014 (4 pages).
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 13/341,661 dated Jul. 8, 2014.
IP Australia, “Examination Report”, issued in connection with Australian Patent Application No. 2013204488, dated Aug. 12, 2014 (5 pages).
IP Australia, “Examination Report”, issued in connection with Australian Patent Application No. 2013203778, dated Aug. 21, 2014 (5 pages).
IP Australia, “Examination Report”, issued in connection with Australian Patent Application No. 2012272876, dated Sep. 18, 2014 (4 pages).
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/443,596, dated Sep. 25, 2014.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/455,961, dated Dec. 5, 2014.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/767,548, dated Feb. 3, 2015.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/793,959 dated Jan. 30, 2015.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/793,991, dated Nov. 10, 2014.
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 13/341,646, dated Nov. 3, 2014.
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 13/472,170, dated Dec. 5, 2014.
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 13/793,983, dated Jan. 9, 2015.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/793,974, dated Feb. 18, 2015.
United States Patent and Trademark Office, “Corrected Notice of Allowance”, issued in connection with U.S. Appl. No. 13/472,170, dated Feb. 12, 2015.
United States Patent and Trademark Office, “Notice of Allowance,” issued in connection with U.S. Appl. No. 12/890,216 on Aug. 6, 2013.
United States Patent and Trademark Office, “Office Action,” issued in connection with U.S. Appl. No. 13/472,170 on Nov. 8, 2013.
United States Patent and Trademark Office, “Office Action,” issued in connection with U.S. Appl. No. 13/793,983 on Nov. 8, 2013.
United States Patent and Trademark Office, “Office Action,” issued in connection with U.S. Appl. No. 13/443,596 on Nov. 21, 2013.
United States Patent and Trademark Office, “Office Action,” issued in connection with U.S. Appl. No. 13/793,991 on Dec. 6, 2013.
International Searching Authority, “International Search Report and Written Opinion of the International Searching Authority,” issued in connection with International Application No. PCT/US2012/043535, mailed on Feb. 21, 2013, 15 pages.
United States Patent and Trademark Office, “Notice of Allowance,” issued in connection with U.S. Appl. No. 13/341,661 on Sep. 23, 2013.
R. Pantos, Ed., & W. May, Apple Inc. “HTTP Live Streaming: draft-pantos-httplive-streaming-07”, Sep. 2011 (33 pages).
Apple Inc. “Timed Metadata for HTTP Live Streaming”, Apr. 28, 2011 (12 pages).
Apple Inc. “HTTP Live Streaming Overview”, Apr. 1, 2011 (36 pages).
Eric Winkelman, “Timed Text Tracks and TV Services”, Aug. 15, 2011 (5 pages).
U.S. Appl. No. 13/433,596, filed Apr. 10, 2012 (41 pages).
“What is Jacked?,” http://www.jacked.com/, retrieved on Dec. 3, 2009 (1 page).
Anderson, “Google to compete with Nielsen for TV-ratings info?,” Ars Technica, Jun. 19, 2006 (3 pages).
Boehret, “Yahoo Widgets Lend Brains to Boob Tube,” The Wall Street Journal, Mar. 25, 2009 (4 pages).
Claburn, “Google Researchers Propose TV Monitoring,” Information Week, Jun. 7, 2006 (4 pages).
Evain, “TV-Anytime Metadata—A Preliminary Specification on Schedule!,” EBU Technical Review, Sep. 2000 (15 pages).
Heuer et al., “Adaptive Multimedia Messaging based on MPEG-7—The M3-Box,”, Proc. Second Int'l Symposium on Mobile Multimedia Systems Applications, Nov. 9-10, 2000 (8 pages).
Heussner, “Sound-Triggered Smart Phone Ads Seek You Out,”Adweek.com, http://www.adweek.com/news/advertising-branding/sound-triggered-smartphone-ads-seek-you-out-136901, Dec. 7, 2011 (3 pages).
Hopper, “EBU Project Group P/META Metadata Exchange Standards,” EBU Technical Review, Sep. 2000 [http://www.ebu.ch/en/technical/trev/trev—284-contents.html, retrieved on Jul. 20, 2006] (25 pages).
Kane, “Entrepreneur Plans On-Demand Videogame Service,” The Wall Street Journal, Mar. 24, 2009 (2 pages).
Laven,“EBU Technical Review (Editor ial),” No. 284, Sep. 2000 [http://www.ebu.ch/en/technical/trev/trev—284-contents.html, retrieved on Jul. 20, 2006] (3 pages).
Mulder, “The Integration of Metadata From Production to Consumer,” EBU Technical Review, Sep. 2000 [http://www.ebu.ch/en/technical/trev/trev—284-contents.html, retrieved on Jul. 20, 2006] (6 pages).
U.S. Appl. No. 13/455,961, filed Apr. 25, 2012 (61 pages).
Canadian Intellectual Property Office, “Office Action,” issued in connection with Canadian Application Serial No. 2,574,998, mailed Aug. 26, 2008 (4 pages).
Canadian Intellectual Property Office, “Office Action,” issued in connection with Canadian Application Serial No. 2,574,998, mailed Mar. 23, 2009 (5 pages).
Canadian Intellectual Property Office, “Office Action,” issued in connection with Canadian Application Serial No. 2,574,998, mailed Nov. 13, 2009 (10 pages).
Patent Cooperation Treaty, “International Search Report,” issued in connection with International Application Serial No. PCT/US2003/14970, mailed Feb. 10, 2004 (1 page).
Patent Cooperation Treaty, “International Preliminary Examination Report,” issued in connection with International Application Serial No. PCT/US2003/014970, completed Aug. 21, 2004 (6 pages).
Patent Cooperation Treaty, “International Search Report,” issued in connection with International Application Serial No. PCT/US2003/031180, mailed Jun. 8, 2004 (5 pages).
Patent Cooperation Treaty, “International Preliminary Examination Report,” issued in connection with International Application Serial No. PCT/US2003/031180, mailed Aug. 17, 2004 (4 pages).
Patent Cooperation Treaty, “International Search Report and Written Opinion,” issued in connection with International Application Serial No. PCT/US2005/026426, mailed Aug. 18, 2006 (10 pages).
Patent Cooperation Treaty, “International Preliminary Report on Patentability,” issued in connection with International Application Serial No. PCT/US2005/026426, mailed Feb. 1, 2007 (9 pages).
Patent Cooperation Treaty, “International Search Report and Written Opinion,” issued in connection with International Application Serial No. PCT/US2009/061827, mailed Mar. 15, 2010 (12 pages).
Patent Cooperation Treaty, “International Search Report and Written Opinion,” issued in connection with International Application Serial No. PCT/US2009/061750, mailed Mar. 3, 2010 (10 pages).
Patent Cooperation Treaty, International Search Report and Written Opinion, issued in connection with International Application Serial No. PCT/US2010/033201, mailed Oct. 1, 2010 (16 pages).
Patent Cooperation Treaty, “International Search Report and Written Opinion,” issued in connection with International Application Serial No. PCT/US2009/061479, mailed May 26, 2010 (15 pages).
Patent Cooperation Treaty, “International Search Report and Written Opinion”, issued in connection with International application No. PCT/US2012/043546, mailed Dec. 10, 2012, (6 pages).
Patent Cooperation Treaty, “International Search Report and Written Opinion”, issued in connection with International application No. PCT/US2012/043539, mailed Jan. 17, 2013, (9 pages).
Patent Cooperation Treaty, “International Search Report and Written Opinion”, issued in connection with International application No. PCT/US2012/043544, mailed Jan. 17, 2013, (15 pages).
Shazam, “Shazam and VidZone Digital Media announce UK1s first fixed price mobile download service for music videos,” http://www.shazam.com/music/web/newsdetail.html?nid=NEWS136, Feb. 11, 2008 (1 page).
Shazam, “Shazam launches new music application for Facebook fans,” http://www.shazam.com/music/web/newsdetail.html?nid=NEWS135, Feb. 18, 2008 (1 page).
Shazam, “Shazam turns up the volume on mobile music,” http://www.shazam.com/music/web/newsdetail.html?nid=NEWS137, Nov. 28, 2007 (1 page).
Stross, “Apple Wouldn't Risk Its Cool Over a Gimmick, Would It?,” The New York Times, Nov. 14, 2009 (3 pages).
Stultz, “Handheld Captioning at Disney World Theme Parks,” article retrieved on Mar. 8, 2013, http://goflorida.about.com/od/disneyworld/a/wdw—captioning.htm, (1 page).
Sullivan, “Google Cozies Up to SMBs for Digital Content,” MediaPost News, Mar. 19, 2009 (3 pages).
U.S. Appl. No. 61/499,520, filed Jun. 21, 2011 (51 pages).
U.S. Appl. No. 61/568,631, filed Dec. 8, 2011 (80 pages).
United States Patent and Trademark Office, “Advisory Action,” issued in connection with U.S. Appl. No. 10/540,611, mailed Jan. 22, 2010.
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 10/540,611, mailed Sep. 29, 2009.
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 10/540,611, mailed Mar. 4, 2009.
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 10/540,611, mailed Sep. 15, 2008.
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 12/890,216, mailed Apr. 2, 2012.
United States Patent and Trademark Office, “Notice of Allowance,” issued in connection with U.S. Appl. No. 10/540,611, mailed Jun. 22, 2010.
United States Patent and Trademark Office, “Advisory Action,” issued in connection with U.S. Appl. No. 11/618,245, mailed Sep. 30, 2009.
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 10/530,233, mailed Mar. 18, 2010.
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 11/618,245, mailed Jul. 21, 2009.
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 10/530,233, mailed Sep. 16, 2009.
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 11/618,245, mailed Feb. 5, 2009.
Van Beek et al., “Metadata-Driven Multimedia Access,” IEEE Signal Processing Magazine, vol. 20, No. 2, Institute of Electric and Electronic Engineers, Inc., New York, New York, USA, Mar. 2003 (13 pages).
Vetro et al., “Video Transcoding Architectures and Techniques: An Overview,” IEEE Signal Processing Magazine, vol. 20, No. 2, Institute of Electric and Electronic Engineers, Inc., New York, New York, USA, Mar. 2003 (12 pages).
Wactlar et al., “Digital Video Archives: Managing Through Metadata” Building a National Strategy for Digital Preservation: Issues in Digital Media Archiving, http://www.informedia.cs.cmu.edu/documents/Wactlar-CLIR-final.pdf, Apr. 2002 (14 pages).
Wang, “An Industrial-Strength Audio Search Algorithm,” Shazam Entertainment, Ltd., in Proceedings of the Fourth International Conference on Music Information Retrieval, Baltimore, Oct. 26-30, 2003 (7 pages).
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 12/890,216, mailed Nov. 29, 2012 (22 pages).
United States Patent and Trademark Office, “Requirement for Restriction,” issued in connection with U.S. Appl. No. 10/530,233, mailed Jun. 10, 2009 (20 pages).
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 11/618,245, mailed Apr. 28, 2011 (48 pages).
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 11/618,245, mailed Oct. 26, 2011 (38 pages).
IP Australia, “Notice of Acceptance”, issued in connection with Australian Patent Application No. 2012272874, dated Sep. 11, 2015 (2 pages).
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/455,961, dated Sep. 24, 2015.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/793,974, dated Sep. 24, 2015.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/443,596, dated Oct. 20, 2015.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/793,991, dated Oct. 22, 2015.
State Intellectual Property Office of China, “Office Action”, issued in connection with Chinese Patent Application No. 201280032737.0, dated Nov. 10, 2015 (5 pages).
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 14/089,279, dated Nov. 17, 2015.
European Patent Office, “European Search Report” issued in connection with European Patent Application No. 12802202.7 dated May 28, 2015 (7 pages).
IP Australia, “Notice of Acceptance”, issued in connection with Australian Patent Application No. 2012272868, dated Jul. 22, 2015 (2 pages).
Canadian Intellectual Property Office, “Office Action,” issued in connection with Canadian Application Serial No. 2,773,567, on Mar. 27, 2015 (6 pages).
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/767,548 dated Aug. 11, 2015.
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/778,108 dated Aug. 13, 2015.
IP Australia, “Notice of Acceptance”, issued in connection with Australian Patent Application No. 2012272872 dated Aug. 6, 2015 (2 pages).
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/793,959, dated Sep. 11, 2015.
European Patent Office, “Examination Report,” issued in connection with application No. 12002599.4-1905 on Mar. 4, 2016 (4 pages).
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 14/922,918, dated Feb. 23, 2016.
Canadian Intellectual Property Office, “Office Action,” issued in connection with Canadian Patent Application No. 2,773,567, dated Mar. 9, 2016 (4 pages).
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 13/455,961, dated Mar. 23, 2016.
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 13/443,596, dated Apr. 6, 2016.
European Patent Office, “Communication Pursuant to Article 94(3) EPC,” issued in connection with application No. 12002599.4 on Mar. 4, 2016 (4 pages).
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 13/793,991, dated Apr. 8, 2016.
IP Australia, “Notice of Acceptance,” issued in connection with Australian Patent Application No. 2013204488, dated Apr. 26, 2016 (3 pages).
IP Australia, “Examination Report,” issued in connection with Australian Patent Application No. 2012272876, dated Apr. 26, 2016 (3 pages).
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 14/089,279, dated May 5, 2016.
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/181,147, dated Nov. 21, 2012.
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/181,147, dated Aug. 15, 2013.
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 13/181,147, dated Mar. 10, 2014.
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 13/181,147, dated Dec. 3, 2015.
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/181,147, dated May 19, 2015.
United States Patent and Trademark Office, “Notice of Allowance,” issued in connection with U.S. Appl. No. 13/181,147, dated Feb. 18, 2016.
United States Patent and Trademark Office, “Advisory Action,” issued in connection with U.S. Appl. No. 13/181,147, mailed Oct. 15, 2014 (4 pages).
United States Patent and Trademark Office, “Notice of Allowance” issued in connection with U.S. Appl. No. 13/181,147, dated Feb. 18, 2016, 8 pages.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 14/922,918, dated Feb. 23, 2016, 17 pages.
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/443,596, dated Apr. 6, 2016, 25 pages.
European Patent Office, “Communication Pursuant to Article 94(3) EPS,” issued in connection with application No. 12002599.4 on Mar. 4, 2016, (4 pages).
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/793,991, dated Apr. 8, 2016, 23 pages.
IP Australia, “Notice of Acceptance”, issued in connection with Australian Patent Application No. 2013204488, dated Apr. 26, 2016 (3 pages).
IP Australia, “Examination Report”, issued in connection with Australian Patent Application No. 2012272876, dated Apr. 26, 2016, 3 pages.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 14/089,279, dated May 5, 2016, 29 pages.
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/778,108, dated May 23, 2016, 13 pages.
The State Intellectual Property Office of China, “Office Action”, issued in connection with Chinese Patent Application No. 201280032740.2, dated May 31, 2016 (22 pages).
IP Australia, “Notice of Acceptance”, issued in connection with Australian Patent Application No. 2012272876, dated Jun. 6, 2016 (2 pages).
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 13/341,661, dated Aug. 19, 2016, 9 pages.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 14/922,918, dated Sep. 9, 2016 (16 pages).
IP Australia, “Examination Report”, issued in connection with Australian Patent Application No. 2015252031, dated Oct. 21, 2016 (3 pages).
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 14/089,279, dated Nov. 14, 2016 (13 pages).
The State Intellectual Property Office of China, “First Office Action”, issued in connection with Chinese Patent Application No. 201280032738.5, dated Dec. 16, 2016 (13 pages).
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 14/725,877, dated Jan. 26, 2017, 14 pages.
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 15/331,568, dated Feb. 24, 2017, 9 pages.
European Patent Office, “Examination Report”, issued in connection with European Patent Application No. 12802746.3, dated Feb. 23, 2017, 6 pages.
Canadian Intellectual Property Office, “Notice of Allowance,” issued in connection with application No. 2,773,567, on Mar. 30, 2017, 1 page.
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 14/922,918, dated Apr. 6, 2017, 18 pages.
Related Publications (1)
Number Date Country
20160301988 A1 Oct 2016 US
Provisional Applications (1)
Number Date Country
61474728 Apr 2011 US
Continuations (1)
Number Date Country
Parent 13181147 Jul 2011 US
Child 15181399 US