Systems and methods for gathering audience measurement data

Information

  • Patent Grant
  • 9100132
  • Patent Number
    9,100,132
  • Date Filed
    Tuesday, November 3, 2009
    15 years ago
  • Date Issued
    Tuesday, August 4, 2015
    9 years ago
Abstract
Systems and methods are provided for gathering audience measurement data relating to exposure of an audience member to audio data. Audio data is received in a user system and is then encoded with audience measurement data. The encoded audio data is reproduced by the user system, picked up by a monitor and decoded to recover the audience measurement data.
Description
TECHNICAL FIELD

The present invention relates to techniques for gathering audience measurement data by detecting such data encoded in audio data.


BACKGROUND INFORMATION

There is considerable interest in measuring the usage of media accessed by an audience to provide market information to advertisers, media distributors and the like.


In the past there were relatively few alternatives for distributing media, such as analog radio and television, analog recordings, newspapers and magazines and relatively few media producers and distributors. Moreover, the marketplace for media distributed via one technology was distinct from the marketplace for media distributed in a different manner. The radio and television industries, for example, had their distinctly different media content and delivery methodologies. Recorded media was distributed and reproduced in distinctly different ways, although the content was often adapted for radio or television distribution.


Audience measurement has evolved in a similar manner tracking the market segmentation of the media distribution industry. Generally, audience measurement data has been gathered, processed and reported separately for each media distribution market segment


The development of techniques to efficiently process, store and communicate digital data has enabled numerous producers and distributors of media to enter the marketplace. Users of media now have a great many choices which did not exist only a few years ago. Established producers and distributors have responded with their own efforts to provide media in digital form to users. This trend is enhanced with each improvement in digital processing, storage and communications.


A result of these developments is a convergence of media distribution within the digital realm, especially through distribution via the Internet. Media is thus available to users not only through traditional distribution channels, but also via alternative digital communication pathways. For example, many radio stations now provide their programming via the Internet as well as over the air.


The emergence of multiple, overlapping media distribution pathways, as well as the wide variety of available user systems (e.g. PC's, PDA's, portable CD players, Internet, appliances, TV, radio, etc.) for accessing media, has greatly complicated the task of measuring media audiences. The development of commercially-viable techniques for encoding audio data with audience measurement data provides a crucial tool for measuring media usage across multiple media distribution pathways and user systems. Most notable among these techniques is the CBET methodology developed by Arbitron Inc., which is already providing useful audience estimates to numerous media distributors and advertisers.


However, the bandwidth for data encoded in audio is limited by the needs to maintaining inaudibility of the codes while ensuring that they are reliably detectable. Nevertheless, today more data is required for audience measurement than ever before. Not only is it necessary to detect the source of the data, but also to detect how it was distributed (e.g., over-the-air vs. Internet) and how it was reproduced (e.g. by a conventional radio, PC, etc., as well as the player software employed).


Accordingly, it is desired to provide data gathering techniques for audience measurement data capable of measuring media usage across multiple distribution paths and user systems.


It is also desired to provide such data gathering techniques which are likely to be adaptable to future media distribution paths and user systems which are presently unknown.


SUMMARY OF THE INVENTION

For this application, the following terms and definitions shall apply, both for the singular and plural forms of nouns and for all verb tenses:


The term “data” as used herein means any indicia, signals, marks, domains, symbols, symbol sets, representations, and any other physical form or forms representing information, whether permanent or temporary, whether visible, audible, acoustic, electric, magnetic, electromagnetic, or otherwise manifested. The term “data” as used to represent predetermined information in one physical form shall be deemed to encompass any and all representations of the same predetermined information in a different physical form or forms.


The term “audio data” as used herein means any data representing acoustic energy, including, but not limited to, audible sounds, regardless of the presence of any other data, or lack thereof, which accompanies, is appended to, is superimposed on, or is otherwise transmitted or able to be transmitted with the audio data.


The term “user system” as used herein means any software, devices, or combinations thereof which are useful for reproducing audio data as sound for an audience member, including, but not limited to, computers, televisions, radios, personal digital assistants, and internet appliances.


The term “network” as used herein means networks of all kinds, including both intra-networks and inter-networks, including, but not limited to, the Internet, and is not limited to any particular such network.


The term “source identification data” as used herein means any data that is indicative of a source of audio data, including, but not limited to, (a) persons or entities that create, produce, distribute, reproduce, communicate, have a possessory interest in, or are otherwise associated with the audio data, or (b) locations, whether physical or virtual, from which data is communicated, either originally or as an intermediary, and whether the audio data is created therein or prior thereto.


The terms “audience” and “audience member” as used herein mean a person or persons, as the case may be, who access audio data in any manner, whether alone or in one or more groups, whether in the same or various places, and whether at the same time or at various different times.


The term “audience measurement data” as used herein means data wheresoever originating which comprises source identification data or which otherwise characterizes or provides information about audio data, or else concerns (a) a user system that requests, communicates, receives, or presents audio data, (b) a network that requests, receives, or presents audio data for a user, user system, or another network, or (c) an audience or audience member, including, but not limited to, user demographic data.


The term “processor” as used herein means data processing devices, apparatus, programs, circuits, systems, and subsystems, whether implemented in hardware, software, or both.


The terms “communicate” and “communicating” as used herein include both conveying data from a source to a destination, as well as delivering data to a communications medium, system or link to be conveyed to a destination. The term “communication” as used herein means the act of communicating or the data communicated, as appropriate.


The terms “coupled”, “coupled to”, and “coupled with” shall each mean a relationship between or among two or more devices, apparatus, files, programs, media, components, networks, systems, subsystems, and/or means, constituting any one or more of (a) a connection, whether direct or through one or more other devices, apparatus, files, programs, media, components, networks, systems, subsystems, or means, (b) a communications relationship, whether direct or through one or more other devices, apparatus, files, programs, media, components, networks, systems, subsystems, or means, or (c) a functional relationship in which the operation of any one or more of the relevant devices, apparatus, files, programs, media, components, networks, systems, subsystems, or means depends, in whole or in part, on the operation of any one or more others thereof.


In accordance with an aspect of the present invention, a method is provided for gathering audience measurement data relating to the exposure of an audience member to audio data. The method comprises receiving the audio data in a user system adapted to reproduce the audio data as sound; encoding the audio data in the user system with audience measurement data to produce encoded audio data; reproducing the encoded audio data as encoded sound by means of the user system; receiving the encoded sound in a monitor device to produce received audio data; and decoding the audience measurement data from the received audio data.


In accordance with another aspect of the present invention, a system is provided for gathering audience measurement data relating to exposure of an audience member to audio data reproduced by a user system. The system comprises an encoder coupled with the user system to encode audio data which has been received in the user system with audience measurement data to produce encoded audio data; and a decoder device having an input to receive the encoded audio data for decoding the audience measurement data encoded therein.


In accordance with a further aspect of the present invention, a method is provided for gathering data relating to exposure of an audience member to streaming media reproduced by a user system. The method comprises receiving streaming media including audio data in a user system; encoding the audio data received in the user system with audience measurement data; reproducing the encoded audio data as encoded acoustic energy; receiving the encoded acoustic energy in a portable monitor carried on the person of an audience member; and decoding the audience measurement data in the encoded acoustic energy received in the portable monitor.


In accordance with still another aspect of the present invention, a system is provided for gathering audience measurement data relating to exposure of an audience member to streaming media in the form of audio data reproduced by a user system. The system comprises an encoder coupled with the user system to encode audio data which has been received in the user system as streaming media with audience measurement data and to supply the encoded audio data to be reproduced by the user system; a portable monitor adapted to be carried on the person of an audience member to transduce the encoded audio data reproduced by the user system; and a decoder coupled with the portable monitor to receive the transduced encoded audio data and to decode the audience measurement data in the transduced encoded audio data.


In accordance with yet another aspect of the present invention, a method is provided for gathering data relating to exposure of an audience member to streaming media. The method comprises receiving streaming media in a user system, the streaming media including audio data and source identification data for the audio data and separate therefrom; encoding the audio data in the user system with the source identification data to form encoded audio data; reproducing the encoded audio data as encoded acoustic energy; receiving the encoded acoustic energy in a portable monitor carried on the person of an audience member; and decoding the source identification data encoded in the encoded acoustic energy received by the portable monitor.


In accordance with still another aspect of the present invention, a method is provided for gathering audience measurement data. The method comprises encoding audio data in a user system with first audience measurement data, the user system being arranged to reproduce the audio data as sound; and decoding the first audience measurement data in the encoded audio data.


The invention and its particular features and advantages will become more apparent from the following detailed description considered with reference to the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a functional block diagram for use in illustrating various embodiments of systems and methods for gathering audience measurement data relating to exposure of an audience member to audio data.



FIG. 2 is a functional block diagram for use in illustrating various additional embodiments of systems and methods for gathering audience measurement data relating to exposure of an audience member to audio data.





DETAILED DESCRIPTION OF CERTAIN ADVANTAGEOUS EMBODIMENTS


FIG. 1 illustrates an embodiment of a system 10 for encoding and reproducing audio data by means of a user system 20, an encoder 25, and an acoustic reproducing device 30. The source of the audio data may be a satellite receiver 40, an antenna 50 and/or a network 60 such as a cable television system or the Internet. The source of the audio data may also be any one or more of a web site, a broadcast channel, a content channel, an online channel, a radio station, a television station, a media organization, and/or a storage medium. The user system 20 is coupled with the audio data source in any available manner including but not limited to over-the-air (wireless), cable, satellite, telephone, DSL (Direct Subscriber Line), LAN (Local Area Network), WAN (Wide Area Network), Intranet, and/or the Internet. The invention is particularly useful for monitoring exposure to streaming media delivered via the Internet


The user system 10 includes one or more coupled devices that serve, among other things, to supply the audio data to the acoustic reproducing device 30 for reproduction as acoustic energy 80. In certain embodiments, the user system 20 is a computer, a radio, a television, a cable converter, a satellite television system, a game playing system, a VCR, a DVD player, a portable audio player, an internet appliance, a PDA (personal digital assistant), a cell phone, a home theater system, a component stereo system, and/or an electronic book. In one embodiment, the acoustic reproducing device 30 is a speaker. In another embodiment, the acoustic reproducing device 30 is a speaker system. In other embodiments, the acoustic reproducing device 30 is any device capable of producing acoustic energy 80.


In certain embodiments, the encoder 25 present in the user system 20 embeds audience measurement data in the audio data. In certain embodiments, the encoder comprises software running on the user system 20, including embodiments in which the encoding software is integrated or coupled with a player running on the user system 20. In other embodiments, the encoder 25 comprises a device coupled with the user system 20 such as a peripheral device, or a board, such as a soundboard. In certain embodiments, the board is plugged into an expansion slot of the user system. In certain embodiments, the encoder 25 is programmable such that it is provided with encoding software prior to coupling with the user system or after coupling with the user system. In these embodiments, the encoding software is loaded from a storage device or from the audio source or another source, or via another communication system or medium.


In certain embodiments, the encoder 25 encodes the audience measurement data as a further encoded layer in already-encoded audio data, so that two or more layers of embedded data are simultaneously present in the audio data. The layers are arranged with sufficiently diverse frequency characteristics so that they may be separately detected. In certain of these embodiments the code is superimposed on the audio data asynchronously. In other embodiments, the code is added synchronously with the preexisting audio data. In certain ones of such synchronous encoding embodiments data is encoded in portions of the audio data which have not previously been encoded. At times the user system receives both audio data (such as streaming media) and audience measurement data (such as source identification data) which, as received, is not encoded in the audio data but is separate therefrom. In certain embodiments, the user system 220 supplies such audience measurement data to the encoder 200 which serves to encode the audio data therewith.


In certain embodiments the audience measurement data is source identification data, content identification code, data that provides information about the received audio data, demographic data regarding the user, and/or data describing the user system or some aspect thereof, such as the user agent (e.g. player or browser type), operating system, sound card, etc. In one embodiment, the audience measurement data is an identification code. In certain embodiments for measuring exposure of any audience member to audio data obtained from the Internet, such as streaming media, the audience measurement data comprises data indicating that the audio data was obtained from the Internet, the type of player and/or source identification data.


Several advantageous and suitable techniques for encoding audience measurement data in audio data are disclosed in U.S. Pat. No. 5,764,763 to James M. Jensen, et al., which is assigned to the assignee of the present application, and which is incorporated by reference herein. Other appropriate encoding techniques are disclosed in U.S. Pat. No. 5,579,124 to Aijala, et al., U.S. Pat. Nos. 5,574,962, 5,581,800 and 5,787,334 to Fardeau, et al., U.S. Pat. No. 5,450,490 to Jensen, et al., and U.S. patent application Ser. No. 09/318,045, in the names of Neuhauser, et al., each of which is assigned to the assignee of the present application and all of which are incorporated herein by reference.


Still other suitable encoding techniques are the subject of PCT Publication WO 00/04662 to Srinivasan, U.S. Pat. No. 5,319,735 to Preuss, et al., U.S. Pat. No. 6,175,627 to Petrovich, et al., U.S. Pat. No. 5,828,325 to Wolosewicz, et al., U.S. Pat. No. 6,154,484 to Lee, et al., U.S. Pat. No. 5,945,932 to Smith, et al., PCT Publication WO 99/59275 to Lu, et al., PCT Publication WO 98/26529 to Lu, et al., and PCT Publication WO 96/27264 to Lu, et al, all of which are incorporated herein by reference.


In certain embodiments, the encoder 25 forms a data set of frequency-domain data from the audio data and the encoder processes the frequency-domain data in the data set to embed the encoded data therein. Where the codes have been formed as in the Jensen, et al. U.S. Pat. No. 5,764,763 or U.S. Pat. No. 5,450,490, the frequency-domain data is processed by the encoder 25 to embed the encoded data in the form of frequency components with predetermined frequencies. Where the codes have been formed as in the Srinivasan PCT Publication WO 00/04662, in certain embodiments the encoder processes the frequency-domain data to embed code components distributed according to a frequency-hopping pattern. In certain embodiments, the code components comprise pairs of frequency components modified in amplitude to encode information. In certain other embodiments, the code components comprise pairs of frequency components modified in phase to encode information. Where the codes have been formed as spread spectrum codes, as in the Aijala, et al. U.S. Pat. No. 5,579,124 or the Preuss, et al. U.S. Pat. No. 5,319,735, the encoder comprises an appropriate spread spectrum encoder.


The acoustic energy 80 produced by the acoustic reproducing device 30 is detected by a transducer 90 coupled to a portable monitor 100. The transducer 90 translates the acoustic energy 80 into detected audio data. In certain embodiments, the portable monitor 100 has an internal decoder 110 which serves to decode the encoded audience measurement data present in the detected audio data. The decoded audience measurement data is either stored in an internal storage device 120 to be communicated at a later time or else communicated from the monitor 100 once decoded. In other embodiments, the portable monitor 100 provides the detected audio data or a compressed version thereof to a storage device 120 for decoding elsewhere. The storage device 120 may be internal to the portable monitor 100 as depicted in FIG. 1, or the storage device may be external to the portable monitor 100 and coupled therewith to receive the data to be recorded. In still further embodiments, the portable monitor 100 receives and communicates audio data or a compressed version thereof to another device for subsequent decoding. In certain embodiments, the audio data is compressed by forming signal-to-noise ratios representing possible code components, such as in U.S. Pat. No. 5,450,490 or U.S. Pat. No. 5,764,763 both of which are assigned to the assignee of the present invention and are incorporated herein by reference.


The audience measurement data to be decoded in certain embodiments includes data already encoded in the audio data when received by the user system, data encoded in the audio data by the user system, or both.


There are several possible embodiments of decoding techniques that can be implemented for use in the present invention. Several advantageous techniques for detecting encoded audience measurement data are disclosed in U.S. Pat. No. 5,764,763 to James M. Jensen, et al., which is assigned to the assignee of the present application, and which is incorporated by reference herein. Other appropriate decoding techniques are disclosed in U.S. Pat. No. 5,579,124 to Aijala, et al., U.S. Pat. Nos. 5,574,962, 5,581,800 and 5,787,334 to Fardeau, et al., U.S. Pat. No. 5,450,490 to Jensen, et al., and U.S. patent application Ser. No. 09/318,045, in the names of Neuhauser, et al., each of which is assigned to the assignee of the present application and all of which are incorporated herein by reference.


Still other suitable decoding techniques are the subject of PCT Publication WO 00/04662 to Srinivasan, U.S. Pat. No. 5,319,735 to Preuss, et al., U.S. Pat. No. 6,175,627 to Petrovich, et al., U.S. Pat. No. 5,828,325 to Wolosewicz, et al., U.S. Pat. No. 6,154,484 to Lee, et al., U.S. Pat. No. 5,945,932 to Smith, et al., PCT Publication WO 99/59275 to Lu, et al., PCT Publication WO 98/26529 to Lu, et al., and PCT Publication WO 96/27264 to Lu, et al., all of which are incorporated herein by reference.


In certain embodiments, decoding is carried out by forming a data set from the audio data collected by the portable monitor 100 and processing the data set to extract the audience measurement data encoded therein. Where the encoded data has been formed as in U.S. Pat. No. 5,764,763 or U.S. Pat. No. 5,450,490, the data set is processed to transform the audio data to the frequency domain. The frequency domain data is processed to extract code components with predetermined frequencies. Where the encoded data has been formed as in the Srinivasan PCT Publication WO 00/04662, in certain embodiments the remote processor 160 processes the frequency domain data to detect code components distributed according to a frequency-hopping pattern. In certain embodiments, the code components comprise pairs of frequency components modified in amplitude to encode information which are processed to detect such amplitude modifications. In certain other embodiments, the code components comprise pairs of frequency components modified in phase to encode information and are processed to detect such phase modifications. Where the codes have been formed as spread spectrum codes, as in the Aijala, et al. U.S. Pat. No. 5,579,124 or the Preuss, et al. U.S. Pat. No. 5,319,735, an appropriate spread spectrum decoder is employed to decode the audience measurement data.


In the embodiment illustrated in FIG. 1, the portable monitor 100 is coupled with a base station 150 from time to time to download the detected audio data or decoded audience measurement data from the portable monitor 100. The base station 150 communicates this data to a remote processor 160 or a remote storage system 170 for producing audience measurement reports. The detected audio data or decoded audience measurement data is downloaded to the base station in either compressed or uncompressed form, depending on the embodiment. In one embodiment, the data is communicated from the base station 150 via the PSTN (public switched telephone network), accessed through a phone jack or via a cellular telephone. In another embodiment, the data is communicated via another network, such as the Internet. In yet another embodiment, the data is communicated via a satellite system or other wireless communications link.


In certain embodiments, the data is communicated from the base station 150 to a hub (not shown for purposes of simplicity and clarity) that collects such data from multiple base stations within a household, or directly from one or more portable monitors or both from one or more base stations and one or more portable monitors. The hub then communicates the collected data to the remote processor 160 or the remote storage system 170.


In certain embodiments, the base station 150 can also recharge an internal battery 115 on the portable monitor 100. In certain embodiments, the portable monitor 100 and base station 150 are implemented as in U.S. Pat. No. 5,483,276 assigned to the assignee of the present invention and incorporated herein by reference.


In an alternative embodiment, a stationary monitor receives the acoustic energy from the acoustic reproducing device 30 and provides the functionality provided by the portable monitor in other embodiments described herein above. In certain ones of such embodiments, the stationary monitor is integrated with the base station in order to communicate the data in accordance with the embodiments disclosed above. In another embodiment, the stationary monitor receives the acoustic energy from the acoustic reproducing device and provides the functionality provided by both the portable monitor and the base station in other embodiments described herein; thus, here there is no separate base station as all functions of the base station are performed by the stationary monitor.


In certain embodiments, encoded audio from the user system is output as an electrical signal through a device, such as an output jack, for reproduction by headphones or by a system such as a stereo, surround sound, or home theater system. In some such embodiments, the encoded audio is supplied in electrical form for monitoring and to gather audience measurement data by means of a portable monitor, and in others by means of a stationary monitor.



FIG. 2 illustrates various embodiments of a system 180 for encoding and reproducing audio data including a user system 220, an encoder 200 and an acoustic reproducing device 235. The user system 220 receives audio data, with or without associated data in other forms (such as video data, graphical data and/or textual data) as indicated at 222. The data may be supplied from any source, such as one or more of the audio data sources identified above in connection with FIG. 1. Moreover, as indicated at 224, the audio data at times will be encoded with audience measurement data, while at other times it may not be so encoded. As in the case of the embodiments described in connection with FIG. 1, encoder 200 is coupled with user system 220 to encode audience measurement data in the audio data 224 received in user system 220, and may be implemented by software running on user system 220 or as a device coupled with the user system 220 such as a peripheral device, or a board, such as a soundboard.


In certain embodiments, this audience measurement data is demographic data about the user. In other embodiments, this data is information about the user system or some portion thereof. In still other embodiments, this data is information about the audio data, such as its content or source. In still other embodiments, the data is qualitative data about the audience member or members. Further embodiments encode all or some of the above mentioned types of data in the audio data.


In one embodiment the user system 220 includes a player 230, and a browser 240 running on the user system 220. In certain embodiments, the player is capable of processing audio and/or video data for presentation. In other embodiments, the browser is capable of processing various types of received data for presentation, sending and receiving data, encrypting and decrypting data, linking to other information sources, transmitting audio data, launching player applications and file viewers, and navigating a file system.


In certain embodiments, the user system 220 gathers demographic data about a user or a set of users and encoder 200 encodes this data into the audio data. The demographic data may include data on some or all of the user's age, sex, race, interests, occupation, profession, income, etc. In certain embodiments, the demographic data gathered from a particular user is associated with a user ID that is also encoded into the audio data. The demographic data may be gathered from direct user input, user agents, software tracking history and user system usage, an examination of files on the user system or user profile data on the user system or elsewhere. In some embodiments, the user agent automates an action, such as demographic data gathering. In other embodiments, the user inputs demographic data via a keyboard 280, a pointing device 285, and/or other kinds of user input devices (e.g. touch screens, microphones, key pads, voice recognition software, etc.).


In certain embodiments, the encoder 200 encodes system data about the content being presented from the player or the browser, information about the player type, information about the browser type, information about the operating system type, information about the user, and/or information about a URL, a channel, or a source associated with the source of the audio data. The system data may be gathered from operating system messages, metalevel program interactions, network level messages, direct user input, user agents, software tracking history and user system usage, and examination of files on the user system or user profile data on the user system or elsewhere. In some embodiments, the user agent automates an action, such as system data gathering. In other embodiments, the user inputs system data via keyboard 280, pointing device 285, and/or other kinds of user input devices (e.g. touch screens, microphones, key pads, voice recognition software, etc.) In still further embodiments, software embedded in the encoder gathers system data.



FIG. 2 further illustrates a portable monitor 250 to be carried on the person of an audience member and including an acoustic transducer 260. Portable monitor 260 is coupled with a docking station 270 to download data as well as recharge batteries within monitor 260. Docking station 270 communicates with a remote processor or storage system 290 to provide data thereto for producing audience measurement reports. The monitor 250, transducer 260, docking station 270 and remote processor 290 may take any of the forms described above for comparable devices and substitutes in connection with FIG. 1.


Although the invention has been described with reference to particular arrangements and embodiments of services, systems, processors, devices, features and the like, these are not intended to exhaust all possible arrangements or embodiments, and indeed many other modifications and variations will be ascertainable to those of skill in the art.

Claims
  • 1. A method for adding information to audio, the method comprising: accessing an audio signal segment in a processing device capable of reproducing the audio as sound, the audio signal segment including first audience measurement data embedded into the audio signal segment;embedding second audience measurement data in the audio signal segment in the processing device, the second audience measurement data being different from the first audience measurement data, and the first audience measurement data and second audience measurement data exist simultaneously existing in the audio signal segment simultaneously; andtransmitting the audio signal segment for reproduction.
  • 2. The method according to claim 1, wherein the audio signal segment includes streaming media.
  • 3. The method of claim 1, wherein the first audience measurement data and the second audience measurement data include portions having frequency characteristics enabling separate detection of the first audience measurement data and the second audience measurement data.
  • 4. The method of claim 1, wherein the second audience measurement data is embedded according to one of an asynchronous or synchronous positioning relative to the first audience measurement data.
  • 5. The method of claim 1, wherein the embedding of the second audience measurement data includes forming a data set of frequency-domain data.
  • 6. The method of claim 5, wherein the embedding of the second audience measurement data includes producing frequency-domain data based on the audience measurement data.
  • 7. The method of claim 6, wherein the frequency domain data is processed to embed the audience measurement data as frequency components having predetermined frequencies.
  • 8. The method of claim 6, wherein the frequency domain data is processed to embed code components of the audience measurement data according to a frequency-hopping pattern.
  • 9. The method of claim 1, further including decoding the audio signal segment by forming and processing a data set therefrom to extract at least one of the first audience measurement data or the second audience measurement data.
  • 10. A consumer electronics device, comprising: an input to receive an audio signal segment in the device, the audio signal segment including first audience measurement data embedded in the audio signal segment;a processor to embed second audience measurement data in the audio signal segment, the second audience measurement data being different from the first audience measurement data, the embedded first audience measurement data and the embedded second audience measurement data to exist in the audio signal segment simultaneously; anda speaker to reproduce the audio signal segment as sound.
  • 11. The device according to claim 10, wherein the audio signal segment includes streaming media.
  • 12. The device of claim 10, wherein the first audience measurement data and second audience measurement data include portions having frequency characteristics enabling separate detection of the first audience measurement data and the second audience measurement data.
  • 13. The device of claim 10, wherein the processor is to embed the second audience measurement data according to one of an asynchronous or synchronous positioning relative to the first audience measurement data.
  • 14. The device of claim 10, wherein the first audience measurement data and the second audience measurement data includes data sets of frequency domain data.
  • 15. The device of claim 14, wherein the first audience measurement data and the second audience measurement data include frequency domain data.
  • 16. The device of claim 15, wherein the processor is to process the frequency domain data to embed the second audience measurement data as frequency components having predetermined frequencies.
  • 17. The device of claim 15, wherein the processor is to process the frequency domain data to embed code components of the audience measurement data according to a frequency-hopping pattern.
  • 18. The device of claim 10, further including a decoder to decode the audio signal segment by forming and processing a data set therefrom to extract at least one of the first audience measurement data or the second audience measurement data.
  • 19. A method comprising: accessing an audio signal segment in a consumer electronics device capable of producing sound from the audio signal segment, the audio signal segment including first measurement data acoustically embedded into the audio signal segment;acoustically embedding, with the consumer electronics device, second measurement data in the audio signal segment, the second measurement data being different from the first measurement data, and at least portions of the first measurement data and second measurement data existing in the audio signal segment simultaneously; andproducing sound, via the consumer electronics device, based on the audio signal segment.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a continuation of prior U.S. non-provisional patent application Ser. No. 11/767,254, filed Jun. 22, 2007, now U.S. Pat. No. 7,640,141, which is a continuation of prior U.S. non-provisional patent application Ser. No. 10/205,808, filed Jul. 26, 2002, now U.S. Pat. No. 7,239,981, assigned to the assignee of the present invention and hereby incorporated herein by reference in its entirety.

US Referenced Citations (478)
Number Name Date Kind
2833859 Rahmel et al. May 1958 A
3484787 Vallese Dec 1969 A
3540003 Murphy Nov 1970 A
3818458 Deese Jun 1974 A
3906450 Prado, Jr. Sep 1975 A
3906454 Martin Sep 1975 A
3919479 Moon et al. Nov 1975 A
3973206 Haselwood et al. Aug 1976 A
T955010 Ragonese et al. Feb 1977 I4
4048562 Haselwood et al. Sep 1977 A
4168396 Best Sep 1979 A
4230990 Lert, Jr. et al. Oct 1980 A
4232193 Gerard Nov 1980 A
4306289 Lumley Dec 1981 A
4319079 Best Mar 1982 A
4361832 Cole Nov 1982 A
4367488 Leventer et al. Jan 1983 A
4367525 Brown et al. Jan 1983 A
4425578 Haselwood et al. Jan 1984 A
4547804 Greenberg Oct 1985 A
4558413 Schmidt et al. Dec 1985 A
4588991 Atalla May 1986 A
4590550 Eilert et al. May 1986 A
4595950 Lofberg Jun 1986 A
4621325 Naftzger et al. Nov 1986 A
4630196 Bednar, Jr. et al. Dec 1986 A
4639779 Greenberg Jan 1987 A
4647974 Butler et al. Mar 1987 A
4658093 Hellman Apr 1987 A
4672572 Alsberg Jun 1987 A
4677466 Lert, Jr. et al. Jun 1987 A
4685056 Barnsdale, Jr. et al. Aug 1987 A
4694490 Harvey et al. Sep 1987 A
4696034 Wiedemer Sep 1987 A
4697209 Kiewit et al. Sep 1987 A
4703324 White Oct 1987 A
4712097 Hashimoto Dec 1987 A
4718005 Feigenbaum et al. Jan 1988 A
4720782 Kovalcin Jan 1988 A
4723302 Fulmer et al. Feb 1988 A
4734865 Scullion et al. Mar 1988 A
4739398 Thomas et al. Apr 1988 A
4740890 William Apr 1988 A
4745468 Von Kohorn May 1988 A
4747139 Taaffe May 1988 A
4754262 Hackett et al. Jun 1988 A
4757533 Allen et al. Jul 1988 A
4764808 Solar Aug 1988 A
4769697 Gilley et al. Sep 1988 A
4791565 Dunham et al. Dec 1988 A
4805020 Greenberg Feb 1989 A
4821178 Levin et al. Apr 1989 A
4825354 Agrawal et al. Apr 1989 A
4827508 Shear May 1989 A
4866769 Karp Sep 1989 A
4876592 Von Kohorn Oct 1989 A
4876736 Kiewit Oct 1989 A
4907079 Turner et al. Mar 1990 A
4914689 Quade et al. Apr 1990 A
4926162 Pickell May 1990 A
4926255 Von Kohorn May 1990 A
4930011 Kiewit May 1990 A
4931871 Kramer Jun 1990 A
4940976 Gastouniotis et al. Jul 1990 A
4943963 Waechter et al. Jul 1990 A
4945412 Kramer Jul 1990 A
4956769 Smith Sep 1990 A
4967273 Greenberg Oct 1990 A
4970644 Berneking et al. Nov 1990 A
4972503 Zurlinden Nov 1990 A
4973952 Malec et al. Nov 1990 A
4977594 Shear Dec 1990 A
4994916 Pshitssky et al. Feb 1991 A
5019899 Boles et al. May 1991 A
5023907 Johnson et al. Jun 1991 A
5023929 Call Jun 1991 A
5032979 Hecht et al. Jul 1991 A
5034807 Von Kohorn Jul 1991 A
5057915 Von Kohorn Oct 1991 A
5081680 Bennett Jan 1992 A
5086386 Islam Feb 1992 A
5103498 Lanier et al. Apr 1992 A
5113518 Durst, Jr. et al. May 1992 A
5128752 Von Kohorn Jul 1992 A
5157489 Lowe Oct 1992 A
5182770 Medveczky et al. Jan 1993 A
5200822 Bronfin et al. Apr 1993 A
5204897 Wyman Apr 1993 A
5214780 Ingoglia et al. May 1993 A
5222874 Unnewehr et al. Jun 1993 A
5227874 Von Kohorn Jul 1993 A
5233642 Renton Aug 1993 A
5249044 Von Kohorn Sep 1993 A
5283734 Von Kohorn Feb 1994 A
5287408 Samson Feb 1994 A
5317635 Stirling et al. May 1994 A
5319735 Preuss et al. Jun 1994 A
5331544 Lu et al. Jul 1994 A
5343239 Lappington et al. Aug 1994 A
5355484 Record et al. Oct 1994 A
5374951 Welsh Dec 1994 A
5377269 Heptig et al. Dec 1994 A
5388211 Hornbuckle Feb 1995 A
5401946 Weinblatt Mar 1995 A
5406269 Baran Apr 1995 A
5410598 Shear Apr 1995 A
5425100 Thomas et al. Jun 1995 A
5440738 Bowman et al. Aug 1995 A
5444642 Montgomery et al. Aug 1995 A
5450134 Legate Sep 1995 A
5450490 Jensen et al. Sep 1995 A
5457807 Weinblatt Oct 1995 A
5463616 Kruse et al. Oct 1995 A
5481294 Thomas et al. Jan 1996 A
5483276 Brooks et al. Jan 1996 A
5483658 Grube et al. Jan 1996 A
5488648 Womble Jan 1996 A
5490060 Malec et al. Feb 1996 A
5497479 Hornbuckle Mar 1996 A
5499340 Barritz Mar 1996 A
5508731 Von Kohorn Apr 1996 A
5512933 Wheatley et al. Apr 1996 A
5519433 Lappington et al. May 1996 A
5524195 Clanton, III et al. Jun 1996 A
5526035 Lappington et al. Jun 1996 A
5533021 Branstad et al. Jul 1996 A
5543856 Rosser et al. Aug 1996 A
5557334 Legate Sep 1996 A
5559808 Kostreski et al. Sep 1996 A
5561010 Hanyu et al. Oct 1996 A
5574962 Fardeau et al. Nov 1996 A
5579124 Aijala et al. Nov 1996 A
5581800 Fardeau et al. Dec 1996 A
5584025 Keithley et al. Dec 1996 A
5584050 Lyons Dec 1996 A
5594934 Lu et al. Jan 1997 A
5606604 Rosenblatt et al. Feb 1997 A
5610916 Kostreski et al. Mar 1997 A
5621395 Kiyaji et al. Apr 1997 A
5629739 Dougherty May 1997 A
5638113 Lappington et al. Jun 1997 A
5640192 Garfinkle Jun 1997 A
5646675 Copriviza et al. Jul 1997 A
5646942 Oliver et al. Jul 1997 A
5654748 Matthews, III Aug 1997 A
5659366 Kerman Aug 1997 A
5666293 Metz et al. Sep 1997 A
5666365 Kostreski Sep 1997 A
5675510 Coffey et al. Oct 1997 A
5682196 Freeman Oct 1997 A
5697844 Von Kohorn Dec 1997 A
5701582 DeBey Dec 1997 A
5713795 Von Kohorn Feb 1998 A
5719634 Keery et al. Feb 1998 A
5724103 Batchelor Mar 1998 A
5724521 Dedrick Mar 1998 A
5727129 Barett et al. Mar 1998 A
5729472 Seiffert et al. Mar 1998 A
5729549 Kostreski et al. Mar 1998 A
5732218 Bland et al. Mar 1998 A
5734413 Lappington et al. Mar 1998 A
5734720 Salganicoff Mar 1998 A
5740035 Cohen et al. Apr 1998 A
5740549 Reilly et al. Apr 1998 A
5745760 Kawamura et al. Apr 1998 A
5751707 Voit et al. May 1998 A
5754938 Herz et al. May 1998 A
5754939 Herz et al. May 1998 A
5758257 Herz et al. May 1998 A
5759101 Von Kohorn Jun 1998 A
5761606 Wolzien Jun 1998 A
5764275 Lappington et al. Jun 1998 A
5764763 Jensen et al. Jun 1998 A
5768382 Schneier et al. Jun 1998 A
5768680 Thomas Jun 1998 A
5771354 Crawford Jun 1998 A
5774664 Hidary et al. Jun 1998 A
5787253 McCreery et al. Jul 1998 A
5787334 Fardeau et al. Jul 1998 A
5793410 Rao Aug 1998 A
5796633 Burgess et al. Aug 1998 A
5796952 Davis et al. Aug 1998 A
5802304 Stone Sep 1998 A
5812928 Watson, Jr. et al. Sep 1998 A
5815671 Morrison Sep 1998 A
5819156 Belmont Oct 1998 A
5828325 Wolosewicz et al. Oct 1998 A
5833468 Guy et al. Nov 1998 A
5841978 Rhoads Nov 1998 A
5848155 Cox Dec 1998 A
5848396 Gerace Dec 1998 A
5850249 Massetti et al. Dec 1998 A
5857190 Brown Jan 1999 A
5872588 Aras et al. Feb 1999 A
5878384 Johnson et al. Mar 1999 A
5880789 Inaba Mar 1999 A
5887140 Itsumi et al. Mar 1999 A
5892917 Myerson Apr 1999 A
5893067 Bender et al. Apr 1999 A
5905713 Anderson et al. May 1999 A
5918223 Blum et al. Jun 1999 A
5930369 Cox et al. Jul 1999 A
5933789 Byun et al. Aug 1999 A
5937392 Alberts Aug 1999 A
5944780 Chase et al. Aug 1999 A
5945932 Smith et al. Aug 1999 A
5945988 Williams et al. Aug 1999 A
5951642 Onoe et al. Sep 1999 A
5956716 Kenner et al. Sep 1999 A
5966120 Arazi et al. Oct 1999 A
5974396 Anderson et al. Oct 1999 A
5978842 Noble et al. Nov 1999 A
5987611 Freund Nov 1999 A
5987855 Dey et al. Nov 1999 A
5991807 Schmidt et al. Nov 1999 A
5999912 Wodarz et al. Dec 1999 A
6006217 Lumsden Dec 1999 A
6006332 Rabne et al. Dec 1999 A
6018619 Allard et al. Jan 2000 A
6034722 Viney et al. Mar 2000 A
6035177 Moses et al. Mar 2000 A
6049830 Saib Apr 2000 A
6055573 Gardenswartz et al. Apr 2000 A
6061082 Park May 2000 A
6061719 Bendinelli et al. May 2000 A
6108637 Blumenau Aug 2000 A
6115680 Coffee et al. Sep 2000 A
6138155 Davis et al. Oct 2000 A
6154209 Naughton et al. Nov 2000 A
6154484 Lee et al. Nov 2000 A
6175627 Petrovic et al. Jan 2001 B1
6199206 Nichioka et al. Mar 2001 B1
6202210 Ludtke Mar 2001 B1
6208735 Cox et al. Mar 2001 B1
6216129 Eldering Apr 2001 B1
6272176 Srinivasan Aug 2001 B1
6286036 Rhoads Sep 2001 B1
6286140 Ivanyi Sep 2001 B1
6298348 Eldering Oct 2001 B1
6308327 Liu et al. Oct 2001 B1
6327619 Blumenau Dec 2001 B1
6331876 Koster et al. Dec 2001 B1
6335736 Wagner et al. Jan 2002 B1
6363159 Rhoads Mar 2002 B1
6381632 Lowell Apr 2002 B1
6389055 August et al. May 2002 B1
6400827 Rhoads Jun 2002 B1
6411725 Rhoads Jun 2002 B1
6421445 Jensen et al. Jul 2002 B1
6477707 King et al. Nov 2002 B1
6487564 Asai et al. Nov 2002 B1
6505160 Levy et al. Jan 2003 B1
6510462 Blumenau Jan 2003 B2
6512836 Xie et al. Jan 2003 B1
6513014 Walker et al. Jan 2003 B1
6522771 Rhoads Feb 2003 B2
6539095 Rhoads Mar 2003 B1
6546556 Kataoka et al. Apr 2003 B1
6553178 Abecassis Apr 2003 B2
6574594 Pitman et al. Jun 2003 B2
6642966 Limaye Nov 2003 B1
6647269 Hendrey et al. Nov 2003 B2
6651253 Dudkiewicz et al. Nov 2003 B2
6654480 Rhoads Nov 2003 B2
6665873 Van Gestel et al. Dec 2003 B1
6675383 Wheeler et al. Jan 2004 B1
6683966 Tian et al. Jan 2004 B1
6710815 Billmaier et al. Mar 2004 B1
6714683 Tian et al. Mar 2004 B1
6741684 Kaars May 2004 B2
6748362 Meyer et al. Jun 2004 B1
6750985 Rhoads Jun 2004 B2
6766523 Herley Jul 2004 B2
6795972 Rovira Sep 2004 B2
6804379 Rhoads Oct 2004 B2
6829368 Meyer et al. Dec 2004 B2
6853634 Davies et al. Feb 2005 B1
6871180 Neuhauser et al. Mar 2005 B1
6871323 Wagner et al. Mar 2005 B2
6873688 Aarnio Mar 2005 B1
6941275 Swierczek Sep 2005 B1
6956575 Nakazawa et al. Oct 2005 B2
6965601 Nakano et al. Nov 2005 B1
6968564 Srinivasan Nov 2005 B1
6970886 Conwell et al. Nov 2005 B1
6996213 De Jong Feb 2006 B1
7003731 Rhaods et al. Feb 2006 B1
7050603 Rhoads et al. May 2006 B2
7051086 Rhoads et al. May 2006 B2
7058697 Rhoads Jun 2006 B2
7082434 Gosselin Jul 2006 B2
7095871 Jones et al. Aug 2006 B2
7143949 Hannigan Dec 2006 B1
7158943 Ven der Riet Jan 2007 B2
7171018 Rhoads et al. Jan 2007 B2
7174293 Kenyon et al. Feb 2007 B2
7185201 Rhoads et al. Feb 2007 B2
7194752 Kenyon et al. Mar 2007 B1
7197156 Levy Mar 2007 B1
7206494 Engle et al. Apr 2007 B2
7215280 Percy et al. May 2007 B1
7221405 Basson et al. May 2007 B2
7227972 Brundage et al. Jun 2007 B2
7239981 Kolessar et al. Jul 2007 B2
7254249 Rhoads et al. Aug 2007 B2
7273978 Uhle Sep 2007 B2
7317716 Boni et al. Jan 2008 B1
7328153 Wells et al. Feb 2008 B2
7346512 Li-Chun Wang et al. Mar 2008 B2
7356700 Noridomi et al. Apr 2008 B2
7363278 Schmelzer et al. Apr 2008 B2
7369678 Rhoads May 2008 B2
7421723 Harkness et al. Sep 2008 B2
7440674 Plotnick et al. Oct 2008 B2
7443292 Jensen et al. Oct 2008 B2
7463143 Forr et al. Dec 2008 B2
7519658 Anglin et al. Apr 2009 B1
7592908 Zhang et al. Sep 2009 B2
7607147 Lu et al. Oct 2009 B1
7623823 Zito et al. Nov 2009 B2
7640141 Kolessar et al. Dec 2009 B2
7644422 Lu et al. Jan 2010 B2
RE42627 Neuhauser et al. Aug 2011 E
8369972 Topchy et al. Feb 2013 B2
20010028662 Hunt et al. Oct 2001 A1
20010044751 Pugliese et al. Nov 2001 A1
20010044899 Levy Nov 2001 A1
20010047517 Christopoulos et al. Nov 2001 A1
20010056405 Muyres et al. Dec 2001 A1
20010056573 Kovac et al. Dec 2001 A1
20020002488 Muyres et al. Jan 2002 A1
20020032734 Rhoads Mar 2002 A1
20020032904 Lerner Mar 2002 A1
20020033842 Zetts Mar 2002 A1
20020053078 Holtz et al. May 2002 A1
20020056086 Yuen May 2002 A1
20020056089 Houston May 2002 A1
20020056094 Dureau May 2002 A1
20020059218 August et al. May 2002 A1
20020062382 Rhoads et al. May 2002 A1
20020065826 Bell et al. May 2002 A1
20020087967 Conkwright et al. Jul 2002 A1
20020087969 Brunheroto et al. Jul 2002 A1
20020091991 Castro Jul 2002 A1
20020101083 Toledano et al. Aug 2002 A1
20020102993 Hendrey et al. Aug 2002 A1
20020108125 Joao Aug 2002 A1
20020111934 Narayan Aug 2002 A1
20020112002 Abato Aug 2002 A1
20020124077 Hill et al. Sep 2002 A1
20020124246 Kaminsky et al. Sep 2002 A1
20020133412 Oliver et al. Sep 2002 A1
20020138851 Lord et al. Sep 2002 A1
20020144262 Plotnick et al. Oct 2002 A1
20020144273 Reto Oct 2002 A1
20020162118 Levy et al. Oct 2002 A1
20020171567 Altare et al. Nov 2002 A1
20020174425 Markel et al. Nov 2002 A1
20020188746 Drosset et al. Dec 2002 A1
20020194592 Tsuchida et al. Dec 2002 A1
20030021441 Levy et al. Jan 2003 A1
20030039465 Bjorgan et al. Feb 2003 A1
20030041141 Abdelaziz et al. Feb 2003 A1
20030056208 Kamada et al. Mar 2003 A1
20030070167 Holtz et al. Apr 2003 A1
20030088674 Ullman et al. May 2003 A1
20030105870 Baum Jun 2003 A1
20030108200 Sako Jun 2003 A1
20030115586 Lejouan et al. Jun 2003 A1
20030115598 Pantoja Jun 2003 A1
20030170001 Breen Sep 2003 A1
20030171833 Crystal et al. Sep 2003 A1
20030177488 Smith et al. Sep 2003 A1
20030185232 Moore et al. Oct 2003 A1
20030229900 Reisman Dec 2003 A1
20040004630 Kalva et al. Jan 2004 A1
20040006696 Shin et al. Jan 2004 A1
20040024588 Watson et al. Feb 2004 A1
20040031058 Reisman Feb 2004 A1
20040037271 Liscano et al. Feb 2004 A1
20040038692 Muzaffar Feb 2004 A1
20040059918 Xu Mar 2004 A1
20040059933 Levy Mar 2004 A1
20040064319 Neuhauser et al. Apr 2004 A1
20040073916 Petrovic et al. Apr 2004 A1
20040073951 Bae et al. Apr 2004 A1
20040120417 Lynch et al. Jun 2004 A1
20040125125 Levy Jul 2004 A1
20040128514 Rhoads Jul 2004 A1
20040137929 Jones et al. Jul 2004 A1
20040143844 Brant et al. Jul 2004 A1
20040146161 De Jong Jul 2004 A1
20040184369 Herre et al. Sep 2004 A1
20040199387 Wang et al. Oct 2004 A1
20040267533 Hannigan et al. Dec 2004 A1
20050028189 Heine et al. Feb 2005 A1
20050033758 Baxter Feb 2005 A1
20050036653 Brundage et al. Feb 2005 A1
20050058319 Rhoads et al. Mar 2005 A1
20050086682 Burges et al. Apr 2005 A1
20050144004 Bennett et al. Jun 2005 A1
20050192933 Rhoads et al. Sep 2005 A1
20050216346 Kusumoto et al. Sep 2005 A1
20050232411 Srinivasan et al. Oct 2005 A1
20050234728 Tachibana et al. Oct 2005 A1
20050234774 Dupree Oct 2005 A1
20050262351 Levy Nov 2005 A1
20050271246 Sharma et al. Dec 2005 A1
20060059277 Zito et al. Mar 2006 A1
20060083403 Zhang et al. Apr 2006 A1
20060095401 Krikorian et al. May 2006 A1
20060107195 Ramaswamy et al. May 2006 A1
20060107302 Zdepski May 2006 A1
20060136564 Ambrose Jun 2006 A1
20060167747 Goodman et al. Jul 2006 A1
20060168613 Wood et al. Jul 2006 A1
20060212710 Baum et al. Sep 2006 A1
20060221173 Duncan Oct 2006 A1
20060224798 Kelin et al. Oct 2006 A1
20070006250 Croy et al. Jan 2007 A1
20070016918 Alcorn et al. Jan 2007 A1
20070055987 Lu et al. Mar 2007 A1
20070110089 Essafi et al. May 2007 A1
20070118375 Kenyon et al. May 2007 A1
20070124771 Shvadron May 2007 A1
20070127717 Herre et al. Jun 2007 A1
20070129952 Kenyon et al. Jun 2007 A1
20070143778 Covell et al. Jun 2007 A1
20070149114 Danilenko Jun 2007 A1
20070162927 Ramaswamy et al. Jul 2007 A1
20070198738 Angiolillo et al. Aug 2007 A1
20070201835 Rhoads Aug 2007 A1
20070226760 Neuhauser et al. Sep 2007 A1
20070274523 Rhoads Nov 2007 A1
20070276925 La Joie et al. Nov 2007 A1
20070276926 La Joie et al. Nov 2007 A1
20070288476 Flanagan, III et al. Dec 2007 A1
20070294057 Crystal et al. Dec 2007 A1
20070294132 Zhang et al. Dec 2007 A1
20070294705 Gopalakrishnan et al. Dec 2007 A1
20070294706 Neuhauser et al. Dec 2007 A1
20080019560 Rhoads Jan 2008 A1
20080022114 Moskowitz Jan 2008 A1
20080027734 Zhao et al. Jan 2008 A1
20080028223 Rhoads Jan 2008 A1
20080028474 Horne et al. Jan 2008 A1
20080040354 Ray et al. Feb 2008 A1
20080059160 Saunders et al. Mar 2008 A1
20080065507 Morrison et al. Mar 2008 A1
20080077956 Morrison et al. Mar 2008 A1
20080082510 Wang et al. Apr 2008 A1
20080082922 Biniak et al. Apr 2008 A1
20080083003 Biniak et al. Apr 2008 A1
20080133223 Son et al. Jun 2008 A1
20080137749 Tian et al. Jun 2008 A1
20080139182 Levy et al. Jun 2008 A1
20080140573 Levy et al. Jun 2008 A1
20080168503 Sparrell Jul 2008 A1
20080209491 Hasek Aug 2008 A1
20080215333 Tewfik et al. Sep 2008 A1
20080219496 Tewfik et al. Sep 2008 A1
20080235077 Harkness et al. Sep 2008 A1
20090031037 Mendell et al. Jan 2009 A1
20090031134 Levy Jan 2009 A1
20090070408 White Mar 2009 A1
20090070587 Srinivasan Mar 2009 A1
20090119723 Tinsman May 2009 A1
20090125310 Lee et al. May 2009 A1
20090150553 Collart et al. Jun 2009 A1
20090259325 Topchy et al. Oct 2009 A1
20090265214 Jobs et al. Oct 2009 A1
20090307061 Monighetti et al. Dec 2009 A1
20090307084 Monighetti et al. Dec 2009 A1
20100049474 Kolessar et al. Feb 2010 A1
20100135638 Mio Jun 2010 A1
20100138770 Lu et al. Jun 2010 A1
20100166120 Baum et al. Jul 2010 A1
20130096706 Srinivasan et al. Apr 2013 A1
Foreign Referenced Citations (107)
Number Date Country
8976601 Feb 2002 AU
9298201 Apr 2002 AU
2003230993 Nov 2003 AU
2006203639 Sep 2006 AU
0112901 Jun 2003 BR
0309598 Feb 2005 BR
1318967 Jun 1993 CA
2353303 Jan 2003 CA
2483104 Nov 2003 CA
1149366 May 1997 CN
1253692 May 2000 CN
1372682 Oct 2002 CN
1592906 Mar 2005 CN
1647160 Jul 2005 CN
101243688 Aug 2008 CN
101262292 Sep 2008 CN
0309269 Mar 1989 EP
0325219 Jul 1989 EP
0703683 Mar 1996 EP
0744695 Nov 1996 EP
0769749 Apr 1997 EP
0944991 Sep 1999 EP
1043853 Oct 2000 EP
1089201 Apr 2001 EP
1089564 Apr 2001 EP
1267572 Dec 2002 EP
1349370 Oct 2003 EP
1406403 Apr 2004 EP
1307833 Jun 2006 EP
1745464 Oct 2007 EP
1704695 Feb 2008 EP
1504445 Aug 2008 EP
2176639 Dec 1986 GB
5324352 Dec 1993 JP
5347648 Dec 1993 JP
6085966 Mar 1994 JP
7123392 May 1995 JP
2001040322 Aug 2002 JP
2002247610 Aug 2002 JP
2003208187 Jul 2003 JP
2003536113 Dec 2003 JP
2006154851 Jun 2006 JP
2007318745 Dec 2007 JP
4408453 Nov 2009 JP
8907868 Aug 1989 WO
9512278 May 1995 WO
9526106 Sep 1995 WO
9600950 Jan 1996 WO
9617467 Jun 1996 WO
9627264 Sep 1996 WO
9628904 Sep 1996 WO
9632815 Oct 1996 WO
9637983 Nov 1996 WO
9641495 Dec 1996 WO
9702672 Jan 1997 WO
9715007 Apr 1997 WO
9826529 Jun 1998 WO
9826571 Jun 1998 WO
9831155 Jul 1998 WO
9527349 Oct 1998 WO
9959275 Nov 1999 WO
0004662 Jan 2000 WO
0019699 Apr 2000 WO
0072309 Nov 2000 WO
0119088 Mar 2001 WO
0124027 Apr 2001 WO
0131497 May 2001 WO
0140963 Jun 2001 WO
0153922 Jul 2001 WO
0175743 Oct 2001 WO
0191109 Nov 2001 WO
0205517 Jan 2002 WO
0211123 Feb 2002 WO
0215081 Feb 2002 WO
0217591 Feb 2002 WO
0219625 Mar 2002 WO
0227600 Apr 2002 WO
0237381 May 2002 WO
0245034 Jun 2002 WO
02061652 Aug 2002 WO
02065305 Aug 2002 WO
02065318 Aug 2002 WO
02069121 Sep 2002 WO
02098029 Dec 2002 WO
03009277 Jan 2003 WO
03091990 Nov 2003 WO
03094499 Nov 2003 WO
03096337 Nov 2003 WO
2004010352 Jan 2004 WO
2004040416 May 2004 WO
2004040475 May 2004 WO
2005025217 Mar 2005 WO
2005064885 Jul 2005 WO
2005101243 Oct 2005 WO
2005111998 Nov 2005 WO
2006012241 Feb 2006 WO
2006025797 Mar 2006 WO
2007056531 May 2007 WO
2007056532 May 2007 WO
2008042953 Apr 2008 WO
2008044664 Apr 2008 WO
2008045950 Apr 2008 WO
2008110002 Sep 2008 WO
2008110790 Sep 2008 WO
2009011206 Jan 2009 WO
2009061651 May 2009 WO
2009064561 May 2009 WO
Non-Patent Literature Citations (63)
Entry
Heuer, et al., “Adaptive Multimedia Messaging based on MPEG-7—The M3-Box,” Nov. 9-10, 2000, Proc. Second Int'l Symposium on Mobile Multimedia Systems Applications, pp. 6-13 (8 pages).
Wactlar et al., “Digital Video Archives: Managing Through Metadata” Building a National Strategy for Digital Preservation: Issues in Digital Media Archiving, Apr. 2002, pp. 84-88. [httr://www.informedia.cs.cmu.edu/documents/Wactlar-CUR-final.gdf, retrieved on Jul. 20, 2006] (14 pages).
Mulder, “The Integration of Metadata From Production to Consumer,” EBU Technical Review, Sep. 2000, pp. 1-5. [http://www.ebu.ch/en/technical/trev/trcv—284-contcnts.html, retrieved on Jul. 20, 2006] (5 pages).
Hopper, “EBU Project Group P/META Metadata Exchange Standards,” EBU Technical Review, Sep. 2000, pp. 1-24. [http://v,•ww.ebu.ch/en/technical/trev/trev—284-contents.html, retrieved on Jul. 20, 2006] (24 pages).
Evain, “TV-Anytime Metadata—A Preliminary Specification on Schedule!,” EBU Technical Review, Sep. 2000, pp. 1-14. [http://www.ebu.ch/en/technical/trev/trev—284-contents.html, retrieved on Jul. 20, 2006] (14 pages).
“EBU Technical Review (Editorial),” No. 284 Sep. 2000, pp. 1-3. [http://www.ebu.ch/en/technical/trev/trev—284-contcnts.html, retrieved on Jul. 20, 2006] (3 pages).
Apr. 22, 2009 Complaint in Arbitron Inc., v. John Barrett Kiefl in United States District Court for the Southern District of New York. Case 1 :09-cv-04013-PAC.
Apr. 8, 2009 Letter from John S. Macera (representing Kiefl) to Michael Skarzynski (of Arbitron) re: alleged patent infringement. (Exhibit I of the Apr. 22, 2009 Complaint in Arbitron Inc., v. John Barrett Kieft in United States District Court for the Southern District of New York. Case 1:09-cv-04013-PAC.).
Apr. 24, 2009 Letter from Michael Skarzynski (of Arbitron) to John S. Macera (representing Kiefl) re: alleged patent infringement.
United States Patent and Trademark Office, “Notice of Allowance,” issued in connection with U.S. Appl. No. 10/205,808, on Feb. 26, 2007 (7 pages).
United States Patent and Trademark Office, “Notice of Allowance,” issued in connection with U.S. Appl. No. 10/205,808, on Sep. 26, 2006 (11 pages).
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 10/205,808, on Mar. 28, 2005 (19 pages).
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 10/205,808, on Dec. 10, 2004 (17 pages).
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 10/205,808, on Dec. 20, 2005 (20 pages).
United States Patent and Trademark Office, “Notice of Allowance,” issued in connection with U.S. Appl. No. 11/767,254, on Mar. 12, 2009 (8 pages).
United States Patent and Trademark Office, “Notice of Allowance,” issued in connection with U.S. Appl. No. 11/767,254, on Jul. 30, 2009 (8 pages).
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 11/767,254, on Oct. 1, 2008 (16 pages).
Patent Cooperation Treaty, “International Preliminary Examination Report,” issued in connection with PCT Patent Application Serial No. PCT/US03/22370, on May 6, 2005 (3 pages).
Patent Cooperation Treaty, “International Search Report,” issued in connection with PCT Patent Application Serial No. PCT/US03/22370, on Mar. 11, 2004 (1 page).
Tally Systems Corp, “Tally Systems Patents Software Inventorying Technology,” retrieved from www.tallysys.com, on Jul. 1, 1996 (5 pages).
“Lan Times 1995 Index: Application Administration & Management,” LAN Times (1995) (5 pages).
R. Lisle, “'The Management Features in Software-Metering Tools Can Save You a Bundle,” LAN Times, Jul. 3, 1995 (3 pages).
T. Johnson, “Research in the Future: The Role and Measurement of the Internet,” ARF 60w Anniversary Annual Conference and Research Expo, Mar. 11-13, 1996 (4 pages).
“The Top Five Advertising Agencies Now Subscribe to PC-Meter Web Measurement Services,” at http://www.npd.com:80/pcmprl0.htm on Jul. 1, 1996 (2 pages).
“Demographics,” at http://www.w3.org/pub/WWW/Demographics on Oct. 4, 1996 (3 pages).
D. Hoffman et al., “How Big is the Internet,” Aug. 18, 1994 (2 pages).
M. Brownstein, “Streamlined and Ready for Action,” NETGUIDE (1996), pp. 81, 83-86, 88, 90, 95-96.
B. Harvey, “Interactive Standards,” The Marketing Pulse, vol. XN, Issue 12, Aug. 31, 1994, pp. 1-6.
Chiat/Day, “The New Video Highway: What Will We Need to Know? How Will We Measure It?” Advertising Research Foundation, Jun. 29, 1994, pp. 1-12.
M. Green et al., “The Evolution of Research Problems on the Information Superhighway,” JMCT Media Research, Jun. 1994 (7 pages).
“Preliminary Summary Overview of Studies ofInteractivity for 4AS Casie Research Sub-Committee,” Next Century Media, Inc., pp. 1-11, Oct. 4, 1996.
Release Notes for NeTraMet as found on The Worldwide Web on Jul. 1, 1996 (2 pages).
Infoseek Internet Search Results When Searching for “NPD” on Jul. 1, 1996 (2 pages).
Print of page from The Worldwide Web, http://www.npd.com/pcmdef.htm on Jul. 1, 1996 (1 page).
Print of page from The Worldwide Web, http://www.npd.com:80/pcmeter.htm on Jul. 1, 1996 (1 page).
Print of page from The Worldwide Web, http://www.npd.com:80/pcmpr.htm on Jul. 1, 1996 (1 page).
E. English, “The Meter's Running,” LAN Times, Mar. 27, 1995 (2 pages).
Marketing News, Jun. 3, 1996, Section: 1996 Business Report on the Marketing Research Industry (36 pages).
“Latest NPD Survey Finds World Wide Web Access from Homes Grew Fourfold in Second Half of 1995,” from http://www.npd.com:80/meterpr4.htm on Jul. 1, 1996 (1 page).
“First Demographic Data on Home World Wide Web Use Now Available from the NPD Group,” from http://www.npd.com:80/meterpr6.htm on Jul. 1, 1996 (2 pages).
“America Online is Leading Destination of Web Surfers in First-ever PC-Meter Sweeps Citing Top 25 Web Sites,” from http://www.npd.com:80/meterpr5.htm on Jul. 1, 1996 (3 pages).
“NPD's PC-Meter Service to Provide More Accurate Measure of World Wide Web Traffic,” from http://www.npd.com:80/meterpr.htm on Jul. 1, 1996 (1 page).
“PC-Meter Now in 1,000 Households Nationwide,” from http://www.npd.com:80/meterpr2.htm on Jul. 1, 1996 (1 page).
“PC-Meter Predicts Happy Holidays for Computer Manufacturers and Retailers,” http://www.npd.com:80/meterpr3.htm on Jul. 1, 1996 (1 page).
Electronic News, vol. 42, No. 2110, Monday, Apr. 1, 1996 (4 pages).
Interactive Marketing News, Friday, Jul. 5, 1996 (1 page).
Advertising Age, Special Report, Monday, May 30, 1996 (1 page).
Charlottesville Business Journal, vol. 7, No. 2, Thursday, Feb. 1, 1996 (6 pages).
P. Helsinki, “Automating Web-Site Maintenance Part 2 Peri-Based Tools to Manage Your Website,” Web Techniques, vol. 1, No. 9, XP-002048313, Dec. 1996, pp. 75-78.
M. Lafferty, “Taking the PC Out of the Data Comm Loop: New Techniques Bring Mass Market and Net Together on TV,” CED: Communications Engineering and Design, vol. 22, No. 9, XP-002079179, Aug. 1996, pp. 34-38.
Fink et al., “Social- and Interactive- Television Applications Based on Real-Time Ambient-Audio Identification,” EuroiTV, 2006 (10 pages).
Claburn, “Google Researchers Propose TV Monitoring,” Information Week, Jun. 7, 2006 (3 pages).
Anderson, “Google to compete with Nielsen for TV-ratings info?,” Ars Technica, Jun. 19, 2006 (2 pages).
“What is Jacked?,” httn://www.jacked.com, retrieved on Dec. 3, 2009 (1 page).
Sullivan, “Google Cozies Up to SMBs for Digital Content,” MediaPost News, Mar. 18, 2009, (2 pages).
Wang, “An Industrial-Strength Audio Search Algorithm,” Shazam Entertainment, Ltd., in Proceedings of the Fourth International Conference on Music Information Retrieval, Baltimore, Oct. 26-30, 2003 (7 pages).
Boehret, “Yahoo Widgets Lend Brains to Boob Tube,” The Wall Street Journal, Mar. 25, 2009 (3 pages).
Stross, “Apple Wouldn't Risk Its Cool Over a Gimmick, Would It?,” The New York Times, Nov. 14, 2009 (3 pages).
Stultz, “Handheld Captioning at Disney World Theme Parks,” article retrieved on Mar. 19, 2009, http://goflorida.about.com/od/disneyworld/a/wdw—captioning.htm, (2 pages).
Kane, “Entrepreneur Plans On-Demand Videogame Service,” The Wall Street Journal, Mar. 24, 2009 (2 pages).
Shazam, “Shazam turns up the volume on mobile music,” http://www.shazam.com/music/web/newsdetail.html?nid=NEWS137, Nov. 28, 2007 (1 page).
Shazam, “Shazam and VidZone Digital Media announce UK1s first fixed price mobile download service for music videos,” http://www.shazam.com/music/web/newsdetail.html?nid=NEWS136, Feb. 11, 2008 ( 1 page).
Shazam, “Shazam launches new music application for Facebook fans,” http://www.shazam.com/music/web/newsdetail.html?nid=NEWS135, Feb. 18, 2008 ( 1 page) p. 6.
Related Publications (1)
Number Date Country
20100049474 A1 Feb 2010 US
Continuations (2)
Number Date Country
Parent 11767254 Jun 2007 US
Child 12611220 US
Parent 10205808 Jul 2002 US
Child 11767254 US