This disclosure relates generally to audience measurement and, more particularly, to display device ON/OFF detection methods and apparatus.
Media ratings and metering information is typically generated by collecting viewing records and/or other media consumption information from a group of statistically selected households. Each of the statistically selected households typically has a data logging and processing unit commonly referred to as a “home unit.” In households having multiple viewing sites (e.g., multiple television systems or, more generally, multiple presentation devices), the data logging and processing functionality may be distributed among a single home unit and multiple “site units,” one site unit for each viewing site. The home unit (or the combination of the home unit and the site unit) is often in communication with a variety of attachments that provide inputs to the home unit or receive outputs from the metering unit. For example, a frequency detector attachment coupled with the home unit may be in communication with a television to sense a local oscillator frequency of the television tuner. In this manner, the frequency detector attachment may be used by the home unit to determine the channel to which the television is currently tuned based on a detected frequency. As another example, a people meter may be located in the viewing space of the television and in communication with the home unit, thereby enabling the home unit to detect the identities and/or number of the persons currently viewing programs displayed on the television. Additional devices may be provided, for example, to determine if the television is operating (i.e., is turned ON) and/or the channel to which the television is tuned.
In addition, building security and building monitoring systems are becoming more and more prevalent in today's society. Such systems enable the building owner to determine the status of various electronic appliances disposed in the building even when the building owner is located remotely from the building premises. In many instances, the building owner may desire to know the operating status, e.g., ON/OFF, of a particular appliance, such as a television, or other media delivery/presentation device.
In another setting, parents often have an interest in monitoring their children's television viewing habits, electronic gaming habits and computer usage habits. A component of monitoring such habits involves determining the operating status of the appliance, electronic device, etc. of interest.
Media monitoring systems, building monitoring systems and parenting tools such as those described above, are only three (of many) applications in which an ON/OFF detection apparatus/device has use.
A block diagram of an example local metering system 100 capable of providing viewing and metering information for program content presented via an example home entertainment system 102 is illustrated in
The broadcast source 104 may be any broadcast media source, such as a cable television service provider, a satellite television service provider, a radio frequency (RF) television service provider, an internet streaming video/audio provider, etc. The broadcast source 104 may provide analog and/or digital television signals to the home entertainment system 102, for example, over a coaxial cable or via a wireless connection.
The STB 108 may be any set-top box, such as a cable television converter, a direct broadcast satellite (DBS) decoder, a video cassette recorder (VCR), etc. The set-top box 108 receives a plurality of broadcast channels from the broadcast source 104. Typically, the STB 108 selects one of the plurality of broadcast channels based on a user input, and outputs one or more signals received via the selected broadcast channel. In the case of an analog signal, the STB 108 tunes to a particular channel to obtain programming delivered on that channel. For a digital signal, the STB 108 may tune to a channel and decode certain packets of data to obtain programming delivered on a selected channel. For example, the STB 108 may tune to a major channel and then extract a program carried on a minor channel within the major channel via the decoding process mentioned above. For some home entertainment systems 102, for example, those in which the broadcast source 104 is a standard RF analog television service provider or a basic analog cable television service provider, the STB 108 may not be present as its function is performed by a tuner in the display device 120.
In the illustrated example, an output from the STB 108 is fed to a signal splitter 116, such as a single analog y-splitter (in the case of an RF coaxial connection between the STB 108 and the display device 120) or an audio/video splitter (in the case of a direct audio/video connection between the STB 108 and the display device 120). (For configurations in which the STB 108 is not present, the broadcast source 104 may be coupled directly to the signal splitter 116). In the example home entertainment system 102, the signal splitter produces two signals indicative of the output from the STB 108. Of course, a person of ordinary skill in the art will readily appreciate that any number of signals may be produced by the signal splitter 116.
In the illustrated example, one of the two signals from the signal splitter 116 is fed to the display device 120 and the other signal is delivered to the metering unit 124. The display device 120 may be any type of video display device, such as a television. For example, the display device 120 may be a television and/or other display device (e.g., a computer monitor, a CRT, an LCD, etc.) that supports the National Television Standards Committee (NTSC) standard, the Phase Alternating Line (PAL) standard, the Systéme Èlectronique pour Couleur avec Mémoire (SECAM) standard, a standard developed by the Advanced Television Systems Committee (ATSC), such as high definition television (HDTV), a standard developed by the Digital Video Broadcasting (DVB) Project, or may be a multimedia computer system, etc.
In the example of
The metering unit 124 may be configured to determine identifying information based on the signal corresponding to the program content being output by the STB 108. For example, the metering unit 124 may be configured to decode an embedded code in the signal received via connection 136 that corresponds to the channel or program currently being delivered by the STB 108 for display on the display device 120. The code may be embedded for purposes such as, for example, audience measurement, program delivery (e.g., PIDS in a digital television presentation, electronic program guide information, etc.) or delivery of other services (e.g., embedded hyperlinks to related programming, closed caption information, etc.). Alternatively or additionally, the metering unit 124 may be configured to generate a program signature (e.g., a proxy signal which is uniquely representative of the program, signal) based on the signal received via connection 136 that corresponds to the program currently being delivered by the STB 108 for display on the display device 120. The metering unit 124 may then add this program identifying information (e.g., the code(s) and/or signature(s)) to the viewing records corresponding to the currently displayed program.
In the example local metering system 100, the display device ON/OFF detector 128 is coupled to the metering unit 124. The display device ON/OFF detector 128 is configured to determine whether the display device 120 or other monitored information presenting device (e.g., a computer monitor, etc.) is operating in an ON (active) state or an OFF (inactive) state. Such ON/OFF detection information concerning the operating state of the information presenting device 120 may be used to more accurately process the viewing information and viewing records determined by the metering unit 124. For example, in the home entertainment system 102, it is possible that even though the display device 120 is turned OFF, the STB 108 may be inadvertently or intentionally left in an ON (active) state such that the STB 108 continues to receive and output program content provided by the broadcast source 104. Without the ON/OFF detection information provided by the display device ON/OFF detector 128, the metering unit 124 (or subsequent processing at, for example, a central facility) might credit the program content provided by the STB 108 as being consumed even though the display device 120 is turned OFF. Thus, the display device ON/OFF detector 128 may be used to augment the viewing information and/or viewing records determined by the metering unit 124 to more accurately determine whether program content output by the STB 108 is actually presented by the display device 120.
To facilitate the determination of program identifying information and the generation of viewing records for the program content received and output by the STB 108, as well as the determination of the operating state of the display device 120 or corresponding information presenting device, the metering unit 124 and the display device ON/OFF detector 128 may be provided with one or more sensors 144. For example, a sensor 144 may be implemented by a microphone placed in the proximity of the display device 120 to receive audio signals corresponding to the program being displayed. The metering unit 124 and/or display device ON/OFF detector 128 may then process the audio signals received from the microphone 144 to decode any embedded ancillary code(s) and/or generate one or more audio signatures corresponding to a program being displayed. The display device ON/OFF detector 128 may also process the audio signal to determine whether the display device 120 is turned ON and emitting audio signals consistent with operation in an active state.
Additionally or alternatively, a sensor 144 may be implemented by an on-screen display detector for capturing images displayed on the display device 120 and processing regions of interest in the displayed image. The regions of interest may correspond, for example, to a broadcast channel associated with the currently displayed program, a broadcast time associated with the currently displayed program, a viewing time associated with the currently displayed program, etc. Example on-screen display detectors are disclosed by Nelson, et al. in U.S. Provisional Patent Application Ser. No. 60/523,444 filed on Nov. 19, 2003, and Patent Cooperation Treaty Application Serial No. PCT/US04/12272 filed on Apr. 19, 2004, both of which are hereby incorporated by reference.
Additionally or alternatively, a sensor 144 could be implemented by a frequency detector to determine, for example, the channel to which the display device 120 is tuned. Additionally or alternatively, a sensor 144 could be implemented by an electromagnetic (EM) field pickup, a current sensor and/or a temperature sensor configured to detect emissions from the display device 120 indicative of the display device 120 being turned ON. Persons having ordinary skill in the art will recognize that there are a variety of sensors 144 that may be coupled with the metering unit 124 and/or the display device ON/OFF detector to facilitate generation of viewing records and display device operating state data containing sufficient information to determine a set of desired ratings and/or metering results. Persons of ordinary skill in the art will also appreciate that any or all of the sensors 144 may be located separate from and/or disposed in the metering unit 124, the display device ON/OFF detector 128 and/or any combination thereof. Additionally or alternatively, any or all of the sensors 144 may be duplicated in the metering unit 124 and the display device ON/OFF detector 128 to, for example, facilitate flexible placement of the various components of the local metering system 100 to permit metering of a wide range of home entertainment systems 102.
The example home entertainment system 102 of
The example local metering system 100 of
Persons of ordinary skill in the art will appreciate that the metering unit 124 and the display device ON/OFF detector 128 may be implemented as separate devices or integrated into a single unit. Additionally or alternatively, any or all or the metering unit 124, the display device ON/OFF detector 128, or portions thereof may be integrated into the STB 108 and/or the display device 120. For example, the display device ON/OFF detector 128 could be integrated into the STB 108 such that STB 108 is able to determine whether program content being received and output is also being presented by the monitored display device 120 or corresponding information presenting device. Such display device operating state information, coupled with operating state information concerning the STB 108 itself, could be transmitted back to the broadcast provider responsible for the broadcast source 104 via a back-channel connection 168 to allow the broadcast provider to, for example, monitor consumption of program content output by the STB 108 and presented by the display device 120 in the absence of the metering unit 124.
A block diagram of an example display device ON/OFF detector 200 that may be used to implement the display device ON/OFF detector 128 of
The display device ON/OFF detector 200 includes one or more audio processors 228 to process the audio signal 230 output by the audio sensor 224. The audio processors 228 are configured to determine characteristics of the input audio signal 230 and/or information included in the input audio signal 230 that may be used to ascertain whether the monitored information presenting is turned ON and operating in an active state. Examples of audio processors 228 are discussed in greater detail below in connection with
The example display device ON/OFF detector 200 also includes one or more video processors 232 to process the video signal 234 output by the video sensor 208. Similar to the audio processors 228, the video processors 232 are configured to determine characteristics of the input video signal 234 and/or information included in the input video signal 234 that may be used to ascertain whether the information presenting device monitored by the display device ON/OFF detector 200 (e.g., the display device 120) is turned ON and operating in an active state. Examples of video processors 232 are discussed in greater detail below in connection with
The example display device ON/OFF detector 200 also includes one or more emission processors 236 to process the emission signals 238 output by the emission sensor 212. Similar to the audio processors 228 and the video processors 232, the emission processors 236 are configured to determine characteristics of the input emission signals 238 and/or information included in the input emission signals 238 that may be used to ascertain whether the information presenting device monitored by the display device ON/OFF detector 200 (e.g., the display device 120) is turned ON and operating in an active state. Examples of emission processors 236 are discussed in greater detail below in connection with
The example display device ON/OFF detector 200 of
An example set of audio processors 228 is shown in
The example set of audio engines 228 of
The example audio signature processor 316 of
The example audio gain level processor 320 of
The example horizontal sync audio processor 324 of
The example quiet time detector 328 of
The example fan noise detector 332 of
The example audio source detector 336 of
As shown in the example of
An example set of video processors 232 is shown in
The example set of video engines 232 of
The example display activity detector 416 of
As shown in the example of
An example set of emissions processors 236 is shown in
The example set of emissions processors 236 of
The example current detector 516 of
The example temperature detector 520 of
The example remote control activity detector 524 of
The example people meter activity detector 528 of
As shown in the example of
A first example audio processor system 600 that may be used to implement any or all of the audio code detector 312, the audio signature processor 316, the audio gain level processor 320, the horizontal sync audio processor 324, the quiet time detector 328 and/or the fan noise processor 332 of
The processor 612 may be configured to control the gain/attenuation provided by the VGA 616 based on any known automatic gain control (AGC) algorithm. For example, an AGC algorithm implemented by the processor 612 may control the VGA 616 to yield an output of the A/D converter 604 having an amplitude, variance, standard deviation, energy, etc. within a predetermined range. The predetermined range is typically derived from the characteristics of the particular A/D converter 604 to result in a gain/attenuation of the VGA 616 that appropriately fills the dynamic range of the A/D converter 604.
In addition to implementing the AGC algorithm, the processor 612 may also be configured to execute machine readable instructions to implement one or more of the audio code detector 312, the audio signature processor 316, the audio gain level processor 320, the horizontal sync audio processor 324, the quiet time detector 328 and/or the fan noise processor 332. Such machine readable instructions are discussed in greater detail below in connection with
A second example audio processor system 700 that may be used to implement any or all of the audio code detector 312, the audio signature processor 316, the audio gain level processor 320, the horizontal sync audio processor 324, the quiet time detector 328, the fan noise processor 332 and/or the audio source detector 336 of
The audio processor system 700 also includes a second A/D converter 704B to sample the audio signal 230B output by the audio sensor 204B and convert the audio signal 230B to a digital format for processing by the processor 712. Additionally, the audio processor system 700 includes a second VGA 716B which may amplify or attenuate, as needed, the audio signal 230B so that the audio signal 230B appropriately fills the dynamic range of the A/D converter 704B to yield a desired bit resolution at the output of the A/D converter 704B.
The processor 712 may be configured to control the gain/attenuation provided by the VGAs 716A-B based on any known AGC algorithm as discussed above in connection with
An example video processor system 800 that may be used to implement any or all of the visible light rhythm processor 412 and/or the display activity detector 416 of
The processor 812 may be configured to execute machine readable instructions to implement one or more of the visible light rhythm processor 412 and/or the display activity detector 416. Such machine readable instructions are discussed in greater detail below in connection with
An example EM field processor system 900 that may be used to implement the EM field detector 512 of
The EM field processor system 900 includes an A/D converter 904 to sample the EM field signal 532 output by the emission sensor 212 and convert the EM field signal 532 to a digital format for processing by the processor 912. The processor 912 may be configured to execute machine readable instructions to implement the EM field detector 512. Such machine readable instructions are discussed in greater detail below in connection with
An example current measurement processor system 1000 that may be used to implement the current detector 516 of
The current measurement processor system 1000 includes an A/D converter 1004 to sample the current measurement signal 536 output by the emission sensor 212 and convert the current measurement signal 536 to a digital format for processing by the processor 1012. The processor 1012 may be configured to execute machine readable instructions to implement the current detector 516. Such machine readable instructions are discussed in greater detail below in connection with
An example temperature processor system 1100 that may be used to implement the temperature detector 520 of
The temperature processor system 1100 may also include a second emission sensor 212B implemented, for example, as a temperature sensor 212B. The second emission sensor 212B may positioned to measure, for example, the ambient temperature of the room in which the monitored display device 120 is located. In the example of
The temperature processor system 1100 includes a first A/D converter 1104A to sample the temperature signal 540A output by the emission sensor 212A and convert the temperature signal 540A to a digital format for processing by the processor 1112. The temperature processor system 1100 also includes a second A/D converter 1104B to sample the temperature signal 540B output by the emission sensor 212B and convert the audio signal 540B to a digital format for processing by the processor 1112. The processor 1112 may be configured to execute machine readable instructions to implement the temperature detector 520. Such machine readable instructions are discussed in greater detail below in connection with
Three example remote device activity processor systems 1200, 1250 any 1280, any or all of which may be used to implement the remote control activity detector 524 and/or the people meter activity detector 528 of
The first example remote device activity processor system 1200 includes an IR receiver 1204 to receive IR signals detected by the IR detector 212. The IR receiver 1204 generates corresponding received control signals from the IR signals and outputs the received control signals for processing by the processor 1212. The second example remote device activity processor system 1250 includes a wireless receiver 1254 to receive RF signals detected by the antenna 212. The wireless receiver 1254 generates corresponding received control signals from the RF signals and outputs the received control signals for processing by the processor 1212. The third example remote device activity processor system 1280 includes an ultrasonic receiver 1284 to receive ultrasonic signals detected by the ultrasonic transducer 212. The ultrasonic receiver 1284 generates corresponding received control signals from the ultrasonic signals and outputs the received control signals for processing by the processor 1292. The processors 1212, 1262 and 1292 may be configured to execute machine readable instructions to implement the remote control activity detector 524 and/or the people meter activity detector 528. Such machine readable instructions are discussed in greater detail below in connection with
Flowcharts representative of example machine readable instructions that may be executed to implement the audio processors 228 of
Example machine readable instructions 1300 that may be executed to implement the audio code detector 312 of
After convergence of the AGC algorithm at block 1304, control proceeds to block 1308 at which the audio code detector 312 checks for audio codes present in the received audio signal. Any appropriate technique for decoding audio codes embedded in a content presentation may be used, such as one or more of those discussed above in connection with the description of
First example machine readable instructions 1400 that may be executed to implement the audio signature processor 316 of
After convergence of the AGC algorithm at block 1404, control proceeds to block 1408 at which the audio signature processor 316 generates an audio signature from the received audio signal. Any appropriate technique for generating audio signatures based on an audio signal corresponding to a content presentation may be used, such as one or more of those discussed above in connection with the description of
Second example machine readable instructions 1500 that may be executed to implement the audio signature processor 316 of
After convergence of the AGC algorithm at block 1504, control proceeds to block 1508 at which the audio signature processor 316 generates an audio signature from the received audio signal. Any appropriate technique for generating audio signatures based on an audio signal corresponding to a content presentation may be used, such as one or more of those discussed above in connection with the description of
Control then proceeds to block 1512 at which the audio signature processor 316 determines whether the audio signature generates at block 1508 may be characterized as “hissy.” Typically, an audio signal corresponding to audible program content exhibits significant peak energy fluctuations caused by the varying pressure wave associated with the audio emissions. Conversely, an audio signal corresponding to background noise or silence exhibits relatively small peak energy fluctuations about an average energy value resulting in sound typically characterized as “hissy.” Thus, the audio signature processor 316 may evaluate whether the audio signature generated at block 1508 is hissy to determine whether a monitored information presenting device is emitting an audio signal corresponding to audible program content. In an example hissy audio signature detection algorithm, the audio signature processor 316 may compute a running average of peak energy values of the audio signal. Then, if a particular peak energy value is within some region about this running average, the audio signature processor 316 may determine that a possible hissy state has been entered. If such a possible hissy state exists for a period of time (e.g., three seconds), the audio signature processor 316 may decide that a definite hissy state has been entered and declare the generated audio signature to be hissy. Persons of ordinary skill in the art will appreciate that many techniques may be used to determine whether an audio signature is hissy or, in other words, corresponds to silence or background noise. For example, the average time between audio energy peaks or the variability of the standard deviation of the audio energy peaks may be used to determine whether the audio signal energy fluctuates sufficiently to indicate the presence of an audio content presentation or is relatively static and, therefore, indicative of silence or background noise.
Returning to
Example machine readable instructions 1600 that may be executed to implement the audio gain level processor 320 of
After convergence of the AGC algorithm at block 1604, control proceeds to block 1608 at which the audio gain level processor 320 examines the steady-state audio gain level to which the AGC algorithm converged at block 1604. In particular, the audio gain level processor 320 determines whether the steady-state audio gain level exceeds a predetermined threshold indicative of the AGC algorithm converging to a large, possibly maximum, gain. Such a large/maximum convergence would occur of the input audio signal corresponded to silence or background noise. If at block 1612 the audio gain level processor 320 determines that the steady-state audio gain level achieved at block 1604 does not exceed the predetermined threshold, control proceeds to block 1616 at which the audio gain level processor 320 determines that the monitored information presenting device is probably ON. The audio gain level processor 320 makes such a determination because the steady-state gain level indicates that an audio signal emitted from the monitored information presenting device was probably detected and provided as input to the audio gain level processor 320. If, however, at block 1612 the steady-state audio gain level exceeds the threshold, control proceeds to block 1620 at which audio gain level processor 320 determines that the monitored information presenting device is probably OFF. Here, the audio gain level processor 320 uses the large/maximum steady-state audio gain to decide that the monitored information presenting device is probably not emitting an audio signal corresponding to presented program content and, therefore, is probably turned OFF. In any case, after audio gain level processor 320 makes a determination at block 1616 or block 1620, execution of the machine readable instructions 1600 ends.
Example machine readable instructions 1700 that may be executed to implement the horizontal sync audio processor 324 of
After convergence of the AGC algorithm at block 1704, control proceeds to block 1708 at which the horizontal sync audio processor 324 examines the frequency spectrum of the input audio signal for characteristics corresponding to audio emitted by a fly-back transformer. For example, and as discussed above in connection with
Example machine readable instructions 1800 that may be executed to implement the quiet time detector 328 of
After convergence of the AGC algorithm at block 1804, control proceeds to block 1808 at which the quiet time detector 328 performs a quiet time detector algorithm to determine whether the audio signal includes any periods of silence indicative of, for example, a channel change operation, a transition between broadcast program content and a commercial, etc. Any appropriate technique for detecting intervals of quiet time based on an audio signal corresponding to a content presentation may be used, such as the technique discussed above in connection with the description of
If, however, at block 1812 a quiet time interval is not detected, control proceeds to block 1820 at which the quiet time detector 328 determines whether a quiet time interval was detected within a predetermined preceding interval of time. If at block 1820 the quiet time detector 328 determines that a quiet time interval was detected within the preceding interval of time, control proceeds to block 1816 at which the quiet time detector 328 determines that the monitored information presenting device is probably ON. The quiet time detector 328 makes such a determination because the audio signal emitted from the monitored information presenting device recently included quiet time intervals probably indicative of short interruptions of program content presented by an actively-operating information presenting device. If, however, at block 1820 the quiet time detector 328 determines that a quiet time interval was also not detected within the predetermined preceding interval of time, control proceeds to block 1828 at which the quiet time detector 328 determines that the monitored information presenting device is probably OFF. Here, the quiet time detector 328 uses the lack of a quiet time interval within the predetermined period of time to decide that the monitored information presenting device is probably not emitting an audio signal corresponding to presented program content and, therefore, is probably turned OFF. In any case, after quiet time detector 328 makes a determination at block 1816 or block 1828, execution of the machine readable instructions 1800 ends.
Example machine readable instructions 1900 that may be executed to implement the fan noise detector 332 of
After convergence of the AGC algorithm at block 1904, control proceeds to block 1908 at which the fan noise detector 332 checks for the presence of fan noise in the received audio signal. Fan noise from an operating information presenting device typically exhibits tonal energy in the frequency range between 300 Hz and 5 kHz. As such, any known technique for detecting tonal audio signals in this (or any other appropriate) frequency range may be used at block 1908. If at block 1912 the fan noise detector 332 detects the presence of fan noise, control proceeds to block 1916 at which the fan noise detector 332 determines the monitored information presenting device is probably ON. The fan noise detector 332 makes such a determination because the presence of fan noise indicates that the monitored information presenting device is probably operating and presenting program content. If, however, at block 1912 the fan noise detector 332 does not detect the presence of fan noise, control proceeds to block 1920 at which the fan noise detector 332 determines the monitored information presenting device is probably OFF. Here, the fan noise detector 332 uses the lack of fan noise to decide that the monitored information presenting device is probably not operating and, therefore, is probably turned OFF. In any case, after the fan noise detector 332 makes a determination at block 1916 or block 1920, execution of the machine readable instructions 1900 ends.
Example machine readable instructions 2000 that may be executed to implement the audio source detector 336 of
After convergence of the AGC algorithms at block 2004, control proceeds to block 2008 at which the audio source detector 336 performs a source detection algorithm to determine the source of the input audio signals. Any appropriate technique for audio source detection may be used, such as one or more of those discussed above in connection with the description of
Example machine readable instructions 2100 that may be executed to implement the visible light rhythm processor 412 of
The machine readable instructions 2100 begin execution at block 2104 at which the visible light rhythm processor 412 determines the intensity of the light detected by the video sensor by sampling the signal provided by the video sensor. Next, control proceeds to block 2108 at which the visible light rhythm processor 412 examines the light intensities, for example, over a predetermined interval of time. If at block 2112 the visible light rhythm processor 412 determines that the light intensities indicate that the monitored display is active, control proceeds to block 2116 at which the visible light rhythm processor 412 determines that the monitored information presenting device is probably ON. The visible light rhythm processor 412 makes such a determination, for example, by comparing the light intensities to a predetermined threshold corresponding to a light intensity visible to the human eye and, therefore, probably indicative of the information presenting device displaying active program content. If, however, at block 2112 the visible light rhythm processor 412 determines that the light intensities do not indicate that the monitored display is active, control proceeds to block 2120 at which the visible light rhythm processor 412 determines that the monitored information presenting device is probably OFF. Here, the lack of detected light intensities which would be visible to the human eye probably indicates that the monitored information presenting device is inactive and, therefore, probably turned OFF. In any case, after the visible light rhythm processor 412 makes a determination at block 2116 or block 2120, execution of the machine readable instructions 2100 ends.
Example machine readable instructions 2200 that may be executed to implement the display activity detector 416 of
The machine readable instructions 2200 begin execution at block 2204 at which the display activity detector 416 captures video frames based on the video signal (e.g., the video signal 234 of
Returning to
The display activity detector 416 may be configured to increase the confidence of the OFF decision by examining, for example, the color of the extracted region. If the color of the extracted region is a uniform dark color (e.g., black), the display activity detector 416 may determine that the monitored display is more likely turned OFF than, for example, displaying a paused video image. In any case, after the display activity detector 416 makes a determination at block 2224 or block 2228, execution of the machine readable instructions 2200 ends.
Example machine readable instructions 2300 that may be executed to implement the electromagnetic (EM) field detector 512 of
If at block 2308 the EM field detector 512 determines that the sampled EM field signal exceeds the threshold, control proceeds to block 2312 at which the EM field detector 512 determines the monitored information presenting device is ON. The EM field detector 512 makes such a determination because the presence of an EM field exceeding the predetermined threshold indicates that the monitored information presenting device is turned ON and operating in an active mode. If, however, at block 2308 the EM field detector 512 determines that the EM field signal does not exceed the threshold, control proceeds to block 2316 at which the EM field detector 512 determines the monitored information presenting device is OFF. Here, the EM field detector 512 uses the lack of a significant EM field to decide that the monitored information presenting device is not operating in an active mode and, therefore, is turned OFF. In any case, after the EM field detector 512 makes a determination at block 2312 or block 2316, execution of the machine readable instructions 2300 ends.
Example machine readable instructions 2400 that may be executed to implement the current detector 516 of
If at block 2408 the current detector 516 determines that the sampled current signal exceeds the threshold, control proceeds to block 2412 at which the current detector 516 determines the monitored information presenting device is ON. The current detector 516 makes such a determination because a current signal exceeding the predetermined threshold indicates that the monitored information presenting device is turned ON and drawing current from the associated power source. If, however, at block 2408 the current detector 516 determines that the current signal does not exceed the threshold, control proceeds to block 2416 at which the current detector 516 determines the monitored information presenting device is OFF. Here, the current detector 516 uses the lack of a significant current signal to decide that the monitored information presenting device is not operating in an active mode and, therefore, is turned OFF. In any case, after the current detector 516 makes a determination at block 2412 or block 2416, execution of the machine readable instructions 2400 ends.
Example machine readable instructions 2500 that may be executed to implement the temperature detector 520 of
After sampling of the respective temperature signals, control then proceeds to block 2512 at which the temperature detector 520 compares the monitored information presenting device's temperature to the ambient air temperature, possible offset by a threshold to improve ON/OFF detection reliability. If at block 2512 the temperature detector 520 determines that the monitored information presenting device's temperature sufficiently exceeds the ambient air temperature (based on the additional threshold amount), control proceeds to block 2516 at which the temperature detector 520 determines that the monitored information presenting device is ON. The temperature detector 520 makes such a determination because the monitored information presenting device's temperature indicates that heat is being emitted and, thus, that the device is turned ON. If, however, at block 2512 monitored information presenting device's temperature does not sufficiently exceed the ambient air temperature, control proceeds to block 2520 at which the temperature detector 520 determines the monitored information presenting device is OFF. Here, the temperature detector 520 uses the lack of a significant heat emission to decide that the monitored information presenting device is not operating in an active mode and, therefore, is turned OFF. In any case, after the temperature detector 520 makes a determination at block 2516 or block 2520, execution of the machine readable instructions 2500 ends. Persons of ordinary skill in the art will appreciate that the processing at step 2508 may be eliminated to reduce the number of required emission sensors if, for example, the threshold at block 2512 is modified to incorporate an expected/average ambient air temperature.
Example machine readable instructions 2600 that may be executed to implement the remote control activity detector 524 and/or the people meter activity detector 528 of
The machine readable instructions 2600 begin execution at block 2604 at which the remote input device activity detector 524/528 initializes/configures a receiver (e.g., the IR receiver 1204 of
If, however, at block 2612 control signals corresponding to known and/or unknown remote input device activity are not detected, control proceeds to block 2620 at which the remote input device activity detector 524/528 determines the monitored information presenting device is probably OFF. Here, the remote input device activity detector 524/528 uses the lack of received control signals corresponding to known and/or unknown remote input device activity to decide that the monitored information presenting device is not being controlled and/or responded to by a user and, therefore, is probably turned OFF. In any case, after the remote input device activity detector 524/528 makes a determination at block 2616 or block 2620, execution of the machine readable instructions 2600 ends.
Example machine readable instructions 2700 that may be executed to implement the decision processor 244 of
The machine readable instructions 2700 begin execution at block 2704 at which the decision processor 244 samples the audio decision outputs 246 (also called audio decision metrics 246) generated by the audio processors 228. Next, control proceeds to block 2708 at which the decision processor 244 samples the video decision outputs 248 (also called video decision metrics 248) generated by the video processors 232. Control then proceeds to block 2712 at which the decision processor 244 samples the emission decision outputs 250 (also called emission decision metrics 250) generated by the emission processors 236. Then, after all the decision metrics have been sampled, control proceeds to block 2716 at which the decision processor 244 weights the decision metrics by, for example, scaling or assigning a value to each decision metric corresponding to the confidence associated with the decision metric. For example, and referring to the examples of
Next, control proceeds to block 2720 at which the decision processor 244 combines all of the individual decision metrics (e.g., via addition) to determine a weighted majority vote of the individual decisions made by the audio processors 228, the video processors 232 and the emission processors 236. Then, if at block 2724 the majority vote favors a decision that the monitored information presenting device is ON (e.g., if the weighted majority vote results in a positive value), control proceeds to block 2728 at which the decision processor 244 declares the monitored information presenting device to be ON. However, if at block 2724 the majority vote favors a decision that the monitored information presenting device is OFF (e.g., if the majority vote results in a negative value), control proceeds to block 2732 at which the decision processor 244 declares the monitored information presenting device to be OFF. In the case of a tie, the decision processor 244 may be configured, for example, to favor either a decision of ON or OFF depending on the particular monitored information presenting device, to produce an output indicating that the state of the monitored information presenting device is indeterminate, etc. In any case, after the decision processor 244 makes a determination at block 2728 or block 2732, execution of the machine readable instructions 2700 ends.
Turning to the example of
Next, after processing at blocks 2806, 2810 and 2812 completes, control proceeds to block 2818 at which the decision processor 244 may further weight the individual audio, video and emission metric weighted majority votes based on, for example, the confidence and/or importance associated with the particular type of metric. Control then proceeds to block 2822 at which the decision processor 244 combines the resulting individual audio, video and emission metric weighted majority votes to determine an overall majority vote. Then, control proceeds to block 2724 and blocks subsequent thereto as discussed above in connection with
Persons of ordinary skill in the art will appreciate that the examples of
The system 2900 of the instant example includes a processor 2912 such as a general purpose programmable processor. The processor 2912 includes a local memory 2914, and executes coded instructions 2916 present in the local memory 2914 and/or in another memory device. The processor 2912 may execute, among other things, the machine readable instructions represented in
The processor 2912 is in communication with a main memory including a volatile memory 2918 and a non-volatile memory 2920 via a bus 2922. The volatile memory 2918 may be implemented by Static Random Access Memory (SRAM), Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS Dynamic Random Access Memory (RDRAM) and/or any other type of random access memory device. The non-volatile memory 2920 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 2918, 2920 is typically controlled by a memory controller (not shown) in a conventional manner.
The computer 2900 also includes a conventional interface circuit 2924. The interface circuit 2924 may be implemented by any type of well known interface standard, such as an Ethernet interface, a universal serial bus (USB), and/or a third generation input/output (3GIO) interface.
One or more input devices 2926 are connected to the interface circuit 2924. The input device(s) 2926 permit a user to enter data and commands into the processor 2912. The input device(s) can be implemented by, for example, a keyboard, a mouse, a touchscreen, a track-pad, a trackball, an isopoint and/or a voice recognition system.
One or more output devices 2928 are also connected to the interface circuit 2924. The output devices 2928 can be implemented, for example, by display devices (e.g., a liquid crystal display, a cathode ray tube display (CRT)), by a printer and/or by speakers. The interface circuit 2924, thus, typically includes a graphics driver card.
The interface circuit 2924 also includes a communication device such as a modem or network interface card to facilitate exchange of data with external computers via a network (e.g., an Ethernet connection, a digital subscriber line (DSL), a telephone line, coaxial cable, a cellular telephone system, etc.).
The computer 2900 also includes one or more mass storage devices 2930 for storing software and data. Examples of such mass storage devices 2930 include floppy disk drives, hard drive disks, compact disk drives and digital versatile disk (DVD) drives. The mass storage device 2930 may be used, for example, store any or all of the machine readable instructions 1300, 1400, 1500, 1600, 1700, 1800, 1900, 2000, 2100, 2200, 2300, 2400, 2500, 2600, 2700 and/or 2800. Additionally, the volatile memory 1518 may be used, for example, to store any or all of the audio decision metrics 246, the video decision metrics 248 and/or the emission decision metrics 250.
At least some of the above described example methods and/or apparatus are implemented by one or more software and/or firmware programs running on a computer processor. However, dedicated hardware implementations including, but not limited to, application specific integrated circuits (ASICs), programmable logic arrays (PLAs) and other hardware devices can likewise be constructed to implement some or all of the example methods and/or apparatus described herein, either in whole or in part. Furthermore, alternative software implementations including, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the example methods and/or apparatus described herein.
It should also be noted that the example software and/or firmware implementations described herein are optionally stored on a tangible storage medium, such as: a magnetic medium (e.g., a magnetic disk or tape); a magneto-optical or optical medium such as an optical disk; or a solid state medium such as a memory card or other package that houses one or more read-only (non-volatile) memories, random access memories, or other re-writable (volatile) memories; or a signal containing computer instructions. A digital file attached to e-mail or other information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium. Accordingly, the example software and/or firmware described herein can be stored on a tangible storage medium or distribution medium such as those described above or successor storage media.
Additionally, although this patent discloses example systems including software or firmware executed on hardware, it should be noted that such systems are merely illustrative and should not be considered as limiting. For example, it is contemplated that any or all of these hardware and software components could be embodied exclusively in hardware, exclusively in software, exclusively in firmware or in some combination of hardware, firmware and/or software. Accordingly, while the above specification described example systems, methods and articles of manufacture, persons of ordinary skill in the art will readily appreciate that the examples are not the only way to implement such systems, methods and articles of manufacture. Therefore, although certain example methods, apparatus and articles of manufacture have been described herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods, apparatus and articles of manufacture fairly falling within the scope of the appended claims either literally or under the doctrine of equivalents.
This patent arises from a continuation of U.S. patent application Ser. No. 17/164,483 (now U.S. Pat. No. 11,546,579), titled “Display Device ON/OFF Detection Methods and Apparatus,” filed on Feb. 1, 2021, which is a continuation of U.S. patent application Ser. No. 16/706,280 (now U.S. Pat. No. 10,911,749), titled “Display Device ON/OFF Detection Methods and Apparatus,” filed on Dec. 6, 2019, which is a continuation of U.S. patent application Ser. No. 16/417,128 (now U.S. Pat. No. 10,506,226), titled “Display Device ON/OFF Detection Methods and Apparatus,” filed on May 20, 2019, which is a continuation of U.S. patent application Ser. No. 16/166,871 (now U.S. Pat. No. 10,306,221), titled “Display Device ON/OFF Detection Methods and Apparatus,” filed on Oct. 22, 2018, which is a continuation of U.S. patent application Ser. No. 15/958,814 (now U.S. Pat. No. 10,110,889), titled “Display Device ON/OFF Detection Methods and Apparatus,” filed on Apr. 20, 2018, which is a continuation of U.S. patent application Ser. No. 15/207,019 (now U.S. Pat. No. 9,961,342), titled “Display Device ON/OFF Detection Methods and Apparatus,” filed on Jul. 11, 2016, which is a continuation of U.S. patent application Ser. No. 14/015,664 (now U.S. Pat. No. 9,420,334), titled “Display Device ON/OFF Detection Methods and Apparatus,” filed on Aug. 30, 2013, which is a continuation of U.S. patent application Ser. No. 12/831,870 (now U.S. Pat. No. 8,526,626), titled “Display Device ON/OFF Detection Methods and Apparatus,” filed on Jul. 7, 2010, which is a continuation of U.S. patent application Ser. No. 11/576,328 (now U.S. Pat. No. 7,882,514), titled “Display Device ON/OFF Detection Methods and Apparatus,” filed on Mar. 29, 2007, which is a U.S. national stage of International Patent Application No. PCT/US2006/031960, titled “Display Device ON/OFF Detection Methods and Apparatus,” filed on Aug. 16, 2006, which claims the benefit of U.S. Provisional Application No. 60/708,557, titled “Display Device ON/OFF Detection Methods and Apparatus” and filed on Aug. 16, 2005, and U.S. Provisional Application No. 60/761,678, titled “Display Device ON/OFF Detection Methods and Apparatus” and filed on Jan. 24, 2006. Priority to U.S. Provisional Application No. 60/708,557, U.S. Provisional Application No. 60/761,678, International Application No. PCT/US2006/031960, U.S. patent application Ser. No. 11/576,328, U.S. patent application Ser. No. 12/831,870, U.S. patent application Ser. No. 14/015,664, U.S. patent application Ser. No. 15/207,019, U.S. patent application Ser. No. 15/958,814, U.S. patent application Ser. No. 16/166,871, U.S. patent application Ser. No. 16/417,128, U.S. patent application Ser. No. 16/706,280 and U.S. patent application Ser. No. 17/164,483 is hereby claimed. U.S. Provisional Application No. 60/708,557, U.S. Provisional Application No. 60/761,678, International Application No. PCT/US2006/031960, U.S. patent application Ser. No. 11/576,328, U.S. patent application Ser. No. 12/831,870, U.S. patent application Ser. No. 14/015,664, U.S. patent application Ser. No. 15/207,019, U.S. patent application Ser. No. 15/958,814, U.S. patent application Ser. No. 16/166,871, U.S. patent application Ser. No. 16/417,128, U.S. patent application Ser. No. 16/706,280 and U.S. patent application Ser. No. 17/164,483 are hereby incorporated by reference in their respective entireties.
Number | Name | Date | Kind |
---|---|---|---|
3281695 | Bass | Oct 1966 | A |
3315160 | Goodman | Apr 1967 | A |
3483327 | Schwartz | Dec 1969 | A |
3651471 | Haselwood et al. | Mar 1972 | A |
3733430 | Thompson et al. | May 1973 | A |
3803349 | Watanabe | Apr 1974 | A |
3906454 | Martin | Sep 1975 | A |
3947624 | Miyake | Mar 1976 | A |
4027332 | Wu et al. | May 1977 | A |
4044376 | Porter | Aug 1977 | A |
4058829 | Thompson | Nov 1977 | A |
4245245 | Matsumoto et al. | Jan 1981 | A |
4388644 | Ishman et al. | Jun 1983 | A |
4546382 | McKenna et al. | Oct 1985 | A |
4566030 | Nickerson et al. | Jan 1986 | A |
4574304 | Watanabe et al. | Mar 1986 | A |
4613904 | Lurie | Sep 1986 | A |
4622583 | Watanabe et al. | Nov 1986 | A |
4642685 | Roberts et al. | Feb 1987 | A |
4644393 | Smith et al. | Feb 1987 | A |
4647964 | Weinblatt | Mar 1987 | A |
4697209 | Kiewit et al. | Sep 1987 | A |
4723302 | Fulmer et al. | Feb 1988 | A |
4764808 | Solar | Aug 1988 | A |
4769697 | Gilley et al. | Sep 1988 | A |
4779198 | Lurie | Oct 1988 | A |
4800437 | Hosoya | Jan 1989 | A |
4807031 | Broughton et al. | Feb 1989 | A |
4876736 | Kiewit | Oct 1989 | A |
4885632 | Mabey et al. | Dec 1989 | A |
4907079 | Turner et al. | Mar 1990 | A |
4912552 | Allison, III et al. | Mar 1990 | A |
4931865 | Scarampi | Jun 1990 | A |
4943963 | Waechter et al. | Jul 1990 | A |
4965825 | Harvey et al. | Oct 1990 | A |
4972503 | Zurlinden | Nov 1990 | A |
5097328 | Boyette | Mar 1992 | A |
5136644 | Audebert et al. | Aug 1992 | A |
5165069 | Vitt et al. | Nov 1992 | A |
5226177 | Nickerson | Jul 1993 | A |
5235414 | Cohen | Aug 1993 | A |
5251324 | McMullan, Jr. | Oct 1993 | A |
5310222 | Chatwin et al. | May 1994 | A |
5319453 | Copriviza et al. | Jun 1994 | A |
5335277 | Harvey et al. | Aug 1994 | A |
5355161 | Bird et al. | Oct 1994 | A |
5398055 | Nonomura et al. | Mar 1995 | A |
5404161 | Douglass et al. | Apr 1995 | A |
5404172 | Berman et al. | Apr 1995 | A |
5408258 | Kolessar | Apr 1995 | A |
5425100 | Thomas | Jun 1995 | A |
5481294 | Thomas et al. | Jan 1996 | A |
5483276 | Brooks et al. | Jan 1996 | A |
5488408 | Maduzia et al. | Jan 1996 | A |
5505901 | Harvey et al. | Apr 1996 | A |
5512933 | Wheatley et al. | Apr 1996 | A |
5550928 | Lu et al. | Aug 1996 | A |
5659367 | Yuen | Aug 1997 | A |
5760760 | Helms | Jun 1998 | A |
5767922 | Zabih et al. | Jun 1998 | A |
5771307 | Lu et al. | Jun 1998 | A |
5801747 | Bedard | Sep 1998 | A |
5874724 | Cato | Feb 1999 | A |
5896554 | Itoh et al. | Apr 1999 | A |
5889548 | Chan | May 1999 | A |
5963844 | Dail | Oct 1999 | A |
6035177 | Moses et al. | Mar 2000 | A |
6049286 | Forr | Apr 2000 | A |
6124877 | Schmidt | Sep 2000 | A |
6137539 | Lownes et al. | Oct 2000 | A |
6177931 | Alexander et al. | Jan 2001 | B1 |
6184918 | Goldschmidt Iki et al. | Feb 2001 | B1 |
6272176 | Srinivasan | Aug 2001 | B1 |
6286140 | Ivanyi | Sep 2001 | B1 |
6297859 | George | Oct 2001 | B1 |
6311214 | Rhoads | Oct 2001 | B1 |
6388662 | Narui et al. | May 2002 | B2 |
6400996 | Hoffberg et al. | Jun 2002 | B1 |
6421445 | Jensen et al. | Jul 2002 | B1 |
6457010 | Eldering et al. | Sep 2002 | B1 |
6463413 | Applebaum et al. | Oct 2002 | B1 |
6467089 | Aust et al. | Oct 2002 | B1 |
6477508 | Lazar et al. | Nov 2002 | B1 |
6487719 | Itoh et al. | Nov 2002 | B1 |
6519769 | Hopple et al. | Feb 2003 | B1 |
6523175 | Chan | Feb 2003 | B1 |
6529212 | Miller et al. | Mar 2003 | B2 |
6542878 | Heckerman et al. | Apr 2003 | B1 |
6567978 | Jarrell | May 2003 | B1 |
6570559 | Oshima | May 2003 | B1 |
6647212 | Toriumi et al. | Nov 2003 | B1 |
6647548 | Lu et al. | Nov 2003 | B1 |
6675383 | Wheeler et al. | Jan 2004 | B1 |
6681396 | Bates et al. | Jan 2004 | B1 |
6791472 | Hoffberg | Sep 2004 | B1 |
6934508 | Ceresoli et al. | Aug 2005 | B2 |
6946803 | Moore | Sep 2005 | B2 |
7051352 | Schaffer | May 2006 | B1 |
7100181 | Srinivasan et al. | Aug 2006 | B2 |
7150030 | Eldering et al. | Dec 2006 | B1 |
7587732 | Wright et al. | Sep 2009 | B2 |
7647604 | Ramaswamy | Jan 2010 | B2 |
7786987 | Nielsen | Aug 2010 | B2 |
7882514 | Nielsen et al. | Feb 2011 | B2 |
7958526 | Wheeler et al. | Jun 2011 | B2 |
8060372 | Topchy et al. | Nov 2011 | B2 |
8180712 | Nelson et al. | May 2012 | B2 |
8249992 | Harkness et al. | Aug 2012 | B2 |
8526626 | Nielsen et al. | Sep 2013 | B2 |
8863166 | Harsh et al. | Oct 2014 | B2 |
9027043 | Johnson | May 2015 | B2 |
9294813 | Lee | Mar 2016 | B2 |
9301007 | Ramaswamy | Mar 2016 | B2 |
9312973 | Nelson et al. | Apr 2016 | B2 |
9332305 | Lee | May 2016 | B1 |
9420334 | Nielsen et al. | Aug 2016 | B2 |
9519909 | Nielsen et al. | Dec 2016 | B2 |
9680584 | Lee | Jun 2017 | B2 |
9961342 | Nielsen et al. | May 2018 | B2 |
10110889 | Nielsen et al. | Oct 2018 | B2 |
10306221 | Nielsen et al. | May 2019 | B2 |
10506226 | Nielsen et al. | Dec 2019 | B2 |
10528881 | Nelson et al. | Jan 2020 | B2 |
10911749 | Nielsen et al. | Feb 2021 | B2 |
11546579 | Nelson et al. | Jan 2023 | B2 |
20020012353 | Gerszberg et al. | Jan 2002 | A1 |
20020015112 | Nagakubo et al. | Feb 2002 | A1 |
20020026635 | Wheeler et al. | Feb 2002 | A1 |
20020056087 | Berezowski et al. | May 2002 | A1 |
20020057893 | Wood et al. | May 2002 | A1 |
20020059577 | Lu et al. | May 2002 | A1 |
20020072952 | Hamzy et al. | Jun 2002 | A1 |
20020077880 | Gordon et al. | Jun 2002 | A1 |
20020080286 | Daglas et al. | Jun 2002 | A1 |
20020083435 | Blasko et al. | Jun 2002 | A1 |
20020141730 | Haken | Oct 2002 | A1 |
20020174425 | Markel et al. | Nov 2002 | A1 |
20020198762 | Donato | Dec 2002 | A1 |
20030046685 | Srinivasan et al. | Mar 2003 | A1 |
20030054757 | Kolessar et al. | Mar 2003 | A1 |
20030056215 | Kanungo | Mar 2003 | A1 |
20030067459 | Lim | Apr 2003 | A1 |
20030093790 | Logan et al. | May 2003 | A1 |
20030101449 | Bentollia et al. | May 2003 | A1 |
20030110485 | Lu et al. | Jun 2003 | A1 |
20030115591 | Weissmueller et al. | Jun 2003 | A1 |
20030131350 | Peiffer et al. | Jul 2003 | A1 |
20030216120 | Ceresoli et al. | Nov 2003 | A1 |
20040003394 | Ramaswamy | Jan 2004 | A1 |
20040055020 | Delpuch | Mar 2004 | A1 |
20040058675 | Lu et al. | Mar 2004 | A1 |
20040073918 | Ferman et al. | Apr 2004 | A1 |
20040088212 | Hill | May 2004 | A1 |
20040088721 | Wheeler et al. | May 2004 | A1 |
20040100437 | Hunter et al. | May 2004 | A1 |
20040181799 | Lu et al. | Sep 2004 | A1 |
20040210922 | Peiffer et al. | Oct 2004 | A1 |
20040233126 | Moore | Nov 2004 | A1 |
20050054285 | Mears et al. | Mar 2005 | A1 |
20050057550 | George | Mar 2005 | A1 |
20050125820 | Nelson et al. | Jun 2005 | A1 |
20050221774 | Ceresoli et al. | Oct 2005 | A1 |
20050286860 | Conklin | Dec 2005 | A1 |
20060075421 | Roberts et al. | Apr 2006 | A1 |
20060093998 | Vertegaal | May 2006 | A1 |
20060195857 | Wheeler et al. | Aug 2006 | A1 |
20060212895 | Johnson | Sep 2006 | A1 |
20060232575 | Nielsen | Oct 2006 | A1 |
20070011040 | Wright et al. | Jan 2007 | A1 |
20070050832 | Wright et al. | Mar 2007 | A1 |
20070063850 | Devaul et al. | Mar 2007 | A1 |
20070186228 | Ramaswamy et al. | Aug 2007 | A1 |
20070192782 | Ramaswamy | Aug 2007 | A1 |
20080028427 | Nesvadba et al. | Jan 2008 | A1 |
20080148307 | Nielsen et al. | Jun 2008 | A1 |
20080148309 | Wilcox et al. | Jun 2008 | A1 |
20080276265 | Topchy et al. | Nov 2008 | A1 |
20090192805 | Topchy et al. | Jul 2009 | A1 |
20090225994 | Topchy et al. | Sep 2009 | A1 |
20090259325 | Topchy et al. | Oct 2009 | A1 |
20100083299 | Nelson et al. | Apr 2010 | A1 |
20100162285 | Cohen et al. | Jun 2010 | A1 |
20100211967 | Ramaswamy et al. | Aug 2010 | A1 |
20110016231 | Ramaswamy et al. | Jan 2011 | A1 |
20120102515 | Ramaswamy | Apr 2012 | A1 |
20130084056 | Harsh et al. | Apr 2013 | A1 |
20130160042 | Stokes et al. | Jun 2013 | A1 |
20140059579 | Vinson et al. | Feb 2014 | A1 |
20160173921 | Srinivasan et al. | Jun 2016 | A1 |
20160210557 | Nelson et al. | Jul 2016 | A1 |
20180241989 | Nielsen et al. | Aug 2018 | A1 |
20200195917 | Nielsen et al. | Jun 2020 | A1 |
Number | Date | Country |
---|---|---|
2015255256 | Dec 2015 | AU |
1244982 | Feb 2000 | CN |
3401762 | Aug 1985 | DE |
0593202 | Apr 1994 | EP |
0946012 | Sep 1999 | EP |
1318679 | Jun 2003 | EP |
1574964 | Sep 1980 | GB |
8331482 | Dec 1996 | JP |
2000307520 | Nov 2000 | JP |
9115062 | Oct 1991 | WO |
9512278 | May 1995 | WO |
9526106 | Sep 1995 | WO |
9810539 | Mar 1998 | WO |
9832251 | Jul 1998 | WO |
9933206 | Jul 1999 | WO |
9959275 | Nov 1999 | WO |
0038360 | Jun 2000 | WO |
0072484 | Nov 2000 | WO |
0111506 | Feb 2001 | WO |
0145103 | Jun 2001 | WO |
0161892 | Aug 2001 | WO |
0219581 | Mar 2002 | WO |
02052759 | Jul 2002 | WO |
03049339 | Jun 2003 | WO |
03052552 | Jun 2003 | WO |
0306030 | Jul 2003 | WO |
2005032145 | Apr 2005 | WO |
2005038625 | Apr 2005 | WO |
2005041166 | May 2005 | WO |
2005055601 | Jun 2005 | WO |
2005065159 | Jul 2005 | WO |
2005079457 | Sep 2005 | WO |
2006012629 | Feb 2006 | WO |
2007022250 | Feb 2007 | WO |
2007120518 | Oct 2007 | WO |
2011115945 | Sep 2011 | WO |
Entry |
---|
Patent Cooperation Treaty, “Written Opinion of the International Searching Authority,” mailed by the International Searching Authority in connection with PCT application No. PCT/US2003/030355, dated Mar. 21, 2008 (5 pages). |
Patent Cooperation Treaty, “International Search Report,” mailed by the International Searching Authority in connection with PCT application No. PCT/US2003/030355, dated May 5, 2004 (6 pages). |
Patent Cooperation Treaty, “International Preliminary Examination Report,” mailed by the International Preliminary Examining Authority in connection with PCT application No. PCT/US2003/030370, dated Mar. 7, 2005 (4 pages). |
Patent Cooperation Treaty, “International Search Report,” mailed by the International Searching Authority in connection with PCT application No. PCT/US2003/030370, dated Mar. 11, 2004 (7 pages). |
Patent Cooperation Treaty, “Written Opinion of the International Searching Authority,” mailed by the International Searching Authority in connection with PCT application No. PCT/US2003/030370, dated Nov. 15, 2004 (5 pages). |
European Patent Office, “Extended European Search Report,” mailed in connection with European Patent Application No. EP05798239.9, dated Sep. 9, 2008 (4 pages). |
Patent Cooperation Treaty, “International Preliminary Report on Patentability,” mailed by the International Bureau in connection with PCT application No. PCT/US2005/028106, dated Mar. 12, 2007 (4 pages). |
Patent Cooperation Treaty, “International Search Report,” mailed by the International Searching Authority in connection with PCT application No. PCT/US2005/028106, dated Mar. 12, 2007 (2 pages). |
Patent Cooperation Treaty, “Written Opinion of the International Searching Authority,” mailed by the International Searching Authority in connection with PCT application No. PCT/US2005/028106, dated Mar. 12, 2007 (4 pages). |
United States Patent and Trademark Office, “Non-Final Office Action” mailed in connection with U.S. Appl. No. 11/388,262, dated Mar. 5, 2009 (22 pages). |
United States Patent and Trademark Office, “Non-Final Office Action” mailed in connection with U.S. Appl. No. 11/388,555, dated Dec. 27, 2007 (12 pages). |
United States Patent and Trademark Office, “Final Office Action” mailed in connection with U.S. Appl. No. 11/388,555, dated Oct. 6, 2008, (18 pages). |
United States Patent and Trademark Office, “Advisory Action” mailed in connection with U.S. Appl. No. 11/388,555, dated Jan. 13, 2009 (4 pages). |
Patent Cooperation Treaty, “Written Opinion of the International Searching Authority,” mailed in connection with International Patent Application No. PCT/US2006/031960, dated Feb. 21, 2007, 3 pages. |
Vincent et al., “A Tentative Typology of Audio Source Separation Tasks,” 4th International Symposium on Independent Component Analysis and Blind Signal Separation (ICA2003), Nara, Japan, Apr. 2003, pp. 715-720. |
Smith, Leslie S., “Using IIDs to Estimate Sound Source Direction,” University of Stirling, United Kingdom, 2 pages. |
United States Patent and Trademark Office, “Notice of Allowance” mailed in connection with U.S. Appl. No. 11/576,328, dated Apr. 7, 2010 (8 pages). |
The State Intellectual Property Office of China (SIPO), “Notice of Decision of Granting Patent Right for Invention,” mailed in connection with Chinese Patent Application Serial No. 200680036510.8, dated Aug. 9, 2010 (5 pages). |
The State Intellectual Property Office of China (SIPO), “Second Office Action” mailed in connection with Chinese Patent Application Serial No. 200680036510.8, dated Mar. 24, 2010 (9 pages). |
The State Intellectual Property Office of China (SIPO), “First Office Action” mailed in connection with Chinese Patent Application Serial No. 200680036510.8, dated Jul. 10, 2009 (10 pages). |
United States Patent and Trademark Office, “Final Office Action” dated Dec. 8, 2009, in connection with U.S. Appl. No. 11/388,555 (17 pages). |
United States Patent and Trademark Office, “Final Office Action” dated Oct. 12, 2010, in connection with U.S. Appl. No. 11/388,262 (23 pages). |
United States Patent and Trademark Office, “Non-Final Office Action” dated Apr. 28, 2010 in connection with U.S. Appl. No. 11/388,262 (14 pages). |
United States Patent and Trademark Office, “Notice of Allowance” dated Dec. 31, 2009, in connection with U.S. Appl. No. 11/672,706 (17 pages). |
Lu et al., “Content Analysis for Audio Classification and Segmentation,” IEEE Transactions on Speech and Audio Processing, vol. 10, No. 7, Oct. 2002 (14 pages). |
IP Australia, “Examination Report,” mailed in connection with Australian Patent Application No. 2010201753, dated Mar. 23, 2011 (2 pages). |
IP Australia, “Notice of Allowance,” mailed in connection with Australian Patent Application No. 2010201753, dated Apr. 17, 2012 (3 pages). |
IP Australia, “Examination Report,” mailed in connection with Australian Patent Application No. 2010219320, dated Jun. 20, 2012 (4 pages). |
CIPO, “Office Action,” mailed in connection with Canadian Patent Application No. 2,576,865, dated Jun. 17, 2011 (3 pages). |
CIPO, “Notice of Allowance,” mailed in connection with Canadian Patent Application No. 2,576,865, dated Oct. 2, 2012 (1 page). |
EPO, “Summons to Attend Oral Proceedings Pursuant to Rule 115(1) EPC,” mailed in connection with European Patent Application No. 05798239.9, dated Dec. 22, 2011 (3 pages). |
EPO, “Communication Pursuant to Rule 62a(1) EPC and Rule 63(1) EPC,” mailed in connection with European Patent Application No. 11009958.7, dated Mar. 27, 2012 (3 pages). |
EPO, “Extended European Search Report,” mailed in connection with European Patent Application No. 06801611.2, dated Mar. 2, 2012 (5 pages). |
KIPO, “Notice of Preliminary Rejection,” mailed in connection with Korean Patent Application No. 10-2007-7005373, dated Oct. 31, 2011 (5 pages). |
KIPO, “Notice of Preliminary Rejection,” mailed in connection with Korean Patent Application No. 10-2007-7005373, dated May 30, 2012 (5 pages). |
PCT, “International Preliminary Report on Patentability,” mailed in connection with PCT Application No. PCT/US2003/030355, dated Mar. 21, 2011 (5 pages). |
SIPO, “Rejection Decision,” mailed in connection with Chinese Patent Application No. 200580030202.X, dated Mar. 24, 2011 (9 pages). |
EPO, “Examination Report,” mailed in connection with European Patent Application No. 06801611.2, dated Jun. 25, 2013 (4 pages). |
United States Patent and Trademark Office, “Non-Final Office Action”, mailed in connection with U.S. Appl. No. 12/831,870, dated Nov. 29, 2012 (9 pages). |
United States Patent and Trademark Office, “Notice of Allowability”, mailed in U.S. Appl. No. 12/831,870, dated Apr. 23, 2013 (9 pages). |
United States Patent and Trademark Office, “Notice of Allowance”, mailed in U.S. Appl. No. 12/831,870, dated Aug. 1, 2013 (6 pages). |
IP Australia, “Examination Report”, mailed in connection with Australian Patent Application No. 2013203468, dated Aug. 26, 2014 (3 pages). |
IP Australia, “Notice of Acceptance”, mailed in connection with Australian Patent Application No. 2013203468, dated Oct. 1, 2015 (2 pages). |
Canadian Intellectual Property Office, “Office Action”, mailed in connection with Canadian Patent Application No. 2,619,781, dated Sep. 12, 2013 (3 pages). |
Canadian Intellectual Property Office, “Office Action”, mailed in connection with Canadian Patent Application No. 2,619,781, dated Jan. 26, 2015 (4 pages). |
United States Patent and Trademark Office, “Non-Final Office Action,” mailed in connection with U.S. Appl. No. 11/388,555, dated Mar. 31, 2009 (10 pages). |
United States Patent and Trademark Office, “Final Office Action,” mailed in connection with U.S. Appl. No. 11/388,262, dated Sep. 2, 2009 (13 pages). |
United States Patent and Trademark Office, “Non-Final Office Action,” mailed in connection with U.S. Appl. No. 11/672,706, dated Jul. 23, 2009 (8 pages). |
Thomas, William L., “Television Audience Research Technology, Today's Systems and Tomorrow's Challenges,” Nielsen Media Research, Jun. 5, 1992 (4 pages). |
Dai et al., “Transferring Naive Bayes Classifiers for Text Classification,” Proceedings of the Twenty-Second AAAI Conference on Artificial Intelligence, held in Vancouver, British Columbia on Jul. 22-26, 2007 (6 pages). |
Elkan, Charles, “Naive Bayesian Learning,” Adapted from Technical Report No. CS97-557, Department of Computer Science and Engineering, University of California, San Diego, U.S.A., Sep. 1997 (4 pages). |
Zhang, Harry, “The Optimality of Naive Bayes,” Proceedings of the Seventeenth International FLAIRS Conference, 2004 (6 pages). |
Domingos et al., “On the Optimality of the Simple Bayesian Classifier under Zero-One Loss,” Machine Learning, vol. 29, No. 2, pp. 103-130, Nov. 1, 1997 (28 pages). |
Patron-Perez et aI., “A Probabilistic Framework for Recognizing Similar Actions using Spatio-Temporal Features,” BMVC07, 2007 [Retrieved from the Internet on Feb. 29, 2008] (10 pages). |
Mitchell, Tom M., “Chapter 1; Generative and Discriminative Classifiers: Naive Bayes and Logistic Regression,” Machine Learning, Sep. 21, 2006 (17 pages). |
Lang, Marcus, “Implementation on Naive Bayesian Classifiers in Java,” http://www.iit.edul˜ipro356f03/ipro/documents/naive-bayes.edu [Retrieved from the Internet on Feb. 29, 2008] (4 pages). |
Liang et al., “Learning Naive Bayes Tree for Conditional Probability Estimation,” Proceedings of the Canadian A1-2006 Conference, held in Quebec, Canada, pp. 456-466, on Jun. 7-9, 2006 (13 pages). |
Mozina et al., “Nomograms for Visualization of Naive Bayesian Classifier,” Proceedings of the Eight European Conference on Principles and Practice of Knowledge Discovery in Databases, held in Pisa, Italy, pp. 337-348, 2004 [Retrieved from the Internet on Feb. 29, 2008] (12 pages). |
“Lecture 3; Naive Bayes Classification, ” http://www.cs. utoronto. ca/˜strider/CSCD 11_f08INaiveBayes_Zemel. pdf [Retrieved from the Internet on Feb. 29, 2008] (9 pages). |
Klein, Dan, PowerPoint Presentation of “Lecture 23: Naive Bayes,” CS 188: Artificial Intelligence held on Nov. 15, 2007 (6 pages). |
“Learning Bayesian Networks: Naive and non-Naive Bayes” Oregon State University, Oregon [Retrieved from the Internet on Feb. 29, 2008]. Retrieved from the Internet: http://web.engr.oregonstate.edul˜tgd/classess/534/slides/part6.pdf (18 pages). |
“The Naive Bayes Classifier,” CS534-Machine Learning, Oregon State University, Oregon [Retrieved from the Internet on Feb. 29, 2008]. Retrieved from the Internet: http://web.engr.oregonstate.edul˜afern/classes/cs534/notes1Naivebayes-10.pdf (19 pages). |
“Bayesian Networks,” Machine Learning A, 708.064 07 1sst KU Oregon State University, Oregon [Retrieved from the Internet on Feb. 29, 2008]. Retrieved from the Internet: http://www.igi.tugraz.at.lehre /MLA/WS07/slides3. pdf (21 pages). |
“The Peltarion Blog,” Jul. 10, 2006 [Retrieved from the Internet on Mar. 11, 2009] Retrieved from the Internet: http//blog.peltarion.com/2006/07/10/classifier-showdown (14 pages). |
“Logical Connective: Philosophy 103: Introduction to Logic Conjunction, Negation, and Disjunction,” [Retrieved from the Internet on Mar. 11, 200] Retrieved from the Internet: http://philosophy.lander.edu/logic/conjunct.html (5 pages). |
“Naive Bayes Classifier,” Wikipedia entry as of Mar. 11, 2009 [Retrieved from the Internet on Mar. 11, 2009] (7 pages). |
“Naive Bayes Classifier,” Wikipedia entry as of Jan. 11, 2008 [Retrieved from the Internet from Wikipedia history pages on Mar. 11, 2009] (7 pages). |
Zimmerman, H., “Fuzzy set applications in pattern recognition and data-analysis,” 11th IAPR International conference on Pattern Recognition, Aug. 29, 1992 (81 pages). |
Canadian Intellectual Property Office, “Office Action,” mailed in connection with Canadian Patent Application No. 2,619,781, dated Apr. 1, 2016 (8 pages). |
United States Patent and Trademark Office, “Non-Final Office Action,” mailed in connection with U.S. Appl. No. 14/015,664, dated Jan. 4, 2016 (6 pages). |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 14/015,664, dated Apr. 20, 2016 (10 pages). |
United States Patent and Trademark Office, “Notice of Allowability,” mailed in connection with U.S. Appl. No. 14/015,664, dated Jul. 20, 2016 (2 pages). |
Canadian Intellectual Property Office, “Notice of Allowance”, mailed in connection with Canadian Patent Application No. 2,619,781, dated Apr. 10, 2017 (1 page). |
Doe, “Bringing Set Top Box Data to Life,” ARF Audience Measurement Symposium 2.0, NYC, Jun. 26, 2007 (9 pages). |
United States Patent and Trademark Office, “Non-final Office Action,” mailed in connection with U.S. Appl. No. 15/207,019, dated Apr. 19, 2017 (6 pages). |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 15/207,019, dated Dec. 26, 2017 (13 pages). |
United States Patent and Trademark Office, “Final Office Action,” mailed in connection with U.S. Appl. No. 15/207,019, dated Sep. 6, 2017 (9 pages). |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 15/958,814, dated Jun. 15, 2018 (9 pages). |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 16/166,871, dated Jan. 8, 2019 (13 pages). |
United States Patent and Trademark Office, “Corrected Notice of Allowability,” mailed in connection with U.S. Appl. No. 16/417,128, dated Oct. 8, 2019 (2 pages). |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 16/417,128, dated Jul. 29, 2019 (13 pages). |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 16/706,280, dated Sep. 23, 2020 (9 pages). |
United States Patent and Trademark Office, “Non-final Office Action,” mailed in connection with U.S. Appl. No. 16/706,280, dated Mar. 26, 2020 (4 pages). |
United States Patent and Trademark Office, “Advisory Action” mailed in connection with U.S. Appl. No. 11/388,555, dated Mar. 22, 2010 (3 pages). |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 11/388,555, dated May 20, 2010 (4 pages). |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 11/388,262, dated Jan. 5, 2015 (8 pages). |
United States Patent and Trademark Office, “Advisory Action” mailed in connection with U.S. Appl. No. 11/388,262, dated Jan. 7, 2010 (3 pages). |
United States Patent and Trademark Office, “Non-Final Office Action”, mailed in connection with U.S. Appl. No. 14/686,470, dated Jun. 19, 2015 (6 pages). |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 14/686,470, dated Oct. 15, 2015 (5 pages). |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 13/444,571, dated Dec. 2, 2015 (8 pages). |
United States Patent and Trademark Office, “Non-Final Office Action”, mailed in connection with U.S. Appl. No. 13/444,571, dated Jul. 30, 2015 (6 pages). |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 15/082,380, dated Aug. 20, 2019 (9 pages). |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 14/307,152, dated Nov. 12, 2015 (7 pages). |
United States Patent and Trademark Office, “Non-Final Office Action,” mailed in connection with U.S. Appl. No. 11/576,328, dated Aug. 7, 2009 (10 pages). |
United States Patent and Trademark Office, “Non-Final Office Action,” mailed in connection with U.S. Appl. No. 11/576,328, dated Feb. 5, 2009 (13 pages). |
United States Patent and Trademark Office, “Notice of Allowance and Fee(s) Due,” mailed in connection with U.S. Appl. No. 17/164,483, dated Aug. 24, 2022 (7 pages). |
United States Patent and Trademark Office, “Final Office Action,” mailed in connection with U.S. Appl. No. 17/164,483, dated Jun. 14, 2022 (7 pages). |
United States Patent and Trademark Office, “Non-Final Office Action,” mailed in connection with U.S. Appl. No. 17/164,483, dated Mar. 2, 2022 (4 pages). |
Number | Date | Country | |
---|---|---|---|
20230141256 A1 | May 2023 | US |
Number | Date | Country | |
---|---|---|---|
60761678 | Jan 2006 | US | |
60708557 | Aug 2005 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17164483 | Feb 2021 | US |
Child | 18092030 | US | |
Parent | 16706280 | Dec 2019 | US |
Child | 17164483 | US | |
Parent | 16417128 | May 2019 | US |
Child | 16706280 | US | |
Parent | 16166871 | Oct 2018 | US |
Child | 16417128 | US | |
Parent | 15958814 | Apr 2018 | US |
Child | 16166871 | US | |
Parent | 15207019 | Jul 2016 | US |
Child | 15958814 | US | |
Parent | 14015664 | Aug 2013 | US |
Child | 15207019 | US | |
Parent | 12831870 | Jul 2010 | US |
Child | 14015664 | US | |
Parent | 11576328 | US | |
Child | 12831870 | US |