For a number of reasons, it would be useful if a home entertainment device or system were able to determine if people were present in the room. If viewers leave the room in order to go to the kitchen, for example, the system could go into a low power consumption state, perhaps by dimming or powering down the display, or by shutting down completely. In this way, power could be conserved. If recorded media were being viewed, the playback could be automatically paused when a viewer leaves the room.
In addition, the next generation of smart televisions may be service platforms offering viewers several services such as banking, on-line shopping, etc. Human presence detection would also be useful for such TV-based services. For example, if a viewer was accessing a bank/brokerage account using the TV, but then leaves the room without closing the service, a human presence detection capability could be used to automatically log off or shut down the service after a predetermined time. In another case, if another person enters the room while the on-line banking service is running, the human presence detection could be used to automatically turn off the banking service for security or privacy reasons.
Detecting human presence would also be useful to advertisers and content providers. Actual viewership could be determined. Content providers could determine the number of people viewing a program. Advertisers could use this information to determine the number of people who are exposed to a given advertisement. Moreover, an advertiser could determine how many people viewed a particular airing of an advertisement, i.e., how many people saw an ad at a particular time and channel, and in the context of a particular program. This in turn could allow the advertiser to perform cost benefit analysis. The exposure of an advertisement could be compared to the cost to produce the advertisement, to determine if the advertisement, as aired at a particular time and channel, is a worthwhile expense.
In the drawings, the leftmost digits) of a reference number identifies the drawing in which the reference number first appears.
An embodiment is now described with reference to the figures, where like reference numbers may indicate identical or functionally related elements. While specific configurations and arrangements are discussed, it should be understood that this is done for illustrative purposes only. A person skilled in the relevant art will recognize that other configurations and arrangements can be used without departing from the spirit and scope of the description. It will be apparent to a person skilled in the relevant art that this can also be employed in a variety of other systems and applications other than what is described herein.
Disclosed herein are methods, systems and computer program products that may allow for the determination of human presence in a room where content is being presented. The audio that is associated with the content may be captured, along with the audio that may be being generated in the room by whatever sources are collectively present. Features may be extracted from both the content audio and the room audio. These features may then be compared, and the differences may be quantified. If the differences are significant, then human presence may be inferred. Insignificant differences may be used to infer the absence of people.
The overall context of the system is illustrated in
Content 110 may be presented to a user through one or more output devices, such as television (TV) 150. The presentation of content 110 may be controlled through the use of a remote control 160, which may transmit control signals to SIB 120. The control signals may be received by a radio frequency (RF) interface WO 130 at STB 120.
Room audio 170 may also be present, including all sound generated in the room. Sources for the room audio 170 may include ambient noise and sounds made by any users, including but not limited to speech. Room audio 170 may also include sound generated by the consumer electronics in the room, such as the content audio 115 produced by TV 150. The room audio may be be captured by a microphone 140. In the illustrated embodiment, microphone 140 may be incorporated in STB 120. In alternative embodiments, the microphone 140 may be incorporated in TV 150 or elsewhere.
The processing of the system described herein is shown generally at
Process 200 is illustrated in greater detail in
Room audio may be processed in an analogous manner. At 315, room audio may be received. As noted above, room audio may be captured using a microphone incorporated into an STB or other consumer electronics component in the room, and may then be recorded for processing purposes. At 325, the room audio may be sampled. In an embodiment, the room audio may be sampled at 8 kHz or any other frequency. At 335, the sampled room audio may be divided into intervals for subsequent processing, in an embodiment, the intervals may be 0.5 second long. The intervals of sampled room audio may correspond, with respect to time, to respective intervals of sampled content audio. At 345, features may be extracted from each interval of sampled room audio. As in the case of content audio, a coefficient of variation or other statistical measure may be calculated for each interval and used as the feature for subsequent processing.
At 350, the extracted features may be compared. In an embodiment, this includes comparison of the coefficients of variation as a common statistical measure, for temporally corresponding intervals of sampled room audio and sampled content audio. The comparison process will be described in greater detail below. In an embodiment, this may comprise calculating the difference between the coefficients of variation of the room audio and the content audio, for corresponding intervals. At 360, a normalization or smoothing process may take place. This may comprise calculation of a function of the differences between the coefficients of variation of the room audio and the content audio over a sequence of successive intervals. At 370, an inference may be reached regarding the presence of people in the room, where the inference may be based on the statistic(s) resulting from the normalization performed at 360. In an embodiment, if the coefficients of variation are sufficiently different between temporally corresponding intervals of room and content audio, then the presence of one or more people may be inferred.
In an alternative embodiment, additional processing may be performed in conjunction with feature extraction.
The comparison of coefficients of variation is illustrated in
Note that the magnitude of the percentage difference may allow greater or lesser confidence in the human presence inference. If the percentage difference is less than the threshold, then human presence may be unlikely, as discussed above. If the percentage is significantly less than the threshold, e.g., close to zero, then this may suggest that the room audio and the content audio are extremely similar, so that a higher degree of confidence may be placed in the inference that human presence is unlikely. Conversely, if the percentage difference exceeds the threshold then human presence may be likely. If the percentage difference exceeds the threshold by a significant amount, then this may suggest that the room audio and the content audio are very different, and a higher degree of confidence may be placed in the inference that human presence is likely.
In an embodiment, the data related to a given interval may be normalized by considering this interval in addition to a sequence of immediately preceding intervals. In this way, significance of outliers may be diminished, while the implicit confidence level of an interval may influence the inferences derived in succeeding intervals. Numerically, the normalization process may use any of several functions. Normalization may use a moving average of data from past intervals, or may use linear or exponential decay functions of this data.
The processes of
This is shown in
At 1050, the extracted features of a content audio interval and a room audio interval may be compared. This comparison may be performed in the same manner as shown in
As noted above, the systems, methods and computer program products described herein may be implemented in the context of a home entertainment system that may include an STB and/or a smart television, or may be implemented in a personal computer. Moreover, the systems, methods and computer program products described herein may also be implemented in the context of a laptop computer, ultra-laptop or netbook computer, tablet, touch pad, portable computer, handheld computer, palmtop computer, personal digital assistant (PDA), cellular telephone, combination cellular telephone/PDA, smart device (e.g., smart phone, smart tablet or smart television), mobile internet device (MID), messaging device, data communication device, and so forth.
One or more features disclosed herein may be implemented in hardware, software, firmware, and combinations thereof, including discrete and integrated circuit logic, application specific integrated circuit (ASIC) logic, and microcontrollers, and may be implemented as part of a domain-specific integrated circuit package, or a combination of integrated circuit packages. The term software, as used herein, refers to a computer program product including a computer readable medium having computer program logic stored therein to cause a computer system to perform one or more features and/or combinations of features disclosed herein. The computer readable medium may be transitory or non-transitory. An example of a transitory computer readable medium may be a digital signal transmitted over a radio frequency or over an electrical conductor, through a local or wide area network, or through a network such as the Internet. An example of a non-transitory computer readable medium may be a compact disk, a flash memory, random access memory (RAM), read-only memory (ROM), or other data storage device.
An embodiment of a system that may perform the processing described herein is shown in
A microphone 1105 may capture room audio 1107. Content audio 1117 may be received and routed to PIC 1110. The sampling of the room and content audio and the decomposition of these signals into intervals may be performed in PIC 1110 or elsewhere. After sampling and decomposing into intervals, the content and room audio may be processed by the feature extraction firmware 1115 in PIC 1110. As discussed above, feature extraction process may produce coefficients of variation for each interval, for both sampled room audio and sampled content audio. In the illustrated embodiment, feature extraction may take place in the PIC 1110 through the execution of feature extraction firmware 1115. Alternatively, the feature extraction functionality may be implemented in an execution engine of system on a chip (SOC) 1120.
If feature extraction is performed at PIC 1110, the coefficients of variation may be sent to SOC 1120, and then made accessible to operating system (OS) 1130. Comparison of coefficients from corresponding room audio and content audio intervals may be performed by logic 1160 in presence middleware 1140. Normalization may be performed by normalization logic 1150, which may also be part of presence middleware 1140. An inference regarding human presence may then be made available to a presence-enabled application 1170. Such an application may, for example, put system 1100 into a low power state if it is inferred that no one is present. Another example of a presence-enabled application 1170 may be a program that collects presence inferences from system 1100 and others like it in other households, to determine viewership of a television program or advertisement.
As noted above with respect to
Items 1105, 1110, 1120, and 1130 may all be located in one or more components in a user's home entertainment system or computer system, in an embodiment. They may be located in an STB, digital video recorder, or television, for example. Presence middleware 1140 and presence-enabled application 1170 may also be located in one or more components of the user's home entertainment system or computer system. In alternative embodiments, one or both of presence middleware 1140 and presence-enabled application 1170 may be located elsewhere, such as the facility of a content provider, for example.
Note that in some embodiments, the audio captured by the microphone 1105 may be muted. A user may choose to do this via a button on remote control 1180 or the home entertainment system. Such a mute function does not interfere with the mute on remote controls which mutes the audio coming out of the TV, A “mute” command for the microphone would then be sent to audio selection logic in PIC 1110. As a result of such a command, audio from microphone 1105 would not be received by OS 1130. Nonetheless, room audio 1107 may still be received at PIC 1110, where feature extraction may be performed. Such a capability may be enabled by the presence of the feature extraction firmware 1115 in the PIC 1110. The statistical data, i.e., the coefficients of variation, may then be made available to the OS 1130, even though the room audio itself has been muted. The nature of the coefficients of variation may be such that the coefficients may not be usable for purposes of recreating room audio 1107.
Computer program logic 1240 may include feature extraction code 1250. This code may be responsible for determining the standard deviation and mean for intervals of sampled room audio and content audio, as discussed above. Feature extraction code 1250 may also be responsible for implementing Fourier transformation and bandpass filtering as discussed above with respect to
A software embodiment of the comparison and normalization functionality is illustrated in
Computer program logic 1340 may include comparison code 1350. This module may be responsible for comparing coefficients of variation of corresponding intervals of room audio and content audio, and generating a quantitative indication of lire difference, e.g., a percentage difference, as discussed above. Computer program logic 1340 may include code 1350 for performing normalization. This module may be responsible for performing normalization of data generated by comparison code 1350 using a moving average or other process, as noted above. Computer program logic 1340 may include inference code 1370. This module may be responsible for generating an inference regarding the presence or absence of people, given the results of normalization code 1360.
The systems, methods, and computer program products described above may have a number of applications. If a viewer leaves a room, for example, the absence of people could be detected as described above, and the entertainment or computer system could go into a low power consumption state, perhaps by dimming or powering down the display, or by shutting down completely. In this way, power could be conserved. If recorded media were being viewed, the playback could be automatically paused when a viewer leaves the room.
In addition, service platforms may offer viewers services such as banking, on-line shopping, etc. Human presence detection as described above would be useful for such TV-based services. For example, if a viewer were accessing a bank/brokerage account using the TV, but then leaves the room without closing the service, a human presence detection capability could be used to automatically log off or shut down the service after a predetermined time. In another case, if another person enters the room while the on-line banking service is running, the human presence detection could be used to automatically turn off the banking service for security or privacy reasons.
Detecting human presence would also be used by advertisers and content providers. Actual viewership could be determined. Content providers could determine the number of people viewing a program. Advertisers could use this information to determine the number of people who are exposed to a given advertisement. Moreover, an advertiser could determine how many people viewed a particular airing of an advertisement, i.e., how many people saw an ad at a particular time and channel, and in the context of a particular program. This in turn could allow the advertiser to perform cost benefit analysis. The exposure of an advertisement could be compared to the cost to produce the advertisement, to determine if the advertisement, as aired at a particular time and channel, is a worthwhile expense.
Methods and systems are disclosed herein with the aid of functional building blocks illustrating the functions, features, and relationships thereof. At least some of the boundaries of these functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternate boundaries may be defined so long as the specified functions and relationships thereof are appropriately performed.
While various embodiments are disclosed herein, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail may be made therein without departing from the spirit and scope of the methods and systems disclosed herein. Thus, the breadth and scope of the claims should not be limited by any of the exemplary embodiments disclosed herein.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US2011/049228 | 8/25/2011 | WO | 00 | 6/23/2014 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2013/028204 | 2/28/2013 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4769697 | Gilley et al. | Sep 1988 | A |
5515098 | Cartes | May 1996 | A |
5644723 | Deaton et al. | Jul 1997 | A |
5661516 | Carles | Aug 1997 | A |
5911773 | Mutsuga et al. | Jun 1999 | A |
6334110 | Walter et al. | Dec 2001 | B1 |
6338065 | Takahashi et al. | Jan 2002 | B1 |
6401034 | Kaplan et al. | Jun 2002 | B1 |
6708335 | Ozer et al. | Mar 2004 | B1 |
6941197 | Murakami et al. | Sep 2005 | B1 |
6947881 | Murakami et al. | Sep 2005 | B1 |
7072942 | Maller | Jul 2006 | B1 |
7134130 | Thomas | Nov 2006 | B1 |
7231198 | Loughran | Jun 2007 | B2 |
7349513 | Bhullar et al. | Mar 2008 | B2 |
7363151 | Nomura et al. | Apr 2008 | B2 |
7366683 | Altberg et al. | Apr 2008 | B2 |
7457828 | Wenner et al. | Nov 2008 | B2 |
7546619 | Anderson et al. | Jun 2009 | B2 |
7567800 | Uematsu et al. | Jul 2009 | B2 |
7636785 | Shahine et al. | Dec 2009 | B2 |
7657626 | Zwicky | Feb 2010 | B1 |
7698236 | Cox et al. | Apr 2010 | B2 |
7730509 | Boulet et al. | Jun 2010 | B2 |
7761345 | Martin et al. | Jul 2010 | B1 |
7778987 | Hawkins | Aug 2010 | B2 |
7831384 | Bill | Sep 2010 | B2 |
7835859 | Bill | Nov 2010 | B2 |
7904461 | Baluja et al. | Mar 2011 | B2 |
7974873 | Simmons et al. | Jul 2011 | B2 |
8065227 | Beckman | Nov 2011 | B1 |
8108405 | Marvit et al. | Jan 2012 | B2 |
8307068 | Schuler | Nov 2012 | B2 |
8429685 | Yarvis et al. | Apr 2013 | B2 |
8515828 | Wolf et al. | Aug 2013 | B1 |
8527336 | Kothari et al. | Sep 2013 | B2 |
20010049620 | Blasko | Dec 2001 | A1 |
20010056461 | Kampe et al. | Dec 2001 | A1 |
20020023002 | Staehelin | Feb 2002 | A1 |
20020023010 | Rittmaster et al. | Feb 2002 | A1 |
20020072952 | Hamzy et al. | Jun 2002 | A1 |
20020078444 | Krewin et al. | Jun 2002 | A1 |
20020129368 | Schlack et al. | Sep 2002 | A1 |
20020174025 | Hind et al. | Nov 2002 | A1 |
20020188609 | Fukuta et al. | Dec 2002 | A1 |
20030023678 | Rugelj | Jan 2003 | A1 |
20030037333 | Ghashghai et al. | Feb 2003 | A1 |
20030134632 | Loughran | Jul 2003 | A1 |
20030135499 | Schirmer et al. | Jul 2003 | A1 |
20030191812 | Agarwalla et al. | Oct 2003 | A1 |
20040003392 | Trajkovic et al. | Jan 2004 | A1 |
20040039629 | Hoffman et al. | Feb 2004 | A1 |
20040064443 | Taniguchi et al. | Apr 2004 | A1 |
20040133923 | Watson et al. | Jul 2004 | A1 |
20040205810 | Matheny et al. | Oct 2004 | A1 |
20040240676 | Hashimoto et al. | Dec 2004 | A1 |
20040254942 | Error et al. | Dec 2004 | A1 |
20050079343 | Raybould et al. | Apr 2005 | A1 |
20050097595 | Lipsanen et al. | May 2005 | A1 |
20050160002 | Roetter et al. | Jul 2005 | A1 |
20050216345 | Altberg et al. | Sep 2005 | A1 |
20050283699 | Nomura et al. | Dec 2005 | A1 |
20060015637 | Chung | Jan 2006 | A1 |
20060090131 | Kumagai | Apr 2006 | A1 |
20060090185 | Zito et al. | Apr 2006 | A1 |
20060106944 | Shahine et al. | May 2006 | A1 |
20060155608 | Bantz et al. | Jul 2006 | A1 |
20060184625 | Nordvik et al. | Aug 2006 | A1 |
20060212350 | Ellis et al. | Sep 2006 | A1 |
20060241862 | Ichihara et al. | Oct 2006 | A1 |
20060242315 | Nichols | Oct 2006 | A1 |
20060253453 | Chmaytelli et al. | Nov 2006 | A1 |
20060271425 | Goodman et al. | Nov 2006 | A1 |
20060282304 | Bedard et al. | Dec 2006 | A1 |
20070010942 | Bill | Jan 2007 | A1 |
20070073477 | Krumm et al. | Mar 2007 | A1 |
20070073682 | Adar et al. | Mar 2007 | A1 |
20070088801 | Levkovitz et al. | Apr 2007 | A1 |
20070106468 | Eichenbaum et al. | May 2007 | A1 |
20070121845 | Altberg et al. | May 2007 | A1 |
20070157262 | Ramaswamy et al. | Jul 2007 | A1 |
20070220010 | Ertugrul | Sep 2007 | A1 |
20070220146 | Suzuki | Sep 2007 | A1 |
20070226320 | Hager et al. | Sep 2007 | A1 |
20070239527 | Nazer et al. | Oct 2007 | A1 |
20070239533 | Wojcicki et al. | Oct 2007 | A1 |
20070255617 | Maurone et al. | Nov 2007 | A1 |
20070270163 | Anupam et al. | Nov 2007 | A1 |
20070294773 | Hydrie et al. | Dec 2007 | A1 |
20070299671 | McLachlan | Dec 2007 | A1 |
20080021632 | Amano | Jan 2008 | A1 |
20080027639 | Tryon | Jan 2008 | A1 |
20080033841 | Wanker | Feb 2008 | A1 |
20080036591 | Ray | Feb 2008 | A1 |
20080040370 | Bosworth et al. | Feb 2008 | A1 |
20080040475 | Bosworth et al. | Feb 2008 | A1 |
20080052082 | Tsai | Feb 2008 | A1 |
20080052168 | Peters et al. | Feb 2008 | A1 |
20080086477 | Hawkins | Apr 2008 | A1 |
20080097822 | Schigel et al. | Apr 2008 | A1 |
20080104195 | Hawkins et al. | May 2008 | A1 |
20080114651 | Jain et al. | May 2008 | A1 |
20080120105 | Srinivasan | May 2008 | A1 |
20080120308 | Martinez et al. | May 2008 | A1 |
20080134043 | Georgis et al. | Jun 2008 | A1 |
20080154720 | Gounares et al. | Jun 2008 | A1 |
20080162186 | Jones | Jul 2008 | A1 |
20080187114 | Altberg et al. | Aug 2008 | A1 |
20080201472 | Bistriceanu et al. | Aug 2008 | A1 |
20080215425 | Guldimann et al. | Sep 2008 | A1 |
20080221987 | Sundaresan et al. | Sep 2008 | A1 |
20080222283 | Ertugrul et al. | Sep 2008 | A1 |
20080235088 | Weyer et al. | Sep 2008 | A1 |
20080275899 | Baluja et al. | Nov 2008 | A1 |
20080290987 | Li | Nov 2008 | A1 |
20080306808 | Adjali et al. | Dec 2008 | A1 |
20090006995 | Error et al. | Jan 2009 | A1 |
20090049097 | Nocifera et al. | Feb 2009 | A1 |
20090091426 | Barnes et al. | Apr 2009 | A1 |
20090106415 | Brezina et al. | Apr 2009 | A1 |
20090172035 | Lessing et al. | Jul 2009 | A1 |
20090172721 | Lloyd et al. | Jul 2009 | A1 |
20090177528 | Wu et al. | Jul 2009 | A1 |
20090204706 | Ertugrul et al. | Aug 2009 | A1 |
20090216704 | Zheng et al. | Aug 2009 | A1 |
20090240569 | Ramer et al. | Sep 2009 | A1 |
20090307205 | Churchill et al. | Dec 2009 | A1 |
20090327486 | Andrews et al. | Dec 2009 | A1 |
20100031335 | Handler | Feb 2010 | A1 |
20100042317 | Tajima et al. | Feb 2010 | A1 |
20100049602 | Softky | Feb 2010 | A1 |
20100063877 | Soroca et al. | Mar 2010 | A1 |
20100076997 | Koike et al. | Mar 2010 | A1 |
20100082432 | Feng et al. | Apr 2010 | A1 |
20100094878 | Soroca et al. | Apr 2010 | A1 |
20100106603 | Dey et al. | Apr 2010 | A1 |
20100106673 | Parks | Apr 2010 | A1 |
20100114864 | Agam et al. | May 2010 | A1 |
20100161492 | Harvey et al. | Jun 2010 | A1 |
20100162285 | Cohen | Jun 2010 | A1 |
20100220679 | Abraham et al. | Sep 2010 | A1 |
20100250361 | Torigoe et al. | Sep 2010 | A1 |
20100251304 | Donoghue et al. | Sep 2010 | A1 |
20100281042 | Windes et al. | Nov 2010 | A1 |
20100293048 | Singolda et al. | Nov 2010 | A1 |
20100299225 | Aarni et al. | Nov 2010 | A1 |
20100316218 | Hatakeyama et al. | Dec 2010 | A1 |
20100325259 | Schuler | Dec 2010 | A1 |
20100325655 | Perez | Dec 2010 | A1 |
20110010433 | Wilburn et al. | Jan 2011 | A1 |
20110072452 | Shimy et al. | Mar 2011 | A1 |
20110078720 | Blanchard | Mar 2011 | A1 |
20110106436 | Bill | May 2011 | A1 |
20110137975 | Das et al. | Jun 2011 | A1 |
20110154385 | Price et al. | Jun 2011 | A1 |
20110161462 | Hussain et al. | Jun 2011 | A1 |
20110213800 | Saros et al. | Sep 2011 | A1 |
20110214148 | Gossweiler, III et al. | Sep 2011 | A1 |
20110246213 | Yarvis et al. | Oct 2011 | A1 |
20110246214 | Yarvis et al. | Oct 2011 | A1 |
20110246283 | Yarvis et al. | Oct 2011 | A1 |
20110246300 | Yarvis et al. | Oct 2011 | A1 |
20110246469 | Yarvis et al. | Oct 2011 | A1 |
20110247029 | Yarvis et al. | Oct 2011 | A1 |
20110247030 | Yarvis et al. | Oct 2011 | A1 |
20110251788 | Yarvis et al. | Oct 2011 | A1 |
20110251918 | Yarvis et al. | Oct 2011 | A1 |
20110251990 | Yarvis et al. | Oct 2011 | A1 |
20110258203 | Wouhaybi et al. | Oct 2011 | A1 |
20110264553 | Yarvis et al. | Oct 2011 | A1 |
20110264613 | Yarvis et al. | Oct 2011 | A1 |
20110268054 | Abraham et al. | Nov 2011 | A1 |
20110288907 | Harvey | Nov 2011 | A1 |
20110295719 | Chen et al. | Dec 2011 | A1 |
20110321073 | Yarvis et al. | Dec 2011 | A1 |
20120079521 | Garg et al. | Mar 2012 | A1 |
20120126868 | Machnicki et al. | May 2012 | A1 |
20120246000 | Yarvis et al. | Sep 2012 | A1 |
20120246065 | Yarvis et al. | Sep 2012 | A1 |
20120246684 | Yarvis et al. | Sep 2012 | A1 |
20120253920 | Yarvis et al. | Oct 2012 | A1 |
20120304206 | Roberts et al. | Nov 2012 | A1 |
20130013545 | Agarwal et al. | Jan 2013 | A1 |
Number | Date | Country |
---|---|---|
101159818 | Apr 2008 | CN |
102223393 | Oct 2011 | CN |
102316364 | Jan 2012 | CN |
102612702 | Jul 2012 | CN |
1 003 018 | May 2000 | EP |
1 217 560 | Jun 2002 | EP |
1 724 992 | Nov 2006 | EP |
1 939 797 | Jul 2008 | EP |
10-301905 | Nov 1998 | JP |
2000-76304 | Mar 2000 | JP |
2000-198412 | Jul 2000 | JP |
2002-366550 | Dec 2002 | JP |
2004-108865 | Apr 2004 | JP |
2004-171343 | Jun 2004 | JP |
2004-258872 | Sep 2004 | JP |
2006-333531 | Dec 2006 | JP |
2006-350813 | Dec 2006 | JP |
2007-179185 | Jul 2007 | JP |
2007-249413 | Sep 2007 | JP |
2008-152564 | Jul 2008 | JP |
2008-171418 | Jul 2008 | JP |
2008-242805 | Oct 2008 | JP |
2008-546075 | Dec 2008 | JP |
2009-076041 | Apr 2009 | JP |
2009-528639 | Aug 2009 | JP |
10-2002-0024645 | Apr 2002 | KR |
10-2006-0122372 | Nov 2006 | KR |
10-2006-0122375 | Nov 2006 | KR |
10-2007-0061601 | Jun 2007 | KR |
10-2009-0014846 | Feb 2009 | KR |
1999007148 | Feb 1999 | WO |
2002032136 | Apr 2002 | WO |
2002032136 | Apr 2002 | WO |
2002071298 | Sep 2002 | WO |
2002082214 | Oct 2002 | WO |
2002082214 | Oct 2002 | WO |
2006130258 | Dec 2006 | WO |
2006130258 | Dec 2006 | WO |
2007101263 | Sep 2007 | WO |
2008064071 | May 2008 | WO |
2008096783 | Aug 2008 | WO |
2009002999 | Dec 2008 | WO |
2009099876 | Aug 2009 | WO |
2011075119 | Jun 2011 | WO |
2011075120 | Jun 2011 | WO |
2011075137 | Jun 2011 | WO |
2011130034 | Oct 2011 | WO |
2011130034 | Oct 2011 | WO |
2011163411 | Dec 2011 | WO |
2011163411 | Dec 2011 | WO |
2012006237 | Jan 2012 | WO |
2012006237 | Jan 2012 | WO |
2012135239 | Oct 2012 | WO |
2012135239 | Oct 2012 | WO |
2013028204 | Feb 2013 | WO |
Entry |
---|
Final Office Action received for U.S. Appl. No. 13/163,968, dated Aug. 16, 2012, 8 pages. |
Office Action received for U.S. Appl. No. 13/163,968, dated Nov. 27, 2012, 9 pages. |
Final Office Action received for U.S. Appl. No. 13/163,968, dated Apr. 2, 2013, 8 pages. |
Office Action received for U.S. Appl. No. 13/163,968, dated Oct. 3, 2013, 8 pages. |
Final Office Action received for U.S. Appl. No. 13/163,968, dated Feb. 13, 2014, 8 pges. |
Office Action received for U.S. Appl. No. 13/163,984, dated Feb. 28, 2012, 15 pages. |
Bindley, Katherine, “Verizon Files Patent for DVR That Watches Viewers, Delivers Targeted Ads Based on What It Sees”, The Huffington Post, Published Dec. 5, 2012. |
International Search Report and Written Opinion received for PCT Application No. PCT/US2009/068131, dated Sep. 1, 2010, 9 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2009/068689, dated Aug. 26, 2010, 11 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2009/068129, dated Aug. 31, 2010, 10 pages. |
Office Action received for United Kingdom Patent Application No. GB1108772.3, dated Jun. 2, 2011, 2 pages. |
Wikipedia, “Nielsen ratings”, From Wikipedia, the free encyclopedia, retrieved on Feb. 1, 2013, webpage available at: http://en.wikipedia.org/wiki/Nielsen_ratings, 10 pages. |
Schonfeld, Erick “Google Now Lets You Target Ads at Yourself”, TechCrunch, posted on Mar. 11, 2009, 2 pages. webpage available at: <http://techcrunch.com/2009/03/11/google-now-lets-you-target-ads-at-yourself/>. |
“Introducing Google TV”, retrieved on Aug. 17, 2011, 1 page. webpage available at: <http://www.google.com/tv/>. |
“Eloda Protocol Suite of Products”, retrieved on Aug. 17, 2011, 1 page. webpage available at: <http://www.eloda.com/en/protocol/>. |
“TRA—The Right Audience”, retrieved on Aug. 17, 2011, 1 page. webpage available at: <http://www.traglobal.com/whatwedo.php>. |
Office Action Received for European Patent Application No. 10252181.2, dated Jan. 9, 2012, 4 pages. |
European Search Report Received for European Patent Application No. 10252181.2, dated Dec. 23, 2011, 3 pages. |
International Preliminary Report on Patentability with Written Opinion received for PCT Patent Application No. PCT/US2012/030776, dated Oct. 10, 2013, 6 pages. |
International Search Report and Written Opinion received for International Patent Application No. PCT/US2011/042786, dated Feb. 23, 2012, 8 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2011/031064, dated Dec. 14, 2011, 8 pages. |
Combined Search and Examination Report received for United Kingdom Application No. 1108772.3, dated Sep. 26, 2011, 5 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2011/041516, dated Feb. 24, 2012, 7 pages. |
International Search Report and Written Opinion received for International Patent Application No. PCT/US2011/049228, dated Mar. 27, 2012, 9 pages. |
Norio, et al., A prediction system based on Context Information and Operation Histories, vol. 2004, No. 112 Nov. 10, 2004, pp. 83-90. (English Abstract Only). |
Office Action received for Japanese Patent Application No. 2012-541060, dated Jun. 18, 2013, 3 pages of Office Action and 3 pages of English Translation. |
Office Action Received for the Chinese patent application No. 201010621563.4, dated Mar. 1, 2012, 6 pages of Office Action and 6 pages of English Translation. |
Office Action Received for the Japanese Patent Application No. 2010-274870, dated Apr. 3, 2012, 1 page of Office Action and 2 pages of English Translation. |
International Preliminary Report on Patentability and Written Opinion received for PCT Application No. PCT/US2009/068131, dated Jun. 28, 2012, 8 pages. |
International Preliminary Report on Patentability and Written Opinion received for PCT Patent Application No. PCT/US2009/068129, dated Jun. 28, 2012, 9 pages. |
International Preliminary Report on Patentability and Written Opinion received for PCT Application No. PCT/US2009/068689, dated Jun. 28, 2012, 8 pages. |
Office Action Received for Japanese Patent Application No. 2011-084230, dated Jul. 17, 2012, 3 pages of Office Action and 3 pages of English Translation. |
Yue et al., “Automatic cookie usage setting with CookiePicker”, Proceedings of the 37th Annual IEEE/IFIP International Conference on Dependable Systems and Networks, Jun. 2007, pp. 460-470. |
International Search Report and Written Opinion received for Patent Application No. PCT/US2012/030776, dated Oct. 29, 2012, 7 pages. |
International Preliminary Report and Written Opinion received for PCT Patent Application No. PCT/US2011/031064, dated Oct. 26, 2012, 5 pages. |
Office Action Received for the Chinese Patent Application No. 201010621563.4, dated Dec. 25, 2012, 8 pages of Office Action and 10 pages of English Translation. |
International Preliminary Report on Patentability and Written Opinion received for PCT Patent Application No. PCT/US2011/042786, dated Jan. 24, 2013, 6 pages. |
International Preliminary Report on Patentablitiy and Written Opinion received for PCT Patent Application No. PCT/US2011/041516, dated Jan. 10, 2013, 8 pages. |
Matsuo et al., “Finding Social Network for Trust Calculation”, National Institute of Advance Industrial Science and technology (AIST), ECAI 2004, 16 pages. |
Peter, Mika, “Flink: Semantic Web Technology for the Extraction and Analysis of social Networks”, Department of Computer Science, Vrije Universiteit Amsterdam (VUA), De Boelelaan 1081, 1081 HV Amsterdam, The Netherlands, Elsevier Science, May 14, 2005, 20 pages. |
McClellan, “Nielsen, Charter in Set-Top Box Deal”, Adweek, Mar. 12, 2008, 3 pages. |
Norio et al., “Using a Positioning System of Cellular Phone to Learn Significant Locations”, IPSJ Journal, Japan, Information Processing Society of Japan, vol. 46, No. 12, Dec. 2005, pp. 2915-2924. (English Abstract Only). |
Office Action received for Japanese Patent Application No. 2011-084230, dated Apr. 9, 2013, 2 pages of Office Action including 3 pages of English Translation. |
Office Action received for European Patent Application No. 09852385.5, dated Jul. 24, 2012, 2 pages. |
Office Action received for Chinese Patent Application No. 201110105727.2, dated May 24, 2013, 8 pages of Office Action and 12 pages of English Translation. |
Office Action received for U.S. Appl. No. 13/130,203, dated Mar. 25, 2014, 18 pages. |
International Preliminary Report on Patentability and Written Opinion received for PCT Application No. PCT/US2011/049228 dated Mar. 6, 2014, 6 pages. |
Office Action received for U.S. Appl. No. 13/162,098, dated Nov. 12, 2013, 14 pages. |
Office Action received for Japanese Patent Application No. 2012-541060, dated Feb. 4, 2014, 5 pages of Office Action including 2 page of English Translation. |
Office Action received for U.S. Appl. No. 13/163,949, dated Nov. 25, 2013, 22 pages. |
Office Action received for U.S. Appl. No. 12/833,035, dated Jun. 6, 2012, 15 pages. |
Notice of Allowance received for U.S. Appl. No. 12/833,035, dated Nov. 26, 2012, 10 pages. |
Office Action received for U.S. Appl. No. 12/821,376, dated Apr. 10, 2012, 10 pages. |
Final Office Action received for U.S. Appl. No. 12/821,376, dated Jul. 17, 2012, 13 pages. |
Office Action received for U.S. Appl. No. 12/821,376, dated Apr. 22, 2013, 12 pages. |
Final Office Action received for U.S. Appl. No. 12/821,376, dated May 13, 2013, 12 pages. |
Office Action received for Chinese Patent Application 201110185034.9, dated May 6, 2013, 13 pages of Office Action including 5 pages of English Translation. |
Office Action received for Chinese Patent Application 201110185034.9, dated Jan. 28, 2014, 6 pages of Office Action including 3 pages of English Translation. |
Office Action received for United Kingdom Patent Application No. GB1108772.3, dated Apr. 15, 2013, 3 pages. |
Office Action received for United Kingdom Patent Application No. GB1108772.3, dated Apr. 1, 2014, 3 pages. |
Office Action received for U.S. Appl. No. 13/078,565, dated Oct. 23, 2012, 17 pages. |
Office Action received for U.S. Appl. No. 13/078,565, dated Feb. 15, 2013, 19 pages. |
Office Action received for U.S. Appl. No. 13/078,565, dated Sep. 11, 2013, 22 pages. |
Office Action received for U.S. Appl. No. 13/162,041, dated Feb. 12, 2014, 24 pages. |
Davison et al., “Predicting Sequences of User Actions”, To be presented at the AAAI/ICML Workshop on Predicting the Future: AI Approaches to Time-Series Analysis, 1998, pp. 1-8. |
Tseng et al., “Efficient mining and prediction of user behavior patterns in mobile web systems”, Information and Software Technology vol. 48, 2006, pp. 357-369. |
Dupret et al., “A user browsing model to predict search engine click data from past observations”, SIGIR '08 Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval, Jul. 20-24, 2008, pp. 331-338. |
Office Action received for U.S. Appl. No. 12/761,448, dated Feb. 10, 2012, 12 pages. |
Office Action received for U.S. Appl. No. 12/761,448, dated Aug. 1, 2012, 12 pages. |
Final Office Action received for U.S. Appl. No. 12/761,448, dated Jan. 31, 2013, 13 pages. |
Office Action received for U.S. Appl. No. 13/164,002, dated Dec. 18, 2013, 10 pages. |
International Preliminary Report on Patentability and Written Opinion received for PCT Patent Application No. PCT/US2009/068689, dated Jun. 28, 2012, 8 pages. |
European Search Report received for EP Patent Application No. 09852397.0, dated Feb. 14, 2014, 6 pages. |
Office Action Received for Japanese Patent Application No. 2012-541987, dated Dec. 17, 2013, 6 pages of office action including 3 pages of English Translation. |
Office Action received for U.S. Appl. No. 13/129,968, dated Aug. 27, 2012, 19 pages. |
Final Office Action received for U.S. Appl. No. 13/129,968 dated Apr. 4, 2013, 16 pages. |
Office Action received for U.S. Appl. No. 13/159,874, dated Sep. 7, 2012, 14 pages. |
Final Office Action received for U.S. Appl. No. 13/159,874, dated Apr. 19, 2013, 33 pages. |
Office Action received for U.S. Appl. No. 13/159,884, dated Aug. 27, 2012, 19 pages. |
Final Office Action received for U.S. Appl. No. 13/159,884, dated May 13, 2013, 32 pages. |
Office Action received for U.S. Appl. No. 13/159,894, dated Sep. 11, 2012, 13 pages. |
Final Office action Received for U.S. Appl. No. 13/159,894, dated Apr. 25, 2013, 18 pages. |
Office Action received for U.S. Appl. No. 13/159,896, dated Sep. 19, 2012, 14 pages. |
Final Office action received for U.S. Appl. No. 13/159,896, dated Apr. 24, 2013, 12 pages. |
Office Action received for U.S. Appl. No. 13/130,203, dated Mar. 27, 2013, 11 pages. |
Final Office Action received for U.S. Appl. No. 13/130,203, dated Nov. 5, 2013, 37 pages. |
Extended European Search Report received for EP Patent Application No. 09852385.5, dated Oct. 23, 2013, 7 pages. |
Office Action received for Russian Patent Application No. 2012127417, dated Nov. 13, 2013, 5 pages of Office Action including 2 pages of English Translation. |
Office Action received for Russian Patent Application No. 2012127417, dated Mar. 11, 2014, 10 pages of Office Action including 4 pages of English Translation. |
Office Action received for U.S. Appl. No. 13/161,990, dated Sep. 12, 2012, 9 pages. |
Final Office Action Received for U.S. Appl. No. 13/161,990, dated May 16, 2013, 8 pages. |
Office Action received for U.S. Appl. No. 13/162,041, dated Oct. 25, 2012, 33 pages. |
Final Office Action Received for U.S. Appl. No. 13/162,041, dated Jun. 5, 2013, 26 pages. |
Office Action received for U.S. Appl. No. 13/162,098, dated Jan. 10, 2013, 12 pages. |
Office Action received for U.S. Appl. No. 13/130,734, dated Mar. 26, 2013, 13 pages. |
Final Office Action received for U.S. Appl. No. 13/130,734, dated Aug. 30, 2013, 11 pages. |
Extended European Search Report Received for EP Patent Application No. 09852384.8, dated Jan. 3, 2014, 5 pages. |
Supplementary European Search Report received for European Patent Application No. 09852384.8, dated Jan. 21, 2014, 1 page. |
Office Action received for Japanese Patent Application No. 2012-543075, dated Feb. 12, 2014, 6 pages of Office Action Including 3 pages of English Translation. |
Office Action received for Russian Patent Application No. 2012127407, dated Nov. 13, 2013, 16 pages of Office Action including 7 pages of English Translation. |
Final Office Action received for U.S. Appl. No. 13/163,949, dated Apr. 21, 2014, 23 pages. |
Office Action received for U.S. Appl. No. 13/163,959, dated Feb. 24, 2012, 10 pages. |
Final Office Action received for U.S. Appl. No. 13/163,959, dated Jun. 15, 2012, 14 pages. |
Office Action received for U.S. Appl. No. 13/163,968, dated Sep. 9, 2011, 8 pages. |
Final Office Action received for U.S. Appl. No. 13/163,968, dated Dec. 28, 2011, 7 pages. |
Office Action received for U.S. Appl. No. 13/163,968, dated May 4, 2012, 9 pages. |
Number | Date | Country | |
---|---|---|---|
20140340992 A1 | Nov 2014 | US |