This disclosure relates generally to audience measurement and, more particularly, to methods and apparatus to capture images.
Audience measurement of media content (e.g., broadcast television and/or radio, stored audio and/or video content played back from a memory such as a digital video recorder or a digital video disc, audio and/or video content played via the Internet, video games, etc.) often involves collection of content identifying data (e.g., signature(s), fingerprint(s), embedded code(s), channel information, time of consumption information, etc.) and people data (e.g., identifiers, demographic data associated with audience members, etc.). The content identifying data and the people data can be combined to generate, for example, media exposure data indicative of amount(s) and/or type(s) of people that were exposed to specific piece(s) of media content.
In some audience measurement systems, the collected people data includes an amount of people being exposed to media content. To calculate the amount of people being exposed to the media content, some measurement systems capture a series of images of a media exposure environment (e.g., a television room, a family room, a living room, a bar, a restaurant, etc.) and analyze the images to determine how many people appear in the images at a particular date and time. The calculated amount of people in the media exposure environment can be correlated with media content being presented at the particular date and time to provide exposure data (e.g., ratings data) for that media content. Additionally, some audience measurement systems identify people in the images via one or more identification techniques such as, for example, facial recognition.
To count people in a media exposure environment, such as a room of a house in which a television is located, some audience measurement systems attempt to recognize objects as humans in a series of captured images of the room. A tally is maintained for each frame of image data to reflect an amount of people in the room at a time corresponding to a respective frame. That is, each recognition of an object as a human in a frame increases the tally associated with that frame. The audience measurement system counting the people in the room may also collect content identifying information to identify media content being presented (e.g., aurally and/or visually) in the room. With the identification of the media content and the amount of people in the room at a given date and time, the audience measurement system is aware of how many people were exposed to the specific media content.
Additionally, some systems recognize an identity of one or more of the detected humans by, for example, performing a facial recognition process on image data of one or more of the frames, receiving identifier(s) from the detected humans, detecting identifying signal(s) generated by devices carried by the humans, etc. Personal identification information can be used in conjunction with the content identifying information and/or the tally information to generate exposure information related to the content. When an audience measurement system uses a facial recognition process to identify people, an accuracy of the identification increases with an increase in resolution of the image data on which the facial recognition process is performed. In other words, the higher the resolution of a frame of image data, the more likely identification made via facial recognition will be accurate.
To provide high-resolution image data, audience measurement systems that include facial recognition capabilities typically employ high-resolution image sensors equipped with an illumination source, such as an infrared (IR) light emitting diode (LED). In previous systems, each time the high-resolution image sensor captures a frame of image data, the illumination source illuminates a surrounding area. The resulting illumination provides lighting conditions favorable to capturing high-resolution image data. When the illumination source is an IR LED, the illumination source emits IR light to enable the image sensor to capture illuminated objects. In addition to the IR light, IR emitters also emit light from the visible spectrum that appears as a red glow from the emitter.
Frequent activation of the illumination sources when capturing image data represents a significant power drain for the audience measurement system. Moreover, frequent interval activation (e.g., every two seconds) of the illumination source shortens a lifetime of the illumination source. Additionally, due to significant amounts of heat generated by the illumination sources, heat sinking devices and techniques are typically needed in systems that activate the illumination sources frequently. Further, light emissions of the illumination source has the potential to annoy people such as, for example, members of a panel that are exposed to the illumination source while in the presence of the audience measurement system. In an audience measurement system utilizing an IR LED, the red glow emitted from the illumination source is a blinking red light that faces the panel members while the panel members are, for example, watching a television. This blinking red light may annoy some panel members. Annoyance of panel members is undesirable and may prove detrimental to an ability of the audience measurement system to maintain persons in the panel and/or to collect as much data as possible. That is, some audience measurement systems rely on the willing participation of panel members and, thus, reduction or elimination of annoying aspects of the system is beneficial to avoid impairing willingness to volunteer. An annoying feature of the audience measurement system may decrease panelist compliance with and/or participation in the system.
Example methods, apparatus, and articles of manufacture disclosed herein reserve use of an illumination source of a high-resolution image sensor for frames of image data designated for processing that requires high-resolution image data. Example frames designated for such processing include frames on which a facial recognition process is to be executed. For frames not designated for processing that requires high-resolution image data, example methods, apparatus, and articles of manufacture disclosed herein capture image data without the use of the illumination source. Example frames not designated for processing that requires high-resolution image data include frames on which a person count (e.g., a body count) is to be executed without recognizing an identity of the detected persons. Additionally, when image data is captured without activation of the illumination source (e.g., when the corresponding frame will not be subjected to facial recognition), example methods, apparatus, and articles of manufacture disclosed herein enhance resulting images to compensate for low light levels and loss of contrast due to lack of illumination. In particular, example methods, apparatus, and articles of manufacture disclosed herein employ a pixel binning procedure on the image data captured without use of the illumination source. Binning is the process of summing pixel values in a neighborhood of pixels (e.g., a 2×2, 3×3, 4×4, etc. area), thereby capturing more light per pixel at the cost of lower resolution. However, for example methods, apparatus, and articles of manufacture disclosed herein, the lower resolution is acceptable for the frames captured without the use of the illumination source because, as described above, those frames will not be subjected to processing that requires high-resolution data.
Accordingly, example methods, apparatus, and articles of manufacture disclosed herein selectively activate an illumination source associated with an image sensor for certain frames. In contrast, previous systems activate the illumination source for each frame captured by a corresponding image sensor. In many instances, frames not designated for high-resolution processing (e.g., frames used solely to count people) greatly outnumber the frames designated for high-resolution frame (e.g., frame used for facial recognition). In fact, as described below, a mode of an image sensor during which image data is captured without the use of the illumination source (e.g., because high-resolution image data is not necessary for the corresponding frame(s)) is referred to herein as a ‘majority capture’ mode. Conversely, because the number of frames requiring high-resolution image data is far less than the number of frames for which lower resolution image data is acceptable, a mode of the image sensor during which image data is captured with the use of the illumination source is referred to herein as a ‘minority capture’ mode.
Because the illumination source is a significant power consumer, selective activation of the illumination source provided by example methods, apparatus, and articles of manufacture disclosed herein greatly reduce power consumption levels of the image sensors of audience measurement systems. Moreover, the selective activation provided by example methods, apparatus, and articles of manufacture disclosed herein extends the operational lifetime of the image sensors of audience measurement systems by less frequently operating the corresponding illumination sources. Further, the selective activation provided by example methods, apparatus, and articles of manufacture disclosed herein reduces or even eliminates the need for heat sinking devices and/or techniques otherwise required to dissipate heat generated by the illumination sources. In addition to the reduced resource consumption provided by example methods, apparatus, and articles of manufacture disclosed herein, audience measurement methods, systems, and articles of manufacture employing the selective activation disclosed herein reduce the likelihood that panel members become irritated by light and/or glow emitted from illumination sources. As described above, illumination sources of image sensors typically face panel members in a media exposure environment (e.g., a television room) and, thus, the panel members are subjected to a blinking and/or glowing light whenever the illumination source is activated. In previous systems utilizing a high-resolution image sensor, each frame is captured with the use of the illumination source, resulting in a blinking light (e.g., red light in the case of an IR LED flash unit being used as the illumination source) or a light that is seemingly persistently on throughout operation of the system. Selectively activating the illumination source for frames requiring high-resolution image data and not activating the illumination source for other frames, as disclosed herein, considerably reduces the instances of illumination of the media exposure environment and, thus, the potential for irritating the panel members.
The example audience measurement device 104 of
In the illustrated example, the frames obtained by the image sensor 204 of
In the illustrated example, some frames obtained by the image sensor 204 of
The example people counter 206 of
The example content identifier 202 of
In the illustrated example of
While an example manner of implementing the audience measurement device 104 of
The resolution determiner 300 of the illustrated example determines in which mode the image sensor 204 is to operate for a given time period, an image frame and/or set of image frames. In the illustrated example of
When the example image sensor 204 of
When the example image sensor 204 of
The example image capturer 308 of
Further, the example image capturer 308 of
For frames designated by the resolution determiner 300 as frames captured in the majority mode, the illuminator 306 was inactive and, thus, the image data is of high-resolution but likely to be dark and of low contrast. For many of these frames, the contrast level is likely to be too low for proper analysis (e.g., people counting) to be accurately performed. Accordingly, the image data captured in the majority mode may not be in condition (e.g., may not have high enough contrast levels) for an accurate execution of the people counting process of the people counter 206. Therefore, in response to the instruction from the resolution determiner 300 that the received frame(s) were captured in the majority mode, the example frame router 310 of
The example pixel binner 312 of
While an example manner of implementing the image sensor 204 of
As mentioned above, the example processes of
In the example of
The example pixel binner 312 of
Referring back to block 402, when the resolution determiner 300 determines that the next frame(s) to be captured are going to be subjected to the facial recognition process, the example resolution determiner 300 provides an instruction to the illumination controller 304 to enable the illuminator 306 during capture of the next frame(s) (block 414). In such instances, the illumination controller 304 cooperates with the image capturer 308 to capture the next frame(s) with the illuminator 306 activated (e.g., emitting light, such as IR light into the exposure environment 100 of
While the example image sensor of
The example processor system 510 of
The processor 512 of
In general, the system memory 524 may include any desired type of volatile and/or non-volatile memory such as, for example, static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, read-only memory (ROM), double data rate memory (DDR), etc. The mass storage memory 525 may include any desired type of mass storage device including hard disk drives, optical drives, tape storage devices, etc. The machine readable instructions of
The I/O controller 522 performs functions that enable the processor 512 to communicate with peripheral input/output (I/O) devices 526 and 528 and a network interface 530 via an I/O bus 532. The I/O devices 526 and 528 may be any desired type of I/O device such as, for example, a keyboard, a video display or monitor, a mouse, etc. The network interface 530 may be, for example, an Ethernet device, an asynchronous transfer mode (ATM) device, an 802.11 device, a digital subscriber line (DSL) modem, a cable modem, a cellular modem, etc. that enables the processor system 510 to communicate with another processor system. The example network interface 530 of
While the memory controller 520 and the I/O controller 522 are depicted in
Although certain example apparatus, methods, and articles of manufacture have been disclosed herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all apparatus, methods, and articles of manufacture fairly falling within the scope of the claims of this patent.
This patent arises from a continuation of U.S. patent application Ser. No. 16/878,935, filed May 20, 2020, now U.S. Pat. No. 11,245,839, which is a continuation of U.S. patent application Ser. No. 16/196,810, filed Nov. 20, 2018, which is a continuation of U.S. patent application Ser. No. 15/793,108, filed Oct. 25, 2017, now U.S. Pat. No. 10,165,177, which is a continuation of U.S. patent application Ser. No. 15/419,120, filed Jan. 30, 2017, now U.S. Pat. No. 9,843,717, which is a continuation of U.S. patent application Ser. No. 14/732,175, filed Jun. 5, 2015, now U.S. Pat. No. 9,560,267, which is a continuation of U.S. patent application Ser. No. 13/327,227, filed Dec. 15, 2011, now U.S. Pat. No. 9,082,004. U.S. patent application Ser. No. 16/878,935, U.S. patent application Ser. No. 16/196,810, U.S. patent application Ser. No. 15/793,108, U.S. patent application Ser. No. 15/419,120, U.S. patent application Ser. No. 14/732,175, and U.S. patent application Ser. No. 13/327,227 are hereby incorporated herein by reference in their entirety. Priority to U.S. patent application Ser. No. 16/878,935, U.S. patent application Ser. No. 16/196,810, U.S. patent application Ser. No. 15/793,108, U.S. patent application Ser. No. 15/419,120, U.S. patent application Ser. No. 14/732,175 and U.S. patent application Ser. No. 13/327,227 is claimed.
Number | Name | Date | Kind |
---|---|---|---|
4677466 | Lert, Jr. et al. | Jun 1987 | A |
4695879 | Weinblatt | Sep 1987 | A |
4858000 | Lu | Aug 1989 | A |
4859050 | Borah et al. | Aug 1989 | A |
4870579 | Hey | Sep 1989 | A |
5031228 | Lu | Jul 1991 | A |
5305464 | Frett | Apr 1994 | A |
5331544 | Lu et al. | Jul 1994 | A |
5550928 | Lu et al. | Aug 1996 | A |
5676138 | Zawilinski | Oct 1997 | A |
5842199 | Miller et al. | Nov 1998 | A |
5983129 | Cowan et al. | Nov 1999 | A |
5987154 | Gibbon et al. | Nov 1999 | A |
6014461 | Hennessey et al. | Jan 2000 | A |
6190314 | Ark et al. | Feb 2001 | B1 |
6292688 | Patton | Sep 2001 | B1 |
6297859 | George | Oct 2001 | B1 |
6422999 | Hill | Jul 2002 | B1 |
6453241 | Bassett, Jr. et al. | Sep 2002 | B1 |
6652283 | Van Schaack et al. | Nov 2003 | B1 |
7024024 | Aiazian | Apr 2006 | B1 |
7043056 | Edwards et al. | May 2006 | B2 |
7440593 | Steinberg et al. | Oct 2008 | B1 |
7587728 | Wheeler et al. | Sep 2009 | B2 |
7602524 | Eichhorn et al. | Oct 2009 | B2 |
7631324 | Buonasera et al. | Dec 2009 | B2 |
7636456 | Collins et al. | Dec 2009 | B2 |
7636748 | Duarte et al. | Dec 2009 | B2 |
7676065 | Wiedemann et al. | Mar 2010 | B2 |
7697735 | Adam et al. | Apr 2010 | B2 |
7782297 | Zalewski et al. | Aug 2010 | B2 |
7787697 | Ritzau et al. | Aug 2010 | B2 |
7796154 | Senior et al. | Sep 2010 | B2 |
7882514 | Nielsen et al. | Feb 2011 | B2 |
7899209 | Greiffenhagen et al. | Mar 2011 | B2 |
7930199 | Hill | Apr 2011 | B1 |
8150109 | Sung et al. | Apr 2012 | B2 |
8230457 | Lee et al. | Jul 2012 | B2 |
8296172 | Marci et al. | Oct 2012 | B2 |
8620088 | Lee | Dec 2013 | B2 |
8684742 | Siefert | Apr 2014 | B2 |
8764652 | Lee et al. | Jul 2014 | B2 |
8769557 | Terrazas | Jul 2014 | B1 |
8782681 | Lee et al. | Jul 2014 | B2 |
8793715 | Weitzenfeld et al. | Jul 2014 | B1 |
8973022 | Lee et al. | Mar 2015 | B2 |
9082004 | Nielsen | Jul 2015 | B2 |
9407958 | Terrazas | Aug 2016 | B2 |
9454646 | Siefert | Sep 2016 | B2 |
9514436 | Marci et al. | Dec 2016 | B2 |
9514439 | Marci et al. | Dec 2016 | B2 |
9560267 | Nielsen | Jan 2017 | B2 |
9843717 | Nielsen | Dec 2017 | B2 |
10102486 | Kaiser et al. | Oct 2018 | B1 |
10165177 | Nielsen | Dec 2018 | B2 |
10171869 | Terrazas | Jan 2019 | B2 |
10198713 | Marci et al. | Feb 2019 | B2 |
10248195 | Siefert | Apr 2019 | B2 |
10839350 | Marci et al. | Nov 2020 | B2 |
10992985 | Terrazas | Apr 2021 | B2 |
11032610 | Terrazas | Jun 2021 | B2 |
11200964 | Siefert | Dec 2021 | B2 |
11245839 | Nielsen | Feb 2022 | B2 |
11470243 | Nielsen | Oct 2022 | B2 |
20010013009 | Greening et al. | Aug 2001 | A1 |
20020002525 | Arai et al. | Jan 2002 | A1 |
20020010919 | Lu et al. | Jan 2002 | A1 |
20020013717 | Ando et al. | Jan 2002 | A1 |
20020059577 | Lu et al. | May 2002 | A1 |
20020073417 | Kondo et al. | Jun 2002 | A1 |
20030046683 | Jutzi | Mar 2003 | A1 |
20030081834 | Philomin et al. | May 2003 | A1 |
20030093784 | Dimitrova et al. | May 2003 | A1 |
20030093792 | Labeeb et al. | May 2003 | A1 |
20040088289 | Xu et al. | May 2004 | A1 |
20040122679 | Neuhauser et al. | Jun 2004 | A1 |
20040183749 | Vertegaal | Sep 2004 | A1 |
20040219184 | Brown et al. | Nov 2004 | A1 |
20050117783 | Sung et al. | Jun 2005 | A1 |
20050172311 | Hjelt et al. | Aug 2005 | A1 |
20060026642 | Schaffer et al. | Feb 2006 | A1 |
20060041548 | Parsons et al. | Feb 2006 | A1 |
20060062429 | Ramaswamy et al. | Mar 2006 | A1 |
20060129458 | Maggio | Jun 2006 | A1 |
20060133699 | Widrow et al. | Jun 2006 | A1 |
20060251292 | Gokturk et al. | Nov 2006 | A1 |
20070066916 | Lemos | Mar 2007 | A1 |
20070150916 | Begole et al. | Jun 2007 | A1 |
20070154063 | Breed | Jul 2007 | A1 |
20070162927 | Ramaswamy et al. | Jul 2007 | A1 |
20070244739 | Soito et al. | Oct 2007 | A1 |
20070250901 | McIntire et al. | Oct 2007 | A1 |
20070263934 | Ojima et al. | Nov 2007 | A1 |
20070294126 | Maggio | Dec 2007 | A1 |
20080091512 | Marci et al. | Apr 2008 | A1 |
20080097854 | Young | Apr 2008 | A1 |
20080147742 | Allen | Jun 2008 | A1 |
20080221400 | Lee et al. | Sep 2008 | A1 |
20080222670 | Lee et al. | Sep 2008 | A1 |
20080232650 | Suzuki et al. | Sep 2008 | A1 |
20080243590 | Rich | Oct 2008 | A1 |
20080255904 | Park et al. | Oct 2008 | A1 |
20080271065 | Buonasera et al. | Oct 2008 | A1 |
20080295126 | Lee et al. | Nov 2008 | A1 |
20090070798 | Lee et al. | Mar 2009 | A1 |
20090088610 | Lee et al. | Apr 2009 | A1 |
20090091650 | Kodama | Apr 2009 | A1 |
20090131764 | Lee et al. | May 2009 | A1 |
20090177528 | Wu et al. | Jul 2009 | A1 |
20090265215 | Lindstrom | Oct 2009 | A1 |
20090307084 | Monighetti et al. | Dec 2009 | A1 |
20090310829 | Baba et al. | Dec 2009 | A1 |
20090328089 | Pradeep et al. | Dec 2009 | A1 |
20100004977 | Marci et al. | Jan 2010 | A1 |
20100046797 | Strat et al. | Feb 2010 | A1 |
20100162285 | Cohen et al. | Jun 2010 | A1 |
20100211439 | Marci et al. | Aug 2010 | A1 |
20100245567 | Krahnstoever et al. | Sep 2010 | A1 |
20100274372 | Nielsen et al. | Oct 2010 | A1 |
20100290538 | Xu et al. | Nov 2010 | A1 |
20110019924 | Elgersma et al. | Jan 2011 | A1 |
20110122255 | Haritaoglu | May 2011 | A1 |
20110137721 | Bansal | Jun 2011 | A1 |
20110164188 | Karaoguz et al. | Jul 2011 | A1 |
20110169953 | Sandler et al. | Jul 2011 | A1 |
20110243459 | Deng | Oct 2011 | A1 |
20110265110 | Weinblatt | Oct 2011 | A1 |
20110285845 | Bedros et al. | Nov 2011 | A1 |
20120027299 | Ran | Feb 2012 | A1 |
20120081392 | Arthur | Apr 2012 | A1 |
20120120296 | Roberts et al. | May 2012 | A1 |
20120124604 | Small et al. | May 2012 | A1 |
20120151079 | Besehanic et al. | Jun 2012 | A1 |
20120253921 | Pradeep et al. | Oct 2012 | A1 |
20120314914 | Karakotsios et al. | Dec 2012 | A1 |
20120324493 | Holmdahl et al. | Dec 2012 | A1 |
20130013396 | Vinson et al. | Jan 2013 | A1 |
20130051677 | Lee | Feb 2013 | A1 |
20130055300 | Hanina | Feb 2013 | A1 |
20130103624 | Thieberger | Apr 2013 | A1 |
20130119649 | Sato et al. | May 2013 | A1 |
20130129159 | Huijgens et al. | May 2013 | A1 |
20130152113 | Conrad et al. | Jun 2013 | A1 |
20130156273 | Nielsen | Jun 2013 | A1 |
20140189720 | Terrazas | Jul 2014 | A1 |
20140259034 | Terrazas | Sep 2014 | A1 |
20150086070 | Deng et al. | Mar 2015 | A1 |
20150271390 | Nielsen | Sep 2015 | A1 |
20160007083 | Gurha | Jan 2016 | A1 |
20160255384 | Marci et al. | Sep 2016 | A1 |
20160323640 | Terrazas | Nov 2016 | A1 |
20170142330 | Nielsen | May 2017 | A1 |
20180048807 | Nielsen | Feb 2018 | A1 |
20190089894 | Nielsen | Mar 2019 | A1 |
20190164130 | Marci et al. | May 2019 | A1 |
20190182544 | Terrazas | Jun 2019 | A1 |
20190222894 | Terrazas | Jul 2019 | A1 |
20200351436 | Nielsen | Nov 2020 | A1 |
20210065117 | Marci et al. | Mar 2021 | A1 |
20210297738 | Terrazas | Sep 2021 | A1 |
20220279245 | Terrazas | Sep 2022 | A1 |
20220279246 | Terrazas | Sep 2022 | A1 |
20220279247 | Terrazas | Sep 2022 | A1 |
20220279248 | Terrazas | Sep 2022 | A1 |
20220286601 | Nielsen | Sep 2022 | A1 |
Number | Date | Country |
---|---|---|
9927668 | Jun 1999 | WO |
2004030350 | Apr 2004 | WO |
2005041166 | May 2005 | WO |
2005115011 | Dec 2005 | WO |
2006099612 | Sep 2006 | WO |
Entry |
---|
United States Patent and Trademark Office, “Notice of Allowance and Fee(s) Due,” issued in connection with U.S. Appl. No. 17/751,283, dated Jul. 20, 2022, 8 pages. |
Wikipedia. (Dec. 2010) “Kinect.” http://en.wikipedia.org/wiki/Kinect, 15 pages. |
Fellers et al. (Aug. 2004) “Nikon MicroscopyU: Interactive Java Tutorials—CCD Signal-to-Noise Ratio.” http://www.microscopyu.com/tutorials/java/digitalimaging/signaltonoise/index.html as archived by The Internet Archive, www archive.org., 4 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/327,227, dated May 20, 2014, 15 pages. |
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 13/327,227, dated Sep. 18, 2014, 20 pages. |
United States Patent and Trademark Office, “Notice of Allowance and Fee(s) Due,” issued in connection with U.S. Appl. No. 13/327,227, dated Mar. 6, 2015, 9 pages. |
United States Patent and Trademark Office, “Requirement for Restriction/Election,” issued in connection with U.S. Appl. No. 14/732,175, dated Jul. 8, 2016, 6 pages. |
United States Patent and Trademark Office, “Notice of Allowance and Fee(s) Due,” issued in connection with U.S. Appl. No. 14/732,175, dated Sep. 23, 2016, 10 pages. |
United States Patent and Trademark Office, “Notice of Allowance and Fee(s) Due,” issued in connection with U.S. Appl. No. 15/419,120, dated Aug. 2, 2017, 10 pages. |
United States Patent and Trademark Office, “Notice of Allowance and Fee(s) Due,” issued in connection with U.S. Appl. No. 15/793,108, dated Aug. 23, 2018, 10 pages. |
United States Patent and Trademark Office, “Non-Final Office action,” issued in connection with U.S. Appl. No. 15/793,108, dated Mar. 2, 2018, 11 pages. |
United States Patent and Trademark Office, “Advisory action,” issued in connection with U.S. Appl. No. 16/196,810, dated May 13, 2020, 2 pages. |
United States Patent and Trademark Office, “Final Office action,” issued in connection with U.S. Appl. No. 16/196,810, dated Feb. 26, 2020, 23 pages. |
United States Patent and Trademark Office, “Non-Final Office action,” issued in connection with U.S. Appl. No. 16/196,810, dated Nov. 7, 2019, 15 pages. |
United States Patent and Trademark Office, “Advisory action,” issued in connection with U.S. Appl. No. 16/196,810, dated Aug. 22, 2019, 2 pages. |
United States Patent and Trademark Office, “Final Office action,” issued in connection with U.S. Appl. No. 16/196,810, dated Jun. 13, 2019, 16 pages. |
United States Patent and Trademark Office, “Non-Final Office action,” issued in connection with U.S. Appl. No. 16/196,810, dated Mar. 1, 2019, 14 pages. |
United States Patent and Trademark Office, “Non-Final Office action,” issued in connection with U.S. Appl. No. 16/878,935, dated Jun. 10, 2021, 9 pages. |
United States Patent and Trademark Office, “Notice of Allowance and Fee(s) Due,” issued in connection with U.S. Appl. No. 16/878,935, dated Sep. 27, 2021, 7 pages. |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 13/728,515, dated Feb. 19, 2014, 13 pages. |
United States Patent and Trademark Office, “Non-final Office Action,” issued in connection with U.S. Appl. No. 14/281,139, dated Nov. 3, 2015, 11 pages. |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 14/281,139, dated Mar. 16, 2016, 7 pages. |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 15/206,932, dated Aug. 8, 2018, 8 pages. |
United States Patent and Trademark Office, “Final Office Action,” mailed in connection with U.S. Appl. No. 15/206,932, dated Apr. 24, 2018, 14 pages. |
United States Patent and Trademark Office, “Non-final Office Action,” mailed in connection with U.S. Appl. No. 15/206,932, dated Dec. 14, 2017, 15 pages. |
United States Patent and Trademark Office, “Notice of Allowance and Fee(s) Due,” issued in connection with U.S. Appl. No. 16/209,635, dated Dec. 23, 2020, 8 pages. |
United States Patent and Trademark Office, “Non-final Office Action,” issued in connection with U.S. Appl. No. 16/209,635, dated Aug. 7, 2020, 14 pages. |
United States Patent and Trademark Office, “Non-final Office Action,” mailed in connection with U.S. Appl. No. 16/360,976, dated Oct. 1, 2020, 15 pages. |
United States Patent and Trademark Office, “Notice of Allowance and Fee(s) Due,” issued in connection with U.S. Appl. No. 16/360,976, dated Feb. 3, 2021, 8 pages. |
United States Patent and Trademark Office, “Non-final Office Action,” issued in connection with U.S. Appl. No. 17/341,104, dated Jul. 13, 2022, 16 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 17/750,147, dated Aug. 16, 2022, 19 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 17/750,159, dated Aug. 15, 2022, 11 pages. |
Horoviiz, Bruce, “Watching Ads Is Real Science Research Companies Monitor Physiological Reactions To Commercials To Determine Their Effectiveness,” [3 Star Edition], Los Angeles Times, Orlando Sentinel, Sep. 1, 1991 D2, accessed on Mar. 25, 2013, 2 pages. |
Hazlett et al., “Emotional Response to Television Commercials: Facial EMG vs. Self-Report,” Journal of Advertising Research, (Apr. 1999), 17 pages. |
Neurofocus—Neuroscientific Analysis for Audience Engagement, accessed on Jan. 8, 2010 athttp://web.archive.org/web/20080621 I14525/www.neurofocus.com/BrandImage.htm, (2008), 2 pages. |
Krugman, “Brain Wave Measures of Media Involvement,” Journal of Advertising Research vol. 11, No. 1,pp. (Feb. 3-9, 1971), 7 pages. |
Micu, A.C. et al., “Measurable Emotions: How Television Ads Really Work: How the Patterns of Reactions to Commercials can Demonstrate Advertising Effectiveness”, Management Slant, 50(2), Jun. 2010; pp. 1-17, 18pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 17/750,157, dated Sep. 19, 2022, 22 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 17/750,141, dated Sep. 13, 2022, 24 pages. |
Jia et al., “Extending the Feature Set for Facial Recognition,” Apr. 7-9, 1992, 6 pages. |
Furketal, “Eigenfaces for Recognition,” Journal of Cognitive Neuroscience, vol. 3, No. 1, 1991, 17 pages. |
United States Patent and Trademark Office, “Notice of Allowance and Fee(s) Due”, issued in connection with U.S. Appl. No. 17/341,104 dated Jan. 9, 2023, 8 pages. |
Teixeira et al., “A Survey of Human-Sensing: Methods for Detecting Presence, Count, Location, Track, and Identity,” ENALAB Technical Report Sep. 2010, vol. 1, No. 1, Sep. 2010, 41 pages. |
U.S. Appl. No. 09/427,970, titled “Audio Signature Extraction and Correlation,” filed with the United States Patent and Trademark Office dated Oct. 27, 1999, 51 pages. |
Erik Larson, “Watching Americans Watch TV,” The Atlantic Monthly, Mar. 1992,12 pages. |
The Nielsen Company (US),LLC v. TVision Insights, Inc., “Amended Answer to Complaint for Patent Infringement,” filed with the United States District Court for the District of Delaware by TVision Insights, Inc., in connection with Case 1:22-cv-01345-CJB, Feb. 8, 2023, 34 pages. |
United States Patent and Trademark Office, “Notice of Allowability,” issued in connection with U.S. Appl. No. 17/750,159, dated Feb. 13, 2023, 3 pages. |
United States Patent and Trademark Office, “Notice of Allowability,” issued in connection with U.S. Appl. No. 17/750,147, dated Feb. 13, 2023, 4 pages. |
The Nielsen Company (US),LLC v. TVision Insights, Inc., “Answer to Complaint for Patent Infringement,” filed with the United States District Court for the District of Delaware by TVision Insights, Inc., in connection with Case 1:22-ov-01345-CJB, Dec. 19, 2022, 16 pages. |
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 17/750,157, dated Dec. 30, 2022, 22 pages. |
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 17/750,141, dated Dec. 30, 2022, 26 pages. |
United States Patent and Trademark Office, “Notice of Allowance and Fee(s) Due,” issued in connection with U.S. Appl. No. 17/750,159, dated Dec. 19, 2022, 9 pages. |
United States Patent and Trademark Office, “Notice of Allowance and Fee(s) Due,” issued in connection with U.S. Appl. No. 17/750,147, dated Dec. 14, 2022, 8 pages. |
The Nielsen Company (US), LLCv. TVision Insights, Inc., “Exhibit A—Exhibit T,” filed with the United States District Court for the District of Delaware by The Nielsen Company (US), LLC in connection with Case 1:22-cv-01345-CJB, Oct. 12, 2022, 240 pages. |
The Nielsen Company (US),LLC v. TVision Insights, Inc., “Complaint for Patent Infringement,” filed with the United States District Court for the District of Delaware by The Nielsen Company (US), LLC in connection with Case 1:22-cv-01345-CJB, Oct. 12, 2022, 21 pages. |
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 17/750,147 dated Mar. 15, 2023, 17 pages. |
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 17/750,159 dated Mar. 15, 2023, 16 pages. |
Number | Date | Country | |
---|---|---|---|
20220256075 A1 | Aug 2022 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16878935 | May 2020 | US |
Child | 17666322 | US | |
Parent | 16196810 | Nov 2018 | US |
Child | 16878935 | US | |
Parent | 15793108 | Oct 2017 | US |
Child | 16196810 | US | |
Parent | 15419120 | Jan 2017 | US |
Child | 15793108 | US | |
Parent | 14732175 | Jun 2015 | US |
Child | 15419120 | US | |
Parent | 13327227 | Dec 2011 | US |
Child | 14732175 | US |