This disclosure relates generally to audience measurement and, more particularly, to methods and apparatus to detect user attentiveness to handheld computing devices.
Audience measurement of media (e.g., content or advertisements) delivered in any format (e.g., via terrestrial, cable, or satellite television and/or radio, stored audio and/or video played back from a memory such as a digital video recorder or an optical disc, a webpage, audio and/or video presented (e.g., streamed) via the Internet, video games, etc.) often involves collection of media identifying data (e.g., signature(s), fingerprint(s), code(s), tuned channel identification information, time of exposure information, etc.) and people data (e.g., user identifiers, demographic data associated with audience members, etc.). The media identifying data and the people data can be combined to generate, for example, media exposure data indicative of amount(s) and/or type(s) of people that were exposed to specific piece(s) of media.
In some audience measurement systems, exposure data is collected in connection with usage of one or more computing devices. For example, audience measurement systems often employ one or more techniques to determine user exposure to media via browsing the Internet via computing devices. The exposure data can be correlated with the identities and/or demographics of users to, for example, generate statistics for the detected media. For example, an audience measurement entity (e.g., Nielsen®) can calculate ratings and/or other statistics (e.g., online exposure statistics, such as a number of impressions for a web address that hosts an advertisement) for a piece of media (e.g., an advertisement, a website, a movie, a song, an album, a news segment, personal video (e.g., a YouTube® video), a highlight reel, a television program, a radio program, etc.) accessed via a computing device by crediting the piece of media as being presented on the computing device at a first time and identifying the audience member(s) using the computing device at the first time. Some known systems credit exposure to the media and generate statistics based on such crediting irrespective of the fact that the user(s) may be paying little or no attention to the presentation of the media.
Examples disclosed herein recognize that although media may be presented on a computing device, a current user may or may not be paying attention to (e.g., be engaged with) the presentation of the media. For example, when viewing online media (e.g., via a service such as Hulu®) on a handheld computing device (e.g., an iPad® or iPhone®), users are often presented with advertisements at one or more points or segments in the presented programming. The user is typically unable to fast-forward or skip the advertisement. However, the user can easily disengage from (e.g., stop paying attention to) the handheld computing device during presentation of the advertisement by, for example, putting the handheld computing device down or turning the handheld computing device away from view. In such instances, while the user did not actually pay attention to the advertisement, a known prior monitoring service measuring exposure to the advertisement would credit the advertisement as being watched by the user even though the user did not watch the advertisement.
Example methods, apparatus, and articles of manufacture disclosed herein measure attentiveness of users of handheld computing devices with respect to one or more pieces of media presented on the handheld computing devices. A first example measure of attentiveness for a user provided by examples disclosed herein is referred to herein as engagement likelihood. As used herein, an engagement likelihood associated with a presented piece of media refers to a value representative of a confidence that the user is paying or has begun paying attention to a presentation on a handheld computing device. A second example measure of attentiveness for a user provided by examples disclosed herein is referred to herein as disengagement likelihood. As used herein, a disengagement likelihood associated with a presented piece of media refers to a value representative of a confidence that the user is not paying (or has ceased paying) attention to a presentation of the handheld computing device.
As used herein, the term “handheld computing device” refers to any form of processor based device that can be, and is intended to be simultaneously held in the air and operated by one or more hands of a user. In other words, as used herein, a handheld computing device is readily moved and held by the hand(s) of a user and is designed to receive input from the user while being held in the air by the hand(s) of the user. While a handheld computing device can remain stationary during user operation, a handheld computing device is not one designed or mainly meant to remain stationary during interaction with a user, such as a desktop computer. For example, a handheld computing device such as a tablet or smart phone can be placed on a table and operate by a user while resting on the table. However, unlike non-handheld computing devices such as desktop computers, the tablet can also be picked up and operating by the user with one or both hands.
To determine a likelihood that a user is paying attention to (e.g., engaged with) or not paying attention to (e.g., disengaged with) a handheld computing device that is presenting media, examples disclosed herein utilize sensors of the handheld computing device (e.g., gravitational sensors (e.g., accelerometers, gyroscopes, tilt sensors), microphones, magnetometers, global positioning sensors, etc.) to detect one or more spatial (e.g., position, movement and/or orientation) conditions related to the handheld computing device while, for example, the media is being presented. Example spatial conditions detected by the sensor(s) of the handheld computing device include an angular orientation or tilt relative to one or more reference lines (e.g., a horizontal reference line, a vertical reference line, etc.), a distance from a nearest object (e.g., a user), a proximity to a person, etc. Examples disclosed herein also detect changes to current spatial conditions, such a change from a first orientation to a second orientation and/or a change from a first position relative to a user to a second position relative to the user. Examples disclosed herein compare detected change(s) to an index of likelihoods, each likelihood corresponding to a respective one of a plurality of possible changes (e.g., a first position to a second position). In other words, the likelihoods of the index provided by examples disclosed herein are indicative of how likely a user is engaged or disengaged with a presentation on the handheld computing device when the user changes the handheld computing device from a first spatial condition to a second spatial condition. For example, a first example engagement likelihood of the example index disclosed herein indicates that the user is likely (e.g., according to a corresponding percentage) to be paying attention to a screen of the handheld computing device and/or likely beginning to pay attention to the screen of the handheld computing device when the user changes the orientation of the handheld computing device from parallel to the ground (e.g., resting on a table) to a forty-five degree angle relative to a horizontal reference that is facing downward and parallel to the ground (e.g., being held above the user while the user is laying down). Conversely, a first example disengagement likelihood of the example index disclosed herein indicates that the user is unlikely to be paying attention to the screen of the handheld computing device and/or likely to begin disengaging from the screen of the handheld computing device when the user changes the orientation of the handheld computing device from a forty-five degree angle relative to the horizontal reference that is parallel to the ground to a position that is parallel to the ground.
A second example disengagement likelihood of the example index disclosed herein indicates that the user is unlikely to be paying attention to the screen of the handheld computing device and/or likely to begin disengaging from the screen of the handheld computing device when the user changes a position of the handheld computing device relative to the user from a first position proximate the user to a second position in which the user is undetected (e.g., the device is too far away from the user for the sensors of the handheld computing device to determine a distance between the handheld computing device and the user). Conversely, a second example engagement likelihood of the example index disclosed herein indicates that the user is likely to be paying attention to the screen of the handheld computing device and/or beginning to pay attention to the screen of the handheld computing device when the user changes a position of the handheld computing device relative to the user from the second position (e.g., an undetectable distance from the user) to the first position (e.g., proximate the user).
Other example engagement and disengagement likelihoods of the example index disclosed herein correspond to changes in orientation combined with changes in relative position. In other words, some example engagement likelihoods of the example index disclosed herein indicate how likely it is that the user is paying attention to or is beginning to pay attention to the presentation on the handheld computing device when a certain change in orientation coincides with a certain change in relative position (e.g., a change in distance between the device and the user). Additionally or alternatively, some example disengagement likelihoods of the example index disclosed herein indicate how likely it is that the user is not paying attention to or is beginning to disengage from the presentation on the handheld computing device when a certain change in orientation coincides with a certain change in relative position (e.g., a change in distance between the device and the user).
Using the example index disclosed herein, user attentiveness to handheld computing devices can be passively collected. As a user interacts with a handheld computing device, examples disclosed herein detect change(s) in orientation and/or relative position (e.g., of the device with respect to the user) and compare the detected change(s) to the engagement/disengagement likelihood index. If the detected change(s) correspond to (e.g., within a threshold) one or more of the changes of the engagement/disengagement likelihood index, examples disclosed herein determine that the corresponding likelihood represents how likely it is that the current user is paying attention to the handheld computing device, beginning to pay attention to the handheld computing device, not paying attention to the handheld computing device, and/or beginning to disengage from the handheld computing device. The attentiveness measurements provided by examples disclosed herein can be used to, for example, increase granularity and accuracy of exposure measurement data generated in connection with the media being presented on the handheld computing device.
The example handheld device 108 of
To detect attentiveness of a current user of the handheld computing device 108 to a presentation of media on the handheld computing device 108, the example exposure measurement application 112 includes an attentiveness detector 204. The example attentiveness detector 204 of
The example attentiveness detector 204 of
The example exposure measurement application 112 of
In some examples, the media detector 210 sends a signal to the attentiveness detector 204 in response to determining that the handheld computing device 108 is presenting media, thereby triggering the attentiveness detector 204 to collect user engagement/disengagement information. In such instances, the attentiveness detector 204 collects and interprets data from the sensors 200a-e while the handheld computing device 108 presents media such that the example attentiveness detector 204 determines whether, for example, a user is paying attention or beginning to pay attention to the handheld computing device 108 when media is being presented on the handheld computing device 108. In other words, the example attentiveness detector 204 of
In the illustrated example of
While an example manner of implementing the exposure measurement application 112 of
Additionally or alternatively, the orientation detector 300 uses data from the sensors 200a-e to determine an angle at which the handheld computing device 108 is tilted or orientated relative to a second reference line, such as a vertical reference line, of which the sensors 200a-e are aware. Such an angle is referred to herein as a vertical orientation. The example orientation detector 300 analyzes the vertical orientation of the handheld computing device 108 by determining whether one side or edge of the handheld computing device 108 is higher than an opposing side or edge with reference to the vertical reference line. Thus, when the handheld computing device 108 is resting against a wall with one edge on a flat surface, the example orientation detector 300 determines that the handheld computing device 108 is at a zero tilt that corresponds to the vertical reference line. In contrast, when the handheld computing device 108 is obtusely or acutely angled away/toward the vertical reference line by an angular amount while, for example, being held by a user, the example orientation detector 300 determines that the handheld computing device 108 is being held at the detected tilt (e.g., thirty degrees).
To determine a position relative to a user and/or other objects, the example attentiveness detector 204 includes a position detector 302. In the illustrated example, the position detector 302 utilizes data received from the sensor interface 202 of
In the illustrated example, the orientation detector 300 and the position detector 302 are triggered to collect and analyze data from the sensor interface 202 by, for example, the media detector 210 when the media detector 210 determines that the handheld computing device 108 is outputting media. Thus, in the illustrated example, the orientation detector 300 detects orientation(s) of the handheld computing device 108 when the handheld computing device 108 is presenting media and the position detector 302 detects a position of the handheld computing device 108 relative to a user when the handheld computing device 108 is presenting media to the user. In some examples, the example orientation detector 300 and/or the position detector 302 records a type of media being presented (e.g., as provided by the media detector 210 of
The example attentiveness detector 204 of
When the example change detector 304 detects a change in orientation and/or position of the handheld computing device 108, the example change detector 304 records a first set of spatial conditions (e.g., a first orientation(s) and/or a first position relative to the user) associated with the handheld computing device 108 immediately prior to the detected change, as well as a second set of spatial conditions (e.g., second orientation(s) and/or a second relative position) associated with the handheld computing device 108 immediately after the detected change. Accordingly, with respect to a detected change in spatial condition(s) of the handheld computing device 108, the example change detector 304 of
The example attentiveness detector 204 of
Some of the predefined spatial condition changes of the example index 306 are associated with an engagement likelihood, which reflects how likely the respective change corresponds to a user being engaged with or beginning to engage the handheld computing device 108. Additionally or alternatively, some of the predefined spatial condition changes of the example index 306 are associated with a disengagement likelihood, which reflects how likely the respective change corresponds to the user being disengaged or beginning to disengage from the handheld computing device 108.
As described above, the example exposure measurement application 112 of
As described above, for each detected change in a spatial condition (e.g., an orientation or a relative position) above a threshold, the example change detector 304 records a starting spatial condition and an ending spatial condition. The example attentiveness detector 204 of
Because the detected spatial condition change may include more than one aspect, the example comparator 308 may find more than one match in the likelihood index 306. For example, suppose the change detector 304 detects a change involving a first spatial condition change from a first horizontal orientation to a second horizontal orientation, as well as a second spatial condition change from a first vertical horizontal orientation to a second horizontal orientation, as well as a third spatial change from a first relative position to a second relative position. In such an instance, the example comparator 308 may find matches in the index 306 for the first and second spatial condition changes. Additionally or alternatively, the example comparator 308 may find a match in the index 306 for a combination or concurrence of the first and second spatial condition changes or match for a combination or concurrence of the second and third spatial condition changes. As a result, more than one likelihood from the index 306 may apply to the detected change. For such instances, the example attentiveness detector 204 includes an aggregator 310 to aggregate the plurality of likelihoods when a detected change involves more than one matching spatial condition change from the index 306. In the illustrated example of
In some examples, the plurality of likelihoods are output individually as separate measurements of user attentiveness (e.g., without being aggregated). For example, when a first one of the likelihoods corresponds to an engagement likelihood and a second one of the likelihoods corresponds to a disengagement likelihood, the comparator 308 of such examples outputs the two likelihoods individually without aggregator the first and second likelihoods.
In the illustrated example, when a single match is found in the likelihood index 306 for a detected change, the example comparator 308 outputs the corresponding likelihood as representative of likely engagement (or disengagement) of the current user with a presentation of the handheld computing device 108. Otherwise, in the illustrated example, when more than one match is found in the likelihood index 306 for a detected change, the example aggregator 310 outputs the aggregated likelihood as representative of likely engagement (or outputs the aggregated likelihood as representative of likely disengagement) of the current user with a presentation of the handheld computing device 108.
The example attentiveness detector 204 of
While an example manner of implementing the attentiveness detector 204 of
A flowchart representative of example machine readable instructions for implementing the example exposure measurement application 112 of
As mentioned above, the example processes of
After installation, the exposure measurement application 112 runs in the background (e.g., does not require manual instantiation) and the example sensor interface 202 of
The example change detector 304 determines whether the handheld computing device 108 has experienced one or more spatial condition changes from the orientation(s) and/or relative position determined at blocks 404 and 406, respectively (block 408). When the example change detector 304 detects such sufficient a change (e.g., a change greater than a threshold such as one percent), the change detector 304 instructs the orientation detector 300 and the position detector 302 to determine the new spatial conditions (e.g., orientation(s) and/or relative position) of the handheld computing device 108. In response, the orientation detector 300 uses data from the sensor interface 202 to determine the orientation(s) of the handheld computing device 108 (block 410). Further, the position detector 302 uses data from the sensor interface 202 to determine the relative position of handheld computing device 108 (block 412). The example change detector 304 records the orientation(s) and the relative position determined at blocks 404 and 406, respectively, as starting spatial conditions for the detected change (block 414). Further, the example change detector 304 records the orientation(s) and the relative position determined at blocks 410 and 412, respectively, as the sending spatial conditions for the detected change (block 414).
The example comparator 308 uses the starting and ending spatial conditions associated with the detected change to query the example engagement/disengagement index 306 to determine whether the starting and ending spatial conditions match any of the spatial condition changes stored in the index 306 (block 416). If a single match is found in the index (block 418), the comparator 308 outputs the likelihood of the index 306 corresponding to the match as a measure of attentiveness (e.g., engagement or disengagement) of a user of the handheld computing device 108 (block 420). Control then returns to block 402. Otherwise, if more than one match is found in the index 306 (block 422), the aggregator 310 aggregates (e.g., averages) the likelihoods of the index 306 corresponding to the matches and outputs the aggregated likelihood as a measure of attentiveness (e.g., engagement or disengagement) of a user of the handheld computing device 108 (block 424). Control returns to block 402.
The processor platform 500 of the instant example includes a processor 512. For example, the processor 512 can be implemented by one or more microprocessors or controllers from any desired family or manufacturer.
The processor 512 is in communication with a main memory including a volatile memory 514 and a non-volatile memory 516 via a bus 518. The volatile memory 514 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS Dynamic Random Access Memory (RDRAM) and/or any other type of random access memory device. The non-volatile memory 516 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 514, 516 is controlled by a memory controller.
The processor platform 500 also includes an interface circuit 520. The interface circuit 520 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), and/or a PCI express interface.
One or more input devices 522 can be connected to the interface circuit 520. The input device(s) 522 permit a user to enter data and commands into the processor 512. The input device(s) can be implemented by, for example, a keyboard, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
One or more output devices 524 can be connected to the interface circuit 520. The output devices 524 can be implemented, for example, by display devices (e.g., a liquid crystal display, a touchscreen, and/or speakers). The interface circuit 520, thus, typically includes a graphics driver card.
The interface circuit 520 also includes a communication device such as an antenna, a modem or network interface card to facilitate exchange of data with external computers via a network 526 (e.g., a WiFi network, an Ethernet connection, a digital subscriber line (DSL), a telephone line, coaxial cable, a cellular system, etc.).
The processor platform 500 also includes one or more mass storage devices 528, such as a hard drive for storing software and data. The mass storage device 528 may implement the memory 208 of
The coded instructions 532 of
Although certain example apparatus, methods, and articles of manufacture have been disclosed herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all apparatus, methods, and articles of manufacture fairly falling within the scope of the claims of this patent.
This patent arises from a continuation of U.S. patent application Ser. No. 16/741,399, filed Jan. 13, 2020, now U.S. Pat. No. 10,986,405, which is a continuation of U.S. patent application Ser. No. 16/119,509, filed Aug. 31, 2018, now U.S. Pat. No. 10,536,747, which is a continuation of U.S. patent application Ser. No. 15/265,352, filed Sep. 14, 2016, now U.S. Pat. No. 10,080,053, which is a continuation of U.S. patent application Ser. No. 14/495,323, filed Sep. 24, 2014, now U.S. Pat. No. 9,485,534, which is a continuation of U.S. patent application Ser. No. 13/893,027, filed May 13, 2013, now U.S. Pat. No. 8,869,183, which is a continuation U.S. patent application Ser. No. 13/447,862, filed Apr. 16, 2012, now U.S. Pat. No. 8,473,975. Priority to U.S. patent application Ser. No. 16/741,399, U.S. patent application Ser. No. 16/119,509, U.S. patent application Ser. No. 15/265,352, U.S. patent application Ser. No. 14/495,323, U.S. patent application Ser. No. 13/893,027, and U.S. patent application Ser. No. 13/447,862 is claimed. U.S. patent application Ser. No. 16/741,399, U.S. patent application Ser. No. 16/119,509, U.S. patent application Ser. No. 15/265,352, U.S. patent application Ser. No. 14/495,323, U.S. patent application Ser. No. 13/893,027, and U.S. patent application Ser. No. 13/447,862 are hereby incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
3919479 | Moon et al. | Nov 1975 | A |
4230990 | Lert, Jr. et al. | Oct 1980 | A |
4677466 | Lert, Jr. et al. | Jun 1987 | A |
4718106 | Weinblatt | Jan 1988 | A |
5200822 | Bronfin | Apr 1993 | A |
6400996 | Hoffberg et al. | Jun 2002 | B1 |
6549792 | Cannon et al. | Apr 2003 | B1 |
6783459 | Cumbers | Aug 2004 | B2 |
6834249 | Orchard | Dec 2004 | B2 |
6839072 | Trajkovic | Jan 2005 | B2 |
6850252 | Hoffberg | Feb 2005 | B1 |
7224996 | Sakamoto | May 2007 | B2 |
7236156 | Liberty et al. | Jun 2007 | B2 |
7373820 | James | May 2008 | B1 |
7430439 | Griffin et al. | Sep 2008 | B2 |
7479949 | Jobs et al. | Jan 2009 | B2 |
7499232 | Hodge et al. | Mar 2009 | B2 |
7505921 | Lukas et al. | Mar 2009 | B1 |
7697962 | Cradick et al. | Apr 2010 | B2 |
7716054 | Harris et al. | May 2010 | B2 |
7733224 | Tran | Jun 2010 | B2 |
7859521 | Hotelling et al. | Dec 2010 | B2 |
8049756 | Shuster | Nov 2011 | B2 |
8085125 | Bigdely Shamlo | Dec 2011 | B2 |
8292433 | Vertegaal | Oct 2012 | B2 |
8432366 | Hodges et al. | Apr 2013 | B2 |
8473975 | Besehanic | Jun 2013 | B1 |
8869183 | Besehanic | Oct 2014 | B2 |
9485534 | Besehanic | Nov 2016 | B2 |
9519909 | Nielsen et al. | Dec 2016 | B2 |
10080053 | Besehanic | Sep 2018 | B2 |
10536747 | Besehanic | Jan 2020 | B2 |
11360576 | Rakshit | Jun 2022 | B1 |
20030179229 | Van Erlach et al. | Sep 2003 | A1 |
20040203850 | Oesterling | Oct 2004 | A1 |
20040239639 | Stavely et al. | Dec 2004 | A1 |
20050212755 | Marvit | Sep 2005 | A1 |
20060238161 | Rusnell et al. | Oct 2006 | A1 |
20060247915 | Bradford et al. | Nov 2006 | A1 |
20070075127 | Rosenberg | Apr 2007 | A1 |
20070180469 | Finley et al. | Aug 2007 | A1 |
20070203850 | Singh et al. | Aug 2007 | A1 |
20070271518 | Tischer et al. | Nov 2007 | A1 |
20070271580 | Tischer et al. | Nov 2007 | A1 |
20070294705 | Gopalakrishnan et al. | Dec 2007 | A1 |
20080086533 | Neuhauser et al. | Apr 2008 | A1 |
20080091451 | Crystal | Apr 2008 | A1 |
20080091762 | Neuhauser et al. | Apr 2008 | A1 |
20080109295 | McConochie et al. | May 2008 | A1 |
20080126420 | Wright et al. | May 2008 | A1 |
20080214160 | Jonsson | Sep 2008 | A1 |
20080249867 | Angell et al. | Oct 2008 | A1 |
20090012927 | Brooks et al. | Jan 2009 | A1 |
20090083129 | Pradeep et al. | Mar 2009 | A1 |
20090085877 | Chang et al. | Apr 2009 | A1 |
20090088204 | Culbert et al. | Apr 2009 | A1 |
20090097689 | Prest et al. | Apr 2009 | A1 |
20090099983 | Drane et al. | Apr 2009 | A1 |
20090128567 | Shuster et al. | May 2009 | A1 |
20090193052 | FitzGerald et al. | Jul 2009 | A1 |
20090307633 | Haughay, Jr. et al. | Dec 2009 | A1 |
20090320123 | Yu et al. | Dec 2009 | A1 |
20100134655 | Kuroiwa | Jun 2010 | A1 |
20100211439 | Marci et al. | Aug 2010 | A1 |
20100225443 | Bayram et al. | Sep 2010 | A1 |
20100249538 | Pradeep et al. | Sep 2010 | A1 |
20100265204 | Tsuda | Oct 2010 | A1 |
20100279738 | Kim et al. | Nov 2010 | A1 |
20110004474 | Bansal et al. | Jan 2011 | A1 |
20110054830 | Logan | Mar 2011 | A1 |
20110066383 | Jangle et al. | Mar 2011 | A1 |
20110077891 | Koenig | Mar 2011 | A1 |
20110156867 | Carrizo et al. | Jun 2011 | A1 |
20110181422 | Tran | Jul 2011 | A1 |
20110279228 | Kumar | Nov 2011 | A1 |
20110279453 | Murphy et al. | Nov 2011 | A1 |
20120032806 | Lee | Feb 2012 | A1 |
20120032819 | Chae et al. | Feb 2012 | A1 |
20120033594 | Kalbag | Feb 2012 | A1 |
20120081392 | Arthur | Apr 2012 | A1 |
20120083668 | Pradeep et al. | Apr 2012 | A1 |
20120158520 | Momeyer et al. | Jun 2012 | A1 |
20120200391 | Sugiyama et al. | Aug 2012 | A1 |
20120278377 | Weissman et al. | Nov 2012 | A1 |
20120284332 | Pradeep et al. | Nov 2012 | A1 |
20120306758 | Marsden et al. | Dec 2012 | A1 |
20130084805 | Pasquero et al. | Apr 2013 | A1 |
20130102283 | Lau et al. | Apr 2013 | A1 |
20130135218 | Jain et al. | May 2013 | A1 |
20130205314 | Ramaswamy | Aug 2013 | A1 |
20130205360 | Novak et al. | Aug 2013 | A1 |
20130222277 | O'Hara | Aug 2013 | A1 |
20130232142 | Nielsen et al. | Sep 2013 | A1 |
20140244566 | Hewett et al. | Aug 2014 | A1 |
20150012927 | Besehanic | Jan 2015 | A1 |
20170006339 | Besehanic | Jan 2017 | A1 |
20170214952 | Karanth | Jul 2017 | A1 |
20200302478 | Martinez | Sep 2020 | A1 |
20210373676 | Jorasch | Dec 2021 | A1 |
20220279226 | Stevens | Sep 2022 | A1 |
20220292613 | Risbood | Sep 2022 | A1 |
20220312071 | Devaraj | Sep 2022 | A1 |
Number | Date | Country |
---|---|---|
2073387 | Jan 1993 | CA |
2133777 | Dec 2009 | EP |
9111062 | Jan 1991 | WO |
0211123 | Feb 2002 | WO |
2010060146 | Jun 2010 | WO |
WO-2018195391 | Oct 2018 | WO |
Entry |
---|
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 13/447,862, dated May 2, 2013, 25 pages. |
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 13/893,027, dated Jun. 13, 2014,19 pages. |
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/307,599, notification dated Jun. 19, 2015, 27 pages. |
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/307,599, notification dated Feb. 23, 2015, 24 pages. |
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 13/781,236, dated Sep. 29, 2015, 5 pages. |
United States Patent and Trademark Office, “Non-final Office Action,” issued in connection with U.S. Appl. No. 13/781,236, notification dated Nov. 18, 2014, 9 pages. |
United States Patent and Trademark Office, “Advisory Action,” issued in connection with U.S. Appl. No. 13/781,236, notification dated Aug. 4, 2015, 3 pages. |
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/781,236, notification dated May 14, 2015, 10 pages. |
International Searching Authority, “International Search Report and Written Opinion,” issued in connection with Application No. PCT/US2012/067049, dated Mar. 25, 2013, 10 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/473,361, notification dated Mar. 31, 2014, 10 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/473,361, notification dated Feb. 12, 2015, 23 pages. |
United States Patent and Trademark Office, “Advisory Action,” issued in connection with U.S. Appl. No. 13/473,361, notification dated Nov. 10, 2014, 4 pages. |
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 13/473,361, notification dated Aug. 14, 2014, 17 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/307,599, notification dated Jul. 30, 2014, 18 pages. |
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 13/307,599, notification dated May 3, 2016, 29 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/409,796, notification dated Jul. 17, 2013, 19 pages. |
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 13/409,796, notification dated Mar. 14, 2014, 20 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/409,796, notification dated Aug. 19, 2014, 23 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/409,796, notification dated Apr. 27, 2015, 24 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/409,796, notification dated Sep. 9, 2015, 23 pages. |
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 13/409,796, notification dated Apr. 6, 2016, 22 pages. |
United States Patent and Trademark Office, “Notice of Allowance,” issued in connection with U.S. Appl. No. 13/409,796, dated Aug. 10, 2016, 20 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 14/495,323, notification dated Oct. 7, 2015, 13 pages. |
United States Patent and Trademark Office, “Final Office Action,” issued in connection with U.S. Appl. No. 14/495,323, notification dated Feb. 3, 2016, 10 pages. |
United States Patent and Trademark Office, “Notice of Allowance,” issued in connection with U.S. Appl. No. 14/495,323, dated Apr. 26, 2016, 17 pages. |
United States Patent and Trademark Office, “Notice of Allowability,” issued in connection with U.S. Appl. No. 14/495,323, notification dated May 12, 2016, 13 pages. |
European Patent Office, “Communication Pursuant to Article 94(3) EPC,” issued in connection with Application No. 03027280.1, dated Apr. 1, 2010, 2 pages. |
European Patent Office, “European Search Report,” issued in connection with Application No. 03027280.1, dated Apr. 6, 2005, 3 pages. |
United States Patent and Trademark Office, “Supplemental Notice of Allowability,” issued in connection with U.S. Appl. No. 14/495,323, notification dated Sep. 19, 2016, 15 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 15/265,352, notification dated Oct. 25, 2017, 24 pages. |
United States Patent and Trademark Office, “Notice of Allowance,” issued in connection with U.S. Appl. No. 15/265,352, dated May 14, 2018, 19 pages. |
United States Patent and Trademark Office, “Notice of Allowance,” issued in connection with U.S. Appl. No. 16/119,509, dated Aug. 26, 2019, 52 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 16/741,399, dated Jul. 7, 2020, 14 pages. |
United States Patent and Trademark Office, “Notice of Allowance,” issued in connection with U.S. Appl. No. 16/741,399, dated Dec. 17, 2020, 10 pages. |
Number | Date | Country | |
---|---|---|---|
20210243494 A1 | Aug 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16741399 | Jan 2020 | US |
Child | 17234364 | US | |
Parent | 16119509 | Aug 2018 | US |
Child | 16741399 | US | |
Parent | 15265352 | Sep 2016 | US |
Child | 16119509 | US | |
Parent | 14495323 | Sep 2014 | US |
Child | 15265352 | US | |
Parent | 13893027 | May 2013 | US |
Child | 14495323 | US | |
Parent | 13447862 | Apr 2012 | US |
Child | 13893027 | US |