This disclosure relates generally to metering advertisements, and, more particularly, to metering advertisements streamed in a media player.
Media players on electronic devices (e.g., smartphones, tablet computers, computers, etc.) enable access to a wide range of media. The media can be streamed from the Internet via a browser or an application dedicated for streaming media or playing media.
Many media streaming websites or applications stream advertisements along with content selected for presentation by a viewer or machine (e.g., web crawler). For example, if a viewer chooses to view a video on YouTube™, an advertisement may be streamed in a media player application 154 of YouTube™ before the chosen video is presented.
Example methods and apparatus disclosed herein determine a type of media presented in a media player (or media player application 154) based on an ability to manipulate (e.g., select, activate, deactivate, adjust, etc.) a control of the media player. An example method includes determining if a media application control is selectable when media is presented by the media player application 154; and determining whether the media presented includes a specified type of media in response to determining if the media application control is selectable.
An example apparatus includes a control state analyzer to determine if a control of media player is selectable when the media player is presenting media, and a media type analyzer to determine that the media is a specified type of media in response to determining that the control is selectable or not selectable.
In some examples an image analyzer is used to process an image of a media player application 154 to identify if a control is selectable based on the appearance of a corresponding control indicator. In some examples, control data monitor is used to monitor a stream of data to the media player that includes control data indicating whether a control is enabled or disabled.
Example methods and apparatus disclosed herein may identify whether an advertisement or media including an advertisement is presented by a media player or media player application 154. Considering the ever increasing amount of media that is accessible to potential audience members via the Internet, “on-demand” applications, or other similar technologies, there is a great opportunity for advertising. Accordingly, many, but not all media that can be downloaded, streamed, or viewed includes an advertisement. Determination of the presence of an advertisement in downloaded or streamed media may be beneficial to entities, such as audience measurement entities (e.g., The Nielsen Company). Knowing which pieces of media include an advertisement and which pieces of media do not include an advertisement may enable such entities to process (e.g., determine an advertisement source or creator) fewer videos by only processing the videos determined to include an advertisement.
The example media device 102 includes an example network interface 104, an example data storage device 106, an example device controller 108, an example media type identifier 110, an example data input/output 112, an example audio/video output 114 (e.g., a display (e.g., a speaker, a liquid crystal display (LCD), a light-emitting diode (LED) display, etc.), an example user interface 116, and an example media presenter 150. An example communication bus 160 facilitates communication between the network interface 104, the data storage device 106, the device controller 108, the data input/output 112, the audio/video output 114, the user interface 116, and/or the media presenter 150. The example media presenter 150 includes an example media controller 152 and an example media player application 154. In the illustrated example of
In the illustrated example of
The example media type identifier 110 identifies a type of media presented by the media presenter 150. The example media presenter 150 of
For example, the media type may be an advertisement or content. As described herein, content is one or more of programming (e.g., a television program, radio program, web video, etc.), media for streaming (e.g., a YouTube™ video, etc.), etc. that a user (or a robot) expects to be presented by the media presenter 150 following a selection by the user (e.g., via a user interface) or a selection by a robot (e.g., via a web crawler). The media presenter 150 receives media data from the communication bus 101 and the media controller 152 instructs the media player application 154 to present the media according to the media data and control signals received from the media player application 154. In some examples, the media data corresponds to a media presentation that includes multiple types of media. For example, the media data received from a streamed video over the Internet may include both the content and an advertisement.
In the example of
In some examples of
The example media player data may include, but is not limited to, media presentation data, media player control data (e.g., data indicating whether controls of the media player application 154 are to be selectable, enabled or disabled, etc. while presenting the media such that the control cannot be activated (e.g., turned on), cannot be selected (e.g., turned off), and/or cannot be adjusted), a uniform resource locator (URL) associated with the media, media source data (e.g., source of origination such as YouTube®, Hulu®, a media application, etc.), etc. The example media type identifier 110 receives the media player data and image data generated by the media player application 154 to be displayed on the audio/video output 114. The media type identifier 110 determines the type of media based on the media data and image data (e.g., whether it is an advertisement or content).
As described herein, the media identifier 110 determines a state of one or more controls (e.g., whether the control(s) is/are selectable or not, able to be activated/deactivated, manipulated, etc.) of the media player application 154. Based on the state of the controls, the identifier 110 determines the type of media presented by the media player application 154. In some examples, the media type identifier 110 extracts control data from the media player data to identify whether a control is enabled or disabled.
In this example, a countdown timer 214 is also included, which displays the remaining time play of the media 202 when played at a predetermined speed. However, a count up timer may also be used independently or in conjunction with a play time indicator (e.g., the countdown timer 214) to display the play time of the media 202 relative to the beginning and/or the end of the media 202. In some examples, the progress bar 110, the progress bar indicator 112, and/or the countdown timer 214 identify timing of content of the media 202. In the illustrated example, the media player application 154 display 200 includes a volume control 216 to control the output level of any audio content that may be part of the media 202. Furthermore, the example media player application 154 display 200 includes a closed captioning control 218 to control activation or deactivation of closed captioning in any content that may be part of the media 202.
In some examples disclosed herein, one or more of the media player controls 206-218 may be enabled or disabled based on a type of the media 202. When the controls are enabled, a user can control the corresponding control 206-218 (e.g., control the volume output, activate/deactivate closed captioning, etc.). When the controls 206-218 are disabled, the user cannot control the corresponding control 206-218 (e.g., cannot adjust volume, cannot activate fast forward, cannot activate closed captioning, etc.). In some examples, when the media 202 is an advertisement, one or more of the controls 206-218 may be disabled. For example, a user may not be able to activate or deactivate closed captioning using the closed captioning control 218 when the media 202 is an advertisement. In some examples, when one or more of the controls 206-218 are disabled, they appear “grayed out” (not shown) on the media player application 154 display 200.
In some examples, where the device 102 is implemented by a web crawler, the web crawler accesses web pages including one or more media player application 154 display(s) 200. The example web crawler may then process one or more image(s) of the web pages to identify media player application 154 display(s) 200. The example web crawler may then process images of the identified media player application 154 display(s) 200 to control and/or manipulate the media player to begin playback. In some examples, the web crawler may attempt to select or manipulate (e.g., activate/deactivate) the controls 206-218 by “clicking” (or performing an operation to imitate “clicking”) the corresponding buttons of the controls 206-218. A media identifier 110 included in the example web crawler identifies the type of media (e.g., content, an advertisement, etc.) based on whether controls are enabled or disabled. The media identifier may make such a determination by at least one or an appearance of the controls 206-218 (e.g., whether “grayed out” or not) and/or control data in the media player data.
The example media data monitor 302 monitors the media player data transmitted between the media controller 152 and the media player application 154. In some examples, the media data monitor 302 extracts control data from the media player data indicating a state (e.g., enabled or disabled, selectable, etc.) of one or more media player controls (e.g., fast forward, closed captioning, etc.) of the media player application 154. In some examples, the media data monitor 302 sends query messages (e.g., by attempting to select, activate, or enable a control of the media player application 154) to determine and/or retrieve control data from the media controller 152 and/or the media player application 154 (e.g., if control data is not sent to the media player after attempting to enable the control, it can be inferred that the control is disabled).
The example media presentation monitor 304 of
The example control state analyzer 310 uses the received information to determine the state (e.g., whether enabled or disabled, whether selectable, etc.) of a control of the media player application 154. The example analyzer controller 312 forwards image data received from the media presentation monitor 304 and forwards it to the image analyzer 316. Additionally or alternatively, the example analyzer controller 312 forwards control data (e.g., data indicating a status of a control of the media player application 154) from the media player data received from the media data monitor 302 and forwards it to the control data analyzer 318. In some examples, the analyzer controller 312 only provides the monitored data when the buffer analyzer 314 detects that media player application 154 is not buffering. For example, the buffer analyzer 314 may perform an image analysis of the media player application 154 to determine whether the media player application 154 is buffering. The example buffer analyzer 314 may prevent the control state analyzer 310 from falsely identifying a state of a control due to buffering media. For example, some example media player application 154 controls may be disabled (i.e., may not be selectable, activated, deactivated, etc.) while the media player application 154 is buffering.
The example image analyzer 316 uses image processing techniques to analyze the images of the media player application 154 (e.g., the media player application 154 display 200 of
The example control data analyzer 318 analyzes control data embedded within the media player data streamed between the media controller 152 and the media player application 154. For example, the control data may be a message indicating whether a control (e.g., fast forward or closed captioning) is selectable or whether the control can be activated, deactivated, adjusted, etc. As another example, the control data may be a bit indicator in the media player data designated to indicate whether a control is selectable. In some examples, the control data analyzer 318 analyzes control messages transmitted to/from the media player application 154 and forwards the communication to the control state identifier 320. For example, the control data analyzer 318 may determine that a user attempted to enable an ability to activate/deactivate closed captioning by identifying a closed captioning request message being sent to the media controller 152 to enable closed captioning on the media player application 154, but not identifying control data in the media player data that enables the closed captioning. In this example, the control state identifier 320 may infer that closed captioning is disabled.
The example control state identifier 320 identifies the state (e.g., enabled or disabled) of a control based on the data provided by the image analyzer 316 and/or the control data analyzer 318. For example, if data from the image analyzer 316 indicates that a control button corresponding to a control is “grayed out” (i.e., the button appears gray to a user indicating that it cannot be selected) the control state identifier 320 determines that the control is disabled. In some examples, the control data analyzer 318 provides the value of a bit indicator corresponding to a control and or control data corresponding to a control and the control state identifier determines the control state based on the control data value. For example, a value of 0 for a control bit indicator indicates that the control is disabled, and a value of 1 indicates that the control is enabled. In some examples, the control state identifier 320 may receive control message data. For example, the control state identifier may receive control messages or requests transmitted between the media controller 152 and the media player application 154 indicating whether the control is to be enabled or disabled for corresponding media (e.g., disable for advertisement and enable for content).
The example control state analyzer 310 provides data indicating the control state to the media type analyzer 330. For example, the control state identifier 320 provides data indicating whether a corresponding control is selectable. Based on the received state of the control, the example media type analyzer 330 determines the type of media being presented by the media player application 154 and/or a type of media included in the media being presented by the media player application 154. For example, if the control state identifier 320 indicates that the control is disabled, the media type analyzer 330 may determine that the media is an advertisement. Alternatively, in the above example, if the control state identifier 320 indicates that the control is enabled, the media type analyzer 330 may determine that the media is the content. In some examples, the media type analyzer 330 forwards data corresponding to the identified media type to the data output 112 of
While an example manner of implementing the media type identifier 110 of
Flowcharts representative of example machine readable instructions for implementing the media type identifier 110 of
As mentioned above, the example processes of
The program 400 of
At block 404, the control state analyzer 310 determines the state of a control of the media player application 154. In some examples, the control state analyzer 310 determines whether the control is enabled or disabled. For example, the control state analyzer may determine that a closed captioning control of the media player has been disabled. In some examples, the control state analyzer 310 determines the state of a control of the media player using the image analyzer 316. In some such examples, the image analyzer 316 uses image processing techniques to identify control indicators (e.g., the indicators identifying media application controls 206-218 of
In some examples, at block 404, the control state analyzer 310 determines the state of a control of the media player using the control data analyzer 318 in addition to or as an alternative to the image analyzer 316. In some such examples, the control data analyzer 318 identifies control data in stream of data transmitted between the media controller 152 and the media player application 154. For example, the control data analyzer 318 may identify messages and/or requests to enable or disable a control (e.g., one of the controls 206-218). Based on the identified control data, such as a value of a bit indicator for the corresponding control or a value of a control message, the control state identifier 320 determines the state of the control, such as whether the control is enabled or disabled.
In some examples, the state of the control may be determined as described in connection with
After identifying the state of the control (block 404), at block 406, the media type analyzer 330 determines a type of media presented by the media player based on the state of the control. In some examples, a state of the control corresponds to the type of media being displayed. The correspondence between the state of the control and the media type may be predefined and/or programmed into the media type analyzer 330. The type of media may be determined as described in more detail in connection with
Following identification of the type of media (block 406), at block 408, the analyzer controller 312 determines whether to continue monitoring the media player application 154. If the analyzer controller 312 is to no longer monitor the media player to identify a type of media presented (e.g., because of a system failure, system shutdown, etc.), then the program 400 ends. If the analyzer controller 312 is to continue monitoring the media player, control returns to block 402.
At block 504 of
Following identification of a control indicator (block 506), at block 508 the control data analyzer 318 monitors the media player data for control data corresponding to the identified control. For example, if the image analyzer 316 identified the closed captioning control button of
At block 510 of
At block 602, the media type analyzer 330 determines whether the control is enabled or disabled. If the control is disabled, at block 604 the media type analyzer 604 determines that the media is an advertisement. If the control is not disabled (i.e., it is enabled), the media type analyzer 330 determines that the media is not an advertisement. In some such examples, the media is the content.
The processor platform 700 of the illustrated example includes a processor 712. The processor 712 of the illustrated example is hardware. For example, the processor 712 can be implemented by one or more integrated circuits, logic circuits, microprocessors or controllers from any desired family or manufacturer.
The processor 712 of the illustrated example includes a local memory 713 (e.g., a cache). The processor 712 of the illustrated example is in communication with a main memory including a volatile memory 714 and a non-volatile memory 716 via a bus 718. The volatile memory 714 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS Dynamic Random Access Memory (RDRAM) and/or any other type of random access memory device. The non-volatile memory 716 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 714, 716 is controlled by a memory controller.
The processor platform 700 of the illustrated example also includes an interface circuit 720. The interface circuit 720 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), and/or a PCI express interface.
In the illustrated example, one or more input devices 722 are connected to the interface circuit 720. The input device(s) 722 permit(s) a user to enter data and commands into the processor 712. The input device(s) can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system. The input device(s) 722 may be used to implement the user interface 116 of
One or more output devices 724 are also connected to the interface circuit 720 of the illustrated example. The output devices 724 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display, a cathode ray tube display (CRT), a touchscreen, a tactile output device, a light emitting diode (LED), and/or speakers). The output devices 724 may be used to implement the example audio/video output 114 of
The interface circuit 720 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem and/or network interface card to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 726 (e.g., an Ethernet connection, a digital subscriber line (DSL), a telephone line, coaxial cable, a cellular telephone system, etc.).
The processor platform 700 of the illustrated example also includes one or more mass storage devices 728 for storing software and/or data. Examples of such mass storage devices 728 include floppy disk drives, hard drive disks, compact disk drives, Blu-ray disk drives, RAID systems, and digital versatile disk (DVD) drives.
The coded instructions 732 of
From the foregoing, it will appreciate that the above disclosed methods, apparatus and articles of manufacture facilitate identification of a type of media (e.g., an advertisement) the presence of a type of media presented by a media player based on a state of a control of the media player application.
Although certain example methods, apparatus and articles of manufacture have been disclosed herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods, apparatus and articles of manufacture fairly falling within the scope of the claims of this patent.
This patent arises from a continuation of U.S. patent application Ser. No. 16/175,476, entitled “METHODS AND APPARATUS TO IDENTIFY A TYPE OF MEDIA PRESENTED BY A MEDIA PLAYER,” filed on Oct. 30, 2018, which is a continuation of U.S. patent application Ser. No. 13/840,807, entitled “METHODS AND APPARATUS TO IDENTIFY A TYPE OF MEDIA PRESENTED BY A MEDIA PLAYER,” filed on Mar. 15, 2013. Priority to U.S. patent application Ser. No. 16/175,476 and U.S. patent application Ser. No. 13/840,807 is claimed. U.S. patent application Ser. No. 16/175,476 and U.S. patent application Ser. No. 13/840,807 are herein incorporated by reference in their respective entireties.
Number | Name | Date | Kind |
---|---|---|---|
5151788 | Blum | Sep 1992 | A |
5754255 | Takamori | May 1998 | A |
5987171 | Wang | Nov 1999 | A |
5999688 | Iggulden et al. | Dec 1999 | A |
5999689 | Iggulden | Dec 1999 | A |
6014458 | Wang | Jan 2000 | A |
6046740 | LaRoche et al. | Apr 2000 | A |
6311194 | Sheth et al. | Oct 2001 | B1 |
6353929 | Houston | Mar 2002 | B1 |
6362894 | Shima | Mar 2002 | B1 |
6430583 | Taguchi | Aug 2002 | B1 |
6460023 | Bean et al. | Oct 2002 | B1 |
6519648 | Eyal | Feb 2003 | B1 |
6535880 | Musgrove et al. | Mar 2003 | B1 |
6643641 | Snyder | Nov 2003 | B1 |
6714933 | Musgrove et al. | Mar 2004 | B2 |
6721741 | Eyal et al. | Apr 2004 | B1 |
6725222 | Musgrove et al. | Apr 2004 | B1 |
6725275 | Eyal | Apr 2004 | B2 |
6970602 | Smith et al. | Nov 2005 | B1 |
7013310 | Messing et al. | Mar 2006 | B2 |
7082426 | Musgrove et al. | Jul 2006 | B2 |
7149982 | Duperrouzel et al. | Dec 2006 | B1 |
7162696 | Wakefield | Jan 2007 | B2 |
7200801 | Agassi et al. | Apr 2007 | B2 |
7231381 | Li et al. | Jun 2007 | B2 |
7251790 | Drucker et al. | Jul 2007 | B1 |
7269330 | Iggulden | Sep 2007 | B1 |
7272785 | Fukuda et al. | Sep 2007 | B2 |
7281034 | Eyal | Oct 2007 | B1 |
7451391 | Coleman et al. | Nov 2008 | B1 |
7584194 | Tuttle et al. | Sep 2009 | B2 |
7685273 | Anastas et al. | Mar 2010 | B1 |
7809154 | Lienhart et al. | Oct 2010 | B2 |
7954120 | Roberts et al. | May 2011 | B2 |
8019162 | Zhang et al. | Sep 2011 | B2 |
8196164 | Oztaskent et al. | Jun 2012 | B1 |
8290351 | Plotnick et al. | Oct 2012 | B2 |
8572505 | Lee et al. | Oct 2013 | B2 |
8650587 | Bhatia et al. | Feb 2014 | B2 |
9639531 | Deliyannis | May 2017 | B2 |
20020023271 | Augenbraun et al. | Feb 2002 | A1 |
20020056089 | Houston | May 2002 | A1 |
20020063727 | Markel | May 2002 | A1 |
20020080165 | Wakefield | Jun 2002 | A1 |
20020091764 | Yale | Jul 2002 | A1 |
20020114002 | Mitsubori et al. | Aug 2002 | A1 |
20030004272 | Power | Jan 2003 | A1 |
20030066070 | Houston | Apr 2003 | A1 |
20030237027 | Cook | Dec 2003 | A1 |
20040003102 | DuVall et al. | Jan 2004 | A1 |
20040021686 | Barberis | Feb 2004 | A1 |
20040145778 | Aoki et al. | Jul 2004 | A1 |
20040177096 | Eyal et al. | Sep 2004 | A1 |
20040189720 | Wilson et al. | Sep 2004 | A1 |
20040221311 | Dow et al. | Nov 2004 | A1 |
20040254956 | Volk | Dec 2004 | A1 |
20040254958 | Volk | Dec 2004 | A1 |
20040267812 | Harris et al. | Dec 2004 | A1 |
20050025348 | Tecu | Feb 2005 | A1 |
20050041858 | Celi, Jr. et al. | Feb 2005 | A1 |
20050231648 | Kitamura et al. | Oct 2005 | A1 |
20050262438 | Armstrong et al. | Nov 2005 | A1 |
20060015571 | Fukuda et al. | Jan 2006 | A1 |
20060026162 | Salmonsen et al. | Feb 2006 | A1 |
20060041589 | Helfman et al. | Feb 2006 | A1 |
20060120590 | Han et al. | Jun 2006 | A1 |
20060120692 | Fukuta | Jun 2006 | A1 |
20060230011 | Tuttle et al. | Oct 2006 | A1 |
20060242192 | Musgrove et al. | Oct 2006 | A1 |
20060259938 | Kinoshita et al. | Nov 2006 | A1 |
20060271977 | Lerman et al. | Nov 2006 | A1 |
20060282494 | Sima et al. | Dec 2006 | A1 |
20070047766 | Rhoads | Mar 2007 | A1 |
20070073758 | Perry et al. | Mar 2007 | A1 |
20070124110 | Tung | May 2007 | A1 |
20070130525 | Murphy et al. | Jun 2007 | A1 |
20070150612 | Chaney et al. | Jun 2007 | A1 |
20070168543 | Krikorian et al. | Jul 2007 | A1 |
20070172155 | Guckenberger | Jul 2007 | A1 |
20070237426 | Xie et al. | Oct 2007 | A1 |
20070239839 | Buday et al. | Oct 2007 | A1 |
20070271300 | Ramaswamy | Nov 2007 | A1 |
20070277088 | Bodin et al. | Nov 2007 | A1 |
20070294252 | Fetterly et al. | Dec 2007 | A1 |
20080034306 | Ording | Feb 2008 | A1 |
20080046562 | Butler | Feb 2008 | A1 |
20080046738 | Galloway et al. | Feb 2008 | A1 |
20080082426 | Gokturk et al. | Apr 2008 | A1 |
20080089666 | Aman | Apr 2008 | A1 |
20080109724 | Gallmeier et al. | May 2008 | A1 |
20080120420 | Sima et al. | May 2008 | A1 |
20080140712 | Weber et al. | Jun 2008 | A1 |
20080141162 | Bockus | Jun 2008 | A1 |
20080158361 | Itoh et al. | Jul 2008 | A1 |
20080222273 | Lakshmanan et al. | Sep 2008 | A1 |
20080229240 | Garbow et al. | Sep 2008 | A1 |
20080229427 | Ramirez | Sep 2008 | A1 |
20080294981 | Balzano et al. | Nov 2008 | A1 |
20080313177 | Li et al. | Dec 2008 | A1 |
20080319844 | Hua et al. | Dec 2008 | A1 |
20090047000 | Walikis et al. | Feb 2009 | A1 |
20090109337 | Imai et al. | Apr 2009 | A1 |
20090172723 | Shkedi et al. | Jul 2009 | A1 |
20090222754 | Phillips et al. | Sep 2009 | A1 |
20090248672 | Mcintire et al. | Oct 2009 | A1 |
20090254553 | Weiskopf et al. | Oct 2009 | A1 |
20090259926 | Deliyannis | Oct 2009 | A1 |
20090268261 | Banton et al. | Oct 2009 | A1 |
20090291665 | Gaskarth et al. | Nov 2009 | A1 |
20100023660 | Liu | Jan 2010 | A1 |
20100080411 | Deliyannis | Apr 2010 | A1 |
20100162301 | Minnick | Jun 2010 | A1 |
20100174983 | Levy et al. | Jul 2010 | A1 |
20110122939 | Ganesan et al. | May 2011 | A1 |
20110283311 | Luong | Nov 2011 | A1 |
20120047010 | Dowling et al. | Feb 2012 | A1 |
20120047234 | Terayoko | Feb 2012 | A1 |
20120109743 | Balakrishnan et al. | May 2012 | A1 |
20120304223 | Sargent et al. | Nov 2012 | A1 |
20130090097 | Klassen et al. | Apr 2013 | A1 |
20130097702 | Alhamed et al. | Apr 2013 | A1 |
20130173402 | Young et al. | Jul 2013 | A1 |
20140155022 | Kandregula | Jun 2014 | A1 |
20140281980 | Hage | Sep 2014 | A1 |
20150039637 | Neuhauser et al. | Feb 2015 | A1 |
20150156332 | Kandregula | Jun 2015 | A1 |
20190066148 | Hage | Feb 2019 | A1 |
Number | Date | Country |
---|---|---|
1120732 | Aug 2001 | EP |
2004157907 | Jun 2004 | JP |
2008171039 | Jul 2008 | JP |
9827497 | Jun 1998 | WO |
2002047467 | Jun 2002 | WO |
2005043288 | May 2005 | WO |
2005086081 | Sep 2005 | WO |
2006058075 | Jun 2006 | WO |
2007018102 | Feb 2007 | WO |
2007041647 | Apr 2007 | WO |
2008021459 | Feb 2008 | WO |
Entry |
---|
Buscher et al., “Generating and Using Gaze-Based Document Annotations,” CHI 2008 Proceedings—Works in Progress, Apr. 5-10, 2008, Florence, Italy, pp. 3045-3050, 6 pages. |
Fu et al., “Detecting Phishing Web Pages with Visual Similarity Assessment Based on Earth Mover's Distance (EMD),” IEEE Transactions on Dependable and Secure Computing, vol. 3, No. 4, Oct.-Dec. 2006, 11 pages. |
Australian Government, “Examiner's First Report”, issued in connection with Patent Application No. 2009222570 dated May 3, 2010, 1 page. |
Australian Government, “Examiner's Report”, issued in connection with Patent Application No. 2009222570 dated Jun. 6, 2011, 1 page. |
Australian Government “Examiner's First Report”, issued in connection with Patent Application No. 2008354332 dated Jul. 26, 2011, 2 pages. |
IP Australia, “Patent Examination Report No. 1”, issued in connection with Patent Application No. 2011239269 dated Feb. 6, 2013, 3 pages. |
Australian Government, “Notice of Acceptance”, issued in connection with Patent Application No. 2008354332, dated Apr. 5, 2013, 2 pages. |
Canadian Intellectual Property Office, “Requisition by the Examiner”, issued in connection with Patent Application No. 2,680,955 dated Oct. 5, 2011, 3 pages. |
Canadian Intellectual Property Office, “Office Action”, issued in connection with Patent Application No. 2,680,955 dated Sep. 27, 2012, 4 pages. |
Chinese State Intellectual Property Office, “First Office Action”, issued in connection with Patent Application No. 2009102214165 dated Nov. 30, 2011, 5 pages. |
Chinese State Intellectual Property Office, “Second Office Action”, issued in connection with Patent Application No. 2009102214165 dated Aug. 23, 2012, 17 pages. |
Chinese State Intellectual Property Office, “Third Office Action”, issued in connection with Patent Application No. 2009102214165 dated May 17, 2013, 9 pages. |
European Patent Office, Extended European Search Report issued in connection with European Patent Application No. 09012337.3 dated Jan. 7, 2010, 8 pages. |
International Searching Authority, “International Search Report”, issued in connection with corresponding to International Patent Application No. PCT/US2008/059783, dated Aug. 13, 2008, 6 pages. |
International Searching Authority, “Written Opinion of the International Searching Authority”, issued in connection with International Patent Application No. PCT/US2008/059783 dated Aug. 13, 2008, 6 pages. |
Patent Cooperation Treaty, “International Preliminary Report on Patentability”, issued in connection with international application No. PCT/US2008/059783 dated Oct. 12, 2010, 6 pages. |
Japanese Patent Office, “Office Action with redacted summary in English”, issued in connection with Application No. 2011-503955 dated May 14, 2012, 5 pages. |
Japanese Patent Office, “Second Office Action with redacted summary in English”, issued in connection with Application No. 2011-503955 dated Mar. 19, 2013, 4 pages. |
United States Patent and Trademark Office. “Final Office Action”, issued in connection with U.S. Appl. No. 12/240,756 dated Jun. 24, 2011, 19 pages. |
United States Patent and Trademark Office, “Advisory Action”, issued in connection with U.S. Appl. No. 12/240,756 dated Nov. 1, 2011, 3 pages. |
Canadian Intellectual Property Office, “Examiner's Report”, issued in connection with Canadian Patent Application No. 2,680,955, dated Nov. 13, 2013, 3 pages. |
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 12/100,264 dated Feb. 7, 2011, 10 pages. |
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 12/100,264 dated Aug. 4, 2011, 18 pages. |
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 12/100,264 dated Feb. 14, 2014, 19 pages. |
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 12/240,756 dated Feb. 22, 2011, 9 pages. |
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 12/240,756 dated Apr. 1, 2014, 20 pages. |
Japanese Patent Office, “Decision of Rejection”, issued in connection with Japanese Patent Application No. 2011-503955 dated Aug. 13, 2013, 2 pages. |
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 12/100,264 dated Oct. 9, 2014, 20 pages. |
IP Australia, “Notice of Acceptance”, issued in connection with Australian Patent Application No. 2011239269 dated Oct. 15, 2014, 2 pages. |
IP Australia, “Patent Examination Report No. 1”, issued in connection with Australian Patent Application No. 2013203736 dated Oct. 20, 2014, 2 pages. |
Canadian Intellectual Property Office, “Examiner's Report”, issued in connection with Canadian Patent Application No. 2,680,955 dated Dec. 15, 2014, 6 pages. |
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 12/240,756 dated Dec. 17, 2014, 22 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 13/706,244 dated Mar. 25, 2014, 14 pages. |
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/706,244 dated Sep. 12, 2014, 23 pages. |
United States Patent and Trademark Office, “Advisory Action”, issued in connection with U.S. Appl. No. 13/706,244, dated Feb. 17, 2015, 3 pages. |
IP Australia, “Notice of Grant”, issued in connection with Australian Patent Application No. 2011239269 dated Feb. 12, 2015, 2 pages. |
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 14/621,010 dated Mar. 17, 2015, 17 pages. |
United States Patent and Trademark Office. “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/955,163 dated Aug. 13, 2015, 12 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 12/100,264 dated Dec. 17, 2015, 13 pages. |
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/955,163 dated Mar. 1, 2016, 11 pages. |
IP Australia, “Notice of Acceptance”, issued in connection with Australian Patent Application No. 2013203736 dated Nov. 13, 2015, 2 pages. |
IP Australia, “Notice of Grant”, issued in connection with Australian Patent Application No. 2008354332 dated Aug. 1, 2013, 2 pages. |
Australian Intellectual Property Office, “Notice of Grant”, issued in connection with Australian Patent Application No. 2013203736 dated Mar. 23, 2016, 1 page. |
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/955,163 dated Jun. 27, 2016, 14 pages. |
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 12/100,264 dated Jul. 28, 2016, 19 pages. |
United States Patent and Trademark Office, “Advisory Action”, issued in connection with U.S. Appl. No. 12/100,264 dated Oct. 13, 2016, 3 pages. |
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/955,163 dated Dec. 23, 2016, 17 pages. |
United States Patent and Trademark Office, “Notice of Allowance”, issued in connection with U.S. Appl. No. 12/100,264 dated Dec. 22, 2016, 12 pages. |
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/840,807 dated Feb. 26, 2015, 12 pages. |
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/840,807 dated Jul. 8, 2015, 16 pages. |
United States Patent and Trademark Office, “Non-Final Office Action”, issued in connection with U.S. Appl. No. 13/840,807 dated Mar. 24, 2016, 13 pages. |
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 13/840,807 dated Aug. 5, 2016, 13 pages. |
United States Patent and Trademark Office, “Notice of Panel Decision from Pre-Appeal Brief Review”, issued in connection with U.S. Appl. No. 13/840,807 on Jan. 19, 2017, 2 pages. |
United States Patent and Trademark Office, “Examiner's Answer to Appeal Brief”, issued in connection with U.S. Appl. No. 13/840,807 on May 2, 2017, 13 pages. |
United States Patent and Trademark Office, “Examiner's Second Answer to Appeal Brief”, issued in connection with U.S. Appl. No. 13/840,807 on Oct. 12, 2017, 13 pages. |
United States Patent and Trademark Office, “Decision on Appeal”, issued in connection with U.S. Appl. No. 13/840,807 on Aug. 30, 2018, 6 pages. |
United States Patent and Trademark Office, “Non-final Office action”, issued in connection with U.S. Appl. No. 15/482,317, dated Jul. 27, 2018, 16 pages. |
United States Patent and Trademark Office, “Final Office action,” issued in connection with U.S. Appl. No. 15/482,317, dated Feb. 25, 2019, 15 pages. |
United States Patent and Trademark Office, “Office action,” issued in connection with U.S. Appl. No. 15/482,317, dated Sep. 19, 2019, 14 pages. |
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 15/482,317 dated Apr. 6, 2020, 14 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” issued in connection with U.S. Appl. No. 16/175,476, dated Dec. 12, 2019, 15 pages. |
United States Patent and Trademark Office, “Final Office Action”, issued in connection with U.S. Appl. No. 16/175,476 dated Jun. 5, 2020, 18 pages. |
United States Patent and Trademark Office, “Notice of Allowance and Fee(s) Due”, issued in connection with U.S. Appl. No. 16/175,476 dated Oct. 28, 2020, 6 pages. |
Number | Date | Country | |
---|---|---|---|
20210192561 A1 | Jun 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16175476 | Oct 2018 | US |
Child | 17195421 | US | |
Parent | 13840807 | Mar 2013 | US |
Child | 16175476 | US |