This disclosure relates generally to media, and, more particularly, to methods and apparatus to identify media.
Media producers, media providers, advertisers, product manufactures represented in advertisements, and many other entities utilize information about the presentation of media. Such information is often collected through the use of panels comprised of persons (e.g., panelists) who have agreed to have their exposure to media monitored. For example, audio of media may be transmitted with identifying information (e.g., embedded watermarks or codes) that identifies the media. Panelists may be supplied with meters (e.g., portable meters carried and/or worn by the panelists) that collect the audio and extract the identifying information. The information may be transmitted to a collection facility where the results from multiple panelists are combined to generate reports comprising information about media presentation.
Embedding identification information in media (e.g., content, advertisements, audio, video, movies, commercials, television programs, radio programs, video games, etc.) is often performed at or on behalf of a media provider. Such embedding relies on cooperation with media providers. Obtaining such cooperation may be complicated due to the large number of media providers across different media platforms (e.g., terrestrial television, cable television, satellite television, Internet television, Sony PlayStation® video games, Nintendo® video games, Microsoft® video games, movie producers, video on demand, CD's, DVD's, etc.). Furthermore, media providers may be unwilling or unable to cooperate.
Methods and apparatus described herein embed identification information in media utilizing a media presentation device that renders the media for presentation (e.g., a media presentation device at a panelist home). In some examples, such methods and apparatus rely on little or no cooperation from media providers. The use of computational resources of the media providers is, thus, reduced. Furthermore, by embedding identification at the media presentation device, further information about the audience/the device and/or applications executing on the device can be included in the embedded identification information.
In some disclosed examples, a media presentation device determines identification information for an application executing on the media presentation device. For example, the application may be an application controlling media presentation (e.g., an operation system component presenting a video game, a cloud video distribution application (e.g., Netflix®, Hulu®, MLBtv®, etc.), a digital versatile disk (DVD) decoding application, a Flash program, etc.). The media presentation device determines an identifier for the application by consulting a lookup table. The media presentation device inserts the identifier as a watermark in the media so that the media is presented with the embedded watermark identifier. In some examples, the media presentation device additionally or alternatively determines identification information for media presented by the application (e.g., a manufacturer identification number embedded in a video game DVD, an identifier of a video distributed by a cloud video distribution application, metadata associated with media, an originating internet protocol address of streaming media, etc.). The media presentation device determines an identifier for the media by consulting a lookup table (e.g., a second lookup table) or algorithm to produce an appropriate watermark identifier. The media presentation device inserts the identifier for the media as a watermark in the media. In some examples, a first level watermark (e.g., corresponding to the application identifier) and a second level watermark (e.g., corresponding to the media identifier) are inserted in the media in an interleaved manner. Where two levels of watermarks are used, the first level can be used to identify the media content, and the second level can identify the media presentation device or application. In some examples, after the watermarked media output by the media presentation device, the watermarks are detected by a meter and are sent to a data collection facility for analysis and/or reporting.
In examples disclosed herein, any type of watermarking (e.g., video watermarking, audio watermark, etc.) or any other technique for embedding identifying information in media may be utilized. For example, watermarks may be embedded as Nielsen Watermarks codes (a.k.a. Nielsen codes) of The Nielsen Company (US), LLC, as Arbitron audio encoding watermarks, etc. Example methods, systems, and apparatus to encode and/or decode audio watermarks are disclosed in U.S. patent application Ser. No. 12/551,220, entitled “Methods and Apparatus to Perform Audio Watermarking and Watermark Detection and Extraction,” filed Aug. 31, 2009, U.S. patent application Ser. No. 12/464,811, entitled “Methods and Apparatus to Perform Audio Watermarking and Watermark Detection and Extraction,” filed May 12, 2009, U.S. patent application Ser. No. 12/249,619, entitled “Methods and Apparatus to Perform Audio Watermarking Detection and Extraction,” filed Oct. 10, 2008, all of which are hereby incorporated by reference in their entireties.
The media presentation device 102 of the illustrated example receives and/or outputs media for presentation. For example, the media presentation device 102 may be a set-top box, an integrated receiver decoder, a video game console, a disk media player (e.g., a DVD player, Blu-ray player, a compact disk (CD) player), a television, a computing device, etc. The media presentation device 102 may present the media via an integrated output device (e.g., where the media presentation device 102 is a television) or via a second device (e.g., via external speakers and/or video display). Thus, the media presentation device 102 may output the media via its own functionality and/or to another device that presents the media (e.g., a television, speakers, etc.). The example media presentation device 102 of
The media application 120 interfaces with and causes presentation of media by the media rendering engine 124. The media application may be integrated with an operating system of the media presentation device 102 (e.g., an application to present video games on gaming disks inserted in a Sony PlayStation) or may be an application apart from the operating system (e.g., an application associated with an entity different from the manufacturer of the media presentation device 102 such as a Netflix application executing on a Sony PlayStation). The media may be stored on a removable tangible storage medium 122 (e.g., a DVD, a CD, a Blu-ray, a flash memory), retrieved from the media provider 106 via the network 104, obtained from a broadcast or unicast system, stored on a local tangible storage medium of the media presentation device 102 (e.g., a hard disk drive), or stored on any other tangible storage medium.
The media application 120 directs the media to the media rendering engine 124 for rendering and presentation. The media application 120 of the illustrated example also includes an interface 121 (e.g., an application programming interface (API)) that enables the identification generator 126 to obtain information about media handled by the media application 120. The identification generator 126 and access to the identification information are described in further detail below. In some examples, while the operating system of the media presentation device 102 is aware of the identity of the media application 120 the media presentation device 102 may not be apprised of the identity of media presented by the media application 120. In such examples, the media application 120 may allow the identification generator 126 to access identifying information for the media (e.g., using the API 121 or any other interface to the media application 120).
The media rendering engine 124 of the illustrated example receives media via the media application 120 and renders the media for presentation. For example, the media rendering engine 124 may include an audio subsystem to render audio for presentation, a video subsystem to render video for presentation, etc. For example, if the example media presentation device 102 is implemented as a video gaming system, the media rendering engine 124 may render the media as it is dynamically generated by the media application 120 in order to display the gaming environment. For example, if the example media presentation device 102 is presenting a movie on a DVD, the media rendering engine 124 decodes and renders the movie for presentation on a display.
The media rendering engine 124 of the illustrated example also includes an interface (e.g., an API) that allows the identification generator 126 to access the media to be rendered. For example, the identification generator 126 of the illustrated example cooperates with the watermarking generator 127 by sending the media watermark generator 127 for watermarking prior to output of the media by the media presentation device 102. For example, the watermark generator 127 may access the media and embed watermarks in real-time as the media is rendered. Such real-time encoding may be advantageous in, for example, video gaming media presentation devices 102 where the media (e.g., the audio of the media) is dynamic and based on the play of the game.
The example identification generator 126 of the illustrated example determines application identification information for the media application 120 via an operating system of the media presentation device 102. Alternatively, the identification generator 126 may determine the application identification information for the media application 120 from any other source (e.g., by querying the API 121 of the media application 120). The application identification information for the media application 120 may be a name of the media application 120, an identification number of the media application 120, a globally unique identifier (GUID) of the media application 120, a manufacturer identifiers, an identifier embedded in the application, or any other unique or semi-unique identifier for the media.
The example identification generator 126 also determines media identification information for media to be presented by and/or currently presented by the media application 120 by querying the API 121 of the media application 120. The media identification information may be any information useful for identifying the media such as a name of the media, an identification number for the media (e.g., an identification number embedded in the media, an identification number embedded on a storage disk on which the media is stored, etc.), an identifier embedded in the media, a manufacturer identifier, or any other unique or semi-unique identifier for the media.
The identification generator 126 of the illustrated example accesses the lookup table 128 using the application identification information for the media application 120 and/or the media identification information for the media to determine application and/or media identification information to be embedded in the media. In particular, the example identification generator 126 determines application identification information to be embedded in a first watermark identifying the media application 120 and media identification information to be embedded in a second watermark identifying the media. The identification generator 126 passes application and/or media identifying information to the watermark generator 127. The watermark generator 127 generates the watermark(s) and inserts the watermark in media received from the media rendering engine 124. For example, the watermark generator 127 may generate watermark(s) as audio having tones at emphasized frequencies that are integrated with the audio of the media from the media rendering engine 124. The audio watermarking tones may be generated using psychoacoustic masking to reduce the perceptibility of the watermark(s) by humans exposed to the audio.
The watermark generator 127 of the illustrated example embeds identifying information in media received from the media rendering engine 124. The watermark generator 127 may generate the watermarks using any technique for embedding identifying information in media (e.g., generating watermarks to be mixed into audio of the media, generating image watermarks to be overlaid on the video of the media, etc.). The watermark generator 127 receives the media from the media rendering engine 124 and inserts the watermark in the media. For example, the watermark generator 127 may generate the watermark information in real-time or almost real-time to insert the watermarks into the media without causing human-noticeable delay in the media rendering.
In the illustrated example, the watermark generator 127 interleaves a first watermark identifying the media application 120 with a second watermark identifying the media. For example,
Example implementations of the identification generator 126 and the watermark generator 127 are described in further detail in conjunction with
The lookup table 128 of the illustrated examples associate application and/or media identification information with example watermark(s). The lookup table 128 of the illustrated example includes two tables as illustrated in
Returning to the system 100 of
The example media provider 106 of
The meter 108 of the illustrated example detects the presentation of media by the media presentation 102 (e.g., media presented by components of the media presentation device 102 and/or media presented via another media presentation device such as a television) and extracts the watermark(s) embedded in the media. For example, the meter 108 may be a portable metering device carried by a member of a panel or a stationary meter located to meter a specific presentation device or set of presentation devices. The meter 108 may include a microphone that captures sound near the portable metering device. When such a meter 108 is near the media output by the media presentation device 102 the microphone captures the audio output by the media presentation device 102. The meter 108 of the illustrated example decodes the watermark(s) embedded in the audio. In alternative examples, the audio is sent to the monitoring facility 110 for watermark extraction. The example meter 108 of
The monitoring facility 110 of the illustrated example receives information collected by the meter 108, stores the information, analyzes the information in conjunction with other data collected by other meters similar to the meter 108 of
According to the illustrated example, the identification generator 126, the watermark generator 127, and the lookup table 128 are provided by an operator of the monitoring facility 110 to a manufacturer of the media presentation device 102 for inclusion in the media presentation device 102 at the time of manufacture. For example, the identification generator 126 may be provided as a software development kit (SDK) for integration in the media presentation device 102. In such examples, the identification generator 126, the watermark generator 127, and/or the lookup table 128 may exist dormant in the media presentation device unless and until a purchaser of the product agrees to become a panelist. A purchaser may agree to become a panelist in any way (e.g., by calling the monitoring company, entering data including the internet protocol address of the purchased device, accessing a menu in the purchased device, etc.). In such examples, the purchaser is provided an opportunity to become a panelist. For instance, a consumer electronic device such as a television or electronic gaming system may be sold as “ratings ready” (e.g., as illustrated in
The lookup table updater 402 receives updated identifying information for the lookup table 128 of
The lookup table interface 404 is an interface to the lookup table 128 for the lookup table updater 402 and the data compiler 410. For example, the lookup table interface 404 may be a database engine that facilitates queries and/or other access to the lookup table 128. Alternatively, the lookup table interface 404 may be any other type of interface and/or may be integrated in one or both of the lookup table updater 402 and/or the data compiler 410.
The device interface 406 of the illustrated example interfaces with an operating system of the media presentation device 102 to obtain information about applications executing on (or about to execute on) the media presentation device 102. For example, the device interface 406 may obtain application identifying information for the media application 120 of
The application interface 406 of the illustrated example interfaces to an application executing on the media presentation device 102 (e.g., the media application 120) to obtain information about media presented by (or about to be presented by) the application. For example, the application interface 406 may obtain identifying information for the media by querying the API 121 of the media application 120. In some examples, while the operating system of the media presentation device 102 is aware of the application executing on the media presentation device 102 (e.g., a Netflix application), the operating system may not be aware of the media presented by the application (e.g., a movie selected by an operator of the media presentation device 102). Accordingly, the application interface 406 can determine identifying information for the media even when the identifying information is not known to the operating system of the media presentation device 102. The application interface 406 provides the information about the media to the data compiler 410.
The data compiler 410 receives application identifying information f from the device interface 406 and receives media identifying information from the application interface 408. The data compiler 410 queries the lookup table 128 via the lookup table interface 404 to obtain watermark information for the identifying information, and passes the watermark information to the watermark generator 127 of
While an example manner of implementing the example system 100 has been illustrated in
Flowcharts representative of example machine readable instructions for implementing the identification generator 126 and/or the watermark generator 127 are shown in
As mentioned above, the example processes of
The instructions of
The watermark generator 127 acquires the audio of the media from the media rendering engine 124, processes the audio to inject the watermark into the audio, and returns the audio to the media rendering engine 124 (block 610). Alternatively, the watermark generator 127 may obtain and inject the watermark into video, may obtain and inject the watermark into audio and video of the media, or use any other process for watermarking media. The watermark generator 127 may be configured to always receive the audio and/or video from the media rendering engine 124. Alternatively, the watermark generator 127 may periodically receive the audio and/or video, may aperiodically receive the audio and/or video, may receive the audio and/or video upon a request from the watermark generator 127 to the media rendering engine 124, etc. The watermark generator 127 may generate a watermark including identifying information for the application in the fields of the watermark typically dedicated to the identification of media. For example, a Source Identifier field of a watermark is typically filled with information identifying a media provider, channel, broadcaster, etc. that provided media and a Time In Content field is typically filled with information indicating a position (e.g., time) from the beginning of the media. The example watermark generator 127 generates the watermark for the application as a Final Distributor watermark.
Next, the application interface 408 retrieves identifying information for the media from the media application 120 (block 612). The watermark generator 127 next determines an identifier(s) for the media by querying the lookup table 128 via the lookup table interface 404 and using the detected identifying information for the media (block 614). The data compiler 410 next determines watermark identifier(s) for the application by querying the lookup table 128 via the lookup table interface 404 and using the detected application identifying information (block 606). The watermark generator 127 then generates a watermark based on the determined identifier(s) (block 616). The watermark generator 127 acquires the audio of the media from the media rendering engine 124 (e.g., in real-time), processes the audio to inject the watermark into the audio, and returns the audio to the media rendering engine 124 (block 618). Alternatively, the watermark generator 127 may obtain and inject the watermark into video, may obtain and inject the watermark into audio and video of the media, or use any other process for watermarking media. For example, the watermark generator 127 may generate a watermark including identifying information for the application in the fields of the watermark typically dedicated to the identification of media (e.g., the Source Identifier field and the Time In Content field). The example watermark generator 127 generates the watermark for the media as a Program Content watermark.
The example instructions of
While the foregoing describes generating and inserting an application watermark and then generating and inserting a media watermark other implementations are possible. For example, an application watermark may be utilized without a media watermark or a media watermark may be utilized without an application watermark. Alternatively, separate processes may be utilized for application watermarking and media watermarking. In such instances, the separate processes may run in parallel, in series, and/or each process may execute without regard to the other process. In some implementations, the watermark generator 127 may check for previously inserted watermarks before inserting a watermark in media. For example, the watermark generator 127 may determine that a media watermark has already been inserted when inserting an application watermark. When a previously inserted watermark has been detected, the watermark generator 127 may utilize any known technique for inserting an additional watermark (e.g., inserting the watermark at a different time, inserting the watermark using a different encoding technique, inserting the watermark at a different frequency, setting a flag indicating the presence of multiple watermarks, etc.).
While the foregoing describes determining watermarking information using a lookup table that associates information associated with media and applications (e.g., manufacturer identifiers) with the watermark information, other approaches for determining watermark information to be inserted into media may be utilized. For example, the media identifying information and/or the application identifying information may be inserted into the watermark (e.g., in a Time in Content field and/or in a Source Identifier field). In some instances, the size(s) of the media identifying information and/or the application identifying information may exceed the capacity of the watermark payload (or may be undesirably large such that the watermarks will exceed a desired length). In such instances, the identifying information may be compacted to a size that is less than the capacity of the watermark payload and/or a size that is a less than a desired threshold. For example, if a DVD identifier on a disk is ‘xyz123abc456’ includes a portion that uniquely identifies the media (e.g., ‘abc456’) and a portion that identifies information not unique to the media (e.g., a genre, a production studio, a rating, etc.), the identification generator 126 may compact the identifier to the unique portion (e.g., ‘abc456’) removing the non-unique portion. Thus, the shorter, unique portion of the identifier can be inserted in a watermark payload.
The media presentation device 102 displays an agreement that explains the monitoring process, requests consent for monitoring usage of the media presentation device 102, provides options for agreeing (e.g., an ‘I Agree’ button) or disagreeing (‘I Disagree’) (block 802). The media presentation device 102 then waits for a user to indicate a selection (block 804). When the user indicates that they disagree (e.g., do not want to enable monitoring), the instructions of
The processor platform 900 of the instant example includes a processor 912. For example, the processor 912 can be implemented by one or more microprocessors or controllers from any desired family or manufacturer.
The processor 912 includes a local memory 913 (e.g., a cache) and is in communication with a main memory including a volatile memory 916 and a non-volatile memory 914 via a bus 918. The volatile memory 916 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS Dynamic Random Access Memory (RDRAM) and/or any other type of random access memory device. The non-volatile memory 914 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 914, 916 is controlled by a memory controller.
The processor platform 900 also includes an interface circuit 920. The interface circuit 920 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), and/or a PCI express interface.
One or more input devices 922 are connected to the interface circuit 920. The input device(s) 922 permit a user to enter data and commands into the processor 912. The input device(s) can be implemented by, for example, a keyboard, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
One or more output devices 924 are also connected to the interface circuit 920. The output devices 924 can be implemented, for example, by display devices (e.g., a liquid crystal display, a cathode ray tube display (CRT), a printer and/or speakers). The interface circuit 920, thus, typically includes a graphics driver card.
The interface circuit 920 also includes a communication device such as a modem or network interface card to facilitate exchange of data with external computers via a network 926 (e.g., an Ethernet connection, a digital subscriber line (DSL), a telephone line, coaxial cable, a cellular telephone system, etc.).
The processor platform 900 also includes one or more mass storage devices 928 for storing software and data. Examples of such mass storage devices 928 include floppy disk drives, hard drive disks, compact disk drives and digital versatile disk (DVD) drives. The mass storage device 928 may implement the example lookup table 128 of
The coded instructions 932 of
Although certain example methods, apparatus and articles of manufacture have been described herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods, apparatus and articles of manufacture fairly falling within the scope of the claims of this patent.
This patent is a continuation of U.S. patent application Ser. No. 14/556,197, filed Nov. 30, 2014, entitled “METHOD AND APPARATUS TO IDENTIFY MEDIA,” which is a continuation of International Application No. PCT/US13/68635, filed Nov. 6, 2013, entitled “METHOD AND APPARATUS TO IDENTIFY MEDIA,” which is a continuation of U.S. patent application Ser. No. 13/671,341, filed Nov. 7, 2012, entitled “METHOD AND APPARATUS TO IDENTIFY MEDIA.” Priority to U.S. patent application Ser. No. 14/556,197, International Application No. PCT/US13/68635 and U.S. patent application Ser. No. 13/671,341 is claimed. U.S. patent application Ser. No. 14/556,197, International Application No. PCT/US13/68635 and U.S. patent application Ser. No. 13/671,341 are hereby incorporated by reference herein in their respective entireties.
Number | Name | Date | Kind |
---|---|---|---|
5887136 | Yasuda et al. | Mar 1999 | A |
5915027 | Cox et al. | Jun 1999 | A |
5987997 | Roskam | Nov 1999 | A |
6208745 | Florencio et al. | Mar 2001 | B1 |
6285775 | Wu | Sep 2001 | B1 |
6748320 | Jones | Jun 2004 | B2 |
6804779 | Carroni et al. | Oct 2004 | B1 |
6819774 | Nguyen | Nov 2004 | B2 |
7224799 | Mase | May 2007 | B2 |
7577769 | Cobb | Aug 2009 | B2 |
7668334 | Reed et al. | Feb 2010 | B2 |
7827312 | Ramaswamy | Nov 2010 | B2 |
8085975 | Srinivasan | Dec 2011 | B2 |
8090822 | Lee | Jan 2012 | B2 |
8103049 | Petrovic et al. | Jan 2012 | B2 |
8225342 | Mears et al. | Jul 2012 | B2 |
8488837 | Bae et al. | Jul 2013 | B2 |
8639629 | Hoffman | Jan 2014 | B1 |
8768838 | Hoffman | Jul 2014 | B1 |
8874924 | McMillan | Oct 2014 | B2 |
20020032906 | Grossman | Mar 2002 | A1 |
20020033844 | Levy | Mar 2002 | A1 |
20020072975 | Steele et al. | Jun 2002 | A1 |
20020112048 | Gruyer et al. | Aug 2002 | A1 |
20020126869 | Wang | Sep 2002 | A1 |
20020138730 | Kim | Sep 2002 | A1 |
20030018587 | Althoff | Jan 2003 | A1 |
20030061607 | Hunter et al. | Mar 2003 | A1 |
20030151762 | Cherry | Aug 2003 | A1 |
20030174861 | Levy et al. | Sep 2003 | A1 |
20030188166 | Pelly | Oct 2003 | A1 |
20040044576 | Kurihara | Mar 2004 | A1 |
20040052400 | Inomata et al. | Mar 2004 | A1 |
20050015802 | Masson | Jan 2005 | A1 |
20050097331 | Majidimehr et al. | May 2005 | A1 |
20050144632 | Mears | Jun 2005 | A1 |
20050205482 | Gladney | Sep 2005 | A1 |
20060026431 | Campello De Souza | Feb 2006 | A1 |
20060083403 | Zhang | Apr 2006 | A1 |
20060085257 | Johnson | Apr 2006 | A1 |
20060107334 | Leone | May 2006 | A1 |
20070038728 | Jacobs et al. | Feb 2007 | A1 |
20070040934 | Ramaswamy | Feb 2007 | A1 |
20070130015 | Starr et al. | Jun 2007 | A1 |
20070150961 | Ikeda | Jun 2007 | A1 |
20070157249 | Cordray | Jul 2007 | A1 |
20070160082 | Un | Jul 2007 | A1 |
20070174623 | Watson | Jul 2007 | A1 |
20070198842 | Kawamae et al. | Aug 2007 | A1 |
20070227264 | Pors | Oct 2007 | A1 |
20070266252 | Davis et al. | Nov 2007 | A1 |
20070277039 | Zhao | Nov 2007 | A1 |
20070288952 | Weinblatt | Dec 2007 | A1 |
20080103978 | Houston | May 2008 | A1 |
20080142599 | Benillouche et al. | Jun 2008 | A1 |
20080154718 | Flake et al. | Jun 2008 | A1 |
20080172747 | Hurtado et al. | Jul 2008 | A1 |
20080215436 | Roberts | Sep 2008 | A1 |
20080247543 | Mick | Oct 2008 | A1 |
20080263579 | Mears | Oct 2008 | A1 |
20080295128 | Aaltonen | Nov 2008 | A1 |
20090049466 | Schoettle | Feb 2009 | A1 |
20090055854 | Wright et al. | Feb 2009 | A1 |
20090070587 | Srinivasan et al. | Mar 2009 | A1 |
20090113466 | Amitay | Apr 2009 | A1 |
20090136083 | Picard et al. | May 2009 | A1 |
20090220070 | Picard et al. | Sep 2009 | A1 |
20090305680 | Swift et al. | Dec 2009 | A1 |
20100063978 | Lee et al. | Mar 2010 | A1 |
20100064331 | Cooper | Mar 2010 | A1 |
20100205628 | Davis et al. | Aug 2010 | A1 |
20100228594 | Chweh et al. | Sep 2010 | A1 |
20100291907 | MacNaughtan et al. | Nov 2010 | A1 |
20100325025 | Etchegoyen | Dec 2010 | A1 |
20110055385 | Tung | Mar 2011 | A1 |
20110239244 | Karaoguz | Sep 2011 | A1 |
20110264657 | Hoffman et al. | Oct 2011 | A1 |
20120011531 | Levy et al. | Jan 2012 | A1 |
20120072940 | Fuhrer | Mar 2012 | A1 |
20120134548 | Rhoads | May 2012 | A1 |
20120150958 | Besehanic et al. | Jun 2012 | A1 |
20120154633 | Rodriguez | Jun 2012 | A1 |
20120197633 | Aoyagi | Aug 2012 | A1 |
20120208592 | Davis | Aug 2012 | A1 |
20120209949 | Deliyannis et al. | Aug 2012 | A1 |
20120240143 | Mathews | Sep 2012 | A1 |
20120254910 | Donoghue | Oct 2012 | A1 |
20120275642 | Aller | Nov 2012 | A1 |
20120284012 | Rodriguez | Nov 2012 | A1 |
20120290265 | Crystal et al. | Nov 2012 | A1 |
20120290950 | Rapaport et al. | Nov 2012 | A1 |
20120302222 | Williamson et al. | Nov 2012 | A1 |
20130185162 | Mo | Jul 2013 | A1 |
20130205311 | Ramaswamy et al. | Aug 2013 | A1 |
20130205326 | Sinha et al. | Aug 2013 | A1 |
20130297422 | Hunter et al. | Nov 2013 | A1 |
20130311478 | Frett | Nov 2013 | A1 |
20130311780 | Besehanic | Nov 2013 | A1 |
20130347016 | Rowe | Dec 2013 | A1 |
20140110468 | Kandregula | Apr 2014 | A1 |
20140129841 | McMillan | May 2014 | A1 |
20140344844 | Wright et al. | Nov 2014 | A1 |
20150089235 | McMillan | Mar 2015 | A1 |
20150089523 | Volovich et al. | Mar 2015 | A1 |
20150121073 | Wajs | Apr 2015 | A1 |
20150350245 | Twitchell, Jr. | Dec 2015 | A1 |
Number | Date | Country |
---|---|---|
1220151 | Jul 2002 | EP |
3880162 | Feb 2007 | JP |
2008134725 | Jun 2008 | JP |
20090064142 | Jun 2009 | KR |
101088080 | Nov 2011 | KR |
20120034525 | Apr 2012 | KR |
20120042245 | May 2012 | KR |
2014074543 | May 2014 | WO |
2014137414 | Sep 2014 | WO |
Entry |
---|
IP Australia, “Examination Report,” dated Mar. 25, 2020 in connection with Australian Patent Application No. 2018282417, 3 pages. |
IP Australia, “Examination Report,” dated Jul. 4, 2020 in connection with Australian Patent Application No. 2018282417, 3 pages. |
International Bureau, “International Preliminary Report on Patentability,” mailed in connection with International Patent Application No. PCT/US2013/068635, dated May 21, 2015, 6 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” mailed in connection with U.S. Appl. No. 14/073,656, dated Jan. 20, 2015, 19 pages. |
International Searching Authority, “International Search Report and Written Opinion,” mailed in connection with International Patent Application No. PCT/US2013/068929, dated Feb. 27, 2014, 11 pages. |
IP Australia, “Examination Report,” mailed in connection with Australian Patent Application No. 2013203768, dated Jul. 11, 2014, 4 pages. |
European Patent Office, “Communication Pursuant to Rules 161(2) and 162 EPC,” mailed in connection with European Application No. 13853140.5, dated Jul. 2, 2015, 2 pages. |
United States Patent and Trademark Office, “Non-Final Office Action,” mailed in connection with U.S. Appl. No. 14/073,656, dated Nov. 13, 2015, 21 pages. |
United States Patent and Trademark Office, “Final Office Action,” mailed in connection with U.S. Appl. No. 14/073,656, dated Jul. 24, 2015, 28 pages. |
European Patent Office, “Examination Report,” mailed in connection with European Patent Application No. 13853140.5 dated Jan. 13, 2017, 6 pages. |
IP Australia, “Certificate of Grant,” mailed in connection with Australian Patent Application No. 2014250673, dated Sep. 1, 2016, 1 page. |
European Patent Office, “Communication Pursuant to Article 94(3) EPC,” mailed in connection with European Patent Application No. 13853140.5, dated Feb. 21, 2018, 8 pages. |
IP Australia, “Examination Report,” mailed in connection with Australian Patent Application No. 2016216648, dated Mar. 31, 2018, 3 pages. |
Canadian Intellectual Property Office, “Office Action,” mailed in connection with Canadian Patent Application No. 2890486, dated May 5, 2017, 3 pages. |
European Patent Office, “2nd Examination Report,” mailed in connection with European Patent Application No. 13853140.5, dated Aug. 10, 2017, 8 pages. |
IP Australia, “Examination Report,” mailed in connection with Australian Patent Application No. 2016216648, dated Sep. 16, 2017, 4 pages. |
European Patent Office, “Extended Search Report,” mailed in connection with European Patent Application No. 13853140.5, dated Apr. 21, 2016, 7 pages. |
IP Australia, “Notice of Acceptance,” mailed in connection with Australian Patent Application No. 2014250673, dated May 10, 2016, 2 pages. |
Canadian Intellectual Property Office, “Office Action,” mailed in connection with Canadian Patent Application No. 2,890,486, dated Jun. 10, 2016, 3 pages. |
United States Patent and Trademark Office, “Notice of Allowance,” mailed in connection with U.S. Appl. No. 13/671,341, dated Jun. 24, 2014, 8 pages. |
United States Patent and Trademark Office, “Supplemental Notice of Allowability,” mailed in connection with U.S. Appl. No. 13/671,341, dated Aug. 13, 2014, 2 pages. |
IP Australia, “Patent Examination Report,” mailed in connection with Australian Patent Application No. 2013203683, dated May 30, 2014, 2 pages. |
IP Australia, “Notice of Acceptance,” mailed in connection with Australian Patent Application No. 2013203683, dated Jul. 3, 2014, 2 pages. |
International Searching Authority, “International Search Report,” mailed in connection with International Patent Application No. PCT/US2013/068635, dated Feb. 18, 2014, 4 pages. |
International Searching Authority, “Written Opinion of the International Search Authority,” mailed in connection with International Patent Application No. PCT/US2013/068635, dated Feb. 18, 2014, 4 pages. |
United States Patent and Trademark Office, “Final Office Action,” mailed in connection with U.S. Appl. No. 14/556,197, dated Sep. 14, 2018, 14 pages. |
United States Patent and Trademark Office, “Non-final Office Action,” mailed in connection with U.S. Appl. No. 14/556,197, dated Apr. 12, 2018, 12 pages. |
United States Patent and Trademark Office, “Final Office Action,” mailed in connection with U.S. Appl. No. 14/556,197, dated Sep. 25, 2017, 12 pages. |
United States Patent and Trademark Office, “Non-final Office Action,” mailed in connection with U.S. Appl. No. 14/556,197, dated Apr. 20, 2017, 12 pages. |
United States Patent and Trademark Office, “Final Office Action,” mailed in connection with U.S. Appl. No. 14/556,197, dated Jul. 19, 2016, 12 pages. |
United States Patent and Trademark Office, “Non-final Office Action,” mailed in connection with U.S. Appl. No. 14/556,197, dated Jan. 22, 2016, 10 pages. |
United States Patent and Trademark Office, “Advisory Action,” mailed in connection with U.S. Appl. No. 14/556,197, dated Jul. 15, 2019, 2 pages. |
United States Patent and Trademark Office, “Examiner's Answer to Appeal Brief,” mailed in connection with U.S. Appl. No. 14/556,197, dated Oct. 10, 2019, 19 pages. |
European Patent Office, “Summons to Oral Proceedings,” mailed Apr. 30, 2019 in connection with European Patent Application No. 13853140.5, 8 pages. |
IP Australia, “Examination Report No. 1,” dated Aug. 31, 2019 in connection with Australian Patent Application No. 2018282417, 4 pages. |
Canadian Intellectual Property Office, “Office Action,” dated Jul. 26, 2019 in connection with Canadian Patent Application No. 3,021,656, 4 pages. |
IP Australia, “Notice of Grant for Patent,” dated Jan. 10, 2019 in connection with Australian Patent Application No. 2016216648, 1 page. |
Canadian Intellectual Property Office, “Notice of Allowance,” dated Apr. 20, 2018 in connection with Canadian Patent Application No. 2,890,486, 1 page. |
Canadian Intellectual Property Office, “Notice of Allowance,” dated Aug. 18, 2020 in connection with Canadian Patent Application No. 3,021,656, 1 page. |
IP Australia, “Examination Report No. 3,” dated Jul. 4, 2020 in connection with Australian Patent Application No. 2018282417, 3 pages. |
IP Australia, “Examination Report No. 2,” dated Mar. 25, 2020 in connection with Australian Patent Application No. 2018282417, 3 pages. |
European Patent Office, “Preliminary Opinion of the Examining Division,” dated Feb. 26, 2020 in connection with European Patent Application No. 13853140.5, 10 pages. |
European Patent Office, “Summons to Attend Oral Proceedings Pursuant to Rule 115(1) EPC,” mailed Apr. 9, 2020 in connection with European Patent Application No. 13853140.5, 13 pages. |
European Patent Office, “Decision to Refuse a European Patent Application,” dated Jan. 29, 2021 in connection with European Patent Application No. 13853140.5, 19 pages. |
IP Australia, “Exam report No. 1 for standard patent application,” issued in connection with Australian Patent Application No. 2020226993, dated Apr. 29, 2021, 4 pages. |
Number | Date | Country | |
---|---|---|---|
20190215169 A1 | Jul 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14556197 | Nov 2014 | US |
Child | 16355262 | US | |
Parent | PCT/US2013/068635 | Nov 2013 | US |
Child | 14556197 | US | |
Parent | 13671341 | Nov 2012 | US |
Child | PCT/US2013/068635 | US |