The present invention generally relates to watermarking digital content and more particularly to using watermarks to track content timeline in the presence of playback rate changes.
This section is intended to provide a background or context to the disclosed embodiments that are recited in the claims. The description herein may include concepts that could be pursued but are not necessarily ones that have been previously conceived or pursued. Therefore, unless otherwise indicated herein, what is described in this section is not prior art to the description and claims in this application and is not admitted to be prior art by inclusion in this section.
A video watermarking system which embeds ancillary information into a video signal is found in the ATSC standard A/335. In such systems it is sometimes necessary to playback auxiliary content which is synchronized to a watermark timeline recovered from the received content in cases where the recovered timeline has a non-linear mapping to real time.
This section is intended to provide a summary of certain exemplary embodiments and is not intended to limit the scope of the embodiments that are disclosed in this application.
Disclosed embodiments relate to method for synchronizing auxiliary content to a watermark timeline recovered from a received content when the recovered timeline has a non-linear mapping to real time. The method includes receiving video content having a video watermark embedded therein and decoding video frames from the received video content. A Detector Engine is used to receive the decoded video frames and extract a time-offset field, a VP1 payload, and a Cyclic Redundance Check (CRC) field in each video frame. A Content Timeline Tracker is used to monitor and analyze the output of the Detector Engine, to produce a piecewise linear approximation of the content timeline, wherein the playback rate changes by a user in an upstream device can be tracked, thereby enabling the playback of auxiliary content which is synchronized to a watermark timeline recovered from the received content when the recovered timeline has a non-linear mapping to real time.
These and other advantages and features of disclosed embodiments, together with the organization and manner of operation thereof, will become apparent from the following detailed description when taken in conjunction with the accompanying drawings.
In the following description, for purposes of explanation and not limitation, details and descriptions are set forth in order to provide a thorough understanding of the disclosed embodiments. However, it will be apparent to those skilled in the art that the present invention may be practiced in other embodiments that depart from these details and descriptions.
Additionally, in the subject description, the word “exemplary” is used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word exemplary is intended to present concepts in a concrete manner.
Introduction
This disclosure describes the logic that uses video watermarks specified in the ATSC 3.0 Standards, Video Watermark Emission (A/335), Doc. A335:2016, 20 Sep. 2016, which is incorporated by reference, and Content Recovery in Redistribution Scenarios (A/336), Doc. A/336:2019, 3 Oct. 2019, which is incorporated by reference, in order to detect and measure trick-play action on upstream devices such as Set Top Box (STB), such as pause, speed-up, slow-down and skip. In particular it is based on detecting eVP1 messages specified in the A/336 standard, which comprises 8-bit time_offset field, 50-bit VP1 payload and 32-bit Cyclic Redundancy Check (CRC) field in each video frame.
The time_offset field is incremented by one every 1/30 s within a message group that lasts 1.5 s, i.e., it can have values 0, 1, 2, . . . 44 within each message group. The VP1 payload (P) is divided into four fields: Domain Type (DT), Server Code (SC), Interval Code (IC), and Query Flag (QF). DT is a one-bit field (0=“small domain”, 1=“large domain”). For “small domain”, the SC field consists of 31 bits and the IC field consists of 17 bits. For “large domain”, the SC field consists of 23 bits and the IC field consists of 25 bits. The QF field is always one bit, and its toggling signals a dynamic event that requires new signaling recovery. The IC field is incremented by one for each subsequent message group.
The CRC field is used to confirm correctness of the extracted data, as is well known to those skilled in the art. It is assumed that there is a detector engine that will receive decoded video frames and extract 8-bit time_offset field, 50-bit VP1 payload and 32-bit CRC field in each video frame based on A/335 and A/336. The details of detector engine design are not part of this disclosure.
CRC Matching
The CRC matching logic compares the CRC fields extracted from the current frame with CRC field extracted from the previous frame and sets the CRC repetition flag to TRUE if they match and otherwise sets it to FALSE. This process is done regardless of whether the extracted CRC field matches the calculated CRC field based on the extracted data. Even if extracted CRC field may have bit errors and the actual data cannot be retrieved, we still want to know if the consecutive CRC fields are repeated. This information can be later used to discriminate between actual payload repetition, such as time_offset repetition in high frame-rate video or fragment repetition, or frame repetition in pause-and-seek playback rate change, skip and pause, as described below.
Content Timeline Tracker
The Content Timeline Tracker (“Tracker”) monitors the output of the detector engine, and analyzes frame_counter, interval_code, time_offset, and CRC repetition flag values and to produce estSpeed, a piecewise linear approximation of the content timeline which can track playback rate changes initiated by a user on an upstream device (e.g., STB).
Overview
Some applications require playback of auxiliary content which is synchronized to the watermark timeline recovered from the main content. For normal viewing the recovered timeline is real-time, meaning that an elapsed interval of media time occurs in an equal duration interval of real time. Other times, such as when the user is controlling playback of main content using ‘trick play’, the recovered timeline has a non-linear mapping to real time.
To play content, Media Player APIs typically expose a command to start (or continue) to play from a specific frame at a specific speed. A sufficiently fast enough player could track, frame-by-frame, the recovered timeline in all modes of play, but most current players cannot respond quickly enough to be able to precisely seek to and render a frame within one frame's duration.
A goal of the Tracker is to quickly recognize where playback rate changes are initiated by the user, and provide a piecewise-linear estimate of the playback speed which can then be used in controlling a replacement media player, minimizing the number of seek commands required to track the main content.
Control Segments
A Control Segment represents a period of time between two upstream user transport control commands which modify playback speed. The media timeline detected with the watermark might be a smooth rendition of the user's command (e.g., 2× resulting in regular frame decimation), or it might be a pause-seek stepwise approximation to the user's command (e.g., 32× in
The Control Segment is initialized with the currentMediaTime and currentClockTime.
An initial speed estimate uses the most recent deltaMediaTime.
Occasionally the speedEstimate is updated in the middle of a Control Segment as the slope of the expanding control segment line becomes a better estimator for media speed. getCurrentCSSpeed( ) calculates the current slope and clips the value to speedlimit.
Tracker States
The Tracker implements a state machine to help recognize patterns in the recovered timeline and estimate the control segment boundaries. The states are shown in the tracker States Table below.
Tracker Events/Tracker Main
track( ) is called with parameters frame_counter, interval_counter, time_offset and CRC repetition flag. It generates events which drive the Tracker state machine. The events are:
track( ) is called once for every detected frame.
Two successive calls to tracker with the same IC and time_offset might mean that content is paused, but this can also happen for frame rates higher that 30 fps because time_offset is quantized to 1/30 sec (T0Quantization). These duplicate frames caused by To quantization should be discarded by the tracker, and this is done by looking at the deltaclockTime to determine if two successive calls are spaced less than 1/30 sec. Note that deltaMediaTime might not be zero even if two successive calls are spaced closer than 1/30 sec because of upstream trick play, and these samples should not be discarded.
Two successive calls to Track( ) might be spaced further than 1/fps seconds apart if intervening frames did not have time_offset available. The number of skipped frames is calculated in skippedFrames and used to test for 1× play speed.
The CRC repetition flag crf is used to indicate paused state when the time_offset is not available; in this case the previous value of the time_offset is used.
When the fps is different than 1/ToQuantization, there will be an error in the calculation of delta media time. This kind of jitter is tolerated using a threshold in the calculation:
frameJitterThresholdSec=0.99/fps
Pseudo-code for the track( ) function of the Tracker:
Pause Detected Event Handler
This event is triggered when successive frames show no advance in media time. This could be because the content is paused, or it might part of content playback at speed not equal 1×, such as part of a ‘Pause-Seek’ operation for speed >2.0, or part of frame interpolation for speed <1.0.
A goal is to recognize as quickly as possible that pause is occurring to ensure that a tracking media player is responsive to user commands.
The main decision to be made in the event handlers is whether to start a new or update the current control segment. For example, new control segments should not be started in the middle of a sequence of pause-seeks, but the existing speed estimate should be updated.
play1× Detected Event Handler
play1× Detected might be part of normal 1× play, or it might be part of a sequence of frames where playback speed is <2×. A goal is to recognize as quickly as possible that normal 1× play is occurring to ensure that a tracking media player is responsive to user commands.
Discontinuity Detected Event Handler
A discontinuity is any jump in the recover timeline, which is not a pause or frames spaced 1/fps apart. These might be part of a pause-seek (a ‘big’ jump below), or result from playback speeds estSpeed <2.0 && estSpeed >1.0.
Tracking Timeline
estSpeed represents the slope of an idealized control segment. In reality, it is a noisy signal that is influenced by the imperfect nature of trick play media transports. A trackingTimeline is created with logic to try to remove this noise and produce sparsely spaced fSpeedUpdated events that delineate constant slope (constant speed) control segments.
The timeline is parametrized by att. speed and tt.mediaTime, and can be quantized in time to correspond to the underlying video frame rate. For each processed video frame, trackingTimelineTimetick( ) is called to update the timeline by extrapolating the mediaTime using tt.speed. The timeline can also be resynchronized to the video watermark timeline in trackingTimelineUpdate( ) which is also called every processed video frame. trackingTimelineUpdate( ) selectively calls trackingTimelineSetimeAndSpeed (time, speed) which updates the tracking timeline and sets the fSpeedUpdated Boolean.
trackingTimelineUpdate ( ) does not always update tt.speed and tt.mediaTime and uses thresholding logic and other heuristics to avoid too frequent updates to fSpeedUpdated. This can be important if, for example, fSpeedUpdated is used to trigger the seeking of a media player which playing alternate content synchronized to the incoming watermarked content.
trackingTimelineUpdate ( ) analyzes the differences between tt. speed and the estSpeed which is estimated from the recovered watermarks. If there is any transition between pause and play (i.e., if (estSpeed==0.0∥estSpeed=1.0∥tt.speed=0∥tt.speed=1.0) && (tt.speed˜=estSpeed), the tracking timeline is immediately updated.
If tt.speed and estSpeed have opposite signs, the tracking timeline is also immediately updated so that overshoot is reduced in tracking devices. If the signs are the same then the tracking timeline is only updated if the ratio of tt.speed and estSpeed is outside of a thresholded window. This avoids constant fSpeedUpdated triggers that might be due to small estimation errors in estSpeed and other system noise.
If none of the speed analysis conditions are true, trackingTimelineUpdate ( ) analyzes the differences between tt.mediaTime and the currentMediaTime. If this difference is above a threshold, then the tracking timeline is updated. The threshold is adjusted based on the estSpeed, so that there is a greater tolerance to time errors when operating at fast trick play speeds. In most cases the tracking timeline is updated using the currentMediaTime and estSpeed; however, if such an update would reverse the sign of the speed when the time difference is relatively small and the difference is diverging, this is recognized as normal tracking of a pause-seek trick play source, so the tracking timeline is updated to pause at currentMediaTime to wait for the next seek in the pause seek sequence.
Trickplay Timeline Examples
Examples of non-linear timelines resulting from the user's operation of STB remote control trickplay functions are shown below. These are selected from a set of test vectors that can be used to validate implementations of this algorithm.
In these examples, the user input is a sparse sequence of button pushes to change playback speed or skip through content. The STBs main media player responds by seeking in the content and using custom frame decimation and interpolation to play the content at the commanded speed. A typical algorithm is ‘Pause-Seek’, where a frame is repeated (‘Pause’) while the player seeks to an appropriate frame to play next.
1×→2×→8× Playback
A closer look at the 2× playback section in
Similarly, playback rates between 1.0 and 2.0 consisting of periods of 1× playback interspersed with jumps of 2 frames. Playback rates <1.0 consist of repeated frames interspersed with 1× frame increments.
ChannelMaster 32× Playback
ChannelMaster Skip Ahead/Skip Back
It is understood that the various embodiments of the present invention may be implemented individually, or collectively, in devices comprised of various hardware and/or software modules and components. These devices, for example, may comprise a processor, a memory unit, an interface that are communicatively connected to each other, and may range from desktop and/or laptop computers, to consumer electronic devices such as media players, mobile devices, and the like. For example,
Referring back to
Various embodiments described herein are described in the general context of methods or processes, which may be implemented in one embodiment by a computer program product, embodied in a computer-readable medium, including computer-executable instructions, such as program code, executed by computers in networked environments. A computer-readable medium may include removable and non-removable storage devices including, but not limited to, Read Only Memory (ROM), Random Access Memory (RAM), compact discs (CDs), digital versatile discs (DVD), etc. Therefore, the computer-readable media that is described in the present application comprises non-transitory storage media. Generally, program modules may include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Computer-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps or processes.
The foregoing description of embodiments has been presented for purposes of illustration and description. The foregoing description is not intended to be exhaustive or to limit embodiments of the present invention to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of various embodiments. The embodiments discussed herein were chosen and described in order to explain the principles and the nature of various embodiments and its practical application to enable one skilled in the art to utilize the present invention in various embodiments and with various modifications as are suited to the particular use contemplated. The features of the embodiments described herein may be combined in all possible combinations of methods, apparatus, modules, systems, and computer program products.
This application claims priority to U.S. Provisional Patent Application No. 63/147,122, filed Feb. 8, 2021, and U.S. Provisional Patent Application No. 63/225,381, filed Jul. 23, 2021, the entirety of which are incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
6122610 | Isabelle | Sep 2000 | A |
6145081 | Winograd et al. | Nov 2000 | A |
6175627 | Petrovic et al. | Jan 2001 | B1 |
6314192 | Chen et al. | Nov 2001 | B1 |
6427012 | Petrovic | Jul 2002 | B1 |
6430301 | Petrovic | Aug 2002 | B1 |
6490579 | Gao et al. | Dec 2002 | B1 |
6556688 | Ratnakar | Apr 2003 | B1 |
6577747 | Kalker et al. | Jun 2003 | B1 |
6683958 | Petrovic | Jan 2004 | B2 |
6757405 | Muratani et al. | Jun 2004 | B1 |
6792542 | Lee et al. | Sep 2004 | B1 |
6839673 | Choi et al. | Jan 2005 | B1 |
6888943 | Lam et al. | May 2005 | B1 |
6895430 | Scheider | May 2005 | B1 |
6931536 | Hollar | Aug 2005 | B2 |
7024018 | Petrovic | Apr 2006 | B2 |
7140043 | Choi et al. | Nov 2006 | B2 |
7159118 | Petrovic | Jan 2007 | B2 |
7224819 | Levy et al. | May 2007 | B2 |
7460667 | Lee et al. | Dec 2008 | B2 |
7533266 | Bruekers et al. | May 2009 | B2 |
7707422 | Shin et al. | Apr 2010 | B2 |
7779271 | Langelaar | Aug 2010 | B2 |
7983922 | Neusinger et al. | Jul 2011 | B2 |
7986806 | Rhoads | Jul 2011 | B2 |
7991995 | Rabin et al. | Aug 2011 | B2 |
3015410 | Pelly et al. | Sep 2011 | A1 |
3055013 | Levy et al. | Nov 2011 | A1 |
3059815 | Lofgren et al. | Nov 2011 | A1 |
8059858 | Brundage et al. | Nov 2011 | B2 |
8081757 | Voessing et al. | Dec 2011 | B2 |
8085935 | Petrovic | Dec 2011 | B2 |
8138930 | Heath | Mar 2012 | B1 |
8151113 | Rhoads | Apr 2012 | B2 |
8181262 | Cooper et al. | May 2012 | B2 |
8189861 | Rucklidge | May 2012 | B1 |
8194803 | Baum et al. | Jun 2012 | B2 |
8249992 | Harkness et al. | Aug 2012 | B2 |
8259873 | Baum et al. | Sep 2012 | B2 |
8280103 | Petrovic et al. | Oct 2012 | B2 |
8301893 | Brundage | Oct 2012 | B2 |
8315835 | Tian et al. | Nov 2012 | B2 |
8321679 | Petrovic et al. | Nov 2012 | B2 |
8340348 | Petrovic et al. | Dec 2012 | B2 |
8346532 | Chakra et al. | Jan 2013 | B2 |
8346567 | Petrovic et al. | Jan 2013 | B2 |
8467717 | Croy et al. | Jun 2013 | B2 |
8479225 | Covell et al. | Jul 2013 | B2 |
8483136 | Yuk et al. | Jul 2013 | B2 |
8533481 | Petrovic et al. | Sep 2013 | B2 |
8538066 | Petrovic et al. | Sep 2013 | B2 |
8560604 | Shribman et al. | Oct 2013 | B2 |
8588459 | Bloom et al. | Nov 2013 | B2 |
8589969 | Falcon | Nov 2013 | B2 |
8601504 | Stone et al. | Dec 2013 | B2 |
8615104 | Petrovic et al. | Dec 2013 | B2 |
8666528 | Harkness et al. | Mar 2014 | B2 |
8682026 | Petrovic et al. | Mar 2014 | B2 |
8726304 | Petrovic et al. | May 2014 | B2 |
8745403 | Petrovic | Jun 2014 | B2 |
8768714 | Blesser | Jul 2014 | B1 |
8781967 | Tehranchi et al. | Jul 2014 | B2 |
8791789 | Petrovic et al. | Jul 2014 | B2 |
8806517 | Petrovic et al. | Aug 2014 | B2 |
8811655 | Petrovic et al. | Aug 2014 | B2 |
8825518 | Levy | Sep 2014 | B2 |
8838977 | Winograd et al. | Sep 2014 | B2 |
8838978 | Winograd et al. | Sep 2014 | B2 |
8869222 | Winograd et al. | Oct 2014 | B2 |
8898720 | Eyer | Nov 2014 | B2 |
8923548 | Petrovic et al. | Dec 2014 | B2 |
8959202 | Haitsma et al. | Feb 2015 | B2 |
8990663 | Liu et al. | Mar 2015 | B2 |
9009482 | Winograd | Apr 2015 | B2 |
9042598 | Ramaswamy et al. | May 2015 | B2 |
9055239 | Tehranchi et al. | Jun 2015 | B2 |
9106964 | Zhao | Aug 2015 | B2 |
9117270 | Wong et al. | Aug 2015 | B2 |
9147402 | Chen et al. | Sep 2015 | B2 |
9277183 | Eyer | Mar 2016 | B2 |
10110971 | Winograd et al. | Oct 2018 | B2 |
10236031 | Gurijala | Mar 2019 | B1 |
20010044899 | Levy | Nov 2001 | A1 |
20020032864 | Rhoads et al. | Mar 2002 | A1 |
20020094082 | Jones et al. | Jul 2002 | A1 |
20020138695 | Beardsley et al. | Sep 2002 | A1 |
20030055979 | Cooley | Mar 2003 | A1 |
20030065739 | Shnier | Apr 2003 | A1 |
20040202324 | Yamaguchi et al. | Oct 2004 | A1 |
20050123169 | Wendt | Jun 2005 | A1 |
20050182792 | Israel et al. | Aug 2005 | A1 |
20060047704 | Gopalakrishnan | Mar 2006 | A1 |
20060053292 | Langelaar | Mar 2006 | A1 |
20060083242 | Pulkkinen | Apr 2006 | A1 |
20060115108 | Rodriquez et al. | Jun 2006 | A1 |
20060239503 | Petrovic et al. | Oct 2006 | A1 |
20070003103 | Lemma et al. | Jan 2007 | A1 |
20070039018 | Saslow et al. | Feb 2007 | A1 |
20070208744 | Krishnaprasad et al. | Sep 2007 | A1 |
20070250560 | Wein et al. | Oct 2007 | A1 |
20080297654 | Verberkt et al. | Dec 2008 | A1 |
20080301304 | Chitsaz et al. | Dec 2008 | A1 |
20090010487 | Maeno | Jan 2009 | A1 |
20090060055 | Blanchard et al. | Mar 2009 | A1 |
20090089078 | Bursey | Apr 2009 | A1 |
20090158318 | Levy | Jun 2009 | A1 |
20090319639 | Gao et al. | Dec 2009 | A1 |
20100023489 | Miyata et al. | Jan 2010 | A1 |
20100054531 | Kogure et al. | Mar 2010 | A1 |
20100063978 | Lee et al. | Mar 2010 | A1 |
20100131461 | Prahlad et al. | May 2010 | A1 |
20100172540 | Davis et al. | Jul 2010 | A1 |
20100174608 | Harkness et al. | Jul 2010 | A1 |
20100281142 | Stoyanov | Nov 2010 | A1 |
20110004897 | Alexander et al. | Jan 2011 | A1 |
20110088075 | Eyer | Apr 2011 | A1 |
20110099594 | Chen et al. | Apr 2011 | A1 |
20110103444 | Baum et al. | May 2011 | A1 |
20110161086 | Rodriguez | Jun 2011 | A1 |
20110188700 | Kim et al. | Aug 2011 | A1 |
20110252342 | Broman | Oct 2011 | A1 |
20110261667 | Ren et al. | Oct 2011 | A1 |
20110286625 | Petrovic et al. | Nov 2011 | A1 |
20110293090 | Ayaki et al. | Dec 2011 | A1 |
20110320627 | Landow et al. | Dec 2011 | A1 |
20120023595 | Speare et al. | Jan 2012 | A1 |
20120072731 | Winograd et al. | Mar 2012 | A1 |
20120102304 | Brave | Apr 2012 | A1 |
20120110138 | Zhang | May 2012 | A1 |
20120117031 | Cha et al. | May 2012 | A1 |
20120122429 | Wood et al. | May 2012 | A1 |
20120129547 | Andrews, III et al. | May 2012 | A1 |
20120203556 | Villette et al. | Aug 2012 | A1 |
20120203734 | Spivack et al. | Aug 2012 | A1 |
20120216236 | Robinson et al. | Aug 2012 | A1 |
20120265735 | McMillan et al. | Oct 2012 | A1 |
20120272012 | Aronovich et al. | Oct 2012 | A1 |
20120272327 | Shin et al. | Oct 2012 | A1 |
20120300975 | Chalamala et al. | Nov 2012 | A1 |
20120304206 | Roberts et al. | Nov 2012 | A1 |
20120308071 | Ramsdell et al. | Dec 2012 | A1 |
20120315011 | Messmer | Dec 2012 | A1 |
20130007462 | Petrovic et al. | Jan 2013 | A1 |
20130007790 | McMillan et al. | Jan 2013 | A1 |
20130024894 | Eyer | Jan 2013 | A1 |
20130031579 | Klappert | Jan 2013 | A1 |
20130060837 | Chakraborty et al. | Mar 2013 | A1 |
20130073065 | Chen et al. | Mar 2013 | A1 |
20130114848 | Petrovic et al. | May 2013 | A1 |
20130117571 | Petrovic et al. | May 2013 | A1 |
20130129303 | Lee et al. | May 2013 | A1 |
20130151855 | Petrovic et al. | Jun 2013 | A1 |
20130151856 | Petrovic et al. | Jun 2013 | A1 |
20130152210 | Petrovic et al. | Jun 2013 | A1 |
20130159546 | Thang et al. | Jun 2013 | A1 |
20130171926 | Perret et al. | Jul 2013 | A1 |
20130246643 | Luby et al. | Sep 2013 | A1 |
20140037132 | Heen et al. | Feb 2014 | A1 |
20140047475 | Oh et al. | Feb 2014 | A1 |
20140059591 | Terpstra et al. | Feb 2014 | A1 |
20140067950 | Winograd | Mar 2014 | A1 |
20140068686 | Oh et al. | Mar 2014 | A1 |
20140074855 | Zhao et al. | Mar 2014 | A1 |
20140075465 | Petrovic et al. | Mar 2014 | A1 |
20140075469 | Zhao | Mar 2014 | A1 |
20140115644 | Kim et al. | Apr 2014 | A1 |
20140196071 | Terpstra et al. | Jul 2014 | A1 |
20140267907 | Downes et al. | Sep 2014 | A1 |
20140270337 | Zhao et al. | Sep 2014 | A1 |
20140279549 | Petrovic et al. | Sep 2014 | A1 |
20140325550 | Winograd et al. | Oct 2014 | A1 |
20140325673 | Petrovic | Oct 2014 | A1 |
20150030200 | Petrovic et al. | Jan 2015 | A1 |
20150043728 | Kim et al. | Feb 2015 | A1 |
20150043768 | Breebaart | Feb 2015 | A1 |
20150121534 | Zhao et al. | Apr 2015 | A1 |
20150156536 | Kim et al. | Jun 2015 | A1 |
20150170661 | Srinivasan | Jun 2015 | A1 |
20150229979 | Wood et al. | Aug 2015 | A1 |
20150261753 | Winograd et al. | Sep 2015 | A1 |
20150264429 | Winograd et al. | Sep 2015 | A1 |
20150296274 | Good et al. | Oct 2015 | A1 |
20150324947 | Winograd et al. | Nov 2015 | A1 |
20150340045 | Hardwick et al. | Nov 2015 | A1 |
20160055606 | Petrovic et al. | Feb 2016 | A1 |
20160055607 | Petrovic et al. | Feb 2016 | A1 |
20160057317 | Zhao et al. | Feb 2016 | A1 |
20160148334 | Petrovic et al. | May 2016 | A1 |
20160150297 | Petrovic et al. | May 2016 | A1 |
20160165297 | Jamal-Syed | Jun 2016 | A1 |
20160182973 | Winograd et al. | Jun 2016 | A1 |
20160241932 | Winograd et al. | Aug 2016 | A1 |
20170251282 | Winograd et al. | Aug 2017 | A1 |
20180192163 | Winograd et al. | Jul 2018 | A1 |
Number | Date | Country |
---|---|---|
102474658 | May 2012 | CN |
103299648 | Sep 2013 | CN |
1474924 | Nov 2004 | EP |
2439735 | Apr 2012 | EP |
2489181 | Aug 2012 | EP |
2899720 | Jul 2015 | EP |
2004163855 | Jun 2004 | JP |
2004173237 | Jun 2004 | JP |
2004193843 | Jul 2004 | JP |
2004194233 | Jul 2004 | JP |
2004328747 | Nov 2004 | JP |
2005051733 | Feb 2005 | JP |
2005094107 | Apr 2005 | JP |
2005525600 | Aug 2005 | JP |
20100272920 | Dec 2010 | JP |
1020080087047 | Sep 2008 | KR |
20100009384 | Jan 2010 | KR |
10201016712 | Feb 2011 | KR |
1020120083903 | Jul 2012 | KR |
1020120128149 | Nov 2012 | KR |
20130078663 | Jul 2013 | KR |
1020130074922 | Jul 2013 | KR |
101352917 | Jan 2014 | KR |
10201424049 | Jul 2014 | KR |
00059148 | Oct 2000 | WO |
2005017827 | Feb 2005 | WO |
2005038778 | Apr 2005 | WO |
2006051043 | May 2006 | WO |
2009031082 | Mar 2009 | WO |
2010073236 | Jul 2010 | WO |
2010135687 | Nov 2010 | WO |
2011046590 | Apr 2011 | WO |
2011116309 | Sep 2011 | WO |
2012177126 | Dec 2012 | WO |
2012177874 | Dec 2012 | WO |
2013025035 | Feb 2013 | WO |
2013163921 | Nov 2013 | WO |
2015138798 | Sep 2015 | WO |
2015168697 | Nov 2015 | WO |
2015174086 | Nov 2015 | WO |
2016028934 | Feb 2016 | WO |
2016028936 | Feb 2016 | WO |
2016029055 | Feb 2016 | WO |
2016086047 | Jun 2016 | WO |
Entry |
---|
“ATSC-3.0 Automatic Content Recognition Watermarking Solutions,” ATSC Technology Group, Advanced Television Systems Committee, Inc., Jan. 2014 (6 pages). |
Aris Technologies, Inc. “Audio Watermarking System to Screen Digital Audio Content for LCM Acceptance,” May 1999 (17 pages). |
Bangaleea, R., et al., “Performance improvement of spread spectrum spatial-domain watermarking scheme through diversity and attack characterisation,” IEEE Africon, pp. 293-298, 2002. |
Hartung, F., et al., “Watermarking of MPEG-2 encoded video without decoding and re-coding,” Proc. SPIE Multimedia Computing and Networking 97, 3020:264-274, Feb. 1997. |
Hartung, F., et al., “Watermarking of uncompressed and compressed video,” Signal Processing, 3(66):283-301, May 1998. |
International Search Report and Written Opinion dated Aug. 13, 2015 for International Application No. PCT/US2015/029097, filed May 4, 2015 (14 pages). |
International Search Report and Written Opinion dated May 28, 2015 for International Application No. PCT/US2015/020282, filed Mar. 12, 2015 (7 pages). |
International Search Report and Written Opinion dated Dec. 7, 2015 for International Application No. PCT/US2015/045960, filed Aug. 19, 2015 (14 pages). |
International Search Report and Written Opinion dated Jan. 28, 2016 for International Application No. PCT/US2015/045964, filed Aug. 19, 2015 (8 pages). |
International Search Report and Written Opinion dated Mar. 15, 2016 for International Application No. PCT/US2015/062514, filed Nov. 24, 2015 (10 pages). |
Kalker, T., et al., “System issues in digital image and video watermarking for copy protection,” Proc. IEEE Int. Conf. on Multimedia Computing and Systems, pp. 562-567, Jun. 1999. |
Kirovski, D., et al., “Multimedia content screening using a dual watermarking and fingerprinting system,” Proceedings of the tenth ACM international conference, pp. 372-381, 2002. |
Kirovski, D., et al., “Multimedia content screening using a dual watermarking and fingerprinting system,” Multimedia '02 Proceedings of the tenth ACM international conference on Multimedia, 2002 (11 pages). |
Verance Corporation, “Confirmedia,” PowerPoint presentation made to National Association of Broadcasters, Apr. 24, 2001 (40 pages). |
Zhao, J., “A WWW service to embed and prove digital copyright watermarks,” Proc. European Conf. on Multimedia Applications, Services and Techniques (ECMAST'96), May 1996 (15 pages). |
Zhao, J., “Applying digital watermarking techniques to online multimedia commerce,” Proc. Int. Conf. on Imaging Science, Systems and Applications (CISSA'97), Jun./Jul. 1997 (7 pages). |
International Search Report and Written Opinion dated Jan. 21, 2016 for International Application No. PCT/US2015/046166, filed Aug. 20, 2015 (8 pages). |
International Search Report and Written Opinion dated Apr. 12, 2016 for International Application No. PCT/US2015/066872, filed Dec. 18, 2015 (7 pages). |
Office action dated Jun. 10, 2016 for Korean Patent Application No. 10-2016-7002291 (19 pages). |
Office action dated Jun. 10, 2016 for Korean Patent Application No. 10-2016-7002289 (11 pages). |
Office action dated Nov. 30, 2016 for Korean Patent Application No. 10-2016-7002289 (4 pages). |
Office action dated Oct. 9, 2018 for Chinese Patent Application No. 201580019496.X (8 pages). |
Extended European Search Report dated Nov. 21, 2017 for European Application No. 15785628.7 (7 pages). |
Number | Date | Country | |
---|---|---|---|
20220312081 A1 | Sep 2022 | US |
Number | Date | Country | |
---|---|---|---|
63225381 | Jul 2021 | US | |
63147122 | Feb 2021 | US |