[go: up one dir, main page]

US20240007712A1 - System and method for tracking content timeline in the presence of playback rate changes - Google Patents

System and method for tracking content timeline in the presence of playback rate changes Download PDF

Info

Publication number
US20240007712A1
US20240007712A1 US18/344,792 US202318344792A US2024007712A1 US 20240007712 A1 US20240007712 A1 US 20240007712A1 US 202318344792 A US202318344792 A US 202318344792A US 2024007712 A1 US2024007712 A1 US 2024007712A1
Authority
US
United States
Prior art keywords
content
timeline
playback
tracker
repetition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US18/344,792
Inventor
Patrick George Downes
Rade Petrovic
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Verance Corp
Original Assignee
Verance Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Verance Corp filed Critical Verance Corp
Priority to US18/344,792 priority Critical patent/US20240007712A1/en
Publication of US20240007712A1 publication Critical patent/US20240007712A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Responding to the occurrence of a fault, e.g. fault tolerance
    • G06F11/08Error detection or correction by redundancy in data representation, e.g. by using checking codes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Responding to the occurrence of a fault, e.g. fault tolerance
    • G06F11/08Error detection or correction by redundancy in data representation, e.g. by using checking codes
    • G06F11/10Adding special bits or symbols to the coded information, e.g. parity check, casting out 9's or 11's
    • G06F11/1004Adding special bits or symbols to the coded information, e.g. parity check, casting out 9's or 11's to protect a block of data words, e.g. CRC or checksum
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/4147PVR [Personal Video Recorder]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43074Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on the same device, e.g. of EPG data or interactive icon with a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/835Generation of protective data, e.g. certificates
    • H04N21/8358Generation of protective data, e.g. certificates involving watermark

Definitions

  • the present invention generally relates to watermarking digital content and more particularly to using watermarks to track content timeline in the presence of playback rate changes.
  • a video watermarking system which embeds ancillary information into a video signal is found in the ATSC standard A/335. In such systems it is sometimes necessary to playback auxiliary content which is synchronized to a watermark timeline recovered from the received content in cases where the recovered timeline has a non-linear mapping to real time.
  • FIG. 1 Illustrates exemplary non-linear timelines resulting from the user's operation of STB remote control trickplay functions in response to a sequence of user commands in accordance with an embodiment of the disclosure.
  • FIG. 2 Illustrates exemplary non-linear timelines resulting from the user's operation of STB remote control trickplay functions showing that occasionally two input frames are skipped resulting in an overall rate of ⁇ 2.08 ⁇ as shown in Figure
  • FIG. 3 Illustrates exemplary non-linear timelines resulting from the user's operation of STB remote control trickplay functions in response to starting 1 ⁇ playback, then hitting the ‘>>’ button three times in succession, resulting in ‘1 ⁇ ’, ‘2 ⁇ ’, ‘8 ⁇ ’, ‘32 ⁇ ’ playback in accordance with an embodiment of the disclosure.
  • FIG. 4 Illustrates exemplary non-linear timelines resulting from the user's operation of STB remote control trickplay functions in response to a series of ‘skip-forward’ and ‘skip-back’ commands, resulting in short pauses prior to the skip, then an immediate return to 1 ⁇ playback.
  • FIG. 5 Illustrates a block diagram of a device that can be used for implementing various disclosed embodiments.
  • Disclosed embodiments relate to method for synchronizing auxiliary content to a watermark timeline recovered from a received content when the recovered timeline has a non-linear mapping to real time.
  • the method includes receiving video content having a video watermark embedded therein and decoding video frames from the received video content.
  • a Detector Engine is used to receive the decoded video frames and extract a time-offset field, a VP1 payload, and a Cyclic Redundance Check (CRC) field in each video frame.
  • CRC Cyclic Redundance Check
  • a Content Timeline Tracker is used to monitor and analyze the output of the Detector Engine, to produce a piecewise linear approximation of the content timeline, wherein the playback rate changes by a user in an upstream device can be tracked, thereby enabling the playback of auxiliary content which is synchronized to a watermark timeline recovered from the received content when the recovered timeline has a non-linear mapping to real time.
  • exemplary is used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word exemplary is intended to present concepts in a concrete manner.
  • This disclosure describes the logic that uses video watermarks specified in the ATSC 3.0 Standards, Video Watermark Emission (A/335), Doc. A335:2016, 20 Sep. 2016, which is incorporated by reference, and Content Recovery in Redistribution Scenarios (A/336), Doc. A/336:2019, 3 Oct. 2019, which is incorporated by reference, in order to detect and measure trick-play action on upstream devices such as Set Top Box (STB), such as pause, speed-up, slow-down and skip.
  • STB Set Top Box
  • eVP1 messages specified in the A/336 standard which comprises 8-bit time_offset field, 50-bit VP1 payload and 32-bit Cyclic Redundancy Check (CRC) field in each video frame.
  • the time_offset field is incremented by one every 1/30 s within a message group that lasts 1.5 s, i.e., it can have values 0, 1, 2, . . . 44 within each message group.
  • the VP1 payload (P) is divided into four fields: Domain Type (DT), Server Code (SC), Interval Code (IC), and Query Flag (QF).
  • the SC field consists of 31 bits and the IC field consists of 17 bits.
  • the SC field consists of 23 bits and the IC field consists of 25 bits.
  • the QF field is always one bit, and its toggling signals a dynamic event that requires new signaling recovery.
  • the IC field is incremented by one for each subsequent message group.
  • the CRC field is used to confirm correctness of the extracted data, as is well known to those skilled in the art. It is assumed that there is a detector engine that will receive decoded video frames and extract 8-bit time_offset field, 50-bit VP1 payload and 32-bit CRC field in each video frame based on A/335 and A/336. The details of detector engine design are not part of this disclosure.
  • the CRC matching logic compares the CRC fields extracted from the current frame with CRC field extracted from the previous frame and sets the CRC repetition flag to TRUE if they match and otherwise sets it to FALSE. This process is done regardless of whether the extracted CRC field matches the calculated CRC field based on the extracted data. Even if extracted CRC field may have bit errors and the actual data cannot be retrieved, we still want to know if the consecutive CRC fields are repeated. This information can be later used to discriminate between actual payload repetition, such as time_offset repetition in high frame-rate video or fragment repetition, or frame repetition in pause-and-seek playback rate change, skip and pause, as described below.
  • the Content Timeline Tracker (“Tracker”) monitors the output of the detector engine, and analyzes frame_counter, interval_code, time_offset, and CRC repetition flag values and to produce estSpeed, a piecewise linear approximation of the content timeline which can track playback rate changes initiated by a user on an upstream device (e.g., STB).
  • an upstream device e.g., STB
  • auxiliary content which is synchronized to the watermark timeline recovered from the main content.
  • the recovered timeline is real-time, meaning that an elapsed interval of media time occurs in an equal duration interval of real time.
  • the recovered timeline has a non-linear mapping to real time.
  • Media Player APIs typically expose a command to start (or continue) to play from a specific frame at a specific speed.
  • a sufficiently fast enough player could track, frame-by-frame, the recovered timeline in all modes of play, but most current players cannot respond quickly enough to be able to precisely seek to and render a frame within one frame's duration.
  • a goal of the Tracker is to quickly recognize where playback rate changes are initiated by the user, and provide a piecewise-linear estimate of the playback speed which can then be used in controlling a replacement media player, minimizing the number of seek commands required to track the main content.
  • controlSegmentStartMediaTime Float Media time of current Control Segment start. Init value 0.0 controlSegmentStartClockTime Float Clock time of current Control Segment start currentMediaTime Float Media time as calculated using ic and To.
  • pauseCount Threshold Int 11; make this larger than the largest number of frames encountered during pause-seek trick play.
  • stableStateCount Int 5; stability threshold for counting pause or 1x events.
  • a Control Segment represents a period of time between two upstream user transport control commands which modify playback speed.
  • the media timeline detected with the watermark might be a smooth rendition of the user's command (e.g., 2 ⁇ resulting in regular frame decimation), or it might be a pause-seek stepwise approximation to the user's command (e.g., 32 ⁇ in FIG. 1 ).
  • the Control Segment is initialized with the currentMediaTime and currentClockTime.
  • estSpeed deltaMediaTime / frameDurationSec return min(max (estSpeed, ⁇ c.speedLimit) , c.speedLimit) ⁇
  • the Tracker implements a state machine to help recognize patterns in the recovered timeline and estimate the control segment boundaries.
  • the states are shown in the tracker States Table below.
  • track( ) is called with parameters frame_counter, interval_counter, time_offset and CRC repetition flag. It generates events which drive the Tracker state machine. The events are:
  • track( ) is called once for every detected frame.
  • Two successive calls to Track( ) might be spaced further than 1/fps seconds apart if intervening frames did not have time_offset available.
  • the number of skipped frames is calculated in skippedFrames and used to test for 1 ⁇ play speed.
  • the CRC repetition flag crf is used to indicate paused state when the time_offset is not available; in this case the previous value of the time offset is used.
  • This event is triggered when successive frames show no advance in media 10 time. This could be because the content is paused, or it might part of content playback at speed not equal 1 ⁇ , such as part of a ‘Pause-Seek’ operation for speed >2.0, or part of frame interpolation for speed ⁇ 1.0.
  • a goal is to recognize as quickly as possible that pause is occurring to ensure that a tracking media player is responsive to user commands.
  • the main decision to be made in the event handlers is whether to start a new or update the current control segment. For example, new control segments should not be started in the middle of a sequence of pause-seeks, but the existing speed estimate should be updated.
  • play1 ⁇ Detected might be part of normal 1 ⁇ play, or it might be part of a sequence of frames where playback speed is ⁇ 2 ⁇ .
  • a goal is to recognize as quickly as possible that normal 1 ⁇ play is occurring to ensure that a tracking media player is responsive to user commands.
  • a discontinuity is any jump in the recover timeline, which is not a pause or frames spaced 1/fps apart. These might be part of a pause-seek (a ‘big’ jump below), or result from playback speeds estSpeed ⁇ 2.0 && estSpeed >1.0.
  • estSpeed represents the slope of an idealized control segment. In reality, it is a noisy signal that is influenced by the imperfect nature of trick play media transports.
  • a trackingTimeline is created with logic to try to remove this noise and produce sparsely spaced fSpeedUpdated events that delineate constant slope (constant speed) control segments.
  • the timeline is parametrized by a tt.speed and tt.mediaTime, and can be quantized in time to correspond to the underlying video frame rate. For each processed video frame, trackingTimelineTimetick( ) is called to update the timeline by extrapolating the mediaTime using tt.speed.
  • the timeline can also be resynchronized to the video watermark timeline in trackingTimelineUpdate( ) which is also called every processed video frame.
  • trackingTimelineUpdate( ) selectively calls trackingTimelineSetTimeAndSpeed (time, speed) which updates the tracking timeline and sets the fSpeedUpdated Boolean.
  • trackingTimelineUpdate( ) does not always update tt.speed and tt.mediaTime and uses thresholding logic and other heuristics to avoid too frequent updates to fSpeedUpdated. This can be important if, for example, fSpeedUpdated is used to trigger the seeking of a media player which playing alternate content synchronized to the incoming watermarked content.
  • the tracking timeline is also immediately updated so that overshoot is reduced in tracking devices. If the signs are the same then the tracking timeline is only updated if the ratio of tt.speed and estSpeed is outside of a thresholded window. This avoids constant fSpeedUpdated triggers that might be due to small estimation errors in estSpeed and other system noise.
  • trackingTimelineUpdate( ) analyzes the differences between tt.mediaTime and the currentMediaTime. If this difference is above a threshold, then the tracking timeline is updated. The threshold is adjusted based on the estSpeed, so that there is a greater tolerance to time errors when operating at fast trick play speeds. In most cases the tracking timeline is updated using the currentMediaTime and estSpeed; however, if such an update would reverse the sign of the speed when the time difference is relatively small and the difference is diverging, this is recognized as normal tracking of a pause-seek trick play source, so the tracking timeline is updated to pause at currentMediaTime to wait for the next seek in the pause seek sequence.
  • non-linear timelines resulting from the user's operation of STB remote control trickplay functions are shown below. These are selected from a set of test vectors that can be used to validate implementations of this algorithm.
  • the user input is a sparse sequence of button pushes to change playback speed or skip through content.
  • the STBs main media player responds by seeking in the content and using custom frame decimation and interpolation to play the content at the commanded speed.
  • a typical algorithm is ‘Pause-Seek’, where a frame is repeated (‘Pause’) while the player seeks to an appropriate frame to play next.
  • FIG. 1 shows the results of a sequence of user commands to a ChannelMaster DVR: starting 1 ⁇ playback, then hitting the ‘>>’ button at frame 40 results in 2 ⁇ playback, until frame 90 , when a the second ‘>>’ command results in a brief pause, a slight regression in time, then a succession of pause-seek intervals.
  • the pause-seek interval timing is regularly spaced, with slight variations (e.g. Pause for 6 frames then a jump of 36 frames). Even though the display overlay says ‘8 ⁇ ’ the actual average is approx. 5.4 ⁇ .
  • playback rates between 1.0 and 2.0 consisting of periods of 1 ⁇ playback interspersed with jumps of 2 frames.
  • Playback rates ⁇ 1.0 consist of repeated frames interspersed with 1 ⁇ frame increments.
  • FIG. 3 shows the results of starting 1 ⁇ playback, then hitting the ‘>>’ button three times in succession, resulting in ‘1 ⁇ ’, ‘2 ⁇ ’, ‘8 ⁇ ’, ‘32 ⁇ ’ playback. Notice that at 32 ⁇ the pause-seek steps are no longer uniform.
  • FIG. 4 shows the result of a series of ‘skip-forward’ and ‘skip-back’ commands, resulting in short pauses prior to the skip, then an immediate return to 1 ⁇ playback.
  • FIG. 4 illustrates a block diagram of a device 1000 within which the various disclosed embodiments may be implemented.
  • the device 1000 comprises at least one processor 1002 and/or controller, at least one memory 1004 unit that is in communication with the processor 1002 , and at least one communication unit 1006 that enables the exchange of data and information, directly or indirectly, through the communication link 1008 with other entities, devices and networks.
  • the communication unit 1006 may provide wired and/or wireless communication capabilities in accordance with one or more communication protocols, and therefore it may comprise the proper transmitter/receiver antennas, circuitry and ports, as well as the encoding/decoding capabilities that may be necessary for proper transmission and/or reception of data and other information.
  • the device 1000 and the like may be implemented in software, hardware, firmware, or combinations thereof.
  • the various components or sub-components within each module may be implemented in software, hardware, or firmware.
  • the connectivity between the modules and/or components within the modules may be provided using any one of the connectivity methods and media that is known in the art, including, but not limited to, communications over the Internet, wired, or wireless networks using the appropriate protocols.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present application.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present application.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present application.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present application.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present application.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present application.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present application.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present application.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present application.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present application.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

A system and method for controlling a media player for replacement content, such as dynamic ad insertion. The system tracks video watermarks from a content stream, where the input content timeline is being modified by a user exercising the transport controls of a digital video recorder (DVR). A Detector Engine receives decoded video frames and extracts a time-offset field, a VP1 payload, and a Cyclic Redundance Check (CRC) field in each video frame. A Content Timeline Tracker monitors and analyzes the output of the Detector Engine to produce a piecewise linear approximation of the content timeline, wherein playback rate changes by a user in an upstream device can be tracked. This enables the playback of auxiliary content which is synchronized to a watermark timeline recovered from the received content in cases where the recovered timeline has a non-linear mapping to real time. When the estimated speed is changing due to user-controlled trick play of recorded content, estimated speed deviates from the user intended speed profile because of imperfect playback of the media player. The system includes additional filtering of estimated speed to produce a Boolean updated speed which is asserted sparsely at estimated control segment endpoints in an attempt to delineate constant slope (constant speed) control segments.

Description

    REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. Nonprovisional patent application Ser. No. 17/667,464, filed Feb. 8, 2022, which claims priority to U.S. Provisional Patent Application No. 63/147,122, filed Feb. 8, 2021, and U.S. Provisional Patent Application No. 63/225,381, filed Jul. 23, 2021 the entirety of which are hereby incorporated by reference.
  • FIELD OF INVENTION
  • The present invention generally relates to watermarking digital content and more particularly to using watermarks to track content timeline in the presence of playback rate changes.
  • BACKGROUND
  • This section is intended to provide a background or context to the disclosed embodiments that are recited in the claims. The description herein may include concepts that could be pursued but are not necessarily ones that have been previously conceived or pursued. Therefore, unless otherwise indicated herein, what is described in this section is not prior art to the description and claims in this application and is not admitted to be prior art by inclusion in this section.
  • A video watermarking system which embeds ancillary information into a video signal is found in the ATSC standard A/335. In such systems it is sometimes necessary to playback auxiliary content which is synchronized to a watermark timeline recovered from the received content in cases where the recovered timeline has a non-linear mapping to real time.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 Illustrates exemplary non-linear timelines resulting from the user's operation of STB remote control trickplay functions in response to a sequence of user commands in accordance with an embodiment of the disclosure.
  • FIG. 2 Illustrates exemplary non-linear timelines resulting from the user's operation of STB remote control trickplay functions showing that occasionally two input frames are skipped resulting in an overall rate of ˜2.08× as shown in Figure
  • FIG. 3 Illustrates exemplary non-linear timelines resulting from the user's operation of STB remote control trickplay functions in response to starting 1× playback, then hitting the ‘>>’ button three times in succession, resulting in ‘1×’, ‘2×’, ‘8×’, ‘32×’ playback in accordance with an embodiment of the disclosure.
  • FIG. 4 Illustrates exemplary non-linear timelines resulting from the user's operation of STB remote control trickplay functions in response to a series of ‘skip-forward’ and ‘skip-back’ commands, resulting in short pauses prior to the skip, then an immediate return to 1× playback.
  • FIG. 5 Illustrates a block diagram of a device that can be used for implementing various disclosed embodiments.
  • SUMMARY OF THE INVENTION
  • This section is intended to provide a summary of certain exemplary embodiments and is not intended to limit the scope of the embodiments that are disclosed in this application.
  • Disclosed embodiments relate to method for synchronizing auxiliary content to a watermark timeline recovered from a received content when the recovered timeline has a non-linear mapping to real time. The method includes receiving video content having a video watermark embedded therein and decoding video frames from the received video content. A Detector Engine is used to receive the decoded video frames and extract a time-offset field, a VP1 payload, and a Cyclic Redundance Check (CRC) field in each video frame. A Content Timeline Tracker is used to monitor and analyze the output of the Detector Engine, to produce a piecewise linear approximation of the content timeline, wherein the playback rate changes by a user in an upstream device can be tracked, thereby enabling the playback of auxiliary content which is synchronized to a watermark timeline recovered from the received content when the recovered timeline has a non-linear mapping to real time.
  • These and other advantages and features of disclosed embodiments, together with the organization and manner of operation thereof, will become apparent from the following detailed description when taken in conjunction with the accompanying drawings.
  • DETAILED DESCRIPTION OF CERTAIN EMBODIMENTS
  • In the following description, for purposes of explanation and not limitation, details and descriptions are set forth in order to provide a thorough understanding of the disclosed embodiments. However, it will be apparent to those skilled in the art that the present invention may be practiced in other embodiments that depart from these details and descriptions.
  • Additionally, in the subject description, the word “exemplary” is used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word exemplary is intended to present concepts in a concrete manner.
  • Introduction
  • This disclosure describes the logic that uses video watermarks specified in the ATSC 3.0 Standards, Video Watermark Emission (A/335), Doc. A335:2016, 20 Sep. 2016, which is incorporated by reference, and Content Recovery in Redistribution Scenarios (A/336), Doc. A/336:2019, 3 Oct. 2019, which is incorporated by reference, in order to detect and measure trick-play action on upstream devices such as Set Top Box (STB), such as pause, speed-up, slow-down and skip. In particular it is based on detecting eVP1 messages specified in the A/336 standard, which comprises 8-bit time_offset field, 50-bit VP1 payload and 32-bit Cyclic Redundancy Check (CRC) field in each video frame.
  • The time_offset field is incremented by one every 1/30 s within a message group that lasts 1.5 s, i.e., it can have values 0, 1, 2, . . . 44 within each message group. The VP1 payload (P) is divided into four fields: Domain Type (DT), Server Code (SC), Interval Code (IC), and Query Flag (QF). DT is a one-bit field (0=“small domain”, 1=“large domain”). For “small domain”, the SC field consists of 31 bits and the IC field consists of 17 bits. For “large domain”, the SC field consists of 23 bits and the IC field consists of 25 bits. The QF field is always one bit, and its toggling signals a dynamic event that requires new signaling recovery. The IC field is incremented by one for each subsequent message group.
  • The CRC field is used to confirm correctness of the extracted data, as is well known to those skilled in the art. It is assumed that there is a detector engine that will receive decoded video frames and extract 8-bit time_offset field, 50-bit VP1 payload and 32-bit CRC field in each video frame based on A/335 and A/336. The details of detector engine design are not part of this disclosure.
  • CRC Matching
  • The CRC matching logic compares the CRC fields extracted from the current frame with CRC field extracted from the previous frame and sets the CRC repetition flag to TRUE if they match and otherwise sets it to FALSE. This process is done regardless of whether the extracted CRC field matches the calculated CRC field based on the extracted data. Even if extracted CRC field may have bit errors and the actual data cannot be retrieved, we still want to know if the consecutive CRC fields are repeated. This information can be later used to discriminate between actual payload repetition, such as time_offset repetition in high frame-rate video or fragment repetition, or frame repetition in pause-and-seek playback rate change, skip and pause, as described below.
  • Content Timeline Tracker
  • The Content Timeline Tracker (“Tracker”) monitors the output of the detector engine, and analyzes frame_counter, interval_code, time_offset, and CRC repetition flag values and to produce estSpeed, a piecewise linear approximation of the content timeline which can track playback rate changes initiated by a user on an upstream device (e.g., STB).
  • Overview
  • Some applications require playback of auxiliary content which is synchronized to the watermark timeline recovered from the main content. For normal viewing the recovered timeline is real-time, meaning that an elapsed interval of media time occurs in an equal duration interval of real time. Other times, such as when the user is controlling playback of main content using ‘trick play’, the recovered timeline has a non-linear mapping to real time.
  • To play content, Media Player APIs typically expose a command to start (or continue) to play from a specific frame at a specific speed. A sufficiently fast enough player could track, frame-by-frame, the recovered timeline in all modes of play, but most current players cannot respond quickly enough to be able to precisely seek to and render a frame within one frame's duration.
  • A goal of the Tracker is to quickly recognize where playback rate changes are initiated by the user, and provide a piecewise-linear estimate of the playback speed which can then be used in controlling a replacement media player, minimizing the number of seek commands required to track the main content.
  • TABLE
    Tracker Details
    Parameter Name Input/Output
    Frame Counter (Fc) Input
    Interval Code (P) Input
    time_offset (To) Input
    CRC repetition flag Input
    (crf)
    Frame rate (fps) Input
    estSpeed Output
    fSpeedUpdated Output

    Tracker inputs and outputs are summarized in the Tracker Details Table below.
  • TABLE
    Tracker Variables
    Parameter Name Type Description
    controlSegmentStartMediaTime Float Media time of current
    Control Segment start.
    Init value = 0.0
    controlSegmentStartClockTime Float Clock time of current
    Control Segment start
    currentMediaTime Float Media time as
    calculated using ic
    and To. Unit: Seconds
    prevMediaTime Float Value of
    currentMediaTime last
    time tracker ( ) was
    called. Unit: Seconds
    deltaMediaTime Float Time since last time
    tracker ( ) was called.
    Unit: Seconds
    currentClockTime Float Local receiver clock
    derived from counting
    samples. Unit: Seconds
    prevClockTime Float Value of
    currentClockTime last
    time tracker ( ) was
    called. Unit: Seconds
    prevOffset Int Value of time_offset
    from previous call to
    track ( )
    pauseCounter Int Counts successive
    frames with same media
    time.
    estSpeed Float Estimated playback
    speed. Init value = 0.0
    fSpeedUpdated boolean true if a new control
    segment was detected
    during the last call
    to track ( ). Init
    value = false.
  • TABLE
    Tracker Configuration Constants
    Parameter Name Type Description
    pauseCount Threshold Int =11; make this larger
    than the largest number
    of frames encountered
    during pause-seek trick
    play.
    stableStateCount Int =5; stability
    threshold for counting
    pause or 1x events.
    speedLimit Float =32; clip speed
    estimates to +/−
    speedLimit
    speedRatioReportingThreshold Float =1.005; estSpeed
    ratio change reporting
    threshold
    ToQuantization Float =1/30 sec;
    quantization of T0
    during embedding
  • Control Segments
  • A Control Segment represents a period of time between two upstream user transport control commands which modify playback speed. The media timeline detected with the watermark might be a smooth rendition of the user's command (e.g., 2× resulting in regular frame decimation), or it might be a pause-seek stepwise approximation to the user's command (e.g., 32× in FIG. 1 ).
  • The Control Segment is initialized with the currentMediaTime and currentClockTime.
  • func controlSegmentInit ( ) {
     csStartMediaTime = currentMediaTime
     csStartClockTime = currentClockTime
    }

    An initial speed estimate uses the most recent deltaMediaTime
  • func getInitJumpingSpeed ( ) -> Float {
     let estSpeed = deltaMediaTime / frameDurationSec
     return            min(max (estSpeed,
    −c.speedLimit) , c.speedLimit)
    }
  • Occasionally the speedEstimate is updated in the middle of a Control Segment as the slope of the expanding control segment line becomes a better estimator for media speed. getCurrentCSSpeed( ) calculates the current slope and clips the value to speedlimit.
  • func getCurrentCSSpeed ( ) -> Float {
     let speed = (currentMediaTime − csStartMediaTime) /
    (currentClockTime − csStartClockTime)
     let  clippedSpeed = min(max(speed, −speedLimit),
    speedLimit)
     return clippedSpeed
    }
  • Tracker States
  • The Tracker implements a state machine to help recognize patterns in the recovered timeline and estimate the control segment boundaries. The states are shown in the tracker States Table below.
  • State Name Description
    Init Initial tracker state
    Paused Paused
    PauseSeek Pause-Seek
    OneXOnset First 1x frame spacing
    detected. This state provides
    a one frame delay before
    making decision on next state
    OneXPlay Playback at speed less than or
    equal to 2.0. which might
    include some individual
    repeated or skipped frames.
    JumpOnset First Jumping spacing detected
    (not paused and not 1x) . This
    state provides a one frame
    delay before making decision
    on next state
    Jumping In the middle of a sequence of
    non-pause, non-1x play spaced
    frames
  • Tracker Events/Tracker Main
  • track( ) is called with parameters frame_counter, interval_counter, time_offset and CRC repetition flag. It generates events which drive the Tracker state machine. The events are:
  • Event Name Description
    pauseDetected ( )
    play1xDetected ( )
    discontinuityDetected ( )
  • track( ) is called once for every detected frame.
  • Two successive calls to tracker with the same IC and time offset might mean that content is paused, but this can also happen for frame rates higher that 30 fps because time offset is quantized to 1/30 sec (TOQuantization). These duplicate frames caused by To quantization should be discarded by the tracker, and this is done by looking at the deltaClockTime to determine if two successive calls are spaced less than 1/30 sec. Note that deltaMediaTime might not be zero even if two successive calls are spaced closer than 1/30 sec because of upstream trick play, and these samples should not be discarded.
  • Two successive calls to Track( ) might be spaced further than 1/fps seconds apart if intervening frames did not have time_offset available. The number of skipped frames is calculated in skippedFrames and used to test for 1× play speed.
  • The CRC repetition flag crf is used to indicate paused state when the time_offset is not available; in this case the previous value of the time offset is used.
  • When the fps is different than 1/ToQuantization, there will be an error in the calculation of delta media time. This kind of jitter is tolerated using a threshold in the calculation:
      • frameJitterThresholdSec=0.99/fps
  • Pseudo-code for the track( ) function of the Tracker:
  • func track(frame_counter:Int, ic:Int, time_offset:int,
    crf:Boolean, fps:Float) {
     trackingTimelineTimetick(frame_counter)
     if ( (time_offset != −1) && (ic != −1) ) | | crf {
      if crf {
      let currentOffset = prevOffset
      if (ic != −1) {
       prevIC = ic
      } else {
       ic = prevIC
      }
     else {   // (time_offset != −1) && (ic != −1)
      let currentOffset = time_offset
      prevOffset = time_offset
      prevIC = ic
      }
      let clockTime = frame_counter / fps
      let mediaFrameOffsetTime = (ic * 1.5) +
    (currentOffset * ToQuantization )
      deltaClockTime = clockTime − prevClockTime
      deltaMediaTime = mediaFrameOffsetTime −
    prevMediaTime
      if ! ( (deltaClockTime < ToQuantization) &&
    (deltaMediaTime == 0) ) {
       skippedFrames = Int ( (deltaClockTime *
    fps) .rounded ( ) )
      currentClockTime = clockTime
       prevClockTime = currentClockTime
      currentMediaTime = mediaFrameOffsetTime
       prevMediaTime = currentMediaTime
       if deltaMediaTime == 0 {
        pauseDetected( )
       } else if deltaMediaTime < skippedFrames *
    frameDurationSec + frameJitterThresholdSec &&
        deltaMediaTime > skippedFrames *
    frameDurationSec − frameJitterThresholdSec {
        play1xDetected( )
       } else {
        discontinuityDetected( )
       }
       trackingTimelineUpdate( )
      }
     }
    }
  • Pause Detected Event Handler
  • This event is triggered when successive frames show no advance in media 10 time. This could be because the content is paused, or it might part of content playback at speed not equal 1×, such as part of a ‘Pause-Seek’ operation for speed >2.0, or part of frame interpolation for speed <1.0.
  • A goal is to recognize as quickly as possible that pause is occurring to ensure that a tracking media player is responsive to user commands.
  • The main decision to be made in the event handlers is whether to start a new or update the current control segment. For example, new control segments should not be started in the middle of a sequence of pause-seeks, but the existing speed estimate should be updated.
  • func pauseDetected( ) {
     if state == . JumpOnset {
      if prevState == . PauseSeek { / / in trickplay. Dont
    reanchor control segment; just update speed
        setEstSpeed(getCurrentCSSpeed( ) )
      } else { / / This is a new jump, so reanchor
    control segment;
       setEstSpeed(getInitJumpingSpeed( ) )
        controlSegment Init( )
      }
      pauseCounter = 0
      state = . PauseSeek
     } else if (state == .PauseSeek) | | (state == .Jumping)
    {
      / / This might be a real pause, or it might be part
    of a
      / / pause-seek: Only update if real pause
      if pauseCounter > pauseCountThreshold {
       setEstSpeed (0)
       controlSegmentInit( )
       state = .Paused
       pauseCounter = 0;
      }
     } else if state == .OneXPlay | | state == .OneXOnset {
      / / Start a new control segment only if coming from
    continuous
      / / (non-jumping) 1x playback . Ignore transient
    duplicated frames during jumping.
      if pauseCounter > stableStateCount | | onexCounter >
    stableStateCount {
       setEstSpeed (1)  / / Do not zero speed entering
    pause-seek. This is not a real pause.
       controlSegmentInit( )
       state = .PauseSeek
       pauseCounter = 0;
      } else { / / This could be oscilation between pause
    and 1x seen in speeds < 1
       setEstSpeed (getCurrentCSSpeed( ) )
      }
     } else { / / state == .Paused or state == .Init
      state = .Paused
     if estSpeed != 0 {
      setEstSpeed(0.0)
      }
     }
     discCounter = 0;
     pauseCounter = pauseCounter + 1
     >
    }
  • play1× Detected Event Handler
  • play1× Detected might be part of normal 1× play, or it might be part of a sequence of frames where playback speed is <2×. A goal is to recognize as quickly as possible that normal 1× play is occurring to ensure that a tracking media player is responsive to user commands.
  • func play1xDetected( ) {
     if state == .JumpOnset {
      state = .OneXOnset
     } else if state == .Jumping {
      controlSegmentInit( )
      state = .OneXOnset
     } else if state == .OneXOnset {
      setEstSpeed (1.0)
      controlSegmentInit( )
      state = .OneXPlay
     } else if state == .Paused | | state == .Init {
      setEstSpeed(1.0)
      controlSegmentInit( )
      state = .OneXPlay
     } else if state == .PauseSeek { / / might be <2x, so
    don't reanchor segment
      state = .OneXOnset
     } else if state == .OneXPlay {
      if stableStateCount { / / Fail safe
    (in presense of CRC errors).
       setEstSpeed(1)
       controlSegment Init( )     / / Establish
    1x control segment
      }
     }
     discCounter = 0;
     pauseCounter = 0;
      + 1
    }
  • Discontinuity Detected Event Handler
  • A discontinuity is any jump in the recover timeline, which is not a pause or frames spaced 1/fps apart. These might be part of a pause-seek (a ‘big’ jump below), or result from playback speeds estSpeed <2.0 && estSpeed >1.0.
  • func discontinuityDetected( ) {
     if state == .Paused | | state == .PauseSeek | | state ==
    .Init {
      / / first jump afer a pause; wait for next frame to
    establish slope for estSpeed
      / / (esp useful for 1x play after skip)
      state = .JumpOnset
     } else if state == .JumpOnset {
      state = .Jumping
      setEstSpeed(getInitJumpingSpeed( ) )
      controlSegment Init( )
     } else if state == .OneXPlay | | state == .OneXOnset {
      if abs(deltaMediaTime) > 2.2 * skippedFrames *
    frameDurationSec | | discounter > 2 { / / if this is a big
    skip
       controlSegmentInit( )
       state = .JumpOnset / / wait a frame before
    tsUpdated to get better speed estimate
      } else { / / a small jump could be part of speed <
    2, so stay in .OneXPlay
       setEstSpeed(getCurrentCSSpeed( ) )
      }
     } else if state == .Jumping {
      if abs(deltaMediaTime) > 2.2 * Float(skippedFrames)
    * frameDurationSec { / / if this is a big skip
       state = .JumpOnset
      } / / else, don't controlSegmentInit via .JumpOnset
      setEstSpeed(getCurrentCSSpeed( ) )
     }
     discCounter = discounter + 1;
     pauseCounter = 0;
     >
    }
  • Tracking Timeline
  • estSpeed represents the slope of an idealized control segment. In reality, it is a noisy signal that is influenced by the imperfect nature of trick play media transports. A trackingTimeline is created with logic to try to remove this noise and produce sparsely spaced fSpeedUpdated events that delineate constant slope (constant speed) control segments.
  • The timeline is parametrized by a tt.speed and tt.mediaTime, and can be quantized in time to correspond to the underlying video frame rate. For each processed video frame, trackingTimelineTimetick( ) is called to update the timeline by extrapolating the mediaTime using tt.speed. The timeline can also be resynchronized to the video watermark timeline in trackingTimelineUpdate( ) which is also called every processed video frame. trackingTimelineUpdate( ) selectively calls trackingTimelineSetTimeAndSpeed (time, speed) which updates the tracking timeline and sets the fSpeedUpdated Boolean.
  • trackingTimelineUpdate( ) does not always update tt.speed and tt.mediaTime and uses thresholding logic and other heuristics to avoid too frequent updates to fSpeedUpdated. This can be important if, for example, fSpeedUpdated is used to trigger the seeking of a media player which playing alternate content synchronized to the incoming watermarked content.
  • trackingTimelineUpdate( ) analyzes the differences between tt.speed and the estSpeed which is estimated from the recovered watermarks. If there is any transition between pause and play (i.e., if (estSpeed==0.0∥estSpeed==1.0∥tt.speed==0∥tt.speed==1.0) && (tt.speed˜=estSpeed), the tracking timeline is immediately updated.
  • If tt.speed and estSpeed have opposite signs, the tracking timeline is also immediately updated so that overshoot is reduced in tracking devices. If the signs are the same then the tracking timeline is only updated if the ratio of tt.speed and estSpeed is outside of a thresholded window. This avoids constant fSpeedUpdated triggers that might be due to small estimation errors in estSpeed and other system noise.
  • If none of the speed analysis conditions are true, trackingTimelineUpdate( ) analyzes the differences between tt.mediaTime and the currentMediaTime. If this difference is above a threshold, then the tracking timeline is updated. The threshold is adjusted based on the estSpeed, so that there is a greater tolerance to time errors when operating at fast trick play speeds. In most cases the tracking timeline is updated using the currentMediaTime and estSpeed; however, if such an update would reverse the sign of the speed when the time difference is relatively small and the difference is diverging, this is recognized as normal tracking of a pause-seek trick play source, so the tracking timeline is updated to pause at currentMediaTime to wait for the next seek in the pause seek sequence.
  • TABLE
    Tracking Timeline Variables
    Parameter Name Type Description
    tt.mediaTime Float Current time along the
    tracking timeline
    tt.speed Float Current speed along the
    tracking timeline
    tt.currentMediaTimeError Float Difference between the
    tt.mediaTime and estSpeed
    tt.prevFrameNumber Int curentFrameNumber last
    time
    trackingTimelineTimeTick
    was called
    tt.prevMediaTimeError Float Difference between the
    tt. mediaTime and estSpeed
    as measured in the last
    frame
  • TABLE
    TrackingTimeline Constants
    Parameter Name Type Description
    tt.timeErrorThreshold Float =0.5 Seconds (mediaTime)
    error before issuing a
    correction in jump mode.
    Smaller number results in
    less error but more speed
    updates
  • TimeTracker Functions
  •  func trackingTimelineInit( ) {
      tt.speed = 0.0
      tt.mediaTime = 0.0
      tt.currentMediaTimeError = 0. 0
      tt. prevMediaTimeError = 0.0
     tt. prevFrameNumber = 0
     }
     func trackingTimelineTimetick(current FrameNumber) {
      let skippedFrames = current FrameNumber −
    tt. prevFrameNumber
      tt. prevFrameNumber = current FrameNumber
      tt.mediaTime = tt.mediaTime + tt.speed *
    skippedFrames * frameDurationSec
     tt.prevMediaTimeError = tt.currentMediaTimeError
     tt. currentMediaTimeError = tt.mediaTime −
    currentMediaTime
      fSpeedUpdated = 0
     }
     func trackingTimelineSetTimeAndSpeed(time, speed) {
      tt.speed = speed
      tt.mediaTime = time
      fSpeedUpdated = 1
     }
     func trackingTimelineUpdate( ) {
      fCorrected = false;
      / / First check speed difference
      if (estSpeed == 0.0 | | estSpeed == 1.0 | | tt.speed
    == 0 | | tt.speed == 1.0) && ( tt.speed ~= estSpeed ) {
      trackingTimelineSetTimeAndSpeed(currentMediaTime,
    estSpeed)
       fCorrected = true;
      } elseif sign(tt.speed) == sign(estSpeed) ) {
       if ( tt.speed ~= estSpeed ) {
        if abs(estSpeed) > abs(tt.speed ) {
         ratio = tt.speed/ estSpeed
        } else {
         ratio = estSpeed/tt.speed
        }
         if ratio < 0.5 {
          trackingTimelineSetTimeAndSpeed
     (currentMediaTime, estSpeed)
          fCorrected = true
         }
      }
     } elseif  tt.speed * state.estSpeed ~= 0 / / / / update
    if speeds are opposite signs and non-zero
         trackingTimelineSetTimeAndSpeed
    (currentMediaTime, estSpeed)
        fCorrected = true
     }
      / / Second, check time difference
      if fCorrected == false {
       thresh = tt.timeErrorThreshold
        if abs(estSpeed) > 2 {
         thresh = tt.timeErrorThreshold *
    abs(estSpeed)
        }
       if abs(tt.currentMediaTimeError) >=
     abs(deltaMediaTime) &&
       abs(tt.currentMediaTimeError) >= thresh {
          / / do not make small direction
     reversals when correcting
          if sign(tt.currentMediaTimeError) ~=
     sign (state .estSpeed) | |
         sign (tt.speed) ~= sign (state.estSpeed) {
          / / in this case a correction will not
       reverese direction
      trackingTimelineSetTimeAndSpeed(currentMediaTime,
     estSpeed)
       } elseif abs(tt.currentMediaTimeError) >
     abs(tt.prevMediaTimeError) &&
         (abs (tt.currentMediaTimeError)/abs (stat
        e.estSpeed) ) < 5
         / / in this case a correction will
        reverse directions,
         / / so if we're diverging pause media
        player at its current position
      trackingTimelineSetTimeAndSpeed (currentMediaTime,
     0)
       } else {
      trackingTimelineSetTimeAndSpeed (currentMediaTime,
     estSpeed)
         }
      }
     }
    }
  • Trickplay Timeline Examples
  • Examples of non-linear timelines resulting from the user's operation of STB remote control trickplay functions are shown below. These are selected from a set of test vectors that can be used to validate implementations of this algorithm.
  • In these examples, the user input is a sparse sequence of button pushes to change playback speed or skip through content. The STBs main media player responds by seeking in the content and using custom frame decimation and interpolation to play the content at the commanded speed. A typical algorithm is ‘Pause-Seek’, where a frame is repeated (‘Pause’) while the player seeks to an appropriate frame to play next.
  • 1×->2×->8× Playback
  • FIG. 1 shows the results of a sequence of user commands to a ChannelMaster DVR: starting 1× playback, then hitting the ‘>>’ button at frame 40 results in 2× playback, until frame 90, when a the second ‘>>’ command results in a brief pause, a slight regression in time, then a succession of pause-seek intervals. The pause-seek interval timing is regularly spaced, with slight variations (e.g. Pause for 6 frames then a jump of 36 frames). Even though the display overlay says ‘8×’ the actual average is approx. 5.4×.
  • A closer look at the 2× playback section in FIG. 1 shows that it is not simply discarding every other frame. Occasionally two input frames are skipped resulting in an overall rate of ˜2.08× as shown in FIG. 2 .
  • Similarly, playback rates between 1.0 and 2.0 consisting of periods of 1× playback interspersed with jumps of 2 frames. Playback rates <1.0 consist of repeated frames interspersed with 1× frame increments.
  • ChannelMaster 32× Playback
  • FIG. 3 shows the results of starting 1× playback, then hitting the ‘>>’ button three times in succession, resulting in ‘1×’, ‘2×’, ‘8×’, ‘32×’ playback. Notice that at 32× the pause-seek steps are no longer uniform.
  • ChannelMaster Skip Ahead/Skip Back
  • FIG. 4 shows the result of a series of ‘skip-forward’ and ‘skip-back’ commands, resulting in short pauses prior to the skip, then an immediate return to 1× playback.
  • It is understood that the various embodiments of the present invention may be implemented individually, or collectively, in devices comprised of various hardware and/or software modules and components. These devices, for example, may comprise a processor, a memory unit, an interface that are communicatively connected to each other, and may range from desktop and/or laptop computers, to consumer electronic devices such as media players, mobile devices, and the like. For example, FIG. 4 illustrates a block diagram of a device 1000 within which the various disclosed embodiments may be implemented. The device 1000 comprises at least one processor 1002 and/or controller, at least one memory 1004 unit that is in communication with the processor 1002, and at least one communication unit 1006 that enables the exchange of data and information, directly or indirectly, through the communication link 1008 with other entities, devices and networks. The communication unit 1006 may provide wired and/or wireless communication capabilities in accordance with one or more communication protocols, and therefore it may comprise the proper transmitter/receiver antennas, circuitry and ports, as well as the encoding/decoding capabilities that may be necessary for proper transmission and/or reception of data and other information.
  • Referring back to FIG. 4 the device 1000 and the like may be implemented in software, hardware, firmware, or combinations thereof. Similarly, the various components or sub-components within each module may be implemented in software, hardware, or firmware. The connectivity between the modules and/or components within the modules may be provided using any one of the connectivity methods and media that is known in the art, including, but not limited to, communications over the Internet, wired, or wireless networks using the appropriate protocols.
  • Various embodiments described herein are described in the general context of methods or processes, which may be implemented in one embodiment by a computer program product, embodied in a computer-readable medium, including computer-executable instructions, such as program code, executed by computers in networked environments. A computer-readable medium may include removable and non-removable storage devices including, but not limited to, Read Only Memory (ROM), Random Access Memory (RAM), compact discs (CDs), digital versatile discs (DVD), etc. Therefore, the computer-readable media that is described in the present application comprises non-transitory storage media. Generally, program modules may include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Computer-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps or processes.
  • The foregoing description of embodiments has been presented for purposes of illustration and description. The foregoing description is not intended to be exhaustive or to limit embodiments of the present invention to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of various embodiments. The embodiments discussed herein were chosen and described in order to explain the principles and the nature of various embodiments and its practical application to enable one skilled in the art to utilize the present invention in various embodiments and with various modifications as are suited to the particular use contemplated. The features of the embodiments described herein may be combined in all possible combinations of methods, apparatus, modules, systems, and computer program products.

Claims (20)

1. A method comprising:
receiving video content having a video watermark embedded therein;
decoding video frames from the received video content;
using a detector engine to receive the decoded video frames and extract at least a time-offset, a payload, and an error detection field in a plurality of the decoded video frames, wherein the detector engine generates an output for the plurality of decoded video frames that includes a time-offset and a payload derived from the video watermark; and
using a content timeline tracker to monitor and analyze the output of the detector engine, to produce a piecewise linear approximation of the content timeline, wherein the playback rate changes by a user in an upstream device can be tracked, thereby enabling the playback of auxiliary content which is synchronized to a watermark timeline recovered from the received content when the recovered timeline has a non-linear mapping to real time.
2. The method of claim 1 further comprising using matching logic to compare the watermark data extracted from the current frame with at least some watermark data extracted from the previous frame and to set a data repetition flag to TRUE if they match and otherwise set it to FALSE.
3. The method of claim 2 wherein the matching logic determines if consecutive watermark data are repeated, wherein this information can be later used to discriminate between actual payload repetition, such as time offset repetition in high frame-rate video or fragment repetition, or frame repetition in pause-and-seek playback rate change, skip and pause.
4. The method of claim 1 wherein the content timeline tracker analyzes at least the time_offset and data repetition flag values to produce estSpeed, a piecewise linear approximation of the content timeline which can track playback rate changes initiated by a user on an upstream device.
5. The method of claim 1 wherein the Content Timeline Tracker implements a state machine to recognize patterns in the recovered timeline and estimate the control segment boundaries.
6. The method of claim 1 wherein the piecewise linear approximation of the content timeline includes an estimate of the playback speed and further comprising using the estimate of the playback speed in controlling a replacement media player, whereby the number of seek commands required to track main content in minimized.
7. The method of claim 1 wherein the content timeline tracker recognizes one or more patterns selected from the group consisting of: initial tracker state; pause; pause seek; oneXOnset; oneXplay; JumpOnset; and Jumping.
8. A device comprising:
a detector engine receiving decoded video having a video watermark embedded therein, the detector engine extracting at least a time-offset, a payload, and an error detection field in a plurality of the decoded video frames, wherein the detector engine generates an output for the plurality of decoded video frames that includes a time-offset and a payload derived from the video watermark; and
a content timeline tracker that monitors and analyzes the output of the detector engine and produces a piecewise linear approximation of the content timeline, wherein the playback rate changes by a user in an upstream device can be tracked, thereby enabling the playback of auxiliary content which is synchronized to a watermark timeline recovered from the received content when the recovered timeline has a non-linear mapping to real time.
9. The device of claim 8 wherein the detector engine uses matching logic to compare the watermark data extracted from the current frame with at least some watermark data extracted from the previous frame and to set a data repetition flag to TRUE if they match and otherwise set it to FALSE.
10. The device of claim 9 wherein the matching logic determines if consecutive watermark data are repeated, wherein this information can be later used to discriminate between actual payload repetition, such as time_offset repetition in high frame-rate video or fragment repetition, or frame repetition in pause-and-seek playback rate change, skip and pause.
11. The method of claim 8 wherein the content timeline tracker analyzes at least the time_offset and data repetition flag values to produce estSpeed, a piecewise linear approximation of the content timeline which can track playback rate changes initiated by a user on an upstream device.
12. The device of claim 8 wherein the content timeline tracker implements a state machine to recognize patterns in the recovered timeline and estimate the control segment boundaries.
13. The device of claim 8 wherein the piecewise linear approximation of the content timeline includes an estimate of the playback speed and wherein the content timeline tracker uses the estimate of the playback speed to control a replacement media player, whereby the number of seek commands required to track main content in minimized.
14. The device of claim 8 wherein the content timeline tracker recognizes one or more patterns selected from the group consisting of: initial tracker state; pause; pause seek; oneXOnset; oneXplay; JumpOnset; and Jumping.
15. A computer program product embodied on one or more non-transitory computer readable media, comprising:
program code for receiving video content having a video watermark embedded therein;
program code for decoding video frames from the received video content;
program code for using a Detector Engine to receive the decoded video frames and extract at least a time-offset, a payload, and an error detection field in a plurality of the decoded video frames, wherein the Detector Engine generates an output for the plurality of decoded video frames that includes a time-offset and a payload derived from the video watermark; and
program code for using a Content Timeline Tracker to monitor and analyze the output of the Detector Engine, to produce a piecewise linear approximation of the content timeline, wherein the playback rate changes by a user in an upstream device can be tracked, thereby enabling the playback of auxiliary content which is synchronized to a watermark timeline recovered from the received content when the recovered timeline has a non-linear mapping to real time.
16. The computer program product of claim 15 further comprising using matching logic to compare the watermark data extracted from the current frame with at least some watermark data extracted from the previous frame and to set a data repetition flag to TRUE if they match and otherwise set it to FALSE.
17. The computer program product of claim 16 wherein the matching logic determines if consecutive watermark data are repeated, wherein this information can be later used to discriminate between actual payload repetition, such as time_offset repetition in high frame-rate video or fragment repetition, or frame repetition in pause-and-seek playback rate change, skip and pause.
18. The computer program product of claim 15 wherein the Content Timeline Tracker analyzes at least the time_offset and data repetition flag values to produce estSpeed, a piecewise linear approximation of the content timeline which can track playback rate changes initiated by a user on an upstream device.
19. The computer program product of claim 15 wherein the Content Timeline Tracker implements a state machine to recognize patterns in the recovered timeline and estimate the control segment boundaries.
20. The computer program product of claim 15 wherein the piecewise linear approximation of the content timeline includes an estimate of the playback speed and further comprising using the estimate of the playback speed in controlling a replacement media player, whereby the number of seek commands required to track main content in minimized.
US18/344,792 2021-02-08 2023-06-29 System and method for tracking content timeline in the presence of playback rate changes Abandoned US20240007712A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/344,792 US20240007712A1 (en) 2021-02-08 2023-06-29 System and method for tracking content timeline in the presence of playback rate changes

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163147122P 2021-02-08 2021-02-08
US202163225381P 2021-07-23 2021-07-23
US17/667,464 US11722741B2 (en) 2021-02-08 2022-02-08 System and method for tracking content timeline in the presence of playback rate changes
US18/344,792 US20240007712A1 (en) 2021-02-08 2023-06-29 System and method for tracking content timeline in the presence of playback rate changes

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US17/667,464 Continuation US11722741B2 (en) 2021-02-08 2022-02-08 System and method for tracking content timeline in the presence of playback rate changes

Publications (1)

Publication Number Publication Date
US20240007712A1 true US20240007712A1 (en) 2024-01-04

Family

ID=83365225

Family Applications (2)

Application Number Title Priority Date Filing Date
US17/667,464 Active 2042-02-08 US11722741B2 (en) 2021-02-08 2022-02-08 System and method for tracking content timeline in the presence of playback rate changes
US18/344,792 Abandoned US20240007712A1 (en) 2021-02-08 2023-06-29 System and method for tracking content timeline in the presence of playback rate changes

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US17/667,464 Active 2042-02-08 US11722741B2 (en) 2021-02-08 2022-02-08 System and method for tracking content timeline in the presence of playback rate changes

Country Status (1)

Country Link
US (2) US11722741B2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024226863A1 (en) * 2023-04-26 2024-10-31 Verance Corporation Video watermark message scheduler

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120315011A1 (en) * 2010-02-22 2012-12-13 Dolby Laboratories Licensing Corporation Video Delivery and Control by Overwriting Video Data
US20160165297A1 (en) * 2013-07-17 2016-06-09 Telefonaktiebolaget L M Ericsson (Publ) Seamless playback of media content using digital watermarking
US10236031B1 (en) * 2016-04-05 2019-03-19 Digimarc Corporation Timeline reconstruction using dynamic path estimation from detections in audio-video signals

Family Cites Families (197)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3015410A (en) 1958-09-12 1962-01-02 John W Everett Container seams
US3055013A (en) 1959-07-23 1962-09-25 Leonard P Frieder Helmet construction
US3059815A (en) 1960-12-20 1962-10-23 Jr Craig B Parsons Surgeon's powder dispensing machine
US6611607B1 (en) 1993-11-18 2003-08-26 Digimarc Corporation Integrating digital watermarks in multimedia content
US7986806B2 (en) 1994-11-16 2011-07-26 Digimarc Corporation Paper products and physical objects as means to access and control a computer or to navigate over or act as a portal on a network
US7224819B2 (en) 1995-05-08 2007-05-29 Digimarc Corporation Integrating digital watermarks in multimedia content
US7562392B1 (en) 1999-05-19 2009-07-14 Digimarc Corporation Methods of interacting with audio and ambient music
US6427012B1 (en) 1997-05-19 2002-07-30 Verance Corporation Apparatus and method for embedding and extracting information in analog signals using replica modulation
US5940135A (en) 1997-05-19 1999-08-17 Aris Technologies, Inc. Apparatus and method for encoding and decoding information in analog signals
US6895430B1 (en) 1999-10-01 2005-05-17 Eric Schneider Method and apparatus for integrating resolution services, registration services, and search services
US6145081A (en) 1998-02-02 2000-11-07 Verance Corporation Method and apparatus for preventing removal of embedded information in cover signals
US6888943B1 (en) 1998-04-21 2005-05-03 Verance Corporation Multimedia adaptive scrambling system (MASS)
US6792542B1 (en) 1998-05-12 2004-09-14 Verance Corporation Digital system for embedding a pseudo-randomly modulated auxiliary data sequence in digital samples
US6314192B1 (en) 1998-05-21 2001-11-06 Massachusetts Institute Of Technology System, method, and product for information embedding using an ensemble of non-intersecting embedding generators
US7644282B2 (en) 1998-05-28 2010-01-05 Verance Corporation Pre-processed information embedding system
US6490579B1 (en) 1998-07-16 2002-12-03 Perot Systems Corporation Search engine system and method utilizing context of heterogeneous information resources
US6122610A (en) 1998-09-23 2000-09-19 Verance Corporation Noise suppression for low bitrate speech coder
US7373513B2 (en) 1998-09-25 2008-05-13 Digimarc Corporation Transmarking of multimedia signals
US20050160271A9 (en) 1998-11-19 2005-07-21 Brundage Trent J. Identification document and related methods
JP4130503B2 (en) 1998-11-30 2008-08-06 株式会社東芝 Digital watermark embedding device
US6438661B1 (en) 1999-03-03 2002-08-20 International Business Machines Corporation Method, system, and program for managing meta data in a storage system and rebuilding lost meta data in cache
US6556688B1 (en) 1999-03-15 2003-04-29 Seiko Epson Corporation Watermarking with random zero-mean patches for printer tracking
KR100333163B1 (en) 1999-03-29 2002-04-18 최종욱 Digital watermarking method and apparatus
US7185201B2 (en) 1999-05-19 2007-02-27 Digimarc Corporation Content identifiers triggering corresponding responses
JP4709961B2 (en) 1999-08-05 2011-06-29 シヴォリューティオン ベー. フェー. Auxiliary data detection in information signals
US6737957B1 (en) 2000-02-16 2004-05-18 Verance Corporation Remote control signaling using audio watermarks
US7240100B1 (en) 2000-04-14 2007-07-03 Akamai Technologies, Inc. Content delivery network (CDN) content server request handling mechanism with metadata framework support
US6430301B1 (en) 2000-08-30 2002-08-06 Verance Corporation Formation and analysis of signals with common and transaction watermarks
US6870931B2 (en) 2000-12-13 2005-03-22 Eastman Kodak Company Method and system for embedding message data in a digital image sequence
US6965683B2 (en) 2000-12-21 2005-11-15 Digimarc Corporation Routing networks for use with watermark systems
US6931536B2 (en) 2001-03-06 2005-08-16 Macrovision Corporation Enhanced copy protection of proprietary material employing multiple watermarks
US7602936B2 (en) 2001-03-08 2009-10-13 Sony Corporation Method to make wavelet watermarks resistant to affine transformations
US7159118B2 (en) 2001-04-06 2007-01-02 Verance Corporation Methods and apparatus for embedding and recovering watermarking information based on host-matching codes
US7024018B2 (en) 2001-05-11 2006-04-04 Verance Corporation Watermark position modulation
US20030055979A1 (en) 2001-09-19 2003-03-20 Cooley William Ray Internet domain name resolver
US20030065739A1 (en) 2001-10-01 2003-04-03 J. Mitchell Shnier Methods for independently generating a reference to desired information available from a remote source
US7779271B2 (en) 2001-11-23 2010-08-17 Civolution B.V. Watermark embedding
US7533266B2 (en) 2002-02-01 2009-05-12 Civolution B.V. Watermark-based access control method and device
AU2003243187A1 (en) 2002-05-02 2003-11-17 Shieldip, Inc. Method and apparatus for protecting information and privacy
EP1506548A2 (en) 2002-05-10 2005-02-16 Koninklijke Philips Electronics N.V. Watermark embedding and retrieval
US8601504B2 (en) 2002-06-20 2013-12-03 Verance Corporation Secure tracking system and method for video program content
JP4266677B2 (en) 2002-09-20 2009-05-20 三洋電機株式会社 Digital watermark embedding method and encoding device and decoding device capable of using the method
EP2782337A3 (en) 2002-10-15 2014-11-26 Verance Corporation Media monitoring, management and information system
WO2004040913A1 (en) 2002-10-30 2004-05-13 Koninklijke Philips Electronics N.V. Watermarking of a variable bit-rate signal
JP3960959B2 (en) 2002-11-08 2007-08-15 三洋電機株式会社 Digital watermark embedding apparatus and method, and digital watermark extraction apparatus and method
JP2004193843A (en) 2002-12-10 2004-07-08 Nippon Hoso Kyokai <Nhk> Content distribution device, content distribution method, content distribution program and content reproduction device, content reproduction method, content reproduction program
JP2004194233A (en) 2002-12-13 2004-07-08 Mitsubishi Electric Corp Content management device and content distribution device
KR100492743B1 (en) 2003-04-08 2005-06-10 주식회사 마크애니 Method for inserting and detecting watermark by a quantization of a characteristic value of a signal
US20040202324A1 (en) 2003-04-11 2004-10-14 Matsushita Electric Industrial Co., Ltd Program electronic watermark processing apparatus
KR100624751B1 (en) 2003-04-25 2006-09-19 (주)마크텍 A method of embedding a watermark in an image and a digital video storage device using the method
JP4200106B2 (en) 2003-07-15 2008-12-24 株式会社リコー Image processing apparatus, image processing method, computer program, and storage medium for storing computer program
US8301893B2 (en) 2003-08-13 2012-10-30 Digimarc Corporation Detecting media areas likely of hosting watermarks
CN1836252A (en) 2003-08-19 2006-09-20 皇家飞利浦电子股份有限公司 Detect watermarks using a subset of available detection methods
JP4269861B2 (en) 2003-09-12 2009-05-27 沖電気工業株式会社 Printed material processing system, watermarked document printing device, watermarked document reading device, printed material processing method, information reading device, and information reading method
US20060239501A1 (en) 2005-04-26 2006-10-26 Verance Corporation Security enhancements of digital watermarks for multi-media content
US7369677B2 (en) 2005-04-26 2008-05-06 Verance Corporation System reactions to the detection of embedded watermarks in a digital host content
US9055239B2 (en) 2003-10-08 2015-06-09 Verance Corporation Signal continuity assessment using embedded watermarks
US20070039018A1 (en) 2005-08-09 2007-02-15 Verance Corporation Apparatus, systems and methods for broadcast advertising stewardship
CN1867970A (en) 2003-10-17 2006-11-22 皇家飞利浦电子股份有限公司 Signal encoding
US8023882B2 (en) 2004-01-14 2011-09-20 The Nielsen Company (Us), Llc. Portable audience measurement architectures and methods for portable audience measurement
CN1910845A (en) 2004-01-16 2007-02-07 皇家飞利浦电子股份有限公司 Method of allocating optimal payload space
JP4981454B2 (en) 2004-01-16 2012-07-18 ヒルクレスト・ラボラトリーズ・インコーポレイテッド Metadata mediation server and mediation method
US8117595B2 (en) 2004-03-23 2012-02-14 Microsoft Corporation Method for updating data in accordance with rights management policy
US8953908B2 (en) 2004-06-22 2015-02-10 Digimarc Corporation Metadata management and generation using perceptual features
WO2006036442A2 (en) 2004-08-31 2006-04-06 Gopalakrishnan Kumar Method and system for providing information services relevant to visual imagery
US20060083242A1 (en) 2004-10-20 2006-04-20 Nokia Corporation Address modification in application servers
WO2006051043A1 (en) 2004-11-10 2006-05-18 Thomson Licensing Method for securely binding content protection information to a content and method for verifying this binding
JP4519678B2 (en) 2005-02-21 2010-08-04 株式会社東芝 Digital watermark detection method and apparatus, digital watermark embedding method and apparatus
US7983922B2 (en) 2005-04-15 2011-07-19 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Apparatus and method for generating multi-channel synthesizer control signal and apparatus and method for multi-channel synthesizing
US8020004B2 (en) 2005-07-01 2011-09-13 Verance Corporation Forensic marking using a common customization function
US8781967B2 (en) 2005-07-07 2014-07-15 Verance Corporation Watermarking in an encrypted domain
US8181262B2 (en) 2005-07-20 2012-05-15 Verimatrix, Inc. Network user authentication system and method
KR100901142B1 (en) 2005-08-04 2009-06-04 니폰 덴신 덴와 가부시끼가이샤 Digital watermark detecting method, digital watermark detection device, and program
EP1764780A1 (en) 2005-09-16 2007-03-21 Deutsche Thomson-Brandt Gmbh Blind watermarking of audio signals by using phase modifications
US7937393B2 (en) 2005-11-28 2011-05-03 Commvault Systems, Inc. Systems and methods for classifying and transferring information in a storage network
KR101371574B1 (en) 2005-11-29 2014-03-14 구글 인코포레이티드 Social and interactive applications for mass media
JP2009521170A (en) 2005-12-22 2009-05-28 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Script synchronization method using watermark
US7945070B2 (en) 2006-02-24 2011-05-17 Digimarc Corporation Geographic-based watermarking keys
US9177124B2 (en) 2006-03-01 2015-11-03 Oracle International Corporation Flexible authentication framework
EP1837875A1 (en) 2006-03-22 2007-09-26 Deutsche Thomson-Brandt Gmbh Method and apparatus for correlating two data sections
WO2008045950A2 (en) 2006-10-11 2008-04-17 Nielsen Media Research, Inc. Methods and apparatus for embedding codes in compressed audio data streams
KR100834095B1 (en) 2006-12-02 2008-06-10 한국전자통신연구원 Non-blind Watermark Insertion / Extraction Device and Watermark Insertion / Extraction Method Using Data-Inherent Characteristics of Digital Media
GB2445765A (en) 2006-12-14 2008-07-23 Media Instr Sa Movable audience measurement system
EP2103026B1 (en) 2006-12-21 2014-02-26 Thomson Licensing A method to support forward error correction for real-time audio and video data over internet protocol networks
JP5014832B2 (en) 2007-02-27 2012-08-29 株式会社沖データ Image processing apparatus, image processing method, and computer program
US20100174608A1 (en) 2007-03-22 2010-07-08 Harkness David H Digital rights management and audience measurement systems and methods
EP2135376A4 (en) 2007-03-22 2012-12-19 Nielsen Co Us Llc Digital rights management and audience measurement systems and methods
US8055708B2 (en) 2007-06-01 2011-11-08 Microsoft Corporation Multimedia spaces
KR101383307B1 (en) 2007-06-14 2014-04-09 톰슨 라이센싱 Method and apparatus for setting a detection threshold given a desired false probability
EP2162860B1 (en) 2007-06-14 2018-08-01 ContentArmor Modifying a coded bitstream
US20090060055A1 (en) 2007-08-29 2009-03-05 Sony Corporation Method and apparatus for encoding metadata into a digital program stream
WO2009031082A1 (en) 2007-09-03 2009-03-12 Koninklijke Philips Electronics N.V. Apparatus and methods for transferring digital content
US9811849B2 (en) 2007-09-28 2017-11-07 Great-Circle Technologies, Inc. Contextual execution of automated workflows
US8138930B1 (en) 2008-01-22 2012-03-20 Google Inc. Advertising based on environmental conditions
WO2009116856A2 (en) 2008-03-18 2009-09-24 Civolution B.V. Generating statistics of popular content
US8805689B2 (en) 2008-04-11 2014-08-12 The Nielsen Company (Us), Llc Methods and apparatus to generate and use content-aware watermarks
US8527651B2 (en) 2008-06-19 2013-09-03 Huawei Technologies Co., Ltd. Content identification method and system, and SCIDM client and server
US8259938B2 (en) 2008-06-24 2012-09-04 Verance Corporation Efficient and secure forensic marking in compressed
US8346532B2 (en) 2008-07-11 2013-01-01 International Business Machines Corporation Managing the creation, detection, and maintenance of sensitive information
KR100985190B1 (en) 2008-07-18 2010-10-05 주식회사 마크애니 Method and system for providing information using watermark
JP2010033265A (en) 2008-07-28 2010-02-12 Nec Corp Method and system for distributing content
KR100983516B1 (en) 2008-08-05 2010-09-24 원종근 Medium and high frequency transformer with low eddy current and magnetic history loss
US8543773B2 (en) 2008-08-25 2013-09-24 International Business Machines Corporation Distributed shared memory
EP2175443A1 (en) 2008-10-10 2010-04-14 Thomson Licensing Method and apparatus for for regaining watermark data that were embedded in an original signal by modifying sections of said original signal in relation to at least two different reference data sequences
KR101529082B1 (en) 2008-12-01 2015-06-17 주식회사 케이티 Apparatus for watermarking by dividing off tracking information and method therefor
WO2010073236A1 (en) 2008-12-22 2010-07-01 France Telecom A method of and apparatus for authenticating data content
US8200617B2 (en) 2009-04-15 2012-06-12 Evri, Inc. Automatic mapping of a location identifier pattern of an object to a semantic type using object metadata
CA3094520A1 (en) 2009-05-01 2010-11-04 The Nielsen Company (Us), Llc Methods, apparatus and articles of manufacture to provide secondary content in association with primary broadcast media content
US8499059B2 (en) 2009-05-04 2013-07-30 Rovi Solutions Corporation System and methods for buffering of real-time data streams
JP2010272920A (en) 2009-05-19 2010-12-02 Mitsubishi Electric Corp Digital watermark embedding apparatus, digital watermark embedding method, and digital watermark embedding program
CN102461066B (en) 2009-05-21 2015-09-09 数字标记公司 Differentiate the method for content signal
US8136142B2 (en) 2009-07-02 2012-03-13 Ericsson Television, Inc. Centralized content management system for managing distribution of packages to video service providers
KR101653310B1 (en) 2009-09-02 2016-09-01 엘지전자 주식회사 Method and Apparatus of transmitting and receiving MAC PDU using a MAC header
US8560604B2 (en) 2009-10-08 2013-10-15 Hola Networks Ltd. System and method for providing faster and more efficient data communication
US9277183B2 (en) 2009-10-13 2016-03-01 Sony Corporation System and method for distributing auxiliary data embedded in video data
US8914835B2 (en) 2009-10-28 2014-12-16 Qualcomm Incorporated Streaming encoded video data
US8605900B2 (en) 2009-10-30 2013-12-10 Panasonic Corporation AV data receiving device, AV data receiving method, and AV data transmission and receiving system
US8548810B2 (en) 2009-11-04 2013-10-01 Digimarc Corporation Orchestrated encoding and decoding multimedia content having plural digital watermarks
BR112012001150B1 (en) 2009-11-09 2021-06-29 Snaptrack, Inc METHOD FOR IMPLEMENTING HTTP-BASED TRANSMISSION SERVICE
EP2362382A1 (en) 2010-02-26 2011-08-31 Fraunhofer-Gesellschaft zur Förderung der Angewandten Forschung e.V. Watermark signal provider and method for providing a watermark signal
WO2011116309A1 (en) 2010-03-19 2011-09-22 Digimarc Corporation Intuitive computing methods and systems
US8863000B2 (en) 2010-04-07 2014-10-14 Yahoo! Inc. Method and system for action suggestion using browser history
US20110261667A1 (en) 2010-04-23 2011-10-27 General Electric Company System and method for protecting piracy in optical storage
EP2387033A1 (en) 2010-05-11 2011-11-16 Thomson Licensing Method and apparatus for detecting which one of symbols of watermark data is embedded in a received signal
US9009339B2 (en) 2010-06-29 2015-04-14 Echostar Technologies L.L.C. Apparatus, systems and methods for accessing and synchronizing presentation of media content and supplemental media rich content
CN106330903A (en) 2010-09-01 2017-01-11 韩国电子通信研究院 Method and terminal for providing media content
US8838977B2 (en) 2010-09-16 2014-09-16 Verance Corporation Watermark extraction and content screening in a networked environment
EP2439735A1 (en) 2010-10-06 2012-04-11 Thomson Licensing Method and Apparatus for generating reference phase patterns
US20120102304A1 (en) 2010-10-26 2012-04-26 Baynote, Inc. Behavior-Based Data Configuration System and Method
KR101413298B1 (en) 2010-11-04 2014-06-27 한국전자통신연구원 Apparatus, system and method for recovering meta data using fragmentary information
US20120122429A1 (en) 2010-11-16 2012-05-17 Qualcomm Incorporated Method for digital watermark use by a mobile station
US8990876B2 (en) 2010-11-24 2015-03-24 Lg Electronics Inc. Method for receiving enhanced service and display apparatus thereof
US9767823B2 (en) 2011-02-07 2017-09-19 Qualcomm Incorporated Devices for encoding and detecting a watermarked signal
EP2490444B1 (en) 2011-02-15 2020-03-25 DISH Technologies L.L.C. Controlling placeshifted content
NL2006291C2 (en) 2011-02-24 2012-08-27 Civolution B V Broadcasting an information signal having special content for triggering an appropriate action in user device.
US8189861B1 (en) 2011-04-05 2012-05-29 Google Inc. Watermarking digital documents
US9380356B2 (en) 2011-04-12 2016-06-28 The Nielsen Company (Us), Llc Methods and apparatus to generate a tag for media content
KR20120119793A (en) 2011-04-22 2012-10-31 삼성전자주식회사 Method and apparatus for watermarking for tracing hacked contents, and method and apparatus for blocking hacked contents
US8666111B2 (en) 2011-05-24 2014-03-04 Tata Consultancy Services Limited System and method for detecting the watermark using decision fusion
US20120304206A1 (en) 2011-05-26 2012-11-29 Verizon Patent And Licensing, Inc. Methods and Systems for Presenting an Advertisement Associated with an Ambient Action of a User
US8848969B2 (en) 2011-06-06 2014-09-30 Time Warner Cable Enterprises Llc Methods and apparatus for watermarking and distributing watermarked content
NL2006978C2 (en) 2011-06-21 2012-12-28 Civolution B V Rendering device with content substitution.
WO2012177874A2 (en) 2011-06-21 2012-12-27 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US8578404B2 (en) 2011-06-30 2013-11-05 The Nielsen Company (Us), Llc Program telecast monitoring using watermarks
JP5784833B2 (en) 2011-07-15 2015-09-24 アルカテル−ルーセント Secure group messaging
US9554175B2 (en) 2011-07-20 2017-01-24 Sony Corporation Method, computer program, reception apparatus, and information providing apparatus for trigger compaction
US20130031579A1 (en) 2011-07-28 2013-01-31 United Video Properties, Inc. Systems and methods for selectively modifying the display of advertisements and providing supplementary media content
EP2744214A4 (en) 2011-08-12 2015-03-11 Samsung Electronics Co Ltd TRANSMITTING DEVICE, RECEIVING DEVICE AND CORRESPONDING TRANSMITTING-RECEIVING METHOD
US9253233B2 (en) 2011-08-31 2016-02-02 Qualcomm Incorporated Switch signaling methods providing improved switching between representations for adaptive HTTP streaming
US8819171B2 (en) 2011-09-07 2014-08-26 Microsoft Corporation Monitoring and benchmarking client performance from the server-side
US8682026B2 (en) 2011-11-03 2014-03-25 Verance Corporation Efficient extraction of embedded watermarks in the presence of host content distortions
US8923548B2 (en) 2011-11-03 2014-12-30 Verance Corporation Extraction of embedded watermarks from a host content using a plurality of tentative watermarks
US8533481B2 (en) 2011-11-03 2013-09-10 Verance Corporation Extraction of embedded watermarks from a host content based on extrapolation techniques
US8615104B2 (en) 2011-11-03 2013-12-24 Verance Corporation Watermark extraction based on tentative watermarks
US9281013B2 (en) 2011-11-22 2016-03-08 Cyberlink Corp. Systems and methods for transmission of media content
US8745403B2 (en) 2011-11-23 2014-06-03 Verance Corporation Enhanced content management based on watermark extraction records
US9547753B2 (en) 2011-12-13 2017-01-17 Verance Corporation Coordinated watermarking
US20130151855A1 (en) 2011-12-13 2013-06-13 Verance Corporation Watermark embedding workflow improvements
US9323902B2 (en) 2011-12-13 2016-04-26 Verance Corporation Conditional access using embedded watermarks
KR101358807B1 (en) 2011-12-27 2014-02-10 에스케이브로드밴드주식회사 Method for synchronizing program between multi-device using digital watermark and system for implementing the same
EP2487680B1 (en) 2011-12-29 2014-03-05 Distribeo Audio watermark detection for delivering contextual content to a user
KR101378493B1 (en) 2011-12-30 2014-04-01 포항공과대학교 산학협력단 Synchronized text editing method and apparatus based on image data
NL2008511C2 (en) 2012-03-21 2013-09-25 Civolution B V Method and system for embedding and detecting a pattern.
CN103379101A (en) 2012-04-20 2013-10-30 腾讯科技(深圳)有限公司 Watermark generation method, client side and server
MY177736A (en) 2012-07-12 2020-09-23 Sony Corp Parameterized services descriptor for advanced television services
KR102086514B1 (en) 2012-07-16 2020-03-09 엘지전자 주식회사 Method and apparatus for processing digital service signals
EP2693392A1 (en) 2012-08-01 2014-02-05 Thomson Licensing A second screen system and method for rendering second screen information on a second screen
KR101352917B1 (en) 2012-08-13 2014-01-20 주식회사 금영 Method and system playing received video content synchronized with audio content
MX348203B (en) 2012-08-29 2017-06-05 Lg Electronics Inc Method and apparatus for processing digital service signal.
US9571606B2 (en) 2012-08-31 2017-02-14 Verance Corporation Social media viewing system
US8869222B2 (en) 2012-09-13 2014-10-21 Verance Corporation Second screen content
US20140074855A1 (en) 2012-09-13 2014-03-13 Verance Corporation Multimedia content tags
US8726304B2 (en) 2012-09-13 2014-05-13 Verance Corporation Time varying evaluation of multimedia content
US20140075469A1 (en) 2012-09-13 2014-03-13 Verance Corporation Content distribution including advertisements
MX346770B (en) 2012-10-18 2017-03-31 Lg Electronics Inc Apparatus and method for processing an interactive service.
US20140267907A1 (en) 2013-03-13 2014-09-18 Verance Corporation Multimedia presentation tracking in networked environment
US9262793B2 (en) 2013-03-14 2016-02-16 Verance Corporation Transactional video marking system
US20140279549A1 (en) 2013-03-15 2014-09-18 Verance Corporation Referred sale system
EP2989806A1 (en) 2013-04-25 2016-03-02 Verance Corporation Real-time anti-piracy for broadcast streams
WO2014176550A1 (en) 2013-04-25 2014-10-30 Verance Corporation Live broadcast content protection based on watermarking
US9251549B2 (en) 2013-07-23 2016-02-02 Verance Corporation Watermark extractor enhancements based on payload ranking
KR101485852B1 (en) 2013-08-12 2015-01-27 주식회사 마크애니 Drm content stream transmission apparatus, method, and transmission and reception system
US9208334B2 (en) 2013-10-25 2015-12-08 Verance Corporation Content management using multiple abstraction layers
US8768714B1 (en) 2013-12-05 2014-07-01 The Telos Alliance Monitoring detectability of a watermark message
EP2899720A1 (en) 2014-01-22 2015-07-29 Thomson Licensing Real-time position estimation in indoor environment using audio watermarking
US9277265B2 (en) 2014-02-11 2016-03-01 The Nielsen Company (Us), Llc Methods and apparatus to calculate video-on-demand and dynamically inserted advertisement viewing probability
US20150261753A1 (en) 2014-03-13 2015-09-17 Verance Corporation Metadata acquisition using embedded codes
WO2015138798A1 (en) 2014-03-13 2015-09-17 Verance Corporation Interactive content acquisition using embedded codes
US10504200B2 (en) 2014-03-13 2019-12-10 Verance Corporation Metadata acquisition using embedded watermarks
US9860612B2 (en) 2014-04-10 2018-01-02 Wowza Media Systems, LLC Manifest generation and segment packetization
US9990928B2 (en) 2014-05-01 2018-06-05 Digital Voice Systems, Inc. Audio watermarking via phase modification
WO2015168697A1 (en) 2014-05-02 2015-11-05 Verance Corporation Metadata acquisition using embedded codes
CA2948117A1 (en) 2014-05-13 2015-11-19 Sharp Kabushiki Kaisha A method of decoding a content bitstream
EP3183883A4 (en) 2014-08-20 2018-03-28 Verance Corporation Watermark detection using a multiplicity of predicted patterns
US9942602B2 (en) 2014-11-25 2018-04-10 Verance Corporation Watermark detection and metadata delivery associated with a primary content
US9769543B2 (en) 2014-11-25 2017-09-19 Verance Corporation Enhanced metadata and content delivery using watermarks
US9602891B2 (en) 2014-12-18 2017-03-21 Verance Corporation Service signaling recovery for multimedia content using embedded watermarks

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120315011A1 (en) * 2010-02-22 2012-12-13 Dolby Laboratories Licensing Corporation Video Delivery and Control by Overwriting Video Data
US20160165297A1 (en) * 2013-07-17 2016-06-09 Telefonaktiebolaget L M Ericsson (Publ) Seamless playback of media content using digital watermarking
US10236031B1 (en) * 2016-04-05 2019-03-19 Digimarc Corporation Timeline reconstruction using dynamic path estimation from detections in audio-video signals

Also Published As

Publication number Publication date
US20220312081A1 (en) 2022-09-29
US11722741B2 (en) 2023-08-08

Similar Documents

Publication Publication Date Title
US11423942B2 (en) Reference and non-reference video quality evaluation
US9392344B2 (en) Audio/video identification watermarking
CN109168083B (en) Streaming media real-time playing method and device
RU2481649C2 (en) Method and device for detection and use of sampling frequency for decoding of water sign information built into received signal selected by initial sampling frequency at coder side
US20150130952A1 (en) System and method for estimating quality of video with frame freezing artifacts
US20080084506A1 (en) Real time scene change detection in video sequences
JP2007511948A (en) Trick play signal playback
US20240007712A1 (en) System and method for tracking content timeline in the presence of playback rate changes
US8780209B2 (en) Systems and methods for comparing media signals
EP3073754A1 (en) Systems and methods for video play control
US11483535B2 (en) Synchronizing secondary audiovisual content based on frame transitions in streaming content
US20080240576A1 (en) Method of and apparatus for detecting error in image data stream
US8422859B2 (en) Audio-based chapter detection in multimedia stream
CN111818338B (en) Abnormal display detection method, device, equipment and medium
CN101449587A (en) Scene cut detection for video
US8754947B2 (en) Systems and methods for comparing media signals
US20090002567A1 (en) Image analysis apparatus and image analysis method
US20140064505A1 (en) Echo modulation methods and system
CN115550695B (en) Multimedia playing method, device and computer equipment
JP2003508941A (en) Method and apparatus for encoding a sequence of frames containing video-type or film-type images
CN116095362B (en) Video encoding, decoding, transmission method, electronic device, and storage medium
JP3060742B2 (en) Encoded signal decoding device
CN109510978B (en) Data processing performance detection method and device
CA2692872C (en) Systems and methods for comparing media signals
Terry et al. Detection and correction of lip-sync errors using audio and video fingerprints

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION