The present invention relates to a method and associated system for measuring and analyzing a video stream broadcasted for a user.
Monitoring data and associated functions typically comprises an inaccurate process with little flexibility. Systems performing data monitoring processes typically transmit the data without the ability to enable any feedback associated with the data. Transmitting the data without the ability to enable any feedback associated with the data limits any flexibility to account for changes within the systems.
The present invention provides a method comprising:
The present invention advantageously provides a simple method and associated system capable of monitoring data and associated functions.
System 5 of
Computing system 10 comprises a memory system 14. Memory system 14 may comprise a single memory system. Alternatively, memory system 14 may comprise a plurality of memory systems. Memory system 14 comprises a software application 18 and a database 12. Database 12 comprises all analysis data associated with user interactions related to a video stream. Software application 18 enables a method to measure and analyze a video stream broadcasted for a user. Software application 18 enables a process for measuring and analyzing a video stream broadcasted for a user as follows.
Computing system 10 (i.e., via software application 18) inserts timestamp meta-data into a live encoded on-demand video stream. The video stream is transmitted to a client (e.g., any of devices 8a . . . 8n). When the client receives each timestamp in the meta-data, the client invokes a measurement reporting process. A measurement interval associated with insertion points for (i.e., in the video stream) for the meta-data time stamps will determine how fine-grained a final analysis may be. Software application 18 may report partial video segments if a user interrupts the video between measurement intervals. A measurement reporting method is invoked with additional parameters to differentiate from live, on-demand, or reviewed video. User interaction information such as, inter alia, window focus, mouse movement, mouse clicks within the player, etc. may also be recorded via system 5 for analytic purposes. The user interaction information may be used to validate or invalidate video stream measurement statistics. The user interaction information may be used to determine if a user is actively watching a video stream or if the user has the video stream playing on their system without paying attention to it. Examples of user interaction information retrieved by software application 18 are described as follows:
Window Focus Information
Software application 18 may detect if a video player (e.g., a software video media player on one of devices 8a . . . 8n) is in focus through method calls to an operating system. When a video window (i.e., of a software video media player) focus changes, the change is recorded by software application 18. Window focus may indicate if a user is actively viewing streaming video media.
Mouse Movement Information
Software application 18 may detect mouse movement within a video window (i.e., of a software video media player). Additionally, software application 18 may detect mouse movement external to the video window using method calls to an operating system. Mouse movement indicates that a user is actively interacting with the computer and therefore likely watching the video streaming. Retrieving mouse movement information in combination with the window focus information allows software application 18 to an accurate depiction of a user's attentiveness to a video.
Volume Information
Software application 18 may detect a current volume of a software video media player and/or system. During a video stream measurement transmission, a state of the volume may be transmitted. Volume or lack of volume indicates if a user has heard any audio associated with the video stream during a specified time period. This measurement may be helpful to potential advertisers.
Software application 18 enables a process for analyzing an embedded time code compared to a relative reported system time in order to determine an increased buffer time or lag time for each end user. If software application 18 discerns that a quality of the video stream is below a threshold, software application 18 may alter the video stream or direct the user to a new video stream that consumes less bandwidth to enhance the video quality.
Devices 8a . . . 8n may report measurements using any method including, inter alia, HTTP over TCP/IP, etc. Devices 8a . . . 8n may enable an HTTP request to a measurement server (e.g., software application 18) and an associated measurement is recorded by the server and used for analytics. Any subset of the measurements or additional information may be transmitted with the measurement requests. Software application 18 enables generation of measurement reports comprising how many people viewed a live video stream at any given time, metrics associated with portions of an on-demand video section, and a determination of which advertisements were viewed (if advertisements were embedded in a primary video stream). If advertisements were displayed as separate sections, they could be likewise measured as separate sections. Software application 18 analyzes measurements to determine:
The following implementation example illustrates a process for measuring and analyzing a video stream broadcasted for a user. Initially, a video stream is embedded with timestamps every ten seconds. Note that the ten second interval may be modified up or down to increase or decrease a granularity of measurement calls. The video stream embedded with timestamps is transmitted from streaming server computing system (e.g., computing system 10) to a client application (i.e., a web video console of devices 8a . . . 8n). The client application displays the video stream for a user. As the client application displays the video stream it will also receive meta-data which has been inserted into the video stream (i.e., comprising embedded time stamps). When the client application receives a timestamp, it transmits a measurement call. This process begins by collecting a current time. The client application additionally collects data which may enhance measurement of the video stream. In this example the following data is collected:
Additionally, a measurement service will generate a unique ID number for each client session.
The aforementioned retrieved measurement data is formatted into a text block and the text block is transmitted to the measurement server through an HTTP call from the client application. The measurement server accesses the measurement data and generates real-time reports. If a measurement value exceeds a specified threshold, a manual or automatic process may add instructions to the video stream as additional meta-data. In this example the measurement process reviews a variety of data points from the measurement calls. User activity, video performance, DVR activity, segments viewed, concurrent streams, max clients, are max streams are all reviewed. User activity is used to determine if a user is actually watching a video stream. Since each video stream served comprises an incremental cost, the video stream producers want to limit unwatched streams. During processing, the measurement server determines that a client with an id of 1234 has not had any activity in the past 30 minutes. A message is transmitted to an encoding process which embeds the id of 1234 and a message code of 1 into the video stream as meta-data. All client applications receive the message, but only client 1234 processes this message since it has a matching client id. The message code of 1, instructs the client video player to display a message to the user asking them to click a button if they are still watching. If the button is not clicked, the video client automatically disables the video stream after a specified time period. Additionally, the aforementioned method may be used for a live video performance. Based on a video lag (i.e., calculated as the difference between a video timestamp and a real time timestamp) and buffering events, a code may be transmitted to the client application. The code instructs the client application to switch to an alternative video stream. The alternative video stream may comprise a lower bit rate stream or may comprise a video stream from a different server which may be geographically closer (i.e., for improved network performance). The additional data, segments viewed, concurrent streams, max clients, and max streams are all reviewed by video producers to make business decisions about the video streams. Concurrent streams and max values are calculated using unique client ids and a series of measurement calls from each client application. Video segments viewed are calculated by looking at reported video timestamps. Each timestamp represents a segment of video 10 seconds long which ends at a timestamp time. As the timestamps are reported in measurement calls, a profile is generated illustrating which parts of the video are being displayed at any given time. In this example, the client application and video server comprise digital video recorder (DVR) capabilities, which allow users to rewind or pause a video and does not guarantee all users are watching a live version of the video stream. The details of these viewership patterns are available because of the timestamp based measurement technique described, supra.
The measurements may be compiled into usage statistics and viewing patterns associated with the video stream. The measurements occur at different time periods with reference to the time stamps. In step 218, the different time periods (i.e., associated with the time stamps of step 208 or reference time periods) are optionally compared to each other to determine differences. For example, different time periods associated with different time stamps (i.e., associated with the measurements) may be compared to each other to determine differences. Alternatively, different time periods associated with different time stamps (i.e., associated with the measurements) may be compared to predetermined reference time periods to determine differences. Additionally, different time periods associated with different time stamps (i.e., associated with the measurements) may be compared to each other and predetermined reference time periods to determine differences. In step 220, the computing system generates a report comprising a descriptions associated with the first user interaction functions, the time periods of step 208, and the differences of step 218. In step 224, the computing system transmits the report to an analysis computing system (e.g., computing system 10 of
Still yet, any of the components of the present invention could be created, integrated, hosted, maintained, deployed, managed, serviced, etc. by a service provider who offers to implement a method for measuring and analyzing a video stream broadcasted for a user. Thus the present invention discloses a process for deploying, creating, integrating, hosting, maintaining, and/or integrating computing infrastructure, comprising integrating computer-readable code into the computer system 90, wherein the code in combination with the computer system 90 is capable of performing a method for implementing a method for measuring and analyzing a video stream broadcasted for a user. In another embodiment, the invention provides a method that performs the process steps of the invention on a subscription, advertising, and/or fee basis. That is, a service provider, such as a Solution Integrator, could offer to implement a method for measuring and analyzing a video stream broadcasted for a user. In this case, the service provider can create, maintain, support, etc. a computer infrastructure that performs the process steps of the invention for one or more customers. In return, the service provider can receive payment from the customer(s) under a subscription and/or fee agreement and/or the service provider can receive payment from the sale of advertising content to one or more third parties.
While
While embodiments of the present invention have been described herein for purposes of illustration, many modifications and changes will become apparent to those skilled in the art. Accordingly, the appended claims are intended to encompass all such modifications and changes as fall within the true spirit and scope of this invention.
This application is a continuation application claiming priority to Ser. No. 12/628,288 filed Dec. 1, 2009, now U.S. Pat. No. 8,566,856, issued Oct. 22, 2013.
Number | Name | Date | Kind |
---|---|---|---|
6771881 | Ketcham | Aug 2004 | B1 |
7260823 | Schlack et al. | Aug 2007 | B2 |
7363643 | Drake et al. | Apr 2008 | B2 |
7908616 | Jeong | Mar 2011 | B2 |
8374590 | Mikan et al. | Feb 2013 | B1 |
8566856 | Amsterdam et al. | Oct 2013 | B2 |
20020056086 | Yuen et al. | May 2002 | A1 |
20030172131 | Ao et al. | Sep 2003 | A1 |
20040045020 | Witt et al. | Mar 2004 | A1 |
20040046790 | Agarwal et al. | Mar 2004 | A1 |
20040261102 | Itoh | Dec 2004 | A1 |
20070021065 | Sengupta et al. | Jan 2007 | A1 |
20070050832 | Wright et al. | Mar 2007 | A1 |
20070157247 | Cordray et al. | Jul 2007 | A1 |
20070294096 | Randall et al. | Dec 2007 | A1 |
20070294126 | Maggio et al. | Dec 2007 | A1 |
20080092168 | Logan et al. | Apr 2008 | A1 |
20080165760 | Challapali et al. | Jul 2008 | A1 |
20080263579 | Mears et al. | Oct 2008 | A1 |
20090089445 | Deshpande | Apr 2009 | A1 |
20090172736 | Tsui et al. | Jul 2009 | A1 |
20090228910 | Christinat et al. | Sep 2009 | A1 |
20090254932 | Wang et al. | Oct 2009 | A1 |
20100153831 | Beaton | Jun 2010 | A1 |
20110088052 | Ramaswamy et al. | Apr 2011 | A1 |
20110131596 | Amsterdam et al. | Jun 2011 | A1 |
20120260279 | Matz et al. | Oct 2012 | A1 |
Number | Date | Country |
---|---|---|
101282248 | Oct 2008 | CN |
202505365 | Aug 2009 | CN |
2002056280 | Feb 2002 | JP |
2009110521 | May 2009 | JP |
2008103829 | Aug 2008 | WO |
Entry |
---|
Siemens AG, Juergen Carstens et al.; Counting of Viewing Rates in Mobile TV and IP TV; Jul. 10, 2008; Nokia Siemens Networks 2008; 2 pages. |
P. Mariadoss; Dossier CHA9-2008-0008; Performing Real-Time Analytics Using a Network Processing Solution Able to Directly Ingest IP Camera Video Streams; pp. 1-3. |
Office Action (Mail Date Sep. 20, 2012) for U.S. Appl. No. 12/628,288, filed Dec. 1, 2009. |
Amendment filed Dec. 13, 2012 filed in response to Office Action (Mail Date Sep. 20, 2012) for U.S. Appl. No. 12/628,288, filed Dec. 1, 2009. |
Final Office Action (Mail Date Jan. 7, 2013) for U.S. Appl. No. 12/628,288, filed Dec. 1, 2009. |
Amendment After Final with Request for Continued Examination filed Mar. 7, 2013 in response to Final Office Action (Mail Date Jan. 7, 2013) for U.S. Appl. No. 12/628,288, filed Dec. 1, 2009. |
Notice of Allowance (Mail Date Jun. 14, 2013) for U.S. Appl. No. 12/628,288, filed Dec. 1, 2009. |
Number | Date | Country | |
---|---|---|---|
20140013346 A1 | Jan 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12628288 | Dec 2009 | US |
Child | 14024160 | US |