Adaptive streaming systems and methods for performing trick play

Information

  • Patent Grant
  • 10368096
  • Patent Number
    10,368,096
  • Date Filed
    Tuesday, October 9, 2018
    6 years ago
  • Date Issued
    Tuesday, July 30, 2019
    5 years ago
Abstract
Systems and methods for performing trick play functionality using trick play streams during adaptive bitrate streaming in accordance with embodiments of the invention are disclosed. One embodiment includes requesting a video container index from a video container file containing a video stream from a plurality of alternative streams of video; requesting at least one portion of the video stream using at least one entry from the video container index; decoding the at least one portion of the video stream; receiving at least one user instruction to perform a visual search of the media; requesting a trick play container index from a trick play container file containing a trick play stream; requesting at least one frame of video from the at least one trick play stream; and decoding and displaying the at least one frame of video from the trick play stream.
Description
FIELD OF THE INVENTION

The present invention generally relates to adaptive streaming and more specifically to adaptive bitrate streaming of encoded media contained within Matroska container files using Hypertext Transfer Protocol.


BACKGROUND

The term streaming media describes the playback of media on a playback device, where the media is stored on a server and continuously sent to the playback device over a network during playback. Typically, the playback device stores a sufficient quantity of media in a buffer at any given time during playback to prevent disruption of playback due to the playback device completing playback of all the buffered media prior to receipt of the next portion of media. Adaptive bit rate streaming or adaptive streaming involves detecting the present streaming conditions (e.g. the user's network bandwidth and CPU capacity) in real time and adjusting the quality of the streamed media accordingly. Typically, the source media is encoded at multiple bit rates and the playback device or client switches between streaming the different encodings depending on available resources.


Adaptive streaming solutions typically utilize either Hypertext Transfer Protocol (HTTP), published by the Internet Engineering Task Force and the World Wide Web Consortium as RFC 2616, or Real Time Streaming Protocol (RTSP), published by the Internet Engineering Task Force as RFC 2326, to stream media between a server and a playback device. HTTP is a stateless protocol that enables a playback device to request a byte range within a file. HTTP is described as stateless, because the server is not required to record information concerning the state of the playback device requesting information or the byte ranges requested by the playback device in order to respond to requests received from the playback device. RTSP is a network control protocol used to control streaming media servers. Playback devices issue control commands, such as “play” and “pause”, to the server streaming the media to control the playback of media files. When RTSP is utilized, the media server records the state of each client device and determines the media to stream based upon the instructions received from the client devices and the client's state.


In adaptive streaming systems, the source media is typically stored on a media server as a top level index file pointing to a number of alternate streams that contain the actual video and audio data. Each stream is typically stored in one or more container files. Different adaptive streaming solutions typically utilize different index and media containers. The Synchronized Multimedia Integration Language (SMIL) developed by the World Wide Web Consortium is utilized to create indexes in several adaptive streaming solutions including IIS Smooth Streaming developed by Microsoft Corporation of Redmond, Wash., and Flash Dynamic Streaming developed by Adobe Systems Incorporated of San Jose, Calif. HTTP Adaptive Bitrate Streaming developed by Apple Computer Incorporated of Cupertino, Calif. implements index files using an extended M3U playlist file (.M3U8), which is a text file containing a list of URIs that typically identify a media container file. The most commonly used media container formats are the MP4 container format specified in MPEG-4 Part 14 (i.e. ISO/IEC 14496-14) and the MPEG transport stream (TS) container specified in MPEG-2 Part 1 (i.e. ISO/IEC Standard 13818-1). The MP4 container format is utilized in IIS Smooth Streaming and Flash Dynamic Streaming. The TS container is used in HTTP Adaptive Bitrate Streaming.


The Matroska container is a media container developed as an open standard project by the Matroska non-profit organization of Aussonne, France. The Matroska container is based upon Extensible Binary Meta Language (EBML), which is a binary derivative of the Extensible Markup Language (XML). Decoding of the Matroska container is supported by many consumer electronics (CE) devices. The DivX Plus file format developed by DivX, LLC of San Diego, Calif. utilizes an extension of the Matroska container format (i.e. is based upon the Matroska container format, but includes elements that are not specified within the Matroska format).


SUMMARY OF THE INVENTION

Systems and methods for adaptive bitrate streaming of media stored in Matroska container files utilizing Hypertext Transfer Potocol (HTTP) in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes a processor configured via a source encoding application to ingest at least one multimedia file containing a source video. In addition, the source encoding application further configures the processor to select a portion of the source video, transcode the selected portion of the source video into a plurality of alternative portions of encoded video, where each alternative portion is encoded using a different set of encoding parameters and commences with an intra frame starting a closed Group of Pictures (GOP), write each of the alternative portions of encoded video to an element of a different EBML container file, where each element is located within an EBML container file that also includes another element that indicates the encoding parameters used to encode the alternative portion of encoded video, and add an entry to at least one index that identifies the location of the element containing one of the alternative portions of encoded video within each of the EBML container files.


In a further embodiment, transcoding a selected portion of the source video further comprises transcoding the selected portion into at least one closed group of pictures.


In another embodiment, the portion of source video is selected based upon the duration of the selected portion of source video.


In a still further embodiment, the source encoding application configures the processor to select a portion of the source video having a duration of two seconds.


In still another embodiment, each of the alternative portions of encoded video is encoded with a different maximum bitrate.


In a yet further embodiment, at least two of the alternative portions of encoded video are encoded with different resolutions.


In yet another embodiment, at least two of the alternative portions of encoded video are encoded with different frame rates.


In a further embodiment again, the element of the EBML container file to which each alternative portion of encoded video is written is a Cluster element containing a time code and the portion of encoded video is contained within BlockGroup elements within the Cluster element.


In another embodiment again, each encoded frame of the alternative portion of encoded video contained within the Cluster element is contained within a separate BlockGroup element.


In further additional embodiment, the first BlockGroup element in the Cluster element contains the IDR frame.


In another additional embodiment, the first BlockGroup element contains a Block element, which specifies the time code attribute of the IDR frame relative to the time code of the Cluster element.


In a still yet further embodiment, each element to which each of the alternative portions of encoded video is written is assigned the same time code.


In still yet another embodiment, the source encoding application further configures the processor to create an index for each of the EBML container files.


In a still further embodiment again, the source encoding application further configures the processor to add the location of the element containing one of the alternative portions of encoded video within each of the EBML container files to the index for the EBML container file.


In still another embodiment again, the source encoding application further configures the processor to pack the index for each EBML container file into the EBML container file.


In a still further additional embodiment, each index comprises a Cues element.


In still another additional embodiment, each Cues element includes a CuePoint element that points to the location of the element containing one of the alternative portions of encoded video within the EBML file.


In a yet further embodiment again, the source encoding application further configures the processor to create a top level index file that identifies each of the EBML container files.


In yet another embodiment again, the ingested multimedia file also includes source audio.


In a yet further additional embodiment, the source encoding application configures the processor to multiplex the audio into each of the EBML container files.


In yet another additional embodiment, wherein the source encoding application configures the processor to write the audio to a separate EBML container file.


In a further additional embodiment again, the source encoding application further configures the processor to transcode at least one of the at least one audio tracks.


In another additional embodiment again, the ingested multimedia file further comprises subtitles.


In a still yet further embodiment again, the source encoding application configures the processor to multiplex the subtitles into each of the EBML container files.


In still yet another embodiment again, the source encoding application configures the processor to write the subtitles to a separate EBML container file.


In a still yet further additional embodiment, the source encoding application further configures the processor to transcode the source video to create a lower frame rate trick play track and to write the trick play track to a separate EBML container file.


In still yet another additional embodiment, the trick play track is also lower resolution than the source video.


In a yet further additional embodiment again, the source encoding application further configures the processor to write the element containing a set of encoding parameters in each of the EBML container files.


In yet another additional embodiment again, the set of encoding parameters includes at least one parameter selected from the group consisting of frame rate, frame height, frame width, sample aspect ratio, maximum bitrate, and minimum buffer size.


Another further embodiment includes repeatedly selecting a portion of the source video using the source encoder, transcoding the selected portion of the source video into a plurality of alternative portions of encoded video using the source encoder, where each alternative portion is encoded using a different set of encoding parameters and commences with an intra frame starting a closed Group of Pictures (GOP), writing each of the alternative portions of encoded video to an element of a different EBML container file using the source encoder, where each element is located within an EBML container file that also includes another element containing a set of encoding parameters corresponding to the encoding parameters used to encode the portion of video, and adding an entry to at least one index that identifies the location of the element containing one of the alternative portions of encoded video within each of the EBML container files.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a network diagram of an adaptive bitrate streaming system in accordance with an embodiment of the invention.



FIG. 2 conceptually illustrates a top level index file and Matroska container files generated by the encoding of source media in accordance with embodiments of the invention.



FIG. 3 conceptually illustrates a specialized Matroska container file incorporating a modified Cues element in accordance with an embodiment of the invention.



FIGS. 4a-4c conceptually illustrate the insertion of different types of media into the Clusters element of a Matroska container file subject to various constrains that facilitate adaptive bitrate streaming in accordance with embodiments of the invention.



FIG. 4d conceptually illustrates the multiplexing of different types of media into the Clusters element of a Matroska container file subject to various constraints that facilitate adaptive bitrate streaming in accordance with an embodiment of the invention.



FIG. 4e conceptually illustrates the inclusion of a trick play track into the Clusters element of a Matroska container file subject to various constraints that facilitate adaptive bitrate streaming in accordance with an embodiment of the invention.



FIG. 5 conceptually illustrates a modified Cues element of a specialized Matroska container file, where the Cues element includes information enabling the retrieval of Cluster elements using HTTP byte range requests in accordance with an embodiment of the invention.



FIG. 5a conceptually illustrates a modified Cues element of a specialized Matroska container file in accordance with an embodiment of the invention, where the Cues element is similar to the Cues element shown in FIG. 5 with the exception that attributes that are not utilized during adaptive bitrate streaming are removed.



FIG. 6 conceptually illustrates the indexing of Cluster elements within a specialized Matroska container file utilizing modified CuePoint elements within the container file in accordance with embodiments of the invention.



FIG. 7 is a flow chart illustrating a process for encoding source media for adaptive bitrate streaming in accordance with an embodiment of the invention.



FIG. 8 conceptually illustrates communication between a playback device and an HTTP server associated with the commencement of streaming of encoded media contained within Matroska container files indexed by a top level index file in accordance with an embodiment of the invention.



FIGS. 9a and 9b conceptually illustrate communication between a playback device and an HTTP server associated with switching between streams in response to the streaming conditions experienced by the playback device and depending upon the index information available to the playback device prior to the decision to switch streams in accordance with embodiments of the invention.





DETAILED DISCLOSURE OF THE INVENTION

Turning now to the drawings, systems and methods for encoding source media in Matroska container files for adaptive bitrate streaming utilizing Hypertext Transfer Potocol (HTTP) in accordance with embodiments of the invention are illustrated. In a number of embodiments, source media is encoded as a number of alternative streams. Each stream is stored in a Matroska (MKV) container file. In many embodiments, the Matroska container file is a specialized Matroska container file in that the manner in which the media in each stream is encoded and stored within the container is constrained to improve streaming performance. In several embodiments, the Matroska container file is further specialized in that additional index elements (i.e. elements that are not specified as part of the Matroska container format) can be included within the file to facilitate the retrieval of desired media during adaptive bitrate streaming. In several embodiments, each stream (i.e. audio, video, or subtitle) is stored within a separate Matroska container file. In other embodiments, an encoded video stream is multiplexed with one or more encoded audio, and/or subtitle streams in each Matroska container file. A top level index file containing an index to the streams contained within each of the container files is also generated to enable adaptive bitrate streaming of the encoded media. In many embodiments, the top level index file is a Synchronized Multimedia Integration Language (SMIL) file containing URIs for each of the Matroska container files. In other embodiments, any of a variety of file formats can be utilized in the generation of the top level index file.


The performance of an adaptive bitstrate streaming system in accordance with embodiments of the invention can be significantly enhanced by encoding each portion of the source video at each bit rate in such a way that the portion of video is encoded in each stream as a single (or at least one) closed group of pictures (GOP) starting with an Instantaneous Decoder Refresh (IDR) frame. The GOP for each stream can then be stored as a Cluster element within the Matroska container file for the stream. In this way, the playback device can switch between streams at the completion of the playback of a Cluster and, irrespective of the stream from which a Cluster is obtained the first frame in the Cluster will be an IDR frame and can be decoded without reference to any encoded media other than the encoded media contained within the Cluster element. In many embodiments, the sections of the source video that are encoded as GOPs are all the same duration. In a number of embodiments each two second sequence of the source video is encoded as a GOP.


Retrieval of media using HTTP during adaptive streaming can be improved by adding additional index information to the Matroska container files used to contain each of the encoded streams. In a number of embodiments, the index is a reduced index in that the index only points to the IDRs at the start of each cluster. In many embodiments, the index of the Matroska container file includes additional non-standard attributes (i.e. attributes that do not form part of the Matroska container file format specification) that specify the size of each of the clusters so that a playback device can retrieve a Cluster element from the Matroska container file via HTTP using a byte range request.


Adaptive streaming of source media encoded in the manner outlined above can be coordinated by a playback device in accordance with embodiments of the invention. The playback device obtains information concerning each of the available streams from the top level index file and selects one or more streams to utilize in the playback of the media. The playback device can then obtain header information from the Matroska container files containing the one or more bitstreams or streams, and the headers provide information concerning the decoding of the streams. The playback device can also request index information that indexes the encoded media stored within the relevant Matroska container files. The index information can be stored within the Matroska container files or separately from the Matroska container files in the top level index or in separate index files. The index information enables the playback device to request byte ranges corresponding to Cluster elements within the Matroska container file containing specific portions of encoded media via HTTP from the server. As the playback device receives the Cluster elements from the HTTP server, the playback device can evaluate current streaming conditions to determine whether to increase or decrease the bitrate of the streamed media. In the event that the playback device determines that a change in bitrate is necessary, the playback device can obtain header information and index information for the container file(s) containing the desired stream(s) (assuming the playback device has not already obtained this information). The index information can then be used to identify the byte range of the Cluster element containing the next portion of the source media encoded at the desired bit rate and the identified Cluster element can be retrieved from the server via HTTP. The next portion of the source media that is requested is typically identified based upon the Cluster elements already requested by the playback device and the Cluster elements buffered by the playback device. The next portion of source media requested from the alternative stream is requested to minimize the likelihood that the buffer of the playback device will underflow (i.e. run out media to playback) prior to receipt of the Cluster element containing the next portion of source media by the playback device. In this way, the playback device can achieve adaptive bitrate streaming by retrieving sequential Cluster elements from the various streams as appropriate to the streaming conditions using the top level index and index information describing the Cluster elements within each of the Matroska container files.


In a number of embodiments, variation in the bitrate between different streams can be achieved by modifying the encoding parameters for each stream including but not limited to the bitrate, frame rate, and resolution. When different streams include different resolutions, the display aspect ratio of each stream is the same and the sample aspect ratios are modified to ensure smooth transitions from one resolution to another. The encoding of source video for use in adaptive bitrate streaming and the playback of the encoded source video using HTTP requests to achieve adaptive bitrate streaming in accordance with embodiments of the invention is discussed further below.


Adaptive Streaming System Architecture


An adaptive streaming system in accordance with an embodiment of the invention is illustrated in FIG. 1. The adaptive streaming system 10 includes a source encoder 12 configured to encode source media as a number of alternative streams. In the illustrated embodiment, the source encoder is a server. In other embodiments, the source encoder can be any processing device including a processor and sufficient resources to perform the transcoding of source media (including but not limited to video, audio, and/or subtitles). As is discussed further below, the source encoding server 12 generates a top level index to a plurality of container files containing the streams, at least a plurality of which are alternative streams. Alternative streams are streams that encode the same media content in different ways. In many instances, alternative streams encode media content (such as but not limited to video) at different bitrates. In a number of embodiments, the alternative streams are encoded with different resolutions and/or at different frame rates. The top level index file and the container files are uploaded to an HTTP server 14. A variety of playback devices can then use HTTP or another appropriate stateless protocol to request portions of the top level index file and the container files via a network 16 such as the Internet.


In many embodiments, the top level index file is a SMIL file and the media is stored in Matroska container files. As is discussed further below, the media can be stored within the Matroska container file in a way that facilitates the adaptive bitrate streaming of the media. In many embodiments, the Matroska container files are specialized Matroska container files that include enhancements (i.e. elements that do not form part of the Matroska file format specification) that facilitate the retrieval of specific portions of media via HTTP during the adaptive bitrate streaming of the media.


In the illustrated embodiment, playback devices include personal computers 18 and mobile phones 20. In other embodiments, playback devices can include consumer electronics devices such as DVD players, Blu-ray players, televisions, set top boxes, video game consoles, tablets, and other devices that are capable of connecting to a server via HTTP and playing back encoded media. Although a specific architecture is shown in FIG. 1 any of a variety of architectures can be utilized that enable playback devices to request portions of the top level index file and the container files in accordance with embodiments of the invention.


File Structure


Files generated by a source encoder and/or stored on an HTTP server for streaming to playback devices in accordance with embodiments of the invention are illustrated in FIG. 2. The files utilized in the adaptive bitrate streaming of the source media include a top level index 30 and a plurality of container files 32 that each contain at least one stream. The top level index file describes the content of each of the container files. As is discussed further below, the top level index file can take a variety of forms including an SMIL file and the container files can take a variety of forms including a specialized Matroska container file.


In many embodiments, each Matroska container file contains a single stream. For example, the stream could be one of a number of alternate video streams, an audio stream, one of a number of alternate audio streams, a subtitle stream, one of a number of alternate subtitle streams, a trick play stream, or one of a number of alternate trick play streams. In several embodiments, the Matroska container file includes multiple multiplexed streams. For example, the Matroska container could include a video stream, and one or more audio streams, one or more subtitle streams, and/or one or more trick play streams. As is discussed further below, in many embodiments the Matroska container files are specialized files. The encoding of the media and the manner in which the media is stored within Cluster elements within the Matroska container file can be subject to constraints designed to enhance the performance of an adaptive bitrate streaming system. In addition, the Matroska container file can include index elements that facilitate the location and downloading of Cluster elements from the various Matroska container files during the adaptive streaming of the media. Top level index files and Matroska container files that can be used in adaptive bitrate streaming systems in accordance with embodiments of the invention are discussed below.


Top Level Index Files


Playback devices in accordance with many embodiments of the invention utilize a top level index file to identify the container files that contain the streams available to the playback device for use in adaptive bitrate streaming. In many embodiments, the top level index files can include references to container files that each include an alternative stream of encoded media. The playback device can utilize the information in the top level index file to retrieve encoded media from each of the container files according to the streaming conditions experienced by the playback device.


In several embodiments, the top level index file provides information enabling the playback device to retrieve information concerning the encoding of the media in each of the container files and an index to encoded media within each of the container files. In a number of embodiments, each container file includes information concerning the encoded media contained within the container file and an index to the encoded media within the container file and the top level index file indicates the portions of each container file containing this information. Therefore, a playback device can retrieve the top level index file and use the top level index file to request the portions of one or more of the container files that include information concerning the encoded media contained within the container file and an index to the encoded media within the container file. A variety of top level index files that can be utilized in adaptive bitrate streaming systems in accordance with embodiments of the invention are discussed further below.


Top Level Index SMIL Files


In a number of embodiments, the top level index file utilized in the adaptive bitrate streaming of media is a SMIL file, which is an XML file that includes a list of URIs describing each of the streams and the container files that contain the streams. The URI can include information such as the “system-bitrate” of the stream contained within the stream and information concerning the location of specific pieces of data within the container file.


The basic structure of a SMIL file involves providing an XML declaration and a SMIL element. The SMIL element defines the streams available for use in adaptive bitrate streaming and includes a HEAD element, which is typically left empty and a BODY element that typically only contains a PAR (parallel) element. The PAR element describes streams that can be played simultaneously (i.e. include media that can be presented at the same time).


The SMIL specification defines a number of child elements to the PAR element that can be utilized to specify the streams available for use in adaptive bitrate streaming. The VIDEO, AUDIO and TEXTSTREAM elements can be utilized to define a specific video, audio or subtitle stream. The VIDEO, AUDIO and TEXTSTREAM elements can collectively be referred to as media objects. The basic attributes of a media object are the SRC attribute, which specifies the full path or a URI to a container file containing the relevant stream, and the XML:LANG attribute, which includes a 3 letter language code. Additional information concerning a media object can be specified using the PARAM element. The PARAM element is a standard way within the SMIL format for providing a general name value pair. In a number of embodiments of the invention, specific PARAM elements are defined that are utilized during adaptive bitrate streaming.


In many embodiments, a “header-request” PARAM element is defined that specifies the size of the header section of the container file containing the stream. The value of the “header-request” PARAM element typically specifies the number of bytes between the start of the file and the start of the encoded media within the file. In many embodiments, the header contains information concerning the manner in which the media is encoded and a playback device retrieves the header prior to playback of the encoded media in order to be able to configure the decoder for playback of the encoded media. An example of a “header-request” PARAM element is follows:

















<param



 name=“header-request”



 value=“1026”



 valuetype=“data” />










In a number of embodiments, a “mime” PARAM element is defined that specifies the MIME type of the stream. A “mime” PARAM element that identifies the stream as being an H.264 stream (i.e. a stream encoded in accordance with the MPEG-4 Advanced Video Codec standard) is as follows:

















<param



 name=“mime”



 value=“V_MPEG4/ISO/AVC”



 valuetype=“data” />










The MIME type of the stream can be specified using a “mime” PARAM element as appropriate to the encoding of a specific stream (e.g. AAC audio or UTF-8 text stream).


When the media object is a VIDEO element, additional attributes are defined within the SMIL file format specification including the systemBitrate attribute, which specifies the bitrate of the stream in the container file identified by the VIDEO element, and width and height attributes, which specify the dimensions of the encoded video in pixels. Additional attributes can also be defined using the PARAM element. In several embodiments, a “vbv” PARAM element is defined that specified the VBV buffer size of the video stream in bytes. The video buffering verifier (VBV) is a theoretical MPEG video buffer model used to ensure that an encoded video stream can be correctly buffered and played back at the decoder device. An example of a “vbv” PARAM element that specifies a VBV size of 1000 bytes is as follows:

















<param



 name=“vbv”



 value=“1000”



 valuetype=“data” />










An example of VIDEO element including the attributes discussed above is as follows:

















<video



  src=“http://cnd.com/video1_620kbps.mkv”



  systemBitrate=“620”



  width=“480”



  height=“270” >



  <param



  name=“vbv”



  value=“1000”



  valuetype=“data” />



</video>










Adaptive bitrate streaming systems in accordance with embodiments of the invention can support trick play streams, which can be used to provide smooth visual search through source content encoded for adaptive bitrate streaming. A trick play stream can be encoded that appears to be an accelerated visual search through the source media when played back, when in reality the trick play stream is simply a separate track encoding the source media at a lower frame rate. In many embodiments of the system a VIDEO element that references a trick play track is indicated by the systemProfile attribute of the VIDEO element. In other embodiments, any of a variety of techniques can be utilized to signify within the top level index file that a specific stream is a trick play stream. An example of a trick play stream VIDEO element in accordance with an embodiment of the invention is as follows:














<video


  src=“http://cnd.com/video_test2_600kbps.mkv”


  systemProfile=“DivXPlusTrickTrack”


  width=“480”


  height=“240”>


  <param name=“vbv” value=“1000” valuetype=“data” />


  <param name=“header-request” value=“1000” valuetype=“data” />


 </video>









In a number of embodiments of the invention, a “reservedBandwidth” PARAM element can be defined for an AUDIO element. The “reservedBandwidth” PARAM element specifies the bitrate of the audio stream in Kbps. An example of an AUDIO element specified in accordance with an embodiment of the invention is as follows:

















<audio



  src=“http://cnd.com/audio_test1_277kbps.mkv”



  xml:lang=“gem”



<param



  name=“reservedBandwidth”



  value=“128”



  valuetype=“data” />



/>










In several embodiments, the “reservedBandwidth” PARAM element is also defined for a TEXTSTREAM element. An example of a TEXTSTREAM element including a “reservedBandwidth” PARAM element in accordance with an embodiment of the invention is as follows:

















<textstream



 src=“http://cnd.com/text_stream_ger.mkv”



 xml:lang=“gem”



<param



 name=“reservedBandwidth”



 value=“32”



 valuetype=“data” />



/>










In other embodiments, any of a variety of mechanisms can be utilized to specify information concerning VIDEO, AUDIO, and SUBTITLE elements as appropriate to specific applications.


A SWITCH element is a mechanism defined within the SMIL file format specification that can be utilized to define adaptive or alternative streams. An example of the manner in which a SWITCH element can be utilized to specify alternative video streams at different bitrates is as follows:

















<switch>



 <video src=“http://cnd.com/video_test1_300kbps.mkv”/>



 <video src=“http://cnd.com/video_test2_900kbps.mkv”/>



 <video src=“http://cnd.com/video_test3_1200kbps.mkv”/>



</switch>










The SWTICH element specifies the URLs of three alternative video streams. The file names indicate that the different bitrates of each of the streams. As is discussed further below, the SMIL file format specification provides mechanisms that can be utilized in accordance with embodiments of the invention to specify within the top level index SMIL file additional information concerning a stream and the container file in which it is contained.


In many embodiments of the invention, the EXCL (exclusive) element is used to define alternative tracks that do not adapt during playback with streaming conditions. For example, the EXCL element can be used to define alternative audio tracks or alternative subtitle tracks. An example of the manner in which an EXCL element can be utilized to specify alternative English and French audio streams is as follows:

















<excl>



 <audio



  src=“http://cnd.com/english-audio.mkv”



  xml:lang=“eng”/>



 <audio



  src=“http://cnd.com/french-audio.mkv”



  xml:lang=“fre”/>



</excl>










An example of a top level index SMIL file that defines the attributes and parameters of two alternative video levels, an audio stream and a subtitle stream in accordance with an embodiment of the invention is as follows:

















<?xml version=“1.0” encoding=“utf-8”?>



<smil xmlns=“http://www.w3.org/ns/SMIL” version=“3.0”



baseProfile=“Language”>



 <head>



 </head>



 <body>



  <par>



   <switch>



    <video



     src=“http://cnd.com/video_test1_300kbps.mkv”



     systemBitrate=“300”



     vbv=“600”



     width=“320”



     height=“240” >



     <param



      name=“vbv”



      value=“600”



      valuetype=“data” />



     <param



      name=“header-request”



      value=“1000”



      valuetype=“data” />



    </video>



    <video



     src=“http://cnd.com/video_test2_600kbps.mkv”



     systemBitrate=“600”



     vbv =“900”



     width=“640”



     height=“480”>



     <param



      name=“vbv”



      value=“1000”



      valuetype=“data” />



     <param



      name=“header-request”



      value=“1000”



      valuetype=“data” />



    </video>



   </switch>



   <audio



    src=“http://cnd.com/audio.mkv”



    xml:lang=“eng”>



    <param



     name=“header-request”



     value=“1000”



     valuetype=“data” />



    <param name=“reservedBandwidth” value=“128”



    valuetype=“data” />



   </audio>



   <textstream



    src=“http://cnd.com/subtitles.mkv”



    xml:lang=“eng”>



    <param



     name=“header-request”



     value=“1000”



     valuetype=“data” />



    <param name=“reservedBandwidth” value=“32”



    valuetype=“data” />



   </textstream>



  </par>



 </body>



</smil>










The top level index SMIL file can be generated when the source media is encoded for playback via adaptive bitrate streaming. Alternatively, the top level index SMIL file can be generated when a playback device requests the commencement of playback of the encoded media. When the playback device receives the top level index SMIL file, the playback device can parse the SMIL file to identify the available streams. The playback device can then select the streams to utilize to playback the content and can use the SMIL file to identify the portions of the container file to download to obtain information concerning the encoding of a specific stream and/or to obtain an index to the encoded media within the container file.


Although top level index SMIL files are described above, any of a variety of top level index file formats can be utilized to create top level index files as appropriate to a specific application in accordance with an embodiment of the invention. The use of top level index files to enable playback of encoded media using adaptive bitrate streaming in accordance with embodiments of the invention is discussed further below.


Storing Media in Matroska Files for Adaptive Bitrate Streaming


A Matroska container file used to store encoded video in accordance with an embodiment of the invention is illustrated in FIG. 3. The container file 32 is an Extensible Binary Markup Language (EBML) file that is an extension of the Matroska container file format. The specialized Matroska container file 32 includes a standard EBML element 34, and a standard Segment element 36 that includes a standard Seek Head element 40, a standard Segment Information element 42, and a standard Tracks element 44. These standard elements describe the media contained within the Matroska container file. The Segment element 36 also includes a standard Clusters element 46. As is described below, the manner in which encoded media is inserted within individual Cluster elements 48 within the Clusters element 46 is constrained to improve the playback of the media in an adaptive streaming system. In many embodiments, the constraints imposed upon the encoded video are consistent with the specification of the Matroska container file format and involve encoding the video so that each cluster includes at least one closed GOP commencing with an IDR frame. In addition to the above standard elements, the Segment element 36 also includes a modified version of the standard Cues element 52. As is discussed further below, the Cues element includes specialized CuePoint elements (i.e. non-standard CuePoint elements) that facilitate the retrieval of the media contained within specific Cluster elements via HTTP.


The constraints imposed upon the encoding of media and the formatting of the encoded media within the Clusters element of a Matroska container file for adaptive bitrate streaming and the additional index information inserted within the container file in accordance with embodiments of the invention is discussed further below.


Encoding Media for Insertion in Cluster Elements


An adaptive bitrate streaming system provides a playback device with the option of selecting between different streams of encoded media during playback according to the streaming conditions experienced by the playback device. In many embodiments, switching between streams is facilitated by separately pre-encoding discrete portions of the source media in accordance with the encoding parameters of each stream and then including each separately encoded portion in its own Cluster element within the stream's container file. Furthermore, the media contained within each cluster is encoded so that the media is capable of playback without reference to media contained in any other cluster within the stream. In this way, each stream includes a Cluster element corresponding to the same discrete portion of the source media and, at any time, the playback device can select the Cluster element from the stream that is most appropriate to the streaming conditions experienced by the playback device and can commence playback of the media contained within the Cluster element. Accordingly, the playback device can select clusters from different streams as the streaming conditions experienced by the playback device change over time. In several embodiments, the Cluster elements are further constrained so that each Cluster element contains a portion of encoded media from the source media having the same duration. In a number of embodiments, each Cluster element includes two seconds of encoded media. The specific constraints applied to the media encoded within each Cluster element depending upon the type of media (i.e. video, audio, or subtitles) are discussed below.


A Clusters element of a Matroska container file containing a video stream in accordance with an embodiment of the invention is illustrated in FIG. 4a. The Clusters element 46 includes a plurality of Cluster elements 48 that each contains a discrete portion of encoded video. In the illustrated embodiment, each Cluster element 48 includes two seconds of encoded video. In other embodiments, the Cluster elements include encoded video having a greater or lesser duration than two seconds. The smaller the Cluster elements (i.e. the smaller the duration of the encoded media within each Cluster element), the higher the overhead associated with requesting each Cluster element. Therefore, a tradeoff exists between the responsiveness of the playback device to changes in streaming conditions and the effective data rate of the adaptive streaming system for a given set of streaming conditions (i.e. the portion of the available bandwidth actually utilized to transmit encoded media). In several embodiments, the encoded video sequences in the Cluster elements have different durations. Each Cluster element 48 includes a Timecode element 60 indicating the start time of the encoded video within the Cluster element and a plurality of BlockGroup elements. As noted above, the encoded video stored within the Cluster is constrained so that the encoded video can be played back without reference to the encoded video contained within any of the other Cluster elements in the container file. In many embodiments, encoding the video contained within the Cluster element as a GOP in which the first frame is an IDR frame enforces the constraint. In the illustrated embodiment, the first BlockGroup element 62 contains an IDR frame. Therefore, the first BlockGroup element 62 does not include a ReferenceBlock element. The first BlockGroup element 62 includes a Block element 64, which specifies the Timecode attribute of the frame encoded within the Block element 64 relative to the Timecode of the Cluster element 48. Subsequent BlockGroup elements 66 are not restricted in the types of frames that they can contain (other than that they cannot reference frames that are not contained within the Cluster element). Therefore, subsequent BlockGroup elements 66 can include ReferenceBlock elements 68 referencing other BlockGroup element(s) utilized in the decoding of the frame contained within the BlockGroup or can contain IDR frames and are similar to the first BlockGroup element 62. As noted above, the manner in which encoded video is inserted within the Cluster elements of the Matroska file conforms with the specification of the Matroska file format.


The insertion of encoded audio and subtitle information within a Clusters element 46 of a Matroska container file in accordance with embodiments of the invention is illustrated in FIGS. 4b and 4c. In the illustrated embodiments, the encoded media is inserted within the Cluster elements 48 subject to the same constraints applied to the encoded video discussed above with respect to FIG. 4a. In addition, the duration of the encoded audio and subtitle information within each Cluster element corresponds to the duration of the encoded video in the corresponding Cluster element of the Matroska container file containing the encoded video. In other embodiments, the Cluster elements within the container files containing the audio and/or subtitle streams need not correspond with the start time and duration of the Cluster elements in the container files containing the alternative video streams.


Multiplexing Streams in a Single MKV Container File


The Clusters elements shown in FIGS. 4a-4c assume that a single stream is contained within each Matroska container file. In several embodiments, media from multiple streams is multiplexed within a single Matroska container file. In this way, a single container file can contain a video stream multiplexed with one or more corresponding audio streams, and/or one or more corresponding subtitle streams. Storing the streams in this way can result in duplication of the audio and subtitle streams across multiple alternative video streams. However, the seek time to retrieve encoded media from a video stream and an associated audio, and/or subtitle stream can be reduced due to the adjacent storage of the data on the server. The Clusters element 46 of a Matroska container file containing multiplexed video, audio and subtitle data in accordance with an embodiment of the invention is illustrated in FIG. 4d. In the illustrated embodiment, each Cluster element 48 includes additional BlockGroup elements for each of the multiplexed streams. The first Cluster element includes a first BlockGroup element 62v for encoded video that includes a Block element 64v containing an encoded video frame and indicating the Timecode attribute of the frame relative to the start time of the Cluster element (i.e. the Timecode attribute 60). A second BlockGroup element 62a includes a Block element 64a including an encoded audio sequence and indicating the timecode of the encoded audio relative to the start time of the Cluster element, and a third BlockGroup element 62s including a Block element 64s containing an encoded subtitle and indicating the timecode of the encoded subtitle relative to the start time of the Cluster element. Although not shown in the illustrated embodiment, each Cluster element 48 likely would include additional BlockGroup elements containing additional encoded video, audio or subtitles. Despite the multiplexing of the encoded video, audio, and/or subtitle streams, the same constraints concerning the encoded media apply.


Incorporating Trick Play Tracks in MKV Container Files for Use in Adaptive Bitrate Streaming Systems


The incorporation of trick play tracks within Matroska container files is proposed by DivX, LLC in U.S. patent application Ser. No. 12/260,404 entitled “Application Enhancement Tracks”, filed Oct. 29, 2008, the disclosure of which is hereby incorporated by reference in its entirety. Trick play tracks similar to the trick play tracks described in U.S. patent application Ser. No. 12/260,404 can be used to provide a trick play stream in an adaptive bitrate streaming system in accordance with an embodiment of the invention to provide smooth visual search through source content encoded for adaptive bitrate streaming. A separate trick play track can be encoded that appears to be an accelerated visual search through the source media when played back, when in reality the trick play track is simply a separate track encoding the source media at a lower frame rate. In several embodiments, the tick play stream is created by generating a trick play track in the manner outlined in U.S. patent application Ser. No. 12/260,404 and inserting the trick play track into a Matroska container file subject to the constraints mentioned above with respect to insertion of a video stream into a Matroksa container file. In many embodiments, the trick play track is also subject to the further constraint that every frame in the GOP of each Cluster element in the trick play track is encoded as an IDR frame. As with the other video streams, each Cluster element contains a GOP corresponding to the same two seconds of source media as the corresponding Cluster elements in the other streams. There are simply fewer frames in the GOPs of the trick play track and each frame has a longer duration. In this way, transitions to and from a trick play stream can be treated in the same way as transitions between any of the other encoded streams are treated within an adaptive bitrate streaming system in accordance with embodiments of the invention. Playback of the frames contained within the trick play track to achieve accelerated visual search typically involves the playback device manipulating the timecodes assigned to the frames of encoded video prior to providing the frames to the playback device's decoder to achieve a desired increase in rate of accelerated search (e.g. ×2, ×4, ×6, etc.).


A Clusters element containing encoded media from a trick play track is shown in FIG. 4e. In the illustrated embodiment, the encoded trick play track is inserted within the Cluster elements 48 subject to the same constraints applied to the encoded video discussed above with respect to FIG. 4a. However, each Block element contains an IDR. In other embodiments, the Cluster elements within the container files containing the trick play tracks need not correspond with the start time and duration of the Cluster elements in the container files containing the alternative video streams.


In many embodiments, source content can be encoded to provide a single trick play track or multiple trick play tracks for use by the adaptive bit rate streaming system. When a single trick play track is provided, the trick play track is typically encoded at a low bitrate. When multiple alternative trick play tracks are provided, adaptive rate streaming can also be performed with respect to the trick play tracks. In several embodiments, multiple trick play tracks are provided to support different rates of accelerated visual search through the encoded media.


Incorporating Indexing Information within MKV Container Files


The specification for the Matroska container file format provides for an optional Cues element that is used to index Block elements within the container file. A modified Cues element 52 that can be incorporated into a Matroska container file in accordance with an embodiment of the invention to facilitate the requesting of clusters by a playback device using HTTP is illustrated in FIG. 5. The modified Cues element 52 includes a plurality of CuePoint elements 70 that each include a CueTime attribute 72. Each CuePoint element includes a CueTrackPositions element 74 containing the CueTrack 76 and CueClusterPosition 78 attributes. In many embodiments, the CuePoint element is mainly configured to identify a specific Cluster element as opposed to a specific Block element within a Cluster element. Although, in several applications the ability to seek to specific BlockGroup elements within a Cluster element is required and additional index information is included in the Cues element.


The use of a modified Cues element to index encoded media within a Clusters element of a Matroska file in accordance with an embodiment of the invention is illustrated in FIG. 6. A CuePoint element is generated to correspond to each Cluster element within the Matroska container file. The CueTime attribute 72 of the CuePoint element 70 corresponds to the Timecode attribute 60 of the corresponding Cluster element 48. In addition, the CuePoint element contains a CueTrackPositions element 74 having a CueClusterPosition attribute 78 that points to the start of the corresponding Cluster element 48. The CueTrackPositions element 74 can also include a CueBlockNumber attribute, which is typically used to indicate the Block element containing the first IDR frame within the Cluster element 48.


As can readily be appreciated the modified Cues element 52 forms an index to each of the Cluster elements 48 within the Matroska container file. Furthermore, the CueTrackPosition elements provide information that can be used by a playback device to request the byte range of a specific Cluster element 48 via HTTP or another suitable protocol from a remote server. The Cues element of a conventional Matroska file does not directly provide a playback device with information concerning the number of bytes to request from the start of the Cluster element in order to obtain all of the encoded video contained within the Cluster element. The size of a Cluster element can be inferred in a modified Cues element by using the CueClusterPosition attribute of the CueTrackPositions element that indexes the first byte of the next Cluster element. Alternatively, additional CueTrackPosition elements could be added to modified Cues elements in accordance with embodiments of the invention that index the last byte of the Cluster element (in addition to the CueTrackPositions elements that index the first byte of the Cluster element), and/or a non-standard CueClusterSize attribute that specifies the size of the Cluster element pointed to by the CueClusterPosition attribute is included in each CueTrackPosition element to assist with the retrieval of specific Cluster elements within a Matroska container file via HTTP byte range requests or a similar protocol.


The modification of the Cues element in the manner outlined above significantly simplifies the retrieval of Cluster elements from a Matroska container file via HTTP or a similar protocol during adaptive bitrate streaming. In addition, by only indexing the first frame in each Cluster the size of the index is significantly reduced. Given that the index is typically downloaded prior to playback, the reduced size of the Cues element (i.e. index) means that playback can commence more rapidly. Using the CueClusterPosition elements, a playback device can request a specific Cluster element from the stream most suited to the streaming conditions experienced by the playback device by simply referencing the index of the relevant Matroska container file using the Timecode attribute for the desired Cluster element.


In some embodiments, a number of the attributes within the Cues element are not utilized during adaptive bitrate streaming. Therefore, the Cues element can be further modified by removing the unutilized attributes to reduce the overall size of the index for each Matroska container file. A modified Cues element that can be utilized in a Matroska container file that includes a single encoded stream in accordance with an embodiment of the invention is illustrated in FIG. 5a. The Cues element 52′ shown in FIG. 5a is similar to the Cues element 52 shown in FIG. 5 with the exception that the CuePoint elements 70′ do not include a CueTime attribute (see 72 in FIG. 5) and/or the CueTrackPositions elements 74′ do not include a CueTrack attribute (76 in FIG. 5). When the portions of encoded media in each Cluster element in the Motroska container file have the same duration, the CueTime attribute is not necessary. When the Matroska contain file includes a single encoded stream, the CueTrack attribute is not necessary. In other embodiments, the Cues element and/or other elements of the Matroska container file can be modified to remove elements and/or attributes that are not necessary for the adaptive bitrate streaming of the encoded stream contained within the Matroska container file, given the manner in which the stream is encoded and inserted in the Matroska container file.


Although various modifications to the Cues element to include information concerning the size of each of the Cluster elements within a Matroska container file and to eliminate unnecessary attributes are described above, many embodiments of the invention utilize a conventional Matroska container. In several embodiments, the playback device simply determines the size of Cluster elements on the fly using information obtained from a conventional Cues element, and/or relies upon a separate index file containing information concerning the size and/or location of the Cluster elements within the MKV container file. In several embodiments, the additional index information is stored in the top level index file. In a number of embodiments, the additional index information is stored in separate files that are identified in the top level index file. When index information utilized to retrieve Cluster elements from a Matroska container file is stored separately from the container file, the Matroska container file is still typically constrained to encode media for inclusion in the Cluster elements in the manner outlined above. In addition, wherever the index information is located, the index information will typically index each Cluster element and include (but not be limited to) information concerning at least the starting location and, in many instances, the size of each Cluster element.


Encoding Source Media for Adaptive Bitrate Streaming


A process for encoding source media as a top level index file and a plurality of Matroska container files for use in an adaptive bitrate streaming system in accordance with an embodiment of the invention is illustrated in FIG. 7. The encoding process 100 commences by selecting (102) a first portion of the source media and encoding (104) the source media using the encoding parameters for each stream. When the portion of media is video, then the portion of source video is encoded as a single GOP commencing with an IDR frame. In many embodiments, encoding parameters used to create the alternative GOPs vary based upon bitrate, frame rate, encoding parameters and resolution. In this way, the portion of media is encoded as a set of interchangeable alternatives and a playback device can select the alternative most appropriate to the streaming conditions experienced by the playback device. When different resolutions are supported, the encoding of the streams is constrained so that each stream has the same display aspect ratio. A constant display aspect ratio can be achieved across different resolution streams by varying the sample aspect ratio with the resolution of the stream. In many instances, reducing resolution can result in higher quality video compared with higher resolution video encoded at the same bit rate. In many embodiments, the source media is itself encoded and the encoding process (104) involves transcoding or transrating of the encoded source media according to the encoding parameters of each of the alternative streams supported by the adaptive bitrate streaming system.


Once the source media has been encoded as a set of alternative portions of encoded media, each of the alternative portions of encoded media is inserted (106) into a Cluster element within the Matroska container file corresponding to the stream to which the portion of encoded media belongs. In many embodiments, the encoding process also constructs indexes for each Matroska container file as media is inserted into Cluster elements within the container. Therefore, the process 100 can also include creating a CuePoint element that points to the Cluster element inserted within the Matroska container file. The CuePoint element can be held in a buffer until the source media is completely encoded. Although the above process describes encoding each of the alternative portions of encoded media sequentially in a single pass through the source media, many embodiments of the invention involve performing a separate pass through the source media to encode each of the alternative streams.


Referring back to FIG. 7, the process continues to select (102) and encode (104) portions of the source media and then insert (106) the encoded portions of media into the Matroska container file corresponding to the appropriate stream until the entire source media is encoded for adaptive bitrate streaming (108). At which point, the process can insert an index (110) into the Matroska container for each stream and create (112) a top level index file that indexes each of the encoded streams contained within the Matroska container files. As noted above, the indexes can be created as encoded media and inserted into the Matroska container files so that a CuePoint element indexes each Cluster element within the Mastroska container file. Upon completion of the encoding, each of the CuePoint elements can be included in a Cues element and the Cues element can be inserted into the Matroska container file following the Clusters element.


Following the encoding of the source media to create Matroska container files containing each of the streams generated during the encoding process, which can include the generation of trick play streams, and a top level index file that indexes each of the streams within the Matroska container files, the top level index file and the Matroska container files can be uploaded to an HTTP server for adaptive bitrate streaming to playback devices. The adaptive bitrate streaming of media encoded in accordance with embodiments of the invention using HTTP requests is discussed further below.


Adaptive Bitrate Streaming from MKV Container Files Using Http


When source media is encoded so that there are alternative streams contained in separate Matroska container files for at least one of video, audio, and subtitle content, adaptive streaming of the media contained within the Matroska container files can be achieved using HTTP requests or a similar stateless data transfer protocol. In many embodiments, a playback device requests the top level index file resident on the server and uses the index information to identify the streams that are available to the playback device. The playback device can then retrieve the indexes for one or more of the Matroska files and can use the indexes to request media from one or more of the streams contained within the Matroska container files using HTTP requests or using a similar stateless protocol. As noted above, many embodiments of the invention implement the indexes for each of the Matroska container files using a modified Cues element. In a number of embodiments, however, the encoded media for each stream is contained within a standard Matroska container file and separate index file(s) can also be provided for each of the container files. Based upon the streaming conditions experienced by the playback device, the playback device can select media from alternative streams encoded at different bitrates. When the media from each of the streams is inserted into the Matroska container file in the manner outlined above, transitions between streams can occur upon the completion of playback of media within a Cluster element. Therefore, the size of the Cluster elements (i.e the duration of the encoded media within the Cluster elements) is typically chosen so that the playback device is able to respond quickly enough to changing streaming conditions and to instructions from the user that involve utilization of a trick play track. The smaller the Cluster elements (i.e. the smaller the duration of the encoded media within each Cluster element), the higher the overhead associated with requesting each Cluster element. Therefore, a tradeoff exists between the responsiveness of the playback device to changes in streaming conditions and the effective data rate of the adaptive streaming system for a given set of streaming conditions (i.e. the portion of the available bandwidth actually utilized to transmit encoded media). In many embodiments, the size of the Cluster elements is chosen so that each Cluster element contains two seconds of encoded media. In other embodiments, the duration of the encoded media can be greater or less than two seconds and/or the duration of the encoded media can vary from Cluster element to Cluster element.


Communication between a playback device or client and an HTTP server during the playback of media encoded in separate streams contained within Matroska container files indexed by a top level index file in accordance with an embodiment of the invention is illustrated in FIG. 8. In the illustrated embodiment, the playback device 200 commences playback by requesting the top level index file from the server 202 using an HTTP request or a similar protocol for retrieving data. The server 202 provides the bytes corresponding to the request. The playback device 200 then parses the top level index file to identify the URIs of each of the Matroska container files containing the streams of encoded media derived from a specific piece of source media. The playback device can then request the byte ranges corresponding to headers of one or more of the Matroska container files via HTTP or a similar protocol, where the byte ranges are determined using the information contained in the URI for the relevant Matroska container files (see discussion above). The server returns the following information in response to a request for the byte range containing the headers of a Matroska container file:

















ELEM(“EBML”)



 ELEM(“SEEKHEAD”)



 ELEM(“SEGMENTINFO”)



 ELEM(“TRACKS”)










The EBML element is typically processed by the playback device to ensure that the file version is supported. The SeekHead element is parsed to find the location of the Matroska index elements and the SegmentInfo element contains two key elements utilized in playback: TimecodeScale and Duration. The TimecodeScale specifies the timecode scale for all timecodes within the Segment of the Matroska container file and the Duration specifies the duration of the Segment based upon the TimecodeScale. The Tracks element contains the information used by the playback device to decode the encoded media contained within the Clusters element of the Matroska file. As noted above, adaptive bitrate streaming systems in accordance with embodiments of the invention can support different streams encoded using different encoding parameters including but not limited to frame rate, and resolution. Therefore, the playback device can use the information contained within the Matroska container file's headers to configure the decoder every time a transition is made between encoded streams.


In many embodiments, the playback device does not retrieve the headers for all of the Matroska container files indexed in the top level index file. Instead, the playback device determines the stream(s) that will be utilized to initially commence playback and requests the headers from the corresponding Matroska container files. Depending upon the structure of the URIs contained within the top level index file, the playback device can either use information from the URIs or information from the headers of the Matroska container files to request byte ranges from the server that contain at least a portion of the index from relevant Matroska container files. The byte ranges can correspond to the entire index. The server provides the relevant byte ranges containing the index information to the playback device, and the playback device can use the index information to request the byte ranges of Cluster elements containing encoded media using this information. When the Cluster elements are received, the playback device can extract encoded media from the Block elements within the Cluster element, and can decode and playback the media within the Block elements in accordance with their associated Timecode attributes.


In the illustrated embodiment, the playback device 200 requests sufficient index information from the HTTP server prior to the commencement of playback that the playback device can stream the entirety of each of the selected streams using the index information. In other embodiments, the playback device continuously retrieves index information as media is played back. In several embodiments, all of the index information for the lowest bitrate steam is requested prior to playback so that the index information for the lowest bitrate stream is available to the playback device in the event that streaming conditions deteriorate rapidly during playback.


Switching Between Streams


The communications illustrated in FIG. 8 assume that the playback device continues to request media from the same streams (i.e. Matroska container files) throughout playback of the media. In reality, the streaming conditions experienced by the playback device are likely to change during the playback of the streaming media and the playback device can request media from alternative streams (i.e. different Matroska container files) to provide the best picture quality for the streaming conditions experienced by the playback device. In addition, the playback device may switch streams in order to perform a trick play function that utilizes a trick play track stream.


Communication between a playback device and a server when a playback device switches to a new stream in accordance with embodiments of the invention are illustrated in FIG. 9a. The communications illustrated in FIG. 9a assume that the index information for the new stream has not been previously requested by the playback device and that downloading of Cluster elements from the old stream proceeds while information is obtained concerning the Matroska container file containing the new stream. When the playback device 200 detects a change in streaming conditions, determines that a higher bitrate stream can be utilized at the present streaming conditions, or receives a trick play instruction from a user, the playback device can use the top level index file to identify the URI for a more appropriate alternative stream to at least one of the video, audio, or subtitle streams from which the playback device is currently requesting encoded media. The playback device can save the information concerning the current stream(s) and can request the byte ranges of the headers for the Matroska container file(s) containing the new stream(s) using the parameters of the corresponding URIs. Caching the information in this way can be beneficial when the playback device attempts to adapt the bitrate of the stream downward. When the playback device experiences a reduction in available bandwidth, the playback device ideally will quickly switch to a lower bitrate stream. Due to the reduced bandwidth experienced by the playback device, the playback device is unlikely to have additional bandwidth to request header and index information. Ideally, the playback device utilizes all available bandwidth to download already requested higher rate Cluster elements and uses locally cached index information to start requesting Cluster elements from Matroska container file(s) containing lower bitrate stream(s).


Byte ranges for index information for the Matroska container file(s) containing the new stream(s) can be requested from the HTTP server 202 in a manner similar to that outlined above with respect to FIG. 8. At which point, the playback device can stop downloading of cluster elements from the previous streams and can commence requesting the byte ranges of the appropriate Cluster elements from the Matroska container file(s) containing the new stream(s) from the HTTP server, using the index information from the Matroska container file(s) to identify the Cluster element(s) containing the encoded media following the encoded media in the last Cluster element retrieved by the playback device. As noted above, the smooth transition from one stream to another is facilitated by encoding each of the alternative streams so that corresponding Cluster elements start with the same Timecode element and an IDR frame.


When the playback device caches the header and the entire index for each stream that has be utilized in the playback of the media, the process of switching back to a previously used stream can be simplified. The playback device already has the header and index information for the Matroska file containing the previously utilized stream and the playback device can simply use this information to start requesting Cluster elements from the Matroska container file of the previously utilized stream via HTTP. Communication between a playback device and an HTTP server when switching back to a stream(s) for which the playback device has cached header and index information in accordance with an embodiment of the invention is illustrated in FIG. 9b. The process illustrated in FIG. 9b is ideally performed when adapting bitrate downwards, because a reduction in available resources can be exacerbated by a need to download index information in addition to media. The likelihood of interruption to playback is reduced by increasing the speed with which the playback device can switch between streams and reducing the amount of overhead data downloaded to achieve the switch.


Although the present invention has been described in certain specific aspects, many additional modifications and variations would be apparent to those skilled in the art. It is therefore to be understood that the present invention may be practiced otherwise than specifically described, including various changes in the implementation such as utilizing encoders and decoders that support features beyond those specified within a particular standard with which they comply, without departing from the scope and spirit of the present invention. Thus, embodiments of the present invention should be considered in all respects as illustrative and not restrictive.

Claims
  • 1. A playback device, comprising: a set of one or more processors; anda non-volatile storage containing an application for causing the set of one or more processors to perform the steps of: obtaining top level index information identifying a plurality of alternative streams of video, an audio stream, and at least one trick play stream that are each stored in a separate container file, where: each video container file containing a given stream from the plurality of alternative streams of video comprises: portions of the given video stream within the video container file, where the portions of the given video stream comprise an encoded group of pictures that commences with a picture encoded without reference to another picture in the given video stream; anda video container index, where entries in the video container index indicate sizes of portions of the given video stream within the video container file;each trick play container file containing a given trick play stream from the at least one trick play stream comprises: frames of the given trick play stream, where each frame of the given trick play stream is a picture encoded without reference to another picture in the trick play stream; anda trick play container index, where entries in the trick play container index comprise a timecode and a location of a frame in the given track play stream;requesting a video container index from a video container file containing a video stream from the plurality of alternative streams of video;requesting at least one portion of the video stream from the plurality of alternative streams of video using at least one entry from the video container index;decoding the at least one portion of the video stream from the plurality of alternative streams of video;receiving at least one user instruction to perform a visual search of the media;requesting a trick play container index from a trick play container file containing a trick play stream from the at least one trick play stream;requesting at least one frame of video from the at least one trick play stream using at least one entry from the trick play container index; anddecoding and displaying the at least one frame of video from the at least one trick play stream.
  • 2. The playback device of claim 1, wherein the at least one trick play stream comprises a plurality of trick play streams.
  • 3. The playback device of claim 1, wherein a trick play stream from the at least one trick play stream is encoded at a lower bit rate than the plurality of alternative streams of video.
  • 4. The playback device of claim 3, wherein: the top level index information includes information indicating the portion of each video container file containing the video container index for the video container file; andthe application further is for causing the set of one or more processors to perform the step of requesting a video container index from a video container file containing a video stream from the plurality of alternative streams of video by requesting the portion of the video container file indicated as containing the video container index for the video container file by the top level index information.
  • 5. The playback device of claim 1, wherein the application is further for causing the set of processors to perform the step of performing adaptive streaming with respect to the plurality of trick play streams.
  • 6. The playback device of claim 1, wherein the application is further for causing the set of processors to perform the steps of: manipulating at least one timecode assigned to the at least one frame of video from the at least one trick play stream prior to providing the at least one frame to a decoder on the playback device, where the at least one timecode is manipulated in order to achieve a rate of playback based on a desired rate for the visual search; anddecoding and presenting the at least one frame of video from the at least one trick play stream in accordance with the manipulated timecodes.
  • 7. The playback device of claim 1, wherein the application is further for causing the set of processors to perform the steps of: detecting streaming conditions for the playback device;selecting a stream from the plurality of alternative streams of video based on the detected streaming conditions; andrequesting portions of video from the selected stream.
  • 8. The playback device of claim 7, wherein: the application is further for causing the set of processors to perform the step of converting at least one entry in a video container index for a video container file containing the selected stream into a byte range within the video container file containing the selected stream corresponding to at least one portion of the selected stream; andthe step of requesting portions of the selected stream further comprises the step of generating at least one byte range request for the at least one portion of the selected stream.
  • 9. The playback device of claim 1, wherein the encoded group of pictures is a single closed group of pictures that commences with a picture encoded without reference to another picture in the given video stream.
  • 10. The playback device of claim 1, wherein: the at least one user instruction to perform a visual search of the media indicates a time code; andthe application further is for causing the set of one or more processors to perform the step of requesting at least one frame of video from the at least one trick play stream using at least one entry from the trick play container index based upon the time code.
  • 11. The playback device of claim 10, wherein the application further is for causing the set of one or more processors to perform the steps of: receiving at least one user instruction to resume playback of the media and determining an associated timecode;requesting at least one portion of the video stream from the plurality of alternative streams of video using at least one entry from a retrieved video container index based upon the associated timecode; anddecoding the at least one portion of the video stream from the plurality of alternative streams of video.
  • 12. A method of playing back content on a playback device, comprising: obtaining top level index information identifying a plurality of alternative streams of video, an audio stream, and at least one trick play stream that are each stored in a separate container file, where: each video container file containing a given stream from the plurality of alternative streams of video comprises: portions of the given video stream within the video container file, where the portions of the given video stream comprise an encoded group of pictures that commences with a picture encoded without reference to another picture in the given video stream; anda video container index, where entries in the video container index indicate sizes of portions of the given video stream within the video container file;each trick play container file containing a given trick play stream from the at least one trick play stream comprises: frames of the given trick play stream, where each frame of the given trick play stream is a picture encoded without reference to another picture in the trick play stream; anda trick play container index, where entries in the trick play container index comprise a timecode and a location of a frame in the given track play stream;requesting a video container index from a video container file containing a video stream from the plurality of alternative streams of video;requesting at least one portion of the video stream from the plurality of alternative streams of video using at least one entry from the video container index;decoding the at least one portion of the video stream from the plurality of alternative streams of video;receiving at least one user instruction to perform a visual search of the media;requesting a trick play container index from a trick play container file containing a trick play stream from the at least one trick play stream;requesting at least one frame of video from the at least one trick play stream using at least one entry from the trick play container index; anddecoding and displaying the at least one frame of video from the at least one trick play stream.
  • 13. The method of claim 12, wherein the at least one trick play stream comprises a plurality of trick play streams.
  • 14. The method of claim 12, wherein a trick play stream from the at least one trick play stream is encoded at a lower bit rate than the plurality of alternative streams of video.
  • 15. The method of claim 12, further comprising performing adaptive streaming with respect to the plurality of trick play streams.
  • 16. The method of claim 12, further comprising: manipulating at least one timecode assigned to the at least one frame of video from the at least one trick play stream prior to providing the at least one frame to a decoder on the playback device, where the at least one timecode is manipulated in order to achieve a rate of playback based on a desired rate for the visual search; anddecoding and presenting the at least one frame of video from the at least one trick play stream in accordance with the manipulated timecodes.
  • 17. The method of claim 12, further comprising: detecting streaming conditions for the playback device;selecting a stream from the plurality of alternative streams of video based on the detected streaming conditions; andrequesting portions of video from the selected stream.
  • 18. The method of claim 17, further comprising: converting at least one entry in a video container index for a video container file containing the selected stream into a byte range within the video container file containing the selected stream corresponding to at least one portion of the selected stream; andgenerating at least one byte range request for the at least one portion of the selected stream.
  • 19. The method of claim 12, wherein the encoded group of pictures is a single closed group of pictures that commences with a picture encoded without reference to another picture in the given video stream.
  • 20. The method of claim 12, wherein: the top level index information includes information indicating the portion of each video container file containing the video container index for the video container file; andrequesting a video container index from a video container file containing a video stream from the plurality of alternative streams of video comprises requesting the portion of the video container file indicated as containing the video container index for the video container file by the top level index information.
  • 21. The method of claim 12, wherein the at least one user instruction to perform a visual search of the media indicates a time code and the method further comprises requesting at least one frame of video from the at least one trick play stream using at least one entry from the trick play container index based upon the time code.
  • 22. The method of claim 21, further comprising: receiving at least one user instruction to resume playback of the media and determining an associated timecode;requesting at least one portion of the video stream from the plurality of alternative streams of video using at least one entry from a retrieved video container index based upon the associated timecode; anddecoding the at least one portion of the video stream from the plurality of alternative streams of video.
  • 23. A playback device, comprising: a set of one or more processors; anda non-volatile storage containing an application for causing the set of one or more processors to perform the steps of: obtaining top level index information identifying a plurality of alternative streams of video, an audio stream, and at least one trick play stream that are each stored in a separate container file, where: each video container file containing a given stream from the plurality of alternative streams of video comprises: portions of the given video stream within the video container file, where the portions of the given video stream comprise an encoded group of pictures that commences with a picture encoded without reference to another picture in the given video stream; anda video container index, where entries in the video container index indicate sizes of portions of the given video stream within the video container file;each trick play container file containing a given trick play stream from the at least one trick play stream comprises: frames of the given trick play stream, where each frame of the given trick play stream is a picture encoded without reference to another picture in the trick play stream; anda trick play container index, where entries in the trick play container index comprise a timecode and a location of a frame in the given track play stream;requesting a video container index from a video container file containing a video stream from the plurality of alternative streams of video;requesting at least one portion of the video stream from the plurality of alternative streams of video using at least on entry from the video container index;decoding the at least one portion of the video stream from the plurality of alternative streams of video;receiving at least one user instruction to perform a visual search of the media indicating a time code;requesting a trick play container index from a trick play container file containing a trick play stream from the at least one trick play stream;requesting at least one frame of video from the at least one trick play stream using at least one entry from the trick play container index based upon the indicated time code;manipulating at least one timecode assigned to the at least one frame of video from the at least one trick play stream, where the at least one timecode is manipulated in order to achieve a rate of playback based on a desired rate for the visual search; anddecoding and displaying the at least one frame of video from the at least one trick play stream in accordance with the at least one manipulated timecode.
CROSS REFERENCE TO RELATED APPLICATIONS

The current application is a continuation of U.S. patent application Ser. No. 15/881,351 entitled “Systems and Methods for Encoding Source Media in Matroska Container Files for Adaptive Bitrate Streaming Using Hypertext Transfer Protocol” to Braness et al., filed Jan. 26, 2018, which is a continuation of U.S. patent application Ser. No. 15/005,990 entitled “Systems and Methods for Encoding Source Media in Matroska Container Files for Adaptive Bitrate Streaming Using Hypertext Transfer Protocol” to Braness et al., filed Jan. 25, 2016, which is a continuation of U.S. patent application Ser. No. 13/221,794 entitled “Systems and Methods for Encoding Source Media in Matroska Container Files for Adaptive Bitrate Streaming Using Hypertext Transfer Protocol” to Braness et al., filed Aug. 30, 2011, which application claims priority to U.S. Provisional Application Ser. No. 61/430,110, entitled “Systems and Methods For Adaptive Bitrate Streaming of Media Stored in Matroska Files Using Hypertext Transfer Protocol”, filed Jan. 5, 2011. The disclosures of application Ser. Nos. 15/881,351, 15/005,990, 13/221,794 and 61/430,110 are hereby incorporated by reference in their entirety.

US Referenced Citations (736)
Number Name Date Kind
4009331 Goldmark et al. Feb 1977 A
4694357 Rahman et al. Sep 1987 A
4802170 Trottier Jan 1989 A
4964069 Ely Oct 1990 A
5119474 Beitel et al. Jun 1992 A
5274758 Beitel et al. Dec 1993 A
5361332 Yoshida et al. Nov 1994 A
5396497 Veltman Mar 1995 A
5404436 Hamilton Apr 1995 A
5420801 Dockter et al. May 1995 A
5420974 Morris et al. May 1995 A
5471576 Yee Nov 1995 A
5479303 Suzuki et al. Dec 1995 A
5487167 Dinallo et al. Jan 1996 A
5502766 Boebert et al. Mar 1996 A
5509070 Schull Apr 1996 A
5533021 Branstad et al. Jul 1996 A
5537408 Branstad et al. Jul 1996 A
5539908 Chen et al. Jul 1996 A
5541662 Adams et al. Jul 1996 A
5583652 Ware Dec 1996 A
5589993 Naimpally et al. Dec 1996 A
5621794 Matsuda et al. Apr 1997 A
5627936 Prasad May 1997 A
5633472 DeWitt et al. May 1997 A
5642171 Baumgartner et al. Jun 1997 A
5655117 Goldberg et al. Aug 1997 A
5664044 Ware Sep 1997 A
5675382 Bauchspies Oct 1997 A
5675511 Prasad et al. Oct 1997 A
5684542 Tsukagoshi Nov 1997 A
5715403 Stefik Feb 1998 A
5717816 Boyce et al. Feb 1998 A
5719786 Nelson et al. Feb 1998 A
5745643 Mishina Apr 1998 A
5751280 Abbott May 1998 A
5754648 Ryan et al. May 1998 A
5763800 Rossum et al. Jun 1998 A
5765164 Prasad et al. Jun 1998 A
5794018 Vrvilo et al. Aug 1998 A
5805700 Nardone et al. Sep 1998 A
5822524 Chen et al. Oct 1998 A
5828370 Moeller et al. Oct 1998 A
5841432 Carmel et al. Nov 1998 A
5844575 Reid Dec 1998 A
5848217 Tsukagoshi et al. Dec 1998 A
5867625 McLaren Feb 1999 A
5887110 Sakamoto et al. Mar 1999 A
5892900 Ginter et al. Apr 1999 A
5903261 Walsh et al. May 1999 A
5907597 Mark May 1999 A
5946446 Yanagihara Aug 1999 A
5956729 Goetz et al. Sep 1999 A
5959690 Toebes, VIII et al. Sep 1999 A
5970147 Davis Oct 1999 A
5999812 Himsworth Dec 1999 A
6031622 Ristow et al. Feb 2000 A
6038257 Brusewitz et al. Mar 2000 A
6044469 Horstmann Mar 2000 A
6046778 Nonomura et al. Apr 2000 A
6047100 McLaren Apr 2000 A
6058240 McLaren May 2000 A
6064794 McLaren et al. May 2000 A
6065050 DeMoney May 2000 A
6018611 Nogami et al. Jun 2000 A
6079566 Eleftheriadis et al. Jun 2000 A
6097877 Katayama et al. Aug 2000 A
6141754 Choy Oct 2000 A
6155840 Sallette Dec 2000 A
6169242 Fay et al. Jan 2001 B1
6175921 Rosen Jan 2001 B1
6195388 Choi et al. Feb 2001 B1
6204883 Tsukagoshi Mar 2001 B1
6222981 Rijckaert Apr 2001 B1
6282653 Berstis et al. Aug 2001 B1
6289450 Pensak et al. Sep 2001 B1
6292621 Tanaka et al. Sep 2001 B1
6308005 Ando et al. Oct 2001 B1
6330286 Lyons et al. Dec 2001 B1
6374144 Viviani et al. Apr 2002 B1
6389218 Gordon et al. May 2002 B2
6389473 Carmel et al. May 2002 B1
6395969 Fuhrer May 2002 B1
6397230 Carmel et al. May 2002 B1
6418270 Steenhof et al. Jul 2002 B1
6441754 Wang et al. Aug 2002 B1
6449719 Baker Sep 2002 B1
6466671 Maillard et al. Oct 2002 B1
6466733 Kim Oct 2002 B1
6510513 Danieli Jan 2003 B1
6510554 Gordon et al. Jan 2003 B1
6532262 Fukuda et al. Mar 2003 B1
6621979 Eerenberg et al. Sep 2003 B1
6625320 Nilsson et al. Sep 2003 B1
6658056 Duruöz et al. Dec 2003 B1
6665835 Gutfreund et al. Dec 2003 B1
6671408 Kaku Dec 2003 B1
6697568 Kaku Feb 2004 B1
6725281 Zintel et al. Apr 2004 B1
6771703 Oguz et al. Aug 2004 B1
6807306 Girgensohn et al. Oct 2004 B1
6810031 Hegde et al. Oct 2004 B1
6810389 Meyer Oct 2004 B1
6819394 Nomura et al. Nov 2004 B1
6850252 Hoffberg Feb 2005 B1
6856997 Lee et al. Feb 2005 B2
6859496 Boroczky et al. Feb 2005 B1
6917652 Lyu Jul 2005 B2
6931531 Takahashi Aug 2005 B1
6944621 Collart Sep 2005 B1
6944629 Shioi et al. Sep 2005 B1
6956901 Boroczky et al. Oct 2005 B2
6965724 Boccon-Gibod et al. Nov 2005 B1
6965993 Baker Nov 2005 B2
6985588 Glick et al. Jan 2006 B1
6988144 Luken et al. Jan 2006 B1
7007170 Morten Feb 2006 B2
7023924 Keller et al. Apr 2006 B1
7043473 Rassool et al. May 2006 B1
7103906 Katz et al. Sep 2006 B1
7127155 Ando et al. Oct 2006 B2
7150045 Koelle et al. Dec 2006 B2
7151832 Fetkovich et al. Dec 2006 B1
7151833 Candelore et al. Dec 2006 B2
7165175 Kollmyer et al. Jan 2007 B1
7185363 Narin et al. Feb 2007 B1
7197234 Chatterton Mar 2007 B1
7206940 Evans et al. Apr 2007 B2
7209892 Galuten et al. Apr 2007 B1
7231132 Davenport Jun 2007 B1
7237061 Boic Jun 2007 B1
7242772 Tehranchi Jul 2007 B1
7243346 Seth et al. Jul 2007 B1
7295673 Grab et al. Nov 2007 B2
7328345 Morten et al. Feb 2008 B2
7330875 Parasnis et al. Feb 2008 B1
7340528 Noblecourt et al. Mar 2008 B2
7349886 Morten et al. Mar 2008 B2
7356143 Morten Apr 2008 B2
7356245 Belknap et al. Apr 2008 B2
7366788 Jones et al. Apr 2008 B2
7376831 Kollmyer et al. May 2008 B2
7406174 Palmer Jul 2008 B2
7421411 Kontio et al. Sep 2008 B2
7454780 Katsube et al. Nov 2008 B2
7457359 Mabey et al. Nov 2008 B2
7472280 Giobbi Dec 2008 B2
7478325 Foehr Jan 2009 B2
7484103 Woo et al. Jan 2009 B2
7493018 Kim Feb 2009 B2
7499938 Collart Mar 2009 B2
7515710 Russell et al. Apr 2009 B2
7526450 Hughes et al. Apr 2009 B2
7594271 Zhuk et al. Sep 2009 B2
7610365 Kraft et al. Oct 2009 B1
7640435 Morten Dec 2009 B2
7689510 Lamkin et al. Mar 2010 B2
7720352 Belknap et al. May 2010 B2
7747853 Candelore et al. Jun 2010 B2
7761892 Ellis et al. Jul 2010 B2
7779097 Lamkin et al. Aug 2010 B2
7788271 Soma et al. Aug 2010 B2
7817608 Rassool et al. Oct 2010 B2
7869691 Kelly et al. Jan 2011 B2
7882034 Hug et al. Feb 2011 B2
7949703 Matsuzaki et al. May 2011 B2
7962942 Craner Jun 2011 B1
7974714 Hoffberg Jul 2011 B2
7991156 Miller Aug 2011 B1
8023562 Zheludkov et al. Sep 2011 B2
8046453 Olaiya Oct 2011 B2
8054880 Yu et al. Nov 2011 B2
8065708 Smyth et al. Nov 2011 B1
8069260 Speicher et al. Nov 2011 B2
8082442 Keljo et al. Dec 2011 B2
8195714 Mckibben et al. Jun 2012 B2
8201264 Grab et al. Jun 2012 B2
8225061 Greenebaum Jul 2012 B2
8233768 Soroushian et al. Jul 2012 B2
8245124 Gupta Aug 2012 B1
8249168 Graves Aug 2012 B2
8261356 Choi et al. Sep 2012 B2
8265168 Masterson et al. Sep 2012 B1
8270473 Chen et al. Sep 2012 B2
8270819 Vannier Sep 2012 B2
8275871 Ram et al. Sep 2012 B2
8289338 Priyadarshi et al. Oct 2012 B2
8291460 Peacock Oct 2012 B1
8296434 Miller et al. Oct 2012 B1
8311111 Xu et al. Nov 2012 B2
8311115 Gu et al. Nov 2012 B2
8321556 Chatterjee et al. Nov 2012 B1
8325800 Holcomb et al. Dec 2012 B2
8341715 Sherkin et al. Dec 2012 B2
8386621 Park Feb 2013 B2
8396114 Gu et al. Mar 2013 B2
8401900 Cansler et al. Mar 2013 B2
8407753 Kuo Mar 2013 B2
8412841 Swaminathan et al. Apr 2013 B1
8423889 Zagorie et al. Apr 2013 B1
8452110 Shoham et al. May 2013 B2
8456380 Pagan Jun 2013 B2
8472792 Butt et al. Jun 2013 B2
8473630 Galligan et al. Jun 2013 B1
8510303 Soroushian et al. Aug 2013 B2
8510404 Carmel et al. Aug 2013 B2
8515265 Kwon et al. Aug 2013 B2
8516529 Lajoie et al. Aug 2013 B2
8527645 Proffit et al. Sep 2013 B1
8595378 Cohn Nov 2013 B1
8606069 Okubo et al. Dec 2013 B2
8640166 Craner et al. Jan 2014 B1
8649669 Braness et al. Feb 2014 B2
8656183 Russell et al. Feb 2014 B2
8677428 Lewis et al. Mar 2014 B2
8681866 Jia Mar 2014 B1
8689267 Hunt Apr 2014 B2
8726264 Allen et al. May 2014 B1
RE45052 Li Jul 2014 E
8767825 Wang et al. Jul 2014 B1
8774609 Drake et al. Jul 2014 B2
8781122 Chan et al. Jul 2014 B2
8805109 Shoham et al. Aug 2014 B2
8806188 Braness et al. Aug 2014 B2
8832434 Apostolopoulos et al. Sep 2014 B2
8843586 Pantos et al. Sep 2014 B2
8856218 Inskip Oct 2014 B1
8908984 Shoham et al. Dec 2014 B2
8909922 Kiefer et al. Dec 2014 B2
8914534 Braness et al. Dec 2014 B2
8914836 Shivadas et al. Dec 2014 B2
8918535 Ma et al. Dec 2014 B2
8918636 Kiefer Dec 2014 B2
8918908 Ziskind et al. Dec 2014 B2
8948249 Sun et al. Feb 2015 B2
8997161 Priyadarshi et al. Mar 2015 B2
8997254 Amidei et al. Mar 2015 B2
9014471 Shoham et al. Apr 2015 B2
9025659 Soroushian et al. May 2015 B2
9042670 Carmel et al. May 2015 B2
9060207 Scherkus et al. Jun 2015 B2
9094737 Shivadas et al. Jul 2015 B2
9098335 Muthiah et al. Aug 2015 B2
9124773 Chan et al. Sep 2015 B2
9125073 Oyman et al. Sep 2015 B2
9184920 Grab et al. Nov 2015 B2
9191457 Van Der Schaar Nov 2015 B2
9197685 Soroushian Nov 2015 B2
9203816 Brueck et al. Dec 2015 B2
9210481 Braness et al. Dec 2015 B2
9215466 Zhai et al. Dec 2015 B2
9247311 Kiefer Jan 2016 B2
9247312 Braness et al. Jan 2016 B2
9247317 Shivadas et al. Jan 2016 B2
9253178 Blom et al. Feb 2016 B2
9264475 Shivadas et al. Feb 2016 B2
9294531 Zhang et al. Mar 2016 B2
9313510 Shivadas et al. Apr 2016 B2
9343112 Amidei et al. May 2016 B2
9344517 Shivadas et al. May 2016 B2
9344721 Dikvall May 2016 B2
9479805 Rothschild et al. Oct 2016 B2
9485546 Chen et al. Nov 2016 B2
9571827 Su et al. Feb 2017 B2
9584557 Panje et al. Feb 2017 B2
9584847 Ma et al. Feb 2017 B2
9621522 Kiefer et al. Apr 2017 B2
9706259 Chan et al. Jul 2017 B2
9712890 Shivadas et al. Jul 2017 B2
9798863 Grab et al. Oct 2017 B2
9813740 Panje et al. Nov 2017 B2
9866878 Van Der Schaar et al. Jan 2018 B2
9883204 Braness et al. Jan 2018 B2
9906785 Naletov et al. Feb 2018 B2
9967189 Patel et al. May 2018 B2
9967305 Braness May 2018 B2
10225299 van der Schaar et al. Mar 2019 B2
20010030710 Werner Oct 2001 A1
20010036355 Kelly et al. Nov 2001 A1
20010046299 Wasilewski et al. Nov 2001 A1
20010053222 Wakao et al. Dec 2001 A1
20020026560 Jordan et al. Feb 2002 A1
20020034252 Owen et al. Mar 2002 A1
20020051494 Yamaguchi et al. May 2002 A1
20020057898 Normile May 2002 A1
20020062313 Lee et al. May 2002 A1
20020076112 Devara Jun 2002 A1
20020087569 Fischer et al. Jul 2002 A1
20020091665 Beek et al. Jul 2002 A1
20020093571 Hyodo Jul 2002 A1
20020110193 Yoo et al. Aug 2002 A1
20020116481 Lee Aug 2002 A1
20020118953 Kim Aug 2002 A1
20020120934 Abrahams et al. Aug 2002 A1
20020136298 Anantharamu et al. Sep 2002 A1
20020143413 Fay et al. Oct 2002 A1
20020143547 Fay et al. Oct 2002 A1
20020147980 Satoda Oct 2002 A1
20020159528 Graziani et al. Oct 2002 A1
20020159598 Rubinstein et al. Oct 2002 A1
20020161462 Fay Oct 2002 A1
20020180929 Tseng et al. Dec 2002 A1
20020184159 Tadayon et al. Dec 2002 A1
20020184515 Oho et al. Dec 2002 A1
20020191112 Akiyoshi et al. Dec 2002 A1
20020191959 Lin et al. Dec 2002 A1
20020191960 Fujinami et al. Dec 2002 A1
20030001964 Masukura et al. Jan 2003 A1
20030002578 Tsukagoshi et al. Jan 2003 A1
20030005442 Brodersen et al. Jan 2003 A1
20030021296 Wee et al. Jan 2003 A1
20030031178 Haeri Feb 2003 A1
20030035488 Barrau Feb 2003 A1
20030035545 Jiang Feb 2003 A1
20030035546 Jiang et al. Feb 2003 A1
20030041257 Wee et al. Feb 2003 A1
20030061305 Copley et al. Mar 2003 A1
20030061369 Aksu et al. Mar 2003 A1
20030065777 Mattila et al. Apr 2003 A1
20030078930 Surcouf et al. Apr 2003 A1
20030093799 Kauffman et al. May 2003 A1
20030123855 Okada et al. Jul 2003 A1
20030128296 Lee Jul 2003 A1
20030133506 Haneda Jul 2003 A1
20030152370 Otomo et al. Aug 2003 A1
20030163824 Gordon et al. Aug 2003 A1
20030165328 Grecia Sep 2003 A1
20030174844 Candelore Sep 2003 A1
20030185302 Abrams Oct 2003 A1
20030185542 McVeigh et al. Oct 2003 A1
20030206558 Parkkinen et al. Nov 2003 A1
20030210821 Yogeshwar et al. Nov 2003 A1
20030216922 Gonzales et al. Nov 2003 A1
20030229900 Reisman Dec 2003 A1
20030231863 Eerenberg et al. Dec 2003 A1
20030231867 Gates et al. Dec 2003 A1
20030233464 Walpole et al. Dec 2003 A1
20030236836 Borthwick Dec 2003 A1
20030236907 Stewart et al. Dec 2003 A1
20040006701 Kresina Jan 2004 A1
20040021684 Millner Feb 2004 A1
20040024688 Bi et al. Feb 2004 A1
20040025180 Begeja et al. Feb 2004 A1
20040031058 Reisman Feb 2004 A1
20040039916 Aldis et al. Feb 2004 A1
20040047614 Green Mar 2004 A1
20040052501 Tam Mar 2004 A1
20040071453 Valderas Apr 2004 A1
20040081333 Grab et al. Apr 2004 A1
20040081434 Jung et al. Apr 2004 A1
20040093618 Baldwin et al. May 2004 A1
20040105549 Suzuki et al. Jun 2004 A1
20040114687 Ferris et al. Jun 2004 A1
20040117347 Seo et al. Jun 2004 A1
20040136698 Mock Jul 2004 A1
20040139335 Diamand et al. Jul 2004 A1
20040143760 Alkove et al. Jul 2004 A1
20040146276 Ogawa Jul 2004 A1
20040158878 Ratnakar et al. Aug 2004 A1
20040184534 Wang Sep 2004 A1
20040184616 Morten et al. Sep 2004 A1
20040202320 Amini et al. Oct 2004 A1
20040217971 Kim Nov 2004 A1
20040255115 DeMello et al. Dec 2004 A1
20040255236 Collart Dec 2004 A1
20050004875 Kontio et al. Jan 2005 A1
20050005025 Harville et al. Jan 2005 A1
20050015797 Noblecourt et al. Jan 2005 A1
20050038826 Bae et al. Feb 2005 A1
20050055399 Savchuk Mar 2005 A1
20050071280 Irwin et al. Mar 2005 A1
20050108320 Lord et al. May 2005 A1
20050114534 Lee May 2005 A1
20050114896 Hug May 2005 A1
20050149450 Stefik et al. Jul 2005 A1
20050180641 Clark Aug 2005 A1
20050183120 Jain et al. Aug 2005 A1
20050193070 Brown et al. Sep 2005 A1
20050193322 Lamkin et al. Sep 2005 A1
20050196147 Seo et al. Sep 2005 A1
20050204289 Mohammed et al. Sep 2005 A1
20050207442 Van Zoest et al. Sep 2005 A1
20050207578 Matsuyama et al. Sep 2005 A1
20050254508 Aksu et al. Nov 2005 A1
20050273695 Schnurr Dec 2005 A1
20050275656 Corbin et al. Dec 2005 A1
20060015580 Gabriel et al. Jan 2006 A1
20060026294 Virdi et al. Feb 2006 A1
20060026302 Bennett et al. Feb 2006 A1
20060036549 Wu Feb 2006 A1
20060037057 Xu Feb 2006 A1
20060052095 Vazvan Mar 2006 A1
20060053080 Edmonson et al. Mar 2006 A1
20060064605 Giobbi Mar 2006 A1
20060078301 Ikeda et al. Apr 2006 A1
20060093320 Hallberg et al. May 2006 A1
20060120378 Usuki et al. Jun 2006 A1
20060129909 Butt et al. Jun 2006 A1
20060168639 Gan et al. Jul 2006 A1
20060173887 Breitfeld et al. Aug 2006 A1
20060179239 Fluhr Aug 2006 A1
20060181965 Collart Aug 2006 A1
20060235880 Qian Oct 2006 A1
20060245727 Nakano et al. Nov 2006 A1
20060259588 Lerman et al. Nov 2006 A1
20060263056 Lin et al. Nov 2006 A1
20060267986 Bae Nov 2006 A1
20060274835 Hamilton et al. Dec 2006 A1
20060294164 Armangau et al. Dec 2006 A1
20070005333 Setiohardjo et al. Jan 2007 A1
20070031110 Rijckaert Feb 2007 A1
20070033419 Kocher et al. Feb 2007 A1
20070044010 Sull et al. Feb 2007 A1
20070047901 Ando et al. Mar 2007 A1
20070053513 Hoffberg Mar 2007 A1
20070058928 Naito et al. Mar 2007 A1
20070083617 Chakrabarti et al. Apr 2007 A1
20070086528 Mauchly et al. Apr 2007 A1
20070100757 Rhoads May 2007 A1
20070133603 Weaver Jun 2007 A1
20070136817 Nguyen Jun 2007 A1
20070140647 Kusunoki et al. Jun 2007 A1
20070154165 Hemmeryckz-Deleersnijder et al. Jul 2007 A1
20070168541 Gupta et al. Jul 2007 A1
20070168542 Gupta et al. Jul 2007 A1
20070178933 Nelson Aug 2007 A1
20070180125 Knowles et al. Aug 2007 A1
20070185982 Nakanowatari et al. Aug 2007 A1
20070192810 Pritchett et al. Aug 2007 A1
20070217339 Zhao Sep 2007 A1
20070217759 Dodd Sep 2007 A1
20070234391 Hunter et al. Oct 2007 A1
20070239839 Buday et al. Oct 2007 A1
20070255940 Ueno Nov 2007 A1
20070271317 Carmel et al. Nov 2007 A1
20070271385 Davis et al. Nov 2007 A1
20070274679 Yahata et al. Nov 2007 A1
20070277219 Toebes et al. Nov 2007 A1
20070277234 Bessonov et al. Nov 2007 A1
20070280298 Hearn et al. Dec 2007 A1
20070288745 Kwan Dec 2007 A1
20070292107 Yahata et al. Dec 2007 A1
20070297422 Matsuo et al. Dec 2007 A1
20080005175 Bourke et al. Jan 2008 A1
20080008455 De Lange et al. Jan 2008 A1
20080043832 Barkley et al. Feb 2008 A1
20080046718 Grab Feb 2008 A1
20080066099 Brodersen et al. Mar 2008 A1
20080066181 Haveson et al. Mar 2008 A1
20080077592 Brodie et al. Mar 2008 A1
20080086456 Rasanen et al. Apr 2008 A1
20080086747 Rasanen et al. Apr 2008 A1
20080101466 Swenson et al. May 2008 A1
20080104633 Noblecourt et al. May 2008 A1
20080120330 Reed et al. May 2008 A1
20080120342 Reed et al. May 2008 A1
20080120389 Bassali et al. May 2008 A1
20080126248 Lee et al. May 2008 A1
20080131078 Jeong Jun 2008 A1
20080137541 Agarwal et al. Jun 2008 A1
20080137736 Richardson et al. Jun 2008 A1
20080151817 Fitchett Jun 2008 A1
20080172441 Speicher et al. Jul 2008 A1
20080187283 Takahashi Aug 2008 A1
20080192818 DiPietro et al. Aug 2008 A1
20080195664 Maharajh et al. Aug 2008 A1
20080195744 Bowra et al. Aug 2008 A1
20080201705 Wookey Aug 2008 A1
20080205860 Holtman Aug 2008 A1
20080209534 Keronen et al. Aug 2008 A1
20080240144 Kruse et al. Oct 2008 A1
20080256105 Nogawa et al. Oct 2008 A1
20080263354 Beuque et al. Oct 2008 A1
20080271102 Kienzle et al. Oct 2008 A1
20080279535 Hague et al. Nov 2008 A1
20080294453 Baird-Smith et al. Nov 2008 A1
20080298358 John et al. Dec 2008 A1
20080310454 Bellwood et al. Dec 2008 A1
20080310496 Fang Dec 2008 A1
20090031220 Tranchant et al. Jan 2009 A1
20090037959 Suh et al. Feb 2009 A1
20090048852 Burns et al. Feb 2009 A1
20090055546 Jung et al. Feb 2009 A1
20090060452 Chaudhri Mar 2009 A1
20090064341 Hartung et al. Mar 2009 A1
20090066839 Jung et al. Mar 2009 A1
20090097644 Haruki Apr 2009 A1
20090132599 Soroushian et al. May 2009 A1
20090132721 Soroushian et al. May 2009 A1
20090132824 Terada et al. May 2009 A1
20090136216 Soroushian et al. May 2009 A1
20090150557 Wormley et al. Jun 2009 A1
20090165148 Frey et al. Jun 2009 A1
20090168795 Segel et al. Jul 2009 A1
20090169181 Priyadarshi et al. Jul 2009 A1
20090172201 Carmel et al. Jul 2009 A1
20090178090 Oztaskent Jul 2009 A1
20090196139 Bates et al. Aug 2009 A1
20090201988 Gazier et al. Aug 2009 A1
20090217317 White et al. Aug 2009 A1
20090226148 Nesvadba et al. Sep 2009 A1
20090228395 Wegner et al. Sep 2009 A1
20090265737 Issa et al. Oct 2009 A1
20090276636 Grab et al. Nov 2009 A1
20090290706 Amini et al. Nov 2009 A1
20090290708 Schneider et al. Nov 2009 A1
20090293116 DeMello Nov 2009 A1
20090303241 Priyadarshi et al. Dec 2009 A1
20090307258 Priyadarshi et al. Dec 2009 A1
20090307267 Chen et al. Dec 2009 A1
20090310933 Lee Dec 2009 A1
20090313544 Wood et al. Dec 2009 A1
20090313564 Rottler et al. Dec 2009 A1
20090316783 Au et al. Dec 2009 A1
20090328124 Khouzam et al. Dec 2009 A1
20090328228 Schnell Dec 2009 A1
20100040351 Toma et al. Feb 2010 A1
20100057928 Kapoor et al. Mar 2010 A1
20100058405 Ramakrishnan et al. Mar 2010 A1
20100074324 Qian et al. Mar 2010 A1
20100074333 Au et al. Mar 2010 A1
20100083322 Rouse Apr 2010 A1
20100094969 Zuckerman et al. Apr 2010 A1
20100095121 Shetty et al. Apr 2010 A1
20100106968 Mori et al. Apr 2010 A1
20100107260 Orrell et al. Apr 2010 A1
20100111192 Graves May 2010 A1
20100138903 Medvinsky Jun 2010 A1
20100142917 Isaji Jun 2010 A1
20100158109 Dahlby et al. Jun 2010 A1
20100161825 Ronca et al. Jun 2010 A1
20100166060 Ezure et al. Jul 2010 A1
20100186092 Takechi et al. Jul 2010 A1
20100189183 Gu et al. Jul 2010 A1
20100218208 Holden Aug 2010 A1
20100228795 Hahn Sep 2010 A1
20100235472 Sood et al. Sep 2010 A1
20100250532 Soroushian et al. Sep 2010 A1
20100290761 Drake et al. Nov 2010 A1
20100299522 Khambete et al. Nov 2010 A1
20100306249 Hill et al. Dec 2010 A1
20100313225 Cholas et al. Dec 2010 A1
20100313226 Cholas et al. Dec 2010 A1
20100319014 Lockett et al. Dec 2010 A1
20100319017 Cook Dec 2010 A1
20100332595 Fullagar et al. Dec 2010 A1
20110002381 Yang et al. Jan 2011 A1
20110016225 Park et al. Jan 2011 A1
20110047209 Lindholm et al. Feb 2011 A1
20110055585 Lee Mar 2011 A1
20110060808 Martin et al. Mar 2011 A1
20110066673 Outlaw Mar 2011 A1
20110067057 Karaoguz et al. Mar 2011 A1
20110078440 Feng et al. Mar 2011 A1
20110080940 Bocharov Apr 2011 A1
20110082924 Gopalakrishnan Apr 2011 A1
20110096828 Chen et al. Apr 2011 A1
20110107379 Lajoie et al. May 2011 A1
20110116772 Kwon et al. May 2011 A1
20110126191 Hughes et al. May 2011 A1
20110129011 Cilli et al. Jun 2011 A1
20110135090 Chan et al. Jun 2011 A1
20110138018 Raveendran et al. Jun 2011 A1
20110142415 Rhyu Jun 2011 A1
20110145726 Wei et al. Jun 2011 A1
20110149753 Bapst et al. Jun 2011 A1
20110150100 Abadir Jun 2011 A1
20110153785 Minborg et al. Jun 2011 A1
20110153835 Rimac et al. Jun 2011 A1
20110184738 Kalisky et al. Jul 2011 A1
20110191439 Dazzi et al. Aug 2011 A1
20110191803 Baldwin et al. Aug 2011 A1
20110197237 Turner Aug 2011 A1
20110197267 Gravel et al. Aug 2011 A1
20110213827 Kaspar et al. Sep 2011 A1
20110222786 Carmel et al. Sep 2011 A1
20110225302 Park et al. Sep 2011 A1
20110225315 Wexler et al. Sep 2011 A1
20110225417 Maharajh et al. Sep 2011 A1
20110239078 Luby et al. Sep 2011 A1
20110246657 Glow Oct 2011 A1
20110246659 Bouazizi Oct 2011 A1
20110252118 Pantos et al. Oct 2011 A1
20110264530 Santangelo et al. Oct 2011 A1
20110268178 Park et al. Nov 2011 A1
20110276695 Maldaner et al. Nov 2011 A1
20110283012 Melnyk Nov 2011 A1
20110291723 Hashimoto Dec 2011 A1
20110302319 Ha et al. Dec 2011 A1
20110305273 He et al. Dec 2011 A1
20110314176 Frojdh et al. Dec 2011 A1
20110314500 Gordon Dec 2011 A1
20120005368 Knittle et al. Jan 2012 A1
20120017282 Kang et al. Jan 2012 A1
20120023251 Pyle et al. Jan 2012 A1
20120036365 Kyslov et al. Feb 2012 A1
20120036544 Chen et al. Feb 2012 A1
20120066360 Ghosh Mar 2012 A1
20120093214 Urbach Apr 2012 A1
20120114302 Randall May 2012 A1
20120124191 Lyon May 2012 A1
20120137336 Applegate et al. May 2012 A1
20120144117 Weare et al. Jun 2012 A1
20120144445 Bonta et al. Jun 2012 A1
20120166633 Baumback et al. Jun 2012 A1
20120170642 Braness et al. Jul 2012 A1
20120170643 Soroushian et al. Jul 2012 A1
20120170906 Soroushian et al. Jul 2012 A1
20120170915 Braness et al. Jul 2012 A1
20120173751 Braness et al. Jul 2012 A1
20120177101 Van Der Schaar Jul 2012 A1
20120179834 Van Der Schaar et al. Jul 2012 A1
20120201475 Carmel et al. Aug 2012 A1
20120201476 Carmel et al. Aug 2012 A1
20120233345 Hannuksela Sep 2012 A1
20120240176 Ma et al. Sep 2012 A1
20120254455 Adimatyam et al. Oct 2012 A1
20120257678 Zhou et al. Oct 2012 A1
20120260277 Kosciewicz Oct 2012 A1
20120263434 Wainner et al. Oct 2012 A1
20120265562 Daouk et al. Oct 2012 A1
20120278496 Hsu Nov 2012 A1
20120289147 Raleigh et al. Nov 2012 A1
20120294355 Holcomb et al. Nov 2012 A1
20120297039 Acuna et al. Nov 2012 A1
20120307883 Graves Dec 2012 A1
20120311094 Biderman et al. Dec 2012 A1
20120314778 Salustri et al. Dec 2012 A1
20120317235 Nguyen et al. Dec 2012 A1
20130007223 Luby et al. Jan 2013 A1
20130013730 Li et al. Jan 2013 A1
20130019107 Grab et al. Jan 2013 A1
20130019273 Ma et al. Jan 2013 A1
20130041808 Pham et al. Feb 2013 A1
20130044821 Braness et al. Feb 2013 A1
20130046849 Wolf Feb 2013 A1
20130046902 Villegas Nuñez et al. Feb 2013 A1
20130051554 Braness et al. Feb 2013 A1
20130054958 Braness et al. Feb 2013 A1
20130055084 Soroushian et al. Feb 2013 A1
20130058480 Ziskind et al. Mar 2013 A1
20130061040 Kiefer et al. Mar 2013 A1
20130061045 Kiefer et al. Mar 2013 A1
20130064466 Carmel et al. Mar 2013 A1
20130066838 Singla et al. Mar 2013 A1
20130094565 Yang et al. Apr 2013 A1
20130097309 Ma et al. Apr 2013 A1
20130114944 Soroushian et al. May 2013 A1
20130124859 Pestoni et al. May 2013 A1
20130128962 Rajagopalan et al. May 2013 A1
20130152767 Katz et al. Jun 2013 A1
20130166580 Maharajh Jun 2013 A1
20130166765 Kaufman Jun 2013 A1
20130166906 Swaminathan et al. Jun 2013 A1
20130170561 Hannuksela Jul 2013 A1
20130170764 Carmel et al. Jul 2013 A1
20130173513 Chu et al. Jul 2013 A1
20130179199 Ziskind et al. Jul 2013 A1
20130179589 McCarthy et al. Jul 2013 A1
20130179992 Ziskind et al. Jul 2013 A1
20130182952 Carmel et al. Jul 2013 A1
20130196292 Brennen et al. Aug 2013 A1
20130212228 Butler Aug 2013 A1
20130223812 Rossi Aug 2013 A1
20130226578 Bolton et al. Aug 2013 A1
20130226635 Fisher Aug 2013 A1
20130227081 Luby et al. Aug 2013 A1
20130227122 Gao Aug 2013 A1
20130301424 Kotecha et al. Nov 2013 A1
20130311670 Tarbox et al. Nov 2013 A1
20130329781 Su et al. Dec 2013 A1
20140003516 Soroushian Jan 2014 A1
20140019593 Reznik et al. Jan 2014 A1
20140037620 Ferree et al. Feb 2014 A1
20140052823 Gavade et al. Feb 2014 A1
20140059156 Freeman, II et al. Feb 2014 A1
20140096171 Shivadas et al. Apr 2014 A1
20140096269 Amidei et al. Apr 2014 A1
20140101722 Moore Apr 2014 A1
20140115650 Zhang et al. Apr 2014 A1
20140119432 Wang et al. May 2014 A1
20140140396 Wang et al. May 2014 A1
20140140417 Shaffer et al. May 2014 A1
20140143301 Watson et al. May 2014 A1
20140143431 Watson et al. May 2014 A1
20140143440 Ramamurthy et al. May 2014 A1
20140177734 Carmel et al. Jun 2014 A1
20140189065 Van Der Schaar et al. Jul 2014 A1
20140201382 Shivadas et al. Jul 2014 A1
20140211840 Butt et al. Jul 2014 A1
20140211859 Carmel et al. Jul 2014 A1
20140241420 Orton-jay et al. Aug 2014 A1
20140241421 Orton-jay et al. Aug 2014 A1
20140247869 Su et al. Sep 2014 A1
20140250473 Braness et al. Sep 2014 A1
20140258714 Grab Sep 2014 A1
20140269927 Naletov et al. Sep 2014 A1
20140269936 Shivadas et al. Sep 2014 A1
20140280763 Grab et al. Sep 2014 A1
20140297804 Shivadas et al. Oct 2014 A1
20140297881 Shivadas et al. Oct 2014 A1
20140355668 Shoham et al. Dec 2014 A1
20140359678 Shivadas et al. Dec 2014 A1
20140359679 Shivadas et al. Dec 2014 A1
20140359680 Shivadas et al. Dec 2014 A1
20140376720 Chan et al. Dec 2014 A1
20150006662 Braness Jan 2015 A1
20150026677 Stevens et al. Jan 2015 A1
20150049957 Shoham et al. Feb 2015 A1
20150063693 Carmel et al. Mar 2015 A1
20150067715 Koat et al. Mar 2015 A1
20150104153 Braness et al. Apr 2015 A1
20150117836 Amidei et al. Apr 2015 A1
20150117837 Amidei et al. Apr 2015 A1
20150139419 Kiefer et al. May 2015 A1
20150188758 Amidei et al. Jul 2015 A1
20150188842 Amidei et al. Jul 2015 A1
20150188921 Amidei et al. Jul 2015 A1
20150189017 Amidei et al. Jul 2015 A1
20150189373 Amidei et al. Jul 2015 A1
20150288996 Van Der Schaar et al. Oct 2015 A1
20150334435 Shivadas et al. Nov 2015 A1
20150373421 Chan et al. Dec 2015 A1
20160070890 Grab et al. Mar 2016 A1
20160112382 Kiefer et al. Apr 2016 A1
20160149981 Shivadas et al. May 2016 A1
20160219303 Braness et al. Jul 2016 A1
20170214947 Kiefer et al. Jul 2017 A1
20170280203 Chan et al. Sep 2017 A1
20180007451 Shivadas et al. Jan 2018 A1
20180060543 Grab et al. Mar 2018 A1
20180131980 Van Der Schaar et al. May 2018 A1
20180220153 Braness et al. Aug 2018 A1
20180262757 Naletov et al. Sep 2018 A1
20180332094 Braness Nov 2018 A1
20190045219 Braness et al. Feb 2019 A1
Foreign Referenced Citations (150)
Number Date Country
2010203605 May 2015 AU
2823829 Jan 2019 CA
1169229 Dec 1997 CN
1221284 Jun 1999 CN
1723696 Jan 2006 CN
757484 Feb 1997 EP
813167 Dec 1997 EP
0936812 Aug 1999 EP
0818111 Jan 2000 EP
1056273 Nov 2000 EP
1187483 Mar 2002 EP
1420580 May 2004 EP
1553779 Jul 2005 EP
1657835 May 2006 EP
1718074 Nov 2006 EP
2486517 Aug 2012 EP
2486727 Aug 2012 EP
2507995 Oct 2012 EP
2564354 Mar 2013 EP
2616991 Jul 2013 EP
2617192 Jul 2013 EP
2661696 Nov 2013 EP
2661875 Nov 2013 EP
2661895 Nov 2013 EP
2486727 Mar 2014 EP
2564354 Mar 2014 EP
2616991 Mar 2014 EP
2617192 Mar 2014 EP
2716048 Apr 2014 EP
2721826 Apr 2014 EP
2486517 Jun 2014 EP
2751990 Jul 2014 EP
2807821 Dec 2014 EP
2751990 Apr 2015 EP
1125765 Aug 2009 HK
08046902 Feb 1996 JP
08111842 Apr 1996 JP
08163488 Jun 1996 JP
08287613 Nov 1996 JP
09037225 Feb 1997 JP
11164307 Jun 1999 JP
11275576 Oct 1999 JP
11328929 Nov 1999 JP
2000201343 Jul 2000 JP
02001043668 Feb 2001 JP
2001209726 Aug 2001 JP
2001346165 Dec 2001 JP
2002164880 Jun 2002 JP
2002170363 Jun 2002 JP
2002518898 Jun 2002 JP
2002218384 Aug 2002 JP
2003179597 Jun 2003 JP
2003250113 Sep 2003 JP
2004013823 Jan 2004 JP
2004515941 May 2004 JP
2004172830 Jun 2004 JP
2004187161 Jul 2004 JP
2004234128 Aug 2004 JP
2004304767 Oct 2004 JP
2005027153 Jan 2005 JP
2005080204 Mar 2005 JP
2006155500 Jun 2006 JP
2006524007 Oct 2006 JP
2007036666 Feb 2007 JP
2007174375 Jul 2007 JP
2007235690 Sep 2007 JP
2007535881 Dec 2007 JP
2008235999 Oct 2008 JP
2009530917 Aug 2009 JP
5200204 Jun 2013 JP
2014506430 Mar 2014 JP
5723888 Apr 2015 JP
6038805 Dec 2016 JP
6078574 Jan 2017 JP
2017063453 Mar 2017 JP
2018160923 Oct 2018 JP
6453291 Jan 2019 JP
100221423 Sep 1999 KR
2002013664 Feb 2002 KR
1020020064888 Aug 2002 KR
100669616 Jan 2007 KR
20100106418 Oct 2010 KR
1020130133830 Dec 2013 KR
101874907 Jul 2018 KR
101917763 Nov 2018 KR
146026 Dec 2010 SG
1995015660 Jun 1995 WO
1996013121 May 1996 WO
1997031445 Apr 1998 WO
1999010836 Mar 1999 WO
1999065239 Dec 1999 WO
0104892 Jan 2001 WO
2001031497 May 2001 WO
2001050732 Jul 2001 WO
2001065762 Sep 2001 WO
2002001880 Jan 2002 WO
2002008948 Jan 2002 WO
2002035832 May 2002 WO
2002037210 May 2002 WO
2002054196 Jul 2002 WO
2003030000 Apr 2003 WO
2003096136 Nov 2003 WO
2004054247 Jun 2004 WO
2004097811 Nov 2004 WO
2004102571 Nov 2004 WO
2006018843 Feb 2006 WO
2006018843 Dec 2006 WO
2007044590 Apr 2007 WO
2007113836 Oct 2007 WO
2008010275 Jan 2008 WO
2008042242 Apr 2008 WO
2008086313 Jul 2008 WO
2007113836 Nov 2008 WO
2008135932 Nov 2008 WO
2007113836 Dec 2008 WO
2009065137 May 2009 WO
2009070770 Jun 2009 WO
2010060106 May 2010 WO
2010080911 Jul 2010 WO
2010089962 Aug 2010 WO
2010108053 Sep 2010 WO
2010111261 Sep 2010 WO
2010122447 Oct 2010 WO
2010147878 Dec 2010 WO
2011042898 Apr 2011 WO
2011042900 Apr 2011 WO
2011068668 Jun 2011 WO
2011103364 Aug 2011 WO
2011132184 Oct 2011 WO
2011135558 Nov 2011 WO
2012035533 Mar 2012 WO
2012035534 Mar 2012 WO
2012035534 Jul 2012 WO
2012094171 Jul 2012 WO
20120094181 Jul 2012 WO
20120094189 Jul 2012 WO
2012035533 Aug 2012 WO
2012162806 Dec 2012 WO
2012171113 Dec 2012 WO
2013030833 Mar 2013 WO
2013032518 Mar 2013 WO
2013103986 Jul 2013 WO
2013111126 Aug 2013 WO
2013111126 Aug 2013 WO
2013032518 Sep 2013 WO
2013144942 Oct 2013 WO
2014145901 Sep 2014 WO
2014193996 Dec 2014 WO
2014193996 Feb 2015 WO
2015031982 Mar 2015 WO
Non-Patent Literature Citations (194)
Entry
RGB Networks, “Comparing Adaptive HTTP Streaming Technologies”, Nov. 2011, Retrieved from: http://btreport.net/wp-content/uploads/2012/02/RGB-Adaptive-HTTP-Streaming-Comparison-1211-01.pdf, 20 pgs.
Rosenberg et al., “Indicating User Agent Capabilities in the Session Initiation Protocol (SIP)”, Network Working Group, RFC 3840, Aug. 2004, 36 pgs.
Schulzrinne, H., “Real Time Streaming Protocol 2.0 (RTSP): draft-ietfmmusic-rfc2326bis-27”, MMUSIC Working Group of the Internet Engineering Task Force (IETF), 296 pgs. (presented in two parts), Mar. 9, 2011, 296 pages.
Siglin, “HTTP Streaming: What You Need to Know”, streamingmedia.com, 2010, 15 pages.
Siglin, “Unifying Global Video Strategies, MP4 File Fragmentation for Broadcast, Mobile and Web Delivery”, Nov. 16, 2011, 16 pgs.
Silvio, “Adaptive HTTP streaming for open codecs”, Oct. 9, 2010, [retrieved on: Mar. 2, 2018.
Tan, Yap-Peng et al., “Video transcoding for fast forward/reverse video playback”, IEEE ICIP, 2002, pp. 1-713 to 1-716.
Taxan, “AVel LinkPlayer2 for Consumer”, I-O Data USA—Products—Home Entertainment, printed May 4, 2007 from http://www.iodata.com/usa/products/products.php?cat=HNP&sc=AVEL&pld=AVLP2/DVDLA&ts=2&tsc, 1 pg.
Tripathi et al., “Improving Multimedia Streaming with Content-Aware Video Scaling”, Retrieved from: http://digitalcommons.wpi.edu/computerscience-pubs/96, 2001, 17 pgs.
Unknown, “AVI RIFF File Reference (Direct X 8.1 C++ Archive)”, printed from http://msdn.microsoft.com/archive/en-us/dx81_c/directx_cpp/htm/avirifffilereference.asp?fr . . . On Mar. 6, 2006, 7 pgs.
Unknown, “Entropy and Source Coding (Compression)”, TCOM 570, Sep. 1999, pp. 1-22.
Venkatramani et al., “Securing Media for Adaptive Streaming”, Multimedia 2003 Proceedings of the Eleventh ACM International Conference on Multimedia, Nov. 2-8, 2003, Berkeley, California, 4 pgs.
Wang et al., “Image Quality Assessment: From Error Visibility to Structural Similarity”, IEEE Transactions on Image Processing, Apr. 2004, vol. 13, No. 4, pp. 600-612.
Wu, Feng et al., “Next Generation Mobile Multimedia Communications: Media Codec and Media Transport Perspectives”, In China Communications, Oct. 2006, pp. 30-44.
Zambelli, “IIS Smooth Streaming Technical Overview”, Microsoft Corporation, Mar. 2009.
International Preliminary Report on Patentability for International Application No. PCT/US2007/063950, Report Completed Dec. 18, 2009, 3 pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2008/083816, dated May 18, 2010, 6 pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2010/56733, dated Jun. 5, 2012, 5 pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2011/068276, dated Mar. 4, 2014, 23 pgs.
International Preliminary Report on Patentability for International Application PCT/US2011/067167, dated Feb. 25, 2014, 8 pgs.
International Preliminary Report on Patentability for International Application PCT/US2013/043181, dated Dec. 31, 2014, dated Jan. 8, 2015, 11 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/039852, dated Dec. 1, 2015, dated Dec. 5, 2015, 8 Pgs.
International Search Report and Written Opinion for International Application No. PCT/US07/63950, completed Feb. 19, 2008; dated Mar. 19, 2008, 9 pgs.
International Search Report and Written Opinion for International Application No. PCT/US08/87999, completed Feb. 7, 2009, dated Mar. 19, 2009, 6 pgs.
International Search Report and Written Opinion for International Application No. PCT/US09/46588, completed Jul. 13, 2009, dated Jul. 23, 2009, 7 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2004/041667, completed May 24, 2007, dated Jun. 20, 2007, 6 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2005/025845, completed Feb. 5, 2007 and dated May 10, 2007, 8 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2008/083816, completed Jan. 10, 2009, dated Jan. 22, 2009, 7 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2010/020372, Completed Feb. 10, 2009, dated Mar. 1, 2010, 8 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2010/56733, Completed Jan. 3, 2011, dated Jan. 14, 2011, 9 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2011/067243, International Filing Date Dec. 23, 2011, Search Completed Apr. 24, 2012, dated May 8, 2012, 8 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2013/043181, completed Nov. 27, 2013, dated Dec. 6, 2013, 12 pgs.
International Search Report and Written Opinion for International Application PCT/US14/30747, completed Jul. 30, 2014, dated Aug. 22, 2014, 7 Pgs.
International Search Report and Written Opinion for International Application PCT/US14/39852, completed Oct. 21, 2014, dated Dec. 5, 2014, 11 pgs.
International Search Report and Written Opinion for International Application PCT/US2011/066927, completed Apr. 3, 2012, dated Apr. 20, 2012, 14 pgs.
International Search Report and Written Opinion for International Application PCT/US2011/067167, completed Jun. 19, 2012, dated Jul. 2, 2012, 11 pgs.
International Search Report and Written Opinion for International Application PCT/US2011/068276, completed Jun. 19, 2013, dated Jul. 8, 2013, 24 pgs.
International Search Report and Written Opinion for PCT/US2013/020572, International Filing Date Jan. 7, 2013, Search Completed Mar. 19, 2013, dated Apr. 29, 2013, 10 pgs.
International Telecommunication Union, Telecommunication Standardization Sector of ITU, H.233, Line Transmission of Non-Telephone Signals, Confidentiality System for Audiovisual Services, ITU-T Recommendation H.233, Mar. 1993, 18 pgs.
ITS International, “Fleet System Opts for Mobile Server”, Aug. 26, 1999, Printed on Oct. 21, 2011 from http://www.itsinternational.com/News/article.cfm?recordID=547, 2 pgs.
Lifehacker—Boxqueue Bookmarklet Saves Videos for Later Boxee Watching, printed Jun. 16, 2009 from http://feeds.gawker.com/˜r/lifehacker/full/˜3/OHvDmrlgZZc/boxqueue-bookmarklet-saves-videos-for-late-boxee-watching, 2 pgs.
Linksys Wireless-B Media Adapter Reviews, printed May 4, 2007 from http://reviews.cnet.com/Linksys_Wireless_B_Media_Adapter/4505-6739_7-30421900.html?tag=box, 5 pgs.
Linksys, KISS DP-500, printed May 4, 2007 from http://www.kiss-technology.com/?p=dp500, 2 pgs.
Linksys®: “Enjoy your digital music and pictures on your home entertainment center, without stringing wires!”, Model No. WMA 11B, printed May 9, 2007 from http://www.linksys.com/servlet/Satellite?c=L_Product_C2&childpagename=US/Layout&cid=1115416830950&p.
Microsoft Corporation, “Chapter 8, Multimedia File Formats” 1991, Microsoft Windows Multimedia Programmer's Reference, 3 cover pgs, pp. 8-1 to 8-20.
Microsoft Media Platform: Player Framework, “Microsoft Media Platform: Player Framework v2.5 (formerly Silverlight Media Framework)”, May 3, 2011, 2 pages.
Microsoft Media Platform: Player Framework, “Silverlight Media Framework v1.1”, Jan. 2010, 2 pages.
Microsoft Windows® XP Media Center Edition 2005, Frequently asked Questions, printed May 4, 2007 from http://www.microsoft.com/windowsxp/mediacenter/evaluation/faq.mspx.
Microsoft Windows® XP Media Center Edition 2005: Features, printed May 9, 2007, from http://www.microsoft.com/windowsxp/mediacenter/evaluation/features.mspx, 4 pgs.
Office Action for Chinese Patent Application No. CN200880127596.4, dated May 6, 2014, 8 pgs.
Office Action for U.S. Appl. No. 13/223,210, dated Apr. 30, 2015, 14 pgs.
Office Action for U.S. Appl. No. 14/564,003, dated Apr. 17, 2015, 28 pgs.
Open DML AVI-M-JPEG File Format Subcommittee, “Open DML AVI File Format Extensions”, Version 1.02, Feb. 28, 1996, 29 pgs.
pc world.com, Future Gear: PC on the HiFi, and the TV, from http://www.pcworld.com/article/id,108818-page,1/article.html, printed May 4, 2007, from IDG Networks, 2 pgs.
Qtv—About BroadQ, printed May 11, 2009 from http://www.broadq.com/en/about.php, 1 pg.
Supplementary European Search Report for Application No. EP 04813918, Search Completed Dec. 19, 2012, 3 pgs.
Supplementary European Search Report for Application No. EP 10729513, completed Dec. 9, 2013, 4 pgs.
Supplementary European Search Report for EP Application 11774529, completed Jan. 31, 2014, 2 pgs.
Supplementary European Search Report for International Application No. PCT/US2007063950, Report Completed Jan. 25, 2013, 8 pgs.
Windows Media Center Extender for Xbox, printed May 9, 2007 from http://www.xbox.com/en-US/support/systemuse/xbox/console/mediacenterextender.htm, 2 pgs.
Windows® XP Media Center Edition 2005, “Experience more entertainment”, retrieved from http://download.microsoft.com/download/c/9/a/c9a7000a-66b3-455b-860b-1c16f2eecfec/MCE.pdf on May 9, 2007, 2 pgs.
Invitation to Pay Add'l Fees Rcvd for International Application PCT/US14/39852, dated Sep. 25, 2014, 2 pgs.
3GPP TS 26.247, V1.3.0, 3rd Generation Partnership Project; Technical Specification Group Services and System Aspects Transparent end-to-end Packet-switches Streaming Services (PSS);, Progressive Download and Dynamic Adaptive Streaming over HTTP (3GP-DASH) (Release 10), Mar. 2011, 72 pgs.
“Adaptive Streaming Comparison”, Jan. 28, 2010, 5 pgs.
“Best Practices for Multi-Device Transcoding”, Kaltura Open Source Video, Printed on Nov. 27, 2013 from knowledge.kaltura.com/best-practices-multi-device-transcoding, 13 pgs.
“Container format (digital)”, printed Aug. 22, 2009 from http://en.wikipedia.org/wiki/Container_format_(digital), 4 pgs.
“Diagram | Matroska”, Dec. 17, 2010, Retrieved from http://web.archive.org/web/201 01217114656/http:l/matroska.org/technical/diagram/index.html on Jan. 29, 2016, 5 pages, Dec. 17, 2010.
“Draft CR: Trick Mode for HTTP Streaming”, 3GPP TSG-SA4 Meeting #58, Apr. 26-30, 2010, Vancouver, Canada, S4-100237, 3 pgs.
“DVD-MPeg differences”, printed Jul. 2, 2009 from http://dvd.sourceforge.net/dvdinfo/dvdmpeg.html, 1 pg.
“IBM Closes Cryptolopes Unit,” Dec. 17, 1997, CNET News, Printed on Apr. 25, 2014 from http://news.cnet.com/IBM-closes-Cryptolopes-unit/2100-1001_3206465.html, 3 pages.
“Information Technology—Coding of Audio Visual Objects—Part 2: Visual” International Standard, ISO/IEC 14496-2, Third Edition, Jun. 1, 2004, pp. 1-724. (presented in three parts).
U.S. Appl. No. 13/905,804, “Notice of Allowance,” dated Aug. 12, 2015, 8 pgs.
Broadq—The Ultimate Home Entertainment Software, printed May 11, 2009 from ittp://web.srchive.org/web/20030401122010/www.broadq.com/qcasttuner/, 1 pg.
Cloakware Corporation, “Protecting Digital Content Using Cloakware Code Transformation Technology”, Version 1.2, May 2002, pp. 1-10.
European Search Report Application No. EP 08870152, Search Completed May 19, 2011, dated May 26, 2011, 9 pgs.
European Search Report for Application 11855103.5, search completed Jun. 26, 2014, 9 pgs.
European Search Report for Application 11855237.1, search completed Jun. 12, 2014, 9 pgs.
European Supplementary Search Report for Application EP09759600, completed Jan. 25, 2011, 11 pgs.
Extended European Search Report for European Application EP10821672, completed Jan. 30, 2014, 3 pgs.
Extended European Search Report for European Application EP11824682, completed Feb. 6, 2014, 4 pgs.
Extended European Search Report for European Application No. 14763140.2, Search completed Sep. 26, 2016, dated Oct. 5, 2016, 9 Pgs.
Federal Computer Week, “Tool Speeds Info to Vehicles”, Jul. 25, 1999, 5 pages.
Final draft ETSI ES 202 109, V1.1.1, ETSI Standard, Terrestrial Trunked Radio (TETRA); Security; Synchronization mechanism for end-to-end encryption, Oct. 2002, 17 pgs.
HTTP Live Streaming Overview, Networking & Internet, Apple, Inc., Apr. 1, 2011, 38 pages.
IBM Corporation and Microsoft Corporation, “Multimedia Programming Interface and Data Specifications 1.0”, Aug. 1991, printed from http://www.kk.iij4u.or.jp/˜kondo/wave/mpidata.txt on Mar. 6, 2006, 100 pgs.
InformationWeek, “Internet on Wheels”, InformationWeek: Front End: Daily Dose, Jul. 20, 1999, Printed on Mar. 26, 2014, 3 pgs.
International Preliminary Report for Application No. PCT/US2011/066927, Filed Dec. 22, 2011, Report dated Jul. 10, 2013, 13 pgs.
International Preliminary Report for International Application No. PCT/US2011/067243, International Filing Date Dec. 23, 2011, dated Jul. 10, 2013, 7 pgs.
International Preliminary Report on Patentability for International Application PCT/US14/30747, Report dated Sep. 15, 2015, dated Sep. 24, 2015, 6 pgs.
“DVD subtitles”, sam.zoy.org/writings/dvd/subtitles, dated Jan. 9, 2001, printed Jul. 2, 2009, 4 pgs.
“Final Committee Draft of MPEG-4 streaming text format”, International Organisation for Standardisation, Feb. 2004, 22 pgs.
“IBM Spearheading Intellectual Property Protection Technology for Information on the Internet; Cryptolope Containers Have Arrived”, May 1, 1996, Business Wire, Printed on Aug. 1, 2014 from http://www.thefreelibrary.com/IBM+Spearheading+Intellectual+Property+Protection+Technology+for...-a018239381, 6 pg.
“Information Technology—Coding of audio-visual objects—Part 14: MP4 file format”, International Standard, ISO/IEC 14496-14, First Edition, Nov. 15, 2003, 18 pages.
“Information Technology—Coding of audio-visual objects—Part 17: Streaming text”, International Organisation for Standardisation, Feb. 2004, 22 pgs.
“Information technology—Coding of audio-visual objects—Part 18: Font compression and streaming”, ISO/IEC 14496-18, First edition Jul. 1, 2004, 26 pgs.
“Information technology—Generic coding of moving pictures and associated audio information: Systems”, International Standard ISO/IEC 13818-1, Second Edition, Dec. 1, 2000, 174 pages (presented in two parts).
“Information-Technology—Generic coding of moving pictures and associated audio: Systems, Recommendation H.222.0”, International Standard, ISO/IEC 13818-1, Draft 1209, Apr. 25, 1995, 151 pages.
“Information-Technology—Generic coding of moving pictures and associated audio: Systems, Recommendation H.222.0”, International Standard, ISO/IEC 13818-1, Draft 1540, Nov. 13, 1994, 161 pages.
“Innovation of technology arrived”, I-O Data, Nov. 2004, Retrieved from http://www.iodata.com/catalogs/AVLP2DVDLA_Flyer200505.pdf on May 30, 2013, 2 pgs., I-O Data, 2 pgs.
KISS Players, “KISS DP-500”, retrieved from http://www.kiss-technology.com/?p=dp500 on May 4, 2007, 1 pg.
“Matroska”, Wikipedia, Jul. 10, 2017, retrieved from https://en.wikipedia.org/wiki/Matroska on Jul. 20, 2017, 3 pages.
“Matroska Streaming | Matroska”, Retrieved from the Internet: URL:http://web.archive.org/web/201 0121711431 O/http://matroska.org/technical!streaming/index.html [retrieved on Jan. 29, 2016], Dec. 17, 2010.
“MovieLabs Specification for Next Generation Video—Version 1.0”, Motion Picture Laboratories, Inc., 2013, Retrieved from: http://movielabs.com/ngvideo/MovieLabs%20Specification%20for%20Next%20Generation%20Video%20v1.0.pdf, 5 pgs.
“MPEG-2”, Wikipedia, Jun. 13, 2017, retrieved from https://en.wikipedia.org/wiki/MPEG-2 on Jul. 20, 2017, 13 pages.
“MPEG-4 File Format, Version 2”, Sustainability of Digital Formats: Planning for Library of Congress Collections, Retrieved from: https://www.loc.gov/preservation/digital/formats/fdd/fdd000155.shtml, Last updated Feb. 21, 2017, 8 pgs.
“MPEG-4 Part 14”, Wikipedia, Jul. 10, 2017, retrieved from https://en.wikipedia.org/wiki/MPEG-4_Part_14 on Jul. 20, 2017, 5 pages.
“Netflix turns on subtitles for PC, Mac streaming”, Yahoo! News, Apr. 21, 2010, Printed on Mar. 26, 2014, 3 pgs.
“OpenDML AVI File Format Extensions”, OpenDML AVI M-JPEG File Format Subcommittee, retrieved from www.the-labs.com/Video/odmlff2-avidef.pdf, Sep. 1997, 42 pgs.
“OpenDML AVI File Format Extensions Version 1.02”, OpenDMLAVI MJPEG File Format Subcommittee. Last revision: Feb. 28, 1996. Reformatting: Sep. 1997.
“QCast Tuner for PS2”, printed May 11, 2009 from http://web.archive.org/web/20030210120605/www.divx.com/software/detail.php?ie=39, 2 pgs.
“Series H: Audiovisual and Multimedia Systems Infrastructure of audiovisual services—Coding of moving video; High efficiency video coding”, International Telecommunication Union, ITU-T H.265, Apr. 2015, 634 pages (presented in six parts).
“Smooth Streaming Client”, The Official Microsoft IIS Site, Sep. 24, 2010, 4 pages.
“Specifications | Matroska”, Retrieved from the Internet: URL:http://web.archive.org/web/201 00706041303/http:/1www.matroska.org/technical/specs/index.html [retrieved on Jan. 29, 2016, Jul. 6, 2010.
“Specifications Matroska”, Dec. 17, 2010, [retrieved on Mar. 2, 2018], 12 pages.
“Supplementary European Search Report for Application No. EP 10834935, International Filing Date Nov. 15, 2010, Search Completed May 27, 2014, 9 pgs.”
“Supported Media Formats”, Supported Media Formats, Android Developers, Printed on Nov. 27, 2013 from developerandroid.com/guide/appendix/media-formats.html, 3 pgs.
“Text of ISO/IEC 14496-18/COR1, Font compression and streaming”, ITU Study Group 16—Video Coding Experts Group—ISO/IEC MPEG & ITU-T VCEG(ISO/IEC JTC1/SC29/WG11 and ITU-T SG16 06), No. N8664, Oct. 27, 2006, 8 pgs.
“Text of ISO/IEC 14496-18/FDIS, Coding of Moving Pictures and Audio”, ITU Study Group 16—Videocoding Experts Group—ISO/IEC MPEG & ITU-T VCEG(ISO/IEC JTC1/SC29/WG11 and ITU-T SG16 06), No. N6215, Dec. 2003, 26 pgs.
“Thread: SSME (Smooth Streaming Medial Element) config.xml review (Smooth Streaming Client configuration file)”, Printed on Mar. 26, 2014, 3 pgs.
“Transcoding Best Practices”, From movideo, Printed on Nov. 27, 2013 from code.movideo.com/Transcoding_Best_Practices, 5 pgs.
“Using HTTP Live Streaming”, iOS Developer Library, http://developer.apple.com/library/ios/#documentation/networkinginternet/conceptual/streamingmediaguide/UsingHTTPLiveStreaming/UsingHTTPLiveStreaming.html#//apple_ref/doc/uid/TP40008332-CH102-SW1, Feb. 11, 2014, 10 pgs.
“Video Manager and Video Title Set IFO file headers”, printed Aug. 22, 2009 from http://dvd.sourceforge.net/dvdinfo/ifo.htm, 6 pgs.
“What is a DVD?”, printed Aug. 22, 2009 from http://www.videohelp.com/dvd, 8 pgs.
“What is a VOB file”, http://www.mpucoder.com/DVD/vobov.html, printed on Jul. 2, 2009, 2 pgs.
“What's on a DVD?”, printed Aug. 22, 2009 from http://www.doom9.org/dvd-structure.htm, 5 pgs.
U.S. Appl. No. 13/224,298, “Final Office Action Received”, May 19, 2014, 26 pgs.
U.S. Appl. No. 13/905,804, “Non-Final Office Action Received”, Jul. 25, 2014, 15 pgs.
Adams et al., “Will http adaptive streaming become the dominant mode of video delivery in cable networks?”, https://www.nctatechnicalpapers.com/Paper/2011/2011-will-http-adaptive-streaming-become-the-dominant-mode-of-video-delivery-in-cable-networks-, 10 pgs.
Adhikari et al., “Unreeling Netflix: Understanding and Improving Multi-CDN Movie Delivery”, 2012 Proceedings IEEE InfoCom, Mar. 25-30, 2012, Orlando, Florida, 9 pgs.
Adzic et al., “Optimized Adaptive HTTP Streaming for Mobile Devices”, International Society for Optics and Photonics, Applications of Digital Image Processing XXXIV, vol. 8135, Sep. 2011, p. 81350T.
Akhshabi et al., “An Experimental Evaluation of Rate-Adaptation Algorithms in Adaptive Streaming over HTTP”, MMSys'11, Feb. 23-25, 2011, 12 pgs.
Anonymous, “Method for the encoding of a compressed video sequence derived from the same video sequence compressed at a different bit rate without loss of data”, ip.com, ip.com No. IPCOM000008165D, May 22, 2002, pp. 1-9.
Author Unknown, “Blu-ray Disc—Blu-ray Disc—Wikipedia, the free encyclopedia”, printed Oct. 30, 2008 from http://en.wikipedia.org/wiki/Blu-ray_Disc, 11 pgs.
Author Unknown, “Blu-ray Movie Bitrates Here—Blu-ray Forum”, printed Oct. 30, 2008 from http://forum.blu-ray.com/showthread.php?t=3338, 6 pgs.
Author Unknown, “MPEG-4 Video Encoder: Based on International Standard ISO/IEC 14496-2”, Patni Computer Systems, Ltd., printed Jan 24, 2007, USA, pp. 1-15.
Author Unknown, “O'Reilly—802.11 Wireless Networks: The Definitive Guide, Second Edition”, printed Oct. 30, 2008 from http://oreilly.com/catalog/9780596100520, 2 pgs.
Author Unknown, “Tunneling QuickTime RTSP and RTP over HTTP”, Published by Apple Computer, Inc.: 1999 (month unknown), 6 pages.
Author Unknown, “Turbo-Charge Your Internet and PC Performance”, printed Oct. 30, 2008 from Speedtest.net—The Global Broadband Speed Test, 1 pg.
Author Unknown, “White paper, The New Mainstream Wireless LAN Standard”, Broadcom Corporation, Jul. 2003, 12 pgs.
Blasiak, “Video Transrating and Transcoding: Overview of Video Transrating and Transcoding Technologies”, Ingenient Technologies, TI Developer Conference, Aug. 6-8, 2002, 22 pgs.
Bloom et al., “Copy Protection for DVD Video”, Proceedings of the IEEE, vol. 87, No. 7, Jul. 1999, pp. 1267-1276.
Bross et al., “High Efficiency Video Coding (HEVC) text specification draft 10 (for FDIS & Last Call)”, Joint Collaborative Team on Video Coding (JCT-VC) of ITU-T SG 16 WP 3 and ISO/IEC JTC 1/SC 29/WG 11, Document: JCTVC-L1003_v34, 12th Meeting: Geneva, CH, Jan. 14-23, 2013 (presented in three parts).
Casares et al., “Simplifying Video Editing Using Metadata”, DIS2002, 2002, pp. 157-166.
Concolato et al., “Live HTTP Streaming of Video and Subtitles within a Browser”, MMSys 2013, Feb. 26-Mar. 1, 2013, Oslo, Norway, 5 pgs.
Deutscher, “IIS Transform Manager Beta—Using the MP4 to Smooth Task”, Retrieved from: https://web.archive.org/web/20130328111303/http://blog.johndeutscher.com/category/smooth-streaming, Blog post of Apr. 29, 2011, 14 pgs.
Eskicioglu et al., “An Integrated Approach to Encrypting Scalable Video”, Proceedings IEEE International Conference on Multimedia and Expo, Aug. 26-29, 2002, Lausanne, Switzerland, 4 pgs.
Fecheyr-Lippens, “A Review of HTTP Live Streaming”, Internet Citation, Jan. 25, 2010, pp. 1-37.
Gannes, “The Lowdown on Apple's HTTP Adaptive Bitrate Streaming”, GigaOM, Jun. 10, 2009, 12 pgs.
Garg et al., “An Experimental Study of Throughput for UDP and VoIP Traffic in IEEE 802.11b Networks”, Wreless Communications and Networkings, Mar. 2003, pp. 1748-1753.
Gast, “When is 54 Not Equal to 54? A Look at 802.11a, b and g Throughput”, Aug. 8, 2003, printed Oct. 30, 2008 from www.oreillynet.com/pub/a/wireless/2003/08/08/wireless_throughput.html, 4 pgs.
Ghosh, “Enhancing Silverlight Video Experiences with Contextual Data”, Retrieved from: http://msdn.microsoft.com/en-us/magazine/ee336025.aspx, 2010, 15 pgs.
Griffith, Eric, “The Wireless Digital Picture Frame Arrives”, Wi-Fi Planet, printed May 4, 2007 from http://www.wi-fiplanet.com/news/article.php/3093141, Oct. 16, 2003, 3 pgs.
Hartung et al., “DRM Protected Dynamic Adaptive HTTP Streaming”, MMSys 2011 Proceedings of the Second Annual ACM Conference on Multimedia Systems, San Jose, California, Feb. 23-25, 2011, pp. 277-282.
Hurtado Guzman, Juan Esteban, “Development and Implementation of an Adaptive HTTP Streaming Framework for H264/MVC Coded Media”, Politecnico di Torino, Nov. 2010, 108 pgs.
Hwang et al., “Efficient and User Friendly Inter-domain Device Authentication/Access control for Home Networks”, Proceedings of the 2006 International Conference on Embedded and Ubiquitous Computing, Seoul, Korea, Aug. 1-4, 2006, pp, 131-140.
Inlet Technologies, “Adaptive Delivery to iDevices”, 2010, 2 pages.
Inlet Technologies, “Adaptive delivery to iPhone 3.0”, 2009, 2 pgs.
Inlet Technologies, “HTTP versus RTMP”, 2009, 3 pages.
Inlet Technologies, “The World's First Live Smooth Streaming Event: The French Open”, 2009, 2 pages.
Kaspar et al., “Using HTTP Pipelining to Improve Progressive Download over Multiple Heterogeneous Interfaces”, IEEE ICC proceedings, 2010, 5 pgs.
Kim, Kyuheon, “MPEG-2 ES/PES/TS/PSI”, Kyung-Hee University, Oct. 4, 2010, 66 pages.
Kozintsev et al., “Improving last-hop multicast streaming video over 802.11”, Workshop on Broadband Wireless Multimedia, Oct. 2004, pp. 1-10.
Kurzke et al., “Get Your Content Onto Google TV”, Google, Retrieved from: http://commondatastorage.googleapis.com/io2012/presentations/live%20to%20website/1300.pdf, 2012, 58 pgs.
Lang, “Expression Encoder, Best Practices for live smooth streaming broadcasting”, Microsoft Corporation, 2010, retrieved from http://www.streamingmedia.com/conferences/west2010/presentations/SMWest-12010-Expression-Encoder.pdf, 20 pgs.
Levkov, “Mobile Encoding Guidelines for Android Powered Devices”, Adobe Systems Inc., Addendum B, Dec. 22, 2010, 42 pgs.
Li et al., “Content-Aware Playout and Packet Scheduling for Video Streaming Over Wireless Links”, IEEE Transactions on Multimedia, vol. 10, No. 5, Aug. 2008, pp. 885-895.
Lian et al., “Efficient video encryption scheme based on advanced video coding”, Multimed. Tools Appl. vol. 38, 2008, pp. 75-89.
Liu et al., “A Formal Framework for Component Deployment”, OOPSLA 2006, Proceedings of the 21st Annual ACM SIGPLAN Conference on Object-Oriented Programming Systems, Language, and Applications, Portland, Oregon, Oct. 22-26, 2006, pp. 325-344.
Long et al., “Silver: Simplifying Video Editing with Metadata”, CHI 2003: New Horizons, Apr. 5-10, 2003, pp. 628-629.
Matroska, “Diagram”, Matroska, Technical/Info, Diagram, 2016, retrieved from https://www.matroska.org/technical/diagram/index.html on Jul. 20, 2017, 9 pages.
Matroska, “Specifications”, Matroska, Technical/Info, Specifications, Jun. 25, 2017, retrieved from https://www.matroska.org/technical/specs/index.html on Jul. 20, 2017, 20 pages.
Miras, “On Quality Aware Adaptation of Internet Video”, University of London, PhD dissertation, 2004, 181 pgs.
Morrison, “EA IFF 85 Standard for Interchange Format Files”, Jan. 14, 1985, printed from http://www.dcs.ed.ac.uk/home/mxr/gfx/2d/Iff.txt on Mar. 6, 2006, 24 pgs.
Moscoso, Pedro Gomes, “Interactive Internet TV Architecture Based on Scalable Video Coding”, Instituto Superior Techico, Universidad Technica de Lisboa, May 2011, 103 pgs.
MSDN, “Adaptive streaming, Expression Studio 2.0”, Apr. 23, 2009, 2 pgs.
Nelson, “Arithmetic Coding + Statistical Modeling = Data Compression: Part 1—Arithmetic Coding”, Doctor Dobb's Journal, Feb. 1991, USA, pp. 1-12.
Nelson, “Smooth Streaming Deployment Guide”, Microsoft Expression Encoder, Aug. 2010, 66 pgs.
Nelson, Michael, “IBM's Cryptolopes”, Complex Objects in Digital Libraries Course, Spring 2001, Retrieved from http://www.cs.odu.edu/˜mln/teaching/unc/inls210/?method=display&pkg_name=cryptolopes.pkg&element_name=cryptolopes.ppt, 12 pages.
Noboru, “Play Fast and Fine Video on Web! codec”, Co.9 No. 12, Dec. 1, 2003, pp. 178-179.
Noe, A., “Matroska File Format (under construction!)”, Retrieved from the Internet: URL:http://web.archive.orgweb/20070821155146/www.matroska.org/technical/specs/matroska.pdf [retrieved on Jan. 19, 2011] Jun. 24, 2007, 1-51 pgs.
Noe, Alexander, “AVI File Format”, http://www.alexander-noe.com/video/documentation/avi.pdf, Dec. 14, 2006, pp. 1-26.
Noe, Alexander, “Definitions”, Apr. 11, 2006, retrieved from http://www.alexander-noe.com/video/amg/definitions.html on Oct. 16, 2013, 2 pages.
Ooyala, “Widevine Content Protection”, Ooyala Support Center for Developers. Ooyala, Inc., 2013. Jun. 3, 2013. http://support.ooyala.com/developers/documentation/concepts/player_v3_widevine_integration.html.
Oyman et al., “Quality of Experience for HTTP Adaptive Streaming Services”, IEEE Communications Magazine, Apr. 2012, vol. 50, No. 4, pp. 20-27, DOI: 10.1109/MCOM.2012.6178830.
Ozer, “The 2012 Encoding and Transcoding Buyers' Guide”, Streamingmedia.com, Retrieved from: http://www.streamingmedia.com/Articles/Editorial/Featured-Articles/The-2012-Encoding-and-Transcoding-Buyers-Guide-84210.aspx, 2012, 8 pgs.
Ozer, Jan, “Adaptive Streaming in the Field”, Streaming Media, Dec. 2010-Jan. 2011, pp. 36-47.
Padiadpu, Rashmi, “Towards Mobile Learning: A SCORM Player for the Google Android Platform”, Master Thesis, Hamburg University of Applied Sciences, 2008, 66 pgs.
Pantos, “HTTP Live Streaming, draft-pantos-http-live-streaming-10”, IETF Tools, Oct. 15, 2012, Retrieved from: http://tools.ietf.org/html/draft-pantos-http-live-streaming-10, 37 pgs.
Pantos, R, “HTTP Live Streaming: draft-pantos-http-live-streaming-06”, Published by the Internet Engineering Task Force (IETF), Mar. 31, 2011, 24 pages.
Papagiannaki et al., “Experimental Characterization of Home Wireless Networks and Design Implications”, INFOCOM 2006, 25th IEEE International Conference of Computer Communications, Proceedings, Apr. 2006, 13 pgs.
Peek, David, “Consumer Distributed File Systems”, Dissertation, Doctor of Philosophy, Computer Science and Engineering, The University of Michigan, 2009, 118 pgs.
Phamdo, “Theory of Data Compression”, printed on Oct. 10, 2003, 12 pgs.
De Cock et al., “Complexity-Based Consistent-Quality Encoding in the Cloud”, IEEE International Conference on Image Processing (ICIP), Date of Conference Sep. 25-28, 2016, Phoenix, AZ, pp. 1484-1488.
Lin et al., “Multipass Encoding for Reducing Pulsing Artifacts in Cloud Based Video Transcoding”, IEEE International Conference on Image Processing (ICIP), Date of Conference Sep. 27-30, 2015, Quebec City, QC, Canada, 5 pgs.
Related Publications (1)
Number Date Country
20190045220 A1 Feb 2019 US
Provisional Applications (1)
Number Date Country
61430110 Jan 2011 US
Continuations (3)
Number Date Country
Parent 15881351 Jan 2018 US
Child 16155840 US
Parent 15005990 Jan 2016 US
Child 15881351 US
Parent 13221794 Aug 2011 US
Child 15005990 US