Systems and methods for encoding video content

Information

  • Patent Grant
  • 11483609
  • Patent Number
    11,483,609
  • Date Filed
    Wednesday, June 9, 2021
    2 years ago
  • Date Issued
    Tuesday, October 25, 2022
    a year ago
Abstract
Systems and methods for encoding a plurality of alternative streams of video content using multiple encoders in accordance with embodiments of the invention are disclosed. An encoding system includes multiple encoders. Each of the encoders receives a source stream of video content that is divided into portions. Each of the encoders generates portions of the plurality of alternative streams from the portions of the source stream. The portions of the alternative streams generated by a particular encoder are stored in a container for the particular encoder. Each encoder also generates index information for the portion of the alternative stream generated by the encoder that is stored in a manifest for the encoder.
Description
FIELD OF THE INVENTION

The present invention generally relates to adaptive streaming and more specifically to systems that encode video data into streams having different maximum bitrates and playback devices that use the streams to obtain encoded video content from the encoded streams.


BACKGROUND

The term streaming media describes the playback of media on a playback device, where the media is stored on a server and continuously sent to the playback device over a network during playback. Typically, the playback device stores a sufficient quantity of media in a buffer at any given time during playback to prevent disruption of playback due to the playback device completing playback of all the buffered media prior to receipt of the next portion of media. Adaptive bit rate streaming or adaptive streaming involves detecting the present streaming conditions (e.g. the user's network bandwidth and CPU capacity) in real time and adjusting the quality of the streamed media accordingly. Typically, the source media is encoded at multiple bit rates and the playback device or client switches between streaming the different encodings depending on available resources.


Adaptive streaming solutions typically utilize either Hypertext Transfer Protocol (HTTP), published by the Internet Engineering Task Force and the World Wide Web Consortium as RFC 2616, or Real Time Streaming Protocol (RTSP), published by the Internet Engineering Task Force as RFC 2326, to stream media between a server and a playback device. HTTP is a stateless protocol that enables a playback device to request a byte range within a file. HTTP is described as stateless, because the server is not required to record information concerning the state of the playback device requesting information or the byte ranges requested by the playback device in order to respond to requests received from the playback device. RTSP is a network control protocol used to control streaming media servers. Playback devices issue control commands, such as “play” and “pause”, to the server streaming the media to control the playback of media files. When RTSP is utilized, the media server records the state of each client device and determines the media to stream based upon the instructions received from the client devices and the client's state.


In adaptive streaming systems, the source media is typically stored on a media server as a top level index file or manifest pointing to a number of alternate streams that contain the actual video and audio data. Each stream is typically stored in one or more container files. Different adaptive streaming solutions typically utilize different index and media containers. The Synchronized Multimedia Integration Language (SMIL) developed by the World Wide Web Consortium is utilized to create indexes in several adaptive streaming solutions including IIS Smooth Streaming developed by Microsoft Corporation of Redmond, Wash., and Flash Dynamic Streaming developed by Adobe Systems Incorporated of San Jose, Calif. HTTP Adaptive Bitrate Streaming developed by Apple Computer Incorporated of Cupertino, Calif. implements index files using an extended M3U playlist file (.M3U8), which is a text file containing a list of URIs that typically identify a media container file. The most commonly used media container formats are the MP4 container format specified in MPEG-4 Part 14 (i.e. ISO/IEC 14496-14) and the MPEG transport stream (TS) container specified in MPEG-2 Part 1 (i.e. ISO/IEC Standard 13818-1). The MP4 container format is utilized in IIS Smooth Streaming and Flash Dynamic Streaming. The TS container is used in HTTP Adaptive Bitrate Streaming.


The Matroska container is a media container developed as an open standard project by the Matroska non-profit organization of Aussonne, France. The Matroska container is based upon Extensible Binary Meta Language (EBML), which is a binary derivative of the Extensible Markup Language (XML). Decoding of the Matroska container is supported by many consumer electronics (CE) devices. The DivX Plus file format developed by DivX, LLC of San Diego, Calif. utilizes an extension of the Matroska container format (i.e. is based upon the Matroska container format, but includes elements that are not specified within the Matroska format).


To provide a consistent means for the delivery of media content over the Internet, the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) have put forth the Dynamic Adaptive Streaming over HTTP (DASH) standard. The DASH standard specifies formats for the media content and the description of the content for delivery of MPEG content using HTTP. In accordance with DASH, each component of media content for a presentation is stored in one or more streams. Each of the streams is divided into segments. A Media Presentation Description (MPD) is a data structure that includes information about the segments in each of the stream and other information needed to present the media content during playback. A playback device uses the MPD to obtain the components of the media content using adaptive bit rate streaming for playback.


As the speed at which streaming content has improved. Streaming of live events, such as sporting events and concerts has become popular. However, it is a problem to encode the video content from a live event into streams for adaptive bitrate streaming. To do so, encoder server systems typically use hardware encoders that are specifically designed to encode the video content into the various streams. These specialized encoders are expensive to obtain. Thus, those skilled in the art are constantly striving to find lower cost alternatives to the specialized encoders.


SUMMARY OF THE INVENTION

Systems and methods for encoding video content into multiple streams having different maximum bitrates and to obtaining the video content using playback devices in accordance with some embodiments of the invention are disclosed. The process in accordance to many embodiments is performed in the following manner. Each server in an encoding system receives portions of a source stream of video content from a content provider system. Each of the encoders encodes a portion of the alternative streams using the portions of the source stream received in each of the encoders. The portions of the alternative streams encoded by each particular one of the encoders is stored in a container for the particular one of the encoder. Each of the encoders then generates index information for the portions of the alternatives streams generated by the each encoder and stores the index information in a manifest for the portion of the alternative streams generated by the particular encoder.


In accordance with some embodiments of the invention, the portion of the alternative streams encoded by each of the encoders is one of the alternative streams and a stream is generated be each encoder in the following manner. The encoder receives each portion of the source stream and encodes a segment of an alternative stream from each portion of the source stream to generate the segments of the alternative stream. In accordance with a number of these embodiments, each particular encoder has a particular set of parameters for generating a stream. Each encoder also generates index information for each of the segments generated by the particular encoder and store the index information in a manifest for the particular encoder. In accordance with several of these embodiments, the alterative stream generated by a particular encoder has a particular maximum bitrate as a parameter. In accordance with still further of some of these embodiments, at least two alternative streams of the alternative streams generated by different ones of the encoders have a same maximum bitrate and at least one other parameter that is different. In accordance with many of these embodiments, the at least one other parameter is selected from a group of parameters consisting of aspect ratio, frame rate, and resolution.


In accordance with some embodiments of the invention, the system includes N encoders where N is an integer and each of the N encoders encode 1/N of the portions of the source streams into segments of each of the alternative streams. In accordance with some embodiments the encoding of 1/N portions of the source streams into segments in each of the alternative streams is performed in the following manner. Each encoder is assigned an Mth encoding order where M is an integer from 1 to N. Each encoder determines the Mth portion of the source stream received and every Nth portion received thereafter from the source stream as a set of portions of the source stream for the Mth encoder to encode. The encoding of a portion includes encoding the portion into a segment in each one of the alternative streams. The generating of the index information includes generating index information for each of segments generated for each of the portions in each of the alternative streams and storing the index information for each of the segments generated from each portion in the set of portions in a manifest for the Mth encoder. In accordance with some of these embodiments, each Mth encoder discards each of the portions that is not in the set of portions for the Mth encoder.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a network diagram of an adaptive bitrate streaming system in accordance with an embodiment of the invention.



FIG. 2 illustrates a block diagram of components of an encoding server system in accordance with an embodiment of the invention.



FIG. 3 illustrates a block diagram of components of a processing system in an encoder server system that encodes the video content into streams having different maximum bitrates in accordance with an embodiment of the invention.



FIG. 4 illustrates a block diagram of components of a processing system in a playback device that uses the encoded streams having different maximum bitrates to obtain the video content via adaptive bitrate stream, encoding system in accordance with an embodiment of the invention.



FIG. 5 illustrates a flow diagram for a process performed by each encoder in an encoder server system to encode video content into one of the streams used in an adaptive bitrate streaming system in accordance with an embodiment of the invention.



FIG. 6 illustrates a flow diagram for a process performed by each of N encoders in an encoder server system to encode each Nth segment of the video content in accordance with an embodiment of the invention.



FIG. 7 illustrates a flow diagram of a process performed by a playback device to obtain the manifest information for the streams and use the streams to obtain the video content using an adaptive bitrate system in accordance with an embodiment of the invention.





DETAILED DISCLOSURE OF THE INVENTION

Turning now to the drawings, systems and methods for encoding video content into streams for adaptive bitrate steaming and obtaining the stream using a playback device in accordance with some embodiments of the invention are illustrated. In accordance with some embodiments of this invention, an encoding system includes more than one encoder. In accordance with some of these embodiments, the encoders maybe provided by software executed by a processing system in the encoding system. In accordance with many embodiments, the encoders may be provided by firmware in the encoding system. In accordance with a number of embodiments, the encoders are provided by hardware in the server system.


The encoding system receives a source stream of video content from a source that includes an embedded timestamp. In accordance with some embodiments, the video content is a live feed being recorded in real-time. In accordance with some of these embodiments, the source stream of video content includes a timestamp in accordance with universal time.


In accordance with some embodiments, each encoder is used to generate a single stream of a set of streams to be used for adaptive bitrate streaming of content. In accordance with some of these embodiments, all of the encoders begin receiving portions of the source stream of video content and are synchronized using the embedded timestamp within the received portions. As each encoder receives portions of the source stream of the video content from the source system, the encoders encode the received portions of the source stream of video content into segments of a stream having predefined parameters particular to each encoder. In accordance with some embodiments, the stream produced by each encoder has a different maximum bitrate (or different target average bitrate) than the streams being generated by the other encoders. In accordance with some other embodiments, other parameters including, but not limited to, aspect ratio, resolution, and frame rate may be varied in the stream being generated by the various encoders.


Each encoder stores the generated portions in one or more container files for the generated stream in accordance with some embodiments of the invention. The encoder also generates index or manifest information for each of the generated portions of the streams and adds the generated index or manifest information to an index file or manifest in accordance with many embodiments of the invention. The process is repeated until the end of the source stream is received.


In accordance with some other embodiments, the encoding system includes a number of encoders (N) and each encoder encodes a portion (e.g. 1/N) of the source stream multiple times using different sets of encoding parameters to create segments for each of the streams in an adaptation set of streams. In accordance with some of these embodiments, each encoder is assigned position in the processing order. Each encoder then begins to receive the source stream of the video content. As portions of the source stream are received by each of the encoders, the encoder determines whether a portion is an Nth segment of the source stream assigned to the encoder. If a portion is not an Nth segment, the encoder discards the segment. If a portion is an Nth segment, the encoder encodes the received portion into segments in accordance with each of the profiles for the various streams in the set of streams and stores the segments in the container files for the appropriate streams. The encoder then generates index or manifest information for each of the generated segments and adds the information to an appropriate index file or manifest. In accordance with many of these embodiments, the index or manifest information is added to a manifest for segments produced by the encoder. In accordance with some other embodiments, the index or manifest information for each segment produced for the various streams is added to the index file or manifest maintained for the specific stream and/or stored in a database in memory for future use in generating a manifest file. The process is repeated by each of the encoders until each of the encoders receive the end of the source stream.


In accordance with some embodiments, the media content is stored in streams in accordance with the DASH standards. However, one skilled in the art will recognize that the formats such as but not limited to, a Matroska (MKV) container file format may be used to store streams of the media content without departing from this invention.


The performance of an adaptive bitrate streaming system in accordance with some embodiments of the invention can be significantly enhanced by encoding each portion of the source video in each of the alternative streams in such a way that the segment of video is encoded in each stream as a single (or at least one) closed group of pictures (GOP) starting with an Instantaneous Decoder Refresh (IDR) frame, which is an intra frame. The playback device can switch between the alternative streams used during playback at the completion of the playback of a video segment and, irrespective of the stream from which a video segment is obtained, the first frame in the video segment will be an IDR frame that can be decoded without reference to any encoded media other than the encoded media contained within the video segment.


In a number of embodiments, the playback device obtains information concerning each of the available streams from the MPD and selects one or more streams to utilize in the playback of the media. The playback device can also request index information that indexes segments of the encoded video content stored within the relevant container files. The index information can be stored within the container files or separately from the container files in the MPD or in separate index files. The index information enables the playback device to request byte ranges corresponding to segments of the encoded video content within the container file containing specific portions of encoded video content via HTTP (or another appropriate stateful or stateless protocol) from the server. The playback device uses the index information to request segments of the video content from the alternative streams in accordance with some embodiments. Playback is continued with the playback device requesting segments of the encoded video content from a stream having video content that is encoded at a maximum bitrate that can supported by the network conditions.


In accordance with some embodiments of the invention, the playback device operates in the following manner to use the streams generated by the multiple encoders in the encoding system. The playback device requests the media content that includes the video content. In response to the request, the playback device receives the MPD or index file maintained and/or generated by each encoder. The playback devices uses embedded timestamps to then join the MPD or index files from the various encoders into a combined adaptation set of index information. The playback devices then uses the index information from the combined adaptation set to perform adaptive bitrate streaming to obtain the video content. In accordance with some other embodiments, the server generates a MPD from the MPD or index files generated by each encoder using the embedded time stamps and provides the MPD to the playback devices. The playback devices then uses MPD to perform adaptive bitrate streaming to obtain the video content.


The encoding of video content into multiple streams for use in adaptive bitrate streaming using multiple encoders and the process for obtaining the video content from the generated streams by a playback device using adaptive bitrate streaming in accordance with some embodiments of the invention is discussed further below.


Adaptive Streaming System Architecture


Turning now to the FIG. 1, an adaptive streaming system that includes an encoding system that generates streams using multiple encoders in accordance with an embodiment of the invention is illustrated. The adaptive streaming system 10 includes a source encoding system 12 configured to encode source media content including video content as a number of alternative streams. In the illustrated embodiment, the source encoder is a single server. In other embodiments, the source encoder can be any processing device or group of processing devices including a processor and sufficient resources to perform the transcoding of source media (including but not limited to video, audio, and/or subtitles) into the alternative streams. Typically, the source encoding server 12 generates an MPD that includes an index indicating container files containing the streams and/or metadata information, at least a plurality of which are alternative streams. Alternative streams are streams that encode the same media content in different ways. In many instances, alternative streams encode media content (such as, but not limited to, video content and/or audio content) at different maximum bitrates. In a number of embodiments, the alternative streams of video content are encoded with different resolutions and/or at different frame rates. However, the source encoder system 12 uses multiple encoders to generate the alternative streams and each particular encoder generates an MPD for the segments of the stream or streams generated by the particular encoder. The MPDs generated by the various encoders and the container files are uploaded to an HTTP server 14. A variety of playback devices can then use HTTP or another appropriate stateless protocol to request portions of the MPDs, index files, and the container files via a network 16 such as the Internet.


In the illustrated embodiment, playback devices that can perform adaptive bitrate streaming using the MPDs from the various encoders include personal computers 18, CE players, and mobile phones 20. In accordance with some other embodiments, playback devices can include consumer electronics devices such as DVD players, Blu-ray players, televisions, set top boxes, video game consoles, tablets, virtual reality headsets, augmented reality headsets and other devices that are capable of connecting to a server via a communication protocol including (but not limited to) HTTP and playing back encoded media. Although a specific architecture is shown in FIG. 1, any of a variety of architectures including systems that perform conventional streaming and not adaptive bitrate streaming can be utilized that enable playback devices to request portions of the MPDs and the container files in accordance with various embodiments of the invention.


Encoder System


An encoder system that uses multiple encoders to encode video content into alternative streams for use in adaptive bitrate streaming in accordance with an embodiment of the invention is shown in FIG. 2. Encoding system 200 includes a router 205 and an encoding server 202 communicatively connected to router 205. One skilled in the art will recognize that any number of servers or processors may be connected to router 205 without departing from this invention and that only one server is shown for clarity and brevity. The encoder includes multiple encoders 215-218. In accordance with some embodiments, each of the encoders 215-218 is an instantiation of software that is being executed by the processor from instructions stored in a memory to perform the decoding and/or encoding of the source content. In accordance with some other embodiments, one or more of encoders 215-218 is a particular hardware component in the server that encodes received content. In still other embodiments, one or more of the encoders may be a firmware component in which hardware and software are used to provide the encoder. The router provides an incoming source stream of video content to each of the encoders 215-218 of the server 210. In accordance with some embodiments, the router transmits a copy of the stream to each of the encoders. In accordance with some other embodiments, the server 210 receives the source stream and provides a copy of the incoming source stream to each of the encoders 215 as the source stream is received. The source stream includes embedded timing information.


Although a specific architecture of a server system is shown in FIG. 2, any of a variety of architectures including systems that encode video content from a received stream can be utilized in accordance with various embodiments of the invention.


Playback Device


Processes that provide the methods and systems for using the alternative streams generated by multiple encoders in accordance with some embodiments of this invention are executed by a playback device. The relevant components in a playback device that can perform the processes in accordance with an embodiment of the invention are shown in FIG. 3. One skilled in the art will recognize that playback devices may include other components that are omitted for brevity without departing from described embodiments of this invention. The playback device 300 includes a processor 305, a non-volatile memory 310, and a volatile memory 315. The processor 305 is a processor, microprocessor, controller, or a combination of processors, microprocessor, and/or controllers that performs instructions stored in the volatile memory 315 or non-volatile memory 310 to manipulate data stored in the memory. The non-volatile memory 310 can store the processor instructions utilized to configure the playback device 300 to perform processes including processes for using alternative streams encoded by multiple encoders to obtain video content using adaptive bit rate streaming in accordance with some embodiments of the invention. In accordance with various other embodiments, the playback device may have hardware and/or firmware that can include the instructions and/or perform these processes. In accordance with still other embodiments, the instructions for the processes can be stored in any of a variety of non-transitory computer readable media appropriate to a specific application.


Servers


Process in a method and system of encoding video content into streams for adaptive bitrate streaming using multiple encoders in accordance with an embodiment of this invention are performed by an encoder such as an encoding server. The relevant components in an encoding server that perform these processes in accordance with an embodiment of the invention are shown in FIG. 4. One skilled in the art will recognize that a server may include other components that are omitted for brevity without departing from the described embodiments of this invention. The server 400 includes a processor 405, a non-volatile memory 410, and a volatile memory 415. The processor 405 is a processor, microprocessor, controller, or a combination of processors, microprocessor, and/or controllers that performs instructions stored in the volatile 415 or non-volatile memory 410 to manipulate data stored in the memory. The non-volatile memory 410 can store the processor instructions utilized to configure the server 400 to perform processes including processes for encoding media content and/or generating marker information in accordance with some embodiments of the invention and/or data for the processes being utilized. In accordance with various embodiments, these instructions may be in server software and/or firmware can be stored in any of a variety of non-transitory computer readable media appropriate to a specific application. Although a specific server is illustrated in FIG. 4, any of a variety of servers configured to perform any number of processes can be utilized in accordance with various embodiments of the invention.


Encoding of Video Content into Alternative Streams for Adaptive Bitrate Streaming Using Multiple Encoders in an Encoding System


In accordance with some embodiments, an encoding system encodes video content into alternative streams for adaptive bitrate streaming using multiple encoders. In accordance with some embodiments of the invention, the encoders are software encoders that are instantiations of software instructions read from a memory that can be performed or executed by a processor. Software encoders may be used when it is desirable to reduce the cost of the encoders and/or to improve the scalability of the system as only processing and memory resources are needed to add additional encoders to the system. In accordance with many embodiments, one or more of the multiple encoders are hardware encoders. Hardware encoders are circuitry that is configured to perform the processes for encoding the received content into one or more streams. In accordance with a number of embodiments, one or more of the encoders may be firmware encoders. A firmware encoder combines some hardware components and some software processes to provide an encoder.


The video content may be received as a source stream from a content provider. In accordance with some embodiments, the video content is a live broadcast meaning the video content is being captured and streamed in real time. The video content may include time information. The time information may include, but is not limited to, a broadcast time, a presentation time and/or a time recorded. Each of the encoders receives the source stream of video content and generates portions of the alternative streams. In accordance with some embodiments, each of the multiple encoders produces a single stream having encoder specific parameters from the source stream. In accordance with some other embodiments, the encoding system includes a number of encoders (N) and each encoder encodes a portion (e.g. 1/N) of the source stream multiple times using different sets of encoding parameters to create segments for each of the streams in an adaptation set of streams. Processes for encoding alternative streams of video content from a source stream of video content using multiple encoders in accordance with some different embodiments of the invention are shown in FIGS. 5 and 6.


A flow chart of a process performed by at least one of a set of multiple encoders to generate a single stream of the alternative streams from the source stream of video content in accordance with an embodiment of the invention is shown in FIG. 5. In process 500, the encoder receives a portion of a source stream of video content that includes timing information (505). In accordance with some embodiments, the encoders may use time information received with the portion to determine at what point in the stream the encoder is to start encoding the stream. As the encoders are using the same timing information, the encoding performed by the encoders is synchronized such that the segments produced by each encoder include the same duration of video content to present in terms of presentation time and the segments are aligned. The encoder uses the portion of the source stream of video content to encode a segment of a stream of video content that has specified parameters particular to the encoder (510). In accordance with some embodiments, the specified parameters of the stream generated by each encoder include a different maximum bitrate.


In accordance with some other embodiments, the streams from two or more encoders have the same maximum bitrate and different aspect ratios, resolutions, and/or frame rates. The encoder also generates index or manifest information for the generated segment (515). The generated segment is stored in the container of the stream being generated by the encoder (520) and the index or manifest information is added to the manifest or index file for the stream stored in memory and/or delivered to client playback devices as an update (525). Process 500 repeats until the encoder receives the end of the stream and/or reception of the stream is halted in some other manner (530).


Although various examples of processes performed by an encoder for encoding one of a set alternative streams of video content are described above, one skilled in the art will recognize that other processes for encoding the streams may be performed in accordance with some embodiments of the invention.


In accordance with some other embodiments of the invention, the encoding system includes a number of encoders (N) and each encoder encodes a portion (e.g. 1/N) of the source stream multiple times using different sets of encoding parameters to create segments for each of the streams in an adaptation set of streams. Each encoder is assigned an encoder order position M where M is a number between 1 and M. The first encoder encodes the first portion of the source stream and every Nth portion received thereafter into segments of each of the alternative streams. The second encoder handles the second received portion of the source stream and every Nth portion received thereafter into segments of each of the alternative streams. Likewise, the remaining encoders in the encoding order 1 through N encode the Mth received portion of the every Nth portion received thereafter into segments of the various alternative streams where M is the encoding order position. Thus, each encoder only encodes 1/N of the total number of segments of the source streams into alternative streams. This type of encoding causes the availability of segments to be N*|segment duration| and not real time. Thus, the availability time of the segments may need to be added to information in the manifest to allow clients to know when the segments will be available. A flow diagram of a process performed by each of the N encoders to generate every Nth segment of the video content from the source stream in accordance with an embodiment of the invention is shown in FIG. 6.


In process 600, the encoder receives a portion of a source stream of video content that includes timing information (605). In accordance with some embodiments, the encoders may use time information received with the portion to determine at what point in the stream the encoder is to start encoding the stream. As the encoders are using the same timing information from the source stream, the encoding performed by the encoders is synchronized such that the segments produced by each encoder include the same amount of video content to present in terms of presentation time and the segments are aligned with subsequent segments.


The encoder then determines whether the received portion is one of the Nth portions of the source stream to handle (610). The determination may be performed by using a counter to count the received portions and compare the current count to M and determine whether the count is equal to or a factor of M where M is the encoder position order in accordance with some embodiments. In accordance with some other embodiments, metadata for the received portions of the source streams are used to make the determination.


If the received portion is not one of the portions the encoder is to handle, the encoder discards the received portion of the stream (615). If the received portion is determined to be one of the portions of the incoming streams the encoder is to encode, the encoder encodes the portion in segments for each of the alternative streams based upon the specific parameters for each of the alternative streams (620). The parameters of the streams include, but are not limited to, maximum bitrates, resolution, aspect ratio, and frame rate.


The encoder also generates index or manifest information for each of the generated segments (625). This includes generating manifest information for each of the alternative streams. Each of the generated segments is stored in the container(s) of the appropriate alternative stream (630) and the index or manifest information is added to an appropriate manifest or index file(s) (635). In accordance with some embodiments, manifest or index information is added to the MPD for the alternative streams stored in memory. In accordance with some other embodiments, the manifest or index information is added to an MPD for the segments encoded by the encoder. In still other embodiments, the manifest or index information is delivered to client playback devices as an update. Process 600 repeats until the encoder receives the end of the stream and/or reception of the stream is halted in some other manner (640).


Although various examples of processes performed by an encoder for encoding every Nth segment for each of the alternative streams of video content are described above, one skilled in the art will recognize that other processes for encoding the portions for the streams may be performed in accordance with some embodiments of the invention.


Process Performed by a Playback Device to Obtain Video Content Using Alternative Streams Generated by Multiple Encoders


In accordance with some embodiments of the invention, a playback device uses the streams generated by the multiple encoders to obtain the video content for playback. In accordance with some embodiments of the invention, the playback devices adaptive bit rate streaming to obtain the media content from the alternative streams generated using multiple encoders. To do so, the playback device must receive the MPD generated by each of the encoders to generate a combined adaptation set for use in obtaining the segments using adaptive bit rate streaming. In accordance with some embodiments, the combined adaptation set is generated based upon timestamps embedded in the MPD generated by each of the encoders. A process performed by a playback device to perform adaptive bitrate streaming in accordance with an embodiment of the invention is shown in FIG. 7.


In process 700, the playback device requests the index or manifest information for the video content (705). The playback device receives the MPD or index files generated by each encoders as the encoders generate the segments of the alternative streams (710). The playback device generates a combined adaptive set from the index or manifest information in the MPDs from the encoders using the embedded time stamps in each of the MPDs (715). In accordance with some embodiments, the combined adaptive set generated has the same format as a MPD and is generated by populating the combine adaptive set with the index or manifest information from the received MPD. The combined adaptive set is used by the playback device to perform adaptive bit rate streaming to obtain the video content for playback (720). In accordance with some embodiments, the playback device uses the combined adaptive set to request portions of the video content. In accordance with some embodiments of the invention, the playback device monitors the network bandwidth for communications over the network between the playback device and the content provider system; and selects streams of the audio and/or video content that are encoded at highest maximum bitrates that can be handled in accordance with the measured bandwidth. Systems and methods for selecting a stream and commencing playback include those disclosed in U.S. Patent Application Publication 2013/0007200 entitled “Systems and Methods for Determining Available Bandwidth and Performing Initial Stream Selection When Commencing Streaming Using Hypertext Transfer Protocol” and U.S. Pat. No. 8,832,297 entitled “Systems and Methods for Performing Multiphase Adaptive Bitrate Streaming,” the disclosures of which are hereby incorporated by reference in their entirety. More particularly, the processes performed by a playback device to obtain the video content using adaptive bit rate streaming described in these references are incorporated herein by reference.


Although a process performed by a playback device to obtain video content performing adaptive bit rate streaming using the alternative streams generated by multiple encoders in accordance with an embodiment of the invention is disclosed in FIG. 7, other processes may be performed by a playback device to obtain video content using alternative streams generated by multiple encoders in accordance with embodiments of the invention.


Although the present invention has been described in certain specific aspects, many additional modifications and variations would be apparent to those skilled in the art. Specifically, this invention may be used in association with trick play tracks where only certain frames of the trick-play track are shown in accordance with some embodiments of the invention. It is therefore to be understood that the present invention may be practiced otherwise than specifically described, including various changes in the implementation such as utilizing encoders and decoders that support features beyond those specified within a particular standard with which they comply, without departing from the scope and spirit of the present invention. Thus, embodiments of the present invention should be considered in all respects as illustrative and not restrictive.

Claims
  • 1. A method of encoding a plurality of alternative streams of video content from a source stream of video content using a plurality of encoders in an encoding system, the method comprising: receiving a plurality of portions of the source stream of video content;instantiating the plurality of encoders; andencoding the plurality of portions of the source stream of video content such that each portion of the source stream is encoded by one of the plurality of encoders, wherein different portions of the source stream are encoded by different encoders of the plurality of encoders;storing the different portions of the source stream in a container for the encoded stream;generating index information for the encoded stream; andstoring the index information for the encoded stream in the container for the encoded stream.
  • 2. The method of claim 1, wherein the encoded stream comprises a particular set of parameters.
  • 3. The method of claim 2, wherein the particular set of parameters comprises a particular maximum bitrate.
  • 4. The method of claim 2, wherein the particular set of parameters comprises at least one selected from the group consisting of aspect ratio, frame rate, and resolution.
  • 5. The method of claim 1, further comprising causing the encoder encoding each of the portions to encode multiple times to create a plurality of alternative encoded streams each having a different set of parameters.
  • 6. The method of claim 5, wherein at least two alternative encoded streams have a same maximum bitrate and at least one other parameter that is different.
  • 7. The method of claim 5, further comprising generating index information for each of the plurality of alternative encoded streams.
  • 8. The method of claim 1, wherein the index information is stored in a manifest.
  • 9. The method of claim 8, wherein the manifest further includes information indicating when at least one of the encoded portions is available for streaming.
  • 10. An encoding system for encoding a source stream of video content comprising: at least one processor;memory accessible by the at least one processor; andinstructions stored in the memory that when read by the at least one processor direct the processor to: receive a plurality of portions of the source stream of video content;instantiate the plurality of encoders; andencode the plurality of portions of the source stream of video content such that each portion of the source stream is encoded by one of the plurality of encoders, wherein different portions of the source stream are encoded by different encoders of the plurality of encoders;store the different portions of the source stream in a container for the encoded stream;generate index information for the encoded stream; andstore the index information for the encoded stream in the container for the encoded stream.
  • 11. The system of claim 10, wherein the encoded stream comprises a particular set of parameters.
  • 12. The system of claim 11, wherein the particular set of parameters comprises a particular maximum bitrate.
  • 13. The system of claim 11, wherein the particular set of parameters comprises at least one selected from the group consisting of aspect ratio, frame rate, and resolution.
  • 14. The system of claim 10, wherein the instructions further direct the processor to cause the encoder encoding each of the portions to encode multiple times to create a plurality of alternative encoded streams each having a different set of parameters.
  • 15. The system of claim 14, wherein at least two alternative encoded streams have a same maximum bitrate and at least one other parameter that is different.
  • 16. The system of claim 14, wherein the instructions further direct the processor to generate index information for each of the plurality of alternative encoded streams.
  • 17. The system of claim 10, wherein the index information is stored in a manifest.
  • 18. The system of claim 17, wherein the manifest further includes information indicating when at least one of the encoded portions is available for streaming.
CROSS REFERENCE TO RELATED APPLICATIONS

The current application is a continuation of U.S. patent application Ser. No. 16/819,865, filed Mar. 16, 2020, entitled “Systems and Methods for Encoding Video Content” to Amidei et al., which is a continuation of U.S. patent application Ser. No. 16/208,210, filed Dec. 3, 2018, entitled “Systems and Methods for Encoding Video Content” to Amidei et al., which is a continuation of U.S. patent application Ser. No. 15/183,562, filed Jun. 15, 2016, entitled “Systems and Methods for Encoding Video Content” to Amidei et al., the disclosures of which are expressly incorporated by reference herein in their entirety.

US Referenced Citations (551)
Number Name Date Kind
3919474 Benson Nov 1975 A
4009331 Goldmark et al. Feb 1977 A
4694357 Rahman et al. Sep 1987 A
4802170 Trottier Jan 1989 A
4964069 Ely Oct 1990 A
4974260 Rudak Nov 1990 A
5119474 Beitel et al. Jun 1992 A
5274758 Beitel et al. Dec 1993 A
5361332 Yoshida et al. Nov 1994 A
5396497 Veltman Mar 1995 A
5400401 Wasilewski et al. Mar 1995 A
5404436 Hamilton Apr 1995 A
5420801 Dockter et al. May 1995 A
5420974 Morris et al. May 1995 A
5471576 Yee Nov 1995 A
5477272 Zhang et al. Dec 1995 A
5487167 Dinallo et al. Jan 1996 A
5533021 Branstad et al. Jul 1996 A
5537408 Branstad et al. Jul 1996 A
5539908 Chen et al. Jul 1996 A
5541662 Adams et al. Jul 1996 A
5574785 Ueno et al. Nov 1996 A
5583652 Ware Dec 1996 A
5600721 Kitazato Feb 1997 A
5621794 Matsuda et al. Apr 1997 A
5627936 Prasad May 1997 A
5633472 DeWitt et al. May 1997 A
5642171 Baumgartner et al. Jun 1997 A
5642338 Fukushima et al. Jun 1997 A
5655117 Goldberg et al. Aug 1997 A
5664044 Ware Sep 1997 A
5675382 Bauchspies Oct 1997 A
5675511 Prasad et al. Oct 1997 A
5684542 Tsukagoshi Nov 1997 A
5717394 Schwartz et al. Feb 1998 A
5719786 Nelson et al. Feb 1998 A
5745643 Mishina Apr 1998 A
5751280 Abbott May 1998 A
5763800 Rossum et al. Jun 1998 A
5765164 Prasad et al. Jun 1998 A
5794018 Vrvilo et al. Aug 1998 A
5805228 Proctor et al. Sep 1998 A
5805700 Nardone et al. Sep 1998 A
5813010 Kurano et al. Sep 1998 A
5822524 Chen et al. Oct 1998 A
5828370 Moeller et al. Oct 1998 A
5844575 Reid Dec 1998 A
5848217 Tsukagoshi et al. Dec 1998 A
5854873 Mori et al. Dec 1998 A
5903261 Walsh et al. May 1999 A
5907597 Mark May 1999 A
5907658 Murase et al. May 1999 A
5912710 Fujimoto Jun 1999 A
5923869 Kashiwagi et al. Jul 1999 A
5956729 Goetz et al. Sep 1999 A
5959690 Toebes, VIII et al. Sep 1999 A
5990955 Koz Nov 1999 A
6002834 Hirabayashi et al. Dec 1999 A
6005621 Linzer et al. Dec 1999 A
6009237 Hirabayashi et al. Dec 1999 A
6016381 Taira et al. Jan 2000 A
6031622 Ristow et al. Feb 2000 A
6046778 Nonomura et al. Apr 2000 A
6057832 Lev et al. May 2000 A
6065050 DeMoney May 2000 A
6079566 Eleftheriadis et al. Jun 2000 A
6157410 Izumi et al. Dec 2000 A
6169242 Fay et al. Jan 2001 B1
6192154 Rajagopalan et al. Feb 2001 B1
6195388 Choi et al. Feb 2001 B1
6204883 Tsukagoshi Mar 2001 B1
6246803 Gauch et al. Jun 2001 B1
6266483 Okada et al. Jul 2001 B1
6282320 Hasegawa et al. Aug 2001 B1
6308005 Ando et al. Oct 2001 B1
6320905 Konstantinides Nov 2001 B1
6330286 Lyons et al. Dec 2001 B1
6351538 Uz Feb 2002 B1
6373803 Ando et al. Apr 2002 B2
6374144 Viviani et al. Apr 2002 B1
6395969 Fuhrer May 2002 B1
6415031 Colligan et al. Jul 2002 B1
6430354 Watanabe Aug 2002 B1
6445877 Okada et al. Sep 2002 B1
6453115 Boyle Sep 2002 B1
6453116 Ando et al. Sep 2002 B1
6481012 Gordon et al. Nov 2002 B1
6504873 Vehvilaeinen Jan 2003 B1
6512883 Shim et al. Jan 2003 B2
6594699 Sahai et al. Jul 2003 B1
6614843 Gordon et al. Sep 2003 B1
6654933 Abbott et al. Nov 2003 B1
6658056 Duruöz et al. Dec 2003 B1
6665835 Gutfreund et al. Dec 2003 B1
6671408 Kaku Dec 2003 B1
6690838 Zhou Feb 2004 B2
6697568 Kaku Feb 2004 B1
6724944 Kalevo et al. Apr 2004 B1
6725281 Zintel et al. Apr 2004 B1
6751623 Basso et al. Jun 2004 B1
6807306 Girgensohn et al. Oct 2004 B1
6813437 Ando et al. Nov 2004 B2
6819394 Nomura et al. Nov 2004 B1
6856997 Lee et al. Feb 2005 B2
6859496 Boroczky et al. Feb 2005 B1
6871006 Oguz et al. Mar 2005 B1
6912513 Candelore Jun 2005 B1
6917652 Lyu Jul 2005 B2
6920179 Anand et al. Jul 2005 B1
6931531 Takahashi Aug 2005 B1
6944621 Collart Sep 2005 B1
6944629 Shioi et al. Sep 2005 B1
6956901 Boroczky et al. Oct 2005 B2
6957350 Demos Oct 2005 B1
6970564 Kubota et al. Nov 2005 B1
6983079 Kim Jan 2006 B2
6985588 Glick et al. Jan 2006 B1
6988144 Luken et al. Jan 2006 B1
7006757 Ando et al. Feb 2006 B2
7007170 Morten Feb 2006 B2
7020287 Unger Mar 2006 B2
7127155 Ando et al. Oct 2006 B2
7151832 Fetkovich et al. Dec 2006 B1
7188183 Paul et al. Mar 2007 B1
7209892 Galuten et al. Apr 2007 B1
7212726 Zetts May 2007 B2
7237061 Boic Jun 2007 B1
7242772 Tehranchi Jul 2007 B1
7274861 Yahata et al. Sep 2007 B2
7295673 Grab et al. Nov 2007 B2
7330875 Parasnis et al. Feb 2008 B1
7340528 Noblecourt et al. Mar 2008 B2
7349886 Morten et al. Mar 2008 B2
7352956 Winter et al. Apr 2008 B1
7356245 Belknap et al. Apr 2008 B2
7366788 Jones et al. Apr 2008 B2
7382879 Miller Jun 2008 B1
7397853 Kwon et al. Jul 2008 B2
7400679 Kwon et al. Jul 2008 B2
7418132 Hoshuyama Aug 2008 B2
7457359 Mabey et al. Nov 2008 B2
7457415 Reitmeier et al. Nov 2008 B2
7478325 Foehr Jan 2009 B2
7493018 Kim Feb 2009 B2
7499930 Naka et al. Mar 2009 B2
7499938 Collart Mar 2009 B2
7546641 Robert et al. Jun 2009 B2
7639921 Seo et al. Dec 2009 B2
7640435 Morten Dec 2009 B2
7711052 Hannuksela et al. May 2010 B2
7728878 Yea et al. Jun 2010 B2
7853980 Pedlow, Jr. et al. Dec 2010 B2
7864186 Robotham et al. Jan 2011 B2
7869691 Kelly et al. Jan 2011 B2
7945143 Yahata et al. May 2011 B2
8023562 Zheludkov et al. Sep 2011 B2
8046453 Olaiya Oct 2011 B2
8054880 Yu et al. Nov 2011 B2
8131875 Chen Mar 2012 B1
8169916 Pai et al. May 2012 B1
8225061 Greenebaum Jul 2012 B2
8233768 Soroushian et al. Jul 2012 B2
8243924 Chen et al. Aug 2012 B2
8249168 Graves Aug 2012 B2
8270473 Chen et al. Sep 2012 B2
8270819 Vannier Sep 2012 B2
8286213 Seo Oct 2012 B2
8289338 Priyadarshi et al. Oct 2012 B2
8311115 Gu et al. Nov 2012 B2
8312079 Newsome et al. Nov 2012 B2
8321556 Chatterjee et al. Nov 2012 B1
8369421 Kadono et al. Feb 2013 B2
8386621 Park Feb 2013 B2
8456380 Pagan Jun 2013 B2
8472792 Butt et al. Jun 2013 B2
8649669 Braness et al. Feb 2014 B2
8683066 Hurst et al. Mar 2014 B2
8689267 Hunt Apr 2014 B2
8751679 McHugh Jun 2014 B2
RE45052 Li Jul 2014 E
8768984 Priddle et al. Jul 2014 B2
8782268 Pyle et al. Jul 2014 B2
8818171 Soroushian et al. Aug 2014 B2
8819116 Tomay et al. Aug 2014 B1
8832297 Soroushian et al. Sep 2014 B2
8849950 Stockhammer et al. Sep 2014 B2
8914534 Braness et al. Dec 2014 B2
8924580 Begen Dec 2014 B2
8948249 Sun Feb 2015 B2
9021119 Van Der Schaar et al. Apr 2015 B2
9025659 Soroushian et al. May 2015 B2
9038116 Knox et al. May 2015 B1
9197944 Reisner Nov 2015 B2
9246971 Giladi Jan 2016 B2
9350990 Orton-Jay et al. May 2016 B2
9357210 Orton-Jay et al. May 2016 B2
9374604 Nemiroff Jun 2016 B2
9467708 Soroushian et al. Oct 2016 B2
9485526 Hybertson Nov 2016 B2
9510031 Soroushian et al. Nov 2016 B2
9571827 Su et al. Feb 2017 B2
9661049 Gordon May 2017 B2
9668007 Van Veldhuisen May 2017 B2
9712890 Shivadas et al. Jul 2017 B2
9955195 Soroushian Apr 2018 B2
10148989 Amidei et al. Dec 2018 B2
10452715 Soroushian et al. Oct 2019 B2
10595070 Amidei et al. Mar 2020 B2
10708587 Soroushian et al. Jul 2020 B2
11064235 Amidei et al. Jul 2021 B2
20010021276 Zhou Sep 2001 A1
20010030710 Werner Oct 2001 A1
20010052077 Fung et al. Dec 2001 A1
20010052127 Seo et al. Dec 2001 A1
20020034252 Owen et al. Mar 2002 A1
20020048450 Zetts Apr 2002 A1
20020051494 Yamaguchi et al. May 2002 A1
20020062313 Lee et al. May 2002 A1
20020067432 Kondo et al. Jun 2002 A1
20020076112 Devara Jun 2002 A1
20020085638 Morad et al. Jul 2002 A1
20020087569 Fischer et al. Jul 2002 A1
20020091665 Beek et al. Jul 2002 A1
20020093571 Hyodo Jul 2002 A1
20020094031 Ngai et al. Jul 2002 A1
20020110193 Yoo et al. Aug 2002 A1
20020118953 Kim Aug 2002 A1
20020120498 Gordon et al. Aug 2002 A1
20020135607 Kato et al. Sep 2002 A1
20020136298 Anantharamu Sep 2002 A1
20020141503 Kobayashi et al. Oct 2002 A1
20020143413 Fay et al. Oct 2002 A1
20020143547 Fay et al. Oct 2002 A1
20020147980 Satoda Oct 2002 A1
20020151992 Hoffberg et al. Oct 2002 A1
20020154125 Coleman Oct 2002 A1
20020154779 Asano et al. Oct 2002 A1
20020161462 Fay Oct 2002 A1
20020164024 Arakawa et al. Nov 2002 A1
20020169971 Asano et al. Nov 2002 A1
20020180929 Tseng et al. Dec 2002 A1
20020184159 Tadayon et al. Dec 2002 A1
20020191112 Akiyoshi et al. Dec 2002 A1
20020191960 Fujinami et al. Dec 2002 A1
20030001964 Masukura et al. Jan 2003 A1
20030002577 Pinder Jan 2003 A1
20030002578 Tsukagoshi et al. Jan 2003 A1
20030005442 Brodersen et al. Jan 2003 A1
20030012275 Boice et al. Jan 2003 A1
20030035488 Barrau Feb 2003 A1
20030044080 Frishman et al. Mar 2003 A1
20030053541 Sun et al. Mar 2003 A1
20030063675 Kang et al. Apr 2003 A1
20030077071 Lin et al. Apr 2003 A1
20030078930 Surcouf et al. Apr 2003 A1
20030093799 Kauffman et al. May 2003 A1
20030123855 Okada et al. Jul 2003 A1
20030128296 Lee Jul 2003 A1
20030133506 Haneda Jul 2003 A1
20030135742 Evans Jul 2003 A1
20030142594 Tsumagari et al. Jul 2003 A1
20030142872 Koyanagi Jul 2003 A1
20030152370 Otomo et al. Aug 2003 A1
20030165328 Grecia Sep 2003 A1
20030185302 Abrams Oct 2003 A1
20030185542 McVeigh et al. Oct 2003 A1
20030206558 Parkkinen et al. Nov 2003 A1
20030206717 Yogeshwar et al. Nov 2003 A1
20030216922 Gonzales et al. Nov 2003 A1
20030231863 Eerenberg et al. Dec 2003 A1
20030231867 Gates et al. Dec 2003 A1
20030236836 Borthwick Dec 2003 A1
20040001594 Krishnaswamy et al. Jan 2004 A1
20040006701 Kresina Jan 2004 A1
20040021684 Millner Feb 2004 A1
20040022391 Obrien Feb 2004 A1
20040025180 Begeja et al. Feb 2004 A1
20040028227 Yu Feb 2004 A1
20040037421 Truman Feb 2004 A1
20040047592 Seo et al. Mar 2004 A1
20040047607 Seo et al. Mar 2004 A1
20040047614 Green Mar 2004 A1
20040052501 Tam Mar 2004 A1
20040062306 Takahashi Apr 2004 A1
20040071453 Valderas Apr 2004 A1
20040076237 Kadono et al. Apr 2004 A1
20040081333 Grab et al. Apr 2004 A1
20040081434 Jung et al. Apr 2004 A1
20040093494 Nishimoto et al. May 2004 A1
20040101059 Joch et al. May 2004 A1
20040107356 Shamoon et al. Jun 2004 A1
20040114687 Ferris et al. Jun 2004 A1
20040117347 Seo et al. Jun 2004 A1
20040136698 Mock Jul 2004 A1
20040143760 Alkove et al. Jul 2004 A1
20040146276 Ogawa Jul 2004 A1
20040150747 Sita Aug 2004 A1
20040208245 Macinnis et al. Oct 2004 A1
20040217971 Kim Nov 2004 A1
20040255236 Collart Dec 2004 A1
20050013494 Srinivasan et al. Jan 2005 A1
20050015797 Noblecourt et al. Jan 2005 A1
20050038826 Bae et al. Feb 2005 A1
20050055399 Savchuk Mar 2005 A1
20050063541 Candelore Mar 2005 A1
20050076232 Kawaguchi Apr 2005 A1
20050089091 Kim et al. Apr 2005 A1
20050144468 Northcutt Jun 2005 A1
20050157948 Lee Jul 2005 A1
20050177741 Chen et al. Aug 2005 A1
20050180641 Clark Aug 2005 A1
20050193070 Brown et al. Sep 2005 A1
20050193322 Lamkin et al. Sep 2005 A1
20050196147 Seo et al. Sep 2005 A1
20050207442 Zoest et al. Sep 2005 A1
20050207578 Matsuyama et al. Sep 2005 A1
20050210145 Kim et al. Sep 2005 A1
20050243912 Kwon et al. Nov 2005 A1
20050243922 Magee Nov 2005 A1
20050265555 Pippuri Dec 2005 A1
20050273695 Schnurr Dec 2005 A1
20050275656 Corbin et al. Dec 2005 A1
20060013568 Rodriguez Jan 2006 A1
20060015813 Chung et al. Jan 2006 A1
20060039481 Shen et al. Feb 2006 A1
20060072672 Holcomb et al. Apr 2006 A1
20060078301 Ikeda et al. Apr 2006 A1
20060093320 Hallberg et al. May 2006 A1
20060126717 Boyce et al. Jun 2006 A1
20060129909 Butt et al. Jun 2006 A1
20060165163 Burazerovic et al. Jul 2006 A1
20060168639 Gan et al. Jul 2006 A1
20060173887 Breitfeld et al. Aug 2006 A1
20060181965 Collart Aug 2006 A1
20060182139 Bugajski et al. Aug 2006 A1
20060235880 Qian Oct 2006 A1
20060245727 Nakano et al. Nov 2006 A1
20060259588 Lerman et al. Nov 2006 A1
20060263056 Lin et al. Nov 2006 A1
20060267986 Bae Nov 2006 A1
20060274835 Hamilton et al. Dec 2006 A1
20070005333 Setiohardjo et al. Jan 2007 A1
20070024706 Brannon, Jr. et al. Feb 2007 A1
20070031110 Rijckaert Feb 2007 A1
20070047645 Takashima Mar 2007 A1
20070047901 Ando et al. Mar 2007 A1
20070053293 Mcdonald et al. Mar 2007 A1
20070053444 Shibata et al. Mar 2007 A1
20070067472 Maertens et al. Mar 2007 A1
20070083467 Lindahl et al. Apr 2007 A1
20070083617 Chakrabarti et al. Apr 2007 A1
20070086528 Mauchly et al. Apr 2007 A1
20070140647 Kusunoki et al. Jun 2007 A1
20070154165 Hemmeryckz-Deleersnijder et al. Jul 2007 A1
20070168541 Gupta et al. Jul 2007 A1
20070168542 Gupta et al. Jul 2007 A1
20070177812 Yang et al. Aug 2007 A1
20070180051 Kelly et al. Aug 2007 A1
20070180125 Knowles et al. Aug 2007 A1
20070239839 Buday et al. Oct 2007 A1
20070292107 Yahata et al. Dec 2007 A1
20080025413 Apostolopoulos Jan 2008 A1
20080030614 Schwab Feb 2008 A1
20080052306 Wang et al. Feb 2008 A1
20080063051 Kwon et al. Mar 2008 A1
20080086570 Dey et al. Apr 2008 A1
20080101466 Swenson et al. May 2008 A1
20080101718 Yang et al. May 2008 A1
20080104633 Noblecourt et al. May 2008 A1
20080120330 Reed et al. May 2008 A1
20080120342 Reed et al. May 2008 A1
20080126248 Lee et al. May 2008 A1
20080137736 Richardson et al. Jun 2008 A1
20080137847 Candelore et al. Jun 2008 A1
20080137848 Kocher et al. Jun 2008 A1
20080192818 DiPietro et al. Aug 2008 A1
20080196076 Shatz et al. Aug 2008 A1
20080232456 Terashima et al. Sep 2008 A1
20080253454 Imamura et al. Oct 2008 A1
20080256105 Nogawa et al. Oct 2008 A1
20080263354 Beuque et al. Oct 2008 A1
20080266522 Weisgerber Oct 2008 A1
20080279535 Haque et al. Nov 2008 A1
20080310496 Fang Dec 2008 A1
20090010622 Yahata et al. Jan 2009 A1
20090013195 Ochi et al. Jan 2009 A1
20090031220 Tranchant et al. Jan 2009 A1
20090037959 Suh et al. Feb 2009 A1
20090060452 Chaudhri Mar 2009 A1
20090066839 Jung et al. Mar 2009 A1
20090077143 Macy, Jr. Mar 2009 A1
20090106082 Senti et al. Apr 2009 A1
20090116821 Shibamiya et al. May 2009 A1
20090132599 Soroushian et al. May 2009 A1
20090132721 Soroushian et al. May 2009 A1
20090150557 Wormley et al. Jun 2009 A1
20090169181 Priyadarshi et al. Jul 2009 A1
20090178090 Oztaskent Jul 2009 A1
20090201988 Gazier et al. Aug 2009 A1
20090226148 Nesvadba et al. Sep 2009 A1
20090249081 Zayas Oct 2009 A1
20090282162 Mehrotra et al. Nov 2009 A1
20090293116 DeMello Nov 2009 A1
20090300204 Zhang et al. Dec 2009 A1
20090303241 Priyadarshi et al. Dec 2009 A1
20090307258 Priyadarshi et al. Dec 2009 A1
20090307267 Chen et al. Dec 2009 A1
20090310819 Hatano Dec 2009 A1
20090313544 Wood et al. Dec 2009 A1
20090313564 Rottler et al. Dec 2009 A1
20090328124 Khouzam et al. Dec 2009 A1
20100002069 Eleftheriadis et al. Jan 2010 A1
20100040351 Toma Feb 2010 A1
20100094969 Zuckerman et al. Apr 2010 A1
20100095121 Shetty et al. Apr 2010 A1
20100111192 Graves May 2010 A1
20100142915 Mcdermott et al. Jun 2010 A1
20100146055 Hannuksela et al. Jun 2010 A1
20100158109 Dahlby et al. Jun 2010 A1
20100189183 Gu et al. Jul 2010 A1
20100195713 Coulombe et al. Aug 2010 A1
20100226582 Luo et al. Sep 2010 A1
20100228795 Hahn Sep 2010 A1
20100259690 Wang et al. Oct 2010 A1
20100262712 Kim et al. Oct 2010 A1
20100278271 MacInnis Nov 2010 A1
20100284473 Suh et al. Nov 2010 A1
20110010466 Fan et al. Jan 2011 A1
20110022432 Ishida et al. Jan 2011 A1
20110055585 Lee Mar 2011 A1
20110058675 Brueck et al. Mar 2011 A1
20110080940 Bocharov Apr 2011 A1
20110082924 Gopalakrishnan Apr 2011 A1
20110096828 Chen et al. Apr 2011 A1
20110099594 Chen et al. Apr 2011 A1
20110103374 Lajoie et al. May 2011 A1
20110126104 Woods et al. May 2011 A1
20110126191 Hughes et al. May 2011 A1
20110129011 Cilli et al. Jun 2011 A1
20110135090 Chan et al. Jun 2011 A1
20110142415 Rhyu Jun 2011 A1
20110145858 Philpott et al. Jun 2011 A1
20110150100 Abadir Jun 2011 A1
20110153785 Minborg et al. Jun 2011 A1
20110164679 Satou et al. Jul 2011 A1
20110170408 Furbeck et al. Jul 2011 A1
20110173345 Knox et al. Jul 2011 A1
20110179185 Wang et al. Jul 2011 A1
20110197261 Dong et al. Aug 2011 A1
20110239078 Luby et al. Sep 2011 A1
20110246659 Bouazizi Oct 2011 A1
20110246661 Manzari et al. Oct 2011 A1
20110268178 Park et al. Nov 2011 A1
20110280307 MacInnis et al. Nov 2011 A1
20110296048 Knox et al. Dec 2011 A1
20110302319 Ha et al. Dec 2011 A1
20110305273 He et al. Dec 2011 A1
20110310982 Yang et al. Dec 2011 A1
20110314130 Strasman Dec 2011 A1
20110314176 Frojdh et al. Dec 2011 A1
20110314500 Gordon Dec 2011 A1
20120005312 Mcgowan et al. Jan 2012 A1
20120023251 Pyle et al. Jan 2012 A1
20120042090 Chen et al. Feb 2012 A1
20120047542 Lewis et al. Feb 2012 A1
20120072493 Muriello et al. Mar 2012 A1
20120093214 Urbach Apr 2012 A1
20120105279 Brown et al. May 2012 A1
20120110120 Willig et al. May 2012 A1
20120167132 Mathews et al. Jun 2012 A1
20120170642 Braness Jul 2012 A1
20120170643 Soroushian et al. Jul 2012 A1
20120170906 Soroushian et al. Jul 2012 A1
20120170915 Braness et al. Jul 2012 A1
20120173751 Braness et al. Jul 2012 A1
20120177101 van der Schaar Jul 2012 A1
20120179834 van der Schaar et al. Jul 2012 A1
20120203766 Hörnkvist et al. Aug 2012 A1
20120230390 Akkor Sep 2012 A1
20120269275 Hannuksela Oct 2012 A1
20120278496 Hsu Nov 2012 A1
20120281767 Duenas et al. Nov 2012 A1
20120307883 Graves Dec 2012 A1
20120311174 Bichot et al. Dec 2012 A1
20120316941 Moshfeghi Dec 2012 A1
20120331167 Hunt Dec 2012 A1
20130003868 Sjoberg et al. Jan 2013 A1
20130007200 van der Schaar et al. Jan 2013 A1
20130013803 Bichot et al. Jan 2013 A1
20130019257 Tschernutter et al. Jan 2013 A1
20130044821 Braness et al. Feb 2013 A1
20130046902 Villegas Nuñez et al. Feb 2013 A1
20130051767 Soroushian et al. Feb 2013 A1
20130051768 Soroushian et al. Feb 2013 A1
20130054958 Braness et al. Feb 2013 A1
20130055084 Soroushian et al. Feb 2013 A1
20130058393 Soroushian Mar 2013 A1
20130061045 Kiefer et al. Mar 2013 A1
20130080267 McGowan Mar 2013 A1
20130091249 McHugh et al. Apr 2013 A1
20130095855 Bort Apr 2013 A1
20130097172 McIntosh Apr 2013 A1
20130128970 Yu et al. May 2013 A1
20130169863 Smith Jul 2013 A1
20130188928 Ogawa Jul 2013 A1
20130191754 Rose Jul 2013 A1
20130196292 Brennen et al. Aug 2013 A1
20130279810 Li et al. Oct 2013 A1
20140003501 Soroushian et al. Jan 2014 A1
20140003523 Soroushian et al. Jan 2014 A1
20140059243 Reisner Feb 2014 A1
20140140253 Lohmar et al. May 2014 A1
20140149557 Lohmar et al. May 2014 A1
20140189765 Green Jul 2014 A1
20140211840 Butt et al. Jul 2014 A1
20140241421 Orton-jay et al. Aug 2014 A1
20140250473 Braness et al. Sep 2014 A1
20140355958 Soroushian et al. Dec 2014 A1
20140359680 Shivadas et al. Dec 2014 A1
20150036758 Sato Feb 2015 A1
20150104153 Braness et al. Apr 2015 A1
20150229695 Kim et al. Aug 2015 A1
20150281746 Lam Oct 2015 A1
20150288530 Oyman Oct 2015 A1
20160044078 Hosur Feb 2016 A1
20160073176 Phillips et al. Mar 2016 A1
20160119657 Sun Apr 2016 A1
20160127440 Gordon May 2016 A1
20160134881 Wang May 2016 A1
20160255348 Panchagnula Sep 2016 A1
20170026445 Soroushian et al. Jan 2017 A1
20170041372 Hosur Feb 2017 A1
20170041604 Soroushian et al. Feb 2017 A1
20170055007 Phillips Feb 2017 A1
20170123713 Fisher May 2017 A1
20170347135 Frantz Nov 2017 A1
20170366833 Amidei et al. Dec 2017 A1
20180027244 Chen et al. Jan 2018 A1
20180137208 Ricker et al. May 2018 A1
20180242015 Katsavounidis Aug 2018 A1
20180278975 Soroushian Sep 2018 A1
20190082182 Naser et al. Mar 2019 A1
20190146951 Velmurugan et al. May 2019 A1
20190166410 Kirk et al. May 2019 A1
20190182524 Amidei et al. Jun 2019 A1
20200204834 Loheide et al. Jun 2020 A1
20200221152 Amidei et al. Jul 2020 A1
20200236376 Li et al. Jul 2020 A1
20200252689 Crowe Aug 2020 A1
20200322691 Hui Oct 2020 A1
20200396451 Soroushian et al. Dec 2020 A1
Foreign Referenced Citations (102)
Number Date Country
2010202963 Feb 2012 AU
2237293 Jul 1997 CA
1221284 Jun 1999 CN
1662952 Aug 2005 CN
1684518 Oct 2005 CN
1723696 Jan 2006 CN
102138327 Jul 2011 CN
103858419 Jun 2014 CN
103875248 Jun 2014 CN
105103565 Nov 2015 CN
105359511 Feb 2016 CN
103875248 Sep 2018 CN
108989847 Dec 2018 CN
109314784 Feb 2019 CN
757484 Feb 1997 EP
813167 Dec 1997 EP
1335603 Aug 2003 EP
1420580 May 2004 EP
1453319 Sep 2004 EP
1283640 Oct 2006 EP
1718074 Nov 2006 EP
2180664 Apr 2010 EP
2360923 Aug 2011 EP
2661895 Nov 2013 EP
2962461 Jan 2016 EP
3005689 Apr 2016 EP
3473005 Apr 2019 EP
1195183 Feb 2018 HK
1260329 Dec 2019 HK
1263223 Jan 2020 HK
08163488 Jun 1996 JP
08287613 Nov 1996 JP
11328929 Nov 1999 JP
2001043668 Feb 2001 JP
2002170363 Jun 2002 JP
2002218384 Aug 2002 JP
2003250113 Sep 2003 JP
2004320707 Nov 2004 JP
2005027153 Jan 2005 JP
2009508452 Feb 2009 JP
2010262255 Nov 2010 JP
2011029962 Feb 2011 JP
2013026724 Feb 2013 JP
2014506430 Mar 2014 JP
20165043 Jan 2016 JP
2016526336 Sep 2016 JP
2019-526188 Sep 2019 JP
6892877 Jun 2021 JP
100221423 Sep 1999 KR
2002013664 Feb 2002 KR
1020020064888 Aug 2002 KR
20040039852 May 2004 KR
20060106250 Oct 2006 KR
20110051104 May 2011 KR
20140056317 May 2014 KR
20160021141 Feb 2016 KR
101823321 Jan 2018 KR
101928910 Dec 2018 KR
10-1936142 Jan 2019 KR
10-1981923 May 2019 KR
10-2020764 Sep 2019 KR
10-2074148 Jan 2020 KR
10-2086995 Mar 2020 KR
10-2190364 Dec 2020 KR
2328040 Jun 2008 RU
1995015660 Jun 1995 WO
2000049762 Aug 2000 WO
2000049763 Aug 2000 WO
2001031497 May 2001 WO
2001050732 Jul 2001 WO
2002001880 Jan 2002 WO
2003047262 Jun 2003 WO
2004012378 Feb 2004 WO
2004054247 Jun 2004 WO
2004097811 Nov 2004 WO
2004100158 Nov 2004 WO
2004102571 Nov 2004 WO
2005008385 Jan 2005 WO
2005015935 Feb 2005 WO
2009006302 Jan 2009 WO
2009065137 May 2009 WO
2009109976 Sep 2009 WO
2010060106 May 2010 WO
2010122447 Oct 2010 WO
2010150470 Dec 2010 WO
2011053658 May 2011 WO
2011059291 May 2011 WO
2011087449 Jul 2011 WO
2011093835 Aug 2011 WO
2011101371 Aug 2011 WO
2011102791 Aug 2011 WO
2011103364 Aug 2011 WO
2012094171 Jul 2012 WO
20120094181 Jul 2012 WO
20120094189 Jul 2012 WO
2013033334 Mar 2013 WO
2013033335 Mar 2013 WO
2013033458 Mar 2013 WO
2013033458 May 2013 WO
2014121857 Aug 2014 WO
2014190308 Nov 2014 WO
2017218095 Dec 2017 WO
Non-Patent Literature Citations (213)
Entry
Broadq—The Ultimate Home Entertainment Software, printed May 11, 2009 from http://web.srchive.org/web/20030401122010/www.broadq.com/qcasttuner/, 1 pg.
European Extended Search Report for EP Application 17813738.6, Search completed Oct. 18, 2019, dated Oct. 24, 2019, 9 pages.
European Search Report for Application 11855103.5, search completed Jun. 26, 2014, 9 pgs.
European Search Report for Application 11855237.1, search completed Jun. 12, 2014, 9 pgs.
European Supplementary Search Report for Application EP09759600, completed Jan. 25, 2011, 11 pgs.
Extended European Search Report for European Application EP12828956.8, Report Completed Feb. 18, 2015, dated Mar. 2, 2015, 13 Pages.
Extended European Search Report for European Application No. 14800901.2, Search completed Dec. 2, 2016, dated Dec. 13, 2016, 13 Pgs.
Federal Computer Week, “Tool Speeds Info to Vehicles”, Jul. 25, 1999, 5 pages.
HTTP Live Streaming Overview, Networking & Internet, Apple, Inc., Apr. 1, 2011, 38 pages.
IBM Corporation and Microsoft Corporation, “Multimedia Programming Interface and Data Specifications 1.0”, Aug. 1991, printed from http://www.kk.iij4u.or.jp/˜kondo/wave/mpidata.txt on Mar. 6, 2006, 100 pgs.
Information Technology—MPEG Systems Technologies—Part 7: Common Encryption in ISO Base Media File Format Files (ISO/IEC 23001-7), Apr. 2015, 24 pgs.
InformationWeek, “Internet on Wheels”, InformationWeek: Front End: Daily Dose, Jul. 20, 1999, Printed on Mar. 26, 2014, 3 pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2008/083816, dated May 18, 2010, 6 pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2012/053052, Completed Mar. 4, 2014, 8 pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2012/053223, dated Mar. 4, 2014, 7 pgs.
International Preliminary Report on Patentability for International Application PCT/US2011/067167, dated Feb. 25, 2014, 8 pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/039425, Report Issued Nov. 24, 2015, dated Dec. 3, 2015, 6 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2017/031114, Report issued Dec. 18, 2018, dated Dec. 27, 2018, 7 Pgs.
International Search Report and Written Opinion for International Application No. PCT/US09/46588, completed Jul. 13, 2009, dated Jul. 23, 2009, 7 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2004/041667, completed May 24, 2007, dated Jun. 20, 2007, 6 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2008/083816, completed Jan. 10, 2009, dated Jan. 22, 2009, 7 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2012/053052, International Filing Date Aug. 30, 2012, Report Completed Oct. 25, 2012, dated Nov. 16, 2012, 9 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2012/053223, International Filing Date Aug. 30, 2012, Report Completed Dec. 7, 2012, dated Mar. 7, 2013, 10 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2017/031114, Search completed Jun. 29, 2017, dated Jul. 19, 2017, 11 Pgs.
International Search Report and Written Opinion for International Application PCT/US2011/066927, completed Apr. 3, 2012, dated Apr. 20, 2012, 14 pgs.
International Search Report and Written Opinion for International Application PCT/US2011/067167, completed Jun. 19, 2012, dated Jul. 2, 2012, 11 pgs.
International Search Report and Written Opinion for International Application PCT/US2012/053053, search completed Oct. 23, 2012, dated Nov. 13, 2012, 11 pgs.
International Search Report for International Application No. PCT/SE2011/050166, Search completed Mar. 30, 2011, dated Mar. 30, 2011, 5 Pgs.
ISO/IEC 14496-12 Information technology—Coding of audio-visual objects—Part 12: ISO base media file format, Feb. 2004 (“MPEG-4 Part 12 Standard”), 62 pgs.
ISO/IEC 14496-12:2008(E) Informational Technology—Coding of Audio-Visual Objects Part 12: ISO Base Media File Format, Oct. 2008, 120 pgs.
ISO/IEC FCD 23001-6 MPEG systems technologies Part 6: Dynamic adaptive streaming over HTTP (DASH), Jan. 28, 2011, 86 pgs.
ITS International, “Fleet System Opts for Mobile Server”, Aug. 26, 1999, Printed on Oct. 21, 2011 from http://www.itsinternational.com/News/article.cfm?recordID=547, 2 pgs.
Linksys Wireless-B Media Adapter Reviews, printed May 4, 2007 from http://reviews.cnet.com/Linksys_Wireless_B_Media_Adapter/4505-6739_7-30421900.html?tag=box, 5 pgs.
Linksys, KISS DP-500, printed May 4, 2007 from http://www.kiss-technology.com/?p=dp500, 2 pgs.
LINKSYS® : “Enjoy your digital music and pictures on your home entertainment center, without stringing wires!”, Model No. WMA 11B, printed May 9, 2007 from http://www.linksys.com/servlet/Satellite?c=L_Product_C2&childpagename=US/Layout&cid=1115416830950&p.
Microsoft Corporation, “Chapter 8, Multimedia File Formats” 1991, Microsoft WIndows Multimedia Programmer's Reference, 3 cover pgs, pp. 8-1 to 8-20.
Microsoft Corporation, Advanced Systems Format (ASF) Specification, Revision 01.20.03, Dec. 2004, 121 pgs.
Microsoft Media Platform: Player Framework, “Microsoft Media Platform Player Framework v2.5 (formerly Silverlight Media Framework)”, May 3, 2011, 2 pages.
Microsoft Media Platform: Player Framework, “Silverlight Media Framework v1.1”, Jan. 2010, 2 pages.
Microsoft Windows® XP Media Center Edition 2005, Frequently asked Questions, printed May 4, 2007 from http://www.microsoft.com/windowsxp/mediacenter/evaluation/faq.mspx.
Microsoft Windows® XP Media Center Edition 2005: Features, printed May 9, 2007, from http://www.microsoft.com/windowsxp/mediacenter/evaluation/features.mspx, 4 pgs.
MPEG-DASH presentation at Streaming Media West 2011, Nov. 2011, 14 pgs.
Multiview Video Coding (MVC), ISO/IEC 14496-10, 2008 Amendment.
Open DML AVI-M-JPEG File Format Subcommittee, “Open DML AVI File Format Extensions”, Version 1.02, Feb. 28, 1996, 29 pgs.
PC World.com, Future Gear: PC on the HiFi, and the TV, from http://www.pcworld.com/article/id, 108818-page,1/article.html, printed May 4, 2007, from IDG Networks, 2 pgs.
Pomelo, LLC Tech Memo, Analysis of Netflix's Security Framework for ‘Watch Instantly’ Service, Mar.-Apr. 2009, 18 pgs.
Qtv—About BroadQ, printed May 11, 2009 from http://www.broadq.com/en/about.php, 1 pg.
Server-Side Stream Repackaging (Streaming Video Technologies Panorama, Part 2), Jul. 2011, 15 pgs.
Supplementary European Search Report for Application No. EP 04813918, Search Completed Dec. 19, 2012, 3 pgs.
Text of ISO/IEC 23001-6: Dynamic adaptive streaming over HTTP (DASH), Oct. 2010, 71 pgs.
Universal Mobile Telecommunications System (UMTS), ETSI TS 126 233 V9.1.0 (Jun. 2011) 3GPP TS 26.233 version 9.1.0 Release 9, 18 pgs.
Universal Mobile Telecommunications Systems (UMTS); ETSI TS 126 244 V9.4.0 (May 2011) 3GPP TS 26.244 version 9.4.0 Release 9, 58 pgs.
Windows Media Center Extender for Xbox, printed May 9, 2007 from http://www.xbox.com/en-US/support/systemuse/xbox/console/mediacenterextender.htm, 2 pgs.
Windows® XP Media Center Edition 2005, “Experience more entertainment”, retrieved from http://download.microsoft.com/download/c/9/a/c9a7000a-66b3-455b-860b-1c16f2eecfec/MCE.pdf on May 9, 2007, 2 pgs.
International Search Report and Written Opinion for International Application PCT/US14/39425, International Filing Date Mar. 23, 2014, Report Completed Sep. 15, 2014, dated Oct. 17, 2014, 9 Pgs.
Written Opinion for International Application No. PCT/US2008/083816, Opinion completed Jan. 10, 2009, dated Jan. 22, 2009, 5 pgs.
Written Opinion for International Application No. PCT/US2009/046588, completed Jul. 14, 2009, dated Jul. 23, 2009, 5 pgs.
Written Opinion for International Application No. PCT/US2004/041667, Filing Date Dec. 8, 2004, Report Completed May 24, 2007, dated Jun. 20, 2007, 4 pgs.
“Innovation of technology arrived”, I-O Data, Nov. 2004, Retrieved from http://www.iodata.com/catalogs/AVLP2DVDLA_Flyer200505.pdf on May 30, 2013, 2 pgs., I-O Data, 2 pgs.
“Adaptive HTTP Streaming in PSS—Client Behaviour”, S4-AHI129, 3GPP TSG-SA4 Ad-Hoc Meeting, Dec. 14-16, 2009, Paris, France; section 12.6.1.
“Adaptive HTTP Streaming in PSS—Data Formats for HTTP-Streaming excluding MPD”, S4-AHI128, 3GPP TSGSA4 Ad-Hoc Meeting, Dec. 14-16, 2009, Paris,France; sections 12.2.1 and 12.2.4.2.1.
“Adaptive HTTP Streaming in PSS—Discussion on Options”, S4-AHI130, 3GPP TSG-SA4 Ad-Hoc Meeting, Dec. 14-16, 2009, Paris, France; sections 1, 2.7-2.8, and 2.16-2.19.
“Adaptive Streaming Comparison”, Jan. 28, 2010, 5 pgs.
“Apple HTTP Live Streaming specification”, Aug. 2017, 60 pgs.
“Best Practices for Multi-Device Transcoding”, Kaltura Open Source Video, Printed on Nov. 27, 2013 from knowledge.kaltura.com/best-practices-multi-device-transcoding, 13 pgs.
“Container format (digital)”, printed Aug. 22, 2009 from http://en.wikipedia.org/wiki/Container_format_(digital), 4 pgs.
“Data Encryption Decryption using AES Algorithm, Key and Salt with Java Cryptography Extension”, Available at https://www.digizol.com/2009/10/java-encrypt-decrypt-jce-salt.html, October 200, 6 pgs.
“Delivering Live and On-Demand Smooth Streaming”, Microsoft Silverlight, 2009, 28 pgs.
“DVD—MPeg differences”, printed Jul. 2, 2009 from http://dvd.sourceforge.net/dvdinfo/dvdmpeg.html, 1 pg.
“DVD subtitles”, sam.zoy.org/writings/dvd/subtitles, dated Jan. 9, 2001, printed Jul. 2, 2009, 4 pgs.
“Final Committee Draft of MPEG-4 streaming text format”, International Organisation for Standardisation, Feb. 2004, 22 pgs.
“Fragmented Time Indexing of Representations”, S4-AHI126, 3GPP TSG-SA4 Ad-Hoc Meeting, Dec. 14-16, 2009, Paris, France.
“HTTP Based Adaptive Streaming over HSPA”, Apr. 2011, 73 pgs.
“HTTP Live Streaming”, Mar. 2011, 24 pgs.
“HTTP Live Streaming”, Sep. 2011, 33 pgs.
“Information Technology—Coding of audio-visual objects—Part 17: Streaming text”, International Organisation for Standardisation, Feb. 2004, 22 pgs.
“Information technology—Coding of audio-visual objects—Part 18: Font compression and streaming”, ISO/IEC 14496-18, First edition Jul. 1, 2004, 26 pgs.
“Information Technology—Coding of Audio Visual Objects—Part 2: Visual”, International Standard, ISO/IEC 14496-2, Third Edition, Jun. 1, 2004, pp. 1-724. (presented in three parts).
“Java Cryptography Architecture API Specification & Reference”, Available at https://docs.oracle.com/javase/1.5.0/docs/guide/security/CryptoSpec.html, Jul. 25, 2004, 68 pgs.
“Java Cryptography Extension, javax.crypto.Cipher class”, Available at https://docs.oracle.com/javase/1.5.0/docs/api/javax/crypto/Cipher.html, 2004, 24 pgs.
“JCE Encryption—Data Encryption Standard (DES) Tutorial”, Available at https://mkyong.com/java/jce-encryption-data-encryption-standard-des-tutorial/, Feb. 25, 2009, 2 pgs.
“KISS Players, KISS DP-500”, retrieved from http://www.kiss-technology.com/?p=dp500 on May 4, 2007, 1 pg.
“Live and On-Demand Video with Silverlight and IIS Smooth Streaming”, Microsoft Silverlight, Windows Server Internet Information Services 7.0, Feb. 2010, 15 pgs.
“Microsoft Smooth Streaming specification”, Jul. 22, 2013, 56 pgs.
“MPEG ISO/IEC 13818-1”, Information Technology—Generic Coding Of Moving Pictures And Associated Audio: Systems, Apr. 25, 1995, 151 pages.
“MPEG-4, Part 14, ISO/IEC 14496-14”, Information technology—Coding of audio-visual objects, 18 pgs., Nov. 15, 2003.
“Netflix turns on subtitles for PC, Mac streaming”, Yahoo! News, Apr. 21, 2010, Printed on Mar. 26, 2014, 3 pgs.
“OpenDML AVI File Format Extensions”, OpenDML AVI M-JPEG File Format Subcommittee, retrieved from www.the-labs.com/Video/odmlff2-avidef.pdf, Sep. 1997, 42 pgs.
“OpenDML AVI File Format Extensions Version 1.02”, OpenDMLAVI MJPEG File Format Subcommittee. Last revision: Feb. 28, 1996. Reformatting: Sep. 1997, 42 pgs.
“QCast Tuner for PS2”, printed May 11, 2009 from http://web.archive.org/web/20030210120605/www.divx.com/software/det ail.php?ie=39, 2 pgs.
“Single-Encode Streaming for Multiple Screen Delivery”, Telestream Wowza Media Systems, 2009, 6 pgs.
“Smooth Streaming Client”, The Official Microsoft IIS Site, Sep. 24, 2010, 4 pages.
“Supported Media Formats”, Supported Media Formats, Android Developers, Printed on Nov. 27, 2013 from developer.android.com/guide/appendix/media-formats.html, 3 pgs.
“Text of ISO/IEC 14496-18/COR1, Font compression and streaming”, ITU Study Group 16—Video Coding Experts Group—ISO/IEC MPEG & ITU-T VCEG(ISO/IEC JTC1/SC29/WG11 and ITU-T SG16 06), No. N8664, Oct. 27, 2006, 8 pgs.
“Text of ISO/IEC 14496-18/FDIS, Coding of Moving Pictures and Audio”, ITU Study Group 16—Videocoding Experts Group—ISO/IEC MPEG & ITU-T VCEG(ISO/IEC JTC1/SC29/WG11 and ITU-T SG16 06), No. N6215, Dec. 2003, 26 pgs.
“The MPEG-DASH Standard for Multimedia Streaming Over the Internet”, IEEE MultiMedia, vol. 18, No. 4, 2011, 7 pgs.
“Thread: SSME (Smooth Streaming Medial Element) config.xml review (Smooth Streaming Client configuration file)”, Printed on Mar. 26, 2014, 3 pgs.
“Transcoding Best Practices”, From movideo, Printed on Nov. 27, 2013 from code.movideo.com/Transcoding_Best_Practices, 5 pgs.
“Transparent end-to-end packet switched streaming service (PSS); 3GPP file format (3GP) (Release 9)”, 3GPP TS 26.244 V9.0.0 (Dec. 2009),sections 7.1-7.4.
“Using HTTP Live Streaming”, iOS Developer Library, http://developer.apple.com/library/ios/#documentation/networkinginternet/conceptual/streamingmediaguide/UsingHTTPLiveStreaming/UsingHTTPLiveStreaming.html#//apple_ref/doc/uid/TP40008332-CH102-SW1, Feb. 11, 2014, 10 pgs.
“Video Manager and Video Title Set IFO file headers”, printed Aug. 22, 2009 from http://dvd.sourceforge.net/dvdinfo/ifo.htm, 6 pgs.
“What is a DVD?”, printed Aug. 22, 2009 from http://www.videohelp.com/dvd, 8 pgs.
“What is a VOB file”, http://www.mpucoder.com/DVD/vobov.html, printed on Jul. 2, 2009, 2 pgs.
“What's on a DVD?”, printed Aug. 22, 2009 from http://www.doom9.org/dvd-structure.htm, 5 pgs.
“Windows Media Player 9”, Microsoft, Mar. 23, 2017, 3 pgs.
“DVD-Mpeg differences”, printed on Jul. 2, 2009, http://dvd.sourceforge.net/dvdinfo/dvdmpeg.html, 1 pg.
U.S. Appl. No. 13/224,298, “Final Office Action Received”, dated May 19, 2014, 26 pgs.
Abomhara et al., “Enhancing Selective Encryption for H.264/AVC Using Advanced Encryption Standard”, International Journal of computer Theory and Engineering, Apr. 2010, vol. 2, No. 2, pp. 223-229.
Akhshabi et al., “An Experimental Evaluation of Rate-Adaptation Algorithms in Adaptive Streaming over HTTP”, MMSys'11, Feb. 23-25, 2011, 12 pgs.
Alattar et al., A.M., “Improved selective encryption techniques for secure transmission of MPEG video bit-streams”, In Proceedings 1999 International Conference on Image Processing (Cat. 99CH36348), vol. 4, IEEE, 1999, pp. 256-260.
Anonymous, “Method for the encoding of a compressed video sequence derived from the same video sequence compressed at a different bit rate without loss of data”, ip.com, ip.com No. IPCOM000008165D, May 22, 2002, pp. 1-9.
Antoniou et al., “Adaptive Methods for the Transmission of Video Streams in Wireless Networks”, 2015, 50 pgs.
Apostolopoulos et al., “Secure Media Streaming and Secure Transcoding”, Multimedia Security Technologies for Digital Rights Management, 2006, 33 pgs.
Asai et al., “Essential Factors for Full-Interactive VOD Server: Video File System, Disk Scheduling, Network”, Proceedings of Globecom '95, Nov. 14-16, 1995, 6 pgs.
Beker et al., “Cipher Systems, The Protection of Communications”, 1982, 40 pgs.
Blasiak, “Video Transrating and Transcoding: Overview of Video Transrating and Transcoding Technologies”, Ingenient Technologies, TI Developer Conference, Aug. 6-8, 2002, 22 pgs.
Bocharov et al, “Portable Encoding of Audio-Video Objects, The Protected Interoperable File Format (PIFF)”, Microsoft Corporation, First Edition Sep. 8, 2009, 30 pgs.
Bulterman et al., “Synchronized Multimedia Integration Language (SMIL 3.0)”, W3C Recommendation, Dec. 1, 2008, https://www.w3.org/TR/2008/REC-SMIL3-20081201/, 321 pgs. (presented in five parts).
Cahill et al.,“Locally Adaptive Deblocking Filter for Low Bit Rate Video”, Proceedings 2000 International Conference on Image Processing, Sep. 10-13, 2000, Vancouver, BC, Canada, 4 pgs.
Candelore, File Wrapper, U.S. Appl. No. 60/372,901, filed Apr. 17, 2002, 5 pgs.
Casares et al., “Simplifying Video Editing Using Metadata”, DIS2002, 2002, pp. 157-166.
Chaddha et al., “A Frame-work for Live Multicast of Video Streams over the Internet”, Proceedings of 3rd IEEE International Conference on Image Processing, Sep. 19, 1996, Lausanne, Switzerland, 4 pgs.
Cheng, “Partial Encryption for Image and Video Communication”, Thesis, Fall 1998, 95 pgs.
Cheng et al., “Partial encryption of compressed images and videos”, IEEE Transactions on Signal Processing, vol. 48, No. 8, Aug. 2000, 33 pgs.
Cheung et al., “On the Use of Destination Set Grouping to Improve Fairness in Multicast Video Distribution”, Proceedings of IEEE INFOCOM'96, Conference on Computer Communications, vol. 2, IEEE, 1996, 23 pgs.
Collet, “Delivering Protected Content, An Approach for Next Generation Mobile Technologies”, Thesis, 2010, 84 pgs.
Deutscher, “IIS Transform Manager Beta—Using the MP4 to Smooth Task”, Retrieved from: https://web.archive.org/web/20130328111303/http://blog.johndeutscher.com/category/smooth-streaming, Blog post of Apr. 29, 2011, 14 pgs.
Diamantis et al., “Real Time Video Distribution using Publication through a Database”, Proceedings SIBGRAPI'98. International Symposium on Computer Graphics, Image Processing, and Vision (Cat. No. 98EX237), Oct. 1990, 8 pgs.
Ding, Li-Fu et al., “Content-Aware Prediction Algorithm With Inter-View Mode Decision for Multiview Video Coding”, IEEE Transactions on Multimedia, vol. 10, No. 8, Dec. 2008., Dec. 8, 2008, 12 Pages.
Dworkin, “Recommendation for Block Cipher Modes of Operation: Methods and Techniques”, NIST Special Publication 800-38A, 2001, 66 pgs.
Fang et al., “Real-time deblocking filter for MPEG-4 systems”, Asia-Pacific Conference on Circuits and Systems, Oct. 28-31, 2002, Bail, Indonesia, pp. 541-544.
Fecheyr-Lippens, , “A Review of HTTP Live Streaming”, Jan. 2010, 38 pgs.
Fielding et al., “Hypertext Transfer Protocol—HTTP1.1”, Network Working Group, RFC 2616, Jun. 1999, 114 pgs.
Fukuda et al., “Reduction of Blocking Artifacts by Adaptive DCT Coefficient Estimation in Block-Based Video Coding”, Proceedings 2000 International Conference on Image Processing, Sep. 10-13, 2000, Vancouver, BC, Canada, pp. 969-972.
Gannes, “The Lowdown on Apple's HTTP Adaptive Bitrate Streaming”, GigaOM, Jun. 10, 2009, 12 pgs.
Ghosh, “Enhancing Silverlight Video Experiences with Contextual Data”, Retrieved from: http://msdn.microsoft.com/en-us/magazine/ee336025.aspx, 2010, 15 pgs.
Griffith, Eric, “The Wireless Digital Picture Frame Arrives”, Wi-Fi Planet, printed May 4, 2007 from http://www.wi-fiplanet.com/news/article.php/3093141, Oct. 16, 2003, 3 pgs.
Huang, U.S. Pat. No. 7,729,426, U.S. Appl. No. 11/230,794, filed Sep. 20, 2005, 143 pgs.
Huang et al., “Adaptive MLP post-processing for block-based coded images”, IEEE Proceedings—Vision, Image and Signal Processing, vol. 147, No. 5, Oct. 2000, pp. 463-473.
Huang et al., Architecture Design for Deblocking Filter in H.264/JVT/AVC, 2003 International Conference on Multimedia and Expo., Jul. 6-9, 2003, Baltimore, MD, 4 pgs.
Inlet Technologies, “Adaptive Delivery to iDevices”, 2010, 2 pages.
Inlet Technologies, “Adaptive delivery to iPhone 3.0”, 2009, 2 pgs.
Inlet Technologies, “HTTP versus RTMP”, 2009, 3 pages.
Inlet Technologies, “The World's First Live Smooth Streaming Event: The French Open”, 2009, 2 pages.
I-O Data, “Innovation of technology arrived”, Nov. 2004, Retrieved from http://www.iodata.com/catalogs/AVLP2DVDLA_Flyer200505.pdf, 2 pgs.
Jain et al., U.S. Appl. No. 61/522,623, filed Aug. 11, 2011, 44 pgs.
Jeannin, Sylvie et al., “Video Motion Representation for Improved Content Access”, IEEE Transactions on Consumer Electronics, vol. 46, No. 3., Aug. 2004, 11 Pages.
Jung et al., “Design and Implementation of an Enhanced Personal Video Recorder for DTV”, IEEE Transactions on Consumer Electronics, vol. 47, No. 4, Nov. 2001, 6 pgs.
Kalva, Hari, “Delivering MPEG-4 Based Audio-Visual Services”, 2001, 113 pgs.
Kang et al., “Access Emulation and Buffering Techniques for Steaming of Non-Stream Format Video Files”, IEEE Transactions on Consumer Electronics, vol. 43, No. 3, Aug. 2001, 7 pgs.
Karouia et al., “Video Similarity Measurement Based on Attributed Relational Graph Matching”, N.T. Nguyen, R. Katarzyniak (Eds.): New Challenges in Applied Intelligence Technologies, SCI 134, pp. 173-182, 2008, 10 Pages.
Kim, Seon H. et al., “Design and implementation of geo-tagged video search framework”, Journal of Visual Communication and Image Representation, 2010, vol. 21 (2010), pp. 773-786.
Kim et al, “A Deblocking Filter with Two Separate Modes in Block-Based Video Coding”, IEEE transactions on circuits and systems for video technology, vol. 9, No. 1, 1999, pp. 156-160.
Kim et al., “Tree-Based Group Key Agreement”, Feb. 2004, 37 pgs.
Kurzke et al., “Get Your Content Onto Google TV”, Google, Retrieved from http://commondatastorage.googleapis.com/io2012/presentations/live%20to%20website/1300.pdf, 2012, 58 pgs.
Lang, “Expression Encoder, Best Practices for live smooth streaming broadcasting”, Microsoft Corporation, 2010, retrieved from http://www.streamingmedia.com/conferences/west2010/presentations/SMWest-12010-Expression-Encoder.pdf, 20 pgs.
Laukens, “Adaptive Streaming—A Brief Tutorial”, EBU Technical Review, 2011, 6 pgs.
Legault et al., “Professional Video Under 32-bit Windows Operating Systems”, SMPTE Journal, vol. 105, No. 12, Dec. 1996, 10 pgs.
Levkov, “Mobile Encoding Guidelines for Android Powered Devices”, Adobe Systems Inc., Addendum B, Dec. 22, 2010, 42 pgs.
Lewis, “H.264/MPEG-4 AVC CABAC overview”, Oct. 25, 2012, printed Jun. 24, 2013 from http://www.web.archive.org/web/20121025003926/www.theonlineoasis.co.uk/notes.html, 3 pgs.
Li et al., “Layered Video Multicast with Retransmission (LVMR): Evaluation of Hierarchical Rate Control”, Proceedings of IEEE INFOCOM'98, the Conference on Computer Communications. Seventeenth Annual Joint Conference of the IEEE Computer and Communications Societies. Gateway to the 21st Century, Cat. No. 98, vol. 3, 1998, 26 pgs.
List et al., “Adaptive deblocking filter”, IEEE transactions on circuits and systems for video technology, vol. 13, No. 7, Jul. 2003, pp. 614-619.
Long et al., “Silver: Simplifying Video Editing with Metadata”, Demonstrations, CHI 2003: New Horizons, Apr. 5-10, 2003, pp. 628-629.
Massoudi et al., “Overview on Selective Encryption of Image and Video: Challenges and Perspectives”, EURASIP Journal on Information Security, Nov. 2008, 18 pgs.
McCanne et al., “Receiver-driven Layered Multicast”, Conference proceedings on Applications, technologies, architectures, and protocols for computer communications, Aug. 1996, 14 pgs.
Meier, “Reduction of Blocking Artifacts in Image and Video Coding”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 9, No. 3, Apr. 1999, pp. 490-500.
Morrison, “EA IFF 85 Standard for Interchange Format Files”, Jan. 14, 1985, printed from http://www.dcs.ed.ac.uk/home/mxr/gfx/2d/IFF.txt on Mar. 6, 2006, 24 pgs.
MSDN, “Adaptive streaming, Expression Studio 2.0”, Apr. 23, 2009, 2 pgs.
Nelson, “Arithmetic Coding + Statistical Modeling = Data Compression: Part 1—Arithmetic Coding”, Doctor Dobb's Journal, Feb. 1991, USA, pp. 1-12.
Nelson, “Smooth Streaming Deployment Guide”, Microsoft Expression Encoder, Aug. 2010, 66 pgs.
Newton et al., “Preserving Privacy by De-identifying Facial Images”, Carnegie Mellon University School of Computer Science, Technical Report, CMU-CS-03-119, Mar. 2003, 26 pgs.
Noboru, “Play Fast and Fine Video on Web! codec”, col. 9 No. 12, Dec. 1, 2003, pp. 178-179.
Noe, A., “Matroska File Format (under construction!)”, Retrieved from the Internet: URL:http://web.archive.orgweb/20070821155146/www.matroska.org/technical/specs/matroska.pdf [retrieved on Jan. 19, 2011], Jun. 24, 2007, 1-51.
Noe, Alexander, “AVI File Format”, http://www.alexander-noe.com/video/documentation/avi.pdf, Dec. 14, 2006, pp. 1-26.
Noe, Alexander, “Definitions”, Apr. 11, 2006, retrieved from http://www.alexander-noe.com/video/amg/definitions.html on Oct. 16, 2013, 2 pages.
O'Brien, U.S. Appl. No. 60/399,846, filed Jul. 30, 2002, 27 pgs.
O'Rourke, “Improved Image Decompression for Reduced Transform Coding Artifacts”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 5, No. 6, Dec. 1995, pp. 490-499.
Ozer, “The 2012 Encoding and Transcoding Buyers' Guide”, Streamingmedia.com, Retrieved from: http://www.streamingmedia.com/Articles/Editorial/Featured-Articles/The-2012-Encoding-and-Transcoding-Buyers-Guide-84210.aspx, 2012, 8 pgs.
Pantos, “HTTP Live Streaming, draft-pantos-http-live-streaming-10”, IETF Tools, Oct. 15, 2012, Retrieved from: http://tools.ietf.org/html/draft-pantos-http-live-streaming-10, 37 pgs.
Park et al., “A postprocessing method for reducing quantization effects in low bit-rate moving picture coding”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 9, No. 1, Feb. 1999, pp. 161-171.
Phamdo, “Theory of Data Compression”, printed on Oct. 10, 2003, 12 pgs.
RGB Networks, “Comparing Adaptive HTTP Streaming Technologies”, Nov. 2011, Retrieved from: http://btreport.net/wp-content/uploads/2012/02/RGB-Adaptive-HTTP-Streaming-Comparison-1211-01.pdf, 20 pgs.
Richardson, “H.264 and MPEG-4 Video Compression”, Wiley, 2003, 306 pgs. (presented in 2 parts).
Schulzrinne et al., “Real Time Streaming Protocol (RTSP)”, Internet Engineering Task Force, RFC 2326, Apr. 1998, 80 pgs.
Siglin, “HTTP Streaming: What You Need to Know”, streamingmedia.com, 2010, 15 pages.
Siglin, “Unifying Global Video Strategies, MP4 File Fragmentation for Broadcast, Mobile and Web Delivery”, Nov. 16, 2011, 16 pgs.
Sima et al., “An Efficient Architecture for Adaptive Deblocking Filter of H.264 AVC Video Coding”, IEEE Transactions on Consumer Electronics, vol. 50, No. 1, Feb. 2004, pp. 292-296.
Spanos et al., “Performance Study of a Selective Encryption Scheme for the Security of Networked, Real-Time Video”, Proceedings of the Fourth International Conference on Computer Communications and Networks, IC3N'95, Sep. 20-23, 1995, Las Vegas, NV, pp. 2-10.
Srinivasan et al., “Windows Media Video 9: overview and applications”, Signal Processing: Image Communication, 2004, 25 pgs.
Stockhammer, “Dynamic Adaptive Streaming over HTTP—Standards and Design Principles”, Proceedings of the second annual ACM conference on Multimedia, Feb. 2011, pp. 133-145.
Taxan, “AVel LinkPlayer2 for Consumer”, I-O Data USA—Products—Home Entertainment, printed May 4, 2007 from http://www.iodata.com/usa/products/products.php?cat=HNP&sc=AVEL&pld=AVLP2/DVDLA&ts=2&tsc, 1 pg.
Timmerer et al., “HTTP Streaming of MPEG Media”, Proceedings of Streaming Day, 2010, 4 pgs.
Tiphaigne et al., “A Video Package for Torch”, Jun. 2004, 46 pgs.
Trappe et al., “Key Management and Distribution for Secure Multimedia Multicast”, IEEE Transaction on Multimedia, vol. 5, No. 4, Dec. 2003, pp. 544-557.
Unknown, “AVI RIFF File Reference (Direct X 8.1 C++ Archive)”, printed from http://msdn.microsoft.com/archive/en-us/dx81_c/directx_cpp/htm/avirifffilereference.asp?fr . . . on Mar. 6, 2006, 7 pgs.
Unknown, “Entropy and Source Coding (Compression)”, TCOM 570, Sep. 1999, pp. 1-22.
Unknown, “MPEG-4 Video Encoder: Based on International Standard ISO/IEC 14496-2”, Patni Computer Systems, Ltd., publication date unknown, 15 pgs.
Van Deursen et al., “On Media Delivery Protocols in the Web”, 2010 IEEE International Conference on Multimedia and Expo, Jul. 19-23, 2010, 6 pgs.
Ventura, Guillermo Albaida, “Streaming of Multimedia Learning Objects”, AG Integrated Communication System, Mar. 2003, 101 pgs.
Waggoner, “Compression for Great Digital Video”, 2002, 184 pgs.
Watanabem et al., “MPEG-2 decoder enables DTV trick plays”, esearcher System LSI Development Lab, Fujitsu Laboratories Ltd., Kawasaki, Japan, Jun. 2001, 2 pgs.
Watson, Mark, “Input for DASH EE#1 (CMP): Pixel Aspect Ratio”, 94. MPEG Meeting; Nov. 10, 2010-Oct. 15, 2010; Guangzhou; (Motion Picture Expert Group or ISO/IEC JTC1/SC29/WG11), No. M18498, Oct. 28, 2010 (Oct. 28, 2010), XP030047088,, Oct. 2, 2010, 4 Pages.
Westerink et al., “Two-pass MPEG-2 variable-bit-rate encoding”, IBM Journal of Research and Development, International Business Machines Corporation, New York, NY, US, XP002395114, ISSN: 0018-8646, vol. 43, No. 4, Jul. 4, 1999, pp. 471-488.
Wiegand, “Joint Video Team (JVT) of ISO/IEC MPEG and ITU-T VCEG”, Jan. 2002, 70 pgs.
Willig et al., U.S. Appl. No. 61/409,285, filed Nov. 2, 2010, 43 pgs.
Wu et al., “A Hierarchical Reliability-Driven Scheduling for Cloud Video Transcoding”, International Conference on Machine Learning and Cybernetics Jul. 12, 2015 pp. 456-457.
Yang et al., “Projection-Based Spatially Adaptive Reconstruction of Block-Transform Compressed Images”, IEEE Transactions on Image Processing, vol. 4, No. 7, Jul. 1995, pp. 896-908.
Yang et al., “Regularized Reconstruction to Reduce Blocking Artifacts of Block Discrete Cosine Transform Compressed Images”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 3, No. 6, Dec. 1993, pp. 421-432.
Yu et al., “Video deblocking with fine-grained scalable complexity for embedded mobile computing”, Proceedings 7th International Conference on Signal Processing, Aug. 31-Sep. 4, 2004, pp. 1173-1178.
Zakhor, “Iterative Procedures for Reduction of Blocking Effects in Transform Image Coding”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 2, No. 1, Mar. 1992, pp. 91-95.
Zambelli, “IIS Smooth Streaming Technical Overview”, Microsoft Corporation, Mar. 2009.
Azwar et al., “H.265 video delivery using dynamic adaptive streaming over HTTP (DASH) on LAN network”, 2014 8th International Conference on Telecommunication Systems Services and Applications (TSSA), Mar. 26, 2015.
Zhu et al., “Comparison and Analysis of MPEG-DASH and HLS Adaptive Streaming Delivery Technology”, Telecommunications Science, Issue 04, Apr. 20, 2015.
Related Publications (1)
Number Date Country
20220021919 A1 Jan 2022 US
Continuations (3)
Number Date Country
Parent 16819865 Mar 2020 US
Child 17343453 US
Parent 16208210 Dec 2018 US
Child 16819865 US
Parent 15183562 Jun 2016 US
Child 16208210 US