Systems, methods, and media for delivery of content

Information

  • Patent Grant
  • 10397292
  • Patent Number
    10,397,292
  • Date Filed
    Friday, March 15, 2013
    11 years ago
  • Date Issued
    Tuesday, August 27, 2019
    5 years ago
Abstract
Systems, methods, and computer readable media for delivery of content are provided. In some embodiments, systems for controlling delivery of content are provided, the systems comprising processing circuitry configured to: receive a request to stream the content, the request being received from a user equipment device; determine a first location of the user equipment device; determine a count of user equipment devices that are located at the first location and are currently streaming the content; determine whether the count meets a threshold; and responsive to determining that the count meets the threshold, add a first content delivery network to a pool of one or more content delivery networks that are used to stream the content.
Description
BACKGROUND OF THE INVENTION

Consumers increasingly have the option to stream live media content over the Internet. When content is streamed live, fragments of the content are provided to user equipment devices as the content is being created. These fragments are rendered by the user equipment devices as they arrive, permitting consumers to observe events, such as sports games, as the events develop. Live Internet streaming may be advantageous because it may give consumers access to kinds of programming that were until recently strictly in the domain of traditional television and radio broadcasting.


Streaming of live media content may be more technically challenging than non-live content streaming. Because live content is rendered at approximately the same time as it is captured, live content cannot be buffered for prolonged periods of time. The lack of extensive buffering in live content streaming may cause live content streaming to require greater network bandwidth and/or smaller network latency than non-live content streaming.


Accordingly, the need exists for new methods, systems, and media for delivery of content that are capable of satisfying the bandwidth and latency requirements of live content streaming while still being suitable for streaming non-live content.


SUMMARY OF THE INVENTION

Systems, methods, and media for delivery of content are provided. In some embodiments, systems for controlling delivery of content are provided, the systems comprising processing circuitry configured to: receive a request to stream the content, the request being received from a user equipment device; determine a first location of the user equipment device; determine a count of user equipment devices that are located at the first location and are currently streaming the content; determine whether the count meets a threshold; and responsive to determining that the count meets the threshold, add a first content delivery network to a pool of one or more content delivery networks that are used to stream the content.


In some embodiments, methods for delivery of content are provided, the methods comprising: receiving a request to stream the content, the request being received from a user equipment device; determining a first location of the user equipment device; determining a count of user equipment devices that are located at the first location and are currently streaming the content; determining whether the count meets a threshold; and responsive to determining that the count meets the threshold, adding, by processing circuitry, a first content delivery network to a pool of one or more content delivery networks that are used to stream the content.


In some embodiments, non-transitory computer-readable media that contain computer-executable instructions which, when executed by a processor, cause the processor to perform a method for delivery of content are provided, the method comprising: receiving a request to stream the content, the request being received from a user equipment device; determining a first location of the user equipment device; determining a count of user equipment devices that are located at the first location and are currently streaming the content; determining whether the count meets a threshold; and responsive to determining that the count meets the threshold, adding a first content delivery network to a pool of one or more content delivery networks that are used to stream the content.





BRIEF DESCRIPTION OF THE DRAWINGS

The above and other objects and advantages of the invention will be apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings, in which like reference characters refer to like parts throughout, and in which:



FIG. 1 shows an example of an interactive media guidance application display that can be used with a process for selecting media content for presentation in accordance with some embodiments of the invention;



FIG. 2 shows an example of a block diagram of hardware that can be used in accordance with some embodiments of the invention;



FIG. 3 shows an example of a block diagram of user equipment device hardware that can be used in accordance with some embodiments of the invention;



FIG. 4 shows an example of a block diagram of server hardware that can be used in accordance with some embodiments of the invention; and



FIGS. 5A and 5B show an example of a flow diagram of a process for delivery of content, in accordance with some embodiments of the invention.





DETAILED DESCRIPTION OF EMBODIMENTS

This invention generally relates to systems, methods, and media for controlling delivery of content. In some embodiments, mechanisms (which can be systems, methods, media, etc.) are provided for controlling the distribution of media content that is delivered to user equipment devices by a pool of one or more content distribution networks (CDNs). In some embodiments, a count of user equipment devices that are streaming content from a particular location can be monitored and, when the count exceeds a predetermined threshold, a new content distribution network can be added to the pool.


As referred to herein, the term “media content” or “content” should be understood to mean one or more electronically consumable media assets, such as television programs, pay-per-view programs, on-demand programs (e.g., as provided in video-on-demand (VOD) systems), Internet content (e.g., streaming content, downloadable content, Webcasts, etc.), movies, films, video clips, audio, audio books, and/or any other media or multimedia and/or combination of the same. As referred to herein, the term “multimedia” should be understood to mean media content that utilizes at least two different content forms described above, for example, text, audio, images, video, or interactivity content forms. Media content may be recorded, played, displayed or accessed by user equipment devices, but can also be part of a live performance. In some embodiments, media content can include over-the-top (OTT) content. Examples of OTT content providers include YOUTUBE, NETFLIX, and HULU, which provide audio and video via IP packets. Youtube is a trademark owned by Google Inc., Netflix is a trademark owned by Netflix Inc., and Hulu is a trademark owned by Hulu, LLC.


Media content can be provided from any suitable source in some embodiments. In some embodiments, media content can be electronically delivered to a user's location from a remote location. For example, media content, such as a Video-On-Demand movie, can be delivered to a user's home from a cable system server. As another example, media content, such as a television program, can be delivered to a user's home from a streaming media provider over the Internet.



FIG. 1 shows an example 100 of a guidance display that can be provided as part of an interactive media guidance application in accordance with some embodiments. As illustrated, a user may be presented with display 100 in response to the user selecting a selectable option provided in a displayed menu (e.g., an “Internet Videos” option, a “DivXTV” option, a “Program Listings” option, etc.), pressing a dedicated button (e.g., a GUIDE button) on a user input interface or device, and/or taking any other suitable action.


As illustrated in FIG. 1, guidance display 100 may include lists of media identifiers, such as a first list of media identifiers 102 that lists categories of media content, and a second list of media identifiers 104 that lists particular pieces of media content within a selected category that are available for presentation.


Additional media guidance data, such as additional media identifiers, may be presented in response to a user selecting a navigational icon 108.


Display 100 may also include a media queue region 110 that lists one or more pieces of media content selected and queued for playback, and a video region 112 in which pieces of media content can be presented.


In some embodiments, information relating to a piece of media content can also be presented to a user. For example, information 118 can include a name of a piece of media content, a time at which the media content is available (if applicable), a source (e.g., channel, Web address, etc.) from which the media content can be obtained, a parental rating for the piece of media content, a duration of the piece of media content, a description of the piece of media content, a review or a quality rating of the piece of media content, and/or any other suitable information.


In some embodiments, pieces of media content can be played in a full sized display screen in response to a user selecting “full screen” button 120.


In some embodiments, a user may be able to set settings related to the interactive media guidance application by pressing a settings button, such as settings button 122 of FIG. 1. The settings that can be set can include any suitable settings such as channel and program favorites, programming preferences that the guidance application can utilize to make programming recommendations, display preferences, language preferences, and/or any other suitable settings.


Turning to FIG. 2, an example 200 of architecture of hardware that can be used in accordance with some embodiments is shown. As illustrated, architecture 200 can include a user television equipment device 202, a user computer equipment device 204, a wireless user communication device 206, a communications network 214, a media content source 216, a media guidance data source 218, a media encoder 230, content distribution networks (CDNs) 252, 254, and 256, and communication paths 208, 210, 212, 220, 222, 232, 242, 244, and 246, in some embodiments.


In some embodiments, user television equipment device 202, user computer equipment device 204, and wireless user communication device 206, which can each be referred to herein as a “user equipment device,” can be any suitable devices for presenting media content, presenting an interactive media guidance application for selecting content, and/or performing any other suitable functions as described herein.


User television equipment device 202 can be any suitable user television equipment device or devices in some embodiments. For example, in some embodiments, user television equipment device 202 can include any suitable television, smart TV, set-top box, integrated receiver decoder (IRD) for handling satellite television, digital storage device, digital media receiver (DMR), digital media adapter (DMA), streaming media device, DVD player, DVD recorder, connected DVD, local media server, BLU-RAY player, BLU-RAY recorder, any other suitable user television equipment, and/or any other suitable combination of the same.


User computer equipment 204 can be any suitable user computer equipment in some embodiments. For example, in some embodiments, user computer equipment 204 can include any suitable personal computer (PC), laptop computer, tablet computer, WebTV box, personal computer television (PC/TV), PC media server, PC media center, hand-held computer, stationary telephone, non-portable gaming machine, any other suitable user computer equipment, and/or any other suitable combination of the same.


Wireless user communication device 206 can be any suitable wireless user communication device or devices in some embodiments. For example, in some embodiments, wireless user communication device 206 can include any suitable personal digital assistant (PDA), mobile telephone, portable video player, portable music player, portable gaming machine, smart phone, any other suitable wireless device, and/or any suitable combination of the same.


In some embodiments, user equipment devices may be connectable to a communications network. For example, in some embodiments, user equipment devices may be Internet-enabled allowing them to access Internet media content.


In some embodiments, communications network 214 may be any one or more networks including the Internet, a mobile phone network, a mobile voice network, a mobile data network (e.g., a 3G, 4G, or LTE network), a cable network, a satellite network, a public switched telephone network, a local area network, a wide area network, a wireless network (e.g., WiFi, WiMax, etc.), any other suitable type of communications network, and/or any suitable combination of communications networks.


Media content source 216 may include one or more types of content distribution equipment for distributing any suitable media content, including television distribution facility equipment, cable system head-end equipment, satellite distribution facility equipment, programming source equipment (e.g., equipment of television broadcasters, such as NBC, ABC, HBO, etc.), intermediate distribution facility equipment, Internet provider equipment, on-demand media server equipment, live media distribution equipment, cameras, and/or any other suitable media content provider equipment, in some embodiments. NBC is a trademark owned by the National Broadcasting Company, Inc., ABC is a trademark owned by the American Broadcasting Companies, Inc., and HBO is a trademark owned by the Home Box Office, Inc.


Media content source 216 may be operated by the originator of content (e.g., a television broadcaster, a Webcast provider, etc.) or may be operated by a party other than the originator of content (e.g., an on-demand content provider, an Internet provider of content of broadcast programs for downloading, etc.), in some embodiments.


Media content source 216 may be operated by cable providers, satellite providers, on-demand providers, Internet providers, providers of over-the-top content, subscription providers, rental providers, and/or any other suitable provider(s) of content, in some embodiments.


Media content source 216 may include a remote media server used to store different types of content (including video content selected by a user), in a location remote from any of the user equipment devices, in some embodiments. Systems and methods for remote storage of content, and providing remotely stored content to user equipment are discussed in greater detail in connection with Ellis et al., U.S. Pat. No. 7,761,892, issued Jul. 20, 2010, which is hereby incorporated by reference herein in its entirety.


Media guidance data source 218 may provide any suitable media guidance data, such as names of pieces of media content, times at which the media content is available (if applicable), sources (e.g., channels, Web addresses, etc.) from which the media content can be obtained, parental ratings for the pieces of media content, durations of the pieces of media content, descriptions of the pieces of media content, reviews or quality ratings of the pieces of media content, and/or any other suitable information, in some embodiments.


Media guidance data may be provided by media guidance data source 218 to the user equipment devices using any suitable approach, in some embodiments. In some embodiments, for example, an interactive media guidance application may be a stand-alone interactive television program guide that receives this media guidance data from media guidance data source 218 via a data feed (e.g., a continuous feed or trickle feed). In some embodiments, this media guidance data may be provided to the user equipment on a television channel sideband, using an in-band digital signal, using an out-of-band digital signal, or by any other suitable data transmission technique from media guidance data source 218. In some embodiments, this media guidance data may be provided to user equipment on multiple analog or digital television channels from media guidance data source 218. In some embodiments, media guidance data from media guidance data source 218 may be provided to users' equipment using a client-server approach, wherein media guidance data source 218 acts as a server.


In some embodiments, media guidance data source 218 may manage a pool of one or more content delivery networks (CDNs) that are used to deliver content to a plurality of user equipment devices. More particularly, in some embodiments, media guidance data source 218 may maintain a list of the CDNs from the pool and make changes to the list as CDNs are added or removed from the pool. When a CDN is added to the pool by media guidance source 218, or periodically, media guidance data source 218 may provide the list to media encoder 230.


In some embodiments, media guidance data source 218 may maintain records relating to the geographic distribution of user equipment devices that are currently streaming the content. For example, media guidance data source 218 may maintain a record that indicates a plurality of locations along with a count of user equipment devices located at each of the locations that are currently streaming the media content. In some embodiments, any one of the locations in the record may be indicated by: an identifier of a geographic location; an identifier of a network; an identifier of a network domain; an item of information that is found in a Domain Name Service (DNS) record; and/or any other suitable identifier.


Media encoder 230 may receive live content from media content source 216 and encode fragments of the content using a media encoding algorithm. Each fragment may be of any suitable duration, such as 2-10 seconds. In addition, each fragment may be encoded into one or more media files. In some embodiments, each fragment may be encoded into multiple media files that have different bit encoding rates.


Media encoder 230 may provide the media files corresponding to each fragment of the media content to each one of the pool of CDNs that are used to deliver the media content to user equipment devices. In some embodiments, media encoder 230 may identify the CDNs by obtaining the list maintained by media guidance data source 218. In some embodiments, the media files may be uploaded to the CDNs in the pool over a File Transfer Protocol (FTP) connection and/or any other suitable mechanism. Additionally or alternatively, in some embodiments, media encoder 230 may obtain a current copy of the list of CDNs before uploading media files that correspond to a fragment of the content. Doing so may cause any changes made to the pool of CDNs by media guidance data source 218 to take place immediately.


After the upload of a set of media files that correspond to a fragment of the media content is completed, media encoder 230 may provide a set of addresses to media guidance data source 218. Each address in the set may be the address of one of the media files at one of the CDN's in the pool. Each address in the set may be usable to retrieve a media file from the address' respective CDN. Addresses from the set may be later communicated by media guidance data source 218 to user equipment devices that seek to stream the media content. The user equipment devices may use these addresses to obtain the CDNs in the pool.


Content delivery network (CDN) 252 may distribute content to user equipment devices 202, 204, and/or 206. CDN 252 may include: load balancing servers; request servers: cache servers; storage servers; communications switches; gateways; and/or any other suitable equipment. In some embodiments, CDN 252 may include a cloud-based storage that includes virtualized pools of storage hosted in an Internet data center, such as the Amazon S3 storage provided by Amazon Web Services of Herndon, Va., USA. In some embodiments, the cloud based storage may be used to “locally” cache media content for presentation on user equipment devices 202, 204, and/or 206. Any suitable type and/or number of equipment may be used to implement CDN 252, in some embodiments.


CDN 252 may use an adaptive bit rate (ABR) technique in which content is encoded into fragments (e.g., 2-10 seconds in length) that have different bit encoding rates. Having fragments of different bit encoding rates permits CDN 252 to dynamically select the proper bit-rate for user equipment devices depending on the networking resources available to any of the devices. In live content streaming, the fragments may be provided to client devices in real-time or near-real time as they are generated. As noted above, the fragments may be obtained from media encoder 230.


CDN 254 and CDN 256 may have similar structures to CDN 252. Each CDN may include a different plurality of computing devices (e.g., load balancers, cache servers, or storage servers). Additionally or alternatively, each CDN may be implemented by using a different data center. In some embodiments, CDNs 252, 254, and 256 may have different geographic locations from one another.


In some embodiments, each of CDN 252, 254 and 256 may be operated by third-party operators that deliver content on behalf of the operators of media content source 216 and/or media guidance data source 218. Each CDN may be associated with a different price that the operator of the CDN charges for the delivery of the media content. The price may be structured in terms of: dollars per amount of data served; dollars per amount of bandwidth that is made available to serve the content; dollars per amount of bandwidth that is consumed; and/or in accordance with any other suitable pricing scheme.


Although only one each of user equipment devices 202, 204, and/or 206, sources 216 and 218, media encoder 230, and CDNs 252, 254, and 256 are illustrated in FIG. 2 in order to avoid over complicating the drawing, any suitable number of each of these components can be provided in some embodiments.


Each user may utilize more than one type of user equipment device in some embodiments. In some embodiments, any of user equipment devices 202, 204, and 206 can be combined, and any of media content source, media encoder 230, and media guidance data source 218 may be combined.


Paths 208, 210, 212, 220, 222, 232, 242, 244, and 246 may separately or together include one or more communications paths, such as, a satellite path, a fiber-optic path, a cable path, a path that supports Internet communications (e.g., IPTV), free-space connections (e.g., for broadcast or other wireless signals), or any other suitable wired or wireless communications path or combination of such paths, in some embodiments. Path 212 is drawn with dotted lines to indicate that, in the example embodiment shown in FIG. 2, it can be a wireless path (although this path may be a wired path, if desired), and paths 208, 210, 220, 222, and 232 are drawn as solid lines to indicate they can be wired paths (although these paths may be wireless paths, if desired). In some embodiments, communication to/from user equipment devices 202, 204, and 206, sources 216 and 218, media encoder 230, and CDNs 252, 254, and 256 may be provided by one or more of communications paths 208, 210, 212, 220, 222, and 232, respectively, but are shown as a single path in FIG. 2 to avoid overcomplicating the drawing.


Although direct communications paths are not drawn between user equipment devices 202, 204, and 206, and between sources 216 and 218, media encoder 230, and CDNs 252, 254, and 256, these components may communicate directly with each other via communication paths, such as those described above, as well via point-to-point communication paths, such as USB cables, IEEE 1394 cables, wireless paths (e.g., Bluetooth, infrared, IEEE 802.11x, etc.), or other communication via wired or wireless paths, in some embodiments. BLUETOOTH is a certification mark owned by Bluetooth SIG, INC. The user equipment devices 202, 204, and 206, sources 216 and 218, media encoder 230, and CDNs 252, 254, and 256 may also communicate with each other directly through an indirect path via communications network 214, in some embodiments.


In some embodiments, sources 216 and 218 and media encoder 230 can be implemented in any suitable hardware. For example, sources 216 and 218 and media encoder 230 can be implemented in any of a general purpose device such as a computer or a special purpose device such as a client, a server, a mobile terminal (e.g., a mobile phone), etc. Any of these general or special purpose devices can include any suitable components such as a hardware processor (which can be a microprocessor, a digital signal processor, a controller, etc.). Furthermore, as noted above, any of media encoder 230 and sources 218 and 216 may be integrated as a single device (e.g., a single computer) and/or a single distributed system.



FIG. 3 shows an example of hardware that can be provided in an illustrative user equipment device 300, such as user television equipment device 202, user computer equipment device 204, and/or wireless user communication device 206 of FIG. 2, in accordance with some embodiments. As illustrated, device 300 can include control circuitry 304 (which can include processing circuitry 306 and storage 308), a user input interface 310, a display 312, speakers 314, and an input/output (hereinafter “I/O”) interface 316, in some embodiments.


Control circuitry 304 may include any suitable processing circuitry such as processing circuitry 306. As referred to herein, processing circuitry 306 can be circuitry that includes one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, field-programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), hardware processors, etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores) or a supercomputer, in some embodiments. In some embodiments, processing circuitry may be distributed across multiple separate processors or processing units, such as, for example, multiple of the same type of processing units (e.g., two Intel Core i7 processors) or multiple different processors (e.g., an Intel Core i5 processor and an Intel Core i7 processor).


Storage 308 can be any suitable digital storage mechanism in some embodiments. For example, storage 308 can include any device for storing electronic data, program instructions, computer software, firmware, register values, etc., such as random-access memory, read-only memory, hard drives, optical drives, digital video disc (DVD) recorders, compact disc (CD) recorders, BLU-RAY disc (BD) recorders, BLU-RAY 3D disc recorders, digital video recorders (DVR, sometimes called a personal video recorder, or PVR), solid state devices, quantum storage devices, gaming consoles, gaming media, or any other suitable fixed or removable storage devices, and/or any combination of the same. Storage 308 may be used to store media content, media guidance data, executable instructions (e.g., programs, software, scripts, etc.) for providing an interactive media guidance application, and for any other suitable functions, and/or any other suitable data or program code, in accordance with some embodiments. Nonvolatile memory may also be used (e.g., to launch a boot-up routine and other instructions), in some embodiments. Cloud-based storage may be used to supplement storage 308 or instead of storage 308 in some embodiments.


Control circuitry 304 may include video generating circuitry and tuning circuitry, such as one or more analog tuners, one or more MPEG-2 decoders or other digital decoding circuitry, high-definition tuners, or any other suitable tuning or video circuits or combinations of such circuits, in some embodiments. Encoding circuitry (e.g., for converting over-the-air, analog, or digital signals to MPEG signals for storage) may also be provided, in some embodiments. Control circuitry 304 may also include scaler circuitry for upconverting and downconverting content into the preferred output format of the user equipment 300, in some embodiments. Circuitry 304 may also include digital-to-analog converter circuitry and analog-to-digital converter circuitry for converting between digital and analog signals. The video generating circuitry may be used for presenting media content, in some embodiments. The tuning and encoding circuitry may be used by the user equipment device to receive and to display, to play, or to record content, in some embodiments. The tuning and encoding circuitry may also be used to receive guidance data, in some embodiments. The circuitry described herein, including for example, the tuning, video generating, encoding, decoding, encrypting, decrypting, scaler, and analog/digital circuitry, may be implemented using software running on one or more general purpose or special purpose hardware processors, in some embodiments. Multiple tuners may be provided to handle simultaneous tuning functions (e.g., watch and record functions, picture-in-picture (PIP) functions, multiple-tuner recording, etc.), in some embodiments. If storage 308 is provided as a separate device from user equipment 300, the tuning and encoding circuitry (including multiple tuners) may be associated with storage 308, in some embodiments.


A user may send instructions to control circuitry 304 using user input interface 310, in some embodiments. User input interface 310 may be any suitable user interface, such as a remote control, mouse, trackball, keypad, keyboard, touch screen, touchpad, stylus input, joystick, voice recognition interface, or other user input interfaces, in some embodiments.


Display 312 may be provided as a stand-alone device or integrated with other elements of user equipment device 300, in some embodiments. Display 312 may be one or more of a monitor, a television, a liquid crystal display (LCD) for a mobile device, or any other suitable equipment for displaying visual images, in some embodiments. In some embodiments, display 312 may be HDTV-capable. In some embodiments, display 312 may be a 3D display.


A video card or graphics card may generate the output to display 312, in some embodiments. The video card may offer various functions such as accelerated rendering of 3D scenes and 2D graphics, MPEG-2/MPEG-4 decoding, TV output, or the ability to connect multiple monitors, in some embodiments. The video card may be any processing circuitry described above in relation to control circuitry 304, in some embodiments. The video card may be integrated with the control circuitry 304 or may be integrated with display 312, in some embodiments.


Speakers 314 may be provided as integrated with other elements of user equipment device 300 or may be stand-alone units, in some embodiments. The audio component of media content displayed on display 312 may be played through speakers 314, in some embodiments. In some embodiments, the audio may be distributed to a receiver (not shown), which processes and outputs the audio via speakers 314.


I/O interface 316 can be any suitable I/O interface 316 in some embodiments. For example, in some embodiments, I/O interface 316 can be any suitable interface for coupling control circuitry 304 (and specifically processing circuitry 306) to one or more communications paths (e.g., paths 208, 210, and 212 described in FIG. 2). More particularly, for example, I/O interface 316 can include a cable modem, an integrated services digital network (ISDN) modem, a digital subscriber line (DSL) modem, a telephone modem, an Ethernet card, a fiber-optic modem, a wireless modem, and/or any other suitable communications circuitry. In some embodiments, the I/O interface can be used to provide content and data from an external location to device 300. For example, in some embodiments, I/O interface 316 can be used to provide media content (e.g., broadcast programming, on-demand programming, Internet content, content available over a local area network (LAN) or wide area network (WAN), and/or any other suitable content), media guidance data, subtitles, time codes, and/or any other suitable information or data to control circuitry 304 of device 300. In some embodiments, I/O interface 316 can also be used to send and receive commands, requests, and other suitable data from and to, respectively, control circuitry 304. Any suitable number of I/O interfaces 316 can be provided, even though only one is shown in FIG. 3 to avoid overcomplicating the drawing.


The processes for playing back media content, the interactive media guidance application and/or any other suitable functions as described herein may be implemented as stand-alone applications on user equipment devices in some embodiments. For example, the processes for playing back media content and/or the interactive media guidance application may be implemented as software or a set of executable instructions which may be stored in storage 308, and executed by control circuitry 304 of a user equipment device 300.


In some embodiments, the processes for playing back media content, the interactive media guidance application, and/or any other suitable functions as described herein may be implemented as client-server applications. In such client-server applications, a client application may reside on a user equipment device, and a server application may reside on a remote server, such as source 216 or one of CDNs 252, 254, and 256. For example, the processes for playing back media content may be implemented partially as a client application on control circuitry 304 of user equipment device 300 and partially as a server application on media content source 216 or one of CDNs 252, 254, and 256. As another example, an interactive media guidance application may be implemented partially as a client application on control circuitry 304 of user equipment device 300 and partially on a remote server (e.g., media guidance data source 218 or one of CDNs 252, 254, and 256) as a server application running on control circuitry of the remote server.



FIG. 4 shows an example of hardware that can be provided in an illustrative server 400. Server 400 may be part of a media guidance data source, such as media guidance data source 218, and it may implement a media content delivery process, such as at least portions of content delivery process 500, which is shown in FIG. 5. As illustrated, server 400 can include control circuitry 402 (which can include processing circuitry 404 and storage 406) and a network interface 408.


Control circuitry 402 may include any suitable processing circuitry such as processing circuitry 404. As referred to herein, processing circuitry 404 can be circuitry that includes one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, field-programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), hardware processors, etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores) or a supercomputer, in some embodiments. In some embodiments, processing circuitry may be distributed across multiple separate processors or processing units, such as, for example, multiple of the same type of processing units (e.g., two Intel Core i7 processors) or multiple different processors (e.g., an Intel Core i5 processor and an Intel Core i7 processor).


Storage 406 can be any suitable digital storage mechanism in some embodiments. For example, storage 406 can include any device for storing electronic data, program instructions, computer software, firmware, register values, etc., such as random-access memory, read-only memory, hard drives, optical drives, digital video disc (DVD) recorders, compact disc (CD) recorders, BLU-RAY disc (BD) recorders, BLU-RAY 3D disc recorders, digital video recorders (DVR, sometimes called a personal video recorder, or PVR), solid state devices, quantum storage devices, gaming consoles, gaming media, or any other suitable fixed or removable storage devices, and/or any combination of the same. Storage 406 may be used to store media content, media guidance data, executable instructions (e.g., programs, software, scripts, etc.) for providing an interactive media guidance application, and for any other suitable functions, and/or any other suitable data or program code, in accordance with some embodiments. Nonvolatile memory may also be used (e.g., to launch a boot-up routine and other instructions). Cloud-based storage may be used to supplement storage 406 or instead of storage 406 in some embodiments.


Control circuitry 402 may include encoding circuitry for encoding media content (e.g., video or audio). Control circuitry 402 may also include adaptive bit streaming circuitry for encoding the media content into multiple bit rates and performing switches between the streams during normal playback based upon the streaming conditions. Control circuitry 402 may also include streaming circuitry for transmitting the different bit streams via network interface 408.


For example, in some embodiments, interface 408 can be any suitable interface for coupling control circuitry 402 (and specifically processing circuitry 404) to one or more communications networks. More particularly, for example, interface 408 can include a cable modem, an integrated services digital network (ISDN) modem, a digital subscriber line (DSL) modem, a telephone modem, an Ethernet card, a fiber-optic modem, a wireless modem, and/or any other suitable communications circuitry. In some embodiments, the I/O interface can be used by server 400 to stream content to a client device, such as device 300. More particularly, in some embodiments, interface 408 can be used to provide media content (e.g., broadcast programming, on-demand programming, Internet content, content available over a local area network (LAN) or wide area network (WAN), and/or any other suitable content). In some embodiments, interface 408 can also be used to receive commands, requests, from a client device. Such requests may be for blocks (e.g., chunks) of media content that is being streamed.



FIGS. 5A and 5B depict a flowchart of an example of a process 500 for distributing media content in accordance with some embodiments of the disclosed subject matter.


At 502, one or more first records may be stored in memory or any other suitable location. The one or more first records may include identifiers for a pool of one or more content delivery networks (CDNs) that can be used to stream media content to a plurality of user equipment devices. Any suitable type of identifiers may be used in the first record(s) in some embodiments. For example, in some embodiments, an identifier of a CDN in the pool may include an identifier for a component of the CDN, such as a gateway or a load balancer. Step 502 may be performed by: a media guidance data source, such as media guidance data source 218; processing circuitry of the media guidance data source; and/or any other suitable device or processing circuitry thereof.


At 504, one or more second records may be stored in memory or any other suitable location. The second record(s) may include information relating to the geographic distribution of user equipment devices that are currently streaming media content from CDNs in the pool. In some embodiments, the record(s) may identify a set of one or more locations. In some embodiments, for each location, the record may identify a count of user equipment devices that are located at that location and are currently streaming the media content. The count may be a string, a number, or an alphanumerical string that is based on (or indicative of) a number of user equipment devices that are streaming the media content.


In some embodiments, two or more user equipment devices may be considered to be at the same location when the user equipment devices are located in the same region as each other (e.g., in the same district, in the same service area, in the same telephone service area, in the same city, or in the same state, etc.). Any suitable definition of location may be used, in some embodiments.


Additionally or alternatively, in some embodiments, two or more user equipment devices may be considered to be at the same location when the user equipment devices are part of the same network domain (or the same network; or the same portion of a network) as each other. Additionally or alternatively, in some embodiments, two user equipment devices may be considered to be at the same location when one of the user equipment devices is part of a network domain (or a network; or a portion of a network) that is associated with a network domain (or a network; or a portion of a network) the other user equipment device is part of. Thus, in some embodiments, whether two user equipment devices are located at the same location may, at least in part, depend on the topology of network(s) the two user equipment devices are part of.


Additionally or alternatively, in some embodiments, a two user equipment devices may be considered to be at the same location when a first record corresponding to one of the user equipment devices contains an item of information (e.g., a number, a word, or an alphanumerical string) that satisfies a similarity criterion with respect to another item of information that is part of a second record that corresponds to the other user equipment device. Any suitable type of records may be used, in some embodiments. For example, the first record and the second record may be Domain Name Service (DNS) records, records maintained by Internet service providers, records maintained by network administrators, records maintained by content distributors, and/or any other suitable records.


To determine whether two or more user equipment devices are at the same location, any suitable similarity criterion may be used in some embodiments. For example, in some embodiments, the similarity criterion may be one that is satisfied when the two items of information are identical. As another example, in some embodiments, the similarity criterion may be one that is satisfied when the first item of information and the second item of information are within a predetermined distance from one another in the space of items of information from their type.


Step 504 may be performed by: a media guidance data source, such as media guidance data source 218; processing circuitry of the media guidance source and/or any other suitable device or processing circuitry thereof.


At 506, a fragment of the media content may be received. The fragment may be received in any suitable manner, in some embodiments. The fragment may include any suitable media content, such as media content data that is sufficient to render (e.g., in sound and/or in image) at least a portion of the media content (e.g., 2-10 seconds of the media content).


At 508, the fragment of the media content may be encoded to generate one or more media files that encode the fragment. The fragment may be encoded in any suitable manner, in some embodiments. For example, in some embodiments, one or more of the files may have a different encoding bit rate.


At 510, one or more CDNs in the pool may be identified. The one or more of the CDNs may be identified in any suitable manner, in some embodiments. For example, in some embodiments, a first record may be obtained and used to identify the CDNs.


At 512, at least some of the generated media files may be provided to the identified CDNs. The media files may be provided in any suitable manner, in some embodiments. For example, in some embodiments, at least some of the generated media files may be uploaded via a File Transfer Protocol (FTP) connection to each of the CDNs. Upon uploading any one of the media files to a given CDN, the address (e.g., Uniform Resource Locator (URL)) of that file at the CDN may be recorded. In some embodiments, the addresses (of some or all) of the media files at different CDNs may be recorded. These addresses, as is further discussed below, may be usable by user equipment devices to obtain the media files when the content is being streamed.


At 514, the addresses of the media files may be provided to media guidance data source, such as media guidance data source 218. The addresses may be provided in any suitable manner, in some embodiments.


Each of steps 506-514 may be performed by: an encoder, such as media encoder 230; and/or any other suitable device.


At 516, a request may be received from a user equipment device to initiate a streaming of the content to the user equipment device. The request may be received in any suitable manner in some embodiments. For example, in some embodiments, the request may be transmitted over a communications network, such as network 214. In some embodiments, the request may be received at a media guidance data source, such as media guidance data source 218.


At 518, the location of the user equipment device may be determined. The device may be determined in any suitable manner, in some embodiments. For example, in some embodiments, determining the location, in some embodiments, may include obtaining an identifier that is indicative of the user equipment device's physical location, such as: an identifier of a network domain (or portion thereof) the device is part of; an identifier of a network (or portion thereof) the device is part of; coordinates of the device (e.g., by using a Global Positioning System (GPS) capability that is built into the device); and/or any other suitable identifier. Additionally or alternatively, in some embodiments, the location of the user equipment device may be determined by using an instance of the nslookup utility. Additionally or alternatively, in some embodiments, determining the location of the user equipment device may include retrieving a record (e.g., a DNS record, a record maintained by an Internet Service provider, and/or any other suitable type of record) that corresponds to the user equipment device and obtaining an item of information that is part of the retrieved record that is indicative of the location of the user equipment device. Any suitable item of information may be used in some embodiments. For example, in some embodiments, the item of information may include: an item of information that is indicative of a network domain (or portion thereof); an item of information that is indicative of a network (or portion thereof); an item of information that is indicative of a city; an item of information that is indicative of a physical address; an item of information that is indicative of a network address; and/or any other suitable item.


At 520, a count of user equipment devices that are located at the same location as the user equipment device and are streaming the media content may be determined. This count may be determined in any suitable manner, in some embodiments. For example, in some embodiments, one of the first records may be obtained and used to identify the count. Additionally or alternatively, in some embodiments, the count may be determined based on responses to queries transmitted over a communications network to one or more devices/systems that are responsible for routing user equipment devices to CDNs (e.g., in the same manner as the device/system executing steps 516-530) and receiving respective counts of user equipment devices at the location that are currently streaming the media content and have been assisted in streaming the media content by each of the CDNs.


At 522, a determination may be made whether a predetermined condition is satisfied. Any suitable predetermined condition may be used, in some embodiments. For example, in some embodiments, the predetermined condition may be based on the count. For example, the predetermined condition may be one that is satisfied when the count meets (e.g., exceeds, equals, or falls below) a predetermined threshold.


Additionally or alternatively, in some embodiments, the predetermined condition may be based on a value of a quality of service metric of a communications path connecting a CDN that is used to provide media content with a user equipment device that is streaming the content. Any suitable quality of service metric may be used. For example, the quality of service metric may be latency, bandwidth, jitter, and/or any other suitable quality of service metric. Thus, in some embodiments, the predetermined criterion may be a criterion that is satisfied when the quality of service metric meets a predetermined threshold.


Additionally or alternatively, in some embodiments, the predetermined condition may be based on a plurality of quality of service metric values, wherein each quality of service value is for a different one of a plurality of communications paths. Each communications path may be one that connects a CDN that is used to provide the media content with user equipment. Each communication path may be one that is leading to different one of a plurality of user equipment devices that are streaming the content. In some embodiments, the predetermined condition may be based on an average, median, and/or any other suitable statistical characteristic of the plurality of quality of service metric values.


When the predetermined condition is determined to be satisfied, the execution of process 500 proceeds to step 524. Otherwise, when the predetermined condition is found to not be satisfied, the execution of process 500 proceeds to step 528.


At 524, an additional CDN that is available to stream the media content may be identified. This identification may be performed in any suitable manner, and any suitable CDN may be identified, in some embodiments. The additional CDN, in some embodiments, may include a plurality of computing devices (e.g., servers, load balancers, cache servers, and/or any suitable type of computing device). In some embodiments, the additional CDN may be implemented using a data center that is different from the data centers used to implement the CDNs in the pool. For example, the data center of the additional CDN may be at a different geographic location than the data centers of the other CDNs in the pool.


In some embodiments, the additional CDN may be selected from a plurality of available CDNs. Additionally or alternatively, the CDN may be selected based on a predetermined criterion. Any suitable criterion may be used, in some embodiments. For example, in some embodiments, the CDN may be selected based on a geographic location corresponding to the CDN (e.g., a geographic location where a data center that is used to implement the CDN is located).


As another example, in some embodiments, the CDN may be selected from the plurality of available CDNs based on a QoS metric associated with the CDN, such as latency or bandwidth. For example, in some embodiments, the QoS metric associated with the CDN may indicate one of the latency, bandwidth, or throughput for one or more communications paths between a component of the CDN (e.g., a load balancer) and a device located at a predetermined location, such as the location determined at step 518 and/or any other suitable location.


As yet another example, in some embodiments, the CDN may be selected from the plurality of available CDNs based on a price associated with the CDN (e.g., a price for a unit of bandwidth that is made available for serving the media content or price for a unit of bandwidth that is consumed by streaming the media content, a price for a unit of data served, etc.). For example, the CDN may be selected based on having the lowest associated price.


As yet another example, in some embodiments, the CDN may be selected from the plurality of available CDNs based on distance from CDNs in the pool. For example, the CDN may be selected based on being situated the furthest, of all CDNs in the plurality, from a given CDN in the pool. The distance between different CDNs may be based on the physical distance, the logical distance, and/or the network distance between the locations of data centers that are used to implement the CDNs.


In some embodiments, the additional CDN may be implemented using a different data center than data centers used to implement CDNs in the pool. In some embodiments, the additional CDN may include a plurality of computing devices (e.g., load balancers, cache servers, storage servers, etc.).


At 526, the CDN identified at step 524 may be added to the pool of CDNs that are responsible for streaming the media content. The CDN may be added in any suitable manner, in some embodiments. For example, in some embodiments, adding the CDN to the pool may include adding an identifier for the CDN to one or more of the first records. Additionally or alternatively, in some embodiments, adding the identified CDN to the pool may include configuring an encoder, such as media encoder 230, to start uploading media files corresponding to fragments of the media content to the CDN. Additionally or alternatively, configuring the encoder may include providing (e.g., transmitting over a communications network) an identifier for the CDN to the encoder.


At 528, the user equipment device may be provided with information that may be usable by the user equipment device to begin streaming media content. Any suitable information may be provided, and this information may be provided in any suitable manner, in some embodiments. For example, the information may include an address (e.g., a URL) that is usable to retrieve a media file from a CDN from the pool. The media file may be one that corresponds to a fragment of a media stream and is uploaded to the CDN by an encoder, such media encoder 230, in some embodiments. In some embodiments, in instances where the count is determined to meet the threshold at 522, the address may point to an instance of the media file that is stored at the CDN identified at step 524. The user equipment device may then use the streaming information to stream the media content from of the CDNs in the pool.


At 530, the record indicating the geographic distribution of the user equipment devices may be updated. This record may be updated in any suitable manner, in some embodiments. For example, in some embodiments, the count determined at step 520 may be incremented by one (or otherwise changed) in order to reflect that the user equipment device has begun streaming the media content based on the streaming information provided at step 528, in some embodiments.


Any of steps 516-530 may be performed by: a media guidance data source, such as media guidance data source 218; processing circuitry of the media guidance data source; and/or any other suitable device or processing circuitry thereof.


In some embodiments, a functional separation may be maintained between. As noted above, steps 506-514 may be performed by one or more media encoders, whereas steps 516-530 may be performed by a media guidance data source. Thus in some embodiments, failsafe mechanism (for adding CDN's to the pools e.g., the determination whether the predetermined condition is satisfied) may be implemented at a device that is separate from any media encoders.


Although in the above example a CDN is added to a pool CDNs that are responsible for streaming the media content, in other examples, when the count meets the threshold, a server may be added to a pool of servers that are used to stream the content, in some embodiments. For example, once added, that server may begin to receive streaming resources associated with the pool as discussed with respect to step 528. The address of that server may be provided to user equipment devices that seek to begin streaming the media content and the server may begin providing the media content to any devices that establish a connection with the server, in some embodiments.


It should be noted, however, that in some embodiments, adding a server to a pool of servers may be different from adding a CDN to a pool. In some embodiments, adding a new CDN to a pool of CDN may involve utilizing another data center to distribute the media content along with underlying data center infrastructure, such as load balancers and caching servers. The data center may be at a different location than other data centers in the pool and, thus, adding the CDN to the pool may provide user equipment devices located at the first location with additional network paths to stream the media content over. This in turn may prevent congestion of network paths spanning between the first location and other CDNs in the pool.


The above steps of the flow diagrams of FIGS. 5A-B may be executed or performed in any order or sequence not limited to the order and sequence shown and described in the figures. Some of the above steps of the flow diagrams of FIGS. 5A-B may be executed or performed substantially simultaneously where appropriate or in parallel to reduce latency and processing times. Some of the above steps of the flow diagrams of FIGS. 5A-B may be omitted. Although the above embodiments of the invention are described in reference to live content streaming, the techniques disclosed herein may be used in any type of data downloading, including non-live streaming of media content.


In some embodiments, any suitable computer readable media can be used for storing instructions for performing the mechanisms and/or processes described herein. For example, in some embodiments, computer readable media can be transitory or non-transitory. For example, non-transitory computer readable media can include media such as magnetic media (such as hard disks, floppy disks, etc.), optical media (such as compact discs, digital video discs, Blu-ray discs, etc.), semiconductor media (such as flash memory, electrically programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), etc.), any suitable media that is not fleeting or devoid of any semblance of permanence during transmission, and/or any suitable tangible media. As another example, transitory computer readable media can include signals on networks, in wires, conductors, optical fibers, circuits, any suitable media that is fleeting and devoid of any semblance of permanence during transmission, and/or any suitable intangible media.


The above described embodiments of the present disclosure are presented for purposes of illustration and not of limitation, and the present disclosure is limited only by the claims which follow.

Claims
  • 1. A system for live streaming of content, the system comprising: an architecture comprising a hardware media encoder and a hardware management server connected via a communications network; wherein: the media encoder is configured for real-time encoding and uploading a live content stream of a live event to a pool of content delivery networks for use in distributing the live content stream to a plurality of user equipment devices, wherein the pool of content delivery networks utilizes communication paths to stream the live content stream to the user equipment devices;the management server is configured to: maintain the pool of content delivery networks for use in distributing the live content stream, wherein each content delivery network of the pool of content delivery networks comprises load balancing servers, gateways, and storage servers connected via the communications network;receive a request to stream the live content stream from a particular user equipment device, wherein the particular user equipment device has a first geographic location that is shared with a plurality of user equipment devices;provide a listing of content delivery networks of the pool of content delivery networks to the media encoder, wherein the media encoder encodes and uploads a first fragment of the live content stream to the pool of content delivery networks;provide, to the particular user equipment device, a manifest that identifies the pool of content delivery networks, wherein the particular user equipment device requests at least a portion of the live content stream from a first content delivery network from the pool of content delivery networks over a first distribution path;determine whether a predetermined condition is satisfied, wherein the predetermined condition comprises whether a count of user equipment devices that are located at the first geographic location and are currently streaming the content from the pool of content delivery networks exceeds a threshold quantity of streaming user equipment devices;when the predetermined condition is satisfied: select an additional content delivery network from a set of available content delivery networks to add to the pool of content delivery networks based on a plurality of factors comprising at least: distance between a second geographic location corresponding to the additional content delivery network and geographic locations corresponding to content delivery networks within the pool of content delivery networks, latency along at least one different distribution path between the additional content delivery network and the particular user equipment device, and throughput for at least one different distribution path between the additional content delivery network and the particular user equipment device;update the pool of content delivery networks to include the additional content delivery network, wherein the additional content delivery network comprises a load balancing server, a gateway, and a storage server connected via a communications network;provide an updated listing of the updated pool of content delivery networks, including the additional content delivery network, to the media encoder, wherein the media encoder encodes and uploads a second fragment of the live content stream to the updated pool of content delivery networks, including the additional content delivery network, to stream the live content stream using a distribution path including the additional content delivery network from the updated pool of content delivery networks; andprovide, to the particular user equipment device, an updated manifest that identifies the additional content delivery network, wherein the particular user equipment requests at least a portion of the live content stream from the additional content delivery network.
  • 2. The system of claim 1, wherein the predetermined condition further comprises whether a plurality of quality of service metric values satisfy certain thresholds.
  • 3. The system of claim 1, wherein throughput comprises network bandwidth.
  • 4. The system of claim 1, wherein providing the updated listing of the updated pool of content delivery networks comprises providing the media encoder with an identifier of the selected additional content delivery network, wherein the identifier identifies at least one of a gateway and a load balancer for the additional content delivery network.
  • 5. The system of claim 1, wherein the management server is further configured to maintain records relating to the geographic distribution of user equipment devices that are currently streaming content.
  • 6. The system of claim 1, wherein the plurality of factors further includes a price per unit of bandwidth associated with the additional content delivery network.
  • 7. A method for live streaming of content, comprising: encoding and uploading a first fragment of a live content stream of a live event to a pool of content delivery networks for use in distributing the live content stream in real-time to a plurality of user equipment devices, wherein the pool of content delivery networks utilizes communication paths to stream the live content stream to user equipment devices, wherein each content delivery network of the pool of content delivery networks comprises load balancing servers, gateways, and storage servers connected via a network;receiving a request to stream the live content stream from a particular user equipment device, wherein the particular user equipment device has a first geographic location that is shared with a plurality of user equipment devices;providing to the particular user equipment device, a manifest that identifies the pool of content delivery networks that contains at least a portion of the live content stream;streaming content to the particular user equipment device using a first content delivery network from the pool of content delivery networks over a first distribution path;determining whether a predetermined condition is satisfied, wherein the predetermined condition comprises whether a count of user equipment devices that are located at the first geographic location and are currently streaming the content from the pool of content delivery networks exceeds a threshold quantity of streaming user equipment devices;when the predetermined condition is satisfied: selecting an additional content delivery network from a set of available content delivery networks to add to the pool of content delivery networks based on a plurality of factors comprising at least:distance between a second geographic location corresponding to the additional content delivery network and geographic locations corresponding to content delivery networks within the pool of content delivery networks,latency along at least one different distribution path between the additional content delivery network and the particular user equipment device, andthroughput for at least one different distribution path between the additional content delivery network and the particular user equipment device;updating the pool of content delivery networks to include the additional content delivery network, wherein the additional content delivery network comprises a load balancing server, a gateway, and a storage server connected via a communications network;encoding and uploading a second fragment of the live content stream for the updated pool of content delivery networks, including the additional content delivery network, to stream the live content stream using a distribution path including the additional content delivery network from the updated pool content delivery networks; andproviding, to the particular user equipment device, an updated manifest that identifies the additional content delivery network, wherein the particular user equipment requests at least a portion of the live content stream from the additional content delivery network.
  • 8. The method of claim 7, wherein the predetermined condition further comprises whether a plurality of quality of service metric values satisfy certain thresholds.
  • 9. The method of claim 7, wherein throughput comprises network bandwidth.
  • 10. The method of claim 7, wherein encoding and uploading the live content stream for the updated pool of content delivery networks comprises providing an encoder with an identifier of the selected additional content delivery network, wherein the identifier identifies at least one of a gateway and a load balancer for the additional content delivery network.
  • 11. The method of claim 7, wherein the method further comprises maintaining records relating to the geographic distribution of user equipment devices that are currently streaming content.
  • 12. The method of claim 7, wherein the plurality of factors further includes on a price per unit of bandwidth associated with the additional content delivery network.
US Referenced Citations (662)
Number Name Date Kind
4009331 Goldmark et al. Feb 1977 A
4694357 Rahman et al. Sep 1987 A
4802170 Trottier Jan 1989 A
4964069 Ely Oct 1990 A
5119474 Beitel et al. Jun 1992 A
5274758 Beitel et al. Dec 1993 A
5361332 Yoshida et al. Nov 1994 A
5396497 Veltman Mar 1995 A
5404436 Hamilton Apr 1995 A
5420801 Dockter et al. May 1995 A
5420974 Morris et al. May 1995 A
5471576 Yee Nov 1995 A
5479303 Suzuki et al. Dec 1995 A
5487167 Dinallo et al. Jan 1996 A
5502766 Boebert et al. Mar 1996 A
5509070 Schull Apr 1996 A
5533021 Branstad et al. Jul 1996 A
5537408 Branstad et al. Jul 1996 A
5539908 Chen et al. Jul 1996 A
5541662 Adams et al. Jul 1996 A
5583652 Ware Dec 1996 A
5589993 Naimpally et al. Dec 1996 A
5627936 Prasad May 1997 A
5633472 DeWitt et al. May 1997 A
5642171 Baumgartner et al. Jun 1997 A
5655117 Goldberg et al. Aug 1997 A
5664044 Ware Sep 1997 A
5675382 Bauchspies Oct 1997 A
5675511 Prasad et al. Oct 1997 A
5684542 Tsukagoshi Nov 1997 A
5715403 Stefik Feb 1998 A
5717816 Boyce et al. Feb 1998 A
5719786 Nelson et al. Feb 1998 A
5745643 Mishina Apr 1998 A
5751280 Abbott May 1998 A
5754648 Ryan et al. May 1998 A
5763800 Rossum et al. Jun 1998 A
5765164 Prasad et al. Jun 1998 A
5794018 Vrvilo et al. Aug 1998 A
5805700 Nardone et al. Sep 1998 A
5822524 Chen et al. Oct 1998 A
5828370 Moeller et al. Oct 1998 A
5841432 Carmel et al. Nov 1998 A
5844575 Reid Dec 1998 A
5848217 Tsukagoshi et al. Dec 1998 A
5867625 McLaren Feb 1999 A
5887110 Sakamoto et al. Mar 1999 A
5892900 Ginter et al. Apr 1999 A
5903261 Walsh et al. May 1999 A
5907597 Mark May 1999 A
5946446 Yanagihara Aug 1999 A
5956729 Goetz et al. Sep 1999 A
5959690 Toebes, VIII et al. Sep 1999 A
5999812 Himsworth Dec 1999 A
6031622 Ristow et al. Feb 2000 A
6038257 Brusewitz et al. Mar 2000 A
6044469 Horstmann Mar 2000 A
6046778 Nonomura et al. Apr 2000 A
6047100 McLaren Apr 2000 A
6058240 McLaren May 2000 A
6064794 McLaren et al. May 2000 A
6065050 DeMoney May 2000 A
6018611 Nogami et al. Jun 2000 A
6079566 Eleftheriadis et al. Jun 2000 A
6097877 Katayama et al. Aug 2000 A
6141754 Choy Oct 2000 A
6155840 Sallette Dec 2000 A
6169242 Fay et al. Jan 2001 B1
6175921 Rosen Jan 2001 B1
6192319 Simonson et al. Feb 2001 B1
6195388 Choi et al. Feb 2001 B1
6204883 Tsukagoshi Mar 2001 B1
6222981 Rijckaert Apr 2001 B1
6282653 Berstis et al. Aug 2001 B1
6289450 Pensak et al. Sep 2001 B1
6292621 Tanaka et al. Sep 2001 B1
6308005 Ando et al. Oct 2001 B1
6330286 Lyons et al. Dec 2001 B1
6374144 Viviani et al. Apr 2002 B1
6389218 Gordon et al. May 2002 B2
6389473 Carmel et al. May 2002 B1
6395969 Fuhrer May 2002 B1
6397230 Carmel et al. May 2002 B1
6418270 Steenhof et al. Jul 2002 B1
6449719 Baker Sep 2002 B1
6466671 Maillard et al. Oct 2002 B1
6466733 Kim Oct 2002 B1
6510513 Danieli Jan 2003 B1
6510554 Gordon et al. Jan 2003 B1
6621979 Eerenberg et al. Sep 2003 B1
6625320 Ghanbari Sep 2003 B1
6658056 Duruöz et al. Dec 2003 B1
6665835 Gutfreund et al. Dec 2003 B1
6671408 Kaku Dec 2003 B1
6697568 Kaku Feb 2004 B1
6725281 Zintel et al. Apr 2004 B1
6771703 Oguz et al. Aug 2004 B1
6807306 Girgensohn et al. Oct 2004 B1
6810031 Hegde et al. Oct 2004 B1
6810389 Meyer Oct 2004 B1
6819394 Nomura et al. Nov 2004 B1
6850252 Hoffberg Feb 2005 B1
6856997 Lee et al. Feb 2005 B2
6859496 Boroczky et al. Feb 2005 B1
6917652 Lyu Jul 2005 B2
6944621 Collart Sep 2005 B1
6944629 Shioi et al. Sep 2005 B1
6956901 Boroczky et al. Oct 2005 B2
6965724 Boccon-Gibod et al. Nov 2005 B1
6965993 Baker Nov 2005 B2
6985588 Glick et al. Jan 2006 B1
6988144 Luken et al. Jan 2006 B1
7007170 Morten Feb 2006 B2
7023924 Keller et al. Apr 2006 B1
7043473 Rassool et al. May 2006 B1
7127155 Ando et al. Oct 2006 B2
7150045 Koelle et al. Dec 2006 B2
7151832 Fetkovich et al. Dec 2006 B1
7151833 Candelore et al. Dec 2006 B2
7165175 Kollmyer et al. Jan 2007 B1
7185363 Narin et al. Feb 2007 B1
7197234 Chatterton Mar 2007 B1
7209892 Galuten et al. Apr 2007 B1
7231132 Davenport Jun 2007 B1
7237061 Boic Jun 2007 B1
7242772 Tehranchi Jul 2007 B1
7328345 Morten et al. Feb 2008 B2
7330875 Parasnis et al. Feb 2008 B1
7340528 Noblecourt et al. Mar 2008 B2
7349886 Morten et al. Mar 2008 B2
7356143 Morten Apr 2008 B2
7356245 Belknap et al. Apr 2008 B2
7366788 Jones et al. Apr 2008 B2
7376831 Kollmyer et al. May 2008 B2
7406174 Palmer Jul 2008 B2
7421411 Kontio et al. Sep 2008 B2
7457359 Mabey et al. Nov 2008 B2
7472280 Giobbi Dec 2008 B2
7478325 Foehr Jan 2009 B2
7484103 Woo et al. Jan 2009 B2
7493018 Kim Feb 2009 B2
7499938 Collart Mar 2009 B2
7526450 Hughes et al. Apr 2009 B2
7594271 Zhuk et al. Sep 2009 B2
7610365 Kraft et al. Oct 2009 B1
7640435 Morten Dec 2009 B2
7689510 Lamkin et al. Mar 2010 B2
7720352 Belknap et al. May 2010 B2
7747853 Candelore et al. Jun 2010 B2
7761892 Ellis et al. Jul 2010 B2
7779097 Lamkin et al. Aug 2010 B2
7817608 Rassool et al. Oct 2010 B2
7869691 Kelly et al. Jan 2011 B2
7962942 Craner Jun 2011 B1
7974714 Hoffberg Jul 2011 B2
7991156 Miller Aug 2011 B1
8023562 Zheludkov et al. Sep 2011 B2
8046453 Olaiya Oct 2011 B2
8054880 Yu et al. Nov 2011 B2
8065708 Smyth et al. Nov 2011 B1
8069260 Speicher et al. Nov 2011 B2
8201264 Grab et al. Jun 2012 B2
8225061 Greenebaum Jul 2012 B2
8233768 Soroushian et al. Jul 2012 B2
8245124 Gupta Aug 2012 B1
8249168 Graves Aug 2012 B2
8261356 Choi et al. Sep 2012 B2
8265168 Masterson et al. Sep 2012 B1
8270473 Chen et al. Sep 2012 B2
8270819 Vannier Sep 2012 B2
8289338 Priyadarshi et al. Oct 2012 B2
8291460 Peacock Oct 2012 B1
8296434 Miller et al. Oct 2012 B1
8311111 Xu Nov 2012 B2
8311115 Gu et al. Nov 2012 B2
8321556 Chatterjee et al. Nov 2012 B1
8386621 Park Feb 2013 B2
8401900 Cansler et al. Mar 2013 B2
8412841 Swaminathan et al. Apr 2013 B1
8452110 Shoham et al. May 2013 B2
8456380 Pagan Jun 2013 B2
8472792 Butt et al. Jun 2013 B2
8473630 Galligan et al. Jun 2013 B1
8510303 Soroushian et al. Aug 2013 B2
8510404 Carmel et al. Aug 2013 B2
8515265 Kwon et al. Aug 2013 B2
8516529 Lajoie et al. Aug 2013 B2
8595378 Cohn Nov 2013 B1
8606069 Okubo et al. Dec 2013 B2
8640166 Craner et al. Jan 2014 B1
8681866 Jia Mar 2014 B1
8726264 Allen et al. May 2014 B1
RE45052 Li Jul 2014 E
8774609 Drake et al. Jul 2014 B2
8781122 Chan et al. Jul 2014 B2
8787570 Braness et al. Jul 2014 B2
8805109 Shoham et al. Aug 2014 B2
8806188 Braness et al. Aug 2014 B2
8843586 Pantos et al. Sep 2014 B2
8908984 Shoham et al. Dec 2014 B2
8909922 Kiefer et al. Dec 2014 B2
8914534 Braness et al. Dec 2014 B2
8914836 Shivadas et al. Dec 2014 B2
8918636 Kiefer Dec 2014 B2
8918908 Ziskind et al. Dec 2014 B2
8997161 Priyadarshi et al. Mar 2015 B2
8997254 Amidei et al. Mar 2015 B2
9014471 Shoham et al. Apr 2015 B2
9025659 Soroushian et al. May 2015 B2
9042670 Carmel et al. May 2015 B2
9094737 Shivadas et al. Jul 2015 B2
9191457 Van der Schaar Nov 2015 B2
9197685 Soroushian et al. Nov 2015 B2
9210481 Braness et al. Dec 2015 B2
9247311 Kiefer Jan 2016 B2
9247312 Braness et al. Jan 2016 B2
9247317 Shivadas et al. Jan 2016 B2
9264475 Shivadas et al. Feb 2016 B2
9313510 Shivadas et al. Apr 2016 B2
9343112 Amidei et al. May 2016 B2
9344517 Shivadas et al. May 2016 B2
9883204 Braness et al. Jan 2018 B2
20010030710 Werner Oct 2001 A1
20010036355 Kelly et al. Nov 2001 A1
20010046299 Wasilewski et al. Nov 2001 A1
20020026560 Jordan et al. Feb 2002 A1
20020034252 Owen et al. Mar 2002 A1
20020051494 Yamaguchi et al. May 2002 A1
20020057898 Normile May 2002 A1
20020059170 Vange et al. May 2002 A1
20020062313 Lee et al. May 2002 A1
20020076112 Devara Jun 2002 A1
20020087569 Fischer et al. Jul 2002 A1
20020091665 Van Beek et al. Jul 2002 A1
20020093571 Hyodo Jul 2002 A1
20020110193 Yoo et al. Aug 2002 A1
20020116481 Lee Aug 2002 A1
20020118953 Kim Aug 2002 A1
20020120934 Abrahams et al. Aug 2002 A1
20020136298 Anantharamu et al. Sep 2002 A1
20020143413 Fay et al. Oct 2002 A1
20020143547 Fay et al. Oct 2002 A1
20020147980 Satoda Oct 2002 A1
20020161462 Fay Oct 2002 A1
20020180929 Tseng et al. Dec 2002 A1
20020184159 Tayadon et al. Dec 2002 A1
20020191112 Akiyoshi et al. Dec 2002 A1
20020191959 Lin et al. Dec 2002 A1
20020191960 Fujinami et al. Dec 2002 A1
20030001964 Masukura et al. Jan 2003 A1
20030002578 Tsukagoshi et al. Jan 2003 A1
20030005442 Brodersen et al. Jan 2003 A1
20030021296 Wee et al. Jan 2003 A1
20030031178 Haeri Feb 2003 A1
20030035488 Barrau Feb 2003 A1
20030035545 Jiang Feb 2003 A1
20030035546 Jiang et al. Feb 2003 A1
20030041257 Wee Feb 2003 A1
20030061305 Copley et al. Mar 2003 A1
20030061369 Aksu et al. Mar 2003 A1
20030065777 Mattila et al. Apr 2003 A1
20030078930 Surcouf et al. Apr 2003 A1
20030093799 Kauffman et al. May 2003 A1
20030123855 Okada et al. Jul 2003 A1
20030128296 Lee Jul 2003 A1
20030133506 Haneda Jul 2003 A1
20030152370 Otomo et al. Aug 2003 A1
20030163824 Gordon et al. Aug 2003 A1
20030165328 Grecia Sep 2003 A1
20030174844 Candelore Sep 2003 A1
20030185302 Abrams Oct 2003 A1
20030185542 McVeigh et al. Oct 2003 A1
20030206558 Parkkinen et al. Nov 2003 A1
20030216922 Gonzales et al. Nov 2003 A1
20030229900 Reisman Dec 2003 A1
20030231863 Eerenberg et al. Dec 2003 A1
20030231867 Gates et al. Dec 2003 A1
20030233464 Walpole et al. Dec 2003 A1
20030236836 Borthwick Dec 2003 A1
20030236907 Stewart et al. Dec 2003 A1
20040006701 Kresina Jan 2004 A1
20040021684 Millner Feb 2004 A1
20040024688 Bi et al. Feb 2004 A1
20040025180 Begeja et al. Feb 2004 A1
20040031058 Reisman Feb 2004 A1
20040039916 Aldis et al. Feb 2004 A1
20040047614 Green Mar 2004 A1
20040052501 Tam Mar 2004 A1
20040071453 Valderas Apr 2004 A1
20040081333 Grab Apr 2004 A1
20040081434 Jung et al. Apr 2004 A1
20040088412 John May 2004 A1
20040093618 Baldwin et al. May 2004 A1
20040105549 Suzuki et al. Jun 2004 A1
20040114687 Ferris et al. Jun 2004 A1
20040117347 Seo et al. Jun 2004 A1
20040136698 Mock Jul 2004 A1
20040139335 Diamand et al. Jul 2004 A1
20040143760 Alkove et al. Jul 2004 A1
20040146276 Ogawa Jul 2004 A1
20040158878 Ratnakar et al. Aug 2004 A1
20040184534 Wang Sep 2004 A1
20040202320 Amini et al. Oct 2004 A1
20040217971 Kim Nov 2004 A1
20040255115 DeMello et al. Dec 2004 A1
20040255236 Collart Dec 2004 A1
20050015797 Noblecourt et al. Jan 2005 A1
20050038826 Bae et al. Feb 2005 A1
20050055399 Savchuk Mar 2005 A1
20050055435 Gbadegesin et al. Mar 2005 A1
20050071280 Irwin et al. Mar 2005 A1
20050071469 McCollom et al. Mar 2005 A1
20050108320 Lord et al. May 2005 A1
20050114896 Hug May 2005 A1
20050149450 Stefik et al. Jul 2005 A1
20050180641 Clark Aug 2005 A1
20050183120 Jain et al. Aug 2005 A1
20050193070 Brown et al. Sep 2005 A1
20050193322 Lamkin et al. Sep 2005 A1
20050196147 Seo et al. Sep 2005 A1
20050204289 Mohammed et al. Sep 2005 A1
20050207442 Zoest et al. Sep 2005 A1
20050207578 Matsuyama et al. Sep 2005 A1
20050254508 Aksu et al. Nov 2005 A1
20050273695 Schnurr Dec 2005 A1
20050275656 Corbin et al. Dec 2005 A1
20060026294 Virdi et al. Feb 2006 A1
20060036549 Wu Feb 2006 A1
20060037057 Xu Feb 2006 A1
20060052095 Vazvan Mar 2006 A1
20060053080 Edmonson et al. Mar 2006 A1
20060064605 Giobbi Mar 2006 A1
20060078301 Ikeda et al. Apr 2006 A1
20060093320 Hallberg et al. May 2006 A1
20060120378 Usuki Jun 2006 A1
20060129909 Butt et al. Jun 2006 A1
20060156330 Chiu Jul 2006 A1
20060168639 Gan et al. Jul 2006 A1
20060173887 Breitfeld et al. Aug 2006 A1
20060179239 Fluhr Aug 2006 A1
20060181965 Collart Aug 2006 A1
20060235880 Qian Oct 2006 A1
20060245727 Nakano et al. Nov 2006 A1
20060259588 Lerman et al. Nov 2006 A1
20060263056 Lin et al. Nov 2006 A1
20060267986 Bae Nov 2006 A1
20060274835 Hamilton et al. Dec 2006 A1
20060294164 Armangau et al. Dec 2006 A1
20070005333 Setiohardjo et al. Jan 2007 A1
20070031110 Rijckaert Feb 2007 A1
20070044010 Sull et al. Feb 2007 A1
20070047901 Ando et al. Mar 2007 A1
20070053513 Hoffberg Mar 2007 A1
20070058928 Naito et al. Mar 2007 A1
20070083617 Chakrabarti et al. Apr 2007 A1
20070086528 Mauchly et al. Apr 2007 A1
20070100757 Rhoads May 2007 A1
20070133603 Weaver Jun 2007 A1
20070136817 Nguyen Jun 2007 A1
20070140647 Kusunoki et al. Jun 2007 A1
20070154165 Hemmeryckx-Deleersnijder et al. Jul 2007 A1
20070168541 Gupta et al. Jul 2007 A1
20070168542 Gupta et al. Jul 2007 A1
20070174209 Fallon et al. Jul 2007 A1
20070178933 Nelson Aug 2007 A1
20070180125 Knowles et al. Aug 2007 A1
20070185982 Nakanowatari et al. Aug 2007 A1
20070192810 Pritchett et al. Aug 2007 A1
20070217339 Zhao Sep 2007 A1
20070217759 Dodd Sep 2007 A1
20070234391 Hunter et al. Oct 2007 A1
20070239839 Buday et al. Oct 2007 A1
20070255940 Ueno Nov 2007 A1
20070271317 Carmel et al. Nov 2007 A1
20070271385 Davis Nov 2007 A1
20070274679 Yahata et al. Nov 2007 A1
20070277219 Toebes et al. Nov 2007 A1
20070277234 Bessonov et al. Nov 2007 A1
20070280298 Hearn et al. Dec 2007 A1
20070292107 Yahata et al. Dec 2007 A1
20070297422 Matsuo et al. Dec 2007 A1
20080005175 Bourke et al. Jan 2008 A1
20080008455 De Lange et al. Jan 2008 A1
20080043832 Barkley et al. Feb 2008 A1
20080066099 Brodersen et al. Mar 2008 A1
20080066181 Haveson et al. Mar 2008 A1
20080086456 Rasanen et al. Apr 2008 A1
20080086747 Rasanen et al. Apr 2008 A1
20080101466 Swenson et al. May 2008 A1
20080104633 Noblecourt et al. May 2008 A1
20080114891 Pereira May 2008 A1
20080120330 Reed et al. May 2008 A1
20080120342 Reed et al. May 2008 A1
20080120389 Bassali et al. May 2008 A1
20080126248 Lee et al. May 2008 A1
20080137541 Agarwal Jun 2008 A1
20080137736 Richardson et al. Jun 2008 A1
20080151817 Fitchett Jun 2008 A1
20080155061 Afergan et al. Jun 2008 A1
20080172441 Speicher et al. Jul 2008 A1
20080187283 Takahashi Aug 2008 A1
20080192818 DiPietro et al. Aug 2008 A1
20080195664 Maharajh et al. Aug 2008 A1
20080195744 Bowra et al. Aug 2008 A1
20080205860 Holtman Aug 2008 A1
20080240144 Kruse et al. Oct 2008 A1
20080256105 Nogawa et al. Oct 2008 A1
20080260028 Lamy-Bergot et al. Oct 2008 A1
20080262824 Oslake et al. Oct 2008 A1
20080263354 Beuque Oct 2008 A1
20080279535 Haque et al. Nov 2008 A1
20080294453 Baird-Smith et al. Nov 2008 A1
20080298358 John et al. Dec 2008 A1
20080310454 Bellwood et al. Dec 2008 A1
20080310496 Fang Dec 2008 A1
20090031220 Tranchant et al. Jan 2009 A1
20090037959 Suh et al. Feb 2009 A1
20090048852 Burns et al. Feb 2009 A1
20090055546 Jung et al. Feb 2009 A1
20090060452 Chaudhri Mar 2009 A1
20090066839 Jung et al. Mar 2009 A1
20090097644 Haruki Apr 2009 A1
20090132599 Soroushian et al. May 2009 A1
20090132721 Soroushian et al. May 2009 A1
20090132824 Terada et al. May 2009 A1
20090136216 Soroushian et al. May 2009 A1
20090150557 Wormley et al. Jun 2009 A1
20090168795 Segel et al. Jul 2009 A1
20090169181 Priyadarshi et al. Jul 2009 A1
20090172167 Drai et al. Jul 2009 A1
20090172201 Carmel et al. Jul 2009 A1
20090178090 Oztaskent Jul 2009 A1
20090196139 Bates et al. Aug 2009 A1
20090201988 Gazier et al. Aug 2009 A1
20090217317 White et al. Aug 2009 A1
20090226148 Nesvadba et al. Sep 2009 A1
20090228395 Wegner et al. Sep 2009 A1
20090290706 Amini et al. Nov 2009 A1
20090290708 Schneider et al. Nov 2009 A1
20090293116 DeMello Nov 2009 A1
20090303241 Priyadarshi et al. Dec 2009 A1
20090307258 Priyadarshi et al. Dec 2009 A1
20090307267 Chen et al. Dec 2009 A1
20090310933 Lee Dec 2009 A1
20090313544 Wood et al. Dec 2009 A1
20090313564 Rottler et al. Dec 2009 A1
20090316783 Au et al. Dec 2009 A1
20090328124 Khouzam et al. Dec 2009 A1
20090328228 Schnell Dec 2009 A1
20100036954 Sakata et al. Feb 2010 A1
20100040351 Toma et al. Feb 2010 A1
20100057928 Kapoor et al. Mar 2010 A1
20100058405 Ramakrishnan et al. Mar 2010 A1
20100074324 Qian et al. Mar 2010 A1
20100074333 Au et al. Mar 2010 A1
20100083322 Rouse Apr 2010 A1
20100094969 Zuckerman et al. Apr 2010 A1
20100095121 Shetty et al. Apr 2010 A1
20100106968 Mori et al. Apr 2010 A1
20100107260 Orrell et al. Apr 2010 A1
20100111192 Graves May 2010 A1
20100138903 Medvinsky Jun 2010 A1
20100142917 Isaji Jun 2010 A1
20100158109 Dahlby et al. Jun 2010 A1
20100161825 Ronca et al. Jun 2010 A1
20100166060 Ezure et al. Jul 2010 A1
20100186092 Takechi et al. Jul 2010 A1
20100189183 Gu et al. Jul 2010 A1
20100228795 Hahn Sep 2010 A1
20100235472 Sood et al. Sep 2010 A1
20100250532 Soroushian et al. Sep 2010 A1
20100290761 Drake et al. Nov 2010 A1
20100299522 Khambete et al. Nov 2010 A1
20100306249 Hill et al. Dec 2010 A1
20100313225 Cholas et al. Dec 2010 A1
20100313226 Cholas et al. Dec 2010 A1
20100319014 Lockett et al. Dec 2010 A1
20100319017 Cook Dec 2010 A1
20100332595 Fullagar Dec 2010 A1
20110002381 Yang et al. Jan 2011 A1
20110016225 Park Jan 2011 A1
20110047209 Lindholm et al. Feb 2011 A1
20110055585 Lee Mar 2011 A1
20110060808 Martin et al. Mar 2011 A1
20110066673 Outlaw Mar 2011 A1
20110067057 Karaoguz et al. Mar 2011 A1
20110078440 Feng et al. Mar 2011 A1
20110080940 Bocharov Apr 2011 A1
20110082924 Gopalakrishnan Apr 2011 A1
20110096828 Chen et al. Apr 2011 A1
20110107379 Lajoie et al. May 2011 A1
20110116772 Kwon et al. May 2011 A1
20110126191 Hughes et al. May 2011 A1
20110129011 Cilli et al. Jun 2011 A1
20110135090 Chan Jun 2011 A1
20110138018 Raveendran et al. Jun 2011 A1
20110142415 Rhyu Jun 2011 A1
20110145726 Wei et al. Jun 2011 A1
20110149753 Bapst et al. Jun 2011 A1
20110150100 Abadir Jun 2011 A1
20110153785 Minborg et al. Jun 2011 A1
20110153835 Rimac Jun 2011 A1
20110184738 Kalisky et al. Jul 2011 A1
20110191439 Dazzi Aug 2011 A1
20110191587 Tian et al. Aug 2011 A1
20110191803 Baldwin et al. Aug 2011 A1
20110197237 Turner Aug 2011 A1
20110197238 Li Aug 2011 A1
20110213827 Kaspar et al. Sep 2011 A1
20110222786 Carmel et al. Sep 2011 A1
20110225302 Park Sep 2011 A1
20110225315 Wexler et al. Sep 2011 A1
20110225417 Maharajh et al. Sep 2011 A1
20110239078 Luby et al. Sep 2011 A1
20110246657 Glow Oct 2011 A1
20110246659 Bouazizi Oct 2011 A1
20110252118 Pantos et al. Oct 2011 A1
20110264530 Santangelo et al. Oct 2011 A1
20110268178 Park Nov 2011 A1
20110276695 Maldaner et al. Nov 2011 A1
20110283012 Melnyk Nov 2011 A1
20110291723 Hashimoto Dec 2011 A1
20110302319 Ha et al. Dec 2011 A1
20110305273 He et al. Dec 2011 A1
20110314176 Frojdh et al. Dec 2011 A1
20110314500 Gordon et al. Dec 2011 A1
20120005368 Knittle et al. Jan 2012 A1
20120023251 Pyle et al. Jan 2012 A1
20120036365 Kyslov et al. Feb 2012 A1
20120036544 Chen et al. Feb 2012 A1
20120066360 Ghosh Mar 2012 A1
20120093214 Urbach Apr 2012 A1
20120114302 Randall et al. May 2012 A1
20120124191 Lyon May 2012 A1
20120137336 Applegate et al. May 2012 A1
20120144117 Weare et al. Jun 2012 A1
20120144445 Bonta et al. Jun 2012 A1
20120166633 Baumback Jun 2012 A1
20120170642 Braness et al. Jul 2012 A1
20120170643 Soroushian et al. Jul 2012 A1
20120170906 Soroushian et al. Jul 2012 A1
20120170915 Braness et al. Jul 2012 A1
20120173751 Braness et al. Jul 2012 A1
20120177101 van der Schaar Jul 2012 A1
20120179834 van der Schaar Jul 2012 A1
20120201475 Carmel et al. Aug 2012 A1
20120201476 Carmel et al. Aug 2012 A1
20120233345 Hannuksela Sep 2012 A1
20120240176 Ma Sep 2012 A1
20120254455 Adimatyam et al. Oct 2012 A1
20120260277 Kosciewicz Oct 2012 A1
20120263434 Wainner et al. Oct 2012 A1
20120265562 Daouk et al. Oct 2012 A1
20120278496 Hsu Nov 2012 A1
20120289147 Raleigh et al. Nov 2012 A1
20120294355 Holcomb et al. Nov 2012 A1
20120297039 Acuna Nov 2012 A1
20120307883 Graves Dec 2012 A1
20120311094 Biderman et al. Dec 2012 A1
20120314778 Salustri et al. Dec 2012 A1
20120317235 Nguyen Dec 2012 A1
20130007223 Luby et al. Jan 2013 A1
20130013730 Li Jan 2013 A1
20130019107 Grab et al. Jan 2013 A1
20130019273 Ma et al. Jan 2013 A1
20130041808 Pham et al. Feb 2013 A1
20130044821 Braness et al. Feb 2013 A1
20130046849 Wolf Feb 2013 A1
20130046902 Villegas Nuñez et al. Feb 2013 A1
20130051554 Braness et al. Feb 2013 A1
20130054958 Braness et al. Feb 2013 A1
20130055084 Soroushian et al. Feb 2013 A1
20130058480 Ziskind et al. Mar 2013 A1
20130061040 Kiefer et al. Mar 2013 A1
20130061045 Kiefer et al. Mar 2013 A1
20130064466 Carmel et al. Mar 2013 A1
20130080772 McGowan et al. Mar 2013 A1
20130094565 Yang et al. Apr 2013 A1
20130097309 Ma et al. Apr 2013 A1
20130114944 Soroushian et al. May 2013 A1
20130128962 Rajagopalan et al. May 2013 A1
20130152767 Katz et al. Jun 2013 A1
20130166580 Maharajh Jun 2013 A1
20130166765 Kaufman Jun 2013 A1
20130166906 Swaminathan et al. Jun 2013 A1
20130170764 Carmel et al. Jul 2013 A1
20130173513 Chu et al. Jul 2013 A1
20130179199 Ziskind et al. Jul 2013 A1
20130179992 Ziskind et al. Jul 2013 A1
20130182952 Carmel et al. Jul 2013 A1
20130196292 Brennen et al. Aug 2013 A1
20130212228 Butler Aug 2013 A1
20130223812 Rossi Aug 2013 A1
20130226578 Bolton et al. Aug 2013 A1
20130226635 Fisher Aug 2013 A1
20130227122 Gao Aug 2013 A1
20130227573 Morsi et al. Aug 2013 A1
20130290697 Wang et al. Oct 2013 A1
20130301424 Kotecha Nov 2013 A1
20130311670 Tarbox et al. Nov 2013 A1
20130329781 Su et al. Dec 2013 A1
20140003516 Soroushian Jan 2014 A1
20140013103 Giladi et al. Jan 2014 A1
20140037620 Ferree et al. Feb 2014 A1
20140047069 Ma Feb 2014 A1
20140052823 Gavade et al. Feb 2014 A1
20140059156 Freeman, II et al. Feb 2014 A1
20140101445 Giladi et al. Apr 2014 A1
20140101722 Moore Apr 2014 A1
20140119432 Wang et al. May 2014 A1
20140122698 Batrouni et al. May 2014 A1
20140140396 Wang et al. May 2014 A1
20140140417 Shaffer et al. May 2014 A1
20140143301 Watson May 2014 A1
20140143431 Watson May 2014 A1
20140143440 Ramamurthy et al. May 2014 A1
20140164584 Joe Jun 2014 A1
20140177734 Carmel et al. Jun 2014 A1
20140189065 van der Schaar et al. Jul 2014 A1
20140195686 Yeager et al. Jul 2014 A1
20140201382 Shivadas et al. Jul 2014 A1
20140211840 Butt et al. Jul 2014 A1
20140211859 Carmel et al. Jul 2014 A1
20140241420 Orton-jay et al. Aug 2014 A1
20140241421 Orton-jay et al. Aug 2014 A1
20140250473 Braness et al. Sep 2014 A1
20140258714 Grab Sep 2014 A1
20140269269 Kovvali et al. Sep 2014 A1
20140269927 Naletov et al. Sep 2014 A1
20140269936 Shivadas et al. Sep 2014 A1
20140297804 Shivadas et al. Oct 2014 A1
20140297881 Shivadas et al. Oct 2014 A1
20140355668 Shoham et al. Dec 2014 A1
20140355958 Soroushian et al. Dec 2014 A1
20140359678 Shivadas et al. Dec 2014 A1
20140359679 Shivadas et al. Dec 2014 A1
20140359680 Shivadas et al. Dec 2014 A1
20140376720 Chan et al. Dec 2014 A1
20150006662 Braness Jan 2015 A1
20150026677 Stevens et al. Jan 2015 A1
20150049957 Shoham et al. Feb 2015 A1
20150058228 Voeller Feb 2015 A1
20150063693 Carmel et al. Mar 2015 A1
20150067715 Koat Mar 2015 A1
20150104153 Braness et al. Apr 2015 A1
20150117836 Amidei et al. Apr 2015 A1
20150117837 Amidei et al. Apr 2015 A1
20150139419 Kiefer et al. May 2015 A1
20150188758 Amidei et al. Jul 2015 A1
20150188842 Amidei et al. Jul 2015 A1
20150188921 Amidei et al. Jul 2015 A1
20150189017 Amidei et al. Jul 2015 A1
20150189373 Amidei et al. Jul 2015 A1
20150195259 Liu et al. Jul 2015 A1
20150334435 Shivadas et al. Nov 2015 A1
20160072870 Watson et al. Mar 2016 A1
20160127440 Gordon May 2016 A1
20160149981 Shivadas et al. May 2016 A1
20160219303 Braness et al. Jul 2016 A1
20170026712 Gonder et al. Jan 2017 A1
20170041604 Soroushian et al. Feb 2017 A1
20180241796 Srinivasan et al. Aug 2018 A1
Foreign Referenced Citations (116)
Number Date Country
1169229 Dec 1997 CN
1221284 Jun 1999 CN
1723696 Jan 2006 CN
757484 Feb 1997 EP
813167 Dec 1997 EP
0936812 Aug 1999 EP
1056273 Nov 2000 EP
1420580 May 2004 EP
1553779 Jul 2005 EP
1657835 May 2006 EP
1718074 Nov 2006 EP
2486727 Aug 2012 EP
2507995 Oct 2012 EP
2564354 Mar 2013 EP
2616991 Jul 2013 EP
2617192 Jul 2013 EP
2661696 Nov 2013 EP
2716048 Apr 2014 EP
2721826 Apr 2014 EP
2486517 Jun 2014 EP
2751990 Jul 2014 EP
2807821 Dec 2014 EP
2972960 Jan 2016 EP
08046902 Feb 1996 JP
8111842 Apr 1996 JP
1996163488 Jun 1996 JP
08287613 Nov 1996 JP
09037225 Feb 1997 JP
11164307 Jun 1999 JP
11275576 Oct 1999 JP
11328929 Nov 1999 JP
2000201343 Jul 2000 JP
2001043668 Feb 2001 JP
2001346165 Dec 2001 JP
2002170363 Jun 2002 JP
2002518898 Jun 2002 JP
2002218384 Aug 2002 JP
2003250113 Sep 2003 JP
2004013823 Jan 2004 JP
2004515941 May 2004 JP
2004172830 Jun 2004 JP
2004187161 Jul 2004 JP
2004234128 Aug 2004 JP
2005027153 Jan 2005 JP
2005080204 Mar 2005 JP
2006524007 Oct 2006 JP
2007036666 Feb 2007 JP
2007174375 Jul 2007 JP
2007235690 Sep 2007 JP
2007535881 Dec 2007 JP
2008235999 Oct 2008 JP
2014506430 Mar 2014 JP
6038805 Dec 2016 JP
201763453 Mar 2017 JP
100221423 Jun 1999 KR
1020020064888 Aug 2002 KR
669616 Sep 2007 KR
2002013664 Feb 2012 KR
20130133830 Dec 2013 KR
1995015660 Jun 1995 WO
1996013121 May 1996 WO
1997031445 Apr 1998 WO
1999010836 Mar 1999 WO
1999065239 Dec 1999 WO
2001031497 May 2001 WO
2001050732 Jul 2001 WO
2001065762 Sep 2001 WO
2002001880 Jan 2002 WO
2002008948 Jan 2002 WO
2002035832 May 2002 WO
2002037210 May 2002 WO
2002054196 Jul 2002 WO
2004054247 Jun 2004 WO
2004097811 Nov 2004 WO
2004102571 Nov 2004 WO
2006018843 Feb 2006 WO
2006018843 Dec 2006 WO
2007044590 Apr 2007 WO
2007113836 Oct 2007 WO
2008010275 Jan 2008 WO
2008042242 Apr 2008 WO
2007113836 Nov 2008 WO
2009065137 May 2009 WO
2010060106 May 2010 WO
2010080911 Jul 2010 WO
2010089962 Aug 2010 WO
2010122447 Oct 2010 WO
2010147878 Dec 2010 WO
2011042898 Apr 2011 WO
2011042900 Apr 2011 WO
2011068668 Jun 2011 WO
2011103364 Aug 2011 WO
2011132184 Oct 2011 WO
2011135558 Nov 2011 WO
2012035533 Mar 2012 WO
2012035534 Mar 2012 WO
2012094181 Jul 2012 WO
2012094189 Jul 2012 WO
2012035534 Jul 2012 WO
2012094171 Jul 2012 WO
2012035533 Aug 2012 WO
2012162806 Dec 2012 WO
2012171113 Dec 2012 WO
2013030833 Mar 2013 WO
2013032518 Mar 2013 WO
2013103986 Jul 2013 WO
2013111126 Aug 2013 WO
2013111126 Aug 2013 WO
2013032518 Sep 2013 WO
2013144942 Oct 2013 WO
2014063726 May 2014 WO
2014145901 Sep 2014 WO
2014193996 Dec 2014 WO
2014193996 Feb 2015 WO
2015031982 Mar 2015 WO
2018152347 Aug 2018 WO
Non-Patent Literature Citations (145)
Entry
“Supported Media Formats”, Supported Media Formats, Android Developers, Nov. 27, 2013, 3 pgs.
European Search Report for Application 11855237.1, search completed Jun. 12, 2014, 9 pgs.
Federal Computer Week, “Tool Speeds Info to Vehicles”, Jul. 25, 1999, 5 pages.
HTTP Live Streaming Overview, Networking & Internet, Apple, Inc., Apr. 1, 2011, 38 pages.
Informationweek: Front End: Daily Dose, “Internet on Wheels”, Jul. 20, 1999, 3 pgs.
International Preliminary Report on Patentability for International Application PCT/US14/30747, Report dated Sep. 15, 2015, dated Sep. 24, 2015, 6 Pgs.
International Search Report and Written Opinion for International Application PCT/US2011/066927, International Filing Date Dec. 22, 2011, Report Completed Apr. 3, 2012, dated Apr. 20, 2012, 14 pgs.
International Search Report and Written Opinion for International Application PCT/US2011/067167, International Filing Date Dec. 23, 2011, Report Completed Jun. 19, 2012, dated Jul. 2, 2012, 11 pgs.
ITS International, “Fleet System Opts for Mobile Server”, Aug. 26, 1999, 1 page.
Microsoft, Microsoft Media Platform: Player Framework, “Silverlight Media Framework v1.1”, 2 pages.
Microsoft, Microsoft Media Platform:Player Framework, “Microsoft Media Platform: Player Framework v2.5 (formerly Silverlight Media Framework)”, 2 pages.
The Official Microsoft IIS Site, Smooth Streaming Client, 4 pages.
International Search Report and Written Opinion for International Application PCT/US14/30747, report completed Jul. 30, 2014, dated Aug. 22, 2014, 7 Pages.
“Adaptive Streaming Comparison”, Jan. 28, 2010, 5 pgs.
“Best Practices for Multi-Device Transcoding”, Kaltura Open Source Video, 13 pgs.
“Netflix turns on subtitles for PC, Mac streaming”, 3 pgs.
“Thread: SSME (Smooth Streaming Medial Element) config.xml review (Smooth Streaming Client configuration file)”, 3 pgs.
“Transcoding Best Practices”, From movideo, Nov. 27, 2013, 5 pgs.
“Using HTTP Live Streaming”, iOS Developer Library, Retrieved from: http://developer.apple.com/library/ios/#documentation/networkinginternet/conceptual/streamingmediaguide/UsingHTTPLiveStreaming/UsingHTTPLiveStreaming.html#//apple_ref/doc/uid/TP40008332-CH102-SW1, 10 pgs.
Akhshabi et al., “An Experimental Evaluation of Rate-Adaptation Algorithms in Adaptive Streaming over HTTP”, MMSys'11, Feb. 24-25, 2011, 12 pgs.
“Method for the encoding of a compressed video sequence derived from the same video sequence compressed at a different bit rate without loss of data”, ip.com, ip.com No. IPCOM000008165D, May 22, 2012, pp. 1-9.
Deutscher, “IIS Transform Manager Beta—Using the MP4 to Smooth Task”, Retrieved from: https://web.archive.org/web/20130328111303/http://blogjohndeutscher.com/category/smooth-streaming, Blog post of Apr. 17, 2010, 14 pgs.
Gannes, “The Lowdown on Apple's HTTP Adaptive Bitrate Streaming”, GigaOM, Jun. 10, 2009, 12 pgs.
Ghosh, “Enhancing Silverlight Video Experiences with Contextual Data”, Retrieved from: http://msdn.microsoft.com/en-us/magazine/ee336025.aspx, 15 pgs.
Inlet Technologies, “Adaptive Delivery to iDevices”, 2 pages.
Inlet Technologies, “Adaptive delivery to iPhone 3.0”, 2 pgs.
Inlet Technologies, “HTTP versus RTMP”, 3 pages.
Inlet Technologies, “The World's First Live Smooth Streaming Event: The French Open”, 2 pages.
Kurzke et al., “Get Your Content Onto Google TV”, Google, Retrieved from: http://commondatastorage.googleapis.com/io2012/presentations/live%20to%20website/1300.pdf, 58 pgs.
Lang, “Expression Encoder, Best Practices for live smooth streaming broadcasting”, Microsoft Corporation, 20 pgs.
Levkov, “Mobile Encoding Guidelines for Android Powered Devices”, Adobe Systems Inc., Addendum B, source and date unknown, 42 pgs.
MSDN, “Adaptive streaming, Expression Studio 2.0”, 2 pgs.
Nelson, “Smooth Streaming Deployment Guide”, Microsoft Expression Encoder, Aug. 2010, 66 pgs.
Noe, A. “Matroska File Format (under construction!)”, Retrieved from the Internet: URL:http://web.archive.org web/20070821155146/www.matroska.org/technical/specs/matroska.pdf [retrieved on Jan. 19, 2011], Jun. 24, 2007, 1-51.
Ozer, “The 2012 Encoding and Transcoding Buyers' Guide”, Streamingmedia.com, Retrieved from: http://www.streamingmedia.com/Articles/Editorial/Featured-Articles/The-2012-Encoding-and-Transcoding-Buyers-Guide-84210.aspx, 2012, 8 pgs.
Pantos, “HTTP Live Streaming, draft-pantos-http-live-streaming-10”, IETF Tools, Oct. 15, 2012, Retrieved from: http://tools.ietf.org/html/draft-pantos-http-live-streaming-10, 37 pgs.
RGB Networks, “Comparing Adaptive HTTP Streaming Technologies”, Nov. 2011, Retrieved from: http://btreport.net/wp-content/uploads/2012/02/RGB-Adaptive-HTTP-Streaming-Comparison-1211-01.pdf, 20 pgs.
Siglin, “HTTP Streaming: What You Need to Know”, streamingmedia.com, 2010, 15 pages.
Siglin, “Unifying Global Video Strategies, MP4 File Fragmentation for Broadcast, Mobile and Web Delivery”, Nov. 16, 2011, 16 pgs.
Zambelli, Alex “IIS Smooth Streaming Technical Overview”, Microsoft Corporation, Mar. 2009.
“Diagram | Matroska”, Retrieved from the Internet: URL:http://web.archive.org/web/201 01217114656/http://matroska.org/technical/diagram/index.html, retrieved on Jan. 29, 2016, Dec. 17, 2010.
“Matroska Streaming | Matroska”, Retrieved from the Internet: URL: http://web.archive.org/web/20101217114310/http://matroska.org/technical!streaming/index.html, retrieved on Jan. 29, 2016, Dec. 17, 2010.
“Specifications | Matroska”, Retrieved from the Internet: URL:http://web.archive.org/web/20100706041303/http://www.matroska.org/technical/specs/index.html, retrieved on Jan. 29, 2016, Jul. 6, 2010.
U.S. Appl. No. 13/224,298, “Final Office Action Received”, dated May 19, 2014, 27 pgs.
“Container format (digital)”, printed Aug. 22, 2009 from http://en.wikipedia.org/wiki/Container_format_(digital), 4 pgs.
“DVD subtitles”, sam.zoy.org/writings/dvd/subtitles, dated Jan. 9, 2001, printed Jul. 2, 2009, 4 pgs.
“DVD-Mpeg differences”, http://dvd.sourceforge.net/dvdinfo/dvdmpeg.html, printed on Jul. 2, 2009, 1 pg.
“Final Committee Draft of MPEG-4 streaming text format”, International Organisation for Standardisation, Feb. 2004, 22 pgs.
“IBM Spearheading Intellectual Property Protection Technology for Information on the Internet; Cryptolope Containers Have Arrived”, May 1, 1996, Business Wire, Printed on Aug. 1, 2014 from http://www.thefreelibrary.com/IBM+Spearheading+Intellectual+Property+Protection+Technology+for . . . -a018239381, 6pg.
“Information Technology—Coding of audio-visual objects—Part 17: Streaming text”, International Organisation for Standardisation, Feb. 2004, 22 pgs.
“Information technology—Coding of audio-visual objects—Part 18: Font compression and streaming”, ISO/IEC 14496-18, First edition Jul. 1, 2004, 26 pgs.
I-O Data, Innovation of technology arrived, Retrieved from http://www.iodata.com/catalogs/AVLP2DVDLA_Flyer200505.pdf on May 30, 2013, 2 pgs.
“OpenDML AVI File Format Extensions”, www.the-labs.com/Video/odmlff2-avidef.pdf, Authored by the OpenDML AVI M-JPEG File Format Subcommittee, Sep. 1997, 42 pgs.
“QCast Tuner for PS2”, printed May 11, 2009 from http://web.archive.org/web/20030210120605/www.divx.com/software/detail.php?ie=39, 2 pgs.
“Text of ISO/IEC 14496-18/COR1”, ITU Study Group 16—Video Coding Experts Group—ISO/IEC MPEG & ITU-T VCEG(ISO/IEC JTC1/SC29/WG11 and ITU-T SG16 06), No. N8664, Nov. 7, 2006.
“Text of ISO/IEC 14496-18/FDIS”, ITU Study Group 16—Videocoding Experts Group—ISO/IEC MPEG & ITU-T VCEG(ISO/IEC JTC1/SC29/WG11 and ITU-T SG16 06), No. N6215, Jan. 7, 2004.
“Video Manager and Video Title Set IFO file headers”, printed Aug. 22, 2009 from http://dvd.sourceforge.net/dvdinfo/ifo.htm, 6 pgs.
“What is a DVD?”, printed Aug. 22, 2009 from http://www.videohelp.com/dvd, 8 pgs.
“What is a VOB file”, http://www.mpucoder.com/DVD/vobov.html, printed on Jul. 2, 2009, 2 pgs.
“What's on a DVD?”, printed Aug. 22, 2009 from http://www.doom9.org/dvd-structure.htm, 5 pgs.
Author Unknown, “Blu-ray Disc—Blu-ray Disc—Wikipedia, the free encyclopedia”, printed Oct. 30, 2008 from http://en.wikipedia.org/wiki/Blu-ray_Disc, 11 pgs.
Author Unknown, “Blu-ray Movie Bitrates Here—Blu-ray Forum”, printed Oct. 30, 2008 from http://forum.blu-ray.com/showthread.php?t=3338, 6 pgs.
Author Unknown, “MPEG-4 Video Encoder: Based on International Standard ISO/IEC 14496-2”, Patni Computer Systems, Ltd., printed Mar. 4, 2006, 15 pgs.
Author Unknown, “O'Reilly—802.11 Wireless Networks: The Definitive Guide, Second Edition”, printed Oct. 30, 2008 from http://oreilly.com/catalog/9780596100520, 2 pgs.
Author Unknown, “Tunneling QuickTime RTSP and RTP over HTTP”, Published by Apple Computer, Inc.: 1999 (month unknown), 6 pages.
Author Unknown, “Turbo-Charge Your Internet and PC Performance”, printed Oct. 30, 2008 from Speedtest.net—The Global Broadband Speed Test, 1 pg.
Author Unknown, “When is 54 Not Equal to 54? A Look at 802.11a, b and g Throughput”, printed Oct. 30, 2008 from www.oreillynet.com/pub/a/wireless/2003/08/08/wireless_throughput.html, 4 pgs.
Author Unknown, “White paper, The New Mainstream Wirless LAN Standard”, Broadcom Corporation, Jul. 2, 2003, 12 pgs.
Blasiak, Ph.D., “Video Transrating and Transcoding: Overview of Video Transrating and Transcoding Technologies,”, Ingenient Technologies, TI Developer Conference, Aug. 6-8, 2002, 22 pgs.
Casares et al., “Simplifying Video Editing Using Metadata”, DIS2002, 2002, pp. 157-166.
CN200880127596.4, “Fourth Office Action”, May 6, 2014, 8 pgs.
Garg et al., “An Experimental Study of Throughput for UDP and VoIP Traffic in IEEE 802.11b Networks”, Wireless Communications and Networkings, Mar. 2003, pp. 1748-1753.
Kaspar et al., “Using HTTP Pipelining to Improve Progressive Download over Multiple Heterogeneous Interfaces”, IEEE ICC proceedings, 2010, 5 pgs.
Kim, Kyuheon, “MPEG-2 ES/PES/TS/PSI”, Kyung-Hee University, Oct. 4, 2010, 66 pages.
Kozintsev et al., “Improving last-hop multicast streaming video over 802.11”, Workshop on Broadband Wireless Multimedia, Oct. 2004, pp. 1-10.
Long et al., “Silver: Simplifying Video Editing with Metadata”, Demonstrations, CHI 2003: New Horizons, pp. 628-629.
Nelson, “Arithmetic Coding + Statistical Modeling = Data Compression: Part 1—Arithmetic Coding”, Doctor Dobb's Journal, Feb. 1991, USA, pp. 1-12.
Nelson, Michael “IBM's Cryptolopes”, Complex Objects in Digital Libraries Course, Spring 2001, Retrieved from http://www.cs.odu.edu/˜mln/teaching/unc/inls210/?method=display&pkg_name=cryptolopes.pkg&element_name=cryptolopes.ppt, 12 pages.
Noboru, Takematsu “Play Fast and Fine Video on Web! codec”, Co.9 No. 12, Dec. 1, 2003, 178-179.
Noe, Alexander, “AVI File Format”, http://www.alexander-noe.com/video/documentation/avi.pdf, pp. 1-26.
Noe, Alexander, “Definitions”, http://www.alexander-noe.com/video/amg/definitions.html, Apr. 11, 2006, 2 pgs.
Ooyala, “Widevine Content Protection”, Ooyala Support Center for Developers. Ooyala, Inc., 2013. Jun. 3, 2013. http://support.ooyala.com/developers/documentation/concepts/player_v3_widevine_integration.html, 7 pgs.
Pantos, “HTTP Live Streaming: draft-pantos-http-live-streaming-06”, Published by the Internet Engineering Task Force (IETF), Mar. 31, 2011, 24 pages.
Papagiannaki et al., “Experimental Characterization of Home Wireless Networks and Design Implications”, INFOCOM 2006, 25th IEEE International Conference of Computer Communications, Proceedings, Apr. 2006, 13 pgs.
Phamdo, Nam, “Theory of Data Compression”, printed from http://www.data-compression.com/theoroy.html on Oct. 10, 2003, 12 pgs.
Schulzrinne, H et al., “Real Time Streaming Protocol 2.0 (RTSP): draft-ietfmmusic-rfc2326bis-27”, MMUSIC Working Group of the Internet Engineering Task Force (IETF), Mar. 9, 2011, 296 pages (presented in two parts).
Tan, Yap-Peng et al., “Video transcoding for fast forward/reverse video playback”, IEEE ICIP, 2002, pp. I-713 to I-716.
Unknown, “AVI RIFF File Reference (Direct X 8.1 C++ Archive)”, printed from http://msdn.microsoft.com/archive/en-us/dx81_c/directx_cpp/htm/avirifffilereference.asp?fr . . . on Mar. 6, 2006, 7 pgs.
Unknown, “Entropy and Source Coding (Compression)”, TCOM 570, Sep. 1999, pp. 1-22.
Wang et al., “Image Quality Assessment: From Error Visibility to Structural Similarity”, IEEE Transactions on Image Processing, Apr. 2004, vol. 13, No. 4, pp. 600-612.
Wu et al., “Next Generation Mobile Multimedia Communications: Media Codec and Media Transport Perspectives”, In China Communications, Oct. 2006, pp. 30-44.
“IBM Closes Cryptolopes Unit,” Dec. 17, 1997, CNET News, Printed on Apr. 25, 2014 from http://news.cnet.com/IBM-closes-Cryptolopes-unit/2100-1001_3206465.html, 3 pages.
“Information Technology-Coding of Audio Visual Objects—Part 2: Visual” International Standard, ISO/IEC 14496-2, Third Edition, Jun. 1, 2004, pp. 1-724 (presented in three parts).
“OpenDML AVI File Format Extensions,” Sep. 1997, Version 1.02, XP-002179318, OpenDML AVI M-JPEG File Format Subcommittee, 42 pgs.
Broadq—The Ultimate Home Entertainment Software, printed May 11, 2009 from ittp://web.srchive.org/web/20030401122010/www.broadq.com/qcasttuner/,1 pg.
Cloakware Corporation, “Protecting Digital Content Using Cloakware Code Transformation Technology”, Version 1.2, May 2002, pp. 1-10.
EP10821672 EESR, completed Jan. 30, 2014, 3 pgs.
EP11824682 EESR, completed Feb. 6, 2014. 4 pgs.
European Search Report Application No. EP 08870152, Search Completed May 19, 2011, dated May 26, 2011, 9 pgs.
European Search Report for Application 11855103.5, search completed Jun. 26, 2014, 9 pgs.
European Supplementary Search Report for Application EP09759600, completed Jan. 25, 2011, 11 pgs.
Extended European Search Report for European Application No. 14763140.2, Search completed Sep. 26, 2016, dated Oct. 5, 2016, 9 Pgs.
Supplementary European Search Report for Application No. EP 04813918, Search Completed Dec. 19, 2002, 3 pgs.
Supplementary European Search Report for Application No. EP 10729513, International Filing Date Jan. 7, 2010, Search Completed Dec. 9, 2013, 4 pgs.
Supplementary European Search Report for EP Application 11774529, completed Jan. 31, 2014, 2 pgs.
Supplementary European Search Report for Application No. EP 10834935, International Filing Date Nov. 15, 2010, Search Completed May 27, 2014, 9 pgs.
Griffith, Eric, “The Wireless Digital Picture Fram Arrives”, printed May 4, 2007 from Wi-Fi Planet at http://www.wi-fiplanet.com/news/article.php/3093141, Oct. 16, 2003, 3 pgs.
IBM Corporation and Microsoft Corporation, “Multimedia Programming Interface and Data Specifications 1.0”, Aug. 1991, printed from http://www.kk.iij4u.or.jp/˜kondo/wave/mpidata.txt on Mar. 6, 2006, 100 pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2011/068276, dated Mar. 4, 2014, 23 pgs.
International Preliminary Report on Patentability for International Application PCT/US2013/043181, Report dated Dec. 31, 2014, dated Jan. 8, 2015, 11 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/039852 , Report dated Dec. 1, 2015, dated Dec. 5, 2015, 8 Pgs.
International Search Report and Written Opinion for International Application No. PCT/US08/87999, completed Feb. 7, 2009, dated Mar. 19, 2009, 6 pgs.
International Search Report and Written Opinion for International Application No. PCT/US09/46588, completed Jul. 13, 2009, dated Jul. 23, 2009, 7 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2004/041667, Completed May 24, 2007, dated Jun. 20, 2007, 6 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2005/025845, completed Feb. 5, 2007 and dated May 10, 2007, 6 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2008/083816, completed Jan. 10, 2009, dated Jan. 22, 2009, 7 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2010/56733, Completed Jan. 3, 2011, dated Jan. 14, 2011, 9 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2013/043181, Completed Nov. 27, 2013, dated Dec. 6, 2013, 12 pgs.
International Search Report and Written Opinion for International Application PCT/US14/39852, Report Completed Oct. 21, 2014, dated Dec. 5, 2014, 11 pgs.
International Search Report and Written Opinion for International Application PCT/US2011/068276, completed Jun. 19, 2013, dated Jul. 8, 2013, 24 pgs.
International Search Report and Written Opinion for PCT/US2013/020572, International Filing Date Jan. 7, 2013, Search Completed Mar. 19, 2013, dated Apr. 29, 2013, 10 pgs.
International Search Report for International Application No. PCT/US07/63950, completed Feb. 19, 2008; dated Mar. 19, 2008, 9 pgs.
International Search Report and Written Opinion for International Application PCT/US2010/020372, completed Feb. 10, 2009, Search Report dated Mar. 1, 2010, 7 pgs.
10 KISS Players, “KISS DP-500”, retrieved from http://www.kiss-technology.com/?p=dp500 on May 4, 2007, 1 pg.
Lifehacker—Boxqueue Bookmarklet Saves Videos for Later Boxee Watching, printed Jun. 16, 2009 from http://feeds.gawker.com/˜r/lifehacker/full/˜3/OHvDmrIgZZc/boxqueue-bookmarklet-saves-videos-for-late-boxee-watching, 2 pgs.
Linksys Wireless—B Media Adapter Reviews, printed May 4, 2007 from http://reviews.cnet.com/Linksys_Wireless_B_Media_Adapter/4505-6739_7-30421900.html?tag=box, 5 pgs.
Linksys, KISS DP-500, printed May 4, 2007 from http://www.kiss-technology.com/?p=dp500, 2 pgs.
Linksys®: “Enjoy your digital music and pictures on your home entertainment center, without stringing wires!”, Model No. WMA 11B, printed May 9, 2007 from http://www.linksys.com/servlet/Satellite?c=L_Product_C2&childpagename=US/Layout&cid=1115416830950&p, 4 pgs.
Microsoft Corporation, “Chapter 8, Multimedia File Formats” 1991, Microsoft Windows Multimedia Programmer's Reference, 3 cover pgs., pp. 8-1 to 8-20.
Microsoft Windows® XP Media Center Edition 2005, “Frequently asked Questions”, printed May 4, 2007 from http://www.microsoft.com/windowsxp/mediacenter/evaluation/faq.mspx, 6 pgs.
Microsoft Windows® XP Media Center Edition 2005, “Features”, printed May 9, 2007, from http://www.microsoft.com/windowsxp/mediacenter/evaluation/features.mspx, 4 pgs.
Morrison, “EA IFF 85” Standard for Interchange Format Files, Jan. 14, 1985, printed from http://www.dcs.ed.ac.uk/home/mxr/gfx/2d/IFF.txt on Mar. 6, 2006, 24 pgs.
Office Action for U.S. Appl. No. 13/223,210, dated Apr. 30, 2015, 14 pgs.
Office Action for U.S. Appl. No. 14/564,003, dated Apr. 17, 2015, 28 pgs.
Open DML AVI-M-JPEG File Format Subcommittee, “Open DML AVI File Format Extensions”, Version 1.02, Feb. 28, 1996, 29 pgs.
Pcworld.com, “Future Gear: PC on the HiFi, and the TV”, from http://www.pcworld.com/article/id,108818-page,1/article.html, printed May 4, 2007, from IDG Networks, 2 pgs.
“QTV—About BroadQ”, printed May 11, 2009 from http://www.broadq.com/en/about.php, 1 pg.
Taxan, “A Vel LinkPlayer2 for Consumer, I-O Data USA—Products—Home Entertainment”, printed May 4, 2007 from http://www.iodata.com/usa/products/products.php?cat=HNP&sc=AVEL&pld=AVLP2/DVDLA&ts=2&tsc, 1 pg.
Windows Media Center Extender for Xbox, printed May 9, 2007 from http://www.xbox.com/en-US/support/systemuse/xbox/console/mediacenterextender.htm, 2 pgs.
Windows® XP Media Center Edition 2005, “Experience more entertainment”, retrieved from http://download.microsoft.com/download/c/9/a/c9a7000a-66b3-455b-860b-1c16f2eecfec/MCE.pdf on May 9, 2007, 2 pgs.
International Preliminary Report for Application No. PCT/US2011/066927, Report dated Jul. 10, 2013, 13 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2018/018399, Search completed Apr. 4, 2018, dated Apr. 25, 2018, 28 Pgs.
Adhikari et al., “Unreeling Netflix: Understanding and Improving Multi-CDN Movie Delivery”, INFOCOM, 2012 Proceedings IEEE, 2012, 9 pgs.
Martin et al, “Characterizing Netflix Bandwidth Consumption”, In CCNC, 2013, 6 pgs.
Martin et al, “Characterizing Netflix Bandwidth Consumption”, Presented at IEEE CCNC 2013 on Jan. 13, 2013, 21 pgs.
Related Publications (1)
Number Date Country
20140280763 A1 Sep 2014 US