U.S. patent application No. 12/008,722 entitled “MPEG Objects and Systems and Methods for Using MPEG Objects” and assigned to the same assignee filed contemporaneously herewith on Jan. 11, 2008 is related generally to the subject matter of the present application and is incorporated herein by reference in its entirety.
The present application claims priority from U.S. provisional application Ser. No. 60/884,773, filed Jan. 12, 2007, Ser. No. 60/884,744, filed Jan. 12, 2007, and Ser. No. 60/884,772, filed Jan. 12, 2007, the full disclosures of which are hereby incorporated herein by reference.
The present invention relates to systems and methods for providing interactive content to a remote device and more specifically to systems and methods wherein an object model is associated with pre-encoded video content.
In cable television systems, the cable head-end transmits content to one or more subscribers wherein the content is transmitted in an encoded form. Typically, the content is encoded as digital MPEG video and each subscriber has a set-top box or cable card that is capable of decoding the MPEG video stream. Beyond providing linear content, cable providers can now provide interactive content, such as web pages or walled-garden content. As the Internet has become more dynamic, including video content on web pages and requiring applications or scripts for decoding the video content, cable providers have adapted to allow subscribers the ability to view these dynamic web pages. In order to composite a dynamic web page for transmission to a requesting subscriber in encoded form, the cable head end retrieves the requested web page and renders the web page. Thus, the cable headend must first decode any encoded content that appears within the dynamic webpage. For example, if a video is to be played on the webpage, the headend must retrieve the encoded video and decode each frame of the video. The cable headend then renders each frame to form a sequence of bitmap images of the Internet web page. Thus, the web page can only be composited together if all of the content that forms the web page is first decoded. Once the composite frames are complete, the composited video is sent to an encoder, such as an MPEG encoder to be re-encoded. The compressed MPEG video frames are then sent in an MPEG video stream to the user's set-top box.
Creating such composite encoded video frames in a cable television network requires intensive CPU and memory processing, since all encoded content must first be decoded, then composited, rendered, and re-encoded. In particular, the cable headend must decode and re-encode all of the content in real-time. Thus, allowing users to operate in an interactive environment with dynamic web pages is quite costly to cable operators because of the required processing. Additionally, such systems have the additional drawback that the image quality is degraded due to re-encoding of the encoded video.
Embodiments of the invention disclose a system for encoding at least one composite encoded video frame for display on a display device. The system includes a markup language-based graphical layout, the graphical layout including frame locations within the composite frame for at least the first encoded source and the second encoded source. Additionally, the system has a stitcher module for stitching together the first encoded source and the second encoded source according to the frame locations of the graphical layout. The stitcher forms an encoded frame without having to decode the block-based transform encoded data for at least the first source. The encoded video may be encoded using one of the MPEG standards, AVS, VC-1 or another block-based encoding protocol.
In certain embodiments of the invention, the system allows a user to interact with graphical elements on a display device. The processor maintains state information about one or more graphical elements identified in the graphical layout. The graphical elements in the graphical layout are associated with one of the encoded sources. A user transmits a request to change state of one of the graphical elements through a client device in communication with the system. The request for the change in state causes the processor to register the change in state and to obtain a new encoded source. The processor causes the stitcher to stitch the new encoded source in place of the encoded source representing the graphic element. The processor may also execute or interpret computer code associated with the graphic element.
For example, the graphic element may be a button object that has a plurality of states, associated encoded content for each state, and methods associated which each of the states. The system may also include a transmitter for transmitting to the client device the composited video content. The client device can then decode the composited video content and cause the composited video content to be displayed on a display device. In certain embodiments each graphical element within the graphical layout is associated with one or more encoded MPEG video frames or portions of a video frame, such as one or more macroblocks or slices. The compositor may use a single graphical element repeatedly within the MPEG video stream. For example, the button may be only a single video frame in one state and a single video frame in another state and the button may be composited together with MPEG encoded video content wherein the encoded macroblocks representing the button are stitched into the MPEG encoded video content in each frame.
Other embodiments of the invention disclose a system for creating one or more composite MPEG video frames forming an MPEG video stream. The MPEG video stream is provided to a client device that includes an MPEG decoder. The client device decodes the MPEG video stream and outputs the video to a display device. The composite MPEG video frames are created by obtaining a graphical layout for a video frame. The graphical layout includes frame locations within the composite MPEG video frame for at least a first MPEG source and a second MPEG source. Based upon the graphical layout the first and second MPEG sources are obtained. The first and second MPEG sources are provided to a stitcher module. The stitcher module stitches together the first MPEG source and the second MPEG source according to the frame locations of the graphical layout to form an MPEG frame without having to decode the macroblock data of the MPEG sources. In certain embodiments, the MPEG sources are only decoded to the slice layer and a processor maintains the positions of the slices within the frame for the first and second MPEG sources. This process is repeated for each frame of MPEG data in order to form an MPEG video stream.
In certain embodiments, the system includes a groomer. The groomer grooms the MPEG sources so that each MPEG element of the MPEG source is converted to an MPEG P-frame format. The groomer module may also identify any macroblocks in the second MPEG source that include motion vectors that reference other macroblocks in a section of the first MPEG source and re-encodes those macroblocks as intracoded macroblocks.
The system may include an association between an MPEG source and a method for the MPEG source forming an MPEG object. In such a system, a processor would receive a request from a client device and in response to the request, a method of the MPEG object would be used. The method may change the state of the MPEG object and cause the selection of a different MPEG source. Thus, the stitcher may replace a first MPEG source with a third MPEG source and stitch together the third and second MPEG sources to form a video frame. The video frame would be streamed to the client device and the client device could decode the updated MPEG video frame and display the updated material on the client's display. For example, an MPEG button object may have an “on” state and an “off” state and the MPEG button object may also include two MPEG graphics composed of a plurality of macroblocks forming slices. In response to a client requesting to change the state of the button from off to on, a method would update the state and cause the MPEG encoded graphic representing an “on” button to be passed to the stitcher.
In certain embodiments, the video frame may be constructed from an unencoded graphic or a graphic that is not MPEG encoded and a groomed MPEG video source. The unencoded graphic may first be rendered. For example, a background may be rendered as a bit map. The background may then be encoded as a series of MPEG macroblocks divided up into slices. The stitcher can then stitch together the background and the groomed MPEG video content to form an MPEG video stream. The background may then be saved for later reuse. In such a configuration, the background would have cut-out regions wherein the slices in those regions would have no associated data, thus video content slices could be inserted into the cut-out. In other embodiments, real-time broadcasts may be received and groomed for creating MPEG video streams.
The foregoing features of the invention will be more readily understood by reference to the following detailed description, taken with reference to the accompanying drawings, in which:
As used in the following detailed description and in the appended claims the term “region” shall mean a logical grouping of MPEG (Motion Picture Expert Group) slices that are either contiguous or non-contiguous. When the term MPEG is used it shall refer to all variants of the MPEG standard including MPEG-2 and MPEG-4. The present invention as described in the embodiments below provides an environment for interactive MPEG content and communications between a processing office and a client device having an associated display, such as a television. Although the present invention specifically references the MPEG specification and encoding, principles of the invention may be employed with other encoding techniques that are based upon block-based transforms. As used in the following specification and appended claims, the terms encode, encoded, and encoding shall refer to the process of compressing a digital data signal and formatting the compressed digital data signal to a protocol or standard. Encoded video data can be in any state other than a spatial representation. For example, encoded video data may be transform coded, quantized, and entropy encoded or any combination thereof. Therefore, data that has been transform coded will be considered to be encoded.
Although the present application refers to the display device as a television, the display device may be a cell phone, a Personal Digital Assistant (PDA) or other device that includes a display. A client device including a decoding device, such as a set-top box that can decode MPEG content, is associated with the display device of the user. In certain embodiments, the decoder may be part of the display device. The interactive MPEG content is created in an authoring environment allowing an application designer to design the interactive MPEG content creating an application having one or more scenes from various elements including video content from content providers and linear broadcasters. An application file is formed in an Active Video Markup Language (AVML). The AVML file produced by the authoring environment is an XML-based file defining the video graphical elements (i.e. MPEG slices) within a single frame/page, the sizes of the video graphical elements, the layout of the video graphical elements within the page/frame for each scene, links to the video graphical elements, and any scripts for the scene. In certain embodiments, an AVML file may be authored directly as opposed to being authored in a text editor or generated by an authoring environment. The video graphical elements may be static graphics, dynamic graphics, or video content. It should be recognized that each element within a scene is really a sequence of images and a static graphic is an image that is repeatedly displayed and does not change over time. Each of the elements may be an MPEG object that can include both MPEG data for graphics and operations associated with the graphics. The interactive MPEG content can include multiple interactive MPEG objects within a scene with which a user can interact. For example, the scene may include a button MPEG object that provides encoded MPEG data forming the video graphic for the object and also includes a procedure for keeping track of the button state. The MPEG objects may work in coordination with the scripts. For example, an MPEG button object may keep track of its state (on/off), but a script within the scene will determine what occurs when that button is pressed. The script may associate the button state with a video program so that the button will indicate whether the video content is playing or stopped. MPEG objects always have an associated action as part of the object. In certain embodiments, the MPEG objects, such as a button MPEG object, may perform actions beyond keeping track of the status of the button. In such, embodiments, the MPEG object may also include a call to an external program, wherein the MPEG object will access the program when the button graphic is engaged. Thus, for a play/pause MPEG object button, the MPEG object may include code that keeps track of the state of the button, provides a graphical overlay based upon a state change, and/or causes a video player object to play or pause the video content depending on the state of the button.
Once an application is created within the authoring environment, and an interactive session is requested by a requesting client device, the processing office assigns a processor for the interactive session.
The assigned processor operational at the processing office runs a virtual machine and accesses and runs the requested application. The processor prepares the graphical part of the scene for transmission in the MPEG format. Upon receipt of the MPEG transmission by the client device and display on the user's display, a user can interact with the displayed content by using an input device in communication with the client device. The client device sends input requests from the user through a communication network to the application running on the assigned processor at the processing office or other remote location. In response, the assigned processor updates the graphical layout based upon the request and the state of the MPEG objects hereinafter referred to in total as the application state. New elements may be added to the scene or replaced within the scene or a completely new scene may be created. The assigned processor collects the elements and the objects for the scene, and either the assigned processor or another processor processes the data and operations according to the object(s) and produces the revised graphical representation in an MPEG format that is transmitted to the transceiver for display on the user's television. Although the above passage indicates that the assigned processor is located at the processing office, the assigned processor may be located at a remote location and need only be in communication with the processing office through a network connection. Similarly, although the assigned processor is described as handling all transactions with the client device, other processors may also be involved with requests and assembly of the content (MPEG objects) of the graphical layout for the application.
The content provider 160 may encode the video content as MPEG video/audio or the content may be in another graphical format (e.g. JPEG, BITMAP, H263, H264, VC-1 etc.). The content may be subsequently groomed and/or scaled in a Groomer/Scaler 190 to place the content into a preferable encoded MPEG format that will allow for stitching. If the content is not placed into the preferable MPEG format, the processing office will groom the format when an application that requires the content is requested by a client device. Linear broadcast content 170 from broadcast media services, like content from the content providers, will be groomed. The linear broadcast content is preferably groomed and/or scaled in Groomer/Scaler 180 that encodes the content in the preferable MPEG format for stitching prior to passing the content to the processing office.
The video content from the content producers 160 along with the applications created by application programmers are distributed through a video content distribution network 150 and are stored at distribution points 140. These distribution points are represented as the proxy/cache within
An end user of the system can request an interactive session by sending a command through the client device 110, such as a set-top box, to a processing office 105. In
The virtual machine 106 communicates its address to the client device 110 and an interactive session is established. The user can then request presentation of an interactive application (AVML) through the client device 110. The request is received by the virtual machine 106 and in response, the virtual machine 106 causes the AVML file to be retrieved from the proxy/cache 140 and installed into a memory cache 107 that is accessible by the virtual machine 106. It should be recognized that the virtual machine 106 may be in simultaneous communication with a plurality of client devices 110 and the client devices may be different device types. For example, a first device may be a cellular telephone, a second device may be a set-top box, and a third device may be a personal digital assistant wherein each device access the same or a different application.
In response to a request for an application, the virtual machine 106 processes the application and requests elements and MPEG objects that are part of the scene to be moved from the proxy/cache into memory 107 associated with the virtual machine 106. An MPEG object includes both a visual component and an actionable component. The visual component may be encoded as one or more MPEG slices or provided in another graphical format. The actionable component may be storing the state of the object, may include performing computations, accessing an associated program, or displaying overlay graphics to identify the graphical component as active. An overlay graphic may be produced by a signal being transmitted to a client device wherein the client device creates a graphic in the overlay plane on the display device. It should be recognized that a scene is not a static graphic, but rather includes a plurality of video frames wherein the content of the frames can change over time.
The virtual machine 106 determines based upon the scene information, including the application state, the size and location of the various elements and objects for a scene. Each graphical element may be formed from contiguous or non-contiguous MPEG slices. The virtual machine keeps track of the location of all of the slices for each graphical element. All of the slices that define a graphical element form a region. The virtual machine 106 keeps track of each region. Based on the display position information within the AVML file, the slice positions for the elements and background within a video frame are set. If the graphical elements are not already in a groomed format, the virtual machine passes that element to an element renderer. The renderer renders the graphical element as a bitmap and the renderer passes the bitmap to an MPEG element encoder 109. The MPEG element encoder encodes the bitmap as an MPEG video sequence. The MPEG encoder processes the bitmap so that it outputs a series of P-frames. An example of content that is not already pre-encoded and pre-groomed is personalized content. For example, if a user has stored music files at the processing office and the graphic element to be presented is a listing of the user's music files, this graphic would be created in real-time as a bitmap by the virtual machine. The virtual machine would pass the bitmap to the element renderer 108 which would render the bitmap and pass the bitmap to the MPEG element encoder 109 for grooming.
After the graphical elements are groomed by the MPEG element encoder, the MPEG element encoder 109 passes the graphical elements to memory 107 for later retrieval by the virtual machine 106 for other interactive sessions by other users. The MPEG encoder 109 also passes the MPEG encoded graphical elements to the stitcher 115. The rendering of an element and MPEG encoding of an element may be accomplished in the same or a separate processor from the virtual machine 106. The virtual machine 106 also determines if there are any scripts within the application that need to be interpreted. If there are scripts, the scripts are interpreted by the virtual machine 106.
Each scene in an application can include a plurality of elements including static graphics, object graphics that change based upon user interaction, and video content. For example, a scene may include a background (static graphic), along with a media player for playback of audio video and multimedia content (object graphic) having a plurality of buttons, and a video content window (video content) for displaying the streaming video content. Each button of the media player may itself be a separate object graphic that includes its own associated methods.
The virtual machine 106 acquires each of the graphical elements (background, media player graphic, and video frame) for a frame and determines the location of each element. Once all of the objects and elements (background, video content) are acquired, the elements and graphical objects are passed to the stitcher/compositor 115 along with positioning information for the elements and MPEG objects. The stitcher 115 stitches together each of the elements (video content, buttons, graphics, background) according to the mapping provided by the virtual machine 106. Each of the elements is placed on a macroblock boundary and when stitched together the elements form an MPEG video frame. On a periodic basis all of the elements of a scene frame are encoded to form a reference P-frame in order to refresh the sequence and avoid dropped macroblocks. The MPEG video stream is then transmitted to the address of client device through the down stream network. The process continues for each of the video frames. Although the specification refers to MPEG as the encoding process, other encoding processes may also be used with this system.
The virtual machine 106 or other processor or process at the processing office 105 maintains information about each of the elements and the location of the elements on the screen. The virtual machine 106 also has access to the methods for the objects associated with each of the elements. For example, a media player may have a media player object that includes a plurality of routines. The routines can include, play, stop, fast forward, rewind, and pause. Each of the routines includes code and upon a user sending a request to the processing office 105 for activation of one of the routines, the object is accessed and the routine is run. The routine may be a JAVA-based applet, a script to be interpreted, or a separate computer program capable of being run within the operating system associated with the virtual machine.
The processing office 105 may also create a linked data structure for determining the routine to execute or interpret based upon a signal received by the processor from the client device associated with the television. The linked data structure may be formed by an included mapping module. The data structure associates each resource and associated object relative to every other resource and object. For example, if a user has already engaged the play control, a media player object is activated and the video content is displayed. As the video content is playing in a media player window, the user can depress a directional key on the user's remote control. In this example, the depression of the directional key is indicative of pressing a stop button. The transceiver produces a directional signal and the assigned processor receives the directional signal. The virtual machine 106 or other processor at the processing office 105 accesses the linked data structure and locates the element in the direction of the directional key press. The database indicates that the element is a stop button that is part of a media player object and the processor implements the routine for stopping the video content. The routine will cause the requested content to stop. The last video content frame will be frozen and a depressed stop button graphic will be interwoven by the stitcher module into the frame. The routine may also include a focus graphic to provide focus around the stop button. For example, the virtual machine can cause the stitcher to enclose the graphic having focus with a boarder that is 1 macroblock wide. Thus, when the video frame is decoded and displayed, the user will be able to identify the graphic/object that the user can interact with. The frame will then be passed to a multiplexor and sent through the downstream network to the client device. The MPEG encoded video frame is decoded by the client device displayed on either the client device (cell phone, PDA) or on a separate display device (monitor, television). This process occurs with a minimal delay. Thus, each scene from an application results in a plurality of video frames each representing a snapshot of the media player application state.
The virtual machine 106 will repeatedly receive commands from the client device and in response to the commands will either directly or indirectly access the objects and execute or interpret the routines of the objects in response to user interaction and application interaction model. In such a system, the video content material displayed on the television of the user is merely decoded MPEG content and all of the processing for the interactivity occurs at the processing office and is orchestrated by the assigned virtual machine. Thus, the client device only needs a decoder and need not cache or process any of the content.
It should be recognized that through user requests from a client device, the processing office could replace a video element with another video element. For example, a user may select from a list of movies to display and therefore a first video content element would be replaced by a second video content element if the user selects to switch between two movies. The virtual machine, which maintains a listing of the location of each element and region forming an element can easily replace elements within a scene creating a new MPEG video frame wherein the frame is stitched together including the new element in the stitcher 115.
Authoring Environment
The authoring environment includes a graphical editor as shown in
As shown in
When a user selects an application through a client device, the processing office will stitch together the elements in accordance with the layout from the graphical editor of the authoring environment. The output of the authoring environment includes an Active Video Mark-up Language file (AVML) The AVML file provides state information about multi-state elements such as a button, the address of the associated graphic, and the size of the graphic. The AVML file indicates the locations within the MPEG frame for each element, indicates the objects that are associated with each element, and includes the scripts that define changes to the MPEG frame based upon user's actions. For example, a user may send an instruction signal to the processing office and the processing office will use the AVML file to construct a set of new MPEG frames based upon the received instruction signal. A user may want to switch between various video elements and may send an instruction signal to the processing office. The processing office will remove a video element within the layout for a frame and will select the second video element causing the second video element to be stitched into the MPEG frame at the location of the first video element. This process is described below.
AVML File
The application programming environment outputs an AVML file. The AVML file has an XML-based syntax. The AVML file syntax includes a root object <AVML>. Other top level tags include <initialscene> that specifies the first scene to be loaded when an application starts. The <script> tag identifies a script and a <scene> tag identifies a scene. There may also be lower level tags to each of the top level tags, so that there is a hierarchy for applying the data within the tag. For example, a top level stream tag may include <aspect ratio> for the video stream, <video format>, <bit rate>, <audio format> and <audio bit rate>. Similarly, a scene tag may include each of the elements within the scene. For example, <background> for the background, <button> for a button object, and <static image> for a still graphic. Other tags include <size> and <pos> for the size and position of an element and may be lower level tags for each element within a scene. An example of an AVML file is provided in
Groomer
The process of stitching is described below and can be performed in a much more efficient manner if the elements have been groomed first.
Grooming removes some of the interdependencies present in compressed video. The groomer will convert I and B frames to P frames and will fix any stray motion vectors that reference a section of another frame of video that has been cropped or removed. Thus, a groomed video stream can be used in combination with other groomed video streams and encoded still images to form a composite MPEG video stream. Each groomed video stream includes a plurality of frames and the frames can be can be easily inserted into another groomed frame wherein the composite frames are grouped together to form an MPEG video stream. It should be noted that the groomed frames may be formed from one or more MPEG slices and may be smaller in size than an MPEG video frame in the MPEG video stream.
As shown, video element 420 is inserted within the background video frame 410 (also for example only; this element could also consist of multiple slices per row). If a macroblock within the original video frame 410 references another macroblock in determining its value and the reference macroblock is removed from the frame because the video image 420 is inserted in its place, the macroblocks value needs to be recalculated. Similarly, if a macroblock references another macroblock in a subsequent frame and that macroblock is removed and other source material is inserted in its place, the macroblock values need to be recalculated. This is addressed by grooming the video 430. The video frame is processed so that the rows contain multiple slices some of which are specifically sized and located to match the substitute video content. After this process is complete, it is a simple task to replace some of the current slices with the overlay video resulting in a groomed video with overlay 440. The groomed video stream has been specifically defined to address that particular overlay. A different overlay would dictate different grooming parameters. Thus, this type of grooming addresses the process of segmenting a video frame into slices in preparation for stitching. It should be noted that there is never a need to add slices to the overlay element. Slices are only added to the receiving element, that is, the element into which the overlay will be placed. The groomed video stream can contain information about the stream's groomed characteristics. Characteristics that can be provided include: 1. the locations for the upper left and lower right corners of the groomed window. 2. The location of upper left corner only and then the size of the window. The size of the slice accurate to the pixel level.
There are also two ways to provide the characteristic information in the video stream. The first is to provide that information in the slice header. The second is to provide the information in the extended data slice structure. Either of these options can be used to successfully pass the necessary information to future processing stages, such as the virtual machine and stitcher.
Next, the slice overhead information 740 must be modified. The parameters to modify are given in the table below.
Next, the macroblock overhead 750 information may require modification. The values to be modified are given in the table below.
Finally, the block information 760 may require modification. The items to modify are given in the table below.
Once the block changes are complete, the process can start over with the next frame of video. If the frame type is a B-frame 705, the same steps required for an I-frame are also required for the B-frame. However, in addition, the motion vectors 770 need to be modified. There are two scenarios: B-frame immediately following an I-frame or P-frame, or a B-frame following another B-frame. Should the B-frame follow either an I or P frame, the motion vector, using the I or P frame as a reference, can remain the same and only the residual would need to change. This may be as simple as converting the forward looking motion vector to be the residual.
For the B-frames that follow another B-frame, the motion vector and its residual will both need to be modified. The second B-frame must now reference the newly converted B to P frame immediately preceding it. First, the B-frame and its reference are decoded and the motion vector and the residual are recalculated. It must be noted that while the frame is decoded to update the motion vectors, there is no need to re-encode the DCT coefficients. These remain the same. Only the motion vector and residual are calculated and modified.
The last frame type is the P-frame. This frame type also follows the same path as an I-frame
In addition to updating motion vectors and changing frame types, the groomer may also convert field based encoded macroblocks to frame based encoded macroblocks.
Stitcher
This particular type of encoding is called “slice based encoding”. A slice based encoder/virtual machine is one that is aware of the desired slice structure of the output frame and performs its encoding appropriately. That is, the encoder knows the size of the slices and where they belong. It knows where to leave holes if that is required. By being aware of the desired output slice configuration, the virtual machine provides an output that is easily stitched.
It is also possible for there to be an overlap in the composited video frame. Referring back to
The possibility of different slice sizes requires the compositing function to perform a check of the incoming background and video elements to confirm they are proper. That is, make sure each one is complete (e.g., a full frame), there are no sizing conflicts, etc.
The performance of the stitcher can be improved (build frames faster with less processor power) by providing the stitcher advance information on the frame format. For example, the virtual machine may provide the stitcher with the start location and size of the areas in the frame to be inserted. Alternatively, the information could be the start location for each slice and the stitcher could then figure out the size (the difference between the two start locations). This information could be provided externally by the virtual machine or the virtual machine could incorporate the information into each element. For instance, part of the slice header could be used to carry this information. The stitcher can use this foreknowledge of the frame structure to begin compositing the elements together well before they are required.
The source for the element slices can be any one of a number of options. It can come from a real-time encoded source. It can be a complex slice that is built from separate slices, one having a background and the other having text. It can be a pre-encoded element that is fetched from a cache. These examples are for illustrative purposes only and are not intended to limit the options for element sources.
When a client device sends a request for a mosaic application, the processing office associated with the client device assigns a processor/virtual machine for the client device for the requested mosaic application. The assigned virtual machine constructs the personalized mosaic by compositing the groomed content from the desired channels using a stitcher. The virtual machine sends the client device an MPEG stream that has a mosaic of the channels that the client has requested. Thus, by grooming the content first so that the content can be stitched together, the virtual machines that create the mosaics do not need to first decode the desired channels, render the channels within the background as a bitmap and then encode the bitmap.
An application, such as a mosaic, can be requested either directly through a client device or indirectly through another device, such as a PC, for display of the application on a display associated with the client device. The user could log into a website associated with the processing office by providing information about the user's account. The server associated with the processing office would provide the user with a selection screen for selecting an application. If the user selected a mosaic application, the server would allow the user to select the content that the user wishes to view within the mosaic. In response to the selected content for the mosaic and using the user's account information, the processing office server would direct the request to a session processor and establish an interactive session with the client device of the user. The session processor would then be informed by the processing office server of the desired application. The session processor would retrieve the desired application, the mosaic application in this example, and would obtain the required MPEG objects. The processing office server would then inform the session processor of the requested video content and the session processor would operate in conjunction with the stitcher to construct the mosaic and provide the mosaic as an MPEG video stream to the client device. Thus, the processing office server may include scripts or application for performing the functions of the client device in setting up the interactive session, requesting the application, and selecting content for display. While the mosaic elements may be predetermined by the application, they may also be user configurable resulting in a personalized mosaic.
These additional resources add cost to the system. As a result, the desire is to minimize the number of additional resources that are required to deliver a level of performance to the user that mimics a non-blocking system such as an IP network. Since there is not a one-to-one correspondence between the cable network resources and the users on the network, the resources must be shared. Shared resources must be managed so they can be assigned when a user requires a resource and then freed when the user is finished utilizing that resource. Proper management of these resources is critical to the operator because without it, the resources could be unavailable when needed most. Should this occur, the user either receives a “please wait” message or, in the worst case, a “service unavailable” message.
(1) The Set Top 2609 requests content 2610 from the Controller 2607
(2) The Controller 2607 requests QAM bandwidth 2620 from the SRM 2603
(3) The SRM 2603 checks QAM availability 2625
(4) The SRM 2603 allocates the QAM modulator 2630
(5) The QAM modulator returns confirmation 2635
(6) The SRM 2603 confirms QAM allocation success 2640 to the Controller
(7) The Controller 407 allocates the Session processor 2650
(8) The Session processor confirms allocation success 2653
(9) The Controller 2607 allocates the content 2655
(10) The Controller 2607 configures 2660 the Set Top 2609. This includes:
(11) The Set Top 2609 tunes to the channel 2663
(12) The Set Top 2609 confirms success 2665 to the Controller 2607 The Controller 2607 allocates the resources based on a request for service from a set top box 2609. It frees these resources when the set top or server sends an “end of session”. While the controller 2607 can react quickly with minimal delay, the SRM 2603 can only allocate a set number of QAM sessions per second i.e. 200. Demand that exceeds this rate results in unacceptable delays for the user. For example, if 500 requests come in at the same time, the last user would have to wait 5 seconds before their request was granted. It is also possible that rather than the request being granted, an error message could be displayed such as “service unavailable”.
While the example above describes the request and response sequence for an AVDN session over a cable TV network, the example below describes a similar sequence over an IPTV network. Note that the sequence in itself is not a claim, but rather illustrates how AVDN would work over an IPTV network.
A first issue is the assignment of QAMs 2770 and QAM channels 2775 by the SRM 2720. In particular, the resources must be managed to prevent SRM overload, that is, eliminating the delay the user would see when requests to the SRM 2720 exceed its sessions per second rate.
To prevent SRM “overload”, “time based modeling” may be used. For time based modeling, the Controller 2700 monitors the history of past transactions, in particular, high load periods. By using this previous history, the Controller 2700 can predict when a high load period may occur, for example, at the top of an hour. The Controller 2700 uses this knowledge to pre-allocate resources before the period comes. That is, it uses predictive algorithms to determine future resource requirements. As an example, if the Controller 2700 thinks 475 users are going to join at a particular time, it can start allocating those resources 5 seconds early so that when the load hits, the resources have already been allocated and no user sees a delay.
Secondly, the resources could be pre-allocated based on input from an operator. Should the operator know a major event is coming, e.g., a pay per view sporting event, he may want to pre-allocate resources in anticipation. In both cases, the SRM 2720 releases unused QAM 2770 resources when not in use and after the event.
Thirdly, QAMs 2770 can be allocated based on a “rate of change” which is independent of previous history. For example, if the controller 2700 recognizes a sudden spike in traffic, it can then request more QAM bandwidth than needed in order to avoid the QAM allocation step when adding additional sessions. An example of a sudden, unexpected spike might be a button as part of the program that indicates a prize could be won if the user selects this button.
Currently, there is one request to the SRM 2720 for each session to be added. Instead the controller 2700 could request the whole QAM 2770 or a large part of a single QAM's bandwidth and allow this invention to handle the data within that QAM channel 2775. Since one aspect of this system is the ability to create a channel that is only 1, 2, or 3 Mb/sec, this could reduce the number of requests to the SRM 2720 by replacing up to 27 requests with a single request.
The user will also experience a delay when they request different content even if they are already in an active session. Currently, if a set top 2790 is in an active session and requests a new set of content 2730, the Controller 2700 has to tell the SRM 2720 to de-allocate the QAM 2770, then the Controller 2700 must de-allocate the session processor 2750 and the content 2730, and then request another QAM 2770 from the SRM 2720 and then allocate a different session processor 2750 and content 2730. Instead, the controller 2700 can change the video stream 2755 feeding the QAM modulator 2770 thereby leaving the previously established path intact. There are a couple of ways to accomplish the change. First, since the QAM Modulators 2770 are on a network so the controller 2700 can merely change the session processor 2750 driving the QAM 2770. Second, the controller 2700 can leave the session processor 2750 to set top 2790 connection intact but change the content 2730 feeding the session processor 2750, e.g., “CNN Headline News” to “CNN World Now”. Both of these methods eliminate the QAM initialization and Set Top tuning delays.
Thus, resources are intelligently managed to minimize the amount of equipment required to provide these interactive services. In particular, the Controller can manipulate the video streams 2755 feeding the QAM 2770. By profiling these streams 2755, the Controller 2700 can maximize the channel usage within a QAM 2770. That is, it can maximize the number of programs in each QAM channel 2775 reducing wasted bandwidth and the required number of QAMs 2770. There are three primary means to profile streams: formulaic, pre-profiling, and live feedback.
The first profiling method, formulaic, consists of adding up the bit rates of the various video streams used to fill a QAM channel 2775. In particular, there may be many video elements that are used to create a single video stream 2755. The maximum bit rate of each element can be added together to obtain an aggregate bit rate for the video stream 2755. By monitoring the bit rates of all video streams 2755, the Controller 2700 can create a combination of video streams 2755 that most efficiently uses a QAM channel 2775. For example, if there were four video streams 2755: two that were 16 Mb/sec and two that were 20 Mb/sec then the controller could best fill a 38.8 Mb/sec QAM channel 2775 by allocating one of each bit rate per channel. This would then require two QAM channels 2775 to deliver the video. However, without the formulaic profiling, the result could end up as 3 QAM channels 2775 as perhaps the two 16 Mb/sec video streams 2755 are combined into a single 38.8 Mb/sec QAM channel 2775 and then each 20 Mb/sec video stream 2755 must have its own 38.8 Mb/sec QAM channel 2775.
A second method is pre-profiling. In this method, a profile for the content 2730 is either received or generated internally. The profile information can be provided in metadata with the stream or in a separate file. The profiling information can be generated from the entire video or from a representative sample. The controller 2700 is then aware of the bit rate at various times in the stream and can use this information to effectively combine video streams 2755 together. For example, if two video streams 2755 both had a peak rate of 20 Mb/sec, they would need to be allocated to different 38.8 Mb/sec QAM channels 2775 if they were allocated bandwidth based on their peaks. However, if the controller knew that the nominal bit rate was 14 Mb/sec and knew their respective profiles so there were no simultaneous peaks, the controller 2700 could then combine the streams 2755 into a single 38.8 Mb/sec QAM channel 2775. The particular QAM bit rate is used for the above examples only and should not be construed as a limitation.
A third method for profiling is via feedback provided by the system. The system can inform the controller 2700 of the current bit rate for all video elements used to build streams and the aggregate bit rate of the stream after it has been built. Furthermore, it can inform the controller 2700 of bit rates of stored elements prior to their use. Using this information, the controller 2700 can combine video streams 2755 in the most efficient manner to fill a QAM channel 2775.
It should be noted that it is also acceptable to use any or all of the three profiling methods in combination. That is, there is no restriction that they must be used independently.
The system can also address the usage of the resources themselves. For example, if a session processor 2750 can support 100 users and currently there are 350 users that are active, it requires four session processors. However, when the demand goes down to say 80 users, it would make sense to reallocate those resources to a single session processor 2750, thereby conserving the remaining resources of three session processors. This is also useful in failure situations. Should a resource fail, the invention can reassign sessions to other resources that are available. In this way, disruption to the user is minimized.
The system can also repurpose functions depending on the expected usage. The session processors 2750 can implement a number of different functions, for example, process video, process audio, etc. Since the controller 2700 has a history of usage, it can adjust the functions on the session processors 2700 to meet expected demand. For example, if in the early afternoons there is typically a high demand for music, the controller 2700 can reassign additional session processors 2750 to process music in anticipation of the demand. Correspondingly, if in the early evening there is a high demand for news, the controller 2700 anticipates the demand and reassigns the session processors 2750 accordingly. The flexibility and anticipation of the system allows it to provide the optimum user experience with the minimum amount of equipment. That is, no equipment is idle because it only has a single purpose and that purpose is not required.
The present invention may be embodied in many different forms, including, but in no way limited to, computer program logic for use with a processor (e.g., a microprocessor, microcontroller, digital signal processor, or general purpose computer), programmable logic for use with a programmable logic device (e.g., a Field Programmable Gate Array (FPGA) or other PLD), discrete components, integrated circuitry (e.g., an Application Specific Integrated Circuit (ASIC)), or any other means including any combination thereof. In an embodiment of the present invention, predominantly all of the reordering logic may be implemented as a set of computer program instructions that is converted into a computer executable form, stored as such in a computer readable medium, and executed by a microprocessor within the array under the control of an operating system.
Computer program logic implementing all or part of the functionality previously described herein may be embodied in various forms, including, but in no way limited to, a source code form, a computer executable form, and various intermediate forms (e.g., forms generated by an assembler, compiler, networker, or locator.) Source code may include a series of computer program instructions implemented in any of various programming languages (e.g., an object code, an assembly language, or a high-level language such as Fortran, C, C++, JAVA, or HTML) for use with various operating systems or operating environments. The source code may define and use various data structures and communication messages. The source code may be in a computer executable form (e.g., via an interpreter), or the source code may be converted (e.g., via a translator, assembler, or compiler) into a computer executable form.
The computer program may be fixed in any form (e.g., source code form, computer executable form, or an intermediate form) either permanently or transitorily in a tangible storage medium, such as a semiconductor memory device (e.g., a RAM, ROM, PROM, EEPROM, or Flash-Programmable RAM), a magnetic memory device (e.g., a diskette or fixed disk), an optical memory device (e.g., a CD-ROM), a PC card (e.g., PCMCIA card), or other memory device. The computer program may be fixed in any form in a signal that is transmittable to a computer using any of various communication technologies, including, but in no way limited to, analog technologies, digital technologies, optical technologies, wireless technologies, networking technologies, and internetworking technologies. The computer program may be distributed in any form as a removable storage medium with accompanying printed or electronic documentation (e.g., shrink wrapped software or a magnetic tape), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the communication system (e.g., the Internet or World Wide Web.)
Hardware logic (including programmable logic for use with a programmable logic device) implementing all or part of the functionality previously described herein may be designed using traditional manual methods, or may be designed, captured, simulated, or documented electronically using various tools, such as Computer Aided Design (CAD), a hardware description language (e.g., VHDL or AHDL), or a PLD programming language (e.g., PALASM, ABEL, or CUPL.)
While the invention has been particularly shown and described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended clauses. As will be apparent to those skilled in the art, techniques described above for panoramas may be applied to images that have been captured as non-panoramic images, and vice versa.
Embodiments of the present invention may be described, without limitation, by the following clauses. While these embodiments have been described in the clauses by process steps, an apparatus comprising a computer with associated display capable of executing the process steps in the clauses below is also included in the present invention. Likewise, a computer program product including computer executable instructions for executing the process steps in the clauses below and stored on a computer readable medium is included within the present invention.
| Number | Name | Date | Kind |
|---|---|---|---|
| 3889050 | Thompson | Jun 1975 | A |
| 3934079 | Barnhart | Jan 1976 | A |
| 3997718 | Ricketts et al. | Dec 1976 | A |
| 4002843 | Rackman | Jan 1977 | A |
| 4032972 | Saylor | Jun 1977 | A |
| 4077006 | Nicholson | Feb 1978 | A |
| 4081831 | Tang et al. | Mar 1978 | A |
| 4107734 | Percy et al. | Aug 1978 | A |
| 4107735 | Frohbach | Aug 1978 | A |
| 4145720 | Weintraub et al. | Mar 1979 | A |
| 4168400 | de Couasnon et al. | Sep 1979 | A |
| 4186438 | Benson et al. | Jan 1980 | A |
| 4222068 | Thompson | Sep 1980 | A |
| 4245245 | Matsumoto et al. | Jan 1981 | A |
| 4247106 | Jeffers et al. | Jan 1981 | A |
| 4253114 | Tang et al. | Feb 1981 | A |
| 4264924 | Freeman | Apr 1981 | A |
| 4264925 | Freeman et al. | Apr 1981 | A |
| 4290142 | Schnee et al. | Sep 1981 | A |
| 4302771 | Gargini | Nov 1981 | A |
| 4308554 | Percy et al. | Dec 1981 | A |
| 4350980 | Ward | Sep 1982 | A |
| 4367557 | Stern et al. | Jan 1983 | A |
| 4395780 | Gohm et al. | Jul 1983 | A |
| 4408225 | Ensinger et al. | Oct 1983 | A |
| 4450477 | Lovett | May 1984 | A |
| 4454538 | Toriumi | Jun 1984 | A |
| 4466017 | Banker | Aug 1984 | A |
| 4471380 | Mobley | Sep 1984 | A |
| 4475123 | Dumbauld et al. | Oct 1984 | A |
| 4484217 | Block et al. | Nov 1984 | A |
| 4491983 | Pinnow et al. | Jan 1985 | A |
| 4506387 | Walter | Mar 1985 | A |
| 4507680 | Freeman | Mar 1985 | A |
| 4509073 | Baran et al. | Apr 1985 | A |
| 4523228 | Banker | Jun 1985 | A |
| 4533948 | McNamara et al. | Aug 1985 | A |
| 4536791 | Campbell et al. | Aug 1985 | A |
| 4538174 | Gargini et al. | Aug 1985 | A |
| 4538176 | Nakajima et al. | Aug 1985 | A |
| 4553161 | Citta | Nov 1985 | A |
| 4554581 | Tentler et al. | Nov 1985 | A |
| 4555561 | Sugimori et al. | Nov 1985 | A |
| 4562465 | Glaab | Dec 1985 | A |
| 4567517 | Mobley | Jan 1986 | A |
| 4573072 | Freeman | Feb 1986 | A |
| 4591906 | Morales-Garza et al. | May 1986 | A |
| 4602279 | Freeman | Jul 1986 | A |
| 4614970 | Clupper et al. | Sep 1986 | A |
| 4616263 | Eichelberger | Oct 1986 | A |
| 4625235 | Watson | Nov 1986 | A |
| 4627105 | Ohashi et al. | Dec 1986 | A |
| 4633462 | Stifle et al. | Dec 1986 | A |
| 4670904 | Rumreich | Jun 1987 | A |
| 4682360 | Frederiksen | Jul 1987 | A |
| 4695880 | Johnson et al. | Sep 1987 | A |
| 4706121 | Young | Nov 1987 | A |
| 4706285 | Rumreich | Nov 1987 | A |
| 4709418 | Fox et al. | Nov 1987 | A |
| 4710971 | Nozaki et al. | Dec 1987 | A |
| 4718086 | Rumreich et al. | Jan 1988 | A |
| 4732764 | Hemingway et al. | Mar 1988 | A |
| 4734764 | Pocock et al. | Mar 1988 | A |
| 4748689 | Mohr | May 1988 | A |
| 4749992 | Fitzemeyer et al. | Jun 1988 | A |
| 4750036 | Martinez | Jun 1988 | A |
| 4754426 | Rast et al. | Jun 1988 | A |
| 4760442 | O'Connell et al. | Jul 1988 | A |
| 4763317 | Lehman et al. | Aug 1988 | A |
| 4769833 | Farleigh et al. | Sep 1988 | A |
| 4769838 | Hasegawa | Sep 1988 | A |
| 4789863 | Bush | Dec 1988 | A |
| 4792849 | McCalley et al. | Dec 1988 | A |
| 4801190 | Imoto | Jan 1989 | A |
| 4805134 | Calo et al. | Feb 1989 | A |
| 4807031 | Broughton et al. | Feb 1989 | A |
| 4816905 | Tweety et al. | Mar 1989 | A |
| 4821102 | Ichikawa et al. | Apr 1989 | A |
| 4823386 | Dumbauld et al. | Apr 1989 | A |
| 4827253 | Maltz | May 1989 | A |
| 4827511 | Masuko | May 1989 | A |
| 4829372 | McCalley et al. | May 1989 | A |
| 4829558 | Welsh | May 1989 | A |
| 4847698 | Freeman | Jul 1989 | A |
| 4847699 | Freeman | Jul 1989 | A |
| 4847700 | Freeman | Jul 1989 | A |
| 4848698 | Newell et al. | Jul 1989 | A |
| 4860379 | Schoeneberger et al. | Aug 1989 | A |
| 4864613 | Van Cleave | Sep 1989 | A |
| 4876592 | Von Kohorn | Oct 1989 | A |
| 4889369 | Albrecht | Dec 1989 | A |
| 4890320 | Monslow et al. | Dec 1989 | A |
| 4891694 | Way | Jan 1990 | A |
| 4901367 | Nicholson | Feb 1990 | A |
| 4903126 | Kassatly | Feb 1990 | A |
| 4905094 | Pocock et al. | Feb 1990 | A |
| 4912760 | West, Jr. et al. | Mar 1990 | A |
| 4918516 | Freeman | Apr 1990 | A |
| 4920566 | Robbins et al. | Apr 1990 | A |
| 4922532 | Farmer et al. | May 1990 | A |
| 4924303 | Brandon et al. | May 1990 | A |
| 4924498 | Farmer et al. | May 1990 | A |
| 4937821 | Boulton | Jun 1990 | A |
| 4941040 | Pocock et al. | Jul 1990 | A |
| 4947244 | Fenwick et al. | Aug 1990 | A |
| 4961211 | Tsugane et al. | Oct 1990 | A |
| 4963995 | Lang | Oct 1990 | A |
| 4975771 | Kassatly | Dec 1990 | A |
| 4989245 | Bennett | Jan 1991 | A |
| 4994909 | Graves et al. | Feb 1991 | A |
| 4995078 | Monslow et al. | Feb 1991 | A |
| 5003384 | Durden et al. | Mar 1991 | A |
| 5008934 | Endoh | Apr 1991 | A |
| 5014125 | Pocock et al. | May 1991 | A |
| 5027400 | Baji et al. | Jun 1991 | A |
| 5051720 | Kittirutsunetorn | Sep 1991 | A |
| 5051822 | Rhoades | Sep 1991 | A |
| 5057917 | Shalkauser et al. | Oct 1991 | A |
| 5058160 | Banker et al. | Oct 1991 | A |
| 5060262 | Bevins, Jr. et al. | Oct 1991 | A |
| 5077607 | Johnson et al. | Dec 1991 | A |
| 5083800 | Lockton | Jan 1992 | A |
| 5088111 | McNamara et al. | Feb 1992 | A |
| 5093718 | Hoarty et al. | Mar 1992 | A |
| 5109414 | Harvey et al. | Apr 1992 | A |
| 5113496 | McCalley et al. | May 1992 | A |
| 5119188 | McCalley et al. | Jun 1992 | A |
| 5130792 | Tindell et al. | Jul 1992 | A |
| 5132992 | Yurt et al. | Jul 1992 | A |
| 5133009 | Rumreich | Jul 1992 | A |
| 5133079 | Ballantyne et al. | Jul 1992 | A |
| 5136411 | Paik et al. | Aug 1992 | A |
| 5142575 | Farmer et al. | Aug 1992 | A |
| 5144448 | Hornbaker, III et al. | Sep 1992 | A |
| 5155591 | Wachob | Oct 1992 | A |
| 5172413 | Bradley et al. | Dec 1992 | A |
| 5191410 | McCalley et al. | Mar 1993 | A |
| 5195092 | Wilson et al. | Mar 1993 | A |
| 5208665 | McCalley et al. | May 1993 | A |
| 5220420 | Hoarty et al. | Jun 1993 | A |
| 5230019 | Yanagimichi et al. | Jul 1993 | A |
| 5231494 | Wachob | Jul 1993 | A |
| 5236199 | Thompson, Jr. | Aug 1993 | A |
| 5247347 | Litteral et al. | Sep 1993 | A |
| 5253341 | Rozmanith et al. | Oct 1993 | A |
| 5262854 | Ng | Nov 1993 | A |
| 5262860 | Fitzpatrick et al. | Nov 1993 | A |
| 5303388 | Kreitman et al. | Apr 1994 | A |
| 5319455 | Hoarty et al. | Jun 1994 | A |
| 5319707 | Wasilewski et al. | Jun 1994 | A |
| 5321440 | Yanagihara et al. | Jun 1994 | A |
| 5321514 | Martinez | Jun 1994 | A |
| 5351129 | Lai | Sep 1994 | A |
| 5355162 | Yazolino et al. | Oct 1994 | A |
| 5359601 | Wasilewski et al. | Oct 1994 | A |
| 5361091 | Hoarty et al. | Nov 1994 | A |
| 5371532 | Gelman et al. | Dec 1994 | A |
| 5404393 | Remillard | Apr 1995 | A |
| 5408274 | Chang et al. | Apr 1995 | A |
| 5410343 | Coddington et al. | Apr 1995 | A |
| 5410344 | Graves et al. | Apr 1995 | A |
| 5412415 | Cook et al. | May 1995 | A |
| 5412720 | Hoarty | May 1995 | A |
| 5418559 | Blahut | May 1995 | A |
| 5422674 | Hooper et al. | Jun 1995 | A |
| 5422887 | Diepstraten et al. | Jun 1995 | A |
| 5442389 | Blahut et al. | Aug 1995 | A |
| 5442390 | Hooper et al. | Aug 1995 | A |
| 5442700 | Snell et al. | Aug 1995 | A |
| 5446490 | Blahut et al. | Aug 1995 | A |
| 5469283 | Vinel et al. | Nov 1995 | A |
| 5469431 | Wendorf et al. | Nov 1995 | A |
| 5471263 | Odaka | Nov 1995 | A |
| 5481542 | Logston et al. | Jan 1996 | A |
| 5485197 | Hoarty | Jan 1996 | A |
| 5487066 | McNamara et al. | Jan 1996 | A |
| 5493638 | Hooper et al. | Feb 1996 | A |
| 5495283 | Cowe | Feb 1996 | A |
| 5495295 | Long | Feb 1996 | A |
| 5497187 | Banker et al. | Mar 1996 | A |
| 5517250 | Hoogenboom et al. | May 1996 | A |
| 5526034 | Hoarty et al. | Jun 1996 | A |
| 5528281 | Grady et al. | Jun 1996 | A |
| 5537397 | Abramson | Jul 1996 | A |
| 5537404 | Bentley et al. | Jul 1996 | A |
| 5539449 | Blahut et al. | Jul 1996 | A |
| RE35314 | Logg | Aug 1996 | E |
| 5548340 | Bertram | Aug 1996 | A |
| 5550578 | Hoarty et al. | Aug 1996 | A |
| 5557316 | Hoarty et al. | Sep 1996 | A |
| 5559549 | Hendricks et al. | Sep 1996 | A |
| 5561708 | Remillard | Oct 1996 | A |
| 5570126 | Blahut et al. | Oct 1996 | A |
| 5570363 | Holm | Oct 1996 | A |
| 5579143 | Huber | Nov 1996 | A |
| 5581653 | Todd | Dec 1996 | A |
| 5583927 | Ely et al. | Dec 1996 | A |
| 5587734 | Lauder et al. | Dec 1996 | A |
| 5589885 | Ooi | Dec 1996 | A |
| 5592470 | Rudrapatna et al. | Jan 1997 | A |
| 5594507 | Hoarty | Jan 1997 | A |
| 5594723 | Tibi | Jan 1997 | A |
| 5594938 | Engel | Jan 1997 | A |
| 5596693 | Needle et al. | Jan 1997 | A |
| 5600364 | Hendricks et al. | Feb 1997 | A |
| 5600573 | Hendricks et al. | Feb 1997 | A |
| 5608446 | Carr et al. | Mar 1997 | A |
| 5617145 | Huang et al. | Apr 1997 | A |
| 5621464 | Teo et al. | Apr 1997 | A |
| 5625404 | Grady et al. | Apr 1997 | A |
| 5630757 | Gagin et al. | May 1997 | A |
| 5631693 | Wunderlich et al. | May 1997 | A |
| 5631846 | Szurkowski | May 1997 | A |
| 5632003 | Davidson et al. | May 1997 | A |
| 5649283 | Galler et al. | Jul 1997 | A |
| 5668592 | Spaulding, II | Sep 1997 | A |
| 5668599 | Cheney et al. | Sep 1997 | A |
| 5708767 | Yeo et al. | Jan 1998 | A |
| 5710815 | Ming et al. | Jan 1998 | A |
| 5712906 | Grady et al. | Jan 1998 | A |
| 5740307 | Lane | Apr 1998 | A |
| 5742289 | Naylor et al. | Apr 1998 | A |
| 5748234 | Lippincott | May 1998 | A |
| 5754941 | Sharpe et al. | May 1998 | A |
| 5786527 | Tarte | Jul 1998 | A |
| 5790174 | Richard, III et al. | Aug 1998 | A |
| 5802283 | Grady et al. | Sep 1998 | A |
| 5812665 | Hoarty et al. | Sep 1998 | A |
| 5812786 | Seazholtz et al. | Sep 1998 | A |
| 5815604 | Simons et al. | Sep 1998 | A |
| 5818438 | Howe et al. | Oct 1998 | A |
| 5821945 | Yeo et al. | Oct 1998 | A |
| 5822537 | Katseff et al. | Oct 1998 | A |
| 5828371 | Cline et al. | Oct 1998 | A |
| 5844594 | Ferguson | Dec 1998 | A |
| 5845083 | Hamadani et al. | Dec 1998 | A |
| 5862325 | Reed et al. | Jan 1999 | A |
| 5864820 | Case | Jan 1999 | A |
| 5867208 | McLaren | Feb 1999 | A |
| 5883661 | Hoarty | Mar 1999 | A |
| 5903727 | Nielsen | May 1999 | A |
| 5903816 | Broadwin et al. | May 1999 | A |
| 5905522 | Lawler | May 1999 | A |
| 5907681 | Bates et al. | May 1999 | A |
| 5917822 | Lyles et al. | Jun 1999 | A |
| 5946352 | Rowlands et al. | Aug 1999 | A |
| 5952943 | Walsh et al. | Sep 1999 | A |
| 5959690 | Toebes, et al. | Sep 1999 | A |
| 5961603 | Kunkel et al. | Oct 1999 | A |
| 5963203 | Goldberg et al. | Oct 1999 | A |
| 5966163 | Lin et al. | Oct 1999 | A |
| 5978756 | Walker et al. | Nov 1999 | A |
| 5982445 | Eyer et al. | Nov 1999 | A |
| 5990862 | Lewis | Nov 1999 | A |
| 5995146 | Rasmussen | Nov 1999 | A |
| 5995488 | Kalkunte et al. | Nov 1999 | A |
| 5999970 | Krisbergh et al. | Dec 1999 | A |
| 6014416 | Shin et al. | Jan 2000 | A |
| 6021386 | Davis et al. | Feb 2000 | A |
| 6031989 | Cordell | Feb 2000 | A |
| 6034678 | Hoarty et al. | Mar 2000 | A |
| 6049539 | Lee et al. | Apr 2000 | A |
| 6049831 | Gardell et al. | Apr 2000 | A |
| 6052555 | Ferguson | Apr 2000 | A |
| 6055314 | Spies et al. | Apr 2000 | A |
| 6055315 | Doyle et al. | Apr 2000 | A |
| 6064377 | Hoarty et al. | May 2000 | A |
| 6078328 | Schumann et al. | Jun 2000 | A |
| 6084908 | Chiang et al. | Jul 2000 | A |
| 6100883 | Hoarty | Aug 2000 | A |
| 6108625 | Kim | Aug 2000 | A |
| 6131182 | Beakes et al. | Oct 2000 | A |
| 6141645 | Chi-Min et al. | Oct 2000 | A |
| 6141693 | Perlman et al. | Oct 2000 | A |
| 6144698 | Poon et al. | Nov 2000 | A |
| 6167084 | Wang et al. | Dec 2000 | A |
| 6169573 | Sampath-Kumar et al. | Jan 2001 | B1 |
| 6177931 | Alexander et al. | Jan 2001 | B1 |
| 6182072 | Leak et al. | Jan 2001 | B1 |
| 6184878 | Alonso et al. | Feb 2001 | B1 |
| 6192081 | Chiang et al. | Feb 2001 | B1 |
| 6198822 | Doyle et al. | Mar 2001 | B1 |
| 6205582 | Hoarty | Mar 2001 | B1 |
| 6226041 | Florencio et al. | May 2001 | B1 |
| 6236730 | Cowieson et al. | May 2001 | B1 |
| 6243418 | Kim | Jun 2001 | B1 |
| 6253238 | Lauder et al. | Jun 2001 | B1 |
| 6253375 | Gordon et al. | Jun 2001 | B1 |
| 6256047 | Isobe et al. | Jul 2001 | B1 |
| 6259826 | Pollard et al. | Jul 2001 | B1 |
| 6266369 | Wang et al. | Jul 2001 | B1 |
| 6266684 | Kraus et al. | Jul 2001 | B1 |
| 6275496 | Burns et al. | Aug 2001 | B1 |
| 6292194 | Powell, III | Sep 2001 | B1 |
| 6305020 | Hoarty et al. | Oct 2001 | B1 |
| 6310915 | Wells et al. | Oct 2001 | B1 |
| 6317151 | Ohsuga et al. | Nov 2001 | B1 |
| 6317885 | Fries | Nov 2001 | B1 |
| 6324217 | Gordon | Nov 2001 | B1 |
| 6349284 | Park et al. | Feb 2002 | B1 |
| 6385771 | Gordon | May 2002 | B1 |
| 6386980 | Nishino et al. | May 2002 | B1 |
| 6389075 | Wang et al. | May 2002 | B2 |
| 6389218 | Gordon et al. | May 2002 | B2 |
| 6415031 | Colligan et al. | Jul 2002 | B1 |
| 6415437 | Ludvig et al. | Jul 2002 | B1 |
| 6438140 | Jungers et al. | Aug 2002 | B1 |
| 6446037 | Fielder et al. | Sep 2002 | B1 |
| 6459427 | Mao et al. | Oct 2002 | B1 |
| 6477182 | Calderone | Nov 2002 | B2 |
| 6481012 | Gordon et al. | Nov 2002 | B1 |
| 6512793 | Maeda | Jan 2003 | B1 |
| 6525746 | Lau et al. | Feb 2003 | B1 |
| 6536043 | Guedalia | Mar 2003 | B1 |
| 6557041 | Mallart | Apr 2003 | B2 |
| 6560496 | Michener | May 2003 | B1 |
| 6564378 | Satterfield et al. | May 2003 | B1 |
| 6579184 | Tanskanen | Jun 2003 | B1 |
| 6584153 | Comito et al. | Jun 2003 | B1 |
| 6588017 | Calderone | Jul 2003 | B1 |
| 6598229 | Smyth et al. | Jul 2003 | B2 |
| 6604224 | Armstrong et al. | Aug 2003 | B1 |
| 6606746 | Zdepski et al. | Aug 2003 | B1 |
| 6614442 | Ouyang et al. | Sep 2003 | B1 |
| 6614843 | Gordon et al. | Sep 2003 | B1 |
| 6621870 | Gordon et al. | Sep 2003 | B1 |
| 6625574 | Taniguchi et al. | Sep 2003 | B1 |
| 6639896 | Goode et al. | Oct 2003 | B1 |
| 6645076 | Sugai | Nov 2003 | B1 |
| 6651252 | Gordon et al. | Nov 2003 | B1 |
| 6657647 | Bright | Dec 2003 | B1 |
| 6675385 | Wang | Jan 2004 | B1 |
| 6675387 | Boucher | Jan 2004 | B1 |
| 6681326 | Son et al. | Jan 2004 | B2 |
| 6681397 | Tsai et al. | Jan 2004 | B1 |
| 6684400 | Goode et al. | Jan 2004 | B1 |
| 6687663 | McGrath et al. | Feb 2004 | B1 |
| 6691208 | Dandrea et al. | Feb 2004 | B2 |
| 6697376 | Son et al. | Feb 2004 | B1 |
| 6704359 | Bayrakeri et al. | Mar 2004 | B1 |
| 6717600 | Dutta et al. | Apr 2004 | B2 |
| 6718552 | Goode | Apr 2004 | B1 |
| 6721794 | Taylor et al. | Apr 2004 | B2 |
| 6721956 | Wasilewski | Apr 2004 | B2 |
| 6727929 | Bates et al. | Apr 2004 | B1 |
| 6732370 | Gordon et al. | May 2004 | B1 |
| 6747991 | Hemy et al. | Jun 2004 | B1 |
| 6754271 | Gordon et al. | Jun 2004 | B1 |
| 6754905 | Gordon et al. | Jun 2004 | B2 |
| 6758540 | Adolph et al. | Jul 2004 | B1 |
| 6766407 | Lisitsa et al. | Jul 2004 | B1 |
| 6771704 | Hannah | Aug 2004 | B1 |
| 6785902 | Zigmond et al. | Aug 2004 | B1 |
| 6807528 | Truman et al. | Oct 2004 | B1 |
| 6810528 | Chatani | Oct 2004 | B1 |
| 6817947 | Tanskanen | Nov 2004 | B2 |
| 6886178 | Mao et al. | Apr 2005 | B1 |
| 6907574 | Xu et al. | Jun 2005 | B2 |
| 6931291 | Alvarez-Tinoco et al. | Aug 2005 | B1 |
| 6934965 | Gordon et al. | Aug 2005 | B2 |
| 6941019 | Mitchell et al. | Sep 2005 | B1 |
| 6941574 | Broadwin et al. | Sep 2005 | B1 |
| 6947509 | Wong | Sep 2005 | B1 |
| 6952221 | Holtz et al. | Oct 2005 | B1 |
| 6956899 | Hall et al. | Oct 2005 | B2 |
| 7030890 | Jouet et al. | Apr 2006 | B1 |
| 7050113 | Campisano et al. | May 2006 | B2 |
| 7089577 | Rakib et al. | Aug 2006 | B1 |
| 7095402 | Kunil et al. | Aug 2006 | B2 |
| 7114167 | Slemmer et al. | Sep 2006 | B2 |
| 7124424 | Gordon et al. | Oct 2006 | B2 |
| 7146615 | Hervet et al. | Dec 2006 | B1 |
| 7146628 | Gordon et al. | Dec 2006 | B1 |
| 7158676 | Rainsford | Jan 2007 | B1 |
| 7200836 | Brodersen et al. | Apr 2007 | B2 |
| 7212573 | Winger | May 2007 | B2 |
| 7224731 | Mehrotra | May 2007 | B2 |
| 7272556 | Aguilar et al. | Sep 2007 | B1 |
| 7310619 | Baar et al. | Dec 2007 | B2 |
| 7325043 | Rosenberg et al. | Jan 2008 | B1 |
| 7346111 | Winger et al. | Mar 2008 | B2 |
| 7360230 | Paz et al. | Apr 2008 | B1 |
| 7412423 | Asano | Aug 2008 | B1 |
| 7412505 | Slemmer et al. | Aug 2008 | B2 |
| 7421082 | Kamiya et al. | Sep 2008 | B2 |
| 7444306 | Varble | Oct 2008 | B2 |
| 7444418 | Chou et al. | Oct 2008 | B2 |
| 7500235 | Maynard et al. | Mar 2009 | B2 |
| 7508941 | O'Toole, Jr. et al. | Mar 2009 | B1 |
| 7512577 | Slemmer et al. | Mar 2009 | B2 |
| 7543073 | Chou et al. | Jun 2009 | B2 |
| 7596764 | Vienneau et al. | Sep 2009 | B2 |
| 7623575 | Winger | Nov 2009 | B2 |
| 7669220 | Goode | Feb 2010 | B2 |
| 7742609 | Yeakel et al. | Jun 2010 | B2 |
| 7743400 | Kurauchi | Jun 2010 | B2 |
| 7751572 | Villemoes et al. | Jul 2010 | B2 |
| 7757157 | Fukuda | Jul 2010 | B1 |
| 7830388 | Lu | Nov 2010 | B1 |
| 7840905 | Weber et al. | Nov 2010 | B1 |
| 7936819 | Craig et al. | May 2011 | B2 |
| 7970263 | Asch | Jun 2011 | B1 |
| 7987489 | Krzyzanowski et al. | Jul 2011 | B2 |
| 8027353 | Damola et al. | Sep 2011 | B2 |
| 8036271 | Winger et al. | Oct 2011 | B2 |
| 8046798 | Schlack et al. | Oct 2011 | B1 |
| 8074248 | Sigmon et al. | Dec 2011 | B2 |
| 8118676 | Craig et al. | Feb 2012 | B2 |
| 8136033 | Bhargava et al. | Mar 2012 | B1 |
| 8149917 | Zhang et al. | Apr 2012 | B2 |
| 8155194 | Winger et al. | Apr 2012 | B2 |
| 8155202 | Landau | Apr 2012 | B2 |
| 8170107 | Winger | May 2012 | B2 |
| 8194862 | Herr et al. | Jun 2012 | B2 |
| 8243630 | Luo et al. | Aug 2012 | B2 |
| 8270439 | Herr et al. | Sep 2012 | B2 |
| 8284842 | Craig et al. | Oct 2012 | B2 |
| 8296424 | Malloy et al. | Oct 2012 | B2 |
| 8370869 | Paek et al. | Feb 2013 | B2 |
| 8411754 | Zhang et al. | Apr 2013 | B2 |
| 8442110 | Pavlovskaia et al. | May 2013 | B2 |
| 8473996 | Gordon et al. | Jun 2013 | B2 |
| 8619867 | Craig et al. | Dec 2013 | B2 |
| 8621500 | Weaver et al. | Dec 2013 | B2 |
| 20010008845 | Kusuda et al. | Jul 2001 | A1 |
| 20010049301 | Masuda et al. | Dec 2001 | A1 |
| 20020007491 | Schiller et al. | Jan 2002 | A1 |
| 20020013812 | Krueger et al. | Jan 2002 | A1 |
| 20020016161 | Dellien et al. | Feb 2002 | A1 |
| 20020021353 | DeNies | Feb 2002 | A1 |
| 20020026642 | Augenbraun et al. | Feb 2002 | A1 |
| 20020027567 | Niamir | Mar 2002 | A1 |
| 20020032697 | French et al. | Mar 2002 | A1 |
| 20020040482 | Sextro et al. | Apr 2002 | A1 |
| 20020047899 | Son et al. | Apr 2002 | A1 |
| 20020049975 | Thomas et al. | Apr 2002 | A1 |
| 20020056083 | Istvan | May 2002 | A1 |
| 20020056107 | Schlack | May 2002 | A1 |
| 20020056136 | Wistendahl et al. | May 2002 | A1 |
| 20020059644 | Andrade et al. | May 2002 | A1 |
| 20020062484 | De Lange et al. | May 2002 | A1 |
| 20020066101 | Gordon et al. | May 2002 | A1 |
| 20020067766 | Sakamoto et al. | Jun 2002 | A1 |
| 20020069267 | Thiele | Jun 2002 | A1 |
| 20020072408 | Kumagai | Jun 2002 | A1 |
| 20020078171 | Schneider | Jun 2002 | A1 |
| 20020078456 | Hudson et al. | Jun 2002 | A1 |
| 20020083464 | Tomsen et al. | Jun 2002 | A1 |
| 20020095689 | Novak | Jul 2002 | A1 |
| 20020105531 | Niemi | Aug 2002 | A1 |
| 20020108121 | Alao et al. | Aug 2002 | A1 |
| 20020131511 | Zenoni | Sep 2002 | A1 |
| 20020136298 | Anantharamu et al. | Sep 2002 | A1 |
| 20020152318 | Menon et al. | Oct 2002 | A1 |
| 20020171765 | Waki et al. | Nov 2002 | A1 |
| 20020175931 | Holtz et al. | Nov 2002 | A1 |
| 20020178447 | Plotnick et al. | Nov 2002 | A1 |
| 20020188628 | Cooper et al. | Dec 2002 | A1 |
| 20020191851 | Keinan | Dec 2002 | A1 |
| 20020194592 | Tsuchida et al. | Dec 2002 | A1 |
| 20020196746 | Allen | Dec 2002 | A1 |
| 20030018796 | Chou et al. | Jan 2003 | A1 |
| 20030027517 | Callway et al. | Feb 2003 | A1 |
| 20030035486 | Kato et al. | Feb 2003 | A1 |
| 20030038893 | Rajamaki et al. | Feb 2003 | A1 |
| 20030039398 | McIntyre | Feb 2003 | A1 |
| 20030046690 | Miller | Mar 2003 | A1 |
| 20030051253 | Barone, Jr. | Mar 2003 | A1 |
| 20030058941 | Chen et al. | Mar 2003 | A1 |
| 20030061451 | Beyda | Mar 2003 | A1 |
| 20030065739 | Shnier | Apr 2003 | A1 |
| 20030071792 | Safadi | Apr 2003 | A1 |
| 20030072372 | Shen et al. | Apr 2003 | A1 |
| 20030076546 | Johnson et al. | Apr 2003 | A1 |
| 20030088328 | Nishio et al. | May 2003 | A1 |
| 20030088400 | Nishio et al. | May 2003 | A1 |
| 20030095790 | Joshi | May 2003 | A1 |
| 20030107443 | Clancy | Jun 2003 | A1 |
| 20030122836 | Doyle et al. | Jul 2003 | A1 |
| 20030123664 | Pedlow, Jr. et al. | Jul 2003 | A1 |
| 20030126608 | Safadi | Jul 2003 | A1 |
| 20030126611 | Kuczynski-Brown | Jul 2003 | A1 |
| 20030131349 | Kuczynski-Brown | Jul 2003 | A1 |
| 20030135860 | Dureau | Jul 2003 | A1 |
| 20030169373 | Peters et al. | Sep 2003 | A1 |
| 20030177199 | Zenoni | Sep 2003 | A1 |
| 20030188309 | Yuen | Oct 2003 | A1 |
| 20030189980 | Dvir et al. | Oct 2003 | A1 |
| 20030196174 | Pierre Cote et al. | Oct 2003 | A1 |
| 20030208768 | Urdang et al. | Nov 2003 | A1 |
| 20030229719 | Iwata et al. | Dec 2003 | A1 |
| 20030229900 | Reisman | Dec 2003 | A1 |
| 20030231218 | Amadio | Dec 2003 | A1 |
| 20040016000 | Zhang et al. | Jan 2004 | A1 |
| 20040034873 | Zenoni | Feb 2004 | A1 |
| 20040040035 | Carlucci et al. | Feb 2004 | A1 |
| 20040078822 | Breen et al. | Apr 2004 | A1 |
| 20040088375 | Sethi et al. | May 2004 | A1 |
| 20040091171 | Bone | May 2004 | A1 |
| 20040111526 | Baldwin et al. | Jun 2004 | A1 |
| 20040117827 | Karaoguz et al. | Jun 2004 | A1 |
| 20040128686 | Boyer et al. | Jul 2004 | A1 |
| 20040133704 | Krzyzanowski et al. | Jul 2004 | A1 |
| 20040136698 | Mock | Jul 2004 | A1 |
| 20040139158 | Datta | Jul 2004 | A1 |
| 20040157662 | Tsuchiya | Aug 2004 | A1 |
| 20040163101 | Swix et al. | Aug 2004 | A1 |
| 20040184542 | Fujimoto | Sep 2004 | A1 |
| 20040193648 | Lai et al. | Sep 2004 | A1 |
| 20040210824 | Shoff et al. | Oct 2004 | A1 |
| 20040261106 | Hoffman | Dec 2004 | A1 |
| 20040261114 | Addington et al. | Dec 2004 | A1 |
| 20050015259 | Thumpudi et al. | Jan 2005 | A1 |
| 20050015816 | Christofalo et al. | Jan 2005 | A1 |
| 20050021830 | Urzaiz et al. | Jan 2005 | A1 |
| 20050034155 | Gordon et al. | Feb 2005 | A1 |
| 20050034162 | White et al. | Feb 2005 | A1 |
| 20050044575 | Der Kuyl | Feb 2005 | A1 |
| 20050055685 | Maynard et al. | Mar 2005 | A1 |
| 20050055721 | Zigmond et al. | Mar 2005 | A1 |
| 20050071876 | van Beek | Mar 2005 | A1 |
| 20050076134 | Bialik et al. | Apr 2005 | A1 |
| 20050089091 | Kim et al. | Apr 2005 | A1 |
| 20050091690 | Delpuch et al. | Apr 2005 | A1 |
| 20050091695 | Paz et al. | Apr 2005 | A1 |
| 20050105608 | Coleman et al. | May 2005 | A1 |
| 20050114906 | Hoarty et al. | May 2005 | A1 |
| 20050132305 | Guichard et al. | Jun 2005 | A1 |
| 20050135385 | Jenkins et al. | Jun 2005 | A1 |
| 20050141613 | Kelly et al. | Jun 2005 | A1 |
| 20050149988 | Grannan | Jul 2005 | A1 |
| 20050160088 | Scallan et al. | Jul 2005 | A1 |
| 20050166257 | Feinleib et al. | Jul 2005 | A1 |
| 20050180502 | Puri | Aug 2005 | A1 |
| 20050198682 | Wright | Sep 2005 | A1 |
| 20050213586 | Cyganski et al. | Sep 2005 | A1 |
| 20050216933 | Black | Sep 2005 | A1 |
| 20050216940 | Black | Sep 2005 | A1 |
| 20050226426 | Oomen et al. | Oct 2005 | A1 |
| 20050273832 | Zigmond et al. | Dec 2005 | A1 |
| 20050283741 | Balabanovic et al. | Dec 2005 | A1 |
| 20060001737 | Dawson et al. | Jan 2006 | A1 |
| 20060020960 | Relan et al. | Jan 2006 | A1 |
| 20060020994 | Crane et al. | Jan 2006 | A1 |
| 20060031906 | Kaneda | Feb 2006 | A1 |
| 20060039481 | Shen et al. | Feb 2006 | A1 |
| 20060041910 | Hatanaka et al. | Feb 2006 | A1 |
| 20060088105 | Shen et al. | Apr 2006 | A1 |
| 20060095944 | Demircin et al. | May 2006 | A1 |
| 20060112338 | Joung et al. | May 2006 | A1 |
| 20060117340 | Pavlovskaia et al. | Jun 2006 | A1 |
| 20060143678 | Chou et al. | Jun 2006 | A1 |
| 20060161538 | Kiilerich | Jul 2006 | A1 |
| 20060173985 | Moore | Aug 2006 | A1 |
| 20060174026 | Robinson et al. | Aug 2006 | A1 |
| 20060174289 | Theberge | Aug 2006 | A1 |
| 20060195884 | van Zoest et al. | Aug 2006 | A1 |
| 20060212203 | Furuno | Sep 2006 | A1 |
| 20060218601 | Michel | Sep 2006 | A1 |
| 20060230428 | Craig et al. | Oct 2006 | A1 |
| 20060239563 | Chebil et al. | Oct 2006 | A1 |
| 20060242570 | Croft et al. | Oct 2006 | A1 |
| 20060256865 | Westerman | Nov 2006 | A1 |
| 20060269086 | Page et al. | Nov 2006 | A1 |
| 20060271985 | Hoffman et al. | Nov 2006 | A1 |
| 20060285586 | Westerman | Dec 2006 | A1 |
| 20060285819 | Kelly et al. | Dec 2006 | A1 |
| 20070009035 | Craig et al. | Jan 2007 | A1 |
| 20070009036 | Craig et al. | Jan 2007 | A1 |
| 20070009042 | Craig | Jan 2007 | A1 |
| 20070025639 | Zhou et al. | Feb 2007 | A1 |
| 20070033528 | Merrit et al. | Feb 2007 | A1 |
| 20070033631 | Gordon et al. | Feb 2007 | A1 |
| 20070074251 | Oguz et al. | Mar 2007 | A1 |
| 20070079325 | de Heer | Apr 2007 | A1 |
| 20070115941 | Patel et al. | May 2007 | A1 |
| 20070124282 | Wittkotter | May 2007 | A1 |
| 20070124795 | McKissick et al. | May 2007 | A1 |
| 20070130446 | Minakami | Jun 2007 | A1 |
| 20070130592 | Haeusel | Jun 2007 | A1 |
| 20070147804 | Zhang et al. | Jun 2007 | A1 |
| 20070152984 | Ording et al. | Jul 2007 | A1 |
| 20070172061 | Pinder | Jul 2007 | A1 |
| 20070174790 | Jing et al. | Jul 2007 | A1 |
| 20070178243 | Dong et al. | Aug 2007 | A1 |
| 20070237232 | Chang et al. | Oct 2007 | A1 |
| 20070300280 | Turner et al. | Dec 2007 | A1 |
| 20080052742 | Kopf et al. | Feb 2008 | A1 |
| 20080066135 | Brodersen et al. | Mar 2008 | A1 |
| 20080084503 | Kondo | Apr 2008 | A1 |
| 20080094368 | Ording et al. | Apr 2008 | A1 |
| 20080098450 | Wu et al. | Apr 2008 | A1 |
| 20080104520 | Swenson et al. | May 2008 | A1 |
| 20080127255 | Ress et al. | May 2008 | A1 |
| 20080154583 | Goto et al. | Jun 2008 | A1 |
| 20080163059 | Craner | Jul 2008 | A1 |
| 20080163286 | Rudolph et al. | Jul 2008 | A1 |
| 20080170619 | Landau | Jul 2008 | A1 |
| 20080170622 | Gordon et al. | Jul 2008 | A1 |
| 20080178243 | Dong et al. | Jul 2008 | A1 |
| 20080178249 | Gordon et al. | Jul 2008 | A1 |
| 20080187042 | Jasinschi | Aug 2008 | A1 |
| 20080189740 | Carpenter et al. | Aug 2008 | A1 |
| 20080195573 | Onoda et al. | Aug 2008 | A1 |
| 20080201736 | Gordon et al. | Aug 2008 | A1 |
| 20080212942 | Gordon et al. | Sep 2008 | A1 |
| 20080232452 | Sullivan et al. | Sep 2008 | A1 |
| 20080243918 | Holtman | Oct 2008 | A1 |
| 20080243998 | Oh et al. | Oct 2008 | A1 |
| 20080246759 | Summers | Oct 2008 | A1 |
| 20080253440 | Srinivasan et al. | Oct 2008 | A1 |
| 20080271080 | Gossweiler et al. | Oct 2008 | A1 |
| 20090003446 | Wu et al. | Jan 2009 | A1 |
| 20090003705 | Zou et al. | Jan 2009 | A1 |
| 20090007199 | La Joie | Jan 2009 | A1 |
| 20090025027 | Craner | Jan 2009 | A1 |
| 20090031341 | Schlack et al. | Jan 2009 | A1 |
| 20090041118 | Pavlovskaia et al. | Feb 2009 | A1 |
| 20090083781 | Yang et al. | Mar 2009 | A1 |
| 20090083813 | Dolce et al. | Mar 2009 | A1 |
| 20090083824 | McCarthy et al. | Mar 2009 | A1 |
| 20090089188 | Ku et al. | Apr 2009 | A1 |
| 20090094113 | Berry et al. | Apr 2009 | A1 |
| 20090094646 | Walter et al. | Apr 2009 | A1 |
| 20090100465 | Kulakowski | Apr 2009 | A1 |
| 20090100489 | Strothmann | Apr 2009 | A1 |
| 20090106269 | Zuckerman et al. | Apr 2009 | A1 |
| 20090106386 | Zuckerman et al. | Apr 2009 | A1 |
| 20090106392 | Zuckerman et al. | Apr 2009 | A1 |
| 20090106425 | Zuckerman et al. | Apr 2009 | A1 |
| 20090106441 | Zuckerman et al. | Apr 2009 | A1 |
| 20090106451 | Zuckerman et al. | Apr 2009 | A1 |
| 20090106511 | Zuckerman et al. | Apr 2009 | A1 |
| 20090113009 | Slemmer et al. | Apr 2009 | A1 |
| 20090138966 | Krause et al. | May 2009 | A1 |
| 20090144781 | Glaser et al. | Jun 2009 | A1 |
| 20090146779 | Kumar et al. | Jun 2009 | A1 |
| 20090157868 | Chaudhry | Jun 2009 | A1 |
| 20090158369 | Van Vleck et al. | Jun 2009 | A1 |
| 20090160694 | Di Flora | Jun 2009 | A1 |
| 20090172757 | Aldrey et al. | Jul 2009 | A1 |
| 20090178098 | Westbrook et al. | Jul 2009 | A1 |
| 20090183219 | Maynard et al. | Jul 2009 | A1 |
| 20090189890 | Corbett et al. | Jul 2009 | A1 |
| 20090193452 | Russ et al. | Jul 2009 | A1 |
| 20090196346 | Zhang et al. | Aug 2009 | A1 |
| 20090204920 | Beverley et al. | Aug 2009 | A1 |
| 20090210899 | Lawrence-Apfelbaum et al. | Aug 2009 | A1 |
| 20090225790 | Shay et al. | Sep 2009 | A1 |
| 20090228620 | Thomas et al. | Sep 2009 | A1 |
| 20090228922 | Haj-Khalil et al. | Sep 2009 | A1 |
| 20090233593 | Ergen et al. | Sep 2009 | A1 |
| 20090251478 | Maillot et al. | Oct 2009 | A1 |
| 20090254960 | Yarom et al. | Oct 2009 | A1 |
| 20090265617 | Randall et al. | Oct 2009 | A1 |
| 20090271512 | Jorgensen | Oct 2009 | A1 |
| 20090271818 | Schlack | Oct 2009 | A1 |
| 20090298535 | Klein et al. | Dec 2009 | A1 |
| 20090313674 | Ludvig et al. | Dec 2009 | A1 |
| 20090328109 | Pavlovskaia et al. | Dec 2009 | A1 |
| 20100033638 | O'Donnell et al. | Feb 2010 | A1 |
| 20100058404 | Rouse | Mar 2010 | A1 |
| 20100067571 | White et al. | Mar 2010 | A1 |
| 20100077441 | Thomas et al. | Mar 2010 | A1 |
| 20100104021 | Schmit | Apr 2010 | A1 |
| 20100115573 | Srinivasan et al. | May 2010 | A1 |
| 20100118972 | Zhang et al. | May 2010 | A1 |
| 20100131996 | Gauld | May 2010 | A1 |
| 20100146139 | Brockmann | Jun 2010 | A1 |
| 20100158109 | Dahlby et al. | Jun 2010 | A1 |
| 20100166071 | Wu et al. | Jul 2010 | A1 |
| 20100174776 | Westberg et al. | Jul 2010 | A1 |
| 20100175080 | Yuen et al. | Jul 2010 | A1 |
| 20100180307 | Hayes et al. | Jul 2010 | A1 |
| 20100211983 | Chou | Aug 2010 | A1 |
| 20100226428 | Thevathasan et al. | Sep 2010 | A1 |
| 20100235861 | Schein et al. | Sep 2010 | A1 |
| 20100242073 | Gordon et al. | Sep 2010 | A1 |
| 20100251167 | Deluca et al. | Sep 2010 | A1 |
| 20100254370 | Jana et al. | Oct 2010 | A1 |
| 20100325655 | Perez | Dec 2010 | A1 |
| 20110002376 | Ahmed et al. | Jan 2011 | A1 |
| 20110002470 | Purnhagen et al. | Jan 2011 | A1 |
| 20110023069 | Dowens | Jan 2011 | A1 |
| 20110035227 | Lee et al. | Feb 2011 | A1 |
| 20110067061 | Karaoguz et al. | Mar 2011 | A1 |
| 20110096828 | Chen et al. | Apr 2011 | A1 |
| 20110107375 | Stahl et al. | May 2011 | A1 |
| 20110110642 | Salomons et al. | May 2011 | A1 |
| 20110150421 | Sasaki et al. | Jun 2011 | A1 |
| 20110153776 | Opala et al. | Jun 2011 | A1 |
| 20110167468 | Lee et al. | Jul 2011 | A1 |
| 20110243024 | Osterling et al. | Oct 2011 | A1 |
| 20110258584 | Williams et al. | Oct 2011 | A1 |
| 20110289536 | Poder et al. | Nov 2011 | A1 |
| 20110317982 | Xu et al. | Dec 2011 | A1 |
| 20120023126 | Jin et al. | Jan 2012 | A1 |
| 20120030212 | Koopmans et al. | Feb 2012 | A1 |
| 20120137337 | Sigmon et al. | May 2012 | A1 |
| 20120204217 | Regis et al. | Aug 2012 | A1 |
| 20120209815 | Carson et al. | Aug 2012 | A1 |
| 20120224641 | Haberman et al. | Sep 2012 | A1 |
| 20120257671 | Brockmann et al. | Oct 2012 | A1 |
| 20130003826 | Craig et al. | Jan 2013 | A1 |
| 20130086610 | Brockmann | Apr 2013 | A1 |
| 20130179787 | Brockmann et al. | Jul 2013 | A1 |
| 20130198776 | Brockmann | Aug 2013 | A1 |
| 20130272394 | Brockmann et al. | Oct 2013 | A1 |
| 20140033036 | Gaur et al. | Jan 2014 | A1 |
| Number | Date | Country |
|---|---|---|
| 191599 | Apr 2000 | AT |
| 198969 | Feb 2001 | AT |
| 250313 | Oct 2003 | AT |
| 472152 | Jul 2010 | AT |
| 475266 | Aug 2010 | AT |
| 550086 | Feb 1986 | AU |
| 199060189 | Nov 1990 | AU |
| 620735 | Feb 1992 | AU |
| 199184838 | Apr 1992 | AU |
| 643828 | Nov 1993 | AU |
| 2004253127 | Jan 2005 | AU |
| 2005278122 | Mar 2006 | AU |
| 2010339376 | Aug 2012 | AU |
| 2011249132 | Nov 2012 | AU |
| 2011258972 | Nov 2012 | AU |
| 2011315950 | May 2013 | AU |
| 682776 | Mar 1964 | CA |
| 2052477 | Mar 1992 | CA |
| 1302554 | Jun 1992 | CA |
| 2163500 | May 1996 | CA |
| 2231391 | May 1997 | CA |
| 2273365 | Jun 1998 | CA |
| 2313133 | Jun 1999 | CA |
| 2313161 | Jun 1999 | CA |
| 2528499 | Jan 2005 | CA |
| 2569407 | Mar 2006 | CA |
| 2728797 | Apr 2010 | CA |
| 2787913 | Jul 2011 | CA |
| 2798541 | Dec 2011 | CA |
| 2814070 | Apr 2012 | CA |
| 1507751 | Jun 2004 | CN |
| 1969555 | May 2007 | CN |
| 101180109 | May 2008 | CN |
| 101627424 | Jan 2010 | CN |
| 101637023 | Jan 2010 | CN |
| 102007773 | Apr 2011 | CN |
| 4408355 | Oct 1994 | DE |
| 69516139 D1 | Dec 2000 | DE |
| 69132518 D1 | Sep 2001 | DE |
| 69333207 D1 | Jul 2004 | DE |
| 98961961 | Aug 2007 | DE |
| 602008001596 D1 | Aug 2010 | DE |
| 602006015650 D1 | Sep 2010 | DE |
| 0093549 | Nov 1983 | EP |
| 0128771 | Dec 1984 | EP |
| 0419137 | Mar 1991 | EP |
| 0449633 | Oct 1991 | EP |
| 0477786 | Apr 1992 | EP |
| 0477786 | Apr 1992 | EP |
| 0523618 | Jan 1993 | EP |
| 0534139 | Mar 1993 | EP |
| 0568453 | Nov 1993 | EP |
| 0588653 | Mar 1994 | EP |
| 0594350 | Apr 1994 | EP |
| 0612916 | Aug 1994 | EP |
| 0624039 | Nov 1994 | EP |
| 0638219 | Feb 1995 | EP |
| 0643523 | Mar 1995 | EP |
| 0661888 | Jul 1995 | EP |
| 0714684 | Jun 1996 | EP |
| 0746158 | Dec 1996 | EP |
| 0761066 | Mar 1997 | EP |
| 0789972 | Aug 1997 | EP |
| 0830786 | Mar 1998 | EP |
| 0861560 | Sep 1998 | EP |
| 0933966 | Aug 1999 | EP |
| 0933966 | Aug 1999 | EP |
| 1026872 | Aug 2000 | EP |
| 1038397 | Sep 2000 | EP |
| 1038399 | Sep 2000 | EP |
| 1038400 | Sep 2000 | EP |
| 1038401 | Sep 2000 | EP |
| 1051039 | Nov 2000 | EP |
| 1055331 | Nov 2000 | EP |
| 1120968 | Aug 2001 | EP |
| 1345446 | Sep 2003 | EP |
| 1422929 | May 2004 | EP |
| 1428562 | Jun 2004 | EP |
| 1521476 | Apr 2005 | EP |
| 1645115 | Apr 2006 | EP |
| 1 725 044 | Nov 2006 | EP |
| 1767708 | Mar 2007 | EP |
| 1771003 | Apr 2007 | EP |
| 1772014 | Apr 2007 | EP |
| 1877150 | Jan 2008 | EP |
| 1887148 | Feb 2008 | EP |
| 1900200 | Mar 2008 | EP |
| 1902583 | Mar 2008 | EP |
| 1908293 | Apr 2008 | EP |
| 1911288 | Apr 2008 | EP |
| 1918802 | May 2008 | EP |
| 2100296 | Sep 2009 | EP |
| 2105019 | Sep 2009 | EP |
| 2106665 | Oct 2009 | EP |
| 2116051 | Nov 2009 | EP |
| 2124440 | Nov 2009 | EP |
| 2248341 | Nov 2010 | EP |
| 2269377 | Jan 2011 | EP |
| 2271098 | Jan 2011 | EP |
| 2304953 | Apr 2011 | EP |
| 2364019 | Sep 2011 | EP |
| 2384001 | Nov 2011 | EP |
| 2409493 | Jan 2012 | EP |
| 2477414 | Jul 2012 | EP |
| 2487919 | Aug 2012 | EP |
| 2520090 | Nov 2012 | EP |
| 2567545 | Mar 2013 | EP |
| 2577437 | Apr 2013 | EP |
| 2628306 | Aug 2013 | EP |
| 2632164 | Aug 2013 | EP |
| 2632165 | Aug 2013 | EP |
| 2695388 | Feb 2014 | EP |
| 2207635 | Jun 2004 | ES |
| 8211463 | Jun 1982 | FR |
| 2529739 | Jan 1984 | FR |
| 2891098 | Mar 2007 | FR |
| 2207838 | Feb 1989 | GB |
| 2248955 | Apr 1992 | GB |
| 2290204 | Dec 1995 | GB |
| 2365649 | Feb 2002 | GB |
| 2378345 | Feb 2003 | GB |
| 1134855 | Oct 2010 | HK |
| 1116323 | Dec 2010 | HK |
| 19913397 | Apr 1992 | IE |
| 99586 | Feb 1998 | IL |
| 215133D0 | Dec 2011 | IL |
| 222829D0 | Dec 2012 | IL |
| 222830D0 | Dec 2012 | IL |
| 225525D0 | Jun 2013 | IL |
| 180215 | Jan 1998 | IN |
| 200701744 | Nov 2007 | IN |
| 200900856 | May 2009 | IN |
| 200800214 | Jun 2009 | IN |
| 3759 | Mar 1992 | IS |
| 60-054324 | Mar 1985 | JP |
| 63-033988 | Feb 1988 | JP |
| 63-263985 | Oct 1988 | JP |
| 2001-241993 | Sep 1989 | JP |
| 04-373286 | Dec 1992 | JP |
| 06-054324 | Feb 1994 | JP |
| 7015720 | Jan 1995 | JP |
| 7-160292 | Jun 1995 | JP |
| 8-265704 | Oct 1996 | JP |
| 10-228437 | Aug 1998 | JP |
| 10-510131 | Sep 1998 | JP |
| 11-134273 | May 1999 | JP |
| H11-261966 | Sep 1999 | JP |
| 2000-152234 | May 2000 | JP |
| 2001-203995 | Jul 2001 | JP |
| 2001-245271 | Sep 2001 | JP |
| 2001-514471 | Sep 2001 | JP |
| 2002-016920 | Jan 2002 | JP |
| 2002-057952 | Feb 2002 | JP |
| 2002-112220 | Apr 2002 | JP |
| 2002-141810 | May 2002 | JP |
| 2002-208027 | Jul 2002 | JP |
| 2002-319991 | Oct 2002 | JP |
| 2003-506763 | Feb 2003 | JP |
| 2003-087785 | Mar 2003 | JP |
| 2003-529234 | Sep 2003 | JP |
| 2004-501445 | Jan 2004 | JP |
| 2004-056777 | Feb 2004 | JP |
| 2004-110850 | Apr 2004 | JP |
| 2004-112441 | Apr 2004 | JP |
| 2004-135932 | May 2004 | JP |
| 2004-264812 | Sep 2004 | JP |
| 2004-533736 | Nov 2004 | JP |
| 2004-536381 | Dec 2004 | JP |
| 2004-536681 | Dec 2004 | JP |
| 2005-033741 | Feb 2005 | JP |
| 2005-084987 | Mar 2005 | JP |
| 2005-095599 | Mar 2005 | JP |
| 8-095599 | Apr 2005 | JP |
| 2005-156996 | Jun 2005 | JP |
| 2005-519382 | Jun 2005 | JP |
| 2005-523479 | Aug 2005 | JP |
| 2005-309752 | Nov 2005 | JP |
| 2006-067280 | Mar 2006 | JP |
| 2006-512838 | Apr 2006 | JP |
| 11-88419 | Sep 2007 | JP |
| 2008-523880 | Jul 2008 | JP |
| 2008-535622 | Sep 2008 | JP |
| 04252727 | Apr 2009 | JP |
| 2009-543386 | Dec 2009 | JP |
| 2011-108155 | Jun 2011 | JP |
| 2012-080593 | Apr 2012 | JP |
| 04996603 | Aug 2012 | JP |
| 05121711 | Jan 2013 | JP |
| 53-004612 | Oct 2013 | JP |
| 05331008 | Oct 2013 | JP |
| 05405819 | Feb 2014 | JP |
| 2006067924 | Jun 2006 | KR |
| 2007038111 | Apr 2007 | KR |
| 20080001298 | Jan 2008 | KR |
| 2008024189 | Mar 2008 | KR |
| 2010111739 | Oct 2010 | KR |
| 2010120187 | Nov 2010 | KR |
| 2010127240 | Dec 2010 | KR |
| 2011030640 | Mar 2011 | KR |
| 2011129477 | Dec 2011 | KR |
| 20120112683 | Oct 2012 | KR |
| 2013061149 | Jun 2013 | KR |
| 2013113925 | Oct 2013 | KR |
| 1333200 | Nov 2013 | KR |
| 2008045154 | Nov 2013 | KR |
| 2013138263 | Dec 2013 | KR |
| 1032594 | Apr 2008 | NL |
| 1033929 | Apr 2008 | NL |
| 2004670 | Nov 2011 | NL |
| 2004780 | Jan 2012 | NL |
| 239969 | Dec 1994 | NZ |
| 99110 | Dec 1993 | PT |
| WO 8202303 | Jul 1982 | WO |
| WO 8908967 | Sep 1989 | WO |
| WO 9013972 | Nov 1990 | WO |
| WO 9322877 | Nov 1993 | WO |
| WO 9416534 | Jul 1994 | WO |
| WO 9419910 | Sep 1994 | WO |
| WO 9421079 | Sep 1994 | WO |
| WO 9515658 | Jun 1995 | WO |
| WO 9532587 | Nov 1995 | WO |
| WO 9533342 | Dec 1995 | WO |
| WO 9533342 | Dec 1995 | WO |
| WO 9614712 | May 1996 | WO |
| WO 9627843 | Sep 1996 | WO |
| WO 9631826 | Oct 1996 | WO |
| WO 9637074 | Nov 1996 | WO |
| WO 9637074 | Nov 1996 | WO |
| WO 9642168 | Dec 1996 | WO |
| WO 9716925 | May 1997 | WO |
| WO 9733434 | Sep 1997 | WO |
| WO 9739583 | Oct 1997 | WO |
| WO 9826595 | Jun 1998 | WO |
| WO 9900735 | Jan 1999 | WO |
| WO 9904568 | Jan 1999 | WO |
| WO 9900735 | Jan 1999 | WO |
| WO 9930496 | Jun 1999 | WO |
| WO 9930497 | Jun 1999 | WO |
| WO 9930500 | Jun 1999 | WO |
| WO 9930501 | Jun 1999 | WO |
| WO 9935840 | Jul 1999 | WO |
| WO 9941911 | Aug 1999 | WO |
| WO 9956468 | Nov 1999 | WO |
| WO 9965232 | Dec 1999 | WO |
| WO 9965243 | Dec 1999 | WO |
| WO 9966732 | Dec 1999 | WO |
| WO 9966732 | Dec 1999 | WO |
| WO 0002303 | Jan 2000 | WO |
| WO 0007372 | Feb 2000 | WO |
| WO 0008967 | Feb 2000 | WO |
| WO 0019910 | Apr 2000 | WO |
| WO 0038430 | Jun 2000 | WO |
| WO 0041397 | Jul 2000 | WO |
| WO 0139494 | May 2001 | WO |
| WO 0141447 | Jun 2001 | WO |
| WO 0182614 | Nov 2001 | WO |
| WO 0192973 | Dec 2001 | WO |
| WO 02089487 | Jul 2002 | WO |
| WO 02076097 | Sep 2002 | WO |
| WO 02076099 | Sep 2002 | WO |
| WO 03026232 | Mar 2003 | WO |
| WO 2003026275 | Mar 2003 | WO |
| WO 03047710 | Jun 2003 | WO |
| WO 03065683 | Aug 2003 | WO |
| WO 03071727 | Aug 2003 | WO |
| WO 03091832 | Nov 2003 | WO |
| WO 2004012437 | Feb 2004 | WO |
| WO 2004018060 | Mar 2004 | WO |
| WO 2004073310 | Aug 2004 | WO |
| WO 2005002215 | Jan 2005 | WO |
| WO 2005041122 | May 2005 | WO |
| WO 2005053301 | Jun 2005 | WO |
| WO 2005120067 | Dec 2005 | WO |
| WO 2006014362 | Feb 2006 | WO |
| WO 2006022881 | Mar 2006 | WO |
| WO 2006053305 | May 2006 | WO |
| WO 2006067697 | Jun 2006 | WO |
| WO 2006081634 | Aug 2006 | WO |
| WO 2006105480 | Oct 2006 | WO |
| WO 2006110268 | Oct 2006 | WO |
| WO 2007001797 | Jan 2007 | WO |
| WO 2007008319 | Jan 2007 | WO |
| WO 2007008355 | Jan 2007 | WO |
| WO 2007008356 | Jan 2007 | WO |
| WO 2007008357 | Jan 2007 | WO |
| WO 2007008358 | Jan 2007 | WO |
| WO 2007018722 | Feb 2007 | WO |
| WO 2007018722 | Feb 2007 | WO |
| WO 2007018726 | Feb 2007 | WO |
| WO 2008044916 | Apr 2008 | WO |
| WO 2008086170 | Jul 2008 | WO |
| WO 2008088741 | Jul 2008 | WO |
| WO 2008088752 | Jul 2008 | WO |
| WO 2008088772 | Jul 2008 | WO |
| WO 2008100205 | Aug 2008 | WO |
| WO 2009038596 | Mar 2009 | WO |
| WO 2009099893 | Aug 2009 | WO |
| WO 2009099895 | Aug 2009 | WO |
| WO 2009105465 | Aug 2009 | WO |
| WO 2009110897 | Sep 2009 | WO |
| WO 2009114247 | Sep 2009 | WO |
| WO 2009155214 | Dec 2009 | WO |
| WO 2010044926 | Apr 2010 | WO |
| WO 2010054136 | May 2010 | WO |
| WO 2010107954 | Sep 2010 | WO |
| WO 2011014336 | Sep 2010 | WO |
| WO 2011082364 | Jul 2011 | WO |
| WO 2011139155 | Nov 2011 | WO |
| WO 2011149357 | Dec 2011 | WO |
| WO 2012051528 | Apr 2012 | WO |
| WO 2012138660 | Oct 2012 | WO |
| WO 2013106390 | Jul 2013 | WO |
| WO 2013155310 | Jul 2013 | WO |
| Entry |
|---|
| Authorized Officer Jürgen Güttlich, International Search Report and Written Opinion, dated Jan. 12, 2007, PCT/US2008/000400. |
| Authorized Officer Jürgen Güttlich, International Search Report and Written Opinion, dated Jan. 12, 2007, PCT/US2008/000450. |
| Hoarty, W. L., “The Smart Headend—A Novel Approach to Interactive Television”, Montreux Int'l TV Symposium, Jun. 9, 1995. |
| Robert Koenen, “MPEG-4 Overview—Overview of the MPEG-4 Standard” Internet Citation, Mar. 2001. |
| Avaro, O., et al., “MPEG-4 Systems: Overview” Signal Processing, Image Communication, Elsevier Science Publishers, vol. 15, Jan. 1, 2000, pp. 281-298. |
| Stoll, G. et al., “GMF4iTV: Neue Wege zur Interaktivität mit bewegten Objekten beim digitalen Fernsehen” FKT Fernseh Und Kinotechnik, Fachverlag Schiele & Schon Gmgh., vol. 60, No. 4, Jan. 1, 2006, pp. 171-178. |
| AC-3 digital audio compression standard, Extract, Dec. 20, 1995, 11 pgs. |
| ActiveVideo Networks Bv, International Search Report and Written Opinion, PCT/NL2011/050308, Sep. 6, 2011, 8 pgs. |
| ActiveVideo Networks Inc., International Preliminary Report on Patentability, PCT/US2011/056355, Apr. 16, 2013, 4 pgs. |
| ActiveVideo Networks Inc., International Preliminary Report on Patentability, PCT/US2012/032010, Oct. 17, 2013, 4 pgs. |
| ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2011/056355, Apr. 13, 2012, 6 pgs. |
| ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2012/032010, Oct. 10, 2012, 6 pgs. |
| ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2013/020769, May 9, 2013, 9 pgs. |
| ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2013/036182, Jul. 29, 2013, 12 pgs. |
| ActiveVideo Networks, Inc., International Search Report and Written Opinion, PCT/US2009/032457, Jul. 22, 2009, 7 pgs. |
| ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 09820936-4, 11 pgs. |
| ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 10754084-1, 11 pgs. |
| ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 10841764.3, 16 pgs. |
| ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 11833486.1, 6 pgs. |
| Annex C—Video buffering verifier, information technology—generic coding of moving pictures and associated audio information: video, Feb. 2000, 6 pgs. |
| Antonoff, Michael, “Interactive Television,” Popular Science, Nov. 1992, 12 pages. |
| Askenas, M., U.S. Appl. No. 10/253,109 (unpublished), filed Sep. 24, 2002. Not Found. |
| Avinity Systems B.V., Extended European Search Report, Application No. 12163713.6, 10 pgs. |
| Benjelloun, a summation algorithm for MPEG-1 coded audio signals: a first step towards audio processed domain, 2000, 9 pgs. |
| Broadhead, Direct manipulation of MPEG compressed digital audio, Nov. 5-9, 1995, 41 pgs. |
| Cable Television Laboratories, Inc., “CableLabs Asset Distribution Interface Specification, Version 1.1”, May 5, 2006, 33 pgs. |
| CD 11172-3, Coding of moving pictures and associated audio for digital storage media at up to about 1.5 MBIT, Jan. 1, 1992, 39 pgs. |
| Craig, Notice of Allowance, U.S. Appl. No. 11/178,176, filed Dec. 23, 2010, 8 pgs. |
| Craig, Notice of Allowance, U.S. Appl. No. 11/178,183, filed Jan. 12, 2012, 7 pgs. |
| Craig, Notice of Allowance, U.S. Appl. No. 11/178,183, filed Jul. 19, 2012, 8 pgs. |
| Craig, Notice of Allowance, U.S. Appl. No. 11/178,189, filed Oct. 12, 2011, 7 pgs. |
| Craig, Notice of Allowance, U.S. Appl. No. 11/178,176, filed Mar. 23, 2011, 8 pgs. |
| Craig, Notice of Allowance, U.S. Appl. No. 13/609,183, filed Aug. 26, 2013, 8 pgs. |
| Craig, Final Office Action, U.S. Appl. No. 11/103,838, filed Feb. 5, 2009, 30 pgs. |
| Craig, Final Office Action, U.S. Appl. No. 11/103,838, filed Jul. 6, 2010, 35 pgs. |
| Craig, Final Office Action, U.S. Appl. No. 11/178,176, filed Oct. 1, 2010, 8 pgs. |
| Craig, Final Office Action, U.S. Appl. No. 11/178,183, filed Apr. 13, 2011, 16 pgs. |
| Craig, Final Office Action, U.S. Appl. No. 11/178,177, filed Oct. 26, 2010, 12 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/103,838, filed May 12, 2009, 32 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/103,838, filed Aug. 19, 2008, 17 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/103,838, filed Nov. 19, 2009, 34 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,176, filed May 6, 2010, 7 pgs. |
| Craig, Office-Action U.S. Appl. No. 11/178,177, filed Mar. 29, 2011, 15 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,177, filed Aug. 3, 2011, 26 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,177, filed Mar. 29, 2010, 11 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,181, filed Feb. 11, 2011, 19 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,181, filed Jun. 20, 2011, 21 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,181, filed Aug. 25, 2010, 17 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,181, filed Mar. 29, 2010, 10 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,182, filed Feb. 23, 2010, 15 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,183, filed Dec. 6, 2010, 12 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,183, filed Sep. 15, 2011, 12 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,183, filed Feb. 19, 2010, 17 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,183, filed Jul. 20, 2010, 13 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,189, filed Nov. 9, 2010, 13 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,189, filed Mar. 15, 2010, 11 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,189, filed Jul. 23, 2009, 10 pgs. |
| Craig, Office Action, U.S. Appl. No. 11/178,189, filed May 26, 2011, 14 pgs. |
| Craig, Office Action, U.S. Appl. No. 13/609,183, filed May 9, 2013, 7 pgs. |
| Pavlovskaia, Office Action, JP 2011-516499, Feb. 14, 2014, 19 pgs. |
| Digital Audio Compression Standard(AC-3, E-AC-3), Advanced Television Systems Committee, Jun. 14, 2005, 236 pgs. |
| European Patent Office, Extended European Search Report for International Application No. PCT/US2010/027724, dated Jul. 24, 2012, 11 pages. |
| FFMPEG, http://www.ffmpeg.orq, downloaded Apr. 8, 2010, 8 pgs. |
| FFMEG-0.4.9 Audio Layer 2 Tables Including Fixed Psycho Acoustic Model, 2001, 2 pgs. |
| Herr, Notice of Allowance, U.S. Appl. No. 11/620,593, filed May 23, 2012, 5 pgs. |
| Herr, Notice of Allowance, U.S. Appl. No. 12/534,016, filed Feb. 7, 2012, 5 pgs. |
| Herr, Notice of Allowance, U.S. Appl. No. 12/534,016, filed Sep. 28, 2011, 15 pgs. |
| Herr, Final Office Action, U.S. Appl. No. 11/620,593, filed Sep. 15, 2011, 104 pgs. |
| Herr, Office Action, U.S. Appl. No. 11/620,593, filed Mar. 19, 2010, 58 pgs. |
| Herr, Office Action, U.S. Appl. No. 11/620,593, filed Apr. 21, 2009 27 pgs. |
| Herr, Office Action, U.S. Appl. No. 11/620,593, filed Dec. 23, 2009, 58 pgs. |
| Herr, Office Action, U.S. Appl. No. 11/620,593, filed Jan. 24, 2011, 96 pgs. |
| Herr, Office Action, U.S. Appl. No. 11/620,593, filed Aug. 27, 2010, 41 pgs. |
| Herre, Thoughts on an SAOC Architecture, Oct. 2006, 9 pgs. |
| Hoarty, The Smart Headend—A Novel Approach to Interactive Television, Montreux Int'l TV Symposium, Jun. 9, 1995, 21 pgs. |
| ICTV, Inc., International Preliminary Report on Patentability, PCT/US2006/022585, Jan. 29, 2008, 9 pgs. |
| ICTV, Inc., International Search Report / Written Opinion, PCT/US2006/022585, Oct. 12, 2007, 15 pgs. |
| ICTV, Inc., International Search Report / Written Opinion, PCT/US2008/000419, May 15, 2009, 20 pgs. |
| ICTV, Inc., International Search Report / Written Opinion; PCT/US2006/022533, Nov. 20, 2006; 8 pgs. |
| Isovic, Timing constraints of MPEG-2 decoding for high quality video: misconceptions and realistic assumptions, Jul. 2-4, 2003, 10 pgs. |
| Korean Intellectual Property Office, International Search Report; PCT/US2009/032457, Jul. 22, 2009, 7 pgs. |
| MPEG-2 Video elementary stream supplemental information, Dec. 1999, 12 pgs. |
| Ozer, Video Compositing 101. available from http://www.emedialive.com, Jun. 2, 2004, 5pgs. |
| Porter, Compositing Digital Images, 18 Computer Graphics (No. 3), Jul. 1984, pp. 253-259. |
| RSS Advisory Board, “RSS 2.0 Specification”, published Oct. 15, 2007. Not Found. |
| SAOC use cases, draft requirements and architecture, Oct. 2006, 16 pgs. |
| Sigmon, Final Office Action, U.S. Appl. No. 11/258,602, Feb. 23, 2009, 15 pgs. |
| Sigmon, Office Action, U.S. Appl. No. 11/258,602, Sep. 2, 2008, 12 pgs. |
| TAG Networks, Inc., Communication pursuant to Article 94(3) EPC, European Patent Application, 06773714.8, May 6, 2009, 3 pgs. |
| TAG Networks Inc, Decision to Grant a Patent, JP 209-544985, Jun. 28, 2013, 1 pg. |
| TAG Networks Inc., IPRP, PCT/US2006/010080, Oct. 16, 2007, 6 pgs. |
| TAG Networks Inc., IPRP, PCT/US2006/024194, Jan. 10, 2008, 7 pgs. |
| TAG Networks Inc., IPRP, PCT/US2006/024195, Apr. 1, 2009, 11 pgs. |
| TAG Networks Inc., IPRP, PCT/US2006/024196, Jan. 10, 2008, 6 pgs. |
| TAG Networks Inc., International Search Report, PCT/US2008/050221, Jun. 12, 2008, 9 pgs. |
| TAG Networks Inc., Office Action, CN 200680017662.3, Apr. 26, 2010, 4 pgs. |
| TAG Networks Inc., Office Action, EP 06739032.8, Aug. 14, 2009, 4 pgs. |
| TAG Networks Inc., Office Action, EP 06773714.8, May 6, 2009, 3 pgs. |
| TAG Networks Inc., Office Action, EP 06773714.8, Jan. 12, 2010, 4 pgs. |
| TAG Networks Inc., Office Action, JP 2008-506474, Oct. 1, 2012, 5 pgs. |
| TAG Networks Inc., Office Action, JP 2008-506474, Aug. 8, 2011, 5 pgs. |
| TAG Networks Inc., Office Action, JP 2008-520254, Oct. 20, 2011, 2 pgs. |
| TAG Networks, IPRP, PCT/US2008/050221, Jul. 7, 2009, 6 pgs. |
| TAG Networks, International Search Report, PCT/US2010/041133, Oct. 19, 2010, 13 pgs. |
| TAG Networks, Office Action, CN 200880001325.4, Jun. 22, 2011, 4 pgs. |
| TAG Networks, Office Action, JP 2009-544985, Feb. 25, 2013, 3 pgs. |
| Talley, A general framework for continuous media transmission control, Oct. 13-16, 1997, 10 pgs. |
| The Toolame Project, Psych—nl.c, 1999, 1 pg. |
| Todd, AC-3: flexible perceptual coding for audio transmission and storage, Feb. 26-Mar. 1, 1994, 16 pgs. |
| Tudor, MPEG-2 Video Compression, Dec. 1995, 15 pgs. |
| TVHEAD, Inc., First Examination Report, in 1744/MUMNP/2007, Dec. 30, 2013, 6 pgs. |
| TVHEAD, Inc., International Search Report, PCT/US2006/010080, Jun. 20, 2006, 3 pgs. |
| TVHEAD, Inc., International Search Report, PCT/US2006/024194, Dec. 15, 2006, 4 pgs. |
| TVHEAD, Inc., International Search Report, PCT/US2006/024195, Nov. 29, 2006, 9 pgs. |
| TVHEAD, Inc., International Search Report, PCT/US2006/024196, Dec. 11, 2006, 4 pgs. |
| TVHEAD, Inc., International Search Report, PCT/US2006/024197, Nov. 28, 2006, 9 pgs. |
| Vernon, Dolby digital: audio coding for digital television and storage applications, Aug. 1999, 18 pgs. |
| Wang, A beat-pattern based error concealment scheme for music delivery with burst packet loss, Aug. 22-25, 2001, 4 pgs. |
| Wang, A compressed domain beat detector using MP3 audio bitstream, Sep. 30-Oct. 5, 2001, 9 pgs. |
| Wang, A multichannel audio coding algorithm for inter-channel redundancy removal, May 12-15, 2001, 6 pgs. |
| Wang, An excitation level based psychoacoustic model for audio compression, Oct. 30-Nov. 4, 1999, 4 pgs. |
| Wang, Energy compaction property of the MDCT in comparison with other transforms, Sep. 22-25, 2000, 23 pgs. |
| Wang, Exploiting excess masking for audio compression, Sep. 2-5, 1999, 4 pgs. |
| Wang, schemes for re-compressing mp3 audio bitstreams,Nov. 30-Dec. 3, 2001, 5 pgs. |
| Wang, Selected advances in audio compression and compressed domain processing, Aug. 2001, 68 pgs. |
| Wang, The impact of the relationship between MDCT and DFT on audio compression, Dec. 13-15, 2000, 9 pgs. |
| ActiveVideo Networks Inc., Communication Pursuant to Rules 70(2) and 70a(2), EP11833486.1, Apr. 24, 2014, 1 pg. |
| ActiveVideo Networks, Inc., International Search Report and Written Opinion, PCT/US2014/041430, Oct. 9, 2014, 9 pgs. |
| ActiveVideo Networks Inc., Examination Report No. 1, AU2011258972, Jul. 21, 2014, 3 pgs. |
| Active Video Networks, Notice of Reasons for Rejection, JP2012-547318, Sep. 26, 2014, 7 pgs. |
| Avinity Systems B. V., Final Office Action, JP-2009-530298, Oct. 7, 2014, 8 pgs. |
| Brockmann, Final Office Action, U.S. Appl. No. 13/686,548, filed Sep. 24, 2014, 13 pgs. |
| Brockmann, Final Office Action, U.S. Appl. No. 13/438,617, filed Oct. 3, 2014, 19 pgs. |
| Brockmann, Office Action, U.S. Appl. No. 12/443,571, filed Nov. 5, 2014, 26 pgs. |
| ActiveVideo, http://www.activevideo.com/, as printed out in year 2012, 1 pg. |
| ActiveVideo Networks Inc., International Preliminary Report on Patentability, PCT/US2013/020769, Jul. 24, 2014, 6 pgs. |
| ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2014/030773, Jul. 25, 2014, 8 pgs. |
| ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2014/041416, Aug. 27, 2014, 8 pgs. |
| ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 13168509.1, 10 pgs. |
| ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 13168376-5, 8 pgs. |
| ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 12767642-7, 12 pgs. |
| ActiveVideo Networks Inc., Communication Pursuant to Rules 70(2) and 70a(2), EP10841764.3, Jun. 22, 2011, 1 pg. |
| ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, EP08713106.6, Jun. 26, 2014, 5 pgS. |
| ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, EP09713486.0, Apr. 14, 2014, 6 pgS. |
| ActiveVideo Networks Inc., Examination Report No. 1, AU2011258972, Apr. 4, 2013, 5 pgs. |
| ActiveVideo Networks Inc., Examination Report No. 1, AU2010339376, Apr. 30, 2014, 4 pgs. |
| ActiveVideo Networks Inc., Examination Report, App. No. EP11749946.7, Oct. 8, 2013, 6 pgs. |
| ActiveVideo Networks Inc., Summons to attend oral-proceeding, Application No. EP09820936-4, Aug. 19, 2014, 4 pgs. |
| ActiveVideo Networks Inc., International Searching Authority, International Search Report—International application No. PCT/US2010/027724, dated Oct. 28, 2010, together with the Written Opinion of the International Searching Authority, 7 pages. |
| Adams, Jerry, NTZ Nachrichtechnische Zeitschrift. vol. 40, No. 7, Jul. 1987, Berlin DE pp. 534-536; Jerry Adams: 'Glasfasernetz für Breitbanddienste in London', 5 pgs. No English Translation Found. |
| Avinity Systems B.V., Communication pursuant to Article 94(3) EPC, EP 07834561.8, Jan. 31, 2014, 10 pgs. |
| Avinity Systems B.V., Extended European Search Report, Application No. 12163712-8, 10 pgs. |
| Avinity Systems B.V., Communication pursuant to Article 94(3) EPC, EP 07834561.8, Apr. 8, 2010, 5 pgs. |
| Avinity Systems B.V., International Preliminary Report on Patentability, PCT/NL2007/000245, Feb. 19, 2009, 7 pgs. |
| Avinity Systems B.V., International Search Report and Written Opinion, PCT/NL2007/000245, Feb. 19, 2009, 18 pgs. |
| Avinity Systems B.V., Notice of Grounds of Rejection for Patent, JP 2009-530298, Sep. 3, 2013, 4 pgs. |
| Avinity Systems B.V., Notice of Grounds of Rejection for Patent, JP 2009-530298, Sep. 25, 2012, 6 pgs. |
| Bird et al., “Customer Access to Broadband Services,” ISSLS 86—The International Symposium on Subrscriber Loops and Services Sep. 29, 1986, Tokyo,JP 6 pgs. |
| Brockmann, Final Office Action, U.S. Appl. No. 13/668,004, filed Jul. 16, 2014, 20 pgs. |
| Brockmann, Office Action, U.S. Appl. No. 13/686,548, filed Mar. 10, 2014, 11 pgs. |
| Brockmann, Office Action, U.S. Appl. No. 13/668,004, filed Dec. 23, 2013, 9 pgs. |
| Brockmann, Office Action, U.S. Appl. No. 13/438,617, filed May 12, 2014, 17 pgs. |
| Brockmann, Final Office Action, U.S. Appl. No. 12/443,571, filed Mar. 7, 2014, 21 pgs. |
| Brockmann, Office Action, U.S. Appl. No. 12/443,571, filed Jun. 5, 2013, 18 pgs. |
| Chang, Shih-Fu, et al., “Manipulation and Compositing of MC-DOT Compressed Video, ” IEEE Journal on Selected Areas of Communications, Jan. 1995, vol. 13, No. 1, 11 pgs. Best Copy Available. |
| Dahlby, Office Action, U.S. Appl. No. 12/651,203, filed Jun. 5, 2014, 18 pgs. |
| Dahlby, Final Office Action, U.S. Appl. No. 12/651,203, filed Feb. 4, 2013, 18 pgs. |
| Dahlby, Office Action, U.S. Appl. No. 12/651,203, filed Aug. 16, 2012, 18 pgs. |
| Dukes, Stephen D., “Photonics for cable television system design, Migrating to regional hubs and passive networks,” Communications Engineering and Design, May 1992, 4 pgs. |
| Ellis, et al., “INDAX: An Operation Interactive Cabletext System”, IEEE Journal on Selected Areas in Communications, vol. sac-1, No. 2, Feb. 1983, pp. 285-294. |
| European Patent Office, Supplementary European Search Report, Application No. EP 09 70 8211, dated Jan. 5, 2011, 6 pgs. |
| Frezza, W., “The Broadband Solution—Metropolitan CATV Networks, ” Proceedings of Videotex '84, Apr. 1984, 15 pgs. |
| Gecsei, J., “Topology of Videotex Networks,” The Architecture of Videotex Systems, Chapter 6, 1983 by Prentice-Hall, Inc. |
| Gobi, et al., “ARIDEM—a multi-service broadband access demonstrator,” Ericsson Review No. 3, 1996, 7 pgs. |
| Gordon, Notice of Allowance, U.S. Appl. No. 12/008,697, filed Mar. 20, 2014, 10 pgs. |
| Gordon, Final Office Action, U.S. Appl. No. 12/008,722, filed Mar. 30, 2012, 16 pgs. |
| Gordon, Final Office Action, U.S. Appl. No. 12/035,236, filed Jun. 11, 2014, 14 pgs. |
| Gordon, Final Office Action, U.S. Appl. No. 12/035,236, filed Jul. 22, 2013, 7 pgs. |
| Gordon, Final Office Action, U.S. Appl. No. 12/035,236, filed Sep. 20, 2011, 8 pgs. |
| Gordon, Final Office Action, U.S. Appl. No. 12/035,236, filed Sep. 21, 2012, 9 pgs. |
| Gordon, Final Office Action, U.S. Appl. No. 12/008,697, filed Mar. 6, 2012, 48 pgs. |
| Gordon, Office Action, U.S. Appl. No. 12/035,236, filed Mar. 13, 2013, 9 pgs. |
| Gordon, Office Action, U.S. Appl. No. 12/035,236, filed Mar. 22, 2011, 8 pgs. |
| Gordon, Office Action, U.S. Appl. No. 12/035,236, filed Mar. 28, 2012, 8 pgs. |
| Gordon, Office Action, U.S. Appl. No. 12/035,236, filed Dec. 16, 2013, 11 pgs. |
| Gordon, Office Action, U.S. Appl. No. 12/008,697, filed Aug. 1, 2013, 43 pgs. |
| Gordon, Office Action, U.S. Appl. No. 12/008,697, filed Aug. 4, 2011, 39 pgs. |
| Gordon, Office Action, U.S. Appl. No. 12/008,722, filed Oct. 11, 2011, 16 pgs. |
| Handley et al, “TCP Congestion Window Validation,” RFC 2861, Jun. 2000, Network Working Group, 22 pgs. |
| Henry et al. “Multidimensional Icons” ACM Transactions on Graphics, vol. 9, No. 1 Jan. 1990, 5 pgs. |
| Insight advertisement, “In two years this is going to be the most watched program on TV” On touch VCR programming, published not later than 2000, 10 pgs. |
| Isensee et al., “Focus Highlight for World Wide Web Frames,” Nov. 1, 1997, IBM Technical Disclosure Bulletin, vol. 40, No. 11, pp. 89-90. |
| ICTV, Inc., International Search Report/Written Opinion, PCT/US2008/000400, Jul. 14, 2009, 10 pgs. |
| Kato, Y., et al., “A Coding Control algorithm for Motion Picture Coding Accomplishing Optimal Assignment of Coding Distortion to Time and Space Domains,” Electronics and Communications in Japan, Part 1, vol. 72, No. 9, 1989, 11 pgs. |
| Koenen, Rob,“MPEG-4 Overview—Overview of the MPEG-4 Standard” Internet Citation, Mar. 2001, http://mpeg.telecomitalialab.com/standards/mpeg-4/mpeg-4.htm May 9, 2002, 74 pgs. |
| Konaka, M. et al., “Development of Sleeper Cabin Cold Storage Type Cooling System,” SAE International, The Engineering Society for Advancing Mobility Land Sea Air and Space, SAE 2000 World Congress, Detroit, Michigan, Mar. 6-9, 2000, 7 pgs. |
| Le Gall, Didier, “MPEG: A Video Compression Standard for Multimedia Applications”, Communication of the ACM, vol. 34, No. 4, Apr. 1991, New York, NY, 13 pgs. |
| Langenberg, E, Integrating Entertainment and Voice on the Cable Network by Earl Langenberg 0 TeleWest International and Ed Callahan—ANTEC. work on this one. |
| Large, D., “Tapped Fiber vs. Fiber-Reinforced Coaxial CATV Systems”, IEEE LCS Magazine, Feb. 1990, 7 pgs. Best Copy Available. |
| Mesiya, M.F, “A Passive Optical/Coax Hybrid Network Architecture for Delivery of CATV, Telephony and Data Services,” 1993 NCTA Technical Papers, 7 pgs. |
| “MSDL Specification Version 1.1” International Organisation for Standardisation Organisation Internationale EE Normalisation, ISO/IEC JTC1/SC29NVG11 Coding of Moving Pictures and Autdio, N1246, MPEG96/Mar. 1996, 101 pgs. |
| Noguchi, Yoshihiro, et al., “MPEG Video Compositing in the Compressed Domain,” IEEE International Symposium on Circuits and Systems, vol. 2, May 1, 1996, 4 pgs. |
| Regis, Notice of Allowance U.S. Appl. No. 13/273,803, filed Sep. 2, 2014, 8 pgs. |
| Regis, Notice of Allowance U.S. Appl. No. 13/273,803, filed May 14, 2014, 8 pgs. |
| Regis, Final Office Action U.S. Appl. No. 13/273,803, filed Oct. 11, 2013, 23 pgs. |
| Regis, Office Action U.S. Appl. No. 13/273,803, filed Mar. 27, 2013, 32 pgs. |
| Richardson, Ian E.G., “H.264 and MPEG-4 Video Compression, Video Coding for Next-Genertion Multimedia,” Johm Wiley & Sons, US, 2003, ISBN: 0-470-84837-5, pp. 103-105, 149-152, and 164. |
| Rose, K., “Design of a Switched Broad-Band Communications Network for Interactive Services,” IEEE Transactions on Communications, vol. com-23, No. 1, Jan. 1975, 7 pgs. |
| Saadawi, Tarek N., “Distributed Switching for Data Transmission over Two-Way CATV”, IEEE Journal on Selected Areas in Communications, vol. Sac-3, No. 2, Mar. 1985, 7 pgs. |
| Schrock, “Proposal for a Hub Controlled Cable Television System Using Optical Fiber,” IEEE Transactions on Cable Television, vol. CATV-4, No. 2, Apr. 1979, 8 pgs. |
| Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, filed Sep. 22, 2014, 5 pgs. |
| Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, filed Feb. 27, 2014, 14 pgs. |
| Sigmon, Final Office Action, U.S. Appl. No. 13/311,203, filed Sep. 13, 2013, 20 pgs. |
| Sigmon, Office Action, U.S. Appl. No. 13/311,203, filed May 10, 2013, 21 pgs. |
| Smith, Brian C., et al., “Algorithms for Manipulating Compressed Images,” IEEE Computer Graphics and Applications, vol. 13, No. 5, Sep. 1, 1993, 9 pgs. |
| Smith, J. et al., “Transcoding Internet Content for Heterogeneous Client Devices” Circuits and Systems, 1998. ISCAS '98. Proceedings of the 1998 IEEE International Symposium on Monterey, CA, USA May 31-Jun. 3, 1998, New York, NY, USA,IEEE, US, May 31, 1998, 4 pgs. |
| Stoll, G. et al., “GMF4iTV: Neue Wege zur-Interaktivitaet Mit Bewegten Objekten Beim Digitalen Fernsehen,” Fkt Fernseh Und Kinotechnik, Fachverlag Schiele & Schon GmbH, Berlin, DE, vol. 60, No. 4, Jan. 1, 2006, ISSN: 1430-9947, 9 pgs. No English Translation Found. |
| Tamitani et al., “An Encoder/Decoder Chip Set for the MPEG Video Standard,” 1992 IEEE International Conference on Acoustics, vol. 5, Mar. 1992, San Francisco, CA, 4 pgs. |
| Terry, Jack, “Alternative Technologies and Delivery Systems for Broadband ISDN Access”, IEEE Communications Magazine, Aug. 1992, 7 pgs. |
| Thompson, Jack, “DTMF-TV, The Most Economical Approach to Interactive TV,” GNOSTECH Incorporated, NCF'95 Session T-38-C, 8 pgs. |
| Thompson, John W. Jr., “The Awakening 3.0: PCs, TSBs, or DTMF-TV—Which Telecomputer Architecture is Right for the Next Generations's Public Network?,” GNOSTECH Incorporated, 1995 The National Academy of Sciences, downloaded from the Unpredictable Certainty: White Papers, http://www.nap.edu/catalog/6062.html, pp. 546-552. |
| Tobagi, Fouad A., “Multiaccess Protocols in Packet Communication Systems,” IEEE Transactions on Communications, Vol. Com-28, No. 4, Apr. 1980, 21 pgs. |
| Toms, N., “An Integrated Network Using Fiber Optics (Info) for the Distribution of Video, Data, and Telephone in Rural Areas,” IEEE Transactions on Communication, vol. Com-26, No. 7, Jul. 1978, 9 pgs. |
| Trott, A., et al.“An Enhanced Cost Effective Line Shuffle Scrambling System with Secure Conditional Access Authorization,” 1993 NCTA Technical Papers, 11 pgs. |
| Jurgen—Two-way applications for cable television systems in the '70s, IEEE Spectrum, Nov. 1971, 16 pgs. |
| va Beek, P., “Delay-Constrained Rate Adaptation for Robust Video Transmission over Home Networks,” Image Processing, 2005, ICIP 2005, IEEE International Conference, Sep. 2005, vol. 2, No. 11, 4 pgs. |
| Van der Star, Jack A. M., “Video on Demand Without Compression: A Review of the Business Model, Regulations and Future Implication,” Proceedings of PTC'93, 15th Annual Conference, 12 pgs. |
| Welzenbach et al., “The Application of Optical Systems for Cable TV,” AEG-Telefunken, Backnang, Federal Republic of Germany, ISSLS Sep. 15-19, 1980, Proceedings IEEE Cat. No. 80 CH1565-1, 7 pgs. |
| Yum, TS P., “Hierarchical Distribution of Video with Dynamic Port Allocation,” IEEE Transactions on Communications, vol. 39, No. 8, Aug. 1, 1991, XP000264287, 7 pgs. |
| ActiveVideo Networks, Inc., International Preliminary Report on Patentablity, PCT/US2013/036182, Oct. 14, 2014, 9 pgs. |
| ActiveVideo Networks Inc., Communication Pursuant to Rule 94(3), EP08713106-6, Jun. 25, 2014, 5 pgs. |
| ActiveVideo Networks Inc., Communication Pursuant to Rule 94(3), EP09713486.0, Apr. 14, 2014, 6 pgs. |
| ActiveVideo Networks Inc., Communication Pursuant to Rules 161(2) & 162 EPC, EP13775121.0, Jan. 20, 2015, 3 pgs. |
| ActiveVideo Networks Inc., Certificate of Patent JP5675765, Jan. 9, 2015, 3 pgs. |
| Brockmann, Notice of Allowance, U.S. Appl. No. 13/445,104, filed Dec. 24, 2014, 14 pgs. |
| Brockmann, Office Action, U.S. Appl. No. 13/668,004, filed Feb. 26, 2015, 17 pgs. |
| Brockmann, Office Action, U.S. Appl. No. 13/686,548, filed Jan. 5, 2015, 12 pgs. |
| Brockmann, Office Action, U.S. Appl. No. 13/911,948, filed Dec. 26, 2014, 12 pgs. |
| Brockmann, Office Action, U.S. Appl. No. 13/911,948, filed Jan. 29, 2015, 11 pgs. |
| Dahlby, Office Action, U.S. Appl. No. 12/651,203, filed Dec. 3, 2014, 19 pgs. |
| Gordon, Office Action, U.S. Appl. No. 12/008,722, filed Nov. 28, 2014, 18 pgs. |
| Regis, Notice of Allowance, U.S. Appl. No. 13/273,803, filed Nov. 18, 2014, 9 pgs. |
| Regis, Notice of Allowance, U.S. Appl. No. 13/273,803, filed Mar. 2, 2015, 8 pgs. |
| Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, filed Dec. 19, 2014, 5 pgs. |
| TAG Networks Inc, Decision to Grant a Patent, JP 2008-506474, Oct. 4, 2013, 5 pg. |
| ActiveVideo Networks Inc., Decision to refuse a European patent application (Art. 97(2) EPC, EP09820936.4, Feb. 20, 2015, 4 pgs. |
| ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, 10754084.1, Feb. 10, 2015, 12 pgs. |
| ActiveVideo Networks Inc., Communication under Rule 71(3) EPC, Intention to Grant, EP08713106.6, Feb. 19, 2015, 12 pgs. |
| ActiveVideo Networks Inc., Notice of Reasons for Rejection, JP2014-100460, Jan. 15, 2015, 6 pgs. |
| ActiveVideo Networks Inc., Notice of Reasons for Rejection, JP2013-509016, Dec. 24, 2014 (Received Jan. 14, 2015), 11 pgs. |
| Brockmann, Office Action, U.S. Appl. No. 13/737,097, filed Mar. 16, 2015, 18 pgs. |
| Brockmann, Notice of Allowance, U.S. Appl. No. 14/298,796, filed Mar. 18, 2015, 11 pgs. |
| Craig, Decision on Appeal—Reversed—, U.S. Appl. No. 11/178,177, filed Feb. 25, 2015, 7 pgs. |
| Craig, Notice of Allowance, U.S. Appl. No. 11/178,177, filed Mar. 5, 2015, 7 pgs. |
| Craig, Notice of Allowance, U.S. Appl. No. 11/178,181, filed Feb. 13, 2015, 8 pgs. |
| Number | Date | Country | |
|---|---|---|---|
| 20080170622 A1 | Jul 2008 | US |
| Number | Date | Country | |
|---|---|---|---|
| 60884773 | Jan 2007 | US | |
| 60884744 | Jan 2007 | US | |
| 60884772 | Jan 2007 | US |