Overlay rendering of user interface onto source video

Information

  • Patent Grant
  • 9326047
  • Patent Number
    9,326,047
  • Date Filed
    Friday, June 6, 2014
    10 years ago
  • Date Issued
    Tuesday, April 26, 2016
    8 years ago
Abstract
A method of combining an interactive user interface for generating a blended output that includes the interactive user interface and one or more supplemental images. At a client device remote from a server, a video stream that contains an interactive user interface is received from the server using a first data communications channel configured to communicate video content, and a command that relates to an interactive user interface is transmitted to the server. In response to the transmitting, an updated user interface is received from the server using the first data communications channel, and one or more supplemental images for supplementing the interactive user interface are received using a second data communications channel different from the first data communications channel. The updated user interface and the one or more supplemental images are blended to generate a blended output, which is transmitted toward the display device for display thereon.
Description
TECHNICAL FIELD

The present invention relates to interactive video distribution systems, and more particularly to blending a source video with an interactive user interface to generate a single image, where the source video and interactive user interface are separately provided.


BACKGROUND ART

It is known in the prior art to provide interactive user interfaces for television programs. Such interactive user interfaces include, for example, electronic program guides (EPG) that may be manipulated to search for broadcast programs or schedule recordings. Interactive user interfaces also include simple video games, menuing systems to access video on demand, and other similar such mechanisms.


Interactive user interfaces may be combined with source video, such as video on a broadcast or cable channel. There are two broad ways to combine such interfaces with source video: scale down the source video and fill the rest of the screen with the interactive user interface, or keep the source video full-screen but overlay the user interface onto the screen. As an example of the first combination, modern EPGs often show dynamically-generated channel information with a small preview window that shows video for a current channel. As an example of the second combination, television sets often provide volume controls as elements that overlay an area of the screen, typically near the bottom or along one side, while continuing to display the underlying source video content full-screen.


The latter method to combine user interfaces with source video can itself be broken into two different categories: opaque user interfaces and translucent, or partially transparent, user interfaces. Different techniques can be used for these different categories. For example, if it is known in advance that a user interface will be opaque, then the pixels of the underlying source video content may be discarded at the beginning of the overlay process. This ability to discard pixels simplifies processing of the overlays and permits compositing of the user interface directly into the source image. For certain block-based encoding schemes, compositing can be accomplished at a block level. However, for partially transparent user interfaces, the underlying pixels must be retained and blended with the user interface.


It is also known in the art to overlay images using blending. For purposes of the present disclosure, “blending” refers to a process of alpha compositing; that is, the process of combining two colors using a transparency coefficient, a. Using this technique, each pixel of each image may be viewed as being associated with four values: three color values and one alpha value, each between 0.0 and 1.0, either by storing these values per pixel or in a lookup table such as for example a palette. If the color values are red-green-blue, for example, then these four values are denoted RGBA. Alpha blending takes as input the RGBA values of a foreground pixel and a background pixel, and produces as output a pixel having RGBA values color(output)=α(f)*color(f)+(1−α(f))*color(b) and α(output)=α(f)+α(b)*(1−α(f)), where α(f) and α(b) are the transparency coefficients of the foreground and background pixels, respectively. In other words, the colors and transparency coefficients of the output are a weighted average of the foreground and background pixel, using “α” as the weight. Thus, if α=0.0 in the foreground pixel, then the colors in the output pixel are the same as that of the background (that is, the foreground pixel is not visible). If α is increased from 0.0 toward 1.0, more of the foreground pixel becomes visible, until when α=1.0 the color of the output pixel is the same as that of the foreground pixel (that is, the background pixel is completely overlaid by the foreground pixel).


However, it is generally disadvantageous to blend user interfaces at the server (e.g., at a cable headend), for a number of reasons. First, a typical television provider will have hundreds of thousands or millions of subscribers, a significant portion of whom will, at any given time, require interactive user interfaces. Each subscriber may be watching a different source video, and blending all of these source videos with any number of user interfaces is a problem that does not scale well. Second, blending a user interface with a source video requires access to the pixels of the source video, but the source video that is broadcast is typically ingested from a content provider, encoded according to a transmission encoding that exceeds available computational power. Third, a significant latency may be caused by the blending process, creating an unacceptable ‘sluggishness’ in the response of the user interface.


SUMMARY OF THE EMBODIMENTS

Various embodiments of the invention overcome the disadvantages of blending, at the server, interactive user interfaces with underlying source video in two distinct ways. First, many client devices, such as set top boxes or smart televisions, have the ability to perform alpha blending. Thus, it is possible to transmit the user interface from a remote server, such as one found at a cable headend, to the client device, on demand and out-of-band using a separate protocol, such as a modified RFB or XRT protocol. Second, even if a client device does not have the ability to perform alpha blending locally, such blending can be accelerated at the remote server through a combination of image caching and reconstruction of the client device decoder state to the point where blending becomes a scalable operation.


Some implementations include a method of providing, at a client device, an interactive user interface for generating an output, for a display, that includes a source video and an interactive user interface. The method includes receiving, at a client device remote from a server, the source video from the server using a first data communications channel configured to communicate video content, wherein the first data communications channel comprises a quadrature amplitude modulation (QAM) protocol. Furthermore, the method includes transmitting to the server a command related to an interactive user interface, and receiving, in response to the transmitting, one or more images of the interactive user interface using a second data communications channel different from the first data communications channel, wherein the second data communications channel comprises a transmission control protocol over internet protocol (TCP/IP) protocol. The source video is blended with the received one or more images to generate an output, and the output is transmitted toward a display device for display thereon.


In some embodiments, the interactive user interface comprises a menu.


In some embodiments, the received video content is encoded using an MPEG specification, an AVS specification, or a VC-1 specification. Furthermore, in some embodiments, the one or more images of the interactive user interface are encoded using a bitmap (BMP) file format, a portable network graphics (PNG) file format, a joint photographic experts group (JPEG) file format, or a graphics interchange format (GIF) file format.


In some embodiments, each image of the one or more images is associated with a corresponding transparency coefficient, and wherein blending the source video with the received one or more images comprises blending according to the transparency coefficient.


In some embodiments, wherein the blending comprises blending in a spatial domain.


In another aspect, a method includes providing, at a server, an interactive user interface for generating a output, for a display, that includes the interactive user interface and a source video. The method includes transmitting frames of a source video toward a client device, remote from the server, using a data communications channel configured to communicate video content, while simultaneously buffering in a memory of the server a plurality of encoded frames from the source video for subsequent transmission to the client device. The buffered frames include a first frame that is intra-encoded and one or more additional frames that are inter-encoded based on the first frame. Responsive to receiving from the client device a command that relates to the interactive user interface, the method includes determining a buffered frame in the plurality of buffered frames that corresponds to a time associated with the command, and blending the determined frame with one or more images of the interactive user interface to generate an output. Using the data communications channel, the output is transmitted toward the client device for display on the display device.


In some embodiments, transmitting the frames of the source video and transmitting the output frame each comprise transmitting according to a screen resolution or a screen dimension of the display device.


In some embodiments, the interactive user interface comprises a menu.


In some embodiments, the encoding specification is an MPEG specification, an AVS specification, or a VC-1 specification. Furthermore, in some embodiments, the one or more images of the interactive user interface are encoded using a bitmap (BMP) file format, a portable network graphics (PNG) file format, a joint photographic experts group (JPEG) file format, or a graphics interchange format (GIF) file format.


In some embodiments, the data communications channel comprises at least one of: quadrature amplitude modulation (QAM) using a cable network infrastructure, user datagram protocol over internet protocol (UDP/IP) using an internet protocol television (IPTV) infrastructure, or hypertext transfer protocol (HTTP) using a public or private internet infrastructure.


In some embodiments, each image of the one or more images is associated with a corresponding transparency coefficient, and wherein blending the determined frame with the one or more images comprises blending according to the transparency coefficient.


In some embodiments, blending the determined frame with one or more images includes (i) decoding the determined frame according to the encoding specification to generate a decoded frame; (ii) blending the decoded frame with the one or more images in a spatial domain to generate a blended frame; and (iii) encoding the blended frame according to the encoding specification to generate the output frame. Furthermore, in some implementations, encoding the blended frame comprises searching for motion vectors.


In some embodiments, the output frame is encoded according to the encoding specification.


In yet another aspect, a method includes combining, at a client device, an interactive user interface for generating a blended output, for a display, that includes the interactive user interface and one or more supplemental images. The method includes receiving, at a client device remote from a server, an interactive user interface from the server using a first data communications channel configured to communicate video content. Furthermore, the method includes transmitting to the server a command that relates to an interactive user interface, and receiving, in response to the transmitting, an updated user interface from the server using the first data communications channel, and the one or more supplemental images for supplementing the interactive user interface using a second data communications channel different from the first data communications channel. The updated user interface and the one or more supplemental images are blended to generate a blended output, and the blended output is transmitted toward the display device for display thereon.


In some embodiments, the interactive user interface comprises a source video stitched with user interface content.


In some embodiments, the encoding specification is an MPEG specification, an AVS specification, or a VC-1 specification.


In some embodiments, the first data communications channel comprises at least one of: quadrature amplitude modulation (QAM) using a cable network infrastructure, user datagram protocol over internet protocol (UDP/IP) using an internet protocol television (IPTV) infrastructure, or hypertext transfer protocol (HTTP) using a public or private internet infrastructure.


In some embodiments, the one or more supplemental images are encoded using a bitmap (BMP) file format, a portable network graphics (PNG) file format, a joint photographic experts group (JPEG) file format, or a graphics interchange format (GIF) file format.


In some embodiments, the second data communications channel comprises at least one of transmission control protocol over internet protocol (TCP/IP), remote frame buffer (RFB) protocol, and extended remoting technology (XRT) protocol.


In some embodiments, each supplemental image of the one or more supplemental images is associated with a corresponding transparency coefficient, and wherein blending the updated user interface with the one or more supplemental images comprises blending according to the transparency coefficient.


In some embodiments, blending comprises blending in a spatial domain.


In some embodiments, the command is a request for secure content, wherein the one or more supplemental images are received from a third party server, and the second data communications channel uses a secure transport protocol.


In yet another aspect, the method includes providing, at a server, an interactive user interface for generating a blended output, for a display, that includes the interactive user interface and one or more supplemental images. The method includes transmitting, at a server remote from a client device, the interactive user interface from a server using a first data communications channel configured to communicate video content, and receiving a command that relates to the interactive user interface. Furthermore, the method includes generating an updated interactive user interface, blending the updated user interface and the one or more supplemental images to generate a blended output frame, and transmitting the blended output frame toward a client device for display on a display device thereon.


In some embodiments, the method further includes transmitting the updated interactive user interface toward the client device for display on the display device thereon, and switching between transmitting the blended output frame and transmitting the updated interactive user interface.


In some embodiments, the encoding specification is an MPEG specification, an AVS specification, or a VC-1 specification.


In some embodiments, the first data communications channel comprises at least one of: quadrature amplitude modulation (QAM) using a cable network infrastructure, user datagram protocol over internet protocol (UDP/IP) using an internet protocol television (IPTV) infrastructure, or hypertext transfer protocol (HTTP) using a public or private internet infrastructure.


In some embodiments, the image format of the one or more supplemental images is a bitmap (BMP) file format, a portable network graphics (PNG) file format, a joint photographic experts group (JPEG) file format, or a graphics interchange format (GIF) file format.


In some embodiments, the method includes first determining that the client device is not capable of overlaying.





BRIEF DESCRIPTION OF THE DRAWINGS

The foregoing features of embodiments will be more readily understood by reference to the following detailed description, taken with reference to the accompanying drawings, in which:



FIG. 1 schematically shows a system in accordance with one embodiment of the invention;



FIG. 2 is a flowchart showing operation of a client device in the system of FIG. 1;



FIG. 3 schematically shows a sequence of frames of video in relation to an interactivity command in accordance with a second embodiment of the invention;



FIG. 4 schematically shows a system in accordance with a second embodiment of the invention;



FIG. 5 is a flowchart showing operation of a server in the system of FIG. 4;



FIG. 6A schematically shows a system in accordance with a third embodiment of the invention;



FIG. 6B is a flowchart showing operation of a client device in the system of FIG. 6A;



FIG. 7A schematically shows a system in accordance with a fourth embodiment of the invention;



FIG. 7B is a flowchart showing operation of a server in the system of FIG. 7A;



FIG. 8A schematically shows a system in accordance with a fifth embodiment of the invention; and



FIG. 8B is a flowchart showing operation of a client device in the system of FIG. 8A.





DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS
Definitions

As used in this description and the accompanying claims, the following terms shall have the meanings indicated, unless the context otherwise requires:


“Video” means both silent moving images and moving images accompanied by sound, except where otherwise indicated.


An “encoding specification” is a specification according to which video data are encoded by a transmitting electronic device and decoded by a receiving electronic device. Examples of encoding specifications are MPEG-2, MPEG-4, AVS, and VC-1.


A “client device” is an electronic device capable of receiving and decoding data according to an encoding specification for display on a display device. Examples of client devices include cable and satellite set top boxes, some video game consoles, and some televisions.



FIG. 1 schematically shows a system in accordance with one embodiment of the invention. This embodiment includes a client device 10 that provides an output display signal to a display device 11. The client device 10 generally receives signals, such as linear broadcast television signals, from one or more servers 12, by way of a first data communications network 131. The client device 10 also receives images that form an interactive user interface, such as electronic program guide signals, by way of a second data communications network 132. The client device 10 then combines these signals to generate the output display signal. The aforementioned elements are now described in more detail.


The client device 10 may be implemented as a set top box, a video game console, a television, or other electronic device known in the art. The client device 10 includes an overlay module 101 that is capable of overlaying an image on an input video signal to generate an output video signal as a sequence of composite images. The operation of the overlay module 101 is described in more detail in connection with FIG. 2. The client device 10 also includes a video decoder 102, which is capable of decoding audiovisual data that was encoded according to an encoding specification. Such video decoders are well known in the art, and may be implemented as an integrated circuit. Audiovisual data typically are encoded to reduce size for transmission through the data communications network 131.


The client device 10 also has several input/output (I/O) ports 103. One I/O port 103 is used to receive audiovisual data from the data communications network 131, while another is used to receive images from the data communications network 132. Another I/O port may be used in some embodiments to receive images that comprise an interactive user interface. In other embodiments, the same I/O port is used to receive both the audiovisual data and the interactive user interface images. Another I/O port is used to accept user input in the form of commands. Some commands may instruct the client device 10 to tune to a different channel (i.e., to receive different audiovisual data from the data communications network 131 or from another data network such as the Internet). Other commands may instruct the client device to record audiovisual data, either as it arrives at the client device 10 or at a future time and on a specified channel. Some commands will cause the display of an interactive user interface, while other commands will not. Various embodiments of the present invention are directed toward processing of commands that cause the display of such a user interface. The I/O ports 103 may be implemented using hardware known in the art, such as an IR receiver to interface with a remote control, a coaxial jack to interface with a cable television distribution network, a wired or wireless Ethernet port to interface with an Ethernet network, a video jack to provide the output display signal to the display device 11, and so on. The display device 11 itself may be implemented as a standard CRT, LCD, LED, or plasma monitor as is known in the art, or other similar device.


The one or more servers 12 may be implemented using computer equipment known in the art; however their functions are novel when operated in accordance with various embodiments of the present invention. In accordance with some embodiments of the invention, a large number of servers 12 may be present, and cooperate to provide the functions described below. However, for convenience and clarity, the remainder of the detailed description will assume that only one server 12 is present.


The server 12 includes a number of audio, video, and/or audiovisual data sources 121, an application execution environment 122, and an encoder 123. Note that other components may be used in an implementation of the server 12, although these have been omitted for clarity. These components are now described in more detail.


The audio/video data sources 121 may be, for example, non-linear multimedia data stored on a non-volatile storage device in the form of a movie, television program, television commercial, game graphics and sounds, user interface sounds, or other such form. The data sources 121 also may include linear multimedia data sources, such as a television broadcast stream received live by antenna or private network.


An application execution environment 122 executes an interactive application on behalf of a user. The application may be, for example, a menuing system, a video game system, or other interactive application. The environment 122 responds to input interactive commands by providing images to the client device 10 using data communications network 132. The environment 122 includes at least application logic 1221, a source of images 1222, and an image cache 1223. Application logic 1221 may be implemented as an executable file or a script that provides a state machine for operating an interactive user interface. Any format of application file may be used as application logic 1221; for example, a hypertext markup language (HTML) file that includes JavaScript may be used, or a compiled binary file may be used.


The application logic 1221 may dynamically generate one or more images 1222 that comprise the interactive user interface. The images 1222 often persist in a volatile memory of the server 12 for speed of access, for example in an image cache 1223. The images 1222 may be generated by the application execution environment logic 1221 according to a screen resolution or a screen dimension of the display device 11, which may be statically configured or may be determined dynamically when the client device 10 first establishes a communications session with the server 12. Typically, for efficiency purposes, the application logic 1221 will transmit images from the image cache 1223 if possible, and dynamically create images 1222 for transmission only if they are not already in the image cache 1223. The use of a cache 1223 advantageously permits interactive user interface images to be reused by the server 12 (or by other servers) between different requests for the user interface, even if those requests come from different end users or at different times. Images in the image cache 1223 typically are indexed using a hashing function defined by the environment 122. The use of the hashing function permits many images to be quickly retrieved from the image cache 1223, advantageously providing increased scalability. Additionally and/or alternatively, in some embodiments, server(s) 12 will transmit references to the images (such as Uniform Resource Locators or URLs), as opposed to the images themselves, so that the client can retrieve them on demand (e.g., by means of HTTP). Such embodiments would be advantageous, as an intermediate network cache (not shown), accessible through second data communications channel 132, may be used to store reusable images closer to the client device.


The encoder 123 encodes the source audiovisual data according to an encoding specification, such as MPEG, AVS, or VC-1. The encoder 123 and the decoder 102 use the same encoding specification, so that the encoded audiovisual data may be decoded once it passes through the data communications network 131. In the case that the source audiovisual data are already encoded, the encoder acts as a simple pass-through. However, in the case that the source audiovisual data are not in a format decodable by the decoder 102, the encoder 123 transcodes the data into a decodable format.


As can be seen from FIG. 1, the encoded audiovisual data (from the encoder 123) and the user interface images (either from images 1222 or the cache 1223) travel to the client device along two different data channels. The first data channel through the first data communications network 131 is designed specifically to communicate video content. Thus, for example, the network 131 may include a cable network infrastructure that deploys quadrature amplitude modulation (QAM), as is known in the art. Alternately, the network 131 may have an internet protocol television (IPTV) infrastructure that uses user datagram protocol over internet protocol (UDP/IP) to communicate encoded video. In yet another implementation, the network 131 may be part of a public or private internet infrastructure, and use hypertext transfer protocol (HTTP) tunneling to communicate the encoded video.


By contrast, the second data communications network 132 may be designed to communicate images, rather than video. In particular, this means that the second network 132 may operate on a much lower bandwidth or a higher reliability than the first network 131. Thus, for example, the second network 132 may support data channels using the transmission control protocol over internet protocol (TCP/IP), the remote frame buffer (RFB) protocol, or the extended remoting technology (XRT) protocol. Images that are transmitted on the second network 132 may be encoded, for example, using a bitmap (BMP) file format, a portable network graphics (PNG) file format, a joint photographic experts group (JPEG) file format, or a graphics interchange format (GIF) file format. The use of PNG is particularly advantageous, as each pixel is stored with a corresponding transparency coefficient (a value).



FIG. 2 is a flowchart showing operation of a client device 10 in the system of FIG. 1 in accordance with an embodiment of the invention. In particular, FIG. 2 illustrates a method of providing, in the client device 10, an interactive user interface for simultaneous display with a source video on a display device 11. The method begins with a process 21 in which the client device 10 receives source video using a first data communications channel 131. In a typical embodiment, the client device 10 will display this source video as it arrives on the display device 11, as is known in the art. Next, in process 22 the client device 10 transmits a command related to an interactive user interface to a server 12. This command may be transmitted, for example, in response to the client device 10 receiving on an I/O port 103 a signal that a button or buttons on a remote control has been pressed. The button or buttons may be provided on the remote control to call up an interactive program guide, a video game, or other interactive application.


In process 23, the client device 10 responsively receives one or more images of the interactive user interface, using a second data communications channel 132. For example, the images might include a number of buttons, switches, or dials for collective simultaneous display as a user interface. Alternately, the images might be designed to be displayed sequentially, as in the case of a “trick play” interface that includes a video timeline and a mark indicating a current time stamp. Seeking through the video may be performed by pressing a fast-forward or rewind button on the remote control, and movement of the timing mark along the timeline typically may be sped up by repeated button presses. The images may come from the images 1222 or the image cache 1223 of the application execution environment 122.


Next, in a process 24, the client device 10, and in particular the overlay module 101, alpha blends the source video with the received images to generate an output frame of pixels. In accordance with various embodiments of the invention, the received interactive user interface images are considered to be partially transparent foreground images (0.0<α<1.0), and frames of the source video are considered to be opaque background images (α=1.0). The choice of a for the user interface images advantageously may be made to be approximately 0.5, so that the interactive user interface appears evenly blended with the background source video. Or, the value of a may be varied on a per-pixel basis (i.e., per pixel alpha blending) within each image; for example, providing a downward a-gradient at the edges of a user interface image will produce an effect of the image ‘fading into the background’ at its edges. Global alpha blending and per pixel alpha blending may be combined by multiplying each per pixel alpha blending value with the global alpha blending value before the blending process is applied. The blending process 24 is performed using an appropriate received user interface image or images with respect to each frame of the source video for as long as the interactive user interface should be displayed on the screen, thereby providing a continuously-displayed interactive user interface.


Finally, in process 25, the client device 10 transmits each output frame toward the display device 11 for display. An I/O port 103 may be used in processes 21, 22, 23, 25 to receive or transmit data. A computing processor may be used in process 24 to perform the required blending.


The above embodiments are preferred because the image cache 1223 may be used to increase scalability of the content delivery platform provided by the server 12 (or a server cloud). This is true because it is feasible to cache individual user interface images separately from their underlying source videos, while it is generally infeasible to cache a vast number of pre-blended images due to limited storage space. The separate caching of user interface images, in turn, is a result of the ability of the client device 10 to receive these images using an I/O port 103 and perform blending in the overlay module 101.


In some situations, it may be impossible to use these embodiments, because a client device 10 may not have the necessary I/O ports 103 or an overlay module 101. In these situations, it is instead necessary to perform blending at the server 12, rather than the client 10, and such blending has its own challenges.


One such challenge is that the user interface images must be blended by the server 12, but can be sent to the client device 10 only as encoded audiovisual data. Therefore, it is necessary to decode the source video into a spatial domain (i.e., as a frame of pixels), blend the user interface images with the source video in the spatial domain, then re-encode the blended image according to the encoding specification. These processes require server computational capacity, and do not scale well.


Another challenge is that there is noticeable latency between the time at which the interface command occurs and when the user interface can be displayed. This challenge is illustrated by consideration of FIG. 3, which schematically shows a time sequence 31 of frames of video in relation to an interactivity command 32. In this figure, a sequence 31 of frames includes a number of individual video frames 311-317. The frames are labeled by a frame type, which may be either intra-encoded or inter-encoded. An intra-encoded frame encodes video data according to data found only in the frame, while an inter-encoded frame encodes video data according to data found in the given frame and in surrounding frames. For purposes of clarity, MPEG frame types are used in the figures and detailed description to provide an example implementation, but any encoding specification may be used in accordance with an embodiment of the invention.


The sequence 31 of frames includes two types of frames: I-frames that are intra-encoded and P-frames that are inter-encoded. I-frames are encoded using image information found only in themselves. Thus, I-frames encode a full-screen image, which is useful to indicate a ‘scene change’ or to eliminate display artifacts. Two frames 311, 317 are I-frames. P-frames are encoded using information found in the previous image by estimating movement of pixels using two-dimensional “motion vectors”. Thus, P-frames are useful for predicting movement fixed or slow-moving ‘camera pan’ images where most of the image content of the previous frame is present in the next frame. This relationship between P-frames and their predecessor frames is indicated by the backwards-facing arrows in FIG. 3. Frames 312-316 are P-frames. MPEG also defines a B-frame, not shown in FIG. 3, which interpolates both forward and backward between other frames.


Suppose an interface command 32 arrives at the server 12 when a P-frame 316 is being displayed on the display device 11. Because it is inter-encoded, the information in this P-frame is insufficient by itself to reconstruct the complete image being displayed (i.e., to reconstruct the decoder state). In fact, the information necessary is found in a combination of the frames 311-316. One could introduce a latency 33 between the time of the command 32 and the next I-frame 317, at which time the overlay image is blended 34. However, if the group of pictures 31 contains two seconds worth of source video, the average wait time from the command 32 to the next I-frame 317 (and the appearance of the user interface) is one second, which is unacceptably unresponsive. Therefore, in various embodiments of the invention, all of the data in each group of pictures 31 (that is, from one intra-encoded frame until the next one) are buffered in the server 12 before being transmitted to permit blending of the interactive user interface images with the currently-displayed image from the source video.


The server 12 uses buffered frames to simulate, for blending, the state of the decoder 102 in the client device 10. This process is illustrated by the sequence 35, in which an encoder in the server 12 constructs the state of the decoder 102. The server 12 retrieves the first frame 311 of the buffered frames, and uses it as an initial simulated state 351. The server 12 then retrieves the second frame 312 of the buffered frames, and applies its data to the initial simulated state 351 to obtain a second simulated state 352. The server 12 retrieves the third frame 313 of the buffered frames, and applies its data to the second simulated state 352 to obtain a third simulated state 353. This process continues until the simulation reaches a state 356 that corresponds to a frame 316 corresponding to a time associated with the command 32. Once the server 12 has recovered the state of the decoder 102, it may perform blending as described above in connection with element 24 of FIG. 2.



FIG. 4 schematically shows a system in accordance with an embodiment of the invention in which the server 12 performs blending. The disclosure of FIG. 4 overlaps to large extent with that of FIG. 1, so only the changes will be remarked upon here. As noted above, in the scenario under consideration, the client device 10 in FIG. 4 lacks an overlay module 101 found in FIG. 1. Therefore, the server 12 includes, in addition to the encoder 123 of FIG. 1, a decoder/blender 124 for decoding and blending source video with an interactive user interface. Note that while the functions of decoding and blending are combined in decoder/blender 124 for purposes of this disclosure, these functions may be implemented in separate hardware or software. Also as described above, the server 12 further includes a buffer memory 125 for buffering frames of source video data. During ordinary operation of the system of FIG. 4, most frames of source video data buffered in the buffer memory 125 are discarded without being blended, and the decoder/blender 124 acts as a simple pass-through. However, when a user provides an interactive command to the application execution environment 122, the environment 122 provides images to the blender 124 (either preferably statically from its cache 1223, or dynamically from the image generator 1222) for blending with the buffered video. The decoder/blender 124 decodes the source video data and simulates the state of the decoder 102 as described with respect to element 35 of FIG. 3. The decoder/blender 124 then blends the interactive user interface images into the source video, one frame at a time. The decoder/blender 124 provides an output to the encoder 123, which encodes the data according to the appropriate encoding specification for transmission to the client device 10.



FIG. 5 is a flowchart showing operation of a server in the system of FIG. 4. In particular, FIG. 5 shows a method of providing, in a server 12, an interactive user interface for simultaneous display with a source video on a display device 11. In a first process 51, the server 12 transmits frames of the source video toward the client device 10 for display. Simultaneously, in a second process 52, the server 12 buffers frames from the source video for subsequent transmission. In process 53, the server 12 receives from the client device 10 a command 32 that relates to the interactive user interface. In process 54, the decoder/blender 124 determines a buffered frame 316 in the buffer memory 125 that corresponds to a time associated with the command 32. In process 55, the decoder/blender 124 blends the determined frame with one or more images of the interactive user interface received from the application execution environment 122 to generate an output frame that is subsequently encoded by the encoder 123. Then, in process 56, the server 12 transmits the output frame toward the client device 10 for display on the display device 11.


Note that the encoder 123 may be required to do a motion vector search after blending. There are several optimizations that can be performed to speed up this process. In a first optimization, the encoder 123 could make use of motion information found in the original video frame when it was decoded by the decoder/blender 124. However, the encoder 123 must verify whether the same motion is still present in the blended image due to the presence of the interactive user interface. In a second optimization, the source video images could be divided into rectangular areas, and motion vectors for each area are encoded separately. In this case, motion vectors for rectangles that do not intersect the user interface are unaffected by the blending, and no additional motion vector search is required for these rectangles.



FIG. 6A schematically shows a system in accordance with an embodiment of the invention in which overlay images are used to supplement a streamed interactive user interface. In U.S. application Ser. No. 12/443,571 (“Method for Streaming Parallel User Sessions, System and Computer Software”), the contents of which are hereby incorporated by reference in its entirety, a system is disclosed where an interactive user interface is streamed to a client device over a first data communications channel. The streamed interactive user interface is realized by stitching a plurality of fragments and streams into a single compliant audiovisual stream. It has been identified that for a number of reasons it is beneficial to overlay images over an encoded stream instead of encoding them in the stream, which also holds for cases in which the audiovisual stream is an interactive user interface. For example, it is beneficial to overlay images in cases involving a sprite-like user interface element (e.g., a cursor). Such a user interface element is generally arbitrarily placed on the screen and it may be more efficient to decouple the element from the interactive user interface by overlaying images. In particular, if the user interface element was instead encoded (e.g., by the fragment encoder) and subsequently stored in cache, the cache would quickly reach capacity because a sprite-like user interface element, unlike some other user interface elements (e.g., a menu), does not have a predefined position. Another example may be that the interactive user interface has a partial screen video element over which another user interface element is supposed to be rendered. In this case it is more efficient from a scalability point of view to render only the new interface element as overlay image(s).


The system disclosed in FIG. 6A is fundamentally the same as that shown in FIG. 1. Here, the client device 60 receives an interactive user interface via a first data communications channel 63 from a server 62. In some embodiments, server 62 runs an application in the application execution engine 621 that generates fragments by means of a fragment encoder 630; caches these fragments in a cache 632; and combines these (cached) fragments by means of a stitcher 622 (otherwise known as an assembler) to generate, and subsequently stream, an interactive user interface via the first data communications channel 63 to the client device 60 (as described in, U.S. application Ser. No. 12/443,571 (“Method for Streaming Parallel User Sessions, System and Computer Software”)). Optionally, in some embodiments, the interactive user interface is directly encoded by an encoder of server 62 (not shown in FIG. 6A) from pixel data. The interactive user interface may be supplemented by the generation of images 634 that are to be overlain by the client device 60. These images may also be stored in a cache 632 for reuse across sessions in the same way as fragments are reused across sessions. For example, in some implementations, the interactive user interface includes a source video with images from cache 632 overlaid. The images may be sent via a second data communications channel 64 to the I/O ports 601 of client device 60. Additionally and/or alternatively, in some embodiments, server(s) 62 will transmit references to the images (such as Uniform Resource Locators or URLs), as opposed to the images themselves, so that the client can retrieve them on demand (e.g., by means of HTTP). Such embodiments are advantageous, as an intermediate network cache 641, accessible through second data communications channel 64, can be used to store reusable images closer to the client device. The stream received from server 62 is decoded in the decoder 602 and combined with the images received or retrieved from server 62 in the overlay module 603 for display on 61 as described in the embodiment described in relation to FIG. 1. In some implementations, client device 60 switches between 1) receiving the interactive user interface from the stitcher, and 2) blending the interactive user interface from the stitcher with overlay images.



FIG. 6B is a flowchart showing operations of a client device in the system of FIG. 6A. The flow chart is very similar to the operations described in the flow chart in FIG. 2. However, instead of receiving a source video using the first data communications channel, the client device receives (6000) the interactive user interface via the first data communications channel. In some embodiments, the interactive user interface is a video stream, such as an MPEG video stream. Next, a command related to that interactive user interface is transmitted (6010) to the server. The client may subsequently receive (6020) updates to the interactive user interface via the first data communications channel and/or supplemental images from the same server to supplement the interactive user interface. The remaining processes 24 and 25 are the same as those described with respect to FIG. 1.


In some embodiments, since the first data communications channel and the second data communications channel are completely independent channels, the graphical information transmitted over both data channels is likely to be related. Therefore, special care must be taken when the images are combined with the video stream representing the interactive user interface. A loosely coupled synchronization mechanism, such as for example a presentation timestamp and timeout for each image, may be used to synchronize the display of images with the streamed interactive user interface.



FIG. 7A schematically shows an alternative embodiment of the system described in FIG. 6A. The system disclosed in FIG. 7A is similar to the systems depicted by FIGS. 4 and 6A. Here, the server (specifically, overlay module 724 of server 72), and not the client device, overlays images over the encoded stream. In other words, the blending occurs at the server, as described in relation to FIG. 4.


As illustrated, in some implementations, client device 703 does not include an overlay module. Moreover, as shown, the system of FIG. 7A does not utilize a second data communications channel.


As in the system of FIG. 6A, stitcher 722 generates an interactive user interface by combining fragments, generated by fragment encoder 730, and stored in cache 732. Overlay module 724 overlays images 734 over the resulting interactive user interface received from stitcher 722. As illustrated, client device 70 then receives the encoded stream, which includes interactive user interface and overlay images 734, via a first data communications channel 73 from server 72. In optional implementations, server 72 (or, alternatively, overlay module 724) is configured to switch between transmitting (i) the encoded stream including the interactive user interface and overlay images 734, and (ii) only the interactive user interface. Alternatively, in some implementations, overlay module 724 and stitcher 722 exist and operate as a single component of server 72.



FIG. 7B is a flowchart showing operations of a client device in the system of FIG. 7A. The flow chart is very similar to the operations described in the flow chart in FIG. 6B, but written with respect to a server (e.g., server 72) that is configured (e.g., overlay module 724) to overlay images. In process 7000, the server transmits the interactive user interface via a first data communications channel. Next, in process 7010, the server receives a command related to the interactive user interface. In process 7020, the server generates an updated interactive user interface. Further, in process 7030, the server blends the updated interactive user interface with supplemental images to generate a blended output frame which, in process 7040, is transmitted towards the client device. As described above, in optional implementations, the server switches between transmitting (i) the blended output frame including the interactive user interface and overlay images, and (i) the interactive user interface.



FIG. 8A schematically shows an alternative embodiment, similar to the system described in FIG. 6A, in which the supplemental overlay images are sourced from a third party server. The embodiment provides a strict separation between an interactive user interface and information from a third party, by conveying the interactive user interface and information from a third party over separate data communications channels. An example of a system requiring such a separation is a banking application where the interactive user interface is the same for every user, except for account related information that is sent directly to the end user as supplemental images (e.g., supplemental images sent by third party server 85) over a secure data communications channel (e.g., second data communications channel 84).


The system disclosed in FIG. 8A is very similar to the system depicted by FIG. 6A. The main difference being that one or more images originate from a third party server 85, and are sent as supplemental images to client device 80 over second data communications channel 84. In some embodiments, second data communications channel 84 is a secure channel (e.g., a secure transport protocol is used for the images, such as HTTPS). The application may use application logic 834 to liaise with application logic 840 of an application 844 on a third party server 85 via a communication channel 87 to generate one or more images 842 that supplement the interactive user interface with third party information.



FIG. 8B is a flowchart showing operations of a client device in the system of FIG. 8A. The flow chart is similar to the flow chart in FIG. 6B. Here, the device transmits (8020) a request for secure content, and supplemental images are received (8030) from a third party server over a second data communications channel, where, in some embodiments, the second data communications channel uses a secure transport protocol.


The embodiments of the invention described above are intended to be merely exemplary; numerous variations and modifications will be apparent to those skilled in the art. All such variations and modifications are intended to be within the scope of the present invention as defined in any appended claims. For those skilled in the art it will also be evident that it may be beneficial for systems to switch between the embodiments of the invention on demand.


The present invention may be embodied in many different forms, including, but in no way limited to, computer program logic for use with a processor (e.g., a microprocessor, microcontroller, digital signal processor, or general purpose computer), programmable logic for use with a programmable logic device (e.g., a Field Programmable Gate Array (FPGA) or other PLD), discrete components, integrated circuitry (e.g., an Application Specific Integrated Circuit (ASIC)), or any other means including any combination thereof


Computer program logic implementing all or part of the functionality previously described herein may be embodied in various forms, including, but in no way limited to, a source code form, a computer executable form, and various intermediate forms (e.g., forms generated by an assembler, compiler, linker, or locator). Source code may include a series of computer program instructions implemented in any of various programming languages (e.g., an object code, an assembly language, or a high-level language such as Fortran, C, C++, JAVA, or HTML) for use with various operating systems or operating environments. The source code may define and use various data structures and communication messages. The source code may be in a computer executable form (e.g., via an interpreter), or the source code may be converted (e.g., via a translator, assembler, or compiler) into a computer executable form.


The computer program may be fixed in any form (e.g., source code form, computer executable form, or an intermediate form) either permanently or transitorily in a tangible storage medium, such as a semiconductor memory device (e.g., a RAM, ROM, PROM, EEPROM, or Flash-Programmable RAM), a magnetic memory device (e.g., a diskette or fixed disk), an optical memory device (e.g., a CD-ROM), a PC card (e.g., PCMCIA card), or other memory device. The computer program may be fixed in any form in a signal that is transmittable to a computer using any of various communication technologies, including, but in no way limited to, analog technologies, digital technologies, optical technologies, wireless technologies (e.g., Bluetooth), networking technologies, and internetworking technologies. The computer program may be distributed in any form as a removable storage medium with accompanying printed or electronic documentation (e.g., shrink wrapped software), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the communication system (e.g., the Internet or World Wide Web).


Hardware logic (including programmable logic for use with a programmable logic device) implementing all or part of the functionality previously described herein may be designed using traditional manual methods, or may be designed, captured, simulated, or documented electronically using various tools, such as Computer Aided Design (CAD), a hardware description language (e.g., VHDL or AHDL), or a PLD programming language (e.g., PALASM, ABEL, or CUPL).


Programmable logic may be fixed either permanently or transitorily in a tangible storage medium, such as a semiconductor memory device (e.g., a RAM, ROM, PROM, EEPROM, or Flash-Programmable RAM), a magnetic memory device (e.g., a diskette or fixed disk), an optical memory device (e.g., a CD-ROM), or other memory device. The programmable logic may be fixed in a signal that is transmittable to a computer using any of various communication technologies, including, but in no way limited to, analog technologies, digital technologies, optical technologies, wireless technologies (e.g., Bluetooth), networking technologies, and internetworking technologies. The programmable logic may be distributed as a removable storage medium with accompanying printed or electronic documentation (e.g., shrink wrapped software), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the communication system (e.g., the Internet or World Wide Web).

Claims
  • 1. A method of combining an interactive user interface for generating a blended output that includes the interactive user interface and one or more supplemental images, the method comprising: at a client device remote from a server:receiving a video stream that contains an interactive user interface from the server using a first data communications channel configured to communicate video content;transmitting to the server a command that relates to a user input received through the interactive user interface, wherein the command is a request for secure content;receiving, in response to the transmitting, an updated user interface from the server using the first data communications channel,receiving from a third-party server, in response to the transmitting, one or more supplemental images for supplementing the interactive user interface using a second data communications channel that is different from the first data communications channel and uses a secure transport protocol, wherein each supplemental image of the one or more supplemental images is associated with a corresponding transparency coefficient;blending the updated user interface and the one or more supplemental images according to the transparency coefficient for each supplemental image of the one or more supplemental images to generate a blended output; andtransmitting the blended output toward the display device for display thereon.
  • 2. The method according to claim 1, wherein the interactive user interface comprises a source video stitched with user interface content.
  • 3. The method according to claim 1, wherein an encoding specification for the video stream is an MPEG specification, an AVS specification, or a VC-1 specification.
  • 4. The method according to claim 1, wherein the first data communications channel comprises at least one of: quadrature amplitude modulation (QAM) using a cable network infrastructure, user datagram protocol over internet protocol (UDP/IP) using an internet protocol television (IPTV) infrastructure, or hypertext transfer protocol (HTTP) using a public or private internet infrastructure, and the second data communications channel comprises at least one of transmission control protocol over internet protocol (TCP/IP), remote frame buffer (RFB) protocol, and extended remoting technology (XRT) protocol.
  • 5. The method according to claim 1, wherein the blending comprises blending in a spatial domain.
  • 6. A method of providing an interactive user interface for generating a blended output that includes the interactive user interface and one or more supplemental images, the method comprising: at a server remote from a client device:transmitting a video stream that includes an interactive user interface towards the client device using a first data communications channel configured to communicate video content;receiving a command that relates to a user input received through the interactive user interface;generating an updated interactive user interface;blending the updated interactive user interface and one or more supplemental images to generate a blended output frame, wherein each supplemental image of the one or more supplemental images is associated with a corresponding transparency coefficient and the blending is performed according to the transparency coefficient for each supplemental image of the one or more supplemental images;transmitting the blended output frame toward the client device for display on a display device;switching between transmitting the blended output frame and transmitting the updated interactive user interface; andtransmitting the updated interactive user interface toward the client device for display on the display device.
  • 7. The method according to claim 6, wherein an encoding specification for the video stream is an MPEG specification, an AVS specification, or a VC-1 specification.
  • 8. The method according to claim 6, wherein the first data communications channel comprises at least one of: quadrature amplitude modulation (QAM) using a cable network infrastructure, user datagram protocol over internet protocol (UDP/IP) using an internet protocol television (IPTV) infrastructure, or hypertext transfer protocol (HTTP) using a public or private internet infrastructure, and wherein an image format of the one or more supplemental images is a bitmap (BMP) file format, a portable network graphics (PNG) file format, a joint photographic experts group (JPEG) file format, or a graphics interchange format (GIF) file format.
  • 9. The method according to claim 6, further comprising first determining that the client device is not capable of overlaying.
RELATED APPLICATIONS

This application claims priority to U.S. Provisional Patent Application No. 61/832,069, entitled “Overlay Rendering of User Interface Onto Source Video,” filed Jun. 6, 2013, which is incorporated by reference herein in its entirety.

US Referenced Citations (738)
Number Name Date Kind
3889050 Thompson Jun 1975 A
3934079 Barnhart Jan 1976 A
3997718 Ricketts et al. Dec 1976 A
4002843 Rackman Jan 1977 A
4032972 Saylor Jun 1977 A
4077006 Nicholson Feb 1978 A
4081831 Tang et al. Mar 1978 A
4107734 Percy et al. Aug 1978 A
4107735 Frohbach Aug 1978 A
4145720 Weintraub et al. Mar 1979 A
4168400 de Couasnon et al. Sep 1979 A
4186438 Benson et al. Jan 1980 A
4222068 Thompson Sep 1980 A
4245245 Matsumoto et al. Jan 1981 A
4247106 Jeffers et al. Jan 1981 A
4253114 Tang et al. Feb 1981 A
4264924 Freeman Apr 1981 A
4264925 Freeman et al. Apr 1981 A
4290142 Schnee et al. Sep 1981 A
4302771 Gargini Nov 1981 A
4308554 Percy et al. Dec 1981 A
4350980 Ward Sep 1982 A
4367557 Stern et al. Jan 1983 A
4395780 Gohm et al. Jul 1983 A
4408225 Ensinger et al. Oct 1983 A
4450477 Lovett May 1984 A
4454538 Toriumi Jun 1984 A
4466017 Banker Aug 1984 A
4471380 Mobley Sep 1984 A
4475123 Dumbauld et al. Oct 1984 A
4484217 Block et al. Nov 1984 A
4491983 Pinnow et al. Jan 1985 A
4506387 Walter Mar 1985 A
4507680 Freeman Mar 1985 A
4509073 Baran et al. Apr 1985 A
4523228 Banker Jun 1985 A
4533948 McNamara et al. Aug 1985 A
4536791 Campbell et al. Aug 1985 A
4538174 Gargini et al. Aug 1985 A
4538176 Nakajima et al. Aug 1985 A
4553161 Citta Nov 1985 A
4554581 Tentler et al. Nov 1985 A
4555561 Sugimori et al. Nov 1985 A
4562465 Glaab Dec 1985 A
4567517 Mobley Jan 1986 A
4573072 Freeman Feb 1986 A
4591906 Morales-Garza et al. May 1986 A
4602279 Freeman Jul 1986 A
4614970 Clupper et al. Sep 1986 A
4616263 Eichelberger Oct 1986 A
4625235 Watson Nov 1986 A
4627105 Ohashi et al. Dec 1986 A
4633462 Stifle et al. Dec 1986 A
4670904 Rumreich Jun 1987 A
4682360 Frederiksen Jul 1987 A
4695880 Johnson et al. Sep 1987 A
4706121 Young Nov 1987 A
4706285 Rumreich Nov 1987 A
4709418 Fox et al. Nov 1987 A
4710971 Nozaki et al. Dec 1987 A
4718086 Rumreich et al. Jan 1988 A
4732764 Hemingway et al. Mar 1988 A
4734764 Pocock et al. Mar 1988 A
4748689 Mohr May 1988 A
4749992 Fitzemeyer et al. Jun 1988 A
4750036 Martinez Jun 1988 A
4754426 Rast et al. Jun 1988 A
4760442 O'Connell et al. Jul 1988 A
4763317 Lehman et al. Aug 1988 A
4769833 Farleigh et al. Sep 1988 A
4769838 Hasegawa Sep 1988 A
4789863 Bush Dec 1988 A
4792849 McCalley et al. Dec 1988 A
4801190 Imoto Jan 1989 A
4805134 Calo et al. Feb 1989 A
4807031 Broughton et al. Feb 1989 A
4816905 Tweedy et al. Mar 1989 A
4821102 Ichikawa et al. Apr 1989 A
4823386 Dumbauld et al. Apr 1989 A
4827253 Maltz May 1989 A
4827511 Masuko May 1989 A
4829372 McCalley et al. May 1989 A
4829558 Welsh May 1989 A
4847698 Freeman Jul 1989 A
4847699 Freeman Jul 1989 A
4847700 Freeman Jul 1989 A
4848698 Newell et al. Jul 1989 A
4860379 Schoeneberger et al. Aug 1989 A
4864613 Van Cleave Sep 1989 A
4876592 Von Kohorn Oct 1989 A
4889369 Albrecht Dec 1989 A
4890320 Monslow et al. Dec 1989 A
4891694 Way Jan 1990 A
4901367 Nicholson Feb 1990 A
4903126 Kassatly Feb 1990 A
4905094 Pocock et al. Feb 1990 A
4912760 West, Jr. et al. Mar 1990 A
4918516 Freeman Apr 1990 A
4920566 Robbins et al. Apr 1990 A
4922532 Farmer et al. May 1990 A
4924303 Brandon et al. May 1990 A
4924498 Farmer et al. May 1990 A
4937821 Boulton Jun 1990 A
4941040 Pocock et al. Jul 1990 A
4947244 Fenwick et al. Aug 1990 A
4961211 Tsugane et al. Oct 1990 A
4963995 Lang Oct 1990 A
4975771 Kassatly Dec 1990 A
4989245 Bennett Jan 1991 A
4994909 Graves et al. Feb 1991 A
4995078 Monslow et al. Feb 1991 A
5003384 Durden et al. Mar 1991 A
5008934 Endoh Apr 1991 A
5014125 Pocock et al. May 1991 A
5027400 Baji et al. Jun 1991 A
5051720 Kittirutsunetorn Sep 1991 A
5051822 Rhoades Sep 1991 A
5057917 Shalkauser et al. Oct 1991 A
5058160 Banker et al. Oct 1991 A
5060262 Bevins, Jr. et al. Oct 1991 A
5077607 Johnson et al. Dec 1991 A
5083800 Lockton Jan 1992 A
5088111 McNamara et al. Feb 1992 A
5093718 Hoarty et al. Mar 1992 A
5109414 Harvey et al. Apr 1992 A
5113496 McCalley et al. May 1992 A
5119188 McCalley et al. Jun 1992 A
5130792 Tindell et al. Jul 1992 A
5132992 Yurt et al. Jul 1992 A
5133009 Rumreich Jul 1992 A
5133079 Ballantyne et al. Jul 1992 A
5136411 Paik et al. Aug 1992 A
5142575 Farmer et al. Aug 1992 A
5144448 Hombaker, III et al. Sep 1992 A
5155591 Wachob Oct 1992 A
5172413 Bradley et al. Dec 1992 A
5191410 McCalley et al. Mar 1993 A
5195092 Wilson et al. Mar 1993 A
5208665 McCalley et al. May 1993 A
5220420 Hoarty et al. Jun 1993 A
5230019 Yanagimichi et al. Jul 1993 A
5231494 Wachob Jul 1993 A
5236199 Thompson, Jr. Aug 1993 A
5247347 Litteral et al. Sep 1993 A
5253341 Rozmanith et al. Oct 1993 A
5262854 Ng Nov 1993 A
5262860 Fitzpatrick et al. Nov 1993 A
5303388 Kreitman et al. Apr 1994 A
5319455 Hoarty et al. Jun 1994 A
5319707 Wasilewski et al. Jun 1994 A
5321440 Yanagihara et al. Jun 1994 A
5321514 Martinez Jun 1994 A
5351129 Lai Sep 1994 A
5355162 Yazolino et al. Oct 1994 A
5359601 Wasilewski et al. Oct 1994 A
5361091 Hoarty et al. Nov 1994 A
5371532 Gelman et al. Dec 1994 A
5404393 Remillard Apr 1995 A
5408274 Chang et al. Apr 1995 A
5410343 Coddington et al. Apr 1995 A
5410344 Graves et al. Apr 1995 A
5412415 Cook et al. May 1995 A
5412720 Hoarty May 1995 A
5418559 Blahut May 1995 A
5422674 Hooper et al. Jun 1995 A
5422887 Diepstraten et al. Jun 1995 A
5442389 Blahut et al. Aug 1995 A
5442390 Hooper et al. Aug 1995 A
5442700 Snell et al. Aug 1995 A
5446490 Blahut et al. Aug 1995 A
5469283 Vinel et al. Nov 1995 A
5469431 Wendorf et al. Nov 1995 A
5471263 Odaka Nov 1995 A
5481542 Logston et al. Jan 1996 A
5485197 Hoarty Jan 1996 A
5487066 McNamara et al. Jan 1996 A
5493638 Hooper et al. Feb 1996 A
5495283 Cowe Feb 1996 A
5495295 Long Feb 1996 A
5497187 Banker et al. Mar 1996 A
5517250 Hoogenboom et al. May 1996 A
5526034 Hoarty et al. Jun 1996 A
5528281 Grady et al. Jun 1996 A
5537397 Abramson Jul 1996 A
5537404 Bentley et al. Jul 1996 A
5539449 Blahut et al. Jul 1996 A
RE35314 Logg Aug 1996 E
5548340 Bertram Aug 1996 A
5550578 Hoarty et al. Aug 1996 A
5557316 Hoarty et al. Sep 1996 A
5559549 Hendricks et al. Sep 1996 A
5561708 Remillard Oct 1996 A
5570126 Blahut et al. Oct 1996 A
5570363 Holm Oct 1996 A
5579143 Huber Nov 1996 A
5581653 Todd Dec 1996 A
5583927 Ely et al. Dec 1996 A
5587734 Lauder et al. Dec 1996 A
5589885 Ooi Dec 1996 A
5592470 Rudrapatna et al. Jan 1997 A
5594507 Hoarty Jan 1997 A
5594723 Tibi Jan 1997 A
5594938 Engel Jan 1997 A
5596693 Needle et al. Jan 1997 A
5600364 Hendricks et al. Feb 1997 A
5600573 Hendricks et al. Feb 1997 A
5608446 Carr et al. Mar 1997 A
5617145 Huang et al. Apr 1997 A
5621464 Teo et al. Apr 1997 A
5625404 Grady et al. Apr 1997 A
5630757 Gagin et al. May 1997 A
5631693 Wunderlich et al. May 1997 A
5631846 Szurkowski May 1997 A
5632003 Davidson et al. May 1997 A
5649283 Galler et al. Jul 1997 A
5668592 Spaulding, II Sep 1997 A
5668599 Cheney et al. Sep 1997 A
5708767 Yeo et al. Jan 1998 A
5710815 Ming et al. Jan 1998 A
5712906 Grady et al. Jan 1998 A
5740307 Lane Apr 1998 A
5742289 Naylor et al. Apr 1998 A
5748234 Lippincott May 1998 A
5754941 Sharpe et al. May 1998 A
5786527 Tarte Jul 1998 A
5790174 Richard, III et al. Aug 1998 A
5802283 Grady et al. Sep 1998 A
5812665 Hoarty et al. Sep 1998 A
5812786 Seazholtz et al. Sep 1998 A
5815604 Simons et al. Sep 1998 A
5818438 Howe et al. Oct 1998 A
5821945 Yeo et al. Oct 1998 A
5822537 Katseff et al. Oct 1998 A
5828371 Cline et al. Oct 1998 A
5844594 Ferguson Dec 1998 A
5845083 Hamadani et al. Dec 1998 A
5862325 Reed et al. Jan 1999 A
5864820 Case Jan 1999 A
5867208 McLaren Feb 1999 A
5883661 Hoarty Mar 1999 A
5903727 Nielsen May 1999 A
5903816 Broadwin et al. May 1999 A
5905522 Lawler May 1999 A
5907681 Bates et al. May 1999 A
5917822 Lyles et al. Jun 1999 A
5946352 Rowlands et al. Aug 1999 A
5952943 Walsh et al. Sep 1999 A
5959690 Toebes et al. Sep 1999 A
5961603 Kunkel et al. Oct 1999 A
5963203 Goldberg et al. Oct 1999 A
5966163 Lin et al. Oct 1999 A
5978756 Walker et al. Nov 1999 A
5982445 Eyer et al. Nov 1999 A
5990862 Lewis Nov 1999 A
5995146 Rasmussen Nov 1999 A
5995488 Kalkunte et al. Nov 1999 A
5999970 Krisbergh et al. Dec 1999 A
6014416 Shin et al. Jan 2000 A
6021386 Davis et al. Feb 2000 A
6031989 Cordell Feb 2000 A
6034678 Hoarty et al. Mar 2000 A
6049539 Lee et al. Apr 2000 A
6049831 Gardell et al. Apr 2000 A
6052555 Ferguson Apr 2000 A
6055314 Spies et al. Apr 2000 A
6055315 Doyle et al. Apr 2000 A
6064377 Hoarty et al. May 2000 A
6078328 Schumann et al. Jun 2000 A
6084908 Chiang et al. Jul 2000 A
6100883 Hoarty Aug 2000 A
6108625 Kim Aug 2000 A
6131182 Beakes et al. Oct 2000 A
6141645 Chi-Min et al. Oct 2000 A
6141693 Perlman et al. Oct 2000 A
6144698 Poon et al. Nov 2000 A
6167084 Wang et al. Dec 2000 A
6169573 Sampath-Kumar et al. Jan 2001 B1
6177931 Alexander et al. Jan 2001 B1
6182072 Leak et al. Jan 2001 B1
6184878 Alonso et al. Feb 2001 B1
6192081 Chiang et al. Feb 2001 B1
6198822 Doyle et al. Mar 2001 B1
6205582 Hoarty Mar 2001 B1
6226041 Florencio et al. May 2001 B1
6236730 Cowieson et al. May 2001 B1
6243418 Kim Jun 2001 B1
6253238 Lauder et al. Jun 2001 B1
6256047 Isobe et al. Jul 2001 B1
6259826 Pollard et al. Jul 2001 B1
6266369 Wang et al. Jul 2001 B1
6266684 Kraus et al. Jul 2001 B1
6275496 Burns et al. Aug 2001 B1
6292194 Powell, III Sep 2001 B1
6305020 Hoarty et al. Oct 2001 B1
6317151 Ohsuga et al. Nov 2001 B1
6317885 Fries Nov 2001 B1
6349284 Park et al. Feb 2002 B1
6385771 Gordon May 2002 B1
6386980 Nishino et al. May 2002 B1
6389075 Wang et al. May 2002 B2
6389218 Gordon et al. May 2002 B2
6415031 Colligan et al. Jul 2002 B1
6415437 Ludvig et al. Jul 2002 B1
6438140 Jungers et al. Aug 2002 B1
6446037 Fielder et al. Sep 2002 B1
6459427 Mao et al. Oct 2002 B1
6477182 Calderone Nov 2002 B2
6480210 Martino et al. Nov 2002 B1
6481012 Gordon et al. Nov 2002 B1
6512793 Maeda Jan 2003 B1
6525746 Lau et al. Feb 2003 B1
6536043 Guedalia Mar 2003 B1
6557041 Mallart Apr 2003 B2
6560496 Michener May 2003 B1
6564378 Satterfield et al. May 2003 B1
6578201 LaRocca et al. Jun 2003 B1
6579184 Tanskanen Jun 2003 B1
6584153 Gordon et al. Jun 2003 B1
6588017 Calderone Jul 2003 B1
6598229 Smyth et al. Jul 2003 B2
6604224 Armstrong et al. Aug 2003 B1
6614442 Ouyang et al. Sep 2003 B1
6621870 Gordon et al. Sep 2003 B1
6625574 Taniguchi et al. Sep 2003 B1
6639896 Goode et al. Oct 2003 B1
6645076 Sugai Nov 2003 B1
6651252 Gordon et al. Nov 2003 B1
6657647 Bright Dec 2003 B1
6675385 Wang Jan 2004 B1
6675387 Boucher Jan 2004 B1
6681326 Son et al. Jan 2004 B2
6681397 Tsai et al. Jan 2004 B1
6684400 Goode et al. Jan 2004 B1
6687663 McGrath et al. Feb 2004 B1
6691208 Dandrea et al. Feb 2004 B2
6697376 Son et al. Feb 2004 B1
6704359 Bayrakeri et al. Mar 2004 B1
6717600 Dutta et al. Apr 2004 B2
6718552 Goode Apr 2004 B1
6721794 Taylor et al. Apr 2004 B2
6721956 Wasilewski Apr 2004 B2
6727929 Bates et al. Apr 2004 B1
6731605 Deshpande May 2004 B1
6732370 Gordon et al. May 2004 B1
6747991 Hemy et al. Jun 2004 B1
6754271 Gordon et al. Jun 2004 B1
6754905 Gordon et al. Jun 2004 B2
6758540 Adolph et al. Jul 2004 B1
6766407 Lisitsa et al. Jul 2004 B1
6771704 Hannah Aug 2004 B1
6785902 Zigmond et al. Aug 2004 B1
6807528 Truman et al. Oct 2004 B1
6810528 Chatani Oct 2004 B1
6813690 Lango et al. Nov 2004 B1
6817947 Tanskanen Nov 2004 B2
6886178 Mao et al. Apr 2005 B1
6907574 Xu et al. Jun 2005 B2
6931291 Alvarez-Tinoco et al. Aug 2005 B1
6941019 Mitchell et al. Sep 2005 B1
6941574 Broadwin et al. Sep 2005 B1
6947509 Wong Sep 2005 B1
6952221 Holtz et al. Oct 2005 B1
6956899 Hall et al. Oct 2005 B2
7016540 Gong et al. Mar 2006 B1
7030890 Jouet et al. Apr 2006 B1
7031385 Inoue et al. Apr 2006 B1
7050113 Campisano et al. May 2006 B2
7089577 Rakib et al. Aug 2006 B1
7093028 Shao et al. Aug 2006 B1
7095402 Kunil et al. Aug 2006 B2
7114167 Slemmer et al. Sep 2006 B2
7146615 Hervet et al. Dec 2006 B1
7151782 Oz et al. Dec 2006 B1
7158676 Rainsford Jan 2007 B1
7200836 Brodersen et al. Apr 2007 B2
7212573 Winger May 2007 B2
7224731 Mehrotra May 2007 B2
7272556 Aguilar et al. Sep 2007 B1
7310619 Baar et al. Dec 2007 B2
7325043 Rosenberg et al. Jan 2008 B1
7346111 Winger et al. Mar 2008 B2
7360230 Paz et al. Apr 2008 B1
7412423 Asano Aug 2008 B1
7412505 Slemmer et al. Aug 2008 B2
7421082 Kamiya et al. Sep 2008 B2
7444306 Varble Oct 2008 B2
7444418 Chou et al. Oct 2008 B2
7500235 Maynard et al. Mar 2009 B2
7508941 O'Toole, Jr. et al. Mar 2009 B1
7512577 Slemmer et al. Mar 2009 B2
7543073 Chou et al. Jun 2009 B2
7596764 Vienneau et al. Sep 2009 B2
7623575 Winger Nov 2009 B2
7669220 Goode Feb 2010 B2
7742609 Yeakel et al. Jun 2010 B2
7743400 Kurauchi Jun 2010 B2
7751572 Villemoes et al. Jul 2010 B2
7757157 Fukuda Jul 2010 B1
7830388 Lu Nov 2010 B1
7840905 Weber et al. Nov 2010 B1
7936819 Craig et al. May 2011 B2
7941645 Riach et al. May 2011 B1
7970263 Asch Jun 2011 B1
7987489 Krzyzanowski et al. Jul 2011 B2
8027353 Damola et al. Sep 2011 B2
8036271 Winger et al. Oct 2011 B2
8046798 Schlack et al. Oct 2011 B1
8074248 Sigmon et al. Dec 2011 B2
8118676 Craig et al. Feb 2012 B2
8136033 Bhargava et al. Mar 2012 B1
8149917 Zhang et al. Apr 2012 B2
8155194 Winger et al. Apr 2012 B2
8155202 Landau Apr 2012 B2
8170107 Winger May 2012 B2
8194862 Herr et al. Jun 2012 B2
8243630 Luo et al. Aug 2012 B2
8270439 Herr et al. Sep 2012 B2
8284842 Craig et al. Oct 2012 B2
8296424 Malloy et al. Oct 2012 B2
8370869 Paek et al. Feb 2013 B2
8411754 Zhang et al. Apr 2013 B2
8442110 Pavlovskaia et al. May 2013 B2
8473996 Gordon et al. Jun 2013 B2
8619867 Craig et al. Dec 2013 B2
8621500 Weaver et al. Dec 2013 B2
8656430 Doyle Feb 2014 B2
20010008845 Kusuda et al. Jul 2001 A1
20010049301 Masuda et al. Dec 2001 A1
20020007491 Schiller et al. Jan 2002 A1
20020013812 Krueger et al. Jan 2002 A1
20020016161 Dellien et al. Feb 2002 A1
20020021353 DeNies Feb 2002 A1
20020026642 Augenbraun et al. Feb 2002 A1
20020027567 Niamir Mar 2002 A1
20020032697 French et al. Mar 2002 A1
20020040482 Sextro et al. Apr 2002 A1
20020047899 Son et al. Apr 2002 A1
20020049975 Thomas et al. Apr 2002 A1
20020054578 Zhang et al. May 2002 A1
20020056083 Istvan May 2002 A1
20020056107 Schlack May 2002 A1
20020056136 Wistendahl et al. May 2002 A1
20020059644 Andrade et al. May 2002 A1
20020062484 De Lange et al. May 2002 A1
20020067766 Sakamoto et al. Jun 2002 A1
20020069267 Thiele Jun 2002 A1
20020072408 Kumagai Jun 2002 A1
20020078171 Schneider Jun 2002 A1
20020078456 Hudson et al. Jun 2002 A1
20020083464 Tomsen et al. Jun 2002 A1
20020095689 Novak Jul 2002 A1
20020105531 Niemi Aug 2002 A1
20020108121 Alao et al. Aug 2002 A1
20020131511 Zenoni Sep 2002 A1
20020136298 Anantharamu et al. Sep 2002 A1
20020152318 Menon et al. Oct 2002 A1
20020171765 Waki et al. Nov 2002 A1
20020175931 Holtz et al. Nov 2002 A1
20020178447 Plotnick et al. Nov 2002 A1
20020188628 Cooper et al. Dec 2002 A1
20020191851 Keinan Dec 2002 A1
20020194592 Tsuchida et al. Dec 2002 A1
20020196746 Allen Dec 2002 A1
20030018796 Chou et al. Jan 2003 A1
20030020671 Santoro et al. Jan 2003 A1
20030027517 Callway et al. Feb 2003 A1
20030035486 Kato et al. Feb 2003 A1
20030038893 Rajamaki et al. Feb 2003 A1
20030039398 McIntyre Feb 2003 A1
20030046690 Miller Mar 2003 A1
20030051253 Barone, Jr. Mar 2003 A1
20030058941 Chen et al. Mar 2003 A1
20030061451 Beyda Mar 2003 A1
20030065739 Shnier Apr 2003 A1
20030071792 Safadi Apr 2003 A1
20030072372 Shen et al. Apr 2003 A1
20030076546 Johnson et al. Apr 2003 A1
20030088328 Nishio et al. May 2003 A1
20030088400 Nishio et al. May 2003 A1
20030095790 Joshi May 2003 A1
20030107443 Yamamoto Jun 2003 A1
20030122836 Doyle et al. Jul 2003 A1
20030123664 Pedlow, Jr. et al. Jul 2003 A1
20030126608 Safadi et al. Jul 2003 A1
20030126611 Chernock et al. Jul 2003 A1
20030131349 Kuczynski-Brown Jul 2003 A1
20030135860 Dureau Jul 2003 A1
20030169373 Peters et al. Sep 2003 A1
20030177199 Zenoni Sep 2003 A1
20030188309 Yuen Oct 2003 A1
20030189980 Dvir et al. Oct 2003 A1
20030196174 Pierre Cote et al. Oct 2003 A1
20030208768 Urdang et al. Nov 2003 A1
20030229719 Iwata et al. Dec 2003 A1
20030229900 Reisman Dec 2003 A1
20030231218 Amadio Dec 2003 A1
20040016000 Zhang et al. Jan 2004 A1
20040034873 Zenoni Feb 2004 A1
20040040035 Carlucci et al. Feb 2004 A1
20040055007 Allport Mar 2004 A1
20040078822 Breen et al. Apr 2004 A1
20040088375 Sethi et al. May 2004 A1
20040091171 Bone May 2004 A1
20040111526 Baldwin et al. Jun 2004 A1
20040117827 Karaoguz et al. Jun 2004 A1
20040128686 Boyer et al. Jul 2004 A1
20040133704 Krzyzanowski et al. Jul 2004 A1
20040136698 Mock Jul 2004 A1
20040139158 Datta Jul 2004 A1
20040157662 Tsuchiya Aug 2004 A1
20040163101 Swix et al. Aug 2004 A1
20040184542 Fujimoto Sep 2004 A1
20040193648 Lai et al. Sep 2004 A1
20040210824 Shoff et al. Oct 2004 A1
20040261106 Hoffman Dec 2004 A1
20040261114 Addington et al. Dec 2004 A1
20040268419 Danker et al. Dec 2004 A1
20050015259 Thumpudi et al. Jan 2005 A1
20050015816 Christofalo et al. Jan 2005 A1
20050021830 Urzaiz et al. Jan 2005 A1
20050034155 Gordon et al. Feb 2005 A1
20050034162 White et al. Feb 2005 A1
20050044575 Der Kuyl Feb 2005 A1
20050055685 Maynard et al. Mar 2005 A1
20050055721 Zigmond et al. Mar 2005 A1
20050071876 van Beek Mar 2005 A1
20050076134 Bialik et al. Apr 2005 A1
20050089091 Kim et al. Apr 2005 A1
20050091690 Delpuch et al. Apr 2005 A1
20050091695 Paz et al. Apr 2005 A1
20050105608 Coleman et al. May 2005 A1
20050114906 Hoarty et al. May 2005 A1
20050132305 Guichard et al. Jun 2005 A1
20050135385 Jenkins et al. Jun 2005 A1
20050141613 Kelly et al. Jun 2005 A1
20050149988 Grannan Jul 2005 A1
20050155063 Bayrakeri et al. Jul 2005 A1
20050160088 Scallan et al. Jul 2005 A1
20050166257 Feinleib et al. Jul 2005 A1
20050180502 Puri Aug 2005 A1
20050198682 Wright Sep 2005 A1
20050213586 Cyganski et al. Sep 2005 A1
20050216933 Black Sep 2005 A1
20050216940 Black Sep 2005 A1
20050226426 Oomen et al. Oct 2005 A1
20050273832 Zigmond et al. Dec 2005 A1
20050283741 Balabanovic et al. Dec 2005 A1
20060001737 Dawson et al. Jan 2006 A1
20060020960 Relan et al. Jan 2006 A1
20060020994 Crane et al. Jan 2006 A1
20060031906 Kaneda Feb 2006 A1
20060039481 Shen et al. Feb 2006 A1
20060041910 Hatanaka et al. Feb 2006 A1
20060088105 Shen et al. Apr 2006 A1
20060095944 Demircin et al. May 2006 A1
20060112338 Joung et al. May 2006 A1
20060117340 Pavlovskaia et al. Jun 2006 A1
20060143678 Cho et al. Jun 2006 A1
20060161538 Kiilerich Jul 2006 A1
20060173985 Moore Aug 2006 A1
20060174026 Robinson et al. Aug 2006 A1
20060174289 Theberge Aug 2006 A1
20060195884 van Zoest et al. Aug 2006 A1
20060203913 Kim et al. Sep 2006 A1
20060212203 Furuno Sep 2006 A1
20060218601 Michel Sep 2006 A1
20060230428 Craig et al. Oct 2006 A1
20060242570 Croft et al. Oct 2006 A1
20060256865 Westerman Nov 2006 A1
20060269086 Page et al. Nov 2006 A1
20060271985 Hoffman et al. Nov 2006 A1
20060285586 Westerman Dec 2006 A1
20060285819 Kelly et al. Dec 2006 A1
20070009035 Craig et al. Jan 2007 A1
20070009036 Craig et al. Jan 2007 A1
20070009042 Craig Jan 2007 A1
20070011702 Vaysman Jan 2007 A1
20070025639 Zhou et al. Feb 2007 A1
20070033528 Merrit et al. Feb 2007 A1
20070033631 Gordon et al. Feb 2007 A1
20070074251 Oguz et al. Mar 2007 A1
20070079325 de Heer Apr 2007 A1
20070115941 Patel et al. May 2007 A1
20070124282 Wittkotter May 2007 A1
20070124795 McKissick et al. May 2007 A1
20070130446 Minakami Jun 2007 A1
20070130592 Haeusel Jun 2007 A1
20070152984 Ording et al. Jul 2007 A1
20070162953 Bollinger et al. Jul 2007 A1
20070172061 Pinder Jul 2007 A1
20070174790 Jing et al. Jul 2007 A1
20070178243 Dong et al. Aug 2007 A1
20070234220 Khan et al. Oct 2007 A1
20070237232 Chang et al. Oct 2007 A1
20070300280 Turner et al. Dec 2007 A1
20080046928 Poling et al. Feb 2008 A1
20080052742 Kopf et al. Feb 2008 A1
20080066135 Brodersen et al. Mar 2008 A1
20080084503 Kondo Apr 2008 A1
20080086688 Chandratillake et al. Apr 2008 A1
20080086747 Rasanen et al. Apr 2008 A1
20080094368 Ording et al. Apr 2008 A1
20080097953 Levy et al. Apr 2008 A1
20080098450 Wu et al. Apr 2008 A1
20080104520 Swenson et al. May 2008 A1
20080127255 Ress et al. May 2008 A1
20080154583 Goto et al. Jun 2008 A1
20080163059 Craner Jul 2008 A1
20080163286 Rudolph et al. Jul 2008 A1
20080170619 Landau Jul 2008 A1
20080170622 Gordon et al. Jul 2008 A1
20080178125 Elsbree et al. Jul 2008 A1
20080178243 Dong et al. Jul 2008 A1
20080178249 Gordon et al. Jul 2008 A1
20080181221 Kampmann et al. Jul 2008 A1
20080184120 O-Brien-Strain et al. Jul 2008 A1
20080189740 Carpenter et al. Aug 2008 A1
20080195573 Onoda et al. Aug 2008 A1
20080201736 Gordon et al. Aug 2008 A1
20080212942 Gordon et al. Sep 2008 A1
20080222199 Tiu et al. Sep 2008 A1
20080232452 Sullivan et al. Sep 2008 A1
20080243918 Holtman Oct 2008 A1
20080243998 Oh et al. Oct 2008 A1
20080246759 Summers Oct 2008 A1
20080253440 Srinivasan et al. Oct 2008 A1
20080271080 Gossweiler et al. Oct 2008 A1
20090003446 Wu et al. Jan 2009 A1
20090003705 Zou et al. Jan 2009 A1
20090007199 La Joie Jan 2009 A1
20090025027 Craner Jan 2009 A1
20090031341 Schlack et al. Jan 2009 A1
20090041118 Pavlovskaia et al. Feb 2009 A1
20090083781 Yang et al. Mar 2009 A1
20090083813 Dolce et al. Mar 2009 A1
20090083824 McCarthy et al. Mar 2009 A1
20090089188 Ku et al. Apr 2009 A1
20090094113 Berry et al. Apr 2009 A1
20090094646 Walter et al. Apr 2009 A1
20090100465 Kulakowski Apr 2009 A1
20090100489 Strothmann Apr 2009 A1
20090106269 Zuckerman et al. Apr 2009 A1
20090106386 Zuckerman et al. Apr 2009 A1
20090106392 Zuckerman et al. Apr 2009 A1
20090106425 Zuckerman et al. Apr 2009 A1
20090106441 Zuckerman et al. Apr 2009 A1
20090106451 Zuckerman et al. Apr 2009 A1
20090106511 Zuckerman et al. Apr 2009 A1
20090113009 Slemmer et al. Apr 2009 A1
20090132942 Santoro et al. May 2009 A1
20090138966 Krause et al. May 2009 A1
20090144781 Glaser et al. Jun 2009 A1
20090146779 Kumar et al. Jun 2009 A1
20090157868 Chaudhry Jun 2009 A1
20090158369 Van Vleck et al. Jun 2009 A1
20090160694 Di Flora Jun 2009 A1
20090172757 Aldrey et al. Jul 2009 A1
20090178098 Westbrook et al. Jul 2009 A1
20090183219 Maynard et al. Jul 2009 A1
20090189890 Corbett et al. Jul 2009 A1
20090193452 Russ et al. Jul 2009 A1
20090196346 Zhang et al. Aug 2009 A1
20090204920 Beverley et al. Aug 2009 A1
20090210899 Lawrence-Apfelbaum et al. Aug 2009 A1
20090225790 Shay et al. Sep 2009 A1
20090228620 Thomas et al. Sep 2009 A1
20090228922 Haj-Khalil et al. Sep 2009 A1
20090233593 Ergen et al. Sep 2009 A1
20090251478 Maillot et al. Oct 2009 A1
20090254960 Yarom et al. Oct 2009 A1
20090265617 Randall et al. Oct 2009 A1
20090271512 Jorgensen Oct 2009 A1
20090271818 Schlack Oct 2009 A1
20090298535 Klein et al. Dec 2009 A1
20090313674 Ludvig et al. Dec 2009 A1
20090328109 Pavlovskaia et al. Dec 2009 A1
20100033638 O'Donnell et al. Feb 2010 A1
20100035682 Gentile et al. Feb 2010 A1
20100058404 Rouse Mar 2010 A1
20100067571 White et al. Mar 2010 A1
20100077441 Thomas et al. Mar 2010 A1
20100104021 Schmit Apr 2010 A1
20100115573 Srinivasan et al. May 2010 A1
20100118972 Zhang et al. May 2010 A1
20100131996 Gauld May 2010 A1
20100146139 Brockmann Jun 2010 A1
20100158109 Dahlby et al. Jun 2010 A1
20100161825 Ronca et al. Jun 2010 A1
20100166071 Wu et al. Jul 2010 A1
20100174776 Westberg et al. Jul 2010 A1
20100175080 Yuen et al. Jul 2010 A1
20100180307 Hayes et al. Jul 2010 A1
20100211983 Chou Aug 2010 A1
20100226428 Thevathasan et al. Sep 2010 A1
20100235861 Schein et al. Sep 2010 A1
20100242073 Gordon et al. Sep 2010 A1
20100251167 DeLuca et al. Sep 2010 A1
20100254370 Jana et al. Oct 2010 A1
20100265344 Velarde et al. Oct 2010 A1
20100325655 Perez Dec 2010 A1
20100325668 Young et al. Dec 2010 A1
20110002376 Ahmed et al. Jan 2011 A1
20110002470 Purnhagen et al. Jan 2011 A1
20110023069 Dowens Jan 2011 A1
20110035227 Lee et al. Feb 2011 A1
20110067061 Karaoguz et al. Mar 2011 A1
20110096828 Chen et al. Apr 2011 A1
20110107375 Stahl et al. May 2011 A1
20110110642 Salomons et al. May 2011 A1
20110150421 Sasaki et al. Jun 2011 A1
20110153776 Opala et al. Jun 2011 A1
20110167468 Lee et al. Jul 2011 A1
20110191684 Greenberg Aug 2011 A1
20110231878 Hunter et al. Sep 2011 A1
20110243024 Osterling et al. Oct 2011 A1
20110258584 Williams et al. Oct 2011 A1
20110283304 Roberts et al. Nov 2011 A1
20110289536 Poder et al. Nov 2011 A1
20110296312 Boyer et al. Dec 2011 A1
20110317982 Xu et al. Dec 2011 A1
20120023126 Jin et al. Jan 2012 A1
20120030212 Koopmans et al. Feb 2012 A1
20120137337 Sigmon et al. May 2012 A1
20120204217 Regis et al. Aug 2012 A1
20120209815 Carson et al. Aug 2012 A1
20120224641 Haberman et al. Sep 2012 A1
20120257671 Brockmann et al. Oct 2012 A1
20130003826 Craig et al. Jan 2013 A1
20130071095 Chauvier et al. Mar 2013 A1
20130086610 Brockmann Apr 2013 A1
20130179787 Brockmann et al. Jul 2013 A1
20130198776 Brockmann Aug 2013 A1
20130254308 Rose et al. Sep 2013 A1
20130272394 Brockmann et al. Oct 2013 A1
20130304818 Brumleve et al. Nov 2013 A1
20140033036 Gaur et al. Jan 2014 A1
20140081954 Elizarov Mar 2014 A1
20140267074 Balci et al. Sep 2014 A1
Foreign Referenced Citations (319)
Number Date Country
191599 Apr 2000 AT
198969 Feb 2001 AT
250313 Oct 2003 AT
472152 Jul 2010 AT
475266 Aug 2010 AT
199060189 Nov 1990 AU
620735 Feb 1992 AU
199184838 Apr 1992 AU
643828 Nov 1993 AU
2004253127 Jan 2005 AU
2005278122 Mar 2006 AU
2010339376 Aug 2012 AU
2011249132 Nov 2012 AU
2011258972 Nov 2012 AU
2011315950 May 2013 AU
682776 Mar 1964 CA
2052477 Mar 1992 CA
1302554 Jun 1992 CA
2163500 May 1996 CA
2231391 May 1997 CA
2273365 Jun 1998 CA
2313133 Jun 1999 CA
2313161 Jun 1999 CA
2528499 Jan 2005 CA
2569407 Mar 2006 CA
2728797 Apr 2010 CA
2787913 Jul 2011 CA
2798541 Dec 2011 CA
2814070 Apr 2012 CA
1507751 Jun 2004 CN
1969555 May 2007 CN
101180109 May 2008 CN
101627424 Jan 2010 CN
101637023 Jan 2010 CN
102007773 Apr 2011 CN
103647980 Mar 2014 CN
4408355 Oct 1994 DE
69516139 D1 Dec 2000 DE
69132518 D1 Sep 2001 DE
69333207 D1 Jul 2004 DE
98961961 Aug 2007 DE
602008001596 Aug 2010 DE
602006015650 Sep 2010 DE
0128771 Dec 1984 EP
0419137 Mar 1991 EP
0449633 Oct 1991 EP
0477786 Apr 1992 EP
0523618 Jan 1993 EP
0534139 Mar 1993 EP
0568453 Nov 1993 EP
0588653 Mar 1994 EP
0594350 Apr 1994 EP
0612916 Aug 1994 EP
0624039 Nov 1994 EP
0638219 Feb 1995 EP
0643523 Mar 1995 EP
0661888 Jul 1995 EP
0714684 Jun 1996 EP
0746158 Dec 1996 EP
0761066 Mar 1997 EP
0789972 Aug 1997 EP
0830786 Mar 1998 EP
0861560 Sep 1998 EP
0 881 808 Dec 1998 EP
0933966 Aug 1999 EP
0933966 Aug 1999 EP
1026872 Aug 2000 EP
1038397 Sep 2000 EP
1038399 Sep 2000 EP
1038400 Sep 2000 EP
1038401 Sep 2000 EP
1051039 Nov 2000 EP
1055331 Nov 2000 EP
1120968 Aug 2001 EP
1345446 Sep 2003 EP
1422929 May 2004 EP
1428562 Jun 2004 EP
1521476 Apr 2005 EP
1645115 Apr 2006 EP
1725044 Nov 2006 EP
1767708 Mar 2007 EP
1771003 Apr 2007 EP
1772014 Apr 2007 EP
1877150 Jan 2008 EP
1887148 Feb 2008 EP
1900200 Mar 2008 EP
1902583 Mar 2008 EP
1908293 Apr 2008 EP
1911288 Apr 2008 EP
1918802 May 2008 EP
2100296 Sep 2009 EP
2105019 Sep 2009 EP
2106665 Oct 2009 EP
2116051 Nov 2009 EP
2124440 Nov 2009 EP
2248341 Nov 2010 EP
2269377 Jan 2011 EP
2271098 Jan 2011 EP
2304953 Apr 2011 EP
2364019 Sep 2011 EP
2384001 Nov 2011 EP
2409493 Jan 2012 EP
2477414 Jul 2012 EP
2487919 Aug 2012 EP
2520090 Nov 2012 EP
2567545 Mar 2013 EP
2577437 Apr 2013 EP
2628306 Aug 2013 EP
2632164 Aug 2013 EP
2632165 Aug 2013 EP
2695388 Feb 2014 EP
2207635 Jun 2004 ES
8211463 Jun 1982 FR
2529739 Jan 1984 FR
2891098 Mar 2007 FR
2207838 Feb 1989 GB
2248955 Apr 1992 GB
2290204 Dec 1995 GB
2365649 Feb 2002 GB
2378345 Feb 2003 GB
1134855 Oct 2010 HK
1116323 Dec 2010 HK
19913397 Apr 1992 IE
99586 Feb 1998 IL
215133 Dec 2011 IL
222829 Dec 2012 IL
222830 Dec 2012 IL
225525 Jun 2013 IL
180215 Jan 1998 IN
200701744 Nov 2007 IN
200900856 May 2009 IN
200800214 Jun 2009 IN
3759 Mar 1992 IS
60-054324 Mar 1985 JP
63-033988 Feb 1988 JP
63-263985 Oct 1988 JP
2001-241993 Sep 1989 JP
04-373286 Dec 1992 JP
06-054324 Feb 1994 JP
7015720 Jan 1995 JP
7-160292 Jun 1995 JP
7160292 Jun 1995 JP
8095599 Apr 1996 JP
8-265704 Oct 1996 JP
8265704 Oct 1996 JP
10-228437 Aug 1998 JP
11-134273 May 1999 JP
H11-261966 Sep 1999 JP
2000-152234 May 2000 JP
2001-203995 Jul 2001 JP
2001-245271 Sep 2001 JP
2001-245291 Sep 2001 JP
2001-514471 Sep 2001 JP
2002-016920 Jan 2002 JP
2002-057952 Feb 2002 JP
2002-112220 Apr 2002 JP
2002-141810 May 2002 JP
2002-208027 Jul 2002 JP
2002-319991 Oct 2002 JP
2003-506763 Feb 2003 JP
2003-087785 Mar 2003 JP
2004-501445 Jan 2004 JP
2004-056777 Feb 2004 JP
2004-110850 Apr 2004 JP
2004-112441 Apr 2004 JP
2004-135932 May 2004 JP
2004-264812 Sep 2004 JP
2004-312283 Nov 2004 JP
2004-533736 Nov 2004 JP
2004-536381 Dec 2004 JP
2004-536681 Dec 2004 JP
2005-033741 Feb 2005 JP
2005-084987 Mar 2005 JP
2005-095599 Mar 2005 JP
8-095599 Apr 2005 JP
2005-156996 Jun 2005 JP
2005-519382 Jun 2005 JP
2005-523479 Aug 2005 JP
2005-309752 Nov 2005 JP
2006-067280 Mar 2006 JP
2006-512838 Apr 2006 JP
2007-129296 May 2007 JP
2007-522727 Aug 2007 JP
11-88419 Sep 2007 JP
2008-535622 Sep 2008 JP
04252727 Apr 2009 JP
2009-543386 Dec 2009 JP
2012-080593 Apr 2012 JP
04996603 Aug 2012 JP
05121711 Jan 2013 JP
53-004612 Oct 2013 JP
05331008 Oct 2013 JP
05405819 Feb 2014 JP
10-2005-0001362 Jan 2005 KR
10-2005-0085827 Aug 2005 KR
2006067924 Jun 2006 KR
10-2006-0095821 Sep 2006 KR
2007038111 Apr 2007 KR
20080001298 Jan 2008 KR
2008024189 Mar 2008 KR
2010111739 Oct 2010 KR
2010120187 Nov 2010 KR
2010127240 Dec 2010 KR
2011030640 Mar 2011 KR
2011129477 Dec 2011 KR
20120112683 Oct 2012 KR
2013061149 Jun 2013 KR
2013113925 Oct 2013 KR
1333200 Nov 2013 KR
2008045154 Nov 2013 KR
2013138263 Dec 2013 KR
1032594 Apr 2008 NL
1033929 Apr 2008 NL
2004670 Nov 2011 NL
2004780 Jan 2012 NL
239969 Dec 1994 NZ
99110 Dec 1993 PT
WO 8202303 Jul 1982 WO
WO 8908967 Sep 1989 WO
WO 9013972 Nov 1990 WO
WO 9322877 Nov 1993 WO
WO 9416534 Jul 1994 WO
WO 9419910 Sep 1994 WO
WO 9421079 Sep 1994 WO
WO 9515658 Jun 1995 WO
WO 9532587 Nov 1995 WO
WO 9533342 Dec 1995 WO
WO 9614712 May 1996 WO
WO 9627843 Sep 1996 WO
WO 9631826 Oct 1996 WO
WO 9637074 Nov 1996 WO
WO 9642168 Dec 1996 WO
WO 9716925 May 1997 WO
WO 9733434 Sep 1997 WO
WO 9739583 Oct 1997 WO
WO 9826595 Jun 1998 WO
WO 9900735 Jan 1999 WO
WO 9904568 Jan 1999 WO
WO 9900735 Jan 1999 WO
WO 9930496 Jun 1999 WO
WO 9930497 Jun 1999 WO
WO 9930500 Jun 1999 WO
WO 9930501 Jun 1999 WO
WO 9935840 Jul 1999 WO
WO 9941911 Aug 1999 WO
WO 9956468 Nov 1999 WO
WO 9965232 Dec 1999 WO
WO 9965243 Dec 1999 WO
WO 9966732 Dec 1999 WO
WO 0002303 Jan 2000 WO
WO 0007372 Feb 2000 WO
WO 0008967 Feb 2000 WO
WO 0019910 Apr 2000 WO
WO 0038430 Jun 2000 WO
WO 0041397 Jul 2000 WO
WO 0139494 May 2001 WO
WO 0141447 Jun 2001 WO
WO 0182614 Nov 2001 WO
WO 0192973 Dec 2001 WO
WO 02089487 Jul 2002 WO
WO 02076097 Sep 2002 WO
WO 02076099 Sep 2002 WO
WO 03026232 Mar 2003 WO
WO 03026275 Mar 2003 WO
WO 03047710 Jun 2003 WO
WO 03065683 Aug 2003 WO
WO 03071727 Aug 2003 WO
WO 03091832 Nov 2003 WO
WO 2004012437 Feb 2004 WO
WO 2004018060 Mar 2004 WO
WO2004057609 Jul 2004 WO
WO 2004073310 Aug 2004 WO
WO 2005002215 Jan 2005 WO
WO 2005041122 May 2005 WO
WO 2005053301 Jun 2005 WO
WO2005076575 Aug 2005 WO
WO 2005120067 Dec 2005 WO
WO 2006014362 Feb 2006 WO
WO 2006022881 Mar 2006 WO
WO 2006053305 May 2006 WO
WO 2006067697 Jun 2006 WO
WO 2006081634 Aug 2006 WO
WO 2006105480 Oct 2006 WO
WO 2006110268 Oct 2006 WO
WO 2007001797 Jan 2007 WO
WO 2007008319 Jan 2007 WO
WO 2007008355 Jan 2007 WO
WO 2007008356 Jan 2007 WO
WO 2007008357 Jan 2007 WO
WO 2007008358 Jan 2007 WO
WO 2007018722 Feb 2007 WO
WO 2007018726 Feb 2007 WO
WO2008044916 Apr 2008 WO
WO 2008044916 Apr 2008 WO
WO 2008086170 Jul 2008 WO
WO 2008088741 Jul 2008 WO
WO 2008088752 Jul 2008 WO
WO 2008088772 Jul 2008 WO
WO 2008100205 Aug 2008 WO
WO2009038596 Mar 2009 WO
WO 2009038596 Mar 2009 WO
WO 2009099893 Aug 2009 WO
WO 2009099895 Aug 2009 WO
WO 2009105465 Aug 2009 WO
WO 2009110897 Sep 2009 WO
WO 2009114247 Sep 2009 WO
WO 2009155214 Dec 2009 WO
WO 2010044926 Apr 2010 WO
WO 2010054136 May 2010 WO
WO 2010107954 Sep 2010 WO
WO 2011014336 Sep 2010 WO
WO 2011082364 Jul 2011 WO
WO 2011139155 Nov 2011 WO
WO 2011149357 Dec 2011 WO
WO 2012051528 Apr 2012 WO
WO 2012138660 Oct 2012 WO
WO 2013106390 Jul 2013 WO
WO 2013155310 Jul 2013 WO
WO2013184604 Dec 2013 WO
Non-Patent Literature Citations (307)
Entry
AC-3 digital audio compression standard, Extract, Dec. 20, 1995, 11 pgs.
ActiveVideo Networks BV, International Preliminary Report on Patentability, PCT/NL2011/050308, Sep. 6, 2011, 8 pgs.
ActiveVideo Networks BV, International Search Report and Written Opinion, PCT/NL2011/050308, Sep. 6, 2011, 8 pgs.
Activevideo Networks Inc., International Preliminary Report on Patentability, PCT/US2011/056355, Apr. 16, 2013, 4 pgs.
ActiveVideo Networks Inc., International Preliminary Report on Patentability, PCT/US2012/032010, Oct. 8, 2013, 4 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2011/056355, Apr. 13, 2012, 6 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2012/032010, Oct. 10, 2012, 6 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2013/020769, May 9, 2013, 9 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2013/036182, Jul. 29, 2013, 12 pgs.
ActiveVideo Networks, Inc., International Search Report and Written Opinion, PCT/US2009/032457, Jul. 22, 2009, 7 pgs.
ActiveVideo Networks Inc., Extended EP Search RPT, Application No. 09820936-4, 11 pgs.
ActiveVideo Networks Inc., Extended EP Search RPT, Application No. 10754084-1, 11 pgs.
ActiveVideo Networks Inc., Extended EP Search RPT, Application No. 10841764.3, 16 pgs.
ActiveVideo Networks Inc., Extended EP Search RPT, Application No. 11833486.1, 6 pgs.
AcitveVideo Networks Inc., Korean Intellectual Property Office, International Search Report; PCT/US2009/032457, Jul. 22, 2009, 7 pgs.
Annex C—Video buffering verifier, information technology—generic coding of moving pictures and associated audio information: video, Feb. 2000, 6 pgs.
Antonoff, Michael, “Interactive Television,” Popular Science, Nov. 1992, 12 pages.
Avinity Systems B.V., Extended European Search Report, Application No. 12163713.6, 10 pgs.
Avinity Systems B.V., Extended European Search Report, Application No. 12163712-8, 10 pgs.
Benjelloun, A summation algorithm for MPEG-1 coded audio signals: a first step towards audio processed domain, 2000, 9 pgs.
Broadhead, Direct manipulation of MPEG compressed digital audio, Nov. 5-9, 1995, 41 pgs.
Cable Television Laboratories, Inc., “CableLabs Asset Distribution Interface Specification, Version 1.1”, May 5, 2006, 33 pgs.
CD 11172-3, Coding of moving pictures and associated audio for digital storage media at up to about 1.5 MBIT, Jan. 1, 1992, 39 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,176, Dec. 23, 2010, 8 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,183, Jan. 12, 2012, 7 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,183, Jul. 19, 2012, 8 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,189, Oct. 12, 2011, 7 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,176, Mar. 23, 2011, 8 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 13/609,183, Aug. 26, 2013, 8 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/103,838, Feb. 5, 2009, 30 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,181, Aug. 25, 2010, 17 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/103,838, Jul. 6, 2010, 35 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,176, Oct. 1, 2010, 8 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,183, Apr. 13, 2011, 16 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,177, Oct. 26, 2010, 12 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,181, Jun. 20, 2011, 21 pgs.
Craig, Office Action, U.S. Appl. No. 11/103,838, May 12, 2009, 32 pgs.
Craig, Office Action, U.S. Appl. No. 11/103,838, Aug. 19, 2008, 17 pgs.
Craig, Office Action, U.S. Appl. No. 11/103,838, Nov. 19, 2009, 34 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,176, May 6, 2010, 7 pgs.
Craig, Office-Action U.S. Appl. No. 11/178,177, Mar. 29, 2011, 15 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,177, Aug. 3, 2011, 26 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,177, Mar. 29, 2010, 11 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,181, Feb. 11, 2011, 19 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,181, Mar. 29, 2010, 10 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,182, Feb. 23, 2010, 15 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,183, Dec. 6, 2010, 12 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,183, Sep. 15, 2011, 12 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,183, Feb. 19, 2010, 17 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,183, Jul. 20, 2010, 13 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,189, Nov. 9, 2010, 13 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,189, Mar. 15, 2010, 11 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,189, Jul. 23, 2009, 10 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,189, May 26, 2011, 14 pgs.
Craig, Office Action, U.S. Appl. No. 13/609,183, May 9, 2013, 7 pgs.
Pavlovskaia, Office Action, JP 2011-516499, Feb. 14, 2014, 19 pgs.
Digital Audio Compression Standard(AC-3, E-AC-3), Advanced Television Systems Committee, Jun. 14, 2005, 236 pgs.
European Patent Office, Extended European Search Report for International Application No. PCT/US2010/027724, dated Jul. 24, 2012, 11 pages.
FFMPEG, http://www.ffmpeg.org, downloaded Apr. 8, 2010, 8 pgs.
FFMEG-0.4.9 Audio Layer 2 Tables Including Fixed Psycho Acoustic Model, 2001, 2 pgs.
Herr, Notice of Allowance, U.S. Appl. No. 11/620,593, May 23, 2012, 5 pgs.
Herr, Notice of Allowance, U.S. Appl. No. 12/534,016, Feb. 7, 2012, 5 pgs.
Herr, Notice of Allowance, U.S. Appl. No. 12/534,016, Sep. 28, 2011, 15 pgs.
Herr, Final Office Action, U.S. Appl. No. 11/620,593, Sep. 15, 2011, 104 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, Mar. 19, 2010, 58 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, Apr. 21, 2009 27 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, Dec. 23, 2009, 58 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, Jan. 24, 2011, 96 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, Aug. 27, 2010, 41 pgs.
Herre, Thoughts on an SAOC Architecture, Oct. 2006, 9 pgs.
Hoarty, The Smart Headend—A Novel Approach to Interactive Television, Montreux Int'l TV Symposium, Jun. 9, 1995, 21 pgs.
ICTV, Inc., International Preliminary Report on Patentability, PCT/US2006/022585, Jan. 29, 2008, 9 pgs.
ICTV, Inc., International Search Report / Written Opinion, PCT/US2006/022585, Oct. 12, 2007, 15 pgs.
ICTV, Inc., International Search Report / Written Opinion, PCT/US2008/000419, May 15, 2009, 20 pgs.
ICTV, Inc., International Search Report / Written Opinion; PCT/US2006/022533, Nov. 20, 2006; 8 pgs.
Isovic, Timing constraints of MPEG-2 decoding for high quality video: misconceptions and realistic assumptions, Jul. 2-4, 2003, 10 pgs.
MPEG-2 Video elementary stream supplemental information, Dec. 1999, 12 pgs.
Ozer, Video Compositing 101. available from http://www.emedialive.com, Jun. 2, 2004, 5pgs.
Porter, Compositing Digital Images, 18 Computer Graphics (No. 3), Jul. 1984, pp. 253-259.
RSS Advisory Board, “RSS 2.0 Specification”, published Oct. 15, 2007. Not Found.
SAOC use cases, draft requirements and architecture, Oct. 2006, 16 pgs.
Sigmon, Final Office Action, U.S. Appl. No. 11/258,602, Feb. 23, 2009, 15 pgs.
Sigmon, Office Action, U.S. Appl. No. 11/258,602, Sep. 2, 2008, 12 pgs.
TAG Networks, Inc., Communication pursuant to Article 94(3) EPC, European Patent Application, 06773714.8, May 6, 2009, 3 pgs.
TAG Networks Inc, Decision to Grant a Patent, JP 209-544985, Jun. 28, 2013, 1 pg.
TAG Networks Inc., IPRP, PCT/US2006/010080, Oct. 16, 2007, 6 pgs.
TAG Networks Inc., IPRP, PCT/US2006/024194, Jan. 10, 2008, 7 pgs.
TAG Networks Inc., IPRP, PCT/US2006/024195, Apr. 1, 2009, 11 pgs.
TAG Networks Inc., IPRP, PCT/US2006/024196, Jan. 10, 2008, 6 pgs.
TAG Networks Inc., International Search Report, PCT/US2008/050221, Jun. 12, 2008, 9 pgs.
TAG Networks Inc., Office Action, CN 200680017662.3, Apr. 26, 2010, 4 pgs.
TAG Networks Inc., Office Action, EP 06739032.8, Aug. 14, 2009, 4 pgs.
TAG Networks Inc., Office Action, EP 06773714.8, May 6, 2009, 3 pgs.
TAG Networks Inc., Office Action, EP 06773714.8, Jan. 12, 2010, 4 pgs.
TAG Networks Inc., Office Action, JP 2008-506474, Oct. 10, 2012, 5 pgs.
TAG Networks Inc., Office Action, JP 2008-506474, Aug. 8, 2011, 5 pgs.
TAG Networks Inc., Office Action, JP 2008-520254, Oct. 20, 2011, 2 pgs.
TAG Networks, IPRP, PCT/US2008/050221, Jul. 7, 2009, 6 pgs.
TAG Networks, International Search Report, PCT/US2010/041133, Oct. 19, 2010, 13 pgs.
TAG Networks, Office Action, CN 200880001325.4, Jun. 22, 2011, 4 pgs.
TAG Networks, Office Action, JP 2009-544985, Feb. 25, 2013, 3 pgs.
Talley, A general framework for continuous media transmission control, Oct. 13-16, 1997, 10 pgs.
The Toolame Project, Psych—nl.c, 1999, 1 pg.
Todd, AC-3: flexible perceptual coding for audio transmission and storage, Feb. 26-Mar. 1, 1994, 16 pgs.
Tudor, MPEG-2 Video Compression, Dec. 1995, 15 pgs.
TVHead, Inc., First Examination Report, IN 1744/MUMNP/2007, Dec. 30, 2013, 6 pgs.
TVHead, Inc., International Search Report, PCT/US2006/010080, Jun. 20, 2006, 3 pgs.
TVHead, Inc., International Search Report, PCT/US2006/024194, Dec. 15, 2006, 4 pgs.
TVHead, Inc., International Search Report, PCT/US2006/024195, Nov. 29, 2006, 9 pgs.
TVHead, Inc., International Search Report, PCT/US2006/024196, Dec. 11, 2006, 4 pgs.
TVHead, Inc., International Search Report, PCT/US2006/024197, Nov. 28, 2006, 9 pgs.
Vernon, Dolby digital: audio coding for digital television and storage applications, Aug. 1999, 18 pgs.
Wang, A beat-pattern based error concealment scheme for music delivery with burst packet loss, Aug. 22-25, 2001, 4 pgs.
Wang, A compressed domain beat detector using MP3 audio bitstream, Sep. 30, Oct. 5, 2001, 9 pgs.
Wang, A multichannel audio coding algorithm for inter-channel redundancy removal, May 12-15, 2001, 6 pgs.
Wang, An excitation level based psychoacoustic model for audio compression, Oct. 30-Nov. 4, 1999, 4 pgs.
Wang, Energy compaction property of the MDCT in comparison with other transforms, Sep. 22-25, 2000, 23 pgs.
Wang, Exploiting excess masking for audio compression, Sep. 2-5, 1999, 4 pgs.
Wang, schemes for re-compressing mp3 audio bitstreams, Nov. 30-Dec. 3, 2001, 5 pgs.
Wang, Selected advances in audio compression and compressed domain processing, Aug. 2001, 68 pgs.
Wang, The impact of the relationship between MDCT and DFT on audio compression, Dec. 13-15, 2000, 9 pgs.
ActiveVideo, http://www.activevideo.com/, as printed out in year 2012, 1 pg.
ActiveVideo Networks Inc., International Preliminary Report on Patentability, PCT/US2013/020769, Jul. 24, 2014, 6 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2014/030773, Jul. 25, 2014, 8 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2014/041416, Aug. 27, 2014, 8 pgs.
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 13168509.1, 10 pgs.
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 13168376-5, 8 pgs.
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 12767642-7, 12 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Rules 70(2) and 70a(2), EP10841764.3, Jun. 6, 2014, 1 pg.
ActiveVideo Networks Inc., Communication Pursuant to Rules 70(2) and 70a(2), EP11833486.1, Apr. 24, 2014, 1 pg.
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, EP08713106.6-1908, Jun. 26, 2014, 5 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, EP08713106.6-2223, May 10, 2011, 7 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, EP09713486.0, Apr. 14, 2014, 6 pgS.
ActiveVideo Networks Inc., Examination Report No. 1, AU2011258972, Apr. 4, 2013, 5 pgs.
ActiveVideo Networks Inc., Examination Report No. 1, AU2010339376, Apr. 30, 2014, 4 pgs.
ActiveVideo Networks Inc., Summons to attend oral-proceeding, Application No. EP09820936-4, Aug. 19, 2014, 4 pgs.
ActiveVideo Networks Inc., International Searching Authority, International Search Report-International application No. PCT/US2010/027724, dated Oct. 28, 2010, together with the Written Opinion of the International Searching Authority, 7 pages.
ActiveVideo Networks, Inc., International Search Report and Written Opinion, PCT/US2014/041430, Oct. 9, 2014, 9 pgs.
Active Video Networks, Notice of Reasons for Rejection, JP2012-547318, Sep. 26, 2014, 7 pgs.
Adams, Jerry, NTZ Nachrichtechnische Zeitschrift. vol. 40, No. 7, Jul. 1987, Berlin DE pp. 534-536; Jerry Adams: 'Glasfasernetz für Breitbanddienste in London', 5 pgs.
Avinity Systems B.V., Communication pursuant to Article 94(3) EPC, EP 07834561.8, Jan. 31, 2014, 10 pgs.
Avinity Systems B.V., Communication pursuant to Article 94(3) EPC, EP 07834561.8, Apr. 8, 2010, 5 pgs.
Avinity Systems B.V., International Preliminary Report on Patentability, PCT/NL2007/000245, Mar. 31, 2009, 12 pgs.
Avinity Systems B.V., International Search Report and Written Opinion, PCT/NL2007/000245, Feb. 19, 2009, 18 pgs.
Avinity Systems B.V., Notice of Grounds of Rejection for Patent, JP 2009-530298, Sep. 3, 2013, 4 pgs.
Avinity Systems B.V., Notice of Grounds of Rejection for Patent, JP 2009-530298, Sep. 25, 2012, 6 pgs.
Avinity Systems B. V., Final Office Action, JP-2009-530298, Oct. 7, 2014, 8 pgs.
Bird et al., “Customer Access to Broadband Services,” ISSLS 86—The International Symposium on Subrscriber Loops and Services Sep. 29, 1986, Tokyo,JP 6 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 12/443,571, Mar. 7, 2014, 21 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/668,004, Jul. 16, 2014, 20 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/686,548, Sep. 24, 2014, 13 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/438,617, Oct. 3, 2014, 19 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/686,548, Mar. 10, 2014, 11 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/668,004, Dec. 23, 2013, 9 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/438,617, May 12, 2014, 17 pgs.
Brockmann, Office Action, U.S. Appl. No. 12/443,571, Jun. 5, 2013, 18 pgs.
Brockmann, Office Action, U.S. Appl. No. 12/443,571, Nov. 5, 2014, 26 pgs.
Chang, Shih-Fu, et al., “Manipulation and Compositing of MC-DOT Compressed Video, ” IEEE Journal on Selected Areas of Communications, Jan. 1995, vol. 13, No. 1, 11 pgs.
Dahlby, Office Action, U.S. Appl. No. 12/651,203, Jun. 5, 2014, 18 pgs.
Dahlby, Final Office Action, U.S. Appl. No. 12/651,203, Feb. 4, 2013, 18 pgs.
Dahlby, Office Action, U.S. Appl. No. 12/651,203, Aug. 16, 2012, 18 pgs.
Dukes, Stephen D., “Photonics for cable television system design, Migrating to regional hubs and passive networks,” Communications Engineering and Design, May 1992, 4 pgs.
Ellis, et al., “INDAX: An Operation Interactive Cabletext System”, IEEE Journal on Selected Areas in Communications, vol. sac-1, No. 2, Feb. 1983, pp. 285-294.
European Patent Office, Supplementary European Search Report, Application No. EP 09 70 8211, dated Jan. 5, 2011, 6 pgs.
Frezza, W., “The Broadband Solution-Metropolitan CATV Networks,” Proceedings of Videotex '84, Apr. 1984, 15 pgs.
Gecsei, J., “Topology of Videotex Networks,” The Architecture of Videotex Systems, Chapter 6, 1983 by Prentice-Hall, Inc.
Gobl, et al., “ARIDEM—a multi-service broadband access demonstrator,” Ericsson Review No. 3, 1996, 7 pgs.
Gordon, Notice of Allowance, U.S. Appl. No. 12/008,697, Mar. 20, 2014, 10 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/008,722, Mar. 30, 2012, 16 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, Jun. 11, 2014, 14 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, Jul. 22, 2013, 7 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, Sep. 20, 2011, 8 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, Sep. 21, 2012, 9 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/008,697, Mar. 6, 2012, 48 pgs.
Gordon, Office Action, U.S. Appl. No. 12/035,236, Mar. 13, 2013, 9 pgs.
Gordon, Office Action, U.S. Appl. No. 12/035,236, Mar. 22, 2011, 8 pgs.
Gordon, Office Action, U.S. Appl. No. 12/035,236, Mar. 28, 2012, 8 pgs.
Gordon, Office Action, U.S. Appl. No. 12/035,236, Dec. 16, 2013, 11 pgs.
Gordon, Office Action, U.S. Appl. No. 12/008,697, Aug. 1, 2013, 43 pgs.
Gordon, Office Action, U.S. Appl. No. 12/008,697, Aug. 4, 2011, 39 pgs.
Gordon, Office Action, U.S. Appl. No. 12/008,722, Oct. 11, 2011, 16 pgs.
Handley et al, “TCP Congestion Window Validation,” RFC 2861, Jun. 2000, Network Working Group, 22 pgs.
Henry et al. “Multidimensional Icons” ACM Transactions on Graphics, vol. 9, No. 1 Jan. 1990, 5 pgs.
Insight advertisement, “In two years this is going to be the most watched program on TV” on touch VCR programming, published not later than 2000, 10 pgs.
Isensee et al., “Focus Highlight for World Wide Web Frames,” Nov. 1, 1997, IBM Technical Disclosure Bulletin, vol. 40, No. 11, pp. 89-90.
ICTV, Inc., International Search Report / Written Opinion, PCT/US2008/000400, Jul. 14, 2009, 10 pgs.
ICTV, Inc., International Search Report / Written Opinion, PCT/US2008/000450, Jan. 26, 2009, 9 pgs.
Kato, Y., et al., “A Coding Control algorithm for Motion Picture Coding Accomplishing Optimal Assignment of Coding Distortion to Time and Space Domains,” Electronics and Communications in Japan, Part 1, vol. 72, No. 9, 1989, 11 pgs.
Koenen, Rob,“MPEG-4 Overview—Overview of the MPEG-4 Standard” Internet Citation, Mar. 2001, http://mpeg.telecomitalialab.com/standards/mpeg-4/mpeg-4.htm, May 9, 2002, 74 pgs.
Konaka, M. et al., “Development of Sleeper Cabin Cold Storage Type Cooling System,” SAE International, The Engineering Society for Advancing Mobility Land Sea Air and Space, SAE 2000 World Congress, Detroit, Michigan, Mar. 6-9, 2000, 7 pgs.
Le Gall, Didier, “MPEG: A Video Compression Standard for Multimedia Applications”, Communication of the ACM, vol. 34, No. 4, Apr. 1991, New York, NY, 13 pgs.
Langenberg, E, et al., “Integrating Entertainment and Voice on the Cable Network,” SCTE , Conference on Emerging Technologies, Jan. 6-7, 1993, New Orleans, Louisiana, 9 pgs.
Large, D., “Tapped Fiber vs. Fiber-Reinforced Coaxial CATV Systems”, IEEE LCS Magazine, Feb. 1990, 7 pgs.
Mesiya, M.F, “A Passive Optical/Coax Hybrid Network Architecture for Delivery of CATV, Telephony and Data Services,” 1993 NCTA Technical Papers, 7 pgs.
“MSDL Specification Version 1.1” International Organisation for Standardisation Organisation Internationale EE Normalisation, ISO/IEC JTC1/SC29/VVG11 Coding of Moving Pictures and Autdio, N1246, MPEG96/Mar. 1996, 101 pgs.
Noguchi, Yoshihiro, et al., “MPEG Video Compositing in the Compressed Domain,” IEEE International Symposium on Circuits and Systems, vol. 2, May 1, 1996, 4 pgs.
Regis, Notice of Allowance U.S. Appl. No. 13/273,803, Sep. 2, 2014, 8 pgs.
Regis, Notice of Allowance U.S. Appl. No. 13/273,803, May 14, 2014, 8 pgs.
Regis, Final Office Action U.S. Appl. No. 13/273,803, Oct. 11, 2013, 23 pgs.
Regis, Office Action U.S. Appl. No. 13/273,803, Mar. 27, 2013, 32 pgs.
Richardson, Ian E.G., “H.264 and MPEG-4 Video Compression, Video Coding for Next-Genertion Multimedia,” Johm Wiley & Sons, US, 2003, ISBN: 0-470-84837-5, pp. 103-105, 149-152, and 164.
Rose, K., “Design of a Switched Broad-Band Communications Network for Interactive Services,” IEEE Transactions on Communications, vol. com-23, No. 1, Jan. 1975, 7 pgs.
Saadawi, Tarek N., “Distributed Switching for Data Transmission over Two-Way CATV”, IEEE Journal on Selected Areas in Communications, vol. Sac-3, No. 2, Mar. 1985, 7 pgs.
Schrock, “Proposal for a Hub Controlled Cable Television System Using Optical Fiber,” IEEE Transactions on Cable Television, vol. CATV-4, No. 2, Apr. 1979, 8 pgs.
Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, Sep. 22, 2014, 5 pgs.
Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, Feb. 27, 2014, 14 pgs.
Sigmon, Final Office Action, U.S. Appl. No. 13/311,203, Sep. 13, 2013, 20 pgs.
Sigmon, Office Action, U.S. Appl. No. 13/311,203, May 10, 2013, 21 pgs.
Smith, Brian C., et al., “Algorithms for Manipulating Compressed Images,” IEEE Computer Graphics and Applications, vol. 13, No. 5, Sep. 1, 1993, 9 pgs.
Smith, J. et al., “Transcoding Internet Content for Heterogeneous Client Devices” Circuits and Systems, 1998. ISCAS '98. Proceedings of the 1998 IEEE International Symposium on Monterey, CA, USA May 31-Jun. 3, 1998, New York, NY, USA,IEEE, US, May 31, 1998, 4 pgs.
Stoll, G. et al., “GMF4iTV: Neue Wege zur-Interaktivitaet Mit Bewegten Objekten Beim Digitalen Fernsehen,” Fkt Fernseh Und Kinotechnik, Fachverlag Schiele & Schon GmbH, Berlin, DE, vol. 60, No. 4, Jan. 1, 2006, ISSN: 1430-9947, 9 pgs.
Tamitani et al., “An Encoder/Decoder Chip Set for the MPEG Video Standard,” 1992 IEEE International Conference on Acoustics, vol. 5, Mar. 1992, San Francisco, CA, 4 pgs.
Terry, Jack, “Alternative Technologies and Delivery Systems for Broadband ISDN Access”, IEEE Communications Magazine, Aug. 1992, 7 pgs.
Thompson, Jack, “DTMF-TV, The Most Economical Approach to Interactive TV,” Gnostech Incorporated, NCF'95 Session T-38-C, 8 pgs.
Thompson, John W. Jr., “The Awakening 3.0: PCs, TSBs, or DTMF-TV—Which Telecomputer Architecture is Right for the Next Generations's Public Network?,” GNOSTECH Incorporated, 1995 The National Academy of Sciences, downloaded from the Unpredictable Certainty: White Papers, http://www.nap.edu/catalog/6062.html, pp. 546-552.
Tobagi, Fouad A., “Multiaccess Protocols in Packet Communication Systems,” IEEE Transactions on Communications, vol. Com-28, No. 4, Apr. 1980, 21 pgs.
Toms, N., “An Integrated Network Using Fiber Optics (Info) for the Distribution of Video, Data, and Telephone in Rural Areas,” IEEE Transactions on Communication, vol. Com-26, No. 7, Jul. 1978, 9 pgs.
Trott, A., et al.“An Enhanced Cost Effective Line Shuffle Scrambling System with Secure Conditional Access Authorization,” 1993 NCTA Technical Papers, 11 pgs.
Jurgen—Two-way applications for cable television systems in the '70s, IEEE Spectrum, Nov. 1971, 16 pgs.
va Beek, P., “Delay-Constrained Rate Adaptation for Robust Video Transmission over Home Networks,” Image Processing, 2005, ICIP 2005, IEEE International Conference, Sep. 2005, vol. 2, No. 11, 4 pgs.
Van der Star, Jack A. M., “Video on Demand Without Compression: A Review of the Business Model, Regulations and Future Implication,” Proceedings of PTC'93, 15th Annual Conference, 12 pgs.
Welzenbach et al., “The Application of Optical Systems for Cable TV,” AEG-Telefunken, Backnang, Federal Republic of Germany, ISSLS Sep. 15-19, 1980, Proceedings IEEE Cat. No. 80 CH1565-1, 7 pgs.
Yum, TS P., “Hierarchical Distribution of Video with Dynamic Port Allocation,” IEEE Transactions on Communications, vol. 39, No. 8, Aug. 1, 1991, XP000264287, 7 pgs.
ActiveVideo Networks, Inc., International Preliminary Report on Patentablity, PCT/US2013/036182, Oct. 14, 2014, 9 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Rule 94(3), EP08713106-6, Jun. 25, 2014, 5 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Rules 161(2) & 162 EPC, EP13775121.0, Jan. 20, 2015, 3 pgs.
ActiveVideo Networks Inc., Examination Report No. 1, AU2011258972, Jul. 21, 2014, 3 pgs.
ActiveVideo Networks Inc., Certificate of Patent JP5675765, Jan. 9, 2015, 3 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 13/445,104, Dec. 24, 2014, 14 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/668,004, Feb. 26, 2015, 17 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/686,548, Jan. 5, 2015, 12 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/911,948, Dec. 26, 2014, 12 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/911,948, Jan. 29, 2015, 11 pgs.
Dahlby, Office Action, U.S. Appl. No. 12/651,203, Dec. 3, 2014, 19 pgs.
Gordon, Notice of Allowance, U.S. Appl. No. 12/008,697, Dec. 8, 2014, 10 pgs.
Gordon, Office Action, U.S. Appl. No. 12/008,722, Nov. 28, 2014, 18 pgs.
Regis, Notice of Allowance, U.S. Appl. No. 13/273,803, Nov. 18, 2014, 9 pgs.
Regis, Notice of Allowance, U.S. Appl. No. 13/273,803, Mar. 2, 2015, 8 pgs.
Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, Dec. 19, 2014, 5 pgs.
TAG Networks Inc, Decision to Grant a Patent, JP 2008-506474, Oct. 4, 2013, 5 pgs.
ActiveVideo Networks Inc., Decision to refuse a European patent application (Art. 97(2) EPC, EP09820936.4, Feb. 20, 2015, 4 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, 10754084.1, Feb. 10, 2015, 12 pgs.
ActiveVideo Networks Inc., Communication under Rule 71(3) EPC, Intention to Grant, EP08713106.6, Feb. 19, 2015, 12 pgs.
ActiveVideo Networks Inc., Notice of Reasons for Rejection, JP2014-100460, Jan. 15, 2015, 6 pgs.
ActiveVideo Networks Inc., Notice of Reasons for Rejection, JP2013-509016, Dec. 24, 2014 (Received Jan. 14, 2015), 11 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/737,097, Mar. 16, 2015, 18 pgs.
Craig, Decision on Appeal—Reversed—, U.S. Appl. No. 11/178,177, Feb. 25, 2015, 7 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,177, Mar. 5, 2015, 7 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,181, Feb. 13, 2015, 8 pgs.
ActiveVideo Networks, Inc., Certificate of Grant, EP08713106.6-1908, Aug. 5, 2015, 2 pgs.
ActiveVideo Networks, Inc., Certificate of Grant, AU2011258972, Nov. 19, 2015, 2 pgs.
ActiveVideo Networks, Inc., Certificate of Grant, AU2011315950, Dec. 17, 2015, 2 pgs.
ActiveVideo Networks, Inc., Certificate of Grant, AU2011249132, Jan. 7, 2016, 2 pgs.
ActiveVideo Networks, Inc., Certificate of Grant EP13168509.11908, Sep. 30, 2015, 2 pgs.
ActiveVideo Networks, Inc., Certificate of Patent, JP2013534034, Jan. 8, 2016, 4 pgs.
ActiveVideo Networks, Inc., Communication Pursuant to Rules 161(1) and 162 EPC, EP14722897.7, Oct. 28, 2015, 2 pgs.
ActiveVideo Networks, Inc., Communication Pursuant to Rules 161(1) and 162 EPC, EP14740004.8, Jan. 26, 2016, 2 pgs.
ActiveVideo Networks, Inc., Communication Pursuant to Rules 161(1) and 162 EPC, EP14736535.7, Jan. 26, 2016, 2 pgs.
ActiveVideo, Communication Pursuant to Article-94(3) EPC, EP12767642.7, Sep. 4, 2015, 4 pgs.
AcriveVideo, Communication Pursuant to Article 94(3) EPC, EP10841764.3, Dec. 18, 2015, 6 pgs.
ActiveVideo Networks, Inc., Communication Pursuant to Rules 70(2) abd 70a(2) EP13735906.3, Nov. 27, 2015, 1 pg.
ActiveVideo Networks, Inc., Decision to Grant, EP08713106.6-1908, Jul. 9, 2015, 2 pgs.
ActiveVideo Networks, Inc., Decision to Grant, EP13168509.1-1908, Sep. 3, 2015, 2 pgs.
ActiveVideo Networks, Inc., Decision to Grant, JP2014100460, Jul. 24, 2015, 5 pgs.
ActiveVideo Networks, Inc., Decision to Refuse a European Patent Application, EP08705578.6, Nov. 26, 2015, 10 pgs.
ActiveVideo Networks Inc., Examination Report No. 2, AU2011249132, May 29, 2015, 4 pgs.
ActiveVideo Networks Inc., Examination Report No. 2, AU2011315950, Jun. 25, 2015, 3 pgs.
ActiveVideo Networks Inc., Extended European Search Report, EP13735906.3, Nov. 11, 2015, 10 pgs.
ActiveVideo, International Search Report and Written Opinion, PCT/US2015/027803, Jun. 24, 2015, 18 pgs.
ActiveVideo, International Search Report and Written Opinion, PCT/US2015/027804, Jun. 25, 2015, 10 pgs.
ActiveVideo Networks, Inc., International Search Report and Written Opinion, PCT-US2015028072, Aug. 7, 2015, 9 pgs.
ActiveVideo Networks, Inc., International Preliminary Report on Patentability, PCT-US2014030773, Sep. 15, 2015, 6 pgs.
ActiveVideo Networks, Inc., International Preliminary Report on Patentability, PCT/US2014041430, Dec. 8, 2015, 6 pgs.
ActiveVideo Networks, Inc., International Preliminary Report on Patentability, PCT-US2014041416, Dec. 8, 2015, 6 pgs.
ActiveVideo Networks, Inc., KIPO'S Notice of Preliminary Rejection, KR10-2010-7019512, Jul. 15, 2015, 15 pgs.
ActiveVideo Networks, Inc., KIPO'S Notice of Preliminary Rejection, KR10-20107021116, Jul. 13, 2015, 19 pgs.
ActiveVideo, Notice of Reasons for Rejection, JP2013-509016, Dec. 3, 2015, 7 pgs.
ActiveVideo, Notice of German Patent, EP602008040474-9, Jan. 6, 2016, 4 pgs.
ActiveVideo Networks B.V., Office Action, IL222830, Jun. 28, 2015, 7 pgs.
ActiveVideo Networks, Inc., Office Action, JP2013534034, Jun. 16, 2015, 6 pgs.
Avinity-Systems-BV, PreTrial-Reexam-Report-JP2009530298, Apr. 24, 2015, 6 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 13/911,948, Jul. 10, 2015, 5 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 13/438,617, May 22, 2015, 18 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 13/445,104, Apr. 23, 2015, 8 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 14/262,674, Sep. 30, 2015, 7 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 13/911,948, Aug. 21, 2015, 6 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 13/911,948, Aug. 5, 2015, 5 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 12/443,571, Jul. 9, 2015, 28 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/668,004, Aug. 3, 2015, 18 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/686,548, Aug. 12, 2015, 13 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/737,097, Aug. 14, 2015, 17 pgs.
Brockmann, Office Action, U.S. Appl. No. 14/262,674, May 21, 2015, 7 pgs.
Brockmann, Office Action, U.S. Appl. No. 12/443,571, Dec. 4, 2015, 30 pgs.
Dahlby, Office Action U.S. Appl. No. 12/651,203, Jul. 2, 2015, 25 pgs.
Dahlby, Final Office Action, U.S. Appl. No. 12/651,203, Dec. 11, 2015, 25 pgs.
Gecsei, J., “Adaptation in Distributed Multimedia Systems,” IEEE Multimedia, IEEE Service Center, New York, NY, vol. 4, No. 2, Apr. 1, 1997, 10 pgs.
Gordon, Notice of Allowance, U.S. Appl. No. 12/008,697, Apr. 1, 2015, 10 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/008,722, Jul. 2, 2015, 20 pgs.
Jacob, Bruce, “Memory Systems: Cache, DRAM, Disk,” The Cache Layer, Chapter 22, p. 739.
Ohta, K., et al., “Selective Multimedia Access Protocol for Wireless Multimedia Communication,” Communications, Computers and Signal Processing, 1997, IEEE Pacific Rim Conference NCE Victoria, BC, Canada, Aug. 1997, vol. 1, 4 pgs.
Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, Apr. 14, 2015, 5 pgs.
Wei, S., “QoS Tradeoffs Using an Application-Oriented Transport Protocol (AOTP) for Multimedia Applications Over IP.” Sep. 23-26 1999, Proceedings of the Third International Conference on Computational Intelligence and Multimedia Applications, New Delhi, India, 5 pgs.
ActiveVideo Networks, Inc., KIPO's Notice of Preliminary Rejection, KR10-2011-7024417, Feb. 18, 2016, 15 pgs.
ActiveVideo Networks, Inc., KIPO's Second Notice of Preliminary Rejection, KR10-20107019512, Feb. 12, 2016, 5 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/686,548, Feb. 8, 2016, 13 pgs.
Gordon, Notice of Allowance, U.S. Appl. No. 12/008,722, Feb. 17, 2016, 10 pgs.
McElhatten, Office Action, U.S. Appl. No. 14/698,633, Feb. 22, 2016, 14 pgs.
Related Publications (1)
Number Date Country
20140366057 A1 Dec 2014 US
Provisional Applications (1)
Number Date Country
61832069 Jun 2013 US