System and method for providing video content associated with a source image to a television in a communication network

Information

  • Patent Grant
  • 9077860
  • Patent Number
    9,077,860
  • Date Filed
    Monday, December 5, 2011
    13 years ago
  • Date Issued
    Tuesday, July 7, 2015
    9 years ago
Abstract
A system and method are provided for processing video content, associated with a source image, for display on a television. The source image, such as a web page, and its associated (e.g., linked) video content are retrieved and separately encoded. The encoded source image and the encoded video content are composited together to form a sequence of encoded video frames, where a frame type of the video content is used to determine a type of a composited frame. For example, if all displayed frames of the video content are MPEG I-frames, then the composited frame also may be an I-frame. However, if any displayed frame of video content is an MPEG P-frame or B-frame, then the composited frame may be a P-frame.
Description
TECHNICAL FIELD AND BACKGROUND ART

The present invention relates to displaying video content, such as, for example internet video content, on a television in a communications network.


It is known in the prior art to display video content on a computer that is attached to the Internet as shown in FIG. 1. A user of a client computer 100 having an associated web browser 110 can request the display of a web page 120 from a server computer 130 by providing the URL (universal resource locator) for the web page 120. When the client computer 100 receives the web page 120, the web page 120 is rendered in a web browser 110. The displayed webpage 120 is a document that contains content in a format, such as HTML, along with links to other content, such as video content 150. The user of the client computer can request the display of the video content 150 by selecting the link. This selection requests the transmission of the video content from the server computer 130 through the Internet 140. The video content may be in any one of a number of different formats. For example, the content may be in Apple® Computer's Quicktime format, MPEG-2 format, or Microsoft's® Window's Media format. After the user has requested the video content 150, the video content 150 is transmitted to the client's computer 100 from the address associated with the link. Given that the video is in a particular format and is generally compressed, the video 150 must be decoded by the client computer 100. The video content 150 is decoded by a program separate from the web browser which may be a plug-in 160 to the web browser 110. The plug-in 160 is run and decodes the video content 150 and displays the video content 150 within the client's web browser. In other systems, the web browser accesses a separate program that displays the content.


In communication networks wherein the requesting device does not have the capability to separately decode video content from the web page content, the previously described client plug-in architecture will not work. An example of such a system is an interactive cable television system 200 that supports web browsing on televisions 210. The web content 230 is rendered at a remote location, such as a cable head end 240 as shown in FIG. 2. Such a cable television system 200 allows a subscriber to make a request for content using a communication device, such as a set top box 250. The request is sent to the cable headend 240 from a subscriber's set top box 250 and the head end 240 accesses the web page 230 from a server 260, renders the web page 270, encodes the web page 270 in a format that the set top box 250 can decode, and then transmits the webpage 230 to the set top box. If the web page 230 contains a link to video content 220 and the subscriber requests the display of the video content 220, the video content must be encoded in the format that the set top box can decode, such as MPEG-2 content. As such, the head end retrieves the video content associated with the requested link. The head end decodes the video content using an applicable program 280 and then re-encodes the video content 270 along with the web page in the format that the set top box can decode. Thus, each frame of video along with the substantially static webpage background is encoded. Such a process is time consuming, and resource intensive, particularly for streaming video. Additionally, because the video content needs to be decoded and re-encoded, information is lost, and therefore the displayed video has less resolution than the originally encoded video content.


SUMMARY OF THE INVENTION

In a first embodiment of the invention there is provided a system and method to provide displayed video content associated with a web page or other source image on a television in a communication network. A request at a content distribution platform in the communication network is received for display of the source image from a communication device associated with a television. In certain embodiments, the communication network is a cable television network. In other embodiments, the communication network may be a television over IP network. The content distribution platform retrieves the requested source image and displays the source image on a user's television. The user can then request video content by selecting a link on the source image. The request for video content associated with the link is then received by the content distribution platform. The content distribution platform retrieves the video content that is associated with the link if it is not already available to the platform in a pre-encoded file. The video content is pre-encoded and may be, for example, an MPEG data file. The content distribution platform then composites the video content and at least a portion of the source image together to form a video stream that can be decoded by the communication device and displayed on the television. The composited video stream is then sent through the communication network to the communication device where it is decoded and displayed on the requesting user's television. In one embodiment, at least a portion of the source image is encoded prior to compositing the web page and the video content together. The source image can be encoded in real-time using an MPEG encoder. In certain embodiments, a portion of data from the source image overlaid by the video content is discarded prior to the macro blocks of the web page being encoded.


In one embodiment, the communication device associated with the television includes a decoder capable of decoding an MPEG video stream. The communication device may be, for example, a set-top box or a cable card. In other embodiments, the communication device and the decoder are separate entities. The decoder can be a self-contained device or part of the television.


In another embodiment of the invention, prior to a request for playback of video content, the content distribution platform locates links associated with video content on a source image, such as a web page. The video content may or may not be in a format that is compatible with the decoder. The content distribution platform then downloads the video content and if the video content is not in a compatible format, the content distribution platform decodes and re-encodes the video content, so that the video content is decodable by the decoder. The video content is therefore in the process of being pre-encoded or is already pre-encoded prior to a user making a request for the video content, thus allowing the video content to be sent quicker than if the content distribution platform waited for a request to be made for the video content. The video content can also be shared amongst other users that share the same distribution platform.


The system for processing video content associated with a link includes a plurality of modules including: a receiver for receiving a request for transmission of video content associated with a link and providing the request to a retriever. The retriever retrieves the video content associated with the link. The system includes a compositor that includes an encoder that encodes at least a portion of the source image/web page into a format that the communication device can decode. The compositor then creates a composite stream based upon the encoded web page/source image and the video content that can be decoded by the communication device. A transmitter within the system transmits via a communication network the composite stream for display on a television associated with the request. In other embodiments, the receiver and transmitter are combined together as a transceiver. In still other embodiments, multiple modules may be combined together and may be comprised of hardware, software or both hardware and software.


In yet another embodiment of the system, a request for display of a web page is received by a receiver. The receiver provides the request to a retriever, wherein the receiver subsequently receives a request for display of the video content associated with a link on the web page. The retriever retrieves the web page and retrieves the video content associated with the link. In such an embodiment, the compositor creates a composite data stream based on information in the retrieved webpage and the pre-encoded video content. The transmitter then transmits the composite stream for display on a television associated with the request. The system may include a decoder associated with the television for decoding the received video content.


As already stated, the communication device may include a decoder be capable of decoding an MPEG stream and the web page and the encoded/pre-encoded video content are composited together as an MPEG video stream in certain embodiments.





BRIEF DESCRIPTION OF THE DRAWINGS

The foregoing features of the invention will be more readily understood by reference to the following detailed description, taken with reference to the accompanying drawings, in which:



FIG. 1 is a first prior art environment for displaying video content from the Internet;



FIG. 2 is a second prior art environment for displaying video content from the Internet;



FIG. 3 is an environment showing a first embodiment of the invention;



FIG. 4 is a flow chart of one embodiment of the invention for accelerating delivery of video content to a television in a communication network;



FIG. 5 is a flow chart that further explains the methodology of compositing video source material and background material; and



FIG. 6 is an image that shows a frame of video content in which there is a background image and two video sources X and Y.





DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS

Definitions: In the specification the following terms shall having the meaning given unless the context indicates otherwise. The term “frame” shall refer to both fields and frames as is understood by those in the compression arts. The term “video content” may include audio. The term “source image” shall refer to static graphical content capable of being displayed on a television, as well as dynamic graphical content. The term source image includes, but is not limited to web pages.



FIG. 3. shows an environment for implementing an embodiment of the invention. The environment is designed to allow a television 320 associated with a decoder and coupled to a communication network to display video content 360 in combination with a web page or a source image. For example, a source image may be a cable television system's proprietary graphical image. The video content and source image/web page can reside in a limited-content network wherein the video content is pre-encoded in a format decodable by a decoder 393 associated with the television 320 or the video content and web page/source image may reside in an open access network wherein the video content may or may not be pre-encoded in a format that is decodable by the decoder associated with the television.


In such an environment, the television 320 is associated with a communication device 310. The communication device 310 performs limited processing tasks, such as receiving input instructions and content and formatting output instructions. The communication device, in this embodiment, includes decoder 393 for decoding video content in known formats. For example, the communication device 310 may be a set top box which is capable of receiving a digital data signal and decoding MPEG video. Examples of such set-top boxes include Motorola's DCT 200 and Amino Communications, Ltd AmiNet 110. The communication device 310 does not perform any rendering of content. All general purpose processing is performed at a content distribution platform 330, which may be at a central location, such as, a head end in a cable television network. Examples of other locations for the content distribution platforms include a central switching office for a telephone system and intermediate processing facilities, such as an ISP (Internet Service Provider). Additionally, the content distribution platform may reside at a location separate from the network's central location. Further, each module within the content distribution platform can be distributed as the modules operate as a logical network. The content distribution platform 330 includes a plurality of processors. Each processor may be associated with one or more interactive television processes. For example, the interactive processes may be the display of a movie on demand or the ability to access the internet. Thus, a user may request an interactive session from the content distribution platform using an input device by sending a predefined request signal to the content distribution platform using a subscriber input device. U.S. Pat. No. 6,100,883 (which is incorporated herein by reference in its entirety) shows such an environment wherein a content distribution platform has the foregoing features. In order to simplify explanation, embodiments of the invention will refer to web pages; however this should not be interpreted as limiting the scope of the invention to web pages and other source images may also be used.


In the embodiment that is shown in FIG. 3, the communication device 310, such as a set-top box, receives a request for an interactive session for accessing a web page from a user input device 390, such as, a remote control or a wired wireless keyboard. The request is forwarded to the content distribution platform 330 through a communication network 300. The content distribution platform 330 receives the request at a transceiver input and assigns a processor for the interactive session. The transceiver is made up of a transmitter 334 and receiver 332. The request includes at least indicia of the web page 380 and of the communication device. For example, the indicia may be the address of the web page/communication device or a code that can be located in a look-up table that is associated with the address. The web page address may be a Universal Resource Locator (URL).


The content distribution platform 330 contains a transceiver (332,334), a pre-encoder 335, storage (memory) 333, a stream compositor 392 and a retrieving module 331. All of the functions performed by these modules may be performed by a single processor or each module may be a separate processor. Further, the storage/memory 333 may be part of the processor or separate from the processor.


It should be understood that FIG. 3 shows the individual modules that are used for a single interactive session requesting video content in conjunction with a web page. As stated above, the content distribution platform can contain a plurality of processors and each processor can control multiple simultaneous interactive sessions. Therefore, the content distribution platform may contain multiple copies of the shown modules on separate processors. It should be noted that some of the modules may be shared by multiple sessions and therefore, not all modules need be part of the same processor. For example, a content distribution platform may have only a single transceiver that is shared by many processors each maintaining at least one interactive session.


As previously mentioned, the present invention as embodied may be used with open access networks, such as the internet, or with closed access networks. In closed access networks where the video content is already in a format decodable by the decoder associated with the television, the content distribution platform need not decode and re-encode the video content using a pre-encoder module. In such a closed access network, the pre-encoder module need not be part of the content distribution platform.


In an open access network, the content distribution platform parses and reviews the links on a requested web-page. If a link indicates that the video content is not in a format decodable by the decoder, the content distribution platform can pre-retrieve the video content for re-encoding. The content distribution platform can perform this check by scanning the web page code (e.g. HTML) for known video content extensions. If the link indicates video content is in the proper format, the content distribution platform can wait until receiving a request for that link before retrieving the video content.


In one example of how content distribution platform operates, the content distribution platform 330 receives a request for a web page 380. The retriever 331 forwards that request along with the return address for the content distribution platform using the transceiver (332,334) through a network, such as the internet, a LAN (local-area network) or a WAN (wide-area network) 340 to a server 350 associated with the address provided by the requesting communication device 310. The server 350 receives the request and responds to the request by sending the requested web page 380 to the transceiver (332,334) of the content distribution platform. The transceiver of the content distribution platform sends the requested web page to a renderer 336. The renderer 336 produces a rendered version of the web page placing the rendered version into a memory 333 that is accessed by an encoder that is part of the stream compositor 392. The web page may be rendered by accessing a web browser program and producing a video data output. The encoder of the stream compositor 392 encodes the renderer's output and stores the resulting web page date in memory 333. The web page is encoded as an MPEG (MPEG-2, MPEG-4) video frame and is also provided to the communication device 310 as an MPEG video stream. For example, the MPEG video frame may be repeatedly transmitted until the web page is updated by the server 350. For the remainder of this specification, it will be presumed that the communication device 310 includes a decoder 393 that can decode MPEG encoded video and that the content distribution platform encodes the content into an MPEG format. This will be done for simplification of explanation and in no way should be seen as limiting the invention to MPEG encoding schemes. Further, having the decoder within the communication device also should not be interpreted as limiting.


The retriever module 331 searches the web page for any links or other associated video content. If a link is found on the web page that is associated with video content not in a decodable format by the decoder associated with the television, the retriever 331 will make a request to the server 350 for the video content. Video content can be readily identified by the file name and associated extension (ex. mpg, avi, qt, mov etc.) When the video content 360 is received by the retriever 331, the retriever will forward the video content to the renderer 336 which provides the content to the pre-encoder 335. The pre-encoder 335 will decode the video content and re-encode the video content into a valid format for the communication device. The content is then stored to memory 333 and will only be retrieved if a user makes a request for such video content. By pre-encoding the video content prior to receiving a request for the video content, the video content will be either encoded or already in process of being encoded when requested, allowing the video content to be transmitted more rapidly than if the content is retrieved when a request is received. Further, once the video content is pre-encoded, the video content can be stored in memory and saved for later retrieval by another user of the system or for another session by the same user. The pre-encoder may also serve to perform off-line pre-encoding of known content. For example, if a user selects a website, the pre-encoder may access and begin pre-encoding all content from web pages associated with the website that is not in a format decodable by decoders within the network. Thus, in a cable television network in which a plurality of subscribers share the same content distribution platform, the video content is accessible and pre-encoded for all subscribers. Thus, the pre-encoded content can improve the time between a request being made and the display of video content on the television of the requesting subscriber.


If the content distribution platform is configured to allow sharing of pre-encoded content among multiple users of the network, the pre-encoded content can be stored in a repository. The repository may be located either locally or remotely from the content distribution platform. In such an embodiment, the content distribution platform includes a management module. The management module maintains the repository and contains a database of information regarding the pre-encoded content. The management module maintains a data structure that indicates the file name and the storage location within memory of the repository. For each pre-encoded content file, the database may include parameters indicating: whether the content is time sensitive, the time that the content was retrieved, the location from where the content was retrieved, the recording format of the content, a user identifier regarding the last person to request the content, a counter identifying the number of times the content is accessed. Additionally, the database can include other parameters.


Each time a user requests content, the management module searches the repository to determine if the repository contains the requested content. If the content is stored in the repository, the management module determines if the content is time sensitive content by accessing the database. If a parameter in the database indicates that the content is time sensitive, the management module requests information from the server providing the content to determine if the repository contains the most recent version of the content. For example, the management module may obtain a version number for the content or a timestamp of when the content was created/posted. The management module compares this information to the data in the database. If the repository contains the most recent version of the content, the management module directs the pre-encoded version of the content to the television of the requesting user. If the repository does not contain the most recent version of the content, the management module requests the content from the server. The management module causes the content distribution platform to transcodes the requested content into a format that the decoder associated with the requesting television can decode. The content distribution platform then distributes the encoded content to the device associated with the requesting television.


In certain embodiments, the management module includes an algorithm for determining how long to maintain a pre-encoded file. The management module may have a fixed period for maintaining the content, for example 24 hours. Any pre-encoded content file that includes a timestamp that falls outside of the previous 24 hour period is purged from the repository. In other embodiments, the management module maintains content based upon popularity (i.e. the number of times a file is accessed within a given time period). For example, the algorithm may maintain the top 1000 content files wherein a database keeps a counter for each file that a user accesses. The management module may maintain content using a combination of time and popularity, where the management module uses a weighting factor based upon popularity. For example, each file may be maintained for a set time period of 6 hours, but if the file is within the top 100 accessed files, the file will be maintained for an additional 6 hours. By regularly purging the repository, the repository memory can be efficiently used.


In certain embodiments, the pre-encoded content can be maintained locally to a group of end users or to a single user. For example, the system maintains pre-encoded content for users within a 10 block radius. Thus, the management module is also situated locally with the pre-encoded content. Therefore, different localities may have different pre-encoded content. This would be preferable for city-wide or national systems, where local content (news, sports, weather) would be more likely to be pre-encoded and stored for the local users of the system.


If the network is a closed network, the retriever does not need to parse through the links nor does the video content need to be pre-encoded, since all of the video content is already in a format that is decodable by the decoder at the requesting television.


A subscriber then makes a request for video content 360 associated with a link on the requested web page 380 by using the user input device 390 to select the link. The requested web page 380 and the requested video content 360 although associated, may reside on different servers 350, 351. The link information is passed through the communication network 300 to the content distribution platform 330 and the content distribution platform 330 requests the video content or retrieves the video content from memory depending on whether the video content needed to be pre-encoded.


An example of such a communication network for selecting a link of a web page that is displayed on a television is taught in U.S. patent application Ser. No. 10/895,776 entitled “Television Signal Transmission of Interlinked Data and Navigation Information for use By a Chaser Program” that is assigned to the same assignee and is incorporated herein by reference in its entirety. Reference to this application should not be viewed as limiting the invention to this particular communication network.


The compositor 392 retrieves the stored web page data and the video content which is encoded as MPEG data. The web page and the video content are then composited together. The web page is saved as a series of macro blocks which are a subset of pixels (ex. 16×16) which together comprise an entire frame of video. Each macro block of the web page is separately processed. The display position (macro block position) of the video content may be predetermined or determined during compositing by compositor 392. Macro blocks within the web page that are to be overlaid by the video content are not processed. The macro blocks of the video content are then inserted in place of the macro blocks of the web page that are not processed. In order to provide continuity, the video content may need to be padded with pixels if the video content is not defined in perfect macro block increments. In addition to the top left corner of the video content window being aligned to a macro block boundary, the right and bottom corner must also be aligned (the height and width must be divisible by 16). For example, if the video content is 100×100 pixels in size and each macro block is 16 pixels by 16 pixels square, it would take 7×7 macroblocks (112 pixels by 112 pixels) to present the video content and therefore, there would be a border around the video content that is 12 pixels wide. The content distribution platform would insert this border and the border could be made any desired color. For example, the content distribution platform may make the border pixels black. This process is performed for all video content to be displayed.


Each composited frame is then transmitted by the transceiver (332,334) through the communication network 300 to the communication device. The communication device 310 can then use decoder 393 to decode the MPEG video stream and provide the output to the television set. The video content 360 will then be displayed on the television 320. Thus, it appears to a viewer of the television that the web page is rendered locally with the video content, even though the rendering occurs on the content distribution platform. It should be understood that the communication device may include a digital to analog converter for converting the MPEG video stream to an analog signal for display on an analog television or for providing the MPEG video stream to a component, composite or other analog input on a digital television.



FIG. 4 is a flow chart showing the methodology employed at the content distribution platform when a user selects a link on a web page for display of video content. The content distribution platform receives a request for display of video content associated with a link on the displayed web page (410). The request is sent from the communication device and includes a representation of the web address for the video content. For example, the set-top box/communication device may transmit a signal that represents a direction/relative location on the display. In one embodiment, the platform contains a predetermined map for associating a user's input with a link. In another an embodiment, the position of each link on the web page is mapped upon receiving the web page and the content distribution platform creates a look-up table that associates the directional command with a link. Based upon this received directional signal, the content distribution platform can relate the direction/relative location signal to the desired link from the web page. Further explanation regarding embodiments of mapping links with directional commands is disclosed in U.S. patent application Ser. No. 09/997,608 entitled, “System and Method for Hyperlink Navigation Between Frames” that is assigned to the same assignee and is incorporated herein by reference in its entirety.


The content distribution platform then retrieves the video content (420). If the video content is already in a format that is decodable by the decoder associated with the requesting television, the content distribution directs the request with the address of the link to the server through the Internet to retrieve the video content. The server receives the request for the video content and forwards the video content to the content distribution platform. The content distribution platform, which has maintained an active interactive session with the communication device requesting the video content, receives the video content and associates the video content with the interactive session. The video content is preferably an MPEG stream. Additionally, the content distribution platform may receive periodic updates of the Web Page data (RGB data received into a video buffer which is converted into YUV image data). If the video content was not in a format that is decodable by the decoder and was previously retrieved and pre-encoded, the content distribution platform retrieves the pre-encoded video content from memory.


The content distribution platform then composites the pre-encoded video content and the web page together 430. The compositor creates an MPEG video stream from the web page data and the MPEG video content. For each frame of the MPEG video stream transmitted to the decoder, the compositor encodes each macro block of the web page data in real-time and inserts the pre-encoded video content into the encoded web page macro block data.


The compositor divides the data (YUV data) of the web page into macro blocks and determines the position for display of the video content within the web page. The position relates to the macro block locations for the video content when displayed on a display device. For each frame of the MPEG stream, the compositor parses the video content into frames and determines the frame-type of the video content frame. After the frame-type is determined, the macro blocks of the web page are encoded in real-time based upon the type of frame. The macro blocks of the web page data that overlap with the video content are not encoded and are discarded. The compositor splices the encoded video content macro blocks in with the encoded web page macro blocks at the pre-determined position. This compositing step continues for each frame of video content.


The web page data is repeatedly used in the compositing process; however all of the information need not be transmitted, since much of each web page is temporally static. The same encoded web page data can be reused, until the web page changes. As explained below, the web page macro block data is encoded in real-time and the manner in which it is encoded (as an interframe or intra frame block etc.) is determined based upon the type of frame of the video content that is being composited with the encoded web page.


Since the content distribution platform maintains an internet session with the server from which the web page was received, the content distribution platform may receive updated content for a web page. When such an update is received, the content distribution platform replaces the old web page with the content of the new web page. The compositor encodes the new web page content, discards macro blocks that overlap with the video content, and splices the video content with the new encoded web page content as explained above with respect to 430 and below in FIG. 5.


As each frame is composited, the frame is transmitted into the communication network to the address associated with the communication device (440). The communication device then decodes the composited MPEG video stream and displays the stream on the television set. To the subscriber, the image on the television set appears as if the television is rendering the web page and video content in a web browser, when in actuality, the images on the television are merely decoded MPEG video frames.



FIG. 5 is a flow chart that elaborates on the compositing process performed by the compositor. First, the sources to be displayed are determined and are obtained (500). A request for content is issued by a user. For example, a user requests the display of a pre-encoded video source by selecting a link on a web page. Thus, there are at least two sources of information: a background image, the web page, and a foreground image, the video content. It should be recognized that the background, the web page, is not encoded when received by the compositor, whereas foreground, the video content is received as pre-encoded data. Further, there may be more than one video source. The additional video sources may be part of the web page. An example of a web page that contains two video sources is presented in FIG. 6 wherein the video sources are labeled X and Y.


The location and size of the video content with respect to the background is next determined by the content distribution platform (510). The location may be predefined with respect to the background, for example the video content associated with the link may be centered at the center of the background. Similarly, the size of the video content may be preset. The content distribution platform may allow the video content to be shown at its native resolution. In other embodiments, the size of the video content may be limited to a number of macro blocks (e.g. 10×10, 15×15 etc.). In such embodiments, the compositor scales the video content as is known to those in the art. Once the location and size are fixed, the compositor determines whether any border region is necessary, so that the video source material lies on a macro block boundary of the background 520.


Next, the visible macro blocks are determined 530. A visible macro block is a macro block that is not obscured by another macro block that overlays it. The selected pre-encoded video content overlays a portion of the web page and therefore, obscures a section of the web page. As shown in FIG. 6, frame F is broken up into a plurality of macro blocks. Macro blocks from the two video content sources X and Y overlap with each other and also overlap with the frame background F. Video content source X lies on top of some of the macro blocks of video source Y, and both X and Y lie on top of the frame background. As a result, not all of video source Y is displayed. Similarly, not all of the macroblocks of the frame background are displayed. The content distribution platform determines the top most macro block for each macroblock in the frame.


The compositor then begins to encode each frame of the MPEG stream. First the overall frame type is determined. The compositor inquires whether the frame should be an I or a P MPEG frame 540. Frame type is selected based upon the frame type of the video content that is being composited with the background. If the frame type of the current frame of any of the video sources content is a P type frame, then the overall frame type will be a P frame. If the frame type of the video content source(s) is an I frame, then the overall frame type will be an I frame. Referencing FIG. 6, if the current video frame for video source X is an I frame, and the current video frame for video source Y is a P frame, then the overall frame type will be a P frame. If the frame type of all of the video content that is to be composited is I type (ex. X and Y are I frames), then the overall frame type is an I frame. Once the overall frame type is determined, the MPEG frame headers are written.


Next the macroblocks are each systematically and individually processed. The compositor inquires if the current macroblock being processed is already pre-encoded, and therefore, part of the video content (550). If the answer is no, the macro block contains data from the web page. If the compositor has determined that the overall frame type is a P type frame, the encoder decides whether to encode the web page macro block as an intercoded macroblock or as an intracoded macro block (570). The encoder will generally encode the macro block as an intercoded macroblock (575), but if there are changes above a threshold in the data content of the macroblock as compared to the macro block at the same location from the previously encoded frame, the encoder will encode the macro block as an intracoded macro block (572). If the overall frame type is an I type frame, the web page macro block is intracoded (560). Thus, only the background/non-video content material is real-time encoded. If the macro-block does contain pre-encoded data (video content), the video content macro block is spliced into the macro block sequence regardless of the overall frame type (577). The encoding methodology is repeated for each macroblock until the frame is complete (578). Once a frame is completely encoded, the content distribution platform inquires whether each of the frames of video content within the video sequence have been encoded (580). If all of the video content has been encoded or the communication device sends a stop command to the content distribution platform, the sequence ends and compositing stops. If all of the frames have not been processed, then the process returns to block 540.


As the background and the video content are composited together frame by frame and constructed into an MPEG data stream, the encoded MPEG stream is sent to the communication device through the communication network and is decoded by the decoder and displayed on the subscriber's television.


In the previous example, it was assumed that the background was a web page from the internet. The background need not be a web page and may come from other sources. For example, the background may be a cable operator's background image and not a web page, wherein video content is composited with the cable operator's background.


Although various exemplary embodiments of the invention have been disclosed, it should be apparent to those skilled in the art that various changes and modifications can be made that will achieve some of the advantages of the invention without departing from the true scope of the invention. These and other obvious modifications are intended to be covered by the appended claims.

Claims
  • 1. A method of processing encoded video content that includes a plurality of encoded video frames, the encoded video content being addressed by a link on a static source image, for display of the encoded video content and the static source image on a television, the television coupled to a communication network through a communication device, the method comprising: retrieving the encoded video content addressed by the link; andin response to receiving a request for display of the encoded video content on the television, at a server, forming a sequence of composite video frames in a format decodable by the communication device, each composite video frame being formed by compositing an encoded frame of the encoded video content with at least one portion of the static source image, wherein the compositing includes selecting a frame type, from a plurality of frame types, of the encoded video content based on the frame type of the encoded video frame and encoding the at least one portion of the static source image according to the selected frame type.
  • 2. The method according to claim 1, wherein a portion of the static source image is removed, and compositing includes, when forming each composite video frame, inserting the encoded video content at a location of the static source image where the removed portion was located.
  • 3. The method according to claim 1, further comprising: transmitting the sequence of composite video frames through the communication network to the communication device that transmitted the request.
  • 4. The method according to claim 1, wherein at least a portion of the static source image is encoded as an MPEG frame prior to compositing and after receiving the request for display of the encoded video content.
  • 5. The method according to claim 1, wherein retrieving the encoded video content comprises: searching a repository for a stored version of the encoded video content;if the stored version of the encoded video content is in the repository, determining whether the encoded video content is time-sensitive; andif the encoded video content is time-sensitive, determining whether the stored version of the encoded video content is the most recent version of the encoded video content.
  • 6. The method according to claim 5, wherein determining whether the stored version of the encoded video content is the most recent version of the encoded video content includes comparing a timestamp associated with the stored version of the encoded video content to a timestamp of the encoded video content associated with the link.
  • 7. The method according to claim 5, wherein retrieving the encoded video content includes, if the stored version of the encoded video content is the most recent version of the encoded video content, retrieving the stored version of the encoded video content from the repository.
  • 8. The method according to claim 5, wherein retrieving the encoded video content includes, if the stored version of the encoded video content is not the most recent version of the encoded video content, requesting the encoded video content from a source associated with the link.
  • 9. The method according to claim 5, wherein the repository is located remotely from the content distribution platform.
  • 10. The method according to claim 1, further comprising: determining a group of end users associated with the encoded video content, each user in the group of end users having a communication device associated with a television.
  • 11. The method according to claim 10, wherein the group of end users consists of a single user.
  • 12. The method according to claim 10, wherein the group of end users is defined using a national boundary, a city boundary, or a circle having a given center and radius.
  • 13. The method according to claim 1, wherein the selected frame type of each encoded video frame is an I-frame or a P-frame.
  • 14. The method according to claim 13, further comprising: retrieving additional encoded video content, the additional encoded video content including a plurality of additional encoded video frames, wherein the additional encoded video content is to be displayed with the encoded video content and the static source image on the television; andselecting the frame type of the composite video frame to be an I-frame if the frame type of the encoded frame and a frame type of a respective frame of the additional encoded video frames are both I-frames.
  • 15. The method according to claim 13, further comprising: retrieving additional encoded video content, the additional encoded video content including a plurality of additional encoded video frames, wherein the additional encoded video content is to be displayed with the encoded video content and the static source image on the television; andselecting the frame type of the composite video frame to be a P-frame if the frame type of either of the encoded video frame and a respective frame of the additional encoded video frames is a P-frame.
  • 16. The method according to claim 1, wherein selecting the frame type includes selecting based on the encoding type of one or more macroblocks of the encoded video frame.
  • 17. A system for providing, to a decoder associated with a television, encoded video content that includes a plurality of video frames, the encoded video content being addressed by a link on a static source image, the system comprising: a receiver for receiving, from the decoder, a request for display of at least the encoded video content on the television;a retriever for retrieving the encoded video content addressed by the link on the static source image; anda server, comprising: a compositor for creating a sequence of composite video frames in response to the receiver receiving the request for display of the encoded video content, wherein for each composite video frame, the compositor selects a frame type for the composite video frame, from a plurality of frame types, of the encoded video content based on the frame type of an encoded frame of the encoded video content, and combines at least a portion of the static source image with the encoded frame of encoded video content, the static source image being encoded according to the selected frame type.
  • 18. The system according to claim 17, further comprising an encoder enabled to encode the static source image with a portion of the static source image removed, wherein the compositor creates a series of video frames that includes the encoded video content inserted into the removed portion of the encoded static source image.
  • 19. The system according to claim 18, wherein the encoder is enabled to decode previously encoded video content into decoded video content.
  • 20. The system according to claim 19, wherein the encoder is enabled to encode the decoded video content into the encoded video content.
  • 21. The system according to claim 20, wherein the encoder is enabled to encode the decoded video content as a series of MPEG frames.
  • 22. The system according to claim 17, wherein the selected frame type of each encoded video frame is an I-frame or a P-frame.
  • 23. The system according to claim 22, wherein the retriever is further enabled to retrieve additional encoded video content, the additional encoded video content including a plurality of additional encoded video frames, and wherein the compositor is further enabled to select the frame type of the composite video frame to be an I-frame if the frame type of the encoded frame and a frame type of a respective frame of the additional encoded video frames are both I-frames.
  • 24. The system according to claim 22, wherein the retriever is further enabled to retrieve additional encoded video content, the additional encoded video content including a plurality of additional encoded video frames, and wherein the compositor is further enabled to select the frame type of the composite video frame to be a P-frame if the frame type of either the encoded video frame or a respective frame of the additional encoded video frames is a P-frame.
  • 25. The system according to claim 17, wherein selecting the frame type includes selecting based on the encoding type of one or more macroblocks of the encoded video frame.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 11/258,601, filed Oct. 25, 2005, which claims the benefit of U.S. Provisional Application No. 60/702,507, filed Jul. 26, 2005. The contents of these prior applications are incorporated herein by reference in their entireties.

US Referenced Citations (699)
Number Name Date Kind
3889050 Thompson Jun 1975 A
3934079 Barnhart Jan 1976 A
3997718 Ricketts et al. Dec 1976 A
4002843 Rackman Jan 1977 A
4032972 Saylor Jun 1977 A
4077006 Nicholson Feb 1978 A
4081831 Tang et al. Mar 1978 A
4107734 Percy et al. Aug 1978 A
4107735 Frohbach Aug 1978 A
4145720 Weintraub et al. Mar 1979 A
4168400 de Couasnon et al. Sep 1979 A
4186438 Benson et al. Jan 1980 A
4222068 Thompson Sep 1980 A
4245245 Matsumoto et al. Jan 1981 A
4247106 Jeffers et al. Jan 1981 A
4253114 Tang et al. Feb 1981 A
4264924 Freeman Apr 1981 A
4264925 Freeman et al. Apr 1981 A
4290142 Schnee et al. Sep 1981 A
4302771 Gargini Nov 1981 A
4308554 Percy et al. Dec 1981 A
4350980 Ward Sep 1982 A
4367557 Stern et al. Jan 1983 A
4395780 Gohm et al. Jul 1983 A
4408225 Ensinger et al. Oct 1983 A
4450477 Lovett May 1984 A
4454538 Toriumi Jun 1984 A
4466017 Banker Aug 1984 A
4471380 Mobley Sep 1984 A
4475123 Dumbauld et al. Oct 1984 A
4484217 Block et al. Nov 1984 A
4491983 Pinnow et al. Jan 1985 A
4506387 Walter Mar 1985 A
4507680 Freeman Mar 1985 A
4509073 Baran et al. Apr 1985 A
4523228 Banker Jun 1985 A
4533948 McNamara et al. Aug 1985 A
4536791 Campbell et al. Aug 1985 A
4538174 Gargini et al. Aug 1985 A
4538176 Nakajima et al. Aug 1985 A
4553161 Citta Nov 1985 A
4554581 Tentler et al. Nov 1985 A
4555561 Sugimori et al. Nov 1985 A
4562465 Glaab Dec 1985 A
4567517 Mobley Jan 1986 A
4573072 Freeman Feb 1986 A
4591906 Morales-Garza et al. May 1986 A
4602279 Freeman Jul 1986 A
4614970 Clupper et al. Sep 1986 A
4616263 Eichelberger Oct 1986 A
4625235 Watson Nov 1986 A
4627105 Ohashi et al. Dec 1986 A
4633462 Stifle et al. Dec 1986 A
4670904 Rumreich Jun 1987 A
4682360 Frederiksen Jul 1987 A
4695880 Johnson et al. Sep 1987 A
4706121 Young Nov 1987 A
4706285 Rumreich Nov 1987 A
4709418 Fox et al. Nov 1987 A
4710971 Nozaki et al. Dec 1987 A
4718086 Rumreich et al. Jan 1988 A
4732764 Hemingway et al. Mar 1988 A
4734764 Pocock et al. Mar 1988 A
4748689 Mohr May 1988 A
4749992 Fitzemeyer et al. Jun 1988 A
4750036 Martinez Jun 1988 A
4754426 Rast et al. Jun 1988 A
4760442 O'Connell et al. Jul 1988 A
4763317 Lehman et al. Aug 1988 A
4769833 Farleigh et al. Sep 1988 A
4769838 Hasegawa Sep 1988 A
4789863 Bush Dec 1988 A
4792849 McCalley et al. Dec 1988 A
4801190 Imoto Jan 1989 A
4805134 Calo et al. Feb 1989 A
4807031 Broughton et al. Feb 1989 A
4816905 Tweety et al. Mar 1989 A
4821102 Ichikawa et al. Apr 1989 A
4823386 Dumbauld et al. Apr 1989 A
4827253 Maltz May 1989 A
4827511 Masuko May 1989 A
4829372 McCalley et al. May 1989 A
4829558 Welsh May 1989 A
4847698 Freeman Jul 1989 A
4847699 Freeman Jul 1989 A
4847700 Freeman Jul 1989 A
4848698 Newell et al. Jul 1989 A
4860379 Schoeneberger et al. Aug 1989 A
4864613 Van Cleave Sep 1989 A
4876592 Von Kohorn Oct 1989 A
4889369 Albrecht Dec 1989 A
4890320 Monslow et al. Dec 1989 A
4891694 Way Jan 1990 A
4901367 Nicholson Feb 1990 A
4903126 Kassatly Feb 1990 A
4905094 Pocock et al. Feb 1990 A
4912760 West, Jr. et al. Mar 1990 A
4918516 Freeman Apr 1990 A
4920566 Robbins et al. Apr 1990 A
4922532 Farmer et al. May 1990 A
4924303 Brandon et al. May 1990 A
4924498 Farmer et al. May 1990 A
4937821 Boulton Jun 1990 A
4941040 Pocock et al. Jul 1990 A
4947244 Fenwick et al. Aug 1990 A
4961211 Tsugane et al. Oct 1990 A
4963995 Lang Oct 1990 A
4975771 Kassatly Dec 1990 A
4989245 Bennett Jan 1991 A
4994909 Graves et al. Feb 1991 A
4995078 Monslow et al. Feb 1991 A
5003384 Durden et al. Mar 1991 A
5008934 Endoh Apr 1991 A
5014125 Pocock et al. May 1991 A
5027400 Baji et al. Jun 1991 A
5051720 Kittirutsunetorn Sep 1991 A
5051822 Rhoades Sep 1991 A
5057917 Shalkauser et al. Oct 1991 A
5058160 Banker et al. Oct 1991 A
5060262 Bevins, Jr et al. Oct 1991 A
5077607 Johnson et al. Dec 1991 A
5083800 Lockton Jan 1992 A
5088111 McNamara et al. Feb 1992 A
5093718 Hoarty et al. Mar 1992 A
5109414 Harvey et al. Apr 1992 A
5113496 McCalley et al. May 1992 A
5119188 McCalley et al. Jun 1992 A
5130792 Tindell et al. Jul 1992 A
5132992 Yurt et al. Jul 1992 A
5133009 Rumreich Jul 1992 A
5133079 Ballantyne et al. Jul 1992 A
5136411 Paik et al. Aug 1992 A
5142575 Farmer et al. Aug 1992 A
5144448 Hombaker, III et al. Sep 1992 A
5155591 Wachob Oct 1992 A
5172413 Bradley et al. Dec 1992 A
5191410 McCalley et al. Mar 1993 A
5195092 Wilson et al. Mar 1993 A
5208665 McCalley et al. May 1993 A
5220420 Hoarty et al. Jun 1993 A
5230019 Yanagimichi et al. Jul 1993 A
5231494 Wachob Jul 1993 A
5236199 Thompson, Jr. Aug 1993 A
5247347 Letteral et al. Sep 1993 A
5253341 Rozmanith et al. Oct 1993 A
5262854 Ng Nov 1993 A
5262860 Fitzpatrick et al. Nov 1993 A
5303388 Kreitman et al. Apr 1994 A
5319455 Hoarty et al. Jun 1994 A
5319707 Wasilewski et al. Jun 1994 A
5321440 Yanagihara et al. Jun 1994 A
5321514 Martinez Jun 1994 A
5351129 Lai Sep 1994 A
5355162 Yazolino et al. Oct 1994 A
5359601 Wasilewski et al. Oct 1994 A
5361091 Hoarty et al. Nov 1994 A
5371532 Gelman et al. Dec 1994 A
5404393 Remillard Apr 1995 A
5408274 Chang et al. Apr 1995 A
5410343 Coddington et al. Apr 1995 A
5410344 Graves et al. Apr 1995 A
5412415 Cook et al. May 1995 A
5412720 Hoarty May 1995 A
5418559 Blahut May 1995 A
5422674 Hooper et al. Jun 1995 A
5422887 Diepstraten et al. Jun 1995 A
5442389 Blahut et al. Aug 1995 A
5442390 Hooper et al. Aug 1995 A
5442700 Snell et al. Aug 1995 A
5446490 Blahut et al. Aug 1995 A
5469283 Vinel et al. Nov 1995 A
5469431 Wendorf et al. Nov 1995 A
5471263 Odaka Nov 1995 A
5481542 Logston et al. Jan 1996 A
5485197 Hoarty Jan 1996 A
5487066 McNamara et al. Jan 1996 A
5493638 Hooper et al. Feb 1996 A
5495283 Cowe Feb 1996 A
5495295 Long Feb 1996 A
5497187 Banker et al. Mar 1996 A
5517250 Hoogenboom et al. May 1996 A
5526034 Hoarty et al. Jun 1996 A
5528281 Grady et al. Jun 1996 A
5537397 Abramson Jul 1996 A
5537404 Bentley et al. Jul 1996 A
5539449 Blahut et al. Jul 1996 A
RE35314 Logg Aug 1996 E
5548340 Bertram Aug 1996 A
5550578 Hoarty et al. Aug 1996 A
5557316 Hoarty et al. Sep 1996 A
5559549 Hendricks et al. Sep 1996 A
5561708 Remillard Oct 1996 A
5570126 Blahut et al. Oct 1996 A
5570363 Holm Oct 1996 A
5579143 Huber Nov 1996 A
5581653 Todd Dec 1996 A
5583927 Ely et al. Dec 1996 A
5587734 Lauder et al. Dec 1996 A
5589885 Ooi Dec 1996 A
5592470 Rudrapatna et al. Jan 1997 A
5594507 Hoarty Jan 1997 A
5594723 Tibi Jan 1997 A
5594938 Engel Jan 1997 A
5596693 Needle et al. Jan 1997 A
5600364 Hendricks et al. Feb 1997 A
5600573 Hendricks et al. Feb 1997 A
5608446 Carr et al. Mar 1997 A
5617145 Huang et al. Apr 1997 A
5621464 Teo et al. Apr 1997 A
5625404 Grady et al. Apr 1997 A
5630757 Gagin et al. May 1997 A
5631693 Wunderlich et al. May 1997 A
5631846 Szurkowski May 1997 A
5632003 Davidson et al. May 1997 A
5649283 Galler et al. Jul 1997 A
5668592 Spaulding, II Sep 1997 A
5668599 Cheney et al. Sep 1997 A
5708767 Yeo et al. Jan 1998 A
5710815 Ming et al. Jan 1998 A
5712906 Grady et al. Jan 1998 A
5740307 Lane Apr 1998 A
5742289 Naylor et al. Apr 1998 A
5748234 Lippincott May 1998 A
5754941 Sharpe et al. May 1998 A
5786527 Tarte Jul 1998 A
5790174 Richard, III et al. Aug 1998 A
5802283 Grady et al. Sep 1998 A
5812665 Hoarty et al. Sep 1998 A
5812786 Seazholtz et al. Sep 1998 A
5815604 Simons et al. Sep 1998 A
5818438 Howe et al. Oct 1998 A
5821945 Yeo et al. Oct 1998 A
5822537 Katseff et al. Oct 1998 A
5828371 Cline et al. Oct 1998 A
5844594 Ferguson Dec 1998 A
5845083 Hamadani et al. Dec 1998 A
5862325 Reed et al. Jan 1999 A
5864820 Case Jan 1999 A
5867208 McLaren Feb 1999 A
5883661 Hoarty Mar 1999 A
5903727 Nielsen May 1999 A
5903816 Broadwin et al. May 1999 A
5905522 Lawler May 1999 A
5907681 Bates et al. May 1999 A
5917822 Lyles et al. Jun 1999 A
5946352 Rowlands et al. Aug 1999 A
5952943 Walsh et al. Sep 1999 A
5959690 Toebes et al. Sep 1999 A
5961603 Kunkel et al. Oct 1999 A
5963203 Goldberg et al. Oct 1999 A
5966163 Lin et al. Oct 1999 A
5978756 Walker et al. Nov 1999 A
5982445 Eyer et al. Nov 1999 A
5990862 Lewis Nov 1999 A
5995146 Rasmusse Nov 1999 A
5995488 Kalkunte et al. Nov 1999 A
5999970 Krisbergh et al. Dec 1999 A
6014416 Shin et al. Jan 2000 A
6021386 Davis et al. Feb 2000 A
6031989 Cordell Feb 2000 A
6034678 Hoarty et al. Mar 2000 A
6049539 Lee et al. Apr 2000 A
6049831 Gardell et al. Apr 2000 A
6052555 Ferguson Apr 2000 A
6055314 Spies et al. Apr 2000 A
6055315 Doyle et al. Apr 2000 A
6064377 Hoarty et al. May 2000 A
6078328 Schumann et al. Jun 2000 A
6084908 Chiang et al. Jul 2000 A
6100883 Hoarty Aug 2000 A
6108625 Kim Aug 2000 A
6131182 Beakes et al. Oct 2000 A
6141645 Chi-Min et al. Oct 2000 A
6141693 Perlman et al. Oct 2000 A
6144698 Poon et al. Nov 2000 A
6167084 Wang et al. Dec 2000 A
6169573 Sampath-Kumar et al. Jan 2001 B1
6177931 Alexander et al. Jan 2001 B1
6182072 Leak et al. Jan 2001 B1
6184878 Alonso et al. Feb 2001 B1
6192081 Chiang et al. Feb 2001 B1
6198822 Doyle et al. Mar 2001 B1
6205582 Hoarty Mar 2001 B1
6226041 Florencio et al. May 2001 B1
6236730 Cowieson et al. May 2001 B1
6243418 Kim Jun 2001 B1
6253238 Lauder et al. Jun 2001 B1
6256047 Isobe et al. Jul 2001 B1
6259826 Pollard et al. Jul 2001 B1
6266369 Wang et al. Jul 2001 B1
6266684 Kraus et al. Jul 2001 B1
6275496 Burns et al. Aug 2001 B1
6292194 Powell, III Sep 2001 B1
6305020 Hoarty et al. Oct 2001 B1
6317151 Ohsuga et al. Nov 2001 B1
6317885 Fries Nov 2001 B1
6349284 Park et al. Feb 2002 B1
6386980 Nishino et al. May 2002 B1
6389075 Wang et al. May 2002 B2
6446037 Fielder et al. Sep 2002 B1
6459427 Mao et al. Oct 2002 B1
6481012 Gordon et al. Nov 2002 B1
6512793 Maeda Jan 2003 B1
6525746 Lau et al. Feb 2003 B1
6536043 Guedalia Mar 2003 B1
6557041 Mallart Apr 2003 B2
6560496 Michener May 2003 B1
6564378 Satterfield et al. May 2003 B1
6578201 LaRocca et al. Jun 2003 B1
6579184 Tanskanen Jun 2003 B1
6584153 Gordon et al. Jun 2003 B1
6588017 Calderone Jul 2003 B1
6598229 Smyth et al. Jul 2003 B2
6604224 Armstrong et al. Aug 2003 B1
6614442 Ouyang et al. Sep 2003 B1
6621870 Gordon et al. Sep 2003 B1
6625574 Taniguchi et al. Sep 2003 B1
6639896 Goode et al. Oct 2003 B1
6645076 Sugai Nov 2003 B1
6651252 Gordon et al. Nov 2003 B1
6657647 Bright Dec 2003 B1
6675385 Wang Jan 2004 B1
6675387 Boucher Jan 2004 B1
6681326 Son et al. Jan 2004 B2
6681397 Tsai et al. Jan 2004 B1
6684400 Goode et al. Jan 2004 B1
6687663 McGrath et al. Feb 2004 B1
6691208 Dandrea et al. Feb 2004 B2
6697376 Son et al. Feb 2004 B1
6704359 Bayrakeri et al. Mar 2004 B1
6717600 Dutta et al. Apr 2004 B2
6718552 Goode Apr 2004 B1
6721794 Taylor et al. Apr 2004 B2
6721956 Wasilewski Apr 2004 B2
6727929 Bates et al. Apr 2004 B1
6732370 Gordon et al. May 2004 B1
6747991 Hemy et al. Jun 2004 B1
6754271 Gordon et al. Jun 2004 B1
6754905 Gordon et al. Jun 2004 B2
6758540 Adolph et al. Jul 2004 B1
6766407 Lisitsa et al. Jul 2004 B1
6771704 Hannah Aug 2004 B1
6785902 Zigmond et al. Aug 2004 B1
6807528 Truman et al. Oct 2004 B1
6810528 Chatani Oct 2004 B1
6817947 Tanskanen Nov 2004 B2
6886178 Mao et al. Apr 2005 B1
6907574 Xu et al. Jun 2005 B2
6931291 Alvarez-Tinoco et al. Aug 2005 B1
6941019 Mitchell et al. Sep 2005 B1
6941574 Broadwin et al. Sep 2005 B1
6947509 Wong Sep 2005 B1
6952221 Holtz et al. Oct 2005 B1
6956899 Hall et al. Oct 2005 B2
7030890 Jouet et al. Apr 2006 B1
7050113 Campisano et al. May 2006 B2
7089577 Rakib et al. Aug 2006 B1
7095402 Kunii et al. Aug 2006 B2
7114167 Slemmer et al. Sep 2006 B2
7146615 Hervet et al. Dec 2006 B1
7158676 Rainsford Jan 2007 B1
7200836 Brodersen et al. Apr 2007 B2
7212573 Winger May 2007 B2
7224731 Mehrotra May 2007 B2
7272556 Aguilar et al. Sep 2007 B1
7310619 Baar et al. Dec 2007 B2
7325043 Rosenberg et al. Jan 2008 B1
7346111 Winger et al. Mar 2008 B2
7360230 Paz et al. Apr 2008 B1
7412423 Asano Aug 2008 B1
7412505 Slemmer et al. Aug 2008 B2
7421082 Kamiya et al. Sep 2008 B2
7444306 Varble Oct 2008 B2
7444418 Chou et al. Oct 2008 B2
7500235 Maynard et al. Mar 2009 B2
7508941 O'Toole, Jr. et al. Mar 2009 B1
7512577 Slemmer et al. Mar 2009 B2
7543073 Chou et al. Jun 2009 B2
7596764 Vienneau et al. Sep 2009 B2
7623575 Winger Nov 2009 B2
7669220 Goode Feb 2010 B2
7742609 Yeakel et al. Jun 2010 B2
7743400 Kurauchi Jun 2010 B2
7751572 Villemoes et al. Jul 2010 B2
7757157 Fukuda Jul 2010 B1
7830388 Lu Nov 2010 B1
7840905 Weber et al. Nov 2010 B1
7936819 Craig et al. May 2011 B2
7970263 Asch Jun 2011 B1
7987489 Krzyzanowski et al. Jul 2011 B2
8027353 Damola et al. Sep 2011 B2
8036271 Winger et al. Oct 2011 B2
8046798 Schlack et al. Oct 2011 B1
8074248 Sigmon, Jr. et al. Dec 2011 B2
8118676 Craig et al. Feb 2012 B2
8136033 Bhargava et al. Mar 2012 B1
8149917 Zhang et al. Apr 2012 B2
8155194 Winger et al. Apr 2012 B2
8155202 Landau Apr 2012 B2
8170107 Winger May 2012 B2
8194862 Herr et al. Jun 2012 B2
8243630 Luo et al. Aug 2012 B2
8270439 Herr et al. Sep 2012 B2
8284842 Craig et al. Oct 2012 B2
8296424 Malloy et al. Oct 2012 B2
8370869 Paek et al. Feb 2013 B2
8411754 Zhang et al. Apr 2013 B2
8442110 Pavlovskaia et al. May 2013 B2
8473996 Gordon et al. Jun 2013 B2
8619867 Craig et al. Dec 2013 B2
8621500 Weaver et al. Dec 2013 B2
20010008845 Kusuda et al. Jul 2001 A1
20010049301 Masuda et al. Dec 2001 A1
20020007491 Schiller et al. Jan 2002 A1
20020013812 Krueger et al. Jan 2002 A1
20020016161 Dellien et al. Feb 2002 A1
20020021353 DeNies Feb 2002 A1
20020026642 Augenbraun et al. Feb 2002 A1
20020027567 Niamir Mar 2002 A1
20020032697 French et al. Mar 2002 A1
20020040482 Sextro et al. Apr 2002 A1
20020047899 Son et al. Apr 2002 A1
20020049975 Thomas et al. Apr 2002 A1
20020056083 Istvan May 2002 A1
20020056107 Schlack May 2002 A1
20020056136 Wistendahl et al. May 2002 A1
20020059644 Andrade et al. May 2002 A1
20020062484 De Lange et al. May 2002 A1
20020067766 Sakamoto et al. Jun 2002 A1
20020069267 Thiele Jun 2002 A1
20020072408 Kumagai Jun 2002 A1
20020078171 Schneider Jun 2002 A1
20020078456 Hudson et al. Jun 2002 A1
20020083464 Tomsen et al. Jun 2002 A1
20020095689 Novak Jul 2002 A1
20020105531 Niemi Aug 2002 A1
20020108121 Alao et al. Aug 2002 A1
20020131511 Zenoni Sep 2002 A1
20020136298 Anantharamu et al. Sep 2002 A1
20020152318 Menon et al. Oct 2002 A1
20020171765 Waki et al. Nov 2002 A1
20020175931 Holtz et al. Nov 2002 A1
20020178447 Plotnick et al. Nov 2002 A1
20020188628 Cooper et al. Dec 2002 A1
20020191851 Keinan Dec 2002 A1
20020194592 Tsuchida et al. Dec 2002 A1
20020196746 Allen Dec 2002 A1
20030020671 Santoro et al. Jan 2003 A1
20030027517 Callway et al. Feb 2003 A1
20030035486 Kato et al. Feb 2003 A1
20030038893 Rajamaki et al. Feb 2003 A1
20030046690 Miller Mar 2003 A1
20030051253 Barone, Jr. Mar 2003 A1
20030058941 Chen et al. Mar 2003 A1
20030061451 Beyda Mar 2003 A1
20030065739 Shnier Apr 2003 A1
20030071792 Safadi Apr 2003 A1
20030072372 Shen et al. Apr 2003 A1
20030076546 Johnson et al. Apr 2003 A1
20030088328 Nishio et al. May 2003 A1
20030088400 Nishio et al. May 2003 A1
20030095790 Joshi May 2003 A1
20030107443 Yamamoto Jun 2003 A1
20030122836 Doyle et al. Jul 2003 A1
20030123664 Pedlow, Jr. et al. Jul 2003 A1
20030126608 Safadi et al. Jul 2003 A1
20030126611 Chernock et al. Jul 2003 A1
20030131349 Kuczynski-Brown Jul 2003 A1
20030135860 Dureau Jul 2003 A1
20030169373 Peters et al. Sep 2003 A1
20030177199 Zenoni Sep 2003 A1
20030188309 Yuen Oct 2003 A1
20030189980 Dvir et al. Oct 2003 A1
20030196174 Pierre Cote et al. Oct 2003 A1
20030208768 Urdang et al. Nov 2003 A1
20030229719 Iwata et al. Dec 2003 A1
20030229900 Reisman Dec 2003 A1
20030231218 Amadio Dec 2003 A1
20040016000 Zhang et al. Jan 2004 A1
20040034873 Zenoni Feb 2004 A1
20040040035 Carlucci et al. Feb 2004 A1
20040078822 Breen et al. Apr 2004 A1
20040088375 Sethi et al. May 2004 A1
20040091171 Bone May 2004 A1
20040111526 Baldwin et al. Jun 2004 A1
20040117827 Karaoguz et al. Jun 2004 A1
20040128686 Boyer et al. Jul 2004 A1
20040133704 Krzyzanowski et al. Jul 2004 A1
20040136698 Mock Jul 2004 A1
20040139158 Datta Jul 2004 A1
20040157662 Tsuchiya Aug 2004 A1
20040163101 Swix et al. Aug 2004 A1
20040184542 Fujimoto Sep 2004 A1
20040193648 Lai et al. Sep 2004 A1
20040210824 Shoff et al. Oct 2004 A1
20040261106 Hoffman Dec 2004 A1
20040261114 Addington et al. Dec 2004 A1
20050015259 Thumpudi et al. Jan 2005 A1
20050015816 Christofalo et al. Jan 2005 A1
20050021830 Urzaiz et al. Jan 2005 A1
20050034155 Gordon et al. Feb 2005 A1
20050034162 White et al. Feb 2005 A1
20050044575 Der Kuyl Feb 2005 A1
20050055685 Maynard et al. Mar 2005 A1
20050055721 Zigmond et al. Mar 2005 A1
20050071876 van Beek Mar 2005 A1
20050076134 Bialik et al. Apr 2005 A1
20050089091 Kim et al. Apr 2005 A1
20050091690 Delpuch et al. Apr 2005 A1
20050091695 Paz et al. Apr 2005 A1
20050105608 Coleman et al. May 2005 A1
20050114906 Hoarty et al. May 2005 A1
20050132305 Guichard et al. Jun 2005 A1
20050135385 Jenkins et al. Jun 2005 A1
20050141613 Kelly et al. Jun 2005 A1
20050149988 Grannan Jul 2005 A1
20050160088 Scallan et al. Jul 2005 A1
20050166257 Feinleib et al. Jul 2005 A1
20050180502 Puri Aug 2005 A1
20050198682 Wright Sep 2005 A1
20050213586 Cyganski et al. Sep 2005 A1
20050216933 Black Sep 2005 A1
20050216940 Black Sep 2005 A1
20050226426 Oomen et al. Oct 2005 A1
20050273832 Zigmond et al. Dec 2005 A1
20050283741 Balabanovic et al. Dec 2005 A1
20060001737 Dawson et al. Jan 2006 A1
20060020960 Relan et al. Jan 2006 A1
20060020994 Crane et al. Jan 2006 A1
20060031906 Kaneda Feb 2006 A1
20060039481 Shen et al. Feb 2006 A1
20060041910 Hatanaka et al. Feb 2006 A1
20060088105 Shen et al. Apr 2006 A1
20060095944 Demircin et al. May 2006 A1
20060112338 Joung et al. May 2006 A1
20060117340 Pavlovskaia et al. Jun 2006 A1
20060143678 Cho et al. Jun 2006 A1
20060161538 Kiilerich Jul 2006 A1
20060173985 Moore Aug 2006 A1
20060174026 Robinson et al. Aug 2006 A1
20060174289 Theberge Aug 2006 A1
20060195884 van Zoest et al. Aug 2006 A1
20060212203 Furuno Sep 2006 A1
20060218601 Michel Sep 2006 A1
20060230428 Craig et al. Oct 2006 A1
20060242570 Croft et al. Oct 2006 A1
20060256865 Westerman Nov 2006 A1
20060269086 Page et al. Nov 2006 A1
20060271985 Hoffman et al. Nov 2006 A1
20060285586 Westerman Dec 2006 A1
20060285819 Kelly et al. Dec 2006 A1
20070009035 Craig et al. Jan 2007 A1
20070009036 Craig et al. Jan 2007 A1
20070009042 Craig et al. Jan 2007 A1
20070025639 Zhou et al. Feb 2007 A1
20070033528 Merrit et al. Feb 2007 A1
20070033631 Gordon et al. Feb 2007 A1
20070074251 Oguz et al. Mar 2007 A1
20070079325 de Heer Apr 2007 A1
20070115941 Patel et al. May 2007 A1
20070124282 Wittkotter May 2007 A1
20070124795 McKissick et al. May 2007 A1
20070130446 Minakami Jun 2007 A1
20070130592 Haeusel Jun 2007 A1
20070152984 Ording et al. Jul 2007 A1
20070162953 Bolliger et al. Jul 2007 A1
20070172061 Pinder Jul 2007 A1
20070174790 Jing et al. Jul 2007 A1
20070237232 Chang et al. Oct 2007 A1
20070300280 Turner et al. Dec 2007 A1
20080046928 Poling et al. Feb 2008 A1
20080052742 Kopf et al. Feb 2008 A1
20080066135 Brodersen et al. Mar 2008 A1
20080084503 Kondo Apr 2008 A1
20080086688 Chandratillake et al. Apr 2008 A1
20080094368 Ording et al. Apr 2008 A1
20080098450 Wu et al. Apr 2008 A1
20080104520 Swenson et al. May 2008 A1
20080127255 Ress et al. May 2008 A1
20080154583 Goto et al. Jun 2008 A1
20080163059 Craner Jul 2008 A1
20080163286 Rudolph et al. Jul 2008 A1
20080170619 Landau Jul 2008 A1
20080170622 Gordon et al. Jul 2008 A1
20080178125 Elsbree et al. Jul 2008 A1
20080178243 Dong et al. Jul 2008 A1
20080178249 Gordon et al. Jul 2008 A1
20080189740 Carpenter et al. Aug 2008 A1
20080195573 Onoda et al. Aug 2008 A1
20080201736 Gordon et al. Aug 2008 A1
20080212942 Gordon et al. Sep 2008 A1
20080232452 Sullivan et al. Sep 2008 A1
20080243918 Holtman Oct 2008 A1
20080243998 Oh et al. Oct 2008 A1
20080246759 Summers Oct 2008 A1
20080253440 Srinivasan et al. Oct 2008 A1
20080271080 Gossweiler et al. Oct 2008 A1
20090003446 Wu et al. Jan 2009 A1
20090003705 Zou et al. Jan 2009 A1
20090007199 La Joie Jan 2009 A1
20090025027 Craner Jan 2009 A1
20090031341 Schlack et al. Jan 2009 A1
20090041118 Pavlovskaia et al. Feb 2009 A1
20090083781 Yang et al. Mar 2009 A1
20090083813 Dolce et al. Mar 2009 A1
20090083824 McCarthy et al. Mar 2009 A1
20090089188 Ku et al. Apr 2009 A1
20090094113 Berry et al. Apr 2009 A1
20090094646 Walter et al. Apr 2009 A1
20090100465 Kulakowski Apr 2009 A1
20090100489 Strothmann Apr 2009 A1
20090106269 Zuckerman et al. Apr 2009 A1
20090106386 Zuckerman et al. Apr 2009 A1
20090106392 Zuckerman et al. Apr 2009 A1
20090106425 Zuckerman et al. Apr 2009 A1
20090106441 Zuckerman et al. Apr 2009 A1
20090106451 Zuckerman et al. Apr 2009 A1
20090106511 Zuckerman et al. Apr 2009 A1
20090113009 Slemmer et al. Apr 2009 A1
20090132942 Santoro et al. May 2009 A1
20090138966 Krause et al. May 2009 A1
20090144781 Glaser et al. Jun 2009 A1
20090146779 Kumar et al. Jun 2009 A1
20090157868 Chaudhry Jun 2009 A1
20090158369 Van Vleck et al. Jun 2009 A1
20090160694 Di Flora Jun 2009 A1
20090172757 Aldrey et al. Jul 2009 A1
20090178098 Westbrook et al. Jul 2009 A1
20090183219 Maynard et al. Jul 2009 A1
20090189890 Corbett et al. Jul 2009 A1
20090193452 Russ et al. Jul 2009 A1
20090196346 Zhang et al. Aug 2009 A1
20090204920 Beverley et al. Aug 2009 A1
20090210899 Lawrence-Apfelbaum et al. Aug 2009 A1
20090225790 Shay et al. Sep 2009 A1
20090228620 Thomas et al. Sep 2009 A1
20090228922 Haj-Khalil et al. Sep 2009 A1
20090233593 Ergen et al. Sep 2009 A1
20090251478 Maillot et al. Oct 2009 A1
20090254960 Yarom et al. Oct 2009 A1
20090265617 Randall et al. Oct 2009 A1
20090271512 Jorgensen Oct 2009 A1
20090271818 Schlack Oct 2009 A1
20090298535 Klein et al. Dec 2009 A1
20090313674 Ludvig et al. Dec 2009 A1
20090328109 Pavlovskaia et al. Dec 2009 A1
20100033638 O'Donnell et al. Feb 2010 A1
20100035682 Gentile et al. Feb 2010 A1
20100058404 Rouse Mar 2010 A1
20100067571 White et al. Mar 2010 A1
20100077441 Thomas et al. Mar 2010 A1
20100104021 Schmit Apr 2010 A1
20100115573 Srinivasan et al. May 2010 A1
20100118972 Zhang et al. May 2010 A1
20100131996 Gauld May 2010 A1
20100146139 Brockmann Jun 2010 A1
20100158109 Dahlby et al. Jun 2010 A1
20100166071 Wu et al. Jul 2010 A1
20100174776 Westberg et al. Jul 2010 A1
20100175080 Yuen et al. Jul 2010 A1
20100180307 Hayes et al. Jul 2010 A1
20100211983 Chou Aug 2010 A1
20100226428 Thevathasan et al. Sep 2010 A1
20100235861 Schein et al. Sep 2010 A1
20100242073 Gordon et al. Sep 2010 A1
20100251167 DeLuca et al. Sep 2010 A1
20100254370 Jana et al. Oct 2010 A1
20100325655 Perez Dec 2010 A1
20110002376 Ahmed et al. Jan 2011 A1
20110002470 Purnhagen et al. Jan 2011 A1
20110023069 Dowens Jan 2011 A1
20110035227 Lee et al. Feb 2011 A1
20110067061 Karaoguz et al. Mar 2011 A1
20110096828 Chen et al. Apr 2011 A1
20110107375 Stahl et al. May 2011 A1
20110110642 Salomons et al. May 2011 A1
20110150421 Sasaki et al. Jun 2011 A1
20110153776 Opala et al. Jun 2011 A1
20110167468 Lee et al. Jul 2011 A1
20110191684 Greenberg Aug 2011 A1
20110243024 Osterling et al. Oct 2011 A1
20110258584 Williams et al. Oct 2011 A1
20110289536 Poder et al. Nov 2011 A1
20110317982 Xu et al. Dec 2011 A1
20120023126 Jin et al. Jan 2012 A1
20120030212 Koopmans et al. Feb 2012 A1
20120137337 Sigmon et al. May 2012 A1
20120204217 Regis et al. Aug 2012 A1
20120209815 Carson et al. Aug 2012 A1
20120224641 Haberman et al. Sep 2012 A1
20120257671 Brockmann et al. Oct 2012 A1
20130003826 Craig et al. Jan 2013 A1
20130071095 Chauvier et al. Mar 2013 A1
20130086610 Brockmann Apr 2013 A1
20130179787 Brockmann et al. Jul 2013 A1
20130198776 Brockmann Aug 2013 A1
20130254308 Rose et al. Sep 2013 A1
20130272394 Brockmann et al. Oct 2013 A1
20140033036 Gaur et al. Jan 2014 A1
Foreign Referenced Citations (331)
Number Date Country
191599 Apr 2000 AT
198969 Feb 2001 AT
250313 Oct 2003 AT
472152 Jul 2010 AT
475266 Aug 2010 AT
550086 Feb 1986 AU
199060189 Nov 1990 AU
620735 Feb 1992 AU
199184838 Apr 1992 AU
643828 Nov 1993 AU
2004253127 Jan 2005 AU
2005278122 Mar 2006 AU
2010339376 Aug 2012 AU
2011249132 Nov 2012 AU
2011258972 Nov 2012 AU
2011315950 May 2013 AU
682776 Mar 1964 CA
2052477 Mar 1992 CA
1302554 Jun 1992 CA
2163500 May 1996 CA
2231391 May 1997 CA
2273365 Jun 1998 CA
2313133 Jun 1999 CA
2313161 Jun 1999 CA
2528499 Jan 2005 CA
2569407 Mar 2006 CA
2728797 Apr 2010 CA
2787913 Jul 2011 CA
2798541 Dec 2011 CA
2814070 Apr 2012 CA
1507751 Jun 2004 CN
1969555 May 2007 CN
101180109 May 2008 CN
101627424 Jan 2010 CN
101637023 Jan 2010 CN
102007773 Apr 2011 CN
4408355 Oct 1994 DE
69516139 Dec 2000 DE
69132518 Sep 2001 DE
69333207 Jul 2004 DE
98961961 Aug 2007 DE
602008001596 Aug 2010 DE
602006015650 Sep 2010 DE
0093549 Nov 1983 EP
0128771 Dec 1984 EP
0419137 Mar 1991 EP
0449633 Oct 1991 EP
0 477 786 Apr 1992 EP
0523618 Jan 1993 EP
0534139 Mar 1993 EP
0568453 Nov 1993 EP
0588653 Mar 1994 EP
0594350 Apr 1994 EP
0612916 Aug 1994 EP
0624039 Nov 1994 EP
0638219 Feb 1995 EP
0643523 Mar 1995 EP
0661888 Jul 1995 EP
0714684 Jun 1996 EP
0746158 Dec 1996 EP
0761066 Mar 1997 EP
0789972 Aug 1997 EP
0830786 Mar 1998 EP
0861560 Sep 1998 EP
0933966 Aug 1999 EP
0933966 Aug 1999 EP
1026872 Aug 2000 EP
1038397 Sep 2000 EP
1038399 Sep 2000 EP
1038400 Sep 2000 EP
1038401 Sep 2000 EP
1 051 039 Nov 2000 EP
1055331 Nov 2000 EP
1120968 Aug 2001 EP
1345446 Sep 2003 EP
1422929 May 2004 EP
1428562 Jun 2004 EP
1521476 Apr 2005 EP
1645115 Apr 2006 EP
1725044 Nov 2006 EP
1767708 Mar 2007 EP
1771003 Apr 2007 EP
1772014 Apr 2007 EP
1887148 Feb 2008 EP
1900200 Mar 2008 EP
1902583 Mar 2008 EP
1908293 Apr 2008 EP
1911288 Apr 2008 EP
1918802 May 2008 EP
2100296 Sep 2009 EP
2105019 Sep 2009 EP
2106665 Oct 2009 EP
2116051 Nov 2009 EP
2124440 Nov 2009 EP
2248341 Nov 2010 EP
2269377 Jan 2011 EP
2271098 Jan 2011 EP
2304953 Apr 2011 EP
2364019 Sep 2011 EP
2384001 Nov 2011 EP
2409493 Jan 2012 EP
2477414 Jul 2012 EP
2487919 Aug 2012 EP
2520090 Nov 2012 EP
2567545 Mar 2013 EP
2577437 Apr 2013 EP
2628306 Aug 2013 EP
2632164 Aug 2013 EP
2632165 Aug 2013 EP
2695388 Feb 2014 EP
2207635 Jun 2004 ES
8211463 Jun 1982 FR
2 529 739 Jan 1984 FR
2891098 Mar 2007 FR
2207838 Feb 1989 GB
2248955 Apr 1992 GB
2290204 Dec 1995 GB
2365649 Feb 2002 GB
2378345 Feb 2003 GB
1134855 Oct 2010 HK
1116323 Dec 2010 HK
19913397 Apr 1992 IE
99586 Feb 1998 IL
215133 Dec 2011 IL
222829 Dec 2012 IL
222830 Dec 2012 IL
225525 Jun 2013 IL
180215 Jan 1998 IN
200701744 Nov 2007 IN
200900856 May 2009 IN
200800214 Jun 2009 IN
3759 Mar 1992 IS
63 33988 Feb 1988 JP
63-263985 Oct 1988 JP
2001-241993 Sep 1989 JP
04-373286 Dec 1992 JP
06-054324 Feb 1994 JP
7015720 Jan 1995 JP
7160292 Jun 1995 JP
8095599 Apr 1996 JP
8265704 Oct 1996 JP
10228437 Aug 1998 JP
10-510131 Sep 1998 JP
11-134273 May 1999 JP
H11-261966 Sep 1999 JP
2000-152234 May 2000 JP
2001-203995 Jul 2001 JP
2001-245271 Sep 2001 JP
2001-514471 Sep 2001 JP
2002-016920 Jan 2002 JP
2002-057952 Feb 2002 JP
2002-112220 Apr 2002 JP
2002-141810 May 2002 JP
2002-208027 Jul 2002 JP
2002-319991 Oct 2002 JP
2003-506763 Feb 2003 JP
2003-087785 Mar 2003 JP
2003-529234 Sep 2003 JP
2004-056777 Feb 2004 JP
2004-110850 Apr 2004 JP
2004-112441 Apr 2004 JP
2004-135932 May 2004 JP
2004-264812 Sep 2004 JP
2004-533736 Nov 2004 JP
2004-536381 Dec 2004 JP
2004-536681 Dec 2004 JP
2005-033741 Feb 2005 JP
2005-084987 Mar 2005 JP
2005-095599 Mar 2005 JP
2005-156996 Jun 2005 JP
2005-519382 Jun 2005 JP
2005-523479 Aug 2005 JP
2005-309752 Nov 2005 JP
2006-067280 Mar 2006 JP
2006-512838 Apr 2006 JP
11-88419 Sep 2007 JP
2008-523880 Jul 2008 JP
2008-535622 Sep 2008 JP
04252727 Apr 2009 JP
2009-543386 Dec 2009 JP
2011-108155 Jun 2011 JP
2012-080593 Apr 2012 JP
04996603 Aug 2012 JP
05121711 Jan 2013 JP
53-004612 Oct 2013 JP
05331008 Oct 2013 JP
05405819 Feb 2014 JP
2006067924 Jun 2006 KR
2007038111 Apr 2007 KR
20080001298 Jan 2008 KR
2008024189 Mar 2008 KR
2010111739 Oct 2010 KR
2010120187 Nov 2010 KR
2010127240 Dec 2010 KR
2011030640 Mar 2011 KR
2011129477 Dec 2011 KR
20120112683 Oct 2012 KR
2013061149 Jun 2013 KR
2013113925 Oct 2013 KR
1333200 Nov 2013 KR
2008045154 Nov 2013 KR
2013138263 Dec 2013 KR
1032594 Apr 2008 NL
1033929 Apr 2008 NL
2004670 Nov 2011 NL
2004780 Jan 2012 NL
239969 Dec 1994 NZ
99110 Dec 1993 PT
WO 8202303 Jul 1982 WO
WO8202303 Jul 1982 WO
WO8908967 Sep 1989 WO
WO 8908967 Sep 1989 WO
WO 9013972 Nov 1990 WO
WO 9322877 Nov 1993 WO
WO 9416534 Jul 1994 WO
WO9416534 Jul 1994 WO
WO 9419910 Sep 1994 WO
WO9419910 Sep 1994 WO
WO9421079 Sep 1994 WO
WO 9421079 Sep 1994 WO
WO 9515658 Jun 1995 WO
WO9532587 Nov 1995 WO
WO 9532587 Nov 1995 WO
WO9533342 Dec 1995 WO
WO 9533342 Dec 1995 WO
WO 9614712 May 1996 WO
WO9614712 May 1996 WO
WO9627843 Sep 1996 WO
WO 9627843 Sep 1996 WO
WO 9631826 Oct 1996 WO
WO9631826 Oct 1996 WO
WO 9637074 Nov 1996 WO
WO9637074 Nov 1996 WO
WO9642168 Dec 1996 WO
WO 9642168 Dec 1996 WO
WO 9716925 May 1997 WO
WO9716925 May 1997 WO
WO 9733434 Sep 1997 WO
WO9733434 Sep 1997 WO
WO9739583 Oct 1997 WO
WO 9739583 Oct 1997 WO
WO 9826595 Jun 1998 WO
WO9826595 Jun 1998 WO
WO 9900735 Jan 1999 WO
WO 9904568 Jan 1999 WO
WO 9900735 Jan 1999 WO
WO9900735 Jan 1999 WO
WO 9930496 Jun 1999 WO
WO9930496 Jun 1999 WO
WO 9930497 Jun 1999 WO
WO9930497 Jun 1999 WO
WO 9930500 Jun 1999 WO
WO9930500 Jun 1999 WO
WO9930501 Jun 1999 WO
WO 9930501 Jun 1999 WO
WO9935840 Jul 1999 WO
WO 9935840 Jul 1999 WO
WO9941911 Aug 1999 WO
WO 9941911 Aug 1999 WO
WO9956468 Nov 1999 WO
WO 9956468 Nov 1999 WO
WO 9965243 Dec 1999 WO
WO 9965323 Dec 1999 WO
WO 9966732 Dec 1999 WO
WO9966732 Dec 1999 WO
WO0002303 Jan 2000 WO
WO 0002303 Jan 2000 WO
WO 0007372 Feb 2000 WO
WO 0008967 Feb 2000 WO
WO 0019910 Apr 2000 WO
WO 0038430 Jun 2000 WO
WO 0041397 Jul 2000 WO
WO 0139494 May 2001 WO
WO 0141447 Jun 2001 WO
WO 0182614 Nov 2001 WO
WO 0192973 Dec 2001 WO
WO 02089487 Jul 2002 WO
WO 02076097 Sep 2002 WO
WO 02076099 Sep 2002 WO
WO 03026232 Mar 2003 WO
WO 03026275 Mar 2003 WO
WO 03047710 Jun 2003 WO
WO 03065683 Aug 2003 WO
WO 03071727 Aug 2003 WO
WO 03091832 Nov 2003 WO
WO 2004012437 Feb 2004 WO
WO 2004018060 Mar 2004 WO
WO 2004073310 Aug 2004 WO
WO 2005002215 Jan 2005 WO
WO 2005041122 May 2005 WO
WO 2005053301 Jun 2005 WO
WO 2005120067 Dec 2005 WO
WO 2006014362 Feb 2006 WO
WO 2006022881 Mar 2006 WO
WO 2006053305 May 2006 WO
WO 2006067697 Jun 2006 WO
WO 2006081634 Aug 2006 WO
WO 2006105480 Oct 2006 WO
WO 2006110268 Oct 2006 WO
WO 2007001797 Jan 2007 WO
WO 2007008319 Jan 2007 WO
WO 2007008355 Jan 2007 WO
WO 2007008356 Jan 2007 WO
WO 2007008357 Jan 2007 WO
WO 2007008358 Jan 2007 WO
WO 2007018722 Feb 2007 WO
WO 2007018726 Feb 2007 WO
WO 2008044916 Apr 2008 WO
WO 2008086170 Jul 2008 WO
WO 2008088741 Jul 2008 WO
WO 2008088752 Jul 2008 WO
WO 2008088772 Jul 2008 WO
WO 2008100205 Aug 2008 WO
WO 2009038596 Mar 2009 WO
WO 2009099893 Aug 2009 WO
WO 2009099895 Aug 2009 WO
WO 2009105465 Aug 2009 WO
WO 2009110897 Sep 2009 WO
WO 2009114247 Sep 2009 WO
WO 2009155214 Dec 2009 WO
WO 2010044926 Apr 2010 WO
WO 2010054136 May 2010 WO
WO 2010107954 Sep 2010 WO
WO 2011014336 Sep 2010 WO
WO 2011082364 Jul 2011 WO
WO 2011139155 Nov 2011 WO
WO 2011149357 Dec 2011 WO
WO 2012051528 Apr 2012 WO
WO 2012138660 Oct 2012 WO
WO 2013106390 Jul 2013 WO
WO 2013155310 Jul 2013 WO
Non-Patent Literature Citations (252)
Entry
Star, “Video on Demand Without Compression: a Review of the Business Model, Regulation and Future Implication”.
Porter et al., Compositing Digital Images, Computer Graphics, vol. 18, No. 3, pp. 253-259, Jul. 1984.
Hoarty, “The Smart Headend—A Novel Approach to Interactive Television”, Montreux Int'l TV Symposium, Jun. 9, 1995.
Ozer, Video Compositing 101, available from http://www.emedialive.com, Jun. 2, 2004.
International Searching Authority, International Search Report—International Application No. PCT/US/2006/022585, dated Oct. 12, 2007, together with the Written Opinion of the International Searching Authority, 13 pages.
USPTO, Office Action dated Sep. 2, 2008 pertaining to U.S. Appl. No. 11/258,602, 13 pages.
USPTO, Office Action dated Feb. 23, 2009 pertaining to U.S. Appl. No. 11/258,602, 17 pages.
ICTV, Inc., International Preliminary Report on Patentability, PCT/US2006/022585, Jan. 29, 2008, 9 pgs.
Annex C—Video buffering verifier, information technology—generic coding of moving pictures and associated audio information: video, Feb. 2000, 6 pgs.
Antonoff, Michael, “Interactive Television,” Popular Science, Nov. 1992, 12 pages.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,176, filed Dec. 23, 2010, 8 pgs.
Craig, Office Action, U.S. Appl. No. 11/103,838, filed Feb. 5, 2005, 30 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/103,838, filed Jul. 6, 2010, 35 pgs.
Craig, Office Action, U.S. Appl. No. 11/103,838, filed May 12, 2009, 32 pgs.
Craig, Office Action, U.S. Appl. No. 11/103,838, filed Aug. 19, 2008, 17 pgs.
Craig, Office Action, U.S. Appl. No. 11/103,838, filed Nov. 19, 2009, 34 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,176, filed Oct. 10, 2010, 8 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,176, filed May 6, 2010, 7 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,181, filed Feb. 11, 2011, 19 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,181, filed Aug. 25, 2010, 17 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,182, filed Feb. 23, 2010, 15 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,183, filed Dec. 6, 2010, 12 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,183, filed Feb. 19, 2010, 17 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,183, filed Jul. 20, 2010, 13 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,189, filed Nov. 9, 2010, 13 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,189, filed Mar. 15, 2010, 11 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,189, filed Jul. 23, 2009, 10 pgs.
Isovic, Timing constraints of MPEG-2 decoding for high quality video: misconceptions and realistic assumptions, Jul. 2-4, 2003, 10 pgs.
MPEG-2 Video elementary stream supplemental information, Dec. 1999, 12 pgs.
TAG Networks Inc., Office Action, CN 200680017662.3, Apr. 26, 2010, 4 pgs.
TAG Networks Inc., Office Action, EP 06739032.8, Aug. 14, 2009, 4 pgs.
TAG Networks Inc., Office Action, EP 06773714.8, May 8, 2009, 3 pgs.
TAG Networks Inc., Office Action, EP 06773714.8, Jan. 12, 2010, 4 pgs.
Talley, A general framework for continuous media transmission control, Oct. 13-16, 1997, pp. 374-383.
Tudor, MPEG-2 Video Compression, Dec. 1995, 15 pgs.
Tvhead, Inc., International Search Report, PCT/US2006/024195, Nov. 29, 2006, 9 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Rules 70(2) and 70a(2), EP11833486.1, Apr. 24, 2014, 1 pg.
ActiveVideo Networks, Inc., International Search Report and Written Opinion, PCT/US2014/041430, Oct. 9, 2014, 9 pgs.
ActiveVideo Networks Inc., Examination Report No. 1, AU2011258972, Jul. 21, 2014, 3 pgs.
Active Video Networks, Notice of Reasons for Rejection, JP2012-547318, Sep. 26, 2014, 7 pgs.
Avinity Systems B. V., Final Office Action, JP-2009-530298, 0700T2014, 8 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/686,548, filed Sep. 24, 2014, 13 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/438,617, filed Oct. 30, 2014, 19 pgs.
Brockmann, Office Action, U.S. Appl. No. 12/443,571, filed Nov. 5, 2014, 26 pgs.
AC-3 digital audio compression standard, Extract, Dec. 20, 1995, 11 pgs.
ActiveVideo, http://www.activevideo.com/, as printed out in year 2012, 1 pg.
ActiveVideo Networks BV, International Preliminary Report on Patentability, PCT/NL2011/050308, Sep. 6, 2011, 8 pgs.
ActiveVideo Networks by, International Search Report and Written Opinion, PCT/NL2011/050308, Sep. 6, 2011, 8 pgs.
Activevideo Networks Inc., International Preliminary Report on Patentability, PCT/US2011/056355, Apr. 16, 2013, 4 pgs.
ActiveVideo Networks Inc., International Preliminary Report on Patentability. PCT/US2012/032010, Oct. 8, 2013, 4 pgs.
ActiveVideo Networks Inc., International Preliminary Report on Patentability, PCT/US2013/020769, Jul. 24, 2011, 6 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2014/030773, Jul. 25, 2014, 8 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2014/041416, Aug. 27, 2014, 8 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2011/056355, Apr. 13, 2012, 6 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US20121032010, Oct. 10, 2012, 6 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2013/020769, May 9, 2013, 9 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2013/036182, Jul. 29, 2013, 12 pgs.
ActiveVideo Networks, Inc., International Search Report and Written Opinion, PCT/US2009/032457, Jul. 22, 2009, 7 pgs.
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 09820930-4, 11 pgs.
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 10754084-1, 11 pgs.
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 10841764.3, 16 pgs.
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No, 11833486.1, 6 pgs.
Active Video Networks Inc., Korean Intellectual Property Office, International Search Report; PCT/US2009/032457, Jul. 22, 2009, 7 pgs.
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No, 13168509.1, 10 pgs.
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 13168376-5, 8 pgs.
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 12767642-7, 12 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Rules 70(2) and 70a(2), EP10841764.3, Jun. 6, 2014, 1 pg.
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, EP08713106.6-1908, Jun. 26, 2014 5 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, EP08713106.6-2223, May 10, 2011, 7 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, EP09713486.0, Apr. 14, 2014, 6 pgs.
ActiveVideo Networks Inc., Examination Report No. 1, AU2011258972, Apr. 4, 2013, 5 pgs.
ActiveVideo Networks Inc., Examination Report No. 1, AU2010339376, Apr. 30, 2014, 4 pgs.
ActiveVideo Networks Inc., Examination Report, App. No. EP11749946.7, Oct. 8, 2013, 6 pgs.
ActiveVideo Networks Inc., Summons to attend oral-proceeding, Application No. EP09820936-4, Aug. 19, 2014, 4 pgs.
ActiveVideo Networks Inc., International Searching Authority, International Search Report—International application No. PCT/US2010/027724, dated Oct. 28, 2010, together with the Written Opinion of the International Searching Authority, 7 pages.
Adams, Jerry, NTZ Nachrichtechnische Zeitschrift. vol. 40, No. 7, Jul. 1987, Berlin De pp. 534-536; Jerry Adams; ‘Glasfasernetz fur Breitbanddienste in London’, 5 pgs. No English Translation Found.
Avinity Systems B.V., Communication pursuant to Article 94(3) EPC, EP 07834561.8, Jan. 31, 2014, 10 pgs.
Avinity Systems B.V., Extended European Search Report, Application No. 12163713.6, 10 pgs.
Avinity Systems B.V., Extended European Search Report, Application No. 12183712-8, 10 pgs.
Avinity Systems B.V., Communication pursuant to Article 94(3) EPC, EP 07834561.8, Apr. 8, 2010, 5 pgs.
Avinity Systems B.V., International Preliminary Report on Patentability, PCT/NL2007/000245, 3 Mar. 31, 2009, 12 pgs.
Avinity Systems B.V., International Search Report and Written Opinion, PCT/NL2007/000245, Feb. 19, 2009, 18 pgs.
Avinity Systems B.V., Notice of Grounds of Rejection for Patent, JP 2009-530298, Sep. 3, 2013, 4 pgs.
Avinity Systems B.V., Notice of Grounds of Rejection fo Patent, JP 2009-530298, Sep. 25, 2012, 6 pgs.
Benjelloun, A summation algorithm for MPEG-1 coded audio signals: a first step towards audio processed domain, 2000, 9 pgs.
Bird et al., “Customer Access to Broadband Services,” ISSLS 86—The International Symposium on Subrscriber Loops and Services Sep. 29, 1986, Tokyo, JP 6 pgs.
Broadhead, Direct manipulation of MPEG compressed digital audio, Nov. 5-9, 1995, 41 pgs.
Broclmann, Final Office Action, U.S. Appl. No. 13/668,004, filed Jul. 16, 2014, 20 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/686,548, filed Mar. 10, 2014, 11 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/668,004, filed Dec. 23, 2013, 9 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/438,617, filed May 21, 2014, 17 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 12/443,571, filed Mar. 7, 2014, 21 pgs.
Brockmann, Office Action, U.S. Appl. No. 12/443,571, filed Jun. 5, 2013, 18 pgs.
Cable Television Laboratories, Inc., “CableLabs Asset Distribution Interface Specification, Version 1.1”, May 5, 2006, 33 pgs.
CD 11172-3, Coding of moving pictures and associated audio for digital storage media at up to about 1.5 MBIT, Jan. 1, 1992, 39 pgs.
Chang, Shih-Fu, et al., “Manipulation and Compositing of MC-DOT Compressed Video, ” IEEE Journal on Selected Areas of Communications, Jan. 1995, vol. 13, No. 1, 11 pgs. Best Copy Available.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,183, filed Jan. 12, 2012, 7 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,183, filed Jul. 19, 2012, 8 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,189, filed Oct. 12, 2011, 7 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,176, filed Mar. 3, 2011, 8 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 13/609,183, filed Aug. 13, 2013, 8 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,181, filed Jun. 20, 2011, 21 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,183, filed Apr. 13, 2011, 16 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,177, filed Oct. 26, 2010, 12 pgs.
Craig, Office-Action U.S. Appl. No. 11/178,177, filed Mar. 29, 2011, 15 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,177, filed Aug. 3, 2011, 26 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,181, filed Jun. 20, 2011, 21 pgs,.
Craig, Office Action, U.S. Appl. No. 11/178,181, filed Mar. 29, 2010, 10 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,183, filed Sep. 15, 2011, 12 pgs,.
Craig, Office Action, U.S. Appl. No. 11/178,189, filed May 26, 2011, 14 pgs.
Craig, Office Action, U.S. Appl. No. 13/609,183, filed May 9, 2013, 7 pgs.
Pavlovskaia, Office Action, JP 2011-516499, Feb. 14, 2014, 19 pgs.
Dahlby, Office Action, U.S. Appl. No. 12/651,203, filed Jun. 5, 2014, 18 pgs.
Dahlby, Final Office Action, U.S. Appl. No. 12/651,203, filed Feb. 4, 2013, 18 pgs.
Dahlby, Office Action, U.S. Appl. No. 12/651,203, filed Aug. 16, 2012, 18 pgs.
Digital Audio Compression Standard(AC-3, E-AC-3), Advanced Television Systems Committee, Jun. 14, 2005, 236 pgs.
Dukes, Stephen D., “Photonics for cable television system design. Migrating to regional hubs and passive networks,” Communications Engineering and Design, May 1992, 4 pgs.
Ellis, et al., “INDAX: An Operation Interactive Cabletext System”, IEEE Journal on Selected Areas in Communications, vol. sac-1, No. 2, Feb. 1983, pp. 285-294.
European Patent Office, Supplementary European Search Report, Application No. EP 09 70 8211, dated Jan, 5, 2011, 6 pgs.
European Patent Office, Extended European Search Report for Inte national Application No. PCT/US2010/027724, dated Jul. 24, 2012, 11 pages.
FFMPEG, hilp://www.ffmpeg.org, downloaded Apr. 8, 2010, 8 pgs.
FFMEG-0.4.9 Audio Layer 2 Tables Including Fixed Psycho Acoustic Model, 2001, 2 pgs.
Frezza, W., “The Broadband Solution—Metropolitan CATV Networks, ” Proceedings of Videotex '84, Apr. 1984, 15 pgs.
Gecsei, J., “Topology of Videotex Networks,” The Architecture of Videotex Systems, Chapter 6, 1983 by Prentice-Hall, Inc.
Gobl, et al., “ARIDEM—a multi-service broadband access demonstrator,” Ericsson Review No. 3, 1996, 7 pgs.
Gordon, Notice of Allowance, U.S. Appl. No. 12/008,697, filed Mar. 20, 2014, 10 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/008,722, filed Mar. 30, 202, 16 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, filed Jun. 11, 2014, 14 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, filed Jul. 22, 2013, 7 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, filed Sep. 20, 2011, 8 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, filed Sep. 21, 2012, 9 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/008,697, filed Mar. 6, 2012, 48 pgs.
Gordon, Office Action, U.S. Appl. No. 12/035,236, filed Mar. 13, 2013, 9 pgs.
Gordon, Office Action, U.S. Appl. No. 12/035,236, filed Mar. 11, 2011, 8 pgs.
Gordon, Office Action, U.S. Appl. No. 12/035,236, filed Mar. 28, 2012, 8 pgs.
Gordon, Office Action, U.S. Appl. No. 12/035,236, filed Dec. 16, 2013, 11 pgs.
Gordon, Office Action, U.S. Appl. No. 12/008,697, filed Aug. 1, 2013, 43 pgs.
Gordon, Office Action, U.S. Appl. No. 12/008,697, filed Aug. 4, 2011, 39 pgs.
Gordon, Office Action, U.S. Appl. No. 12/008,722, filed Oct. 11, 2011, 16 pgs.
Henry et al. “Multidimensional Icons” ACM Transactions on Graphics, vol. 9. No. 1 Jan. 1990, 5 pgs.
Herr, Notice of Allowance, U.S. Appl. No. 11/620,593, filed May 23, 2012, 5 pgs.
Herr, Notice of Allowance, U.S. Appl. No. 12/534,016, filed Feb. 7, 2012, 5 pgs.
Herr, Notice of Allowance, U.S. Appl. No. 12/534,016, filed Sep. 28, 2011, 15 pgs.
Herr, Final Office Action, U.S. Appl. No. 11/620,593, filed Sep. 15, 2011, 104 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, filed Apr. 19, 2010, 58 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, filed Apr. 21, 2009 27 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, filed Dec. 23, 2009, 58 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, filed Jan. 24, 2011, 96 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, filed Aug. 27, 2010, 41 pgs.
Herre, Thoughts on an SAOC Architecture, Oct. 2006, 9 pgs.
Insight advertisement, “In two years this is going to be the most watched program on TV” On touch VCR programming, published not later than 2000, 10 pgs.
lsensee et al., “Focus Highlight for World Wide Web Frames.” Nov. 1, 1997, IBM Technical Disclosure Bulletin, vol. 40, No. 11, pp. 89-90.
ICTV, Inc., International Search Report/Written Opinion, PCTI/US2006/022585, Oct. 12, 2007, 15 pgs.
ICTV, Inc., International Search Report/Written Opinion, PCT/US2008/000400, Jul. 14, 2009, 10 pgs.
ICTV, Inc., International Search Report/Written Opinion, PCT/US2008/000419, May 15, 2009, 20 pgs.
ICTV, Inc., International Search Report/Written Opinion; PCT/US2006/022533, Nov. 20, 2006; 8 pgs.
Kato. Y., et al., “A Coding Control algorithm for Motion Picture Coding Accomplishing Optimal Assignment of Coding Distortion to Time and Space Domains,” Electronics and Communications in Japan, Part 1, vol. 72, No. 9, 1989, 11 pgs.
Koenen, Rob,“MPEG-4 Overview—Overview of the MPEG-4 Standard” Internet Citation, Mar. 2001, http://mpeg.telecomitalialab.com/standards/mpeg-4/mpeg-4.htm, May 9, 2002, 74 pgs.
Konaka, M. et al., “Development of Sleeper Cabin Cold Storage Type Cooling System,” SAE International, The Engineering Society for Advancing Mobility Land Sea Air and Space, SAE 2000 World Congress, Detroit, Michigan, Mar. 6-9, 2000, 7 pgs.
AcitveVideo Networks Inc., Korean Intellectual Property Office, International Search Report; PCT/US2009/032457, Jul. 22, 2009, 7 pgs.
Le Gall, Didier, “MPEG: A Video Compression Standard for Multimedia Applications”, Communication of the ACM, vol. 34, No. 4, Apr. 1991, New York, NY, 13 pgs.
Langenberg, E, Integrating Entertainment and Voice on the Cable Network by Earl Langenberg 0 TeleWest International and Ed Callahan—ANTEC. work on this one.
Large, D., “Tapped Fiber vs. Fiber-Reinforced Coaxial CATV Systems”, IEEE LCS Magazine, Feb. 1990, 7 pgs. Best Copy Available.
Mesiya, M.F, “A Passive Optical/Coax Hybrid Network Architecture for Delivery of CATV, Telephony and Data Services,” 1993 NCTA Technical Papers, 7 pgs.
“MSDL Specification Version 1,1” International Organisation for Standardisation Organisation Internationale EE Normalisation, ISO/IEC JTC1/SC29/WG11 Coding of Moving Pictures and Autdio, N1246, MPEG96/Mar. 1996, 101 pgs.
Noguchi, Yoshihiro, et al., “MPEG Video Compositing in the Compressed Domain,” IEEE International Symposium on Circuits and Systems, vol. 2, May 1, 1996, 4 pgs.
Regis, Notice of Allowance U.S. Appl. No. 13/273,893, filed May 14, 2014, 8 pgs.
Regis, Final Office Action U.S. Appl. No. 13/273,803, filed Oct. 11, 2013, 23 pgs.
Regis, Office Action U.S. Appl. No. 13/273,803, filed Mar. 27, 2013, 32 pgs.
Richardson, Ian E.G., “H.264 and MPEG-4 Video Compression, Video Coding for Next-Genertion Multimedia,” Johm Wiley & Sons, US, 2903, ISBN: 0-470-84837-5, pp. 103-105, 149-152, and 164.
Rose, K., “Design of a Switched Broad-Band Communications Network for Interactive Services,”IEEE Transactions on Communications, vol. com-23, No. 1, Jan. 1975, 7 pgs.
RSS Advisory Board, “RSS 2.0 Specification”, published Oct. 15, 2007.
Saadawi, Tarek N., “Distributed Switching for Data Transmission over Two-Way CATV”, IEEE Journal on Selected Areas in Communications, vol. Sac-3, No. 2, Mar. 1985, 7 pgs.
SAOC use cases, draft requirements and architecture, Oct. 2006, 16 pgs.
Schrock, “Proposal for a Hub Controlled Cable Television System Using Optical Fiber,” IEEE Transactions on Cable Television, vol. CATV-4, No. 2, Apr. 1979, 8 pgs.
Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,293, filed Feb. 27, 2014, 14 pgs.
Sigmon, Final Office Action, U.S. Appl. No. 13/311,203, filed Sep. 13, 2013, 20 pgs.
Sigmon, Office Action, U.S. Appl. No. 13/311,203, filed May 10, 2013, 21 pgs.
Sigmon, Final Office Action, U.S. Appl. No. 11/258,602, filed Feb. 23, 2009, 15 pgs.
Sigmon, Office Action, U.S. Appl. No. 11/258,602, filed Sep. 2, 2008, 12 pgs.
Smith, Brian C., et al., “Algorithms for Manipulating Compressed Images,” IEEE Computer Graphics and Applications, vol. 13, No. 5, Sep. 1, 1993, 9 pgs.
Smith. J. et al., “Transcoding Internet Content for Heterogeneous Client Devices” Circuits and Systems, 1998. ISCAS '98. Proceedings of the 1998 IEEE International Symposium on Monterey, CA, USA May 31-Jun. 3, 1998, New York, NY, USA,IEEE, US, May 31, 1998, 4 pgs.
Stoll, G. et al., “GMF4iTV: Neue Wege zur-Interaktivitaet Mit Bewegten Objekten Beim Digitalen Fernsehen,” Fkt Fernseh Und Kinotechnik, Fachverlag Schiele & Schon GmbH, Berlin, DE, vol. 60, No. 4, Jan. 1, 2006. ISSN: 1430-9947, 9 pgs. No English Translation Found.
TAG Networks, Inc, Communication pursuant to Article 94(3) EPC, European Patent Application, 06773714.8, May 6, 2009, 3 pgs.
TAG Networks Inc, Decision to Grant a Patent, JP 2009-544985, Jun. 28, 2013, 1 pg.
TAG Networks Inc. IPRP, PCT/US2006/010080, Oct. 16, 2007, 6 pgs.
TAG Networks Inc., IPRP, PCT/US2006/024194, Jan. 10, 2008, 7 pgs.
TAG Networks Inc. IPRP, PCT/US2006/024195, Apr. 1, 2009, 11 pgs.
TAG Networks Inc. IPRP, PCT/US2006/024196, Jan. 10, 2008, 6 pgs.
TAG Networks Inc. International Search Report, PCT/US2008/050221, Jun. 12, 2008, 9 pgs.
TAG Networks Inc., Office Action, JP 2008-506474, Oct. 1, 2012, 5 pgs.
TAG Networks Inc. Office Action, JP 2008-506474, Aug. 8, 2011, 5 pgs.
TAG Networks Inc. Office Action, JP 2008-520254, Oct. 20, 2011, 2 pgs.
TAG Networks, Iprp, PCT/US2008/050221, Jul. 7, 2009, 6 pgs.
TAG Networks, International Search Report, PCT/US2010/041133, Oct. 19, 2010, 13 pgs.
TAG Networks, Office Action, CN 200880001325.4, Jun. 22, 2011, 4 pgs.
TAG Networks, Office Action, JP 2009-544985, Feb. 25, 2013, 3 pgs.
Tamitani et al., “An Encoder/Decoder Chip Set for the MPEG Video Standard,” 1992 IEEE International Conference on Acoustics, vol. 5, Mar. 1992, San Francisco, CA, 4 pgs.
The Toolame Project, Psych—nl.c, 1999, 1 pg.
Terry, Jack, “Alternative Technologies and Delivery Systems for Broadband ISDN Access”, IEEE Communications Magazine, Aug. 1992, 7 pgs.
Thompson, Jack, “DTMF-TV, The Most Economical Approach to Interactive TV,” GNOSTECH Incorporated, NCF'95 Session T-38-C, 8 pgs.
Thompson, John W. Jr., “The Awakening 3.0: PCs, TSBs, or DTMF-TV—Which Telecomputer Architecture is Right for the Next Generations's Public Network?,” GNOSTECH Incorporated, 1995 The National Academy of Sciences, downloaded from the Unpredictable Certainty: White Papers, http://www.nap.edu/catalog/6062.html, pp. 546-552.
Tobagi, Fouad A., “Multiaccess Protocols in Packet Communication Systems,” IEEE Transactions on Communications, vol. Com-28, No. 4, Apr. 1980, 21 pgs.
Todd, AC-3: flexible perceptual coding for audio transmission and storage, Feb. 26-Mar. 1, 1994, 16 pgs.
Toms “An Integrated Network Using Fiber Optics (Info) for the Distribution of Video, Data, and Telephone in Rural Areas,” IEEE Transactions on Communication, vol. Com-26, No. 7, Jul. 1978, 9 pgs.
Trott, A., et al.“An Enhanced Cost Effective Line Shuffle Scrambling System with Secure Conditional Access Authorization,” 1993 NCTA Technical Papers, 11 pgs.
Tvhead, Inc., First Examination Report, IN 1744/MUMNP/2007, Dec. 30, 2013, 6 pgs,.
Tvhead, Inc., International Search Report, PCT/US2006/010080, Jun. 20, 2006, 3 pgs.
Tvhead, Inc., International Search Report, PCT/US2006/024194, Dec. 15, 2006, 4 pgs.
Tvhead, Inc., International Search Report, PCT/US2006/024196, Dec. 11, 2006, 4 pgs.
Tvhead, Inc., International Search Report, PCT/US2006/024197, Nov. 28, 2006, 9 pgs.
Jurgen—Two-way applications for cable television systems in the '70s, IEEE Spectrum, Nov. 1971, 16 pgs.
va Beek, P., “Delay-Constrained Rate Adaptation for Robust Video Transmission over Home Networks,” Image Processing, 2005, ICIP 2005, IEEE international Conference, Sep. 2005, vol. 2, No. 11, 4 pgs.
Van der Star, Jack A. M., “Video on Demand Without Compression: A Review of the Business Model, Regulations and Future Implication,” Proceedings of PTC'93, 15th Annual Conference, 12 pgs.
Vernon, Dolby digital: audio coding for digital television and storage applications, Aug. 1999, 18 pgs.
Wang, A beat-pattern based error concealment scheme for music delivery with burst packet loss, Aug. 22-25, 2001, 4 pgs.
Wang, A compressed domain beat detector using MP3 audio bitstrearn, Sep. 30-Oct. 5, 2001, 9 pgs.
Wang. A multichannel audio coding algorithm for inter-channel redundancy removal, May 12-15, 2001, 6 pgs.
Wang, An excitation level based psychoacoustic model for audio compression, Oct. 30-Nov. 4, 1999, 4 pgs.
Wang, Energy compaction property of the MDCT in comparison with other transforms, Sep. 22-25, 2000, 23 pgs.
Wang, Exploiting excess masking for audio compression, Sep. 2-5, 1999, 4 pgs.
Wang, schemes for re-compressing mp3 audio bitstreams, Nov. 30-Dec. 3, 2001, 5 pgs.
Wang, Selected advances in audio compression and compressed domain processing, Aug. 2001, 68 pgs.
Wang, The impact of the relationship between MDCT and DFT on audio compression, Dec. 13-15, 2000, 92 pgs.
Welzenbach et al., “The Application of Optical Systems for Cable TV,” AEG-Telefunken, Backnang Federal Rupublic of Germany, ISSLS Sep. 15-19, 1980, Proceedings IEEE Cat. No. 80 CH1565-1, 7 pgs.
Yum, TS P., “Hierarchical Distribution of Video with Dynamic Port Allocation,” IEEE Transaction on Communications, vol. 39, No. 8, Aug. 1, 1991, XP000264287, 7 pgs.
ActiveVideo Networks, Inc., International Preliminary Report on Patentablity, PCT/US2013/036182, Oct. 14, 2014, 9 pgs.
ActiveVideo Networks Inc., Decision to refuse a European patent application (Art. 97(2) EPC, EP09820936.4, Feb. 20, 2015, 4 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, 10754084.1, Feb 10, 2015, 12 pgs.
ActiveVideo Networks Inc., Communication under Rule 71(3) EPC, Intention to Grant, EP08713106.6, 19FEB2015, 12 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Rule 94(3), EP08713106-6, Jun. 25, 2014, 5 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Rules 161(2) & 162 EPC, EP13775121.0, Jan. 20, 2015, 3 pgs.
ActiveVideo Networks Inc., Certificate of Patent JP5675765, Jan. 9, 2015, 3 pgs.
ActiveVideo Networks Inc., Notice of Reasons for Rejection, JP2014-100460, Jan. 15, 2015, 6 pgs.
ActiveVideo Networks Inc., Notice of Reasons for Rejection, JP2013-509016, Dec. 24, 2014 (Received Jan. 14, 2015), 11 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 13/445,104, filed Dec. 24, 2014, 14 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 14/298,796, filed Mar. 18, 2015, 11 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/737,097, filed Mar. 16, 2015, 18 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/668,004, filed Feb. 26, 2015, 17 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/686,548, filed Jan. 5, 2015, 12 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/911,948, filed Dec. 26, 2014, 12 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/911,948, filed Jan. 29, 2015, 11 pgs.
Craig, Decision on Appeal-Reversed-, U.S. Appl. No. 11/178,177, filed Feb. 24, 2015, 7 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,177, filed Mar. 5, 2015, 7 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,181, filed Feb. 13, 2015, 8 pgs.
Dahlby, Office Action, U.S. Appl. No. 12/651,203, filed Dec. 3, 2014, 19 pgs.
Gordon, Notice of Allowance, U.S. Appl. No. 12/008,697, filed Dec. 8, 2014, 10 pgs.
Gordon, Office Action, U.S. Appl. No. 12/008,722, filed Nov. 28, 2014, 18 pgs.
Regis, Notice of Allowance, U.S. Appl. No. 13/273,803, filed Nov. 18, 2014, 9 pgs.
Regis, Notice of Allowance, U.S. Appl. No. 13/273,803, filed Mar. 2, 2015, 8 pgs.
Tag Networks Inc, Decision to Grant a Patent, JP 2008-506474, Oct. 4, 2013, 5 pgs.
Gordon, Notice of Allowance, U.S. Appl. No. 12/008,697, filed Apr. 1, 2015, 10 pgs.
Related Publications (1)
Number Date Country
20120137337 A1 May 2012 US
Provisional Applications (1)
Number Date Country
60702507 Jul 2005 US
Continuations (1)
Number Date Country
Parent 11258601 Oct 2005 US
Child 13311203 US