MPEG objects and systems and methods for using MPEG objects

Information

  • Patent Grant
  • 9355681
  • Patent Number
    9,355,681
  • Date Filed
    Friday, January 11, 2008
    16 years ago
  • Date Issued
    Tuesday, May 31, 2016
    8 years ago
Abstract
An MPEG object is an object-oriented data structure that may be used in the creation of interactive MPEG video stream. The MPEG object data structure includes an MPEG object interface defining data received by the MPEG object and data output by the MPEG object. The MPEG object data structure further includes either one or more MPEG video elements or an association to one or more MPEG video elements. The MPEG video elements are preferably defined as MPEG slices that include a plurality of encoded macroblocks. Additionally, the data structure may provide a method for receiving input through the MPEG object interface and in response to input from the client device outputting an MPEG video element. In certain configurations, the MPEG object contains a method that maintains state data for the MPEG object. The state data may be used to select between a plurality of MPEG elements to output. In other configurations, the MPEG object includes a method that provides for the control of streaming MPEG content from a source external to the MPEG object.
Description
TECHNICAL FIELD AND BACKGROUND ART

The present invention relates to interactive encoded video and more specifically to interactive MPEG video that can be used with a client device having a decoder and limited caching capabilities.


Set-top boxes of cable television systems have preferably been simple devices. The boxes generally include a QAM decoder, an MPEG decoder, and a transceiver for receiving signals from a remote control and transferring the signals to the cable headend. In order to keep costs down, set-top boxes have not included sophisticated processors, such as those found in personal computers or extensive memory for caching content or programs. As a result, developers attempting to provide interactive content that includes encoded video elements such as those found in dynamic web pages to subscribers have been forced to find solutions that are compatible with the set-top boxes. These solutions require having the processing functionality reside at the cable headend and further require that the content is delivered in MPEG format. In order to provide dynamic web page content, the content forming the web page first must be decoded and then rendered within the webpage frame as a bitmap. Next, the rendered frames are then re-encoded into an MPEG stream that the set-top box of a requesting user can decode. This decoding and re-encoding scheme is processor intensive.


SUMMARY OF THE INVENTION

In a first embodiment, a system for providing interactive MPEG content for display on a display device associated with a client device having an MPEG decoder is disclosed. The system operates in a client/server environment wherein the server includes a plurality of session processors that can be assigned to an interactive session requested by a client device. The session processor runs a virtual machine, such as a JAVA virtual machine. The virtual machine includes code that in response to a request for an application accesses the requested application. In addition the virtual machine is capable of parsing the application and interpreting scripts. The application contains a layout for an MPEG frame composed of a plurality of MPEG elements. The application also includes a script that refers to one or more MPEG objects that provide the interactive functionality and the MPEG elements (MPEG encoded audio/video) or methodology for accessing the encoded MPEG audio/video content if the content is stored external to the MPEG object.


The MPEG object includes an object interface that defines data received by the MPEG object and data output by the MPEG object. Additionally, the MPEG object includes one or more MPEG video or audio elements. The MPEG elements are preferably groomed so that the elements can be stitched together to form an MPEG video frame. In some embodiments, the MPEG elements are located external to the MPEG object and the MPEG object includes a method for accessing the MPEG element(s). In certain embodiments, the MPEG object includes a plurality of MPEG video elements wherein each element represents a different state for the MPEG object. For example, a button may have an “on” state and an “off” state and an MPEG button object would include an MPEG element composed of a plurality of macroblocks/slices for each state. The MPEG object also includes methods for receiving input from the client device through the object interface and for outputting data from the MPEG object through the object interface.


After the program running on the virtual machine, has obtained all of the MPEG objects indicated in the application, the program on the virtual machine provides the MPEG elements and the layout to a stitcher. In certain embodiments, the virtual machine and program for retrieving and parsing the application and interpreting the scripts may be subsumed in the stitcher. The stitcher then stitches together each of the MPEG elements in their position within the MPEG frame. The stitched MPEG video frame is passed to a multiplexor that multiplexes in any MPEG audio content and additional data streams and the MPEG video frame is placed into an MPEG transport stream that is directed to the client device. In certain embodiments, the multiplexor may be internal to the stitcher. The client device receives the MPEG frame and can then decode and display the video frame on an associated display device. This process repeats for each video frame that is sent to the client device. As the client interacts and makes requests, for example changing the state of a button object, the virtual machine in conjunction with the MPEG object updates the MPEG element provided to the stitcher and the stitcher will replace the MPEG element within the MPEG video frame based upon the request of the client device. In certain other embodiments, each MPEG element representative of a different state of the MPEG object is provided to the stitcher. The virtual machine forwards the client's request to the stitcher and the stitcher selects the appropriate MPEG element based upon the MPEG objects state from a buffer to stitch into the MPEG video frame.


An interactive MPEG application may be constructed in an authoring environment. The authoring environment includes an editor with one or more scene windows that allow a user to create a scene based upon placement of MPEG objects within a scene window. An object tool bar is included within the authoring environment that allows the MPEG objects to be added. The authoring environment also includes a processor that produces an application file that contains at least reference to the MPEG objects and the display position for each of the MPEG objects within the scene. Preferably, when the MPEG object is placed within a scene window, the MPEG video element for the MPEG object is automatically snapped to a macroblock boundary. For each MPEG object that is added to the scene, the properties for the object can be modified. The authoring environment also allows a programmer to create scripts for using the MPEG objects. For example, a script within the application may relate a button state to an execution of a program. The authoring environment also provides for the creation of new MPEG objects. A designer may create an MPEG object by providing graphical content such as a video file or still image. The authoring environment will encode the graphical content so that the content includes MPEG elements/slices or a sequence of MPEG elements/slices. In addition to defining the MPEG video resource, the authoring environment allows the designer to add methods, properties, object data and scripts to the MPEG object.





BRIEF DESCRIPTION OF THE DRAWINGS

The foregoing features of the invention will be more readily understood by reference to the following detailed description, taken with reference to the accompanying drawings, in which:



FIG. 1 graphically shows an example of an atomic MPEG object as used in a client/server environment;



FIG. 1A is a flow chart showing process flow between a stitcher and events from a client device;



FIG. 2 graphically shows an example of a streaming MPEG object as used in a client/server environment;



FIG. 2A graphically shows an embodiment employing several session processors;



FIG. 3 provides an exemplary data structure and pseudo code for an atomic MPEG button object;



FIG. 4 provides an exemplary data structure and pseudo code for a progress bar MPEG object;



FIG. 5 shows an exemplary screen shot of an authoring environment for creating applications that use MPEG objects;



FIG. 6A shows an exemplary screen shot of a properties tab for an MPEG object;



FIG. 6B shows an exemplary screen shot of an event tab for an MPEG object;



FIG. 6C shows an exemplary screen shot of a script editor that can be used to create a script for an application that uses MPEG objects; and



FIG. 7 shows a system for using MPEG objects for interactive content.





DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS

Embodiments of the present invention disclose MPEG objects and systems and methods of using MPEG objects in a client/server environment for providing interactive encoded video content to a client device that includes an MPEG decoder and an upstream data connection to the server in an interactive communications network. As used in the detailed description and the claims the term MPEG element and MPEG video element shall refer to graphical information that has been formatted according to an MPEG standard (Motion Picture Experts Group). The graphical information may only be partially encoded. For example, graphical information that has been transform coded using the discrete cosine transform will be considered to be an MPEG element without requiring quantization, entropy encoding and additional MPEG formatting. MPEG elements may include MPEG header information for macroblocks and the slice level. An MPEG element may include data for either a full MPEG video frame, a portion of an MPEG video frame (macroblocks or slices) that are contiguous or non-contiguous, or data representative of a temporal sequence (frames, macroblocks or slices).


Interactive content formed from MPEG objects is preferably used in a client/server environment 100 as shown in FIG. 1 wherein the client device 101 does not need memory for caching data and includes a standard MPEG video decoder. An example of such a client device is a set-top box or other terminal that includes an MPEG decoder. Client devices may include a full processor and memory for caching; however these elements are not necessary for operation of this system. The server device in the client/server environment contains at least a session processor 102 formed from at least one processor that includes associated memory.


The client 101 and server establish an interactive session wherein the client device 101 transmits a request for an interactive session through an interactive communication network. The server assigns a session processor 102 and the request is sent to an input receiver 103 of the assigned session processor 102. The session processor 102 runs a virtual machine 104 that can interpret scripts. The virtual machine 104 may be any one of a number of virtual machines, such as a JAVA virtual machine. In response to the interactive request from the client, addressing information for the session processor is passed to the client 101. The client 101 then selects an interactive application, as defined in an AVML (Active Video Mark-up Language) file to view and interact with. Interactive applications may include references to video content along with selection controls, such as buttons, lists, and menus. Further explanation of such applications is explained in the U.S. patent application entitled, filed concurrently herewith entitled, “Interactive Encoded Content System including Object Models for Viewing on a Remote Device” and assigned to the same assignee, which is incorporated by reference in its entirety. The request for the selected application is directed to the virtual machine 104. The virtual machine 104 accesses the AVML file defining the application that indicates the MPEG objects, along with any other graphical content that is necessary for composing a video frame within a video sequence for display on a display device. The AVML file also includes the location within the frame for positioning each of the MPEG objects. In addition, the AVML file may include one or more scripts. One use for a script is to maintain the state of an MPEG object. These MPEG objects can reside and be accessed at different locations and may be distributed. The graphical elements of the MPEG objects are stitched together by a stitcher 105 based upon the location information within the application file (AVML file) to form complete MPEG video frames. The video frames along with MPEG audio frames are multiplexed together in a multiplexor 106 within the stitcher to form an MPEG stream that is sent to the requesting client device. The MPEG stream may then be decoded and displayed on the client's device. The input receiver, virtual machine, and stitcher may be embodied as either computer code that can be executed/interpreted on the session processor or may embodied in hardware or a combination of hardware and software. In some embodiments, any of the software (i.e. input receiver, virtual machine, or stitcher) may be constructed in hardware that is separate from the session processor. Additionally, the stitcher, which may be a computer program application may incorporate the functionality of the input receiver, the virtual machine and may process and parse the application file (AVML).


In certain embodiments, the stitcher may stitch the graphical elements together based upon the type of device that has requested the application. Devices have different capabilities. For example MPEG decoders on certain devices may not be as robust and capable of implementing all aspects of the chosen MPEG standard. Additionally, the bandwidth of the transmission path between the multiplexor and the client device may vary. For example, in general, wireless devices may have less bandwidth than wireline devices. Thus, the stitcher may insert into the MPEG header parameters a load delay or no delay, allow skips or not allow skips, force all frames to be encoded as I-frames or use a repeated uniform quantization to reduce the number of bits required to represent the values.


An MPEG object is part of a programming paradigm that allows individual MPEG video elements to be stitched together to form a frame of a video stream that incorporates active elements wherein a client can interact with the active elements and more specifically change the video stream. The MPEG video elements associated with an MPEG object may be a plurality of encoded macroblocks or slices that form a graphical element. A client can use a client device to select a graphical element on a display screen and interact with that graphical element. An MPEG object 110 includes an association with MPEG video and/or audio data along with methods and properties for the object. The MPEG video or audio may reside internal to the MPEG object or may be externally accessed through remote function calls. The methods within an MPEG object are code that may receive data from outside of the object, process the received data and/or the MPEG video 115 and audio data 120 and output data from the object according to video and audio directives. Object data 160 may indicate the state of the object or other internal variables for the object. For example, parameters such as display priority may be used to determine the priority of stacked media. In addition, parental control parameters, such as a content rating, may be associated with the audio or video data or an audio or video source or address. A parental control may be a method internal to an MPEG object that provides for control over access to the content.


As shown in FIG. 1 a virtual machine is made active on a session processor 102 in response to a request for an interactive application (AVML file having a script) and accesses a first MPEG object 110 which is an atomic object. An atomic object is self-contained in that the object contains all of the encoded data and methods necessary to construct all of the visual states for the object. Once the object is retrieved by the virtual machine the object requires no additional communications with another source. An example of an atomic object is a button that is displayed within a frame. The button object would have an MPEG video file for all states of the button and would include methods for storing the state based upon a client's interaction. The atomic object includes both pre-encoded MPEG data (video and audio data) 115, 120 along with methods 130. In certain embodiments, the audio or video data may not initially be MPEG elements, but rather graphical or audio data in another format that is converted either by the virtual machine or the stitcher into MPEG elements. In addition to the pre-encoded MPEG data 115, 120, the atomic object can include object data 160, such as state information. The object interacts with external sources through an interface definition 170 along with a script 180 for directing data to and from the object. The interface 170 may be for interacting with C++ code, Java Script or binary machine code. For example, the interface may be embodied in a class definitions.


An event may be received from a client device into the input receiver 103 that passes the event to an event dispatcher 111. The event dispatcher 111 identifies an MPEG object within the AVML file that is capable of processing the event. The event dispatcher then communicates the event to that object.


In response, the MPEG object through the interface definition 170 accesses the MPEG video 115 and/or audio data 120. The MPEG object may implement a method 130 for handling the event. In other embodiments, the interface definitions may directly access the data (object data, audio data and video data) Each MPEG object may include multiple MPEG video files that relate to different states of the object wherein the state is stored as object data 160. For example, the method may include a pointer that points the stitcher to the current frame and that is updated each time the stitcher is provided with a video frame. Similarly, the MPEG audio data 120 may have associated methods within the MPEG object. For example, the audio methods 130 may synchronize the MPEG audio data 120 with the MPEG video data 115. In other embodiments, state information is contained within the AVML file 11.


The process flow for the MPEG object and system for implementing the MPEG object is shown in the flow chart of FIG. 1A. In FIG. 1A, all code for accessing and parsing of an application is contained within the stitcher. The stitcher may be a software module that operates within the virtual machine on the session processor.


After receiving the request for the application and retrieving the application the stitcher first loads any script that exists within the application. 100A The stitcher accesses the layout for the video frame and loads this information into memory. 110A The layout will include the background, the overall size of the video frame, the aspect ratio, and position of any objects within the application. The stitcher then instantiates any MPEG objects that are present within the application. 120A Based upon a script within the application that keeps track of the state of an object, the graphical element associated with the state for each object is retrieved from a memory location. The graphical element may be in a format other than MPEG and may not initially be an MPEG element. The stitcher will determine the format of the graphical element. If the graphical element is in a non-MPEG element format, such as a TIFF format, GIF or RGB, for example, the stitcher will render the graphical element into a spatial representation. 130A The stitcher will then encode the spatial representation of the graphical element, so that it becomes an MPEG element. 135A Thus, the MPEG element will have macroblock data formed into slices. If the graphical element associated with the MPEG object is already in an MPEG element format then neither rendering or encoding is necessary. The MPEG elements may include one or more macroblocks that have associated position information. The stitcher then converts the relative macroblock/slice information into global MPEG video frame locations based upon the position information from the layout and encodes each of the slices. The slices are then stored to memory so that they are cached for quick retrieval. 140A An MPEG video frame is then created. The MPEG elements for each object based upon the layout are placed into scan order by slice for an MPEG frame. The stitcher sequences the slices into the appropriate order to form an MPEG frame. 145A The MPEG video frame is sent to the stitcher's multiplexor and the multiplexor multiplexes the video frame with any audio content. The MPEG video stream that includes the MPEG video frame and any audio content is directed through the interactive communication network to the client device of the user for display on a display device. 190A


Changes to the MPEG frames are event driven. A user through an input device sends a signal through a client device to the session processor that is provided to the stitcher. 160A The stitcher checks to see if the input that is received is input that is handled by the script of the application using the event dispatcher. 165A If it is handled by the script, the script directives are executed/interpreted. 170A The stitcher determines if the object state has changed. 175A The stitcher will retrieve the graphical element associated with the state of that object from a memory location. 180A The stitcher may retrieve the graphical element from a memory location associated with the MPEG object after the event has been processed, or the MPEG object may place the graphical element in a memory location associated with the stitcher during event processing. The stitcher will again determine the format of the graphical element. If the graphical element is in a non-MPEG element format and therefore is not structured according to macroblocks and slices, the stitcher will render and encode the element as an MPEG element and will cache the element into a buffer. 130A, 135A, 140A This new MPEG element representative of the change in state will be stitched into the MPEG frame at the same location as defined by the layout for the MPEG frame from the application. 145A The stitcher will gather all of the MPEG elements and places the slices into scan order and format the frame according to the appropriate MPEG standard. The MPEG frame will then be sent to the client device for display. 190A The system will continue to output MPEG frames into an MPEG stream until the next event causes a change in state and therefore, a change to one or more MPEG elements within the frame layout.


A second MPEG object is a streaming MPEG object. The streaming MPEG object operates within the same environment as the atomic object, but the object is not self-contained and accesses an outside source for source data. For example, the object may be a media player that allows for selection between various sources of audio and video. Thus, the MPEG object is not self-contained for each of the audio and video sources, but the MPEG object accesses the sources based upon requests from the client device. As shown in FIG. 2, the MPEG object 200 and methods implemented according to interface definitions (input, output) 211 link the MPEG object 200 to the virtual machine 230, the stitcher 250, as well as an RPC (remote procedure call) receiver 212 at a stream source 220. Thus, the streaming MPEG object is in communication with the virtual machine/client 230, 240 a stitcher 250, a source entity, the stream source 220 and other sources. The interface definitions may also directly access the data (object, audio and video). In response to an event, an event dispatcher accesses the MPEG object capable of handling the event using the interface. The event dispatcher causes the MPEG object to access or request the video and audio content requested by the client. This request may be achieved directly by a method within the MPEG object that accesses the data source. In other embodiments, a script within the AVML file calls an RPC receiver 212 that accesses a server script 213. The server script 213 retrieves the requested content (event source 214, data source 215, video source 216, or audio source 217) or accesses an address for the content and either provides this information or content to the MPEG object or to the stitcher 250.


The server script 213 may render the requested content and encode the content as one or more MPEG slices. MPEG video content can be passed through the MPEG object to the stitcher 250 that stitches together the MPEG video content into an MPEG video frame. The MPEG object may also request or retrieve audio MPEG content that can be passed to the stitcher. Thus, audio MPEG content may be processed in a similar fashion to MPEG video content. The MPEG video data may be processed by a method within the MPEG object. For example, a method may synchronize all of the MPEG content prior to providing the MPEG content to the stitcher, or the method may confirm that all of the MPEG content has been received and is temporally aligned, so that the stitcher can stitch together a complete MPEG video frame from a plurality of MPEG object video and audio data for presentation to the client in a compliant MPEG stream. The script of the AVML file or the MPEG object may request updated content from the stream source through the server script 213 or directly from an addressable location. An event requesting updated content may originate from communication with the client. The content may originate from a data, audio, video, or event source 214-217.


Event data 214 includes but is not limited to trigger data. Triggers include data that can be inserted into the MPEG transport stream. In addition, triggers may be internal to an MPEG video or audio source. For example, triggers may be located in header information or within the data content itself. These triggers when triggered can cause different events, such as an overlay to be presented on the screen of the client or a pop-up advertisement. The data source 215 may include data that is not traditionally audio or video data. For example, a data from the data source may include an alert notification for the client script, data to be embedded within the MPEG video stream or stock data that is to be merged with a separate graphical element.


Each of the various sources that have been requested is provided to the stitcher directly or may pass through the MPEG object. The MPEG object using a method may combine the data sources into a single stream for transport to the session processor. The single stream is received by the session processor and the session processor Like the atomic object the streaming object may include audio and video methods 281, 282 that synchronize the audio and video data. The video method 282 provides the video content to the stitcher so that the stitcher can stitch each of the MPEG video elements together to form a series of MPEG frames. The audio method 281 provides the audio data to the multiplexor within the stitcher so that the audio data is multiplexed together with the video data into an MPEG transport stream. The MPEG object also includes methods 283, 284 for the event data and for the other data.


Steaming MPEG objects may be produced by stitching multiple streaming MPEG objects 201A, 202A . . . 203A together in a session processor 200A. Construction of a scene may occur by linking multiple session processors 210A . . . 220A wherein each session processor feeds the next session processor with the MPEG elements of an MPEG object as shown in FIG. 2A.


The MPEG object, either an atomic object or a streaming object may itself be an application with a hierarchy of internal objects. For example, there may be an application object that defines the type of application at the top level. Below the application object there may be a scene object that defines a user interface including the locations of MPEG elements that are to be stitched together along with reference to other MPEG objects that are necessary for the application. Below the scene object, the individual MPEG object could be located. Thus, an MPEG object may be a self contained application. In such an embodiment, in response to a request for an application, the client script would call the MPEG object that contains the application and the application would be instantiated.


An example of an atomic MPEG object's data structure 300 along with pseudo code 310 for the MPEG object is shown in FIG. 3. Each MPEG object includes an interface segment 315 that may provide such information as class definitions and/or the location of the object and related class definitions in a distributed system. MPEG objects also include either a resource segment 316 or a method for at least receiving one or more resources.


The data structure 300 of FIG. 3 shows the object container/package 320 that includes an interface segment 315 that provides the location of the button MPEG object. The object also includes an object data segment 317. As shown there may be multiple object data segments (i.e. Interface Data, Visible Data, Audible Data, Button Data etc.) The object data is data that is used to define parameters of the object. For example, the visible data 330 for the object defines the height and the width of the button. The button data 340 provides a name for the button along with the states of the button and an audio file that is played when the button is selected (ClickAudio:=ClickSound.ac3). The resource segment 316 of the MPEG button object includes one or more video and/or audio files. In the example that is shown, the various state data for the button are provided 350, 351 wherein the video content would be a collection of macroblocks that represent one or more frames of MPEG video data. Thus, for each state of the button there would be at least one group of MPEG video elements composed of a plurality of macroblocks. The MPEG video elements would be the size of the height and width of the button and may be smaller than a frame to be displayed on a client's display device.



FIG. 4 shows another example of a possible MPEG object including the data structure 400 and pseudo code 410. This example is of a progress bar object. Like the MPEG object of FIG. 3 the progress bar MPEG object includes an interface segment 415 that identifies the location of the object's classes. Sample class definitions are provided in both XML and JAVA 422, 423. In the class definition the class includes methods for clearing the variable percentage and for setting the MPEG graphic initially to 0 percent.slc wherein slc represents an MPEG slice. In addition, the progress bar includes an Object Data Segment 417 that provides interface data (name of the progress bar), visible data (the size of the progress bar MPEG slices) and progress data (an internal variable that is updated as progress of the event being measured increases) 418. The progress bar MPEG object includes resource data 316 that includes MPEG slices that represent the various graphical states representing percentages of completion of the event being monitored. Thus, there may be ten different progress bar graphics each composed of MPEG slices 419. These MPEG slices can be combined with other MPEG slices to form a complete MPEG frame.


An authoring environment provides for the creation and manipulation of MPEG objects and allows for the creation of scenes for an interactive application. The authoring environment is preferably a graphical user interface authoring tool for creating MPEG objects and interactive applications by graphical selection of MPEG objects. The authoring environment includes two interfaces. The first interface is the authoring tool for creating MPEG objects and defining application scenes. The second interface is a script editor that allows a designer to add events and methods to MPEG object or to a scene. The output of the authoring environment may be self contained binary code for an MPEG object or a structured data file representing an application. The structured data file for an application includes information regarding the MPEG objects within a scene, the location of the MPEG graphical element of the MPEG object within a frame, properties for the MPEG object, the address/memory location of the MPEG object, and scripts for the application that access and use the MPEG objects. The self contained binary code for an MPEG object may be used by an application. The application may access an MPEG object by referencing the memory location wherein the self-contained binary code is located.



FIG. 5 graphically shows the authoring environment 600. The graphical environment allows an application designer to add MPEG objects into a scene layout 610 though graphical selection of a representative icon 620 that is linked to the underlying object code. In addition, the authoring environment allows a user to create new MPEG objects.


A top level scene will be the first scene that is provided to a user's device when the application is loaded. The application designer can select and drag and drop an object from the object toolbar 620. For example, the designer can insert user interface objects such as: a media player object, a ticker object, a button object, a static image, a list box object, or text. The authoring environment includes other objects such as container objects, session objects and timer objects that are not graphical in nature, but are part of the MPEG object model.


The authoring environment includes an application tree 630 that indicates the level of the application. For example, an application may include a plurality of video scenes wherein a single scene is equivalent to a portion of a webpage. The video scene may allow a user of the interactive video to drill down to a second scene by selecting a link within the video scene. The second scene would be at a level that is lower than the first scene. The application tree 630 provides both a listing of the scene hierarchy as well as a listing of the objects within the scene in a hierarchical order.


Rather than the creation of an application, the designer may create an object or a hierarchical object that contains a plurality of objects. Thus, the output of the authoring environment may also be that of an MPEG object. The designer would provide graphical content, for example in the form of a JPEG image, and the authoring environment would render the JPEG image and encode the JPEG image as a sequence of slices. The authoring environment would also allow the designer to define scripts, methods and properties for the object.


For example, a designer may wish to create a new media player MPEG object to display viewable media streams. The designer may import a graphic that provides a skin for the media player that surrounds the media stream. The graphic would be rendered by the authoring environment and encoded as a plurality of MPEG slices. The designer could then add in properties for the media player object such as the name and location of the media stream, whether a chaser (highlighting of the media stream within the video frame) is present, or the type of highlighting (i.e. yellow ring around the object that has focus). In addition, the designer may include properties that indicate the objects that are located in each direction in case a user decides to move focus from the media player object to another object. For example, there may be a chaser up, down, left, and right properties and associated methods that indicate the object that will receive focus if the current media player object has focus and the user uses a remote control coupled to the user's device (i.e. set-top box) and presses one of the direction keys. The MPEG object designer may provide the media player object with events such as on Load that is triggered every time a user views the scene that has the media player object. Other events may include on Focus that indicates that the object has received focus and on Blur that indicates the object has lost focus. An on KeyPress event may be included indicating that if the object is in focus and a key is pressed that this event will occur. The events and properties for the Media Player Object are provided for exemplary purposes to show the nature and scope of events and properties that can be associated with an MPEG object. Other MPEG objects can be created having similar event and properties as well as distinct events and properties as required by the application designer.


The authoring environment includes a properties 640 and event tab 650 for defining the properties of a predefined or new object. An example of the properties pane 660 is shown in FIG. 6A. The properties for a predefined ticker object (a banner that appears to scroll across the video frame) includes the background color, the text color, the text font and the transparency of the ticker 665. It should be recognized that each object type will have different properties. The events tab allows the application designer to make associations between events (received signals from the user) and the object. For example, a button object may include a plurality of states (on and off). Associated with each state may be a separate MPEG video sequence. Thus, there is a video graphic for the “on” state that indicates the button has been activated and a video graphic for the “off” state that indicates the button is inactive. The event tab allows the application designer to make the association between the signal received from the user, the state change of the object and the change in the video content that is part of the scene. FIG. 6B shows an example of the event tab when selected for a predefined media player object. The events include an on Load, on Focus, on Blur, on KeyPress, and onClick events 670 for the media player. The authoring environment allows the designer to tab between scenes 680 and tab between the scene layout and the scripting page 690. As shown, the authoring environment includes a template tab. The template tab 695 allows for selection of previously saved scenes, so that a designer can use design information from previous scenes for the creation of new scenes. In addition, the designer may be provided with blank event panes and properties panes so that the designer can create a new MPEG object defining properties and events for the new object.


Scripts can be added to an application or to a newly created object by selecting the scripting tab. FIG. 6C shows the script editor 691. For example, the script may determine the function that is provided if a client attempts to select a button graphic 692. In this example, the script would be part of the application file. Similarly, the designer may designate that the script is to be used for creating a script internal to the MPEG object such as the client script within the MPEG streaming object shown in FIG. 2 or the script shown in the atomic object of FIG. 1.


MPEG objects may also be generated in real-time. In this paradigm, a request for an MPEG object is made to the session processor wherein the MPEG object has undefined video and/or audio content. A script at the session processor will cause a separate processor/server to obtain and render the video content for the object, encode the content as an MPEG element and return a complete MPEG object in real-time to the session processor. The server may construct either an atomic or streaming MPEG object. The server may also employee caching techniques to store the newly defined MPEG objects for subsequent MPEG object requests. This methodology is useful for distributed rendering of user specific or real-time generated content. For example, the server may act as a proxy that transcodes a client's photo album where the photos originate in a JPEG format and the server stores the photos as MPEG elements within an MPEG photo album object. The server may then pass the MPEG photo album object to the session processor for use with the requested application. Additionally, the MPEG photo album object would be saved for later retrieval when the client again requests the photo album.


Once the designer has completed the design of the application or the MPEG object, the system takes the received information and converts the information into either binary code if a new MPEG object is created or an AVML (active video mark-up language) file if the designer has created a new application. The AVML file is XML based in syntax, but contain specific structures relevant to the formation of an interactive video. For example, the AVML file can contain scripts that interact with MPEG objects. An explanation of the AVML language can be found in Appendix A attached to U.S. patent application entitled, entitled, “Interactive Encoded Content System including Object Models for Viewing on a Remote Device” filed concurrently herewith on Jan. 11, 2008 and assigned to the same assignee, which is incorporated by reference in its entirety. All objects within an application scene have a hierarchy in a logical stack. The hierarchy is assigned based on the sequence of adding the object in the scene. The object first added to the scene is at the bottom of the stack. Objects may be moved up or down within the hierarchy prior to completion of the design and conversion of the graphical scene into the AVML file format. New MPEG objects that are in binary code may be incorporated into applications by referencing the storage location for the binary code.


The AVML file output from the authoring environment allows a stitcher module to be aware of the desired output slice configuration from the plurality of MPEG elements associated with the MPEG objects referenced within the AVML file. The AVML file indicates the size of the slices and the location of the slices within an MPEG frame. In addition, the AVML file describes the encapsulated self-describing object presentations or states of the MPEG objects. For example, if a button object is graphically placed into the authoring environment by a user, the authoring environment will determine the position of the button within an MPEG video frame based upon this dynamic placement. This position information will be translated into a frame location and will be associated with the MPEG button object. State information will also be placed within the AVML file. Thus, the AVML file will list the states for the MPEG button object (on and off) and will have a reference to the location of each MPEG graphical file (MPEG elements) for those two states.


After an application is defined by an application designer, a client can request the application by using the client's device 700 as shown in FIG. 7. The client's device 700 will request an interactive session and a session processor 701 will be assigned. The session processor 701 will retrieve the AVML file 702 from a memory location 703 for the requested application and will run a virtual machine 705. The virtual machine 705 will parse the AVML file and identify the MPEG objects that the session processor 701 needs to access for the application. The virtual machine 705 will determine the position of each graphical element 710 from the accessed MPEG objects 720 within a video frame based upon the position information from the AVML file 730 and the sizing information as defined within the MPEG objects 720. As shown, only one MPEG object is present in the Fig. although many MPEG objects may be used in conjunction with the AVML file. Additionally, the MPEG object that is shown stored in memory has two representative components, the MPEG element 710 and the MPEG method 775. As expressed above, the MPEG element may be internal to the MPEG object or may be external. The MPEG elements 710a,b, which are preferably MPEG slices from one or more MPEG objects are then passed to the stitcher 740 by the virtual machine 705 and the stitcher sequences the slices so that they form an MPEG video frame 750 according to the position information parsed by the virtual machine. The stitcher is presented with the MPEG elements associated with the objects for each state. For example, if an MPEG button object has MPEG elements of 64×64 pixels and has two states (on and off), the stitcher will buffer the pre-encoded 64×64 pixel MPEG elements for each state.


The MPEG video frame 750 is encapsulated so that it forms a part of an MPEG video stream 760 that is then provided to the client device 700. The client device 700 can then decode the MPEG video stream. The client may then interact with MPEG objects by using an input device 770. The session processor 701 receives the signal form the input device 770 and based on the signal and the object selected methods 775 of the MPEG object 720 will be executed or interpreted by the virtual machine 705 and an MPEG video element 710a will be updated and the updated video element content 710c will be passed to the stitcher 740. Additionally, state information maintained by the session processor for the MPEG object that has been selected will be updated within the application (AVML file). The MPEG video element 710c may already be stored in a buffer within the stitcher. For example, the MPEG element 710c may be representative of a state. A request for change in state of a button may be received by the session processor and the stitcher can access the buffer that contains the MPEG slices of the MPEG element for the ‘off-state’ assuming the button was previously in the ‘on-state.’ The stitcher 740 can then replace the MPEG element slice 710a within the MPEG frame 750 and the updated MPEG frame 750a will be sent to the client device 700. Thus, the client interacts with the MPEG content even though the client device may only have an MPEG decoder and an upstream connection for sending signals/instructions to the assigned session processor 701.


Although the present invention has been described in terms of MPEG encoding, the invention may be employed with other block based encoding techniques for creating objects that are specific to those block based encoding techniques. The present invention may be embodied in many different forms, including, but in no way limited to, computer program logic for use with a processor (e.g., a microprocessor, microcontroller, digital signal processor, or general purpose computer), programmable logic for use with a programmable logic device (e.g., a Field Programmable Gate Array (FPGA) or other PLD), discrete components, integrated circuitry (e.g., an Application Specific Integrated Circuit (ASIC)), or any other means including any combination thereof. In an embodiment of the present invention, predominantly all of the reordering logic may be implemented as a set of computer program instructions that is converted into a computer executable form, stored as such in a computer readable medium, and executed by a microprocessor within the array under the control of an operating system.


Computer program logic implementing all or part of the functionality previously described herein may be embodied in various forms, including, but in no way limited to, a source code form, a computer executable form, and various intermediate forms (e.g., forms generated by an assembler, compiler, networker, or locator.) Source code may include a series of computer program instructions implemented in any of various programming languages (e.g., an object code, an assembly language, or a high-level language such as FORTRAN, C, C++, JAVA, or HTML) for use with various operating systems or operating environments. The source code may define and use various data structures and communication messages. The source code may be in a computer executable form (e.g., via an interpreter), or the source code may be converted (e.g., via a translator, assembler, or compiler) into a computer executable form.


The computer program may be fixed in any form (e.g., source code form, computer executable form, or an intermediate form) either permanently or transitorily in a tangible storage medium, such as a semiconductor memory device (e.g., a RAM, ROM, PROM, EEPROM, or Flash-Programmable RAM), a magnetic memory device (e.g., a diskette or fixed disk), an optical memory device (e.g., a CD-ROM), a PC card (e.g., PCMCIA card), or other memory device. The computer program may be fixed in any form in a signal that is transmittable to a computer using any of various communication technologies, including, but in no way limited to, analog technologies, digital technologies, optical technologies, wireless technologies, networking technologies, and internetworking technologies. The computer program may be distributed in any form as a removable storage medium with accompanying printed or electronic documentation (e.g., shrink wrapped software or a magnetic tape), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the communication system (e.g., the Internet or World Wide Web.)


Hardware logic (including programmable logic for use with a programmable logic device) implementing all or part of the functionality previously described herein may be designed using traditional manual methods, or may be designed, captured, simulated, or documented electronically using various tools, such as Computer Aided Design (CAD), a hardware description language (e.g., VHDL or AHDL), or a PLD programming language (e.g., PALASM, ABEL, or CUPL.)


While the invention has been particularly shown and described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended clauses. As will be apparent to those skilled in the art, techniques described above for panoramas may be applied to images that have been captured as non-panoramic images, and vice versa.


Embodiments of the present invention may be described, without limitation, by the following clauses. While these embodiments have been described in the clauses by process steps, an apparatus comprising a computer with associated display capable of executing the process steps in the clauses below is also included in the present invention. Likewise, a computer program product including computer executable instructions for executing the process steps in the clauses below and stored on a computer readable medium is included within the present invention.

Claims
  • 1. A tool for creating interactive MPEG content, the tool comprising: an editor having a scene window allowing a user to create a scene based upon placement of atomic MPEG objects within the scene window;an object tool bar allowing a user to add the atomic MPEG objects to the scene, each of the atomic MPEG objects having a respective plurality of visual states and comprising a respective plurality of graphical elements, each graphical element of the respective plurality of graphical elements corresponding to a respective visual state of the respective plurality of visual states, each of the atomic MPEG objects further comprising code to construct each visual state of the respective plurality of visual states, wherein: when a respective atomic MPEG object of the atomic MPEG objects is placed within the scene window, user-modifiable properties associated with the respective atomic MPEG object are displayed,the user can create a user-definable script for the respective atomic MPEG object, anda first atomic MPEG object of the atomic MPEG objects represents interactive graphical content; anda format processor comprising hardware for processing the scene into a file format that includes display position information of respective pluralities of graphical elements of the atomic MPEG objects that were added to the scene, so as to allow a stitcher to form a complete MPEG video frame from the respective pluralities of graphical elements,wherein different MPEG frames can be assigned to a respective visual state of the pluralities of visual states.
  • 2. The tool according to claim 1, wherein when a respective atomic MPEG object of the atomic MPEG objects is placed within the scene window, the editor snaps a first graphical element of the plurality of graphical elements of the respective atomic MPEG object to an MPEG macroblock boundary.
  • 3. The tool according to claim 1, wherein the respective pluralities of graphical elements of the atomic MPEG objects comprise respective pluralities of MPEG video elements corresponding to different visual states of the respective pluralities of visual states.
  • 4. The tool according to claim 1, wherein the first atomic MPEG object represents a button.
  • 5. The tool according to claim 1, wherein after a scene is defined, the editor creates an XML-based output file.
CROSS-REFERENCE TO RELATED APPLICATIONS

U.S. Patent Application entitled “Interactive Encoded Content System including Object Models for Viewing on a Remote Device” and assigned to the same assignee filed contemporaneously herewith on Jan. 11, 2008 is related generally to the subject matter of the present application and is incorporated herein by reference in its entirety. The present application claims priority from U.S. provisional application Ser. No. 60/884,744, filed Jan. 12, 2007, Ser. No. 60/884,773, filed Jan. 12, 2007, and Ser. No. 60/884,772, filed Jan. 12, 2007, the full disclosures of which are hereby incorporated herein by reference.

US Referenced Citations (730)
Number Name Date Kind
3889050 Thompson Jun 1975 A
3934079 Barnhart Jan 1976 A
3997718 Ricketts et al. Dec 1976 A
4002843 Rackman Jan 1977 A
4032972 Saylor Jun 1977 A
4077006 Nicholson Feb 1978 A
4081831 Tang et al. Mar 1978 A
4107734 Percy et al. Aug 1978 A
4107735 Frohbach Aug 1978 A
4145720 Weintraub et al. Mar 1979 A
4168400 de Couasnon et al. Sep 1979 A
4186438 Benson et al. Jan 1980 A
4222068 Thompson Sep 1980 A
4245245 Matsumoto et al. Jan 1981 A
4247106 Jeffers et al. Jan 1981 A
4253114 Tang et al. Feb 1981 A
4264924 Freeman Apr 1981 A
4264925 Freeman et al. Apr 1981 A
4290142 Schnee et al. Sep 1981 A
4302771 Gargini Nov 1981 A
4308554 Percy et al. Dec 1981 A
4350980 Ward Sep 1982 A
4367557 Stern et al. Jan 1983 A
4395780 Gohm et al. Jul 1983 A
4408225 Ensinger et al. Oct 1983 A
4450477 Lovett May 1984 A
4454538 Toriumi Jun 1984 A
4466017 Banker Aug 1984 A
4471380 Mobley Sep 1984 A
4475123 Dumbauld et al. Oct 1984 A
4484217 Block et al. Nov 1984 A
4491983 Pinnow et al. Jan 1985 A
4506387 Walter Mar 1985 A
4507680 Freeman Mar 1985 A
4509073 Baran et al. Apr 1985 A
4523228 Banker Jun 1985 A
4533948 McNamara et al. Aug 1985 A
4536791 Campbell et al. Aug 1985 A
4538174 Gargini et al. Aug 1985 A
4538176 Nakajima et al. Aug 1985 A
4553161 Citta Nov 1985 A
4554581 Tentler et al. Nov 1985 A
4555561 Sugimori et al. Nov 1985 A
4562465 Glaab Dec 1985 A
4567517 Mobley Jan 1986 A
4573072 Freeman Feb 1986 A
4591906 Morales-Garza et al. May 1986 A
4602279 Freeman Jul 1986 A
4614970 Clupper et al. Sep 1986 A
4616263 Eichelberger Oct 1986 A
4625235 Watson Nov 1986 A
4627105 Ohashi et al. Dec 1986 A
4633462 Stifle et al. Dec 1986 A
4670904 Rumreich Jun 1987 A
4682360 Frederiksen Jul 1987 A
4695880 Johnson et al. Sep 1987 A
4706121 Young Nov 1987 A
4706285 Rumreich Nov 1987 A
4709418 Fox et al. Nov 1987 A
4710971 Nozaki et al. Dec 1987 A
4718086 Rumreich et al. Jan 1988 A
4732764 Hemingway et al. Mar 1988 A
4734764 Pocock et al. Mar 1988 A
4748689 Mohr May 1988 A
4749992 Fitzemeyer et al. Jun 1988 A
4750036 Martinez Jun 1988 A
4754426 Rast et al. Jun 1988 A
4760442 O'Connell et al. Jul 1988 A
4763317 Lehman et al. Aug 1988 A
4769833 Farleigh et al. Sep 1988 A
4769838 Hasegawa Sep 1988 A
4789863 Bush Dec 1988 A
4792849 McCalley et al. Dec 1988 A
4801190 Imoto Jan 1989 A
4805134 Calo et al. Feb 1989 A
4807031 Broughton et al. Feb 1989 A
4816905 Tweedy et al. Mar 1989 A
4821102 Ichikawa et al. Apr 1989 A
4823386 Dumbauld et al. Apr 1989 A
4827253 Maltz May 1989 A
4827511 Masuko May 1989 A
4829372 McCalley et al. May 1989 A
4829558 Welsh May 1989 A
4847698 Freeman Jul 1989 A
4847699 Freeman Jul 1989 A
4847700 Freeman Jul 1989 A
4848698 Newell et al. Jul 1989 A
4860379 Schoeneberger et al. Aug 1989 A
4864613 Van Cleave Sep 1989 A
4876592 Von Kohorn Oct 1989 A
4889369 Albrecht Dec 1989 A
4890320 Monslow et al. Dec 1989 A
4891694 Way Jan 1990 A
4901367 Nicholson Feb 1990 A
4903126 Kassatly Feb 1990 A
4905094 Pocock et al. Feb 1990 A
4912760 West, Jr. et al. Mar 1990 A
4918516 Freeman Apr 1990 A
4920566 Robbins et al. Apr 1990 A
4922532 Farmer et al. May 1990 A
4924303 Brandon et al. May 1990 A
4924498 Farmer et al. May 1990 A
4937821 Boulton Jun 1990 A
4941040 Pocock et al. Jul 1990 A
4947244 Fenwick et al. Aug 1990 A
4961211 Tsugane et al. Oct 1990 A
4963995 Lang Oct 1990 A
4975771 Kassatly Dec 1990 A
4989245 Bennett Jan 1991 A
4994909 Graves et al. Feb 1991 A
4995078 Monslow et al. Feb 1991 A
5003384 Durden et al. Mar 1991 A
5008934 Endoh Apr 1991 A
5014125 Pocock et al. May 1991 A
5027400 Baji et al. Jun 1991 A
5051720 Kittirutsunetorn Sep 1991 A
5051822 Rhoades Sep 1991 A
5057917 Shalkauser et al. Oct 1991 A
5058160 Banker et al. Oct 1991 A
5060262 Bevins, Jr et al. Oct 1991 A
5077607 Johnson et al. Dec 1991 A
5083800 Lockton Jan 1992 A
5088111 McNamara et al. Feb 1992 A
5093718 Hoarty et al. Mar 1992 A
5109414 Harvey et al. Apr 1992 A
5113496 McCalley et al. May 1992 A
5119188 McCalley et al. Jun 1992 A
5130792 Tindell et al. Jul 1992 A
5132992 Yurt et al. Jul 1992 A
5133009 Rumreich Jul 1992 A
5133079 Ballantyne et al. Jul 1992 A
5136411 Paik et al. Aug 1992 A
5142575 Farmer et al. Aug 1992 A
5144448 Hornbaker et al. Sep 1992 A
5155591 Wachob Oct 1992 A
5172413 Bradley et al. Dec 1992 A
5191410 McCalley et al. Mar 1993 A
5195092 Wilson et al. Mar 1993 A
5208665 McCalley et al. May 1993 A
5220420 Hoarty et al. Jun 1993 A
5230019 Yanagimichi et al. Jul 1993 A
5231494 Wachob Jul 1993 A
5236199 Thompson, Jr. Aug 1993 A
5247347 Litteral et al. Sep 1993 A
5253341 Rozmanith et al. Oct 1993 A
5262854 Ng Nov 1993 A
5262860 Fitzpatrick et al. Nov 1993 A
5303388 Kreitman et al. Apr 1994 A
5319455 Hoarty et al. Jun 1994 A
5319707 Wasilewski et al. Jun 1994 A
5321440 Yanagihara et al. Jun 1994 A
5321514 Martinez Jun 1994 A
5351129 Lai Sep 1994 A
5355162 Yazolino et al. Oct 1994 A
5359601 Wasilewski et al. Oct 1994 A
5361091 Hoarty et al. Nov 1994 A
5371532 Gelman et al. Dec 1994 A
5404393 Remillard Apr 1995 A
5408274 Chang et al. Apr 1995 A
5410343 Coddington et al. Apr 1995 A
5410344 Graves et al. Apr 1995 A
5412415 Cook et al. May 1995 A
5412720 Hoarty May 1995 A
5418559 Blahut May 1995 A
5422674 Hooper et al. Jun 1995 A
5422887 Diepstraten et al. Jun 1995 A
5442389 Blahut et al. Aug 1995 A
5442390 Hooper et al. Aug 1995 A
5442700 Snell et al. Aug 1995 A
5446490 Blahut et al. Aug 1995 A
5469283 Vinel et al. Nov 1995 A
5469431 Wendorf et al. Nov 1995 A
5471263 Odaka Nov 1995 A
5481542 Logston et al. Jan 1996 A
5485197 Hoarty Jan 1996 A
5487066 McNamara et al. Jan 1996 A
5493638 Hooper et al. Feb 1996 A
5495283 Cowe Feb 1996 A
5495295 Long Feb 1996 A
5497187 Banker et al. Mar 1996 A
5517250 Hoogenboom et al. May 1996 A
5526034 Hoarty et al. Jun 1996 A
5528281 Grady et al. Jun 1996 A
5537397 Abramson Jul 1996 A
5537404 Bentley et al. Jul 1996 A
5539449 Blahut et al. Jul 1996 A
RE35314 Logg Aug 1996 E
5548340 Bertram Aug 1996 A
5550578 Hoarty et al. Aug 1996 A
5557316 Hoarty et al. Sep 1996 A
5559549 Hendricks et al. Sep 1996 A
5561708 Remillard Oct 1996 A
5570126 Blahut et al. Oct 1996 A
5570363 Holm Oct 1996 A
5579143 Huber Nov 1996 A
5581653 Todd Dec 1996 A
5583927 Ely et al. Dec 1996 A
5587734 Lauder et al. Dec 1996 A
5589885 Ooi Dec 1996 A
5592470 Rudrapatna et al. Jan 1997 A
5594507 Hoarty Jan 1997 A
5594723 Tibi Jan 1997 A
5594938 Engel Jan 1997 A
5596693 Needle et al. Jan 1997 A
5600364 Hendricks et al. Feb 1997 A
5600573 Hendricks et al. Feb 1997 A
5608446 Carr et al. Mar 1997 A
5617145 Huang et al. Apr 1997 A
5621464 Teo et al. Apr 1997 A
5625404 Grady et al. Apr 1997 A
5630757 Gagin et al. May 1997 A
5631693 Wunderlich et al. May 1997 A
5631846 Szurkowski May 1997 A
5632003 Davidson et al. May 1997 A
5649283 Galler et al. Jul 1997 A
5668592 Spaulding, II Sep 1997 A
5668599 Cheney et al. Sep 1997 A
5708767 Yeo et al. Jan 1998 A
5710815 Ming et al. Jan 1998 A
5712906 Grady et al. Jan 1998 A
5740307 Lane Apr 1998 A
5742289 Naylor et al. Apr 1998 A
5748234 Lippincott May 1998 A
5754941 Sharpe et al. May 1998 A
5786527 Tarte Jul 1998 A
5790174 Richard, III et al. Aug 1998 A
5802283 Grady et al. Sep 1998 A
5812665 Hoarty et al. Sep 1998 A
5812786 Seazholtz et al. Sep 1998 A
5815604 Simons et al. Sep 1998 A
5818438 Howe et al. Oct 1998 A
5821945 Yeo et al. Oct 1998 A
5822537 Katseff et al. Oct 1998 A
5828371 Cline et al. Oct 1998 A
5844594 Ferguson Dec 1998 A
5845083 Hamadani et al. Dec 1998 A
5862325 Reed et al. Jan 1999 A
5864820 Case Jan 1999 A
5867208 McLaren Feb 1999 A
5883661 Hoarty Mar 1999 A
5903727 Nielsen May 1999 A
5903816 Broadwin et al. May 1999 A
5905522 Lawler May 1999 A
5907681 Bates et al. May 1999 A
5917822 Lyles et al. Jun 1999 A
5946352 Rowlands et al. Aug 1999 A
5952943 Walsh et al. Sep 1999 A
5959690 Toebes et al. Sep 1999 A
5961603 Kunkel et al. Oct 1999 A
5963203 Goldberg et al. Oct 1999 A
5966163 Lin et al. Oct 1999 A
5978756 Walker et al. Nov 1999 A
5982445 Eyer et al. Nov 1999 A
5990862 Lewis Nov 1999 A
5995146 Rasmussen Nov 1999 A
5995488 Kalkunte et al. Nov 1999 A
5999970 Krisbergh et al. Dec 1999 A
6014416 Shin et al. Jan 2000 A
6021386 Davis et al. Feb 2000 A
6031989 Cordell Feb 2000 A
6034678 Hoarty et al. Mar 2000 A
6049539 Lee et al. Apr 2000 A
6049831 Gardell et al. Apr 2000 A
6052555 Ferguson Apr 2000 A
6055314 Spies et al. Apr 2000 A
6055315 Doyle et al. Apr 2000 A
6064377 Hoarty et al. May 2000 A
6078328 Schumann et al. Jun 2000 A
6084908 Chiang et al. Jul 2000 A
6100883 Hoarty Aug 2000 A
6108625 Kim Aug 2000 A
6131182 Beakes et al. Oct 2000 A
6141645 Chi-Min et al. Oct 2000 A
6141693 Perlman et al. Oct 2000 A
6144698 Poon et al. Nov 2000 A
6167084 Wang et al. Dec 2000 A
6169573 Sampath-Kumar et al. Jan 2001 B1
6177931 Alexander et al. Jan 2001 B1
6182072 Leak et al. Jan 2001 B1
6184878 Alonso et al. Feb 2001 B1
6192081 Chiang et al. Feb 2001 B1
6198822 Doyle et al. Mar 2001 B1
6205582 Hoarty Mar 2001 B1
6226041 Florencio et al. May 2001 B1
6236730 Cowieson et al. May 2001 B1
6243418 Kim Jun 2001 B1
6253238 Lauder et al. Jun 2001 B1
6253375 Gordon et al. Jun 2001 B1
6256047 Isobe et al. Jul 2001 B1
6259826 Pollard et al. Jul 2001 B1
6266369 Wang et al. Jul 2001 B1
6266684 Kraus et al. Jul 2001 B1
6275496 Burns et al. Aug 2001 B1
6292194 Powell, III Sep 2001 B1
6305020 Hoarty et al. Oct 2001 B1
6317151 Ohsuga et al. Nov 2001 B1
6317885 Fries Nov 2001 B1
6324217 Gordon Nov 2001 B1
6349284 Park et al. Feb 2002 B1
6385771 Gordon May 2002 B1
6386980 Nishino et al. May 2002 B1
6389075 Wang et al. May 2002 B2
6389218 Gordon et al. May 2002 B2
6415031 Colligan et al. Jul 2002 B1
6415437 Ludvig et al. Jul 2002 B1
6438140 Jungers et al. Aug 2002 B1
6446037 Fielder et al. Sep 2002 B1
6459427 Mao et al. Oct 2002 B1
6477182 Calderone Nov 2002 B2
6480210 Martino et al. Nov 2002 B1
6481012 Gordon et al. Nov 2002 B1
6512793 Maeda Jan 2003 B1
6525746 Lau et al. Feb 2003 B1
6536043 Guedalia Mar 2003 B1
6557041 Mallart Apr 2003 B2
6560496 Michener May 2003 B1
6564378 Satterfield et al. May 2003 B1
6578201 LaRocca et al. Jun 2003 B1
6579184 Tanskanen Jun 2003 B1
6584153 Gordon et al. Jun 2003 B1
6588017 Calderone Jul 2003 B1
6598229 Smyth et al. Jul 2003 B2
6604224 Armstrong et al. Aug 2003 B1
6614442 Ouyang et al. Sep 2003 B1
6614843 Gordon et al. Sep 2003 B1
6621870 Gordon et al. Sep 2003 B1
6625574 Taniguchi et al. Sep 2003 B1
6639896 Goode et al. Oct 2003 B1
6645076 Sugai Nov 2003 B1
6651252 Gordon et al. Nov 2003 B1
6657647 Bright Dec 2003 B1
6675385 Wang Jan 2004 B1
6675387 Boucher et al. Jan 2004 B1
6681326 Son et al. Jan 2004 B2
6681397 Tsai et al. Jan 2004 B1
6684400 Goode et al. Jan 2004 B1
6687663 McGrath et al. Feb 2004 B1
6691208 Dandrea et al. Feb 2004 B2
6697376 Son et al. Feb 2004 B1
6704359 Bayrakeri et al. Mar 2004 B1
6717600 Dutta et al. Apr 2004 B2
6718552 Goode Apr 2004 B1
6721794 Taylor et al. Apr 2004 B2
6721956 Wasilewski Apr 2004 B2
6727929 Bates et al. Apr 2004 B1
6731605 Deshpande May 2004 B1
6732370 Gordon et al. May 2004 B1
6747991 Hemy et al. Jun 2004 B1
6754271 Gordon et al. Jun 2004 B1
6754905 Gordon et al. Jun 2004 B2
6758540 Adolph et al. Jul 2004 B1
6766407 Lisitsa et al. Jul 2004 B1
6771704 Hannah Aug 2004 B1
6785902 Zigmond et al. Aug 2004 B1
6807528 Truman et al. Oct 2004 B1
6810528 Chatani Oct 2004 B1
6813690 Lango et al. Nov 2004 B1
6817947 Tanskanen Nov 2004 B2
6886178 Mao et al. Apr 2005 B1
6907574 Xu et al. Jun 2005 B2
6931291 Alvarez-Tinoco et al. Aug 2005 B1
6934965 Gordon et al. Aug 2005 B2
6941019 Mitchell et al. Sep 2005 B1
6941574 Broadwin et al. Sep 2005 B1
6947509 Wong Sep 2005 B1
6952221 Holtz et al. Oct 2005 B1
6956899 Hall et al. Oct 2005 B2
7016540 Gong et al. Mar 2006 B1
7030890 Jouet et al. Apr 2006 B1
7031385 Inoue et al. Apr 2006 B1
7050113 Campisano et al. May 2006 B2
7089577 Rakib et al. Aug 2006 B1
7093028 Shao et al. Aug 2006 B1
7095402 Kunii et al. Aug 2006 B2
7114167 Slemmer et al. Sep 2006 B2
7124424 Gordon et al. Oct 2006 B2
7146615 Hervet et al. Dec 2006 B1
7146628 Gordon et al. Dec 2006 B1
7151782 Oz et al. Dec 2006 B1
7158676 Rainsford Jan 2007 B1
7200836 Brodersen et al. Apr 2007 B2
7212573 Winger May 2007 B2
7224731 Mehrotra May 2007 B2
7272556 Aguilar et al. Sep 2007 B1
7310619 Baar et al. Dec 2007 B2
7325043 Rosenberg et al. Jan 2008 B1
7346111 Winger et al. Mar 2008 B2
7360230 Paz et al. Apr 2008 B1
7412423 Asano Aug 2008 B1
7412505 Slemmer et al. Aug 2008 B2
7421082 Kamiya et al. Sep 2008 B2
7444306 Varble Oct 2008 B2
7444418 Chou et al. Oct 2008 B2
7500235 Maynard et al. Mar 2009 B2
7508941 O'Toole, Jr. et al. Mar 2009 B1
7512577 Slemmer et al. Mar 2009 B2
7543073 Chou et al. Jun 2009 B2
7596764 Vienneau et al. Sep 2009 B2
7623575 Winger Nov 2009 B2
7669220 Goode Feb 2010 B2
7742609 Yeakel et al. Jun 2010 B2
7743400 Kurauchi Jun 2010 B2
7751572 Villemoes et al. Jul 2010 B2
7757157 Fukuda Jul 2010 B1
7830388 Lu Nov 2010 B1
7840905 Weber et al. Nov 2010 B1
7936819 Craig et al. May 2011 B2
7941645 Riach et al. May 2011 B1
7970263 Asch Jun 2011 B1
7987489 Krzyzanowski et al. Jul 2011 B2
8027353 Damola et al. Sep 2011 B2
8036271 Winger et al. Oct 2011 B2
8046798 Schlack et al. Oct 2011 B1
8074248 Sigmon et al. Dec 2011 B2
8118676 Craig et al. Feb 2012 B2
8136033 Bhargava et al. Mar 2012 B1
8149917 Zhang et al. Apr 2012 B2
8155194 Winger et al. Apr 2012 B2
8155202 Landau Apr 2012 B2
8170107 Winger May 2012 B2
8194862 Herr et al. Jun 2012 B2
8243630 Luo et al. Aug 2012 B2
8270439 Herr et al. Sep 2012 B2
8284842 Craig et al. Oct 2012 B2
8296424 Malloy et al. Oct 2012 B2
8370869 Paek et al. Feb 2013 B2
8411754 Zhang et al. Apr 2013 B2
8442110 Pavlovskaia et al. May 2013 B2
8473996 Gordon et al. Jun 2013 B2
8619867 Craig et al. Dec 2013 B2
8621500 Weaver et al. Dec 2013 B2
8656430 Doyle Feb 2014 B2
20010008845 Kusuda et al. Jul 2001 A1
20010049301 Masuda et al. Dec 2001 A1
20020007491 Schiller et al. Jan 2002 A1
20020013812 Krueger et al. Jan 2002 A1
20020016161 Dellien et al. Feb 2002 A1
20020021353 DeNies Feb 2002 A1
20020026642 Augenbraun et al. Feb 2002 A1
20020027567 Niamir Mar 2002 A1
20020032697 French et al. Mar 2002 A1
20020040482 Sextro et al. Apr 2002 A1
20020047899 Son et al. Apr 2002 A1
20020049975 Thomas et al. Apr 2002 A1
20020054578 Zhang et al. May 2002 A1
20020056083 Istvan May 2002 A1
20020056107 Schlack May 2002 A1
20020056136 Wistendahl et al. May 2002 A1
20020059644 Andrade et al. May 2002 A1
20020062484 De Lange et al. May 2002 A1
20020066101 Gordon et al. May 2002 A1
20020067766 Sakamoto et al. Jun 2002 A1
20020069267 Thiele Jun 2002 A1
20020072408 Kumagai Jun 2002 A1
20020078171 Schneider Jun 2002 A1
20020078456 Hudson et al. Jun 2002 A1
20020083464 Tomsen et al. Jun 2002 A1
20020095689 Novak Jul 2002 A1
20020105531 Niemi Aug 2002 A1
20020108121 Alao et al. Aug 2002 A1
20020131511 Zenoni Sep 2002 A1
20020136298 Anantharamu et al. Sep 2002 A1
20020152318 Menon et al. Oct 2002 A1
20020171765 Waki et al. Nov 2002 A1
20020175931 Holtz et al. Nov 2002 A1
20020178447 Plotnick et al. Nov 2002 A1
20020188628 Cooper et al. Dec 2002 A1
20020191851 Keinan Dec 2002 A1
20020194592 Tsuchida et al. Dec 2002 A1
20020196746 Allen Dec 2002 A1
20030018796 Chou et al. Jan 2003 A1
20030020671 Santoro et al. Jan 2003 A1
20030027517 Callway et al. Feb 2003 A1
20030035486 Kato et al. Feb 2003 A1
20030038893 Rajamaki et al. Feb 2003 A1
20030039398 McIntyre Feb 2003 A1
20030046690 Miller Mar 2003 A1
20030051253 Barone, Jr. Mar 2003 A1
20030058941 Chen et al. Mar 2003 A1
20030061451 Beyda Mar 2003 A1
20030065739 Shnier Apr 2003 A1
20030071792 Safadi Apr 2003 A1
20030072372 Shen et al. Apr 2003 A1
20030076546 Johnson et al. Apr 2003 A1
20030088328 Nishio et al. May 2003 A1
20030088400 Nishio et al. May 2003 A1
20030095790 Joshi May 2003 A1
20030107443 Yamamoto Jun 2003 A1
20030122836 Doyle et al. Jul 2003 A1
20030123664 Pedlow, Jr. et al. Jul 2003 A1
20030126608 Safadi Jul 2003 A1
20030126611 Chernock et al. Jul 2003 A1
20030131349 Kuczynski-Brown Jul 2003 A1
20030135860 Dureau Jul 2003 A1
20030169373 Peters et al. Sep 2003 A1
20030177199 Zenoni Sep 2003 A1
20030188309 Yuen Oct 2003 A1
20030189980 Dvir et al. Oct 2003 A1
20030196174 Pierre Cote et al. Oct 2003 A1
20030208768 Urdang et al. Nov 2003 A1
20030229719 Iwata et al. Dec 2003 A1
20030229900 Reisman Dec 2003 A1
20030231218 Amadio Dec 2003 A1
20040016000 Zhang et al. Jan 2004 A1
20040034873 Zenoni Feb 2004 A1
20040040035 Carlucci et al. Feb 2004 A1
20040055007 Allport Mar 2004 A1
20040078822 Breen et al. Apr 2004 A1
20040088375 Sethi et al. May 2004 A1
20040091171 Bone May 2004 A1
20040111526 Baldwin et al. Jun 2004 A1
20040117827 Karaoguz et al. Jun 2004 A1
20040128686 Boyer et al. Jul 2004 A1
20040133704 Krzyzanowski et al. Jul 2004 A1
20040136698 Mock Jul 2004 A1
20040139158 Datta Jul 2004 A1
20040157662 Tsuchiya Aug 2004 A1
20040163101 Swix et al. Aug 2004 A1
20040184542 Fujimoto Sep 2004 A1
20040193648 Lai et al. Sep 2004 A1
20040210824 Shoff et al. Oct 2004 A1
20040261106 Hoffman Dec 2004 A1
20040261114 Addington et al. Dec 2004 A1
20040268419 Danker et al. Dec 2004 A1
20050015259 Thumpudi et al. Jan 2005 A1
20050015816 Christofalo et al. Jan 2005 A1
20050021830 Urzaiz et al. Jan 2005 A1
20050034155 Gordon et al. Feb 2005 A1
20050034162 White et al. Feb 2005 A1
20050044575 Der Kuyl Feb 2005 A1
20050055685 Maynard et al. Mar 2005 A1
20050055721 Zigmond et al. Mar 2005 A1
20050071876 van Beek Mar 2005 A1
20050076134 Bialik et al. Apr 2005 A1
20050089091 Kim et al. Apr 2005 A1
20050091690 Delpuch et al. Apr 2005 A1
20050091695 Paz et al. Apr 2005 A1
20050114906 Hoarty et al. May 2005 A1
20050132305 Guichard et al. Jun 2005 A1
20050135385 Jenkins et al. Jun 2005 A1
20050141613 Kelly et al. Jun 2005 A1
20050149988 Grannan Jul 2005 A1
20050155063 Bayrakeri Jul 2005 A1
20050160088 Scallan et al. Jul 2005 A1
20050166257 Feinleib et al. Jul 2005 A1
20050180502 Puri Aug 2005 A1
20050198682 Wright Sep 2005 A1
20050213586 Cyganski et al. Sep 2005 A1
20050216933 Black Sep 2005 A1
20050216940 Black Sep 2005 A1
20050226426 Oomen et al. Oct 2005 A1
20050273832 Zigmond et al. Dec 2005 A1
20060001737 Dawson et al. Jan 2006 A1
20060020960 Relan et al. Jan 2006 A1
20060020994 Crane et al. Jan 2006 A1
20060031906 Kaneda Feb 2006 A1
20060039481 Shen et al. Feb 2006 A1
20060041910 Hatanaka et al. Feb 2006 A1
20060088105 Shen et al. Apr 2006 A1
20060095944 Demircin et al. May 2006 A1
20060112338 Joung et al. May 2006 A1
20060117340 Pavlovskaia et al. Jun 2006 A1
20060143678 Chou et al. Jun 2006 A1
20060161538 Kiilerich Jul 2006 A1
20060173985 Moore Aug 2006 A1
20060174026 Robinson et al. Aug 2006 A1
20060174289 Theberge Aug 2006 A1
20060195884 van Zoest et al. Aug 2006 A1
20060203913 Kim et al. Sep 2006 A1
20060212203 Furuno Sep 2006 A1
20060218601 Michel Sep 2006 A1
20060230428 Craig et al. Oct 2006 A1
20060242570 Croft et al. Oct 2006 A1
20060256865 Westerman Nov 2006 A1
20060269086 Page et al. Nov 2006 A1
20060271985 Hoffman et al. Nov 2006 A1
20060285586 Westerman Dec 2006 A1
20060285819 Kelly et al. Dec 2006 A1
20070009035 Craig et al. Jan 2007 A1
20070009036 Craig et al. Jan 2007 A1
20070009042 Craig et al. Jan 2007 A1
20070025639 Zhou et al. Feb 2007 A1
20070033528 Merril et al. Feb 2007 A1
20070033631 Gordon et al. Feb 2007 A1
20070074251 Oguz et al. Mar 2007 A1
20070079325 de Heer Apr 2007 A1
20070115941 Patel et al. May 2007 A1
20070124795 McKissick et al. May 2007 A1
20070130446 Minakami Jun 2007 A1
20070130592 Haeusel Jun 2007 A1
20070152984 Ording et al. Jul 2007 A1
20070162953 Bolliger et al. Jul 2007 A1
20070172061 Pinder Jul 2007 A1
20070178243 Dong et al. Aug 2007 A1
20070234220 Khan et al. Oct 2007 A1
20070237232 Chang et al. Oct 2007 A1
20070300280 Turner et al. Dec 2007 A1
20080046928 Poling et al. Feb 2008 A1
20080052742 Kopf et al. Feb 2008 A1
20080066135 Brodersen et al. Mar 2008 A1
20080084503 Kondo Apr 2008 A1
20080086688 Chandratillake et al. Apr 2008 A1
20080094368 Ording et al. Apr 2008 A1
20080097953 Levy et al. Apr 2008 A1
20080098450 Wu et al. Apr 2008 A1
20080104520 Swenson et al. May 2008 A1
20080127255 Ress et al. May 2008 A1
20080154583 Goto et al. Jun 2008 A1
20080163059 Craner Jul 2008 A1
20080163286 Rudolph et al. Jul 2008 A1
20080170619 Landau Jul 2008 A1
20080170622 Gordon et al. Jul 2008 A1
20080178125 Elsbree et al. Jul 2008 A1
20080178243 Dong et al. Jul 2008 A1
20080178249 Gordon et al. Jul 2008 A1
20080181221 Kampmann et al. Jul 2008 A1
20080184120 O-Brien-Strain et al. Jul 2008 A1
20080189740 Carpenter et al. Aug 2008 A1
20080195573 Onoda et al. Aug 2008 A1
20080201736 Gordon et al. Aug 2008 A1
20080212942 Gordon et al. Sep 2008 A1
20080222199 Tiu et al. Sep 2008 A1
20080232452 Sullivan et al. Sep 2008 A1
20080243918 Holtman Oct 2008 A1
20080243998 Oh et al. Oct 2008 A1
20080246759 Summers Oct 2008 A1
20080253440 Srinivasan et al. Oct 2008 A1
20080271080 Gossweiler et al. Oct 2008 A1
20090003446 Wu et al. Jan 2009 A1
20090003705 Zou et al. Jan 2009 A1
20090007199 La Joie Jan 2009 A1
20090025027 Craner Jan 2009 A1
20090031341 Schlack et al. Jan 2009 A1
20090041118 Pavlovskaia et al. Feb 2009 A1
20090083781 Yang et al. Mar 2009 A1
20090083813 Dolce et al. Mar 2009 A1
20090083824 McCarthy et al. Mar 2009 A1
20090089188 Ku et al. Apr 2009 A1
20090094113 Berry et al. Apr 2009 A1
20090094646 Walter et al. Apr 2009 A1
20090100465 Kulakowski Apr 2009 A1
20090100489 Strothmann Apr 2009 A1
20090106269 Zuckerman et al. Apr 2009 A1
20090106386 Zuckerman et al. Apr 2009 A1
20090106392 Zuckerman et al. Apr 2009 A1
20090106425 Zuckerman et al. Apr 2009 A1
20090106441 Zuckerman et al. Apr 2009 A1
20090106451 Zuckerman et al. Apr 2009 A1
20090106511 Zuckerman et al. Apr 2009 A1
20090113009 Slemmer et al. Apr 2009 A1
20090132942 Santoro et al. May 2009 A1
20090138966 Krause et al. May 2009 A1
20090144781 Glaser et al. Jun 2009 A1
20090146779 Kumar et al. Jun 2009 A1
20090157868 Chaudhry Jun 2009 A1
20090158369 Van Vleck et al. Jun 2009 A1
20090160694 Di Flora Jun 2009 A1
20090172757 Aldrey et al. Jul 2009 A1
20090178098 Westbrook et al. Jul 2009 A1
20090183219 Maynard et al. Jul 2009 A1
20090189890 Corbett et al. Jul 2009 A1
20090193452 Russ et al. Jul 2009 A1
20090196346 Zhang et al. Aug 2009 A1
20090210899 Lawrence-Apfelbaum et al. Aug 2009 A1
20090225790 Shay et al. Sep 2009 A1
20090228620 Thomas et al. Sep 2009 A1
20090228922 Haj-Khalil et al. Sep 2009 A1
20090233593 Ergen et al. Sep 2009 A1
20090251478 Maillot et al. Oct 2009 A1
20090254960 Yarom et al. Oct 2009 A1
20090265617 Randall et al. Oct 2009 A1
20090271512 Jorgensen Oct 2009 A1
20090271818 Schlack Oct 2009 A1
20090298535 Klein et al. Dec 2009 A1
20090313674 Ludvig et al. Dec 2009 A1
20090328109 Pavlovskaia et al. Dec 2009 A1
20100033638 O'Donnell et al. Feb 2010 A1
20100035682 Gentile et al. Feb 2010 A1
20100058404 Rouse Mar 2010 A1
20100067571 White et al. Mar 2010 A1
20100077441 Thomas et al. Mar 2010 A1
20100104021 Schmit Apr 2010 A1
20100115573 Srinivasan et al. May 2010 A1
20100118972 Zhang et al. May 2010 A1
20100131996 Gauld May 2010 A1
20100146139 Brockmann Jun 2010 A1
20100158109 Dahlby et al. Jun 2010 A1
20100161825 Ronca et al. Jun 2010 A1
20100166071 Wu et al. Jul 2010 A1
20100174776 Westberg et al. Jul 2010 A1
20100175080 Yuen et al. Jul 2010 A1
20100180307 Hayes et al. Jul 2010 A1
20100226428 Thevathasan et al. Sep 2010 A1
20100235861 Schein et al. Sep 2010 A1
20100242073 Gordon et al. Sep 2010 A1
20100254370 Jana et al. Oct 2010 A1
20100265344 Velarde et al. Oct 2010 A1
20100325655 Perez Dec 2010 A1
20100325668 Young et al. Dec 2010 A1
20110002376 Ahmed et al. Jan 2011 A1
20110002470 Purnhagen et al. Jan 2011 A1
20110023069 Dowens Jan 2011 A1
20110035227 Lee et al. Feb 2011 A1
20110067061 Karaoguz et al. Mar 2011 A1
20110096828 Chen et al. Apr 2011 A1
20110107375 Stahl et al. May 2011 A1
20110110642 Salomons et al. May 2011 A1
20110153776 Opala et al. Jun 2011 A1
20110167468 Lee et al. Jul 2011 A1
20110191684 Greenberg Aug 2011 A1
20110231878 Hunter et al. Sep 2011 A1
20110243024 sterling et al. Oct 2011 A1
20110258584 Williams et al. Oct 2011 A1
20110289536 Poder et al. Nov 2011 A1
20110296312 Boyer et al. Dec 2011 A1
20110317982 Xu et al. Dec 2011 A1
20120137337 Sigmon et al. May 2012 A1
20120204217 Regis et al. Aug 2012 A1
20120224641 Haberman et al. Sep 2012 A1
20120257671 Brockmann et al. Oct 2012 A1
20130003826 Craig et al. Jan 2013 A1
20130071095 Chauvier et al. Mar 2013 A1
20130086610 Brockmann Apr 2013 A1
20130179787 Brockmann et al. Jul 2013 A1
20130198776 Brockmann Aug 2013 A1
20130254308 Rose et al. Sep 2013 A1
20130272394 Brockmann et al. Oct 2013 A1
20130304818 Brumleve et al. Nov 2013 A1
20140081954 Elizarov Mar 2014 A1
20140267074 Balci Sep 2014 A1
Foreign Referenced Citations (322)
Number Date Country
191599 Apr 2000 AT
198969 Feb 2001 AT
250313 Oct 2003 AT
472152 Jul 2010 AT
475266 Aug 2010 AT
550086 Feb 1986 AU
199060189 Nov 1990 AU
620735 Feb 1992 AU
199184838 Apr 1992 AU
643828 Nov 1993 AU
2004253127 Jan 2005 AU
2005278122 Mar 2006 AU
2010339376 Aug 2012 AU
2011249132 Nov 2012 AU
2011258972 Nov 2012 AU
2011315950 May 2013 AU
682776 Mar 1964 CA
2052477 Mar 1992 CA
1302554 Jun 1992 CA
2163500 May 1996 CA
2231391 May 1997 CA
2273365 Jun 1998 CA
2313133 Jun 1999 CA
2313161 Jun 1999 CA
2528499 Jan 2005 CA
2569407 Mar 2006 CA
2728797 Apr 2010 CA
2787913 Jul 2011 CA
2798541 Dec 2011 CA
2814070 Apr 2012 CA
1507751 Jun 2004 CN
1969555 May 2007 CN
101180109 May 2008 CN
101627424 Jan 2010 CN
101637023 Jan 2010 CN
102007773 Apr 2011 CN
103647980 Mar 2014 CN
4408355 Oct 1994 DE
69516139 D1 Dec 2000 DE
69132518 D1 Sep 2001 DE
69333207 D1 Jul 2004 DE
98961961 Aug 2007 DE
602008001596 D1 Aug 2010 DE
602006015650 D1 Sep 2010 DE
0093549 Nov 1983 EP
0128771 Dec 1984 EP
0419137 Mar 1991 EP
0449633 Oct 1991 EP
0477786 Apr 1992 EP
0523618 Jan 1993 EP
0534139 Mar 1993 EP
0568453 Nov 1993 EP
0588653 Mar 1994 EP
0594350 Apr 1994 EP
0612916 Aug 1994 EP
0624039 Nov 1994 EP
0638219 Feb 1995 EP
0643523 Mar 1995 EP
0661888 Jul 1995 EP
0714684 Jun 1996 EP
0746158 Dec 1996 EP
0761066 Mar 1997 EP
0789972 Aug 1997 EP
0830786 Mar 1998 EP
0861560 Sep 1998 EP
0 881 808 Dec 1998 EP
0933966 Aug 1999 EP
0933966 Aug 1999 EP
1026872 Aug 2000 EP
1038397 Sep 2000 EP
1038399 Sep 2000 EP
1038400 Sep 2000 EP
1038401 Sep 2000 EP
1051039 Nov 2000 EP
1055331 Nov 2000 EP
1120968 Aug 2001 EP
1345446 Sep 2003 EP
1422929 May 2004 EP
1428562 Jun 2004 EP
1521476 Apr 2005 EP
1645115 Apr 2006 EP
1 725 044 Nov 2006 EP
1767708 Mar 2007 EP
1771003 Apr 2007 EP
1772014 Apr 2007 EP
1877150 Jan 2008 EP
1887148 Feb 2008 EP
1900200 Mar 2008 EP
1902583 Mar 2008 EP
1908293 Apr 2008 EP
1911288 Apr 2008 EP
1918802 May 2008 EP
2100296 Sep 2009 EP
2105019 Sep 2009 EP
2106665 Oct 2009 EP
2116051 Nov 2009 EP
2124440 Nov 2009 EP
2248341 Nov 2010 EP
2269377 Jan 2011 EP
2271098 Jan 2011 EP
2304953 Apr 2011 EP
2364019 Sep 2011 EP
2384001 Nov 2011 EP
2409493 Jan 2012 EP
2477414 Jul 2012 EP
2487919 Aug 2012 EP
2520090 Nov 2012 EP
2567545 Mar 2013 EP
2577437 Apr 2013 EP
2628306 Aug 2013 EP
2632164 Aug 2013 EP
2632165 Aug 2013 EP
2695388 Feb 2014 EP
2207635 Jun 2004 ES
8211463 Jun 1982 FR
2529739 Jan 1984 FR
2891098 Mar 2007 FR
2207838 Feb 1989 GB
2248955 Apr 1992 GB
2290204 Dec 1995 GB
2365649 Feb 2002 GB
2378345 Feb 2003 GB
1134855 Oct 2010 HK
1116323 Dec 2010 HK
19913397 Apr 1992 IE
99586 Feb 1998 IL
215133 Dec 2011 IL
222829 Dec 2012 IL
222830 Dec 2012 IL
225525 Jun 2013 IL
180215 Jan 1998 IN
200701744 Nov 2007 IN
200900856 May 2009 IN
200800214 Jun 2009 IN
3759 Mar 1992 IS
60-054324 Mar 1985 JP
63-033988 Feb 1988 JP
63-263985 Oct 1988 JP
2001-241993 Sep 1989 JP
04-373286 Dec 1992 JP
06-054324 Feb 1994 JP
7015720 Jan 1995 JP
7-160292 Jun 1995 JP
8-265704 Oct 1996 JP
10-228437 Aug 1998 JP
10-510131 Sep 1998 JP
11-134273 May 1999 JP
H11-261966 Sep 1999 JP
2000-152234 May 2000 JP
2001-203995 Jul 2001 JP
2001-245271 Sep 2001 JP
2001-245291 Sep 2001 JP
2001-514471 Sep 2001 JP
2002-016920 Jan 2002 JP
2002-057952 Feb 2002 JP
2002-112220 Apr 2002 JP
2002-141810 May 2002 JP
2002-208027 Jul 2002 JP
2002-319991 Oct 2002 JP
2003-506763 Feb 2003 JP
2003-087785 Mar 2003 JP
2003-529234 Sep 2003 JP
2004-501445 Jan 2004 JP
2004-056777 Feb 2004 JP
2004-110850 Apr 2004 JP
2004-112441 Apr 2004 JP
2004-135932 May 2004 JP
2004-264812 Sep 2004 JP
2004-312283 Nov 2004 JP
2004-533736 Nov 2004 JP
2004-536381 Dec 2004 JP
2004-536681 Dec 2004 JP
2005-033741 Feb 2005 JP
2005-084987 Mar 2005 JP
2005-095599 Mar 2005 JP
8-095599 Apr 2005 JP
2005-156996 Jun 2005 JP
2005-519382 Jun 2005 JP
2005-523479 Aug 2005 JP
2005-309752 Nov 2005 JP
2006-067280 Mar 2006 JP
2006-512838 Apr 2006 JP
2007-129296 May 2007 JP
2007-522727 Aug 2007 JP
11-88419 Sep 2007 JP
2008-523880 Jul 2008 JP
2008-535622 Sep 2008 JP
04252727 Apr 2009 JP
2009-543386 Dec 2009 JP
2011-108155 Jun 2011 JP
2012-080593 Apr 2012 JP
04996603 Aug 2012 JP
05121711 Jan 2013 JP
53-004612 Oct 2013 JP
05331008 Oct 2013 JP
05405819 Feb 2014 JP
10-2005-0001362 Jan 2005 KR
10-2005-0085827 Aug 2005 KR
2006067924 Jun 2006 KR
10-2006-0095821 Sep 2006 KR
2007038111 Apr 2007 KR
20080001298 Jan 2008 KR
2008024189 Mar 2008 KR
2010111739 Oct 2010 KR
2010120187 Nov 2010 KR
2010127240 Dec 2010 KR
2011030640 Mar 2011 KR
2011129477 Dec 2011 KR
20120112683 Oct 2012 KR
2013061149 Jun 2013 KR
2013113925 Oct 2013 KR
1333200 Nov 2013 KR
2008045154 Nov 2013 KR
2013138263 Dec 2013 KR
1032594 Apr 2008 NL
1033929 Apr 2008 NL
2004670 Nov 2011 NL
2004780 Jan 2012 NL
239969 Dec 1994 NZ
99110 Dec 1993 PT
WO 8202303 Jul 1982 WO
WO 8908967 Sep 1989 WO
WO 9013972 Nov 1990 WO
WO 9322877 Nov 1993 WO
WO 9416534 Jul 1994 WO
WO 9419910 Sep 1994 WO
WO 9421079 Sep 1994 WO
WO 9515658 Jun 1995 WO
WO 9532587 Nov 1995 WO
WO 9533342 Dec 1995 WO
WO 9614712 May 1996 WO
WO 9627843 Sep 1996 WO
WO 9631826 Oct 1996 WO
WO 9637074 Nov 1996 WO
WO 9642168 Dec 1996 WO
WO 9716925 May 1997 WO
WO 9733434 Sep 1997 WO
WO 9739583 Oct 1997 WO
WO 9826595 Jun 1998 WO
WO 9900735 Jan 1999 WO
WO 9904568 Jan 1999 WO
WO 9900735 Jan 1999 WO
WO 9930496 Jun 1999 WO
WO 9930497 Jun 1999 WO
WO 9930500 Jun 1999 WO
WO 9930501 Jun 1999 WO
WO 9935840 Jul 1999 WO
WO 9941911 Aug 1999 WO
WO 9956468 Nov 1999 WO
WO 9965232 Dec 1999 WO
WO 9965243 Dec 1999 WO
WO 9966732 Dec 1999 WO
WO 0002303 Jan 2000 WO
WO 0007372 Feb 2000 WO
WO 0008967 Feb 2000 WO
WO 0019910 Apr 2000 WO
WO 0038430 Jun 2000 WO
WO 0041397 Jul 2000 WO
WO 0139494 May 2001 WO
WO 0141447 Jun 2001 WO
WO 0182614 Nov 2001 WO
WO 0192973 Dec 2001 WO
WO 02089487 Jul 2002 WO
WO 02076097 Sep 2002 WO
WO 02076099 Sep 2002 WO
WO 03026232 Mar 2003 WO
WO 03026275 Mar 2003 WO
WO 03047710 Jun 2003 WO
WO 03065683 Aug 2003 WO
WO 03071727 Aug 2003 WO
WO 03091832 Nov 2003 WO
WO 2004012437 Feb 2004 WO
WO 2004018060 Mar 2004 WO
WO2004057609 Jul 2004 WO
WO 2004073310 Aug 2004 WO
WO 2005002215 Jan 2005 WO
WO 2005041122 May 2005 WO
WO 2005053301 Jun 2005 WO
WO2005076575 Aug 2005 WO
WO 2005120067 Dec 2005 WO
WO 2006014362 Feb 2006 WO
WO 2006022881 Mar 2006 WO
WO 2006053305 May 2006 WO
WO 2006067697 Jun 2006 WO
WO 2006081634 Aug 2006 WO
WO 2006105480 Oct 2006 WO
WO 2006110268 Oct 2006 WO
WO 2007001797 Jan 2007 WO
WO 2007008319 Jan 2007 WO
WO 2007008355 Jan 2007 WO
WO 2007008356 Jan 2007 WO
WO 2007008357 Jan 2007 WO
WO 2007008358 Jan 2007 WO
WO 2007018722 Feb 2007 WO
WO 2007018726 Feb 2007 WO
WO2008044916 Apr 2008 WO
WO 2008044916 Apr 2008 WO
WO 2008086170 Jul 2008 WO
WO 2008088741 Jul 2008 WO
WO 2008088752 Jul 2008 WO
WO 2008088772 Jul 2008 WO
WO 2008100205 Aug 2008 WO
WO2009038596 Mar 2009 WO
WO 2009038596 Mar 2009 WO
WO 2009099893 Aug 2009 WO
WO 2009099895 Aug 2009 WO
WO 2009105465 Aug 2009 WO
WO 2009110897 Sep 2009 WO
WO 2009114247 Sep 2009 WO
WO 2009155214 Dec 2009 WO
WO 2010044926 Apr 2010 WO
WO 2010054136 May 2010 WO
WO 2010107954 Sep 2010 WO
WO 2011014336 Sep 2010 WO
WO 2011082364 Jul 2011 WO
WO 2011139155 Nov 2011 WO
WO 2011149357 Dec 2011 WO
WO 2012051528 Apr 2012 WO
WO 2012138660 Oct 2012 WO
WO 2013106390 Jul 2013 WO
WO 2013155310 Jul 2013 WO
WO2013184604 Dec 2013 WO
Non-Patent Literature Citations (309)
Entry
Authorized Officer Jürgen Güttlich, International Search Report and Written Opinion, dated Jan. 12, 2007, PCT/US2008/000400.
Authorized Officer Jürgen Güttlich, International Search Report and Written Opinion, dated Jan. 12, 2007, PCT/US2008/000450.
Hoarty, W. L., “The Smart Headend—A Novel Approach to Interactive Television”, Montreux Int'l TV Symposium, Jun. 9, 1995.
Rob Koenen, “MPEG-4 Overview—Overview of the MPEG-4 Standard” Internet Citation, Mar. 2001.
MSDL, “MSDL Specification Version 1.1” Joint Video Team of ISO/IEC MPEG & ITU-T VCEG (ISO/IEC JTC1/SC29/WG11 and ITU-T SG16 Q6). No. N1246, Mar. 1996, pp. 1-99.
Stoll, G., et al., “GMF4ITV: Neue Wege Zur Interaktivität Mit Bewegten Objekten Beim Digitalen Fernsehen,” FKT Fernseh Und Kinotechnik, Fachverlag Schiele & Schon Gmgh., vol. 60, No. 4, Jan. 1, 2006, pp. 172-178.
Avaro, O., et al., “MPEG-4 Systems: Overview,” Signal Processing, vol. 15, Jan. 1, 2000, pp. 281-298.
Karin Exner, International Search Report, PCT/US2008/000450, Jan. 26, 2009, 9 pages.
AC-3 digital audio compression standard, Extract, Dec. 20, 1995, 11 pgs.
ActiveVideo Networks BV, International Preliminary Report on Patentability, PCT/NL2011/050308, Sep. 6, 2011, 8 pgs.
ActiveVideo Networks BV, International Search Report and Written Opinion, PCT/NL2011/050308, Sep. 6, 2011, 8 pgs.
Activevideo Networks Inc., International Preliminary Report on Patentability, PCT/US2011/056355, Apr. 16, 2013, 4 pgs.
ActiveVideo Networks Inc., International Preliminary Report on Patentability, PCT/US2012/032010, Oct. 8, 2013, 4 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2011/056355, Apr. 13, 2012, 6 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2012/032010, Oct. 10, 2012, 6 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2013/020769, May 9, 2013, 9 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2013/036182, Jul. 29, 2013, 12 pgs.
ActiveVideo Networks, Inc., International Search Report and Written Opinion, PCT/US2009/032457, Jul. 22, 2009, 7 pgs.
AcitveVideo Networks Inc., Korean Intellectual Property Office, International Search Report; PCT/US2009/032457, Jul. 22, 2009, 7 pgs.
Annex C—Video buffering verifier, information technology—generic coding of moving pictures and associated audio information: video, Feb. 2000, 6 pgs.
Antonoff, Michael, “Interactive Television,” Popular Science, Nov. 1992, 12 pages.
Avinity Systems B.V., Extended European Search Report, Application No. 12163713.6, 10 pgs.
Avinity Systems B.V., Extended European Search Report, Application No. 12163712-8, 10 pgs.
Benjelloun, A summation algorithm for MPEG-1 coded audio signals: a first step towards audio processed domain, 2000, 9 pgs.
Broadhead, Direct manipulation of MPEG compressed digital audio, Nov. 5-9, 1995, 41 pgs.
Cable Television Laboratories, Inc., “CableLabs Asset Distribution Interface Specification, Version 1.1”, May 5, 2006, 33 pgs.
CD 11172-3, Coding of moving pictures and associated audio for digital storage media at up to about 1.5 MBIT, Jan. 1, 1992, 39 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,176, Dec. 23, 2010, 8 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,183, Jan. 12, 2012, 7 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,183, Jul. 19, 2012, 8 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,189, Oct. 12, 2011, 7 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,176, Mar. 23, 2011, 8 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 13/609,183, Aug. 26, 2013, 8 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/103,838, Feb. 5, 2009, 30 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,181, Aug. 25, 2010, 17 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/103,838, Jul. 6, 2010, 35 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,176, Oct. 10, 2010, 8 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,183, Apr. 13, 2011, 16 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,177, Oct. 26, 2010, 12 pgs.
Craig, Final Office Action, U.S. Appl. No. 11/178,181, Jun. 20, 2011, 21 pgs.
Craig, Office Action, U.S. Appl. No. 11/103,838, May 12, 2009, 32 pgs.
Craig, Office Action, U.S. Appl. No. 11/103,838, Aug. 19, 2008, 17 pgs.
Craig, Office Action, U.S. Appl. No. 11/103,838, Nov. 19, 2009, 34 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,176, May 6, 2010, 7 pgs.
Craig, Office-Action, U.S. Appl. No. 11/178,177, Mar. 29, 2011, 15 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,177, Aug. 3, 2011, 26 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,177, Mar. 29, 2010, 11 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,181, Feb. 11, 2011, 19 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,181, Mar. 29, 2010, 10 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,182, Feb. 23, 2010, 15 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,183, Dec. 6, 2010, 12 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,183, Sep. 15, 2011, 12 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,183, Feb. 19, 2010, 17 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,183, Jul. 20, 2010, 13 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,189, Nov. 9, 2010, 13 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,189, Mar. 15, 2010, 11 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,189, Jul. 23, 2009, 10 pgs.
Craig, Office Action, U.S. Appl. No. 11/178,189, May 26, 2011, 14 pgs.
Craig, Office Action, U.S. Appl. No. 13/609,183, May 9, 2013, 7 pgs.
Pavlovskaia, Office Action, JP 2011-516499, Feb. 14, 2014, 19 pgs.
Digital Audio Compression Standard(AC-3, E-AC-3), Advanced Television Systems Committee, Jun. 14, 2005, 236 pgs.
European Patent Office, Extended European Search Report for International Application No. PCT/US2010/027724, dated Jul. 24, 2012, 11 pages.
FFMPEG, http://www.ffmpeq.orq, downloaded Apr. 8, 2010, 8 pgs.
FFMEG-0.4.9 Audio Layer 2 Tables Including Fixed Psycho Acoustic Model, 2001, 2 pgs.
Herr, Notice of Allowance, U.S. Appl. No. 11/620,593, May 23, 2012, 5 pgs.
Herr, Notice of Allowance, U.S. Appl. No. 12/534,016, Feb. 7, 2012, 5 pgs.
Herr, Notice of Allowance, U.S. Appl. No. 12/534,016, Sep. 28, 2011, 15 pgs.
Herr, Final Office Action, U.S. Appl. No. 11/620,593, Sep. 15, 2011, 104 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, Mar. 19, 2010, 58 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, Apr. 21, 2009 27 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, Dec. 23, 2009, 58 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, Jan. 24, 2011, 96 pgs.
Herr, Office Action, U.S. Appl. No. 11/620,593, Aug. 27, 2010, 41 pgs.
Herre, Thoughts on an SAOC Architecture, Oct. 2006, 9 pgs.
ICTV, Inc., International Preliminary Report on Patentability, PCT/US2006/022585, Jan. 29, 2008, 9 pgs.
ICTV, Inc., International Search Report / Written Opinion, PCT/US2006/022585, Oct. 12, 2007, 15 pgs.
ICTV, Inc., International Search Report / Written Opinion, PCT/US2008/000419, May 15, 2009, 20 pgs.
ICTV, Inc., International Search Report / Written Opinion; PCT/US2006/022533, Nov. 20, 2006; 8 pgs.
Isovic, Timing constraints of MPEG-2 decoding for high quality video: misconceptions and realistic assumptions, Jul. 2-4, 2003, 10 pgs.
MPEG-2 Video elementary stream supplemental information, Dec. 1999, 12 pgs.
Ozer, Video Compositing 101. available from http://www.emedialive.com, Jun. 2, 2004, 5pgs.
Porter, Compositing Digital Images, 18 Computer Graphics (No. 3), Jul. 1984, pp. 253-259.
RSS Advisory Board, “RSS 2.0 Specification”, published Oct. 15, 2007.
SAOC use cases, draft requirements and architecture, Oct. 2006, 16 pgs.
Sigmon, Final Office Action, U.S. Appl. No. 11/258,602, Feb. 23, 2009, 15 pgs.
Sigmon, Office Action, U.S. Appl. No. 11/258,602, Sep. 2, 2008, 12 pgs.
TAG Networks, Inc., Communication pursuant to Article 94(3) EPC, European Patent Application, 06773714.8, May 6, 2009, 3 pgs.
TAG Networks Inc, Decision to Grant a Patent, JP 209-544985, Jun. 28, 2013, 1 pg.
TAG Networks Inc., IPRP, PCT/US2006/010080, Oct. 16, 2007, 6 pgs.
TAG Networks Inc., IPRP, PCT/US2006/024194, Jan. 10, 2008, 7 pgs.
TAG Networks Inc., IPRP, PCT/US2006/024195, Apr. 1, 2009, 11 pgs.
TAG Networks Inc., IPRP, PCT/US2006/024196, Jan. 10, 2008, 6 pgs.
TAG Networks Inc., International Search Report, PCT/US2008/050221, Jun. 12, 2008, 9 pgs.
TAG Networks Inc., Office Action, CN 200680017662.3, Apr. 26, 2010, 4 pgs.
TAG Networks Inc., Office Action, EP 06739032.8, Aug. 14, 2009, 4 pgs.
TAG Networks Inc., Office Action, EP 06773714.8, May 6, 2009, 3 pgs.
TAG Networks Inc., Office Action, EP 06773714.8, Jan. 12, 2010, 4 pgs.
TAG Networks Inc., Office Action, JP 2008-506474, Oct. 10, 2012, 5 pgs.
TAG Networks Inc., Office Action, JP 2008-506474, Aug. 8, 2011, 5 pgs.
TAG Networks Inc., Office Action, JP 2008-520254, Oct. 20, 2011, 2 pgs.
TAG Networks, IPRP, PCT/US2008/050221, Jul. 7, 2009, 6 pgs.
TAG Networks, International Search Report, PCT/US2010/041133, Oct. 19, 2010, 13 pgs.
TAG Networks, Office Action, CN 200880001325.4, Jun. 22, 2011, 4 pgs.
TAG Networks, Office Action, JP 2009-544985, Feb. 25, 2013, 3 pgs.
Talley, A general framework for continuous media transmission control, Oct. 13-16, 1997, 10 pgs.
The Toolame Project, Psych—nl.c, 1999, 1 pg.
Todd, AC-3: flexible perceptual coding for audio transmission and storage, Feb. 26-Mar. 1, 1994, 16 pgs.
Tudor, MPEG-2 Video Compression, Dec. 1995, 15 pgs.
TVHEAD, Inc., First Examination Report, IN 1744/MUMNP/2007, Dec. 30, 2013, 6 pgs.
TVHEAD, Inc., International Search Report, PCT/US2006/010080, Jun. 20, 2006, 3 pgs.
TVHEAD, Inc., International Search Report, PCT/US2006/024194, Dec. 15, 2006, 4 pgs.
TVHEAD, Inc., International Search Report, PCT/US2006/024195, Nov. 29, 2006, 9 pgs.
TVHEAD, Inc., International Search Report, PCT/US2006/024196, Dec. 11, 2006, 4 pgs.
TVHEAD, Inc., International Search Report, PCT/US2006/024197, Nov. 28, 2006, 9 pgs.
Vernon, Dolby digital: audio coding for digital television and storage applications, Aug. 1999, 18 pgs.
Wang, A beat-pattern based error concealment scheme for music delivery with burst packet loss, Aug. 22-25, 2001, 4 pgs.
Wang, A compressed domain beat detector using MP3 audio bitstream, Sep. 30-Oct. 5, 2001, 9 pgs.
Wang, A multichannel audio coding algorithm for inter-channel redundancy removal, May 12-15, 2001, 6 pgs.
Wang, An excitation level based psychoacoustic model for audio compression, Oct. 30-Nov. 4, 1999, 4 pgs.
Wang, Energy compaction property of the MDCT in comparison with other transforms, Sep. 22-25, 2000, 23 pgs.
Wang, Exploiting excess masking for audio compression, Sep. 2-5, 1999, 4 pgs.
Wang, schemes for re-compressing mp3 audio bitstreams, Nov. 30-Dec. 3, 2001, 5 pgs.
Wang, Selected advances in audio compression and compressed domain processing, Aug. 2001, 68 pgs.
Wang, The impact of the relationship between MDCT and DFT on audio compression, Dec. 13-15, 2000, 9 pgs.
ActiveVideo, http://www.activevideo.com/, as printed out in year 2012, 1 pg.
ActiveVideo Networks Inc., International Preliminary Report on Patentability, PCT/US2013/020769, Jul. 24, 2014, 6 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2014/030773, Jul. 25, 2014, 8 pgs.
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2014/041416, Aug. 27, 2014, 8 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Rules 70(2) and 70a(2), EP10841764.3, Jun. 22, 2011, 1 pg.
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, EP08713106.6, Jun. 26, 2014, 5 pgS.
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, EP09713486.0, Apr. 14, 2014, 6 pgS.
ActiveVideo Networks Inc., Examination Report No. 1, AU2011258972, Apr. 4, 2013, 5 pgs.
ActiveVideo Networks Inc., Examination Report No. 1, AU2010339376, Apr. 30, 2014, 4 pgs.
ActiveVideo Networks Inc., Examination Report, App. No. EP11749946.7, Oct. 8, 2013, 6 pgs.
ActiveVideo Networks Inc., Summons to attend oral-proceeding, Application No. EP09820936-4, Aug. 19, 2014, 4 pgs.
ActiveVideo Networks Inc., International Searching Authority, International Search Report-International application No. PCT/US2010/027724, dated Oct. 28, 2010, together with the Written Opinion of the International Searching Authority, 7 pages.
Adams, Jerry, NTZ Nachrichtechnische Zeitschrift. vol. 40, No. 7, Jul. 1987, Berlin DE pp. 534-536; Jerry Adams: ‘Glasfasernetz für Breitbanddienste in London’, 5 pgs. No English Translation Found.
Avinity Systems B.V., Communication pursuant to Article 94(3) EPC, EP 07834561.8, Jan. 31, 2014, 10 pgs.
Avinity Systems B.V., Communication pursuant to Article 94(3) EPC, EP 07834561.8, Apr. 8, 2010, 5 pgs.
Avinity Systems B.V., International Preliminary Report on Patentability, PCT/NL2007/000245, Feb. 19, 2009, 7 pgs.
Avinity Systems B.V., International Search Report and Written Opinion, PCT/NL2007/000245, Feb. 19, 2009, 18 pgs.
Avinity Systems B.V., Notice of Grounds of Rejection for Patent, JP 2009-530298, Sep. 3, 2013, 4 pgs.
Avinity Systems B.V., Notice of Grounds of Rejection for Patent, JP 2009-530298, Sep. 25, 2012, 6 pgs.
Bird et al., “Customer Access to Broadband Services,” ISSLS 86—The International Symposium on Subrscriber Loops and Services Sep. 29, 1986, Tokyo, JP 6 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/668,004, Jul. 16, 2014, 20 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/686,548, Mar. 10, 2014, 11 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/668,004, Dec. 23, 2013, 9 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/438,617, May 12, 2014, 17 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 12/443,571, Mar. 7, 2014, 21 pgs.
Brockmann, Office Action, U.S. Appl. No. 12/443,571, Jun. 5, 2013, 18 pgs.
Chang, Shih-Fu, et al., “Manipulation and Compositing of MC-DOT Compressed Video,” IEEE Journal on Selected Areas of Communications, Jan. 1995, vol. 13, No. 1, 11 pgs. BEST COPY AVAILABLE.
Dahlby, Office Action, U.S. Appl. No. 12/651,203, Jun. 5, 2014, 18 pgs.
Dahlby, Final Office Action, U.S. Appl. No. 12/651,203, Feb. 4, 2013, 18 pgs.
Dahlby, Office Action, U.S. Appl. No. 12/651,203, Aug. 16, 2012, 18 pgs.
Dukes, Stephen D., “Photonics for cable television system design, Migrating to regional hubs and passive networks,” Communications Engineering and Design, May 1992, 4 pgs.
Ellis, et al., “INDAX: An Operation Interactive Cabletext System”, IEEE Journal on Selected Areas in Communications, vol. sac-1, No. 2, Feb. 1983, pp. 285-294.
European Patent Office, Supplementary European Search Report, Application No. EP 09 70 8211, dated Jan. 5, 2011, 6 pgs.
Frezza, W., “The Broadband Solution-Metropolitan CATV Networks,” Proceedings of Videotex '84, Apr. 1984, 15 pgs.
Gecsei, J., “Topology of Videotex Networks,” The Architecture of Videotex Systems, Chapter 6, 1983 by Prentice-Hall, Inc.
Gobl, et al., “ARIDEM—a multi-service broadband access demonstrator,” Ericsson Review No. 3, 1996, 7 pgs.
Gordon, Notice of Allowance, U.S. Appl. No. 12/008,697, Mar. 20, 2014, 10 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/008,722, Mar. 30, 2012, 16 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, Jun. 11, 2014, 14 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, Jul. 22, 2013, 7 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, Sep. 20, 2011, 8 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, Sep. 21, 2012, 9 pgs.
Gordon, Final Office Action, U.S. Appl. No. 12/008,697, Mar. 6, 2012, 48 pgs.
Gordon, Office Action, U.S. Appl. No. 12/035,236, Mar. 13, 2013, 9 pgs.
Gordon, Office Action, U.S. Appl. No. 12/035,236, Mar. 22, 2011, 8 pgs.
Gordon, Office Action, U.S. Appl. No. 12/035,236, Mar. 28, 2012, 8 pgs.
Gordon, Office Action, U.S. Appl. No. 12/035,236, Dec. 16, 2013, 11 pgs.
Gordon, Office Action, U.S. Appl. No. 12/008,697, Aug. 1, 2013, 43 pgs.
Gordon, Office Action, U.S. Appl. No. 12/008,697, Aug. 4, 2011, 39 pgs.
Gordon, Office Action, U.S. Appl. No. 12/008,722, Oct. 11, 2011, 16 pgs.
Handley et al, “TCP Congestion Window Validation,” RFC 2861, Jun. 2000, Network Working Group, 22 pgs.
Henry et al. “Multidimensional Icons” ACM Transactions on Graphics, vol. 9, No. 1 Jan. 1990, 5 pgs.
Insight advertisement, “In two years this is going to be the most watched program on TV” On touch VCR programming, published not later than 2000, 10 pgs.
Isensee et al., “Focus Highlight for World Wide Web Frames,” Nov. 1, 1997, IBM Technical Disclosure Bulletin, vol. 40, No. 11, pp. 89-90.
Kato, Y., et al., “A Coding Control algorithm for Motion Picture Coding Accomplishing Optimal Assignment of Coding Distortion to Time and Space Domains,” Electronics and Communications in Japan, Part 1, vol. 72, No. 9, 1989, 11 pgs.
Koenen, Rob,“MPEG-4 Overview—Overview of the MPEG-4 Standard” Internet Citation, Mar. 2001 (2001-03), http://mpeq.telecomitalialab.com/standards/mpeg-4/mpeg-4.htm, May 9, 2002, 74 pgs.
Konaka, M. et al., “Development of Sleeper Cabin Cold Storage Type Cooling System,” SAE International, The Engineering Society for Advancing Mobility Land Sea Air and Space, SAE 2000 World Congress, Detroit, Michigan, Mar. 6-9, 2000, 7 pgs.
Le Gall, Didier, “MPEG: A Video Compression Standard for Multimedia Applications”, Communication of the ACM, vol. 34, No. 4, Apr. 1991, New York, NY, 13 pgs.
Langenberg, E, et al., “Integrating Entertainment and Voice on the Cable Network,” SCTE , Conference on Emerging Technologies, Jan. 6-7, 1993, New Orleans, Louisiana, 9 pgs.
Large, D., “Tapped Fiber vs. Fiber-Reinforced Coaxial CATV Systems”, IEEE LCS Magazine, Feb. 1990, 7 pgs. BEST COPY AVAILABLE.
Mesiya, M.F, “A Passive Optical/Coax Hybrid Network Architecture for Delivery of CATV, Telephony and Data Services,” 1993 NCTA Technical Papers, 7 pgs.
“MSDL Specification Version 1.1” International Organisation for Standardisation Organisation Internationale EE Normalisation, ISO/IEC JTC1/SC29/WG11 Coding of Moving Pictures and Autdio, N1246, MPEG96/Mar. 1996, 101 pgs.
Noguchi, Yoshihiro, et al., “MPEG Video Compositing in the Compressed Domain,” IEEE International Symposium on Circuits and Systems, vol. 2, May 1, 1996, 4 pgs.
Regis, Notice of Allowance U.S. Appl. No. 13/273,803, Sep. 2, 2014, 8 pgs.
Regis, Notice of Allowance U.S. Appl. No. 13/273,803, May 14, 2014, 8 pgs.
Regis, Final Office Action U.S. Appl. No. 13/273,803, Oct. 11, 2013, 23 pgs.
Regis, Office Action U.S. Appl. No. 13/273,803, Mar. 27, 2013, 32 pgs.
Richardson, Ian E.G., “H.264 and MPEG-4 Video Compression, Video Coding for Next-Genertion Multimedia,” Johm Wiley & Sons, US, 2003, ISBN: 0-470-84837-5, pp. 103-105, 149-152, and 164.
Rose, K., “Design of a Switched Broad-Band Communications Network for Interactive Services,” IEEE Transactions on Communications, vol. com-23, No. 1, Jan. 1975, 7 pgs.
Saadawi, Tarek N., “Distributed Switching for Data Transmission over Two-Way CATV”, IEEE Journal on Selected Areas in Communications, vol. SAC-3, No. 2, Mar. 1985, 7 pgs.
Schrock, “Proposal for a Hub Controlled Cable Television System Using Optical Fiber,” IEEE Transactions on Cable Television, vol. CATV-4, No. 2, Apr. 1979, 8 pgs.
Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, Sep. 22, 2014, 5 pgs.
Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, Feb. 27, 2014, 14 pgs.
Sigmon, Final Office Action, U.S. Appl. No. 13/311,203, Sep. 13, 2013, 20 pgs.
Sigmon, Office Action, U.S. Appl. No. 13/311,203, May 10, 2013, 21 pgs.
Smith, Brian C., et al., “Algorithms for Manipulating Compressed Images,” IEEE Computer Graphics and Applications, vol. 13, No. 5, Sep. 1, 1993, 9 pgs.
Smith, J. et al., “Transcoding Internet Content for Heterogeneous Client Devices” Circuits and Systems, 1998. ISCAS '98. Proceedings of the 1998 IEEE International Symposium on Monterey, CA, USA May 31-Jun. 3, 1998, New York, NY, USA,IEEE, US, May 31, 1998, 4 pgs.
Stoll, G. et al., “GMF4iTV: Neue Wege zur-Interaktivitaet Mit Bewegten Objekten Beim Digitalen Fernsehen,” Fkt Fernseh Und Kinotechnik, Fachverlag Schiele & Schon GmbH, Berlin, DE, vol. 60, No. 4, Jan. 1, 2006, ISSN: 1430-9947, 9 pgs. No English Translation Found.
Tamitani et al., “An Encoder/Decoder Chip Set for the MPEG Video Standard,” 1992 IEEE International Conference on Acoustics, vol. 5, Mar. 1992, San Francisco, CA, 4 pgs.
Terry, Jack, “Alternative Technologies and Delivery Systems for Broadband ISDN Access”, IEEE Communications Magazine, Aug. 1992, 7 pgs.
Thompson, Jack, “DTMF-TV, The Most Economical Approach to Interactive TV,” GNOSTECH Incorporated, NCF'95 Session T-38-C, 8 pgs.
Thompson, John W. Jr., “The Awakening 3.0: PCs, TSBs, or DTMF-TV—Which Telecomputer Architecture is Right for the Next Generations's Public Network?,” Gnostech Incorporated, 1995 The National Academy of Sciences, downloaded from the Unpredictable Certainty: White Papers, http://www.nap.edu/catalog/6062.html, pp. 546-552.
Tobagi, Fouad A., “Multiaccess Protocols in Packet Communication Systems,” IEEE Transactions on Communications, vol. Com-28, No. 4, Apr. 1980, 21 pgs.
Toms, N., “An Integrated Network Using Fiber Optics (Info) for the Distribution of Video, Data, and Telephone in Rural Areas,” IEEE Transactions on Communication, vol. Com-26, No. 7, Jul. 1978, 9 pgs.
Trott, A., et al. “An Enhanced Cost Effective Line Shuffle Scrambling System with Secure Conditional Access Authorization,” 1993 NCTA Technical Papers, 11 pgs.
Jurgen—Two-way applications for cable television systems in the '70s, IEEE Spectrum, Nov. 1971, 16 pgs.
va Beek, P., “Delay-Constrained Rate Adaptation for Robust Video Transmission over Home Networks,” Image Processing, 2005, ICIP 2005, IEEE International Conference, Sep. 2005, vol. 2, No. 11, 4 pgs.
Van der Star, Jack A. M., “Video on Demand Without Compression: A Review of the Business Model, Regulations and Future Implication,” Proceedings of PTC'93, 15th Annual Conference, 12 pgs.
Welzenbach et al., “The Application of Optical Systems for Cable TV,” AEG-Telefunken, Backnang, Federal Republic of Germany, ISSLS Sep. 15-19, 1980, Proceedings IEEE Cat. No. 80 CH1565-1, 7 pgs.
Yum, TS P., “Hierarchical Distribution of Video with Dynamic Port Allocation,” IEEE Transactions on Communications, vol. 39, No. 8, Aug. 1, 1991, XP000264287, 7 pgs.
ActiveVideo Networks Inc. Extended EP Search Rpt, Application No. 09820936-4, Oct. 26, 2012, 11 pgs.
ActiveVideo Networks Inc. Extended EP Search Rpt, Application No. 10754084-1, Jul. 24, 2012, 11 pgs.
ActiveVideo Networks Inc. Extended EP Search Rpt, Application No. 10841764.3, May 20, 2014, 16 pgs.
ActiveVideo Networks Inc. Extended EP Search Rpt, Application No. 11833486.1, Apr. 3, 2014, 6 pgs.
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 13168509.1, Apr. 24, 2014, 10 pgs.
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 13168376-5, Jan. 23, 2014, 8 pgs.
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 12767642-7, Aug. 20, 2014, 12 pgs.
Avinity Systems B.V., Extended European Search Report, Application No. 12163713.6, Feb. 7, 2014, 10 pgs.
Avinity Systems B.V., Extended European Search Report, Application No. 12163712-8, Feb. 3, 2014, 10 pgs.
ActiveVideo Networks, Inc., International Preliminary Report on Patentablity, PCT/US2013/036182, Oct. 14, 2014, 9 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Rule 94(3), EP08713106-6, Jun. 25, 2014, 5 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Rule 94(3), EP09713486.0, Apr. 14, 2014, 6 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Rules 161(2) & 162 EPC, EP13775121.0, Jan. 20, 2015, 3 pgs.
ActiveVideo Networks Inc., Certificate of Patent JP5675765, Jan. 9, 2015, 3 pgs.
ActiveVideo Networks Inc., Decision to Refuse Application, EP09820936.4, Feb. 20, 2015, 4 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, 10754084.1, Feb. 10, 2015, 12 pgs.
ActiveVideo Networks Inc., Intention to Grant, Communication under Rule 71(3) EPC, EP08713106.6, Feb. 19, 2015, 12 pgs.
ActiveVideo Networks Inc., Notice of Reasons for Rejection, JP2014-100460, Jan. 15, 2015, 6 pgs.
ActiveVideo Networks Inc., Notice of Reasons for Rejection, JP2013-509016, Dec. 24, 2014 (Received Jan. 14, 2015), 11 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 13/445,104, Dec. 24, 2014, 14 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/668,004, Feb. 26, 2015, 17 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/686,548, Jan. 5, 2015, 12 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/911,948, Dec. 26, 2014, 12 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/911,948, Jan. 29, 2015, 11 pgs.
Dahlby, Office Action, U.S. Appl. No. 12/651,203, Dec. 3, 2014, 19 pgs.
Craig, Decision on Appeal—Reversed—, U.S. Appl. No. 11/178,177, Feb. 25, 2015, 7 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,177, Mar. 5, 2015, 7 pgs.
Craig, Notice of Allowance, U.S. Appl. No. 11/178,181, Feb. 13, 2015, 8 pgs.
Gordon, Notice of Allowance, U.S. Appl. No. 12/008,697, Dec. 8, 2014, 10 pgs.
Regis, Notice of Allowance, U.S. Appl. No. 13/273,803, Nov. 18, 2014, 9 pgs.
Regis, Notice of Allowance, U.S. Appl. No. 13/273,803, Mar. 2, 2015, 8 pgs.
Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, Dec. 19, 2014, 5 pgs.
TAG Networks Inc, Decision to Grant a Patent, JP 2008-506474, Oct. 4, 2013, 5 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 13/438,617, May 22, 2015, 18 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 13/445,104, Apr. 23, 2015, 8 pgs.
Brockmann, Office Action, U.S. Appl. No. 14/262,674, May 21, 2015, 7 pgs.
Gordon, Notice of Allowance, U.S. Appl. No. 12/008,697, Apr. 1, 2015, 10 pgs.
Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, Apr. 14, 2015, 5 pgs.
Avinity-Systems-BV, PreTrial-Reexam-Report-JP2009530298, Apr. 24, 2015, 6 pgs.
ActiveVideo Networks Inc., Communication Pursuant to Rules 70(2) and 70a(2), EP11833486.1, Apr. 24, 2014, 1 pg.
ActiveVideo Networks, Inc., International Search Report and Written Opinion, PCT/US2014/041430, Oct. 9, 2014, 9 pgs.
ActiveVideo Networks Inc., Examination Report No. 1, AU2011258972, Jul. 21, 2014, 3 pgs.
Active Video Networks, Notice of Reasons for Rejection, JP2012-547318, Sep. 26, 2014, 7 pgs.
Avinity Systems B. V., Final Office Action, JP-2009-530298, Oct. 7, 2014, 8 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/686,548, Sep. 24, 2014, 13 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/438,617, Oct. 3, 2014, 19 pgs.
Brockmann, Office Action, U.S. Appl. No. 12/443,571, Nov. 5, 2014, 26 pgs.
ActiveVideo Networks, Inc., Certificate of Grant, AU2011258972, Nov. 19, 2015, 2 pgs.
ActiveVideo Networks, Inc., Certificate of Grant, AU2011315950, Dec. 17, 2015, 2 pgs.
ActiveVideo Networks, Inc., Certificate of Grant EP13168509.11908, Sep. 30, 2015, 2 pgs.
ActiveVideo, Certificate of Grant, AU2011249132, Jan. 7, 2016, 2 pgs.
ActiveVideo, Notice of German Patent, EP602008040474-9, Jan. 6, 2016, 4 pgs.
ActiveVideo Networks, Inc., Communication Pursuant to Rules 161(1) and 162 EPC, EP14722897.7, Oct. 28, 2015, 2 pgs.
ActiveVideo Networks, Inc., Decision to Grant, EP13168509.1-1908, Sep. 3, 2015, 2 pgs.
ActiveVideo Networks, Inc., Decision to Refuse a European Patent Application, EP08705578.6, Nov. 26, 2015, 10 pgs.
ActiveVideo Networks, Inc., Extended European Search Report, EP13735906.3, Nov. 11, 2015, 10 pgs.
ActiveVideo Networks, Inc., International Preliminary Report on Patentability, PCT-US2014030773, Sep. 15, 2015, 6 pgs.
ActiveVideo Networks, Inc., International Preliminary Report on Patentability, PCT/US2014041430, Dec. 8, 2015, 6 pgs.
ActiveVideo Networks, Inc., International Preliminary Report on Patentability, PCT-US2014041416, Dec. 8, 2015, 6 pgs.
AcriveVideo, Communication Pursuant to Article 94(3) EPC, EP10841764.3, Dec. 18, 2015, 6 pgs.
ActiveVideo Networks, Inc., Communication Pursuant to Rules 70(2) abd 70a(2) EP13735906.3, Nov. 27, 2015, 1 pg.
ActiveVideo, Notice of Reasons for Rejection, JP2013-509016, Dec. 3, 2015, 7 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 14/262,674, Sep. 30, 2015, 7 pgs.
Brockmann, Office Action, U.S. Appl. No. 12/443,571, Dec. 4, 2015, 30 pgs.
Dahlby, Final Office Action, U.S. Appl. No. 12/651,203, Dec. 11, 2015, 25 pgs.
Jacob, Bruce, “Memory Systems: Cache, DRAM, Disk,” Oct. 19, 2007, The Cache Layer, Chapter 22, p. 739.
ActiveVideo Networks, Inc., Certificate of Grant, EP08713106.6-1908, Aug. 5, 2015, 2 pgs.
ActiveVideo Networks, Inc., Decision to Grant, EP08713106.6-1908, Jul. 9, 2015, 2 pgs.
ActiveVideo Networks, Inc., Decision to Grant, JP2014100460, Jul. 24, 2015, 5 pgs.
ActiveVideo Networks Inc., Examination Report No. 2, AU2011249132, May 29, 2015, 4 pgs.
Activevideo Networks Inc., Examination Report No. 2, AU2011315950, Jun. 25, 2015, 3 pgs.
ActiveVideo, International Search Report and Written Opinion, PCT/US2015/027803, Jun. 24, 2015, 18 pgs.
ActiveVideo, International Search Report and Written Opinion, PCT/US2015/027804, Jun. 25, 2015, 10 pgs.
ActiveVideo Networks, Inc., International Search Report and Written Opinion, PCT-US2015028072, Aug. 7, 2015, 9 pgs.
ActiveVideo Networks B.V., Office Action, IL222830, Jun. 28, 2015, 7 pgs.
ActiveVideo Networks, Inc., Office Action, JP2013534034, Jun. 16, 2015, 6 pgs.
ActiveVideo Networks, Inc., KIPO'S Notice of Preliminary Rejection, KR10-2010-7019512, Jul. 15, 2015, 15 pgs.
ActiveVideo Networks, Inc., KIPO'S Notice of Preliminary Rejection, KR10-2010-7021116, Jul. 13, 2015, 19 pgs.
ActiveVideo, Communication Pursuant to Article-94(3) EPC, EP12767642.7, Sep. 4, 2015, 4 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 13/911,948, Jul. 10, 2015, 5 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 12/443,571, Jul. 9, 2015, 28 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 13/911,948, Aug. 21, 2015, 6 pgs.
Brockmann, Notice of Allowance, U.S. Appl. No. 13/911,948, Aug. 5, 2015, 5 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/668,004, Aug. 3, 2015, 18 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/686,548, Aug. 12, 2015, 13 pgs.
Brockmann, Final Office Action, U.S. Appl. No. 13/737,097, Aug. 14, 2015, 17 pgs.
Brockmann, Office Action, U.S. Appl. No. 14/298,796, Sep. 11, 2015, 11 pgs.
Dahlby, Office Action U.S. Appl. No. 12/651,203, Jul. 2, 2015, 25 pgs.
Gecsei, J., “Adaptation in Distributed Multimedia Systems,” IEEE Multimedia, IEEE Service Center, New York, NY, vol. 4, No. 2, Apr. 1, 1997, 10 pgs.
Ohta, K., et al., “Selective Multimedia Access Protocol for Wireless Multimedia Communication,” Communications, Computers and Signal Processing, 1997, IEEE Pacific Rim Conference NCE Victoria, BC, Canada, Aug. 1997, vol. 1, 4 pgs.
Wei, S., “QoS Tradeoffs Using an Application-Oriented Transport Protocol (AOTP) for Multimedia Applications Over IP.” Sep. 23, 1999, Proceedings of the Third International Conference on Computational Intelligence and Multimedia Applications, New Delhi, India, 5 pgs.
ActiveVideo Networks, Inc., Certificate of Patent, JP2013534034, Jan. 8, 2016, 4 pgs.
ActiveVideo Networks, Inc., Communication Pursuant to Rules 161(1) and 162 EPC, EP14740004.8, Jan. 26, 2016, 2 pgs.
ActiveVideo Networks, Inc., Communication Pursuant to Rules 161(1) and 162 EPC, EP14736535.7, Jan. 26, 2016, 2 pgs.
Brockmann, Office Action, U.S. Appl. No. 13/686,548, Feb. 8, 2016, 13 pgs.
Related Publications (1)
Number Date Country
20080178249 A1 Jul 2008 US
Provisional Applications (3)
Number Date Country
60884773 Jan 2007 US
60884744 Jan 2007 US
60884772 Jan 2007 US