The present invention relates to the creation of an encoded video sequence, and more particularly to using scene graph information for encoding the video sequence.
It is known in the prior art to encode and transmit multimedia content for distribution within a network. For example, video content may be encoded as MPEG video wherein pixel domain data is converted into a frequency domain representation, quantized and entropy encoded and placed into an MPEG stream format. The MPEG stream can then be transmitted to a client device and decoded and returned to the spatial/pixel domain for display on a display device.
The encoding of the video may be spatial, temporal or a combination of both. Spatial encoding generally refers to the process of intra-frame encoding wherein spatial redundancy (information) is exploited to reduce the number of bits that represent a spatial location. Spatial data is converted into a frequency domain over a small region. In general for small regions it is expected that the data will not drastically change and therefore there much of the information will be stored at DC and low frequency components with the higher frequency components being at or near zero. Thus, the lack of high frequency information of small area is used to reduce the representative data size. Data may also be compressed using temporal redundancy. One method for exploiting temporal redundancy is through the calculation of motion vectors. Motion vectors establish how objects or pixels move between frames of video. Thus, a ball may move between a first frame and a second frame by a number of pixels in a given direction. Thus, once a motion vector is calculated, the information about the spatial relocation of the ball information from the first frame to the second frame can be used to reduce the amount of information that is used to represent the motion in an encoded video sequence. Note that in practical applications the motion vector is rarely a perfect match and an additional residual is sometimes used to compensate for the imperfect temporal reference.
Motion vector calculation is a time consuming and processor intensive step in compressing video content. Typically, a motion search algorithm is employed to attempt to match elements within the video frames and to define motion vectors that point to the new location that objects or portions of objects. This motion search algorithm compares macroblocks (i.e., tries to find for each macroblock the optimal representation of that macroblock in past and future reference frames to a certain criterion), and determines the vector to represent that temporal relation. The motion vector is subsequently used (i.e., to minimize the residual that needs to be compressed) in the compression process. It would be beneficial if a mechanism existed that assists in the determination of these motion vectors.
As appreciated by those skilled in the art, another expensive component of the encoding process for more advanced codecs is the process to find the optimal macroblock type, partitioning of the macroblock and the weighing properties of the slice. H.264, for example, has 4 16×16, 9 8×8 and 9 4×4 luma intra prediction modes, 4 8×8 chroma intra prediction modes and inter macroblocks can be partitioned from as coarse as 16×16 to as fine grained as 4×4. In addition to that, it is possible to assign a weight and offset to the temporal references. A mechanism that defines or assists in finding these parameters directly improves scalability.
In a first embodiment of the invention there is provided a method for creating a composited video frame sequence for an application wherein the video frame sequence is encoded according to a predetermined specification, such as MPEG-2, H.264 or other block based encoding protocol or variant thereof. A current scene state for the application is compared to a previous scene state wherein each scene state includes a plurality of objects. A video construction module determines if properties of one or more objects have changed (such as, but not limited to, the object's position, transformation matrix, texture, translucency, etc. . . . ) based upon a comparison of the scene states. If properties of one or more objects have changed, the delta between the object's states is determined and this is used by a fragment encoding module in case the fragment is not already available in a fragment caching module. This information is used to define, for example, the motion vectors used by the fragment encoding module in the construction of the fragments for the stitching module to build the composited video frame sequence from.
In certain embodiments of the invention, the information about the changes in the scene's state can also be used to decide whether a macroblock is to be encoded spatially (using an intra encoded macroblock) or temporally (using an inter encoded macroblock) and given a certain encoding, what the optimal partitioning of the macroblock is. In certain embodiments, the information about the changes in the scene's state may also assist in finding the optimal weight and offset of the temporal reference in order to minimize the residual. The benefits of using scene state information in the encoding process is a gain in efficiency with respect to the resources required to encode the fragments, as well as improvements in the visual quality of the encoded fragments or to minimize the size of the encoded fragments because spatial relations in the current scene state or temporal relations between the previous scene state and current scene state can be more accurately determined.
Some embodiments of the invention may maintain objects in a 2 dimensional coordinate system, 2 dimensional (flat) objects in a 3 dimensional coordinate system or a full 3 dimensional object model in a 3 dimensional coordinate system. The objects may be kept in a hierarchical structure, such as a scene graph. Embodiments may use additional 3 dimensional object or scene properties known to the trade, such as, but not limited to, perspective, lighting effects, reflection, refraction, fog, etc.
In other embodiments, in order to determine the motion information, the current scene graph state and the previous scene graph state may be converted from a three dimensional representation into a two dimensional representation. The three dimensional representation may be for a worldview of the objects to be rendered and displayed on a display device. The two dimensional representation may be a screen view for displaying the objects on a display device. In addition to the motion information, in general there will be residual graphical information because the edges of moving objects generally do not map exactly on macroblock boundaries or objects are partially translucent, overlay or have quantization effects etc.
Embodiments of the invention may construct an MPEG encoded video sequence using the motion information including the corresponding motion vectors and residual graphical information that can be encoded. The scene states (previous and current) may result as the output of an application engine such as an application execution engine. The application execution engine may be a web browser, a script interpreter, operating system or other computer-based environment that is accessed during operation of the application. The application execution engine may interface with the described system using a standardized API (application programming interface), such as for example OpenGL. The system may translate the scene representation as expressed through the used API to a convenient internal representation or directly derive state changes from the API's primitives.
The current scene graph state includes a plurality objects having associated parameters. Some examples of parameters are the location of objects to be rendered, lighting effects, textures, and other graphical characteristics that may be used in rendering the object(s). A hash may be created for objects within a scene. The hash may be compared to a table of hashes that represent objects from previous scenes. If the current hash matches a hash within the table of hashes, MPEG encoded elements for the identified object are retrieved. The MPEG encoded elements can then be sent to a stitcher that can stitch together the MPEG encoded elements to form one or more MPEG encoded video frames in a series of MPEG encoded video frames.
In order to create the hash for the objects, the scene graph state is converted to a 2D or display representation. It is then determined which non-overlapping rectangles of the display represent state changes of the scene graph state. A hash is created for each rectangle, i.e. object; the previous and current state of the objects within these rectangles is hashed. These hashes are compared to hashes available in the table of hashes.
If the current hash does not match a hash in the table and no motion information can be determined by the scene graph state comparison for an object, the spatial data from the hashing process where the object is converted from a three dimensional representation to a two dimension screen representation is provided to an encoder wherein the encoder compresses the data using at least spatial techniques to produce one or more encoded elements. The encoder may encode according to a predetermined protocol such as MPEG, H.264 or another block based encoding protocol. The encoded elements are passed to a stitching module. The stitching module forms an encoded MPEG frame from the encoded elements where the encoded MPEG frame is part of an MPEG video sequence.
The methodology may be embodied as a computer program product where the computer program product includes a non-transitory computer readable medium having computer code thereon for creating an encoded video sequence. The above-described method may be embodied as a system that includes one or more processors that perform specified functions in the creation of the encoded video sequence.
The foregoing features of embodiments will be more readily understood by reference to the following detailed description, taken with reference to the accompanying drawings, in which:
Definitions. As used in this description and the accompanying claims, the following terms shall have the meanings indicated, unless the context otherwise requires:
The term “application” refers to an executable program, or a listing of instructions for execution, that defines a graphical user interface (“GUI”) for display on a display device. An application may be written in a declarative language such as HTML or CSS, a procedural language such as C, JavaScript, or Perl, any other computer programming language, or a combination of languages.
“Application execution environment” is an environment that receives in an application including all of its components and manages the components and execution of the components to define a graphical layout and manages the interactions with the graphical layout. For example, Trident, WebKit, and Gecko are software layout engines that convert web pages into a collection of graphical objects (text strings, images, and so on) arranged, according to various instructions, within a page display area of a web browser. The instructions may be static, as in the case of parts of HTML, or dynamic, as in the case of JavaScript or other scripting languages, and the instructions may change as a function of user input. Trident is developed by Microsoft Corporation and used by the Internet Explorer web browser; WebKit is developed by a consortium including Apple, Nokia, Google and others, and is used by the Google Chrome and Apple Safari web browsers; Gecko is developed by the Mozilla Foundation, and is used by the Firefox web browser. Operating systems such as Google's Android and Apple's iOS may be considered application execution environment because these operating systems can execute applications. The output of an application execution environment is a screen state (either absolute or relative to a previous screen state). The screen state may be presented as a scene graph state.
“Rendering Engine” transforms a model of an image to actual data that can generate the image on a display device. The model of the image may contain two-dimensional or three-dimensional data as would be represented in a world space and the rendering engine takes the data and transforms the data into a screen-space representation wherein the data may be represented as pixels.
“Video Construction Module” compares scene states and derives which areas on the display device need to be changed. The video construction module determines how to map the changed areas to encoded fragments that can be stitched together and maintains a cache of already encoded fragments. If fragments are not available in an encoded form, the video construction module interacts with an fragment encoding module to encode the fragment.
“Fragment Caching Module” stores fragments in volatile memory (such as for example the system's RAM) or persistent memory (such as for example a disc based file system).
“Encoding Engine/Fragment Encoding Module” transforms graphical data and associated information about spatial and/or temporal relations into one or more encoded fragments.
“Stitching Engine/Module” receives as input one or more fragments (e.g., MPEG encoded elements) along with layout information and then constructs complete video frames for a video sequence (e.g. MPEG video frames for an MPEG elementary stream).
“Scene” is a model of an image generated by an application execution engine consisting of objects and their properties;
“Scene state” is the combined state of all objects and their properties at a particular moment in time.
“Scene Graph” is a specialized scene where objects have a hierarchical relation.
“Scene Graph State” is the combined state of all objects and their properties of a scene graph at a particular moment in time.
“API” (application programming interface) is an interaction point for software modules, providing functions, data structures and object classes with the purpose of using provided services in software modules.
“DOM” (document object model) is a convention for representing and interacting with objects in markup languages such as HTML and XML documents.
“DOM tree” is a representation of a DOM (document object model) for a document (e.g. an HTML file) having nodes wherein the topmost node is the document object.
“CSS” (cascading style sheets) provide the graphical layout information for a document (e.g. an HTML document) and how each object or class of objects should be represented graphically. The combination of a DOM object and the corresponding CSS files (i.e. layout) is referred to as a rendering object.
“Render layer” is a graphical representation of one or more objects of a scene graph state. For example, a group of objects that have a geographical relationship such as an absolute or a relative position to each other may form a layer. An object may be considered to be a separate render layer if, for example, the object is transparent, has an alpha mask or has a reflection. A render layer may be defined by a screen area, such as a screen area that can be scrolled. A render layer may be designated for an area that may have an overlay (e.g. a pop up). A render layer may be defined for a portion of a screen area if that area is to have an applied graphical filter such as a blur, color manipulation or shadowing. A layer may be defined by a screen area that has associated video content. Thus, a render layer may be a layer within a scene graph state or a modification of a scene graph state layer in which objects are grouped according to a common characteristic.
“Fragment” is one or more MPEG-encoded macroblocks, as disclosed in U.S. patent application Ser. No. 12/443,571, filed Oct. 1, 2007, the contents of which are incorporated by reference in their entirety. A fragment may be intra-encoded (spatially-encoded), inter-encoded (temporally-encoded) or a combination thereof.
Embodiments of the present invention provide for the extraction of spatial information as well as other graphical information from an application execution environment by using software integration points that are (for example) intended for communication between the application execution environment and Graphical Processing Unit (GPU) driver software. This spatial information can then be used for the creation of motion vectors for encoding of graphical content in a frequency-based encoding format, such as MPEG, AVS, VC-1, H.264 and other block-based encoding formats and variants that employ motion vectors.
Embodiments of the invention use the motion information exposed by an Application Execution Environment's GPU interface (or another suitable interface that allows access to the scene graph state) to obtain spatial and temporal information of the screen objects to be rendered, and to use that information to more efficiently encode the screen objects into a stream of MPEG frames.
In order to determine the motion information, the application execution Environment may access Z-ordering information from a scene graph for the rendering of objects. For example, the application execution environment can separate a background layer from a foreground image layer and the scene graph state may specify objects that are partially translucent. This information can be used to determine what information will be rendered from a 3-dimensional world view in a 2-dimensional screen view. Once the visible elements are determined, motion information can be determined and the motion information can be converted into motion vectors. Multiple motion vectors may be present for a particular screen area. For example, if two different layers (on different Z indices) are moving in different directions, the area would have different associated motion vectors. The encoder will determine a dominant vector given its knowledge on what is being rendered, including translucency, surface area of the moving object, texture properties (i.e. is it a solid or a pattern) etc.
As shown, the Application Execution Engine 110 may produce an output for graphical processing. Examples of application execution environments include both computer software and hardware and combinations thereof for executing the application. Applications can be written for certain application execution environments including WebKit, JAVA compilers, script interpreters (Perl etc.) and various operating systems including iOS and Android OS for example.
The video construction engine 170 takes advantage of the data that it receives from the application execution environment in order to exploit redundancies in requests for the presentation of information within user sessions and between user sessions as well as determining motion changes of objects from a previous video frame or scene graph state to a current frame or scene graph state. The present system may be used in a networked environment wherein multiple user sessions are operational simultaneously wherein requested applications may be used by multiple users simultaneously.
The video construction engine 170 may receive in OpenGL data and can construct a scene graph from the OpenGL data. The video construction engine 170 can then compare the current scene graph state to one or more previous scene graph states to determine if motion occurs between objects within the scene. If motion occurs between the objects, this motion can be translated into a motion vector and this motion vector information can be passed to an encoding module 150. Thus, the encoding module 150 need not perform a motion vector search and can add the motion vectors into the video frame format (e.g. MPEG video frame format). The MPEG elements can be constructed that are encoded MPEG macroblocks that are inter-frame encoded. These macroblocks are passed to the stitching module 160 that receives stitching information about the video frame layout and stitches together encoded MPEG elements to form complete MPEG encoded video frames in accordance with the scene graph. Either simultaneously or in sequence, the MPEG video construction engine may hash the parameters for objects within the scene graph according to a known algorithm. The construction engine 170 will compare the hash value to hash values of objects from previous scene graphs and if there is a match within the table of hashes, the construction engine 170 will locate MPEG encoded macroblocks (MPEG elements) that are stored in memory and are related to the hash. These MPEG elements can be passed directly to the stitching engine 160 wherein the MPEG elements are stitched together to form complete MPEG encoded video frames. Thus, the output of the stitching module 160 is a sequence of encoded video frames that contain both intra-frame encoded macroblocks and inter-frame encoded macroblocks. Additionally, the video construction engine 170 outputs pixel based information to the encoder. This pixel-based information may be encoded using spatial based encoding algorithms including the standard MPEG DCT processes. This pixel based information occurs as a result of changes in the scene (visual display) in which objects represented by rectangles are altered. The encoded macroblocks can then be passed to the stitcher. The processes of the video construction engine 170 will be explained in further detail with respect to the remaining figures.
The application execution engine may be proximate to the client device, operational on the client device, or may be remote from the client device, such as in a networked client/server environment. The control signal for the dirty rectangle causes the application execution engine to generate a scene graph having a scene graph state that reflects the changes to the screen (e.g. dirty rectangles of the screen display). For example, the application execution environment may be a web browser operating within an operating system. The web browser represents a page of content in a structured hierarchical format such as a DOM and corresponding DOM tree. Associated with the DOM tree is a CSS that specifies where and how each object is to be graphically rendered on a display device. The web browser creates an output that can be used by a graphics engine. The output that is produced is the scene graph state which may have one or more nodes and objects associated with the nodes forming a layer (i.e. a render layer) 200. As requests occur from a client device for updates or updates are automatically generated as in a script, a new or current scene graph state is generated. Thus, the current scene graph state represents a change in the anticipated output video that will be rendered on a display device. An exemplary scene graph state is shown in
Once the current scene graph state is received by the video construction engine 200, the scene graph state can be compared with a previous scene graph state 210. The comparison of scene graph states can be performed hierarchically by layer and by object. For each object associated with a node differences in the positions of objects from the scene graph states can be identified as well as differences in characteristics, such as translucence and lighting.
For example, in a simple embodiment, a circle may be translated by a definable distance between the current scene graph state and a previous scene graph state. The system queries whether one or more objects within the scene graph state have moved. If one or more objects have been identified as moving between scene graph states information about the motion translation are determined 220. This information may require the transformation of position data from a three dimensional world coordinate view to a two dimensional screen view so that pixel level motion (two dimensional motion vectors) can be determined. This motion information can then be passed on to an encoder in the form of a motion vector 230. Thus, the motion vector information can be used by the encoder which to create inter-frame encoded video frames. For example, the video frames may be P or B frame MPEG encoded frames.
In addition to objects moving, scene elements may also change. Thus, a two dimensional representation of information to be displayed on a screen can be ascertained from the three-dimensional scene graph state data. Rectangles can be defined as dirty rectangles, which identify data on the screen that has changed 240. These rectangles can by hashed according to a known formula that will take into account properties of the rectangles 250. The hash value can then be compared to a listing of hash values associated with rectangles that were updated from previous scene graph states 260. The list of hash values may be for the current user session or for other user sessions. Thus, if a request for a change in the content being displayed in an application is received from multiple parties, the redundancy in information being requested can be exploited and processing resources conserved. More specifically, if the hash matches a hash within the searchable memory, encoded graphical data (e.g. either a portion of an entire video frame of encoded data or an entire frame of encoded data) that is linked to the hash value in the searchable memory is retrieved and the data can be combined with other encoded video frames 270.
Additionally, if a rectangle is identified as being dirty and a hash is not identified, the spatial information for that rectangle can be passed to the encoder and the MPEG encoder will spatially encode the data for the rectangle. As used herein, the term content, may refer to a dirty rectangle or an object from a scene graph state.
The application execution environment 300 creates a current scene graph 320. The current scene graph may be translated using a library of functions, such as the OpenGL library 330. The resulting OpenGL scene graph state 340 is passed to the video construction engine 310. The OpenGL scene graph state 340 for the current scene graph is compared to a previous scene graph state 350 in a comparison module 360. This may require the calculation and analysis of two-dimensional projections of three-dimension information that are present within the scene graph state. Such transformations are known by one of ordinary skill in the art. It should be recognized that OpenGL is used herein for convenience and that only the creation of a scene graph state is essential for the present invention. Thus, the scene graph state need not be converted into OpenGL before a scene graph state comparison is performed.
Differences between the scene graphs are noted and dirty rectangles can be identified 370. A dirty rectangle 370 represents a change to an identifiable portion of the display (e.g. a button changing from an on-state to an off-state). There may be more than one dirty rectangle that is identified in the comparison of the scene graph states. Thus, multiple objects within a scene may change simultaneously causing the identification of more than one dirty rectangle.
From the list of dirty rectangles 370, a list of MPEG fragment rectangles (i.e. spatially defined fragments, such as a plurality of macroblocks on macroblock boundaries) can be determined for the dirty rectangle. It should be recognized that the term MPEG fragment rectangle as used in the present context refers to spatial data and not frequency transformed data and is referred to as an MPEG fragment rectangle because MPEG requires a block-based formatting schema i.e. macroblocks that are generally 16×16 pixels in shape. Defining dirty rectangles as MPEG fragment rectangles can be achieved by defining an MPEG fragment rectangle for a dirty rectangle wherein the dirty rectangle is fully encompassed within a selection of macroblocks. Thus, the dirty rectangle fits within a rectangle composed of spatially defined macroblocks. Preferably the dirty rectangles are combined or split to limit the number of MPEG fragment rectangles that are present or to avoid small changes in large rectangles.
For each MPEG fragment rectangle, a listing of nodes according to z-order (depth) in the scene graph that contributed to the rectangle contents is determined. This can be achieved by omitting nodes that are invisible, have a low opacity, or have a transparent texture.
For each MPEG fragment rectangle, a hash value 382 is created based upon relevant properties of all nodes that have contributed to the rectangle contents (for example absolute position, width, height, transformation matrix, hash of texture bitmap, opacity). If the cache contains an encoded MPEG fragment associated with that hash value, then the encoded MPEG fragment is retrieved from the cache. In the present context, the term encoded MPEG fragment, refers to a portion of a full frame of video that has been encoded according to the MPEG standard. The encoding may simply be DCT encoding for blocks of data or may also include MPEG specific header information for the encoded material. If the calculated hash value does not match an MPEG fragment in the cache, then the dirty rectangle contents (using the scene graph state) are rendered from a three dimensional world view to a two dimensional screen view and the rendered pixel data (i.e. spatial data) are encoded in an encoder, such as an MPEG encoder 385. The encoded MPEG data (e.g. encoded MPEG fragment(s)) for the scene is stored into the cache.
As part of the encoding process, the fragment is analyzed to determine whether the encoding can best be performed as ‘inter’ encoding (an encoding relative to the previous screen state) or whether it is encoded as ‘intra’ encoding (an independent encoding). Inter-encoding is preferred in general because it results in less bandwidth and may result in higher quality streams. All changes in nodes between scene graphs are determined including movement, changes of opacity, and changes in texture for example. The system then evaluates whether these changes contribute to a fragment, and whether it is possible to express these changes efficiently into the video codec's primitives. If the evaluation indicates that changes to dominant nodes can be expressed well in the video codec's primitives, then the fragment is inter-encoded. These steps are repeated for every screen update. Since the ‘new scene graph’ will become ‘previous scene graph’ in a next screen update, intermediate results can be reused from previous frames.
The tree like structure provides a hierarchical representation wherein attributes of parent objects can be attributed to the child objects. The root object represents the entire scene 610, while child nodes of a certain node may contain a decomposition of the parent node into smaller objects. The nodes contain may contain a texture (bitmap object), a 3D transformation matrix that specifies how the texture is positioned in a 3D space, and I or other graphical attributes such as visibility and transparency. A child node inherits all attributes, transformations, filters, from the parent node.
For example, movement between scene graphs for an object such as the “cover list” 620 would indicate that each of the child objects (cover1, cover2, cover3, and cover4) 621, 622, 623, 624 would also move by an equal amount. As shown, the screen shot of
The scene graph comparison between the previous scene graph and the current scene graph may be performed in the following manner wherein the scene graph is transformed from a 3D to a 2D space. A node in a scene graph consists of an object having a texture (2D bitmap) and a transformation how that object is floating in space. It also contains the z-order (absolute order to render things). In OpenGL the transformation consists of a matrix:
This transformation is applied to an element ‘a’ in a 3D space by matrix multiplication. The element ‘a’ is identified by four points: the origin and the three top positions of the object in x, y and z direction. The bottom row (i.e. elements m[12], m[13] and m[14]) specifies translation in 3D space. Elements m[0], m[4], m[8], m[1], m[5], m[9], m[2], m[6], m[10] specify the three top positions of an object (i.e. furthest point out in x, y, z direction) where that particular point will end up by using matrix multiplication. This allows for object or frame rotation, slanting, shearing, shrinking, zooming, and translation etc. and repositioning of the object in world space at any time.
When two transformations have been applied to an object according to matrix ‘m’ (from the previous scene graph) and ‘n’ (from the current scene graph) then the “difference” between the two is m-n: matrix subtraction. The result of the matrix subtraction gives the amount of rotation, slanting, shearing, shrinking, zooming, translation etc. that has been performed to the object between the previous frame and the current frame.
Projecting a 3D image to a 2D surface is well known in the art. In one embodiment, the system first calculates projections of the 3D scene graphs onto a 2D plane, where the transformation matrices also become 2D. The motion vector (obtained by subtracting the transformation matrices) is then 2D and can be directly applied by the MPEG encoder. One motion vector per (destination) macroblock is passed, if motion was detected. The motion vector has a defined (x, y) direction, having a certain length that indicates direction and distance covered between the current frame and the previous frame. The encoder then assumes that the reference information for a macroblock is located in the reverse direction of the motion vector. If no motion was detected, then either the macroblock did not change, or it changed entirely and then it is intra-encoded.
Hashing and caching of dirty rectangles on individual layers of a scene graph state is more efficient compared to hashing and caching of 2D projection of these layers, because the layers represent independent changes.
It should be noted that some Application Execution Environments might use one ‘background’ layer where it renders objects for which it chooses not to create a separate render layer. This could be a wall clock, for example. Changes to this layer are analyzed resulting in one or more dirty rectangles. In principle all rectangles depend on the background (if the background changes, parts of the background are likely visible in the rectangle due to the macroblock snapping). To avoid the background being part of every rectangle's hash function, and thus to avoid a re-rendering and re-encoding of all rectangles when the background changes (e.g. when the seconds hand moves in the wall clock object), the background is excluded from the scene graph and it is not available as an MPEG fragment.
Since embodiments of the invention may maintain objects in a 2 dimensional coordinate system, 2 dimensional (flat) objects in a 3 dimensional coordinate system or a full 3 dimensional object model in a 3 dimensional coordinate system, a mapping has to be made for each object from the scene's coordinate system to the current and previous field of view. The field of view is the extent of the observable scene at a particular moment. For each object on the list of changed, added or removed objects it is determined in step 1202 whether the object's change, addition or removal was visible in the field of view of the scene's current state or the field of view of the previous state and what bounding rectangle represented that change in said states.
Bounding rectangles pertaining to the objects' previous and current states may overlap in various constellations. Fragments, however, cannot overlap and before any fragments can be identified, overlapping conditions have to be resolved. This is done in step 1203 by applying a tessellation or tiling process as depicted in
Suppose that overlapping rectangles 1301 for object A and 1302 for object B as depicted by
Returning to step 1203, the tessellation process is first applied to the rectangles pertaining to the objects' previous states. When an object changes position or its transformation matrix changes, graphical data may be revealed that was obscured in the previous state. The object's new bounding rectangle usually only partially overlaps with the object's previous bounding rectangle. A fragment has to be made that encodes this exposure. Therefore, step 1203 first applies the tessellation process to all bounding rectangles of the objects' previous states. Subsequently, the bounding rectangles of the objects' current states are added to the tessellation process. The resulting rectangles represent the fragments that constitute the update from the previous scene state to the current scene state. Steps 1204 to 1208 are performed for each fragment.
Step 1204 determines the fragment's properties, such as whether the fragment is related to the current state or the previous state, which objects contribute to the fragment's pixel representation, and which contributing object is the dominant object. If an object dominates the fragment's pixel representation, the object's rectangle pertaining to the previous state is used as a reference window for temporal reference and the fragment may be inter encoded. If multiple objects dominate the fragment's representation a union of multiple previous state rectangles may be used as a reference window. Alternatively, the fragment's current bounding rectangle may be used as a reference window.
The fragments' properties as determined by step 1204 are used in step 1205 to form hash values that uniquely describe the fragment. A hash value typically includes the coordinates of the fragment's rectangle, the properties of contributing objects and encoding attributes that may be used to distinguish encoder specific variants of otherwise equivalent fragments such as profile, level or other codec specific settings, differences in quantization, use of the loop filter, etc. . . . If the fragment has a reference window, the hash is extended with the coordinates of the reference window in pixel units, the properties of the objects contributing to the reference window and the transformation matrix of the dominant object. All in all the hash uniquely describes the fragment that encodes the scene's current state for the fragment's rectangle and if a temporal relation could be established, a transition from the scene's previous state to the current.
In step 1206 the hash uniquely identifying the fragment is checked against a hash table. If the hash cannot be found in the hash table, the fragment description is forwarded to the fragment encoding module and step 1207 is applied. If the hash is found in the hash table, the associated encoded fragment is retrieved from the fragment caching module and step 1208 is applied.
In step 1207 fragments are encoded from pixel data pertaining to the current scene state and, if available, pixel data pertaining to the previous scene state and meta data obtained from the scene's state change (such as for example the type of fragment, transformation matrices of the objects contributing to the fragment, changes in translucency of the objects) into a stitchable fragment. It is in this step that many efficiency and quality improvements are achieved. Many steps in the encoding process, such as the intra/inter decision, selection of partitions, motion estimation and weighted prediction parameters benefit from the meta data because it allows for derivation of the spatial or temporal relations relevant for the encoding process. Examples of such benefits are provided in the remainder of this document. Once a fragment has been encoded the fragment is stored in the fragment caching module and step 1208 is applied.
Step 1208 forwards stitchable fragments to the stitching module.
It should be noted that objects are generally handled as an atomic entity, except for the background object. The background object is a fixed object at infinite distance that spans the entire field of view. A consequence of treating the background as an atomic entity would mean that small changes to the background would potentially permeate in the hash values of all fragments in which the background is visible. Therefore, the background texture is treated in the same way as disclosed in U.S. application Ser. No. 13/445,104 (Graphical Application Integration with MPEG Objects), the contents of which are hereby incorporated by reference, and changes to the background only have consequences for the fragments overlapping the dirty rectangles of the background.
The following examples relate to a DOM-based application embodiment equivalent to
The present invention may be embodied in many different forms, including, but in no way limited to, computer program logic for use with a processor (e.g., a microprocessor, microcontroller, digital signal processor, or general purpose computer), programmable logic for use with a programmable logic device (e.g., a Field Programmable Gate Array (FPGA) or other PLD), discrete components, integrated circuitry (e.g., an Application Specific Integrated Circuit (ASIC)), or any other means including any combination thereof. In an embodiment of the present invention, predominantly all of the reordering logic may be implemented as a set of computer program instructions that is converted into a computer executable form, stored as such in a computer readable medium, and executed by a microprocessor within the array under the control of an operating system.
Computer program logic implementing all or part of the functionality previously described herein may be embodied in various forms, including, but in no way limited to, a source code form, a computer executable form, and various intermediate forms (e.g., forms generated by an assembler, compiler, networker, or locator.) Source code may include a series of computer program instructions implemented in any of various programming languages (e.g., an object code, an assembly language, or a high-level language such as Fortran, C, C++, JAVA, or HTML) for use with various operating systems or operating environments. The source code may define and use various data structures and communication messages. The source code may be in a computer executable form (e.g., via an interpreter), or the source code may be converted (e.g., via a translator, assembler, or compiler) into a computer executable form.
The computer program may be fixed in any form (e.g., source code form, computer executable form, or an intermediate form) either permanently or transitorily in a tangible storage medium, such as a semiconductor memory device (e.g., a RAM, ROM, PROM, EEPROM, or Flash-Programmable RAM), a magnetic memory device (e.g., a diskette or fixed disk), an optical memory device (e.g., a CD-ROM), a PC card (e.g., PCMCIA card), or other memory device. The computer program may be fixed in any form in a signal that is transmittable to a computer using any of various communication technologies, including, but in no way limited to, analog technologies, digital technologies, optical technologies, wireless technologies, networking technologies, and inter-networking technologies. The computer program may be distributed in any form as a removable storage medium with accompanying printed or electronic documentation (e.g., shrink wrapped software or a magnetic tape), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the communication system (e.g., the Internet or World Wide Web.)
Hardware logic (including programmable logic for use with a programmable logic device) implementing all or part of the functionality previously described herein may be designed using traditional manual methods, or may be designed, captured, simulated, or documented electronically using various tools, such as Computer Aided Design (CAD), a hardware description language (e.g., VHDL or AHDL), or a PLD programming language (e.g., PALASM, ABEL, or CUPL.)
While the invention has been particularly shown and described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended clauses. As will be apparent to those skilled in the art, techniques described above for panoramas may be applied to images that have been captured as non-panoramic images, and vice versa.
Embodiments of the present invention may be described, without limitation, by the following clauses. While these embodiments have been described in the clauses by process steps, an apparatus comprising a computer with associated display capable of executing the process steps in the clauses below is also included in the present invention. Likewise, a computer program product including computer executable instructions for executing the process steps in the clauses below and stored on a computer readable medium is included within the present invention.
This application is a continuation-in-part of U.S. patent application Ser. No. 13/911,948, filed Jun. 6, 2013. This prior application is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3889050 | Thompson | Jun 1975 | A |
3934079 | Barnhart | Jan 1976 | A |
3997718 | Ricketts et al. | Dec 1976 | A |
4002843 | Rackman | Jan 1977 | A |
4032972 | Saylor | Jun 1977 | A |
4077006 | Nicholson | Feb 1978 | A |
4081831 | Tang et al. | Mar 1978 | A |
4107734 | Percy et al. | Aug 1978 | A |
4107735 | Frohbach | Aug 1978 | A |
4145720 | Weintraub et al. | Mar 1979 | A |
4168400 | de Couasnon et al. | Sep 1979 | A |
4186438 | Benson et al. | Jan 1980 | A |
4222068 | Thompson | Sep 1980 | A |
4245245 | Matsumoto et al. | Jan 1981 | A |
4247106 | Jeffers et al. | Jan 1981 | A |
4253114 | Tang et al. | Feb 1981 | A |
4264924 | Freeman | Apr 1981 | A |
4264925 | Freeman et al. | Apr 1981 | A |
4290142 | Schnee et al. | Sep 1981 | A |
4302771 | Gargini | Nov 1981 | A |
4308554 | Percy et al. | Dec 1981 | A |
4350980 | Ward | Sep 1982 | A |
4367557 | Stern et al. | Jan 1983 | A |
4395780 | Gohm et al. | Jul 1983 | A |
4408225 | Ensinger et al. | Oct 1983 | A |
4450477 | Lovett | May 1984 | A |
4454538 | Toriumi | Jun 1984 | A |
4466017 | Banker | Aug 1984 | A |
4471380 | Mobley | Sep 1984 | A |
4475123 | Dumbauld et al. | Oct 1984 | A |
4484217 | Block et al. | Nov 1984 | A |
4491983 | Pinnow et al. | Jan 1985 | A |
4506387 | Walter | Mar 1985 | A |
4507680 | Freeman | Mar 1985 | A |
4509073 | Baran et al. | Apr 1985 | A |
4523228 | Banker | Jun 1985 | A |
4533948 | McNamara et al. | Aug 1985 | A |
4536791 | Campbell et al. | Aug 1985 | A |
4538174 | Gargini et al. | Aug 1985 | A |
4538176 | Nakajima et al. | Aug 1985 | A |
4553161 | Citta | Nov 1985 | A |
4554581 | Tentler et al. | Nov 1985 | A |
4555561 | Sugimori et al. | Nov 1985 | A |
4562465 | Glaab | Dec 1985 | A |
4567517 | Mobley | Jan 1986 | A |
4573072 | Freeman | Feb 1986 | A |
4591906 | Morales-Garza et al. | May 1986 | A |
4602279 | Freeman | Jul 1986 | A |
4614970 | Clupper et al. | Sep 1986 | A |
4616263 | Eichelberger | Oct 1986 | A |
4625235 | Watson | Nov 1986 | A |
4627105 | Ohashi et al. | Dec 1986 | A |
4633462 | Stifle et al. | Dec 1986 | A |
4670904 | Rumreich | Jun 1987 | A |
4682360 | Frederiksen | Jul 1987 | A |
4695880 | Johnson et al. | Sep 1987 | A |
4706121 | Young | Nov 1987 | A |
4706285 | Rumreich | Nov 1987 | A |
4709418 | Fox et al. | Nov 1987 | A |
4710971 | Nozaki et al. | Dec 1987 | A |
4718086 | Rumreich et al. | Jan 1988 | A |
4732764 | Hemingway et al. | Mar 1988 | A |
4734764 | Pocock et al. | Mar 1988 | A |
4748689 | Mohr | May 1988 | A |
4749992 | Fitzemeyer et al. | Jun 1988 | A |
4750036 | Martinez | Jun 1988 | A |
4754426 | Rast et al. | Jun 1988 | A |
4760442 | O'Connell et al. | Jul 1988 | A |
4763317 | Lehman et al. | Aug 1988 | A |
4769833 | Farleigh et al. | Sep 1988 | A |
4769838 | Hasegawa | Sep 1988 | A |
4789863 | Bush | Dec 1988 | A |
4792849 | McCalley et al. | Dec 1988 | A |
4801190 | Imoto | Jan 1989 | A |
4805134 | Calo et al. | Feb 1989 | A |
4807031 | Broughton et al. | Feb 1989 | A |
4816905 | Tweety et al. | Mar 1989 | A |
4821102 | Ichikawa et al. | Apr 1989 | A |
4823386 | Dumbauld et al. | Apr 1989 | A |
4827253 | Maltz | May 1989 | A |
4827511 | Masuko | May 1989 | A |
4829372 | McCalley et al. | May 1989 | A |
4829558 | Welsh | May 1989 | A |
4847698 | Freeman | Jul 1989 | A |
4847699 | Freeman | Jul 1989 | A |
4847700 | Freeman | Jul 1989 | A |
4848698 | Newell et al. | Jul 1989 | A |
4860379 | Schoeneberger et al. | Aug 1989 | A |
4864613 | Van Cleave | Sep 1989 | A |
4876592 | Von Kohorn | Oct 1989 | A |
4889369 | Albrecht | Dec 1989 | A |
4890320 | Monslow et al. | Dec 1989 | A |
4891694 | Way | Jan 1990 | A |
4901367 | Nicholson | Feb 1990 | A |
4903126 | Kassatly | Feb 1990 | A |
4905094 | Pocock et al. | Feb 1990 | A |
4912760 | West, Jr. et al. | Mar 1990 | A |
4918516 | Freeman | Apr 1990 | A |
4920566 | Robbins et al. | Apr 1990 | A |
4922532 | Farmer et al. | May 1990 | A |
4924303 | Brandon et al. | May 1990 | A |
4924498 | Farmer et al. | May 1990 | A |
4937821 | Boulton | Jun 1990 | A |
4941040 | Pocock et al. | Jul 1990 | A |
4947244 | Fenwick et al. | Aug 1990 | A |
4961211 | Tsugane et al. | Oct 1990 | A |
4963995 | Lang | Oct 1990 | A |
4975771 | Kassatly | Dec 1990 | A |
4989245 | Bennett | Jan 1991 | A |
4994909 | Graves et al. | Feb 1991 | A |
4995078 | Monslow et al. | Feb 1991 | A |
5003384 | Durden et al. | Mar 1991 | A |
5008934 | Endoh | Apr 1991 | A |
5014125 | Pocock et al. | May 1991 | A |
5027400 | Baji et al. | Jun 1991 | A |
5051720 | Kittirutsunetorn | Sep 1991 | A |
5051822 | Rhoades | Sep 1991 | A |
5057917 | Shalkauser et al. | Oct 1991 | A |
5058160 | Banker et al. | Oct 1991 | A |
5060262 | Bevins, Jr. et al. | Oct 1991 | A |
5077607 | Johnson et al. | Dec 1991 | A |
5083800 | Lockton | Jan 1992 | A |
5088111 | McNamara et al. | Feb 1992 | A |
5093718 | Hoarty et al. | Mar 1992 | A |
5109414 | Harvey et al. | Apr 1992 | A |
5113496 | McCalley et al. | May 1992 | A |
5119188 | McCalley et al. | Jun 1992 | A |
5130792 | Tindell et al. | Jul 1992 | A |
5132992 | Yurt et al. | Jul 1992 | A |
5133009 | Rumreich | Jul 1992 | A |
5133079 | Ballantyne et al. | Jul 1992 | A |
5136411 | Paik et al. | Aug 1992 | A |
5142575 | Farmer et al. | Aug 1992 | A |
5144448 | Hombaker, III et al. | Sep 1992 | A |
5155591 | Wachob | Oct 1992 | A |
5172413 | Bradley et al. | Dec 1992 | A |
5191410 | McCalley et al. | Mar 1993 | A |
5195092 | Wilson et al. | Mar 1993 | A |
5208665 | McCalley et al. | May 1993 | A |
5220420 | Hoarty et al. | Jun 1993 | A |
5230019 | Yanagimichi et al. | Jul 1993 | A |
5231494 | Wachob | Jul 1993 | A |
5236199 | Thompson, Jr. | Aug 1993 | A |
5247347 | Letteral et al. | Sep 1993 | A |
5253341 | Rozmanith et al. | Oct 1993 | A |
5262854 | Ng | Nov 1993 | A |
5262860 | Fitzpatrick et al. | Nov 1993 | A |
5303388 | Kreitman et al. | Apr 1994 | A |
5319455 | Hoarty et al. | Jun 1994 | A |
5319707 | Wasilewski et al. | Jun 1994 | A |
5321440 | Yanagihara et al. | Jun 1994 | A |
5321514 | Martinez | Jun 1994 | A |
5351129 | Lai | Sep 1994 | A |
5355162 | Yazolino et al. | Oct 1994 | A |
5359601 | Wasilewski et al. | Oct 1994 | A |
5361091 | Hoarty et al. | Nov 1994 | A |
5371532 | Gelman et al. | Dec 1994 | A |
5404393 | Remillard | Apr 1995 | A |
5408274 | Chang et al. | Apr 1995 | A |
5410343 | Coddington et al. | Apr 1995 | A |
5410344 | Graves et al. | Apr 1995 | A |
5412415 | Cook et al. | May 1995 | A |
5412720 | Hoarty | May 1995 | A |
5418559 | Blahut | May 1995 | A |
5422674 | Hooper et al. | Jun 1995 | A |
5422887 | Diepstraten et al. | Jun 1995 | A |
5442389 | Blahut et al. | Aug 1995 | A |
5442390 | Hooper et al. | Aug 1995 | A |
5442700 | Snell et al. | Aug 1995 | A |
5446490 | Blahut et al. | Aug 1995 | A |
5469283 | Vinel et al. | Nov 1995 | A |
5469431 | Wendorf et al. | Nov 1995 | A |
5471263 | Odaka | Nov 1995 | A |
5481542 | Logston et al. | Jan 1996 | A |
5485197 | Hoarty | Jan 1996 | A |
5487066 | McNamara et al. | Jan 1996 | A |
5493638 | Hooper et al. | Feb 1996 | A |
5495283 | Cowe | Feb 1996 | A |
5495295 | Long | Feb 1996 | A |
5497187 | Banker et al. | Mar 1996 | A |
5517250 | Hoogenboom et al. | May 1996 | A |
5526034 | Hoarty et al. | Jun 1996 | A |
5528281 | Grady et al. | Jun 1996 | A |
5537397 | Abramson | Jul 1996 | A |
5537404 | Bentley et al. | Jul 1996 | A |
5539449 | Blahut et al. | Jul 1996 | A |
RE35314 | Logg | Aug 1996 | E |
5548340 | Bertram | Aug 1996 | A |
5550578 | Hoarty et al. | Aug 1996 | A |
5557316 | Hoarty et al. | Sep 1996 | A |
5559549 | Hendricks et al. | Sep 1996 | A |
5561708 | Remillard | Oct 1996 | A |
5570126 | Blahut et al. | Oct 1996 | A |
5570363 | Holm | Oct 1996 | A |
5579143 | Huber | Nov 1996 | A |
5581653 | Todd | Dec 1996 | A |
5583927 | Ely et al. | Dec 1996 | A |
5587734 | Lauder et al. | Dec 1996 | A |
5589885 | Ooi | Dec 1996 | A |
5592470 | Rudrapatna et al. | Jan 1997 | A |
5594507 | Hoarty | Jan 1997 | A |
5594723 | Tibi | Jan 1997 | A |
5594938 | Engel | Jan 1997 | A |
5596693 | Needle et al. | Jan 1997 | A |
5600364 | Hendricks et al. | Feb 1997 | A |
5600573 | Hendricks et al. | Feb 1997 | A |
5608446 | Carr et al. | Mar 1997 | A |
5617145 | Huang et al. | Apr 1997 | A |
5621464 | Teo et al. | Apr 1997 | A |
5625404 | Grady et al. | Apr 1997 | A |
5630757 | Gagin et al. | May 1997 | A |
5631693 | Wunderlich et al. | May 1997 | A |
5631846 | Szurkowski | May 1997 | A |
5632003 | Davidson et al. | May 1997 | A |
5649283 | Galler et al. | Jul 1997 | A |
5668592 | Spaulding, II | Sep 1997 | A |
5668599 | Cheney et al. | Sep 1997 | A |
5708767 | Yeo et al. | Jan 1998 | A |
5710815 | Ming et al. | Jan 1998 | A |
5712906 | Grady et al. | Jan 1998 | A |
5740307 | Lane | Apr 1998 | A |
5742289 | Naylor et al. | Apr 1998 | A |
5748234 | Lippincott | May 1998 | A |
5754941 | Sharpe et al. | May 1998 | A |
5786527 | Tarte | Jul 1998 | A |
5790174 | Richard, III et al. | Aug 1998 | A |
5802283 | Grady et al. | Sep 1998 | A |
5812665 | Hoarty et al. | Sep 1998 | A |
5812786 | Seazholtz et al. | Sep 1998 | A |
5815604 | Simons et al. | Sep 1998 | A |
5818438 | Howe et al. | Oct 1998 | A |
5821945 | Yeo et al. | Oct 1998 | A |
5822537 | Katseff et al. | Oct 1998 | A |
5828371 | Cline et al. | Oct 1998 | A |
5844594 | Ferguson | Dec 1998 | A |
5845083 | Hamadani et al. | Dec 1998 | A |
5862325 | Reed et al. | Jan 1999 | A |
5864820 | Case | Jan 1999 | A |
5867208 | McLaren | Feb 1999 | A |
5883661 | Hoarty | Mar 1999 | A |
5903727 | Nielsen | May 1999 | A |
5903816 | Broadwin et al. | May 1999 | A |
5905522 | Lawler | May 1999 | A |
5907681 | Bates et al. | May 1999 | A |
5917822 | Lyles et al. | Jun 1999 | A |
5946352 | Rowlands et al. | Aug 1999 | A |
5952943 | Walsh et al. | Sep 1999 | A |
5959690 | Toebes et al. | Sep 1999 | A |
5961603 | Kunkel et al. | Oct 1999 | A |
5963203 | Goldberg et al. | Oct 1999 | A |
5966163 | Lin et al. | Oct 1999 | A |
5978756 | Walker et al. | Nov 1999 | A |
5982445 | Eyer et al. | Nov 1999 | A |
5990862 | Lewis | Nov 1999 | A |
5995146 | Rasmusse | Nov 1999 | A |
5995488 | Kalhunte et al. | Nov 1999 | A |
5999970 | Krisbergh et al. | Dec 1999 | A |
6014416 | Shin et al. | Jan 2000 | A |
6021386 | Davis et al. | Feb 2000 | A |
6031989 | Cordell | Feb 2000 | A |
6034678 | Hoarty et al. | Mar 2000 | A |
6049539 | Lee et al. | Apr 2000 | A |
6049831 | Gardell et al. | Apr 2000 | A |
6052555 | Ferguson | Apr 2000 | A |
6055314 | Spies et al. | Apr 2000 | A |
6055315 | Doyle et al. | Apr 2000 | A |
6064377 | Hoarty et al. | May 2000 | A |
6078328 | Schumann et al. | Jun 2000 | A |
6084908 | Chiang et al. | Jul 2000 | A |
6100883 | Hoarty | Aug 2000 | A |
6108625 | Kim | Aug 2000 | A |
6131182 | Beakes et al. | Oct 2000 | A |
6141645 | Chi-Min et al. | Oct 2000 | A |
6141693 | Perlman et al. | Oct 2000 | A |
6144698 | Poon et al. | Nov 2000 | A |
6167084 | Wang et al. | Dec 2000 | A |
6169573 | Sampath-Kumar et al. | Jan 2001 | B1 |
6177931 | Alexander et al. | Jan 2001 | B1 |
6182072 | Leak et al. | Jan 2001 | B1 |
6184878 | Alonso et al. | Feb 2001 | B1 |
6192081 | Chiang et al. | Feb 2001 | B1 |
6198822 | Doyle et al. | Mar 2001 | B1 |
6205582 | Hoarty | Mar 2001 | B1 |
6226041 | Florencio et al. | May 2001 | B1 |
6236730 | Cowieson et al. | May 2001 | B1 |
6243418 | Kim | Jun 2001 | B1 |
6253238 | Lauder et al. | Jun 2001 | B1 |
6256047 | Isobe et al. | Jul 2001 | B1 |
6259826 | Pollard et al. | Jul 2001 | B1 |
6266369 | Wang et al. | Jul 2001 | B1 |
6266684 | Kraus et al. | Jul 2001 | B1 |
6275496 | Burns et al. | Aug 2001 | B1 |
6292194 | Powell, III | Sep 2001 | B1 |
6305020 | Hoarty et al. | Oct 2001 | B1 |
6317151 | Ohsuga et al. | Nov 2001 | B1 |
6317885 | Fries | Nov 2001 | B1 |
6349284 | Park et al. | Feb 2002 | B1 |
6385771 | Gordon | May 2002 | B1 |
6386980 | Nishino et al. | May 2002 | B1 |
6389075 | Wang et al. | May 2002 | B2 |
6389218 | Gordon et al. | May 2002 | B2 |
6415031 | Colligan et al. | Jul 2002 | B1 |
6415437 | Ludvig et al. | Jul 2002 | B1 |
6438140 | Jungers et al. | Aug 2002 | B1 |
6446037 | Fielder et al. | Sep 2002 | B1 |
6459427 | Mao et al. | Oct 2002 | B1 |
6477182 | Calderone | Nov 2002 | B2 |
6480210 | Martino et al. | Nov 2002 | B1 |
6481012 | Gordon et al. | Nov 2002 | B1 |
6512793 | Maeda | Jan 2003 | B1 |
6525746 | Lau et al. | Feb 2003 | B1 |
6536043 | Guedalia | Mar 2003 | B1 |
6557041 | Mallart | Apr 2003 | B2 |
6560496 | Michnener | May 2003 | B1 |
6564378 | Satterfield et al. | May 2003 | B1 |
6578201 | LaRocca et al. | Jun 2003 | B1 |
6579184 | Tanskanen | Jun 2003 | B1 |
6584153 | Gordon et al. | Jun 2003 | B1 |
6588017 | Calderone | Jul 2003 | B1 |
6598229 | Smyth et al. | Jul 2003 | B2 |
6604224 | Armstrong et al. | Aug 2003 | B1 |
6614442 | Ouyang et al. | Sep 2003 | B1 |
6621870 | Gordon et al. | Sep 2003 | B1 |
6625574 | Taniguchi et al. | Sep 2003 | B1 |
6639896 | Goode et al. | Oct 2003 | B1 |
6645076 | Sugai | Nov 2003 | B1 |
6651252 | Gordon et al. | Nov 2003 | B1 |
6657647 | Bright | Dec 2003 | B1 |
6675385 | Wang | Jan 2004 | B1 |
6675387 | Boucher | Jan 2004 | B1 |
6681326 | Son et al. | Jan 2004 | B2 |
6681397 | Tsai et al. | Jan 2004 | B1 |
6684400 | Goode et al. | Jan 2004 | B1 |
6687663 | McGrath et al. | Feb 2004 | B1 |
6691208 | Dandrea et al. | Feb 2004 | B2 |
6697376 | Son et al. | Feb 2004 | B1 |
6704359 | Bayrakeri et al. | Mar 2004 | B1 |
6717600 | Dutta et al. | Apr 2004 | B2 |
6718552 | Goode | Apr 2004 | B1 |
6721794 | Taylor et al. | Apr 2004 | B2 |
6721956 | Wsilewski | Apr 2004 | B2 |
6727929 | Bates et al. | Apr 2004 | B1 |
6731605 | Deshpande | May 2004 | B1 |
6732370 | Gordon et al. | May 2004 | B1 |
6747991 | Hemy et al. | Jun 2004 | B1 |
6754271 | Gordon et al. | Jun 2004 | B1 |
6754905 | Gordon et al. | Jun 2004 | B2 |
6758540 | Adolph et al. | Jul 2004 | B1 |
6766407 | Lisitsa et al. | Jul 2004 | B1 |
6771704 | Hannah | Aug 2004 | B1 |
6785902 | Zigmond et al. | Aug 2004 | B1 |
6807528 | Truman et al. | Oct 2004 | B1 |
6810528 | Chatani | Oct 2004 | B1 |
6813690 | Lango et al. | Nov 2004 | B1 |
6817947 | Tanskanen | Nov 2004 | B2 |
6886178 | Mao et al. | Apr 2005 | B1 |
6907574 | Xu et al. | Jun 2005 | B2 |
6931291 | Alvarez-Tinoco et al. | Aug 2005 | B1 |
6941019 | Mitchell et al. | Sep 2005 | B1 |
6941574 | Broadwin et al. | Sep 2005 | B1 |
6947509 | Wong | Sep 2005 | B1 |
6952221 | Holtz et al. | Oct 2005 | B1 |
6956899 | Hall et al. | Oct 2005 | B2 |
7016540 | Gong et al. | Mar 2006 | B1 |
7030890 | Jouet et al. | Apr 2006 | B1 |
7031385 | Inoue et al. | Apr 2006 | B1 |
7050113 | Campisano et al. | May 2006 | B2 |
7089577 | Rakib et al. | Aug 2006 | B1 |
7093028 | Shao et al. | Aug 2006 | B1 |
7095402 | Kunil et al. | Aug 2006 | B2 |
7114167 | Slemmer et al. | Sep 2006 | B2 |
7146615 | Hervet et al. | Dec 2006 | B1 |
7151782 | Oz et al. | Dec 2006 | B1 |
7158676 | Rainsford | Jan 2007 | B1 |
7200836 | Brodersen et al. | Apr 2007 | B2 |
7212573 | Winger | May 2007 | B2 |
7224731 | Mehrotra | May 2007 | B2 |
7272556 | Aguilar et al. | Sep 2007 | B1 |
7310619 | Baar et al. | Dec 2007 | B2 |
7325043 | Rosenberg et al. | Jan 2008 | B1 |
7346111 | Winger et al. | Mar 2008 | B2 |
7360230 | Paz et al. | Apr 2008 | B1 |
7412423 | Asano | Aug 2008 | B1 |
7412505 | Slemmer et al. | Aug 2008 | B2 |
7421082 | Kamiya et al. | Sep 2008 | B2 |
7444306 | Varble | Oct 2008 | B2 |
7444418 | Chou et al. | Oct 2008 | B2 |
7500235 | Maynard et al. | Mar 2009 | B2 |
7508941 | O'Toole, Jr. et al. | Mar 2009 | B1 |
7512577 | Slemmer et al. | Mar 2009 | B2 |
7543073 | Chou et al. | Jun 2009 | B2 |
7596764 | Vienneau et al. | Sep 2009 | B2 |
7623575 | Winger | Nov 2009 | B2 |
7669220 | Goode | Feb 2010 | B2 |
7742609 | Yeakel et al. | Jun 2010 | B2 |
7743400 | Kurauchi | Jun 2010 | B2 |
7751572 | Villemoes et al. | Jul 2010 | B2 |
7757157 | Fukuda | Jul 2010 | B1 |
7830388 | Lu | Nov 2010 | B1 |
7840905 | Weber et al. | Nov 2010 | B1 |
7936819 | Craig et al. | May 2011 | B2 |
7941645 | Riach et al. | May 2011 | B1 |
7970263 | Asch | Jun 2011 | B1 |
7987489 | Krzyzanowski et al. | Jul 2011 | B2 |
8027353 | Damola et al. | Sep 2011 | B2 |
8036271 | Winger et al. | Oct 2011 | B2 |
8046798 | Schlack et al. | Oct 2011 | B1 |
8074248 | Sigmon et al. | Dec 2011 | B2 |
8118676 | Craig et al. | Feb 2012 | B2 |
8136033 | Bhargava et al. | Mar 2012 | B1 |
8149917 | Zhang et al. | Apr 2012 | B2 |
8155194 | Winger et al. | Apr 2012 | B2 |
8155202 | Landau | Apr 2012 | B2 |
8170107 | Winger | May 2012 | B2 |
8194862 | Herr et al. | Jun 2012 | B2 |
8243630 | Luo et al. | Aug 2012 | B2 |
8270439 | Herr et al. | Sep 2012 | B2 |
8284842 | Craig et al. | Oct 2012 | B2 |
8296424 | Malloy et al. | Oct 2012 | B2 |
8370869 | Paek et al. | Feb 2013 | B2 |
8411754 | Zhang et al. | Apr 2013 | B2 |
8442110 | Pavlovskaia et al. | May 2013 | B2 |
8473996 | Gordon et al. | Jun 2013 | B2 |
8619867 | Craig et al. | Dec 2013 | B2 |
8621500 | Weaver et al. | Dec 2013 | B2 |
8656430 | Doyle | Feb 2014 | B2 |
20010008845 | Kusuda et al. | Jul 2001 | A1 |
20010049301 | Masuda et al. | Dec 2001 | A1 |
20020007491 | Schiller et al. | Jan 2002 | A1 |
20020013812 | Krueger et al. | Jan 2002 | A1 |
20020016161 | Dellien et al. | Feb 2002 | A1 |
20020021353 | DeNies | Feb 2002 | A1 |
20020026642 | Augenbraun et al. | Feb 2002 | A1 |
20020027567 | Niamir | Mar 2002 | A1 |
20020032697 | French et al. | Mar 2002 | A1 |
20020040482 | Sextro et al. | Apr 2002 | A1 |
20020047899 | Son et al. | Apr 2002 | A1 |
20020049975 | Thomas et al. | Apr 2002 | A1 |
20020054578 | Zhang et al. | May 2002 | A1 |
20020056083 | Istvan | May 2002 | A1 |
20020056107 | Schlack | May 2002 | A1 |
20020056136 | Wistendahl et al. | May 2002 | A1 |
20020059644 | Andrade et al. | May 2002 | A1 |
20020062484 | De Lange et al. | May 2002 | A1 |
20020067766 | Sakamoto et al. | Jun 2002 | A1 |
20020069267 | Thiele | Jun 2002 | A1 |
20020072408 | Kumagai | Jun 2002 | A1 |
20020078171 | Schneider | Jun 2002 | A1 |
20020078456 | Hudson et al. | Jun 2002 | A1 |
20020083464 | Tomsen et al. | Jun 2002 | A1 |
20020095689 | Novak | Jul 2002 | A1 |
20020105531 | Niemi | Aug 2002 | A1 |
20020108121 | Alao et al. | Aug 2002 | A1 |
20020131511 | Zenoni | Sep 2002 | A1 |
20020136298 | Anantharamu et al. | Sep 2002 | A1 |
20020152318 | Menon et al. | Oct 2002 | A1 |
20020171765 | Waki et al. | Nov 2002 | A1 |
20020175931 | Holtz et al. | Nov 2002 | A1 |
20020178447 | Plotnick et al. | Nov 2002 | A1 |
20020188628 | Cooper et al. | Dec 2002 | A1 |
20020191851 | Keinan | Dec 2002 | A1 |
20020194592 | Tsuchida et al. | Dec 2002 | A1 |
20020196746 | Allen | Dec 2002 | A1 |
20030018796 | Chou et al. | Jan 2003 | A1 |
20030020671 | Santoro et al. | Jan 2003 | A1 |
20030027517 | Callway et al. | Feb 2003 | A1 |
20030035486 | Kato et al. | Feb 2003 | A1 |
20030038893 | Rajamaki et al. | Feb 2003 | A1 |
20030039398 | McIntyre | Feb 2003 | A1 |
20030046690 | Miller | Mar 2003 | A1 |
20030051253 | Barone, Jr. | Mar 2003 | A1 |
20030058941 | Chen et al. | Mar 2003 | A1 |
20030061451 | Beyda | Mar 2003 | A1 |
20030065739 | Shnier | Apr 2003 | A1 |
20030071792 | Safadi | Apr 2003 | A1 |
20030072372 | Shen et al. | Apr 2003 | A1 |
20030076546 | Johnson et al. | Apr 2003 | A1 |
20030088328 | Nishio et al. | May 2003 | A1 |
20030088400 | Nishio et al. | May 2003 | A1 |
20030095790 | Joshi | May 2003 | A1 |
20030107443 | Yamamoto | Jun 2003 | A1 |
20030122836 | Doyle et al. | Jul 2003 | A1 |
20030123664 | Pedlow, Jr. et al. | Jul 2003 | A1 |
20030126608 | Safadi | Jul 2003 | A1 |
20030126611 | Chernock et al. | Jul 2003 | A1 |
20030131349 | Kuczynski-Brown | Jul 2003 | A1 |
20030135860 | Dureau | Jul 2003 | A1 |
20030169373 | Peters et al. | Sep 2003 | A1 |
20030177199 | Zenoni | Sep 2003 | A1 |
20030188309 | Yuen | Oct 2003 | A1 |
20030189980 | Dvir et al. | Oct 2003 | A1 |
20030196174 | Pierre Cote et al. | Oct 2003 | A1 |
20030208768 | Urdang et al. | Nov 2003 | A1 |
20030229719 | Iwata et al. | Dec 2003 | A1 |
20030229900 | Reisman | Dec 2003 | A1 |
20030231218 | Amadio | Dec 2003 | A1 |
20040016000 | Zhang et al. | Jan 2004 | A1 |
20040034873 | Zenoni | Feb 2004 | A1 |
20040040035 | Carlucci et al. | Feb 2004 | A1 |
20040078822 | Breen et al. | Apr 2004 | A1 |
20040088375 | Sethi et al. | May 2004 | A1 |
20040091171 | Bone | May 2004 | A1 |
20040111526 | Baldwin et al. | Jun 2004 | A1 |
20040117827 | Karaoguz et al. | Jun 2004 | A1 |
20040128686 | Boyer et al. | Jul 2004 | A1 |
20040133704 | Krzyzanowski et al. | Jul 2004 | A1 |
20040136698 | Mock | Jul 2004 | A1 |
20040139158 | Datta | Jul 2004 | A1 |
20040157662 | Tsuchiya | Aug 2004 | A1 |
20040163101 | Swix et al. | Aug 2004 | A1 |
20040184542 | Fujimoto | Sep 2004 | A1 |
20040193648 | Lai et al. | Sep 2004 | A1 |
20040210824 | Shoff et al. | Oct 2004 | A1 |
20040261106 | Hoffman | Dec 2004 | A1 |
20040261114 | Addington et al. | Dec 2004 | A1 |
20040268419 | Danker et al. | Dec 2004 | A1 |
20050015259 | Thumpudi et al. | Jan 2005 | A1 |
20050015816 | Christofalo et al. | Jan 2005 | A1 |
20050021830 | Urzaiz et al. | Jan 2005 | A1 |
20050034155 | Gordon et al. | Feb 2005 | A1 |
20050034162 | White et al. | Feb 2005 | A1 |
20050044575 | Der Kuyl | Feb 2005 | A1 |
20050055685 | Maynard et al. | Mar 2005 | A1 |
20050055721 | Zigmond et al. | Mar 2005 | A1 |
20050071876 | van Beek | Mar 2005 | A1 |
20050076134 | Bialik et al. | Apr 2005 | A1 |
20050089091 | Kim et al. | Apr 2005 | A1 |
20050091690 | Delpuch et al. | Apr 2005 | A1 |
20050091695 | Paz et al. | Apr 2005 | A1 |
20050105608 | Coleman et al. | May 2005 | A1 |
20050114906 | Hoarty et al. | May 2005 | A1 |
20050132305 | Guichard et al. | Jun 2005 | A1 |
20050135385 | Jenkins et al. | Jun 2005 | A1 |
20050141613 | Kelly et al. | Jun 2005 | A1 |
20050149988 | Grannan | Jul 2005 | A1 |
20050155063 | Bayrakeri | Jul 2005 | A1 |
20050160088 | Scallan et al. | Jul 2005 | A1 |
20050166257 | Feinleib et al. | Jul 2005 | A1 |
20050180502 | Puri | Aug 2005 | A1 |
20050198682 | Wright | Sep 2005 | A1 |
20050213586 | Cyganski et al. | Sep 2005 | A1 |
20050216933 | Black | Sep 2005 | A1 |
20050216940 | Black | Sep 2005 | A1 |
20050226426 | Oomen et al. | Oct 2005 | A1 |
20050273832 | Zigmond et al. | Dec 2005 | A1 |
20050283741 | Balabanovic et al. | Dec 2005 | A1 |
20060001737 | Dawson et al. | Jan 2006 | A1 |
20060020960 | Relan et al. | Jan 2006 | A1 |
20060020994 | Crane et al. | Jan 2006 | A1 |
20060031906 | Kaneda | Feb 2006 | A1 |
20060039481 | Shen et al. | Feb 2006 | A1 |
20060041910 | Hatanaka et al. | Feb 2006 | A1 |
20060088105 | Shen et al. | Apr 2006 | A1 |
20060095944 | Demircin et al. | May 2006 | A1 |
20060112338 | Joung et al. | May 2006 | A1 |
20060117340 | Pavlovskaia et al. | Jun 2006 | A1 |
20060143678 | Cho et al. | Jun 2006 | A1 |
20060161538 | Kiilerich | Jul 2006 | A1 |
20060173985 | Moore | Aug 2006 | A1 |
20060174026 | Robinson et al. | Aug 2006 | A1 |
20060174289 | Theberge | Aug 2006 | A1 |
20060195884 | van Zoest et al. | Aug 2006 | A1 |
20060203913 | Kim et al. | Sep 2006 | A1 |
20060212203 | Furuno | Sep 2006 | A1 |
20060218601 | Michel | Sep 2006 | A1 |
20060230428 | Craig et al. | Oct 2006 | A1 |
20060242570 | Croft et al. | Oct 2006 | A1 |
20060256865 | Westerman | Nov 2006 | A1 |
20060269086 | Page et al. | Nov 2006 | A1 |
20060271985 | Hoffman et al. | Nov 2006 | A1 |
20060285586 | Westerman | Dec 2006 | A1 |
20060285819 | Kelly et al. | Dec 2006 | A1 |
20070009035 | Craig et al. | Jan 2007 | A1 |
20070009036 | Craig et al. | Jan 2007 | A1 |
20070009042 | Craig | Jan 2007 | A1 |
20070025639 | Zhou et al. | Feb 2007 | A1 |
20070033528 | Merrit et al. | Feb 2007 | A1 |
20070033631 | Gordon et al. | Feb 2007 | A1 |
20070074251 | Oguz et al. | Mar 2007 | A1 |
20070079325 | de Heer | Apr 2007 | A1 |
20070115941 | Patel et al. | May 2007 | A1 |
20070124282 | Wittkotter | May 2007 | A1 |
20070124795 | McKissick et al. | May 2007 | A1 |
20070130446 | Minakami | Jun 2007 | A1 |
20070130592 | Haeusel | Jun 2007 | A1 |
20070152984 | Ording et al. | Jul 2007 | A1 |
20070162953 | Bollinger et al. | Jul 2007 | A1 |
20070172061 | Pinder | Jul 2007 | A1 |
20070174790 | Jing et al. | Jul 2007 | A1 |
20070178243 | Houck et al. | Aug 2007 | A1 |
20070234220 | Khan et al. | Oct 2007 | A1 |
20070237232 | Chang et al. | Oct 2007 | A1 |
20070300280 | Turner et al. | Dec 2007 | A1 |
20080046928 | Poling et al. | Feb 2008 | A1 |
20080052742 | Kopf et al. | Feb 2008 | A1 |
20080066135 | Brodersen et al. | Mar 2008 | A1 |
20080084503 | Kondo | Apr 2008 | A1 |
20080086688 | Chandratillake et al. | Apr 2008 | A1 |
20080094368 | Ording et al. | Apr 2008 | A1 |
20080097953 | Levy et al. | Apr 2008 | A1 |
20080098450 | Wu et al. | Apr 2008 | A1 |
20080104520 | Swenson et al. | May 2008 | A1 |
20080127255 | Ress et al. | May 2008 | A1 |
20080154583 | Goto et al. | Jun 2008 | A1 |
20080163059 | Craner | Jul 2008 | A1 |
20080163286 | Rudolph et al. | Jul 2008 | A1 |
20080170619 | Landau | Jul 2008 | A1 |
20080170622 | Gordon et al. | Jul 2008 | A1 |
20080178125 | Elsbree et al. | Jul 2008 | A1 |
20080178243 | Dong et al. | Jul 2008 | A1 |
20080178249 | Gordon et al. | Jul 2008 | A1 |
20080181221 | Kampmann et al. | Jul 2008 | A1 |
20080184120 | O-Brien-Strain et al. | Jul 2008 | A1 |
20080189740 | Carpenter et al. | Aug 2008 | A1 |
20080195573 | Onoda et al. | Aug 2008 | A1 |
20080201736 | Gordon et al. | Aug 2008 | A1 |
20080212942 | Gordon et al. | Sep 2008 | A1 |
20080222199 | Tiu et al. | Sep 2008 | A1 |
20080232452 | Sullivan et al. | Sep 2008 | A1 |
20080243918 | Holtman | Oct 2008 | A1 |
20080243998 | Oh et al. | Oct 2008 | A1 |
20080246759 | Summers | Oct 2008 | A1 |
20080253440 | Srinivasan et al. | Oct 2008 | A1 |
20080271080 | Grossweiler et al. | Oct 2008 | A1 |
20090003446 | Wu et al. | Jan 2009 | A1 |
20090003705 | Zou et al. | Jan 2009 | A1 |
20090007199 | La Joie | Jan 2009 | A1 |
20090025027 | Craner | Jan 2009 | A1 |
20090031341 | Schlack et al. | Jan 2009 | A1 |
20090041118 | Pavlovskaia et al. | Feb 2009 | A1 |
20090083781 | Yang et al. | Mar 2009 | A1 |
20090083813 | Dolce et al. | Mar 2009 | A1 |
20090083824 | McCarthy et al. | Mar 2009 | A1 |
20090089188 | Ku et al. | Apr 2009 | A1 |
20090094113 | Berry et al. | Apr 2009 | A1 |
20090094646 | Walter et al. | Apr 2009 | A1 |
20090100465 | Kulakowski | Apr 2009 | A1 |
20090100489 | Strothmann | Apr 2009 | A1 |
20090106269 | Zuckerman et al. | Apr 2009 | A1 |
20090106386 | Zuckerman et al. | Apr 2009 | A1 |
20090106392 | Zuckerman et al. | Apr 2009 | A1 |
20090106425 | Zuckerman et al. | Apr 2009 | A1 |
20090106441 | Zuckerman et al. | Apr 2009 | A1 |
20090106451 | Zuckerman et al. | Apr 2009 | A1 |
20090106511 | Zuckerman et al. | Apr 2009 | A1 |
20090113009 | Slemmer et al. | Apr 2009 | A1 |
20090132942 | Santoro et al. | May 2009 | A1 |
20090138966 | Krause et al. | May 2009 | A1 |
20090144781 | Glaser et al. | Jun 2009 | A1 |
20090146779 | Kumar et al. | Jun 2009 | A1 |
20090157868 | Chaudhry | Jun 2009 | A1 |
20090158369 | Van Vleck et al. | Jun 2009 | A1 |
20090160694 | Di Flora | Jun 2009 | A1 |
20090172757 | Aldrey et al. | Jul 2009 | A1 |
20090178098 | Westbrook et al. | Jul 2009 | A1 |
20090183219 | Maynard et al. | Jul 2009 | A1 |
20090189890 | Corbett et al. | Jul 2009 | A1 |
20090193452 | Russ et al. | Jul 2009 | A1 |
20090196346 | Zhang et al. | Aug 2009 | A1 |
20090204920 | Beverly et al. | Aug 2009 | A1 |
20090210899 | Lawrence-Apfelbaum et al. | Aug 2009 | A1 |
20090225790 | Shay et al. | Sep 2009 | A1 |
20090228620 | Thomas et al. | Sep 2009 | A1 |
20090228922 | Haj-khalil et al. | Sep 2009 | A1 |
20090233593 | Ergen et al. | Sep 2009 | A1 |
20090251478 | Maillot et al. | Oct 2009 | A1 |
20090254960 | Yarom et al. | Oct 2009 | A1 |
20090265617 | Randall et al. | Oct 2009 | A1 |
20090271512 | Jorgensen | Oct 2009 | A1 |
20090271818 | Schlack | Oct 2009 | A1 |
20090298535 | Klein et al. | Dec 2009 | A1 |
20090313674 | Ludvig et al. | Dec 2009 | A1 |
20090328109 | Pavlovskaia et al. | Dec 2009 | A1 |
20100033638 | O'Donnell et al. | Feb 2010 | A1 |
20100035682 | Gentile et al. | Feb 2010 | A1 |
20100058404 | Rouse | Mar 2010 | A1 |
20100067571 | White et al. | Mar 2010 | A1 |
20100077441 | Thomas et al. | Mar 2010 | A1 |
20100104021 | Schmit | Apr 2010 | A1 |
20100115573 | Srinivasan et al. | May 2010 | A1 |
20100118972 | Zhang et al. | May 2010 | A1 |
20100131996 | Gauld | May 2010 | A1 |
20100146139 | Brockmann | Jun 2010 | A1 |
20100158109 | Dahlby et al. | Jun 2010 | A1 |
20100161825 | Ronca et al. | Jun 2010 | A1 |
20100166071 | Wu et al. | Jul 2010 | A1 |
20100174776 | Westberg et al. | Jul 2010 | A1 |
20100175080 | Yuen et al. | Jul 2010 | A1 |
20100180307 | Hayes et al. | Jul 2010 | A1 |
20100211983 | Chou | Aug 2010 | A1 |
20100226428 | Thevathasan et al. | Sep 2010 | A1 |
20100235861 | Schein et al. | Sep 2010 | A1 |
20100242073 | Gordon et al. | Sep 2010 | A1 |
20100251167 | DeLuca et al. | Sep 2010 | A1 |
20100254370 | Jana et al. | Oct 2010 | A1 |
20100265344 | Velarde et al. | Oct 2010 | A1 |
20100325655 | Perez | Dec 2010 | A1 |
20110002376 | Ahmed et al. | Jan 2011 | A1 |
20110002470 | Purnhagen et al. | Jan 2011 | A1 |
20110023069 | Dowens | Jan 2011 | A1 |
20110035227 | Lee et al. | Feb 2011 | A1 |
20110067061 | Karaoguz et al. | Mar 2011 | A1 |
20110096828 | Chen et al. | Apr 2011 | A1 |
20110107375 | Stahl et al. | May 2011 | A1 |
20110110642 | Salomons et al. | May 2011 | A1 |
20110150421 | Sasaki et al. | Jun 2011 | A1 |
20110153776 | Opala et al. | Jun 2011 | A1 |
20110167468 | Lee et al. | Jul 2011 | A1 |
20110191684 | Greenberg | Aug 2011 | A1 |
20110231878 | Hunter et al. | Sep 2011 | A1 |
20110243024 | Osterling et al. | Oct 2011 | A1 |
20110258584 | Williams et al. | Oct 2011 | A1 |
20110289536 | Poder et al. | Nov 2011 | A1 |
20110296312 | Boyer et al. | Dec 2011 | A1 |
20110317982 | Xu et al. | Dec 2011 | A1 |
20120023126 | Jin et al. | Jan 2012 | A1 |
20120030212 | Koopmans et al. | Feb 2012 | A1 |
20120137337 | Sigmon et al. | May 2012 | A1 |
20120204217 | Regis et al. | Aug 2012 | A1 |
20120209815 | Carson et al. | Aug 2012 | A1 |
20120224641 | Haberman et al. | Sep 2012 | A1 |
20120257671 | Brockmann et al. | Oct 2012 | A1 |
20130003826 | Craig et al. | Jan 2013 | A1 |
20130071095 | Chauvier et al. | Mar 2013 | A1 |
20130086610 | Brockmann | Apr 2013 | A1 |
20130179787 | Brockmann et al. | Jul 2013 | A1 |
20130198776 | Brockmann | Aug 2013 | A1 |
20130254308 | Rose et al. | Sep 2013 | A1 |
20130272394 | Brockmann et al. | Oct 2013 | A1 |
20130304818 | Brumleve et al. | Nov 2013 | A1 |
20140033036 | Gaur et al. | Jan 2014 | A1 |
20140081954 | Elizarov | Mar 2014 | A1 |
20140267074 | Balci | Sep 2014 | A1 |
Number | Date | Country |
---|---|---|
191599 | Apr 2000 | AT |
198969 | Feb 2001 | AT |
250313 | Oct 2003 | AT |
472152 | Jul 2010 | AT |
475266 | Aug 2010 | AT |
199060189 | Nov 1990 | AU |
620735 | Feb 1992 | AU |
199184838 | Apr 1992 | AU |
643828 | Nov 1993 | AU |
2004253127 | Jan 2005 | AU |
2005278122 | Mar 2006 | AU |
2010339376 | Aug 2012 | AU |
2011249132 | Nov 2012 | AU |
2011258972 | Nov 2012 | AU |
2011315950 | May 2013 | AU |
682776 | Mar 1964 | CA |
2052477 | Mar 1992 | CA |
1302554 | Jun 1992 | CA |
2163500 | May 1996 | CA |
2231391 | May 1997 | CA |
2273365 | Jun 1998 | CA |
2313133 | Jun 1999 | CA |
2313161 | Jun 1999 | CA |
2528499 | Jan 2005 | CA |
2569407 | Mar 2006 | CA |
2728797 | Apr 2010 | CA |
2787913 | Jul 2011 | CA |
2798541 | Dec 2011 | CA |
2814070 | Apr 2012 | CA |
1507751 | Jun 2004 | CN |
1969555 | May 2007 | CN |
101180109 | May 2008 | CN |
101627424 | Jan 2010 | CN |
101637023 | Jan 2010 | CN |
102007773 | Apr 2011 | CN |
103647980 | Mar 2014 | CN |
4408355 | Oct 1994 | DE |
69516139 | Dec 2000 | DE |
69132518 | Sep 2001 | DE |
69333207 | Jul 2004 | DE |
98961961 | Aug 2007 | DE |
602008001596 | Aug 2010 | DE |
602006015650 | Sep 2010 | DE |
0128771 | Dec 1984 | EP |
0419137 | Mar 1991 | EP |
0449633 | Oct 1991 | EP |
0477786 | Apr 1992 | EP |
0523618 | Jan 1993 | EP |
0534139 | Mar 1993 | EP |
0568453 | Nov 1993 | EP |
0588653 | Mar 1994 | EP |
0594350 | Apr 1994 | EP |
0612916 | Aug 1994 | EP |
0624039 | Nov 1994 | EP |
0638219 | Feb 1995 | EP |
0643523 | Mar 1995 | EP |
0661888 | Jul 1995 | EP |
0714684 | Jun 1996 | EP |
0746158 | Dec 1996 | EP |
0761066 | Mar 1997 | EP |
0789972 | Aug 1997 | EP |
0830786 | Mar 1998 | EP |
0861560 | Sep 1998 | EP |
0 881 808 | Dec 1998 | EP |
0933966 | Aug 1999 | EP |
0933966 | Aug 1999 | EP |
1026872 | Aug 2000 | EP |
1038397 | Sep 2000 | EP |
1038399 | Sep 2000 | EP |
1038400 | Sep 2000 | EP |
1038401 | Sep 2000 | EP |
1051039 | Nov 2000 | EP |
1055331 | Nov 2000 | EP |
1120968 | Aug 2001 | EP |
1345446 | Sep 2003 | EP |
1422929 | May 2004 | EP |
1428562 | Jun 2004 | EP |
1521476 | Apr 2005 | EP |
1645115 | Apr 2006 | EP |
1725044 | Nov 2006 | EP |
1767708 | Mar 2007 | EP |
1771003 | Apr 2007 | EP |
1772014 | Apr 2007 | EP |
1877150 | Jan 2008 | EP |
1887148 | Feb 2008 | EP |
1900200 | Mar 2008 | EP |
1902583 | Mar 2008 | EP |
1908293 | Apr 2008 | EP |
1911288 | Apr 2008 | EP |
1918802 | May 2008 | EP |
2100296 | Sep 2009 | EP |
2105019 | Sep 2009 | EP |
2106665 | Oct 2009 | EP |
2116051 | Nov 2009 | EP |
2124440 | Nov 2009 | EP |
2248341 | Nov 2010 | EP |
2269377 | Jan 2011 | EP |
2271098 | Jan 2011 | EP |
2304953 | Apr 2011 | EP |
2364019 | Sep 2011 | EP |
2384001 | Nov 2011 | EP |
2409493 | Jan 2012 | EP |
2477414 | Jul 2012 | EP |
2487919 | Aug 2012 | EP |
2520090 | Nov 2012 | EP |
2567545 | Mar 2013 | EP |
2577437 | Apr 2013 | EP |
2628306 | Aug 2013 | EP |
2632164 | Aug 2013 | EP |
2632165 | Aug 2013 | EP |
2695388 | Feb 2014 | EP |
2207635 | Jun 2004 | ES |
8211463 | Jun 1982 | FR |
2529739 | Jan 1984 | FR |
2891098 | Mar 2007 | FR |
2207838 | Feb 1989 | GB |
2248955 | Apr 1992 | GB |
2290204 | Dec 1995 | GB |
2365649 | Feb 2002 | GB |
2378345 | Feb 2003 | GB |
1134855 | Oct 2010 | HK |
1116323 | Dec 2010 | HK |
19913397 | Apr 1992 | IE |
99586 | Feb 1998 | IL |
215133 | Dec 2011 | IL |
222829 | Dec 2012 | IL |
222830 | Dec 2012 | IL |
225525 | Jun 2013 | IL |
180215 | Jan 1998 | IN |
200701744 | Nov 2007 | IN |
200900856 | May 2009 | IN |
200800214 | Jun 2009 | IN |
3759 | Mar 1992 | IS |
60-054324 | Mar 1985 | JP |
63-033988 | Feb 1988 | JP |
63-263985 | Oct 1988 | JP |
2001-241993 | Sep 1989 | JP |
04-373286 | Dec 1992 | JP |
06-054324 | Feb 1994 | JP |
7015720 | Jan 1995 | JP |
7-160292 | Jun 1995 | JP |
7160292 | Jun 1995 | JP |
8095599 | Apr 1996 | JP |
8-265704 | Oct 1996 | JP |
8265704 | Oct 1996 | JP |
10-228437 | Aug 1998 | JP |
11-134273 | May 1999 | JP |
H11-261966 | Sep 1999 | JP |
2000-152234 | May 2000 | JP |
2001-203995 | Jul 2001 | JP |
2001-245271 | Sep 2001 | JP |
2001-245291 | Sep 2001 | JP |
2001-514471 | Sep 2001 | JP |
2002-016920 | Jan 2002 | JP |
2002-057952 | Feb 2002 | JP |
2002-112220 | Apr 2002 | JP |
2002-141810 | May 2002 | JP |
2002-208027 | Jul 2002 | JP |
2002-319991 | Oct 2002 | JP |
2003-506763 | Feb 2003 | JP |
2003-087785 | Mar 2003 | JP |
2004-501445 | Jan 2004 | JP |
2004-056777 | Feb 2004 | JP |
2004-110850 | Apr 2004 | JP |
2004-112441 | Apr 2004 | JP |
2004-135932 | May 2004 | JP |
2004-264812 | Sep 2004 | JP |
2004-312283 | Nov 2004 | JP |
2004-533736 | Nov 2004 | JP |
2004-536381 | Dec 2004 | JP |
2004-536681 | Dec 2004 | JP |
2005-033741 | Feb 2005 | JP |
2005-084987 | Mar 2005 | JP |
2005-095599 | Mar 2005 | JP |
8-095599 | Apr 2005 | JP |
2005-156996 | Jun 2005 | JP |
2005-519382 | Jun 2005 | JP |
2005-523479 | Aug 2005 | JP |
2005-309752 | Nov 2005 | JP |
2006-067280 | Mar 2006 | JP |
2006-512838 | Apr 2006 | JP |
2007-129296 | May 2007 | JP |
2007-522727 | Aug 2007 | JP |
11-88419 | Sep 2007 | JP |
2008-523880 | Jul 2008 | JP |
2008-535622 | Sep 2008 | JP |
04252727 | Apr 2009 | JP |
2009-543386 | Dec 2009 | JP |
2012-080593 | Apr 2012 | JP |
04996603 | Aug 2012 | JP |
05121711 | Jan 2013 | JP |
53-004612 | Oct 2013 | JP |
05331008 | Oct 2013 | JP |
05405819 | Feb 2014 | JP |
10-2005-0001362 | Jan 2005 | KR |
10-2005-0085827 | Aug 2005 | KR |
2006067924 | Jun 2006 | KR |
10-2006-0095821 | Sep 2006 | KR |
2007038111 | Apr 2007 | KR |
20080001298 | Jan 2008 | KR |
2008024189 | Mar 2008 | KR |
2010111739 | Oct 2010 | KR |
2010120187 | Nov 2010 | KR |
2010127240 | Dec 2010 | KR |
2011030640 | Mar 2011 | KR |
2011129477 | Dec 2011 | KR |
20120112683 | Oct 2012 | KR |
2013061149 | Jun 2013 | KR |
2013113925 | Oct 2013 | KR |
1333200 | Nov 2013 | KR |
2008045154 | Nov 2013 | KR |
2013138263 | Dec 2013 | KR |
1032594 | Apr 2008 | NL |
1033929 | Apr 2008 | NL |
2004670 | Nov 2011 | NL |
2004780 | Jan 2012 | NL |
239969 | Dec 1994 | NZ |
99110 | Dec 1993 | PT |
WO 8202303 | Jul 1982 | WO |
WO 8908967 | Sep 1989 | WO |
WO 9013972 | Nov 1990 | WO |
WO 9322877 | Nov 1993 | WO |
WO 9416534 | Jul 1994 | WO |
WO 9419910 | Sep 1994 | WO |
WO 9421079 | Sep 1994 | WO |
WO 9515658 | Jun 1995 | WO |
WO 9532587 | Nov 1995 | WO |
WO 9533342 | Dec 1995 | WO |
WO 9614712 | May 1996 | WO |
WO 9627843 | Sep 1996 | WO |
WO 9631826 | Oct 1996 | WO |
WO 9637074 | Nov 1996 | WO |
WO 9642168 | Dec 1996 | WO |
WO 9716925 | May 1997 | WO |
WO 9733434 | Sep 1997 | WO |
WO 9739583 | Oct 1997 | WO |
WO 9826595 | Jun 1998 | WO |
WO 9900735 | Jan 1999 | WO |
WO 9904568 | Jan 1999 | WO |
WO 9900735 | Jan 1999 | WO |
WO 9930496 | Jun 1999 | WO |
WO 9930497 | Jun 1999 | WO |
WO 9930500 | Jun 1999 | WO |
WO 9930501 | Jun 1999 | WO |
WO 9935840 | Jul 1999 | WO |
WO 9941911 | Aug 1999 | WO |
WO 9956468 | Nov 1999 | WO |
WO 9965232 | Dec 1999 | WO |
WO 9965243 | Dec 1999 | WO |
WO 9966732 | Dec 1999 | WO |
WO 0002303 | Jan 2000 | WO |
WO 0007372 | Feb 2000 | WO |
WO 0008967 | Feb 2000 | WO |
WO 0019910 | Apr 2000 | WO |
WO 0038430 | Jun 2000 | WO |
WO 0041397 | Jul 2000 | WO |
WO 0139494 | May 2001 | WO |
WO 0141447 | Jun 2001 | WO |
WO 0182614 | Nov 2001 | WO |
WO 0192973 | Dec 2001 | WO |
WO 02089487 | Jul 2002 | WO |
WO 02076097 | Sep 2002 | WO |
WO 02076099 | Sep 2002 | WO |
WO 03026232 | Mar 2003 | WO |
WO 03026275 | Mar 2003 | WO |
WO 03047710 | Jun 2003 | WO |
WO 03065683 | Aug 2003 | WO |
WO 03071727 | Aug 2003 | WO |
WO 03091832 | Nov 2003 | WO |
WO 2004012437 | Feb 2004 | WO |
WO 2004018060 | Mar 2004 | WO |
WO2004057609 | Jul 2004 | WO |
WO 2004073310 | Aug 2004 | WO |
WO 2005002215 | Jan 2005 | WO |
WO 2005041122 | May 2005 | WO |
WO 2005053301 | Jun 2005 | WO |
WO2005076575 | Aug 2005 | WO |
WO 2005120067 | Dec 2005 | WO |
WO 2006014362 | Feb 2006 | WO |
WO 2006022881 | Mar 2006 | WO |
WO 2006053305 | May 2006 | WO |
WO 2006067697 | Jun 2006 | WO |
WO 2006081634 | Aug 2006 | WO |
WO 2006105480 | Oct 2006 | WO |
WO 2006110268 | Oct 2006 | WO |
WO 2007001797 | Jan 2007 | WO |
WO 2007008319 | Jan 2007 | WO |
WO 2007008355 | Jan 2007 | WO |
WO 2007008356 | Jan 2007 | WO |
WO 2007008357 | Jan 2007 | WO |
WO 2007008358 | Jan 2007 | WO |
WO 2007018722 | Feb 2007 | WO |
WO 2007018726 | Feb 2007 | WO |
WO2008044916 | Apr 2008 | WO |
WO 2008044916 | Apr 2008 | WO |
WO 2008086170 | Jul 2008 | WO |
WO 2008088741 | Jul 2008 | WO |
WO 2008088752 | Jul 2008 | WO |
WO 2008088772 | Jul 2008 | WO |
WO 2008100205 | Aug 2008 | WO |
WO2009038596 | Mar 2009 | WO |
WO 2009038596 | Mar 2009 | WO |
WO 2009099893 | Aug 2009 | WO |
WO 2009099895 | Aug 2009 | WO |
WO 2009105465 | Aug 2009 | WO |
WO 2009110897 | Sep 2009 | WO |
WO 2009114247 | Sep 2009 | WO |
WO 2009155214 | Dec 2009 | WO |
WO 2010044926 | Apr 2010 | WO |
WO 2010054136 | May 2010 | WO |
WO 2010107954 | Sep 2010 | WO |
WO 2011014336 | Sep 2010 | WO |
WO 2011082364 | Jul 2011 | WO |
WO 2011139155 | Nov 2011 | WO |
WO 2011149357 | Dec 2011 | WO |
WO 2012051528 | Apr 2012 | WO |
WO 2012138660 | Oct 2012 | WO |
WO 2013106390 | Jul 2013 | WO |
WO 2013155310 | Jul 2013 | WO |
WO2013184604 | Dec 2013 | WO |
Entry |
---|
ActiveVideo Networks Inc., International Preliminary Report on Patentability, PCT/US2013/020769, Jul. 24, 2014, 6 pgs. |
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2014/030773, Jul. 25, 2014, 8 pgs. |
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2014/041416, Aug. 27, 2014, 8 pgs. |
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 13168509.1, 10 pgs. |
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 13168376-5, 8 pgs. |
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 12767642-7, 12 pgs. |
ActiveVideo Networks Inc., Communication Pursuant to Rules 70(2) and 70a(2), EP10841764.3, Jun. 6, 2014, 1 pg. |
ActiveVideo Networks Inc., Communication Pursuant to Rules 70(2) and 70a(2), EP11833486.1, Apr. 24, 2014, 1 pg. |
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, EP08713106.6-1908, Jun. 26, 2014, 5 pgs. |
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, EP08713106.6-2223, May 10, 2011, 7 pgs. |
ActiveVideo Networks Inc., Communication Pursuant to Rule 94(3), EP08713106-6, Jun. 26, 2014, 5 pgs. |
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, EP09713486.0, Apr. 14, 2014, 6 pgS. |
ActiveVideo Networks Inc., Examination Report No. 1, AU2011258972, Apr. 4, 2013, 5 pgs. |
ActiveVideo Networks Inc., Examination Report No. 1, AU2010339376, Apr. 30, 2014, 4 pgs. |
ActiveVideo Networks Inc., Examination Report, App. No. EP11749946.7, Oct. 8, 2013, 6 pgs. |
ActiveVideo Networks Inc., Summons to attend oral-proceeding, Application No. EP09820936-4, Aug. 19, 2014, 4 pgs. |
ActiveVideo Networks Inc., International Searching Authority, International Search Report—International application No. PCT/US2010/027724, dated Oct. 28, 2010, together with the Written Opinion of the International Searching Authority, 7 pages. |
ActiveVideo Networks, Inc., International Search Report and Written Opinion, PCT/US2014/041430, Oct. 9, 2014, 9 pgs. |
Active Video Networks, Notice of Reasons for Rejection, JP2012-547318, Sep. 26, 2014, 7 pgs. |
Adams, Jerry, NTZ Nachrichtechnische Zeitschrift. vol. 40, No. 7, Jul. 1987, Berlin DE pp. 534-536; Jerry Adams: ‘Glasfasernetz für Breitbanddienste in London’, 5 pgs. No English Translation Found. |
Avinity Systems B.V., Communication pursuant to Article 94(3) EPC, EP 07834561.8, Jan. 31, 2014, 10 pgs. |
Avinity Systems B.V., Communication pursuant to Article 94(3) EPC, EP 07834561.8, Apr. 8, 2010, 5 pgs. |
Avinity Systems B.V., International Preliminary Report on Patentability, PCT/NL2007/000245, Mar. 31, 2009, 12 pgs. |
Avinity Systems B.V., International Search Report and Written Opinion, PCT/NL2007/000245, Feb. 19, 2009, 18 pgs. |
Avinity Systems B.V., Notice of Grounds of Rejection for Patent, JP 2009-530298, Sep. 3, 2013, 4 pgs. |
Avinity Systems B.V., Notice of Grounds of Rejection for Patent, JP 2009-530298, Sep. 25, 2012, 6 pgs. |
Avinity Systems B. V., Final Office Action, JP-2009-530298, Oct. 7, 2014, 8 pgs. |
Bird et al., “Customer Access to Broadband Services,” ISSLS 86—The International Symposium on Subrscriber Loops and Services Sep. 29, 1986, Tokyo,JP 6 pgs. |
Brockmann, Final Office Action, U.S. Appl. No. 12/443,571, Mar. 7, 2014, 21 pgs. |
Brockmann, Final Office Action, U.S. Appl. No. 13/668,004, Jul. 16, 2014, 20 pgs. |
Brockmann, Final Office Action, U.S. Appl. No. 13/686,548, Sep. 24, 2014, 13 pgs. |
Brockmann, Final Office Action, U.S. Appl. No. 13/438,617, Oct. 3, 2014, 19 pgs. |
Brockmann, Office Action, U.S. Appl. No. 13/686,548, Mar. 10, 2014, 11 pgs. |
Brockmann, Office Action, U.S. Appl. No. 13/668,004, Dec. 23, 2013, 9 pgs. |
Brockmann, Office Action, U.S. Appl. No. 13/438,617, May 12, 2014, 17 pgs. |
Brockmann, Office Action, U.S. Appl. No. 12/443,571, Jun. 5, 2013, 18 pgs. |
Brockmann, Office Action, U.S. Appl. No. 12/443,571, Nov. 5, 2014, 26 pgs. |
Chang, Shih-Fu, et al., “Manipulation and Compositing of MC-DOT Compressed Video,” IEEE Journal on Selected Areas of Communications, Jan. 1995, vol. 13, No. 1, 11 pgs. |
Dahlby, Office Action, U.S. Appl. No. 12/651,203, Jun. 5, 2014, 18 pgs. |
Dahlby, Final Office Action, U.S. Appl. No. 12/651,203, Feb. 4, 2013, 18 pgs. |
Dahlby, Office Action, U.S. Appl. No. 12/651,203, Aug. 16, 2012, 18 pgs. |
Dukes, Stephen D., “Photonics for cable television system design, Migrating to regional hubs and passive networks,” Communications Engineering and Design, May 1992, 4 pgs. |
Ellis, et al., “INDAX: An Operation Interactive Cabletext System”, IEEE Journal on Selected Areas in Communications, vol. sac-1, No. 2, Feb. 1983, pp. 285-294. |
European Patent Office, Supplementary European Search Report, Application No. EP 09 70 8211, dated Jan. 5, 2011, 6 pgs. |
Frezza, W., “The Broadband Solution—Metropolitan CATV Networks,” Proceedings of Videotex '84, Apr. 1984, 15 pgs. |
Gecsei, J., “Topology of Videotex Networks,” The Architecture of Videotex Systems, Chapter 6, 1983 by Prentice-Hall, Inc. |
Gobl, et al., “ARIDEM—a multi-service broadband access demonstrator,” Ericsson Review No. 3, 1996, 7 pgs. |
Gordon, Notice of Allowance, U.S. Appl. No. 12/008,697, Mar. 20, 2014, 10 pgs. |
Gordon, Final Office Action, U.S. Appl. No. 12/008,722, Mar. 30, 2012, 16 pgs. |
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, Jun. 11, 2014, 14 pgs. |
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, Jul. 22, 2013, 7 pgs. |
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, Sep. 20, 2011, 8 pgs. |
Gordon, Final Office Action, U.S. Appl. No. 12/035,236, Sep. 21, 2012, 9 pgs. |
Gordon, Final Office Action, U.S. Appl. No. 12/008,697, Mar. 6, 2012, 48 pgs. |
Gordon, Office Action, U.S. Appl. No. 12/035,236, Mar. 13, 2013, 9 pgs. |
Gordon, Office Action, U.S. Appl. No. 12/035,236, Mar. 22, 2011, 8 pgs. |
Gordon, Office Action, U.S. Appl. No. 12/035,236, Mar. 28, 2012, 8 pgs. |
Gordon, Office Action, U.S. Appl. No. 12/035,236, Dec. 16, 2013, 11 pgs. |
Gordon, Office Action, U.S. Appl. No. 12/008,697, Aug. 1, 2013, 43 pgs. |
Gordon, Office Action, U.S. Appl. No. 12/008,697, Aug. 4, 2011, 39 pgs. |
Gordon, Office Action, U.S. Appl. No. 12/008,722, Oct. 11, 2011, 16 pgs. |
Handley et al, “TCP Congestion Window Validation,” RFC 2861, Jun. 2000, Network Working Group, 22 pgs. |
Henry et al. “Multidimensional Icons” ACM Transactions on Graphics, vol. 9, No. 1 Jan. 1990, 5 pgs. |
Isensee et al., “Focus Highlight for World Wide Web Frames,” Nov. 1, 1997, IBM Technical Disclosure Bulletin, vol. 40, No. 11, pp. 89-90. |
ICTV, Inc., International Search Report / Written Opinion, PCT/US2008/000400, Jul. 14, 2009, 10 pgs. |
ICTV, Inc., International Search Report / Written Opinion, PCT/US2008/000450, Jan. 26, 2009, 9 pgs. |
Kato, Y., et al., “A Coding Control algorithm for Motion Picture Coding Accomplishing Optimal Assignment of Coding Distortion to Time and Space Domains,” Electronics and Communications in Japan, Part 1, vol. 72, No. 9, 1989, 11 pgs. |
Koenen, Rob,“MPEG-4 Overview—Overview of the MPEG-4 Standard” Internet Citation, Mar. 2001, http://mpeg.telecomitalialab.com/standards/mpeg-4/mpeg-4.htm, May 9, 2002, 74 pgs. |
Konaka, M. et al., “Development of Sleeper Cabin Cold Storage Type Cooling System,” SAE International, The Engineering Society for Advancing Mobility Land Sea Air and Space, SAE 2000 World Congress, Detroit, Michigan, Mar. 6-9, 2000, 7 pgs. |
Le Gall, Didier, “MPEG: A Video Compression Standard for Multimedia Applications”, Communication of the ACM, vol. 34, No. 4, Apr. 1991, New York, NY, 13 pgs. |
Langenberg, E, et al., “Integrating Entertainment and Voice on the Cable Network,” SCTE , Conference on Emerging Technologies, Jan. 6-7, 1993, New Orleans, Louisiana, 9 pgs. |
Large, D., “Tapped Fiber vs. Fiber-Reinforced Coaxial CATV Systems”, IEEE LCS Magazine, Feb. 1990, 7 pgs. |
“MSDL Specification Version 1.1” International Organisation for Standardisation Organisation Internationale EE Normalisation, ISO/IEC JTC1/SC29/WG11 Coding of Moving Pictures and Autdio, N1246, MPEG96/Mar. 1996, 101 pgs. |
Noguchi, Yoshihiro, et al., “MPEG Video Compositing in the Compressed Domain,” IEEE International Symposium on Circuits and Systems, vol. 2, May 1, 1996, 4 pgs. |
Regis, Notice of Allowance U.S. Appl. No. 13/273,803, Sep. 2, 2014, 8 pgs. |
Regis, Notice of Allowance U.S. Appl. No. 13/273,803, May 14, 2014, 8 pgs. |
Regis, Final Office Action U.S. Appl. No. 13/273,803, Oct. 11, 2013, 23 pgs. |
Regis, Office Action U.S. Appl. No. 13/273,803, Mar. 27, 2013, 32 pgs. |
Rose, K., “Design of a Switched Broad-Band Communications Network for Interactive Services,” IEEE Transactions on Communications, vol. com-23, No. 1, Jan. 1975, 7 pgs. |
Saadawi, Tarek N., “Distributed Switching for Data Transmission over Two-Way CATV”, IEEE Journal on Selected Areas in Communications, vol. Sac-3, No. 2, Mar. 1985, 7 pgs. |
Schrock, “Proposal for a Hub Controlled Cable Television System Using Optical Fiber,” IEEE Transactions on Cable Television, vol. CATV-4, No. 2, Apr. 1979, 8 pgs. |
Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, Sep. 22, 2014, 5 pgs. |
Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, Feb. 27, 2014, 14 pgs. |
Sigmon, Final Office Action, U.S. Appl. No. 13/311,203, Sep. 13, 2013, 20 pgs. |
Sigmon, Office Action, U.S. Appl. No. 13/311,203, May 10, 2013, 21 pgs. |
Smith, Brian C., et al., “Algorithms for Manipulating Compressed Images,” IEEE Computer Graphics and Applications, vol. 13, No. 5, Sep. 1, 1993, 9 pgs. |
Smith, J. et al., “Transcoding Internet Content for Heterogeneous Client Devices” Circuits and Systems, 1998. ISCAS '98. Proceedings of the 1998 IEEE International Symposium on Monterey, CA, USA May 31-Jun. 3, 1998, New York, NY, USA,IEEE, US, May 31, 1998, 4 pgs. |
Stoll, G. et al., “GMF4iTV: Neue Wege zur-Interaktivitaet Mit Bewegten Objekten Beim Digitalen Fernsehen,” Fkt Fernseh Und Kinotechnik, Fachverlag Schiele & Schon GmbH, Berlin, DE, vol. 60, No. 4, Jan. 1, 2006, ISSN: 1430-9947, 9 pgs. No English Translation Found. |
Tamitani et al., “An Encoder/Decoder Chip Set for the MPEG Video Standard,” 1992 IEEE International Conference on Acoustics, vol. 5, Mar. 1992, San Francisco, CA, 4 pgs. |
Terry, Jack, “Alternative Technologies and Delivery Systems for Broadband ISDN Access”, IEEE Communications Magazine, Aug. 1992, 7 pgs. |
Tobagi, Fouad A., “Multiaccess Protocols in Packet Communication Systems,” IEEE Transactions on Communications, vol. Com-28, No. 4, Apr. 1980, 21 pgs. |
Toms, N., “An Integrated Network Using Fiber Optics (Info) for the Distribution of Video, Data, and Telephone in Rural Areas,” IEEE Transactions on Communication, vol. Com-26, No. 7, Jul. 1978, 9 pgs. |
Jurgen—Two-way applications for cable television systems in the '70s, IEEE Spectrum, Nov. 1971, 16 pgs. |
va Beek, P., “Delay-Constrained Rate Adaptation for Robust Video Transmission over Home Networks,” Image Processing, 2005, ICIP 2005, IEEE International Conference, Sep. 2005, vol. 2, No. 11, 4 pgs. |
Welzenbach et al., “The Application of Optical Systems for Cable TV,” AEG-Telefunken, Backnang, Federal Republic of Germany, ISSLS Sep. 15-19, 1980, Proceedings IEEE Cat. No. 80 CH1565-1, 7 pgs. |
Yum, TS P., “Hierarchical Distribution of Video with Dynamic Port Allocation,” IEEE Transactions on Communications, vol. 39, No. 8, Aug. 1, 1991, XP000264287, 7 pgs. |
ActiveVideo Networks, Inc., International Preliminary Report on Patentablity, PCT/US2013/036182, Oct. 14, 2014, 9 pgs. |
ActiveVideo Networks Inc., Communication Pursuant to Rule 94(3), EP08713106-6, Jun. 25, 2014, 5 pgs. |
ActiveVideo Networks Inc., Communication Pursuant to Rules 161(2) & 162 EPC, EP13775121.0, Jan. 20, 2015, 3 pgs. |
ActiveVideo Networks Inc., Examination Report No. 1, AU2011258972, Jul. 21, 2014, 3 pgs. |
ActiveVideo Networks Inc., Certificate of Patent JP5675765, Jan. 9, 2015, 3 pgs. |
Brockmann, Notice of Allowance, U.S. Appl. No. 13/445,104, Dec. 24, 2014, 14 pgs. |
Brockmann, Office Action, U.S. Appl. No. 13/668,004, Feb. 26, 2015, 17 pgs. |
Brockmann, Office Action, U.S. Appl. No. 13/686,548, Jan. 5, 2015, 12 pgs. |
Brockmann, Office Action, U.S. Appl. No. 13/911,948, Dec. 26, 2014, 12 pgs. |
Brockmann, Office Action, U.S. Appl. No. 13/911,948, Jan. 29, 2015, 11 pgs. |
Dahlby, Office Action, U.S. Appl. No. 12/651,203, Dec. 3, 2014, 19 pgs. |
Gordon, Notice of Allowance, U.S. Appl. No. 12/008,697, Dec. 8, 2014, 10 pgs. |
Gordon, Office Action, U.S. Appl. No. 12/008,722, Nov. 28, 2014, 18 pgs. |
Regis, Notice of Allowance, U.S. Appl. No. 13/273,803, Nov. 18, 2014, 9 pgs. |
Regis, Notice of Allowance, U.S. Appl. No. 13/273,803, Mar. 2, 2015, 8 pgs. |
Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, Dec. 19, 2014, 5 pgs. |
TAG Networks Inc, Decision to Grant a Patent, JP 2008-506474, Oct. 4, 2013, 5 pgs. |
ActiveVideo Networks Inc., Decision to refuse a European patent application (Art. 97(2) EPC, EP09820936.4, Feb. 20, 2015, 4 pgs. |
ActiveVideo Networks Inc., Communication Pursuant to Article 94(3) EPC, 10754084.1, Feb. 10, 2015, 12 pgs. |
ActiveVideo Networks Inc., Communication under Rule 71(3) EPC, Intention to Grant, EP08713106.6, Feb. 19, 2015, 12 pgs. |
ActiveVideo Networks Inc., Notice of Reasons for Rejection, JP2014-100460, Jan. 15, 2015, 6 pgs. |
ActiveVideo Networks Inc., Notice of Reasons for Rejection, JP2013-509016, Dec. 24, 2014 (Received Jan. 14, 2015), 11 pgs. |
Brockmann, Office Action, U.S. Appl. No. 13/737,097, Mar. 16, 2015, 18 pgs. |
Brockmann, Notice of Allowance, U.S. Appl. No. 14/298,796, Mar. 18, 2015, 11 pgs. |
Craig, Decision on Appeal—Reversed-, U.S. Appl. No. 11/178,177, Feb. 25, 2015, 7 pgs. |
Craig, Notice of Allowance, U.S. Appl. No. 11/178,177, Mar. 5, 2015, 7 pgs. |
Craig, Notice of Allowance, U.S. Appl. No. 11/178,181, Feb. 13, 2015, 8 pgs. |
AC-3 digital audio compression standard, Extract, Dec. 20, 1995, 11 pgs. |
ActiveVideo Networks BV, International Preliminary Report on Patentability, PCT/NL2011/050308, Sep. 6, 2011, 8 pgs. |
ActiveVideo Networks BV, International Search Report and Written Opinion, PCT/NL2011/050308, Sep. 6, 2011, 8 pgs. |
Activevideo Networks Inc., International Preliminary Report on Patentability, PCT/US2011/056355, Apr. 16, 2013, 4 pgs. |
ActiveVideo Networks Inc., International Preliminary Report on Patentability, PCT/US2012/032010, Oct. 8, 2013, 4 pgs. |
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2011/056355, Apr. 13, 2012, 6 pgs. |
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2012/032010, Oct. 10, 2012, 6 pgs. |
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2013/020769, May, 9, 2013, 9 pgs. |
ActiveVideo Networks Inc., International Search Report and Written Opinion, PCT/US2013/036182, Jul. 29, 2013, 12 pgs. |
ActiveVideo Networks, Inc., International Search Report and Written Opinion, PCT/US2009/032457, Jul. 22, 2009, 7 pgs. |
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 09820936-4, 11 pgs. |
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 10754084-1, 11 pgs. |
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 10841764.3, 16 pgs. |
ActiveVideo Networks Inc., Extended EP Search Rpt, Application No. 11833486.1, 6 pgs. |
AcitveVideo Networks Inc., Korean Intellectual Property Office, International Search Report; PCT/US2009/032457, Jul. 22, 2009, 7 pgs. |
Annex C—Video buffering verifier, information technology—generic coding of moving pictures and associated audio information: video, Feb. 2000, 6 pgs. |
Antonoff, Michael, “Interactive Television,” Popular Science, Nov. 1992, 12 pages. |
Avinity Systems B.V., Extended European Search Report, Application No. 12163713.6, 10 pgs. |
Avinity Systems B.V., Extended European Search Report, Application No. 12163712-8, 10 pgs. |
Broadhead, Direct manipulation of MPEG compressed digital audio, Nov. 5-9, 1995, 41 pgs. |
Cable Television Laboratories, Inc., “CableLabs Asset Distribution Interface Specification, Version 1.1”, May 5, 2006, 33 pgs. |
CD 11172-3, Coding of moving pictures and associated audio for digital storage media at up to about 1.5 MBIT, Jan. 1, 1992, 39 pgs. |
Craig, Notice of Allowance, U.S. Appl. No. 11/178,176, Dec. 23, 2010, 8 pgs. |
Craig, Notice of Allowance, U.S. Appl. No. 11/178,183, Jan. 12, 2012, 7 pgs. |
Craig, Notice of Allowance, U.S. Appl. No. 11/178,183, Jul. 19, 2012, 8 pgs. |
Craig, Notice of Allowance, U.S. Appl. No. 11/178,189, Oct. 12, 2011, 7 pgs. |
Craig, Notice of Allowance, U.S. Appl. No. 11/178,176, Mar. 23, 2011, 8 pgs. |
Craig, Notice of Allowance, U.S. Appl. No. 13/609,183, Aug. 26, 2013, 8 pgs. |
Craig, Final Office Action, U.S. Appl. No. 11/103,838, Feb. 5, 2009, 30 pgs. |
Craig, Final Office Action, U.S. Appl. No. 11/178,181, Aug. 25, 2010, 17 pgs. |
Craig, Final Office Action, U.S. Appl. No. 11/103,838, Jul. 6, 2010, 35 pgs. |
Craig, Final Office Action, U.S. Appl. No. 11/178,176, Oct. 1, 2010, 8 pgs. |
Craig, Final Office Action, U.S. Appl. No. 11/178,183, Apr. 13, 2011, 16 pgs. |
Craig, Final Office Action, U.S. Appl. No. 11/178,177, Oct. 26, 2010, 12 pgs. |
Craig, Final Office Action, U.S. Appl. No. 11/178,181, Jun. 20, 2011, 21 pgs. |
Craig, Office Action, U.S. Appl. No. 11/103,838, May 12, 2009, 32 pgs. |
Craig, Office Action, U.S. Appl. No. 11/103,838, Aug. 19, 2008, 17 pgs. |
Craig, Office Action, U.S. Appl. No. 11/103,838, Nov. 19, 2009, 34 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,176, May 6, 2010, 7 pgs. |
Craig, Office-Action U.S. Appl. No. 11/178,177, Mar. 29, 2011, 15 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,177, Aug. 3, 2011, 26 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,177, Mar. 29, 2010, 11 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,181, Feb. 11, 2011, 19 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,181, Mar. 29, 2010, 10 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,182, Feb. 23, 2010, 15 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,183, Dec. 6, 2010, 12 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,183, Sep. 15, 2011, 12 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,183, Feb. 19, 2010, 17 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,183, Jul. 20, 2010, 13 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,189, Nov. 9, 2010, 13 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,189, Mar. 15, 2010, 11 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,189, Jul. 23, 2009, 10 pgs. |
Craig, Office Action, U.S. Appl. No. 11/178,189, May 26, 2011, 14 pgs. |
Craig, Office Action, U.S. Appl. No. 13/609,183, May 9, 2013, 7 pgs. |
Pavlovskaia, Office Action, JP 2011-516499, Feb. 14, 2014, 19 pgs. |
Digital Audio Compression Standard(AC-3, E-AC-3), Advanced Television Systems Committee, Jun. 14, 2005, 236 pgs. |
European Patent Office, Extended European Search Report for International Application No. PCT/US2010/027724, dated Jul. 24, 2012, 11 pages. |
FFMPEG, http://www.ffmpeg.org, downloaded Apr. 8, 2010, 8 pgs. |
FFMEG-0.4.9 Audio Layer 2 Tables Including Fixed Psycho Acoustic Model, 2001, 2 pgs. |
Herr, Notice of Allowance, U.S. Appl. No. 11/620,593, May 23, 2012, 5 pgs. |
Herr, Notice of Allowance, U.S. Appl. No. 12/534,016, Feb. 7, 2012, 5 pgs. |
Herr, Notice of Allowance, U.S. Appl. No. 12/534,016, Sep. 28, 2011, 15 pgs. |
Herr, Final Office Action, U.S. Appl. No. 11/620,593, Sep. 15, 2011, 104 pgs. |
Herr, Office Action, U.S. Appl. No. 11/620,593, Mar. 19, 2010, 58 pgs. |
Herr, Office Action, U.S. Appl. No. 11/620,593, Apr. 21, 2009 27 pgs. |
Herr, Office Action, U.S. Appl. No. 11/620,593, Dec. 23, 2009, 58 pgs. |
Herr, Office Action, U.S. Appl. No. 11/620,593, Jan. 24, 2011, 96 pgs. |
Herr, Office Action, U.S. Appl. No. 11/620,593, Aug. 27, 2010, 41 pgs. |
Herre, Thoughts on an SAOC Architecture, Oct. 2006, 9 pgs. |
Hoarty, The Smart Headend—A Novel Approach to Interactive Television, Montreux Int'l TV Symposium, Jun. 9, 1995, 21 pgs. |
ICTV, Inc., International Preliminary Report on Patentability, PCT/US2006/022585, Jan. 29, 2008, 9 pgs. |
ICTV, Inc., International Search Report / Written Opinion, PCT/US2006/022585, Oct. 12, 2007, 15 pgs. |
ICTV, Inc., International Search Report / Written Opinion, PCT/US2008/000419, May 15, 2009, 20 pgs. |
ICTV, Inc., International Search Report / Written Opinion; PCT/US2006/022533, Nov. 20, 2006; 8 pgs. |
Isovic, Timing constraints of MPEG-2 decoding for high quality video: misconceptions and realistic assumptions, Jul. 2-4, 2003, 10 pgs. |
MPEG-2 Video elementary stream supplemental information, Dec. 1999, 12 pgs. |
Ozer, Video Compositing 101. available from http://www.emedialive.com, Jun. 2, 2004, 5pgs. |
Porter, Compositing Digital Images, 18 Computer Graphics (No. 3), Jul. 1984, pp. 253-259. |
RSS Advisory Board, “RSS 2.0 Specification”, published Oct. 15, 2007. |
SAOC use cases, draft requirements and architecture, Oct. 2006, 16 pgs. |
Sigmon, Final Office Action, U.S. Appl. No. 11/258,602, Feb. 23, 2009, 15 pgs. |
Sigmon, Office Action, U.S. Appl. No. 11/258,602, Sep. 2, 2008, 12 pgs. |
TAG Networks, Inc., Communication pursuant to Article 94(3) EPC, European Patent Application, 06773714.8, May 6, 2009, 3 pgs. |
TAG Networks Inc, Decision to Grant a Patent, JP 209-544985, Jun. 28, 2013, 1 pg. |
TAG Networks Inc., IPRP, PCT/US2006/010080, Oct. 16, 2007, 6 pgs. |
TAG Networks Inc., IPRP, PCT/US2006/024194, Jan. 10, 2008, 7 pgs. |
TAG Networks Inc., IPRP, PCT/US2006/024195, Apr. 1, 2009, 11 pgs. |
TAG Networks Inc., IPRP, PCT/US2006/024196, Jan. 10, 2008, 6 pgs. |
TAG Networks Inc., International Search Report, PCT/US2008/050221, Jun. 12, 2008, 9 pgs. |
TAG Networks Inc., Office Action, CN 200680017662.3, Apr. 26, 2010, 4 pgs. |
TAG Networks Inc., Office Action, EP 06739032.8, Aug. 14, 2009, 4 pgs. |
TAG Networks Inc., Office Action, EP 06773714.8, May 6, 2009, 3 pgs. |
TAG Networks Inc., Office Action, EP 06773714.8, Jan. 12, 2010, 4 pgs. |
TAG Networks Inc., Office Action, JP 2008-506474, Oct. 1, 2012, 5 pgs. |
TAG Networks Inc., Office Action, JP 2008-506474, Aug. 8, 2011, 5 pgs. |
TAG Networks Inc., Office Action, JP 2008-520254, Oct. 20, 2011, 2 pgs. |
TAG Networks, IPRP, PCT/US2008/050221, Jul. 7, 2009, 6 pgs. |
TAG Networks, International Search Report, PCT/US2010/041133, Oct. 19, 2010, 13 pgs. |
TAG Networks, Office Action, CN 200880001325.4, Jun. 22, 2011, 4 pgs. |
TAG Networks, Office Action, JP 2009-544985, Feb. 25, 2013, 3 pgs. |
Talley, A general framework for continuous media transmission control, Oct. 13-16, 1997, 10 pgs. |
Todd, AC-3: flexible perceptual coding for audio transmission and storage, Feb. 26-Mar. 1, 1994, 16 pgs. |
Tudor, MPEG-2 Video Compression, Dec. 1995, 15 pgs. |
TVHEAD, Inc., First Examination Report, IN 1744/MUMNP/2007, Dec. 30, 2013, 6 pgs. |
TVHEAD, Inc., International Search Report, PCT/US2006/010080, Jun. 20, 2006, 3 pgs. |
TVHEAD, Inc., International Search Report, PCT/US2006/024194, Dec. 15, 2006, 4 pgs. |
TVHEAD, Inc., International Search Report, PCT/US2006/024195, Nov. 29, 2006, 9 pgs. |
TVHEAD, Inc., International Search Report, PCT/US2006/024196, Dec. 11, 2006, 4 pgs. |
TVHEAD, Inc., International Search Report, PCT/US2006/024197, Nov. 28, 2006, 9 pgs. |
Vernon, Dolby digital: audio coding for digital television and storage applications, Aug. 1999, 18 pgs. |
Wang, A compressed domain beat detector using MP3 audio bitstream, Sep. 30-Oct. 5, 2001, 9 pgs. |
Wang, A multichannel audio coding algorithm for inter-channel redundancy removal, May 12-15, 2001, 6 pgs. |
Wang, An excitation level based psychoacoustic model for audio compression, Oct. 30-Nov. 4, 1999, 4 pgs. |
Wang, Energy compaction property of the MDCT in comparison with other transforms, Sep. 22-25, 2000, 23 pgs. |
Wang, Exploiting excess masking for audio compression, Sep. 2-5, 1999, 4 pgs. |
Wang, schemes for re-compressing mp3 audio bitstreams,Nov. 30-Dec. 3, 2001, 5 pgs. |
Wang, Selected advances in audio compression and compressed domain processing, Aug. 2001, 68 pgs. |
Wang, The impact of the relationship between MDCT and DFT on audio compression, Dec. 13-15, 2000, 9 pgs. |
ActiveVideo Networks, Inc., Certificate of Grant, EP08713106.6-1908, Aug. 5, 2015, 1pgs. |
ActiveVideo Networks, Inc., Decision to Grant, EP08713106.6-1908, Jul. 9, 2015, 2 pgs. |
ActiveVideo Networks, Inc., Decision to Grant, JP2014100460, Jul. 24, 2015, 5 pgs. |
ActiveVideo Networks Inc., Examination Report No. 2, AU2011249132, May 29, 2015, 4 pgs. |
Activevideo Networks Inc., Examination Report No. 2, AU2011315950, Jun. 25, 2015, 3 pgs. |
ActiveVideo, International Search Report and Written Opinion, PCT/US2015/027803, Jun. 24, 2015, 18 pgs. |
ActiveVideo, International Search Report and Written Opinion, PCT/US2015/027804, Jun. 25, 2015, 10 pgs. |
ActiveVideo Networks, Inc., KIPO's Notice of Preliminary Rejection, KR10-210-7019512, Jul. 15, 2015, 15 pgs. |
ActiveVideo Networks, Inc., KIPO's Notice of Preliminary Rejection, KR10-2107021116, Jul. 13, 2015, 19 pgs. |
ActiveVideo Networks, Inc., International Search Report and Written Opinion, PCT-US2015028072, Aug. 7, 2015, 9 pgs. |
ActiveVideo Networks B.V., Office Action, IL222830, Jun. 28, 2015, 7 pgs. |
ActiveVideo Networks, Inc., Office Action, JP2013534034, Jun. 16, 2015, 6 pgs. |
Avinity-Systems-BV, PreTrial-Reexam-Report-JP2009530298, Apr. 24, 2015, 6 pgs. |
Brockmann, Notice of Allowance, U.S. Appl. No. 13/911,948, Jul. 10, 2015, 5 pgs. |
Brockmann, Notice of Allowance, U.S. Appl. No. 13/438,617, May 22, 2015, 18 pgs. |
Brockmann, Notice of Allowance, U.S. Appl. No. 13/445,104, Apr. 23, 2015, 8 pgs. |
Brockmann, Final Office Action, U.S. Appl. No. 12/443,571, Jul. 9, 2015, 28 pgs. |
Brockmann, Notice of Allowance, U.S. Appl. No. 13/911,948, Aug. 21, 2015, 6 pgs. |
Brockmann, Notice of Allowance, U.S. Appl.. No. 13/911,948, Aug. 5, 2015, 5 pgs. |
Brockmann, Final Office Action, U.S. Appl. No. 13/668,004, Aug. 3, 2015, 13 pgs. |
Brockmann, Final Office Action, U.S. Appl. No. 13/686,548, Aug. 12, 2015, 13 pgs. |
Brockmann, Final Office Action, U.S. Appl. No. 13/737,097, Aug. 14, 2015, 17 pgs. |
Brockmann, Office Action, U.S. Appl. No. 14/298,796, Sep. 11, 2015, 11 pgs. |
Dahlby, Office Action U.S. Appl. No. 12/651,203, Jul. 2, 2015, 25 pgs. |
Gecsei, J., “Adaptation in Distributed Multimedia Systems,” IEEE Multimedia, IEEE Service Center, New York, NY, vol. 4, No. 2, Apr. 1, 1997, 10 pgs. |
Gordon, Notice of Allowance, U.S. Appl. No. 12/008,697, Apr. 1, 2015, 10 pgs. |
Gordon, Final Office Action, U.S. Appl. No. 12/008,722, Jul. 2, 2015, 20 pgs. |
Ohta, K., et al., “Selective Multimedia Access Protocol for Wireless Multimedia Communication,” Communications, Computers and Signal Processing, 1997, IEEE Pacific Rim Conference NCE Victoria, BC, Canada, Aug. 1997, vol. 1, 4 pgs. |
Sigmon, Notice of Allowance, U.S. Appl. No. 13/311,203, Apr. 14, 2015, 5 pgs. |
Wei, S., “QoS Tradeoffs Using an Application-Oriented Transport Protocol (AOTP) for Multimedia Applications Over IP.” Sep. 23-26 1999, Proceedings of the Third International Conference on Computational Intelligence and Multimedia Applications, New Delhi, India, 5 pgs. |
ActiveVideo Networks, Inc., Certificate of Grant, AU2011258972, Nov. 19, 2015, 2 pgs. |
ActiveVideo Networks, Inc., Certificate of Grant EP13168509.11908, Sep. 30, 2015, 2 pgs. |
ActiveVideo Networks, Inc., Communication Pursuant to Rules 161(1) and 162 EPC, EP14722897.7, Oct. 28, 2015, 2 pgs. |
ActiveVideo Networks, Inc., Decision to Grant, EP13168509.1-1908, Sep. 3, 2015, 2 pgs. |
ActiveVideo Networks, Inc., Decision to Refuse a European Patent Application, EP08705578.6, Nov. 26, 2015, 10 pgs. |
ActiveVideo Networks, Inc., Extended European Search Report, EP13735906.3, Nov. 11, 2015, 10 pgs. |
ActiveVideo Networks, Inc., International Preliminary Report on Patentability, PCT-US2014030773, Sep. 15, 2015, 6 pgs. |
ActiveVideo, Communication Pursuant to Article-94(3) EPC, EP12767642.7, Sep. 4, 2015, 4 pgs. |
Brockmann, Office Action, U.S. Appl. No. 12/443,571, Dec. 4, 2015, 30 pgs. |
Dahlby, Final Office Action, U.S. Appl. No. 12/651,203, Dec. 11, 2015, 25 pgs. |
Jacob, Bruce, “Memory Systems: Cache, DRAM, Disk,” The Cache Layer, Chapter 22, p. 739. |
ActiveVideo, Certificate of Grant, AU2011249132, Jan. 7, 2016, 2 pgs. |
ActiveVideo Networks, Inc., Certificate of Grant, AU2011315950, Dec. 17, 2015, 2 pgs. |
ActiveVideo Networks, Inc., Certificate of Patent, JP2013534034, Jan. 8, 2016, 4 pgs. |
ActiveVideo Networks, Inc., Communication Pursuant to Rules 161(1) and 162 EPC, EP14740004.8, Jan. 26, 2016, 2 pgs. |
ActiveVideo Networks, Inc., Communication Pursuant to Rules 161(1) and 162 EPC, EP14736535.7, Jan. 26, 2016, 2 pgs. |
ActiveVideo Networks, Inc., Communication Pursuant to Rules 70(2) abd 70a(2) EP13735906.3, Nov. 27, 2015, 1 pg. |
ActiveVideo, Notice of Reasons for Rejection, JP2013-509016, Dec. 3, 2015, 7 pgs. |
AcriveVideo, Communication Pursuant to Article 94(3) EPC, EP10841764.3, Dec. 18, 2015, 6 pgs. |
ActiveVideo Networks, Inc., International Preliminary Report on Patentability, PCT/US2014041430, Dec. 8, 2015, 6 pgs. |
ActiveVideo Networks, Inc., International Preliminary Report on Patentability, PCT-US2014041416, Dec. 8, 2015, 6 pgs. |
Number | Date | Country | |
---|---|---|---|
20140362930 A1 | Dec 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13911948 | Jun 2013 | US |
Child | 14262674 | US |