Systems and methods for implementing a content object access point

Information

  • Patent Grant
  • 8112449
  • Patent Number
    8,112,449
  • Date Filed
    Friday, August 1, 2003
    22 years ago
  • Date Issued
    Tuesday, February 7, 2012
    14 years ago
Abstract
Systems and methods for accessing and distributing content objects. Various of the systems and methods utilize a number of content object entities that can be sources and/or destinations for content objects. A combination of abstraction and distinction engines can be used to access content objects from a source of content objects, format and/or modify the content objects, and redistribute the modified content object to one or more content object destinations. In some cases, an access point is included that identifies a number of available content objects, and identifies one or more content object destinations to which the respective content objects can be directed.
Description
CROSS-REFERENCES TO RELATED APPLICATIONS

The present application is related to U.S. patent application Ser. No. 10/632,602 entitled “Systems And Methods For Controlled Transmittance in a Telecommunication System”, and filed by the inventors common hereto and on a date common herewith. The entirety of the aforementioned application is incorporated herein by reference for all purposes.


BACKGROUND OF THE INVENTION

The present invention is related to telecommunication systems in general, and in particular to systems and methods for distributing content objects.


The telecommunication and electronics industries have developed and implemented a large number of incompatible devices and protocols. Thus, in a given consumer's home you can find a number of different types of content maintained in a number of different media. To use such content object types, a user is required to access multiple devices. Further, such content typically cannot be used together without requiring the use of multiple devices, each respectively using a portion of the content.


Hence, among other things, there exists a need in the art to address the aforementioned limitations.


FIELD OF THE INVENTION
Brief Summary of the Invention

The present invention is related to telecommunication systems in general, and in particular to systems and methods for distributing content objects. Various of the systems and methods utilize a number of content object entities that can be sources and/or destinations for content objects. A combination of abstraction and distinction engines can be used to access content objects from a source of content objects, format and/or modify the content objects, and redistribute the modified content object to one or more content object destinations. In some cases, an access point is included that identifies a number of available content objects, and identifies one or more content object destinations to which the respective content objects can be directed.


Such systems and methods can be used to select a desired content object, and to select a content object entity to which the content object is directed. In addition, the systems and methods can be used to modify the content object as to format and/or content. For example, the content object may be reformatted for use on a selected content object entity, modified to add additional or to reduce the content included in the content object, or combined with one or more other content objects to create a composite content object. This composite content object can then be directed to a content object destination where it can be either stored or utilized.


Some embodiments of the present invention provide systems for abstraction and distinction of content objects. These systems include an abstraction engine and a distinction engine. The abstraction engine is communicably coupled to a group of content object entities, and the distinction engine is communicably coupled to another group of content object entities. The two groups of content object entities are not necessarily mutually exclusive, and in many cases, a content object entity in one of the groups is also included in the other group. The first of the groups of content object entities includes content objects entities such as an appliance control system, a telephone information system, a storage medium including video objects, a storage medium including audio objects, an audio stream source, a video stream source, a human interface, the Internet, and an interactive content entity. The other of the groups of content object entities includes content object entities such as an appliance control system, a telephone information system, a storage medium including video objects, a storage medium including audio objects, a human interface, the Internet, and an interactive content entity.


In some instances, two or more of the content object entities are maintained on separate partitions of a common database. In such instances, the common database can be partitioned using a content based schema, while in other cases the common database can be partitioned using a user based schema.


In particular instances, the abstraction engine is operable to receive a content object from one of the groups of content object entities, and to form the content object into an abstract format. As just one example, this abstract format can be a format that is compatible at a high level with other content formats. In other instances, the abstraction engine is operable to receive a content object from one of the content object entities, and to derive another content object based on the aforementioned content object. Further, the abstraction engine can be operable to receive yet another content object from one of the content object entities and to derive an additional content object there from. The abstraction engine can then combine the two derived content objects to create a composite content object. In some cases, the distinction engine accepts the composite content object and formats it such that it is compatible with a particular group of content object entities. In yet other instances, the abstraction engine is operable to receive a content object from one group of content object entities, and to form that content object into an abstract format. The distinguishing engine can then conform the abstracted content object with a standard compatible with a selected one of another group of content object entities.


In some other instances, the systems include an access point that indicates a number of content objects associated with one group of content object entities, and a number of content objects associated with another group of content object entities. The access point indicates from which group of content object entities a content object can be accessed, and a group of content object entities to which the content object can be directed.


Other embodiments of the present invention provide methods for utilizing content objects that include accessing a content object from a content object entity; abstracting the content object to create an abstracted content object; distinguishing the abstracted content object to create a distinguished content object, and providing the distinguished content object to a content object entity capable of utilizing the distinguished content object. In some cases, the methods further include accessing yet another content object from another content object entity, and abstracting that content object entity to create another abstracted content object entity. The two abstracted content object entities can be combined to create a composite content object entity. In one particular case, the first abstracted content object is a video content object and the second abstracted content object is an audio content object. Thus, the composite content object includes audio from one source, and video from another source. Further, in such a case, abstracting the video content object can include removing the original audio track from the video content object prior to combining the two abstracted content objects. As yet another example, the first abstracted content object can be an Internet object, while the other abstracted content object is a video content object.


In other cases, the methods can further include identifying a content object associated with one group of content object entities that has expired, and removing the identified content object. Other cases include querying a number of content object entities to identify one or more content objects accessible via the content object entities, and providing an access point that indicates the identified content objects and one or more content object entities to which the identified content objects can be directed.


Yet other embodiments provide methods for accessing content objects within a customer premises. Such methods include identifying content object entities within the customer premises, and grouping the identified content objects into two or more groups of content object entities. At least one of the groups of content object entities include sources of content objects, and at least another of the groups of content object entities include destinations of content objects. The methods further include providing an access point that indicates the at least one group of content object entities that can act as content object sources, and at least another group of content object entities that can act as content object destinations. In some cases, the methods further include mixing two or more content objects from the first plurality of content object entities to form a composite content object, and providing the composite content object to a content object entity capable of utilizing it. In other cases, the methods further include eliminating a portion of a content object accessed from one group of content object entities and providing this reduced content object to another content object entity capable of utilizing the reduced content object entity.


This summary provides only a general outline of some embodiments according to the present invention. Many other objects, features, advantages and other embodiments of the present invention will become more fully apparent from the following detailed description, the appended claims and the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

A further understanding of the nature and advantages of the present invention may be realized by reference to the figures which are described in remaining portions of the specification. In the figures, like reference numerals are used throughout several figures to refer to similar components. In some instances, a sub-label consisting of a lower case letter is associated with a reference numeral to denote one of multiple similar components. When reference is made to a reference numeral without specification to an existing sub-label, it is intended to refer to all such multiple similar components.



FIG. 1 illustrates a block diagram of an abstraction and distinction engine in accordance with various embodiments of the present invention;



FIG. 2 illustrates a hierarchical diagram of various content object entities accessible via the abstraction and distinction engine of FIG. 1; and



FIG. 3 illustrate various applications of the abstraction and distinction engine of FIG. 1 in accordance with various embodiments of the present invention.





DETAILED DESCRIPTION OF THE INVENTION

The present invention is related to telecommunication systems in general, and in particular to systems and methods for distributing content objects. In some embodiments of the present invention, a content access point is provided that includes a guide to various content objects maintained in relation to a customer premises. In some cases, this content access point is implemented as a combination of hardware and software, however, one of ordinary skill in the art will appreciate a variety of implementation methods that can be used in accordance with the present invention. Via a guide associated with the access point, a list of all content objects available to a consumer can be displayed, and commands requesting various of the content objects and/or portions thereof can be received and processed. Thus, in some cases, the present invention provides a unifying tool allowing a consumer to access a variety of content objects from a variety of content object entities, manipulate those content objects, and utilize those content object via one or more content object entities.


As used herein, a content object can be any content maintained as an accessible object that can be accessed, utilized, and/or stored. Thus, for example, a content object can include, but is not limited to, voicemail, email, video, audio, movies, music, games, email, live broadcasts, user preferences, appliance status information, documents, Internet web pages, and the like. Further, as used herein a content object entity can be any entity capable of storing, sourcing, and/or utilizing a content object. In some cases, content object entities are classified as content object sources, content object destinations, or a combination thereof. Thus, for example, a voice mail system may be both a content object destination and a content object source. This is because the voice mail system can be both a source of audio content objects and a destination for an audio content object. Other examples of content object entities include, but are not limited to, appliance watch systems, caller identification systems, call logging systems, databases of recorded video and/or audio objects, sources of live video and/or audio objects, human interfaces, the Internet, databases of interactive content, databases of documents, video players, audio players, and/or graphical displays. Based on the disclosure provided herein, one of ordinary skill in the art will appreciate the myriad of content objects and/or content object entities that can be utilized in relation to embodiments of the present invention.


Various of the systems and methods utilize a number of content object entities that can be sources and/or destinations for content objects. A combination of abstraction and distinction engines can be used to access content objects from a source of content objects, format and/or modify the content objects, and redistribute the modified content object to one or more content object destinations. In some cases, an access point is included that identifies a number of available content objects, and identifies one or more content object destinations to which the respective content objects can be directed.


Such systems and methods can be used to select a desired content object, and to select a content object entity to which the content object is directed. In addition, the systems and methods can be used to modify the content object as to format and/or content. For example, the content object may be reformatted for use on a selected content object entity, modified to add additional or to reduce the content included in the content object, or combined with one or more other content objects to create a composite content object. This composite content object can then be directed to a content object destination where it can be either stored or utilized.


Some embodiments of the access point, abstraction engine, and/or distinction engine are implemented as an appliance that can be attached to a network interface device (NID) to provide a convenient content object access point for a customer premises. Alternatively, the abstraction and/or distinction engine can be implemented as a microserver associated with the NID. Information about such NIDs and microservers can be obtained from U.S. application Ser. No. 10/377,282, filed Feb. 27, 2003 by Casey et al. and entitled “Systems And Methods For Displaying Data Over Video”; U.S. application Ser. No. 10/356,364, filed Jan. 31, 2003 by Phillips et al. and entitled “Packet Network Interface Device And Systems And Methods For Its Use”; U.S. application Ser. No. 10/356,688, filed Jan. 31, 2003 by Phillips et al. and entitled “Systems, Methods And Apparatus For Providing A Plurality Of Telecommunications Services”; U.S. application Ser. No. 10/356,338, filed Jan. 31, 2003 by Phillips et al. and entitled “Configurable Network Interface Device And Systems And Methods For Its Use”; U.S. application Ser. No. 10/367,596, filed Feb. 14, 2003 by Casey et al. and entitled “Systems And Methods For Delivering A Data Stream To A Video Appliance”; U.S. application Ser. No. 10/367,597, filed Feb. 14, 2003 by Casey et al. and entitled “Systems And Methods For Providing Application Services”; U.S. application Ser. No. 10/377,290, filed Feb. 27, 2003 by Phillips et al. and entitled “Systems And Methods For Forming Picture-In-Picture Signals”; U.S. application Ser. No. 10/377,283 filed Feb. 27, 2003 by Phillips et al. and entitled “Systems And Methods For Monitoring Visual Information”; U.S. application Ser. No. 10/377,584 filed Feb. 27, 2003 by Phillips et al. and entitled “Systems And Methods For Delivering Picture-In-Picture Signals At Diverse Compressions And Bandwidths”; U.S. application Ser. No. 10/377,281 filed Feb. 27, 2003 by Phillips et al. and entitled “Systems And Methods For Providing And Displaying Picture-In-Picture Signals”; U.S. application Ser. No. 10/444,941, filed May 22, 2003 by Phillips et al. and entitled “Systems And Methods For Providing Television Signals Using A Network Interface Device”; U.S. application Ser. No. 10/448,249, filed May 22, 2003 by Phillips et al. and entitled “Methods And Apparatus For Delivering A Computer Data Stream To A Video Appliance With A Network Interface Device”; and U.S. Application Ser. No. 10/624,454, filed Jul. 21, 2003 by Casey et al. and entitled “Systems And Methods For Integrating Microservers With A Network Interface Device”. Each of the aforementioned patent applications share one or more inventors, and are assigned to an entity common hereto. Further, the entirety of each of the aforementioned patent applications is incorporated herein by reference for all purposes.


This appliance may include a guide that incorporates a broad range of content media into a single access point. This range of content media can include, but is not limited to, traditional content including movies, music, games, voicemails, emails, software, security video, emergency alerts, and any other content that comes to the home or can be requested from the network via providers.


Some embodiments of the present invention provide systems for abstraction and distinction of content objects. These systems include an abstraction engine and a distinction engine. The abstraction engine is communicably coupled to a group of content object entities, and the distinction engine is communicably coupled to another group of content object entities. The two groups of content object entities are not necessarily mutually exclusive, and in many cases, a content object entity in one of the groups is also included in the other group. The first of the groups of content object entities includes content objects entities such as an appliance control system, a telephone information system, a storage medium including video objects, a storage medium including audio objects, an audio stream source, a video stream source, a human interface, the Internet, and an interactive content entity. The other of the groups of content object entities includes content object entities such as an appliance control system, a telephone information system, a storage medium including video objects, a storage medium including audio objects, a human interface, the Internet, and an interactive content entity.


In particular instances, the abstraction engine is operable to receive a content object from one of the groups of content object entities, and to form the content object into an abstract format. As just one example, this abstract format can be a format that is compatible at a high level with other content formats. In other instances, the abstraction engine is operable to receive a content object from one of the content object entities, and to derive another content object based on the aforementioned content object. Further, the abstraction engine can be operable to receive yet another content object from one of the content object entities and to derive an additional content object there from. The abstraction engine can then combine the two derived content objects to create a composite content object. In some cases, the distinction engine accepts the composite content object and formats it such that it is compatible with a particular group of content object entities. In yet other instances, the abstraction engine is operable to receive a content object from one group of content object entities, and to form that content object into an abstract format. The distinguishing engine can then conform the abstracted content object with a standard compatible with a selected one of another group of content object entities.


Turning to FIG. 1, a combination guide, abstraction, and distinction system 100 in accordance with various embodiments of the present invention is illustrated. System 100 includes a guide 110, a control 120, and abstraction/distinction engine 130, and a number of content object entities 150-164. Content object entities 150-164 can include, but are not limited to, an appliance control system 150, a telephone information system 151-153, a storage medium including video objects 154, a storage medium including audio objects 155, an audio stream source 159-161, a video stream source 156-158, a human interface 162, the Internet 163, and an interactive content entity 164. Human interface 162 can be an audio reception device for encoding voice data, a keyboard, a pen interface, a display including televisions, and audio player, and/or the like. Interactive content entity 164 can be a computer program that provides responses to a user based on a user's actions. Live video and audio sources may include feeds from multiple sources. For example, live video stream 156 may include a feed from a cable television provider for source one 157, and from an antenna for source two 158. Based on this disclosure, one of ordinary skill in the art will appreciate that any number of video sources or channels may be provided via a common live video content object entity. Similarly, one of ordinary skill in the art will appreciate that any number of audio sources or channels may be provided via a common live audio content object entity. Further, one of ordinary skill in the art will recognize other content object entities to which the systems and methods of the present invention can be directed.


As previously discussed, these various content object entities 150-164 can be organized into groups of content object entities 180, 190. One group of content object entities 180 can include, for example, all content object entities that are capable of acting as a content object destination. Thus, for example, this group may include a television capable of receiving and displaying video content objects. Another group of content object entities 190 may include all content object entities that are the source of content objects. Thus, for example, live video feed 156 may be included in this group. It will be appreciated that some of the content object entities can be included in both of the aforementioned groups.


In some embodiments of the present invention, control 120 queries each of content object entities 150-164 to determine content objects available, format of the content objects, and content objects entities and content formats that the content object entities are capable of utilizing. Thus, for example, a query of live video feed 156 would indicate a number of content objects corresponding to available video channels, but would not indicate that live video feed 156 can utilize any content objects. Alternatively, a query of human interface 162 may indicate a television capable of receiving and utilizing content objects in a particular video format, but not providing any sourced content objects.


Using this query information, control 120 assembles a listing of all available content objects and the respective formats of the content objects. In addition, control 120 assembles a listing of all content object entities capable of receiving content objects, and the content object format that each content object entity is capable of supporting. Further, control 120 identifies all format conversions that can be provided by abstraction/distinction engine 130. From this information control 120 creates a guide 110 that indicates all available content objects, and all content object entities to which the available content object entities can be directed.


Various examples are provided to illustrate the process. In the first, a television capable of receiving an NTSC signal is identified as a content object entity (human interface 162), and live video source 156 providing an NTSC video stream is identified as another content object entity. Thus, guide 110 includes a listing for video source 156 indicating that the content object can be displayed on the identified television. The second example expands on the first where a computer display capable of displaying raster video signals is identified, and abstraction/distinction engine 130 includes capability for converting an NTSC video signal to a raster format signal. Thus, guide 110 would include a listing for video source 156 indicating that the content object can be displayed on either the television or the computer display.


As yet another example, control 120 may identify an MPEG video content object maintained on recorded video media 154, and an uncompressed recorded audio content object on recorded audio media 155. Further, control 120 may identify a television capable of displaying an NTSC signal including both audio and video, and identify decompression capability and NTSC conversion capability in abstraction/distinction engine 130. Thus, guide 110 would list the MPEG video content object indicating that it can be displayed on the identified television, and listing the audio object indicating that it can also be displayed on the identified television.


Thus, as just one exemplary use of system 100, a user could select both the audio content object and the video content object and indicate that a combination of the two objects are to be displayed on the identified television. This selection would be passed by control 110 to abstraction/distinction engine 130. In turn, abstraction/distinction engine 130 can access the MPEG video content object from recorded video media 154, and decompress the MPEG video content object to create an uncompressed digital video object. This process is generically referred to herein as abstracting—or converting a content object from one format and/or location to a more generally usable format and/or location. Also, abstraction/distinction engine 130 accesses the audio content object. Such an access can also be referred to as abstracting as the content is being moved to a more accessible location. Then, the audio and video content objects can be merged to create a composite content object. This composite content object can then be modified by abstraction/distinction engine 130 into a format compatible with the identified television. Thus, the digital audio and video are merged, and subsequently converted to an NTSC video signal that is then passed to the identified television for display. This process of modification into an NTSC is one form of distinction. The present invention can employ various forms of distinction all generically referred to herein as distinction or distinguishing. In general, distinction includes modifying the format of a content object and/or moving the content object to a content object entity where it can be displayed.


As yet another example, a user can request a web page to record a broadcast movie. In such a case, guide 110 may be displayed as a web page accessible to the user. Via guide 110, a request for a selected content object indicating that the content object is to be directed to a digital video recorder can be received. The request can be passed to a NID associated with a customer premises, and from the NID, the request can be passed to control 120. Control 120 can convert the request packet and determine which of the various content object entities has access to the requested content object. Based on this information, control 120 can direct abstraction/distinction engine 130 to access the requested content object from the Internet 163 at the specified Internet address, and to format the received information in a format compatible with the selected digital video recorder. Thus, abstraction engine 130 accesses the Internet 163 and retrieves an web page including a video stream. This video stream can then be converted to a digital video format compatible with the digital video recorder. Once the recording is complete, an alert can be sent to control 120. The recorded video is then maintained as a content object on the digital video recorder. This content object can then be accessed and sent to other content object entities.


As a modification of the previous example, control 120 may include a machine interface that can be programmed with a user's preferences. Based on these preferences, control 120 can query the various content object entities to identify programs that match the preferences, and to automatically record those programs to a digital recorder. Thus, the digital recorder may be automatically populated with a number of content objects matching the user's preferences. These content objects can be derived from a number of different content object entities, and can all be abstracted and/or distinguished such that they exist in a common file format compatible with the digital recorder. In turn, these content objects can be requested from the digital recorder, and abstracted and/or distinguished for utilization on another content object entity. In some cases, the user's preferences can be derived from monitoring the user's access habits. In addition, guide 110 may also include other content guides available from various content object entities. These can include, for example, channel lineups, television guides, video on demand schedules, and/or the like.


Based on the disclosure provided herein, one of ordinary skill in the art will appreciate a number of format conversions that can be performed by abstraction/distinction engine 130. Such format conversions can include compression, decompression, encryption, decryption, conversions between data types, and/or the like. The following lists just some examples of possible conversions: an MPEG2 to MPEG4 conversion, an MPEG to a proprietary video standard conversion, video resolution adjustments for display on a variety of monitor types, speech to text (e.g., voicemail to email), text to speech (e.g., email to voicemail), text to text (e.g., home caller ID to cell phone caller ID), data to text (e.g., alerts from appliances indicating a change of state such as, for example, a door opening), data to HTML or other computer formats (e.g., requests from a digital recorder to serve a web page), HTML resolution adjustments allowing the HTML to be displayed in a different resolution environment. Again, based on this disclosure, one of ordinary skill in the art will appreciate that the systems and methods of the present invention can be applied in relation to a number of other conversions.


Systems in accordance with the present invention can support a large number of content object types. Such content object types can include, but are not limited to, recorded video content from content providers, recorded security footage from home/remote cameras, real-time video baby monitors, real-time computer usage monitoring via picture-in-picture display, recorded incoming voicemails, recorded outgoing voicemail greetings, voice over IP, caller-id, PSTN caller-id, call Logs, on Screen TV guide, on screen internet broadcast guide, alerts from emergency alert system, digital recorder requests from a web page, software downloads for internet, video on demand, audio on demand, games on demand, and/or the like. Based on this disclosure, one of ordinary skill in the art will appreciate a number of other content object types that can be used in relation to the present invention.


Further, abstraction/distinction engine 130 can be updatable by adding additional software offering different conversions as the need arises. Thus, for example, where a user installs a new voice mail system, conversion software capable of accepting audio content objects from the new voice mail system, and for converting the audio signals to a standard digital audio signal can be added. Thus, the conversion software can be updated to allow content objects from one of content object entities 150-164 to be utilized by a large number of other content object entities.


Also, based on the disclosure provided herein, one of ordinary skill in the art will recognize a number of uses for system 100. Such uses can include, but are not limited to, utilizing content objects on content object entities where the content object otherwise would not be used, combining content objects to create composite content objects, and/or providing a user friendly selection point for a large variety of content object types. It should also be recognized that the output of abstraction/distinction engine 130 can be formatted for display on a content object entity, or for additional processing by another content object entity.


In particular embodiments of the present invention, a machine interface is provided that allows a user to program control 120 to define the types of content objects that the user would like displayed, and which content object entity the user would like to use for display. In this way, the user can simplify guide 110 making it more accessible and/or useful to the user. Alternatively, or in addition, the user may select a content object entity, and guide 110 can present a list of all content objects that can be utilized by the selected content object entity.


In some cases, various recorded data can be maintained across content object entities in different logical or physical partitions. Thus, for example one user in a customer premises may be allotted a certain physical storage space on one or more content object entities. Once the user's storage space is exhausted, one or more content objects will need to be removed from the user's storage space before an additional content object can be added. Alternatively, or in addition, the partition can be logical. Such a partition can segregate content objects for use by parents and children within the same customer premises.


Thus, guide 110 can also assemble access information indicating which user and/or group of user's can access a particular identified content object. For example, in a customer premises there could be three different personal video recorders all attached to the network. Control 120 could access all of these devices to create a central guide 110. The table of gathered information could appear as the following exemplary table:


















Content

License
Device Name
Location

Access


Name
Description
Info
(DNS Name)
(Local/Remote)
Distribution
Group







Scooby Doo
Video -
Full
PVR1.NID-IP
Local
World
United



Carton
Usage



States


Terminator
Video -
1 Time
PVR2.NID-IP
Local
State
United



Action
View



States


Blink 182
Audio -
Full
www.music.com
Remote
City
Denver, CO


Concert
Music
Usage







Video







Voicemails
Voice -
Full
Voicemail.NID-IP
Local and
Home
Casey



mail
Usage

Remote

Family






(Second Copy)

Only









Turning to FIG. 2, a hierarchical diagram 200 illustrates various content object entities 201-206 accessible via the abstraction and distinction engine 130 illustrated in FIG. 1. In particular, hierarchical diagram 200 illustrates one approach in accordance with some embodiments of the present invention for removing expired content objects. Content objects can either be expired after a specified time or after a predetermined number of uses of the content object (e.g., the content object is consumed). As illustrated, content objects can be moved from a first tier of content object entities 201-206, to a second tier of content object entities 211-213 maintained as a more accessible storage area. Content objects are passed to this more accessible storage area based on priorities and expiration information. Thus, content object entities 201-206 can be queried by control 120 to identify content objects available thereon. The status of each of the identified content objects is determined by a content priorities engine 220. Determination of status includes a determination of whether the identified content object has expired 225, and a ranking of the identified content object as to importance 227 relative to other identified content objects. For example, priority can be based on the importance of the user for which the content object is being moved, or for the likelihood that the content object will be utilized.


In addition to the standard expiration of content objects, a feature can be set to store critical information and delete other data when there is no other space available. For example, video content objects could be expired if storage for voicemails and emails is exhausted and needs to be expanded to support incoming messages. The user and/or a service provider providing a NID can define these options.


To support additional storage, content can be moved to secondary storage automatically as space is required. This can be performed automatically after a specific condition is met (e.g., after ten days content objects can be moved to secondary storage), or manually via a user request (e.g., watch a movie and then record to DVD). Another example of an automatic record would be to copy all content to secondary storage to CD-ROMs if it is audio based. This solution will free up space on a space-constrained system. Further, in some cases, system 100 can include the ability to automatically copy any backup data to the new drive once it was replaced. The data that is set for backup would be user configurable. This system could use standard data protection schemes like Disk Mirroring and Stripping as are known in the art.


Some embodiments of the present invention further include a license monitoring application. This application can allow a user to purchase a content object, and to remove the content object once the license has expired, and/or to limit access to the content object to accesses that conform to the terms of the license. In some cases, the licenses can be stored separate from the content objects to which the licenses are related. Thus, when a content object is selected, an access to the license is also initiated. A separate interface can be provided for storing licensed content objects to an offline media (i.e., CD-ROMs, Flash Cards, or external drives). When the content object is selected, an associated license is queried and where the license is valid, the content object is accessed from the offline media. If any of the content object is presented in an encrypted format, the license will provide rights to access one or more decryption keys useful for decrypting the content object.



FIG. 3 graphically represent examples in accordance with the present invention for utilizing system 100. Turning to FIG. 3a, a graphical representation 301 of system 100 combining an audio content object 311 with a video content object 312 is illustrated. Audio content object 311 can be, for example, a streaming audio signal available from the Internet, while video content object 312 can be, for example, a cable television channel. Audio content object 311 and video content object 312 are abstracted by abstraction/distinction engine 130 to a combinable format. Then, audio content object 311 and video content object 312 are combined 321, and the combined signals are distinguished for utilization by a determined content object entity. Upon distinction, a composite content object 331 is formed.


It will be appreciated that system 100 can include the ability to combine a number of different content object types including, for example, audio, video, or data tracks to produce enhanced content specified by the user. The example of graphical representation 301 would be very useful where a user is watching a sporting event on television, but would rather listen to the play calling provided on a radio station available over the Internet. System 100 thus accesses only the video portion of the television program, and synchronizes it with the audio portion. This synchronization can include delaying one or the other portions such that they align. This delaying function can be aided by a user that can indicate when the two portions have been synchronized. Thus, for example, the program may be displayed with increasing amounts of delay added to one or the other segments until the user indicates that the delay is correct. Once the delay is correct, the program combination continues.


Turning to FIG. 3b, a graphical representation 302 illustrates another exemplary use of system 100 in accordance with other embodiments of the present invention. In this example, only the audio portion of a television program is recorded. In operation, the content object entity supplying the television program is selected, and an audio only content object entity is selected to receive the program. Abstraction/distinction engine 130 separates the audio and video portions of the selected television program, and provides the audio portion to the selected content object entity 341. The selected content object entity 341 can be, for example, a CD recorder, an MP3 player, a radio, and/or the like.


Turning to FIG. 3c, a graphical representation 303 illustrates yet another exemplary use of system 100 in accordance with other embodiments of the present invention. In this example, a video content object 312 is combined with a data stream content object 313 obtained from the Internet. The combination creates a composite content object 351. A clock 321 can be used to synchronize the content objects and the video content object can be tagged 361 to identify where in the video stream the data is to be added. In this way video programming can be augmented to include additional details and background information based on the video program.


The invention has now been described in detail for purposes of clarity and understanding. However, it will be appreciated that certain changes and modifications may be practiced within the scope of the appended claims. Accordingly, it should be recognized that many other systems, functions, methods, and combinations thereof are possible in accordance with the present invention. Thus, although the invention is described with reference to specific embodiments and figures thereof, the embodiments and figures are merely illustrative, and not limiting of the invention. Rather, the scope of the invention is to be determined solely by the appended claims.

Claims
  • 1. A method for utilizing content objects by a content access point within a customer's premises, wherein the method comprises: creating, at the content access point, a first list of available content objects and a respective format of each available content object, wherein the content access point is implemented by a demarcation device that isolates the customer's premises network from a provider's network, the demarcation device comprising a broadband modem;creating, at the content access point, a second list of content object entities and one or more respective formats that each content object entity is capable of supporting;creating, at the content access point, a guide indicating available content objects and, for each particular available content object, one or more content object entities to which that particular content object can be directed, based at least in part on the first list and the second list, each of the one or more content object entities being a separate device that is capable of displaying that particular content object;accessing, with the content access point, a first content object from a first content object entity within the customer's premises, wherein the first content object is in a first content format compatible with the first content object entity and wherein the first content object is selected from a group consisting of a voicemail object, an email object, a video object, an audio object, and an Internet web page;abstracting, with the content access point, the first content object to create a second content object in an abstract format, wherein the abstract format is compatible with a plurality of content formats;distinguishing, at the content access point, the second content object to create a third content object, wherein the third content object is in a second content format that is compatible with a second content object entity within the customer's premises, wherein the third content object is selected from a group consisting of a voicemail object, an email object, a video object, an audio object, and an Internet web page, and wherein the third content object is different from the first content format; andproviding, from the content access point, the third content object to the second content object entity.
  • 2. The method of claim 1, wherein the method further comprises: accessing a fourth content object from a third content object entity wherein the fourth content object is in a third content format compatible with the third content object entity, wherein the fourth content object is selected from a group consisting of a voicemail object, an email object, a video object, an audio object, a document object, and an Internet web page, and wherein the fourth content object is different from the first content format and the second content format;abstracting the fourth content object to create a fifth content object; andcombining the fifth content object with the second content object, wherein the combination of the second and fifth content objects are distinguished to create the third content object.
  • 3. The method of claim 2, wherein the first content object is a video object, and wherein the fourth content object is an audio object.
  • 4. The method of claim 3, wherein abstracting the first content object includes separating an audio portion from a video portion of the video object.
  • 5. The method of claim 2, wherein the first content object is a video object, and wherein the fourth content object is an Internet object.
  • 6. The method of claim 2, wherein the first content object entity is selected from a group consisting of an appliance control system, a telephone information system, a storage medium including video objects, a storage medium including audio objects, an audio stream source, a video stream source, a human interface, the Internet, and an interactive content entity.
  • 7. The method of claim 6, wherein the second content object entity is selected from a group consisting of an appliance control system, a telephone information system, a storage medium including video objects, a storage medium including audio objects, an audio stream source, a video stream source, a human interface, the Internet, and an interactive content entity.
  • 8. The method of claim 7, wherein the first content object entity is different from the second content object entity.
  • 9. The method of claim 7, wherein the third content object entity is selected from a group consisting of an appliance control system, a telephone information system, a storage medium including video objects, a storage medium including audio objects, an audio stream source, a video stream source, a human interface, the Internet, and an interactive content entity.
  • 10. The method of claim 9, wherein the first content object entity is different from the second content object entity and the third content object entity.
  • 11. The method of claim 1, wherein the method further comprises: identifying a content object associated with one of the first plurality of content object entities that has expired; andremoving the identified content object.
  • 12. The method of claim 1, wherein the first content object is a video object, wherein abstracting the first content object includes removing a visual portion of the video object, and wherein the second content object includes an audio portion of the video object.
  • 13. The method of claim 1, wherein the first content object comprises a voicemail and the third content object comprises an email.
  • 14. The method of claim 1, wherein the first content object comprises an email and the third content object comprises a voicemail.
  • 15. The method of claim 1, further comprising: limiting access to the first and second content objects to conform with terms of a license to the first content object.
  • 16. The method of claim 1, wherein the demarcation device is incorporated within a premises network interface device (“NID”) that is affixed to an external wall of the customer's premises.
  • 17. A content access point for utilizing content objects by a content access point within a customer's premises, the content access point comprising: a processor; anda storage medium having stored thereon a set of instructions for controlling operation of the content access point, the set of instructions comprising: instructions for creating a first list of available content objects and a respective format of each available content object, wherein the content access point is located within a demarcation device that isolates the customer's premises network from a provider's network;instructions for creating, at the content access point, a second list of content object entities and one or more respective formats that each content object entity is capable of supporting;instructions for creating, at the content access point, a guide indicating available content objects and, for each particular available content object, one or more content object entities to which that particular content object can be directed, based at least in part on the first list and the second list, each of the one or more content object entities being a separate device that is capable of displaying that particular content object;instructions for accessing a first content object from a first content object entity within the customer's premises, wherein the first content object is in a first content format compatible with the first content object entity and wherein the first content object is selected from a group consisting of a voicemail object, an email object, a video object, an audio object, and an Internet web page;instructions for abstracting the first content object to create a second content object in an abstract format, wherein the abstract format is compatible with a plurality of content formats;instructions for distinguishing the second content object to create a third content object, wherein the third content object is in a second content format that is compatible with a second content object entity within the customer's premises, wherein the third content object is selected from a group consisting of a voicemail object, an email object, a video object, an audio object, and an Internet web page, and wherein the third content object is different from the first content format; andinstructions for providing the third content object to the second content object entity;wherein the content access point is implemented by a demarcation device that isolates the customer's premises network from a provider's network, the demarcation device comprising a broadband modem.
US Referenced Citations (348)
Number Name Date Kind
4775997 West, Jr. et al. Oct 1988 A
4959719 Strubbe et al. Sep 1990 A
4989230 Gillig et al. Jan 1991 A
5111296 Duffield et al. May 1992 A
5202765 Lineberry Apr 1993 A
5327156 Masukane et al. Jul 1994 A
5361098 Lucas Nov 1994 A
5369696 Krauss et al. Nov 1994 A
5398074 Duffield et al. Mar 1995 A
5418559 Blahut May 1995 A
5463422 Simpson et al. Oct 1995 A
5485221 Banker et al. Jan 1996 A
5488412 Majeti et al. Jan 1996 A
5511114 Stimson et al. Apr 1996 A
5526403 Tam Jun 1996 A
5541670 Hanai Jul 1996 A
5541671 Pugel Jul 1996 A
5559549 Hendricks et al. Sep 1996 A
5585837 Nixon Dec 1996 A
5602598 Shintani Feb 1997 A
5621429 Yamaashi et al. Apr 1997 A
5621482 Gardner et al. Apr 1997 A
5621787 McKoy et al. Apr 1997 A
5633683 Rosengren et al. May 1997 A
5635980 Lin et al. Jun 1997 A
5638112 Bestler et al. Jun 1997 A
5657076 Tapp Aug 1997 A
5671019 Isoe et al. Sep 1997 A
5673692 Schulze et al. Oct 1997 A
5675390 Schindler et al. Oct 1997 A
5689705 Fino et al. Nov 1997 A
5691777 Kasstly Nov 1997 A
5694616 Johnson et al. Dec 1997 A
5717748 Sneed et al. Feb 1998 A
5740075 Bigham et al. Apr 1998 A
5748255 Johnson et al. May 1998 A
5760842 Song Jun 1998 A
5771465 Bojeryd Jun 1998 A
5774172 Kapell et al. Jun 1998 A
5774664 Hidary et al. Jun 1998 A
5774666 Portuesi Jun 1998 A
5774885 Delfer, III Jun 1998 A
5781620 Montgomery et al. Jul 1998 A
5784683 Sistanizadeh et al. Jul 1998 A
5790201 Antos Aug 1998 A
5790775 Marks et al. Aug 1998 A
5815216 Suh Sep 1998 A
5831591 Suh Nov 1998 A
5844552 Gaughan et al. Dec 1998 A
5857203 Kauffman et al. Jan 1999 A
5861881 Freeman et al. Jan 1999 A
5883948 Dunn Mar 1999 A
5889954 Gessel et al. Mar 1999 A
5894320 Vancelette Apr 1999 A
5900867 Schindler et al. May 1999 A
5900916 Pauley May 1999 A
5901220 Garver et al. May 1999 A
5910981 Baghat et al. Jun 1999 A
5912668 Sciammarella et al. Jun 1999 A
5912711 Lin et al. Jun 1999 A
5923379 Patterson Jul 1999 A
5969769 Hamadate Oct 1999 A
5970386 Williams Oct 1999 A
5971921 Timbel Oct 1999 A
5977963 Gaughan et al. Nov 1999 A
5978451 Swan et al. Nov 1999 A
5983068 Tomich et al. Nov 1999 A
5999599 Shaffer et al. Dec 1999 A
6012100 Frailong et al. Jan 2000 A
6021434 Pizano Feb 2000 A
6039578 Suffi et al. Mar 2000 A
6058430 Kaplan May 2000 A
6061719 Bendinelli et al. May 2000 A
6069899 Foley May 2000 A
6070127 Hirono et al. May 2000 A
6073140 Morgan et al. Jun 2000 A
6073171 Gaughan et al. Jun 2000 A
6078661 Arnett et al. Jun 2000 A
6084638 Hare et al. Jul 2000 A
6097383 Gaughan et al. Aug 2000 A
6128389 Chan et al. Oct 2000 A
6134320 Swan et al. Oct 2000 A
6188397 Humpleman Feb 2001 B1
6201538 Wugofski Mar 2001 B1
6202212 Sturgeon et al. Mar 2001 B1
6208384 Schultheiss Mar 2001 B1
6208637 Eames Mar 2001 B1
6209025 Bellamy Mar 2001 B1
6212585 Chrabaszcz Apr 2001 B1
6229890 Kerr et al. May 2001 B1
6256624 Pollard et al. Jul 2001 B1
6256785 Klappert et al. Jul 2001 B1
6259440 Vaughan et al. Jul 2001 B1
6272680 Gaughan et al. Aug 2001 B1
6282189 Eames Aug 2001 B1
6288749 Freadman Sep 2001 B1
6299526 Cowan et al. Oct 2001 B1
6300980 McGraw et al. Oct 2001 B1
6313851 Matthews et al. Nov 2001 B1
6317164 Hrusecky et al. Nov 2001 B1
6322375 Cole et al. Nov 2001 B1
6324184 Hou et al. Nov 2001 B1
6324694 Watts et al. Nov 2001 B1
6326982 Wu et al. Dec 2001 B1
6327363 Henderson et al. Dec 2001 B1
6330285 Crosby et al. Dec 2001 B1
6331852 Gould et al. Dec 2001 B1
6333976 Lesley Dec 2001 B2
6337717 Nason et al. Jan 2002 B1
6349410 Lortz Feb 2002 B1
6357045 Devaney Mar 2002 B1
6359973 Rahamim et al. Mar 2002 B1
6377861 York Apr 2002 B1
6381745 Paul Apr 2002 B1
6392664 White et al. May 2002 B1
6396480 Schindler et al. May 2002 B1
6397256 Chan et al. May 2002 B1
6405371 Oosterhout et al. Jun 2002 B1
6441861 Vaughan et al. Aug 2002 B2
6443890 Schulze et al. Sep 2002 B1
6445694 Swartz Sep 2002 B1
6452611 Gerba et al. Sep 2002 B1
6452923 Gerszberg et al. Sep 2002 B1
6456335 Miura et al. Sep 2002 B1
6456340 Margulis Sep 2002 B1
6463273 Day Oct 2002 B1
6481012 Gordon et al. Nov 2002 B1
6481013 Dinwiddie et al. Nov 2002 B1
6486892 Stern Nov 2002 B1
6492997 Gerba et al. Dec 2002 B1
6493036 Fernandez Dec 2002 B1
6493038 Singh et al. Dec 2002 B1
6493878 Kassatly Dec 2002 B1
6502242 Howe et al. Dec 2002 B1
6505123 Root et al. Jan 2003 B1
6510152 Gerszberg et al. Jan 2003 B1
6510533 Siek et al. Jan 2003 B1
6510557 Thrift Jan 2003 B1
6512551 Cheney et al. Jan 2003 B1
6512552 Subramanian Jan 2003 B1
6519283 Cheney et al. Feb 2003 B1
6526579 Sato Feb 2003 B2
6526581 Edson Feb 2003 B1
6536041 Knudson et al. Mar 2003 B1
6538701 Yuen Mar 2003 B1
6542500 Gerszberg et al. Apr 2003 B1
6544174 West et al. Apr 2003 B2
6556251 Sorensen Apr 2003 B1
6556252 Kim Apr 2003 B1
6556253 Megied et al. Apr 2003 B1
6563515 Reynolds et al. May 2003 B1
6567106 Wugofski May 2003 B1
6567981 Jeffrey May 2003 B1
6567984 Allport May 2003 B1
6574236 Gosselin et al. Jun 2003 B1
6580710 Bowen et al. Jun 2003 B1
6590615 Murakami et al. Jul 2003 B2
6593937 Ludtke et al. Jul 2003 B2
6611840 Baer et al. Aug 2003 B1
6621870 Gordon et al. Sep 2003 B1
6625144 El-Batal et al. Sep 2003 B1
6628302 White et al. Sep 2003 B2
6640239 Gidwani Oct 2003 B1
6658464 Reisman Dec 2003 B2
6678007 Nason et al. Jan 2004 B2
6678009 Kahn Jan 2004 B2
6687374 Leuca et al. Feb 2004 B2
6700625 Fujii Mar 2004 B1
6714264 Kempisty Mar 2004 B1
6727886 Mielekamp et al. Apr 2004 B1
6727918 Nason Apr 2004 B1
6727960 Seo Apr 2004 B2
6728780 Hebert Apr 2004 B1
6732373 Harrison et al. May 2004 B2
6738820 Hilt May 2004 B2
6741617 Rosengren et al. May 2004 B2
6745021 Stevens Jun 2004 B1
6757707 Houghton et al. Jun 2004 B1
6760782 Swales Jul 2004 B1
6784945 Norsworthy et al. Aug 2004 B2
6785906 Gaughan et al. Aug 2004 B1
6795506 Zhng et al. Sep 2004 B1
6807564 Zellner et al. Oct 2004 B1
6809776 Simpson Oct 2004 B1
6816878 Zimmers et al. Nov 2004 B1
6819682 Rabenko et al. Nov 2004 B1
6820157 Eide et al. Nov 2004 B1
6833874 Ozaki et al. Dec 2004 B2
6833877 Wang Dec 2004 B2
6842628 Arnold et al. Jan 2005 B1
6857131 Yagawa et al. Feb 2005 B1
6882714 Mansfield Apr 2005 B2
6882795 McMurdie et al. Apr 2005 B1
6889385 Rakib et al. May 2005 B1
6894999 Acharya May 2005 B1
6896276 Sparrow May 2005 B1
6898413 Yip et al. May 2005 B2
6903753 Gray et al. Jun 2005 B1
6909903 Wang Jun 2005 B2
6924846 Ohba et al. Aug 2005 B2
6934753 Kim Aug 2005 B2
6948076 Saito Sep 2005 B2
6957275 Sekiguchi Oct 2005 B1
6970127 Rakib Nov 2005 B2
6970181 Fadel Nov 2005 B1
6975324 Valmiki et al. Dec 2005 B1
6978474 Sheppard et al. Dec 2005 B1
6987734 Hundemer Jan 2006 B2
7010608 Garg et al. Mar 2006 B2
7020652 Matz et al. Mar 2006 B2
7023492 Sullivan Apr 2006 B2
7024677 Snyder et al. Apr 2006 B1
7028327 Dougherty et al. Apr 2006 B1
7028330 Gaughan et al. Apr 2006 B1
7032241 Venkatachary et al. Apr 2006 B1
7035270 Moore et al. Apr 2006 B2
7055169 Delpuch et al. May 2006 B2
7096487 Gordon et al. Aug 2006 B1
7099443 Phillips et al. Aug 2006 B2
7102691 Dischert et al. Sep 2006 B2
7136945 Gibbs et al. Nov 2006 B2
7180988 Phillips et al. Feb 2007 B2
7187418 Phillips et al. Mar 2007 B2
7203966 Abburi et al. Apr 2007 B2
7206029 Cohen-Solal Apr 2007 B2
7233781 Hunter et al. Jun 2007 B2
7263362 Young et al. Aug 2007 B1
7264590 Casey et al. Sep 2007 B2
7272613 Sim et al. Sep 2007 B2
7283045 Manz Oct 2007 B1
7283505 Meenan et al. Oct 2007 B1
7292590 Chen et al. Nov 2007 B1
7376191 Melick et al. May 2008 B2
7483958 Elabbady et al. Jan 2009 B1
7519353 Stevens et al. Apr 2009 B2
7793337 Casey et al. Sep 2010 B2
20010021997 Lee Sep 2001 A1
20010024239 Feder et al. Sep 2001 A1
20010031066 Meyer et al. Oct 2001 A1
20010034754 Elwahab et al. Oct 2001 A1
20010048481 Hatano et al. Dec 2001 A1
20010051980 Raciborski et al. Dec 2001 A1
20020021465 Moore et al. Feb 2002 A1
20020022991 Sharood et al. Feb 2002 A1
20020026642 Augenbraun et al. Feb 2002 A1
20020037004 Bossemeyer et al. Mar 2002 A1
20020044225 Rakib Apr 2002 A1
20020051119 Sherman et al. May 2002 A1
20020054062 Gerba et al. May 2002 A1
20020056009 Affif et al. May 2002 A1
20020057372 Cavallerano et al. May 2002 A1
20020066110 Cloonan et al. May 2002 A1
20020089605 Min Jul 2002 A1
20020100054 Feinberg et al. Jul 2002 A1
20020110115 Gorman et al. Aug 2002 A1
20020116720 Terry et al. Aug 2002 A1
20020122136 Safadi et al. Sep 2002 A1
20020129154 Okawa et al. Sep 2002 A1
20020147987 Reynolds et al. Oct 2002 A1
20020171552 Tate Nov 2002 A1
20020175998 Hoang Nov 2002 A1
20020180579 Nagaoka et al. Dec 2002 A1
20020184457 Yuasa et al. Dec 2002 A1
20020196933 Brower et al. Dec 2002 A1
20030013441 Bhogal et al. Jan 2003 A1
20030016304 Norworthy et al. Jan 2003 A1
20030026416 Fusco Feb 2003 A1
20030026418 Fusco Feb 2003 A1
20030027521 Yip et al. Feb 2003 A1
20030027565 Bossemeyer et al. Feb 2003 A1
20030028879 Gordon et al. Feb 2003 A1
20030030652 Billmaier et al. Feb 2003 A1
20030035075 Butler et al. Feb 2003 A1
20030056215 Kanungo et al. Mar 2003 A1
20030066082 Kliger et al. Apr 2003 A1
20030067926 Gerszberg et al. Apr 2003 A1
20030069002 Hunter et al. Apr 2003 A1
20030072330 Yang et al. Apr 2003 A1
20030074372 Barchi et al. Apr 2003 A1
20030083533 Gerba et al. May 2003 A1
20030152207 Ryan Aug 2003 A1
20030179858 Bella et al. Sep 2003 A1
20030184679 Meehan Oct 2003 A1
20030189935 Warden et al. Oct 2003 A1
20030192057 Gaughan et al. Oct 2003 A1
20030201889 Zulkowski Oct 2003 A1
20030225641 Gritzmacher et al. Dec 2003 A1
20030226143 Michael et al. Dec 2003 A1
20030236916 Adcox et al. Dec 2003 A1
20040004538 Manis et al. Jan 2004 A1
20040006772 Ansari et al. Jan 2004 A1
20040019396 McMahon et al. Jan 2004 A1
20040025012 Burks Feb 2004 A1
20040027487 Rzadzki et al. Feb 2004 A1
20040049785 Grzeczkowski et al. Mar 2004 A1
20040052343 Glaser et al. Mar 2004 A1
20040052578 Baldino et al. Mar 2004 A1
20040073941 Ludvig et al. Apr 2004 A1
20040078457 Tindal Apr 2004 A1
20040092276 Doodley May 2004 A1
20040093492 Daude et al. May 2004 A1
20040100975 Kreiner et al. May 2004 A1
20040107356 Shamoon et al. Jun 2004 A1
20040136373 Bareis Jul 2004 A1
20040141758 El-Reedy Jul 2004 A1
20040150158 Biegelsen et al. Aug 2004 A1
20040150518 Phillips et al. Aug 2004 A1
20040150748 Phillips et al. Aug 2004 A1
20040150749 Phillips et al. Aug 2004 A1
20040150750 Phillips et al. Aug 2004 A1
20040150751 Phillips et al. Aug 2004 A1
20040151161 Casey et al. Aug 2004 A1
20040151168 Phillips et al. Aug 2004 A1
20040151289 Phillips et al. Aug 2004 A1
20040151290 Magarasevic et al. Aug 2004 A1
20040152493 Phillips et al. Aug 2004 A1
20040153289 Casey et al. Aug 2004 A1
20040153577 Phillips et al. Aug 2004 A1
20040153670 Casey et al. Aug 2004 A1
20040160460 Casey et al. Aug 2004 A1
20040163125 Phillips et al. Aug 2004 A1
20040163126 Phillips et al. Aug 2004 A1
20040163128 Phillips et al. Aug 2004 A1
20040168199 Phillips et al. Aug 2004 A1
20040172657 Phillips et al. Sep 2004 A1
20040176085 Phillips et al. Sep 2004 A1
20040177163 Casey et al. Sep 2004 A1
20040181813 Ota et al. Sep 2004 A1
20040184523 Dawson et al. Sep 2004 A1
20040213286 Jette et al. Oct 2004 A1
20040252675 Lund Dec 2004 A1
20040264687 Casey et al. Dec 2004 A1
20050018653 Phillips et al. Jan 2005 A1
20050022007 Phillips et al. Jan 2005 A1
20050027715 Casey et al. Feb 2005 A1
20050034155 Gordon et al. Feb 2005 A1
20050041787 Casey et al. Feb 2005 A1
20050048957 Casey et al. Mar 2005 A1
20050064831 Feenstra et al. Mar 2005 A1
20050091695 Paz et al. Apr 2005 A1
20050149981 Augenbraun et al. Jul 2005 A1
20050166232 Lamkin et al. Jul 2005 A1
20060020992 Pugel et al. Jan 2006 A1
20060031457 Motoyama Feb 2006 A1
20060031582 Pugel et al. Feb 2006 A1
20060156368 Campbell Jul 2006 A1
20060259941 Goldberg et al. Nov 2006 A1
20100293599 Casey et al. Nov 2010 A1
Related Publications (1)
Number Date Country
20050027715 A1 Feb 2005 US