Determining device that performs processing of output pictures

Information

  • Patent Grant
  • 9615139
  • Patent Number
    9,615,139
  • Date Filed
    Tuesday, April 3, 2012
    12 years ago
  • Date Issued
    Tuesday, April 4, 2017
    7 years ago
Abstract
A system and method for determining the characteristics of a device coupled to a client device are disclosed. A method, according to one embodiment, includes driving a display device with a first video output signal formatted according to a first video interface specification; responsive to driving the display device, soliciting user input based on information included in the first video output signal; determining a characteristic of the display device based on the user input; and driving the display device according to the determined characteristic.
Description
TECHNICAL FIELD

This disclosure relates in general to the field of television systems, and more particularly, to the field of interactive television.


BACKGROUND OF THE DISCLOSURE

With recent advances in digital transmission technology, subscriber television systems are now capable of providing much more than the traditional analog broadcast video. In implementing enhanced programming, the home communication terminal (“HCT”), otherwise known as the set-top box, has become an important computing device for accessing content services (and content within those services) and navigating a user through a maze of available services. In addition to supporting traditional analog broadcast video functionality, digital HCTs (or “DHCTs”) now also support an increasing number of two-way digital services such as video-on-demand and personal video recording.


Typically, a DHCT is connected to a cable or satellite, or generally, a subscriber television system, and includes hardware and software necessary to provide the functionality of the digital television system at the user's site. Some of the software executed by a DHCT can be downloaded and/or updated via the subscriber television system. Each DHCT also typically includes a processor, input/output capabilities, communication components, and memory, and is connected to a television set or other display device, such as a personal computer. While many conventional DHCTs are stand-alone devices that are externally connected to a television, a DHCT and/or its functionality may be integrated into a television or personal computer or even an audio device such as a programmable radio, as will be appreciated by those of ordinary skill in the art.


Technological advances now permit generation and transmission of a variety of higher resolution pictures and video formats. Coincident with the advancing technology of transmission equipment is the technological improvements of the DHCTs and the television set to receive and display a plurality of video formats. There are a wide range of television sets available today, including the conventional cathode ray tube (CRT) styles, overhead projection, rear projection, liquid crystal display based technology, and plasma television sets that can be mounted on a wall. These variations in television sets often lead to a wide variety of characteristics that affect processing of a television picture for display. A sourced television or video signal is typically processed for display by considering its characteristics, as well as the TV set's characteristics such as the size of the screen, the aspect ratio of the display, and whether the display implements an interlaced or progressive scan format, among other characteristics. Due to the increasing complexity and variation of DHCTs and television sets (and thus a multitude of characteristics to consider in processing a video signal), connecting a television set to, or otherwise communicating with, a DHCT and achieving a viewable picture and a desired display quality is often a challenge to even the most technologically adept. Thus a need exists in the industry to address the aforementioned and/or other deficiencies and/or inadequacies.





BRIEF DESCRIPTION OF THE DRAWINGS

The disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.



FIG. 1 is a block diagram depicting an example subscriber television system (STS), in accordance with one embodiment of the disclosure.



FIG. 2 is a block diagram depicting an example headend shown in the STS of FIG. 1, in accordance with one embodiment of the disclosure.



FIG. 3A is a block diagram depicting an example digital home communication terminal (DHCT) shown in the STS of FIG. 1, which is coupled to the example headend of FIG. 2 and a television, in accordance with one embodiment of the disclosure.



FIG. 3B is a schematic diagram of the example remote control device shown in FIG. 3A, in accordance with one embodiment of the disclosure.



FIG. 4 is a flow diagram depicting an example method for determining display device characteristics, in accordance with one embodiment of the disclosure.



FIGS. 5-7 are block diagrams of the example DHCT and the television set shown in FIG. 3A with a display screen that solicits user feedback based on a plurality of output formats from the DHCT to the TV set, in accordance with one embodiment of the disclosure.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

The preferred embodiments of the disclosure now will be described more fully hereinafter with reference to the accompanying drawings. The preferred embodiments of the disclosure include systems and methods that provide an interactive session with a user to determine the characteristics of a television set or other display device coupled to a digital home communication terminal (DHCT).


Video formats can vary in picture size, frame rate, and on whether pictures are progressive or interlaced. A progressive picture constitutes all the lines of a frame whereas an interlaced picture has a top field and bottom field of alternating lines to constitute a complete frame. An interlaced picture is displayed during respective field intervals. Video formats can further vary in other attributes or characteristics such as color format, color primaries, picture width-to-height aspect ratio, and width-to-height aspect ratio of the individual picture elements, or pixels, that make-up the picture.


As a non-limiting example, the video formats of a compressed digital television signal processed in a DHCT for display include formats specified by ATSC (Advanced Television Systems Committee) Digital Television Standard A/54 and include the characteristics described above. An analog television signal such as a NTSC (National Television System Committee) television signal can be processed for display as well.


As there are multiple video formats for a sourced television signal, there are also multiple physical output interfaces or connectors (or similarly, ports) in the DHCT from which a DHCT-processed television signal can be output. The processed signal that is output through a physical output interface complies with the format specification of that interface. An interface specification, such as a video interface specification, can also include mechanisms for providing ancillary data. For example, the provision of close-caption text within a video signal may be part of the video interface specification.


In addition to possible differences in signaling and physical characteristics, video interface specifications can differ in one or more parameters that pertains to the characteristics of the video picture. For instance, parameters that can differ in value in distinct video interface specifications include picture size, frame rate, whether pictures are progressive or interlaced, color format, colorimetry, picture width-to-height aspect ratio, width-to-height aspect ratio of pixels, and if and how ancillary data is provided.


A video interface specification can allow for one or more picture parameters to take different values. For instance, the value of a colorimetry parameter may differ according to the video picture's colorimetry. Furthermore, a video interface specification can support multiple sets of a combination of parameter values that specify the characteristics of the picture. The parameters that can take different values in the multiple sets of a combination of parameters correspond to characteristics that may include picture size, frame rate, whether pictures are progressive or interlaced, color format, colorimetry, picture width-to-height aspect ratio, width-to-height aspect ratio of pixels, and if and where ancillary data is carried. For instance, the SMPTE (Society of Motion Picture and Television Engineers) 274 specification supports multiple input and output picture formats, such as 1920×1080 interlaced or progressive pictures, one of various frame rates, and RGB or YPbPr encoded pixels. SMPTE 296 specifies 1280×720 progressive pictures for input and output.


A physical output interface or connector can further serve to output video signals corresponding to one or more video format specifications. For instance, the physical connector trio used to output component analog video as YPbPr could be configured to output a television signal as an RGB component analog video signal. As another non-limiting example, a physical output interface can be used to output a television signal compliant to SMPTE 274 or SMPTE 296.


It would be understood by those having ordinary skill in the art that other specifications and/or standards can be used that will include the same, fewer, more, or different parameters and yet be considered within the scope of the preferred embodiments of the disclosure. Consequently, a DHCT configured in accordance with the preferred embodiments determines the video formats supported by the television set or other display device connected to the DHCT. Responsive to these determinations, the DHCT can then set its video output format, resolution of graphics overlays, and closed caption support accordingly and process sourced television signals accordingly.


The systems and methods of the preferred embodiments of the disclosure will be described in the context of a subscriber television system, and particularly, a DHCT that is connected to a TV set, although other systems that include communication with an interactive display are considered to be within the scope of the disclosure. Additionally, reference herein will be made to physical output ports with the understanding that physical output ports include ports, connectors, and physical interfaces, including wireless interfaces. Since the preferred embodiments of the disclosure can be understood in the context of a subscriber television system, one such example system is described below, with further description of the DHCT and headend components. Following the description of these components is an example method of the preferred embodiments of the disclosure, followed by some illustrations of some example interactive sessions of a discovery procedure that can be used to determine the display device characteristics to enable a quality picture to be displayed.


The disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those of ordinary skill in the art. Furthermore, all “examples” given herein are intended to be non-limiting and among others not shown but understood to be within the scope of the disclosure.


I. Subscriber Television System


FIG. 1 is a block diagram depicting an example subscriber television system (STS) 10, in accordance with one embodiment of the disclosure. In this example, the STS 10 includes a headend 11 and a digital home communication terminal (DHCT) 16 that are coupled via a communications network 18. It will be understood that the STS 10 shown in FIG. 1 is merely illustrative and should not be construed as implying any limitations upon the scope of the preferred embodiments of the disclosure. For example, although single components (e.g., a headend 11 and a DHCT 16) are illustrated in FIG. 1, the STS 10 can feature a plurality of any one of the illustrated components, or may be configured with alternative embodiments for any one of the individual components or with yet other additional components not enumerated above. Subscriber television systems also included within the scope of the preferred embodiments of the disclosure include systems not utilizing physical structured cabling for transmission, such as, but not limited to, satellite systems and terrestrial-broadcast systems.


The DHCT 16 is typically situated at the residence or place of business or recreation of a user and may be a stand-alone unit or integrated into a television set, a personal computer, or other display devices, or an audio device, among other media devices. The DHCT 16 receives content (video, audio and/or other data) from the headend 11 through the network 18 and in some embodiments, provides reverse information to the headend 11 through the network 18.


The headend 11 receives content from one or more content providers (not shown), including local providers. The content is processed and/or stored and then transmitted to client devices such as the DHCT 16 via the network 18. The headend 11 may include one or more server devices (not shown) for providing content to the DHCT 16. The headend 11 and the DHCT 16 cooperate to provide a user with television services via a television set (not shown). The television services may include, for example, broadcast television services, cable television services, premium television services, video-on-demand (VOD) services, and/or pay-per-view (PPV) services, among others.


II. Headend


FIG. 2 is an overview of one example headend 11, which provides the interface between the STS 10 (FIG. 1) and the service and content providers. The overview of FIG. 2 is equally applicable to a hub (not shown), and the same elements and principles may be implemented at a hub instead of the headend 11. It will be understood that the headend 11 shown in FIG. 2 is merely illustrative and should not be construed as implying any limitations upon the scope of the preferred embodiments of the disclosure. The headend 11 receives content from a variety of service and content providers, which can provide input in a variety of ways. The headend 11 combines the content from the various sources and distributes the content to subscribers via the distribution systems of the network 18.


In a typical system, the programming, services and other information from content providers can be distributed according to a variety of mechanisms. The input signals may be transmitted from sources to the headend 11 via a variety of transmission paths, including satellites (not shown) and terrestrial broadcast transmitters and antennas (not shown). The headend 11 can also receive content from a direct feed source 210 via a direct line 212. Other input sources from content providers include a video camera 214, an analog input source 208, and/or an application server 216. The application server 216 may include more than one line of communication. One or more components such as the analog input source 208, the direct feed source 210, the video camera 214, and the application server 216 can be located external to the headend 11, as shown, or internal to the headend 11 as would be appreciated by one having ordinary skill in the art. The signals provided by the content or programming input sources can include a single content instance (e.g., a program episode or show) or a multiplex that includes several content instances.


The headend 11 generally includes one or more receivers 218 that are each associated with a content source. MPEG (Motion Picture Experts Group) encoders, such as encoder 220, are included for digitally encoding local programming or a real-time feed from the video camera 214 or the like. The encoder 220 outputs the respective compressed video and audio streams corresponding to the analog audio/video signal received at its input. For example, the encoder 220 can output formatted MPEG-2 or MPEG-1 packetized elementary (PES) streams or transport streams compliant to the syntax and semantics of the ISO (International Organization for Standardization) MPEG-2 standard, respectively. The PES or transport streams may be multiplexed with input signals from a switch 230, receiver 218 and control system 232. The multiplexing logic 222 processes the input signals and multiplexes at least a portion of the input signals into a transport stream on connection 240. Analog input source 208 can provide an analog audio/video broadcast signal that can be input into a modulator 227. From the modulator 227, a modulated analog output signal can be combined at combiner 246 along with other modulated signals for transmission into transmission medium 250. Alternatively, analog audio/video broadcast signals from the analog input source 208 can be input into the modulator 228. Alternatively, an analog audio/video broadcast signal can be input directly from the modulator 227 to the transmission medium 250. The analog broadcast content instances are transmitted via respective RF channels, each assigned for transmission of an analog audio/video signal such as NTSC video.


A switch, or switches, such as asynchronous transfer mode (ATM) switch 230, provide an interface to an application server 216. There can be multiple application servers 216 providing a variety of services such as a pay-per-view service, including video on demand (VOD), a data service, an Internet service, a network system, or a telephone system. Service and content providers may download content to an application server located within the STS 10. The application server 216 may be located within the headend 11 or elsewhere within the STS 10, such as in a hub. The various inputs into the headend 11 are then combined with the other information from the control system 232, which is specific to the STS 10, such as local programming and control information, which can include, among other things, conditional access information. The headend 11 contains one or more modulators 228 to convert the received transport streams on connection 240 into modulated output signals suitable for transmission over the transmission medium 250 through the network 18. Each modulator 228 may be a multimodulator including a plurality of modulators, such as, but not limited to, QAM (quadrature amplitude modulation) modulators, that radio frequency modulate at least a portion of the transport streams on connection 240 to become output transport streams on connections 242. The output signals on connections 242 from the various modulators 228 or multimodulators are combined, using equipment such as the combiner 246, for input into the transmission medium 250, which is sent via the in-band delivery path 254 to subscriber locations, such as the DHCT 16.


In one embodiment, the server 216 also provides various types of data on connections 288a, 288b to the headend 11. The data, in part, is received by the media access control (MAC) functions 224 that output MPEG transport packets containing data on connections 266a, 266b instead of digital audio/video MPEG streams. The control system 232 enables the television system operator to control and monitor the functions and performance of the STS 10. The control system 232 interfaces with various components, via communication link 270, in order to monitor and/or control a variety of functions, including the frequency spectrum lineup of the programming for the STS 10, billing for each subscriber, and conditional access for the content distributed to subscribers. Information, such as conditional access information, is communicated from the control system 232 to the multiplexing logic 222 where it is multiplexed into a transport stream provided on connection 240.


Among other things, the control system 232 provides input to the modulator 228 for setting operating parameters, such as selecting certain content instances or portions of the transport streams for inclusion in one or more output transport stream on connections 242, system specific MPEG table packet organization, and/or conditional access information. Control information and other data can be communicated to hubs and DHCTs 16 via an in-band delivery path 254 or via an out-of-band delivery path 256.


The out-of-band data is transmitted via the out-of-band forward data signal (FDS) 76 of the transmission medium 250 by mechanisms such as, but not limited to, a QPSK (quadrature phase-shift keying) modem array 226. Two-way communication utilizes the reverse data signal (RDS) 80 of the out-of-band delivery path 256. Hubs and DHCTs 16 transmit out-of-band data through the transmission medium 250, and the out-of-band data is received in the headend 11 via the out-of-band RDS 80. The out-of-band data is routed through router 264 to the application server 216 or to the control system 232. The out-of-band control information includes such information as, among many others, a pay-per-view purchase instruction and a pause viewing command from the subscriber location to a video-on-demand type application server located internally or external to the headend 11, such as the application server 216, as well as any other data sent from the DHCT 16 or hubs, all of which will preferably be properly timed.


The control system 232 also monitors, controls, and coordinates all communications in the subscriber television system 10 (FIG. 1), including video, audio, and data. The control system 232 can be located at the headend 11 or remotely. The control system 232 also includes a broadcast file system (BFS) server 202 that provides content to the DHCT 16. A broadcast file system preferably carries data formatted in directories and files by the BFS server 202, which is used for producing and transmitting data streams throughout the STS 10, and which provides an efficient means for the delivery of application executables and application content (e.g., data) to the DHCT 16. In particular, the BFS server 202 and its counterpart, a BFS client module 343 in the DHCT 16, are part of the broadcast file system. The BFS server 202 repeatedly sends content to the DHCT 16 over a period of time in a cyclical manner so that the DHCT 16 may access the content as needed.


The transmission medium 250 distributes signals from the headend 11 to the other elements in the subscriber television system 10 (FIG. 1), such as a hub, a node (not shown), and subscriber locations. The transmission medium 250 can incorporate one or more of a variety of media, such as optical fiber, coaxial cable, and HFC, satellite, direct broadcast, or other transmission media.


III. DHCT and Remote Control Device


FIG. 3A is a block diagram illustration of an example DHCT 16 that is coupled to the headend 11 and a television set 341, in accordance with one embodiment of the disclosure. It will be understood that the DHCT 16 shown in FIG. 3A is merely illustrative and should not be construed as implying any limitations upon the scope of the preferred embodiments of the disclosure. For example, some of the functionality performed by applications executed in the DHCT 16 (such as the IPG application 397) may instead be performed completely or in part at the headend 11 and vice versa, or not at all in some embodiments. A DHCT 16 may be a stand-alone unit or integrated into another device such as, for example, a television set or a personal computer or other display devices or an audio device, among others. The DHCT 16 preferably includes a communications interface 342 for receiving signals (video, audio and/or other data) from the headend 11 through the network 18, and provides for reverse information to the headend 11 through the network 18.


The DHCT 16 preferably includes one or more processors, such as processor 344, which controls the functions of the DHCT 16 via a real-time, multi-threaded operating system (O.S.) 353 that enables task scheduling and switching capabilities. The DHCT 16 also includes a tuner system 345 comprising one or more tuners for tuning into a particular television channel or spacing in the radio-frequency spectrum to display content and for sending and receiving various types of content to and from the headend 11. The tuner system 345 can select from a plurality of transmission signals provided by the subscriber television system 10 (FIG. 1). The tuner system 345 enables the DHCT 16 to tune to downstream content transmissions, thereby allowing a user to receive digital and/or analog content delivered in the downstream transmission via the subscriber television system 10. The tuner system 345 includes, in one embodiment, an out-of-band tuner for bi-directional QPSK (or QAM in some embodiments) communication and one or more QAM tuners (in band) for receiving television signals. Additionally, a receiver 346 receives externally generated information, such as user inputs or commands from an input device, such as a remote control device 380.



FIG. 3B is a schematic diagram of the example remote control device 380 illustrated in FIG. 3A, in accordance with one embodiment of the disclosure. The example remote control device 380 includes a select button 386 for making selections on a display screen, navigation buttons 385 for navigating within a particular display screen, a menu button 388 for accessing other screens, such as a general settings screen, and a quick settings button 387 to access a quick settings display screen wherein various parameters can be changed. In one embodiment, the remote control device 380 further includes a TV capabilities query button 381 that, when selected, enables a user to invoke a discovery procedure that prompts a series of user interfaces for determining characteristics of the TV set 341. In addition, the remote control device 380 can include a cycle button 383 that, when selected, cycles the DHCT output of a processed television signal between two or more preset formats for driving the TV set 341 through one or more physical output connections. The preset formats are preferably set by the user during the discovery procedure. Many alternative methods of providing user input may be used including a remote control device with different buttons and/or button layouts, a keyboard device, a voice activated device, etc. The embodiments of the disclosure described herein are not limited by the type of device used to provide user input.


Referring again to FIG. 3A, the DHCT 16 processes analog and/or digital transmission signals for storage in a storage device 373 such as an optical or hard disk drive, and/or for display to the television set 341. The DHCT 16 preferably includes a signal processing system 314 and a media engine 330. One or more of the systems of the signal processing system 314 can be implemented with software, a combination of software and hardware, or preferably in hardware. The signal processing system 314 includes a demodulating system 316 and a demultiplexer/parser 318. The demodulating system 316 comprises functionality for demodulating an analog transmission signal or a differently modulated signal carrying digital transmission signals or information. For instance, the demodulating system 316 can demodulate a signal in the tuned frequency spacing that was modulated, among others, as a QAM-modulated signal that carries compressed digital television signals or information. The demultiplexer/parser 318 can include MPEG-2 transport demultiplexing. For example, when tuned to frequency spacings carrying a digital transmission signal, the demultiplexer/parser 318 enables the separation of packets of data corresponding to one or more desired video and audio streams of one or more television signals for further processing. Concurrently, the demultiplexer/parser 318 precludes further processing of packets in the multiplexed transport stream that are irrelevant or not desired, such as packets of data corresponding to other video streams. Thus, the components of the signal processing system 314 are capable of QAM demodulation, forward error correction, and demultiplexing of MPEG-2 transport streams, and parsing of elementary streams and packetized elementary streams. Additional components, not shown, include conditional access components, as well as, among others, an analog video decoder for processing an analog transmission signal and, in one implementation, a compression engine for converting a decoded analog transmission signal to compressed audio and video streams that are produced in accordance with the syntax and semantics of a designated audio and video coding method, such as specified by the MPEG-2 audio and MPEG-2 video ISO standard, among others.


The signal processing system 314 outputs packetized compressed streams and presents them as input for storage in the storage device 373, or in other implementations, as input to the media engine 330 for decompression by a video decoder (or video decompression engine) 333 and an audio decoder (or audio decompression engine) 332 for display on the TV set 341 via the output system 331 and output connections 340. One having ordinary skill in the art will appreciate that the signal processing system 314 will preferably include other components not shown, including memory, decryptors, samplers, digitizers (e.g., analog-to-digital converters), and multiplexers, among other components. Further, it will be understood that one or more of the components listed above will interface with the processor 344 and/or system memory 349 (and/or dedicated memory for a particular component) to facilitate data transfer and/or processing of the video and/or audio signal for display and/or storage.


The media engine 330 includes the digital video decoder 333, digital audio decoder 332, a memory controller 334, a blitter 337, and the output system 331. The media engine 330 is capable of graphics data generation and graphics data processing capabilities with functional modules such as the blitter 337, which is a functional 2-D (i.e., 2-dimensional) DMA (Direct-Memory-Access) module, often referred to as a blit engine. The functionality in the blitter 337 allows for graphical data or image data stored in a media memory 329 or system memory 349 to be read into the media engine 330, processed, and written back (i.e. output) to memory. The graphical data or image data is read in some predetermined 2-D order, such as raster scan order, and processed in systematic, pipelined-fashion through one or more functional elements in the blitter 337. Processing through the blitter 337 can cause one or more pixels of the input image to be changed to a new value. The output image to memory is representative of a pre-specified set of blitter operations.


During the course of program execution, the processor 344 writes operation settings and/or parameters to registers in the media engine 330 to effect one or more operations in one or more input images to be processed through the blitter 337. Typically, the processor 344 sets the registers prior to invoking the blit operation.


According to the mode set for the blit operation, the number of input pixels read from an input image in a blit operation may be less than, equal to, or more than the number of output pixels produced and written back to memory. According to the mode set in registers, a blit operation may comprise of reading multiple input images to produce an output image, such as an alpha-blending operation on two input images. A blit operation may be also set to produce more than one output image. A blit operation may further be set to process pixels of an input image conditionally (e.g., only if they have a certain value or meet certain conditions).


Blitter functionality further includes a scaling capability that can be employed to effect vertical and/or horizontal scaling of an input image. Thus, an input image can be upscaled or downscaled according to the operational mode and parameters written to the corresponding blitter registers. Such scaling operations can be used to modify the aspect ratio of an input image, for example, from a 4 by 3 aspect (i.e., 4:3) ratio (horizontal to vertical ratio) to a 16 by 9 (i.e., 16:9) aspect ratio. Furthermore, the blitter 337 can be set to perform color conversion or alter the color of an input image.


The scaling functionality in the blitter 337 can be performed with sample-rate converters or scaling filters of multiple taps and phases, as known to those practicing digital signal processing in the state-of-art.


As described previously, the blitter 337 is capable of scaling an input image with a 4:3 aspect-ratio to another aspect-ratio, such as a 16:9, although often at the expense of introducing distortion. The blitter 337 can further be set for the blit operation to crop the boundaries of the input image, prior to conducting the scaling operation, by simply omitting to read (or input) parts of the input image from memory. The blitter 337 can also crop the boundaries of the input image it is processing after conducting the scaling operation by simply omitting to write (or output) parts of the input image to memory. For instance, the blitter 337 is capable of processing an image and cropping it by omitting to write to media memory 329 a pre-specified top and bottom portion or rectangular sections. The left and right exterior portions, or pillars, of an input image can be similarly cropped.


As described previously, the blitter 337 possesses capability to perform color conversion on each pixel of the input graphical data from one color format to another color format. For example, color conversion may comprise converting an input image from an RGB color format to a YPbPr color format, or vice-versa.


Color conversion may also comprise conversion between color primaries specifications, such as from the color primaries specified in ITU-R (International Telecommunication Union-Radiocommunication) Recommendation BT.601 to the color primaries specified ITU-R Recommendation BT.709. In one embodiment, color conversion is employed to create an image that is displayed during an interactive discovery phase to determine the color “temperature” of a TV set or display attached to the DHCT 16.


The blitter 337 further possesses capability to perform a color transformation that represents a non-linear transfer-function from input to output to correct for the perceived intensity (i.e., lightness) of the type of TV set 341 or display driven by the DHCT 16. This color transformation could be for the purpose of what is often called by practitioners in the state of the art as gamma correction (i.e., correction of perceived intensity), or the color transformation can be for other purposes that assist in the TV discovery phase.


All aforementioned blitter operations as well as other blitter operations, as a whole or any combination thereof, may be employed to create images for use during the TV discovery phase.


The media engine 330 processes and feeds data for output via the output system 331 to a television set 341. The output system 331 is configured (e.g., by the processor 344) to convert processed video and graphical pictures that are composited by the media engine 330 to a video signal for output, such signal consistent with the format and signaling specification of a port in the set of physical output ports 340 in the DHCT 16. As an example, the output system 331 includes a digital video encoder (DENC) 336 that converts reconstructed video data (or pictures) fed (or otherwise received) at its input to an analog video signal that is output through a port in the set of physical output ports 340 to drive a TV display connected to the DHCT 16. In one embodiment, the same output video signal is output simultaneously through a plurality of ports in the set of physical output ports 340.


The set of physical output ports 340 may comprise of physical connectors for which one end of a corresponding physical cable is attached to, while the opposite end of the cable is attached to a corresponding physical connector in the TV set 341 (or display). A physical connector in the set of physical output ports 340 may also be a wireless transmitter or emitter device that emits the video signal to a corresponding wireless receiver in the TV set 341. An auxiliary device, such as a second TV display, may be also be connected between the DHCT 16 and the TV set 341 through a second set of corresponding connectors.


A physical connector in the set of physical output ports 340 may also serve as both input and output, as input, or as an asymmetrical output-input port wherein video and audio are output to the TV set 341 and the TV set's display status and/or capabilities, characteristics, or attributes are input to the DHCT 16.


Data is fed to one of the physical output ports 340 sourced from the media engine memory 329 or memory 349 in a manner to produce a raster scan of displayed pixels consistent with the interface or port of the physical output ports 340 in which the display type is connected to the DHCT 16. As is described below, the output system 331, under the direction of a display manager 335 and in cooperation with the processor 344 and operating system 353, provides a formatted signal to one or more ports of the physical output ports 340 that are coupled to corresponding inputs (not shown) at the TV set 341, or at other devices such as a VCR, among others. The physical output ports 340 include a RGB port, a S-Video port, a channel 3/4 port, a baseband port, component analog port, and/or a YPbPr port. TV signals are formatted by the DHCT 16 according to well-known video interface specifications. For example, TV signals routed to the baseband port are formatted by the DHCT 16 according to a baseband video format specification. Baseband video can be analog NTSC (or PAL or SECAM) baseband video. Similarly, TV signals routed to the Ch. 3/4 RF (radio frequency) output port are formatted as RF. The S-video port is for S-video, which is the same as baseband video except that the luminance and chroma are transferred on separate wires. YPrPb is an analog component interface that is popular in early HDTV displays. It couples luminance (Y) and two color difference signals (Pr and Pb) to the display device. The same components when digitized are referred to as Y, Cr and Cb. Other connections of the physical output ports 340 include a digital video interface (DVI) to drive a TV set or display that receives non-compressed digital TV signals at its input, and/or an IEEE-1394 interface (not shown) to drive a TV set or display with a possibly-compressed digital TV signal, among others. DVI is a digital interface often used to transmit HDTV signals to a display. In some embodiments, some interfaces, such as the DVI interface or interfaces or ports not shown, can be bi-directional.


A display buffer (not shown) in media memory 329 is designated for the displayable graphical data. The memory controller 334 in the media engine 330 grants access to transfer data from system memory 349 or other sections of media memory 329 to the display buffer in a timely way that safeguards from the generation of tear artifacts on the TV display. For instance, data transfer can be granted to locations in the display buffer corresponding to locations already passed by the raster-scan ordered data fed from the display buffer into the DENC 336. Thus, data written to the display buffer is always behind (in raster-scan order) the display buffer locations read and fed into the output system 331. Alternatively, data can be written to a secondary display buffer (not shown), also called an off-screen or composition buffer and the display buffer and the off-screen buffer can be logically swapped, for example, during the vertical blanking interval of the output TV signal.


Under the arbitration of the memory controller 334, the audio decoder 332 and the video decoder 333 decode the compressed audio and video signals stored in compressed buffers (not shown) in the media memory 329 to obtain the corresponding audio and video data in an uncompressed format. The reconstructed or uncompressed audio and video data are stored in uncompressed buffers (not shown) in the media memory 329. The process of decoding is coordinated by the memory controller 334 such that each of the decoders 332 and 333 are granted access or authorization every time the respective decoder imports data from or exports data to the media memory 329. Data stored in the decoded buffers may be fed to the output system 331 and the audio output system (not shown) under the arbitration of the memory controller 334. The audio output system may be part of the output system 331 and may include digital-to-analog converters (DACs).


The output system 331, for instance, includes a DENC 336 that provides a video output via physical output ports 340 while the audio DAC (not shown) provides an audio output via an audio connection (not shown). The video signal, which has been converted and possibly composited with graphics, is formatted appropriately for output via the output system 331 in cooperation with the display manager 335, operating system 353, and processor 344, in accordance with a video interface type and a video format of the display device that is receiving the video output (e.g., high definition television (HDTV), NTSC or phase alternate line (PAL)).


The output system 331 comprises capabilities to convert processed video and graphical pictures simultaneously for output. A video display pipeline (VPipe) 338 in the output system 331 receives (or is fed) video pictures from the media memory 329 and processes the pictures for output in a systematic fashion. In one embodiment, the video display pipeline 338 includes (not shown) one or more line buffers and vertical sample-rate converters to effect vertical resizing, horizontal sample-rate converters to effect horizontal resizing, and/or in-line color-conversion capabilities. Furthermore, the video display pipeline 338 may include memory and circuit logic (programmable in some embodiments) to effect picture de-interlacing of interlaced pictures for display as progressive or as larger interlaced pictures. The de-interlacing process may involve the access of the media memory 329 to read a plurality of lines from one or more fields corresponding to one or more consecutive video pictures from a TV signal to process them and generate each line of a converted video picture.


A graphics display pipeline (GPipe) 339 in the output system 331 receives (or is fed) graphical pictures or data from a display buffer in media memory 329 and processes the images for output in a systematic fashion. The graphics display pipeline 339 may include (not shown) one or more line buffers and/or vertical scalers (e.g., sample-rate converters) to effect vertical resizing. The graphics display pipeline 339 can also have horizontal scalers or sample-rate converters to effect horizontal resizing, and/or in-line color-conversion capabilities to effect color conversion.


The output system 331 further includes programmable digital logic and control circuitry (not shown), among other circuitry, to overlay or composite the converted graphical picture at the end of the graphics display pipeline over the converted video picture at the end of the video display pipeline 338 (or vice versa). The converted graphical picture and the converted video picture may be composited with a common center but do not have to be of the same picture resolution. For instance, the converted video picture may span a 1280×720 picture of 16:9 aspect ratio while the graphical picture may span a 960×720 picture of 12:9 (or equivalently 4:3) aspect ratio.


The output system 331 further comprises programmable digital logic and control circuitry (not shown) to enable and disable the video display pipeline 338, to enable and disable the graphics display pipeline 339, and/or to alpha blend the spatially corresponding pixels output by the two respective display pipelines 338,339 to appear as a mix (or translucent) picture that is output to the TV set 341. Some pixels in the graphical picture may have a transparent color value.


The output system 331 preferably generates a TV signal for output to the TV set 341 according to the specification of the video interface used for a particular output port 340. If the video display pipeline 338 is disabled, a graphical picture is displayed on the screen of TV set 341. If the graphical display pipeline 339 is disabled, the video picture (e.g., from a tuned TV channel, from the storage device 373, etc.) is displayed on the TV set 341. In one embodiment, the DHCT 16 performs an interactive discovery session to find the TV set display or TV set characteristics by outputting simultaneously a TV signal through a first output port of the physical output ports 340 and another TV signal through a second output port of the physical output ports 340. For example, a first video display pipeline (not shown) of the video display pipeline 338 in the output system 331 is employed to convert a video picture for output through the aforementioned first output port of the physical output ports 340 and a second video display pipeline (not shown) of the video display pipeline 338 in the output system 331 is employed to convert the same or different video picture for output through the aforementioned second output port of the physical output ports 340.


In another embodiment, a first video display pipeline (not shown) of the video display pipeline 338 and a first graphics display pipeline (not shown) of the graphics display pipeline 339 in the output system 331 are employed during the discovery phase to output a TV signal through an output port (e.g., a first output port) of the physical output ports 340 as either a video picture or as a composition of a video picture and a graphical picture. A second video display pipeline (not shown) of the video display pipeline 338 in the output system 331 is employed to output a TV signal through another (e.g., a second) output port of the physical output ports 340.


In another embodiment, a second graphics display pipeline (not shown) of the graphics display pipeline 339 is also employed (along with a second video display pipeline) during the discovery phase to output a TV signal through the second output port as either a video picture or as a composition of a video picture and a graphical picture.


The DHCT 16 can include one or more storage devices, such as storage device 373, preferably integrated into the DHCT 16 through an interface 375 (e.g., IDE (integrated drive electronics) or SCSI (small computer system interface), etc.), or externally coupled to the DHCT 16 via a communication port 374. The communication port 374 can be a wireless or wired interface, and is for receiving and/or transmitting data to other devices. For instance, the DHCT 16 may feature USB (Universal Serial Bus), Ethernet (for connection to a computer), IEEE-1394 (for connection to media content devices in an entertainment center), serial, and/or parallel ports. The storage device 373 can be optical (e.g. read/write compact disc), but is preferably a hard disk drive. The storage device 373 includes one or more media, such as a hard disk 301. A storage device controller 379 in the storage device 373 of DHCT 16, in cooperation with a device driver 311 and the operating system 353, grants access to write data to or read data from the local storage device 373. The processor 344 can transfer content (e.g., data) from memory 349 to the local storage device 373 or from the local storage device 373 to the memory 349 by communication and acknowledgement with the storage device controller 379.


In one embodiment, data stored, written, and retrieved from the storage device 373 includes one or more images to be processed during a discovery phase to determine the type of TV set or display attached to or otherwise in communication with the DHCT 16 and the characteristics of the TV set or display. Data stored in the storage device 373 can also include one or more sets of audio samples to be played back during the discovery phase. For example, the audio samples can be speech instructions or other types of audio. One or more picture sequences can also to be stored and retrieved in the storage device 373 for use during the discovery phase.


In one embodiment, data stored, written, and retrieved from storage device 373 includes one or more images to be displayed as a graphical picture during a discovery phase to determine the type of TV set or display attached to the DHCT 16 and the characteristics of the same. Data stored in the storage device 373 can also include one or more sets of audio samples to be played back during the discovery phase. One or more picture sequences can also to be stored and retrieved in the storage device 373 for use during the discovery phase. The storage device 373 can include graphical pictures and video picture sequences of different characteristics such as pictures of different picture aspect ratios, different colorimetry, different gammas, and/or different pixel aspect ratio, among other characteristics. Distorted pictures and pictures with, for example, side blank pillars, blank top and bottom rectangular regions, or a box with blanked rectangular regions may also be stored in the storage device 373 for employment during the discovery phase.


The DHCT 16 includes memory 349, which includes volatile and/or non-volatile memory, for storing various applications, modules and data for execution and use by the processor 344. Basic functionality of the DHCT 16 is provided by an operating system 353. Among other things, the operating system 353 includes a resource manager 367 that provides an interface to resources of the DHCT 16 such as, for example, computing resources, and a broadcast file system (BFS) client 343 that cooperates with the BFS server 202 (FIG. 2) to receive data and/or applications that are delivered from the BFS server 202 in a carousel fashion. The operating system 353 further includes one or more device drivers, such as device driver 311, that works in cooperation with the operating system 353 to provide operating instructions for communicating with peripheral devices.


Data to be used during the discovery phase can be transmitted to and received by the DHCT 16 as one or more files using the broadcast file system (via BFS server 202 (FIG. 2) and BFS module 343). Data can further be stored in the storage device 373 prior to exercising the discovery phase.


One or more programmed software applications, herein referred to as applications, are executed by utilizing the computing resources in the DHCT 16. Note that an application typically includes a client part and a server counterpart that cooperate to provide the complete functionality of the application. The applications may be resident in memory 349 or the storage device 373, or stored in a combination of memory 349 and storage device 373. Applications stored in memory 349 (or storage device 373) are executed by the processor 344 (e.g., a central processing unit or digital signal processor) under the auspices of the operating system 353. Data required as input by an application is stored in memory 349 or the storage device 373 (or a combination) and read by the processor 344 as needed during the course of the application's execution.


Input data may be stored in memory 349 by a secondary application or other source, either internal or external to the DHCT 16, or possibly anticipated by the application and thus created with the application at the time it was generated as a software application. Data generated by an application is stored in memory 349 by the processor 344 during the course of the application's execution, or if required, transferred to the storage device 373 from memory 349 by the processor 344 during the course of the application's execution. The availability of data, location of data, whether in memory 349 or in the local storage device 373, and the amount of data generated by a first application for consumption by a secondary application is communicated by messages. Messages are communicated through the services of the operating system 353, such as interrupt or polling mechanisms or data sharing mechanisms such as semaphores.


An application referred to as a navigator 355 is resident in memory 349. The navigator 355 provides a navigation framework for services provided by the DHCT 16. For instance, the navigator 355 includes core functionality such as volume and configuration settings. The navigator 355 preferably handles signals invoked from the channel navigation buttons on the remote control device 380.


The memory 349 also contains a platform library 356. The platform library 356 is a collection of utilities useful to applications, such as a timer manager, a compression manager, a configuration manager, an HTML parser, a database manager, a widget toolkit, a string manager, and other utilities (not shown). These utilities are accessed by applications via application programming interfaces (APIs) as necessary so that each application does not have to contain these utilities. Two components of the platform library 356 that are shown in FIG. 3A are a window manager 359 and a service application manager (SAM) client 357. Note that in some embodiments, one or more of the platform library components may be resident in the operating system 353. The window manager 359 provides a mechanism for implementing the sharing of the display device screen regions and user input. The window manager 359 in the DHCT 16 is responsible for, as directed by one or more applications, implementing the creation, display, and de-allocation of the limited DHCT screen resources. It allows multiple applications to share the screen by assigning ownership of screen regions, or windows.


The window manager 359 also maintains, among other things, a user input registry 350 in memory 349 so that when a user enters a key or a command via the remote control device 380 or another input device such as a keyboard or mouse, the user input registry 350 is accessed to determine which of various applications running on the DHCT 16 should receive data corresponding to the input key and in which order.


The SAM client 357 is a client component of a client-server pair of components, with the server component being located in the headend 11, typically in the control system 232 (FIG. 2), although not shown. A SAM database 360 (i.e. structured data such as a database or data structure) in memory 349 includes a data structure of services and a data structure of channels that are created and updated by the headend 11. Herein, database will refer to a database, structured data or other data structures as is well known to those of ordinary skill in the art. The SAM client 357 also interfaces with the resource manager 367 to control resources of the DHCT 16. Many services can be defined using the same application component, with different parameters. Examples of services include, without limitation and in accordance with one implementation, presenting television programs (available through a WatchTV application 362), pay-per-view events (available through a PPV application (not shown)), digital music (not shown), media-on-demand (available through an MOD application (not shown)), and an interactive program guide (IPG) (available through an IPG application 397).


In the example DHCT 16 depicted in FIG. 3A, memory 349 also includes a web browser application 366 for providing web browsing services and a personal video recording (PVR) application 377 for providing personal video recording services. It should be clear to one with ordinary skill in the art that these applications are not limiting and merely serve as examples for this present embodiment of the disclosure. These applications, and others provided by the cable system operator, are top level software entities on the network for providing services to the user.


An executable program or algorithm corresponding to an operating system (OS) component, or to a client platform component, or to an application, or to respective parts thereof, can reside in and execute out of memory 349 and/or the storage device 373. Likewise, data input into or output from any executable program can reside in memory 349 and/or the storage device 373.


IV. Example Method for Determining Display Device Characteristics

As described above, there are many possible video formats that are received by the DHCT 16 and converted to a television signal compliant with the appropriate video interface specification for the respective port of the physical output ports 340. For example, ATSC includes 18 different video formats, which include parameters for the amount of active lines, total lines, horizontal pixels (e.g., pixels per line), aspect ratio (e.g., 16:9 or 4:3), vertical rate, frame rate, scan type (progressive or interlaced), and type of TV set (e.g., SDTV or HDTV) that a particular format is best suited for. For a TV set capable of sourcing a TV signal through more than one video interface or with a video interface that supports more than one video format, the nature of the picture displayed on a TV screen (or other display screen) can vary depending on what format the DHCT 16 configures the output signal (e.g., output from the output ports 340 to the TV set 341) to be. Accurate assumptions about the characteristics of the TV set 341 or display can be made based on the nature of the picture displayed in response to the video format of the television signal fed to the TV set 341 from the DHCT 16. Although the DHCT 16 does not know what type of TV set it is connected to, nor the characteristics of that TV set, it can output some predefined video formats and query the user about what he or she sees. Based on the response of the user, not only can the DHCT 16 determine the characteristics of the TV set connected (e.g., aspect ratio, color temperature, gamma characteristics, etc.), but also provide for a fine tuning of the picture quality that improves what the user sees. Note that in some embodiments some of these characteristics may be input by the user in response to a query by the DHCT 16 (e.g., the DHCT 16 can prompt questions on the TV display, such as “what type of TV do you have? year made? model number? manufacturer?,” etc.).


Even if characteristics are input by a user, a discovery phase may still be employed where, for example, a TV set 341 has multiple interfaces to receive the output TV signal from the DHCT 16 and the user may have pre-configured the TV set 341 to a user-specific set of preferences. For instance, a user may opt to pre-configure a TV set 341 with a physical screen (e.g., the glass portion of the screen, screen portion of a projection TV, etc.) having a 4:3 aspect ratio using the TV set's settings to display a 16:9 TV signal received from the DHCT 16 in one of multiple configurable displayable ways. That is, the user can set a 16:9 TV signal to be displayed as, for example, a letterboxed picture (e.g., blank top and bottom portions envelope picture), as a distorted picture, or as a full-screen picture without distortion but with the left- and right-most portions cropped.


An interactive session(s) (or application(s)) that serves as a discovery phase according to an embodiment of the disclosure is executed upon power-up, or at other times such as when the user hooks up a different TV set to the DHCT 16, to determine the capabilities of the TV set 341 coupled to the DHCT 16. The discovery phase can be viewed as comprising two phases. Phase I includes determining how the TV set 341 can be driven. The TV set 341 can be a high definition television (HDTV) set, a national television systems committee (NTSC) set, or a PAL set. It is in phase I that the sourced TV signal received at the tuner system 345 of the DHCT 16 is mapped to a corresponding output of the DHCT 16 (and thus corresponding input of a TV set 341). For example, through one or more interactive sessions with the user, the DHCT 16 (or user) may decide that when the DHCT 16 is tuned to SD (standard definition) or NTSC channels, the signal is to be processed and fed through the baseband port or S-video port of the physical output ports 340. Similarly, if the DHCT 16 is tuned to an HD channel, the DHCT 16 (or user) may determine that the signal is to be processed and fed through the YPbPr port or the DVI port of the physical output ports 340. Phase II includes determining additional characteristics of the TV set 341, which serves as a calibration process. For example, it is in phase II that the color temperature, gamma characteristics, and/or de-interlacing characteristics or capabilities, among others, can be verified or determined.



FIG. 4 is a flow diagram of one example method for implementing phase I and phase II of the discovery phase, in accordance with one embodiment of the disclosure. The blocks in the flow diagram of FIG. 4 should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the preferred embodiments of the present disclosure in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, or with additional steps or steps omitted, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present disclosure.


Referring to FIG. 4, and with continued reference to FIG. 3A, step 402 includes receiving a request to commence discovery. The discovery phase can be invoked using a dedicated key or button in the remote control device 380 (FIG. 3B), such as the TV capabilities query button 381 (FIG. 3B). An interactive session can also be accomplished through a quick settings user interface screen (e.g., not shown, but evoked by the user selecting the quick settings button 387, FIG. 3B) or through a general settings screen (e.g., not shown, but evoked by the user navigating through menu items upon pressing the menu button 388, FIG. 3B).


Step 404 includes converting a video or picture sequence according to a defined video interface specification. Representative pictures or video sequences that include the different video formats in ATSC A/54 or SCTE 43, for example, which vary in parameters such as aspect ratio (e.g., 4:3 versus 16:9), TV type (e.g., SDTV versus HDTV), among others, are used during the discovery phase. The video formats are processed for output by appropriately exercising conversion capabilities in the TV output system 331 according to defined categories of parameters for a video output interface and to provide some efficiency in the driving of outputs. The pictures or video sequences (and/or audio) can be provided by the DHCT 16 as a result of real-time or time-shifted (e.g., buffered to the storage device 373) conversion of a TV signal received from the network 18 (FIG. 1). Preferably, picture or video sequence samples of various parameters are stored in the storage device 373 and provided by the DHCT 16 during the discovery phase. For example, these samples can be downloaded to the storage device 373 upon receipt via the BFS module 343 or downloaded by the DHCT manufacturer. Additionally, graphics, video, or a combination of graphics and video can be provided during the discovery phase according to the mechanisms described in association with FIG. 3A.


Step 406 includes driving one or more display devices through the physical output port with a video format corresponding to the video interface specification to which the conversion in step 404 was implemented. The DHCT cycles through each video format in response to the user selecting the cycle button 383, or in alternative embodiments, automatically for a defined interval of time, requesting input from the user for each format. The display manager 335, in cooperation with the media engine 330, the operating system 353, memory (e.g., memory 349 and/or media memory 329), and the processor 344, preferably is responsible for cycling the various video formats through the output connections 340 to the inputs of the TV set 341. The cycle is preferably set from the highest resolution to the lowest resolution. For example, the DHCT may output 1080i first, and then output 720p, then 480p, and then 480i. Once a user enters input, it is assumed that the TV set 341 is to be driven at the highest resolution format to minimize degradation of the highest resolution TV set that is sourced. Driving the TV set with a common video format simplifies graphics overlay management by the display manager 335 and operating system 353 as it shortens development cycle and reduces provisions required to support legacy applications. In some embodiments, the DHCT 16 can transmit audio instructions in addition to or in lieu of the video and/or graphics data.


Step 408 includes soliciting a user response based on what is displayed on the TV screen. In one embodiment, the display manager 335, in cooperation with the processor 344, can cause the generation and presentation of various GUI screens to solicit help from a user to determine the type of inputs that are supported by the TV set 341. One purpose for driving the inputs of the TV set 341 is to present a picture on the screen of the TV set 341 along with a GUI screen that enables the user to comment on whether he or she sees any picture on the display. Note that in other embodiments, audio instructions (e.g., such as speech instructions) can be used, alone or in combination with graphics and/or video, to solicit user feedback and/or provide instruction which would facilitate the determination of how the TV set 341 can be driven.


Step 410 includes determining whether user input was received within a defined period of time defined by a fixed or programmable threshold. The failure to receive an inputted response from the user may indicate that the user did not see a screen (e.g., there may not be a connection made by the user or the particular video format cycled through the output port is not supported). If a video format applied to the TV set input was not compatible with the TV set (e.g., a video format that is not compatible with the scan format or other characteristics of the TV set 341), or no connection has been made between the DHCT port and the corresponding TV set port, an out-of-synch picture is likely to be displayed. If the DHCT 16 determines that no user input is received within a predetermined or defined period of time of driving the TV set with a signal having a prescribed video format, the DHCT 16 automatically selects a new port, or another video format for the same output port (e.g., if more than one video format is supported by the selected output port) (step 422), and repeats the process starting at step 404. In some embodiments, selection of the next port is not automatically initiated until a user enters input, regardless of whether instructions or other messaging on the screen is viewable or not, thus omitting the determination step (step 410). For example, instructions in a written user's guide (or by phone) can require a user to enter one type of input (e.g., the cycle button 383, FIG. 3B) when the video is out-of-synch (e.g., instructions do not appear), and another type of input (e.g., the select button 387, FIG. 3B) when the displayed picture and/or instructions are coherent (e.g., observable and/or legible). In some embodiments without the automatic feature, the determination step may still be employed.


The programmable threshold that defines the period of time for cycling through the video formats and/or output ports may be programmed and stored in NVM 348 of the DHCT 16. Alternatively, the threshold value can be stored in the storage device 373 of the DHCT 16 or in NVM (not shown) in the remote control device 380. Programming of the threshold value can be performed at a manufacturing facility, by a cable operator from the headend 11 while the DHCT 16 is at a subscriber's or user's premise, or by the subscriber or user.


If a GUI is viewable, suggesting that the cycled video format is both established through a valid connection to the TV set 341 and that the TV set 341 can be driven using that applied video format, step 412 includes mapping the video interface specification and corresponding port with a parameter or parameters of the TV signal, video sequence or picture. In other words, the DHCT 16 stores in memory (e.g., media memory 329, memory 349, and/or the storage device 373, FIG. 3A) the parameter(s) or settings required to operate the video display pipeline 338 and/or graphics display pipeline 339, the functions in the output system 331, and the output ports 340 to effect the processing and output of a TV signal according to the parameters of each possible TV signal format that DHCT 16 is capable of receiving via its tuner system 345 (FIG. 3A) or an input or communication port. In addition to storing the parameters and settings required to process and output each respective TV signal format, the DHCT 16 stores a table (not shown) that associates each possible TV signal format to one or more designated output ports 340. For example, for TV signals that have parameters that indicate the signal originated in or is otherwise compliant to an HD transmission medium, the DHCT 16 determines the parameters of the received TV signal by, for example, processing information in the signal and comparing the information with information stored in the SAM database 360. The SAM database 360 may include stored information in an association table (not shown) that maps the TV signal to a designated output port. The DHCT 16 then converts the TV signal, and drives the converted TV signal through the designated port (e.g., YPbPr port) of the physical output ports 340.


For formats where a display is generated and viewable by the user, additional formats can be presented through this interactive session to fine-tune what the user is viewing. Step 414 includes applying test sequences for phase II discovery. For example, one category of driving the TV set inputs can be according to the scan format (e.g., interlaced versus progressive). Within an interlaced category, there are multiple video formats (e.g., with differences in resolution (e.g., 1080i, 480i, etc.), aspect ratio, among other parameters). If the TV set 341 has the capability to only receive an input with a progressive scan format, then driving it with an interlaced scan format may result in no displayable screen (e.g., a scrambled picture), thus negating the need to drive the TV set inputs with other video formats under the category of interlaced. The test sequences are received from memory in a manner as described above for step 402, and processed (e.g., altered) in the blitter 337 in cooperation with the display manager 335 and the processor 344 (FIG. 3A). In one embodiment, the samples are altered in a manner that causes images on the TV screen to be distorted and/or appear with a variety of features (e.g., cropped) and/or or omissions of features and/or accentuations in features or conditions (e.g., taking a circle and expanding in a vertical or horizontal orientation to create an ellipse). By altering the samples, a display screen comprising, in one embodiment, a GUI and an altered image sample, can be presented that enables the DHCT 16 to determine how the TV set 341 reacts to or presents these altered images based on the user response. Some of the samples stored in the storage device 373 can be altered samples, or in some embodiments, the alterations can be implemented in the media engine 330 or via a combination of stored altered samples and processing in the media engine 330.


Step 416 includes soliciting a user response based on what is displayed on the TV screen. Based on the video format selected and the alteration imposed in step 414, there is an expected effect on the picture presented on the TV screen, and this expected effect is used to generate the appropriate question to the user on the display screen to help determine the characteristic of the TV set 341 (FIG. 3A) the question was geared to find. Soliciting a user response can also include requesting user preferences. For example, the user may be presented with alternate pictures having different colors and asked to choose a preferred color.


Step 418 includes determining characteristics of the display device based on the solicited user response. Running through interactive screens shown in FIGS. 5-8 (and other screens not shown) and soliciting answers to the questions presented in these screens enables the DHCT to determine the capabilities of the TV and allows the user alternate display options of video and graphics throughout the course of viewing the picture in the display screen of the TV set 341. Hence, the user has a mechanism for selecting a video format for the video (and graphic) that the viewer can simply invoke for different sourced video signals. This allows the user the flexibility to display a sourced picture of a certain aspect ratio to the aspect ratio of the TV set or in a manner the user chooses. For example, the preferred embodiments of the disclosure allows the viewer to expand a sourced 4:3 content out to the edges of a 4:3 HDTV (i.e., full screen rather than a boxed in display).


Step 420 includes storing the TV set display characteristics. One or more non-volatile memory (NVM) bits (not shown) in memory 349 (or the characteristics can be stored in the storage device 373, FIG. 3A) (FIG. 3A) are assigned to denote a default value: “never set” or “set”. Once set, the capabilities of the connected TV set 341 are retained in NVM 348. If the user employs a different TV set in the future, the interactive sessions of the preferred embodiments of the disclosure can be re-entered via the TV capabilities query button 381, general settings, quick settings, front key input on the DHCT 16, or via other buttons on the remote control device 380 (FIG. 3B) using key sequences as explained in a user's manual.


Step 422 includes selecting another port or another format for the same port, and then the discovery process is repeated again beginning at step 404.


Therefore, during an interactive session that requests input from the user, the DHCT 16 cycles through each video format requesting input from the viewer rather than merely terminating when a first video format category suitable for driving the TV set 341 is determined Hence, the input from the user preferably does not cease after the user acknowledges being able to view a first viewable display screen, but rather after completion of cycling through all or substantially all video formats.


Note that in some embodiments, phase I can be implemented for all ports and completed before commencing phase II.


Also, in some embodiments, the user may decide on which output port of the physical output ports 340 (FIG. 3A) to output each possible picture format of the input, incoming or tuned TV signal. For instance, a user may express his or her preferences in interactive sessions during the TV capability discovery phase to output an incoming TV (video) signal through an output port capable of outputting the scan rate equivalent to the scan rate of the incoming TV signal. Furthermore, the user may decide through the interactive sessions to select an output port for each respective TV channel in the channel line-up, assuming that the user knows the picture format of each channel or that the interactive session displays information on the screen containing the picture format associated with each channel to the user. User preferences are stored in memory, for example memory 349 (FIG. 3A), and/or the storage device 373 (FIG. 3A).


The picture format associated with each channel may be known a priori. For example, the picture format may be obtained or inferred by the processor 344 (FIG. 3A) from the record of each program in an EPG or IPG database (not shown) residing in memory 349 (FIG. 3A), or from information associated with each service or TV channel in the SAM tables (e.g., SAM database 360, FIG. 3A) residing in memory 349. Alternatively, the DHCT 16 (FIG. 3A) may determine the picture rate for each TV channel, service or program by tuning to each channel in the channel line-up. Since the picture rate for a TV channel may change through the course of time, the user may further prefer to allow the DHCT 16 to dynamically switch output ports according to the incoming TV signal's picture format.


The discovery phase retains (e.g., by saving in memory 349, FIG. 3A) the output formats or modes that the DHCT 16 (FIG. 3A) is capable of outputting. A user may opt to eliminate one or more output capability from the plurality of output capabilities found during the discovery phase. During normal TV watching operation, the user has access to select any of the retained outputting modes (e.g., via quick settings or a remote key). The DHCT 16 may be configured to output in one picture format (e.g., 1080i) or to dynamically adjust to output according to the picture format of the incoming TV signal. Regardless of whether the DHCT 16 outputs in a fixed format indefinitely, the user has access to the list of retained output modes and select an alternate output mode from the list.


In one embodiment, wherein the output mode is fixed to a first picture format, once the user selects an alternate output format from the list of retained output modes, the DHCT 16 (FIG. 3A) continues to output the alternate output format indefinitely until the user once again selects another output mode (e.g., possibly the first picture format). In an alternate embodiment, when the user tunes to a different channel, the DHCT 16 discontinues outputting the alternate output format by reverting to output the first picture format.


V. Interactive Session Display Screens


FIG. 5 is a block diagram of one arrangement for the TV set 341 and the DHCT 16. The DHCT 16 includes a digital display 502 and navigation buttons 504 for performing navigational functions like channel changing, etc. For example, the user can select the navigation buttons 504 to select information in response to a DHCT prompted GUI screen 510 during phase I discovery. Presumably, a GUI screen 510 is presented because a connection between the DHCT port and corresponding connection at the TV set 341 has been made and a video format was input to the TV set input or inputs with which the TV set 341 is compatible. As shown, a text message in the display screen 510 queries the user, “Can you read this question?” The user will enter input if he or she is able to read the question. The display screen 510 suggests through the use of navigation arrow icons 512 that the user can select the navigation buttons 385 on the remote control device 380 (FIG. 3B) or the DHCT navigation buttons 504. One skilled in the art will understand that other buttons can be used to convey a user response, including the use of just the select button 387 (FIG. 3B).


As is shown in FIGS. 6-7, a set of display screens tailored for each combination of video format and display aspect ratio are generated for each respective video format during phase II discovery. Continuing with the example referenced in association with FIG. 4, within the generalized category of a respective scan mode (e.g., progressive versus interlaced), the picture is output in multiple ways to find out further capabilities of the TV set 341. This process includes outputting different aspect ratios such as 16:9, 4:3, or others, as well as embedding a graphical picture within the picture (e.g., letterbox or sidebars). The user may have set the TV set 341 to display letterbox, for example, or there may be a default setting. The user may also configure settings in the DHCT 16 for handling the aspect ratio in a predetermined manner. Thus, there may be cumulative operations performed by the DHCT 16 and the TV set 341 that cause distortions of objects to be amplified (e.g., a ball on the screen may be stretched by operations in the DHCT 16 and then further stretched due to operations in the TV set 341). Images stored in the storage device 373, for example, may be presented at the TV set 341 in a distorted and non-distorted manner to determine these settings. The phase II discovery process also includes outputting graphics with alternate lines displaced or shifted by varied amounts to determine if the TV set 341 has a de-interlacer and if so, what is its quality. Graphical objects in the graphical picture may represent certain geometrical shapes subjected to any of a plurality of 2-D or 3-D transformations that include rotation, scaling, shear and perspectives, among others, that when coupled to alternate line displacements of shift that emulate motion from an interlaced camera, help identify the performance of the de-interlacing capabilities of a de-interlacer. For example, the DHCT 16 may test the quality of the de-interlacer of the TV set 341 to determine if de-interlacing functionality in the DHCT 16 should be bypassed due to superior quality of the de-interlacer in the TV set 341. De-interlacer assessment may be further refined to determine which of two de-interlacers performs best for analog signals, for interlaced SD pictures received as compressed digital signals, and/or for interlaced HD pictures received as compressed digital signals. Note that a TV set may feature “multi-synch” support. For example, an HDTV set may support multiple video formats at its input. In such event, it may be desirable to drive the HDTV set with the native scan format of the sourced input signal to minimize picture degradation.



FIG. 6 illustrates a query that is aimed towards determining how the TV set 341 handles certain aspect ratios. For example, a signal received from the headend 11 (FIG. 2) may carry content at a 4:3 aspect ratio, although the connected TV set is an HDTV having a 16:9 aspect ratio. One way the HDTV set handles this source signal could be to provide black vertical stripe boxes on the side of the 4:3 screen. An interactive session can help the DHCT 16 determine what is connected by the way the TV set handles the displayed picture. As shown, the video format inferred here is handled by the TV set 341 by outlining the picture in the display screen with top and bottom block stripes. The display screen 610 includes a text message that asks the user, “Does the screen have top and bottom black boxes reducing the screen?” Depending on the response to this question, a series of screens can be presented that ask similar questions, such as whether the user sees side block stripes, or a boxed-in picture. In other embodiments, a single screen instruction can ask all of these questions, which are aimed at trying to ascertain the aspect ratio of the TV set 341.



FIG. 7 is a block diagram illustrating another interactive display screen 710 used to fine tune the color quality. As shown, input is requested on the color of a graphical object (although it could be colored-text as well) in the display to ascertain the color temperature of the TV set display. For example, the graphical object, generated internally to the DHCT 16, is a ball 712, and the question presented to the user is, “what is the color of the ball in the screen?” A scrollable color-choice list 714 can be presented, which gives the user a choice of colors to choose from to enable the user to adequately describe the color of the ball.


Other questions can be presented to determine other characteristics. For example, input can be requested from the user on whether visible flicker or certain artifacts are visible to determine if the TV set has an internal line-doubler (i.e., de-interlacer) or to assess the quality of the line doubler versus that of the line doubler of the DHCT 16. Additional questions can be presented to the user to provide user preferences on border color or shade (e.g., the letterbox may be available with different shades of gray to cause the picture brightness to vary and thus enhance the viewer experience).


Additional embodiments can be used for this interactive session between the user and the DHCT 16. In one embodiment, synthesized audio can be played while cycling through the video formats, instructing the user to enter input without any dependence on a displayed GUI. In another embodiment, synthesized audio can be played with a displayed GUI in each cycled video format, and in some embodiments, synthesized audio is repeated if the user's input is not received after an elapsed time greater than a predetermined threshold.


Some embodiments may employ buttons on the DHCT 16 (e.g., light-emitting diodes, or LEDs, such as digital display 502 (FIG. 5)) in lieu of or in addition to the display screens. For example, since some HDTVs and projectors have common inputs that may not accept 1080i signals, the LEDs may be used where on-screen graphics is not readable (e.g., if the scan is not correct). Possible scan values include 1080i, 720p, 480p, and 480i. When NVM values are cleared during staging (e.g., initial installation), the scan value from a staging configuration file may be set. If the staging configuration file does not contain the scan value parameter, 1080i will be set. Most HDTVs will accept the 1080i scan format, so that may be the default value. Other settings may be available using a front panel settings key (not shown). For example, by holding down a settings key for a predetermined period of time, a message light may blink and the LED digits may reveal the current setting. In some embodiments, a scan barker (not shown) may be displayed. Pressing the settings key again may cycle the settings.


The display manager 335 can be implemented in hardware, software, firmware, or a combination thereof. In the preferred embodiment(s), the display manager 335 is implemented in software or firmware that is stored in a memory and that is executed by a suitable instruction execution system. If implemented in hardware, as in an alternative embodiment, the display manager 335 may be implemented with any or a combination of the following technologies, which are all well known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc.


The display manager 335, which comprises an ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “computer-readable medium” can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer readable medium can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a nonexhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic) having one or more wires, a portable computer diskette (magnetic), a random access memory (RAM) (electronic), a read-only memory (ROM) (electronic), an erasable programmable read-only memory (EPROM or Flash memory) (electronic), an optical fiber (optical), and a portable compact disc read-only memory (CDROM) (optical). Note that the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.


It should be emphasized that the above-described embodiments of the present disclosure, particularly, any “preferred embodiments” are merely possible examples of implementations, merely setting forth a clear understanding of the principles of the disclosures. Many variations and modifications may be made to the above-described embodiments of the disclosure without departing substantially from the spirit of the principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of the disclosure and present disclosure and protected by the following claims.

Claims
  • 1. A system, comprising: processing logic of a client device configured to effect the presentation of objects on a display screen of a display device coupled to the client device that are altered by one or more pipelines, the processing logic of the client device configured to solicit a response by a user, the processing logic of the client device further configured to, responsive to the response from the user, determine de-interlacing capabilities of the display device coupled to the client device, the processing logic of the client device further configured to determine de-interlacing capabilities of the client device and then determine if the de-interlacing capabilities of the client device should be bypassed in favor of the de-interlacing capabilities of the display device.
  • 2. The client device of claim 1, wherein the display device includes a television set.
  • 3. The system of claim 1, wherein the one or more pipelines further comprises: a media memory configured to store one or more consecutive interlaced pictures; and circuit logic configured to:access an interlaced picture from the memory;process a plurality of lines from a field corresponding to the interlaced picture; andgenerate each line of a resulting progressive picture.
  • 4. The client device of claim 3, wherein the processing logic is further configured to affect a display of the progressive picture on the display screen and solicit further user response based on a perceived quality of the display of the progressive picture.
  • 5. The client device of claim 4, wherein the processing logic is further configured to determine whether the de-interlacing capabilities of the client device or of the display device should be used based on the further user response, wherein the processing logic is further configured to responsively implement the de-interlacing capabilities of the client device or of the display device.
  • 6. The client device of claim 3, wherein the processing logic is further configured to determine the de-interlacing capabilities based on an input corresponding to analog signals, interlaced standard definition (SD) pictures received as compressed digital signals, interlaced high definition (HD) pictures received as compressed digital signals, or a combination of two or more.
  • 7. The client device of claim 3, wherein the circuit logic is programmable.
  • 8. The client device of claim 1, wherein the one or more pipelines further comprises: a media memory configured to store one or more consecutive interlaced pictures; and circuit logic configured to:access plural interlaced pictures from the media memory;process a plurality of lines from plural field corresponding to the plural interlaced pictures; andgenerate each line of resulting plural progressive pictures.
  • 9. The client device of claim 8, wherein the processing logic is further configured to affect a display of the plural progressive pictures on the display screen and solicit further user response based on a perceived quality of the display of the plural progressive pictures.
  • 10. The client device of claim 9, wherein the processing logic is further configured to determine whether the de-interlacing capabilities of the client device or of the display device should be used based on the further user response.
  • 11. The client device of claim 8, wherein the processing logic is further configured to determine the de-interlacing capabilities of the client device or of the display device based on an input corresponding to analog signals, interlaced standard definition (SD) pictures received as compressed digital signals, interlaced high definition (HD) pictures received as compressed digital signals, or a combination of two or more.
  • 12. The client device of claim 8, wherein the circuit logic is programmable.
  • 13. The client device of claim 1, wherein the processing logic is embodied in a network client device capable of outputting video and audio in at least one defined format through at least one port.
  • 14. The client device of claim 1, wherein the processing logic is further configured to determine the de-interlacing capability of the display device by affecting an output to the display device of graphical objects with alternate lines displaced or shifted by varied amounts based on operations of the one or more pipelines.
  • 15. The client device of claim 14, wherein the determination of de-interlacing capabilities of the display device includes a determination of whether the display device has de-interlacing capabilities, the quality of the de-interlacing capabilities, or a combination of both.
  • 16. The client device of claim 14, wherein the one or more pipelines are further configured to generate the graphical objects based on one or more two-dimensional transformations, one or more three-dimensional transformations, or a combination of both.
  • 17. A method, comprising: altering one or more objects corresponding to one or more pictures;de-interlacing by a client device one or more fields corresponding to the altered one or more objects;outputting the altered objects and the de-interlaced altered objects on a display screen of a display device coupled to the client device;soliciting user input to determine a quality of one or more rendered displays on the display screen, the one or more rendered displays based on the outputted altered objects that are de-interlaced at the display device and the de-interlaced altered objects provided by the client device;determining via the client device de-interlacing capabilities of the display device coupled to the client device and de-interlacing capabilities of the client device; anddetermining if the de-interlacing capabilities of the client device should be bypassed in favor of the de-interlacing capabilities of the display device.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a divisional of U.S. patent application having Ser. No. 10/761,777, filed on Jan. 21, 2004, which is incorporated by reference herein in its entirety.

US Referenced Citations (692)
Number Name Date Kind
3676580 Beck Jul 1972 A
4528643 Freeny Jul 1985 A
4586158 Brandle Apr 1986 A
4706121 Young Nov 1987 A
4751578 Reiter et al. Jun 1988 A
4821097 Robbins Apr 1989 A
4827250 Stallkamp May 1989 A
4885775 Lucas Dec 1989 A
4908713 Levine Mar 1990 A
4930158 Vogel May 1990 A
4949187 Cohen Aug 1990 A
4963994 Levine Oct 1990 A
4984152 Mueller Jan 1991 A
4991011 Johnson et al. Feb 1991 A
5038211 Hallenbeck Aug 1991 A
5172413 Bradley et al. Dec 1992 A
5191410 McCalley et al. Mar 1993 A
5253066 Vogel Oct 1993 A
5291554 Morales Mar 1994 A
5293357 Hallenbeck Mar 1994 A
5317391 Banker et al. May 1994 A
5329590 Pond Jul 1994 A
5353121 Young et al. Oct 1994 A
5357276 Banker et al. Oct 1994 A
5359362 Lewis et al. Oct 1994 A
5371551 Logan et al. Dec 1994 A
5398071 Gove et al. Mar 1995 A
5410326 Goldstein Apr 1995 A
5410343 Coddington et al. Apr 1995 A
5410344 Graves et al. Apr 1995 A
5414455 Hooper et al. May 1995 A
5418622 Takeuchi May 1995 A
5438423 Lynch et al. Aug 1995 A
5448313 Kim et al. Sep 1995 A
5465120 Schultheiss Nov 1995 A
5477262 Banker et al. Dec 1995 A
5479268 Young et al. Dec 1995 A
5481542 Logston et al. Jan 1996 A
5483277 Granger Jan 1996 A
5483278 Strubbe et al. Jan 1996 A
5485216 Lee Jan 1996 A
5493638 Hooper et al. Feb 1996 A
5508815 Levine Apr 1996 A
5512958 Rzeszewski Apr 1996 A
5515495 Ikemoto May 1996 A
5521631 Budow et al. May 1996 A
5524195 Clanton et al. Jun 1996 A
5530754 Garfinkle Jun 1996 A
5532735 Blahut et al. Jul 1996 A
5532754 Young et al. Jul 1996 A
5544354 May et al. Aug 1996 A
5555441 Haddad Sep 1996 A
5557541 Schulhof et al. Sep 1996 A
5562732 Eisenberg Oct 1996 A
5565908 Ahmad Oct 1996 A
5568181 Greenwood et al. Oct 1996 A
5568272 Levine Oct 1996 A
5568785 Hazen Oct 1996 A
5579055 Hamilton Nov 1996 A
5583560 Florin et al. Dec 1996 A
5583995 Gardner et al. Dec 1996 A
5585821 Ishikura et al. Dec 1996 A
5585838 Lawler et al. Dec 1996 A
5589892 Knee et al. Dec 1996 A
5592551 Lett et al. Jan 1997 A
5594509 Florin et al. Jan 1997 A
5598524 Johnston, Jr. et al. Jan 1997 A
5600364 Hendricks et al. Feb 1997 A
5600573 Hendricks et al. Feb 1997 A
5604528 Edwards et al. Feb 1997 A
5614940 Cobbley et al. Mar 1997 A
5619247 Russo Apr 1997 A
5619249 Billock et al. Apr 1997 A
5621456 Florin et al. Apr 1997 A
5623613 Rowe et al. Apr 1997 A
5625405 DuLac et al. Apr 1997 A
5625864 Budow et al. Apr 1997 A
5629732 Moskowitz et al. May 1997 A
5629733 Youman et al. May 1997 A
5631693 Wunderlich et al. May 1997 A
5632681 Bakoglu et al. May 1997 A
5635979 Kostreski et al. Jun 1997 A
5635980 Lin et al. Jun 1997 A
5635989 Rothmuller Jun 1997 A
5650831 Farwell Jul 1997 A
5654747 Ottesen et al. Aug 1997 A
5659350 Hendricks et al. Aug 1997 A
5664133 Malamud Sep 1997 A
5666293 Metz et al. Sep 1997 A
5671411 Watts et al. Sep 1997 A
5675752 Scott et al. Oct 1997 A
5677905 Bigham et al. Oct 1997 A
5682206 Wehmeyer et al. Oct 1997 A
5682597 Ganek et al. Oct 1997 A
5684918 Abecassis Nov 1997 A
5686954 Yoshinobu et al. Nov 1997 A
5687331 Volk et al. Nov 1997 A
5689641 Ludwig et al. Nov 1997 A
5694163 Harrison Dec 1997 A
5694176 Bruette et al. Dec 1997 A
5694546 Reisman Dec 1997 A
5696905 Reimer et al. Dec 1997 A
5696906 Peters et al. Dec 1997 A
5699107 Lawler et al. Dec 1997 A
5710591 Bruno et al. Jan 1998 A
5715169 Noguchi Feb 1998 A
5715515 Akins, III et al. Feb 1998 A
5717468 Baryla Feb 1998 A
5721827 Logan et al. Feb 1998 A
5721829 Dunn et al. Feb 1998 A
5721897 Rubinstein Feb 1998 A
5724106 Autry et al. Mar 1998 A
5724521 Dedrick Mar 1998 A
5724646 Ganek et al. Mar 1998 A
5727060 Young Mar 1998 A
5729549 Kostreski et al. Mar 1998 A
5732216 Logan et al. Mar 1998 A
5734853 Hendricks et al. Mar 1998 A
5737028 Bertram et al. Apr 1998 A
5740304 Katsuyama et al. Apr 1998 A
5740549 Reilly et al. Apr 1998 A
5745837 Fuhrmann Apr 1998 A
5748493 Lightfoot et al. May 1998 A
5751282 Girard et al. May 1998 A
5752160 Dunn May 1998 A
5754773 Ozden et al. May 1998 A
5758257 Herz et al. May 1998 A
5764873 Magid et al. Jun 1998 A
5764899 Eggleston et al. Jun 1998 A
5771435 Brown Jun 1998 A
5774170 Hite et al. Jun 1998 A
5778077 Davidson Jul 1998 A
5790170 Suzuki Aug 1998 A
5790176 Craig Aug 1998 A
5790935 Payton Aug 1998 A
5790940 Laborde et al. Aug 1998 A
5796828 Tsukamoto et al. Aug 1998 A
5798785 Hendricks et al. Aug 1998 A
5799063 Krane Aug 1998 A
5801747 Bedard Sep 1998 A
5801787 Schein et al. Sep 1998 A
5802502 Gell et al. Sep 1998 A
5808608 Young et al. Sep 1998 A
5808611 Johnson et al. Sep 1998 A
5809204 Young et al. Sep 1998 A
5812123 Rowe et al. Sep 1998 A
5812124 Eick et al. Sep 1998 A
5812786 Seazholtz et al. Sep 1998 A
5822123 Davis et al. Oct 1998 A
5826102 Escobar et al. Oct 1998 A
5826110 Ozden et al. Oct 1998 A
5828419 Bruette et al. Oct 1998 A
5828845 Jagadish et al. Oct 1998 A
5835843 Haddad Nov 1998 A
5838314 Neel et al. Nov 1998 A
5844620 Coleman et al. Dec 1998 A
5848352 Dougherty et al. Dec 1998 A
5850218 LaJoie et al. Dec 1998 A
5856975 Rostoker et al. Jan 1999 A
5859641 Cave Jan 1999 A
5861881 Freeman et al. Jan 1999 A
5861906 Dunn et al. Jan 1999 A
5877755 Hellhake Mar 1999 A
5877756 Um Mar 1999 A
5880768 Lemmons et al. Mar 1999 A
5886690 Pond et al. Mar 1999 A
5886732 Humpleman Mar 1999 A
5895454 Harrington Apr 1999 A
5898456 Wahl Apr 1999 A
5900905 Shoff et al. May 1999 A
5905522 Lawler May 1999 A
5905865 Palmer et al. May 1999 A
5905942 Stoel et al. May 1999 A
5907323 Lawler et al. May 1999 A
5913040 Rakavy et al. Jun 1999 A
5914712 Sartain et al. Jun 1999 A
5914746 Matthews, III et al. Jun 1999 A
5915068 Levine Jun 1999 A
5917822 Lyles et al. Jun 1999 A
5929849 Kikinis Jul 1999 A
5930493 Ottesen et al. Jul 1999 A
5935206 Dixon et al. Aug 1999 A
5936659 Viswanathan Aug 1999 A
5940073 Klosterman et al. Aug 1999 A
5943047 Suzuki Aug 1999 A
5956024 Strickland et al. Sep 1999 A
5956716 Kenner et al. Sep 1999 A
5960411 Hartman et al. Sep 1999 A
5961603 Kunkel et al. Oct 1999 A
5969748 Casement et al. Oct 1999 A
5973685 Schaffa et al. Oct 1999 A
5978043 Blonstein et al. Nov 1999 A
5978381 Perlman et al. Nov 1999 A
5983273 White et al. Nov 1999 A
5986650 Ellis et al. Nov 1999 A
5987256 Wu et al. Nov 1999 A
5990881 Inoue et al. Nov 1999 A
5990890 Etheredge Nov 1999 A
5990927 Hendricks et al. Nov 1999 A
5995134 Hayashi Nov 1999 A
5996015 Day et al. Nov 1999 A
6002401 Baker Dec 1999 A
6005565 Legall et al. Dec 1999 A
6005631 Anderson et al. Dec 1999 A
6006257 Slezak Dec 1999 A
6008803 Rowe et al. Dec 1999 A
6008836 Bruck Dec 1999 A
6014184 Knee et al. Jan 2000 A
6014694 Aharoni et al. Jan 2000 A
6014727 Creemer Jan 2000 A
6016348 Blatter et al. Jan 2000 A
6018359 Kermode et al. Jan 2000 A
6018372 Etheredge Jan 2000 A
6020912 De Lang Feb 2000 A
6023267 Chapuis et al. Feb 2000 A
6025837 Matthews, III et al. Feb 2000 A
6025868 Russo Feb 2000 A
6025869 Stas et al. Feb 2000 A
6026376 Kenney Feb 2000 A
6034677 Noguchi et al. Mar 2000 A
6035281 Crosskey et al. Mar 2000 A
6037933 Blonstein et al. Mar 2000 A
6049831 Gardell et al. Apr 2000 A
6055571 Fulp et al. Apr 2000 A
6057872 Candelore May 2000 A
6061097 Satterfield May 2000 A
6064380 Swenson et al. May 2000 A
6064980 Jacobi et al. May 2000 A
6070186 Nishio May 2000 A
6072982 Haddad Jun 2000 A
6073105 Sutcliffe et al. Jun 2000 A
6075575 Schein et al. Jun 2000 A
6081263 LeGall et al. Jun 2000 A
6081830 Schindler Jun 2000 A
6085185 Matsuzawa et al. Jul 2000 A
6088455 Logan et al. Jul 2000 A
6094680 Hokanson Jul 2000 A
6097383 Gaughan et al. Aug 2000 A
6098082 Gibbon et al. Aug 2000 A
6100883 Hoarty Aug 2000 A
6101512 DeRose et al. Aug 2000 A
6108002 Ishizaki Aug 2000 A
6108042 Adams et al. Aug 2000 A
6108706 Birdwell et al. Aug 2000 A
6118445 Nonomura et al. Sep 2000 A
6118498 Reitmeier Sep 2000 A
6118976 Arias et al. Sep 2000 A
6119163 Monteiro et al. Sep 2000 A
6124878 Adams et al. Sep 2000 A
6125259 Perlman Sep 2000 A
6130726 Darbee et al. Oct 2000 A
6133909 Schein et al. Oct 2000 A
6137539 Lownes et al. Oct 2000 A
6138139 Beck et al. Oct 2000 A
6141003 Chor et al. Oct 2000 A
6141488 Knudson et al. Oct 2000 A
6145083 Shaffer et al. Nov 2000 A
6148332 Brewer et al. Nov 2000 A
6151059 Schein et al. Nov 2000 A
6151688 Wipfel et al. Nov 2000 A
6157377 Shah-Nazaroff et al. Dec 2000 A
6157413 Hanafee et al. Dec 2000 A
6160545 Eyer et al. Dec 2000 A
6160546 Thompson et al. Dec 2000 A
6160989 Hendricks et al. Dec 2000 A
6163272 Goode et al. Dec 2000 A
6166730 Goode et al. Dec 2000 A
6169543 Wehmeyer Jan 2001 B1
6172674 Etheredge Jan 2001 B1
6172677 Stautner et al. Jan 2001 B1
6177931 Alexander et al. Jan 2001 B1
6181333 Chaney et al. Jan 2001 B1
6181693 Maresca Jan 2001 B1
6182287 Schneidewend et al. Jan 2001 B1
6184877 Dodson et al. Feb 2001 B1
6188684 Setoyama et al. Feb 2001 B1
6195689 Bahlmann Feb 2001 B1
6201536 Hendricks et al. Mar 2001 B1
6201540 Gallup et al. Mar 2001 B1
6205485 Kikinis Mar 2001 B1
6208335 Gordon et al. Mar 2001 B1
6209130 Rector et al. Mar 2001 B1
6216264 Maze et al. Apr 2001 B1
6233390 Yoneda May 2001 B1
6238290 Tarr et al. May 2001 B1
6239794 Yuen et al. May 2001 B1
6239845 Itagaki et al. May 2001 B1
6240555 Shoff et al. May 2001 B1
6243142 Mugura et al. Jun 2001 B1
6249532 Yoshikawa et al. Jun 2001 B1
6253375 Gordon et al. Jun 2001 B1
6259733 Kaye et al. Jul 2001 B1
6263501 Schein et al. Jul 2001 B1
6266814 Lemmons et al. Jul 2001 B1
6268849 Boyer et al. Jul 2001 B1
6272484 Martin et al. Aug 2001 B1
6275268 Knudson et al. Aug 2001 B1
6275506 Fazel et al. Aug 2001 B1
6282713 Kitsukawa et al. Aug 2001 B1
6289346 Milewski et al. Sep 2001 B1
6289514 Link et al. Sep 2001 B1
6292624 Saib et al. Sep 2001 B1
6305019 Dyer et al. Oct 2001 B1
6305020 Hoarty et al. Oct 2001 B1
6311011 Kuroda Oct 2001 B1
6314572 LaRocca et al. Nov 2001 B1
6314573 Gordon et al. Nov 2001 B1
6314575 Billock et al. Nov 2001 B1
6317777 Skarbo et al. Nov 2001 B1
6317881 Shah-Nazaroff et al. Nov 2001 B1
6317885 Fries Nov 2001 B1
6323911 Schein et al. Nov 2001 B1
6324694 Watts et al. Nov 2001 B1
6327628 Anuff et al. Dec 2001 B1
6335936 Bossemeyer, Jr. et al. Jan 2002 B1
6347400 Ohkura et al. Feb 2002 B1
6349410 Lortz Feb 2002 B1
6353448 Scarborough et al. Mar 2002 B1
6357046 Thompson et al. Mar 2002 B1
6359636 Schindler et al. Mar 2002 B1
6360367 Yamamoto Mar 2002 B1
6362841 Nykanen Mar 2002 B1
6367078 Lasky Apr 2002 B1
6378130 Adams Apr 2002 B1
6381332 Glaab Apr 2002 B1
6385614 Vellandi May 2002 B1
6393585 Houha et al. May 2002 B1
6396549 Weber May 2002 B1
6397386 O'Connor et al. May 2002 B1
6400280 Osakabe Jun 2002 B1
6401243 Suzuki Jun 2002 B1
6405239 Addington et al. Jun 2002 B1
6408128 Abecassis Jun 2002 B1
6421067 Kamen et al. Jul 2002 B1
6429899 Nio et al. Aug 2002 B1
6434748 Shen et al. Aug 2002 B1
6441862 Yuen et al. Aug 2002 B1
6442332 Knudson et al. Aug 2002 B1
6442755 Lemmons et al. Aug 2002 B1
6442756 Durden et al. Aug 2002 B1
6446261 Rosser Sep 2002 B1
6446262 Malaure et al. Sep 2002 B1
6452611 Gerba et al. Sep 2002 B1
6460181 Donnelly Oct 2002 B1
6463585 Hendricks et al. Oct 2002 B1
6476833 Moshfeghi Nov 2002 B1
6480669 Tsumagari et al. Nov 2002 B1
6481010 Nishikawa et al. Nov 2002 B2
6481011 Lemmons Nov 2002 B1
6484318 Shioda et al. Nov 2002 B1
6486920 Arai et al. Nov 2002 B2
6501902 Wang Dec 2002 B1
6505348 Knowles et al. Jan 2003 B1
6507949 Jonason et al. Jan 2003 B1
6510556 Kusaba et al. Jan 2003 B1
6512552 Subramanian Jan 2003 B1
6515680 Hendricks et al. Feb 2003 B1
6515710 Koshimuta Feb 2003 B1
6518986 Mugura Feb 2003 B1
6519770 Ford Feb 2003 B2
6526575 McCoy et al. Feb 2003 B1
6526577 Knudson et al. Feb 2003 B1
6530083 Liebenow Mar 2003 B1
6532589 Proehl et al. Mar 2003 B1
6532593 Moroney Mar 2003 B1
6536041 Knudson et al. Mar 2003 B1
6539548 Hendricks et al. Mar 2003 B1
6543053 Li et al. Apr 2003 B1
6545669 Kinawi et al. Apr 2003 B1
6557030 Hoang Apr 2003 B1
6563515 Reynolds et al. May 2003 B1
6564005 Berstis May 2003 B1
6564378 Satterfield et al. May 2003 B1
6564379 Knudson et al. May 2003 B1
6564383 Combs et al. May 2003 B1
6571390 Dunn et al. May 2003 B1
6574793 Ngo et al. Jun 2003 B1
6578077 Rakoshitz et al. Jun 2003 B1
6578201 LaRocca et al. Jun 2003 B1
6594699 Sahai et al. Jul 2003 B1
6594825 Goldschmidt Iki et al. Jul 2003 B1
6594826 Rao et al. Jul 2003 B1
6600496 Wagner et al. Jul 2003 B1
6601103 Goldschmidt Iki et al. Jul 2003 B1
6604240 Ellis et al. Aug 2003 B2
6609253 Swix et al. Aug 2003 B1
6611958 Shintani et al. Aug 2003 B1
6614440 Bowen et al. Sep 2003 B1
6614988 Sampsell Sep 2003 B1
6628302 White et al. Sep 2003 B2
6631413 Aggarwal et al. Oct 2003 B1
6642939 Vallone et al. Nov 2003 B1
6647548 Lu et al. Nov 2003 B1
6651044 Stoneman Nov 2003 B1
6662365 Sullivan et al. Dec 2003 B1
6664984 Schlarb et al. Dec 2003 B2
6665869 Ellis et al. Dec 2003 B1
6671328 Poon et al. Dec 2003 B1
6675384 Block et al. Jan 2004 B1
6675385 Wang Jan 2004 B1
6675388 Beckmann et al. Jan 2004 B1
6678891 Wilcox et al. Jan 2004 B1
6681395 Nishi Jan 2004 B1
6681396 Bates et al. Jan 2004 B1
6684025 Perlman Jan 2004 B1
6684400 Goode et al. Jan 2004 B1
6697376 Son et al. Feb 2004 B1
6698023 Levitan Feb 2004 B2
6701523 Hancock et al. Mar 2004 B1
6701528 Arsenault et al. Mar 2004 B1
6706311 Wong et al. Mar 2004 B2
6708336 Bruette Mar 2004 B1
6717590 Sullivan Apr 2004 B1
6718552 Goode Apr 2004 B1
6721953 Bates et al. Apr 2004 B1
6725461 Dougherty et al. Apr 2004 B1
6731310 Craycroft et al. May 2004 B2
6732366 Russo May 2004 B1
6732367 Ellis et al. May 2004 B1
6732369 Schein et al. May 2004 B1
6732372 Tomita et al. May 2004 B2
6735572 Landesmann May 2004 B2
6738978 Hendricks et al. May 2004 B1
6738982 Jerding May 2004 B1
6754904 Cooper et al. Jun 2004 B1
6754906 Finseth et al. Jun 2004 B1
6757909 Maruo et al. Jun 2004 B1
6760918 Rodriguez et al. Jul 2004 B2
6769127 Bonomi et al. Jul 2004 B1
6771290 Hoyle Aug 2004 B1
6772209 Chernock et al. Aug 2004 B1
6772433 LaJoie et al. Aug 2004 B1
6782550 Cao Aug 2004 B1
6791620 Elswick et al. Sep 2004 B1
6792615 Rowe et al. Sep 2004 B1
6801533 Barkley Oct 2004 B1
6817028 Jerding et al. Nov 2004 B1
6832386 Jerding et al. Dec 2004 B1
6847969 Mathai et al. Jan 2005 B1
6876628 Howard et al. Apr 2005 B2
6898762 Ellis et al. May 2005 B2
6901385 Okamoto et al. May 2005 B2
6934964 Schaffer et al. Aug 2005 B1
6938258 Weinberger et al. Aug 2005 B1
6957386 Nishina et al. Oct 2005 B2
6968372 Thompson et al. Nov 2005 B1
6978310 Rodriguez et al. Dec 2005 B1
6978475 Kunin et al. Dec 2005 B1
6986156 Rodriguez et al. Jan 2006 B1
6987728 Deshpande Jan 2006 B2
6990676 Proehl et al. Jan 2006 B1
6990677 Pietraszak et al. Jan 2006 B1
6993782 Newberry et al. Jan 2006 B1
7010492 Bassett et al. Mar 2006 B1
7010801 Jerding et al. Mar 2006 B1
7024681 Fransman et al. Apr 2006 B1
7039944 Cho et al. May 2006 B1
7062466 Wagner et al. Jun 2006 B2
7065709 Ellis et al. Jun 2006 B2
7069575 Goode et al. Jun 2006 B1
7076734 Wolff et al. Jul 2006 B2
7086077 Giammaressi Aug 2006 B2
7103903 Kydd Sep 2006 B1
7103905 Novak Sep 2006 B2
7110714 Kay et al. Sep 2006 B1
7117440 Gordon et al. Oct 2006 B2
7120926 Safadi et al. Oct 2006 B1
7143430 Fingerman et al. Nov 2006 B1
7143432 Brooks et al. Nov 2006 B1
7150031 Rodriguez et al. Dec 2006 B1
7155733 Rodriguez et al. Dec 2006 B2
7168086 Carpenter et al. Jan 2007 B1
7180422 Milenkovic et al. Feb 2007 B2
7185355 Ellis et al. Feb 2007 B1
7188356 Miura et al. Mar 2007 B1
7194757 Fish et al. Mar 2007 B1
7200857 Rodriguez et al. Apr 2007 B1
7237251 Oz et al. Jun 2007 B1
7243364 Dunn et al. Jul 2007 B2
7246367 Livonen Jul 2007 B2
7249366 Flavin Jul 2007 B1
7324552 Galand et al. Jan 2008 B1
7324553 Varier et al. Jan 2008 B1
7334251 Rodriguez et al. Feb 2008 B2
7340759 Rodriguez Mar 2008 B1
7343614 Hendricks et al. Mar 2008 B1
7356477 Allan et al. Apr 2008 B1
7404200 Hailey et al. Jul 2008 B1
7496943 Goldberg et al. Feb 2009 B1
7496945 Rodriguez Feb 2009 B2
7509267 Yarmolich et al. Mar 2009 B1
7512964 Rodriguez et al. Mar 2009 B2
7526788 Rodriguez Apr 2009 B2
7535888 Carlucci et al. May 2009 B1
7647549 Denoual et al. Jan 2010 B2
7673314 Ellis et al. Mar 2010 B2
7685520 Rashkovskiy et al. Mar 2010 B2
7707614 Krikorian et al. Apr 2010 B2
7925534 Amano et al. Apr 2011 B2
7934232 Jerding et al. Apr 2011 B1
7961643 McDonald et al. Jun 2011 B2
7962370 Rodriguez et al. Jun 2011 B2
7975277 Jerding et al. Jul 2011 B1
7992163 Jerding et al. Aug 2011 B1
7992166 Jerding et al. Aug 2011 B2
8005713 Sanz-Pastor et al. Aug 2011 B1
8006262 Rodriguez et al. Aug 2011 B2
8006273 Rodriguez et al. Aug 2011 B2
8020184 Jerding et al. Sep 2011 B2
8032914 Rodriguez et al. Oct 2011 B2
8037504 Jerding et al. Oct 2011 B2
8056106 Rodriguez et al. Nov 2011 B2
8069259 Rodriguez et al. Nov 2011 B2
8079047 Jerding et al. Dec 2011 B1
8161388 Rodriguez et al. Apr 2012 B2
8189472 McDonald et al. May 2012 B2
8191093 Rodriguez et al. May 2012 B2
8255951 Jerding et al. Aug 2012 B2
8516525 Jerding et al. Aug 2013 B1
8620135 Rodriguez et al. Dec 2013 B2
8640172 Rodriguez et al. Jan 2014 B2
8707153 Rodriguez et al. Apr 2014 B2
8745656 Rodriguez et al. Jun 2014 B2
20010003846 Rowe et al. Jun 2001 A1
20010013125 Kitsukawa et al. Aug 2001 A1
20010013127 Tomita et al. Aug 2001 A1
20010029523 Mcternan et al. Oct 2001 A1
20010030667 Kelts Oct 2001 A1
20010032293 Korst et al. Oct 2001 A1
20010032335 Jones Oct 2001 A1
20010034763 Jacobs et al. Oct 2001 A1
20010036271 Javed Nov 2001 A1
20010044744 Rhoads Nov 2001 A1
20010047517 Christopoulos et al. Nov 2001 A1
20020002642 Tyson et al. Jan 2002 A1
20020007485 Rodriguez et al. Jan 2002 A1
20020013836 Friedman et al. Jan 2002 A1
20020026496 Boyer et al. Feb 2002 A1
20020026638 Eldering et al. Feb 2002 A1
20020032638 Arora et al. Mar 2002 A1
20020032728 Sako et al. Mar 2002 A1
20020032905 Sherr et al. Mar 2002 A1
20020042913 Ellis et al. Apr 2002 A1
20020042920 Thomas et al. Apr 2002 A1
20020044762 Wood et al. Apr 2002 A1
20020049804 Rodriguez et al. Apr 2002 A1
20020049978 Rodriguez et al. Apr 2002 A1
20020056098 White May 2002 A1
20020056118 Hunter et al. May 2002 A1
20020057336 Gaul et al. May 2002 A1
20020062481 Slaney et al. May 2002 A1
20020069105 do Rosario Botelho Jun 2002 A1
20020069218 Sull et al. Jun 2002 A1
20020069412 Philips Jun 2002 A1
20020078176 Nomura et al. Jun 2002 A1
20020083443 Eldering et al. Jun 2002 A1
20020087981 Daniels Jul 2002 A1
20020101367 Geiger et al. Aug 2002 A1
20020104083 Hendricks et al. Aug 2002 A1
20020108125 Joao Aug 2002 A1
20020124249 Shintani et al. Sep 2002 A1
20020124255 Reichardt et al. Sep 2002 A1
20020128908 Levin et al. Sep 2002 A1
20020129362 Chang et al. Sep 2002 A1
20020186957 Yuen Dec 2002 A1
20030002862 Rodriguez et al. Jan 2003 A1
20030014753 Beach et al. Jan 2003 A1
20030023971 Martinolich et al. Jan 2003 A1
20030030679 Jain Feb 2003 A1
20030031465 Blake Feb 2003 A1
20030037068 Thomas et al. Feb 2003 A1
20030037332 Chapin et al. Feb 2003 A1
20030040962 Lewis Feb 2003 A1
20030061619 Giammaressi Mar 2003 A1
20030067554 Klarfeld et al. Apr 2003 A1
20030074214 Kelliher Apr 2003 A1
20030074257 Saveliev et al. Apr 2003 A1
20030079224 Komar et al. Apr 2003 A1
20030079227 Knowles et al. Apr 2003 A1
20030088872 Maissel et al. May 2003 A1
20030101451 Bentolila et al. May 2003 A1
20030101454 Ozer et al. May 2003 A1
20030124973 Sie et al. Jul 2003 A1
20030126425 Yang et al. Jul 2003 A1
20030131356 Proehl et al. Jul 2003 A1
20030135853 Goldman et al. Jul 2003 A1
20030154475 Rodriguez et al. Aug 2003 A1
20030154486 Dunn et al. Aug 2003 A1
20030159147 Young et al. Aug 2003 A1
20030174243 Arbeiter et al. Sep 2003 A1
20030188313 Ellis et al. Oct 2003 A1
20030193486 Estrop Oct 2003 A1
20030206553 Surcouf et al. Nov 2003 A1
20030219228 Thiagarajan et al. Nov 2003 A1
20030221194 Thiagarajan et al. Nov 2003 A1
20040034867 Rashkovskiy et al. Feb 2004 A1
20040049787 Maissel et al. Mar 2004 A1
20040107436 Ishizaki Jun 2004 A1
20040117831 Ellis et al. Jun 2004 A1
20040128685 Hassell et al. Jul 2004 A1
20040133907 Rodriguez et al. Jul 2004 A1
20040163114 Rodriguez et al. Aug 2004 A1
20040163117 Rodriguez et al. Aug 2004 A1
20040168191 Jerding et al. Aug 2004 A1
20040181801 Hagen et al. Sep 2004 A1
20040221310 Herrington et al. Nov 2004 A1
20040261098 Macrae et al. Dec 2004 A1
20040261112 Hicks et al. Dec 2004 A1
20040261125 Ellis et al. Dec 2004 A1
20050005308 Logan et al. Jan 2005 A1
20050008074 van Beek et al. Jan 2005 A1
20050028190 Rodriguez et al. Feb 2005 A1
20050044565 Jerding et al. Feb 2005 A1
20050044566 Jerding et al. Feb 2005 A1
20050044577 Jerding et al. Feb 2005 A1
20050071882 Rodriguez et al. Mar 2005 A1
20050076360 Jerding et al. Apr 2005 A1
20050091693 Amine et al. Apr 2005 A1
20050111046 Kurumisawa et al. May 2005 A1
20050138657 Leftwich Jun 2005 A1
20050155056 Knee et al. Jul 2005 A1
20050160468 Rodriguez et al. Jul 2005 A1
20050188415 Riley Aug 2005 A1
20050204387 Knudson et al. Sep 2005 A1
20050204388 Knudson et al. Sep 2005 A1
20050213506 Wakumoto et al. Sep 2005 A1
20050216936 Knudson et al. Sep 2005 A1
20050235319 Carpenter et al. Oct 2005 A1
20050240961 Jerding et al. Oct 2005 A1
20050251822 Knowles et al. Nov 2005 A1
20050278741 Robarts et al. Dec 2005 A1
20050283797 Eldering et al. Dec 2005 A1
20050283810 Ellis et al. Dec 2005 A1
20050289618 Hardin Dec 2005 A1
20060020982 Jerding et al. Jan 2006 A1
20060026080 Rodriguez et al. Feb 2006 A1
20060026665 Rodriguez et al. Feb 2006 A1
20060059525 Jerding et al. Mar 2006 A1
20060070107 Renkis Mar 2006 A1
20060088105 Shen et al. Apr 2006 A1
20060112434 Banker et al. May 2006 A1
20060206913 Jerding et al. Sep 2006 A1
20060271933 Agassi et al. Nov 2006 A1
20060271964 Rodriguez et al. Nov 2006 A1
20060271973 Jerding et al. Nov 2006 A1
20070019670 Falardeau Jan 2007 A1
20070053293 McDonald et al. Mar 2007 A1
20070094690 Rodriguez et al. Apr 2007 A1
20070136748 Rodriguez et al. Jun 2007 A1
20070186240 Ward, III et al. Aug 2007 A1
20080010658 Abbott et al. Jan 2008 A1
20080046947 Katznelson Feb 2008 A1
20080098421 Rodriguez et al. Apr 2008 A1
20080098422 Rodriguez et al. Apr 2008 A1
20080101460 Rodriguez May 2008 A1
20080104637 Rodriguez et al. May 2008 A1
20080137755 Onur et al. Jun 2008 A1
20080155631 Liwerant et al. Jun 2008 A1
20080216111 Alten et al. Sep 2008 A1
20080229361 Jerding et al. Sep 2008 A1
20080279217 McDonald et al. Nov 2008 A1
20080281968 Rodriguez Nov 2008 A1
20080282307 McDonald et al. Nov 2008 A1
20080282308 McDonald et al. Nov 2008 A1
20090141794 Rodriguez et al. Jun 2009 A1
20090150946 Rodriguez et al. Jun 2009 A1
20090150957 Jerding et al. Jun 2009 A1
20090150958 Jerding et al. Jun 2009 A1
20090150959 Jerding et al. Jun 2009 A1
20090158306 Rodriguez et al. Jun 2009 A1
20090158324 Rodriguez et al. Jun 2009 A1
20090158329 Rodriguez et al. Jun 2009 A1
20090158331 Rodriguez et al. Jun 2009 A1
20090158332 Rodriguez et al. Jun 2009 A1
20090158335 Rodriguez et al. Jun 2009 A1
20090158339 Rodriguez et al. Jun 2009 A1
20090158352 Rodriguez et al. Jun 2009 A1
20090158354 Rodriguez et al. Jun 2009 A1
20090158355 Rodriguez et al. Jun 2009 A1
20090158363 Rodriguez et al. Jun 2009 A1
20090183081 Rodriguez et al. Jul 2009 A1
20090190028 Rodriguez et al. Jul 2009 A1
20090193468 Rodriguez Jul 2009 A1
20090193471 Rodriguez Jul 2009 A1
20090199249 Rodriguez et al. Aug 2009 A1
20090276808 Jerding et al. Nov 2009 A1
20090282372 Jerding et al. Nov 2009 A1
20090282440 Rodriguez Nov 2009 A1
20100242063 Slaney et al. Sep 2010 A1
20110279863 Chang et al. Nov 2011 A1
20140282701 Rodriguez et al. Sep 2014 A1
20140282732 Jerding et al. Sep 2014 A1
Foreign Referenced Citations (102)
Number Date Country
2 363 052 Nov 1995 CA
2 223 025 Nov 2001 CA
2 475 723 Jan 2011 CA
2408289 Aug 2012 CA
0 572 090 Dec 1993 EP
0 673 159 Sep 1995 EP
0 680 214 Nov 1995 EP
0 725 538 Aug 1996 EP
0 763 936 Mar 1997 EP
0 811 939 Dec 1997 EP
0 838 915 Apr 1998 EP
0 849 948 Jun 1998 EP
0 854 645 Jul 1998 EP
0 891 084 Jan 1999 EP
0 896 318 Feb 1999 EP
0 909 095 Apr 1999 EP
0 701 756 Dec 1999 EP
0 989 751 Mar 2000 EP
1 052 644 Nov 2000 EP
1 069 801 Jan 2001 EP
1 075 143 Feb 2001 EP
1 113 668 Apr 2001 EP
1 111 572 Jun 2001 EP
1 161 085 Dec 2001 EP
2 343 051 Apr 2000 GB
8-289219 Nov 1996 JP
9-322022 Dec 1997 JP
10-143734 May 1998 JP
11-73361 Mar 1999 JP
11-73394 Mar 1999 JP
2000-101941 Apr 2000 JP
200167786 Mar 2001 JP
WO 9222983 Dec 1992 WO
WO 9414284 Jun 1994 WO
WO 9528799 Oct 1995 WO
WO 9617467 Jun 1996 WO
WO 9633579 Oct 1996 WO
WO 9634486 Oct 1996 WO
WO 9634491 Oct 1996 WO
9641470 Dec 1996 WO
WO 9641477 Dec 1996 WO
WO 9641478 Dec 1996 WO
WO 9734414 Sep 1997 WO
WO 9803012 Jan 1998 WO
WO 9826528 Jun 1998 WO
WO 9831116 Jul 1998 WO
WO 9837695 Aug 1998 WO
WO 9839893 Sep 1998 WO
WO 9847279 Oct 1998 WO
WO 9848566 Oct 1998 WO
WO 9856172 Dec 1998 WO
WO 9856173 Dec 1998 WO
WO 9856188 Dec 1998 WO
WO 9901984 Jan 1999 WO
WO 9904560 Jan 1999 WO
WO 9904561 Jan 1999 WO
WO 9912109 Mar 1999 WO
WO 9914947 Mar 1999 WO
WO 9935831 Jul 1999 WO
WO 9945701 Sep 1999 WO
WO 9949717 Oct 1999 WO
WO 9952285 Oct 1999 WO
WO 9957903 Nov 1999 WO
WO 9960790 Nov 1999 WO
9965237 Dec 1999 WO
WO 9966719 Dec 1999 WO
WO 0001149 Jan 2000 WO
WO 0002385 Jan 2000 WO
WO 0004726 Jan 2000 WO
WO 0005889 Feb 2000 WO
WO 0030354 May 2000 WO
WO 0040017 Jul 2000 WO
WO 0046988 Aug 2000 WO
WO 0049801 Aug 2000 WO
0058967 Oct 2000 WO
WO 0059202 Oct 2000 WO
WO 0060482 Oct 2000 WO
WO 0078031 Dec 2000 WO
WO 0078045 Dec 2000 WO
WO 0078047 Dec 2000 WO
WO 0078048 Dec 2000 WO
WO 0106788 Jan 2001 WO
WO 0120907 Mar 2001 WO
WO 0124067 Apr 2001 WO
WO 0156273 Aug 2001 WO
WO 0167736 Sep 2001 WO
WO 0172042 Sep 2001 WO
WO 0176245 Oct 2001 WO
WO 0177888 Oct 2001 WO
WO 0184831 Nov 2001 WO
WO 02097584 Dec 2002 WO
WO 03003164 Jan 2003 WO
WO 03003709 Jan 2003 WO
WO 03014873 Feb 2003 WO
WO 03024084 Mar 2003 WO
WO 03042787 May 2003 WO
WO 03069898 Aug 2003 WO
WO 2004091219 Oct 2004 WO
WO 2004100500 Nov 2004 WO
WO 2005059202 Jun 2005 WO
WO 2005071658 Aug 2005 WO
WO 2007030370 Mar 2007 WO
Non-Patent Literature Citations (507)
Entry
U.S. Appl. No. 11/170,348, filed Jun. 29, 2005 entitled “Methods and Systems for Advertising During Video-on-Demand Suspensions”.
U.S. Appl. No. 11/460,516, filed Jul. 27, 2006 entitled “Video Promotional and Advertising Systems for Video on Demand System”.
U.S. Appl. No. 12/372,887, filed Feb. 18, 2009 entitled “Selection of Purchasable Enhancements of a Television Service”.
U.S. Appl. No. 12/372,952, filed Feb. 18, 2009 entitled “System and Method for Assessing Usage of Purchasable Enhancements of Television Services”.
U.S. Appl. No. 12/388,002, filed Feb. 18, 2009 entitled “System and Method for Providing Purchasable Enhancements of VOD Services”.
U.S. Appl. No. 12/372,776, filed Feb. 18, 2009 entitled “Configurable Options for Accessible On-Demand Information”.
U.S. Appl. No. 12/388,139, filed Feb. 18, 2009 entitled “Upgrading Access of Television Program Information with Optional Features”.
U.S. Appl. No. 12/372,803, filed Feb. 18, 2009 entitled “Management of Generic Service Enhancements for Television Services”.
U.S. Appl. No. 10/780,448, filed Feb. 13, 2004 entitled “System and Method for Expiration Reminders of Rentable Media Offerings”.
U.S. Appl. No. 09/330,792, filed Jun. 11, 1999 entitled “Series Reminders and Series Recording from an Interactive Television program Guide”.
U.S. Appl. No. 09/378,533, filed Aug. 20, 1999 entitled “Electronic Program Guide with Advance Notification”.
U.S. Appl. No. 09/494,209, filed Jan. 1, 2000 entitled “System and Method for Allowing a User to Quickly Navigate Within a Program Guide to an Established Point”.
U.S. Appl. No. 09/502,067, filed Feb. 10, 2000 entitled “Method and System for Identification of Pay-per-View Programming”.
U.S. Appl. No. 09/518,041, filed Mar. 2, 2000 entitled “Apparatus and Method for Providing a Plurality of Interactive Program Guide Initial Arrangements”.
U.S. Appl. No. 09/542,484, filed Apr. 3, 2000 entitled “System for Providing Alternative Services”.
U.S. Appl. No. 09/565,931, filed May 4, 2000 entitled “Navigation Paradigm for Access to Television Services”.
U.S. Appl. No. 09/590,434, filed Jun. 9, 2000 entitled “Video Promotional and Advertising Systems for Video on Demand System”.
U.S. Appl. No. 09/590,488, filed Jun. 9, 2000 entitled “User Interface Navigational System with Parental Control for Video on Demand System”.
U.S. Appl. No. 09/590,521, filed Jun. 9, 2000 entitled “Systems and Methods for Adaptive Scheduling and Dynamic Bandwidth Resource Allocation Management in a Digital Broadband Delivery System”.
U.S. Appl. No. 09/590,904, filed Jun. 9, 2000 entitled “Program Information Searching System for Interactive Program Guide”.
U.S. Appl. No. 09/590,520, filed Jun. 9, 2000 entitled “Video on Deman System with Parameter-Controlled Bandwidth Deallocation”.
U.S. Appl. No. 09/590,518, filed Jun. 9, 2000 entitled “Catalog Management System for Video on Demand System”.
U.S. Appl. No. 09/591,356, filed Jun. 9, 2000 entitled “Future Program Options Menu System for Interactive Program Guide”.
U.S. Appl. No. 09/692,920, filed Oct. 20, 2000 entitled “Media-on-Demand Title Indexing System”.
U.S. Appl. No. 09/692,995, filed Oct. 20, 2000 entitled “Media-on-Demand Bookmark System”.
U.S. Appl. No. 09/693,115, filed Oct. 20, 2000 entitled “Media Services Window Configuration System”.
U.S. Appl. No. 09/693,288, filed Oct. 20, 2000 entitled “Media-on-Demand Rental Duration Management System”.
U.S. Appl. No. 09/693,790, filed Oct. 20, 2000 entitled “Integrated Searching System for Interactive Media Guide”.
U.S. Appl. No. 09/693,784, filed Oct. 20, 2000 entitled “System and Method for Reminders of Upcoming Rentable Media Offerings”.
U.S. Appl. No. 09/693,780, filed Oct. 20, 2000 entitled “Sychronized Video-On-Demand Supplemental Commentary”.
U.S. Appl. No. 09/896,390, filed Jun. 29, 2001 entitled “System and Method for Archiving Multiple Downloaded Recordable Media Content”.
U.S. Appl. No. 09/896,331, filed Jun. 29, 2001 entitled “Method and Apparatus for Recordable Media Content Distribution”.
U.S. Appl. No. 14/166,460, filed Jan. 28, 2014 entitled “System and Method for Characterization of Purchasable and Recordable Media (PRM)”.
U.S. Appl. No. 10/008,429, filed Nov. 13, 2001 entitled “Graphic User Interface Alternate Download Options for Unavailable PRM Content”.
U.S. Appl. No. 10/957,849, filed Oct. 4, 2004, entitled “User Input for Access to Television Services”.
U.S. Appl. No. 10/957,854, filed Oct. 4, 2004, entitled “Menu Operation for Access to Television Services”.
U.S. Appl. No. 10/957,942, filed Oct. 4, 2004, entitled “Control Access to Television Services”.
U.S. Appl. No. 12/389,128, filed Feb. 19, 2009, entitled “Configuration of TV Services via Alternate Communication”.
U.S. Appl. No. 14/287,339, filed May 27, 2014 entitled “Hypertext Service Guide Menu Display”.
Almeroth et al.; An Alternative Paradigm for Scalable On-Demand Applications: Evaluating and Deploying the Interactive multimedia Jukebox; 1999; I EEE; vol. 11; p. 658-672.
Law Office Computing, AportisDoc Mobile Edition 2.2, retrieved on Aug. 14, 2008 at http://www.lawofficecomputing. com/old—site/Reviewsdata/djO 1/aportisdoc.asp, 2 pgs.
Perlman et al.; A working Anti-Taping System for Cable Pay-Per-View; Aug. 1998; IEEE; vol. 35; 6 pgs.
Wikipedia, Moxi, retrieved on Aug. 14, 2008 at http://en.wikipedia.orglwiki/Moxi, 8 pgs.
BPAI Decision for U.S. Appl. No. 09/896,231, mailed Nov. 6, 2007, 17 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,521 mailed Apr. 9, 2001, 15 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,521 mailed Oct. 24, 2001, 19 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,521 mailed May 23, 2002, 19 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,521 mailed Aug. 28, 2002, 11 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,521 mailed Feb. 26, 2003, 17 pgs.
U.S. Office Action in U.S. Appl. No. 10/073,842 mailed Jun. 25, 2003, 16 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,521 mailed Aug. 6, 2003, 19 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,518 mailed Sep. 9, 2003, 10 pgs.
U.S. Office Action in U.S. Appl. No. 09/693,780 mailed Sep. 13, 2003, 16 pgs.
U.S. Office Action in U.S. Appl. No. 09/693,784 mailed Sep. 25, 2003, 9 pgs.
U.S. Office Action in U.S. Appl. No. 10/073,842 mailed Dec. 24, 2003, 9 pgs.
U.S. Office Action in U.S. Appl. No. 09/693,780 mailed Jan. 9, 2004, 10 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,521 mailed Jan. 21, 2004, 12 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,518 mailed Mar. 11, 2004, 11 pgs.
U.S. Office Action in U.S. Appl. No. 09/693,780 mailed May 6, 2004, 16 pgs.
U.S. Office Action in U.S. Appl. No. 09/709,145 mailed Jul. 20, 2004, 14 pgs.
U.S. Office Action in U.S. Appl. No. 10/073,842 mailed Aug. 2, 2004, 10 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,518 mailed Sep. 23, 2004, 14 pgs.
U.S. Office Action in U.S. Appl. No. 09/693,784 mailed Dec. 1, 2004, 8 pgs.
U.S. Office Action in U.S. Appl. No. 10/780,448 mailed Dec. 29, 2004, 9 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,520 mailed Mar. 11, 2005, 11 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,518 mailed Mar. 14, 2005, 15 pgs.
U.S. Office Action in U.S. Appl. No. 09/693,780 mailed Apr. 20, 2005, 20 pgs.
U.S. Office Action in U.S. Appl. No. 09/709,145 mailed Apr. 22, 2005, 21 pgs.
U.S. Office Action in U.S. Appl. No. 10/780,448 mailed May 3, 2005, 7 pgs.
U.S. Office Action in U.S. Appl. No. 10/073,842 mailed Jun. 1, 2005, 10 pgs.
U.S. Office Action in U.S. Appl. No. 09/896,390 mailed Jun. 15, 2005, 7 pgs.
U.S. Office Action in U.S. Appl. No. 09/693,784 mailed Aug. 12, 2005, 15 pgs.
U.S. Office Action in U.S. Appl. No. 09/896,470 mailed Aug. 25, 2005, 21 pgs.
U.S. Office Action in U.S. Appl. No. 10/780,448 mailed Oct. 13, 2005, 9 pgs.
U.S. Office Action in U.S. Appl. No. 09/896,390 mailed Oct. 19, 2005, 15 pgs.
U.S. Office Action in U.S. Appl. No. 09/709,145 mailed Nov. 2, 2005, 26 pgs.
U.S. Office Action in U.S. Appl. No. 09/693,780 mailed Dec. 19, 2005, 17 pgs.
U.S. Office Action in U.S. Appl. No. 09/896,470 mailed Feb. 1, 2006, 15 pgs.
U.S. Office Action in U.S. Appl. No. 09/896,390 mailed Feb. 8, 2006, 14 pgs.
U.S. Office Action in U.S. Appl. No. 10/008,429 mailed Apr. 4, 2006, 18 pgs.
U.S. Office Action in U.S. Appl. No. 09/693,780 mailed Apr. 5, 2006, 25 pgs.
U.S. Office Action in U.S. Appl. No. 10/073,842 mailed May 2, 2006, 9 pgs.
U.S. Office Action in U.S. Appl. No. 09/693,780 mailed Jul. 7, 2006, 18 pgs.
U.S. Office Action in U.S. Appl. No. 09/709,145 mailed Jul. 28, 2006, 25 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,518 mailed Aug. 14, 2006, 10 pgs.
U.S. Office Action in U.S. Appl. No. 10/073,842 mailed Sep. 5, 2006, 10 pgs.
U.S. Office Action in U.S. Appl. No. 09/896,470 mailed Sep. 6, 2006, 11 pgs.
U.S. Office Action in U.S. Appl. No. 09/896,390 mailed Oct. 23, 2006, 20 pgs.
U.S. Office Action in U.S. Appl. No. 10/008,429 mailed Oct. 31, 2006, 25 pgs.
U.S. Office Action in U.S. Appl. No. 09/709,145 mailed Jan. 12, 2007, 27 pgs.
U.S. Office Action in U.S. Appl. No. 09/896,470 mailed Feb. 27, 2007, 14 pgs.
U.S. Office Action in U.S. Appl. No. 10/073,842 mailed Mar. 9, 2007, 10 pgs.
U.S. Office Action in U.S. Appl. No. 10/008,429 mailed May 3, 2007, 22 pgs.
U.S. Office Action in U.S. Appl. No. 09/896,390 mailed May 21, 2007, 16 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,518 mailed Jun. 26, 2007, 11 pgs.
U.S. Office Action in U.S. Appl. No. 10/683,138 mailed Jul. 5, 2007, 10 pgs.
U.S. Office Action in U.S. Appl. No. 09/896,470 mailed Jul. 30, 2007, 17 pgs.
U.S. Office Action in U.S. Appl. No. 09/896,390 mailed Nov. 5, 2007, 14 pgs.
U.S. Office Action in U.S. Appl. No. 10/008,429 mailed Dec. 5, 2007, 23 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,518 mailed Dec. 14, 2007, 11 pgs.
U.S. Office Action in U.S. Appl. No. 10/683,138 mailed Dec. 28, 2007, 10 pgs.
U.S. Office Action in U.S. Appl. No. 09/896,390 mailed Apr. 9, 2008, 17 pgs.
U.S. Office Action in U.S. Appl. No. 09/896,470 mailed Apr. 18, 2008, 14 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,518 mailed Jul. 23, 2008, 11 pgs.
U.S. Office Action in U.S. Appl. No. 10/683,138 mailed Aug. 7, 2008, 10 pgs.
U.S. Office Action in U.S. Appl. No. 10/683,138 mailed Jan. 21, 2009 10 pgs.
U.S. Office Action in U.S. Appl. No. 09/590,518 mailed Jan. 23, 2009, 11 pgs.
U.S. Office Action in U.S. Appl. No. 10/683,138 mailed Jul. 6, 2009, 11 pgs.
U.S. Non-Final Office Action in U.S. Appl. No. 12/372,894 mailed Oct. 27, 2009, 6 pages.
U.S. Office Action in U.S. Appl. No. 12/372,898 mailed Oct. 27, 2009, 8 pgs.
U.S. Office Action in U.S. Appl. No. 12/372,905 mailed Oct. 27, 2009, 6 pgs.
U.S. Office Action in U.S. Appl. No. 10/683,138 mailed Jan. 27, 2010, 11 pgs.
U.S. Office Action mailed Jun. 27, 2014 in U.S. Appl. No. 13/482,689, 25 pgs.
U.S. Office Action mailed Feb. 25, 2015 in U.S. Appl. No. 13/482,497, 20 pages.
U.S. Office Action mailed May 15, 2015 in U.S. Appl. No. 14/287,339, 16 pages.
Canadian Office Action dated Jul. 11, 2011 in Application No. 2,554,208, 4 pages.
European Office Action dated Oct. 10, 2011 in Application No. 02744705.1, 9 pages.
Canadian Office Action dated Oct. 17, 2011 in Application No. 2,402,088, 4 pages.
Summons to attend oral proceedings mailed Dec. 29, 2011 in Application No. 00939759.7, 9 pages.
Summons to attend oral proceedings mailed Jul. 31, 2012 in Application No. 02744705.1, 5 pages.
European Communication dated Nov. 14, 2012 in Application No. 09154377.7, 6 pages.
Canadian Office Action mailed Apr. 11, 2013 in Application No. 2,402,088, 7 pages.
Canadian Office Action mailed Oct. 17, 2013 in Application No. 2,451,477, 3 pages.
U.S. Final Office Action mailed Jul. 19, 2011 in U.S. Appl. No. 11/162,345, all pages.
Board of Patent Appeals and Interferences Decision mailed Aug. 8, 2011 in U.S. Appl. No. 09/518,041, 9 pgs.
U.S. Non-Final Office Action mailed Sep. 6, 2011 in U.S. Appl. No. 11/238,369, 22 pgs.
U.S Non-Final Office Action mailed Sep. 13, 2011 in U.S. Appl. No. 12/389,564, 16 pgs.
U.S. Non-Final Office Action mailed Sep. 27, 2011 in U.S. Appl. No. 12/413,686, 24 pgs.
U.S. Final Office Action mailed Nov. 10, 2011 in U.S. Appl. No. 10/934,253, 14 pgs.
U.S. Final Office Action mailed Jan. 24, 2012 in U.S. Appl. No. 11/238,369,19 pgs.
U.S. Final Office Action mailed Apr. 16, 2012 in U.S. Appl. No. 12/413,686, 21 pgs.
U.S. Final Office Action mailed May 14, 2012 in U.S. Appl. No. 11/238,369, 16 pgs.
U.S. Non-Final Office Action mailed May 30, 2012 in U.S. Appl. No. 10/740,138, 27 pgs.
U.S. Non-Final Office Action mailed Dec. 26, 2012 in U.S. Appl. No. 12/413,686, 9 pages.
U.S. Final Office Action mailed Apr. 19, 2013 in U.S. Appl. No. 10/740,138, 31 pages.
U.S. Non-Final Office Action mailed Jun. 5, 2013 in U.S. Appl. No. 11/678,653, 50 pages.
U.S. Non-Final Office Action mailed Jun. 19, 2013 in U.S. Appl. No. 10/934,253, 14 pages.
U.S. Final Office Action mailed Jun. 25, 2013 in U.S. Appl. No. 12/413,686, 10 pages.
U.S. Office Action mailed Feb. 26, 2014 in U.S. Appl. No. 13/482,689, 26 pages.
U.S. Office Action mailed Apr. 16, 2014 in U.S. Appl. No. 13/482,497, 16 pages.
Rousseau, “Synchronized Multimedia for the WWW.,” retrieved from http://www7.wwwconference.org/1833/com1833.htm on Nov. 21, 2013, 15 pages.
U.S. Office Action mailed Sep. 22, 2014 in U.S. Appl. No. 13/482,689, 74 pgs.
U.S. Office Action mailed Jan. 27, 2015 in U.S. Appl. No. 14/287,339, 105 pages.
EP Communication in Appln No. 02 794 161.6 mailed Apr. 2, 2009, 4 pgs.
EP Communication in Appln No. 02 794 161.6 mailed Oct. 22, 2009, 3 pgs.
PCT Search Report in International Application No. PCT/US02/37282 mailed Feb. 7, 2003, 4 pgs.
PCT International Search Report in Appln No. PCT/US02/14874 mailed Nov. 20, 2002, 4 pgs.
PCT Written Opinion in Appln No. PCT/US02/14874 mailed Feb. 11, 2003, 5 pgs.
EP Communication in Appln No. 02 736 739.0 mailed Oct. 13, 2009, 3 pgs.
EP Communication in Appln No. 02 736 739.0 mailed Jan. 17, 2012, 4 pgs.
Canadian Office Action mailed Oct. 2, 2012 in Appln No. 2,658,766, 2 pgs.
EP Communication mailed May 26, 2014 in Appln No. 02 782 347.5, 6 pgs.
EP Communication mailed Jun. 3, 2014 in Appln No. 03 745 157.2, 6 pgs.
U.S. Office Action mailed Nov. 10, 2014 in U.S. Appl. No. 13/482,497, 73 pages.
U.S. Office Action mailed Nov. 20, 2014 in U.S. Appl. No. 14/294,624, 99 pages.
“A Brief History of the Trailer,” http://www.movietrailertrash.com/views/history.html, 11 pages(Publicly known at least as early as Dec. 20, 2003).
“Client User Interface Specification (Phase I) for Video-On-Demand Application Development on the Explorer 2000™ Digital Home Communications Terminal”, Revision 1.10 (Aug. 31, 1998), 20 pages.
“Evidence of illustrative movie release years,” Retrieved from the Internet Movie Database using Internet, http://www.imdb.com, 19 pages (Retrieved on Jun. 6, 2005).
“ISO/IEC 13818-6 Information technology—Generic coding of moving pictures and associated audio information—Part 6: Extensions for DSM-CC,” Chapter 4, 113 pages (Sep. 1, 1998).
“Netcaster Developer's Guide,” Devedge Online Documentation, Netscape Communications Corporation, http://developer.netscape.com/docs/manuals/netcast/devguide/ index.html, XP-002166370, 82 pages (Sep. 25, 1997).
“Netscape Navigator Help,” Netscape Navigator Software User Documentation, Netscape Communications Corporation, http://home.netscape.com, XP-002166369, pp. 1-63 (Aug. 10, 1997) “Sez You . . . origin of word daemon,” Take Our Word for It, Issue 146, p. 4, http://www.takeourword.com/TOW146/page4.html (retrieved on Apr. 4, 2006).
Addington, Timothy H., “System Architecture Specification for Video-On-Demand Application Development on the Explorer 2000™ Digital Home Communications Terminal”, Revision 1.10r Review (Mar. 4, 1999), 53 pages.
Alberico, G. et al., “Satellite Interactive Multimedia: A New Opportunity for Broadcasters,” International Broadcasting Convention, Conference Publication No. 447, pp. 18-23 (Sep. 12-16, 1997).
ATI Multimedia Center 7.9, User's Guide, ATI Technologies Inc., pp. i-vi and 1-96 (Copyright 2002).
Barth et al., “10 Fernsehen am PC”, Suse GMBH, XP-002324319, pp. 143-149 (2001).
BPAI Decision for U.S. Appl. No. 09/692,995, mailed Aug. 20, 2008, 10 pages.
BPAI Decision for U.S. Appl. No. 09/693,288, mailed Nov. 28, 2007, 5 pages.
Canadian Office Action in Application No. 2,376,556 mailed Sep. 30, 2008, all pages.
Canadian Office Action in Application No. 2,376,556 mailed Nov. 23, 2007, all pages.
Canadian Office Action in Application No. 2,376,556 mailed Dec. 6, 2005, all pages.
Canadian Office Action in Application No. 2,402,088 mailed Jun. 1, 2010, all pages.
Canadian Office Action in Application No. 2,402,088 mailed May 30, 2006, all pages.
Canadian Office Action in Application No. 2,405,491 mailed Jun. 9, 2010, all pages.
Canadian Office Action in Application No. 2,405,491 mailed Apr. 3, 2009, all pages.
Canadian Office Action in Application No. 2,405,491 mailed May 22, 2008, all pages.
Canadian Office Action in Application No. 2,405,491 mailed Jun. 20, 2007, all pages.
Canadian Office Action in Application No. 2,405,491 mailed Jan. 20, 2006, all pages.
Canadian Office Action in Application No. 2,408,289 mailed Sep. 2, 2010, 3 pages.
Canadian Office Action in Application No. 2,408,289 mailed Aug. 27, 2008, all pages.
Canadian Office Action in Application No. 2,408,289 mailed May 30, 2006, all pages.
Canadian Office Action in Application No. 2,451,477 mailed Nov. 3, 2009, all pages.
Canadian Office Action in Application No. 2,456,318 mailed Nov. 17, 2010, 4 pages.
Canadian Office Action in Application No. 2,456,318 mailed May 5, 2008, all pages.
Canadian Office Action in Application No. 2,456,318 mailed Mar. 27, 2007, all pages.
Canadian Office Action in Application No. 2,459,334 mailed Mar. 4, 2011, 3 pages.
Canadian Office Action in Application No. 2,459,334 mailed Apr. 16, 2009, all pages.
Canadian Office Action in Application No. 2,466,667 mailed Apr. 15, 2009, all pages.
Canadian Office Action in Application No. 2,475,723 mailed Jul. 7, 2009, all pages.
Canadian Office Action in Application No. 2,554,208 mailed Apr. 1, 2010, all pages.
Canadian Office Action in Application No. 2,621,605 mailed Dec. 15, 2009, all pages.
Canadian Office Action in Application No. 2,451,477 mailed Oct. 20, 2010, 4 pages.
Cunningham et al., “5 Das X Window System”., Suse GMBH, XP-002324320, pp. 129-180 (2001).
Decision on Appeal affirmed in U.S. Appl. No. 09/590,434 mailed May 28, 2008, all pages.
Definition of “flag”, Microsoft Press: Computer User's Dictionary, 3 pages (1998).
Definition of “renting”, Webster's II: New College Dictionary, 1995, Houghton Mifflin Company, p. 39.
European Examination Report in Application No. 00 938 251.6 mailed Mar. 2, 2010, all pages.
European Examination Report in Application No. 00 938 251.6 mailed Nov. 2, 2007, all pages.
European Examination Report in Application No. 00 939 759.7 mailed May 10, 2007, all pages.
European Examination Report in Application No. 01 905 058.2 mailed Dec. 19, 2006, all pages.
European Examination Report in Application No. 01 922 261.1 mailed Jul. 18, 2008, all pages.
European Examination Report in Application No. 01 922 261.1 mailed Nov. 2, 2007, all pages.
European Examination Report in Application No. 01 922 261.1 mailed Jan. 24, 2007, all pages.
European Examination Report in Application No. 01 922 261.1 mailed May 26, 2006, all pages.
European Examination cited in Application No. 01 923 092.9 mailed Jul. 20, 2009, all pages.
European Examination Report in Application No. 01 923 092.9 mailed Nov. 27, 2008, all pages.
European Examination Report in Application No. 01 937 209.3 mailed Mar. 16, 2010, all pages.
European Examination Report in Application No. 01 937 209.3 mailed Jun. 23, 2008, all pages.
European Examination Report in Application No. 02 737 593.0 mailed May 6, 2009, all pages.
European Examination Report in Application No. 02 750 416.6 mailed Aug. 4, 2008, all pages.
European Examination Report in Application No. 02 750 416.6 mailed Aug. 28, 2007, all pages.
European Examination Report in Application No. 02 761 572.3 mailed Apr. 20, 2009, all pages.
European Examination Report in Application No. 02 761 572.3 mailed Sep. 22, 2008, all pages.
European Examination Report in Application No. 02 761 572.3 mailed Jan. 22, 2008, all pages.
European Examination Report in Application No. 02 761 572.3 mailed Aug. 29, 2007, all pages.
European Examination Report in Application No. 06 802 683.0 mailed Jun. 26, 2008, all pages.
Examiner's Answer to Appeal Brief in U.S. Appl. No. 09/590,488 mailed Jan. 11, 2008, all pages.
“Industry Leading Software Vendors Endorse BroadVision's Next Generation of Retail and Business-To-Business E-Commerce Application Solutions,” PR Newswire, Jun. 14, 1999, 4 pages.
Japanese Office Action in Application No. 2001-581527 mailed Feb. 10, 2010, all pages.
Japanese Office Action in Application No. 2001-581527 mailed Sep. 8, 2009, all pages.
Kevin, “Change Screen Resolution in Windows (Tips, Tricks, Tweaks, and Setting),” http://www.tacktech.com/display.cfm?ttid=207, pp. 1-3 (Oct. 26, 2002).
Leftwitch et al., “StarSight Interactive Television Program Guide—Functional/International Architecture Specification Document, Interaction Analysis and Design Project—Phase III,” published no later than Dec. 15, 1995, 36 pages.
Little et al., “Prospects for Interactive Video-On-Demand”, IEEE Multimedia, IEEE Service Center, New York, NY US, vol. 1 No. 3, Sep. 1994, pp. 14-24, XP000476885 ISSN: 1070-986X.
McFedries, “The Complete Idiot's Guide to Windows 95,” Que, 2nd Edition, p. 49 (1997).
PCT Search Report in International Application No. PCT/US00/15952 mailed Jan. 16, 2001, all pages.
PCT Search Report in International Application No. PCT/US00/15963 mailed Sep. 1, 2000, all pages.
PCT Search Report in International Application No. PCT/US00/16000 mailed Oct. 2, 2000, all pages.
PCT Search Report in International Application No. PCT/US01/02490 mailed May 18, 2001, all pages.
PCT Search Report in International Application No. PCT/US01/06663 mailed Oct. 18, 2001, all pages.
PCT Search Report in International Application No. PCT/US01/10874 mailed Nov. 29, 2001, all pages.
PCT Search Report in International Application No. PCT/US01/14150 mailed Apr. 29, 2002, all pages.
PCT Search Report in International Application No. PCT/US02/20307 mailed Jan. 3, 2003, all pages.
PCT Search Report in International Application No. PCT/US02/20519 mailed Apr. 7, 2003, all pages.
PCT Search Report in International Application No. PCT/US02/24704 mailed Mar. 5, 2003, all pages.
PCT Search Report in International Application No. PCT/US02/28212 mailed Jan. 23, 2003, all pages.
PCT Search Report in International Application No. PCT/US02/36291 mailed May 23, 2003, all pages.
PCT Search Report in International Application No. PCT/US03/03391 mailed Jul. 14, 2003, all pages.
PCT Search Report and Written Opinion in International Application No. PCT/US2005/001812 mailed May 2, 2005, all pages.
PCT Search Report and Written Opinion in International Application No. PCT/US2006/033965 mailed Feb. 9, 2007, all pages.
PCT Search Report and Written Opinion in International Application No. PCT/US2006/033965 Feb. 19, 2007, all pages.
PCT Written Opinion in International Application No. PCT/US00/15952 mailed Jul. 25, 2001, all pages.
PCT Written Opinion in International Application No. PCT/US00/15963 mailed Jun. 22, 2001, all pages.
PCT Written Opinion in International Application No. PCT/US00/16000 mailed Oct. 25, 2001, all pages.
PCT Written Opinion in International Application No. PCT/US01/02490 mailed Oct. 23, 2001, all pages.
PCT Written Opinion in International Application No. PCT/US01/06663 mailed Jan. 3, 2002, all pages.
PCT Written Opinion in International Application No. PCT/US01/10874 mailed Jun. 4, 2002, all pages.
PCT Written Opinion in International Application No. PCT/US01/14150 mailed Sep. 30, 2004, all pages.
PCT Written Opinion in International Application No. PCT/US02/20307 mailed Aug. 8, 2003, all pages.
PCT Written Opinion in International Application No. PCT/US02/20519 mailed Apr. 6, 2004, all pages.
PCT Written Opinion in International Application No. PCT/US02/24704 mailed Nov. 20, 2003, all pages.
PCT Written Opinion in International Application No. PCT/US02/28212 mailed Dec. 4, 2003, all pages.
Summons to attend oral proceedings in EP Application No. 01937209.3 mailed Mar. 21, 2011, 7 pages.
Petit et al., “Bandwidth Resource Optimization in Video-On-Demand Network Architectures”, Community Networking Integrated Multimedia Services to the Home, 1994, Proceedings of the 1st International Workshop on San Francisco, CA USA, Jul. 1994, New York, NY USA, IEEE, Jul. 1994, pp. 91-97, XP010124402 ISBN: 978-0-7803-2076-5.
Reid, Dixie, “Coming attractions before they hit the big screen, most films begin life as a trailer,” The Sacramento Bee, Sacramento, California, p. E.1 (Jul. 18, 1996).
Remote Wonder, ATI, Tweak 3D, pp. 1-5 (Sep. 30, 2002).
Reply Brief in U.S. Appl. No. 09/565,931 mailed on Sep. 17, 2007, all pages.
Rottentomatoes web archived site, http://web.archive.org/web/20000301122211/http://rottentomatoes.com, Mar. 1, 2000, pp. 1-2.
VESA Plug and Display Standard, Version 1, Video Electronics Standards Association, XP-002123075, 90 pages (Jun. 11, 1997).
W3C, Putting language attributes in HTML, www.w3.org.org/International/O-help-lang, 2 pages (Apr. 29, 1997).
Summons to attend oral proceedings pursuant to Rule 115(1) EPC in European Application No. 02737593.0-1241 mailed May 28, 2010, all pages.
Supplementary European Search Report in European Application No. 02737593.0 mailed Mar. 3, 2009, all pages.
Supplementary European Search Report in European Application No. 02744705.1 mailed Feb. 19, 2010, all pages.
Supplementary European Search Report in European Application No. 02750416.6 mailed Jan. 2, 2007, all pages.
Supplementary European Search Report in European Application No. 02761572.3 mailed Mar. 20, 2007, all pages.
Supplementary European Search Report in European Application No. 02797096.1 mailed Oct. 14, 2005, all pages.
Supplementary European Search Report in European Application No. 03713364.2 mailed Jul. 6, 2005, all pages.
Canadian Office Action dated May 31, 2011 in Application No. 2,621,605, 2 pages.
U.S. Final Office Action cited in U.S. Appl. No. 09/518,041 mailed Jul. 7, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/518,041 mailed Jan. 10, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/518,041 mailed Aug. 24, 2007, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/518,041 mailed Feb. 6, 2007, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/518,041 mailed Aug. 28, 2006, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/518,041 mailed Sep. 15, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/518,041 mailed Apr. 22, 2005, all pages.
U.S. Final Office in U.S. Appl. No. 09/518,041 mailed Oct. 20, 2004, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/518,041 mailed Feb. 11, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/518,041 mailed Aug. 27, 2003, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/518,041 mailed Mar. 18, 2003, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/542,484 mailed Jun. 17, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/542,484 mailed Dec. 7, 2007, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/542,484 mailed Mar. 12, 2007, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/542,484 mailed Sep. 7, 2006, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/542,484 mailed Mar. 21, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/542,484 mailed Jul. 28, 2005, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/542,484 mailed Mar. 22, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/542,484 mailed Apr. 1, 2004, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/565,931 mailed Oct. 28, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/565,931 mailed Jul. 14, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/565,931 mailed Feb. 13, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/565,931 mailed Jun. 15, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/565,931 mailed Jan. 11, 2006, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/565,931 mailed Jul. 1, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/565,931 mailed Sep. 10, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/590,434 mailed May 11, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/590,434 mailed Nov. 21, 2005, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/590,434 mailed Dec. 1, 2004, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/590,434 mailed Apr. 22, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/590,434 mailed Dec. 18, 2003, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/590,434 mailed May 23, 2003, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/590,488 mailed Feb. 27, 2007, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/590,488 mailed Oct. 26, 2006, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/590,488 mailed Jul. 10, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/590,488 mailed Dec. 20, 2005, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/590,488 mailed Jun. 30, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/590,488 mailed Nov. 16, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/590,488 mailed Jun. 7, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/590,488 mailed Dec. 16, 2003, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/590,488 mailed Jun. 10, 2003, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/590,904 mailed Sep. 13, 2007, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/590,904 mailed Mar. 26, 2007, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/590,904 mailed Nov. 15, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/590,904 mailed May 31, 2006, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/590,904 mailed Jan. 24, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/590,904 mailed Jul. 13, 2005, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/590,904 mailed Jan. 11, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/590,904 mailed Jun. 4, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/591,356 mailed Apr. 13, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/591,356 mailed Dec. 20, 2005, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/591,356 mailed Jun. 30, 2005, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/591,356 mailed May 10, 2004, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/591,356 mailed Jan. 14, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/591,356 mailed Sep. 26, 2003, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/591,356 mailed May 21, 2003, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/692,920 mailed Jul. 22, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/692,920 mailed Jan. 17, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/692,920 mailed Jun. 14, 2007, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/692,920 mailed Nov. 24, 2006, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/692,920 mailed Jun. 21, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/692,920 mailed Feb. 16, 2006, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/692,920 mailed Jun. 17, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/692,920 mailed Nov. 18, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/692,995 mailed Sep. 4, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/692,995 mailed Jan. 23, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/692,995 mailed Sep. 8, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/692,995 mailed Mar. 27, 2006, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/692,995 mailed Sep. 21, 2005, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/692,995 mailed May 3, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/692,995 mailed Oct. 21, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/692,995 mailed Apr. 26, 2004, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/692,995 mailed Dec. 5, 2003, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/693,115 mailed Jan. 25, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/693,115 mailed Jun. 16, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/693,115 mailed Feb. 9, 2004, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/693,115 mailed Sep. 26, 2003, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/693,288 mailed Feb. 8, 2011, 28 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/693,288 mailed Jun. 21, 2010, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/693,288 mailed Dec. 1, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/693,288 mailed Jul. 19, 2005, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/693,288 mailed Feb. 10, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/693,288 mailed Jul. 15, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/693,288 mailed Feb. 26, 2004, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/693,288 mailed Oct. 27, 2003, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/693,790 mailed Jul. 25, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/693,790 mailed Jan. 15, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/693,790 mailed Jun. 19, 2007, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/693,790 mailed Dec. 28, 2006, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/693,790 mailed Jun. 16, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/693,790 mailed Dec. 28, 2005, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/693,790 mailed Jun. 21, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/693,790 mailed Oct. 6, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/881,516 mailed Jun. 3, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/881,516 mailed Dec. 29, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/881,516 mailed Jul. 26, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/881,516 mailed Apr. 21, 2004, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/881,516 mailed Oct. 28, 2003, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/894,508 mailed Sep. 17, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/894,508 mailed Feb. 4, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/894,508 mailed Jun. 13, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/894,508 mailed Dec. 31, 2007, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/894,508 mailed Jul. 26, 2007, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/896,231 mailed May 28, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/896,231 mailed Nov. 17, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/896,231 mailed Jun. 3, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/896,231 mailed Dec. 23, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/896,231 mailed Dec. 29, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/896,231 mailed Jun. 23, 2005, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/924,111 mailed Aug. 7, 2008, all pages.
U.S. Non-Final Office in U.S. Appl. No. 09/924,111 mailed Jan. 29, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/924,111 mailed Oct. 5, 2007, all pages.
U.S. Non-Final Office in U.S. Appl. No. 09/924,111 mailed Apr. 19, 2007, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/924,111 mailed Sep. 18, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/924,111 mailed Mar. 15, 2006, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/947,890 mailed Nov. 24, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/947,890 mailed Apr. 10, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 09/947,890 mailed Nov. 6, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 09/947,890 mailed Jun. 25, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/740,138 mailed Oct. 27, 2010, 23 pages.
U.S. Final Office Action in U.S. Appl. No. 10/740,138 mailed Jan. 15, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/740,138 mailed Sep. 3, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/740,138 mailed Mar. 19, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/740,138 mailed Sep. 15, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/740,138 mailed Jun. 11, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/740,138 mailed Oct. 2, 2007, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/778,494 mailed Jul. 25, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/778,494 mailed Jan. 16, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/778,494 mailed May 22, 2007, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/778,494 mailed Feb. 2, 2007, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/778,494 mailed Aug. 28, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/778,494 mailed Dec. 29, 2004, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/783,235 mailed Oct. 2, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/783,235 mailed Feb. 25, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/934,253 mailed Nov. 4, 2010, 10 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/934,253 mailed Apr. 27, 2010, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/934,253 mailed Dec. 23, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/934,253 mailed Jun. 26, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/934,253 mailed Dec. 26, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/934,253 mailed Jun. 17, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/934,253 mailed Jul. 24, 2007, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/934,253 mailed Feb. 9, 2007, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/934,253 mailed Sep. 14, 2006, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/957,849 mailed Aug. 8, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/957,849 mailed Apr. 30, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/957,854 mailed Apr. 1, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/957,854 mailed Sep. 28, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/957,854 mailed Apr. 7, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/957,854 mailed Oct. 15, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/957,854 mailed Apr. 30, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/957,942 mailed Jun. 30, 2010, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/957,942 mailed Jul. 28, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/957,942 mailed Jan. 14, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/957,942 mailed Jul. 31, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/957,942 mailed May 1, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/981,053 mailed Jan. 21, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/981,053 mailed Apr. 15, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/981,053 mailed Aug. 6, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/981,053 mailed Jan. 2, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/994,599 mailed Dec. 1, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/994,599 mailed May 16, 2006, all pages.
U.S. Final Office Action in U.S. Appl. No. 10/994,599 mailed Jan. 26, 2006, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/994,599 mailed Aug. 23, 2005, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/162,345 mailed Feb. 1, 2011, 33 pages.
U.S. Final Office Action in U.S. Appl. No. 11/162,345 mailed Mar. 16, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/162,345 mailed Aug. 21, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 11/162,345 mailed Mar. 9, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/162,345 mailed Oct. 31, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/170,348 mailed Oct. 26, 2010, 13 pages.
U.S. Final Office Action in U.S. Appl. No. 11/170,348 mailed Feb. 1, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/170,348 mailed Sep. 30, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 11/170,348 mailed May 28, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/170,348 mailed Dec. 11, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 11/208,387 mailed Dec. 22, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/208,387 mailed Jun. 12, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/234,967 mailed Sep. 10, 2010, all pages.
U.S. Final Office Action in U.S. Appl. No. 11/238,369 mailed Aug. 31, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/238,369 mailed Mar. 30, 2010, all pages.
U.S. Final Office Action in U.S. Appl. No. 11/244,621 mailed Aug. 18, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/244,621 mailed Feb. 5, 2010, all pages.
U.S. Final Office Action in U.S. Appl. No. 11/244,621 mailed Sep. 17, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/244,621 mailed Mar. 19, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/244,621 mailed Sep. 19, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/275,245 mailed May 5, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/275,245 mailed Oct. 22, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 11/275,245 mailed Jul. 29, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/275,245 mailed Sep. 22, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 11/460,516 mailed Mar. 18, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/460,516 mailed Jun. 26, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 11/460,516 mailed Feb. 13, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/460,516 mailed Sep. 17, 2008, all pages.
U.S. Final Office Action in U.S. Appl. No. 11/496,303 mailed Jul. 22, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/496,303 mailed Mar. 2, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/496,303 mailed Sep. 29, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 11/496,303 mailed Apr. 1, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/496,303 mailed Sep. 18, 2008, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/564,431 mailed Jul. 20, 2010, all pages.
U.S. Final Office Action in U.S. Appl. No. 11/564,431 mailed Jan. 4, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/564,431 mailed Aug. 24, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 11/678,653 mailed Jun. 23, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/678,653 mailed Dec. 16, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/963,942 mailed Jun. 8, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/963,945 mailed Jul. 16, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 11/963,951 mailed Aug. 2, 2010, all pages.
U.S. Final Office Action in U.S. Appl. No. 12/127,968 mailed Sep. 14, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/127,968 mailed Mar. 31, 2010, all pages.
U.S. Final Office Action in U.S. Appl. No. 12/127,968 mailed Dec. 1, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/127,968 mailed Apr. 30, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/179,752 mailed Dec. 23, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/179,763 mailed Jan. 4, 2011, 18 pages.
U.S. Final Office Action in U.S. Appl. No. 12/179,767 mailed Aug. 20, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/179,767 mailed Jan. 22, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/180,416 mailed Jun. 25, 2010, all pages.
U.S. Final Office Action in U.S. Appl. No. 12/372,887 mailed Apr. 14, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/372,887 mailed Oct. 16, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/372,894 mailed Oct. 27, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 12/372,917 mailed May 17, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/372,917 mailed Oct. 26, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/388,002 mailed Sep. 3, 2010, all pages.
U.S. Final Office Action in U.S. Appl. No. 12/388,139 mailed Jul. 6, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/388,139 mailed Dec. 15, 2009, all pages.
U.S. Final Office Action in U.S. Appl. No. 12/389,128 mailed Nov. 9, 2010, 50 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/389,128 mailed Jun. 2, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/389,564 mailed Jan. 21, 2011, 13 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/389,564 mailed Aug. 23, 2010, all pages.
U.S. Final Office Action in U.S. Appl. No. 12/389,564 mailed Apr. 28, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/389,564 mailed Nov. 10, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/390,418 mailed Sep. 28, 2010, 13 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/390,420 mailed Oct. 19, 2010, 12 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/390,422 mailed Oct. 20, 2010, 13 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/413,686 mailed Mar. 17, 2011, 20 pages.
U.S. Final Office Action in U.S. Appl. No. 12/413,686 mailed Jun. 10, 2010, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 12/413,686 mailed Nov. 30, 2009, all pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/934,253 mailed Apr. 29, 2011, 11 pages.
U.S. Final Office Action in U.S. Appl. No. 12/389,564 mailed May 19, 2011, 15 pages.
U.S. Non-Final Office Action in U.S. Appl. No. 10/740,138 mailed Jun. 8, 2011, 26 pages.
U.S. Final Office Action in U.S. Appl. No. 11/170,348 mailed Jun. 9, 2011, 14 pages.
U.S. Office Action mailed Aug. 14, 2015 in U.S. Appl. No. 14/294,624, 20 pages.
U.S. Office Action mailed Aug. 24, 2015 in U.S. Appl. No. 14/287,339, 20 pages.
U.S. Office Action mailed Nov. 27, 2015 in U.S. Appl. No. 14/294,624, 11 pages.
U.S. Office Action mailed Jan. 4, 2016 in U.S. Appl. No. 13/482,497, 22 pages.
Related Publications (1)
Number Date Country
20120188445 A1 Jul 2012 US
Divisions (1)
Number Date Country
Parent 10761777 Jan 2004 US
Child 13438511 US