Television functionality on a chip

Information

  • Patent Grant
  • 7489362
  • Patent Number
    7,489,362
  • Date Filed
    Wednesday, March 3, 2004
    20 years ago
  • Date Issued
    Tuesday, February 10, 2009
    15 years ago
Abstract
A television on a chip (TVOC) system that provides a cost effective approach for providing television functionality on a single integrated circuit chip is disclosed. A TVOC includes the functionality necessary to receive and display television signals in a variety of input and output formats. A TVOC can be used in set-top boxes for cable and satellite television, or directly within a television. All functionality provided can be provided on a single integrated circuit. TVOC includes a data transport module, an IF demodulator, a digital audio engine, an analog audio engine, a digital video engine, and an analog video engine. The TVOC also includes three sets of interfaces including output interfaces, control interfaces and ancillary interfaces. Further features and embodiments provide enhanced functionality and increased efficiencies.
Description
BACKGROUND OF THE INVENTION

1. Field of Invention


The present invention relates to television technology, and more particularly to providing television functionality on a single integrated circuit chip.


2. Background


Television systems have become increasingly complex as consumers continue to demand greater functionality and performance from television sets. Furthermore, the miniaturization of television systems demands that while complexity is increasing, that the size of electronic circuitry to support this complexity and performance must be reduced. At the same time, market forces continue to drive prices lower for television sets. Current electronic circuitry to support the functionality needed to receive audio and video signals that are either analog or digital and process those signals to provide a signal suitable for display on a television often consist of several integrated circuits. Furthermore, additional functionality related to value added features, such as teletext or e-commerce often requires additional integrated circuits.


What is needed is a system for providing television functionality and ancillary functionality on a single integrated chip to reduce costs and support the continued miniaturization of electronics for televisions.


SUMMARY OF THE INVENTION

The present invention provides a cost effective approach for implementing television functionality on a single integrated circuit chip (referred to herein as “TV on a Chip” or TVOC). A TVOC includes functionality to receive and display television signals in a variety of input and output formats. A TVOC can be used in set-top boxes for cable and satellite television, or directly within a television. All or substantially all functionality provided can be provided on a single integrated circuit. TVOC includes one or more of a data transport module, an IF demodulator, a digital audio engine, an analog audio engine, a digital video engine, and an analog video engine. The TVOC also includes three sets of interfaces including output interfaces, control interfaces and ancillary interfaces.


The present invention addresses the conflicting consumer demands of television system miniaturization and reducing the cost of televisions.


Further embodiments, features, and advantages of the present inventions, as well as the structure and operation of the various embodiments of the present invention, are described below.





BRIEF DESCRIPTION OF THE FIGURES

The invention is described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. The drawing in which an element first appears is indicated by the left-most digit in the corresponding reference number.



FIG. 1 is a functional block diagram of a television on a chip, according to an embodiment of the invention.



FIG. 2 is a detailed functional block diagram of a television on a chip, according to an embodiment of the invention.



FIG. 3 illustrates a block diagram of an exemplary digital signal processor according to embodiments of the present invention.



FIG. 4 illustrates a block diagram of an exemplary reset synchronization system according to embodiments of the present invention.



FIG. 5 shows a block diagram of a I/O multiplexer system according to embodiments of the present invention.



FIG. 6 illustrates a block diagram of an exemplary adaptable strapping system according to an embodiment of the present invention.



FIG. 7 illustrates a block diagram of an exemplary multi-channel audio interconnect system.



FIG. 8 illustrates a block diagram of an exemplary FM demodulation system.



FIG. 9 illustrates a block diagram of an exemplary a video data stream front end processor.



FIG. 10 illustrates a block diagram of an exemplary packet substitution module.



FIG. 11 illustrates a block diagram of an exemplary media processing system.



FIG. 12 illustrates a block diagram of an exemplary video system having an artificial time stamp module.



FIG. 13 illustrates a block diagram of an exemplary RAM implementation of a color lookup table and gamma correction function.



FIG. 14 illustrates a block diagram of an exemplary 2D adaptive comb filter.



FIG. 15 illustrates a block diagram of an exemplary timing generator.



FIG. 16 illustrates a block diagram of an exemplary system for detecting and decoding teletext message sequences.



FIG. 17 illustrates a block diagram of an exemplary copy-protection filter.



FIG. 18 illustrates a block diagram of an exemplary signal mode detection system.





DETAILED DESCRIPTION OF THE INVENTION

While the present invention is described herein with reference to illustrative embodiments for particular applications, it should be understood that the invention is not limited thereto. Those skilled in the art with access to the teachings provided herein will recognize additional modifications, applications, and embodiments within the scope thereof and additional fields in which the present invention would be of significant utility.



FIG. 1 provides a functional block diagram of television on a chip (TVOC) 100, according to an embodiment of the invention. A TVOC includes the functionality necessary to receive and display television signals in a variety of input and output formats. A TVOC can be used in set-top boxes for cable and satellite television, or directly within a television. All functionality provided can be provided on a single integrated circuit. TVOC 100 includes data transport 105, IF demodulator 110, digital audio engine 115, analog audio engine 120, digital video engine 125, analog video engine 130, output interfaces 135, control interfaces 140, ancillary interfaces 145, and system controller 150.


Data transport 105 can receive digital video data and performs preprocessing of the data for use throughout other TVOC 100 components.


IF demodulator 110 contains functionality to receive analog television signals and perform processing to produce digital and analog video and audio signals for use within the other components of TVOC 100.


Digital audio engine 115 can process digital audio signals, while analog audio engine 120 can process analog audio signals. Similarly, digital video engine 125 can process digital video signals, while analog video engine 130 can process analog video signals. Digital video engine 125 and analog video engine 130 also have the capability to process graphics, and integrate those graphics within the video and audio outputs.


Output interfaces 135 can output various types of analog and digital audio and video signals. Control interfaces 140 can provide a means for external control of TVOC 100. Ancillary interfaces 145 can provide access to various types of ancillary input and output devices, such as access to establish an Ethernet connection with the TVOC 100. System controller 150 can control overall operation of the circuitry within TVOC 100.



FIG. 2 provides a detailed functional block diagram of TVOC 200, according to an embodiment of the invention. TVOC implementations will vary depending on the particular environment in which a TVOC will be used. For example, depending on the country of use, a TVOC may be required to support different types of video or audio standards. Additionally, a TVOC may be required to support standard definition, high definition or both types of television signal broadcasts. Increasingly, cable set top box and television set manufacturers require that chips and chip sets support a wide range of standards and uses. TVOC 200 provides an embodiment of a TVOC that provides a set of functionalities that meets this need. TVOC 200 is provided to illustrate one embodiment of a TVOC, and is not intended to be limiting. Based on the teachings herein, individuals skilled in the relevant arts will be able to ascertain alternative embodiments to TVOC 200. It is intended that these embodiments are within the scope of the invention.


As in the case of TVOC 100, TVOC 200 includes input elements, video and graphics processing elements, audio processing elements, control elements, control interfaces, output interfaces and ancillary interfaces. Additionally, TVOC 200 includes elements that further facilitate the integration of television functionality on a single integrated circuit.


The input elements of TVOC 200 include intermediate frequency (IF) demodulator 202 and data transport module 204. IF demodulator 202 operates on an analog television signal to produce a digital baseband composite video broadcasting signal (CVBS) and a digital audio signal, such as a monaural or Broadcast Television System Committee (BTSC) baseband multiplex signal. The analog input signals can include, among others, National TV System Committee (NTSC), Phase Alternation Line Rate (PAL), and Sequential Couleur Avec Memoire (SECAM) television signals.


In an embodiment, IF demodulator 202 is optionally a digital IF demodulator that receives and demodulates an analog IF input signal to produce a digital audio signal and a digital video signal. Where the analog IF input signal is a television signal or channel that has been down-converted from RF, IF demodulator 110 includes an A/D converter, a first digital complex mixer, a second digital complex mixer, and various digital filters. The A/D converter receives the analog IF input signal and converts it to digital. The first digital complex mixer receives the output of the A/D converter and down-converts the output of the A/D converter to baseband.


Additionally, a picture carrier is recovered from the output of the first digital complex mixer, and fed back to a direct digital synthesizer to control the tuning accuracy of the first digital complex mixer. More specifically, a feedback loop is configured so that the picture carrier is down-converted to DC so as to control the tuning accuracy of the first digital complex mixer. The complex output of the first complex mixer is further processed using Nyquist filtering and other filtering to recover the digital video signal. Such filtering may be accomplished using digital signal processing techniques. The digital audio signal is recovered by further processing the output of the first digital complex mixer. A digital IF demodulator embodiment of IF demodulator 202 is further taught in copending application, U.S. patent application Ser. No. 10/448,062, filed May 30, 2003, which is incorporated herein by reference in its entirety.


In an embodiment, IF demodulator 202 can optionally include a FM demodulation system for approximating y(n)=1/(x(n)), where x(n)=I2(n)+Q2(n). An exemplary FM demodulation system 800 is illustrated in FIG. 8. FM demodulator 800 can include a filter 806 (e.g., a Hilbert Filter) that generates a quadrature-phase signal Q(n) 804 from I(n) 810. The signals I(n) 810 and Q(n) 804 are input into an FM demodulation system 802, which produces FM(n) output signal 812. A secondary audio program (SAP) signal received must be processed in order to generate a pulse code modulated signal (PCM) output signal. Typically, a SAP signal is band pass filtered, FM demodulated, and processed using a variable de-emphasis algorithm to produce the PCM. The FM demodulation can be carried out using an equation FM(n)=[I(n)Q′(n)−I′(n)Q(n)]/[I2(n)+Q2(n)]. Conventional systems typically calculate the numerator and ignore the denominator because the division is too complex. This is because conventional processors do not have enough hardware and/or software support to perform such complex division. Thus, a noise signal received by a FM demodulator is passed on in the FM(n) output signal because the denominator is not calculated along with the numerator. This noise can cause problems down the line during subsequent signal processing.


The optional FM demodulation system described above addresses this problem by approximating the denominator during demodulation of an FM signal. The FM demodulation system receives a prior estimated value of 1/x(n) and a present value of x(n). The FM demodulation system also adjusts the prior estimated value of 1/x(n) to compensate for an error between the prior estimated value of 1/x(n) and the present value of 1/x(n). The FM demodulation system outputs an adjusted prior estimated value of 1/x(n) as the present value of 1/x(n). An FM demodulation system is further taught in copending application, U.S. patent application Ser. No. 10/629,797, filed Jul. 30, 2003, which is incorporated herein by reference in its entirety.


Data transport module 204 receives compressed video data, normally in the form of MPEG or DIRECTTV transport streams. Data transport module 204 parses the streams and performs preprocessing. Data transport module 204 can optionally support personal video recording.


In an embodiment, data transport module 204 can optionally include a video data stream front end processor. When program channels are transmitted to customer premise devices, such as a television or cable set top box, data packets for program channels are typically multiplexed together into a video data stream. A common way to identify packets within a video data stream that are associated with a particular program channel uses program identification (PID) information. This approach is referred to herein as a PID parsing approach. In this approach, each packet within a video data stream contains a PID that contains information that tells a customer premise device which channel a particular packet of information is associated with.


In another approach, data packets within a video data stream that are associated with a particular program are allocated within particular time slots of a data frame. With this approach a distribution device, such as a cable system headend, organizes data packets into frames with packets for a particular program channel located in specific time slots within the frames. This approach is referred to herein as a time division multiplexing approach. An example of this approach is transport stream multiplexing format (TSMF), which has been recently developed by a consortium of television manufacturers for use in Japan. Within this approach it is also common that packets will contain a PID. The video data stream front end processor provides an efficient way to demultiplex video signals that use both a PID and are multiplexed using a time division multiplexing approach.


An exemplary video data stream front end processor system 900 as illustrated in FIG. 9 includes synchronizers 910A-910E, parsers 920A-920E, demultiplexers 930A and 930B, and an input buffer 940. Each demultiplexer within the system includes a header detect module, a slot map module, a frame sync module and a packet accept module. The video data stream front end processor analyzes the PID and slot location for each packet received in a data stream, and based on the PID and slot location a packet is either accepted or rejected. The video data stream front end processor is further taught in co-pending application, U.S. patent application Ser. No. 10/640,682 filed Aug. 14, 2003, which is herein incorporated by reference in its entirety.


In an embodiment, data transport module 204 can optionally include a packet substitution module for substituting packets into a data stream. When program channels are transmitted to customer premise devices, such as a television or cable set top box, data packets for program channels are typically multiplexed together into a video data stream. When the video data stream is received by a customer premise device, the customer premise device typically parses the incoming stream to retain only those packets that pertain to the channel of interest to be viewed or recorded.


The parsing process can create gaps in the data stream where packets associated with channels other than the channel of interest were located. The parsing of the stream potentially leaves time gaps where packets were removed. As the complexity of data processing systems increases, the need for a robust and flexible approach to substitute packets into a data stream, such as a video data stream, increases without impairing performance. In particular, system and management information associated with a data stream is likely to change as a result of adjustment in encryption or organization of packets within a data stream, for example. Moreover, different types of information may need to be inserted into a data stream, and the different types of information may have different timing requirements. Different types of information can include, for example, system information for video control, system information for audio control, overall system management information, or customer specific application information.


An exemplary packet substitution module system 1000 as illustrated in FIG. 10 includes a set of packet buffers 1010A-1010D that buffer packets to be substituted into the data stream. The packet substitution module system 1000 also includes a multiplexer 1030 that obtains packets from the set of packet buffers 1010A-1010D and substitutes packets into the data stream. The operation of the multiplexer 1020 is controlled by a packet substitution controller 1030. A direct memory access engine provides packets for insertion to the packet buffers based on a link list buffer control that is coupled the direct memory access engine. The packet substitution module is further taught in co-pending application, U.S. patent application Ser. No. 10/640,684, filed Aug. 14, 2003, which is herein incorporated by reference in its entirety.


In an embodiment, data transport module 204 can optionally include a media processing system for processing multiple program channels containing one or more data packets. TVOC 200 may use a system time clock that is independent of the system time clock of a transmitting system. Because these system time clocks are asynchronous, the receiver clock may run at a slightly different rate than the transmitter clock. For example, when the receiver clock runs slower than the transmitter clock, data packets arrive faster than the receiver processes them causing the input buffer to overflow. In addition, the lack of precise timing between the transmitter and receiver creates deleterious effects on video display such as lack of synchronization between video and audio. A common technique to minimize the impact of these timing errors is to synchronize the system time clock of the receiver to the system time clock of the transmitting device. This clock “locking” technique reduces timing errors associated with data packets from the transmitter on which the receiver clock is locked.


However, in modern systems, data from multiple transmission sources (for example, multiple programs) are often multiplexed together in a single transport stream. If traditional clock locking techniques were used, TVOC 200 would be able to synchronize to the system time clock of only one of these transmission sources. As a result, the data streams from the other transmission sources having different system time clocks are susceptible to the timing errors discussed above.


In addition, while being processed by the receiver, data packets may experience delay. This internal receiver delay introduces error into the clock reference values being forwarded to the downstream systems and applications. This error impacts the ability of the downstream system or application to synchronize its system time clock with the system time clock of the original transmitting system. The media processing system addresses these challenges. An exemplary media processing system 1100 as illustrated in FIG. 11 includes a processor 1102, a set of timestamp insertion modules 1104A-1104N coupled to the processor 1102. The number of timestamp insertion modules 1104 would be equal to the number of program channels being processed. Each timestamp insertion module 1104 would include a local program reference clock, a synchronizer that synchronized the local program reference clock to a clock of a device transmitting the program being processed, a local program timestamp generator, and an input buffer 1108 may be coupled to the timestamp insertion modules. The media processing system is further taught in co-pending application, U.S. Patent Application No. 60/495,129, filed Aug. 15, 2003, which is herein incorporated by reference in its entirety.


The video processing elements of the TVOC 200 include analog video decoder 210, digital video decoder 212, video and graphics processing module 214, 3D engine 215, 2D engine 216 and video encoder 218. Analog video decoder 210 supports high quality decoding of a standard definition composite video broadcasting signal (CVBS) and S-Video signal. Digital video decoder 212 decodes video signals that comply with the International Telecommunications Union (ITU) standard ITU-R-656 at varying resolutions including 525i, 625i and 240p.


In an embodiment, analog video decoder 210 and/or digital video decoder 212 can optionally include a 2-D adaptive comb filter. A composite video signal is the sum of a luminance (brightness) signal and a chrominance (color) signal. These signals are also referred to as luma and chroma signals, respectively. The frequency ranges of the luma and chroma signals are designed to overlap. A problem created by overlapping the luma and chroma frequency spectra is that it is difficult to separate them completely.


2-D adaptive comb filter addresses this challenge. 2-D adaptive comb filter filters the video information both horizontally along the scan lines and vertically between the horizontal scan lines. To filter horizontally, a low-pass or bandpass filter is used. To filter vertically, a combing procedure is used. Adaptive logic solves the problem of combing at a vertical color transition point. The logic examines three successive horizontal scan lines simultaneously. At a vertical transition between two colors, either the top two or bottom two scan lines will usually be the same. Initially, for example, the top two of three scan lines will be the same color. The logic directs those lines to a digital version of the comb filter. When the scan moves down another line, the scan line triplet includes a new bottom line. The bottom two scan lines of the new triplet will have the new color, and the bottom lines will be directed to the comb filter. In this manner, two lines with different colors are not input to the comb filter at a transition boundary.


An exemplary 2D adaptive comb filter 1400 as illustrated in FIG. 14 includes a band limiting filter 1402, a decision stage 1404, a blending stage 1406, and a processor 1410. A composite video signal including luma and chroma subcarriers enters the band limiting filter after passing through an analog-to-digital converter. After band limiting filter, the original signal is split among three different paths. The first path is a decision stage. The decision stage includes a wide bandpass filter, interpolator, and chroma decision logic. The second path is the blending stage. The blending stage includes a narrow bandpass filter, a chroma comb filter, and a blending module. The third path leads directly to the processor. The 2D adaptive comb filter is further taught in co-pending application, U.S. patent application Ser. No. 10/641,160, filed Aug. 15, 2003, which is herein incorporated by reference in its entirety.


In an embodiment, digital video decoder 212 can optionally include an artificial time stamp module that generates pseudo MPEG information from received digital video information, such as DV25 and DV50 information generated by digital video cameras. This DV25 and DV50 information lacks timestamps used by MPEG signal processors. An artificial time stamp module generates timestamps that are inserted into the DV25 or DV50 information, thereby facilitating decoding by an MPEG signal processor. An exemplary artificial time stamp module including system time clock 1212 and time control device 1210 is illustrated in FIG. 12 illustrating a video system 1200. A data transport device 1208 controls the transmission of data within system 1200. A time control device 1210 can be used to generate and associate time stamp information with the data. A system time clock 1212 can be used in one embodiment in conjunction with the time control device to generate time stamp information. A storage device 1214 is used to store the pseudo MPEG information and other data received by data transport device 1208. A decoder 1216 is used to decode the pseudo MPEG information. A controller 1218 can be used to control one or more components of system 1200. Output device 1244 outputs the decoded pseudo MPEG information based on the artificial time stamp information. An artificial time stamp module is further taught in co-pending application, U.S. patent application Ser. No. 10/640,648, filed Aug. 14, 2003, which is herein incorporated by reference in its entirety.


In an embodiment, digital video decoder 212 can optionally include a teletext decoder system for processing teletext message sequences. The teletext decoder provides a system for efficiently decoding and detecting valid teletext message sequences. An exemplary teletext decoder system 1600 as illustrated in FIG. 16 includes a correlator 1630, a sine wave generator 1640, a first time window generator 1610 and a second time window generator 1620, a phase detector 1650, a framing code search engine 1660, and a match filter 1670. The teletext decoder system determines the phase of the run-in burst of a teletext message sequence, identifies the framing code location, decides whether the teletext sequence is valid and decodes the teletext data for display. The teletext decoder system is further taught in copending applications, U.S. Provisional Patent Application No.: 60/467,574, filed May 5, 2003, which is incorporated herein by reference in its entirety, and in U.S. patent application Ser. No. 10/629,781, filed Jul. 30, 2003, which is incorporated herein by reference in its entirety.


Video and graphics processing module 214 includes a variety of functions for processing video and graphics, and for integrating video and graphics. In particular, video and graphics processing module 214 can optionally include MPEG, graphics and video feeders; video scalers; capture blocks; and/or video compositors for combining video and graphics. 3D engine 215 provides low bandwidth three dimensional (3D) graphics rendering for set-top boxes and television displays without sacrificing the performance needed for Internet e-commerce 3D applications and games.


In an embodiment, video and graphics processing module 214 can optionally include a combined color look-up and gamma correction system. Color look-up table and gamma correction routines are used to improve video graphics displayed on a television. The combined color look-up and gamma correction system includes a memory having color look up table (CLUT) parameters and gamma correction parameters stored therein. An exemplary combined color look-up and gamma correction system includes an input matrix configured to receive first and second type pixel data. Memory is coupled to the input matrix and configured to associate one of the first and second type pixel data with the stored CLUT parameters and associate the other of the first and second type pixel data with the stored gamma correction parameters. For example, an exemplary RAM memory system 1300 is illustrated in FIG. 13. System 1300 is configured to perform both CLUT and gamma correction functions, in accordance with the present invention. System 1300 includes an input matrix 1301, a memory section 1302, and an output matrix 1303 configured to receive the input pixel words 1390, 1391, 1393, and 1394. The memory section 1302 includes RAMs 1304, 1305, 1306 and 1308. The RAMs 1305, 1306, and 1308 are respectively connected to input multiplexing devices 1310, 1312 and 1314. When configured for the gamma correction mode, individual color components 1380, 1382 and 1384 are provided as inputs to respective multiplexing devices 1310, 1312, and 1314 and along data lines 1348 as inputs to respective multiplexing devices 1326, 1328, and 1330. Four data paths 1322 are coupled to multiplexing device 1324. An output of the multiplexing device 1324 is provided to a multiplexing device 1340. Multiplexing device 1342 receives one input from the multiplexing device 1340 and other inputs from the multiplexing devices 1326, 1328 and 1330, to provide an output 1344.


Two primary types of graphics image format exist. In one format, the pixel contains all the color component information (e.g., YUV422 or ARGB8888). In the other format, the pixel contains an index (e.g., CLUT format). In order to support CLUT format and Gamma-correction function, conventionally two separate RAMs were needed. In the combined color look-up and gamma correction system, however, the CLUT function is combined with the gamma-correction function by applying a gamma-correction equation on the color look-up-table.


The combined color look-up and gamma correction system provides one RAM for sharing both CLUT and gamma correction functions. More specifically, one RAM arrangement is integrated such that the same RAM can be used for CLUT+Gamma-Correction (for CLUT format) or just Gamma-Correction (for other formats). This RAM arrangement provides economies in terms of space savings on the integrated TVOC 200 circuit, by providing for a more efficient process to implement CLUT and gamma correction functions. Additionally, the combined color look-up and gamma correction system provides a more efficient software setup that does not require special arrangements or address swapping. Combined color look-up and gamma correction system is further taught in co-pending application, U.S. patent application Ser. No. 10/640,666, filed Aug. 14, 2003, which is incorporated herein by reference in its entirety.


2D engine 216 provides two dimensional graphics processing. Video encoder 218 encodes processed video signals for output to a display. Video encoder 218 can optionally support both standard and high definition video signals. Video encoder 218 supports a variety of analog video standards (e.g., NTSC, PAL, SECAM, 480i, 480p, 720p, and 1080i), as well as digital video standards (e.g., ITU-R-656 and support for digital video interface (DVI) encoding). Additionally, video encoder 218 can receive video blanking interval (VBI) signals. Video encoder 218 combines these VBI signals with appropriate lines of video to support arrangements, such as closed captioning and teletext.


In an embodiment, video encoder 218 can optionally include a timing generator that generates time-dependent control signals for video signals. Such control signals support outputting video signals in different formats and to implement a variety of value added features including, for example, copy protection features, such as those contained within MACROVISION copy protection process standards. MACROVISION is a trademark of Macrovision Corporation. An exemplary timing generator 1500 as illustrated in FIG. 15 includes a set of microsequencers 1505A-1505G, a programmable combinational logic (PCL) module 1510, shared memory, an arbiter 1530 for sharing of memory by the microsequencers 1505, stacks 1540 containing registers for microsequencer 1505 control, and a control interface. The timing generator can provide control signals for video signals, implement the MACROVISION copy protection process, and provide other value added features. Flags generated by the microsequencers are processed to generate one or more control signals used to support the outputting of video signals including those requiring MACROVISION copy protection. The generation of control signals is software controlled, thereby allowing the system to be reconfigured in real time, and permitting modifications to reflect changes in the MACROVISION copy protection process and other value added features for video signals. The invention provides an efficient way to flexibly produce time-dependent control signals for video signals. The timing generator is further taught in co-pending application, U.S. patent application Ser. No. 10/640,627, filed Aug. 14, 2003, which is hereby incorporated in its entirety.


In an embodiment, video encoder 218 can optionally include an MPAA (Motion Picture Association of America) HDTV (High Definition Television) copy protection filter system. The MPAA has released a set of standards to provide for copy protection of HDTV signals. In effect, these standards degrade or reduce the resolution of HDTV signals. The MPAA HDTV copy protection filter system provides an efficient system to reduce the resolution of a digital video signal to comply with the MPAA copy protection standards by integrating copy protection filters with elements of video encoder 218. An exemplary MPAA HDTV filter system is illustrated in FIG. 17. FIG. 17 shows a block diagram of an exemplary system 1700 included in, for example, a set-top box or a TV-on-a-chip system. System 1700 includes a compositor 1702 and a video encoder (“VEC”) 1704. Compositor 1702 accepts an input video signal 1706 and an input graphics signal 1708. Compositor outputs composite video signal 1710. VEC 1704 includes a sync timing circuit 1712, a bandpass filter 1714, and a digital-to-analog converter (“DAC”) 1716, along a first data path 1718. In one embodiment, VEC 1704 also includes a digital visual interface (“DVI”) transmitter 1720 on a second data path 1722. A copy-protection filter 1750 may be inserted into one or more of the video signal data paths 1706, 1710, 1718, and 1722 (shown in 1710 only. In an embodiment, the copy-protection filter is a low-pass horizontal filter. In another embodiment, the copy-protection filter is a low-pass vertical filter. In yet another embodiment, a combination horizontal and vertical filter are used. For each filter, any number of taps may be used.


The copy-protection filter may be placed in an analog data path in the video encoder before the signal is converted to an analog signal. In another embodiment, a digital data path is coupled to the analog data path. The copy-protection filter can be placed on the analog data path after the connection between the analog and digital data paths. In yet another embodiment, a second copy-protection filter can be placed in the digital data path. The second copy-protection filter may be operated independently of the first copy-protection filter. In a further embodiment, the copy-protection filter may be combined with other filters in the system. In still another embodiment, the copy-protection filter may be combined with a video scaler before the initial video data stream enters the compositor. The MPAA HDTV copy protection filter system is further taught in co-pending application, U.S. patent application Ser. No. 10/641,031, filed Aug. 15, 2003, which is incorporated herein by reference in its entirety.


The audio processing elements of TVOC 200 include audio decoder 220 and audio processor 222. Audio decoder 220 can receive a baseband analog BTSC composite signal that was extracted by IF demodulator 202 and process the received signal to recover the main, stereo and SAP channels contained within the original signal. Audio decoder 220 can support BTSC and CPZ503 (used in Japan) standards for composite signals.


In an embodiment, an optional digital interface is coupled between IF demodulator 202 and a digital signal processor (DSP) within audio decoder 220. A potentiometer or implementation of an automatic gain control (AGC) algorithm in a digital signal processor within audio decoder 202 can be used to adjust the signal levels of right and left stereo signals. When a digital interface is used, IF demodulator 202 has an output that is scalable to ensure compatibility with different down-stream DSP devices. Additionally, the output of the IF demodulator 202 is a digital signal, including composite BTSC samples that are compatible with multi-channel television sound (MTS) standards. Because the output includes digital composite samples, the digital interface can provide signals to the DSP in the digital domain. By using a digital interface, conversion mismatches associated with an analog interface are substantially reduced. The audio decoder digital interface is further taught in co-pending application, U.S. patent application Ser. No. 10/646,721, filed Aug. 25, 2003, which is incorporated herein by reference in its entirety.


In an embodiment, audio decoder 220 can optionally include a signal mode detection system. The signal mode detection system can be used, for example, to distinguish mono, dual mono and stereo modes in an audio transmission. An exemplary signal mode detection system 1800 as illustrated in FIG. 18 includes a first bandpass filter 1802, a first envelope tracker 1804 coupled to the first bandpass filter 1802, a second bandpass filter 1806, a second envelope tracker 1808 coupled to the second bandpass filter 1806, and a decision circuit 1810 coupled to both the first and second envelope trackers 1804 and 1808. The decision circuit is configured to process decision logic to determine whether an audio transmission is in mono, dual mono, or stereo mode. The signal mode detection system is particularly useful for processing Japanese Broadcast Television Systems Committee (JBTSC) standard audio broadcast signals, which have three modes of transmission. The signal mode detection system is further taught in co-pending application, U.S. patent application Ser. No. 10/641,004, filed Aug. 15, 2003, which is incorporated herein by reference in its entirety.


In an embodiment, audio decoder 220 can optionally include a JBTSC signal separation processor. Three channels exists within a JBTSC signal—a main channel, a sub channel and a control signal. The main channel includes the audio signal. The sub channel includes the difference between a right and left channel stereo signal contained in the main channel. The control signal includes information indicating the mode of transmission. The JBTSC signal separation processor supports the processing of all three channels. The JBTSC signal separation processor includes a sub path, a main path and a separator. The sub path includes a bandpass filter, a first filter path, a second filter path, an FM demodulator, a lowpass filter and a de-emphasis circuit. The JBTSC signal separation processor provides optimal separation of left and right stereo signals within a JBTSC. The JBTSC signal separation processor is further taught in co-pending application, U.S. patent application Ser. No. 10/641,161, filed Aug. 15, 2003, which is incorporated herein by reference in its entirety.


In an embodiment, audio decoder 220 can optionally include a digital signal processor (DSP) decoder 300, such as one illustrated in FIG. 3, having an execution unit 302, a memory 306, an address generator 308 and an instruction set that can be used to decode input signals. The programmable address generator enables the execution unit to perform register addressing, indirect addressing and immediate addressing. An advantage of a DSP is that is reprogrammable and updateable. In other embodiments with different program instructions, a DSP decoder can also be used within digital video decoder 212 and analog video decoder 210. The DSP decoder is further taught in co-pending application, U.S. patent application Ser. No. 10/641,295, filed Aug. 15, 2003, which is incorporated herein by reference in its entirety.


In one application of the DSP decoder, a sampled signal can be converted to a higher data rate signal. In signal processing, sampled signals often need to be converted to higher or lower sample rates. Conventional sample rate converters (SRCs) suffer from a variety of drawbacks. For example, conventional SRCs typically interpolate between existing sample points to obtain new sample points. The interpolation is typically performed using known, pre-determined, fixed ratios between the lower data rate and the higher data rate. Results are then clocked out at the higher data rate. Such an interpolation is not performed in real time. Thus, when the higher data rate clock is affected by jitter or small/large frequency variations, for example, the higher data rate output does not necessarily track the lower data rate data.


Using the DSP decoder a method can be implemented that addresses these drawbacks. The method to convert a sampled signal to a higher data rate involves receiving conversion pulses having a conversion rate that is higher than a sample rate of the sampled signal. Next, two sample points on either side of a conversion pulse are reconstructed in real time from the sample signal. An amplitude between the two reconstructed sample points is then interpolated. The output of the interpolated amplitude is output in real time. This process is repeated on subsequent conversion pulses, such that the outputted interpolated amplitudes form the higher data rate signal having a data rate equal to the conversion rate. This sample rate conversion method is further taught in co-pending application, U.S. patent application Ser. No. 10/641,034, filed Aug. 15, 2003, which is incorporated herein by reference in its entirety.


Audio processor 222 analyzes and processes audio signals in a variety of formats including Dolby digital and MPEG audio. Audio processor 222 delivers processed audio signals to audio outputs, including digital audio port 258 and analog audio port 259.


The control interfaces can include DDR-DRAM controller 240, EJTAG module 242, peripheral interfaces 244, and EBI interface 246. DDR-DRAM controller 240 supports the use of external DDR-DRAM. EJTAG module 242 supports advanced debugging features, including software debug of user and kernel code. Peripheral interfaces 244 provide a range of interfaces to support smart card usage, and the like. EBI (external bus interface) interface 246 supports the connection of external SRAMs, flash memories, EPROMs, and interfaces with additional external peripherals.


The output interfaces can include S-Video composite interface 250, DVI interface 252, 656 Interface 254, RF Mod interface 256, digital audio port 258 and analog audio port 259. The S-Video composite interface 250 can output video signals in a variety of formats including SCART1, S-Video, and composite video (CVBS) outputs. DVI interface 252 provides an output interface for DVI signals. 656 Interface 254 provides an output interface for analog video signals that comply with the ITU-R-656 standard.


RF Mod interface 256 provides an output interface for an analog composite television signal which is suitable for demodulation by a television demodulator. Digital audio port 258 and analog audio port 259 provide interfaces for digital and analog audio signals, respectively.


The ancillary interfaces can include V.90 interface 260, USB 2.0 interface 262, Ethernet interface 264, and SATA interface 266. V.90 interface 260 provides an interface to modems using the V.90 standard. Similarly, USB 2.0 interface 262 provides an interface to devices using a USB 2.0 standard. Ethernet interface 264 provides an interface to connect TVOC 200 to an Ethernet or provide Internet connectivity. Finally, SATA interface 266 provides an interface for use with external hard drives.


System controller 270 provides overall control for TVOC 200.


Additional optional elements within TVOC 200 that further facilitate the integration of television functionality on a single integrated circuit can include a reset synchronization system. An exemplary reset synchronization system 400 as illustrated in FIG. 4 showing a reset generator 404 and a circuit 418. A local clock A 412 drives circuit 418 and clock B 406 drives reset generator 404. A reset synchronizer including synchronizing circuitry 402 receives a reset signal 408 and local clock A 412. The synchronizing circuitry 402 outputs a synchronized reset signal 422 that is used to reset circuit 418 or portions thereof. Thus, a reset synchronization system includes a local clock terminal 420, a reset terminal 424, a synchronized reset output terminal 422, and synchronizing circuitry 402 coupled between the clock terminal 420, the reset terminal 424, and the synchronized reset output terminal 422. The synchronizing circuitry synchronizes a received reset signal to the local clock signal and outputs a synchronized reset signal on the synchronized reset output terminal. A reset synchronization system is further taught is co-pending application, U.S. patent application Ser. No. 10/640,632, filed Aug. 14, 2003, which is incorporated herein by reference in its entirety.


Optional elements within TVOC 200 that further facilitate the integration of television functionality on a single integrated circuit further include an I/O (input/output) multiplexing system. Integrated circuits, such as TVOC 200 communicate with other chips through I/O devices (e.g., pins or pads). Because function blocks are becoming smaller, and the chips denser, there is limited area on the chip for the I/O devices. To compensate for the limited area, groups or sets of the function blocks share the I/O devices. In order to share the I/O devices, a controlling system (e.g., a multiplexer) is used to direct which of the function blocks use the I/O device at a particular time (e.g., during a particular mode of the chip). A wide variety of factors—such as temperature and distance can affect timing control of I/O device signals. The I/O multiplexing system addresses the need to conserve space on TVOC 200 while also ensuring proper timing.


An exemplary I/O multiplexing system 500 as illustrated in FIG. 5 includes an input/output device including a bi-directional pad, function blocks 502A and 502B coupled to the I/O device and an I/O multiplexing module 506 that controls transmission of the signals between the function blocks 502A and 502B and the I/O device using a controller. The I/O multiplexing module 506 can include a data I/O multiplexer and an enable I/O multiplexer coupled to the controller.


Furthermore, the I/O multiplexing module can include a data re-clocking device coupled to the I/O device, the function blocks, and the data I/O multiplexer. The re-clocking device synchronizes one or more of the signals with a system clock before the one or more signals enter the data I/O multiplexer. In this case, the I/O multiplexing module also includes an enable re-clocking device coupled to the I/O device, the function blocks, and the enable I/O multiplexer. The re-clocking device synchronizes one or more of the signals with a system clock before the one or more signals enter the enable I/O multiplexer. The I/O multiplexing system is further taught in co-pending application, U.S. patent application Ser. No. 10/640,649, filed Aug. 14, 2003, which is incorporated herein by reference in its entirety.


Optional elements within TVOC 200 that further facilitate the integration of television functionality on a single integrated circuit can further include an adaptable strapping system. TVOC 200 can have numerous selectable functions, for example, selectable video encoding standards. In order to properly set the function of TVOC 200, a mode or state can be selected for TVOC 200 based on the product it is incorporated within. The mode or state can be selected using a strapping system (e.g., mode or state setting system) in a variety of ways, including using a dedicated pin, a register control block controlled by software running in the product, or the like. Thus, because a designer knows what product is being made, when the chip is inserted into the product, a function of the device can be set using a signal that designates a mode or state.


Using a strapping system, such as the one described herein, is useful because it does not require using a dedicated pin. Pins are in limited supply on a chip, and avoiding having to use one to set a mode of the device is desired. Furthermore, the adaptable strapping system of the present invention allows TVOC 200 to use alternate functions after packaging in real time. An exemplary adaptable strapping system 600 as illustrated in FIG. 6 includes a control system 608 that controls a mode of TVOC 200, an initialization system 604 that transmits a signal to the control system 608 to initialize TVOC 200, an internal strapping system 606 that transmits a signal to the control system 608 that can be used to place TVOC 200 in a first state, and an override system 6l0 that transmits a second signal to the control system to place TVOC 200 in a second state. The adaptable strapping system is further taught in co-pending application, U.S. patent application Ser. No. 10/641,103, filed Aug. 15, 2003, which is incorporated by reference herein in its entirety.


Optional elements within TVOC 200 that further facilitate the integration of television functionality on a single integrated circuit can further include a system for multi-channel audio interconnects. Traditional processing of audio signals transferred between audio modules, such as analog-to-digital converters, error correction devices and digital filters, within a printed circuit board or an IC typically requires conversion of the related audio signals to an inter-IC sound (I2S) standard or a similar format. I2S is one technique intended to standardize the format of audio data transferred between these internal components. Components implementing the widely accepted I2S standard convert incoming audio data signals into the I2S format.


As the speed at which micro-electronic large scale integrated (LSI) devices operate increases, the time required to perform the I2S conversion becomes a critical system limitation. The chip space required to accommodate the three I2S pins has become equally burdensome. Although other serial data bus designs are available, even some with 1-line data links, most are too complex or lack sufficient flexibility for extensive use. The system for multi-channel audio interconnects provides a data bus that can be used to transfer audio data that minimizes the complexities and amount of hardware required for transferring audio data between modules. A suite of protocols support this bus to more efficiently transfer data between the modules, ultimately reducing the number of chip pins.


When this embodiment is used, a multi-channel audio interconnect system includes data paths that are configured for transferring audio data between a transmitting module and one or more receiving modules. An exemplary multi-channel audio interconnect system 700 as illustrated in FIG. 7. System 700 includes an encoder positioned within the transmitting module and configured to convert audio data requiring transmission into two-line audio information segments. As shown in FIG. 7, the system 700 includes the 3-line data bus 706 for transferring input audio information 715 between the transmitter 702 and the receiver 704. In order to accommodate this information transfer, the I2S protocol suite may be used for formatting the clock bit data transferred along the clock line 708, the word select data 710, and the serial data stream 712.


Transmitter 702 of FIG. 7 includes a standard audio encoder 713 configured for converting received data 715 into format for transfer across the data bus 706. The receiver 704 includes a conventional audio decoder 714 configured to decode the encoded audio data received via the data path 706. The encoder 713 can be used, for example, to convert received audio PCM data into an I2S format or the popular Sony/Philips digital interface (SPDIF) format. The audio information segments include a format portion identifying the audio format and a data portion for carrying the audio data. A data line and a synchronization line exist between audio modules. The data line carries the audio information segments, while the synchronization line transmits a number of sync pulses that indicate a start of one of the audio information segments. A decoder is included in a receiving module that interprets the audio information segments and sync pulses. The multi-channel audio interconnect system is further taught in co-pending application, U.S. patent application Ser. No. 10/646,833, filed Aug. 25, 2003, which is incorporated herein by reference in its entirety.


Optional elements within TVOC 200 that further facilitate the integration of television functionality on a single integrated circuit can include circuits and methods to enhance scan testing. Scan testing provides an effective means to test the structural integrity of devices, such as flip-flops, within a complex integrated circuit, such as TVOC 200. Within TVOC 200, features may be enabled or disabled depending upon TV manufacturer preferences. This flexibility can make scan testing more complex. Circuits to support scan testing and reduce this complexity can be provided. These circuits are taught in the following co-pending applications, U.S. patent application Ser. No. 10/640,687, filed Aug. 14, 2003, U.S. patent application Ser. No. 10/640,659, filed Aug. 14, 2003, and U.S. patent application Ser. No. 10/640,686, filed Aug. 14, 2003. All of which are incorporated herein by reference in their entireties.


CONCLUSION

While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention.


The present invention has been described above with the aid of functional building blocks and method steps illustrating the performance of specified functions and relationships thereof. The boundaries of these functional building blocks and method steps have been arbitrarily defined herein for the convenience of the description. Alternate boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Any such alternate boundaries are thus within the scope and spirit of the claimed invention. One skilled in the art will recognize that these functional building blocks can be implemented by discrete components, application specific integrated circuits, processors executing appropriate software and the like or any combination thereof. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims
  • 1. An integrated circuit for providing television functionality, comprising: a plurality of input elements;video and graphics processing elements;audio processing elements, comprising: an audio decoder including a signal mode detection system, wherein said signal mode detection system determines a transmission mode of a broadcast signal; andan audio processor;output interfaces; anda control element, wherein said plurality of input elements, said video and graphics processing elements, said audio processing elements, and said output interfaces are coupled through a system controller.
  • 2. The integrated circuit of claim 1, wherein said plurality of input elements comprise: an IF demodulator; anda data transport module.
  • 3. The integrated circuit of claim 2, wherein said IF demodulator comprises a digital IF demodulator.
  • 4. The integrated circuit of claim 2, wherein said IF demodulator includes an FM demodulation system for approximating y(n)=1/x(n) in FM demodulation, where x(n)=I2(n)+Q2(n).
  • 5. The integrated circuit of claim 2, wherein said data transport module includes a video data stream front end processor.
  • 6. The integrated circuit of claim 2, wherein said data transport module includes a packet substitution module.
  • 7. The integrated circuit of claim 2, wherein said data transport module includes a media processing system.
  • 8. The integrated circuit of claim 1, wherein said video and graphics processing elements comprise: an analog video decoder;a digital video decoder;a video and graphics processing module coupled to said analog video decoder and said digital video decoder; anda video encoder coupled to said video and graphics processing module.
  • 9. The integrated circuit of claim 8, wherein said analog video decoder includes a 2D adaptive comb filter.
  • 10. The integrated circuit of claim 8, wherein said digital video decoder includes an artificial time stamp module that generates pseudo MPEG information from received DV25 or DV50 digital video information.
  • 11. The integrated circuit of claim 8, wherein said digital video decoder includes a 2D adaptive comb filter.
  • 12. The integrated circuit of claim 8, wherein said video and graphics processing elements further comprise a 2D engine.
  • 13. The integrated circuit of claim 8, wherein said video and graphics processing elements further comprise a 3D engine.
  • 14. The integrated circuit of claim 8, wherein said video and graphics processing elements further comprise a combined color look-up and gamma correction system.
  • 15. The integrated circuit of claim 8, wherein said video encoder includes a timing generator.
  • 16. The integrated circuit of claim 8, wherein said digital video decoder includes a teletext decoder system.
  • 17. The integrated circuit of claim 8, wherein said video encoder includes a MPAA HDTV copy protection filter system.
  • 18. The integrated circuit of claim 1, further comprising an audio decoder digital interface coupled between an IF demodulator and said audio decoder, wherein said audio decoder digital interface provides an all digital interface and reduces signal mismatch.
  • 19. The integrated circuit of claim 1, wherein the broadcast signal is a Japanese Broadcast Television System Committee broadcast signal.
  • 20. The integrated circuit of claim 1, wherein said audio decoder includes a JBTSC signal separation processor configured to receive JBTSC signals and configured to process a main channel, a sub channel and a control channel of the JBTSC signals.
  • 21. The integrated circuit of claim 1, wherein said audio decoder includes a digital signal processor (DSP) decoder having an execution unit, an address generator and an instruction set to decode input signals, wherein said digital signal processor decoder is reprogrammable and updateable.
  • 22. The integrated circuit of claim 1, wherein said output interfaces includes one or more of: an S-Video composite interface;a 656 interface;a RE modulation interface;a digital audio port; andan analog audio port.
  • 23. The integrated circuit of claim 1, further comprising control interfaces.
  • 24. The integrated circuit of claim 23, wherein said control interfaces include one or more of: a DDR-DRAM controller;an EJTAG module;peripheral interfaces; andan EBI interface.
  • 25. The integrated circuit of claim 1, further comprising ancillary interfaces.
  • 26. The integrated circuit of claim 25, wherein said ancillary interfaces include one or more of: a USB 2.0 interface;an Ethernet interface;a V0.90 interface; anda SATA interface.
  • 27. The integrated circuit of claim 1, further comprising a reset synchronization system.
  • 28. The integrated circuit of claim 1, further comprising an I/O multiplexing system.
  • 29. The integrated circuit of claim 1, further comprising an adaptable strapping system.
  • 30. The integrated circuit of claim 1, further comprising a multi-channel audio interconnect system.
  • 31. An integrated circuit, comprising: a digital IF demodulator for receiving and demodulating analog television signals;an FM demodulation system for approximating y(n)=1/x(n) in FM demodulation, where x(n)=I2(n)+Q2(n);a video data stream front end processor for demultiplexing video signals that use both a program identifier and are multiplexed using a time division multiplexing approach;a packet substitution module for substituting packets into a video data stream;a media processing system for processing multiple program channels containing one or more data packets;a 2D adaptive comb filter for separating luma and chroma signals within a composite video signal;an artificial time stamp module that generates pseudo MPEG information from received DV25 or DV50 digital video information;a 2D engine that provides two dimensional graphics processing;a 3D engine that provides three dimensional graphics processing;a combined color look-up and gamma correction system that improves video graphics;a timing generator for generating time-dependent control signals for video signals;a teletext decoder system for processing teletext sequences;a MPAA HDTV copy protection filter system for providing MPAA HDTV copy protection;an audio decoder digital interface coupled between a IF demodulator and an audio decoder the provides and all digital interface and reduces signal mismatch;signal mode detection system that determine a transmission mode of a broadcast signal; anda digital signal processor within an audio decoder, wherein said digital signal processor is reprogrammable and updateable.
  • 32. An integrated circuit for providing television functionality, comprising: a plurality of input elements;video and graphics processing elements, comprising an analog video decoder;a digital video decoder comprising an artificial time stamp module that generates pseudo MPEG information from received DV25 or DV50 digital video information;a video and graphics processing module coupled to said analog video decoder and said digital video decoder; anda video encoder coupled to said video and graphics processing module;audio processing elements;output interfaces; anda control element, wherein said plurality of input elements, said video and graphics processing elements, said audio processing elements, and said output interfaces are coupled through a system controller.
CROSS REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of priority to the following U.S. Provisional Patent Applications: Application No. 60/451,265, filed Mar. 4, 2003; Application No. 60/467,574, filed May 5, 2003; Application No. 60/495,129, filed Aug. 15, 2003; Application No. 60/495,127, filed Aug. 15, 2003; and Application No. 60/495,121, filed Aug. 15, 2003. All of which are incorporated herein by reference in their entireties. This application is a continuation in part of the following U.S. patent applications: application Ser. No. 10/448,062, filed May 30, 2003; application Ser. No. 10/629,781, filed Jul. 30, 2003; application Ser. No. 10/640,687, filed Aug. 14, 2003; application Ser. No. 10/640,659, filed Aug. 14, 2003; application Ser. No. 10/640,686, filed Aug. 14, 2003; application Ser. No. 10/640,666, filed Aug. 14, 2003; application Ser. No. 10/641,031, filed Aug. 15, 2003; application Ser. No. 10/640,632, filed Aug. 14, 2003; application Ser. No. 10/640,649, filed Aug. 14, 2003; application Ser. No. 10/641,103, filed Aug. 15, 2003; application Ser. No. 10/640,648, filed Aug. 14, 2003; application Ser. No. 10/640,627, filed Aug. 14, 2003; application Ser. No. 10/641,160, filed Aug. 15, 2003; application Ser. No. 10/629,797, filed Jul. 30, 2003; application Ser. No. 10/641,295, filed Aug. 15, 2003; Application No. 10,640,682, filed Aug. 14, 2003; application Ser. No. 10/640,684, filed Aug. 14, 2003; application Ser. No. 10/641,004, filed Aug. 15, 2003; application Ser. No. 10/641,161, filed Aug. 15, 2003; application Ser. No. 10/646,833, filed Aug. 25, 2003; application Ser. No. 10/646,721, filed Aug. 25, 2003; and application Ser. No. 10/641,034, filed Aug. 15, 2003. All of which are incorporated herein by reference in their entireties.

US Referenced Citations (189)
Number Name Date Kind
4300207 Eivers et al. Nov 1981 A
4368354 Furihata et al. Jan 1983 A
4399329 Wharton Aug 1983 A
4493077 Agrawal et al. Jan 1985 A
4502078 Steckler et al. Feb 1985 A
4521858 Kraemer et al. Jun 1985 A
4532587 Roskell et al. Jul 1985 A
4577157 Reed Mar 1986 A
4623926 Sakamoto Nov 1986 A
4656651 Evans et al. Apr 1987 A
4712131 Tanabe Dec 1987 A
4747140 Gibson May 1988 A
4893316 Janc et al. Jan 1990 A
5031233 Regan Jul 1991 A
5134691 Elms Jul 1992 A
5227863 Bilbrey et al. Jul 1993 A
5235600 Edwards Aug 1993 A
5271023 Norman Dec 1993 A
5283903 Uehara Feb 1994 A
5337196 Kim Aug 1994 A
5467342 Logston et al. Nov 1995 A
5471411 Adams et al. Nov 1995 A
5473768 Kimura Dec 1995 A
5491787 Hashemi Feb 1996 A
5519443 Salomon et al. May 1996 A
5524244 Robinson et al. Jun 1996 A
5557608 Calvignac et al. Sep 1996 A
5570137 Goeckler Oct 1996 A
5572663 Hosaka Nov 1996 A
5596767 Guttag et al. Jan 1997 A
5621651 Swoboda Apr 1997 A
5640388 Woodhead et al. Jun 1997 A
5644677 Park et al. Jul 1997 A
5687344 Mitsuishi et al. Nov 1997 A
5694588 Ohara et al. Dec 1997 A
5708961 Hylton et al. Jan 1998 A
5715012 Patel et al. Feb 1998 A
5732107 Phillips et al. Mar 1998 A
5737035 Rotzoll Apr 1998 A
5748860 Shively May 1998 A
5790873 Popper et al. Aug 1998 A
5812562 Baeg Sep 1998 A
5826072 Knapp et al. Oct 1998 A
5828415 Keating et al. Oct 1998 A
5841670 Swoboda Nov 1998 A
5847612 Birleson Dec 1998 A
5859442 Manning Jan 1999 A
5878264 Ebrahim Mar 1999 A
5889820 Adams Mar 1999 A
5905405 Ishizawa May 1999 A
5909369 Gopinath et al. Jun 1999 A
5931934 Li et al. Aug 1999 A
5949821 Emami et al. Sep 1999 A
5956494 Girardeau et al. Sep 1999 A
5968140 Hall Oct 1999 A
5987078 Kiyanagi et al. Nov 1999 A
6002726 Simanapalli et al. Dec 1999 A
6005640 Strolle et al. Dec 1999 A
6006287 Wakazu Dec 1999 A
6035094 Kori Mar 2000 A
6064676 Slattery et al. May 2000 A
6065112 Kishida et al. May 2000 A
6071314 Baxter et al. Jun 2000 A
6101319 Hall Aug 2000 A
6112170 Patwardhan et al. Aug 2000 A
6115432 Mishima et al. Sep 2000 A
6133964 Han Oct 2000 A
6147713 Robbins et al. Nov 2000 A
6154483 Davidovici et al. Nov 2000 A
6163684 Birleson Dec 2000 A
6177964 Birleson et al. Jan 2001 B1
6189064 MacInnis et al. Feb 2001 B1
6195392 O'Grady Feb 2001 B1
6199182 Whetsel Mar 2001 B1
6205223 Rao et al. Mar 2001 B1
6208162 Bocchino Mar 2001 B1
RE37195 Kean May 2001 E
6233295 Wang May 2001 B1
6275507 Anderson et al. Aug 2001 B1
6281813 Vierthaler et al. Aug 2001 B1
6292490 Gratacap et al. Sep 2001 B1
6314504 Dent Nov 2001 B1
6334026 Xue et al. Dec 2001 B1
6337878 Endres et al. Jan 2002 B1
6363126 Furukawa et al. Mar 2002 B1
6370191 Mahant-Shetti et al. Apr 2002 B1
6373530 Birks et al. Apr 2002 B1
6378093 Whetsel Apr 2002 B1
6381747 Wonfor et al. Apr 2002 B1
6430681 Nagao Aug 2002 B1
6438368 Phillips et al. Aug 2002 B1
6445726 Gharpurey Sep 2002 B1
6452435 Skergen et al. Sep 2002 B1
6463452 Schulist et al. Oct 2002 B1
6476878 Lafay et al. Nov 2002 B1
6487466 Miyabe Nov 2002 B1
6492913 Vierthaler et al. Dec 2002 B2
6539497 Swoboda et al. Mar 2003 B2
6542203 Shadwell et al. Apr 2003 B1
6545728 Patel et al. Apr 2003 B1
6559898 Citta et al. May 2003 B1
6570990 Kohn et al. May 2003 B1
6584560 Kroun et al. Jun 2003 B1
6584571 Fung Jun 2003 B1
6636270 Gates et al. Oct 2003 B2
6639422 Albean Oct 2003 B1
6646460 Whetsel Nov 2003 B2
6665802 Ober Dec 2003 B1
6674488 Satoh Jan 2004 B1
6680955 Le Jan 2004 B1
6687670 Sydanmaa et al. Feb 2004 B2
6697382 Eatherton Feb 2004 B1
6725357 Cousin Apr 2004 B1
6738097 Satoh May 2004 B2
6738098 Hutchinson May 2004 B1
6760866 Swoboda et al. Jul 2004 B2
6772022 Farrow et al. Aug 2004 B1
6779098 Sato et al. Aug 2004 B2
6789183 Smith et al. Sep 2004 B1
6791995 Azenkot et al. Sep 2004 B1
6801544 Rijckaert et al. Oct 2004 B1
6819331 Shih et al. Nov 2004 B2
6823131 Abelard et al. Nov 2004 B2
6832078 Wu Dec 2004 B2
6859238 Wu Feb 2005 B2
6861867 West et al. Mar 2005 B2
6944226 Lin et al. Sep 2005 B1
6959151 Cotter et al. Oct 2005 B1
6967951 Alfano Nov 2005 B2
6987767 Saito Jan 2006 B2
6999130 Tanigawa Feb 2006 B2
7006756 Keesen et al. Feb 2006 B1
7006806 Wu et al. Feb 2006 B2
7010665 Toll et al. Mar 2006 B1
7031306 Amaral et al. Apr 2006 B2
7057627 MacInnis et al. Jun 2006 B2
7058868 Guettaf Jun 2006 B2
7088398 Wolf et al. Aug 2006 B1
7089471 Guettaf Aug 2006 B2
7102689 Grossman et al. Sep 2006 B2
7131045 Guettaf Oct 2006 B2
7139283 Quigley et al. Nov 2006 B2
7151945 Myles et al. Dec 2006 B2
7227587 MacInnis et al. Jun 2007 B2
7230987 Demas et al. Jun 2007 B2
7239357 Jaffe Jul 2007 B2
7260166 Sweet Aug 2007 B2
7263627 Sweet et al. Aug 2007 B2
7307667 Yeh et al. Dec 2007 B1
7352411 Jaffe Apr 2008 B2
7397822 Golan et al. Jul 2008 B2
20020091861 Kim et al. Jul 2002 A1
20020122430 Haberman et al. Sep 2002 A1
20020186223 Sasaki Dec 2002 A1
20030085993 Trimbee et al. May 2003 A1
20030086695 Okamoto et al. May 2003 A1
20030161477 Wu et al. Aug 2003 A1
20030161486 Wu et al. Aug 2003 A1
20030162500 Wu et al. Aug 2003 A1
20030165084 Blair et al. Sep 2003 A1
20030174770 Kato et al. Sep 2003 A1
20030190157 Aubry et al. Oct 2003 A1
20030197810 Jaffe Oct 2003 A1
20030198352 Easley et al. Oct 2003 A1
20030215215 Imahashi et al. Nov 2003 A1
20040090976 Shung May 2004 A1
20040105658 Hallberg et al. Jun 2004 A1
20040128578 Jonnalagadda Jul 2004 A1
20040170162 Hung Sep 2004 A1
20040223086 Jaffe Nov 2004 A1
20050027771 Wu Feb 2005 A1
20050035887 Nhu Feb 2005 A1
20050035975 Tang et al. Feb 2005 A1
20050036037 Kranawetter et al. Feb 2005 A1
20050036070 Johnson Feb 2005 A1
20050036074 Nhu Feb 2005 A1
20050036357 Nhu et al. Feb 2005 A1
20050036508 Tran et al. Feb 2005 A1
20050036515 Cheung et al. Feb 2005 A1
20050036516 Cheung et al. Feb 2005 A1
20050036523 Sweet Feb 2005 A1
20050036626 Nhu Feb 2005 A1
20050036764 Grossman et al. Feb 2005 A1
20050039065 Cheung et al. Feb 2005 A1
20050039204 Neuman et al. Feb 2005 A1
20050047603 Nhu Mar 2005 A1
20050280742 Jaffe Dec 2005 A1
20060062254 Markevitch et al. Mar 2006 A1
20060171659 Worrell et al. Aug 2006 A1
Foreign Referenced Citations (4)
Number Date Country
1244007 Sep 2002 EP
H06-324669 Nov 1994 JP
10-2000-0060826 Oct 2000 KR
10-2001-0033892 Apr 2001 KR
Related Publications (1)
Number Date Country
20050028220 A1 Feb 2005 US
Provisional Applications (5)
Number Date Country
60495129 Aug 2003 US
60495127 Aug 2003 US
60495121 Aug 2003 US
60467574 May 2003 US
60451265 Mar 2003 US
Continuation in Parts (22)
Number Date Country
Parent 10646833 Aug 2003 US
Child 10791686 US
Parent 10646721 Aug 2003 US
Child 10646833 US
Parent 10641031 Aug 2003 US
Child 10646721 US
Parent 10641103 Aug 2003 US
Child 10641031 US
Parent 10641160 Aug 2003 US
Child 10641103 US
Parent 10641295 Aug 2003 US
Child 10641160 US
Parent 10641004 Aug 2003 US
Child 10641295 US
Parent 10641161 Aug 2003 US
Child 10641004 US
Parent 10641034 Aug 2003 US
Child 10641161 US
Parent 10640686 Aug 2003 US
Child 10641034 US
Parent 10640666 Aug 2003 US
Child 10640686 US
Parent 10640632 Aug 2003 US
Child 10640666 US
Parent 10640649 Aug 2003 US
Child 10640632 US
Parent 10640648 Aug 2003 US
Child 10640649 US
Parent 10640627 Aug 2003 US
Child 10640648 US
Parent 10640862 Aug 2003 US
Child 10640627 US
Parent 10640684 Aug 2003 US
Child 10640862 US
Parent 10640687 Aug 2003 US
Child 10640684 US
Parent 10640659 Aug 2003 US
Child 10640687 US
Parent 10629797 Jul 2003 US
Child 10640659 US
Parent 10629781 Jul 2003 US
Child 10629797 US
Parent 10448062 May 2003 US
Child 10629781 US