1. Field of the Invention
The present invention is related to a noise cancellation device for an image signal processing system, and more particularly, to a noise cancellation device capable of performing filtering operations for image signals according to a motion degree of images, so as to reduce noise, improve image quality, and maintain definition of the images.
2. Description of the Prior Art
With rapid developments of communication and computer techniques, image applications have more and more variety. Briefly, each of the image applications can be regarded as a combination of an image data source and a player. The image data source can be any device capable of outputting image signals, such as a computer, a DVD player, a cable or wireless television signal LS (Launch-Station), a video game player, etc., and is utilized for outputting image signals to the player through wired or wireless channels, so as to display images. During signal transmission, signals are inevitably interfered by noise, topography, and surface features. Even in the image data source or the player, signals processed by the image data source or the player may be mingled with unanalyzable elements of noise due to circuit defects or environment conditions (e.g. temperature or humidity), which reduces image quality.
It is therefore a primary objective of the claimed invention to provide a noise cancellation device for an image signal processing system.
The present invention discloses a noise cancellation device for an image signal processing system, which comprises a receiving end for receiving an image signal, a 3D (three-dimensional) filtering unit for adjusting a filtering parameter according to a motion estimation value, and filtering the image signal and a former filtering result for generating a current filtering result, a motion detection unit for comparing the former filtering result and the image signal received by the receiving end, so as to generate a current motion factor and the motion estimation value according to a former motion factor, a memory unit for receiving and storing the current filtering result outputted from the 3D filtering unit and the current motion factor outputted from the motion detection unit as the former filtering result and the former motion factor, and an output end for outputting the current filtering result provided by the 3D filtering unit.
These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
Please refer to
The memory unit 106 stores the filtering result y(n) and the corresponding motion factor mf(n) for a specified duration, and outputs them to the 3D filtering unit 102 and the motion detection unit 104 when the next image comes. The motion detection unit 104 determines the motion degree of the current image, which compares the filtering result y(n−1) with the image signal x(n), and generates the motion factor mf(n) and the motion estimation factor k according to the comparison result and the motion factor mf(n−1) of the former image. According to the motion estimation value k and the filtering result y(n−1), the 3D filtering unit 102 can perform appropriate filtering for the image signal x(n), so as to generate the filtering result y(n). Preferably, the 3D filtering unit 102 performs 2D low-pass filtering operations for the image signal x(n) when images are dynamic, and performs infinite impulse response (IIR) operations for the image signal x(n) when the images are static. In other words, the motion detection unit 104 can determine the motion degree of the images, while the 3D filtering unit 102 performs appropriate filtering for the image signal x(n) according to the motion degree. Therefore, the noise cancellation device 10 can effectively reduce noise, improve image quality, and maintain definition of the static images.
Briefly, the memory unit 106 stores the filtering result and the motion factor of the former image, while the 3D filtering unit 102 and the motion detection unit 104 determine the motion degree according to the filtering result and the motion factor stored in the memory unit 106, so as to perform appropriate filtering for the image signal x(n) to generate the filtering result and the motion factor of the current image. In this situation, the noise cancellation device 10 can perform appropriate filtering operations for the image signal x(n) according to the motion degree, which can effectively reduce noise, improve image quality, and maintain definition of the static images. Note that,
For example, please refer to
Therefore, according to the motion degree of the images, the 3D filtering unit 20 adjusts filtering operations for enhancing the quality of dynamic images and maintaining original definition of static images.
Please refer to
Note that, the motion estimation unit 30 shown in
In the preferred embodiment of the present invention, the noise cancellation device 10 generates the filtering result and the motion factor of the current image according to the filtering result and the motion factor of the former image. In such a situation, the memory unit 106 stores a filtering result and a motion factor of an image. Certainly, the present invention can also generate the filtering result and the motion factor of the current image according to filtering results and motion factors of a plurality of former images. For example, please refer to
In conclusion, the present invention performs appropriate filtering operations for image signals according to the motion degree of the images, which can effectively reduce noise, improve image quality, and maintain definition of static images.
Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention.
Number | Date | Country | Kind |
---|---|---|---|
96121405 A | Jun 2007 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
4672445 | Casey et al. | Jun 1987 | A |
5218449 | Ko et al. | Jun 1993 | A |
5225898 | Imai et al. | Jul 1993 | A |
5412481 | Ko et al. | May 1995 | A |
5543858 | Wischermann | Aug 1996 | A |
5818972 | Girod et al. | Oct 1998 | A |
5841251 | Vroemen et al. | Nov 1998 | A |
5845039 | Ko et al. | Dec 1998 | A |
5925875 | Frey | Jul 1999 | A |
6118489 | Han et al. | Sep 2000 | A |
6259489 | Flannaghan et al. | Jul 2001 | B1 |
6311555 | McCall et al. | Nov 2001 | B1 |
6360014 | Boon | Mar 2002 | B1 |
6400762 | Takeshima | Jun 2002 | B2 |
6567468 | Kato et al. | May 2003 | B1 |
6597738 | Park et al. | Jul 2003 | B1 |
6654054 | Embler | Nov 2003 | B1 |
6687300 | Fujita et al. | Feb 2004 | B1 |
7003037 | Bordes et al. | Feb 2006 | B1 |
7034870 | Nagaoka et al. | Apr 2006 | B2 |
7061548 | Piepers | Jun 2006 | B2 |
7085318 | Kondo et al. | Aug 2006 | B2 |
7170562 | Yoo et al. | Jan 2007 | B2 |
7193655 | Nicolas | Mar 2007 | B2 |
7268835 | Babonneau et al. | Sep 2007 | B2 |
7365801 | Kondo | Apr 2008 | B2 |
7460697 | Erhart et al. | Dec 2008 | B2 |
7489829 | Sorek et al. | Feb 2009 | B2 |
7567300 | Satou et al. | Jul 2009 | B2 |
7570833 | Lee | Aug 2009 | B2 |
7769089 | Chou | Aug 2010 | B1 |
7887489 | Lee et al. | Feb 2011 | B2 |
20010012408 | Badyal et al. | Aug 2001 | A1 |
20010035916 | Stessen et al. | Nov 2001 | A1 |
20010050956 | Takeshima | Dec 2001 | A1 |
20020044205 | Nagaoka et al. | Apr 2002 | A1 |
20030071920 | Yu | Apr 2003 | A1 |
20030122967 | Kondo et al. | Jul 2003 | A1 |
20030123750 | Yu | Jul 2003 | A1 |
20030189655 | Lim et al. | Oct 2003 | A1 |
20040153581 | Nakaya et al. | Aug 2004 | A1 |
20040179108 | Sorek et al. | Sep 2004 | A1 |
20040233326 | Yoo et al. | Nov 2004 | A1 |
20040257467 | Nicolas | Dec 2004 | A1 |
20040264802 | Kondo | Dec 2004 | A1 |
20050083439 | Endress et al. | Apr 2005 | A1 |
20050084011 | Song et al. | Apr 2005 | A1 |
20050094035 | Babonneau et al. | May 2005 | A1 |
20050135427 | Machimura et al. | Jun 2005 | A1 |
20050162566 | Chuang et al. | Jul 2005 | A1 |
20050243194 | Xu | Nov 2005 | A1 |
20050286802 | Clark et al. | Dec 2005 | A1 |
20060038920 | Kondo et al. | Feb 2006 | A1 |
20060050146 | Richardson | Mar 2006 | A1 |
20060104353 | Johnson et al. | May 2006 | A1 |
20060187357 | Satou et al. | Aug 2006 | A1 |
20070047647 | Lee et al. | Mar 2007 | A1 |
20070182862 | Li et al. | Aug 2007 | A1 |
20070229709 | Asamura et al. | Oct 2007 | A1 |
20080074552 | Jung et al. | Mar 2008 | A1 |
20080218630 | Kempf et al. | Sep 2008 | A1 |
20080232708 | Erdler et al. | Sep 2008 | A1 |
20080278631 | Fukuda | Nov 2008 | A1 |
20080291298 | Kim et al. | Nov 2008 | A1 |
20090086814 | Leontaris et al. | Apr 2009 | A1 |
20090184894 | Sato et al. | Jul 2009 | A1 |
20090245639 | Erdler et al. | Oct 2009 | A1 |
20100165207 | Deng et al. | Jul 2010 | A1 |
Number | Date | Country | |
---|---|---|---|
20080309680 A1 | Dec 2008 | US |