Claims
- 1. A method comprising:
accessing at least a portion of a digital image; and determining if at least said portion is blurred based on at least one blur detection process selected from a group of blur detection processes including a wavelet transform blur detection process, and a Cepstrum analysis blur detection process.
- 2. The method as recited in claim 1, wherein said wavelet transform blur detection process includes:
wavelet transforming at least said portion of said digital image to produce a plurality of corresponding different resolution levels, each resolution level including a plurality of bands; generating at least one edge map for each of said resolution levels; and detecting blur in at least said portion of said digital image based on said resulting edge maps.
- 3. The method as recited in claim 2, wherein detecting blur in at least said portion of said digital image based on said resulting edge maps further includes normalizing each of said resulting edge maps.
- 4. The method as recited in claim 3, wherein normalizing each of said resulting edge maps further includes discretizing edge information.
- 5. The method as recited in claim 4, wherein said edge information includes edge amplitude data.
- 6. The method as recited in claim 3, wherein normalizing each of said resulting edge maps further includes:
normalizing a total edge amplitude of said edge map:Emapi(k,l)=Emapi(k,l)/max(Emapi);partitioning said edge map into edge map blocks; determining a maximal edge amplitude in each of said edge map blocks and using it to represent the respective edge map block; and using E maxi to denote a discretization result of Emapi for each of said edge map blocks.
- 7. The method as recited in claim 2; wherein said plurality of bands includes at least LL, HL, LH, HH bands.
- 8. The method as recited in claim 7, wherein Iiv, Iih, Iid denote LHi, HLi, HHi bands, respectively, and wherein generating said at least one edge map for each of said resolution levels further includes constructing said edge map in scale i as follows:
- 9. The method as recited in claim 8, wherein detecting blur in at least said portion of said digital image based on said resulting edge maps further includes:
comparing amplitude variations of corresponding edge nodes in at least two different edge maps of at least two different levels, and wherein comparing said amplitude variations includes generating a difference map Dmap based on 8Dmap(i,j)=(E max3(i,j)-E max2(i,j))2+(E max2(i,j)-E max1(i,j))2.
- 10. The method as recited in claim 2, wherein detecting blur in at least said portion of said digital image based on said resulting edge maps further includes:
comparing amplitude variations of corresponding edge nodes in at least two different edge maps of at least two different levels.
- 11. The method as recited in claim 10, wherein comparing said amplitude variations includes generating a difference map.
- 12. The method as recited in claim 10, wherein in said difference map a position of a plurality of relatively large amplitude values corresponds to at least one blurred edge.
- 13. The method as recited in claim 11, wherein detecting blur in at least said portion of said digital image based on said resulting edge maps further includes: generating a binary difference map BDmap such that,
- 14. The method as recited in claim 13, further comprising:
determining that at least said portion of said digital image is blurred if an applicable percentage of edge map blocks are determined to be blurred exceeds a second threshold value.
- 15. The method as recited in claim 1, wherein said Cepstrum analysis blur detection process includes:
determining a Cepstrum for image data I asC(f)=real(FFT−1(log(|FFT(I)|))).
- 16. The method as recited in claim 1, wherein said Cepstrum analysis blur detection process includes:
dividing said image into a plurality of parts; and determining a Cepstrum for each of said parts.
- 17. The method as recited in claim 1, wherein said Cepstrum analysis blur detection process includes:
blurring at least one boundary within said image.
- 18. The method as recited in claim 17, wherein blurring said at least one boundary within said image further includes:
using a point spread function (PSF) to blur at least a part of said image.
- 19. The method as recited in claim 18, wherein said PSF includes a circular averaging filter.
- 20. The method as recited in claim 19, wherein said PSF is used to blur a plurality of parts Iij to produce corresponding blurred images BIij, where
- 21. The method as recited in claim 20, further comprising:
determining an image Jij that includes a weighted sum of said Iij and corresponding BIij.
- 22. The method as recited in claim 21, further comprising:
generating a weighting array wherein Jij is at least substantially equal to Iij in its central region and at least substantially equal to said corresponding BIij near at least one edge.
- 23. The method as recited in claim 22, wherein:
- 24. The method as recited in claim 23, further comprising: binarizing each CIij.
- 25. The method as recited in claim 24, wherein binarizing each CIij includes setting BCI(x,y)=1 if CI(x,y)/max(CI)>t3, else otherwise setting BCI(x,y)=0, wherein t3 is a third threshold value.
- 26. The method as recited in claim 24, further comprising calculating an elongation of each resulting binarized Cepstrum image.
- 27. The method as recited in claim 26, wherein said elongation includes a ratio of a maximum length of a chord to a minimum length chord.
- 28. The method as recited in claim 26, wherein moments are used to calculate said elongation.
- 29. The method as recited in claim 28, wherein an ij th discrete central moment μij of a region is defined by
- 30. The method as recited in claim 29, wherein said elongation using moments includes an:
- 31. The method as recited in claim 29, wherein a principal axes of inertia is used to define a natural coordinate system for said region, such that
- 32. The method as recited in claim 28, further comprising:
determining that said image includes motion blur if more than about one third of said regions have an elongation larger than a threshold value L and the maximum difference between a corresponding principal axes is less than a threshold Δθ.
- 33. The method as recited in claim 28, further comprising:
determining that said image includes out-of-focus blur if more than about one third of said regions have applicable areas that are larger than a threshold area value A and corresponding elongations that are less than a threshold value T.
- 34. A computer-readable medium having computer-implementable instructions suitable for causing at least one processing unit to perform acts comprising:
determining if at least a portion of a digital image is motion blurred or out-of-focus blurred based on at least one blur detection process selected from a group of blur detection processes including a wavelet transform blur detection process, and a Cepstrum analysis blur detection process.
- 35. The computer-readable medium as recited in claim 34, wherein said wavelet transform blur detection process includes:
wavelet transforming at least said portion of said digital image to produce a plurality of corresponding different resolution levels, each resolution level including a plurality of bands; generating at least one edge map for each of said resolution levels; and detecting blur in at least said portion of said digital image based on said resulting edge maps.
- 36. The computer-readable medium as recited in claim 35, wherein detecting blur in at least said portion of said digital image based on said resulting edge maps further includes normalizing each of said resulting edge maps.
- 37. The computer-readable medium as recited in claim 36, wherein normalizing each of said resulting edge maps further includes discretizing edge information.
- 38. The computer-readable medium as recited in claim 37, wherein said edge information includes edge amplitude data.
- 39. The computer-readable medium as recited in claim 36, wherein normalizing each of said resulting edge maps further includes:
normalizing a total edge amplitude of said edge map:Emapi(k,l)=Emapi(k,l)/max(Emapi);partitioning said edge map into edge map blocks; determining a maximal edge amplitude in each of said edge map blocks and using it to represent the respective edge map block; and using E maxi to denote a discretization result of Emapi for each of said edge map blocks.
- 40. The computer-readable medium as recited in claim 35, wherein said plurality of bands includes at least LL, HL, LH, HH bands.
- 41. The computer-readable medium as recited in claim 40, wherein Iiv, Iih, Iid denote LHi, HLi, HHi bands, respectively, and wherein generating said at least one edge map for each of said resolution levels further includes constructing said edge map in scale i as follows:
- 42. The computer-readable medium as recited in claim 41, wherein detecting blur in at least said portion of said digital image based on said resulting edge maps further includes:
comparing amplitude variations of corresponding edge nodes in at least two different edge maps of at least two different levels, and wherein comparing said amplitude variations includes generating a difference map Dmap based on 14Dmap(i,j)=(E max3(i,j)-E max2(i,j))2+(E max2(i,j)-E max1(i,j))2.
- 43. The computer-readable medium as recited in claim 35, wherein detecting blur in at least said portion of said digital image based on said resulting edge maps further includes:
comparing amplitude variations of corresponding edge nodes in at least two different edge maps of at least two different levels.
- 44. The computer-readable medium as recited in claim 43, wherein comparing said amplitude variations includes generating a difference map.
- 45. The computer-readable medium as recited in claim 43, wherein in said difference map a position of a plurality of relatively large amplitude values corresponds to at least one blurred edge.
- 46. The computer-readable medium as recited in claim 44, wherein detecting blur in at least said portion of said digital image based on said resulting edge maps further includes:
generating a binary difference map BDmap such that,BDmap(i,j)=1 if Dmap(i,j)>t1BDmap(i,j)=0 otherwisewhere t1 is a first threshold value; and determining that at least one edge map block (i, j) is blurred if said corresponding BDmap(i,j)=1.
- 47. The computer-readable medium as recited in claim 46, further comprising:
determining that at least said portion of said digital image is blurred if an applicable percentage of edge map blocks are determined to be blurred exceeds a second threshold value.
- 48. The computer-readable medium as recited in claim 34, wherein said Cepstrum analysis blur detection process includes:
determining a Cepstrum for image data I asC(f)=real(FFT−1(log(|FFT(I)|))).
- 49. The computer-readable medium as recited in claim 34, wherein said Cepstrum analysis blur detection process includes:
dividing said image into a plurality of parts; and determining a Cepstrum for each of said parts.
- 50. The computer-readable medium as recited in claim 34, wherein said Cepstrum analysis blur detection process includes:
blurring at least one boundary within said image.
- 51. The computer-readable medium as recited in claim 50, wherein blurring said at least one boundary within said image further includes:
using a point spread function (PSF) to blur at least a part of said image.
- 52. The computer-readable medium as recited in claim 51, wherein said PSF includes a circular averaging filter.
- 53. The computer-readable medium as recited in claim 52, wherein said PSF is used to blur a plurality of parts Iij to produce corresponding blurred images BIij, where
- 54. The computer-readable medium as recited in claim 53, further comprising:
determining an image Jij that includes a weighted sum of said Iij and corresponding BIij.
- 55. The computer-readable medium as recited in claim 54, further comprising:
generating a weighting array wherein Jij is at least substantially equal to Iij in its central region and at least substantially equal to said corresponding BIij near at least one edge.
- 56. The computer-readable medium as recited in claim 55, wherein:
- 57. The computer-readable medium as recited in claim 56, further comprising:
binarizing each CIij.
- 58. The computer-readable medium as recited in claim 57, wherein binarizing each CIij includes setting BCI(x,y)=1 if CI(x,y)/max(CI)>t3, else otherwise setting BCI(x,y)=0, wherein t3 is a third threshold value.
- 59. The computer-readable medium as recited in claim 57, further comprising calculating an elongation of each resulting binarized Cepstrum image.
- 60. The computer-readable medium as recited in claim 59, wherein said elongation includes a ratio of a maximum length of a chord to a minimum length chord.
- 61. The computer-readable medium as recited in claim 59, wherein moments are used to calculate said elongation.
- 62. The computer-readable medium as recited in claim 61, wherein an ij th discrete central moment μij of a region is defined by
- 63. The computer-readable medium as recited in claim 62, wherein said elongation using moments includes an:
- 64. The computer-readable medium as recited in claim 62, wherein said principal axes of inertia is used to define a natural coordinate system for said region, such that
- 65. The computer-readable medium as recited in claim 61, further comprising:
determining that said image includes motion blur if more than about one third of said regions have an elongation larger than a threshold value L and the maximum difference between a corresponding principal axes is less than a threshold Δθ.
- 66. The computer-readable medium as recited in claim 61, further comprising:
determining that said image includes out-of-focus blur if more than about one third of said regions have applicable areas that are larger than a threshold area value A and corresponding elongations that are less than a threshold value T.
- 67. An apparatus comprising:
logic operatively configured to access digital image data and determine if at least a portion of said image is blurry using at least one blur detector selected from a group of blur detectors comprising a wavelet transform blur detector, and a Cepstrum analysis blur detector.
- 68. The apparatus as recited in claim 67, wherein said wavelet transform blur detector is operatively configured to wavelet transform at least said portion to produce a plurality of corresponding different resolution levels with each resolution level including a plurality of bands, generate at least one edge map for each of said resolution levels, and detect blur in at least said portion of said digital image based on said resulting edge maps.
- 69. The apparatus as recited in claim 68, wherein said wavelet transform blur detector normalizes each of said resulting edge maps.
- 70. The apparatus as recited in claim 69, wherein said wavelet transform blur detector discretizes edge information.
- 71. The apparatus as recited in claim 70, wherein said edge information includes edge amplitude data.
- 72. The apparatus as recited in claim 69, wherein said wavelet transform blur detector normalizes each of said resulting edge maps by normalizing a total edge amplitude of said edge map such that Emapi(k,l)=Emapi(k,l)/max(Emapi), partitions said edge map into edge map blocks, and determines a maximal edge amplitude in each of said edge map blocks and uses it to represent the respective edge map block, and using E maxi denotes a discretization result of Emapi for each of said edge map blocks.
- 73. The apparatus as recited in claim 68, wherein said plurality of bands includes at least LL, HL, LH, HH bands.
- 74. The apparatus as recited in claim 73, wherein Iiv, Iih, Iid denote LHi, HLi, HHi bands, respectively, and wherein said wavelet transform blur detector constructs said edge map in scale i as follows:
- 75. The apparatus as recited in claim 74, wherein said wavelet transform blur detector compares amplitude variations of corresponding edge nodes in at least two different edge maps of at least two different levels, and generates a difference map Dmap based on
- 76. The apparatus as recited in claim 68, wherein said wavelet transform blur detector compares amplitude variations of corresponding edge nodes in at least two different edge maps of at least two different levels.
- 77. The apparatus as recited in claim 76, wherein said wavelet transform blur detector generates a difference map.
- 78. The apparatus as recited in claim 76, wherein in said difference map a position of a plurality of relatively large amplitude values corresponds to at least one blurred edge.
- 79. The apparatus as recited in claim 77, wherein said wavelet transform blur detector generates a binary difference map BDmap such that,
- 80. The apparatus as recited in claim 79, wherein said wavelet transform blur detector determines that at least said portion of said digital image is blurred if an applicable percentage of edge map blocks are determined to be blurred exceeds a second threshold value.
- 81. The apparatus as recited in claim 67, wherein said Cepstrum analysis blur detector determines a Cepstrum for image data I as
- 82. The apparatus as recited in claim 67, wherein said Cepstrum analysis blur detector divides said image into a plurality of parts, and determines a Cepstrum for each of said parts.
- 83. The apparatus as recited in claim 67, wherein said Cepstrum analysis blur detector selectively blurs at least one boundary within said image.
- 84. The apparatus as recited in claim 83, wherein said Cepstrum analysis blur detector uses a point spread function (PSF) to selectively blur at least a part of said image.
- 85. The apparatus as recited in claim 84, wherein said PSF includes a circular averaging filter.
- 86. The apparatus as recited in claim 85, wherein said PSF blurs a plurality of parts Iij to produce corresponding blurred images BIij, where
- 87. The apparatus as recited in claim 86, wherein said Cepstrum analysis blur detector determines an image Jij that includes a weighted sum of said Iij and corresponding BIij.
- 88. The apparatus as recited in claim 87, wherein said Cepstrum analysis blur detector generates a weighting array wherein Jij is at least substantially equal to Iij in its central region and at least substantially equal to said corresponding BIij near at least one edge.
- 89. The apparatus as recited in claim 88, wherein
- 90. The apparatus as recited in claim 89, wherein said Cepstrum analysis blur detector binarizes each CIij.
- 91. The apparatus as recited in claim 90, wherein said Cepstrum analysis blur detector binarizes each CIij by setting BCI(x,y)=1 if CI(x,y)/max(CI)>t3, else otherwise setting BCI(x,y)=0, wherein t3 is a third threshold value.
- 92. The apparatus as recited in claim 90, wherein said Cepstrum analysis blur detector calculates an elongation of each resulting binarized Cepstrum image.
- 93. The apparatus as recited in claim 92, wherein said elongation includes a ratio of a maximum length of a chord to a minimum length chord.
- 94. The apparatus as recited in claim 92, wherein said Cepstrum analysis blur detector uses moments to calculate said elongation.
- 95. The apparatus as recited in claim 94, wherein an ij th discrete central moment μij of a region is defined by
- 96. The apparatus as recited in claim 95, wherein said elongation using moments includes an:
- 97. The apparatus as recited in claim 95, wherein said Cepstrum analysis blur detector uses a principal axes of inertia to define a natural coordinate system for said region, such that
- 98. The apparatus as recited in claim 94, wherein said logic determines that said image includes motion blur if more than about one third of said regions have an elongation larger than a threshold value L and the maximum difference between a corresponding principal axes is less than a threshold Δθ.
- 99. The apparatus as recited in claim 94, wherein said logic determines that said image includes out-of-focus blur if more than about than one third of said regions have applicable areas that are larger than a threshold area value A and corresponding elongations that are less than a threshold value T.
- 100. The apparatus as recited in claim 67, wherein said apparatus includes at least one device selected from a group of devices comprising a computer, a camera, a set top box, an optical disc player, an optical disc player recorder, a portable communication device, a display device, a television set, and a projector.
RELATED PATENT APPLICATIONS
[0001] This U.S. Non-provisional application for Letters Patent is a continuation-in-part of co-pending U.S. patent application Ser. No. 10/374,934, filed Feb. 26, 2003, and titled “Image Blur Detection Methods and Arrangements”, which is a continuation of U.S. patent application Ser. No. 09/833,525, filed Apr. 9, 2001, and titled “Image Blur Detection Methods and Arrangements” now U.S. Pat. No. 6,548,800. The present U.S. Non-provisional application for Letters patent claims the benefit of priority from these earlier patent applications and hereby incorporates by reference the entire disclosure of each of these earlier patent applications.
Continuations (1)
|
Number |
Date |
Country |
Parent |
09833525 |
Apr 2001 |
US |
Child |
10374934 |
Feb 2003 |
US |
Continuation in Parts (1)
|
Number |
Date |
Country |
Parent |
10374934 |
Feb 2003 |
US |
Child |
10646387 |
Aug 2003 |
US |