METHOD AND APPARATUS FOR ADAPTIVE AND SELF-CALIBRATED SENSOR GREEN CHANNEL GAIN BALANCING

Information

  • Patent Application
  • 20070165116
  • Publication Number
    20070165116
  • Date Filed
    September 06, 2006
    19 years ago
  • Date Published
    July 19, 2007
    18 years ago
Abstract
A method and apparatus for adaptive green channel odd-even mismatch removal to effectuate the disappearance of artifacts caused by the odd-even mismatch in a demosaic processed image. In one adaptive approach, a calibrated GR channel gain for red rows and a calibrated GB channel gain for blue rows are determined and are a function of valid pixels only in each respective region. After the calibration, in a correction process, the green pixels in red rows of a region are multiplied by the calibrated GR channel gain. On the other hand, the green pixels in blue rows are multiplied by the calibrated GB channel gain. Thus, after demosaic processing, the corrected image has essentially no artifacts caused by odd-even mismatch of the green channel. Alternately, the adaptive green channel odd-even mismatch removal method replaces the center green pixel of a region having an odd number of columns and rows with a normalized weighted green pixel sum total. The weighted green pixel sum total adds the center green pixel weighted by a first weighting factor, a sum of a first tier layer of weighted green pixel values based on a second weighting factor and a sum of a second tier layer of weighted green pixel values based on a third weighting factor.
Description

BRIEF DESCRIPTION OF THE DRAWINGS

The foregoing summary, as well as the following detailed description of preferred embodiments of the invention, will be better understood when read in conjunction with the accompanying drawings. For the purpose of illustrating the invention, there is shown in the drawings embodiments which are presently preferred. It should be understood, however, that the invention is not limited to the precise arrangement shown. In the drawings:



FIG. 1 illustrates a flat field image after demosaic operation (zoomed 300%);



FIG. 2A illustrates a plot of a Gb gain distribution for a green channel odd-even mismatch distribution where each point represents one region (32×32 pixels);



FIG. 2B illustrates a plot of a Gr gain distribution for the green channel odd-even mismatch distribution where each point represents one region (32×32) pixels;



FIG. 3 illustrates a modified flat field image after applying an adaptive bayer filter to handle the green channel odd-even mismatch with moderate smoothing (zoomed 300%);



FIG. 4 illustrates a plot of a green channel mismatch (Gr/Gb) of an indoor image where each point represents one region (32×32 pixels);



FIG. 5 illustrates a plot of the green channel mismatch (Gr/Gb) of an outdoor image where each point represents one region (32×32 pixels);



FIGS. 6A-6B illustrate flowcharts for the adaptive region-by-region green channel gain self-calibration process of the green channel odd-even mismatch removal method;



FIG. 7 illustrates a flowchart for the correction process of the green channel odd-even mismatch removal method;



FIG. 8 illustrates a flowchart to calculate the average GB and GR values for the valid pixel pairs;



FIG. 9 illustrates a flowchart to calculate the average gain for each GB, GR pair;



FIG. 10A illustrates a divided raw bayer image with 4×3 regions with one region cross hatched;



FIG. 10B illustrates the cross hatched region of FIG. 10A divided into 8×8 pixels;



FIG. 11 illustrates a block diagram of a snap shot imaging device incorporating a green channel odd-even mismatch removal module;



FIG. 12 illustrates a flat field image after region-by-region gain correction and demosaic where each region size is 32×32 (zoomed 300%);



FIG. 13A illustrates a bayer pattern with Green pixel indexing;



FIG. 13B illustrates a bayer pattern with Green pixel indexing and Red pixel indexing;



FIGS. 14A-14E illustrate flowcharts for an alternative adaptive green channel odd-even mismatch removal method for adaptive channel balancing;



FIG. 15A illustrates a flat field image without the adaptive channel balancing (zoomed 300% and with demosaic processing);



FIG. 15B illustrates a flat field image with the adaptive channel balancing (zoomed 300% and with demosaic processing);



FIG. 16A illustrates a resolution chart image (center circles) without the adaptive channel balancing (zoomed 300% and with demosaic processing);



FIG. 16B illustrates a resolution chart image (center circles) with the adaptive channel balancing (zoomed 300% and with demosaic processing);



FIG. 17A illustrates a resolution chart image (vertical lines) without the adaptive channel balancing (zoomed 300% and with demosaic processing);



FIG. 17B illustrates a resolution chart image (vertical lines) with the adaptive channel balancing (zoomed 300% and with demosaic processing);



FIG. 18A illustrates a resolution chart image (horizontal lines) without the adaptive channel balancing (zoomed 300% and with demosaic processing);



FIG. 18B illustrates a resolution chart image (horizontal lines) with the adaptive channel balancing (zoomed 300% and with demosaic processing);



FIG. 19A illustrates a MacBeth chart image without the adaptive channel balancing (zoomed 300% and with demosaic processing);



FIG. 19B illustrates a MacBeth chart image with the adaptive channel balancing algorithm (zoomed 300% and with demosaic processing);



FIG. 20A illustrates a MacBeth chart image without the adaptive channel balancing and with demosaic processing; and



FIG. 20B illustrates a MacBeth chart image with the adaptive channel balancing and with demosaic processing.


Claims
  • 1. A method for adaptive green channel odd-even mismatch removal comprising the steps of: calibrating region-by-region of an image a green (GR) channel gain for red rows and a green (GB) channel gain for blue rows, andapplying, region-by-region, the GR channel gain to green pixels in the red rows and the GB channel gain to the green pixels in the blue rows calibrated for each respective region to remove the green channel odd-even mismatch.
  • 2. The method of claim 1; wherein the image is a raw bayer image.
  • 3. The method of claim 1; wherein the calibrating step includes the step of filtering out bad pixels and edge pixels in said each respective region to form a set of valid pixel pairs.
  • 4. The method of claim 3; wherein the calibrating step includes the steps of counting a number of the valid pixel pairs in the region; computing an average number of the valid green pixels for the red rows; and computing an average number of the valid green pixels for the blue rows.
  • 5. The method of claim 4; wherein the GR channel gain and the GB channel gain are a function of the computed average numbers of the valid green pixels for the red rows and the blue rows.
  • 6. The method of claim 1; wherein the calibrating step includes the step of filtering the GR channel gain and the GB channel gain with a GR channel gain and a GR channel gain of a previous image; and wherein the applied GR channel gain and the applied GB channel gain of the applying step are the filtered GR channel gain and the filtered GB channel gain.
  • 7. The method of claim 1; wherein the applying step comprises the steps of multiplying the green pixels in red rows in said each respective region with the GR channel gain; and multiplying the green pixels in blue rows with the GB channel gain.
  • 8. Program code executed by a processing device comprising: instructions operable upon execution to:calibrate region-by-region in an image a green (GR) channel gain for red rows and a green (GB) channel gain for blue rows, andapply, region-by-region, the GR channel gain to green pixels in the red rows and the GB channel gain to green pixels in the blue rows calibrated for each respective region to adaptively remove green channel odd-even mismatch from the image.
  • 9. The code of claim 8; wherein the instructions operable to calibrate include instructions operable to filter out bad pixels and edge pixels in said each respective region to form a set of valid pixel pairs.
  • 10. The code of claim 9; wherein the instructions operable to calibrate include instructions operable to count a number of the valid pixel pairs in the region; to compute an average number of valid green pixels for the red rows; and to compute an average number of the valid green pixels for the blue rows.
  • 11. The code of claim 10; wherein the GR channel gain and the GB channel gain are a function of the computed average numbers of the valid green pixels for the red rows and the blue rows.
  • 12. The code of claim 8; wherein the instructions operable to calibrate include instructions operable to filter the GR channel gain and the GB channel gain with a GR channel gain and a GB channel gain of a previous image; and wherein the instructions operable to apply the GR channel gain and the applied GB channel gain apply the filtered GR channel gain and the filtered GB channel gain.
  • 13. The code of claim 8; wherein the instructions operable to apply include instructions operable to multiply the green pixels in the red rows in said each respective region with the GR channel gain; and to multiply the green pixels in the blue rows with the GB channel gain.
  • 14. An adaptive green channel odd-even mismatch removal module comprising: means for calibrating region-by-region in an image a green (GR) channel gain for red rows and a green (GB) channel gain for blue rows, andmeans for applying, region-by-region, the GR channel gain to green pixels in the red rows and the GB channel gain to green pixels in the blue rows calibrated for each respective region for removing the green channel odd-even mismatch
  • 15. The module of claim 14; wherein the image is a raw bayer image.
  • 16. The module of claim 14; wherein the calibrating means includes means for filtering out bad pixels and edge pixels in said each respective region to form a set of valid pixel pairs.
  • 17. The module of claim 16; wherein the calibrating means includes means for counting a number of the valid pixel pairs in the region; means for computing an average number of valid green pixels for the red rows; and means for computing an average number of valid green pixels for the blue rows.
  • 18. The module of claim 17; wherein the GR channel gain and the GB channel gain are a function of the computed average numbers of the valid green pixels for the red rows and the blue rows.
  • 19. The module of claim 17; wherein the calibrating means includes means for filtering the GR channel gain and the GR channel gain with a GR channel gain and a GR channel gain of a previous image; and wherein the applied GR channel gain and the applied GB channel gain of the applying means are the filtered GR channel gain and the filtered GB channel gain.
  • 20. The module of claim 19; wherein the applying means comprises means for multiplying the green pixels in the red rows in said each respective region with the GR channel gain; and means for multiplying the green pixels in the blue rows with the GB channel gain.
  • 21. A method for adaptive green channel odd-even mismatch removal comprising the steps of: dividing a raw image from a sensor into a plurality of regions; andfor each region, adaptively removing green channel odd-even mismatch in the raw image to effectuate the disappearance of artifact in demosaic processed image.
  • 22. The method of claim 21; wherein the raw image is a raw bayer image.
  • 23. The method of claim 21; wherein the removing step comprises the steps of: calibrating region-by-region of the raw image a green (GR) channel gain for red rows and a green (GB) channel gain for blue rows, andapplying, region-by-region, the GR channel gain to green pixels in the red rows and the GB channel gain to the green pixels in the blue rows calibrated for each respective region to remove the green channel odd-even mismatch.
  • 24. The method of claim 21; wherein the removing step comprises the steps of: for said each respective region in the raw image: generating a weighted center green pixel value based on a first weighting factor for a center green pixel;summing weighted green pixel values based on a second weighting factor for surrounding green pixels in a first tier layer with respect to the center green pixel of the region to form a first tier layer sum;summing weighted green pixel values based on a third weighting factor for surrounding green pixels in a second tier layer with respect to the center green pixel of the region to form a second tier layer sum;summing the weighted center green pixel value, the first tier layer sum and the second tier layer sum to form a weighted green pixel sum total;normalizing the weighted green pixel sum total; andreplacing a pixel value of the center green pixel with the normalized weighted green pixel sum total.
  • 25. The method of claim 24; wherein the generating step comprises the step of: multiplying the center green pixel value by the first weighting factor to generate the weighted center green pixel value.
  • 26. The method of claim 24; further comprising the steps of: prior to the summing step for the first tier layer: comparing a pixel value for each green pixel in the first tier layer to a pixel maximum and a pixel minimum to determine if the pixel value for each green pixel in the first tier layer is within a range;for each green pixel in the first tier layer within the range, multiplying the pixel value of said each green pixel by the first weighting factor to form a corresponding within-range weighted green pixel value; and,for each green pixel in the first tier layer not within the range, multiplying the center green pixel value by the first weighting factor to form a corresponding out-of-range weighted green pixel value wherein the summing step for the first tier layer sums the within-range weighted green pixel value for all green pixels in the first tier layer that are within the range and the out-of-range weighted green pixel value for all green pixels in the first tier layer that are beyond the range.
  • 27. The method of claim 26; further comprising the steps of: prior to the summing step for the second tier layer:comparing a pixel value for each green pixel in the second tier layer to the pixel maximum and the pixel minimum to determine if the pixel value for each green pixel in the second tier layer is within the range;for each green pixel in the second tier layer within the range, multiplying the pixel value of each green pixel by the second weighting factor to form a corresponding within-range weighted green pixel value; andfor each green pixel in the second tier layer not within the range, multiplying the center green pixel value by the second weighting factor to form a corresponding out-of-range weighted green pixel valuewherein the summing step for the second tier layer sums the within-range weighted green pixel value for all green pixels in the second tier layer that are within the range and the out-of-range weighted green pixel value for all green pixels in the second tier layer that are beyond the range.
  • 28. The method of claim 26; further comprising the steps of: setting an upper bound threshold of a ratio of maximum Green mismatch (F_max);setting a lower bound threshold of the ratio of max Green mismatch (F_min);calculating an offset adaptive to red pixels surrounding the center green pixel (CGP) to remove spatial variant green channel odd-even mismatch;calculating the pixel maximum (P_max) based on an equation defined as P_max=max(F_max*CGP,CGP+offset); andcalculating the pixel minimum (P_min) based on an equation defined as P_min=min(F_min*CGP,CGP−offset).
  • 29. The method of claim 28; wherein the offset calculating step comprises: multiplying k by a mean of surrounding red pixel values for the surrounding red pixels where k is a parameter that adjusts a magnitude of correction for cross talk.
  • 30. An adaptive green channel odd-even mismatch removal module comprising: means for dividing a raw image from a sensor into a plurality of regions; and,means for adaptively removing green channel odd-even mismatch in each region in the raw image to effectuate the disappearance of artifact in demosaic processed image.
  • 31. The module of claim 30; wherein the raw image is a raw bayer image.
  • 32. The module of claim 30; wherein the removing module comprises: means for calibrating region-by-region of the raw image a green (GR) channel gain for red rows and a green (GB) channel gain for blue rows; andmeans for applying, region-by-region, the GR channel gain to green pixels in the red rows and the GB channel gain to green pixels in the blue rows calibrated for each respective region to remove the green channel odd-even mismatch.
  • 33. The module of claim 32; wherein the removing means comprises: means for generating a weighted center green pixel value based on a first weighting factor for a center green pixel;means for summing weighted green pixel values based on a second weighting factor for surrounding green pixels in a first tier layer with respect to the center green pixel of the region to form a first tier layer sum;means for summing weighted green pixel values based on a third weighting factor for surrounding green pixels in a second tier layer with respect to the center green pixel of the region to form a second tier layer sum;means for summing the weighted center green pixel value, the first tier layer sum and the second layer sum to form a weighted green pixel sum total;means for normalizing the weighted green pixel sum total; andmeans for replacing a pixel value of the center green pixel with the normalized weighted green pixel sum total.
  • 34. The module of claim 33; wherein the generating means comprises: means for multiplying the center green pixel value by the first weighting factor to generate the weighted center green pixel value.
  • 35. The module of claim 33; further comprising: means for comparing a pixel value for each green pixel in the first tier layer to a pixel maximum and a pixel minimum to determine if the pixel value for each green pixel in the first tier layer is within a range;means for multiplying, for each green pixel in the first tier layer within the range, the pixel value of a respective green pixel by the first weighting factor to form a corresponding within-range weighted green pixel value;means for multiplying, for each green pixel in the first tier layer not within the range, the center green pixel value by the first weighting factor to form a corresponding out-of-range weighted green pixel value;wherein the summing means for the first tier layer sums the within-range weighted green pixel value for all green pixels in the first tier layer that are within the range and the out-of-range weighted green pixel value for all green pixels in the first tier layer that are beyond the range.
  • 36. The module of claim 35; further comprising: means for comparing, prior to the summing for the second tier layer, a pixel value for each green pixel in the second tier layer to the pixel maximum and the pixel minimum to determine if the pixel value for each green pixel in the second tier layer is within the range;means for multiplying, for each green pixel in the second tier layer within the range, the pixel value of each green pixel by the second weighting factor to form a corresponding within-range weighted green pixel value;means for multiplying, for each green pixel in the second tier layer out of the range, the center green pixel value by the second weighting factor to form a corresponding out-of-range weighted green pixel valuewherein the summing means for the second tier layer sums the within-range weighted green pixel value for all green pixels in the second tier layer that are within the range and the out-of-range weighted green pixel value for all green pixels in the second tier layer that are out of the range.
  • 37. The module of claim 36; further comprising: means for setting an upper bound threshold of a ratio of maximum Green mismatch (F_max);means for setting a lower bound threshold of the ratio of max Green mismatch (F_min);means for calculating an offset adaptive to red pixel surrounding the center green pixel (CGP) to remove spatial variant green channel odd-even mismatch;means for calculating the pixel maximum (P_max) based on an equation defined as P_max=max(F_max*CGP,CGP+offset); andmeans for calculating the pixel minimum (P_min) based on an equation defined as P_min=min(F_min*CGP,CGP−offset).
  • 38. The module of claim 37; wherein the offset calculating means comprises: means for multiplying k by a mean of surrounding red pixel values of the surrounding red pixels where k is a parameter that adjusts a magnitude of correction for cross talk.
  • 39. A method for adaptive green channel odd-even mismatch removal comprising the steps of: for each region in the raw image: generating a weighted center green pixel value based on a first weighting factor for a center green pixel;summing weighted green pixel values based on a second weighting factor for surrounding green pixels in a first tier layer with respect to the center green pixel of the region to form a first tier layer sum;summing weighted green pixel values based on a third weighting factor for surrounding green pixels in a second tier layer with respect to the center green pixel of the region to form a second tier layer sum;summing the weighted center green pixel value, the first tier layer sum and the second tier layer sum to form a weighted green pixel sum total;normalizing the weighted green pixel sum total; andreplacing a pixel value of the center green pixel with the normalized weighted green pixel sum total to remove the green channel odd-even mismatch.
  • 40. The method of claim 39; wherein the generating step comprises the step of: multiplying the center green pixel value by the first weighting factor to generate the weighted center green pixel value.
  • 41. The method of claim 40; further comprising the steps of: prior to the summing step for the first tier layer: comparing a pixel value for each green pixel in the first tier layer to a pixel maximum and a pixel minimum to determine if the pixel value for each green pixel in the first tier layer is within a range;for each green pixel in the first tier layer within the range, multiplying the pixel value of a respective green pixel by the first weighting factor to form a corresponding within-range weighted green pixel value; and,for each green pixel in the first tier layer not within the range, multiplying the center green pixel value by the first weighting factor to form a corresponding out-of-range weighted green pixel valuewherein the summing step for the first tier layer sums the within-range weighted green pixel value for all green pixels in the first tier layer that are within the range and the out-of-range weighted green pixel value for all green pixels in the first tier layer that are beyond the range.
  • 42. The method of claim 41; further comprising the steps of: prior to the summing step for the second tier layer: comparing a pixel value for each green pixel in the second tier layer to the pixel maximum and the pixel minimum to determine if the pixel value for each green pixel in the second tier layer is within the range;for each green pixel in the second tier layer within the range, multiplying the pixel value of a respective green pixel by the second weighting factor to form a corresponding within-range weighted green pixel value; andfor each green pixel in the second tier layer not within the range, multiplying the center green pixel value by the second weighting factor to form a corresponding out-of-range weighted green pixel valuewherein the summing step for the second tier layer sums the within-range weighted green pixel value for all green pixels in the second tier layer that are within the range and the out-of-range weighted green pixel value for all green pixels in the second tier layer that are beyond the range.
  • 43. The method of claim 41; further comprising the steps of: setting an upper bound threshold of a ratio of maximum Green mismatch (F_max);setting a lower bound threshold of the ratio of max Green mismatch (F_min);calculating an offset adaptive to red pixels surrounding the center green pixel (CPG) to remove spatial variant green channel odd-even mismatch;calculating the pixel maximum (P_max) based on an equation defined as P_max=max(F_max*CGP,CGP+offset); andcalculating the pixel minimum (P_min) based on an equation defined as P_min=min(F_min*CGP,CGP−offset).
  • 44. The method of claim 43; wherein the offset calculating step comprises: multiplying k by a mean of surrounding red pixel values for the surrounding red pixels where k is a parameter that adjusts a magnitude of correction for cross talk.
  • 45. An adaptive green channel odd-even mismatch removal module comprising: means for generating a weighted center green pixel value based on a first weighting factor for a center green pixel;means for summing weighted green pixel values based on a second weighting factor for surrounding green pixels in a first tier layer with respect to the center green pixel of the region to form a first tier layer sum;means for summing weighted green pixel values based on a third weighting factor for surrounding green pixels in a second tier layer with respect to the center green pixel of the region to form a second tier layer sum;means for summing the weighted center green pixel value, the first tier layer sum and the second layer sum to form a weighted green pixel sum total;means for normalizing the weighted green pixel sum total; andmeans for replacing a pixel value of the center green pixel with the normalized weighted green pixel sum total.
  • 46. The module of claim 45; wherein the generating means comprises: means for multiplying the center green pixel value by the first weighting factor to generate the weighted center green pixel value.
  • 47. The module of claim 45; further comprising: means for comparing a pixel value for each green pixel in the first tier layer to a pixel maximum and a pixel minimum to determine if the pixel value for each green pixel in the first tier layer is within a range;means for multiplying, for each green pixel in the first tier layer within the range, the pixel value of a respective green pixel by the first weighting factor to form a corresponding within-range weighted green pixel value;means for multiplying, for each green pixel in the first tier layer not within the range, the center green pixel value by the first weighting factor to form a corresponding out-of-range weighted green pixel value;wherein the summing means for the first tier layer sums the within-range weighted green pixel value for all green pixels in the first tier layer that are within the range and the out-of-range weighted green pixel value for all green pixels in the first tier layer that are beyond the range.
  • 48. The module of claim 47; further comprising: means for comparing, prior to summing by the summing means for the second tier layer, a pixel value for each green pixel in the second tier layer to the pixel maximum and the pixel minimum to determine if the pixel value for each green pixel in the second tier layer is within the range;means for multiplying, for each green pixel in the second tier layer within the range, the pixel value of a respective green pixel by the second weighting factor to form a corresponding within-range weighted green pixel value;means for multiplying, for each green pixel in the second tier layer out of the range, the center green pixel value by the second weighting factor to form a corresponding out-of-range weighted green pixel valuewherein the summing means for the second tier layer sums the within-range weighted green pixel value for all green pixels in the second tier layer that are within the range and the out-of-range weighted green pixel value for all green pixels in the second tier layer that are out of the range.
  • 49. The module of claim 38; further comprising: means for setting an upper bound threshold of a ratio of maximum Green mismatch (F_max);means for setting a lower bound threshold of the ratio of max Green mismatch (F_min);means for calculating an offset adaptive to red pixel surrounding the center green pixel (CPG) to remove spatial variant green channel odd-even mismatch;means for calculating the pixel maximum (P_max) based on an equation defined as P_max=max(F_max*CGP,CGP+offset); andmeans for calculating the pixel minimum (P_min) based on an equation defined as P_min=min(F_min*CGP,CGP−offset).
  • 50. The module of claim 49; wherein the offset calculating means comprises: means for multiplying k by a mean of surrounding red pixel values of the surrounding red pixels where k is a parameter that adjusts a magnitude of correction for cross talk.
Provisional Applications (2)
Number Date Country
60760769 Jan 2006 US
60759842 Jan 2006 US