METHOD FOR CORRECTING DISTORTION OF CAPTURED IMAGE, DEVICE FOR CORRECTING DISTORTION OF CAPTURED IMAGE, AND IMAGING DEVICE

Information

  • Patent Application
  • 20070188619
  • Publication Number
    20070188619
  • Date Filed
    February 09, 2007
    17 years ago
  • Date Published
    August 16, 2007
    16 years ago
Abstract
A method for correcting distortion of a captured image is provided. The method includes receiving a detection output from a sensor; optically correcting distortion of a captured image in image data from the element due to the positional change of the imaging element by controlling a control mechanism depending on the detection output in the receiving step, the mechanism being configured to control a position at which incident light from a target object enters the imaging element; receiving image data resulting from the optical correction of a captured image in the correcting step from the element, and detecting a motion vector per one screen of a captured image from the image data; and further correcting distortion of a captured image due to the positional change of the imaging element for the image data from the imaging element based on the motion vector detected in the motion vector detecting step.
Description

BRIEF DESCRIPTION OF THE FIGURES


FIG. 1 is a block diagram showing a configuration example of an imaging device to which a method for correcting distortion of a captured image according to a first embodiment is applied;



FIG. 2 is a diagram for explaining the outline of processing for detecting a motion vector in an embodiment;



FIGS. 3A and 3B are diagrams for explaining the outline of processing for detecting a motion vector in an embodiment;



FIG. 4 is a diagram for explaining the outline of processing for detecting a motion vector in an embodiment;



FIGS. 5A and 5B are diagrams for explaining a first example of processing for detecting an accurate motion vector in a method for detecting a motion vector according to an embodiment;



FIG. 6 is a diagram for explaining the first example of processing for detecting an accurate motion vector in a method for detecting a motion vector according to an embodiment;



FIG. 7 is a diagram for explaining the outline of processing for detecting a motion vector in an embodiment;



FIG. 8 is a diagram for explaining the outline of processing for detecting a motion vector in an embodiment;



FIGS. 9A and 9B are diagrams for explaining the first example of processing for detecting an accurate motion vector in a method for detecting a motion vector according to an embodiment;



FIG. 10 is a diagram for explaining the first example of processing for detecting an accurate motion vector in a method for detecting a motion vector according to an embodiment;



FIG. 11 is a diagram for explaining the first example of processing for detecting an accurate motion vector in a method for detecting a motion vector according to an embodiment;



FIGS. 12A and 12B are diagrams for explaining the first example of processing for detecting an accurate motion vector in a method for detecting a motion vector according to an embodiment;



FIGS. 13A to 13D are diagrams for explaining the first example of processing for detecting an accurate motion vector in a method for detecting a motion vector according to an embodiment;



FIG. 14 is a diagram for explaining the first example of processing for detecting an accurate motion vector in a method for detecting a motion vector according to an embodiment;



FIG. 15 is a diagram for explaining the first example of processing for detecting an accurate motion vector in a method for detecting a motion vector according to an embodiment;



FIGS. 16A and 16B are diagrams for explaining a second example of processing for detecting an accurate motion vector in a method for detecting a motion vector according to an embodiment;



FIG. 17 is a diagram for explaining the second example of processing for detecting an accurate motion vector in a method for detecting a motion vector according to an embodiment;



FIG. 18 is a diagram for explaining the second example of processing for detecting an accurate motion vector in a method for detecting a motion vector according to an embodiment;



FIGS. 19A to 19D are diagrams for explaining the second example of processing for detecting an accurate motion vector in a method for detecting a motion vector according to an embodiment;



FIG. 20 is a diagram for explaining the processing performance in a method for detecting a motion vector according to an embodiment;



FIG. 21 is a diagram for explaining the outline of a method for detecting a motion vector according to an embodiment;



FIG. 22 is a diagram for explaining a feature of a method for detecting a motion vector according to an embodiment, based on a comparison with an existing method;



FIG. 23 is a diagram for explaining a feature of a method for detecting a motion vector according to an embodiment, based on a comparison with an existing method;



FIG. 24 is a diagram for explaining a feature of a method for detecting a motion vector according to an embodiment, based on a comparison with an existing method;



FIG. 25 is a part of a flowchart for explaining a first example of motion vector detection processing of the imaging device according to the first embodiment;



FIG. 26 is a part of the flowchart for explaining the first example of the motion vector detection processing of the imaging device according to the first embodiment;



FIG. 27 is a part of a flowchart for explaining a second example of the motion vector detection processing of the imaging device according to the first embodiment;



FIG. 28 is a part of the flowchart for explaining the second example of the motion vector detection processing of the imaging device according to the first embodiment;



FIG. 29 is a part of a flowchart for explaining a third example of the motion vector detection processing of the imaging device according to the first embodiment;



FIG. 30 is a part of the flowchart for explaining the third example of the motion vector detection processing of the imaging device according to the first embodiment;



FIG. 31 is a part of the flowchart for explaining the third example of the motion vector detection processing of the imaging device according to the first embodiment;



FIG. 32 is a part of the flowchart for explaining the third example of the motion vector detection processing of the imaging device according to the first embodiment;



FIG. 33 is a block diagram showing a configuration example of an imaging device to which a method for correcting distortion of a captured image according to a second embodiment is applied;



FIG. 34 is a diagram for explaining motion vector detection processing of the imaging device of the second embodiment;



FIG. 35 is a diagram for explaining the motion vector detection processing of the imaging device of the second embodiment;



FIGS. 36A and 36B are diagrams for explaining major part of the motion vector detection processing of the imaging device of the second embodiment;



FIGS. 37A and 37B are diagrams for explaining major part of the motion vector detection processing of the imaging device of the second embodiment;



FIG. 38 is a block diagram showing a configuration example of major part of a motion vector detector 15 in the imaging device of the second embodiment;



FIG. 39 is a part of a flowchart for explaining an example of the motion vector detection processing of the imaging device of the second embodiment;



FIG. 40 is a part of the flowchart for explaining the example of the motion vector detection processing of the imaging device of the second embodiment;



FIG. 41 is a block diagram showing a configuration example of an imaging device to which a method for correcting distortion of a captured image according to a third embodiment of the invention is applied;



FIG. 42 is a part of a flowchart for explaining an example of motion vector detection processing of the imaging device of the third embodiment;



FIG. 43 is a part of the flowchart for explaining the example of the motion vector detection processing of the imaging device of the third embodiment;



FIG. 44 is a block diagram showing a configuration example of an imaging device to which a method for correcting distortion of a captured image according to a fourth embodiment is applied;



FIG. 45 is a flowchart for explaining a processing operation example of an image blur correction processor in the imaging device of the fourth embodiment;



FIGS. 46A and 46B are diagrams for explaining an inter-frame camera shake in a captured image;



FIG. 47 is a diagram for explaining motion vector detection processing employing block matching;



FIG. 48 is a diagram for explaining the motion vector detection processing employing block matching;



FIG. 49 is a flowchart for explaining the motion vector detection processing employing block matching; and



FIG. 50 is a diagram for explaining the motion vector detection processing employing block matching.


Claims
  • 1. A method for correcting distortion of a captured image, the method comprising: receiving a detection output from a sensor that detects a change corresponding to a positional change of an imaging element in a horizontal direction and/or a vertical direction of a captured image at the time of imaging;optically correcting distortion of a captured image in image data from the imaging element due to the positional change of the imaging element by controlling a control mechanism depending on the detection output received from the sensor in the detection output receiving step, the control mechanism being configured to control a position at which incident light from a target object enters the imaging element;receiving image data resulting from the optical correction of a captured image in the correcting step from the imaging element, and detecting a motion vector per one screen of a captured image from the image data; andfurther correcting distortion of a captured image due to the positional change of the imaging element for the image data from the imaging element based on the motion vector detected in the motion vector detecting step.
  • 2. The method for correcting distortion of a captured image according to claim 1, wherein in the motion vector detecting step, the motion vector is detected from the image data from the imaging element with reference to a detection output from the sensor.
  • 3. The method for correcting distortion of a captured image according to claim 1, wherein when detecting the motion vector,at least one target block that is composed of a plurality of pixels and has a predetermined size is defined at a predetermined position in each of divided image regions in an original screen,a plurality of reference blocks each having the same size as the size of the target block are defined in a search range defined in a reference screen that is a screen of interest,a reference block having a strong correlation with the target block is detected from the plurality of reference blocks, andthe motion vector is detected based on an amount of a position shift on a screen between the detected reference block and the target block.
  • 4. The method for correcting distortion of a captured image according to claim 3, wherein detecting the motion vector includes:obtaining a sum of an absolute difference between a pixel value of each pixel in the reference block and a pixel value of a pixel at a corresponding position in the target block;obtaining a reference reduced vector arising from reduction in a size of a reference vector by a predetermined reduction ratio, the reference vector indicating an amount and a direction of a position shift between a position of the reference block on the reference screen and a position of the target block on a screen;creating a reduced sum-of-absolute-difference table in which values obtained based on a sum of an absolute difference about each of the reference blocks are stored as a plurality of table elements, the number of the table elements being obtained by reducing the number of the reference blocks depending on the predetermined reduction ratio, the table elements each having an address indicated by a reference vector with a size near or equivalent to a size of a reference reduced vector; andcalculating a motion vector that corresponds to a difference between the reference screen and the original screen for each of the divided image regions by using at least a reference vector corresponding to a minimum value among the values stored in the reduced sum of absolute difference table, andcreating the table includes:detecting a plurality of reference vectors near to the reference reduced vector obtained in the reference reduced vector obtaining step, as a plurality of near reference vectors;calculating distribution values, each corresponding to a respective one of the near reference vectors detected in the detecting step from the sum of an absolute difference calculated in the sum of absolute difference obtaining step; andadding the distribution values calculated in the distribution value calculating step to previous values in the reduced SAD table each corresponding to a respective one of the near reference vectors.
  • 5. The method for correcting distortion of a captured image according to claim 3, wherein in the motion vector detecting step, the motion vector is detected after the search range is defined with reference to a detection output from the sensor.
  • 6. The method for correcting distortion of a captured image according to claim 4, wherein in the motion vector detecting step, the motion vector is detected after the search range is defined with reference to a detection output from the sensor.
  • 7. A method for correcting distortion of a captured image, the method comprising: receiving a detection output from a sensor that detects a positional change of an imaging element in a horizontal direction and/or a vertical direction of a captured image at the time of imaging;optically correcting distortion of a captured image in image data from the imaging element due to the positional change of the imaging element by controlling a control mechanism depending on the detection output received from the sensor in the detection output receiving step, the control mechanism being configured to control a position at which incident light from a target object enters the imaging element; andcorrecting distortion of a captured image due to the positional change of the imaging element with reference to a detection output from the sensor, for image data from the imaging element resulting from the optical correction of a captured image in the optical correcting step.
  • 8. The method for correcting distortion of a captured image according to claim 7, wherein when correcting the distortion, if it is determined based on the reference to the detection output from the sensor that the positional change of the imaging element has occurred, a blur is detected about the image data from the imaging element and correction is carried out so that the blur is removed.
  • 9. A device for correcting distortion of a captured image, comprising: a sensor configured to detect a positional change of an imaging element in a horizontal direction and/or a vertical direction of a captured image at the time of imaging;an optical corrector configured to optically correct distortion of a captured image in image data from the imaging element due to the positional change of the imaging element by controlling a control mechanism depending on a detection output from the sensor, the control mechanism being configured to control a position at which incident light from a target object enters the imaging element;a motion vector detector configured to receive image data resulting from the optical correction of a captured image from the imaging element and detect a motion vector per one screen of a captured image from the image data; andan image distortion correction processor configured to further correct distortion of a captured image due to the positional change of the imaging element for the image data from the imaging element based on the motion vector detected by the motion vector detector.
  • 10. An imaging device comprising: a sensor configured to detect a positional change of an imaging element in a horizontal direction and/or a vertical direction of a captured image at the time of imaging;an optical corrector configured to optically correct distortion of a captured image in image data from the imaging element due to the positional change of the imaging element by controlling a control mechanism depending on a detection output from the sensor, the control mechanism being configured to control a position at which incident light from a target object enters the imaging element;a motion vector detector configured to receive image data resulting from the optical correction of a captured image from the imaging element and detect a motion vector per one screen of a captured image from the image data;an image distortion correction processor configured to further correct distortion of a captured image due to the positional change of the imaging element for the image data from the imaging element based on the motion vector detected by the motion vector detector; anda recorder configured to record image information of the captured image corrected by the image distortion correction processor in a recording medium.
  • 11. The imaging device according to claim 10, wherein the motion vector detector detects the motion vector from the image data from the imaging element with reference to a detection output from the sensor.
  • 12. The imaging device according to claim 10, wherein the motion vector detector calculates a motion vector that corresponds to a difference between a reference screen that is a screen of interest and an original screen previous to the reference screen,at least one target block that is composed of a plurality of pixels and has a predetermined size is defined at a predetermined position in each of the divided image regions in the original screen, and a plurality of reference blocks each having the same size as the size of the target block are defined in a search range defined in the reference screen,a reference block having a strong correlation with the target block is detected from the plurality of reference blocks, andthe motion vector is detected based on an amount of a position shift on a screen between the detected reference block and the target block.
  • 13. The imaging device according to claim 12, wherein the motion vector detector includes:a sum-of-absolute-difference calculator that obtains a sum of an absolute difference between a pixel value of each pixel in the reference block and a pixel value of a pixel at a corresponding position in the target block;a reference reduced vector obtainer that obtains a reference reduced vector arising from reduction in a size of a reference vector by a predetermined reduction ratio, the reference vector indicating an amount and a direction of a position shift between a position of the reference block on the reference screen and a position of the target block on a screen;a table creator that creates a reduced sum of absolute difference table in which values obtained based on a sum of an absolute difference about each of the reference blocks are stored as a plurality of table elements, the number of the table elements being obtained by reducing the number of the reference blocks depending on the predetermined reduction ratio, the table elements each having an address indicated by a reference vector with a size near or equivalent to a size of a reference reduced vector; anda motion vector calculator that calculates a motion vector corresponding to a difference between the reference screen and the original screen for each of the divided image regions by using at least a reference vector corresponding to a minimum value among the values stored in the reduced sum of absolute difference table, and whereinthe table creator includes:a near reference vector detector that detects a plurality of reference vectors near to the reference reduced vector obtained by the reference reduced vector obtainer, as a plurality of near reference vectors;a distribution value calculator that calculates distribution values each corresponding to a respective one of the near reference vectors from the sum of an absolute difference of the reference block, calculated by the sum of absolute difference calculator; anda distribution adder that adds the distribution values each corresponding to a respective one of the near reference vectors, calculated by the distribution value calculator, to previous values in the reduced sum of absolute difference table each corresponding to a respective one of the near reference vectors.
  • 14. The imaging device according to claim 12, wherein the motion vector detector detects the motion vector after defining the search range with reference to a detection output from the sensor.
  • 15. The imaging device according to claim 13, wherein the motion vector detector detects the motion vector after defining the search range with reference to a detection output from the sensor.
  • 16. An imaging device comprising: a sensor configured to detect a positional change of an imaging element in a horizontal direction and/or a vertical direction of a captured image at the time of imaging;an optical corrector configured to optically correct distortion of a captured image in image data from the imaging element due to the positional change of the imaging element by controlling a control mechanism depending on a detection output from the sensor, the control mechanism being configured to control a position at which incident light from a target object enters the imaging element; andan image distortion correction processor configured to correct distortion of a captured image due to the positional change of the imaging element with reference to a detection output from the sensor, for image data from the imaging element resulting from the optical correction of a captured image by the optical corrector.
  • 17. The imaging device according to claim 16, wherein the image distortion correction processor detects a blur about the image data from the imaging element and carries out correction so that the blur is removed, if the image distortion correction processor has determined based on the reference to the detection output from the sensor that the positional change of the imaging element has occurred.
Priority Claims (1)
Number Date Country Kind
P2006-035341 Feb 2006 JP national