This nonprovisional application claims priority under 35 U.S.C. §119(a) on Patent Application No. 2011-018449 filed in Japan on Jan. 31, 2011, the entire contents of which are hereby incorporated by reference.
1. Field of the Invention
The present invention relates to an image pickup apparatus for photographing images, an image reproduction apparatus for reproducing images, and an image processing apparatus for performing image processing.
2. Description of Related Art
In an image pickup apparatus such as a digital camera, both optical zoom and digital zoom are usually used for realizing high magnification zoom. Using both the optical zoom and the digital zoom, the product of the optical zoom magnification and the digital zoom magnification is obtained as an output zoom magnification so that high magnification zoom can be realized.
In this type of image pickup apparatus, when the output zoom magnification is increased from one, as illustrated in
Note that there is a method of blurring a background by image processing.
Here, the taken image can include a focused subject that is in focus and an out-of-focus subject that is not focused (background subject). If the optical zoom magnification is increased in a state where a noted subject is in focus, a focal length of the image pickup portion changes along with an increase of the optical zoom magnification, a blur amount of the out-of-focus subject also changes along with the change of the focal length. As illustrated in
On the other hand, the digital zoom is realized by trimming without a change of focal length. Therefore, when the digital zoom magnification is increased, a size of each subject image in the taken image is increased, but the blur amount of the out-of-focus subject is not changed.
Therefore, when the output zoom magnification is increased, the blur amount is gradually increased in the optical zoom region, but after changing from the optical zoom region to the digital zoom region, the blur amount does not change in the taken image. Therefore, if the taken image in the digital zoom region is provided to a user as it is, the user may have a wrong feeling. Otherwise, the user may not be satisfied with that a change of image quality accompanying a change of the output zoom magnification cannot be obtained. It is considered that this wrong feeling or the like will be relieved if the change of image quality obtained by the optical zoom and the change of image quality obtained by the digital zoom are similar to each other, which will be a merit for the user.
Similar thing can be said also in a reproduction step or the like of the image. After the image is photographed, the angle of view adjustment can be performed by trimming. For instance, if a reproduction enlarging magnification in the reproduction is set to be larger than one, enlarging reproduction is performed via the trimming. The decrease of angle of view by trimming after finishing photography functions also as substitute means for decreasing the angle of view by the optical zoom that cannot be or is not performed during photography. Therefore, when the angle of view is decreased by the trimming, an effect as if the angle of view adjustment had been performed by the optical zoom can be obtained, which will be a great merit for the user.
An image pickup apparatus according to the present invention includes a target image generating portion that generates a target image by photography using optical zoom and digital zoom, and an output image generating portion that generates an output image by performing image processing on the target image. An entire image region of the target image includes a first image region and a second image region having a focus degree lower than that of the first image region. The image processing includes a blurring process for blurring an image in the second image region of the target image, and the output image generating portion performs the blurring process in accordance with a magnification of the digital zoom.
An image reproduction apparatus according to the present invention includes a target image generating portion that generates a target image by enlarging an input image in accordance with a designated reproduction enlarging magnification, an output image generating portion that generates an output image by performing image processing on the target image, and a display portion that displays the output image. An entire image region of the target image includes a first image region and a second image region having a focus degree lower than that of the first image region. The image processing includes a blurring process for blurring an image in the second image region of the target image, and the output image generating portion performs the blurring process in accordance with the reproduction enlarging magnification. An image processing apparatus according to the present invention includes a target image generating portion that sets a clipping frame for designating a part of an input image and extracts an image in the clipping frame so as to generate a target image, and an output image generating portion that performs image processing on the target image so as to generate an output image. The image processing includes a blurring process for blurring an image of a subject at an out-of-focus distance, and the output image generating portion performs the blurring process in accordance with a size of the clipping frame.
Hereinafter, examples of an embodiment of the present invention are described specifically with reference to the attached drawings. In the drawings to be referred to, the same part is denoted by the same numeral or symbol, and overlapping description of the same part is omitted as a rule. Note that in this specification, for simple description, a name of information, physical quantity, state quantity, a member or the like corresponding to the numeral or symbol may be shortened or omitted by adding the numeral or symbol referring to the information, the physical quantity, the state quantity, the member or the like. For instance, when an optical zoom magnification is denoted by symbol ZFOPT, the optical zoom magnification ZFOPT may be expressed by a magnification ZFOPT or simply by ZFOPT.
The image pickup apparatus 1 includes an image pickup portion 11, an analog front end (AFE) 12, a main control portion 13, an internal memory 14, a display portion 15, a recording medium 16, and an operating portion 17. Note that the display portion 15 can be interrupted to be disposed in an external device (not shown) of the image pickup apparatus 1.
The image pickup portion 11 photographs a subject using an image sensor.
The image sensor 33 is constituted of a plurality of light receiving pixels arranged in horizontal and vertical directions. The light receiving pixels of the image sensor 33 perform photoelectric conversion of an optical image of the subject entering through the optical system 35 and the aperture stop 32, so as to deliver an electric signal obtained by the photoelectric conversion to the analog front end (AFE) 12.
The AFE 12 amplifies an analog signal output from the image pickup portion 11 (image sensor 33) and converts the amplified analog signal into a digital signal so as to deliver the digital signal to the main control portion 13. An amplification degree of the signal amplification in the AFE 12 is controlled by the main control portion 13. The main control portion 13 performs image processing on the image expressed by the output signal of the AFE 12 and generates an image signal of the image after the image processing. The main control portion 13 also has a function as a display control portion that controls display content of the display portion 15 so as to perform control necessary for the display on the display portion 15.
The internal memory 14 is constituted of a synchronous dynamic random access memory (SDRAM) or the like and temporarily stores various data generated in the image pickup apparatus 1.
The display portion 15 is a display device having a display screen such as a liquid crystal display panel so as to display taken images or images recorded in the recording medium 16 under control of the main control portion 13. In this specification, when referred to simply as a display or a display screen, it means the display or the display screen of the display portion 15. The display portion 15 is equipped with a touch panel 19, so that a user can issue a specific instruction to the image pickup apparatus 1 by touching the display screen of the display portion 15 with a touching member (such as a finger or a touch pen). Note that it is possible to omit the touch panel 19.
The recording medium 16 is a nonvolatile memory such as a card-like semiconductor memory or a magnetic disk so as to record an image signal of the taken image under control of the main control portion 13. The operating portion 17 includes a shutter button 20 for receiving an instruction to take a still image, a zoom button 21 for receiving an instruction to change a zoom magnification and the like, so as to receive various operations from the outside. Operational content of the operating portion 17 is sent to the main control portion 13. The operating portion 17 and the touch panel 19 can be called a user interface for receiving user's arbitrary instruction and operation. The shutter button 20 and the zoom button 21 may be buttons on the touch panel 19.
Action modes of the image pickup apparatus 1 include a photographing mode in which images (still images or moving images) can be taken and recorded, and a reproducing mode in which images (still images or moving images) recorded in the recording medium 16 can be reproduced and displayed on the display portion 15. Transition between the modes is performed in accordance with an operation to the operating portion 17.
In the photographing mode, a subject is photographed periodically at a predetermined frame period so that taken images of the subject are sequentially obtained. An image signal expressing an image is also referred to as image data. The image signal contains a luminance signal and a color difference signal, for example. However, in this specification, an output signal of each light receiving pixel of the image sensor 33 may be also referred to as image data. Image data of a certain pixel may be referred to as a pixel signal. A size of a certain image or a size of an image region may be referred to as an image size. An image size of a noted image or a noted image region can be expressed by the number of pixels forming the noted image or the number of pixels belonging to the noted image region. Note that in this specification, image data of a certain image may be referred to simply as an image.
With reference to
In similar consideration, as illustrated in
In addition, an indicator corresponding to the diameter of the image 310′ is referred to as a focus degree. In the noted image 320, as the diameter of the image 310′ is larger, the focus degree of the subject as the point light source 310 (namely, the focus degree of the image 310′) is lower. As the diameter of the image 310′ is smaller, the focus degree of the subject as the point light source 310 (namely, the focus degree of the image 310′) is higher. Therefore, the focus degree in the out-of-focus region is lower than the focus degree in the focused region. Note that an arbitrary image mentioned in this specification is a two-dimensional image unless otherwise noted.
A distance in the real space between an arbitrary subject 330 and the image pickup apparatus 1 (more specifically, the image sensor 33) is referred to as a subject distance (see
As illustrated in
Hereinafter, a plurality of examples concerning structures and actions or other techniques of the image pickup apparatus 1 are described. Contents described in an example can be applied to other examples unless otherwise noted and as long as no contradiction arises.
A first example of the present invention is described. As illustrated in
An image constituted of the light receiving pixel signals output from the light receiving pixels in the effective pixel region 33A is referred to as an original image. In the first example, the angle of view of the original image is supposed to be the same as the angle of view of the image formed in the entire effective pixel region 33A. The angle of view of the original image is an angle indicating a range of a photographing space expressed by the original image (the same is true for the angle of view of any other image than the original image).
The user can use the zoom button 21 so as to perform a zoom operation for designating the output zoom magnification. The zoom control portion 51 sets the optical zoom magnification and the electronic zoom magnification from the output zoom magnification designated in the zoom operation. The output zoom magnification, the optical zoom magnification, and the electronic zoom magnification are denoted by symbols ZFOUT, ZFOPT, and ZFEL, respectively. In the first example, the optical zoom magnification and the electronic zoom magnification are set so that the equation ZFOUT=ZFOPT×ZFEL is satisfied.
The optical zoom processing portion 52 controls a position of the zoom lens 30 so that the angle of view of the original image becomes the angle of view according to the optical zoom magnification set by the zoom control portion 51. In other words, by controlling the position of the zoom lens 30 in accordance with the optical zoom magnification, the optical zoom processing portion 52 determines the angle of view of the image formed on the effective pixel region 33A of the image sensor 33. Here, it is supposed that when the optical zoom magnification becomes k1 times a given magnification, the angle of view of the image formed on the effective pixel region 33A becomes 1/k1 times in each of the horizontal and vertical directions of the image sensor 33 (k1 is a positive number, for example, twice). However, it is possible to define the magnification by an area ratio and to consider that when the optical zoom magnification becomes k12 times a given magnification, the angle of view of the image formed on the effective pixel region 33A becomes 1/k1 times in each of the horizontal and vertical directions of the image sensor 33.
The original image obtaining portion 53 obtains image data of the original image taken with the optical zoom magnification set by the zoom control portion 51. It is possible to consider structural elements of the original image obtaining portion 53 include the image sensor 33 and the AFE 12.
The electronic zoom processing portion 54 generates the target image by performing electronic zooming process according to the electronic zoom magnification set by the zoom control portion 51 on the original image. The electronic zooming process means a process of setting a clipping frame having a size corresponding to the electronic zoom magnification in the image region of the original image as illustrated in
Here, it is supposed that a center position of the original image is the same as a center position of the clipping frame, and an aspect ratio of the original image is the same as an aspect ratio of the clipping frame (namely, an aspect ratio of the clipped image). However, the center position or the aspect ratio may be different between the original image and the clipping frame. Here, it is supposed that when the electronic zoom magnification is k2, a size of the clipping frame becomes 1/k2 of a size of the original image in each of the horizontal and vertical directions (k2≧1). However, it is possible to define the magnification by an area ratio and to consider that when the electronic zoom magnification is k22, a size of the clipping frame becomes 1/k2 times a size of the original image in each of the horizontal and vertical directions. If k2 is larger than one, the clipped image is a part of the original image. If k2 is equal to one, the clipped image is the original image itself.
The image size of the target image is the same as the image size of the output image, which is predetermined. The predetermined image size of the target image and the output image is referred to as a specified output size. The electronic zoom processing portion 54 performs the above-mentioned image size enlargement process using known resolution conversion (resampling) so that a target image having the specified output size can be obtained. However, if the specified output size is smaller than the image size of the clipped image, instead of the image size enlargement process in which the image size of the clipped image is increased, image size reduction process for reducing the image size of the clipped image is used. In other words, the image size enlargement process or the image size reduction process is performed on the clipped image so that the image size of the clipped image after the image size enlargement process or the image size reduction process becomes the same as the specified output size.
In the example of
The output image generating portion 55 performs the image processing according to the electronic zoom magnification on the target image so as to generate the output image (details will be described later).
Here, it is supposed that a plurality of subjects exist within the photographing range of the image pickup portion 11 and that the plurality of subjects includes subjects SUBA and SUBB. In addition, as illustrated in
In
When an arbitrary original image is taken, it is supposed that the subject SUBA is a focused subject while the subject SUBB is an out-of-focus subject. In other words, regardless of the optical zoom magnification, it is supposed that the subject SUBA is always a focused subject while the subject SUBB is always an out-of-focus subject in the original image and the target image. Then, in the arbitrary original image and the arbitrary target image, the blur amount of the subject SUBA becomes a minimum value, while the blur amount of the subject SUBB is larger than a blur amount of the subject SUBA. In the noted image 320 as an arbitrary two-dimensional image (see
A blur amount of the subject SUBB with respect to the blur amount of the subject SUBA or the reference diameter RREF is particularly referred to as a target blur amount (see
In order to avoid an occurrence of such wrong feeling or the like, the output image generating portion 55 generates the output image by performing image processing according to the electronic zoom magnification (hereinafter, referred to also as specific image processing) on the target image. Image data of the output image can be recorded in the recording medium 16, and the output image can be displayed on the display portion 15. However, it is also possible to record image data of the original image or the target image in the recording medium 16, and it is also possible to display the original image or the target image on the display portion 15.
When the specific image processing is realized, blur amount characteristic information 450 illustrated in
The optical zoom magnification is actually changed within the range satisfying 1≦ZFOPT≦5, and the target blur amount for each optical zoom magnification is measured. Thus, the target blur amount Vp within the range satisfying 1≦p≦5 can be known correctly. The target blur amount Vp within the range satisfying 5<p≦10 is estimated by calculation from the target blur amount Vp within the range satisfying 1≦p≦5, and hence the blur amount characteristic information 450 can be generated. As a matter of course, it is possible to determine the target blur amount Vp within the range satisfying 1≦p≦10 from optical characteristics of the image pickup portion 11 entirely by calculation. In any case, the blur amount characteristic information 450 indicates a relationship between the target blur amount that is observed when the target image is obtained by using only the optical zoom (namely, that is observed when ZFEL equals one) and the optical zoom magnification.
The entire image region of the target image includes an image region A as a focused region in which image data of the subject SUBA exists, and an image region B as an out-of-focus region in which image data of the subject SUBB exists. In an arbitrary target image, the focus degree of the image region B is lower than the focus degree of the image region A. The specific image processing in the output image generating portion 55 includes a blurring process for blurring the image of the subject SUBB on the target image, namely, the blurring process for blurring the image in the out-of-focus region B of the target image. The blurring process can be realized by a filtering process using a smoothing filter for smoothing an image, for example.
An indicator indicating intensity of blurring in the blurring process is referred to as a process-blurring amount (or a blurring amount simply). The process-blurring amount may be considered to be an amount of blurring added to the image in the image region B of the target image by the blurring process. As the process-blurring amount is larger, the image in the image region B is blurred more strongly. When the blurring process is realized by spatial domain filtering using the smoothing filter, for example, it is preferred to increase a filter size of the smoothing filter (such as a Gaussian filter) to be used for the blurring process along with an increase of the process-blurring amount, and the increase of the filter size enhances blurring intensity.
As illustrated in
Because the target blur amount depends on subject distances dA and dB of the subjects SUBA and SUBB, the process-blurring amount to be set also depends on the subject distances dA and dB. Therefore, it is preferred to dispose a subject distance detecting portion (not shown) that detects a subject distance of a subject at each pixel of the original image or the target image in the main control portion 13, and to use subject distance information indicating the detection result (namely, a detected value of the subject distance of a subject at each pixel of the original image or the target image) for determining a process-blurring amount for each pixel of the target image (the same is true for the second to fourth examples described later). As a detection method of the subject distance, arbitrary methods including known methods can be used. For instance, a stereo camera or a range sensor may be used for detecting the subject distance. Otherwise, the subject distance may be determined by an estimation process using edge information of the original image or the target image.
A classification process for classifying a subject at each pixel of the target image into either one of the focused subject and the out-of-focus subject (namely, a classification process for classifying each subject image region of the target image into either one of the focused region and the out-of-focus region) can be performed by the output image generating portion 55 and the output image generating portion 75 described later (see
The classification process can be performed based on a focal length, an aperture stop value, and subject distance information of the image pickup portion 11 when photographing an image to be a base of the target image or the target image. It is because the depth of field of the original image and the target image can be determined based on the focal length and the aperture stop value, and that it is known whether or not the subject distance corresponding to each pixel of the target image is within the depth of field using the subject distance information after the depth of field is determined.
Alternatively, the classification process can be performed based on image data of the target image. For instance, the entire image region of the target image is split into a plurality of small regions, and for each small region, edge intensity in the small region is determined based on image data of the small region. An edge extraction process using an edge extraction filter (such as a differential filter) is performed for each pixel in the small region, and an output value of the edge extraction filter used for each pixel in the small region is accumulated so that the edge intensity of the small region can be determined. Then, small regions having edge intensity of a predetermined reference value or larger are classified into the focused region, while small regions having edge intensity smaller than the reference value are classified into the out-of-focus region, so that the above-mentioned classification process can be realized.
With reference to the flowchart of
After starting display or recording of the output image sequence, the process of Steps S12 to S14 is repeated while continuing the display or recording of the output image sequence. The user can freely perform the zoom operation to instruct a change of the output zoom magnification at an arbitrary timing. The zoom control portion 51 resets the magnifications ZFOPT and ZFEL in accordance with the relationship of
According to the first example, in the electronic zoom region too, a blur amount can be obtained as if the optical zoom is used for obtaining the blur amount (the effect as if the angle of view adjustment is performed by the optical zoom can be obtained). Therefore, the situation where the blur amount is not changed in the electronic zoom region can be avoided, and user's wrong feeling or the like generated in such the situation can be relieved.
Note that in the magnification relationship 430 (see
A second example of the present invention is described. The contents described in first example can be applied to the second example as long as no contradiction arises.
In the second example, RAW zoom is used instead of the electronic zoom described above in the first example. First, with reference to
In the electronic zoom in the first example, trimming is performed on the original image. In contrast, in the RAW zoom, trimming is performed on the image on the image sensor 33 when image data is read out from the image sensor 33. A magnification of trimming in the RAW zoom is referred to as a RAW zoom magnification.
The read control portion 61 controls data size or the like read out from the image sensor 33 in accordance with the RAW zoom magnification given to itself, and Q mega image data are read out from the image sensor 33 under this control. Here, Q is eight at largest. In addition, corresponding to the fact that an output definition size is two mega, a minimum value of Q is two. The two-dimensional image having a Q mega image size expressed by the Q mega image data read out from the image sensor 33 is referred to as an extracted image.
The resolution conversion portion 62 generates an image having a two mega image size by reducing an image size of the extracted image having a Q mega image size (hereinafter, the image having a two mega image size, which is generated by the resolution conversion portion 62, is referred to as a converted image). However, if Q is two, the extracted image itself becomes a converted image. Reduction of the image size is realized by a known resampling method. The angle of view of the extracted image having a Q mega image size is the same as the angle of view of the converted image having a two mega image size.
A relationship between the RAW zoom magnification and a value of Q is described with reference to a specific numeric example. When the RAW zoom is used, a rectangular extraction frame is defined with respect to the effective pixel region 33A of the image sensor 33. It is supposed that the aspect ratio of the extraction frame is the same as that of the effective pixel region 33A, and that the center of the extraction frame is the same as the center of the effective pixel region 33A.
The read control portion 61 determines the image size of the extraction frame from the RAW zoom magnification according to the following definition equation.
In other words, an image size of the extraction frame is determined so that a positive square root of a value obtained by dividing the image size of the converted image by the image size of the extraction frame becomes equal (or substantially equal) to a half of the RAW zoom magnification. Then, a light receiving pixel signal of each light receiving pixel in the extraction frame is read out from the image sensor 33 and supplied to the resolution conversion portion 62.
Therefore, when the RAW zoom magnification is one, the image size of the extraction frame becomes eight mega from the above-mentioned definition equation. Therefore, when the RAW zoom magnification is one, as illustrated in
When the RAW zoom magnification is two, the image size of the extraction frame becomes two mega from the above-mentioned definition equation. Therefore, when the RAW zoom magnification is two, as illustrated in
As understood from the above-mentioned definition equation and
The zoom control portion 71 sets the optical zoom magnification and the RAW zoom magnification so that the equation ZFOUT=ZFOPT×ZFRAW is satisfied. The optical zoom processing portion 72 is the same as the optical zoom processing portion 52 of
The RAW zoom processing portion 74 includes the read control portion 61 and the resolution conversion portion 62 of
The output image generating portion 75 has the same function as the output image generating portion 55 of
Similarly to the output image generating portion 55 of
As illustrated in
With reference to a flowchart of
After starting display or recording of the output image sequence, the process of Steps S22 to S24 is repeated while continuing the display or recording of the output image sequence. The user can freely perform the zoom operation to instruct a change of the output zoom magnification at an arbitrary timing. The zoom control portion 71 resets the magnifications ZFOPT and ZFRAW in accordance with the relationship of
According to the second example, also in the case where the RAW zoom is used, a blur amount can be obtained as if the optical zoom is used for obtaining the blur amount. Therefore, the same effect as the first example can be obtained.
Note that the electronic zoom in the first example and the RAW zoom in the second example are one type of digital zoom for adjusting angles of view of the target image and the output image. In the digital zoom by the electronic zoom, the angles of view of the target image and the output image are adjusted by trimming of the original image. In contrast, in the digital zoom by the RAW zoom, the angles of view of the target image and the output image are adjusted by trimming of the image on the image sensor 33.
In addition, it is possible to deform the relationship of
In addition, in the above-mentioned example, output signals of all the light receiving pixels in the extraction frame (511 or the like of
In addition, the RAW zoom, the electronic zoom and the optical zoom may be combined to generate the target image. In this case, the zoom control portion 71 sets the magnifications ZFOPT, ZFEL, and ZFRAW from the output zoom magnification so that ZFOUT=ZFOPT×ZFEL×ZFRAW is satisfied, and it is preferred to dispose the electronic zoom processing portion 54 of
A third example of the present invention is described. In the third example, an action of the image pickup apparatus 1 in the reproducing mode is described.
The electronic zoom processing portion 54 and the output image generating portion 55 of
Therefore, the electronic zoom processing portion 54 of
The output image generating portion 55 performs the specific image processing corresponding to the reproduction enlarging magnification on the target image so as to generate the output image. The output image is displayed on the display portion 15. Similarly to the first example, the specific image processing in the third example includes the blurring process for blurring the image of the subject SUBB on the target image, namely, the blurring process for blurring the image in the out-of-focus region B of the target image. However, the process-blurring amount in the blurring process is set by using the reproduction enlarging magnification instead of the output zoom magnification.
As illustrated in
On the other hand, if the reproduction enlarging magnification is larger than one, the output image generating portion 55 performs the blurring process on the target image and outputs the target image after the blurring process as the output image. Along with an increase of the reproduction enlarging magnification, the process-blurring amount is increased. Specifically, for example, the output image generating portion 55 sets the process-blurring amount and performs the blurring process based on the reproduction enlarging magnification EFREP and the blur amount characteristic information 450 so that the target blur amount on the output image is the same as V3 when EFREP is three, and that the target blur amount on the output image is the same as V5 when EFREP is five. Here, because it is supposed that the image to be reproduced is the original image obtained when ZFOUT is one, the process-blurring amount when EFREP is three corresponds to V3-V1, and the process-blurring amount when EFREP is five corresponds to V5-V1.
Note that if the image to be reproduced is the original image obtained when ZFOUT is not one, the process-blurring amount can be modified from that described above based on the optical zoom magnification when the image to be reproduced is photographed and on the blur amount characteristic information 450. However, if the target blur amount is a linear function of ZFOPT in the blur amount characteristic information 450, this modification is not necessary.
In addition, as described above in the first example, it is preferred to determine the process-blurring amount for each pixel of the target image using the subject distance information in the third example, too. If the subject distance information is obtained only in the photographing mode, it is preferred to generate the subject distance information corresponding to the image to be reproduced in the photographing mode so as to record the subject distance information together with the image data of the image to be reproduced in the recording medium 16, and to read the corresponding subject distance information when image data of the image to be reproduced is read out.
With reference to a flowchart of
After that, the user can freely instruct a change of the reproduction enlarging magnification at an arbitrary timing. In Step S32 after Step S31, it is decided whether or not the reproduction enlarging magnification EFREP is one. If EFREP is one, the process goes back to Step S31 without performing the electronic zooming process and the blurring process, and the image to be reproduced is displayed as it is continuously as the output image on the display portion 15. On the other hand, if EFREP is larger than one, in Steps S33 and S34, the electronic zooming process is performed on the image to be reproduced, and further the blurring process is performed to generate the output image. Then, the process goes back to Step S31. Therefore, if EFREP is larger than one, a part of the image to be reproduced (a part of the input image) is enlarged according to the reproduction enlarging magnification, and the obtained image is displayed as the output image on the display portion 15.
According to the third example, when enlarging reproduction of an image is performed, a blur amount can be obtained as if the image is enlarged using the optical zoom. The enlarging reproduction also functions as substitute means for reducing the angle of view by the optical zoom that could not be performed or was not performed when the image was photographed. Therefore, when the angle of view is reduced by trimming for enlarging reproduction, the effect as if the angle of view adjustment is performed by the optical zoom has a large merit for a user.
Note that the image to be reproduced can be regarded as a still image, but it is possible to apply the technique described above in the third example to a moving image. In this case, it is preferred to supply the plurality of images to be reproduced that are arranged in time sequence to the electronic zoom processing portion 54 sequentially, and to generate the target image and the output image from each image to be reproduced so that the output image sequence is obtained. It is possible to display the obtained output image sequence as a moving image on the display portion 15 and to record the same in the recording medium 16.
A fourth example of the present invention is described. Similarly to the third example, in the fourth example too, an action of the image pickup apparatus 1 in the reproducing mode is described.
In
The trimming frame FT corresponds to the clipping frame in the first to third examples, and the image in the trimming frame FT (namely, the target image IB) is a part of the input image IA. Therefore, the angle of view of the target image IB is smaller than the angle of view of the input image IA.
The user can use the operating portion 17 or the touch panel 19 to perform a trimming instruction operation for designating a position, a size, and the like of the trimming frame FT. Content of the designation by the trimming instruction operation is contained in the trimming information. The trimming information specifies the center position and sizes in the horizontal and vertical directions of the trimming frame FT on the input image IA. Here, it is supposed that the trimming frame FT is a rectangular frame. However, it is possible to set the shape of the trimming frame FT to other than the rectangular shape. The center position of the trimming frame FT and the center position of the input image IA may be the same or different. In addition, the aspect ratio of the trimming frame FT and the aspect ratio of the input image IA may also be the same or different.
An output image generating portion 102 performs the specific image processing corresponding to the trimming information based on the trimming instruction operation, a distance map from a distance map generating portion 103, and focused state setting information from a focused state setting portion 104 on the target image IB so as to generate the output image. The specific image processing includes a blurring process similar to the above-mentioned blurring process (details will be described later). In addition, if the image size enlargement process has not been performed yet at time point when the target image IB is generated, the image size enlargement process for setting the image size of the output image to be the same as the image size of the input image IA can be included in the specific image processing. The output image can be displayed on the display portion 15 and can be recorded in the recording medium 16.
The distance map generating portion 103 reads out the distance data from the header region of the image file storing the image data of the input image and generates the distance map from the distance data. The distance map is a distance image (range image) in which each pixel value constituting the same has a detected value of the subject distance. The distance map specifies a subject distance of the subject at an arbitrary pixel in the input image or the target image. Note that the distance data itself may be the distance map. In this case, the distance map generating portion 103 is not necessary.
The focused state data stored in the header region illustrated in
The focused state setting portion 104 generates the focused state setting information based on the focused state data from the recording medium 16, or in accordance with a focused state designation operation by the user. The focused state setting information includes a distance Lo′ specifying the focus reference distance Lo of the output image, and the output image generating portion 102 performs the specific image processing so that the focus reference distance Lo of the output image becomes the same as the distance Lo′. The user can designate the distance Lo′ by the focused state designation operation as necessary. In this case, the user can use the operating portion 17 or the touch panel 19 so as to directly input a value of the distance Lo′. Alternatively, the user can designate the distance Lo′ by designation of a noted subject corresponding to the distance Lo′. For instance, if the subject distance of the subject 602 in the input image 600 is the distance Lo′, the image pickup apparatus 1 displays the input image 600 on the display portion 15, and in this state the user can use the touch panel 19 to designate the subject 602 on the display screen. When this designation is performed, the focused state setting portion 104 can set the subject distance of the subject 602 as the distance Lo′. If the user does not designate the distance Lo′, the specific image processing is performed so that the focus reference distance Lo determined by the focused state data from the recording medium 16, namely the focus reference distance Lo of the input image becomes the same as the focus reference distance Lo of the output image. In the following description, it is supposed that the user does not designate the distance Lo′ unless otherwise noted.
In the fourth example, for convenience sake, in the following description, it is supposed that the focus reference distance Lo is the center distance within the depth of field in the noted image 320 that is an arbitrary two-dimensional image (see
The specific image processing in the fourth example includes the blurring process for blurring the out-of-focus distance subject image (namely, the image of the subject at the out-of-focus distance) included in the target image. In the noted image 320, the out-of-focus distance means a distance outside the depth of field of the noted image 320, and the out-of-focus distance subject (in other words, the subject at the out-of-focus distance) means a subject positioned outside the depth of field of the noted image 320. In addition, a difference between the focus reference distance Lo and a subject distance of an arbitrary subject is referred to as a difference distance.
As illustrated in
In
In the input image 600 and the target image 610, difference distances of the subjects 601, 602, and 603 are denoted by DIF601, DIF602, and DIF603, respectively. Here, as illustrated in
In
Here, because it is supposed that the focus reference distance Lo of the output image 620 is the same as the subject distance d601, the subject 601 is a focused subject on the output image 620. In addition, because the depth of field of the output image 620 is shallower than those of the input image 600 and the target image 610, the subject 602 is an out-of-focus subject on the output image 620. The subject 603 is also an out-of-focus subject on the output image 620.
Meaning of the process-blurring amount and content of the blurring process corresponding to the process-blurring amount are the same as those described above in the first example. In other words, as the process-blurring amount set for the noted pixel is larger, the noted pixel is blurred more strongly in the blurring process (namely, as the process-blurring amount set for the noted subject is larger, the image of the noted subject is blurred more strongly in the blurring process). If the blurring process is realized by the spatial domain filtering using the smoothing filter, for example, it is preferred to increase a filter size of the smoothing filter (such as a Gaussian filter) to be used for the blurring process along with an increase of the process-blurring amount, and the increase of the filter size enhances blurring intensity.
In order to set the depth of field of the output image 620 shallower than the depth of field of the input image 600, the distance DIFS corresponding to a half of the magnitude of the depth of field of the output image 620 is set shorter than the distance DIFO corresponding to a half of the magnitude of the depth of field of the input image 600. Therefore, DIFS<DIFO=DIF602<DIF603 is satisfied. In addition, as for a pixel having a corresponding difference distance larger than the distance DIFS, a larger process-blurring amount is set as the difference distance is larger. Therefore, when the process-blurring amount set for the subject 602 corresponding to the difference distance DIF602 is denoted by Q602, and when the process-blurring amount set for the subject 603 corresponding to the difference distance DIF603 is denoted by Q603, 0<Q602<Q603 is satisfied. Because 0<Q602<Q603 is satisfied, the images of the subject 602 and 603 are blurred in the specific image processing so that blur as illustrated in
The subject distance d602 is not the out-of-focus distance in the target image 610, but it becomes the out-of-focus distance in the output image 620 because the depth of field is reduced. Therefore, the specific image processing can be said to include the blurring process for blurring the image of the out-of-focus distance subject 603 in the target image 610, and further to include the blurring process for blurring images of the out-of-focus distance subjects 602 and 603 in the output image 620.
A decrease of a size of the trimming frame is similar to an increase of the reproduction enlarging magnification in the third example. The depth of field of the output image 620 is set shallower, and the process-blurring amount is increased, along with a decrease of a size of the trimming frame. Then, the blur amount can be obtained as if the optical zoom is performed.
In order to set the depth of field of the output image 620 shallower and to increase the process-blurring amount along with a decrease of a size of the trimming frame, the output image generating portion 102 decreases the distance DIFS along with a decrease of a size of the trimming frame indicated by the trimming information. The decrease of the distance DIFS causes an increase of the process-blurring amounts Q602 and Q603. In other words, the process-blurring amounts Q602 and Q603 increase along with a decrease of a size of the trimming frame.
With reference to a flowchart of
In the next Step S54, the focused state setting portion 104 generates the focused state setting information including the distance Lo′ by the above-mentioned method, considering a focused state designation operation by user. After that, in Step S55, the image pickup apparatus 1 accepts the trimming instruction operation and performs the process of Steps S56 to S58 when the trimming instruction operation is performed. Note that it is possible to realize an action without considering an input of the focused state designation operation. In this case, the process of Steps S53 and S54 or the process of Step S54 is omitted, and the focus reference distance Lo of the input image is used as it is as the distance Lo′.
In Step S56, the target image is generated from the input image by trimming based on the trimming information. In the next Step S57, the depth of field of the target image is set shallow by the specific image processing corresponding to the trimming information, the distance map, and the focused state setting information, so as to generate the output image. In Step S58, the output image is displayed on the display screen, and image data of the output image is recorded in the recording medium 16. When the image data of the output image is recorded in the recording medium 16, it is possible to delete the image data of the input image from the recording medium 16 or to keep the image data of the input image in the recording medium 16.
Similarly to the third example, the blur amount can be obtained as if the optical zoom is performed in the fourth example too, when the image is trimmed. The image trimming also functions as substitute means for decreasing the angle of view by the optical zoom that could not be performed or was not performed when the image was photographed. Therefore, when the angle of view is decreased by the trimming, an effect as if the angle of view adjustment had been performed by the optical zoom can be obtained, which will be a great merit for the user.
Note that the input image can be considered as a still image, but it is possible to apply the technique described above in the fourth example to a moving image. In this case, it is preferred to supply the plurality of input images arranged in time sequence to the trimming processing portion 101 sequentially, and to generate the target image and the output image from each input image so as to obtain the output image sequence. It is possible to display the obtained output image sequence as a moving image on the display portion 15 and to record the same in the recording medium 16.
In addition, it is also possible that the magnitude of the depth of field (hereinafter, referred to also as a depth DEP) can be designated by the focused state designation operation. Specifically, for example, it is possible to form the image pickup apparatus 1 so that the arbitrary focus reference distance Lo and the depth DEP can be designated by the focused state designation operation. The designated focus reference distance Lo and depth DEP are denoted by Lo* and DEP*, respectively. When the designation of them is performed, a digital focus portion 120 in the main control portion 13 (see
In addition, if the digital focus portion 120 is disposed in the main control portion 13, the digital focus based on the distance map and the focused state setting information (including Lo* and DEP*) may be performed on the target image. According to this, the focus reference distance Lo and the depth DEP of the target image are changed to the focus reference distance Lo* and the depth DEP*, so that the result image is obtained as the output image. In other words, by performing the digital focus on the target image, it is possible to obtain the focused state adjusted image as the output image, which has Lo* and DEP* as the focus reference distance Lo and the depth DEP, and has the same angle of view as the target image. In this case, it can be said that the digital focus functions as the specific image processing and that the digital focus portion 120 functions as the output image generating portion 102.
The digital focus is the image processing that can adjust the depth of field of the image to be processed, which is the input image or the target image, to an arbitrary value. The adjustment of the depth of field includes adjustment (change) of the focus reference distance Lo and the depth DEP. By the digital focus, it is possible to generate the focused state adjusted image having an arbitrary focus reference distance Lo and an arbitrary magnitude of the depth of field DEP from the image to be processed.
As the image processing method for realizing the digital focus, various image processing methods are proposed. The digital focus portion 120 can utilize a known digital focus image processing (for example, image processing described in JP-A-2010-252293, JP-A-2009-224982, JP-A-2008-271241, or JP-A-2002-247439).
The embodiment of the present invention can be modified appropriately and variously in the scope of the technical concept described in the claims. The embodiment described above is merely an example of the embodiment of the present invention, and the present invention and the meanings of terms of the elements are not limited to those described in the embodiment. Specific numerical values exemplified in the above description are merely examples, which can be changed to various values as a matter of course. As annotations that can be applied to the embodiment described above, Notes 1 to 4 are described below. The descriptions in the Notes can be combined arbitrarily as long as no contradiction arises.
[Note 1]
The specific image processing in each example described above may include image processing other than the blurring process. For instance, the specific image processing may include a contour emphasizing process for emphasizing contours of focused subjects (namely, an edge enhancement process for enhancing edges of focused subjects).
[Note 2]
The image pickup apparatus 1 described above may be constituted of hardware or a combination of hardware and software. If the image pickup apparatus 1 is constituted using software, the block diagram of a portion realized by software indicates a functional block diagram of the portion. The function realized using software may be described as a program, and the program may be executed by a program executing device (for example, a computer) so that the function can be realized.
[Note 3]
The image pickup apparatus 1 in the reproducing mode functions as an image reproduction apparatus. The action of the image pickup apparatus 1 in the reproducing mode may be performed by an image reproduction apparatus (not shown) other than the image pickup apparatus 1. The image reproduction apparatus includes an arbitrary information apparatus such as a mobile phone, a mobile information terminal, or a personal computer. Each of the image pickup apparatus and the image reproduction apparatus is one type of electronic equipment. In addition, it can be said that the image pickup apparatus 1 includes the image processing apparatus. The image processing apparatus in
[Note 4]
For instance, it is possible to consider as follows. In the first example, a part including the original image obtaining portion 53 and the electronic zoom processing portion 54 illustrated in
Number | Date | Country | Kind |
---|---|---|---|
2011-018449 | Jan 2011 | JP | national |