The entire disclosure of Japanese Patent Application No. 2007-211655, filed Aug. 15, 2007 is expressly incorporated by reference herein.
1. Technical Field
The present invention relates to an image processing apparatus and an image processing method.
2. Related Art
The exposure time in shooting an image is an important element that decides the quality of the shot image. If an image is shot where an inappropriate exposure time is set, the shooting subject on the image may be blackened and cannot be recognized even though the subject can be visually recognized with human eyes. Meanwhile, there may be a case where reflected light is picked up as white on the image, causing so-called whiteout. In some cases, the shooting subject cannot be recognized because of derivation from whiteout.
As a traditional technique to solve such problems, JP-A-63-306777 discloses an HDR (high dynamic range) technique of slicing out images of proper brightness of plural images having different quantities of exposure and then combining these images to form a single image. Picking up images having different quantities of exposure can be easily realized by picking up an image by exposure for an ordinary exposure time (ordinary exposure) and then picking up an image by exposure for a shorter time than the ordinary exposure time (short-time exposure) and by exposure for a longer time (long-time exposure).
In combining images, luminance signals of images are normalized in accordance with the exposure time and therefore noise of the image of short-time exposure largely influences a particularly dark part of the combined image. Such inconvenience can be solved by weighting images so that an image shot by long-time exposure is mainly used for the dark part.
As a traditional technique of weighting and combining images, for example, JP-A-11-317905 may be employed. According to the invention described in JP-A-11-317905, an image picked up by ordinary exposure (ordinary exposure image) an image picked up by short-time exposure (short-time exposure image) and an image picked up by long-time exposure (long-time exposure image) are weighted in accordance with the intensity of luminance signals of the image picked up by ordinary exposure.
In the case where the ordinary exposure image has an output characteristic as shown in
According to the traditional technique described in JP-A-11-317905, noise of the short-time exposure image can be prevented from expanding and hence influencing the low-luminance part of the combined image.
However, blackening and whiteout may occur also in the ordinary exposure image. The ordinary exposure image is not always suitable as a reference of weighting. That is, the ordinary exposure image may have an output characteristic as shown in
Moreover, if the ordinary exposure image shown in
An advantage of some aspects of the invention is to provide an image processing apparatus and an image processing method in which each of plural images is properly weighted and then combined, thereby restraining noise in a dark part of the combined image, maintaining linearity of luminance, preventing generation of a pseudo-contour, and thus generating a high-quality image.
An image processing apparatus according to an aspect of the invention is an image processing apparatus that generates a combined image by combining plural image data acquired as a result of digitizing plural images acquired by shooting with different quantities of exposure. The apparatus includes a weighting unit that adds weight to adjust proportion of combination of the image data, to at least one of the plural image data. The weighting unit has a luminance data generating unit that combines data related to luminance of the plural image data and thus generates combined luminance data, and a weight deciding unit that decides the weight added to the image data in accordance with the combined luminance data generated by the luminance data generating unit.
In this image processing apparatus, the weight of image data can be decided in accordance with combined luminance data formed as a result of combining data related to luminance of plural image data acquired by shooting with different quantities of exposure. The combined luminance data has a broader linearity range of luminance signal level than the luminance of an image exposed for an ordinary exposure time. Therefore, proper weighting can be carried out within a broader luminance range than in the case where an image shot with an ordinary exposure time, of images having different exposure times, is used as a reference. Also, generation of a pseudo-contour can be restrained and deterioration in image quality can be prevented. Moreover, the proportion of a long-time exposure image in the combined image can be restrained and a combined image with high image quality and with less noise can be provided.
Thus, in the image processing apparatus, as each of plural images is properly weighted, noise in a dark part of the combined image can be restrained and linearity of luminance can be maintained. Moreover, generation of a pseudo-contour can be prevented and a high-quality image can be generated.
It is preferable that the image processing apparatus further includes a normalizing unit that normalizes the plural image data and equalizes brightness of each image data.
In this image processing apparatus, the difference in brightness due to the difference in quantity of exposure of plural image data is unified. Therefore, in preparing a combined image, normalized image data can be directly weighted. The combined image preparation processing can be simplified.
It is also preferable that the image processing apparatus has a linearizing unit that linearizes the combined image data, which is image data acquired as a result of adding the weight decided by the weight deciding unit to the plural image data and then combining the plural image data, with respect to the luminance of a subject.
In this image processing apparatus, the luminance of combined image data can be linearized. Therefore, a combined image with a uniform change in luminance and with high image quality can be provided.
It is also preferable that, in the image processing apparatus, the weight deciding unit decides weight by using a reference table or a function that associates image data and weight in accordance with luminance, and the normalizing unit normalizes the plural image data by using the reference table or the function.
In this image processing apparatus, the reference table or the function can be used to normalize image data as well as to decide weight. Therefore, it is not necessary to prepare a separate function or processing for normalization and the configuration of the apparatus can be simplified.
It is also preferable that, in the image processing apparatus, the weight deciding unit decides weight by using a reference table or a function that associates image data and weight, and the linearizing unit linearizes the combined image data by using the reference table or the function.
In this image processing apparatus, the reference table or the function can be used to linearize combined image data as well as to decide weight. Therefore, it is not necessary to prepare a separate function or processing for linearization and the configuration of the apparatus can be simplified.
An image processing method according to still another aspect of the invention is an image processing method executed in an image processing apparatus that generates a combined image by combining plural image data acquired as a result of digitizing plural images acquired by shooting with different quantities of exposure. The method includes adding weight to adjust proportion of combination of the image data, to at least one of the plural image data. This weighting includes combining data related to luminance of the plural image data and thus generating combined luminance data, and deciding the weight added to the image data in accordance with the generated combined luminance data.
In this image processing method, the weight of image data can be decided in accordance with combined luminance data formed as a result of combining data related to luminance of plural image data acquired by shooting with different quantities of exposure. The combined luminance data has a broader linearity range of luminance signal level than the luminance of an image exposed for an ordinary exposure time. Therefore, proper weighting can be carried out within a broader luminance range than in the case where an image shot with an ordinary exposure time, of images having different exposure times, is used as a reference. Also, generation of a pseudo-contour can be restrained and deterioration in image quality can be prevented. Moreover, the proportion of a long-time exposure image in the combined image can be restrained and a combined image with high image quality and with less noise can be provided.
Thus, in the image processing method, as each of plural images is properly weighted, noise in a dark part of the combined image can be restrained and linearity of luminance can be maintained. Moreover, generation of a pseudo-contour can be prevented and a high-quality image can be generated.
An image processing program according to still another aspect of the invention is an image processing program for causing a computer to realize image processing in which a combined image is generated by combining plural image data acquired as a result of digitizing plural images acquired by shooting with different quantities of exposure. The program includes a weighting function to add weight to adjust proportion of combination of the image data, to at least one of the plural image data. The weighting function includes a luminance data generating function to combine data related to luminance of the plural image data and thus generate combined luminance data, and a weight deciding function to decide the weight added to the image data in accordance with the combined luminance data generated by the luminance data generating function.
As this image processing program is executed by a computer, the weight of image data can be decided in accordance with combined luminance data formed as a result of combining data related to luminance of plural image data acquired by shooting with different quantities of exposure. The combined luminance data has a broader linearity range of luminance signal level than the luminance of an image exposed for an ordinary exposure time. Therefore, proper weighting can be carried out within a broader luminance range than in the case where an image shot with an ordinary exposure time, of images having different exposure times, is used as a reference. Also, generation of a pseudo-contour can be restrained and deterioration in image quality can be prevented. Moreover, the proportion of a long-time exposure image in the combined image can be restrained and a combined image with high image quality and with less noise can be provided.
Thus, a recording medium in which the image processing program is recorded and which is readable by a computer can provide an image processing program that enables restraining noise in a dark part of the combined image by properly weighting each of plural images, maintenance of linearity of luminance, prevention of generation of a pseudo-contour, and generation of a high-quality image.
The invention will be described with reference to the accompanying drawings, wherein like numbers reference like elements.
Such an image processing apparatus is an image processing apparatus that combines plural image data having different quantities of exposure and thus generates a combined image. In this embodiment, image data refers to digital data acquired as a result of picking up an image. Image data represents an image with plural pixels. Pixels contain information about the position (coordinates) and luminance in the image, and R, G and B color components.
In the first embodiment, the CCD camera 101 generates plural image data having difference quantities of exposure in one shot. The generation of image data having different quantities of exposure can be realized, for example, by changing the reading timing of electric charges accumulated in the CCD with an electronic shutter function in the CCD camera 101.
For example, in the case of changing the reading timing in three stages, image data read out from the CCD at in the earliest timing is assumed to be image data A having the smallest quantity of exposure. Then, image data read out from the CCD in the next timing is assumed to be image data B of ordinary exposure. Finally, image data read out from the CCD in the last timing is assumed to be image data C having the large quantity of exposure. In such a configuration, the exposure time is changed to change the quantity of exposure. In the first embodiment, if the exposure time that provides the image data A is Ta, the exposure time that provides the image data B is Tb and the exposure time that provides the image data C is Tc, the ratio of Ta, Tb and Tc is defined as follows.
The memory 103a is used to accumulate the image data A. The memory 103b is used to accumulate the image data B. The memory 103c is used to accumulate the image data C. It should be noted that the first embodiment is not limited the configuration in which the quantity of exposure is changed by the exposure time, and may also be applied to a configuration in which the CCD camera 101 picks up an image plural times with varied apertures, thereby generating plural image data having different quantities of exposure.
The image processing apparatus according to the first embodiment combines plural image data to generate a combined image, as described above. The image processing apparatus according to the first embodiment has a weighting unit 100 that adds weight to adjust the combination proportion of image data to be combined, to the image data A, B and C accumulated in the memories 103a, 103b and 103c. The weighting unit 100 has a brightness information calculating unit 111 that combines data related to luminance of the image data A, B and C and thus generates combined luminance data, and a weighting calculating unit 112 that decides weight to be added to the image data in accordance with the combined luminance data generated by the brightness information calculating unit 111.
In the first embodiment, the brightness information calculating unit 111 functions as a luminance data generating unit, and the weighting calculating unit 112 functions as a weight deciding unit. Also, the normalizing unit 104 functions as a normalizing unit and the linearizing unit 106 functions as a linearizing unit.
In the first embodiment 1, all the image data A, B and C are weighted. However, the invention is not limited to this configuration. It is also possible to weight at least one of the image data A, B and C.
The CCD camera 101 shoots a subject. As shooting is done, electric charges are accumulated in the CCD of the CCD camera 101 and read out in different timing. The electric charges that are read out are inputted to an A/D converter unit via an AFE (analog front end), not shown, and converted into digital data (image data A, B and C). The SW 102 allocates and accumulates the image data A into the memory 103a, the image data B into the memory 103b, and the image data C into the memory 103c.
The accumulated image data A, B and C are subject to processing such as normalization and HDR combination and are then linearized to become combined image data. The image data A, B and C before being normalized are also inputted to the weighting unit 100. The weighting unit 100 calculates weight to be used for image combination in the HDR combination unit 105 and provides the calculated weight to the HDR combination unit 105.
The HDR combination unit 105 combines the image data A, B and C while adding the calculated weight to the normalized image data, and thus generates a combined image. The linearizing unit 106 secures linearity of the combined image and outputs the combined image to the display unit 107 or the image saving unit 108.
Hereinafter, the operation in the above configuration will be described further in detail.
It can be seen from
Although the combined luminance data has continuity, the saturation values of the lines 201a, 201b and 201c are added up and therefore the slope changes (
As described above, in the case where the characteristic of the image data B of ordinary exposure (line 201b) is used for weighting as in the traditional technique, the luminance of subject is at a constant luminance signal level in a range greater than L1 shown in
The weighting calculating unit 112 decides weight in accordance with the combined luminance data generated as described above, and adds the weight to the image data A, B and C. The weight is decided by using the function or 1DULT that associates image data and weight in accordance with camera luminance.
If the image data are weighted in accordance with
The weight is decided for each pixel of the image data A, B and C. For example, the weight W_Ta added to a pixel situated at coordinates (x,y) of the image data A having the exposure time Ta is expressed as W_Ta(x,y). Similarly, the weight W_Tb added to a pixel situated at coordinates (x,y) of the image data B having the exposure time Tb is expressed as W_Tb(x,y). The weight W_Tc added to a pixel situated at coordinates (x,y) of the image data C having the exposure time Tc is expressed as W_Tc(x,y).
In
Next, processing of the image data A, B and C sent from the memories 103a, 103b and 103c to the HDR combination unit 105 via the normalizing unit 104 will be described.
The normalizing unit 104 normalizes the image data A, B and C having different exposure times so as to equalize their brightness. The normalization is carried out as expressed by the following equations (1), (2) and (3). In these equations, the image data A before normalization is expressed as IMG_Ta, the image data A after normalization as IMG_Ta_N, the image data B before normalization as IMG_Tb, the image data B after normalization as IMG_Tb_N, the image data C before normalization as IMG_Tc, and the image data C after normalization as IMG_Tc_N.
IMG
—
Ta
—
N=IMG
—
Ta×Tc/Ta (1)
IMG
—
Tb
—
N=IMG
—
Tb×Tc/Tb (2)
IMG_Tc_N=IMG_Tc (3)
The HDR combination unit 105 adds weight to pixels situated at the same coordinates, of the image data A, B and C, and combines these pixels. The value HDR(x,y) of a pixel situated at coordinates (x,y) of the combined image is found by the following equation (4).
In the case where the characteristic of the combined image expressed as shown in
The image data A, B and C generated by the CCD camera 101 are accumulated in the memories 103a, 103b and 103c, respectively. The accumulated image data A, B and C are sent to the normalizing unit 104 for HDR combination and inputted to the weighting unit 100.
In the weighting unit 100, the brightness information calculating unit 111 combines the image data A, B and C are (step S601), as shown in
Next, the weighting calculating unit 112 decides weight to be added to each of the image data A, B and C in accordance with the camera luminance acquired by combining the image data A, B and C. The decision of weight is carried out with reference to the LUT shown in
The weighting calculating unit 112 determines whether pixel weighting is decided with respect to all the coordinates of the image data A, B and C (step S603). If there is a pixel that has not been weighted yet (No in step S603), the processing to decide weight is continued. On the other hand, when weighting is decided for the pixels situated at all the coordinates, the processing ends.
The normalizing unit 104 normalizes the image data A, B and C (step S611), as shown in the flowchart of
Next, the HDR combination unit 105 receives the weight decided in accordance with the flowchart shown in
In the above-described flowchart, steps S601 and S602 in
The above-described image processing method according to the first embodiment is carried out by an image processing program according to the first embodiment, which is executed by a computer. The image processing program according to the first embodiment is provided in the form of being recorded in a recording medium readable by a computer such as a CD-ROM, floppy (trademark registered) disk (FD) or DVD as a file having a format that can be installed or executed. The image processing program according to the first embodiment may also be stored on a computer connected to a network such as the Internet and downloaded via the network.
Moreover, the image processing program according to the first embodiment may be provided in the form of being recorded in a memory device such as a computer-readable ROM, flash memory, memory card, or USB-connection flash memory.
According to the above-described first embodiment, weight of image data can be decided in accordance with combined luminance data formed as a result of combining data related to luminance of plural image data acquired by shooting with different quantities of exposure. The combined luminance data has a broader luminance range with linear luminance signal level than the luminance of an image of an ordinary exposure time. Therefore, proper weighting can be carried out in a broader luminance range than in the case of using an image shot with an ordinary exposure time, of images having different exposure times, as a reference. Also, generation of a pseudo-contour can be restrained and the image quality can be prevented from lowering. Moreover, the proportion of a short-time exposure image in the combined image can be restrained and a combined image with high image quality and with less noise can be provided.
Next, a second embodiment of the invention will be described. In the second embodiment, the normalizing unit 104 of the image processing apparatus according to the first embodiment is omitted and the functional configuration and processing steps are simplified. For simplification, in the second embodiment, the normalizing unit 104 and the linearizing unit 106 are omitted, and the image data A, B and C are normalized or normalized by using the 1DLUT or function used for weighting. In such second embodiment, a weighting unit 700 (
However, the image processing apparatus according to the second embodiment differs from the first embodiment in not having the normalizing unit 104 and the linearizing unit 106. The image data are inputted to the HDR combination unit 105 without being normalized. The HDR-combined image is outputted to the display unit 107 and the image saving unit 108 without being particularly linearized.
The image data A, B and C provided by the CCD camera 101 are saved in the memories 103a, 103b and 103c, respectively. Then, the image data A, B and C are combined at the brightness information calculating unit 711. As a result of the combination, combined luminance data is produced. However, the brightness information calculating unit 711 does not make correction to linear the combined image data and uses the group of straight lines 202 shown in
In this case, the weighting calculating unit 712 decides weight by using the 1DLUT shown in
Here, the process of generating the 1DLUT of
In the second embodiment, since the image data are not normalized, it is necessary to multiply the characteristic shown in the LUT of
Moreover, in the second embodiment, the weighting calculating unit 712 must decide weight by using the 1DLUT prepared also in consideration of linearization of the combined image provided after combination.
The 1DLUT shown in
Here, the advantages of the first and second embodiments of the invention will be summarized. That is, the first and second embodiments of the invention focus on the fact that key information in adjusting weight at the time of combining images is the brightness of the subject. Therefore, for an image acquired by shooting a bright part of the subject, an image with a short exposure time is mainly used and combined with an image with a shorter exposure time. Thus, an image having a good S/N ratio can be provided.
As a standard to determine the brightness (luminance) of the subject, an ordinary exposure image (the line 201b in
Meanwhile, in the first and second embodiments, plural image data having different exposure times are combined to prepare combined luminance data, which is used as a reference for weighting. Since the combined luminance data has a smaller range where the luminance signal level is saturated than the ordinary exposure image, proper weight can be decided even in a higher luminance range.
Next, the relation between an image to be a reference for weighting and the image quality will be described.
A short-time exposure image generally has a lot of noise. When the proportion of the short-time exposure image in the combined image increases, the noise (granularity) of the combined image increases and it may deteriorate the image quality.
If images are weighted in accordance with combined luminance data acquired by combining image data having different exposure times, as in the first and second embodiments of the invention, ideal weighting shown in
The entire disclosure of Japanese Patent Application No. 2007-211655 filed on Aug. 15, 2007 is expressly incorporated by reference herein.
Number | Date | Country | Kind |
---|---|---|---|
2007-211655 | Aug 2007 | JP | national |