The present invention generally relates to the field of image processing and video processing, and more particularly, to a method and system for obtaining a digitally enhanced image.
Nowadays, image processing and video processing is widely used to enhance the quality of an image captured by a digital device. The digital device can be a digital camera, a video camera, a video-conferencing device, a digital telescope, and the like. The image captured by a digital device may have a dark subject and a bright background if the sources of the ambient light are located mostly behind the subject. Generally, this problem of the dark subject and the bright background is solved by the addition of light sources such as flash. Usually these light sources are controlled by the digital device.
While capturing an image of an object, ambient light sources can be present in the environment of the object. Often, the illumination provided by the ambient light sources is not neutral in color and causes a variation in the original color of the image of the object.
Some of the digital cameras available are provided with an electronic flash to avoid the dark subject and the bright background in a captured image. The electronic flash is activated when the foreground illumination is not sufficient, and eliminates the problem of dark subject and the bright background in the image. However, the electronic flash requires a strong power source and may generate heat, which might affect the working of the camera. Further, the large size and weight of the electronic flash increases the size and weight of the camera. Moreover, the illumination provided by the electronic flash may cause discomfort or annoyance to the subject. For example, people often blink when exposed to an electronic flash.
Accordingly, in light of the foregoing, there exists a need for developing alternative solutions to remove the problem of dark subject and the bright background from the image of an object without using a strong light source or increasing the size and weight of the system. Further, there exists a need for removing variations in the color of the images of the objects, caused by the illumination provided by the ambient light sources present in the environment of the object.
The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, and which, together with the detailed description below, are incorporated in and form part of the specification, serve to further illustrate various embodiments and explain various principles and advantages, all in accordance with the present invention.
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated, relative to other elements, to help in improving an understanding of the embodiments of the present invention.
Before describing in detail the particular of the present invention, it should be observed that the present invention utilizes a combination of method steps and apparatus components related to the method and system for obtaining a digitally enhanced image. Accordingly, the apparatus components and method steps have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent for an understanding of the present invention, so as not to obscure the disclosure with details that will be readily apparent to those with ordinary skill in the art, having the benefit of the description herein. method and system for obtaining a digitally enhanced image, in accordance with various embodiments
In this document, the terms ‘comprises,’ ‘comprising,’ ‘includes,’ or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, article, system or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such a process, article or apparatus. An element proceeded by ‘comprises . . . a’ does not, without more constraints, preclude the existence of additional identical elements in the process, article, system or apparatus that comprises the element. The terms “includes” and/or “having”, as used herein, are defined as comprising.
There are many different color representation schemes (e.g., RGB, HSV) used in digital and video photography. The present invention may be applied to any color representation scheme. For the purposes of this disclosure pixel values refer to a luminance of the pixels, unless the specific context of color is mentioned.
For the present invention the convention used for measuring light is a linear scale proportional to the number of photons per second for a given area. The present invention may also be used where the representational scheme is logarithmic, however, the mathematical operations should be made to match the particular representational scheme.
For one embodiment, a method for obtaining a digitally enhanced image is provided. The method includes capturing a plurality of digital images. The plurality of digital images is captured on at least two differing illumination levels from a controlled light source. Further, the method includes analyzing the captured plurality of the digital images, to identify the illumination contribution provided by the controlled light source. Furthermore, the method includes amplifying the identified illumination contribution. Moreover, the method includes combining the amplified illumination contribution with at least one of the plurality of captured images, to produce a composite digital image.
For another embodiment, a system for obtaining a digitally enhanced image is provided. The system includes a memory for storing a plurality of digital images. The plurality of digital images is captured on at least two differing illumination levels from a controlled light source. Further, the system includes a processor that analyzes the stored plurality of digital images, to identify the illumination contribution provided by the controlled light source, to amplify the identified illumination contribution and combine the amplified illumination contribution with at least one of the plurality of stored digital images, to form a composite digital image.
At step 206, the plurality of digital images is analyzed to obtain an image representing illumination contribution provided by the controlled light source. In one embodiment, the image representing the illumination contribution is identified by comparing the plurality of digital images, captured at different illumination levels, with each other.
At step 208, each pixel of the image representing the illumination contribution of the controlled light source is amplified, since the controlled light source, used for illumination of the plurality of digital images, is a weak light source. In an embodiment, amplification of the image representing the illumination contribution is performed by multiplying each pixel of the image representing the illumination contribution by a scaling factor. For example, the value of the scaling factor may be four.
At step 210, the image representing the illumination contribution is digitally combined with at least one of the plurality of digital images, to obtain at least one composite digital image. In an embodiment, the step 210, of combining the image representing the illumination contribution with the at least one of the plurality of digital images includes aligning the image representing the illumination contribution with the at least one of the plurality of digital images. In this embodiment, the step 210, further includes adding the image representing the illumination contribution and the at least one of the plurality of digital images, pixel by pixel, to obtain the at least one composite digital image. The pixel value of a composite digital image of the at least one composite digital image is calculated by using the following equation:
pixel value=pixel value of the at least one of the plurality of digital images+pixel value of the image representing the amplified illumination contribution (1)
In an embodiment, where the color of the controlled light source is not neutral, e.g., an LCD display, and may change with time, the color of the image representing the illumination contribution is adjusted prior to combining the image representing the illumination contribution with the at least one of the plurality of digital images. In one embodiment, adjustment to the color of the image representing illumination contribution is done by computing the color of the controlled light source. The color of the controlled light source is computed by determining an average color of the display. In one embodiment, average color is determined by computing a mean pixel value for each primary color. In an embodiment, the color of the controlled light source changes with time, the mean pixel value for each primary color also changes with time. In this embodiment, the color correction of each pixel of the image representing the illumination contribution is based on mean pixel color of the controlled light source at the time of the image capture. Thereafter, the color of the image representing the illumination contribution is adjusted accordingly.
In an embodiment, the color of ambient light sources is adjusted in the at least one of the plurality of digital images prior to combining the image representing the illumination contribution with the at least one of the plurality of digital images. The ambient light sources are light sources present in the environment of the object other than the controlled light source. In this embodiment, pixels in the at least one of the plurality of captured digital images that are illuminated by both the ambient light sources and the controlled light source are identified. Further, in this embodiment, true color of the pixels is determined from their color values in the illumination contribution of the controlled light source. If the controlled light source is non-neutral in color or varies in output over time, then the color of the pixels is corrected based on color output of the controlled light source at the time of the image capture. Furthermore, in this embodiment the color of the ambient light sources are determined by comparing the true color of the pixels in the at least one of the plurality of digital images with the pixels in the at least one of the plurality of digital images that are illuminated only by the ambient light sources. Furthermore, the color of the ambient light sources is adjusted from the at least one of the plurality of digital images. At step 212, the method is terminated.
At step 306, the images in the first set of digital images are digitally combined to obtain a first combined digital image, and the images in the second set of digital images are digitally combined to obtain a second combined digital image. In an embodiment, prior to digitally combining images in the first set of digital images, they are aligned to minimize the effect of motion. In this embodiment, digitally combining images in the first set of digital images further involves digitally adding the images present in the first set of digital images, pixel by pixel, to obtain a first intermediate digital image. Further, in this particular embodiment, each pixel of the first intermediate digital image is digitally averaged, to obtain the first combined digital image. In an embodiment, digitally combining images in the second set of digital images involves aligning all the images in the second set of digital images. In this embodiment, digitally combining images in the second set of digital images further involves adding the images present in the second set of digital images, pixel by pixel, to obtain a second intermediate digital image. Further, in this particular embodiment, each pixel of the second intermediate digital image is digitally averaged, to obtain the second combined digital image.
At step 308, the second combined digital image is subtracted from the first combined digital image to obtain a third image. The third image represents illumination contribution provided by the controlled light source. At step 402, the color of the third image is adjusted when the color of the controlled light source is not neutral. Each pixel value of the third image is calculated by using the following equation:
Pixel value=Pixel value of the first combined digital image−Pixel value of the second combined digital image (2)
At step 404, the color of the ambient light sources in the second combined digital image is adjusted. The ambient light sources are light sources present in the environment of the object other than the controlled light source. The second combined digital image is compared with the third combined digital image. Pixels from both the second and third combined images are selected based on illumination level. The color of the third combined digital image is used as a reference to correct the color of the second combined digital image. In one embodiment, pixels having high illumination levels may be used to correct the color component.
If the controlled light source is non-neutral in color or varies in output, then the color of the pixels in the third combined digital image is corrected based on color output of the controlled light source at the time of the image capture before the second combined digital image is adjusted.
At step 406, each pixel in the third image is amplified to obtain an amplified third image. In an embodiment, amplification of the third image is performed by multiplying each pixel in the third image by a scaling factor to obtain an amplified third image. For example, the value of the scaling factor may be four. Each pixel value of the amplified third image is calculated using the following equation:
Pixel value of the amplified third image=Pixel value of the third image*Scaling factor (3)
At step 408, the amplified third image is added to the first combined digital image to obtain a composite digital image. At step 410, the method is terminated. Each pixel value of the composite digital image is calculated using the following equation:
Pixel value of the composite digital image=Pixel value in the first combined digital image+Pixel value of the amplified third image. (4)
At step 506, a controlled illumination video image is selected or estimated for the output frame. At step 507, an ambient illumination video image is selected or estimated for the current frame. At step 508, each ambient video image is subtracted from its associated controlled illumination video image to obtain an image represented an illumination contribution provided by a controlled light source. The pixel value for an image of the one or more images representing the illumination contribution is calculated using the following equation:
Pixel value=Pixel value in the associated video image−Pixel value in the estimated reference video image (5)
At step 510, the illumination contribution video image is amplified, since the at least one light source, used for illumination, is a weak light source. In one embodiment, the amplification of an image of the one or more images representing the illumination contribution is performed by multiplying each pixel of the image by a scaling factor.
At step 602, the color of the illumination contribution video image is adjusted when the color of the controlled light source is not neutral. At step 604, the color of ambient light is estimated by comparing pixels in both the illumination contribution video image and the ambient illumination video image. The ambient illumination video image is color corrected using the estimate. The ambient light sources are light sources present in the environment of the object other than the controlled light source. In one embodiment, step 604, includes identifying pixels in a video image of the one or more video images that are illuminated by the ambient light sources and the controlled light source.
At step 606, the color corrected illumination contribution video image and the color corrected ambient illumination video image are combined to produce an output video frame. The pixel value of a digitally enhanced image is calculated using the following equation:
Pixel value=Pixel value of the video image+(Pixel value of the corresponding image representing the amplified illumination contribution) (6)
At step 608, status of digital camera is checked. If the digital camera is still on then the method proceeds to step 609. At step 609 a new output video frame is started and the method proceeds to step 506. At step 610, the method is terminated.
In one embodiment, processor 708 is further adapted to adjust the color of ambient light sources in the plurality of digital or video images. The ambient light sources are light sources present in the environment of the object other than controlled light source 704. In this embodiment, pixels in at least one of the plurality of digital or video images that are illuminated by the ambient light sources and the controlled light source 704 are identified. Further, in this embodiment, true color of those pixels in the at least one of the plurality of digital or video images is determined from the color values of the illumination contribution of the controlled light source 704. If controlled light source 704 is non-neutral in color or varies in output, then the color of the pixels is corrected based on color output of the controlled light source 704 at the time of the image capture. Furthermore, in this embodiment the color of the ambient light sources are determined by comparing the true color of the pixels in the digital or video image(s) with the pixels in the digital or video image(s) that are illuminated only by the ambient light sources. Furthermore, the color of the ambient light sources is adjusted in the digital or video image(s).
The image capture module 802 can capture a plurality of video images of an object. The controlled light source(s) 804 are used for illuminating the object while capturing the plurality of video images at different illumination levels. These light sources are any devices that can generate light output vary with time. Typically, this would be a white light with a predictable invisible flicker, where that flicker is controlled by the light pattern generator. It could include the backlight of a LCD display or a simple light. The light pattern generator 806 is operatively coupled to the controlled light source(s) 804. The light pattern generator 806 sends illumination control signals to the controlled light source 804. In one embodiment, the light pattern generator 806 hides the changes in illumination when the illumination level changes. In this embodiment, the light pattern generator 806 turns off the illumination for a very short period to hide the changes in the illumination level. The controlled light source(s) 804 use the illumination control signals to control the amount and/or color of light output by the controlled light source(s) 804 to illuminate the object while capturing the plurality of video images. In still another embodiment, the light pattern generator X06 is not necessary, because the controlled light source(s) X04 generates an intrinsic pattern that can be detected by the pattern illumination detector x08 by design.
The pattern illumination detector 808 is operatively coupled to the image capture module 802 and the light pattern generator 806. The pattern illumination detector 808 can receive the plurality of video images from the image capture module 802. The pattern illumination detector 808 correlates the plurality of recent video images to obtain an image representing the illumination contribution provided by the controlled light source(s) 804. The pattern illumination detector 808 can receive the information regarding the brightness and/or color of a controlled light source from the light pattern generator 806. The pattern illumination detector 808 can adjust the color of the image representing the illumination contribution when the color of the controlled light source is not neutral.
The illumination enhancer 810 is operatively coupled to the pattern illumination detector 808. The illumination enhancer 810 receives the image representing the illumination contribution from the pattern illumination detector 808. The illumination enhancer 810 amplifies each pixel of the image representing the illumination contribution provided by the controlled light source(s) 804. The video frame generator 812 is operatively coupled to the image capture module 802 and the illumination enhancer 810. The video frame generator 812 receives the plurality of video images from the image capture module 802. The video frame generator 812 receives the enhanced image representing the illumination contribution from the illumination enhancer 810. The video frame generator 812 adjusts the color of the ambient light sources in the plurality of video images. The ambient light sources are light sources present in the environment of the object other than the controlled light source(s) 804. The video frame generator 812 also combines the enhanced image representing the illumination contribution with each of the plurality of video images to obtain a plurality of enhanced video images.
The image capture module 902 can capture a plurality of video images of an object. The display generator 904 is a controlled light source used for illuminating the object while capturing the plurality of video images at different illumination levels. The light pattern generator 906 is operatively coupled to the display generator 904. The light pattern generator 906 sends illumination control signals to the display generator 904. In one embodiment, the light pattern generator 906 hides the changes in illumination when the illumination level changes. In this embodiment, the light pattern generator 906 turns off the illumination for a very short period to hide the changes in the illumination level. The display generator 904 uses the illumination control signals to control the amount of light output by the display generator 904 to illuminate the object while capturing the plurality of video images.
The pattern illumination detector 908 is operatively coupled to the image capture module 902 and the light pattern generator 906. The pattern illumination detector 908 can receive the plurality of video images from the image capture module 902. The pattern illumination detector 908 correlates the plurality of recent video images to obtain an image representing the illumination contribution provided by the display generator 904. The pattern illumination detector 908 can receive the information regarding the color of a controlled light source from the light pattern generator 906. The pattern illumination detector 908 can adjust the color of the image representing the illumination contribution when the color of the controlled light source is not neutral.
The illumination enhancer 910 is operatively coupled to the pattern illumination detector 908. The illumination enhancer 910 receives the image representing the illumination contribution from the pattern illumination detector 908. The illumination enhancer 910 amplifies each pixel of the image representing the illumination contribution provided by the display generator 904. The video frame generator 912 is operatively coupled to the image capture module 902 and the illumination enhancer 910. The video frame generator 912 receives the plurality of video images from the image capture module 902. The video frame generator 912 receives the enhanced image representing the illumination contribution from the illumination enhancer 910. The video frame generator 912 adjusts the color of the ambient light sources in the plurality of video images. The ambient light sources are light sources present in the environment of the object other than the display generator 904. The video frame generator 912 also combines the enhanced image representing the illumination contribution with each of the plurality of video images to obtain a plurality of enhanced video images.
Various embodiments, as described above, provide a method and system for obtaining a digitally enhanced image of an object. In an embodiment, the digitally enhanced image includes sequence of video images. The present invention digitally eliminates darkness from an image of an object that is captured by using a weak light source. Since the light source used to capture an image is weak, the power requirement of the system is less, and the heat generated by the system is low, as compared to when an electronic flash is used to capture an image.
According to an embodiment, the present invention also balances the color of the image by estimating the color of the light sources used for the illumination and present in the environment of the object, and adjusting the colors accordingly.
The present invention can also work with a wide variety of digital devices, including a camcorder, by adding a weak light source that is neutral in color. The present invention is useful when used with a cell phone camera, where a power flash would require too much space and power.
In the foregoing specification, the invention and its benefits and advantages have been described with reference to specific embodiments. However, one with ordinary skill in the art would appreciate that various modifications and changes can be made without departing from the scope of the present invention, as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of the present invention. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage or solution to occur or become more pronounced are not to be construed as critical, required or essential features or elements of any or all the claims. The invention is defined solely by the appended claims, including any amendments made during the pendency of this application, and all equivalents of those claims as issued.