The following description relates to a technology of combining images, and more particularly, to an apparatus and method for combining color images with black-and-white images including infrared components.
With popularization of digital cameras, interests in digital imaging devices have been growing. A digital imaging device may edit or store captured images as it digitalizes and processes various image information.
In general, a digital imaging device includes a lens, an image sensor, and an image processor. The lens adjusts a focus of light reflected from an object and transmits the light to the image sensor so that the light forms a proper image on the image sensor. The image sensor senses the light incident thereon and generates image signals, that is, electrical signals. The generated image signals are subjected to processing and may be displayed or stored.
Types of image sensors include image pickup tubes and solid image sensors. The solid image sensors include charge coupled devices (CCDs), complementary metal-oxide-semiconductors (CMOSs), and the like.
A CCD sensor includes a circuit in which a plurality of capacitors are connected in pairs. Also, a CCD chip including a plurality of photo diodes generates electrons according to an amount of light incident on each photodiode. Then, by reconfiguring information generated by the photodiodes, image information may be created.
CMOS image sensors may be manufactured at lower costs than CCD image sensors as the CMOS image sensors may be manufactured using a general-purpose semiconductor manufacturing equipment. Therefore, CMOS image sensors have been typically utilized in low-priced digital cameras or slow-frame television cameras. However, CMOS image sensors may be unstable or have poor performance in a low illumination environment, and images captured by a CMOS image sensor may have noises.
While a CMOS image sensor can convert infrared light as well as visible light into image signals, such infrared components are generally removed by an infrared blocking filter in order to easily restore color signals. However, in order to acquire images over wider bands, it is desirable to use the infrared components.
According to one general aspect, there is provided an image composition apparatus including an image acquiring unit to sense incident light and generate a first image signal with color information and a second image signal including infrared components without color information, an image signal divider to divide the first image signal into a brightness signal and a color signal, a brightness composer to compose the brightness signal of the first image signal with a brightness signal of the second image signal to generate a composed brightness signal, and an image restoring unit to compose the composed brightness signal with the color signal of the first image signal, so as to generate a color image.
The first image signal may include a signal corresponding to a specific region of a visible band of an optical spectrum, and the second image signal may include a signal corresponding to an infrared band of the optical spectrum and a combination of signals corresponding to specific regions of the visible band.
The first image signal may be a color image signal, and the second image signal may be a black-and-white image signal including infrared components.
The apparatus may further include a color space converter to convert a color space of the first image signal.
The apparatus may further include a dynamic bandwidth adjusting unit to equalize dynamic bandwidths of the first image signal and the second image signal.
The apparatus may further includes a resolution adjusting unit to equalize resolutions of the first image signal and the second image signal.
The apparatus may further includes a domain transformer to transform spatial domains of the first image signal and the second image signal into frequency domains.
According to another aspect, there is provided an image composition method in an image composition apparatus, the method including generating a first image signal with color information and a second image signal including infrared components without color information, dividing the first image signal into a brightness signal and a color signal, composing the brightness signal of the first image signal with a brightness signal of the second image signal to generate a composed brightness signal, and composing the composed brightness signal with the color signal of the first image signal to generate a color image.
The method may further include converting a color space of the first image signal prior to the dividing of the first image signal into the brightness signal and the color signal.
The method may further include equalizing dynamic bandwidths of the first image signal and the second image signal. The equalizing of the dynamic bandwidths of the first and second image signals may comprise compressing the second image signal to match with the dynamic bandwidth of the first image signal.
The method may further include matching a resolution of the first image signal with a resolution of the second image signal. The matching of the resolution may comprise interpolating an image signal with the lower resolution among the first and second image signals with respect to an image signal with the higher resolution among the first and second image signals.
Other features and aspects will be apparent from the following detailed description, the drawings, and the claims.
Throughout the drawings and the detailed description, unless otherwise described, the same drawing reference numerals will be understood to refer to the same elements, features, and structures. The relative size and depiction of these elements may be exaggerated for clarity, illustration, and convenience.
The following detailed description is provided to assist the reader in gaining a comprehensive understanding of the methods, apparatuses, and/or systems described herein. Accordingly, various changes, modifications, and equivalents of the systems, apparatuses and/or methods described herein will be suggested to those of ordinary skill in the art. Also, descriptions of well-known functions and constructions may be omitted for increased clarity and conciseness.
The image composition apparatus may be applied to an imaging device to detect light reflected from an object and create or store an image. For example, the image composition apparatus may be applied to a digital camera, a hardware system to drive a digital camera, an image processing chip, and the like. Referring to
The image acquiring unit 101 may be a CCD or CMOS image sensor to sense light reflected from an object and converting the sensed light into electrical signals. The image acquiring unit 101 senses light incident thereon to generate a predetermined image signal, wherein the image signal may be divided into a first image signal 102 and a second image signal 103.
The first image signal 102 includes color information, and the second image signal 103 includes infrared components without any color information. The first and second image signals 102 and 103 will be described further with reference to an optical spectrum illustrated in
Referring to
Image signals, such as the first and second image signals 102 and 103, having information of different wavelength bands, may be obtained by using, for example, a multi-sensor technology utilizing different optical systems and image sensors or by controlling the filtering function of a color filter array (CFA) without having to change an optical system.
Referring to
The filter array 115 includes color filters 117 to selectively transmit light belonging to specific regions (for example, the regions 114 of
Accordingly, where light reflected from an object passes through the filter array 115, the color filter units 117 transmit light belonging to specific bands of the visible band and infrared light therethrough, and the transparent filter 118 transmits light (including infrared light) over all bands therethrough.
The image sensor 116 disposed below the filter array 115 may have a multi-layer structure with stacked sensor modules. For example, the upper layer 130 of the image sensor 116 includes first light receivers 119 to sense light belonging to the visible band from the light which has passed through the color filters 117, and a second light receiver 120 to sense white light which has passed through the transparent filter 118, and the lower layer 131 of the image sensor 116 includes a third light receiver 121 to sense light belonging to the infrared band.
The image sensor 116 may be formed by a semiconductor manufacturing process, and each light receiver may be a photodiode made of silicon. Here, since infrared light with wavelengths longer than those of visible light is absorbed at a relatively deeper location (that is, the lower layer 131), the multi-layer structure is provided in which the upper layer 130 detects light of the visible band and the lower layer 131 detects light of the infrared band.
In the image acquiring unit, the first light receiver 119 detects light with color information, the second light receiver 120 detects light (for example, white light) without color information, and the third light receiver 121 detects infrared light, respectively. Accordingly, the output signal of the first light receiver 119 is used as the first image signal 102, and the output signals of the second and third light receivers 120 and 121 are used as the second image signal 103.
Referring back to
The first image signal 102 whose color space has been converted by the color space converter 104 is input to the image signal divider 105 (see
Here, the brightness signal 106 corresponds to brightness information for the first image signal 102, and the color signal 107 corresponds to color information for the first image signal 102. For example, if the first image signal 102 represented as an RGB signal is converted into a YCbCr signal, the brightness signal 106 corresponds to the Y signal and the color signal 107 corresponds to the CbCr signal. Also, in a HSV space, V information may be used as the brightness signal 106, and in a HIS space, I information may be used as the brightness signal 106.
The brightness composer 108 composes the brightness signal 106 divided from the first image signal 102 with the brightness signal of the second image signal 103. Since the second image signal 103 has only brightness information without any color information, the second image signal 103 may be composed with the brightness signal 106 divided from the first image signal 102 without any additional processing. For example, if the second image signal 103 is represented in the YCbCr color space, the Y information of the second image signal 103 is composed with the Y information of the first image signal 102 output from the image signal divider 105.
A brightness composition method may be used in which brightness signals are composed using, for example, a lookup table or a conversion function representing the relationship between the brightness information of the first image signal 102 and the brightness information of the second image signal 103. As another example, a brightness composition method may be used which adds the coefficients of brightness signals using Discrete Wavelet Transform (DWT).
The image restoring unit 109 combines the composed brightness signal 110 generated by the brightness composer 108 with the color signal 107 of the first image signal 102 divided by the image signal divider 105, so as to generate a color image. Combining the composed brightness signal 110 with the color signal 107 may be done by the inverse processing of the division processing by the image signal divider 105, and the final color image may be obtained by the inverse processing of the conversion processing by the color space converter 104.
Accordingly, the color image provided by the image composition apparatus is an image created using both visible light signals and black-and-white image signals including infrared components. That is, by using signals over a wide band, recognizable image information may be obtained even in a low illumination environment.
Referring to
The dynamic bandwidth adjusting unit 122, which equalizes the dynamic bandwidths of the first and second images signals 102 and 103, adjusts the dynamic bandwidth of the second image signal 103 to be suitable for an image output apparatus, or compresses the dynamic bandwidth of the second image signal 103 to be equalized to the dynamic bandwidth of the first image signal 102.
Referring to
For example, the resolution adjusting unit 123 interpolates an image signal with lower resolution to equalize the resolutions of image signals. This process may be performed after brightness/color division (
Referring to
The domain transformer 201 transforms the spatial domains of the first and second image signals 102 and 103. For example, the domain transformer 201 transforms the spatial domains of the first and second image signals 102 and 103 into frequency domains. The domain inverse-transformer 202 inverse-transforms the frequency domains into the original spatial domains before generating a final color image.
Referring to
The color space of the first image signal is converted in operation S202. For example, the first image signal represented as an RGB signal is converted into a YCrCb signal.
In operation S203, the first image signal whose color space has been converted is divided into a brightness signal and a color signal. For example, the brightness signal corresponds to brightness information represented as a Y signal and the color signal corresponds to color information represented as a CbCr signal.
Where the first image signal is divided into the brightness signal and color signal, the brightness signal of the first image signal is composed with the brightness signal of the second image signal to generate a composed brightness signal in operation S204. The brightness composition may be carried out by adding the coefficients of brightness signals using DWT to obtain a composed brightness signal.
In operation S205, the composed brightness signal is composed with the color signal divided in the operation S203, so as to generate a color image. Here, the color image may be obtained by the inverse processing of the division processing performed in the operation S203.
Referring to
In operation S303, the first image signal is divided into a brightness signal and a color signal after adjusting the resolutions of the first and second image signals.
In operation S304, the dynamic bandwidths of the first and second image signals are equalized. The operation of equalizing the bandwidths of the first and second image signals may be carried out by compressing the second image signal with the wider dynamic bandwidth to be matched with the bandwidth of the first image signal or by adaptively matching the dynamic bandwidth of the first image signal with the dynamic bandwidth of the second image signal.
In operation S306, the brightness signal of the first image signal is composed with the brightness signal of the second image signal to generate a composed brightness signal. The composed brightness signal is composed with the color signal of the first image signal to generate a color image in operation S306.
According to examples described above, color images are combined with black-and-white images including infrared components.
According to examples described above, since black-and-white image signals having the wider band as well as color image signals are additionally used to generate images, high-sensitivity images may be obtained in a low illumination environment.
According to examples described above, since infrared signals having no color information may be represented as black-and-white images, appropriate composition of black-and-white images with color images may obtain images with high-sensitivity and wide-bandwidth.
The methods described above may be recorded, stored, or fixed in one or more computer-readable media that includes program instructions to be implemented by a computer to cause a processor to execute or perform the program instructions. The media may also include, alone or in combination with the program instructions, data files, data structures, and the like. Examples of computer-readable media include magnetic media, such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVDs; magneto-optical media, such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like. Examples of program instructions include machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter. The described hardware devices may be configured to act as one or more software modules in order to perform the operations and methods described above, or vice versa.
A number of exemplary embodiments have been described above. Nevertheless, it will be understood that various modifications may be made. For example, suitable results may be achieved if the described techniques are performed in a different order and/or if components in a described system, architecture, device, or circuit are combined in a different manner and/or replaced or supplemented by other components or their equivalents. Accordingly, other implementations are within the scope of the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2008-0046060 | May 2008 | KR | national |
This application is a continuation application of prior application Ser. No. 14/627,498, filed on Feb. 20, 2015, which is a continuation application of prior application Ser. No. 12/467,321, filed on May 18, 2009, which has issued as U.S. Pat. No. 8,989,487 on Mar. 24, 2015 and claimed the benefit under 35 U.S.C. § 119(a) of Korean Patent Application No. 10-2008-0046060, filed on May 19, 2008, the disclosure of which is incorporated herein in its entirety by reference.
Number | Name | Date | Kind |
---|---|---|---|
4679068 | Lillquist | Jul 1987 | A |
4751571 | Lillquist | Jun 1988 | A |
6150930 | Cooper | Nov 2000 | A |
6553141 | Huffman | Apr 2003 | B1 |
6556724 | Chang | Apr 2003 | B1 |
6590679 | Edgar | Jul 2003 | B1 |
6759949 | Miyahara | Jul 2004 | B2 |
6925208 | Huffman | Aug 2005 | B1 |
7358496 | Fleury | Apr 2008 | B2 |
7365771 | Kahn et al. | Apr 2008 | B2 |
7456384 | Toda | Nov 2008 | B2 |
7460160 | Hershey | Dec 2008 | B2 |
7609291 | Oon | Oct 2009 | B2 |
20020031265 | Higaki | Mar 2002 | A1 |
20020140822 | Kahn | Oct 2002 | A1 |
20020168096 | Hakamata | Nov 2002 | A1 |
20030117522 | Okada | Jun 2003 | A1 |
20040179744 | Chang | Sep 2004 | A1 |
20050012882 | Karman | Jan 2005 | A1 |
20050134731 | Lee | Jun 2005 | A1 |
20060124833 | Toda | Jun 2006 | A1 |
20060221326 | Cok | Oct 2006 | A1 |
20070024931 | Compton | Feb 2007 | A1 |
20070183657 | Kidono | Aug 2007 | A1 |
20090285476 | Choe | Nov 2009 | A1 |
20100201823 | Zhang | Aug 2010 | A1 |
Number | Date | Country |
---|---|---|
2999517 | Jan 2000 | JP |
2005-130237 | May 2005 | JP |
2007-184805 | Jul 2007 | JP |
2007-329596 | Dec 2007 | JP |
10-2000-0060759 | Oct 2000 | KR |
10-2004-0010121 | Jan 2004 | KR |
10-2005-0015737 | Feb 2005 | KR |
10-2005-0049856 | May 2005 | KR |
Number | Date | Country | |
---|---|---|---|
20180013963 A1 | Jan 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14627498 | Feb 2015 | US |
Child | 15676235 | US | |
Parent | 12467321 | May 2009 | US |
Child | 14627498 | US |