The present invention relates to an imaging device comprising an imaging element array configured such that a plurality of imaging elements are arrayed.
Generally, a mobile device, such as a mobile phone or a smartphone, includes a camera module as standard equipment. In recent years, there has been an increasing need for mobile devices to have a smaller thickness. Along with this, the camera module is also required to be reduced in thickness.
For example, the following Patent Literature 1 discloses an imaging device comprising: a camera array configured such that a plurality of imaging elements are arrayed; and a plurality of lens elements each corresponding to a respective one of the imaging elements, wherein each of the imaging elements comprises a one-color (monochromatic) filter. In this imaging device, plural pieces of image data are acquired from the respective imaging elements, and integrated to generate a single piece of high-definition image date.
In the imaging device disclosed in the Patent Literature 1, a one-color filter is disposed on each of the imaging elements, so that it is only necessary for each of the lens elements to adapt to one spectral sensitivity characteristic. This facilitates a reduction in size of the lens element.
The following Patent Literature 2 discloses an image input device in which a one-color filter is disposed on each of a plurality of units each consisting of a plurality of light-receiving cells. In the Patent Literature 2, a one-color filter is disposed on each of the units to thereby facilitate a reduction in size of each lens, as with the Patent Literature 1.
However, in each of the Patent Literatures 1 and 2, only a one-color signal can be obtained from each of the imaging elements. Thus, in a situation where it is attempted to obtain a color image, it is necessary to integrate plural pieces of image data captured by the imaging elements on each of which one of at least three different color filters is disposed. Further, when the plural pieces of image data obtained from the imaging elements are integrated to synthesize a single piece of image data, it is necessary to utilize synthetic processing, e.g., a processing of aligning and adding the plural pieces of image data, and super-resolution processing.
For example, in the super-resolution processing, although depending on the number of pixels in each of the plural pieces of image data, it takes several minutes or more to synthesize a single piece of image data, in some cases. In regard to such processing, a still image does not face any major problem, whereas a moving image requiring real-time processing confronts a serious problem.
It is an object of the present invention to provide an imaging device comprising an imaging element array configured such that a plurality of imaging elements are arrayed, wherein the imaging device is capable of producing a color image at high speeds.
According to one aspect of the present invention, there is provided an imaging device which comprises: a plurality of optical systems arranged in a matrix pattern; and an imaging element array comprising an array of imaging elements each corresponding to a respective one of the optical systems, wherein the imaging element array includes a first imaging element on which at least two types of color filters each having a different spectral characteristic are disposed, and a second imaging element on which one type of color filter is disposed.
In this aspect, the present invention makes it possible to produce a color image at high speeds in an imaging device comprising an imaging element array configured such that a plurality of imaging elements are arrayed.
In this embodiment, the imaging element array 11 is configured such that the plurality of imaging elements 30 are arranged in a matrix pattern, for example, in M rows×N columns (where M is an integer of 2 or more, and N is an integer of 2 or more). Further, each of the lenses 20 is provided correspondingly to a respective one of the imaging elements 30. In this embodiment, as the lens 20, a single lens may be employed, or a lens group composed of a combination of a plurality of lenses may be employed.
The imaging element 30 comprises a plurality of pixels arrayed, for example, in m rows×n columns (where m is an integer of 2 or more, and n is an integer of 2 or more). For examples, as the imaging element 30, it is possible to employ a CMOS-type imaging element or a CCD-type imaging element. The imaging elements 30 may be constructed by dividing one imaging element into a plurality of light-receiving regions formed on the same substrate, or by forming a plurality of imaging elements on the same substrate, or by arranging on the same plane a plurality of imaging elements formed on respective separate substrates. Respective angles of view of the arrayed lenses 12 are adjusted to enable capture of images of approximately the same object. Thus, plural pieces of image data captured by the respective imaging elements 30 represent approximately the same object.
In this embodiment, the R, G or B one-color filters disposed on the imaging elements 302 are arranged in a Bayer array. R, G and B color filters of the RGB mosaic filter array disposed on the imaging element 301 are also arranged in a Bayer array. As the Bayer array, it is possible to employ one type in which G color filters are arranged in a checkered pattern, and R and B color filters are arranged in the remaining regions at a ratio of 1:1. However, it is to be understood that this is merely one example. For example, it is possible to employ a Bayer array in which color filters to be arranged in a checkered pattern are B or R color filters.
Specifically, in the 1st row, the RGB mosaic filter array, and the B, G and B one-color filters are arranged in this order from the 1st column to the 4th column, and, in the 2nd row, the R, G, R and G one-color filters are arranged in this order from the 1st column to the 4th column. In the 3rd row, the G, B, G and B one-color filters are arranged in this order from the 1st column to the 4th column, and, in the 4th row, the R, G, R and G one-color filters are arranged in this order from the 1st column to the 4th column.
The imaging element 301 is divided, for example, into 4 rows×6 columns photoelectric conversion regions (pixels) 303, and the R, G or B one-color filter is disposed on each of the photoelectric conversion regions 303. In this embodiment, the imaging element 301 comprises a plurality of pixels arranged in a matrix pattern of given number m of rows×given number n of columns. Each of the pixels is disposed in a respective one of the photoelectric conversion regions 303. In the embodiment illustrated in
As above, in the imaging element array 11 illustrated in
The imaging element 301 is equivalent to one example of “first imaging element on which at least two types of color filters each having a different spectral characteristic are disposed”. Further, the imaging element 302 is equivalent to one example of “second imaging element on which one type of color filter is disposed”.
In
In this example, the imaging element 301 at the intersection of the 2nd row and the 3rd column is configured such that R color filters and G color filters are arranged in a checkered pattern. Specifically, in the imaging element 301 at the intersection of the 2nd row and the 3rd column, R and G color filters are arranged in such a manner that, in the 1st row, the R color filter and the G color filter are alternately arranged from the first column to the fourth column, and, in the second row, the G color filter and the R color filter are alternately arranged from the first column to the fourth column.
The imaging element 301 at the intersection of the 3rd row and the 2nd column is configured such that B color filters and G color filters are arranged in a checkered pattern. Specifically, in the imaging element 301 at the intersection of the 3rd row and the 2nd column, B and G color filters are arranged in such a manner that, in the 1st row, the B color filter and the G color filter are alternately arranged from the first column to the fourth column, and, in the second row, the G color filter and the B color filter are alternately arranged from the first column to the fourth column.
In the arrangement pattern illustrated in
In the arrangement pattern illustrated in
The lens 20 is provided correspondingly to each of the imaging element RGB, the imaging elements G1 to Gk, the imaging elements B1 to Bk and the imaging elements R1 to Rk. The imaging element RGB corresponds to the imaging element 301 in
The color separation section 501 is configured to separate image data captured by the imaging element RGB, into three, R, G and B, color components. The switch 502 is configured to connect the color separation section 501 to the image processing section 503, in the mode for capturing a moving image. Thus, R, G and B color components separated by the color separation section 501 are output to the image processing section 503.
The switch 502 is also configured to connect the color separation section 501 to the memories G10, B10, R10, in the mode for capturing a still image. Thus, R, G and B color components separated by the color separation section 501 are written into the memories R10, G10, B10, respectively.
The image processing section 503 is configured to, in the mode for capturing a moving image, subject image data consisting of the R, G and B color components separated by the color separation section 501, to given image processing to produce a color image, and output the color image to the display 504. The image processing section 503 is also configured to, in the mode for capturing a still image, read three pieces of super-resolved image data, respectively, from the memories G20, B20, R20, and, after subjecting the read image data to gamma correction to produce a color image, output the color image to the display 504.
In this embodiment, examples of the given image processing include color interpolation and gamma correction. For example, as the color interpolation, it is possible to employ a processing of interpolating missing pixels in the image data consisting of the separated R, G and B color components. For example, as the gamma correction, it is possible to employ a processing of correcting an image characteristic of image data obtained by each of the imaging elements 30 to a characteristic suitable for output characteristics of the display 504. Further, the image processing section 503 is configured to additionally output the image-processed image data to the compression section 505, as needed.
The memory G10 is an image memory configured to hold image data consisting of the G color component obtained by the imaging element RGB, and k pieces of image data obtained by the imaging elements G1 to Gk. The memory B10 is an image memory configured to hold image data consisting of the B color component obtained by the imaging element RGB, and k pieces of image data obtained by the imaging elements B1 to Bk. The memory R10 is an image memory configured to hold image data consisting of the R color component obtained by the imaging element RGB, and k pieces of image data obtained by the imaging elements R1 to Rk.
The super-resolution processing section G30 is configured to subject image data held by the memory G10 to the super-resolution processing to produced a single piece of G-component super-resolved image data, and write the produced image data in the memory G20. The super-resolution processing section B30 (R30) is configured to produce a single piece of B-component (R-component) super-resolved image data, from image data held by the memory B10 (R10), and write the produced image data in the memory B20 (R20), in the same manner as that in the super-resolution processing section G30. In this embodiment, although depending on lens performance with respect to a resolution of the imaging element 30, each of the super-resolution processing sections G30, B30, R30 is configured to generate image data, for example, having a sub-pixel level of resolution which is about 5 times greater than the resolution of the imaging element 30. For allowing each of the super-resolution processing sections G30, B30, R30 to generate a sub-pixel level of image data, it is preferable that misalignment between two pieces of image data captured by the adjacent imaging elements 30 is at a sub-pixel level.
In the example illustrated in
The memory G20 is configured to store therein the super-resolved G image data. Each of the memory B20 and the memory R20 is configured to store therein a respective one of the super-resolved B image data and the super-resolved R image data, in the same manner as that in the memory G20.
The compression section 505 is configured to compress image data output from the image processing section 503. In this embodiment, a variety of data compression methods may be employed in the compression section 505. For example, for a moving image, the compression section 505 may be configured to compress image data by a method such as H.264 or Motion JPEG. For a still image, the compression section 505 may be configured to compress image data by a method such as JPEG.
The display 504 is constructed by employing various types of displays such as a liquid crystal panel and an organic EL panel, and configured to display image data output from the image processing section 503. The recording medium 506 is composed, for example, of a stationary, rewritable storage device such as a hard disk, or a portable, rewritable storage device such as a memory card, and configured to store therein image data compressed by the compression section 505.
The communication section 507 is composed, for example, of a wireless LAN or wired LAN communication module, or a mobile phone communication module, and configured to send image data compressed by the compression section 505 to the outside, and receive image data from the outside.
Next, an operation of the imaging device in the mode for capturing a moving image will be described. First of all, an object is imaged, i.e., image data about the object is captured, by the imaging element RGB. The captured image data is separated into R, G and B color components by the color separation section 501, and image data consisting of the R, G and B color components is input into the image processing section 503 via the switch 502. Then, the image data input into the image processing section 503 is subjected to gamma correction and color interpolation, and formed as processed image data consisting of R, G and B color components, and the processed image data is displayed on the display 504. Further, for example, in response to a storage instruction input by a user, the processed image data is compressed by the compression section 505 and then stored in the recording medium 506. On the other hand, for example, in response to a sending instruction input by a user, the processed image data is sent to the outside by the communication section 507.
Next, an operation of the imaging device in the mode for capturing a still image will be described. First of all, an image of an object is captured by all of the imaging elements 30 to acquire plural pieces of image data. G image data captured by the imaging element RGB and k pieces of G image data captured by the imaging elements G1 to Gk are stored in the memory G10 once. Similarly, B image data captured by the imaging element RGB and k pieces of B image data captured by the imaging elements B1 to Bk are stored in the memory B10 once, and R image data captured by the imaging element RGB and k pieces of R image data captured by the imaging elements R1 to Rk are stored in the memory R10 once.
Then, the super-resolution processing section G30 integrates the plural pieces of G image data stored in the memory G10, based on the super-resolution processing, to thereby synthesize a single piece of G image data, and writes the single piece of G image data into the memory G20. Similarly, the super-resolution processing section B30 (R30) integrates the plural pieces of B (R) image data stored in the memory B10 (R10) to thereby synthesize a single piece of B (R) image data, and writes the single piece of B (R) image data into the memory B20 (R20).
Then, the super-resolved G, B and R image data stored in the memories G20, B20, R20 is subjected to image processing, such as gamma correction and color interpolation, through the image processing section 503, and displayed on the display 504. The super-resolved G, B and R image data after being subjected to the image processing through the image processing section is stored in the recording medium 506, or sent to the outside by the communication section 507, as needed.
Although the above imaging device is configured to, in the mode for capturing a still image, integrate plural pieces of image data captured by all of the imaging elements 30, the present invention is not limited thereto, but image data captured by the imaging element RGB may be used as data for a still image. For example, in the case where the captured image data is uploaded to an operating site of a social network, high image quality is not required for the captured image data. Thus, image data captured by the imaging element RGB may be used as data for a still image without any problem.
Switching between capture of a moving image and capture of a still image may be achieved, for example, by providing a manual operation section in the imaging device in such a manner as to allow a user to manually operate the manual operation section so as to select one of a moving image mode and a still image mode.
A switch 502 is configured to, in the mode for capturing a moving image, connect each of the color separation section 501 and the color separation section 508 to the synthesis section 509. The switch 502 is also configured to, in the mode for capturing a still image, connect each of the color separation section 501 and the color separation section 508 to memories G10, B10, R10.
The color separation section 501 is configured to separate image data captured by the imaging element RGB1 into three, R, G and B, color components. The color separation section 508 is configured to separate image data captured by the imaging element RGB2 into R, G and B color components.
The memory G10 is configured to hold two pieces of image data each consisting of the G color component separated by a respective one of the color separation section 501 and the color separation section 508, and k pieces of image data each consisting of the G color component captured by a respective one of a plurality of imaging elements G1 to Gk.
Each of the memory B10 and the memory R10 is configured to hold a respective one of image data consisting of B color components and image data consisting of R color components.
The synthesis section 509 is configured to, in the mode for capturing a moving image, integrate image data consisting of R, G and B color components separated by the color separation section 501, and image data consisting of R, G and B color components separated by the color separation section 508, to thereby synthesize a single piece of image data. More specifically, an amount of misalignment between the imaging element RGB1 and the imaging element RGB2 is preliminarily stored in the synthesis section 509. Then, the color separation section 508 is operable to align the image data separated by the color separation section 501 and the image data separated by the color separation section 508, using the amount of misalignment, and add the aligned image data on a color component-by-color component basis to thereby synthesize a single piece of image data consisting of R, G and B color components. In this case, it is only necessary for the synthesis section 509 to perform the alignment and addition processing, so that it is possible to synthesize a single piece of image data under a low processing load. However, it is to be understood that this is merely one example. For example, the synthesis section 509 may be configured to, based on super-resolution processing, integrate image data captured by the imaging element RGB1, and image data captured by the imaging element RGB2. The use of the super-resolution processing causes an increase in processing load, as compared to the alignment and addition processing. However, considering that the processing load becomes lower along with a decrease in the number of images (pixels) of the imaging elements RGB1, RGB2, the synthesis section 509 may employ the super-resolution processing without any problem, under a limited condition (e.g., in the case where a frame rate is relatively small).
In the imaging device illustrated in
In the YMCG mosaic filter array, color filters having respective spectral characteristics for transmitting Y (yellow) light, M (magenta) light, C (cyan) light and G (green) light are arrayed. In the example illustrated in
In the case where the arrangement pattern in
The color separation section 501 is also configured to, in the mode for capturing a still image, after separating image data captured by the imaging element YMCG, into the four, Y, M, C and G, color components, convert the Y, M, C and G color components to R, G and B color components, by arithmetic processing, and write the R, G and B color components into the memories R10, G10, B10, respectively.
In the WRGB mosaic filter array, a W (white) region on which no color filter is disposed, and a color filter having a spectral characteristic for transmitting B (blue) light, R (red) light or G (green) light, are arranged in a mosaic pattern. Although the following description will be made on the assumption that a W color filter is disposed in the W region, only for the sake of simplicity of explanation, the W color filter is not actually disposed in the W region.
In the example illustrated in
In the case where the arrangement pattern in
The color separation section 501 is also configured to, in the mode for capturing a still image, after separating image data captured by the imaging element WRGB, into the four, W, R, G and B, color components, convert the W, R, G and B color components to G, B and R color components, and write the G, B and R color components into the memories G10, B10, R10, respectively.
In the WYR mosaic filter array, a W (white) region on which no color filter is disposed, and a color filter having a spectral characteristic for transmitting Y (yellow) light or R (red) light, are arranged in a mosaic pattern. In the example illustrated in
In the case where the arrangement pattern in
The color separation section 501 is also configured to, in the mode for capturing a still image, after separating image data captured by the imaging element WYR, into the three, W, Y and R, color components, convert the W, Y and R color components to G, B and R color components, and write the G, B and R color components into the memories G10, B10, R10, respectively.
In each of the arrangement patterns illustrated in
As above, the imaging device according to this embodiment is configured to, in the mode for capturing a moving image, use image data captured by the imaging element 301 on which an at least two-color mosaic filter array is disposed, so that it becomes possible to suppress a processing load during capture of a moving image to suppress power consumption. In the above embodiment, the number of the imaging elements 301 is set to 1 or 2. However, the number is not limited thereto, but may be set to 3 or mote.
(1) The imaging device according to the above embodiment comprises: a plurality of optical systems arranged in a matrix pattern; and an imaging element array comprising an array of imaging elements each corresponding to a respective one of the optical systems, wherein the imaging element array includes a first imaging element on which at least two types of color filters each having a different spectral characteristic are disposed, and a second imaging element on which one type of color filter is disposed.
In this imaging device, the imaging elements are arrayed in a matrix pattern to form the imaging element array. Then, the imaging element array includes the first imaging element having two types of color filters disposed thereon, and the second imaging element having one type of color filter disposed thereon. Therefore, during capture of a moving image requiring high-speed processing, a color image can be synthesized from image data captured by the first imaging element.
That is, when a color image is synthesized from image data captured by the first imaging element, a processing load can be reduced, as compared to the case where a color image is synthesized from image data captured by all of the imaging elements, so that it becomes possible to obtain a color image at high speeds. In addition, the reduction in processing load facilitates power saving.
The number of the first imaging elements may be set to 1. In this case, it becomes possible to eliminate a need to integrate plural pieces of image data, thereby producing a color image at a higher speed. Further, as the second imaging element, a type having a different spectral characteristic from that of the first imaging element may be arranged. In this case, color image data having at least three color components can be generated at high speeds by integrating image data captured by the first imaging element and the second imaging element.
On the other hand, during capture of a still image having a low need for high-speed processing, as compared to a moving image, a high-definition color image can be synthesized from image data captured by all of the imaging elements, through super-resolution processing or the like.
(2) Preferably, the first imaging element is provided with an at least two-color mosaic filter array disposed thereon, and the second imaging element is provided with a one-color filter disposed thereon.
In the imaging device having this feature, the at least two-color mosaic filter array is disposed on the first imaging element, so that a color image having at least two color components can be obtained on a real-time basis. Further, the one-color filter is disposed on the second imaging element, so that a high-definition color image can be obtained by integrating image data obtained from the first imaging element and the second imaging element.
(3) Preferably, the one-color filter disposed on the at least one second imaging element has a same color as one of the colors of the mosaic filter array.
In the imaging device having this feature, the one-color filter disposed on the second imaging element has the same color as one of the colors of the mosaic filter array disposed on the first imaging element, so that higher-definition image data in terms of the color can be obtained by integrating image data obtained from the first imaging element and the second imaging element.
(4) Preferably, the above imaging device is configured to use the first imaging element to obtain image data for a moving image.
In the imaging device having this feature, the first imaging element is used to obtain image data for a moving image. Thus, image data for a moving image can be obtained at a higher speed, as compared to the case where image data for a moving image is generated by integrating image data captured by all of the imaging elements.
(5) Preferably, the first imaging element is provided with an RGB mosaic filter array disposed thereon.
In the imaging device having this feature, the RGB mosaic filter array is disposed on the first imaging element, so that a color moving image having RGB color components can be produced on a real-time basis.
(6) Preferably, the first imaging element is provided in a plural number, wherein at least one of the first imaging elements is provided with a GR mosaic filter array disposed thereon, and each of the remaining first imaging elements is provided with a GB mosaic filter array disposed thereon.
In the imaging device having this feature, two types of color filters are disposed on each of the first imaging elements, so that, as compared to the case where three types of color filters are disposed thereon, a wavelength band of light to be transmitted through the optical system becomes narrower, and thereby the optical system corresponding to the first imaging element can be reduced in size. More specifically, the GR mosaic filter array is disposed on at least one of the first imaging elements, and the GB mosaic filter array is disposed on each of the remaining first imaging elements, so that a color moving image having RGB color components can be produced on a real-time basis by integrating image data captured by the first imaging elements.
(7) Preferably, the first imaging element is provided with an YMCG mosaic filter array disposed thereon.
In the imaging device having this feature, the YMCG mosaic filter array generally having higher sensitivity than an RGB mosaic filter array is disposed on the first imaging element, so that it becomes possible to provide higher sensitivity and obtain image data with good S/N ratio, on a real-time basis.
(8) Preferably, the first imaging element is provided with a WRGB mosaic filter array disposed thereon.
In the imaging device having this feature, it becomes possible to obtain image data consisting of a W (white) color component in addition to image data consisting of R, G and B color components. In this regard, the W color component has a spectral sensitivity in the overall bandwidth. Therefore, it becomes possible to provide higher sensitivity and obtain image data with good S/N ratio, by generating image data consisting of R, G and B color components, using image data consisting of a W color component.
(9) Preferably, the first imaging element is provided with a WYR mosaic filter array disposed thereon.
In the imaging device having this feature, image data consisting of a W color component can be obtained, so that it becomes possible to provide higher sensitivity and obtain image data with good S/N ratio, on a real-time basis.
(10) Preferably, the second imaging element is provided with an R, G or B one-color filter disposed thereon.
In the imaging device having this feature, the R, G or B one-color filter is disposed on the second imaging element, so that a high-definition color image can be obtained by integrating image data obtained from a plurality of the second imaging elements.
Number | Date | Country | Kind |
---|---|---|---|
2012-273378 | Dec 2012 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2013/007030 | 11/29/2013 | WO | 00 |