This application claims the benefit of Japanese Priority Patent Application JP 2014-042855 filed Mar. 5, 2014, and Japanese Priority Patent Application JP 2014-188275 filed Sep. 16, 2014, the entire contents of each of which are incorporated herein by reference.
The present technology which is disclosed in the specification relates to an imaging apparatus which images a high dynamic range image using an imaging element with a low dynamic range.
Due to a high bit of an imaging element (image sensor), a correspondence to a high bit in a display, or the like, a high dynamic range (HDR) of an image is progressing. In an HDR image, a contrast ratio of a color with maximum brightness to a color with minimum brightness reaches 10000:1 or greater, for example, and it is possible to realistically express the real world. In the HDR image, there are advantage that it is possible to realistically express shade, simulate an exposure, express glare, and the like.
As a field of application of the HDR technology, there are an instrument or a device in which an image which is captured from a complementary metal oxide semiconductor (CMOS), or a charge coupled device ((CCD) sensor) is used, a digital still camera, a camcorder for a moving image, a camera for a medical image, a surveillance camera, a digital camera for cinema-photography, a camera for a binocular image, a display, and the like.
Various technologies for imaging a high dynamic range image using an imaging element for a low dynamic range have been proposed.
For example, an imaging apparatus in which an HDR image is composited from a plurality of imaged images of which exposure amounts are different has been proposed (for example, refer to Japanese Unexamined Patent Application Publication No. 2013-255201). However, when an HDR image of one frame is composited from a plurality of frames, there are the following problems.
(1) Memories of a plurality of frames are necessary
(2) Delay time due to photographing and processing of plurality of frames
(3) Motion blur in moving object
In addition, an imaging apparatus in which a mask plate which is formed of a two-dimensional array of cells of which degrees of transparency corresponding to an exposure value are different is placed before an image sensing device, imaging is performed using a mechanism in which exposures are different in each pixel in one frame, and an image signal in a high dynamic range is generated by performing a predetermined image processing with respect to the obtained image signal has been proposed (for example, refer to Japanese Patent No. 4494690).
On the other hand, as a technology of obtaining image signals of which properties or imaging conditions are different from one frame, a technology of light field photography (LDF) is known. In an imaging apparatus in which the LFP is used, a lens array is arranged between an imaging lens and an image sensor. An input ray from an object is divided into rays of each viewpoint in the lens array, and is received in the image sensor thereafter. Multiple viewpoint images are generated at the same time using pixel data which is obtained from the image sensor (for example, refer to Japanese Unexamined Patent Application Publication No. 2010-154493, and “Light Field Photography with a Hand-Held Plenoptic Camera” (Stanford Tech Report CTSR 2005-02) written by Ren. Ng, et al.
In the technology of LFP, viewpoints are divided using a lens array, and multiple viewpoint images are generated in one frame. Specifically, in an imaging apparatus in which the technology of LFP is used, a ray which penetrates one lens of the lens array is received in m×n pixels (here, m and n are integers of one or more, respectively) on the image sensor. That is, it is possible to obtain viewpoint images of pixels (pixels of m×n) corresponding to each lens. When such a property of the imaging apparatus in which the technology of LFP is used is used, it is possible to generate a parallax image in each viewpoint in the left and right directions among other viewpoints of which phase differences are different. That is, it is possible to execute a view of a stereoscopic image in which binocular parallax is used.
It is desirable to provide an excellent imaging apparatus in which imaging of a high dynamic range image is performed using an imaging element for a low dynamic range.
According to an embodiment of the present technology, there is provided an imaging apparatus which includes an imaging lens; an imaging element which performs a photoelectric conversion with respect to light which is condensed using the imaging lens; and lens arrays which are configured by arranging micro lenses of which exposure conditions are different on a two-dimensional plane, are arranged by being separated on a front face of an imaging face of the imaging element, and causes light which is output from each micro lens to be formed as an image on the imaging face of the imaging element.
The imaging apparatus may further include an image composition unit which composites a plurality of imaged images which are output from the imaging element, and of which exposure conditions are different, and generates a high dynamic range image.
In the imaging apparatus, the lens array may include a micro lens with a property of a low exposure lens, and a micro lens with a property of a high exposure lens. In addition, the imaging element may photograph a low exposure image and a high exposure image by respectively performing a photoelectric conversion, with respect to output light of each micro lenses with the property of the low exposure lens, and the property of the high exposure lens, and the image composition unit may generate a high dynamic range image by compositing the low exposure image and the high exposure image.
In the imaging apparatus, the lens array may include micro lenses of three or more types of which exposure lens properties are different. In addition, the imaging element may photograph images of three types or more of which exposure conditions are different by performing a photoelectric conversion, respectively, with respect to output light of a micro lens with each exposure lens property, and the image composition unit may generate a high dynamic range image by compositing imaged images of three types or more of which the exposure conditions are different.
The imaging apparatus may further include an interpolation unit which improves resolution by interpolating pixels at a pixel position with another exposure condition using a pixel value of neighboring pixels with the same exposure condition with respect to respective imaged images of which exposure conditions are different, after the images are formed in the imaging element.
In the imaging apparatus, the interpolation unit may improve resolution of respective imaged images of which exposure conditions are different so as to be the same resolution as that of an input image using the pixel interpolation.
In the imaging apparatus, the micro lens may include a diaphragm for controlling a light intensity which meets a corresponding exposure condition.
According to another embodiment of the present technology, there is provided an imaging apparatus which includes an imaging lens; an imaging element which performs photoelectric conversion with respect to light which is condensed using the imaging element: lens arrays which are configured by being arranged with a plurality of micro lenses to which m×n pixels of the imaging element are respectively allocated on a two-dimensional plane, and are arranged by being separated on a front face of an imaging face of the imaging element; and an image composition unit which composites at least part of image data among m×n pixels which receive light which has passed through each micro lens of the lens array.
In the imaging apparatus, the image composition unit may generate a stereoscopic image based on at least part of image data among the m×n pixels which receive light which has passed through each micro lens of the lens array.
In the imaging apparatus, the image composition unit may composite a left eye image based on image data which is read from a pixel which receives a ray for a left eye which passes through each micro lens, and may composite a right eye image based on image data which is read from a pixel which receives a ray for a right eye.
In the imaging apparatus, the image composition unit may generate a plurality of images of which exposure conditions are different at the same time based on at least part of image data among m×n pixels which receive light which has passed through each micro lens of the lens array.
In the imaging apparatus, the image composition unit may generate a low exposure image based on image data which is read from a pixel which is set to a low exposure condition among m×n pixels which receive light which has passed through each micro lens, and generate a high exposure image based on image data which is read from a pixel which is set to a high exposure condition simultaneously with the low exposure image.
In the imaging apparatus, the image composition unit may generate a stereoscopic image, a low exposure image, and a high exposure image at the same time based on at least part of image data among m×n pixels which receive light which has passed through each micro lens of the lens array.
In the imaging apparatus, the image composition unit may generate a high dynamic range image by compositing the low exposure image and the high exposure image which are generated at the same time.
In the imaging apparatus, the imaging element may be arranged in a state in which a pixel group which is arranged in a square lattice shape along a horizontal direction and a vertical direction is rotated by a predetermined angle in a light receiving plane.
In the imaging apparatus, an exposure time of each pixel may be controlled so as to have a light intensity which meets each exposure condition.
In the imaging apparatus, an amount of narrowing of light which is input to each pixel may be controlled so as to be a light intensity which meets each exposure condition.
The imaging apparatus may further include an encoding unit which outputs a code stream by encoding an image which is generated in the image composition unit.
In the imaging apparatus, generating either a stereoscopic image or a high dynamic range image may be selected based on instructed information, and the encoding unit may output an encoding result of the stereoscopic image when the generating of the stereoscopic image is selected, and output an encoding result of the high dynamic range image when the generating of the high dynamic range image is selected.
In the imaging apparatus, the encoding unit may include a tone mapping unit which performs tone mapping with respect to a high dynamic range image when the high dynamic range image is encoded; a first encoding unit which encodes the image after being subjected to the tone mapping; a decoding unit which decodes an encoding result using the first encoding unit; a reverse tone mapping unit which performs reverse tone mapping with respect to the decoding result using the decoding unit; a difference calculation unit which calculates a difference between the original high dynamic range image and an image which is subjected to the reverse tone mapping; and a second encoding unit which encodes a difference image using the difference calculation unit.
According to the technology which is disclosed in the specification, it is possible to provide an excellent imaging apparatus in which a high dynamic range image is imaged using an imaging element of a low dynamic range.
According to the technology which is disclosed in the specification, since a high dynamic range image is generated from an image of one frame using an imaging element of a low dynamic range, it is possible to solve problems in a memory, delay, and motion blur of a moving object in a case of generating a high dynamic range image from a plurality of frames.
According to the technology which is disclosed in the specification, when a plurality of exposure images of which exposure conditions are different are obtained at the same point of time, by arranging a lens array according to the LFP technology on the front face of the imaging element, and controlling rays which pass through each micro lens of the lens array so as to be output in different exposure conditions, it is possible to generate a high dynamic range image in one frame by compositing the plurality of exposure images. According to the technology which is disclosed in the specification, since a process of generating a high dynamic range image is completed in one frame, it is possible to save a frame memory, and it is also possible to solve a problem of motion blur of a moving object since a delay time is shortened. In addition, according to the technology which is disclosed in the specification, it is possible to generate a stereoscopic image using binocular parallax using a principle of the LFP.
In addition, the effect which is disclosed in the specification is merely an example, and the effect of the present technology is not limited to this. In addition, there is a case in which the present technology exhibits another additional effect, in addition to the above described effect.
Further, another object, property, or advantage of the technology which is disclosed in the specification will be clarified using detailed descriptions based on embodiments which will be described later, or based on accompanying drawings.
Hereinafter, embodiments of the technology which is disclosed in the specification will be described in detail with reference to drawings.
An imaging unit 101 outputs one frame including an image signal 103 with a high exposure amount and an image signal 104 with a low exposure amount in one imaging process. In addition, an image composition unit 102 composites the image signal 103 with the high exposure amount and the image signal 104 with the low exposure amount, and generates an HDR image using imaging of one frame, that is, in one imaging process. A difference in exposure conditions such as the high exposure amount or the low exposure amount is controlled using an exposure time in each pixel, an amount of narrowing a diaphragm window at a time of exposure, or the like.
In addition, in
The imaging unit 201 outputs one frame including an image signal 203 with a high exposure amount, an image signal 204 with a low exposure amount, and an image signal 205 with a medium exposure amount in one imaging process. In addition, an image composition unit 202 composites the image signal 203 with the high exposure amount, the image signal 204 with the low exposure amount, and the image signal 205 with the medium exposure amount signal, and generates an HDR image using imaging of one frame, that is, in one imaging process. A difference in exposure conditions such as the high exposure amount or the low exposure amount is controlled using an exposure time in each pixel, an amount of narrowing of a diaphragm window at a time of exposure, or the like.
As a technology of obtaining image signals of which properties or exposure conditions are different from one frame, a light field photography (LFP) technology is known. In an imaging apparatus in which the LFP technology is used, a lens array is arranged between an imaging lens and an imaging sensor. An input ray from an object is divided into rays of each viewpoint in the lens array, and is received in the image sensor thereafter. In addition, a multi viewpoint image is generated at the same point of time using image data which is obtained from the image sensor (for example, refer to Japanese Unexamined Patent Application Publication No. 2010-154493, and “Light Field Photography with a Hand-Held Plenoptic Camera” (Stanford Tech Report CTSR 2005-02) written by Ren. Ng, etc.).
In the LFP technology, viewpoints are divided using the lens array, and images of a plurality of viewpoints are generated in one frame. In contrast to this, in the first embodiment, a point of arranging the lens array on the front face of an imaging element face of the imaging unit 101 (or 201) is the same as an LFP in the related art; however, the first embodiment is different from the related art in a point in which images of which exposure amounts are different are generated in one frame by using a lens array in which micro lenses of which exposure properties are different are combined. In addition, according to the embodiment it is possible to generate an HDR image from one frame by compositing images of which exposure amounts are different.
According to the embodiment, the lens array 302 is configured by alternately arranging two types of micro lenses of an L lens with a property of a low exposure lens, and an H lens with a property of a high exposure lens on a two-dimensional plane. In addition, the lens array has a configuration in which one micro lens is provided with respect to one pixel of the imaging element 301 (that is, one to one correspondence of pixel and micro lens), and each pixel is irradiated with light which passes through a corresponding micro lens. Accordingly, imaged images of the L lens and the H lens are respectively input to pixels on the imaging element 301, and a photoelectric conversion is performed. As a result, a high exposure pixel signal 103 is output from an H pixel irradiated with light which has passed through the H lens, and a low exposure pixel signal 104 is output from an L pixel irradiated with light which has passed through the L lens.
Therefore, in the imaging unit 101, new L component pixels L1 and L2 are generated at positions of H component pixels, originally, due to interpolation processing (for example, calculating of mean value) for neighboring L component pixels with respect to an imaged image which is output from the imaging element 301. In this manner, it is possible to increase a compensation effect, and to maintain the original resolution of an input image as well, using the unit, since values of neighboring pixels are similar, though the L component pixel does not practically exist.
In addition, as illustrated in
As illustrated in
The image composition unit 102 is capable of generating a high dynamic range image in which halation, black crush, or the like does not occur by compositing the two images 701 and 702. However, a couple of methods in which a high dynamic range image is generated by compositing a plurality of images of which exposure properties are different have already been used in the industry, and the embodiment is not limited to a specific image compositing method. In general, an image processing method in which a dynamic range is improved in the entire image, while reducing halation of an image in the image with a high exposure amount, and solving a problem of black crush in a low exposure amount has been used.
In the examples illustrated in
According to the embodiment, the lens array 802 is configured by alternately arranging three types of micro lenses of an L lens with a property of a low exposure lens, an H lens with a property of a high exposure lens, and an M lens with a property of a medium exposure lens on a two-dimensional plane. In addition, the lens array has a configuration in which one micro lens is arranged with respect to one pixel of the imaging element 801 (that is, one to one correspondence of pixel and micro lens), and each pixel is irradiated with light which has passed through a corresponding micro lens. Accordingly, imaged images of the L lens and the H lens are respectively input to pixels on the imaging element 801, and photoelectric conversion is performed. As a result, a high exposure pixel signal 203 is output from an H pixel which is irradiated with light which has passed through the H lens, a low exposure pixel signal 204 is output from an L pixel which is irradiated with light which has passed through the L lens, and a medium exposure pixel signal 205 is output from an M pixel which is irradiated with light which has passed through the M lens.
Therefore, in the imaging unit 201, interpolation processing of neighboring pixels with the same component is performed at a pixel position of another component in each of the L component pixel, the M component pixel, and the H component pixel with respect to an imaged image which is output from the imaging element 801. In this manner, since values of neighboring pixels are usually similar, though it is a pixel with another component which does not exist practically, it is possible to increase a compensation effect, and maintain the original resolution of an input image as well, using the unit.
The image composition unit 202 is capable of generating an image of a higher dynamic range in which halation, black crush, or the like does not occur by compositing these three images of 1001, 1002, and 1003. However, a couple of methods in which a high dynamic range image is generated by compositing a plurality of images of which exposure properties are different have already been used in the industry, and the embodiment is not limited to a specific image compositing method.
In addition, when setting an exposure condition of each micro lens of the lens arrays 302 and 802, various methods are taken into consideration. As the method, there is a method of controlling transmissivity of light by arranging a filter on the front face of a lens, a method of determining an exposure amount by controlling a shutter speed, by arranging a mechanical shutter on the front face of a micro lens, though it is mechanically difficult, or the like. When a shutter speed is increased, it becomes a low exposure since a light intensity is reduced, and accordingly, it is possible to obtain an L component image. On the other hand, when the shutter speed is decreased, it becomes a high exposure since a light intensity is increased, and accordingly, it is possible to obtain an H component image.
In addition, it is possible to set an exposure condition comparatively easily and effectively by arranging a diaphragm window at the outer periphery of each micro lens which configures the lens array 802 (or 302), and by respectively setting a narrowing amount corresponding to an exposure property of a corresponding pixel.
As described above, in the imaging apparatus according to the embodiment, a lens array which is configured by arranging micro lenses on a two-dimensional plane is arranged on the front face of the imaging element. Since each micro lens respectively corresponds to one pixel of the imaging element, and different exposure conditions are set, the imaging apparatus generates a plurality of imaged images of which exposure conditions are different at the same point of time, and is capable of compositing a high dynamic range image from these imaged images. In the related art, frames of a plurality of point of times are captured in advance, and are composited (for example, refer to Japanese Unexamined Patent Application Publication No. 2013-255201); however, in contrast to this, according to the embodiment, since a process is completed in one frame, it is possible to save a frame memory, and there is an effect of shortening a delay time.
The imaging lens 1401 is a main lens for imaging the object 1410, and for example, is configured of a general optical lens which is used in a video camera, or a still camera. An opening diaphragm 1407 is arranged on the light input side or the light output side (light input side in illustrated example) of the imaging lens 1401. An image of the object 1410 which is similar to a shape of an opening of the opening diaphragm 1407 (for example, circular shape) is formed in each image forming region of each micro lens of the lens array 1402 on the imaging element 1403.
The lens array 1402 is configured by arranging a plurality of micro lenses on a two-dimensional plane such as a glass substrate, or the like, for example. The lens array 1402 is arranged on a focal face (image forming face) of the imaging lens 1401, and the imaging element 1403 is arranged at a focal position of the micro lens of the lens array 1402. Each micro lens is configured of an individual lens, a liquid crystal lens, a diffraction lens, or the like, for example. Though it will be described later in detail, a two-dimensional arrangement of the micro lens in the lens array 1402 corresponds to a pixel array in the imaging element 1403.
The imaging element 1403 performs photoelectric conversion with respect to a ray which is received through the lens array 1402, and outputs imaged data DO. The imaging element 1403 is configured using a charge coupled device (CCD), or a complementary metal oxide semiconductor (CMOS), and has a structure in which a plurality of pixels are arranged in a matrix.
The rays which have passed through each micro lens of the lens array 1402 are respectively received in pixel blocks of m×n (for example, 2×2) of the imaging element 1403. That is, pixel blocks of m×n are allocated to one micro lens. In other words, it is possible to perform separating viewpoints of the number of pixels which are allocated to each micro lens (=the number of total pixels of imaging element 1403/the number of lenses of lens array 1402) using the lens array 1402.
The separating of viewpoints here means that a position (region) of the imaging lens 1401 which the ray which has passed through the imaging lens 1401 passes is stored in a unit of pixel of the imaging element 1403, by including directivity thereof.
When the number of viewpoints which are separated increases, angular resolution in a parallax image increases; however, on the other hand, two-dimensional resolution in the parallax image increases when the number of pixels which is allocated to one micro lens decreases. That is, the angular resolution and the two-dimensional resolution of the parallax image are in a trade-off relationship. In the example illustrated in
The image processing unit 1404 performs a predetermined image process with respect to the imaging data DO which is obtained in the imaging element 1403, and outputs a parallax image or a high dynamic range image in the embodiment as output image data Dout. A detail of an image process for generating a parallax image or a high dynamic range image will be described later.
The imaging element driving unit 1405 drives the imaging element 1403, and performs a control of a light receiving operation thereof.
The control unit 1406 is configured of a micro computer, or the like, for example, and controls operations of the image processing unit 1404 and the imaging element driving unit 1405.
Subsequently, a pixel array in the imaging element 1403 will be described.
In the example illustrated in
In other words, in the diagonal array which is illustrated in
In the square array which is illustrated in
On the other hand, in the diagonal array illustrated in
In brief, in the configuration examples of the imaging apparatus 1400 which are illustrated in
According to the imaging apparatus 1400 in the embodiment, in each micro lens of the lens array 1402, it is possible to receive rays of the object 1410 as ray vectors of which viewpoints are different from each other, from a principle of the LFP technology. Accordingly, it is possible to use an obtained parallax image as a stereoscopic image with binocular parallax, for example. When photographing a common stereoscopic image, a parallax image with binocular parallax is obtained using two cameras of a camera for a right eye, and a camera for a left eye. In contrast to this, in the imaging apparatus 1400 according to the embodiment, it is possible to easily obtain a stereoscopic image using one camera due to the principle of the LFP technology, and due to a generation of a parallax image using the micro lens. In addition, as described above, resolution rarely decreases in each parallax image.
Subsequently, a specific method of reading and generating image data when generating a parallax image in the imaging apparatus 1400 will be described.
In the square array which is illustrated in
In contrast to this, in the diagonal array which is illustrated in
However, in the case of the diagonal array which is illustrated in
In order to solve the problem that part of image data remains unused, and becomes useless, adopting a configuration of an imaging element in which each micro lens performs separating of left and right parallax by allocating a right half of a left pixel and a left half of a right pixel in two pixels which are neighboring in the horizontal direction to one micro lens, as illustrated in
In
In addition, in
In the imaging element illustrated in
In addition, in the example illustrated in
The R pixel (PCR) 2011 is configured by including two division pixels of pixels DPC-AR1 and DPC-BR1. In addition, the division pixel DPC-AR1 is allocated to an image for R of a stereoscopic image, and the division pixel DPC-BR1 is allocated to an image for L of the stereoscopic image. The same is applied to the R pixel (PCR) 2015.
The G pixel (PCG) 2012 is configured by including two division pixels of pixels DPC-AG1 and DPC-BG1. In addition, the division pixel DPC-AG1 is allocated to an image for R of a stereoscopic image, and the division pixel DPC-BG1 is allocated to an image for L of the stereoscopic image. The same is applied to the G pixel (PCG) 2013 and the G pixel (PCG) 2016.
In addition, the B pixel (PCB) 2014 is configured by including two division pixels of pixels DPC-AB1 and DPC-BB1. In addition, the division pixel DPC-AB1 is allocated to an R image of a stereoscopic image, and the division pixel DPC-BB1 is allocated to an L image of the stereoscopic image.
In the pixel array which is illustrated in
On a semiconductor substrate 2001, a light shielding unit (BLD) or wiring is formed, the color filter array 2003 is formed on a higher layer thereof, and an on-chip lens array 2005 is formed on a higher layer of the color filter array 2003. Each on-chip lens (OCL) of the on-chip lens array 2005 is formed in a matrix so as to correspond to each division pixel in the pixel array unit 2010. In addition, the lens array 1402 in which micro lenses are two-dimensionally arranged is arranged by facing a light input side of the on-chip lens array 2005.
In the example which is illustrated in
For example, the first micro lens ML1 is arranged so as to be shared by the division pixel DPC-BG1 for L of the stereoscopic image of the G pixel (PCG) 2012, and the neighboring division pixel DPC-AB1 for R of the stereoscopic image of the B pixel (PCB) 2014 in the first row. Similarly, the first micro lens ML1 is arranged so as to be shared by the division pixel DPC-BR1 for L of the stereoscopic image of the R pixel (PCR) 2011, and the neighboring division pixel DPC-AG1 for R of the stereoscopic image of the G pixel (PCG) 2013 in the second row.
In addition, the second micro lens ML2 is arranged so as to be shared by the division pixel DPC-BB1 for L of the stereoscopic image of the B pixel (PCB) 2014, and the neighboring division pixel DPC-AG1 for R of the stereoscopic image of the G pixel (PCG) 2016 in the first row. Similarly, the second micro lens ML2 is arranged so as to be shared by the division pixel DPC-BG1 for an L image of the stereoscopic image of the G pixel (PCG) 2013, and the neighboring division pixel DPC-AR1 for R of the stereoscopic image of the R pixel (PCR) 2015 in the second row.
In
Each pixel which is diagonally arranged is divided into two in the horizontal direction (X direction). In addition, a left division pixel of each pixel is allocated to an L image of a stereoscopic image, and a right division pixel is allocated to an R image of the stereoscopic image, respectively. In addition, in the pixel array illustrated in
Each unit pixel of the imaging element with the diagonal array which is illustrated in
Each pixel of the first pixel unit 2201 which is diagonally arranged is formed in a state of being rotated by 45° in the Y direction toward the X direction, for example, so as to straddle two neighboring columns of a row corresponding to the second pixel unit 2202 which is arranged in a square shape. In addition, each pixel of the first pixel unit 2201 is configured so as to include division pixels of DPC1 and DPC2 in a triangular shape which are horizontally divided into two about a Y axis, and in which each division pixel DPC1 is arranged on the left column of neighboring two columns of the second pixel unit 2202, and the DPC2 is arranged on the right column. In addition, one micro lens (ML) is arranged so as to be shared by the two division pixels of DPC1 and DPC2 with the same color in a straddling manner. In addition, one division pixel DPC1 is allocated to an L image of a stereoscopic image, and the other division pixel DPC2 is allocated to an R image of the stereoscopic image. Image data with the same color of each division pixel DPC1 and DPC2 for the R image and the L image which are included in each pixel of the first pixel unit 2201 can be read from pixels of neighboring two columns on the second pixel unit 2202 side, and the mechanism is the same as that in the imaging element which is illustrated in
In a case of the imaging element with the diagonal array which is illustrated in
The configuration example of the imaging element (pixel array unit) which is illustrated in
Though it is not illustrated, also in the imaging element which is illustrated in
A row array and a column array of the first pixel unit, and a row array and a column array of the second pixel unit are formed so as to correspond to each other. Each pixel of the first pixel unit which is diagonally arranged is formed in a state of being rotated by 45° in the Y direction toward the X direction, for example, so as to straddle two neighboring columns of a row corresponding to the second pixel unit which is arranged in a square shape. In addition, each pixel of the first pixel unit is configured so as to include division pixels which have triangular shapes which are horizontally divided into two about the Y axis, and in which each division pixel is arranged in respective left and right columns of neighboring two columns of the second pixel unit. In addition, one micro lens (ML) is arranged so as to be shared by 2×2 pixels with the same color in a straddling manner. Image data of each division pixel which is included in each pixel of the first pixel unit is read from pixels of neighboring two columns on the second pixel unit side, and the mechanism is the same as that in the imaging element which is illustrated in
In the configuration example of the imaging element (pixel array unit) which is illustrated in
In
In
According to the configuration example of the imaging element (pixel array unit) which is illustrated in
Hitherto, the method of generating a stereoscopic image using binocular parallax of left and right in the imaging apparatus 1400 to which the LFP technology is applied has been described. It is also possible to generate a high dynamic range (HDR) image using the same imaging element 1400 to which the LFP technology is applied. Hereinafter, an embodiment in which an HDR image is generated using the imaging apparatus 1400 to which the LFP technology is applied will be described.
In
In contrast to this, according to the embodiment, as illustrated in
In addition,
In contrast to this, according to the embodiment, as illustrated in
In addition, through it is not illustrated in
In addition, a method of performing adjusting of a narrowing amount as illustrated in
In addition, by performing separating of left and right parallax using a micro lens, and setting of a plurality of exposure properties using an exposure time, or the like, at the same time in the imaging apparatus 1400 to which the LFP technology is applied, it is possible to generate stereoscopic images with characteristics of the high dynamic range (HDR) image at the same time.
A row array and a column array of the first pixel unit, and a row array and a column array of the second pixel unit are formed so as to correspond to each other. Each pixel of the first pixel unit which is diagonally arranged is formed in a state of being rotated by 45° in the Y direction toward the X direction, for example, so as to straddle two neighboring columns of a row corresponding to the second pixel unit which is arranged in a square shape. In addition, each pixel of the first pixel unit is configured so as to include a division pixel in a triangular shape which is horizontally divided into two about the Y axis, and each division pixel is arranged in the respective left and right columns of two neighboring columns of the second pixel unit. In addition, one micro lens (ML) is arranged so as to be shared by 2×2 pixels with the same color in a straddling manner. Image data of each division pixel which is included in each pixel of the first pixel unit is read from pixels of neighboring two columns on the second pixel unit side, and the mechanism is the same as that in the imaging element which is illustrated in
In the configuration example of the imaging element (pixel array unit) which is illustrated in
In addition, in pixels of the second pixel unit corresponding to half of each division pixel on the left side about the Y axis in each of 2×2 pixels, short exposure (Se) respectively is performed. In addition, in pixels of the second pixel unit corresponding to half of each division pixel on the right side about the Y axis, long exposure (Le) is respectively performed.
In the example which is illustrated in
LLe: Left+Long Exposure (long exposure in left image)
LSe: Left+Short Exposure (short exposure in left image)
RLe: Right+Long Exposure (long exposure in right image)
RSe: Right+Short Exposure (short exposure in right image)
As described in advance, a couple of methods for generating an HDR image by compositing a plurality of images of which exposure properties are different have been used in the industry. When image data items with the above described four conditions are present in one micro lens, it is possible to generate a high dynamic range image which is a left image by compositing image data of LLe and LSe. In addition, it is possible to generate a high dynamic range image which is a right image by compositing image data of RLe and Rse.
In addition,
In addition, when setting exposure conditions of pixels, various methods are taken into consideration, in addition to the above described method in which an exposure time is controlled. There are a method in which transmissivity of light is controlled by providing a filter on the front face of a lens (including method of controlling transmissivity of light of lens its own), a method in which a mechanical shutter is provided on the front face of a lens (micro lens or on-chip lens), and determines an exposure amount by controlling a shutter speed, and the like. Since light intensity decreases when making a shutter speed fast, this state corresponds to the above described short exposure Se, and image data with a component of a low exposure amount is obtained. On the other hand, since light intensity increases when making a shutter speed slow, this state corresponds to the above described long exposure Le, and image data with a component of a high exposure amount is obtained.
In addition, the method of providing a diaphragm window, as illustrated in
Hitherto, the method of generating a high dynamic range image or a parallax image using an imaging apparatus to which the LFP technology is applied has been described. Usually, an image has a great amount of information. Accordingly, in general, the amount of information is reduced using image compression. In particular, in a case of a moving image, image compression is important.
Data 2811 which is input to the image compression device 2800 is a high dynamic range image, and is assumed to be expressed in high bit depth of 8 or more bits, and of an accuracy of one decimal point. As a method of encoding a high bit depth image, a plurality of methods are suggested. For example, a process of converting bit depth using tone mapping is used in the industry. According to the embodiment, the tone mapping unit 2801 performs tone mapping with respect to the input high dynamic range image 2811, and converts the image into an image 2812 of 8 bits. Accordingly, in the subsequent first encoding unit 2802, an encoding process corresponding to an image of 8 bit depth such as a Joint Photographic Experts Group (JPEG) or a Moving Picture Experts Group (MPEG) is applied.
The decoding unit 2803 performs a decoding process corresponding to a reverse conversion of the encoding process using the first encoding unit 2802 with respect to a encoding result 2813 of the first encoding unit 2802, and obtains a decoded image 2814. When the encoding process using the first encoding unit 2802 and the decoding process using the decoding unit 2803 are reversible modes, the 8 bit image 2812 before encoding and the image 2814 after decoding completely match; however, usually, the two do not match (image 2814 after decoding becomes a deteriorated image compared to image 2812 before encoding) each other, since compression is not performed in order to improve a compression ratio.
The reverse tone mapping unit 2804 performs a reverse tone mapping process corresponding to a reverse conversion of the tone mapping process using the tone mapping unit 2801 with respect to the image 2814 after decoding using the decoding unit 2803.
As described above, since the encoding process in the first encoding unit 2802 is in a non-compressing mode, an image 2815 which is subjected to reverse tone mapping using the reverse tone mapping unit 2804 does not match the input image 2811. The difference calculation unit 2805 calculates a difference between the image 2815 which is subjected to reverse tone mapping using the reverse tone mapping unit 2804 and the input image 2811, and outputs a difference image 2816.
The second encoding unit 2806 performs an encoding process with respect to the difference image 2816, and outputs an encoding result 2817.
Accordingly, in the entire image compression device 2800, two results of the encoding result 2813 of the first encoding unit 2802 corresponding to a usual 8 bit depth image, for example, and the encoding result 2817 of a difference image (that is, with respect to error of encoding) between high dynamic range images, are output.
As such an advantage, for example, it is possible to provide a compressed image with respect to both the existing device which is capable of corresponding only to an 8 bit depth image and the device which is also capable of corresponding to a high dynamic range image (however, “device” here includes image viewer, or various software and hardware).
The image compression device 2800 may output only the encoding result 2813 using the first encoding unit 2802 with respect to the existing device which is capable of corresponding only to the 8 bit depth image. In addition, the image compression device 2800 may output two results of the encoding result 2813 using the first encoding unit 2802, and the encoding result 2817 of a differential image using the second encoding unit 2806 with respect to the device which is also capable of corresponding to the high dynamic range. That is, it can be said that the image compression device 2800 has backward compatibility with respect to the existing image compression device which corresponds to an 8 bit depth image.
Backward compatibility is a great advantage. The reason for this is that software corresponding to 8 bits such as a JPEG, or an MPEG, a digital camera, a multifunction terminal with camera, or the like, is widely spread.
In addition,
The first decoding unit 2901 sets the encoder result using the first encoding unit 2802 on the image compression device 2800 side to an input 2911, performs a decoding process corresponding to a reverse conversion of the encoding process using the first encoding unit 2802, and obtains a decoded image 2912.
The reverse tone mapping unit 2902 performs a process of reverse tone mapping corresponding to a reverse conversion of the tone mapping process using the tone mapping unit 2801 on the image compression device 2800 side with respect to the image 2912 after decoding using the first decoding unit 2901, and outputs a reverse tone mapped image 2913.
Meanwhile, the second decoding unit 2903 sets the encoding result using the second encoding unit 2806 on the image compression device 2800 side to an input 2914, performs a decoding process corresponding to a reverse conversion of the encoding process using the second encoding unit 2806, and obtains a decoded image 2915.
As described above, since the encoding process on the image compression device 2800 side is a non-compressing mode, the image 2913 which is subjected to the reverse tone mapping using the reverse tone mapping unit 2902 does not match the original high dynamic range image 2811 which is input to the image compression device 2800. The addition calculation unit 2904 adds an error component of encoding which is the decoding result 2915 using the second decoding unit 2903 to the image 2913 which is subjected to the reverse tone mapping using the reverse tone mapping unit 2902, generates a high dynamic range image 2916 which is closer to the original state, and sets the image to an output of the image decoding device 2900.
Subsequently, a compressing process of a high dynamic range stereoscopic image will be described. A compressing method of a stereoscopic image has already been standardized as a Multiview Video Coding (PVC) standard in a form of expanding H.264/AVC, for example, and has been put to practical use in a stereoscopic image display in a Blu-ray disc, or the like. Such an international standard may be used also in the embodiment.
The tone mapping unit 3001 performs separate tone mapping with respect to left and right high dynamic range images 3011L and 3011R which are input, and converts the images into 8 bit images of 3012L and 3012R, respectively, for example. It is not necessary for the tone mapping unit 3001 to use a tone mapping method which is particularly different with respect to the left and right images of 3011L and 3011R.
The first encoding unit 3002 performs an encoding process according to a predetermined standard, for example, with respect to the left and right tone mapped images of 3012L and 3012R, and outputs each encoded image 3013L and 3013R.
The decoding unit 3003 respectively performs a decoding process corresponding to a reverse conversion of the encoding process using the first encoding unit 3002 with respect to the left and right encoded images of 3013L and 3013R of the first encoding unit 3002, and obtains left and right decoded images of 3014L and 3014R. When the encoding process using the first encoding unit 3002, and the decoding process using the decoding unit 3003, are reversible modes, the left and right images 3012L and 3012R before encoding, and the left and right images 3014L and 3014R after decoding, respectively completely match; however, usually, both do not match (images 3014L and 3014R after decoding become deteriorated images compared to images 3012L and 3012R before encoding) each other, since compression is not performed in order to improve a compression ratio.
The reverse tone mapping unit 3004 respectively performs a reverse tone mapping process corresponding to a reverse conversion of the tone mapping process using the tone mapping unit 3001 with respect to the images 3014L and 3014R after decoding using the decoding unit 3003.
As described above, since the encoding process in the first encoding unit 3002 is a non-compressing mode, the left and right images of 3015L and 3015R which are subjected to reverse tone mapping using the reverse tone mapping unit 3004 do not match the left and right input images of 3011L and 3011R. The difference calculation unit 3005 respectively calculates differences between the left and right images of 3015L and 3015R which are subjected to reverse tone mapping using the reverse tone mapping unit 3004, and between the left and right input images of 3011L and 3011R, and outputs left and right difference images of 3016L and 3016R.
The second encoding unit 3006 respectively performs an encoding process with respect to the left and right difference images of 3016L and 3016R, and outputs encoded images 3017L and 3017R.
Accordingly, the entire image compression device 3000 performs two ways of output of the left and right encoded images of 3013L and 3013R using the first encoding unit 3002 which corresponds to a usual 8 bit depth image, for example, and encoded images 3017L and 3017R which are difference images (that is, with respect to error in encoding) of respective left and right high dynamic range images.
As such an advantage, for example, it is possible to provide a compressed image with respect to both the existing device which is capable of corresponding only to an 8 bit depth image and the device which is also capable of corresponding to a high dynamic range image, that is, it is possible to have backward compatibility (the same as above).
In addition,
The first decoding unit 3101 inputs the left and right encoded images of 3111L and 3111R using the first encoding unit 3002 on the image compression device 3000 side, respectively performs a decoding process corresponding to a reverse conversion of the encoding process using the first encoding unit 3002, and obtains left and right decoded images of 3112L and 3112R.
The reverse tone mapping unit 3102 respectively performs reverse tone mapping corresponding to a reverse conversion of the tone mapping process using the tone mapping unit 3001 on the image compression device 3000 side with respect to the left and right decoded images of 3112L and 3112R using the first decoding unit 3101, and outputs left and right reverse tone mapped images of 3113L and 3113R.
Meanwhile, the second decoding unit 3103 inputs left and right encoded images of 3114L and 3114R using the second encoding unit 3006 on the image compression device 3000 side, performs a decoding process corresponding to a reverse conversion of the encoding process using the second encoding unit 3006, and obtains left and right decoded images of 3115L and 3115R.
As described above, since the encoding process on the image compression device 3000 is a non-compressing mode, the left and right reverse tone mapping images of 3113L and 3113R which are output from the reverse tone mapping unit 3102 do not match the original high dynamic range stereoscopic images of 3011L and 3011R which are input to the image compression device 3000. The addition calculation unit 3104 respectively adds the left and right decoded images of 3115L and 3115R using the second decoding unit 3103 to the reverse tone mapped left and right images of 3113L and 3113R using the reverse tone mapping unit 3102, generates high dynamic range stereoscopic images of 3116L and 3116R which are close to the original state, and sets the images as outputs of the image decoding device 3000.
It is also possible to have a configuration in which an image compressing unit is incorporated inside the imaging apparatus 1400, and outputs a code stream by encoding a generated image. As described in the third embodiment, it is possible for one imaging apparatus 1400 to generate a stereoscopic image, and to generate a high dynamic range image, as well.
In addition, it is also possible to configure the imaging apparatus 1400 so as to selectively generate a stereoscopic image or a high dynamic range image based on instructed information from a user or an external device. When information on an instruction of generating a high dynamic range image is input, the image compressing device which is incorporated in the imaging apparatus 1400 may output an encoding result of a high dynamic range image by acting using operation properties which are illustrated in
In addition, it is also possible to configure the technology disclosed in the specification as follows.
(1) An imaging apparatus which includes an imaging lens; an imaging element which performs a photoelectric conversion with respect to light which is condensed using the imaging lens; and lens arrays which are configured by arranging micro lenses of which exposure conditions are different on a two-dimensional plane, are arranged by being separated on a front face of an imaging face of the imaging element, and causes light which is output from each micro lens to be formed as an image on the imaging face of the imaging element.
(2) The imaging apparatus which is described in (1) further includes an image composition unit which composites a plurality of imaged images which are output from the imaging element, and of which exposure conditions are different, and generates a high dynamic range image.
(3) The imaging apparatus which is described in (2), in which the lens array includes a micro lens with a property of a low exposure lens, and a micro lens with a property of a high exposure lens, the imaging element photographs a low exposure image and a high exposure image by performing a photoelectric conversion, respectively, with respect to output light of each micro lenses with the property of the low exposure lens, and the property of the high exposure lens, and the image composition unit generates a high dynamic range image by compositing the low exposure image and the high exposure image.
(4) The imaging apparatus which is described in (2), in which the lens array includes micro lenses of three types or more of which exposure lens properties are different, the imaging element photographs images of three types or more of which exposure conditions are different by performing a photoelectric conversion, respectively, with respect to output light of a micro lens with each exposure lens property, and the image composition unit generates a high dynamic range image by compositing imaged images of three types or more of which the exposure conditions are different.
(5) The imaging apparatus which is described in any one of (1) to (4), further includes an interpolation unit which improves resolution by interpolating pixels at a pixel position with another exposure condition using a pixel value of neighboring pixels with the same exposure condition with respect to respective imaged images of which exposure conditions are different, after the images are formed in the imaging element.
(6) The imaging apparatus which is described in (5), in which the interpolation unit improves resolution of respective imaged images of which exposure conditions are different so as to be the same resolution as that of an input image using the pixel interpolation.
(7) The imaging apparatus which is described in any one of (1) to (6), in which the micro lens includes a diaphragm for controlling a light intensity which meets a corresponding exposure condition.
(8) An imaging apparatus which includes an imaging lens; an imaging element which performs photoelectric conversion with respect to light which is condensed using the imaging lens: lens arrays which are configured by being arranged with a plurality of micro lenses to which m×n pixels of the imaging element are respectively allocated on a two-dimensional plane, and are arranged by being separated on a front face of an imaging face of the imaging element; and an image composition unit which composites at least part of image data among m×n pixels which receive light which has passed through each micro lens of the lens array.
(9) The imaging apparatus which is described in (8), in which the image composition unit generates a stereoscopic image based on at least part of image data among the m×n pixels which receive light which has passed through each micro lens of the lens array.
(10) The imaging apparatus which is described in (8), in which the image composition unit composites a left eye image based on image data which is read from a pixel which receives a ray for a left eye which passes through each micro lens, and composites a right eye image based on image data which is read from a pixel which receives a ray for a right eye.
(11) The imaging apparatus which is described in (8), in which the image composition unit generates a plurality of images of which exposure conditions are different at the same time based on at least part of image data among m×n pixels which receive light which has passed through each micro lens of the lens array.
(12) The imaging apparatus which is described in (11), in which the image composition unit generates a low exposure image based on image data which is read from a pixel which is set to a low exposure condition among m×n pixels which receive light which has passed through each micro lens, and generates a high exposure image based on image data which is read from a pixel which is set to a high exposure condition simultaneously with the low exposure image.
(13) The imaging apparatus which is described in (8), in which the image composition unit generates a stereoscopic image, a low exposure image, and a high exposure image at the same time based on at least part of image data among m×n pixels which receive light which has passed through each micro lens of the lens array.
(14) The imaging apparatus which is described in any one of (11) to (13), in which the image composition unit generates a high dynamic range image by compositing the low exposure image and the high exposure image which are generated at the same time.
(15) The imaging apparatus which is described in (8), in which the imaging element is arranged in a state in which a pixel group which is arranged in a square lattice shape along a horizontal direction and a vertical direction is rotated by a predetermined angle in a light receiving plane.
(16) The imaging apparatus which is described in any one of (11) to (13), in which an exposure time of each pixel is controlled so as to have a light intensity which meets each exposure condition.
(17) The imaging apparatus which is described in any one of (11) to (13), in which an amount of narrowing of light which is input to each pixel is controlled so as to be a light intensity which meets each exposure condition.
(18) The imaging apparatus which is described in (13) further including an encoding unit which outputs a code stream by encoding an image which is generated in the image composition unit.
(19) The imaging apparatus which is described in (18), in which generating either a stereoscopic image or a high dynamic range image is selected based on instructed information, and the encoding unit outputs an encoding result of the stereoscopic image when the generating of the stereoscopic image is selected, and outputs an encoding result of the high dynamic range image when the generating of the high dynamic range image is selected.
(20) The imaging apparatus which is described in (19), in which the encoding unit includes a tone mapping unit which performs tone mapping with respect to a high dynamic range image when the high dynamic range image is encoded; a first encoding unit which encodes the image after being subjected to the tone mapping; a decoding unit which decodes an encoding result using the first encoding unit; a reverse tone mapping unit which performs reverse tone mapping with respect to the decoding result using the decoding unit; a difference calculation unit which calculates a difference between the original high dynamic range image and an image which is subjected to the reverse tone mapping; and a second encoding unit which encodes a difference image using the difference calculation unit.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
2014-042855 | Mar 2014 | JP | national |
2014-188275 | Sep 2014 | JP | national |