1. Field
Embodiments relate to an image sensor, a method of sensing an image, and an image capturing apparatus including the image sensor, and more particularly, to an image sensor which may improve the quality of a sensed image, a method of sensing an image, and an image capturing apparatus including the image sensor.
2. Description of the Related Art
Technology related to imaging apparatuses and methods of capturing images has advanced at high speed. In order to sense more accurate image information, image sensors have been developed to sense depth information as well as color information of an object.
Embodiments provide an image sensor that may accurately sense an image of an object, a method of sensing an image, and an image capturing apparatus including the image sensor.
Embodiments are directed to providing an image sensor for sensing an image of an object by receiving reflected light obtained after output light is reflected by the object. The image sensor may include a pixel array having color pixels and depth pixels which receive the reflected light, and a shuttering unit that facilitates generation of color pixel signals by resetting the color pixels in units of a color integration time and reading the color pixels in units of the color integration time, and facilitates generation of depth pixel signals by resetting the depth pixels in units of a depth integration time, different from the color integration time, and reading the depth pixels in units of the depth integration time.
The shuttering unit may include a first reset unit that resets the color pixels and a second reset unit that resets the depth pixels.
The shuttering unit may be a rolling shutter that performs the resetting and the reading in units of rows of the pixel array.
The color integration time may be a time taken, after color pixels of an arbitrary row of the pixel array are reset, to read the color pixels of the arbitrary row. The depth integration time may be a time taken, after depth pixels of an arbitrary row of the pixel array are reset, to read the depth pixels of the arbitrary row.
The image sensor may further include a color information calculator that calculates the color pixel signals as color information of the object.
The image sensor may include a sample module that samples the depth pixel signals from depth pixels of an arbitrary row of the pixel array, and after the depth integration time elapses, samples the color pixel signals from the color pixels and the depth pixels of the arbitrary row.
The image sensor may further include a depth information calculator that estimates a delay between the output light and the reflected light from the depth pixel signals and calculates depth information of the object.
The image sensor may be a time-of-flight (TOF) sensor.
Embodiments are directed to providing a method of sensing an image of an object by receiving reflected light that is obtained after output light is reflected by the object. The method may including outputting the reflected light sensed by color pixels of a pixel array of the image sensor for a color integration time as color pixel signals, outputting the reflected light sensed by depth pixels of the pixel array for a depth integration time, different from the color integration time, as depth pixel signals, and calculating the color pixel signals and the depth pixel signals as image information of the object.
Outputting the color pixel signals may include resetting the color pixels in units of the color integration time and reading the reset color pixels in units of the color integration time.
Outputting the depth pixel signals may include resetting the depth pixels in units of the depth integration time and reading the reset depth pixels in units of the depth integration time.
Embodiments are directed to providing an image capturing apparatus including a light source that emits light, a lens that receives reflected light obtained after the light emitted from the light source is reflected by an object, an image sensor that senses image information of the object from the reflected light transmitted by the lens, and a processor that controls the image sensor and processes the image information transmitted from the image sensor. The image sensor may include a pixel array having color pixels and depth pixels which receive the reflected light, and a shuttering unit that facilitates generation of color pixel signals by resetting the color pixels in units of a color integration time and reading the reset color pixels in units of the color integration time, and facilitates generation of depth pixel signals by resetting the depth pixels in units of a depth integration time, different from the color integration time, and reading the reset depth pixels in units of the depth integration time.
Embodiments are directed to providing an image sensor for sensing an image of an object by receiving reflected light that is obtained after output light is reflected by the object. The image sensor may include a pixel array having first pixels and second pixels that sense different wavelengths of the reflected light and an integration control unit that reads the first pixels in units of a first integration time and reads the second pixels in units of a second integration time, different from the first integration time.
The first pixels may be color pixels sensing visible light and the second pixels may be depth pixels.
The depth pixels may output a plurality of depth pixel signals for each frame.
The depth pixels may sense infrared light.
Features will become apparent to those of ordinary skill in the art by describing in detail exemplary embodiments with reference to the attached drawings, in which:
Example embodiments will now be described more fully hereinafter with reference to the accompanying drawings; however, they may be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like reference numerals refer to like elements throughout.
Referring to
As shown in detail in
Referring again to
Although the color pixels PXc and the depth pixels PXd are separately arranged in
The depth pixels PXd may each include a photoelectric conversion element (not shown) for converting the reflected light RLIG into an electric change. The photoelectric conversion element may be a photodiode, a phototransistor, a photo-gate, a pinned photodiode, and so forth. Also, the depth pixels PXd may each include transistors connected to the photoelectric conversion element. The transistors may control the photoelectric conversion element or output an electric change of the photoelectric conversion element as pixel signals. For example, read-out transistors included in each of the depth pixels PXd may output an output voltage corresponding to reflected light received by the photoelectric conversion element of each of the depth pixels PXd as pixel signals. Also, the color pixels PXc may each include a photoelectric conversion element (not shown) for converting the visible light into an electric charge. A structure and a function of each pixel will not be explained in detail for clarity.
If the pixel array PA of the present embodiment separately includes the color pixels PXc and the depth pixels PXd as shown in
Referring again to
The timing generator TG controls the depth pixels PXd to be activated so that the depth pixels PXd of the image sensor ISEN may demodulate from the reflected light RLIG synchronously with the clock ‘ta’. The photoelectric conversion element of each the depth pixels PXd outputs electric charges accumulated with respect to the reflected light RLIG for a depth integration time Tint_Dep as depth pixel signals POUTd. The photoelectric conversion element of each the color pixels PXc outputs electric charges accumulated with respect to the visible light for a color integration time Tint_Col as color pixel signals POUTc. A detailed explanation of the color integration time Tint_Col and the depth integration time Tint_Dep will be made with reference to the shuttering unit SHUT.
The depth pixel signals POUTd of the image sensor ISEN are output to correspond to a plurality of demodulated optical wave pulses from the reflected light RLIG which includes modulated optical wave pulses. For example,
Referring back to
The color information calculator CC calculates the color information CINF from the color pixel signals POUTc converted to digital data by the analog-to-digital converter ADC.
The depth information calculator DC calculates the depth information DINF from the depth pixel signals POUTd=A0 through A3 converted to digital data by the analog-to-digital converter ADC. In detail, the depth information calculator DC estimates a phase delay φ between the output light OLIG and the reflected light RLIG as shown in Equation 1, and determines a distance D between the image sensor ISEN and the object OBJ as shown in Equation 2.
In Equation 2, the distance D between the image sensor ISEN and the object OBJ is a value measured in meters, Fm is a modulation wave period measured in seconds, and ‘c’ is the speed of light. Thus, the distance D between the image sensor ISEN and the object OBJ may be sensed as the depth information DINF from the depth pixel signals POUTd output from the depth pixels PXd of
Still referring to
The shuttering unit SHUT may include a first reset unit SHUT1 and a second reset unit SHUT2. The first reset unit SHUT1 may sequentially perform resetting on the color pixels PXc of each row. The second reset unit SHUT2 may sequentially perform resetting on the depth pixels PXd of each row. In this case, the color integration time Tint_Col and the depth integration time Tint_Dep may be different from each other.
As shown in
As above-mentioned, the sample module may sample the color pixel signals POUTc and the depth pixel signals POUTd. Thus, the shuttering unit SHUT may include at least two read shutters (not shown). One read shutter may be control reading about the color pixels and another read shutter may by control reading about the depth pixels. For example, each read shutter may send a row address of a row to be read to the row decoder RD.
As each reset shutter finishes its operation at the end of the pixel array, the reset shutter wraps around and starts operation again from the first row. The first row may be an arbitrary row.
Although the color integration time Tint_Col with respect to the color pixels PXc is longer than the depth integration time Tint_Dep with respect to the depth pixels PXd in
Referring to
As shown in
As described above, since the image sensor ISEN according to the one or more embodiments of the inventive concept shuttering on the depth pixels PXd as performed separately from shuttering on the color pixels PXc, i.e., shuttering for the different types of pixels has different integration times, optimal sensing in accordance with photographing environments and characteristics of the color pixels PXc and the depth pixels PXd which sense different light may be performed. Accordingly, the image sensor ISEN of the one or more embodiments of the inventive concept may sense an image with better quality.
The pixel array Pa may include sufficient depth pixels outputting A0˜A3 samples to reconstruct a depth map from one image. For example, 4-tap pixels may be employed, or 1-tap or 2-tap pixels arranged in a mosaic may be employed such that a 4-tap image can be reconstructed. In this case the color and depth frame times equal, Tfc=Tfd. This allows synchronizing capture of depth image with capture of color image, such that both images are output at the same frame rate. Thus, read operations may be performed simultaneously or close to each other in time for the depth and color images. This allows synchronizing depth and color image, such that both images depict the scene at approximately same time, such that differences between images due to motion are minimized.
According to embodiments, the depth integration time Tint_dep=Tfd, such that all reflected light RLIG is sensed without loss. According to embodiments, the color integration time Tint_col<=Tint_dep, such that the exposure of the color image can be controlled as necessary, while all reflected light RLIG is sensed without loss. In cases of overexposure in the depth image the power of the output light OLIG may be controlled instead of decreasing Tint_dep.
When the pixel array PA does not include sufficient depth pixels, such that more than one frame must be captured sequentially in order to compute a single depth map, an embodiment has Tfc=K×Tfd, where K is the number of depth frames may be used in order to compute a single depth map. In this case, the SHUT1 and SHUT2 may read out rows non-simultaneously, such that D_RD_PTR and C_RD_PTR may not be co-located, i.e., simultaneous or nearly simultaneous. For example, if K=4, the depth frame rate Tfd will be 4 times faster than color frame rate Tfc, and the reading and resetting of depth rows may progress from one row to another 4 times faster than the reading and resetting of color rows.
Although shuttering is performed on the entire pixel array PA in
Referring to
However, embodiments are not limited thereto, and the image sensor ISEN of
Referring to
Referring to
Referring to
The computing system COM may further include a power supply device PS. Also, the computing system COM may further include a storage device RAM that stores the image information IMG transmitted from the image capturing apparatus CMR.
If the computing system COM is a mobile system, a modem such as a baseband chipset and a battery for supplying an operating voltage of the computing system COM may be additionally provided. Also, since it would be obvious to one of ordinary skill in the art that an application chipset, a mobile DRAM, and the like may be further provided in the computing system COM, a detailed explanation thereof will not be given.
According to the image sensor, the method of sensing an image, and the image capturing apparatus according to the inventive concept, since color information and depth information are sensed in different exposure times, a pixel signal with a sufficient size may be sensed. Accordingly, according to the image sensor, the method of sensing an image, and the image capturing apparatus according to the inventive concept, the quality of a sensed image may be improved.
Example embodiments have been disclosed herein, and although specific terms are employed, they are used and are to be interpreted in a generic and descriptive sense only and not for purpose of limitation. For example, in the above, a method of obtaining a phase delay in consecutive images has been described. In some instances, as would be apparent to one of ordinary skill in the art as of the filing of the present application, features, characteristics, and/or elements described in connection with a particular embodiment may be used singly or in combination with features, characteristics, and/or elements described in connection with other embodiments unless otherwise specifically indicated. Accordingly, it will be understood by those of skill in the art that various changes in form and details may be made without departing from the spirit and scope of the present invention as set forth in the following claims.