This application claims priority from Korean Patent Application No. 2011-0028579, filed on Mar. 30, 2011; Korean Patent Application No. 2011-0029249, filed on Mar. 31, 2011; and Korean Patent Application No. 2011-0029388, filed on Mar. 31, 2011; all in the Korean Intellectual Property Office (KIPO), the entire contents of all of which are incorporated herein by reference.
1. Technical Field
Example embodiments relate to image sensors. More particularly, example embodiments relate to three-dimensional image sensors, image pick-up devices, cameras, and imaging systems.
2. Description of the Related Art
An image sensor is a photo-detection device that converts optical signals including image and/or distance (i.e., depth) information about an object into electrical signals. Various types of image sensors, such as charge-coupled device (CCD) image sensors, complimentary metal-oxide-semiconductor (CMOS) image sensors (CISs), etc., have been developed to provide high quality image information about the object. Recently, a three-dimensional (3D) image sensor is being researched and developed which provides depth information as well as two-dimensional image information.
The three-dimensional image sensor may obtain the depth information using infrared light or near-infrared light as a light source.
Example embodiments provide a three-dimensional image sensor capable of increasing dynamic ranges.
Example embodiments provide a camera capable of adjusting focusing of a receiving lens.
According to an example embodiment, a three-dimensional image sensor may include a light source module, a sensing circuit, and/or a control unit. The light source module may emit at least one light to an object. The sensing circuit may be configured to polarize a received light that represents the at least one light reflected from the object and configured to convert the polarized light to electrical signals. The control unit may control the light source module and the sensing circuit.
In an example embodiment, the light source module may include a light source configured to generate the at least one light and/or a first lens configured to focus the at least one light on the object.
The sensing circuit may include a lens module and/or a sensor unit. The lens module may include a second lens configured to concentrate the received light; an infrared filter configured to filter visible light components in the received light; and/or a polarization filter configured to polarize an output of the infrared filter in one direction to provide the polarized light. The sensor unit may be configured to convert the polarized light to the electrical signals.
The light source may include a light-emitting diode or a laser diode.
In an example embodiment, the sensing circuit may include a lens module and/or a sensor unit. The lens module may include a second lens configured to concentrate the received light and/or an infrared filter configured to filter visible light components in the received light.
The sensor unit may include a plurality of unit pixels, each of the unit pixels including a grid polarizer. Each unit pixel may include a transmission gate formed over a semiconductor substrate; a floating diffusion region formed over the semiconductor substrate adjacent to the transmission gate; a buried channel formed in the semiconductor substrate adjacent to the transmission gate; a pinning layer formed in the buried channel; and/or a metal layer formed over the transmission gate and the buried channel. The grid polarizer may be configured to polarize an output of the infrared filter. The grid polarizer may include the buried channel and the metal layer.
In an example embodiment, the at least one light may include first and second lights. The light source module may include a first light source configured to emit the first light and/or a second light source configured to emit the second light. The sensing circuit may include a lens configured to concentrate the received light.
The first and second light sources may be opposed to each other with respect to the lens.
The first and second lights may have a same period with respect to each other. The control unit may provide first and second control signals that alternately enable the first and second light sources.
According to an example embodiment, a camera may include receiving lens, a sensor module, an engine unit, and/or a motor unit. The sensor module may be configured to generate depth data, the depth data including depth information of a plurality of objects based on a received light from the objects. The engine unit may be configured to generate a depth map of the objects based on the depth data, may be configured to segment the objects in the depth map based on the depth map, and/or may be configured to generate a control signal for controlling the receiving lens based on the segmented objects. The motor unit may be configured to control focusing of the receiving lens. The sensor module may be configured to generate color data of the objects based on visible light reflected from the objects and concentrated by the receiving lens. The motor unit may be configured to control focusing of the receiving lens to provide the color data to the engine unit.
The sensor module may include a depth sensor configured to generate the depth data; and/or a color sensor configured to generate the color data.
The engine unit may include a first image signal processor (ISP) configured to process the depth data to generate a depth image of the objects and the depth map; a segmentation and control unit configured to segment the objects based on the depth map, and configured to generate the control signal based on the segmented objects; and/or a second ISP configured to process the color data to generate a color image of the objects.
The second ISP may be configured to perform color image processing on each of the objects according to respective distances of the objects from the sensor module.
The receiving lens may be configured to have a depth of field that covers one of the objects.
The camera may further include an image generator. The image generator may be configured to execute an application to generate a stereo image of the objects based on the depth image and the color image.
An imaging system may include a receiving lens; a sensor module configured to generate color data and depth data, the color data including color information of one or more objects based on received light from the one or more objects, and the depth data including depth information of the one or more objects based on the received light from the objects; an engine unit configured to generate a color image of the one or more objects based on the color data, configured to generate a depth image of the one or more objects based on the depth data, configured to generate a depth map of the one or more objects based on the depth data, and/or configured to generate a control signal for controlling the receiving lens based on the depth map; and/or a motor unit configured to control focusing of the receiving lens based on the control signal.
The sensor module may be further configured to generate the color data based on visible light reflected from the one or more objects and concentrated by the receiving lens.
The sensor module may be further configured to generate the depth data based on infrared or near-infrared light reflected from the one or more objects and concentrated by the receiving lens.
The sensor module may be further configured to polarize light reflected from the one or more objects.
The sensor module may be further configured to configured to convert the polarized light to electrical signals.
As described above, dynamic ranges of a three-dimensional image sensor may be increased and focusing of a receiving lens of a camera may be adaptively adjusted.
The above and/or other aspects and advantages will become more apparent and more readily appreciated from the following detailed description of example embodiments, taken in conjunction with the accompanying drawings, in which:
Example embodiments will now be described more fully with reference to the accompanying drawings. Embodiments, however, may be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these example embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope to those skilled in the art. In the drawings, the thicknesses of layers and regions may be exaggerated for clarity.
It will be understood that when an element is referred to as being “on,” “connected to,” “electrically connected to,” or “coupled to” to another component, it may be directly on, connected to, electrically connected to, or coupled to the other component or intervening components may be present. In contrast, when a component is referred to as being “directly on,” “directly connected to,” “directly electrically connected to,” or “directly coupled to” another component, there are no intervening components present. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
It will be understood that although the terms first, second, third, etc., may be used herein to describe various elements, components, regions, layers, and/or sections, these elements, components, regions, layers, and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer, and/or section from another element, component, region, layer, and/or section. For example, a first element, component, region, layer, and/or section could be termed a second element, component, region, layer, and/or section without departing from the teachings of example embodiments.
Spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper,” and the like may be used herein for ease of description to describe the relationship of one component and/or feature to another component and/or feature, or other component(s) and/or feature(s), as illustrated in the drawings. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and should not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Reference will now be made to example embodiments, which are illustrated in the accompanying drawings, wherein like reference numerals may refer to like components throughout.
Referring to
The pixel array 110 may include depth pixels receiving received light RX that is emitted by the light source module 300, is reflected from an object 50, and is received as received light RX. The depth pixels may convert the received light RX into electrical signals. The depth pixels may provide information about a distance of the object 50 from the three-dimensional image sensor 10 and/or black-and-white image information.
The pixel array 110 may further include color pixels for providing color image information. In this case, the three-dimensional image sensor 10 may be a three-dimensional color image sensor that provides the color image information and the depth information. In some embodiments, an infrared filter or a near-infrared filter may be formed on the depth pixels, and a color filter (e.g., red, green and blue filters) may be formed on the color pixels. A ratio of the number of the depth pixels to the number of the color pixels may vary according to example embodiments.
The ADC unit 130 may convert an analog signal output from the pixel array 110 into a digital signal. In some embodiments, the ADC unit 130 may perform a column ADC that converts analog signals in parallel using a plurality of analog-to-digital converters respectively coupled to a plurality of column lines. In other embodiments, the ADC unit 130 may perform a single ADC that sequentially converts the analog signals using a single analog-to-digital converter.
In some embodiments, the ADC unit 130 may further include a correlated double sampling (CDS) unit for extracting an effective signal component. In some embodiments, the CDS unit may perform an analog double sampling that extracts the effective signal component based on a difference between an analog reset signal including a reset component and an analog data signal including a signal component. In other embodiments, the CDS unit may perform a digital double sampling that converts the analog reset signal and the analog data signal into two digital signals and extracts the effective signal component based on a difference between the two digital signals. In still other embodiments, the CDS unit may perform a dual correlated double sampling that performs both the analog double sampling and the digital double sampling.
The row scanning circuit 120 may receive control signals from the control unit 200, and may control a row address and a row scan of the pixel array 110. To select a row line among a plurality of row lines, the row scanning circuit 120 may apply a signal for activating the selected row line to the pixel array 110. In some embodiments, the row scanning circuit 120 may include a row decoder that selects a row line of the pixel array 110 and a row driver that applies a signal for activating the selected row line.
The column scanning circuit 140 may receive control signals from the control unit 200, and may control a column address and a column scan of the pixel array 110. The column scanning circuit 140 may output a digital output signal from the ADC unit 130 to a digital signal processing circuit (not shown) or to an external host (not shown). For example, the column scanning circuit 140 may provide the ADC unit 130 with a horizontal scan control signal to sequentially select a plurality of analog-to-digital converters included in the ADC unit 130. In some embodiments, the column scanning circuit 140 may include a column decoder that selects one of the plurality of analog-to-digital converters and a column driver that applies an output of the selected analog-to-digital converter to a horizontal transmission line. The horizontal transmission line may have a bit width corresponding to that of the digital output signal.
The control unit 200 may control the ADC unit 130, the row scanning circuit 120, the column scanning circuit 140 and the light source module 300. The control unit 200 may provide the ADC unit 130, the row scanning circuit 120, the column scanning circuit 140 and the light source module 300 with control signals, such as a clock signal, a timing control signal, etc. In some embodiments, the control unit 200 may include a control logic circuit, a phase locked loop circuit, a timing control circuit, a communication interface circuit, etc.
The light source module 300 may emit light of a desired (or, alternatively, a predetermined) wavelength. For example, the light source module 300 may emit infrared light or near-infrared light. The light source module 300 may include a light source 310 and a lens 320. The light source 310 may be controlled by the control unit 200 to emit the emitted light TX of which the intensity periodically changes. For example, the intensity of the emitted light TX may be controlled such that the intensity of the emitted light TX has a waveform of a pulse wave, a sine wave, a cosine wave, etc. The light source 310 may be implemented by a light emitting diode (LED), a laser diode, etc. The lens 320 may be configured to focus the emitted light TX on the object 50.
The lens module 400 may include a lens 410, a first filter 420 and a second filter 430. The lens 410 concentrates the received light RX reflected from the object 50 to be provided to the pixel array 110. The first filter 420 may be an infrared filter which filters components having frequencies other than a frequency corresponding to an infrared light, such as visible light VL. The second filter 430 may be a polarization filter which filters background lights other than the emitted light TX. The second filter 430 may be a linear polarization filter and the background lights are polarized in all directions. When the linear polarization filter which is polarized in one direction is employed as the second filter 430, components of the background lights may be reduced by ½. That is, the lens module 400 may polarize the received light RX in one direction to provide the polarized light PRX to the sensor unit 105. The sensor unit 105 may convert the polarized light PRX to electrical signals.
Hereinafter, an operation of the three-dimensional image sensor 10 according to example embodiments will be described below.
The control unit 200 may control the light source module 300 to emit the emitted light TX having the periodic intensity. The emitted light TX emitted by the light source module 300 may be reflected from the object 50 back to the three-dimensional image sensor 10 as the received light RX. The received light RX may enter the depth pixels, and the depth pixels may be activated by the row scanning circuit 120 to output analog signals corresponding to the received light RX. The ADC unit 130 may convert the analog signals output from the depth pixels into digital data DATA. The digital data DATA may be provided to the control unit 200 by the column scanning circuit 140.
A calculation unit 210 included in the control unit 200 may calculate a distance of the object 50 from the three-dimensional image sensor 10 based on the digital data DATA.
The emitted light TX emitted by the light source module 300 may be reflected from the object 50 back to the three-dimensional image sensor 10 as the received light RX. The polarized light PRX may enter the depth pixels, the depth pixels may output analog signals corresponding to the polarized light PRX, and the ADC unit 130 may convert the analog signals output from the depth pixels into digital data DATA. The digital data DATA may be converted to the depth information by the calculation unit 210.
The digital data DATA and/or the depth information may be provided to the digital signal processing circuit or the external host. In some embodiments, the pixel array 110 may include color pixels, and the color image information as well as the depth information may be provided to the digital signal processing circuit or the external host.
As described above, in the three-dimensional image sensor 10 according to example embodiments, since the lens module 400 including the second filter 430 (that may be a polarization filter) polarizes the received light RX in one direction and provides the polarized light PRX to the sensor unit 105, the interference effect due to the background lights may be reduced and a dynamic range of the three-dimensional image sensor 10 may be enhanced.
Referring to
The emitted light TX emitted by the light source module 300 may be reflected from the object 50, and then may enter a lens module 400 as received light RX. The lens module 400 including the second filter 430 (that may be a polarization filter) polarizes the received light RX in one direction and provides the polarized light PRX to the sensor unit 105. The pixel array 110 may periodically sample the polarized light PRX. In some embodiments, during each period of the received light RX (i.e., period of the emitted light TX), the pixel array 110 may perform a sampling on the polarized light PRX with two sampling points having a phase difference of about 180 degrees, with four sampling points having a phase difference of about 90 degrees, or with more than four sampling points. For example, the pixel array 110 may extract four samples A0, A1, A2 and A3 of the polarized light PRX (or, in general, received light RX) at phases of about 90 degrees, about 180 degrees, about 270 degrees and about 360 degrees per period, respectively.
The polarized light PRX may have an offset B that is different from an offset of the emitted light TX emitted by the light source module 300 due to background light, a noise, etc. The offset B of the polarized light PRX may be calculated by Equation 1.
Here, A0 represents an intensity of the polarized light PRX sampled at a phase of about 90 degrees of the emitted light TX, A1 represents an intensity of the polarized light PRX sampled at a phase of about 180 degrees of the emitted light TX, A2 represents an intensity of the polarized light PRX sampled at a phase of about 270 degrees of the emitted light TX, and A3 represents an intensity of the polarized light PRX sampled at a phase of about 360 degrees of the emitted light TX.
The polarized light PRX may have an amplitude A lower than that of the emitted light TX emitted by the light source module 300 due to a light loss. The amplitude A of the polarized light PRX may be calculated by Equation 2.
Black-and-white image information about the object 50 may be provided by respective depth pixels included in the pixel array 110 based on the amplitude A of the polarized light PRX.
The polarized light PRX may be delayed by a phase difference Φ corresponding to a double of the distance of the object 50 from the three-dimensional image sensor 10 with respect to the emitted light TX. The phase difference Φ between the emitted light TX and the polarized light PRX may be calculated by Equation 3.
The phase difference Φ between the emitted light TX and the polarized light PRX may correspond to a time-of-flight (TOF). The distance of the object 50 from the three-dimensional image sensor 10 may be calculated by an equation, “R=c*TOF/2”, where R represents the distance of the object 50, and c represents the speed of light. Further, the distance of the object 50 from the three-dimensional image sensor 50 may also be calculated by Equation 4 using the phase difference Φ between the emitted light TX and the polarized light PRX.
Here, f represents a modulation frequency, which is a frequency of the intensity of the emitted light TX (or a frequency of the intensity of the received light RX).
As described above, the three-dimensional image sensor 10 according to example embodiments may obtain depth information about the object 50 using the emitted light TX emitted by the light source module 300. Although
Referring to
Referring to
Referring to
Referring to
Referring to
The pixel array 160 may include depth pixels receiving received light RX that is emitted by the light source module 350 and is reflected from an object 60. The depth pixels may convert the received light RX into electrical signals. The depth pixels may provide information about a distance of the object 60 from the three-dimensional image sensor 20 and/or black-and-white image information.
The pixel array 160 may further include color pixels for providing color image information. In this case, the three-dimensional image sensor 20 may be a three-dimensional color image sensor that provides the color image information and the depth information. In some embodiments, an infrared filter or a near-infrared filter may be formed on the depth pixels, and a color filter (e.g., red, green and blue filters) may be formed on the color pixels. A ratio of the number of the depth pixels to the number of the color pixels may vary according to example embodiments.
The ADC unit 180 may convert an analog signal output from the pixel array 160 into a digital signal. In some embodiments, the ADC unit 180 may perform a column ADC that converts analog signals in parallel using a plurality of analog-to-digital converters respectively coupled to a plurality of column lines. In other embodiments, the ADC unit 180 may perform a single ADC that sequentially converts the analog signals using a single analog-to-digital converter.
In some embodiments, the ADC unit 180 may further include a correlated double sampling (CDS) unit for extracting an effective signal component. In some embodiments, the CDS unit may perform an analog double sampling that extracts the effective signal component based on a difference between an analog reset signal including a reset component and an analog data signal including a signal component. In other embodiments, the CDS unit may perform a digital double sampling that converts the analog reset signal and the analog data signal into two digital signals and extracts the effective signal component based on a difference between the two digital signals. In still other embodiments, the CDS unit may perform a dual correlated double sampling that performs both the analog double sampling and the digital double sampling.
The row scanning circuit 170 may receive control signals from the control unit 250, and may control a row address and a row scan of the pixel array 160. To select a row line among a plurality of row lines, the row scanning circuit 170 may apply a signal for activating the selected row line to the pixel array 160. In some embodiments, the row scanning circuit 170 may include a row decoder that selects a row line of the pixel array 160 and a row driver that applies a signal for activating the selected row line.
The column scanning circuit 190 may receive control signals from the control unit 250, and may control a column address and a column scan of the pixel array 160. The column scanning circuit 190 may output a digital output signal from the ADC unit 180 to a digital signal processing circuit (not shown) or to an external host (not shown). For example, the column scanning circuit 190 may provide the ADC unit 180 with a horizontal scan control signal to sequentially select a plurality of analog-to-digital converters included in the ADC unit 180. In some embodiments, the column scanning circuit 190 may include a column decoder that selects one of the plurality of analog-to-digital converters and a column driver that applies an output of the selected analog-to-digital converter to a horizontal transmission line. The horizontal transmission line may have a bit width corresponding to that of the digital output signal.
The control unit 250 may control the ADC unit 180, the row scanning circuit 170, the column scanning circuit 190 and the light source module 350. The control unit 250 may provide the ADC unit 180, the row scanning circuit 170, the column scanning circuit 190 and the light source module 350 with control signals, such as a clock signal, a timing control signal, etc. In some embodiments, the control unit 250 may include a control logic circuit, a phase locked loop circuit, a timing control circuit, a communication interface circuit, etc.
The light source module 350 may emit light of a desired (or, alternatively, a predetermined) wavelength. For example, the light source module 350 may emit infrared light or near-infrared light. The light source module 350 may include a light source 360 and a lens 370. The light source 360 may be controlled by the control unit 250 to emit the emitted light TX of which the intensity periodically changes. For example, the intensity of the emitted light TX may be controlled such that the intensity of the emitted light TX has a waveform of a pulse wave, a sine wave, a cosine wave, etc. The light source 360 may be implemented by a light emitting diode (LED), a laser diode, etc. The lens 370 may be configured to focus the emitted light TX on the object 60.
The lens module 450 may include a lens 460 and an infrared filter 470. The lens 460 concentrates the received light RX reflected from the object 60 to be provided to the pixel array 160. The infrared filter 470 filters components having frequencies other than a frequency corresponding to an infrared light, such as visible light VL.
The sensor unit 155 polarizes the received light RX which passes through the lens module 450 and converts the polarized light to electrical signals. For polarizing the received light RX and converting the polarized light to electrical signals, the pixel array 160 may include a plurality of pixels, each including a polarization grid as will be described below. That is, the three-dimensional image sensor 20 of
Referring to
Referring to
Hereinafter, an operation of the three-dimensional image sensor 20 according to example embodiments will be described below.
The control unit 250 may control the light source module 350 to emit the emitted light TX having the periodic intensity. The emitted light TX emitted by the light source module 350 may be reflected from the object 60 back to the three-dimensional image sensor 20 as the received light RX. The received light RX may enter the depth pixels after only the infrared components pass through the lens module 450. The depth pixels may polarize the received light in one direction, and the depth pixels may be activated by the row scanning circuit 170 to output analog signals corresponding to the received light RX. The ADC unit 180 may convert the analog signals output from the depth pixels into digital data DATA. The digital data DATA may be provided to the control unit 250 by the column scanning circuit 190.
A calculation unit 260 included in the control unit 250 may calculate a distance of the object 60 from the three-dimensional image sensor 20 based on the digital data DATA.
The emitted light TX emitted by the light source module 350 may be reflected from the object 60 back to the three-dimensional image sensor 20 as the received light RX. The received light RX may enter the depth pixels. The depth pixels may polarize the received light RX, output analog signals corresponding to the received light RX, and the ADC unit 180 may convert the analog signals output from the depth pixels into digital data DATA. The digital data DATA may be converted to the depth information by the calculation unit 260.
The digital data DATA and/or the depth information may be provided to the digital signal processing circuit or the external host. In some embodiments, the pixel array 160 may include color pixels, and the color image information as well as the depth information may be provided to the digital signal processing circuit or the external host.
As described above, in the three-dimensional image sensor 20 according to example embodiments, since the pixel array 160 including the grid polarizer of
Referring to
The first three-dimensional image sensor 720 may include a light source module 721 and a lens module 722. The second three-dimensional image sensor 730 may include a light source module 731 and a lens module 732. In addition, each of the first and second image sensors may further include a sensing circuit and a control unit such as the three-dimensional image sensors 10 and 20 of
The light source module 721 of the first three-dimensional image sensor 720 emits an emitted light TX1 polarized in a first direction, and the lens module 722 of the first three-dimensional image sensor 720 may include a polarization filter to polarize a received light RX1 from the object 710 in one direction and may convert a polarized light to electrical signals. The light source module 731 of the second three-dimensional image sensor 730 emits an emitted light TX2 polarized in a second direction, and the lens module 732 of the first three-dimensional image sensor 720 may include a polarization filter to polarize a received light RX2 from the object 710 in one direction and may convert a polarized light to electrical signals. The first direction may differ from the second direction.
As described above, in the three-dimensional image sensor system 700 according to example embodiments, since the light source module 721 emits emitted light TX1 polarized in the first direction while the light source module 731 emits emitted light TX2 polarized in the second direction different from the first direction, the interference effect due to a plurality of emitted lights may be reduced and a dynamic range of the three-dimensional image sensor system 700 may be enhanced.
Referring to
The pixel array 110a may include depth pixels receiving light RX that a first emitted light TX1 and a second emitted light TX2 is emitted by the light source module 350 and is reflected from an object 50a. The depth pixels may convert the received light RX into electrical signals. The depth pixels may provide information about a distance of the object 50a from the three-dimensional image sensor 10a and/or black-and-white image information.
The pixel array 110a may further include color pixels for providing color image information. In this case, the three-dimensional image sensor 10a may be a three-dimensional color image sensor that provides the color image information and the depth information. In some embodiments, an infrared filter or a near-infrared filter may be formed on the depth pixels, and a color filter (e.g., red, green and blue filters) may be formed on the color pixels. A ratio of the number of the depth pixels to the number of the color pixels may vary according to example embodiments.
The ADC unit 130a may convert an analog signal output from the pixel array 110a into a digital signal. In some embodiments, the ADC unit 130a may perform a column ADC that converts analog signals in parallel using a plurality of analog-to-digital converters respectively coupled to a plurality of column lines. In other embodiments, the ADC unit 130a may perform a single ADC that sequentially converts the analog signals using a single analog-to-digital converter.
In some embodiments, the ADC unit 130a may further include a correlated double sampling (CDS) unit for extracting an effective signal component. In some embodiments, the CDS unit may perform an analog double sampling that extracts the effective signal component based on a difference between an analog reset signal including a reset component and an analog data signal including a signal component. In other embodiments, the CDS unit may perform a digital double sampling that converts the analog reset signal and the analog data signal into two digital signals and extracts the effective signal component based on a difference between the two digital signals. In still other embodiments, the CDS unit may perform a dual correlated double sampling that performs both the analog double sampling and the digital double sampling.
The row scanning circuit 120a may receive control signals from the control unit 200a, and may control a row address and a row scan of the pixel array 110a. To select a row line among a plurality of row lines, the row scanning circuit 120a may apply a signal for activating the selected row line to the pixel array 110a. In some embodiments, the row scanning circuit 120a may include a row decoder that selects a row line of the pixel array 110a and a row driver that applies a signal for activating the selected row line.
The column scanning circuit 140a may receive control signals from the control unit 200a, and may control a column address and a column scan of the pixel array 110a. The column scanning circuit 140a may output a digital output signal from the ADC unit 130a to a digital signal processing circuit (not shown) or to an external host (not shown). For example, the column scanning circuit 140a may provide the ADC unit 130a with a horizontal scan control signal to sequentially select a plurality of analog-to-digital converters included in the ADC unit 130a. In some embodiments, the column scanning circuit 140a may include a column decoder that selects one of the plurality of analog-to-digital converters and a column driver that applies an output of the selected analog-to-digital converter to a horizontal transmission line. The horizontal transmission line may have a bit width corresponding to that of the digital output signal.
The control unit 200a may control the ADC unit 130a, the row scanning circuit 120a, the column scanning circuit 140a and the light source module 300a. The control unit 200a may provide the ADC unit 130a, the row scanning circuit 120a, the column scanning circuit 140a and the light source module 300a with control signals, such as a clock signal, a timing control signal, etc. In some embodiments, the control unit 200a may include a control logic circuit, a phase locked loop circuit, a timing control circuit, a communication interface circuit, etc.
The light source module 300a may emit light of a desired (or, alternatively, a predetermined) wavelength. For example, the light source module 300a may emit infrared light or near-infrared light. The light source module 300a may include a first light source 310a, a second light source 320a and a lens 330a. The first light source 310a may be controlled by the control unit 200a to emit a first emitted light TX1 of which the intensity periodically changes in response to a first control signal CTL1 from the control unit 200a. For example, the intensity of the first emitted light TX1 may be controlled such that the intensity of the first emitted light TX1 has a waveform of a pulse wave, a sine wave, a cosine wave, etc. The second light source 320a may be controlled by the control unit 200a to emit a second emitted light TX2 of which the intensity periodically changes in response to a second control signal CTL2 from the control unit 200a. For example, the intensity of the second emitted light TX2 may be controlled such that the intensity of the second emitted light TX2 has a waveform of a pulse wave, a sine wave, a cosine wave, etc. The first and second control signals CTL1 and CTL2 may controls the light source module 300a such that the first emitted light TX1 and the second emitted light TX2 may have different enabling pulse width with respect to each other. The first and second light source 310a and 320a may be implemented by a light emitting diode (LED), a laser diode, etc. The lens 330a may be configured to focus the first and second emitted lights TX1 and TX2 on the object 50a.
The lens module 400a may include a light-receiving lens 410a and a filter 420a. The light-receiving lens 410a concentrates the received light RX reflected from the object 50a to be provided to the pixel array 110a. The filter 420a, for example an infrared filter, filters components having frequencies other than a frequency corresponding to an infrared light, such as visible light VL. The sensor unit 105a may convert the filtered received light RX to electrical signals.
Hereinafter, an operation of the three-dimensional image sensor 10a according to example embodiments will be described below.
The control unit 200a may control the light source module 300a to emit the first emitted light TX1 and the second emitted light TX2 having pulse widths of different enabling intervals with respect to each other using the first and second control signals CTL1 and CTL2. The first and second emitted lights TX1 and TX2 emitted by the light source module 300a may be reflected from the object 50a back to the three-dimensional image sensor 10a as the received light RX. The received light RX may enter the depth pixels after only the infrared components pass through the lens module 400a. The depth pixels may be activated by the row scanning circuit 120a to output analog signals corresponding to the received light RX. The ADC unit 130a may convert the analog signals output from the depth pixels into digital data DATA. The digital data DATA may be provided to the control unit 200a by the column scanning circuit 140a.
A calculation unit 210a included in the control unit 200a may calculate a distance of the object 50a from the three-dimensional image sensor 10a based on the digital data DATA.
The digital data DATA and/or the depth information may be provided to the digital signal processing circuit or the external host. In some embodiments, the pixel array 110a may include color pixels, and the color image information as well as the depth information may be provided to the digital signal processing circuit or the external host.
As described above, in the three-dimensional image sensor 10a according to example embodiments, since the light source module 300a includes the first and second light source 310a and 320a which emit the first emitted light TX1 and the second emitted light TX2 having pulse widths of different enabling intervals with respect to each other in response to the first and second control signals CTL1 and CTL2 from the control unit 200a, an over-saturation effect of the object 50a due to the a plurality of light sources having different enabling pulse widths may be prevented.
Referring to
Referring to
Referring to
In
The description of an example of calculating a distance of the object 50a by the three-dimensional image sensor 10a of
Referring to
Referring to
Referring to
As described above, in the three-dimensional image sensor, components for controlling the color pixels and components for controlling the depth pixels may independently operate to provide the color data CDTA and the depth data ZDTA of an image.
Although it is described that the sensor unit 105a of the three-dimensional image sensor 10a of
Referring to
The receiving lens 810a may focus incident light on a photo-receiving region (e.g., depth pixels and/or color pixels) of the three-dimensional image sensor chip 820a. The three-dimensional image sensor chip 820a may generate data DATA1 including depth information and/or color image information based on the incident light passing through the receiving lens 810a. For example, the data DATA1 generated by the three-dimensional image sensor chip 820a may include depth data generated using infrared light or near-infrared light emitted by the light source module 830a, and red, green, blue (RGB) data of a Bayer pattern generated using external visible light VL. The three-dimensional image sensor chip 820a may provide the data DATA1 to the engine unit 840a in response to a clock signal CLK. In some embodiments, the three-dimensional image sensor chip 820a may interface with the engine unit 840a using a mobile industry processor interface (MIPI) and/or a camera serial interface (CSI).
The engine unit 840a may control the three-dimensional image sensor 900a. The engine unit 840a may process the data DATA1 received from the three-dimensional image sensor chip 820a. For example, the engine unit 840a may generate three-dimensional color data based on the received data DATAL In other examples, the engine unit 840a may generate luminance, chrominance (YUV) data including a luminance component (Y), a difference between the luminance component and a blue component (U), and a difference between the luminance component and a red component (V) based on the RGB data, or may generate compressed data, such as Joint Photographic Experts Group (JPEG) data. The engine unit 840a may be coupled to a host/application 850a, and may provide data DATA2 to the host/application 850a based on a master clock signal MCLK. In some embodiments, the engine unit 840a may interface with the host/application 850a using a serial peripheral interface (SPI) and/or an inter integrated circuit (I2C) interface.
Referring to
Each operation of the receiving lens 810b, the engine unit 840b and a host/application 850b may be substantially same as each operation of the receiving lens 810a, the engine unit 840a and the host/application 850a in
Referring to
The receiving lens 1120 may focus incident light on a photo-receiving region (e.g., depth pixels and/or color pixels) of the three-dimensional image sensor chip 1200. The three-dimensional image sensor chip 1200 may generate depth data ZDTA including depth information of a plurality of objects 1050 based on emitted light TX reflected from the plurality of objects 1050 as received light RX, and may provide the depth data ZDTA to the engine unit 1300. The engine unit 1300 may generate a depth map including depth information of the plurality of objects 1050 based on the depth data ZDTA, may segment the plurality of objects 1050 in the depth map based on the depth map, and may generate a control signal CTRL for controlling the receiving lens 1120 based on the segmented objects. That is, the engine unit 1300 may select one of the plurality of objects 1050 in the depth map, may set the selected object as a focusing region, and may generate the control signal CTRL for focusing the receiving lens 1120 on the focusing region.
The motor unit 1130 may control the focusing of the receiving lens 1120 on the selected object by moving the receiving lens 1120 in response to the control signal CTRL. The three-dimensional image sensor chip 1200 may generate color data CDTA of the objects 1050 based on visible light VL which are reflected from the objects 1050 and received through the focus-adjusted receiving lens 1120, and may provide the color data CDTA to the engine unit 1300.
The light source module 1110 may include a light source 1111 and a lens 1112. The light source 1111 may generate infrared light or near-infrared light, and the lens 1112 may focus the infrared light or near-infrared light on the objects 1050.
The three-dimensional image sensor chip 1200 may provide data DATA1, including the depth data ZDTA and/or the color data CDTA, to the engine unit 1300 in response to a clock signal CLK. In some embodiments, the three-dimensional image sensor chip 1200 may interface with the engine unit 1300 using a mobile industry processor interface (MIPI) and/or a camera serial interface (CSI).
The engine unit 1300 may control the three-dimensional image sensor 1100 and the motor unit 1130. The engine unit 1300 may process the data DATA1 received from the three-dimensional image sensor chip 1200. For example, the engine unit 1300 may generate three-dimensional color data based on the received data DATA1. In other examples, the engine unit 1300 may generate YUV data including a luminance component, a difference between the luminance component and a blue component, and a difference between the luminance component and a red component based on the RGB data, or may generate compressed data, such as Joint Photographic Experts Group (JPEG) data. The engine unit 1300 may be coupled to a host/application 1400, and may provide data DATA2 to the host/application 1400 based on a master clock signal MCLK. In some embodiments, the engine unit 1300 may interface with the host/application 1400 using a serial peripheral interface (SPI) and/or an inter integrated circuit (I2C) interface.
In an example embodiment of
Referring to
In
Referring to
The depth pixel array 1211 may include depth pixels receiving light RX that is emitted by the light source module 1110 and is reflected from the object 1050. The depth pixels may convert the received light RX into electrical signals. The depth pixels may provide information about a distance of the objects 1050 from the depth sensor 1210 and/or black-and-white image information.
The ADC unit 1212 may convert an analog signal output from the depth pixel array 1211 into a digital signal. In some embodiments, the ADC unit 1212 may perform a column ADC that converts analog signals in parallel using a plurality of analog-to-digital converters respectively coupled to a plurality of column lines. In other embodiments, the ADC unit 1212 may perform a single ADC that sequentially converts the analog signals using a single analog-to-digital converter.
In some embodiments, the ADC unit 1212 may further include a correlated double sampling (CDS) unit for extracting an effective signal component. In some embodiments, the CDS unit may perform an analog double sampling that extracts the effective signal component based on a difference between an analog reset signal including a reset component and an analog data signal including a signal component. In other embodiments, the CDS unit may perform a digital double sampling that converts the analog reset signal and the analog data signal into two digital signals and extracts the effective signal component based on a difference between the two digital signals. In still other embodiments, the CDS unit may perform a dual correlated double sampling that performs both the analog double sampling and the digital double sampling.
The row scanning circuit 1213 may receive control signals from the control unit 1215, and may control a row address and a row scan of the depth pixel array 1211. To select a row line among a plurality of row lines, the row scanning circuit 1213 may apply a signal for activating the selected row line to the depth pixel array 1211. In some embodiments, the row scanning circuit 1213 may include a row decoder that selects a row line of the depth pixel array 1211 and a row driver that applies a signal for activating the selected row line.
The column scanning circuit 1214 may receive control signals from the control unit 1215, and may control a column address and a column scan of the depth pixel array 1211. The column scanning circuit 1214 may output a digital output signal from the ADC unit 1212 to a digital signal processing circuit (not shown) or to an external host (not shown). For example, the column scanning circuit 1214 may provide the ADC unit 1212 with a horizontal scan control signal to sequentially select a plurality of analog-to-digital converters included in the ADC unit 1212. In some embodiments, the column scanning circuit 1214 may include a column decoder that selects one of the plurality of analog-to-digital converters and a column driver that applies an output of the selected analog-to-digital converter to a horizontal transmission line. The horizontal transmission line may have a bit width corresponding to that of the digital output signal.
The control unit 1215 may control the ADC unit 1212, the row scanning circuit 1213, the column scanning circuit 1214 and the light source module 1110. The control unit 1215 may provide the ADC unit 1212, the row scanning circuit 1213, the column scanning circuit 1214 and the light source module 1110 with control signals, such as a clock signal, a timing control signal, etc. In some embodiments, the control unit 1215 may include a control logic circuit, a phase locked loop circuit, a timing control circuit, a communication interface circuit, etc.
The light source module 1110 may emit light of a desired (or, alternatively, a predetermined) wavelength. For example, the light source module 1110 may emit infrared light or near-infrared light. The light source 1110 may be controlled by the control unit 1215 to emit the emitted light TX of which the intensity periodically changes. For example, the intensity of the emitted light TX may be controlled such that the intensity of the emitted light TX has a waveform of a pulse wave, a sine wave, a cosine wave, etc. The light source 1111 may be implemented by a light emitting diode (LED), a laser diode, etc.
Referring to
Although not illustrated in
In an example of
In
Referring to
The first ISP (depth ISP) 1310 may process the depth data ZDTA to generate a depth image ZIMG and a depth map DM of the objects 1050. The depth map DM may include depth information of the objects 1050, and the depth image ZIMG may be a black-and-white image including depth information of the objects 1050. The depth image ZIMG may be provided to the host/application 1400, and the depth map DM may be provided to the segmentation and control unit 1320. The segmentation and control unit 1320 may segment the objects 1050 in the depth map DM based on the depth map DM and may generate the control signal CTRL for controlling the receiving lens 1120 based on the segmented object. The control signal CTRL may be provided to the motor unit 1130 and the motor unit 1130 may control the focusing of the receiving lens 1120 on the object selected in the segmentation and control unit 1320 by moving the receiving lens 1120 in response to the control signal CTRL. The three-dimensional image sensor chip 1200 may generate the color data CDTA of the objects 1050 based on visible light VL which is reflected from the objects 1050 and may provide the color data CDTA to the second ISP (color ISP) 1330. The second ISP 1330 may process the color data CDTA to generate a color image CIMG. The second ISP 1330 may perform color image processing on each of the objects 1050 according to respective distances of the objects 1050 from the three-dimensional image sensor chip 1200.
As described above, in the camera 1000 according to example embodiments, the depth map DM is generated based on depth information of the objects 1050, one of the objects 1050 to be focused on by the receiving lens 1120 is selected based on the depth map DM, the receiving lens 1120 is moved such that the receiving lens 1120 is focused on the selected object, and each of the objects 1050 may be processed to a color image according to respective distances between the receiving lens 1120 (or the three-dimensional image sensor chip 1200) and respective objects 1050. That is, the object selected in the segmentation and control unit 1320 may be processed with more calculations while objects other than the selected object may be processed with less calculations.
Referring to
Hereinafter, there will be detailed description on operation of the camera 1000 with reference
When the objects 1050 are positioned at different distances from the camera 1000, the depth map DM of
For example, when the user selects an object S02 as in
For example, when the user selects an object S03 as in
Referring to
The receiving lens 1520 may focus incident light on a photo-receiving region (e.g., depth pixels and/or color pixels) of the three-dimensional image sensor chip 1600. The three-dimensional image sensor chip 1600 may generate depth data ZDTA including depth information of a plurality of objects 1050 based on received light RX reflected from the plurality of objects 1060, may generate color data CDTA including color information of the objects 1060 based on visible light VL from the objects 1060 and may provide the depth data ZDTA and the color data CDTA to the engine unit 1700. The engine unit 1700 may generate a depth map including depth information of the plurality of objects 1060 based on the depth data ZDTA, may perform image blurring process on the color data CDTA based in the depth map.
The light source module 1510 may include a light source 1511 and a lens 1512. The light source 1511 may generate infrared light or near-infrared light, and the lens 1512 may focus the infrared light or near-infrared light on the objects 1060.
The three-dimensional image sensor (or sensor module) 1500 may provide data DATA1 including the depth data ZDTA and/or the color data CDTA to the engine unit 1700 in response to a clock signal CLK. In some embodiments, the three-dimensional image sensor chip 1600 may interface with the engine unit 1700 using a mobile industry processor interface (MIPI) and/or a camera serial interface (CSI).
The engine unit 1700 may control the three-dimensional image sensor 1500. The engine unit 1700 may process the data DATA1 received from the three-dimensional image sensor chip 1600. For example, the engine unit 1700 may generate three-dimensional color data based on the received data DATA1. In other examples, the engine unit 1700 may generate YUV data including a luminance component, a difference between the luminance component and a blue component, and a difference between the luminance component and a red component based on the RGB data, or may generate compressed data, such as Joint Photographic Experts Group (JPEG) data. The engine unit 1700 may be coupled to a host/application 1800, and may provide data DATA2 to the host/application 1800 based on a master clock signal MCLK. In some embodiments, the engine unit 1700 may interface with the host/application 1700 using a serial peripheral interface (SPI) and/or an inter integrated circuit (I2C) interface.
In an example embodiment of
The three-dimensional image sensor chip 1600 may have configuration of the three-dimensional image sensor chip 1200a of
Referring to
Referring to
For convenience of explanation,
Hereinafter, there will be detailed description on operation of the camera 1020 with reference
When the objects 1060 are positioned at different distances from the camera 1020, the depth map DM of
For example, when a user selects an object S01 as in
For example, when a user selects an object S02 as in
For example, when a user selects an object S03 as in
Referring to
Referring to
Referring to
The processor 2010 may perform specific calculations or tasks. For example, the processor 2010 may be a microprocessor, a central process unit (CPU), a digital signal processor, or the like. The processor 2010 may communicate with the memory device 2020, the storage device 2030 and the input/output device 2040 via an address bus, a control bus and/or a data bus. The processor 2010 may be coupled to an extension bus, such as a peripheral component interconnect (PCI) bus. The memory device 2020 may store data for operating the computing system 2000. For example, the memory device 2020 may be implemented by a dynamic random access memory (DRAM), a mobile DRAM, a static random access memory (SRAM), a phase change random access memory (PRAM), a resistance random access memory (RRAM), a nano floating gate memory (NFGM), a polymer random access memory (PoRAM), a magnetic random access memory (MRAM), a ferroelectric random access memory (FRAM), etc. The storage device 2030 may include a solid state drive, a hard disk drive, a compact disc read-only memory (CD-ROM), etc. The input/output device 2040 may include an input device, such as a keyboard, a mouse, a keypad, etc., and an output device, such as a printer, a display device, etc. The power supply 2050 may supply power to the computing device 2000.
The camera 2060 may be coupled to the processor 2010 via the buses or other communication links. The camera 2060 may employ one of the camera 800a of
In some embodiments, camera 2060 and/or components of the camera 2060 may be packaged in various forms, such as package on package (PoP), ball grid arrays (BGAs), chip scale packages (CSPs), plastic leaded chip carrier (PLCC), plastic dual in-line package (PDIP), die in waffle pack, die in wafer form, chip on board (COB), ceramic dual in-line package (CERDIP), plastic metric quad flat pack (MQFP), thin quad flat pack (TQFP), small outline IC (SOIC), shrink small outline package (SSOP), thin small outline package (TSOP), system in package (SIP), multi chip package (MCP), wafer-level fabricated package (WFP), or wafer-level processed stack package (WSP).
The computing system 2000 may be any computing system including the camera 2060. For example, the computing system 2000 may include a digital camera, a mobile phone, a smart phone, a personal digital assistants (PDA), a portable multimedia player (PMP), etc.
Referring to
The computing system 2100 may further include a radio frequency (RF) chip 2160. A physical interface (PHY) 2113 of the application processor 2110 may perform data transfer with a PHY 2161 of the RF chip 2160 using a MIPI DigRF. The PHY 2113 of the application processor 2110 may include a DigRF MASTER 2114 for controlling the data transfer with the PHY 2161 of the RF chip 2160. The computing system 2100 may further include a global positioning system (GPS) 2120, a storage device 2170, a microphone 2180, a DRAM 2185 and a speaker 2190. The computing system 2100 may communicate with external devices using an ultra wideband (UWB) communication 2210, a wireless local area network (WLAN) communication 2220, a worldwide interoperability for microwave access (WIMAX) communication 2230, etc. The inventive concepts may not be limited to configurations or interfaces of the computing systems 2000 and 2100 illustrated in
The inventive concept may be applied to any three-dimensional image sensor or any system including the three-dimensional image sensor, such as a computer, a digital camera, a three-dimensional camera, a mobile phone, a personal digital assistant (PDA), a scanner, a navigator, a video phone, a monitoring system, an auto focus system, a tracking system, a motion capture system, an image stabilizing system, etc.
While example embodiments have been particularly shown and described, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2011-0028579 | Mar 2011 | KR | national |
10-2011-0029249 | Mar 2011 | KR | national |
10-2011-0029388 | Mar 2011 | KR | national |