The inventive concepts described herein relate to binary image sensors and unit pixels of binary image sensors.
An image sensor is a device which can convert an optical image into an electrical signal. As computer and communications industries develop, demand for image sensors having improved performance may increase in various fields such as digital cameras, camcorders, PCS (Personal Communication Systems), game machines, security cameras, medical micro cameras, robots, and so on.
A charge coupled device (CCD) image sensor and/or a CMOS image sensor may be used as an image sensor. The CMOS image sensor may be characterized in that a driving manner is simpler and a signal processing circuit is integrated in a single chip for downsizing. Since power consumption of the CMOS image sensor is relatively low, the CMOS image sensor may be more easily applied to products with a limited battery capacity. Also, the CMOS image sensor may be fabricated using a CMOS process technology, so that fabrication costs may be reduced.
The CMOS image sensor or the CCD image sensor may be formed of a plurality of pixels constituting or made up of unit pixels. A size of a pixel may be about 2 micrometers (μm). In the event that a pixel is fabricated to have a size less than 1 μm, it may be difficult to achieve performance improvement of the image sensor, for example, due to a narrower dynamic range, a smaller full well capacity, and/or a signal to noise ratio (SNR) of less than about 10 to 20:1.
According to some embodiments of the inventive concepts, a binary image sensor includes a plurality of unit pixels on a substrate having a surface that is configured to receive incident light. The unit pixels respectively include source and drain regions in the substrate and a channel region therebetween, and a gate electrode on the channel region. The unit pixels further respectively include at least one quantum dot on the surface of the substrate, and a charge storage region between the gate electrode and the at least one quantum dot.
In some embodiments, the charge storage region may be configured to store carriers therein that are generated by the quantum dot responsive to incident light.
In some embodiments, the channel region may be configured such that an electrostatic potential thereof differs responsive to a quantity and/or type of the carriers stored in the charge storage region.
In some embodiments, the unit pixels may be configured to operate responsive to different wavelengths of light. The plurality of unit pixels may define a pixel of a pixel array.
In some embodiments, the unit pixels may respectively include a quantum dot of a different material. The unit pixels may be separated by an isolation layer therebetween.
Some embodiments of the inventive concepts provide a binary image sensor which comprises a plurality of unit pixels on a substrate having one surface on which light is incident and at least one quantum dot disposed on the one surface of the substrate; a column sense amplifier circuit configured to detect binary information of a selected unit pixel among the plurality of unit pixels from a voltage or a current detected from the selected unit pixel; and a processing unit configured to process binary information of the respective unit pixels to generate pixel image information.
In some embodiments, the column sense amplifier circuit decides the binary information to be 1 when the voltage or current is over a threshold value and to be 0 when the voltage or current is below the threshold value.
In some embodiments, the pixel image information is formed of a set of three or four unit pixels.
In some embodiments, the processing unit accumulates binary information of the respective unit pixels constituting a set of the unit pixels to generate the pixel image information.
In some embodiments, the processing unit generates the pixel image information based on binary information of one of unit pixels constituting a set of the unit pixels.
In some embodiments, the at least one quantum dot is formed of different materials.
In some embodiments, the at least one quantum dot has different sizes.
In some embodiments, the unit pixels include red, green and blue unit pixels.
Other embodiments of the inventive concepts are directed to provide a unit pixel of a binary image sensor which comprises a substrate having one surface on which light is incident; at least one quantum dot disposed on the one surface of the substrate; a gate electrode disposed on the other surface of the substrate; source and drain areas formed at both sides of the gate electrode; a channel area formed between the source and drain areas; and a charge storage area formed between the channel area and the at least one quantum dot and configured to store carriers transferred from the at least one quantum dot.
In some embodiments, when a gate voltage is applied to the gate electrode, carriers stored at the charge storage area lower an electrostatic potential of the channel area.
In some embodiments, a conductivity type of the charge storage area is different from conductivity types of the source and drain areas.
In some embodiments, the charge storage area is formed using an ion implantation manner.
In some embodiments, the at least one quantum dot and the charge storage area are adjacent to each other.
In some embodiments, the at least one quantum dot is formed of a metallic compound or a silicon compound.
In some embodiments, the metallic compound is formed of at least one compound of copper, tungsten, and aluminum.
The above and other features of the present inventive concepts will become more apparent by describing in detail exemplary embodiments thereof with reference to the accompanying drawings, in which:
Exemplary embodiments of the present inventive concepts will be described more fully hereinafter with reference to the accompanying drawings. Like reference numerals may refer to like elements throughout the accompanying drawings.
It will be understood that, although the terms “first”, “second”, “third”, etc., may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the present inventive concepts.
Spatially relative terms, such as “beneath”, “below”, “lower”, “under”, “above”, “upper”, etc., may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures.
As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It will be further understood that the terms “comprises”, “comprising”, “includes” and/or “including,” if used herein, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
It will be understood that when an element or layer is referred to as being “on”, “connected to”, “coupled to”, or “adjacent to” another element or layer, it can be directly on, connected, coupled, or adjacent to the other element or layer, or intervening elements or layers may be present. In contrast, when an element is referred to as being “directly on”, “directly connected to”, “directly coupled to”, or “directly adjacent to” another element, there are no intervening elements present.
Example embodiments of the inventive concepts are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized embodiments (and intermediate structures) of example embodiments. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example embodiments of the inventive concepts should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, an implanted region illustrated as a rectangle may have rounded or curved features and/or a gradient of implant concentration at its edges rather than a binary change from implanted to non-implanted region. Likewise, a buried region formed by implantation may result in some implantation in the region between the buried region and the surface through which the implantation takes place. Thus, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of example embodiments.
Referring to
The pixel array 1100 may include a plurality of pixels arranged two-dimensionally. Each pixel may include a plurality of unit pixels (or, called “JOT”). Each unit pixel may convert an input optical signal into an electrical signal. Each unit pixel may include at least one quantum dot. The quantum dot may refer to a material showing a quantum confinement effect as a nano-sized semiconductor material. A carrier (electrons and/or holes) generated by photon applied to the quantum dot may cause a variation in an electrical characteristic (e.g., a voltage, a current, etc.) of the unit pixel. Binary information of the unit pixel may be defined using a variation in an electrical characteristic of the unit pixel.
In example embodiments, the binary image sensor and the unit pixel of the binary image sensor may improve collection efficiency of light applied to the unit pixel using the quantum dot. Also, a crosstalk between unit pixels may be reduced by disposing the quantum dot for receiving light in each and/or every unit pixel. Thus, a wide dynamic range may be obtained. An analog-to-digital converter (ADC) of a conventional image sensor may be removed by defining binary information every unit pixel. The unit pixel will be more fully described with reference to
The timing controller 1200 may control operating timing of the image sensor 1000. The timing controller 1200 may provide the row driver 1300 and the column sense amplifier circuit 1400 with a timing signal, a control signal, and address information.
The row driver 1300 may control operation of the pixel array 1100 using the address information from the timing controller 1200. The row driver 1300 may provide the pixel array 1100 with driving signals for driving a plurality of pixels. In the event that a plurality of pixels is arranged in a matrix form, driving signals may be provided every row.
The column sense amplifier circuit 1400 may include a plurality of sense amplifiers for sensing data states of unit pixels. The column sense amplifier circuit 1400 may sense the unit pixels to acquire or determine binary information of a logic ‘1’ or ‘0’. For example, when a current of a unit pixel sensed is over a threshold value, the column sense amplifier circuit 1400 may determine the binary information to be ‘1’. When the current of the unit pixel sensed is below the threshold value, the column sense amplifier circuit 1400 may determine the binary information to be ‘0’. In case of a unit pixel where a current flows, the column sense amplifier circuit 1400 may determine the binary information to be ‘1’. In case of a unit pixel where no current flows, the column sense amplifier circuit 1400 may determine the binary information to be ‘0’.
The processing unit 1500 may process binary information of the unit pixels to generate image information of pixels. For example, in the event that at least one of the unit pixels has binary information of ‘1’, a pixel corresponding to the unit pixel having binary information of ‘1’ may be processed to have image information of ‘1’. However, embodiments of the inventive concepts are not limited thereto. For example, pixel image information can be generated using various methods (e.g., a sum of binary information of unit pixels). That is, pixel image information may be decided according to binary information of unit pixels. Image information decided may be sent to an image data processing unit (not shown).
The image sensor 1000 according to some embodiments of the inventive concepts is applicable to and/or may otherwise be used in various multimedia devices having an image capture function. For example, the image sensor 1000 may be used in a mobile phone, a smart phone, a smart tablet, and so on. Also, the image sensor 1000 is applicable to a notebook computer, a television, a smart television, a digital camera, a digital camcorder, and so on.
Referring to
Source lines SL1 to SLn of the pixel array 1100 may be connected with sources of unit pixels connected to a row. A row driver 1300 may apply a source voltage Vs to sources of unit pixels selected through the source lines SL1 to SLn.
Gate lines GL1 to GLn of the pixel array 1100 may be connected with gates of unit pixels connected to a row. The gate lines GL1 to GLn may be disposed in parallel with the source lines SL1 to SLn. The row driver 1300 may apply a gate voltage Vg to gates of unit pixels selected through the gate lines GL1 to GLn.
Drains of unit pixels may be connected to column lines CL1 to CLn of corresponding columns. A column sense amplifier circuit 1400 may sense currents or voltages of unit pixels selected through the column lines CL1 to CLn and detect binary information of unit pixels from the sensed currents or voltages.
Referring to
A channel may be formed between the drain area 1112 and the source area 1113. The gate insulation film 1114 may be formed of a silicon oxide film (SiO2). The gate electrode 1115 may be a transparent metal (for example, an ITO (Indium Tin Oxide) electrode or a poly silicon electrode). The second area 1116 may be an insulation layer (e.g., a silicon oxide film (SiO2). Also, a metal line connected with the gate electrode 1115 may be formed at the second area 1116.
The charge storage area 1117 may be formed on the channel, for example, in a direction illustrated in
Carriers (electrons or holes) generated from the quantum dot 1118 may be stored at the charge storage area 1117. The stored carrier may influence an electrostatic potential of a channel, and a level of current flowing to the channel may be changed according to the change in electrostatic potential.
The quantum dot 1118 may generate carriers (electrons and/or holes) in response to an input or incident light. The quantum dot 1118 may be formed on the charge storage area 1117. The quantum dot 1118 may be formed of one or plural quantum dots, and may be fabricated by a self-aligned or patterning manner. The quantum dot 1118 may be n-type, p-type or a combination of n-type and p-type. The quantum dot 1118 may be formed of metal oxide (e.g., copper, tungsten, aluminum, etc.), silicon, or gallium arsenide (GaAs).
A reset operation will be described with reference to
A charge operation will be described with reference to
A selection/read operation will be described with reference to
Referring to
The image sensor 1000 according to some embodiments of the inventive concepts may output binary information of the unit pixels, respectively. The binary information of the unit pixels may be processed by a processing unit 1500 to provide pixel image information.
For example, a unit pixel having the first quantum dot 1118a may absorb light having a wavelength (about 670 nm) corresponding to red R. A unit pixel having the second quantum dot 1118b may absorb light having a wavelength (about 570 nm) corresponding to green G. A unit pixel having the third quantum dot 1118c may absorb light having a wavelength (about 415 nm) corresponding to blue B.
According to this application, an image sensor 1000 (refer to
For example, the column sense amplifier circuit 1400 and the processing unit 1500 may be provided with a voltage or current output from a drain of the unit pixel 1110. The column sense amplifier circuit 1400 and the processing unit 1500 may operate the same as or as similarly described with reference to
Referring to
The lens 2100 may collect light incident onto a receiving area of the image sensor 2200.
The image sensor 2200 may include a pixel array formed of unit pixels described with reference to
The motor unit 2300 may adjust a focus of the lens 2100 or perform shuttering in response to a control signal CTRL from the engine unit 2400.
The engine unit 2400 may control the image sensor 2200 and the motor unit 2300. The engine unit 2400 may generate YUV data based on a distance and/or image data from the image sensor 2200. Here, the YUV data may include a distance from an object, brightness component, difference between brightness component and blue component, and different between brightness component and red component. The engine unit 2400 may generate compression data, for example, JPEG (Joint Photography Experts Group) data. The engine unit 2400 may be connected with a host/application 2500. The engine unit 2400 may provide the host/application 2500 with YUV data or JPEG data based on a master clock MCLK. The engine unit 2400 may interface with the host/application 2500 through SPI (Serial Peripheral Interface) and/or I2C (Inter Integrated Circuit).
The processor 3100 may execute particular calculations or tasks. In example embodiments, the processor 3100 may include a microprocessor or a central processing unit (CPU). The processor 3100 may communicate with the memory device 3200, the storage device 3300 and the input/output device 3400 through an address bus, a control bus, and a data bus. In example embodiments, the processor 3100 can be connected to an expansion bus such as a PCI (Peripheral Component Interconnect) bus.
The memory device 3200 may store data that is used for operations of the computing system 3000. For example, the memory device 3200 may include a DRAM, a mobile DRAM, an SRAM, a PRAM, an FRAM, an RRAM, and/or an MRAM.
The storage device 3300 may include SSD (Solid State Drive), HDD (Hard Disk Drive) or CD-ROM.
The input/output device 3400 may include input devices (e.g., a keyboard, a keypad, a mouse, etc.) and output devices (e.g., a printer, a display, etc.).
The power supply 3500 may supply an operating voltage needed for operations of the computing system 3000.
The image sensor 3600 may include a pixel array formed of unit pixels described with reference to
In example embodiments, the CSI host 4120 may include a de-deserializer (DES), and the CSI device 4310 may include a serializer (SER). The image sensor 4300 may include a pixel array formed of unit pixels described with reference to
A DSI (Display Serial Interface) host 4110 of the AP 4100 may perform serial communications with a DSI device 4410 of the display 4400 through a DSI. For example, the DSI host 4110 may include a serializer (SER), and the DSI device 4410 may include a deserializer DES.
The computing system 4000 may further include an RF (Radio Frequency) chip 4500 which communicates with the AP 4100. A PHY (Physical layer) 4130 of the AP 4100 and a PHY 4510 of the RF chip 4500 may perform data exchange according to an MIPI (Mobile Industry Processor Interface) DigRF. Also, the PHY 4130 of the AP 4100 may further comprise a DigRF MASTER 4140 to control data exchange according to the MIPI DigRF.
The computing system 4000 may include a GPS (Global Positioning System) 4200, storage 4600, a microphone 4700, a DRAM 4800, and a speaker 4900.
Also, the computing system 4000 may communicate using WIMAX (Worldwide Interoperability for Microwave Access) 4910, WLAN (Wireless Local Area Network) 4920, and UWB (Ultra WideBand) 4930. However, embodiments of the inventive concepts are not limited thereto.
While the inventive concept has been described with reference to exemplary embodiments, it will be apparent to those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the present inventive concepts. Therefore, it should be understood that the above embodiments are not limiting, but illustrative.
Number | Date | Country | Kind |
---|---|---|---|
10-2013-0025083 | Mar 2013 | KR | national |
A claim for priority under 35 U.S.C. §119 is made to U.S. Provisional Patent Application No. 61/713,175 filed Oct. 12, 2012, in the U.S. Patent and Trademark Office, and to Korean Patent Application No. 10-2013-0025083 filed Mar. 8, 2013, in the Korean Intellectual Property Office, the disclosures of which are hereby incorporated by reference herein in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
5224172 | Masai | Jun 1993 | A |
5288988 | Hashimoto et al. | Feb 1994 | A |
5665959 | Fossum et al. | Sep 1997 | A |
5808677 | Yonemoto | Sep 1998 | A |
6445022 | Barna et al. | Sep 2002 | B1 |
6483541 | Yonemoto et al. | Nov 2002 | B1 |
6552746 | Yang et al. | Apr 2003 | B1 |
6587145 | Hou et al. | Jul 2003 | B1 |
6720589 | Shields | Apr 2004 | B1 |
6784933 | Nakai | Aug 2004 | B1 |
6791611 | Yang | Sep 2004 | B2 |
6975324 | Valmiki et al. | Dec 2005 | B1 |
7113212 | Yonemoto et al. | Sep 2006 | B2 |
7148927 | Ogura et al. | Dec 2006 | B2 |
7176432 | Komiya | Feb 2007 | B2 |
7542345 | Okhonin et al. | Jun 2009 | B2 |
7746400 | Mo | Jun 2010 | B2 |
7791034 | Kameshima et al. | Sep 2010 | B2 |
7825971 | Hynecek | Nov 2010 | B2 |
7858917 | Stern et al. | Dec 2010 | B2 |
7859582 | Gomi | Dec 2010 | B2 |
7923801 | Tian et al. | Apr 2011 | B2 |
8143685 | Cha et al. | Mar 2012 | B2 |
8154640 | Cha et al. | Apr 2012 | B2 |
8174604 | Shibata et al. | May 2012 | B2 |
8175408 | Hagiwara | May 2012 | B2 |
8193482 | Itzer | Jun 2012 | B2 |
8208053 | Nishihara | Jun 2012 | B2 |
8213212 | Sargent et al. | Jul 2012 | B2 |
8284587 | Sargent et al. | Oct 2012 | B2 |
8319855 | Yang et al. | Nov 2012 | B2 |
8405017 | Neter et al. | Mar 2013 | B2 |
8420996 | Rissa et al. | Apr 2013 | B2 |
9052381 | Woolaway et al. | Jun 2015 | B2 |
20010002099 | Itoh et al. | May 2001 | A1 |
20010020909 | Sakuragi et al. | Sep 2001 | A1 |
20030025817 | Yonemoto et al. | Feb 2003 | A1 |
20030193585 | Ogura et al. | Oct 2003 | A1 |
20040076032 | Chae | Apr 2004 | A1 |
20040245435 | Komiya | Dec 2004 | A1 |
20050012033 | Stern et al. | Jan 2005 | A1 |
20050067637 | Jang | Mar 2005 | A1 |
20050078205 | Hynecek | Apr 2005 | A1 |
20050201149 | Duan | Sep 2005 | A1 |
20070187775 | Okhonin et al. | Aug 2007 | A1 |
20070285548 | Gomi et al. | Dec 2007 | A1 |
20080001538 | Cok | Jan 2008 | A1 |
20080089599 | Hagiwara | Apr 2008 | A1 |
20090033779 | Mo | Feb 2009 | A1 |
20090052262 | Nii | Feb 2009 | A1 |
20090146071 | Kameshima et al. | Jun 2009 | A1 |
20090152664 | Klem | Jun 2009 | A1 |
20090251581 | Cha et al. | Oct 2009 | A1 |
20090256156 | Hsieh | Oct 2009 | A1 |
20090268031 | Honma | Oct 2009 | A1 |
20090290056 | Itoh et al. | Nov 2009 | A1 |
20100007780 | Nishihara | Jan 2010 | A1 |
20100019296 | Cha et al. | Jan 2010 | A1 |
20100097510 | Wada et al. | Apr 2010 | A1 |
20100155703 | Jun | Jun 2010 | A1 |
20100320515 | Fossum et al. | Dec 2010 | A1 |
20100329566 | Nikula et al. | Dec 2010 | A1 |
20110002179 | Kim et al. | Jan 2011 | A1 |
20110050969 | Nishihara | Mar 2011 | A1 |
20110080510 | Nishihara | Apr 2011 | A1 |
20110101205 | Tian et al. | May 2011 | A1 |
20110121421 | Charbon et al. | May 2011 | A1 |
20110134264 | Nishihara et al. | Jun 2011 | A1 |
20110149274 | Rissa et al. | Jun 2011 | A1 |
20110149658 | Aritome | Jun 2011 | A1 |
20110155892 | Neter et al. | Jun 2011 | A1 |
20110176019 | Yang et al. | Jul 2011 | A1 |
20110204210 | Itzler | Aug 2011 | A1 |
20110249148 | Prescher et al. | Oct 2011 | A1 |
20110267510 | Malone et al. | Nov 2011 | A1 |
20110278541 | Huang et al. | Nov 2011 | A1 |
20120006975 | Shibata et al. | Jan 2012 | A1 |
20120057059 | Eldesouki et al. | Mar 2012 | A1 |
20120200734 | Tang | Aug 2012 | A1 |
20120217413 | Kameshima et al. | Aug 2012 | A1 |
20120224447 | Kitayama | Sep 2012 | A1 |
20120235021 | Kasai | Sep 2012 | A1 |
20130147979 | McMahon et al. | Jun 2013 | A1 |
20140104469 | Kim et al. | Apr 2014 | A1 |
20140104472 | Kim et al. | Apr 2014 | A1 |
20140117247 | Hamlin | May 2014 | A1 |
20140293102 | Vogelsang et al. | Oct 2014 | A1 |
Number | Date | Country |
---|---|---|
01-280974 | Nov 1989 | JP |
04-267670 | Sep 1992 | JP |
2003-319265 | Nov 2003 | JP |
2005-252743 | Sep 2005 | JP |
2006-005312 | Jan 2006 | JP |
2006-094192 | Apr 2006 | JP |
2006-270292 | May 2006 | JP |
2006-237772 | Sep 2006 | JP |
2009-239668 | Oct 2009 | JP |
2012-124338 | Jun 2012 | JP |
1020090012759 | Feb 2009 | KR |
1020100079088 | Jul 2010 | KR |
101026923 | Apr 2011 | KR |
Entry |
---|
Eric R. Fossum, “Quanta Image Sensor: Possible paradigm shift for the future”, Mar. 22, 2012, “Grand Keynote”, IntertechPira Image Sensors 2012 London, England, UK (http://ericfossum.com/Presentations/2012%20March%20QIS%20London.pdf). |
Number | Date | Country | |
---|---|---|---|
20140103193 A1 | Apr 2014 | US |
Number | Date | Country | |
---|---|---|---|
61713175 | Oct 2012 | US |