This application claims priority under 35 U.S.C. § 119 to Korean Patent Application No. 10-2023-0142425 filed on Oct. 23, 2023, in the Korean Intellectual Property Office, the disclosures of which are incorporated by reference herein in their entireties.
Embodiments of the present disclosure described herein relate to image sensors.
An image sensor is a semiconductor-based sensor employed in an optical sensor or an imaging module to convert an optical image into an electrical signal. The image sensor includes a pixel array having a plurality of pixels.
As the optical sensor or the imaging module is made compact, the chief ray angle (CRA) at the edge of the optical sensor is increased. When the chief ray angle at the edge of the optical sensor is increased, the sensitivity of pixels located at the edge of the optical sensor is decreased. This causes degradation in the sensitivity of the optical sensor or the imaging module.
Embodiments of the present disclosure provide image sensors with improved sensitivity depending on conditions of incident light.
Embodiments of the present disclosure provide image sensors with improved auto focusing performance by improving light receiving efficiency depending on conditions of incident light.
According to some example embodiments, an image sensor includes a pixel array including a plurality of unit pixels having photoelectric conversion elements arranged in a matrix form on a semiconductor substrate in a first direction and a second direction crossing the first direction, color filters corresponding to the unit pixels and having at least two different colors, and meta-microlenses on the color filters and configured to condense light incident to the unit pixels. Each of the meta-microlenses includes first and second nanostructures having different refractive indexes. The first nanostructure is implemented with a plurality of nanoposts, and diameters of the nanoposts at an edge of the pixel array are greater than diameters of the nanoposts at a center of the pixel array.
In some example embodiments of the present disclosure, the diameters of the nanoposts may be sequentially increased from the center of the pixel array toward the edge of the pixel array.
In some example embodiments of the present disclosure, each of the meta-microlenses includes a refractive index peak region having a highest effective refractive index of a meta-microlens of the meta-microlenses, and a distance between the refractive index peak region and a center of a unit pixel corresponding to the meta-microlens may have different values at the center of the pixel array and the edge of the pixel array.
In some example embodiments of the present disclosure, at least some of the nanoposts may have different diameters in the corresponding unit pixel, and the nanoposts may have the greatest diameter in the refractive index peak region.
In some example embodiments of the present disclosure, the refractive index peak region may be sequentially far away from the center of the corresponding unit pixel with an approach to the edge of the pixel array from the center of the pixel array.
In some example embodiments of the present disclosure, the refractive index peak region may coincide with the center of the corresponding unit pixel at the center of the pixel array.
In some example embodiments of the present disclosure, the nanoposts in the corresponding unit pixel may have different pitches and/or densities at the center of the pixel array and the edge of the pixel array.
In some example embodiments of the present disclosure, the meta-microlenses may be shifted in a direction toward the center with an approach to the edge of the pixel array from the center of the pixel array.
In some example embodiments of the present disclosure, the color filters may be shifted in a direction toward the center with an approach array to the edge of the pixel array from the center of the pixel.
In some example embodiments of the present disclosure, a distance by which the meta-microlenses are shifted may be greater than a distance by which the color filters are shifted.
In some example embodiments of the present disclosure, the color filters may include two or more types of color filters having different colors, and the color filters having the different colors may be shifted by different distances with an approach to the edge of the pixel array from the center of the pixel array.
In some example embodiments of the present disclosure, the color filters may include a red color filter, a green color filter, and a blue color filter, and the red color filter may be shifted more than the green color filter and the blue color filter.
In some example embodiments of the present disclosure, a region where each of the color filters is provided may at least partially overlap a corresponding unit pixel region, and an overlapping area of the region where the color filter is provided and the corresponding unit pixel region may be decreased from the center toward the edge.
In some example embodiments of the present disclosure, a region where each of the meta-microlenses is provided may at least partially overlap a corresponding unit pixel region, and an overlapping area of the region where the meta-microlens is provided and the corresponding unit pixel region may be decreased from the center toward the edge.
In some example embodiments of the present disclosure, the pixel array may include a plurality of regions arranged in a direction from the center of the pixel array to the edge of the pixel array, and the plurality of regions may have a shape of concentric circles.
In some example embodiments of the present disclosure, the meta-microlenses may have a flat upper surface.
In some example embodiments of the present disclosure, the image sensor may further include at least one of an anti-reflective layer, a nano-prism, a color splitter, a color sorter, or a polarizer provided on the meta-microlenses.
In some example embodiments of the present disclosure, each of the unit pixels may include four pixels in a 2×2 array.
According to some example embodiments, a camera module includes an optical lens assembly that condenses light from an object and forms an optical image, an image sensor that converts the optical image formed by the optical lens assembly into an electrical signal, and an image signal processor that processes the electrical signal output from the image sensor into an image signal. The image sensor includes a pixel array including a plurality of unit pixels having photoelectric conversion elements arranged in a matrix form on a semiconductor substrate in a first direction and a second direction crossing the first direction, color filters corresponding to the unit pixels and having at least two different colors, and meta-microlenses on the color filters and configured to condense light incident to the unit pixels. Each of the meta-microlenses includes first and second nanostructures having different refractive indexes. The first nanostructure is implemented with a plurality of nanoposts, and diameters of the nanoposts at an edge of the pixel array are greater than diameters of the nanoposts at a center of the pixel array.
According to some example embodiments, a method for manufacturing the meta-microlenses of the image sensor described above includes forming a low-refractive index material layer on the color filters, forming a plurality of holes by etching the low-refractive index material layer, filling the plurality of holes with a high-refractive index material, and partially removing upper sides of the holes filled with the high-refractive index material through chemical mechanical polishing.
The above and other objects and features of the present disclosure will become apparent by describing in detail embodiments thereof with reference to the accompanying drawings.
Various changes can be made to the present disclosure, and various embodiments of the present disclosure may be implemented. Thus, specific example embodiments are illustrated in the drawings and described as examples herein. However, it should be understood that the present disclosure is not to be construed as being limited thereto and covers all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure.
The present disclosure relates to image sensors. The image sensors are devices that generate a digital signal (or, an electrical signal) based on light reflected from a subject and generates digital image data based on the electrical signal.
Hereinafter, example embodiments of the present disclosure will be described in more detail with reference to the accompanying drawings.
Referring to
The pixel array 1 includes a plurality of pixels arranged in two dimensions and converts an optical signal into an electrical signal. The pixel array 1 may be driven by a plurality of drive signals such as a pixel selection signal, a reset signal, and a charge transmission signal from the row driver 3. In addition, the converted electrical signal is provided to the correlated double sampler 6.
The row driver 3 provides the plurality of drive signals for driving the plurality of pixels to the pixel array 1 depending on results decoded by the row decoder 2. In a case in which the pixels are arranged in a matrix form, the drive signals may be provided for respective rows.
The timing generator 5 provides a timing signal and a control signal to the row decoder 2 and the column decoder 4.
The correlated double sampler 6 receives, holds, and samples the electrical signal generated by the pixel array 1. The correlated double sampler 6 doubly samples a specific noise level and a signal level by the electrical signal and outputs a difference level corresponding to the difference between the noise level and the signal level.
The analog-to-digital converter 7 converts an analog signal corresponding to the difference level output from the correlated double sampler 6 into a digital signal and outputs the digital signal.
The input/output buffer 8 latches the digital signal. The input/output buffer 8 outputs the latched signal to an image signal processor (not illustrated) as a digital signal depending on a result decoded by the column decoder 4.
The pixel array 1 may include the plurality of pixels that sense light in different wavelength bands. The arrangement of the plurality of pixels may be implemented in various ways.
Referring to
In some example embodiments of the present disclosure, the red unit pixels PX_R, the green unit pixels PX_Gr and PX_Gb, and the blue unit pixels PX_B may be arranged in a certain arrangement order to constitute a Bayer pattern commonly employed in image sensors. The Bayer pattern includes a plurality of unit patterns UT. One unit pattern UT includes four quadrant regions. The unit pattern UT is repeatedly arranged in two dimensions in a first direction D1 and a second direction D2. In other words, within a unit pattern in the form of a 2×2 array, one red unit pixel PX_R and one blue unit pixel PX_B are arranged in one diagonal direction, and two green unit pixels PX_Gr and PX_Gb are arranged in the other diagonal direction. When the overall pixel arrangement is viewed, a first row in which a plurality of red unit pixels PX_R and a plurality of green unit pixels PX_Gr are alternately arranged in the first direction D1 and a second row in which a plurality of green unit pixels PX_Gb and a plurality of blue unit pixels PX_B are alternately arranged in the first direction D1 are repeatedly arranged in the second direction D2. For reference, in the drawing, a direction perpendicular to the first direction D1 and the second direction D2 is a third direction D3.
The image sensor may be applied to various optical devices, for example, a camera module.
Referring to
The optical lens assembly OPL serves to focus an image of a subject outside the camera module CMD on the image sensor 100, more accurately, the pixel array of the image sensor 100. Although briefly illustrated as one lens in
Accordingly, the light starting from the different points A, B, C, and D is incident to the pixel array at different angles depending on the distances between the points A, B, C, and D and the optical axis AX. The incidence angle of light incident to the pixel array is typically defined as the chief ray angle (CRA). A chief ray (CR) refers to a ray incident from a point on the subject to the pixel array through the center of the optical lens assembly OPL, and the chief ray angle refers to the angle formed by the chief ray with the optical axis AX. The light starting from the point A on the optical axis AX has a chief ray angle of 0 degrees and is perpendicularly incident to the pixel array. As the distance between a starting point and the optical axis AX is increased, the chief ray angle is increased.
From the perspective of the image sensor 100, the chief ray angle of light incident to the center of the pixel array is 0 degrees, and the chief ray angle of incident light is increased toward the edge of the pixel array. For example, the light that starts from the points B and C and is incident to the outermost edge of the pixel array has the largest chief ray angle, and the light that starts from the point A and is incident to the center of the pixel array has a chief ray angle of 0 degrees. In addition, the light that starts from the point D and is incident between the center of the pixel array and the edge of the pixel array has a chief ray angle greater than 0 degrees and smaller than the chief ray angles of the light starting from the points B and C.
Accordingly, the chief ray angles of incident light incident to the plurality of pixels vary depending on the positions of the plurality of pixels in the pixel array.
Referring to
In some example embodiments of the present disclosure, to prevent, reduce, or minimize deterioration in the sensitivities of pixels located at the edge of the pixel array PA, a meta-microlens array is disposed in the pixel array PA.
Referring to
Each of the unit pixels representing the respective colors may include a plurality of pixels, for example, four pixels PX. According to these example embodiments, the four pixels PX may be arranged in a matrix form in the first direction D1 and the second direction D2 crossing each other.
In some example embodiments of the present disclosure, pixels PX constituting one unit pixel correspond to one color and share one meta-microlens (MML). For example, as illustrated in
In some example embodiments of the present disclosure, for convenience of description, it has been described that one unit pixel includes four pixels. However, the present disclosure is not limited thereto, and in some example embodiments of the present disclosure, one unit pixel may include, for example, nine or sixteen pixels.
When the image sensor according to some example embodiments of the present disclosure is viewed in the sectional view, the image sensor may include a photoelectric conversion layer 10, a circuit wiring layer 20, and a light transmitting layer 30. The photoelectric conversion layer 10 may be disposed between the circuit wiring layer 20 and the light transmitting layer 30.
The photoelectric conversion layer 10 includes a semiconductor substrate 110 having a first surface 110a and a second surface 110b facing away from each other, device isolation films 130 penetrating the semiconductor substrate 110, and the photoelectric conversion regions PD provided in the semiconductor substrate 110. (Here, the photoelectric conversion regions are regions corresponding to photoelectric conversion elements and are assigned with the same reference numeral PD as that of the photoelectric conversion elements.)
The semiconductor substrate 110 is a region where the pixel array having the plurality of pixels PX arranged therein is provided. The semiconductor substrate 110 may be a silicon substrate, a germanium substrate, a silicon-germanium substrate, a Group II-VI compound semiconductor substrate, a Group III-V compound semiconductor substrate, or a silicon on insulator (SOI) substrate. The semiconductor substrate 110 may include a first conductive type impurity. Accordingly, the semiconductor substrate 110 may have a first conductive type. The first conductive type impurity may be a Group III element. For example, the first conductive type impurity may include a P-type impurity such as aluminum (Al), boron (B), indium (in), and/or gallium (Ga).
The plurality of pixels PX may be arranged in a matrix form in the semiconductor substrate 110 in the first direction D1 and the second direction D2 parallel to the first surface 110a of the semiconductor substrate 110. The first direction D1 and the second direction D2 may cross each other. For example, the first direction D1 and the second direction D2 may be perpendicular to each other.
The device isolation films 130 penetrate the semiconductor substrate 110 and are disposed between the pixels PX. The pixels PX may be defined by the device isolation films 130. The device isolation films 130 may penetrate the semiconductor substrate 110 in the third direction D3 perpendicular to the first surface 110a. That is, the device isolation films 130 may extend from the first surface 110a toward the second surface 110b. The first surface 110a may expose the lower surfaces of the device isolation films 130 and may be substantially coplanar or coplanar with the lower surfaces of the device isolation films 130. The second surface 110b may expose the upper surfaces of the device isolation films 130 and may be substantially coplanar or coplanar with the upper surfaces of the device isolation films 130.
The photoelectric conversion regions PD may be disposed in the pixels PX, respectively. The photoelectric conversion elements PD may be disposed in the semiconductor substrate 110. The photoelectric conversion elements PD may absorb incident light and may generate and accumulate charges corresponding to the amount of light. The photoelectric conversion elements PD may include at least one of a photoelectric conversion element, a photo transistor, a photo gate, a pinned photo diode (PPD), or a combination thereof. The photoelectric conversion regions PD may be disposed behind the meta-microlenses MML and color filters CF and may correspond to, for example, photo diodes. When light reaches the photoelectric conversion regions PD, the photoelectric conversion regions PD may output electrical signals corresponding to the incident light through a photoelectric effect. The electrical signals may generate charges (or, currents) depending on the intensity (or, amount) of the received light.
The photoelectric conversion regions PD may be interposed between the first surface 110a and the second surface 110b of the semiconductor substrate 110. The photoelectric conversion regions PD may be doped regions including a second conductive type impurity opposite to the first conductive type impurity. In some example embodiments of the present disclosure, the photoelectric conversion regions PD may include a Group V element as an impurity, and the Group V element may be the second conductive type impurity. The second conductive type impurity may include an n-type impurity such as phosphorus (P), arsenic (As), bismuth (Bi), and/or antimony (Sb). The photoelectric conversion regions PD may form a PN junction with the semiconductor substrate 110 to form photo diodes.
Shallow separation films 140 may be provided in the respective pixels PX. The shallow separation films 140 may be buried the semiconductor substrate 110 and may be disposed adjacent to the first surface 110a. Pad films 141 may be provided between the shallow separation films 140 and the semiconductor substrate 110. The upper surfaces of the shallow separation films 140 may be exposed by the first surface 110a. The shallow separation films 140 may be formed of various insulating materials. For example, the shallow separation films 140 may include at least one of a silicon oxide film, a silicon nitride film, or a silicon oxy nitride film.
The circuit wiring layer 20 may be provided on the first surface 110a of the semiconductor substrate 110. The circuit wiring layer 20 may include a circuit part, contact VIAs 230, and conductive lines 220. The circuit part may include gate patterns GP and gate insulating layers GI, and the contact VIAs 230 and the conductive lines 220 may be connected with the circuit part. The contact VIAs 230 and the conductive lines 220 may be provided in an interlayer insulating layer 210 stacked on the first surface 110a. The interlayer insulating layer 210 may cover the first surface 110a, the upper surfaces of the device isolation films 130, and the upper surfaces of the shallow separation films 140. The interlayer insulating layer 210 may cover transistors constituting the circuit part. The conductive lines 220 may be electrically connected to the transistors through the contact VIAs 230. The interlayer insulating layer 210 may include an insulating material, and the contact VIAs 230 and the conductive lines 220 may include a conductive material.
The gate patterns GP may be disposed on the first surface 110a of the semiconductor substrate 110. The gate patterns GP may function as gate electrodes of transfer transistors, source follower transistors, reset transistors, or selection transistors for driving the image sensor. For example, the gate patterns GP may include transfer gates, source follower gates, reset gates, or selection gates. In the drawing, for convenience of description, one gate pattern GP is illustrated as being disposed in each pixel PX. However, without being limited thereto, a plurality of gate patterns GP may be disposed in each pixel PX. As illustrated, the gate patterns GP may have a buried gate structure. However, without being limited thereto, the gate patterns GP may have a planar gate structure unlike that illustrated in the drawing. The gate patterns GP may include a metallic material, a metal silicide material, poly silicon, or a combination thereof.
The gate insulating layers GI may be interposed between the gate patterns GP and the semiconductor substrate 110. The gate insulating layers GI may include, for example, a silicon-based insulating material (e.g., silicon oxide, silicon nitride, and/or silicon oxy nitride) and/or a high-k material (e.g., hafnium oxide and/or aluminum oxide).
The light transmitting layer 30 may be disposed on the second surface 110b of the semiconductor substrate 110. The light transmitting layer 30 is a layer through which light travelling from the outside toward the photoelectric conversion regions PD transmits, and the second surface 110b of the semiconductor substrate 110 is a light incident surface to which light is incident. The light transmitting layer 30 may condense and filter light incident from the outside and may provide the light to the photoelectric conversion layer 10. The light transmitting layer 30 may include the color filters CF, fence patterns 320, and the meta-microlenses MML that are disposed on the second surface 110b.
The color filters CF may be disposed between the second surface 110b and the meta-microlenses MML. The color filters CF may include at least two types of color filters having different wavelength bands. For example, the color filters CF may be allocated to the red unit pixel PX_R, the green unit pixels PX_Gr and PX_Gb, and the blue unit pixel PX_B to implement red, green, and blue, respectively. Red, green, and blue (RGB), red, green, blue, and white (RGBW), cyan, magenta, and yellow (CMY), cyan, magenta, yellow, and black (CMYK), red, yellow, and blue (RYB), and infrared ray (RGBIR) may serve as examples of a plurality of reference colors. Although it has been described that the color filters CF implement red (R), green (G), and blue (B), the present disclosure is not limited thereto. The color filters CF corresponding to the respective pixels PX constitute a color filter array.
Hereinafter, for convenience of description, the description will be focused on the Bayer pattern, that is, the RGB pattern (or the RGGB pattern). However, it should be noted that this does not limit the repetitive arrangement structures and patterns of other color filter arrays. That is, example embodiments of the present disclosure are not limited thereto, and the color filter array may be formed in various patterns including RGB, CYYM, CYGM, RGBW, RYYB, and X-trans.
The fence patterns 320 may be disposed on the device isolation films 130. The fence patterns 320 may vertically overlap the device isolation films 130. The fence patterns 320 may have a planar shape corresponding to the device isolation films 130. For example, the fence patterns 320 may have a grid shape when viewed from above the plane. The fence patterns 320 may surround the color filters CF when viewed from above the plane. The fence patterns 320 may be disposed such that at least portions thereof vertically overlap the device isolation films 130 (e.g., in the third direction D3).
Each of the fence patterns 320 may be interposed between two color filters CF adjacent to each other. The plurality of color filters CF may be physically or optically separated from one another by the fence patterns 320. Accordingly, the fence patterns 320 may guide light incident to the second surface 110b such that the light is input into the photoelectric conversion regions PD.
The fence patterns 320 may include metal. However, without being limited thereto, the fence patterns 320 may include a low-refractive index material. The low-refractive index material may include a polymer and silica nano particles in the polymer. The low-refractive index material may have insulating characteristics. In another example, the fence patterns 320 may include metal and/or metal nitride. For example, the fence patterns 320 may include titanium and/or titanium nitride.
The meta-microlenses MML may be planar microlenses using nanostructures and may be disposed on the plurality of color filters CF. Each of the meta-microlenses MML includes a nanostructure that has a sub-wavelength period and locally adjusts the refractive index of incident light. The nanostructure is an optical element that locally adjusts the refractive index of incident light to control the polarization, phase, and size of the light. When light transmits through the nanostructure having a sub-wavelength or less, the effective refractive index is determined depending on the ratio between the structure and the surrounding material. In some example embodiments of the present disclosure, the phase delay of transmitted light is adjusted by locally adjusting the ratio of materials having different refractive indexes in the portion through which the light transmits.
The meta-microlenses MML may be disposed such that at least portions thereof vertically overlap the photoelectric conversion regions PD (e.g., in the third direction D3). The meta-microlenses MML may be provided at positions corresponding to the photoelectric conversion regions PD of the semiconductor substrate 110.
Although it has been described that the color filters CF, the fence patterns 320, and/or the meta-microlenses MML are provided at the positions corresponding to the respective pixels PX in an overlapping manner, the present disclosure is not limited thereto. At least one of the color filters CF, the fence patterns 320, or the meta-microlenses MML may have an offset structure shifted to a certain degree from the positions corresponding to the respective pixels PX. The offset structure may be intentionally selected to optimize an optical path in consideration of the angle of light traveling from the outside toward the pixels PX. The meta-microlenses MML and the offset structure will be described below.
A passivation layer 340 may be provided on the meta-microlenses MML. The passivation layer 340 may be formed of a single layer or multiple layers. The passivation layer 340 may not only function as a protective layer for protecting the meta-microlenses MML, but may also function as an anti-reflective layer that prevents or reduces light from the outside from being reflected by the upper surfaces of the meta-microlenses MML. The passivation layer 340 may be formed of various materials. For example, the passivation layer 340 may include hafnium oxide (HfOx), silicon oxide (SiOx), zirconium oxide (ZrO2), titanium oxide (TiO2), or aluminum oxide (Al2O3, alumina). In this case, the passivation layer 340 may prevent or reduce reflection of light such that light travelling toward the second surface 110b through the meta-microlenses MML effectively reaches the photoelectric conversion regions PD.
An upper insulating layer 310 may be interposed between the second surface 110b of the semiconductor substrate 110 and the color filters CF and between the device isolation films 130 and the fence patterns 320. The upper insulating layer 310 may cover the second surface 110b of the semiconductor substrate 110 and the upper surfaces of the device isolation films 130. The upper insulating layer 310 may include a plurality of layers. For example, the upper insulating layer 310 may include an anti-reflective layer. The anti-reflective layer may be formed of various materials. For example, the anti-reflective layer may include hafnium oxide (HfOx), silicon oxide (SiOx), zirconium oxide (ZrO2), titanium oxide (TiO2), or aluminum oxide (Al2O3, alumina). In this case, the upper insulating layer 310 may prevent or reduce reflection of light such that light incident to the second surface 110b of the semiconductor substrate 110 effectively reaches the photoelectric conversion regions PD.
An etch-preventing insulation layer 330 may be provided between the color filters CF and the meta-microlenses MML. The etch-preventing insulation layer 330 may be an insulation layer for preventing or reducing over-etching when the meta-microlenses MML are formed using an etching process. Although not illustrated, a capping insulating layer for covering the upper surfaces of the color filters CF and/or a spacer layer for securing a gap between the color filters CF and the meta-microlenses MML may be additionally provided between the color filters CF and the etch-preventing insulation layer 330.
The meta-microlenses MML serve to refract and/or condense light.
Referring to
The meta-microlens MML may include a plurality of nanoposts NP as the high-refractive index nanostructures RF1. In some example embodiments of the present disclosure, when viewed based on one unit pixel region, the plurality of high-refractive index nanostructures RF1 may be disposed in the form of the nanoposts NP on the plane, and the low-refractive index nanostructure RF2 may fill the remaining regions other than the nanoposts NP.
The nanoposts NP according to some example embodiments of the present disclosure may have a cylindrical shape. In
The effective refractive index of the meta-microlens MML may be highest in one region of the meta-microlens MML and may be gradually decreased farther away from the one region such that the meta-microlens MML serves as a convex lens that allows light to converge. In other words, the ratio of the high-refractive index nanostructures RF1 to the low-refractive index nanostructure RF2 may be highest in one region of the meta-microlens MML and may gradually decrease farther away from the one region.
Hereinafter, the region where the effective refractive index is highest within the meta-microlens MML is referred to as the “refractive index peak region”. The diameters and pitches of the nanoposts NP may be differently selected in the refractive index peak region and the region around the refractive index peak region such that the effective refractive index appears depending on the regions. That is, the diameters, number, and areas of the nanoposts NP may vary depending on the form of light collection to be achieved. For example, the nanoposts NP may be provided in various ways such that the nanoposts NP have the same diameter or different diameters within a unit pixel. In this case, the nanoposts NP may be disposed such that the nanoposts NP in the refractive index peak region have the greatest diameter.
As described above, the chief ray angle of incident light varies depending on the position on the pixel array. Accordingly, the position of the refractive index peak region within the meta-microlens MML may vary depending on the position of the meta-microlens MML within the pixel array. When the meta-microlens MML is disposed at the center of the pixel array to which light is nearly perpendicularly incident, the meta-microlens MML does not have to change the angle at which the light travels. However, when the meta-microlens MML is disposed at the edge of the pixel array to which light is obliquely incident, the angle at which the light travels needs to be changed by the meta-microlens MML.
To this end, according to some example embodiments of the present disclosure, the diameters and arrangement of the nanoposts NP may vary depending on the incidence angles of chief rays. When three light beams, the incidence angles of which are sequentially increased, are referred to as first light L1, second light L2, and third light L3, respectively, and color pixel regions disposed in three regions where the incidence angles of incident chief rays are different from one another are referred to as first to third color pixel regions CPA1, CPA2, and CPA3, respectively, the diameters and arrangement of the nanoposts NP in the first to third color pixel regions CPA1, CPA2, and CPA3 may be differently set depending on the incidence angles of the first light L1, the second light L2, and the third light L3.
In some example embodiments of the present disclosure, the nanoposts NP provided in each of the color pixel regions CPA1, CPA2, and CPA3 may have the greatest diameter in the refractive index peak region, and the diameters of the nanoposts NP may be decreased farther away from the refractive index peak region. In addition, the refractive index peak region may be shifted from the center of each color pixel region depending on the angle of incident light. For example, in the first color pixel region CPA1 to which the first light L1 is incident at an angle of 0 degrees, the refractive index peak region may coincide with the center of the first color pixel region CPA1. However, in the second color pixel region CPA2 and the third color pixel region CPA3 to which the second light L2 and the third light L3, which are inclined when compared to the first light L1, are incident, the refractive index peak regions may be shifted and spaced apart from the centers of the color pixel regions. In the second color pixel region CPA2 to which the second light L2 is incident, the refractive index peak region may be shifted from the center of the second color pixel region CPA2 by a first distance S1, and in the third color pixel region CPA3 to which the third light L3 is incident, the refractive index peak region may be shifted from the center of the third color pixel region CPA3 by a second distance S2. Here, the refractive index peak region in the third color pixel region CPA3 may be shifted more than the refractive index peak region in the second color pixel region CPA2, and therefore the second distance S2 may be greater than the first distance S1.
In some example embodiments of the present disclosure, the diameters of the nanoposts NP may also be changed depending on the incidence angles of the chief rays. When the diameters of the nanoposts NP in the refractive index peak regions of the first to third color pixel regions CPA1, CPA2, and CPA3 are referred to as first to third diameters R1, R2, and R3, the first to third diameters R1, R2, and R3 may be sequentially increased in the order of the first diameter R1, the second diameter R2, and the third diameter R3.
When light transmits through a certain material, phase modulation occurs due to the refractive index of the material. For example, when light having a certain wavelength transmits through a material having a period smaller than or equal to the certain wavelength, the phase of the transmitted light varies depending on the ratio of constituent materials. Since the meta-microlens MML is formed of two materials having different effective refractive indexes and the diameters of the nanoposts NP having a high refractive index are varied, the phase of light transmitting through the meta-microlens MML is changed.
In general, a method of refracting light using a spherical microlens having a physically protruding shape (hereinafter, referred to as the spherical microlens) is used. In the case of the spherical microlens, an optical path is controlled by changing the height for each position. However, in some example embodiments of the present disclosure, an optical path is controlled by adjusting the diameters of the nanoposts NP depending on the position where light is incident and the resulting incidence angle. That is, a structure that delays a phase is intentionally disposed at each position, and the degree of phase delay is controlled by the diameters of the nanoposts NP. In this case, when the diameters of the nanoposts NP are small, the phase delay is small, and as the diameters of the nanoposts NP are increased, the phase delay is increased. As a result, light may be deflected in one direction. By arranging the nanoposts NP using this principle, light obliquely incident to the upper surface of the meta-microlens MML may be refracted and vertically emitted from the lower surface of the meta-microlens MML. In some example embodiments of the present disclosure, the diameters of the nanoposts NP may be sequentially increased such that the deflection of traveling light increases as the incidence angle increases.
In some example embodiments of the present disclosure, the average diameters of the nanoposts NP provided in the first to third color pixel regions CPA1, CPA2, and CPA3 may be gradually increased as the incidence angle is increased. That is, the average diameter of the nanoposts NP provided in the second color pixel region CPA2 may be greater than the average diameter of the nanoposts NP provided in the first color pixel region CPA1. Furthermore, the average diameter of the nanoposts NP provided in the third color pixel region CPA3 may be greater than the average diameter of the nanoposts NP provided in the second color pixel region CPA2.
In addition, the densities of the nanoposts NP in the refractive index peak regions of the first to third color pixel regions CPA1, CPA2, and CPA3 may be increased in the order of the first color pixel region CPA1, the second color pixel region CPA2, and the third color pixel region CPA3. Although
The above-described meta-microlens MML may be manufactured by forming a low-refractive index material layer on the color filter CF, etching the low-refractive index material layer to form a plurality of holes, and filling the holes with a high-refractive index material. Referring to
The plurality of holes are formed to correspond to regions where the nanoposts NP are to be formed. Next, a high-refractive index material layer is formed on the semiconductor substrate 110 through deposition. The high-refractive index material layer fills the plurality of holes. Then, the meta-microlens MML is formed by performing a chemical mechanical polishing (CMP) process on the upper surface of the semiconductor substrate, on which the high-refractive index material layer is formed, until the upper surface of the low-refractive index material layer is exposed. This process may be performed several times to increase the heights of the nanoposts NP. The passivation layer 340 may be provided on the manufactured meta-microlens MML. The passivation layer 340 may be formed of a single layer or multiple layers. The passivation layer 340 may not only function as a protective layer for protecting the meta-microlenses MML, but may also function as an anti-reflective layer that prevents or reduces light from the outside from being reflected by the upper surfaces of the meta-microlenses MML.
In some example embodiments of the present disclosure, unlike a commonly used spherical microlens, the meta-microlens MML has a flat upper surface shape. Accordingly, various additional functional layers may be stacked on the meta-microlens MML. Although not illustrated, for example, an infrared filter layer may be stacked on the meta-microlens MML, and in this case, flare may be prevented or reduced. Alternatively, various nanostructure layers, such as a nano-prism, a color splitter, a color sorter, and/or a polarizer, may be additionally formed on the meta-microlens MML.
Referring to
Referring to
To solve this problem, as illustrated in
Referring to
The image sensor according to some example embodiments of the present disclosure employs the above-described structure, but the shapes of the meta-microlenses may be sequentially changed from the center of the pixel array toward the edge of the pixel array. That is, in some example embodiments of the present disclosure, the diameters or densities of the nanoposts may be increased from the center of the pixel array toward the edge of the pixel array.
Referring to
In some example embodiments of the present disclosure, in meta-microlenses MML, the diameters of the nanoposts NP in refractive index peak regions may be increased from the center of the pixel array PA toward the edge of the pixel array PA, that is, from the first region A1 toward the fourth region A4. In particular, in separate color pixel regions, the diameters of the nanoposts NP corresponding to the refractive index peak regions may be increased from the first region A1 toward the fourth region A4. Alternatively, the nanoposts NP in the separate color pixel regions may have an increasing average diameter in the direction from the first region A1 toward the fourth region A4 even though not corresponding to the refractive index peak regions. Furthermore, not only the diameter but also the density of the nanoposts NP may have a larger value from the first region A1 toward the fourth region A4. In addition, the nanoposts NP may have a decreasing pitch from the first region A1 toward the fourth region A4.
As compared with when a spherical microlens is used for light collection, the image sensor having the above-described structure has improved sensitivity, especially, improved sensitivity to oblique light.
In
Referring to
Referring to
Due to the chromatic aberration of each color, the intersection points in the image sensor employing the spherical microlens appear differently depending on the angle of incidence. This means that light is sensed to different degrees depending on colors for the same angle of incidence. In contrast, the image sensor employing the meta-microlens according to some example embodiments of the present disclosure senses light to almost the same degree regardless of colors for the same angle of incidence. Accordingly, it can be seen that in the case of the image sensor of the present disclosure, the chromatic aberration depending on colors is significantly reduced and there is almost no channel difference.
Referring to
Although the image sensor according to some example embodiments of the present disclosure has very small chromatic aberration and channel difference when compared to the image sensor employing the spherical microlens, the image sensor according to some example embodiments of the present disclosure may implement additional chromatic aberration correction and reduction of channel difference by shifting the color filters.
Referring to
In some example embodiments of the present disclosure, the color filter CF may be shifted by a certain distance S_CF from a corresponding unit pixel in a direction in which light is incident, that is, toward the center of the pixel array. The meta-microlens MML may also be shifted by a certain distance S_LS from the corresponding unit pixel in the direction in which the light is incident, that is, toward the center of the pixel array.
Due to the shift of the color filter CF and the meta-microlens MML, the region of the corresponding unit pixel, the region where the color filter CF is provided, and the region where the meta-microlens MML is provided may not partially overlap one another. That is, the color filter CF does not overlap the corresponding unit pixel region by the distance by which the color filter CF is shifted toward the center of the pixel array. Accordingly, the overlapping area of the region where each color filter CF is provided and the region where the corresponding unit pixel is provided may decrease from the center toward the edge.
Likewise, the meta-microlens MML does not overlap the corresponding unit pixel region by the distance by which the meta-microlens MML is shifted toward the center of the pixel array. Accordingly, the overlapping area of the region where each meta-microlens MML is provided and the region where the corresponding unit pixel is provided may decrease from the center toward the edge.
The shift distance S_SL of the meta-microlens MML may be greater than the shift distance S_CF of the color filter CF.
Referring to
Due to the shift of the color filter CF and the meta-microlens MML, the non-overlapping region between the color filter CF, the meta-microlens MML, and the corresponding unit pixel may also be increased from the first region A1 toward the fourth region A4.
According to some example embodiments of the present disclosure, chromatic aberration may be more precisely corrected by differently setting the degree of shift for each color filter.
Referring to
Accordingly, in the case of the meta-microlens MML having the same structure, chromatic aberration may occur although the chromatic aberration is less than that of a spherical microlens, and an optical path needs to be modified accordingly.
To correct the chromatic aberration, the diameter, density, and pitch of the meta-microlens MML may be changed, and as illustrated in
For example, the color filter CF may be shifted by a certain distance from the corresponding unit pixel PX_R, PX_Gr, PX_Gb, and PX_B in a direction in which light is incident, that is, toward the center of the pixel array. However, the shift distance S_CF_R of the red color filter CF_R, the shift distances S_CF_Gr and S_CF_Gb of the green color filters CF_Gr and CF_Gb, and the shift distance S_CF_B of the blue color filter CF_B may be different from one another. According to some example embodiments of the present disclosure, the shift distance S_CF_R of the red color filter CF_R may be greater than the shift distances S_CF_Gr and S_CF_Gb of the green color filters CF_Gr and CF_Gb or the shift distance S_CF_B of the blue color filter CF_B, and the shift distances S_CF_Gr and S_CF_Gb of the green color filters CF_Gr and CF_Gb may be greater than or equal to the shift distance S_CF_B of the blue color filter CF_B. Here, the shift distances S_CF_Gr and S_CF_Gb of the green color filters CF_Gr and CF_Gb may be equal to or different from each other.
The meta-microlens MML may also be shifted by a certain distance from the corresponding unit pixel in the direction in which the light is incident, that is, toward the center of the pixel array.
For example, the meta-microlens MML may be shifted by a certain distance from the corresponding unit pixel PX_R, PX_Gr, PX_Gb, and PX_B in the direction in which the light is incident, that is, toward the center of the pixel array. However, the shift distance S_LS_R of the red meta-microlens LS_R, the shift distances S_LS_Gr and S_LS_Gb of the green meta-microlenses LS_Gr and LS_Gb, and the shift distance S_LS_B of the blue meta-microlens LS_B may be different from one another. According to some example embodiments of the present disclosure, the shift distance S_LS_R of the red meta-microlens LS_R may be greater than the shift distances S_LS_Gr and S_LS_Gb of the green meta-microlenses LS_Gr and LS_Gb or the shift distance S_LS_B of the blue meta-microlens LS_B, and the shift distances S_LS_Gr and S_LS_Gb of the green meta-microlenses LS_Gr and LS_Gb may be greater than or equal to the shift distance S_LS_B of the blue meta-microlens LS_B. Here, the shift distances S_LS_Gr and S_LS_Gb of the green meta-microlenses LS_Gr and LS_Gb may be equal to or different from each other.
The degree of shift of the color filters CF may be derived by adjusting the diameter, density, and pitch of the nanoposts NP in consideration of the difference in phase difference distribution for each wavelength. In particular, when the focal length is fixed, the phase distribution depending on the representative wavelength for each color filter may be calculated, and when the size distribution of the nanoposts NP is arranged to correspond to the corresponding phase distribution, the meta-microlens MML having the same focus for each color may be designed, unlike a spherical microlens.
Referring to
The image sensor of the present disclosure, which is implemented through the shift of the meta-microlens and the color filter, has significantly reduced chromatic aberration depending on colors and has little to no channel difference, and thus the sensitivity is improved.
Referring to
When the meta-microlens according to some example embodiments of the present disclosure is employed, AF-C and channel difference values may be significantly improved, as compared with when the spherical microlens is employed.
Table 1 below shows AF-C and channel difference values in a comparative example in which the spherical microlens is employed and some example embodiments in which the meta-microlens is employed.
Referring to Table 1, it can be seen that when the meta-microlens is employed, an improvement in the AF-C and a significant reduction in the channel difference are achieved and thus the quality of the image sensor is significantly improved, as compared with when the spherical microlens is employed. In particular, it can be seen that in the case of the blue color, the channel difference is decreased by about or exactly 10% or more (for example, about or exactly 10% to 14%) at the corresponding angle.
As such, according to some example embodiments of the present disclosure, by introducing different focal distances for respective colors and different degrees of shift of components for the respective colors to the image sensor, monochromatic aberration due to oblique light is corrected, and the channel differences are reduced at the same time. Accordingly, the image sensor having the above-described configuration improves auto focusing performance by correcting the phase difference of light transmitting through the microlens when auto-focusing using phase difference is implemented.
In the image sensor having the above-described structure, various modifications, such as changing the number and size of pixels and replacing color filters, may be made. For example, colors included in one unit pattern may be differently set, and the number of pixels may also be differently set. For example, in the above-described embodiments of the present disclosure, one unit pattern includes four unit pixels, and each unit pixel includes four pixels. However, the present disclosure is not limited thereto. Each unit pixel may include two pixels, nine pixels, sixteen pixels, or other numbers of pixels. In addition, the sizes of pixels may also be set in various ways depending on colors.
Some example embodiments of the present disclosure proposes a structure that optimizes phase correction depending on colors by employing the meta-microlens in the image sensor. Accordingly, the sensitivity of the image sensor may be improved.
Some example embodiments of the present disclosure provide the image sensors with improved auto focusing performance by correcting a phase depending on conditions of incident light.
When the terms “about” or “substantially” are used in this specification in connection with a numerical value, it is intended that the associated numerical value includes a manufacturing or operational tolerance (e.g., ±10%) around the stated numerical value. Moreover, when the words “generally” and “substantially” are used in connection with geometric shapes, it is intended that precision of the geometric shape is not required but that latitude for the shape is within the scope of the disclosure. Further, regardless of whether numerical values or shapes are modified as “about” or “substantially,” it will be understood that these values and shapes should be construed as including a manufacturing or operational tolerance (e.g., ±10%) around the stated numerical values or shapes.
As described herein, any electronic devices and/or portions thereof according to any of the example embodiments may include, may be included in, and/or may be implemented by one or more instances of processing circuitry such as hardware including logic circuits; a hardware/software combination such as a processor executing software; or any combination thereof. For example, the processing circuitry more specifically may include, but is not limited to, a central processing unit (CPU), an arithmetic logic unit (ALU), a graphics processing unit (GPU), an application processor (AP), a digital signal processor (DSP), a microcomputer, a field programmable gate array (FPGA), and programmable logic unit, a microprocessor, application-specific integrated circuit (ASIC), a neural network processing unit (NPU), an Electronic Control Unit (ECU), an Image Signal Processor (ISP), and the like. In some example embodiments, the processing circuitry may include a non-transitory computer readable storage device (e.g., a memory), for example a DRAM device, storing a program of instructions, and a processor (e.g., CPU) configured to execute the program of instructions to implement the functionality and/or methods performed by some or all of any devices, systems, modules, units, controllers, circuits, architectures, and/or portions thereof according to any of the example embodiments, and/or any portions thereof.
The above-described example embodiments have been described as an example for convenience of description. Without being limited thereto, the above-described example embodiments may be combined in various ways without departing from the spirit and scope of the present disclosure.
While the present disclosure has been described with reference to example embodiments thereof, it will be apparent to those of ordinary skill in the art that various changes and modifications may be made thereto without departing from the spirit and scope of the present disclosure as set forth in the following claims.
Accordingly, the scope of the present disclosure should not be determined by the above-described example embodiments and should be determined by the accompanying claims and the equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
10-2023-0142425 | Oct 2023 | KR | national |