This application is based on and claims priority under 35 U.S.C. § 119 to Korean Patent Application No. 10-2023-0019538, filed on Feb. 14, 2023, in the Korean Intellectual Property Office, the disclosure of which is incorporated by reference herein in its entirety.
One or more embodiments relate to an image sensor including a nano-photonic lens array and an electronic apparatus including the image sensor.
Image sensors generally sense the color of incident light by using a color filter. However, a color filter may have low light utilization efficiency because the color filter absorbs light of colors other than the intended color of light. For example, in the case in which a red-green-blue (RGB) color filter is used, only ⅓ of the incident light is transmitted therethrough and the other portion of the incident light, that is, ⅔ of the incident light, is absorbed. Thus, the light utilization efficiency of the RGB color filter is only about 33%. Thus, in a color display apparatus or a color image sensor, most light loss occurs in the color filter.
In addition, as pixel sizes in an image sensor are reduced, a binning technique for improving sensitivity under a low light level has been suggested. However, in an image sensor of a Bayer pattern type, the resolution of the image sensor may deteriorate when a binning mode is used. Accordingly, a new type of pixel arrangement other than the Bayer pattern has been suggested.
One or more example embodiments provide an image sensor including a nano-photonic lens array suitable for a new pixel arrangement, as well as for a Bayer pattern.
Further, one or more example embodiments provide an electronic apparatus including an image sensor.
According to an embodiment, an image sensor includes a sensor substrate including a first pixel, a second pixel, a third pixel, and a fourth pixel that are two-dimensionally disposed in a first direction and a second direction; and a nano-photonic lens array including a first pixel corresponding region corresponding to the first pixel, a second pixel corresponding region corresponding to the second pixel, a third pixel corresponding region corresponding to the third pixel, and a fourth pixel corresponding region corresponding to the fourth pixel, wherein each of the first pixel corresponding region, the second pixel corresponding region, the third pixel corresponding region, and the fourth pixel corresponding region includes a plurality of nano-structures that are disposed to condense, in incident light, light of a first wavelength onto the first pixel, condense light of a second wavelength onto the second pixel and the fourth pixel, and condense light of a third wavelength onto the third pixel, and in each of the second pixel corresponding region and the fourth pixel corresponding region, cross-sectional area sizes of the plurality of nano-structures are distributed asymmetrically in the first direction, the second direction, and a first diagonal direction, and symmetrically in a second diagonal direction that crosses the first diagonal direction.
The first pixel, the second pixel, the third pixel, and the fourth pixel may be alternately disposed in the first direction, and the first pixel, the second pixel, the third pixel, and the fourth pixel may be shifted by one pixel unit in the first direction in a following row in the second direction so that each of the first pixel, the second pixel, the third pixel, and the fourth pixel is disposed in a straight line in the first diagonal direction.
The first pixel corresponding region, the second pixel corresponding region, the third pixel corresponding region, and the fourth pixel corresponding region may be alternately disposed in the first direction, and the first pixel corresponding region, the second pixel corresponding region, the third pixel corresponding region, and the fourth pixel corresponding region may be shifted one-by-one in the first direction in a following row in the second direction so that each of the first pixel corresponding region, the second pixel corresponding region, the third pixel corresponding region, and the fourth pixel corresponding region is disposed in a straight line in the first diagonal direction.
In each of the first pixel corresponding region, the second pixel corresponding region, the third pixel corresponding region, and the fourth pixel corresponding region, the plurality of nano-structures may be two-dimensionally disposed, and in the first pixel corresponding region, the second pixel corresponding region, the third pixel corresponding region, and the fourth pixel corresponding region, locations, materials, and heights of the plurality of nano-structures may be equal to one another.
In the second pixel corresponding region, at least one pair from among pairs of two nano-structures corresponding to each other in the first direction may have different cross-sectional area sizes from each other, at least one pair from among pairs of two nano-structures corresponding to each other in the second direction may have different cross-sectional area sizes from each other, at least one pair from among pairs of two nano-structures corresponding to each other in the second diagonal direction may have different cross-sectional area sizes from each other, and pairs of two nano-structures corresponding to each other in the first diagonal direction may have same cross-sectional area sizes.
In the fourth pixel corresponding region, at least one pair from among pairs of two nano-structures corresponding to each other in the first direction may have different cross-sectional area sizes from each other, at least one pair from among pairs of two nano-structures corresponding to each other in the second direction may have different cross-sectional area sizes from each other, at least one pair from among pairs of two nano-structures corresponding to each other in the first diagonal direction may have different cross-sectional area sizes from each other, and pairs of two nano-structures corresponding to each other in the second diagonal direction may have same cross-sectional area sizes.
A distribution of cross-sectional area sizes of the plurality of nano-structures in the second pixel corresponding region and a distribution of cross-sectional area sizes of the plurality of nano-structures in the fourth pixel corresponding region may be mirror-symmetrical with each other in the first diagonal direction.
The cross-sectional area size of each of the plurality of nano-structures in the second pixel corresponding region may be equal to the cross-sectional area size of the nano-structure located at a symmetrical position in the first diagonal direction in the fourth pixel corresponding region.
The plurality of nano-structures may be disposed so that the light of the second wavelength that passed through a rectangular region obtained by connecting centers of two first pixel corresponding regions to centers of two third pixel corresponding regions that are adjacent to the second pixel corresponding region while contacting one sides thereof is condensed onto the second pixel, and the light of the second wavelength that passed through a rectangular region obtained by connecting centers of two first pixel corresponding regions to centers of two third pixel corresponding regions that are adjacent to the fourth pixel corresponding region while contacting one sides thereof is condensed onto the fourth pixel.
In the first pixel corresponding region and the third pixel corresponding region, the cross-sectional area sizes of the plurality of nano-structures may be distributed symmetrically in the first direction, the second direction, the first diagonal direction, and the second diagonal direction.
In each of the first pixel corresponding region and the third pixel corresponding region, two nano-structures corresponding to each other in the first direction may have the same cross-sectional area size, two nano-structures corresponding to each other in the second direction may have the same cross-sectional area size, two nano-structures corresponding to each other in the first diagonal direction may have the same cross-sectional area size, and two nano-structures corresponding to each other in the second diagonal direction may have the same cross-sectional area size.
The plurality of nano-structures may be disposed so that the light of the first wavelength that passed through a rectangular region obtained by connecting centers of two second pixel corresponding regions to centers of two fourth pixel corresponding regions that are adjacent to the first pixel corresponding region while contacting one sides thereof is condensed onto the first pixel, and the light of the third wavelength that passed through a rectangular region obtained by connecting centers of two second pixel corresponding regions to centers of two fourth pixel corresponding regions that are adjacent to the third pixel corresponding region while contacting one sides thereof is condensed onto the third pixel.
In the first pixel corresponding region and the third pixel corresponding region, the cross-sectional area sizes of the plurality of nano-structures may be distributed asymmetrically in the first direction and the second direction, and symmetrically in the first diagonal direction and the second diagonal direction.
In the first pixel corresponding region and the third pixel corresponding region, at least one pair from among pairs of two nano-structures corresponding to each other in the first direction may have different cross-sectional area sizes from each other, at least one pair from among pairs of two nano-structures corresponding to each other in the second direction may have different cross-sectional area sizes from each other, at least two nano-structures corresponding to each other in the first diagonal direction may have same cross-sectional area sizes, and two nano-structures corresponding to each other in the second diagonal direction may have same cross-sectional area sizes.
The plurality of nano-structures may be disposed so that the light of the first wavelength that passed through a rectangular region obtained by connecting apexes of two second pixel corresponding regions to apexes of two fourth pixel corresponding regions that are adjacent to the first pixel corresponding region while contacting one sides thereof is condensed onto the first pixel, and the light of the third wavelength that passed through a rectangular region obtained by connecting apexes of two second pixel corresponding regions to apexes of two fourth pixel corresponding regions that are adjacent to the third pixel corresponding region while contacting one sides thereof is condensed onto the third pixel.
Each of the first pixel, the second pixel, the third pixel, and the fourth pixel may include a plurality of photosensitive cells that are grouped in the first direction and the second direction to be two-dimensionally disposed and independently sense incident light.
The image sensor may further include a color filter layer disposed between the sensor substrate and the nano-photonic microlens array, wherein the color filter layer may include a first color filter corresponding to the first pixel and transmitting the light of the first wavelength, a second color filter corresponding to the second pixel and transmitting the light of the second wavelength, a third color filter corresponding to the third pixel and transmitting the light of the third wavelength, and a fourth color filter corresponding to the fourth pixel and transmitting the light of the second wavelength.
The first color filter, the second color filter, the third color filter, and the fourth color filter may be alternately disposed in the first direction, and the first color filter, the second color filter, the third color filter, and the fourth color filter may be shifted by one pixel unit in the first direction in a following row in the second direction so that each of the first color filter, the second color filter, the third color filter, and the fourth color filter is disposed in a line in the first diagonal direction.
According to another embodiment, an electronic apparatus includes a lens assembly for forming an optical image of a subject, an image sensor configured to convert the optical image formed by the lens assembly into an electrical signal, and a processor configured to process a signal generated by the image sensor, wherein the image sensor includes a sensor substrate including a first pixel, a second pixel, a third pixel, and a fourth pixel that are two-dimensionally disposed in a first direction and a second direction, and a nano-photonic lens array including a first pixel corresponding region corresponding to the first pixel, a second pixel corresponding region corresponding to the second pixel, a third pixel corresponding region corresponding to the third pixel, and a fourth pixel corresponding region corresponding to the fourth pixel, each of the first pixel corresponding region, the second pixel corresponding region, the third pixel corresponding region, and the fourth pixel corresponding region includes a plurality of nano-structures that are disposed to condense, in incident light, light of a first wavelength onto the first pixel, condense light of a second wavelength to the second pixel and the fourth pixel, and condense light of a third wavelength onto the third pixel, and in each of the second pixel corresponding region and the fourth pixel corresponding region, cross-sectional area sizes of the plurality of nano-structures are distributed symmetrically in the first direction, the second direction, and a first diagonal direction, and are distributed asymmetrically in a second diagonal direction that crosses the first diagonal direction.
According to another aspect of the disclosure, an image sensor may include: a plurality of pixels that comprises a first pixel, and four neighboring pixels that surround the first pixel; and a lens array comprising a plurality of pixel corresponding regions, wherein the plurality of pixel corresponding regions may include a first pixel corresponding region, and four neighboring pixel corresponding regions that surround the first pixel corresponding region, wherein the first pixel corresponding region and the four neighboring pixel corresponding regions are aligned correspondingly with the first pixel and the four neighboring pixels; wherein the lens array may include a plurality of nano-structures configured to condense light of a first wavelength onto the first pixel, when the light of the first wavelength is incident at any position within a first wavelength light condensing region of the lens array, and wherein the first wavelength light condensing region is an area formed by connecting centers of the four neighboring pixel corresponding regions.
Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented embodiments of the disclosure.
The above and other aspects, features, and advantages of certain embodiments of the disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout. In this regard, the present embodiments may have different forms and should not be construed as being limited to the descriptions set forth herein. Accordingly, the embodiments are merely described below, by referring to the figures, to explain aspects. Expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
Hereinafter, an image sensor including a nano-photonic lens array and an electronic apparatus including the image sensor will be described in detail with reference to accompanying drawings. The embodiments of the disclosure are capable of various modifications and may be embodied in many different forms. In the drawings, like reference numerals denote like components, and sizes of components in the drawings may be exaggerated for convenience of explanation.
When a layer, a film, a region, or a panel is referred to as being “on” another element, it may be directly on/under/at left/right sides of the other layer or substrate, or intervening layers may also be present.
It will be understood that although the terms “first,” “second,” etc. may be used herein to describe various components, these components should not be limited by these terms. These terms are only used to distinguish one component from another. These terms do not limit that materials or structures of components are different from one another.
An expression used in the singular encompasses the expression of the plural, unless it has a clearly different meaning in the context. It will be further understood that when a portion is referred to as “comprising” another component, the portion may not exclude another component but may further comprise another component unless the context states otherwise.
In addition, the terms such as “ . . . unit”, “module”, etc. provided herein indicates a unit performing a function or operation, and may be realized by hardware, software, or a combination of hardware and software.
The use of the terms of “the above-described” and similar indicative terms may correspond to both the singular forms and the plural forms.
Also, the steps of all methods described herein may be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. Also, the use of all exemplary terms (for example, etc.) is only to describe a technical spirit in detail, and the scope of rights is not limited by these terms unless the context is limited by the claims.
The pixel array 1100 includes pixels that are two-dimensionally disposed in a plurality of rows and columns. The row decoder 1020 selects one of the rows in the pixel array 1100 in response to a row address signal output from the timing controller 1010. The output circuit 1030 outputs a photosensitive signal, in a line unit, from a plurality of pixels disposed in the selected row. To this end, the output circuit 1030 may include a column decoder and an analog-to-digital converter (ADC). For example, the output circuit 1030 may include a plurality of ADCs that are disposed respectively to columns between the column decoder and the pixel array 1100, or one ADC disposed at an output end of the column decoder. The timing controller 1010, the row decoder 1020, and the output circuit 1030 may be implemented as one chip or in separate chips. A processor for processing an image signal output from the output circuit 1030 may be implemented as one chip with the timing controller 1010, the row decoder 1020, and the output circuit 1030.
The pixel array 1100 may include a plurality of pixels that sense light of different wavelengths. The pixel arrangement may be implemented in various ways. For example,
Referring to
In
Also, in the pixel arrangement shown in
Also, around the red pixel R in the first and second directions, the first green pixels G1 and the second green pixels G2 may be disposed contacting four sides of the red pixel R. For example, the first green pixels G1 may be disposed contacting the right and upper sides of the red pixel R, and the second green pixels G2 may be disposed contacting the left and lower sides of the red pixel R. In addition, around the blue pixel B in the first and second directions, the first and second green pixels G1 and G2 may be disposed contacting four sides of the blue pixel B. For example, the second green pixels G2 may be disposed contacting the right and upper sides of the blue pixel B, and the first green pixels G1 may be disposed contacting the left and lower sides of the blue pixel B. Therefore, the green pixels may be in contact with all four sides of the red pixel R and the blue pixel B. In this point of view, the pixel arrangements around the red pixel R and the blue pixel B may be symmetrical in the first direction and the second direction.
In the case of the first green pixel G1, the blue pixels B may be disposed contacting the right and upper sides of the first green pixel G1 and the red pixels R may be disposed contacting the left and lower sides of the first green pixel G1. Therefore, pixels of different colors, e.g., the red pixel R and the blue pixel B, are disposed at left and right sides of the first green pixel G1, and pixels of different colors are disposed at lower and upper sides of the first green pixel G1. Also, the red pixels R may be disposed contacting the right and upper sides of the second green pixel G2, and the blue pixels B may be disposed contacting the left and lower sides of the second green pixel G2. Therefore, pixels of different colors are disposed at the left and right sides of the second green pixel G2, and pixels of different colors are disposed at the lower and upper sides of the second green pixel G2. In this point, the pixel arrangement around the first green pixel G1 and the pixel arrangement around the second green pixel G2 are asymmetrical in the first direction and in the second direction, and are also asymmetrical in the first diagonal direction D1 and symmetrical in the second diagonal direction D2. Also, in the pixel arrangement around the first green pixel G1 and the pixel arrangement around the second green pixel G2, the red pixels R and the blue pixels B are exchanged with each other.
In addition, as indicated by dashed lines in
The pixel arrangement shown in
The arrangement method of the pixel array 1100 may be various, other than the method shown in
Referring to
The sensor substrate 110 may include a plurality of pixels sensing incident light. For example, the sensor substrate 110 may include a plurality of first pixels 111, a plurality of second pixels 112, a plurality of third pixels 113, and a plurality of fourth pixels 114 that convert incident light into electrical signals and generate an image signal. The first, second, third, and fourth pixels 111, 112, 113, and 114 may respectively correspond to the red pixels R, the first green pixels G1, the blue pixels B, and the second green pixels G2 shown in
Therefore, in a row following any one row in the second direction, the pixels may be shifted by one pixel unit in the first direction with respect to the previous row. For example, the first pixel 111, the second pixel 112, the third pixel 113, and the fourth pixel 114 disposed in the second row may be shifted by one pixel unit in the first direction with respect to the first pixel 111, the second pixel 112, the third pixel 113, and the fourth pixel 114 disposed in the first row. Then, the same color pixels may be disposed in a line along one diagonal direction. For example, a plurality of first pixels 111 are disposed in a straight line in a diagonal direction while contacting the apexes of one another, a plurality of second pixels 112 are disposed in a line in the diagonal direction while contacting the apexes of one another, a plurality of third pixels 113 are disposed in a line in the diagonal direction while contacting the apexes of one another, and a plurality of fourth pixels 114 are disposed in a line in the diagonal direction while contacting the apexes of one another. In the above-mentioned manner, the first to fourth pixels 111, 112, 113, and 114 may be two-dimensionally disposed in the first and second directions.
Also, each of the first to fourth pixels 111, 112, 113, and 114 may include a plurality of photosensitive cells that independently sense incident light. For example, each of the first to fourth pixels 111, 112, 113, and 114 may include first to fourth photosensitive cells C1, C2, C3, and C4. The first to fourth photosensitive cells C1, C2, C3, and C4 may be two-dimensionally disposed in the first direction and the second direction. For example, in each of the first to fourth pixels 111, 112, 113, and 114, the first to fourth photosensitive cells C1, C2, C3, and C4 may be disposed in a 2×2 array. Each of the first to fourth photosensitive cells C1, C2, C3, and C4 may include a photodiode sensing light, and a switching circuit controlling operations of the photodiode. Therefore, when each of the first to fourth pixels 111, 112, 113, and 114 includes the first to fourth photosensitive cells C1, C2, C3, and C4 disposed in 2×2 array, each of the first to fourth pixels 111, 112, 113, and 114 may include four photodiodes operating independently from one another.
According to the embodiment, an auto-focusing signal may be obtained by comparing output signals of adjacent photosensitive cells based on differences between the output signals. For example, in the first direction, an auto-focusing signal may be generated by calculating a difference between output signals from the first photosensitive cell C1 and the second photosensitive cell C2, a difference between output signals from the third photosensitive cell C3 and the fourth photosensitive cell C4, or a difference between a sum of the output signals from the first photosensitive cell C1 and the third photosensitive cell C3 and a sum of the output signals from the second photosensitive cell C2 and the fourth photosensitive cell C4. Also, in the second direction, an auto-focusing signal may be generated by calculating a difference between output signals from the first photosensitive cell C1 and the third photosensitive cell C3, a difference between output signals from the second photosensitive cell C2 and the fourth photosensitive cell C4, or a difference between a sum of the output signals from the first photosensitive cell C1 and the second photosensitive cell C2 and a sum of the output signals from the third photosensitive cell C3 and the fourth photosensitive cell C4.
An RGB image signal may be obtained in various ways. For example, in the case of a full-sized image, output signals from the first to fourth photosensitive cells C1, C2, C3, and C4 may be used as independent image signals. In this case, the first pixel 111 may generate four red image signals, each of the second pixel 112 and the fourth pixel 114 may generate four green image signals, and the third pixel 113 may generate four blue image signals.
In a first binning mode, the output signals from the first to fourth photosensitive cells C1, C2, C3, and C4 may be summed to obtain an image signal. For example, a red image signal may be generated by summing the output signals from the first to fourth photosensitive cells C1, C2, C3, and C4 of the first pixel 111, a first green image signal may be generated by summing the output signals from the first to fourth photosensitive cells C1, C2, C3, and C4 of the second pixel 112, a blue image signal may be generated by summing the output signals from the first to fourth photosensitive cells C1, C2, C3, and C4 of the third pixel 113, and a second green image signal may be generated by summing the output signals from the first to fourth photosensitive cells C1, C2, C3, and C4 of the fourth pixel 114. In this case, each of the first to fourth pixels 111, 112, 113, and 114 may generate one image signal.
Also, in a second binning mode, outputs from two same pixels that are adjacent to each other in the diagonal direction may be summed to generate an image signal. For example, one red image signal may be generated by summing output signals from the first to fourth photosensitive cells C1, C2, C3, and C4 of two first pixels 111 adjacent to each other in the diagonal direction, one first green image signal may be generated by summing output signals from the first to fourth photosensitive cells C1, C2, C3, and C4 of two second pixels 112 adjacent to each other in the diagonal direction, one blue image signal may be generated by summing output signals from the first to fourth photosensitive cells C1, C2, C3, and C4 of two third pixels 113 adjacent to each other in the diagonal direction, and one second green image signal may be generated by summing output signals from the first to fourth photosensitive cells C1, C2, C3, and C4 of two fourth pixels 114 adjacent to each other in the diagonal direction.
The first color filter 121 may be disposed to face the first pixel 111 in a third direction (that is, Z-axis direction), the second color filter 122 may be disposed to face the second pixel 112 in the third direction, the third color filter 123 may be disposed to face the third pixel 113 in the third direction, and the fourth color filter 124 may be disposed to face the fourth pixel 114 in the third direction. Therefore, the first to fourth color filters 121, 122, 123, and 124 may be two-dimensionally disposed in the same manner as that of the first to fourth pixels 111, 112, 113, and 114 described above with reference to
Accordingly, the first pixel 111 may sense the light of the first wavelength that has passed through the first color filter 121 corresponding thereto. The second pixel 112 and the fourth pixel 114 may sense the light of the second wavelength that has passed through the second color filter 122 and the fourth color filter 124 corresponding thereto. The third pixel 113 may sense the light of the third wavelength that has passed through the third color filter 123 corresponding thereto. For example, the light of the first wavelength may be red light, the light of the second wavelength may be green light, and the light of the third wavelength may be blue light. Therefore, the first color filter 121 may be a red color filter transmitting the red light, the second color filter 122 and the fourth color filter 124 may be green color filters transmitting the green light, and the third color filter 123 may be a blue color filter transmitting the blue light.
Dashed lines shown in
The first to fourth color filters 121, 122, 123, and 124 in the color filter layer 120 may be formed of, for example, an organic polymer material. For example, the first to fourth color filters 121, 122, 123, and 124 may include a coloring agent, binder resin, polymer photoresist, etc. The first color filter 121 may be an organic color filter including red organic dye or a red organic pigment as a coloring agent, the second and fourth color filters 122 and 124 may be organic color filters including a green organic dye or a green organic pigment as a coloring agent, and the third color filter 123 may be an organic color filter including a blue organic dye or a blue organic pigment as a coloring agent. The color filter layer 120 may further include a black matrix disposed at boundaries between the first to fourth color filters 121, 122, 123, and 124. The black matrix may include, for example, carbon black.
In
An encapsulation layer 131 may be further disposed on the planarization layer 130. The encapsulation layer 131 may serve multiple functions in the pixel array 1100. For example, the encapsulation layer 131 may act as a protective layer for preventing the planarization layer 130 formed of an organic polymer material from potential damages during a process of forming the nano-photonic lens array 150 on the planarization layer 130. Also, the encapsulation layer 131 may serve as a diffusion barrier layer that prevents a metal component in the color filter layer 120 from passing through the planarization layer 130 and being exposed to outside due to the high temperature during the process of forming the nano-photonic lens array 150. To this end, the encapsulation layer 131 may include an inorganic material. The inorganic material of the encapsulation layer 131 may be formed at a temperature lower than a processing temperature for forming the nano-photonic lens array 150 and may include a material transparent with respect to visible light. Also, a refractive index of the encapsulation layer 131 may be similar to that of the planarization layer 130 in order to reduce reflection loss at an interface between the planarization layer 130 and the encapsulation layer 131. For example, a difference between the refractive index of the planarization layer 130 and the refractive index of the encapsulation layer 131 may be within +20% of the refractive index of the planarization layer 130. For example, the encapsulation layer 131 may include at least one inorganic material selected from SiO2, SiN, and SiON.
Also, the nano-photonic lens array 150 may include a plurality of first pixel corresponding regions 151 corresponding to the plurality of first pixels 111, a plurality of second pixel corresponding regions 152 corresponding to the plurality of second pixels 112, a plurality of third pixel corresponding regions 153 corresponding to the plurality of third pixels 113, and a plurality of fourth pixel corresponding regions 154 corresponding to the plurality of fourth pixels 114. The first pixel corresponding regions 151 may be disposed so as to face the first pixels 111 and the first color filters 121 in the third direction, the second pixel corresponding regions 152 may be disposed so as to face the second pixels 112 and the second color filters 122 in the third direction, the third pixel corresponding regions 153 may be disposed so as to face the third pixels 113 and the third color filters 123 in the third direction, and the fourth pixel corresponding regions 154 may be disposed so as to face the fourth pixels 114 and the fourth color filters 124 in the third direction. Therefore, the first to fourth pixel corresponding regions 151, 152, 153, and 154 may be two-dimensionally disposed in the same manner as that of the first to fourth pixels 111, 112, 113, and 114 described above with reference to
According to the embodiment, the nano-photonic lens array 150 may color-separate the incident light according to wavelengths thereof and condense the separated light of each wavelength onto a pixel corresponding to the wavelength. In other words, the nano-photonic lens array 150 may condense light of the first wavelength, in the incident light, onto the first pixel 111, condense light of the second wavelength onto the second pixel 112 and the fourth pixel 114, and condense light of the third wavelength onto the third pixel 113. Then, the incident light is separated by the nano-photonic lens array 150 according to wavelengths and then condensed to the first to fourth pixels 111, 112, 113, and 114. For example, the nano-photonic lens array 150 separates the incident light into red light, green light, and blue light, and condenses the red light onto the first pixel 111, the green light onto the second and fourth pixels 112 and 114, and the blue light onto the third pixel 113. Because the incident light is color-separated by the nano-photonic lens array 150 to a considerable degree, the absorption loss may be low even when the color filter layer 120 is used. Also, color purity may be improved because the nano-photonic lens array 150 and the color filter layer 120 are used together. The color filter layer 120 may be omitted provided that the sufficient color separate may be generated by the nano-photonic lens array 150.
Each of the first to fourth pixel corresponding regions 151, 152, 153, and 154 of the nano-photonic lens array 150 may include a plurality of nano-structures NP that are disposed to color-separate the incident light according to the wavelengths and condense the separated light of each wavelength respectively onto the first to fourth pixels 111, 112, 113, and 114. The plurality of nano-structures NP may be disposed so that a phase of light transmitting through the nano-photonic lens array 150 is changed according to a position on the nano-photonic lens array 150. Each of the plurality of nano-structures NP may have a pillar shape extending in the third direction. Also, each of the plurality of nano-structures NP may have a certain cross-sectional shape such as a circular shape, an elliptical shape, or a polygonal shape, on a cross-section taken along an XY-plane or a plane parallel to the first direction and the second direction, in other words, a cross-section taken along the upper surface or lower surface of the nano-photonic lens array 150. Therefore, each of the plurality of nano-structures NP may have a cylindrical shape, an elliptical pillar shape, or a polygonal pillar shape. A phase profile of the transmitted light, which is implemented by the nano-photonic lens array 150, may be determined according to a size (e.g., width, diameter, or cross-sectional area) and a height of each of the nano-structures NP, and the arrangement period (or pitch) and arrangement type of the plurality of nano-structures NP. Also, the behavior of the light passing through the nano-photonic lens array 150 may be determined according to the phase profile of the transmitted light. For example, the plurality of nano-structures NP may be disposed so as to form a phase profile allowing the light transmitted through the nano-photonic lens array 150 to condense.
The nano-structures NP may each have a size that is less than a wavelength of visible light. The nano-structures NP may have a size that is less than, for example, the blue wavelength. For example, the cross-sectional width (or diameter) of the nano-structures NP may be less than 400 nm, 300 nm, or 200 nm. A height of the nano-structures NP may be about 500 nm to about 1500 nm, and may be greater than the cross-sectional width of the nano-structures NP.
The nano-structures NP may include a material having a relatively higher refractive index as compared with a peripheral material and having a relatively lower absorption ratio in the visible ray band. For example, the nano-structures NP may include c-Si, p-Si, a-Si and a Group III-V compound semiconductor (GaP, GaN, GaAs etc.), SiC, TiO2, SiN3, ZnS, ZnSe, Si3N4, and/or a combination thereof. Periphery of the nano-structures NP may be filled with the dielectric layer DL having a relatively lower refractive index as compared with the nano-structures NP and have a relatively low absorbent ratio in the visible ray band. For example, the periphery of the nano-structures NP may be filled with a dielectric layer DL including siloxane-based spin on glass (SOG), SiO2, Si3N4, Al2O3, air, etc.
The refractive index of the nano-structures NP may be about 2.0 or greater with respect to light of about a 630 nm wavelength, and the refractive index of the dielectric layer DL may be about 1.0 to about 2.0 or less with respect to light of about a 630 nm wavelength. For example, a difference between the refractive indexes of the nano-structures NP and the refractive index of the dielectric layer DL may be about 0.5 or greater. The nano-structures NP having a difference in a refractive index between the refractive index of the peripheral material may change the phase of light that passes through the nano-structures NP. This is caused by phase delay that occurs due to the shape dimension of the sub-wavelength of the nanostructures NP, and a degree at which the phase is delayed, may be determined by a detailed shape dimension and arrangement shape of the nanostructures NP.
In order to color-separate the incident light into the light of the first wavelength, the light of the second wavelength, and the light of the third wavelength and condense the light onto the corresponding pixels, the plurality of nano-structures NP may be disposed in different types in the first to fourth pixel corresponding regions 151, 152, 153, and 154 of the nano-photonic lens array 150.
Referring to
For example, because the pixels around the red pixel R and the blue pixel B in
For example, as shown in
In addition, in
For example, from among nine nano-structures NP disposed in a 3×3 array, when the nano-structures in the first row are referred to as first to third nano-structures 1, 2, and 3 from left to right, the nano-structures in the second row are referred to as fourth to sixth nano-structures 4, 5, and 6 from left to right, and the nano-structures in the third row are referred to seventh to ninth nano-structures 7, 8, and 9 from left to right, the second, fifth, and sixth nano-structures 2, 5, and 6 may have largest cross-sectional areas from among the nano-structures NP in the second pixel corresponding region 152. In addition, the cross-sectional area of the third nano-structure 3 may be less than those of the second, fifth, and sixth nano-structures 2, 5, and 6, and the first, fourth, seventh, eighth, and ninth nano-structures 1, 4, 7, 8, and 9 may have smallest cross-sectional areas.
Therefore, pairs of two nano-structures corresponding to each other in the first direction in the second pixel corresponding region 152, that is, at least one pair from among the pairs of the first and third nano-structures 1 and 3, the fourth and sixth nano-structures 4 and 6, and the seventh and ninth nano-structures 7 and 9 may have different cross-sectional areas from each other. Also, from among the pairs of two nano-structures corresponding to each other in the second direction in the second pixel corresponding region 152, that is, the first and seventh nano-structures 1 and 7, the second and eighth nano-structures 2 and 8, and the third and ninth nano-structures 3 and 9, at least one pair may have different cross-sectional areas from each other. Also, from among the pairs of two nano-structures corresponding to each other in the second diagonal direction D2 in the second pixel corresponding region 152, that is, the second and fourth nano-structures 2 and 4, the third and seventh nano-structures 3 and 7, and the sixth and eighth nano-structures 6 and 8, at least one pair may have different cross-sectional areas from each other. On the contrary, the pairs of two nano-structures corresponding to each other in the first diagonal direction D1 in the second pixel corresponding region 152, that is, the first and ninth nano-structures 1 and 9, the second and sixth nano-structures 2 and 6, and the fourth and eighth nano-structures 4 and 8 may have the same cross-sectional areas.
From among the nano-structures NP in the fourth pixel corresponding region 154, the fourth, fifth, and eighth nano-structures 4, 5, and 8 have largest cross-sectional areas, the cross-sectional area of the seventh nano-structure 7 may be less than those of the fourth, fifth, and eighth nano-structures 4, 5, and 8, and the first, second, third, sixth, and ninth nano-structures 1, 2, 3, 6, and 9 may have smallest cross-sectional areas. Therefore, pairs of two nano-structures corresponding to each other in the first direction in the fourth pixel corresponding region 154, that is, at least one pair from among the pairs of the first and third nano-structures 1 and 3, the fourth and sixth nano-structures 4 and 6, and the seventh and ninth nano-structures 7 and 9 may have different cross-sectional areas from each other. Also, from among the pairs of two nano-structures corresponding to each other in the second direction in the fourth pixel corresponding region 154, that is, the first and seventh nano-structures 1 and 7, the second and eighth nano-structures 2 and 8, and the third and ninth nano-structures 3 and 9, at least one pair may have different cross-sectional areas from each other. Also, from among the pairs of two nano-structures corresponding to each other in the second diagonal direction D2 in the fourth pixel corresponding region 154, that is, the second and fourth nano-structures 2 and 4, the third and seventh nano-structures 3 and 7, and the sixth and eighth nano-structures 6 and 8, at least one pair may have different cross-sectional areas from each other. On the contrary, the pairs of two nano-structures corresponding to each other in the first diagonal direction D1 in the fourth pixel corresponding region 154, that is, the first and ninth nano-structures 1 and 9, the second and sixth nano-structures 2 and 6, and the fourth and eighth nano-structures 4 and 8 may have the same cross-sectional areas.
Also, the cross-sectional area of each of the plurality of nano-structures NP in the second pixel corresponding region 152 may be equal to that of the nano-structure NP in the fourth pixel corresponding region 154 which is positioned symmetrically with the nano-structures NP in the second pixel corresponding region 152 in the first diagonal direction D1. For example, the cross-sectional area of the first nano-structure 1 in the second pixel corresponding region 152 and the cross-sectional area of the first nano-structure 1 in the fourth pixel corresponding region 154 are the same as each other, the cross-sectional area of the second nano-structure 2 in the second pixel corresponding region 152 and the cross-sectional area of the fourth nano-structure 4 in the fourth pixel corresponding region 154 are the same as each other, the cross-sectional area of the third nano-structure 3 in the second pixel corresponding region 152 and the cross-sectional area of the seventh nano-structure 7 in the fourth pixel corresponding region 154 are the same as each other, the cross-sectional area of the fourth nano-structure 4 in the second pixel corresponding region 152 and the cross-sectional area of the second nano-structure 2 in the fourth pixel corresponding region 154 are the same as each other, the cross-sectional area of the fifth nano-structure 5 in the second pixel corresponding region 152 and the cross-sectional area of the fifth nano-structure 5 in the fourth pixel corresponding region 154 are the same as each other, the cross-sectional area of the sixth nano-structure 6 of the second pixel corresponding region 152 and the cross-sectional area of the eighth nano-structure 8 in the fourth pixel corresponding region 154 are the same as each other, the cross-sectional area of the seventh nano-structure 7 in the second pixel corresponding region 152 and the cross-sectional area of the third nano-structure 3 in the fourth pixel corresponding region 154 are the same as each other, the cross-sectional area of the eighth nano-structure 8 in the second pixel corresponding region 152 and the cross-sectional area of the sixth nano-structure 6 in the fourth pixel corresponding region 154 are the same as each other, and the cross-sectional area of the ninth nano-structure 9 in the second pixel corresponding region 152 and the cross-sectional area of the ninth nano-structure 9 in the fourth pixel corresponding region 154 are the same as each other.
Referring to
Referring to
Referring to
In addition,
When the nano-structures NP are disposed in the 4×4 array, the arrangement rule of the nano-structures NP described above with reference to
The cross-sectional areas of the plurality of nano-structures NP in the second pixel corresponding region 152 and the fourth pixel corresponding region 154 may be distributed asymmetrically in the first direction, the second direction, and the first diagonal direction D1 and symmetrically in the second diagonal direction D2. Also, the distribution of the cross-sectional areas of the plurality of nano-structures NP in the second pixel corresponding region 152 and the distribution of the cross-sectional areas of the plurality of nano-structures NP in the fourth pixel corresponding region 154 may be mirror-symmetrical with each other in the first diagonal direction D1. For example, from among the plurality of nano-structures NP in the second pixel corresponding region 152 and the fourth pixel corresponding region 154, four nano-structures located at each center portion and four nano-structures disposed around the centers in the two sides facing the first pixel corresponding region 151 may have first cross-sectional areas that are the largest. Also, one nano-structure disposed at an apex facing two first pixel corresponding regions 151 may have a second cross-sectional area that is less than the first cross-sectional area. Also, from among the plurality of nano-structures NP in the second pixel corresponding region 152 and the fourth pixel corresponding region 154, four nano-structures disposed around the centers of two sides facing the third pixel corresponding region 153 and two nano-structures disposed at two apexes facing the first pixel corresponding region 151 and the third pixel corresponding region 153 may each have a third cross-sectional area that is less than the second cross-sectional area. Also, one nano-structure disposed at an apex facing two third pixel corresponding regions 153 may have a fourth cross-sectional area less than the third cross-sectional area. For example, the fourth cross-sectional area may be 0 (zero).
The cross-sectional areas of the nano-structures NP shown in the example of
Referring to
Conditions may be selected so that the second pixel corresponding region 152 and the fourth pixel corresponding region 154 are asymmetrical with each other in the first direction, the second direction, and the first diagonal direction D1 and symmetrical in the second diagonal direction D2. Also, the conditions may be selected so that the second pixel corresponding region 152 and the fourth pixel corresponding region 154 may be mirror-symmetrical with each other in the first diagonal direction D1. For example, the same variable wg may be assigned to four nano-structures located at the center portions of the second pixel corresponding region 152 and the fourth pixel corresponding region 154, the same variable w4 may be assigned to four nano-structures disposed around the centers of two sides facing the first pixel corresponding region 151, the same variable w3 may be assigned to four nano-structures disposed around the centers of two sides facing the third pixel corresponding region 153, the same variable w7 may be assigned to two nano-structures disposed at two apexes facing the first pixel corresponding region 151 and the third pixel corresponding region 153, the variable w6 may be assigned to one nano-structure disposed at the apex facing two first pixel corresponding regions 151, and the variable w5 may be assigned to one nano-structure disposed at the apex facing two third pixel corresponding regions 153. In the second pixel corresponding region 152 and the fourth pixel corresponding region 154, the nano-structures NP to which the same variable is assigned may have the same cross-sectional areas. According to the calculation result, two or more variables from among wg, w3, w4, w5, w6, and w7 may accidentally have the same value.
So far, the description is provided that the arrangements of the plurality of nano-structures NP in the first pixel corresponding region 151 and the third pixel corresponding region 153 are symmetrical with each other in the first direction, the second direction, the first diagonal direction D1, and the second diagonal direction D2, only in consideration of the green pixels around the red pixel R and the blue pixel B. However, because the blue pixel B is disposed in the first diagonal direction D1 of the blue pixel B and the red pixel R is disposed in the second diagonal direction D2 of the blue pixel whereas the red pixel R is disposed in the first diagonal direction D1 of the red pixel R and the blue pixel B is disposed in the second diagonal direction D2 of the red pixel R, there may be a slight difference in the arrangement of the peripheral pixels of the red pixel R and the blue pixel B. In further consideration of the difference in the arrangements, the arrangements of the plurality of nano-structures NP in the first pixel corresponding region 151 and the third pixel corresponding region 153 may be deformed.
Referring to
Therefore, in the first pixel corresponding region 151 and the third pixel corresponding region 153, at least one of the pairs of two nano-structures corresponding to each other in the first direction may have different cross-sectional areas and at least one of the pairs of two nano-structures corresponding to each other in the second direction may have different cross-sectional areas. Also, in the first pixel corresponding region 151 and the third pixel corresponding region 153, the pairs of two nano-structures corresponding to each other in the first diagonal direction D1 may have the same cross-sectional areas, and the pairs of two nano-structures corresponding to each other in the second diagonal direction D2 may have the same cross-sectional areas.
Referring to
Referring to
Referring to
The image sensor 1000 may have an improved light utilization efficiency. As the light utilization efficiency is improved, a size of one pixel or size of independent photosensitive cells in the pixel of the image sensor 1000 may be reduced. Therefore, the ultra-fine pixel image sensor 1000 having high resolution may be provided. According to the embodiment, the nano-photonic lens array 150 may be applied to the image sensor 1000 along with the pixel arrangement that does not need the demosaic process when using a binning mode. Therefore, the image sensor 1000 may exhibit an increase in the sensitivity, improvement in a resolution, and improvement in the power consumption under the low-illuminance, while having ultra-fine pixels. The image sensor 1000 according to the embodiment may form a camera module along with a module lens of various functions and may be utilized in various electronic devices.
The processor ED20 may control one or more elements (hardware, software elements, etc.) of the electronic apparatus ED01 connected to the processor ED20 by executing software (program ED40, etc.), and may perform various data processes or operations. As a portion of the data processing or operations, the processor ED20 may load a command and/or data received from another element (sensor module ED76, communication module ED90, etc.) to a volatile memory ED32, may process the command and/or data stored in the volatile memory ED32, and may store result data in a non-volatile memory ED34. The non-volatile memory ED34 may include an internal memory ED36 and an external memory ED38. The processor ED20 may include a main processor ED21 (central processing unit, application processor, etc.) and an auxiliary processor ED23 (graphic processing unit, image signal processor, sensor hub processor, communication processor, etc.) that may be operated independently from or along with the main processor ED21. The auxiliary processor ED23 may use less power than that of the main processor ED21, and may perform specified functions.
The auxiliary processor ED23, on behalf of the main processor ED21 while the main processor ED21 is in an inactive state (sleep state) or along with the main processor ED21 while the main processor ED21 is in an active state (application executed state), may control functions and/or states related to some (display device ED60, sensor module ED76, communication module ED90, etc.) of the elements in the electronic apparatus ED01. The auxiliary processor ED23 (image signal processor, communication processor, etc.) may be implemented as a part of another element (camera module ED80, communication module ED90, etc.) that is functionally related thereto.
The memory ED30 may store various data required by the elements (processor ED20, sensor module ED76, etc.) of the electronic apparatus ED01. The data may include, for example, input data and/or output data about software (program ED40, etc.) and commands related thereto. The memory ED30 may include the volatile memory ED32 and/or the non-volatile memory ED34.
The program ED40 may be stored as software in the memory ED30, and may include an operation system ED42, middleware ED44, and/or an application ED46.
The input device ED50 may receive commands and/or data to be used in the elements (processor ED20, etc.) of the electronic apparatus ED01, from outside (user, etc.) of the electronic apparatus ED01. The input device ED50 may include a microphone, a mouse, a keyboard, and/or a digital pen (stylus pen).
The sound output device ED55 may output a sound signal to outside of the electronic apparatus ED01. The sound output device ED55 may include a speaker and/or a receiver. The speaker may be used for a general purpose such as multimedia reproduction or record play, and the receiver may be used to receive a call. The receiver may be coupled as a part of the speaker or may be implemented as an independent device.
The display device ED60 may provide visual information to outside of the electronic apparatus ED01. The display device ED60 may include a display, a hologram device, or a projector, and a control circuit for controlling the corresponding device. The display device ED60 may include a touch circuitry set to sense a touch, and/or a sensor circuit (pressure sensor, etc.) that is set to measure a strength of a force generated by the touch.
The audio module ED70 may convert sound into an electrical signal or vice versa. The audio module ED 70 may acquire sound through the input device ED50, or may output sound via the sound output device ED55 and/or a speaker and/or a headphone of another electronic apparatus (electronic apparatus ED02, etc.) connected directly or wirelessly to the electronic apparatus ED01.
The sensor module ED76 may sense an operating state (power, temperature, etc.) of the electronic apparatus ED01, or an outer environmental state (user state, etc.), and may generate an electrical signal and/or data value corresponding to the sensed state. The sensor module ED76 may include a gesture sensor, a gyro-sensor, a pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared (IR) ray sensor, a vivo sensor, a temperature sensor, a humidity sensor, and/or an illuminance sensor.
The interface ED77 may support one or more designated protocols that may be used in order for the electronic apparatus ED01 to be directly or wirelessly connected to another electronic apparatus (electronic apparatus ED02, etc.). The interface ED77 may include a high-definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, and/or an audio interface.
The connection terminal ED78 may include a connector by which the electronic apparatus ED01 may be physically connected to another electronic apparatus (electronic apparatus ED02, etc.). The connection terminal ED78 may include an HDMI connector, a USB connector, an SD card connector, and/or an audio connector (headphone connector, etc.).
The haptic module ED79 may convert the electrical signal into a mechanical stimulation (vibration, motion, etc.) or an electric stimulation that the user may sense through a tactile or motion sensation. The haptic module ED79 may include a motor, a piezoelectric device, and/or an electric stimulus device.
The camera module ED80 may capture a still image and a video. The camera module ED80 may include a lens assembly including one or more lenses, the image sensor 1000 of
The power management module ED88 may manage the power supplied to the electronic apparatus ED01. The power management module ED88 may be implemented as a part of a power management integrated circuit (PMIC).
The battery ED89 may supply electric power to components of the electronic apparatus ED01. The battery ED89 may include a primary battery that is not rechargeable, a secondary battery that is rechargeable, and/or a fuel cell.
The communication module ED90 may support the establishment of a direct (wired) communication channel and/or a wireless communication channel between the electronic apparatus ED01 and another electronic apparatus (electronic apparatus ED02, electronic apparatus ED04, server ED08, etc.), and execution of communication through the established communication channel. The communication module ED90 may be operated independently from the processor ED20 (application processor, etc.), and may include one or more communication processors that support the direct communication and/or the wireless communication. The communication module ED90 may include a wireless communication module ED92 (cellular communication module, a short-range wireless communication module, a global navigation satellite system (GNSS) communication module) and/or a wired communication module ED94 (local area network (LAN) communication module, a power line communication module, etc.). From among the communication modules, a corresponding communication module may communicate with another electronic apparatus via a first network ED98 (short-range communication network such as Bluetooth, WiFi direct, or infrared data association (IrDA)) or a second network ED99 (long-range communication network such as a cellular network, Internet, or computer network (LAN, WAN, etc.)). Such above various kinds of communication modules may be integrated as one element (single chip, etc.) or may be implemented as a plurality of elements (a plurality of chips) separately from one another. The wireless communication module ED92 may identify and authenticate the electronic apparatus ED01 in a communication network such as the first network ED98 and/or the second network ED99 by using subscriber information (international mobile subscriber identifier (IMSI), etc.) stored in the subscriber identification module ED96.
The antenna module ED97 may transmit or receive the signal and/or power to/from outside (another electronic apparatus, etc.). An antenna may include a radiator formed as a conductive pattern formed on a substrate (PCB, etc.). The antenna module ED97 may include one or more antennas. When the antenna module ED97 includes a plurality of antennas, from among the plurality of antennas, an antenna that is suitable for the communication type used in the communication network such as the first network ED98 and/or the second network ED99 may be selected by the communication module ED90. The signal and/or the power may be transmitted between the communication module ED90 and another electronic apparatus via the selected antenna. Another component (RFIC, etc.) other than the antenna may be included as a portion of the antenna module ED97.
Some of the elements may be connected to one another via the communication method among the peripheral devices (bus, general purpose input and output (GPIO), serial peripheral interface (SPI), mobile industry processor interface (MIPI), etc.) and may exchange signals (commands, data, etc.).
The command or data may be transmitted or received between the electronic apparatus ED01 and the external electronic apparatus ED04 via the server ED08 connected to the second network ED99. Other electronic apparatuses ED02 and ED04 may be the devices that are the same as or different kinds from the electronic apparatus ED01. All or some of the operations executed in the electronic apparatus ED01 may be executed in one or more devices among the other electronic apparatuses ED02, ED04, and ED08. For example, when the electronic apparatus ED01 has to perform a certain function or service, the electronic apparatus ED01 may request one or more other electronic apparatuses to perform some or entire function or service, instead of executing the function or service by itself. One or more electronic apparatuses receiving the request execute an additional function or service related to the request and may transfer a result of the execution to the electronic apparatus ED01. To do this, for example, a cloud computing, a distributed computing, or a client-server computing technique may be used.
The flash 1120 may emit light that is used to strengthen the light emitted or reflected from the object. The flash 1120 may emit visible light or infrared-ray light. The flash 1120 may include one or more light-emitting diodes (red-green-blue (RGB) LED, white LED, infrared LED, ultraviolet LED, etc.), and/or a Xenon lamp. The image sensor 1000 may be the image sensor described above with reference to
The image stabilizer 1140, in response to a motion of the camera module ED80 or the electronic apparatus ED01 including the camera module ED80, moves one or more lenses included in the lens assembly 1110 or the image sensor 1000 in a certain direction or controls the operating characteristics of the image sensor 1000 (adjusting of a read-out timing, etc.) in order to compensate for a negative influence of the motion. The image stabilizer 1140 may sense the movement of the camera module ED80 or the electronic apparatus ED01 by using a gyro sensor (not shown) or an acceleration sensor (not shown) disposed in or out of the camera module ED80. The image stabilizer 1140 may be implemented as an optical type.
The memory 1150 may store some or entire data of the image obtained through the image sensor 1000 for next image processing operation. For example, when a plurality of images are obtained at a high speed, obtained original data (Bayer-patterned data, high-resolution data, etc.) is stored in the memory 1150, and a low-resolution image is only displayed. Then, original data of a selected image (user selection, etc.) may be transferred to the image signal processor 1160. The memory 1150 may be integrated with the memory ED30 of the electronic apparatus ED01, or may include an additional memory that is operated independently.
The image signal processor 1160 may perform image treatment on the image obtained through the image sensor 1000 or the image data stored in the memory 1150. The image treatments may include a depth map generation, a three-dimensional modeling, a panorama generation, extraction of features, an image combination, and/or an image compensation (noise reduction, resolution adjustment, brightness adjustment, blurring, sharpening, softening, etc.). The image signal processor 1160 may perform controlling (exposure time control, read-out timing control, etc.) of the elements (image sensor 1000, etc.) included in the camera module ED80. The image processed by the image signal processor 1160 may be stored again in the memory 1150 for additional process, or may be provided to an external element of the camera module ED80 (e.g., the memory ED30, the display device ED60, the electronic apparatus ED02, the electronic apparatus ED04, the server ED08, etc.). The image signal processor 1160 may be integrated with the processor ED20, or may be configured as an additional processor that is independently operated from the processor ED20. When the image signal processor 1160 is configured as an additional processor separately from the processor ED20, the image processed by the image signal processor 1160 undergoes through an additional image treatment by the processor ED20 and then may be displayed on the display device ED60.
Also, the image signal processor 1160 may receive two output signals independently from the adjacent photosensitive cells in each pixel or sub-pixel of the image sensor 1000, and may generate an auto-focusing signal from a difference between the two output signals. The image signal processor 1160 may control the lens assembly 1110 so that the focus of the lens assembly 1110 may be accurately formed on the surface of the image sensor 1000 based on the auto-focusing signal.
The electronic apparatus ED01 may further include one or a plurality of camera modules having different properties or functions. The camera module may include elements similar to those of the camera module ED80 of
While the image sensor including the nano-photonic lens array and the electronic apparatus including the image sensor have been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope as defined by the following claims. The preferred embodiments should be considered in descriptive sense only and not for purposes of limitation. Therefore, the scope of the disclosure is defined not by the detailed description of the disclosure but by the appended claims, and all differences within the scope will be construed as being included in the present disclosure.
According to the embodiments, the nano-photonic lens array performing both the color-separation and light condensing functions is used, and thus, the light utilization efficiency may be improved. As the light utilization efficiency is improved, a size of one pixel or size of independent photosensitive cells in the pixel of the image sensor may be reduced. Therefore, the ultra-fine pixel image sensor having high resolution may be provided.
Also, according to the embodiments, the nano-photonic lens array may be applied to the image sensor having new pixel arrangement that does not need the demosaic process when using the binning mode. Therefore, the image sensor including the nano-photonic lens array may exhibit an increase in the sensitivity, improvement in a resolution, and improvement in the power consumption under the low-illuminance, while having ultra-fine pixels.
It should be understood that embodiments described herein should be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects within each embodiment should typically be considered as available for other similar features or aspects in other embodiments. While one or more embodiments have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope as defined by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2023-0019538 | Feb 2023 | KR | national |