The present disclosure generally concerns the field of electronic circuits and, more particularly, an optical sensor formed inside and on top of a semiconductor substrate.
An optical sensor generally comprises a plurality of pixels each comprising a photodetector capable of generating an electric signal representative of the intensity of a light radiation that it receives.
So-called multispectral optical sensors, comprising a plurality of pixel types respectively comprising different optical filters in order to measure radiation intensities in different wavelength ranges are more particularly considered here. The present application however also concerns so-called monochromatic sensors, where the different pixels measure intensities of a radiation received in a same wavelength range.
It would be desirable to at least partly improve certain aspects of known optical sensors.
An embodiment provides an optical sensor comprising one or a plurality of pixels, each comprising a photodetector and a telecentric system topping the photodetector. Each telecentric system comprises: an opaque layer comprising one or a plurality of openings facing the photodetector; and a microlens facing each opening and arranged between the opaque layer and the photodetector.
According to an embodiment, each pixel comprises an optical filter between the microlenses and the photodetector.
According to an embodiment, the optical filter comprises an interference filter, a diffraction grating-based filter, or a metasurface-based filter.
According to an embodiment, the microlenses have a diameter greater than the diameter of the openings.
According to an embodiment, each microlens comprises at least one planar surface.
According to an embodiment, the planar surfaces of the microlenses are coplanar.
According to an embodiment, the microlenses are laterally separated by opaque walls.
According to an embodiment, the telecentric system comprises at least another microlens facing each microlens and positioned between the microlenses and the photodetector.
According to an embodiment, each microlens is a planar lens.
According to an embodiment, each planar lens comprises a plurality of pads made of a first material, having a first optical index, and surrounded with a second material, having a second optical index different from the first index.
The foregoing features and advantages, as well as others, will be described in detail in the following description of specific embodiments given by way of illustration and not limitation with reference to the accompanying drawings, in which:
Like features have been designated by like references in the various figures. In particular, the structural and/or functional features that are common among the various embodiments may have the same references and may dispose identical structural, dimensional and material properties.
For the sake of clarity, only the steps and elements that are useful for an understanding of the embodiments described herein have been illustrated and described in detail. In particular, the forming of the described sensors has not been detailed, the forming of such sensors being within the abilities of those skilled in the art based on the indications of the present description. Further, the applications capable of taking advantage of the described sensors have not been detailed, the described embodiments being compatible with all or most of the usual applications of optical sensors.
Unless indicated otherwise, when reference is made to two elements connected together, this signifies a direct connection without any intermediate elements other than conductors, and when reference is made to two elements coupled together, this signifies that these two elements can be connected or they can be coupled via one or more other elements.
In the following disclosure, unless otherwise specified, when reference is made to absolute positional qualifiers, such as the terms "front", "back", "top", "bottom", "left", "right", etc., or to relative positional qualifiers, such as the terms "above", "below", "upper", "lower", etc., or to qualifiers of orientation, such as "horizontal", "vertical", etc., reference is made to the orientation shown in the figures.
Unless specified otherwise, the expressions "around", "approximately", "substantially" and "in the order of" signify within 10%, and preferably within 5%.
In the following description, unless specified otherwise, a layer or a film is called opaque to a radiation when the transmittance of the radiation through the layer or the film is smaller than 10%. In the following description, a layer or a film is called transparent to a radiation when the transmittance of the radiation through the layer or the film is greater than 10%.
In the following description, "visible light" designates an electromagnetic radiation having a wavelength in the range from 400 nm to 700 nm. In this range, call red light an electromagnetic radiation having a wavelength in the range from 600 nm to 700 nm, blue light an electromagnetic radiation having a wavelength in the range from 400 nm to 500 nm, and green light an electromagnetic radiation having a wavelength in the range from 500 nm to 600 nm. Further call infrared light an electromagnetic radiation having a wavelength in the range from 700 nm to 1 mm.
Sensor 11 comprises a plurality of pixels 13, for example, organized in the form of an array of rows and columns. In the example of
Pixels 13, for example, all have, in top view, the same shape and the same dimensions, to within manufacturing dispersions. In the example shown in
Sensor 11 is, for example, an ambient luminosity sensor, comprising a plurality of pixels 13 adapted to respectively measuring the intensity of the ambient light in different wavelength ranges.
Each pixel 13 comprises a photodetector 15 and an optical filter 17 coating a surface of exposure to light of photodetector 15 (upper surface in the orientation of the view shown in
Photodetectors 15 are, for example, configured for sensing all or part of a light radiation illuminating sensor 11. As an example, photodetectors 15 are configured for detecting all the visible radiation and/or the infrared radiation. As an example, the photodetectors 15 of a same sensor 11 are all identical, to within manufacturing differences. Photodetectors 15 are, for example, monolithically integrated inside and on top of a same semiconductor substrate, for example a silicon substrate. As an example, each photodetector 15 comprises a photosensitive semiconductor area integrated in the semiconductor substrate. Photodetectors 15 are, for example, photodiodes.
Filters 17, for example, comprise interference filters each corresponding to a multilayer stack having its materials and thicknesses selected to control the wavelength of the radiation crossing it. As a variant, filters 17 may be filters based on diffraction gratings, metasurfaces, or any other type of adapted filters. The filters 17 of pixels 13 different from sensor 11, for example, have different responses. More particularly, the filters 17 of different pixels 13, for example, give way to radiations in different wavelength ranges, so that the different pixels 13 measure radiations in different wavelength ranges.
As an example, sensor 11 comprises: one or a plurality of first pixels 13 called blue pixels, each comprising a first filter 17 called blue filter, configured for mainly giving way to blue light; one or a plurality of second pixels 13 called red pixels, each comprising a second filter 17, called red filter, configured for mainly giving way to red light; and one or a plurality of pixels 13 called green pixels, each comprising a third filter 17, called green filter, configured for mainly giving way to green light.
As an example, sensor 11 may, in addition to the blue, red, and green pixels, comprise: one or a plurality of fourth pixels 13, called infrared pixels, each comprising a fourth filter 17, called infrared filter, configured for mainly giving way to infrared light; and/or one or a plurality of fifth pixels 13, called white pixels, each comprising a fifth pixel 17, called white filter, configured for substantially giving way to all the visible light and to block infrared light.
As a variant, other types of filters 17 configured for giving way to other wavelength ranges different from those listed hereabove may be provided, to obtain signals representative of different wavelength ranges of the light spectrum received by sensor 11.
As an example, in infrared pixels, filter 17 is a bandpass interference filter only giving way to infrared radiation.
In red, green, and blue pixels, filter 17, for example, comprises a bandgap interference filter filtering the infrared radiation, topped with a colored resin filter mainly giving way to a red, respectively green, respectively blue, portion of the visible radiation.
In white pixels, filter 17, for example, comprises a bandgap interference filter filtering the infrared radiation.
The bandgap interference filter is, for example, identical (to within manufacturing dispersions) in red, green, blue, and white pixels. As an example, this filter continuously extends above the photodetectors 15 of the red, green, blue, and white pixels of sensor 11. This filter is however interrupted in front of the infrared filters.
As a variant, in red, green, and blue pixels, filter 17 comprises a specific interference filter mainly giving way to a red, respectively green, respectively blue portion of the visible radiation. In this case, the infrared bandgap interference filter may be omitted in red, green, and blue pixels.
Sensor 11 is, for example, an ambient light sensor (ALS), or a hyperspectral sensor. In such a sensor, it is not desired to obtain an image of a scene, but only a representation of the spectrum of the ambient light.
Conversely to an image sensor, such a sensor is generally not topped with an external optical system for focusing the received light. Indeed, such an external optical system is generally expensive and bulky and is considered undesirable in most usual applications of ambient luminosity sensors.
In practice, the received radiations originate from different sources of the sensor environment, and thus each arrive with different angles of incidence at the sensor surface. It is then desired to detect all the received light radiations, independently from the angle of incidence of these radiations on the sensor surface.
A problem which is posed is that interference filters or, more generally, filters based on diffractive phenomena (diffraction networks, metasurfaces, etc.) are sensitive to the angle of incidence of the light radiations which cross them. In other words, the bandwidth or the bandgap of the sensor filters 17 varies according to the angle of incidence of the received rays. This results in inaccuracies on the representation of the ambient spectrum provided by the sensor.
According to an aspect of an implementation mode, it is provided, to overcome these disadvantages, to top the filter 17 of each pixel 13 with a telecentric system. In each pixel 13, the telecentric system of the pixel has the function of deviating the radiations received by the pixel so that the latter reach the surface of filter 17 with a controlled angle of incidence. In particular, the telecentric system enables to symmetrize the light cone reaching the pixel filter 17, and to decrease the angles of incidence of the rays reaching filter 17. Thus, the sensor response is less sensitive to variations of the characteristics and/or orientations of the ambient light sources.
The sensor 20 of
In
In each pixel 13, telecentric system 19 comprises a layer 21 opaque to light radiations, for example, visible and/or infrared, received by sensor 20. Layer 21 comprises at least one through opening 23 facing optical filter 17 and the photodetector 15 of pixel 13. The portions of layer 21 delimiting opening(s) 23 form walls 25 opaque to said radiation, defining a diaphragm. Telecentric system 19 further comprises a layer 27 of at least one lens 29. In the example of
In the example of
Openings 23, for example, have, in top view, a circular shape (the term characteristic dimension of an opening 23 defines the diameter of opening 23 in top view). Openings 23 may as a variant have a shape different from a circle, for example, a hexagonal shape, an octagonal shape, or a square shape (the term characteristic dimension of an opening 23 then defines the diameter the circle inscribed within opening 23 in top view).
Openings 23, for example, have a characteristic dimension in the range from 1 µm to 200 µm, for example, determined according to the dimensions of photodetectors 15, to the number of openings 23 facing each photodetector, and to the incidence cone which is desired to be obtained on the photodetector.
Walls 25 are, for example, opaque to the wavelengths detected by photodetectors 15, here the visible and/or infrared wavelengths. Walls 25 are, for example, made of an opaque resin or of metal, for example, of tungsten.
Walls 25 have a thickness that may depend on the material of layer 21. As an example, if walls 25 are made of resin, they may have a thickness in the range from 500 nm to 1 µm, and if walls 25 are made of metal, they may have a thickness in the order of 100 nm.
In the example of
Microlenses 29, and more particularly the bulged surfaces of microlenses 29, are, for example, topped with a planarizing layer 32. According to an embodiment, layer 32 is a layer having its lower surface following the shape of microlenses 29 and having its upper surface substantially planar. Layer 21 is, for example, located on top of and in contact with the upper surface of layer 32. Layer 32 is made of a material transparent to the wavelengths detected by the underlying photodetectors 15. As an example, layer 32 is made of a material having an optical index (or refraction index) lower than the optical index of the material of microlenses 29. As an example, the material of layer 32 is a resin or an oxide.
In the example shown in
Microlenses 29 are, for example, made of an organic resin or of a nitride, for example a silicon nitride. Microlenses 29 are preferably transparent in the wavelength range detected by photodetectors 15. Microlenses 29 are more preferably transparent in the visible and/or infrared range.
As an example, the layer 27 of microlenses 29 and the opaque layer 21 each continuously extend over substantially the entire surface of sensor 20. Further, in the example of
As an example, the microlenses 29 of sensor 20 are all identical (to within manufacturing dispersions). As a variant, microlenses 29 may have different dimensions within a pixel 13 and/or between different pixels 13 of sensor 20.
As an example, openings 23 have a characteristic dimension smaller than the diameter (in top view) of microlenses 29.
As an example, the upper surface of layer 21 is located in the object focal plane of microlenses 29 to obtain a telecentric effect. The upper surface of photodetectors 15 is, for example, but not necessarily, located in the image focal plane of microlenses 29.
Telecentric system 19 enables, by the presence of opening layer 23, to filter the radiations arriving with an angle of incidence greater than a defined angle of incidence.
Telecentric system 19 further enables to focus, in a cone converging at the surface of a photodetector 15, the radiations arriving with an angle of incidence smaller than or equal to the defined angle of incidence. Telecentric system 19 particularly enables to symmetrize the light cone reaching the pixel filter 17, and to decrease the angles of incidence of the rays arriving on filter 17. In practice, the width of openings 23 defines the cone angle. The smaller openings 23, the smaller the cone angle.
Thus, an advantage of the embodiment illustrated in
In the above-described example, each pixel comprises a telecentric system 19 comprising a plurality of telecentric sub-systems arranged in a same plane, each telecentric sub-system comprising a stack of a diaphragm defined by an opening 23, and of a microlens 29. An advantage of this configuration is that it enables to gain compactness for the system. Indeed, the thickness of the telecentric system may be decreased with respect to a telecentric system comprising a single diaphragm and a single microlens, covering the entire surface of the photodetector, particularly for photodetectors of large dimensions. The described embodiments are however not limited to this specific case. As a variant, each pixel may comprise a single telecentric sub-system, that is, a single opening 23 and a single microlens 29.
More particularly,
Walls 33 are made of a material opaque to the wavelengths detected by the underlying photodetectors 15, for example the visible and/or infrared wavelengths in the considered example. Walls 33 are, for example, made of the same material as walls 25. As an example, walls 33 are made of opaque resin or of metal, for example comprising tungsten.
Walls 33 are, for example, formed by etching of trenches in layers 27 and 32, and then filling of the trenches with an opaque material.
In this example, walls 33 extend between the microlenses 29 of layer 27 all along the height of layers 27 and 32 between the upper surface of layer 31 and the lower surface of walls 25. Walls 33 are, for example, located in front of walls 25 but have a width (horizontal dimension in the orientation of
Walls 33 particularly enable to avoid optical crosstalk between two neighboring pixels 13 and/or to improve the efficiency of the telecentric system.
More particularly,
Microlenses 39 are, for example, organized according to the same array network as microlenses 29, each of microlenses 39 being associated with a microlens 29 so that the optical axis of each microlens 39 is confounded with the optical axis of a corresponding microlens 29.
As an example, microlenses 39 are made of the same material as microlenses 29.
In the example illustrated in
As a variant, microlenses 39 have a geometry different from that of microlenses 29. Microlenses 39 then have a diameter, a radius of curvature, and/or a focal distance respectively different from the diameter, the radius of curvature, and/or the focal distance of microlenses 29.
Like microlenses 29, microlenses 39 may be topped with a planarizing layer 41 similar or identical to layer 32. According to an embodiment, layer 41 has a lower surface following the shape of microlenses 39 and a substantially planar upper surface.
In the example shown in
An advantage of the embodiment illustrated in
The optical sensor 43 of
Each microlens 45 comprises a plurality of pads 47. Pads 47, for example, correspond to cylindrical structures. Pads 47, for example, all have the same thickness (vertical dimension in the orientation of
Pads 47 are, for example, made of a material transparent to the wavelengths detected by the underlying photodetector 15, for example at visible and/or infrared wavelengths, in the considered example. As an example, pads 47 are made of polysilicon or of silicon nitride. Pads 47 are laterally separated from one another by a transparent material having an optical index different from that of pads 47, for example, a material having an optical index smaller than that of pads 47, for example a silicon oxide.
In the example shown in
An advantage of the embodiment illustrated in
Another advantage of the embodiment illustrated in
Various embodiments and variants have been described. Those skilled in the art will understand that certain features of these various embodiments and variants may be combined, and other variants will occur to those skilled in the art. In particular, the embodiments illustrated in
Further, the described embodiments are not limited to the examples of dimensions and of materials mentioned hereabove.
Further, the above-described embodiments are not limited to the specific example of application of ambient light sensors described hereabove, but may be adapted to any optical sensor capable of taking advantage of a control of the angles of incidence of the radiations received by the photodetectors of the sensor pixels. The described embodiments are particularly advantageous for sensors comprising optical interference filters above the photodetectors of the sensor pixels. The described embodiments are however not limited to this specific case. For certain applications, for example, in the case of a monochromatic image sensor, optical filters 17 may be omitted. The described embodiments then particularly enable to symmetrize and to decrease the cone of incidence of the rays arriving on the pixels in order to, for example, uniformize the light everywhere on the sensor.
Finally, the practical implementation of the described embodiments and variations is within the abilities of those skilled in the art based on the functional indications given hereabove.
Number | Date | Country | Kind |
---|---|---|---|
2108146 | Jul 2021 | FR | national |
This application claims the priority benefit of French Application for Patent No. 2108146, filed on Jul. 27, 2021, the content of which is hereby incorporated by reference in its entirety to the maximum extent allowable by law.