This application claims the priority benefit of French Application for Patent No. 2211351, filed on Oct. 31, 2022, the content of which is hereby incorporated by reference in its entirety to the maximum extent allowable by law.
The present disclosure generally concerns electronic devices and, in particular, optoelectronic devices.
Event-based cameras, such as standard cameras, comprise a plurality of pixels, each pixel being configured to deliver a value corresponding to a location in an observed scene. However, conversely to a standard camera, that is, a camera having each of its pixel configured to periodically deliver a light intensity value corresponding to the location, an event-based camera is configured to deliver an information indicating a modification of the light intensity. Event-based cameras are thus configured to deliver an event-based image of a scene. Standard cameras are thus configured to deliver a light intensity image of a scene. Thus, a standard camera is a synchronous camera enabling to obtain at a frame frequency a succession of images comprising as many intensity values as there are pixels. An event-based camera is a synchronous or asynchronous camera delivering, for each pixel, information indicating the luminosity change of the corresponding location in the scene. When the scene is motionless, a standard camera keeps on delivering, periodically, all the intensity values, while an event-based camera delivers no value, indicating the absence of change.
There is a need in the art to overcome all or part of the disadvantages of known optoelectronic devices.
An embodiment provides a device configured to generate an event-based image comprising at least one first pixel based on quantum dots configured to deliver event-based data.
Another embodiment provides a method of controlling a device comprising at least one first pixel based on quantum dots, comprising the generation of an event-based image.
According to an embodiment, each first pixel comprises a first region of a quantum dot layer coupled to a second region of a substrate comprising a circuit for controlling the first pixel by a conductive via.
According to an embodiment, the second region comprises a portion located in front of the first region and a portion which is not in front of the first region.
According to an embodiment, the device is configured to, further, generate a light intensity image, comprising at least one second pixel based on quantum dots configured to deliver light intensity data.
According to an embodiment, each second pixel comprises a third region of a quantum dot layer coupled to a fourth region of a substrate comprising a circuit for controlling the second pixel by a conductive via.
According to an embodiment, the third region comprises a portion located in front of the fourth region and a portion which is not in front of the fourth region.
According to an embodiment, each first pixel is surrounded by second pixels.
According to an embodiment, the device comprises an array of assemblies of pixels, each assembly comprising one first pixel and eight second pixels surrounding the first pixel.
According to an embodiment, the device comprises an array of assemblies of pixels, each assembly comprising one first pixel and three second pixels, arranged in an array.
According to an embodiment, the first and third regions all have identical dimensions.
According to an embodiment, the device comprises an array of assemblies of pixels, each assembly comprising one first pixel and four second pixels, each second pixel having the shape of a rectangle with a beveled corner, the beveled corners of the second pixels defining the first region of the first pixel.
According to an embodiment, the device comprises an array of assemblies of pixels, each assembly comprising one first pixel and four second pixels, the first region of the first pixel having the shape of a cross separating from one another the third regions of the second pixels.
According to an embodiment, the first pixels are covered with an infrared filter and each second pixel is covered with a filter letting through a visible wavelength range.
According to an embodiment, the first pixels generate an event-based image independently from the generation of a light intensity image by the second pixels.
According to an embodiment, the generation of an event-based data element by a first pixel triggers the generation of light intensity data by at least one second pixel.
The foregoing features and advantages, as well as others, will be described in detail in the rest of the disclosure of specific embodiments given by way of illustration and not limitation with reference to the accompanying drawings, in which:
Like features have been designated by like references in the various figures. In particular, the structural and/or functional features that are common among the various embodiments may have the same references and may dispose identical structural, dimensional and material properties.
For the sake of clarity, only the steps and elements that are useful for the understanding of the described embodiments have been illustrated and described in detail.
Unless indicated otherwise, when reference is made to two elements connected together, this signifies a direct connection without any intermediate elements other than conductors, and when reference is made to two elements coupled together, this signifies that these two elements can be connected or they can be coupled via one or more other elements.
In the following description, when reference is made to terms qualifying absolute positions, such as terms “front”, “back”, “top”, “bottom”, “left”, “right”, etc., or relative positions, such as terms “above”, “under”, “upper”, “lower”, etc., or to terms qualifying directions, such as terms “horizontal”, “vertical”, etc., it is referred, unless specified otherwise, to the orientation of the drawings.
Unless specified otherwise, the expressions “about”, “approximately”, “substantially”, and “in the order of” signify plus or minus 10%, preferably of plus or minus 5%.
Pixel 10 comprises a light detection element 12. Pixel 10 comprises, for example, a current-to-voltage converter 14. Pixel 10 comprises, for example, an amplifier 16. The pixel further comprises, for example, a comparator 18.
Converter 14 uses, for example, one or a plurality of transistors in subthreshold operation to perform a logarithmic conversion of the light intensity, enabling an extension of the dynamic operating range of the pixel. Amplifier 16 preferably enables amplification of the output voltage of converter 14 and to define a contrast threshold of the pixel. Comparator 18 preferably enables detection of whether the observed light variations differ from the contrast threshold, either positively, corresponding to an increase of the light intensity, or negatively, corresponding to a decrease of the light intensity.
Element 12 is, for example, a diode. Element 12 is, for example, coupled between a node 20 of application of a reference voltage, for example a voltage GND, and an input node of converter 14. For example, in the case where converter 14 captures electrons to transform the current into voltage, the anode of element 12 is coupled, for example preferably connected, to node 20 and the cathode of element 12 is coupled, for example preferably connected, to the input node of converter 14. For example, in the case where converter 14 captures holes to transform the current into voltage, the cathode of element 12 is coupled, for example preferably connected, to node 20 and the anode of element 12 is coupled, for example preferably connected, to the input node of converter 14. Converter 14 comprises an output coupled, for example preferably connected, to an input of amplifier 16. Amplifier 16 comprises an output coupled, for example preferably connected, to an input of comparator 18.
Comparator 18 is configured to deliver as an output information indicating that the light intensity measured by diode 12 has been modified. For example, comparator 18 comprises two outputs: a first output having a voltage E+ generated thereon and a second output having a voltage E− generated thereon. Voltages E+ and E− are, for example, binary signals. For example, voltage E+ takes a first value to indicate that the intensity measured by diode 12 has increased and a low value otherwise. For example, voltage E− takes a first value to indicate that the intensity measured on diode 12 has decreased and a low value otherwise. In the case of a synchronous camera, voltages E+ and E− are stored, for example, in memory cells, for example located at the output of the camera, which enables a periodic reading of the information generated by the comparator.
Pixel 22 comprises a substrate 24. Substrate 24 is, for example, a semiconductor substrate. Electronic components are located inside and on top of substrate 24. More precisely, at least part of the components forming the control circuit of pixel 22 are located inside and on top of substrate 24. For example, the different components which enable operation to obtain the information indicating whether the measured intensity has varied are located, for example, inside and on top of substrate 24. For example, substrate 24 comprises converter 14, amplifier 16, and comparator 18.
For example, substrate 24 comprises a region 26 inside and on top of which are located analog components and a region 28 inside and on top of which are located logic components.
Pixel 22 further comprises a layer 30 comprising quantum dots (QD). Layer 30 forms the diode 12 of
Layer 30 comprises quantum dots. The quantum dots of layer 30 are located, for example, in a layer made of a material other than a semiconductor material, for example made of an electrically-insulating material, for example made of a resin.
By quantum dot, there is meant that each quantum dot forms a confinement area by quantum effect in all dimensions, that is, in the three dimensions of space. Each quantum dot thus preferably has dimensions, in all directions, in the order of a few tens of nanometers, in other words smaller than 100 nm, preferably in the range from 2 nm to 15 nm.
Each quantum dot comprises a core made of a semiconductor material, for example made of lead sulfide. Said core preferably has dimensions in all directions in the order of a few tens of nanometers, in other words smaller than 100 nm. Each quantum dot further comprises ligands extending from the core. The ligands are preferably made of organic aliphatic molecules or metal-organic and inorganic molecules.
Due to their net charge and to their dipole moment, the ligands modify the effective doping of the layers of quantum dots as well as their electronic affinity. For example, the ligands of the quantum dots of layer 30 may be molecules acting as N-type dopants, for example organic molecules such as thiolates.
The materials forming the quantum dots and the dimensions of each quantum dot, in particular the dimensions of the semiconductor core, determine the absorption wavelengths of the quantum dots, that is, the operating wavelengths of the diode. The operating wavelengths for example correspond to near infrared, that is, wavelengths in the range from 700 nm to 1.6 mm. The operating wavelengths may also correspond to medium infrared, that is, wavelengths in the range from 1.6 μm to 4 μm or to the visible range, that is, wavelengths in the range from 300 nm to 700 nm.
It is possible to select an operating wavelength among a wider wavelength range than the wavelength range possible with a standard diode, that is, a diode comprising no quantum dot layer. Indeed, quantum dot layers correspond to an absorption curve having a peak significantly located on a wavelength, said wavelength depending on the materials of the quantum dot and being likely to be any wavelength of a wavelengths range comprising at least the wavelengths from 300 nm to 4 μm.
Pixel 22 further comprises a conductive via 32 coupling layer 30 to substrate 24. Via 32 crosses, for example, an electrically-insulating layer, not shown, separating layer 30 from substrate 24. Via 32 comprises, for example, one end in contact with layer 30 and another end coupled to the components located inside and on top of substrate 24, for example by an interconnection network covering substrate 24. Thus, via 32 for example couples layer 30, that is, the diode 12 of
Thus, during the operation of pixel 22, the charges absorbed in layer 30 are attracted by via 32 and supplied to the components of substrate 24. More precisely, the charges located in a region surrounding the area of contact between the via and layer 30 are attracted by the via.
Event-based cameras are fast cameras. In other words, the cameras may reach a speed of 100,000 frames per second (fps), while standard cameras generally operate between 60 and 120 frames per second. Event-based cameras are thus adapted to the diodes formed by quantum dots, which may be slower than standard diodes due to the low mobility of the photosensitive layers given the transport mechanism (variable-distance jump).
Pixel 34 corresponds to pixel 22 of
Pixel 36 corresponds to a standard camera pixel. Pixel 36 comprises, like pixel 34, quantum dot layer 30 forming a diode, a via 32, and substrate 24. The substrate 24 of pixel 36 comprises a region 38 inside and on top of which are located analog components and a region 40 inside and on top of which are located logic components. The components located inside and on top of the substrate 24 of pixel 36, that is, the components located inside and on top of regions 38 and 40, form the control circuit of pixel 36.
The layers 30 and the vias 32 of pixels 34 and 36 are configured so that the regions of layer 30 where the charges are attracted towards via 32 have same dimensions, in particular a same surface area in top view. In other words, the portions of layer 30 associated with pixels 34 and 36 have the same dimensions.
The control circuit of pixel 36, that is, of a pixel of a standard camera, comprises less components than the control circuit of pixel 34, that is, of a pixel of an event-based camera. The regions of substrate 24 associated with pixel 34, that is, regions 26 and 28, thus have greater dimensions than the regions of substrate 24 associated with pixel 36, that is, regions 38 and 40.
Thus, as shown in
Assembly 42 corresponds, for example, to a portion of an image capture device, for example of a camera. Assembly 42 comprises at least one pixel 34 and at least one pixel 36, that is, at least one event-based camera pixel and at least one standard camera pixel. Assembly 42 thus enables the obtaining of event-based data and standard luminosity data.
In the example of
The pixels of assembly 42 comprise a common layer 44. Layer 44 corresponds to the layers 30 of each pixel 34 or 36. Layer 44 is thus formed of an array of regions 46, each region 46 forming part of a pixel and corresponding to the layer 30 described in relation with
Layer 44 is preferably continuous. The regions 46 of neighboring pixels are preferably in contact and are preferably not separated. Layer 44 preferably has a substantially constant thickness. Layer 44 is preferably homogeneous, that is, made of the same material over the entire surface. Regions 46 are preferably substantially identical to one another. Regions 46 are preferably parallelepipedal, for example having a rectangular surface, for example square, in top view. Layer 44 preferably comprises substantially identical quantum dots all over the surface of the layer. In other words, the quantum dots of layer 44 are preferably made of the same materials over the entire surface. Preferably, the density of quantum dots is identical (i.e., uniform) over the entire surface. Preferably, regions 46 all have the same dimensions.
Pixel 34 is surrounded by pixels 36. The region 46 corresponding to pixel 34 is thus surrounded by regions 46 corresponding to pixels 36.
Assembly 42 comprises a substrate 48, preferably a single substrate. Substrate 48 corresponds to the substrates 24 of pixels 34 and 36. Substrate 42 comprises all the regions 26 and 28 of pixel 34 and the regions 38 and 40 of pixels 36. Substrate 48 comprises regions 50, corresponding to the substrate 24 of pixel 34 and regions 52 corresponding to the substrates 24 of pixels 36. Thus, in
As explained in relation with
Preferably, the region 46 of a pixel 34 has a surface area smaller than the surface area of region 50 in a plane parallel to the plane of layer 44, for example in the upper plane of substrate 48, that is, the plane of the surface of substrate 48 closest to layer 44. Preferably, the region 46 of a pixel 36 has a surface area greater than the surface area of region 52 in a plane parallel to the plane of layer 44, for example in the upper plane of substrate 48, that is, the plane of the surface of substrate 48 closest to layer 44. In other words, the region 50 of substrate 48 comprises portions which do not face region 46 of pixel 34. Said portions face portions of regions 46 of pixels 36. Thus, portions of the regions 46 of pixels 36 are not located in front of the regions 52 of the corresponding pixel.
Assembly 42 further comprises vias 32. Each pixel 34 or 36 comprises a via 32. Each via 32 extends from layer 44, more precisely from the region 46 corresponding to the pixel, to substrate 48, more precisely to the region 50 or 52 corresponding to the pixel. Vias 32 cross, for example, an insulating layer, not shown, extending between layer 44 and substrate 48.
Vias 32 preferably form an array of vias 32. Thus, vias 32 form rows and columns, corresponding to the rows and to the columns of the pixel array. The vias 32 of a same row or of a same column are preferably separated two by two by the same distance.
Each via 32 is preferably located at the center of the region 46 of the corresponding pixel. The via of a pixel is, for example, not located at the center of region 50 or 52 of substrate 48.
Each region 46 corresponds, for example, to the region where the corresponding pixel recovers the charges. In other words, during the operation of the device, charges are generated in layer 44. The charges generated in a region 46 are attracted by the via 32 of the pixel corresponding to region 46. Preferably, any charge generated in layer 44 is contained in a region 46 and is thus attracted by a via 32 to be processed by the pixel control circuit.
The dimensional differences of regions 50 and 52 enable to form arrays of identical regions 46, and in particular having identical dimensions. Indeed, the dimensions of regions 50 and 52 compensate for each other.
Device 54 comprises an array of assemblies 42. Device 54 thus comprises a plurality, nine in
In device 54, the event-based image is thus obtained by an array of pixels 34. The pixels 34 of said array are separated from one another by a distance corresponding to two pixels 36.
In device 54, the light intensity image is obtained by an array of pixels 36. The array of pixels 36 is incomplete, certain pixels being separated from a neighboring pixel 36 of a same row or of a same column by a distance different from the distance separating them from other neighboring pixels 36. This is caused by the presence of pixels 34. The light intensity value of the location of pixels 34 is obtained by interpolation, for example by calculation of the average of the light intensities of the pixels 36 of the assembly 42 comprising the corresponding pixel 34.
The assembly 56 of
Assembly 56 comprises, like assembly 42, layer 44 common to pixels 34 and 36. Layer 44 corresponds to the layers 30 of each pixel 34 or 36. Layer 44 is thus formed of an array of regions 46, each region 46 forming part of a pixel and corresponding to the layer 30 described in relation with
Like assembly 42, assembly 56 further comprises a substrate 48, preferably a single substrate, comprising the substrate regions of pixels 34 and 36. Further, assembly 56 comprises an array of vias 32, each via corresponding to the via 32 of a pixel 34 or 36. Vias 32 are arranged as described in relation with
As in the embodiment of
As in the embodiment of
More precisely,
Device 58 comprises an array of assemblies 56. Device 58 thus comprises a plurality, nine shown in
In device 58, the event-based image is thus obtained by an array of pixels 34. The pixels 34 of said array are separated from one another by a distance corresponding to a pixel 36.
In device 58, the light intensity image is obtained by an array of pixels 36. The array of pixels 36 is incomplete, certain pixels being separated from a neighboring pixel 36 of a same row or of a same column by a distance different from the distance separating them from other neighboring pixels 36. This is caused by the presence of pixels 34. The light intensity value of the location of pixels 34 is obtained by interpolation, for example by calculation of the average of the light intensities of the pixels 36 surrounding the corresponding pixel 34.
The assembly 60 of
Assembly 60 comprises, like assembly 56, layer 44 common to pixels 34 and 36. Layer 44 corresponds to the layers 30 of each pixel 34 or 36. Layer 44 is thus formed of an array of regions 46, each region 46 forming part of a pixel 36. The layer 44 of assembly 60 further comprises a region 62 corresponding to the layer 30 of pixel 34. Layer 44, and regions 46, 62, are such as described in relation with
Like assembly 56, assembly 60 further comprises a substrate 48, preferably a single substrate, comprising the substrate regions 50, 52 of pixels 34 and 36. Preferably, regions 52 are located at the corners of assembly 60. Region 50 is located between regions 50. Region 50 thus forms a cross separating regions 52 from one another.
Further, assembly 56 comprises an array of vias 32, each via corresponding to the via 32 of a pixel 36. Assembly 60 thus comprises complete rows and columns of vias 32 corresponding to pixels 36. Assembly further comprises vias 64 corresponding to the vias 32 of pixel 34. Via 64 extends between the region 62 of layer 44 and substrate 48, in particular region 50, preferably the middle of the cross.
As in the embodiment of
Device 66 comprises an array of assemblies 60. Device 66 thus comprises a plurality, nine in
In device 66, the event-based image is thus obtained by an array of pixels 34. The pixels 34 of said array are separated from one another by a distance corresponding to two pixels 36.
In device 66, the light intensity image is obtained by an array of pixels 36. The array of pixels 36 is complete, each pixel 36 being at a same distance from all the neighboring pixels 36. No interpolation is thus necessary.
The assembly 68 of
Assembly 68 comprises, like assembly 56, layer 44 common to pixels 34 and 36. Layer 44 corresponds to the layers 30 of each pixel 34 or 36. Layer 44 is thus formed of an array of regions 46, each region 46 forming part of a pixel 36. The layer 44 of assembly 68 further comprises a region 70 corresponding to the layer 30 of pixel 34. Layer 44 is such as described in relation with
Regions 46 are preferably, in top view, rectangular, for example square. Regions 46 are separated from one another by region 70. Region 70 is cross-shaped. Region 70 comprises a portion, for example substantially rectilinear in top view, extending in the column direction and separating the pixels 36 of different columns. Region 70 comprises another portion, for example substantially rectilinear in top view, extending in the row direction and separating the pixels 36 of different rows. Thus, regions 46 are located at the corners of layer 44 of assembly 68 and are separated from one another by the branches of the cross of region 70.
Like assembly 56, assembly 68 further comprises a substrate 48, preferably a single substrate, comprising the substrate regions 50, 52 of pixels 34 and 36. Preferably, regions 52 are located at the corners of assembly 68. Region 50 is located between regions 50. Region 50 thus forms a cross separating regions 52 from one another.
Further, assembly 56 comprises an array of vias 32, each via corresponding to a via 32 of a pixel 34 or 36. Assembly 68 thus comprises complete rows and columns of vias 32.
Assembly comprises in particular vias 32 corresponding to the vias 32 of pixel 34. Assembly comprises at least one line of vias 32 extending in the row direction of the array of pixels 36, vias 32 extending between the branch of region 70 extending in the row direction of the array of pixels 36 and the branch of region 50 extending in the row direction of the array of pixels 36. Similarly, the assembly comprises at least one line of vias 32 extending in the column direction of the array of pixels 36, vias 32 extending between the branch of region 70 extending in the column direction of the array of pixels 36 and the branch of region 50 extending in the column direction of the array of pixels 36. Thus, the vias 32 corresponding to pixel 34 enable to attract the charges generated in the entire region 70 of pixel 34.
Assembly further comprises vias 32 corresponding to the vias of pixels 36. For example, each region 52 is coupled to the corresponding region 46 by at least one via 32. Preferably, the vias 32 corresponding to pixels 36 are arranged to form an array with the vias 32 corresponding to pixel 34.
As in the embodiment of
Device 75 comprises an array of assemblies 68. Device 75 thus comprises a plurality, nine in
Regions 70 form together a grid separating from one another groups of pixels 36. The groups of pixels comprise four pixels 36, pixels 36 belonging to different assemblies 68. Thus, the branches of regions 70 extending in the column direction of pixels 36 are separated two by two by two columns of pixels 36. Similarly, the branches of regions 70 extending in the row direction of pixels 36 are separated two by two by two rows of pixels 36.
In device 75, the event-based image is thus obtained by pixels 34 forming a grid extending on the assembly of the device. In the device 75, the light intensity image is obtained by an array of pixels 36. Although the regions 46 of certain neighboring pixels 36 are directly adjacent and the regions of other neighboring pixels 36 are separated by the region 70 of a pixel 34, this difference is relatively slight and thus no interpolation is necessary.
According to an embodiment, devices 54, 58, 66, and 75 can deliver an event-based image and a light intensity image independently. In other words, pixels 36 deliver, at a frame frequency, an image comprising a light intensity value, whatever the value provided by pixels 34. Further, each pixel 34 delivers event-based data, that is, indicating a modification of the light intensity measuring by pixel 34, independently from the values measured by pixels 36 and independently from the values provided by the other pixels 34.
According to another embodiment, pixels 36 are configured to generate a light intensity image when at least a number of pixels 34, for example at least one, measures a modification of the measured intensity. Thus, when the scene is modified, this is detected by pixels 34 and a light intensity image is generated by pixels 36.
According to another embodiment, the device is configured so that when a pixel 34 delivers event-based data indicating that the light intensity has changed, part of pixels 36, for example the neighboring pixels, for example the pixels surrounding pixel 34, provide a light intensity value. For example, in the case of device 54, said portion of pixels 36 corresponds to the pixels 36 of the assembly comprising pixel 34. The device thus delivers an event-based image and a light intensity image of the portion of the scene where there have been variations.
According to the example of
On the other hand, pixels 34, and more precisely the regions 46 corresponding to pixels 34, are covered by a filter 78. Filters 78 are configured, for example, to let through infrared wavelengths (IR), that is, wavelengths for example in the range from 0.7 μm to 100 μm. As a variant, filters 78 are for example configured to let through wavelengths in the range from 0.4 μm to 100 μm, to extend their photosensitive range.
According to the example of
Filters 80 are configured to let through wavelengths corresponding to the red color, that is, wavelengths for example in the range from 622 nm to 780 nm. Filters 82 are configured to let through wavelengths corresponding to the green color, that is, wavelengths for example in the range from 492 nm to 577 nm. Filters 84 are configured to let through wavelengths corresponding to the blue color, that is, wavelengths for example in the range from 455 nm to 492 nm.
According to the embodiment of
On the other hand, pixels 34, and more precisely the regions 46 corresponding to pixels 34, are covered by filter 78. Filters 78 are configured to let through infrared wavelengths, that is, wavelengths for example in the range from 0.7 μm to 100 μm.
The device comprises an array of assemblies 42, each assembly 42 being arranged in such a way that the neighboring assemblies 42 of a same row or of a same column are rotated by 90° with respect to said assembly.
Device 54 preferably only comprises rows 86 and 88. Rows 86 comprise an alternation of pixels 36 associated with a filter 80 and of pixels 36 associated with a filter 84, each pixel 36 associated with a filter 80 is separated from each pixels 36 associated with a neighboring filter 84 by two pixels 36 associated with a filter 82. Each row 88 only comprises pixels 34 associated with filters 78, pixels 36 associated with filters 80, and pixels 36 associated with filters 84. In a row 88, each pixel 34 is located between two pixels 36 associated with filters 80 or between two pixels 36 associated with filters 84. In a row 88, each pixel 36 associated with a filter 80 is located between a pixel 34 associated with a filter 78 and a pixel 36 associated with a filter 84. In a row 88, each pixel 36 associated with a filter 84 is located between a pixel 34 associated with a filter 78 and a pixel 36 associated with a filter 80.
The array is preferably symmetrical. Thus, device 54 preferably only comprises columns 90 and 92. Columns 90 comprise an alternation of pixels 36 associated with a filter 80 and of pixels 36 associated with a filter 84, each pixel 36 associated with a filter 80 is separated from each pixel 36 associated with a filter neighboring 84 by two pixels 36 associated with a filter 82. Each column 92 only comprises pixels 34 associated with filters 78, pixels 36 associated with filters 80, and pixels 36 associated with filters 84. In a column 92, each pixel 34 is located between two pixels 36 associated with filters 80 or between two pixels 36 associated with filters 84. In a column 92, each pixel 36 associated with a filter 80 is located between a pixel 34 associated with a filter 78 and a pixel 36 associated with a filter 84. In a column 92, each pixel 36 associated with a filter 84 is located between a pixel 34 associated with a filter 78 and a pixel 36 associated with a filter 80.
Assembly 56 comprises three pixels 36 and one pixel 34. In the embodiment of
By the use of methods of controlling the pixels allowing the distribution of charges, described in relation with
In the example of
More precisely,
The control modes are described in relation with an assembly 42 of pixels such as that described in relation with
In the control mode corresponding to
In the case of
In the case of
The distribution mode of
The distribution of the charges photogenerated by a plurality of regions 46 may be achieved by a different biasing of the lower electrodes of the photodiodes of regions 46, in the case where the charges are attracted by the via of pixel 34, or by a specific routing and logic transistors at the level of the analog layer in the case where the charges are added in the substrate.
An advantage of the previously-described embodiments is that the device can deliver an event-based image and a light intensity image, for example non simultaneously, the event-based and standard pixels being readable independently.
Another advantage of the described embodiments is that a single substrate, comprising the pixel control circuits, is used. It is not necessary to have a second substrate for the diodes.
Another advantage of the described embodiments is that it is possible to form a complete display. In other words, charges may be generated all over the surface of the display. There is no area between pixels where charges cannot be generated.
Another advantage of the described embodiments is that, for a same number of pixels, the device can be smaller.
Another advantage of the described embodiments is that the possible wavelength range is larger.
Various embodiments and variants have been described. Those skilled in the art will understand that certain features of these various embodiments and variants may be combined, and other variants will occur to those skilled in the art.
Finally, the practical implementation of the described embodiments and variants is within the abilities of those skilled in the art based on the functional indications given hereabove.
Number | Date | Country | Kind |
---|---|---|---|
2211351 | Oct 2022 | FR | national |