PROJECTION DEVICE WITH AN OPTIMIZED EMISSION POINT DISTRIBUTION ON A DISCRETIZED EMISSION SURFACE

Information

  • Patent Application
  • 20240337847
  • Publication Number
    20240337847
  • Date Filed
    April 04, 2024
    7 months ago
  • Date Published
    October 10, 2024
    28 days ago
Abstract
An image projection device for projecting an image onto an eye, includes an emission surface S comprising a set of waveguides, a set of diffraction gratings and a set of electrodes. Each grating is positioned at the intersection of one of the guides and of one of the electrodes so as to form an emission point for a light wave. The surface S is discretized into a plurality of elementary emission zones in a continuous mesh. Each zone comprises a subset of points distributed in a number nxij×nyij of emission point distributions. The points of one and the same distribution are configured to emit a resultant wave directed with a wave vector contained in an angular domain defined based on the number of zones discretizing the surface S and on the position of the zone on the surface S, the number nxij×nyij of distributions corresponding to the number of pixels of the image to be projected in said angular domain.
Description
CROSS-REFERENCE TO RELATED APPLICATION

This application claims priority to foreign French patent application No. FR 2303528, filed on Apr. 7, 2023, the disclosure of which is incorporated by reference in its entirety.


FIELD OF THE INVENTION

The present invention relates in general to the projection of an image onto an eye, in augmented reality applications, and in particular to a projection device with an optimized emission point distribution on a discretized emission surface and based on an isoline configuration, and to a method for manufacturing such a device.


A portable optical augmented reality data display system superimposes, on the real-world view of a user of the system, an image containing information intended for the user, such as for example information about their environment, their position, their speed of movement, etc.


BACKGROUND

Some known portable optical augmented reality data display systems use an image projection device comprising a transparent integrated optical circuit composed of an array of nanometric light guides, an electrode array and a holographic film, as described for example in patent applications FR3122929A1 and FR3022642A1. Such an image projection device is implemented without a screen or an optical system, thereby making it possible to obtain a compact optical system and a wide field of view for the user. The intersections of the nanometric light guide circuit with the electrode array make it possible to define a set of emission points able to emit a light wave directed towards the pupil of the eye of the user. The set of emission points is subdivided into various subsets, each subset comprising emission points that are distributed as randomly as possible. The light waves associated with one and the same subset of emission points propagate in one and the same direction so as to form a single light spot at the retina of the eye of the user. A light spot corresponds to a pixel of an image to be projected.


However, such a device has the drawback of using emission point distributions that are defined according to certain spatial periodicities that generate diffraction effects when the image is formed on the retina. Moreover, it also has a limited density of the emission points of each subset subdivided so as to form each light spot. This results in a luminous halo associated with retinal projection, which degrades the contrast of the image to be displayed. As a result, the quality of the projection of images onto the retina of a user is insufficient.


There is thus a need for an improved image projection device that makes it possible to increase the density of emission points of a subdivided subset while at the same time improving the pseudo-random and aperiodic distribution of the emission points.


SUMMARY OF THE INVENTION

The present invention aims to improve the situation by proposing an image projection device for projecting an image onto an eye. The device is defined in an orthogonal reference system (X,Y,Z) and comprises an emission surface S extending generally in the plane (X,Y) of the orthogonal reference system (X,Y,Z). The emission surface S comprises a stack of elements, the elements comprising a set of Mx waveguides gp, a set of Mx×My diffraction gratings rpq and a set of My electrodes eq, Mx and My being positive integers whose product Mx×My is strictly greater than 1. Each diffraction grating rqp is positioned at the intersection of one of the waveguides gp and of one of the electrodes ep so as to form an emission point EPpq for a light wave.


The emission surface S is discretized into a plurality of Lx×Ly elementary emission zones Zij in a continuous mesh in the plane (X,Y), each elementary emission zone Zij comprising a subset of mxij×myij emission points EPpqij among the Mx×My emission points EPpq of the emission surface S, the subset of mxij×myij emission points EPpq being distributed in a number ηxij×ηyij of emission point distributions EPDuvij. The emission points EPpqij of one and the same emission point distribution EPDuvij of the elementary emission zone Zij are configured to emit a resultant light wave directed with a wave vector {right arrow over (κ)}uvij contained in an angular domain defined based on the number Lx×Ly of elementary emission zones Zij discretizing the emission surface S and on the position of the elementary emission zone Zij on the emission surface S, the number ηxij×ηyij of emission point distributions EPDuvij corresponding to the number ηxij×ηyij of pixels of the image to be projected in the angular domain.


In some embodiments, the discretization of the emission surface S may be uniform in the plane (X,Y).


Alternatively, the discretization of the emission surface S may be non-uniform in the plane (X,Y).


According to some embodiments, for each elementary emission zone Zij, the distribution of the emission points EPpqij in an emission point distribution EPDuvij may be determined randomly or pseudo-randomly.


Advantageously, the stack of the emission surface S may furthermore comprise a set of Mx×My holograms hpq, each hologram hpq being positioned at the intersection between one of the waveguides gp and one of the electrodes ep so as to form the emission point EPpq. The holograms hpq associated with the emission points EPpqij of one and the same emission point distribution EPDuvij of the elementary emission zone Zij may be encoded such that the emission points EPpqij emit light waves that are angle-matched and phase-matched to one another so as to generate the resultant light wave defined according to the direction of the wave vector {right arrow over (κ)}uvij contained in an angular domain.


The device may furthermore comprise, in the plane (X,Y), at least one other emission surface Sxy distinct from the emission surface S. The other emission surface Sxy may be discretized into elementary emission zones comprising emission points designed to emit a light wave in a direction contained in a determined angular domain along an optical axis centred with respect to a point Prxy and directed towards the emission surface Sxy, the point Prxy being associated with the position of the eye, after the eye has rotated in its orbit towards the emission surface Sxy.


The device may furthermore comprise, in the plane (X,Y), at least one other emission surface S identical to the emission surface S. The other emission surface S may be discretized into elementary emission zones comprising emission points configured to emit a light wave in a direction contained in an angular domain defined along the axis Z and centred with respect to a point Ptxy associated with the translation of the eye in the plane (X,Y).


Advantageously, the elementary emission zones Zij may have a size in the plane (X,Y) of between 200 μm and 800 μm.


Another subject of the invention is a transparent portable optical data display system comprising an image projection device. The system is a glasses system or an augmented reality headset.


The invention also provides a method for manufacturing the image projection device. The method comprises a phase of designing the device and a phase of physically manufacturing the device thus designed. The design phase comprises the following steps:

    • discretizing the emission surface S into Lx×Ly elementary emission zones Zij, each elementary emission zone Zij comprising a subset of mxij×myij emission points EPpqij;
    • distributing the subset of mxij×myij emission points EPpqij into ηxij×ηyij emission point distributions EPDuvij;
    • for each elementary emission zone Zij, assigning ηxij×ηyij emission point distributions EPDuvij to ηxij×ηyij pixels of the image to be projected;
    • determining, for each emission point distribution EPDuvij, the direction of the wave vector {right arrow over (κ)}uvij of the light wave emitted by the emission points EPpqij, the wave vector {right arrow over (κ)}uvij being contained in an angular domain defined based on the number Lx×Ly of elementary emission zones Zij discretizing the emission surface S and on the position of the elementary emission zone Zij on the emission surface S.


The image projection device according to the embodiments of the invention makes it possible to improve the quality, in particular the contrast, of the projection of images onto the retina of a user, based on an optimized aperiodic emission point distribution on a discretized emission surface, an increase in the densification of emission points on an emission surface and isoline configurations.


Such a device also makes it possible to facilitate and speed up the method for manufacturing the device.





BRIEF DESCRIPTION OF THE DRAWINGS

Other features, details and advantages of the invention will become apparent on reading the description provided with reference to the appended drawings, which are given by way of example.



FIG. 1a is a diagram showing an image projection device, according to some embodiments of the invention.



FIG. 1b is a diagram showing one example of an image to be projected onto an eye of a user of a portable optical data display system, according to some embodiments of the invention.



FIG. 2 is a diagram showing a cross section of the structure of the emission surface of an image projection device along the plane (X,Z), according to some embodiments of the invention.



FIG. 3 is a diagram showing a discretization of the emission surface of an image projection device in the plane (X,Y), according to some embodiments of the invention.



FIG. 4 is a diagram showing an elementary emission zone of the discretized emission surface of an image projection device in the plane (X,Y), according to some embodiments of the invention.



FIG. 5 is a diagram showing a projection in the plane (X,Y) of the angular projection domain associated with an elementary emission zone of the discretized emission surface of an image projection device, according to some embodiments of the invention.



FIG. 6 is a diagram showing a set of illumination zones on a pupil associated with the elementary emission zones of the discretized emission surface of an image projection device, according to some embodiments of the invention.



FIG. 7a and FIG. 7b are diagrams showing emission surfaces reproduced in the plane (X,Y) on an image projection device, according to some embodiments of the invention.



FIG. 8a is a graph showing a spatial distribution of the intensity of the percussive response formed by an emission point distribution of an image projection device, according to some embodiments of the invention.



FIG. 8b is a graph showing an evolution of a ratio of power of a spatial distribution of the intensity of the percussive response as a function of a density of the emission points of an image projection device, according to some embodiments of the invention.



FIG. 9 illustrates the results of simulations of a projected image for various comparative dimensions of an image projection device, according to some embodiments of the invention.



FIG. 10 is a flowchart showing steps of a method for manufacturing the image projection device, according to some embodiments of the invention.



FIG. 11a and FIG. 11b are flowcharts showing sub-steps of a method for manufacturing the image projection device, according to some embodiments of the invention.



FIG. 12a and FIG. 12b are diagrams showing an optical hologram recording system, according to some embodiments of the invention.



FIG. 13 is a diagram showing the isoline waveguide and electrode configurations of an emission surface of an image projection device, according to some embodiments of the invention.



FIG. 14 is a flowchart showing steps of a method for manufacturing the image projection device, according to some embodiments of the invention.



FIG. 15 is a flowchart showing graphs associated with steps of a method for manufacturing the image projection device, according to some embodiments of the invention.



FIG. 16a and FIG. 16b show results of simulating the spatial distribution of the intensity of the percussive response for configurations based on “segment translations” and isoline configurations of an emission surface of an image projection device, according to some embodiments of the invention.



FIG. 17a and FIG. 17b show results of applying criteria for quantifying the quality of the projection of an image for configurations based on “segment translations” and isoline configurations of an emission surface of an image projection device, according to some embodiments of the invention.



FIG. 18 is a diagram showing the isoline waveguide and electrode configurations associated with a discretization of the emission surface of an image projection device in the plane (X,Y), according to some embodiments of the invention.





Identical references are used in the figures to denote identical or similar elements. For the sake of clarity, the elements that are shown are not to scale. Moreover, in the remainder of the description, unless indicated otherwise, the terms “substantially” and “generally” mean “to within plus or minus 10%”.


DETAILED DESCRIPTION


FIG. 1a schematically shows an image projection device 10 defined in an orthogonal reference system (X,Y,Z) and comprising a set of waveguides gp comprising a number Mx of waveguides and a set of electrodes eq comprising My electrodes, each waveguide gp being non-parallel to each of the electrodes eq, according to some embodiments of the invention.


The numbers Mx and My are positive integers whose product (Mx×My) is strictly greater than 1. The parameter ρ denotes an index associated with the various waveguides gp, with ρ∈[1, Mx], and the parameter q denotes an index associated with the various electrodes eq, with q∈[1, My].


The image projection device 10 may be used for example in a portable optical data display system, in the field of augmented reality (AR) image display, and more generally in the field of virtual reality (VR) and mixed reality (MR). FIG. 1a corresponds to such an application. As shown in FIG. 1a, the image projection device 10 may be arranged at a distance Zer from the pupil P of an eye of a user of the portable optical data display system. The axis Z then corresponds to the optical axis of the gaze of the user of the system and is associated with the position of the eye in the plane (X,Y) of the orthogonal reference system (X,Y,Z). For example, the distance Zer (also called ‘eye relief’) may be equal to 20 mm for a portable optical data display system comprising a pair of glasses that are spectacles, or equal to 30 mm for a system comprising an AR or VR/MR immersion headset.


The image projection device 10 comprises an emission surface S, generally extending in the plane (X,Y) and formed of (Mx×My) emission points, denoted EPpq. Each emission point EPpq corresponds to the intersection of a waveguide gp and an electrode eq. Each emission point EPpq is designed to emit a light wave propagating along a propagation axis (or wave vector) {right arrow over (κ)}, from the emission surface S to the pupil P, which then focuses the light onto the retina R of the eye. The emission surface S then corresponds to a retinal projection screen.


The image projection device 10 is designed to project an image of size (Nx×Ny) in terms of number of pixels and substantially defined in the plane of projection of the retina R of the eye. FIG. 1b shows one example of an image to be projected onto an eye of a user. Each projected pixel then corresponds to an illuminated point Ruv on the retina R.


The numbers Nx and Ny are positive integers whose product (Nx×Ny) is strictly greater than 1. The parameter u denoting an index associated with the pixels of the image on the axis X, with u∈[1, Nx], and the parameter ν denoting an index associated with the pixels of the image along the axis Y, with ν∈[1, Ny]. For example and without limitation, a minimum number of the product of the integers (Nx×Ny) may be equal to 100 pixels of an image to be projected.


The set of (Mx×My) emission points EPpq is then distributed into (Nx×Ny) subsets of emission points. These subsets of emission points are also called ‘emission point distributions’ and are denoted by the notation EPDuv. It should be noted that a minimum number of the product of the integers (Mx×My) may be defined based on the maximization of the number of emission point distributions (Nx×Ny) on the emission surface S (that is to say pixels to be projected) and of the number ηem/EPD of emission points EPpq per emission point distribution EPDuv. For example and without limitation, the number ηem/EPD may be between 50 and 200, and the minimum number of the product of the integers (Mx×My) may therefore be equal to 5000 emission points EPpq.


Each emission point distribution EPDuv may be formed such that all of the emission points EPpq of one and the same distribution all emit a phase-matched light wave along one and the same propagation axis {right arrow over (κ)}uv. It should be noted that a first light wave and a second light wave each having one and the same propagation axis {right arrow over (κ)}uv are phase-matched if, in a plane perpendicular to {right arrow over (κ)}uv, the value of the phase of the second light wave is substantially equal to the value of the phase of the first light wave modulo 21. The light waves emitted by the emission points EPpq of one and the same distribution may therefore propagate in parallel between the image projection device 10 and the pupil P of the eye over the distance Zer, and converge substantially at the same point Ruv on the retina by virtue of the lens C of the eye, as shown in FIG. 1a. In this case, the eye perceives the projected image by accommodating to infinity. According to one variant that is not shown in the figures, the phase-matched light waves emitted by the emission points EPpq of one and the same distribution EPDuv may propagate in substantially divergent directions {right arrow over (κ′)}uv around a direction {right arrow over (κ)}uv that targets the point Ruv (that is to say pixel of indices (u, ν)) in the field of view. In this case, the eye perceives the projected image by accommodating at a distance, referred to as accommodation distance and denoted da, which may be a few metres (that is to say da is not equal to infinity). For example and without limitation, the accommodation distance da may be 2 m. The degree of divergence of a direction {right arrow over (κ)} of one and/or more emission points EPpq from the direction {right arrow over (κ)}uv may then be determined based on the accommodation distance da. The light intensity at the point Ruv results from the contribution of each light wave from the emission points EPpq. The emission point distribution EPDuv is phase-adjusted and forms a resultant light wave. The resultant light wave from EPDuv may be associated with a plane wave or a substantially spherical wave if the accommodation distance da of the image projected onto the retina is not infinity. Advantageously, the distribution of the emission points EPpq for each emission point distribution EPDuv may be determined randomly or pseudo-randomly.



FIG. 2 schematically shows a cross section of the structure of the emission surface S of the image projection device 10 along the plane (X,Z). This structure comprises a stack of multiple layers that are superimposed in the direction of the axis Z.


The device 10 comprises a first layer corresponding to the set of Mx waveguides gp arranged parallel to the emission surface S. Each waveguide gp extends on either side of the emission surface S, from an initial position to a final position that are arranged along an axis of extension X′ substantially parallel to the longitudinal axis X. For simplicity, in FIGS. 1a and 2, the waveguides gp are schematically shown in a rectilinear shape along the axis X. However, those skilled in the art will readily understand that the invention is not limited to such a waveguide configuration shape.


The waveguides gp are configured to receive coherent light, emitted by one or more laser sources (not shown in the figures), and to propagate the coherent light along the emission surface S. The waveguides gp may for example be formed of silicon nitride, and have a width (gy), in the direction of the axis Y in FIG. 1a, and a thickness (gz), in the direction of the axis Z, that are both between 100 nm and 600 nm.


The device also comprises a second layer corresponding to the set of My electrodes eq arranged parallel to the emission surface S. Each electrode eq extends on either side of the emission surface S, from an initial position to a final position that are arranged along an axis of extension Y′ substantially parallel to the transverse axis Y. For simplicity, in FIG. 1a, the electrodes eq are shown schematically in a rectilinear shape in the direction of the axis Y. However, those skilled in the art will readily understand that the invention is not limited to such an electrode configuration shape.


The electrodes eq are configured to receive a specific bias voltage (or electrical modulation), managed by one or more power supplies (not shown in the figures), and to propagate the electrical modulation along the emission surface S. The electrodes eq are made of a conductive material, for example, for transparent electrodes; this conductive material may be indium tin oxide. The electrodes eq may have a width (ex), in the direction of the axis X in FIG. 1a, of between 2 μm and 10 μm, and a thickness (ez), in the direction of the axis Z, depending on the conductive material. For example, the thickness (ez) of electrodes eq made of indium tin oxide may typically be between 20 nm and 100 nm.


The set of My electrodes eq is superimposed on the set of Mx waveguides gp, parallel to the emission surface S. Each electrode eq thus ‘crosses’ multiple waveguides gp, so as to define multiple intersections (or crossings). Each intersection corresponds to a position of an emission point EPpq. As used here, the term ‘intersection’ refers to a superposition of a waveguide gp (that is to say first layer) and an electrode eq (that is to say second layer) that are not parallel to one another.


The device 10 furthermore comprises a third layer, contained between the first and second layer, corresponding to a set of (Mx×My) diffraction gratings, denoted rpq. Each diffraction grating rqp is optically coupled to a waveguide gp and is joined to an electrode ep. Each diffraction grating rqp is formed from a periodic variation of at least one material with a refractive index able to be modulated by applying a bias voltage passing through the electrode ep.


The third layer comprising the set of diffraction gratings rpq may consist of a continuous structure in the plane (X,Y) of the emission surface S. Alternatively, the third layer may also consist of structures located in the plane (X,Y) of the emission surface S and positioned substantially at the intersections formed by the waveguides gp and the electrodes eq. The continuous structure or the localized structures may be formed for example by inclusions defining a pattern with a periodic variation in silicon oxide. The inclusions may then consist of any material having an electrically adjustable refractive index, such as for example a liquid crystal. When the wavelength of the light emitted by a laser source is 532 nm, the period of the pattern of the diffraction grating rpq may be between 300 nm and 400 nm. A diffraction grating may be spread in a range of periodic patterns over a length (rx) and spread over a width (ry). These quantities (rx) and (ry) may be defined based on the zones of superposition formed by the waveguides gp and the electrodes eq. For example, the diffraction grating may be spread over 10 periodic patterns such that the length (rx) may be greater than or substantially equal to the width (ex) of the electrodes ep inducing the modulation of the refractive index. The third layer comprising the set of diffraction gratings may have a thickness (rz) of for example between 100 nm and 500 nm.


The device 10 may comprise a set of light wave orientation elements configured to control a light wave. As used here, the expression “light wave control” (also referred to as ‘light wave manipulation’) refers to various phenomena related to electromagnetic waves that may occur when an optical beam interacts notably with the material of a given object, as shown with the element hpq in FIG. 2. These phenomena comprise notably angular deviation, phase change, transmission, reflection, absorption, scattering, refraction and/or diffraction of the electromagnetic wave.


Advantageously, the device 10 may comprise a fourth layer H, positioned on the second layer, comprising the set of light wave orientation elements configured to control a light wave. The fourth layer H may be a holographic film comprising a set of (Mx×My) holograms, denoted hpq. Each hologram hpq corresponds to a (reflective or transmissive) orientation element and is associated with a diffraction grating rpq along the axis Z. The holographic film may be a photopolymer, for example polymethyl methacrylate, or a photoresist, with a thickness (hz) of between 2 μm and 20 μm. A hologram recorded (or encoded) on the holographic film may extend over a length (hx) of between 2 μm and 20 μm, and a width (hy) of between 1 μm and 10 μm. The magnitudes (hx) and (hy) of the holograms may be defined based on the zones of superposition formed by the waveguides gp and the electrodes eq. Advantageously, the length (hx) may be greater than the width (ex) of the electrodes ep inducing holograms that are contiguous or overlap in pairs, as shown in FIG. 2 for the holograms hpq and hp(q−1), making it possible for example to reduce the minimum distance between two electrodes.


The stack of the various layers and elements of the structure of the device 10 is arranged on a support V. The support V may be a transparent support made of glass or polycarbonate and contained in a spectacle lens or a visor of the transparent portable optical data display system.


In FIG. 2, a single waveguide gp coupled to three point diffraction gratings rp(q+1), rpq and rp(q−1), three different electrodes e(q+1), eq and e(q−1) and three point holograms hp(q+1), hpq and hp(q−1) are shown by way of simplification and non-limiting example. Those skilled in the art will readily understand that the invention is not limited to such numbers and configurations of waveguides, diffraction gratings and electrodes.


In FIG. 2, the emission volume (that is to say the emission point EPpq) thus corresponds to a superposition, parallel to the emission surface S, of the waveguide gp and of the electrode eq, coupled to the point diffraction grating rpq and associated with the hologram hpq. To activate the emission point EPpq, that is to say to “make it emit”, the image projection device 10 is configured to transmit coherent light to the waveguide gp and to electrically modulate (or bias) the electrode eq. Part of the light wave propagating in the waveguide gp is then extracted at the diffraction grating rpq coupled to the refractive index modified by the electrode eq, as shown in FIG. 2 by the arrow. The light extracted at the diffraction grating rpq propagates in the hologram hpq (that is to say an orientation element), such that the device 10 emits, at the output of the hologram hpq, spatially at the emission point EPpq, a light wave with a wave vector {right arrow over (κ)}uv and a phase shift that are predetermined and stored (that is to say recorded or encoded beforehand) in the hologram hpq. The light waves emitted by various emission points EPpq of one and the same distribution EPDuv are phase-matched with respect to one another with the same wave vector {right arrow over (κ)}uv so that the set of these light waves forms a resultant wave the wavefront of which is controlled, for example a plane wave for accommodation to infinity, or a spherical wave for accommodation to the accommodation distance da. This resultant wave propagates to the pupil P so as to form a single light spot Ruv at the retina R. The phase and the direction of the light wave emitted by an emission point EPpq depend on the phase information recorded in the hologram hpq.


Each distribution EPDuv therefore makes it possible to form a light spot that is perceived by the user and associated with a pixel of an image. An image may be formed by successively illuminating various emission point distributions EPDuv, so as to form an image comprising a large number of pixels. The illumination frequency of the various emission point distributions EPDuv is dimensioned such that the user is able to experience the formation of a still image under the effect of retinal persistence, despite technically sequential formation of the various pixels of the image.


In a first embodiment, the emission surface S may be discretized into (Lx×Ly) elementary emission zones denoted Zij forming a continuous mesh of the emission surface S in the plane (X,Y), as shown in FIG. 3.


The numbers Lx and Ly are positive integers whose product (Lx×Ly) is strictly greater than 1. The parameter i is an index associated with a discretization of the emission surface S along the axis X, with i∈[1, Lx], and the parameter j is an index associated with a discretization of the emission surface S along the axis Y, with j∈[1, Ly]. For example and without limitation, a minimum number of the product of the integers (Lx×Ly) may be equal to (1×2) or (2×1) elementary emission zones.


Each elementary emission zone Zij may be associated with a number of mxij waveguides gpij, with mxij∈[1, Mx], and with a number myij of electrodes eqij, with myij∈[1, My], as illustrated in FIG. 4.


An elementary emission zone Zij therefore comprises a subset of (mxij×myij) emission points EPpqij defined among the (Mx×My) emission points EPpq of the emission surface S.


Each elementary emission zone Zij may be associated with a finite number of emission point distributions, which are then denoted EPDuvij. Each emission point distribution EPDuvij of an elementary emission zone Zij consists of one or more emission points EPpqij determined only among the (mxij×myij) emission points EPpqij of the elementary emission zone Zij.


Advantageously, for each elementary emission zone Zij, the distribution of the emission points EPpqij for each emission point distribution EPDuvij may be determined randomly or pseudo-randomly. For example and without limitation, the distribution of the emission points EPpqij may be determined from the random or pseudo-random selection of mxij indices ρ and myij indices q associated respectively with the various waveguides gpij and with the various electrodes eqij making up the elementary emission zone Zij.


Each emission point distribution EPDuvij is associated with the point projected onto the retina R, which is then denoted Ruvij. The number of emission point distributions EPDuvij of an elementary emission zone Zij is thus equal to the number (ηxij×ηyij) of pixels of the image to be projected associated with the points Ruvij projected onto the retina R by the elementary emission zone Zij. The integers ηxij and ηyij are numbers of pixels defined along the axis X and axis Y, respectively, with ηxij∈[1, Nx] and ηyij∈[1, Ny]. For example, the (ηxij×ηyij) pixels of the image to be projected may be identical for each elementary emission zone Zij of the emission surface S. The integers ηxij and ηyij may be defined based on the total number of pixels (Nx×Ny) of the image to be projected and the total number of elementary zones (Lx×Ly) discretizing the emission surface S, according to the following equations (01) and (02):










n
x
ij

=


N
x


L
x






(
01
)













n
y
ij

=


N
y


L
y







(
02
)








For example, an image to be projected may comprise (400×200) pixels and an emission surface S may be discretized into (16×12) elementary emission zones Zij. In this case, each zone Zij may comprise for example (25×17) pixels of the image to be projected (that is to say points Ruvij projected onto the retina R) from a number of (25×17) corresponding emission point distributions EPDuvij.


If, for each elementary emission zone Zij, the numbers (ηxij×ηyij) of pixels of the image to be projected are identical, the indices uij and νij corresponding to the index intervals may be defined based on the indices (i, j) of the zone Zij, the total number of pixels (Nx×Ny) of the image to be projected and the total number of elementary zones (Lx×Ly) discretizing the emission surface S, and according to the following expressions (03) and (04):










u
ij



ϵ
[




(

i
-
1

)

×


N
x


L
x



+
1

;

i
×


N
x


L
x




]





(
03
)













v
ij



ϵ
[




(

j
-
1

)

×


N
y


L
y



+
1

;

j
×


N
y


L
y




]






(
04
)








For each elementary emission zone Zij, an emission point EPpqij of an emission point distribution EPDuvij may be associated with a wave vector {right arrow over (κ)}uvij contained in an angular domain, represented by the angular ranges Δφ and Δψ, and defined based on the total number of elementary zones (Lx×Ly) discretizing the emission surface S, according to the following equations (05) and (06):









Δφ
=


θ
x


L
x






(
05
)












Δψ
=


θ
y


L
y






(
06
)







In the above equations (05) and (06), the quantities θx and θy correspond to angular projections respectively onto the axis X and the axis Y of the viewing cone θ (also called field of view or FOV) of the emission surface S of the image projection device 10 by an eye, as illustrated in FIGS. 3 and 5.


In particular, for each elementary emission zone Zij, the angular domain (Δφ, Δψ) may be directed along a central vector {right arrow over (κ)}ij oriented from the elementary emission zone Zij to a central point of impact tij in the plane (X,Y) at the pupil P. The angular direction of the central vector κxij in the plane (X,Y,Z) is characterized by angles φij and ψij determined relative to the optical axis Z in the planes (X,Z) and (Y,Z), respectively. By way of illustration, the projection κxij into the plane (X,Z) of the central vector {right arrow over (κ)}ij of the angular domain of the elementary emission zone Zij is shown in FIG. 5. The angle φij corresponds to the angle between the optical axis Z and the direction of projection κxij of the elementary emission zone Zij to the coordinate txij of the central point of impact tij on the axis X, as shown in FIG. 5. The angles φij and ψij associated with the central vector {right arrow over (κ)}ij may be defined based on the angular domain, the total number of elementary zones (Lx×Ly) discretizing the emission surface S and the position of the elementary emission zone Zij on the emission surface S (represented for example by the indices (i, j) of the zone Zij). The angles φij and ψij associated with the central vector {right arrow over (κ)}ij may thus be defined according to the following equations (07) and (08):











φ
ij

_

=

Δφ
×

(

i
-



L
x

+
1

2


)






(
07
)














ψ
ij

_

=

Δψ
×

(

j
-



L
y

+
1

2


)






(
08
)







Advantageously, for each emission point EPpqij corresponding to the intersection of a waveguide gp and an electrode eq of an emission point distribution EPDuvij of an elementary emission zone Zij (as shown in FIG. 4), the angular direction of the wave vector {right arrow over (κ)}uvij in the plane (X,Y,Z) may be characterized by angles (φuvij and ψuvij determined with respect to the optical axis Z in the planes (X,Z) and (Y,Z), respectively. In particular, each hologram hpq associated with an emission point EPpqij of the emission point distribution EPDuvij may encode the angular directions (φuvij and ψuvij defined according to the following equations (09) and (10):










φ
uv
ij

=



(


u


ij


-
1

)

×
δ

-


θ
x

2






(
09
)













ψ


uv

ij

=



(


v
ij

-
1

)

×
δ

-


θ
y

2






(
10
)







In the above equations (09) and (10), the quantity δ corresponds to the angular resolution of the image and depends on the desired level of sharpness on the display. For example, an angular resolution δ considered to be a visual limit value of angular resolution (that is to say a separable limit) corresponds to an angle of one arc minute equal to the angle of 1/60˜0,02° (and converted into radians in the calculations in the above equations). It should be noted that the above equations (09) and (10) are therefore obtained with a small-angle approximation (that is to say tan δ≈δ). In this case, it may be considered that, below this value, the eye is not able to perceive the pixelation of an image to be projected onto the retina R.


For each elementary emission zone Zij, the beams emitted by the emission points EPpqij may impact the plane of the pupil P around the central point of impact tij over an impact range wij. For example and without limitation, the impact range wij may be considered to be uniform for all of the elementary emission zones Zij of the emission surface S. The range wij of the impacts of the emission points EPpqij in the plane of the pupil P may notably take into account the range of the angular domain of the elementary emission zone Zij and the contribution of a diffraction of the emission points.


The coordinates txij and tyij of the central point of impact tij on the axis X and the axis Y, respectively, in the plane of the pupil P, may be expressed according to the following equations (11) and (12):










t
x
ij

=



(

i
-



L
x

+
1

2


)

×

d
emx
ij


+

(



φ
ij

_

×

Z
er


)






(
11
)













t
y
ij

=



(

j
-



L
y

+
1

2


)

×

d
emy
ij


+

(



ψ
ij

_

×

Z
er


)






(
12
)







In the above equations (11) and (12), the quantities demxij and demyij correspond to the size of the elementary emission zone Zij along the axis X and the axis Y, respectively, as shown in FIG. 5. The sizes demxij and demyij may depend on the dimensions associated with the waveguides and with the electrodes, their distribution in the plane of the emission surface S, as well as the numbers mxij of waveguides gp and myij of electrodes eq of the elementary zone Zij.


The discretization of the emission surface S into elementary emission zones Zij may be uniform, the quantities demxij and demyij then being values of constant sizes for all of the elementary zones Zij, and able to be defined based on the size of the emission surface S and the total number of elementary zones (Lx×Ly) discretizing the surface S, according to the following equations (13) and (14):










d
emx
ij

=


D
emx


L
x






(
13
)













d
emy
ij

=


D
emy


L
y







(
14
)








For example and without limitation, the quantities demxij and demyij may be equivalent to one and the same quantity ϕ equal to 0.5 mm. In the above equations (13) and (14), the quantities Demx and Demy correspond to the size of the emission surface S along the axis X and the axis Y, respectively, as shown in FIGS. 3 and 5. In particular, the size of the emission surface S may be determined as a function of the diameter Dpup of the pupil of the eye (which is typically non-negligible and typically chosen to be equal to 4 mm). Indeed, the size of the emission surface S may be determined so as to be greater than the size of a zone covered solely by taking into account the viewing cone θ, that is to say by taking into account the diameter Dpup of the pupil. The quantities Demx and Demy may then be defined, with a small-angle approximation, according to the following equations (15) and (16):










D
emx

=


(


Z
er

×
δ
×

N
x


)

+

D
pup






(
15
)













D
emy

=


(


Z
er

×
δ
×

N
y


)

+

D
pup







(
16
)








The discretization of the emission surface S into elementary emission zones Zij may also be uniform, the number mxij of waveguides gp in an elementary zone and the number myij of electrodes eq in an elementary zone then being constant values for all of the elementary zones Zij and being able to be defined based on the total number of (Mx×My) emission points of the emission surface S and the total number of elementary zones (Lx×Ly) discretizing the surface S, according to the following equations (17) and (18):










m
x
ij

=


M
x


L
x






(
17
)













m
y
ij

=


M
y


L
y







(
18
)








Alternatively, the discretization of the emission surface S into elementary zones Zij may be non-uniform, the number mxij of waveguides gp and the number myij of electrodes eq being defined for example according to increasing and/or decreasing functions as a function of the axes X and Y, respectively. The non-uniformity of the discretization of the emission surface S into elementary zones Zij may similarly result in non-homogeneous quantities demxij and demyij as a function of the axes X and Y, respectively.


In particular, it should be noted that the quantities associated with the emission surface S and with the various elementary zones are expressed, in the above equations, using the small-angle approximation for simplification. This approximation is applicable for a small emission surface S size (that is to say a small viewing cone θ) and/or an emission surface S centred with respect to the optical axis Z of the gaze of the user of the system. Alternatively, if the small-angle approximation might no longer be applicable (that is to say tan δ≠δ), the formulas of the above equations, notably (09), (10), (15) and (16), are more complex. In this case, some peripheral elementary zones of an emission surface S may be widened in the plane (X,Y) for high viewing angles compared to the elementary zones referred to as central zones with respect to the optical axis Z of this same emission surface S.



FIG. 6 shows a diagram of the set of impact ranges wij (that is to say illumination zones on the pupil) at the positions of the central point of impact tij for an emission surface S discretized for example into (16×12) elementary emission zones Zij of dimension ϕ equal to 0.5 mm and according to a field of view defined by quantities θx and θy equal to (12°, 6°). In this example, the emission points EPpq have similar sizes over the entire emission surface S, defined by a diameter ω1 of 4 μm. Thus, for each elementary emission zone Zij, the dimension of the impact range wij (that is to say the diffraction ‘spot’ at the eye) is equal to 850 μm.


As shown in FIG. 6, depending on the dimensioning of the device 10 and in general of the transparent portable optical data display system, some coordinates txij and tyij of the central point of impact tij of an elementary emission zone Zij may be positioned outside the pupil P on the plane (X,Y). In this case, at least some of the pixels associated with the indices uij and νij are not projected onto the retina R of the eye because at least some of the light energy associated with the emission points EPpqij of the emission point distributions EPDuvij, does not enter the pupil represented by the circle P in FIG. 6.


However, this effect is partly compensated for by the orientation of the eye in its orbit, leading to a displacement of the position of the pupil P. Indeed, as illustrated by the circle P′ in dotted lines in FIG. 6, the position of the pupil may be shifted in the plane (X,Y) when the eye targets a direction different from the direction of the optical axis Z. In FIG. 6, and according to the dimensioning shown, the shift in the position of the pupil P′ represents a target in a direction −6°: +3°, that is to say the direction concerned by the emission of a light signal from the elementary emission zone Z01,01. In this case, the associated pixels may be projected onto the retina R of the eye because the light energy associated with the emission points EPpqij of the emission point distributions EPDuvij enters the pupil P′. The movement of the eye in its orbit therefore makes it possible to manage signal dispersion in the plane of the pupil.


Advantageously, in order to extend the viewing cone θ of the image projection device 10 for an eye, that is to say the angular range of image projection, the emission surface S may be reproduced in the plane (X,Y). In particular, the image projection device 10 may comprise a finite number of distinct emission surfaces Sxy distributed in the plane (X,Y). For each emission surface Sxy, each hologram hpq associated with an emission point EPpqij of an emission point distribution EPDuvij may encode the angular directions φuvij and ψuvij determined with respect to an optical axis Zxy rotated in the planes (X,Z) and (Y,Z) and centred with respect to the point Prxy corresponding to the position of the centre of the pupil of the eye after rotation in its orbit and in the angular direction of the zone of the emission surface Sxy in question. The image projection device 10 shown schematically in FIG. 7a shows the emission surfaces S1, S2 and S3 positioned consecutively along the axis X and having different angular cone projections along the optical axis Z1=Z and the rotated optical axes Z2 and Z3 centred respectively on the points Pr1, Pr2 and Pr3 of the projection of the centre of the pupil P onto the axis X.


In particular, in the case of the optical axes Zxy rotated with respect to the optical axis Z of the gaze of the user of the system, the small-angle approximation might no longer be applicable (that is to say tan δ≠δ) for high viewing angles. In this case, the emission surfaces Sxy may be widened in the plane (X,Y) for high viewing angles compared to what is referred to as a central emission surface with respect to the optical axis Z. In FIG. 7a, the emission surfaces S2 and S3, associated with the rotated optical axes Z2 and Z3, are larger than the central emission surface S′ associated with the optical axis Z=Z1.


In addition, to extend the size of the “eye box”, that is to say the zone in which the user (and therefore their eye) is able to move in front of the image projection device 10 while still viewing the entire image, the emission surface S may also be duplicated in the plane (X,Y). In this case, the image projection device 10 may comprise a finite number of one and the same emission surface S distributed in the plane (X,Y). For each distributed emission surface S, each hologram hpq associated with an emission point EPpqij of an emission point distribution EPDuvij may encode the angular directions φuvij and ψuvij determined with respect to the optical axis Z in the planes (X,Z) and (Y,Z) and centred with respect to the point Ptxy corresponding to the position of the centre of the pupil of the eye after translation in the plane (X,Y). The image projection device 10 shown schematically in FIG. 7b shows, three times, the same emission surface S positioned consecutively along the axis X and having the same three angular cone projections θx along the optical axis Z centred respectively on the points Pt1, Pt2 and Pt3 of the projection of the centre of the pupil P translated on the axis X.


For example and without limitation, in the cases shown schematically in FIGS. 7(a) and 7(b), in order to duplicate or reproduce emission surfaces, the length of the waveguides gp may be extended along the axis X, and sets of My electrodes eq may be duplicated (notably three times here) in the longitudinal direction of the guides so as to form the emission surfaces S and/or S1, S2 and S3. Similarly, the length of the electrodes eq may be extended along the axis Y, and sets of Mx waveguides gp may be duplicated in the transverse direction of the guides so as to form new emission surfaces S and/or Sxy.


The discretization of the emission surface S into elementary emission zones Zij makes it possible to significantly increase the densification of the emission points EPpqij of one and the same emission point distribution EPDuvij. In particular, the density ρ of the emission points may be defined according to the following expression (19):









ρ
=


1

ϕ
2


×

[


(


Z
er
2

×

δ
2


)

+

(


D
pup
2



N
x

×

N
y



)

+

(


Z
er

×
δ
×

D
pup

×



N
x

+

N
y




N
x

×

N
y




)


]






(
19
)







The number ηem/EPD of emission points EPpq of one and the same emission point distribution may be approximated based on the size of the emission surface S, the surface area sem of the emission point EPpq, and the number of pixels (Nx×Ny) of the image to be projected, as defined according to the following expression (20):










n

em
/
EPD


=



D
emx

×

D
emy




s
em

×

(


N
x

×

N
y


)







(
20
)







For example and without limitation, the number ηem/EPD may be equal to around 50. For an image to be projected of (400×200) pixels and a discretization of the emission surface S into (16×12) elementary emission zones Zij, the density ρ of emission points EPpqij of one and the same emission point distribution EPDuvij may be equal, according to the above equation (19), to around 0.24%. In the above equation (20), the surface area sem of the emission point EPpq may be defined based on the diameter ω1 of the emission point EPpq according to the following expression (21):










s
em

=

π
×


ω
1
2

4






(
21
)







It should be noted that there is at least one criterion for quantifying the quality of the image formed on the retina R by self-focusing of the emission points EPpq of one and the same emission point distribution EPDuv. In particular, to quantify the efficiency of the self-focusing, a power ratio denoted γ may be defined based on the spatial distribution of the intensity of the percussive response (referred to by the acronym PSF, or point spread function), formed by an emission point distribution EPDuv at a point Ruv on the retina R of the eye, as shown in FIG. 8a. The power ratio γ may be expressed according to the following expression (22):









γ
=


-
10




log
10




P
1



P
1

+

P
2








(
22
)







In the above equation (22), the quantity P1 corresponds to the power of the central intensity peak formed by the emission point distribution EPDuv, that is to say the area of the central peak shown in light grey in FIG. 8a. P1+P2 denotes the total luminous power, the quantity P2 corresponding to the power of the intensity noise formed by the emission point distribution EPDuv, that is to say the peripheral zone of the central peak shown in dark grey of the central peak shown in grey.


The power ratio γ is optimum for a zero power P2 of the intensity noise formed by the emission point (value of γ tending towards 0). FIG. 8b represents the evolution of the power ratio γ according to the variation in the density p of the emission points EPpq of one and the same emission point distribution EPDuv, and as a function of various characteristic parameters (for example characteristic dimensions) of the transparent portable optical data display system. The criterion γ for quantifying the quality of the image is therefore well related to the distribution surface density of the emission points EPpq on the emission surface S, as illustrated in FIG. 8b. Now, the discretization of the emission surface S into elementary emission zones Zij makes it possible to significantly increase the densification of the emission points EPpqij of one and the same emission point distribution EPDuvij, thus allowing a significant gain (of at least 9 dB) on the value of the criterion γ compared to a configuration without discretization of the surface (that is to say for ρ0=0,001%).



FIG. 9 illustrates the results of image simulations for various comparative dimensions of the image projection device 10. In particular, the simulation results are carried out on the basis of the discretization of the emission surface S via the number (Lx×Ly) of elementary zones Zij, on the one hand, and on the basis of the size of the emission points EPpq defined by the diameter ω1, on the other hand, and for an image to be projected of (400×200) pixels showing the words “CEA leti Retinal Projection”. In particular, the variation in the number (Lx×Ly) of elementary zones Zij induces a variation in the size of the emission zone (ϕ or more generally the quantities demxij and demyij) and thus a variation in the density p of the emission points EPpq for each emission point distribution EPDuv.


As illustrated by the image simulation results in FIG. 9, implementing the discretization of the emission surface S, and then reducing the size of the emission zone (that is to say increasing the density), makes it possible to improve the contrast of the image to be projected. However, the increase in the size of the emission zone induces a reduction in blur on the image to be projected. It should also be noted that the reduction in the size of the emission points (that is to say small diameter value ω1) induces an increase in the proportion of the extent of intensity noise formed by the emission points compared to the central emission peak, and thus a reduction in the impact of this noise on the construction of the image on the retina R. Therefore, the image simulation results, illustrated in FIG. 9, make it possible to indicate possible optimum dimensioning with emission points of between 2 μm and 4 μm in diameter and an elementary emission zone size Zij of between 200 μm and 800 μm.


The manufacture of an image projection device 10 is implemented based on a method comprising a design phase and a phase of physically manufacturing the device 10 thus designed.



FIG. 10 is a flowchart showing steps of the phase of designing the image projection device 10, according to some embodiments of the invention.


The phase of designing the image projection device may for example be computer-implemented.


In step 120, a discretization of the emission surface S into (Lx×Ly) elementary emission zones Zij may be applied, such that each elementary emission zone Zij comprises a subset of (mxij×myij) emission points EPpqij.


In step 140, a distribution of the subset of (mxij×myij) emission points EPpqij into (ηxij×ηyij) emission point distributions EPDuvij is carried out.


In step 160, for each elementary emission zone Zij, the (ηxij×ηyij) emission point distributions EPDuvij are assigned to (that is to say associated with or allocated to) (ηxij×ηyij) pixels of the image to be projected.


In step 180, for each emission point distribution EPDuvij, the direction of the wave vector {right arrow over (κ)}uvij of the resultant light wave emitted by the emission points EPpqij of the emission point distribution EPDuvij is determined such that the direction {right arrow over (κ)}uvij is contained in an angular domain directed along a central vector {right arrow over (κ)}ij defined based on the number (Lx×Ly) of elementary emission zones Zij discretizing the emission surface S and the position of the elementary emission zone Zij on the emission surface S.



FIG. 11a is a flowchart showing examples of sub-steps of step 120 of discretizing the emission surface S in the phase of designing the image projection device 10, according to some embodiments of the invention.


In sub-step 102, the resolution of the image to be projected may be determined. For example, the resolution (Nx×Ny) of the image to be projected may be chosen to be equal to (400×200). In sub-step 104, the size of the field of view may be determined. For example, the quantities θx and θy of the field of view may be chosen to be equal to 12° and 6°, respectively. In sub-step 106, the distance of the eye relief may be determined. For example, the distance Zer of the eye relief may be chosen to be equal to 20 mm. In sub-step 108, the size of the emission surface S may be calculated based on the values fixed in sub-steps 102 to 106. According to the equations (15) and (16), the quantities Demx and Demy of the emission surface S may be equal to around 8 mm and 6 mm, respectively.


In sub-step 110, the size of the elementary emission zones Zij may be determined. For example, the quantities demxij and demyij of the elementary emission zones may be chosen to be equal to 500 μm. In sub-step 112, the number of elementary emission zones Zij may be calculated from the quantities Demx and Demy and the values fixed in sub-step 110. According to the equations (13) and (14), the number (Lx×Ly) of elementary emission zones Zij may be equal to around (19×12). The number of pixels defined per elementary emission zone Zij may then be deduced from the results of sub-step 110 and from the values fixed in sub-step 102. For example, the number (ηxij×ηyij) of pixels, that is to say emission point distribution, to be encoded per elementary emission zone may be equal to around 350.


Advantageously, in sub-step 114, widths ex and gy and configurations of the waveguides gp and electrodes eq may be determined. In sub-step 116, the numbers (mxij×myij) of waveguides gp and electrodes eq per elementary emission zone Zij may be calculated based on the results of sub-step 114. And, in sub-step 118, the maximum number ηem/EPD of emission points per emission paint distribution may be calculated based on the numbers of waveguides gp and electrodes eq, and on the resolution of the image to be projected (or number (ηxij×ηyij) of pixels, that is to say emission point distribution, per elementary emission zone Zij, for example).



FIG. 11b is a flowchart showing examples of sub-steps of step 140 of distributing the emission points of a distribution EPDuvij in the phase of designing the image projection device 10, according to some embodiments of the invention.


In sub-step 122, the number of waveguides gp and the number of electrodes eq may be determined by random and/or pseudo-random drawing, in particular in order to obtain (or approximate) the number ηem/EPD of emission points. For example, the maximum number ηem/EPD of emission points per emission point distribution calculated in sub-step 118 may be equal to 50. The number of waveguides may be equal to 10 and the number of electrodes may be equal to 5.


In sub-step 124, the position of the waveguides gp and the position of the electrodes eq among the (mxij×myij) waveguides gp and electrodes eq of the elementary emission zone Zij may be determined by random and/or pseudo-random drawing.


The sub-steps 122 and 124 may be applied sequentially for each emission point distribution EPDuvij, for example. Pseudo-random drawing carried out for a distribution EPDuvij may take into account the previous results of sub-steps 122 and 124 of one or more other emission point distributions of one and the same elementary emission zone Zij, such that a pair of a waveguide gp and an electrode eq representing a specific emission point EPpqij is able to be associated only with a single emission point distribution EPDuvij.


The phase of physically manufacturing the device 10 takes into account the results from the design phase implemented beforehand. The physical manufacturing phase may comprise steps of manufacturing waveguides, diffraction gratings and electrodes using conventional techniques, such as for example, and without limitation, chemical or physical deposition techniques. The physical manufacturing phase may also comprise layering steps.


In addition, the physical manufacturing phase may comprise a step of recording holograms hpq, each hologram being associated with an emission point EPpq.


In particular, the step of recording the holograms hpq may take into account the results from step 180 of determining, for each emission point distribution EPDuvij, the direction of the wave vector {right arrow over (κ)}uvij of the resultant light wave emitted by the emission points EPpqij of the emission point distribution EPDuvij.


Recording holograms is based on the interference of two groups of light beams on the holographic film. The generated interference fringes are physically or chemically stored on the holographic film, resulting in a variation in the refractive index. The two groups of beams are conventionally referred to as ‘reference beams’ and ‘object beams’. To encode holograms hpq of an emission point distribution EPDuv(or EPDuvij), the reference beams correspond to the light waves extracted from the diffraction gratings rpq optically coupled to the waveguides gp and to the electrodes ep selected to form the emission points EPpq (or EPpqij) of the distribution EPDuv (or EPDuvij). The hologram recording step requires an optical recording system consisting in forming object beams having an encoding angle β with the emission surface S (or specifically the elementary emission zone Zij). The encoding angle β then induces the direction of the wave vector {right arrow over (κ)}uv (or {right arrow over (κ)}uvij). The object beams, coming from a coherent light source that is not shown in the figures, are obtained from the projection of a given mask through a converging lens (La or Lb), as shown in FIG. 12a and FIG. 12b. Advantageously, for each emission point distribution EPDuv (or EPDuvij), various zones of the emission surface S (or specifically the elementary emission zone Zij) are exposed sequentially to various object beams of a specific encoding angle β, so as to form emission points associated with distinct emission directions.


The discretization of the emission surface S according to the embodiments of the invention makes it possible notably to facilitate and speed up the step of recording the holograms during the phase of physically manufacturing the device 10 compared to a non-discretized surface S. Indeed, since the emission points EPpqij of one and the same emission point distribution EPDuvij are contained within a single elementary emission zone Zij of size smaller than the overall size of the emission surface S, recording (or encoding) the direction of the wave vector {right arrow over (κ)}uvij in the holograms associated with the emission points EPpqij makes it possible to encode angles on larger values. As shown by the optical hologram recording systems in FIG. 12a and FIG. 12b, the encoding of a zone (Zij in FIG. 12(b)) smaller than a surface (S in FIG. 12(a)) induces an increase in the cone of the encoding angle Δβ. This reduction in the encoding zone also induces a reduction in the constraints on the temporal coherence of the light source generating the object beams.



FIGS. 1, 2 and 4 show sets of waveguides gp and of electrodes eq with rectilinear configurations along the axis X and the axis Y, respectively, each waveguide gp being arranged perpendicular to the electrodes eq so as to obtain emission points EPpq distributed regularly in a spatially periodic grid over the emission surface S.


As a variant, the set of Mx waveguides gp of the image projection device 10 may be arranged in a configuration of non-rectilinear isolines in the plane defined by the emission surface S, as shown in FIG. 13.


As used here, the expression “non-rectilinear isolines” refers to a set of at least two geometric curves f1(X,Y) and f2(X,Y), defined in a plane (X,Y), such that the distance between a point p1 belonging to the geometric curve f1(X,Y) and the geometric curve f2(X,Y) is the same regardless of the point p1, said distance being defined by the length of the shortest path between the point p1 and a point of the geometric curve f2(XY) according to a reference metric dref. Conversely, the distance between a point p2 belonging to the geometric curve f2(X,Y) and the geometric curve f1(X,Y) is the same regardless of the point p2, said distance being defined by the length of the shortest path between the point p2 and a point of the geometric curve f1(XY) according to the reference metric dref. Non-rectilinear isolines thus correspond to a set of geometric curves, such that the curves of the set are always separated in pairs by a constant distance. Such geometric curves may be parametric curves defined in the plane of the emission surface S. For a given geometric curve, an associated isoline may assume a geometric curve shape that is highly variable as a function of the value of the metric dref under consideration.


Advantageously, the configuration of the electrodes on the emission surface S may be similar to the configuration of the waveguides. Thus, as shown in FIG. 13, the set of My electrodes eq of the image projection device 10 may be arranged in a non-rectilinear isoline configuration in the plane (X,Y) defined by the emission surface S.


If the isoline waveguide configuration in the plane (X,Y) is defined according to a constant reference metric value drefy between each of the waveguides gp, the configuration may be a configuration of what are referred to as regular isoline waveguides. In the same way, if the isoline electrode configuration in the plane (X,Y) is defined according to a constant reference metric value drefx between each of the electrodes eq, the configuration may be a configuration of what are referred to as regular isoline electrodes.


The reference metric values drefy and drefx may be variable between various waveguides gp and various electrodes eq, respectively. The variabilities of the reference metric values make it possible notably to improve the suppression of diffraction effects related to possible periodicities generated in the waveguide and electrode configurations.


Advantageously, the isoline waveguide configuration and/or the isoline electrode configuration may be implemented using a fast-marching method (FMM) or level-set method (LSM).


It should be noted that the FMM and LSM methods (James Sethian, Level Set Methods and Fast Marching Methods Evolving Interfaces in Computational Geometry, Fluid Mechanics, Computer Vision and Materials Science, Cambridge University Press, 1999) are numerical algorithms for solving problems at the limits of the Eikonal equation so as to follow the evolution of interfaces.



FIG. 14 is a flowchart showing steps of the design phase of the method for manufacturing the device 10 that are implemented in order to generate configurations of the set of waveguides gp and/or the set of electrodes eq on the emission surface S, according to some embodiments of the invention.


In step 210, a calculation matrix is initialized. The calculation matrix corresponds to the emission surface S, the size Demx and Demy, and is characterized according to predetermined calculation steps dcalx and dcaly. The initialized calculation matrix may then be a matrix comprising a number nbcalx nbcaly of points defined according to the following equations (23) and (24):










nb

cal
x


=





D
emx


d

cal
x





+
1





(
23
)













nb

cal
y


=





D
emx


d

cal
y





+
1





(
24
)







The calculation steps may be defined for example as a function of the width (gy) of the waveguides gp and the width (ex) of the electrodes eq, and/or as a function of the reference metric values drefy and drefx of the waveguides gp and the electrodes eq. In particular, the calculation steps may be defined so as to minimize the time required to generate these configurations while still optimizing calculation resolution. For example and without limitation, the calculation steps dcalx and dcaly, may be equal to one and the same quantity dcal between 1.5 μm and 250 nm.


In step 230, for the isoline waveguide configuration, an initial geometric waveguide curve g0(X,Y) is determined and implemented in the initialized calculation matrix, as illustrated by the exemplary implementation in FIG. 15 (that is to say step 230). In equivalent fashion, in step 230, for the isoline electrode configuration, an initial geometric electrode curve e0(X,Y) may be determined and implemented in the initialized calculation matrix.


The initial geometric waveguide and/or electrode curve f0(X,Y), also called ‘initial geometric curve’, has a plurality of construction characteristics. For example and without limitation, some construction characteristics may comprise:

    • an initial geometric curve f0(X,Y) defined by a non-rectilinear geometric function and extending on either side of the emission surface S, from an initial position (X0, Y0) to a final position (Xf, Yf) in the initialized calculation matrix, the initial position and the final position being arranged along an axis of extension X′ or Y′, substantially parallel to the axis X or Y; in particular, as illustrated in FIG. 15 (that is to say step 230), the initial geometric waveguide curve g0(X,Y) may be defined from an initial position (0, Y0) to a final position (nbcalx, Yf); in the same way, the initial geometric electrode curve e0(X,Y) may be defined from an initial position (X0, 0) to a final position (Xf, nbcaly);
    • an initial geometric curve f0(X,Y) comprising only ordinary points referred to as ‘regular points’ for which the partial derivatives of the curve along X and Y are not simultaneously zero, that is to say in this case the initial geometric curve f0(X,Y) does not comprise any stationary point referred to as a ‘singular point’ for which the partial derivatives of the curve along X and Y at this point are zero, the tangent at such a point then appearing to be indeterminate; in particular, the initial geometric curve f0(X,Y) then does not comprise any point referred to as a ‘double point’ for which the curve “passes back” on itself (or intersects) in the plane (X,Y), or any point referred to as a ‘cusp’ for which the curve “reverses path” in the plane (X,Y);
    • an initial geometric curve f0(X,Y) comprising, at any point in the plane (X,Y), a radius of curvature greater than or equal to a predefined minimum radius of curvature rc. In particular, a minimum radius of curvature rc, associated with the waveguides gp may be defined as a function of the physical properties of the waveguides gp and/or their respective manufacturing constraints. For example and without limitation, the minimum radius of curvature rc, associated with the waveguides gp may be equal to 20 μm;
    • an initial geometric curve f0(X,Y) contained within a predetermined zone of the initialized calculation matrix; for example and without limitation, a zone referred to as a “content zone” may be determined as a function of a number of points nbmaxx or nbmaxy defined with respect to the initial position (X0, Y0) and/or the final position (Xf, Yf) of the curve.


Advantageously, the initial geometric curve f0(X,Y) may be defined randomly while complying with one or more of the above construction characteristics.


The initial geometric curve f0(X,Y) may for example consist of a succession of segments. In particular, the initial geometric curve f0(X,Y) may comprise segments inclined at an angle of inclination β(X, Y) with respect to the axis of extension X′ or Y′ substantially parallel to the axis X or Y. The angle of inclination β(X, Y) may correspond to a constant quantity β or to a variable function in the plane (X,Y), for example a function that is increasing, decreasing or any other function, and defined randomly, pseudo-randomly and/or as a function of the construction characteristics. Along the axis of extension X′ or Y′, the angle of inclination β(X, Y) may be positive or negative. For example, the angle of inclination may be constant β, and alternatively positive and negative. The size of each inclined segment may be determined randomly, pseudo-randomly and/or as a function of the construction characteristics. The segments of the initial geometric curve f0(X,Y) may then be connected to one another by curved segments with a radius of curvature greater than or equal to the minimum radius of curvature.


The initial positions (0, Y0) and (X0,0), and/or similarly the “content zone” of the initial geometric curve, may be defined randomly, pseudo-randomly and/or defined for example so as to be centred in the calculation matrix initialized along the axis Y and the axis X, respectively.


In step 250, a distance function may be applied to the initial geometric curve f0(X,Y) so as to generate the isoline waveguide configuration and/or the isoline electrode configuration. Such a distance function corresponds to a succession of steps for applying one or more reference metric values dref between various curves forming a non-rectilinear isoline configuration based on the initial geometric curve f0(X,Y). For example and without limitation, one distance function used in step 250 may be the fast-marching method.


By way of illustration, in step 250 for the isoline waveguide configuration, a distance map may be calculated using the fast-marching (or level-set) method applied to the initial geometric curve f0(X,Y) in the calculation matrix. For each of the nbcalx×nbcaly points of the calculation matrix, the arbitrary isotropic distance metric is then calculated with respect to the point closest to the initial geometric curve f0(X,Y) implemented in the calculation matrix.



FIG. 15 (that is to say step 250) illustrates the result of calculating a distance map for each point of the matrix with respect to an initial geometric waveguide curve g0(X,Y) determined in the previous step 230.


In step 270, the value of a minimum distance dmin may be determined.


In particular, in the case of determining the isoline waveguide configuration, the minimum distance dmin corresponds to the constant and fixed spacing distance dmin-p between two waveguides gp on the emission surface S. The value of dmin-p may be defined with respect to the width (gy) of the waveguides gp, for example according to the following expression (25):










d

min
-
p


=

3
×

g
y






(
25
)







In the case of determining the isoline electrode configuration, the minimum distance dmin corresponds to the constant and fixed spacing distance dmin-q between electrodes eq on the emission surface S. The value of dmin-q may be defined with respect to the width (ex) of the electrodes eq, for example according to the following expression (26):











d

min
-
q


=
1

,

2
×

e
x






(
26
)







In step 270 as well, one or more multiples c of the value of the minimum distance dmin may be determined. The determination of these multiples may depend notably on the size Demx and Demy of the emission surface S, and on the initial position (X0, Y0) of the initial geometric curve f0(X,Y).


For example and without limitation, if the initial position (0, Y0) is fixed so as to be centred in the calculation matrix initialized respectively along the axis Y, the multiples c x dmin may be determined for the interval indices c defined according to the following expression (27):









c
=

[

1
;



"\[LeftBracketingBar]"



D
emy


2
×

d

min
-
q






"\[RightBracketingBar]"



]





(
27
)







In step 290, the set of isolines (also called isoline paths) of a configuration may be determined based on the distance map and as a function of the multiples c of the minimum distance value dmin. In particular, for each order c, an isoline associated with the initial geometric curve f0(X,Y) that is implemented corresponds to the position of the points of the calculation matrix associated with the closest distance value c x dmin. Thus, in step 290, the waveguide configuration and/or said electrode configuration are determined.



FIG. 15 (that is to say step 290) illustrates the result of a set of isolines associated with the initial geometric waveguide curve g0(X,Y) corresponding to the isoline of order c=0. The determined set of isolines may thus be obtained for isolines defined based on the orders c=[−7; −1] and c=[1;7]. In this example, each waveguide gp of order p then corresponds to an isoline of order c.


The steps of generating the configurations of the set of waveguides gp and of the set of electrodes eq on the emission surface S in the design phase of the method for manufacturing the image projection device 10 make it possible in particular to minimize degradations of the projected image related to spatial periodicities between the emission points EPpq and to optimize the number of emission points EPpq on the emission surface S so as to maximize the resolution of the image formed on the retina R, so as for example to obtain an optimized contrast.


In particular, implementing fast-marching (or level-set) methods to calculate the distance maps in step 250 makes it possible to promote an aperiodic distribution of the intersections between the waveguides gp and the electrodes eq, and therefore the aperiodic distribution of the various emission points EPpq, thereby leading to an improvement in image quality.



FIG. 16a shows the result of a simulation of the spatial distribution of the intensity of the percussive response (that is to say PSF) on the retina R formed by illumination points associated with emission point distributions on the emission surface S defined based on waveguide and electrode configurations obtained by translating a curve segmented along the axis Y and the axis X, respectively. These configurations are called configurations based on “segment translations” hereinafter. Although such configurations improve the aperiodicity of the distribution of the various emission points EPpq compared to rectilinear configurations, such configurations do not make it possible to avoid diffraction effects during the formation of the image on the retina R since, as illustrated in the simulation in FIG. 16a, the intensity of the percussive response exhibits lines in the noise generated by the alignment of certain emission points EPpq.


Conversely, FIG. 16b shows the result of a simulation of the spatial distribution of the intensity of the percussive response (that is to say PSF) on the retina R formed by illumination points associated with emission point distributions on the emission surface S defined based on isoline waveguide and electrode configurations. As illustrated in FIG. 16b, such configurations make it possible to limit diffraction effects during the formation of the image on the retina R, and thus to obtain more uniform noise.


Moreover, calculating the number of intersections between the waveguides gp and the electrodes eq, and therefore of emission points EPpq, for an emission surface S associated with isoline waveguide and electrode configurations, increases (notably by more than 14%) compared to an emission surface S associated with waveguide and electrode configurations based on “segment translations”, each emission surface S in this case comprising the same numbers Mx of waveguides gp and My of electrodes eq. This significant increase in emission points EPpq induces an increase in density and therefore an increase in the possible number of pixels of the image to be projected and/or an increase in the intensity of the illuminated points on the retina R, while at the same time improving the aperiodic distribution of the various emission points EPpq.



FIGS. 17a and 17b show the results of the comparison of criteria for quantifying the quality of the image that are generated from two distinct emission surfaces S each associated with random drawing from a number of 25000 emission points EPpq. The first emission surface S is associated with waveguide and electrode configurations based on “segment translations”, and the second emission surface S is associated with isoline waveguide and electrode configurations.


In particular, FIG. 17a shows the results of the comparison of the power ratios γ for each of the distinct emission surfaces S. As illustrated, implementing isolines makes it possible to reduce the power ratio γ compared to configurations based on “segment translations”, this meaning that this induces a reduction in the power of the intensity noise formed by the emission point distribution compared to the power of the central intensity peak, and thus an improvement in the quality of the projected image.



FIG. 17b shows the results of the comparison of the signal-to-noise ratios (SNR) for each of the distinct emission surfaces S. It should be noted that the SNR criterion, like the power ratio γ, makes it possible to quantify the efficiency of the self-focusing of an emission point distribution EPDuv at a point Ruv on the retina R of the eye. As illustrated in FIG. 8a, the SNR criterion may be evaluated based on the difference between the height of the central intensity peak formed by the emission point distribution and the height of the peak referred to as ‘secondary peak’. As shown in FIG. 17(b), implementing isolines makes it possible to increase the signal-to-noise ratio compared to configurations based on “segment translations”, this also meaning an improvement in the quality of the projected image.


According to some embodiments, the emission surface S may be discretized into (Lx×Ly) elementary emission zones Zij in the plane (X,Y), as illustrated by FIG. 18. Each elementary emission zone Zij may be associated with a number mxij of waveguides gpij and with a number myij of electrodes eqij, so as to generate a subset of (mxij×myij) emission points EPpqij defined among the (Mx×My) emission points EPpq of the emission surface S. Each elementary emission zone Zij may then be associated with a finite number of emission point distributions EPDuvij. Each emission point distribution EPDuvij of an elementary emission zone Zij then consists of one or more emission points EPpqij determined solely among the (mxij×myij) emission points EPpqij of the elementary emission zone Zij. For each elementary emission zone Zij, the subset of mxij waveguides gpij may be arranged in a non-rectilinear isoline waveguide sub-configuration, and the subset of myij electrodes eqij may be arranged in a non-rectilinear isoline electrode sub-configuration.


Advantageously, for each elementary emission zone Zij, the isoline waveguide sub-configuration and/or the isoline electrode sub-configuration may be generated based on the application of the fast-marching (or level-set) method as described notably by the steps of the design phase for generating the configurations.


For example and without limitation, for an elementary emission zone Zij, a non-rectilinear isoline sub-configuration may be formed from an initial geometric curve extending in the plane of the emission surface S on either side of the elementary emission zone Zij, from an intermediate initial position to an intermediate final position that are arranged along an axis of extension X′ or Y′ substantially parallel to the axis X or to the axis Y.


In the embodiments described above, each emission point EPpq is formed from a hologram hpq inscribed in a holographic film H and illuminated by light extracted from a light guide gp. In these cases, each hologram hpq makes it possible to control the phase and to control the angle of the light wave extracted at the diffraction grating rpq. Other alternatives for forming (or configuring) the set of orientation elements and therefore the emission points may be envisaged.


In particular, the fourth layer (that is to say the holographic film H) may be replaced by a spatial light modulator (SLM) arranged facing the diffraction grating. An SLM located at an emission point may be formed by liquid crystals, for example. The SLM may then be configured (that is to say manufactured) to control the phase of the light wave extracted at the diffraction grating rpq. In this case, each diffraction grating rpq may be configured (that is to say manufactured) to control the angle of the light wave in addition to extracting the light wave. As a variant, the fourth layer may additionally comprise a hologram holographic film hpq configured (that is to say manufactured) to control only the angle of the light wave.


It should be noted that some features of the invention may have advantages when considered separately.


Those skilled in the art will readily understand that some method steps and sub-steps described above may be carried out simultaneously and/or in a different order, for example in an order defined based on the characteristics of the image projection device 10.


The device and the methods described above according to the embodiments of the invention or sub-elements of this system may be implemented in various ways using hardware, software or a combination of hardware and software, notably in the form of program code able to be distributed in the form of a program product, in various forms.


The invention is not limited to the embodiments described above by way of non-limiting example. It encompasses all variant embodiments that might be envisaged by those skilled in the art. In particular, those skilled in the art will understand that the invention is not limited to the various elementary emission zones and to the various isoline configurations of the image projection device described by way of non-limiting example. In particular, some embodiments of the invention may be combined.

Claims
  • 1. An image projection device for projecting an image onto an eye, the device being defined in an orthogonal reference system (X,Y,Z) and comprising an emission surface S extending generally in the plane (X,Y) of said orthogonal reference system (X,Y,Z), the emission surface S comprising a stack of elements, said elements comprising a set of Mx waveguides gp, a set of Mx×My diffraction gratings rpq and a set of My electrodes eq, Mx and My being positive integers whose product Mx×My is strictly greater than 1, each diffraction grating rq, being positioned at the intersection of one of said waveguides gp and of one of said electrodes ep so as to form an emission point EPpq for alight wave, wherein said emission surface S is discretized into a plurality of Lx×Ly elementary emission zones Zij in a continuous mesh in the plane (X,Y), each elementary emission zone Zij comprising a subset of mxij×myij emission points EPpqij among the Mx×My emission points EPpq of the emission surface S, said subset of mxij×myij emission points EPpqij being distributed in a number ηxij×ηyij of emission point distributions EPDuvij, the emission points EPpqij of one and the same emission point distribution EPDuvij of said elementary emission zone Zij being configured to emit a resultant light wave directed with a wave vector {right arrow over (κ)}uvij contained in an angular domain defined based on the number Lx×Ly of elementary emission zones Zij discretizing said emission surface S and on the position of said elementary emission zone Zij on said emission surface S, the number ηxij×ηyij of emission point distributions EPDuvij corresponding to the number ηxij×ηyij of pixels of said image to be projected in said angular domain.
  • 2. The image projection device according to claim 1, wherein the discretization of said emission surface S is uniform in the plane (X,Y).
  • 3. The image projection device according to claim 1, wherein the discretization of said emission surface S is non-uniform in the plane (X,Y).
  • 4. The image projection device according to claim 1, wherein, for each elementary emission zone Zij, the distribution of the emission points EPpqij in an emission point distribution EPDuvij is determined randomly or pseudo-randomly.
  • 5. The image projection device according to claim 1, wherein said stack of the emission surface S furthermore comprises a set of Mx×My holograms hpq, each hologram hpq being positioned at said intersection between one of said waveguides gp and one of said electrodes ep so as to form said emission point EPpq, the holograms hpq associated with said emission points EPpqij of one and the same emission point distribution EPDuvij of said elementary emission zone Zij being encoded such that said emission points EPpqij emit light waves that are angle-matched and phase-matched to one other so as to generate said resultant light wave defined according to said direction of the wave vector {right arrow over (κ)}uvij contained in an angular domain.
  • 6. The image projection device according to claim 1, wherein the device furthermore comprises, in the plane (X,Y), at least one other emission surface Sxy distinct from said emission surface S, said other emission surface Sxy being discretized into elementary emission zones comprising emission points designed to emit a light wave in a direction contained in a determined angular domain along an optical axis centred with respect to a point Prxy and directed towards said emission surface Sxy, said point Prxy being associated with the position of the eye, after the eye has rotated in its orbit towards said emission surface Sxy.
  • 7. The image projection device according to claim 1, wherein the device furthermore comprises, in the plane (X,Y), at least one other emission surface S identical to said emission surface S, said other emission surface S being discretized into elementary emission zones comprising emission points configured to emit a light wave in a direction contained in an angular domain defined along the axis Z and centred with respect to a point Ptxy associated with the translation of the eye in the plane (X,Y).
  • 8. The image projection device according to claim 1, wherein said elementary emission zones Zij have a size in the plane (X,Y) of between 200 μm and 800 μm.
  • 9. A transparent portable optical data display system comprising an image projection device according to claim 1, wherein said system is a glasses system or an augmented reality headset.
  • 10. A method for manufacturing the image projection device according to claim 1, the method comprising a phase of designing said device and a phase of physically manufacturing said device thus designed, characterized in that said design phase comprises the following steps: discretizing said emission surface S into Lx×Ly elementary emission zones Zij, each elementary emission zone Zij comprising a subset of mxij×myij emission points EPpqij;distributing said subset of mxij×myij emission points EPpqij into ηxij×ηyij emission point distributions EPDuvij;for each elementary emission zone Zij, assigning ηxij×ηyij emission point distributions EPDuvij to ηxij×ηyij pixels of said image to be projected;determining, for each emission point distribution EPDuvij the direction of the wave vector {right arrow over (κ)}uvij of the light wave emitted by the emission points EPpqij, the wave vector {right arrow over (κ)}uvij being contained in an angular domain defined based on the number Lx×Ly of elementary emission zones Zij discretizing said emission surface S and on the position of said elementary emission zone Zij on said emission surface S.
Priority Claims (1)
Number Date Country Kind
2303528 Apr 2023 FR national