This disclosure relates generally to optical communications and optical sensing systems, methods, and structures. More particularly, it describes the machine learning based classification of higher-order spatial modes.
As is known in the optical communications and optical sensing arts, spatial modes are mathematical functions that can be used to describe transverse (i.e., perpendicular to direction of propagation of a light beam) spatial dependence of the complex amplitude of the electric field of a light beam. These mathematical functions are solutions to an electromagnetic wave equation. For example, the light beam of a conventional laser pointer is a spatial mode that is referred to as the fundamental spatial mode, i.e., the lowest order solution to the wave equation.
The spatial dependence of the intensity of the complex amplitude of the electric field of the fundamental spatial mode is characterized by being brightest at a light beam's center and, becoming gradually less bright farther from the beam's center. There are, however, high-order solutions to the wave equation, i.e., higher-order spatial modes. The complex electric field amplitudes of higher-order spatial modes have more complex spatial dependencies. For example, there are higher-order spatial modes referred to as Hermite-Gaussian modes (
Higher-order spatial modes can propagate through free space (e.g. Earth's atmosphere, outer space) and waveguides (e.g. optical fibers). As a result, higher-order spatial modes are receiving significant interest in the communications and sensing arts.
For example, higher-order spatial modes can be used to increase the data speeds of free space and optical fiber communication at a given wavelength (i.e., spectral efficiency), where each higher-order spatial mode is used as a data state with which to encode data or, a channel over which data is encoded otherwise. Also, higher-order spatial modes can be used to enhance image resolution in microscopy, where image resolutions beyond wavelength dependent limits can be achieved via illumination with higher-order spatial modes (i.e., super-resolution).
For any spatial mode, its classification is often necessary, especially with respect to the applications noted above. For fundamental spatial modes, classification comprises characterization of the spatial modes' quality via the so-called M2 factor, i.e., a product of the beams' measured size and divergence. However, higher-order spatial modes are more various and, the complex amplitude of the electric field of each has more complex spatial dependence. Therefore, classification of higher-order spatial modes requires a more complex spatial analysis including, differentiation of the high-order spatial modes from each other. Measurement of the M2 factor is insufficient.
Canonical systems and methods to classify higher-order spatial modes comprise indirect measurement of the complex amplitude of a light beam's electric field. Typically, the complex amplitude of a light beam is indirectly measured using interferometry or, holographic techniques via unconventional optical devices/elements. Such unconventional optical devices/elements must emulate the complex spatial dependencies of the complex amplitudes of the electric fields of higher-order spatial modes. Unconventional optical devices/elements include, liquid crystal or micro-mirror based spatial light modulators or, custom reflective, refractive or, diffractive optical elements (e.g. spiral phase plates, q-plate, fork diffraction grating, meta-material).
While arguably effective, interferometry or, complex holographic techniques via unconventional optical devices/elements may have prohibitive complexity, i.e., dependence on a light beam's alignment, size, wave front (e.g. curvature, etc.), polarization, and wavelength. Additionally, unconventional optical devices/elements may have prohibitive cost and efficacy. They require quality of fabrication that depends on how well the complex spatial dependencies of the complex amplitudes of the electric fields of higher-order spatial modes can be emulated.
An advance in the art is made according to aspects of the present disclosure directed to improved systems, methods, and structures providing machine learning based classification of higher order spatial modes.
In sharp contrast to the prior art, systems, methods, and structures according to aspects of the present disclosure advantageously provides:
A more complete understanding of the present disclosure may be realized by reference to the accompanying drawing in which:
The illustrative embodiments are described more fully by the Figures and detailed description. Embodiments according to this disclosure may, however, be embodied in various forms and are not limited to specific or illustrative embodiments described in the drawing and detailed description.
The following merely illustrates the principles of the disclosure. It will thus be appreciated that those skilled in the art will be able to devise various arrangements which, although not explicitly described or shown herein, embody the principles of the disclosure and are included within its spirit and scope.
Furthermore, all examples and conditional language recited herein are intended to be only for pedagogical purposes to aid the reader in understanding the principles of the disclosure and the concepts contributed by the inventor(s) to furthering the art and are to be construed as being without limitation to such specifically recited examples and conditions.
Moreover, all statements herein reciting principles, aspects, and embodiments of the disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
Thus, for example, it will be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative circuitry embodying the principles of the disclosure.
Unless otherwise explicitly specified herein, the FIGs comprising the drawing are not drawn to scale.
By way of some additional background, we begin by noting that United States Patent Application Publication US2015/0292941 entitled “Modal Decomposition of a Laser Beam,” describes a system and method to classify higher-order spatial modes that uses holographic techniques via an unconventional optical device namely, a spatial light modulator. A light beam of interest illuminates a display of the spatial light modulator. The spatial light modulator displays a hologram corresponding to a spatial mode or spatial modes. The hologram modulates the complex amplitude of the electric field of the light beam. Then, the light beam undergoes a Fourier transform upon focusing through the effect of a lens. The higher-order spatial mode that the light beam is can be classified by detecting the power of the transformed light beam.
While apparently effective, the system and method disclosed therein requires the use of an unconventional optical device, i.e., a spatial light modulator. Additionally, this system and method is dependent on the light beam's alignment, size, wave front (e.g. curvature, etc.), polarization, and wavelength. If the alignment, size, wave front (e.g. curvature, etc.), polarization, and wavelength are not what is required by the spatial light modulator or, its displayed hologram, the light beam may be classified as the incorrect spatial mode.
United States Patent Application Publication US2017/0010463 describes a “Device for processing light/optical radiation, method and system for designing such a device.” As another example, the system and method disclosed therein classifies higher-order spatial modes, where multiple custom reflective optical elements comprising a cavity are used. Operationally, a light beam is directed into a cavity and reflected back and forth off included elements. After a sufficient number of reflections, the light beam exits the cavity. Reflective surfaces of the elements are made such that they modify the complex amplitude of the electric field of the light beam. After exiting the cavity, the higher-order spatial mode characterizes the light beam may be classified by detecting the power of the modified light beam.
While apparently effective, this system and method requires the use of custom reflective optical elements and, is dependent on the light beam's alignment, size, wave front (e.g. curvature, etc.), polarization, and wavelength. As with other prior art systems and methods, if the alignment, size, wave front (e.g. curvature, etc.), polarization, and wavelength are not that required by the cavity or, its custom reflective optical elements, the light beam may not reflect off of the elements properly, and the light beam's complex electric field amplitude may not be modified properly. In turn, the light beam may be classified as an incorrect higher-order spatial mode.
United States Patent Application Publication US2018/0262291A1 entitled “Method for free space optical communication utilizing patterned light and convolutional neural networks”, describes yet another method to classify higher-order spatial modes, where a light beam is incident to an image capturing device, which captures an image of the transverse spatial dependence of the intensity of a light beam. Then captured image of the transverse spatial dependence of the intensity of the light beam is classified using a neural network.
While apparently effective, this system and method are dependent on the light beam's alignment and size. For example, a type of neural network referred to as a convolutional neural network can have a network layer referred to as pooling. Pooling depends on the sizes of the kernel (i.e., the convolutional filter) and, the stride (i.e., the pixel interval that the kernel is translated between subsequent convolutions) that are used. Effectively, a convolution neural network that uses a pooling layer is sensitive to variations of size and alignment of a few pixels. As such, the light beam may be classified as an incorrect higher-order spatial mode when it is misaligned by only a few pixels.
Additionally, this system and method require the neural network to be trained by experimentally acquiring training examples. Experimentally acquired training examples comprise experimentally captured images of the transverse, spatial dependencies of the intensities of higher-order spatial modes. For example, for a convolutional neural network, 1000s of experimentally captured images are required for each higher-order spatial mode that requires classification. As such, the use of experimentally captured training examples may be prohibitive.
Finally, this system and method cannot easily classify the higher-order spatial modes of a light beam from a multimode optical fiber because, a light beam from a multimode optical fiber has a complex electric field amplitude that has two orthogonal polarization components, where the complex electric field amplitude of each polarization component is a linear combination of higher-order spatial modes. As such, the complex electric field amplitude of each polarization component must be classified separately.
Given these and other infirmities in the prior art with respect to determining high-order spatial modes, an advance in the art is made according to aspects of the present disclosure in which an image capture device captures image data of the transverse, spatial dependencies of the intensities of two orthogonal polarization components of a light beam emitted from a multimode optical fiber. The light beam's alignment, size, and wave front vary with respect to the image capture device. Through the effect of polarization optical elements, the transverse, spatial dependencies of the intensities of the of at least two orthogonal polarization components of the light beam are separated and captured separately by the image capture device as image data.
Using a classifier, a processor classifies the image data of the transverse, spatial dependencies of the intensities of the at least two orthogonal polarization components of the light beam emitted from the multimode optical fiber, whose alignment, size, and wave front vary with respect to the image capture device. The classifier is trained off-line using synthesized training examples of higher-order spatial modes, comprising varying alignments, sizes, and wave fronts.
As will be appreciated by those skilled in the art and in sharp contrast to prior art systems and methods for higher-order spatial mode determination, systems, methods and structures according to aspects of the present disclosure provide:
With simultaneous reference now to
In operation, the multimode optical fiber supports N higher-order spatial modes, where N=1, 2, . . . . The transverse cross section of the optical fiber comprises regions that have indices of refraction, given by n2 and n1, referred to as the core and the cladding, respectively. n2 and n1 may be a function of the transverse spatial coordinates, given by (x, y), i.e., n 2(x, y) and n1(x, y)
As shown schematically in
The light beam emitted from the multimode optical fiber 1200 has a complex electric field amplitude that has two orthogonal polarization components, where the complex electric field amplitude of each polarization component is a linear combination of higher-order spatial modes, given by the equation:
where the light beam is propagating in the z-direction and, (x, y) are rectangular coordinates. In Equation 1, Hermite-Gaussian modes are used to describe high-order spatial modes. However, other higher-order spatial modes can also be used, e.g. Laguerre-Gaussian modes, “linear polarized mode,” “vector modes,” etc.
As those skilled in the art will readily understand and appreciate, higher-order spatial modes are mathematical functions that describe the transverse spatial dependence of the complex amplitude of a light beam, as shown above. The mathematical functions are solutions to an electromagnetic wave equation. Advantageously, the electromagnetic wave equation can take into account the boundary conditions of an optical fiber. For example, the Helmholtz wave equation is given by:
∇2(x, y){right arrow over (u)}(x, y)+k2{right arrow over (u)}(x, y)=0, (2)
where ∇(x, y) is a gradient in rectangular coordinates and, k=2π/λ, where λ is the light beam's wavelength.
Hermite-Gaussian modes 1211 such as those shown in
where Hm( . . . ) and Hn( . . . ) are Hermite polynomials, w is the waist size of the higher-order spatial modes, cm,n are complex coefficients, and m,n=0, 1, 2, . . . .
The transverse, spatial dependencies of the intensities of Hermite-Gaussian modes are given by |HGmn(x, y)|2. As may be observed, the transverse, spatial dependencies of the intensities of HG00, HG00, HG01, HG10, HG11, HG02, HG20 are shown in
Laguerre-Gaussian modes 1212 such as those shown in
where (r, θ) are polar coordinates, Ll,p( . . . ) are generalized Laguerre polynomials, w is the waist size each spatial mode, cl,p are complex coefficients, and l=0, ±1, ±2, . . . ; p=0, 1, 2, . . . .
The transverse, spatial dependencies of the intensities of Laguerre-Gaussian modes are given by |LGl,p(r, θ)|2. The transverse, spatial dependencies of the intensities of LG0,0, LG0,+1, LG0,−1, LG0,+2, LG0,−2 are shown in
Note that higher-order spatial modes can also be “linearly polarized” modes, “vector” modes, or any other higher-order spatial modes that are solutions to the electromagnetic wave equation, which takes into account the boundary conditions of an optical fiber. Note further that {right arrow over (x)} and {right arrow over (y)} are the two orthogonal polarization components of the complex amplitude of the electric field of the light beam from the multimode optical fiber.
As will be understood, the at least two orthogonal polarization components can be:
The transverse, spatial dependence of the intensity of the light beam from the multimode optical fiber is given by the equation:
I(x, y)=|{right arrow over (u)}(x, y)|2 (5)
Ix(x, y)=|ux(x, y)|2 (5a)
Iy(x, y)=|uy(x, y)|2 (5b)
wherein Ix(x, y) is the transverse, spatial dependence of the intensity of the {right arrow over (x)} polarization component of the complex amplitude of the electric field of the light beam from the multimode optical fiber, as given by Equation 5a; Iy(x, y) is the transverse, spatial dependence of the intensity of the {right arrow over (y)} polarization component of the complex amplitude of the electric field of the light beam from the multimode optical fiber, given by Equation 5b.
As illustrated in
The image capture device captures the transverse, spatial dependence of the intensity of the {right arrow over (x)} polarization component of the light beam from the multimode optical fiber, Ix(x, y), given by Equation 5a.
The image capture device captures the transverse, spatial dependence of the intensity of the {right arrow over (y)} polarization component of the light beam from the multimode optical fiber, Iy(x, y), given by Equation 5b.
The image capture device captures the transverse, spatial dependence of the intensity of the transverse, spatial dependence of the intensity of the diagonal, {right arrow over (d)}=({right arrow over (x)}+{right arrow over (y)})/√{square root over (2)}, polarization component of the light beam from the multimode optical fiber, Id(x, y), given by the Equation:
Id(x, y)=|{right arrow over (u)}(x, y)·{right arrow over (d)}|2 (5c)
The image capture device captures the transverse, spatial dependence of the intensity of the transverse, spatial dependence of the intensity of the anti-diagonal, {right arrow over (a)}=({right arrow over (x)}−{right arrow over (y)})/√{square root over (2)}, polarization component of the light beam from the multimode optical fiber, Ia(x, y), given by the Equation:
Ia(x, y)=|{right arrow over (u)}(x, y)·{right arrow over (α)}|2 (5d)
The image capture device captures the transverse, spatial dependence of the intensity of the transverse, spatial dependence of the intensity of the right-circular, {right arrow over (r)}=({right arrow over (x)}+i{right arrow over (y)})/√{square root over (2)}, polarization component of the light beam from the multimode optical fiber, Ir(x, y), given by the Equation:
Ir(x,y)=|{right arrow over (u)}(x,y)·{right arrow over (r)}|2 (5e)
Next, polarization optical elements 1600 separate the transverse, spatial dependencies of the intensities of the at least two orthogonal polarization components of the light beam from the multimode optical fiber 1231, 1232. In the illustrative configuration 1 shown in the upper portion of
As shown schematically in
As may be observed in
In a illustrative embodiment, the device that records digital images of the transverse, spatial dependencies of the intensities of the at least two orthogonal polarization components of the light beam from the multimode optical fiber is a conventional pixelated camera.
The complex amplitude of the electric field of the light beam from the multimode optical fiber, whose alignment, size, and wave front vary with respect to the image capture device, is given by the equation:
{right arrow over (u)}′(x+δx, y+δy; w+δw)={right arrow over (u)}(x+δx, y+δy; w+δw×exp(iϕ(x+δx, y+δy; w+δw)), (6)
where ϕ(x+δx, y+δy; w+δw) is the wave front of the light beam, which can be described as a linear combination of Zernike polynomials, given by the equation:
where Ri,j(r′; w′) is the radial Zernike polynomial, and Ai,j and Bi,j are coefficients, and i,j=0, 1, 2, . . . .
The transverse, spatial dependencies of the intensities of the light beam from the multimode optical fiber and, that of its {right arrow over (x)} polarization component and the {right arrow over (y)} polarization component, whose alignment, size, and wave front (e.g. curvature, etc.) vary with respect to the image capture device, given by Equation 6, are given by the equations:
I(x+δx, y+δy; w+δw)=|{right arrow over (u)}(x+δx, y+δy; w+δw)|2 (8)
Ix′(x+δx, y+δy; w+δw)=|ux′(x+δx, y+δy; w+δw)|2 (8a)
Iy′(x+δx, y+δy; w+δw)=|uy′(x+δx, y+δy; w+δw)|2 (8b)
As may be observed with reference to
The waist size of the light beam varies by an amount given by δw, and the wave front of the light beam has aberration, given by ϕ(x+δx, y+δy; w+δw), Equation 7.
Using a classifier 2000, a processor classifies the image data of the transverse, spatial dependencies of the intensities of the two orthogonal polarization components of the light beam from the multimode optical fiber, whose alignment, size, and wave front vary with respect to the image capture device. The classifier is trained off-line using synthesized training examples of higher-order spatial modes, comprising varying alignments, sizes, and wave fronts.
The synthesized training examples of higher-order spatial modes are numerically calculated, transverse, spatial dependencies of the intensities of higher-order spatial modes, whose alignment, size, and wave front vary with respect to the camera.
For example, the numerically calculated, transverse, spatial dependencies of the complex amplitudes of the electric fields and, the intensities of Hermite-Gaussian modes, whose alignment, size, and wave front vary with respect to the camera, are given by the equations, respectively:
For example, the numerically calculate, transverse, spatial dependencies of the complex amplitudes of the electric fields and, the intensities of Laguerre-Gaussian modes, whose alignment, size, and wave front vary with respect to the camera, are given by the equations, respectively:
The numerically calculated transverse, spatial dependencies of the intensities of higher-order spatial modes are displaced an amount given by δx in the x-direction with respect to the z-axis.
The numerically calculated transverse, spatial dependencies of the intensities of higher-order spatial modes are displaced an amount given by δy in the y-direction with respect to the z-axis. The numerically calculated transverse, spatial dependencies of the intensities of higher-order spatial modes have waist sizes that vary by an amount given by: δw. The classifier that classifies the image data that is captured by the image capture device of the transverse, spatial dependencies of the intensities of the two orthogonal polarization components of the light beam from the multimode optical fiber, whose alignment, size, and wave front vary with respect to the image capture device is one selected from the group consisting of: Convolutional neural network and Support vector machine.
The numerically calculated transverse, spatial dependencies of the intensities of higher-order spatial modes are of the {right arrow over (x)} polarization component of the light beam from the multimode optical fiber.
The numerically calculated transverse, spatial dependencies of the intensities of higher-order spatial modes are of the {right arrow over (y)} polarization component of the light beam from the multimode optical fiber.
The numerically calculated transverse, spatial dependencies of the intensities of higher-order spatial modes are of the {right arrow over (d)} polarization component of the light beam from the multimode optical fiber.
The numerically calculated transverse, spatial dependencies of the intensities of higher-order spatial modes are of the {right arrow over (a)} polarization component of the light beam from the multimode optical fiber.
The numerically calculated transverse, spatial dependencies of the intensities of higher-order spatial modes are of the {right arrow over (r)} polarization component of the light beam from the multimode optical fiber.
The numerically calculated transverse, spatial dependencies of the intensities of higher-order spatial modes are of the {right arrow over (l)} polarization component of the light beam from the multimode optical fiber.
The numerically calculated transverse, spatial dependencies of higher-order spatial modes are given by the set of Stokes parameters:
S0=Ix(x+δx, y+δy; w+δw)+Iy(x+δx, y+δy; w+δw) (12a)
S1=Ix(x+δx, y+δy; w+δw)−Iy(x+δx, y+δy; w+δw) (12b)
S2=Id(x+δx, y+δy; w+δw)−Ia(x+δx, y+δy; w+δw) (12c)
S3=Ir(x+δx, y+δy; w+δw)−Ir(x+δx, y+δy; w+δw) (12d)
At this point, while we have presented this disclosure using some specific examples, those skilled in the art will recognize that our teachings are not so limited. Accordingly, this disclosure should be only limited by the scope of the claims attached hereto.
This application claims the benefit of U.S. Provisional Patent Application Ser. No. 62/746,140 filed 16 Oct. 2019 the entire contents of which is incorporated by reference as if set forth at length herein.
Number | Name | Date | Kind |
---|---|---|---|
20090262337 | Nicholson | Oct 2009 | A1 |
20130173194 | Dholakia | Jul 2013 | A1 |
20140098361 | Fini | Apr 2014 | A1 |
20150292941 | Forbes | Oct 2015 | A1 |
Number | Date | Country | |
---|---|---|---|
20200119830 A1 | Apr 2020 | US |
Number | Date | Country | |
---|---|---|---|
62746140 | Oct 2018 | US |