The possibility of employing the spatial degree of photons for communications is gaining interest in recent years due to its unbounded dimensionality. A natural basis to span the transverse profile of photons is comprised of Laguerre-Gaussian (LG) modes which are characterized with two topological numbers: l, the orbital index, describing the orbital-angular-momentum (OAM) in units of “h-bar” per photon in the beam and p, which is the radial index or radial quantum number. One of the main challenges for utilizing LG modes in communications is the ability to perform mode-sorting and demultiplexing of the incoming physical data-flow. Nowadays, there are two leading approaches to mode demultiplexing. The first approach uses intricate optical setups in which the l and p degrees of freedom are coupled to other degrees of freedom such as the angle of propagation or the polarization of the beam. Most of these methods address either the OAM or the radial index degrees of freedom. The second approach, which emerged recently, suggests using just a camera to detect the intensity of the incoming light beam and to utilize a deep neural network (DNN) to classify the beam. To date, demonstrated DNN-based demultiplexers addressed solely the OAM degree of light. We report on an experimental demonstration of state-of-the-art mode demultiplexing of Laguerre-Gaussian beams according to both their orbital angular momentum and radial topological numbers using a flow of two concatenated deep neural networks. The first network serves as a transfer function from experimentally-generated to ideal numerically-generated data, while using a unique "Histogram Weighted Loss" function that solves the problem of images with limited significant information. The second network acts as a spatial-modes classifier. Our method uses only the intensity profile of modes or their superposition with no need for any phase information.
|