Quantum sensors based on cold atoms have enormous potential to unlock new capabilities in GPS-denied navigation, civil engineering, intelligence, and Earth observation. But operating these devices in realistic environments is currently extremely challenging, and for the most part the advantages of choosing a quantum sensor over a conventional alternative are lost in the transition from laboratory to noisy field-based environments. In this work, we demonstrate for the first time in hardware that tailored light pulses, designed and implemented in software using robust control techniques, can substantially mitigate some of the most nefarious effects in a Bragg atom interferometer. We show experimentally that embedding robust control into sensor operation can improve the signal-to-noise ratio of a state-of-the-art Bragg-pulse cold-atom interferometric sensor by a factor of 4× under ideal conditions. In the presence of laser-intensity noise that varies up to 20% from shot-to-shot, commensurate with common platform vibrations, we show experimentally that using the same robust control solutions preserves fringe visibility with minimal degradation while the utility of the primitive Gaussian pulses collapses, delivering an at least 8× improvement in phase-estimation uncertainty compared with primitive pulse schemes. Across all observations, robust control delivers better performance in a noisy environment than the native hardware performance with primitive pulses under approximately ideal conditions. Finally, building on this demonstration we present a validated theoretical concept to extend this performance improvement to compact devices using concatenated sequences of robust pulses designed to enhance the sensor’s scale factor. Time-domain simulations reveal up to 10× performance enhancement in the presence of realistic atomic-cloud effects at 102ℏk momentum separation. These results show for the first time that software-defined quantum sensor operation can deliver useful performance in environmental regimes where primitive operation is impossible, providing a pathway to augment the performance of current and next generation portable cold-atom inertial sensors in real fielded settings.
The sensitivity of atom interferometers depends on the fidelity of the light pulses used as beamsplitters and mirrors. Atom interferometers typically employ pulses that affect π/2 and π fractional Rabi oscillations, the fidelities of which are reduced when there are variations in atomic velocity and laser intensity. We have previously demonstrated the application of optimal control theory to design pulses more robust to such errors; however, if these variations exhibit a time dependence over periods on the order of the interferometer duration then phase shifts can be introduced in the final fringe that potentially reduce the sensitivity. In this paper, we explain why care must be taken when optimising interferometer pulse sequences to ensure that phase shifts arising from inter-pulse variations are not significantly increased. We show that these phase shifts can in fact be minimised by choosing an appropriate measure of individual pulse fidelity.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.