Magnetic tunnel junctions (MTJs) provide an attractive platform for implementing neural networks because of their simplicity, nonvolatility and scalability. In a hardware realization, however, device variations, write errors, and parasitic resistance will generally degrade performance. To quantify such effects, we perform experiments on a 2-layer perceptron constructed from a 15 × 15 passive array of MTJs, examining classification accuracy and write fidelity. Despite imperfections, we achieve accuracy of up to 95.3 % with proper tuning of network parameters. The success of this tuning process shows that new metrics are needed to characterize and optimize networks reproduced in mixed signal hardware.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.