Deep learning architectures have emerged as powerful function approximators in a broad spectrum of complex representation learning tasks, such as, computer vision, natural language processing and collaborative filtering. These architectures bear a high potential to learn the intrinsic structure of data and extract valuable insights. Despite the surge in the development of state-of-the-art intelligent systems using the deep neural networks (DNNs), these systems have found to be vulnerable to adversarial examples produced by adding a small-magnitude of perturbations. Such adversarial examples are adept at misleading the DNN classifiers. In the past, different attack strategies have been proposed to produce adversarial examples in the digital, physical, and transform domain, but the likelihood to generate perceptually realistic adversarial examples require more research efforts. In this paper, we present a novel approach to produce adversarial examples by combining the single-shot fast gradient sign method (FGSM) and spatial, as well as, transform domain image processing techniques. The resulted perturbations neutralize the impact of low-intensity based regions, thus, instilling the noise only in the selective high-intensity regions of the input image. While combining the customized perturbation with one-step FGSM perturbation in an un-targeted black-box attack scenario, the proposed approach successfully fools state-of-the-art DNN classifiers with 99% adversarial examples being misclassified on the ImageNet validation dataset.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.