In the field of model robustness, adversarial attacks have become the most powerful way to threaten the performance of various deep models; adversarial attacks are to add adversarial noise to common samples to generate adversarial samples in order to mislead the model. The aggressiveness and imperceptibility of its noise are two main indicators to measure the adversarial attack. In this paper, we focus on attacks on object detection models. Previous gradient-sign based noise generation methods have achieved powerful attacks, however the adversarial examples they generate are usually easily perceived by our visual system. This is mainly due to the crude sign noise addition strategy adopted by the gradient sign method on the global input space. To solve this problem, the sensitivity of the deep detection model to the input sample space is analyzed, and a gradient attack method based on space sensitivity is proposed based on this clue. Specifically, inspired by the attention mechanism of the deep model, we investigate the global gradient information of the entire image, and verify the spatial regions that play a key role in detection and classification; further, we propose a method based on the key gradient information noise screening strategy to generate adversarial examples. This not only avoids the perceptibility flaw caused by directly attaching the global gradient symbol, but also provides a more powerful attack effect. Taking the Yolov3 detection model as an example, we conducted observational verification and offensive testing on the VOC detection dataset, and the experimental results confirmed the effectiveness of our method. This poses a greater challenge on how to conduct more effective model defense.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.