A Universal Defense Strategy Against Adversarial Attacks Based on Attention-Guided

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Deep Neural Networks (DNNs) have been shown to be vulnerable to adversarial examples. The existence of adversarial examples significantly hinders the development of deep learning technologies in domains with high-security requirements. However, current defense methods often lack universality, being effective only against specific adversarial attacks. This study focuses on analyzing adversarial examples through changes in model attention, classifying attack algorithms into attention-shifting and attention-attenuation categories. To counter attention-shifting attacks, a defense module named Feature Pyramid-based Attention Space-guided (FPAS) is proposed, which spatially retracts the shifting attention in adversarial examples, thereby enhancing the model's overall defense capability. Attention-based Non-Local (ANL) is a proposed defense module to counter attention-attenuation attacks. This module enhances the model's focus on critical features, efficiently constructing a robust defense model with low implementation cost and minimal intrusion into the original model. By integrating FPAS and ANL into the Wide-ResNet model within a boosting framework, the study demonstrates their synergistic defense capability. Even with eight adversarial samples embedded with adversarial patches, our FPAS_at and ANL_at models demonstrated significant improvements over the baseline, enhancing the average defense rate by 5.47% and 7.74%, respectively. Extensive experiments confirm that this universal defense strategy offers comprehensive protection against adversarial attacks at a lower implementation cost compared to current mainstream defense methods, while also being adaptable for integration with existing defense strategies to further enhance adversarial robustness.

Article activity feed