Pixdoor: A Pixel-space Backdoor Attack on Deep Learning Models

Iram Arshad, Mamoona Naveed Asghar, Yuansong Qiao, Brian Lee, Yuhang Ye

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Citations (Scopus)

Abstract

Deep learning algorithms outperform the machine learning techniques in various fields and are widely deployed for recognition and classification tasks. However, recent research focuses on exploring these deep learning models' weaknesses as these can be vulnerable due to outsourced training data and transfer learning. This paper proposed a rudimentary, stealthy Pixel-space based Backdoor attack (Pixdoor) during the training phase of deep learning models. For generating the poisoned dataset, the bit-inversion technique is used for injecting errors in the pixel bits of training images. Then 3% of the poisoned dataset is mixed with the clean dataset to corrupt the complete training images dataset. The experimental results show that the minimal percent of data poisoning can effectively fool a deep learning model with a high degree of accuracy. Likewise, in experiments, we witness a marginal degradation of the model accuracy by 0.02%.

Original languageEnglish
Title of host publication29th European Signal Processing Conference, EUSIPCO 2021 - Proceedings
PublisherEuropean Signal Processing Conference, EUSIPCO
Pages681-685
Number of pages5
ISBN (Electronic)9789082797060
DOIs
Publication statusPublished - 2021
Event29th European Signal Processing Conference, EUSIPCO 2021 - Dublin, Ireland
Duration: 23 Aug 202127 Aug 2021

Publication series

NameEuropean Signal Processing Conference
Volume2021-August
ISSN (Print)2219-5491

Conference

Conference29th European Signal Processing Conference, EUSIPCO 2021
Country/TerritoryIreland
CityDublin
Period23/08/2127/08/21

Keywords

  • Backdoor attack
  • Causative attack
  • Pixel-space
  • Poisoned dataset
  • Training phase

Fingerprint

Dive into the research topics of 'Pixdoor: A Pixel-space Backdoor Attack on Deep Learning Models'. Together they form a unique fingerprint.

Cite this