POSTER: Not All Pixels are Born Equal: An Analysis of Evasion Attacks under Locality Constraints

6Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Deep neural networks (DNNs) have enabled success in learning tasks such as image classification, semantic image segmentation and steering angle prediction which can be key components of the computer vision pipeline of safety-critical systems such as autonomous vehicles. However, previous work has demonstrated the feasibility of using physical adversarial examples to attack image classification systems. In this work, we argue that the success of realistic adversarial examples is highly dependent on both the structure of the training data and the learning objective. In particular, realistic, physicalworld attacks on semantic segmentation and steering angle prediction constrain the adversary to add localized perturbations, since it is very difficult to add perturbations in the entire field of view of input sensors such as cameras for applications like autonomous vehicles. We empirically study the effectiveness of adversarial examples generated under strict locality constraints imposed by the aforementioned applications. Even with image classification, we observe that the success of the adversary under locality constraints depends on the training dataset. With steering angle prediction, we observe that adversarial perturbations localized to an off-road patch are significantly less successful compared to those on-road. For semantic segmentation, we observe that perturbations localized to small patches are only effective at changing the label in and around those patches, making non-local attacks difficult for an adversary. We further provide a comparative evaluation of these localized attacks over various datasets and deep learning models for each task.

Cite

CITATION STYLE

APA

Sehwag, V., Sitawarin, C., Bhagoji, A. N., Mosenia, A., Chiang, M., & Mittal, P. (2018). POSTER: Not All Pixels are Born Equal: An Analysis of Evasion Attacks under Locality Constraints. In Proceedings of the ACM Conference on Computer and Communications Security (Vol. 2018-January, pp. 2285–2287). Association for Computing Machinery. https://doi.org/10.1145/3243734.3278515

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free