PRODeep: A platform for robustness verification of deep neural networks

24Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Deep neural networks (DNNs) have been applied in safety-critical domains such as self driving cars, aircraft collision avoidance systems, malware detection, etc. In such scenarios, it is important to give a safety guarantee to the robustness property, namely that outputs are invariant under small perturbations on the inputs. For this purpose, several algorithms and tools have been developed recently. In this paper, we present PRODeep, a platform for robustness verification of DNNs. PRODeep incorporates constraint-based, abstraction-based, and optimisation-based robustness checking algorithms. It has a modular architecture, enabling easy comparison of different algorithms. With experimental results, we illustrate the use of the tool, and easy combination of those techniques.

Cite

CITATION STYLE

APA

Li, R., Li, J., Huang, C. C., Yang, P., Huang, X., Zhang, L., … Hermanns, H. (2020). PRODeep: A platform for robustness verification of deep neural networks. In ESEC/FSE 2020 - Proceedings of the 28th ACM Joint Meeting European Software Engineering Conference and Symposium on the Foundations of Software Engineering (pp. 1630–1634). Association for Computing Machinery, Inc. https://doi.org/10.1145/3368089.3417918

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free