The surge of interest in applications of deep neural networks has led to a surge of interest in verification methods for such architectures. In summer 2020, the first international competition on neural network verification was held. This paper presents and evaluates the main optimizations used in the nnenum tool, which outperformed all other tools in the ACAS Xu benchmark category, sometimes by orders of magnitude. The method uses fast abstractions for speed, combined with refinement through ReLU splitting to increase accuracy when properties cannot be proven. Although the abstraction refinement process is a classic approach in formal methods, directly applying it to the neural network verification problem actually reduces performance, due to a cascade of overapproxmation error when using abstraction. This makes optimizations and their systematic evaluation essential for high performance.
CITATION STYLE
Bak, S. (2021). nnenum: Verification of ReLU Neural Networks with Optimized Abstraction Refinement. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12673 LNCS, pp. 19–36). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-76384-8_2
Mendeley helps you to discover research relevant for your work.