Dataset Diversity for Metamorphic Testing of Machine Learning Software

10Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Machine learning software is non-testable in that training results are not available in advance. The metamorphic testing, using pseudo oracle, is promising for software testing of such machine learning programs. Machine learning software, indeed, works on a collection of a large number of data, and thus slight changes in the input training dataset have a large impact on training results. This paper proposes a new metamorphic testing method applicable to neural network learning models. Key ideas are dataset diversity as well as behavioral oracle. Dataset diversity takes into account the dataset dependency of training results, and provides a new way of generating follow-up test inputs. Behavioral oracle monitors changes of certain statistical indicators as training processes proceed and is a basis of metamorphic relations to be checked. The proposed method is illustrated with a case of software testing of neural network programs to classify handwritten numbers.

Cite

CITATION STYLE

APA

Nakajima, S. (2019). Dataset Diversity for Metamorphic Testing of Machine Learning Software. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11392 LNCS, pp. 21–38). Springer Verlag. https://doi.org/10.1007/978-3-030-13651-2_2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free