Abstract
Transparency and reproducibility are important aspects of validation for Machine Learning (ML) models that are not fully understood and applies independently of the application domain.We offer a case study of reproducibility that highlights the challenges encountered when attempting to reproduce analyzes obtained with Machine Learning methods in materials informatics. Our study explores prediction results obtained with ML models and issues in training data serving as input. We discuss challenges related to theory-driven and numerical errors in training data, lack of reproducibility across platforms and versions, and effects of randomness when varying hyperparameters. In addition to model accuracy, a main metric of interest in the ML community, our results show that model sensitivity may be equally important for applying ML in domain applications such a materials science.
Author supplied keywords
Cite
CITATION STYLE
Pouchard, L., Lin, Y., & Van Dam, H. (2020). Replicating machine learning experiments in materials science. In Advances in Parallel Computing (Vol. 36, pp. 743–755). IOS Press BV. https://doi.org/10.3233/APC200105
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.