Abstract
We present FeatureNET, an open-source Neural Architecture Search (NAS) tool 1 that generates diverse sets of Deep Learning (DL) models. FeatureNET relies on a meta-model of deep neural networks, consisting of generic configurable entities. Then, it uses tools developed in the context of software product lines to generate diverse (maximize the differences between the generated) DL models. The models are translated to Keras and can be integrated into typical machine learning pipelines. FeatureNET allows researchers to generate seamlessly a large variety of models. Thereby, it helps choosing appropriate DL models and performing experiments with diverse models (mitigating potential threats to validity). As a NAS method, FeatureNET successfully generates models performing equally well with handcrafted models.
Author supplied keywords
Cite
CITATION STYLE
Ghamizi, S., Cordy, M., Papadakis, M., & Le Traon, Y. (2020). FeatureNET: Diversity-Driven Generation of Deep Learning Models. In Proceedings - 2020 ACM/IEEE 42nd International Conference on Software Engineering: Companion, ICSE-Companion 2020 (pp. 41–44). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1145/3377812.3382153
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.