Artificial Intelligence (AI) is part of our everyday life and has become one of the most outstanding and strategic technologies of the 21st century. Explainable AI (XAI in short) is expected to endow AI systems with explanation ability when interacting with humans. This paper describes how to provide kids with natural explanations, i.e., explanations verbalized in Natural Language, in the context of identifying/recognizing roles of basketball players. Semantic grounding is achieved through fuzzy concepts such as tall or short. Selected players are automatically classified by an ensemble of three different decision trees and one fuzzy rule-based classifier. All the single classifiers were first trained with the open source Weka software and then natural explanations were generated by the open source web service ExpliClas. The Human-Computer Interaction interface is implemented in Scratch, that is a visual programming language adapted to kids. The developed Scratch program is used for dissemination purposes when high-school teenagers visit the Research Center in Intelligent Technologies of the University of Santiago de Compostela.
CITATION STYLE
Alonso, J. M. (2020). Explainable artificial intelligence for kids. In Proceedings of the 11th Conference of the European Society for Fuzzy Logic and Technology, EUSFLAT 2019 (pp. 134–141). Atlantis Press. https://doi.org/10.2991/eusflat-19.2019.21
Mendeley helps you to discover research relevant for your work.