[Context and motivation] In our modern society, software systems are highly integrated into our daily life. Quality aspects such as ethics, fairness, and transparency have been discussed as essential for trustworthy software systems and explainability has been identified as a means to achieve all of these three in systems. [Question/problem] Like other quality aspects, explainability must be discovered and treated during the design of those systems. Although explainability has become a hot topic in several communities from different areas of knowledge, there is only little research on systematic explainability engineering. Yet, methods and techniques from requirements and software engineering would add a lot of value to the explainability research. [Principal ideas/results] As a first step to explore this research landscape, we held an interdisciplinary workshop to collect ideas from different communities and to discuss open research questions. In a subsequent working group, we further analyzed and structured the results of this workshop to identify the most important research questions. As a result, we now present a research roadmap for explainable systems. [Contribution] With our research roadmap we aim to advance the software and requirements engineering methods and techniques for explainable systems and to attract research on the most urgent open questions.
CITATION STYLE
Brunotte, W., Chazette, L., Klös, V., & Speith, T. (2022). Quo Vadis, Explainability? – A Research Roadmap for Explainability Engineering. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13216 LNCS, pp. 26–32). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-98464-9_3
Mendeley helps you to discover research relevant for your work.