A proof of concept on dialogue games for explainable artificial intelligence
Recent years have witnessed a groundbreaking number of accurate artificial intelligence-based algorithms. However, their oftentimes obscure nature is known to prevent end users from a safe and responsible use or leads to decreased trustworthiness in their predictions or decisions. In this work, we present a novel dialogue game that serves as an explanatory dialogue model to communicate contrastive, selected, and social explanations from an interpretable rule-based classifier to an end user. In addition, we show how it can address the problem of diversity for such explanations via an empirical human evaluation study.
Palabras clave: Explainable AI, Dialogue game, Argumentative Conversational Agents
Publicación: Libro
1744098516783
8 de abril de 2025
/research/publications/a-proof-of-concept-on-dialogue-games-for-explainable-artificial-intelligence
Recent years have witnessed a groundbreaking number of accurate artificial intelligence-based algorithms. However, their oftentimes obscure nature is known to prevent end users from a safe and responsible use or leads to decreased trustworthiness in their predictions or decisions. In this work, we present a novel dialogue game that serves as an explanatory dialogue model to communicate contrastive, selected, and social explanations from an interpretable rule-based classifier to an end user. In addition, we show how it can address the problem of diversity for such explanations via an empirical human evaluation study. - I. Stepin, A. Catala, J.M. Alonso-Moral
publications_gl