Explainable Artificial Intelligence for Human-Centric Data Analysis in Virtual Learning Environments

The amount of data to analyze in virtual learning environments (VLEs) grows exponentially everyday. The daily interaction of students with VLE platforms represents a digital foot print of the students’ engagement with the learning materials and activities. This big and worth source of information needs to be managed and processed to be useful. Educational Data Mining and Learning Analytics are two research branches that have been recently emerged to analyze educational data. Artificial Intelligence techniques are commonly used to extract hidden knowledge from data and to construct models that could be used, for example, to predict students’ outcomes. However, in the educational field, where the interaction between humans and AI systems is a main concern, there is a need of developing new Explainable AI (XAI) systems, that are able to communicate, in a human understandable way, the data analysis results. In this paper, we use an XAI tool, called ExpliClas, with the aim of facilitating data analysis in the context of the decision-making processes to be carried out by all the stakeholders involved in the educational process. The Open University Learning Analytics Dataset (OULAD) has been used to predict students’ outcome, and both graphical and textual explanations of the predictions have shown the need and the effectiveness of using XAI in the educational field.

keywords: Educational Data Mining, Data Science, Trustworthy AI, Explainable AI, Virtual Learning Environments