This paper presents a novel hybrid approach for building eXplainable Artificial Intelligence (XAI) systems. It combines an opaque machine learning system with several interpretable systems to build a whole XAI system, i.e., a system which provides users with a good interpretability-accuracy trade-off but also with explanation capabilities. First, the opaque system acts as an "oracle'' which finds out the most plausible output. Then, the simplest interpretable system carrying out the same output is selected. Finally, a textual explanation of the given output is generated, which emerges as an automatic interpretation of the inference process carried out by the selected interpretable system. The textual explanation is built by applying a Natural Language Generation Approach. The proposal is validated in a real use case related to beer style classification.
Keywords: Interpretability, Understandability, Comprehensibility, Explainable AI, Interpretable Fuzzy Systems