Abstract:
The current hype of Artificial Intelligence (AI) mostly refers to the success of machine learning and its sub-domain of deep learning. However, AI is also about other areas, such as Knowledge Representation and Reasoning, or Distributed AI, i.e., areas that need to be combined to reach the level of intelligence initially envisioned in the 1950s. Explainable AI (XAI) now refers to the core backup for industry to apply AI in products at scale, particularly for industries operating with critical systems. This paper reviews XAI not only from a Machine Learning perspective, but also from the other AI research areas, such as AI Planning or Constraint Satisfaction and Search.We expose the XAI challenges of AI fields, their existing approaches, limitations and opportunities for Knowledge Graphs and their underlying technologies.