Explainability is key requisite for trustworthy AI, but selecting the right XAI-method to accompany your model development can be a challenging task. eXplego is a decision tree toolkit that provides developers with interactive guidance to help select an appropriate XAI-method for their particular use case.
To get to know the tool we recommend checking out the demonstration video below
English
Norwegian
More information about the tool, including a brief list with the reasoning for the positioning of every method in the tree, is provided in our short research paper:
which was presented as a demo at the 1st World Conference in Explainable AI (2023). To cite our tool, please use this citation.
We are very happy to receive feedback from our users on this tool. Please do so by opening an issue in this repo.
This is a collaborative project between Norwegian Computing Center and the Norwegian Labour and Welfare Administration (NAV), funded by BigInsight.
Explego v 1.0.3, last updated 2023-12-08. See changelog for details.