Skip to content

Latest commit

 

History

History
11 lines (6 loc) · 823 Bytes

README.md

File metadata and controls

11 lines (6 loc) · 823 Bytes

CMA-R: Causal Mediation Analysis for Explaining Rumour Detection

Here is the source code for our paper CMA-R, accepted by EACL 2024 findings.

We apply causal mediation analysis to explain the decision-making process of neural models for rumour detection on Twitter. Interventions at the input and network level reveal the causal impacts of tweets and words in the model output. We find that our approach CMA-R -- Causal Mediation Analysis for Rumour detection -- identifies salient tweets that explain model predictions and show strong agreement with human judgements for critical tweets determining the truthfulness of stories. CMA-R can further highlight causally impactful words in the salient tweets, providing another layer of interpretability and transparency into these blackbox rumour detection systems.