up-to-date curated list of state-of-the-art Large vision language models hallucinations research work, papers & resources
-
Updated
Jan 28, 2025
up-to-date curated list of state-of-the-art Large vision language models hallucinations research work, papers & resources
[ICLR 2025] MLLM can see? Dynamic Correction Decoding for Hallucination Mitigation
Fully automated LLM evaluator
Detecting Hallucinations in LLMs
[ICLR 2025] Data-Augmented Phrase-Level Alignment for Mitigating Object Hallucination
Add a description, image, and links to the hallucination-mitigation topic page so that developers can more easily learn about it.
To associate your repository with the hallucination-mitigation topic, visit your repo's landing page and select "manage topics."