Anushkaghei / Hallucination-Detection-In-LLMs Star 0 Code Issues Pull requests Detecting and Mitigating Self Contradictory Hallucinations in LLMs using 2 approaches detection rouge multi-agent-systems mitigation bleu-score bertscore llms stepback-prompting feqa questeval Updated May 30, 2024 Jupyter Notebook