Popular repositories Loading
-
Hallucination-Attack
Hallucination-Attack PublicForked from PKU-YuanGroup/Hallucination-Attack
Attack to induce LLMs within hallucinations
Python
-
MIND
MIND PublicForked from oneal2000/MIND
Source code of our paper MIND, ACL 2024 Long Paper
Python
-
-
ReEval-LLM-Hallucination
ReEval-LLM-Hallucination PublicForked from CogComp/ReEval-LLM-Hallucination
GitHub Repo for "ReEval: Automatic Hallucination Evaluation for Retrieval-Augmented Large Language Models via Transferable Adversarial Attacks""
Python
-
-
gated_attention
gated_attention PublicForked from qiuzh20/gated_attention
The official implementation for [NeurIPS2025 Oral] Gated Attention for Large Language Models: Non-linearity, Sparsity, and Attention-Sink-Free
Jupyter Notebook
If the problem persists, check the GitHub status page or contact support.

