This is a heavily interactive web application, and JavaScript is required. Simple HTML interfaces are possible, but that is not what this is.
Post
Adrian Chan
gravity7.bsky.social
did:plc:4rcqtj5petmffaalnh65jn6x
Let's dose an LLM and study its hallucinations!
LLMs were fed "blended" prompts, impossible conceptual combinations, meant to elicit hallucinations. Models did not trip, but instead tried to reason their way through their responses.
arxiv.org/abs/2505.00557
https://arxiv.org/abs/2505.00557
2025-05-12T17:21:26.357Z