This is a heavily interactive web application, and JavaScript is required. Simple HTML interfaces are possible, but that is not what this is.
Post
Tim Kellogg
timkellogg.me
did:plc:ckaz32jwl6t2cno6fmuw2nhn
this fits my mental model — LLMs *do* learn procedures. But it’s the same mechanics as what’s learning facts. So of course it would also hallucinate procedures
but also: what does procedure hallucination look like? i don’t think i have a grasp on that
https://machinelearning.apple.com/research/illusion-of-thinking
2025-06-08T15:36:14.049Z