This is a heavily interactive web application, and JavaScript is required. Simple HTML interfaces are possible, but that is not what this is.
Post
Simon Willison
simonwillison.net
did:plc:kft6lu4trxowqmter2b6vg6z
Anyone had much success running long context prompts through local LLMs? On my M2 64GB Mac I'm finding that longer prompts take an unreasonably long time to process, am I holding it wrong? Any models or serving platforms I should try out that might respond reasonably quickly?
2025-03-09T16:51:58.547Z