This is a heavily interactive web application, and JavaScript is required. Simple HTML interfaces are possible, but that is not what this is.
Post
Nathan Peck
nathanpeck.com
did:plc:s2r7bqedf3z7rytysbuhg3br
Additionally, latency doesn’t matter. When you are waiting 10-20 seconds for inferencing compute to complete what’s an extra 200ms around the globe?
LLM’s can and should only be running in areas of high renewable power availability.
The reason this isn’t reality is national AI arms race and greed.
2025-10-10T04:45:34.682Z