This is a heavily interactive web application, and JavaScript is required. Simple HTML interfaces are possible, but that is not what this is.
Post
RealClearScience
rcscience.bsky.social
did:plc:jl3x2o5lg4kue44dxabpgaw6
The same malicious prompts used to jailbreak chatbots could soon be used to jailbreak A.I. agents, producing unintended behavior in the real world. via @nytimes.com www.realclearscience.com/2025/10/13/the_ai_prompt_that_could_end_the_world_1140444.html
https://www.realclearscience.com/2025/10/13/the_ai_prompt_that_could_end_the_world_1140444.html
2025-10-13T15:30:43.978Z