<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"><channel><description>•Software Engineer&#xA;•Android #GDE&#xA;•Open-source #KMP/#CMP projects: http://github.com/joreilly&#xA;•Blog: http://johnoreilly.dev&#xA;•Living in Galway, Ireland 🇮🇪</description><link>https://bsky.app/profile/johnoreilly.dev</link><title>@johnoreilly.dev - John O&#39;Reilly</title><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mlj6qqbfm22p</link><description>Confirming I can if needed work offline for my live coding talk at&#xA;@kotlinconf.com.  Have local Ktor server emulating endpoint and also local ollama model.....and appropriately enough thought I&#39;d try following to make some updates!&#xA;&#xA;ollama launch claude --model qwen3.5:35b-a3b-coding-nvfp4</description><pubDate>10 May 2026 16:32 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mlj6qqbfm22p</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mlfwwnj2kk2h</link><description>Following is model that seems to perform best on MacBook here atm....running as part of a Koog agent (showing telemetry using @langfuse.bsky.social).   Full agent run takes about 2s with LLM calls taking (in this case) 1.52s and 0.31s. Memory footprint ~20GB.&#xA;&#xA;qwen3.5:35b-a3b-coding-nvfp4</description><pubDate>09 May 2026 09:34 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mlfwwnj2kk2h</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mkxm2744zs2f</link><description>compose-ai-tools - Render @Preview composables to PNG outside Android Studio, so AI coding agents can see what they&#39;re changing.&#xA;&#xA;https://github.com/yschimke/compose-ai-tools</description><pubDate>03 May 2026 16:42 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mkxm2744zs2f</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mkldhda4us24</link><description>Claude is so useful for stuff like this!</description><pubDate>28 Apr 2026 19:36 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mkldhda4us24</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mkldfrjw5s24</link><description>Been a pretty good week in the West of Ireland! ☀️</description><pubDate>28 Apr 2026 19:36 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mkldfrjw5s24</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mk6ocnu6ls2n</link><description>New version of llmfit (https://github.com/AlexsJones/llmfit) today that supports Qwen3.6 models</description><pubDate>23 Apr 2026 18:46 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mk6ocnu6ls2n</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mjyng2uz7c2k</link><description>Bring pets to work day today 😃</description><pubDate>21 Apr 2026 09:14 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mjyng2uz7c2k</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mjtrdt6xlc2n</link><description>Still find myself fixing typos and adding question marks (when clearly a question) in Claude etc prompts!</description><pubDate>19 Apr 2026 10:41 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mjtrdt6xlc2n</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mjtix2iv522z</link><description>Managing a series of cascading PRs with Claude is so useful....including monitoring CI, fixing issues found, rebasing the various branches and pushing/retrying again. And all from my phone thanks to /remote-control.</description><pubDate>19 Apr 2026 08:11 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mjtix2iv522z</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mjqtzhnl7s2d</link><pubDate>18 Apr 2026 06:51 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mjqtzhnl7s2d</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mjmp6eya722e</link><description>Introducing Claude Opus 4.7&#xA;https://www.anthropic.com/news/claude-opus-4-7</description><pubDate>16 Apr 2026 15:14 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mjmp6eya722e</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mjbxandpek2g</link><description>Trying out Claude Code + local Ollama model for something a bit more involved (&#34;support viewing information from https://api.climatetrace.org/v7/docs/index.html&#34;) ....did a pretty good job!  Started with empty Compose Multiplatform project.&#xA;&#xA;ollama launch claude --model qwen3.5:35b-a3b-coding-nvfp4</description><pubDate>12 Apr 2026 08:39 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mjbxandpek2g</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mj3okjaek22u</link><description>Still trying local LLMs on M5 Max device here.  Was comparing following and was interesting to see explanation from Gemini below about why Qwen model is faster (and certainly seeing that&#39;s the case here in testing).&#xA;&#xA;- gemma4:26b-a4b-it-q4_K_M&#xA;- qwen3.5:35b-a3b-coding-nvfp4</description><pubDate>09 Apr 2026 20:47 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mj3okjaek22u</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mj3a5ujak22s</link><description>llmfit (https://github.com/AlexsJones/llmfit) is nice way to see what models run best on your hardware</description><pubDate>09 Apr 2026 16:30 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mj3a5ujak22s</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3miyyquorg22m</link><description>M5 Max laptop arrived and starting to try out various local models (as part of Koog AI agent using Ollama integration).  &#xA;&#xA;Latency numbers looking pretty good so far.  This one is a 4B parameter model but working really well for example I&#39;m using (e.g. very reliable tool calling)</description><pubDate>08 Apr 2026 19:12 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3miyyquorg22m</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3miormmq55k2t</link><description>The world of #AI is expansive and ever changing.  Most of it filters down for all practical purposes to a couple of key concepts and tools.  It&#39;s more than sufficient to know those and expand your knowledge if/when needed.</description><pubDate>04 Apr 2026 17:38 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3miormmq55k2t</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3miomvtnxz22t</link><description>Interesting Fragmented Podcast episode on Background AI Agents&#xA;fragmentedpodcast.com/episodes/309/&#xA;https://fragmentedpodcast.com/episodes/309/</description><pubDate>04 Apr 2026 16:13 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3miomvtnxz22t</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mio2lx2tkk24</link><description>Another useful Claude prompt!&#xA;&#xA;&#34;add comments to all the Kotlin files....write in way to use as &#34;speaker notes&#34; during live coding session&#34;</description><pubDate>04 Apr 2026 10:46 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mio2lx2tkk24</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mimcsmod2c2j</link><description>Very cool to see! Showing this data in Claude Desktop using MCP Server from the ClimateTraceKMP sample (https://github.com/joreilly/ClimateTraceKMP).&#xA;&#xA;That makes use of the Kotlin MCP SDK (https://github.com/modelcontextprotocol/kotlin-sdk) and Ktor to connect to the @climatetrace.org endpoint.&#xA;&#xA;[contains quote post or other embedded content]</description><pubDate>03 Apr 2026 18:07 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mimcsmod2c2j</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mijrozr7sk2p</link><description>Ollama 0.20.0-rc0 available and with it the new Google DeepMind Gemma 4 local model.  Trying it here as part of a Koog AI agent (screenshot thanks to Koog&#39;s OpenTelemetry langfuse integration)</description><pubDate>02 Apr 2026 17:56 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mijrozr7sk2p</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mihrpv5kjc2p</link><description>And then there were 14!  (from https://github.com/joreilly/PeopleInSpace #KMP/#CMP sample)</description><pubDate>01 Apr 2026 22:51 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mihrpv5kjc2p</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3miesh6zc722c</link><description>Tried new qwen3.5:4b-nvfp4 Ollama model on M1 Max here (in project where it&#39;s used with Koog AI agent)...38% faster than qwen3.5:4b (averaged over 5 runs of the agent).&#xA;&#xA;I asked Gemini and it initially claimed that I shouldn&#39;t see increase in performance like this...asked again and got following 😀</description><pubDate>31 Mar 2026 18:26 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3miesh6zc722c</guid></item><item><link>https://bsky.app/profile/johnoreilly.dev/post/3mhjdljzfe223</link><description>The latest 2.4.0 dev build of Swift Export now supports exposing StateFlow directly to Swift code.  Chip-8 sample updated to use it. #KMP&#xA;https://github.com/joreilly/chip-8</description><pubDate>20 Mar 2026 20:18 +0000</pubDate><guid isPermaLink="false">at://did:plc:yychpzqerm2b2a4evno5kbrs/app.bsky.feed.post/3mhjdljzfe223</guid></item></channel></rss>