Karpathy's LLM Wiki Nails It – But Local Setup's Friction Killed It for Me
What if your AI notes actually remembered everything, without rehashing docs every query? Karpathy's LLM Wiki does that – I built Hjarni to fix its biggest pains.
⚡ Key Takeaways
- Karpathy's LLM Wiki fixes RAG's rediscovery waste with persistent, LLM-maintained markdown knowledge. 𝕏
- Local setups suffer friction: single-machine, single-client, hard sharing – killing habits. 𝕏
- Hjarni hosts the pattern over MCP for smoothly multi-device, multi-LLM access, trading git for ubiquity. 𝕏
Worth sharing?
Get the best Developer Tools stories of the week in your inbox — no noise, no spam.
Originally reported by dev.to