Karpathy recently posted about using LLMs to build personal knowledge bases — collecting raw sources into a directory, having an LLM “compile” them into a wiki of interlinked markdown files, and viewing the whole thing in Obsidian. He followed it up with an “idea file,” a gist you can hand to your agent so it builds the system for you. This is a great idea, I’ve been doing some form of this for over a decade. My Staff Eng co-host @davidnoelromas reached out after the tweet to ask for more details on how I’ve been using obsidian and AI. This an expanded version of what I told him. I’ve collected possibly too many markdown files. find . -type f | wc -l 52447 That’s my obsidian vault, and I use it with AI everyday without a special database, or a vector store, or a RAG pipeline. It’s merely files on disk. The problem this actually solves Think about the context you carry around in your head for your job. The history of decisions on a project. What you discussed with your manager three months ago. The Slack thread where the team landed on an approach. The Google Doc someone shared in a meeting you half-remember. The slowly evolving understanding of how a system works that lives across fifteen people’s heads and nowhere else. Now think about what happens when you need to produce something from all that context. A design doc. A perf packet. A project handoff. An onboarding guide for a new team member. You spend hours reassembling context from Slack, docs, emails, your own memory, and you still miss things. The knowledge base turns this into a system instead of a scramble. The architecture A file system with markdown and wikilinks is already a graph database. Files are nodes. Wikilinks are semantic edges. Folders introduce taxonomy. You don’t need a special MCP server or plugin. The file system abstraction is the interface, and LLMs are surprisingly good at navigating it. I use a structure borrowed from Tiago Forte’s Building a Second Brain, with the PARA taxonomy as a sta...
First seen: 2026-04-08 10:20
Last seen: 2026-04-08 12:21