hearth
tend your second brain.
A personal AI runtime for your plain-markdown vault. Capture from chat or voice, let an agent compile sources into a structured wiki, query it conversationally — with citations.
"Obsidian is the IDE; the LLM is the programmer; the wiki is the codebase. The tedious part of maintaining a knowledge base is not the reading or the thinking — it's the bookkeeping. LLMs don't abandon wikis because the maintenance cost approaches zero for them." — Andrej Karpathy, LLM Wiki
three verbs
- Ingest
- A new source arrives in
raw/. The agent reads it, summarizes, extracts concepts, links them, updates the topic Map. - Query
- You ask in chat or voice. The agent answers from the wiki, with citations. Insights worth keeping get written back as new pages — so they don't get lost in the conversation stream.
- Lint
- Periodic audit. Contradictions surfaced. Orphan pages adopted. Single-source claims flagged. The wiki stays clean — because the maintenance cost approaches zero.
your vault, your contract
hearth doesn't impose structure. You write a SCHEMA.md defining what kinds of pages live where, and the agent follows it. Append-only sources, agent-maintained wiki, human-governed schema — three layers, one folder.
vault/ ├── raw/ ← primary sources, append-only ├── 01 Maps/ ← MOCs, agent-maintained ├── 02 <topic>/ ← agent-maintained wiki ├── 99 Assets/ ← attachments └── SCHEMA.md ← human-governed contract
what hearth is
A runtime, not a UI. Obsidian — or Logseq, Foam, plain text — stays your editor. hearth is the agent that lives between your channels and your vault, always-on but never in the way.
Channel-agnostic — WeChat first, Telegram and voice next. Agent-agnostic — Claude Code, Codex, any ACP-compatible runtime. Editor-agnostic — your vault is plain markdown, no lock-in.
what hearth is not
Not another note-taking app. Not a chatbot framework. Not a vendor-managed service. Not a database. Just files on your disk, plus an agent that respects your SCHEMA.md and refuses to slop.
who it's for
You already have, or want to start, a plain-markdown vault. You want an LLM agent maintaining it for you, not just retrieving from it. You want to capture from wherever you are — phone, voice memo, link sharing — without sitting at your desk.
status
Pre-alpha. Spec landing first; reference channel adapter (WeChat) follows; multi-format ingest pipeline — PDF, Word, Excel, video, web — comes after. Building in public.