@rwayne: https://x.com/rwayne/status/2054523563248611675
Summary
This article details how to build a fully automated local knowledge base system using Hermes Agent, Obsidian, and LLM Wiki, enabling automatic note organization, bidirectional linking, and persistent accumulation.
View Cached Full Text
Cached at: 05/13/26, 06:24 PM
Hermes+Obsidian+LLM Wiki: Build a Local Knowledge Base
Lots of content—bookmark it and paste the full text into Claude Code or Codex and ask it to help you. Or read it slowly when you’re in the bathroom—you’ll learn something new.
You may have run into these problems.
You find a great tweet, copy-paste it. Want to find it later and study it? You have to dig through manually for ages.
You have hundreds of notes in Notion, but they’re isolated from each other. You have no idea if a concept appears somewhere else.
Every time you ask an AI a question, it starts from scratch, pieces together an answer on the fly. No accumulation, no memory, and half your tokens are wasted.
Worse, your notes live on someone else’s server. If that service shuts down, your data is gone.
Ok, the system I built solves all of that.
Four core benefits:
- Fully automated: You don’t need to manually organize notes—the AI does it for you.
- Local storage: Your data always stays with you, never uploaded to any server.
- Persistent accumulation: Knowledge builds up over time—no more starting from zero every time.
- Just ask: You only need to ask questions and explore—the AI handles the rest.
Simply put: you hand documents to the system, and the system automatically organizes them into a structured knowledge network. You can freely browse with bidirectional links.
The full pipeline is: document import → AI organizes → Wiki generation → bidirectional links.
You never have to touch any GUI yourself. Drop files in, and the network builds itself.
Three Tools, Each With a Role
This system is made up of two tools.
Obsidian: The Note Display Layer
Obsidian is a local bidirectional note-taking tool.
Completely free. Cross-platform on Windows, Mac, and Linux.
Its main feature is bidirectional links.
What does that mean?
In a note, type two square brackets, e.g. [[Claude-Code-Notes]], and Obsidian automatically turns it into a purple link.
If a note called “Claude-Code-Notes” exists, clicking it jumps there.
If it doesn’t exist, clicking it creates it.
That’s how bidirectional links work—very simple.
You don’t need to maintain them manually. Obsidian builds the relationships for you.
What’s the problem with traditional note-taking software?
You write dozens of notes, but they’re siloed. You have no idea where the word “Apple” appears across all your other notes.
Obsidian’s Graph View visualizes all your notes and their link relationships as a network.
At a glance, you see the knowledge structure. You spot which nodes are islands and which are hubs.
Plus, Obsidian is completely free for personal use. No limits.
All data stays local, never uploaded to any server.
Hermes Agent: The Automation Engine
Hermes Agent is an autonomous AI agent developed by Nous Research.
Its biggest feature is a built-in learning loop—it can create and improve skills from experience.
In this knowledge management workflow, Hermes serves as the automation engine.
It comes with a built-in llm-wiki skill that directly operates on the knowledge base following the LLM Wiki file structure specification.
What does that mean?
You don’t need to manually create folders, organize notes, or add bidirectional links.
You just tell Hermes “Write this article into the knowledge base,” and Hermes automatically:
- Extracts key entities (people, tools, projects) from the document
- Extracts core concepts (methodologies, technical principles)
- Creates structured Markdown files
- Adds
[[bidirectional links]]connecting related concepts - Updates the knowledge base index
One important rule to emphasize: Hermes only touches the knowledge base when you explicitly ask it to.
Specifically:
- Only when you say “Write into the knowledge base,” “Import into the knowledge base,” or “Put this file into the knowledge base” does Hermes perform the import.
- Only when you say “With the knowledge base,” “Look up in the knowledge base,” or “Answer based on the knowledge base” does Hermes retrieve from it.
During normal everyday conversation, Hermes won’t mess with your knowledge base.
Why is this good?
Your knowledge base doesn’t get polluted by unrelated conversations. Only queries that truly need knowledge base content trigger retrieval.
LLM Wiki: The Knowledge Base Standard
LLM Wiki is not a standalone application—it’s a specification for organizing knowledge base files.
It defines how to organize knowledge:
knowledge_base/
├── raw/sources/ # Raw materials
├── wiki/entities/ # Entities (people, tools, projects)
├── wiki/concepts/ # Concepts (methodologies, principles)
├── wiki/index.md # Knowledge base index
└── wiki/log.md # Change log
The core idea behind this structure: let the AI incrementally build a persistent Wiki.
What does “persistent” mean?
When you import a document, the system doesn’t just index it and move on.
It truly understands the document, extracts key entities, concepts, and relationships, then creates or updates the corresponding Wiki pages.
These Wiki pages are saved locally.
As you import more and more documents, the Wiki gets richer.
Pages form references and associations. Contradictions are flagged.
When you ask a question later, the system doesn’t need to piece together an answer from raw documents on the fly.
The Wiki already contains structured knowledge, so it answers directly based on the Wiki.
And it cites sources—telling you which document a conclusion came from.
Full Workflow Demo
Now let’s connect the two tools.
The entry point is Hermes Agent.
Step 1: Give a Command
For example, you say: “Write this article about AI novel writing into the knowledge base.”
The phrase “Write into the knowledge base” is an explicit action, so Hermes knows to perform the knowledge base write operation.
Step 2: Hermes Automatically Organizes
Using its built-in llm-wiki skill, Hermes automatically:
- Reads the document content
- Extracts key entities (people, tools, projects)
- Extracts core concepts (methodologies, technical principles)
- Creates structured Markdown files
- Adds
[[bidirectional links]]connecting related concepts - Updates the knowledge base index and log
You don’t lift a finger.
Step 3: File Structure Generated
Hermes creates files following the LLM Wiki specification:
knowledge_base/
├── raw/sources/ # Original articles
├── wiki/entities/ # Entity files (tools, people)
├── wiki/concepts/ # Concept files (methodologies)
├── wiki/index.md # Knowledge base index
└── wiki/log.md # Change log
Each file contains:
- Metadata (tags, creation time)
- Core content
[[bidirectional links]]pointing to related pages
Step 4: Obsidian Displays the Knowledge Network
Open the knowledge base directory and drag it into Obsidian as a Vault.
Now you have a knowledge base carefully organized by AI.
Navigate freely through the bidirectional link network. The Graph View shows the strength of connections between knowledge points.
Step 5: Infinite Loop
This process can loop infinitely.
Every time you import a new document, the knowledge network updates automatically.
Existing pages get supplemented with new information. Contradictions with new content are flagged.
The same concept mentioned across different documents gets linked to the same Wiki node.
Over time, the knowledge base becomes more accurate and richer.
Installation Steps
Step 1: Install Obsidian
Go to obsidian.md, download the macOS version.
After installation, open the app.
It will ask you to choose or create a Vault.
A Vault is a note repository—essentially a folder full of Markdown files.
You can import an existing notes folder, or create a new blank Vault.
Step 2: Install LLM Wiki
Go to nashsu/llm-wiki on GitHub. Find the latest release under Releases.
Download the macOS DMG or App tar.gz file.
Extract it and drag the LLM Wiki App into your Applications folder.
Double-click to open the app.
The first time you use it, create a new project—click New or the plus button.
Click the settings icon in the bottom right, choose a model provider, and fill in your API Key.
LLM Wiki supports OpenAI, Claude, Minimax, and any OpenAI-compatible API endpoint.
Step 3: Install Hermes Agent
Hermes Agent is an autonomous AI agent by Nous Research, responsible for automation.
Supported systems: macOS, Linux, Windows
Installation steps:
Open a terminal and run the install command:
curl -fsSL https://raw.githubusercontent.com/NousResearch/hermes-agent/main/scripts/install.sh | bash
After installation, reload your shell configuration:
source ~/.zshrc
Verify the installation:
hermes
If the Hermes conversation interface appears, you’re good.
First-time configuration:
Run the setup wizard:
hermes setup
Choose a model provider:
hermes model
Pick according to your account (OpenAI, Claude, OpenRouter, etc.).
Configure knowledge base rules (important):
On first use, tell Hermes your knowledge base rules:
My knowledge base directory is: /Users/yourusername/Documents/knowledge_base
Rules:
1. Only write to the knowledge base when I explicitly say "Write into the knowledge base".
2. Only retrieve from the knowledge base when I explicitly say "With the knowledge base".
3. All knowledge pages use Markdown and [[bidirectional links]].
4. Use the LLM Wiki file structure to manage the knowledge base.
After that, you can just say “Write into the knowledge base” or “With the knowledge base”—Hermes handles it automatically.
Usage Rules
To use this workflow well, just remember three rules.
Rule 1: Say “Write into the knowledge base,” Hermes organizes
When you have new documents to manage—product docs, meeting notes, study notes, technical proposals.
Just tell Hermes “Write this article into the knowledge base.”
Hermes automatically (via its built-in llm-wiki skill):
- Extracts key entities and concepts from the document
- Creates structured Markdown files
- Adds
[[bidirectional links]] - Updates the knowledge base index
You don’t lift a finger.
Rule 2: Say “With the knowledge base,” Hermes retrieves
When you need to answer a question based on your knowledge base, like “Based on our existing product docs, introduce our technical architecture.”
Add the phrase “With the knowledge base,” and Hermes will search the relevant documents in your knowledge base.
The AI synthesizes a complete answer and cites sources.
During regular conversation, Hermes won’t proactively read the knowledge base.
For example, if you ask “What’s the weather like today?”—it answers directly without checking your knowledge base.
This keeps your knowledge base free from irrelevant content and makes retrieval results more accurate.
Rule 3: Obsidian is Always Available
You can directly drag the wiki directory into Obsidian as a Vault.
Bidirectional link navigation, Graph View, full-text search—all Obsidian features are ready to use.
Browse and edit notes in Obsidian any time.
All content is plain Markdown, portable across any tool.
Summary
Write into the knowledge base → Hermes organizes automatically. You say it, Hermes does it.
With the knowledge base → AI retrieves, synthesizes, answers your questions.
Knowledge base directory → Obsidian displays, bidirectional links for free browsing.
Two tools, each with a role:
- Hermes Agent is the automation engine—receives commands, extracts structured knowledge, creates files following the LLM Wiki standard.
- Obsidian is the note display layer—handles bidirectional links and knowledge network visualization.
Using this workflow, you don’t need to manually organize notes.
You don’t need to start from scratch every time you search.
Let the AI handle the tedious knowledge management—you focus on asking and exploring.
Roland | Fully funded PhD in Medicine & Economics | Author of “Chinese Enterprise AI Transformation White Paper” | Sharing AI cross-domain practice, cognitive upgrades, first-principle thinking | X: @rwayne
Similar Articles
@MindOS_Lisa: https://x.com/MindOS_Lisa/status/2052766937931965
The author introduces a personal knowledge management system built on Karpathy's knowledge base logic, combining Obsidian, Claude Code, and LLM Wiki to achieve a complete workflow covering automatic content ingestion, card generation, and article output. The article distinguishes between three levels — content, information, and knowledge — and provides concrete setup steps.
@ziwenxu_: https://x.com/ziwenxu_/status/2053241837453029439
The article details a workflow for creating an automated 'Codex Knowledge Vault' using Obsidian, where AI agents automatically ingest and organize daily bookmarks into a structured knowledge base to reduce context debt.
@MindOS_Lisa: Navigate the Knowledge Base~~Hermes Automation Settings, Recommended to Bookmark
Bookmark the Hermes automation settings to help you navigate the knowledge base.
@yanhua1010: For most people, Obsidian eventually becomes a beautifully organized filing cabinet. It stores a lot, but very little is actually used. This is because it is designed primarily for "input": clipping, organizing, tagging, and folders. However, a truly useful knowledge base should proactively push insights back to you every day. I created an auto-growing Obsidian knowledge base Demo: …
The author shares an automated Obsidian knowledge base Demo that, by integrating with Claude, delivers daily briefings and facilitates knowledge compounding, transforming passive storage into active insight.
@smantena: https://x.com/smantena/status/2052483819270521238
This article provides a detailed technical guide on integrating Hermes, GBrain, and Notion to create a personalized AI assistant that automatically manages tasks and synthesizes knowledge.