A lightweight, open-source desktop note-taking app for power users who think in linked ideas
Features • Installation • Getting Started • Architecture • Configuration
- CodeMirror 6 - Full-featured editor with syntax highlighting and one-dark theme
- Live Preview - Split view or preview-only mode with rendered markdown
- Auto-save - Changes saved automatically with 500ms debounce
- Keyboard Shortcuts -
Ctrl+Sto save,Ctrl+Gfor graph view
[[wikilink]]Syntax - Create connections between notes instantly- Shortest-path Resolution - Links resolve by fewest directory segments
- Backlinks Panel - See every note that references the current one
- Live Index - Vault re-indexed automatically on file changes via
notifywatcher
- Force-directed Graph - Visualize your entire knowledge network with D3.js
- Node Sizing - Node size scales with inbound link count
- Interactive - Pan, zoom, and explore your ideas spatially
- Context-Aware - Chat prompt includes your active note plus linked content
- OpenAI-Compatible - Works with Ollama, LM Studio, vLLM, or any OpenAI endpoint
- Persistent History - Conversation maintained throughout your session
- File Tree - Hierarchical sidebar with full CRUD operations
- Rename / Delete - Context menu for quick file operations
- Templates - Create notes from templates in your
.templatesfolder - Native Dialogs - Vault folder picker uses OS-native dialog
- Node.js 18+
- Rust 1.70+ with
cargo - npm
# Clone the repository
git clone https://github.com/markusbegerow/opensidian.git
cd opensidian
# Install dependencies
npm install
# Run in development mode (full Tauri app)
npx tauri dev
# Build for production
npx tauri buildFor UI development without the Rust backend:
npm run dev| Command | Description |
|---|---|
npm run dev |
Start Vite dev server (frontend only) |
npx tauri dev |
Start full Tauri app (frontend + Rust) |
npm run build |
Type-check and build frontend |
npx tauri build |
Build production desktop app |
npm run test |
Run all tests |
npx vitest run src/lib/wikilinkParser.test.ts |
Run a single test file |
cargo clean |
Clean Rust build artifacts (run inside src-tauri/) |
A vault is any folder on your machine containing markdown files.
- Launch Opensidian
- Click Open Vault in the sidebar
- Select your notes folder - Opensidian indexes it automatically
# My Note
This links to [[Another Note]] and [[Project Ideas]].Use [[double brackets]] to create wikilinks. Opensidian resolves them by shortest path match across your vault.
- Open the Chat panel (right sidebar)
- Click the settings icon
- Enter your API URL, model name, and token
API URL: http://localhost:11434 (Ollama)
http://localhost:1234 (LM Studio)
Model: llama3.1 / mistral / etc.
Token: your-api-key (stored locally)
Press Ctrl+G to open the graph view. Nodes grow larger the more notes link to them - instantly revealing your most connected ideas.
Open the chat panel and ask questions. The AI receives your current note, its outbound links, and inbound links as context - so it understands your ideas, not just isolated text.
vault/
├── .templates/
│ ├── meeting.md
│ └── daily-note.md
└── notes/
New notes created from a template inherit its structure automatically.
Create a colors.md in your vault root:
| Emoji | Color | Label |
|-------|-----------|----------|
| 🟦 | #74c7ec | Manager |
| 🟩 | #a6e3a1 | Employee |
| 🟨 | #f9e2af | Project || Shortcut | Action |
|---|---|
Ctrl+S |
Save current file |
Ctrl+G |
Toggle graph view |
Enter (in chat) |
Send message |
Opensidian is a Tauri 2 desktop app: a React/TypeScript frontend communicates with a Rust backend via Tauri IPC. The Rust layer owns all file I/O - the frontend never touches the filesystem directly.
React UI → Zustand stores → tauriFs.ts (invoke wrappers) → Rust commands → OS filesystem
↓
notify watcher emits vault://changed events
↓
vaultStore listener triggers re-index
Stores (Zustand + Immer)
| Store | Responsibility |
|---|---|
vaultStore.ts |
Vault path, flat file list, file watcher setup |
editorStore.ts |
Active file, content, dirty flag, 500ms autosave |
wikilinkStore.ts |
linkMap (source→targets) and backlinkMap (target→sources) |
llmStore.ts |
LLM endpoint/model/token (localStorage), chat history |
noteEmojiStore.ts |
Per-file emoji and first-heading cache |
Key Libraries
lib/tauriFs.ts- allinvoke()calls to Rust, plus path helperslib/wikilinkParser.ts-[[Target]]regex extraction and resolutionlib/vaultIndexer.ts- batched indexing in chunks of 20 viaread_files_batchlib/contextBuilder.ts- builds LLM system prompt from active note + linked contentlib/llmService.ts- generic fetch to any OpenAI-compatible API
Components
components/layout/AppShell.tsx- 3-panel layout (sidebar | editor | backlinks/chat)components/editor/-EditorPane,CodeMirrorEditor(CM6 + one-dark),MarkdownPreviewcomponents/graph/GraphCanvas.tsx- D3 force simulationcomponents/chat/ChatPanel.tsx- LLM chat UI
| File | Responsibility |
|---|---|
commands/fs_commands.rs |
open_vault, read_file, read_files_batch, write_file (atomic), delete_file, rename_file, create_file, create_directory |
commands/watcher_commands.rs |
watch_vault / unwatch_vault using notify; emits vault://changed events |
commands/dialog_commands.rs |
pick_vault_folder via tauri-plugin-dialog |
lib.rs |
Registers all commands and builds the Tauri app |
Frontend calls Rust with invoke("command_name", { args }) wrapped in lib/tauriFs.ts. File watch events flow back via Tauri's event system (listen("vault://changed", ...)). Batch reads (read_files_batch) reduce IPC round-trips during vault indexing.
| Layer | Technology |
|---|---|
| Frontend framework | React 18, TypeScript 5.3, Vite 5 |
| Desktop runtime | Tauri 2, Rust |
| Editor | CodeMirror 6 |
| Styling | Tailwind CSS 3 |
| State management | Zustand 4, Immer 10 |
| Graph | D3.js 7 |
| Markdown | react-markdown, remark-gfm, rehype-highlight |
| File watching | notify (Rust crate) |
| Testing | Vitest |
opensidian/
├── src/
│ ├── components/
│ │ ├── layout/ # AppShell (3-panel layout)
│ │ ├── editor/ # CodeMirror editor + markdown preview
│ │ ├── graph/ # D3 force-directed graph
│ │ └── chat/ # LLM chat panel
│ ├── store/ # Zustand stores
│ ├── lib/ # tauriFs, wikilinkParser, vaultIndexer, llmService
│ ├── types/ # VaultFile, FileNode, FsEvent, WikilinkIndex, etc.
│ └── index.css # Tailwind + theme tokens
├── src-tauri/
│ └── src/
│ ├── commands/ # fs_commands, watcher_commands, dialog_commands
│ └── lib.rs # Tauri app builder + command registration
└── package.json
The installer is not yet code-signed. To proceed: click More info → Run anyway.
The app is not yet notarized with Apple. To open it anyway:
Option A — right-click method: Right-click Opensidian.app → Open → Open in the dialog.
Option B — terminal:
xattr -dr com.apple.quarantine /Applications/Opensidian.appFull notarization (no warning at all) requires an Apple Developer account ($99/year). See the release workflow for how to wire it up once ready.
No signing or permission warnings — packages install directly. If the AppImage doesn't launch, mark it executable first: chmod +x Opensidian_*.AppImage.
- Ensure the selected folder contains
.mdfiles - Check that Rust permissions allow reading that directory path
- Links resolve by shortest path match - if two files share a name, the one with fewer parent directories wins
- Rename conflicting files to disambiguate
- Verify your LLM server is running:
- Ollama:
curl http://localhost:11434/api/tags - LM Studio:
curl http://localhost:1234/v1/models
- Ollama:
- Check the API URL and model name in the chat settings
- Confirm your API token is correct (or leave blank for local servers)
- Open a vault first, then press
Ctrl+G - Graph requires at least one
[[wikilink]]in your notes to show edges
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the GPL License - see the LICENSE file for details.
If you encounter any issues or have questions:
- 🐛 Report bugs
- 💡 Request features
- ⭐ Star the repo if you find it useful!
If you like this project, support further development with a repost or coffee:
- 🧑💻 Markus Begerow
- 💾 GitHub
Privacy Notice: Opensidian operates entirely locally. Your notes never leave your machine. The LLM chat feature only contacts external servers if you explicitly configure a remote API endpoint - by default it connects to local inference servers (Ollama, LM Studio, etc.).