Point it at a codebase. Get a fully-described, LLM-annotated dependency graph — instantly.
DGAT scans any codebase, uses a locally-hosted LLM to write natural-language descriptions for every file and every import relationship, then serves it all through an interactive three-panel UI. Think of it as a self-generating architectural map — no config files, no annotations, no manual work.
flowchart TD
subgraph CLI ["dgat (C++17 backend)"]
A[Walk directory tree] --> B[Fingerprint files\nXXH3-128]
B --> C[Parse imports\ntree-sitter + regex fallback]
C --> D[Describe files & edges\nvLLM HTTP API]
D --> E[Build dependency graph\nDepNode + DepEdge]
E --> F[Synthesise blueprint\ndgat_blueprint.md]
F --> G[Persist state\nfile_tree.json · dep_graph.json]
end
subgraph Server ["--backend mode"]
G --> H[HTTP server :8090\ncpp-httplib]
H --> |GET /api/tree| I[File tree JSON]
H --> |GET /api/dep-graph| J[Dep graph JSON]
H --> |GET /api/blueprint| K[Blueprint markdown]
end
subgraph UI ["Next.js frontend"]
I --> L[Explorer panel\nfile tree]
K --> M[Blueprint tab\nrendered markdown]
J --> N[Graph tab\nSigma.js WebGL]
L & M & N --> O[Inspector panel\nnode · edge · dep details]
end
subgraph LLM ["Local vLLM"]
D <--> P[Qwen/Qwen3.5-2B\nor any OpenAI-compat endpoint]
end
Before running DGAT, bring up a local vLLM instance. DGAT uses it to generate descriptions for every file and dependency edge.
Point DGAT at your project. It walks the file tree, fingerprints every file, sends them to vLLM, and builds the dependency graph.
Dependency extraction runs in parallel — here's the tail end where import relationships are resolved and edges are formed:
Start the backend server and the frontend. Three panels: file explorer on the left, blueprint/graph in the middle, inspector on the right.
Blueprint tab — a synthesised architectural overview of the whole project, generated bottom-up from individual file descriptions:
File inspector — click any file in the explorer to see its description, dependencies, and metadata:
Select a file like dep_graph.json to see its role in the project explained inline:
Switch to the Graph tab for an interactive WebGL view of all import relationships. Node size reflects connectivity.
Single node selected — click any node to see a full LLM-generated analysis of that file, plus its outgoing/incoming edges at the bottom:
Two nodes selected — click a second node to inspect the direct edge between them: the import statement, and a plain-English explanation of why one depends on the other:
- Multi-language import extraction — TypeScript, JavaScript, Python, C/C++, Go, Java, Rust, C#, Ruby, PHP, Bash, and more. Tree-sitter grammars for precision, regex fallback for everything else.
- LLM-annotated graph — every file node and every dependency edge gets a concise description generated by a local model. No cloud, no API keys.
- Project blueprint — a synthesised
dgat_blueprint.mdbuilt bottom-up from all file descriptions. - Incremental updates —
dgat updatere-describes only files whose XXH3 fingerprint changed. - Static export — embed the entire graph into a single self-contained HTML file. Share with anyone, no server required.
- Live UI — auto-refreshes every 30 s. Three-panel layout with file explorer, blueprint/graph tabs, and an inspector.
- C++17 compiler (GCC 11+ / Clang 14+)
- CMake 3.16+
- A running vLLM instance (or any OpenAI-compatible endpoint) on
localhost - Node.js 18+ / Bun (for the frontend)
git clone https://github.com/yourname/dgat
cd dgat
cmake -B build && cmake --build build -j$(nproc)
# or use the install script
bash install.sh# full scan — builds tree, descriptions, dep graph, blueprint
./build/dgat /path/to/your/project
# start the API server (no re-scan)
./build/dgat --backend
# start the frontend
cd frontend && bun dev
# open http://localhost:3000| Command | Description |
|---|---|
dgat [path] |
Full scan of the target directory |
dgat --backend |
Load saved state, start API server |
dgat update |
Incremental re-scan (changed files only) |
--port <n> |
Override API server port (default: 8090) |
| Layer | Tech |
|---|---|
| Backend | C++17 · cpp-httplib · nlohmann/json · inja · xxHash |
| Parsing | tree-sitter (C, C++, Python, TS, Go, Java, Rust, …) |
| LLM | vLLM · Qwen3.5-2B (any OpenAI-compat endpoint) |
| Frontend | Next.js 14 · TypeScript · Tailwind CSS · Sigma.js · shadcn/ui |
gonna scavenge some popular repos, and see how my thing behaves with it, and use my populated context for something useful.







