AI-powered DevOps config generation with validation, sandboxed execution, and audit logging.
Describe what you need in plain English. DojOps generates it, verifies it, and writes it safely.
Quick Start · Features · Documentation · Skill Hub · Contributing
Writing Terraform, Kubernetes, and CI/CD configs by hand is slow. Using an LLM to generate them is fast but risky: no schema enforcement, no execution controls, no audit trail. Compliance teams can't sign off on configs they can't verify.
DojOps sits between you and your LLM provider. It constrains output to Zod schemas, validates configs with external tools (terraform validate, hadolint, kubectl dry-run), writes files through a sandbox with approval gates, and logs every action to a tamper-proof audit chain.
# Install
npm i -g @dojops/cli
# Configure your LLM provider
dojops config
# Generate your first config
dojops "Create a Kubernetes deployment for nginx with 3 replicas"Other install methods
# Homebrew (macOS / Linux)
brew tap dojops/tap && brew install dojops
# Shell script
curl -fsSL https://raw.githubusercontent.com/dojops/dojops/main/install.sh | sh
# Docker
docker run --rm -it ghcr.io/dojops/dojops "Create a Terraform config for S3"See the installation guide for more.
# Describe what you need
dojops "Create a Terraform config for S3 with versioning"
# Break complex goals into task graphs
dojops plan "Set up CI/CD for a Node.js app"
# Execute with approval workflow
dojops apply
# Web dashboard + REST API
dojops serveYour prompt gets routed to the right specialist agent. The LLM output is locked to a Zod schema, validated by external tools, then written to disk through the sandbox. If something fails mid-plan, dojops apply --resume picks up where it left off.
Agents and providers. 17 specialist agents cover Terraform, Kubernetes, CI/CD, security, Docker, cloud architecture, and more. You can create custom agents with dojops agents create. Six LLM providers are supported: OpenAI, Anthropic, Ollama (local), DeepSeek, Google Gemini, and GitHub Copilot. Switch providers mid-session with /provider.
Skills. 18 built-in skills for GitHub Actions, Terraform, Kubernetes, Helm, Ansible, Docker Compose, Dockerfile, Nginx, Makefile, GitLab CI, Prometheus, Systemd, Jenkinsfile, Grafana, CloudFormation, ArgoCD, Pulumi, and OpenTelemetry Collector. Write your own as .dops v2 manifests and share them on the DojOps Hub.
Autonomous agent. dojops auto reads your project, plans changes, writes code, runs verification, and self-repairs on failure in an iterative tool-use loop. Run it in the background with --background and check results later with dojops runs.
MCP integration. Extend DojOps with external tools via the Model Context Protocol. Connect any MCP server (stdio or HTTP) with dojops mcp add — tools are automatically discovered and available to the agent loop.
Streaming and voice. LLM responses stream to the terminal in real time. Use --voice to dictate prompts via local whisper.cpp (fully offline, no data leaves your machine).
Planning and execution. Complex goals get decomposed into dependency-aware task graphs with risk classification and semaphore-based parallel execution. File writes are atomic, restricted to infrastructure paths, and backed up automatically. You see a diff preview before every write. Failed plans can be resumed without re-running completed tasks.
Security scanning. 10 scanners run before configs go live: Trivy, Gitleaks, Checkov, Semgrep, Hadolint, ShellCheck, npm/pip audit, SBOM generation, and license scanning. Configs are also validated by external tools (terraform validate, hadolint, kubectl dry-run) before anything is written.
Persistent memory. DojOps remembers project context across sessions. Notes, error patterns, and task history are stored locally and automatically injected into LLM context when relevant. Toggle with dojops memory auto.
Audit and policy. Every action is recorded in a hash-chained JSONL log with SHA-256 integrity verification. The policy engine controls which paths are writable, enforces timeouts and file size limits, and restricts environment variable access.
API and dashboard. 21 REST endpoints expose everything over HTTP. The web dashboard shows metrics, agents, execution history, and security findings. Run dojops serve to start it.
No telemetry. Nothing leaves your machine except requests to your chosen LLM provider.
Full details in the documentation.
@dojops/cli CLI entry point, terminal UI
@dojops/api REST API (Express), web dashboard
@dojops/skill-registry Skill registry, custom skill/agent discovery
@dojops/planner Task graph decomposition, topological executor
@dojops/executor Sandbox, policy engine, approval, audit log
@dojops/runtime 18 built-in DevOps skills (.dops v2)
@dojops/scanner 10 security scanners, remediation
@dojops/mcp MCP server connections, tool discovery
@dojops/context Context7 documentation augmentation
@dojops/session Chat session management, memory
@dojops/core LLM abstraction (6 providers), 17 specialist agents
@dojops/sdk BaseSkill<T>, Zod validation, file utilities
cli -> api -> skill-registry -> runtime -> core -> sdk
-> planner -> executor
-> scanner
-> mcp -> core
-> context -> core
-> session -> core
See docs/architecture.md for the full design.
git clone https://github.com/dojops/dojops.git
cd dojops
pnpm install
pnpm build # Build all 12 packages via Turbo
pnpm test # Run all tests
pnpm lint # ESLint across all packages
# Per-package
pnpm --filter @dojops/core test
# Run locally without global install
pnpm dojops -- "Create a Terraform config for S3"Requires Node.js >= 20 and pnpm >= 8.
DojOps does not collect telemetry. No project data leaves your machine
except to your configured LLM provider. Generated configs, audit logs,
and scan reports all stay in your local .dojops/ directory.
See the contributing guide for setup, coding standards, and how to add skills and agents.
- Fork the repository
- Create a feature branch (
git checkout -b feature/my-feature) - Make your changes with tests
- Run
pnpm test && pnpm lint - Submit a pull request
