Skip to content

Commit 28279df

Browse files
committed
v1.1.4
1 parent b55a2d2 commit 28279df

6 files changed

Lines changed: 159 additions & 2 deletions

File tree

Dockerfile

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -69,6 +69,12 @@ RUN if [ "$VARIANT" = "full" ]; then \
6969
&& rm -rf /var/lib/apt/lists/*; \
7070
fi
7171

72+
# ---------- Azure CLI (full only) ----------
73+
RUN if [ "$VARIANT" = "full" ]; then \
74+
curl -sL https://aka.ms/InstallAzureCLIDeb | bash \
75+
&& rm -rf /var/lib/apt/lists/*; \
76+
fi
77+
7278
# ---------- GitHub CLI ----------
7379
RUN curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg \
7480
| dd of=/usr/share/keyrings/githubcli-archive-keyring.gpg 2>/dev/null && \

README.md

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -71,6 +71,7 @@ Your existing Anthropic account works directly:
7171
| :wrench: | [Environment Variables](#wrench-environment-variables) |
7272
| :rocket: | [What's Inside](#rocket-whats-inside) |
7373
| :robot: | [AI CLI Providers](#robot-ai-cli-providers) |
74+
| :llama: | [Using Ollama](#llama-using-ollama) |
7475
| :building_construction: | [Architecture](#building_construction-architecture) |
7576
| :file_folder: | [Project Structure](#file_folder-project-structure) |
7677
| :floppy_disk: | [Data & Persistence](#floppy_disk-data--persistence) |
@@ -559,6 +560,7 @@ The full image includes everything above, plus:
559560
| `wrangler`, `@cloudflare/next-on-pages` | Cloudflare Workers deployment |
560561
| `vercel` | Vercel deployment |
561562
| `netlify-cli` | Netlify deployment |
563+
| `az` | Azure CLI for cloud deployment and management |
562564
| `prisma`, `drizzle-kit` | The two most popular Node.js ORMs |
563565
| `pm2` | Production process manager |
564566
| `eas-cli` | Expo / React Native builds |
@@ -624,6 +626,18 @@ Seven AI CLIs. One container. No other Docker image gives you this.
624626

625627
---
626628

629+
## :llama: Using Ollama
630+
631+
HolyClaude works with [Ollama](https://ollama.com) as an alternative to an Anthropic subscription. Set two environment variables and use local or cloud models.
632+
633+
See the full setup guide: **[docs/ollama.md](docs/ollama.md)**
634+
635+
<p align="right">
636+
<a href="#top">↑ back to top</a>
637+
</p>
638+
639+
---
640+
627641
## :building_construction: Architecture
628642

629643
```mermaid
@@ -691,6 +705,7 @@ holyclaude/
691705
│ ├── CHANGELOG.md
692706
│ ├── configuration.md
693707
│ ├── dockerhub-description.md
708+
│ ├── ollama.md
694709
│ └── troubleshooting.md
695710
├── scripts/ # Container lifecycle scripts
696711
│ ├── bootstrap.sh # First-run setup

config/claude-memory-full.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ Both managed by s6-overlay — they auto-restart on failure.
3232
- **Code quality:** eslint, prettier
3333
- **Dev servers:** serve, nodemon, http-server
3434
- **Utilities:** concurrently, dotenv-cli
35-
- **Deployment:** wrangler (Cloudflare), vercel, netlify-cli, @cloudflare/next-on-pages
35+
- **Deployment:** wrangler (Cloudflare), vercel, netlify-cli, @cloudflare/next-on-pages, az (Azure)
3636
- **Databases:** prisma, drizzle-kit
3737
- **Process management:** pm2
3838
- **Mobile:** eas-cli (Expo)

docs/CHANGELOG.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,12 @@ All notable changes to HolyClaude will be documented in this file.
44

55
The format is based on [Keep a Changelog](https://keepachangelog.com/), and this project adheres to [Semantic Versioning](https://semver.org/).
66

7+
## [1.1.4] - 03/28/2026
8+
9+
### Added
10+
- Azure CLI (`az`) in full variant
11+
- Ollama setup documentation (`docs/ollama.md`) for running HolyClaude with local or cloud models without an Anthropic subscription
12+
713
## [1.1.3] - 03/27/2026
814

915
### Added

docs/dockerhub-description.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ That's it. Open your browser, sign in, start building.
4646

4747
🖥️ **Headless Browser** — Chromium + Xvfb + Playwright, pre-configured for screenshots, testing, and automation
4848

49-
🛠️ **50+ Dev Tools** — Node.js 22, Python 3, TypeScript, git, GitHub CLI, database clients (PostgreSQL, SQLite, Redis), deployment CLIs (Vercel, Wrangler, Netlify), and more
49+
🛠️ **50+ Dev Tools** — Node.js 22, Python 3, TypeScript, git, GitHub CLI, database clients (PostgreSQL, SQLite, Redis), deployment CLIs (Vercel, Wrangler, Netlify, Azure), and more
5050

5151
⚙️ **s6-overlay v3** — Proper PID 1 process supervision with graceful shutdown and automatic service restarts
5252

docs/ollama.md

Lines changed: 130 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,130 @@
1+
# Using HolyClaude with Ollama
2+
3+
HolyClaude can run with [Ollama](https://ollama.com) instead of an Anthropic subscription. Since [January 2026](https://ollama.com/blog/claude), Ollama exposes an Anthropic-compatible API endpoint, so Claude Code connects to it natively.
4+
5+
This means you can use HolyClaude with local models (free, unlimited) or Ollama Cloud models (freemium) without a Claude Max/Pro plan or Anthropic API key.
6+
7+
## Prerequisites
8+
9+
- [Ollama](https://ollama.com/download) installed on your host machine (or another server on your network)
10+
- At least one model pulled (e.g., `ollama pull qwen3-coder`)
11+
- For cloud models: an Ollama account (`ollama signin`)
12+
13+
## Setup
14+
15+
Add two environment variables to your existing [Docker Compose file](../docker-compose.yaml). Start from the Quick Start compose and add the `ANTHROPIC_*` variables:
16+
17+
```yaml
18+
services:
19+
holyclaude:
20+
image: coderluii/holyclaude:latest
21+
container_name: holyclaude
22+
restart: unless-stopped
23+
shm_size: 2g
24+
cap_add:
25+
- SYS_ADMIN
26+
- SYS_PTRACE
27+
security_opt:
28+
- seccomp=unconfined
29+
ports:
30+
- "3001:3001"
31+
volumes:
32+
- ./data/claude:/home/claude/.claude
33+
- ./workspace:/workspace
34+
environment:
35+
- TZ=UTC
36+
- ANTHROPIC_AUTH_TOKEN=ollama
37+
- ANTHROPIC_BASE_URL=http://host.docker.internal:11434
38+
```
39+
40+
- `ANTHROPIC_AUTH_TOKEN=ollama` is required by Claude Code but not validated by Ollama. Any string works.
41+
- `ANTHROPIC_BASE_URL` points to your Ollama server. Use `host.docker.internal` to reach the host machine from inside the container, or your server's IP address (e.g., `http://192.168.1.100:11434`).
42+
43+
> **Linux users:** `host.docker.internal` is not available by default on Linux Docker. Either add `extra_hosts: ["host.docker.internal:host-gateway"]` to your compose file, or use your host's LAN IP address directly.
44+
45+
Start the container:
46+
47+
```bash
48+
docker compose up -d
49+
```
50+
51+
## Selecting a Model
52+
53+
Once inside HolyClaude, switch to an Ollama model using the `/model` command:
54+
55+
```
56+
/model qwen3-coder
57+
```
58+
59+
### Recommended Models
60+
61+
**Local models** (run on your hardware, unlimited usage):
62+
63+
| Model | Size | Notes |
64+
|-------|------|-------|
65+
| `qwen3-coder` | 30B | Excellent for coding, needs 24GB+ VRAM |
66+
| `gpt-oss:20b` | 20B | Strong general purpose |
67+
68+
**Cloud models** (run on Ollama's infrastructure, requires `ollama signin`):
69+
70+
| Model | Notes |
71+
|-------|-------|
72+
| `qwen3.5:cloud` | High performance |
73+
| `glm-4.7:cloud` | High performance |
74+
| `minimax-m2.5:cloud` | Fast |
75+
76+
Cloud models are identified by the `:cloud` suffix. They require an Ollama account but have a free tier.
77+
78+
Models should have at least 32K context length for best results with Claude Code.
79+
80+
## Ollama Cloud
81+
82+
If you don't have a GPU or want to try HolyClaude without local hardware, Ollama Cloud runs models remotely.
83+
84+
1. Install Ollama on your computer
85+
2. Sign in: `ollama signin`
86+
3. Use any cloud model (e.g., `qwen3.5:cloud`)
87+
88+
**Pricing:**
89+
90+
| Plan | Price | Cloud Usage |
91+
|------|-------|-------------|
92+
| Free | $0 | Light usage |
93+
| Pro | $20/mo | 50x Free |
94+
| Max | $100/mo | 250x Free |
95+
96+
Local model usage is always unlimited on all plans.
97+
98+
## Switching from Anthropic
99+
100+
If you previously used HolyClaude with an Anthropic subscription and want to switch to Ollama:
101+
102+
1. Add the `ANTHROPIC_AUTH_TOKEN` and `ANTHROPIC_BASE_URL` environment variables to your compose file
103+
2. Restart: `docker compose down && docker compose up -d`
104+
105+
No data deletion is needed. The environment variables override previous authentication.
106+
107+
## Limitations
108+
109+
Ollama's Anthropic API compatibility covers most features Claude Code needs, but some are not supported:
110+
111+
- Prompt caching (`cache_control`)
112+
- PDF document processing
113+
- Token counting endpoint
114+
- Image URLs (base64 images work)
115+
116+
For full details, see [Ollama's Anthropic API documentation](https://docs.ollama.com/api/anthropic-compatibility).
117+
118+
## Troubleshooting
119+
120+
**Claude Code can't connect to Ollama:**
121+
- Verify Ollama is running: `curl http://localhost:11434` on your host (should return "Ollama is running")
122+
- If Ollama is on a different machine, use its IP instead of `host.docker.internal`
123+
- Ensure Ollama is listening on all interfaces: `OLLAMA_HOST=0.0.0.0 ollama serve`
124+
125+
**Web Terminal button missing when not logged in to Claude:**
126+
- This is a known CloudCLI UI limitation. The Web Terminal plugin requires authentication to be active. Use `docker exec -it holyclaude bash` as a workaround.
127+
128+
**Model not found:**
129+
- Pull the model first on your Ollama host: `ollama pull qwen3-coder`
130+
- For cloud models, sign in first: `ollama signin`

0 commit comments

Comments
 (0)