Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 9 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,32 +18,26 @@

**OpenClaw-WebTop** runs OpenClaw securely with a need for a dedicate computer. When you are ready, you can move to your own machine docker environment if you would like.

## Prequisite
- Account with [Ollama](https://signin.ollama.com/) so that you can use an cloud based model for free with your daily free credits
- Account with [NVIDIA Build](https://build.nvidia.com/) so that you can switch NVDIA free API as backup

## Quick start (TL;DR)

1. Open this repository in a GitHub Codespace
2. In the Codespace terminal run:

```bash
make start
```

3. Wait for the OpenClaw WebTop docker to boot. When the web desktop URL appears in the Codespace `Ports` Tab, click it to open the browser desktop.

### Will be full automated

4. In the webtop, open a terminal and run `ollama serve`
5. Open another terminal and run `ollama signin`
6. Sign in or sign up for a ollama free account.. and you get some cloud LLM token for free (reset every day and week)
7. Download a cloud ollama model via `ollama pull kimi-k2.5:cloud`
2. In the Codespace terminal run: `make start`. This will start a webtop OS that is accessible in the browser with ollama and openclaw pre-installed
3. Wait for the OpenClaw WebTop docker to boot. When the web desktop URL appears in the Codespace `Ports` Tab, click it to open the webtop in a new browser.
<img width="703" alt="launch-webtop-via-ports" src="https://raw.githubusercontent.com/gitricko/openclaw-webtop/gitricko-README.md/docs/launch-webtop-via-ports.png" />
4. In the webtop, open a terminal, run `ollama signin` and sign in via the chrome browser
<img width="703" alt="ollama-signin" src="https://raw.githubusercontent.com/gitricko/openclaw-webtop/gitricko-README.md/docs/ollama-signin.png" />
7. Test connectivity via `ollama pull kimi-k2.5:cloud`
8 Open another terminal and run `ollama launch openclaw --model kimi-k2.5:cloud`
9. Follow the instructions till completion
10. In the webtop, open a chromium browser and check if you can access this URI: http://localhost:127.0.0.1
11. If it does not work: run `openclaw gateway run`
12. When the URL is working, run `openclaw dashboard` it will give you to URI with a token at the end.



<!-- ## Star History

[![Star History Chart](https://api.star-history.com/svg?repos=gitricko/openclaw-webtop&type=date&legend=top-left)](https://www.star-history.com/#gitricko/openclaw-webtop&type=date&legend=top-left) -->
8 changes: 4 additions & 4 deletions docker/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -20,10 +20,10 @@ RUN ln -s /usr/local/lib/node_modules/npm/bin/npm-cli.js /usr/local/bin/npm
# Copy the Ollama binary from the official image
COPY --from=ollama-bin /usr/bin/ollama /usr/local/bin/ollama

# (Optional) If you want Ollama to start automatically when the desktop loads,
# you can add a custom init script to /custom-cont-init.d/
# RUN echo 'ollama serve &' > /custom-cont-init.d/start-ollama.sh && \
# chmod +x /custom-cont-init.d/start-ollama.sh
# Start Ollama to start automatically when the desktop loads,
RUN mkdir -p /custom-cont-init.d
RUN echo 'ollama serve &' > /custom-cont-init.d/start-ollama.sh && \
chmod +x /custom-cont-init.d/start-ollama.sh

RUN npm install -g openclaw@${OPENCLAW_VERSION}

Expand Down
Binary file added docs/launch-webtop-via-ports.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/ollama-signin.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading