diff --git a/README.md b/README.md
index 6983ecc..46c957a 100644
--- a/README.md
+++ b/README.md
@@ -18,24 +18,19 @@
**OpenClaw-WebTop** runs OpenClaw securely with a need for a dedicate computer. When you are ready, you can move to your own machine docker environment if you would like.
+## Prequisite
+- Account with [Ollama](https://signin.ollama.com/) so that you can use an cloud based model for free with your daily free credits
+- Account with [NVIDIA Build](https://build.nvidia.com/) so that you can switch NVDIA free API as backup
## Quick start (TL;DR)
1. Open this repository in a GitHub Codespace
-2. In the Codespace terminal run:
-
-```bash
-make start
-```
-
-3. Wait for the OpenClaw WebTop docker to boot. When the web desktop URL appears in the Codespace `Ports` Tab, click it to open the browser desktop.
-
-### Will be full automated
-
-4. In the webtop, open a terminal and run `ollama serve`
-5. Open another terminal and run `ollama signin`
-6. Sign in or sign up for a ollama free account.. and you get some cloud LLM token for free (reset every day and week)
-7. Download a cloud ollama model via `ollama pull kimi-k2.5:cloud`
+2. In the Codespace terminal run: `make start`. This will start a webtop OS that is accessible in the browser with ollama and openclaw pre-installed
+3. Wait for the OpenClaw WebTop docker to boot. When the web desktop URL appears in the Codespace `Ports` Tab, click it to open the webtop in a new browser.
+
+4. In the webtop, open a terminal, run `ollama signin` and sign in via the chrome browser
+
+7. Test connectivity via `ollama pull kimi-k2.5:cloud`
8 Open another terminal and run `ollama launch openclaw --model kimi-k2.5:cloud`
9. Follow the instructions till completion
10. In the webtop, open a chromium browser and check if you can access this URI: http://localhost:127.0.0.1
@@ -43,7 +38,6 @@ make start
12. When the URL is working, run `openclaw dashboard` it will give you to URI with a token at the end.
-
diff --git a/docker/Dockerfile b/docker/Dockerfile
index 47cc24a..bd904c5 100644
--- a/docker/Dockerfile
+++ b/docker/Dockerfile
@@ -20,10 +20,10 @@ RUN ln -s /usr/local/lib/node_modules/npm/bin/npm-cli.js /usr/local/bin/npm
# Copy the Ollama binary from the official image
COPY --from=ollama-bin /usr/bin/ollama /usr/local/bin/ollama
-# (Optional) If you want Ollama to start automatically when the desktop loads,
-# you can add a custom init script to /custom-cont-init.d/
-# RUN echo 'ollama serve &' > /custom-cont-init.d/start-ollama.sh && \
-# chmod +x /custom-cont-init.d/start-ollama.sh
+# Start Ollama to start automatically when the desktop loads,
+RUN mkdir -p /custom-cont-init.d
+RUN echo 'ollama serve &' > /custom-cont-init.d/start-ollama.sh && \
+ chmod +x /custom-cont-init.d/start-ollama.sh
RUN npm install -g openclaw@${OPENCLAW_VERSION}
diff --git a/docs/launch-webtop-via-ports.png b/docs/launch-webtop-via-ports.png
new file mode 100644
index 0000000..077feda
Binary files /dev/null and b/docs/launch-webtop-via-ports.png differ
diff --git a/docs/ollama-signin.png b/docs/ollama-signin.png
new file mode 100644
index 0000000..6a99ce9
Binary files /dev/null and b/docs/ollama-signin.png differ