A Vercel-deployed chatbot that integrates with OpenClaw — the self-hosted AI agent gateway.
- Streaming chat UI with OpenClaw gateway integration
- Configurable gateway URL, auth token, and agent ID via settings panel
- Server-side defaults via environment variables, client-side overrides via localStorage
- Edge runtime for low-latency streaming
- Built with Next.js, TypeScript, and Tailwind CSS
| Variable | Description | Default |
|---|---|---|
OPENCLAW_GATEWAY_URL |
OpenClaw gateway HTTP endpoint | http://localhost:18789 |
OPENCLAW_TOKEN |
Auth token for the gateway | no-token |
npm install
npm run devOpen http://localhost:3000. Use the settings panel (gear icon) to configure your gateway connection.
- Click the Deploy with Vercel button above, or:
- Connect the
makerdock/openclaw-chatbotrepo to a Vercel project - Set
OPENCLAW_GATEWAY_URLandOPENCLAW_TOKENenvironment variables - Deploy
Note: For Vercel to reach a local OpenClaw gateway, expose it via Tailscale Serve, Cloudflare Tunnel, or bind to a public address.
app/api/chat/route.ts— Edge API route that proxies chat completions to OpenClaw via the OpenAI-compatible APIapp/page.tsx— Chat UI with message history and streaming displaycomponents/— Settings panel for gateway configuration