Watson is a web application that lets you chat with an AI to analyze blockchain addresses and transactions. It uses Blockscout’s Model Context Protocol (MCP) tools for data access and offers four provider paths which can be configured during setup
- Claude Agent SDK
- OpenAI Agents SDK
- Google Gemini SDK
- Groq Agents SDK
The UI does not expose model selection; you can switch providers manually via environment variables [Refer: example.env]
- Next.js App Router, React, TypeScript
- Tailwind CSS + shadcn/ui
- Blockscout MCP tools
- Claude Agent SDK (@anthropic-ai/claude-agent-sdk)
- OpenAI Agents SDK (@openai/agents)
- Google Gemini (@google/generative-ai)
- Groq SDK (groq-sdk)
- Chain selector + address/tx input with validation
- Quick actions for common analyses
- Chat interface with Markdown rendering
- Non-streaming and streaming APIs for responses
- Install dependencies
npm install
- Configure environment (create .env.local) [Refer example.env]
- API_PROVIDER=ANTHROPIC (required: ANTHROPIC, OPENAI, GEMINI, or GROQ)
- ANTHROPIC_API_KEY=your_key (required if API_PROVIDER=ANTHROPIC)
- OPENAI_API_KEY=your_key (required if API_PROVIDER=OPENAI)
- OPENAI_AGENT_MODEL=gpt-4.1 (optional, defaults to gpt-4.1)
- GEMINI_API_KEY=your_key (required if API_PROVIDER=GEMINI)
- GROQ_API_KEY=your_key (required if API_PROVIDER=GROQ)
- GROQ_MODEL=openai/gpt-oss-120b (required for MCP, defaults to llama-3.3-70b-versatile)
Do not commit the .env.local file,Rotate if any keys are exposed.
- Run the app
npm run dev
- Unified (Recommended):
POST /api/chat-unified- Automatically routes to the provider specified by
API_PROVIDERenv variable - Body:
{ messages: Message[], chainId: string, address?: string, txHash?: string } - Returns:
{ message: string, toolCalls?: [], usage?: {}, provider: string }
- Automatically routes to the provider specified by
- Claude (Direct):
POST /api/chat- Body:
{ messages: Message[], chainId: string, address?: string, txHash?: string } - Returns:
{ message: string, toolCalls?: [], usage?: {} }
- Body:
- Streaming (Claude):
POST /api/chat/stream(Server-Sent Events) - OpenAI (Direct):
POST /api/chat-openai- Same request/response shape; uses OpenAI Agents SDK with hosted MCP tools
- Gemini (Direct):
POST /api/chat-gemini- Same request/response shape; uses Google Gemini API
- Groq (Direct):
POST /api/chat-groq- Same request/response shape; uses Groq API with Llama models
- Recommended: Set
API_PROVIDERin.env.localto one of:ANTHROPIC,OPENAI,GEMINI, orGROQ - The unified endpoint
/api/chat-unifiedwill automatically use the specified provider - Alternatively, call provider-specific routes directly:
/api/chat- Claude/api/chat-openai- OpenAI/api/chat-gemini- Gemini/api/chat-groq- Groq
- Adjust models: set
OPENAI_AGENT_MODELorGROQ_MODELin.env.local
- App UI:
app/page.tsx - Claude config/helpers:
lib/agent-config.ts(runBlockscoutQuery, createBlockscoutAgent) - OpenAI helpers:
lib/agent-config.ts(runBlockscoutQueryWithOpenAI, createBlockscoutAgentWithOpenAI) - Gemini helpers:
lib/agent-config.ts(runBlockscoutQueryWithGemini, createBlockscoutAgentWithGemini) - Groq helpers:
lib/agent-config.ts(runBlockscoutQueryWithGroq, createBlockscoutAgentWithGroq) - APIs:
app/api/chat/route.ts,app/api/chat/stream/route.ts,app/api/chat-openai/route.ts,app/api/chat-gemini/route.ts,app/api/chat-groq/route.ts
components/ui/*plus:components/chain-selector.tsxcomponents/address-input.tsxcomponents/chat-interface.tsxcomponents/quick-actions.tsx
- Tailwind/shadcn are preconfigured and themed with dark mode support
- Next.js build may warn about multiple lockfiles; you can set turbopack.root or use a single lockfile



