Skip to content

charlesms1246/WATSON

Repository files navigation

WATSON-logo

AI chat analyzing addresses & txs via Blockscout MCP across chains.

Overview

Watson is a web application that lets you chat with an AI to analyze blockchain addresses and transactions. It uses Blockscout’s Model Context Protocol (MCP) tools for data access and offers four provider paths which can be configured during setup

  • Claude Agent SDK
  • OpenAI Agents SDK
  • Google Gemini SDK
  • Groq Agents SDK

The UI does not expose model selection; you can switch providers manually via environment variables [Refer: example.env]

Tech Stack

  • Next.js App Router, React, TypeScript
  • Tailwind CSS + shadcn/ui
  • Blockscout MCP tools
  • Claude Agent SDK (@anthropic-ai/claude-agent-sdk)
  • OpenAI Agents SDK (@openai/agents)
  • Google Gemini (@google/generative-ai)
  • Groq SDK (groq-sdk)

Features

  • Chain selector + address/tx input with validation
  • Quick actions for common analyses
  • Chat interface with Markdown rendering
  • Non-streaming and streaming APIs for responses

Getting Started

  1. Install dependencies
npm install
  1. Configure environment (create .env.local) [Refer example.env]
  • API_PROVIDER=ANTHROPIC (required: ANTHROPIC, OPENAI, GEMINI, or GROQ)
  • ANTHROPIC_API_KEY=your_key (required if API_PROVIDER=ANTHROPIC)
  • OPENAI_API_KEY=your_key (required if API_PROVIDER=OPENAI)
  • OPENAI_AGENT_MODEL=gpt-4.1 (optional, defaults to gpt-4.1)
  • GEMINI_API_KEY=your_key (required if API_PROVIDER=GEMINI)
  • GROQ_API_KEY=your_key (required if API_PROVIDER=GROQ)
  • GROQ_MODEL=openai/gpt-oss-120b (required for MCP, defaults to llama-3.3-70b-versatile)

Do not commit the .env.local file,Rotate if any keys are exposed.

  1. Run the app
npm run dev

Open http://localhost:3000

API Endpoints

  • Unified (Recommended): POST /api/chat-unified
    • Automatically routes to the provider specified by API_PROVIDER env variable
    • Body: { messages: Message[], chainId: string, address?: string, txHash?: string }
    • Returns: { message: string, toolCalls?: [], usage?: {}, provider: string }
  • Claude (Direct): POST /api/chat
    • Body: { messages: Message[], chainId: string, address?: string, txHash?: string }
    • Returns: { message: string, toolCalls?: [], usage?: {} }
  • Streaming (Claude): POST /api/chat/stream (Server-Sent Events)
  • OpenAI (Direct): POST /api/chat-openai
    • Same request/response shape; uses OpenAI Agents SDK with hosted MCP tools
  • Gemini (Direct): POST /api/chat-gemini
    • Same request/response shape; uses Google Gemini API
  • Groq (Direct): POST /api/chat-groq
    • Same request/response shape; uses Groq API with Llama models

Switching Providers

  • Recommended: Set API_PROVIDER in .env.local to one of: ANTHROPIC, OPENAI, GEMINI, or GROQ
  • The unified endpoint /api/chat-unified will automatically use the specified provider
  • Alternatively, call provider-specific routes directly:
    • /api/chat - Claude
    • /api/chat-openai - OpenAI
    • /api/chat-gemini - Gemini
    • /api/chat-groq - Groq
  • Adjust models: set OPENAI_AGENT_MODEL or GROQ_MODEL in .env.local

Notable Files

  • App UI: app/page.tsx
  • Claude config/helpers: lib/agent-config.ts (runBlockscoutQuery, createBlockscoutAgent)
  • OpenAI helpers: lib/agent-config.ts (runBlockscoutQueryWithOpenAI, createBlockscoutAgentWithOpenAI)
  • Gemini helpers: lib/agent-config.ts (runBlockscoutQueryWithGemini, createBlockscoutAgentWithGemini)
  • Groq helpers: lib/agent-config.ts (runBlockscoutQueryWithGroq, createBlockscoutAgentWithGroq)
  • APIs: app/api/chat/route.ts, app/api/chat/stream/route.ts, app/api/chat-openai/route.ts, app/api/chat-gemini/route.ts, app/api/chat-groq/route.ts

Components (shadcn/ui)

  • components/ui/* plus:
    • components/chain-selector.tsx
    • components/address-input.tsx
    • components/chat-interface.tsx
    • components/quick-actions.tsx

UI Demo

UI-Ouput-0
UI-Ouput-1
UI-Ouput-2

Notes

  • Tailwind/shadcn are preconfigured and themed with dark mode support
  • Next.js build may warn about multiple lockfiles; you can set turbopack.root or use a single lockfile

Built By

About

AI that speaks blockchain—instant wallet analysis in plain English.

Resources

Stars

Watchers

Forks

Contributors