Skip to content

monterail/PerfectGeoStorm

 
 

GeoStorm

AI Perception Monitoring for Software

Monitor how AI systems perceive and recommend your software across GPT, Claude, Gemini, and more.


Developers increasingly discover software through AI -- GPT, Claude, Gemini, Perplexity, and others. When someone asks "what's the best library for X?", the AI's answer shapes adoption. But you have no idea what these models are saying about your project.

GeoStorm monitors multiple AI models on a schedule, tracks how they perceive and recommend your software, and alerts you when things change -- a new competitor appears, your ranking drops, or a model stops mentioning you.

One container, one command.


Built with spec-driven development via Shotgun (GitHub)


Quick Start

docker run -d -p 8080:8080 -v geostorm-data:/app/data --name geostorm ghcr.io/geostorm-ai/geostorm

Open http://localhost:8080 -- the demo loads immediately.

No git clone, no build step, no API keys. A demo project with 90 days of synthetic monitoring data is ready to explore.

Requirements


What You'll See

The demo project ships with realistic sample data so you can explore every feature immediately:

Feature Description
Signal Panel A unified feed of alerts, ranked by severity and recency
Alerts Feed Critical and warning signals with full context on what changed
Perception Chart Track your recommendation share and positioning across models over time

The demo data covers multiple AI models, competitor tracking, and trend analysis.


Next Steps

To start monitoring your own software:

1. Get an API key at OpenRouter -- one key gives you access to multiple AI models.

2. Add your OpenRouter API key in the GeoStorm settings page:

GeoStorm Settings — API Key

3. Create a project in the UI and GeoStorm starts monitoring on a schedule.

4. (Optional) Connect Claude Code so you can query your perception data conversationally:

claude mcp add --transport http geostorm http://localhost:8080/mcp/

Then ask Claude things like "What projects am I monitoring?", "Show me perception scores for Shotgun", or "Are there any alerts I should know about?"


Alert Types

GeoStorm detects and alerts on these signals:

Alert Severity Description
competitor_emergence Critical A new competitor has appeared in AI recommendations for your category
disappearance Critical Your software has stopped being mentioned by one or more AI models
recommendation_share_drop Warning Your share of AI recommendations has declined significantly
position_degradation Warning Your software is being listed lower in AI recommendation rankings
model_divergence Warning Different AI models are giving substantially different recommendations about your software

Architecture

GeoStorm runs as a single Docker container with no external dependencies:

Component Technology
Backend FastAPI serving the REST API and running scheduled monitoring jobs via APScheduler (in-process)
Frontend Astro with React islands, styled with TailwindCSS, charts powered by Recharts
Database SQLite, stored in a mounted volume (./data/)
Scheduling APScheduler runs inside the FastAPI process -- no separate worker, no Redis, no message queue

One container, one port, one volume mount.


MCP Integration (Claude Code)

GeoStorm exposes an MCP endpoint at /mcp/ so AI coding assistants like Claude Code can query your perception data conversationally. See Next Steps for the one-liner setup command.

To scope the MCP to a specific project directory instead of globally:

claude mcp add --transport http --scope project geostorm http://localhost:8080/mcp/

Available Tools

Tool Description
list_projects Discover project IDs, names, and latest scores
get_project_summary Full project summary: detail, perception, breakdown, recent runs, and alerts (accepts ID or fuzzy name)
get_run_detail Single run with perception score and competitors detected
get_trajectory Historical trend data with day/week/month aggregation (accepts ID or fuzzy name)

Configuration

GeoStorm works out of the box with zero configuration. You can optionally configure notification channels via environment variables in a .env file:

Channel Description
Slack Set a webhook URL to receive alerts in a Slack channel
Email Configure SMTP settings for email notifications
Custom Webhook Point alerts at any HTTP endpoint

All notification channels are optional. GeoStorm always displays alerts in the UI regardless of notification configuration.

Telemetry

GeoStorm is free and open source. The only telemetry is an anonymous ping when the server starts and when a monitoring run completes — no names, no IPs, no project data. This helps us know the project is being used. You can turn it off anytime.


Roadmap

Planned features, roughly in priority order:

  • Authentication -- user accounts and login so GeoStorm can be hosted on a remote server or shared instance without exposing everything to the network
  • Direct provider support -- use OpenAI, Anthropic, and Google API keys directly instead of going through OpenRouter
  • Expanded model coverage -- automatic support for the latest models as they launch (Perplexity, Grok, etc.) so you're always monitoring against whatever people are actually using
  • Data export -- CSV and PDF export of perception data, alerts, and run history
  • Raspberry Pi hosting guide -- instructions for running GeoStorm on a Pi for always-on monitoring at home
  • Hosted version -- a managed cloud option if there's enough demand for it

Have a feature request? Open an issue.


Contributing

GeoStorm is open-source and we welcome contributions.

Ways to contribute:

Development Setup

# Run locally with a local build
docker compose -f docker-compose.yml -f docker-compose.dev.yml up -d --build

# Backend checks
uv sync --frozen --all-extras
uv run ruff check .
uv run mypy src/ --strict
uv run pytest tests/ -v

# Frontend checks
cd web && pnpm install --frozen-lockfile
pnpm astro check
pnpm tsc --noEmit

Structured Logging (Optional)

GeoStorm uses Logfire for structured logging. Console logs work out of the box with no extra setup. To also send telemetry to Logfire cloud during development, set LOGFIRE_TOKEN in your environment.


FAQ

Why would I want this?

More developers discover tools by asking AI -- "what's the best library for X?" If a model stops recommending your project or starts favoring a competitor, you'd never know unless you manually checked. GeoStorm automates that and alerts you when something changes.

Why OpenRouter?

OpenRouter gives you access to GPT, Claude, Gemini, Llama, and dozens of other models through a single API key. Instead of managing separate keys for OpenAI, Anthropic, and Google, you sign up once and GeoStorm can query all of them. You can also use direct provider keys (OPENAI_API_KEY, ANTHROPIC_API_KEY, GOOGLE_API_KEY) if you prefer.

Is there a hosted version?

Not yet. GeoStorm is self-hosted only for now. The Docker container is designed to be easy to run anywhere -- your laptop, a VPS, or a cloud VM. A hosted version is on the roadmap.

Why SQLite?

GeoStorm is a single-user monitoring tool, not a multi-tenant SaaS. SQLite keeps things simple -- no database server to run, no connection strings to configure. Your data lives in a single file on a mounted volume. For GeoStorm's query patterns, SQLite is more than fast enough.

How much does it cost to run?

GeoStorm itself is free. The only cost is the AI API usage through OpenRouter. A typical monitoring run queries 3 models with a few prompts each -- roughly $0.01-0.05 per run depending on the models you choose. Running daily, that's about $1-2/month.

Couldn't I do this with OpenClaw?

You could wire up an OpenClaw agent with a cron job to query AI models daily and store the results somewhere. But then you're building GeoStorm from scratch -- prompt engineering for consistent structured responses, parsing and normalizing across models, calculating recommendation share and position rankings, detecting changes over time, generating alerts, and building a UI to make sense of it all.

GeoStorm does all of that out of the box. It's also cheaper and more predictable -- deterministic code on a fixed schedule, so you know what queries run and what they cost. An AI agent deciding what to do each run can drift or burn tokens on reasoning overhead. No agent framework required.

How do I disable telemetry?

Set the NO_TELEMETRY=true environment variable. This completely disables all analytics — no PostHog client is created and no events are sent. See PRIVACY.md for full details on what is (and isn't) collected.

# Docker
docker run -e NO_TELEMETRY=true ...

# .env file
NO_TELEMETRY=true

Get started

docker run -d -p 8080:8080 -v geostorm-data:/app/data --name geostorm ghcr.io/geostorm-ai/geostorm
Star GeoStorm

License: MIT | Python: 3.11+ | Homepage: github.com/geostorm-ai/geostorm

About

Monitor how AI systems perceive and recommend your software across GPT, Claude, Gemini, and more.

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages

  • Python 69.2%
  • TypeScript 28.9%
  • Astro 0.9%
  • JavaScript 0.4%
  • Shell 0.2%
  • CSS 0.2%
  • Dockerfile 0.2%