The AI-native, open-source RGB lighting platform.
Control every LED on your PC from a browser. Generate effects from a sentence. Map your hardware in 3D and let lighting flow through real physical space. Built to be hacked on.
███████╗██╗ ██╗███╗ ██╗ █████╗ ██████╗ ███████╗███████╗ ██████╗ ██████╗ ██████╗
██╔════╝╚██╗ ██╔╝████╗ ██║██╔══██╗██╔══██╗██╔════╝██╔════╝ ██╔══██╗██╔════╝ ██╔══██╗
███████╗ ╚████╔╝ ██╔██╗ ██║███████║██████╔╝███████╗█████╗ ██████╔╝██║ ███╗██████╔╝
╚════██║ ╚██╔╝ ██║╚██╗██║██╔══██║██╔═══╝ ╚════██║██╔══╝ ██╔══██╗██║ ██║██╔══██╗
███████║ ██║ ██║ ╚████║██║ ██║██║ ███████║███████╗ ██║ ██║╚██████╔╝██████╔╝
╚══════╝ ╚═╝ ╚═╝ ╚═══╝╚═╝ ╚═╝╚═╝ ╚══════╝╚══════╝ ╚═╝ ╚═╝ ╚═════╝ ╚═════╝
RGB software should be as good as the hardware it controls — and as open as the community that uses it.
Most RGB software is closed, vendor-locked, bloated, and ignores half your hardware. The open alternatives are powerful but stuck in a 2010s UX. SynapseRGB is built around three ideas:
- AI-native — describe what you want in plain English and get a real, runnable effect.
- Spatial-first — your case is a 3D object. Effects should travel through actual physical space, not a flat device list.
- Hackable — Rust core, React dashboard, TypeScript plugin SDK, MCP server. Every layer is documented and replaceable.
| Feature | ASUS Aura | OpenRGB | SignalRGB | SynapseRGB |
|---|---|---|---|---|
| Open source | ❌ | ✅ | ❌ | ✅ |
| Modern web UI | ❌ | ❌ | ✅ | ✅ |
| Plugin SDK | ❌ | limited | ✅ | ✅ |
| AI control (MCP) | ❌ | ❌ | ❌ | ✅ |
| 3D spatial layout | ❌ | ❌ | ❌ | ✅ |
| Photo-to-layout | ❌ | ❌ | ❌ | ✅ |
| AI effect generation | ❌ | ❌ | ❌ | ✅ |
| Cost | Free | Free | Paid | Free |
Three local services, each independently runnable, each with a documented API.
┌──────────────────────────────────────────────────────────┐
│ SynapseRGB │
│ │
│ Browser ─────▶ Dashboard (React/Vite, :7778) │
│ │ │
│ Claude ─────▶ MCP Server (Node/TS, :7779) │
│ │ │
│ Core Service (Rust/Axum, :7777) │
│ │ │
│ Hardware Abstraction Layer │
│ │ │ │
│ Mock Devices OpenRGB Bridge │
│ │ │
│ real GPU / motherboard / RAM / ... │
└──────────────────────────────────────────────────────────┘
- Core (Rust) — REST + WebSocket, runs the 20fps hardware sync loop, owns device state.
- Dashboard (React) — dark-themed UI, live updates over WebSocket.
- MCP Server (TypeScript) — exposes lighting as tools for Claude or any MCP-capable agent.
- Plugin SDK (TypeScript) — write effects against a clean
tick(ctx) -> DeviceFrameinterface.
Prerequisites: Rust (stable), Node 18+, and OpenRGB running with SDK server enabled (port 6742) for real hardware. Without OpenRGB, you'll get 6 mock devices to play with.
Windows:
start.batLinux/macOS:
./start.shThen open http://localhost:7778.
# Core
cd core && cargo run
# Dashboard
cd dashboard && npm install && npm run dev
# MCP server (optional, for AI control)
cd mcp && npm install && npm run buildSee mcp/README.md for wiring the MCP server into Claude Desktop or Claude Code.
SynapseRGB/
├── core/ Rust — REST API, WebSocket, hardware bridge, effects engine
│ ├── src/hardware/openrgb_bridge.rs ← raw OpenRGB protocol
│ ├── src/effects/ ← built-in effects
│ ├── src/plugins/ ← JS plugin runtime (Boa)
│ └── vendor/openrgb/ ← patched openrgb crate
├── dashboard/ React + Vite + Tailwind — web UI, spatial editor, photo-match
├── mcp/ MCP server — AI tool bridge over stdio
├── sdk/ TypeScript types for the plugin SDK
├── plugins/ User and community plugins
├── ROADMAP.md The full plan
└── CLAUDE.md Notes for AI contributors
SynapseRGB is built in phases. We're currently mid-Phase 3 with spatial groundwork already landing in the dashboard.
- Rust core with REST + WebSocket
- React dashboard with dark theme
- 6 built-in effects (Static, Breathing, Rainbow, Wave, Pulse, Color Cycle)
- Scene save/recall
- MCP server for AI control
- Plugin SDK type definitions
- OpenRGB bridge with persistent TCP and split-write
UpdateLEDs - 20fps sync loop with auto-reconnect
- Verified on ASUS ROG RTX 3080 Ti (22 LEDs) + ASUS TUF B550 (68 LEDs)
- ASUS Aura SMBus driver (Windows)
- USB HID enumeration
- Corsair iCUE (keyboard/mouse)
- NVIDIA NvAPI / AMD ADL direct GPU control
- JavaScript plugin runtime (Boa engine, sandboxed)
- Audio-reactive effects (WASAPI loopback capture)
- Screen ambient mode
- Community effects plugin system
- Visual effect timeline editor
- Game integration hooks
The insight: effects only become physical once the system knows where each device lives in 3D.
- 3D spatial editor in the dashboard (drag-and-drop devices into a viewport)
- Layout save/load
- Photo-match dialog (upload a photo of your case)
- AI keys management in settings (BYO Anthropic / OpenAI / Gemini)
- Per-LED position maps inside each device
- Vision-model auto-tagging of devices in photos
- Spatial effect primitives (cascade, ripple, fill, sweep) — direction-aware
- "Spatial mode" toggle on every existing effect
- Profile sync across machines
- Community effect library (one-click install)
- Discord / Twitch event hooks
- Public REST API for third-party apps
With a stable plugin interface and real spatial data, an AI can write real effects from a prompt.
- "Generate with AI" button on the Effects page
- Prompt → TypeScript
tick()function → sandboxed in the runtime → live in your library - Regenerate / explain / edit / share buttons
- Example prompts: "waterfall", "fire", "matrix code rain", "police lights", "heartbeat", "northern lights"
- Full natural-language scene control ("dim everything except the keyboard, make it red")
- Contextual automation (active app, time of day, calendar)
- Mood-to-lighting from music / weather / system state
- Live preview: "show me my setup with an ocean theme"
The full plan, including rationale and implementation notes, lives in ROADMAP.md.
GET /api/devices
POST /api/devices/:id/color {"r":255,"g":0,"b":0}
POST /api/devices/:id/leds {"leds":[{"index":0,"r":255,"g":0,"b":0}]}
GET /api/effects
POST /api/effects/:id/apply
GET /api/scenes
WS /ws ← live state streamDevice IDs follow the pattern openrgb-N (e.g. openrgb-0).
This is the part we care about most. SynapseRGB is meant to be a community platform, not a product. The roadmap above is a direction, not a contract — if you want to take it somewhere else, open an issue and let's talk.
- 🔌 Add a hardware driver — Corsair, Razer, Logitech, NZXT, Cooler Master, anything OpenRGB doesn't already cover.
- 🎨 Write an effect plugin — drop a
.jsfile inplugins/and it shows up in the dashboard. - 🧪 Port to Linux/macOS — the core is Rust and should work; we just need testers.
- 🖼️ Improve the spatial editor — better gizmos, snapping, alignment, presets.
- 🤖 Vision prompts for photo-to-layout — help us tune the prompts that turn a photo of your case into a layout.
- 📚 Docs and examples — every effect we write is one fewer "how do I…" issue.
- 🐛 File issues — even just "this didn't work on my hardware" is valuable.
- Rust: standard
rustfmt, async/await with Tokio, errors asResult<T, String>. - TypeScript/React: Vite + Tailwind, functional components, no class components.
- Commits: short imperative —
Add Corsair K70 driver,Fix wave effect on 1-LED devices. - Don't break the OpenRGB bridge protocol invariants — they're documented in CLAUDE.md and
core/src/hardware/openrgb_bridge.rs. Persistent TCP, split header/data writes, exactUpdateLEDspacket format. These rules exist because the OpenRGB server silently drops packets that violate them.
import type { SynapseEffect, EffectContext, DeviceFrame } from "synapse-sdk";
export const effect: SynapseEffect = {
name: "my-effect",
tick(ctx: EffectContext): DeviceFrame {
// ctx.time, ctx.devices, ctx.layout (3D positions), ctx.audio, ctx.screen
// return per-LED colors
},
};Drop the file in plugins/, refresh the dashboard, done.
- Issues / feature requests: GitHub Issues
- Show & tell: post your setup, your layouts, your effects — we want to see them
- Discussions: GitHub Discussions for design questions, hardware compatibility, AI prompt sharing
If SynapseRGB controls your hardware, add it to the compatibility list via PR. That's how the project grows.
MIT. Fork it, ship it, sell it, embed it. Just don't close it back up.