Satori is a decentralized data stream network built on Nostr. Neurons (clients) discover, subscribe to, predict, and publish data streams across a network of Nostr relays.
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
│ Neuron A │────▶│ Nostr Relay │◀────│ Neuron B │
│ (publisher) │ │ (strfry) │ │ (subscriber) │
└──────────────┘ └──────────────┘ └──────────────┘
│ ▲ │
│ ┌─────┴─────┐ │
└─────────────▶│ Relay 2 │◀──────────────┘
└───────────┘
- Neurons connect to multiple relays simultaneously
- Relays store and forward Nostr events (parameterized replaceable)
- Central server provides relay discovery (peers table), but neurons also maintain a local relay list
- All data flows through Nostr — the central server never sees stream data
All Satori events are parameterized replaceable (kind 34600-34603). Each event includes a d tag set to the stream_name, so the relay keeps only the latest event per (pubkey, kind, stream_name).
| Kind | Name | Content | d-tag | Notes |
|---|---|---|---|---|
| 34600 | Datastream Announcement | Stream metadata (JSON) | stream_name | Public, discoverable. One per stream per publisher. |
| 34601 | Observation Data | Observation value (JSON or encrypted) | stream_name | Relay keeps latest value per stream. Free streams are plaintext; paid streams are encrypted per-subscriber. |
| 34602 | Subscription Announce | Subscription info (JSON) | stream_name | Public. Unsubscribe replaces subscribe (same d-tag). |
| 34603 | Payment Notification | Payment proof (encrypted) | stream_name | Encrypted DM to provider. |
All events include:
["d", stream_name]— parameterized replaceable identifier["stream", stream_name]— stream identifier["satori", "..."]— Satori protocol marker
Observations also include:
["seq", "123"]— sequence number
Announcements may include:
["t", "bitcoin"]— topic tags for discovery["source_stream", "btc-price"]— source lineage (for prediction streams)["source_pubkey", "abc123"]— source provider pubkey
Six local SQLite tables managed by NetworkDB:
Streams we're consuming from other publishers.
| Column | Type | Notes |
|---|---|---|
| stream_name | TEXT | Stream identifier |
| relay_url | TEXT | Where we found it |
| provider_pubkey | TEXT | Publisher's Nostr pubkey |
| name, description | TEXT | Display metadata |
| cadence_seconds | INTEGER | Expected interval between observations |
| price_per_obs | INTEGER | 0 = free |
| encrypted | INTEGER | 0 = plaintext |
| tags | TEXT | JSON array of topic tags |
| active | INTEGER | 1 = subscribed, 0 = unsubscribed |
| stale_since | INTEGER | When we last detected staleness |
| UNIQUE | (stream_name, provider_pubkey) |
Received data points from subscribed streams.
| Column | Type | Notes |
|---|---|---|
| stream_name | TEXT | Which stream |
| provider_pubkey | TEXT | Who published it |
| seq_num | INTEGER | Publisher's sequence number |
| observed_at | INTEGER | Publisher's reported timestamp |
| received_at | INTEGER | Our clock when we got it |
| value | TEXT | The observation data |
| event_id | TEXT | Nostr event hash (used for dedup) |
Predictions generated by the engine for subscribed streams.
| Column | Type | Notes |
|---|---|---|
| stream_name | TEXT | Source stream being predicted |
| provider_pubkey | TEXT | Source stream's publisher |
| observation_seq | INTEGER | Which observation triggered this |
| value | TEXT | Predicted value |
| observed_at | INTEGER | Source observation timestamp |
| created_at | INTEGER | When prediction was made |
| published | INTEGER | 1 = broadcast to network |
Streams we're publishing (predictions or original data sources).
| Column | Type | Notes |
|---|---|---|
| stream_name | TEXT UNIQUE | Our stream identifier |
| source_stream_name | TEXT | If prediction: source subscription stream |
| source_provider_pubkey | TEXT | If prediction: source publisher |
| name, description | TEXT | Display metadata |
| cadence_seconds | INTEGER | NULL or 0 = no schedule (externally fed) |
| price_per_obs | INTEGER | 0 = free |
| encrypted | INTEGER | 0 = plaintext |
| last_published_at | INTEGER | Timestamp of last publish |
| last_seq_num | INTEGER | Auto-incrementing sequence counter |
Configuration for auto-fetched external data.
| Column | Type | Notes |
|---|---|---|
| stream_name | TEXT UNIQUE | Matches publication stream_name |
| url | TEXT | Fetch URL (empty = externally fed via API) |
| method | TEXT | GET or POST |
| headers | TEXT | JSON object of HTTP headers |
| cadence_seconds | INTEGER | 0 = no auto-fetch |
| parser_type | TEXT | json_path or python |
| parser_config | TEXT | Dot-notation path or Python code |
Known Nostr relays.
| Column | Type | Notes |
|---|---|---|
| relay_url | TEXT UNIQUE | WebSocket URL |
| first_seen | INTEGER | When we first learned of it |
| last_active | INTEGER | Last successful interaction |
Relay operators earn rewards for running reliable relays on the Satori network. To register your relay:
-
Clone and run the relay
git clone https://github.com/SatoriNetwork/satori-relay.git cd satori-relay -
Get your Nostr public key from the Relay Settings card on the neuron dashboard. Copy it with the copy button.
-
Configure the relay — paste your Nostr public key into the relay's
.envfile:NOSTR_PUBKEY=your_hex_pubkey_here RELAY_DOMAIN=your-relay.example.comThis sets up NIP-11 so the relay's information document includes a
selffield matching your pubkey, which proves you own the relay. -
Start the relay
docker compose up -d
The relay will be available at
wss://your-relay.example.com. Make sure port 443 is open and your domain points to the server. -
Register with the network — back on the neuron dashboard, enter your relay URL (
wss://your-relay.example.com) in the Relay Settings card and click Register Relay. The central server will:- Fetch the relay's NIP-11 information document
- Verify the
selffield matches your neuron's Nostr pubkey - Register the relay so other neurons can discover it
The status indicator shows "Relay verified and registered" on success, or an error message if verification fails.
Once registered, other neurons will discover your relay through the central server's relay list and connect to it for stream discovery and data exchange.
Neurons discover streams by connecting to relays and querying for kind 34600 (announcement) events. For each discovered stream, the neuron:
- Fetches the latest observation (kind 34601) from the relay
- Checks freshness using
is_likely_active()— compares observation age to cadence - Saves the observation to the local DB if it's new (dedup by event_id)
- Displays the stream in the UI with active/inactive status
Streams with no cadence are always considered active.
Clicking Subscribe in the relay drawer:
- Saves the subscription to the local DB
- Starts a live listener on that relay
- The listener receives observations in real-time and saves them
Clicking Predict on a subscribed stream:
- Creates a publication named
{stream_name}_predlinked to the source - The mock engine runs on each received observation, echoes the value as a prediction
- Predictions are saved to the DB and published to all connected relays
Predicting is decoupled from subscribing — you can subscribe without predicting.
Two ways to publish original data:
Auto-fetched: Create a data source with a URL, parser, and cadence. The fetch loop polls it every 5 minutes, checks if it's due based on cadence, fetches the URL, runs the parser, and publishes the extracted value.
Externally fed: Create a data source with no URL. Push data via the API:
POST /api/network/publish
Content-Type: application/json
{"stream_name": "my-stream", "value": "42.5"}
JSON Path (json_path): Dot-notation traversal. Example: data.markets.0.price traverses {"data": {"markets": [{"price": 42.5}]}} → 42.5.
Python (python): Arbitrary code receiving text (raw response body), returning the extracted value. Example:
import json
data = json.loads(text)
return str(data['price'] * 100)The reconcile loop (every 5 minutes):
- Checks local observations for each subscription against cadence
- If stale locally: hunts across other relays for the stream
- If found on another relay: migrates the subscription
- If not found anywhere: marks the stream as stale
Streams with no cadence (None or 0) are never considered stale.
Relays come from two sources:
- Central server: peers table provides registered relay URLs
- Local: user adds relays manually via the UI
Both are merged. The neuron connects to relays as needed for subscriptions and publishing.
| Method | Path | Description |
|---|---|---|
| GET | /api/network/subscriptions |
List active subscriptions (?all=1 for inactive too) |
| POST | /api/network/subscribe |
Subscribe to a stream |
| POST | /api/network/unsubscribe |
Unsubscribe from a stream |
| GET | /api/network/observations |
Get observations + predictions for a stream |
| Method | Path | Description |
|---|---|---|
| GET | /api/network/publications |
List active publications (?all=1 for inactive too) |
| POST | /api/network/predict |
Start predicting a subscribed stream |
| POST | /api/network/stop-predict |
Stop predicting |
| POST | /api/network/publish |
Push a value to an existing publication |
| Method | Path | Description |
|---|---|---|
| GET | /api/network/data-source |
Get data source config by ?stream_name= |
| POST | /api/network/data-source |
Create or update a data source + publication |
| POST | /api/network/data-source/test |
Test fetch + parse without saving |
| Method | Path | Description |
|---|---|---|
| GET | /api/network/relays |
List all relays (merged server + local) |
| POST | /api/network/relay |
Add a relay URL to local DB |
| DELETE | /api/network/relay |
Remove a local relay |
| GET | /api/network/streams/relay |
Discover streams on a specific relay |
The dashboard provides:
- Subscriptions table: stream name (clickable for data drawer), relay, cadence, status badge, Unsubscribe/Predict/Stop Predicting buttons
- Observations drawer: Chart.js line chart (blue = observations, orange dashed = predictions shifted forward one cadence), tabs for raw observation and prediction tables
- Publications table: stream name (clickable), source linkage, cadence, seq count, status. Clicking a data source publication opens the edit form; clicking a prediction publication opens the data drawer.
- Data source form: stream name, URL (optional), method, cadence (optional, min 15 min), headers, parser type + config, Test button with interactive JSON tree, CSV bulk upload
- Relays table: URL, last active, source badge, delete button for local relays. Clicking opens discovery drawer with Subscribe/Unsubscribe per stream.
| File | Description |
|---|---|
neuron-lite/start.py |
Main startup, reconcile loop, discovery, listeners, fetch loop, engine |
neuron-lite/satorineuron/network_db.py |
SQLite storage, all 6 tables, queries |
web/routes.py |
Flask API endpoints |
web/templates/dashboard.html |
Dashboard UI |
satorilib/src/satorilib/satori_nostr/client.py |
Nostr client (connect, publish, subscribe, discover) |
satorilib/src/satorilib/satori_nostr/models.py |
Data models, event kind constants |