ApexWatch is a comprehensive, AI-powered cryptocurrency token monitoring system built with a microservices architecture. It monitors blockchain wallets, exchange markets, and news sources to provide real-time insights and AI-driven analysis using Large Language Models (LLMs).
- π Automated Wallet Discovery - Automatically identifies and tracks "whale" wallets based on transaction volumes
- πΉ Multi-Exchange Monitoring - Tracks prices, volumes, and anomalies across multiple exchanges
- π° AI News Analysis - Aggregates and filters crypto news with sentiment analysis
- π€ LLM-Powered Insights - Sequential event processing with contextual AI analysis
- π Real-Time Dashboard - Beautiful Streamlit-based UI with live data visualization
- π³ Fully Containerized - Complete Docker orchestration with all dependencies
- π Scalable Architecture - Microservices design for easy scaling and maintenance
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Dashboard (Streamlit) β
β JWT Auth β’ Real-time Charts β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Core Service β
β Event Queue β’ LLM Processing β’ Context Mgmt β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β β β
ββββββββ΄βββββββ βββββββ΄ββββββ βββββββ΄βββββββ
βΌ βΌ βΌ βΌ
ββββββββββ ββββββββββββ ββββββββββββ ββββββββββββ
β Wallet β β Exchange β β News β β RabbitMQ β
βMonitor β β Monitor β β Monitor β β Queue β
ββββββββββ ββββββββββββ ββββββββββββ ββββββββββββ
β β β
βββββββββββββββ΄ββββββββββββββ
β
βββββββββββββββ΄ββββββββββββββ
β β
ββββββββββββ βββββββββ ββββββββββββββββ
βPostgreSQLβ β Redis β β ClickHouse β
ββββββββββββ βββββββββ ββββββββββββββββ
- Core Service - Central brain orchestrating event processing and LLM analysis
- Wallet Monitor - Tracks blockchain transfers and discovers whale wallets
- Exchange Monitor - Monitors market data across exchanges using CCXT
- News Monitor - Aggregates news with NLP filtering and sentiment analysis
- Dashboard - User interface for monitoring and configuration
- PostgreSQL - Stores static data, configurations, and analytics
- Redis - Caches real-time context for fast access
- ClickHouse - Time-series storage for LLM thought history
- RabbitMQ - Persistent message queue for sequential event processing
- ELK Stack - Centralized logging (Elasticsearch, Logstash, Kibana)
- Ollama - Local LLM service with fallback to OpenAI
- Docker Engine 20.10+
- Docker Compose 2.0+
- 8GB RAM minimum (16GB recommended)
- 20GB free disk space
- (Optional) Alchemy API key for Ethereum monitoring
- (Optional) OpenAI API key for LLM fallback
git clone <your-repo-url>
cd ApexWatch
# Copy environment template
cp .env.template .env
# Edit .env with your API keys
nano .env # or your preferred editorEdit .env and set:
# Required for Ethereum monitoring
ALCHEMY_API_KEY=your_alchemy_api_key
# Optional for LLM fallback
OPENAI_API_KEY=your_openai_api_key
# Change these secrets in production!
ACCESS_KEY=your-secure-access-key
JWT_SECRET_KEY=your-secure-jwt-secret# Build and start all services
docker-compose up -d
# Check service status
docker-compose ps
# View logs
docker-compose logs -f# Pull the Llama 3 model
docker exec -it apexwatch-ollama ollama pull llama3
# Or use a smaller model
docker exec -it apexwatch-ollama ollama pull llama3:8bOpen your browser and navigate to:
http://localhost:8501
Default Credentials:
- Username:
admin - Password:
admin123
| Service | Port | URL | Description |
|---|---|---|---|
| Dashboard | 8501 | http://localhost:8501 | Web UI |
| Core Service | 8000 | http://localhost:8000 | Main API |
| Wallet Monitor | 8001 | http://localhost:8001 | Wallet API |
| Exchange Monitor | 8002 | http://localhost:8002 | Market API |
| News Monitor | 8003 | http://localhost:8003 | News API |
| PostgreSQL | 5432 | localhost:5432 | Database |
| Redis | 6379 | localhost:6379 | Cache |
| ClickHouse | 8123 | http://localhost:8123 | Analytics DB |
| RabbitMQ | 15672 | http://localhost:15672 | Queue Management |
| Kibana | 5601 | http://localhost:5601 | Log Viewer |
- Real-time queue status
- Current token prices across exchanges
- Watched wallet count
- Recent news count
- Price history charts
- Latest AI thoughts
- Top whale wallets
- Transaction history
- Balance tracking
- Auto-discovered wallets
- Multi-exchange price comparison
- Historical price charts
- Volume analysis
- Configurable time ranges
- Filtered crypto news
- Relevance scoring
- Sentiment analysis
- Source tracking
- Token configuration
- Monitoring thresholds
- System status
- Parameter tuning
All internal service-to-service communication is protected by the X-Access-Key header. Set a strong access key in .env:
ACCESS_KEY=your-very-strong-secret-key-hereThe dashboard uses JWT-based authentication. Configure a strong secret:
JWT_SECRET_KEY=your-jwt-secret-key-hereDefault PostgreSQL credentials are included for development. In production:
- Change database passwords in
.env - Use PostgreSQL's
pg_hba.conffor access control - Enable SSL connections
- Implement backup strategies
- Navigate to Settings β Tokens in the dashboard
- Enter token details:
- Symbol (e.g., USDT)
- Name (e.g., Tether USD)
- Contract Address
- Chain (ethereum, polygon, bsc)
- Decimals (usually 18 for ERC-20)
Or add directly to PostgreSQL:
INSERT INTO tokens (symbol, name, contract_address, chain, decimals, is_active)
VALUES ('USDT', 'Tether USD', '0xdac17f958d2ee523a2206206994597c13d831ec7', 'ethereum', 6, TRUE);Use the Settings β Monitoring Settings page to configure:
- Minimum Transfer Amount - Lower bound for wallet monitoring
- Maximum Transfer Amount - Upper bound for tracking
- Price Change Threshold (%) - Price movement to trigger alerts
- Volume Spike Threshold (%) - Volume increase to trigger events
Add news sources to PostgreSQL:
INSERT INTO news_sources (name, url, source_type, is_active)
VALUES ('CryptoNews', 'https://cryptonews.com/news/feed/', 'rss', TRUE);Configure exchanges in PostgreSQL:
INSERT INTO exchange_configs (exchange_name, is_active, api_key, api_secret)
VALUES ('binance', TRUE, 'your_api_key', 'your_api_secret');For public endpoints, API keys are optional.
# Check all services
curl http://localhost:8000/health # Core
curl http://localhost:8001/health # Wallet Monitor
curl http://localhost:8002/health # Exchange Monitor
curl http://localhost:8003/health # News Monitor# Send a test event to Core Service
curl -X POST http://localhost:8000/api/webhook/event \
-H "X-Access-Key: your-access-key" \
-H "Content-Type: application/json" \
-d '{
"type": "price_change",
"data": {
"token_id": "your-token-id",
"exchange": "binance",
"old_price": 1.0,
"new_price": 1.05,
"change_percent": 5.0
}
}'curl http://localhost:8000/api/queue/status \
-H "X-Access-Key: your-access-key"curl http://localhost:8000/api/thoughts/YOUR_TOKEN_ID \
-H "X-Access-Key: your-access-key"# Check logs
docker-compose logs [service-name]
# Common issues:
# - Port conflicts: Change ports in docker-compose.yml
# - Memory: Increase Docker memory allocation
# - Database init: Ensure init.sql files are present# Pull the model manually
docker exec -it apexwatch-ollama ollama pull llama3
# List available models
docker exec -it apexwatch-ollama ollama list# Restart databases
docker-compose restart postgres redis clickhouse
# Check if databases are healthy
docker-compose ps
# Verify PostgreSQL initialization
docker exec -it apexwatch-postgres psql -U postgres -d apexwatch -c "\dt"- Check RabbitMQ queue: http://localhost:15672 (guest/guest)
- Verify peripheral services are sending events
- Check Core Service logs:
docker-compose logs core - Ensure Alchemy API key is set for wallet monitoring
# Check dashboard logs
docker-compose logs dashboard
# Verify all API endpoints are accessible
curl http://localhost:8000/health
curl http://localhost:8001/health
curl http://localhost:8002/health
curl http://localhost:8003/healthAccess Kibana at http://localhost:5601
- Create an index pattern:
apexwatch-logs-* - Filter logs by service name
- Set up dashboards for monitoring
Access at http://localhost:15672 (guest/guest)
- View queue sizes
- Monitor message rates
- Check consumer connections
# Connect to ClickHouse
docker exec -it apexwatch-clickhouse clickhouse-client
# Query thought history
SELECT event_type, count() as count
FROM llm_thoughts
WHERE timestamp >= now() - INTERVAL 24 HOUR
GROUP BY event_type;
# View recent thoughts
SELECT timestamp, event_type, substring(thought, 1, 100) as thought_preview
FROM llm_thoughts
ORDER BY timestamp DESC
LIMIT 10;# Backup PostgreSQL
docker exec apexwatch-postgres pg_dump -U postgres apexwatch > backup_$(date +%Y%m%d).sql
# Backup ClickHouse
docker exec apexwatch-clickhouse clickhouse-client --query "BACKUP DATABASE apexwatch TO Disk('backups', 'backup_$(date +%Y%m%d)')"# Pull latest changes
git pull
# Rebuild and restart
docker-compose down
docker-compose build
docker-compose up -d-- PostgreSQL: Clear old market data
DELETE FROM market_data WHERE timestamp < NOW() - INTERVAL '30 days';
-- PostgreSQL: Clear old news
DELETE FROM news_articles WHERE published_at < NOW() - INTERVAL '60 days';ClickHouse has automatic TTL configured in the schema.
ApexWatch/
βββ services/
β βββ core/ # Core service
β β βββ main.py
β β βββ config.py
β β βββ database.py
β β βββ llm.py
β β βββ queue.py
β β βββ processor.py
β β βββ Dockerfile
β βββ wallet_monitor/ # Wallet monitoring
β βββ exchange_monitor/ # Exchange monitoring
β βββ news_monitor/ # News monitoring
β βββ dashboard/ # Dashboard UI
βββ database/
β βββ postgres/
β β βββ init.sql
β βββ clickhouse/
β βββ init.sql
βββ config/
β βββ logstash.conf
βββ docker-compose.yml
βββ .env.template
βββ README.md
- Create a new directory under
services/ - Implement FastAPI service with monitoring logic
- Add webhook calls to Core Service
- Create Dockerfile
- Add service to docker-compose.yml
- Update network configuration
Modify services/core/llm.py:
- Add new prompt templates
- Implement custom analysis functions
- Add additional LLM providers
Add queries to services/core/processor.py:
- Calculate custom metrics
- Store in PostgreSQL or ClickHouse
- Expose via API endpoints
POST /api/webhook/event
{
"type": "wallet_transfer|price_change|volume_spike|news_update",
"data": { /* event-specific data */ }
}GET /api/queue/status
{
"queue_size": 10,
"timestamp": "2024-12-24T10:00:00"
}GET /api/thoughts/{token_id}
{
"thoughts": [
{
"thought": "Analysis text...",
"event_type": "price_change",
"timestamp": "2024-12-24T10:00:00"
}
]
}See service documentation for complete API reference.
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Submit a pull request
This project is provided as-is for educational and commercial use.
For issues and questions:
- Check the Troubleshooting section
- Review service logs
- Open an issue on GitHub
- Support for additional blockchains (Polygon, BSC, Solana)
- Advanced ML models for price prediction
- Mobile app for monitoring
- Telegram bot integration
- Advanced alert system
- Multi-user support with permissions
- API rate limiting
- Kubernetes deployment configs
This software is for informational purposes only. Cryptocurrency trading involves risk. Always do your own research and never invest more than you can afford to lose.
Built with β€οΈ using Python, FastAPI, Streamlit, and Docker