Self-hosted AI infrastructure: Cursor AI assistant + Conversational AI (OpenWebUI) + RAG + Intelligent Automation (n8n)
A production-ready, self-hosted AI stack that combines:
- 🤖 AI Services: OpenWebUI interface, Qdrant vector DB, n8n automation, Ollama LLM
- 💻 Cursor Integration: MCP-Qdrant for AI-enhanced coding with context awareness
- 📚 Knowledge Base: RAG with document upload in OpenWebUI
- 🔄 Intelligent Workflows: n8n automation platform
- 🚀 One-command deployment:
./init.shand everything works
✅ Cursor AI Enhancement via Model Context Protocol (MCP)
✅ OpenWebUI with RAG for document Q&A
✅ Vector Search with Qdrant (1024-dim embeddings)
✅ Production-ready Docker Compose stack
✅ Automated Workflows with n8n orchestration
✅ Fork-friendly - Clone once, everything works
# 1. Clone repository
git clone https://github.com/FlowTech-Lab/FlowTech-AI.git
cd FlowTech-AI
# 2. Initialize stack (requires sudo for permissions)
sudo ./init.sh
# ✅ Stack ready! Services available at:
# - OpenWebUI: http://localhost:8081
# - Cursor MCP: http://localhost:8000
# - n8n: http://localhost:5678
# - Qdrant: http://localhost:6333Full guide: See QUICKSTART.md
| Service | Port | Description | Status |
|---|---|---|---|
| OpenWebUI | 8081 | AI chat interface with RAG | ✅ Production |
| MCP-Qdrant | 8000 | Cursor code context (cursor-context) | ✅ Production |
| MCP-Knowledge | 8001 | Cursor notes search (cursor-knowledge) | ✅ Production |
| n8n | 5678 | Workflow automation | ✅ Production |
| Qdrant | 6333 | Vector database | ✅ Production |
| Samba | 445 | Notes share (SMB) | ✅ Production |
| PostgreSQL | 5432 | Metadata storage | ✅ Production |
| Redis | 6379 | Cache & queues | ✅ Production |
| SearxNG | 8082 | Web search engine | ✅ Production |
| Langfuse | 3300 | LLM observability | ✅ Production |
# 1. Copy MCP config template
cp cursor-mcp-config.json ~/.cursor/mcp.json
# 2. Edit IP (change to your server IP)
nano ~/.cursor/mcp.json
# Replace 192.168.0.246 with your actual IP
# 3. Restart Cursor
# 4. Test
@qdrant store "FlowTech-AI is awesome!"
@qdrant find awesomeWhat you get:
- Store code snippets, notes, and context in Qdrant
- Retrieve information semantically during coding
- AI-enhanced development with persistent memory
Sync Markdown notes to Cursor:
# Initial sync
./scripts/sync-notes.sh
# Install hourly auto-sync
./scripts/install-cron.shYour Notes/ folder will be automatically synced to Qdrant and searchable in Cursor!
📖 See Notes Sync Guide for details.
Edit notes from Windows/Mac/Linux:
# Windows
\\YOUR_SERVER_IP\notes
# Mac/Linux
smb://YOUR_SERVER_IP/notes
# Credentials (generated in .env during init.sh)
Username: admin
Password: Check your .env file (SAMBA_PASSWORD)Open the share with Obsidian or any text editor to manage your notes!
📖 See Samba Windows Guide for connection help.
- Open http://localhost:8081
- Create a new chat
- Click "Knowledge" button
- Upload your documents (PDF, MD, TXT, DOCX, etc.)
- Ask questions about your documents!
Example:
User: What does the API documentation say about authentication?
AI: Based on the uploaded API docs, authentication uses JWT tokens...
- 📄 PDF, DOCX, TXT, MD
- 💻 Code files (PY, JS, TS, etc.)
- 🌐 Websites (via URL)
┌─────────────────────────────────────────────────────────┐
│ FlowTech-AI Stack │
├─────────────────────────────────────────────────────────┤
│ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────┐ │
│ │ Cursor IDE │───▶│ MCP-Qdrant │───▶│ Qdrant │ │
│ │ (Dev Tool) │ │ (Port 8000) │ │ Vector │ │
│ └──────────────┘ └──────────────┘ │ DB │ │
│ └────┬─────┘ │
│ ┌──────────────┐ ┌──────────────┐ │ │
│ │ Browser │───▶│ OpenWebUI │────────▶│ │
│ │ │ │ (Port 8081) │ │ │
│ └──────────────┘ └──────────────┘ │ │
│ │ │
│ ┌──────────────┐ ┌──────────────┐ │ │
│ │ n8n Web │───▶│ n8n │────────▶│ │
│ │ Interface │ │ (Port 5678) │ │ │
│ └──────────────┘ └──────────────┘ │ │
│ │ │
│ ┌────────────────────────────────────────────┘ │
│ │ │
│ │ ┌──────────┐ ┌──────────┐ ┌──────────┐ │
│ └─▶│ Redis │ │ Postgres │ │ SearxNG │ │
│ └──────────┘ └──────────┘ └──────────┘ │
│ │
└───────────────────────────────────────────────────────┘
Data Flow:
- Cursor: Store/retrieve code context via MCP → Qdrant
- OpenWebUI: Upload docs → RAG → Qdrant → AI answers
- n8n: Automate workflows, integrate external APIs
- Open http://localhost:5678
- Login with credentials from
.envfile - Import workflow from
SRC/FlowTech-AI-Complete-Workflow.json - Customize for your needs
Default: bge-m3:567m (Ollama) - Multilingual, 1024 dimensions
Change model:
# Edit .env
RAG_EMBEDDING_MODEL=bge-large:latest # or another model
# Restart
docker compose restart openwebuiWant to sync personal notes? Check the Notes Templates in Notes/_Templates/:
vm-template.md- For virtual machinesserver-template.md- For serversdomain-template.md- For domains
For automated sync, see companion repo: Flow-Notes-AI
- QUICKSTART.md - Complete setup guide
- Notes/_Templates/ - Obsidian note templates
- docs/ - Architecture & advanced topics
Contributions are welcome! Please:
- Fork the repo
- Create a feature branch
- Test your changes with
./init.sh - Submit a pull request
MIT License - See LICENSE for details
- OpenWebUI - Amazing AI interface
- Cursor - Best AI-powered IDE
- n8n - Powerful automation platform
- Qdrant - High-performance vector database
- Ollama - Local LLM inference
- GitHub: https://github.com/FlowTech-Lab/FlowTech-AI
- Documentation: docs/
- Issues: https://github.com/FlowTech-Lab/FlowTech-AI/issues
Made with ❤️ by the FlowTech-Lab community