Last updated: 2025-10-19
FlowTech-AI is a production-ready, self-hosted AI stack that provides:
β
Cursor AI Enhancement (MCP-Qdrant)
β
Conversational AI (OpenWebUI with RAG)
β
Automation Platform (n8n)
β
Vector Database (Qdrant)
β
One-command deployment (./init.sh)
Status: β Production-ready, open-source, fork-friendly
| Service | Port | Description | Status |
|---|---|---|---|
| OpenWebUI | 8081 | AI chat with RAG | β Production |
| MCP-Qdrant | 8000 | Cursor integration | β Production |
| n8n | 5678 | Workflow automation | β Production |
| Qdrant | 6333 | Vector database | β Production |
| PostgreSQL | 5432 | Metadata storage | β Production |
| Redis | 6379 | Cache & queues | β Production |
| SearxNG | 8082 | Web search | β Production |
| Langfuse | 3300 | LLM observability | β Production |
| ClickHouse | 8123 | Analytics | β Production |
| MinIO | 9000 | S3-compatible storage | β Production |
Notes/_Templates/- Generic Obsidian templates for note organizationSRC/- Example n8n workflowscursor-mcp-config.json- Cursor configuration template
README.md- Main documentationQUICKSTART.md- 5-minute setup guidedocs/- Technical documentationNotes/README-NOTES.md- Templates guide
Developer codes in Cursor
β
@qdrant store "code snippet"
β
MCP-Qdrant (port 8000)
β
Qdrant collection: cursor-context
β
@qdrant find "snippet"
β
AI retrieves context
Use case: Store and retrieve code context during development
User uploads documents in OpenWebUI
β
OpenWebUI processes & embeds (bge-m3:567m)
β
Qdrant collection: open-webui_files
β
User asks question
β
RAG retrieves relevant chunks
β
LLM generates answer with context
Use case: Q&A over technical documentation
Trigger (webhook, schedule, manual)
β
n8n workflow execution
β
API calls, data processing
β
Results stored or notifications sent
Use case: Automated workflows, integrations
- Ollama (external):
http://YOUR_OLLAMA_HOST:11434(or localhost if running locally) - Embedding model:
bge-m3:567m(1024 dimensions, multilingual) - Vector DB: Qdrant 1.11.5
- PostgreSQL: Metadata, n8n workflows, Langfuse traces
- Redis: Caching, session management
- ClickHouse: Analytics, telemetry
- MinIO: S3-compatible object storage
- OpenWebUI: React-based chat interface
- n8n: Node.js workflow editor
- Langfuse: Next.js observability UI
- MCP-Qdrant: Python FastAPI server for Cursor
- SearxNG: Metasearch engine for web queries
FlowTech-AI/
βββ docker-compose.yml # All services configuration
βββ init.sh # One-command deployment
βββ .env # Configuration (generated by init.sh)
β
βββ AI_Data/ # Runtime data (Docker volumes)
β βββ openwebui/ # OpenWebUI data
β βββ qdrant/ # Vector database storage
β βββ n8n/ # Workflows & credentials
β βββ mcp-qdrant/ # MCP service data
β βββ ...
β
βββ Notes/
β βββ _Templates/ # Generic Obsidian templates
β βββ README-NOTES.md # Templates guide
β
βββ mcp-qdrant/ # MCP service Dockerfile
β βββ Dockerfile
β βββ README.md
β
βββ SRC/ # Example workflows
β βββ FlowTech-AI-Complete-Workflow.json
β
βββ docs/ # Technical documentation
β βββ MCP-QDRANT.md
β βββ DEVELOPER_GUIDE.md
β βββ ...
β
βββ README.md # Main documentation
βββ QUICKSTART.md # Setup guide
βββ cursor-mcp-config.json # Cursor config template
β Personal notes (VMs, Servers, Domains)
β Automated notes sync scripts
β Samba network shares
β Personal credentials & secrets
For personal notes management: See Flow-Notes-AI (private companion repo)
- OS: Ubuntu 20.04+ / Debian 11+ / RHEL 8+
- CPU: 4 cores
- RAM: 16 GB
- Disk: 50 GB
- Docker: 24.0.0+
- Docker Compose: 2.20.0+
git clone https://github.com/FlowTech-Lab/FlowTech-AI.git
cd FlowTech-AI
sudo ./init.shThat's it! All services start automatically with health checks.
Auto-generated by init.sh, key variables:
# External Ollama
OLLAMA_BASE_URL=http://localhost:11434
# Ports
OPENWEBUI_PORT=8081
MCP_QDRANT_PORT=8000
N8N_PORT=5678
# Security (auto-generated)
POSTGRES_PASSWORD=<random>
REDIS_AUTH=<random>
N8N_BASIC_AUTH_PASSWORD=<random>
# RAG Configuration
RAG_EMBEDDING_MODEL=bge-m3:567m
RAG_EMBEDDING_ENGINE=ollama
VECTOR_DB=qdrant{
"mcpServers": {
"qdrant": {
"transport": "sse",
"url": "http://YOUR_SERVER_IP:8000/sse",
"timeout": 30000
}
}
}# All services status
docker compose ps
# Service logs
docker compose logs -f openwebui
docker compose logs -f mcp-qdrant
# Qdrant collections
curl http://localhost:6333/collections- URL: http://localhost:3300
- Tracks: LLM calls, latencies, costs
- Integration: n8n workflows, OpenWebUI
cd FlowTech-AI
git pull origin main
docker compose pull
docker compose up -ddocker compose build mcp-qdrant
docker compose up -d mcp-qdrantWant to improve FlowTech-AI?
- Fork the repo
- Create feature branch:
git checkout -b feature/amazing-feature - Test changes:
./init.sh - Commit:
git commit -m 'Add amazing feature' - Push:
git push origin feature/amazing-feature - Open Pull Request
- Store code snippets with
@qdrant store - Retrieve context with
@qdrant find - Build personal code knowledge base
- Upload technical docs (PDF, MD, DOCX)
- Ask questions in natural language
- Get accurate answers with citations
- Automate repetitive tasks
- Integrate APIs and services
- Schedule jobs and webhooks
- Share OpenWebUI instance
- Common documentation repository
- Collaborative RAG
- Issues: https://github.com/FlowTech-Lab/FlowTech-AI/issues
- Discussions: https://github.com/FlowTech-Lab/FlowTech-AI/discussions
- Documentation:
docs/folder
MIT License - See LICENSE
- Flow-Notes-AI - Personal notes sync (private repo)
- OpenWebUI - AI chat interface
- n8n - Workflow automation
- Qdrant - Vector database
- Cursor - AI-powered IDE
Made with β€οΈ by the FlowTech-Lab community