Ragnet is an AI-powered DevRel that automatically resolve developer support queries across community channels like Discord, Slack, GitHub, and Telegram specifically for your open source community.
Most open-source dev-facing startups struggle with scaling support: time zone delays, scattered knowledge, and growing community questions outpacing the team. Companies hire DevRels and Support Engineers to bridge this, but it's expensive and doesn’t scale well.
Ragnet solves this by ingesting your docs, codebase, GitHub issues, architecture diagrams, and past chat history to build an AI agent that answers dev queries instantly, with full source citations. It learns continuously from new interactions and works 24/7! like a fully trained DevRel who knows everything your team ever shipped, but without the human overhead. It’s baked right inside where your community is there’s 0 friction for them to use it.
-
Multi-Source Integration
- Discord communities
- GitHub repositories
- Documentation websites
-
Intelligent Query Processing
- RAG-based contextual understanding
- Vector similarity search
- Conversation history awareness
- Source citations
-
Community Engagement
- Discord bot integration
- Threaded conversations
- Real-time responses
- No auth required
- Stays where your community is
-
Analytics & Insights
- Query patterns
- User engagement metrics
- Documentation coverage analysis
- Node.js 20+
- Docker
- Discord Bot Token (for Discord integration)
- OpenAI API Key (for AI processing)
- Clone the repository
git clone https://github.com/ragnet-in/infra.git
cd ragnet-
Configure environment variables in docker-compose.yml
-
Start the server
docker compose up -dNow access the application at http://localhost:3000
- Discord integration
- GitHub Repo ingestion
- Basic RAG implementation
- Conversation threading
- Reputation system
- Content verification
- Token-gated features
- Webpage crawling and ingestion
- Advanced analytics dashboard
- Custom training data support
- Multi-language support
- RAG Accuracy improvement with LLM
- Custom model fine-tuning
- Team collaboration tools
- API rate limiting
- Custom deployment options
- Backend: Node.js, Express
- Database: PostgreSQL, pgvector
- AI/ML: OpenAI, Embeddings, Mastra
- Web3: Lens Protocol, Bonsai
- Infra: Docker, Docker Compose
This project is licensed under the MIT License - see the LICENSE file for details.
Built with ❤️ by the RagNet Team
