Aaroh is an AI-powered learning assistant that helps students move beyond rote memorization toward true comprehension.
It breaks down complex academic text into simplified explanations, relatable analogies, and auto-generated quizzes, making learning engaging and accessible.
Built for the Viksit Bharat Challenge by Team Runtime Terror, Aaroh aims to democratize education by making complex knowledge simple and inclusive — for every learner, in every language.
Students often struggle to grasp dense academic material, leading to surface-level understanding and dependence on memorization. Traditional educational resources are static and fail to adapt to individual learning needs.
Aaroh provides a dynamic, AI-driven solution that allows a student to paste any complex academic text and instantly receive:
- A simplified version of the content.
- A real-world analogy to make the concept intuitive.
- An auto-generated quiz to test comprehension.
- Multilingual support for inclusivity.
By transforming passive reading into active understanding, Aaroh fosters curiosity, confidence, and critical thinking.
- Frontend: Next.js (TypeScript, pnpm)
- Backend: Express.js (Node.js, pnpm)
- AI Service: Python (Flask + LLM integration)
- Containerization: Docker + Docker Compose
- CI/CD: GitHub Actions (build and push caching)
.
├── app/ # Next.js frontend (UI for text input and quiz display)
│ └── Dockerfile
├── api/ # Express backend (routes + middleware)
│ └── Dockerfile
├── ai/ # Flask microservice (AI-powered text simplification, analogy, quiz generation)
│ ├── main.py
│ ├── requirements.txt
│ └── Dockerfile
├── docker-compose.yml
└── pnpm-workspace.yaml
git clone https://github.com/upayanmazumder/aaroh.git
cd aarohBuild and start all services:
docker compose up --buildThen open:
- Frontend: http://localhost:3000
- API: http://localhost:4000
- AI Service: http://localhost:5000
-
The frontend (Next.js) provides a simple interface for students to input text.
-
The backend (Express) receives the text and forwards it to the AI service.
-
The AI service (Flask) processes the input using language models to:
- Simplify complex concepts.
- Generate analogies for better recall.
- Create short quizzes to test understanding.
-
The processed output is sent back through the backend and displayed in the UI.
- User submits text to
/api/learn. - The Express backend routes it to the Flask endpoint
/process. - Flask AI module generates a simplified explanation, analogy, and quiz.
- The processed result is displayed on the web app.
| Service | Port | Description |
|---|---|---|
app |
3000 | Next.js frontend |
api |
4000 | Express backend |
ai |
5000 | Flask AI microservice |
Each container is modular, isolated, and deployable independently.
- Uses
pnpmfor dependency management. - Python virtual environments are handled through Docker.
- GitHub Actions workflow supports build caching for faster CI/CD.
- Easily extensible with additional services (databases, vector stores, etc.).
- Integrate production-grade LLMs (OpenAI, HuggingFace, Ollama).
- Add multilingual voice input and output.
- Introduce user analytics and adaptive learning.
- Expand inclusivity through support for regional languages.
- Deploy to cloud providers (Fly.io, Railway, Render, etc.).
To make learning simple, personalized, and empowering — cultivating thinkers, not memorizers.
Licensed under the MIT License.