You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
AdaptIQ is a full-stack learning management system with AI quiz generation, personalized roadmaps, and comprehensive analytics — all powered by local LLM inference via Ollama.
Full CRUD with topics, difficulty levels, enrollment tracking
Learning Materials
Upload PDF/DOCX/videos, organize by topic and course
AI Quiz Engine
Generate quizzes from any topic via Ollama LLM, auto-grade
Roadmap Generator
Personalized weekly learning plans based on quiz performance
Admin Dashboard
User management, platform analytics, audit logs
Quiz Attempt Logging
Full history, score trends, time tracking
License System
Activation key management for deployment
CSV Export
Export quiz attempts and analytics data
Docker Deployment
One-command full stack deployment
🏗️ Architecture
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
│ Frontend │────▶│ Backend │────▶│ Ollama │
│ Next.js │ │ Flask API │ │ LLM Engine │
│ Port 3000 │ │ Port 8000 │ │ Port 11434 │
└──────────────┘ └──────┬───────┘ └──────────────┘
│
┌──────▼───────┐
│ SQLite DB │
│ acadex.db │
└──────────────┘
🚀 Quick Start
Option 1: Docker Compose (Recommended)
# Clone and navigatecd"Acadex AI"# Start everything
docker-compose up --build
# Pull an LLM model (in a separate terminal)
docker exec -it acadex-ollama ollama pull gemma:2b
The platform uses Ollama for local AI inference. Supported models:
Model
Size
Recommended For
gemma:2b
~1.5GB
Low-resource machines
llama3:8b
~4.7GB
Best quality/speed balance
mistral:7b
~4.1GB
Great for quizzes
Change the model in backend/.env:
LLM_MODEL=llama3:8b
About
AdaptIQ is an offline-capable, AI-powered learning platform that personalizes education with locally generated quizzes, roadmaps, and intelligent progress tracking.