Converts Jupyter notebooks, source code, and experimental results into technical reports in multiple formats.
Automates technical report creation for:
- Lab reports and semester projects
- Internship documentation
- Technical documentation
- Research summaries
- Local LLM support (Ollama) and cloud API compatibility
- Multiple output formats: DOCX, PDF, Markdown
- Diagram generation from code structure
- Citation management
- Configurable report templates
Input (Notebooks, Code, Data)
↓
Document Parser & Analyzer
↓
Context Extraction Engine
↓
Multi-Agent Report Generator
↓
Diagram Generator
↓
Output Formatter (DOCX/PDF)
# Clone and setup
git clone https://github.com/haripatel07/report-generator.git
cd report-generator
python -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
pip install -r requirements.txt
# Setup LLM (pick one)
ollama pull llama3 # easiest
export OPENAI_API_KEY="sk-..." # OpenAI
export OPENAI_API_KEY="gsk-..." # or Groq (also set OPENAI_BASE_URL)
# Generate a report
python src/main.py --input notebook.ipynb --type academic --format docxSee the LLM Setup section below if you need help with the model setup.
- Python 3.9+
- Local LLM (Ollama) or OpenAI-compatible API
- 8GB+ RAM
- Storage for model weights
See INSTALLATION.md for detailed setup instructions.
This tool needs an LLM to generate reports. Pick one:
If you haven't used LLMs before, start here. It's free and runs locally.
# Install from https://ollama.com
ollama pull llama3
ollama list # verify it's installedThe tool will use Ollama automatically if you don't set an API key.
You'll need an API key and it costs money per request.
- Get a key from https://platform.openai.com/api-keys
- Set it in your environment:
# Mac/Linux
export OPENAI_API_KEY="your_key_here"
# Windows PowerShell
setx OPENAI_API_KEY "your_key_here"Or just create a .env file in the project root:
OPENAI_API_KEY=your_key_here
Important: Restart your terminal after setting the variable.
Similar to OpenAI but with a generous free tier. Groq uses an OpenAI-compatible API.
- Get a key from https://console.groq.com
- Add to
.envfile:
LLM_PROVIDER=openai
LLM_MODEL=llama-3.1-70b-versatile
OPENAI_API_KEY=gsk_your_key_here
OPENAI_BASE_URL=https://api.groq.com/openai/v1Or set as environment variable:
export OPENAI_API_KEY="gsk_your_key_here"
export OPENAI_BASE_URL="https://api.groq.com/openai/v1"# If using Ollama
ollama run llama3 "test"
# Otherwise just try running
python src/main.py --helpCommon issues:
- Forgot to restart terminal after setting API key
- Ollama not running (check with
ollama list) - Wrong API key or typo in .env file
python src/main.py \
--input project/notebook.ipynb \
--type academic \
--format docx \
--output report.docxpython src/main.py \
--input project/ \
--type internship \
--format pdf \
--title "Machine Learning Internship Report" \
--institution "XYZ University" \
--include-code-snippets \
--diagram-style detailedacademic: University lab reports, course projectsinternship: Weekly and final internship reportsindustry: Professional technical documentationresearch: Research paper drafts and summaries
report-generator/
├── src/
│ ├── agents/ # Multi-agent system
│ ├── parsers/ # Input file parsers
│ ├── generators/ # Content generators
│ ├── formatters/ # Output formatters
│ └── utils/ # Utilities
├── config/ # Configuration files
├── tests/ # Unit and integration tests
├── docs/ # Documentation
└── examples/ # Example inputs/outputs
See CONTRIBUTING.md for contribution guidelines.
MIT License - see LICENSE for details.