✨ Fully vibecoded - This project was entirely developed using AI assistance, showcasing the power of AI-driven development.
Generate semantic commit messages using your local Ollama LLM instance.
No cloud services, no API keys - everything runs locally.
- 🚀 Generate commit messages instantly from staged changes
- 🔒 Fully local - uses your own Ollama instance
- 📝 Conventional Commits - follows standard commit message format
- ⚡ Fast - powered by a lightweight Go CLI
- 🎨 VS Code integration - seamless SCM toolbar button
- Ollama installed and running locally
- A compatible model (default:
qwen2.5-coder:7b, but any model works) - VS Code 1.85.0 or higher
- Open VS Code
- Go to Extensions view (
Ctrl+Shift+X/Cmd+Shift+X) - Search for "Llamit"
- Click Install
# Clone the repository
git clone https://github.com/crstian19/llamit.git
cd llamit
# Build the Go CLI
cd go-cli
go build -o cli main.go
# Build the VS Code extension
cd ../vscode-extension
npm install
npm run compile
# Package the extension
npx vsce package
# Install the generated .vsix file in VS Code
## Usage
1. **Start Ollama**: Make sure Ollama is running
```bash
ollama serve
-
Stage your changes: Use Git to stage the files you want to commit
git add . -
Generate commit message:
- Click the ✨ Llamit button in the Source Control toolbar, or
- Open Command Palette (
Ctrl+Shift+P/Cmd+Shift+P) - Run:
Llamit: Generate Commit Message
-
Review and commit: The generated message appears in the commit input box. Review it and commit!
You can customize Llamit in VS Code settings:
{
"llamit.ollamaUrl": "http://localhost:11434/api/generate",
"llamit.model": "qwen2.5-coder:7b",
"llamit.commitFormat": "conventional",
"llamit.customFormat": ""
}llamit.ollamaUrl: The Ollama API endpoint URL (default:http://localhost:11434/api/generate)llamit.model: The model to use for generation (default:qwen2.5-coder:7b)llamit.commitFormat: The commit message format to use (default:conventional)- Available formats:
conventional,angular,gitmoji,karma,semantic,google,custom
- Available formats:
llamit.customFormat: Custom format template (only used whencommitFormatis set tocustom)
Llamit supports multiple commit message formats to match your team's conventions:
feat(auth): add user login functionality
Implements OAuth2 authentication flow
feat(core): implement user authentication
- Add login service
- Add auth guard
- Update routing
Closes #123
✨ feat(api): add new endpoint for user profiles
Implements GET /api/users/:id endpoint
feat(ui): add dark mode toggle
Implements theme switching functionality
feat: implement user authentication system
Complete OAuth2 integration with JWT tokens
Add user authentication system
Implements a complete authentication flow using OAuth2 and JWT tokens.
Includes login, logout, and token refresh functionality.
Set llamit.commitFormat to custom and provide your own template in llamit.customFormat:
{
"llamit.commitFormat": "custom",
"llamit.customFormat": "Generate a simple commit message:\n<action>: <description>\n\nRules:\n1. Keep it under 50 characters\n2. Use imperative mood"
}Any Ollama model works, but these are optimized for code:
qwen2.5-coder:7b- Great balance of quality and speed (default)qwen2.5-coder:14b- Better quality, slowercodellama:13b- Good alternativedeepseek-coder:6.7b- Fast and efficient
Llamit consists of two components:
A standalone command-line tool that:
- Reads git diffs from stdin
- Sends them to Ollama with a prompt template
- Returns a formatted commit message
- Implements retry logic with exponential backoff
- Handles errors gracefully
A TypeScript extension that:
- Integrates with VS Code's Source Control view
- Executes
git diff --cachedto get staged changes - Spawns the Go CLI as a subprocess
- Populates the commit message box with the result
Go CLI:
cd go-cli
go test -v # All tests
go test -v -short # Unit tests only (skip integration)VS Code Extension:
cd vscode-extension
npm run test:unit # Fast unit tests
npm test # Full integration testsllamit/
├── go-cli/ # Go CLI binary
│ ├── main.go # Core logic
│ ├── main_test.go # Comprehensive tests
│ └── go.mod # Go module file
├── vscode-extension/ # VS Code extension
│ ├── src/
│ │ ├── extension.ts # Extension entry point
│ │ └── test/ # Unit and integration tests
│ ├── package.json # Extension manifest
│ └── tsconfig.json # TypeScript config
└── CLAUDE.md # AI assistant documentation
Both components have comprehensive test coverage:
- Go CLI: 6 test cases covering success, errors, retries, and integration
- VS Code Extension: Unit tests + integration tests for all core functions
See CLAUDE.md for detailed testing information.
Contributions are welcome! This project was vibecoded, but that doesn't mean it can't be improved by humans too 😊
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Commit your changes:
git commit -m 'feat: add amazing feature' - Push to the branch:
git push origin feature/amazing-feature - Open a Pull Request
See CONTRIBUTING.md for detailed guidelines.
📋 See all release notes on the GitHub Releases page
Llamit uses automated CI/CD for releases:
- ✅ Automatic releases on merge to
main - 📦 Platform-specific packaging (6 platforms)
- 🚀 Automatic publishing to VS Code Marketplace and Open VSX
For maintainers: See .github/RELEASE.md for detailed release process documentation.
Llamit collects minimal anonymous usage data (install/update events and editor name) to understand adoption across editors like VS Code, VSCodium, and Cursor. No PII, code, or usage frequency is ever collected.
You can opt out at any time by setting llamit.telemetry.enabled to false. See USAGE_DATA.md for full details.
MIT License - see LICENSE file for details
- Built with Claude - AI pair programming at its finest
- Powered by Ollama - local LLM runtime
- Inspired by the need for better commit messages everywhere
Made with 🤖 and ✨ through vibecoding