Skip to content

DukeSchlegel/OllaMail

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Logo

OllaMail

About the Project

A self-hosted, AI-powered email client designed for absolute privacy. Instead of sending your sensitive data to external APIs, it uses local LLMs (via Ollama) to bring order to your inbox.

Key Capabilities:

  • Generates instant TL;DR summaries for long email threads.
  • Auto-categorizes incoming mail (invoices, newsletters, high-priority).
  • Extracts action items and deadlines directly from the text.
  • 100% offline AI processing - your emails never leave your server.

Who is this for?

  • Homelabbers & Tech Enthusiasts: Take full control of your infrastructure using standard Docker deployments.
  • Privacy Advocates: Keep your personal communications safe from data scraping.
  • Professionals under NDA (Lawyers, Consultants, etc.): Leverage AI productivity while strictly adhering to GDPR and client confidentiality.

Getting Started

Prerequisites

Before you begin, ensure you have the following ready:

  1. Docker and Docker Compose installed on your server.
  2. An accessible IMAP Email Account
  3. A running instance of Ollama (locally or on your network) with an active model (e.g., llama3 or mistral)
    • Tip: Run ollama run llama3 to ensure your model is downloaded and ready.

Installation (Docker Compose)

The easiest and recommended way to deploy this project is via Docker Compose.

  1. Create a new directory for the project and create a docker-compose.yml file:
services:
  mail-client:
    image: ollamail:latest
    container_name: ollamail
    ports:
      - "8080:8080" # Web UI
    environment:
      # IMAP Configuration
      - IMAP_HOST=
      - IMAP_PORT=
      - IMAP_USER=
      - IMAP_PASS=

      # AI / Ollama Configuration
      # Use host.docker.internal if Ollama runs on the same host machine
      - OLLAMA_BASE_URL=http://host.docker.internal:11434
      - OLLAMA_MODEL=llama3
      - DB_PATH=/app/data/mail.db
      - SYNC_INTERVAL_MINUTES=5
    volumes:
      - ./data:/app/data
    restart: unless-stopped
  1. Run the following command to start the container in the background:
docker compose up -d
  1. Open your browser and navigate to http://localhost:8080 to access your new AI-powered inbox.

Configuration (Environment Variables)

The application is highly customizable via environment variables. If you are using Docker Compose, you can define these in your .env file or directly within the docker-compose.yml

Variable Description Default / Example Required
IMAP_HOST The hostname of your email provider's IMAP server. imap.gmail.com YES
IMAP_PORT The port for IMAP over TLS/SSL. 993 NO (Defaults to 993)
IMAP_USER Your email address or IMAP login username. user@domain.com YES
IMAP_PASS Your IMAP password (use app-specific passwords if 2FA is enabled). super-secret-pass YES
OLLAMA_BASE_URL The URL where your local Ollama instance is running. http://localhost:11434 YES
OLLAMA_MODEL The specific LLM you want to use for summaries and tagging. llama3 NO (Defaults to llama3)
OLLAMA_SYSTEM_PROMPT Override the default AI behavior (e.g., "Always reply in German"). Built-in default NO
DB_PATH The internal path where the SQLite database is stored. /app/data/mail.db NO
SYNC_INTERVAL_MINUTES How often the background worker checks for new emails. 5 NO

Contributing

We welcome contributions! Whether it's fixing bugs, improving the HTML-to-text sanitizer, or adding new features, your help is appreciated.

Authors

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors