Skip to content

Baldri/mingly

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

87 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Mingly

Swiss AI Privacy — Use AI without giving up your data.

CI License: MIT Website

Mingly is a privacy-first multi-LLM desktop app from Switzerland. Use Claude, ChatGPT, Gemini, and local LLMs in one interface — with automatic PII protection that keeps your personal data off the cloud. Use it standalone or deploy as a server for your team.

Highlights

AI Agents & Tools

  • Agentic Mode — ReAct agents with automatic tool selection and multi-step reasoning chains (Pro+)
  • Agent Comparison — Run up to 3 ReAct agents in parallel, each with full tool access, compare reasoning + results side-by-side (Pro+)
  • Parallel Subagents — Master LLM decomposes a task → N parallel agents execute subtasks → Master synthesizes final answer (Pro+)
  • Tool-Use for Local Models — Ollama + LM Studio + OpenRouter get full function calling via OpenAI-compatible endpoints
  • MCP Tools — Extend functionality with Model Context Protocol tools + auto-tool-selection
  • Built-in Toolsweb_search, read_file, write_file, execute_command available to all agents

Multi-Provider Chat

  • Multi-Provider — Claude, GPT-4, Gemini, Ollama (local) in one interface
  • Model Comparison — Send the same prompt to up to 3 models in parallel, compare text outputs side-by-side
  • Intelligent Routing — Gemma 2B auto-routes requests to the best model, or switch to manual
  • Local LLM Discovery — Auto-detects Ollama, LM Studio, vLLM, LocalAI and more

Context Engineering (v0.5.0)

  • Progress Recitation — Agents stay on track with injected step summaries (prevents Lost-in-the-Middle)
  • Error Learning — Full error context preserved so agents learn from and avoid repeated mistakes
  • KV-Cache Optimization — Deterministic tool ordering + stable prompt structure for 90% cost reduction on Anthropic
  • File-based Memory — Large results externalized to temp files, keeping the context window lean
  • Multi-Ollama Load Balancing — Distribute workload across multiple Ollama instances on your local network

Infrastructure

  • Service Discovery — Finds RAG and MCP servers on local machine, network, and cloud
  • Hybrid Orchestration — Local LLM detects cloud needs, delegates with your approval
  • Knowledge Base — Index your local documents for context-aware AI responses (RAG with custom server naming)
  • Auto-Updates — Built-in updater with tier-aware download (Pro+ auto-install, Free manual)
  • Server Mode — Share AI access across your network via REST + WebSocket API
  • Multi-Backend Routing — Load balance across multiple Ollama instances with health checks and automatic failover
  • DocMind Integration — MCPO + RAG context injection for document intelligence
  • Integrations — Slack, Notion, Obsidian + custom workflows

Swiss AI Privacy

  • Automatic PII Protection — Names, addresses, health data, financial info detected and anonymized before reaching cloud APIs
  • 4 Privacy Modes — Shield (auto-anonymize), Vault (block PII entirely), Transparent (show what's detected), Local Only (nothing leaves your device)
  • On-Device NER — piiranha-v1 (400M ONNX model) runs locally for DE+EN entity recognition, no cloud processing
  • Swiss-Made — Built by digital opua GmbH, Walchwil, Switzerland. nDSG-aligned, your data stays yours

Security & Enterprise

  • Enterprise Ready — RBAC, audit logging, budget controls, GDPR/nDSG compliance, license activation
  • Activity Tracking — Token/cost analytics per provider, daily summaries, budget alerts
  • Secure — AES-256-GCM encrypted API keys, IPC input validation, CSP, rate limiting, sensitive data detection

Pricing

Mingly is open source and stays that way. Choose the plan that fits you.

Free Pro Team Enterprise
Price CHF 0 CHF 24/mo CHF 69/user/mo On request
Local models (Ollama)
Cloud APIs (Claude, GPT, Gemini)
Unlimited conversations
Prompt templates
Agentic Mode (ReAct + tools)
Agent Comparison (parallel)
Parallel Subagents
Team workspaces
RBAC & audit logs
SSO (OAuth / SAML)
On-premise & compliance
Dedicated support & SLA

Annual plans available: Pro CHF 199/year, Team CHF 599/user/year (min. 5 users).

You pay AI provider API costs directly — Mingly charges no markup. With Ollama, everything runs locally for free.

Quick Start

Desktop App (macOS / Windows)

Download the latest installer from GitHub Releases.

From Source

git clone https://github.com/Baldri/mingly.git
cd mingly
npm install
npm run dev

Server Mode (Docker)

git clone https://github.com/Baldri/mingly.git
cd mingly
docker compose up -d

The API server starts on port 3939. See Server Documentation for details.

Documentation

Language Installation Configuration Usage FAQ
English Install Configure Usage FAQ
Deutsch Installation Konfiguration Nutzung FAQ

Full documentation available on the Wiki.

Security

Security is a core design principle. See SECURITY.md for details on:

  • Encrypted API key storage (AES-256-GCM)
  • IPC input validation at the Electron security boundary
  • MCP command injection prevention (whitelist + sanitizer)
  • Prompt injection mitigation (subtask length limits, tool-call argument validation)
  • Content Security Policy (environment-aware)
  • Session concurrency limits for parallel agents
  • Navigation and window.open protection
  • RBAC with audit logging

Report vulnerabilities to security@mingly.ch or via GitHub Security Advisories.

Contributing

We welcome contributions! See CONTRIBUTING.md for development setup, coding conventions, and PR process.

License

MIT License - see LICENSE for details.


Mingly is built by digital opua GmbH, Walchwil, Switzerland. Star the repo if you find it useful!

About

Mingly – Multi-LLM desktop app & server for Claude, GPT, Gemini & Ollama. Privacy-first, Swiss Made. Free / Pro / Team / Enterprise.

Topics

Resources

License

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors