feat: Multi-backend AI enhancement (OpenAI-compatible + Anthropic cloud)#35
Open
sk8ersquare wants to merge 5 commits intopoodle64:mainfrom
Open
feat: Multi-backend AI enhancement (OpenAI-compatible + Anthropic cloud)#35sk8ersquare wants to merge 5 commits intopoodle64:mainfrom
sk8ersquare wants to merge 5 commits intopoodle64:mainfrom
Conversation
added 5 commits
March 17, 2026 22:37
Adds a new backend option for AI text enhancement that works with any OpenAI-compatible API server (oMLX, LM Studio, LocalAI, etc). - New OpenAiCompatClient with retry logic, exponential backoff, timeout - Config: backend selector (ollama/openai_compat), base_url, api_key fields - Settings UI: Cloud/Local tab switcher with endpoint + model configuration - Tauri commands: list_openai_models, enhance_openai for frontend integration - All fields use serde(default) for backward compatibility with existing configs - Comprehensive test coverage for client construction and API key handling Ollama functionality unchanged — this is purely additive.
Adds Claude (Anthropic) as a cloud AI enhancement backend alongside the existing Ollama and OpenAI-compatible local backends. - New AnthropicClient with Messages API integration (2023-06-01) - Auto-detection of ANTHROPIC_API_KEY from environment - Model selector: Haiku 4.5, Sonnet 4.6, Opus 4.6 - Settings UI: dedicated Anthropic section with API key (show/hide toggle) - Config: anthropic_api_key, anthropic_model, anthropic_base_url fields - Tauri commands: detect_anthropic_api_key, enhance_anthropic - Helper link to Anthropic console for key generation - Backward compatible — all new config fields use serde(default)
Now that multiple AI backends are supported, the settings subtitle should not reference Ollama specifically.
- Pipeline: enhancementModel now picks anthropicModel when backend is anthropic. Previously always sent config.enhancement.model (local Ollama model) regardless of backend selection — cloud AI was broken. - Tray: shows correct model with backend prefix (Cloud:/Local:/Local OMLX:) - Tray: refreshes immediately after backend switch - UI: tabs replaced with slide toggle (Local AI ←→ Cloud AI) - UI: active backend pill shows what's currently running - UI: switchToLocal remembers last local backend - Anthropic models: trimmed to 3 current (Haiku 4.5, Sonnet 4.6, Opus 4.6) - API key show/hide: SVG eye icon instead of broken emoji rendering
Security hardening aligned with master-project governance (Profile 0): - Config file: restrict to 0600 after write (contains API keys in plaintext) - OpenAI-compat: validate URL scheme, reject non-http(s) (OWASP M4) - Anthropic: warn if custom base_url is not HTTPS (credential exposure risk) - Fix tray closure argument (compiler error on some Rust versions) - Add .claude/rules/20-enhancement-backends.md (backend pattern rules) - Add .github/workflows/security-audit.yaml (cargo audit + npm audit, weekly)
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds two new AI text enhancement backends alongside the existing Ollama support:
Both backends are purely additive. Existing Ollama functionality is unchanged. All new config fields use
serde(default)for full backward compatibility with existing installations.What's Included
Commit 1: OpenAI-compatible backend
OpenAiCompatClientwith retry logic, exponential backoff, configurable timeout (30s default)backendselector (ollama/openai_compat),base_url,api_keyfieldslist_openai_models,enhance_openaiCommit 2: Anthropic cloud backend
AnthropicClientwith Messages API integration (API version 2023-06-01)ANTHROPIC_API_KEYfrom environmentanthropic_api_key,anthropic_model,anthropic_base_urlfieldsdetect_anthropic_api_key,enhance_anthropicCommit 3: Settings subtitle fix
Commit 4: Cloud AI switching fix + UI polish
config.enhancement.model(local Ollama model) regardless of backend selection. When Anthropic was selected, it still tried calling Ollama with the Anthropic API key.Cloud: claude-haiku-4-5-20251001orLocal: your-model)Commit 5: Security hardening
file://,ftp://, etc. (OWASP M4)base_urlis not HTTPS (prevents accidental plaintext key transmission; localhost exempted).claude/rules/20-enhancement-backends.mddocumenting backend patterns.github/workflows/security-audit.yaml(cargo audit+npm audit, weekly schedule + push/PR)Files Changed (12 files, +1584 / -117)
src-tauri/src/enhancement/openai_compat.rssrc-tauri/src/enhancement/anthropic.rssrc-tauri/src/enhancement/mod.rssrc-tauri/src/config.rssrc-tauri/src/lib.rssrc-tauri/src/tray.rssrc/lib/components/AIEnhancementSettings.sveltesrc/lib/stores/config.svelte.tssrc/lib/stores/pipeline.svelte.tssrc/lib/windows/Settings.svelte.claude/rules/20-enhancement-backends.md.github/workflows/security-audit.yamlNot Included (fork-specific, excluded from this PR)
Testing
cargo testin src-tauri/)Screenshots
Settings with slide toggle and active backend pill — happy to add screenshots if helpful.
Built on top of the v2026.2.7 base. Rebased cleanly onto current
main(6ecfe54).