PromptPerfect is an open-source prompt optimization tool that automatically improves your LLM prompts and explains the changes.
PromptPerfect takes your draft prompts—whether vague, messy, or just a rough idea—and transforms them into high-quality, engineered prompts using AI. It doesn't just rewrite them; it teaches you why the changes were made, helping you become a better prompt engineer over time. Choose from modes like "Make it Better," "Make it Specific," or "Add Chain-of-Thought" to get exactly the result you need.
- Instant Optimization: Turn simple phrases into professional prompts in seconds.
- Detailed Explanations: Learn the "why" behind every change with educational breakdowns.
- Multiple Modes:
- Better: General improvement for clarity and robustness.
- Specific: Adds constraints and details to reduce hallucinations.
- Chain-of-Thought: Structures the prompt to encourage step-by-step reasoning.
- Privacy-First: Your API keys are stored locally in your browser and never saved to our servers.
- Open Source: Built with modern web technologies, free to use and extend.
- n8n Integration: Ready-to-import workflow templates for automation (see
examples/).
| Component | Technology |
|---|---|
| Framework | Next.js 16.1.6 (App Router) |
| Language | TypeScript |
| Styling | Tailwind CSS + shadcn/ui |
| AI Integration | Vercel AI SDK |
| Icons | Lucide React |
| Database | Supabase (for analytics) |
| Deployment | Vercel |
┌─────────────────────────────────────────────────────┐
│ Client Layer │
│ ┌──────────┐ ┌──────────────┐ ┌───────────────┐ │
│ │ Landing │ │ App (Optimizer)│ │ Chrome Ext │ │
│ │ Page │ │ + Library │ │ (any page) │ │
│ └──────────┘ └──────┬───────┘ └───────┬───────┘ │
└────────────────────────┼─────────────────┼───────────┘
│ │
┌────────────────────────┼─────────────────┼───────────┐
│ API Layer │ │
│ ┌──────────────┐ ┌──────────────┐ ┌───┴────────┐ │
│ │ /api/optimize │ │ /api/auth/* │ │/api/optimize│ │
│ │ (streaming) │ │ (login/signup│ │-sync (JSON) │ │
│ └──────┬───────┘ └──────┬───────┘ └──────┬─────┘ │
└─────────┼────────────────┼───────────────────┼───────┘
│ │ │
┌─────────┼────────────────┼───────────────────┼───────┐
│ │ Service Layer │ │
│ ┌──────┴───────┐ ┌──────┴───────┐ ┌──────┴─────┐ │
│ │ lib/prompts │ │ Supabase Auth │ │lib/providers││
│ │ (3 modes) │ │ │ │(Gemini/OAI/ ││
│ └──────────────┘ └──────────────┘ │ Anthropic) ││
│ └─────────────┘ │
│ ┌──────────────────────────────────────────────────┐│
│ │ Supabase (PostgreSQL) ││
│ │ optimization_logs │ pp_optimization_history ││
│ │ pp_user_profiles │ pp_saved_prompts ││
│ │ guest_usage │ pp_users ││
│ └──────────────────────────────────────────────────┘│
└──────────────────────────────────────────────────────┘
Follow these steps to run PromptPerfect locally on your machine.
- Node.js 18+ installed
- A Google Gemini API key (or OpenAI/Anthropic key for BYOK)
-
Clone the repository:
git clone https://github.com/Beagle-AI-automation/promptperfect.git cd promptperfect -
Install dependencies:
npm install
-
Configure environment variables:
Create a
.env.localfile in the root directory and add your API keys:# Required for default provider GOOGLE_API_KEY=your_gemini_api_key_here # Optional: For analytics (Supabase) NEXT_PUBLIC_SUPABASE_URL=your_supabase_url NEXT_PUBLIC_SUPABASE_ANON_KEY=your_supabase_anon_key
-
Run the development server:
npm run dev
Open http://localhost:3000 with your browser to see the result.
PromptPerfect includes ready-to-import n8n workflow templates for automating prompt optimization in your workflows.
Quick Start:
- Import
examples/n8n-optimize-prompt.jsoninto n8n - Configure your PromptPerfect URL
- Start automating!
See examples/README.md for full documentation and advanced use cases.
You can deploy your own instance of PromptPerfect to Vercel with a single click:
We welcome contributions—bug fixes, docs, and features are all appreciated. See CONTRIBUTING.md for local setup, branch naming (PP-XXX/description), commit format, PR expectations (npx vitest run, npx tsc --noEmit), and code style.
PromptPerfect is an open-source prompt optimization tool. Paste any LLM prompt, pick an optimization mode, and get an improved version with explanations of what changed and why. It runs in your browser — no install needed.
PromptPerfect supports OpenAI (GPT-4, GPT-3.5), Anthropic (Claude), and Google (Gemini). You bring your own API key. The key is sent directly from your browser to the provider — it never touches our servers.
Yes. Your API key is sent from your browser directly to the LLM provider's API. It is not stored, logged, or transmitted to any other server. You can verify this in the source code — the API route proxies the request without persisting the key.
Add a new prompt string to src/lib/prompts.ts and a corresponding option in the ModeSelector component. See CONTRIBUTING.md for conventions.
DSPy is a framework for programmatic prompt optimization in Python pipelines. PromptFoo is a CLI tool for evaluating and testing prompts. PromptPerfect is a web-based tool for manually improving individual prompts with explanations — more like Grammarly for prompts than a testing framework.
Yes. Click the "Deploy with Vercel" button in this README. You'll need a Gemini API key (free from ai.google.dev). The whole setup takes under 5 minutes.
This project is licensed under the MIT License - see the LICENSE file for details.
Built by the Beagle Builder Program