Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,8 @@ logs/
*.bak
*.tmp
Show-DirectoryTree.ps1
shotgun.desktop
run-shotgun.sh

# If you have specific large files or data that shouldn't be in git
# data/
Expand Down
169 changes: 83 additions & 86 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,82 +2,77 @@

![Shotgun App Banner](https://github.com/user-attachments/assets/6dd15389-4ad9-493a-a0e7-9813eb143e38)

**Tired of Cursor cutting off context, missing your files, and spitting out empty responses?**

**Shotgun** is the bridge between your local codebase and the world's most powerful LLMs.
It doesn't just copy files; it **intelligently packages your project context** and can **execute prompts directly** against OpenAI (GPT-4o/GPT-5), Google Gemini, or OpenRouter.

> **Stop copy-pasting 50 files manually.**
> 1. Select your repo.
> 2. Let AI pick the relevant files (Auto-Context).
> 3. Blast the payload directly to the model or copy it for use in Cursor/Windsurf.
Shotgun is a desktop app for preparing project context and working with LLMs.
It collects selected files into a structured context, can auto‑select relevant files, and can send prompts directly to OpenAI, Gemini, or OpenRouter.

---

## 1. What Shotgun Does
Shotgun is a desktop power-tool that **explodes your project into a structured payload** designed for AI reasoning.

It has evolved from a simple "context dumper" into a full-fledged **LLM Client for Codebases**:
* **Smart Selection:** Uses AI ("Auto-Context") to analyze your task and automatically select only the relevant files from your tree.
* **Direct Execution:** Configurable API integration with **OpenAI**, **Gemini**, and **OpenRouter**.
* **Prompt Engineering:** Built-in templates for different roles (Developer, Architect, Bug Hunter).
* **History & Audit:** Keeps a full log of every prompt sent and response received.
## What Shotgun Does
Shotgun turns your repository into an LLM‑friendly context:
- builds a file tree and collects file contents;
- respects `.gitignore` and custom rules;
- shows size and token counts per file/folder;
- outputs XML‑like blocks `<file path="...">...</file>`.

---

## 2. Key Features
## Key Features

### 🧠 AI-Powered Context
* **Auto-Context:** Don't know which files are needed for a bug fix? Type your task, and Shotgun uses an LLM to scan your tree and select the relevant files for you.
* **Repo Scan:** supplement context retrieval with a `shotgun_reposcan.md` summary of your architecture to give the LLM high-level awareness before diving into code.
### 🧠 Auto Context
- **AutoContext**: the LLM selects relevant files based on your task.
- **Repo Scan**: optional `shotgun_reposcan.md` for highlevel architecture context.

### ⚡ Workflow Speed
* **Fast Tree Scan:** Go + Wails backend scans thousands of files in milliseconds.
* **Interactive Tree:** Manually toggle files/folders or use `.gitignore` and custom rule sets to filter noise.
* **One-Click Blast:** Generate a massive context payload instantly.
### ⚡ Large‑project performance
- **Multiple project roots**: add several root folders in one workspace.
- **Project tree** with manual selection.
- **Symlink traversal** with cycle protection.
- **Offline token counting** (`o200k_base`) without API calls.
- **Context size limit** via `settings.json` (`maxContextBytes`, default 50MB).
- **Auto context generation** with debounce + queue (no constant restarts).

### 🔌 Direct Integrations
* **OpenAI:** Support for GPT-4o and experimental support for **GPT-5** family models.
* **Google Gemini:** Native integration for Gemini 2.5/3 Pro & Flash.
* **OpenRouter:** Access hundreds of LLM's via a unified API.
### 🔌 Integrations
Supported providers:
- OpenAI
- Google Gemini
- OpenRouter

### 🛠 Developer Experience
* **Prompt Templates:** Switch modes easily (e.g., "Find Bug" vs "Refactor" vs "Write Docs").
* **History Tracking:** Never lose a generated patch. Browse past prompts, responses, and raw API payloads.
* **Privacy Focused:** Your code goes only to the API provider you choose. No intermediate servers.
You can pick a model and run requests directly from the app.

---

## 3. The Workflow

Shotgun guides you through a 3-step process:
## Workflow

### Step 1: Prepare Context
* **Select Project:** Open your local repository.
* **Filter:** Use the checkbox tree, `.gitignore`, or the **Auto-Context** button to define the scope.
* **Repo Scan:** Edit or load the high-level repository summary for better AI grounding.
* **Result:** A structured XML-like dump of your selected codebase.
1. Add one or more project roots.
2. Select files manually or use AutoContext.
3. Context is generated automatically with progress.
4. **Generated Project Context** shows an exact token count.

### Step 2: Compose & Execute
* **Define Task:** Describe what you need (e.g., "Refactor the auth middleware to use JWT").
* **Select Template:** Choose a persona (Dev, Architect, QA).
* **Execute:** Click **"Execute Prompt"** to send it to the configured LLM API immediately, OR copy the full payload to your clipboard for use in external tools like ChatGPT or Cursor.
1. Describe your task.
2. Choose a prompt template.
3. Execute the request or copy the prompt.

### Step 3: History & Apply
* **Review:** View the AI's response alongside your original prompt.
* **Diffs:** The AI output is optimized for `diff` generation.
* **Audit:** Inspect raw API calls for debugging or token usage analysis.
### Step 3: History
History of requests, responses, and call parameters.

---

## 4. Installation
## UI
- **Resizable sidebar** (drag to resize).
- **Resizable bottom console**.

### Prerequisites
* **Go ≥ 1.20**
* **Node.js LTS**
* **Wails CLI:** `go install github.com/wailsapp/wails/v2/cmd/wails@latest`
---

### Clone & Build
## Installation & Run

### Requirements
- **Go ≥ 1.20**
- **Node.js LTS**
- **Wails CLI**
`go install github.com/wailsapp/wails/v2/cmd/wails@latest`

### Build & Run
```bash
git clone https://github.com/glebkudr/shotgun_code
cd shotgun_code
Expand All @@ -87,32 +82,42 @@ cd frontend
npm install
cd ..

# Run in Development Mode (Hot Reload)
# Dev mode (hot reload)
wails dev

# Build Production Binary
# Production build
wails build
```
*Binaries will be located in `build/bin/`.*
Binaries are placed in `build/bin/`.

---

## 5. Configuration
## Configuration

### LLM Setup
Click the **Settings** (gear icon) in the app to configure providers:
1. **Provider:** Select OpenAI, Gemini, or OpenRouter.
2. **API Key:** Paste your key (stored locally).
3. **Model:** Select your preferred model (e.g., `gpt-4o`, `gemini-2.5-pro`, `claude-3.5-sonnet`).
### Settings file
Uses the XDG path `shotgun-code/settings.json`.
On Linux this is typically:
```
~/.config/shotgun-code/settings.json
```

### Custom Rules
You can define global excludes (like `node_modules`, `dist`, `.git`) and custom prompt instructions that are appended to every request.
Example:
```json
{
"maxContextBytes": 50000000,
"customIgnoreRules": "...",
"customPromptRules": "..."
}
```

---
### Key options
- **`maxContextBytes`** — context size limit (default 50MB).
- **`customIgnoreRules`** — file exclusion rules.
- **`customPromptRules`** — global prompt rules.

## 6. Output Format
---

Shotgun generates context optimized for LLM parsing:
## Output Format

```xml
<file path="backend/main.go">
Expand All @@ -127,30 +132,22 @@ package main
</file>
```

This format allows models to understand file boundaries perfectly, enabling accurate multi-file refactoring suggestions.

---

## 7. ⚖️ License & Usage

My name is Gleb Curly, and I am an indie developer making software for a living.

Shotgun is developed and maintained by **Curly's Technology Tmi**.

This project uses a **Community License** model:

### 1. Free for Small Teams & Non-Commercial Use
You can use Shotgun for free (including modification and internal use) if:
- Your company/team generates **less than $1M USD** in annual revenue.
- You do **not** use the code to build a competing public product.
## License

### 2. Commercial License (Enterprise)
If your annual revenue exceeds **$1M USD**, you are required to purchase a commercial license with a pretty reasonable price.
Shotgun is developed and maintained by **Curly's Technology Tmi**.

Please contact me at **glebkudr@gmail.com** for pricing.
### 1. Free for small teams and non‑commercial use
You may use Shotgun for free if:
- your annual revenue is **below $1M**;
- you do **not** use the code to build a competing product.

See [LICENSE.md](LICENSE.md) for the full legal text.
### 2. Commercial license
If your revenue exceeds $1M, a commercial license is required.
Contact: **glebkudr@gmail.com**
See `LICENSE.md` for details.

---

*Shotgun Load, Aim, Blast your code into the future.*
*Shotgun Load, Aim, Blast your code into the future.*
153 changes: 153 additions & 0 deletions README_ru.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,153 @@
# Shotgun App

![Shotgun App Banner](https://github.com/user-attachments/assets/6dd15389-4ad9-493a-a0e7-9813eb143e38)

Shotgun — десктопное приложение для подготовки контекста проекта и работы с LLM.
Оно собирает выбранные файлы в структурированный контекст, умеет автоматически выбирать релевантные файлы, а также отправлять готовый промпт прямо в OpenAI, Gemini или OpenRouter.

---

## Что делает Shotgun
Shotgun превращает ваш репозиторий в удобный для LLM контекст:
- собирает дерево файлов и их содержимое;
- уважает `.gitignore` и кастомные правила;
- показывает размер и токены по каждому файлу/папке;
- формирует контекст в XML‑подобном формате `<file path="...">...</file>`.

---

## Основные возможности

### 🧠 Автоматический контекст
- **Auto‑Context**: LLM сам выбирает файлы по описанию задачи.
- **Repo Scan**: подключаемый `shotgun_reposcan.md` с архитектурным обзором.

### ⚡ Быстрая работа с большими проектами
- **Несколько корневых папок**: можно добавить несколько корней проекта в один workspace.
- **Дерево проектов** с ручным выбором файлов и папок.
- **Симлинки** обходятся рекурсивно с защитой от циклов.
- **Офлайн‑подсчёт токенов** (кодировка `o200k_base`) без обращения к API.
- **Ограничение размера контекста** через `settings.json` (`maxContextBytes`, дефолт 50MB).
- **Авто‑генерация контекста** с дебаунсом и очередью (без постоянных перезапусков).

### 🔌 Интеграции
Поддерживаются провайдеры:
- OpenAI
- Google Gemini
- OpenRouter

Можно выбрать модель и выполнять запрос прямо из приложения.

---

## Процесс работы

### Шаг 1: Prepare Context
1. Добавьте один или несколько корней проекта.
2. Выберите файлы вручную или через Auto‑Context.
3. Контекст генерируется автоматически с прогресс‑баром.
4. В блоке **Generated Project Context** показывается точное число токенов.

### Шаг 2: Compose & Execute
1. Опишите задачу.
2. Выберите шаблон промпта.
3. Выполните запрос или скопируйте промпт.

### Шаг 3: History
История запросов, ответов и параметров вызова.

---

## Интерфейс
- **Сайдбар** регулируется по ширине перетаскиванием.
- **Нижняя консоль** регулируется по высоте.

---

## Установка и запуск

### Требования
- **Go ≥ 1.20**
- **Node.js LTS**
- **Wails CLI**
`go install github.com/wailsapp/wails/v2/cmd/wails@latest`

### Сборка и запуск
```bash
git clone https://github.com/glebkudr/shotgun_code
cd shotgun_code

# Установить зависимости фронтенда
cd frontend
npm install
cd ..

# Режим разработки (hot reload)
wails dev

# Продакшен‑сборка
wails build
```
Бинарники появляются в `build/bin/`.

---

## Конфигурация

### Файл настроек
Используется XDG‑путь `shotgun-code/settings.json`.
На Linux это обычно:
```
~/.config/shotgun-code/settings.json
```

Пример:
```json
{
"maxContextBytes": 50000000,
"customIgnoreRules": "...",
"customPromptRules": "..."
}
```

### Важные параметры
- **`maxContextBytes`** — лимит размера контекста (по умолчанию 50MB).
- **`customIgnoreRules`** — правила исключения файлов.
- **`customPromptRules`** — глобальные правила для промптов.

---

## Формат вывода

```xml
<file path="backend/main.go">
package main
...
</file>

<file path="frontend/src/App.vue">
<template>
...
</template>
</file>
```

---

## Лицензия

Shotgun разрабатывается и поддерживается **Curly's Technology Tmi**.

### 1. Бесплатно для небольших команд и некоммерческого использования
Можно использовать бесплатно, если:
- годовой оборот компании **меньше $1M**;
- вы не используете код для создания конкурирующего продукта.

### 2. Коммерческая лицензия
Если оборот выше $1M — требуется коммерческая лицензия.
Контакт: **glebkudr@gmail.com**
Подробности в `LICENSE.md`.

---

*Shotgun — Load, Aim, Blast your code into the future.*
Loading