Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 6 additions & 4 deletions .github/pull_request_template.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,10 +38,12 @@ Fixes #(issue number)
## πŸ§ͺ Testing

<!-- How did you test your changes? -->
- [ ] Ran `uv run pytest`
- [ ] Tested intelligent scraper with: `scapo scrape run --sources [source] --limit 5`
- [ ] Verified LLM processing worked correctly
- [ ] Checked that only AI/ML content was processed
- [ ] Tested service discovery: `uv run scapo scrape discover --update`
- [ ] Tested targeted scraping: `uv run scapo scrape targeted --service "[service]" --limit 10`
- [ ] Tested batch processing: `uv run scapo scrape batch --category [category] --limit 10 --batch-size 2`
- [ ] Verified LLM extraction quality (checked generated files in `models/`)
- [ ] Tested TUI explorer: `uv run scapo tui`
- [ ] For OpenRouter users: Updated context cache with `uv run scapo update-context`
- [ ] Other testing (please describe):

## πŸ“Έ Screenshots (if applicable)
Expand Down
30 changes: 23 additions & 7 deletions QUICKSTART.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,9 @@ cp .env.example .env
LLM_PROVIDER=openrouter
OPENROUTER_API_KEY=sk-or-v1-your-key-here # Get from openrouter.ai
OPENROUTER_MODEL=your_model

# IMPORTANT: Update model context cache for accurate batching
scapo update-context # Without this, defaults to 4096 tokens (poor performance!)
```

#### Option B: Ollama (Local)
Expand Down Expand Up @@ -117,27 +120,27 @@ scapo models search "copilot" # Search for specific models
cat models/audio/eleven-labs/cost_optimization.md
```

### 5. (Optional) Use with Claude Desktop
### 5. (Optional) MCP Server - Query Your Extracted Tips

**Important:** The MCP server only reads tips you've already extracted. Run scrapers first (Steps 3-4) to populate models/ folder!

Add SCAPO as an MCP server to query your extracted tips (from models/ folder) directly in Claude:
Add SCAPO as an MCP server to query your extracted tips directly in MCP-compatible clients:

```json
// Add to claude_desktop_config.json
// Add to config.json
{
"mcpServers": {
"scapo": {
"command": "npx",
"args": ["@scapo/mcp-server"],
"args": ["@arahangua/scapo-mcp-server"],
"env": {
"SCAPO_MODELS_PATH": "path/to/scapo/models"
"SCAPO_MODELS_PATH": "/path/to/scapo/models"
}
}
}
}
```

Then ask Claude: "Get best practices for Midjourney" - no Python needed!

## πŸ“Š Understanding the Output

SCAPO creates organized documentation:
Expand All @@ -151,6 +154,19 @@ models/
β”‚ └── parameters.json # Recommended settings
```

## βš™οΈ Utility Commands

```bash
# Update OpenRouter model context cache (for accurate batching)
scapo update-context # Updates if >24h old
scapo update-context -f # Force update

# View extracted tips
scapo tui # Interactive TUI explorer
scapo models list # List all extracted models
scapo models search "copilot" # Search for specific models
```

## βš™οΈ The --limit flag

```bash
Expand Down
26 changes: 17 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,13 @@ cp .env.example .env
Get your API key from [openrouter.ai](https://openrouter.ai/)
* you can also use local LLMs (Ollama, LMstudio). Check [QUICKSTART.md](./QUICKSTART.md)

#### Important: Update Model Context Cache (OpenRouter users)
```bash
# REQUIRED for optimal performance - fetches accurate token limits
scapo update-context # Creates cache for faster processing
```
Without this, SCAPO defaults to 4096 tokens (severely limiting batch efficiency)


### 3. Start Extracting Optimization Tips

Expand Down Expand Up @@ -274,28 +281,24 @@ MAX_POSTS_PER_SCRAPE=100 # Limit per source
```
Hand-wavy breakdown: With 5 posts, extraction success ~20%. With 20+ posts, success jumps to ~80%.

## πŸ€– MCP Server for Claude Desktop
## πŸ€– MCP Server (Optional Reader)

Query your extracted tips directly in Claude (reads from models/ folder - run scrapers first!):
**Note:** The MCP server is a reader that queries your already-extracted tips. You must run SCAPO scrapers first to populate the models/ folder!

```json
// Add to %APPDATA%\Claude\claude_desktop_config.json (Windows)
// or ~/Library/Application Support/Claude/claude_desktop_config.json (macOS)
// Add to your client's mcp config.json
{
"mcpServers": {
"scapo": {
"command": "npx",
"args": ["@scapo/mcp-server"],
"args": ["@arahangua/scapo-mcp-server"],
"env": {
"SCAPO_MODELS_PATH": "C:\\path\\to\\scapo\\models" // Your models folder
}
}
}
}
```

Then ask Claude: "Get me best practices for GitHub Copilot" or "What models are good for coding?"

See [mcp/README.md](mcp/README.md) for full setup and available commands.

## 🎨 Interactive TUI
Expand Down Expand Up @@ -378,7 +381,12 @@ Built as part of the CZero Engine project to improve AI application development.
- [OpenRouter](https://openrouter.ai/) for accessible AI APIs
- Coffee β˜• for making this possible
- [Ollama](https://ollama.com/) and [LMstudio](https://lmstudio.ai/) for awesome local LLM experience
- [Awesome Generative AI](https://github.com/steven2358/awesome-generative-ai) & [Awesome AI Tools](https://github.com/mahseema/awesome-ai-tools) for service discovery
- Service discovery powered by awesome lists:
- [steven2358/awesome-generative-ai](https://github.com/steven2358/awesome-generative-ai)
- [mahseema/awesome-ai-tools](https://github.com/mahseema/awesome-ai-tools)
- [filipecalegario/awesome-generative-ai](https://github.com/filipecalegario/awesome-generative-ai)
- [aishwaryanr/awesome-generative-ai-guide](https://github.com/aishwaryanr/awesome-generative-ai-guide)
- [eudk/awesome-ai-tools](https://github.com/eudk/awesome-ai-tools)
- All opensource contributors in this space

---
Expand Down
99 changes: 52 additions & 47 deletions mcp/README.md
Original file line number Diff line number Diff line change
@@ -1,62 +1,76 @@
# SCAPO MCP Server

A Model Context Protocol (MCP) server for querying AI/ML best practices from the SCAPO (Stay Calm and Prompt On) knowledge base. Features intelligent fuzzy matching for improved user experience when querying model information.
A Model Context Protocol (MCP) server that makes your locally-extracted SCAPO knowledge base queryable.

⚠️ **This is a reader, not a scraper!** You must first use [SCAPO](https://github.com/czero-cc/scapo) to extract tips into your `models/` folder.

## Documentation

For comprehensive usage instructions, examples, and technical details, please see the **[Usage Guide](usage-guide.md)**.

## Installation
## Prerequisites

1. **Clone and set up SCAPO first**:
```bash
git clone https://github.com/czero-cc/scapo.git
cd scapo
# Follow SCAPO setup to run scrapers and populate models/
```

2. **Required**:
- Node.js 18+
- npm or npx
- Populated `models/` directory (from running SCAPO scrapers)

## How It Works

**IMPORTANT**: This MCP server ONLY reads from your local `models/` folder. It does NOT scrape data itself!

1. First, use SCAPO to scrape and extract tips into `models/`
2. Then, this MCP server makes those tips queryable in your AI client

You can use this MCP server directly with `npx` (no Python required):
## Quick Start

```bash
npx @scapo/mcp-server
# Step 1: Set up SCAPO and extract tips
git clone https://github.com/czero-cc/scapo.git
cd scapo
# Follow SCAPO README to configure and run scrapers
scapo scrape targeted --service "GitHub Copilot" --limit 20

# Step 2: Configure MCP to read your extracted tips
# Add to your MCP client config with YOUR path to scapo/models/
```

Or install it globally:
## Installation

```bash
npm install -g @scapo/mcp-server
npx @arahangua/scapo-mcp-server
```

## Usage with Claude Desktop

Add this to your Claude Desktop configuration file:
## Configuration for MCP Clients

### Windows
Edit `%APPDATA%\Claude\claude_desktop_config.json`:
Add this to your MCP client's configuration:

```json
{
"mcpServers": {
"scapo": {
"command": "npx",
"args": ["@scapo/mcp-server"],
"args": ["@arahangua/scapo-mcp-server"],
"env": {
"SCAPO_MODELS_PATH": "C:\\path\\to\\scapo\\models"
"SCAPO_MODELS_PATH": "/absolute/path/to/your/scapo/models" // From your cloned SCAPO repo!
}
}
}
}
```

### macOS
Edit `~/Library/Application Support/Claude/claude_desktop_config.json`:
**Note:** Set `SCAPO_MODELS_PATH` to the absolute path of your SCAPO models directory.

```json
{
"mcpServers": {
"scapo": {
"command": "npx",
"args": ["@scapo/mcp-server"],
"env": {
"SCAPO_MODELS_PATH": "/path/to/scapo/models"
}
}
}
}
```
For Claude Desktop specifically:
- Windows: Edit `%APPDATA%\Claude\claude_desktop_config.json`
- macOS: Edit `~/Library/Application Support/Claude/claude_desktop_config.json`

## Available Tools

Expand Down Expand Up @@ -89,7 +103,7 @@ List all available models by category.

```
Arguments:
- category: Model category ("text", "image", "video", "audio", "multimodal", "all")
- category: Model category ("text", "image", "video", "audio", "multimodal", "code", "all")
```

Example in Claude:
Expand All @@ -114,12 +128,14 @@ Example in Claude:
## Features

- **Intelligent Fuzzy Matching**: Handles typos, partial names, and variations automatically
- **No Python Required**: Pure Node.js implementation using npx
- Typo tolerance: `heygen` β†’ "HeyGen", `gemeni` β†’ "Gemini"
- Partial matching: `qwen` β†’ finds all Qwen variants
- Case insensitive: `LLAMA-3` β†’ "llama-3"
- **Fully Standalone**: Works without any API server running
- **Direct File Access**: Reads from local model files
- **Smart Search**: Advanced search with similarity scoring
- **Smart Recommendations**: Suggests models based on use case
- **Easy Integration**: Works seamlessly with Claude Desktop
- **Easy Integration**: Works with any MCP-compatible client
- **Helpful Suggestions**: Provides alternatives when exact matches aren't found

## Use Cases
Expand Down Expand Up @@ -150,24 +166,13 @@ models/
└── ...
```

## Advantages Over Python Version

1. **No Python Setup**: Works with just Node.js (which Claude Desktop already has)
2. **Simple npx Usage**: One command to run, no installation needed
3. **Better IDE Integration**: Works seamlessly with Cursor and other IDEs
4. **Faster Startup**: Node.js starts faster than Python
5. **Native JSON Handling**: Better performance for JSON operations

## Contributing

To publish updates to npm:

```bash
cd mcp
npm version patch # or minor/major
npm publish --access public
```
To contribute improvements:
1. Fork the [SCAPO repository](https://github.com/czero-cc/SCAPO)
2. Make your changes in the `mcp/` directory
3. Submit a pull request

## License

Same as the parent SCAPO repository.
Same as the parent [SCAPO](https://github.com/czero-cc/SCAPO) repository.
4 changes: 2 additions & 2 deletions mcp/package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

6 changes: 3 additions & 3 deletions mcp/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@scapo/mcp-server",
"version": "1.0.0",
"name": "@arahangua/scapo-mcp-server",
"version": "1.0.4",
"description": "Stay Calm and Prompt On (SCAPO) - MCP server for AI/ML best practices",
"main": "index.js",
"type": "module",
Expand All @@ -27,4 +27,4 @@
"engines": {
"node": ">=18.0.0"
}
}
}
16 changes: 6 additions & 10 deletions mcp/usage-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,15 +13,11 @@ The SCAPO MCP (Model Context Protocol) server provides intelligent access to AI/
### Installation

```bash
# Clone the repository
git clone https://github.com/your-org/scapo.git
cd scapo/mcp
# After installing SCAPO, the MCP server can be used directly
npx @arahangua/scapo-mcp-server

# Install dependencies
npm install

# Or run directly with npx (no installation needed)
npx @scapo/mcp-server
# Or install globally
npm install -g @arahangua/scapo-mcp-server
```

### Basic Setup
Expand All @@ -43,8 +39,8 @@ npm start
{
"mcpServers": {
"scapo": {
"command": "node",
"args": ["/path/to/scapo/mcp/index.js"],
"command": "npx",
"args": ["@arahangua/scapo-mcp-server"],
"env": {
"SCAPO_MODELS_PATH": "/path/to/scapo/models"
}
Expand Down
13 changes: 13 additions & 0 deletions models/code/coderabbit/metadata.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
{
"service": "CodeRabbit",
"category": "code",
"last_updated": "2025-08-16T18:29:24.327526",
"extraction_timestamp": "2025-08-16T18:10:37.657193",
"data_sources": [
"Reddit API",
"Community discussions"
],
"posts_analyzed": 104,
"confidence": "medium",
"version": "1.0.0"
}
13 changes: 13 additions & 0 deletions models/code/coderabbit/prompting.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# CodeRabbit Prompting Guide

*Last updated: 2025-08-16*

## Tips & Techniques

- Read more here: https://coderabbit.ai/blog/coderabbit-openai-rate-limits
- This blog discusses how CodeRabbit uses FluxNinja Aperture to manage OpenAI’s limits and ensure optimal operation even during peak load.

## Sources

- Reddit community discussions
- User-reported experiences
Loading