Skip to content

Commit 481fbbf

Browse files
committed
update README
1 parent 199267e commit 481fbbf

1 file changed

Lines changed: 20 additions & 14 deletions

File tree

README.md

Lines changed: 20 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -312,8 +312,8 @@ dspy-code
312312
# Initialize your project (creates config and scans your environment)
313313
/init
314314

315-
# Connect to a model (interactive selector)
316-
/model
315+
# Connect to a model (example with Ollama)
316+
/connect ollama llama3.1:8b
317317

318318
# Generate your first program using natural language
319319
Create a sentiment analyzer that takes text and outputs positive or negative
@@ -339,8 +339,7 @@ DSPy Code is **interactive-only** - all commands are slash commands. Here are th
339339
- `/exit` - Exit the interactive session
340340

341341
### 🤖 Model Connection
342-
- `/model` - Interactive model selection (local via Ollama or cloud providers)
343-
- `/connect <provider> <model>` - Directly connect to LLM when you know the model name
342+
- `/connect <provider> <model>` - Connect to LLM (ollama, openai, anthropic, gemini)
344343
- `/disconnect` - Disconnect current model
345344
- `/models` - List available models
346345
- `/status` - Show current connection status
@@ -412,7 +411,7 @@ DSPy Code is **interactive-only** - all commands are slash commands. Here are th
412411
```bash
413412
dspy-code
414413
/init
415-
/model
414+
/connect ollama llama3.1:8b
416415
Create a RAG system for document Q&A
417416
/save rag_system.py
418417
/validate
@@ -479,22 +478,32 @@ dspy-code
479478

480479
## 🔌 Model Connection
481480

482-
Connect to any LLM provider:
481+
Connect to any LLM provider. The easiest way is to use the interactive selector:
483482

484483
```bash
485-
# Recommended: interactive model selector
484+
# Recommended: interactive model selection
486485
/model
486+
```
487+
488+
Then follow the prompts to:
489+
- **Pick a provider** (Ollama, OpenAI, Anthropic, Gemini)
490+
- **Choose a model** (for Ollama we auto‑list local models; for cloud you type the model name)
487491

488-
# Or connect directly if you know the model name:
492+
Make sure the right API keys are set in your environment before starting `dspy-code`:
489493

494+
- **OpenAI**: `OPENAI_API_KEY`
495+
- **Anthropic**: `ANTHROPIC_API_KEY`
496+
- **Gemini**: `GEMINI_API_KEY`
497+
498+
```bash
490499
# Ollama (local, free)
491-
/connect ollama gpt-oss:120b
500+
/connect ollama llama3.1:8b
492501

493502
# OpenAI (example small model)
494503
/connect openai gpt-5-nano
495504

496505
# Anthropic (paid key required)
497-
/connect anthropic claude-sonnet-4.5
506+
/connect anthropic claude-4-5-sonnet
498507

499508
# Google Gemini (example model)
500509
/connect gemini gemini-2.5-flash
@@ -561,11 +570,8 @@ pip install -e .
561570
### With uv (Faster)
562571

563572
```bash
564-
# Always get the latest version into your current environment
573+
# Always get the latest version
565574
uv pip install --upgrade dspy-code
566-
567-
# Or add it to your project's pyproject.toml in one step
568-
uv add dspy-code
569575
```
570576

571577
## 🏗️ Architecture

0 commit comments

Comments
 (0)