You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: .github/copilot-instructions.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -368,7 +368,7 @@ pipeline:
368
368
### LLM Provider Configuration
369
369
370
370
- `LLM_PROVIDER=openai` → uses OpenAI API (requires `OPENAI_API_KEY`)
371
-
- `LLM_PROVIDER=openrouter`→ uses OpenRouter API gateway (requires `OPENROUTER_API_KEY`). Supports 200+ models including free ones (e.g., `meta-llama/llama-3.1-8b-instruct:free`). OpenAI-compatible API via `openai` Python package with custom `base_url`. Get a free key at https://openrouter.ai/keys.
371
+
- `LLM_PROVIDER=openrouter`→ uses OpenRouter API gateway (requires `OPENROUTER_API_KEY`). Supports 200+ models including free ones (e.g., `stepfun/step-3.5-flash:free`). OpenAI-compatible API via `openai` Python package with custom `base_url`. Get a free key at https://openrouter.ai/keys.
372
372
- `LLM_PROVIDER=local`→ uses the local HuggingFace text-completion-llm-service
373
373
- Factory: `create_llm_provider(provider=None)`reads env var if not specified
374
374
@@ -464,7 +464,7 @@ Single bridge network `etl-network`. Services reference each other by container
464
464
| `OPENAI_API_KEY` | — | OpenAI API key |
465
465
| `OPENAI_MODEL` | `gpt-4o-mini` | OpenAI model name |
466
466
| `OPENROUTER_API_KEY` | — | OpenRouter API key (free at https://openrouter.ai/keys) |
467
-
| `OPENROUTER_MODEL` | `meta-llama/llama-3.1-8b-instruct:free` | OpenRouter model identifier |
467
+
| `OPENROUTER_MODEL` | `stepfun/step-3.5-flash:free` | OpenRouter model identifier |
Copy file name to clipboardExpand all lines: README.md
+9-3Lines changed: 9 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -68,6 +68,8 @@ After execution, switch to the **Datasets** tab to browse output files, preview
68
68
69
69
### New in Streamlit UX
70
70
71
+
-**Dual prompt modes** in Pipeline Editor: `Guided` textarea and `Chat-style` conversational input
72
+
-**OpenRouter model semaphore** in sidebar: one-click model reachability check before generation
71
73
-**Platform Readiness** panel in Execution tab: live checks for Airflow, Streamlit, Prometheus, Grafana, including Airflow scheduler heartbeat status
72
74
-**Quick Airflow Triggers** in Execution tab: trigger `hr_analytics_pipeline`, `ecommerce_pipeline`, or `weather_api_pipeline` without leaving Streamlit
73
75
-**Execution insights**: successful steps, processed data volume, slowest step, and orchestration overhead (%)
@@ -91,7 +93,7 @@ Pipeline Compiler → executes steps in parallel via Preparator SDK
91
93
Output: cleaned dataset saved in the requested format
92
94
```
93
95
94
-
The AI agent supports both **OpenAI** (GPT-4o-mini) and **local HuggingFace** models. The YAML editor and validator work without any API key.
96
+
The AI agent supports **OpenAI**, **OpenRouter**, and **local HuggingFace** models. The YAML editor and validator work without any API key.
95
97
96
98
---
97
99
@@ -297,9 +299,13 @@ Full walkthrough: [docs/extending.md](docs/extending.md)
297
299
298
300
| Variable | Default | Description |
299
301
|---|---|---|
300
-
|`LLM_PROVIDER`|`openai`| AI agent provider (`openai` or `local`) |
302
+
|`LLM_PROVIDER`|`openai`| AI agent provider (`openai`, `openrouter`, or `local`) |
301
303
|`OPENAI_API_KEY`|—| Required if `LLM_PROVIDER=openai`|
302
304
|`OPENAI_MODEL`|`gpt-4o-mini`| OpenAI model |
305
+
|`OPENROUTER_API_KEY`|—| Required if `LLM_PROVIDER=openrouter`|
306
+
|`OPENROUTER_MODEL`|`stepfun/step-3.5-flash:free`| Default OpenRouter model |
307
+
|`OPENROUTER_FALLBACK_MODELS`|`arcee-ai/trinity-large-preview:free,...`| Comma-separated fallback models if selected model is unavailable |
308
+
|`LOCAL_LLM_URL`|`http://localhost:5012`| Local text-completion service URL when running Streamlit on host |
303
309
|`ETL_DATA_ROOT`|`/app/data`| Base directory for datasets and metadata |
304
310
|`ALLOW_PRIVATE_API_URLS`|`false`| Allow private/local API targets in extract-api |
305
311
@@ -314,7 +320,7 @@ See [`.env.example`](.env.example) for all available variables including databas
0 commit comments