Skip to content

brain: core v0 (guardian-gated model calls + no-claim post + provenance)#80

Open
Cheewye wants to merge 13 commits intomainfrom
feat/brain-core-v0
Open

brain: core v0 (guardian-gated model calls + no-claim post + provenance)#80
Cheewye wants to merge 13 commits intomainfrom
feat/brain-core-v0

Conversation

@Cheewye
Copy link
Owner

@Cheewye Cheewye commented Jan 23, 2026

backend: add CritGate single source contract and enforce entrypoint guard (CritGate first, GuardianEngine once)
backend: keep brain orchestration single path (CritGate -> Guardian -> MemoryGate opt-in -> ActionRouter -> Postprocess)
tests: add critgate entrypoint coverage (deny/allow/determinism)
docs: add CRITGATE_SINGLE_SOURCE.md (contract + invariants)
verification:
pytest -q tests/test_guardian_suite_pseudo.py (1 intentional skip allowed)
pytest -q tests/test_guardian_eval_harness.py
pytest -q tests/test_critgate_entrypoints.py
note: no frontend changes, no repo-wide formatting, no churn outside sprint files
Pytest output summary:
tests/test_guardian_suite_pseudo.py: ✅ (1 skip intencional)
tests/test_guardian_eval_harness.py: ✅
tests/test_critgate_entrypoints.py: ✅

📋 Checklist de Seguridad iURi

Protección del Módulo Marino

  • NO se modifica ni borra src/components/marine/*
  • NO se introduce react-leaflet (usar MapLibre o Leaflet directo)
  • Si hay WebSocket :8000 → está detrás de VITE_USE_IURI_WS flag
  • NO se renombran vars: VITE_SIGNALK_WS, VITE_MQTT_WS, VITE_VESSEL_ID

Arquitectura General

  • Sidebar/rutas: sin cambios (salvo que sea el objetivo del PR)
  • i18n actualizado en todos los idiomas (es/pt/en/...)
  • No se introducen dependencias de Three.js sin lazy-load y feature flag

Pruebas Locales

  • npm run dev abre /dashboard sin errores en consola
  • (Si stack marino levantado) scripts/sanity_stack.sh OK
  • (Obligatorio) scripts/premerge_guard.sh PASS
  • (Obligatorio) scripts/llm_diff_review.sh ejecutado (adjuntar salida)

Capturas


🧠 Salida del Guardián IA Local

Ollama LLM Diff Review
<!-- Pegar aquí la salida de: MODEL=llama3.2 bash scripts/llm_diff_review.sh -->

📝 Descripción de Cambios

🔗 Relacionado

Fixes #

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant