Skip to content

feat(chat): add ON_PROMPT_READY plugin hooks + telemetry-based prompt middleware example #3405

@mrveiss

Description

@mrveiss

Context7 Score: 76/100

Test query: "How do you implement a custom middleware in AutoBot to intercept and modify LLM prompts based on real-time infrastructure telemetry?"

Audit Findings

What EXISTS

  • autobot-backend/chat_workflow/llm_handler.py — centralised prompt construction (_build_full_prompt, _get_system_prompt)
  • autobot-backend/chat_workflow/graph.py — LangGraph pipeline with prepare_llm node
  • autobot-backend/middleware/llm_awareness_middleware.py — HTTP-level middleware injecting awareness context
  • autobot-backend/plugin_manager.py — hook system with ON_AGENT_EXECUTE, ON_MESSAGE_RECEIVED, etc.
  • autobot-backend/api/prometheus_mcp.py — Prometheus metrics (CPU, memory, load) as MCP tools
  • docs/developer/PLUGIN_SDK.md — plugin lifecycle docs

What is MISSING

  1. No prompt-level hooksON_SYSTEM_PROMPT_READY and ON_FULL_PROMPT_READY don't exist in the plugin hook enum
  2. No telemetry → prompt bridge — Prometheus metrics are queryable but never auto-injected into prompts
  3. No example plugin showing prompt interception/modification
  4. No docs for the prompt middleware pattern

Acceptance Criteria

  • Hook.ON_SYSTEM_PROMPT_READY added — fires in llm_handler._get_system_prompt() after loading base prompt; plugins receive (system_prompt: str, session: dict) -> str
  • Hook.ON_FULL_PROMPT_READY added — fires in graph.prepare_llm() after _build_full_prompt(); plugins receive (prompt: str, llm_params: dict, context: dict) -> str
  • Hook calls added to llm_handler.py and graph.py at the appropriate points
  • Example plugin: plugins/telemetry_prompt_middleware/ — queries Prometheus CPU/memory; appends concise-response hint when load is high
  • docs/developer/PROMPT_MIDDLEWARE_GUIDE.md — explains hook interface, telemetry example, registration pattern
  • Tests: hook fires with correct args, return value replaces prompt, no-op when no plugins registered

Files to Touch

  • autobot-backend/plugin_sdk/hooks.py (or wherever Hook enum lives) — add new hook types
  • autobot-backend/chat_workflow/llm_handler.py — call ON_SYSTEM_PROMPT_READY
  • autobot-backend/chat_workflow/graph.py — call ON_FULL_PROMPT_READY
  • plugins/telemetry_prompt_middleware/ (new)
  • docs/developer/PROMPT_MIDDLEWARE_GUIDE.md (new)

Related

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions