This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
# Install dependencies
uv sync
# Run the flow directly (entry point defined in pyproject.toml)
crewai run
# Interactive terminal chat (local, no deployment required)
uv run python terminal_chat.py
# Run automated multi-turn test
uv run python test_chat.py
# Generate flow diagram
uv run python -c "from src.conversational_routing.main import plot; plot()"
# Run Streamlit demo (requires deployed CrewAI Enterprise instance)
streamlit run demo_streamlit_poll/streamlit_app.py
# Run Flask webhook demo (requires deployed CrewAI Enterprise instance)
python demo_webhooks/app.pyRequired in .env:
MODEL_FAMILY—gemini(default),vertex, oropenaiGEMINI_API_KEY— when usinggeminiorvertexmodel familiesMODEL— optional override, defaults togemini/gemini-2.5-flash-lite
The main entry point is main.py, which defines ChatFlow — a @persist()-decorated CrewAI Flow that maintains multi-turn conversation state via ChatState (Pydantic model).
Flow routing logic:
@start→initial_processing(no-op, triggers routing)@router→classify_messageuses an inline Agent to classify the message intopleasantries,question, ornon-chase-question@listenhandlers branch toanswer_pleasantries,answer_question, oranswer_non_chase_question@listen(or_(...))→send_responseconsolidates all branches and returns a JSON string with{id, response, current_agent, classification}
Conversation persistence: The @persist() decorator on ChatFlow enables multi-turn sessions. Pass id from a previous flow run's state.id into inputs to resume a conversation.
Handles question-classified messages using a benefits_expert_agent backed by a PDF knowledge source (knowledge/freedom_benefits.pdf). The agent uses RAG with configurable embedders.
Model selection is done at import time via the MODEL_FAMILY env var — each model module in models/ exports llm and embedder_configuration.
| Directory | Description |
|---|---|
demo_streamlit_poll/ |
Streamlit UI; polls CrewAI Enterprise /kickoff + /status/{id} endpoints |
demo_webhooks/ |
Flask app with SSE; receives results via webhook from CrewAI Enterprise |
demo_slackbot/ |
Slack Bolt app using Socket Mode; maps Slack threads to flow session IDs |
All three demos require a deployed CrewAI Enterprise instance and use the same API pattern: POST to /kickoff with {current_message, id?}, then retrieve the result.
knowledge/freedom_benefits.pdf is the source document for the RAG-powered AssistantCrew. The prepare_vecdb/ directory contains tooling for pre-building the vector database.