Minimal service/runtime kernel for autonomous agent scheduling.
Migration note:
marrow-coreis being narrowed to service ownership only. Task systems, work-item models, and workflow repos now live outside the active core boundary.
Use one mental model everywhere:
rules-> stable global policy in the external profile bundleroles-> canonical role identity and delegation boundaries in the external profile bundlecontext.d-> dynamic queue, state, and environment facts from the external profile bundleskills-> reusable procedures outside repo prompt assembly
Repo-root agents/ is retired. Do not add prompt material there.
Role trees, prompt policy, and model maps are external profile concerns.
marrow-core no longer ships canonical roles/ or an in-repo casting flow. Use an external profile such as marrow-bot and cast it with uvx role-forge.
The canonical source of truth is:
- external profile
roles/for role prompts and layout - external profile
roles.tomlfor model-tier casting marrow_core/contracts.pyfor runtime inventory and workspace topologymarrow_core/plugin_host.pyfor hosted plugin/background-service rendering.opencode/agents/as the generated runtime surface
Use marrow-core as runtime package only:
- install
marrow-coreviauvx/package tooling - provide runtime config outside core (or from an external profile repo)
- provide prompt/context/roles from an external profile root via
[profile] - use
uvx marrow-core validate --config /path/to/runtime-config.toml - use
uvx marrow-core install-service --config /path/to/runtime-config.toml
marrow-core no longer carries in-repo profile assets. New deployments should supply them externally.
uvx marrow-core validate --config /path/to/runtime-config.toml
uvx marrow-core run --config /path/to/runtime-config.toml
uvx marrow-core install-service --config /path/to/runtime-config.toml --platform auto --output-dir ./service-outThe core runtime prompt is intentionally generic. Execution policy belongs in external profile repos, not in marrow_core.heartbeat.
Documentation is built with Zensical. Run just docs-build from marrow-core/ to generate the static site into site/; that directory is build output and is intentionally not tracked in git.
For Vercel deployment, set the project Root Directory to marrow-core, use the commands from vercel.json, and deploy the generated site/ directory as a static site.
The docs repo/edit links are configured for the monorepo layout under marrow-core/docs/.
marrow run # root supervisor or single-user heartbeat loop
marrow run-once # one tick per scheduled agent, then exit
marrow dry-run # assemble prompts without running agents
marrow sync-once # one bounded sync attempt with structured result codes
marrow setup # init root runtime or single-user workspace
marrow scaffold # create a new writable workspace skeleton and starter config
marrow validate # check config and show summary
marrow doctor # verify workspace, context dirs, and agent command availability
marrow status # query live heartbeat state over IPC
marrow wake # wake one configured agent immediately via IPC, with optional prompt
marrow install-service # render launchd or systemd service files
service remains as an internal grouping for runtime-only commands such as worker bootstrap, but the normal interface is the root command set.
[service]
mode = "supervisor"
runtime_root = "/var/lib/marrow"
[profile]
root_dir = "/path/to/marrow-bot"
[ipc]
enabled = true
[self_check]
enabled = true
interval_seconds = 900
wake_agent = "orchestrator"
[sync]
enabled = true
interval_seconds = 3600
failure_backoff_seconds = 300
[[agents]]
user = "marrow"
name = "orchestrator"
heartbeat_interval = 10800
heartbeat_timeout = 7200
workspace = "/Users/marrow"
agent_command = "/Users/marrow/.opencode/bin/opencode run --agent orchestrator"
context_dirs = ["/Users/marrow/context.d"]marrow-core does not cast profiles itself.
The effective execution path is:
- maintain role definitions in an external profile repo
- cast them with
uvx role-forge cast --config <profile>/roles.toml - run
uvx marrow-core ...against a runtime config that points at that profile
marrow_core/contracts.py- canonical role inventory and workspace topologymarrow_core/plugin_host.py- plugin/background-service host manifests and unitsmarrow_core/prompting.py- context execution and prompt assemblymarrow_core/runtime.py- socket, service-runtime, and binary path resolutionmarrow_core/health.py- reusable doctor and self-check health checksmarrow_core/services.py- launchd/systemd renderingmarrow_core/scaffold.py- workspace scaffold and starter config generationmarrow_core/heartbeat.py,marrow_core/ipc.py,marrow_core/triggers.py- orchestration layersmarrow_core/cli/- unified root CLI plus internal service helpers
/Users/marrow/
├── .opencode/agents/
├── context.d/
├── plugins/
├── runtime/
│ ├── control/
│ ├── state/
│ ├── checkpoints/
│ ├── logs/exec/
│ ├── logs/plugins/
│ └── plugins/
└── docs/
marrow-core carries a minimal hosted-plugin contract:
- Hosted plugins: a small
[[plugins]]config surface for dashboard/background-service processes.marrow_core.plugin_hostresolves runtime directories, emits a manifest, and can render autostart units for background services.
Typical hosted plugin/service examples include operator-side repos such as marrow-dashboard and task/workflow repos outside marrow-core.
When marrow install-service is run with [[plugins]] configured, it:
- writes a plugin manifest to
<primary-workspace>/runtime/plugins/manifest.json - renders autostart service units for
background_serviceplugins withauto_start = true
Minimal [[plugins]] shape:
[[plugins]]
name = "dashboard"
kind = "dashboard"
command = "python"
args = ["-m", "marrow_dashboard", "serve", "--config", "/etc/marrow/dashboard.toml"]
cwd = "/opt/marrow-dashboard"
workspace = "/Users/marrow"
config_path = "/etc/marrow/dashboard.toml"
capabilities = ["read_work_items"]
[[plugins]]
name = "gateway"
kind = "background_service"
command = "python"
args = ["-m", "marrow_gateway", "serve"]
cwd = "/opt/marrow-gateway"
workspace = "/Users/marrow"
auto_start = true
[plugins.env]
MARROW_WORKSPACE = "/Users/marrow"See examples/runtime-config.example.toml for a copyable example and docs/refactor-blueprint.md for the current split direction.
- prefer one high-signal behavior test over multiple helper tests for the same failure mode
- keep supervisor boundary coverage concentrated in
tests/test_supervisor.py - add lower-level tests only when a helper has meaningful branching not already covered by a higher-level test
uvx marrow-core validate --config /path/to/runtime-config.tomlManual update attempt:
uvx marrow-core service sync-once --config /path/to/runtime-config.tomlNote: sync-once is maintenance-only and still assumes a source checkout core_dir. For pure uvx runtime installs, prefer disabling sync or using an external repo-maintenance flow.
Expected sync outcomes:
0->noop10->reloaded11->restart_required1->failed