Skip to content

zrr1999/marrow-core

Repository files navigation

marrow-core

Minimal service/runtime kernel for autonomous agent scheduling.

Migration note: marrow-core is being narrowed to service ownership only. Task systems, work-item models, and workflow repos now live outside the active core boundary.

Prompt model

Use one mental model everywhere:

  • rules -> stable global policy in the external profile bundle
  • roles -> canonical role identity and delegation boundaries in the external profile bundle
  • context.d -> dynamic queue, state, and environment facts from the external profile bundle
  • skills -> reusable procedures outside repo prompt assembly

Repo-root agents/ is retired. Do not add prompt material there.

Profile ownership

Role trees, prompt policy, and model maps are external profile concerns.

marrow-core no longer ships canonical roles/ or an in-repo casting flow. Use an external profile such as marrow-bot and cast it with uvx role-forge.

Canonical model

The canonical source of truth is:

  • external profile roles/ for role prompts and layout
  • external profile roles.toml for model-tier casting
  • marrow_core/contracts.py for runtime inventory and workspace topology
  • marrow_core/plugin_host.py for hosted plugin/background-service rendering
  • .opencode/agents/ as the generated runtime surface

Recommended installation path

Use marrow-core as runtime package only:

  • install marrow-core via uvx/package tooling
  • provide runtime config outside core (or from an external profile repo)
  • provide prompt/context/roles from an external profile root via [profile]
  • use uvx marrow-core validate --config /path/to/runtime-config.toml
  • use uvx marrow-core install-service --config /path/to/runtime-config.toml

marrow-core no longer carries in-repo profile assets. New deployments should supply them externally.

uvx-first usage

uvx marrow-core validate --config /path/to/runtime-config.toml
uvx marrow-core run --config /path/to/runtime-config.toml
uvx marrow-core install-service --config /path/to/runtime-config.toml --platform auto --output-dir ./service-out

The core runtime prompt is intentionally generic. Execution policy belongs in external profile repos, not in marrow_core.heartbeat.

Docs deployment

Documentation is built with Zensical. Run just docs-build from marrow-core/ to generate the static site into site/; that directory is build output and is intentionally not tracked in git.

For Vercel deployment, set the project Root Directory to marrow-core, use the commands from vercel.json, and deploy the generated site/ directory as a static site.

The docs repo/edit links are configured for the monorepo layout under marrow-core/docs/.

CLI

marrow run              # root supervisor or single-user heartbeat loop
marrow run-once         # one tick per scheduled agent, then exit
marrow dry-run          # assemble prompts without running agents
marrow sync-once        # one bounded sync attempt with structured result codes
marrow setup            # init root runtime or single-user workspace
marrow scaffold         # create a new writable workspace skeleton and starter config
marrow validate         # check config and show summary
marrow doctor           # verify workspace, context dirs, and agent command availability
marrow status           # query live heartbeat state over IPC
marrow wake             # wake one configured agent immediately via IPC, with optional prompt
marrow install-service  # render launchd or systemd service files

service remains as an internal grouping for runtime-only commands such as worker bootstrap, but the normal interface is the root command set.

Configuration

[service]
mode = "supervisor"
runtime_root = "/var/lib/marrow"

[profile]
root_dir = "/path/to/marrow-bot"

[ipc]
enabled = true

[self_check]
enabled = true
interval_seconds = 900
wake_agent = "orchestrator"

[sync]
enabled = true
interval_seconds = 3600
failure_backoff_seconds = 300

[[agents]]
user = "marrow"
name = "orchestrator"
heartbeat_interval = 10800
heartbeat_timeout = 7200
workspace = "/Users/marrow"
agent_command = "/Users/marrow/.opencode/bin/opencode run --agent orchestrator"
context_dirs = ["/Users/marrow/context.d"]

Runtime contract

marrow-core does not cast profiles itself.

The effective execution path is:

  1. maintain role definitions in an external profile repo
  2. cast them with uvx role-forge cast --config <profile>/roles.toml
  3. run uvx marrow-core ... against a runtime config that points at that profile

Runtime boundaries

  • marrow_core/contracts.py - canonical role inventory and workspace topology
  • marrow_core/plugin_host.py - plugin/background-service host manifests and units
  • marrow_core/prompting.py - context execution and prompt assembly
  • marrow_core/runtime.py - socket, service-runtime, and binary path resolution
  • marrow_core/health.py - reusable doctor and self-check health checks
  • marrow_core/services.py - launchd/systemd rendering
  • marrow_core/scaffold.py - workspace scaffold and starter config generation
  • marrow_core/heartbeat.py, marrow_core/ipc.py, marrow_core/triggers.py - orchestration layers
  • marrow_core/cli/ - unified root CLI plus internal service helpers

Workspace layout

/Users/marrow/
├── .opencode/agents/
├── context.d/
├── plugins/
├── runtime/
│   ├── control/
│   ├── state/
│   ├── checkpoints/
│   ├── logs/exec/
│   ├── logs/plugins/
│   └── plugins/
└── docs/

Hosted plugins

marrow-core carries a minimal hosted-plugin contract:

  • Hosted plugins: a small [[plugins]] config surface for dashboard/background-service processes. marrow_core.plugin_host resolves runtime directories, emits a manifest, and can render autostart units for background services.

Typical hosted plugin/service examples include operator-side repos such as marrow-dashboard and task/workflow repos outside marrow-core.

When marrow install-service is run with [[plugins]] configured, it:

  • writes a plugin manifest to <primary-workspace>/runtime/plugins/manifest.json
  • renders autostart service units for background_service plugins with auto_start = true

Minimal [[plugins]] shape:

[[plugins]]
name = "dashboard"
kind = "dashboard"
command = "python"
args = ["-m", "marrow_dashboard", "serve", "--config", "/etc/marrow/dashboard.toml"]
cwd = "/opt/marrow-dashboard"
workspace = "/Users/marrow"
config_path = "/etc/marrow/dashboard.toml"
capabilities = ["read_work_items"]

[[plugins]]
name = "gateway"
kind = "background_service"
command = "python"
args = ["-m", "marrow_gateway", "serve"]
cwd = "/opt/marrow-gateway"
workspace = "/Users/marrow"
auto_start = true

[plugins.env]
MARROW_WORKSPACE = "/Users/marrow"

See examples/runtime-config.example.toml for a copyable example and docs/refactor-blueprint.md for the current split direction.

Testing guidance

  • prefer one high-signal behavior test over multiple helper tests for the same failure mode
  • keep supervisor boundary coverage concentrated in tests/test_supervisor.py
  • add lower-level tests only when a helper has meaningful branching not already covered by a higher-level test

Quick start

uvx marrow-core validate --config /path/to/runtime-config.toml

Manual update attempt:

uvx marrow-core service sync-once --config /path/to/runtime-config.toml

Note: sync-once is maintenance-only and still assumes a source checkout core_dir. For pure uvx runtime installs, prefer disabling sync or using an external repo-maintenance flow.

Expected sync outcomes:

  • 0 -> noop
  • 10 -> reloaded
  • 11 -> restart_required
  • 1 -> failed

About

Minimal self-evolving agent scheduler — hard isolation between core (human) and agent evolution

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors