Skip to content

Latest commit

ย 

History

History
246 lines (191 loc) ยท 10.2 KB

File metadata and controls

246 lines (191 loc) ยท 10.2 KB

๐Ÿ‰ Loong - Rust Base for Vertical AI Agents

Loong

"Originated from the East, here to benefit the world"

Build License: MIT Rust Edition 2024 Version
X Telegram Discord Reddit
Xiaohongshu Feishu QR WeChat QR

Secure, extensible, and sustainably evolvable โ€” Loong is an agent base for vertical AI agents, built in Rust. On a secure and controlled base, it supports longer-horizon workflow construction, compound task execution, and closed-loop improvement โ€” enabling people and AI to collaborate in real-world scenarios.

English | ็ฎ€ไฝ“ไธญๆ–‡

Documentation โ€ข Get Started โ€ข Configuration โ€ข Playbooks โ€ข Build On Loong โ€ข Contributing

Why Loong

Because it already has the core capabilities you need to inspect, operate, and extend:

  • ๐Ÿš€ Rich configuration out of the box: 42+ built-in providers, 25+ channels โ€” up and running in a few commands.
  • ๐Ÿ‘€ Transparent and controllable: audit, tasks, skills, plugins, channels, runtime-snapshot, and gateway control are all exposed as directly usable commands.
  • ๐Ÿ›ก๏ธ Secure and controllable base: provider selection, tools, memory, channels, approvals, policy, and audit operate within explicit runtime boundaries.

Also because whether you are a beginner or a power user, it fits you:

  • โšก Easy to start: a few commands to get running, compatible with existing configurations from OpenClaw, Claude Code, Codex, OpenCode, and other similar AI tools.
  • ๐Ÿงญ Transparent boundaries: assistant, gateway, and channels operate independently โ€” never tangled together.
  • ๐Ÿ”Œ Core and extensions are separate: providers, tools, channels, memory, and policy live outside the kernel โ€” compile and compose as needed.
  • ๐ŸŒฑ Not a toy: designed for long-term use, grows with your needs over time.

Also, if you want the longer public rationale behind this positioning, read Why Loong.

Sponsors

Volcengine โ€ƒโ€ƒโ€ƒ Feishu

Quick Start

Loong uses loong as the only supported command-line entrypoint.

Script Install (Recommended)

Linux or macOS:

curl -fsSL https://raw.githubusercontent.com/eastreams/loong/dev/scripts/install.sh | bash -s -- --onboard

Windows PowerShell:

$script = Join-Path $env:TEMP "loong-install.ps1"
Invoke-WebRequest https://raw.githubusercontent.com/eastreams/loong/dev/scripts/install.ps1 -OutFile $script
pwsh $script -Onboard

From source:

Ensure your system has a C linker (required by Rust):

# Debian / Ubuntu
sudo apt update && sudo apt install build-essential
# Fedora
sudo dnf groupinstall "Development Tools"
# macOS
xcode-select --install

Install the Rust toolchain (skip if already installed):

curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source "$HOME/.cargo/env"

Build and install:

bash scripts/install.sh --source --onboard
# Or install via Cargo only (without onboard setup)
cargo install --path crates/daemon

First Successful Flow

loong onboard                # Interactive setup โ€” configure provider and model
loong ask --message "Summarize this repo in one sentence."  # Single-turn query to verify config
loong chat                   # Start a multi-turn conversation
loong doctor --fix           # Check environment and auto-fix common issues

Running onboard is enough for the golden path โ€” it writes a working config to ~/.loong/config.toml without asking you to hand-edit TOML. The snippets below show what that file looks like on dev today, when you want to add another provider or wire up a channel.

Providers

active_provider = "openai"

[providers.openai]
kind = "openai"
api_key = { env = "OPENAI_API_KEY" }
model = "auto"

[providers.volcengine]
kind = "volcengine"
api_key = { env = "ARK_API_KEY" }
model = "auto"
  • active_provider selects which lane runs; switch by editing the field or by running loong onboard again.
  • api_key = { env = "OPENAI_API_KEY" } reads the secret from that environment variable. api_key = "OPENAI_API_KEY" would instead treat the string as the literal key value โ€” a common pitfall.
  • model = "auto" uses provider-side discovery; pin model = "<id>" when discovery is unreliable for your region or account.

Channels โ€” Lark

[feishu]
enabled = true
domain = "lark"                           # use "feishu" for the China Feishu lane
mode = "websocket"
receive_id_type = "chat_id"
app_id = { env = "LARK_APP_ID" }
app_secret = { env = "LARK_APP_SECRET" }
allowed_chat_ids = ["oc_ops_room"]

Smoke-test before anything else:

loong doctor
loong feishu-send --receive-id "ou_example_user" --text "hello from loong"
loong feishu-serve

For the full provider and channel matrices, multi-account setups, and the long-running delivery model, see the Documentation table below.

Documentation

Get started Get Started, or just run onboard / ask / chat / doctor
Full rollout path Common Setups
Pick a provider Provider Guides and Provider Recipes
Wire up channels Channel Guides and Channel Recipes
Long-running delivery Gateway And Supervision
Design stance Why Loong
Architecture and extension Build On Loong
Reference Reference

Architecture

Loong is a 7-crate Rust workspace with a strict acyclic dependency graph, organized around a governed kernel that separates contracts, security, execution, and orchestration.

contracts  (stable contract vocabulary)
โ”œโ”€โ”€ kernel   -> contracts
โ”œโ”€โ”€ protocol (independent transport foundation)
โ”œโ”€โ”€ app      -> contracts, kernel
โ”œโ”€โ”€ spec     -> contracts, kernel, protocol
โ”œโ”€โ”€ bench    -> kernel, spec
โ””โ”€โ”€ daemon   -> app, bench, contracts, kernel, spec

For ownership zones, the layered execution model (L0โ€“L9), and design principles, see ARCHITECTURE.md.

Contributing

Contributions are welcome. Start with CONTRIBUTING.md.

If you want to help where it matters most right now, read Contribution Areas.

Star History

Star History Chart